Source Job

$100,649–$174,459/yr
US 4w PTO

  • Design, build, and maintain scalable data platforms using AWS to support analytics, machine learning, and emerging generative AI use cases.
  • Collaborate with data scientists, analysts, and engineering teams to translate business and AI requirements into scalable data solutions.
  • Work with large-scale datasets to build and optimize data pipelines using AWS services such as EMR (Spark, Trino), S3, Glue, Athena, and Airflow

Python SQL AWS Data Pipelines

20 jobs similar to Data Engineer - Healthcare (Remote)

Jobs ranked by similarity.

$92,686–$125,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.

US

  • Architect and sustain self-healing pipelines using Astronomer/Airflow to ensure 24/7 data availability.
  • Design and optimize event-driven API ingestion frameworks leveraging AWS Lambda and DLT (Data Load Tool).
  • Manage high-performance modeling within AWS Redshift, utilizing DBT to transform raw transactional data into high-fidelity business intelligence.

Odisea helps close the opportunity gap between Colombia and the United States by redefining nearshoring. They are building a passionate team of professionals committed to this purpose.

US 4w PTO

  • Architect production-grade data pipelines integrating clinical data across multiple channels.
  • Build and optimize cloud-native data infrastructure using AWS.
  • Collaborate with data science teams to build foundations for predictive analytics.

Jobgether is a platform that uses AI to match candidates with jobs and ensure applications are reviewed quickly and fairly. They help the hiring company identify the top-fitting candidates.

US

  • Own the data engineering roadmap.
  • Lead, mentor, and scale a high-performing data engineering team.
  • Design and evolve our core data infrastructure on AWS, Apache Airflow, and Apache Spark.

Tekmetric is an all-in-one, cloud-based platform helping auto repair shops run smarter, grow faster, and serve customers better. Officially founded in Houston in 2017, Tekmetric has grown from a single shop’s vision to the industry’s leading solution. They value transparency, integrity, innovation, and a service-first mindset.

US

  • Experience with the integration of data from multiple data sources.
  • Experience with various database technologies such as SQLServer, Redshift, Postgres, and RDS.
  • Experience designing, building, and maintaining data pipelines.

Bluelight Consulting is a leading software consultancy dedicated to designing and developing innovative technology that enhances users' lives. With a presence across the United States and Central/South America, Bluelight is in an exciting phase of expansion, continually seeking exceptional talent to join its dynamic and diverse community.

$89,440–$94,380/hr
US

  • Design, build, and maintain scalable data pipelines.
  • Develop and optimize ETL/ELT processes using cloud data technologies.
  • Partner with teams to understand data requirements and improve data capture strategies.

Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.

US Unlimited PTO

  • Work cross-functionally with Product and subject matter experts to conceptualize, prototype, and build data solutions
  • Connect disparate datasets (e.g. claims, contract rates, demographics data) to empower internal and external stakeholders
  • Build and maintain data engineering systems that support AI use cases, including scalable ingestion pipelines, feature generation, and downstream products

Turquoise Health aims to make healthcare pricing simpler, more transparent, and lower cost. They are a Series B startup backed by top VCs with an accomplished group of folks with a passion for improving healthcare.

$167,000–$200,000/yr
US Unlimited PTO

  • Build, optimize, and maintain data pipelines that power our business
  • Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
  • Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies

Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.

$140,000–$180,000/yr
US Unlimited PTO

  • Build and operate data services driving our applications and APIs
  • Collaborate with team members and across Engineering to iteratively prototype and develop new functionality
  • Partner with product managers and other Zusers

Zus is a shared health data platform designed to accelerate healthcare data interoperability by providing easy-to-use patient data via API, embedded components, and direct EHR integrations. Founded in 2021, it partners with HIEs and other data networks to aggregate patient clinical history and then translates that history into user-friendly information at the point of care.

$108,400–$135,500/yr
US North America

  • Design, develop, and maintain scalable data pipelines using cloud data services.
  • Serve as a technical leader, defining data engineering standards and best practices.
  • Lead the design and implementation of optimized data models in our cloud data warehouse.

Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.

  • Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
  • Implement data quality checks, monitoring, and validation processes.
  • Automate manual processes into centralized and scalable solutions.

Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.

$118,000–$148,000/yr
US

  • Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
  • Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
  • Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response

Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.

$73,560–$95,628/yr
Canada

  • Practicing the KOHO values
  • Gathering requirements for, and implementing streaming and batch pipelines
  • Developing and maintaining of batch data pipelines using AWS Glue, Lambda, and Python

KOHO's mission is to make financial services better for every Canadian by offering transparent financial products designed to help users spend smart, save more, and build wealth. They are a performance organization that values autonomy, high trust, and work-life integration.

$140,000–$160,000/yr
US 4w PTO

  • Own and evolve the data infrastructure that powers Clever's core data products.
  • Maintain and improve data pipeline reliability, monitoring and resolving pipeline failures.
  • Design and implement ingestion for new operational data sources that support Clever's speed-to-match initiative.

Clever Real Estate is a venture-backed technology company aiming to revolutionize real estate transactions. They have built a leading online education platform helping consumers save money and have earned a 4.9 TrustPilot rating with over 3,800 reviews.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

North America

  • Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
  • Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
  • Implement automated data quality checks, validation rules, and monitoring frameworks.

ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.

Europe

  • Design and maintain scalable data pipelines.
  • Structure, transform, and optimize data in Snowflake.
  • Implement multi-source ETL/ELT flows (ERP, APIs, files).

QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.

Europe Asia

  • Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
  • Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
  • Ensure data integrity, availability, lineage, and observability across complex pipelines

Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.

Latin America

  • Develop and maintain data models for core package application and reporting databases.
  • Monitor execution and performance of daily pipelines, triage and escalate any issues.
  • Collaborate with analytics and business teams to improve data models and data pipelines.

Bluelight Consulting designs and develops innovative software to enhance users' lives, focusing on quality and customer satisfaction. They foster a collaborative work environment where team members can grow, and are expanding across the US and Central/South America, seeking exceptional talent.