Source Job

US

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines.
  • Build and implement ETL frameworks to improve code quality and reliability.
  • Guide and mentor other Data Engineers as a technical owner of parts of the data platform.

Python SQL ETL Airflow Spark

20 jobs similar to Senior Data Engineer

Jobs ranked by similarity.

$180,000–$220,000/yr
US Unlimited PTO

  • Design and implement robust, production-grade pipelines using Python, Spark SQL, and Airflow.
  • Lead efforts to canonicalize raw healthcare data into internal models.
  • Onboard new customers by integrating their raw data into internal pipelines and canonical models.

Machinify is a healthcare intelligence company delivering value, transparency, and efficiency to health plan clients. They serve over 85 health plans, including many of the top 20, representing more than 270 million lives, with an AI-powered platform and expertise.

$140,000–$160,000/yr
US 3w PTO

  • Design, develop, and maintain a core Python ETL framework.
  • Develop and optimize an automated refresh pipeline orchestrated through AWS Batch, Lambda, Step Functions, and EventBridge.
  • Build Python integrations with external systems that are robust, testable, and reusable.

BlastPoint is a B2B data analytics startup that helps companies engage with customers more effectively by discovering insights in their data. Founded in 2016 by Carnegie Mellon Alumni, they are a tight-knit, forward-thinking team that serves diverse industries including energy, finance, retail, and transportation.

$142,000–$162,500/yr
US

  • Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
  • Translate business requirements into software solutions that accelerate our ability to deploy AI.
  • Monitor data pipelines, detect anomalies, and implement automated recovery systems.

Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.

$230,000–$265,000/yr
US Unlimited PTO

  • Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
  • Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
  • Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.

Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.

Data Engineer

UW
UK

  • Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
  • Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
  • Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.

UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.

Global

  • Build and optimize scalable, efficient ETL and data lake processes.
  • Own the ingestion, modeling, and transformation of structured and unstructured data.
  • Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.

Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.

EMEA

  • Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
  • Model, store, and transform data to support analytics, reporting, and downstream applications.
  • Build API-based and file-based integrations across enterprise platforms.

Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.

US

  • Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
  • Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
  • Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.

ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.

US Unlimited PTO

  • Design, build, and maintain pipelines that power all data use cases.
  • Develop intuitive, performant, and scalable data models that support product features.
  • Pay down technical debt, improve automation, and follow best practices in data modeling.

Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences. They are leaders in the space, with over $10 billion generated by creators since Patreon's inception, with a team passionate about their mission.

US 3w PTO 2w paternity

  • Serve as a primary advisor to identify technical improvements and automation opportunities.
  • Build advanced data pipelines using the medallion architecture in Snowflake.
  • Write advanced ETL/ELT scripts to integrate data into enterprise data stores.

Spring Venture Group is a digital direct-to-consumer sales and marketing company focused on the senior market. They have a dedicated team of licensed insurance agents and leverage technology to help seniors navigate Medicare.

US

  • Use Google Big Query and GCP Suite or similar big data tools to deliver on data warehouse and reporting requirements
  • Utilize Databricks and similar ETL tools to perform data extraction and transformation of high volume records
  • Architect, design, and build pipelines to move large amounts of data from various sources

Jobgether is a company that uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify top-fitting candidates and share the shortlist with the hiring company.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.

Spain 5w PTO

  • Design, build, and maintain scalable data pipelines.
  • Apply dimensional modeling techniques to design tables and views.
  • Automate manual processes to improve efficiency and speed.

The Knot Worldwide champions celebration and powers meaningful moments for millions around the world. They are a team of passionate dreamers and doers united by connection and committed to the global community, believing the best ideas come from empowered and collaborative teams.

South America

  • Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
  • Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
  • Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.

Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.

North America South America

  • Designing, building and maintaining data pipelines and data warehouses.
  • Developing ETL ecosystem tools using tools like Python, External APIs, Airbyte, Snowflake, dbt, PostgreSQL, MySQL, and Amazon S3.
  • Helping to define automated solutions to solve complex problems around better understanding data, users, and the market

Neighborhoods.com provides real estate software platforms, including 55places.com and neighborhoods.com. They strive to foster an inclusive environment, comfortable for everyone, and will not tolerate harassment or discrimination of any kind.

US

  • Build and maintain scalable data pipelines from ingestion through transformation and delivery.
  • Design, build, and maintain our data warehouse and data marts.
  • Partner with stakeholders to translate business needs into clean data models.

Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.

Philippines

  • Design, develop, and optimize data architecture and pipelines aligned with ETL/ELT principles.
  • Architect workflows using DBT to convert raw data into actionable analytics.
  • Maintain production data pipelines with Python, DBT, Matillion, and Snowflake.

Jobgether is a platform that connects job seekers with partner companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.

$150,000–$170,000/yr
US

  • Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
  • Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
  • Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.

Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

Global

  • Build a scalable, reliable, operable and performant big data workflow platform.
  • Drive the usage of Freight's data model across the organization with multiple product teams.
  • Drive efficiency and reliability improvements through design and automation.

Uber Freight is an enterprise technology company powering intelligent logistics with end-to-end logistics applications, managed services, and an expansive carrier network. Today, the company manages nearly $20B of freight, has one of the largest networks of carriers and is backed by best-in-class investors.

$120,000–$150,000/yr
US

  • Architect and maintain central storage and cloud environment.
  • Design and automate scalable ELT/ETL pipelines for data.
  • Support scientists and operational teams by designing data models.

Funga is a public benefit corporation using forest fungal networks to address climate change. They combine DNA sequencing and machine learning with forest microbiome research to improve wood creation, carbon sequestration, and forest resilience. They are a team of scientists and builders aiming to remove three gigatons of carbon dioxide from the atmosphere by 2050.