Source Job

Global

  • Architect our AWS-based data warehouse and ingestion pipelines.
  • Transform high-volume simulation outputs into clean, trusted datasets.
  • Establish schema standards and data contracts with engineering.

Python SQL AWS GCP Azure

20 jobs similar to Data Engineering Lead

Jobs ranked by similarity.

Global

  • Build and optimize scalable, efficient ETL and data lake processes.
  • Own the ingestion, modeling, and transformation of structured and unstructured data.
  • Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.

Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.

Latin America

  • Design, build, maintain, and operate scalable streaming and batch data pipelines.
  • Work with AWS services, including Redshift, EMR, and ECS, to support data processing and analytics workloads.
  • Develop and maintain data workflows using Python and SQL.

Southworks helps companies with software development and digital transformation. They focus on solving complex problems and delivering innovative solutions.

$230,000–$265,000/yr
US Unlimited PTO

  • Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
  • Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
  • Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.

Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.

South America

  • Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
  • Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
  • Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.

Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.

Global

  • Lead, mentor, and develop a high-performing data engineering squads delivering production-grade pipelines and services.
  • Set technical and operational standards for quality, documentation, and reliability.
  • Partner with Program Management to plan, prioritise, and track delivery against sprint goals.

Forbes Digital Marketing Inc. is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, and everyday life. We combine data-driven content, rigorous experimentation, and modern engineering to power a portfolio of global products and partnerships.

$140,000–$160,000/yr
US 3w PTO

  • Design, develop, and maintain a core Python ETL framework.
  • Develop and optimize an automated refresh pipeline orchestrated through AWS Batch, Lambda, Step Functions, and EventBridge.
  • Build Python integrations with external systems that are robust, testable, and reusable.

BlastPoint is a B2B data analytics startup that helps companies engage with customers more effectively by discovering insights in their data. Founded in 2016 by Carnegie Mellon Alumni, they are a tight-knit, forward-thinking team that serves diverse industries including energy, finance, retail, and transportation.

$120,000–$160,000/yr
US

  • Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs.
  • Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks).
  • Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance.

Care Access is dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. They are working to make the future of health better for all and have hundreds of research locations, mobile clinics, and clinicians across the globe.

$37,744–$48,521/yr
Europe

  • Develop data warehouse applications, including extraction, ingestion, and transformation processes.
  • Collaborate with customers and internal teams to understand business requirements.
  • Ensure quality assurance and data validation, utilizing the sprint methodology.

One Model, founded by industry veterans in HR analytics, has a data-first approach to their People Analytics Platform, giving them a competitive advantage. They foster a friendly, inclusive, and respectful workplace culture, offering the opportunity to contribute significantly to a young company and team.

Data Engineer

UW
UK

  • Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
  • Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
  • Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.

UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.

$200,000–$220,000/yr
US Unlimited PTO

  • Design, develop, and maintain dbt data models that support our healthcare analytics products.
  • Integrate and transform customer data to conform to our data specifications and pipelines.
  • Design and execute initiatives that improve data platform and pipeline automation and resilience.

SmarterDx, a Smarter Technologies company, builds clinical AI that is transforming how hospitals translate care into payment. Founded by physicians in 2020, our platform connects clinical context with revenue intelligence, helping health systems recover millions in missed revenue, improve quality scores, and appeal every denial.

EMEA

  • Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
  • Model, store, and transform data to support analytics, reporting, and downstream applications.
  • Build API-based and file-based integrations across enterprise platforms.

Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.

US

  • Design, build, and optimize data pipelines to support AI and ML projects.
  • Integrate data from various sources to provide a unified data view for AI applications.
  • Implement processes to ensure data quality, consistency, and accuracy across systems.

The Tyndale Company is a leading national supplier of arc-rated flame-resistant clothing (FRC) to the energy sector. They are a family-owned business, 9x Top Workplace winner in PA and 5x winner in TX, providing a retail-style apparel experience.

$125,000–$145,000/yr
US

  • Design, build, and scale the lakehouse architecture that underpins analytics, machine learning, and AI.
  • Modernize our data ecosystem, making it discoverable, reliable, governed, and ready for self-service and intelligent automation.
  • Operate anywhere along the data lifecycle from ingestion and transformation to metadata, orchestration, and MLOps.

OnX is a pioneer in digital outdoor navigation with a suite of apps. With more than 400 employees, they have created regional “Basecamps” to help remote employees find connection and inspiration.

$120,000–$150,000/yr
US

  • Architect and maintain central storage and cloud environment.
  • Design and automate scalable ELT/ETL pipelines for data.
  • Support scientists and operational teams by designing data models.

Funga is a public benefit corporation using forest fungal networks to address climate change. They combine DNA sequencing and machine learning with forest microbiome research to improve wood creation, carbon sequestration, and forest resilience. They are a team of scientists and builders aiming to remove three gigatons of carbon dioxide from the atmosphere by 2050.

US

  • Build and maintain scalable data pipelines from ingestion through transformation and delivery.
  • Design, build, and maintain our data warehouse and data marts.
  • Partner with stakeholders to translate business needs into clean data models.

Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.

Latin America

  • Build and operate backend services and automation for the Snowflake data platform.
  • Support data ingestion pipelines (RDS/Oracle → Snowflake) and reverse ETL (Snowflake → RDS).
  • Develop and maintain Airflow (AWS MWAA) workflows for ingestion, data quality, and ops automation.

Upwork is the world’s work marketplace, serving everyone from one-person startups to over 30% of the Fortune 100. They provide a powerful, trust-driven platform that enables companies and talent to work together in new ways that unlock their potential. Last year, more than $3.8 billion of work was done through Upwork.

$185,000–$200,000/yr
US

  • Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
  • Write Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
  • Monitor daily jobs and troubleshoot any failures to ensure pipeline stability.

Jobgether is a platform that leverages AI to connect job seekers with employers. They focus on ensuring fair and efficient application reviews, connecting top candidates directly with hiring companies.

India

  • Lead the design and implementation of scalable ETL pipelines and data lakes in AWS
  • Develop and optimise data architectures for terabyte-scale relational and distributed data systems
  • Collaborate with Data Scientists, Software Engineers, and Architects to integrate data solutions into analytics platforms and applications

Smart Working connects skilled professionals with outstanding global teams for full-time, long-term roles. They help discover meaningful work with teams that invest in your success, empowering you to grow personally and professionally in a remote-first world.

$150,000–$170,000/yr
US

  • Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
  • Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
  • Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.

Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.