Source Job

$122,400–$195,500/yr
US

  • Contribute to architecture and implement robust data pipelines.
  • Drive the creation of a secure, compliant, and privacy-focused data warehousing solution.
  • Partner with the data analytics team to deliver a data platform that supports accurate, actionable reporting.

Python PySpark Databricks ETL Terraform

20 jobs similar to Senior Data Engineer

Jobs ranked by similarity.

$140,400–$224,250/yr
US

  • Lead the implementation of a resilient, privacy-first data platform architecture.
  • Lead the design, infrastructure, and tooling decisions for platform optimization.
  • Develop AI-ready architecture by creating semantic layers that define and standardize business logic.

Headspace provides access to lifelong mental health support. They combine evidence-based content, clinical care, and innovative technology to help millions of members around the world get support that’s effective and personalized. They value connecting with courage, ownership, and iterating to great.

$110,000–$135,000/yr
Canada

  • Design and implement scalable data architectures to support business needs.
  • Build and optimize data pipelines, ensuring data accessibility and security.
  • Develop and maintain data models, databases, and data lakes, with robust data governance.

Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.

$84,191–$106,194/yr
Canada

  • Design and implement scalable data ingestion and transformation pipelines using Databricks and cloud platforms
  • Lead architecture decisions for modern data platforms, including Medallion Architecture and Lakehouse patterns
  • Build and maintain ETL/ELT pipelines using Python and SQL, following engineering best practices

AOT Technologies helps enterprises and governments bring their ideas to life. As a boutique consulting firm, they partner with enterprises, startups, and governments to solve complex, mission-critical challenges. Their teams are collaborative and their leadership is transparent.

$140,000–$160,000/yr
US 4w PTO

  • Own and evolve the data infrastructure that powers Clever's core data products.
  • Maintain and improve data pipeline reliability, monitoring and resolving pipeline failures.
  • Design and implement ingestion for new operational data sources that support Clever's speed-to-match initiative.

Clever Real Estate is a venture-backed technology company aiming to revolutionize real estate transactions. They have built a leading online education platform helping consumers save money and have earned a 4.9 TrustPilot rating with over 3,800 reviews.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.

Europe Asia

  • Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
  • Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
  • Ensure data integrity, availability, lineage, and observability across complex pipelines

Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.

$146,400–$175,100/yr
US

  • Architect, implement, and maintain scalable data architectures.
  • Develop, optimize, and maintain ETL processes.
  • Optimize data processing and query performance.

Blueprint Technologies is a technology solutions firm that helps organizations unlock value from existing assets by leveraging cutting-edge technology. Their teams have unique perspectives and years of experience across multiple industries. They believe in unique perspectives and build teams of people with diverse skillsets and backgrounds.

  • Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
  • Implement data quality checks, monitoring, and validation processes.
  • Automate manual processes into centralized and scalable solutions.

Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.

$92,686–$125,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

Global

  • Design and develop data ingestion pipelines, preferably with Databricks experience.
  • Performance tune and optimize Databricks jobs, evaluating new features and refactoring code.
  • Collaborate with analysts, managers, architects, and senior developers to establish application framework.

Databricks is a data and AI company. They are likely a medium to large size company, focusing on innovation and teamwork.

US Unlimited PTO

  • Serve as the embedded technical lead for Databricks customer engagements.
  • Own Databricks platform architecture, design decisions, and technical standards.
  • Lead delivery of complex data pipelines and analytics workloads on Databricks.

540 is a forward-thinking company that the government turns to in order to #getshitdone. They break down barriers, build impactful technology, and solve mission-critical problems.

$130,000–$138,000/yr
US

  • Own the end-to-end architecture of the Consumer Canvas product.
  • Design systems across ingestion, identity spine creation, enrichment, governance, and activation workflows.
  • Collaborate with cross-functional partners (Data Scientists, ML Engineers, Data Engineers, Product Owners).

NIQ is the world’s leading consumer intelligence company, helping brands and advertisers understand what people buy, watch, and engage with. MRI‑Simmons, a division within NIQ, runs one of the most comprehensive and detailed consumer insights studies in the U.S., collecting thousands of data attributes from 50,000+ respondents annually.

US

  • Shape, scale, and govern our modern data ecosystem.
  • Deliver high quality data products that power clinical, operational, financial, and analytical outcomes.
  • Work closely with teams across the organization to deliver governed, high‑quality, analytics‑ready data at scale.

Interwell Health is a kidney care management company that partners with physicians. They aim to reimagine healthcare and help patients live their best lives, driven by a mission to help people and create better ways if they exist.

$195,000–$240,000/yr
US

  • Build and lead a team of 4-5 data engineers focused on reusable product artifacts
  • Own the product data engineering backlog in partnership with product management
  • Define and enforce technical standards for notebooks, pipelines, QC modules, and documentation

Qualified Health is redefining what’s possible with Generative AI in healthcare. They provide the guardrails for safe AI governance, healthcare-specific agent creation, and real-time algorithm monitoring, working alongside leading health systems to drive real change. They are a fast-growing company backed by premier investors.

$118,000–$148,000/yr
US

  • Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
  • Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
  • Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response

Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.

Global

  • Design, develop, and maintain data pipelines using Azure Databricks
  • Build and optimize data transformations using PySpark and SQL in Databricks
  • Implement and maintain Lakehouse architectures using Delta Lake

Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their annual growth rate exceeds 25%.

Data Engineer

YLD
Europe

  • Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
  • Coach & mentor other engineers to support the growth of their technical expertise.
  • Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.

YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.

Latin America

  • Design and evolve scalable data pipelines and architectures.
  • Act as the primary anchor for data ingestion, transformation, and storage solutions.
  • Ensure mission-critical data is accessible and reliable.

CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.