Source Job

US

  • Prepare and manage pre-stage files for backbook conversion activities.
  • Support and execute data ingestion tasks in alignment with scheduled project events, including key mock events.
  • Monitor and ensure data ingestion completion within defined SLA windows.

Python PySpark SQL AWS ETL

20 jobs similar to Senior Data Engineer

Jobs ranked by similarity.

$120,000–$160,000/yr
US

  • You will join a team of talented engineers working closely with Data Scientists to build and scale our next-generation Ad EnGage data pipeline.
  • You will work with large-scale datasets (hundreds of TBs to petabyte-scale systems) using a modern data stack centered on AWS, Airflow, dbt, and Snowflake.
  • You’ll contribute to building reliable, high-quality data pipelines and improving the performance, scalability, and observability of our data platform.

EDO is the TV outcomes company. Their leading measurement platform connects convergent TV airings to the ad-driven consumer behaviors most predictive of future sales. They are headquartered in New York City and Los Angeles with an office space in San Francisco and recognize the benefits of hybrid working.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

Brazil

  • Design and implement data ingestion and transformation pipelines using PySpark/SparkSQL on Databricks.
  • Own data pipelines end-to-end in production: freshness, correctness, availability, and SLA adherence.
  • Build and maintain Delta Lake tables following medallion architecture patterns.

Pismo, founded in 2016, provides a comprehensive processing platform for banking, card issuing, and financial market infrastructure. With over 500 employees across more than 10 countries and now part of Visa, they empower firms to build and launch financial products rapidly with high security and availability standards.

$135,500–$200,000/yr
US

  • Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
  • Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
  • Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.

SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.

Global

  • Design and develop data ingestion pipelines, preferably with Databricks experience.
  • Performance tune and optimize Databricks jobs, evaluating new features and refactoring code.
  • Collaborate with analysts, managers, architects, and senior developers to establish application framework.

Databricks is a data and AI company. They are likely a medium to large size company, focusing on innovation and teamwork.

$122,400–$195,500/yr
US

  • Contribute to architecture and implement robust data pipelines.
  • Drive the creation of a secure, compliant, and privacy-focused data warehousing solution.
  • Partner with the data analytics team to deliver a data platform that supports accurate, actionable reporting.

Headspace provides access to lifelong mental health support. They combine evidence-based content, clinical care, and innovative technology to help millions of members around the world get support that’s effective and personalized. The company values making the mission matter, iterating to great, owning the outcome, and connecting with courage.

$179,469–$242,811/yr
US

  • Lead and grow a team of data engineers, providing mentorship and technical guidance.
  • Own execution of customer integrations across multiple product lines, ensuring on-time delivery.
  • Improve data quality and pipeline reliability by investing in better alerting and resilience.

Afresh is the leading AI company in fresh food, partnering with grocers to order billions of dollars of fresh food. They are on a mission to eliminate food waste and make fresh food accessible to all and has saved 200M lbs of food waste in 2025 alone.

$100,000–$147,680/yr
US

  • Design, develop, and maintain ETL processes and systems.
  • Collect, clean, and integrate data from various sources, including structured and unstructured data.
  • Ensure data quality and integrity by implementing data validation and error handling processes.

Cayuse Civil Services, LLC provides solutions in civil services. They value innovation, excellence, collaboration, adaptability, and integrity by fostering technical solutions that meet customer needs, promoting teamwork, and prioritizing quality in deliverables.

$150,000–$190,000/yr
US

  • Build and scale high-throughput streaming pipelines.
  • Model and deliver high-quality, production-grade real estate datasets.
  • Strengthen data quality and observability.

Luxury Presence is building the AI growth platform for real estate. Backed by Bessemer Venture Partners and other top investors, they are a Series C company on track to hit $100M in annual recurring revenue in the next six months. They are a global team ranked on the Inc. 5000 fastest-growing companies list three years in a row.

$100,649–$174,459/yr
US 4w PTO

  • Design, build, and maintain scalable data platforms using AWS to support analytics, machine learning, and emerging generative AI use cases.
  • Collaborate with data scientists, analysts, and engineering teams to translate business and AI requirements into scalable data solutions.
  • Work with large-scale datasets to build and optimize data pipelines using AWS services such as EMR (Spark, Trino), S3, Glue, Athena, and Airflow

Experian is a global data and technology company, powering opportunities for people and businesses around the world. They invest in people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange, they have a team of 23,300 people across 32 countries.

$100,000–$140,000/yr
US

  • Design, build, and maintain scalable data pipelines for clients across industries.
  • Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
  • Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.

NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.

US Europe

  • Become a trusted data and AI advisor to clients, helping them translate business questions into AI-ready data architectures.
  • Design and implement AI-optimized data platforms, including cloud data warehouses, ETL/ELT pipelines, and analytic layers.
  • Engineer modern ELT/ETL pipelines that handle structured, semi-structured, and unstructured data to support AI and analytics use cases.

Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. They work alongside the most innovative software providers in the data engineering space to solve their clients' toughest business problems and believe in blending modern tools and techniques with tried-and-true principles to deliver optimal data engineering solutions.

Europe

  • Develop engineering expertise within the Dataiku Platform to help maintain and develop system integrations, platform automations, and platform configurations.
  • Build & maintain python & SQL data replication & data pipelines on large & often complex data sets.
  • Identify opportunities for improvements & optimization for greater scalability & delivery velocity

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

$110,000–$135,000/yr
Canada

  • Design and implement scalable data architectures to support business needs.
  • Build and optimize data pipelines, ensuring data accessibility and security.
  • Develop and maintain data models, databases, and data lakes, with robust data governance.

Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.

$200,000–$240,000/yr
US Unlimited PTO

  • Lead and grow a team of data engineers responsible for SentiLink’s data platform and infrastructure.
  • Define and drive the technical vision for data ingestion, processing, storage, and serving systems.
  • Design and evolve scalable data pipelines (batch and real-time) to support product and data science use cases.

SentiLink provides identity and risk solutions. They empower institutions and individuals to transaction with confidence. They have grown quickly and are backed by world-class investors.

$140,000–$160,000/yr
US

  • Design and build mission critical data pipelines with a highly scalable distributed architecture.
  • Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders.
  • Build and support reusable framework to ingest, integration and provision data

StockX is a Detroit-based technology leader focused on the online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. They employ 1,000 people across offices and verification centers around the world and their platform connects buyers and sellers using dynamic pricing mechanics.

$104,000–$164,000/yr
US

  • Build and manage business data pipelines and transform Firefox telemetry data into structured datasets.
  • Partner with data scientists, product, and marketing teams to turn datasets into models and metrics.
  • Ensure data accuracy and performance using observability tools and resolve data issues.

Mozilla Corporation is a technology company backed by a non-profit that has shaped the internet, creating brands like Firefox. With millions of users globally, they focus on areas including AI and social media while remaining focused on making the internet better for people.

$160,800–$193,000/yr
US

  • Design and develop high‑performance data converters for multi‑sensor autonomous‑driving data.
  • Design, build, and optimize large‑scale ingestion and transformation pipelines capable of processing petabyte‑scale autonomous‑driving sensor data.
  • Implement automated data validation, quality checks, and lineage tracking to ensure reliability of production datasets.

Torc has been a leader in autonomous driving since 2007 and is now part of the Daimler family. They are focused solely on developing software for automated trucks to transform how the world moves freight and have a collaborative, energetic, and team-focused culture.