Source Job

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Python SQL AWS Airflow CI/CD

20 jobs similar to Sr Data Engineer

Jobs ranked by similarity.

Europe

  • Design, build, and maintain scalable, high-quality data pipelines.
  • Implement robust data ingestion, transformation, and storage using cloud-based technologies.
  • Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.

US Unlimited PTO

  • Partner with clients and implementation teams to understand data distribution requirements.
  • Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
  • Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.

Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

India

  • Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
  • Develop and optimize data models in Snowflake or similar platforms.
  • Implement ETL/ELT processes using Python and modern data tools.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.

US

  • Build, manage, and operationalize data pipelines for marketing use cases.
  • Develop a comprehensive understanding of customer and marketing data requirements.
  • Transform large data sets into targeted customer audiences for personalized experiences.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

$115,000–$160,000/yr
US

As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.

PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.

US

  • Build and maintain scalable data pipelines using Snowflake OpenFlow and related Snowflake-native tools.
  • Develop and maintain Snowflake semantic views that support analytics and reporting needs.
  • Deliver clean, governed data sets for Sigma dashboards and embedded analytics use cases.

They are building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. They value diverse perspectives and innovation.

Europe

  • Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
  • Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
  • Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation

OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.

$175,000–$225,000/yr
US

  • Lead product requirements and advanced analytics requirements gathering efforts.
  • Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
  • Build a scalable technology platform to support a growing business and deliver high-quality code to production.

Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.

$145,290–$185,000/yr
Unlimited PTO

  • Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
  • Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

$130,000–$176,000/yr
US Unlimited PTO

  • Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
  • Lead the architecture and development of scalable and maintainable data solutions.
  • Collaborate with data scientists and analysts to provide clean and accessible data.

DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.

$135,000–$165,000/yr
US Unlimited PTO

  • Design, build, and maintain scalable data pipelines.
  • Develop and implement data models for analytical use cases.
  • Implement data quality checks and governance practices.

MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.

India

  • Design, develop, and maintain scalable data pipelines and data warehouses.
  • Develop ETL/ELT processes using Python and modern data tools.
  • Ensure data quality, reliability, and performance across systems.

3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.

Europe Unlimited PTO

Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.

This position is posted by Jobgether on behalf of a partner company.

US

  • Architect and maintain scalable, secure, and high-performing data pipelines to support analytics, reporting, and operational needs.
  • Develop and deploy production-grade data engineering code, ensuring reliability and performance across environments.
  • Manage end-to-end data workflows, including ingestion, transformation, modeling, and validation for multiple business systems.

Onebridge, a Marlabs Company, is a global AI and Data Analytics Consulting Firm that empowers organizations worldwide to drive better outcomes through data and technology. Since 2005, they have partnered with some of the largest healthcare, life sciences, financial services, and government entities across the globe.

$96,050–$113,000/yr
US

  • Creating and maintaining optimal data pipeline architecture.
  • Assembling large, complex data sets that meet functional & non-functional business requirements.
  • Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.

Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.

$155,000–$165,000/yr

  • Design, optimize and own data pipelines that scrape, process and ingest transaction and listing data from major auction houses and marketplaces.
  • Build comprehensive monitoring and alerting systems to track latency, uptime, and coverage metrics across all data sources.
  • Continuously improve our data infrastructure by modernizing storage and processing technologies, reducing manual interventions, and optimizing for cost, performance, and reliability.

Alt is unlocking the value of alternative assets, starting with the $5B trading-card market. They let collectors buy, sell, vault, and finance their cards in one place and are backed by leaders at Stripe, Coinbase, Seven Seven Six, and pro athletes like Tom Brady and Giannis Antetokounmpo.

$155,000–$180,000/yr
US

  • Design, build, and maintain robust and scalable data pipelines from diverse sources.
  • Leverage expert-level experience with dbt and Snowflake to structure, transform, and organize data.
  • Collaborate with engineering, product, and analytics teams to deliver data solutions that drive business value.

Topstep is an engaging working environment which ranges from fully remote to hybrid and they foster a culture of collaboration.

Brazil Canada US Latin America

  • Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
  • Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
  • Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.

Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.

US

  • Architect and lead the evolution of our modern data platform.
  • Design and build production LLM pipelines and infrastructure that power intelligent operations.
  • Own end-to-end data acquisition and integration architecture across diverse sources.

Brightwheel is the largest, fastest growing, and most loved platform in early ed. They are trusted by millions of educators and families every day. The team is passionate, talented, and customer-focused and embodies their Leadership Principles in their work and culture.