Source Job

$90,000–$150,000/yr
US

  • Build modern, scalable data pipelines that keep the data flowing.
  • Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
  • Unify and wrangle data from all kinds of sources.

SQL ETL Data Modeling DevOps CI/CD

20 jobs similar to Data Engineer

Jobs ranked by similarity.

Europe

  • Design, build, and maintain scalable, high-quality data pipelines.
  • Implement robust data ingestion, transformation, and storage using cloud-based technologies.
  • Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.

Brazil Canada US Latin America

  • Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
  • Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
  • Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.

Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.

$130,000–$130,000/yr
Americas Unlimited PTO

  • Build and evolve our semantic layer, design, document, and optimize dbt models.
  • Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
  • Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.

Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.

India

  • Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
  • Develop and optimize data models in Snowflake or similar platforms.
  • Implement ETL/ELT processes using Python and modern data tools.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.

US

  • Build and maintain scalable data pipelines using Snowflake OpenFlow and related Snowflake-native tools.
  • Develop and maintain Snowflake semantic views that support analytics and reporting needs.
  • Deliver clean, governed data sets for Sigma dashboards and embedded analytics use cases.

They are building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. They value diverse perspectives and innovation.

$171,000–$220,000/yr
US Unlimited PTO

  • Design, implement, and maintain robust, automated data pipelines.
  • Model and optimize data in Snowflake to support analytics.
  • Ensure data reliability through automated quality checks, monitoring, observability, and lineage visibility.

Acquisition.com focuses on acquiring and growing businesses. We foster a lean, high-ownership environment.

US Unlimited PTO

  • Design, build, and maintain robust data pipelines.
  • Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
  • Build modular data models that power analytics, product features, and LLM agents.

Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.

$120,000–$150,000/yr
US

  • Support their managed analytics service client through their data-driven journey and deliver measurable business value through data modeling, API integration, SQL scripting, and data pipeline development.
  • Bridge the important gap between data applications and insightful business reports.
  • Participate in building our data platform from the ground up by exploring new technologies & vendors within our cloud-first environment

DataDrive is a fast-growing managed analytics service provider that provides modern cloud analytics data platforms to data-driven organizations, while also supporting ongoing training, adoption, and growth of our clients’ data cultures. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business.

$96,050–$113,000/yr
US

  • Creating and maintaining optimal data pipeline architecture.
  • Assembling large, complex data sets that meet functional & non-functional business requirements.
  • Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.

Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.

$150,000–$160,000/yr
US

  • Design, build and execute data pipelines.
  • Build the configurable ETL framework.
  • Optimize SQL queries to maximize system performance.

RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

$145,290–$185,000/yr
Unlimited PTO

  • Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
  • Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

US

  • Design, build, and maintain scalable data pipelines and workflows in Snowflake.
  • Integrate and ingest data from multiple systems into Snowflake.
  • Develop and optimize SQL queries, views, and materialized datasets.

GTX Solutions is a consulting firm specializing in modern data architecture, Customer Data Platforms (CDPs), and marketing technology enablement. They work with enterprise clients across industries including Retail, Travel, Hospitality, and Financial Services to design and implement scalable data ecosystems.

US

  • Architect and maintain scalable, secure, and high-performing data pipelines to support analytics, reporting, and operational needs.
  • Develop and deploy production-grade data engineering code, ensuring reliability and performance across environments.
  • Manage end-to-end data workflows, including ingestion, transformation, modeling, and validation for multiple business systems.

Onebridge, a Marlabs Company, is a global AI and Data Analytics Consulting Firm that empowers organizations worldwide to drive better outcomes through data and technology. Since 2005, they have partnered with some of the largest healthcare, life sciences, financial services, and government entities across the globe.

India

  • Design, develop, and maintain scalable data pipelines and data warehouses.
  • Develop ETL/ELT processes using Python and modern data tools.
  • Ensure data quality, reliability, and performance across systems.

3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.

$135,000–$220,000/yr
US Unlimited PTO

  • Design, develop, and maintain reliable end-to-end data pipelines that connect internal and external systems.
  • Contribute to the performance, scalability, and reliability of our entire data ecosystem.
  • Work with analysts to engineer data structures and orchestrate workflows that encode core business logic.

Roo is on a mission to empower animal healthcare professionals with opportunities to earn more and achieve greater flexibility in their careers and personal lives. Powered by groundbreaking technology, Roo has built the industry-leading veterinary staffing platform, connecting Veterinarians, Technicians, and Assistants with animal hospitals for relief work and hiring opportunities.

Europe

  • Maintain, configure, and optimize the existing data warehouse platform and pipelines.
  • Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
  • Drive innovation by experimenting with new technologies and recommending platform improvements.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!

US

  • Build, manage, and operationalize data pipelines for marketing use cases.
  • Develop a comprehensive understanding of customer and marketing data requirements.
  • Transform large data sets into targeted customer audiences for personalized experiences.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

Europe

  • Design and maintain scalable data pipelines.
  • Structure, transform, and optimize data in Snowflake.
  • Implement multi-source ETL/ELT flows (ERP, APIs, files).

QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.