Source Job

Americas EMEA

  • Build and scale data services by designing, developing, and maintaining scalable backend systems and APIs.
  • Collaborate on data architecture and models, partnering with engineering and analytics teams to optimize storage and processing workflows.
  • Contribute to standards, quality, and governance by building reliable, observable data systems with strong testing and validation.

Python SQL Databricks Spark Typescript

20 jobs similar to Data Engineer

Jobs ranked by similarity.

$110,000–$140,000/yr
US

  • Design, build, and maintain scalable and reliable data pipelines.
  • Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
  • Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.

Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.

Europe Unlimited PTO

Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.

This position is posted by Jobgether on behalf of a partner company.

$215,000–$240,000/yr
US

Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.

YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.

$69,100–$107,400/yr

  • Assist in executing data engineering projects within the Customer Intelligence portfolio to meet defined timelines and deliverables.
  • Build and maintain ETL pipelines based on user and project specifications to enable reliable data movement.
  • Develop and update technical documentation for key systems and data assets.

Stryker is one of the world’s leading medical technology companies and, together with its customers, is driven to make healthcare better.

Brazil

Design, build, and maintain a robust, self-service, scalable, and secure data platform. Create and edit data pipelines, considering business logic, levels of aggregation, and data quality. Enable teams to access and use data effectively through self-service tools and well-modeled datasets.

We are Grupo QuintoAndar, the largest real estate ecosystem in Latin America, with a diversified portfolio of brands and solutions across different countries.

Latam

Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.

$155,000–$185,000/yr
US Unlimited PTO

As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.

Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.

$145,290–$185,000/yr
Unlimited PTO

  • Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
  • Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

Unlimited PTO

  • Design, develop, and maintain reliable, scalable ETL/ELT pipelines across ingestion, transformation, storage, and consumption layers.
  • Build and manage data models and transformations using dbt with strong testing and documentation practices.
  • Contribute to architectural decisions, technology selection, and data engineering standards.

Vitable is a health benefits platform making healthcare better for employers of everyday workers.

  • Architect and implement scalable Lakehouse solutions using Delta Tables and Delta Live Tables.
  • Design and orchestrate complex data workflows using Databricks Workflows and Jobs.
  • Develop production-grade Python and PySpark code, including custom Python libraries.

Coderio designs and delivers scalable digital solutions for global businesses with a strong technical foundation and a product mindset.

$200,000–$225,000/yr
US Canada

  • Develop data models and pipelines for customer-facing applications, research, reporting and machine learning.
  • Optimize data models to support efficient data storage and retrieval processes for performance and scalability.
  • Optimize ETL processes for ingesting, processing and transforming large volumes of structured and unstructured data into our data ecosystem.

Inspiren offers the most complete and connected ecosystem in senior living bringing peace of mind to residents, families, and staff.

$130,000–$176,000/yr
US Unlimited PTO

  • Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
  • Lead the architecture and development of scalable and maintainable data solutions.
  • Collaborate with data scientists and analysts to provide clean and accessible data.

DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.

US

  • Design and maintain data models that organize rich content into canonical structures optimized for product features, search, and retrieval.
  • Build high-reliability ETLs and streaming pipelines to process usage events, analytics data, behavioral signals, and application logs.
  • Develop data services that expose unified content to the application, such as metadata access APIs, indexing workflows, and retrieval-ready representations.

Udio's success hinges on hiring great people and creating an environment where we can be happy, feel challenged, and do our best work.

$115,000–$160,000/yr
US

As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.

PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.

$1,800,000–$200,000/yr
US Canada

  • Collaborate with engineering, data science, ML, data engineering, and product analytics teams to understand and shape the future needs of our data platform and infrastructure.
  • Define, drive, and implement the future live ingestion layer of data into our data platform (e.g. Kafka, Kinesis).
  • Define and evolve standards for storage, compute, data management, provenance, and orchestration.

Inspiren offers the most complete and connected ecosystem in senior living.

$99,000–$185,800/yr
US

  • Build and monitor Cribl’s core data tech stack including data pipelines and data warehouse.
  • Develop cloud-native services and infrastructure that power scalable and reliable data systems.
  • Support Cribl’s growing data science and agentic initiatives by preparing model-ready datasets.

Cribl is a company that provides a data engine for IT and Security for various industries.

Mexico

  • Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
  • Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
  • Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.

Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.

$120,000–$138,000/yr
US 3w PTO

  • Responsible for the collection, extraction, transformation, and correlation of business data across the Subsplash platform.
  • Administer and tune data systems to optimize for performance.
  • Work with data warehousing/data lake environments to provide data marts for business analysis and intelligence.

Subsplash is an award-winning team that builds The Ultimate Engagement Platform™ for churches, Christian ministries, non-profits, and businesses around the world.

$155,000–$180,000/yr
US

  • Design, build, and maintain robust and scalable data pipelines from diverse sources.
  • Leverage expert-level experience with dbt and Snowflake to structure, transform, and organize data.
  • Collaborate with engineering, product, and analytics teams to deliver data solutions that drive business value.

Topstep is an engaging working environment which ranges from fully remote to hybrid and they foster a culture of collaboration.

Europe Unlimited PTO

  • Design and implement data cataloging infrastructure to make datasets discoverable and connected.
  • Develop consistent data access patterns and libraries for engineers and data scientists.
  • Collaborate with Data Engineers to ensure platform infrastructure complements data pipelines.

Jobgether helps people find jobs using an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly.