Source Job

EMEA Asia

  • Design, build, and scale performant data pipelines and infrastructure, primarily using ClickHouse, Python, and dbt.
  • Build systems that handle large-scale streaming and batch data, with a strong emphasis on correctness and operational stability.
  • Own the end-to-end lifecycle of data pipelines, from raw ingestion to clean, well-defined datasets consumed by downstream teams.

Python SQL Dbt

20 jobs similar to Senior Data Engineer (Integrations)

Jobs ranked by similarity.

Indonesia

  • Design, build, and scale high-performance data pipelines and infrastructure using technologies such as ClickHouse, Python, and dbt
  • Own the full lifecycle of data pipelines, from raw ingestion through transformation to clean, well-defined datasets
  • Collaborate with downstream data consumers to define clear dataset contracts, schemas, and usage patterns

Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.

$155,000–$180,000/yr
US

  • Design, build, and maintain robust and scalable data pipelines from diverse sources.
  • Leverage expert-level experience with dbt and Snowflake to structure, transform, and organize data.
  • Collaborate with engineering, product, and analytics teams to deliver data solutions that drive business value.

Topstep is an engaging working environment which ranges from fully remote to hybrid and they foster a culture of collaboration.

$110,000–$140,000/yr
US

  • Design, build, and maintain scalable and reliable data pipelines.
  • Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
  • Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.

Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.

$155,000–$185,000/yr
US Unlimited PTO

As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.

Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.

$0–$200,000/yr
North America Latin America

  • Architect and maintain robust data pipelines to transform diverse data inputs.
  • Integrate data from various sources into a unified platform.
  • Build APIs with AI assistance to enable secure access to consolidated insights.

Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.

$180,000–$190,000/yr
US

Build end-to-end data solutions that include ingest, logging, validation, cleaning, transformation, and security. Lead the design, development, and delivery of scalable data pipelines and ETL processes. Design and evolve robust data models and storage patterns that support analytics and efficiency use-cases.

Founded in 1997, Expression provides data fusion, data analytics, AI/ML, software engineering, information technology, and electromagnetic spectrum management solutions.

Latam

Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.

$115,000–$160,000/yr
US

As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.

PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.

Europe Unlimited PTO

Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.

This position is posted by Jobgether on behalf of a partner company.

US

  • Design and maintain data models that organize rich content into canonical structures optimized for product features, search, and retrieval.
  • Build high-reliability ETLs and streaming pipelines to process usage events, analytics data, behavioral signals, and application logs.
  • Develop data services that expose unified content to the application, such as metadata access APIs, indexing workflows, and retrieval-ready representations.

Udio's success hinges on hiring great people and creating an environment where we can be happy, feel challenged, and do our best work.

$171,900–$216,400/yr
US Canada

  • Drive data engineering work across project phases, including discovery, design, build, test, deploy, and ongoing improvement.
  • Design and build scalable data pipelines using Microsoft Fabric (lakehouses, warehouses, pipelines, dataflows, notebooks).
  • Cleanse, model, and transform raw data to support analytics, reporting, semantic modeling, and governance needs.

Stoneridge Software helps clients succeed in implementing business software solutions. As a 2025 Top Workplace Honoree and Microsoft Solutions Partner, they have a meticulous approach to project delivery and empower client's success with long-term support.

Brazil

  • Projetar, desenvolver e manter pipelines de dados (batch e streaming) para ingestão, transformação e disponibilização de dados para analytics e consumo por aplicações.
  • Construir e evoluir modelagem analítica (camadas bronze/silver/gold, data marts, star schema, wide tables), garantindo consistência, documentação e reuso.
  • Implementar boas práticas de qualidade de dados (testes, validações, contratos, SLAs/SLOs, monitoramento de freshness/completeness/accuracy) e atuar na resolução de incidentes com RCA.

CI&T specializes in technological transformation, combining human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters around the world, they have partnered with more than 1,000 clients during their 30 years of history.

  • Lead complex product pipeline builds: Design and develop scalable data pipelines supporting cross-domain product launches in Payments, Credit, and Subscriptions, from scoping through delivery.
  • Collaborate cross-functionally: Partner with Product Analysts and Product Managers to define data requirements, while analysts continue owning pipelines alongside you.
  • Build product reporting infrastructure: Create reliable, performant marts and models that power product dashboards and self-service analytics.

KOHO aims to make financial services better for every Canadian with no hidden fees and predatory interest rates, designing financial products to help users spend smart, save more, and build real wealth. They are a performance organization that cares deeply about outcomes.

Looking for young talent ready to go all in. Offering significant equity to people who want to build something that matters. Define the future of AI in influencer marketing.

Influur is redefining how advertising works through creators, data, and AI, aiming to make influencer marketing as measurable, predictable, and scalable as paid ads.

US Unlimited PTO

  • Partner with clients and implementation teams to understand data distribution requirements.
  • Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
  • Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.

Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.

  • Design, build, and maintain cloud-native data infrastructure using Terraform for IaC.
  • Develop and optimize data pipelines leveraging AWS services and Snowflake.
  • Build and maintain LLM frameworks, ensuring high-quality and cost-effective outputs.

ClickUp is building the first truly converged AI workspace, unifying tasks, docs, chat, calendar, and enterprise search, all supercharged by context-driven AI.

  • Architect and implement scalable Lakehouse solutions using Delta Tables and Delta Live Tables.
  • Design and orchestrate complex data workflows using Databricks Workflows and Jobs.
  • Develop production-grade Python and PySpark code, including custom Python libraries.

Coderio designs and delivers scalable digital solutions for global businesses with a strong technical foundation and a product mindset.

Unlimited PTO

  • Design, develop, and maintain reliable, scalable ETL/ELT pipelines across ingestion, transformation, storage, and consumption layers.
  • Build and manage data models and transformations using dbt with strong testing and documentation practices.
  • Contribute to architectural decisions, technology selection, and data engineering standards.

Vitable is a health benefits platform making healthcare better for employers of everyday workers.

$215,000–$240,000/yr
US

Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.

YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.

$99,000–$185,800/yr
US

  • Build and monitor Cribl’s core data tech stack including data pipelines and data warehouse.
  • Develop cloud-native services and infrastructure that power scalable and reliable data systems.
  • Support Cribl’s growing data science and agentic initiatives by preparing model-ready datasets.

Cribl is a company that provides a data engine for IT and Security for various industries.