Design, build, and scale performant data pipelines and infrastructure, primarily using ClickHouse, Python, and dbt.
Build systems that handle large-scale streaming and batch data, with a strong emphasis on correctness and operational stability.
Own the end-to-end lifecycle of data pipelines, from raw ingestion to clean, well-defined datasets consumed by downstream teams.
Nansen is a leading blockchain analytics platform that empowers investors and professionals with real-time, actionable insights derived from on-chain data. We’re building the world’s best blockchain analytics platform, and data is at the heart of everything we do.
Design, build, and maintain scalable, high-quality data pipelines.
Implement robust data ingestion, transformation, and storage using cloud-based technologies.
Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
Design and implement scalable, high-performance data architectures to support business needs.
Develop, automate, and maintain production-grade data pipelines using modern data stack tools and best practices.
Optimize data workflows and implement observability frameworks to monitor pipeline performance, reliability, and accuracy.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
Design, build, and maintain scalable and reliable data pipelines.
Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Design, develop, and maintain scalable data pipelines using Snowflake and dbt.
Write and optimize advanced SQL queries for performance and reliability.
Implement ETL/ELT processes to ingest and transform data from multiple sources.
Nagarro is a digital product engineering company that is scaling in a big way and builds products, services, and experiences that inspire, excite, and delight.
Drive data engineering work across project phases, including discovery, design, build, test, deploy, and ongoing improvement.
Design and build scalable data pipelines using Microsoft Fabric (lakehouses, warehouses, pipelines, dataflows, notebooks).
Cleanse, model, and transform raw data to support analytics, reporting, semantic modeling, and governance needs.
Stoneridge Software helps clients succeed in implementing business software solutions. As a 2025 Top Workplace Honoree and Microsoft Solutions Partner, they have a meticulous approach to project delivery and empower client's success with long-term support.
Responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures. Integrate, transform, and manage high-volume datasets across multiple platforms. Focus on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies.
This position is posted by Jobgether on behalf of a partner company.
Serve as a core contributor, owning and maintaining critical parts of ClickHouse's Data engineering ecosystem.
Craft tools that enable Data Engineers to harness ClickHouse's incredible speed and scale.
Build the foundation that thousands of Data engineers rely on for their most critical data workloads.
ClickHouse leads the market in real-time analytics, data warehousing, observability, and AI workloads and is recognized on the 2025 Forbes Cloud 100 list.
Projetar, desenvolver e manter pipelines de dados (batch e streaming) para ingestão, transformação e disponibilização de dados para analytics e consumo por aplicações.
Construir e evoluir modelagem analítica (camadas bronze/silver/gold, data marts, star schema, wide tables), garantindo consistência, documentação e reuso.
Implementar boas práticas de qualidade de dados (testes, validações, contratos, SLAs/SLOs, monitoramento de freshness/completeness/accuracy) e atuar na resolução de incidentes com RCA.
CI&T specializes in technological transformation, combining human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters around the world, they have partnered with more than 1,000 clients during their 30 years of history.
Design, build, and maintain cloud-native data infrastructure using Terraform for IaC.
Develop and optimize data pipelines leveraging AWS services and Snowflake.
Build and maintain LLM frameworks, ensuring high-quality and cost-effective outputs.
ClickUp is building the first truly converged AI workspace, unifying tasks, docs, chat, calendar, and enterprise search, all supercharged by context-driven AI.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Develop and maintain scalable data pipelines and ETL processes.
Design, build, and optimize data models and databases.
Perform data analysis, data mining, and statistical modeling.
We’re supporting a global fintech and digital currency platform in their search for a Senior Data Engineer to help scale and optimize their analytics and data infrastructure.
Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
Build and optimize Sauce's lakehouse architecture using Azure Databricks and Unity Catalog for data governance.
Create and maintain data quality tests and improve existing alerting setups.
Own data warehouse by connecting data sources, and maintaining a platform and architecture in coordination with R&D infrastructure and operations teams.
Sauce is a premier restaurant technology platform that helps businesses grow with our Commission-Free Delivery & Pickup structure and proprietary delivery optimization technology.