Source Job

Brazil

Design, build, and maintain a robust, self-service, scalable, and secure data platform. Create and edit data pipelines, considering business logic, levels of aggregation, and data quality. Enable teams to access and use data effectively through self-service tools and well-modeled datasets.

Python SQL Spark AWS Data Modeling

20 jobs similar to Data Engineer

Jobs ranked by similarity.

Latam

Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.

South America

  • Design, develop, and maintain scalable and robust data pipelines.
  • Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
  • Ensure the quality, integrity, and usability of data throughout the entire pipeline.

CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.

Brazil

  • Projetar, desenvolver e manter pipelines de dados (batch e streaming) para ingestão, transformação e disponibilização de dados para analytics e consumo por aplicações.
  • Construir e evoluir modelagem analítica (camadas bronze/silver/gold, data marts, star schema, wide tables), garantindo consistência, documentação e reuso.
  • Implementar boas práticas de qualidade de dados (testes, validações, contratos, SLAs/SLOs, monitoramento de freshness/completeness/accuracy) e atuar na resolução de incidentes com RCA.

CI&T specializes in technological transformation, combining human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters around the world, they have partnered with more than 1,000 clients during their 30 years of history.

Americas EMEA

  • Build and scale data services by designing, developing, and maintaining scalable backend systems and APIs.
  • Collaborate on data architecture and models, partnering with engineering and analytics teams to optimize storage and processing workflows.
  • Contribute to standards, quality, and governance by building reliable, observable data systems with strong testing and validation.

Zapier builds and uses automation every day to make work more efficient, creative, and human.

$69,100–$107,400/yr

  • Assist in executing data engineering projects within the Customer Intelligence portfolio to meet defined timelines and deliverables.
  • Build and maintain ETL pipelines based on user and project specifications to enable reliable data movement.
  • Develop and update technical documentation for key systems and data assets.

Stryker is one of the world’s leading medical technology companies and, together with its customers, is driven to make healthcare better.

Brazil Canada US Latin America

  • Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
  • Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
  • Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.

Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.

$110,000–$140,000/yr
US

  • Design, build, and maintain scalable and reliable data pipelines.
  • Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
  • Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.

Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.

Latin America

Build robust data pipelines at scale. Design and implement data schemas. Collaborate with Analytics/Data Science team to structure and house data.

Goods & Services is a product design and engineering company that solves mission-critical challenges for some of the world’s largest enterprises.

Europe Unlimited PTO

  • Design and implement data cataloging infrastructure to make datasets discoverable and connected.
  • Develop consistent data access patterns and libraries for engineers and data scientists.
  • Collaborate with Data Engineers to ensure platform infrastructure complements data pipelines.

Jobgether helps people find jobs using an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly.

$130,000–$176,000/yr
US Unlimited PTO

  • Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
  • Lead the architecture and development of scalable and maintainable data solutions.
  • Collaborate with data scientists and analysts to provide clean and accessible data.

DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.

Europe Unlimited PTO

Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.

This position is posted by Jobgether on behalf of a partner company.

US Europe

  • Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
  • Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
  • Implement and maintain data quality measures throughout the data lifecycle

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.

$155,000–$185,000/yr
US Unlimited PTO

As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.

Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.

$99,000–$185,800/yr
US

  • Build and monitor Cribl’s core data tech stack including data pipelines and data warehouse.
  • Develop cloud-native services and infrastructure that power scalable and reliable data systems.
  • Support Cribl’s growing data science and agentic initiatives by preparing model-ready datasets.

Cribl is a company that provides a data engine for IT and Security for various industries.

$155,000–$180,000/yr
US

  • Design, build, and maintain robust and scalable data pipelines from diverse sources.
  • Leverage expert-level experience with dbt and Snowflake to structure, transform, and organize data.
  • Collaborate with engineering, product, and analytics teams to deliver data solutions that drive business value.

Topstep is an engaging working environment which ranges from fully remote to hybrid and they foster a culture of collaboration.

$215,000–$240,000/yr
US

Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.

YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.

US

  • Build, manage, and operationalize data pipelines for marketing use cases.
  • Develop a comprehensive understanding of customer and marketing data requirements.
  • Transform large data sets into targeted customer audiences for personalized experiences.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

Europe

Responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures. Integrate, transform, and manage high-volume datasets across multiple platforms. Focus on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies.

This position is posted by Jobgether on behalf of a partner company.

  • Architect and implement scalable Lakehouse solutions using Delta Tables and Delta Live Tables.
  • Design and orchestrate complex data workflows using Databricks Workflows and Jobs.
  • Develop production-grade Python and PySpark code, including custom Python libraries.

Coderio designs and delivers scalable digital solutions for global businesses with a strong technical foundation and a product mindset.

$5,000–$5,830/mo
Global

  • Develop and maintain scalable data pipelines and ETL processes.
  • Design, build, and optimize data models and databases.
  • Perform data analysis, data mining, and statistical modeling.

We’re supporting a global fintech and digital currency platform in their search for a Senior Data Engineer to help scale and optimize their analytics and data infrastructure.