Build robust data pipelines at scale. Design and implement data schemas. Collaborate with Analytics/Data Science team to structure and house data.
Source Job
20 jobs similar to Principal Data Engineer LATAM
Jobs ranked by similarity.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
- Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
- Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
- Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.
- Design, build, and maintain high-volume ETL pipelines and ingestion systems.
- Architect reusable data integrations to replace legacy, bucket-based custom reporting.
- Deliver BI-ready datasets consumed by tools such as Snowflake, Tableau, Looker, or internal platforms.
Ubiminds partners with American software product companies to scale their development footprint by offering staff augmentation and employer-of-record services.
- Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
- Lead the architecture and development of scalable and maintainable data solutions.
- Collaborate with data scientists and analysts to provide clean and accessible data.
DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.
- Design and implement end-to-end data integrations into analytical and operational data stores.
- Evaluate and recommend tooling, frameworks, and platforms for ingestion, transformation, and orchestration.
- Implement monitoring, alerting, and observability including metrics, logging, lineage, and data-quality controls.
Blue Coding specializes in hiring excellent developers and amazing people from all over Latin America and other parts of the world.
- Develop and maintain scalable data pipelines and ETL processes.
- Design, build, and optimize data models and databases.
- Perform data analysis, data mining, and statistical modeling.
We’re supporting a global fintech and digital currency platform in their search for a Senior Data Engineer to help scale and optimize their analytics and data infrastructure.
- Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
- Develop and optimize data models in Snowflake or similar platforms.
- Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.
- Design, build, and optimize data pipelines to centralize data in a modern warehouse (PostHog)
- Automate ETL processes and existing spreadsheet-based reports
- Work closely with finance and business stakeholders to understand ad hoc reporting needs and deliver efficient solutions
Katapult is a nearshore software development agency that combines the best talent in LATAM, with world-class execution and leadership experience, with an AI-first approach to product engineering. Katapult works with PMF+ startups and businesses in the United States with a team-augmentation model.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
- Partner with clients and implementation teams to understand data distribution requirements.
- Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
- Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Design, build, and maintain a robust, self-service, scalable, and secure data platform. Create and edit data pipelines, considering business logic, levels of aggregation, and data quality. Enable teams to access and use data effectively through self-service tools and well-modeled datasets.
We are Grupo QuintoAndar, the largest real estate ecosystem in Latin America, with a diversified portfolio of brands and solutions across different countries.
- Design, develop, and maintain scalable data pipelines and data warehouses.
- Develop ETL/ELT processes using Python and modern data tools.
- Ensure data quality, reliability, and performance across systems.
3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.
- Build and monitor Cribl’s core data tech stack including data pipelines and data warehouse.
- Develop cloud-native services and infrastructure that power scalable and reliable data systems.
- Support Cribl’s growing data science and agentic initiatives by preparing model-ready datasets.
Cribl is a company that provides a data engine for IT and Security for various industries.
- Design, develop, and maintain robust data processes and solutions.
- Develop and maintain data models, databases, and data warehouses.
- Collaborate with stakeholders to gather requirements and provide data solutions.
Highmark Health is a national, blended health organization that includes one of America’s largest Blue Cross Blue Shield insurers.
- Lead and mentor a team of data engineers, providing feedback and support for career growth.
- Define and execute the long-term vision for Donorbox's data engineering infrastructure, focusing on scalability.
- Maintain, optimize, and scale the Postgres data warehouse, serving as the source of truth for analytics and reporting.
Donorbox is a leading fundraising platform and donor management system for nonprofit organizations that helps nonprofits become highly effective at raising funds.
- Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
- Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
- Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Design and build scalable data pipelines to ensure seamless data flow from multiple sources. Automate data collection, transformation, and delivery processes to support real-time and batch processing requirements. Work with stakeholders to define and enforce data governance policies and standards.
Goods & Services is looking for a Data Governance Engineer to design, build, and maintain the data collection, storage, and analysis infrastructure.
- Design, develop, and maintain scalable data pipelines using Snowflake and dbt.
- Write and optimize advanced SQL queries for performance and reliability.
- Implement ETL/ELT processes to ingest and transform data from multiple sources.
Nagarro is a digital product engineering company that is scaling in a big way and builds products, services, and experiences that inspire, excite, and delight.
- Architect and maintain robust data pipelines to transform diverse data inputs.
- Integrate data from various sources into a unified platform.
- Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
- Design and implement scalable, high-performance data architectures to support business needs.
- Develop, automate, and maintain production-grade data pipelines using modern data stack tools and best practices.
- Optimize data workflows and implement observability frameworks to monitor pipeline performance, reliability, and accuracy.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.