Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Source Job
20 jobs similar to Data Engineer
Jobs ranked by similarity.
Build robust data pipelines at scale. Design and implement data schemas. Collaborate with Analytics/Data Science team to structure and house data.
Goods & Services is a product design and engineering company that solves mission-critical challenges for some of the world’s largest enterprises.
- Design, build, and maintain scalable and reliable data pipelines.
- Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
- Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
Design, build, and maintain a robust, self-service, scalable, and secure data platform. Create and edit data pipelines, considering business logic, levels of aggregation, and data quality. Enable teams to access and use data effectively through self-service tools and well-modeled datasets.
We are Grupo QuintoAndar, the largest real estate ecosystem in Latin America, with a diversified portfolio of brands and solutions across different countries.
- Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
- Develop and optimize data models in Snowflake or similar platforms.
- Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.
- Partner with clients and implementation teams to understand data distribution requirements.
- Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
- Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
- Design, develop, and maintain scalable data pipelines and data warehouses.
- Develop ETL/ELT processes using Python and modern data tools.
- Ensure data quality, reliability, and performance across systems.
3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.
The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.
- Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
- Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
- Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.
Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.
Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.
YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
- Design, develop, and maintain scalable and robust data pipelines.
- Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
- Ensure the quality, integrity, and usability of data throughout the entire pipeline.
CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.
- Design, build, and maintain scalable, high-quality data pipelines.
- Implement robust data ingestion, transformation, and storage using cloud-based technologies.
- Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
- Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
- Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
- Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
- Design, build, and maintain robust and scalable data pipelines from diverse sources.
- Leverage expert-level experience with dbt and Snowflake to structure, transform, and organize data.
- Collaborate with engineering, product, and analytics teams to deliver data solutions that drive business value.
Topstep is an engaging working environment which ranges from fully remote to hybrid and they foster a culture of collaboration.
- Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
- Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
- Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.
- Design, develop, and maintain scalable data pipelines using Snowflake and dbt.
- Write and optimize advanced SQL queries for performance and reliability.
- Implement ETL/ELT processes to ingest and transform data from multiple sources.
Nagarro is a digital product engineering company that is scaling in a big way and builds products, services, and experiences that inspire, excite, and delight.
- Migrate data and analytics workloads from BigQuery to Snowflake
- Develop and optimize ETL/ELT pipelines using Python and SQL
- Build analytics-ready datasets for reporting and dashboards
Egen is a fast-growing and entrepreneurial company with a data-first mindset. They bring together the best engineering talent working with the most advanced technology platforms to help clients drive action and impact through data and insights.
- Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
- Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
- Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
- Architect and maintain robust data pipelines to transform diverse data inputs.
- Integrate data from various sources into a unified platform.
- Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
- Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
- Lead the architecture and development of scalable and maintainable data solutions.
- Collaborate with data scientists and analysts to provide clean and accessible data.
DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.