Partner with clients and implementation teams to understand data distribution requirements.
Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Design and implement scalable, high-performance data architectures to support business needs.
Develop, automate, and maintain production-grade data pipelines using modern data stack tools and best practices.
Optimize data workflows and implement observability frameworks to monitor pipeline performance, reliability, and accuracy.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
Responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures. Integrate, transform, and manage high-volume datasets across multiple platforms. Focus on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies.
This position is posted by Jobgether on behalf of a partner company.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
Design, build, and maintain scalable and reliable data pipelines.
Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
Design, build, and scale data pipelines across a variety of source systems and streams.
Implement appropriate design patterns while optimizing performance, cost, security, and scale and end-user experience.
Collaborate with cross-functional teams to understand data requirements and develop efficient data acquisition and integration strategies.
NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming and theme parks.
Develop and maintain scalable data pipelines and ETL processes.
Design, build, and optimize data models and databases.
Perform data analysis, data mining, and statistical modeling.
We’re supporting a global fintech and digital currency platform in their search for a Senior Data Engineer to help scale and optimize their analytics and data infrastructure.
Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
Develop and optimize data models in Snowflake or similar platforms.
Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.
Design, develop, and maintain scalable data pipelines using Snowflake and dbt.
Write and optimize advanced SQL queries for performance and reliability.
Implement ETL/ELT processes to ingest and transform data from multiple sources.
Nagarro is a digital product engineering company that is scaling in a big way and builds products, services, and experiences that inspire, excite, and delight.
Build, manage, and operationalize data pipelines for marketing use cases.
Develop a comprehensive understanding of customer and marketing data requirements.
Transform large data sets into targeted customer audiences for personalized experiences.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, develop, and maintain scalable and robust data pipelines.
Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
Ensure the quality, integrity, and usability of data throughout the entire pipeline.
CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.