Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.
Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.
Build and maintain canonical data models and metric definitions.
Build dashboards and shared analytics tools to support decision making.
Northbeam is building the world’s most advanced marketing intelligence platform. They provide top eCommerce brands a unified view of their business data through powerful attribution modeling and customizable dashboards and is experiencing rapid growth.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Lead and mentor a team of data engineers, fostering innovation, collaboration, and continuous improvement.
Design, implement, and optimize scalable data pipelines and ETL processes to meet evolving business needs.
Ensure data quality, governance, security, and compliance with industry standards and best practices.
Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Architect and maintain scalable, secure, and high-performing data pipelines to support analytics, reporting, and operational needs.
Develop and deploy production-grade data engineering code, ensuring reliability and performance across environments.
Manage end-to-end data workflows, including ingestion, transformation, modeling, and validation for multiple business systems.
Onebridge, a Marlabs Company, is a global AI and Data Analytics Consulting Firm that empowers organizations worldwide to drive better outcomes through data and technology. Since 2005, they have partnered with some of the largest healthcare, life sciences, financial services, and government entities across the globe.
Lead product requirements and advanced analytics requirements gathering efforts.
Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
Build a scalable technology platform to support a growing business and deliver high-quality code to production.
Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.
Design, develop, and optimize data pipelines and ETL processes to ensure high-quality data is available for analysis.
Analyze complex datasets to identify trends, patterns, and actionable insights that drive business performance.
Implement data quality checks and governance best practices to ensure data accuracy and reliability.
Modeling Data Solutions is seeking an experienced data analytics engineer to join its personal lines property team. This is an exciting opportunity to join the US Data Science Infrastructure department helping to support creating cutting edge pricing programs.
Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.
This position is posted by Jobgether on behalf of a partner company.
Utilize strong SQL & Python skills to engineer sound data pipelines and conduct routine and ad hoc analysis.
Build reporting dashboards and visualizations to design, create, and track campaign/program KPIs.
Perform analyses on large data sets to understand drivers of operational efficiency.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. This company fosters a supportive and inclusive work environment.
Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.
The New York Blood Center appears to be a medical organization. They are looking for a Senior Data Engineer to join their team.
Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
Collaborate with cross-functional teams to deliver impactful data solutions.
Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
Develop and optimize data models in Snowflake or similar platforms.
Implement ETL/ELT processes using Python and modern data tools.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Design, build, and maintain scalable data pipelines and workflows in Snowflake.
Integrate and ingest data from multiple systems into Snowflake.
Develop and optimize SQL queries, views, and materialized datasets.
GTX Solutions is a consulting firm specializing in modern data architecture, Customer Data Platforms (CDPs), and marketing technology enablement. They work with enterprise clients across industries including Retail, Travel, Hospitality, and Financial Services to design and implement scalable data ecosystems.