Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
Collaborate with cross-functional teams to deliver impactful data solutions.
Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.
Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
Design and engineer robust data pipelines using technologies like Databricks, Azure Data Factory, Apache Spark, and Delta Lake.
Craft healthcare data solutions - processing massive healthcare datasets, optimizing performance, and ensuring data is accurate and secure.
Communicate technical concepts to non-technical stakeholders, manage multiple priorities, and meet deadlines.
Gentiva offers compassionate care in the comfort of patients' homes as a national leader in hospice, palliative, home health care, and advanced illness management. They have nearly 600 locations and thousands of clinicians across 38 states, offering rewarding careers in a collaborative environment.
Define and enforce data architecture and engineering standards across products and teams
Design scalable, secure, and optimized cloud data architectures that align with business goals
Collaborate with engineering and product teams to translate requirements into high-level and detailed system designs
Jobgether connects candidates with partner companies using an AI-powered matching process. They ensure applications are reviewed quickly and fairly, and their system identifies top candidates to share with the hiring company.
Design, build, and maintain scalable, high-quality data pipelines.
Implement robust data ingestion, transformation, and storage using cloud-based technologies.
Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
Drive data engineering work across project phases, including discovery, design, build, test, deploy, and ongoing improvement.
Design and build scalable data pipelines using Microsoft Fabric (lakehouses, warehouses, pipelines, dataflows, notebooks).
Cleanse, model, and transform raw data to support analytics, reporting, semantic modeling, and governance needs.
Stoneridge Software helps clients succeed in implementing business software solutions. As a 2025 Top Workplace Honoree and Microsoft Solutions Partner, they have a meticulous approach to project delivery and empower client's success with long-term support.
Partner with clients and implementation teams to understand data distribution requirements.
Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Build and operate data pipelines from D365, Power Platform, and other sources into the enterprise data platform.
Design and implement star schemas, data lake house structures, and semantic models for Power BI.
Optimize performance and cost management for reporting in Azure.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify top-fitting candidates and share the shortlist with the hiring company.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Creating and maintaining optimal data pipeline architecture.
Assembling large, complex data sets that meet functional & non-functional business requirements.
Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.
Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.
Build and operate data pipelines from various sources into the enterprise data platform.
Design and implement star schemas, lakehouse structures, and semantic models.
Establish data quality checks and reconciliation rules to ensure consistency.
Jobgether is a platform that uses AI-powered matching to ensure applications are reviewed quickly, objectively, and fairly. Their system identifies top-fitting candidates and shares this shortlist with the hiring company.
Oversee the design, development, implementation, and maintenance of all data-related systems.
Build and maintain scalable and reliable data pipelines, ensuring data quality and consistency across all systems.
Collaborate with data scientists, business analysts, and other stakeholders to identify data-related needs and requirements.
MoneyHero Group is a market-leading financial products platform in Greater Southeast Asia, reaching millions of monthly unique users and working with hundreds of commercial partners. They have a team of over 350 talented individuals and are backed by world-class organizations and companies.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Partner with our customer teams to develop engineering plans to implement our health system partners
Build and support robust batch and streaming pipelines
Evolve the maturity of our monitoring systems and processes to improve visibility and failure detection in our infrastructure
Paradigm is rebuilding the clinical research ecosystem by enabling equitable access to trials for all patients. Incubated by ARCH Venture Partners and backed by leading healthcare and life sciences investors, Paradigm’s seamless infrastructure implemented at healthcare provider organizations, will bring potentially life-saving therapies to patients faster.
Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.
Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.
Build and optimize Sauce's lakehouse architecture using Azure Databricks and Unity Catalog for data governance.
Create and maintain data quality tests and improve existing alerting setups.
Own data warehouse by connecting data sources, and maintaining a platform and architecture in coordination with R&D infrastructure and operations teams.
Sauce is a premier restaurant technology platform that helps businesses grow with our Commission-Free Delivery & Pickup structure and proprietary delivery optimization technology.
Design and own scalable, secure data architectures and pipelines across our platforms.
Collaborate closely with data engineers, analysts, MLOps Lead, AI and Data Architects, and business stakeholders, to align architecture with business needs.
Define and maintain data models, governance frameworks, reference architectures, guidelines, best practices and operating principles for our data and analytics platforms.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision “Until every human has their health”.