Design, build, and maintain scalable data platforms using AWS to support analytics, machine learning, and emerging generative AI use cases.
Collaborate with data scientists, analysts, and engineering teams to translate business and AI requirements into scalable data solutions.
Work with large-scale datasets to build and optimize data pipelines using AWS services such as EMR (Spark, Trino), S3, Glue, Athena, and Airflow
Experian is a global data and technology company, powering opportunities for people and businesses around the world. They invest in people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange, they have a team of 23,300 people across 32 countries.
Guide clients on optimizing their data environment.
Develop system engineering, integrations, and architectures based on client needs.
Implement and advise on data warehouse solutions, ETL pipelines, and BI reporting tools.
Jobgether helps candidates get their applications reviewed quickly and objectively. They use AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements.
Design and optimize scalable cloud-based data architectures.
Develop and maintain robust data pipelines, models, and systems across cloud platforms.
Mentor Data Engineers, guiding them in data modeling, troubleshooting, and best practices.
Personify Health has created a personalized health platform, bringing health plan administration, well-being solutions, and care navigation together. Their team is on a mission to empower people to lead healthier lives.
You will join a team of talented engineers working closely with Data Scientists to build and scale our next-generation Ad EnGage data pipeline.
You will work with large-scale datasets (hundreds of TBs to petabyte-scale systems) using a modern data stack centered on AWS, Airflow, dbt, and Snowflake.
You’ll contribute to building reliable, high-quality data pipelines and improving the performance, scalability, and observability of our data platform.
EDO is the TV outcomes company. Their leading measurement platform connects convergent TV airings to the ad-driven consumer behaviors most predictive of future sales. They are headquartered in New York City and Los Angeles with an office space in San Francisco and recognize the benefits of hybrid working.
Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
Lead architectural discussions around the modern data stack
Develop scalable ETL and ELT processes using Python and SQL
They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Own and operate our data warehouse, pipelines, and transformation layer
Design, build, and maintain scalable, reliable data pipelines that ingest data from across our platform and third-party sources, ensuring data is always available and trustworthy for downstream consumers
Partner with data scientists and analysts to deliver clean, well-documented datasets and optimize query performance so teams spend less time wrangling data and more time generating insights
Atticus makes it easy for any sick or injured person in crisis to get the life-changing aid they deserve. In the last six years, they've become the leading platform connecting people with disabilities to government benefits. In 2025, their team grew from 151 to 210, and they will grow again in 2026.
Architect production-grade data pipelines integrating clinical data across multiple channels.
Build and optimize cloud-native data infrastructure using AWS.
Collaborate with data science teams to build foundations for predictive analytics.
Jobgether is a platform that uses AI to match candidates with jobs and ensure applications are reviewed quickly and fairly. They help the hiring company identify the top-fitting candidates.
Lead, mentor, and scale a high-performing data engineering team.
Design and evolve our core data infrastructure on AWS, Apache Airflow, and Apache Spark.
Tekmetric is an all-in-one, cloud-based platform helping auto repair shops run smarter, grow faster, and serve customers better. Officially founded in Houston in 2017, Tekmetric has grown from a single shop’s vision to the industry’s leading solution. They value transparency, integrity, innovation, and a service-first mindset.
Build and maintain software and data pipelines in support of contract management and AI-assisted workflows
Work cross-functionally with Product, Engineering, and subject matter experts to conceptualize, prototype, and build data solutions
Contribute to data capabilities around contract modeling, automated pricing, and payer policy intelligence
Turquoise Health is a Series C price transparency platform for finance leaders across healthcare. It is a remote-first, US-based team that values transparency, empathy, inclusivity, creativity, and ownership with over 300 enterprise organizations as clients.
Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Help build scalable data solutions and streamline data ingestion.
Maintain high-quality databases that support our scientific and operational teams.
Optimize our data infrastructure to ensure efficient data access.
Funga is a public benefit corporation addressing the climate crisis by harnessing forest fungal networks. They are a team of passionate scientists and builders working to draw down at least three gigatons of carbon dioxide from the atmosphere by 2050.
Design, build, and maintain efficient data pipelines (ETL processes) to integrate data from various source systems into the data warehouse.
Develop and optimize data warehouse schemas and tables to support analytics and reporting needs.
Write and refine complex SQL queries and use scripting (e.g., Python) to transform and aggregate large datasets.
Deel is an all-in-one payroll and HR platform tailored for global teams. As one of the largest globally distributed companies, Deel's 7,000 team members span over 100 countries, fostering a dynamic culture of continuous learning and innovation.
Strong programming skills in Python and Linux Bash for automation and data workflows.
Expertise in Hadoop ecosystem tools and managing SQL databases for data storage and query optimization.
3Pillar is dedicated to engineering solutions that challenge conventional norms, going beyond traditional software development. They are an elite team of visionaries shaping the future direction of various endeavors, redefining urban living, and driving innovation across industries.
Own organizational-wide data architecture, defining standards and designs.
Design and develop data pipelines, integrations, and platform features.
Partner with product managers to define new data features and capabilities.
They offer a connected equipment platform for managing mixed assets. The company values quality, continuous learning, and collaboration within a dynamic team environment.
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional and non-functional business requirements.
Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.
Coderoad is a software development company that provides end-to-end software development services. It provides an opportunity to work on exciting, real-world projects in a supportive environment, offering staff augmentation, dedicated IT teams, and general software engineering.
Design, develop, and maintain scalable data pipelines and transformations
Build and optimize complex SQL queries and data models
Implement and manage real-time and batch data ingestion workflows using CDC
Builder Prime is revolutionizing the home improvement industry by providing an all-in-one business management solution, integrating CRM, estimating, production management, payments, and reporting. They raised their Series B, financed by Blueprint Equity, and look for collaborative and community-oriented team members.
Lead, manage, and mentor a group of data engineers.
Own the design and development of data pipelines and systems.
Partner cross-functionally with Data Science and Product managers.
TrueML is a mission-driven financial software company that aims to create better customer experiences for distressed borrowers. The TrueML team includes inspired data scientists, financial services industry experts and customer experience fanatics building technology to serve people.
Design and develop data pipelines, with an eye toward how they fit into the broader data architecture
Make data modeling and schema design decisions — choosing the right structure for how data is stored, accessed, and extended
Think beyond the immediate task: consider how today's integration point becomes tomorrow's platform surface area
Cross Screen Media is founded by industry veterans and gives customers a new way to plan and execute video advertising campaigns. They require an amazing team and offer a collaborative and creative atmosphere, with inspired leadership.
Lead and manage a team of ~6 data engineers, driving execution, performance, and career development.
Own Kin’s data platform, including ingestion, storage, transformation, pipeline orchestration, and governance.
Build and optimize scalable data pipelines and architectures using tools like Snowflake, Databricks, DBT, and Airflow.
Kin simplifies homeowners' lives with smarter insurance, expanding to meet all homeowner needs. They employ Kinfolk across 35+ states and are recognized for growth, customer satisfaction, and a focus on long-term sustainability, fostering a culture of meaningful work and real impact.