Own and operate our data warehouse, pipelines, and transformation layer
Design, build, and maintain scalable, reliable data pipelines that ingest data from across our platform and third-party sources, ensuring data is always available and trustworthy for downstream consumers
Partner with data scientists and analysts to deliver clean, well-documented datasets and optimize query performance so teams spend less time wrangling data and more time generating insights
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Design, build, and maintain efficient data pipelines (ETL processes) to integrate data from various source systems into the data warehouse.
Develop and optimize data warehouse schemas and tables to support analytics and reporting needs.
Write and refine complex SQL queries and use scripting (e.g., Python) to transform and aggregate large datasets.
Deel is an all-in-one payroll and HR platform tailored for global teams. As one of the largest globally distributed companies, Deel's 7,000 team members span over 100 countries, fostering a dynamic culture of continuous learning and innovation.
You will join a team of talented engineers working closely with Data Scientists to build and scale our next-generation Ad EnGage data pipeline.
You will work with large-scale datasets (hundreds of TBs to petabyte-scale systems) using a modern data stack centered on AWS, Airflow, dbt, and Snowflake.
You’ll contribute to building reliable, high-quality data pipelines and improving the performance, scalability, and observability of our data platform.
EDO is the TV outcomes company. Their leading measurement platform connects convergent TV airings to the ad-driven consumer behaviors most predictive of future sales. They are headquartered in New York City and Los Angeles with an office space in San Francisco and recognize the benefits of hybrid working.
Design and develop data pipelines, with an eye toward how they fit into the broader data architecture
Make data modeling and schema design decisions — choosing the right structure for how data is stored, accessed, and extended
Think beyond the immediate task: consider how today's integration point becomes tomorrow's platform surface area
Cross Screen Media is founded by industry veterans and gives customers a new way to plan and execute video advertising campaigns. They require an amazing team and offer a collaborative and creative atmosphere, with inspired leadership.
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional and non-functional business requirements.
Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.
Coderoad is a software development company that provides end-to-end software development services. It provides an opportunity to work on exciting, real-world projects in a supportive environment, offering staff augmentation, dedicated IT teams, and general software engineering.
Build and manage business data pipelines and transform Firefox telemetry data into structured datasets.
Partner with data scientists, product, and marketing teams to turn datasets into models and metrics.
Ensure data accuracy and performance using observability tools and resolve data issues.
Mozilla Corporation is a technology company backed by a non-profit that has shaped the internet, creating brands like Firefox. With millions of users globally, they focus on areas including AI and social media while remaining focused on making the internet better for people.
Develop engineering expertise within the Dataiku Platform to help maintain and develop system integrations, platform automations, and platform configurations.
Build & maintain python & SQL data replication & data pipelines on large & often complex data sets.
Identify opportunities for improvements & optimization for greater scalability & delivery velocity
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Guide clients on optimizing their data environment.
Develop system engineering, integrations, and architectures based on client needs.
Implement and advise on data warehouse solutions, ETL pipelines, and BI reporting tools.
Jobgether helps candidates get their applications reviewed quickly and objectively. They use AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements.
Design, build, and maintain scalable data platforms using AWS to support analytics, machine learning, and emerging generative AI use cases.
Collaborate with data scientists, analysts, and engineering teams to translate business and AI requirements into scalable data solutions.
Work with large-scale datasets to build and optimize data pipelines using AWS services such as EMR (Spark, Trino), S3, Glue, Athena, and Airflow
Experian is a global data and technology company, powering opportunities for people and businesses around the world. They invest in people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange, they have a team of 23,300 people across 32 countries.
Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization
MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.
Design, build, and maintain scalable data infrastructure to support analytics and reporting across the organization.
Develop and operate ETL pipelines to ingest, transform, and deliver large-scale datasets.
Partner closely with Data Analysts and cross-functional stakeholders to provide reliable datasets and guide them in using data effectively.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. With over two decades of experience, they deliver top-tier technology solutions to companies of all sizes. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies in their projects.
Own organizational-wide data architecture, defining standards and designs.
Design and develop data pipelines, integrations, and platform features.
Partner with product managers to define new data features and capabilities.
They offer a connected equipment platform for managing mixed assets. The company values quality, continuous learning, and collaboration within a dynamic team environment.
Become a trusted data and AI advisor to clients, helping them translate business questions into AI-ready data architectures.
Design and implement AI-optimized data platforms, including cloud data warehouses, ETL/ELT pipelines, and analytic layers.
Engineer modern ELT/ETL pipelines that handle structured, semi-structured, and unstructured data to support AI and analytics use cases.
Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. They work alongside the most innovative software providers in the data engineering space to solve their clients' toughest business problems and believe in blending modern tools and techniques with tried-and-true principles to deliver optimal data engineering solutions.
Enable self-service analytics for all team members by designing clean, intuitive data models and metrics through dbt, empowering employees to make informed, data-driven decisions.
Develop and refine custom data pipelines that ingest data from operational systems to our analytics platform, handling both streaming and batch data using third-party tooling and home-grown solutions
Maintain and optimize the data platform infrastructure, focusing on data quality, ELT efficiency, and platform hygiene.
Auto Integrate makes leased vehicle maintenance frictionless for millions of customers in the USA and Canada. The business is managed by a small, global team within Fleetio, combining the resources of a scaled SaaS company with the agility of a niche market leader.
Design, build, and maintain databases that power Hologram's operations.
Build and maintain ETL pipelines that move and transform data reliably.
Audit existing pipelines and data models, identify complexity, and refactor bad decisions.
Hologram is building the future of IoT connectivity, delivering internet access to millions of connected devices worldwide. They process over 5 billion transactions per month across their global infrastructure and values a fun, upbeat, and remote-first team united by their mission.
Design, build, and maintain robust ETL/ELT pipelines.
Develop and enforce data models and schema standards.
Build and maintain Looker explores, LookML models, and dashboards.
Smart Working believes your job should not only look right on paper but also feel right every day. This company breaks down geographic barriers and connects skilled professionals with outstanding global teams and products for full-time, long-term roles.
Lead the technical onboarding of partner institutions onto UDTS.
Design, build, and maintain scalable data pipelines and architectures.
Collaborate with team members to set engineering standards and guide data infrastructure strategy.
DataKind is a non-profit organization that uses data science and AI to address global challenges. They work with various sectors like health, humanitarian action, climate, economic opportunity, and education to create data-driven tools.
Lead and grow a team of data engineers, providing mentorship and technical guidance.
Own execution of customer integrations across multiple product lines, ensuring on-time delivery.
Improve data quality and pipeline reliability by investing in better alerting and resilience.
Afresh is the leading AI company in fresh food, partnering with grocers to order billions of dollars of fresh food. They are on a mission to eliminate food waste and make fresh food accessible to all and has saved 200M lbs of food waste in 2025 alone.
Lead and manage a team of ~6 data engineers, driving execution, performance, and career development.
Own Kin’s data platform, including ingestion, storage, transformation, pipeline orchestration, and governance.
Build and optimize scalable data pipelines and architectures using tools like Snowflake, Databricks, DBT, and Airflow.
Kin simplifies homeowners' lives with smarter insurance, expanding to meet all homeowner needs. They employ Kinfolk across 35+ states and are recognized for growth, customer satisfaction, and a focus on long-term sustainability, fostering a culture of meaningful work and real impact.