Collaborate with product managers, data analysts, and machine learning engineers to develop pipelines and ETL tasks.
Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards.
Manage individual Data Engineers to foster learning, growth and success at Doximity.
Doximity is transforming the healthcare industry with a mission to help every physician be more productive and provide better care for their patients. As medicine's largest network in the United States, they are committed to building diverse teams with an inclusive culture.
Design and develop data pipelines, with an eye toward how they fit into the broader data architecture
Make data modeling and schema design decisions — choosing the right structure for how data is stored, accessed, and extended
Think beyond the immediate task: consider how today's integration point becomes tomorrow's platform surface area
Cross Screen Media is founded by industry veterans and gives customers a new way to plan and execute video advertising campaigns. They require an amazing team and offer a collaborative and creative atmosphere, with inspired leadership.
Design, build, and maintain data products that support R&D, analytics, Lab, and scientific workflows.
Build and maintain data pipelines for large and complex datasets ensuring high data quality.
Partner with scientists and engineers to translate research needs into reusable data assets.
Natera is a global leader in cell-free DNA (cfDNA) testing, dedicated to oncology, women’s health, and organ health. They aim to make personalized genetic testing and diagnostics part of the standard of care to protect health and enable earlier and more targeted interventions that lead to longer, healthier lives.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Design, build, and maintain efficient data pipelines (ETL processes) to integrate data from various source systems into the data warehouse.
Develop and optimize data warehouse schemas and tables to support analytics and reporting needs.
Write and refine complex SQL queries and use scripting (e.g., Python) to transform and aggregate large datasets.
Deel is an all-in-one payroll and HR platform tailored for global teams. As one of the largest globally distributed companies, Deel's 7,000 team members span over 100 countries, fostering a dynamic culture of continuous learning and innovation.
Design, build, and maintain scalable data infrastructure to support analytics and reporting across the organization.
Develop and operate ETL pipelines to ingest, transform, and deliver large-scale datasets.
Partner closely with Data Analysts and cross-functional stakeholders to provide reliable datasets and guide them in using data effectively.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. With over two decades of experience, they deliver top-tier technology solutions to companies of all sizes. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies in their projects.
Build and manage business data pipelines and transform Firefox telemetry data into structured datasets.
Partner with data scientists, product, and marketing teams to turn datasets into models and metrics.
Ensure data accuracy and performance using observability tools and resolve data issues.
Mozilla Corporation is a technology company backed by a non-profit that has shaped the internet, creating brands like Firefox. With millions of users globally, they focus on areas including AI and social media while remaining focused on making the internet better for people.
Own and operate our data warehouse, pipelines, and transformation layer
Design, build, and maintain scalable, reliable data pipelines that ingest data from across our platform and third-party sources, ensuring data is always available and trustworthy for downstream consumers
Partner with data scientists and analysts to deliver clean, well-documented datasets and optimize query performance so teams spend less time wrangling data and more time generating insights
Atticus makes it easy for any sick or injured person in crisis to get the life-changing aid they deserve. In the last six years, they've become the leading platform connecting people with disabilities to government benefits. In 2025, their team grew from 151 to 210, and they will grow again in 2026.
Organize and structure data systems at both macro and micro levels, designing and implementing data architectures that support business goalsOptimize data pipelines for performance, reliability, and scalability
Design, build, and maintain scalable ETL/ELT pipelines with Airflow to process large-scale, complex datasets
Demonstrate ability to delivery of of data products useful for machine learning and AI research and development (data models, metadata and semantics)
Owkin is an AI company on a mission to solve the complexity of biology. It is building the first Biology Super Intelligence (BASI) by combining powerful biological large language models, multimodal patient data, and agentic software.
Lead the technical onboarding of partner institutions onto UDTS.
Design, build, and maintain scalable data pipelines and architectures.
Collaborate with team members to set engineering standards and guide data infrastructure strategy.
DataKind is a non-profit organization that uses data science and AI to address global challenges. They work with various sectors like health, humanitarian action, climate, economic opportunity, and education to create data-driven tools.
Design, build, and maintain scalable data platforms using AWS to support analytics, machine learning, and emerging generative AI use cases.
Collaborate with data scientists, analysts, and engineering teams to translate business and AI requirements into scalable data solutions.
Work with large-scale datasets to build and optimize data pipelines using AWS services such as EMR (Spark, Trino), S3, Glue, Athena, and Airflow
Experian is a global data and technology company, powering opportunities for people and businesses around the world. They invest in people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange, they have a team of 23,300 people across 32 countries.
Design, build, and maintain production data pipelines using Python, Prefect, Airflow, Jenkins or any other orchestration framework multi-phase algorithmic workflows.
Build and optimize advanced SQL transformations in Snowflake, including window functions, CTEs, stored procedures, UDFs, and semi-structured data processing.
Build and maintain dbt models for data transformation, identity resolution, and slowly changing dimension (SCD Type 2) tracking across 80+ models and multiple pipeline stages.
Kalibri helps to redefine and rebuild the hotel industry. They are looking for passionate, energetic, and hardworking people with an entrepreneurial spirit, who dream big and challenge the status quo; their team is working on cutting-edge solutions for the industry.
Become a trusted data and AI advisor to clients, helping them translate business questions into AI-ready data architectures.
Design and implement AI-optimized data platforms, including cloud data warehouses, ETL/ELT pipelines, and analytic layers.
Engineer modern ELT/ETL pipelines that handle structured, semi-structured, and unstructured data to support AI and analytics use cases.
Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. They work alongside the most innovative software providers in the data engineering space to solve their clients' toughest business problems and believe in blending modern tools and techniques with tried-and-true principles to deliver optimal data engineering solutions.
Manage and guide data teams to execute on enterprise data strategy.
Provide technical guidance and mentor team members on data technologies.
Design end-to-end data processing for enterprise data warehousing.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization
MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.
Partner closely with business stakeholders to understand their challenges and design end-to-end architecture.
Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform.
Motive empowers people who run physical operations with tools to make their work safer, more productive, and more profitable. Motive serves nearly 100,000 customers and provides complete visibility and control across a wide range of industries.
Design, build, and maintain databases that power Hologram's operations.
Build and maintain ETL pipelines that move and transform data reliably.
Audit existing pipelines and data models, identify complexity, and refactor bad decisions.
Hologram is building the future of IoT connectivity, delivering internet access to millions of connected devices worldwide. They process over 5 billion transactions per month across their global infrastructure and values a fun, upbeat, and remote-first team united by their mission.
Architect production-grade data pipelines that integrate clinical data.
Build and optimize cloud-native data infrastructure and ETL/ELT workflows.
Partner with data science and data analytics teams to build and operationalize data foundations.
Waymark is a mission-driven team transforming care for people with Medicaid benefits. Our community-based care teams use data science and ML technologies to support care across multiple states, reducing avoidable emergency department visits and hospitalizations.
Design, build, and maintain scalable data pipelines using Python and Airflow
Develop and optimize ETL/ELT processes for structured and unstructured data
Collaborate with data science teams to support Machine Learning workflows
Oowlish is a rapidly expanding software development company in Latin America. They foster a nurturing work environment, are certified as a Great Place to Work, and provide opportunities for professional development and international impact.
Design and optimize scalable cloud-based data architectures.
Develop and maintain robust data pipelines, models, and systems across cloud platforms.
Mentor Data Engineers, guiding them in data modeling, troubleshooting, and best practices.
Personify Health has created a personalized health platform, bringing health plan administration, well-being solutions, and care navigation together. Their team is on a mission to empower people to lead healthier lives.