Design, build, and maintain scalable data platforms using AWS to support analytics, machine learning, and emerging generative AI use cases.
Collaborate with data scientists, analysts, and engineering teams to translate business and AI requirements into scalable data solutions.
Work with large-scale datasets to build and optimize data pipelines using AWS services such as EMR (Spark, Trino), S3, Glue, Athena, and Airflow
Experian is a global data and technology company, powering opportunities for people and businesses around the world. They invest in people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange, they have a team of 23,300 people across 32 countries.
Design, map, and implement complex data architectures that support enterprise-scale client onboarding and information exchange.
Translate business and functional requirements into scalable, secure, and cloud-native data solutions.
Serve as a trusted advisor to clients and internal stakeholders.
Availity delivers revenue cycle and related business solutions for health care professionals. They aim to help medical businesses thrive with powerful tools, actionable insights and expansive network reach in a changing industry. Availity is a global team with headquarters in Jacksonville, FL, and an office in Bangalore, India, plus a remote workforce across the United States.
Architect production-grade data pipelines integrating clinical data across multiple channels.
Build and optimize cloud-native data infrastructure using AWS.
Collaborate with data science teams to build foundations for predictive analytics.
Jobgether is a platform that uses AI to match candidates with jobs and ensure applications are reviewed quickly and fairly. They help the hiring company identify the top-fitting candidates.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.
You will join a team of talented engineers working closely with Data Scientists to build and scale our next-generation Ad EnGage data pipeline.
You will work with large-scale datasets (hundreds of TBs to petabyte-scale systems) using a modern data stack centered on AWS, Airflow, dbt, and Snowflake.
You’ll contribute to building reliable, high-quality data pipelines and improving the performance, scalability, and observability of our data platform.
EDO is the TV outcomes company. Their leading measurement platform connects convergent TV airings to the ad-driven consumer behaviors most predictive of future sales. They are headquartered in New York City and Los Angeles with an office space in San Francisco and recognize the benefits of hybrid working.
Lead, mentor, and scale a high-performing data engineering team.
Design and evolve our core data infrastructure on AWS, Apache Airflow, and Apache Spark.
Tekmetric is an all-in-one, cloud-based platform helping auto repair shops run smarter, grow faster, and serve customers better. Officially founded in Houston in 2017, Tekmetric has grown from a single shop’s vision to the industry’s leading solution. They value transparency, integrity, innovation, and a service-first mindset.
Define and maintain the enterprise data architecture blueprint.
Collaborate with business and IT leaders to align data architecture.
Design and oversee the implementation of scalable data solutions.
EPAM NEORIS believes that transformation starts with people and they aim to bring in talent that wants to improve, learn constantly, and make an impact on every project. They are a diverse, inclusive, and continuously evolving team that promotes ideas, not hierarchies, investing in the real development of every individual.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Works as a positive team member to deliver quality applications and components within scope.
Leads and proactively identifies emerging technologies and promotes their usage.
Develops strategies for managing complex data sets through maintaining data standards and metadata.
Emory University is a leading research university that fosters excellence and attracts world-class talent to innovate today and prepare leaders for the future. They welcome candidates who can contribute to the excellence of their academic community.
Implement the core components of our data platform, including data modeling, pipelines, and retrieval-ready storage layers.
Build operationally excellent systems, incorporating automated testing, deep observability, and robust failure handling.
Model complex business domains by defining clear entities, event histories, and reusable datasets that reflect real-world logic.
Campminder provides digital transformation solutions to the summer camp industry. The company has 85+ employees and is known for its values-led culture and employee experience, having been listed on Outside Magazine’s 50 Best Places to Work for 8 consecutive years.
Design data transformation pipelines that convert raw health signals, user inputs, and third-party data into structured, queryable context
Architect event-driven ingestion using tools like Kinesis, EventBridge, and SQS - handling duplicate events, traffic spikes, and partial failures gracefully
Define flexible data models and schemas for a hierarchical health context ontology that evolves as new context types emerge
Oura is dedicated to empowering individuals to take ownership of their inner potential. They provide award-winning products that enable their global community to gain deeper insights into their readiness, activity, and sleep quality through the Oura Ring and its connected app, enhancing the health and lives of millions.
Serve as the embedded technical lead for Databricks customer engagements.
Own Databricks platform architecture, design decisions, and technical standards.
Lead delivery of complex data pipelines and analytics workloads on Databricks.
540 is a forward-thinking company that the government turns to in order to #getshitdone. They break down barriers, build impactful technology, and solve mission-critical problems.
Responsible for designing and governing high‑quality, scalable data models that bridge business needs with technical implementation.
Translate complex, enterprise‑level requirements into conceptual, logical, and physical data models that support analytics, reporting, and operational workloads.
Collaborate closely with business, analytics, and engineering teams to align data strategy with organizational goals.
Onebridge, a Marlabs Company, is a global AI and Data Analytics Consulting Firm that empowers organizations worldwide to drive better outcomes through data and technology. The company was founded in 2005 and partners with some of the largest healthcare, life sciences, financial services, and government entities across the globe.
Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
Lead architectural discussions around the modern data stack
Develop scalable ETL and ELT processes using Python and SQL
They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.
Become a trusted data and AI advisor to clients, helping them translate business questions into AI-ready data architectures.
Design and implement AI-optimized data platforms, including cloud data warehouses, ETL/ELT pipelines, and analytic layers.
Engineer modern ELT/ETL pipelines that handle structured, semi-structured, and unstructured data to support AI and analytics use cases.
Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. They work alongside the most innovative software providers in the data engineering space to solve their clients' toughest business problems and believe in blending modern tools and techniques with tried-and-true principles to deliver optimal data engineering solutions.
Design and implement scalable data ingestion and transformation pipelines using Databricks and cloud platforms
Lead architecture decisions for modern data platforms, including Medallion Architecture and Lakehouse patterns
Build and maintain ETL/ELT pipelines using Python and SQL, following engineering best practices
AOT Technologies helps enterprises and governments bring their ideas to life. As a boutique consulting firm, they partner with enterprises, startups, and governments to solve complex, mission-critical challenges. Their teams are collaborative and their leadership is transparent.
Collaborate with cross-functional teams to understand business requirements and translate them into effective cloud-based data architecture solutions.
Design data models, data flow diagrams, and schema structures that optimize data storage, retrieval, and processing on cloud platforms.
Design and develop data integration strategies to consolidate data from diverse sources into a unified and coherent format.
Nimble Gravity solves hard problems and believes the right data can transform and propel growth for any organization. They are a team of outdoor enthusiasts, adrenaline seekers, and experienced growth hackers.
Help build scalable data solutions and streamline data ingestion.
Maintain high-quality databases that support our scientific and operational teams.
Optimize our data infrastructure to ensure efficient data access.
Funga is a public benefit corporation addressing the climate crisis by harnessing forest fungal networks. They are a team of passionate scientists and builders working to draw down at least three gigatons of carbon dioxide from the atmosphere by 2050.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Architect production-grade data pipelines that integrate clinical data.
Build and optimize cloud-native data infrastructure and ETL/ELT workflows.
Partner with data science and data analytics teams to build and operationalize data foundations.
Waymark is a mission-driven team transforming care for people with Medicaid benefits. Our community-based care teams use data science and ML technologies to support care across multiple states, reducing avoidable emergency department visits and hospitalizations.