Architect and maintain scalable, secure, and high-performing data pipelines to support analytics, reporting, and operational needs.
Develop and deploy production-grade data engineering code, ensuring reliability and performance across environments.
Manage end-to-end data workflows, including ingestion, transformation, modeling, and validation for multiple business systems.
Onebridge, a Marlabs Company, is a global AI and Data Analytics Consulting Firm that empowers organizations worldwide to drive better outcomes through data and technology. Since 2005, they have partnered with some of the largest healthcare, life sciences, financial services, and government entities across the globe.
Design, implement, and maintain distributed ingestion pipelines for structured and unstructured data.
Build scalable ETL/ELT workflows to transform, validate, and enrich datasets for AI/ML model training and analytics.
Support preprocessing of unstructured assets for training pipelines, including format conversion, normalization, augmentation, and metadata extraction.
Meshy is a leading 3D generative AI company transforming content creation by enabling the creation of 3D models from text and images. They have a global team distributed across North America, Asia, and Oceania and are backed by venture capital firms like Sequoia and GGV, with $52 Million in funding.
Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.
Develop AWS Lambda functions, APIs, and web applications to support business operations.
Build and maintain ELT pipelines to integrate and transform data across platforms.
Design and implement AI-powered automation to improve operational efficiency.
M3 USA offers digital solutions across healthcare, life sciences, pharmaceuticals, and more. They utilize the internet for a healthier world and more efficient healthcare systems, with trusted digital platforms that engage physician communities globally. M3 USA prides itself on a dynamic and innovative work environment where every team member contributes to global health advancements.
Design, build, and scale performant data pipelines and infrastructure, primarily using ClickHouse, Python, and dbt.
Build systems that handle large-scale streaming and batch data, with a strong emphasis on correctness and operational stability.
Own the end-to-end lifecycle of data pipelines, from raw ingestion to clean, well-defined datasets consumed by downstream teams.
Nansen is a leading blockchain analytics platform that empowers investors and professionals with real-time, actionable insights derived from on-chain data. We’re building the world’s best blockchain analytics platform, and data is at the heart of everything we do.
Work alongside engineers, engineering managers, and project managers to deliver AWS solutions.
Guide Cayliens and Customers alike through Agile ceremonies like stand-ups and retrospectives.
Translate customer requirements into a workable backlog of tickets for engineers.
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They operate fully remote with employees in Canada, the United States, and Latin America and foster a community of technological curiosity.
Play a key role in ensuring the stability, scalability, and efficiency of data platforms and pipelines. Work at the intersection of data engineering, database administration, and technical operations, supporting critical analytics workflows. Monitor, troubleshoot, and optimize databases and data warehouses, while implementing automation and orchestration solutions using Python.
This position is posted by Jobgether on behalf of a partner company, and they use an AI-powered matching process to ensure your application is reviewed quickly.
Building features related to our data pipelines, usage of LLMs, analytics APIs, etc.
Doing whatever is necessary to deliver value to our customers.
Creating notifications, alert emails, scheduled reports, connectors into third party systems, etc.
Scrunch helps marketing teams rethink how their products and services are discovered and surfaced on AI platforms like ChatGPT, Claude, Gemini. They have scaled rapidly since commercial launch and have more than 500 paying brands using the platform.
Building and maintaining pipelines that ingest large amounts of data from various sources.
Setting up Reverse-ETL syncs to power operational analytics.
Write various styles of automated tests to ensure operational reliability and data integrity.
Super.com helps people save more, earn more, and get more out of life. It invests in learning, celebrates bold ideas, and creates pathways for career growth with a fast-moving and people-first culture.
Support their managed analytics service client through their data-driven journey and deliver measurable business value through data modeling, API integration, SQL scripting, and data pipeline development.
Bridge the important gap between data applications and insightful business reports.
Participate in building our data platform from the ground up by exploring new technologies & vendors within our cloud-first environment
DataDrive is a fast-growing managed analytics service provider that provides modern cloud analytics data platforms to data-driven organizations, while also supporting ongoing training, adoption, and growth of our clients’ data cultures. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business.
Architect and lead the evolution of our modern data platform.
Design and build production LLM pipelines and infrastructure that power intelligent operations.
Own end-to-end data acquisition and integration architecture across diverse sources.
Brightwheel is the largest, fastest growing, and most loved platform in early ed. They are trusted by millions of educators and families every day. The team is passionate, talented, and customer-focused and embodies their Leadership Principles in their work and culture.
Ship technically challenging projects end-to-end in a fast-paced, iterative environment.
Propel the business forward, and recognize the impact of your work on the company’s business metrics.
Own features, services, caches, and databases, including: deployment, monitoring, debugging, and testing.
Super.com is on a mission to help people save more, earn more, and get more out of life. They move fast, think big, and always put people first, investing in learning, celebrating bold ideas, and creating pathways for career growth.
Creating and maintaining optimal data pipeline architecture.
Assembling large, complex data sets that meet functional & non-functional business requirements.
Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.
Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.
Build ETL/ELT pipelines for extracting data from sources and placing it in target destinations.
Transform data into formats usable by AI-based solutions.
Manage datasets for AI model training and fine-tuning.
Jobgether is an AI-powered platform that connects job seekers with employers. They use AI to match candidates with roles and ensure applications are reviewed quickly and fairly.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Code, test, and document new or modified data pipelines.
Conduct logical and physical database design.
Perform root cause analysis on internal and external data.
Aker Systems builds and operates ground-breaking, ultra-secure, high performance, cloud-based data infrastructure for the enterprise. They were recognised as a ‘One to Watch’ on the Sunday Times Tech Track and won the Thames Valley Tech Company of the year.
Design, build, and optimize data pipelines to centralize data in a modern warehouse (PostHog)
Automate ETL processes and existing spreadsheet-based reports
Work closely with finance and business stakeholders to understand ad hoc reporting needs and deliver efficient solutions
Katapult is a nearshore software development agency that combines the best talent in LATAM, with world-class execution and leadership experience, with an AI-first approach to product engineering. Katapult works with PMF+ startups and businesses in the United States with a team-augmentation model.
Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation
OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.