Own reliability, scalability and cost discipline of ingestion and transformation systems
Design and deliver infrastructure for real-time/near-real-time feature computation
Lead and grow a small, ambitious team while raising technical standards
Yazio is a nutrition app with millions of users in over 150 countries, driven by its mission to transform the world through healthy eating. They champion a focus-driven culture that values efficiency, offering a high-impact environment supported by a diverse, international team committed to growth and well-being.
Drive end-to-end delivery of core data engineering initiatives.
Lead and mentor the Data Engineering team.
Own data ingestion and processing for live and historical datasets.
BHFT is a proprietary algorithmic trading firm managing the full trading lifecycle — from software development to designing and deploying trading strategies. We are a 230-person company with a strong technology focus, where 70% of the team are engineers and technical specialists.
Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
Translate business requirements into software solutions that accelerate our ability to deploy AI.
Monitor data pipelines, detect anomalies, and implement automated recovery systems.
Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.
Manage and mentor a high-performing team, fostering a culture of technical excellence.
Define the Data Engineering team vision, balancing immediate business needs with a long-term shift towards a self-service data mesh architecture.
Oversee the development of core data pipelines and platform tools, ensuring high performance for ingestion services.
UW provides utilities all in one place with one bill for energy, broadband, mobile and insurance, targeting savings for customers. They are aiming to double in size and are looking for people to help them achieve this goal through innovation and impact.
Define and plan the long-term strategy for the Data Platform.
Design and develop scalable distributed systems for data management.
Improve and add features to the ETL framework while maintaining SLAs.
Jobgether is a platform that connects job seekers with companies using an AI-powered matching process. It's a platform that ensures applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.
Design, deploy, and maintain large-scale Kafka event streaming infrastructure.
Collaborate with engineers to enable new features and ensure data pipeline reliability.
Execute and automate Kafka cluster upgrades, migrations, and major version rollouts.
Yelp's engineering culture values teamwork, individual authenticity, and creative solutions. They are all about helping their users, growing as engineers, and having fun in a collaborative environment.
Build CDC pipelines and real-time streaming (Kafka/Flink)
Design and maintain data models (raw to staging to core)
Implement observability, data transformations and quality checks
Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly.
Design and implement scalable, high-performing data pipelines and optimize our data architecture.
Build and deploy cloud-native solutions leveraging Azure Data Services, Databricks and other Big Data technologies.
Collaborate across teams to understand and support their data needs while ensuring the data architecture supports ongoing and future initiatives.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision “Until every human has their health”.
Design and implement scalable, high-throughput data ingestion systems.
Build and evolve a centralized data lake using Apache Iceberg.
Provide technical leadership through mentorship, code reviews, and design discussions.
Coupa provides a total spend management platform for businesses, which uses community-generated AI to multiply margins. They have a collaborative culture driven by transparency, openness, and a shared commitment to excellence, and are expanding their impact across the globe.
Build and maintain production quality data pipelines between operational systems and BigQuery.
Implement data quality and freshness checks and monitoring processes to ensure data accuracy and consistency.
Maintain and contribute to our ingestion framework that leverages various purpose-built data load tool (dlt) connectors.
Grafana Labs is a remote-first, open-source powerhouse that provides visualization tools. They help companies manage their observability strategies with the Grafana LGTM Stack and have a global collaborative culture with a passion for meaningful work.
Build and maintain the high-throughput, event-driven pipelines responsible for processing the history of assets and vulnerabilities.
Design systems that handle massive scale, ensuring data accuracy and real-time availability.
Use Terraform and Datadog to deploy, monitor, and ensure the health of services in production.
Tenable is the Exposure Management company, with 44,000 organizations using it to understand and reduce cyber risk. Their global employees support 65 percent of the Fortune 500, 45 percent of the Global 2000, and large government agencies, fostering a culture of belonging, respect, and excellence.
Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
Ensure data integrity, availability, lineage, and observability across complex pipelines
Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.
Build and support non-interactive & real-time, highly available data pipelines.
Build fault-tolerant, self-healing, adaptive, and highly accurate data computational pipelines.
Implement and maintain dbt transformation models, CI pipelines, and data contracts.
Valtech is an experience innovation company that helps brands unlock new value in an increasingly digital world. They blend crafts, categories, and cultures to drive transformation for leading organizations. Their people are the heart of their success, and they foster a workplace where everyone has the support to thrive, grow and innovate.
Collaborate with Data Science, Product Managers and Software Engineers to build robust ETL pipelines.
Contribute to architecture decisions, observability tooling, and data quality initiatives.
Contribute to a scalable internal framework for managing prompt engineering pipelines and AI workflows.
Federato is on a mission to defend the right to efficient, equitable insurance for all by enabling insurers to provide affordable coverage. They are well funded by those behind Salesforce, Veeva, Zoom, Box, etc, and they value learning and the ability to change their minds.
Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
Implement automated data quality checks, validation rules, and monitoring frameworks.
ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.
Own Technical Excellence: Define and drive the architecture, design patterns, and engineering standards for the feature store platform.
V2 Implementation: Assist and execute the next generation of our feature store—building for scale, low-latency serving, and enterprise-grade reliability.
Guide Product Roadmap: Partner with Product and leadership to help shape the technical roadmap.
Redis created the product that runs the fast apps our world runs on. At Redis, people work with technology and build it, tell its story, and sell it to 10,000+ worldwide customers creating a faster world with simpler experiences.
Design and evolve scalable data pipelines and architectures.
Act as the primary anchor for data ingestion, transformation, and storage solutions.
Ensure mission-critical data is accessible and reliable.
CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.