Manage and guide data teams to execute on enterprise data strategy.
Provide technical guidance and mentor team members on data technologies.
Design end-to-end data processing for enterprise data warehousing.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Build and manage business data pipelines and transform Firefox telemetry data into structured datasets.
Partner with data scientists, product, and marketing teams to turn datasets into models and metrics.
Ensure data accuracy and performance using observability tools and resolve data issues.
Mozilla Corporation is a technology company backed by a non-profit that has shaped the internet, creating brands like Firefox. With millions of users globally, they focus on areas including AI and social media while remaining focused on making the internet better for people.
Lead and grow a team of data engineers, providing mentorship and technical guidance.
Own execution of customer integrations across multiple product lines, ensuring on-time delivery.
Improve data quality and pipeline reliability by investing in better alerting and resilience.
Afresh is the leading AI company in fresh food, partnering with grocers to order billions of dollars of fresh food. They are on a mission to eliminate food waste and make fresh food accessible to all and has saved 200M lbs of food waste in 2025 alone.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Own organizational-wide data architecture, defining standards and designs.
Design and develop data pipelines, integrations, and platform features.
Partner with product managers to define new data features and capabilities.
They offer a connected equipment platform for managing mixed assets. The company values quality, continuous learning, and collaboration within a dynamic team environment.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Lead and manage a team of ~6 data engineers, driving execution, performance, and career development.
Own Kin’s data platform, including ingestion, storage, transformation, pipeline orchestration, and governance.
Build and optimize scalable data pipelines and architectures using tools like Snowflake, Databricks, DBT, and Airflow.
Kin simplifies homeowners' lives with smarter insurance, expanding to meet all homeowner needs. They employ Kinfolk across 35+ states and are recognized for growth, customer satisfaction, and a focus on long-term sustainability, fostering a culture of meaningful work and real impact.
Design and implement scalable data architectures to support business needs.
Build and optimize data pipelines, ensuring data accessibility and security.
Develop and maintain data models, databases, and data lakes, with robust data governance.
Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.
Assist in delivering on internal data and business intelligence initiatives.
Design, implement, and maintain ETL processes to facilitate warehousing and systems integration needs.
Develop and enhance data models to deliver value to the organization.
CRB delivers life-changing solutions for manufacturers in the life sciences and food and beverage industries. They have over 1,100 expert professionals, and their mission, vision, and core values center around client satisfaction and employee experience.
Design, build, and maintain scalable data pipelines.
Develop and optimize ETL/ELT processes using cloud data technologies.
Partner with teams to understand data requirements and improve data capture strategies.
Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.
Collaborate with product managers, data analysts, and machine learning engineers to develop pipelines and ETL tasks.
Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards.
Manage individual Data Engineers to foster learning, growth and success at Doximity.
Doximity is transforming the healthcare industry with a mission to help every physician be more productive and provide better care for their patients. As medicine's largest network in the United States, they are committed to building diverse teams with an inclusive culture.
Leads the design, development, and technical stewardship of data engineering systems.
Provides direct leadership, accountability, and professional development for the Clinical Data Engineering team.
Serves as the primary data engineering liaison to clinical and operational partners.
OHSU's Information Technology Group (ITG) provides and supports technology and information services. The Business Intelligence and Advanced Analytics (BIAA) Division leverages informational assets to enhance financial, clinical, operational, and research decision-making.
Partner closely with business stakeholders to understand their challenges and design end-to-end architecture.
Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform.
Motive empowers people who run physical operations with tools to make their work safer, more productive, and more profitable. Motive serves nearly 100,000 customers and provides complete visibility and control across a wide range of industries.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.
Build and maintain robust data pipelines processing large volumes of data
Update and optimise our data platform for speed, scalability and cost
Develop processes and tools to monitor and analyse model performance and data accuracy
Moniepoint is Africa's all-in-one financial ecosystem, empowering businesses and their customers with seamless payment, banking, credit, and management tools. They processed $182 billion in 2023 and are Nigeria’s largest merchant acquirer, cultivating a culture of innovation, teamwork, and growth.
Design and build mission critical data pipelines with a highly scalable distributed architecture.
Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders.
Build and support reusable framework to ingest, integration and provision data
StockX is a Detroit-based technology leader focused on the online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. They employ 1,000 people across offices and verification centers around the world and their platform connects buyers and sellers using dynamic pricing mechanics.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Lead and grow a team of data engineers responsible for SentiLink’s data platform and infrastructure.
Define and drive the technical vision for data ingestion, processing, storage, and serving systems.
Design and evolve scalable data pipelines (batch and real-time) to support product and data science use cases.
SentiLink provides identity and risk solutions. They empower institutions and individuals to transaction with confidence. They have grown quickly and are backed by world-class investors.
Lead the technical onboarding of partner institutions onto UDTS.
Design, build, and maintain scalable data pipelines and architectures.
Collaborate with team members to set engineering standards and guide data infrastructure strategy.
DataKind is a non-profit organization that uses data science and AI to address global challenges. They work with various sectors like health, humanitarian action, climate, economic opportunity, and education to create data-driven tools.