Oversee a team of data engineers, guiding them in developing and managing data systems.
Streamline operational processes, enhance data quality, and ensure the reliability of data pipelines.
Foster a collaborative culture that empowers engineers and contributes to innovation.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates and share the shortlist directly with the hiring company.
Build and lead a team of 4-5 data engineers focused on reusable product artifacts
Own the product data engineering backlog in partnership with product management
Define and enforce technical standards for notebooks, pipelines, QC modules, and documentation
Qualified Health is redefining what’s possible with Generative AI in healthcare. They provide the guardrails for safe AI governance, healthcare-specific agent creation, and real-time algorithm monitoring, working alongside leading health systems to drive real change. They are a fast-growing company backed by premier investors.
Manage and support a team of 6 Data Engineers, helping them focus on impactful technical initiatives as the platform scales.
Drive execution excellence, ensuring the team delivers with high velocity, quality, and reliability.
Foster a healthy and sustainable team environment by helping the team manage workload and focus on meaningful engineering work.
Bluefish believes that AI represents the next major chapter of the internet and that consumers will increasingly use AI to consume information and media online. Bluefish is building the platform that helps brands engage consumers on this new AI channel, with powerful enterprise tools to manage AI brand safety and engage consumers with thoughtful and personalized AI marketing experiences.
Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response
Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
Partner closely with business stakeholders to understand their challenges and design end-to-end architecture.
Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform.
Motive empowers people who run physical operations with tools to make their work safer, more productive, and more profitable. Motive serves nearly 100,000 customers and provides complete visibility and control across a wide range of industries.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Build and manage business data pipelines and transform Firefox telemetry data into structured datasets.
Partner with data scientists, product, and marketing teams to turn datasets into models and metrics.
Ensure data accuracy and performance using observability tools and resolve data issues.
Mozilla Corporation is a technology company backed by a non-profit that has shaped the internet, creating brands like Firefox. With millions of users globally, they focus on areas including AI and social media while remaining focused on making the internet better for people.
Own the delivery of scalable internal data solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Design and implement end-to-end data pipelines to transform raw data into actionable insights.
Lead the development of complex data models and transformations leveraging Snowflake and dbt.
Ensure our data infrastructure remains scalable, reliable, and production-ready.
They are a Managed Nearshore Teams provider headquartered in Austin, specializing in building and embedding high-performing software development teams. Their model gives you the opportunity to work on international challenges, collaborate with diverse teams, and grow your career while being part of a company that values expertise, creativity, and impact.
Collaborate with product managers, data analysts, and machine learning engineers to develop pipelines and ETL tasks.
Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards.
Manage individual Data Engineers to foster learning, growth and success at Doximity.
Doximity is transforming the healthcare industry with a mission to help every physician be more productive and provide better care for their patients. As medicine's largest network in the United States, they are committed to building diverse teams with an inclusive culture.
Own the architecture, reliability, and evolution of our GCP data platform.
Translate business priorities and client requirements into a clear, deliverable technical roadmap.
Consumer Edge empowers top investment firms and global consumer and corporate brands with cutting-edge insights into consumer spending, leveraging privacy-compliant data across geographies. Our real-time intelligence and merchant-level benchmarks give clients a competitive edge.
Lead, mentor, and scale a high-performing data engineering team.
Design and evolve our core data infrastructure on AWS, Apache Airflow, and Apache Spark.
Tekmetric is an all-in-one, cloud-based platform helping auto repair shops run smarter, grow faster, and serve customers better. Officially founded in Houston in 2017, Tekmetric has grown from a single shop’s vision to the industry’s leading solution. They value transparency, integrity, innovation, and a service-first mindset.
Develop and maintain data models for core package application and reporting databases.
Monitor execution and performance of daily pipelines, triage and escalate any issues.
Collaborate with analytics and business teams to improve data models and data pipelines.
Bluelight Consulting designs and develops innovative software to enhance users' lives, focusing on quality and customer satisfaction. They foster a collaborative work environment where team members can grow, and are expanding across the US and Central/South America, seeking exceptional talent.
Build, optimize, and maintain data pipelines that power our business
Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies
Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.
Lead the data engineering team and manage activities.
Drive data architecture development using AI innovations.
Collaborate with product teams to enhance data strategies.
Jobgether is a platform that uses AI-powered matching to connect job seekers with employers, ensuring applications are reviewed quickly and fairly. They aim to streamline the hiring process and support both candidates and companies in finding the right fit.
Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization
MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.
Define and work within our data governance practices, including a catalog/dictionary and management of data quality.
Manage lights-out data operations of our ETL/ELT pipelines ranging from streaming inputs to batch file loads, to support customer reporting, development, and operations.
Untangle, normalize, synthesize as needed to permit joining and comparisons from disparate sources, and further analysis including ML processing.
Evermore is a technology company that administers Smart Benefits to connect people to products and services. They are backed by leading investors including General Catalyst, Define Ventures, Lightspeed Venture Partners, Pinegrove Capital Partners, and Qiming Venture Partners.