Collaborate with Data Science, Product Managers and Software Engineers to build robust ETL pipelines.
Contribute to architecture decisions, observability tooling, and data quality initiatives.
Contribute to a scalable internal framework for managing prompt engineering pipelines and AI workflows.
Federato is on a mission to defend the right to efficient, equitable insurance for all by enabling insurers to provide affordable coverage. They are well funded by those behind Salesforce, Veeva, Zoom, Box, etc, and they value learning and the ability to change their minds.
Build and Maintain Bronze/Silver Layer Pipelines: ensure core data sources lands accurately, on time, and with full lineage.
Lead Data Ingestion, Transformation, and Enrichment: own the end-to-end pipeline from raw file landing through cleansed, conformed staging tables, including deduplication, standardization, code mapping, and entity resolution.
Develop Automated Ingestion Pipelines: use Snowpipe, Matillion, or custom solutions with reliability, observability, and minimal manual intervention in mind.
Precision AQ is building a centralized Data Hub to consolidate fragmented data infrastructure, establish enterprise-wide data governance, and enable AI-ready analytics across our life sciences portfolio. They value innovation and a collaborative environment.
Design and implement robust, production-grade pipelines using Python, Spark SQL, and Airflow.
Lead efforts to canonicalize raw healthcare data into internal models.
Onboard new customers by integrating their raw data into internal pipelines and canonical models.
Machinify is a healthcare intelligence company delivering value, transparency, and efficiency to health plan clients. They serve over 85 health plans, including many of the top 20, representing more than 270 million lives, with an AI-powered platform and expertise.
Design, build, and maintain scalable data pipelines using Microsoft Fabric and Apache Airflow
Ingest, transform, and integrate data from a variety of sources, including relational systems, APIs, and MongoDB
Design and maintain analytical data models, including fact and dimension tables, to support reporting and analytics
Theoria Medical is a comprehensive medical group and technology company dedicated to serving patients across the care continuum with an emphasis on post-acute care and primary care. Theoria serves facilities across the United States with a multitude of services to improve the quality of care delivered, refine facility processes, and enhance critical relationships.
Designing, developing, and maintaining robust, scalable, and well-documented data pipelines
Collaborating with the Tech Lead to ensure quality, performance, and maintainability of Data products
Contributing to the continuous improvement of engineering practices (CI/CD, automation, testing, documentation)
Accor Tech & Digital is the power engine of Accor technology, digital business and transformation. Their 5,000 talents are committed to deliver the best tech and digital experiences to its guests, hotels and staff across 110 countries and to shape the future of hospitality.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Own the delivery of scalable internal data solutions.
Translate business needs into clear technical designs and working systems.
Build and improve data pipelines, integrations, and automation.
Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.
Design, build, and maintain scalable data pipelines.
Apply dimensional modeling techniques to design tables and views.
Automate manual processes to improve efficiency and speed.
The Knot Worldwide champions celebration and powers meaningful moments for millions around the world. They are a team of passionate dreamers and doers united by connection and committed to the global community, believing the best ideas come from empowered and collaborative teams.
Design, develop, and optimize data architecture and pipelines aligned with ETL/ELT principles.
Architect workflows using DBT to convert raw data into actionable analytics.
Maintain production data pipelines with Python, DBT, Matillion, and Snowflake.
Jobgether is a platform that connects job seekers with partner companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.
Lead the foundational setup of new data environments.
Design, build, and manage scalable data pipelines.
Develop and maintain robust DBT data models to support semantic layer build-outs.
CompassX is a boutique business and technology consulting firm that helps Fortune 500 and high-growth clients deliver their most strategic initiatives through digital and data-driven projects. Recognized as a three-time winner of Consulting Magazine’s Best Boutique Firms to Work For, consultants value the freedom to shape their client work and maintain a direct line to leadership.
Design, develop, and maintain a core Python ETL framework.
Develop and optimize an automated refresh pipeline orchestrated through AWS Batch, Lambda, Step Functions, and EventBridge.
Build Python integrations with external systems that are robust, testable, and reusable.
BlastPoint is a B2B data analytics startup that helps companies engage with customers more effectively by discovering insights in their data. Founded in 2016 by Carnegie Mellon Alumni, they are a tight-knit, forward-thinking team that serves diverse industries including energy, finance, retail, and transportation.
Own the data infrastructure and pipelines for business critical product and user data
Conceptualize and build best-in-class internal tools enabling others to increasingly self-serve data needs
Take an AI-first approach to scaling Runway’s data tooling and use.
Runway is building AI to simulate the world through merging art and science. They aspire to continuously build impossible things and their ability to do so relies on building an incredible team.
Designing, building and maintaining data pipelines and data warehouses.
Developing ETL ecosystem tools using tools like Python, External APIs, Airbyte, Snowflake, dbt, PostgreSQL, MySQL, and Amazon S3.
Helping to define automated solutions to solve complex problems around better understanding data, users, and the market
Neighborhoods.com provides real estate software platforms, including 55places.com and neighborhoods.com. They strive to foster an inclusive environment, comfortable for everyone, and will not tolerate harassment or discrimination of any kind.
Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
Model, store, and transform data to support analytics, reporting, and downstream applications.
Build API-based and file-based integrations across enterprise platforms.
Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
Translate business requirements into software solutions that accelerate our ability to deploy AI.
Monitor data pipelines, detect anomalies, and implement automated recovery systems.
Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.
Build CDC pipelines and real-time streaming (Kafka/Flink)
Design and maintain data models (raw to staging to core)
Implement observability, data transformations and quality checks
Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly.
Develop and optimize real-time data pipelines from existing data sources to support marketing science initiatives.
Structure and organize large-scale datasets to ensure high performance, scalability, and reliability.
Build and maintain production-grade data tables powering 20–50 marketing signals across ~20 million advertisers.
ItD is a consulting and software development company blending diversity, innovation, and integrity with real business results and rejecting strong hierarchy. They are a woman- and minority-led firm employing a global community with excellent benefits such as medical, dental, vision, life insurance, paid holidays, 401K, networking & career learning and development programs.
Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.
UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.