Collaborate with analytics and finance teams to enable trusted metrics and dashboards in Looker
Design, build, and maintain scalable data pipelines using Python and dbt
Develop and optimize BigQuery data models for analytics and product use cases
UJET delivers a cloud platform that redefines customer experience with AI and a mobile-first approach. They are an innovative company committed to exceptional interactions and accelerated growth in the AI-driven world.
Build and support non-interactive & real-time, highly available data pipelines.
Build fault-tolerant, self-healing, adaptive, and highly accurate data computational pipelines.
Implement and maintain dbt transformation models, CI pipelines, and data contracts.
Valtech is an experience innovation company that helps brands unlock new value in an increasingly digital world. They blend crafts, categories, and cultures to drive transformation for leading organizations. Their people are the heart of their success, and they foster a workplace where everyone has the support to thrive, grow and innovate.
Demonstrate deep knowledge of the data engineering domain to build and support non-interactive & real-time, highly available data pipelines.
Build fault-tolerant, self-healing, adaptive, and highly accurate data computational pipelines.
Implement and maintain dbt transformation models, CI pipelines, and data contracts for curated campaign, ad group, keyword, audience, and landing-page marts.
Valtech is an experience innovation company that helps the world's most recognized brands unlock new value in an increasingly digital world. They blend crafts, categories, and cultures to drive transformation for leading organizations. The people at Valtech are the heart of their success, and they foster a workplace where everyone has the support to thrive, grow and innovate.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
Create and maintain optimal data pipeline architecture
Ensure reliability and scalability of our data infrastructure
Build the infrastructure required for optimal extraction, transformation, and loading of data
Kamino Retail (powered by Equativ) is a pioneering SAAS platform at the forefront of retail media innovation. They equip retailers with advanced tools and solutions to revolutionize their advertising strategies, amplify customer engagement, and drive results.
Build and maintain Azure Data Factory pipelines for data ingestion.
Write Python code in Databricks for data cleaning and transformation.
Monitor daily jobs and troubleshoot pipeline failures to ensure reliability.
Jobgether is a platform that helps candidates find relevant jobs through AI-powered matching. The company ensures applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Deliver on datawarehouse and reporting requirements using Google Big Query and GCP Suite.
Architect, Design, and Build pipelines to move large amounts of data from variety of sources.
Improve existing data warehouse architecture to enable robust user facing and internal reporting.
Bitwave is a rapidly expanding startup that specializes in software for businesses that use digital assets and crypto. Our platform provides cryptocurrency accounting, tax tracking, bookkeeping, digital asset treasury management, crypto AR / AP tooling, and they recently added full DeFi support.
Lead, mentor, and develop a high-performing data engineering squads delivering production-grade pipelines and services.
Set technical and operational standards for quality, documentation, and reliability.
Partner with Program Management to plan, prioritise, and track delivery against sprint goals.
Forbes Digital Marketing Inc. is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, and everyday life. We combine data-driven content, rigorous experimentation, and modern engineering to power a portfolio of global products and partnerships.
Design, build, and scale the lakehouse architecture that underpins analytics, machine learning, and AI.
Modernize our data ecosystem, making it discoverable, reliable, governed, and ready for self-service and intelligent automation.
Operate anywhere along the data lifecycle from ingestion and transformation to metadata, orchestration, and MLOps.
OnX is a pioneer in digital outdoor navigation with a suite of apps. With more than 400 employees, they have created regional “Basecamps” to help remote employees find connection and inspiration.
Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.
Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
Translate business requirements into software solutions that accelerate our ability to deploy AI.
Monitor data pipelines, detect anomalies, and implement automated recovery systems.
Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.
Lead the end-to-end data architecture, designing and implementing data pipelines, warehouses, and lakes that handle petabyte-scale datasets.
Collaborate with product teams to enable data-driven decision-making across the organization.
Establish best practices for data quality, governance, and security while mentoring senior engineers and conducting technical reviews.
Cority is a global enterprise EHS software provider creating industry-leading technology. They have been around for over 35 years and are known for strong employee culture and client satisfaction.
Design, develop, and maintain a core Python ETL framework.
Develop and optimize an automated refresh pipeline orchestrated through AWS Batch, Lambda, Step Functions, and EventBridge.
Build Python integrations with external systems that are robust, testable, and reusable.
BlastPoint is a B2B data analytics startup that helps companies engage with customers more effectively by discovering insights in their data. Founded in 2016 by Carnegie Mellon Alumni, they are a tight-knit, forward-thinking team that serves diverse industries including energy, finance, retail, and transportation.
Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.
UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.
Collaborate with cross-functional teams and business units to understand and communicate Ion’s needs and goals.
Develop, improve, and maintain robust data pipelines for extracting and transforming data from log files and event streams.
Design models and algorithms to derive insights and metrics from large datasets.
Intuitive is a global leader in robotic-assisted surgery and minimally invasive care. Their technologies, like the da Vinci surgical system and Ion, have transformed how care is delivered for millions of patients worldwide. They are a team of engineers, clinicians, and innovators united by one purpose: to make surgery smarter, safer, and more human.
Design, build, and maintain scalable data pipelines.
Apply dimensional modeling techniques to design tables and views.
Automate manual processes to improve efficiency and speed.
The Knot Worldwide champions celebration and powers meaningful moments for millions around the world. They are a team of passionate dreamers and doers united by connection and committed to the global community, believing the best ideas come from empowered and collaborative teams.
Build and Maintain Bronze/Silver Layer Pipelines: ensure core data sources lands accurately, on time, and with full lineage.
Lead Data Ingestion, Transformation, and Enrichment: own the end-to-end pipeline from raw file landing through cleansed, conformed staging tables, including deduplication, standardization, code mapping, and entity resolution.
Develop Automated Ingestion Pipelines: use Snowpipe, Matillion, or custom solutions with reliability, observability, and minimal manual intervention in mind.
Precision AQ is building a centralized Data Hub to consolidate fragmented data infrastructure, establish enterprise-wide data governance, and enable AI-ready analytics across our life sciences portfolio. They value innovation and a collaborative environment.
Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Design, build, and maintain scalable data pipelines using Microsoft Fabric and Apache Airflow
Ingest, transform, and integrate data from a variety of sources, including relational systems, APIs, and MongoDB
Design and maintain analytical data models, including fact and dimension tables, to support reporting and analytics
Theoria Medical is a comprehensive medical group and technology company dedicated to serving patients across the care continuum with an emphasis on post-acute care and primary care. Theoria serves facilities across the United States with a multitude of services to improve the quality of care delivered, refine facility processes, and enhance critical relationships.