Manage and mentor a high-performing team, fostering a culture of technical excellence.
Define the Data Engineering team vision, balancing immediate business needs with a long-term shift towards a self-service data mesh architecture.
Oversee the development of core data pipelines and platform tools, ensuring high performance for ingestion services.
UW provides utilities all in one place with one bill for energy, broadband, mobile and insurance, targeting savings for customers. They are aiming to double in size and are looking for people to help them achieve this goal through innovation and impact.
Own and drive the data and analytics strategy, roadmap, and execution across the organization.
Scale and mentor a high-performing team spanning data engineering, analytics, and platform.
Partner with the executive team and cross-functional leaders to surface insights that inform business strategy.
Newsela takes authentic, real world content from trusted sources and makes it instruction ready for K-12 classrooms. Over 3.3 million teachers and 40 million students have registered with Newsela for personalized content.
Lead the design and delivery of complex data engineering projects.
Design and develop core components of our data platform.
Mentor engineers on the team, elevating their skills and promoting best practices in data engineering.
MoonPay is a unified payments platform for digital currency, making it easy for anyone to buy, sell, swap, and pay in digital currencies. They are trusted by over 30 million customers and over 500 ecosystem partners, driving mainstream crypto adoption worldwide.
Oversee a team of data engineers, guiding them in developing and managing data systems.
Streamline operational processes, enhance data quality, and ensure the reliability of data pipelines.
Foster a collaborative culture that empowers engineers and contributes to innovation.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates and share the shortlist directly with the hiring company.
Manage and support a team of 6 Data Engineers, helping them focus on impactful technical initiatives as the platform scales.
Drive execution excellence, ensuring the team delivers with high velocity, quality, and reliability.
Foster a healthy and sustainable team environment by helping the team manage workload and focus on meaningful engineering work.
Bluefish believes that AI represents the next major chapter of the internet and that consumers will increasingly use AI to consume information and media online. Bluefish is building the platform that helps brands engage consumers on this new AI channel, with powerful enterprise tools to manage AI brand safety and engage consumers with thoughtful and personalized AI marketing experiences.
Define and execute HappyCo’s overall data strategy aligned with company and product goals
Design and implement a canonical entity model that connects data across operational systems
Oversee the architecture and evolution of HappyCo’s data platform, including the data warehouse, pipelines, and transformation layers
HappyCo builds modern property management software that helps owners and operators deliver better living experiences at scale. They are a values-driven company that offers a flexible, supportive culture. Their team is made up of thinkers, talkers, planners, makers, builders and everything in between.
Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
Implement automated data quality checks, validation rules, and monitoring frameworks.
ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.
Design and implement scalable, high-throughput data ingestion systems.
Build and evolve a centralized data lake using Apache Iceberg.
Provide technical leadership through mentorship, code reviews, and design discussions.
Coupa provides a total spend management platform for businesses, which uses community-generated AI to multiply margins. They have a collaborative culture driven by transparency, openness, and a shared commitment to excellence, and are expanding their impact across the globe.
Design and implement scalable data architectures to support business needs.
Build and optimize data pipelines, ensuring data accessibility and security.
Develop and maintain data models, databases, and data lakes, with robust data governance.
Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.
Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response
Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.
Lead, mentor, and scale a high-performing team of data engineers and data scientists.
Define and drive the data architecture vision, technical standards, and best practices across client engagements.
Oversee end-to-end delivery of data platforms, pipelines, and tooling across client projects.
Jobgether is a platform that helps connect job seekers with companies. They use AI to match candidates with relevant roles, ensuring a fair and objective review process.
Build and lead a team of 4-5 data engineers focused on reusable product artifacts
Own the product data engineering backlog in partnership with product management
Define and enforce technical standards for notebooks, pipelines, QC modules, and documentation
Qualified Health is redefining what’s possible with Generative AI in healthcare. They provide the guardrails for safe AI governance, healthcare-specific agent creation, and real-time algorithm monitoring, working alongside leading health systems to drive real change. They are a fast-growing company backed by premier investors.
Design, develop, and maintain scalable data pipelines using cloud data services.
Serve as a technical leader, defining data engineering standards and best practices.
Lead the design and implementation of optimized data models in our cloud data warehouse.
Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.
Design, build, and maintain reliable ETL pipelines, integrating data from multiple sources into the Google Cloud Data Warehouse.
Own the product data structure, mapping product features and behaviors to analytics-ready data models, and define meaningful KPIs.
Act as the primary bridge between Backend Engineering and BI, owning the flow from data production to analytics consumption.
TuoTempo, part of the Docplanner group since 2019, develops the market-leading CRM solution dedicated to hospitals, medical centers, and health insurance providers. The platform manages and automates the entire patient journey, centralizing contacts, communications, and processes in a single modular system integrated with the software already used by organizations.
Lead and develop three sub-teams: Platform Engineering & ETL, Analytics Engineering, and Data Science & Analytics.
Own the Data Lakehouse architecture: Trino, Iceberg/GCS, Airflow, Airbyte, Redpanda CDC, dbt.
Support product launches with data change management: coordinate data impact analysis for new products (fixed income, global stocks, perps, 24/5 trading) across downstream datasets, dashboards, and reverse ETL.
Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Their global team of 230+ members is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve their mission of opening financial services to everyone on the planet.
Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities
Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.
Design and implement end-to-end data pipelines to transform raw data into actionable insights.
Lead the development of complex data models and transformations leveraging Snowflake and dbt.
Ensure our data infrastructure remains scalable, reliable, and production-ready.
They are a Managed Nearshore Teams provider headquartered in Austin, specializing in building and embedding high-performing software development teams. Their model gives you the opportunity to work on international challenges, collaborate with diverse teams, and grow your career while being part of a company that values expertise, creativity, and impact.
Help build scalable data solutions and streamline data ingestion.
Maintain high-quality databases that support our scientific and operational teams.
Optimize our data infrastructure to ensure efficient data access.
Funga is a public benefit corporation addressing the climate crisis by harnessing forest fungal networks. They are a team of passionate scientists and builders working to draw down at least three gigatons of carbon dioxide from the atmosphere by 2050.
Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
Implement data quality checks, monitoring, and validation processes.
Automate manual processes into centralized and scalable solutions.
Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.