Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Create and maintain optimal data pipeline architecture
Ensure reliability and scalability of our data infrastructure
Build the infrastructure required for optimal extraction, transformation, and loading of data
Kamino Retail (powered by Equativ) is a pioneering SAAS platform at the forefront of retail media innovation. They equip retailers with advanced tools and solutions to revolutionize their advertising strategies, amplify customer engagement, and drive results.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Own the data infrastructure and pipelines for business critical product and user data
Conceptualize and build best-in-class internal tools enabling others to increasingly self-serve data needs
Take an AI-first approach to scaling Runway’s data tooling and use.
Runway is building AI to simulate the world through merging art and science. They aspire to continuously build impossible things and their ability to do so relies on building an incredible team.
Design, build, and optimize scalable data lakes, warehouses, and data pipelines using Snowflake and modern cloud platforms.
Develop and maintain robust data models (ELT/ETL), ensuring clean, reliable, and well-documented datasets.
Design and manage end-to-end data pipelines that ingest, transform, and unify data from multiple systems.
Cobalt Service Partners is building the leading commercial access and security integration business in North America. Backed by Alpine Investors, with $15B+ in AUM, Cobalt has scaled rapidly since launch through acquisitions and is building a differentiated, data-driven platform.
Build modern, scalable data pipelines that keep the data flowing.
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
Unify and wrangle data from all kinds of sources.
InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.
Design, build, and maintain scalable data pipelines.
Apply dimensional modeling techniques to design tables and views.
Automate manual processes to improve efficiency and speed.
The Knot Worldwide champions celebration and powers meaningful moments for millions around the world. They are a team of passionate dreamers and doers united by connection and committed to the global community, believing the best ideas come from empowered and collaborative teams.
Assist in designing and implementing Snowflake-based analytics solutions.
Build and maintain data pipelines adhering to enterprise architecture principles.
Act as a technical leader within the team, ensuring quality deliverables.
Jobgether is a company that uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design and develop data integration pipelines across cloud and legacy systems.
Lead and support data engineering implementation efforts.
Apply advanced analytics techniques to support business insights and decision-making.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.
Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, develop, and optimize EDP data pipelines using Python, Airflow, DBT, and Snowflake for scalable financial data processing.
Build performant Snowflake data models and DBT transformations following best practices, standards, and documentation guidelines.
Own ingestion, orchestration, monitoring, and SLA-driven workflows; proactively troubleshoot failures and improve reliability.
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed.
Build data pipelines for coaching and user analytics.
Build data systems that power product features.
Establish our data infrastructure and architecture.
Mento is a career technology company that helps people be exceptional and thrive at work through human, AI, and software-based coaching. They strive to create a fun, conscientious, collaborative, and supportive work environment with a unique model of coaching to every member of the global workforce.
Design, build, and maintain efficient ETL/ELT processes and reliable data pipelines.
Build and maintain dashboards and visualizations in Looker Studio and other BI tools.
Ensure data quality, consistency, and accessibility across the organization.
Cove began with renting coliving spaces and has expanded to provide flexible, comfortable stays in beautiful properties. With over 6000 rooms across Singapore and Indonesia and growing in South Korea and Japan, they aim to build the leading tech flexible living platform in Asia Pacific, encouraging authenticity and fun.
Designing and maintaining scalable, secure data pipelines that feed BigQuery from diverse sources
Owning our infrastructure-as-code setup using Terraform
Automating data QA, modeling, and maintenance tasks using scripting and AI
TheyDo helps enterprises align around their customers with an AI-powered journey management platform, fostering smarter decisions and enhanced experiences. With $50M backing and a global team across 27 countries, TheyDo champions a customer-led, people-first culture.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Ensuring the implementation of best development practices.
Making technical and modeling choices in consultation with other Data Leads.
Accor Tech & Digital is the power engine of Accor technology, digital business and transformation. Our 5,000 talents are committed to deliver the best tech and digital experiences to our guests, hotels and staff across 110 countries and to shape the future of hospitality.
Architect and maintain scalable, secure, and high-performing data pipelines to support analytics, reporting, and operational needs.
Develop and deploy production-grade data engineering code, ensuring reliability and performance across environments.
Manage end-to-end data workflows, including ingestion, transformation, modeling, and validation for multiple business systems.
Onebridge, a Marlabs Company, is a global AI and Data Analytics Consulting Firm that empowers organizations worldwide to drive better outcomes through data and technology. Since 2005, they have partnered with some of the largest healthcare, life sciences, financial services, and government entities across the globe.
Develop and optimize ETL and ELT processes using tools like dbt, Informatica, or Talend.
Define data architectures and flows, ensuring solutions align with business needs.
Contribute to the creation of reusable data frameworks and accelerators for multiple projects.
Jobgether uses an AI-powered matching process to ensure candidate applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.