Design, build, and operate large scale data infrastructure systems across multiple environments to store, aggregate, and progress large amounts of data.
Build a data platform-as-a-service for internal customers, ensuring data integrity, sanity, tagging, and discoverability.
Bridge the gap between engineering and analytics, helping inform the roadmap for data infrastructure for the company.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Build and own Sardine’s internal data infrastructure integrating CRM, marketing, product, finance, and operational systems.
Design, improve, and own ETL/ELT pipelines to ensure clean, reliable, and scalable data flows across the organization.
Partner with data, engineering, revenue/business operations, and executive stakeholders to define and track KPIs.
Sardine is a leader in fraud prevention and AML compliance. Their platform uses device intelligence, behavior biometrics, machine learning, and AI to stop fraud before it happens. Sardine has over 300 banks, retailers, and fintechs as clients and a remote-first work culture.
Designing and maintaining scalable, secure data pipelines that feed BigQuery from diverse sources
Owning our infrastructure-as-code setup using Terraform
Automating data QA, modeling, and maintenance tasks using scripting and AI
TheyDo helps enterprises align around their customers with an AI-powered journey management platform, fostering smarter decisions and enhanced experiences. With $50M backing and a global team across 27 countries, TheyDo champions a customer-led, people-first culture.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Design, build, and maintain robust data pipelines.
Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
Build modular data models that power analytics, product features, and LLM agents.
Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.
Build modern, scalable data pipelines that keep the data flowing.
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
Unify and wrangle data from all kinds of sources.
InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.
Assist in designing and implementing Snowflake-based analytics solutions.
Build and maintain data pipelines adhering to enterprise architecture principles.
Act as a technical leader within the team, ensuring quality deliverables.
Jobgether is a company that uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Oversee the design, development, implementation, and maintenance of all data-related systems.
Build and maintain scalable and reliable data pipelines, ensuring data quality and consistency across all systems.
Collaborate with data scientists, business analysts, and other stakeholders to identify data-related needs and requirements.
MoneyHero Group is a market-leading financial products platform in Greater Southeast Asia, reaching millions of monthly unique users and working with hundreds of commercial partners. They have a team of over 350 talented individuals and are backed by world-class organizations and companies.
Design and develop data integration pipelines across cloud and legacy systems.
Lead and support data engineering implementation efforts.
Apply advanced analytics techniques to support business insights and decision-making.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.
Support their managed analytics service client through their data-driven journey and deliver measurable business value through data modeling, API integration, SQL scripting, and data pipeline development.
Bridge the important gap between data applications and insightful business reports.
Participate in building our data platform from the ground up by exploring new technologies & vendors within our cloud-first environment
DataDrive is a fast-growing managed analytics service provider that provides modern cloud analytics data platforms to data-driven organizations, while also supporting ongoing training, adoption, and growth of our clients’ data cultures. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business.
Design and develop robust ETL/ELT pipelines using Python, Airflow, and DBT.
Build and optimize Snowflake data models for performance, scalability, and cost efficiency.
Implement ingestion pipelines for internal and external financial datasets (Market, Securities, Pricing, ESG, Ratings).
Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation. They support digital transformation for some of the world's largest enterprises. Miratech retains nearly 1000 full-time professionals, with an annual growth rate exceeding 25%, and offers a ForeverRemote work culture.
Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Build and own pipelines for the creation, curation, and processing of large-scale multimodal datasets.
Build and own ETL and CDC streams from Postgres and ClickHouse to analytics warehouses.
Manage production databases (Postgres, ClickHouse) and optimize for performance and reliability
Runway is building AI to simulate the world through merging art and science. They believe that world models are at the frontier of progress in artificial intelligence. The Runway team consists of creative, open minded, caring and ambitious people who are determined to change the world.
Design, develop, and maintain reliable end-to-end data pipelines that connect internal and external systems.
Contribute to the performance, scalability, and reliability of our entire data ecosystem.
Work with analysts to engineer data structures and orchestrate workflows that encode core business logic.
Roo is on a mission to empower animal healthcare professionals with opportunities to earn more and achieve greater flexibility in their careers and personal lives. Powered by groundbreaking technology, Roo has built the industry-leading veterinary staffing platform, connecting Veterinarians, Technicians, and Assistants with animal hospitals for relief work and hiring opportunities.
Contribute to the maintenance of data pipelines to improve ingestion and data quality.
Develop and maintain Python-based ingestion pipelines integrating data from APIs and third-party systems.
Maintain and optimize dbt transformation workflows to support curated data models.
National Debt Relief was founded in 2009 with the goal of helping consumers deal with overwhelming debt. They are one of the most-trusted and best-rated consumer debt relief providers in the United States with over 450,000 people helped.
Maintain, configure, and optimize the existing data warehouse platform and pipelines.
Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
Drive innovation by experimenting with new technologies and recommending platform improvements.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!
Migrate data and analytics workloads from BigQuery to Snowflake
Develop and optimize ETL/ELT pipelines using Python and SQL
Build analytics-ready datasets for reporting and dashboards
Egen is a fast-growing and entrepreneurial company with a data-first mindset. They bring together the best engineering talent working with the most advanced technology platforms to help clients drive action and impact through data and insights.