Build and own Sardine’s internal data infrastructure integrating CRM, marketing, product, finance, and operational systems.
Design, improve, and own ETL/ELT pipelines to ensure clean, reliable, and scalable data flows across the organization.
Partner with data, engineering, revenue/business operations, and executive stakeholders to define and track KPIs.
Sardine is a leader in fraud prevention and AML compliance. Their platform uses device intelligence, behavior biometrics, machine learning, and AI to stop fraud before it happens. Sardine has over 300 banks, retailers, and fintechs as clients and a remote-first work culture.
Design, build, and operate scheduled and event-driven data pipelines for simulation outputs, telemetry, logs, dashboards, and scenario metadata
Build and operate data storage systems (structured and semi-structured) optimized for scale, versioning, and replay
Support analytics, reporting, and ML workflows by exposing clean, well-documented datasets and APIs
Onebrief is collaboration and AI-powered workflow software designed specifically for military staffs. They transform this work, making the staff faster, smarter, and more efficient. The company is all-remote with employees working alongside customers; it was founded in 2019 and has raised $320m+.
Lead product requirements and advanced analytics requirements gathering efforts.
Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
Build a scalable technology platform to support a growing business and deliver high-quality code to production.
Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.
Design, build, and maintain efficient ETL/ELT processes and reliable data pipelines.
Build and maintain dashboards and visualizations in Looker Studio and other BI tools.
Ensure data quality, consistency, and accessibility across the organization.
Cove began with renting coliving spaces and has expanded to provide flexible, comfortable stays in beautiful properties. With over 6000 rooms across Singapore and Indonesia and growing in South Korea and Japan, they aim to build the leading tech flexible living platform in Asia Pacific, encouraging authenticity and fun.
Be the Analytics Engineering lead within the Sales and Marketing organization.
Be the data steward for Sales and Marketing: architect and improve the data collection.
Develop and maintain robust data pipelines and workflows for data ingestion and transformation.
Reddit is a community-driven platform built on shared interests and trust, fostering open and authentic conversations. With over 100,000 active communities and approximately 116 million daily active unique visitors, it serves as a major source of information on the internet.
Utilize strong SQL & Python skills to engineer sound data pipelines and conduct routine and ad hoc analysis.
Build reporting dashboards and visualizations to design, create, and track campaign/program KPIs.
Perform analyses on large data sets to understand drivers of operational efficiency.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. This company fosters a supportive and inclusive work environment.
Build modern, scalable data pipelines that keep the data flowing.
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
Unify and wrangle data from all kinds of sources.
InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Manage the flow of data from ingestion to final consumption.
Develop and maintain entity-relationship models.
Design and implement data pipelines using preferrable SQL or Python.
Teachable is a platform for experts and businesses who take education seriously. They help experts and businesses scale their impact and operations through courses, coaching, and digital downloads that students actually love. Teachable is part of the global Hotmart Company portfolio.
Architect end-to-end solutions to gather product telemetry.
Evolve and maintain our telemetry stack with focus on scalability.
Define and implement best practices for product telemetry tracking.
Rasa is a leader in generative conversational AI, enabling enterprises to build and deliver next-level AI assistants. Rasa was founded in 2016, is remote-first, and has a global presence with a small, committed team that is growing inclusively.
Architect and maintain scalable, secure, and high-performing data pipelines to support analytics, reporting, and operational needs.
Develop and deploy production-grade data engineering code, ensuring reliability and performance across environments.
Manage end-to-end data workflows, including ingestion, transformation, modeling, and validation for multiple business systems.
Onebridge, a Marlabs Company, is a global AI and Data Analytics Consulting Firm that empowers organizations worldwide to drive better outcomes through data and technology. Since 2005, they have partnered with some of the largest healthcare, life sciences, financial services, and government entities across the globe.
Design, build, and maintain robust data pipelines.
Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
Build modular data models that power analytics, product features, and LLM agents.
Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.
Design, build, and operate ETL pipelines at scale.
Design data structure for data products.
Develop and operate API/tools related to data products and machine learning products.
Mercari is a company that provides a marketplace platform. They value teamwork and provide career growth opportunities as the company continues to expand.
Creating and maintaining optimal data pipeline architecture.
Assembling large, complex data sets that meet functional & non-functional business requirements.
Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.
Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.
Lead the end-to-end data architecture, designing and implementing data pipelines, warehouses, and lakes that handle petabyte-scale datasets.
Collaborate with product teams to enable data-driven decision-making across the organization.
Establish best practices for data quality, governance, and security while mentoring senior engineers and conducting technical reviews.
Cority is a global enterprise EHS software provider creating industry-leading technology. They have been around for over 35 years and are known for strong employee culture and client satisfaction.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Bridge the gap between application engineering and data infrastructure.
Own the optimization of high-volume data pipelines and tune operational databases.
Define how to ingest massive bursts of medical information, model it for transactional locking, and transform it for analytical querying.
Synthesis Health is a mission- and values-driven company with tremendous dedication to its customers. The 100% remote team is dedicated to revolutionizing healthcare through innovation, collaboration, and commitment to its core values and behaviors.