Design, build, and maintain robust data pipelines.
Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
Build modular data models that power analytics, product features, and LLM agents.
Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.
Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Designing and maintaining scalable, secure data pipelines that feed BigQuery from diverse sources
Owning our infrastructure-as-code setup using Terraform
Automating data QA, modeling, and maintenance tasks using scripting and AI
TheyDo helps enterprises align around their customers with an AI-powered journey management platform, fostering smarter decisions and enhanced experiences. With $50M backing and a global team across 27 countries, TheyDo champions a customer-led, people-first culture.
Design, build, and operate ETL pipelines at scale.
Design data structure for data products.
Develop and operate API/tools related to data products and machine learning products.
Mercari is a company that provides a marketplace platform. They value teamwork and provide career growth opportunities as the company continues to expand.
Lead the development of ETL pipelines and Data Lake infrastructure.
Perform ETL and Sanitation on disparate data flows from multiple internal and external sources.
Maintain data security, quality, and performance with details documentation.
Swing Left helps people maximize their impact on competitive elections to help Democrats win. They have one million members, who have raised more than $140 million and engaged more than 50 million voters.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Design, build, and maintain efficient ETL/ELT processes and reliable data pipelines.
Build and maintain dashboards and visualizations in Looker Studio and other BI tools.
Ensure data quality, consistency, and accessibility across the organization.
Cove began with renting coliving spaces and has expanded to provide flexible, comfortable stays in beautiful properties. With over 6000 rooms across Singapore and Indonesia and growing in South Korea and Japan, they aim to build the leading tech flexible living platform in Asia Pacific, encouraging authenticity and fun.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Migrate data and analytics workloads from BigQuery to Snowflake
Develop and optimize ETL/ELT pipelines using Python and SQL
Build analytics-ready datasets for reporting and dashboards
Egen is a fast-growing and entrepreneurial company with a data-first mindset. They bring together the best engineering talent working with the most advanced technology platforms to help clients drive action and impact through data and insights.
Utilize strong SQL & Python skills to engineer sound data pipelines and conduct routine and ad hoc analysis.
Build reporting dashboards and visualizations to design, create, and track campaign/program KPIs.
Perform analyses on large data sets to understand drivers of operational efficiency.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. This company fosters a supportive and inclusive work environment.
Maintain, configure, and optimize the existing data warehouse platform and pipelines.
Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
Drive innovation by experimenting with new technologies and recommending platform improvements.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!
Design, build, and optimize robust and scalable data pipelines into our production BigQuery data warehouse.
Mentor other engineers, lead complex projects, and set high standards for data quality and engineering excellence.
Empower our BI tools, reporting, Marketing, and Data Science initiatives by ensuring a highly reliable and performant data ecosystem.
Peerspace is the leading online marketplace for venue rentals for meetings, productions, and events, opening doors to inspiring spaces worldwide. They have facilitated over $500M in transactions and are backed by investors like GV (Google Ventures) and Foundation Capital.
Design, build, and operate scheduled and event-driven data pipelines for simulation outputs, telemetry, logs, dashboards, and scenario metadata
Build and operate data storage systems (structured and semi-structured) optimized for scale, versioning, and replay
Support analytics, reporting, and ML workflows by exposing clean, well-documented datasets and APIs
Onebrief is collaboration and AI-powered workflow software designed specifically for military staffs. They transform this work, making the staff faster, smarter, and more efficient. The company is all-remote with employees working alongside customers; it was founded in 2019 and has raised $320m+.
Creating and maintaining optimal data pipeline architecture.
Assembling large, complex data sets that meet functional & non-functional business requirements.
Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.
Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.
Build and own Sardine’s internal data infrastructure integrating CRM, marketing, product, finance, and operational systems.
Design, improve, and own ETL/ELT pipelines to ensure clean, reliable, and scalable data flows across the organization.
Partner with data, engineering, revenue/business operations, and executive stakeholders to define and track KPIs.
Sardine is a leader in fraud prevention and AML compliance. Their platform uses device intelligence, behavior biometrics, machine learning, and AI to stop fraud before it happens. Sardine has over 300 banks, retailers, and fintechs as clients and a remote-first work culture.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Build highly reliable data services to integrate with dozens of blockchains.
Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
TRM is a blockchain intelligence company on a mission to build a safer financial system. They are a lean, high-impact team tackling critical challenges, and they empower governments, financial institutions, and crypto companies.
Build and operate data pipelines from D365, Power Platform, and other sources into the enterprise data platform.
Design and implement star schemas, data lake house structures, and semantic models for Power BI.
Optimize performance and cost management for reporting in Azure.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify top-fitting candidates and share the shortlist with the hiring company.
Architect and maintain robust, scalable, and secure data infrastructure on AWS leveraging Databricks.
Design, develop, and maintain data pipelines, primarily using tools like Airbyte and custom-built services in Go, to automate data ingestion and ETL processes.
Oversee the creation and maintenance of the data lake, ensuring efficient storage, high data quality, and effective partitioning, organization, performance, monitoring and alerting.
Trust Wallet is the leading non-custodial cryptocurrency wallet, trusted by over 200 million people worldwide to securely manage and grow their digital assets. They aim to be a trusted personal companion — helping users safely navigate Web3, the on-chain economy, and the emerging AI-powered future.