Deliver on datawarehouse and reporting requirements using Google Big Query and GCP Suite.
Architect, Design, and Build pipelines to move large amounts of data from variety of sources.
Improve existing data warehouse architecture to enable robust user facing and internal reporting.
Bitwave is a rapidly expanding startup that specializes in software for businesses that use digital assets and crypto. Our platform provides cryptocurrency accounting, tax tracking, bookkeeping, digital asset treasury management, crypto AR / AP tooling, and they recently added full DeFi support.
Designing and maintaining scalable, secure data pipelines that feed BigQuery from diverse sources
Owning our infrastructure-as-code setup using Terraform
Automating data QA, modeling, and maintenance tasks using scripting and AI
TheyDo helps enterprises align around their customers with an AI-powered journey management platform, fostering smarter decisions and enhanced experiences. With $50M backing and a global team across 27 countries, TheyDo champions a customer-led, people-first culture.
Design, build, and maintain robust data pipelines.
Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
Build modular data models that power analytics, product features, and LLM agents.
Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.
Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
Collaborate with cross-functional teams to deliver impactful data solutions.
Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.
Design, build, and operate ETL pipelines at scale.
Design data structure for data products.
Develop and operate API/tools related to data products and machine learning products.
Mercari is a company that provides a marketplace platform. They value teamwork and provide career growth opportunities as the company continues to expand.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Drive automation through effective metadata management.
Learn and apply modern data preparation and integration techniques.
Jobgether uses an AI-powered matching process to ensure candidate applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Design and build robust data pipelines that integrate data from diverse sources.
Build streaming data pipelines using Kafka and AWS services to enable real-time data processing.
Create and operate data services that make curated datasets accessible to internal teams and external partners.
Quanata aims to ensure a better world through context-based insurance solutions. It is a customer-centered team creating innovative technologies, digital products, and brands, backed by State Farm, blending Silicon Valley talent with insurer expertise.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Maintain, configure, and optimize the existing data warehouse platform and pipelines.
Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
Drive innovation by experimenting with new technologies and recommending platform improvements.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!
Contribute to the technical integrity and evolution of the Data Platform tech stack.
Design and implement core features and enhancements within the Data Platform.
Build and maintain robust data extraction, loading, and transformation processes.
Dimagi is an award-winning social enterprise and certified B-corp building software solutions and providing technology consulting services to improve the quality of essential services for underserved populations. They are passionate and committed to tackling complex health and social inequities and working towards a brighter future for all.
Take full ownership of the data structure and handling
Relai is on a mission to make Bitcoin the go-to savings technology, striving to make it simple, accessible, and secure. As Europe’s leading Bitcoin startup, they are expanding their team of passionate Bitcoiners.
Be the Analytics Engineering lead within the Sales and Marketing organization.
Be the data steward for Sales and Marketing: architect and improve the data collection.
Develop and maintain robust data pipelines and workflows for data ingestion and transformation.
Reddit is a community-driven platform built on shared interests and trust, fostering open and authentic conversations. With over 100,000 active communities and approximately 116 million daily active unique visitors, it serves as a major source of information on the internet.
Lead the development of ETL pipelines and Data Lake infrastructure.
Perform ETL and Sanitation on disparate data flows from multiple internal and external sources.
Maintain data security, quality, and performance with details documentation.
Swing Left helps people maximize their impact on competitive elections to help Democrats win. They have one million members, who have raised more than $140 million and engaged more than 50 million voters.
Manage the flow of data from ingestion to final consumption.
Develop and maintain entity-relationship models.
Design and implement data pipelines using preferrable SQL or Python.
Teachable is a platform for experts and businesses who take education seriously. They help experts and businesses scale their impact and operations through courses, coaching, and digital downloads that students actually love. Teachable is part of the global Hotmart Company portfolio.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Creating and maintaining optimal data pipeline architecture.
Assembling large, complex data sets that meet functional & non-functional business requirements.
Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.
Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.
Serve as a primary advisor to identify technical improvements and automation opportunities.
Build advanced data pipelines using the medallion architecture in Snowflake.
Write advanced ETL/ELT scripts to integrate data into enterprise data stores.
Spring Venture Group is a digital direct-to-consumer sales and marketing company focused on the senior market. They have a dedicated team of licensed insurance agents and leverage technology to help seniors navigate Medicare.
Build a scalable, reliable, operable and performant big data workflow platform.
Drive the usage of Freight's data model across the organization with multiple product teams.
Drive efficiency and reliability improvements through design and automation.
Uber Freight is an enterprise technology company powering intelligent logistics with end-to-end logistics applications, managed services, and an expansive carrier network. Today, the company manages nearly $20B of freight, has one of the largest networks of carriers and is backed by best-in-class investors.