Integrate extracted data into our OpenData platform
Veeva Systems is a mission-driven organization and pioneer in industry cloud, helping life sciences companies bring therapies to patients faster. As one of the fastest-growing SaaS companies in history, they surpassed $2B in revenue with extensive growth potential ahead. They value doing the right thing, customer success, employee success, and speed.
Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
Model, store, and transform data to support analytics, reporting, and downstream applications.
Build API-based and file-based integrations across enterprise platforms.
Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Identify and explore public and compliant channels for 3D model assets.
Design and develop high-performance web crawlers to automate the end-to-end pipeline.
Maintain and optimize distributed crawler systems; monitor node health to ensure system stability.
Meshy, headquartered in Silicon Valley, is a leading 3D generative AI company on a mission to unleash 3D creativity by transforming the content creation pipeline. They are trusted by top developers, backed by premiere venture capital firms and has raised $52 Million in funding.
Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, develop, and maintain Epic’s core backend systems and services.
Contribute to the design and development of Epic’s enterprise data warehouse (EDW).
Build, optimize, and maintain data pipelines to ensure high data quality, reliability, and performance.
Epic is the leading digital reading platform for kids ages 12 and under, used by millions of children, families, and educators around the world. As Epic continues to grow, they are reimagining what reading can be through thoughtful technology, data, and global collaboration to make learning more engaging, accessible, and impactful.
Creating and maintaining optimal data pipeline architecture.
Assembling large, complex data sets that meet functional & non-functional business requirements.
Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.
Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.
Design, develop, and maintain dbt data models that support our healthcare analytics products.
Integrate and transform customer data to conform to our data specifications and pipelines.
Design and execute initiatives that improve data platform and pipeline automation and resilience.
SmarterDx, a Smarter Technologies company, builds clinical AI that is transforming how hospitals translate care into payment. Founded by physicians in 2020, our platform connects clinical context with revenue intelligence, helping health systems recover millions in missed revenue, improve quality scores, and appeal every denial.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Code, test, and document new or modified data pipelines.
Conduct logical and physical database design.
Perform root cause analysis on internal and external data.
Aker Systems builds and operates ground-breaking, ultra-secure, high performance, cloud-based data infrastructure for the enterprise. They were recognised as a ‘One to Watch’ on the Sunday Times Tech Track and won the Thames Valley Tech Company of the year.
Collect and extract data from websites, directories, and other sources using manual and automated methods
Ensure accuracy, consistency, and completeness of gathered data
Maintain organized records and documentation of data sources and collection processes
Foundry for Good builds businesses that do good. Across our family of brands, we support nonprofits, trade associations, and mission-driven organizations with innovative software, impactful marketing strategies, and tools that empower positive change. Our employee retention rate reflects our commitment to competitive pay, respect, and career development.
Design and build robust data pipelines that integrate data from diverse sources.
Build streaming data pipelines using Kafka and AWS services to enable real-time data processing.
Create and operate data services that make curated datasets accessible to internal teams and external partners.
Quanata aims to ensure a better world through context-based insurance solutions. It is a customer-centered team creating innovative technologies, digital products, and brands, backed by State Farm, blending Silicon Valley talent with insurer expertise.
Build modern, scalable data pipelines that keep the data flowing.
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
Unify and wrangle data from all kinds of sources.
InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Develop data warehouse applications, including extraction, ingestion, and transformation processes.
Collaborate with customers and internal teams to understand business requirements.
Ensure quality assurance and data validation, utilizing the sprint methodology.
One Model, founded by industry veterans in HR analytics, has a data-first approach to their People Analytics Platform, giving them a competitive advantage. They foster a friendly, inclusive, and respectful workplace culture, offering the opportunity to contribute significantly to a young company and team.
Design, optimize and own data pipelines that scrape, process and ingest transaction and listing data from major auction houses and marketplaces.
Build comprehensive monitoring and alerting systems to track latency, uptime, and coverage metrics across all data sources.
Continuously improve our data infrastructure by modernizing storage and processing technologies, reducing manual interventions, and optimizing for cost, performance, and reliability.
Alt is unlocking the value of alternative assets, starting with the $5B trading-card market. They let collectors buy, sell, vault, and finance their cards in one place and are backed by leaders at Stripe, Coinbase, Seven Seven Six, and pro athletes like Tom Brady and Giannis Antetokounmpo.
Design and build scalable and high-performance data software solutions using Golang and Python.
Build and deploy Kubernetes-based systems to manage containerized applications in cloud-native environments.
Collaborate with cross-functional teams to understand and address customer needs, ensuring systems evolve.
Machinify is a healthcare intelligence company focused on delivering value, transparency, and efficiency to health plan clients. They deploy a configurable, AI-powered platform used by over 85 health plans, representing more than 270 million lives, and foster a flexible and trusting environment.
Designing and maintaining scalable, secure data pipelines that feed BigQuery from diverse sources
Owning our infrastructure-as-code setup using Terraform
Automating data QA, modeling, and maintenance tasks using scripting and AI
TheyDo helps enterprises align around their customers with an AI-powered journey management platform, fostering smarter decisions and enhanced experiences. With $50M backing and a global team across 27 countries, TheyDo champions a customer-led, people-first culture.
Design, build, and operate scheduled and event-driven data pipelines for simulation outputs, telemetry, logs, dashboards, and scenario metadata
Build and operate data storage systems (structured and semi-structured) optimized for scale, versioning, and replay
Support analytics, reporting, and ML workflows by exposing clean, well-documented datasets and APIs
Onebrief is collaboration and AI-powered workflow software designed specifically for military staffs. They transform this work, making the staff faster, smarter, and more efficient. The company is all-remote with employees working alongside customers; it was founded in 2019 and has raised $320m+.