Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Lead product requirements and advanced analytics requirements gathering efforts.
Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
Build a scalable technology platform to support a growing business and deliver high-quality code to production.
Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.
Drive automation through effective metadata management.
Learn and apply modern data preparation and integration techniques.
Jobgether uses an AI-powered matching process to ensure candidate applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Own Neko’s Data Engineering strategy, ensuring long-term leadership in preventive health.
Drive innovation in data engineering technologies and practices, balancing current and future needs.
Build a platform supporting responsible AI — fairness, privacy, explainability, and compliance.
Neko Health is a Swedish healthcare technology company focused on shifting healthcare from reactive treatment toward preventative health and early detection. They have over 500 employees with offices in Stockholm, London and Manchester and are focused on creating a flexible work environment.
Design and implement data pipelines and workflows.
Develop and maintain data models for optimal data storage.
Collaborate with cross-functional teams to gather data requirements.
Jobgether is a platform that connects job seekers with companies using AI-powered matching. They aim to ensure applications are reviewed quickly and fairly.
Manage the flow of data from ingestion to final consumption.
Develop and maintain entity-relationship models.
Design and implement data pipelines using preferrable SQL or Python.
Teachable is a platform for experts and businesses who take education seriously. They help experts and businesses scale their impact and operations through courses, coaching, and digital downloads that students actually love. Teachable is part of the global Hotmart Company portfolio.
Design and build robust data pipelines that integrate data from diverse sources.
Build streaming data pipelines using Kafka and AWS services to enable real-time data processing.
Create and operate data services that make curated datasets accessible to internal teams and external partners.
Quanata aims to ensure a better world through context-based insurance solutions. It is a customer-centered team creating innovative technologies, digital products, and brands, backed by State Farm, blending Silicon Valley talent with insurer expertise.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Collaborate with business leaders, engineers, and product managers to understand data needs.
Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.
Architect and maintain robust data pipelines to transform diverse data inputs.
Integrate data from various sources into a unified platform.
Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
Design, build, and maintain scalable, high-quality data pipelines.
Implement robust data ingestion, transformation, and storage using cloud-based technologies.
Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
Design, optimize and own data pipelines that scrape, process and ingest transaction and listing data from major auction houses and marketplaces.
Build comprehensive monitoring and alerting systems to track latency, uptime, and coverage metrics across all data sources.
Continuously improve our data infrastructure by modernizing storage and processing technologies, reducing manual interventions, and optimizing for cost, performance, and reliability.
Alt is unlocking the value of alternative assets, starting with the $5B trading-card market. They let collectors buy, sell, vault, and finance their cards in one place and are backed by leaders at Stripe, Coinbase, Seven Seven Six, and pro athletes like Tom Brady and Giannis Antetokounmpo.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation
OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis
ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Tivity Health, Inc. provides healthy life-changing solutions, including SilverSneakers®, Prime® Fitness, and WholeHealth Living®. They help adults improve their health and support them on life's journey by providing access to in-person and virtual physical activity, social and mental enrichment programs. Tivity Health is an equal employment opportunity employer and is committed to a proactive program of diversity development.
Build modern, scalable data pipelines that keep the data flowing.
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
Unify and wrangle data from all kinds of sources.
InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.
Design, build, and operate scheduled and event-driven data pipelines for simulation outputs, telemetry, logs, dashboards, and scenario metadata
Build and operate data storage systems (structured and semi-structured) optimized for scale, versioning, and replay
Support analytics, reporting, and ML workflows by exposing clean, well-documented datasets and APIs
Onebrief is collaboration and AI-powered workflow software designed specifically for military staffs. They transform this work, making the staff faster, smarter, and more efficient. The company is all-remote with employees working alongside customers; it was founded in 2019 and has raised $320m+.
Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
Implement and maintain data quality measures throughout the data lifecycle
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.
Maintain, configure, and optimize the existing data warehouse platform and pipelines.
Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
Drive innovation by experimenting with new technologies and recommending platform improvements.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!