Create and maintain optimal data pipeline architecture
Ensure reliability and scalability of our data infrastructure
Build the infrastructure required for optimal extraction, transformation, and loading of data
Kamino Retail (powered by Equativ) is a pioneering SAAS platform at the forefront of retail media innovation. They equip retailers with advanced tools and solutions to revolutionize their advertising strategies, amplify customer engagement, and drive results.
Designing, building and maintaining data pipelines and data warehouses.
Developing ETL ecosystem tools using tools like Python, External APIs, Airbyte, Snowflake, dbt, PostgreSQL, MySQL, and Amazon S3.
Helping to define automated solutions to solve complex problems around better understanding data, users, and the market
Neighborhoods.com provides real estate software platforms, including 55places.com and neighborhoods.com. They strive to foster an inclusive environment, comfortable for everyone, and will not tolerate harassment or discrimination of any kind.
Build and maintain reliable pipelines and datasets that enable Flex models.
Drive pragmatic improvements in architecture, tooling, and operating practices.
Implement testing, lineage/definitions, and guardrails so stakeholders can trust outputs.
Enode is a platform that powers the next generation of green energy apps by connecting and optimizing the world’s energy devices through our APIs, enabling flexible demand that prioritizes renewable energy. It is backed by leading investors like Y Combinator, Lowercarbon Capital, and Creandum, fostering a mission-driven, passionate team.
Develop and maintain custom integrations and ETL processes.
Design, implement, and optimize data pipelines.
Perform quality assurance on downstream data.
Newsela provides real-world content from trusted sources and adapts it for K-12 classrooms. Over 3.3 million teachers and 40 million students use Newsela.
Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.
ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.
Design, build, and maintain robust ETL/ELT pipelines to ingest large-scale datasets and high-frequency streams.
Lead the design and evolution of our enterprise data warehouse, ensuring it is scalable and performant.
Manage our data transformation layer using Dataform (preferred) or dbt to orchestrate complex, reliable workflows.
UW provides utilities all in one place, including energy, broadband, mobile, and insurance. They aim to double in size and offer savings to customers, fostering a culture that values imaginative and pragmatic problem-solvers.
Design and implement complex data models, modeling metadata, building reports and dashboards and creating reporting tools for data science and ML products users.
Design and deploy data infrastructure needed to drive data-driven decision-making solutions
Be the company’s expert on data administration and master data management
Vanta's mission is to help businesses earn and prove trust by continuously monitoring and verifying security. They empower companies to improve and prove their security with ease. Vanta has a kind and talented team, and while some have prior security experience, many have been successful without it.
Design, build, and scale the lakehouse architecture that underpins analytics, machine learning, and AI.
Modernize our data ecosystem, making it discoverable, reliable, governed, and ready for self-service and intelligent automation.
Operate anywhere along the data lifecycle from ingestion and transformation to metadata, orchestration, and MLOps.
OnX is a pioneer in digital outdoor navigation with a suite of apps. With more than 400 employees, they have created regional “Basecamps” to help remote employees find connection and inspiration.
Build data pipelines for coaching and user analytics.
Build data systems that power product features.
Establish our data infrastructure and architecture.
Mento is a career technology company that helps people be exceptional and thrive at work through human, AI, and software-based coaching. They strive to create a fun, conscientious, collaborative, and supportive work environment with a unique model of coaching to every member of the global workforce.
Architect and maintain central storage and cloud environment.
Design and automate scalable ELT/ETL pipelines for data.
Support scientists and operational teams by designing data models.
Funga is a public benefit corporation using forest fungal networks to address climate change. They combine DNA sequencing and machine learning with forest microbiome research to improve wood creation, carbon sequestration, and forest resilience. They are a team of scientists and builders aiming to remove three gigatons of carbon dioxide from the atmosphere by 2050.
Develop and optimize ETL and ELT processes using tools like dbt, Informatica, or Talend.
Define data architectures and flows, ensuring solutions align with business needs.
Contribute to the creation of reusable data frameworks and accelerators for multiple projects.
Jobgether uses an AI-powered matching process to ensure candidate applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, build, and optimize scalable data lakes, warehouses, and data pipelines using Snowflake and modern cloud platforms.
Develop and maintain robust data models (ELT/ETL), ensuring clean, reliable, and well-documented datasets.
Design and manage end-to-end data pipelines that ingest, transform, and unify data from multiple systems.
Cobalt Service Partners is building the leading commercial access and security integration business in North America. Backed by Alpine Investors, with $15B+ in AUM, Cobalt has scaled rapidly since launch through acquisitions and is building a differentiated, data-driven platform.
Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.
Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.
Designing, developing, and maintaining robust, scalable, and well-documented data pipelines
Collaborating with the Tech Lead to ensure quality, performance, and maintainability of Data products
Contributing to the continuous improvement of engineering practices (CI/CD, automation, testing, documentation)
Accor Tech & Digital is the power engine of Accor technology, digital business and transformation. Their 5,000 talents are committed to deliver the best tech and digital experiences to its guests, hotels and staff across 110 countries and to shape the future of hospitality.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
Model, store, and transform data to support analytics, reporting, and downstream applications.
Build API-based and file-based integrations across enterprise platforms.
Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Design, build, and optimize data pipelines to support AI and ML projects.
Integrate data from various sources to provide a unified data view for AI applications.
Implement processes to ensure data quality, consistency, and accuracy across systems.
The Tyndale Company is a leading national supplier of arc-rated flame-resistant clothing (FRC) to the energy sector. They are a family-owned business, 9x Top Workplace winner in PA and 5x winner in TX, providing a retail-style apparel experience.
Take full ownership of the data structure and handling
Relai is on a mission to make Bitcoin the go-to savings technology, striving to make it simple, accessible, and secure. As Europe’s leading Bitcoin startup, they are expanding their team of passionate Bitcoiners.
Translate complex business requirements into robust, scalable models and data services.
Model data as a product and shape its structure in our data lake across our medallion architecture.
Contribute to our data governance initiatives, helping to maintain our data catalogue and uphold high data quality standards.
Pennylane is a fast-growing Fintech in France and Europe. They provide accounting and financial software for small businesses and accountants with over 1000 employees across 25 nationalities, fostering a remote-friendly culture and being recognized as a great place to work.