Building and maintaining production-grade data pipelines in cloud data warehouses.
Designing and developing dbt models across bronze, silver, and gold layers.
Crafting easy-to-understand visualizations and dashboards in Looker or equivalent BI tools.
Plume is a trans-founded, mission-driven company with a vision to transform healthcare for every trans life by making gender-affirming hormone therapy easily accessible. They offer an affirming, trans-centered, culturally inclusive, and fun work environment filled with purpose.
Design, build, and maintain scalable data pipelines for clients across industries.
Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.
NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.
QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.
Design, build, and maintain scalable data pipelines.
Develop and optimize ETL/ELT processes using cloud data technologies.
Partner with teams to understand data requirements and improve data capture strategies.
Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.
Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
Coach & mentor other engineers to support the growth of their technical expertise.
Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.
YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.
Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
Ensure data integrity, availability, lineage, and observability across complex pipelines
Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.
Design and implement scalable data architectures to support business needs.
Build and optimize data pipelines, ensuring data accessibility and security.
Develop and maintain data models, databases, and data lakes, with robust data governance.
Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.
Enable self-service analytics for all team members by designing clean, intuitive data models and metrics through dbt, empowering employees to make informed, data-driven decisions.
Develop and refine custom data pipelines that ingest data from operational systems to our analytics platform, handling both streaming and batch data using third-party tooling and home-grown solutions
Maintain and optimize the data platform infrastructure, focusing on data quality, ELT efficiency, and platform hygiene.
Auto Integrate makes leased vehicle maintenance frictionless for millions of customers in the USA and Canada. The business is managed by a small, global team within Fleetio, combining the resources of a scaled SaaS company with the agility of a niche market leader.
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.
Become a trusted advisor, partnering with data owners, analysts, business users, and executive stakeholders to translate business needs into scalable analytics solutions.
Work independently as part of a small team to solve complex analytics engineering use-cases across a variety of industries.
Design and develop the analytical layer, including curated data models, semantic layers, metrics definitions, and transformation pipelines.
Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. We partner with innovative software providers in the data analytics & engineering space to solve our clients' toughest business problems.
Organize and structure data systems at both macro and micro levels, designing and implementing data architectures that support business goalsOptimize data pipelines for performance, reliability, and scalability
Design, build, and maintain scalable ETL/ELT pipelines with Airflow to process large-scale, complex datasets
Demonstrate ability to delivery of of data products useful for machine learning and AI research and development (data models, metadata and semantics)
Owkin is an AI company on a mission to solve the complexity of biology. It is building the first Biology Super Intelligence (BASI) by combining powerful biological large language models, multimodal patient data, and agentic software.
Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
Lead architectural discussions around the modern data stack
Develop scalable ETL and ELT processes using Python and SQL
They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.
Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization
MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.
Lead the implementation of a resilient, privacy-first data platform architecture.
Lead the design, infrastructure, and tooling decisions for platform optimization.
Develop AI-ready architecture by creating semantic layers that define and standardize business logic.
Headspace provides access to lifelong mental health support. They combine evidence-based content, clinical care, and innovative technology to help millions of members around the world get support that’s effective and personalized. They value connecting with courage, ownership, and iterating to great.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.
Build and manage business data pipelines and transform Firefox telemetry data into structured datasets.
Partner with data scientists, product, and marketing teams to turn datasets into models and metrics.
Ensure data accuracy and performance using observability tools and resolve data issues.
Mozilla Corporation is a technology company backed by a non-profit that has shaped the internet, creating brands like Firefox. With millions of users globally, they focus on areas including AI and social media while remaining focused on making the internet better for people.
Design and implement scalable data ingestion and transformation pipelines using Databricks and cloud platforms
Lead architecture decisions for modern data platforms, including Medallion Architecture and Lakehouse patterns
Build and maintain ETL/ELT pipelines using Python and SQL, following engineering best practices
AOT Technologies helps enterprises and governments bring their ideas to life. As a boutique consulting firm, they partner with enterprises, startups, and governments to solve complex, mission-critical challenges. Their teams are collaborative and their leadership is transparent.
Design, build, and maintain databases that power Hologram's operations.
Build and maintain ETL pipelines that move and transform data reliably.
Audit existing pipelines and data models, identify complexity, and refactor bad decisions.
Hologram is building the future of IoT connectivity, delivering internet access to millions of connected devices worldwide. They process over 5 billion transactions per month across their global infrastructure and values a fun, upbeat, and remote-first team united by their mission.
Partner closely with business stakeholders to understand their challenges and design end-to-end architecture.
Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform.
Motive empowers people who run physical operations with tools to make their work safer, more productive, and more profitable. Motive serves nearly 100,000 customers and provides complete visibility and control across a wide range of industries.