Design and implement robust, production-grade pipelines using Python, Spark SQL, and Airflow.
Lead efforts to canonicalize raw healthcare data into internal models.
Onboard new customers by integrating their raw data into internal pipelines and canonical models.
Machinify is a healthcare intelligence company delivering value, transparency, and efficiency to health plan clients. They serve over 85 health plans, including many of the top 20, representing more than 270 million lives, with an AI-powered platform and expertise.
Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.
At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.
Design, build, and operate scalable data pipelines using batch and real-time processing technologies.
Build data infrastructure that ingests real-time events and stores them efficiently across databases.
Establish and enforce data contracts with backend engineering teams by implementing schema management.
Fetch provides a platform where millions of people earn rewards for buying brands they love. They have received investments from SoftBank, Univision, and Hamilton Lane and partnerships ranging from challenger brands to Fortune 500 companies. Fetch fosters a people-first culture rooted in trust, accountability, and innovation.
Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.
Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.
Define and plan the long-term strategy for the Data Platform.
Design and develop scalable distributed systems for data management.
Improve and add features to the ETL framework while maintaining SLAs.
Jobgether is a platform that connects job seekers with companies using an AI-powered matching process. It's a platform that ensures applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, build, and maintain pipelines that power all data use cases.
Develop intuitive, performant, and scalable data models that support product features.
Pay down technical debt, improve automation, and follow best practices in data modeling.
Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences. They are leaders in the space, with over $10 billion generated by creators since Patreon's inception, with a team passionate about their mission.
Design and implement scalable, performant data models.
Develop and optimize processes to improve the correctness of 3rd party data.
Implement data quality principles to raise the bar for reliability of data.
SmithRx is a venture-backed Health-Tech company disrupting the Pharmacy Benefit Management (PBM) sector with a next-generation drug acquisition platform. They have a mission-driven and collaborative culture that inspires employees to transform the U.S. healthcare system.
Architect, design, implement, and operate end-to-end data engineering solutions.
Develop and manage robust data integrations with external vendors.
Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.
SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.
Build a scalable, reliable, operable and performant big data workflow platform.
Drive the usage of Freight's data model across the organization with multiple product teams.
Drive efficiency and reliability improvements through design and automation.
Uber Freight is an enterprise technology company powering intelligent logistics with end-to-end logistics applications, managed services, and an expansive carrier network. Today, the company manages nearly $20B of freight, has one of the largest networks of carriers and is backed by best-in-class investors.
Collaborate with cross-functional teams and business units to understand and communicate Ion’s needs and goals.
Develop, improve, and maintain robust data pipelines for extracting and transforming data from log files and event streams.
Design models and algorithms to derive insights and metrics from large datasets.
Intuitive is a global leader in robotic-assisted surgery and minimally invasive care. Their technologies, like the da Vinci surgical system and Ion, have transformed how care is delivered for millions of patients worldwide. They are a team of engineers, clinicians, and innovators united by one purpose: to make surgery smarter, safer, and more human.
Design, build, and maintain enterprise-scale data pipelines on Snowflake Data Platform.
Design, build, and maintain cloud-native AI/ML solutions (AWS, Azure) that support advanced analytics and decision making.
Implement best practices for data quality, observability, lineage, and governance.
FUJIFILM Biotechnologies is dedicated to making a real difference in people’s lives. They partner with innovative biopharma companies to advance vaccines, cures, and gene therapies, fostering a culture that fuels passion and drive, known as Genki.
Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.
Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.
Design and develop scalable, maintainable, and reusable software components with a strong emphasis on performance and reliability.
Collaborate with product managers to translate requirements into well-architected solutions, owning features from design through delivery
Build intuitive and extensible user experiences using modern UI frameworks, ensuring flexibility for customer-specific needs.
ServiceNow is a global market leader that brings innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work.
Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).
They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.
Lead, mentor, and develop a high-performing data engineering squads delivering production-grade pipelines and services.
Set technical and operational standards for quality, documentation, and reliability.
Partner with Program Management to plan, prioritise, and track delivery against sprint goals.
Forbes Digital Marketing Inc. is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, and everyday life. We combine data-driven content, rigorous experimentation, and modern engineering to power a portfolio of global products and partnerships.
Design, build, and optimize data pipelines and workflows.
Drive scalable data solutions to support business decisions.
Contribute to architectural decisions and provide technical leadership.
Jobgether is a platform that uses AI to match candidates with jobs. They focus on ensuring fair and objective reviews of applications by using AI to identify top-fitting candidates for hiring companies.
Design, build, and scale the lakehouse architecture that underpins analytics, machine learning, and AI.
Modernize our data ecosystem, making it discoverable, reliable, governed, and ready for self-service and intelligent automation.
Operate anywhere along the data lifecycle from ingestion and transformation to metadata, orchestration, and MLOps.
OnX is a pioneer in digital outdoor navigation with a suite of apps. With more than 400 employees, they have created regional “Basecamps” to help remote employees find connection and inspiration.