Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
Translate business requirements into software solutions that accelerate our ability to deploy AI.
Monitor data pipelines, detect anomalies, and implement automated recovery systems.
Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.
Design, build, and operate scalable data pipelines using batch and real-time processing technologies.
Build data infrastructure that ingests real-time events and stores them efficiently across databases.
Establish and enforce data contracts with backend engineering teams by implementing schema management.
Fetch provides a platform where millions of people earn rewards for buying brands they love. They have received investments from SoftBank, Univision, and Hamilton Lane and partnerships ranging from challenger brands to Fortune 500 companies. Fetch fosters a people-first culture rooted in trust, accountability, and innovation.
Design and implement robust, production-grade pipelines using Python, Spark SQL, and Airflow.
Lead efforts to canonicalize raw healthcare data into internal models.
Onboard new customers by integrating their raw data into internal pipelines and canonical models.
Machinify is a healthcare intelligence company delivering value, transparency, and efficiency to health plan clients. They serve over 85 health plans, including many of the top 20, representing more than 270 million lives, with an AI-powered platform and expertise.
Drive end-to-end delivery of core data engineering initiatives.
Lead and mentor the Data Engineering team.
Own data ingestion and processing for live and historical datasets.
BHFT is a proprietary algorithmic trading firm managing the full trading lifecycle — from software development to designing and deploying trading strategies. We are a 230-person company with a strong technology focus, where 70% of the team are engineers and technical specialists.
Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.
At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.
Build a scalable, reliable, operable and performant big data workflow platform.
Drive the usage of Freight's data model across the organization with multiple product teams.
Drive efficiency and reliability improvements through design and automation.
Uber Freight is an enterprise technology company powering intelligent logistics with end-to-end logistics applications, managed services, and an expansive carrier network. Today, the company manages nearly $20B of freight, has one of the largest networks of carriers and is backed by best-in-class investors.
Design and build scalable and high-performance data software solutions using Golang and Python.
Build and deploy Kubernetes-based systems to manage containerized applications in cloud-native environments.
Collaborate with cross-functional teams to understand and address customer needs, ensuring systems evolve.
Machinify is a healthcare intelligence company focused on delivering value, transparency, and efficiency to health plan clients. They deploy a configurable, AI-powered platform used by over 85 health plans, representing more than 270 million lives, and foster a flexible and trusting environment.
Design and develop scalable, maintainable, and reusable software components with a strong emphasis on performance and reliability.
Collaborate with product managers to translate requirements into well-architected solutions, owning features from design through delivery
Build intuitive and extensible user experiences using modern UI frameworks, ensuring flexibility for customer-specific needs.
ServiceNow is a global market leader that brings innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work.
Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.
Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.
Bridge AI & Enterprise Infrastructure: Create the integration layer between modern Python-based AI frameworks and Tenable’s robust JVM-based microservices architecture.
Tenable is an Exposure Management company that helps organizations understand and reduce cyber risk. They have 44,000 organizations relying on them and foster a culture of belonging, respect, and excellence.
Writing highly maintainable and performant Python/PySpark code.
Understanding of Cloud environments, particularly Microsoft Azure and data orchestration systems.
Working with data lakes and understanding common data transformation and storage formats.
YLD helps clients build the skills and capabilities they need to stay ahead of the competition. They are a remote-first consultancy specializing in software engineering, product design, and data with teams based across London, Lisbon, and Porto.
Design, build, maintain, and operate scalable streaming and batch data pipelines.
Work with AWS services, including Redshift, EMR, and ECS, to support data processing and analytics workloads.
Develop and maintain data workflows using Python and SQL.
Southworks helps companies with software development and digital transformation. They focus on solving complex problems and delivering innovative solutions.
Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Design, build, and operate ETL pipelines at scale.
Design data structure for data products.
Develop and operate API/tools related to data products and machine learning products.
Mercari is a company that provides a marketplace platform. They value teamwork and provide career growth opportunities as the company continues to expand.
Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.
Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).
They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.
Engaging directly with current and prospective clients to understand business needs, translate them into technical requirements, and communicate findings in a clear, actionable way
Partnering with internal and client stakeholders to shape solutions, develop proposals, and contribute to go-to-market initiatives
Design, develop and deploy efficient data pipeline for both structured and unstructured data
Resultant consists of a team of engineers, mathematicians, data analysts, project managers, and business consultants. They partner with clients in the public and private sectors to help them overcome complex challenges, empowering clients to drive meaningful change.
Design, build, and optimize data pipelines and workflows.
Drive scalable data solutions to support business decisions.
Contribute to architectural decisions and provide technical leadership.
Jobgether is a platform that uses AI to match candidates with jobs. They focus on ensuring fair and objective reviews of applications by using AI to identify top-fitting candidates for hiring companies.
Design, build, and maintain pipelines that power all data use cases.
Develop intuitive, performant, and scalable data models that support product features.
Pay down technical debt, improve automation, and follow best practices in data modeling.
Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences. They are leaders in the space, with over $10 billion generated by creators since Patreon's inception, with a team passionate about their mission.
Lead the end-to-end data architecture, designing and implementing data pipelines, warehouses, and lakes that handle petabyte-scale datasets.
Collaborate with product teams to enable data-driven decision-making across the organization.
Establish best practices for data quality, governance, and security while mentoring senior engineers and conducting technical reviews.
Cority is a global enterprise EHS software provider creating industry-leading technology. They have been around for over 35 years and are known for strong employee culture and client satisfaction.