Lead product requirements and advanced analytics requirements gathering efforts.
Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
Build a scalable technology platform to support a growing business and deliver high-quality code to production.
Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.
Bridge the gap between application engineering and data infrastructure.
Own the optimization of high-volume data pipelines and tune operational databases.
Define how to ingest massive bursts of medical information, model it for transactional locking, and transform it for analytical querying.
Synthesis Health is a mission- and values-driven company with tremendous dedication to its customers. The 100% remote team is dedicated to revolutionizing healthcare through innovation, collaboration, and commitment to its core values and behaviors.
Design and implement scalable, performant data models.
Develop and optimize processes to improve the correctness of 3rd party data.
Implement data quality principles to raise the bar for reliability of data.
SmithRx is a venture-backed Health-Tech company disrupting the Pharmacy Benefit Management (PBM) sector with a next-generation drug acquisition platform. They have a mission-driven and collaborative culture that inspires employees to transform the U.S. healthcare system.
Design, develop, and maintain dbt data models that support our healthcare analytics products.
Integrate and transform customer data to conform to our data specifications and pipelines.
Design and execute initiatives that improve data platform and pipeline automation and resilience.
SmarterDx, a Smarter Technologies company, builds clinical AI that is transforming how hospitals translate care into payment. Founded by physicians in 2020, our platform connects clinical context with revenue intelligence, helping health systems recover millions in missed revenue, improve quality scores, and appeal every denial.
Design and launch our first production-grade data warehouse
Implement reliable ingestion from core clinical and operational systems
Establish data quality and monitoring standards used across engineering
Zócalo Health is a tech-enabled, community-oriented primary care organization. They serve people who have historically been underserved by the one-size-fits-all healthcare system and are scaling rapidly across states and populations.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.
The New York Blood Center appears to be a medical organization. They are looking for a Senior Data Engineer to join their team.
Design and develop data integration pipelines across cloud and legacy systems.
Lead and support data engineering implementation efforts.
Apply advanced analytics techniques to support business insights and decision-making.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.
Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.
At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.
Build Production-Grade Data Systems by writing clean, modular, well-tested Python code for data pipelines and platform services.
Design and Maintain Data Models by implementing relational data models aligned with medallion architectures (bronze/silver/gold).
Implement idempotent transformation logic using SQLMesh/Tobiko (preferred) or dbt to maintain Transformation & Data Quality.
Axle is a bioscience and information technology company that offers advancements in translational research, biomedical informatics, and data science applications to research centers and healthcare organizations nationally and abroad. With experts in biomedical science, software engineering, and program management, they focus on developing and applying research tools and techniques to empower decision-making and accelerate research discoveries.
Contribute to the technical integrity and evolution of the Data Platform tech stack.
Design and implement core features and enhancements within the Data Platform.
Build and maintain robust data extraction, loading, and transformation processes.
Dimagi is an award-winning social enterprise and certified B-corp building software solutions and providing technology consulting services to improve the quality of essential services for underserved populations. They are passionate and committed to tackling complex health and social inequities and working towards a brighter future for all.
Build and evolve our semantic layer, design, document, and optimize dbt models.
Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.
Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.
Design, build, and maintain scalable data pipelines.
Develop and implement data models for analytical use cases.
Implement data quality checks and governance practices.
MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.
Maintain, configure, and optimize the existing data warehouse platform and pipelines.
Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
Drive innovation by experimenting with new technologies and recommending platform improvements.
Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!
Build modern, scalable data pipelines that keep the data flowing.
Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
Unify and wrangle data from all kinds of sources.
InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.
Build & Operate Data Pipelines, using AWS-native data tools and distributed processing frameworks.
Operate and improve core data platform services, addressing incidents, performance issues, and operational toil.
Partner with data producers and consumers to onboard pipelines, troubleshoot issues, and improve platform usability.
Fetch is a platform where millions of people use Fetch earning rewards for buying brands they love, and a whole lot more. With investments from SoftBank, Univision, and Hamilton Lane, and partnerships with Fortune 500 companies, it is reshaping how brands and consumers connect in the marketplace. Ranked as one of America’s Best Startup Employers by Forbes, Fetch fosters a people-first culture rooted in trust, accountability, and innovation.
Design, build, and operate scalable data pipelines using batch and real-time processing technologies.
Build data infrastructure that ingests real-time events and stores them efficiently across databases.
Establish and enforce data contracts with backend engineering teams by implementing schema management.
Fetch provides a platform where millions of people earn rewards for buying brands they love. They have received investments from SoftBank, Univision, and Hamilton Lane and partnerships ranging from challenger brands to Fortune 500 companies. Fetch fosters a people-first culture rooted in trust, accountability, and innovation.
Design and build scalable data pipelines that ingest, process, and transform high-volume event streams and historical data
Develop and maintain APIs that deliver analytics, trend reports, and drill-down capabilities to internal teams and external customers
Build robust infrastructure for data quality monitoring, ensuring accuracy and completeness across customer and artifact datasets
Socket helps devs and security teams ship faster by cutting out security busywork. They have raised $65M in funding from top angels, operators, and security leaders.
Lead the end-to-end data architecture, designing and implementing data pipelines, warehouses, and lakes that handle petabyte-scale datasets.
Collaborate with product teams to enable data-driven decision-making across the organization.
Establish best practices for data quality, governance, and security while mentoring senior engineers and conducting technical reviews.
Cority is a global enterprise EHS software provider creating industry-leading technology. They have been around for over 35 years and are known for strong employee culture and client satisfaction.
Oversee the design, development, implementation, and maintenance of all data-related systems.
Build and maintain scalable and reliable data pipelines, ensuring data quality and consistency across all systems.
Collaborate with data scientists, business analysts, and other stakeholders to identify data-related needs and requirements.
MoneyHero Group is a market-leading financial products platform in Greater Southeast Asia, reaching millions of monthly unique users and working with hundreds of commercial partners. They have a team of over 350 talented individuals and are backed by world-class organizations and companies.