Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
Write Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
Monitor daily jobs and troubleshoot any failures to ensure pipeline stability.
Jobgether is a platform that leverages AI to connect job seekers with employers. They focus on ensuring fair and efficient application reviews, connecting top candidates directly with hiring companies.
Design, develop, and maintain ETL/ELT pipelines on cloud-based data platforms.
Build data ingestion, transformation, and orchestration workflows using tools such as Azure Data Factory, Airflow, Fivetran, or similar.
Develop transformations and data processing logic using platforms such as Databricks, Snowflake, or equivalent.
Ankura Consulting Group, LLC is an independent global expert services and advisory firm. They deliver services and end-to-end solutions to help clients at critical inflection points related to conflict, crisis, performance, risk, strategy, and transformation, and consists of more than 2000 professionals.
Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs.
Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks).
Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance.
Care Access is dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. They are working to make the future of health better for all and have hundreds of research locations, mobile clinics, and clinicians across the globe.
Build and optimize scalable, efficient ETL and data lake processes.
Own the ingestion, modeling, and transformation of structured and unstructured data.
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.
Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.
Lead support of client’s Azure Data platform and Power BI Environment, including response to any escalations while helping to analyze and resolve incidents for customers environment.
Consult, develop, and advise on solutions in Microsoft Azure with tools such as Synapse, Data Factory, Databricks, Azure ML, Data Lake, Data Warehouse, and Power BI.
Consistently learn, apply, and refine skills around data engineering and data analytics.
3Cloud hires people who aren’t afraid to experiment or fail and who are willing to give direct and candid feedback. They hire people who challenge and hold each other accountable for living 3Cloud’s core values because they know that it will result in amazing experiences and solutions for clients.
Design, build, and maintain scalable data pipelines using Microsoft Fabric and Apache Airflow
Ingest, transform, and integrate data from a variety of sources, including relational systems, APIs, and MongoDB
Design and maintain analytical data models, including fact and dimension tables, to support reporting and analytics
Theoria Medical is a comprehensive medical group and technology company dedicated to serving patients across the care continuum with an emphasis on post-acute care and primary care. Theoria serves facilities across the United States with a multitude of services to improve the quality of care delivered, refine facility processes, and enhance critical relationships.
Develop data warehouse applications, including extraction, ingestion, and transformation processes.
Collaborate with customers and internal teams to understand business requirements.
Ensure quality assurance and data validation, utilizing the sprint methodology.
One Model, founded by industry veterans in HR analytics, has a data-first approach to their People Analytics Platform, giving them a competitive advantage. They foster a friendly, inclusive, and respectful workplace culture, offering the opportunity to contribute significantly to a young company and team.
Create and maintain optimal data pipeline architecture
Extend our machine learning platform by designing tools that interface with cloud services
Build the infrastructure required for optimal extraction, transformation, and loading of data
NinjaHoldings aims to revolutionize how Americans interact with financial services. They have a lean and innovative team that empowers people overlooked by traditional financial institutions through digital banking and lending products.
Design and optimize scalable data pipelines and data architecture.
Build cloud-native data solutions using Azure, Databricks (Unity Catalog & Delta Lake), and other big data technologies.
Contribute to a strong data culture through continuous learning and knowledge sharing.
Redcare Pharmacy is Europe’s No.1 e-pharmacy, powered by passionate teams and cutting-edge innovation. They strive to create a healthy collaborative work environment where every employee feels valued and inspired to contribute to their vision.
Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
Translate business requirements into software solutions that accelerate our ability to deploy AI.
Monitor data pipelines, detect anomalies, and implement automated recovery systems.
Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.
Play a key role in designing, developing, and delivering modern data solutions that drive business insight and innovation.
Implement scalable, high-performing cloud architectures that support analytics, AI, and operational excellence.
Be responsible for technical delivery, authoring solution documentation, and ensuring data pipelines and models meet enterprise standards for performance, reliability, and cost efficiency.
3Cloud is a company where people aren’t afraid to experiment or fail. They hire people who care about the collective growth and success of the company, challenging each other to live by 3Cloud’s core values, and resulting in amazing experiences and solutions for clients and each other.
Build and maintain production quality data pipelines between operational systems and BigQuery.
Implement data quality and freshness checks and monitoring processes to ensure data accuracy and consistency.
Maintain and contribute to our ingestion framework that leverages various purpose-built data load tool (dlt) connectors.
Grafana Labs is a remote-first, open-source powerhouse that provides visualization tools. They help companies manage their observability strategies with the Grafana LGTM Stack and have a global collaborative culture with a passion for meaningful work.
Build and maintain scalable data pipelines from ingestion through transformation and delivery.
Design, build, and maintain our data warehouse and data marts.
Partner with stakeholders to translate business needs into clean data models.
Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.
Design, build, maintain, and operate scalable streaming and batch data pipelines.
Work with AWS services, including Redshift, EMR, and ECS, to support data processing and analytics workloads.
Develop and maintain data workflows using Python and SQL.
Southworks helps companies with software development and digital transformation. They focus on solving complex problems and delivering innovative solutions.
Writing highly maintainable and performant Python/PySpark code.
Understanding of Cloud environments, particularly Microsoft Azure and data orchestration systems.
Working with data lakes and understanding common data transformation and storage formats.
YLD helps clients build the skills and capabilities they need to stay ahead of the competition. They are a remote-first consultancy specializing in software engineering, product design, and data with teams based across London, Lisbon, and Porto.
Design, build, and maintain ETL/ELT pipelines and integrations across legacy and cloud systems.
Model, store, and transform data to support analytics, reporting, and downstream applications.
Build API-based and file-based integrations across enterprise platforms.
Jobgether is a platform that uses AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.
Architect our AWS-based data warehouse and ingestion pipelines.
Transform high-volume simulation outputs into clean, trusted datasets.
Establish schema standards and data contracts with engineering.
Onebrief provides collaboration and AI-powered workflow software designed for military staffs, making them faster, smarter, and more efficient. The company, founded in 2019, values ownership and excellence, with a team spanning veterans and technologists; it has raised $320m+ from investors and is valued at $2.15B.
Design and implement robust, production-grade pipelines using Python, Spark SQL, and Airflow.
Lead efforts to canonicalize raw healthcare data into internal models.
Onboard new customers by integrating their raw data into internal pipelines and canonical models.
Machinify is a healthcare intelligence company delivering value, transparency, and efficiency to health plan clients. They serve over 85 health plans, including many of the top 20, representing more than 270 million lives, with an AI-powered platform and expertise.
Design, develop, and maintain a core Python ETL framework.
Develop and optimize an automated refresh pipeline orchestrated through AWS Batch, Lambda, Step Functions, and EventBridge.
Build Python integrations with external systems that are robust, testable, and reusable.
BlastPoint is a B2B data analytics startup that helps companies engage with customers more effectively by discovering insights in their data. Founded in 2016 by Carnegie Mellon Alumni, they are a tight-knit, forward-thinking team that serves diverse industries including energy, finance, retail, and transportation.
Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.
At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.