Source Job

Latam

  • Design and implement data pipelines using Databricks, PySpark, and Delta Lake.
  • Work closely with business stakeholders and analysts to understand KPIs.
  • Model and structure data using dimensional modeling techniques.

Databricks PySpark SQL Delta Lake Airflow

20 jobs similar to Databricks Engineer

Jobs ranked by similarity.

US

  • Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
  • Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
  • Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).

They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.

$120,000–$160,000/yr
US

  • Design and implement scalable, reliable, and efficient data pipelines to support clinical, operational, and business needs.
  • Optimize data storage and processing in data lakes and cloud data warehouses (Azure, Databricks).
  • Proactively suggest improvements to infrastructure, processes, and automation to improve system efficiency, reduce costs, and enhance performance.

Care Access is dedicated to ensuring that every person has the opportunity to understand their health, access the care they need, and contribute to the medical breakthroughs of tomorrow. They are working to make the future of health better for all and have hundreds of research locations, mobile clinics, and clinicians across the globe.

US

  • Design, build, and maintain scalable ETL pipelines for large-scale data processing.
  • Implement data transformations and workflows using PySpark at an intermediate to advanced level.
  • Optimize pipelines for performance, scalability, and cost efficiency across environments.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies on their most impactful projects.

$150,000–$170,000/yr
US

  • Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
  • Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
  • Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.

Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

$125,000–$150,000/yr
US

  • Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
  • Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
  • Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.

The New York Blood Center appears to be a medical organization. They are looking for a Senior Data Engineer to join their team.

Latin America

  • Develop and maintain custom connectors using AirByte.
  • Build and optimize data transformation pipelines using AWS Glue.
  • Structure data to enable efficient AWS Athena queries.

Coderoad is a software development company that provides end-to-end services. They offer opportunities to work on real-world projects, helping individuals skill up and advance their careers in a supportive environment.

India

  • Design, build, and optimize data pipelines and workflows.
  • Drive scalable data solutions to support business decisions.
  • Contribute to architectural decisions and provide technical leadership.

Jobgether is a platform that uses AI to match candidates with jobs. They focus on ensuring fair and objective reviews of applications by using AI to identify top-fitting candidates for hiring companies.

$135,000–$165,000/yr
US Unlimited PTO

  • Design, build, and maintain scalable data pipelines.
  • Develop and implement data models for analytical use cases.
  • Implement data quality checks and governance practices.

MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.

Global

  • Assist in designing and implementing Snowflake-based analytics solutions.
  • Build and maintain data pipelines adhering to enterprise architecture principles.
  • Act as a technical leader within the team, ensuring quality deliverables.

Jobgether is a company that uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

Europe

  • Design and evolve the enterprise Azure Lakehouse architecture.
  • Lead the transformation of classic Data Warehouse environments into modern Lakehouse.
  • Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.

Deutsche Telekom IT Solutions Slovakia, formerly T-Systems Slovakia, has been a part of the Košice region since 2006. They have grown to be the second-largest employer in eastern Slovakia, with over 3900 employees, aiming to provide innovative information and communication technology services.

Global

  • Architect and maintain robust, scalable, and secure data infrastructure on AWS leveraging Databricks.
  • Design, develop, and maintain data pipelines, primarily using tools like Airbyte and custom-built services in Go, to automate data ingestion and ETL processes.
  • Oversee the creation and maintenance of the data lake, ensuring efficient storage, high data quality, and effective partitioning, organization, performance, monitoring and alerting.

Trust Wallet is the leading non-custodial cryptocurrency wallet, trusted by over 200 million people worldwide to securely manage and grow their digital assets. They aim to be a trusted personal companion — helping users safely navigate Web3, the on-chain economy, and the emerging AI-powered future.

$125,000–$145,000/yr
US

  • Design, build, and scale the lakehouse architecture that underpins analytics, machine learning, and AI.
  • Modernize our data ecosystem, making it discoverable, reliable, governed, and ready for self-service and intelligent automation.
  • Operate anywhere along the data lifecycle from ingestion and transformation to metadata, orchestration, and MLOps.

OnX is a pioneer in digital outdoor navigation with a suite of apps. With more than 400 employees, they have created regional “Basecamps” to help remote employees find connection and inspiration.

Global

  • Build and optimize scalable, efficient ETL and data lake processes.
  • Own the ingestion, modeling, and transformation of structured and unstructured data.
  • Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows.

Launch Potato is a digital media company that connects consumers with brands through data-driven content and technology. They have a remote-first team spanning over 15 countries and have built a high-growth, high-performance culture.

$152,000–$257,200/yr
US

  • Collaborate with cross-functional teams and business units to understand and communicate Ion’s needs and goals.
  • Develop, improve, and maintain robust data pipelines for extracting and transforming data from log files and event streams.
  • Design models and algorithms to derive insights and metrics from large datasets.

Intuitive is a global leader in robotic-assisted surgery and minimally invasive care. Their technologies, like the da Vinci surgical system and Ion, have transformed how care is delivered for millions of patients worldwide. They are a team of engineers, clinicians, and innovators united by one purpose: to make surgery smarter, safer, and more human.

US Unlimited PTO

  • Design, build, and maintain pipelines that power all data use cases.
  • Develop intuitive, performant, and scalable data models that support product features.
  • Pay down technical debt, improve automation, and follow best practices in data modeling.

Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences. They are leaders in the space, with over $10 billion generated by creators since Patreon's inception, with a team passionate about their mission.

US

  • Design, build, and optimize scalable data lakes, warehouses, and data pipelines using Snowflake and modern cloud platforms.
  • Develop and maintain robust data models (ELT/ETL), ensuring clean, reliable, and well-documented datasets.
  • Design and manage end-to-end data pipelines that ingest, transform, and unify data from multiple systems.

Cobalt Service Partners is building the leading commercial access and security integration business in North America. Backed by Alpine Investors, with $15B+ in AUM, Cobalt has scaled rapidly since launch through acquisitions and is building a differentiated, data-driven platform.

Latin America

  • Build and operate backend services and automation for the Snowflake data platform.
  • Support data ingestion pipelines (RDS/Oracle → Snowflake) and reverse ETL (Snowflake → RDS).
  • Develop and maintain Airflow (AWS MWAA) workflows for ingestion, data quality, and ops automation.

Upwork is the world’s work marketplace, serving everyone from one-person startups to over 30% of the Fortune 100. They provide a powerful, trust-driven platform that enables companies and talent to work together in new ways that unlock their potential. Last year, more than $3.8 billion of work was done through Upwork.

Global

  • Design, develop, and optimize EDP data pipelines using Python, Airflow, DBT, and Snowflake for scalable financial data processing.
  • Build performant Snowflake data models and DBT transformations following best practices, standards, and documentation guidelines.
  • Own ingestion, orchestration, monitoring, and SLA-driven workflows; proactively troubleshoot failures and improve reliability.

Miratech is a global IT services and consulting company that brings together enterprise and start-up innovation, supporting digital transformation for some of the world's largest enterprises. They retain nearly 1000 full-time professionals, and their culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed.

Brazil

  • Writing highly maintainable and performant Python/PySpark code.
  • Understanding of Cloud environments, particularly Microsoft Azure and data orchestration systems.
  • Working with data lakes and understanding common data transformation and storage formats.

YLD helps clients build the skills and capabilities they need to stay ahead of the competition. They are a remote-first consultancy specializing in software engineering, product design, and data with teams based across London, Lisbon, and Porto.

$194,400–$305,500/yr
US

  • Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
  • Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
  • Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.

At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.