Source Job

  • Design and implement data ingestion and transformation pipelines using Databricks, PySpark, and distributed processing.
  • Implement Delta Lake principles, focusing on CDC and schema evolution, integrating data quality frameworks within CI/CD pipelines for data integrity.
  • Develop and optimize complex SQL and Python scripts, handle both structured and unstructured data, and improve inconsistent legacy datasets.

SQL Python Databricks PySpark Airflow

20 jobs similar to Senior Data Engineer

Jobs ranked by similarity.

  • Architect and implement scalable Lakehouse solutions using Delta Tables and Delta Live Tables.
  • Design and orchestrate complex data workflows using Databricks Workflows and Jobs.
  • Develop production-grade Python and PySpark code, including custom Python libraries.

Coderio designs and delivers scalable digital solutions for global businesses with a strong technical foundation and a product mindset.

US Unlimited PTO

  • Partner with clients and implementation teams to understand data distribution requirements.
  • Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
  • Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.

Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.

US

  • Design and engineer robust data pipelines using technologies like Databricks, Azure Data Factory, Apache Spark, and Delta Lake.
  • Craft healthcare data solutions - processing massive healthcare datasets, optimizing performance, and ensuring data is accurate and secure.
  • Communicate technical concepts to non-technical stakeholders, manage multiple priorities, and meet deadlines.

Gentiva offers compassionate care in the comfort of patients' homes as a national leader in hospice, palliative, home health care, and advanced illness management. They have nearly 600 locations and thousands of clinicians across 38 states, offering rewarding careers in a collaborative environment.

$110,000–$140,000/yr
US

  • Design, build, and maintain scalable and reliable data pipelines.
  • Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
  • Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.

Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.

$215,000–$240,000/yr
US

Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.

YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.

Brazil Colombia Mexico Argentina 2w PTO

  • Build and optimize Sauce's lakehouse architecture using Azure Databricks and Unity Catalog for data governance.
  • Create and maintain data quality tests and improve existing alerting setups.
  • Own data warehouse by connecting data sources, and maintaining a platform and architecture in coordination with R&D infrastructure and operations teams.

Sauce is a premier restaurant technology platform that helps businesses grow with our Commission-Free Delivery & Pickup structure and proprietary delivery optimization technology.

India

Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions. Design and oversee the data architecture and infrastructure, ensuring scalability, performance, and security. Create scalable and efficient data processing frameworks, including ETL processes and data pipelines.

Lingaro has been on the market since 2008, with 1500+ talents currently on board in 7 global sites and emphasizes career growth and skills development.

$150,000–$165,000/yr
US Unlimited PTO 11w maternity

  • Partner with our customer teams to develop engineering plans to implement our health system partners
  • Build and support robust batch and streaming pipelines
  • Evolve the maturity of our monitoring systems and processes to improve visibility and failure detection in our infrastructure

Paradigm is rebuilding the clinical research ecosystem by enabling equitable access to trials for all patients. Incubated by ARCH Venture Partners and backed by leading healthcare and life sciences investors, Paradigm’s seamless infrastructure implemented at healthcare provider organizations, will bring potentially life-saving therapies to patients faster.

Europe Unlimited PTO

Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.

This position is posted by Jobgether on behalf of a partner company.

$150,000–$185,000/yr

  • Design, build, and oversee the deployment of technology for managing structured and unstructured data.
  • Develop tools leveraging AI, ML, and big-data to cleanse, organize, and transform data.
  • Design and maintain CI/CD pipelines using GitHub Actions to automate deployment, testing, and monitoring.

NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming, theme parks, and more.

$155,000–$180,000/yr
US

  • Design, build, and maintain robust and scalable data pipelines from diverse sources.
  • Leverage expert-level experience with dbt and Snowflake to structure, transform, and organize data.
  • Collaborate with engineering, product, and analytics teams to deliver data solutions that drive business value.

Topstep is an engaging working environment which ranges from fully remote to hybrid and they foster a culture of collaboration.

$155,000–$185,000/yr
US Unlimited PTO

As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.

Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.

$117,629–$183,501/yr
US Canada

The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.

Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.

South America

  • Design, develop, and maintain scalable and robust data pipelines.
  • Create solutions for data ingestion, transformation, and modeling using Databricks, Spark/PySpark, Cloudera, and Azure Data Factory (ADF).
  • Ensure the quality, integrity, and usability of data throughout the entire pipeline.

CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with over 1,000 clients during their 30-year history, with a focus on Artificial Intelligence.

$115,000–$160,000/yr
US

As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.

PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.

$150,000–$160,000/yr
US

  • Design, build and execute data pipelines.
  • Build the configurable ETL framework.
  • Optimize SQL queries to maximize system performance.

RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.

US Europe

  • Design and develop scalable data pipelines and infrastructure to process large volumes of data efficiently
  • Collaborate with cross-functional teams to ensure data integrity, accessibility, and usability
  • Implement and maintain data quality measures throughout the data lifecycle

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have a culture that values diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment.

Mexico

  • Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
  • Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
  • Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.

Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.

$145,290–$185,000/yr
Unlimited PTO

  • Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
  • Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

Europe

  • Design, build, and maintain scalable, high-quality data pipelines.
  • Implement robust data ingestion, transformation, and storage using cloud-based technologies.
  • Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.