Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.
Source Job
20 jobs similar to Data Engineering Manager
Jobs ranked by similarity.
- Design, build, and maintain scalable and reliable data pipelines.
- Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
- Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
- Architect and implement scalable Lakehouse solutions using Delta Tables and Delta Live Tables.
- Design and orchestrate complex data workflows using Databricks Workflows and Jobs.
- Develop production-grade Python and PySpark code, including custom Python libraries.
Coderio designs and delivers scalable digital solutions for global businesses with a strong technical foundation and a product mindset.
Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks. Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows. Lead large-scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.
This position is posted by Jobgether on behalf of a partner company.
- Assist in executing data engineering projects within the Customer Intelligence portfolio to meet defined timelines and deliverables.
- Build and maintain ETL pipelines based on user and project specifications to enable reliable data movement.
- Develop and update technical documentation for key systems and data assets.
Stryker is one of the world’s leading medical technology companies and, together with its customers, is driven to make healthcare better.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
- Build and scale data services by designing, developing, and maintaining scalable backend systems and APIs.
- Collaborate on data architecture and models, partnering with engineering and analytics teams to optimize storage and processing workflows.
- Contribute to standards, quality, and governance by building reliable, observable data systems with strong testing and validation.
Zapier builds and uses automation every day to make work more efficient, creative, and human.
- Partner with clients and implementation teams to understand data distribution requirements.
- Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
- Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
- Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
- Lead the architecture and development of scalable and maintainable data solutions.
- Collaborate with data scientists and analysts to provide clean and accessible data.
DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.
- Design and engineer robust data pipelines using technologies like Databricks, Azure Data Factory, Apache Spark, and Delta Lake.
- Craft healthcare data solutions - processing massive healthcare datasets, optimizing performance, and ensuring data is accurate and secure.
- Communicate technical concepts to non-technical stakeholders, manage multiple priorities, and meet deadlines.
Gentiva offers compassionate care in the comfort of patients' homes as a national leader in hospice, palliative, home health care, and advanced illness management. They have nearly 600 locations and thousands of clinicians across 38 states, offering rewarding careers in a collaborative environment.
- Design, build, and maintain robust and scalable data pipelines from diverse sources.
- Leverage expert-level experience with dbt and Snowflake to structure, transform, and organize data.
- Collaborate with engineering, product, and analytics teams to deliver data solutions that drive business value.
Topstep is an engaging working environment which ranges from fully remote to hybrid and they foster a culture of collaboration.
- Design and implement data ingestion and transformation pipelines using Databricks, PySpark, and distributed processing.
- Implement Delta Lake principles, focusing on CDC and schema evolution, integrating data quality frameworks within CI/CD pipelines for data integrity.
- Develop and optimize complex SQL and Python scripts, handle both structured and unstructured data, and improve inconsistent legacy datasets.
Mobile Wave Solutions is a professional services company specializing in software development as a service, with a team of over 120 engineers.
- Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
- Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
- Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
- Design, build, and maintain the pipelines that power all data use cases.
- Develop intuitive, performant, and scalable data models that support product features, internal analytics, experimentation, and machine learning workloads.
- Define and enforce standards for accuracy, completeness, lineage, and dependency management.
Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences.
- Lead and mentor a team of data engineers, fostering innovation, collaboration, and continuous improvement.
- Design, implement, and optimize scalable data pipelines and ETL processes to meet evolving business needs.
- Ensure data quality, governance, security, and compliance with industry standards and best practices.
Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.
As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.
PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.
- Partner with our customer teams to develop engineering plans to implement our health system partners
- Build and support robust batch and streaming pipelines
- Evolve the maturity of our monitoring systems and processes to improve visibility and failure detection in our infrastructure
Paradigm is rebuilding the clinical research ecosystem by enabling equitable access to trials for all patients. Incubated by ARCH Venture Partners and backed by leading healthcare and life sciences investors, Paradigm’s seamless infrastructure implemented at healthcare provider organizations, will bring potentially life-saving therapies to patients faster.
Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions. Design and oversee the data architecture and infrastructure, ensuring scalability, performance, and security. Create scalable and efficient data processing frameworks, including ETL processes and data pipelines.
Lingaro has been on the market since 2008, with 1500+ talents currently on board in 7 global sites and emphasizes career growth and skills development.
The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.
As a Senior Data Engineer, shape a scalable data platform that drives business insights. Design and maintain robust data pipelines and collaborate with cross-functional teams. Tackle complex data challenges, implement best practices, and mentor junior engineers.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
- Design, build, and scale data pipelines across a variety of source systems and streams.
- Implement appropriate design patterns while optimizing performance, cost, security, and scale and end-user experience.
- Collaborate with cross-functional teams to understand data requirements and develop efficient data acquisition and integration strategies.
NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming and theme parks.