Design and build scalable data pipelines to ensure seamless data flow from multiple sources. Automate data collection, transformation, and delivery processes to support real-time and batch processing requirements. Work with stakeholders to define and enforce data governance policies and standards.
Source Job
20 jobs similar to Data Governance Engineer
Jobs ranked by similarity.
- Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
- Lead the architecture and development of scalable and maintainable data solutions.
- Collaborate with data scientists and analysts to provide clean and accessible data.
DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.
- Build and optimize Sauce's lakehouse architecture using Azure Databricks and Unity Catalog for data governance.
- Create and maintain data quality tests and improve existing alerting setups.
- Own data warehouse by connecting data sources, and maintaining a platform and architecture in coordination with R&D infrastructure and operations teams.
Sauce is a premier restaurant technology platform that helps businesses grow with our Commission-Free Delivery & Pickup structure and proprietary delivery optimization technology.
- Design and implement end-to-end data integrations into analytical and operational data stores.
- Evaluate and recommend tooling, frameworks, and platforms for ingestion, transformation, and orchestration.
- Implement monitoring, alerting, and observability including metrics, logging, lineage, and data-quality controls.
Blue Coding specializes in hiring excellent developers and amazing people from all over Latin America and other parts of the world.
Build robust data pipelines at scale. Design and implement data schemas. Collaborate with Analytics/Data Science team to structure and house data.
Goods & Services is a product design and engineering company that solves mission-critical challenges for some of the world’s largest enterprises.
- Architect and maintain robust data pipelines to transform diverse data inputs.
- Integrate data from various sources into a unified platform.
- Build APIs with AI assistance to enable secure access to consolidated insights.
Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.
- Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
- Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
- Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.
Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.
- Design, build, and maintain scalable, high-quality data pipelines.
- Implement robust data ingestion, transformation, and storage using cloud-based technologies.
- Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.
CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.
- Design, build, and scale data pipelines across a variety of source systems and streams.
- Implement appropriate design patterns while optimizing performance, cost, security, and scale and end-user experience.
- Collaborate with cross-functional teams to understand data requirements and develop efficient data acquisition and integration strategies.
NBCUniversal is one of the world's leading media and entertainment companies creating world-class content across film, television, streaming and theme parks.
- Contribute to the design and implementation of highly scalable data infrastructure.
- Implement and maintain end-to-end data pipelines supporting batch & realtime analytics.
- Work with Product, Engineering, and Business teams to understand data requirements.
Docker makes app development easier so developers can focus on what matters and has a remote-first team that spans the globe.
- Design, develop, and maintain robust data processes and solutions.
- Develop and maintain data models, databases, and data warehouses.
- Collaborate with stakeholders to gather requirements and provide data solutions.
Highmark Health is a national, blended health organization that includes one of America’s largest Blue Cross Blue Shield insurers.
Work with data end-to-end, exploring, cleaning, and assembling large, complex datasets. Analyze raw data from multiple sources and identify trends and patterns, maintaining reliable data pipelines. Build analytics-ready outputs and models that enable self-service and trustworthy insights across the organization.
Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions for over two decades.
- Design and implement scalable, high-performance data architectures to support business needs.
- Develop, automate, and maintain production-grade data pipelines using modern data stack tools and best practices.
- Optimize data workflows and implement observability frameworks to monitor pipeline performance, reliability, and accuracy.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions. Design and oversee the data architecture and infrastructure, ensuring scalability, performance, and security. Create scalable and efficient data processing frameworks, including ETL processes and data pipelines.
Lingaro has been on the market since 2008, with 1500+ talents currently on board in 7 global sites and emphasizes career growth and skills development.
- Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
- Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
- Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
- Design, build and execute data pipelines.
- Build the configurable ETL framework.
- Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
- Design, build, and maintain scalable and reliable data pipelines.
- Develop and maintain ETL data pipelines for large volumes of data, writing clean, maintainable, and efficient code.
- Work closely with product managers, data scientists, and software engineers to create and prepare datasets from disparate sources.
Curinos empowers financial institutions to make better, faster and more profitable decisions through industry-leading proprietary data, technologies and insights.
Own the design, build, and optimization of end-to-end data pipelines. Establish and enforce best practices in data modeling, orchestration, and system reliability. Collaborate with stakeholders to translate requirements into robust, scalable data solutions.
YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B.
The Sr Data Engineer, DevX creates the best developer experience for data and application engineers at Basis. They design, implement and maintain deployment and ETL pipelines for data products. Integrate diverse data sources and vendor products, including databases, APIs, and third-party services.
Basis Technologies empowers agencies and brands with cutting-edge software that automates digital media operations, offering flexible work options across the U.S.
- Build and monitor Cribl’s core data tech stack including data pipelines and data warehouse.
- Develop cloud-native services and infrastructure that power scalable and reliable data systems.
- Support Cribl’s growing data science and agentic initiatives by preparing model-ready datasets.
Cribl is a company that provides a data engine for IT and Security for various industries.
- Develop data models and pipelines for customer-facing applications, research, reporting and machine learning.
- Optimize data models to support efficient data storage and retrieval processes for performance and scalability.
- Optimize ETL processes for ingesting, processing and transforming large volumes of structured and unstructured data into our data ecosystem.
Inspiren offers the most complete and connected ecosystem in senior living bringing peace of mind to residents, families, and staff.