Source Job

Europe Asia

  • Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
  • Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
  • Ensure data integrity, availability, lineage, and observability across complex pipelines

Python SQL AWS Terraform Data Modeling

20 jobs similar to Senior Data Engineer EMEA

Jobs ranked by similarity.

Latin America

  • Design and evolve scalable data pipelines and architectures.
  • Act as the primary anchor for data ingestion, transformation, and storage solutions.
  • Ensure mission-critical data is accessible and reliable.

CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic provides smartphone locking technology, enabling global access to devices and digital finance. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. They celebrate diversity and aim to do the right thing for each other, the community, and the planet.

Europe

  • Design and maintain scalable data pipelines.
  • Structure, transform, and optimize data in Snowflake.
  • Implement multi-source ETL/ELT flows (ERP, APIs, files).

QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.

Global

  • Design, implement, and maintain scalable, high-performance data architectures connecting relational and non-relational systems.
  • Manage end-to-end data pipelines, ensuring seamless ingestion from scrapers to AI/ML workflows.
  • Audit and optimize existing workflows for efficiency, accuracy, and flexibility.

Jobgether is a pioneering HR Tech startup, operating entirely remotely, and leading the revolution in the world of work. As the largest job search engine designed exclusively for remote workers, its mission is to empower individuals to discover opportunities that align seamlessly with their unique lifestyles.

$89,440–$94,380/hr
US

  • Design, build, and maintain scalable data pipelines.
  • Develop and optimize ETL/ELT processes using cloud data technologies.
  • Partner with teams to understand data requirements and improve data capture strategies.

Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.

$108,400–$135,500/yr
US North America

  • Design, develop, and maintain scalable data pipelines using cloud data services.
  • Serve as a technical leader, defining data engineering standards and best practices.
  • Lead the design and implementation of optimized data models in our cloud data warehouse.

Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.

$125,000–$165,000/yr
Global

  • Create and maintain optimal data pipeline architecture
  • Extend our machine learning platform by designing tools that interface with cloud services
  • Build the infrastructure required for optimal extraction, transformation, and loading of data

NinjaHoldings aims to revolutionize how Americans interact with financial services. They have a lean and innovative team that empowers people overlooked by traditional financial institutions through digital banking and lending products.

US

  • Build and maintain scalable data pipelines from ingestion through transformation and delivery.
  • Design, build, and maintain our data warehouse and data marts.
  • Partner with stakeholders to translate business needs into clean data models.

Gurobi Optimization focuses on mathematical optimization. They empower customers to expand their use of mathematical optimization technology in order to make smarter decisions and solve some of the world's toughest and most impactful business problems.

$185,000–$215,000/yr
US

  • Collaborate with Data Science, Product Managers and Software Engineers to build robust ETL pipelines.
  • Contribute to architecture decisions, observability tooling, and data quality initiatives.
  • Contribute to a scalable internal framework for managing prompt engineering pipelines and AI workflows.

Federato is on a mission to defend the right to efficient, equitable insurance for all by enabling insurers to provide affordable coverage. They are well funded by those behind Salesforce, Veeva, Zoom, Box, etc, and they value learning and the ability to change their minds.

$142,000–$162,500/yr
US

  • Architect, build, and operate data infrastructure that powers Tebra’s intelligent features.
  • Translate business requirements into software solutions that accelerate our ability to deploy AI.
  • Monitor data pipelines, detect anomalies, and implement automated recovery systems.

Tebra unites Kareo and PatientPop, providing a digital backbone for practice well-being, supporting both products with a shared vision for modernized care. Over 100,000 providers trust Tebra to elevate patient experience and grow their practice, building the future of well-being with compassion and humanity.

US

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines.
  • Build and implement ETL frameworks to improve code quality and reliability.
  • Guide and mentor other Data Engineers as a technical owner of parts of the data platform.

Jobgether is a platform that connects job seekers with companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.

$92,686–$125,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.

Mexico

  • Design, implement, and maintain scalable, high-performance data architectures connecting relational and non-relational systems.
  • Manage end-to-end data pipelines, ensuring seamless ingestion from scrapers to AI/ML workflows.
  • Audit and optimize existing workflows for efficiency, accuracy, and flexibility.

Jobgether is a pioneering HR Tech startup operating entirely remotely, and leading the revolution in the world of work. They are a job search engine designed exclusively for remote workers, with a team of 30 individuals located across the globe.

US

  • Design, build, and optimize data pipelines to support AI and ML projects.
  • Integrate data from various sources to provide a unified data view for AI applications.
  • Implement processes to ensure data quality, consistency, and accuracy across systems.

The Tyndale Company is a leading national supplier of arc-rated flame-resistant clothing (FRC) to the energy sector. They are a family-owned business, 9x Top Workplace winner in PA and 5x winner in TX, providing a retail-style apparel experience.

Global

  • Build and maintain robust data pipelines processing large volumes of data
  • Update and optimise our data platform for speed, scalability and cost
  • Develop processes and tools to monitor and analyse model performance and data accuracy

Moniepoint is Africa's all-in-one financial ecosystem, empowering businesses and their customers with seamless payment, banking, credit, and management tools. They processed $182 billion in 2023 and are Nigeria’s largest merchant acquirer, cultivating a culture of innovation, teamwork, and growth.

  • Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
  • Implement data quality checks, monitoring, and validation processes.
  • Automate manual processes into centralized and scalable solutions.

Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.

Philippines

  • Design, develop, and optimize data architecture and pipelines aligned with ETL/ELT principles.
  • Architect workflows using DBT to convert raw data into actionable analytics.
  • Maintain production data pipelines with Python, DBT, Matillion, and Snowflake.

Jobgether is a platform that connects job seekers with partner companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.

US Unlimited PTO

  • Work cross-functionally with Product and subject matter experts to conceptualize, prototype, and build data solutions
  • Connect disparate datasets (e.g. claims, contract rates, demographics data) to empower internal and external stakeholders
  • Build and maintain data engineering systems that support AI use cases, including scalable ingestion pipelines, feature generation, and downstream products

Turquoise Health aims to make healthcare pricing simpler, more transparent, and lower cost. They are a Series B startup backed by top VCs with an accomplished group of folks with a passion for improving healthcare.

US

  • Architect and sustain self-healing pipelines using Astronomer/Airflow to ensure 24/7 data availability.
  • Design and optimize event-driven API ingestion frameworks leveraging AWS Lambda and DLT (Data Load Tool).
  • Manage high-performance modeling within AWS Redshift, utilizing DBT to transform raw transactional data into high-fidelity business intelligence.

Odisea helps close the opportunity gap between Colombia and the United States by redefining nearshoring. They are building a passionate team of professionals committed to this purpose.

$230,000–$265,000/yr
US Unlimited PTO

  • Design and build robust, highly scalable data pipelines and lakehouse infrastructure with PySpark, Databricks, and Airflow on AWS.
  • Improve the data platform development experience for Engineering, Data Science, and Product by creating intuitive abstractions, self‑service tooling, and clear documentation.
  • Own and maintain core data pipelines and models that power internal dashboards, ML models, and customer-facing products.

Parafin aims to grow small businesses by providing them with the financial tools they need through the platforms they already sell on. They are a Series C company backed by prominent venture capitalists, with a tight-knit team of innovators from companies like Stripe, Square, and Coinbase.