Source Job

Europe

  • Organize and structure data systems at both macro and micro levels, designing and implementing data architectures that support business goalsOptimize data pipelines for performance, reliability, and scalability
  • Design, build, and maintain scalable ETL/ELT pipelines with Airflow to process large-scale, complex datasets
  • Demonstrate ability to delivery of of  data products  useful for machine learning and AI research and development (data models, metadata and semantics)

Python SQL Airflow Docker Kubernetes

20 jobs similar to Data Engineer

Jobs ranked by similarity.

$135,500–$200,000/yr
US

  • Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
  • Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
  • Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.

SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.

Data Engineer

YLD
Europe

  • Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
  • Coach & mentor other engineers to support the growth of their technical expertise.
  • Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.

YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.

Europe Asia

  • Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
  • Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
  • Ensure data integrity, availability, lineage, and observability across complex pipelines

Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.

  • Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
  • Implement data quality checks, monitoring, and validation processes.
  • Automate manual processes into centralized and scalable solutions.

Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

$100,000–$140,000/yr
US

  • Design, build, and maintain scalable data pipelines for clients across industries.
  • Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
  • Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.

NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.

$219,625–$235,675/yr
US Unlimited PTO

  • Define and work within our data governance practices, including a catalog/dictionary and management of data quality.
  • Manage lights-out data operations of our ETL/ELT pipelines ranging from streaming inputs to batch file loads, to support customer reporting, development, and operations.
  • Untangle, normalize, synthesize as needed to permit joining and comparisons from disparate sources, and further analysis including ML processing.

Evermore is a technology company that administers Smart Benefits to connect people to products and services. They are backed by leading investors including General Catalyst, Define Ventures, Lightspeed Venture Partners, Pinegrove Capital Partners, and Qiming Venture Partners.

Europe Asia

  • Create innovative solutions for handling peta-bytes of data with billions of rows & joins.
  • Create real time and offline features generation pipelines to managing our data infrastructure to be reliable and fast!
  • Develop and productionize data pipelines for our ML models in both bare-metal and the cloud environment.

Kayzen is a mobile demand-side platform (DSP) dedicated to democratizing programmatic advertising. They enable leading apps, agencies, media buyers, and brands to run programmatic customer acquisition, retargeting, and brand performance campaigns through their self-serve and managed service options.

$180,000–$200,000/yr
US

  • Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
  • Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
  • Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization

MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.

$118,000–$148,000/yr
US

  • Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
  • Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
  • Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response

Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.

Europe

  • Design and maintain scalable data pipelines.
  • Structure, transform, and optimize data in Snowflake.
  • Implement multi-source ETL/ELT flows (ERP, APIs, files).

QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.

$110,000–$135,000/yr
Canada

  • Design and implement scalable data architectures to support business needs.
  • Build and optimize data pipelines, ensuring data accessibility and security.
  • Develop and maintain data models, databases, and data lakes, with robust data governance.

Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.

$89,440–$94,380/hr
US

  • Design, build, and maintain scalable data pipelines.
  • Develop and optimize ETL/ELT processes using cloud data technologies.
  • Partner with teams to understand data requirements and improve data capture strategies.

Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.

$106,000–$120,000/yr
US

  • Lead the technical onboarding of partner institutions onto UDTS.
  • Design, build, and maintain scalable data pipelines and architectures.
  • Collaborate with team members to set engineering standards and guide data infrastructure strategy.

DataKind is a non-profit organization that uses data science and AI to address global challenges. They work with various sectors like health, humanitarian action, climate, economic opportunity, and education to create data-driven tools.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.

$172,000–$254,000/yr
US Canada

  • Collaborate with product managers, data analysts, and machine learning engineers to develop pipelines and ETL tasks.
  • Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards.
  • Manage individual Data Engineers to foster learning, growth and success at Doximity.

Doximity is transforming the healthcare industry with a mission to help every physician be more productive and provide better care for their patients. As medicine's largest network in the United States, they are committed to building diverse teams with an inclusive culture.

US

  • Apply an in-depth understanding of data structures and information content.
  • Design the architecture for new data and analytics platform to support analytics and data science and machine learning.
  • Develop complex SQL queries to obtain data from our source systems.

Cascade Financial Services helps thousands of families realize the dream of home ownership by offering mortgage loan solutions customized to the manufactured housing marketplace. Their culture lives in their team members, and they are focused and dedicated to providing a platform for growth.

$120,000–$150,000/yr
US

  • Maintain and continuously improve your technical expertise to be an Airflow expert.
  • Work with customers to educate and guide them regarding Airflow best practices.
  • Collaborate with team members to design, prototype, and implement engineering solutions.

Astronomer empowers data teams to bring software, analytics, and AI to life and is behind Astro, the unified DataOps platform powered by Apache Airflow®. They are trusted by more than 800 of the world's leading enterprises, letting businesses do more with their data.

$104,000–$164,000/yr
US

  • Build and manage business data pipelines and transform Firefox telemetry data into structured datasets.
  • Partner with data scientists, product, and marketing teams to turn datasets into models and metrics.
  • Ensure data accuracy and performance using observability tools and resolve data issues.

Mozilla Corporation is a technology company backed by a non-profit that has shaped the internet, creating brands like Firefox. With millions of users globally, they focus on areas including AI and social media while remaining focused on making the internet better for people.

Global

  • Build and maintain robust data pipelines processing large volumes of data
  • Update and optimise our data platform for speed, scalability and cost
  • Develop processes and tools to monitor and analyse model performance and data accuracy

Moniepoint is Africa's all-in-one financial ecosystem, empowering businesses and their customers with seamless payment, banking, credit, and management tools. They processed $182 billion in 2023 and are Nigeria’s largest merchant acquirer, cultivating a culture of innovation, teamwork, and growth.