Source Job

$124,250–$166,750/yr
US Unlimited PTO

  • Designing data workflows, pipelines, and pathways to support ETL/ELT, while ensuring data-quality and exchange of sensitive information across AWS cloud and on-prem environments.
  • Implementing systems engineering principles to ensure the reliability, scalability, and maintainability of data layer systems, including backup and recovery strategies
  • Architecting, deploying, and operating data streaming technologies (Kafka, Redpanda, Kinesis, etc ) on Kubernetes with an emphasis on declarative definitions to reduce complexity in day-2 operations

Kubernetes Kafka AWS Terraform

20 jobs similar to New Data Engineer

Jobs ranked by similarity.

Europe 6w PTO

  • Own reliability, scalability and cost discipline of ingestion and transformation systems
  • Design and deliver infrastructure for real-time/near-real-time feature computation
  • Lead and grow a small, ambitious team while raising technical standards

Yazio is a nutrition app with millions of users in over 150 countries, driven by its mission to transform the world through healthy eating. They champion a focus-driven culture that values efficiency, offering a high-impact environment supported by a diverse, international team committed to growth and well-being.

Europe Unlimited PTO

  • Lead the design and delivery of complex data engineering projects.
  • Design and develop core components of our data platform.
  • Mentor engineers on the team, elevating their skills and promoting best practices in data engineering.

MoonPay is a unified payments platform for digital currency, making it easy for anyone to buy, sell, swap, and pay in digital currencies. They are trusted by over 30 million customers and over 500 ecosystem partners, driving mainstream crypto adoption worldwide.

$118,000–$148,000/yr
US

  • Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
  • Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
  • Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response

Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.

$135,000–$185,000/yr
Canada

  • Design, deploy, and maintain large-scale Kafka event streaming infrastructure.
  • Collaborate with engineers to enable new features and ensure data pipeline reliability.
  • Execute and automate Kafka cluster upgrades, migrations, and major version rollouts.

Yelp's engineering culture values teamwork, individual authenticity, and creative solutions. They are all about helping their users, growing as engineers, and having fun in a collaborative environment.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.

US

  • Build and maintain the high-throughput, event-driven pipelines responsible for processing the history of assets and vulnerabilities.
  • Design systems that handle massive scale, ensuring data accuracy and real-time availability.
  • Use Terraform and Datadog to deploy, monitor, and ensure the health of services in production.

Tenable is the Exposure Management company, with 44,000 organizations using it to understand and reduce cyber risk. Their global employees support 65 percent of the Fortune 500, 45 percent of the Global 2000, and large government agencies, fostering a culture of belonging, respect, and excellence.

UK

  • Manage and mentor a high-performing team, fostering a culture of technical excellence.
  • Define the Data Engineering team vision, balancing immediate business needs with a long-term shift towards a self-service data mesh architecture.
  • Oversee the development of core data pipelines and platform tools, ensuring high performance for ingestion services.

UW provides utilities all in one place with one bill for energy, broadband, mobile and insurance, targeting savings for customers. They are aiming to double in size and are looking for people to help them achieve this goal through innovation and impact.

$120,000–$150,000/yr
US Unlimited PTO

  • Help build scalable data solutions and streamline data ingestion.
  • Maintain high-quality databases that support our scientific and operational teams.
  • Optimize our data infrastructure to ensure efficient data access.

Funga is a public benefit corporation addressing the climate crisis by harnessing forest fungal networks. They are a team of passionate scientists and builders working to draw down at least three gigatons of carbon dioxide from the atmosphere by 2050.

US Central America South America

  • Work closely with data analysts, data engineers, and machine learning engineers, understanding their needs and challenges.
  • Design and implement systems, or refine existing ones, that form the foundations of our data platform.
  • Provide insights into the performance and stability of the foundational systems of the data platform through thoughtful monitoring and analysis.

Doximity is transforming the healthcare industry, helping every physician be more productive and provide better care for their patients. As medicine's largest network in the United States, they are committed to building diverse teams with an inclusive culture that can make a direct impact on the healthcare system.

Europe Asia

  • Create innovative solutions for handling peta-bytes of data with billions of rows & joins.
  • Create real time and offline features generation pipelines to managing our data infrastructure to be reliable and fast!
  • Develop and productionize data pipelines for our ML models in both bare-metal and the cloud environment.

Kayzen is a mobile demand-side platform (DSP) dedicated to democratizing programmatic advertising. They enable leading apps, agencies, media buyers, and brands to run programmatic customer acquisition, retargeting, and brand performance campaigns through their self-serve and managed service options.

Egypt

  • Build CDC pipelines and real-time streaming (Kafka/Flink)
  • Design and maintain data models (raw to staging to core)
  • Implement observability, data transformations and quality checks

Jobgether is a platform that connects job seekers with companies. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly.

US

  • Design, build, and maintain streaming services using Kafka, Spring Boot, and Spring Cloud Stream.
  • Develop and manage Kafka connectors for data integration and own schema management and evolution using Protocol Buffers.
  • Collaborate with cross-functional teams on API design, data contracts, and integration patterns, plus write infrastructure as code using Terraform.

Life360's mission is to keep people close to the ones they love through their mobile app and Tile tracking devices. They have more than 500 remote-first employees and deliver peace of mind and enhances everyday family life.

$140,000–$160,000/yr
US 4w PTO

  • Own and evolve the data infrastructure that powers Clever's core data products.
  • Maintain and improve data pipeline reliability, monitoring and resolving pipeline failures.
  • Design and implement ingestion for new operational data sources that support Clever's speed-to-match initiative.

Clever Real Estate is a venture-backed technology company aiming to revolutionize real estate transactions. They have built a leading online education platform helping consumers save money and have earned a 4.9 TrustPilot rating with over 3,800 reviews.

Global

  • Design and implement real-time streaming data pipelines for high-volume event data.
  • Develop and operate distributed data processing systems using technologies such as Apache Flink, Apache Kafka, Apache Druid.
  • Build scalable ingestion pipelines capable of handling millions of events per second.

I couldn't find enough information about the company in the job description to provide one. Please provide more information about the company.

$92,686–$125,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

US

  • Design and operate platform-oriented core services that support high-scale distributed systems across OnePay’s product ecosystem
  • Architect, build, and manage Kafka-based event streaming systems that power real-time data flows and mission-critical financial workflows
  • Develop internal developer tooling and shared infrastructure that improve velocity, reliability, and observability across engineering teams

OnePay is a consumer fintech trusted by millions of Americans to make money better. It is an all-in-one financial services platform that brings together banking, high-yield savings, credit cards, point-of-sale lending, investing, and crypto in one place.

Global

  • Lead the modernization of the core data ingestion infrastructure that powers our global intelligence products.
  • Drive the architectural transition of established ingestion systems into a configurable high-scale, cloud-native ecosystem.
  • Provide growth mentorship, technical decision-making and operational excellence, while leading your AI-enabled team to Enterprise scale.

Siteimprove is the leader in agentic content intelligence that brings together accessibility, analytics, SEO/AEO, and content strategy into a single, continuous flow. Their AI agents work alongside marketing and digital teams to create high-performing, accessible content.

US Canada

  • Build platforms that scale; Design and operate foundational infrastructure that handle billions of events and enable company to grow with minimal friction.
  • Enable product velocity; Create tooling that let engineers ship faster and more reliably without becoming infrastructure experts themselves.
  • Drive technical direction; Shape Metronome's infrastructure strategy, make platform-level architectural decisions, and mentor engineers across the organization.

Metronome is the leading usage-based billing platform built for modern software companies. They compute millions of invoices per billing period and are scaling rapidly to accommodate new customers, saving them hours of development time and manual invoicing. They've raised over $128M from leading investors including NEA, Andreessen Horowitz, General Catalyst, Elad Gil, and Workday Ventures.

US

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines.
  • Build and implement ETL frameworks to improve code quality and reliability.
  • Guide and mentor other Data Engineers as a technical owner of parts of the data platform.

Jobgether is a platform that connects job seekers with companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.