Source Job

US Unlimited PTO 20w maternity 14w paternity

  • Build & Operate Data Pipelines, using AWS-native data tools and distributed processing frameworks.
  • Operate and improve core data platform services, addressing incidents, performance issues, and operational toil.
  • Partner with data producers and consumers to onboard pipelines, troubleshoot issues, and improve platform usability.

AWS Spark Python Java Go

20 jobs similar to Data Platform Engineer

Jobs ranked by similarity.

US

  • Architect and lead the evolution of our modern data platform.
  • Design and build production LLM pipelines and infrastructure that power intelligent operations.
  • Own end-to-end data acquisition and integration architecture across diverse sources.

Brightwheel is the largest, fastest growing, and most loved platform in early ed. They are trusted by millions of educators and families every day. The team is passionate, talented, and customer-focused and embodies their Leadership Principles in their work and culture.

$1,800,000–$200,000/yr
US Canada

  • Collaborate with engineering, data science, ML, data engineering, and product analytics teams to understand and shape the future needs of our data platform and infrastructure.
  • Define, drive, and implement the future live ingestion layer of data into our data platform (e.g. Kafka, Kinesis).
  • Define and evolve standards for storage, compute, data management, provenance, and orchestration.

Inspiren offers the most complete and connected ecosystem in senior living.

$115,000–$160,000/yr
US

As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.

PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.

Global

  • Design and implement event-driven pipelines using AWS services to ingest data from external sources in real-time.
  • Build and maintain streaming data pipelines between HubSpot CRM and PostgreSQL, handling webhook events and API polling.
  • Implement schema validation, data type checking, and automated quality gates at the ingestion layer to prevent bad data from entering the system.

PropHero is a property analytics platform provider. They have reached €30M revenue in 4 years, 25% QoQ growth, and are already profitable, offering a modern, cloud-native AWS data platform.

Europe

  • Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
  • Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
  • Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation

OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

US Unlimited PTO

  • Partner with clients and implementation teams to understand data distribution requirements.
  • Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
  • Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.

Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.

Brazil Canada US Latin America

  • Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
  • Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
  • Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.

Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.

US

  • Build, manage, and operationalize data pipelines for marketing use cases.
  • Develop a comprehensive understanding of customer and marketing data requirements.
  • Transform large data sets into targeted customer audiences for personalized experiences.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

$59,520–$77,520/yr
EMEA

  • Design, develop, test, and maintain scalable applications using modern frameworks.
  • Actively participate in Agile/Scrum ceremonies, contributing to planning, estimation, and continuous improvement.
  • Contribute to architectural design discussions, test planning, and operational excellence initiatives.

Tealium is a trusted leader in real-time Customer Data Platforms (CDP), helping organizations unify their customer data to deliver more personalized, privacy-conscious experiences. Team Tealium has team members present in nearly 20 countries worldwide, serving customers across more than 30 countries, winning together with respect and appreciation.

US

  • Build and maintain scalable data pipelines using Snowflake OpenFlow and related Snowflake-native tools.
  • Develop and maintain Snowflake semantic views that support analytics and reporting needs.
  • Deliver clean, governed data sets for Sigma dashboards and embedded analytics use cases.

They are building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. They value diverse perspectives and innovation.

$150,000–$200,000/yr
US

  • Be trusted advisor to customers with best practices, methodologies, and technologies to implement data engineering solutions.
  • Design, implement, and maintain modern data pipelines to deliver optimal solutions utilizing appropriate cloud technologies.
  • Partner with product owners and business SMEs to analyze customer requirements and provide a supportable and sustainable engineered solution.

CapTech is an award-winning consulting firm that collaborates with clients to achieve what’s possible through the power of technology. They are passionate about their work and the results they achieve for their clients. From the outset, their founders shared a collective passion to create a consultancy centered on strong relationships that would stand the test of time. Today they work alongside clients that include Fortune 100 companies, mid-sized enterprises, and government agencies, a list that spans across the country.

Brazil

Design, build, and maintain a robust, self-service, scalable, and secure data platform. Create and edit data pipelines, considering business logic, levels of aggregation, and data quality. Enable teams to access and use data effectively through self-service tools and well-modeled datasets.

We are Grupo QuintoAndar, the largest real estate ecosystem in Latin America, with a diversified portfolio of brands and solutions across different countries.

India

  • Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
  • Develop and optimize data models in Snowflake or similar platforms.
  • Implement ETL/ELT processes using Python and modern data tools.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.

Europe

  • Design, build, and maintain scalable, high-quality data pipelines.
  • Implement robust data ingestion, transformation, and storage using cloud-based technologies.
  • Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.

US Unlimited PTO

  • Develop and maintain data pipelines and ETL processes using Python in AWS environments
  • Write and deploy AWS Lambda functions for data processing tasks
  • Collaborate with team members using Git for version control and code collaboration

540 is a forward-thinking company that the government turns to in order to #getshitdone. They break down barriers, build impactful technology, and solve mission-critical problems.

UK 5w PTO

  • Code, test, and document new or modified data pipelines.
  • Conduct logical and physical database design.
  • Perform root cause analysis on internal and external data.

Aker Systems builds and operates ground-breaking, ultra-secure, high performance, cloud-based data infrastructure for the enterprise. They were recognised as a ‘One to Watch’ on the Sunday Times Tech Track and won the Thames Valley Tech Company of the year.

US

  • Design, build, and maintain custom AWS-native data pipelines using AWS Lambda, AWS Glue, S3, DynamoDB, and EventBridge.
  • Develop and support ETL processes that push structured and unlayered data to downstream systems, including Salesforce Marketing Cloud.
  • Lead the technical design and availability of AWS services, ensuring solutions are scalable, reliable, and fully supported by AWS-native features.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York, delivering top-tier technology solutions to companies of all sizes. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies on their most impactful projects.

US

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Collaborate with cross-functional teams to deliver impactful data solutions.

Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.