Source Job

US

  • Design and maintain ETL pipelines that ingest, process, and load data into AWS Neptune.
  • Develop and evolve graph data models representing relationships across users, sessions, devices, and security events.
  • Integrate diverse data sources including S3, relational databases, streaming services, and APIs into a cohesive graph architecture

Python SQL AWS ETL

20 jobs similar to Data Engineer, Graph Analytics & Machine Learning

Jobs ranked by similarity.

US

  • Query and analyze complex graph data in AWS Neptune using Gremlin
  • Develop and maintain Python-based scripts for data extraction, cleaning, transformation, and exploratory analysis
  • Perform data validation, profiling, and consistency checks across multiple data sources and entity types

Keeper Security is transforming cybersecurity for organizations around the world with next-generation privileged access management. Keeper deploys in minutes, not months, and seamlessly integrates with any tech stack to prevent breaches, reduce help desk costs and ensure compliance.

$96,050–$113,000/yr
US

  • Creating and maintaining optimal data pipeline architecture.
  • Assembling large, complex data sets that meet functional & non-functional business requirements.
  • Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.

Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.

$120,000–$150,000/yr
US

  • Support their managed analytics service client through their data-driven journey and deliver measurable business value through data modeling, API integration, SQL scripting, and data pipeline development.
  • Bridge the important gap between data applications and insightful business reports.
  • Participate in building our data platform from the ground up by exploring new technologies & vendors within our cloud-first environment

DataDrive is a fast-growing managed analytics service provider that provides modern cloud analytics data platforms to data-driven organizations, while also supporting ongoing training, adoption, and growth of our clients’ data cultures. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

US Unlimited PTO

  • Partner with clients and implementation teams to understand data distribution requirements.
  • Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
  • Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.

Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.

Europe

  • Design, build, and maintain scalable, high-quality data pipelines.
  • Implement robust data ingestion, transformation, and storage using cloud-based technologies.
  • Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.

$115,000–$160,000/yr
US

As a key member of our Data Engineering team, you will: Collaborate with Data Science, Reporting, Analytics, and other engineering teams to build data pipelines, infrastructure, and tooling to support business initiatives. Oversee the design and maintenance of data pipelines and contribute to the continual enhancement of the data engineering architecture. Collaborate with the team to meet performance, scalability, and reliability goals.

PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences.

Brazil Canada US Latin America

  • Work alongside Caylent’s Architects, Engineering Managers, and Engineers to deliver AWS solutions.
  • Build solutions defined in project backlogs, writing production-ready, well-tested, and documented code across cloud environments.
  • Participate in Agile ceremonies such as daily standups, sprint planning, retrospectives, and demos.

Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using Amazon Web Services (AWS). They are a global company and operate fully remote with employees in Canada, the United States, and Latin America fostering a community of technological curiosity.

US Unlimited PTO

  • Develop and maintain data pipelines and ETL processes using Python in AWS environments
  • Write and deploy AWS Lambda functions for data processing tasks
  • Collaborate with team members using Git for version control and code collaboration

540 is a forward-thinking company that the government turns to in order to #getshitdone. They break down barriers, build impactful technology, and solve mission-critical problems.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.

India

  • Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
  • Develop and optimize data models in Snowflake or similar platforms.
  • Implement ETL/ELT processes using Python and modern data tools.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.

US

  • Build and maintain scalable data pipelines using Snowflake OpenFlow and related Snowflake-native tools.
  • Develop and maintain Snowflake semantic views that support analytics and reporting needs.
  • Deliver clean, governed data sets for Sigma dashboards and embedded analytics use cases.

They are building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. They value diverse perspectives and innovation.

India

  • Design, develop, and maintain scalable data pipelines and data warehouses.
  • Develop ETL/ELT processes using Python and modern data tools.
  • Ensure data quality, reliability, and performance across systems.

3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.

Europe

  • Design and maintain scalable data pipelines.
  • Structure, transform, and optimize data in Snowflake.
  • Implement multi-source ETL/ELT flows (ERP, APIs, files).

QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.

Europe

  • Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
  • Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
  • Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation

OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.

$145,290–$185,000/yr
Unlimited PTO

  • Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
  • Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

$171,000–$220,000/yr
US Unlimited PTO

  • Design, implement, and maintain robust, automated data pipelines.
  • Model and optimize data in Snowflake to support analytics.
  • Ensure data reliability through automated quality checks, monitoring, observability, and lineage visibility.

Acquisition.com focuses on acquiring and growing businesses. We foster a lean, high-ownership environment.

Data Engineer

Egen
$124,800–$145,600/hr

  • Migrate data and analytics workloads from BigQuery to Snowflake
  • Develop and optimize ETL/ELT pipelines using Python and SQL
  • Build analytics-ready datasets for reporting and dashboards

Egen is a fast-growing and entrepreneurial company with a data-first mindset. They bring together the best engineering talent working with the most advanced technology platforms to help clients drive action and impact through data and insights.

$170,000–$200,000/yr
US 12w maternity

  • Architect, design, and lead the implementation of highly complex, scalable, and resilient data solutions in the cloud.
  • Quickly build subject matter expertise in a specific business area and data domain.
  • Support defining and executing the overarching strategy for the analytics engineering function.

Huntress is a fully remote, global team of passionate experts and ethical badasses on a mission to break down the barriers to cybersecurity. Founded in 2015 by former NSA cyber operators, Huntress protects all businesses with enterprise-grade, fully owned, and managed cybersecurity products.

$67,000–$157,000/yr
US 4w PTO

  • Design, develop, and optimize data pipelines and ETL processes to ensure high-quality data is available for analysis.
  • Analyze complex datasets to identify trends, patterns, and actionable insights that drive business performance.
  • Implement data quality checks and governance best practices to ensure data accuracy and reliability.

Modeling Data Solutions is seeking an experienced data analytics engineer to join its personal lines property team. This is an exciting opportunity to join the US Data Science Infrastructure department helping to support creating cutting edge pricing programs.