Source Job

$120,000–$150,000/yr
US Unlimited PTO

  • Help build scalable data solutions and streamline data ingestion.
  • Maintain high-quality databases that support our scientific and operational teams.
  • Optimize our data infrastructure to ensure efficient data access.

SQL Python AWS GCP DevOps

20 jobs similar to Data Infrastructure Engineer

Jobs ranked by similarity.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

US Canada

  • Own and evolve our data infrastructure, including pipelines into our data warehouse
  • Manage and improve cloud infrastructure and DevOps workflows
  • Ensure platform reliability so product and design teams aren’t pulled into backend or operational firefighting

Meridio is a remote-first company on a mission to make health benefits for small businesses simple, affordable, and accessible. As they scale smart, they’re focused on building systems that reduce complexity instead of adding it.

  • Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
  • Implement data quality checks, monitoring, and validation processes.
  • Automate manual processes into centralized and scalable solutions.

Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.

$92,686–$125,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.

$140,000–$160,000/yr
US 4w PTO

  • Own and evolve the data infrastructure that powers Clever's core data products.
  • Maintain and improve data pipeline reliability, monitoring and resolving pipeline failures.
  • Design and implement ingestion for new operational data sources that support Clever's speed-to-match initiative.

Clever Real Estate is a venture-backed technology company aiming to revolutionize real estate transactions. They have built a leading online education platform helping consumers save money and have earned a 4.9 TrustPilot rating with over 3,800 reviews.

$120,000–$140,000/yr
US

  • Design, build, and scale modern data platforms.
  • Lead the development of robust data pipelines and optimize data architecture.
  • Translate complex requirements into scalable data solutions.

JBS is an equal opportunity employer that values its employees. They are committed to hiring individuals authorized for employment in the United States on a W2 basis.

North America

  • Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
  • Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
  • Implement automated data quality checks, validation rules, and monitoring frameworks.

ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.

$118,000–$148,000/yr
US

  • Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
  • Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
  • Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response

Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.

$190,000–$280,500/yr
US Canada

  • Identify structural weaknesses and eliminate operational fragility.
  • Define clear ingestion, validation, and testing standards across the platform.
  • Drive ambiguous initiatives from concept to production-ready outcomes.

Life360's mission is to keep people close to the ones they love. By continuing to innovate and deliver for our customers, they have become a household name and the must-have mobile-based membership for families. Life360 has more than 500 (and growing!) remote-first employees.

Global Unlimited PTO

  • Lead infrastructure initiatives across the engineering organization.
  • Design technical quality bar and architectural standards.
  • Build platforms and AI-enabled systems for multiple teams.

Fieldguide is automating and streamlining the work of assurance and audit practitioners specifically within cybersecurity, privacy, and financial audit, building software for the people who enable trust between businesses. They are based in San Francisco, CA, but built as a remote-first company with an inclusive, driven, humble and supportive team.

US

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines.
  • Build and implement ETL frameworks to improve code quality and reliability.
  • Guide and mentor other Data Engineers as a technical owner of parts of the data platform.

Jobgether is a platform that connects job seekers with companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.

Europe

  • Design, deploy, and manage cloud infrastructure.
  • Build and maintain ETL pipelines.
  • Develop and manage APIs, databases, and middleware.

The Starknet Foundation stewards Starknet, a permissionless validity rollup scaling blockchains. They pioneered ZK-STARK technology and are entering a new era settling on both Bitcoin and Ethereum, aiming to build a unified execution layer for secure assets.

$108,400–$135,500/yr
US North America

  • Design, develop, and maintain scalable data pipelines using cloud data services.
  • Serve as a technical leader, defining data engineering standards and best practices.
  • Lead the design and implementation of optimized data models in our cloud data warehouse.

Constant Contact empowers people by giving them the help and tools they need to grow online. They are energized by new challenges and possibilities, and they celebrate diversity and inclusion with programs in place to bring people together.

$149,000–$193,500/yr
US

  • Design and implement scalable, high-throughput data ingestion systems.
  • Build and evolve a centralized data lake using Apache Iceberg.
  • Provide technical leadership through mentorship, code reviews, and design discussions.

Coupa provides a total spend management platform for businesses, which uses community-generated AI to multiply margins. They have a collaborative culture driven by transparency, openness, and a shared commitment to excellence, and are expanding their impact across the globe.

$140,000–$180,000/yr
US Unlimited PTO

  • Build and operate data services driving our applications and APIs
  • Collaborate with team members and across Engineering to iteratively prototype and develop new functionality
  • Partner with product managers and other Zusers

Zus is a shared health data platform designed to accelerate healthcare data interoperability by providing easy-to-use patient data via API, embedded components, and direct EHR integrations. Founded in 2021, it partners with HIEs and other data networks to aggregate patient clinical history and then translates that history into user-friendly information at the point of care.

US

  • Experience with the integration of data from multiple data sources.
  • Experience with various database technologies such as SQLServer, Redshift, Postgres, and RDS.
  • Experience designing, building, and maintaining data pipelines.

Bluelight Consulting is a leading software consultancy dedicated to designing and developing innovative technology that enhances users' lives. With a presence across the United States and Central/South America, Bluelight is in an exciting phase of expansion, continually seeking exceptional talent to join its dynamic and diverse community.

UK

  • Manage and mentor a high-performing team, fostering a culture of technical excellence.
  • Define the Data Engineering team vision, balancing immediate business needs with a long-term shift towards a self-service data mesh architecture.
  • Oversee the development of core data pipelines and platform tools, ensuring high performance for ingestion services.

UW provides utilities all in one place with one bill for energy, broadband, mobile and insurance, targeting savings for customers. They are aiming to double in size and are looking for people to help them achieve this goal through innovation and impact.

Global

  • Design, implement, and maintain scalable, high-performance data architectures connecting relational and non-relational systems.
  • Manage end-to-end data pipelines, ensuring seamless ingestion from scrapers to AI/ML workflows.
  • Audit and optimize existing workflows for efficiency, accuracy, and flexibility.

Jobgether is a pioneering HR Tech startup, operating entirely remotely, and leading the revolution in the world of work. As the largest job search engine designed exclusively for remote workers, its mission is to empower individuals to discover opportunities that align seamlessly with their unique lifestyles.

$89,440–$94,380/hr
US

  • Design, build, and maintain scalable data pipelines.
  • Develop and optimize ETL/ELT processes using cloud data technologies.
  • Partner with teams to understand data requirements and improve data capture strategies.

Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.

Europe

  • Design and maintain scalable data pipelines.
  • Structure, transform, and optimize data in Snowflake.
  • Implement multi-source ETL/ELT flows (ERP, APIs, files).

QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. They help customers in various industries rapidly adapt to change and innovate for competitive advantage.