Source Job

US

  • Design and manage a modular data layer architecture.
  • Develop and optimize ETL pipelines for real-time and batch processing.
  • Implement metadata management solutions to improve lineage tracking.

Data Governance ArcGIS PostgreSQL SQL

20 jobs similar to DevOps Engineer

Jobs ranked by similarity.

$65,000–$85,000/yr
US

  • Working with customer resources to support GIS data maintenance for Motorola Solutions Inc. products.
  • Installing ESRI software, creating/administering models and scripts, and manipulating data.
  • Providing first-line GIS data support as a liaison to deployment, support, and development teams.

Motorola Solutions is a global company focused on providing critical communication, video security, and command center technologies for public safety agencies and enterprises. They foster a close-knit community and are committed to helping people stay safer by coordinating efforts for safer communities, schools, hospitals, and businesses.

$96,300–$154,400/yr
US

  • Administer and maintain the Atlan Data Catalog, including user onboarding, role/permission configuration, data asset management, and integration with source systems.
  • Design, Implement, and continuously evolve Kapitus’s Atlan metamodel (custom asset types, custom properties, relationship models, classification taxonomy, business glossary structure, and propagation/lineage rules).
  • Partner with Data Engineering, Analytics, Product, and business data stewards to ingest and curate technical and business metadata, ensuring accuracy, completeness, and alignment with governance standards.

Kapitus is a reliable and respected name in small business financing. They provide small businesses with the financing they need by being both a direct lender and a marketplace built with a trusted network of lending partners. The company is fast paced, and teammates need to be self-directed.

Mexico

  • Design, build, and maintain highly scalable, reliable, and efficient ETL/ELT pipelines.
  • Ingest data from a multitude of sources and transform raw data into clean, structured, and AI/ML-ready formats.
  • Work closely with data scientists, machine learning engineers, and business analysts to understand their data needs.

Valtech exists to unlock a better way to experience the world by blending crafts, categories, and cultures, helping brands unlock new value in an increasingly digital world.

$0–$200,000/yr
North America Latin America

  • Architect and maintain robust data pipelines to transform diverse data inputs.
  • Integrate data from various sources into a unified platform.
  • Build APIs with AI assistance to enable secure access to consolidated insights.

Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.

  • Aid analyst users to onboard and use the system to ensure their success.
  • Work to integrate new data sources for new and existing customers to ensure successful onboarding.
  • Rapidly take things not in formal sprints but are needed to support demos and pilots where rapid changes in the product are required.

Danti is on a mission to transform how people interact with the vast amounts of data being generated about our physical world. They are building an AI-powered knowledge engine designed to make this data accessible and usable for everyone. Danti fosters a fun, collaborative work environment that is hybrid.

North America Asia Oceania Unlimited PTO

  • Design, implement, and maintain distributed ingestion pipelines for structured and unstructured data.
  • Build scalable ETL/ELT workflows to transform, validate, and enrich datasets for AI/ML model training and analytics.
  • Support preprocessing of unstructured assets for training pipelines, including format conversion, normalization, augmentation, and metadata extraction.

Meshy is a leading 3D generative AI company transforming content creation by enabling the creation of 3D models from text and images. They have a global team distributed across North America, Asia, and Oceania and are backed by venture capital firms like Sequoia and GGV, with $52 Million in funding.

$86,000–$138,000/yr
US

  • Design and implement ETL/ELT pipelines to integrate data from multiple sources into secure environments.
  • Develop dashboards and reports using BI tools.
  • Integrate security controls into all data engineering processes.

Peraton is a national security company that drives missions of consequence. They deliver solutions and technologies to protect the nation and allies, operating across various domains and serving government agencies and the U.S. armed forces.

$87,841–$109,801/yr
Unlimited PTO

  • Bridge the gap between application engineering and data infrastructure.
  • Own the optimization of high-volume data pipelines and tune operational databases.
  • Define how to ingest massive bursts of medical information, model it for transactional locking, and transform it for analytical querying.

Synthesis Health is a mission- and values-driven company with tremendous dedication to its customers. The 100% remote team is dedicated to revolutionizing healthcare through innovation, collaboration, and commitment to its core values and behaviors.

Europe

  • Own Neko’s Data Engineering strategy, ensuring long-term leadership in preventive health.
  • Drive innovation in data engineering technologies and practices, balancing current and future needs.
  • Build a platform supporting responsible AI — fairness, privacy, explainability, and compliance.

Neko Health is a Swedish healthcare technology company focused on shifting healthcare from reactive treatment toward preventative health and early detection. They have over 500 employees with offices in Stockholm, London and Manchester and are focused on creating a flexible work environment.

$125,000–$150,000/yr
US

  • Design, implement, and optimize robust and scalable data pipelines using SQL, Python, and cloud-based ETL tools such as Databricks.
  • Enhance our overarching data architecture strategy, assisting in decisions related to data storage, consumption, integration, and management within cloud environments.
  • Partner with data scientists, BI teams, and other engineering teams to understand and translate complex data requirements into actionable engineering solutions.

The New York Blood Center appears to be a medical organization. They are looking for a Senior Data Engineer to join their team.

Brazil Colombia Mexico Argentina 2w PTO

  • Build and optimize Sauce's lakehouse architecture using Azure Databricks and Unity Catalog for data governance.
  • Create and maintain data quality tests and improve existing alerting setups.
  • Own data warehouse by connecting data sources, and maintaining a platform and architecture in coordination with R&D infrastructure and operations teams.

Sauce is a premier restaurant technology platform that helps businesses grow with our Commission-Free Delivery & Pickup structure and proprietary delivery optimization technology.

Global

  • Transform satellite imagery and geospatial data into actionable insights.
  • Support Environmental & Social Impact Assessments (EIA) and ensure compliance with environmental regulations.
  • Enable end‑to‑end geospatial evidence for permitting, impact assessment, compliance, restoration, and climate resilience.

ERM is the world’s largest advisory firm focused solely on sustainability, offering unparalleled expertise across business and finance. Our diverse global team of experts works with the world’s leading organizations to help them set clear sustainability targets, measure progress and operationalize strategy through deep implementation and business transformation.

Global

  • Design, build, and operate scheduled and event-driven data pipelines for simulation outputs, telemetry, logs, dashboards, and scenario metadata
  • Build and operate data storage systems (structured and semi-structured) optimized for scale, versioning, and replay
  • Support analytics, reporting, and ML workflows by exposing clean, well-documented datasets and APIs

Onebrief is collaboration and AI-powered workflow software designed specifically for military staffs. They transform this work, making the staff faster, smarter, and more efficient. The company is all-remote with employees working alongside customers; it was founded in 2019 and has raised $320m+.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

$130,000–$176,000/yr
US Unlimited PTO

  • Design, develop, and implement end-to-end data pipelines to support data collection and transformation.
  • Lead the architecture and development of scalable and maintainable data solutions.
  • Collaborate with data scientists and analysts to provide clean and accessible data.

DexCare optimizes time in healthcare, streamlining patient access, reducing waits, and enhancing overall experiences.

Global

  • Design and implement event-driven pipelines using AWS services to ingest data from external sources in real-time.
  • Build and maintain streaming data pipelines between HubSpot CRM and PostgreSQL, handling webhook events and API polling.
  • Implement schema validation, data type checking, and automated quality gates at the ingestion layer to prevent bad data from entering the system.

PropHero is a property analytics platform provider. They have reached €30M revenue in 4 years, 25% QoQ growth, and are already profitable, offering a modern, cloud-native AWS data platform.

US

  • Design and engineer robust data pipelines using technologies like Databricks, Azure Data Factory, Apache Spark, and Delta Lake.
  • Craft healthcare data solutions - processing massive healthcare datasets, optimizing performance, and ensuring data is accurate and secure.
  • Communicate technical concepts to non-technical stakeholders, manage multiple priorities, and meet deadlines.

Gentiva offers compassionate care in the comfort of patients' homes as a national leader in hospice, palliative, home health care, and advanced illness management. They have nearly 600 locations and thousands of clinicians across 38 states, offering rewarding careers in a collaborative environment.

$67,000–$157,000/yr
US 4w PTO

  • Design, develop, and optimize data pipelines and ETL processes to ensure high-quality data is available for analysis.
  • Analyze complex datasets to identify trends, patterns, and actionable insights that drive business performance.
  • Implement data quality checks and governance best practices to ensure data accuracy and reliability.

Modeling Data Solutions is seeking an experienced data analytics engineer to join its personal lines property team. This is an exciting opportunity to join the US Data Science Infrastructure department helping to support creating cutting edge pricing programs.

Latin America

Build robust data pipelines at scale. Design and implement data schemas. Collaborate with Analytics/Data Science team to structure and house data.

Goods & Services is a product design and engineering company that solves mission-critical challenges for some of the world’s largest enterprises.

$175,000–$225,000/yr
US

  • Lead product requirements and advanced analytics requirements gathering efforts.
  • Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
  • Build a scalable technology platform to support a growing business and deliver high-quality code to production.

Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.