Source Job

Latin America

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional and non-functional business requirements.
  • Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.

Databricks Python SQL Scala Spark

20 jobs similar to Data Engineer

Jobs ranked by similarity.

$110,000–$135,000/yr
Canada

  • Design and implement scalable data architectures to support business needs.
  • Build and optimize data pipelines, ensuring data accessibility and security.
  • Develop and maintain data models, databases, and data lakes, with robust data governance.

Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.

$120,000–$160,000/yr
US

  • You will join a team of talented engineers working closely with Data Scientists to build and scale our next-generation Ad EnGage data pipeline.
  • You will work with large-scale datasets (hundreds of TBs to petabyte-scale systems) using a modern data stack centered on AWS, Airflow, dbt, and Snowflake.
  • You’ll contribute to building reliable, high-quality data pipelines and improving the performance, scalability, and observability of our data platform.

EDO is the TV outcomes company. Their leading measurement platform connects convergent TV airings to the ad-driven consumer behaviors most predictive of future sales. They are headquartered in New York City and Los Angeles with an office space in San Francisco and recognize the benefits of hybrid working.

Europe

  • Develop engineering expertise within the Dataiku Platform to help maintain and develop system integrations, platform automations, and platform configurations.
  • Build & maintain python & SQL data replication & data pipelines on large & often complex data sets.
  • Identify opportunities for improvements & optimization for greater scalability & delivery velocity

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

Latam

  • Design, build, and maintain data pipelines (ETL/ELT) in batch and streaming environments.
  • Develop solutions for ingesting and processing large volumes of structured, unstructured, and semi-structured data.
  • Create data products that respond to the analytical needs of the business.

EX Squared LATAM builds high-impact digital solutions, working with exceptional talent throughout Latin America. They foster a culture of collaboration, continuous learning, and technical excellence.

Latin America

  • Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
  • Lead architectural discussions around the modern data stack
  • Develop scalable ETL and ELT processes using Python and SQL

They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.

Brazil

  • Design and implement data ingestion and transformation pipelines using PySpark/SparkSQL on Databricks.
  • Own data pipelines end-to-end in production: freshness, correctness, availability, and SLA adherence.
  • Build and maintain Delta Lake tables following medallion architecture patterns.

Pismo, founded in 2016, provides a comprehensive processing platform for banking, card issuing, and financial market infrastructure. With over 500 employees across more than 10 countries and now part of Visa, they empower firms to build and launch financial products rapidly with high security and availability standards.

$84,191–$106,194/yr
Canada

  • Design and implement scalable data ingestion and transformation pipelines using Databricks and cloud platforms
  • Lead architecture decisions for modern data platforms, including Medallion Architecture and Lakehouse patterns
  • Build and maintain ETL/ELT pipelines using Python and SQL, following engineering best practices

AOT Technologies helps enterprises and governments bring their ideas to life. As a boutique consulting firm, they partner with enterprises, startups, and governments to solve complex, mission-critical challenges. Their teams are collaborative and their leadership is transparent.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

Latin America

  • Design, build, and maintain scalable data pipelines using Python and Airflow
  • Develop and optimize ETL/ELT processes for structured and unstructured data
  • Collaborate with data science teams to support Machine Learning workflows

Oowlish is a rapidly expanding software development company in Latin America. They foster a nurturing work environment, are certified as a Great Place to Work, and provide opportunities for professional development and international impact.

$106,000–$120,000/yr
US

  • Lead the technical onboarding of partner institutions onto UDTS.
  • Design, build, and maintain scalable data pipelines and architectures.
  • Collaborate with team members to set engineering standards and guide data infrastructure strategy.

DataKind is a non-profit organization that uses data science and AI to address global challenges. They work with various sectors like health, humanitarian action, climate, economic opportunity, and education to create data-driven tools.

US Unlimited PTO

  • Serve as the embedded technical lead for Databricks customer engagements.
  • Own Databricks platform architecture, design decisions, and technical standards.
  • Lead delivery of complex data pipelines and analytics workloads on Databricks.

540 is a forward-thinking company that the government turns to in order to #getshitdone. They break down barriers, build impactful technology, and solve mission-critical problems.

$100,000–$140,000/yr
US

  • Design, build, and maintain scalable data pipelines for clients across industries.
  • Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
  • Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.

NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.

$140,000–$160,000/yr
US

  • Design and build mission critical data pipelines with a highly scalable distributed architecture.
  • Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders.
  • Build and support reusable framework to ingest, integration and provision data

StockX is a Detroit-based technology leader focused on the online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. They employ 1,000 people across offices and verification centers around the world and their platform connects buyers and sellers using dynamic pricing mechanics.

Global

  • Design and develop data ingestion pipelines, preferably with Databricks experience.
  • Performance tune and optimize Databricks jobs, evaluating new features and refactoring code.
  • Collaborate with analysts, managers, architects, and senior developers to establish application framework.

Databricks is a data and AI company. They are likely a medium to large size company, focusing on innovation and teamwork.

US

  • Own organizational-wide data architecture, defining standards and designs.
  • Design and develop data pipelines, integrations, and platform features.
  • Partner with product managers to define new data features and capabilities.

They offer a connected equipment platform for managing mixed assets. The company values quality, continuous learning, and collaboration within a dynamic team environment.

US

  • Lead and manage a team of ~6 data engineers, driving execution, performance, and career development.
  • Own Kin’s data platform, including ingestion, storage, transformation, pipeline orchestration, and governance.
  • Build and optimize scalable data pipelines and architectures using tools like Snowflake, Databricks, DBT, and Airflow.

Kin simplifies homeowners' lives with smarter insurance, expanding to meet all homeowner needs. They employ Kinfolk across 35+ states and are recognized for growth, customer satisfaction, and a focus on long-term sustainability, fostering a culture of meaningful work and real impact.

South America

  • Projetar, desenvolver e otimizar pipelines de dados escaláveis utilizando SQL e Python/PySpark.
  • Construir e manter modelos de dados voltados para análise (ex: Star Schema e OBTs).
  • Garantir qualidade, consistência e governança dos dados ao longo de todo o pipeline.

CI&T specializes in technological transformation, uniting human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters worldwide, they have partnered with more than 1,000 clients throughout their 30-year history, highlighting that Artificial Intelligence is a key aspect of their operations.

$146,400–$175,100/yr
US

  • Architect, implement, and maintain scalable data architectures.
  • Develop, optimize, and maintain ETL processes.
  • Optimize data processing and query performance.

Blueprint Technologies is a technology solutions firm that helps organizations unlock value from existing assets by leveraging cutting-edge technology. Their teams have unique perspectives and years of experience across multiple industries. They believe in unique perspectives and build teams of people with diverse skillsets and backgrounds.

Europe Asia

  • Create innovative solutions for handling peta-bytes of data with billions of rows & joins.
  • Create real time and offline features generation pipelines to managing our data infrastructure to be reliable and fast!
  • Develop and productionize data pipelines for our ML models in both bare-metal and the cloud environment.

Kayzen is a mobile demand-side platform (DSP) dedicated to democratizing programmatic advertising. They enable leading apps, agencies, media buyers, and brands to run programmatic customer acquisition, retargeting, and brand performance campaigns through their self-serve and managed service options.