Source Job

Global

  • Deliver high-quality instruction in Data Engineering covering essential tools and technologies.
  • Manage and monitor student performance, providing individual feedback and support.
  • Facilitate hands-on labs, guiding students through practical coding exercises.

Python SQL Tableau AWS Git

20 jobs similar to Data Engineering Educator

Jobs ranked by similarity.

Global

  • Support ~30 Learners in virtual training programs.
  • Provide synchronous and asynchronous support to Learners.
  • Ensure learning, comprehension, and retention of program content.

Correlation One develops workforce skills for the AI economy. They partner with enterprises and governments to develop talent and close critical data, digital, and technology skills gaps, empowering underrepresented communities.

$120,000–$140,000/yr
US

  • Design, build, and scale modern data platforms.
  • Lead the development of robust data pipelines and optimize data architecture.
  • Translate complex requirements into scalable data solutions.

JBS is an equal opportunity employer that values its employees. They are committed to hiring individuals authorized for employment in the United States on a W2 basis.

  • Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion.
  • Implement data quality checks, monitoring, and validation processes.
  • Automate manual processes into centralized and scalable solutions.

Informa TechTarget accelerates growth from R&D to ROI, informing and connecting technology buyers and sellers. They are a vibrant community of over 2000 colleagues worldwide and traded on Nasdaq as part of Informa PLC.

US

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines.
  • Build and implement ETL frameworks to improve code quality and reliability.
  • Guide and mentor other Data Engineers as a technical owner of parts of the data platform.

Jobgether is a platform that connects job seekers with companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.

Global

  • Architect our AWS-based data warehouse and ingestion pipelines.
  • Transform high-volume simulation outputs into clean, trusted datasets.
  • Establish schema standards and data contracts with engineering.

Onebrief provides collaboration and AI-powered workflow software designed for military staffs, making them faster, smarter, and more efficient. The company, founded in 2019, values ownership and excellence, with a team spanning veterans and technologists; it has raised $320m+ from investors and is valued at $2.15B.

Data Engineer

ItD
US

  • Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
  • Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
  • Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.

ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.

US

  • Architect and sustain self-healing pipelines using Astronomer/Airflow to ensure 24/7 data availability.
  • Design and optimize event-driven API ingestion frameworks leveraging AWS Lambda and DLT (Data Load Tool).
  • Manage high-performance modeling within AWS Redshift, utilizing DBT to transform raw transactional data into high-fidelity business intelligence.

Odisea helps close the opportunity gap between Colombia and the United States by redefining nearshoring. They are building a passionate team of professionals committed to this purpose.

$92,686–$125,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.

US

  • Design and evolve a scalable, secure, cloud-native lakehouse platform.
  • Drive long-term platform strategy and evaluate emerging technologies.
  • Develop frameworks for ingestion, quality, lineage, metadata, and observability.

Indiana General is focused on enhancing healthcare by delivering high-quality data products that support clinical and operational outcomes. They provide a collaborative and inclusive work environment with opportunities for professional growth and development.

Europe Asia

  • Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
  • Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
  • Ensure data integrity, availability, lineage, and observability across complex pipelines

Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.

$84,191–$106,194/yr
Canada

  • Design and implement scalable data ingestion and transformation pipelines using Databricks and cloud platforms
  • Lead architecture decisions for modern data platforms, including Medallion Architecture and Lakehouse patterns
  • Build and maintain ETL/ELT pipelines using Python and SQL, following engineering best practices

AOT Technologies helps enterprises and governments bring their ideas to life. As a boutique consulting firm, they partner with enterprises, startups, and governments to solve complex, mission-critical challenges. Their teams are collaborative and their leadership is transparent.

Europe

  • Architect, develop, and deploy robust, scalable data solutions using Azure tools.
  • Design and optimize ETL/ELT data pipelines using Python, PySpark, and SQL.
  • Build and manage modern data architectures, including data lakes and warehouses.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. The system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

Global

  • Design, implement, and maintain scalable, high-performance data architectures connecting relational and non-relational systems.
  • Manage end-to-end data pipelines, ensuring seamless ingestion from scrapers to AI/ML workflows.
  • Audit and optimize existing workflows for efficiency, accuracy, and flexibility.

Jobgether is a pioneering HR Tech startup, operating entirely remotely, and leading the revolution in the world of work. As the largest job search engine designed exclusively for remote workers, its mission is to empower individuals to discover opportunities that align seamlessly with their unique lifestyles.

$150,000–$180,000/yr
US

  • Design and implementation of reliable, maintainable, and scalable GenAI systems.
  • Serve as a subject matter expert for machine learning systems owned by the team.
  • Mentor junior and mid level engineers through code reviews and design collaboration.

Trajector specializes in medical evidence services, guiding clients through disability benefits complexities. They are a global team of over 1,800 dedicated individuals, streamlining the path to benefits and ensuring access to rightful compensation for those with disabilities.

Europe

  • Co-lead the “Revenue” Feature Team
  • Build a strong partnership with your Product Owner counterpart
  • Guide technical and modeling choices in accordance with Accor standards

Accor Tech & Digital drives Accor's technology and digital transformation. They have 5,000 employees committed to delivering tech and digital experiences to guests, hotels, and staff across 110 countries, shaping the future of hospitality.

  • Design and implement data models and data architecture solutions.
  • Develop and maintain ELT pipelines and data integration workflows.
  • Analyze business requirements and translate them into technical specifications.

Dijital Team helps businesses with their digital transformation initiatives. They offer consulting services with a focus on cloud technologies.

Global

  • Lead the instruction of a “virtual classroom”.
  • Conduct lectures during specified dates and times.
  • Instruct learners using provided Cloud Support Specialist content.

Correlation One develops workforce skills for the AI economy. They partner with enterprises and governments to develop talent and close critical data, digital, and technology skills gaps, empowering underrepresented communities and accelerating careers.

Europe 5w PTO

  • Use Gemini and other LLMs to generate Google Apps Script/JavaScript code, connecting our systems and building bespoke internal tools.
  • Leverage Google Cloud Project environments to manage data and build structured workflows that capture clean data at the source.
  • Translate technical data into plain English narratives for our Chief People Officer and senior stakeholders through Looker Studio or Tableau dashboards.

Reach is Britain and Ireland's largest commercial news publisher, connecting with millions of people daily through various platforms. They value diverse perspectives and foster an inclusive workplace where everyone feels welcome and supported, encouraging applications from people of all backgrounds.

Latin America

  • Design and evolve scalable data pipelines and architectures.
  • Act as the primary anchor for data ingestion, transformation, and storage solutions.
  • Ensure mission-critical data is accessible and reliable.

CodeRoad provides end-to-end software development services, helping businesses scale with ideal infrastructure solutions. From staff augmentation to dedicated IT teams and general software engineering, their nearshore technology services empower businesses to thrive in an ever-evolving digital landscape.

$180,000–$200,000/yr
US

  • Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
  • Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
  • Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization

MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.