Source Job

US Unlimited PTO

  • Design, build, and maintain robust data pipelines.
  • Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
  • Build modular data models that power analytics, product features, and LLM agents.

SQL BigQuery Dbt Python ETL/ELT

20 jobs similar to Sr Data Engineer

Jobs ranked by similarity.

$130,000–$130,000/yr
Americas Unlimited PTO

  • Build and evolve our semantic layer, design, document, and optimize dbt models.
  • Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
  • Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.

Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.

Data Engineer

Egen
$124,800–$145,600/hr

  • Migrate data and analytics workloads from BigQuery to Snowflake
  • Develop and optimize ETL/ELT pipelines using Python and SQL
  • Build analytics-ready datasets for reporting and dashboards

Egen is a fast-growing and entrepreneurial company with a data-first mindset. They bring together the best engineering talent working with the most advanced technology platforms to help clients drive action and impact through data and insights.

Global

  • Design, build, and maintain efficient ETL/ELT processes and reliable data pipelines.
  • Build and maintain dashboards and visualizations in Looker Studio and other BI tools.
  • Ensure data quality, consistency, and accessibility across the organization.

Cove began with renting coliving spaces and has expanded to provide flexible, comfortable stays in beautiful properties. With over 6000 rooms across Singapore and Indonesia and growing in South Korea and Japan, they aim to build the leading tech flexible living platform in Asia Pacific, encouraging authenticity and fun.

$90,000–$150,000/yr
US

  • Build modern, scalable data pipelines that keep the data flowing.
  • Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
  • Unify and wrangle data from all kinds of sources.

InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.

India

  • Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
  • Develop and optimize data models in Snowflake or similar platforms.
  • Implement ETL/ELT processes using Python and modern data tools.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.

$160,000–$175,000/yr
US Unlimited PTO

  • Design, build, and optimize robust and scalable data pipelines into our production BigQuery data warehouse.
  • Mentor other engineers, lead complex projects, and set high standards for data quality and engineering excellence.
  • Empower our BI tools, reporting, Marketing, and Data Science initiatives by ensuring a highly reliable and performant data ecosystem.

Peerspace is the leading online marketplace for venue rentals for meetings, productions, and events, opening doors to inspiring spaces worldwide. They have facilitated over $500M in transactions and are backed by investors like GV (Google Ventures) and Foundation Capital.

$150,000–$190,000/yr
US Canada Unlimited PTO 12w maternity

  • Own end-to-end internal ELT data pipelines.
  • Build and maintain canonical data models and metric definitions.
  • Build dashboards and shared analytics tools to support decision making.

Northbeam is building the world’s most advanced marketing intelligence platform. They provide top eCommerce brands a unified view of their business data through powerful attribution modeling and customizable dashboards and is experiencing rapid growth.

$171,000–$220,000/yr
US Unlimited PTO

  • Design, implement, and maintain robust, automated data pipelines.
  • Model and optimize data in Snowflake to support analytics.
  • Ensure data reliability through automated quality checks, monitoring, observability, and lineage visibility.

Acquisition.com focuses on acquiring and growing businesses. We foster a lean, high-ownership environment.

$190,800–$267,100/yr
US

  • Be the Analytics Engineering lead within the Sales and Marketing organization.
  • Be the data steward for Sales and Marketing: architect and improve the data collection.
  • Develop and maintain robust data pipelines and workflows for data ingestion and transformation.

Reddit is a community-driven platform built on shared interests and trust, fostering open and authentic conversations. With over 100,000 active communities and approximately 116 million daily active unique visitors, it serves as a major source of information on the internet.

India

  • Design, develop, and maintain scalable data pipelines and data warehouses.
  • Develop ETL/ELT processes using Python and modern data tools.
  • Ensure data quality, reliability, and performance across systems.

3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.

US

  • Design, build, and maintain scalable data pipelines and workflows in Snowflake.
  • Integrate and ingest data from multiple systems into Snowflake.
  • Develop and optimize SQL queries, views, and materialized datasets.

GTX Solutions is a consulting firm specializing in modern data architecture, Customer Data Platforms (CDPs), and marketing technology enablement. They work with enterprise clients across industries including Retail, Travel, Hospitality, and Financial Services to design and implement scalable data ecosystems.

US

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Collaborate with cross-functional teams to deliver impactful data solutions.

Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.

$135,000–$220,000/yr
US Unlimited PTO

  • Design, develop, and maintain reliable end-to-end data pipelines that connect internal and external systems.
  • Contribute to the performance, scalability, and reliability of our entire data ecosystem.
  • Work with analysts to engineer data structures and orchestrate workflows that encode core business logic.

Roo is on a mission to empower animal healthcare professionals with opportunities to earn more and achieve greater flexibility in their careers and personal lives. Powered by groundbreaking technology, Roo has built the industry-leading veterinary staffing platform, connecting Veterinarians, Technicians, and Assistants with animal hospitals for relief work and hiring opportunities.

$175,000–$225,000/yr
US

  • Lead product requirements and advanced analytics requirements gathering efforts.
  • Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
  • Build a scalable technology platform to support a growing business and deliver high-quality code to production.

Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.

$98,900–$138,000/yr
US

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Tivity Health, Inc. provides healthy life-changing solutions, including SilverSneakers®, Prime® Fitness, and WholeHealth Living®. They help adults improve their health and support them on life's journey by providing access to in-person and virtual physical activity, social and mental enrichment programs. Tivity Health is an equal employment opportunity employer and is committed to a proactive program of diversity development.

$137,700–$206,500/yr
US

  • Provide flexible, trustworthy data models by transforming data from multiple sources with DBT, testing for quality, deploying to visualization tools and publishing documentation.
  • Collaborate directly with stakeholders to define problems and determine requirements for a solution. Set and maintain best practices for data models and processes.
  • Proactively identify gaps or design flaws in our data models, and bring recommendations for how to fix them. Own documentation of our tools and data.

Articulate Global, LLC, is a SaaS provider of creator platforms for online workplace training. They have more than 118,000 customers in 170 countries and counts all 100 of the Fortune 100 companies as customers. They are named one of Inc. Magazine’s Best Workplaces 2022 and a leader in building a human-centered organization.

$135,000–$165,000/yr
US Unlimited PTO

  • Design, build, and maintain scalable data pipelines.
  • Develop and implement data models for analytical use cases.
  • Implement data quality checks and governance practices.

MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.

US

  • Utilize strong SQL & Python skills to engineer sound data pipelines and conduct routine and ad hoc analysis.
  • Build reporting dashboards and visualizations to design, create, and track campaign/program KPIs.
  • Perform analyses on large data sets to understand drivers of operational efficiency.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. This company fosters a supportive and inclusive work environment.

Brazil

  • Projetar, desenvolver e manter pipelines de dados (batch e streaming) para ingestão, transformação e disponibilização de dados para analytics e consumo por aplicações.
  • Construir e evoluir modelagem analítica (camadas bronze/silver/gold, data marts, star schema, wide tables), garantindo consistência, documentação e reuso.
  • Implementar boas práticas de qualidade de dados (testes, validações, contratos, SLAs/SLOs, monitoramento de freshness/completeness/accuracy) e atuar na resolução de incidentes com RCA.

CI&T specializes in technological transformation, combining human expertise with AI to create scalable tech solutions. With over 8,000 CI&Ters around the world, they have partnered with more than 1,000 clients during their 30 years of history.