Source Job

$158,000–$168,000/yr
US Unlimited PTO

  • Building and maintaining production-grade data pipelines in cloud data warehouses.
  • Designing and developing dbt models across bronze, silver, and gold layers.
  • Crafting easy-to-understand visualizations and dashboards in Looker or equivalent BI tools.

SQL Dbt Airflow Python Looker

20 jobs similar to Senior Data Engineer (Data + Applied AI)

Jobs ranked by similarity.

US

  • Collaborate closely with business stakeholders to design comprehensive data solutions.
  • Design, develop, and manage robust data models in Snowflake and Iceberg utilizing dbt and advanced SQL.
  • Build and maintain data pipelines and CI/CD workflows using Airflow, Python, and Terraform.

They are a company that's using AI-driven initiatives. The company values innovation and a dynamic environment.

$89,440–$94,380/hr
US

  • Design, build, and maintain scalable data pipelines.
  • Develop and optimize ETL/ELT processes using cloud data technologies.
  • Partner with teams to understand data requirements and improve data capture strategies.

Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.

Europe Asia

  • Design, implement, and maintain robust, scalable data pipelines to support AI, analytics, and operational reporting
  • Own and evolve the data warehouse architecture, ensuring it meets performance, flexibility, and governance needs
  • Ensure data integrity, availability, lineage, and observability across complex pipelines

Remote People is building the infrastructure to power borderless teams. Their technology handles global payroll, benefits, taxes, and compliance, enabling businesses to compliantly hire anyone anywhere at the push of a button. They are a growing, international family.

$100,000–$140,000/yr
US

  • Design, build, and maintain scalable data pipelines for clients across industries.
  • Architect and optimize cloud data warehouse solutions, adapting to each client's stack.
  • Collaborate with analysts and data scientists to ensure data is clean, reliable, and well-modeled.

NuView Analytics helps companies accelerate the time to insights from their data through data analytics, diligence, and fractional data science. They are a growth-stage company looking to drive additional value from the data they are sitting on and value humility, intellectual rigor, and stewardship.

$127,000–$175,000/yr
US

  • Partner closely with business stakeholders to understand their challenges and design end-to-end architecture.
  • Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
  • Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform.

Motive empowers people who run physical operations with tools to make their work safer, more productive, and more profitable. Motive serves nearly 100,000 customers and provides complete visibility and control across a wide range of industries.

$135,500–$200,000/yr
US

  • Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
  • Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
  • Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.

SmartAsset is an online destination for consumer-focused financial information and advice, whose mission is helping people make smart financial decisions, reaching over an estimated 59 million people each month. A successful $110 million Series D funding round in 2021 valued the company at over $1 billion.

Latin America

  • Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT
  • Lead architectural discussions around the modern data stack
  • Develop scalable ETL and ELT processes using Python and SQL

They are a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale.

$170,000–$193,363/yr
US

  • Design fault-tolerant dbt models to synthesize data from multiple sources into mart tables
  • Design and implement Sigma dashboards and Streamplit apps to provide clear insights into performance
  • Automate regular reporting workflows to reduce manual effort and increase data consistency

Weedmaps is a global leader in the cannabis industry. They are dedicated to transparency, education, and community and serve cannabis to consumers and businesses in the U.S. and worldwide.

Global Unlimited PTO

  • Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
  • Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
  • End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.

Trustonic makes smartphones affordable, enabling global access to devices and digital finance through secure smartphone locking technology. They partner with mobile carriers, retailers, and financiers across 30+ countries, powering device financing solutions. The company celebrates its diversity and is looking to do the right thing: for each other, the community and the planet.

$180,000–$200,000/yr
US

  • Lead the architecture and evolution of scalable, distributed data pipelines, ensuring high availability and performance at scale
  • Build and maintain distributed web scraping systems using tools such as Playwright, Selenium, and BeautifulSoup
  • Integrate AI and LLMs into engineering workflows for code generation, automation, and optimization

MercatorAI is building scalable data infrastructure to power high-quality, data-driven decision making at scale. As an early-stage company, the team is focused on creating robust, future-ready systems that can handle complex data ingestion, transformation, and delivery across a growing national footprint.

$118,000–$148,000/yr
US

  • Design, build, and maintain scalable batch and real-time data pipelines that power analytics, experimentation, and machine learning
  • Partner cross-functionally with analytics, product, engineering and operations to deliver high-quality data solutions that drive measurable business impact
  • Champion data quality, reliability, and observability by implementing best practices in testing, monitoring, lineage, and incident response

Gopuff is reimagining how people purchase everyday essentials, from snacks to household goods to alcohol, all delivered in minutes. They are assembling a team of thinkers, dreamers and risk-takers who know the value of peace of mind in an unpredictable world.

Data Engineer

YLD
Europe

  • Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
  • Coach & mentor other engineers to support the growth of their technical expertise.
  • Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.

YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.

$151,000–$205,000/yr
US Unlimited PTO

  • Extend, optimize, and maintain core data models for reports, machine learning, and generative AI.
  • Implement automation and operationalize ML models to streamline operational processes and improve efficiency.
  • Partner with engineering, product, and analytics teams to deliver seamless integrations and customer-facing data products.

Boulevard provides a client experience platform for appointment-based, self-care businesses, helping customers enhance client experiences. They value diversity and inclusivity, offering equal opportunities and aiming to create a supportive work environment.

US

  • Lead and manage a team of ~6 data engineers, driving execution, performance, and career development.
  • Own Kin’s data platform, including ingestion, storage, transformation, pipeline orchestration, and governance.
  • Build and optimize scalable data pipelines and architectures using tools like Snowflake, Databricks, DBT, and Airflow.

Kin simplifies homeowners' lives with smarter insurance, expanding to meet all homeowner needs. They employ Kinfolk across 35+ states and are recognized for growth, customer satisfaction, and a focus on long-term sustainability, fostering a culture of meaningful work and real impact.

$150,000–$190,000/yr
US

  • Build and scale high-throughput streaming pipelines.
  • Model and deliver high-quality, production-grade real estate datasets.
  • Strengthen data quality and observability.

Luxury Presence is building the AI growth platform for real estate. Backed by Bessemer Venture Partners and other top investors, they are a Series C company on track to hit $100M in annual recurring revenue in the next six months. They are a global team ranked on the Inc. 5000 fastest-growing companies list three years in a row.

US UK

  • Become a trusted advisor, partnering with data owners, analysts, business users, and executive stakeholders to translate business needs into scalable analytics solutions.
  • Work independently as part of a small team to solve complex analytics engineering use-cases across a variety of industries.
  • Design and develop the analytical layer, including curated data models, semantic layers, metrics definitions, and transformation pipelines.

Aimpoint Digital is a dynamic and fully remote data and analytics consultancy. We partner with innovative software providers in the data analytics & engineering space to solve our clients' toughest business problems.

$180,000–$220,000/yr
US Unlimited PTO 14w maternity

  • Design, build, and maintain databases that power Hologram's operations.
  • Build and maintain ETL pipelines that move and transform data reliably.
  • Audit existing pipelines and data models, identify complexity, and refactor bad decisions.

Hologram is building the future of IoT connectivity, delivering internet access to millions of connected devices worldwide. They process over 5 billion transactions per month across their global infrastructure and values a fun, upbeat, and remote-first team united by their mission.

$110,000–$135,000/yr
Canada

  • Design and implement scalable data architectures to support business needs.
  • Build and optimize data pipelines, ensuring data accessibility and security.
  • Develop and maintain data models, databases, and data lakes, with robust data governance.

Terawatt Infrastructure delivers large scale, turnkey charging solutions for companies rapidly deploying AV and EV fleets. With a growing portfolio of sites across the US, Terawatt is building the permanent transportation and logistics infrastructure of tomorrow through capital, real estate, development, and site operations solutions.

North America 5w PTO

  • Design, build, and maintain scalable data pipelines.
  • Strategic partner to design scalable data solutions.
  • Develop reliable data models.

Optro is the leading audit, risk, ESG, and InfoSec platform on the market, surpassing $300M ARR and continuing to grow. More than 50% of the Fortune 500 leverage their award-winning technology. They innovate and are proud of what they are producing, assisting each other and breaking through barriers.

Global

  • Design and implement scalable data models in Snowflake
  • Build and maintain transformation pipelines using dbt
  • Develop optimized star/snowflake schemas for analytics and reporting

We are looking for a highly skilled Snowflake Data Engineer. We work closely with business stakeholders and deliver high-quality data models and insights.