Source Job

$215,000–$300,000/yr
US 4w PTO 12w maternity 12w paternity

  • Design and build robust data pipelines that integrate data from diverse sources.
  • Build streaming data pipelines using Kafka and AWS services to enable real-time data processing.
  • Create and operate data services that make curated datasets accessible to internal teams and external partners.

Python SQL AWS Kafka Terraform

20 jobs similar to Senior Data Engineer [Remote-US]

Jobs ranked by similarity.

US Unlimited PTO

  • Partner with clients and implementation teams to understand data distribution requirements.
  • Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
  • Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.

Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.

Europe

  • Design, build, and maintain scalable, high-quality data pipelines.
  • Implement robust data ingestion, transformation, and storage using cloud-based technologies.
  • Collaborate with stakeholders to understand business goals and translate them into data engineering solutions.

CI&T is a tech transformation specialist, uniting human expertise with AI to create scalable tech solutions. With over 8,000 employees around the world, they have partnerships with more than 1,000 clients and value diversity, fostering a diverse, inclusive, and safe work environment.

Europe

  • Maintain, configure, and optimize the existing data warehouse platform and pipelines.
  • Design and implement incremental data integration solutions prioritizing data quality, performance, and cost-efficiency.
  • Drive innovation by experimenting with new technologies and recommending platform improvements.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They appreciate your interest and wish you the best!

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

Global

  • Design and implement event-driven pipelines using AWS services to ingest data from external sources in real-time.
  • Build and maintain streaming data pipelines between HubSpot CRM and PostgreSQL, handling webhook events and API polling.
  • Implement schema validation, data type checking, and automated quality gates at the ingestion layer to prevent bad data from entering the system.

PropHero is a property analytics platform provider. They have reached €30M revenue in 4 years, 25% QoQ growth, and are already profitable, offering a modern, cloud-native AWS data platform.

$155,000–$165,000/yr

  • Design, optimize and own data pipelines that scrape, process and ingest transaction and listing data from major auction houses and marketplaces.
  • Build comprehensive monitoring and alerting systems to track latency, uptime, and coverage metrics across all data sources.
  • Continuously improve our data infrastructure by modernizing storage and processing technologies, reducing manual interventions, and optimizing for cost, performance, and reliability.

Alt is unlocking the value of alternative assets, starting with the $5B trading-card market. They let collectors buy, sell, vault, and finance their cards in one place and are backed by leaders at Stripe, Coinbase, Seven Seven Six, and pro athletes like Tom Brady and Giannis Antetokounmpo.

US

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Collaborate with cross-functional teams to deliver impactful data solutions.

Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.

US

  • Architect and lead the evolution of our modern data platform.
  • Design and build production LLM pipelines and infrastructure that power intelligent operations.
  • Own end-to-end data acquisition and integration architecture across diverse sources.

Brightwheel is the largest, fastest growing, and most loved platform in early ed. They are trusted by millions of educators and families every day. The team is passionate, talented, and customer-focused and embodies their Leadership Principles in their work and culture.

US

  • Build and maintain scalable data pipelines using Snowflake OpenFlow and related Snowflake-native tools.
  • Develop and maintain Snowflake semantic views that support analytics and reporting needs.
  • Deliver clean, governed data sets for Sigma dashboards and embedded analytics use cases.

They are building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. They value diverse perspectives and innovation.

$96,050–$113,000/yr
US

  • Creating and maintaining optimal data pipeline architecture.
  • Assembling large, complex data sets that meet functional & non-functional business requirements.
  • Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.

Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.

India

  • Design, build, and maintain scalable data pipelines and warehouses for analytics and reporting.
  • Develop and optimize data models in Snowflake or similar platforms.
  • Implement ETL/ELT processes using Python and modern data tools.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company; the final decision and next steps (interviews, assessments) are managed by their internal team.

Nigeria

  • Design and implement data pipelines and workflows.
  • Develop and maintain data models for optimal data storage.
  • Collaborate with cross-functional teams to gather data requirements.

Jobgether is a platform that connects job seekers with companies using AI-powered matching. They aim to ensure applications are reviewed quickly and fairly.

Global

  • Build data pipelines for various data structures.
  • Drive automation through effective metadata management.
  • Learn and apply modern data preparation and integration techniques.

Jobgether uses an AI-powered matching process to ensure candidate applications are reviewed quickly, objectively, and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company.

Europe

  • Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
  • Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
  • Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation

OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.

US Unlimited PTO

  • Develop and maintain data pipelines and ETL processes using Python in AWS environments
  • Write and deploy AWS Lambda functions for data processing tasks
  • Collaborate with team members using Git for version control and code collaboration

540 is a forward-thinking company that the government turns to in order to #getshitdone. They break down barriers, build impactful technology, and solve mission-critical problems.

$158,000–$190,000/yr
US Unlimited PTO

  • Write and ship a lot of code, working with analysts and stakeholders to refine requirements and debug data sets.
  • Drive architectural decisions and thoughtfully balance trade-offs in system design, leading cross-functional data initiatives.
  • Champion high standards in data quality, security, and discoverability, translating complex technical challenges into clear solutions.

EzCater is a technology company that connects workplaces with over 100,000 restaurants nationwide, providing solutions for employee meals and meetings. They are backed by top investors including Insight, Iconiq, Lightspeed, GIC, SoftBank, and Quadrille and value work/life harmony.

US Unlimited PTO 20w maternity 14w paternity

  • Build & Operate Data Pipelines, using AWS-native data tools and distributed processing frameworks.
  • Operate and improve core data platform services, addressing incidents, performance issues, and operational toil.
  • Partner with data producers and consumers to onboard pipelines, troubleshoot issues, and improve platform usability.

Fetch is a platform where millions of people use Fetch earning rewards for buying brands they love, and a whole lot more. With investments from SoftBank, Univision, and Hamilton Lane, and partnerships with Fortune 500 companies, it is reshaping how brands and consumers connect in the marketplace. Ranked as one of America’s Best Startup Employers by Forbes, Fetch fosters a people-first culture rooted in trust, accountability, and innovation.

$120,000–$150,000/yr
US

  • Support their managed analytics service client through their data-driven journey and deliver measurable business value through data modeling, API integration, SQL scripting, and data pipeline development.
  • Bridge the important gap between data applications and insightful business reports.
  • Participate in building our data platform from the ground up by exploring new technologies & vendors within our cloud-first environment

DataDrive is a fast-growing managed analytics service provider that provides modern cloud analytics data platforms to data-driven organizations, while also supporting ongoing training, adoption, and growth of our clients’ data cultures. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business.

India

  • Design, develop, and maintain scalable data pipelines and data warehouses.
  • Develop ETL/ELT processes using Python and modern data tools.
  • Ensure data quality, reliability, and performance across systems.

3Pillar Global is dedicated to engineering solutions that challenge conventional norms. They are an elite team of visionaries that actively shapes the tech landscape for their clients and sets global standards along the way.