Source Job

US Unlimited PTO

  • Design, build, and maintain pipelines that power all data use cases.
  • Develop intuitive, performant, and scalable data models that support product features.
  • Pay down technical debt, improve automation, and follow best practices in data modeling.

SQL Spark Python Airflow Data Modeling

20 jobs similar to Senior Software Engineer, Data

Jobs ranked by similarity.

$194,400–$305,500/yr
US

  • Play a Sr.tech lead & architect role to build world-class data solutions and applications that power crucial business decisions throughout the organization.
  • Enable a world-class engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian's data-driven culture.
  • Maintain a high bar for operational data quality and proactively address performance, scale, complexity and security considerations.

At Atlassian, they're motivated by a common goal: to unleash the potential of every team. Their software products help teams all over the planet and their solutions are designed for all types of work. They ensure that their products and culture continue to incorporate everyone's perspectives and experience, and never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status.

$135,000–$165,000/yr
US Unlimited PTO

  • Design, build, and maintain scalable data pipelines.
  • Develop and implement data models for analytical use cases.
  • Implement data quality checks and governance practices.

MO helps government leaders shape the future. They engineer scalable, human-centered solutions that help agencies deliver their mission faster and better. They are building a company where technologists, designers, and builders can serve the mission and grow their craft.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

US Unlimited PTO

  • Design and implement scalable, performant data models.
  • Develop and optimize processes to improve the correctness of 3rd party data.
  • Implement data quality principles to raise the bar for reliability of data.

SmithRx is a venture-backed Health-Tech company disrupting the Pharmacy Benefit Management (PBM) sector with a next-generation drug acquisition platform. They have a mission-driven and collaborative culture that inspires employees to transform the U.S. healthcare system.

  • Design, build, and evolve scalable data pipelines and systems that ensure financial accuracy and integrity at scale.
  • Explore and apply Spotify’s Data and AI ecosystem to solve engineering problems and improve developer efficiency.
  • Partner closely with Finance and Audiobooks stakeholders to understand their needs and deliver financially impactful features.

Spotify transformed music listening forever when we launched in 2008. Today, they are the world’s most popular audio streaming subscription service.

$135,000–$220,000/yr
US Unlimited PTO

  • Design, develop, and maintain reliable end-to-end data pipelines that connect internal and external systems.
  • Contribute to the performance, scalability, and reliability of our entire data ecosystem.
  • Work with analysts to engineer data structures and orchestrate workflows that encode core business logic.

Roo is on a mission to empower animal healthcare professionals with opportunities to earn more and achieve greater flexibility in their careers and personal lives. Powered by groundbreaking technology, Roo has built the industry-leading veterinary staffing platform, connecting Veterinarians, Technicians, and Assistants with animal hospitals for relief work and hiring opportunities.

US Unlimited PTO

  • Use your expertise to build Software Products that rely on data
  • Design and refine data models that underpin product functionality while implementing monitoring systems to ensure reliability and performance
  • Collaborate with our Data Guild to define the organization’s data strategy influencing decisions on tooling, architecture, and engineering standards

Freshpaint provides a layer of data governance to make current web analytics tools HIPAA-compliant, helping healthcare marketers promote access to care and safeguard patient privacy. They are a fully remote company backed by leading investors including Y-Combinator and Intel Capital.

Europe

  • Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
  • Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
  • Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation

OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.

$190,000–$220,000/yr
US

  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.

TRM is a blockchain intelligence company on a mission to build a safer financial system. They are a lean, high-impact team tackling critical challenges, and they empower governments, financial institutions, and crypto companies.

$190,800–$267,100/yr
US

  • Be the Analytics Engineering lead within the Sales and Marketing organization.
  • Be the data steward for Sales and Marketing: architect and improve the data collection.
  • Develop and maintain robust data pipelines and workflows for data ingestion and transformation.

Reddit is a community-driven platform built on shared interests and trust, fostering open and authentic conversations. With over 100,000 active communities and approximately 116 million daily active unique visitors, it serves as a major source of information on the internet.

$135,500–$200,000/yr
US

  • Architect, design, implement, and operate end-to-end data engineering solutions.
  • Develop and manage robust data integrations with external vendors.
  • Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.

SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.

US

  • Design, build, and maintain scalable data pipelines and workflows in Snowflake.
  • Integrate and ingest data from multiple systems into Snowflake.
  • Develop and optimize SQL queries, views, and materialized datasets.

GTX Solutions is a consulting firm specializing in modern data architecture, Customer Data Platforms (CDPs), and marketing technology enablement. They work with enterprise clients across industries including Retail, Travel, Hospitality, and Financial Services to design and implement scalable data ecosystems.

US

  • Design, build, and maintain scalable ETL pipelines for large-scale data processing.
  • Implement data transformations and workflows using PySpark at an intermediate to advanced level.
  • Optimize pipelines for performance, scalability, and cost efficiency across environments.

Truelogic is a leading provider of nearshore staff augmentation services headquartered in New York. Their team of 600+ highly skilled tech professionals, based in Latin America, drives digital disruption by partnering with U.S. companies on their most impactful projects.

$130,000–$130,000/yr
Americas Unlimited PTO

  • Build and evolve our semantic layer, design, document, and optimize dbt models.
  • Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
  • Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.

Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.

US Unlimited PTO

  • Design, build, and operate scalable data pipelines using batch and real-time processing technologies.
  • Build data infrastructure that ingests real-time events and stores them efficiently across databases.
  • Establish and enforce data contracts with backend engineering teams by implementing schema management.

Fetch provides a platform where millions of people earn rewards for buying brands they love. They have received investments from SoftBank, Univision, and Hamilton Lane and partnerships ranging from challenger brands to Fortune 500 companies. Fetch fosters a people-first culture rooted in trust, accountability, and innovation.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.

US

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Collaborate with cross-functional teams to deliver impactful data solutions.

Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.

$150,000–$160,000/yr
US

  • Design, build and execute data pipelines.
  • Build the configurable ETL framework.
  • Optimize SQL queries to maximize system performance.

RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.

$0–$200,000/yr
North America Latin America

  • Architect and maintain robust data pipelines to transform diverse data inputs.
  • Integrate data from various sources into a unified platform.
  • Build APIs with AI assistance to enable secure access to consolidated insights.

Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.

$175,000–$225,000/yr
US

  • Lead product requirements and advanced analytics requirements gathering efforts.
  • Work with analytics, data science, and wider engineering teams to help with automating data analysis and visualization needs.
  • Build a scalable technology platform to support a growing business and deliver high-quality code to production.

Achieve is a leading digital personal finance company that helps everyday people move from struggling to thriving by providing innovative, personalized financial solutions. They have over 3,000 employees in mostly hybrid and 100% remote roles across the United States with hubs in Arizona, California, and Texas and a culture of putting people first.