Source Job

Global

  • Design, maintain, and scale streaming ETL pipelines for blockchain data.
  • Build and optimize ClickHouse data models and materialized views for high-performance analytics.
  • Implement data transformations and decoding logic.

SQL Python Flink Kafka ClickHouse

20 jobs similar to Data Engineer

Jobs ranked by similarity.

$190,000–$220,000/yr
US

  • Build highly reliable data services to integrate with dozens of blockchains.
  • Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
  • Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.

TRM is a blockchain intelligence company on a mission to build a safer financial system. They are a lean, high-impact team tackling critical challenges, and they empower governments, financial institutions, and crypto companies.

$130,000–$130,000/yr
Americas Unlimited PTO

  • Build and evolve our semantic layer, design, document, and optimize dbt models.
  • Develop and maintain ETL/orchestration pipelines to ensure reliable and scalable data flow.
  • Partner with data analysts, scientists, and stakeholders to enable high-quality data access and experimentation.

Customer.io's platform is used by over 7,500 companies to send billions of emails, push notifications, in-app messages, and SMS every day. They power automated communication and help teams send smarter, more relevant messages using real-time behavioral data; their culture values empathy, transparency, and responsibility.

US

  • Deliver on datawarehouse and reporting requirements using Google Big Query and GCP Suite.
  • Architect, Design, and Build pipelines to move large amounts of data from variety of sources.
  • Improve existing data warehouse architecture to enable robust user facing and internal reporting.

Bitwave is a rapidly expanding startup that specializes in software for businesses that use digital assets and crypto. Our platform provides cryptocurrency accounting, tax tracking, bookkeeping, digital asset treasury management, crypto AR / AP tooling, and they recently added full DeFi support.

$40,754–$60,227/yr
Global Unlimited PTO

  • Build and own Sardine’s internal data infrastructure integrating CRM, marketing, product, finance, and operational systems.
  • Design, improve, and own ETL/ELT pipelines to ensure clean, reliable, and scalable data flows across the organization.
  • Partner with data, engineering, revenue/business operations, and executive stakeholders to define and track KPIs.

Sardine is a leader in fraud prevention and AML compliance. Their platform uses device intelligence, behavior biometrics, machine learning, and AI to stop fraud before it happens. Sardine has over 300 banks, retailers, and fintechs as clients and a remote-first work culture.

$90,000–$150,000/yr
US

  • Build modern, scalable data pipelines that keep the data flowing.
  • Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning.
  • Unify and wrangle data from all kinds of sources.

InterWorks is a people-focused tech consultancy that empowers clients with customized, collaborative solutions. They value unique contributions, and their people are the glue that holds their business together.

  • Design and build scalable data pipelines that ingest, process, and transform high-volume event streams and historical data
  • Develop and maintain APIs that deliver analytics, trend reports, and drill-down capabilities to internal teams and external customers
  • Build robust infrastructure for data quality monitoring, ensuring accuracy and completeness across customer and artifact datasets

Socket helps devs and security teams ship faster by cutting out security busywork. They have raised $65M in funding from top angels, operators, and security leaders.

US Unlimited PTO

  • Design, build, and maintain robust data pipelines.
  • Own and scale ETL/ELT processes using tools like dbt, BigQuery, and Python.
  • Build modular data models that power analytics, product features, and LLM agents.

Jobgether is a platform that uses AI to match candidates with jobs. They aim to review applications quickly and fairly, ensuring the top-fitting candidates are identified and shared with hiring companies.

$85,000–$95,000/yr
US

  • Design and develop data integration pipelines across cloud and legacy systems.
  • Lead and support data engineering implementation efforts.
  • Apply advanced analytics techniques to support business insights and decision-making.

Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly and fairly. They identify the top-fitting candidates and share this shortlist directly with the hiring company, with the final decision managed by the internal team.

Global 5w PTO 26w maternity 6w paternity

  • Designing and maintaining scalable, secure data pipelines that feed BigQuery from diverse sources
  • Owning our infrastructure-as-code setup using Terraform
  • Automating data QA, modeling, and maintenance tasks using scripting and AI

TheyDo helps enterprises align around their customers with an AI-powered journey management platform, fostering smarter decisions and enhanced experiences. With $50M backing and a global team across 27 countries, TheyDo champions a customer-led, people-first culture.

$135,500–$200,000/yr
US

  • Architect, design, implement, and operate end-to-end data engineering solutions.
  • Develop and manage robust data integrations with external vendors.
  • Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams.

SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. With over 59 million people reached each month, they operate SmartAsset Advisor Marketing Platform (AMP) to connect consumers with fiduciary financial advisors.

$171,000–$220,000/yr
US Unlimited PTO

  • Design, implement, and maintain robust, automated data pipelines.
  • Model and optimize data in Snowflake to support analytics.
  • Ensure data reliability through automated quality checks, monitoring, observability, and lineage visibility.

Acquisition.com focuses on acquiring and growing businesses. We foster a lean, high-ownership environment.

$190,800–$267,100/yr
US

  • Be the Analytics Engineering lead within the Sales and Marketing organization.
  • Be the data steward for Sales and Marketing: architect and improve the data collection.
  • Develop and maintain robust data pipelines and workflows for data ingestion and transformation.

Reddit is a community-driven platform built on shared interests and trust, fostering open and authentic conversations. With over 100,000 active communities and approximately 116 million daily active unique visitors, it serves as a major source of information on the internet.

$90,000–$150,000/yr
Global

  • Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
  • Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
  • Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

Europe

  • Enable efficient consumption of domain data as a product by delivering and promoting strategically designed actionable datasets and data models
  • Build, maintain, and improve rock-solid data pipelines using a broad range of technologies like AWS Redshift, Trino, Spark, Airflow, and Kafka streaming for real-time processing
  • Support teams without data engineers in building decentralised data solutions and product integrations, for example, around DynamoDB Act as a data ambassador, promoting the value of data and our data platform among engineering teams and enabling cooperation

OLX operates consumer brands that facilitate trade to build a more sustainable world. They have colleagues around the world who serve millions of people every month.

$96,050–$113,000/yr
US

  • Creating and maintaining optimal data pipeline architecture.
  • Assembling large, complex data sets that meet functional & non-functional business requirements.
  • Building the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using relevant technologies.

Mercer Advisors works with families to help them amplify and simplify their financial lives through integrated financial planning, investment management, tax, estate, and insurance services. They serve over 31,300 families in more than 90 cities across the U.S. and are ranked the #1 RIA Firm in the nation by Barron’s.

Global

  • Design, build, and operate scheduled and event-driven data pipelines for simulation outputs, telemetry, logs, dashboards, and scenario metadata
  • Build and operate data storage systems (structured and semi-structured) optimized for scale, versioning, and replay
  • Support analytics, reporting, and ML workflows by exposing clean, well-documented datasets and APIs

Onebrief is collaboration and AI-powered workflow software designed specifically for military staffs. They transform this work, making the staff faster, smarter, and more efficient. The company is all-remote with employees working alongside customers; it was founded in 2019 and has raised $320m+.

$0–$200,000/yr
North America Latin America

  • Architect and maintain robust data pipelines to transform diverse data inputs.
  • Integrate data from various sources into a unified platform.
  • Build APIs with AI assistance to enable secure access to consolidated insights.

Abusix is committed to making the internet a safer place. They are a globally distributed team that spans multiple countries and thrives in a culture rooted in trust, ownership, and collaboration.

$110,572–$145,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content. They hold over 200 million fares across 160 countries and the travel industry relies on their technology and data solutions. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.

$115,000–$145,000/yr
US

  • Collaborate with business leaders, engineers, and product managers to understand data needs.
  • Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed/elastic environments, and downstream applications and/or self-service solutions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

NBCUniversal is one of the world's leading media and entertainment companies that creates world-class content, which we distribute across our portfolio of film, television, and streaming, and bring to life through our global theme park destinations, consumer products, and experiences. We champion an inclusive culture and strive to attract and develop a talented workforce to create and deliver a wide range of content reflecting our world.

US Unlimited PTO

  • Design, build, and maintain pipelines that power all data use cases.
  • Develop intuitive, performant, and scalable data models that support product features.
  • Pay down technical debt, improve automation, and follow best practices in data modeling.

Patreon is a media and community platform where over 300,000 creators give their biggest fans access to exclusive work and experiences. They are leaders in the space, with over $10 billion generated by creators since Patreon's inception, with a team passionate about their mission.