Source Job

Global

  • You will design, build, and maintain scalable data pipelines and infrastructure.
  • You will work on a wide array of blockchain economics projects, analyses, and systems.
  • You will optimize network incentives and systems by deeply understanding network dynamics.

SQL Python Data Analysis Data Science Data Modeling

20 jobs similar to Junior Data Engineer

Jobs ranked by similarity.

Global

  • Own competitor intelligence across Trust Wallet's core verticals.
  • Monitor on-chain activity across EVM chains and identify trends relevant to Trust Wallet's product, growth, and strategy.
  • Build dashboards and analyses that connect on-chain signals with Trust Wallet's product context.

Trust Wallet is the leading non-custodial cryptocurrency wallet. They are trusted by over 200 million people worldwide to securely manage and grow their digital assets and offer a seamless, multi-chain experience backed by industry-leading self-custody technology.

US Europe

  • Own the delivery of scalable internal data solutions.
  • Translate business needs into clear technical designs and working systems.
  • Build and improve data pipelines, integrations, and automation.

Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and driven by a strong “build and ship” mindset.

Global

  • Build and maintain robust data pipelines processing large volumes of data
  • Update and optimise our data platform for speed, scalability and cost
  • Develop processes and tools to monitor and analyse model performance and data accuracy

Moniepoint is Africa's all-in-one financial ecosystem, empowering businesses and their customers with seamless payment, banking, credit, and management tools. They processed $182 billion in 2023 and are Nigeria’s largest merchant acquirer, cultivating a culture of innovation, teamwork, and growth.

Global

  • Analyse validator performance, staking economics, and revenue drivers across networks.
  • Monitor the blockchain ecosystem to detect emerging opportunities, risks, and strategic developments.
  • Design and deploy end-to-end data solutions, from analysis and modeling to production deployment.

P2P.org, the largest institutional staking provider, manages over $10B TVL and holds a 20%+ market share in restaking. They focus on researching and improving their infrastructure to maximize APR while enhancing security, serving clients like BitGo and Ledger with their client-centric approach.

US Europe

  • Own the architecture and delivery of scalable internal solutions.
  • Translate business needs into clear technical designs and working systems.
  • Build and improve data pipelines, integrations, and automation.

Transparent Hiring is recruiting for a fast-growing reinsurance company operating across Germany and the United States. The environment is collaborative and hands-on, driven by a strong “build and ship” mindset.

India

  • Design, build, and maintain data pipelines connecting internal systems to the organisation’s Delta Lake environment.
  • Develop and optimise SQL-based data transformations and relational data models to support analytics and reporting.
  • Integrate new data sources and systems into the data platform as the organisation expands its technology landscape.

Smart Working believes your job should not only look right on paper but also feel right every day. It connects skilled professionals with outstanding global teams and products for full-time, long-term roles, offering a genuine community that values growth and well-being.

Global

  • Perform technical and business tasks from analysts related to our core tools
  • Participate in code reviews of analysts and identifying suboptimal processes
  • Monitor load and alerts from our services

P2P.org is the largest institutional staking provider with a TVL of over $10B and a market share exceeding 20% in restaking. They unite talented individuals globally, sharing a passion for decentralized finance to shape finance's future with code, learning, and connection.

US

  • Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines.
  • Build and implement ETL frameworks to improve code quality and reliability.
  • Guide and mentor other Data Engineers as a technical owner of parts of the data platform.

Jobgether is a platform that connects job seekers with companies. They use AI-powered matching to ensure applications are reviewed quickly and fairly.

Europe Asia

  • Create innovative solutions for handling peta-bytes of data with billions of rows & joins.
  • Create real time and offline features generation pipelines to managing our data infrastructure to be reliable and fast!
  • Develop and productionize data pipelines for our ML models in both bare-metal and the cloud environment.

Kayzen is a mobile demand-side platform (DSP) dedicated to democratizing programmatic advertising. They enable leading apps, agencies, media buyers, and brands to run programmatic customer acquisition, retargeting, and brand performance campaigns through their self-serve and managed service options.

Europe

  • Responsible for building core infrastructure software (pipelines, APIs, data modelling) as part of our client's data platform team.
  • Coach & mentor other engineers to support the growth of their technical expertise.
  • Implementing the appropriate technologies for scaling data access patterns, batch processing, and data streaming for soft real-time consumption.

YLD is a software engineering and design consultancy that creates digital capabilities for their clients. The company has offices in London, Lisbon, and Porto and aims to attract, inspire, develop, and retain extraordinary people.

Canada

  • Play a crucial part in designing and delivering an internal platform.
  • Translate business requirements into efficient technical solutions.
  • Create sustainable technological solutions that drive business growth.

Jobgether uses an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. They identify the top-fitting candidates, and this shortlist is then shared directly with the hiring company.

Europe

  • Build pipelines to load data from various systems into Dataiku via S3 or Snowflake.
  • Increase the robustness of existing production pipelines, identify bottlenecks, and set up a robust monitoring, testing processes, and documentation templates.
  • Build custom applications and integrations to automate manual tasks related to customer operations to help Product Operations / Support / SRE in their day-to-day activities

Dataiku is the Platform for AI Success, the enterprise orchestration layer for building, deploying, and governing AI. The world’s leading companies rely on Dataiku to operationalize AI and run it as a true business performance engine delivering measurable value.

$160,800–$193,000/yr
US

  • Design and develop high‑performance data converters for multi‑sensor autonomous‑driving data.
  • Design, build, and optimize large‑scale ingestion and transformation pipelines capable of processing petabyte‑scale autonomous‑driving sensor data.
  • Implement automated data validation, quality checks, and lineage tracking to ensure reliability of production datasets.

Torc has been a leader in autonomous driving since 2007 and is now part of the Daimler family. They are focused solely on developing software for automated trucks to transform how the world moves freight and have a collaborative, energetic, and team-focused culture.

$167,000–$200,000/yr
US Unlimited PTO

  • Build, optimize, and maintain data pipelines that power our business
  • Define and build out abstracted reusable data sets to be used for Business Intelligence, Marketing, and Data Science Research
  • Design, build, and evangelize a federated data validation frameworks to be used to monitor potential data inconsistencies

Garner Health strives to transform the healthcare economy, delivering accessible, high-quality healthcare. They are a fast-growing healthcare technology company dedicated to making a meaningful impact on healthcare at scale with a team of talented, mission-driven individuals.

India

  • Design, build, and maintain robust and scalable data pipelines.
  • Leverage the Microsoft Fabric ecosystem to unify data storage and analytics.
  • Architect and maintain automated CI/CD pipelines using Azure DevOps.

66degrees is an end-to-end AI transformation partner that guides enterprises from complex business challenges to clear, quantifiable outcomes. They are a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology.

Data Engineer

ItD
US

  • Design, build, and optimize scalable data architectures that power marketing analytics and survey measurement initiatives.
  • Deliver automated, high-impact data solutions and insights that enhance decision-making across teams.
  • Build robust pipelines, dashboards, and analytical frameworks in fast-paced environments.

ItD is a consulting and software development company that blends diversity, innovation, and integrity with real business results. They reject any strong hierarchy, empowering teams to deliver excellent results in a woman- and minority-led firm.

$89,440–$94,380/hr
US

  • Design, build, and maintain scalable data pipelines.
  • Develop and optimize ETL/ELT processes using cloud data technologies.
  • Partner with teams to understand data requirements and improve data capture strategies.

Blueprint is a technology solutions firm with a strong presence across the United States, solving complicated problems for their clients. They are bold, smart, agile, and fun, and believe in unique perspectives, building teams of people with diverse skillsets and backgrounds.

Global

  • Design, implement, and maintain scalable, high-performance data architectures connecting relational and non-relational systems.
  • Manage end-to-end data pipelines, ensuring seamless ingestion from scrapers to AI/ML workflows.
  • Audit and optimize existing workflows for efficiency, accuracy, and flexibility.

Jobgether is a pioneering HR Tech startup, operating entirely remotely, and leading the revolution in the world of work. As the largest job search engine designed exclusively for remote workers, its mission is to empower individuals to discover opportunities that align seamlessly with their unique lifestyles.

$92,686–$125,000/yr
US Unlimited PTO

  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design data models and schemas that facilitate data analysis and reporting
  • Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large volumes of data from various sources into usable formats
  • Build and optimize data storage and processing systems, including data warehouses, data lakes, and big data platforms, using AWS services such as Amazon Redshift, AWS Glue, AWS EMR, AWS S3, and AWS Lambda, to enable efficient data retrieval and analysis

ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. Every day, the travel industry relies on ATPCO's technology and data solutions to help millions of travelers reach their destinations efficiently. At ATPCO, they believe in flexibility, trust, and a culture where your wellbeing comes first.

Global

  • Collaborate across engineering teams to design and implement secure solutions.
  • Drive automation for microservices deployments to Node Operators.
  • Gain a deep understanding of each Chainlink product’s operational needs.

Chainlink is the industry-standard oracle platform bringing the capital markets onchain and powering the majority of decentralized finance (DeFi). They provide the essential data, interoperability, compliance, and privacy standards needed to power advanced blockchain use cases.