Optimize SQL queries to maximize system performance.
RefinedScience is dedicated to delivering high-quality emerging tech solutions. While the job description does not contain company size or culture information, the role seems to value innovation and collaboration.
Partner with data scientists and stakeholders to translate business and ML/AI use cases into scalable data architectures.
Design, develop, and maintain scalable and efficient data pipelines and ETL processes to ingest, process, and transform large data.
Build and optimize data storage and processing systems using AWS services to enable efficient data retrieval and analysis.
ATPCO is the world's primary source for air fare content, holding over 200 million fares across 160 countries. They provide technology and data solutions to the travel industry, helping millions of travelers reach their destinations efficiently. ATPCO believes in flexibility, trust, and a culture where your wellbeing comes first.
Design, develop, and maintain backend services in Rust and Scala.
Build and operate distributed, data-intensive applications.
Deploy, monitor, and maintain systems on Kubernetes within AWS environments.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Seeking a skilled software engineer to design and develop high-performance applications for large-scale data processing. Work in a fully remote, collaborative R&D environment building solutions to protect critical data and enhance security operations. Take ownership of core application development, contribute to system architecture, and collaborate with cross-functional teams.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.
Design and develop a highly available, scalable, and secure ClickHouse Cloud platform.
Build innovative deployment automation across cloud, hybrid, and on-prem systems.
Solve unique scaling, reliability, and performance challenges in regulated environments.
ClickHouse is a fast-growing private cloud company recognized on the 2025 Forbes Cloud 100 list. With over 2,000 customers and ARR that has more than quadrupled over the past year, ClickHouse leads the market in real-time analytics, data warehousing, observability, and AI workloads.
Design, build, operate, and maintain critical backend systems for alerting, ensuring reliability, scalability, and performance. Drive projects from ideation through to production and operations, actively contributing to roadmap planning. Collaborate with cross-functional teams to deliver features that meet user needs and business objectives.
We use an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements.
Responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures. Integrate, transform, and manage high-volume datasets across multiple platforms. Focus on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies.
This position is posted by Jobgether on behalf of a partner company.
As a Solution Engineer, you will meet with data partners, assess requirements, and recommend pipeline improvements to manage data export and delivery in an efficient, scalable process. You will be part of a small, cross-functional team that includes other engineers, a product manager, data scientists, and others. Success requires the ability to take on ambiguous, complex problems and promote innovative solutions to address immediate needs and support future growth.
Fetch is a rewards app that empowers consumers to live rewarded throughout their day and has delivered more than $1 billion in rewards and earned over 5 million five-star reviews.
Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions. Design and oversee the data architecture and infrastructure, ensuring scalability, performance, and security. Create scalable and efficient data processing frameworks, including ETL processes and data pipelines.
Lingaro has been on the market since 2008, with 1500+ talents currently on board in 7 global sites and emphasizes career growth and skills development.
Design and architect scalable data pipelines and infrastructure on GCP. Provide technical guidance to data engineering teams. Partner with Sales and Engineering to translate client requirements into data engineering solutions.
66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology.
Partner with clients and implementation teams to understand data distribution requirements.
Design and develop data pipelines integrating with Databricks and Snowflake, ensuring accuracy and integrity.
Lead architecture and implementation of solutions for health plan clients, optimizing cloud-based technologies.
Abacus Insights is changing the way healthcare works by unlocking the power of data to enable the right care at the right time. Backed by $100M from top VCs, they're tackling big challenges in an industry that’s ready for change with a bold, curious, and collaborative team.
Serve as a core contributor, owning and maintaining critical parts of ClickHouse's Data engineering ecosystem.
Craft tools that enable Data Engineers to harness ClickHouse's incredible speed and scale.
Build the foundation that thousands of Data engineers rely on for their most critical data workloads.
ClickHouse leads the market in real-time analytics, data warehousing, observability, and AI workloads and is recognized on the 2025 Forbes Cloud 100 list.
Design, code, test, and debug data-driven applications in accordance with established coding standards and best practices.
Develop supporting data tooling, including data pipelines, APIs and related tools to support those applications, ensuring performant, scalable and secure solutions.
Integrate with various data sources and technologies to ensure the right data is available for the solutions in a timely manner.
EasyPost, founded in 2012, is a YC unicorn with the mission to simplify shipping for businesses, from startups to Fortune 500 companies. They provide a developer-friendly REST API for shipping. The team is rapidly growing and fosters a culture of builders and problem-solvers who value elegant architecture and fast decisions.
Lead the development of robust, scalable web applications across frontend and backend. Architect and manage API-driven services to support complex analytics and data-intensive applications. Mentor engineers, provide technical guidance, and enforce software engineering best practices.
We use an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements.
Design and implement scalable, high-performance data architectures to support business needs.
Develop, automate, and maintain production-grade data pipelines using modern data stack tools and best practices.
Optimize data workflows and implement observability frameworks to monitor pipeline performance, reliability, and accuracy.
Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.