Design, develop, and maintain scalable, secure, and high-performance data platforms. Build and manage data pipelines (ETL/ELT) using tools such as Apache Airflow, DBT, SQLMesh or similar. Architect and optimize lakehouse solutions (e.g., Iceberg). Lead the design and implementation of data infrastructure components (streaming, batch processing, orchestration, lineage, observability).
Job listings
Design & Build Pipelines by developing and maintaining robust and scalable ETL/ELT pipelines, moving data from diverse sources into Jooble's data warehouse. Ensure Data Quality & Observability by implementing a comprehensive data observability strategy. Optimize & Automate data processing and continuously optimize the data storage strategies and query performance. Govern & Document data processes, models, and architecture in the data catalog.
The senior data engineer develops, constructs, tests and maintains architectures, such as databases and large-scale processing systems, and ensures that architecture fully supports requirements of the business. This role involves collaborating closely with the businessโs Data and Analytics teams, gathering technical requirements for exceptional data governance and overseeing activities of junior data engineers.
As a key member of our Data Engineering team you will develop and maintain APIs, services, and orchestration systems to facilitate data fulfillment for partners and internal consumers. Integrate with 3 rd party APIs and landing zones. Oversee the design and maintenance of data systems and contribute to the continual enhancement of the data fulfillment platform.
Ditto aims to expand the internet beyond traditional reach with groundbreaking software that empowers devices to synchronize data in real-time. Your role involves guiding a talented team to build and enhance key backend systems, ensuring reliability, performance, and scalability. You will work closely with product managers, designers, and other teams to shape the future of technology and the services that underpin it.
As a Senior Solutions Architect on the Digital Native team, shape the future of the Data & AI landscape by working with the most sophisticated data engineering and data science teams. Partner with the sales team and provide technical leadership to help customers understand how Databricks can solve their business problems. Consult on Big Data architectures, implement proof of concepts for strategic projects, spanning data engineering, data science and machine learning, and SQL analysis workflows.
This is a high-impact opportunity to build scalable infrastructure that unifies data from multiple sources into a trusted, accessible foundation for the business. In the coming months, you'll focus on migrating pipelines to a modern architecture, improving data quality and governance, and enabling teams to better leverage data across our products. Weโre looking for an experienced engineer with strong hands-on expertise in tools like Snowflake, Airflow, Airbyte, and dbt.
You will be a key contributor to the design, development, and scaling of the core platform services that power our AI Process Platform. You will work on complex, distributed systems at the intersection of AI, automation, and human-in-the-loop workflows, ensuring our platform is robust, scalable, and highly performant. This is an opportunity to make a significant impact on a rapidly growing product.
Our backend systems power the clients used by millions of customers every year to buy their groceries online. These systems must also support tight integration with the largest retailers in the US and Canada. You will work closely with other teams to understand their main pain points and translate them into self-serve and reliable solutions.
As a Specialist Solutions Architect, you will guide customers in building big data solutions on Databricks. You will be in a customer-facing role, working with and supporting Solution Architects, which requires hands-on production experience with Apache Sparkโข and expertise in other data technologies. SSAs help customers through design and successful implementation of essential workloads.