Design and evolve the enterprise Azure Lakehouse architecture.
Lead the transformation of classic Data Warehouse environments into modern Lakehouse.
Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.
Deutsche Telekom IT Solutions Slovakia, formerly T-Systems Slovakia, has been a part of the Košice region since 2006. They have grown to be the second-largest employer in eastern Slovakia, with over 3900 employees, aiming to provide innovative information and communication technology services.
Design, build, and scale the lakehouse architecture that underpins analytics, machine learning, and AI.
Modernize our data ecosystem, making it discoverable, reliable, governed, and ready for self-service and intelligent automation.
Operate anywhere along the data lifecycle from ingestion and transformation to metadata, orchestration, and MLOps.
OnX is a pioneer in digital outdoor navigation with a suite of apps. With more than 400 employees, they have created regional “Basecamps” to help remote employees find connection and inspiration.
Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).
They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.
Design, build, and optimize scalable data lakes, warehouses, and data pipelines using Snowflake and modern cloud platforms.
Develop and maintain robust data models (ELT/ETL), ensuring clean, reliable, and well-documented datasets.
Design and manage end-to-end data pipelines that ingest, transform, and unify data from multiple systems.
Cobalt Service Partners is building the leading commercial access and security integration business in North America. Backed by Alpine Investors, with $15B+ in AUM, Cobalt has scaled rapidly since launch through acquisitions and is building a differentiated, data-driven platform.
Define and govern enterprise data architecture standards.
Ensure interoperability and high data quality.
Drive innovation and strategic architecture direction.
Jobgether connects job seekers with partner companies through an AI-powered matching process. Their system quickly and fairly reviews applications against core requirements, ensuring top candidates are shared with hiring companies.
Design and implement data solutions for enterprise customers.
Create and maintain technical documentation and architectural diagrams.
Ensure quality and governance standards are met throughout the engineering lifecycle.
Jobgether is a company that helps candidates get hired. They use an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements.
Architect and develop cloud-native data platforms, focusing on modern data warehousing, transformation, and orchestration frameworks.
Design scalable data pipelines and models, ensure data quality and observability, and contribute to backend services and infrastructure supporting data-driven features.
Collaborate across multiple teams, influence architectural decisions, mentor engineers, and implement best practices for CI/CD and pipeline delivery.
Jobgether uses an AI-powered matching process to ensure applications are reviewed quickly, objectively, and fairly against the role's core requirements. Their system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company.
Lead support of client’s Azure Data platform and Power BI Environment, including response to any escalations while helping to analyze and resolve incidents for customers environment.
Consult, develop, and advise on solutions in Microsoft Azure with tools such as Synapse, Data Factory, Databricks, Azure ML, Data Lake, Data Warehouse, and Power BI.
Consistently learn, apply, and refine skills around data engineering and data analytics.
3Cloud hires people who aren’t afraid to experiment or fail and who are willing to give direct and candid feedback. They hire people who challenge and hold each other accountable for living 3Cloud’s core values because they know that it will result in amazing experiences and solutions for clients.