Job Description
Responsibilities:
- Architect and implement scalable Lakehouse solutions.
- Design and orchestrate complex data workflows using Databricks.
- Manage platform internals, including Cluster Configuration.
Requirements:
- Hands-on experience across the full Databricks ecosystem.
- Deep experience in Data Engineering fundamentals.
- Proficiency in Python, PySpark, and SQL.
Nice to have:
- Experience developing internal Python libraries for reuse across pipelines.
- Experience implementing automated testing strategies for data pipelines.
- Experience collaborating with cross-functional teams to establish best practices.
About Coderio
Coderio designs and delivers scalable digital solutions for global businesses with a strong technical foundation and a product mindset.