Similar Jobs
See allAccountabilities:
- Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
- Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
- Work on cloud platforms (Azure, AWS) to build and manage data lakes and scalable architectures.
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
- 3–6 years of experience in Data Engineering or related roles.
- Hands-on experience with big data processing frameworks and data lakes.
Benefits:
- Flexible remote working conditions.
- Opportunities for professional growth and training.
- Health and wellness benefits.
Jobgether
Jobgether is an AI-powered platform that helps job seekers find suitable opportunities. They connect top-fitting candidates with hiring companies, streamlining the recruitment process through objective and fair assessments.