Similar Jobs

See all

Job Responsibilities:

  • Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines.
  • Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
  • Develop and manage our central data warehouse in Google BigQuery.

Basic Qualifications:

  • 5+ years of hands-on experience in data engineering, data integration, or data platform development.
  • Strong programming and query skills in SQL and Python.
  • Experience working with distributed version control systems such as Git in an Agile/Scrum environment.

Why You’ll Love Working at ShyftLabs:

-Work Arrangement: This role is currently fully remote , providing flexibility to work from home.

-Comprehensive Benefits: We cover 100% of health, dental, and vision insurance premiums for you and your dependents which means no out-of-pocket costs.

-Growth & Learning: Access extensive learning and development resources to keep leveling up your skills.

ShyftLabs

ShyftLabs is a data product company founded in early 2020 that works with Fortune 500 companies. They deliver digital solutions to help accelerate the growth of businesses across various industries through innovation; they also value strong business awareness.

Apply for This Position