Similar Jobs
See allWhat you'll do:
- Design and build mission critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events and batch), data integration, data curation
- Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders
- Build and support reusable framework to ingest, integration and provision data
About you:
- 3 to 5 years' experience in data warehouse / data lake house technical architecture
- 3+ years of experience in using programming languages (Python / Scala / Java / C#)
- Minimum 3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming)
Skills:
- Experience with Database Architecture/Schema design
- Strong familiarity with batch processing and workflow tools such as dbt, AirFlow, NiFi
- Experience providing technical leadership and mentoring other engineers for best practices on data engineering
StockX
StockX is a Detroit-based technology leader focused on the online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. They employ 1,000 people across offices and verification centers around the world and their platform connects buyers and sellers using dynamic pricing mechanics.