Lead and grow a team of data engineers responsible for scalable, reliable, and secure ingestion, transformation, and pipeline management across batch and streaming systems. Drive technical excellence in the design and operation of Airflow, Kafka, Spark, and Iceberg-based data workflows, ensuring data freshness, completeness, and quality SLAs are consistently met. Partner with Product Management, Data Infrastructure, and Analytics Engineering to define the roadmap for ingestion, self-service pipeline automation, and data quality frameworks aligned to business goals.
Establish and track operational metrics to improve reliability and visibility of data systems. Build a strong engineering culture focused on craftsmanship, ownership, and learning—mentoring engineers through design reviews, incident retrospectives, and technical deep dives. Collaborate cross-functionally to develop declarative, self-service tools that reduce dependencies on central teams and improve “time to insight” for internal stakeholders.
Contribute to longer-term architectural strategy for the unified data lakehouse, data catalog, and real-time infrastructure that will power Webflow’s next generation of AI and ML use cases.