Design, develop, and maintain scalable, secure, and high-performance data platforms. Build and manage data pipelines (ETL/ELT) using tools such as Apache Airflow, DBT, SQLMesh or similar. Architect and optimize lakehouse solutions (e.g., Iceberg). Lead the design and implementation of data infrastructure components (streaming, batch processing, orchestration, lineage, observability). Ensure data quality, governance, and compliance (GDPR, HIPAA, etc.) across all data processes. Automate infrastructure provisioning and CI/CD pipelines for data platform components using tools like Terraform, CircleCI, or similar. Collaborate cross-functionally with data scientists, analytics teams, and product engineers to understand data needs and deliver scalable solutions. Mentor experienced data engineers and set best practices for code quality, testing, and platform reliability. Monitor and troubleshoot performance issues in real-time data flows and long-running batch jobs. Stay ahead of trends in data engineering, proactively recommending new technologies and approaches to keep our stack modern and efficient.