Proven experience leading data platform or data lake initiatives in cloud environments. Deep understanding of agile methodologies, sprint planning, and backlog management. Strong technical literacy in data architecture, pipelines, and storage optimization. Excellent communication and stakeholder management skills. Ability to balance short-term delivery with long-term platform scalability. Hands-on experience with data lake solutions on Snowflake or Databricks. Familiarity with data integration tools and ETL/ELT pipelines (e.g., Airflow, dbt, Fivetran, Azure Data Factory). Knowledge of cloud infrastructure (AWS, Azure, or GCP) and DevOps processes. Understanding of enterprise data governance, security, and compliance frameworks. Ability to define KPIs for operational efficiency and data reliability.