Design, build, and maintain ETL pipelines using Stitch and AWS-native services. Own and optimize Snowflake data warehouse, including schema design and performance tuning. Implement dbt models to transform raw data into clean, analytics-ready datasets. Support Looker by developing LookML models, explores, and governed metrics. Build data quality checks, logging, and monitoring across pipelines. Design AI ready ecosystems that are scalable and resilient, supporting both structured and unstructured data for Machine learning and AI Use cases. Monitor, troubleshoot and optimize pipelines for speed, scalability and cost efficiency in AI workloads. Partner with product, support, operations and engineering teams to define and deliver data solutions. Contribute to roadmap planning and provide architectural guidance including data modelling standards, storage strategies, governance and long term scalability.