8+ years of deep, hands-on experience in Data Engineering, Data Infrastructure, or building Data Engineering Tools and Frameworks. Exceptional expertise in data structures, algorithms, and distributed systems. Mastery in Python for large-scale data processing. Extensive experience designing, building, and optimizing complex, fault-tolerant data pipelines (batch and real-time streaming). Profound understanding and hands-on experience with cloud-native data platforms, especially Google Cloud Platform (GCP) services like BigQuery, Dataflow, Composer, Dataproc. Demonstrated experience with modern data orchestration (e.g., Airflow), data transformation (dbt), and data warehousing concepts. Intimate knowledge of and ability to implement unit, integration, and functional testing strategies. Experience providing technical leadership and guidance. Friendly communication skills and ability to work well in a diverse team setting. Demonstrated experience working with many cross-functional partners. Demonstrated experience leading a software product or component vision and delivery plan.