8+ years of hands-on experience in Data Engineering, Data Infrastructure, or building Data Engineering Tools. Proven experience in a technical leadership role, mentoring engineers and leading projects. Exceptional expertise in data structures, algorithms, and distributed systems. Mastery in Python for large-scale data processing. Extensive experience designing and building fault-tolerant data pipelines (batch and real-time streaming). Profound hands-on experience with cloud-native data platforms, especially Google Cloud Platform (GCP) services like BigQuery, Dataflow, Composer, Dataproc. Demonstrated experience with modern data orchestration (e.g., Airflow), data transformation (dbt), and data warehousing. Experience with project management and breaking down complex problems. Intimate knowledge of unit, integration, and functional testing strategies. Friendly communication skills and ability to work well in a diverse, cross-functional setting. Demonstrated experience leading a software product or component vision and delivery plan.