5–8 years of total experience in data-intensive environments Minimum 2 years as a Data Engineer, building or maintaining large-scale pipelines and distributed data systems Minimum 3 years in Product Management or Technical Program Management, owning data or infrastructure products Deep understanding of data engineering concepts: ETL frameworks, data modeling, distributed processing, and streaming systems (e.g., Spark, Airflow, Kafka, Flink) Strong familiarity with cloud data platforms (Snowflake, BigQuery, Redshift) and data lake architectures Proficient in SQL; comfortable understanding Python or similar scripting languages Demonstrated ability to write clear technical documentation and product specifications Excellent verbal and written communication Highly proactive and self-directed