Experience with various data pipeline patterns (push based, pull based, eventually consistent, batch, streaming) Familiarity with Delta Lake Experience with Scala Spark Experience with Python Spark Experience with Airflow Ability to think in systems and design data workflows Flexibility and strong opinions on data pipeline instrumentation Ability to influence and lead engineers Attention to detail in schema evolution, data contracts, and dependency graphs Pragmatism in development and shipping Focus on performance and cost awareness