5+ years in data engineering, including architecting and operating production ETL/ELT pipelines for real-time, high-volume analytic platforms. Deep proficiency with ClickHouse, Postgres, Athena, and distributed data systems (Kafka, cloud-native stores). Proven experience with both batch and streaming pipeline design. Advanced programming in Python and SQL (bonus for Java). Expertise in workflow orchestration (Airflow, Step Functions), CI/CD, and automated testing for data. Experience in high-scale, low-latency environments. Understanding of security, privacy, and compliance requirements for financial-grade platforms. Strong communication, business alignment, and documentation abilities.