Design, implement, and optimize high-throughput, low-latency data pipelines Build scalable data ingestion and transformation frameworks Ensure fault-tolerance, exactly-once semantics, and replayability in event processing Establish reusable patterns for pipeline development, deployment, and monitoring Design data models across streaming, analytical, and serving layers Partner with Data Architects to evolve domain-driven data models Implement optimal partitioning, clustering, and indexing strategies Embed observability, lineage, and data quality validations into pipelines Define and enforce SLAs and SLOs for data delivery, freshness, and accuracy Collaborate with Platform Engineers on metrics, alerts, and self-healing mechanisms Act as a senior engineer, contributing to design reviews and mentoring junior engineers Work closely with AI, Platform, and Product teams