Design, implement, and optimize real-time data pipelines handling billions of events per day with strict SLAs. Architect data flows for bidstream data, auction logs, impression tracking and user behavior data. Build scalable and reliable event ingestion and processing systems using Kafka, Flink, Spark Structured Streaming, or similar technologies. Operate data infrastructure on Kubernetes, managing deployments, autoscaling, resource limits, and high availability. Collaborate with backend to integrate OpenRTB signals into our data platform in near real-time. Ensure high-throughput, low-latency processing, and system resilience in our streaming infrastructure. Design and manage event schemas (Avro, Protobuf), schema evolution strategies, and metadata tracking. Implement observability, alerting, and performance monitoring for critical data services. Contribute to decisions on data modeling and data retention strategies for real-time use cases. Mentor other engineers and advocate for best practices in streaming architecture, reliability, and performance. Continuously evaluate new tools, trends, and techniques to evolve our modern streaming stack.