Bachelor's degree (or equivalent) in Computer Science or a related field. 5+ years of hands-on experience in architecting scalable API development and distributed system architecture. Exceptional programming skills in Python and proficiency in SQL or SparkSQL. In-depth experience with data stores such as BigQuery and Postgres. Proficiency in data pipeline tools like Airflow and DBT. Expertise in data processing technologies including Dataflow, Spark, Kafka, and Flink. Competence in deploying and monitoring infrastructure using tools like Docker, Terraform, Kubernetes, and Datadog. Proven ability in loading, querying, and transforming extensive datasets.