Bachelor's degree (or equivalent) in Computer Science or a related field. 3+ years of hands-on experience in architecting scalable API development, distributed system architecture, guiding projects from initial ideation through to successful production deployment. Exceptional programming skills in Python, as well as adeptness in SQL or SparkSQL. In-depth experience with data stores such as BigQuery and Postgres. Proficiency in data pipeline and workflow orchestration tools like Airflow and DBT. Expertise in data processing technologies and streaming workflows including Dataflow, Spark, Kafka, and Flink. Competence in deploying and monitoring infrastructure within public cloud platforms, utilizing tools such as Docker, Terraform, Kubernetes, and Datadog. Proven ability in loading, querying, and transforming extensive datasets