Bachelor’s degree with Minimum 1.5+ years of experience working in globally distributed teams successfully Must have experience working on Python and data handling frameworks(spark, beam, etc) Apply experience with cloud storage and computing for data pipelines in GCP (GCS, BQ, composer, etc) Write pipelines in Airflow to orchestrate data pipelines Experience handling data from 3rd party providers is a great plus: Google Analytics, Google Ads etc. Experience in manipulating, processing and extracting value from large disconnected datasets. Experience with software engineering practices in data engineering, e.g. release management, testing, etc and corresponding tooling (dbt, great expectations, ...) Basic knowledge on dbt is a good to have Knowledge on data privacy and security Excellent verbal and written communication skills