Minimum of 6 years of industry experience in data infrastructure/data engineering. Minimum of 6 years of experience with Python and SQL. Minimum of 3 years of industry experience using DBT. Minimum of 3 years of industry experience using Snowflake. Familiarity with AWS services (Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR). Industry experience with big data platforms and tools (Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow). Industry experience working with relational and NoSQL databases. Strong fundamentals in data structures, algorithms, and design patterns. Prior experience working on cross-functional teams. Experience to build and improve user onboarding funnel and design comprehensive experiments. Industry experience in using Infrastructure as Code tools (CDK and Terraform). Experience with CI/CD to improve code stability and code quality. Motivated to help other engineers succeed. Excited to work in an ambiguous, fast-paced, and high-growth dynamic environment. Strong written and verbal communication skills.