2+ years of software engineering experience, preferably with a focus on data engineering or analytics systems. Experience with a major cloud platform (AWS, GCP, or Azure), including basic data services (S3, GCS, etc.). Proficiency with SQL and experience with a cloud data warehouse (e.g., Snowflake, Redshift, BigQuery). Familiarity with data transformation tools (e.g., DBT) and modern BI platforms (e.g., Sigma). Familiarity with workflow orchestration tools (e.g., Apache Airflow, Dagster). Proficiency in Python, Go, Kotlin and other programming languages used in data engineering. Familiarity with version control (Git) and modern software development practices (CI/CD). Basic understanding of data warehousing concepts (e.g., dimensional modeling) and analytics architectures. An eagerness to learn about distributed data systems and stream processing concepts. Foundational knowledge of data quality and testing principles. Strong communication and collaboration skills. Ability to take direction and work effectively as part of a team. A proactive attitude toward problem-solving and self-improvement. Experience in an internship or junior role at a technology company (Preferred). Knowledge of container technologies (Docker, Kubernetes) (Preferred). Experience with version control (Git) and CI/CD practices (Preferred). Advanced degree in Computer Science, Data Engineering, or a related technical field (Preferred).