2 years of professional experience in data engineering or related field Experience designing, developing, or maintaining data pipelines, ETL/ELT processes, or data-driven systems Solid understanding of data modeling, data warehousing concepts, and data quality principles Experience working with cloud-based data platforms (AWS, Azure, or GCP) Strong SQL, experience with SQL/NoSQL databases (MySQL, MongoDB) and cloud warehouses (Snowflake, Redshift, BigQuery) Proficiency in programming languages such as Python or Java Familiarity with modern data technologies (Spark, Airflow, dbt, Kafka, or similar) is a plus Strong problem-solving skills Ability to work collaboratively in a cross-functional team Good communication skills Willingness to learn, ask questions, and grow