7+ years of experience in data warehousing, database administration, or database development. 5+ years of hands-on experience as a Data Engineer using SQL and Python. Expertise with cloud-based platforms such as AWS, GCP, or Azure. Familiarity with containerization tools like Docker or Kubernetes. Proven experience working with large-scale datasets using technologies such as Snowflake, BigQuery, Redshift, Databricks, or similar. Strong knowledge of ELT/ETL tools and data integration frameworks, including Airflow, DBT, Python, and REST APIs. Solid understanding of data modeling, performance optimization, and pipeline maintainability. Excellent English communication skills.