5+ years of experience with ETL/ELT tools and processes, at least one using Apache Airflow. 3+ years of experience with Python. 3+ years of experience with data warehousing concepts (Snowflake preferred). 2+ years of experience with Reporting tools (PowerBI and Cognos preferred). 1+ years of experience with cloud platforms (AWS preferred) and their data services. Strong proficiency in SQL and experience with relational databases (Oracle and Snowflake preferred). Familiarity with data lake concepts and technologies. Understanding of data modeling principles and techniques.
Responsibilities:
Design, develop, and implement robust and scalable data pipelines Ensure data quality, reliability, and efficient data access Work with various stakeholders across the organization