- Design, develop, and maintain robust and scalable data pipelines to ingest, transform, and store data from diverse sources
- Optimize data systems for performance, scalability, and reliability in a cloud-native environment
- Work closely with data analysts, data scientists, and other stakeholders to ensure high data quality and availability
- Develop and manage data models using DBT, ensuring modular, testable, and well-documented transformation layers
- Implement and enforce data governance, security, and privacy standards
- Manage and optimize cloud data warehouses, especially Snowflake, for performance, cost-efficiency, and scalability
- Monitor, troubleshoot, and improve data workflows and ETL/ELT processes
- Collaborate in the design and deployment of data lakes, warehouses, and lakehouse architectures
DockerPythonSQL+10 more