Design, develop, and maintain end-to-end data pipelines from ingestion to curated data models in Snowflake. Support data ingestion processes, including configuring and maintaining automated data connectors. Contribute to the migration of existing ETL workflows to dbt, ensuring logic parity, performance improvements, and better maintainability. Develop well-structured, modular dbt models, macros, and tests following best practices. Participate in architectural discussions and help shape standards for data modeling, testing, and deployment. Collaborate with business stakeholders and analytics users to translate requirements into reliable data models and metrics. Contribute to improving data quality, observability, and documentation across the data platform.