7+ years designing and implementing large-scale data architectures for SaaS platforms Deep expertise with modern cloud data warehouses (Snowflake, Redshift, BigQuery) Extensive experience building scalable data pipelines using tools like Airflow, dbt, Estuary, Fivetran, or custom solutions Proven experience designing data architectures for multi-tenant SaaS applications Strong background in dimensional modeling, star/snowflake schemas, and data normalization/denormalization strategies Experience designing integration frameworks, API-based data exchange, and event-driven architectures AWS experience (required) Experience in RDS, MSK, S3, Lambda, and data services Expertise in PostgreSQL (operational) and Snowflake or similar (analytical) Experience with query optimization and indexing strategies Experience with Kafka/MSK for event streaming and real-time data pipelines Experience with dbt, Airflow, Estuary, or similar data orchestration tools Strong Python and SQL skills Experience with Terraform, CloudFormation, or similar for data infrastructure provisioning Experience with data migration projects, especially customer data migrations in SaaS contexts Knowledge of data governance, compliance (GDPR, CCPA), and security best practices Experience with observability and monitoring tools for data pipelines (DataDog, CloudWatch, etc.) Strong communication skills