7+ years designing and implementing large-scale data architectures for SaaS platforms Deep expertise with modern cloud data warehouses (Snowflake, Redshift, BigQuery) Extensive experience building scalable data pipelines using tools like Airflow, dbt, Estuary, Fivetran, or custom solutions Proven experience designing data architectures for multi-tenant SaaS applications Strong background in dimensional modeling, star/snowflake schemas, and data normalization/denormalization strategies Experience designing integration frameworks, API-based data exchange, and event-driven architectures Experience with AWS (RDS, MSK, S3, Lambda) Expertise in PostgreSQL and Snowflake or similar Experience with Kafka/MSK for event streaming Experience with dbt, Airflow, Estuary, or similar data orchestration tools Strong Python and SQL skills Experience with data migration projects Knowledge of data governance, compliance (GDPR, CCPA), and security best practices Experience with observability and monitoring tools for data pipelines (DataDog, CloudWatch, etc.) Strong communication skills