5+ years of experience in data engineering, analytics engineering, or related technical roles focusing on building tools and platforms. Expert-level proficiency in Python, writing clean, maintainable code for robust, production-grade systems. Proficiency with Kubernetes and container orchestration, including deploying and managing data services in production. Experience with Helm charts for packaging, versioning, and deploying applications on Kubernetes. Strong infrastructure skills, including knowledge of infrastructure-as-code tools (e.g., Terraform, CloudFormation) and cloud platforms (AWS preferred). Strong knowledge of dbt and modern analytics engineering practices. Experience with workflow orchestration tools, particularly Airflow. Hands-on experience with modern data warehouses like Redshift and Snowflake, including performance optimization and cost management. A product mindset for internal tools. Strong communication and collaboration skills.