8+ years of professional experience in the data engineering field Hands-on polyglot programming expertise - hands-on, current Python experience is a must-have Marketing channel data automation, pipeline monitoring and data delivery is strongly preferred Extensive experience in designing, developing Snowflake Cloud Data Platform Proficiency in multi-cloud platform like AWS, Azure, and/or GCP Proficiency in designing and implementing data pipelines using diverse data sources including databases, APIs, external data providers, and streaming sources Demonstrated history of designing efficient data models using Medallion Architecture Deep understanding and experience with relational (SQL Server, Oracle, Postgres and MySQL) and NoSQL databases Experience building and supporting REST APIs for both inbound and outbound data workflows Proficiency and solid grasp of distributed system concepts to design scalable and fault tolerant data architectures Excellent critical thinking to perform root cause analysis on external and internal processes and data Excellent analytic skills associated with working on structured and unstructured datasets Ability to build processes that support data transformation, workload management, data structures, dependency and metadata Ability to build and optimize data sets, ‘big data’ data pipelines and architectures Ability to understand and tell the story embedded in the data Ability to communicate with non-technical audience Strong knowledge of coding standards, best practices and data governance Demonstrated AI literacy Must attend occasional in-person meetings. Must have a fast, reliable DSL, Fiber, or Broadband connection (not mobile or broadband stick) with a minimum actual speed of 50MBPS.