Senior Staff Software Engineer, Backend (Data and Storage Services)
New
Remote USFull-TimeStaff
Salary232,000 - 310,000 USD per year
Apply NowOpens the employer's application page
Job Details
- Experience
- 10+ years of experience in software engineering or data engineering, with a proven track record of delivering complex data platform solutions that improve accessibility, performance, and governance of analytics infrastructure. Snowflake Expertise: 6+ years of hands-on experience with Snowflake or comparable analytical data warehouses, including RBAC design, data masking, query optimization, and cost management.
- Required Skills
- AWSPythonSQLSnowflakeSparkCI/CDTerraformData modelingdbt
Requirements
- Architect and Implement: Design, develop, and maintain core components of Affirm's lakehouse analytics platform, with a focus on scalability, governance, and reliability.
- Snowflake Expertise: Leverage deep knowledge of Snowflake to architect RBAC models, dynamic data masking, warehouse optimization, and multi-cluster compute strategies. Should possess deep understanding of Snowflake internals including query profiling, micro-partitioning, clustering, materialized views, and cost attribution.
- Analytics Engineering: Drive the technical strategy for data modeling and transformation using dbt, including testing frameworks, documentation standards, and CI/CD for data pipelines.
- Data Governance & Privacy: Design and operate data governance frameworks using tools like Atlan, including data cataloging, lineage tracking, classification, and automated privacy policy enforcement.
- Lakehouse Architecture: Tackle the challenges of large-scale analytical data systems, including Apache Iceberg table management, schema evolution, storage optimization, and integration with Spark and Snowflake.
- Collaboration: Work closely with product managers, software engineers and analysts to translate business requirements into technical solutions, and with fellow engineers to deliver high-quality data infrastructure.
- Mentorship: Guide and mentor junior and senior engineers, sharing your expertise and fostering a culture of technical excellence.
- Innovation: Stay ahead of the curve by researching and experimenting with emerging technologies and trends in the lakehouse, data governance, and analytics engineering space.
- Experience: 10+ years of experience in software engineering or data engineering, with a proven track record of delivering complex data platform solutions that improve accessibility, performance, and governance of analytics infrastructure.
- Snowflake Expertise: 6+ years of hands-on experience with Snowflake or comparable analytical data warehouses, including RBAC design, data masking, query optimization, and cost management.
- Lakehouse & Big Data: Strong experience with Apache Iceberg, Spark, and cloud-native data lake architectures on AWS (S3, EKS).
- Analytics Engineering: Experience with dbt or equivalent transformation frameworks, including data modeling best practices, testing, and CI/CD for data pipelines.
- Problem Solving: Exceptional problem-solving and analytical skills, with the ability to identify and resolve complex technical challenges and establish long-lasting solutions and processes.
- Programming Skills: Proficiency in Python and SQL, with a strong emphasis on clean, maintainable code. Experience with Kotlin or Go is a plus.
- Leadership: Demonstrated leadership and mentorship skills, with the ability to inspire and guide others. You can also work cross-functionally addressing technical challenges and influencing roadmaps outside your direct area of ownership.
- Innovation: You drive innovation in the platforms you build and operate, and have experience contributing to open-source projects. You are passionate about engaging with the data engineering community.
- Infrastructure as Code (IaC): Familiarity with automation tools like Terraform for managing data infrastructure.
- Communication: Excellent communication and interpersonal skills, with the ability to clearly articulate technical ideas to both technical and non-technical audiences.
Responsibilities
- Architect and evolve Affirm's lakehouse analytics platform, driving strategy around Snowflake, Apache Iceberg, and Spark to deliver scalable, high-performance analytical infrastructure.
- Design and implement robust Role-Based Access Control (RBAC) and dynamic data masking policies in Snowflake, ensuring data access is secure, compliant, and auditable across the organization.
- Lead the technical direction of analytics engineering practices, including data modeling, transformation pipelines (dbt), and data quality frameworks that enable trustworthy, self-service analytics.
- Drive data governance and privacy engineering initiatives, leveraging tools like Atlan to manage data cataloging, lineage, classification, and policy enforcement.
- Identify and execute cost optimization strategies across Affirm's analytical compute and storage footprint, including Snowflake warehouse tuning, query optimization, and efficient data lifecycle management.
- Collaborate with product engineering, data science, and business intelligence teams to understand their data needs and provide continuous guidance on design, architecture, and best practices.
- Establish and champion best practices for lakehouse operations at scale, including schema evolution, table maintenance, partitioning strategies, and observability.
- Stay ahead of industry trends in analytical data platforms, data governance, and privacy technologies, and identify opportunities to innovate and improve our data offerings.
- Mentor engineers across the Lake Analytics Platform and Analytics Engineering teams, providing guidance on emerging technologies, development practices, and fostering a culture of technical excellence.
- Participate in an on-call rotation and collaborate with other teams such as SRE to resolve production issues.
View Full Description & ApplyYou'll be redirected to the employer's site