- Design and implement pipelines that utilize LLMs to analyze and score identity data in real-time
- Integrate AI models directly into the decision-making loop, balancing accuracy with latency and cost
- Architect scalable data solutions using GCP and Python
- Manage data storage and retrieval using BigQuery and Apache Iceberg to support querying of TBs of data
- Utilize Apache Spark for data transformations and batch processing
- Maintain and optimize our system instances for 10k+ executions/minute
- Take ownership of the deployment process, ensuring safe releases with testing and rollback strategies
- Manage PostgreSQL performance under heavy load, optimizing complex queries and indexing strategies
- Use Claude Code and other AI engineering tools to accelerate development, refactoring, and testing
- Proactively monitor the system and investigate/fix root causes of incidents
PostgreSQLPythonGCP+5 more