Strong foundation in transformer architectures (encoder-only, decoder-only, and multimodal; e.g., BERT, GPT, LLaMA, Mistral) Hands-on experience with frontier model tools such as tool calling, MCP, context management, prompt tuning, and evaluation frameworks Knowledge of retrieval systems, embeddings, semantic search, and orchestration of complex AI workflows Experience with NLP tasks: summarization, entity extraction, dialogue systems, or semantic understanding Experience working with frontier models from OpenAI and Anthropic, including token-based inference and orchestration Proficiency in Rust or similar high-performance languages (Go, C++, systems-level Python) Experience building production-grade services in cloud-native environments (AWS, GCP, or Azure) Strong background in scalability, observability, and distributed systems Experience deploying and operating AI systems in production Familiarity with CI/CD pipelines, automated testing, and evaluation frameworks