Strong software engineering background, with experience in AI/ML frameworks like TensorFlow, PyTorch, and their application in generative models.
Proven experience integrating AI/ML models into real-world applications, particularly those involving production use cases of LLMs
Deep understanding of models, their context windows, model behavior as well as strong practical experience implementing and optimizing RAG systems
Hands-on experience with cloud infrastructure and deployment tools, particularly Terraform, and deploying services on platforms like AWS, GCP, or Azure.
Proficiency in unit, integration, and system testing, with a focus on ensuring robustness, reliability, and performance of AI-powered systems.
Comfortable working in cross-functional teams, collaborating with Product Managers, Designers, and Lead Engineers to build cohesive, user-centered solutions.
Responsibilities:
Bring AI solutions to our existing Web App, shipping full-stack AI projects end to end using pre-trained models.
Build and integrate components for Ordio’s AI infrastructure, supporting production-level inference and fine-tuning
Write clean, efficient, and maintainable code for AI-driven applications, APIs, and user interfaces using best practices
Design and implement internal and external MCP Server, enabling interaction with services through chat-based user input.
Deploy AI solutions to the Cloud via Terraform, leveraging cloud-native services for compute, storage, networking, and AI-specific offerings
Work closely with Product Managers, Lead Developers, UX/UI Designers, and other team members to integrate AI components seamlessly into the overall solution
Develop and execute unit, integration, and system tests to ensure the reliability and robustness of AI solutions.