Apply design and coding best practices in writing highly optimized, scalable, and maintainable code for ML systems.
Design and optimize Izola’s multiagent orchestration for dynamic, context-aware dialogue across modalities.
Build and maintain answer quality monitoring pipelines using platforms like Arize and LangSmith, and fine-tune in our retrieval-augmented generation (RAG) architecture to improve relevance and accuracy.
Research and apply cutting-edge generative AI, NLP, and reinforcement learning techniques to enhance Izola’s conversational intelligence as requirements evolve.
Work closely with product, design, and engineering teams to deliver multiagent conversational experiences.
Apply rigorous testing, observability, and performance optimization for Izola’s production environment.
Requirements
Five-plus years designing and building enterprise-grade, cloud-native applications on public clouds (AWS preferred).
Four-plus years in ML engineering, owning end-to-end data pipelines for scalable AI systems (e.g., search, recommendations).
Deep expertise in LLM-powered conversational systems and RAG architectures, including multisource indexing and ingestion for semantic and keyword hybrid retrieval.
Proven experience in designing and operating multiagent systems using modern orchestration frameworks (e.g., Agno, Google ADK).
Experience in integrating ML observability platforms (e.g., Arize, LangSmith) to support evaluation and production readiness.
Hands-on expertise in retriever tuning and personalization, balancing relevance, latency, and scale.
Advanced proficiency in Python and cloud-native development, with production ownership in AWS environments.
Strong experience with MLOps practices and scalable ML architectures, from experimentation to production deployment.
Demonstrated ability to solve ambiguous, high-impact problems and influence outcomes across cross-functional teams.
Excellent communication skills, with a track record of translating complex technical concepts into clear guidance for diverse stakeholders.