AngularAWSAzureCloudGoogle Cloud PlatformJavaScriptPythonTypeScriptC#CAIMLNLPGenAIHTML5GCPGoogle CloudRESTfulRemote Work
About this role
Role Overview
Design and develop state-of-the-art GenAI and general AI solutions as well as multiagent systems to solve complex business problems
Lead the design and architecture of scalable, efficient, and high-performance data systems that support processing of massive datasets of structured and unstructured data
Stay up to date with the latest trends in AI, NLP, LLMs and big data technologies
Contribute to the development and implementation of new techniques that improve performance and innovation
Collaborate with cross-functional teams, including engineers, product owners, and other stakeholders to deploy AI models into production systems and deliver value to the business
Translate high-level business requirements into detailed functional, technical and system specifications
Responsible for the development, maintenance and support of multiple distributed applications
Contributing to 24x7 availability and scalability of the solutions
Analyzing existing designs and interfaces and creating design extensions or enhancements
Designing and enhancing UIs with a focus on the user experience
Testing software designs and solutions (including debugging and troubleshooting)
Promote the firm’s values and set an example for team members upholding those values
Requirements
Bachelor’s degree in Computer Science, Engineering, or a related discipline (or equivalent experience)
A minimum of 5 years of experience in information technology
Hands-on experience developing or integrating AI/ML models into applications
Proficiency in one or more programming languages such as Python, or, C#, or JavaScript/TypeScript
Experience with GenAI models
Experience consuming and building RESTful or event-driven APIs
Solid understanding of cloud platforms (Azure, AWS, or GCP) and containerized deployments
Familiarity with data pipelines, feature stores, or model lifecycle management
Demonstrated experience dealing with big-data technologies and the ability to process, clean and analyse large-scale datasets