Collaborates with business stakeholders to determine functional and non-functional requirements for new software and/or applications.
Provide Data / ML and AI platform recommended business workflow and architecture, understanding requirements, and helping each project team to document them.
Lead and develop Data Modernization strategy and roadmap to enable Insurance Carrier to migrate data computing to cloud using cloud native data / ML and AI assets, tools and capabilities.
Designs and supports the application architecture utilized by in-house developed applications, vendor applications and databases; ensures development efforts are carried out using a consistent and quality driven approach.
Serves as a hands-on Data / ML & A.I. Specialist Architect to include code and framework development. Provides hands on technical support, guidance and coaching to Data / ML and A.I. developers.
Oversees progress of development team to ensure consistency with initial design. Also ensures the software meets all requirements of quality, security, modifiability, extensibility, etc.
Designs and implements long-term strategic goals and short-term tactical plans for managing and maintaining Data / ML and A.I. software systems and platform. (Including assessing and addressing technical risks.)
Building Databricks Delta Lake based Lakehouse using PySpark Jobs, Databricks Workflows, Unity Catalog, Delta Sharing, Serverless computing and Medallion architecture.
Design, develop and maintain Databricks-based data architectures and models for high availability, scalability and performance.
Demonstrate expertise in implementing data security best practices to safeguard sensitive information within Databricks.
Requirements
Bachelor’s degree in computer science, IT or information systems.
Two plus years Implementing AI solutions for P&C insurance companies
Experience leading best practices and end-to-end architecture, design, and delivery.
Ten or more years of experience in data modelling & data engineering, including at least 5 years in an architecture-focused & lead technical role.
Five or more years of experience architecting enterprise-grade cloud data platforms using Databricks.
Hands-on experience with Databricks deployment, configuration and administration.
Working knowledge of Cloud Computing Technologies & Services.
Preferred Certifications in Data, ML & AI platform in configuration and integrations.
Exposure to multiple, diverse technologies, platforms and processing environments.
Advanced knowledge of the design and development of solutions across different functional domains leveraging varied technologies.
Experience with multiple middleware technologies (Application Servers, ESB & Message Brokers), Programming languages (e.g. Java, .Net, JavaScript), Operating Systems (e.g. Windows, Red hat Linux), Databases (Oracle, SQL Server, No SQL Databases like MongoDB and Postgres SQL) preferred.
Advanced knowledge of one or more operating systems, databases, & programming languages used by FCCI.
Experience supporting custom and third-party business applications, as well as understanding project interdependencies and systems integrations.
Advanced working knowledge of server hardware, software, and related equipment and technologies.
Solid understanding of Agile framework and incremental SDLC.
Ability to understand and improve systems; evaluates system’s current state/practices and recommend modifications.
Excellent communication, negotiation, interpersonal and organizational skills.
Ability to analyze, define and solve problems, the use of good judgment and decision-making.