Collaborates with business stakeholders to determine functional and non-functional requirements for new software and/or applications.
Provide Data / ML and AI platform recommended business workflow and architecture, understanding requirements, and helping each project team to document them.
Lead and develop Data Modernization strategy and roadmap to enable Insurance Carrier to migrate data computing to cloud using cloud native data / ML and AI assets, tools and capabilities.
Designs and supports the application architecture utilized by in-house developed applications, vendor applications and databases; ensures development efforts are carried out using a consistent and quality driven approach.
Serves as a hands-on Data / ML & A.I. Specialist Architect to include code and framework development.
Provides hands on technical support, guidance and coaching to Data / ML and A.I. developers.
Oversees progress of development team to ensure consistency with initial design. Also ensures the software meets all requirements of quality, security, modifiability, extensibility, etc.
Designs and implements long-term strategic goals and short-term tactical plans for managing and maintaining Data / ML and A.I. software systems and platform. (Including assessing and addressing technical risks.)
Building Databricks Delta Lake based Lakehouse using PySpark Jobs, Databricks Workflows, Unity Catalog, Delta Sharing, Serverless computing and Medallion architecture.
Design, develop and maintain Databricks-based data architectures and models for high availability, scalability and performance.
Demonstrate expertise in implementing data security best practices to safeguard sensitive information within Databricks.
Requirements
Bachelor’s degree in computer science, IT or information systems.
Two plus years Implementing AI solutions for P&C insurance companies
Experience leading best practices and end-to-end architecture, design, and delivery.
Ten or more years of experience in data modelling & data engineering, including at least 5 years in an architecture-focused & lead technical role.
Five or more years of experience architecting enterprise-grade cloud data platforms using Databricks.
Hands-on experience with Databricks deployment, configuration and administration.