P&C GenAI as a Service: Hyperscaling GenAI in insurance
GenAI adoption in insurance has remained painful and slow, why?
While GenAI has suddenly changed many aspects of our daily lives with its amazing capabilities, adoption in insurance remains slow and tedious. This isn’t due to a lack of need or interest from insurers. Instead, the delay stems from the complexity of insurance, which involves intricate processes, nuanced language in lengthy free-form documents and a high privacy, compliance and regulation-sensitive environment.
Despite slow progress in various GenAI-enabled solutions, the gap between the need and potential of GenAI and its actual deployment remains wide. Why?
A different app for each process: Over the years, point solutions have emerged to tackle specific tasks or processes but implementing each takes time and cost.
No cross-learning: Each application is narrowly trained for its task, making it difficult to extract learnings or data from one to enrich another.
Stuck at 80% accuracy: Despite seemingly high accuracy, the lack of reliability often demands extensive manual intervention, diminishing the expected improvements in turnaround time and cost efficiency. Even slight inconsistencies can undermine trust and hinder the widespread adoption of AI solutions.
No reuse of common building blocks: Enterprise services, external data sources, prompt libraries, or model libraries.
The complexity of managing multiple point solutions: Multiplies exponentially with each app requiring separate updates and management, spiraling down the rabbit hole of interconnections and interdependencies.
Multiplying cybersecurity, vendor dependency and regulatory concerns: Each application, served by a different vendor on a different platform, brings its risks.
Lack of IP ownership and data confidentiality concerns: Many GenAI vendors are unfortunately not transparent about how their models were trained. How many give you ownership of the IP built on your data? How can you be sure that your data isn't enriching your competitors?
Consultants and SI firms’ love for ‘reinventing the wheel’: Yes, each insurer has unique business rules, underwriting guidelines and processes, but the industry follows a common set of data structures, document types and processes.
In a nutshell, this is causing our industry to suffer from the ‘death by a thousand cuts’ syndrome.
Why are ‘bare metal LLMs’ not enough?
Hyperscaler-managed LLMs are indeed the underlying foundation of most GenAI infrastructure. However, building multiple applications on bare metal LLMs from scratch isn’t cost-effective, fast, or scalable. These off-the-shelf solutions require insurers to invest heavily in building everything from workflows to models and analytics, leading to inefficiencies and delayed ROI.
LLMs are now very well trained in the English language, but we are missing the layer of ‘insurance language training’ - a basic understanding of our industry’s typical terms, forms, documents, processes and rules.
Further, while hyperscalers provide the necessary infrastructure layer elements, there is a gap in a foundational ‘software chassis’ to rapidly build applications leveraging shared enterprise components.
To Read full blog visit- P&C GenAI as a Service: Hyperscaling GenAI in insurance
Comments
Post a Comment