How a leading insurtech platform scaled 25+ GenAI use cases to production

How a leading insurtech platform scaled 25+ GenAI use cases to production

From scattered AI experiments to processing 30 million policies a month, one commercial insurance technology company transformed how it manages GenAI operations at scale.

From scattered AI experiments to processing 30 million policies a month, one commercial insurance technology company transformed how it manages GenAI operations at scale.

About

A leading commercial insurance technology platform that simplifies the quote-to-bind process for agents, brokers, and carriers through AI-powered digital solutions.

Industry

Insurance

Company Size

North America

Headquarters

Netherlands, Europe

Why Portkey:

Prompt management, cost tracking, API key governance, enterprise security.

0+
Requests Processed
0M+
Policies in a month
0+
Carrier products
The challenge of scaling AI in insurance

By 2024, this insurtech company had expanded its AI initiatives beyond its core quoting platform. What started as a few AI-powered features for document processing and data extraction had grown into a sprawling ecosystem of over 25 GenAI use cases across underwriting automation, submission processing, coverage analysis, and customer support.

The platform team faced a critical challenge: how do you take dozens of AI pilots to production without losing control? But when something went wrong, finding where and why felt like archaeology.

The gap in visibility

As teams across the organization experimented with LLMs, the company started experiencing familiar growing pains.

Every team was taking its own approach with different prompts, different models, different deployment methods. Efforts were being duplicated across departments, with no central view of what was being built or which models were being used.

There was no easy way to monitor performance, quality, or cost across the various use cases. Identifying issues like failures, hallucinations, or inefficiencies became nearly impossible until problems surfaced in production.

Security and compliance requirements were hard to enforce when teams moved fast, super critical for insurance. API keys were scattered, usage was untracked, and the risk of data leakage grew with every new experiment.

Finding the right infrastructure partner

The team evaluated several options before selecting Portkey's AI Gateway. For an insurance company handling sensitive policy data and regulatory requirements, the decision came down to a few key factors:

  • Centralized prompt management to version, test, and deploy prompts across all use cases

  • Granular cost tracking to understand spend per use case and optimize accordingly

  • Enterprise-grade API key governance to ensure keys were used correctly and securely

  • Comprehensive observability to monitor every LLM call across the organization

  • Compliance-ready architecture with SOC2, HIPAA, and GDPR support

Implementing a unified GenAI platform

The organization deployed Portkey as the central control plane for all LLM interactions. This approach ensured that:

  • All GenAI use cases i.e., from document extraction to coverage analysis, routed through a single, governed gateway

  • Prompt versions were managed centrally, enabling rapid iteration without breaking production workflows

  • Cost and usage data was tracked at the use case level, giving leadership visibility into AI investments

  • Security policies were enforced consistently, with proper key rotation and access controls

See what Portkey can do for your AI stack
Visibility, control, and confidence at scale

After implementing Portkey, the insurtech platform saw tangible improvements across its AI operations.

  • Full visibility into AI operations: With 30 million policies flowing through the platform monthly, leadership now has a clear view of how AI is being used, what it costs, and where it's delivering value. Cost tracking per use case revealed optimization opportunities that had been invisible before.

  • Governance without friction: API key management, once a scattered mess across teams, is now centralized and secure. New use cases can be spun up quickly while maintaining compliance with insurance industry requirements.

  • Faster iteration, lower risk: Prompt management enables teams to experiment and improve their AI applications without the fear of breaking production. What used to require careful coordination now happens continuously.

The platform now supports 25+ GenAI use cases in production, powering everything from intelligent document processing to automated underwriting assistance, all with the visibility and control that enterprise insurance operations demand.

Lessons for teams building production agents

Based on this insurance company's experience, teams building GenAI systems in regulated industries should keep several points in mind:

  • Start with governance, not as an afterthought.

  • Make costs visible early. Without granular tracking, AI spend can spiral quickly. Understanding cost per use case helps prioritize investments and identify inefficiencies before they become budget problems.

  • Choose partners who understand enterprise needs. The right infrastructure partner makes the difference between sustainable growth and constant firefighting.

The path this insurance company took shows how crucial it is to think about GenAI infrastructure strategically. As AI becomes central to insurance operations, having a flexible yet controlled infrastructure will be the difference between scalable innovation and unmanageable complexity.

Build your AI app's
control panel now

Build your AI app's control panel now

Build your AI app's
control panel now

AI Gateway, Observability, Guardrails, Governance, and Prompt Management, all in one platform.

AI Gateway, Observability, Guardrails, Governance, and Prompt Management, all in one platform.

Logo

Product

Solutions

Developers

Resources

...
Logo
...