In this article, Adam Nunes, Director of Sales at Federato and a long-time insurance enthusiast with a passion for digital transformation argues that insurers looking to realize the business benefits of data and AI must first address a ‘data plumbing’ problem that continues to impede meaningful underwriting innovation. An overreliance on so-called “core systems” like policy administration, combined with a reluctance to renovate the underlying data architecture has left underwriters, the linchpins of profitable growth, high and dry.
My wife and I are obsessed with Mid-Century Modern architecture – there’s just something about exposed wood and open floor plans. It’s a throwback to a simpler time filled with neighborhood cocktail parties and malt shops. A few years ago, a well-maintained and renovated mid-century modern home built in 1957 came up for sale. After a bidding war and waiving every real estate buyer’s right known to man, we moved into our dream home.
Seven months later, we found ourselves in a new homeowner’s worst nightmare. The plumbing had a catastrophic failure. Thinking I was Tim “The Tool Man” Taylor, I attempted to fix the plumbing myself. In the course of trying to do so, my father-in-law was electrocuted (he’s fine!) and I wound up in the hospital with an allergic reaction to breathing in dust from the floorboards we sawed up. Long story short: not only did my wife and I have to pay an electrician to completely rewire our new home, we ended up having to replace every pipe in the house. This ate up a sizable chunk out of our rainy-day fund (not to mention the ER bill…).
After speaking with our neighbors, we learned our situation was in no way unique. What we found out was that in the 1990s, the Homeowners Association decided to upgrade to a more modern, high pressure, and energy efficient well system. This didn’t fare well with the original galvanized plumbing, which chewed away at the remaining integrity of the pipes.
As someone who is always looking for parallels to insurance, I found that this is all too similar to what is occurring in the insurance industry today. While the technological infrastructure in place may have been suitable for an earlier era, the explosion of new data sources and the growing demands placed on insurance IT organizations for modernization are exposing the very real cracks in the underlying ‘data plumbing’. The growing pressure will inevitably burst the technological pipes because they were not built with this exponential increase in data or function in mind.
Technology has never been more important to insurance carriers, MGAs, and brokers, but the pipes and solutions in place today – while “modern” at the time of implementation – are now proving wholly insufficient for the current and future state of insurance.
If we continue to ask systems to do things they were not built to do, we’re heading for a costly plumbing failure.
Historically, insurance companies used mainframes and patched together systems to administer insurance policies. These legacy systems largely stayed tucked away in a dark corner of the server room, as IT teams were too afraid to work on them for fear that the house would collapse on itself in the process of trying to improve them.
At the turn of this century (early 2000s), a seismic shift occurred within insurance technology. cutting-edge tech companies saw an opportunity to build technology tailored to modernizing specific insurance functions, such as claims, billing, and rating. This was really the first wave of what would come to be known as “insurtech”. These new solutions promised to help insurers modernize “core” insurance functions and finally have a central place to store, access, and model the underlying data that was previously inaccessible and underutilized.
Over time, this space transformed into what we now know as “core systems” and policy administration systems, which gave insurance organizations a location for storing mission-critical policy data and administering the policy lifecycle – issuing a policy, renewals, billing, etc.
Somewhere along the way during this core technology evolution, “digital transformation” became tantamount to implementing a new policy admin system. Insurance organizations all began to put more and more data into policy administration systems in an entirely customized way, as no implementation or instance looked the same as other instances. This is one reason why the policy admin implementation process typically takes several years, and why the services costs are often as much or more expensive than the software itself.
As a byproduct of this, what we’ve seen happen is that different systems have been implemented for different functions. It is in no way unique for a carrier to have data residing in multiple systems and in entirely different schemas. This is kind of like five people trying to work together when they have no common language to communicate. This disparate approach has led to a big problem for insurers that are trying to make use of their data assets, as these systems are not dynamic and do not directly communicate with one other.
The result has been an incredible amount of technical debt which consumes an eye-watering amount of resources (both in dollars and human capital). According to a recent article in InsurTech Digital, “For incumbent insurers, maintaining legacy systems eats away as much as 70-80% of IT budgets, leaving them little room to innovate.” Oftentimes, insurance organizations hold off on implementing new data sources because it’s nearly impossible for them to take advantage of their existing data assets, let alone capitalize on new ones.
The CEO of GEICOrecently stated that the personal lines insurer has over 600 systems that don’t talk to each other. They understand the need to invest in consolidating systems and make them dynamic, which is why their combined ratio will shoot to an expected 99% by the end of this year. This is unheard of from a company that prides itself on profitability.
Until very recently, there hasn’t been a solution that can do the data cleansing, orchestrate the data into a singular schema, and map pathways back to the existing systems of record without becoming a monolithic, decades-long project.
As a longtime insurance nerd, the funny part of all this to me is that core systems have never adequately addressed the core of an insurer’s profit center: underwriting.
Instead, underwriters are expected to access or input data into multiple legacy systems, sift through data to find meaning in it, search through PDF underwriting rules and guidelines to understand what they’re supposed to do, and ultimately make a decision on an individual risk.
We’ve all seen the statistics from McKinsey and others – underwriters spend 40% of their time on administrative tasks. The static systems where underwriters transact business end up impeding frontline underwriters’ ability to make data-driven risk selection decisions quickly, efficiently, and precisely. Instead, underwriters spend their time foraging for data (both internal and third-party) or inputting data into different systems rather than making risk selection decisions or focusing on building broker and agent relationships.
Despite insurers’ unique risk selection strategy, underwriting rules, and portfolio optimization being the key drivers of profitability, meaningful underwriting improvements have often been delayed or shelved due to the perceived complexity involved, combined with a lack of internal IT resources. It’s a status quo that leaves underwriters high and dry.
Given the acceleration of emerging technologies like generative AI, large language models, and advanced data analytics utilizing machine learning, the stakes are high for insurers that choose to continue down the path of legacy systems maintenance without taking a look around the corner to see the truly transformative innovations that are coming fast.
The key prerequisite for developing AI models is data. While large language models, AI, and advanced analytics are becoming talking points for executives, I see very little hope for insurers who don’t start with the first principles of unifying their data and systems to enable the advancement. The real question is if insurers can do that quickly enough across all their data (not just a subset of thoroughly scrubbed and cleansed data) to take advantage of the first mover advantage that exists today in the marketplace.
"What insurers need is a platform outside of the traditional “core” systems – many of which are now over 20 years old – that has all the pertinent data unified into one schema. Up until a few years ago, this was a pipedream (no pun intended), but no longer. Now, with data graph technology and federated data architecture (which lends Federato its name), insurers finally have powerful new tools to address a very old problem. Gartner estimates that 80% of data and analytics innovations will come from graph technology by 2025."
"Those who understand the end game realize that so-called core systems are not where all data needs to reside and be housed. The policy administration system is fantastic for what it does – administering policies. What a PAS is not great at is integrating an ever-growing list of data attributes or acting as a workflow engine at the underwriting desk."
In the near future, the policy administration system will act as the transactional layer, while real-time data analysis/workflow will occur in an entirely new kind of core insurance system – a separate but integrated platform that was purpose-built for those functions.
Insurance has a legacy plumbing problem which creates a significant downstream data problem for underwriting and analytics teams. It’s a problem that continues to impede underwriting productivity, precision, and outcomes. Until insurers fix the legacy problem, they will continue to struggle with combined ratios and leveraging the value of their existing data assets.
As we rapidly move to a more data-rich world, yesterday’s “modern” and “core” systems are starting to seem more and more like today’s legacy solutions. Insurers that continue to ignore the foundational plumbing do so at their own peril. It is very difficult to build effective applications on top of antiquated architecture. Organizations that continue to spend millions of dollars and countless years maintaining or upgrading these legacy systems in the making will fail to capitalize on the opportunities that lie ahead, and will fall behind those competitors that are more nimble and willing to invest in the future.
These investments in renovating the core will absolutely increase an insurer’s value through its attractiveness to insurance buyers and new generations of talent who are looking for a home. The good news is that solving this challenge doesn’t require insurers to demolish and rebuild the house. The foundation is stable. What needs to happen is an upgrade to the systems that are integral to the future pathways through which dynamic data – and actionable insights – will flow.