By Russ Banham
In Lloyd’s coffee house in London more than 330 years ago, the property and casualty insurance industry was formed to spread the risks of cargo-carrying ships plying the world’s seas. The industry grew and prospered, with little change in the underlying model. Now, an entirely new structure is taking shape.
The compelling reason for building this new model is the existential threat of keeping things as they are. If insurers and brokerages don’t begin to share their own and client data in novel ways to reduce acquisition costs and operating expenses, some other entity—like, say, a giant technology company—might squash them like a stack of yesteryear’s CDs.
“The cost base from which we all are working is fast becoming unsustainable,” warns Alastair Burns, chief marketing officer at London-based insurance holding group Navigators International. “A war on cost must be waged in the insurance marketplace.”
This cost base is predicated on the traditional ways of providing insurance to businesses that have not changed much since Edward Lloyd ground his first cup of coffee. Companies from the largest corporations to the smallest Main Street businesses rely on an insurance broker or agent to understand their risks and transfer them to an insurance carrier willing to absorb these exposures. The insurer then spreads these risks through global reinsurance markets.
This system has worked pretty well for centuries. Then along came data analytics—the ability to search through massive volumes of data to discover useful information for decision-making purposes. This technology and others—machine learning, robotic processing automation and artificial intelligence—have created a dangerous entry point for non-insurance entities to potentially compete against brokers and carriers, leveraging capital in some cases from institutional investors.
Other industries have collapsed from such incursions. The new insurance model under discussion would fortify the barricades. By sharing client data, brokers and carriers can reduce the cost of insurance, pave the way toward the development of innovative new types of insurance, and, most important, keep non-insurance entities at bay.
“Whoever has data is king,” says Jonathan Prinn, group head of broking at Ed Broking, a wholesale insurance and reinsurance brokerage company based in London. “If this data isn’t shared amongst us, we will fail to provide value to our clients. And if we fail, someone else will provide this value.”
Need for a New Model
One of the clear concerns facing the industry is client acquisition costs, which are much higher than the acquisition of new customers in other industries. Prinn blames high acquisition costs for Lloyd’s of London’s dismal 114% combined ratio in 2017, meaning the venerable insurance marketplace paid out more money in claims than it received in premiums.
“The cost of underwriting and broking commissions at Lloyd’s was about 42%, which is unacceptable and untenable in the world going forward,” he says. “No intermediary model I know of is anywhere close to 42%.”
In the United Kingdom, Prinn says, “real estate agents charge around 2%, credit card companies like Mastercard charge around 1.3% and moving $2 from a bank account to buy coffee using Apple Pay is free.”
The 42% figure he cites is composed of 30%-plus broker commission levels and the carriers’ administration costs, which last year ranged between 10% and 12%.
One could argue these expenses are less in the United States—according to A.M. Best, incurred broker commissions/expenses plus underwriting expenses are between 30% and 31%—but that’s beside the point. “Too much of the money that comes into the system is now consumed by acquisition and operating costs,” says Jamie Garratt, who heads the digital underwriting strategy at Talbot Underwriting, a London-based underwriting services provider.
The culprit, the interviewees contend, is the traditional linear process of transacting insurance, whereby a business goes to a broker who goes to a carrier that goes to a reinsurance broker who goes to a reinsurance carrier. The alternative is the sharing of data among brokers and carriers in a free marketplace. “If brokers share client exposure data with all and not just some insurers, everyone’s cost base reduces,” Burns says.
Garratt agrees. “If we can move data quickly and with much less friction between a client and the end risk bearers,” he says, “we can remove the replication of work up and down the value chain, reducing the high cost this produces for all of us.”
An Architecture Emerges
How might this new model look and operate? Instead of the customary linear progression, clients would share operating, financial and other data with brokers; the brokers would evaluate and package these data into discrete elements of risk; these elements would be submitted to the global insurance markets for their review; and insurers and reinsurers would decide whether to assume these risks based on their underwriting appetites and portfolio diversification objectives.
Prinn provided the example of how this might work in the commercial aviation insurance market, which is currently divided between one set of carriers that provide insurance to smaller courier-type aircraft and another set that services large commercial airlines.
“The reason for the split in the market is risk—smaller aircraft take off and land a lot to drop off freight, and the biggest risks in flying occur when landing and taking off,” he explains. “Consequently, the risks for the smaller air carriers are significantly higher.”
Assuming all aircraft owners provide their altitude, speed, turbulence and other data to a broker, cost theoretically would come down. “Airplanes produce an extraordinary amount of data coming off sensors that could be dropped into a blockchain or other type of distributed ledger technology,” Prinn says. “If the data from every airline was available in this platform, there would be many thousands of data items related to every step of the journey, fundamentally changing how insurance is structured.”
The job of the broker would be to gather and assess these granular levels of exposure data and present them to the insurance markets for consideration. Were this to occur, Prinn describes what might happen next: “An insurance carrier might say it will insure this many planes taking off to 1,000 feet in particular regions of the world; this many planes going from 10,000 feet to 20,000 feet across the Atlantic; and this many planes going from 25,000 feet to 35,000 feet across specific land masses.”
The carrier might also decide to insure similar elements as the plane makes its descent to the ultimate landing. “If this happened,” Prinn says, “the need to separate the aviation insurance market into two sectors would be eliminated.”
What he has just described is a far cry from the current model of providing commercial aviation insurance, in which a broker representing a specific airline submits that company’s breadth of risks to an insurer with which the broker often does business. The new model would democratize the process to offer pieces of risk to all aviation insurance carriers in a free marketplace.
Can the risks of other industries be similarly sliced and diced by brokers and offered to the insurance markets? “Absolutely,” says data scientist Henna Karna, chief data officer at global insurer XL Catlin. “We’re continually looking at data that exists in our environment and creating ways to measure it.”
These data are not limited to a company’s internal financial and operating data. “Exogenous credit data, political risk data, cyber risk data, and unstructured data can all be part of the picture,” Karna says. “The goal is to model the risk by considering every available and quantifiable factor that may affect it. We’re getting closer and closer to mining, analyzing, modeling and managing all this data for insurance purposes.”
Better Data, Better Risks
The sharing of detailed client exposure data in a digital platform provides the opportunity for insurers to diversify their risk portfolios by reducing exposure accumulations in specific areas. Karna provides the example of a global manufacturing plant that operates 9 a.m. to 5 p.m. Monday to Friday.
“Say the data coming from the sensors attached to factory equipment indicate the company makes most of its products on Tuesdays and Wednesdays, with the bulk of these items coming off the production line between 9 a.m. and 11 a.m.,” she says. “A blackout or machine glitch during this period on a Tuesday or Wednesday would cause a much bigger business interruption risk than other times of the day the rest of the week.”
Today, the company’s insurance policy from a single carrier does not distinguish these risk factors. By breaking up the risk into difference pieces, and bringing in other quantifiable data like seasonal weather and machine maintenance, other carriers have the opportunity to choose which exposure elements they may want to bear. In turn, this helps the insurers balance their risk portfolios with exposures that are uncorrelated, Karna says.
Burns agrees. “By modeling and sharing risks, carriers can avoid aggregating too much of a single risk,” he says. “Insurers have a finite amount of exposure they can take. Their balance sheets are only so big.”
The new model would present a way for brokers to spread risks among different insurers to their clients’ benefit. “If we have a system where brokers share client risks with all the insurance markets in a centralized shared services model,” Burns says, “this would be a far more efficient way for insurers to spread their risks and for clients to receive more competitive premiums.”
Current expenses up and down the insurance value chain would wither, Garratt says. “If clients share their data with brokers, and brokers share this data with insurance markets, it will result in reduced costs for all parties,” he explains. “All those inputs, calculations and equations that each broker and carrier must do on their own would be removed from the process. There would be no more need for rekeying all this data as it’s passed along the value chain. Instead, the exposure data would move seamlessly and transparently from the client to all potential risk bearers, dramatically reducing acquisition and operating costs.”
This would certainly be a positive development for business clients. “If a broker presents a good commercial risk to all the insurance markets—which is apparent in the client’s exposure data—carriers can compete to charge less in assuming this risk, based on their respective underwriting appetites,” Garratt says.
Although the new model would produce less traditional income for brokers, the loss of revenue would be offset by heightened efficiencies. “Brokers have significant operating expenses that are a reflection of the current inefficiencies in the market,” Garratt explains. “If you remove these inefficiencies, brokers can reduce their costs to maintain current profit margins.”
Moreover, the broker’s enhanced knowledge of clients’ exposures presents the opportunity to become more involved in mitigating clients’ risks. “If a broker comes to us with a client whose exposure data indicates is risky, we would charge more to assume the risk or not take it at all,” Garratt says. “But crucially for the client, the broker and the underwriter now have the ability to help the client manage these exposures, working to reduce the activities and behaviors that resulted in the company being perceived as a higher risk.”
This creates value for the client and possibly a new revenue stream for the broker, while furthering the closeness of the relationship brokers enjoy with clients. “Less money is consumed by process and transaction, and more is used to pay claims, provide value-added services and develop new products,” he adds. “The outcome in all cases is better, quicker and cheaper service for our clients, which is what we all must focus on.”
Customized, Complex Products
By participating in the proposed data-sharing model, brokerages and insurers will be in a more opportune position to create bespoke insurance products. “All that exposure data in the hands of a carrier or a broker can be an engine of innovation,” Karna says.
She provided the example of a large retail chain caught in the crosshairs of a reputational disaster. By mining social media data, the company’s broker may learn that flash mobs are forming to protest the business in a variety of store locations.
“For the sake of argument, let’s assume the organization has comprehensive property insurance absorbing losses caused by vandalism that occur in its stores’ parking lots, which generally have 300 to 400 cars parked per lot,” Karna says. “Now let’s assume the financial limits on this insurance policy are $1 million. Would this amount cover the potential vandalism of cars in the aggregate?”
Possibly not. Even if it did, how much of the total limit would be left over to absorb additional property losses throughout the remainder of the policy duration? “The opportunity now exists for the broker to craft a bespoke insurance policy absorbing the one-time property damage losses caused by vandalism and present it to carriers for their consideration,” Karna says.
Prinn cites similar value. “Brokers and carriers have the ability to take a very complex commercial insurance program like one sees in the oil and gas sector and ferret out comparisons across different oil and gas companies,” he says. “You now can take a common look at these risks, which wasn’t possible before, allowing complex (insurance) products to be packaged.”
There is even the opportunity for brokers and carriers to share customer data with other industries, assuming the owners of the data opt in for this use to avoid privacy regulations.
“Wouldn’t it be great if I bought travel insurance and the carrier shared this data with a rental car agency to set up a vehicle for me when I landed?” Prinn says. “Or the pension side of my insurance company knows I have children and recommends setting up a college plan at a local bank? The future broker or carrier might well be able to help with these things, which is very exciting. And we’re still really at the beginning of all this.”
Nevertheless, he is confident the current model of insurance will be relegated to the trash bin of history. “I was a street broker at Marsh placing Fortune 500 risks for 20 years with carriers that I knew from pure personal experience alone could take on these risks,” he says. “I can tell you unequivocally that this art form is dying. The broker of the future will know how to use data and analytics to find the right insurance markets for their clients, as opposed to relying on experience alone. The model of tomorrow will be a blend of art and science.”
Whenever he talks on the subject, Prinn frequently mentions his 10-year-old daughter’s interest in playing the online game Minecraft. She listens to the YouTube videos of a young man in his 20s named Daniel Middleton, a professional gamer who plays Minecraft and comments about the game’s intricacies to his online listeners.
“I mention Dan because last year he made something like $16 million,” Prinn says. “If you told me five years ago that some guy who sat in his home commenting to kids on a video game would make this kind of money, I would have fallen over laughing. In no way could this have been predicted.”
He feels the same way about the new model of insurance. “Ten years ago, if you told me the traditional model would change so the customary parties to the transaction could share data, I would have dismissed it out of hand,” he says. “And now it is upon us.”
The Tools Are Here
Change is tough for any industry, but the alternative is stark. Just look at the many industries that failed to heed the disruption caused by a technology interloper. “If a broker believes its future value remains as a middleman, where it is being paid merely to access an insurance market, it is dead before it knows it,” Prinn says. “The entire industry is under siege by technology companies that will create better models unless we do it first.”
Down the line, as the internet of things becomes ever more mainstream, millions of sensors will produce an abundance of data, sharpening the ability of brokers to analyze complex risk exposures to a fine point—if they take pains to make this happen.
Yes, there are obstacles to be overcome, including the need to structure data into common formats. But these are relatively easy hurdles to surmount, and organizations like ACORD are already tackling the problem. “Once we have structured data, the entire ecosystem can move in sync to share information and bring down costs,” Burns says. “There’s no going back to the ways things have been.”
Others agree. “We now have the tools to do things we’d only dreamed about before,” Karna says. “Much of our work here now with data is focused on seizing these opportunities.”
The future is as bright as the industry allows. “There’s a tremendous amount of excitement now why the traditional insurance market must change and how this can occur,” Garratt says. “And that can be a catalyst for needed change to occur at a really fast speed—to the benefit of the insurance market and our clients.”
Banham is a Pulitzer Prize-nominated business journalist and author who writes frequently about the intersection of technology and insurance.