A “new normal” has created the need for the catastrophe modeling to move from historic to predictive methods of looking at extreme weather risk, according to a panel of risk experts speaking Tuesday at the annual Chartered Property Casualty Underwriter Society meeting in Anaheim, Calif.
The four-day conference included a keynote speech by NBA star Earvin “Magic” Johnson, and various sessions tackling topics from telematics to pricing analytics to employment liability practices prevention to terrorism risks.
The modeling panel on Tuesday was focused on “global extreme weather,” and included experts from EQECAT and Zurich North America. It was moderated by Louis E. Nunez, international product underwriter at Zurich.
Tom Larsen, senior vice president and product architect at EQECAT, cited a report from research group The Geneva Association that asserts that global climate is now moving into a “new normal.”
Larsen was referring to a report put out by the group last year asserting that the impacts of a warming ocean and climate change threatens the insurability of catastrophe risk.
As the world’s climate and weather patterns change so should modeling this risk, said Larsen, who called for a shift from historic to more predictive risk assessment methods.
He added, “We are now sailing into the unknown.”
Larsen said it doesn’t matter whether climate change is manmade or a result of periodic changes in the climate that have occurred throughout histories.
“We don’t need to prove causation,” he said.
On a high note, the insurance industry by-and-large has time to adapt to these changes, considering that, for example, most property insurance renews on one year cycles – a short timeframe compared with a 50-year-plus horizon for climate change, Larsen added.
Given that sea levels are rising at an average rate of 3mm per year, according to data from the National Oceanic and Atmospheric Administration, it will take 100 years for sea levels to rise a foot, Larsen noted.
“It’s sufficiently slow” to build into models, he said.
The bigger question is whether there is sufficient data now being gathered to develop effective cat models in a changing climate.
He noted that despite access to modern computer modeling and years of weather data, three insurers fell in to insolvency issues following the devastating 2011 Joplin, Miss. tornado.
Many in the field are now including conditional frequency models based on assumptions about ensuing weather patterns, he said, adding that such moves are leading to the adoption of tools like Tail-Value-at-Risk measurement, which quantifies an expected value of loss outside a given probability level.
However, while there is a growing need to implement more alternative techniques to model catastrophe risks, such as TVaR, they are generally not accepted by ratings agencies and regulators, Larsen said.
“Knowing the climate is changing, we really don’t have a good frequency-of-severity model,” he added.
Lindene Patton, who helped develop the White House’s National Climate Assessment report and a former climate product officer for Zurich Å˽ðÁ«´«Ã½Ó³» Group, agreed with Larsen’s call to develop and implement more modeling tools, and added that there is “a temporal dissonance between the instruments we’re trying to offer and the challenge we’re trying to solve.”
David F. Smith, senior vice president of model development for EQECAT, said there is too much volatility in to simply rely on historical models.
“The volatility is just enormous in these things,” Smith said.
Using Superstorm Sandy an example he noted that the average return rate for such a storm is one-in-20 years for New York and one-in-180 years for New Jersey.
Even when looking at average annual losses there’s a great deal of volatility, he said.
The average loss year for the U.S. is $11 billion, but the standard deviation was $22 billion, with two-thirds of normalized losses over the past 50 years coming from roughly a dozen seasons, he added.
Was this article valuable?
Here are more articles you may enjoy.