Claims Data as an Offensive Weapon?

Tim Crossley, London Market Sales Manager, FINEOS

I had the pleasure of attending the recent Verisk Insurance Symposium in London.  Verisk put on a great show and certainly demonstrated their commitment to the London Market.  I decided to attend because of my interest in the link between better quality claims data and more accurate Catastrophe Modelling.  During the Symposium, one of the Verisk executives used the phrase ‘high resolution’ to describe the data needed to drive leading edge cat models, and it’s a phrase that certainly resonated with me.  One of the things we majored on in designing our claims system for the Lloyd’s market was to make it easy to add useful additional data to the core claims data supplied via ECF.

On the one hand, our clients can ensure that all participants on a claim are uniquely and correctly identified, ranging from Loss Adjusters to Lawyers, Risk Managers to Recovery Experts.  This provides a great basis for ‘slicing and dicing’ claims data in all sorts of different ways and getting a feel for those partnerships and relationships that are working well and those that are a bit ‘under par.’  People seem to like the ‘cockpit’ analogy we use when describing FINEOS Claims and the extra level of instrumentation we provide takes a lot of the guesswork out of running a claims operation.  Too many organisations seem to be flying blind at the moment.  Simple questions from the board like ‘who are the top five customers of our claims service?’ and ‘what percentage of our legal fee expenditure is represented by our top five lawyers?’ can now be answered effortlessly, whereas they would previously lead to much soul searching and teeth gnashing.  The daring or even enlightened Claims Director can of course serve up the newly available, high resolution data for self-service analysis by their colleagues, business partners or even customers, but I do understand that this isn’t a path trodden lightly.

On the other hand, I hear time and time again that better, more granular coding of claims is essential if the actuaries and risk managers are going to have a useful historical basis for pricing and reserving decisions.  I have also heard reinsurers making a clear link between the consistency and standardisation of claims data and the cost of reinsurance.  Clearly there is a cost in terms of time and effort to augment the standard ECF claims data, and I have heard both schools of thought on the subject.  Some organisations are focused on operational efficiency as the sole goal, whereas others are willing to invest to lift the fog of obscurity and turn claims data into an offensive weapon.  My personal opinion is that it has to be worth the extra effort.  It’s like fueling a Porsche 911 with chip fat if you are going to invest in a multi-million dollar next generation modelling platform without making sure it has the best quality inputs.  Garbage in, garbage out, anybody?

You may also be interested in