Follow Us

Follow us on Twitter  Follow us on LinkedIn
 

04 March 2013

Risk.net: Technology providers look to the cloud to meet insurers' modelling challenges


Default: Change to:


Cloud technology potentially offers insurers an efficient way to undertake the huge amount of actuarial and risk modelling calculations that need to be performed. But with concerns around data security and reliability, is it really a fail-safe option?


The requirements for insurers to provide ever more detailed and frequent risk analysis and capital calculations, not least for the impending Solvency II regime, place ever greater demands on technology. Affording and managing the infrastructure to meet these requirements is a burden that even the largest insurers can find challenging.

One potential solution is ‘cloud computing' - huge pools of processing and data management resources available online and on demand. The vendors of systems for actuarial modelling and risk management, where the most computationally intense tasks tend to be concentrated, are attempting to make the products more viable by adapting their software to run in the cloud and some insurers are already testing the waters. But transferring processing and data to an apparently nebulous environment like the ‘cloud' raises a number of issues, such as data security, regulatory compliance and reliability.

The secret of cloud technology is ‘virtualisation'. Traditionally, an actuarial model or other application runs on a particular machine with its own processor, operating system, memory and storage. By contrast, cloud operators separate the application from the other elements - the processors, operating systems, memory and storage - which they gather in vast stores in networked data centres.

When an insurer wants to run an application such as an actuarial model, the cloud provider quickly builds a ‘virtual machine' from its store of resources, with the appropriate level of processors, memory and other elements to complete the task in the required time. In this way, cloud operators can provide tailor-made virtual machines, with tens of thousands of processors if necessary, on a temporary or longer-term basis for modelling runs or other intensive computational tasks, charging only rental rates and bearing all the systems implementation, maintenance and upgrade burdens.

Last year risk management consultancy Towers Watson adapted its MoSes financial modelling application to run in Microsoft's Windows Azure Cloud; while Seattle-based actuarial consultants Milliman has also released a cloud version of its financial modelling application MG-Alfa, also to run in Windows Azure Cloud.

Joel Fox, risk consulting and software director at Towers Watson, based in London, says a growing problem for companies in recent years has been determining the appropriate level of IT infrastructure to serve fluctuating business needs. "The financial and regulatory reporting cycles, product development timetables and other analytical requirements inevitably lead to peaks and troughs in IT activity levels for insurers", he says.

Companies, Fox claims, are faced with the dilemma of whether they should invest in maximum capacity to handle peak demand, knowing they will have idle capacity during off-peak periods. "Understandably, some companies have been or are reluctant to invest on these terms because data centres that permit parallel processing of tasks on a large scale don't come cheap", he adds.

Brian Reid, MG-Alfa global sales director at Milliman, claims running MG-Alfa on the Azure cloud provides unlimited capacity on demand, which he says is affordable and reliable. "We saw that clients were struggling with infrastructure obstacles as their need for computing capacity increased exponentially under the emerging regulatory requirements", he adds.

Full article (Risk.net subscription required)



© Risk.net


< Next Previous >
Key
 Hover over the blue highlighted text to view the acronym meaning
Hover over these icons for more information



Add new comment