Pulling Big Insurance Into the In-Memory Fold
Not so long ago, well before “big data” was the buzzword of choice, many businesses could have considered their work to be data-intensive. However, as that data has grown larger, more diverse and the business models behind the data required even faster reactions the standard model started feeling a little slow.
Many applications in these businesses were not just outgrowing their data management pens on a software front–their computing environments were also shifting.
Now, many users have massive, diverse and wide-ranging data sets that can be both memory and CPU intensive, all the while not requiring the “big data persistence” so many have touted.
The answer to these problems comes in the form of in-memory technologies, which are slowly changing the way some key business areas operate, including insurance and risk management. While we’re going to be hitting on SAS and its answers for insurance in a moment, we wanted to point briefly to a SAP video that puts the simpler side of these analytics questions in some context first.
While SAP has been vocal about its HANA platform to fit the needs of insurance lately, SAS recently reached out to the insurance industry, noting that it stands to benefit from new developments that lead to better performance and scalability than traditional data management platforms.
SAS told its insurance customers that when it comes to finding more sophisticated solutions to contend with their massive data and analytics needs, most vendors are “recycling relational and OLA technology from 20 years ago and marketing it as new and improved.”
When it comes to performing near-instantaneous query and reporting on sums and counts of billions of records, SAS says users won’t be hard-pressed to find a company that can prove efficacy. However, they claim that it’s not so easy to find solutions that do this across hundreds or thousands of variable with the distributed analysis and predictive modeling that insurance and risk management companies require.
According to the analytics giant, the LASR Analytic Server was tailored to fit the needs of a number of heavy-duty analytics use cases. As they describe it, the LASR is a read-mostly, stateless, distributed in-memory server that provides the “secure, multi-user, concurrent access to data in a distributed computing environment.”
As the company describes, the platform provides insurance and other big data industries with a direct-access, NoSQ, NoMDX server that has been engineered for the new breed of applications (i.e., those that have outgrown their database containers). They claim that the performance is achieve by tapping multithreading and distributed in-memory analytic architectures.
There are countless whitepapers from vendors that describe solutions and use cases and while we tend to pass several of these by, the SAS whitepaper that describes the LASR thin-layer technology that lets SAS runs inside distributed environments like Hadoop as well as within traditional databases like Greenplum and Teradata might be worth a read.