Follow BigDATAwire:

May 16, 2014

Big Data Going to Waste, Study Finds

Organizations are able to take advantage of just a fraction of the data they’re collecting and storing, according to a study commissioned by VoltDB. The reason? A shocking lack of throughput in their operational databases, the NewSQL database developer says.

The study of 368 IT professionals painted a dim picture of the performance of operational databases at US firms. While the problems are not causing organizations to ditch big data projects en masse, the issues are handcuffing the ability of businesses to make full use of the data they collect.

The key finding is this: 72 percent of respondents say they’re unable to access and make decisions on more than 50 percent of their data. In other words, only 28 percent of respondents were able to access more than 50 percent of their data.

The survey says this dismal state of affairs is hampering organizations’ capability to achieve big data goals, such as delivering personalized customer experiences, driving higher revenue, and lowering costs.

The operational issues aren’t causing many organizations to ditch their big data projects. VoltDB’s research finds less than 17 percent of respondents indicate they ended such projects in the last two years due to database problems. Instead, the big data problems are causing IT pros to get creative in their solutions.

The most popular technique for overcoming database shortcomings is to cache data, which half of all respondents indicated doing, according to the survey. More than one-third say they have either adopted batch ETL processes to move data or bought additional licenses to keep up with data flows. About 31 percent bought a separate stream-processing engine to manage real-time flows, while 15 percent say they have never encountered a problem with their database of record.

Traditional SQL-based relational databases were the most popular databases, with 71 percent of users reporting using them. MySQL is used at about half the respondents, with NoSQL databases used at 38 percent. NewSQL databases are used at only about 17 percent of organizations, fewer than those who use hierarchical mainframe databases.

The survey shows how far most organizations are from achieving their big data dreams, says VoltDB CEO Bruce Reading. “Organizations must have the ability to not only ingest massive amounts of data, but also immediately analyze and act on that data in a meaningful way to realize the big payoff–improving their bottom lines,” Reading says in a press release.

VoltDB develops an ACID-compliant, in-memory relational NewSQL database aimed at powering the transactions and basic analytic functions in hyperscale Web and mobile applications. The company, which was co-founded by database pioneer Mike Stonebraker, recently told Datanami it grew revenues by nearly 300 percent last year.

Related Items:

VoltDB Turns to Real-Time Analytics with NewSQL Database

Picking the Right Tool for Your Big Data Job

RDBMs: The Hot New Technology of 2014?

BigDATAwire