
Vendor » ScaleMP
Features
Big Data – Scale Up or Scale Out or Both
The “Big Data” term is generally used to describe datasets that are too large or complex to be analyzed with standard database management systems. When a dataset is considered to be a “Big Data” is a moving target, since the amount of data created each year grows, as do the tools (soft-ware) and hardware (speed and capacity) to make sense of the information. Many use the terms volume (amount of data), velocity (speed of data in and out) and variety of data to describe “Big Data”. Read more…
This Week’s Big Data Big Seven
We wrap up this week with news about a new high performance, data-intensive supercomputer from SGI, new Hadoop announcements, including those from Hortonworks, Datameer, and Karmasphere, some software enhancements for big data infrastructure from ScaleOut and some other startup goodness--all with an eye on next week's International.... Read more…
Bar Set for Data-Intensive Supercomputing
This week the San Diego Supercomputer Center introduced the flash-based, shared memory Gordon supercomputer. Built by Appro and sporting capabilities at the 36 million IOPS range, the center's director made no mistake in stating that a new era of data-intensive science has begun. Read more…