Follow BigDATAwire:

February 16, 2012

Alpine Data Climbs Analytics Mountain

Robert Gelber

Today’s big data arena is filled with promises of exponentially improved data complexity, operational savings, and speedier analytics. One startup founded by a pair of analytics vets makes claim to have an even faster analytics model than the rest of the pack.

Amidst all the applications and offerings from startups and titans alike, there have been plenty of success stories of big data wrangling and analytics know-how. In 2010, Stanford Alum. Anderson Wong and Texas State Alum. Yi-Ling Chen co-founded Alpine Data Labs; a company that claims simplicity and cost-effectiveness in achieving advanced analytics through utilization of database processing power.

The San Mateo startup recognized the recent changes in database and HPC technologies and has capitalized on the need to update antiquated data mining techniques. Alpine identifies the inherent lack of effectiveness in previous analytic technologies by pointing out the split between data storage and data modeling. Data needs to be compressed, converted and transferred from storage to modeling and vice versa.

Time is lost and extra computing cycles are wasted on the entire overhead. This is pretty comparable to the person who is standing right next to their TV, wanting to change the channel, and can’t find the remote. They search and search, but never make the faster, simpler decision of pressing the channel button on the TV itself.

Yesterday Alpine VP Clint Johnson sat down for an interview with Ron Powell on the Beye Network and there were a couple of interesting takeaways from their conversation. First was a valid question regarding Alpine’s “In-database” approach, as the application platform is using processing power from a system that was designed to manage records. “If you have tools that can run the analytics directly within the database, then the only issue left is workload management. It’s making sure that the analytics process doesn’t affect normal reporting”.

Earlier on in the interview, the Johnson was speaking very highly of Hadoop and its ability to provide a means of entry into advanced analytics with a comparatively low investment. Later on, the VP mentioned plans to introduce compatibility with DB2 and Hadoop. 

Apline’s offering is an application platform called Alpine Miner. As opposed to the model vs. storage approach, the platform uses an “In-Database” model, which makes use of processing abilities within the database engine.

Currently supported databases are Greenplum, Oracle11g, Oracle Exadata Data Computing Machine and PostgreSQL. The Alpine Miner client runs on either Windows XP or OS X 10.5 or higher (no word yet on linux compatibility).  The application allows for value analysis, frequency analysis, SQL executions and other operations to help aggregate, explore, transform and analyze data.

It will be interesting to see how Alpine’s model fairs against other industry analytics platforms. If their Miner gains steam, there may be more in-database developers on the horizon.  

BigDATAwire