Follow BigDATAwire:

September 26, 2013

Dr. Hadoop Being Put to the Challenge at Mount Sinai

Isaac Lopez

Big data luminary, Jeff Hammerbacher who built a reputation last decade as the original data scientist behind Mark Zuckerberg’s Facebook empire, has been famously quoted lamenting that “the best minds of my generation are thinking about how to make people click ads – that sucks.” Now with the Mount Sinai School of Medicine, the poster boy of data science is set to show what this big data thing can do.

Having joined the Mount Sinai School of Medicine in July of 2012, Hammerbacher (who also is the Chief Scientist and a co-founder of Hadoop distro vendor, Cloudera), has settled into his role as an assistant professor there at the Icahn School of medicine. The MIT Technology Review reports that the data science wunderkind is currently leading the design of a new computing cluster, which will undoubtedly become the new Hadoop heart of the medical school, which treats half a million patients each year.

“We’re going out on a limb – we’re saying this can deliver value to the hospital,” said Hammerbacher to the Review. Hammerbacher and his colleagues hope that the sizeable investments that the Mount Sinai School are making, including a $120 million electronic medical system, pay off in the “game changing” type of ways that big data technologies are hyped to produce. In that sense, the Mount Sinai experiment is, for all practical purposes, a high wire act for Hadoop itself. It isn’t ad revenues staying in the 1,406 beds that the hospital houses.

The experiment certainly comes with its share of challenges. The Review reports that Mount Sinai is already part of an experiment installed by Washington D.C. called “accountable care,” where rather than billing by procedure (an arrangement where hospitals earn more the sicker a patient gets),  they are paid based on outcomes. If Hammerbacher’s system is able to affect positive results on the outcomes while cutting costs, Mount Sinai could potentially serve as a role model for a new way healthcare can be done.

As it is, the Review reports that Hammerbacher has pretty well got carte blanche to muster up every resource he needs to in order to get it accomplished. The organization has opened its pocket book very wide to recruit as many people as it can, not to mention investments on equipment to process it’s massive amounts of the life blood data.

While the organization has plenty of data to go around, including 26,735 patient DNA and plasma samples, Hammerbacher has got his work cut out in trying to get patient data into the Hadoop cluster quick enough for analysis to happen. Recognizing this challenge, Hammerbacher reportedly is hoping that other angles can be taken to affect positive health outcomes. This includes searching for relational connections that can provide insights, such as hospital infections and the DNA of microbes present in an ICU, or tracking patients remotely through at-home monitoring equipment.

While the challenges as Hammerbacher and his colleagues build this new system are steep, the resources and brain power being thrown at this evolution are encouraging. And while Hammerbacher laments about the priorities of his generation, this particular data scientist seems to be at the right place at the right time – and that’s awesome.

Here’s a look at Jeff Hammerbacher presenting at the SINAInnovations conference last November: 

Related items:

Can Algorithms Be Authors? One May Be Coming to Your Pocket Soon 

Reducing Suicide Rates with Predictive Analytics 

Healthcare Organizations Face Daunting Data Challenges 

BigDATAwire