Alzheimer’s Research
Using Big Data Against Memory Loss
In an unprecedented project, researchers are attempting to find ways of diagnosing Alzheimer’s, even before the onset of symptoms. A new computer architecture is expected to help in this effort, as it is able to process huge amounts of information faster than ever before.
This pairing may seem surprising at first, but memory problems determine the agenda of Alzheimer’s research in more than one respect. The disease alters the brain on many different levels, from cellular metabolism, to anatomy, to mental performance. To understand cause and effect of even one single piece of this puzzle requires an incredibly large amount of information to be compiled. These kinds of “Big Data” investigations can only be done using computers. And, in fact, not even these really suffice anymore.
The sheer masses of data involved defeat the computational power of even the highest-performing computers, explains Joachim Schultze. “For each individual data processing step, increasingly huge volumes of data have to be saved to the main memory. That takes an incredibly long time.” Until recently, the preparation alone of genome data for the actual analysis took up to several days. Meanwhile, scientists have been able to shorten this to 25 minutes. Several other analytical steps still take hours, however. An entire analysis may take days. For this reason, many of the issues that plague Alzheimer’s researchers simply cannot be addressed using current computers.
The new computer architecture that HPE developed for “The Machine” may mean a big step forward. At the center of the action is the redevelopment of main computer memory, which has not been emphasized up to now. The approach is called “Memory-Driven Computing,” and it will enable faster processing of large volumes of data.
The precondition for this, however, is that all algorithms are adapted to this new architecture. But the work is worthwhile, as an initial test-run showed after the Discover Conference: the team from the DZNE and HP Labs were able to use the new data processing structure to compress preprocessing times from 25 minutes to 39 seconds, and managed to do so using at least 40 percent less energy. This will allow scientists not only to save time, but also costs. Joachim Schultze is thrilled: “It is incredible when you think of the kind of processing we can achieve with processes that are not as optimized.”
Even the computer experts are waxing lyrical about these results. And this is not only on account of the superiority shown by the new architecture. “We have produced an example of how information technology can contribute to solving a problem of great social relevance, namely Alzheimer’s. And this impressed a lot of people a great deal. After the conference, some of them came up to us and said, ‘Amazing, we were not aware of this. We learned something here.’”
The partners in the collaboration are planning further components in the coming year that will be used in genomic analysis in order to transition to “memory-driven computing.” Following on from this, processing of anatomical data derived from imaging will be tackled. If all goes well, it will mean far fewer coffee breaks for the scientists in the short term. Over the long term, their big gamble may just pay off, enabling them to diagnose Alzheimer’s before symptoms show, and maybe even to understand the causes of the disease. This would be a great hope for the development of effective therapies.
Readers comments