The US Department of Energy and IBM have signed a partnership to build a 20 petaflop machine (that is a computer crunching 20,000,000,000,000,000 instructions per second)by 2011-2012 and to follow up with an Exaflop machine (a 1 followed by 18 zeros), providing the processing power of one billion PC of today.
This machine will be able to process the ExaByte of data that are expected to be generated everyday by the Square Kilometre Array telescope project www.skatelescope.org . The project includes the development of a new form of solid data storage, the “Racetrack Memory”.
Clearly this is a starting project and many obstacles will need to be tackled and solved. Also, it is focussing on highly scientific objective and the solutions are likely to be very expensive. Nevertheless, we have learnt that major scientific endeavour generates a fall out of results applicable to the lay man world. I expect that a project aiming at managing Exabytes of data day in day out will create amazing opportunities for our communities in managing and understanding the data they produce.
In 2008-2009 IBM delivered machines sporting a Petaflop processing power. The short term target is multiplying by 20 that power and the 10 year term multiplying by 1,000. This processing power is awesome, however it is what it takes to process the huge amount of data created by the new telescope (and other physical experiments like the LHC are not joking either). It is definitely too much for our everyday needs but also in this area we will see a tremendous growth in data and we will surely benefit from more processing power, may be available on demand in a “cloud” through pervasive, distributed computing. Just think about the personalised medicine where our genome will be used to create the right drug to cure or prevent a desease. What today would take a few months (decoding the genome, analysing its various genes and loci and creating the right protein) should take only a few hours. Even then, we will still require much less processing power than the one targeted to support the SKA telescope.
Major breakthrough are needed in power consumption. Today it will require a nuclear plant to power such a computer. Data transfer will also be a major challenge. Moving around an Exabyte of data per day is equivalent to all data moved around the globe through all telecommunications networks in the year 2000 (voice included, of course).
Researchers are looking into stream computing, a technique to analyse and sort out data on the fly as they are moved around on networks, storing only those that are needed and discarding the rest. Although storage density keeps increasing there is a need for radically different storage technology when you aim at storing EB.
A promise comes from spintronics memories, being studied by IBM, http://www.almaden.ibm.com/spinaps/research/sd/?racetrack
For more info on these futuristic computers take a look at: