Engineers at IBM are taking inspiration from the human brain in an attempt to find a way to power supercomputers.
In less than two decades, scientists predict we'll have awesome supercomputer that can perform a trillion billion operations per second - 300,000 times more than computer's today.
The only minor problem is, they would use more electrical energy than the world can currently produce.
However scientists are now taking a lesson from a machine that's thousands of times denser and more efficient than any technology around today - the human brain.
Our brains are so powerful because they use the same network of blood vessels to transport heat and energy at the same time.
IBM is now copying this innovative strategy by inventing 'electronic blood' - fluid that is charged with an electrical current and flows to the computer's processors to cool the machine while providing energy.
The technology is probably decades away from being implemented, Bruno Michel with IBM research has told CNN Money, but when it's implemented, it could greatly shrink the size of supercomputers and make them significantly more energy efficient.
Today's supercomputers are huge - for example, the "petaflop" supercomputer, which can perform one quadrillion operations per second, takes up about half a football field. Their large size is partly because they get so hot that their chips have to be spaced out in the body of the machine.
But if they were powered by this new electronic blood, the chips could be stacked three-dimensionally, meaning the size of the computers could shrink drastically.
The IBM scientists believe that if they could use electronic blood to cool the machine and stack petaflop's chips, it could be shrunk down to the size of a desktop computer.
"It makes much more sense to have the chips stacked," Chris Sciacca, spokesman for IBM Research told CNN Money. "What we want to do is to make water-cooled supercomputers of the future that are the size of a sugar cube."
But it's not only the size of today's supercomputers that's an issue, it's the fact that they're guzzling up increasingly more energy - the chips also generate so much heat that currently half the energy of the machine goes to cooling equipment such as fans.
CNN Money Reports: "In the United States, data centers are responsible for more than 2% of the country's electricity usage, according to researchers at Villanova University. If the global cloud computing industry were considered to be a single country, it would be the fifth-largest in the world in terms of energy consumption."
It just goes to show that no matter how far technology advances, the human brain really does seem to be the best model we have for information processing. Nature wins again.
Comments
Post a Comment