by David Wolpert from ScientificAmerican Website
Getty Images
vastly more efficient, but for that to happen we need to better understand
the
thermodynamics of computing...
The company is taking a souped-up shipping container stuffed full of computer servers and submerging it in the ocean. The most recent round is taking place near Scotland's Orkney Islands, and involves a total of 864 standard Microsoft data-center servers.
Many people have impugned
the rationality of the company that put Seattle on the high-tech
map, but seriously - why is Microsoft doing this?
Precise estimates vary, but currently about 5 percent of all energy consumption in the U.S. goes just to running computers - a huge cost to the economy as whole.
Moreover, all that energy used by those computers ultimately gets converted into heat.
This results in a second cost:
These issues don't only arise in artificial, digital computers.
There are many naturally occurring computers, and they, too, require huge amounts of energy. To give a rather pointed example, the human brain is a computer.
This particular computer uses some 10-20 percent of all the calories that a human consumes.
Think about it:
Is that penalty why
'intelligence' is so rare in the evolutionary record? Nobody
knows - and nobody has even had the mathematical tools to ask the
question before.
Indeed, the comparison of thermodynamic costs in artificial and cellular computers can be extremely humbling for modern computer engineers.
For example, a large fraction of the energy budget of a cell goes to translating RNA into sequences of amino acids (i.e., proteins), in the cell's ribosome.
But the thermodynamic efficiency of this computation - the amount of energy required by a ribosome per elementary operation - is many orders of magnitude superior to the thermodynamic efficiency of our current artificial computers.
These are some of the issues my collaborators and I are grappling with in an ongoing research project at the Santa Fe Institute.
We are not the first to
investigate these issues; they have been considered, for over a
century and a half, using semi-formal reasoning based on what was
essentially back-of-the-envelope style analysis rather than rigorous
mathematical arguments - since the relevant math wasn't fully mature
at the time.
So whatever else they
are, computers are definitely non-equilibrium systems. In
fact, they are often very-far-from-equilibrium systems.
These breakthroughs allow
us to analyze all kinds of issues concerning how heat, energy, and
information get transformed in non-equilibrium systems.
For example, we can now calculate the (non-zero) probability that a given nanoscale system will violate the second law, reducing its entropy, in a given time interval. (We now understand that the second law does not say that the entropy of a closed system cannot decrease, only that its expected entropy cannot decrease.)
There are no
controversies here arising from semi-formal reasoning; instead,
there are many hundreds of peer-reviewed articles in top journals, a
large fraction involving experimental confirmations of theoretical
predictions.
This has already been
done for bit erasure, the topic of concern to Landauer and others,
and we now have a fully formal understanding of the thermodynamic
costs in erasing a bit (which turn out to be surprisingly subtle).
For example, moving from bits to circuits, my collaborators and I now have a detailed analysis of the thermodynamic costs of "straight-line circuits."
Surprisingly, this
analysis has resulted in novel extensions of information theory.
Moreover, in contrast to the kind of analysis pioneered by Landauer,
this analysis of the thermodynamic costs of circuits is exact, not
just a lower bound.
In light of the foregoing, it seems that there might be far more thermodynamic trade-offs in performing a computation than had been appreciated in conventional computer science, involving thermodynamic costs in addition to the costs of memory resources and number of time-steps.
Such trade-offs would
apply in both artificial and biological computers.
We highly encourage
people to visit it, sign up, and start improving it; the more
scientists get involved, from the more fields, the better!
|