**The world of computing is in transition. As chips become smaller and faster, they dissipate more heat, which is energy that is entirely wasted.**

**By some estimates the difference between the amount of energy required to carry out a computation and the amount that today's computers actually use, is some eight orders of magnitude. Clearly, there is room for improvement.**

So the search is on to find more efficient forms of computation, and there is no shortage of options.

One of the outside runners in the race to take the world of logic by storm is reversible computing. By that, computer scientists mean computation that takes place in steps that are time reversible.

So if a logic gate changes an input X into a output Y, then there is an inverse operation which reverses this step. Crucially, these must be one-to one mappings, meaning that a given input produces a single unique output.

These requirements for reversibility place tight constraints on the types of physical systems that can do this kind of work, not to mention on their design and manufacture. Ordinary computer chips do not qualify--their logic gates are not reversible and they also suffer from another problem.

When conventional logic gates produce several outputs, some of these are not used and the energy required to generate them is simply lost. These are known as garbage states. "Minimization of the garbage outputs is one of the major goals in reversible logic design and synthesis," say Himanshu Thapliyal and Nagarajan Ranganathan at the University of South Florida.

Today, they propose a new way of detecting errors in computations and say that their method is ideally applicable to reversible computing and, what's more, naturally reduces the number of garbage states that a computation produces.

Before we look at their approach, let's quickly go over a conventional method of error detection. This simply involves doing the calculation twice and comparing the results. If they are the same, then the computation is considered error free.

This method has an obvious limitation if the original computation and its duplication both make the same error.

Thapliyal and Ranganathan have a different approach which gets around this problem. If a reversible computation produces a series of outputs, then the inverse computation on these outputs should reproduce the original states.

So their idea is to perform the inverse computation on the output states and if this reproduces the original states, then the computation is error free. And because this relies on reversible logic steps, it naturally minimises the amount of garbage states that are produced in between.

There are one or two caveats, of course. The first is that nobody has succeeded in building a properly reversible logic gate so this work is entirely theoretical.

But there are a number of computing schemes that have the potential to work like this. Thapliyal and Ranganathan point in particular to the emerging technology of quantum cellular automata and show how their approach might be applied.

The beauty of this approach is that it has the potential to be dissipation-free. So not only would it use far less energy than conventional computing, it needn't lose any energy at all. At least in theory.

At first glance, that seems to contradict one of the foundations of computer science: Rolf Landauer's principle that the erasure of a bit of information always dissipates a small amount of energy as heat. This is the basic reason that conventional chips get so hot.

But this principle need not apply to reversible computing because if no bits are erased, no energy is dissipated. In fact, there is no known limit to the efficiency of reversible computing. If a perfectly reversible physical process can be found to carry and process the bits, then computing could become dissipation free.

For the moment, that's wild dream. But in the next few years, as quantum processes begin to play a larger part in computation of all kinds, we may well hear much more about reversible computing and its potential to slash the energy wasted in computing.

**Source: ZeitNews.org**