Why computers need to chill out

September 21, 2018
In February last year, the US quiz show Jeopardy hosted a contest between its two best contestants and an IBM supercomputer named Watson.

In the battle of brain versus machine that ensued, the computer Watson won. While its victory was cause for celebration for some, for others it was an occasion for deep questioning.

What, some wondered, was the engineering cost of Watson’s victory?

While it met its objectives of speed (giving an answer in 3 seconds) and accuracy, the price paid in terms of energy consumption was relatively huge.

In answering a question, a human will use around one-sixtieth of a watt-hour. By contrast, Watson consumes about 114 watt-hours. On top of that, it gets hot – and keeping Watson’s temperature within acceptable limits means an even higher electricity bill.

Researchers at the Masdar Institute of Science and Technology are tackling this very question: how to control, reduce, and ultimately minimize, the amount of energy consumed by integrated micro- and nano- electronic circuits – the fundamental building blocks of all our electronic gadgets.

Reducing the energy consumed by our mobile and portable devices means more battery life between charges, and ultimately longer until the battery needs to be replaced.

Cardiologists love nothing more than an ultra-low-power integrated circuit in pacemakers and defibrillators implanted in their patients.
It means patients don’t need to go through the painful process of device maintenance and replacement so often. More life from batteries simply means more life for their patients.

There are many levels at which this can be achieved. One can work at the physical level of the transistor, the fundamental switching device in integrated circuits, and make each one consume less energy as it switches between its two logic states of conduction and non-conduction.
Or one can work at the level of the wires connecting these transistors into circuits. For a given function, there are several patterns of transistors and wiring that could do the job; the question for circuit design engineers is which will consume least energy.

At another level, the energy use of the part of the integrated circuit that communicates with the outside world can be reduced.
Driving signals off-chip (to use an engineering expression) uses energy. The more energy-efficient the output circuitry is, the better for the whole integrated circuit.

Other methods are more architectural in nature. One common theme is that of hardware parallelism. The concept is somewhat counter- intuitive; one might expect that putting more circuits on the same task would use more, rather than less, power. But in fact, using parallel circuits to perform the same processing on independent pieces of data allows your chips to run slower, which in turn allows supplied voltage to be decreased, thus resulting in a net decrease in consumed energy. This insight, achieved in the early nineties, turned out to be a decisive step towards making lower-power chips. It has resulted, among other things, in the multiple-core chips that now drive our phones and computers.

All these efforts to cut the power used by our servers and data centres has been grouped together under the moniker of “green computing”.
The US-based Semiconductor Research Consortium (SRC) has identified energy reduction as one of the Grand Challenges facing the microelectronics industry.

With generous support from the Abu Dhabi Advanced Technology Investment Company, SRC is funding my work on this Grand Challenge, and that of several colleagues at UAE universities.

It is my hope one day it will enable our future grandchildren to play Jeopardy on their grandparents’ iWatson tablet with the same intuitive ease that my three-year-old daughter now plays Dora on her mum’s iPhone.

Dr. Ibrahim Elfadel is a professor of microsystems engineering at the Masdar Institute of Science and Technology.