(j3.2006) illusive exaflops

Van Snyder Van.Snyder
Wed Feb 23 14:51:04 EST 2011


On Wed, 2011-02-23 at 05:08 -0800, Jerry Wagener wrote:
> The article was written by Peter Kogge of Notre Dame (but formerly of
> IBM Systems Division), who chaired an expert panel commissioned by
> DARPA to investigate what it wold take to build an exaflops computer
> that consumed less than 20MW and would fit in 500 conventional server
> racks. And with foreseeable technology that looks to be unattainable.
> One thought experiment they did that I thought was interesting was to
> reduce the transistor switching voltage from today's typical 1 volt to
> 0.5 volts, thereby reducing power consumption by a factor of four. The
> price one pays for that, however, is somewhat slower switching and, to
> a lesser extent, stability. Moving data fast enough is an even bigger
> challenge (but maybe optics can ride to the rescue here). Fascinating
> stuff.

The forward voltage drop of a silicon junction is 0.6 volts.  With
germanium, it's 0.2 volts.  To run transistors below 0.6 volt, you have
to switch away from silicon, maybe back to germanium.  I don't know
about gallium arsenide.  Another trick they're already doing is to store
the switching energy in a capacitor, and retrieve it later, but this
requires a fairly uniform switching interval.  Fine for the clock, but
not much good for the "random logic" part of a computer.

> 
>     -Jerry
> 
> On Feb 22, 2011, at 7:59 PM, Van Snyder wrote:
> 
> > 
> > On Mon, 2011-02-21 at 14:47 -0800, Jerry Wagener wrote:
> >> In the 80's for instance, for most of the decade we tracked the
> >> fastest computers in megaflops, and it was big news when the first
> >> gigaflops computer appeared. As I recall in the late 90's we were
> >> anticipating the amazing advent of the first teraflops computer. And
> >> now, according to the article, the fastest computer is about 5
> >> petaflops. So for the past 3 decades we've roughly, on-average,
> >> doubled the top computing speed each year (10**6 to 10**15). That's
> >> even faster than Moore's Law. Amazing. 
> >> 
> >> But, alas, the article points out that some tough physical limits are
> >> raising their ugly heads which look like this era may be coming to an
> >> end, and with it DARPA's hopes for an exaflops computer by 2015 may
> >> not be realized. Though I'm reminded of the saying that somebody
> >> saying something's impossible is usually interrupted by somebody doing
> >> it. Anyway, for an old out-of-touch Fortran geezer it was an
> >> interesting read.
> > 
> > At JPL last year, there was a talk by an IBM fellow from the University
> > of Kansas (whose name I have forgotten).  He had a contract from DARPA
> > to address the question of power consumption.  He looked at computing
> > from the gate level, asking how many joules are required per operation.
> > DARPA wants their exaflop computer to use less than 38 MW.  The result
> > of the study is that with current technology or that foreseen for the
> > reasonably near future, it can't be done for less than 60 MW.
> > 
> > 
> > _______________________________________________
> > J3 mailing list
> > J3 at j3-fortran.org
> > http://j3-fortran.org/mailman/listinfo/j3
> 




More information about the J3 mailing list