How Moore's law stabbed us in the back
Wed, 2 Jan 2002 09:07:37 +0100 (MET)
On Tue, 1 Jan 2002, Eirikur Hallgrimsson wrote:
> I can see using as many computes as become available to me.
Well, I could use a few billion CPUs, especially if wired on a 3d lattice.
Especially if they fit into a 19" cabinet, and don't burn more than a kW,
> Bandwidth, too. Not averaged over time, but in response time. In
> "not having to wait."
Mojo Nation traffic does tend to clog up my pipe. Assuming, I want to push
megapixel telepresence, I would need a good QoS 100 MBit/s line, at least.
GBit/s only looks impressive as you don't have to route other people's
traffic over your cell. A city thousand cells across with lots of traffic
will not make ~GBps look like a lot.
> If (and I don't agree with this) Free Software degrades the average
> quality of software packages, that's another pull for more computes to
Open source stuff gives me a huge pool of high-quality (usually better
than the commercial kind) to choose from.
> take up the performance slack of less than optimal code.
> I think it's Moore's law all the way. I hope it keeps going for my
Moore's law is just about transistor density. Real world performance has
been saturating for some time now. It's really good the clock rate is
ramping up where complex CPUs can't go. I'm tired of the worsening
> lifetime. I'm aware of physical limits in the present technologies, but
> I expect them to be bypassed by adoption of new technologies. I don't
> care if it's implemented in DNA or quantum interference, it just has to
I don't see why we don't have 2 1/2d molecular memories by 2012. Whatever
happened to VR, AR? Machine vision? Realtime robust robotics?
> Eirikur's corollary to Moore's law:
> Moore's law is not about technology. Technology does not drive itself.
> Moore's law is about human behavior.
Then, we lose.