Why Russell is wrong, or, What Simulating Humans Takes

Jeff Bone jbone@jump.net
Tue, 29 May 2001 10:23:17 -0500


Eugene Leitl wrote:

> Please stop pulling numbers out of thin air.

Well, admittedly, I didn't have the book handy that presents the argument I was
trying to reconstruct.  Those numbers weren't out of thin air, but rather out
of my own admittedly faulty wetware.  Given the constraints and the fact that
we're just trying for rough approximation here, what do you suggest?

> Complete and utter balderdash. You can safely assume Kurzweil is smoking
> crack here.

I'll merely point out here that Kurzweil has a long and uniquely good track
record in terms of the quality and accuracy of his public predictions regarding
technology, whereas Eugene...  doesn't, at least yet.

> Actually, a distributed.net type of crunch today would make lots of sense,
> since we're so ridiculously remote from realtime. A single 1 GHz CPU would
> probably generate a biologically realistic simulation of a single neuron
> (synapse, dendrite, and all) at 1:1000, so 1 ms activity would take you 1
> s to compute. Such latencies are sufficient for today's networks, even
> for dialup.

Oh, good, we're only three orders of magnitude away!  I thought we actually had
problems.  ;-)

> Moore doesn't say anything about performance. Moore says something about
> the number of devices on a single affordable piece of silicon.

That's true, but it's widely regarded as having a direct implication on
performance.  You're literally correct but you're being pedantic, here.

> You're talking about crunch, using some abstract metric. Fact is that the
> number of molecule-sized devices alone translate into moles very rapidly
> if you start following the line a few decades into the future.  And these
> are not small molecules, you you can presume a mole of them is a big chunk
> of molecular crystal -- which needs to be cooled and I/Od. Because the
> stuff keeps doubling, you then run out of matter in the local solar
> system.

Actually, that's not what K. talks about at all.

> There is no femtotech, and there are not even hints of femtoscale
> phenomena which would appear to be useful for computation.

Really?  How about doing computation using quark spin?

> DNA and RNA
> computers are worthless,

Really?  Check out that [2], it talks extensively about such things IIRC.

> > Etc. etc. etc.  And there's a good chance
>
> Whence the optimism?

Call it professional opinion.  I've never seen a piece of software that was
worked on by more than one person that didn't have some amount of suckitude
going on. ;-)  Complexity due to improper level of abstraction / uniform
application of abstraction alone can account for 2-3 orders of magnitude in
code size / complexity;  to see this compare Plan 9 with its equivalent
subsystems in current commercial OSes.  Furthermore, look at the benefits of
certain strong theoretical tools such as Bloom filters which have been known
for a long time but are sufficiently obscure that even good programmers fail to
use them in many cases where they'd be appropriate...

> I estimate Omega hardware to be less than 100 years away. After that, you
> can only scale up volume.

Well huh, despite all the objection there's actually consensus.  I too rate it
highly unlikely that we'll see the end of this century without hitting that
milestone.

> Forget "CPU".

Don't get hung on the terminology.  "Processing node," if you will.  Whatever.

> Also, this is all fun, and stuff, but this doesn't help us to kick the Man
> into gonads, as you need a constructive IP landscape for most of above
> scenarios.

Oh hell, I guess I forgot --- it that what we're trying to do here?  ;-)

jb