Re: Huge drivespace Re: computing budgeting (fwd)

Date view Thread view Subject view Author view

From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Thu Sep 07 2000 - 00:43:29 PDT


Jeff Bone writes:

> That's a great question. Discontinuities like that present huge
> opportunities. IME, though, the "apps" usually lag significantly because
> people just can't figure out how to do anything but more-of-the-same -w-
> discontinuous tech, often for years after it appears. A VC buddy of mine asked

I have not noticed that the Net had a lot of bandwidth slack at any
time in the past. The traffic always saturates the available bandwidth
(law this and that, forgot the name). Apart from that, in local
networks, switched 200 MBps is too slow. 1 GBps would be about right
just now. In a decade we should have 10-100 GBps in local (optical, of
course) networks. 1 GBps nonlocal (if indeed arrived) would not appear
so exorbitant.

> me recently what I'd do -w- a gigabit to the house; after spending several

By the time you will have a GBps to your house, you will complain that
it is not enough, as now people bitch that SDSL could be a tad faster
(symmetrical 50-60 kBytes/s is really not that dramatic). VDSL will
take a few years still. Apart from prohibitively expensive shenanigans
like stringing fiber through the neighbourhood, the only way to get
GBps to your house is TeraBeam type of thing, which will take at least
a decade to serve end customers, and only in major cities (they will
then have to bury fiber in thick bundles between these). Notice that
this means the network becomes essentially backboneless, from the dish
in the window to the tower, maybe to another tower or two, before the
bits dive down to a yet another dish in another window. Unless your
friend lives in another city ;)

If I had a GBps to my house right now, I could afford to host magnet
content at home without fear of being /.'d, I would run a large number
of community services, irc, freenet, remailers and IP onion routers,
broadcast several multimedia streams, participate in real VR, watch TV
and run distributed thingies, like ALife simulations and participate
in large-scale molecular simulations, and running distributed spiking
codes, pooling mine and my friend's machines. Believe me, if I have
enough CPU I can fill up even a TBps pipe easily.

Btw, by the time I can have 1 GBps at least to my 100 next neighbours,
and they're willing to let me have their spare cycles for free, and
this holds globally (so that I can net ~1 million contributors,
provided the project is interesting enough) the number of GFlops (and
especially MMX type multimedia OPS) in your average desktop will be
extremely formidable. This is a lot of power. Using this much power I
can brute-force a number of things, including how to mutate machine
instructions robustly, semiautomatically design molecular machines via
the inverse protein folding route and sample the space of
machine-phase processes and mechanosynthetic reactions. All very
useful stuff.

> weeks doing due diligence on it, I had to sadly conclude that what *will* get
> done with it for the forseeable future is just incremental: endless variations

Foreseeable future is how much? 2-3 years?

> on video-on-demand and a slow but steady creep upwards towards quality consumer
> video telephony. (1Gb/s is about 3x the discontinuity between 300baud modems
> and my DSL. Imagine trying to predict the Web back in 1980. Ted Nelson was

Well, 1280x1024 good quality 25 fps true colour sounds like a lot of
bandwidth. People might figure out that hooking a high quality solid
state camera to a mobile teleoperated platform is nice way to go
travelling. If there is a certain critical mass of these around the
world you could hop between spatially widely separated unoccupied
platforms instanteously. With these, you could telecommute to the
Moon, if you can stand the 1 s relativistic lag.

> Still, I'd love to have a complete snapshot copy of the Web. :-) :-) Might

I would like to have a complete searchable local web trail of stuff I
have visited. The ratio between the available bandwidth (and the
content that needs that kind of bandwidth) and available storage seems
to stay roughly the same.

> enable some interesting search / knowledge management / knowledge extraction
> stuff... might enable some interesting knowledge bases, research tools, etc.
> Provides an interesting brain seed for AI efforts, though I shudder to think
> about the distribution of kinds of content. ;-)

Yeah, it's a quick way to become a footnote in the fossil record.

> For that matter, 10TB is a hell of a lot of virtual memory. Never mind that it

Virtual memory = memory so slow it's virtually unusable.

Btw, we're talking about rotating bits, cruising on the surface. A 2d
storage medium. Now, as long as the area is to stay about constant (no
tennis court size folded up in your desktop) it doesn't matter an
awful lot if your bits take 1 um^2 or 100 nm^3. But if you happen to
store the same thing in a volume (say, 1 cm^3), these orders of
magnitude become cumulative. Especially, if you realize that this
wipes out the boundary between RAM and peripheral storage.

What I see next is people figuring out how to encode a bit in a
bacteriorhodopsin type of engineered protein (transmembrane proteins
embedded in lipid bilayers are probably a lot easier to engineer than
figuring out how these babies should autoassemble in 3d), imbed the
(autoassembled into a 2d addressable matrix) stuff in a polymerizable
Langmuir-Blodgett and coat, say, 200 mm wafers with it. That would
give you nice stackable storage platters, and without them having to
rotate. Being solid state, these things would be so fast you'd need a
fiber to talk to them without saturating.

> would totally break most virtual memory swapping / paging algorithms, but using
> it as such would enable some really, really big stuff: think massively complex
> neural nets, GA, GP. Hell, -w- 10TB I could actually unload my
> archive-of-all-email-since-time-began. ;-) When you start to get that much

With email, you're bottlenecked by your eyeballs. You can't need more
than a few GBytes, unless you only read the subject: lines ;)

> storage capacity, it really starts to make sense never to throw anything away.
 
What? You mean, you're not a digital pack rat?

> Tying back to the 1GB thing, one problem -w- VOD is that giving every customer
> 10 hrs of upstream storage would eat the current annual world output of 33GB
> Quantum Fireballs in even a modest 1M subscriber network. 10TB storage media
> at $500 price points really starts to make the notion of
> remote-storage-as-utility feasible.
>
> But of course the right answer to the question is I'd do whatever it took to
> make a complete copy of every song that ever hit Napster. :-) That would eat a
> noticeable fraction of the 10TB drive.

We're talking about lousy 1000 GBytes. You can already buy 80 GByte
EIDE drives, right? That's just two orders of magnitude away. 1 TByte
sounds like about the right size to start ripping all these DVD videos
with full quality.


Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Thu Sep 07 2000 - 01:49:30 PDT