Ron Resnick writes:
> A followup to recent discussions of physics analogies to computing
> (eg Climbing Clueful Mountain and Ernie's rebuttal - Physics of Objects).
>
> Ron.
> --
>
> Bennett Elected to the National Academy of Sciences
...Information on Charles H. Bennett removed...
> Bennett along with Gilles Brassard of the Universite de Montreal,
> invented quantum crptography, which uses the uncertainty principle
> to protect secret messages from eavesdropping. He is also one of the
> framers of the concept of quantum teleportation, where the exact
> quantum state of a particle such as a photon or atom can be disembodied
> from that particle and reincarnated in another particle that has never
> been near the first particle.
how many forkers subscribe to the theory of quantum cryptography, let
alone quantum teleportation? i don't care how funky quantum physics
is, i'm quite unconvinced that any of these ideas can be considered
"inventions".
i recently read "bringing schrodinger's cat to life" by staff writer
philip yam in the june issue of sciam.
here, philip covers, in brief, everything from david pritchard's
experiments with interference patterns (created by playing with the
quantum state of an atom when coupled with a measuring device, in this
case a photon) to schrodinger and bohr's complaints complaints about
quantum theory.
of particular note, (and of which i hadn't read elsewhere before), is
the discussion therein of the quantum-classical changeover - the state
at which the rules of classical physics wrestle reality away from the
quantum - especially where such "thought experiments" are now real lab
experiments.
now this is all well and good, and i get the gist of the bulk of it,
but when the article dives into the "futures" of quantum computing at
the end and in a side panel, i loss interest.
can some physicist explain to me, for instance, what the following can
mean:
"Even if experiments cannot yet tackle the measurement problem fully,
they have much to contribute to a very hot field: quantum computing.
A classical computer is built of transistors that switch between 0 or
1. In a quantum computer, however, the "transistors" remain in a
superposition of 0 _and_ 1 (called a quantum bit, or qubit);
calculations proceed via interactions between superposes states until
a measurement is performed. Then the superpositions collapse, and the
machine delivers a final result."
how does a machine "process" with bits that are both 0 _and_ 1? how
does the observer know when to observe (to collapse the wave
function)? plus, as the article states a few paragraphs later, even
if the gate speed of such a system were 0.1 milliseconds, the bits
would have to remain in superposition for at least a year to complete
a meaningful computation (in this case, factoring a 200-digit number).
can anyone make a guess where they came up with this number?
just looking for answers in this schrodinger's cat eats schrodinger's
cat world,
joe
p.s. oh yes, they hint at quantum teleportation, quantum key
cryptography, and other wackiness in a side-bar.