Re: The Fabric of Reality...

Date view Thread view Subject view Author view

From: Karl Anderson (kra@monkey.org)
Date: Mon Aug 21 2000 - 13:28:51 PDT


"Gordon Mohr" <gojomo@usa.net> writes:

> Karl Anderson writes:
> > "Gordon Mohr" <gojomo@usa.net> writes:
> > > We could then, just for kicks, live a few hundred "simulated natural
> > > lifetimes", in the blink of an eye -- if such an exercise were at all
> > > interesting. We could rerun the same life with slightly different
> > > initial conditions; we could invite friends over for guest-starring
> > > roles; we could discuss the twists and turns between lifetime sorties.
> >
> > Yeah, but that would be extremely unethical, unless you don't believe
> > in hard AI, in which case "cannot be differentiated" is false.
> >
> > "Whoops, I messed up the input parameters a notch - a version of you
> > just spent a lifetime getting gangraped by pirates. Butterfly effect,
> > dude!"
>
> I don't exactly follow your comments on the relevance of "hard AI". I
> can see ethical dangers, but not insurmountable problems.

I was thinking of complete sims with sentient players, not wallpaper
or actors like you mention later. Regardless, there's at one sentient
player.

> If the main character in such simulations is ethically "me", and I
> voluntarily agree to enter and live with the outcomes, knowing that
> any pain will at least be finite in duration, who has been unethically
> treated? Mortality itself may just be a safety mechanism, to ensure
> nothing goes too awry for too long on any one "run".

Sure, any sentient being has the right to choose to commit suicide,
but would you choose to commit suicide a million times? I hope that I
wouldn't, & then neither would my copies. And it better be a damn
good simulation before I agree to live out my life in it. I don't
want to find out that I've figured it out after a year.

> Perhaps at every second, the simulation freezes. A forked version
> of me is explained the situation, and given a chance to rescue the
> naive version before continuing. Would that provide suitable ongoing
> informed consent?

How do you rescue the naive version? It's alive, it's sentient, you
can't turn it off.

What do you do with each forked version after it chooses for the naive
one? Kill it?

I thought all of this talk was hokey until I read _Diaspora_, where a
group of (somewhat simplistic) AIs have their own ethical rules about
created sentients, which is basically, once it has cycles, it's a
citizen. One character somewhat unethically clones himself to do a
rather harsh job, and you see it from his POV - he clones himself and
then looks back at the original, who is trying not to look relieved.

-- 
Karl Anderson      kra@monkey.org           http://www.pobox.com/~kra/


Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Mon Aug 21 2000 - 12:57:33 PDT