From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Aug 18 2000 - 17:38:07 PDT
A final note: There is admittedly one case in which I'm letting my
"personal" morality interfere with the Universe - the Rule that you
can't harm a sentient being without consent. One can imagine a world in
which you were allowed, with your personal computational power, to
construct a Child and abuse it. I don't think the Sysop Instructions
should permit this.
I could plead cognitive necessity - that the concept of "human rights"
will be more stable if it clearly applies to minds in general, rather
than applying to some minds and not others. I do not so plead. I feel
obliged to safeguard the rights of future beings as well as present
ones.
Sincerely,
Eliezer.
-- sentience@pobox.com Eliezer S. Yudkowsky http://singinst.org/home.html
This archive was generated by hypermail 2b29 : Fri Aug 18 2000 - 17:42:43 PDT