Also sprach Lucas Gonze:
> What I don't get about his stuff is how he sidesteps the concept of self. IE,
> in his concept it is somehow fundamental that AI things would not have self. I
> get his point that his AI things aren't evolved, so there is no selection, so
> there is no pressure to create units of selection, so there's nothing to have
> selfhood.
Damn, I guess I'm really going to have to read this thing.
But my immediate reaction is, what the hell does evolution
have to do with a sense of self?
We are actually incredibly clueless about the concept of a sense of
self. (In earlier ages we might have called it a "soul"?) You can't
even prove to me that you have a sense of self, instead of just acting
like you have a sense of self. So do pets have a sense of self? Do
insects? Trees? How can you tell?
I've always felt that if you build something that acts like it has a
sense of self, then you are morally obligated to *treat* it like has a
sense of self. Hmm, but then there's the question of, is it possible
to build something that has a sense of self but has been engineered to
act like it doesn't? Jeez, what an ethical nightmare.
- Joe
-- Joseph S. Barrera III Software Architect, Broadbase (=> Kana) Software, Inc. _________________________________________________________ 1.650.219-4557 (cell) / 1.650.588-4801 (home) joe@barrera.org / joebar@broadbase.com / napier@waste.org www.barrera.org / www.broadbase.com / www.waste.orgWORST SIGNATURE EVER
This archive was generated by hypermail 2b29 : Sun Apr 29 2001 - 20:25:59 PDT