Re: New Scientist Trial by laptop.html

Date view Thread view Subject view Author view

From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Sat Jul 29 2000 - 14:14:31 PDT


Gregory Alan Bolcer writes:
> The point is you want to remove the human out of the loop.
> Humans are fallable and biased. In Mexico, for immigration control

Humans also have common sense, which computers lack. Because of this,
for instance intensive care has proven itself remarkably hard to
automate. While humans routinely introduce errors (and kill patients),
humans can also diagnose unprecedented conditions by reasoning, which
computers so far cannot do. (Debugging a nontrivial physical setup
theorem proving is not).

While codifying laws and reasoning applied to the laws would be neat,
laws are not theorems. Real world is not a formal system. Attempts to
handle it as such will must fail dramatically.

Moreover, an officer with a laptop is not Robocop. He still has to
classify the real world occurance, and as such has freedom of
interpretation.

> and interstate-commerce, they took the inspection agents and turned them into
> secondary inspectors. Every tourist or traveler simply goes up
> to a stoplight. They push the button and if it turns green, you
> get to merrily go on your way, if it turns red, they go through
> all your stuff with a fine tooth comb. The decision is taken
> completely out of the hands of anyone at the scene, and everyone
> has a fatalistic determinism about the whole scenario.


Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Sat Jul 29 2000 - 15:19:46 PDT