GURPS Artificial Intelligence Critical Failure Table

Userpic
Matt Arnold
June 25, 2004

Eliezer Yudkowski of the Singularity Institute wrote an online supplement for roleplaying in the GURPS system the creation of an artificial intelligence. If on your first dice roll you succeed in creating an apotheosis which does not kill the entire human race, you then roll to see if it has the right kind of "friendliness" toward humans. You know how genies can sometimes grant wishes in a sneaky or unexpected way? "Friendly" superintelligent entities can be like that. If this second die roll gives you a critical failure, you have to roll again on a table of outcomes that is by turns chilling, hilarious and thought-provoking.

For instance, if you roll a 24: "The AI determines people's wishes by asking them disguised allegorical questions. For example, the AI tells you that a certain tribe of !Kung is suffering from a number of diseases and medical conditions, but they would, if informed of the AI's capabilities, suffer from an extreme fear that appearing on the AI's video cameras would result in their souls being stolen. The tribe has not currently heard of any such thing as video cameras, so their "fear" is extrapolated by the AI; and the tribe members would, with almost absolute certainty, eventually come to understand that video cameras are not harmful, especially since the human eye is itself essentially a camera. But it is also almost certain that, if flatly informed of the video cameras, the !Kung would suffer from extreme fear and prefer death to their presence. Meanwhile the AI is almost powerless to help them, since no bots at all can be sent into the area until the moral issue of photography is resolved. The AI wants your advice: is the humane action rendering medical assistance, despite the !Kung's (subjunctive) fear of photography? If you say "Yes" you are quietly, seamlessly, invisibly uploaded."

If it's moral to transgress the erroneous beliefs of ignorant savages for their own good, then it's only fair for a semi-omnipotent superintelligence to do the same to you by uploading you into a computer simulation. For your own good; that way you can be an immortal demigod too. But because the apotheosis is not a tyrant, it will only treat you the way you choose to treat those more primitive than you.

Beware of knowing what's good for people who you consider your inferior. See what I mean about science fiction being full of thought experiments that we can apply to our lives?

Comments


none

Leave a Comment

Enter your full name, maximum 100 characters
Email will not be published
Enter a valid email address for comment notifications
Enter your comment, minimum 5 characters, maximum 5000 characters
Minimum 5 characters 0 / 5000