Boo "Paperclip Maximizers" as a term

This is an analogy used in informal arguments related to AI's potential for catastrophic risk. The value of the analogy in this name was, in my view, that it pointed out the idea of "a random outcome that nobody asked for". Paperclips are what you'd call a niche interest, for humans nearly everywhere in the past or future. So an incredibly powerful computer that somehow managed to maximize the number of paperclips on earth over everything else, against the wishes of its controllers, would produce a random outcome that nobody asked for, especially those who don't care one bit about paperclips.

views