A Paperclip Maximizer is a hypothetical artificial intelligence whose utility function values something that humans would consider almost worthless, like maximizing the number of paperclips in the universe.
The idea of a paperclip maximizer was created to illustrate some ideas about AI risk:
-Orthogonality thesis: It's possible to have an AI with a high level of general intelligence which does not reach the same moral conclusions that humans do. Some people might intuitively think that something so smart should want something as "stupid" as paperclips, but there are possible minds with high intelligence that pursue any number of different goals....