AI ALIGNMENT FORUM
AF

swordsintoploughshares
Ω10000
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Choosing to Choose?
swordsintoploughshares7y70

Nice point. I think most of the time this doesn't apply, because agents don't just try to maximize "utility" as some abstract quantity, but possess (transparently or opaquely) some function U(⋅) which they concretely try to increase. Utility is a device, specific to each agent, that refers to states of the world, and isn't a kind of universal brownie point that everyone likes.

On the other hand, putting "utility" as your utility is fertile ground for wireheading temptations. I define my utility to be ∞! There, I win.

Reply
5Logical Uncertainty and Functional Decision Theory
7y
0