AI ALIGNMENT FORUM
AF

635
Wikitags

Utilitarianism

Edited by steven0461, TerminalAwareness, Yoav Ravid, Vladimir_Nesov, et al. last updated 18th Mar 2021

Utilitarianism is a moral philosophy that says that what matters is the sum of everyone's welfare, or the "greatest good for the greatest number".

Not to be confused with maximization of utility, or expected utility. If you're a utilitarian, you don't just sum over possible worlds; you sum over people.

Utilitarianism comes in different variants. For example, unlike standard total utilitarianism, average utilitarianism values the average utility among a group's members. Negative utilitarianism seeks only to minimize suffering, and is often discussed for its extreme implications.

Related Pages: Negative Utilitarianism, Consequentialism, Ethics & Morality, Fun Theory, Complexity of Value

Subscribe
Discussion
1
Subscribe
Discussion
1
Posts tagged Utilitarianism
29Comparing Utilities
abramdemski
5y
15
11AXRP Episode 3 - Negotiable Reinforcement Learning with Andrew Critch
DanielFilan
5y
0
Add Posts