AI ALIGNMENT FORUM
AF

Wikitags

Utilitarianism

Edited by steven0461, TerminalAwareness, Yoav Ravid, Vladimir_Nesov, et al. last updated 18th Mar 2021

Utilitarianism is a moral philosophy that says that what matters is the sum of everyone's welfare, or the "greatest good for the greatest number".

Not to be confused with maximization of utility, or expected utility. If you're a utilitarian, you don't just sum over possible worlds; you sum over people.

Utilitarianism comes in different variants. For example, unlike standard total utilitarianism, average utilitarianism values the average utility among a group's members. Negative utilitarianism seeks only to minimize suffering, and is often discussed for its extreme implications.

Related Pages: Negative Utilitarianism, Consequentialism, Ethics & Morality, Fun Theory, Complexity of Value

Subscribe
1
Subscribe
1
Discussion0
Discussion0
Posts tagged Utilitarianism
29Comparing Utilities
abramdemski
5y
15
11AXRP Episode 3 - Negotiable Reinforcement Learning with Andrew Critch
DanielFilan
5y
0
Add Posts