AI Alignment Posts

Alignment Newsletter #28

rohinmshahAlignment Newsletter3d5 points8 min readShow Highlightsubdirectory_arrow_left
0

Standard ML Oracles vs Counterfactual ones

Stuart_Armstrong8d5 points5 min readShow Highlightsubdirectory_arrow_left
0

A Rationality Condition for CDT Is That It Equal EDT (Part 2)

abramdemski9d3 points7 min readShow Highlightsubdirectory_arrow_left
0

Alignment Newsletter #27

rohinmshahAlignment Newsletter10d1 point9 min readShow Highlightsubdirectory_arrow_left
0

A Rationality Condition for CDT Is That It Equal EDT (Part 1)

abramdemski15d4 points8 min readShow Highlightsubdirectory_arrow_left
3

The Rocket Alignment Problem

Eliezer_Yudkowsky15d18 points15 min readShow Highlightsubdirectory_arrow_left
0

Alignment Newsletter #26

rohinmshahAlignment Newsletter16d2 points7 min readShow Highlightsubdirectory_arrow_left
0

EDT solves 5 and 10 with conditional oracles

jessicata18d17 points12 min readShow Highlightsubdirectory_arrow_left
5

[Link] New DeepMind AI Safety Research Blog

Vika21d13 points1 min readShow Highlightsubdirectory_arrow_left
0

Asymptotic Decision Theory (Improved Writeup)

Diffractor21d6 points13 min readShow Highlightsubdirectory_arrow_left
8