AI ALIGNMENT FORUM
AF

Hazard
Ω1000
Message
Dialogue
Subscribe

Worried that I might already be a post-rationalist. I'm very interested in minimizing miscommunication, and helping people through the uncanny valley of rationality. Feel free to pm me about either of those things.

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
0Hazard's Shortform Feed
8y
0
No wikitag contributions to display.
Toy model piece #1: Partial preferences revisited
Hazard6y10

For cycles, it looks like the projection to ¯¯¯¯¯¯W is akin to taking all the worlds that form a given cycle, and compressing them into a single world.

In your example, it's true wi<wj and wj<wi when i≠j. That's the condition for equivalence in the project, so you have that w1=w2=w3. If you're thinking about the ordering as a directed graph, you can collapse those worlds to a single point and not mess up the ordering.

Reply
No posts to display.