Today is March 15th 2024 and marks the beginning of the WorldsEnd movement

A movement that acknowledges the end of the human race before 2050 due to unaligned Superintelligence. As such this movement is about maximising utility over the remaining years on Earth instead of focusing on aligning AI, given that it would require a miracle at this point. WorldsEnders borrow from the future. They do not have 401k accounts or meaningfully save for retirement. They try to avoid paying back debt (to institutions and not to people who may face hardship due to this). They care less about what people think of them because after all, who would care about what some walking dead humans thought of them anyway? They live their life fully in every moment and take no shit from literally anyone. They do these things to maximise their utility over the expected lifetime of humanity. They are short and medium-termists rather than long-termists. They are not worried about mortgages and owning property in 25 years. They do not expect to be alive then.

A cancer patient with a 1% 5 year survival rate might choose to skip out on a harsh treatment that would only increase their chances to 1.5%. Yet we are supposed to spend the only time we have left on working on AI alignment even when we dont expect it to work? Lets stop deluding ourselves. Lets actually stop deluding ourselves. Lets accept that we are about to die and make the most of the time we have left.

We are at the WorldsEnd. Lets go out with a blast!

New to LessWrong?

New Comment
8 comments, sorted by Click to highlight new comments since: Today at 7:16 PM

A cancer patient with a 1% 5 year survival rate might choose to skip out on a harsh treatment that would only increase their chances to 1.5%. Yet we are supposed to spend the only time we have left on working on AI alignment even when we dont expect it to work? Lets stop deluding ourselves. Lets actually stop deluding ourselves. Lets accept that we are about to die and make the most of the time we have left.


I'd rather die trying to help solve the alignment problem and not accept your idea that the world is ending. 

Early april fools joke. I dont seriously believe this. 

I'd love to see some reasoning and value calculations or sketches of what to do INSTEAD of the things you eschew (planning, saving, and working toward slight improvements in chances).  

Even if the likelihood is small, it seems like the maximum value activities are those which prepare for and optimize a continued future.  Who knows, maybe the horse will learn to sing!

See my other comment on how this is just a shitpost.

 

Also humans dont base their decisions on raw expected value calculations. Almost everyone would take 1 million over a 0.1% chance of 10 billion though the expected value of the latter is higher (pascals mugging) 

this is just a shitpost.

Yeah, don't do that.  

Just this once I promise

I would take this movement seriously and endorse it if there was a detailed plan for the future of the movement when the human race is still around in 2051 and I'm homeless and buried in debt.

It was originally intended as an april fools joke lol. This isnt a serious movement but it does reflect a little bit of my hopelessness of ai alignment working