This website requires javascript to properly function. Consider activating javascript to get access to all site functionality.
AI ALIGNMENT FORUM
Tags
AF
Login
Intelligence Explosion
•
Applied to
What is the nature of humans general intelligence and it's implications for AGI?
by
Will_Pearson
3d
ago
•
Applied to
Carl Shulman On Dwarkesh Podcast June 2023
by
Moonicker
2mo
ago
•
Applied to
A thought experiment for comparing "biological" vs "digital" intelligence increase/explosion
by
Super AGI
2mo
ago
•
Applied to
AGI will be made of heterogeneous components, Transformer and Selective SSM blocks will be among them
by
Roman Leventov
3mo
ago
•
Applied to
LLMs May Find It Hard to FOOM
by
Roger Dearnaley
4mo
ago
•
Applied to
A Simple Theory Of Consciousness
by
SherlockHolmes
8mo
ago
•
Applied to
How Smart Are Humans?
by
Joar Skalse
9mo
ago
•
Applied to
Do not miss the cutoff for immortality! There is a probability that you will live forever as an immortal superintelligent being and you can increase your odds by convincing others to make achieving the technological singularity as quickly and safely as possible the collective goal/project of all of humanity, Similar to "Fable of the Dragon-Tyrant."
by
Oliver--Klozoff
9mo
ago
•
Applied to
Carl Shulman on The Lunar Society (7 hour, two-part podcast)
by
ESRogs
9mo
ago
•
Applied to
What is Intelligence?
by
IsaacRosedale
1y
ago
•
Applied to
A basic mathematical structure of intelligence
by
Golol
1y
ago
•
Applied to
A method for empirical back-testing of AI's ability to self-improve
by
Michael Tontchev
1y
ago
•
Applied to
Why I'm Sceptical of Foom
by
Cinera Verinia
1y
ago
•
Applied to
Power-Seeking AI and Existential Risk
by
Antonio Franca
1y
ago
•
Applied to
Towards a Formalisation of Returns on Cognitive Reinvestment (Part 1)
by
Cinera Verinia
2y
ago
•
Applied to
The Hard Intelligence Hypothesis and Its Bearing on Succession Induced Foom
by
Cinera Verinia
2y
ago
•
Applied to
Singularity FAQ
by
Multicore
2y
ago