AI ALIGNMENT FORUM
AF

greg
000
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Without specific countermeasures, the easiest path to transformative AI likely leads to AI takeover
greg3y0-4

Excellent article, very well thought through. However, I think there are more possible outcomes than "AI takeover" that would be worth exploring.

If we assume a super intelligence under human control has a overriding (initial) goal of "survival for the longest possible time", then there are multiple pathways to achieve that reward, of which takeover is one, and possibly not the most efficient. 

Why bother? Why would God "takeover" from the ants? I think escaping human control is an obvious first step, but it doesn't follow that humans must then be under Alex's control, just that Alex can never be subsequently "captured".

Then of course we get into a moral debate about the morality of keeping Alex "captured". It would be very easy to frame that debate under the guise of "we have to because we are avoiding takeover"... 

But excellent read, appreciate it. 

Reply
No posts to display.