Cristian Trout

Philosophy graduate interested in metaphysics, meta-ethics, AI safety and whole bunch of other things. Meta-ethical and moral theories of choice: naturalist realism + virtue ethics. 

Thinks longtermism rests on a false premise – that for every moral agent and every value-bearing location, an agent's spatio-temporal distance from a given value-bearing location does not factor into the value born at that location (e.g. no matter how close or far a person is from you, that person should matter the same to you). 

Thinks we should spend a lot more resources trying to delay HLMI – make AGI development uncool. Questions what we really need AGI for anyway. Accepts the epithet "luddite" so long as this is understood to describe someone who:

  • suspects that on net, technological progress yields diminishing marginal human flourishing
  • OR believes workers have a right to organize to defend their interests (you know – what the original Luddites were doing)
  • OR suspects that, with regards to AI, the Luddite fallacy may not be a fallacy: AI really could lead to wide-spread permanent technological unemployment, and that might not be a good thing
  • OR considering the common sense observation that societies can only adapt so quickly, suspects excessive rates of technological change can lead to social harms, independent of how the technology is used. 
    • Assuming some base line amount of good and bad actors, we always need norms/regulations etc to ensure new tech is overall a net benefit for society. But excessively rapid change makes the optimal norms/regulations excessively fast moving targets. On a deeper note, social bonds could start to fray with excessively rapid change: think different generations or groups who adopted a given new tech at a different time/rate not being able to connect with one another, their experience varying too greatly. Think teachers being unable to connect with, guide or prepare their students effectively given that their own experience is already outdated/invalidated and will only become more so by the time students are adults.

Subscribes to Crocker's Rules: unvarnished critical (but constructive) feedback is welcome.

Wiki Contributions