1 Matching Annotations
  1. Mar 2019
    1. “Meditations on Moloch,”

      Clicked through to the essay. It appears to be mainly an argument for a super-powerful benevolent general artificial intelligence, of the sort proposed by AGI-maximalist Nick Bostrom.

      The money quote:

      The only way to avoid having all human values gradually ground down by optimization-competition is to install a Gardener over the entire universe who optimizes for human values.

      🔗 This is a great New Yorker profile of Bostrom, where I learned about his views.

      🔗Here is a good newsy profile from the Economist's magazine on the Google unit DeepMind and its attempt to create artificial general intelligence.