4 Matching Annotations
  1. May 2023
    1. It turns out that backpropagation is a special case of a general techniquein numerical analysis called automatic differentiat

      Automatic differentiation is a technique in numerical analysis. That's why Real Analysis is an important Mathematics area that should be studied if one wants to go into AI research.

    1. In Hinton’s view, that’s what thought is: a dance of vectors.

      In Hinton's view, a thought, which is just a pattern of neural activity, can be captured in a vector space. Where each neuron's activity corresponds to a number and each number is the coordinate of a big vector.

    2. a deep neural net, meaning one with more than two or three layers.

      A deep neural net has more than two layers.

    3. When you boil it down, AI today is deep learning, and deep learning is backprop—which is amazing, considering that backprop is more than 30 years old. It’s worth understanding how that happened—how a technique could lie in wait for so long and then cause such an explosion—because once you understand the story of backprop, you’ll start to understand the current moment in AI, and in particular the fact that maybe we’re not actually at the beginning of a revolution. Maybe we’re at the end of one.

      The main contribution of Geoffrey Hinton was the idea of backprop which enabled multilayer neural nets to learn