2 Matching Annotations
  1. Apr 2021
  2. Aug 2020
    1. It might be instructive to think about what it would take to create a program which has a model of eighth grade science sufficient to understand and answer questions about hundreds of different things like “growth is driven by cell division”, and “What can magnets be used for” that wasn’t NLP led. It would be a nightmare of many different (probably handcrafted) models. Speaking somewhat loosely, language allows for intellectual capacities to be greatly compressed. From this point of view, it shouldn’t be surprising that some of the first signs of really broad capacity- common sense reasoning, wide ranging problem solving etc., have been found in language based programs- words and their relationships are just a vastly more efficient way of representing knowledge than the alternatives.

      DePonySum ask us to consider what you would need to program to be able to answer a wide range of eight grade science level questions (e.g. What can magnets be used for.) The answer is you would need a whole slew of separately trained and optimized models.

      Language, they say, is a way to compress intellectual capacities.

      It is then no surprise that common sense reasoning, and solving a wide range of problems, is first discovered through language models. Words and their relationships are probably a very efficient way of representing knowledge.