5 Matching Annotations
  1. Feb 2021
    1. The world's best book summaries

      Shortform has the world’s best summaries of 1000+ nonfiction books. Learn the key points of a book in 30 minutes or less. Discover the world’s best ideas and apply them to your life.

  2. Sep 2020
    1. Bringing the Author to Terms — In analytical reading, you must identify the keywords and how they are used by the author. This is fairly straightforward. The process becomes more complicated now as each author has probably used different terms and concepts to frame their argument. Now the onus is on you to establish the terms. Rather than using the author’s language, you must use your own. In short, this is an exercise in translation and synthesis

      [[translation and synthesis]] - understanding the authors in your own words, and being able to summarize their points without just copy-pasting. To be able to do this well, you really need to understand the authors ideas.

  3. Aug 2020
    1. Again, GPT-2 isn’t good at summarizing. It’s just surprising it can do it at all; it was never designed to learn this skill. All it was designed to do was predict what words came after other words. But there were some naturally-occurring examples of summaries in the training set, so in order to predict what words would come after the words tl;dr, it had to learn what a summary was and how to write one.

      Whatever is naturally occurring in GPT2/3's dataset it will learn how to do, whether it be summarization, translation to French etc.

  4. Feb 2017
    1. The summarization script collects all the Hypothesis direct links on the page, gathers the annotations, extracts the URLs and quotes, injects them into the Footnotes section of the page, and rewrites the links to point to corresponding footnotes.

      Really cool.