9 Matching Annotations
  1. Last 7 days
    1. Traditionally, a major challenge for building language models was figuring out the most useful way of representing different words—especially because the meanings of many words depend heavily on context. The next-word prediction approach allows researchers to sidestep this thorny theoretical puzzle by turning it into an empirical problem. It turns out that if we provide enough data and computing power, language models end up learning a lot about how human language works simply by figuring out how to best predict the next word. The downside is that we wind up with systems whose inner workings we don’t fully understand. Tim Lee was on staff at Ars from 2017 to 2021. He recently launched a new newsletter, Understanding AI. It explores how AI works and how it's changing our world. You can subscribe to his newsletter here. Sean Trott is an Assistant Professor at University of California, San Diego, where he conducts research on language understanding in humans and large language models. He writes about these topics, and others, in his newsletter The Counterfactual.

      Final annotation - one question that came to mind when reading this article - Will developers end up giving new AI personality and voice to speak to members/users of ChatGPT? and what else do developers of AI plan to program AI for?

    2. The

      Big picture annotation - The main point of the article is to explain the development from the original Chat GPT editions to the most recent versions by explaining how the programed a training system for the AI to become accustomed to answering questions, and detecting what questions to answer. Also another thing this article was trying to tell us is the potential new future versions of AI and how different it would be from our present versions.

    3. In “the customer asked the mechanic to fix his car,” does "his" refer to the customer or the mechanic? In “the professor urged the student to do her homework” does "her" refer to the professor or the student? In “fruit flies like a banana” is "flies" a verb (referring to fruit soaring across the sky) or a noun (referring to banana-loving insects)?

      connection annotation - This sections connects to the video about ChatGPT because when the man briefly explained Tokens and how it took a big part in development, and how tokens helped with word sequences, and adapting to English.

    4. OpenAI’s first LLM, GPT-1, was released in 2018. It used 768-dimensional word vectors and had 12 layers for a total of 117 million parameters. A few months later, OpenAI released GPT-2. Its largest version had 1,600-dimensional word vectors, 48 layers, and a total of 1.5 billion parameters.

      Connection annotation - This section connects to the video during the middle-beginning when the man explained how the differences between the different versions of the ChatGPT.

    5. For example, the word "bank" can refer to a financial institution or to the land next to a river. Or consider the following sentences: John picks up a magazine. Susan works for a magazine. The meanings of magazine in these sentences are related but subtly different. John picks up a physical magazine, while Susan works for an organization that publishes physical magazines.

      restatement annotation - What this section is saying is that most words in English have two meaning or other ways of saying, so the "tokens" used to program AI helped with this problem.

    6. One reason is scale. It’s hard to overstate the sheer number of examples that a model like GPT-3 sees. GPT-3 was trained on a corpus of approximately 500 billion words. For comparison, a typical human child encounters roughly 100 million words by age 10.

      Restatement annotation- In this section it pretty much states that the training codes encoded with ChatGPT allowed the AI to comprehend more words than a 10 year old kid.

    7. Word

      Tracking annotation - for the first tracking annotation i expect to learn how AI was developed to understand language, because English has a lot of words with different context.

    8. AI

      Video annotation (summary)- AI was the fastest growing platform among TikTok and other social media platforms. ChatGPT was programed with something called tokens, tokens are a numerical representation of words said in the video, and was programed to analyze these words into sequences. ChatGPT was also programed to spot illegal requests from members/users of ChatGPT, dictating weather it should take answers from members/users that have valuable requests rather then negative requests. A question that came to mind, is that if ChatGPT is so complex like this, does the world plan on inventing other AI like this for another purpose?