4 Matching Annotations
- Feb 2025
-
www.reddit.com www.reddit.com
-
Good discussion and outline of research about natural method of language acquisition over grammar-translation method of language acquisition.
-
-
www.catholicculture.org www.catholicculture.org
-
Dreamt of learning Latin? Here’s how you’ll finally do it by [[Thomas V. Mirus]]
A non-specialist look at his Latin language acquisition with lots of resources around Hans Ørberg's Lingua Latin text.
-
- May 2024
-
media.dltj.org media.dltj.org
-
Google translate is generative AI
Google Translate as generative AI
-
- Feb 2023
-
arstechnica.com arstechnica.com
-
An AI model that can learn and work with this kind of problem needs to handle order in a very flexible way. The old models—LSTMs and RNNs—had word order implicitly built into the models. Processing an input sequence of words meant feeding them into the model in order. A model knew what word went first because that’s the word it saw first. Transformers instead handled sequence order numerically, with every word assigned a number. This is called "positional encoding." So to the model, the sentence “I love AI; I wish AI loved me” looks something like (I 1) (love 2) (AI 3) (; 4) (I 5) (wish 6) (AI 7) (loved 8) (me 9).
Google’s “the transformer”
One breakthrough was positional encoding versus having to handle the input in the order it was given. Second, using a matrix rather than vectors. This research came from Google Translate.
-