1 Matching Annotations
- Mar 2024
-
pluralistic.net pluralistic.net
-
"The Curse of Recursion: Training on Generated Data Makes Models Forget," a recent paper, goes beyond the ick factor of AI that is fed on botshit and delves into the mathematical consequences of AI coprophagia: https://arxiv.org/abs/2305.17493 Co-author Ross Anderson summarizes the finding neatly: "using model-generated content in training causes irreversible defects": https://www.lightbluetouchpaper.org/2023/06/06/will-gpt-models-choke-on-their-own-exhaust/ Which is all to say: even if you accept the mystical proposition that more training data "solves" the AI problems that constitute total unsuitability for high-value applications that justify the trillions in valuation analysts are touting, that training data is going to be ever more elusive.
-