1 Matching Annotations
- Apr 2024
-
arxiv.org arxiv.org
-
We consistently find that, far from exhibiting "zero-shot" generalization, multimodal models require exponentially more data to achieve linear improvements in downstream "zero-shot" performance
Exponential increase in training data is needed for linear improvements in zero-shot results of LLMs. This implies a very near, more or less now, brick wall in improvement.
Tags
Annotators
URL
-