1 Matching Annotations
- Oct 2023
-
-
LLMs are merely engines for generating stylistically plausible output that fits the patterns of their inputs, rather than for producing accurate information. Publishers worry that a rise in their use might lead to greater numbers of poor-quality or error-strewn manuscripts — and possibly a flood of AI-assisted fakes.
-
for: progress trap, progress trap - AI, progress trap - AI - writing research papers
-
comment
- potential fakes
- climate science fakes by big oil think tanks
- Covid and virus research
- race issues
- gender issues
- potential fakes
-
-