1 Matching Annotations
  1. Last 7 days
    1. the LLM can effectively track functional emotional states of entities in its context window, including the Assistant, by attending to these representations across token positions, a capability of transformer architectures not shared by biological recurrent neural networks

      Transformer 的注意力机制赋予了 LLM 一种人类大脑没有的能力:通过「回溯注意」缓存过去所有位置的情绪向量,实现跨时间的情绪追踪。这是 Transformer 架构与人类循环神经网络的根本差异——Claude 追踪情绪的方式,比人类大脑更像「翻阅历史记录」。