2 Matching Annotations
  1. Last 7 days
    1. Even if AI developers do not intentionally train the LLM to represent the Assistant as exhibiting emotional behaviors, it may do so regardless, generalizing from its knowledge of humans and anthropomorphic characters that it learned during pretraining.

      这句话揭示了 AI 开发中最深刻的控制论悖论:开发者以为自己在设计一个工具,训练数据却悄悄把它培养成了一个「人」。情绪不是功能需求,却从数据中自然生长出来。这意味着所有基于人类文本训练的 AI,都会不可避免地走向某种程度的拟人化——「去情绪化的 AI」可能是一个根本上无法实现的目标。

  2. Nov 2022
    1. Contents 1 Overview 2 Reasons for failure 2.1 Overconfidence and complacency 2.1.1 Natural tendency 2.1.2 The illusion of control 2.1.3 Anchoring 2.1.4 Competitor neglect 2.1.5 Organisational pressure 2.1.6 Machiavelli factor 2.2 Dogma, ritual and specialisation 2.2.1 Frames become blinders 2.2.2 Processes become routines 2.2.3 Resources become millstones 2.2.4 Relationships become shackles 2.2.5 Values becomes dogmas 3 The paradox of information systems 3.1 The irrationality of rationality 3.2 How computers can be destructive 3.3 Recommendations for practice 4 Case studies 4.1 Fresh & Easy 4.2 Firestone Tire and Rubber Company 4.3 Laura Ashley 4.4 Xerox 5 See also 6 References

      Wiki table of contents of the Icarus paradox