2 Matching Annotations
  1. Nov 2022
    1. Contents 1 Overview 2 Reasons for failure 2.1 Overconfidence and complacency 2.1.1 Natural tendency 2.1.2 The illusion of control 2.1.3 Anchoring 2.1.4 Competitor neglect 2.1.5 Organisational pressure 2.1.6 Machiavelli factor 2.2 Dogma, ritual and specialisation 2.2.1 Frames become blinders 2.2.2 Processes become routines 2.2.3 Resources become millstones 2.2.4 Relationships become shackles 2.2.5 Values becomes dogmas 3 The paradox of information systems 3.1 The irrationality of rationality 3.2 How computers can be destructive 3.3 Recommendations for practice 4 Case studies 4.1 Fresh & Easy 4.2 Firestone Tire and Rubber Company 4.3 Laura Ashley 4.4 Xerox 5 See also 6 References

      Wiki table of contents of the Icarus paradox

    2. The paradox of information systems[edit] Drummond suggests in her paper in 2008 that computer-based information systems can undermine or even destroy the organisation that they were meant to support, and it is precisely what makes them useful that makes them destructive – a phenomenon encapsulated by the Icarus Paradox.[9] For examples, a defence communication system is designed to improve efficiency by eliminating the need for meetings between military commanders who can now simply use the system to brief one another or answer to a higher authority. However, this new system becomes destructive precisely because the commanders no longer need to meet face-to-face, which consequently weakened mutual trust, thus undermining the organisation.[10] Ultimately, computer-based systems are reliable and efficient only to a point. For more complex tasks, it is recommended for organisations to focus on developing their workforce. A reason for the paradox is that rationality assumes that more is better, but intensification may be counter-productive.[11]

      From Wikipedia page on Icarus Paradox. Example of architectural design/technical debt leading to an "interest rate" that eventually collapsed the organization. How can one "pay down the principle" and not just the "compound interest"? What does that look like for this scenario? More invest in workforce retraining?

      Humans are complex, adaptive systems. Machines have a long history of being complicated, efficient (but not robust) systems. Is there a way to bridge this gap? What does an antifragile system of machines look like? Supervised learning? How do we ensure we don't fall prey to the oracle problem?

      Baskerville, R.L.; Land, F. (2004). "Socially Self-destructing Systems". The Social Study of Information and Communication Technology: Innovation, actors, contexts. Oxford: Oxford University Press. pp. 263–285