3 Matching Annotations
  1. Jan 2019
    1. abstract independently existing “object”

      Since forever, apparently, science has relied on Aristotle's "Unmoved Mover," in a sense. Not a god exactly, but some real or imagined unaffected observer whose presence serves as a fixed point from which to accumulate data. Why are we tempted to think this way? Aren't we all moving? What fixed point is there? I'm tempted to go back to the analogy of floating baskets tied together. There is an illusion of being grounded, but we aren't really.

    1. We know by now that there is no GreenwichMean Time in knowledge production in the posthuman era.

      To say it another way (although, why do that, when she just knocked it out of the park?) by borrowing an analogy from economics, knowledge production is a series of floating baskets, all fluctuating together. There is no firm base that everything is built on, the structure persists by virtue of its relations of its coherent parts, one to another.

  2. Nov 2018
    1. Rethinking floating point for deep learning

      【网络的压缩加速问题】

      Facebook人工智能研究院的Jeff Johnson改进了一种新颖的浮点数表示法(posit),使其更加适用于神经网络的训练和推理,并在FPGA上进行了对比实验。和IEEE-754浮点数标准相比,本论文基于改进的浮点数系统,可以实现低bit神经网络训练和高效推理,不再需要后续的量化压缩过程就可以部署在嵌入式等资源受限终端。该论文提出的方法区别于神经网络模型的剪枝、量化等常规思路,直接从浮点数表示这个更加基本、底层的角度尝试解决模型的压缩加速问题,是一个很新颖的方式,且效果不错,值得深入研究。除了论文,作者还给出了代码实现和博客文章,帮助理解。