8 Matching Annotations
  1. May 2020
    1. Teacher forcing is a training technique that isapplicable to RNNs that have connections from their output to their hidden states at thenext time step. (Left) At train time, we feed the correct outputy(t)drawn from the trainset as input toh(t+1). (Right) When the model is deployed, the true output is generallynot known. In this case, we approximate the correct outputy(t)with the model’s outputo(t), and feed the output back into the model.

      Teacher forcing strategy to parellelize training.

    2. the network typically learns to useh(t)as a kind of lossysummary of the task-relevant aspects of the past sequence of inputs up tot

      The hidden state h(t) is a high-level representation of whatever happened until time step t.

    3. Parameter sharingmakes it possible to extend and apply the model to examples of different forms(different lengths, here) and generalize across them. If we had separate parametersfor each value of the time index, we could not generalize to sequence lengths notseen during training, nor share statistical strength across different sequence lengthsand across different positions in time. Such sharing is particularly important whena specific piece of information can occur at multiple positions within the sequence.

      RNN have the same parameters for each time step. This allows to generalize the inferred "meaning", even when it's inferred at different steps.

  2. Jan 2019
    1. MAD-GAN: Multivariate Anomaly Detection for Time Series Data with Generative Adversarial Networks

      这 paper 挺神的,用 GAN 做时序数据异常检测。主要神在 G 和 D 都仅用 LSTM-RNN 来构造的!不仅因此值得我关注,更因为该模型可以为自己思考“非模板引力波探测”带来启发!

  3. Dec 2018
    1. CFUN: Combining Faster R-CNN and U-net Network for Efficient Whole Heart Segmentation

      图做得很好看~~~

    2. Using Convolutional Neural Networks to Classify Audio Signal in Noisy Sound Scenes

      先辨别信号位置,再过滤出信号,这和 LIGO 找event波形的套路很像~ ;又看到 RNN与CNN 结合起来的应用~

    3. Seeing in the dark with recurrent convolutional neural networks

      目测一些结果和自己的 paper 很接近,同时此 paper 于我而言,有太多值得借鉴的地方!同时又看到了 recurrency(类循环记忆单元)在模式识别领域有着很必要的用武之地!

  4. Jul 2015