148 Matching Annotations
  1. May 2019
  2. Apr 2019
  3. Mar 2019
    1. A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification

  4. arxiv.org arxiv.org
    1. To the best of our knowl-edge, there has not been any other work exploringthe use of attention-based architectures for NMT


  5. Feb 2019
    1. 问答系统冠军之路:用 CNN 做问答任务的 QANet


      QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension

    1. To overcome this issue, weexplore data generation using templates and terminologies and data augmentationapproaches. Namely, we report our experiments using paraphrasing and wordrepresentations learned on a large EHR corpus with Fasttext and ELMo, to learn aNLU model without any available dataset. We evaluate on a NLU task of naturallanguage queries in EHRs divided in slot-filling and intent classification sub-tasks.On the slot-filling task, we obtain a F-score of 0.76 with the ELMo representation;and on the classification task, a mean F-score of 0.71. Our results show that thismethod could be used to develop a baseline system



  6. www.iro.umontreal.ca www.iro.umontreal.ca
    1. bi-directional Jordan-type network that takes into account both past and future dependencies among slots works best

      双向的 Jordan-type网络对槽最好用

    2. Using Recurrent Neural Networksfor Slot Filling in Spoken Language Understanding

    1. BERT: Pre-training of Deep Bidirectional Transformers forLanguage Understanding

    1. 10 Exciting Ideas of 2018 in NLP

      2018 年NLP 10大奇观

      • 1 无监督机器翻译
      • 2 预训练的语言模型
      • 3 常识推理数据集
      • 4 meta-learning
      • 5 健壮的无监督方法
      • 6 真正理解表示学习中的表示,预训练的语言模型真的表现出了和图像模型类似的功能
      • 7 多任务学习的成功应用
      • 8 半监督和迁移学习的结合
      • 9 基于大量文本的QA和推理数据集
      • 10 归纳偏差
    2. Phrase-Based & Neural Unsupervised Machine Translation (EMNLP 2018):  T
  7. Jan 2019
  8. Dec 2018
    1. A semantic treebank is a collection of natural language sentences annotated with a meaning representation. These resources use a formal representation of each sentence's semantic structure.