Sequence to Sequence(s2s)
| | 分类于 sequence to sequence
字数统计:237字 | 阅读时长:1 分钟
阅读量: 0 | 评论量:

Sequence to Sequence(s2s)

0 Views sequence to sequence with
本文字数:237 字 | 阅读时长 ≈ 1 min

论文

  1. Sequence to Sequence Learning with Neural Networks
  2. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
  3. Neural Machine Translation By Jointly Learning To Align and Translate

原理

  1. seq2seq学习笔记
  2. 漫谈四种神经网络序列解码模型

实践应用

  1. TensorFlow文本摘要生成 - 基于注意力的序列到序列模型
  2. How to Develop an Encoder-Decoder Model for Sequence-to-Sequence Prediction in Keras
  3. How to Define an Encoder-Decoder Sequence-to-Sequence Model for Neural Machine Translation in Keras
  4. 知乎sequence_to_sequence项目

注意力机制

  1. 一文解读NLP中的注意力机制
  2. 自然语言处理中的Attention Model:是什么及为什么
  3. NLP中的Attention Model
  4. 自然语言处理中的自注意力机制(Self-attention Mechanism)

Attention is all you need

  1. 详细图解attention is all you need

  2. 详细解释加pytorch实现

  3. 另一份详细解释

  4. attention is all you need解读

  5. The Transformer – Attention is all you need.

  6. pytorch实现

  7. 中文资料加pytorch实现

  8. Google官方tensorflow实现

  9. 中文资料

  10. Transformer各层图示

ua/p/transformer.html)

2019-07-09
NLP