天天看點

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

中文題名:基于聯合學習對齊和翻譯的神經機器翻譯

目錄

摘要

背景:神經機器翻譯

任務定義

編碼器-解碼器架構(基線)

編碼器(基線)

解碼器(基線)

模型效果

存在的問題

學習對齊和翻譯

RNNenc vs RNNsearch

RNNsearch的編碼器

RNNsearch的解碼器

注意力思想

注意力機制

RNNsearch模型的解碼器的計算步驟

RNNsearch模型

執行個體

實驗設定和結果

實驗設定

評估标準——Bleu

模型效果

實驗結果分析

未來工作

  • 摘要

  1. 神經機器翻譯的任務定義
  2. 傳統神經機器翻譯所用的編碼器-解碼器模型的缺陷
  3. 本文提出一種能夠自動搜尋原句中與預測目标詞相關的神經機器翻譯模型
  4. 所提出的模型的效果
  • 背景:神經機器翻譯

任務定義

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

該模型采用1到K編碼的字向量的源語言句子作為輸入:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

并輸出由1到K編碼的字向量的目智語言句子:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

任務目标:評估函數

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

編碼器-解碼器架構(基線)

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

模型名稱:RNNenc

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

編碼器(基線)

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示一個輸入句子的序列

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示編碼器的隐層狀态

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示由句子序列的隐層狀态生成的上下文向量

編碼器讀取輸入句子序列x,生成一個上下文向量c

解碼器(基線)

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示一個生成句子的序列

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

:表示解碼器的隐層狀态

解碼器是用來在給定上下文向量c和所有之前的預測詞

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

時預測下一個詞

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

,也可以說,解碼器通過将聯合機率分解為順序條件機率來定義一條翻譯y上的機率:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

使用RNN,每個條件機率被模組化為:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

其中,g是非線性的,可能為多層的,用來輸出

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

機率的函數

模型效果

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

在機器翻譯領域,使用Seq2Seq模型在英法翻譯任務中表現接近技術最先進水準,比傳統的詞袋模型效果好。

存在的問題

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
  1. 必須記住整個句子序列的語義資訊
  2. 把無論長度多長的句子都編碼成固定次元的向量,這樣限制了翻譯過程中長句子的表示
  3. 與人類翻譯時的習慣不同,人們不會在生成目智語言翻譯時關注源語言句子的每一個單詞
  • 學習對齊和翻譯

提出一種新的神經機器翻譯模型:RNNsearch

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

編碼器:采用雙向循環神經網絡,隐藏狀态同時對目前單詞前面和後面的資訊編碼

解碼器:提出注意力機制,對輸入的隐藏狀态求權重

RNNenc vs RNNsearch

RNNenc:

  1. 将整個輸入語句編碼城一個固定長度的向量
  2. 使用單向循環神經網絡

RNNsearch:

  1. 将輸入的句子編碼為變長向量序列
  2. 在解碼翻譯時,自适應地選擇這些向量的子集
  3. 使用雙向循環神經網絡

RNNsearch的編碼器

前向RNN:

輸入:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

輸出:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

後向RNN:

輸入:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

輸出:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

連接配接:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

RNNsearch的解碼器

目标端詞

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

的條件機率:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

表示i時刻的隐層狀态:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

與RNNenc模型的不同點:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

注意力思想

思想:集中關注的上下文

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

注意力機制

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

計算上下文向量

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

權值(注意力分數)

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

對齊模型:

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

,用來對位置j周圍的輸入和位置i處的輸出的比對程度進行評分。

RNNsearch模型的解碼器的計算步驟

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
  1. 計算注意力分數(對齊模型)
    論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
  2. 計算帶有注意力分數的上下文資訊
    論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
  3. 生成新的隐層狀态輸出
    論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
  4. 計算新的目智語言輸出
    論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

RNNsearch模型

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

執行個體

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
  • 實驗設定和結果

實驗設定

實驗模型:RNNsearch和RNNenc

實驗任務:從英語(源語言)到法語(目智語言)的翻譯

資料集:WMT’14資料集

對比實驗:分别取最大長度為30和最大長度為50的句子長度進行實驗

評估标準——Bleu

一種文本評估算法,用來評估機器翻譯跟專業人工翻譯之間的對應關系。

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

模型效果

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

測試集中生成的與句子長度相關的譯文的BLEU分數,結果是在整個測試集上得到,其中包括有未知單詞的句子。RNNsearch模型在長句子上表現優異。

實驗結果分析

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

x軸是源語言的單詞,y軸是目智語言的單詞,圖中顯示的是第j個源單詞和第i個目标單詞的注意力分數

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》

,越接近于0則越白,越接近于1則越黑。

  • 未來工作

論文筆記《Neural Machine Translation by Jointly Learning to Align and Translate》
  1. 使用不同注意力機制計算會導緻不同的結果
  2. 使用單向LSTM和使用計算注意力分數具有同樣的效果
  3. 提出其他的注意力分數計算方法
NLP

繼續閱讀