Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
T
Toy-MT-Introduction
概览
Overview
Details
Activity
Cycle Analytics
版本库
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
问题
0
Issues
0
列表
Board
标记
里程碑
合并请求
0
Merge Requests
0
CI / CD
CI / CD
流水线
作业
日程表
图表
维基
Wiki
代码片段
Snippets
成员
Collapse sidebar
Close sidebar
活动
图像
聊天
创建新问题
作业
提交
Issue Boards
Open sidebar
NiuTrans
Toy-MT-Introduction
Commits
a744eab3
Commit
a744eab3
authored
Apr 12, 2020
by
xiaotong
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
minor updates of sections 2-3
parent
d8921642
隐藏空白字符变更
内嵌
并排
正在显示
6 个修改的文件
包含
826 行增加
和
73 行删除
+826
-73
Book/Chapter2/Figures/figure-probability-values-corresponding-to-different-derivations.tex
+1
-1
Book/Chapter2/chapter2.tex
+8
-8
Book/Chapter3/Figures/figure-processes-SMT.tex
+1
-1
Book/mt-book-xelatex.idx
+263
-27
Book/mt-book-xelatex.ptc
+547
-30
Book/mt-book-xelatex.tex
+6
-6
没有找到文件。
Book/Chapter2/Figures/figure-probability-values-corresponding-to-different-derivations.tex
查看文件 @
a744eab3
...
...
@@ -66,7 +66,7 @@
\end{scope}
\draw
[->,thick,ublue] ([xshift=-2em]sent.south) ..controls + (south:2em) and +(north:2em).. ([xshift=-8em,yshift=-2em]sent.south);
\draw
[->,thick,ublue] ([xshift=-1em]sent.south) ..controls + (south:2em) and +(north:2em).. ([xshift=-2em,yshift=-
3
em]sent.south);
\draw
[->,thick,ublue] ([xshift=-1em]sent.south) ..controls + (south:2em) and +(north:2em).. ([xshift=-2em,yshift=-
2
em]sent.south);
\draw
[->,thick,ublue] ([xshift=0em]sent.south) ..controls + (south:2em) and +(north:2em).. ([xshift=6.5em,yshift=-2em]sent.south);
\draw
[->,thick,ublue,dotted] ([xshift=1em]sent.south) ..controls + (south:1.5em) and +(north:2.5em).. ([xshift=12.5em,yshift=-2em]sent.south);
...
...
Book/Chapter2/chapter2.tex
查看文件 @
a744eab3
...
...
@@ -227,9 +227,9 @@ F(X)=\int_{-\infty}^x f(x)dx
\parinterval
举个例子,小张从家到公司有三条路分别为
$
a
$
,
$
b
$
,
$
c
$
,选择每条路的概率分别为0.5,0.3,0.2。令:
\begin{itemize}
\item
$
S
_
a
$
:小张选择
a
路去上班
\item
$
S
_
b
$
:小张选择
b
路去上班
\item
$
S
_
c
$
:小张选择
c
路去上班
\item
$
S
_
a
$
:小张选择
$
a
$
路去上班
\item
$
S
_
b
$
:小张选择
$
b
$
路去上班
\item
$
S
_
c
$
:小张选择
$
c
$
路去上班
\item
$
S
$
:小张去上班
\end{itemize}
...
...
@@ -308,7 +308,7 @@ F(X)=\int_{-\infty}^x f(x)dx
\subsubsection
{
KL距离
}
\index
{
Chapter2.2.5.2
}
\parinterval
如果同一个随机变量
$
X
$
上有两个独立的概率分布P
$
(
x
)
$
和Q
$
(
x
)
$
,那么可以使用KL距离(
"Kullback-Leibler"
散度)来衡量这两个分布的不同,这种度量就是
{
\small\bfnew
{
相对熵
}}
(Relative Entropy)。其公式如下:
\parinterval
如果同一个随机变量
$
X
$
上有两个独立的概率分布P
$
(
x
)
$
和Q
$
(
x
)
$
,那么可以使用KL距离(
``Kullback-Leibler''
散度)来衡量这两个分布的不同,这种度量就是
{
\small\bfnew
{
相对熵
}}
(Relative Entropy)。其公式如下:
\begin{eqnarray}
\textrm
{
D
}_{
\textrm
{
KL
}}
(
\textrm
{
P
}
\parallel
\textrm
{
Q
}
)
&
=
&
\sum
_{
x
\in
\textrm
{
X
}}
[
\textrm
{
P
}
(x)
\log
\frac
{
\textrm
{
P
}
(x)
}{
\textrm
{
Q
}
(x)
}
]
\nonumber
\\
&
=
&
\sum
_{
x
\in
\textrm
{
X
}
}
[
\textrm
{
P
}
(x)(
\log\textrm
{
P
}
(x)-
\log
\textrm
{
Q
}
(x))]
...
...
@@ -773,7 +773,7 @@ r^* = (r + 1)\frac{n_{r + 1}}{n_r}
\parinterval
基于这个公式,就可以估计所有0次
$
n
$
-gram的频次
$
n
_
0
r
^
*=(
r
+
1
)
n
_
1
=
n
_
1
$
。要把这个重新估计的统计数转化为概率,需要进行归一化处理:对于每个统计数为
$
r
$
的事件,其概率为
\begin{eqnarray}
\textrm
{
P
}_
r=
r
^
*/N
\textrm
{
P
}_
r=
\frac
{
r
^
*
}{
N
}
\end{eqnarray}
其中
...
...
@@ -784,7 +784,7 @@ N & = & \sum_{r=0}^{\infty}{r^{*}n_r} \nonumber \\
\label
{
eq:2.4-10
}
\end{eqnarray}
也就是说,
$
N
$
仍然为这个整个样本分布最初的计数。
这样
样本中所有事件的概率之和为:
也就是说,
$
N
$
仍然为这个整个样本分布最初的计数。样本中所有事件的概率之和为:
\begin{eqnarray}
\textrm
{
P
}
(r>0)
&
=
&
\sum
_{
r>0
}{
\textrm
{
P
}_
r
}
\nonumber
\\
...
...
@@ -1152,7 +1152,7 @@ r_6: & & \textrm{VP} \to \textrm{VV}\ \textrm{NN} \nonumber
\end{figure}
%-------------------------------------------
\parinterval
图
\ref
{
fig:2.5-9
}
展示了基于统计的句法分析的流程。首先,通过树库上的统计,获得各个规则的概率,这样就得到了一个上下文无关句法分析模型
$
\textrm
{
P
}
(
\cdot
)
$
。对于任意句法分析结果
$
d
=
r
_
1
\c
dot
r
_
2
\cdot
...
\cdot
r
_
n
$
,都能通过如下公式计算其概率值:
\parinterval
图
\ref
{
fig:2.5-9
}
展示了基于统计的句法分析的流程。首先,通过树库上的统计,获得各个规则的概率,这样就得到了一个上下文无关句法分析模型
$
\textrm
{
P
}
(
\cdot
)
$
。对于任意句法分析结果
$
d
=
r
_
1
\c
irc
r
_
2
\circ
...
\circ
r
_
n
$
,都能通过如下公式计算其概率值:
\begin{equation}
\textrm
{
P
}
(d)=
\prod
_{
i=1
}^{
n
}
\textrm
{
P
}
(r
_
i)
...
...
@@ -1182,7 +1182,7 @@ r_6: & & \textrm{VP} \to \textrm{VV}\ \textrm{NN} \nonumber
\begin{itemize}
\item
在建模方面,本章介绍的三个任务均采用的是基于人工先验知识进行模型设计的思路。也就是,问题所表达的现象被``一步一步''生成出来。这是一种典型的生成式建模思想,它把要解决的问题看作一些观测结果的隐含变量(比如,句子是观测结果,分词结果是隐含在背后的变量),之后通过对隐含变量生成观测结果的过程进行建模,以达到对问题进行数学描述的目的。这类模型一般需要依赖一些独立性假设,假设的合理性对最终的性能有较大影响。相对
{
\small\sffamily\bfseries
{
生成模型
}}
(Generative Model),另一类方法
{
\small\sffamily\bfseries
{
判别模型
}}
(Discriminative Model),它直接描述了从隐含变量生成观测结果的过程,这样对问题的建模更加直接,同时这类模型可以更加灵活的引入不同的特征。判别式模型在自然语言处理中也有广泛应用
\cite
{
shannon1948mathematical
}
\cite
{
ng2002discriminative
}
。 在本书的第四章也会使用到判别式模型。
\item
从现在自然语言处理的前沿看,基于端到端学习的深度学习方法在很多任务中都取得了领先的性能。但是,本章并没有涉及深度学习及相关方法,这是由于笔者认为:
{
\color
{
red
}
对问题的建模是自然语言处理的基础,对问题的本质刻画并不会因为方法的改变而改变
}
。因此,本章的内容没有太多的陷入到更加复杂的模型和算法设计中,相反,我们希望关注对基本问题的理解和描述。不过,一些前沿方法仍可以作为参考,包括:基于条件随机场和双向长短时记忆模型的序列标注模型
(
\cite
{
lafferty2001conditional
}
\cite
{
huang2015bidirectional
}
\cite
{
ma2016end
}
、神经语言模型
\cite
{
bengio2003neural
}
\cite
{
mikolov2010recurrent
}
、神经句法分析模型
\cite
{
chen2014fast
}
\cite
{
zhu2015long
}
。
\item
从现在自然语言处理的前沿看,基于端到端学习的深度学习方法在很多任务中都取得了领先的性能。但是,本章并没有涉及深度学习及相关方法,这是由于笔者认为:
{
\color
{
red
}
对问题的建模是自然语言处理的基础,对问题的本质刻画并不会因为方法的改变而改变
}
。因此,本章的内容没有太多的陷入到更加复杂的模型和算法设计中,相反,我们希望关注对基本问题的理解和描述。不过,一些前沿方法仍可以作为参考,包括:基于条件随机场和双向长短时记忆模型的序列标注模型
\cite
{
lafferty2001conditional
}
\cite
{
huang2015bidirectional
}
\cite
{
ma2016end
}
、神经语言模型
\cite
{
bengio2003neural
}
\cite
{
mikolov2010recurrent
}
、神经句法分析模型
\cite
{
chen2014fast
}
\cite
{
zhu2015long
}
。
\item
此外,本章并没有对模型的推断方法进行深入介绍。比如,对于一个句子如何有效的找到概率最大的分词结果?显然,简单枚举是不可行的。对于这类问题比较简单的解决方法是使用动态规划
\cite
{
huang2008advanced
}
。如果使用动态规划的条件不满足,可以考虑使用更加复杂的搜索策略,并配合一定剪枝方法。实际上,无论是
$
n
$
-gram语言模型还是简单的上下文无关文法都有高效的推断方法。比如,
$
n
$
-gram语言模型可以被视为概率有限状态自动机,因此可以直接使用成熟的自动机工具。对于更复杂的句法分析问题,可以考虑使用移进-规约方法来解决推断问题
\cite
{
aho1972theory
}
。
\end{itemize}
...
...
Book/Chapter3/Figures/figure-processes-SMT.tex
查看文件 @
a744eab3
...
...
@@ -26,7 +26,7 @@
\draw
[->,very thick,ublue] ([xshift=0.2em]corpus.east) -- ([xshift=3.2em]corpus.east) node [pos=0.5, above]
{
\color
{
red
}{
\scriptsize
{
模型学习
}}}
;
{
\draw
[->,very thick,ublue] ([xshift=0.4em]model.east) -- ([xshift=3.4em]model.east) node [inner sep=0pt,pos=0.5, above,yshift=0.3em] (decodingarrow)
{
\color
{
red
}{
\scriptsize
{
穷举
\&
计算
}}}
;
\draw
[->,very thick,ublue] ([xshift=0.4em]model.east) -- ([xshift=3.4em]model.east) node [inner sep=0pt,pos=0.5, above,yshift=0.3em] (decodingarrow)
{
\color
{
red
}{
\scriptsize
{
搜索
\&
计算
}}}
;
{
\scriptsize
\node
[anchor=north west,inner sep=2pt] (sentlabel) at ([xshift=5.5em,yshift=-0.9em]model.north east)
{{
\color
{
ublue
}
\sffamily\bfseries
{
机器翻译引擎
}}}
;
...
...
Book/mt-book-xelatex.idx
查看文件 @
a744eab3
\indexentry{Chapter2.1|hyperpage}{8}
\indexentry{Chapter2.2|hyperpage}{9}
\indexentry{Chapter2.2.1|hyperpage}{9}
\indexentry{Chapter2.2.2|hyperpage}{11}
\indexentry{Chapter2.2.3|hyperpage}{12}
\indexentry{Chapter2.2.4|hyperpage}{13}
\indexentry{Chapter2.2.5|hyperpage}{15}
\indexentry{Chapter2.2.5.1|hyperpage}{15}
\indexentry{Chapter2.2.5.2|hyperpage}{16}
\indexentry{Chapter2.2.5.3|hyperpage}{16}
\indexentry{Chapter2.3|hyperpage}{17}
\indexentry{Chapter2.3.1|hyperpage}{18}
\indexentry{Chapter2.3.2|hyperpage}{19}
\indexentry{Chapter2.3.2.1|hyperpage}{19}
\indexentry{Chapter2.3.2.2|hyperpage}{20}
\indexentry{Chapter2.3.2.3|hyperpage}{22}
\indexentry{Chapter2.4|hyperpage}{24}
\indexentry{Chapter2.4.1|hyperpage}{25}
\indexentry{Chapter2.4.2|hyperpage}{27}
\indexentry{Chapter2.4.2.1|hyperpage}{28}
\indexentry{Chapter2.4.2.2|hyperpage}{29}
\indexentry{Chapter2.4.2.3|hyperpage}{30}
\indexentry{Chapter2.5|hyperpage}{32}
\indexentry{Chapter2.5.1|hyperpage}{32}
\indexentry{Chapter2.5.2|hyperpage}{34}
\indexentry{Chapter2.5.3|hyperpage}{38}
\indexentry{Chapter2.6|hyperpage}{40}
\indexentry{Chapter1.1|hyperpage}{13}
\indexentry{Chapter1.2|hyperpage}{16}
\indexentry{Chapter1.3|hyperpage}{21}
\indexentry{Chapter1.4|hyperpage}{22}
\indexentry{Chapter1.4.1|hyperpage}{22}
\indexentry{Chapter1.4.2|hyperpage}{24}
\indexentry{Chapter1.4.3|hyperpage}{25}
\indexentry{Chapter1.4.4|hyperpage}{26}
\indexentry{Chapter1.4.5|hyperpage}{27}
\indexentry{Chapter1.5|hyperpage}{28}
\indexentry{Chapter1.5.1|hyperpage}{28}
\indexentry{Chapter1.5.2|hyperpage}{29}
\indexentry{Chapter1.5.2.1|hyperpage}{29}
\indexentry{Chapter1.5.2.2|hyperpage}{31}
\indexentry{Chapter1.5.2.3|hyperpage}{31}
\indexentry{Chapter1.6|hyperpage}{32}
\indexentry{Chapter1.7|hyperpage}{34}
\indexentry{Chapter1.7.1|hyperpage}{34}
\indexentry{Chapter1.7.1.1|hyperpage}{35}
\indexentry{Chapter1.7.1.2|hyperpage}{36}
\indexentry{Chapter1.7.2|hyperpage}{38}
\indexentry{Chapter1.8|hyperpage}{40}
\indexentry{Chapter2.1|hyperpage}{46}
\indexentry{Chapter2.2|hyperpage}{47}
\indexentry{Chapter2.2.1|hyperpage}{47}
\indexentry{Chapter2.2.2|hyperpage}{49}
\indexentry{Chapter2.2.3|hyperpage}{50}
\indexentry{Chapter2.2.4|hyperpage}{51}
\indexentry{Chapter2.2.5|hyperpage}{53}
\indexentry{Chapter2.2.5.1|hyperpage}{53}
\indexentry{Chapter2.2.5.2|hyperpage}{54}
\indexentry{Chapter2.2.5.3|hyperpage}{54}
\indexentry{Chapter2.3|hyperpage}{55}
\indexentry{Chapter2.3.1|hyperpage}{56}
\indexentry{Chapter2.3.2|hyperpage}{57}
\indexentry{Chapter2.3.2.1|hyperpage}{57}
\indexentry{Chapter2.3.2.2|hyperpage}{58}
\indexentry{Chapter2.3.2.3|hyperpage}{60}
\indexentry{Chapter2.4|hyperpage}{62}
\indexentry{Chapter2.4.1|hyperpage}{63}
\indexentry{Chapter2.4.2|hyperpage}{65}
\indexentry{Chapter2.4.2.1|hyperpage}{66}
\indexentry{Chapter2.4.2.2|hyperpage}{67}
\indexentry{Chapter2.4.2.3|hyperpage}{68}
\indexentry{Chapter2.5|hyperpage}{70}
\indexentry{Chapter2.5.1|hyperpage}{70}
\indexentry{Chapter2.5.2|hyperpage}{72}
\indexentry{Chapter2.5.3|hyperpage}{76}
\indexentry{Chapter2.6|hyperpage}{78}
\indexentry{Chapter3.1|hyperpage}{83}
\indexentry{Chapter3.2|hyperpage}{85}
\indexentry{Chapter3.2.1|hyperpage}{85}
\indexentry{Chapter3.2.1.1|hyperpage}{85}
\indexentry{Chapter3.2.1.2|hyperpage}{86}
\indexentry{Chapter3.2.1.3|hyperpage}{87}
\indexentry{Chapter3.2.2|hyperpage}{87}
\indexentry{Chapter3.2.3|hyperpage}{88}
\indexentry{Chapter3.2.3.1|hyperpage}{88}
\indexentry{Chapter3.2.3.2|hyperpage}{89}
\indexentry{Chapter3.2.3.3|hyperpage}{90}
\indexentry{Chapter3.2.4|hyperpage}{91}
\indexentry{Chapter3.2.4.1|hyperpage}{91}
\indexentry{Chapter3.2.4.2|hyperpage}{93}
\indexentry{Chapter3.2.5|hyperpage}{94}
\indexentry{Chapter3.3|hyperpage}{97}
\indexentry{Chapter3.3.1|hyperpage}{97}
\indexentry{Chapter3.3.2|hyperpage}{100}
\indexentry{Chapter3.3.2.1|hyperpage}{101}
\indexentry{Chapter3.3.2.2|hyperpage}{102}
\indexentry{Chapter3.3.2.3|hyperpage}{103}
\indexentry{Chapter3.4|hyperpage}{104}
\indexentry{Chapter3.4.1|hyperpage}{104}
\indexentry{Chapter3.4.2|hyperpage}{106}
\indexentry{Chapter3.4.3|hyperpage}{107}
\indexentry{Chapter3.4.4|hyperpage}{108}
\indexentry{Chapter3.4.4.1|hyperpage}{108}
\indexentry{Chapter3.4.4.2|hyperpage}{109}
\indexentry{Chapter3.5|hyperpage}{114}
\indexentry{Chapter3.5.1|hyperpage}{115}
\indexentry{Chapter3.5.2|hyperpage}{117}
\indexentry{Chapter3.5.3|hyperpage}{119}
\indexentry{Chapter3.5.4|hyperpage}{120}
\indexentry{Chapter3.5.5|hyperpage}{122}
\indexentry{Chapter3.5.5|hyperpage}{124}
\indexentry{Chapter3.6|hyperpage}{125}
\indexentry{Chapter3.6.1|hyperpage}{125}
\indexentry{Chapter3.6.2|hyperpage}{126}
\indexentry{Chapter3.6.4|hyperpage}{127}
\indexentry{Chapter3.6.5|hyperpage}{127}
\indexentry{Chapter3.7|hyperpage}{127}
\indexentry{Chapter4.1|hyperpage}{129}
\indexentry{Chapter4.1.1|hyperpage}{131}
\indexentry{Chapter4.1.2|hyperpage}{132}
\indexentry{Chapter4.2|hyperpage}{134}
\indexentry{Chapter4.2.1|hyperpage}{134}
\indexentry{Chapter4.2.2|hyperpage}{137}
\indexentry{Chapter4.2.2.1|hyperpage}{137}
\indexentry{Chapter4.2.2.2|hyperpage}{138}
\indexentry{Chapter4.2.2.3|hyperpage}{139}
\indexentry{Chapter4.2.3|hyperpage}{140}
\indexentry{Chapter4.2.3.1|hyperpage}{140}
\indexentry{Chapter4.2.3.2|hyperpage}{141}
\indexentry{Chapter4.2.3.3|hyperpage}{142}
\indexentry{Chapter4.2.4|hyperpage}{144}
\indexentry{Chapter4.2.4.1|hyperpage}{144}
\indexentry{Chapter4.2.4.2|hyperpage}{145}
\indexentry{Chapter4.2.4.3|hyperpage}{146}
\indexentry{Chapter4.2.5|hyperpage}{147}
\indexentry{Chapter4.2.6|hyperpage}{147}
\indexentry{Chapter4.2.7|hyperpage}{151}
\indexentry{Chapter4.2.7.1|hyperpage}{152}
\indexentry{Chapter4.2.7.2|hyperpage}{152}
\indexentry{Chapter4.2.7.3|hyperpage}{153}
\indexentry{Chapter4.2.7.4|hyperpage}{154}
\indexentry{Chapter4.3|hyperpage}{155}
\indexentry{Chapter4.3.1|hyperpage}{158}
\indexentry{Chapter4.3.1.1|hyperpage}{159}
\indexentry{Chapter4.3.1.2|hyperpage}{160}
\indexentry{Chapter4.3.1.3|hyperpage}{161}
\indexentry{Chapter4.3.1.4|hyperpage}{162}
\indexentry{Chapter4.3.2|hyperpage}{162}
\indexentry{Chapter4.3.3|hyperpage}{164}
\indexentry{Chapter4.3.4|hyperpage}{165}
\indexentry{Chapter4.3.5|hyperpage}{168}
\indexentry{Chapter4.4|hyperpage}{170}
\indexentry{Chapter4.4.1|hyperpage}{173}
\indexentry{Chapter4.4.2|hyperpage}{175}
\indexentry{Chapter4.4.2.1|hyperpage}{176}
\indexentry{Chapter4.4.2.2|hyperpage}{177}
\indexentry{Chapter4.4.2.3|hyperpage}{179}
\indexentry{Chapter4.4.3|hyperpage}{180}
\indexentry{Chapter4.4.3.1|hyperpage}{181}
\indexentry{Chapter4.4.3.2|hyperpage}{184}
\indexentry{Chapter4.4.3.3|hyperpage}{185}
\indexentry{Chapter4.4.3.4|hyperpage}{187}
\indexentry{Chapter4.4.3.5|hyperpage}{188}
\indexentry{Chapter4.4.4|hyperpage}{189}
\indexentry{Chapter4.4.4.1|hyperpage}{190}
\indexentry{Chapter4.4.4.2|hyperpage}{191}
\indexentry{Chapter4.4.5|hyperpage}{191}
\indexentry{Chapter4.4.5|hyperpage}{193}
\indexentry{Chapter4.4.7|hyperpage}{197}
\indexentry{Chapter4.4.7.1|hyperpage}{198}
\indexentry{Chapter4.4.7.2|hyperpage}{198}
\indexentry{Chapter4.5|hyperpage}{200}
\indexentry{Chapter5.1|hyperpage}{206}
\indexentry{Chapter5.1.1|hyperpage}{206}
\indexentry{Chapter5.1.1.1|hyperpage}{206}
\indexentry{Chapter5.1.1.2|hyperpage}{207}
\indexentry{Chapter5.1.1.3|hyperpage}{208}
\indexentry{Chapter5.1.2|hyperpage}{209}
\indexentry{Chapter5.1.2.1|hyperpage}{209}
\indexentry{Chapter5.1.2.2|hyperpage}{210}
\indexentry{Chapter5.2|hyperpage}{210}
\indexentry{Chapter5.2.1|hyperpage}{210}
\indexentry{Chapter5.2.1.1|hyperpage}{211}
\indexentry{Chapter5.2.1.2|hyperpage}{212}
\indexentry{Chapter5.2.1.3|hyperpage}{212}
\indexentry{Chapter5.2.1.4|hyperpage}{213}
\indexentry{Chapter5.2.1.5|hyperpage}{214}
\indexentry{Chapter5.2.1.6|hyperpage}{215}
\indexentry{Chapter5.2.2|hyperpage}{216}
\indexentry{Chapter5.2.2.1|hyperpage}{217}
\indexentry{Chapter5.2.2.2|hyperpage}{217}
\indexentry{Chapter5.2.2.3|hyperpage}{218}
\indexentry{Chapter5.2.2.4|hyperpage}{219}
\indexentry{Chapter5.2.3|hyperpage}{220}
\indexentry{Chapter5.2.3.1|hyperpage}{220}
\indexentry{Chapter5.2.3.2|hyperpage}{222}
\indexentry{Chapter5.2.4|hyperpage}{224}
\indexentry{Chapter5.3|hyperpage}{227}
\indexentry{Chapter5.3.1|hyperpage}{227}
\indexentry{Chapter5.3.1.1|hyperpage}{227}
\indexentry{Chapter5.3.1.2|hyperpage}{229}
\indexentry{Chapter5.3.1.3|hyperpage}{230}
\indexentry{Chapter5.3.2|hyperpage}{231}
\indexentry{Chapter5.3.3|hyperpage}{232}
\indexentry{Chapter5.3.4|hyperpage}{236}
\indexentry{Chapter5.3.5|hyperpage}{237}
\indexentry{Chapter5.4|hyperpage}{238}
\indexentry{Chapter5.4.1|hyperpage}{239}
\indexentry{Chapter5.4.2|hyperpage}{240}
\indexentry{Chapter5.4.2.1|hyperpage}{241}
\indexentry{Chapter5.4.2.2|hyperpage}{243}
\indexentry{Chapter5.4.2.3|hyperpage}{245}
\indexentry{Chapter5.4.3|hyperpage}{248}
\indexentry{Chapter5.4.4|hyperpage}{250}
\indexentry{Chapter5.4.4.1|hyperpage}{250}
\indexentry{Chapter5.4.4.2|hyperpage}{251}
\indexentry{Chapter5.4.4.3|hyperpage}{251}
\indexentry{Chapter5.4.5|hyperpage}{253}
\indexentry{Chapter5.4.6|hyperpage}{254}
\indexentry{Chapter5.4.6.1|hyperpage}{255}
\indexentry{Chapter5.4.6.2|hyperpage}{257}
\indexentry{Chapter5.4.6.3|hyperpage}{258}
\indexentry{Chapter5.5|hyperpage}{260}
\indexentry{Chapter5.5.1|hyperpage}{260}
\indexentry{Chapter5.5.1.1|hyperpage}{261}
\indexentry{Chapter5.5.1.2|hyperpage}{263}
\indexentry{Chapter5.5.1.3|hyperpage}{264}
\indexentry{Chapter5.5.1.4|hyperpage}{265}
\indexentry{Chapter5.5.2|hyperpage}{266}
\indexentry{Chapter5.5.2.1|hyperpage}{266}
\indexentry{Chapter5.5.2.2|hyperpage}{266}
\indexentry{Chapter5.5.3|hyperpage}{268}
\indexentry{Chapter5.5.3.1|hyperpage}{268}
\indexentry{Chapter5.5.3.2|hyperpage}{270}
\indexentry{Chapter5.5.3.3|hyperpage}{270}
\indexentry{Chapter5.5.3.4|hyperpage}{271}
\indexentry{Chapter5.5.3.5|hyperpage}{272}
\indexentry{Chapter5.6|hyperpage}{272}
\indexentry{Chapter6.1|hyperpage}{275}
\indexentry{Chapter6.1.1|hyperpage}{277}
\indexentry{Chapter6.1.2|hyperpage}{279}
\indexentry{Chapter6.1.3|hyperpage}{282}
\indexentry{Chapter6.2|hyperpage}{284}
\indexentry{Chapter6.2.1|hyperpage}{284}
\indexentry{Chapter6.2.2|hyperpage}{285}
\indexentry{Chapter6.2.3|hyperpage}{286}
\indexentry{Chapter6.2.4|hyperpage}{287}
\indexentry{Chapter6.3|hyperpage}{288}
\indexentry{Chapter6.3.1|hyperpage}{290}
\indexentry{Chapter6.3.2|hyperpage}{292}
\indexentry{Chapter6.3.3|hyperpage}{296}
\indexentry{Chapter6.3.3.1|hyperpage}{296}
\indexentry{Chapter6.3.3.2|hyperpage}{296}
\indexentry{Chapter6.3.3.3|hyperpage}{298}
\indexentry{Chapter6.3.3.4|hyperpage}{299}
\indexentry{Chapter6.3.3.5|hyperpage}{301}
\indexentry{Chapter6.3.4|hyperpage}{301}
\indexentry{Chapter6.3.4.1|hyperpage}{302}
\indexentry{Chapter6.3.4.2|hyperpage}{303}
\indexentry{Chapter6.3.4.3|hyperpage}{306}
\indexentry{Chapter6.3.5|hyperpage}{308}
\indexentry{Chapter6.3.5.1|hyperpage}{309}
\indexentry{Chapter6.3.5.2|hyperpage}{309}
\indexentry{Chapter6.3.5.3|hyperpage}{310}
\indexentry{Chapter6.3.5.4|hyperpage}{310}
\indexentry{Chapter6.3.5.5|hyperpage}{311}
\indexentry{Chapter6.3.5.5|hyperpage}{312}
\indexentry{Chapter6.3.6|hyperpage}{313}
\indexentry{Chapter6.3.6.1|hyperpage}{315}
\indexentry{Chapter6.3.6.2|hyperpage}{316}
\indexentry{Chapter6.3.6.3|hyperpage}{317}
\indexentry{Chapter6.3.7|hyperpage}{318}
\indexentry{Chapter6.4|hyperpage}{320}
\indexentry{Chapter6.4.1|hyperpage}{321}
\indexentry{Chapter6.4.2|hyperpage}{322}
\indexentry{Chapter6.4.3|hyperpage}{325}
\indexentry{Chapter6.4.4|hyperpage}{327}
\indexentry{Chapter6.4.5|hyperpage}{328}
\indexentry{Chapter6.4.6|hyperpage}{329}
\indexentry{Chapter6.4.7|hyperpage}{331}
\indexentry{Chapter6.4.8|hyperpage}{332}
\indexentry{Chapter6.4.9|hyperpage}{333}
\indexentry{Chapter6.4.10|hyperpage}{336}
\indexentry{Chapter6.5|hyperpage}{336}
\indexentry{Chapter6.5.1|hyperpage}{337}
\indexentry{Chapter6.5.2|hyperpage}{337}
\indexentry{Chapter6.5.3|hyperpage}{338}
\indexentry{Chapter6.5.4|hyperpage}{338}
\indexentry{Chapter6.5.5|hyperpage}{339}
\indexentry{Chapter6.6|hyperpage}{340}
Book/mt-book-xelatex.ptc
查看文件 @
a744eab3
\boolfalse {citerequest}\boolfalse {citetracker}\boolfalse {pagetracker}\boolfalse {backtracker}\relax
\babel@toc {english}{}
\defcounter {refsection}{0}\relax
\contentsline {part}{\@mypartnumtocformat {I}{机器翻译基础}}{5}{part.1}%
\select@language {english}
\defcounter {refsection}{0}\relax
\contentsline {part}{\@mypartnumtocformat {I}{机器翻译基础}}{11}{part.1}
\ttl@starttoc {default@1}
\defcounter {refsection}{0}\relax
\contentsline {chapter}{\numberline {1}机器翻译简介}{7}{chapter.1}%
\contentsline {chapter}{\numberline {1}机器翻译简介}{13}{chapter.1}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.1}机器翻译的概念}{13}{section.1.1}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.2}机器翻译简史}{16}{section.1.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.2.1}人工翻译}{16}{subsection.1.2.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.2.2}机器翻译的萌芽}{17}{subsection.1.2.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.2.3}机器翻译的受挫}{18}{subsection.1.2.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.2.4}机器翻译的快速成长}{19}{subsection.1.2.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.2.5}机器翻译的爆发}{20}{subsection.1.2.5}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.3}机器翻译现状}{21}{section.1.3}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.4}机器翻译方法}{22}{section.1.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.4.1}基于规则的机器翻译}{22}{subsection.1.4.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.4.2}基于实例的机器翻译}{24}{subsection.1.4.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.4.3}统计机器翻译}{25}{subsection.1.4.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.4.4}神经机器翻译}{26}{subsection.1.4.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.4.5}对比分析}{27}{subsection.1.4.5}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.5}翻译质量评价}{28}{section.1.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.5.1}人工评价}{28}{subsection.1.5.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.5.2}自动评价}{29}{subsection.1.5.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{BLEU}{29}{section*.15}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{TER}{31}{section*.16}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于检测点的评价}{31}{section*.17}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.6}机器翻译应用}{32}{section.1.6}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.7}开源项目与评测}{34}{section.1.7}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.7.1}开源机器翻译系统}{34}{subsection.1.7.1}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{统计机器翻译开源系统}{35}{section*.19}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{神经机器翻译开源系统}{36}{section*.20}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {1.7.2}常用数据集及公开评测任务}{38}{subsection.1.7.2}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.8}推荐学习资源}{40}{section.1.8}
\defcounter {refsection}{0}\relax
\contentsline {chapter}{\numberline {2}词法、语法及统计建模基础}{45}{chapter.2}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {2.1}问题概述 }{46}{section.2.1}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {2.2}概率论基础}{47}{section.2.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.2.1}随机变量和概率}{47}{subsection.2.2.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.2.2}联合概率、条件概率和边缘概率}{49}{subsection.2.2.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.2.3}链式法则}{50}{subsection.2.2.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.2.4}贝叶斯法则}{51}{subsection.2.2.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.2.5}KL距离和熵}{53}{subsection.2.2.5}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{信息熵}{53}{section*.27}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{KL距离}{54}{section*.29}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{交叉熵}{54}{section*.30}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {2.3}中文分词}{55}{section.2.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.3.1}基于词典的分词方法}{56}{subsection.2.3.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.3.2}基于统计的分词方法}{57}{subsection.2.3.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{统计模型的学习与推断}{57}{section*.34}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{掷骰子游戏}{58}{section*.36}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{全概率分词方法}{60}{section*.40}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {2.4}$n$-gram语言模型 }{62}{section.2.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.4.1}建模}{63}{subsection.2.4.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.4.2}未登录词和平滑算法}{65}{subsection.2.4.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{加法平滑方法}{66}{section*.46}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{古德-图灵估计法}{67}{section*.48}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{Kneser-Ney平滑方法}{68}{section*.50}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {2.5}句法分析(短语结构分析)}{70}{section.2.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.5.1}句子的句法树表示}{70}{subsection.2.5.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.5.2}上下文无关文法}{72}{subsection.2.5.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {2.5.3}规则和推导的概率}{76}{subsection.2.5.3}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {2.6}小结及深入阅读}{78}{section.2.6}
\defcounter {refsection}{0}\relax
\contentsline {part}{\@mypartnumtocformat {II}{统计机器翻译}}{81}{part.2}
\ttl@stoptoc {default@1}
\ttl@starttoc {default@2}
\defcounter {refsection}{0}\relax
\contentsline {chapter}{\numberline {3}基于词的机器翻译模型}{83}{chapter.3}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {3.1}什么是基于词的翻译模型}{83}{section.3.1}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {3.2}构建一个简单的机器翻译系统}{85}{section.3.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.2.1}如何进行翻译?}{85}{subsection.3.2.1}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)机器翻译流程}{86}{section*.63}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)人工 vs. 机器}{87}{section*.65}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.2.2}基本框架}{87}{subsection.3.2.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.2.3}单词翻译概率}{88}{subsection.3.2.3}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)什么是单词翻译概率?}{88}{section*.67}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)如何从一个双语平行数据中学习?}{89}{section*.69}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)如何从大量的双语平行数据中学习?}{90}{section*.70}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.2.4}句子级翻译模型}{91}{subsection.3.2.4}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)句子级翻译的基础模型}{91}{section*.72}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)生成流畅的译文}{93}{section*.74}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.2.5}解码}{94}{subsection.3.2.5}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {3.3}基于词的翻译建模}{97}{section.3.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.3.1}噪声信道模型}{97}{subsection.3.3.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.3.2}统计机器翻译的三个基本问题}{100}{subsection.3.3.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{词对齐}{101}{section*.83}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于词对齐的翻译模型}{102}{section*.86}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于词对齐的翻译实例}{103}{section*.88}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {3.4}IBM模型1-2}{104}{section.3.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.4.1}IBM模型1}{104}{subsection.3.4.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.4.2}IBM模型2}{106}{subsection.3.4.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.4.3}解码及计算优化}{107}{subsection.3.4.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.4.4}训练}{108}{subsection.3.4.4}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)目标函数}{108}{section*.93}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)优化}{109}{section*.95}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {3.5}IBM模型3-5及隐马尔可夫模型}{114}{section.3.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.5.1}基于产出率的翻译模型}{115}{subsection.3.5.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.5.2}IBM 模型3}{117}{subsection.3.5.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.5.3}IBM 模型4}{119}{subsection.3.5.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.5.4} IBM 模型5}{120}{subsection.3.5.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.5.5}隐马尔可夫模型}{122}{subsection.3.5.5}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{隐马尔可夫模型}{122}{section*.107}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{词对齐模型}{123}{section*.109}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.5.6}解码和训练}{124}{subsection.3.5.6}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {3.6}问题分析}{125}{section.3.6}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.6.1}词对齐及对称化}{125}{subsection.3.6.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.6.2}Deficiency}{126}{subsection.3.6.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.6.3}句子长度}{127}{subsection.3.6.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {3.6.4}其它问题}{127}{subsection.3.6.4}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {3.7}小结及深入阅读}{127}{section.3.7}
\defcounter {refsection}{0}\relax
\contentsline {chapter}{\numberline {4}基于短语和句法的机器翻译模型}{129}{chapter.4}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {4.1}翻译中的结构信息}{129}{section.4.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.1.1}更大粒度的翻译单元}{131}{subsection.4.1.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.1.2}句子的结构信息}{132}{subsection.4.1.2}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {4.2}基于短语的翻译模型}{134}{section.4.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.2.1}机器翻译中的短语}{134}{subsection.4.2.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.2.2}数学建模及判别式模型}{137}{subsection.4.2.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于翻译推导的建模}{137}{section*.121}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{对数线性模型}{138}{section*.122}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{搭建模型的基本流程}{139}{section*.123}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.2.3}短语抽取}{140}{subsection.4.2.3}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{与词对齐一致的短语}{140}{section*.126}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{获取词对齐}{141}{section*.130}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{度量双语短语质量}{142}{section*.132}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.2.4}调序}{144}{subsection.4.2.4}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于距离的调序}{144}{section*.136}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于方向的调序}{145}{section*.138}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于分类的调序}{146}{section*.141}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.2.5}特征}{147}{subsection.4.2.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.2.6}最小错误率训练}{147}{subsection.4.2.6}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.2.7}栈解码}{151}{subsection.4.2.7}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{翻译候选匹配}{152}{section*.146}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{翻译假设扩展}{152}{section*.148}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{剪枝}{153}{section*.150}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{解码中的栈结构}{154}{section*.152}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {4.3}基于层次短语的模型}{155}{section.4.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.3.1}同步上下文无关文法}{158}{subsection.4.3.1}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{文法定义}{159}{section*.157}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{推导}{160}{section*.158}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{胶水规则}{161}{section*.159}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{处理流程}{162}{section*.160}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.3.2}层次短语规则抽取}{162}{subsection.4.3.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.3.3}翻译模型及特征}{164}{subsection.4.3.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.3.4}CYK解码}{165}{subsection.4.3.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.3.5}立方剪枝}{168}{subsection.4.3.5}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {4.4}基于语言学句法的模型}{170}{section.4.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.4.1}基于句法的翻译模型分类}{173}{subsection.4.4.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.4.2}基于树结构的文法}{175}{subsection.4.4.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{树到树翻译规则}{176}{section*.176}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于树结构的翻译推导}{177}{section*.178}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{树到串翻译规则}{179}{section*.181}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.4.3}树到串翻译规则抽取}{180}{subsection.4.4.3}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{树的切割与最小规则}{181}{section*.183}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{空对齐处理}{184}{section*.189}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{组合规则}{185}{section*.191}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{SPMT规则}{187}{section*.193}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{句法树二叉化}{188}{section*.195}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.4.4}树到树翻译规则抽取}{189}{subsection.4.4.4}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于节点对齐的规则抽取}{190}{section*.199}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于对齐矩阵的规则抽取}{191}{section*.202}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.4.5}句法翻译模型的特征}{191}{subsection.4.4.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.4.6}基于超图的推导空间表示}{193}{subsection.4.4.6}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {4.4.7}基于树的解码 vs 基于串的解码}{197}{subsection.4.4.7}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于树的解码}{198}{section*.209}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{基于串的解码}{198}{section*.212}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {4.5}小结及深入阅读}{200}{section.4.5}
\defcounter {refsection}{0}\relax
\contentsline {part}{\@mypartnumtocformat {III}{神经机器翻译}}{203}{part.3}
\ttl@stoptoc {default@2}
\ttl@starttoc {default@3}
\defcounter {refsection}{0}\relax
\contentsline {chapter}{\numberline {5}人工神经网络和神经语言建模}{205}{chapter.5}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {5.1}深度学习与人工神经网络}{206}{section.5.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.1.1}发展简史}{206}{subsection.5.1.1}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)早期的人工神经网络和第一次寒冬}{206}{section*.214}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)神经网络的第二次高潮和第二次寒冬}{207}{section*.215}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)深度学习和神经网络的崛起}{208}{section*.216}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.1.2}为什么需要深度学习}{209}{subsection.5.1.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)端到端学习和表示学习}{209}{section*.218}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)深度学习的效果}{210}{section*.220}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {5.2}神经网络基础}{210}{section.5.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.2.1}线性代数基础}{210}{subsection.5.2.1}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)标量、向量和矩阵}{211}{section*.222}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)矩阵的转置}{212}{section*.223}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)矩阵加法和数乘}{212}{section*.224}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(四)矩阵乘法和矩阵点乘}{213}{section*.225}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(五)线性映射}{214}{section*.226}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(六)范数}{215}{section*.227}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.2.2}人工神经元和感知机}{216}{subsection.5.2.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)感知机\ \raisebox {0.5mm}{------}\ 最简单的人工神经元模型}{217}{section*.230}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)神经元内部权重}{217}{section*.233}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)神经元的输入\ \raisebox {0.5mm}{------}\ 离散 vs 连续}{218}{section*.235}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(四)神经元内部的参数学习}{219}{section*.237}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.2.3}多层神经网络}{220}{subsection.5.2.3}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)线性变换和激活函数}{220}{section*.239}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)单层神经网络$\rightarrow $多层神经网络}{222}{section*.246}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.2.4}函数拟合能力}{224}{subsection.5.2.4}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {5.3}神经网络的张量实现}{227}{section.5.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.3.1} 张量及其计算}{227}{subsection.5.3.1}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)张量}{227}{section*.256}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)张量的矩阵乘法}{229}{section*.259}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)张量的单元操作}{230}{section*.261}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.3.2}张量的物理存储形式}{231}{subsection.5.3.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.3.3}使用开源框架实现张量计算}{232}{subsection.5.3.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.3.4}神经网络中的前向传播}{236}{subsection.5.3.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.3.5}神经网络实例}{237}{subsection.5.3.5}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {5.4}神经网络的参数训练}{238}{section.5.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.4.1}损失函数}{239}{subsection.5.4.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.4.2}基于梯度的参数优化}{240}{subsection.5.4.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)梯度下降}{241}{section*.279}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)梯度获取}{243}{section*.281}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)基于梯度的方法的变种和改进}{245}{section*.285}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.4.3}参数更新的并行化策略}{248}{subsection.5.4.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.4.4}梯度消失、梯度爆炸和稳定性训练}{250}{subsection.5.4.4}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)梯度消失现象及解决方法}{250}{section*.288}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)梯度爆炸现象及解决方法}{251}{section*.292}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)稳定性训练}{251}{section*.293}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.4.5}过拟合}{253}{subsection.5.4.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.4.6}反向传播}{254}{subsection.5.4.6}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)输出层的反向传播}{255}{section*.296}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)隐藏层的反向传播}{257}{section*.300}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)程序实现}{258}{section*.303}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {5.5}神经语言模型}{260}{section.5.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.5.1}基于神经网络的语言建模}{260}{subsection.5.5.1}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)基于前馈神经网络的语言模型}{261}{section*.306}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)基于循环神经网络的语言模型}{263}{section*.309}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)基于自注意力机制的语言模型}{264}{section*.311}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(四)语言模型的评价}{265}{section*.313}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.5.2}单词表示模型}{266}{subsection.5.5.2}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)One-hot编码}{266}{section*.314}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)分布式表示}{266}{section*.316}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {5.5.3}句子表示模型及预训练}{268}{subsection.5.5.3}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(一)简单的上下文表示模型}{268}{section*.320}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(二)ELMO模型}{270}{section*.323}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(三)GPT模型}{270}{section*.325}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(四)BERT模型}{271}{section*.327}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{(五)为什么要预训练?}{272}{section*.329}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {5.6}小结及深入阅读}{272}{section.5.6}
\defcounter {refsection}{0}\relax
\contentsline {chapter}{\numberline {6}神经机器翻译模型}{275}{chapter.6}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {6.1}神经机器翻译的发展简史}{275}{section.6.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.1.1}神经机器翻译的起源}{277}{subsection.6.1.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.1.2}神经机器翻译的品质 }{279}{subsection.6.1.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.1.3}神经机器翻译的优势 }{282}{subsection.6.1.3}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {6.2}编码器-解码器框架}{284}{section.6.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.2.1}框架结构}{284}{subsection.6.2.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.2.2}表示学习}{285}{subsection.6.2.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.2.3}简单的运行实例}{286}{subsection.6.2.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.2.4}机器翻译范式的对比}{287}{subsection.6.2.4}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {6.3}基于循环神经网络的翻译模型及注意力机制}{288}{section.6.3}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.3.1}建模}{290}{subsection.6.3.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.3.2}输入(词嵌入)及输出(Softmax)}{292}{subsection.6.3.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.3.3}循环神经网络结构}{296}{subsection.6.3.3}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{循环神经单元(RNN)}{296}{section*.351}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{长短时记忆网络(LSTM)}{296}{section*.352}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{门控循环单元(GRU)}{298}{section*.355}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{双向模型}{299}{section*.357}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{多层循环神经网络}{301}{section*.359}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.3.4}注意力机制}{301}{subsection.6.3.4}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{翻译中的注意力机制}{302}{section*.362}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{上下文向量的计算}{303}{section*.365}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{注意力机制的解读}{306}{section*.370}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.3.5}训练}{308}{subsection.6.3.5}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{损失函数}{309}{section*.373}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{长参数初始化}{309}{section*.374}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{优化策略}{310}{section*.375}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{梯度裁剪}{310}{section*.377}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{学习率策略}{311}{section*.378}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{并行训练}{312}{section*.381}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {6.3.6}推断}{313}{subsection.6.3.6}
\defcounter {refsection}{0}\relax
\contentsline {subsubsection}{贪婪搜索}{315}{section*.385}
\defcounter {refsection}{0}\relax
\contentsline {s
ection}{\numberline {1.1}机器翻译的概念}{7}{section.1.1}%
\contentsline {s
ubsubsection}{束搜索}{316}{section*.388}
\defcounter {refsection}{0}\relax
\contentsline {s
ection}{\numberline {1.2}机器翻译简史}{10}{section.1.2}%
\contentsline {s
ubsubsection}{长度惩罚}{317}{section*.390}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.2.1}人工翻译}{10}{subsection.1.2.1}%
\contentsline {subsection}{\numberline {
6.3.7}实例-GNMT}{318}{subsection.6.3.7}
\defcounter {refsection}{0}\relax
\contentsline {s
ubsection}{\numberline {1.2.2}机器翻译的萌芽}{11}{subsection.1.2.2}%
\contentsline {s
ection}{\numberline {6.4}Transformer}{320}{section.6.4}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.2.3}机器翻译的受挫}{12}{subsection.1.2.3}%
\contentsline {subsection}{\numberline {
6.4.1}自注意力模型}{321}{subsection.6.4.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.2.4}机器翻译的快速成长}{13}{subsection.1.2.4}%
\contentsline {subsection}{\numberline {
6.4.2}Transformer架构}{322}{subsection.6.4.2}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.2.5}机器翻译的爆发}{14}{subsection.1.2.5}%
\contentsline {subsection}{\numberline {
6.4.3}位置编码}{325}{subsection.6.4.3}
\defcounter {refsection}{0}\relax
\contentsline {s
ection}{\numberline {1.3}机器翻译现状}{15}{section.1.3}%
\contentsline {s
ubsection}{\numberline {6.4.4}基于点乘的注意力机制}{327}{subsection.6.4.4}
\defcounter {refsection}{0}\relax
\contentsline {s
ection}{\numberline {1.4}机器翻译方法}{16}{section.1.4}%
\contentsline {s
ubsection}{\numberline {6.4.5}掩码操作}{328}{subsection.6.4.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.4.1}基于规则的机器翻译}{16}{subsection.1.4.1}%
\contentsline {subsection}{\numberline {
6.4.6}多头注意力}{329}{subsection.6.4.6}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.4.2}基于实例的机器翻译}{18}{subsection.1.4.2}%
\contentsline {subsection}{\numberline {
6.4.7}残差网络和层正则化}{331}{subsection.6.4.7}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.4.3}统计机器翻译}{19}{subsection.1.4.3}%
\contentsline {subsection}{\numberline {
6.4.8}前馈全连接网络子层}{332}{subsection.6.4.8}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.4.4}神经机器翻译}{20}{subsection.1.4.4}%
\contentsline {subsection}{\numberline {
6.4.9}训练}{333}{subsection.6.4.9}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.4.5}对比分析}{21}{subsection.1.4.5}%
\contentsline {subsection}{\numberline {
6.4.10}推断}{336}{subsection.6.4.10}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {
1.5}翻译质量评价}{22}{section.1.5}%
\contentsline {section}{\numberline {
6.5}序列到序列问题及应用}{336}{section.6.5}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.5.1}人工评价}{22}{subsection.1.5.1}%
\contentsline {subsection}{\numberline {
6.5.1}自动问答}{337}{subsection.6.5.1}
\defcounter {refsection}{0}\relax
\contentsline {subsection}{\numberline {
1.5.2}自动评价}{23}{subsection.1.5.2}%
\contentsline {subsection}{\numberline {
6.5.2}自动文摘}{337}{subsection.6.5.2}
\defcounter {refsection}{0}\relax
\contentsline {subs
ubsection}{BLEU}{23}{section*.15}%
\contentsline {subs
ection}{\numberline {6.5.3}文言文翻译}{338}{subsection.6.5.3}
\defcounter {refsection}{0}\relax
\contentsline {subs
ubsection}{TER}{25}{section*.16}%
\contentsline {subs
ection}{\numberline {6.5.4}对联生成}{338}{subsection.6.5.4}
\defcounter {refsection}{0}\relax
\contentsline {subs
ubsection}{基于检测点的评价}{25}{section*.17}%
\contentsline {subs
ection}{\numberline {6.5.5}古诗生成}{339}{subsection.6.5.5}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {
1.6}机器翻译应用}{26}{section.1.6}%
\contentsline {section}{\numberline {
6.6}小结及深入阅读}{340}{section.6.6}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {1.7}开源项目与评测}{28}{section.1.7}%
\contentsline {part}{\@mypartnumtocformat {IV}{附录}}{343}{part.4}
\ttl@stoptoc {default@3}
\ttl@starttoc {default@4}
\defcounter {refsection}{0}\relax
\contentsline {
subsection}{\numberline {1.7.1}开源机器翻译系统}{28}{subsection.1.7.1}%
\contentsline {
chapter}{\numberline {A}附录A}{345}{Appendix.1.A}
\defcounter {refsection}{0}\relax
\contentsline {
subsubsection}{统计机器翻译开源系统}{29}{section*.19}%
\contentsline {
chapter}{\numberline {B}附录B}{347}{Appendix.2.B}
\defcounter {refsection}{0}\relax
\contentsline {s
ubsubsection}{神经机器翻译开源系统}{30}{section*.20}%
\contentsline {s
ection}{\numberline {B.1}IBM模型3训练方法}{347}{section.2.B.1}
\defcounter {refsection}{0}\relax
\contentsline {s
ubsection}{\numberline {1.7.2}常用数据集及公开评测任务}{32}{subsection.1.7.2}%
\contentsline {s
ection}{\numberline {B.2}IBM模型4训练方法}{349}{section.2.B.2}
\defcounter {refsection}{0}\relax
\contentsline {section}{\numberline {
1.8}推荐学习资源}{34}{section.1.8}%
\contentsline {section}{\numberline {
B.3}IBM模型5训练方法}{350}{section.2.B.3}
\contentsfinish
Book/mt-book-xelatex.tex
查看文件 @
a744eab3
...
...
@@ -112,13 +112,13 @@
% CHAPTERS
%----------------------------------------------------------------------------------------
%
\include{Chapter1/chapter1}
\include
{
Chapter1/chapter1
}
\include
{
Chapter2/chapter2
}
%
\include{Chapter3/chapter3}
%
\include{Chapter4/chapter4}
%
\include{Chapter5/chapter5}
%
\include{Chapter6/chapter6}
%
\include{ChapterAppend/chapterappend}
\include
{
Chapter3/chapter3
}
\include
{
Chapter4/chapter4
}
\include
{
Chapter5/chapter5
}
\include
{
Chapter6/chapter6
}
\include
{
ChapterAppend/chapterappend
}
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论