Commit e119c723 by xiaotong

definitions of tensors

parent 6081a1a7
......@@ -105,62 +105,33 @@
\subsection{神经网络的简单实现:张量计算}
%%%------------------------------------------------------------------------------------------------------------
%%% 张量
\begin{frame}{如何描述神经网络 - 张量计算}
%%% 张量的简单定义
\begin{frame}{张量是什么}
\begin{itemize}
\item 对于神经网络,输入$\textbf{x}$和输出$\textbf{y}$的形式并不仅仅是向量
\item \textbf{深度学习}中,张量被``简单"地定义为\alert{多维数组}
\end{itemize}
\end{frame}
\begin{center}
\begin{tikzpicture}
\node [anchor=center] (y) at (0,0) {\LARGE{$\textbf{y}$}};
\node [anchor=west] (eq) at (y.east) {\LARGE{$=$}};
\node [anchor=west] (func) at (eq.east) {\LARGE{$f$}};
\node [anchor=west] (brace01) at (func.east) {\LARGE{$($}};
\node [anchor=west] (x) at (brace01.east) {\LARGE{$\textbf{x}$}};
\node [anchor=west] (dot) at (x.east) {\LARGE{$\cdot$}};
\node [anchor=west] (w) at (dot.east) {\LARGE{$\textbf{w}$}};
\node [anchor=west] (plus) at (w.east) {\LARGE{$+$}};
\node [anchor=west] (b) at (plus.east) {\LARGE{$\textbf{b}$}};
\node [anchor=west] (brace02) at (b.east) {\LARGE{$)$}};
\visible<2->{
\node [anchor=center,fill=yellow!30] (x2) at (x) {\LARGE{$\textbf{x}$}};
\node [anchor=south] (xlabel) at ([xshift=-3em,yshift=1.5em]x.north) {\alert{向量?矩阵?...}};
\draw [<-] ([yshift=0.2em,xshift=-0.5em]x2.north) -- ([xshift=1em]xlabel.south);
\node [anchor=center,fill=red!20] (y2) at (y) {\LARGE{$\textbf{y}$}};
\draw [<-] ([yshift=0.2em,xshift=0.5em]y2.north) -- ([xshift=-1em]xlabel.south);
\node [anchor=center,fill=green!20] (w2) at (w) {\LARGE{$\textbf{w}$}};
\node [anchor=north] (wlabel) at ([yshift=-1.0em]w.south) {矩阵 e.g.,};
\draw [<-] ([yshift=-0.2em]w2.south) -- (wlabel.north);
\node [anchor=west] (wsample) at ([xshift=-0.5em]wlabel.east) {\footnotesize{$\left(\begin{array}{c c} 1 & 2 \\ 3 & 4 \end{array}\right)$}};
\node [anchor=center,fill=purple!20] (b2) at (b) {\LARGE{$\textbf{b}$}};
\node [anchor=south] (blabel) at ([yshift=1.3em]b.north) {向量 e.g.,};
\draw [<-] ([yshift=0.2em]b2.north) -- (blabel.south);
\node [anchor=west] (bsample) at ([xshift=-0.5em]blabel.east) {\footnotesize{$(1, 3)$}};
}
\end{tikzpicture}
\end{center}
%%%------------------------------------------------------------------------------------------------------------
%%% 张量是一个多维线性函数
\begin{frame}{事实上,张量是个函数 - 别慌,了解一下 :)}
\begin{itemize}
\item<3-> $\textbf{x}$$\textbf{y}$实际上是一个叫tensor的东西,即\textbf{张量}。比如,
\item \textbf{非常负责任的说},张量\alert{不是}向量和矩阵的简单扩展,甚至说,多维数组也\alert{不是}张量所必须的表达形式
\item<2-> 严格意义上,张量是:
\begin{itemize}
\item<2-> \textbf{看不懂的定义}:由若干坐标系改变时满足一定坐标转化关系的抽象对象,它是一个不随参照系的坐标变换而变化的几何量(几何定义)
\item<3-> \textbf{还是看不懂的定义}:若干向量和协向量通过张量乘法定义的量(代数定义)
\item<4-> \textbf{还可以解释的定义}\alert{张量是多重线性函数},是定义在一些向量空间和笛卡儿积上的多重线性映射
\begin{itemize}
\item 这里把张量表示为$T(v_0,...,v_r)$,其中输入的是$r$个向量$\{v_0,...,v_r\}$
\item 多重线性是指,对于每个输入,函数都是线性的,比如,对于一个$v_i$,我们有
\begin{displaymath}
T(v_0,...,v_i+c \cdot u,...,v_r) = T(v_0,...,v_i,...,v_r) + c \cdot T(v_0,...,u,...,v_r)
\end{displaymath}
其中,$c$为任意数。这个性质非常重要,它可以推导出前面的其它定义。
\end{itemize}
\end{itemize}
\end{itemize}
\begin{center}
\begin{tikzpicture}
\begin{scope}
\visible<4->{\node [anchor=west] (vector) at (0,0) {$\textbf{x} = (1, 3)$};}
\visible<5->{\node [anchor=west] (matrix) at ([xshift=0.1in]vector.east) {$\textbf{x} = \left(\begin{array}{c c} -1 & 3 \\ 0.2 & 2 \end{array}\right)$};}
\visible<6->{\node [anchor=west] (tensor3d) at ([xshift=0.1in]matrix.east) {啥?$\textbf{x} = \left(\begin{array}{c} \left(\begin{array}{c c} -1 & 3 \\ 0.2 & 2 \end{array}\right) \\ \left(\begin{array}{c c} -1 & 3 \\ 0.2 & 2 \end{array}\right) \end{array}\right)$};}
\end{scope}
\end{tikzpicture}
\end{center}
\end{frame}
%%%------------------------------------------------------------------------------------------------------------
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论