From a7417a5b7c187f903324a8e65218a21be9428787 Mon Sep 17 00:00:00 2001 From: caoying03 Date: Sun, 8 Jan 2017 16:28:02 +0800 Subject: [PATCH] delete train.list and test.list --- label_semantic_roles/.gitignore | 0 label_semantic_roles/README.md | 2 +- label_semantic_roles/data/test.list | 1 - label_semantic_roles/data/train.list | 1 - 4 files changed, 1 insertion(+), 3 deletions(-) create mode 100644 label_semantic_roles/.gitignore delete mode 100644 label_semantic_roles/data/test.list delete mode 100644 label_semantic_roles/data/train.list diff --git a/label_semantic_roles/.gitignore b/label_semantic_roles/.gitignore new file mode 100644 index 0000000..e69de29 diff --git a/label_semantic_roles/README.md b/label_semantic_roles/README.md index 6c0735f..0b9deae 100644 --- a/label_semantic_roles/README.md +++ b/label_semantic_roles/README.md @@ -70,7 +70,7 @@ RNN 等价于一个展开地前向网络,通常人们会认为 RNN 在时间 在绝大多数自然语言处理任务中,我们几乎总是能拿到整个句子。这种情况下,$t$ 时刻如果能够像获取历史信息一样,得到未来的信息,对序列学习任务会有很大的帮助。为了克服这一缺陷,我们可以设计一种双向循环网络单元,它的思想简单且直接:对上一节的栈式循环神经网络进行一个小小的修改,堆叠多个 LSTM 单元,让每一层 LSTM 单元分别以:正向、反向、正向 …… 的顺序学习上一层的输出序列。于是,从第 2 层开始, $t$ 时刻我们的 LSTM 单元便总是可以看到历史和未来的信息。图3 是基于 LSTM 的双向循环神经网络结构示意图。

-
+
图3. 基于 LSTM 的双向循环神经网络结构示意图

diff --git a/label_semantic_roles/data/test.list b/label_semantic_roles/data/test.list deleted file mode 100644 index ec370e8..0000000 --- a/label_semantic_roles/data/test.list +++ /dev/null @@ -1 +0,0 @@ -./data/feature diff --git a/label_semantic_roles/data/train.list b/label_semantic_roles/data/train.list deleted file mode 100644 index ec370e8..0000000 --- a/label_semantic_roles/data/train.list +++ /dev/null @@ -1 +0,0 @@ -./data/feature -- GitLab