提交 e0a85db7 编写于 作者: L Luo Tao

fix dead links, reduce image size

上级 e0a81dca
...@@ -33,8 +33,7 @@ PaddlePaddle ...@@ -33,8 +33,7 @@ PaddlePaddle
yield src_ids, trg_ids, trg_ids_next yield src_ids, trg_ids, trg_ids_next
有关如何编写数据提供程序的更多细节描述,请参考 有关如何编写数据提供程序的更多细节描述,请参考 :ref:`api_pydataprovider2` 。完整的数据提供文件在
`PyDataProvider2 <../../ui/data_provider/index.html>`__\ 。完整的数据提供文件在
``demo/seqToseq/dataprovider.py``\ 。 ``demo/seqToseq/dataprovider.py``\ 。
配置循环神经网络架构 配置循环神经网络架构
...@@ -132,9 +131,7 @@ Sequence to Sequence Model with Attention ...@@ -132,9 +131,7 @@ Sequence to Sequence Model with Attention
模型的编码器部分如下所示。它叫做\ ``grumemory``\ 来表示门控循环神经网络。如果网络架构简单,那么推荐使用循环神经网络的方法,因为它比 模型的编码器部分如下所示。它叫做\ ``grumemory``\ 来表示门控循环神经网络。如果网络架构简单,那么推荐使用循环神经网络的方法,因为它比
``recurrent_group`` ``recurrent_group``
更快。我们已经实现了大多数常用的循环神经网络架构,可以参考 更快。我们已经实现了大多数常用的循环神经网络架构,可以参考 :ref:`api_trainer_config_helpers_layers` 了解更多细节。
`Layers <../../ui/api/trainer_config_helpers/layers_index.html>`__
了解更多细节。
我们还将编码向量投射到 ``decoder_size`` 我们还将编码向量投射到 ``decoder_size``
维空间。这通过获得反向循环网络的第一个实例,并将其投射到 维空间。这通过获得反向循环网络的第一个实例,并将其投射到
...@@ -276,9 +273,6 @@ attention,门控循环单元单步函数和输出函数: ...@@ -276,9 +273,6 @@ attention,门控循环单元单步函数和输出函数:
result_file=gen_trans_file) result_file=gen_trans_file)
outputs(beam_gen) outputs(beam_gen)
注意,这种生成技术只用于类似解码器的生成过程。如果你正在处理序列标记任务,请参阅 注意,这种生成技术只用于类似解码器的生成过程。如果你正在处理序列标记任务,请参阅 :ref:`semantic_role_labeling` 了解更多详细信息。
`Semantic Role Labeling
Demo <../../demo/semantic_role_labeling/index.html>`__
了解更多详细信息。
完整的配置文件在\ ``demo/seqToseq/seqToseq_net.py``\ 。 完整的配置文件在\ ``demo/seqToseq/seqToseq_net.py``\ 。
...@@ -331,15 +331,15 @@ For sharing the training data across all the Kubernetes nodes, we use EFS (Elast ...@@ -331,15 +331,15 @@ For sharing the training data across all the Kubernetes nodes, we use EFS (Elast
1. Make sure you added AmazonElasticFileSystemFullAccess policy in your group. 1. Make sure you added AmazonElasticFileSystemFullAccess policy in your group.
1. Create the Elastic File System in AWS console, and attach the new VPC with it. 1. Create the Elastic File System in AWS console, and attach the new VPC with it.
<img src="src/create_efs.png" width="800"> <center>![](src/create_efs.png)</center>
1. Modify the Kubernetes security group under ec2/Security Groups, add additional inbound policy "All TCP TCP 0 - 65535 0.0.0.0/0" for Kubernetes default VPC security group. 1. Modify the Kubernetes security group under ec2/Security Groups, add additional inbound policy "All TCP TCP 0 - 65535 0.0.0.0/0" for Kubernetes default VPC security group.
<img src="src/add_security_group.png" width="800"> <center>![](src/add_security_group.png)</center>
1. Follow the EC2 mount instruction to mount the disk onto all the Kubernetes nodes, we recommend to mount EFS disk onto ~/efs. 1. Follow the EC2 mount instruction to mount the disk onto all the Kubernetes nodes, we recommend to mount EFS disk onto ~/efs.
<img src="src/efs_mount.png" width="800"> <center>![](src/efs_mount.png)</center>
Before starting the training, you should place your user config and divided training data onto EFS. When the training start, each task will copy related files from EFS into container, and it will also write the training results back onto EFS, we will show you how to place the data later in this article. Before starting the training, you should place your user config and divided training data onto EFS. When the training start, each task will copy related files from EFS into container, and it will also write the training results back onto EFS, we will show you how to place the data later in this article.
......
doc/tutorials/gan/gan.png

32.5 KB | W: | H:

doc/tutorials/gan/gan.png

17.4 KB | W: | H:

doc/tutorials/gan/gan.png
doc/tutorials/gan/gan.png
doc/tutorials/gan/gan.png
doc/tutorials/gan/gan.png
  • 2-up
  • Swipe
  • Onion skin
...@@ -4,9 +4,7 @@ This demo implements GAN training described in the original [GAN paper](https:// ...@@ -4,9 +4,7 @@ This demo implements GAN training described in the original [GAN paper](https://
The high-level structure of GAN is shown in Figure. 1 below. It is composed of two major parts: a generator and a discriminator, both of which are based on neural networks. The generator takes in some kind of noise with a known distribution and transforms it into an image. The discriminator takes in an image and determines whether it is artificially generated by the generator or a real image. So the generator and the discriminator are in a competitive game in which generator is trying to generate image to look as real as possible to fool the discriminator, while the discriminator is trying to distinguish between real and fake images. The high-level structure of GAN is shown in Figure. 1 below. It is composed of two major parts: a generator and a discriminator, both of which are based on neural networks. The generator takes in some kind of noise with a known distribution and transforms it into an image. The discriminator takes in an image and determines whether it is artificially generated by the generator or a real image. So the generator and the discriminator are in a competitive game in which generator is trying to generate image to look as real as possible to fool the discriminator, while the discriminator is trying to distinguish between real and fake images.
<p align="center"> <center>![](./gan.png)</center>
<img src="./gan.png" width="500" height="300">
</p>
<p align="center"> <p align="center">
Figure 1. GAN-Model-Structure Figure 1. GAN-Model-Structure
<a href="https://ishmaelbelghazi.github.io/ALI/">figure credit</a> <a href="https://ishmaelbelghazi.github.io/ALI/">figure credit</a>
...@@ -111,9 +109,7 @@ $python gan_trainer.py -d uniform --useGpu 1 ...@@ -111,9 +109,7 @@ $python gan_trainer.py -d uniform --useGpu 1
``` ```
The generated samples can be found in ./uniform_samples/ and one example is shown below as Figure 2. One can see that it roughly recovers the 2D uniform distribution. The generated samples can be found in ./uniform_samples/ and one example is shown below as Figure 2. One can see that it roughly recovers the 2D uniform distribution.
<p align="center"> <center>![](./uniform_sample.png)</center>
<img src="./uniform_sample.png" width="300" height="300">
</p>
<p align="center"> <p align="center">
Figure 2. Uniform Sample Figure 2. Uniform Sample
</p> </p>
...@@ -135,9 +131,7 @@ To train the GAN model on mnist data, one can use the following command: ...@@ -135,9 +131,7 @@ To train the GAN model on mnist data, one can use the following command:
$python gan_trainer.py -d mnist --useGpu 1 $python gan_trainer.py -d mnist --useGpu 1
``` ```
The generated sample images can be found at ./mnist_samples/ and one example is shown below as Figure 3. The generated sample images can be found at ./mnist_samples/ and one example is shown below as Figure 3.
<p align="center"> <center>![](./mnist_sample.png)</center>
<img src="./mnist_sample.png" width="300" height="300">
</p>
<p align="center"> <p align="center">
Figure 3. MNIST Sample Figure 3. MNIST Sample
</p> </p>
doc/tutorials/gan/uniform_sample.png

20.1 KB | W: | H:

doc/tutorials/gan/uniform_sample.png

24.3 KB | W: | H:

doc/tutorials/gan/uniform_sample.png
doc/tutorials/gan/uniform_sample.png
doc/tutorials/gan/uniform_sample.png
doc/tutorials/gan/uniform_sample.png
  • 2-up
  • Swipe
  • Onion skin
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册