diff --git a/.gitignore b/.gitignore
index b68535bac52a9557bb11e0e329081fc2321edc32..61a80a88edb71e9ba4192f84ab7821ba139bb9ce 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,9 +1,6 @@
-.DS_Store
+*.DS_Store
+*.vs
+*.user
*.pyc
-.*~
-fluid/neural_machine_translation/transformer/deps
-fluid/neural_machine_translation/transformer/train.data
-fluid/neural_machine_translation/transformer/train.pkl
-fluid/neural_machine_translation/transformer/train.sh
-fluid/neural_machine_translation/transformer/train.tok.clean.bpe.32000.en-de
-fluid/neural_machine_translation/transformer/vocab.bpe.32000.refined
+*~
+*.vscode
diff --git a/.gitmodules b/.gitmodules
index 4280afc141eb2884c3673e3a59a3e11c3aece4bd..100d42ff0fcc9c935dea6fd5108ab9171b9018c5 100644
--- a/.gitmodules
+++ b/.gitmodules
@@ -1,9 +1,15 @@
-[submodule "fluid/PaddleNLP/LAC"]
- path = fluid/PaddleNLP/LAC
+[submodule "PaddleNLP/LAC"]
+ path = PaddleNLP/LAC
url = https://github.com/baidu/lac.git
-[submodule "fluid/PaddleNLP/SimNet"]
- path = fluid/PaddleNLP/SimNet
+[submodule "PaddleNLP/SimNet"]
+ path = PaddleNLP/SimNet
url = https://github.com/baidu/AnyQ.git
-[submodule "fluid/PaddleNLP/Senta"]
- path = fluid/PaddleNLP/Senta
+[submodule "PaddleNLP/Senta"]
+ path = PaddleNLP/Senta
url = https://github.com/baidu/Senta.git
+[submodule "PaddleNLP/LARK"]
+ path = PaddleNLP/LARK
+ url = https://github.com/PaddlePaddle/LARK.git
+[submodule "PaddleNLP/knowledge-driven-dialogue"]
+ path = PaddleNLP/knowledge-driven-dialogue
+ url = https://github.com/baidu/knowledge-driven-dialogue
diff --git a/AutoDL/HiNAS_models/README.md b/AutoDL/HiNAS_models/README.md
new file mode 100755
index 0000000000000000000000000000000000000000..9c67736aa30643baf72ce42ed2ca3321d4e22165
--- /dev/null
+++ b/AutoDL/HiNAS_models/README.md
@@ -0,0 +1,76 @@
+# Image Classification Models
+This directory contains six image classification models, which are models automatically discovered by Baidu Big Data Lab (BDL) Hierarchical Neural Architecture Search project (HiNAS), achieving 96.1% accuracy on CIFAR-10 dataset. These models are divided into two categories. The first three have no skip link, named HiNAS 0-2, and the last three networks contain skip links, which are similar to the shortcut connections in Resnet, named HiNAS 3-5.
+
+---
+## Table of Contents
+- [Installation](#installation)
+- [Data preparation](#data-preparation)
+- [Training a model](#training-a-model)
+- [Model performances](#model-performances)
+
+## Installation
+Running the trainer in current directory requires:
+
+- PadddlePaddle Fluid >= v0.15.0
+- CuDNN >=6.0
+
+If PaddlePaddle and CuDNN in your runtime environment do not meet the requirements, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) and make an update.
+
+## Data preparation
+
+When you run the sample code for the first time, the trainer will automatically download the cifar-10 dataset. Please make sure your environment has an internet connection.
+
+The dataset will be downloaded to `dataset/cifar/cifar-10-python.tar.gz` in the same directory as the Trainer. If automatic download fails, you can go to https://www.cs.toronto.edu/~kriz/cifar.html and download cifar-10-python.tar.gz to the location mentioned above.
+
+## Training a model
+
+After the environment is ready, you can train the model. There are two entrances: `train_hinas.py` and `train_hinas_res.py`. The former is used to train Model 0-2 (without skip link), and the latter is used to train Model 3-5 (contains skip link).
+
+Train Model 0~2 (without skip link):
+```
+python train_hinas.py --model=m_id # m_id can be 0, 1 or 2.
+```
+Train Model 3~5 (with skip link):
+```
+python train_hinas_res.py --model=m_id # m_id can be 0, 1 or 2.
+```
+
+In addition, both `train_hinas.py` and `train_hinas_res.py` support the following parameters:
+
+- **random_flip_left_right**: Random flip image horizontally. (Default: True)
+- **random_flip_up_down**: Randomly flip image vertically. (Default: False)
+- **cutout**: Add cutout action to image. (Default: True)
+- **standardize_image**: Image standardize. (Default: True)
+- **pad_and_cut_image**: Random padding image and then crop back to the original size. (Default: True)
+- **shuffle_image**: Shuffle the order of the input images during training. (Default: True)
+- **lr_max**: Learning rate at the begin of training. (Default: 0.1)
+- **lr_min**: Learning rate at the end of training. (Default: 0.0001)
+- **batch_size**: Training batch size (Default: 128)
+- **num_epochs**: Total training epoch (Default: 200)
+- **weight_decay**: L2 Regularization value (Default: 0.0004)
+- **momentum**: The momentum parameter in momentum optimizer (Default: 0.9)
+- **dropout_rate**: Dropout rate of the dropout layer (Default: 0.5)
+- **bn_decay**: The decay/momentum parameter (or called moving average decay) in batch norm layer (Default: 0.9)
+
+
+## Model performances
+
+Train all six models using same hyperparameters:
+
+- learning rate: 0.1 -> 0.0001 with cosine annealing
+- total epoch: 200
+- batch size: 128
+- L2 decay: 0.000400
+- optimizer: momentum optimizer with m=0.9 and use nesterov
+- preprocess: random horizontal flip + image standardization + cutout
+
+And below is the accuracy on CIFAR-10 dataset:
+
+| model | round 1 | round 2 | round 3 | max | avg |
+|----------|---------|---------|---------|--------|--------|
+| HiNAS-0 | 0.9548 | 0.9520 | 0.9513 | 0.9548 | 0.9527 |
+| HiNAS-1 | 0.9452 | 0.9462 | 0.9420 | 0.9462 | 0.9445 |
+| HiNAS-2 | 0.9508 | 0.9506 | 0.9483 | 0.9508 | 0.9499 |
+| HiNAS-3 | 0.9607 | 0.9623 | 0.9601 | 0.9623 | 0.9611 |
+| HiNAS-4 | 0.9611 | 0.9584 | 0.9586 | 0.9611 | 0.9594 |
+| HiNAS-5 | 0.9578 | 0.9588 | 0.9594 | 0.9594 | 0.9586 |
diff --git a/AutoDL/HiNAS_models/README_cn.md b/AutoDL/HiNAS_models/README_cn.md
new file mode 100755
index 0000000000000000000000000000000000000000..8ca3bcbfb8d1ea1a15f969c1a1db22ff2ec854f1
--- /dev/null
+++ b/AutoDL/HiNAS_models/README_cn.md
@@ -0,0 +1,78 @@
+# Image Classification Models
+本目录下包含6个图像分类模型,都是百度大数据实验室 Hierarchical Neural Architecture Search (HiNAS) 项目通过机器自动发现的模型,在CIFAR-10数据集上达到96.1%的准确率。这6个模型分为两类,前3个没有skip link,分别命名为 HiNAS 0-2号,后三个网络带有skip link,功能类似于Resnet中的shortcut connection,分别命名 HiNAS 3-5号。
+
+---
+## Table of Contents
+- [Installation](#installation)
+- [Data preparation](#data-preparation)
+- [Training a model](#training-a-model)
+- [Model performances](#model-performances)
+
+## Installation
+最低环境要求:
+
+- PadddlePaddle Fluid >= v0.15.0
+- Cudnn >=6.0
+
+如果您的运行环境无法满足要求,可以参考此文档升级PaddlePaddle:[installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)
+
+## Data preparation
+
+第一次训练模型的时候,Trainer会自动下载CIFAR-10数据集,请确保您的环境有互联网连接。
+
+数据集会被下载到Trainer同目录下的`dataset/cifar/cifar-10-python.tar.gz`,如果自动下载失败,您可以自行从 https://www.cs.toronto.edu/~kriz/cifar.html 下载cifar-10-python.tar.gz,然后放到上述位置。
+
+
+## Training a model
+准备好环境后,可以训练模型,训练有2个入口,`train_hinas.py`和`train_hinas_res.py`,前者用来训练0-2号不含skip link的模型,后者用来训练3-5号包含skip link的模型。
+
+训练0~2号不含skip link的模型:
+```
+python train_hinas.py --model=m_id # m_id can be 0, 1 or 2.
+```
+训练3~5号包含skip link的模型:
+```
+python train_hinas_res.py --model=m_id # m_id can be 0, 1 or 2.
+```
+
+此外,`train_hinas.py`和`train_hinas_res.py` 都支持以下参数:
+
+初始化部分:
+
+- random_flip_left_right:图片随机水平翻转(Default:True)
+- random_flip_up_down:图片随机垂直翻转(Default:False)
+- cutout:图片随机遮挡(Default:True)
+- standardize_image:对图片每个像素做 standardize(Default:True)
+- pad_and_cut_image:图片随机padding,并裁剪回原大小(Default:True)
+- shuffle_image:训练时对输入图片的顺序做shuffle(Default:True)
+- lr_max:训练开始时的learning rate(Default:0.1)
+- lr_min:训练结束时的learning rate(Default:0.0001)
+- batch_size:训练的batch size(Default:128)
+- num_epochs:训练总的epoch(Default:200)
+- weight_decay:训练时L2 Regularization大小(Default:0.0004)
+- momentum:momentum优化器中的momentum系数(Default:0.9)
+- dropout_rate:dropout层的dropout_rate(Default:0.5)
+- bn_decay:batch norm层的decay/momentum系数(即moving average decay)大小(Default:0.9)
+
+
+
+## Model performances
+6个模型使用相同的参数训练:
+
+- learning rate: 0.1 -> 0.0001 with cosine annealing
+- total epoch: 200
+- batch size: 128
+- L2 decay: 0.000400
+- optimizer: momentum optimizer with m=0.9 and use nesterov
+- preprocess: random horizontal flip + image standardization + cutout
+
+以下是6个模型在CIFAR-10数据集上的准确率:
+
+| model | round 1 | round 2 | round 3 | max | avg |
+|----------|---------|---------|---------|--------|--------|
+| HiNAS-0 | 0.9548 | 0.9520 | 0.9513 | 0.9548 | 0.9527 |
+| HiNAS-1 | 0.9452 | 0.9462 | 0.9420 | 0.9462 | 0.9445 |
+| HiNAS-2 | 0.9508 | 0.9506 | 0.9483 | 0.9508 | 0.9499 |
+| HiNAS-3 | 0.9607 | 0.9623 | 0.9601 | 0.9623 | 0.9611 |
+| HiNAS-4 | 0.9611 | 0.9584 | 0.9586 | 0.9611 | 0.9594 |
+| HiNAS-5 | 0.9578 | 0.9588 | 0.9594 | 0.9594 | 0.9586 |
diff --git a/fluid/DeepASR/data_utils/__init__.py b/AutoDL/HiNAS_models/build/__init__.py
old mode 100644
new mode 100755
similarity index 100%
rename from fluid/DeepASR/data_utils/__init__.py
rename to AutoDL/HiNAS_models/build/__init__.py
diff --git a/fluid/PaddleCV/HiNAS_models/build/layers.py b/AutoDL/HiNAS_models/build/layers.py
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/build/layers.py
rename to AutoDL/HiNAS_models/build/layers.py
diff --git a/fluid/PaddleCV/HiNAS_models/build/ops.py b/AutoDL/HiNAS_models/build/ops.py
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/build/ops.py
rename to AutoDL/HiNAS_models/build/ops.py
diff --git a/fluid/PaddleCV/HiNAS_models/build/resnet_base.py b/AutoDL/HiNAS_models/build/resnet_base.py
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/build/resnet_base.py
rename to AutoDL/HiNAS_models/build/resnet_base.py
diff --git a/fluid/PaddleCV/HiNAS_models/build/vgg_base.py b/AutoDL/HiNAS_models/build/vgg_base.py
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/build/vgg_base.py
rename to AutoDL/HiNAS_models/build/vgg_base.py
diff --git a/AutoDL/HiNAS_models/nn_paddle.py b/AutoDL/HiNAS_models/nn_paddle.py
new file mode 100755
index 0000000000000000000000000000000000000000..d3a3ddd60cf3e5e114de322f3eea763e5a2e6018
--- /dev/null
+++ b/AutoDL/HiNAS_models/nn_paddle.py
@@ -0,0 +1,139 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import math
+
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+from paddle.fluid.contrib.trainer import *
+from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
+import reader
+
+from absl import flags
+
+# import preprocess
+
+FLAGS = flags.FLAGS
+
+flags.DEFINE_float("lr_max", 0.1, "initial learning rate")
+flags.DEFINE_float("lr_min", 0.0001, "limiting learning rate")
+
+flags.DEFINE_integer("batch_size", 128, "batch size")
+flags.DEFINE_integer("num_epochs", 200, "total epochs to train")
+flags.DEFINE_float("weight_decay", 0.0004, "weight decay")
+
+flags.DEFINE_float("momentum", 0.9, "momentum")
+
+flags.DEFINE_boolean("shuffle_image", True, "shuffle input images on training")
+
+dataset_train_size = 50000
+
+
+class Model(object):
+ def __init__(self, build_fn, tokens):
+ print("learning rate: %f -> %f, cosine annealing" %
+ (FLAGS.lr_max, FLAGS.lr_min))
+ print("epoch: %d" % FLAGS.num_epochs)
+ print("batch size: %d" % FLAGS.batch_size)
+ print("L2 decay: %f" % FLAGS.weight_decay)
+
+ self.max_step = dataset_train_size * FLAGS.num_epochs // FLAGS.batch_size
+
+ self.build_fn = build_fn
+ self.tokens = tokens
+ print("Token is %s" % ",".join(map(str, tokens)))
+
+ def cosine_annealing(self):
+ step = _decay_step_counter()
+ lr = FLAGS.lr_min + (FLAGS.lr_max - FLAGS.lr_min) / 2 \
+ * (1.0 + fluid.layers.ops.cos(step / self.max_step * math.pi))
+ return lr
+
+ def optimizer_program(self):
+ return fluid.optimizer.Momentum(
+ learning_rate=self.cosine_annealing(),
+ momentum=FLAGS.momentum,
+ use_nesterov=True,
+ regularization=fluid.regularizer.L2DecayRegularizer(
+ FLAGS.weight_decay))
+
+ def inference_network(self):
+ images = fluid.layers.data(
+ name='pixel', shape=[3, 32, 32], dtype='float32')
+ return self.build_fn(images, self.tokens)
+
+ def train_network(self):
+ predict = self.inference_network()
+ label = fluid.layers.data(name='label', shape=[1], dtype='int64')
+ cost = fluid.layers.cross_entropy(input=predict, label=label)
+ avg_cost = fluid.layers.mean(cost)
+ accuracy = fluid.layers.accuracy(input=predict, label=label)
+ # self.parameters = fluid.parameters.create(avg_cost)
+ return [avg_cost, accuracy]
+
+ def run(self):
+ train_files = reader.train10()
+ test_files = reader.test10()
+
+ if FLAGS.shuffle_image:
+ train_reader = paddle.batch(
+ paddle.reader.shuffle(train_files, dataset_train_size),
+ batch_size=FLAGS.batch_size)
+ else:
+ train_reader = paddle.batch(
+ train_files, batch_size=FLAGS.batch_size)
+
+ test_reader = paddle.batch(test_files, batch_size=FLAGS.batch_size)
+
+ costs = []
+ accs = []
+
+ def event_handler(event):
+ if isinstance(event, EndStepEvent):
+ costs.append(event.metrics[0])
+ accs.append(event.metrics[1])
+ if event.step % 20 == 0:
+ print("Epoch %d, Step %d, Loss %f, Acc %f" % (
+ event.epoch, event.step, np.mean(costs), np.mean(accs)))
+ del costs[:]
+ del accs[:]
+
+ if isinstance(event, EndEpochEvent):
+ if event.epoch % 3 == 0 or event.epoch == FLAGS.num_epochs - 1:
+ avg_cost, accuracy = trainer.test(
+ reader=test_reader, feed_order=['pixel', 'label'])
+
+ event_handler.best_acc = max(event_handler.best_acc,
+ accuracy)
+ print("Test with epoch %d, Loss %f, Acc %f" %
+ (event.epoch, avg_cost, accuracy))
+ print("Best acc %f" % event_handler.best_acc)
+
+ event_handler.best_acc = 0.0
+ place = fluid.CUDAPlace(0)
+ trainer = Trainer(
+ train_func=self.train_network,
+ optimizer_func=self.optimizer_program,
+ place=place)
+
+ trainer.train(
+ reader=train_reader,
+ num_epochs=FLAGS.num_epochs,
+ event_handler=event_handler,
+ feed_order=['pixel', 'label'])
diff --git a/fluid/PaddleCV/HiNAS_models/reader.py b/AutoDL/HiNAS_models/reader.py
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/reader.py
rename to AutoDL/HiNAS_models/reader.py
diff --git a/fluid/PaddleCV/HiNAS_models/tokens/15113.pkl b/AutoDL/HiNAS_models/tokens/15113.pkl
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/tokens/15113.pkl
rename to AutoDL/HiNAS_models/tokens/15113.pkl
diff --git a/fluid/PaddleCV/HiNAS_models/tokens/15383.pkl b/AutoDL/HiNAS_models/tokens/15383.pkl
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/tokens/15383.pkl
rename to AutoDL/HiNAS_models/tokens/15383.pkl
diff --git a/fluid/PaddleCV/HiNAS_models/tokens/15613.pkl b/AutoDL/HiNAS_models/tokens/15613.pkl
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/tokens/15613.pkl
rename to AutoDL/HiNAS_models/tokens/15613.pkl
diff --git a/fluid/PaddleCV/HiNAS_models/tokens/17754.pkl b/AutoDL/HiNAS_models/tokens/17754.pkl
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/tokens/17754.pkl
rename to AutoDL/HiNAS_models/tokens/17754.pkl
diff --git a/fluid/PaddleCV/HiNAS_models/tokens/17925.pkl b/AutoDL/HiNAS_models/tokens/17925.pkl
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/tokens/17925.pkl
rename to AutoDL/HiNAS_models/tokens/17925.pkl
diff --git a/fluid/PaddleCV/HiNAS_models/tokens/18089.pkl b/AutoDL/HiNAS_models/tokens/18089.pkl
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/tokens/18089.pkl
rename to AutoDL/HiNAS_models/tokens/18089.pkl
diff --git a/fluid/PaddleCV/HiNAS_models/train_hinas.py b/AutoDL/HiNAS_models/train_hinas.py
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/train_hinas.py
rename to AutoDL/HiNAS_models/train_hinas.py
diff --git a/fluid/PaddleCV/HiNAS_models/train_hinas_res.py b/AutoDL/HiNAS_models/train_hinas_res.py
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/train_hinas_res.py
rename to AutoDL/HiNAS_models/train_hinas_res.py
diff --git a/AutoDL/LRC/README.md b/AutoDL/LRC/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..df9af47d4a3876371673cbbfef0ad2553768b9a5
--- /dev/null
+++ b/AutoDL/LRC/README.md
@@ -0,0 +1,74 @@
+# LRC Local Rademachar Complexity Regularization
+Regularization of Deep Neural Networks(DNNs) for the sake of improving their generalization capability is important and chllenging. This directory contains image classification model based on a novel regularizer rooted in Local Rademacher Complexity (LRC). We appreciate the contribution by [DARTS](https://arxiv.org/abs/1806.09055) for our research. The regularization by LRC and DARTS are combined in this model on CIFAR-10 dataset. Code accompanying the paper
+> [An Empirical Study on Regularization of Deep Neural Networks by Local Rademacher Complexity](https://arxiv.org/abs/1902.00873)\
+> Yingzhen Yang, Xingjian Li, Jun Huan.\
+> _arXiv:1902.00873_.
+
+---
+# Table of Contents
+
+- [Installation](#installation)
+- [Data preparation](#data-preparation)
+- [Training](#training)
+
+## Installation
+
+Running sample code in this directory requires PaddelPaddle Fluid v.1.2.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/install/index_cn.html#paddlepaddle) and make an update.
+
+## Data preparation
+
+When you want to use the cifar-10 dataset for the first time, you can download the dataset as:
+
+ sh ./dataset/download.sh
+
+Please make sure your environment has an internet connection.
+
+The dataset will be downloaded to `dataset/cifar/cifar-10-batches-py` in the same directory as the `train.py`. If automatic download fails, you can download cifar-10-python.tar.gz from https://www.cs.toronto.edu/~kriz/cifar.html and decompress it to the location mentioned above.
+
+
+## Training
+
+After data preparation, one can start the training step by:
+
+ python -u train_mixup.py \
+ --batch_size=80 \
+ --auxiliary \
+ --weight_decay=0.0003 \
+ --learning_rate=0.025 \
+ --lrc_loss_lambda=0.7 \
+ --cutout
+- Set ```export CUDA_VISIBLE_DEVICES=0``` to specifiy one GPU to train.
+- For more help on arguments:
+
+ python train_mixup.py --help
+
+**data reader introduction:**
+
+* Data reader is defined in `reader.py`.
+* Reshape the images to 32 * 32.
+* In training stage, images are padding to 40 * 40 and cropped randomly to the original size.
+* In training stage, images are horizontally random flipped.
+* Images are standardized to (0, 1).
+* In training stage, cutout images randomly.
+* Shuffle the order of the input images during training.
+
+**model configuration:**
+
+* Use auxiliary loss and auxiliary\_weight=0.4.
+* Use dropout and drop\_path\_prob=0.2.
+* Set lrc\_loss\_lambda=0.7.
+
+**training strategy:**
+
+* Use momentum optimizer with momentum=0.9.
+* Weight decay is 0.0003.
+* Use cosine decay with init\_lr=0.025.
+* Total epoch is 600.
+* Use Xaiver initalizer to weight in conv2d, Constant initalizer to weight in batch norm and Normal initalizer to weight in fc.
+* Initalize bias in batch norm and fc to zero constant and do not add bias to conv2d.
+
+
+## Reference
+
+ - DARTS: Differentiable Architecture Search [`paper`](https://arxiv.org/abs/1806.09055)
+ - Differentiable architecture search in PyTorch [`code`](https://github.com/quark0/darts)
diff --git a/AutoDL/LRC/README_cn.md b/AutoDL/LRC/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..06dc937074de199af31db97ee200e7690443b1b0
--- /dev/null
+++ b/AutoDL/LRC/README_cn.md
@@ -0,0 +1,71 @@
+# LRC 局部Rademachar复杂度正则化
+为了在深度神经网络中提升泛化能力,正则化的选择十分重要也具有挑战性。本目录包括了一种基于局部rademacher复杂度的新型正则(LRC)的图像分类模型。十分感谢[DARTS](https://arxiv.org/abs/1806.09055)模型对本研究提供的帮助。该模型将LRC正则和DARTS网络相结合,在CIFAR-10数据集中得到了很出色的效果。代码和文章一同发布
+> [An Empirical Study on Regularization of Deep Neural Networks by Local Rademacher Complexity](https://arxiv.org/abs/1902.00873)\
+> Yingzhen Yang, Xingjian Li, Jun Huan.\
+> _arXiv:1902.00873_.
+
+---
+# 内容
+
+- [安装](#安装)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+
+## 安装
+
+在当前目录下运行样例代码需要PadddlePaddle Fluid的v.1.2.0或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据[安装文档](http://www.paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/install/index_cn.html#paddlepaddle)中的说明来更新PaddlePaddle。
+
+## 数据准备
+
+第一次使用CIFAR-10数据集时,您可以通过如果命令下载:
+
+ sh ./dataset/download.sh
+
+请确保您的环境有互联网连接。数据会下载到`train.py`同目录下的`dataset/cifar/cifar-10-batches-py`。如果下载失败,您可以自行从https://www.cs.toronto.edu/~kriz/cifar.html上下载cifar-10-python.tar.gz并解压到上述位置。
+
+## 模型训练
+
+数据准备好后,可以通过如下命令开始训练:
+
+ python -u train_mixup.py \
+ --batch_size=80 \
+ --auxiliary \
+ --weight_decay=0.0003 \
+ --learning_rate=0.025 \
+ --lrc_loss_lambda=0.7 \
+ --cutout
+- 通过设置 ```export CUDA_VISIBLE_DEVICES=0```指定单张GPU训练。
+- 可选参数见:
+
+ python train_mixup.py --help
+
+**数据读取器说明:**
+
+* 数据读取器定义在`reader.py`中
+* 输入图像尺寸统一变换为32 * 32
+* 训练时将图像填充为40 * 40然后随机剪裁为原输入图像大小
+* 训练时图像随机水平翻转
+* 对图像每个像素做归一化处理
+* 训练时对图像做随机遮挡
+* 训练时对输入图像做随机洗牌
+
+**模型配置:**
+
+* 使用辅助损失,辅助损失权重为0.4
+* 使用dropout,随机丢弃率为0.2
+* 设置lrc\_loss\_lambda为0.7
+
+**训练策略:**
+
+* 采用momentum优化算法训练,momentum=0.9
+* 权重衰减系数为0.0001
+* 采用正弦学习率衰减,初始学习率为0.025
+* 总共训练600轮
+* 对卷积权重采用Xaiver初始化,对batch norm权重采用固定初始化,对全连接层权重采用高斯初始化
+* 对batch norm和全连接层偏差采用固定初始化,不对卷积设置偏差
+
+
+## 引用
+
+ - DARTS: Differentiable Architecture Search [`论文`](https://arxiv.org/abs/1806.09055)
+ - Differentiable Architecture Search in PyTorch [`代码`](https://github.com/quark0/darts)
diff --git a/AutoDL/LRC/dataset/download.sh b/AutoDL/LRC/dataset/download.sh
new file mode 100644
index 0000000000000000000000000000000000000000..0981c3b6878421f80d392f314fd0ae836644a63c
--- /dev/null
+++ b/AutoDL/LRC/dataset/download.sh
@@ -0,0 +1,10 @@
+DIR="$( cd "$(dirname "$0")" ; pwd -P )"
+cd "$DIR"
+mkdir cifar
+cd cifar
+# Download the data.
+echo "Downloading..."
+wget https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz
+# Extract the data.
+echo "Extracting..."
+tar zvxf cifar-10-python.tar.gz
diff --git a/AutoDL/LRC/genotypes.py b/AutoDL/LRC/genotypes.py
new file mode 100644
index 0000000000000000000000000000000000000000..349fbd2478a7c2d1bb4cc3dd901b470de3c8b906
--- /dev/null
+++ b/AutoDL/LRC/genotypes.py
@@ -0,0 +1,116 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# DARTS
+# Copyright (c) 2018, Hanxiao Liu.
+# Licensed under the Apache License, Version 2.0;
+# --------------------------------------------------------
+
+from collections import namedtuple
+
+Genotype = namedtuple('Genotype', 'normal normal_concat reduce reduce_concat')
+
+PRIMITIVES = [
+ 'none', 'max_pool_3x3', 'avg_pool_3x3', 'skip_connect', 'sep_conv_3x3',
+ 'sep_conv_5x5', 'dil_conv_3x3', 'dil_conv_5x5'
+]
+
+NASNet = Genotype(
+ normal=[
+ ('sep_conv_5x5', 1),
+ ('sep_conv_3x3', 0),
+ ('sep_conv_5x5', 0),
+ ('sep_conv_3x3', 0),
+ ('avg_pool_3x3', 1),
+ ('skip_connect', 0),
+ ('avg_pool_3x3', 0),
+ ('avg_pool_3x3', 0),
+ ('sep_conv_3x3', 1),
+ ('skip_connect', 1),
+ ],
+ normal_concat=[2, 3, 4, 5, 6],
+ reduce=[
+ ('sep_conv_5x5', 1),
+ ('sep_conv_7x7', 0),
+ ('max_pool_3x3', 1),
+ ('sep_conv_7x7', 0),
+ ('avg_pool_3x3', 1),
+ ('sep_conv_5x5', 0),
+ ('skip_connect', 3),
+ ('avg_pool_3x3', 2),
+ ('sep_conv_3x3', 2),
+ ('max_pool_3x3', 1),
+ ],
+ reduce_concat=[4, 5, 6], )
+
+AmoebaNet = Genotype(
+ normal=[
+ ('avg_pool_3x3', 0),
+ ('max_pool_3x3', 1),
+ ('sep_conv_3x3', 0),
+ ('sep_conv_5x5', 2),
+ ('sep_conv_3x3', 0),
+ ('avg_pool_3x3', 3),
+ ('sep_conv_3x3', 1),
+ ('skip_connect', 1),
+ ('skip_connect', 0),
+ ('avg_pool_3x3', 1),
+ ],
+ normal_concat=[4, 5, 6],
+ reduce=[
+ ('avg_pool_3x3', 0),
+ ('sep_conv_3x3', 1),
+ ('max_pool_3x3', 0),
+ ('sep_conv_7x7', 2),
+ ('sep_conv_7x7', 0),
+ ('avg_pool_3x3', 1),
+ ('max_pool_3x3', 0),
+ ('max_pool_3x3', 1),
+ ('conv_7x1_1x7', 0),
+ ('sep_conv_3x3', 5),
+ ],
+ reduce_concat=[3, 4, 6])
+
+DARTS_V1 = Genotype(
+ normal=[('sep_conv_3x3', 1), ('sep_conv_3x3', 0), ('skip_connect', 0),
+ ('sep_conv_3x3', 1), ('skip_connect', 0), ('sep_conv_3x3', 1),
+ ('sep_conv_3x3', 0), ('skip_connect', 2)],
+ normal_concat=[2, 3, 4, 5],
+ reduce=[('max_pool_3x3', 0), ('max_pool_3x3', 1), ('skip_connect', 2),
+ ('max_pool_3x3', 0), ('max_pool_3x3', 0), ('skip_connect', 2),
+ ('skip_connect', 2), ('avg_pool_3x3', 0)],
+ reduce_concat=[2, 3, 4, 5])
+DARTS_V2 = Genotype(
+ normal=[('sep_conv_3x3', 0), ('sep_conv_3x3', 1), ('sep_conv_3x3', 0),
+ ('sep_conv_3x3', 1), ('sep_conv_3x3', 1), ('skip_connect', 0),
+ ('skip_connect', 0), ('dil_conv_3x3', 2)],
+ normal_concat=[2, 3, 4, 5],
+ reduce=[('max_pool_3x3', 0), ('max_pool_3x3', 1), ('skip_connect', 2),
+ ('max_pool_3x3', 1), ('max_pool_3x3', 0), ('skip_connect', 2),
+ ('skip_connect', 2), ('max_pool_3x3', 1)],
+ reduce_concat=[2, 3, 4, 5])
+
+MY_DARTS = Genotype(
+ normal=[('sep_conv_3x3', 0), ('skip_connect', 1), ('skip_connect', 0),
+ ('dil_conv_5x5', 1), ('skip_connect', 0), ('sep_conv_3x3', 1),
+ ('skip_connect', 0), ('sep_conv_3x3', 1)],
+ normal_concat=range(2, 6),
+ reduce=[('max_pool_3x3', 0), ('max_pool_3x3', 1), ('max_pool_3x3', 0),
+ ('skip_connect', 2), ('max_pool_3x3', 0), ('skip_connect', 2),
+ ('skip_connect', 2), ('skip_connect', 3)],
+ reduce_concat=range(2, 6))
+
+DARTS = MY_DARTS
diff --git a/AutoDL/LRC/learning_rate.py b/AutoDL/LRC/learning_rate.py
new file mode 100644
index 0000000000000000000000000000000000000000..3965171b487884d36e4a7447f10f312204803bf8
--- /dev/null
+++ b/AutoDL/LRC/learning_rate.py
@@ -0,0 +1,43 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# DARTS
+# Copyright (c) 2018, Hanxiao Liu.
+# Licensed under the Apache License, Version 2.0;
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.layers.ops as ops
+from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
+import math
+from paddle.fluid.initializer import init_on_cpu
+
+
+def cosine_decay(learning_rate, num_epoch, steps_one_epoch):
+ """Applies cosine decay to the learning rate.
+ lr = 0.5 * (math.cos(epoch * (math.pi / 120)) + 1)
+ """
+ global_step = _decay_step_counter()
+
+ with init_on_cpu():
+ decayed_lr = learning_rate * \
+ (ops.cos((global_step / steps_one_epoch) \
+ * math.pi / num_epoch) + 1)/2
+ return decayed_lr
diff --git a/AutoDL/LRC/model.py b/AutoDL/LRC/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..45a403495ecc0b7cc0ac3b541d75702adbef31b2
--- /dev/null
+++ b/AutoDL/LRC/model.py
@@ -0,0 +1,313 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# DARTS
+# Copyright (c) 2018, Hanxiao Liu.
+# Licensed under the Apache License, Version 2.0;
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import sys
+import numpy as np
+import time
+import functools
+import paddle
+import paddle.fluid as fluid
+from operations import *
+
+
+class Cell():
+ def __init__(self, genotype, C_prev_prev, C_prev, C, reduction,
+ reduction_prev):
+ print(C_prev_prev, C_prev, C)
+
+ if reduction_prev:
+ self.preprocess0 = functools.partial(FactorizedReduce, C_out=C)
+ else:
+ self.preprocess0 = functools.partial(
+ ReLUConvBN, C_out=C, kernel_size=1, stride=1, padding=0)
+ self.preprocess1 = functools.partial(
+ ReLUConvBN, C_out=C, kernel_size=1, stride=1, padding=0)
+ if reduction:
+ op_names, indices = zip(*genotype.reduce)
+ concat = genotype.reduce_concat
+ else:
+ op_names, indices = zip(*genotype.normal)
+ concat = genotype.normal_concat
+ print(op_names, indices, concat, reduction)
+ self._compile(C, op_names, indices, concat, reduction)
+
+ def _compile(self, C, op_names, indices, concat, reduction):
+ assert len(op_names) == len(indices)
+ self._steps = len(op_names) // 2
+ self._concat = concat
+ self.multiplier = len(concat)
+
+ self._ops = []
+ for name, index in zip(op_names, indices):
+ stride = 2 if reduction and index < 2 else 1
+ op = functools.partial(OPS[name], C=C, stride=stride, affine=True)
+ self._ops += [op]
+ self._indices = indices
+
+ def forward(self, s0, s1, drop_prob, is_train, name):
+ self.training = is_train
+ preprocess0_name = name + 'preprocess0.'
+ preprocess1_name = name + 'preprocess1.'
+ s0 = self.preprocess0(s0, name=preprocess0_name)
+ s1 = self.preprocess1(s1, name=preprocess1_name)
+ out = [s0, s1]
+ for i in range(self._steps):
+ h1 = out[self._indices[2 * i]]
+ h2 = out[self._indices[2 * i + 1]]
+ op1 = self._ops[2 * i]
+ op2 = self._ops[2 * i + 1]
+ h3 = op1(h1, name=name + '_ops.' + str(2 * i) + '.')
+ h4 = op2(h2, name=name + '_ops.' + str(2 * i + 1) + '.')
+ if self.training and drop_prob > 0.:
+ if h3 != h1:
+ h3 = fluid.layers.dropout(
+ h3,
+ drop_prob,
+ dropout_implementation='upscale_in_train')
+ if h4 != h2:
+ h4 = fluid.layers.dropout(
+ h4,
+ drop_prob,
+ dropout_implementation='upscale_in_train')
+ s = h3 + h4
+ out += [s]
+ return fluid.layers.concat([out[i] for i in self._concat], axis=1)
+
+
+def AuxiliaryHeadCIFAR(input, num_classes, aux_name='auxiliary_head'):
+ relu_a = fluid.layers.relu(input)
+ pool_a = fluid.layers.pool2d(relu_a, 5, 'avg', 3)
+ conv2d_a = fluid.layers.conv2d(
+ pool_a,
+ 128,
+ 1,
+ name=aux_name + '.features.2',
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=aux_name + '.features.2.weight'),
+ bias_attr=False)
+ bn_a_name = aux_name + '.features.3'
+ bn_a = fluid.layers.batch_norm(
+ conv2d_a,
+ act='relu',
+ name=bn_a_name,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=bn_a_name + '.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=bn_a_name + '.bias'),
+ moving_mean_name=bn_a_name + '.running_mean',
+ moving_variance_name=bn_a_name + '.running_var')
+ conv2d_b = fluid.layers.conv2d(
+ bn_a,
+ 768,
+ 2,
+ name=aux_name + '.features.5',
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=aux_name + '.features.5.weight'),
+ bias_attr=False)
+ bn_b_name = aux_name + '.features.6'
+ bn_b = fluid.layers.batch_norm(
+ conv2d_b,
+ act='relu',
+ name=bn_b_name,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=bn_b_name + '.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=bn_b_name + '.bias'),
+ moving_mean_name=bn_b_name + '.running_mean',
+ moving_variance_name=bn_b_name + '.running_var')
+ fc_name = aux_name + '.classifier'
+ fc = fluid.layers.fc(bn_b,
+ num_classes,
+ name=fc_name,
+ param_attr=ParamAttr(
+ initializer=Normal(scale=1e-3),
+ name=fc_name + '.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=fc_name + '.bias'))
+ return fc
+
+
+def StemConv(input, C_out, kernel_size, padding):
+ conv_a = fluid.layers.conv2d(
+ input,
+ C_out,
+ kernel_size,
+ padding=padding,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0), name='stem.0.weight'),
+ bias_attr=False)
+ bn_a = fluid.layers.batch_norm(
+ conv_a,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name='stem.1.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name='stem.1.bias'),
+ moving_mean_name='stem.1.running_mean',
+ moving_variance_name='stem.1.running_var')
+ return bn_a
+
+
+class NetworkCIFAR(object):
+ def __init__(self, C, class_num, layers, auxiliary, genotype):
+ self.class_num = class_num
+ self._layers = layers
+ self._auxiliary = auxiliary
+
+ stem_multiplier = 3
+ self.drop_path_prob = 0
+ C_curr = stem_multiplier * C
+
+ C_prev_prev, C_prev, C_curr = C_curr, C_curr, C
+ self.cells = []
+ reduction_prev = False
+ for i in range(layers):
+ if i in [layers // 3, 2 * layers // 3]:
+ C_curr *= 2
+ reduction = True
+ else:
+ reduction = False
+ cell = Cell(genotype, C_prev_prev, C_prev, C_curr, reduction,
+ reduction_prev)
+ reduction_prev = reduction
+ self.cells += [cell]
+ C_prev_prev, C_prev = C_prev, cell.multiplier * C_curr
+ if i == 2 * layers // 3:
+ C_to_auxiliary = C_prev
+
+ def forward(self, init_channel, is_train):
+ self.training = is_train
+ self.logits_aux = None
+ num_channel = init_channel * 3
+ s0 = StemConv(self.image, num_channel, kernel_size=3, padding=1)
+ s1 = s0
+ for i, cell in enumerate(self.cells):
+ name = 'cells.' + str(i) + '.'
+ s0, s1 = s1, cell.forward(s0, s1, self.drop_path_prob, is_train,
+ name)
+ if i == int(2 * self._layers // 3):
+ if self._auxiliary and self.training:
+ self.logits_aux = AuxiliaryHeadCIFAR(s1, self.class_num)
+ out = fluid.layers.adaptive_pool2d(s1, (1, 1), "avg")
+ self.logits = fluid.layers.fc(out,
+ size=self.class_num,
+ param_attr=ParamAttr(
+ initializer=Normal(scale=1e-3),
+ name='classifier.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.),
+ name='classifier.bias'))
+ return self.logits, self.logits_aux
+
+ def build_input(self, image_shape, batch_size, is_train):
+ if is_train:
+ py_reader = fluid.layers.py_reader(
+ capacity=64,
+ shapes=[[-1] + image_shape, [-1, 1], [-1, 1], [-1, 1], [-1, 1],
+ [-1, 1], [-1, batch_size, self.class_num - 1]],
+ lod_levels=[0, 0, 0, 0, 0, 0, 0],
+ dtypes=[
+ "float32", "int64", "int64", "float32", "int32", "int32",
+ "float32"
+ ],
+ use_double_buffer=True,
+ name='train_reader')
+ else:
+ py_reader = fluid.layers.py_reader(
+ capacity=64,
+ shapes=[[-1] + image_shape, [-1, 1]],
+ lod_levels=[0, 0],
+ dtypes=["float32", "int64"],
+ use_double_buffer=True,
+ name='test_reader')
+ return py_reader
+
+ def train_model(self, py_reader, init_channels, aux, aux_w, batch_size,
+ loss_lambda):
+ self.image, self.ya, self.yb, self.lam, self.label_reshape,\
+ self.non_label_reshape, self.rad_var = fluid.layers.read_file(py_reader)
+ self.logits, self.logits_aux = self.forward(init_channels, True)
+ self.mixup_loss = self.mixup_loss(aux, aux_w)
+ self.lrc_loss = self.lrc_loss(batch_size)
+ return self.mixup_loss + loss_lambda * self.lrc_loss
+
+ def test_model(self, py_reader, init_channels):
+ self.image, self.ya = fluid.layers.read_file(py_reader)
+ self.logits, _ = self.forward(init_channels, False)
+ prob = fluid.layers.softmax(self.logits, use_cudnn=False)
+ loss = fluid.layers.cross_entropy(prob, self.ya)
+ acc_1 = fluid.layers.accuracy(self.logits, self.ya, k=1)
+ acc_5 = fluid.layers.accuracy(self.logits, self.ya, k=5)
+ return loss, acc_1, acc_5
+
+ def mixup_loss(self, auxiliary, auxiliary_weight):
+ prob = fluid.layers.softmax(self.logits, use_cudnn=False)
+ loss_a = fluid.layers.cross_entropy(prob, self.ya)
+ loss_b = fluid.layers.cross_entropy(prob, self.yb)
+ loss_a_mean = fluid.layers.reduce_mean(loss_a)
+ loss_b_mean = fluid.layers.reduce_mean(loss_b)
+ loss = self.lam * loss_a_mean + (1 - self.lam) * loss_b_mean
+ if auxiliary:
+ prob_aux = fluid.layers.softmax(self.logits_aux, use_cudnn=False)
+ loss_a_aux = fluid.layers.cross_entropy(prob_aux, self.ya)
+ loss_b_aux = fluid.layers.cross_entropy(prob_aux, self.yb)
+ loss_a_aux_mean = fluid.layers.reduce_mean(loss_a_aux)
+ loss_b_aux_mean = fluid.layers.reduce_mean(loss_b_aux)
+ loss_aux = self.lam * loss_a_aux_mean + (1 - self.lam
+ ) * loss_b_aux_mean
+ return loss + auxiliary_weight * loss_aux
+
+ def lrc_loss(self, batch_size):
+ y_diff_reshape = fluid.layers.reshape(self.logits, shape=(-1, 1))
+ label_reshape = fluid.layers.squeeze(self.label_reshape, axes=[1])
+ non_label_reshape = fluid.layers.squeeze(
+ self.non_label_reshape, axes=[1])
+ label_reshape.stop_gradient = True
+ non_label_reshape.stop_graident = True
+
+ y_diff_label_reshape = fluid.layers.gather(y_diff_reshape,
+ label_reshape)
+ y_diff_non_label_reshape = fluid.layers.gather(y_diff_reshape,
+ non_label_reshape)
+ y_diff_label = fluid.layers.reshape(
+ y_diff_label_reshape, shape=(-1, batch_size, 1))
+ y_diff_non_label = fluid.layers.reshape(
+ y_diff_non_label_reshape,
+ shape=(-1, batch_size, self.class_num - 1))
+ y_diff_ = y_diff_non_label - y_diff_label
+
+ y_diff_ = fluid.layers.transpose(y_diff_, perm=[1, 2, 0])
+ rad_var_trans = fluid.layers.transpose(self.rad_var, perm=[1, 2, 0])
+ rad_y_diff_trans = rad_var_trans * y_diff_
+ lrc_loss_sum = fluid.layers.reduce_sum(rad_y_diff_trans, dim=[0, 1])
+ lrc_loss_ = fluid.layers.abs(lrc_loss_sum) / (batch_size *
+ (self.class_num - 1))
+ lrc_loss_mean = fluid.layers.reduce_mean(lrc_loss_)
+
+ return lrc_loss_mean
diff --git a/AutoDL/LRC/operations.py b/AutoDL/LRC/operations.py
new file mode 100644
index 0000000000000000000000000000000000000000..b015722a1bc5dbf682c90812a971f3dbb2cd8c9a
--- /dev/null
+++ b/AutoDL/LRC/operations.py
@@ -0,0 +1,349 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# DARTS
+# Copyright (c) 2018, Hanxiao Liu.
+# Licensed under the Apache License, Version 2.0;
+# --------------------------------------------------------
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import sys
+import numpy as np
+import time
+import paddle
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from paddle.fluid.initializer import Xavier
+from paddle.fluid.initializer import Normal
+from paddle.fluid.initializer import Constant
+
+OPS = {
+ 'none' : lambda input, C, stride, name, affine: Zero(input, stride, name),
+ 'avg_pool_3x3' : lambda input, C, stride, name, affine: fluid.layers.pool2d(input, 3, 'avg', pool_stride=stride, pool_padding=1, name=name),
+ 'max_pool_3x3' : lambda input, C, stride, name, affine: fluid.layers.pool2d(input, 3, 'max', pool_stride=stride, pool_padding=1, name=name),
+ 'skip_connect' : lambda input,C, stride, name, affine: Identity(input, name) if stride == 1 else FactorizedReduce(input, C, name=name, affine=affine),
+ 'sep_conv_3x3' : lambda input,C, stride, name, affine: SepConv(input, C, C, 3, stride, 1, name=name, affine=affine),
+ 'sep_conv_5x5' : lambda input,C, stride, name, affine: SepConv(input, C, C, 5, stride, 2, name=name, affine=affine),
+ 'sep_conv_7x7' : lambda input,C, stride, name, affine: SepConv(input, C, C, 7, stride, 3, name=name, affine=affine),
+ 'dil_conv_3x3' : lambda input,C, stride, name, affine: DilConv(input, C, C, 3, stride, 2, 2, name=name, affine=affine),
+ 'dil_conv_5x5' : lambda input,C, stride, name, affine: DilConv(input, C, C, 5, stride, 4, 2, name=name, affine=affine),
+ 'conv_7x1_1x7' : lambda input,C, stride, name, affine: SevenConv(input, C, name=name, affine=affine)
+}
+
+
+def ReLUConvBN(input, C_out, kernel_size, stride, padding, name='',
+ affine=True):
+ relu_a = fluid.layers.relu(input)
+ conv2d_a = fluid.layers.conv2d(
+ relu_a,
+ C_out,
+ kernel_size,
+ stride,
+ padding,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.1.weight'),
+ bias_attr=False)
+ if affine:
+ reluconvbn_out = fluid.layers.batch_norm(
+ conv2d_a,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=name + 'op.2.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=name + 'op.2.bias'),
+ moving_mean_name=name + 'op.2.running_mean',
+ moving_variance_name=name + 'op.2.running_var')
+ else:
+ reluconvbn_out = fluid.layers.batch_norm(
+ conv2d_a,
+ param_attr=ParamAttr(
+ initializer=Constant(1.),
+ learning_rate=0.,
+ name=name + 'op.2.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.),
+ learning_rate=0.,
+ name=name + 'op.2.bias'),
+ moving_mean_name=name + 'op.2.running_mean',
+ moving_variance_name=name + 'op.2.running_var')
+ return reluconvbn_out
+
+
+def DilConv(input,
+ C_in,
+ C_out,
+ kernel_size,
+ stride,
+ padding,
+ dilation,
+ name='',
+ affine=True):
+ relu_a = fluid.layers.relu(input)
+ conv2d_a = fluid.layers.conv2d(
+ relu_a,
+ C_in,
+ kernel_size,
+ stride,
+ padding,
+ dilation,
+ groups=C_in,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.1.weight'),
+ bias_attr=False,
+ use_cudnn=False)
+ conv2d_b = fluid.layers.conv2d(
+ conv2d_a,
+ C_out,
+ 1,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.2.weight'),
+ bias_attr=False)
+ if affine:
+ dilconv_out = fluid.layers.batch_norm(
+ conv2d_b,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=name + 'op.3.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=name + 'op.3.bias'),
+ moving_mean_name=name + 'op.3.running_mean',
+ moving_variance_name=name + 'op.3.running_var')
+ else:
+ dilconv_out = fluid.layers.batch_norm(
+ conv2d_b,
+ param_attr=ParamAttr(
+ initializer=Constant(1.),
+ learning_rate=0.,
+ name=name + 'op.3.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.),
+ learning_rate=0.,
+ name=name + 'op.3.bias'),
+ moving_mean_name=name + 'op.3.running_mean',
+ moving_variance_name=name + 'op.3.running_var')
+ return dilconv_out
+
+
+def SepConv(input,
+ C_in,
+ C_out,
+ kernel_size,
+ stride,
+ padding,
+ name='',
+ affine=True):
+ relu_a = fluid.layers.relu(input)
+ conv2d_a = fluid.layers.conv2d(
+ relu_a,
+ C_in,
+ kernel_size,
+ stride,
+ padding,
+ groups=C_in,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.1.weight'),
+ bias_attr=False,
+ use_cudnn=False)
+ conv2d_b = fluid.layers.conv2d(
+ conv2d_a,
+ C_in,
+ 1,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.2.weight'),
+ bias_attr=False)
+ if affine:
+ bn_a = fluid.layers.batch_norm(
+ conv2d_b,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=name + 'op.3.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=name + 'op.3.bias'),
+ moving_mean_name=name + 'op.3.running_mean',
+ moving_variance_name=name + 'op.3.running_var')
+ else:
+ bn_a = fluid.layers.batch_norm(
+ conv2d_b,
+ param_attr=ParamAttr(
+ initializer=Constant(1.),
+ learning_rate=0.,
+ name=name + 'op.3.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.),
+ learning_rate=0.,
+ name=name + 'op.3.bias'),
+ moving_mean_name=name + 'op.3.running_mean',
+ moving_variance_name=name + 'op.3.running_var')
+
+ relu_b = fluid.layers.relu(bn_a)
+ conv2d_d = fluid.layers.conv2d(
+ relu_b,
+ C_in,
+ kernel_size,
+ 1,
+ padding,
+ groups=C_in,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.5.weight'),
+ bias_attr=False,
+ use_cudnn=False)
+ conv2d_e = fluid.layers.conv2d(
+ conv2d_d,
+ C_out,
+ 1,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.6.weight'),
+ bias_attr=False)
+ if affine:
+ sepconv_out = fluid.layers.batch_norm(
+ conv2d_e,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=name + 'op.7.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=name + 'op.7.bias'),
+ moving_mean_name=name + 'op.7.running_mean',
+ moving_variance_name=name + 'op.7.running_var')
+ else:
+ sepconv_out = fluid.layers.batch_norm(
+ conv2d_e,
+ param_attr=ParamAttr(
+ initializer=Constant(1.),
+ learning_rate=0.,
+ name=name + 'op.7.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.),
+ learning_rate=0.,
+ name=name + 'op.7.bias'),
+ moving_mean_name=name + 'op.7.running_mean',
+ moving_variance_name=name + 'op.7.running_var')
+ return sepconv_out
+
+
+def SevenConv(input, C_out, stride, name='', affine=True):
+ relu_a = fluid.layers.relu(input)
+ conv2d_a = fluid.layers.conv2d(
+ relu_a,
+ C_out, (1, 7), (1, stride), (0, 3),
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.1.weight'),
+ bias_attr=False)
+ conv2d_b = fluid.layers.conv2d(
+ conv2d_a,
+ C_out, (7, 1), (stride, 1), (3, 0),
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'op.2.weight'),
+ bias_attr=False)
+ if affine:
+ out = fluid.layers.batch_norm(
+ conv2d_b,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=name + 'op.3.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=name + 'op.3.bias'),
+ moving_mean_name=name + 'op.3.running_mean',
+ moving_variance_name=name + 'op.3.running_var')
+ else:
+ out = fluid.layers.batch_norm(
+ conv2d_b,
+ param_attr=ParamAttr(
+ initializer=Constant(1.),
+ learning_rate=0.,
+ name=name + 'op.3.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.),
+ learning_rate=0.,
+ name=name + 'op.3.bias'),
+ moving_mean_name=name + 'op.3.running_mean',
+ moving_variance_name=name + 'op.3.running_var')
+
+
+def Identity(input, name=''):
+ return input
+
+
+def Zero(input, stride, name=''):
+ ones = np.ones(input.shape[-2:])
+ ones[::stride, ::stride] = 0
+ ones = fluid.layers.assign(ones)
+ return input * ones
+
+
+def FactorizedReduce(input, C_out, name='', affine=True):
+ relu_a = fluid.layers.relu(input)
+ conv2d_a = fluid.layers.conv2d(
+ relu_a,
+ C_out // 2,
+ 1,
+ 2,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'conv_1.weight'),
+ bias_attr=False)
+ h_end = relu_a.shape[2]
+ w_end = relu_a.shape[3]
+ slice_a = fluid.layers.slice(relu_a, [2, 3], [1, 1], [h_end, w_end])
+ conv2d_b = fluid.layers.conv2d(
+ slice_a,
+ C_out // 2,
+ 1,
+ 2,
+ param_attr=ParamAttr(
+ initializer=Xavier(
+ uniform=False, fan_in=0),
+ name=name + 'conv_2.weight'),
+ bias_attr=False)
+ out = fluid.layers.concat([conv2d_a, conv2d_b], axis=1)
+ if affine:
+ out = fluid.layers.batch_norm(
+ out,
+ param_attr=ParamAttr(
+ initializer=Constant(1.), name=name + 'bn.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.), name=name + 'bn.bias'),
+ moving_mean_name=name + 'bn.running_mean',
+ moving_variance_name=name + 'bn.running_var')
+ else:
+ out = fluid.layers.batch_norm(
+ out,
+ param_attr=ParamAttr(
+ initializer=Constant(1.),
+ learning_rate=0.,
+ name=name + 'bn.weight'),
+ bias_attr=ParamAttr(
+ initializer=Constant(0.),
+ learning_rate=0.,
+ name=name + 'bn.bias'),
+ moving_mean_name=name + 'bn.running_mean',
+ moving_variance_name=name + 'bn.running_var')
+ return out
diff --git a/AutoDL/LRC/reader.py b/AutoDL/LRC/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..20b32b504e9245c4ff3892f08736d800080daab4
--- /dev/null
+++ b/AutoDL/LRC/reader.py
@@ -0,0 +1,187 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rig hts Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# DARTS
+# Copyright (c) 2018, Hanxiao Liu.
+# Licensed under the Apache License, Version 2.0;
+# --------------------------------------------------------
+"""
+CIFAR-10 dataset.
+This module will download dataset from
+https://www.cs.toronto.edu/~kriz/cifar.html and parse train/test set into
+paddle reader creators.
+The CIFAR-10 dataset consists of 60000 32x32 colour images in 10 classes,
+with 6000 images per class. There are 50000 training images and 10000 test images.
+"""
+
+from PIL import Image
+from PIL import ImageOps
+import numpy as np
+
+import cPickle
+import random
+import utils
+import paddle.fluid as fluid
+import time
+import os
+import functools
+import paddle.reader
+
+__all__ = ['train10', 'test10']
+
+image_size = 32
+image_depth = 3
+half_length = 8
+
+CIFAR_MEAN = [0.4914, 0.4822, 0.4465]
+CIFAR_STD = [0.24703233, 0.24348505, 0.26158768]
+
+
+def generate_reshape_label(label, batch_size, CIFAR_CLASSES=10):
+ reshape_label = np.zeros((batch_size, 1), dtype='int32')
+ reshape_non_label = np.zeros(
+ (batch_size * (CIFAR_CLASSES - 1), 1), dtype='int32')
+ num = 0
+ for i in range(batch_size):
+ label_i = label[i]
+ reshape_label[i] = label_i + i * CIFAR_CLASSES
+ for j in range(CIFAR_CLASSES):
+ if label_i != j:
+ reshape_non_label[num] = \
+ j + i * CIFAR_CLASSES
+ num += 1
+ return reshape_label, reshape_non_label
+
+
+def generate_bernoulli_number(batch_size, CIFAR_CLASSES=10):
+ rcc_iters = 50
+ rad_var = np.zeros((rcc_iters, batch_size, CIFAR_CLASSES - 1))
+ for i in range(rcc_iters):
+ bernoulli_num = np.random.binomial(size=batch_size, n=1, p=0.5)
+ bernoulli_map = np.array([])
+ ones = np.ones((CIFAR_CLASSES - 1, 1))
+ for batch_id in range(batch_size):
+ num = bernoulli_num[batch_id]
+ var_id = 2 * ones * num - 1
+ bernoulli_map = np.append(bernoulli_map, var_id)
+ rad_var[i] = bernoulli_map.reshape((batch_size, CIFAR_CLASSES - 1))
+ return rad_var.astype('float32')
+
+
+def preprocess(sample, is_training, args):
+ image_array = sample.reshape(3, image_size, image_size)
+ rgb_array = np.transpose(image_array, (1, 2, 0))
+ img = Image.fromarray(rgb_array, 'RGB')
+
+ if is_training:
+ # pad and ramdom crop
+ img = ImageOps.expand(img, (4, 4, 4, 4), fill=0) # pad to 40 * 40 * 3
+ left_top = np.random.randint(9, size=2) # rand 0 - 8
+ img = img.crop((left_top[0], left_top[1], left_top[0] + image_size,
+ left_top[1] + image_size))
+ if np.random.randint(2):
+ img = img.transpose(Image.FLIP_LEFT_RIGHT)
+
+ img = np.array(img).astype(np.float32)
+
+ # per_image_standardization
+ img_float = img / 255.0
+ img = (img_float - CIFAR_MEAN) / CIFAR_STD
+
+ if is_training and args.cutout:
+ center = np.random.randint(image_size, size=2)
+ offset_width = max(0, center[0] - half_length)
+ offset_height = max(0, center[1] - half_length)
+ target_width = min(center[0] + half_length, image_size)
+ target_height = min(center[1] + half_length, image_size)
+
+ for i in range(offset_height, target_height):
+ for j in range(offset_width, target_width):
+ img[i][j][:] = 0.0
+
+ img = np.transpose(img, (2, 0, 1))
+ return img
+
+
+def reader_creator_filepath(filename, sub_name, is_training, args):
+ files = os.listdir(filename)
+ names = [each_item for each_item in files if sub_name in each_item]
+ names.sort()
+ datasets = []
+ for name in names:
+ print("Reading file " + name)
+ batch = cPickle.load(open(filename + name, 'rb'))
+ data = batch['data']
+ labels = batch.get('labels', batch.get('fine_labels', None))
+ assert labels is not None
+ dataset = zip(data, labels)
+ datasets.extend(dataset)
+ random.shuffle(datasets)
+
+ def read_batch(datasets, args):
+ for sample, label in datasets:
+ im = preprocess(sample, is_training, args)
+ yield im, [int(label)]
+
+ def reader():
+ batch_data = []
+ batch_label = []
+ for data, label in read_batch(datasets, args):
+ batch_data.append(data)
+ batch_label.append(label)
+ if len(batch_data) == args.batch_size:
+ batch_data = np.array(batch_data, dtype='float32')
+ batch_label = np.array(batch_label, dtype='int64')
+ if is_training:
+ flatten_label, flatten_non_label = \
+ generate_reshape_label(batch_label, args.batch_size)
+ rad_var = generate_bernoulli_number(args.batch_size)
+ mixed_x, y_a, y_b, lam = utils.mixup_data(
+ batch_data, batch_label, args.batch_size,
+ args.mix_alpha)
+ batch_out = [[mixed_x, y_a, y_b, lam, flatten_label, \
+ flatten_non_label, rad_var]]
+ yield batch_out
+ else:
+ batch_out = [[batch_data, batch_label]]
+ yield batch_out
+ batch_data = []
+ batch_label = []
+
+ return reader
+
+
+def train10(args):
+ """
+ CIFAR-10 training set creator.
+ It returns a reader creator, each sample in the reader is image pixels in
+ [0, 1] and label in [0, 9].
+ :return: Training reader creator
+ :rtype: callable
+ """
+
+ return reader_creator_filepath(args.data, 'data_batch', True, args)
+
+
+def test10(args):
+ """
+ CIFAR-10 test set creator.
+ It returns a reader creator, each sample in the reader is image pixels in
+ [0, 1] and label in [0, 9].
+ :return: Test reader creator.
+ :rtype: callable
+ """
+ return reader_creator_filepath(args.data, 'test_batch', False, args)
diff --git a/AutoDL/LRC/run.sh b/AutoDL/LRC/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..9f1a045d49789c3e9aebbc2a73b84b11da471b5a
--- /dev/null
+++ b/AutoDL/LRC/run.sh
@@ -0,0 +1,8 @@
+CUDA_VISIBLE_DEVICES=0 python -u train_mixup.py \
+--batch_size=80 \
+--auxiliary \
+--weight_decay=0.0003 \
+--learning_rate=0.025 \
+--lrc_loss_lambda=0.7 \
+--cutout
+
diff --git a/AutoDL/LRC/train_mixup.py b/AutoDL/LRC/train_mixup.py
new file mode 100644
index 0000000000000000000000000000000000000000..de752c84bcf9276aa83540d60370517e66c0704f
--- /dev/null
+++ b/AutoDL/LRC/train_mixup.py
@@ -0,0 +1,247 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# DARTS
+# Copyright (c) 2018, Hanxiao Liu.
+# Licensed under the Apache License, Version 2.0;
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from learning_rate import cosine_decay
+import numpy as np
+import argparse
+from model import NetworkCIFAR as Network
+import reader
+import sys
+import os
+import time
+import logging
+import genotypes
+import paddle.fluid as fluid
+import shutil
+import utils
+import cPickle as cp
+
+parser = argparse.ArgumentParser("cifar")
+parser.add_argument(
+ '--data',
+ type=str,
+ default='./dataset/cifar/cifar-10-batches-py/',
+ help='location of the data corpus')
+parser.add_argument('--batch_size', type=int, default=96, help='batch size')
+parser.add_argument(
+ '--learning_rate', type=float, default=0.025, help='init learning rate')
+parser.add_argument('--momentum', type=float, default=0.9, help='momentum')
+parser.add_argument(
+ '--weight_decay', type=float, default=3e-4, help='weight decay')
+parser.add_argument(
+ '--report_freq', type=float, default=50, help='report frequency')
+parser.add_argument(
+ '--epochs', type=int, default=600, help='num of training epochs')
+parser.add_argument(
+ '--init_channels', type=int, default=36, help='num of init channels')
+parser.add_argument(
+ '--layers', type=int, default=20, help='total number of layers')
+parser.add_argument(
+ '--model_path',
+ type=str,
+ default='saved_models',
+ help='path to save the model')
+parser.add_argument(
+ '--auxiliary',
+ action='store_true',
+ default=False,
+ help='use auxiliary tower')
+parser.add_argument(
+ '--auxiliary_weight',
+ type=float,
+ default=0.4,
+ help='weight for auxiliary loss')
+parser.add_argument(
+ '--cutout', action='store_true', default=False, help='use cutout')
+parser.add_argument(
+ '--cutout_length', type=int, default=16, help='cutout length')
+parser.add_argument(
+ '--drop_path_prob', type=float, default=0.2, help='drop path probability')
+parser.add_argument('--save', type=str, default='EXP', help='experiment name')
+parser.add_argument(
+ '--arch', type=str, default='DARTS', help='which architecture to use')
+parser.add_argument(
+ '--grad_clip', type=float, default=5, help='gradient clipping')
+parser.add_argument(
+ '--lr_exp_decay',
+ action='store_true',
+ default=False,
+ help='use exponential_decay learning_rate')
+parser.add_argument('--mix_alpha', type=float, default=0.5, help='mixup alpha')
+parser.add_argument(
+ '--lrc_loss_lambda', default=0, type=float, help='lrc_loss_lambda')
+parser.add_argument(
+ '--loss_type',
+ default=1,
+ type=float,
+ help='loss_type 0: cross entropy 1: multi margin loss 2: max margin loss')
+
+args = parser.parse_args()
+
+CIFAR_CLASSES = 10
+dataset_train_size = 50000
+image_size = 32
+
+
+def main():
+ image_shape = [3, image_size, image_size]
+ devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
+ devices_num = len(devices.split(","))
+ logging.info("args = %s", args)
+ genotype = eval("genotypes.%s" % args.arch)
+ model = Network(args.init_channels, CIFAR_CLASSES, args.layers,
+ args.auxiliary, genotype)
+ steps_one_epoch = dataset_train_size / (devices_num * args.batch_size)
+ train(model, args, image_shape, steps_one_epoch)
+
+
+def build_program(main_prog, startup_prog, args, is_train, model, im_shape,
+ steps_one_epoch):
+ out = []
+ with fluid.program_guard(main_prog, startup_prog):
+ py_reader = model.build_input(im_shape, args.batch_size, is_train)
+ if is_train:
+ with fluid.unique_name.guard():
+ loss = model.train_model(py_reader, args.init_channels,
+ args.auxiliary, args.auxiliary_weight,
+ args.batch_size, args.lrc_loss_lambda)
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=cosine_decay(args.learning_rate, \
+ args.epochs, steps_one_epoch),
+ regularization=fluid.regularizer.L2Decay(\
+ args.weight_decay),
+ momentum=args.momentum)
+ optimizer.minimize(loss)
+ out = [py_reader, loss]
+ else:
+ with fluid.unique_name.guard():
+ loss, acc_1, acc_5 = model.test_model(py_reader,
+ args.init_channels)
+ out = [py_reader, loss, acc_1, acc_5]
+ return out
+
+
+def train(model, args, im_shape, steps_one_epoch):
+ train_startup_prog = fluid.Program()
+ test_startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+ test_prog = fluid.Program()
+
+ train_py_reader, loss_train = build_program(train_prog, train_startup_prog,
+ args, True, model, im_shape,
+ steps_one_epoch)
+
+ test_py_reader, loss_test, acc_1, acc_5 = build_program(
+ test_prog, test_startup_prog, args, False, model, im_shape,
+ steps_one_epoch)
+
+ test_prog = test_prog.clone(for_test=True)
+
+ place = fluid.CUDAPlace(0)
+ exe = fluid.Executor(place)
+ exe.run(train_startup_prog)
+ exe.run(test_startup_prog)
+
+ exec_strategy = fluid.ExecutionStrategy()
+ exec_strategy.num_threads = 1
+ train_exe = fluid.ParallelExecutor(
+ main_program=train_prog,
+ use_cuda=True,
+ loss_name=loss_train.name,
+ exec_strategy=exec_strategy)
+ train_reader = reader.train10(args)
+ test_reader = reader.test10(args)
+ train_py_reader.decorate_paddle_reader(train_reader)
+ test_py_reader.decorate_paddle_reader(test_reader)
+
+ fluid.clip.set_gradient_clip(fluid.clip.GradientClipByNorm(args.grad_clip))
+ fluid.memory_optimize(fluid.default_main_program())
+
+ def save_model(postfix, main_prog):
+ model_path = os.path.join(args.model_path, postfix)
+ if os.path.isdir(model_path):
+ shutil.rmtree(model_path)
+ fluid.io.save_persistables(exe, model_path, main_program=main_prog)
+
+ def test(epoch_id):
+ test_fetch_list = [loss_test, acc_1, acc_5]
+ objs = utils.AvgrageMeter()
+ top1 = utils.AvgrageMeter()
+ top5 = utils.AvgrageMeter()
+ test_py_reader.start()
+ test_start_time = time.time()
+ step_id = 0
+ try:
+ while True:
+ prev_test_start_time = test_start_time
+ test_start_time = time.time()
+ loss_test_v, acc_1_v, acc_5_v = exe.run(
+ test_prog, fetch_list=test_fetch_list)
+ objs.update(np.array(loss_test_v), args.batch_size)
+ top1.update(np.array(acc_1_v), args.batch_size)
+ top5.update(np.array(acc_5_v), args.batch_size)
+ if step_id % args.report_freq == 0:
+ print("Epoch {}, Step {}, acc_1 {}, acc_5 {}, time {}".
+ format(epoch_id, step_id,
+ np.array(acc_1_v),
+ np.array(acc_5_v), test_start_time -
+ prev_test_start_time))
+ step_id += 1
+ except fluid.core.EOFException:
+ test_py_reader.reset()
+ print("Epoch {0}, top1 {1}, top5 {2}".format(epoch_id, top1.avg,
+ top5.avg))
+
+ train_fetch_list = [loss_train]
+ epoch_start_time = time.time()
+ for epoch_id in range(args.epochs):
+ model.drop_path_prob = args.drop_path_prob * epoch_id / args.epochs
+ train_py_reader.start()
+ epoch_end_time = time.time()
+ if epoch_id > 0:
+ print("Epoch {}, total time {}".format(epoch_id - 1, epoch_end_time
+ - epoch_start_time))
+ epoch_start_time = epoch_end_time
+ epoch_end_time
+ start_time = time.time()
+ step_id = 0
+ try:
+ while True:
+ prev_start_time = start_time
+ start_time = time.time()
+ loss_v, = train_exe.run(
+ fetch_list=[v.name for v in train_fetch_list])
+ print("Epoch {}, Step {}, loss {}, time {}".format(epoch_id, step_id, \
+ np.array(loss_v).mean(), start_time-prev_start_time))
+ step_id += 1
+ sys.stdout.flush()
+ except fluid.core.EOFException:
+ train_py_reader.reset()
+ if epoch_id % 50 == 0 or epoch_id == args.epochs - 1:
+ save_model(str(epoch_id), train_prog)
+ test(epoch_id)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/AutoDL/LRC/utils.py b/AutoDL/LRC/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..4002b57c6e91f9a4f7992156c4fa07f9e55d628c
--- /dev/null
+++ b/AutoDL/LRC/utils.py
@@ -0,0 +1,55 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# DARTS
+# Copyright (c) 2018, Hanxiao Liu.
+# Licensed under the Apache License, Version 2.0;
+# --------------------------------------------------------
+
+import os
+import sys
+import time
+import math
+import numpy as np
+
+
+def mixup_data(x, y, batch_size, alpha=1.0):
+ '''Compute the mixup data. Return mixed inputs, pairs of targets, and lambda'''
+ if alpha > 0.:
+ lam = np.random.beta(alpha, alpha)
+ else:
+ lam = 1.
+ index = np.random.permutation(batch_size)
+
+ mixed_x = lam * x + (1 - lam) * x[index, :]
+ y_a, y_b = y, y[index]
+ return mixed_x.astype('float32'), y_a.astype('int64'),\
+ y_b.astype('int64'), np.array(lam, dtype='float32')
+
+
+class AvgrageMeter(object):
+ def __init__(self):
+ self.reset()
+
+ def reset(self):
+ self.avg = 0
+ self.sum = 0
+ self.cnt = 0
+
+ def update(self, val, n=1):
+ self.sum += val * n
+ self.cnt += n
+ self.avg = self.sum / self.cnt
diff --git a/PaddleCV/README.md b/PaddleCV/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..bec8ec938ca01469d79ca75c8781d112839e07d2
--- /dev/null
+++ b/PaddleCV/README.md
@@ -0,0 +1,87 @@
+PaddleCV
+========
+
+图像分类
+--------
+
+图像分类是根据图像的语义信息对不同类别图像进行区分,是计算机视觉中重要的基础问题,是物体检测、图像分割、物体跟踪、行为分析、人脸识别等其他高层视觉任务的基础,在许多领域都有着广泛的应用。如:安防领域的人脸识别和智能视频分析等,交通领域的交通场景识别,互联网领域基于内容的图像检索和相册自动归类,医学领域的图像识别等。
+
+在深度学习时代,图像分类的准确率大幅度提升,在图像分类任务中,我们向大家介绍了如何在经典的数据集ImageNet上,训练常用的模型,包括AlexNet、VGG、GoogLeNet、ResNet、Inception-v4、MobileNet、DPN(Dual Path Network)、SE-ResNeXt模型,也开源了[训练的模型](https://github.com/PaddlePaddle/models/blob/develop/PaddleCV/image_classification/README_cn.md#已有模型及其性能) 方便用户下载使用。同时提供了能够将Caffe模型转换为PaddlePaddle
+Fluid模型配置和参数文件的工具。
+
+- [AlexNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [VGG](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [GoogleNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [Residual Network](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [Inception-v4](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [MobileNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [Dual Path Network](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [SE-ResNeXt](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)
+- [Caffe模型转换为Paddle Fluid配置和模型文件工具](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/caffe2fluid)
+
+目标检测
+--------
+
+目标检测任务的目标是给定一张图像或是一个视频帧,让计算机找出其中所有目标的位置,并给出每个目标的具体类别。对于人类来说,目标检测是一个非常简单的任务。然而,计算机能够“看到”的是图像被编码之后的数字,很难解图像或是视频帧中出现了人或是物体这样的高层语义概念,也就更加难以定位目标出现在图像中哪个区域。与此同时,由于目标会出现在图像或是视频帧中的任何位置,目标的形态千变万化,图像或是视频帧的背景千差万别,诸多因素都使得目标检测对计算机来说是一个具有挑战性的问题。
+
+在目标检测任务中,我们介绍了如何基于[PASCAL VOC](http://host.robots.ox.ac.uk/pascal/VOC/) 、[MS COCO](http://cocodataset.org/#home) 数据训练通用物体检测模型,当前介绍了SSD算法,SSD全称Single Shot MultiBox Detector,是目标检测领域较新且效果较好的检测算法之一,具有检测速度快且检测精度高的特点。
+
+开放环境中的检测人脸,尤其是小的、模糊的和部分遮挡的人脸也是一个具有挑战的任务。我们也介绍了如何基于 [WIDER FACE](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace) 数据训练百度自研的人脸检测PyramidBox模型,该算法于2018年3月份在WIDER FACE的多项评测中均获得 [第一名](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/WiderFace_Results.html)。
+
+Faster RCNN模型是典型的两阶段目标检测器,相较于传统提取区域的方法,通过RPN网络共享卷积层参数大幅提高提取区域的效率,并提出高质量的候选区域。
+
+Mask RCNN模型是基于Faster RCNN模型的经典实例分割模型,在原有Faster RCNN模型基础上添加分割分支,得到掩码结果,实现了掩码和类别预测关系的解藕。
+
+- [Single Shot MultiBox Detector](https://github.com/PaddlePaddle/models/blob/develop/PaddleCV/object_detection/README_cn.md)
+- [Face Detector: PyramidBox](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/face_detection/README_cn.md)
+- [Faster RCNN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/rcnn/README_cn.md)
+- [Mask RCNN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/rcnn/README_cn.md)
+
+图像语义分割
+------------
+
+图像语意分割顾名思义是将图像像素按照表达的语义含义的不同进行分组/分割,图像语义是指对图像内容的理解,例如,能够描绘出什么物体在哪里做了什么事情等,分割是指对图片中的每个像素点进行标注,标注属于哪一类别。近年来用在无人车驾驶技术中分割街景来避让行人和车辆、医疗影像分析中辅助诊断等。
+
+在图像语义分割任务中,我们介绍如何基于图像级联网络(Image Cascade
+Network,ICNet)进行语义分割,相比其他分割算法,ICNet兼顾了准确率和速度。
+
+- [ICNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/icnet)
+
+图像生成
+-----------
+
+图像生成是指根据输入向量,生成目标图像。这里的输入向量可以是随机的噪声或用户指定的条件向量。具体的应用场景有:手写体生成、人脸合成、风格迁移、图像修复等。当前的图像生成任务主要是借助生成对抗网络(GAN)来实现。
+生成对抗网络(GAN)由两种子网络组成:生成器和识别器。生成器的输入是随机噪声或条件向量,输出是目标图像。识别器是一个分类器,输入是一张图像,输出是该图像是否是真实的图像。在训练过程中,生成器和识别器通过不断的相互博弈提升自己的能力。
+
+在图像生成任务中,我们介绍了如何使用DCGAN和ConditioanlGAN来进行手写数字的生成,另外还介绍了用于风格迁移的CycleGAN.
+
+- [DCGAN & ConditionalGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/gan/c_gan)
+- [CycleGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/gan/cycle_gan)
+
+场景文字识别
+------------
+
+许多场景图像中包含着丰富的文本信息,对理解图像信息有着重要作用,能够极大地帮助人们认知和理解场景图像的内容。场景文字识别是在图像背景复杂、分辨率低下、字体多样、分布随意等情况下,将图像信息转化为文字序列的过程,可认为是一种特别的翻译过程:将图像输入翻译为自然语言输出。场景图像文字识别技术的发展也促进了一些新型应用的产生,如通过自动识别路牌中的文字帮助街景应用获取更加准确的地址信息等。
+
+在场景文字识别任务中,我们介绍如何将基于CNN的图像特征提取和基于RNN的序列翻译技术结合,免除人工定义特征,避免字符分割,使用自动学习到的图像特征,完成字符识别。当前,介绍了CRNN-CTC模型和基于注意力机制的序列到序列模型。
+
+- [CRNN-CTC模型](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/ocr_recognition)
+- [Attention模型](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/ocr_recognition)
+
+
+度量学习
+-------
+
+
+度量学习也称作距离度量学习、相似度学习,通过学习对象之间的距离,度量学习能够用于分析对象时间的关联、比较关系,在实际问题中应用较为广泛,可应用于辅助分类、聚类问题,也广泛用于图像检索、人脸识别等领域。以往,针对不同的任务,需要选择合适的特征并手动构建距离函数,而度量学习可根据不同的任务来自主学习出针对特定任务的度量距离函数。度量学习和深度学习的结合,在人脸识别/验证、行人再识别(human Re-ID)、图像检索等领域均取得较好的性能,在这个任务中我们主要介绍了基于Fluid的深度度量学习模型,包含了三元组、四元组等损失函数。
+
+- [Metric Learning](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/metric_learning)
+
+
+视频分类
+-------
+
+视频分类是视频理解任务的基础,与图像分类不同的是,分类的对象不再是静止的图像,而是一个由多帧图像构成的、包含语音数据、包含运动信息等的视频对象,因此理解视频需要获得更多的上下文信息,不仅要理解每帧图像是什么、包含什么,还需要结合不同帧,知道上下文的关联信息。视频分类方法主要包含基于卷积神经网络、基于循环神经网络、或将这两者结合的方法。该任务中我们介绍基于Fluid的视频分类模型,目前包含Temporal Segment Network(TSN)模型,后续会持续增加更多模型。
+
+
+- [TSN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/video_classification)
diff --git a/PaddleCV/adversarial/README.md b/PaddleCV/adversarial/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..91661f7e1675d59c7d38c4c09bc67d5b9339573d
--- /dev/null
+++ b/PaddleCV/adversarial/README.md
@@ -0,0 +1,112 @@
+The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
+
+---
+
+# Advbox
+
+Advbox is a toolbox to generate adversarial examples that fool neural networks and Advbox can benchmark the robustness of machine learning models.
+
+The Advbox is based on [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) Fluid and is under continual development, always welcoming contributions of the latest method of adversarial attacks and defenses.
+
+
+## Overview
+[Szegedy et al.](https://arxiv.org/abs/1312.6199) discovered an intriguing properties of deep neural networks in the context of image classification for the first time. They showed that despite the state-of-the-art deep networks are surprisingly susceptible to adversarial attacks in the form of small perturbations to images that remain (almost) imperceptible to human vision system. These perturbations are found by optimizing the input to maximize the prediction error and the images modified by these perturbations are called as `adversarial examples`. The profound implications of these results triggered a wide interest of researchers in adversarial attacks and their defenses for deep learning in general.
+
+Advbox is similar to [Foolbox](https://github.com/bethgelab/foolbox) and [CleverHans](https://github.com/tensorflow/cleverhans). CleverHans only supports TensorFlow framework while foolbox interfaces with many popular machine learning frameworks such as PyTorch, Keras, TensorFlow, Theano, Lasagne and MXNet. However, these two great libraries don't support PaddlePaddle, an easy-to-use, efficient, flexible and scalable deep learning platform which is originally developed by Baidu scientists and engineers for the purpose of applying deep learning to many products at Baidu.
+
+## Usage
+Advbox provides many stable reference implementations of modern methods to generate adversarial examples such as FGSM, DeepFool, JSMA. When you want to benchmark the robustness of your neural networks , you can use the advbox to generate some adversarial examples and benchmark the networks. Some tips of using Advbox:
+
+1. Train a model and save the parameters.
+2. Load the parameters which has been trained,then reconstruct the model.
+3. Use advbox to generate the adversarial samples.
+
+
+#### Dependencies
+* PaddlePaddle: [the lastest develop branch](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html)
+* Python 2.x
+
+#### Structure
+
+Network models, attack method's implements and the criterion that defines adversarial examples are three essential elements to generate adversarial examples. Misclassification is adopted as the adversarial criterion for briefness in Advbox.
+
+The structure of Advbox module are as follows:
+
+ .
+ ├── advbox
+ | ├── __init__.py
+ | ├── attack
+ | ├── __init__.py
+ | ├── base.py
+ | ├── deepfool.py
+ | ├── gradient_method.py
+ | ├── lbfgs.py
+ | └── saliency.py
+ | ├── models
+ | ├── __init__.py
+ | ├── base.py
+ | └── paddle.py
+ | └── adversary.py
+ ├── tutorials
+ | ├── __init__.py
+ | ├── mnist_model.py
+ | ├── mnist_tutorial_lbfgs.py
+ | ├── mnist_tutorial_fgsm.py
+ | ├── mnist_tutorial_bim.py
+ | ├── mnist_tutorial_ilcm.py
+ | ├── mnist_tutorial_mifgsm.py
+ | ├── mnist_tutorial_jsma.py
+ | └── mnist_tutorial_deepfool.py
+ └── README.md
+
+**advbox.attack**
+
+Advbox implements several popular adversarial attacks which search adversarial examples. Each attack method uses a distance measure(L1, L2, etc.) to quantify the size of adversarial perturbations. Advbox is easy to craft adversarial example as some attack methods could perform internal hyperparameter tuning to find the minimum perturbation.
+
+**advbox.model**
+
+Advbox implements interfaces to PaddlePaddle. Additionally, other deep learning framworks such as TensorFlow can also be defined and employed. The module is use to compute predictions and gradients for given inputs in a specific framework.
+
+**advbox.adversary**
+
+Adversary contains the original object, the target and the adversarial examples. It provides the misclassification as the criterion to accept a adversarial example.
+
+## Tutorials
+The `./tutorials/` folder provides some tutorials to generate adversarial examples on the MNIST dataset. You can slightly modify the code to apply to other dataset. These attack methods are supported in Advbox:
+
+* [L-BFGS](https://arxiv.org/abs/1312.6199)
+* [FGSM](https://arxiv.org/abs/1412.6572)
+* [BIM](https://arxiv.org/abs/1607.02533)
+* [ILCM](https://arxiv.org/abs/1607.02533)
+* [MI-FGSM](https://arxiv.org/pdf/1710.06081.pdf)
+* [JSMA](https://arxiv.org/pdf/1511.07528)
+* [DeepFool](https://arxiv.org/abs/1511.04599)
+
+## Testing
+Benchmarks on a vanilla CNN model.
+
+> MNIST
+
+| adversarial attacks | fooling rate (non-targeted) | fooling rate (targeted) | max_epsilon | iterations | Strength |
+|:-----:| :----: | :---: | :----: | :----: | :----: |
+|L-BFGS| --- | 89.2% | --- | One shot | *** |
+|FGSM| 57.8% | 26.55% | 0.3 | One shot| *** |
+|BIM| 97.4% | --- | 0.1 | 100 | **** |
+|ILCM| --- | 100.0% | 0.1 | 100 | **** |
+|MI-FGSM| 94.4% | 100.0% | 0.1 | 100 | **** |
+|JSMA| 96.8% | 90.4%| 0.1 | 2000 | *** |
+|DeepFool| 97.7% | 51.3% | --- | 100 | **** |
+
+* The strength (higher for more asterisks) is based on the impression from the reviewed literature.
+
+---
+## References
+* [Intriguing properties of neural networks](https://arxiv.org/abs/1312.6199), C. Szegedy et al., arxiv 2014
+* [Explaining and Harnessing Adversarial Examples](https://arxiv.org/abs/1412.6572), I. Goodfellow et al., ICLR 2015
+* [Adversarial Examples In The Physical World](https://arxiv.org/pdf/1607.02533v3.pdf), A. Kurakin et al., ICLR workshop 2017
+* [Boosting Adversarial Attacks with Momentum](https://arxiv.org/abs/1710.06081), Yinpeng Dong et al., arxiv 2018
+* [The Limitations of Deep Learning in Adversarial Settings](https://arxiv.org/abs/1511.07528), N. Papernot et al., ESSP 2016
+* [DeepFool: a simple and accurate method to fool deep neural networks](https://arxiv.org/abs/1511.04599), S. Moosavi-Dezfooli et al., CVPR 2016
+* [Foolbox: A Python toolbox to benchmark the robustness of machine learning models](https://arxiv.org/abs/1707.04131), Jonas Rauber et al., arxiv 2018
+* [CleverHans: An adversarial example library for constructing attacks, building defenses, and benchmarking both](https://github.com/tensorflow/cleverhans#setting-up-cleverhans)
+* [Threat of Adversarial Attacks on Deep Learning in Computer Vision: A Survey](https://arxiv.org/abs/1801.00553), Naveed Akhtar, Ajmal Mian, arxiv 2018
diff --git a/fluid/adversarial/advbox/__init__.py b/PaddleCV/adversarial/advbox/__init__.py
similarity index 100%
rename from fluid/adversarial/advbox/__init__.py
rename to PaddleCV/adversarial/advbox/__init__.py
diff --git a/fluid/adversarial/advbox/adversary.py b/PaddleCV/adversarial/advbox/adversary.py
similarity index 100%
rename from fluid/adversarial/advbox/adversary.py
rename to PaddleCV/adversarial/advbox/adversary.py
diff --git a/fluid/adversarial/advbox/attacks/__init__.py b/PaddleCV/adversarial/advbox/attacks/__init__.py
similarity index 100%
rename from fluid/adversarial/advbox/attacks/__init__.py
rename to PaddleCV/adversarial/advbox/attacks/__init__.py
diff --git a/fluid/adversarial/advbox/attacks/base.py b/PaddleCV/adversarial/advbox/attacks/base.py
similarity index 100%
rename from fluid/adversarial/advbox/attacks/base.py
rename to PaddleCV/adversarial/advbox/attacks/base.py
diff --git a/fluid/adversarial/advbox/attacks/deepfool.py b/PaddleCV/adversarial/advbox/attacks/deepfool.py
similarity index 100%
rename from fluid/adversarial/advbox/attacks/deepfool.py
rename to PaddleCV/adversarial/advbox/attacks/deepfool.py
diff --git a/fluid/adversarial/advbox/attacks/gradient_method.py b/PaddleCV/adversarial/advbox/attacks/gradient_method.py
similarity index 100%
rename from fluid/adversarial/advbox/attacks/gradient_method.py
rename to PaddleCV/adversarial/advbox/attacks/gradient_method.py
diff --git a/fluid/adversarial/advbox/attacks/lbfgs.py b/PaddleCV/adversarial/advbox/attacks/lbfgs.py
similarity index 100%
rename from fluid/adversarial/advbox/attacks/lbfgs.py
rename to PaddleCV/adversarial/advbox/attacks/lbfgs.py
diff --git a/fluid/adversarial/advbox/attacks/saliency.py b/PaddleCV/adversarial/advbox/attacks/saliency.py
similarity index 100%
rename from fluid/adversarial/advbox/attacks/saliency.py
rename to PaddleCV/adversarial/advbox/attacks/saliency.py
diff --git a/fluid/adversarial/advbox/models/__init__.py b/PaddleCV/adversarial/advbox/models/__init__.py
similarity index 100%
rename from fluid/adversarial/advbox/models/__init__.py
rename to PaddleCV/adversarial/advbox/models/__init__.py
diff --git a/fluid/adversarial/advbox/models/base.py b/PaddleCV/adversarial/advbox/models/base.py
similarity index 100%
rename from fluid/adversarial/advbox/models/base.py
rename to PaddleCV/adversarial/advbox/models/base.py
diff --git a/fluid/adversarial/advbox/models/paddle.py b/PaddleCV/adversarial/advbox/models/paddle.py
similarity index 100%
rename from fluid/adversarial/advbox/models/paddle.py
rename to PaddleCV/adversarial/advbox/models/paddle.py
diff --git a/fluid/adversarial/tutorials/__init__.py b/PaddleCV/adversarial/tutorials/__init__.py
similarity index 100%
rename from fluid/adversarial/tutorials/__init__.py
rename to PaddleCV/adversarial/tutorials/__init__.py
diff --git a/fluid/adversarial/tutorials/mnist_model.py b/PaddleCV/adversarial/tutorials/mnist_model.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_model.py
rename to PaddleCV/adversarial/tutorials/mnist_model.py
diff --git a/fluid/adversarial/tutorials/mnist_tutorial_bim.py b/PaddleCV/adversarial/tutorials/mnist_tutorial_bim.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_tutorial_bim.py
rename to PaddleCV/adversarial/tutorials/mnist_tutorial_bim.py
diff --git a/fluid/adversarial/tutorials/mnist_tutorial_deepfool.py b/PaddleCV/adversarial/tutorials/mnist_tutorial_deepfool.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_tutorial_deepfool.py
rename to PaddleCV/adversarial/tutorials/mnist_tutorial_deepfool.py
diff --git a/fluid/adversarial/tutorials/mnist_tutorial_fgsm.py b/PaddleCV/adversarial/tutorials/mnist_tutorial_fgsm.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_tutorial_fgsm.py
rename to PaddleCV/adversarial/tutorials/mnist_tutorial_fgsm.py
diff --git a/fluid/adversarial/tutorials/mnist_tutorial_ilcm.py b/PaddleCV/adversarial/tutorials/mnist_tutorial_ilcm.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_tutorial_ilcm.py
rename to PaddleCV/adversarial/tutorials/mnist_tutorial_ilcm.py
diff --git a/fluid/adversarial/tutorials/mnist_tutorial_jsma.py b/PaddleCV/adversarial/tutorials/mnist_tutorial_jsma.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_tutorial_jsma.py
rename to PaddleCV/adversarial/tutorials/mnist_tutorial_jsma.py
diff --git a/fluid/adversarial/tutorials/mnist_tutorial_lbfgs.py b/PaddleCV/adversarial/tutorials/mnist_tutorial_lbfgs.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_tutorial_lbfgs.py
rename to PaddleCV/adversarial/tutorials/mnist_tutorial_lbfgs.py
diff --git a/fluid/adversarial/tutorials/mnist_tutorial_mifgsm.py b/PaddleCV/adversarial/tutorials/mnist_tutorial_mifgsm.py
similarity index 100%
rename from fluid/adversarial/tutorials/mnist_tutorial_mifgsm.py
rename to PaddleCV/adversarial/tutorials/mnist_tutorial_mifgsm.py
diff --git a/fluid/PaddleCV/caffe2fluid/.gitignore b/PaddleCV/caffe2fluid/.gitignore
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/.gitignore
rename to PaddleCV/caffe2fluid/.gitignore
diff --git a/PaddleCV/caffe2fluid/README.md b/PaddleCV/caffe2fluid/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..8520342325a1ef4e08d8f9669969acd5b6b57851
--- /dev/null
+++ b/PaddleCV/caffe2fluid/README.md
@@ -0,0 +1,87 @@
+### Caffe2Fluid
+This tool is used to convert a Caffe model to a Fluid model
+
+### Key Features
+1. Convert caffe model to fluid model with codes of defining a network(useful for re-training)
+
+2. Pycaffe is not necessary when just want convert model without do caffe-inference
+
+3. Caffe's customized layers convertion also be supported by extending this tool
+
+4. A bunch of tools in `examples/imagenet/tools` are provided to compare the difference
+
+### HowTo
+1. Prepare `caffepb.py` in `./proto` if your python has no `pycaffe` module, two options provided here:
+ - Generate pycaffe from caffe.proto
+ ```
+ bash ./proto/compile.sh
+ ```
+
+ - Download one from github directly
+ ```
+ cd proto/ && wget https://raw.githubusercontent.com/ethereon/caffe-tensorflow/master/kaffe/caffe/caffepb.py
+ ```
+
+2. Convert the Caffe model to Fluid model
+ - Generate fluid code and weight file
+ ```
+ python convert.py alexnet.prototxt \
+ --caffemodel alexnet.caffemodel \
+ --data-output-path alexnet.npy \
+ --code-output-path alexnet.py
+ ```
+
+ - Save weights as fluid model file
+ ```
+ # only infer the last layer's result
+ python alexnet.py alexnet.npy ./fluid
+ # infer these 2 layer's result
+ python alexnet.py alexnet.npy ./fluid fc8,prob
+ ```
+
+3. Use the converted model to infer
+ - See more details in `examples/imagenet/tools/run.sh`
+
+4. Compare the inference results with caffe
+ - See more details in `examples/imagenet/tools/diff.sh`
+
+### How to convert custom layer
+1. Implement your custom layer in a file under `kaffe/custom_layers`, eg: mylayer.py
+ - Implement ```shape_func(input_shape, [other_caffe_params])``` to calculate the output shape
+ - Implement ```layer_func(inputs, name, [other_caffe_params])``` to construct a fluid layer
+ - Register these two functions ```register(kind='MyType', shape=shape_func, layer=layer_func)```
+ - Notes: more examples can be found in `kaffe/custom_layers`
+
+2. Add ```import mylayer``` to `kaffe/custom_layers/\_\_init__.py`
+
+3. Prepare your pycaffe as your customized version(same as previous env prepare)
+ - (option1) replace `proto/caffe.proto` with your own caffe.proto and compile it
+ - (option2) change your `pycaffe` to the customized version
+
+4. Convert the Caffe model to Fluid model
+
+5. Set env $CAFFE2FLUID_CUSTOM_LAYERS to the parent directory of 'custom_layers'
+ ```
+ export CAFFE2FLUID_CUSTOM_LAYERS=/path/to/caffe2fluid/kaffe
+ ```
+
+6. Use the converted model when loading model in `xxxnet.py` and `xxxnet.npy`(no need if model is already in `fluid/model` and `fluid/params`)
+
+### Tested models
+- Lenet:
+[model addr](https://github.com/ethereon/caffe-tensorflow/blob/master/examples/mnist)
+
+- ResNets:(ResNet-50, ResNet-101, ResNet-152)
+[model addr](https://onedrive.live.com/?authkey=%21AAFW2-FVoxeVRck&id=4006CBB8476FF777%2117887&cid=4006CBB8476FF777)
+
+- GoogleNet:
+[model addr](https://gist.github.com/jimmie33/7ea9f8ac0da259866b854460f4526034)
+
+- VGG:
+[model addr](https://gist.github.com/ksimonyan/211839e770f7b538e2d8)
+
+- AlexNet:
+[model addr](https://github.com/BVLC/caffe/tree/master/models/bvlc_alexnet)
+
+### Notes
+Some of this code come from here: [caffe-tensorflow](https://github.com/ethereon/caffe-tensorflow)
diff --git a/fluid/PaddleCV/caffe2fluid/convert.py b/PaddleCV/caffe2fluid/convert.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/convert.py
rename to PaddleCV/caffe2fluid/convert.py
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/README.md b/PaddleCV/caffe2fluid/examples/imagenet/README.md
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/README.md
rename to PaddleCV/caffe2fluid/examples/imagenet/README.md
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/compare.py b/PaddleCV/caffe2fluid/examples/imagenet/compare.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/compare.py
rename to PaddleCV/caffe2fluid/examples/imagenet/compare.py
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/data/65.jpeg b/PaddleCV/caffe2fluid/examples/imagenet/data/65.jpeg
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/data/65.jpeg
rename to PaddleCV/caffe2fluid/examples/imagenet/data/65.jpeg
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/infer.py b/PaddleCV/caffe2fluid/examples/imagenet/infer.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/infer.py
rename to PaddleCV/caffe2fluid/examples/imagenet/infer.py
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/cmp.sh b/PaddleCV/caffe2fluid/examples/imagenet/tools/cmp.sh
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/cmp.sh
rename to PaddleCV/caffe2fluid/examples/imagenet/tools/cmp.sh
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/cmp_layers.sh b/PaddleCV/caffe2fluid/examples/imagenet/tools/cmp_layers.sh
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/cmp_layers.sh
rename to PaddleCV/caffe2fluid/examples/imagenet/tools/cmp_layers.sh
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/diff.sh b/PaddleCV/caffe2fluid/examples/imagenet/tools/diff.sh
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/diff.sh
rename to PaddleCV/caffe2fluid/examples/imagenet/tools/diff.sh
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/run.sh b/PaddleCV/caffe2fluid/examples/imagenet/tools/run.sh
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/run.sh
rename to PaddleCV/caffe2fluid/examples/imagenet/tools/run.sh
diff --git a/fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/test.sh b/PaddleCV/caffe2fluid/examples/imagenet/tools/test.sh
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/imagenet/tools/test.sh
rename to PaddleCV/caffe2fluid/examples/imagenet/tools/test.sh
diff --git a/fluid/PaddleCV/caffe2fluid/examples/mnist/README.md b/PaddleCV/caffe2fluid/examples/mnist/README.md
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/mnist/README.md
rename to PaddleCV/caffe2fluid/examples/mnist/README.md
diff --git a/fluid/PaddleCV/caffe2fluid/examples/mnist/evaluate.py b/PaddleCV/caffe2fluid/examples/mnist/evaluate.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/mnist/evaluate.py
rename to PaddleCV/caffe2fluid/examples/mnist/evaluate.py
diff --git a/fluid/PaddleCV/caffe2fluid/examples/mnist/run.sh b/PaddleCV/caffe2fluid/examples/mnist/run.sh
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/examples/mnist/run.sh
rename to PaddleCV/caffe2fluid/examples/mnist/run.sh
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/__init__.py b/PaddleCV/caffe2fluid/kaffe/__init__.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/__init__.py
rename to PaddleCV/caffe2fluid/kaffe/__init__.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/caffe/__init__.py b/PaddleCV/caffe2fluid/kaffe/caffe/__init__.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/caffe/__init__.py
rename to PaddleCV/caffe2fluid/kaffe/caffe/__init__.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/caffe/resolver.py b/PaddleCV/caffe2fluid/kaffe/caffe/resolver.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/caffe/resolver.py
rename to PaddleCV/caffe2fluid/kaffe/caffe/resolver.py
diff --git a/PaddleCV/caffe2fluid/kaffe/custom_layers/__init__.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..8505aee05f90f999347b687349fdfd7c7caf1a0f
--- /dev/null
+++ b/PaddleCV/caffe2fluid/kaffe/custom_layers/__init__.py
@@ -0,0 +1,114 @@
+"""
+"""
+
+from .register import get_registered_layers
+#custom layer import begins
+
+import axpy
+import flatten
+import argmax
+import reshape
+import roipooling
+import priorbox
+import permute
+import detection_out
+import normalize
+import select
+import crop
+import power
+import reduction
+
+#custom layer import ends
+
+custom_layers = get_registered_layers()
+
+
+def set_args(f, params, node=None):
+ """ set args for function 'f' using the parameters in node.layer.parameters
+
+ Args:
+ f (function): a python function object
+ params (object): a object contains attributes needed by f's arguments
+
+ Returns:
+ arg_names (list): a list of argument names
+ kwargs (dict): a dict contains needed arguments
+ """
+ from ..protobuf_to_dict import protobuf_to_dict
+
+ argc = f.__code__.co_argcount
+ arg_list = f.__code__.co_varnames[0:argc]
+
+ kwargs = {}
+ for arg_name in arg_list:
+ if arg_name in params:
+ kwargs[arg_name] = params[arg_name]
+
+ if node is not None and len(node.metadata):
+ kwargs.update(node.metadata)
+
+ return arg_list, kwargs
+
+
+def has_layer(kind):
+ """ test whether this layer exists in custom layer
+ """
+ return kind in custom_layers
+
+
+def compute_output_shape(kind, node):
+ assert kind in custom_layers, "layer[%s] not exist in custom layers" % (
+ kind)
+ shape_func = custom_layers[kind]['shape']
+
+ parents = node.parents
+ inputs = [list(p.output_shape) for p in parents]
+ arg_names, kwargs = set_args(shape_func, node.params)
+
+ if len(inputs) == 1:
+ inputs = inputs[0]
+
+ return shape_func(inputs, **kwargs)
+
+
+def make_node(template, kind, node):
+ """ make a PaddleNode for custom layer which means construct
+ a piece of code to define a layer implemented in 'custom_layers'
+
+ Args:
+ @template (PaddleNode): a factory to new a instance of PaddleNode
+ @kind (str): type of custom layer
+ @node (graph.Node): a layer in the net
+
+ Returns:
+ instance of PaddleNode
+ """
+ assert kind in custom_layers, "layer[%s] not exist in custom layers" % (
+ kind)
+
+ layer_func = custom_layers[kind]['layer']
+
+ #construct arguments needed by custom layer function from node's parameters
+ arg_names, kwargs = set_args(layer_func, node.params, node)
+
+ return template('custom_layer', kind, **kwargs)
+
+
+def make_custom_layer(kind, inputs, name, *args, **kwargs):
+ """ execute a custom layer which is implemented by users
+
+ Args:
+ @kind (str): type name of this layer
+ @inputs (vars): variable list created by fluid
+ @namme (str): name for this layer
+ @args (tuple): other positional arguments
+ @kwargs (dict): other kv arguments
+
+ Returns:
+ output (var): output variable for this layer
+ """
+ assert kind in custom_layers, "layer[%s] not exist in custom layers" % (
+ kind)
+
+ layer_func = custom_layers[kind]['layer']
+ return layer_func(inputs, name, *args, **kwargs)
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/argmax.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/argmax.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/argmax.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/argmax.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/axpy.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/axpy.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/axpy.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/axpy.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/crop.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/crop.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/crop.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/crop.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/detection_out.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/detection_out.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/detection_out.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/detection_out.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/flatten.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/flatten.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/flatten.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/flatten.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/normalize.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/normalize.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/normalize.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/normalize.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/permute.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/permute.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/permute.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/permute.py
diff --git a/PaddleCV/caffe2fluid/kaffe/custom_layers/power.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/power.py
new file mode 100644
index 0000000000000000000000000000000000000000..a8b91f4394aa81eee717b4013d110d6e6a8dcb8e
--- /dev/null
+++ b/PaddleCV/caffe2fluid/kaffe/custom_layers/power.py
@@ -0,0 +1,40 @@
+""" a custom layer for 'power', maybe we should implement this in standard way.
+ more info can be found here: http://caffe.berkeleyvision.org/tutorial/layers/power.html
+"""
+from .register import register
+
+
+def power_shape(input_shape, shape=None):
+ """ calculate the output shape of this layer using input shape
+
+ Args:
+ @input_shape (list of num): a list of number which represents the input shape
+
+ Returns:
+ @output_shape (list of num): a list of numbers represent the output shape
+ """
+ return input_shape
+
+
+def power_layer(input, name, power=1.0, scale=1.0, shift=0.0):
+ """ build a layer of type 'Power' using fluid
+
+ Args:
+ @input (variables): input fluid variable for this layer
+ @name (str): name for this layer
+ @power (float): parameter from caffe's Power layer
+ @scale (float): parameter from caffe's Power layer
+ @shift (float): parameter from caffe's Power layer
+
+ Returns:
+ output (variable): output variable for this layer
+ """
+ import paddle.fluid as fluid
+ scale_out = fluid.layers.scale(
+ input, scale=scale, bias=shift, bias_after_scale=True)
+ output = fluid.layers.pow(scale_out, factor=power)
+
+ return output
+
+
+register(kind='Power', shape=power_shape, layer=power_layer)
diff --git a/PaddleCV/caffe2fluid/kaffe/custom_layers/priorbox.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/priorbox.py
new file mode 100644
index 0000000000000000000000000000000000000000..e3eb6407caa7660347f81452b9668028eded81d5
--- /dev/null
+++ b/PaddleCV/caffe2fluid/kaffe/custom_layers/priorbox.py
@@ -0,0 +1,103 @@
+""" A custom layer for 'priorbox' which is used in ssd to generate prior box info
+ Since the order of prior box is different between caffe and paddle,
+ we use 'slice' and 'concate' ops to align them.
+"""
+
+from .register import register
+
+
+def priorbox_shape(input_shapes, min_size, max_size=None, aspect_ratio=None):
+ """ calculate the output shape of this layer using input shapes
+
+ Args:
+ @input_shapes (list of tuples): a list of input shapes
+
+ Returns:
+ @output_shape (list of num): a list of numbers represent the output shape
+ """
+ assert len(input_shapes) == 2, "invalid inputs for Priorbox[%s]" % (name)
+ fc_shape = input_shapes[0]
+ N = 1
+ if not max_size == None:
+ N += 1
+ if not aspect_ratio == None:
+ N += 2 * len(aspect_ratio)
+
+ N_bbx = fc_shape[2] * fc_shape[3] * N
+ output_shape = [1, 2, 4 * N_bbx]
+ return output_shape
+
+
+def priorbox_layer(inputs,
+ name,
+ min_size,
+ max_size=None,
+ aspect_ratio=None,
+ variance=[0.1, 0.1, 0.2, 0.2],
+ flip=False,
+ clip=False,
+ step=0.0,
+ offset=0.5):
+ """ build a layer of type 'Priorbox' using fluid
+
+ Args:
+ @inputs (list of variables): input fluid variables for this layer
+ @name (str): name for this layer
+
+ Returns:
+ output (variable): output variable for this layer
+ """
+ import paddle.fluid as fluid
+
+ assert len(inputs) == 2, "invalid inputs for Priorbox[%s]" % (name)
+ input = inputs[0]
+ image = inputs[1]
+ steps = tuple(step) if type(step) is list or type(step) is tuple else (step,
+ step)
+ box, variance_ = fluid.layers.prior_box(
+ input,
+ image,
+ min_size,
+ max_size,
+ aspect_ratio,
+ variance,
+ flip,
+ clip,
+ steps,
+ offset,
+ min_max_aspect_ratios_order=True)
+ """
+ #adjust layout when the output is not consistent with caffe's
+
+ feat_shape = list(input.shape)
+ H = feat_shape[2]
+ W = feat_shape[3]
+ box_tmp = fluid.layers.reshape(box, [H, W, -1, 4])
+ nb_prior_bbx = int(box_tmp.shape[2])
+ tensor_list = fluid.layers.split(box_tmp, nb_prior_bbx, 2)
+
+ #TODO:
+ # current implementation for this layer is not efficient
+ # and we should fix this bug in future when Paddle support the same prior-box layout with Caffe
+ index_list = [0]
+ index_list = index_list * nb_prior_bbx
+ index_offset = 0
+ if max_size is not None:
+ index_list[1] = -1
+ index_offset = 1
+ for ii in xrange(2 * len(aspect_ratio)):
+ index_list[ii + 1 + index_offset] = ii + 1
+
+ tensor_list_gathered = [tensor_list[ii] for ii in index_list]
+ caffe_prior_bbx = fluid.layers.concat(tensor_list_gathered, axis=2)
+ box = fluid.layers.reshape(caffe_prior_bbx, [1, 1, -1])
+ """
+
+ box = fluid.layers.reshape(box, [1, 1, -1])
+ variance_ = fluid.layers.reshape(variance_, [1, 1, -1])
+ output = fluid.layers.concat([box, variance_], axis=1)
+
+ return output
+
+
+register(kind='PriorBox', shape=priorbox_shape, layer=priorbox_layer)
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/reduction.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/reduction.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/reduction.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/reduction.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/register.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/register.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/register.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/register.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/reshape.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/reshape.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/reshape.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/reshape.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/roipooling.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/roipooling.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/roipooling.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/roipooling.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/select.py b/PaddleCV/caffe2fluid/kaffe/custom_layers/select.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/select.py
rename to PaddleCV/caffe2fluid/kaffe/custom_layers/select.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/errors.py b/PaddleCV/caffe2fluid/kaffe/errors.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/errors.py
rename to PaddleCV/caffe2fluid/kaffe/errors.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/graph.py b/PaddleCV/caffe2fluid/kaffe/graph.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/graph.py
rename to PaddleCV/caffe2fluid/kaffe/graph.py
diff --git a/PaddleCV/caffe2fluid/kaffe/layers.py b/PaddleCV/caffe2fluid/kaffe/layers.py
new file mode 100644
index 0000000000000000000000000000000000000000..0d0aa1adc3c5a5e21f74d188d1784a94cf4acf9e
--- /dev/null
+++ b/PaddleCV/caffe2fluid/kaffe/layers.py
@@ -0,0 +1,250 @@
+import re
+import numbers
+from collections import namedtuple
+
+import custom_layers
+from .shapes import *
+
+LAYER_DESCRIPTORS = {
+
+ # Caffe Types
+ 'AbsVal': shape_identity,
+ 'Accuracy': shape_scalar,
+ 'ArgMax': shape_not_implemented,
+ 'BatchNorm': shape_identity,
+ 'BNLL': shape_not_implemented,
+ 'Concat': shape_concat,
+ 'ContrastiveLoss': shape_scalar,
+ 'Convolution': shape_convolution,
+ 'Deconvolution': shape_deconvolution,
+ 'Data': shape_data,
+ 'Dropout': shape_identity,
+ 'DummyData': shape_data,
+ 'Crop': shape_crop,
+ 'EuclideanLoss': shape_scalar,
+ 'Eltwise': shape_identity,
+ 'Exp': shape_identity,
+ 'Flatten': shape_not_implemented,
+ 'HDF5Data': shape_data,
+ 'HDF5Output': shape_identity,
+ 'HingeLoss': shape_scalar,
+ 'Im2col': shape_not_implemented,
+ 'ImageData': shape_data,
+ 'InfogainLoss': shape_scalar,
+ 'InnerProduct': shape_inner_product,
+ 'Input': shape_data,
+ 'LRN': shape_identity,
+ 'MemoryData': shape_mem_data,
+ 'MultinomialLogisticLoss': shape_scalar,
+ 'MVN': shape_not_implemented,
+ 'Pooling': shape_pool,
+ 'Power': shape_power,
+ 'ReLU': shape_identity,
+ 'PReLU': shape_identity,
+ 'Scale': shape_identity,
+ 'Sigmoid': shape_identity,
+ 'SigmoidCrossEntropyLoss': shape_scalar,
+ 'Silence': shape_not_implemented,
+ 'Softmax': shape_identity,
+ 'SoftmaxWithLoss': shape_scalar,
+ 'Split': shape_not_implemented,
+ 'Slice': shape_not_implemented,
+ 'TanH': shape_identity,
+ 'WindowData': shape_not_implemented,
+ 'Threshold': shape_identity,
+}
+
+# layer types in 'V1LayerParameter'
+# (v1layertype name, enum value, mapped to layer type)
+v1_layertypes = [
+ ('ABSVAL', 35),
+ ('ACCURACY', 1),
+ ('ARGMAX', 30),
+ ('BNLL', 2),
+ ('CONCAT', 3),
+ ('CONVOLUTION', 4),
+ ('DATA', 5),
+ ('DECONVOLUTION', 39),
+ ('DROPOUT', 6),
+ ('ELTWISE', 25),
+ ('EXP', 38),
+ ('FLATTEN', 8),
+ ('IM2COL', 11),
+ ('INNERPRODUCT', 14),
+ ('LRN', 15),
+ ('MEMORYDATA', 29),
+ ('MULTINOMIALLOGISTICLOSS', 16),
+ ('MVN', 34),
+ ('POOLING', 17),
+ ('POWER', 26),
+ ('RELU', 18),
+ ('SIGMOID', 19),
+ ('SIGMOIDCROSSENTROPYLOSS', 27),
+ ('SILENCE', 36),
+ ('SOFTMAX', 20),
+ ('SPLIT', 22),
+ ('SLICE', 33),
+ ('TANH', 23),
+ ('WINDOWDATA', 24),
+ ('THRESHOLD', 31),
+]
+
+LAYER_TYPES = LAYER_DESCRIPTORS.keys()
+LayerType = type('LayerType', (), {t: t for t in LAYER_TYPES})
+
+#map the layer name in V1 to standard name
+V1_LAYER_MAP = {'_not_init_': True}
+
+
+def get_v1_layer_map():
+ global V1_LAYER_MAP
+ if '_not_init_' not in V1_LAYER_MAP:
+ return V1_LAYER_MAP
+ else:
+ del V1_LAYER_MAP['_not_init_']
+
+ name2layer = {}
+ for n in LAYER_TYPES:
+ name2layer[n.upper()] = n
+
+ for l in v1_layertypes:
+ n, v = l
+ if n in name2layer and v not in V1_LAYER_MAP:
+ V1_LAYER_MAP[v] = name2layer[n]
+ else:
+ raise KaffeError('not found v1 layer type %s' % n)
+ return V1_LAYER_MAP
+
+
+class NodeKind(LayerType):
+ @staticmethod
+ def map_raw_kind(kind):
+ if custom_layers.has_layer(kind):
+ return kind
+
+ if kind in LAYER_TYPES:
+ return kind
+
+ v1_layers = get_v1_layer_map()
+ if kind in v1_layers:
+ return v1_layers[kind]
+ else:
+ return None
+
+ @staticmethod
+ def compute_output_shape(node):
+ if custom_layers.has_layer(node.kind):
+ return custom_layers.compute_output_shape(node.kind, node)
+
+ try:
+ val = LAYER_DESCRIPTORS[node.kind](node)
+ return val
+ except NotImplementedError:
+ raise KaffeError(
+ 'Output shape computation not implemented for type: %s' %
+ node.kind)
+
+
+class NodeDispatchError(KaffeError):
+ pass
+
+
+class NodeDispatch(object):
+ @staticmethod
+ def get_handler_name(node_kind):
+ if len(node_kind) <= 6:
+ # A catch-all for things like ReLU and tanh
+ return node_kind.lower()
+ # Convert from CamelCase to under_scored
+ name = re.sub('(.)([A-Z][a-z]+)', r'\1_\2', node_kind)
+ return re.sub('([a-z0-9])([A-Z])', r'\1_\2', name).lower()
+
+ def get_handler(self, node_kind, prefix):
+ if custom_layers.has_layer(node_kind):
+ return getattr(self, 'map_custom')
+
+ name = self.get_handler_name(node_kind)
+ name = '_'.join((prefix, name))
+ try:
+ return getattr(self, name)
+ except AttributeError:
+ raise NodeDispatchError(
+ 'No handler found for node kind: %s (expected: %s)' %
+ (node_kind, name))
+
+
+class LayerAdapter(object):
+ def __init__(self, layer, kind):
+ self.layer = layer
+ self.kind = kind
+
+ @property
+ def parameters(self):
+ name = NodeDispatch.get_handler_name(self.kind)
+ if self.kind.lower() == "normalize":
+ name = "norm"
+ elif self.kind.lower() == "deconvolution":
+ name = "convolution"
+
+ name = '_'.join((name, 'param'))
+ try:
+ return getattr(self.layer, name)
+ except AttributeError:
+ print(dir(self.layer))
+ raise NodeDispatchError(
+ 'Caffe parameters not found attr[%s] for layer kind[%s]' %
+ (name, self.kind))
+
+ @staticmethod
+ def get_kernel_value(scalar, repeated, idx, default=None):
+ if scalar:
+ return scalar
+ if repeated:
+ if isinstance(repeated, numbers.Number):
+ return repeated
+ if len(repeated) == 1:
+ # Same value applies to all spatial dimensions
+ return int(repeated[0])
+ assert idx < len(repeated)
+ # Extract the value for the given spatial dimension
+ return repeated[idx]
+ if default is None:
+ raise ValueError('Unable to determine kernel parameter!')
+ return default
+
+ @property
+ def kernel_parameters(self):
+ assert self.kind in (NodeKind.Convolution, NodeKind.Pooling,\
+ NodeKind.Deconvolution)
+
+ params = self.parameters
+ k_h = self.get_kernel_value(params.kernel_h, params.kernel_size, 0)
+ k_w = self.get_kernel_value(params.kernel_w, params.kernel_size, 1)
+ s_h = self.get_kernel_value(
+ params.stride_h, params.stride, 0, default=1)
+ s_w = self.get_kernel_value(
+ params.stride_w, params.stride, 1, default=1)
+ p_h = self.get_kernel_value(params.pad_h, params.pad, 0, default=0)
+ p_w = self.get_kernel_value(params.pad_w, params.pad, 1, default=0)
+
+ dila_h = dila_w = 1
+ if self.kind in (NodeKind.Convolution, NodeKind.Deconvolution):
+ dila_len = len(params.dilation)
+ if dila_len == 2:
+ dila_h = params.dilation[0]
+ dila_w = params.dilation[1]
+ elif dila_len == 1:
+ dila_h = dila_w = params.dilation[0]
+ else:
+ assert dila_len == 0, "invalid length[%s] of dilation in convolution" % (
+ dila_len)
+
+ return KernelParameters(k_h, k_w, s_h, s_w, p_h, p_w, dila_h, dila_w)
+
+
+KernelParameters = namedtuple(
+ 'KernelParameters',
+ [
+ 'kernel_h', 'kernel_w', 'stride_h', 'stride_w', 'pad_h', 'pad_w',
+ 'dila_h', 'dila_w'
+ ], )
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/net_template.py b/PaddleCV/caffe2fluid/kaffe/net_template.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/net_template.py
rename to PaddleCV/caffe2fluid/kaffe/net_template.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/paddle/__init__.py b/PaddleCV/caffe2fluid/kaffe/paddle/__init__.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/paddle/__init__.py
rename to PaddleCV/caffe2fluid/kaffe/paddle/__init__.py
diff --git a/PaddleCV/caffe2fluid/kaffe/paddle/network.py b/PaddleCV/caffe2fluid/kaffe/paddle/network.py
new file mode 100644
index 0000000000000000000000000000000000000000..718bd196fa107b7adf20ff09d1ec192b090af8cd
--- /dev/null
+++ b/PaddleCV/caffe2fluid/kaffe/paddle/network.py
@@ -0,0 +1,576 @@
+import sys
+import os
+import math
+import numpy as np
+
+
+def import_fluid():
+ import paddle.fluid as fluid
+ return fluid
+
+
+def layer(op):
+ '''Decorator for composable network layers.'''
+
+ def layer_decorated(self, *args, **kwargs):
+ # Automatically set a name if not provided.
+ name = kwargs.setdefault('name', self.get_unique_name(op.__name__))
+ # Figure out the layer inputs.
+ if len(self.terminals) == 0:
+ raise RuntimeError('No input variables found for layer %s.' % name)
+ elif len(self.terminals) == 1:
+ layer_input = self.terminals[0]
+ else:
+ layer_input = list(self.terminals)
+
+ self.layer_reverse_trace[name] = layer_input
+ # Perform the operation and get the output.
+ layer_output = op(self, layer_input, *args, **kwargs)
+ # Add to layer LUT.
+ self.layers[name] = layer_output
+ self.var2name[layer_output.name] = (name, layer_output)
+
+ # This output is now the input for the next layer.
+ self.feed(layer_output)
+ # Return self for chained calls.
+ return self
+
+ return layer_decorated
+
+
+class Network(object):
+ def __init__(self, inputs, trainable=True):
+ # The input nodes for this network
+ self.inputs = inputs
+ # The current list of terminal nodes
+ self.terminals = []
+ # Mapping from layer names to layers
+ self.layers = dict(inputs)
+ # If true, the resulting variables are set as trainable
+ self.trainable = trainable
+ # Switch variable for dropout
+ self.paddle_env = None
+ self.output_names = []
+ self.name_trace = None
+
+ self.layer_reverse_trace = {}
+ self.var2name = {}
+ self.setup()
+
+ def setup(self):
+ '''Construct the network. '''
+ raise NotImplementedError('Must be implemented by the subclass.')
+
+ def locate_ancestor(self, v, which=[0], ancestor_level=1):
+ """ find a ancestor for a node 'v' which is a fluid variable
+ """
+ ancestor = None
+ which = which * ancestor_level
+ name = self.var2name[v.name][0]
+
+ for i in range(ancestor_level):
+ v = self.layer_reverse_trace[name]
+ if type(v) is list:
+ ancestor = self.var2name[v[which[i]].name]
+ else:
+ ancestor = self.var2name[v.name]
+ name = ancestor[0]
+ return ancestor
+
+ def load(self, data_path, exe=None, place=None, ignore_missing=False):
+ '''Load network weights.
+ data_path: The path to the numpy-serialized network weights
+ ignore_missing: If true, serialized weights for missing layers are ignored.
+ '''
+ fluid = import_fluid()
+ #load fluid mode directly
+ if os.path.isdir(data_path):
+ assert (exe is not None), \
+ 'must provide a executor to load fluid model'
+ fluid.io.load_persistables(executor=exe, dirname=data_path)
+ return True
+
+ #load model from a npy file
+ if exe is None or place is None:
+ if self.paddle_env is None:
+ place = fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ self.paddle_env = {'place': place, 'exe': exe}
+ exe = exe.run(fluid.default_startup_program())
+ else:
+ place = self.paddle_env['place']
+ exe = self.paddle_env['exe']
+
+ data_dict = np.load(data_path).item()
+ for op_name in data_dict:
+ if op_name == 'caffe2fluid_name_trace':
+ self.name_trace = data_dict[op_name]
+ continue
+
+ layer = self.layers[op_name]
+ for param_name, data in data_dict[op_name].iteritems():
+ try:
+ name = '%s_%s' % (op_name, param_name)
+ v = fluid.global_scope().find_var(name)
+ w = v.get_tensor()
+ w.set(data.reshape(w.shape()), place)
+ except ValueError:
+ if not ignore_missing:
+ raise
+ return True
+
+ def feed(self, *args):
+ '''Set the input(s) for the next operation by replacing the terminal nodes.
+ The arguments can be either layer names or the actual layers.
+ '''
+ assert len(args) != 0
+ self.terminals = []
+ for fed_layer in args:
+ if isinstance(fed_layer, basestring):
+ try:
+ fed_layer = self.layers[fed_layer]
+ except KeyError:
+ raise KeyError('Unknown layer name fed: %s' % fed_layer)
+ self.terminals.append(fed_layer)
+ return self
+
+ def get_output(self):
+ '''Returns the current network output.'''
+ return self.terminals[-1]
+
+ def get_unique_name(self, prefix):
+ '''Returns an index-suffixed unique name for the given prefix.
+ This is used for auto-generating layer names based on the type-prefix.
+ '''
+ ident = sum(t.startswith(prefix) for t, _ in self.layers.items()) + 1
+ return '%s_%d' % (prefix, ident)
+
+ def get_unique_output_name(self, prefix, layertype):
+ '''Returns an index-suffixed unique name for the given prefix.
+ This is used for auto-generating layer names based on the type-prefix.
+ '''
+ ident = sum(t.startswith(prefix) for t in self.output_names) + 1
+ unique_name = '%s.%s.output.%d' % (prefix, layertype, ident)
+ self.output_names.append(unique_name)
+ return unique_name
+
+ @layer
+ def conv(self,
+ input,
+ k_h,
+ k_w,
+ c_o,
+ s_h,
+ s_w,
+ name,
+ relu=True,
+ relu_negative_slope=0.0,
+ padding=None,
+ dilation=1,
+ group=1,
+ biased=True):
+ if padding is None:
+ padding = [0, 0]
+
+ # Get the number of channels in the input
+ c_i, h_i, w_i = input.shape[1:]
+
+ # Verify that the grouping parameter is valid
+ assert c_i % group == 0
+ assert c_o % group == 0
+
+ fluid = import_fluid()
+ prefix = name + '_'
+ leaky_relu = False
+ act = 'relu'
+ if relu is False:
+ act = None
+ elif relu_negative_slope != 0.0:
+ leaky_relu = True
+ act = None
+
+ output = fluid.layers.conv2d(
+ name=self.get_unique_output_name(name, 'conv2d'),
+ input=input,
+ filter_size=[k_h, k_w],
+ num_filters=c_o,
+ stride=[s_h, s_w],
+ padding=padding,
+ dilation=dilation,
+ groups=group,
+ param_attr=fluid.ParamAttr(name=prefix + "weights"),
+ bias_attr=fluid.ParamAttr(name=prefix + "biases"),
+ act=act)
+
+ if leaky_relu:
+ output = fluid.layers.leaky_relu(output, alpha=relu_negative_slope)
+
+ return output
+
+ @layer
+ def deconv(self,
+ input,
+ k_h,
+ k_w,
+ c_o,
+ s_h,
+ s_w,
+ name,
+ relu=True,
+ relu_negative_slope=0.0,
+ padding=None,
+ dilation=1,
+ biased=True):
+ if padding is None:
+ padding = [0, 0]
+
+ # Get the number of channels in the input
+ c_i, h_i, w_i = input.shape[1:]
+
+ fluid = import_fluid()
+ prefix = name + '_'
+ leaky_relu = False
+ act = 'relu'
+ if relu is False:
+ act = None
+ elif relu_negative_slope != 0.0:
+ leaky_relu = True
+ act = None
+
+ p_h = padding[0]
+ p_w = padding[1]
+ h_o = (h_i - 1) * s_h - 2 * p_h + dilation * (k_h - 1) + 1
+ w_o = (w_i - 1) * s_w - 2 * p_w + dilation * (k_w - 1) + 1
+ output = fluid.layers.conv2d_transpose(
+ name=self.get_unique_output_name(name, 'conv2d_transpose'),
+ input=input,
+ num_filters=c_o,
+ output_size=[h_o, w_o],
+ filter_size=[k_h, k_w],
+ padding=padding,
+ stride=[s_h, s_w],
+ dilation=dilation,
+ param_attr=fluid.ParamAttr(name=prefix + "weights"),
+ bias_attr=fluid.ParamAttr(name=prefix + "biases"),
+ act=act)
+
+ if leaky_relu:
+ output = fluid.layers.leaky_relu(output, alpha=relu_negative_slope)
+
+ return output
+
+ @layer
+ def relu(self, input, name):
+ fluid = import_fluid()
+ output = fluid.layers.relu(input)
+ return output
+
+ @layer
+ def prelu(self, input, channel_shared, name):
+ fluid = import_fluid()
+ if channel_shared:
+ mode = 'all'
+ else:
+ mode = 'channel'
+
+ prefix = name + '_'
+ output = fluid.layers.prelu(
+ input,
+ mode=mode,
+ param_attr=fluid.ParamAttr(name=prefix + 'negslope'))
+ return output
+
+ def pool(self,
+ pool_type,
+ input,
+ k_h,
+ k_w,
+ s_h,
+ s_w,
+ ceil_mode,
+ padding,
+ name,
+ exclusive=True):
+ # Get the number of channels in the input
+ in_hw = input.shape[2:]
+ k_hw = [k_h, k_w]
+ s_hw = [s_h, s_w]
+
+ fluid = import_fluid()
+ output = fluid.layers.pool2d(
+ name=name,
+ input=input,
+ pool_size=k_hw,
+ pool_stride=s_hw,
+ pool_padding=padding,
+ ceil_mode=ceil_mode,
+ pool_type=pool_type,
+ exclusive=exclusive)
+ return output
+
+ @layer
+ def max_pool(self,
+ input,
+ k_h,
+ k_w,
+ s_h,
+ s_w,
+ ceil_mode,
+ padding=[0, 0],
+ name=None):
+ return self.pool(
+ 'max',
+ input,
+ k_h,
+ k_w,
+ s_h,
+ s_w,
+ ceil_mode,
+ padding,
+ name=self.get_unique_output_name(name, 'max_pool'))
+
+ @layer
+ def avg_pool(self,
+ input,
+ k_h,
+ k_w,
+ s_h,
+ s_w,
+ ceil_mode,
+ padding=[0, 0],
+ name=None):
+ return self.pool(
+ 'avg',
+ input,
+ k_h,
+ k_w,
+ s_h,
+ s_w,
+ ceil_mode,
+ padding,
+ name=self.get_unique_output_name(name, 'avg_pool'),
+ exclusive=False)
+
+ @layer
+ def sigmoid(self, input, name):
+ fluid = import_fluid()
+ return fluid.layers.sigmoid(
+ input, name=self.get_unique_output_name(name, 'sigmoid'))
+
+ @layer
+ def tanh(self, input, name):
+ fluid = import_fluid()
+ return fluid.layers.tanh(
+ input, name=self.get_unique_output_name(name, 'tanh'))
+
+ @layer
+ def lrn(self, input, radius, alpha, beta, name, bias=1.0):
+ fluid = import_fluid()
+ output = fluid.layers.lrn(input=input,
+ n=radius,
+ k=bias,
+ alpha=alpha,
+ beta=beta,
+ name=self.get_unique_output_name(name, 'lrn'))
+ return output
+
+ @layer
+ def concat(self, inputs, axis, name):
+ fluid = import_fluid()
+ output = fluid.layers.concat(
+ input=inputs,
+ axis=axis,
+ name=self.get_unique_output_name(name, 'concat'))
+ return output
+
+ @layer
+ def add(self, inputs, name):
+ fluid = import_fluid()
+ output = inputs[0]
+ for i in inputs[1:]:
+ output = fluid.layers.elementwise_add(
+ x=output, y=i, name=self.get_unique_output_name(name, 'add'))
+ return output
+
+ @layer
+ def max(self, inputs, name):
+ fluid = import_fluid()
+ output = inputs[0]
+ for i in inputs[1:]:
+ output = fluid.layers.elementwise_max(
+ x=output, y=i, name=self.get_unique_output_name(name, 'max'))
+ return output
+
+ @layer
+ def multiply(self, inputs, name):
+ fluid = import_fluid()
+ output = inputs[0]
+ for i in inputs[1:]:
+ output = fluid.layers.elementwise_mul(
+ x=output, y=i, name=self.get_unique_output_name(name, 'mul'))
+ return output
+
+ @layer
+ def fc(self, input, num_out, name, relu=True, act=None):
+ fluid = import_fluid()
+
+ if act is None:
+ act = 'relu' if relu is True else None
+
+ prefix = name + '_'
+ output = fluid.layers.fc(
+ name=self.get_unique_output_name(name, 'fc'),
+ input=input,
+ size=num_out,
+ act=act,
+ param_attr=fluid.ParamAttr(name=prefix + 'weights'),
+ bias_attr=fluid.ParamAttr(name=prefix + 'biases'))
+ return output
+
+ @layer
+ def softmax(self, input, axis=2, name=None):
+ fluid = import_fluid()
+ shape = input.shape
+ dims = len(shape)
+ axis = axis + dims if axis < 0 else axis
+
+ need_transpose = False
+ if axis + 1 != dims:
+ need_transpose = True
+
+ if need_transpose:
+ order = range(dims)
+ order.remove(axis)
+ order.append(axis)
+ input = fluid.layers.transpose(
+ input,
+ perm=order,
+ name=self.get_unique_output_name(name, 'transpose'))
+
+ output = fluid.layers.softmax(
+ input, name=self.get_unique_output_name(name, 'softmax'))
+
+ if need_transpose:
+ order = range(len(shape))
+ order[axis] = dims - 1
+ order[-1] = axis
+ output = fluid.layers.transpose(
+ output,
+ perm=order,
+ name=self.get_unique_output_name(name, 'transpose'))
+ return output
+
+ @layer
+ def batch_normalization(self,
+ input,
+ name,
+ scale_offset=True,
+ eps=1e-5,
+ relu=False,
+ relu_negative_slope=0.0):
+ # NOTE: Currently, only inference is supported
+ fluid = import_fluid()
+ prefix = name + '_'
+ param_attr = None if scale_offset is False else fluid.ParamAttr(
+ name=prefix + 'scale')
+ bias_attr = None if scale_offset is False else fluid.ParamAttr(
+ name=prefix + 'offset')
+ mean_name = prefix + 'mean'
+ variance_name = prefix + 'variance'
+
+ leaky_relu = False
+ act = 'relu'
+ if relu is False:
+ act = None
+ elif relu_negative_slope != 0.0:
+ leaky_relu = True
+ act = None
+
+ output = fluid.layers.batch_norm(
+ name=self.get_unique_output_name(name, 'batch_norm'),
+ input=input,
+ is_test=True,
+ param_attr=param_attr,
+ bias_attr=bias_attr,
+ moving_mean_name=mean_name,
+ moving_variance_name=variance_name,
+ epsilon=eps,
+ act=act)
+
+ if leaky_relu:
+ output = fluid.layers.leaky_relu(output, alpha=relu_negative_slope)
+
+ return output
+
+ @layer
+ def dropout(self, input, drop_prob, name, is_test=True):
+ fluid = import_fluid()
+ if is_test:
+ output = input
+ else:
+ output = fluid.layers.dropout(
+ input,
+ dropout_prob=drop_prob,
+ is_test=is_test,
+ name=self.get_unique_output_name(name, 'dropout'))
+ return output
+
+ @layer
+ def scale(self, input, axis=1, num_axes=1, name=None):
+ fluid = import_fluid()
+
+ assert num_axes == 1, "layer scale not support this num_axes[%d] now" % (
+ num_axes)
+
+ prefix = name + '_'
+ scale_shape = input.shape[axis:axis + num_axes]
+ param_attr = fluid.ParamAttr(name=prefix + 'scale')
+ scale_param = fluid.layers.create_parameter(
+ shape=scale_shape,
+ dtype=input.dtype,
+ name=name,
+ attr=param_attr,
+ is_bias=True,
+ default_initializer=fluid.initializer.Constant(value=1.0))
+
+ offset_attr = fluid.ParamAttr(name=prefix + 'offset')
+ offset_param = fluid.layers.create_parameter(
+ shape=scale_shape,
+ dtype=input.dtype,
+ name=name,
+ attr=offset_attr,
+ is_bias=True,
+ default_initializer=fluid.initializer.Constant(value=0.0))
+
+ output = fluid.layers.elementwise_mul(
+ input,
+ scale_param,
+ axis=axis,
+ name=self.get_unique_output_name(name, 'scale_mul'))
+ output = fluid.layers.elementwise_add(
+ output,
+ offset_param,
+ axis=axis,
+ name=self.get_unique_output_name(name, 'scale_add'))
+ return output
+
+ def custom_layer_factory(self):
+ """ get a custom layer maker provided by subclass
+ """
+ raise NotImplementedError(
+ '[custom_layer_factory] must be implemented by the subclass.')
+
+ @layer
+ def custom_layer(self, inputs, kind, name, *args, **kwargs):
+ """ make custom layer
+ """
+ #FIX ME:
+ # there is a trick for different API between caffe and paddle
+ if kind == "DetectionOutput":
+ conf_var = inputs[1]
+ real_conf_var = self.locate_ancestor(conf_var, ancestor_level=2)
+ inputs[1] = real_conf_var[1]
+
+ name = self.get_unique_output_name(name, kind)
+ layer_factory = self.custom_layer_factory()
+ return layer_factory(kind, inputs, name, *args, **kwargs)
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/paddle/transformer.py b/PaddleCV/caffe2fluid/kaffe/paddle/transformer.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/paddle/transformer.py
rename to PaddleCV/caffe2fluid/kaffe/paddle/transformer.py
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/protobuf_to_dict.py b/PaddleCV/caffe2fluid/kaffe/protobuf_to_dict.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/protobuf_to_dict.py
rename to PaddleCV/caffe2fluid/kaffe/protobuf_to_dict.py
diff --git a/PaddleCV/caffe2fluid/kaffe/shapes.py b/PaddleCV/caffe2fluid/kaffe/shapes.py
new file mode 100644
index 0000000000000000000000000000000000000000..4bbdbdebd403f524d0db23e206f0fd394d8e46e4
--- /dev/null
+++ b/PaddleCV/caffe2fluid/kaffe/shapes.py
@@ -0,0 +1,160 @@
+import math
+from collections import namedtuple
+
+from .errors import KaffeError
+
+Tensor4DShape = namedtuple('Tensor4DShape',
+ ['batch_size', 'channels', 'height', 'width'])
+
+Tensor3DShape = namedtuple('Tensor3DShape', ['batch_size', 'data1', 'data2'])
+
+Tensor2DShape = namedtuple('Tensor2DShape', ['batch_size', 'data'])
+
+ScalarShape = namedtuple('ScalarShape', ['batch_size'])
+
+
+def make_tensor(batch_size, d1=None, d2=None, d3=None):
+ if d3 is not None:
+ return Tensor4DShape(batch_size, d1, d2, d3)
+ elif d1 is not None and d2 is not None:
+ return Tensor3DShape(batch_size, d1, d2)
+ elif d1 is not None and d2 is None:
+ return Tensor2DShape(batch_size, d1)
+ elif d1 is None and d2 is None and d3 is None:
+ return ScalarShape(batch_size)
+ else:
+ raise NotImplementedError('invalid params for make_tensor %s' \
+ % (str((batch_size, d1, d2, d3))))
+
+
+def get_filter_output_shape(i_h, i_w, params, round_func):
+ dila_h = getattr(params, 'dila_h', 1)
+ dila_w = getattr(params, 'dila_w', 1)
+
+ o_h = (i_h + 2 * params.pad_h -
+ (dila_h * (params.kernel_h - 1) + 1)) / float(params.stride_h) + 1
+ o_w = (i_w + 2 * params.pad_w -
+ (dila_w * (params.kernel_w - 1) + 1)) / float(params.stride_w) + 1
+
+ return (int(round_func(o_h)), int(round_func(o_w)))
+
+
+def get_strided_kernel_output_shape(node, round_func):
+ assert node.layer is not None
+ input_shape = node.get_only_parent().output_shape
+ o_h, o_w = get_filter_output_shape(input_shape.height, input_shape.width,
+ node.layer.kernel_parameters, round_func)
+ params = node.layer.parameters
+ has_c_o = hasattr(params, 'num_output')
+ c = params.num_output if has_c_o else input_shape.channels
+ return make_tensor(input_shape.batch_size, c, o_h, o_w)
+
+
+def shape_not_implemented(node):
+ raise NotImplementedError
+
+
+def shape_identity(node):
+ assert len(node.parents) > 0
+ return node.parents[0].output_shape
+
+
+def shape_scalar(node):
+ return make_tensor(1, 1, 1, 1)
+
+
+def shape_crop(node):
+ raise KaffeError('crop function had been defined in customer_layers')
+
+
+def shape_power(node):
+ raise KaffeError('power function had been defined in customer_layers')
+
+
+def shape_data(node):
+ if node.output_shape:
+ # Old-style input specification
+ shape = node.output_shape
+ else:
+ try:
+ # New-style input specification
+ shape = map(int, node.parameters.shape[0].dim)
+ except:
+ # We most likely have a data layer on our hands. The problem is,
+ # Caffe infers the dimensions of the data from the source (eg: LMDB).
+ # We want to avoid reading datasets here. Fail for now.
+ # This can be temporarily fixed by transforming the data layer to
+ # Caffe's "input" layer (as is usually used in the "deploy" version).
+ # TODO: Find a better solution for this.
+ raise KaffeError(
+ 'Cannot determine dimensions of data layer.\n'
+ 'See comments in function shape_data for more info.')
+ return shape
+
+
+def shape_mem_data(node):
+ params = node.parameters
+ return make_tensor(params.batch_size, params.channels, params.height,
+ params.width)
+
+
+def shape_concat(node):
+ axis = node.layer.parameters.axis
+ output_shape = None
+ for parent in node.parents:
+ if output_shape is None:
+ output_shape = list(parent.output_shape)
+ else:
+ output_shape[axis] += parent.output_shape[axis]
+ return tuple(output_shape)
+
+
+def shape_convolution(node):
+ return get_strided_kernel_output_shape(node, math.floor)
+
+
+def shape_deconvolution(node):
+ assert node.layer is not None
+ input_shape = node.get_only_parent().output_shape
+ h_i = input_shape.height
+ w_i = input_shape.width
+
+ params = node.layer.kernel_parameters
+ p_h = params.pad_h
+ p_w = params.pad_w
+
+ dila_h = params.dila_h
+ dila_w = params.dila_w
+
+ k_h = params.kernel_h
+ k_w = params.kernel_w
+
+ s_h = params.stride_h
+ s_w = params.stride_w
+
+ h_o = (h_i - 1) * s_h - 2 * p_h + dila_h * (k_h - 1) + 1
+ w_o = (w_i - 1) * s_w - 2 * p_w + dila_w * (k_w - 1) + 1
+
+ params = node.layer.parameters
+ has_c_o = hasattr(params, 'num_output')
+ c = params.num_output if has_c_o else input_shape.channels
+ return make_tensor(input_shape.batch_size, c, h_o, w_o)
+
+
+def shape_pool(node):
+ global_pool = getattr(node.layer.parameters, 'global_pooling', False)
+ if global_pool:
+ input_shape = node.get_only_parent().output_shape
+ return make_tensor(input_shape.batch_size, input_shape.channels, 1, 1)
+
+ ceil_mode = getattr(node.layer.parameters, 'ceil_mode', True)
+ if ceil_mode is True:
+ method = math.ceil
+ else:
+ method = math.floor
+ return get_strided_kernel_output_shape(node, method)
+
+
+def shape_inner_product(node):
+ input_shape = node.get_only_parent().output_shape
+ return make_tensor(input_shape.batch_size, node.layer.parameters.num_output)
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/transformers.py b/PaddleCV/caffe2fluid/kaffe/transformers.py
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/kaffe/transformers.py
rename to PaddleCV/caffe2fluid/kaffe/transformers.py
diff --git a/fluid/PaddleCV/caffe2fluid/proto/caffe.proto b/PaddleCV/caffe2fluid/proto/caffe.proto
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/proto/caffe.proto
rename to PaddleCV/caffe2fluid/proto/caffe.proto
diff --git a/fluid/PaddleCV/caffe2fluid/proto/compile.sh b/PaddleCV/caffe2fluid/proto/compile.sh
similarity index 100%
rename from fluid/PaddleCV/caffe2fluid/proto/compile.sh
rename to PaddleCV/caffe2fluid/proto/compile.sh
diff --git a/PaddleCV/deeplabv3+/.gitignore b/PaddleCV/deeplabv3+/.gitignore
new file mode 100644
index 0000000000000000000000000000000000000000..cfe470860395ca802c6d95dc829385cd0d112ad4
--- /dev/null
+++ b/PaddleCV/deeplabv3+/.gitignore
@@ -0,0 +1,6 @@
+*.tgz
+deeplabv3plus_gn_init*
+deeplabv3plus_xception65_initialize*
+*.log
+*.sh
+output*
diff --git a/PaddleCV/deeplabv3+/.run_ce.sh b/PaddleCV/deeplabv3+/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..c4e6055e1d9d3ad9a9b039d5973c100e76a8aadf
--- /dev/null
+++ b/PaddleCV/deeplabv3+/.run_ce.sh
@@ -0,0 +1,28 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+DATASET_PATH=${HOME}/.cache/paddle/dataset/cityscape/
+
+cudaid=${deeplabv3plus:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py \
+--batch_size=2 \
+--train_crop_size=769 \
+--total_step=50 \
+--save_weights_path=output1 \
+--dataset_path=$DATASET_PATH \
+--enable_ce | python _ce.py
+
+cudaid=${deeplabv3plus_m:=0,1,2,3} # use 0,1,2,3 card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py \
+--batch_size=8 \
+--train_crop_size=769 \
+--total_step=50 \
+--save_weights_path=output4 \
+--dataset_path=$DATASET_PATH \
+--enable_ce | python _ce.py
diff --git a/PaddleCV/deeplabv3+/README.md b/PaddleCV/deeplabv3+/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..eff83fee192d6a34cb338f5c705fb7ec1f59fd07
--- /dev/null
+++ b/PaddleCV/deeplabv3+/README.md
@@ -0,0 +1,116 @@
+DeepLab运行本目录下的程序示例需要使用PaddlePaddle Fluid v1.3.0版本或以上。如果您的PaddlePaddle安装版本低于此要求,请按照安装文档中的说明更新PaddlePaddle安装版本,如果使用GPU,该程序需要使用cuDNN v7版本。
+
+
+## 代码结构
+```
+├── models.py # 网络结构定义脚本
+├── train.py # 训练任务脚本
+├── eval.py # 评估脚本
+└── reader.py # 定义通用的函数以及数据预处理脚本
+```
+
+## 简介
+
+DeepLabv3+ 是DeepLab语义分割系列网络的最新作,其前作有 DeepLabv1,DeepLabv2, DeepLabv3,
+在最新作中,DeepLab的作者通过encoder-decoder进行多尺度信息的融合,同时保留了原来的空洞卷积和ASSP层,
+其骨干网络使用了Xception模型,提高了语义分割的健壮性和运行速率,在 PASCAL VOC 2012 dataset取得新的state-of-art performance,89.0mIOU。
+
+
+
+
+## 数据准备
+
+
+
+本文采用Cityscape数据集,请前往[Cityscape官网](https://www.cityscapes-dataset.com)注册下载。
+下载以后的数据目录结构如下
+```
+data/cityscape/
+|-- gtFine
+| |-- test
+| |-- train
+| `-- val
+|-- leftImg8bit
+ |-- test
+ |-- train
+ `-- val
+```
+
+# 预训练模型准备
+
+我们为了节约更多的显存,在这里我们使用Group Norm作为我们的归一化手段。
+如果需要从头开始训练模型,用户需要下载我们的初始化模型
+```
+wget https://paddle-deeplab.bj.bcebos.com/deeplabv3plus_gn_init.tgz
+tar -xf deeplabv3plus_gn_init.tgz && rm deeplabv3plus_gn_init.tgz
+```
+如果需要最终训练模型进行fine tune或者直接用于预测,请下载我们的最终模型
+```
+wget https://paddle-deeplab.bj.bcebos.com/deeplabv3plus_gn.tgz
+tar -xf deeplabv3plus_gn.tgz && rm deeplabv3plus_gn.tgz
+```
+
+
+## 模型训练与预测
+
+### 训练
+执行以下命令进行训练,同时指定weights的保存路径,初始化路径,以及数据存放位置:
+```
+python ./train.py \
+ --batch_size=1 \
+ --train_crop_size=769 \
+ --total_step=50 \
+ --norm_type=gn \
+ --init_weights_path=$INIT_WEIGHTS_PATH \
+ --save_weights_path=$SAVE_WEIGHTS_PATH \
+ --dataset_path=$DATASET_PATH
+```
+使用以下命令获得更多使用说明:
+```
+python train.py --help
+```
+以上命令用于测试训练过程是否正常,仅仅迭代了50次并且使用了1的batch size,如果需要复现
+原论文的实验,请使用以下设置:
+```
+CUDA_VISIBLE_DEVICES=0 \
+python ./train.py \
+ --batch_size=4 \
+ --parallel=True \
+ --norm_type=gn \
+ --train_crop_size=769 \
+ --total_step=500000 \
+ --base_lr=0.001 \
+ --init_weights_path=deeplabv3plus_gn_init \
+ --save_weights_path=output \
+ --dataset_path=$DATASET_PATH
+```
+如果您的显存不足,可以尝试减小`batch_size`,同时等比例放大`total_step`, 缩小`base_lr`, 保证相乘的值不变,这得益于Group Norm的特性,改变 `batch_size` 并不会显著影响结果,而且能够节约更多显存, 比如您可以设置`--batch_size=2 --total_step=1000000 --base_lr=0.0005`。
+
+### 测试
+执行以下命令在`Cityscape`测试数据集上进行测试:
+```
+python ./eval.py \
+ --init_weights_path=deeplabv3plus_gn \
+ --norm_type=gn \
+ --dataset_path=$DATASET_PATH
+```
+需要通过选项`--init_weights_path`指定模型文件。测试脚本的输出的评估指标为mean IoU。
+
+
+## 实验结果
+训练完成以后,使用`eval.py`在验证集上进行测试,得到以下结果:
+```
+load from: ../models/deeplabv3plus_gn
+total number 500
+step: 500, mIoU: 0.7881
+```
+
+## 其他信息
+
+|数据集 | norm type | pretrained model | trained model | mean IoU
+|---|---|---|---|---|
+|CityScape | group norm | [deeplabv3plus_gn_init.tgz](https://paddle-deeplab.bj.bcebos.com/deeplabv3plus_gn_init.tgz) | [deeplabv3plus_gn.tgz](https://paddle-deeplab.bj.bcebos.com/deeplabv3plus_gn.tgz) | 0.7881 |
+
+## 参考
+
+- [Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation](https://arxiv.org/abs/1802.02611)
diff --git a/fluid/DeepASR/data_utils/augmentor/__init__.py b/PaddleCV/deeplabv3+/__init__.py
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/__init__.py
rename to PaddleCV/deeplabv3+/__init__.py
diff --git a/PaddleCV/deeplabv3+/_ce.py b/PaddleCV/deeplabv3+/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..b0127d6445213b9d3934220fa36e9eb44d3e04b4
--- /dev/null
+++ b/PaddleCV/deeplabv3+/_ce.py
@@ -0,0 +1,60 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+
+each_pass_duration_card1_kpi = DurationKpi('each_pass_duration_card1', 0.1, 0, actived=True)
+train_loss_card1_kpi = CostKpi('train_loss_card1', 0.05, 0)
+each_pass_duration_card4_kpi = DurationKpi('each_pass_duration_card4', 0.1, 0, actived=True)
+train_loss_card4_kpi = CostKpi('train_loss_card4', 0.05, 0)
+
+tracking_kpis = [
+ each_pass_duration_card1_kpi,
+ train_loss_card1_kpi,
+ each_pass_duration_card4_kpi,
+ train_loss_card4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/PaddleCV/deeplabv3+/eval.py b/PaddleCV/deeplabv3+/eval.py
new file mode 100644
index 0000000000000000000000000000000000000000..a0bd5b74eb785c20172e062f97780e987525ae3a
--- /dev/null
+++ b/PaddleCV/deeplabv3+/eval.py
@@ -0,0 +1,139 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+if 'FLAGS_fraction_of_gpu_memory_to_use' not in os.environ:
+ os.environ['FLAGS_fraction_of_gpu_memory_to_use'] = '0.98'
+os.environ['FLAGS_enable_parallel_graph'] = '1'
+
+import paddle
+import paddle.fluid as fluid
+import numpy as np
+import argparse
+from reader import CityscapeDataset
+import reader
+import models
+import sys
+import utility
+
+parser = argparse.ArgumentParser()
+add_arg = lambda *args: utility.add_arguments(*args, argparser=parser)
+
+# yapf: disable
+add_arg('total_step', int, -1, "Number of the step to be evaluated, -1 for full evaluation.")
+add_arg('init_weights_path', str, None, "Path of the weights to evaluate.")
+add_arg('dataset_path', str, None, "Cityscape dataset path.")
+add_arg('use_gpu', bool, True, "Whether use GPU or CPU.")
+add_arg('num_classes', int, 19, "Number of classes.")
+add_arg('use_py_reader', bool, True, "Use py_reader.")
+add_arg('norm_type', str, 'bn', "Normalization type, should be 'bn' or 'gn'.")
+#yapf: enable
+
+
+def mean_iou(pred, label):
+ label = fluid.layers.elementwise_min(
+ label, fluid.layers.assign(np.array(
+ [num_classes], dtype=np.int32)))
+ label_ignore = (label == num_classes).astype('int32')
+ label_nignore = (label != num_classes).astype('int32')
+
+ pred = pred * label_nignore + label_ignore * num_classes
+
+ miou, wrong, correct = fluid.layers.mean_iou(pred, label, num_classes + 1)
+ return miou, wrong, correct
+
+
+def load_model():
+ if os.path.isdir(args.init_weights_path):
+ fluid.io.load_params(
+ exe, dirname=args.init_weights_path, main_program=tp)
+ else:
+ fluid.io.load_params(
+ exe, dirname="", filename=args.init_weights_path, main_program=tp)
+
+
+CityscapeDataset = reader.CityscapeDataset
+
+args = parser.parse_args()
+
+models.clean()
+models.is_train = False
+models.default_norm_type = args.norm_type
+deeplabv3p = models.deeplabv3p
+
+image_shape = [1025, 2049]
+eval_shape = [1024, 2048]
+
+sp = fluid.Program()
+tp = fluid.Program()
+batch_size = 1
+reader.default_config['crop_size'] = -1
+reader.default_config['shuffle'] = False
+num_classes = args.num_classes
+
+with fluid.program_guard(tp, sp):
+ if args.use_py_reader:
+ py_reader = fluid.layers.py_reader(capacity=64,
+ shapes=[[1, 3, 0, 0], [1] + eval_shape],
+ dtypes=['float32', 'int32'])
+ img, label = fluid.layers.read_file(py_reader)
+ else:
+ img = fluid.layers.data(name='img', shape=[3, 0, 0], dtype='float32')
+ label = fluid.layers.data(name='label', shape=eval_shape, dtype='int32')
+
+ img = fluid.layers.resize_bilinear(img, image_shape)
+ logit = deeplabv3p(img)
+ logit = fluid.layers.resize_bilinear(logit, eval_shape)
+ pred = fluid.layers.argmax(logit, axis=1).astype('int32')
+ miou, out_wrong, out_correct = mean_iou(pred, label)
+
+tp = tp.clone(True)
+fluid.memory_optimize(
+ tp,
+ print_log=False,
+ skip_opt_set=set([pred.name, miou, out_wrong, out_correct]),
+ level=1)
+
+place = fluid.CPUPlace()
+if args.use_gpu:
+ place = fluid.CUDAPlace(0)
+exe = fluid.Executor(place)
+exe.run(sp)
+
+if args.init_weights_path:
+ print("load from:", args.init_weights_path)
+ load_model()
+
+dataset = CityscapeDataset(args.dataset_path, 'val')
+if args.total_step == -1:
+ total_step = len(dataset.label_files)
+else:
+ total_step = args.total_step
+
+batches = dataset.get_batch_generator(batch_size, total_step)
+if args.use_py_reader:
+ py_reader.decorate_tensor_provider(lambda :[ (yield b[1],b[2]) for b in batches])
+ py_reader.start()
+
+sum_iou = 0
+all_correct = np.array([0], dtype=np.int64)
+all_wrong = np.array([0], dtype=np.int64)
+
+for i in range(total_step):
+ if not args.use_py_reader:
+ _, imgs, labels, names = next(batches)
+ result = exe.run(tp,
+ feed={'img': imgs,
+ 'label': labels},
+ fetch_list=[pred, miou, out_wrong, out_correct])
+ else:
+ result = exe.run(tp,
+ fetch_list=[pred, miou, out_wrong, out_correct])
+
+ wrong = result[2][:-1] + all_wrong
+ right = result[3][:-1] + all_correct
+ all_wrong = wrong.copy()
+ all_correct = right.copy()
+ mp = (wrong + right) != 0
+ miou2 = np.mean((right[mp] * 1.0 / (right[mp] + wrong[mp])))
+ print('step: %s, mIoU: %s' % (i + 1, miou2))
diff --git a/fluid/PaddleCV/deeplabv3+/imgs/model.png b/PaddleCV/deeplabv3+/imgs/model.png
similarity index 100%
rename from fluid/PaddleCV/deeplabv3+/imgs/model.png
rename to PaddleCV/deeplabv3+/imgs/model.png
diff --git a/PaddleCV/deeplabv3+/models.py b/PaddleCV/deeplabv3+/models.py
new file mode 100644
index 0000000000000000000000000000000000000000..117ab5da539da1a403fb99d8642b3f7b4f864355
--- /dev/null
+++ b/PaddleCV/deeplabv3+/models.py
@@ -0,0 +1,332 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+
+import contextlib
+import os
+name_scope = ""
+
+decode_channel = 48
+encode_channel = 256
+label_number = 19
+
+bn_momentum = 0.99
+dropout_keep_prop = 0.9
+is_train = True
+
+op_results = {}
+
+default_epsilon = 1e-3
+default_norm_type = 'bn'
+default_group_number = 32
+depthwise_use_cudnn = False
+
+bn_regularizer = fluid.regularizer.L2DecayRegularizer(regularization_coeff=0.0)
+depthwise_regularizer = fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=0.0)
+
+
+@contextlib.contextmanager
+def scope(name):
+ global name_scope
+ bk = name_scope
+ name_scope = name_scope + name + '/'
+ yield
+ name_scope = bk
+
+
+def check(data, number):
+ if type(data) == int:
+ return [data] * number
+ assert len(data) == number
+ return data
+
+
+def clean():
+ global op_results
+ op_results = {}
+
+
+def append_op_result(result, name):
+ global op_results
+ op_index = len(op_results)
+ name = name_scope + name + str(op_index)
+ op_results[name] = result
+ return result
+
+
+def conv(*args, **kargs):
+ if "xception" in name_scope:
+ init_std = 0.09
+ elif "logit" in name_scope:
+ init_std = 0.01
+ elif name_scope.endswith('depthwise/'):
+ init_std = 0.33
+ else:
+ init_std = 0.06
+ if name_scope.endswith('depthwise/'):
+ regularizer = depthwise_regularizer
+ else:
+ regularizer = None
+
+ kargs['param_attr'] = fluid.ParamAttr(
+ name=name_scope + 'weights',
+ regularizer=regularizer,
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=init_std))
+ if 'bias_attr' in kargs and kargs['bias_attr']:
+ kargs['bias_attr'] = fluid.ParamAttr(
+ name=name_scope + 'biases',
+ regularizer=regularizer,
+ initializer=fluid.initializer.ConstantInitializer(value=0.0))
+ else:
+ kargs['bias_attr'] = False
+ kargs['name'] = name_scope + 'conv'
+ return append_op_result(fluid.layers.conv2d(*args, **kargs), 'conv')
+
+
+def group_norm(input, G, eps=1e-5, param_attr=None, bias_attr=None):
+ N, C, H, W = input.shape
+ if C % G != 0:
+ # print "group can not divide channle:", C, G
+ for d in range(10):
+ for t in [d, -d]:
+ if G + t <= 0: continue
+ if C % (G + t) == 0:
+ G = G + t
+ break
+ if C % G == 0:
+ # print "use group size:", G
+ break
+ assert C % G == 0
+ x = fluid.layers.group_norm(
+ input,
+ groups=G,
+ param_attr=param_attr,
+ bias_attr=bias_attr,
+ name=name_scope + 'group_norm')
+ return x
+
+
+def bn(*args, **kargs):
+ if default_norm_type == 'bn':
+ with scope('BatchNorm'):
+ return append_op_result(
+ fluid.layers.batch_norm(
+ *args,
+ epsilon=default_epsilon,
+ momentum=bn_momentum,
+ param_attr=fluid.ParamAttr(
+ name=name_scope + 'gamma', regularizer=bn_regularizer),
+ bias_attr=fluid.ParamAttr(
+ name=name_scope + 'beta', regularizer=bn_regularizer),
+ moving_mean_name=name_scope + 'moving_mean',
+ moving_variance_name=name_scope + 'moving_variance',
+ **kargs),
+ 'bn')
+ elif default_norm_type == 'gn':
+ with scope('GroupNorm'):
+ return append_op_result(
+ group_norm(
+ args[0],
+ default_group_number,
+ eps=default_epsilon,
+ param_attr=fluid.ParamAttr(
+ name=name_scope + 'gamma', regularizer=bn_regularizer),
+ bias_attr=fluid.ParamAttr(
+ name=name_scope + 'beta', regularizer=bn_regularizer)),
+ 'gn')
+ else:
+ raise "Unsupport norm type:" + default_norm_type
+
+
+def bn_relu(data):
+ return append_op_result(fluid.layers.relu(bn(data)), 'relu')
+
+
+def relu(data):
+ return append_op_result(
+ fluid.layers.relu(
+ data, name=name_scope + 'relu'), 'relu')
+
+
+def seperate_conv(input, channel, stride, filter, dilation=1, act=None):
+ with scope('depthwise'):
+ input = conv(
+ input,
+ input.shape[1],
+ filter,
+ stride,
+ groups=input.shape[1],
+ padding=(filter // 2) * dilation,
+ dilation=dilation,
+ use_cudnn=depthwise_use_cudnn)
+ input = bn(input)
+ if act: input = act(input)
+ with scope('pointwise'):
+ input = conv(input, channel, 1, 1, groups=1, padding=0)
+ input = bn(input)
+ if act: input = act(input)
+ return input
+
+
+def xception_block(input,
+ channels,
+ strides=1,
+ filters=3,
+ dilation=1,
+ skip_conv=True,
+ has_skip=True,
+ activation_fn_in_separable_conv=False):
+ repeat_number = 3
+ channels = check(channels, repeat_number)
+ filters = check(filters, repeat_number)
+ strides = check(strides, repeat_number)
+ data = input
+ results = []
+ for i in range(repeat_number):
+ with scope('separable_conv' + str(i + 1)):
+ if not activation_fn_in_separable_conv:
+ data = relu(data)
+ data = seperate_conv(
+ data,
+ channels[i],
+ strides[i],
+ filters[i],
+ dilation=dilation)
+ else:
+ data = seperate_conv(
+ data,
+ channels[i],
+ strides[i],
+ filters[i],
+ dilation=dilation,
+ act=relu)
+ results.append(data)
+ if not has_skip:
+ return append_op_result(data, 'xception_block'), results
+ if skip_conv:
+ with scope('shortcut'):
+ skip = bn(
+ conv(
+ input, channels[-1], 1, strides[-1], groups=1, padding=0))
+ else:
+ skip = input
+ return append_op_result(data + skip, 'xception_block'), results
+
+
+def entry_flow(data):
+ with scope("entry_flow"):
+ with scope("conv1"):
+ data = conv(data, 32, 3, stride=2, padding=1)
+ data = bn_relu(data)
+ with scope("conv2"):
+ data = conv(data, 64, 3, stride=1, padding=1)
+ data = bn_relu(data)
+ with scope("block1"):
+ data, _ = xception_block(data, 128, [1, 1, 2])
+ with scope("block2"):
+ data, results = xception_block(data, 256, [1, 1, 2])
+ with scope("block3"):
+ data, _ = xception_block(data, 728, [1, 1, 2])
+ return data, results[1]
+
+
+def middle_flow(data):
+ with scope("middle_flow"):
+ for i in range(16):
+ with scope("block" + str(i + 1)):
+ data, _ = xception_block(data, 728, [1, 1, 1], skip_conv=False)
+ return data
+
+
+def exit_flow(data):
+ with scope("exit_flow"):
+ with scope('block1'):
+ data, _ = xception_block(data, [728, 1024, 1024], [1, 1, 1])
+ with scope('block2'):
+ data, _ = xception_block(
+ data, [1536, 1536, 2048], [1, 1, 1],
+ dilation=2,
+ has_skip=False,
+ activation_fn_in_separable_conv=True)
+ return data
+
+
+def dropout(x, keep_rate):
+ if is_train:
+ return fluid.layers.dropout(x, 1 - keep_rate) / keep_rate
+ else:
+ return x
+
+
+def encoder(input):
+ with scope('encoder'):
+ channel = 256
+ with scope("image_pool"):
+ image_avg = fluid.layers.reduce_mean(input, [2, 3], keep_dim=True)
+ append_op_result(image_avg, 'reduce_mean')
+ image_avg = bn_relu(
+ conv(
+ image_avg, channel, 1, 1, groups=1, padding=0))
+ image_avg = fluid.layers.resize_bilinear(image_avg, input.shape[2:])
+
+ with scope("aspp0"):
+ aspp0 = bn_relu(conv(input, channel, 1, 1, groups=1, padding=0))
+ with scope("aspp1"):
+ aspp1 = seperate_conv(input, channel, 1, 3, dilation=6, act=relu)
+ with scope("aspp2"):
+ aspp2 = seperate_conv(input, channel, 1, 3, dilation=12, act=relu)
+ with scope("aspp3"):
+ aspp3 = seperate_conv(input, channel, 1, 3, dilation=18, act=relu)
+ with scope("concat"):
+ data = append_op_result(
+ fluid.layers.concat(
+ [image_avg, aspp0, aspp1, aspp2, aspp3], axis=1),
+ 'concat')
+ data = bn_relu(conv(data, channel, 1, 1, groups=1, padding=0))
+ data = dropout(data, dropout_keep_prop)
+ return data
+
+
+def decoder(encode_data, decode_shortcut):
+ with scope('decoder'):
+ with scope('concat'):
+ decode_shortcut = bn_relu(
+ conv(
+ decode_shortcut, decode_channel, 1, 1, groups=1, padding=0))
+ encode_data = fluid.layers.resize_bilinear(
+ encode_data, decode_shortcut.shape[2:])
+ encode_data = fluid.layers.concat(
+ [encode_data, decode_shortcut], axis=1)
+ append_op_result(encode_data, 'concat')
+ with scope("separable_conv1"):
+ encode_data = seperate_conv(
+ encode_data, encode_channel, 1, 3, dilation=1, act=relu)
+ with scope("separable_conv2"):
+ encode_data = seperate_conv(
+ encode_data, encode_channel, 1, 3, dilation=1, act=relu)
+ return encode_data
+
+
+def deeplabv3p(img):
+ global default_epsilon
+ append_op_result(img, 'img')
+ with scope('xception_65'):
+ default_epsilon = 1e-3
+ # Entry flow
+ data, decode_shortcut = entry_flow(img)
+ # Middle flow
+ data = middle_flow(data)
+ # Exit flow
+ data = exit_flow(data)
+ default_epsilon = 1e-5
+ encode_data = encoder(data)
+ encode_data = decoder(encode_data, decode_shortcut)
+ with scope('logit'):
+ logit = conv(
+ encode_data, label_number, 1, stride=1, padding=0, bias_attr=True)
+ logit = fluid.layers.resize_bilinear(logit, img.shape[2:])
+ return logit
diff --git a/PaddleCV/deeplabv3+/reader.py b/PaddleCV/deeplabv3+/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..a660f924c3e4ae31688ed39a67a29216c4abce6b
--- /dev/null
+++ b/PaddleCV/deeplabv3+/reader.py
@@ -0,0 +1,158 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import cv2
+import numpy as np
+import os
+import six
+
+default_config = {
+ "shuffle": True,
+ "min_resize": 0.5,
+ "max_resize": 4,
+ "crop_size": 769,
+}
+
+
+def slice_with_pad(a, s, value=0):
+ pads = []
+ slices = []
+ for i in range(len(a.shape)):
+ if i >= len(s):
+ pads.append([0, 0])
+ slices.append([0, a.shape[i]])
+ else:
+ l, r = s[i]
+ if l < 0:
+ pl = -l
+ l = 0
+ else:
+ pl = 0
+ if r > a.shape[i]:
+ pr = r - a.shape[i]
+ r = a.shape[i]
+ else:
+ pr = 0
+ pads.append([pl, pr])
+ slices.append([l, r])
+ slices = list(map(lambda x: slice(x[0], x[1], 1), slices))
+ a = a[slices]
+ a = np.pad(a, pad_width=pads, mode='constant', constant_values=value)
+ return a
+
+
+class CityscapeDataset:
+ def __init__(self, dataset_dir, subset='train', config=default_config):
+ label_dirname = os.path.join(dataset_dir, 'gtFine/' + subset)
+ if six.PY2:
+ import commands
+ label_files = commands.getoutput(
+ "find %s -type f | grep labelTrainIds | sort" %
+ label_dirname).splitlines()
+ else:
+ import subprocess
+ label_files = subprocess.getstatusoutput(
+ "find %s -type f | grep labelTrainIds | sort" %
+ label_dirname)[-1].splitlines()
+ self.label_files = label_files
+ self.label_dirname = label_dirname
+ self.index = 0
+ self.subset = subset
+ self.dataset_dir = dataset_dir
+ self.config = config
+ self.reset()
+ print("total number", len(label_files))
+
+ def reset(self, shuffle=False):
+ self.index = 0
+ if self.config["shuffle"]:
+ np.random.shuffle(self.label_files)
+
+ def next_img(self):
+ self.index += 1
+ if self.index >= len(self.label_files):
+ self.reset()
+
+ def get_img(self):
+ shape = self.config["crop_size"]
+ while True:
+ ln = self.label_files[self.index]
+ img_name = os.path.join(
+ self.dataset_dir,
+ 'leftImg8bit/' + self.subset + ln[len(self.label_dirname):])
+ img_name = img_name.replace('gtFine_labelTrainIds', 'leftImg8bit')
+ label = cv2.imread(ln)
+ img = cv2.imread(img_name)
+ if img is None:
+ print("load img failed:", img_name)
+ self.next_img()
+ else:
+ break
+ if shape == -1:
+ return img, label, ln
+
+ if np.random.rand() > 0.5:
+ range_l = 1
+ range_r = self.config['max_resize']
+ else:
+ range_l = self.config['min_resize']
+ range_r = 1
+
+ if np.random.rand() > 0.5:
+ assert len(img.shape) == 3 and len(
+ label.shape) == 3, "{} {}".format(img.shape, label.shape)
+ img = img[:, :, ::-1]
+ label = label[:, :, ::-1]
+
+ random_scale = np.random.rand(1) * (range_r - range_l) + range_l
+ crop_size = int(shape / random_scale)
+ bb = crop_size // 2
+
+ def _randint(low, high):
+ return int(np.random.rand(1) * (high - low) + low)
+
+ offset_x = np.random.randint(bb, max(bb + 1, img.shape[0] -
+ bb)) - crop_size // 2
+ offset_y = np.random.randint(bb, max(bb + 1, img.shape[1] -
+ bb)) - crop_size // 2
+ img_crop = slice_with_pad(img, [[offset_x, offset_x + crop_size],
+ [offset_y, offset_y + crop_size]], 128)
+ img = cv2.resize(img_crop, (shape, shape))
+ label_crop = slice_with_pad(label, [[offset_x, offset_x + crop_size],
+ [offset_y, offset_y + crop_size]],
+ 255)
+ label = cv2.resize(
+ label_crop, (shape, shape), interpolation=cv2.INTER_NEAREST)
+ return img, label, ln + str(
+ (offset_x, offset_y, crop_size, random_scale))
+
+ def get_batch(self, batch_size=1):
+ imgs = []
+ labels = []
+ names = []
+ while len(imgs) < batch_size:
+ img, label, ln = self.get_img()
+ imgs.append(img)
+ labels.append(label)
+ names.append(ln)
+ self.next_img()
+ return np.array(imgs), np.array(labels), names
+
+ def get_batch_generator(self, batch_size, total_step):
+ def do_get_batch():
+ for i in range(total_step):
+ imgs, labels, names = self.get_batch(batch_size)
+ labels = labels.astype(np.int32)[:, :, :, 0]
+ imgs = imgs[:, :, :, ::-1].transpose(
+ 0, 3, 1, 2).astype(np.float32) / (255.0 / 2) - 1
+ yield i, imgs, labels, names
+
+ batches = do_get_batch()
+ try:
+ from prefetch_generator import BackgroundGenerator
+ batches = BackgroundGenerator(batches, 100)
+ except:
+ print(
+ "You can install 'prefetch_generator' for acceleration of data reading."
+ )
+ return batches
diff --git a/PaddleCV/deeplabv3+/train.py b/PaddleCV/deeplabv3+/train.py
new file mode 100755
index 0000000000000000000000000000000000000000..5e983ed291f2b434a79ecefa7a3583f2741bd0ab
--- /dev/null
+++ b/PaddleCV/deeplabv3+/train.py
@@ -0,0 +1,233 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+if 'FLAGS_fraction_of_gpu_memory_to_use' not in os.environ:
+ os.environ['FLAGS_fraction_of_gpu_memory_to_use'] = '0.98'
+
+import paddle
+import paddle.fluid as fluid
+import numpy as np
+import argparse
+from reader import CityscapeDataset
+import reader
+import models
+import time
+import contextlib
+import paddle.fluid.profiler as profiler
+import utility
+
+parser = argparse.ArgumentParser()
+add_arg = lambda *args: utility.add_arguments(*args, argparser=parser)
+
+# yapf: disable
+add_arg('batch_size', int, 4, "The number of images in each batch during training.")
+add_arg('train_crop_size', int, 769, "Image crop size during training.")
+add_arg('base_lr', float, 0.001, "The base learning rate for model training.")
+add_arg('total_step', int, 500000, "Number of the training step.")
+add_arg('init_weights_path', str, None, "Path of the initial weights in paddlepaddle format.")
+add_arg('save_weights_path', str, None, "Path of the saved weights during training.")
+add_arg('dataset_path', str, None, "Cityscape dataset path.")
+add_arg('parallel', bool, True, "using ParallelExecutor.")
+add_arg('use_gpu', bool, True, "Whether use GPU or CPU.")
+add_arg('num_classes', int, 19, "Number of classes.")
+add_arg('load_logit_layer', bool, True, "Load last logit fc layer or not. If you are training with different number of classes, you should set to False.")
+add_arg('memory_optimize', bool, True, "Using memory optimizer.")
+add_arg('norm_type', str, 'bn', "Normalization type, should be 'bn' or 'gn'.")
+add_arg('profile', bool, False, "Enable profiler.")
+add_arg('use_py_reader', bool, True, "Use py reader.")
+parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs. Users can ignore this agument.')
+#yapf: enable
+
+@contextlib.contextmanager
+def profile_context(profile=True):
+ if profile:
+ with profiler.profiler('All', 'total', '/tmp/profile_file2'):
+ yield
+ else:
+ yield
+
+def load_model():
+ if os.path.isdir(args.init_weights_path):
+ load_vars = [
+ x for x in tp.list_vars()
+ if isinstance(x, fluid.framework.Parameter) and x.name.find('logit') ==
+ -1
+ ]
+ if args.load_logit_layer:
+ fluid.io.load_params(
+ exe, dirname=args.init_weights_path, main_program=tp)
+ else:
+ fluid.io.load_vars(exe, dirname=args.init_weights_path, vars=load_vars)
+ else:
+ fluid.io.load_params(
+ exe,
+ dirname="",
+ filename=args.init_weights_path,
+ main_program=tp)
+
+
+
+def save_model():
+ assert not os.path.isfile(args.save_weights_path)
+ fluid.io.save_params(
+ exe, dirname=args.save_weights_path, main_program=tp)
+
+
+def loss(logit, label):
+ label_nignore = fluid.layers.less_than(
+ label.astype('float32'),
+ fluid.layers.assign(np.array([num_classes], 'float32')),
+ force_cpu=False).astype('float32')
+ logit = fluid.layers.transpose(logit, [0, 2, 3, 1])
+ logit = fluid.layers.reshape(logit, [-1, num_classes])
+ label = fluid.layers.reshape(label, [-1, 1])
+ label = fluid.layers.cast(label, 'int64')
+ label_nignore = fluid.layers.reshape(label_nignore, [-1, 1])
+ logit = fluid.layers.softmax(logit, use_cudnn=False)
+ loss = fluid.layers.cross_entropy(logit, label, ignore_index=255)
+ label_nignore.stop_gradient = True
+ label.stop_gradient = True
+ return loss, label_nignore
+
+
+args = parser.parse_args()
+utility.print_arguments(args)
+
+models.clean()
+models.bn_momentum = 0.9997
+models.dropout_keep_prop = 0.9
+models.label_number = args.num_classes
+models.default_norm_type = args.norm_type
+deeplabv3p = models.deeplabv3p
+
+sp = fluid.Program()
+tp = fluid.Program()
+
+# only for ce
+if args.enable_ce:
+ SEED = 102
+ sp.random_seed = SEED
+ tp.random_seed = SEED
+
+crop_size = args.train_crop_size
+batch_size = args.batch_size
+image_shape = [crop_size, crop_size]
+reader.default_config['crop_size'] = crop_size
+reader.default_config['shuffle'] = True
+num_classes = args.num_classes
+weight_decay = 0.00004
+
+base_lr = args.base_lr
+total_step = args.total_step
+
+with fluid.program_guard(tp, sp):
+ if args.use_py_reader:
+ batch_size_each = batch_size // fluid.core.get_cuda_device_count()
+ py_reader = fluid.layers.py_reader(capacity=64,
+ shapes=[[batch_size_each, 3] + image_shape, [batch_size_each] + image_shape],
+ dtypes=['float32', 'int32'])
+ img, label = fluid.layers.read_file(py_reader)
+ else:
+ img = fluid.layers.data(
+ name='img', shape=[3] + image_shape, dtype='float32')
+ label = fluid.layers.data(name='label', shape=image_shape, dtype='int32')
+ logit = deeplabv3p(img)
+ pred = fluid.layers.argmax(logit, axis=1).astype('int32')
+ loss, mask = loss(logit, label)
+ lr = fluid.layers.polynomial_decay(
+ base_lr, total_step, end_learning_rate=0, power=0.9)
+ area = fluid.layers.elementwise_max(
+ fluid.layers.reduce_mean(mask),
+ fluid.layers.assign(np.array(
+ [0.1], dtype=np.float32)))
+ loss_mean = fluid.layers.reduce_mean(loss) / area
+
+ opt = fluid.optimizer.Momentum(
+ lr,
+ momentum=0.9,
+ regularization=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=weight_decay))
+ optimize_ops, params_grads = opt.minimize(loss_mean, startup_program=sp)
+ # ir memory optimizer has some issues, we need to seed grad persistable to
+ # avoid this issue
+ for p,g in params_grads: g.persistable = True
+
+
+exec_strategy = fluid.ExecutionStrategy()
+exec_strategy.num_threads = fluid.core.get_cuda_device_count()
+exec_strategy.num_iteration_per_drop_scope = 100
+build_strategy = fluid.BuildStrategy()
+if args.memory_optimize:
+ build_strategy.fuse_relu_depthwise_conv = True
+ build_strategy.enable_inplace = True
+ build_strategy.memory_optimize = True
+
+place = fluid.CPUPlace()
+if args.use_gpu:
+ place = fluid.CUDAPlace(0)
+exe = fluid.Executor(place)
+exe.run(sp)
+
+if args.init_weights_path:
+ print("load from:", args.init_weights_path)
+ load_model()
+
+dataset = reader.CityscapeDataset(args.dataset_path, 'train')
+
+if args.parallel:
+ binary = fluid.compiler.CompiledProgram(tp).with_data_parallel(
+ loss_name=loss_mean.name,
+ build_strategy=build_strategy,
+ exec_strategy=exec_strategy)
+else:
+ binary = fluid.compiler.CompiledProgram(tp)
+
+if args.use_py_reader:
+ assert(batch_size % fluid.core.get_cuda_device_count() == 0)
+ def data_gen():
+ batches = dataset.get_batch_generator(
+ batch_size // fluid.core.get_cuda_device_count(),
+ total_step * fluid.core.get_cuda_device_count())
+ for b in batches:
+ yield b[1], b[2]
+ py_reader.decorate_tensor_provider(data_gen)
+ py_reader.start()
+else:
+ batches = dataset.get_batch_generator(batch_size, total_step)
+total_time = 0.0
+epoch_idx = 0
+train_loss = 0
+
+with profile_context(args.profile):
+ for i in range(total_step):
+ epoch_idx += 1
+ begin_time = time.time()
+ prev_start_time = time.time()
+ if not args.use_py_reader:
+ _, imgs, labels, names = next(batches)
+ train_loss, = exe.run(binary,
+ feed={'img': imgs,
+ 'label': labels}, fetch_list=[loss_mean])
+ else:
+ train_loss, = exe.run(binary, fetch_list=[loss_mean])
+ train_loss = np.mean(train_loss)
+ end_time = time.time()
+ total_time += end_time - begin_time
+ if i % 100 == 0:
+ print("Model is saved to", args.save_weights_path)
+ save_model()
+ print("step {:d}, loss: {:.6f}, step_time_cost: {:.3f}".format(
+ i, train_loss, end_time - prev_start_time))
+
+print("Training done. Model is saved to", args.save_weights_path)
+save_model()
+
+if args.enable_ce:
+ gpu_num = fluid.core.get_cuda_device_count()
+ print("kpis\teach_pass_duration_card%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_loss_card%s\t%s" % (gpu_num, train_loss))
diff --git a/fluid/PaddleCV/face_detection/utility.py b/PaddleCV/deeplabv3+/utility.py
similarity index 100%
rename from fluid/PaddleCV/face_detection/utility.py
rename to PaddleCV/deeplabv3+/utility.py
diff --git a/fluid/PaddleCV/face_detection/.gitignore b/PaddleCV/face_detection/.gitignore
similarity index 100%
rename from fluid/PaddleCV/face_detection/.gitignore
rename to PaddleCV/face_detection/.gitignore
diff --git a/PaddleCV/face_detection/.run_ce.sh b/PaddleCV/face_detection/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..0b8632516b06b9ea48691a098b7ac25b171decd5
--- /dev/null
+++ b/PaddleCV/face_detection/.run_ce.sh
@@ -0,0 +1,17 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+cudaid=${face_detection:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --batch_size=2 --epoc_num=1 --batch_num=200 --parallel=False --enable_ce | python _ce.py
+
+
+cudaid=${face_detection_m:=0,1,2,3} # use 0,1,2,3 card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --batch_size=8 --epoc_num=1 --batch_num=200 --parallel=False --enable_ce | python _ce.py
+
diff --git a/PaddleCV/face_detection/README.md b/PaddleCV/face_detection/README.md
new file mode 120000
index 0000000000000000000000000000000000000000..4015683cfa5969297febc12e7ca1264afabbc0b5
--- /dev/null
+++ b/PaddleCV/face_detection/README.md
@@ -0,0 +1 @@
+README_cn.md
\ No newline at end of file
diff --git a/PaddleCV/face_detection/README_cn.md b/PaddleCV/face_detection/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..f63fbed02ab34520d79b2d2b000e31f5eb22e7f8
--- /dev/null
+++ b/PaddleCV/face_detection/README_cn.md
@@ -0,0 +1,185 @@
+## Pyramidbox 人脸检测
+
+## Table of Contents
+- [简介](#简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型发布](#模型发布)
+
+### 简介
+
+人脸检测是经典的计算机视觉任务,非受控场景中的小脸、模糊和遮挡的人脸检测是这个方向上最有挑战的问题。[PyramidBox](https://arxiv.org/pdf/1803.07737.pdf) 是一种基于SSD的单阶段人脸检测器,它利用上下文信息解决困难人脸的检测问题。如下图所示,PyramidBox在六个尺度的特征图上进行不同层级的预测。该工作主要包括以下模块:LFPN、Pyramid Anchors、CPM、Data-anchor-sampling。具体可以参考该方法对应的论文 https://arxiv.org/pdf/1803.07737.pdf ,下面进行简要的介绍。
+
+
+
+Pyramidbox 人脸检测模型
+
+
+**LFPN**: LFPN全称Low-level Feature Pyramid Networks, 在检测任务中,LFPN可以充分结合高层次的包含更多上下文的特征和低层次的包含更多纹理的特征。高层级特征被用于检测尺寸较大的人脸,而低层级特征被用于检测尺寸较小的人脸。为了将高层级特征整合到高分辨率的低层级特征上,我们从中间层开始做自上而下的融合,构建Low-level FPN。
+
+**Pyramid Anchors**: 该算法使用半监督解决方案来生成与人脸检测相关的具有语义的近似标签,提出基于anchor的语境辅助方法,它引入有监督的信息来学习较小的、模糊的和部分遮挡的人脸的语境特征。使用者可以根据标注的人脸标签,按照一定的比例扩充,得到头部的标签(上下左右各扩充1/2)和人体的标签(可自定义扩充比例)。
+
+**CPM**: CPM全称Context-sensitive Predict Module, 本方法设计了一种上下文敏感结构(CPM)来提高预测网络的表达能力。
+
+**Data-anchor-sampling**: 设计了一种新的采样方法,称作Data-anchor-sampling,该方法可以增加训练样本在不同尺度上的多样性。该方法改变训练样本的分布,重点关注较小的人脸。
+
+Pyramidbox模型可以在以下示例图片上展示鲁棒的检测性能,该图有一千张人脸,该模型检测出其中的880张人脸。
+
+
+Pyramidbox 人脸检测性能展示
+
+
+
+
+### 数据准备
+
+本教程使用 [WIDER FACE 数据集](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/) 来进行模型的训练测试工作,官网给出了详尽的数据介绍。
+
+WIDER FACE数据集包含32,203张图片,其中包含393,703个人脸,数据集的人脸在尺度、姿态、遮挡方面有较大的差异性。另外WIDER FACE数据集是基于61个场景归类的,然后针对每个场景,随机的挑选40%作为训练集,10%作为验证集,50%作为测试集。
+
+首先,从官网训练集和验证集,放在`data`目录,官网提供了谷歌云和百度云下载地址,请依据情况自行下载。并下载训练集和验证集的标注信息:
+
+```bash
+./data/download.sh
+```
+
+准备好数据之后,`data`目录如下:
+
+```
+data
+|-- download.sh
+|-- wider_face_split
+| |-- readme.txt
+| |-- wider_face_train_bbx_gt.txt
+| |-- wider_face_val_bbx_gt.txt
+| `-- ...
+|-- WIDER_train
+| `-- images
+| |-- 0--Parade
+| ...
+| `-- 9--Press_Conference
+`-- WIDER_val
+ `-- images
+ |-- 0--Parade
+ ...
+ `-- 9--Press_Conference
+```
+
+
+### 模型训练
+
+#### 下载预训练模型
+
+我们提供了预训练模型,模型是基于VGGNet的主干网络,使用如下命令下载:
+
+
+```bash
+wget http://paddlemodels.bj.bcebos.com/vgg_ilsvrc_16_fc_reduced.tar.gz
+tar -xf vgg_ilsvrc_16_fc_reduced.tar.gz && rm -f vgg_ilsvrc_16_fc_reduced.tar.gz
+```
+
+声明:该预训练模型转换自[Caffe](http://cs.unc.edu/~wliu/projects/ParseNet/VGG_ILSVRC_16_layers_fc_reduced.caffemodel)。不久,我们会发布自己预训练的模型。
+
+
+#### 开始训练
+
+
+`train.py` 是训练模块的主要执行程序,调用示例如下:
+
+```bash
+python -u train.py --batch_size=16 --pretrained_model=vgg_ilsvrc_16_fc_reduced
+```
+ - 可以通过设置 `export CUDA_VISIBLE_DEVICES=0,1,2,3` 指定想要使用的GPU数量,`batch_size`默认设置为12或16。
+ - 更多的可选参数见:
+ ```bash
+ python train.py --help
+ ```
+ - 模型训练150轮以上可以收敛。用Nvidia Tesla P40 GPU 4卡并行,`batch_size=16`的配置,每轮训练大约40分钟,总共训练时长大约100小时
+
+模型训练所采用的数据增强:
+
+**数据增强**:数据的读取行为定义在 `reader.py` 中,所有的图片都会被缩放到640x640。在训练时还会对图片进行数据增强,包括随机扰动、翻转、裁剪等,和[物体检测SSD算法](https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleCV/object_detection/README.md)中数据增强类似,除此之外,增加了上面提到的Data-anchor-sampling:
+
+ **尺度变换(Data-anchor-sampling)**:随机将图片尺度变换到一定范围的尺度,大大增强人脸的尺度变化。具体操作为根据随机选择的人脸高(height)和宽(width),得到$v=\\sqrt{width * height}$,判断$v$的值位于缩放区间$[16,32,64,128,256,512]$中的的哪一个。假设$v=45$,则选定$32
+
+
+
+
+Pyramidbox 预测可视化
+
+
+
+### 模型发布
+
+
+
+| 模型 | 预训练模型 | 训练数据 | 测试数据 | mAP |
+|:------------------------:|:------------------:|:----------------:|:------------:|:----:|
+|[Pyramidbox-v1-SSD 640x640](http://paddlemodels.bj.bcebos.com/PyramidBox_WiderFace.tar.gz) | [VGGNet](http://paddlemodels.bj.bcebos.com/vgg_ilsvrc_16_fc_reduced.tar.gz) | WIDER FACE train | WIDER FACE Val | 96.0%/ 94.8%/ 88.8% |
+
+#### 性能曲线
+
+
+
+
+WIDER FACE Easy/Medium/Hard set
+
diff --git a/fluid/DeepASR/model_utils/__init__.py b/PaddleCV/face_detection/__init__.py
similarity index 100%
rename from fluid/DeepASR/model_utils/__init__.py
rename to PaddleCV/face_detection/__init__.py
diff --git a/PaddleCV/face_detection/_ce.py b/PaddleCV/face_detection/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..7d8325dca6d8f7a08c8f7f0b734d2643b3c550b1
--- /dev/null
+++ b/PaddleCV/face_detection/_ce.py
@@ -0,0 +1,65 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+
+
+each_pass_duration_card1_kpi = DurationKpi('each_pass_duration_card1', 0.08, 0, actived=True)
+train_face_loss_card1_kpi = CostKpi('train_face_loss_card1', 0.08, 0)
+train_head_loss_card1_kpi = CostKpi('train_head_loss_card1', 0.08, 0)
+each_pass_duration_card4_kpi = DurationKpi('each_pass_duration_card4', 0.08, 0, actived=True)
+train_face_loss_card4_kpi = CostKpi('train_face_loss_card4', 0.08, 0)
+train_head_loss_card4_kpi = CostKpi('train_head_loss_card4', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_card1_kpi,
+ train_face_loss_card1_kpi,
+ train_head_loss_card1_kpi,
+ each_pass_duration_card4_kpi,
+ train_face_loss_card4_kpi,
+ train_head_loss_card4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/fluid/PaddleCV/face_detection/data/download.sh b/PaddleCV/face_detection/data/download.sh
similarity index 100%
rename from fluid/PaddleCV/face_detection/data/download.sh
rename to PaddleCV/face_detection/data/download.sh
diff --git a/fluid/PaddleCV/face_detection/image_util.py b/PaddleCV/face_detection/image_util.py
similarity index 100%
rename from fluid/PaddleCV/face_detection/image_util.py
rename to PaddleCV/face_detection/image_util.py
diff --git a/fluid/PaddleCV/face_detection/images/0_Parade_marchingband_1_356.jpg b/PaddleCV/face_detection/images/0_Parade_marchingband_1_356.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/0_Parade_marchingband_1_356.jpg
rename to PaddleCV/face_detection/images/0_Parade_marchingband_1_356.jpg
diff --git a/fluid/PaddleCV/face_detection/images/12_Group_Group_12_Group_Group_12_935.jpg b/PaddleCV/face_detection/images/12_Group_Group_12_Group_Group_12_935.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/12_Group_Group_12_Group_Group_12_935.jpg
rename to PaddleCV/face_detection/images/12_Group_Group_12_Group_Group_12_935.jpg
diff --git a/fluid/PaddleCV/face_detection/images/28_Sports_Fan_Sports_Fan_28_770.jpg b/PaddleCV/face_detection/images/28_Sports_Fan_Sports_Fan_28_770.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/28_Sports_Fan_Sports_Fan_28_770.jpg
rename to PaddleCV/face_detection/images/28_Sports_Fan_Sports_Fan_28_770.jpg
diff --git a/fluid/PaddleCV/face_detection/images/4_Dancing_Dancing_4_194.jpg b/PaddleCV/face_detection/images/4_Dancing_Dancing_4_194.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/4_Dancing_Dancing_4_194.jpg
rename to PaddleCV/face_detection/images/4_Dancing_Dancing_4_194.jpg
diff --git a/fluid/PaddleCV/face_detection/images/architecture_of_pyramidbox.jpg b/PaddleCV/face_detection/images/architecture_of_pyramidbox.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/architecture_of_pyramidbox.jpg
rename to PaddleCV/face_detection/images/architecture_of_pyramidbox.jpg
diff --git a/fluid/PaddleCV/face_detection/images/demo_img.jpg b/PaddleCV/face_detection/images/demo_img.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/demo_img.jpg
rename to PaddleCV/face_detection/images/demo_img.jpg
diff --git a/fluid/PaddleCV/face_detection/images/wider_pr_cruve_int_easy_val.jpg b/PaddleCV/face_detection/images/wider_pr_cruve_int_easy_val.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/wider_pr_cruve_int_easy_val.jpg
rename to PaddleCV/face_detection/images/wider_pr_cruve_int_easy_val.jpg
diff --git a/fluid/PaddleCV/face_detection/images/wider_pr_cruve_int_hard_val.jpg b/PaddleCV/face_detection/images/wider_pr_cruve_int_hard_val.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/wider_pr_cruve_int_hard_val.jpg
rename to PaddleCV/face_detection/images/wider_pr_cruve_int_hard_val.jpg
diff --git a/fluid/PaddleCV/face_detection/images/wider_pr_cruve_int_medium_val.jpg b/PaddleCV/face_detection/images/wider_pr_cruve_int_medium_val.jpg
similarity index 100%
rename from fluid/PaddleCV/face_detection/images/wider_pr_cruve_int_medium_val.jpg
rename to PaddleCV/face_detection/images/wider_pr_cruve_int_medium_val.jpg
diff --git a/fluid/PaddleCV/face_detection/profile.py b/PaddleCV/face_detection/profile.py
similarity index 100%
rename from fluid/PaddleCV/face_detection/profile.py
rename to PaddleCV/face_detection/profile.py
diff --git a/fluid/PaddleCV/face_detection/pyramidbox.py b/PaddleCV/face_detection/pyramidbox.py
similarity index 100%
rename from fluid/PaddleCV/face_detection/pyramidbox.py
rename to PaddleCV/face_detection/pyramidbox.py
diff --git a/PaddleCV/face_detection/reader.py b/PaddleCV/face_detection/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..4839ba5c5389a696fe0cb5f4fcd24daff42f217f
--- /dev/null
+++ b/PaddleCV/face_detection/reader.py
@@ -0,0 +1,324 @@
+# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+from PIL import Image
+from PIL import ImageDraw
+import numpy as np
+import xml.etree.ElementTree
+import os
+import time
+import copy
+import random
+import cv2
+import six
+import math
+from itertools import islice
+import paddle
+import image_util
+
+
+class Settings(object):
+ def __init__(self,
+ dataset=None,
+ data_dir=None,
+ label_file=None,
+ resize_h=None,
+ resize_w=None,
+ mean_value=[104., 117., 123.],
+ apply_distort=True,
+ apply_expand=True,
+ ap_version='11point',
+ toy=0):
+ self.dataset = dataset
+ self.ap_version = ap_version
+ self.toy = toy
+ self.data_dir = data_dir
+ self.apply_distort = apply_distort
+ self.apply_expand = apply_expand
+ self.resize_height = resize_h
+ self.resize_width = resize_w
+ self.img_mean = np.array(mean_value)[:, np.newaxis, np.newaxis].astype(
+ 'float32')
+ self.expand_prob = 0.5
+ self.expand_max_ratio = 4
+ self.hue_prob = 0.5
+ self.hue_delta = 18
+ self.contrast_prob = 0.5
+ self.contrast_delta = 0.5
+ self.saturation_prob = 0.5
+ self.saturation_delta = 0.5
+ self.brightness_prob = 0.5
+ # _brightness_delta is the normalized value by 256
+ self.brightness_delta = 0.125
+ self.scale = 0.007843 # 1 / 127.5
+ self.data_anchor_sampling_prob = 0.5
+ self.min_face_size = 8.0
+
+
+def to_chw_bgr(image):
+ """
+ Transpose image from HWC to CHW and from RBG to BGR.
+ Args:
+ image (np.array): an image with HWC and RBG layout.
+ """
+ # HWC to CHW
+ if len(image.shape) == 3:
+ image = np.swapaxes(image, 1, 2)
+ image = np.swapaxes(image, 1, 0)
+ # RBG to BGR
+ image = image[[2, 1, 0], :, :]
+ return image
+
+
+def preprocess(img, bbox_labels, mode, settings, image_path):
+ img_width, img_height = img.size
+ sampled_labels = bbox_labels
+ if mode == 'train':
+ if settings.apply_distort:
+ img = image_util.distort_image(img, settings)
+ if settings.apply_expand:
+ img, bbox_labels, img_width, img_height = image_util.expand_image(
+ img, bbox_labels, img_width, img_height, settings)
+
+ # sampling
+ batch_sampler = []
+
+ prob = np.random.uniform(0., 1.)
+ if prob > settings.data_anchor_sampling_prob:
+ scale_array = np.array([16, 32, 64, 128, 256, 512])
+ batch_sampler.append(
+ image_util.sampler(1, 10, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.2,
+ 0.0, True))
+ sampled_bbox = image_util.generate_batch_random_samples(
+ batch_sampler, bbox_labels, img_width, img_height, scale_array,
+ settings.resize_width, settings.resize_height)
+ img = np.array(img)
+ if len(sampled_bbox) > 0:
+ idx = int(np.random.uniform(0, len(sampled_bbox)))
+ img, sampled_labels = image_util.crop_image_sampling(
+ img, bbox_labels, sampled_bbox[idx], img_width, img_height,
+ settings.resize_width, settings.resize_height,
+ settings.min_face_size)
+
+ img = img.astype('uint8')
+ img = Image.fromarray(img)
+
+ else:
+ # hard-code here
+ batch_sampler.append(
+ image_util.sampler(1, 50, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
+ 0.0, True))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
+ 0.0, True))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
+ 0.0, True))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
+ 0.0, True))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
+ 0.0, True))
+ sampled_bbox = image_util.generate_batch_samples(
+ batch_sampler, bbox_labels, img_width, img_height)
+
+ img = np.array(img)
+ if len(sampled_bbox) > 0:
+ idx = int(np.random.uniform(0, len(sampled_bbox)))
+ img, sampled_labels = image_util.crop_image(
+ img, bbox_labels, sampled_bbox[idx], img_width, img_height,
+ settings.resize_width, settings.resize_height,
+ settings.min_face_size)
+
+ img = Image.fromarray(img)
+ interp_mode = [
+ Image.BILINEAR, Image.HAMMING, Image.NEAREST, Image.BICUBIC,
+ Image.LANCZOS
+ ]
+ interp_indx = np.random.randint(0, 5)
+
+ img = img.resize(
+ (settings.resize_width, settings.resize_height),
+ resample=interp_mode[interp_indx])
+ img = np.array(img)
+
+ if mode == 'train':
+ mirror = int(np.random.uniform(0, 2))
+ if mirror == 1:
+ img = img[:, ::-1, :]
+ for i in six.moves.xrange(len(sampled_labels)):
+ tmp = sampled_labels[i][1]
+ sampled_labels[i][1] = 1 - sampled_labels[i][3]
+ sampled_labels[i][3] = 1 - tmp
+
+ img = to_chw_bgr(img)
+ img = img.astype('float32')
+ img -= settings.img_mean
+ img = img * settings.scale
+ return img, sampled_labels
+
+
+def load_file_list(input_txt):
+ with open(input_txt, 'r') as f_dir:
+ lines_input_txt = f_dir.readlines()
+
+ file_dict = {}
+ num_class = 0
+ for i in range(len(lines_input_txt)):
+ line_txt = lines_input_txt[i].strip('\n\t\r')
+ if '--' in line_txt:
+ if i != 0:
+ num_class += 1
+ file_dict[num_class] = []
+ file_dict[num_class].append(line_txt)
+ if '--' not in line_txt:
+ if len(line_txt) > 6:
+ split_str = line_txt.split(' ')
+ x1_min = float(split_str[0])
+ y1_min = float(split_str[1])
+ x2_max = float(split_str[2])
+ y2_max = float(split_str[3])
+ line_txt = str(x1_min) + ' ' + str(y1_min) + ' ' + str(
+ x2_max) + ' ' + str(y2_max)
+ file_dict[num_class].append(line_txt)
+ else:
+ file_dict[num_class].append(line_txt)
+
+ return list(file_dict.values())
+
+
+def expand_bboxes(bboxes,
+ expand_left=2.,
+ expand_up=2.,
+ expand_right=2.,
+ expand_down=2.):
+ """
+ Expand bboxes, expand 2 times by defalut.
+ """
+ expand_boxes = []
+ for bbox in bboxes:
+ xmin = bbox[0]
+ ymin = bbox[1]
+ xmax = bbox[2]
+ ymax = bbox[3]
+ w = xmax - xmin
+ h = ymax - ymin
+ ex_xmin = max(xmin - w / expand_left, 0.)
+ ex_ymin = max(ymin - h / expand_up, 0.)
+ ex_xmax = min(xmax + w / expand_right, 1.)
+ ex_ymax = min(ymax + h / expand_down, 1.)
+ expand_boxes.append([ex_xmin, ex_ymin, ex_xmax, ex_ymax])
+ return expand_boxes
+
+
+def train_generator(settings, file_list, batch_size, shuffle=True):
+ def reader():
+ if shuffle:
+ np.random.shuffle(file_list)
+ batch_out = []
+ for item in file_list:
+ image_name = item[0]
+ image_path = os.path.join(settings.data_dir, image_name)
+ im = Image.open(image_path)
+ if im.mode == 'L':
+ im = im.convert('RGB')
+ im_width, im_height = im.size
+
+ # layout: label | xmin | ymin | xmax | ymax
+ bbox_labels = []
+ for index_box in range(len(item)):
+ if index_box >= 2:
+ bbox_sample = []
+ temp_info_box = item[index_box].split(' ')
+ xmin = float(temp_info_box[0])
+ ymin = float(temp_info_box[1])
+ w = float(temp_info_box[2])
+ h = float(temp_info_box[3])
+
+ # Filter out wrong labels
+ if w < 0 or h < 0:
+ continue
+ xmax = xmin + w
+ ymax = ymin + h
+
+ bbox_sample.append(1)
+ bbox_sample.append(float(xmin) / im_width)
+ bbox_sample.append(float(ymin) / im_height)
+ bbox_sample.append(float(xmax) / im_width)
+ bbox_sample.append(float(ymax) / im_height)
+ bbox_labels.append(bbox_sample)
+ im, sample_labels = preprocess(im, bbox_labels, "train", settings,
+ image_path)
+ sample_labels = np.array(sample_labels)
+ if len(sample_labels) == 0: continue
+
+ im = im.astype('float32')
+ face_box = sample_labels[:, 1:5]
+ head_box = expand_bboxes(face_box)
+ label = [1] * len(face_box)
+ batch_out.append((im, face_box, head_box, label))
+ if len(batch_out) == batch_size:
+ yield batch_out
+ batch_out = []
+
+ return reader
+
+
+def train(settings, file_list, batch_size, shuffle=True, num_workers=8):
+ file_lists = load_file_list(file_list)
+ n = int(math.ceil(len(file_lists) // num_workers))
+ split_lists = [file_lists[i:i + n] for i in range(0, len(file_lists), n)]
+ readers = []
+ for iterm in split_lists:
+ readers.append(train_generator(settings, iterm, batch_size, shuffle))
+ return paddle.reader.multiprocess_reader(readers, False)
+
+
+def test(settings, file_list):
+ file_lists = load_file_list(file_list)
+
+ def reader():
+ for image in file_lists:
+ image_name = image[0]
+ image_path = os.path.join(settings.data_dir, image_name)
+ im = Image.open(image_path)
+ if im.mode == 'L':
+ im = im.convert('RGB')
+ yield im, image_path
+
+ return reader
+
+
+def infer(settings, image_path):
+ def batch_reader():
+ img = Image.open(image_path)
+ if img.mode == 'L':
+ img = im.convert('RGB')
+ im_width, im_height = img.size
+ if settings.resize_width and settings.resize_height:
+ img = img.resize((settings.resize_width, settings.resize_height),
+ Image.ANTIALIAS)
+ img = np.array(img)
+ img = to_chw_bgr(img)
+ img = img.astype('float32')
+ img -= settings.img_mean
+ img = img * settings.scale
+ return np.array([img])
+
+ return batch_reader
diff --git a/PaddleCV/face_detection/train.py b/PaddleCV/face_detection/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..2108bcc32a378bbb0803032108ddafea4161e202
--- /dev/null
+++ b/PaddleCV/face_detection/train.py
@@ -0,0 +1,271 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import shutil
+import numpy as np
+import time
+import argparse
+import functools
+
+import paddle
+import paddle.fluid as fluid
+from pyramidbox import PyramidBox
+import reader
+from utility import add_arguments, print_arguments
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+
+# yapf: disable
+add_arg('parallel', bool, True, "Whether use multi-GPU/threads or not.")
+add_arg('learning_rate', float, 0.001, "The start learning rate.")
+add_arg('batch_size', int, 16, "Minibatch size.")
+add_arg('epoc_num', int, 160, "Epoch number.")
+add_arg('use_gpu', bool, True, "Whether use GPU.")
+add_arg('use_pyramidbox', bool, True, "Whether use PyramidBox model.")
+add_arg('model_save_dir', str, 'output', "The path to save model.")
+add_arg('resize_h', int, 640, "The resized image height.")
+add_arg('resize_w', int, 640, "The resized image width.")
+add_arg('mean_BGR', str, '104., 117., 123.', "Mean value for B,G,R channel which will be subtracted.")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, './vgg_ilsvrc_16_fc_reduced/', "The init model path.")
+add_arg('data_dir', str, 'data', "The base dir of dataset")
+parser.add_argument('--enable_ce', action='store_true', help='If set, run the task with continuous evaluation logs.')
+parser.add_argument('--batch_num', type=int, help="batch num for ce")
+parser.add_argument('--num_devices', type=int, default=1, help='Number of GPU devices')
+#yapf: enable
+
+train_parameters = {
+ "train_images": 12880,
+ "image_shape": [3, 640, 640],
+ "class_num": 2,
+ "batch_size": 16,
+ "lr": 0.001,
+ "lr_epochs": [99, 124, 149],
+ "lr_decay": [1, 0.1, 0.01, 0.001],
+ "epoc_num": 160,
+ "optimizer_method": "momentum",
+ "use_pyramidbox": True
+}
+
+def optimizer_setting(train_params):
+ batch_size = train_params["batch_size"]
+ iters = train_params["train_images"] // batch_size
+ lr = train_params["lr"]
+ optimizer_method = train_params["optimizer_method"]
+ boundaries = [i * iters for i in train_params["lr_epochs"]]
+ values = [i * lr for i in train_params["lr_decay"]]
+
+ if optimizer_method == "momentum":
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(boundaries, values),
+ momentum=0.9,
+ regularization=fluid.regularizer.L2Decay(0.0005),
+ )
+ else:
+ optimizer = fluid.optimizer.RMSProp(
+ learning_rate=fluid.layers.piecewise_decay(boundaries, values),
+ regularization=fluid.regularizer.L2Decay(0.0005),
+ )
+ return optimizer
+
+
+def build_program(train_params, main_prog, startup_prog, args):
+ use_pyramidbox = train_params["use_pyramidbox"]
+ image_shape = train_params["image_shape"]
+ class_num = train_params["class_num"]
+ with fluid.program_guard(main_prog, startup_prog):
+ py_reader = fluid.layers.py_reader(
+ capacity=8,
+ shapes=[[-1] + image_shape, [-1, 4], [-1, 4], [-1, 1]],
+ lod_levels=[0, 1, 1, 1],
+ dtypes=["float32", "float32", "float32", "int32"],
+ use_double_buffer=True)
+ with fluid.unique_name.guard():
+ image, face_box, head_box, gt_label = fluid.layers.read_file(py_reader)
+ fetches = []
+ network = PyramidBox(image=image,
+ face_box=face_box,
+ head_box=head_box,
+ gt_label=gt_label,
+ sub_network=use_pyramidbox)
+ if use_pyramidbox:
+ face_loss, head_loss, loss = network.train()
+ fetches = [face_loss, head_loss]
+ else:
+ loss = network.vgg_ssd_loss()
+ fetches = [loss]
+ optimizer = optimizer_setting(train_params)
+ optimizer.minimize(loss)
+ return py_reader, fetches, loss
+
+def train(args, config, train_params, train_file_list):
+ batch_size = train_params["batch_size"]
+ epoc_num = train_params["epoc_num"]
+ optimizer_method = train_params["optimizer_method"]
+ use_pyramidbox = train_params["use_pyramidbox"]
+
+ use_gpu = args.use_gpu
+ model_save_dir = args.model_save_dir
+ pretrained_model = args.pretrained_model
+ with_memory_optimization = args.with_mem_opt
+
+ devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
+ devices_num = len(devices.split(","))
+ batch_size_per_device = batch_size // devices_num
+ iters_per_epoc = train_params["train_images"] // batch_size
+ num_workers = 8
+ is_shuffle = True
+
+ startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+
+ #only for ce
+ if args.enable_ce:
+ SEED = 102
+ startup_prog.random_seed = SEED
+ train_prog.random_seed = SEED
+ num_workers = 1
+ pretrained_model = ""
+ if args.batch_num != None:
+ iters_per_epoc = args.batch_num
+
+ train_py_reader, fetches, loss = build_program(
+ train_params = train_params,
+ main_prog = train_prog,
+ startup_prog = startup_prog,
+ args=args)
+
+ if with_memory_optimization:
+ fluid.memory_optimize(train_prog)
+
+ place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup_prog)
+
+ start_epoc = 0
+ if pretrained_model:
+ if pretrained_model.isdigit():
+ start_epoc = int(pretrained_model) + 1
+ pretrained_model = os.path.join(model_save_dir, pretrained_model)
+ print("Resume from %s " %(pretrained_model))
+
+ if not os.path.exists(pretrained_model):
+ raise ValueError("The pre-trained model path [%s] does not exist." %
+ (pretrained_model))
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+ fluid.io.load_vars(
+ exe, pretrained_model, main_program=train_prog, predicate=if_exist)
+ train_reader = reader.train(config,
+ train_file_list,
+ batch_size_per_device,
+ shuffle = is_shuffle,
+ num_workers = num_workers)
+ train_py_reader.decorate_paddle_reader(train_reader)
+
+ if args.parallel:
+ train_exe = fluid.ParallelExecutor(
+ main_program = train_prog,
+ use_cuda=use_gpu,
+ loss_name=loss.name)
+
+ def save_model(postfix, program):
+ model_path = os.path.join(model_save_dir, postfix)
+ if os.path.isdir(model_path):
+ shutil.rmtree(model_path)
+
+ print('save models to %s' % (model_path))
+ fluid.io.save_persistables(exe, model_path, main_program=program)
+
+ total_time = 0.0
+ epoch_idx = 0
+ face_loss = 0
+ head_loss = 0
+ for pass_id in range(start_epoc, epoc_num):
+ epoch_idx += 1
+ start_time = time.time()
+ prev_start_time = start_time
+ end_time = 0
+ batch_id = 0
+ train_py_reader.start()
+ while True:
+ try:
+ prev_start_time = start_time
+ start_time = time.time()
+ if args.parallel:
+ fetch_vars = train_exe.run(fetch_list=
+ [v.name for v in fetches])
+ else:
+ fetch_vars = exe.run(train_prog, fetch_list=fetches)
+ end_time = time.time()
+ fetch_vars = [np.mean(np.array(v)) for v in fetch_vars]
+ face_loss = fetch_vars[0]
+ head_loss = fetch_vars[1]
+ if batch_id % 10 == 0:
+ if not args.use_pyramidbox:
+ print("Pass {:d}, batch {:d}, loss {:.6f}, time {:.5f}".format(
+ pass_id, batch_id, face_loss,
+ start_time - prev_start_time))
+ else:
+ print("Pass {:d}, batch {:d}, face loss {:.6f}, " \
+ "head loss {:.6f}, " \
+ "time {:.5f}".format(pass_id,
+ batch_id, face_loss, head_loss,
+ start_time - prev_start_time))
+ batch_id += 1
+ except (fluid.core.EOFException, StopIteration):
+ train_py_reader.reset()
+ break
+ epoch_end_time = time.time()
+ total_time += epoch_end_time - start_time
+ save_model(str(pass_id), train_prog)
+
+ # only for ce
+ if args.enable_ce:
+ gpu_num = get_cards(args)
+ print("kpis\teach_pass_duration_card%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_face_loss_card%s\t%s" %
+ (gpu_num, face_loss))
+ print("kpis\ttrain_head_loss_card%s\t%s" %
+ (gpu_num, head_loss))
+
+
+
+def get_cards(args):
+ if args.enable_ce:
+ cards = os.environ.get('CUDA_VISIBLE_DEVICES')
+ num = len(cards.split(","))
+ return num
+ else:
+ return args.num_devices
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ print_arguments(args)
+
+ data_dir = os.path.join(args.data_dir, 'WIDER_train/images/')
+ train_file_list = os.path.join(args.data_dir,
+ 'wider_face_split/wider_face_train_bbx_gt.txt')
+ mean_BGR = [float(m) for m in args.mean_BGR.split(",")]
+ image_shape = [3, int(args.resize_h), int(args.resize_w)]
+ train_parameters["image_shape"] = image_shape
+ train_parameters["use_pyramidbox"] = args.use_pyramidbox
+ train_parameters["batch_size"] = args.batch_size
+ train_parameters["lr"] = args.learning_rate
+ train_parameters["epoc_num"] = args.epoc_num
+
+
+ config = reader.Settings(
+ data_dir=data_dir,
+ resize_h=image_shape[1],
+ resize_w=image_shape[2],
+ apply_distort=True,
+ apply_expand=False,
+ mean_value=mean_BGR,
+ ap_version='11point')
+ train(args, config, train_parameters, train_file_list)
diff --git a/PaddleCV/face_detection/utility.py b/PaddleCV/face_detection/utility.py
new file mode 100644
index 0000000000000000000000000000000000000000..aebb9acbf4f450b50f020d96ccd3b13be5d7afaf
--- /dev/null
+++ b/PaddleCV/face_detection/utility.py
@@ -0,0 +1,60 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import distutils.util
+import six
+
+
+def print_arguments(args):
+ """Print argparse's arguments.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("name", default="Jonh", type=str, help="User name.")
+ args = parser.parse_args()
+ print_arguments(args)
+
+ :param args: Input argparse.Namespace for printing.
+ :type args: argparse.Namespace
+ """
+ print("----------- Configuration Arguments -----------")
+ for arg, value in sorted(six.iteritems(vars(args))):
+ print("%s: %s" % (arg, value))
+ print("------------------------------------------------")
+
+
+def add_arguments(argname, type, default, help, argparser, **kwargs):
+ """Add argparse's argument.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ add_argument("name", str, "Jonh", "User name.", parser)
+ args = parser.parse_args()
+ """
+ type = distutils.util.strtobool if type == bool else type
+ argparser.add_argument(
+ "--" + argname,
+ default=default,
+ type=type,
+ help=help + ' Default: %(default)s.',
+ **kwargs)
diff --git a/fluid/PaddleCV/face_detection/visualize.py b/PaddleCV/face_detection/visualize.py
similarity index 100%
rename from fluid/PaddleCV/face_detection/visualize.py
rename to PaddleCV/face_detection/visualize.py
diff --git a/PaddleCV/face_detection/widerface_eval.py b/PaddleCV/face_detection/widerface_eval.py
new file mode 100644
index 0000000000000000000000000000000000000000..a0525136a275002f6aac1816f327c497ed7822b2
--- /dev/null
+++ b/PaddleCV/face_detection/widerface_eval.py
@@ -0,0 +1,320 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import time
+import numpy as np
+import argparse
+import functools
+from PIL import Image
+
+import paddle.fluid as fluid
+import reader
+from pyramidbox import PyramidBox
+from visualize import draw_bboxes
+from utility import add_arguments, print_arguments
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+
+# yapf: disable
+add_arg('use_gpu', bool, True, "Whether use GPU or not.")
+add_arg('use_pyramidbox', bool, True, "Whether use PyramidBox model.")
+add_arg('data_dir', str, 'data/WIDER_val/images/', "The validation dataset path.")
+add_arg('model_dir', str, '', "The model path.")
+add_arg('pred_dir', str, 'pred', "The path to save the evaluation results.")
+add_arg('file_list', str, 'data/wider_face_split/wider_face_val_bbx_gt.txt', "The validation dataset path.")
+add_arg('infer', bool, False, "Whether do infer or eval.")
+add_arg('confs_threshold', float, 0.15, "Confidence threshold to draw bbox.")
+add_arg('image_path', str, '', "The image used to inference and visualize.")
+# yapf: enable
+
+
+def infer(args, config):
+ model_dir = args.model_dir
+ pred_dir = args.pred_dir
+ if not os.path.exists(model_dir):
+ raise ValueError("The model path [%s] does not exist." % (model_dir))
+
+ if args.infer:
+ image_path = args.image_path
+ image = Image.open(image_path)
+ if image.mode == 'L':
+ image = img.convert('RGB')
+ shrink, max_shrink = get_shrink(image.size[1], image.size[0])
+
+ det0 = detect_face(image, shrink)
+ if args.use_gpu:
+ det1 = flip_test(image, shrink)
+ [det2, det3] = multi_scale_test(image, max_shrink)
+ det4 = multi_scale_test_pyramid(image, max_shrink)
+ det = np.row_stack((det0, det1, det2, det3, det4))
+ dets = bbox_vote(det)
+ else:
+ # when infer on cpu, use a simple case
+ dets = det0
+
+ keep_index = np.where(dets[:, 4] >= args.confs_threshold)[0]
+ dets = dets[keep_index, :]
+ draw_bboxes(image_path, dets[:, 0:4])
+ else:
+ test_reader = reader.test(config, args.file_list)
+ for image, image_path in test_reader():
+ shrink, max_shrink = get_shrink(image.size[1], image.size[0])
+
+ det0 = detect_face(image, shrink)
+ det1 = flip_test(image, shrink)
+ [det2, det3] = multi_scale_test(image, max_shrink)
+ det4 = multi_scale_test_pyramid(image, max_shrink)
+ det = np.row_stack((det0, det1, det2, det3, det4))
+ dets = bbox_vote(det)
+
+ save_widerface_bboxes(image_path, dets, pred_dir)
+
+ print("Finish evaluation.")
+
+
+def save_widerface_bboxes(image_path, bboxes_scores, output_dir):
+ """
+ Save predicted results, including bbox and score into text file.
+ Args:
+ image_path (string): file name.
+ bboxes_scores (np.array|list): the predicted bboxed and scores, layout
+ is (xmin, ymin, xmax, ymax, score)
+ output_dir (string): output directory.
+ """
+ image_name = image_path.split('/')[-1]
+ image_class = image_path.split('/')[-2]
+
+ odir = os.path.join(output_dir, image_class)
+ if not os.path.exists(odir):
+ os.makedirs(odir)
+
+ ofname = os.path.join(odir, '%s.txt' % (image_name[:-4]))
+ f = open(ofname, 'w')
+ f.write('{:s}\n'.format(image_class + '/' + image_name))
+ f.write('{:d}\n'.format(bboxes_scores.shape[0]))
+ for box_score in bboxes_scores:
+ xmin, ymin, xmax, ymax, score = box_score
+ f.write('{:.1f} {:.1f} {:.1f} {:.1f} {:.3f}\n'.format(xmin, ymin, (
+ xmax - xmin + 1), (ymax - ymin + 1), score))
+ f.close()
+ print("The predicted result is saved as {}".format(ofname))
+
+
+def detect_face(image, shrink):
+ image_shape = [3, image.size[1], image.size[0]]
+ if shrink != 1:
+ h, w = int(image_shape[1] * shrink), int(image_shape[2] * shrink)
+ image = image.resize((w, h), Image.ANTIALIAS)
+ image_shape = [3, h, w]
+
+ img = np.array(image)
+ img = reader.to_chw_bgr(img)
+ mean = [104., 117., 123.]
+ scale = 0.007843
+ img = img.astype('float32')
+ img -= np.array(mean)[:, np.newaxis, np.newaxis].astype('float32')
+ img = img * scale
+ img = [img]
+ img = np.array(img)
+
+ detection, = exe.run(infer_program,
+ feed={'image': img},
+ fetch_list=fetches,
+ return_numpy=False)
+ detection = np.array(detection)
+ # layout: xmin, ymin, xmax. ymax, score
+ if np.prod(detection.shape) == 1:
+ print("No face detected")
+ return np.array([[0, 0, 0, 0, 0]])
+ det_conf = detection[:, 1]
+ det_xmin = image_shape[2] * detection[:, 2] / shrink
+ det_ymin = image_shape[1] * detection[:, 3] / shrink
+ det_xmax = image_shape[2] * detection[:, 4] / shrink
+ det_ymax = image_shape[1] * detection[:, 5] / shrink
+
+ det = np.column_stack((det_xmin, det_ymin, det_xmax, det_ymax, det_conf))
+ return det
+
+
+def bbox_vote(det):
+ order = det[:, 4].ravel().argsort()[::-1]
+ det = det[order, :]
+ if det.shape[0] == 0:
+ dets = np.array([[10, 10, 20, 20, 0.002]])
+ det = np.empty(shape=[0, 5])
+ while det.shape[0] > 0:
+ # IOU
+ area = (det[:, 2] - det[:, 0] + 1) * (det[:, 3] - det[:, 1] + 1)
+ xx1 = np.maximum(det[0, 0], det[:, 0])
+ yy1 = np.maximum(det[0, 1], det[:, 1])
+ xx2 = np.minimum(det[0, 2], det[:, 2])
+ yy2 = np.minimum(det[0, 3], det[:, 3])
+ w = np.maximum(0.0, xx2 - xx1 + 1)
+ h = np.maximum(0.0, yy2 - yy1 + 1)
+ inter = w * h
+ o = inter / (area[0] + area[:] - inter)
+
+ # nms
+ merge_index = np.where(o >= 0.3)[0]
+ det_accu = det[merge_index, :]
+ det = np.delete(det, merge_index, 0)
+ if merge_index.shape[0] <= 1:
+ if det.shape[0] == 0:
+ try:
+ dets = np.row_stack((dets, det_accu))
+ except:
+ dets = det_accu
+ continue
+ det_accu[:, 0:4] = det_accu[:, 0:4] * np.tile(det_accu[:, -1:], (1, 4))
+ max_score = np.max(det_accu[:, 4])
+ det_accu_sum = np.zeros((1, 5))
+ det_accu_sum[:, 0:4] = np.sum(det_accu[:, 0:4],
+ axis=0) / np.sum(det_accu[:, -1:])
+ det_accu_sum[:, 4] = max_score
+ try:
+ dets = np.row_stack((dets, det_accu_sum))
+ except:
+ dets = det_accu_sum
+ dets = dets[0:750, :]
+ return dets
+
+
+def flip_test(image, shrink):
+ img = image.transpose(Image.FLIP_LEFT_RIGHT)
+ det_f = detect_face(img, shrink)
+ det_t = np.zeros(det_f.shape)
+ # image.size: [width, height]
+ det_t[:, 0] = image.size[0] - det_f[:, 2]
+ det_t[:, 1] = det_f[:, 1]
+ det_t[:, 2] = image.size[0] - det_f[:, 0]
+ det_t[:, 3] = det_f[:, 3]
+ det_t[:, 4] = det_f[:, 4]
+ return det_t
+
+
+def multi_scale_test(image, max_shrink):
+ # Shrink detecting is only used to detect big faces
+ st = 0.5 if max_shrink >= 0.75 else 0.5 * max_shrink
+ det_s = detect_face(image, st)
+ index = np.where(
+ np.maximum(det_s[:, 2] - det_s[:, 0] + 1, det_s[:, 3] - det_s[:, 1] + 1)
+ > 30)[0]
+ det_s = det_s[index, :]
+ # Enlarge one times
+ bt = min(2, max_shrink) if max_shrink > 1 else (st + max_shrink) / 2
+ det_b = detect_face(image, bt)
+
+ # Enlarge small image x times for small faces
+ if max_shrink > 2:
+ bt *= 2
+ while bt < max_shrink:
+ det_b = np.row_stack((det_b, detect_face(image, bt)))
+ bt *= 2
+ det_b = np.row_stack((det_b, detect_face(image, max_shrink)))
+
+ # Enlarged images are only used to detect small faces.
+ if bt > 1:
+ index = np.where(
+ np.minimum(det_b[:, 2] - det_b[:, 0] + 1,
+ det_b[:, 3] - det_b[:, 1] + 1) < 100)[0]
+ det_b = det_b[index, :]
+ # Shrinked images are only used to detect big faces.
+ else:
+ index = np.where(
+ np.maximum(det_b[:, 2] - det_b[:, 0] + 1,
+ det_b[:, 3] - det_b[:, 1] + 1) > 30)[0]
+ det_b = det_b[index, :]
+ return det_s, det_b
+
+
+def multi_scale_test_pyramid(image, max_shrink):
+ # Use image pyramids to detect faces
+ det_b = detect_face(image, 0.25)
+ index = np.where(
+ np.maximum(det_b[:, 2] - det_b[:, 0] + 1, det_b[:, 3] - det_b[:, 1] + 1)
+ > 30)[0]
+ det_b = det_b[index, :]
+
+ st = [0.75, 1.25, 1.5, 1.75]
+ for i in range(len(st)):
+ if (st[i] <= max_shrink):
+ det_temp = detect_face(image, st[i])
+ # Enlarged images are only used to detect small faces.
+ if st[i] > 1:
+ index = np.where(
+ np.minimum(det_temp[:, 2] - det_temp[:, 0] + 1,
+ det_temp[:, 3] - det_temp[:, 1] + 1) < 100)[0]
+ det_temp = det_temp[index, :]
+ # Shrinked images are only used to detect big faces.
+ else:
+ index = np.where(
+ np.maximum(det_temp[:, 2] - det_temp[:, 0] + 1,
+ det_temp[:, 3] - det_temp[:, 1] + 1) > 30)[0]
+ det_temp = det_temp[index, :]
+ det_b = np.row_stack((det_b, det_temp))
+ return det_b
+
+
+def get_shrink(height, width):
+ """
+ Args:
+ height (int): image height.
+ width (int): image width.
+ """
+ # avoid out of memory
+ max_shrink_v1 = (0x7fffffff / 577.0 / (height * width))**0.5
+ max_shrink_v2 = ((678 * 1024 * 2.0 * 2.0) / (height * width))**0.5
+
+ def get_round(x, loc):
+ str_x = str(x)
+ if '.' in str_x:
+ str_before, str_after = str_x.split('.')
+ len_after = len(str_after)
+ if len_after >= 3:
+ str_final = str_before + '.' + str_after[0:loc]
+ return float(str_final)
+ else:
+ return x
+
+ max_shrink = get_round(min(max_shrink_v1, max_shrink_v2), 2) - 0.3
+ if max_shrink >= 1.5 and max_shrink < 2:
+ max_shrink = max_shrink - 0.1
+ elif max_shrink >= 2 and max_shrink < 3:
+ max_shrink = max_shrink - 0.2
+ elif max_shrink >= 3 and max_shrink < 4:
+ max_shrink = max_shrink - 0.3
+ elif max_shrink >= 4 and max_shrink < 5:
+ max_shrink = max_shrink - 0.4
+ elif max_shrink >= 5:
+ max_shrink = max_shrink - 0.5
+
+ shrink = max_shrink if max_shrink < 1 else 1
+ return shrink, max_shrink
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ print_arguments(args)
+ config = reader.Settings(data_dir=args.data_dir)
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ main_program = fluid.Program()
+ startup_program = fluid.Program()
+ image_shape = [3, 1024, 1024]
+ with fluid.program_guard(main_program, startup_program):
+ network = PyramidBox(
+ data_shape=image_shape,
+ sub_network=args.use_pyramidbox,
+ is_infer=True)
+ infer_program, nmsed_out = network.infer(main_program)
+ fetches = [nmsed_out]
+ fluid.io.load_persistables(
+ exe, args.model_dir, main_program=infer_program)
+ # save model and program
+ #fluid.io.save_inference_model('pyramidbox_model',
+ # ['image'], [nmsed_out], exe, main_program=infer_program,
+ # model_filename='model', params_filename='params')
+ infer(args, config)
diff --git a/PaddleCV/gan/c_gan/.run_ce.sh b/PaddleCV/gan/c_gan/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..eb43acc363e66e10f6ae4052e426dfa25b2d3e8f
--- /dev/null
+++ b/PaddleCV/gan/c_gan/.run_ce.sh
@@ -0,0 +1,9 @@
+#!/bin/bash
+
+# This file is only used for continuous evaluation.
+export FLAGS_cudnn_deterministic=True
+export ce_mode=1
+(CUDA_VISIBLE_DEVICES=2 python c_gan.py --batch_size=121 --epoch=1 --run_ce=True --use_gpu=True & \
+CUDA_VISIBLE_DEVICES=3 python dc_gan.py --batch_size=121 --epoch=1 --run_ce=True --use_gpu=True) | python _ce.py
+
+
diff --git a/PaddleCV/gan/c_gan/README.md b/PaddleCV/gan/c_gan/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..9f3c18fd0fb9a943f728548f655d3dd3cef73288
--- /dev/null
+++ b/PaddleCV/gan/c_gan/README.md
@@ -0,0 +1,76 @@
+
+
+运行本目录下的程序示例需要使用PaddlePaddle develop最新版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
+
+## 代码结构
+```
+├── network.py # 定义基础生成网络和判别网络。
+├── utility.py # 定义通用工具方法。
+├── dc_gan.py # DCGAN训练脚本。
+└── c_gan.py # conditionalGAN训练脚本。
+```
+
+## 简介
+TODO
+
+## 数据准备
+
+本教程使用 mnist 数据集来进行模型的训练测试工作,该数据集通过`paddle.dataset`模块自动下载到本地。
+
+## 训练测试conditianalGAN
+
+在GPU单卡上训练conditionalGAN:
+
+```
+env CUDA_VISIBLE_DEVICES=0 python c_gan.py --output="./result"
+```
+
+训练过程中,每隔固定的训练轮数,会取一个batch的数据进行测试,测试结果以图片的形式保存至`--output`选项指定的路径。
+
+执行`python c_gan.py --help`可查看更多使用方式和参数详细说明。
+
+图1为conditionalGAN训练损失示意图,其中横坐标轴为训练轮数,纵轴为在训练集上的损失。其中,'G_loss'和'D_loss'分别为生成网络和判别器网络的训练损失。conditionalGAN训练19轮的模型预测效果如图2所示.
+
+
+
+
+
+
+
+ |
+
+
+ |
+
+
+ |
+ 图 1
+ |
+
+ 图 2
+ |
+
+
+
+
+
+
+## 训练测试DCGAN
+
+在GPU单卡上训练DCGAN:
+
+```
+env CUDA_VISIBLE_DEVICES=0 python dc_gan.py --output="./result"
+```
+
+训练过程中,每隔固定的训练轮数,会取一个batch的数据进行测试,测试结果以图片的形式保存至`--output`选项指定的路径。
+
+执行`python dc_gan.py --help`可查看更多使用方式和参数详细说明。
+
+
+DCGAN训练10轮的模型预测效果如图3所示:
+
+
+
+图 3
+
diff --git a/fluid/PaddleCV/gan/c_gan/_ce.py b/PaddleCV/gan/c_gan/_ce.py
similarity index 100%
rename from fluid/PaddleCV/gan/c_gan/_ce.py
rename to PaddleCV/gan/c_gan/_ce.py
diff --git a/PaddleCV/gan/c_gan/c_gan.py b/PaddleCV/gan/c_gan/c_gan.py
new file mode 100644
index 0000000000000000000000000000000000000000..ebf5f87fda33375e045022aca860e81b752ffaf5
--- /dev/null
+++ b/PaddleCV/gan/c_gan/c_gan.py
@@ -0,0 +1,204 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import sys
+import os
+import six
+import argparse
+import functools
+import matplotlib
+import numpy as np
+import paddle
+import time
+import paddle.fluid as fluid
+from utility import get_parent_function_name, plot, check, add_arguments, print_arguments
+from network import G_cond, D_cond
+matplotlib.use('agg')
+import matplotlib.pyplot as plt
+import matplotlib.gridspec as gridspec
+
+
+NOISE_SIZE = 100
+LEARNING_RATE = 2e-4
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 121, "Minibatch size.")
+add_arg('epoch', int, 20, "The number of epoched to be trained.")
+add_arg('output', str, "./output", "The directory the model and the test result to be saved to.")
+add_arg('use_gpu', bool, True, "Whether to use GPU to train.")
+add_arg('run_ce', bool, False, "Whether to run for model ce.")
+# yapf: enable
+
+
+def loss(x, label):
+ return fluid.layers.mean(
+ fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=x, label=label))
+
+
+def train(args):
+
+ if args.run_ce:
+ np.random.seed(10)
+ fluid.default_startup_program().random_seed = 90
+
+ d_program = fluid.Program()
+ dg_program = fluid.Program()
+
+ with fluid.program_guard(d_program):
+ conditions = fluid.layers.data(
+ name='conditions', shape=[1], dtype='float32')
+ img = fluid.layers.data(name='img', shape=[784], dtype='float32')
+ label = fluid.layers.data(name='label', shape=[1], dtype='float32')
+ d_logit = D_cond(img, conditions)
+ d_loss = loss(d_logit, label)
+
+ with fluid.program_guard(dg_program):
+ conditions = fluid.layers.data(
+ name='conditions', shape=[1], dtype='float32')
+ noise = fluid.layers.data(
+ name='noise', shape=[NOISE_SIZE], dtype='float32')
+ g_img = G_cond(z=noise, y=conditions)
+
+ g_program = dg_program.clone()
+ g_program_test = dg_program.clone(for_test=True)
+
+ dg_logit = D_cond(g_img, conditions)
+ dg_loss = loss(
+ dg_logit,
+ fluid.layers.fill_constant_batch_size_like(
+ input=noise, dtype='float32', shape=[-1, 1], value=1.0))
+
+ opt = fluid.optimizer.Adam(learning_rate=LEARNING_RATE)
+
+ opt.minimize(loss=d_loss)
+ parameters = [p.name for p in g_program.global_block().all_parameters()]
+
+ opt.minimize(loss=dg_loss, parameter_list=parameters)
+
+ exe = fluid.Executor(fluid.CPUPlace())
+ if args.use_gpu:
+ exe = fluid.Executor(fluid.CUDAPlace(0))
+ exe.run(fluid.default_startup_program())
+ if args.run_ce:
+ train_reader = paddle.batch(
+ paddle.dataset.mnist.train(),
+ batch_size=args.batch_size)
+ else:
+ train_reader = paddle.batch(
+ paddle.reader.shuffle(
+ paddle.dataset.mnist.train(), buf_size=60000),
+ batch_size=args.batch_size)
+
+ NUM_TRAIN_TIMES_OF_DG = 2
+ const_n = np.random.uniform(
+ low=-1.0, high=1.0,
+ size=[args.batch_size, NOISE_SIZE]).astype('float32')
+ t_time = 0
+ losses = [[],[]]
+ for pass_id in range(args.epoch):
+ for batch_id, data in enumerate(train_reader()):
+ if len(data) != args.batch_size:
+ continue
+ noise_data = np.random.uniform(
+ low=-1.0, high=1.0,
+ size=[args.batch_size, NOISE_SIZE]).astype('float32')
+ real_image = np.array(list(map(lambda x: x[0], data))).reshape(
+ -1, 784).astype('float32')
+ conditions_data = np.array([x[1] for x in data]).reshape(
+ [-1, 1]).astype("float32")
+ real_labels = np.ones(
+ shape=[real_image.shape[0], 1], dtype='float32')
+ fake_labels = np.zeros(
+ shape=[real_image.shape[0], 1], dtype='float32')
+ total_label = np.concatenate([real_labels, fake_labels])
+ s_time = time.time()
+ generated_image = exe.run(
+ g_program,
+ feed={'noise': noise_data,
+ 'conditions': conditions_data},
+ fetch_list={g_img})[0]
+
+ total_images = np.concatenate([real_image, generated_image])
+
+ d_loss_1 = exe.run(d_program,
+ feed={
+ 'img': generated_image,
+ 'label': fake_labels,
+ 'conditions': conditions_data
+ },
+ fetch_list={d_loss})[0][0]
+
+ d_loss_2 = exe.run(d_program,
+ feed={
+ 'img': real_image,
+ 'label': real_labels,
+ 'conditions': conditions_data
+ },
+ fetch_list={d_loss})[0][0]
+
+ d_loss_n = d_loss_1 + d_loss_2
+ losses[0].append(d_loss_n)
+ for _ in six.moves.xrange(NUM_TRAIN_TIMES_OF_DG):
+ noise_data = np.random.uniform(
+ low=-1.0, high=1.0,
+ size=[args.batch_size, NOISE_SIZE]).astype('float32')
+ dg_loss_n = exe.run(
+ dg_program,
+ feed={'noise': noise_data,
+ 'conditions': conditions_data},
+ fetch_list={dg_loss})[0][0]
+ losses[1].append(dg_loss_n)
+ batch_time = time.time() - s_time
+ t_time += batch_time
+
+
+
+ if batch_id % 10 == 0 and not args.run_ce:
+ if not os.path.exists(args.output):
+ os.makedirs(args.output)
+ # generate image each batch
+ generated_images = exe.run(
+ g_program_test,
+ feed={'noise': const_n,
+ 'conditions': conditions_data},
+ fetch_list={g_img})[0]
+ total_images = np.concatenate([real_image, generated_images])
+ fig = plot(total_images)
+ msg = "Epoch ID={0}\n Batch ID={1}\n D-Loss={2}\n DG-Loss={3}\n gen={4}\n " \
+ "Batch_time_cost={5:.2f}".format(
+ pass_id, batch_id, d_loss_n, dg_loss_n, check(generated_images), batch_time)
+ print(msg)
+ plt.title(msg)
+ plt.savefig(
+ '{}/{:04d}_{:04d}.png'.format(args.output, pass_id,
+ batch_id),
+ bbox_inches='tight')
+ plt.close(fig)
+
+ if args.run_ce:
+ print("kpis,cgan_d_train_cost,{}".format(np.mean(losses[0])))
+ print("kpis,cgan_g_train_cost,{}".format(np.mean(losses[1])))
+ print("kpis,cgan_duration,{}".format(t_time / args.epoch))
+
+
+if __name__ == "__main__":
+ args = parser.parse_args()
+ print_arguments(args)
+ train(args)
diff --git a/fluid/PaddleCV/gan/c_gan/dc_gan.py b/PaddleCV/gan/c_gan/dc_gan.py
similarity index 100%
rename from fluid/PaddleCV/gan/c_gan/dc_gan.py
rename to PaddleCV/gan/c_gan/dc_gan.py
diff --git a/fluid/PaddleCV/gan/c_gan/images/DCGAN_demo.png b/PaddleCV/gan/c_gan/images/DCGAN_demo.png
similarity index 100%
rename from fluid/PaddleCV/gan/c_gan/images/DCGAN_demo.png
rename to PaddleCV/gan/c_gan/images/DCGAN_demo.png
diff --git a/fluid/PaddleCV/gan/c_gan/images/conditionalGAN_demo.png b/PaddleCV/gan/c_gan/images/conditionalGAN_demo.png
similarity index 100%
rename from fluid/PaddleCV/gan/c_gan/images/conditionalGAN_demo.png
rename to PaddleCV/gan/c_gan/images/conditionalGAN_demo.png
diff --git a/fluid/PaddleCV/gan/c_gan/images/conditionalGAN_loss.png b/PaddleCV/gan/c_gan/images/conditionalGAN_loss.png
similarity index 100%
rename from fluid/PaddleCV/gan/c_gan/images/conditionalGAN_loss.png
rename to PaddleCV/gan/c_gan/images/conditionalGAN_loss.png
diff --git a/fluid/PaddleCV/gan/c_gan/network.py b/PaddleCV/gan/c_gan/network.py
similarity index 100%
rename from fluid/PaddleCV/gan/c_gan/network.py
rename to PaddleCV/gan/c_gan/network.py
diff --git a/fluid/PaddleCV/gan/c_gan/utility.py b/PaddleCV/gan/c_gan/utility.py
similarity index 100%
rename from fluid/PaddleCV/gan/c_gan/utility.py
rename to PaddleCV/gan/c_gan/utility.py
diff --git a/fluid/PaddleCV/gan/cycle_gan/.run_ce.sh b/PaddleCV/gan/cycle_gan/.run_ce.sh
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/.run_ce.sh
rename to PaddleCV/gan/cycle_gan/.run_ce.sh
diff --git a/PaddleCV/gan/cycle_gan/README.md b/PaddleCV/gan/cycle_gan/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..0a9be53c783a557b7c2306f65377e4cafa8cfd90
--- /dev/null
+++ b/PaddleCV/gan/cycle_gan/README.md
@@ -0,0 +1,91 @@
+
+
+运行本目录下的程序示例需要使用PaddlePaddle develop最新版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
+
+## 代码结构
+```
+├── data_reader.py # 读取、处理数据。
+├── layers.py # 封装定义基础的layers。
+├── model.py # 定义基础生成网络和判别网络。
+├── trainer.py # 构造loss和训练网络。
+├── train.py # 训练脚本。
+└── infer.py # 预测脚本。
+```
+
+## 简介
+TODO
+
+## 数据准备
+
+本教程使用 horse2zebra 数据集 来进行模型的训练测试工作,该数据集是用关键字'wild horse'和'zebra'过滤[ImageNet](http://www.image-net.org/)数据集并下载得到的。
+
+horse2zebra训练集包含1069张野马图片,1336张斑马图片。测试集包含121张野马图片和141张斑马图片。
+
+数据下载处理完毕后,并组织为以下路径结构:
+
+```
+data
+|-- horse2zebra
+| |-- testA
+| |-- testA.txt
+| |-- testB
+| |-- testB.txt
+| |-- trainA
+| |-- trainA.txt
+| |-- trainB
+| `-- trainB.txt
+
+```
+
+以上数据文件中,`data`文件夹需要放在训练脚本`train.py`同级目录下。`testA`为存放野马测试图片的文件夹,`testB`为存放斑马测试图片的文件夹,`testA.txt`和`testB.txt`分别为野马和斑马测试图片路径列表文件,格式如下:
+
+```
+testA/n02381460_9243.jpg
+testA/n02381460_9244.jpg
+testA/n02381460_9245.jpg
+```
+
+训练数据组织方式与测试数据相同。
+
+
+## 模型训练与预测
+
+### 训练
+
+在GPU单卡上训练:
+
+```
+env CUDA_VISIBLE_DEVICES=0 python train.py
+```
+
+执行`python train.py --help`可查看更多使用方式和参数详细说明。
+
+图1为训练152轮的训练损失示意图,其中横坐标轴为训练轮数,纵轴为在训练集上的损失。其中,'g_A_loss','g_B_loss','d_A_loss'和'd_B_loss'分别为生成器A、生成器B、判别器A和判别器B的训练损失。
+
+
+
+图 1
+
+
+
+### 预测
+
+执行以下命令读取多张图片进行预测:
+
+```
+env CUDA_VISIBLE_DEVICE=0 python infer.py \
+ --init_model="checkpoints/1" --input="./data/inputA/*" \
+ --input_style A --output="./output"
+```
+
+训练150轮的模型预测效果如图2和图3所示:
+
+
+
+图 2
+
+
+
+
+图 3
+
diff --git a/fluid/PaddleCV/gan/cycle_gan/_ce.py b/PaddleCV/gan/cycle_gan/_ce.py
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/_ce.py
rename to PaddleCV/gan/cycle_gan/_ce.py
diff --git a/fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainA.txt b/PaddleCV/gan/cycle_gan/data/horse2zebra/trainA.txt
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainA.txt
rename to PaddleCV/gan/cycle_gan/data/horse2zebra/trainA.txt
diff --git a/fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainA/n02381460_1001.jpg b/PaddleCV/gan/cycle_gan/data/horse2zebra/trainA/n02381460_1001.jpg
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainA/n02381460_1001.jpg
rename to PaddleCV/gan/cycle_gan/data/horse2zebra/trainA/n02381460_1001.jpg
diff --git a/fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainB.txt b/PaddleCV/gan/cycle_gan/data/horse2zebra/trainB.txt
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainB.txt
rename to PaddleCV/gan/cycle_gan/data/horse2zebra/trainB.txt
diff --git a/fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainB/n02391049_10007.jpg b/PaddleCV/gan/cycle_gan/data/horse2zebra/trainB/n02391049_10007.jpg
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/data/horse2zebra/trainB/n02391049_10007.jpg
rename to PaddleCV/gan/cycle_gan/data/horse2zebra/trainB/n02391049_10007.jpg
diff --git a/fluid/PaddleCV/gan/cycle_gan/data_reader.py b/PaddleCV/gan/cycle_gan/data_reader.py
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/data_reader.py
rename to PaddleCV/gan/cycle_gan/data_reader.py
diff --git a/fluid/PaddleCV/gan/cycle_gan/images/A2B.jpg b/PaddleCV/gan/cycle_gan/images/A2B.jpg
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/images/A2B.jpg
rename to PaddleCV/gan/cycle_gan/images/A2B.jpg
diff --git a/fluid/PaddleCV/gan/cycle_gan/images/B2A.jpg b/PaddleCV/gan/cycle_gan/images/B2A.jpg
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/images/B2A.jpg
rename to PaddleCV/gan/cycle_gan/images/B2A.jpg
diff --git a/fluid/PaddleCV/gan/cycle_gan/images/cycleGAN_loss.png b/PaddleCV/gan/cycle_gan/images/cycleGAN_loss.png
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/images/cycleGAN_loss.png
rename to PaddleCV/gan/cycle_gan/images/cycleGAN_loss.png
diff --git a/PaddleCV/gan/cycle_gan/infer.py b/PaddleCV/gan/cycle_gan/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..2282cf6c6aa6d41b1caba41a3cf42c08aea7ecef
--- /dev/null
+++ b/PaddleCV/gan/cycle_gan/infer.py
@@ -0,0 +1,69 @@
+import argparse
+import functools
+import os
+from PIL import Image
+from paddle.fluid import core
+import paddle.fluid as fluid
+import paddle
+import numpy as np
+from scipy.misc import imsave
+from model import *
+import glob
+from utility import add_arguments, print_arguments
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('input', str, None, "The images to be infered.")
+add_arg('output', str, "./infer_result", "The directory the infer result to be saved to.")
+add_arg('init_model', str, None, "The init model file of directory.")
+add_arg('input_style', str, "A", "The style of the input, A or B")
+add_arg('use_gpu', bool, True, "Whether to use GPU to train.")
+# yapf: enable
+
+
+def infer(args):
+ data_shape = [-1, 3, 256, 256]
+ input = fluid.layers.data(name='input', shape=data_shape, dtype='float32')
+ if args.input_style == "A":
+ model_name = 'g_a'
+ fake = build_generator_resnet_9blocks(input, name="g_A")
+ elif args.input_style == "B":
+ model_name = 'g_b'
+ fake = build_generator_resnet_9blocks(input, name="g_B")
+ else:
+ raise "Input with style [%s] is not supported." % args.input_style
+ # prepare environment
+ place = fluid.CPUPlace()
+ if args.use_gpu:
+ place = fluid.CUDAPlace(0)
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ fluid.io.load_persistables(exe, args.init_model + "/" + model_name)
+
+ if not os.path.exists(args.output):
+ os.makedirs(args.output)
+ for file in glob.glob(args.input):
+ print "read %s" % file
+ image_name = os.path.basename(file)
+ image = Image.open(file)
+ image = image.resize((256, 256))
+ image = np.array(image) / 127.5 - 1
+ if len(image.shape) != 3:
+ continue
+ data = image.transpose([2, 0, 1])[np.newaxis, :].astype("float32")
+ tensor = core.LoDTensor()
+ tensor.set(data, place)
+
+ fake_temp = exe.run(fetch_list=[fake.name], feed={"input": tensor})
+ fake_temp = np.squeeze(fake_temp[0]).transpose([1, 2, 0])
+ input_temp = np.squeeze(data).transpose([1, 2, 0])
+
+ imsave(args.output + "/fake_" + image_name, (
+ (fake_temp + 1) * 127.5).astype(np.uint8))
+
+
+if __name__ == "__main__":
+ args = parser.parse_args()
+ print_arguments(args)
+ infer(args)
diff --git a/PaddleCV/gan/cycle_gan/layers.py b/PaddleCV/gan/cycle_gan/layers.py
new file mode 100644
index 0000000000000000000000000000000000000000..d918946fd659c4b02784c0f9060dd3011cb5840e
--- /dev/null
+++ b/PaddleCV/gan/cycle_gan/layers.py
@@ -0,0 +1,161 @@
+from __future__ import division
+import paddle.fluid as fluid
+import numpy as np
+import os
+
+# cudnn is not better when batch size is 1.
+use_cudnn = False
+if 'ce_mode' in os.environ:
+ use_cudnn = False
+
+
+def cal_padding(img_size, stride, filter_size, dilation=1):
+ """Calculate padding size."""
+ valid_filter_size = dilation * (filter_size - 1) + 1
+ if img_size % stride == 0:
+ out_size = max(filter_size - stride, 0)
+ else:
+ out_size = max(filter_size - (img_size % stride), 0)
+ return out_size // 2, out_size - out_size // 2
+
+
+def instance_norm(input, name=None):
+ # TODO(lvmengsi@baidu.com): Check the accuracy when using fluid.layers.layer_norm.
+ # return fluid.layers.layer_norm(input, begin_norm_axis=2)
+ helper = fluid.layer_helper.LayerHelper("instance_norm", **locals())
+ dtype = helper.input_dtype()
+ epsilon = 1e-5
+ mean = fluid.layers.reduce_mean(input, dim=[2, 3], keep_dim=True)
+ var = fluid.layers.reduce_mean(
+ fluid.layers.square(input - mean), dim=[2, 3], keep_dim=True)
+ if name is not None:
+ scale_name = name + "_scale"
+ offset_name = name + "_offset"
+ scale_param = fluid.ParamAttr(
+ name=scale_name,
+ initializer=fluid.initializer.TruncatedNormal(1.0, 0.02),
+ trainable=True)
+ offset_param = fluid.ParamAttr(
+ name=offset_name,
+ initializer=fluid.initializer.Constant(0.0),
+ trainable=True)
+ scale = helper.create_parameter(
+ attr=scale_param, shape=input.shape[1:2], dtype=dtype)
+ offset = helper.create_parameter(
+ attr=offset_param, shape=input.shape[1:2], dtype=dtype)
+
+ tmp = fluid.layers.elementwise_mul(x=(input - mean), y=scale, axis=1)
+ tmp = tmp / fluid.layers.sqrt(var + epsilon)
+ tmp = fluid.layers.elementwise_add(tmp, offset, axis=1)
+ return tmp
+
+
+def conv2d(input,
+ num_filters=64,
+ filter_size=7,
+ stride=1,
+ stddev=0.02,
+ padding="VALID",
+ name="conv2d",
+ norm=True,
+ relu=True,
+ relufactor=0.0):
+ """Wrapper for conv2d op to support VALID and SAME padding mode."""
+ need_crop = False
+ if padding == "SAME":
+ top_padding, bottom_padding = cal_padding(input.shape[2], stride,
+ filter_size)
+ left_padding, right_padding = cal_padding(input.shape[2], stride,
+ filter_size)
+ height_padding = bottom_padding
+ width_padding = right_padding
+ if top_padding != bottom_padding or left_padding != right_padding:
+ height_padding = top_padding + stride
+ width_padding = left_padding + stride
+ need_crop = True
+ else:
+ height_padding = 0
+ width_padding = 0
+
+ padding = [height_padding, width_padding]
+ param_attr = fluid.ParamAttr(
+ name=name + "_w",
+ initializer=fluid.initializer.TruncatedNormal(scale=stddev))
+ bias_attr = fluid.ParamAttr(
+ name=name + "_b", initializer=fluid.initializer.Constant(0.0))
+ conv = fluid.layers.conv2d(
+ input,
+ num_filters,
+ filter_size,
+ name=name,
+ stride=stride,
+ padding=padding,
+ use_cudnn=use_cudnn,
+ param_attr=param_attr,
+ bias_attr=bias_attr)
+ if need_crop:
+ conv = fluid.layers.crop(
+ conv,
+ shape=(-1, conv.shape[1], conv.shape[2] - 1, conv.shape[3] - 1),
+ offsets=(0, 0, 1, 1))
+ if norm:
+ conv = instance_norm(input=conv, name=name + "_norm")
+ if relu:
+ conv = fluid.layers.leaky_relu(conv, alpha=relufactor)
+ return conv
+
+
+def deconv2d(input,
+ out_shape,
+ num_filters=64,
+ filter_size=7,
+ stride=1,
+ stddev=0.02,
+ padding="VALID",
+ name="conv2d",
+ norm=True,
+ relu=True,
+ relufactor=0.0):
+ """Wrapper for deconv2d op to support VALID and SAME padding mode."""
+ need_crop = False
+ if padding == "SAME":
+ top_padding, bottom_padding = cal_padding(out_shape[0], stride,
+ filter_size)
+ left_padding, right_padding = cal_padding(out_shape[1], stride,
+ filter_size)
+ height_padding = top_padding
+ width_padding = left_padding
+ if top_padding != bottom_padding or left_padding != right_padding:
+ need_crop = True
+ else:
+ height_padding = 0
+ width_padding = 0
+
+ padding = [height_padding, width_padding]
+
+ param_attr = fluid.ParamAttr(
+ name=name + "_w",
+ initializer=fluid.initializer.TruncatedNormal(scale=stddev))
+ bias_attr = fluid.ParamAttr(
+ name=name + "_b", initializer=fluid.initializer.Constant(0.0))
+ conv = fluid.layers.conv2d_transpose(
+ input,
+ num_filters,
+ name=name,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ use_cudnn=use_cudnn,
+ param_attr=param_attr,
+ bias_attr=bias_attr)
+
+ if need_crop:
+ conv = fluid.layers.crop(
+ conv,
+ shape=(-1, conv.shape[1], conv.shape[2] - 1, conv.shape[3] - 1),
+ offsets=(0, 0, 0, 0))
+ if norm:
+ conv = instance_norm(input=conv, name=name + "_norm")
+ if relu:
+ conv = fluid.layers.leaky_relu(conv, alpha=relufactor)
+ return conv
diff --git a/fluid/PaddleCV/gan/cycle_gan/model.py b/PaddleCV/gan/cycle_gan/model.py
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/model.py
rename to PaddleCV/gan/cycle_gan/model.py
diff --git a/PaddleCV/gan/cycle_gan/train.py b/PaddleCV/gan/cycle_gan/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..3f7bb03929958e56b22c5ae3288bd7e20c5b6058
--- /dev/null
+++ b/PaddleCV/gan/cycle_gan/train.py
@@ -0,0 +1,239 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import random
+import sys
+import paddle
+import argparse
+import functools
+import time
+import numpy as np
+from scipy.misc import imsave
+import paddle.fluid as fluid
+import paddle.fluid.profiler as profiler
+from paddle.fluid import core
+import data_reader
+from utility import add_arguments, print_arguments, ImagePool
+from trainer import *
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 1, "Minibatch size.")
+add_arg('epoch', int, 2, "The number of epoched to be trained.")
+add_arg('output', str, "./output_0", "The directory the model and the test result to be saved to.")
+add_arg('init_model', str, None, "The init model file of directory.")
+add_arg('save_checkpoints', bool, True, "Whether to save checkpoints.")
+add_arg('run_test', bool, True, "Whether to run test.")
+add_arg('use_gpu', bool, True, "Whether to use GPU to train.")
+add_arg('profile', bool, False, "Whether to profile.")
+add_arg('run_ce', bool, False, "Whether to run for model ce.")
+# yapf: enable
+
+
+def train(args):
+
+ max_images_num = data_reader.max_images_num()
+ shuffle = True
+ if args.run_ce:
+ np.random.seed(10)
+ fluid.default_startup_program().random_seed = 90
+ max_images_num = 1
+ shuffle = False
+ data_shape = [-1] + data_reader.image_shape()
+
+ input_A = fluid.layers.data(
+ name='input_A', shape=data_shape, dtype='float32')
+ input_B = fluid.layers.data(
+ name='input_B', shape=data_shape, dtype='float32')
+ fake_pool_A = fluid.layers.data(
+ name='fake_pool_A', shape=data_shape, dtype='float32')
+ fake_pool_B = fluid.layers.data(
+ name='fake_pool_B', shape=data_shape, dtype='float32')
+
+ g_A_trainer = GATrainer(input_A, input_B)
+ g_B_trainer = GBTrainer(input_A, input_B)
+ d_A_trainer = DATrainer(input_A, fake_pool_A)
+ d_B_trainer = DBTrainer(input_B, fake_pool_B)
+
+ # prepare environment
+ place = fluid.CPUPlace()
+ if args.use_gpu:
+ place = fluid.CUDAPlace(0)
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ A_pool = ImagePool()
+ B_pool = ImagePool()
+
+ A_reader = paddle.batch(
+ data_reader.a_reader(shuffle=shuffle), args.batch_size)()
+ B_reader = paddle.batch(
+ data_reader.b_reader(shuffle=shuffle), args.batch_size)()
+ if not args.run_ce:
+ A_test_reader = data_reader.a_test_reader()
+ B_test_reader = data_reader.b_test_reader()
+
+ def test(epoch):
+ out_path = args.output + "/test"
+ if not os.path.exists(out_path):
+ os.makedirs(out_path)
+ i = 0
+ for data_A, data_B in zip(A_test_reader(), B_test_reader()):
+ A_name = data_A[1]
+ B_name = data_B[1]
+ tensor_A = core.LoDTensor()
+ tensor_B = core.LoDTensor()
+ tensor_A.set(data_A[0], place)
+ tensor_B.set(data_B[0], place)
+ fake_A_temp, fake_B_temp, cyc_A_temp, cyc_B_temp = exe.run(
+ g_A_trainer.infer_program,
+ fetch_list=[
+ g_A_trainer.fake_A, g_A_trainer.fake_B, g_A_trainer.cyc_A,
+ g_A_trainer.cyc_B
+ ],
+ feed={"input_A": tensor_A,
+ "input_B": tensor_B})
+ fake_A_temp = np.squeeze(fake_A_temp[0]).transpose([1, 2, 0])
+ fake_B_temp = np.squeeze(fake_B_temp[0]).transpose([1, 2, 0])
+ cyc_A_temp = np.squeeze(cyc_A_temp[0]).transpose([1, 2, 0])
+ cyc_B_temp = np.squeeze(cyc_B_temp[0]).transpose([1, 2, 0])
+ input_A_temp = np.squeeze(data_A[0]).transpose([1, 2, 0])
+ input_B_temp = np.squeeze(data_B[0]).transpose([1, 2, 0])
+
+ imsave(out_path + "/fakeB_" + str(epoch) + "_" + A_name, (
+ (fake_B_temp + 1) * 127.5).astype(np.uint8))
+ imsave(out_path + "/fakeA_" + str(epoch) + "_" + B_name, (
+ (fake_A_temp + 1) * 127.5).astype(np.uint8))
+ imsave(out_path + "/cycA_" + str(epoch) + "_" + A_name, (
+ (cyc_A_temp + 1) * 127.5).astype(np.uint8))
+ imsave(out_path + "/cycB_" + str(epoch) + "_" + B_name, (
+ (cyc_B_temp + 1) * 127.5).astype(np.uint8))
+ imsave(out_path + "/inputA_" + str(epoch) + "_" + A_name, (
+ (input_A_temp + 1) * 127.5).astype(np.uint8))
+ imsave(out_path + "/inputB_" + str(epoch) + "_" + B_name, (
+ (input_B_temp + 1) * 127.5).astype(np.uint8))
+ i += 1
+
+ def checkpoints(epoch):
+ out_path = args.output + "/checkpoints/" + str(epoch)
+ if not os.path.exists(out_path):
+ os.makedirs(out_path)
+ fluid.io.save_persistables(
+ exe, out_path + "/g_a", main_program=g_A_trainer.program)
+ fluid.io.save_persistables(
+ exe, out_path + "/g_b", main_program=g_B_trainer.program)
+ fluid.io.save_persistables(
+ exe, out_path + "/d_a", main_program=d_A_trainer.program)
+ fluid.io.save_persistables(
+ exe, out_path + "/d_b", main_program=d_B_trainer.program)
+ print("saved checkpoint to {}".format(out_path))
+ sys.stdout.flush()
+
+ def init_model():
+ assert os.path.exists(
+ args.init_model), "[%s] cann't be found." % args.init_mode
+ fluid.io.load_persistables(
+ exe, args.init_model + "/g_a", main_program=g_A_trainer.program)
+ fluid.io.load_persistables(
+ exe, args.init_model + "/g_b", main_program=g_B_trainer.program)
+ fluid.io.load_persistables(
+ exe, args.init_model + "/d_a", main_program=d_A_trainer.program)
+ fluid.io.load_persistables(
+ exe, args.init_model + "/d_b", main_program=d_B_trainer.program)
+ print("Load model from {}".format(args.init_model))
+
+ if args.init_model:
+ init_model()
+ losses = [[], []]
+ t_time = 0
+ build_strategy = fluid.BuildStrategy()
+ build_strategy.enable_inplace = False
+ build_strategy.memory_optimize = False
+
+ g_A_trainer_program = fluid.CompiledProgram(
+ g_A_trainer.program).with_data_parallel(
+ loss_name=g_A_trainer.g_loss_A.name, build_strategy=build_strategy)
+ g_B_trainer_program = fluid.CompiledProgram(
+ g_B_trainer.program).with_data_parallel(
+ loss_name=g_B_trainer.g_loss_B.name, build_strategy=build_strategy)
+ d_B_trainer_program = fluid.CompiledProgram(
+ d_B_trainer.program).with_data_parallel(
+ loss_name=d_B_trainer.d_loss_B.name, build_strategy=build_strategy)
+ d_A_trainer_program = fluid.CompiledProgram(
+ d_A_trainer.program).with_data_parallel(
+ loss_name=d_A_trainer.d_loss_A.name, build_strategy=build_strategy)
+ for epoch in range(args.epoch):
+ batch_id = 0
+ for i in range(max_images_num):
+ data_A = next(A_reader)
+ data_B = next(B_reader)
+ tensor_A = core.LoDTensor()
+ tensor_B = core.LoDTensor()
+ tensor_A.set(data_A, place)
+ tensor_B.set(data_B, place)
+ s_time = time.time()
+ # optimize the g_A network
+ g_A_loss, fake_B_tmp = exe.run(
+ g_A_trainer_program,
+ fetch_list=[g_A_trainer.g_loss_A, g_A_trainer.fake_B],
+ feed={"input_A": tensor_A,
+ "input_B": tensor_B})
+
+ fake_pool_B = B_pool.pool_image(fake_B_tmp)
+
+ # optimize the d_B network
+ d_B_loss = exe.run(
+ d_B_trainer_program,
+ fetch_list=[d_B_trainer.d_loss_B],
+ feed={"input_B": tensor_B,
+ "fake_pool_B": fake_pool_B})[0]
+
+ # optimize the g_B network
+ g_B_loss, fake_A_tmp = exe.run(
+ g_B_trainer_program,
+ fetch_list=[g_B_trainer.g_loss_B, g_B_trainer.fake_A],
+ feed={"input_A": tensor_A,
+ "input_B": tensor_B})
+
+ fake_pool_A = A_pool.pool_image(fake_A_tmp)
+
+ # optimize the d_A network
+ d_A_loss = exe.run(
+ d_A_trainer_program,
+ fetch_list=[d_A_trainer.d_loss_A],
+ feed={"input_A": tensor_A,
+ "fake_pool_A": fake_pool_A})[0]
+ batch_time = time.time() - s_time
+ t_time += batch_time
+ print(
+ "epoch{}; batch{}; g_A_loss: {}; d_B_loss: {}; g_B_loss: {}; d_A_loss: {}; "
+ "Batch_time_cost: {:.2f}".format(epoch, batch_id, g_A_loss[
+ 0], d_B_loss[0], g_B_loss[0], d_A_loss[0], batch_time))
+ losses[0].append(g_A_loss[0])
+ losses[1].append(d_A_loss[0])
+ sys.stdout.flush()
+ batch_id += 1
+
+ if args.run_test and not args.run_ce:
+ test(epoch)
+ if args.save_checkpoints and not args.run_ce:
+ checkpoints(epoch)
+ if args.run_ce:
+ print("kpis,g_train_cost,{}".format(np.mean(losses[0])))
+ print("kpis,d_train_cost,{}".format(np.mean(losses[1])))
+ print("kpis,duration,{}".format(t_time / args.epoch))
+
+
+if __name__ == "__main__":
+ args = parser.parse_args()
+ print_arguments(args)
+ if args.profile:
+ if args.use_gpu:
+ with profiler.cuda_profiler("cuda_profiler.txt", 'csv') as nvprof:
+ train(args)
+ else:
+ with profiler.profiler("CPU", sorted_key='total') as cpuprof:
+ train(args)
+ else:
+ train(args)
diff --git a/fluid/PaddleCV/gan/cycle_gan/trainer.py b/PaddleCV/gan/cycle_gan/trainer.py
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/trainer.py
rename to PaddleCV/gan/cycle_gan/trainer.py
diff --git a/fluid/PaddleCV/gan/cycle_gan/utility.py b/PaddleCV/gan/cycle_gan/utility.py
similarity index 100%
rename from fluid/PaddleCV/gan/cycle_gan/utility.py
rename to PaddleCV/gan/cycle_gan/utility.py
diff --git a/PaddleCV/human_pose_estimation/README.md b/PaddleCV/human_pose_estimation/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..c563ce31b6a7d84de85b3d41f70a153f3e4fb7b7
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/README.md
@@ -0,0 +1,117 @@
+# Simple Baselines for Human Pose Estimation in Fluid
+
+## Introduction
+This is a simple demonstration of re-implementation in [PaddlePaddle.Fluid](http://www.paddlepaddle.org/en) for the paper [Simple Baselines for Human Pose Estimation and Tracking](https://arxiv.org/abs/1804.06208) (ECCV'18) from MSRA.
+
+
+
+> **Video in Demo**: *Bruno Mars - That’s What I Like [Official Video]*.
+
+## Requirements
+
+ - Python == 2.7 or 3.6
+ - PaddlePaddle >= 1.1.0 (<= 1.3.0)
+ - opencv-python >= 3.3
+
+## Environment
+
+The code is developed and tested under 4 Tesla K40/P40 GPUS cards on CentOS with installed CUDA-9.0/8.0 and cuDNN-7.0.
+
+## Results on MPII Val
+| Arch | Head | Shoulder | Elbow | Wrist | Hip | Knee | Ankle | Mean | Mean@0.1| Models |
+| ---- |:----:|:--------:|:-----:|:-----:|:---:|:----:|:-----:|:----:|:-------:|:------:|
+| 256x256\_pose\_resnet\_50 in PyTorch | 96.351 | 95.329 | 88.989 | 83.176 | 88.420 | 83.960 | 79.594 | 88.532 | 33.911 | - |
+| 256x256\_pose\_resnet\_50 in Fluid | 96.385 | 95.363 | 89.211 | 84.084 | 88.454 | 84.182 | 79.546 | 88.748 | 33.750 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-mpii-256x256.tar.gz) |
+| 384x384\_pose\_resnet\_50 in PyTorch | 96.658 | 95.754 | 89.790 | 84.614 | 88.523 | 84.666 | 79.287 | 89.066 | 38.046 | - |
+| 384x384\_pose\_resnet\_50 in Fluid | 96.862 | 95.635 | 90.046 | 85.557 | 88.818 | 84.948 | 78.484 | 89.235 | 38.093 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-mpii-384x384.tar.gz) |
+
+## Results on COCO val2017 with detector having human AP of 56.4 on COCO val2017 dataset
+| Arch | AP | Ap .5 | AP .75 | AP (M) | AP (L) | AR | AR .5 | AR .75 | AR (M) | AR (L) | Models |
+| ---- |:--:|:-----:|:------:|:------:|:------:|:--:|:-----:|:------:|:------:|:------:|:------:|
+| 256x192\_pose\_resnet\_50 in PyTorch | 0.704 | 0.886 | 0.783 | 0.671 | 0.772 | 0.763 | 0.929 | 0.834 | 0.721 | 0.824 | - |
+| 256x192\_pose\_resnet\_50 in Fluid | 0.712 | 0.897 | 0.786 | 0.683 | 0.756 | 0.741 | 0.906 | 0.806 | 0.709 | 0.790 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-coco-256x192.tar.gz) |
+| 384x288\_pose\_resnet\_50 in PyTorch | 0.722 | 0.893 | 0.789 | 0.681 | 0.797 | 0.776 | 0.932 | 0.838 | 0.728 | 0.846 | - |
+| 384x288\_pose\_resnet\_50 in Fluid | 0.727 | 0.897 | 0.796 | 0.690 | 0.783 | 0.754 | 0.907 | 0.813 | 0.714 | 0.814 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-coco-384x288.tar.gz) |
+
+### Notes:
+
+ - Flip test is used.
+ - We do not hardly search the best model, just use the last saved model to make validation.
+
+## Getting Start
+
+### Prepare Datasets and Pretrained Models
+
+ - Following the [instruction](https://github.com/Microsoft/human-pose-estimation.pytorch#data-preparation) to prepare datasets.
+ - Download the pretrained ResNet-50 model in PaddlePaddle.Fluid on ImageNet from [Model Zoo](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification#supported-models-and-performances).
+
+```bash
+wget http://paddle-imagenet-models.bj.bcebos.com/resnet_50_model.tar
+```
+
+Then, put them in the folder `pretrained` under the directory root of this repo, make them look like:
+
+```
+${THIS REPO ROOT}
+ `-- pretrained
+ `-- resnet_50
+ |-- 115
+ `-- data
+ `-- coco
+ |-- annotations
+ |-- images
+ `-- mpii
+ |-- annot
+ |-- images
+```
+
+### Install [COCOAPI](https://github.com/cocodataset/cocoapi)
+
+```bash
+# COCOAPI=/path/to/clone/cocoapi
+git clone https://github.com/cocodataset/cocoapi.git $COCOAPI
+cd $COCOAPI/PythonAPI
+# if cython is not installed
+pip install Cython
+# Install into global site-packages
+make install
+# Alternatively, if you do not have permissions or prefer
+# not to install the COCO API into global site-packages
+python2 setup.py install --user
+```
+
+### Perform Validating
+
+Downloading the checkpoints of Pose-ResNet-50 trained on MPII dataset from [here](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-mpii-384x384.tar.gz). Extract it into the folder `checkpoints` under the directory root of this repo. Then run
+
+```bash
+python val.py --dataset 'mpii' --checkpoint 'checkpoints/pose-resnet50-mpii-384x384' --data_root 'data/mpii'
+```
+
+### Perform Training
+
+```bash
+python train.py --dataset 'mpii' --data_root 'data/mpii'
+```
+
+**Note**: Configurations for training are aggregated in the `lib/mpii_reader.py` and `lib/coco_reader.py`.
+
+### Perform Test on Images
+
+We also support to apply pre-trained models on customized images.
+
+Put the images into the folder `test` under the directory root of this repo. Then run
+
+```bash
+python test.py --checkpoint 'checkpoints/pose-resnet-50-384x384-mpii'
+```
+
+If there are multiple persons in images, detectors such as [Faster R-CNN](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/rcnn), [SSD](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/object_detection) or others should be used first to crop them out. Because the simple baseline for human pose estimation is a top-down method.
+
+## Reference
+
+ - Simple Baselines for Human Pose Estimation and Tracking in PyTorch [`code`](https://github.com/Microsoft/human-pose-estimation.pytorch#data-preparation)
+
+## License
+
+This code is released under the Apache License 2.0.
diff --git a/PaddleCV/human_pose_estimation/README_cn.md b/PaddleCV/human_pose_estimation/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..86f71aa580c96789f73e87917921ea3456f03241
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/README_cn.md
@@ -0,0 +1,109 @@
+# 关键点检测(Simple Baselines for Human Pose Estimation)
+
+## 介绍
+本目录包含了对论文[Simple Baselines for Human Pose Estimation and Tracking](https://arxiv.org/abs/1804.06208) (ECCV'18)的复现.
+
+
+
+> **演示视频**: *Bruno Mars - That’s What I Like [官方视频]*.
+
+## 环境依赖
+
+本目录下的代码均在4卡Tesla K40/P40 GPU,CentOS系统,CUDA-9.0/8.0,cuDNN-7.0环境下测试运行无误
+
+ - Python == 2.7 / 3.6
+ - PaddlePaddle >= 1.1.0 (<= 1.3.0)
+ - opencv-python >= 3.3
+
+## MPII Val结果
+| Arch | Head | Shoulder | Elbow | Wrist | Hip | Knee | Ankle | Mean | Mean@0.1| Models |
+| ---- |:----:|:--------:|:-----:|:-----:|:---:|:----:|:-----:|:----:|:-------:|:------:|
+| 256x256\_pose\_resnet\_50 in PyTorch | 96.351 | 95.329 | 88.989 | 83.176 | 88.420 | 83.960 | 79.594 | 88.532 | 33.911 | - |
+| 256x256\_pose\_resnet\_50 in Fluid | 96.385 | 95.363 | 89.211 | 84.084 | 88.454 | 84.182 | 79.546 | 88.748 | 33.750 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-mpii-256x256.tar.gz) |
+| 384x384\_pose\_resnet\_50 in PyTorch | 96.658 | 95.754 | 89.790 | 84.614 | 88.523 | 84.666 | 79.287 | 89.066 | 38.046 | - |
+| 384x384\_pose\_resnet\_50 in Fluid | 96.862 | 95.635 | 90.046 | 85.557 | 88.818 | 84.948 | 78.484 | 89.235 | 38.093 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-mpii-384x384.tar.gz) |
+
+## COCO val2017结果(使用的检测器在COCO val2017数据集上AP为56.4)
+| Arch | AP | Ap .5 | AP .75 | AP (M) | AP (L) | AR | AR .5 | AR .75 | AR (M) | AR (L) | Models |
+| ---- |:--:|:-----:|:------:|:------:|:------:|:--:|:-----:|:------:|:------:|:------:|:------:|
+| 256x192\_pose\_resnet\_50 in PyTorch | 0.704 | 0.886 | 0.783 | 0.671 | 0.772 | 0.763 | 0.929 | 0.834 | 0.721 | 0.824 | - |
+| 256x192\_pose\_resnet\_50 in Fluid | 0.712 | 0.897 | 0.786 | 0.683 | 0.756 | 0.741 | 0.906 | 0.806 | 0.709 | 0.790 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-coco-256x192.tar.gz) |
+| 384x288\_pose\_resnet\_50 in PyTorch | 0.722 | 0.893 | 0.789 | 0.681 | 0.797 | 0.776 | 0.932 | 0.838 | 0.728 | 0.846 | - |
+| 384x288\_pose\_resnet\_50 in Fluid | 0.727 | 0.897 | 0.796 | 0.690 | 0.783 | 0.754 | 0.907 | 0.813 | 0.714 | 0.814 | [`link`](https://paddlemodels.bj.bcebos.com/pose/pose-resnet50-coco-384x288.tar.gz) |
+
+### 说明
+
+ - 使用Flip test
+ - 对当前模型结果并没有进行调参选择,使用下面相关实验配置训练后,取最后一个epoch后的模型作为最终模型,即可得到上述实验结果
+
+## 开始
+
+### 数据准备和预训练模型
+
+ - 安照[提示](https://github.com/Microsoft/human-pose-estimation.pytorch#data-preparation)进行数据准备
+ - 下载预训练好的ResNet-50
+
+```bash
+wget http://paddle-imagenet-models.bj.bcebos.com/resnet_50_model.tar
+```
+
+下载完成后,将模型解压、放入到根目录下的'pretrained'文件夹中,默认文件路径树为:
+
+```
+${根目录}
+ `-- pretrained
+ `-- resnet_50
+ |-- 115
+ `-- data
+ `-- coco
+ |-- annotations
+ |-- images
+ `-- mpii
+ |-- annot
+ |-- images
+```
+
+### 安装 [COCOAPI](https://github.com/cocodataset/cocoapi)
+
+```bash
+# COCOAPI=/path/to/clone/cocoapi
+git clone https://github.com/cocodataset/cocoapi.git $COCOAPI
+cd $COCOAPI/PythonAPI
+# if cython is not installed
+pip install Cython
+# Install into global site-packages
+make install
+# Alternatively, if you do not have permissions or prefer
+# not to install the COCO API into global site-packages
+python2 setup.py install --user
+```
+
+### 模型验证(COCO或MPII)
+
+下载COCO/MPII预训练模型(见上表最后一列所附链接),保存到根目录下的'checkpoints'文件夹中,运行:
+
+```bash
+python val.py --dataset 'mpii' --checkpoint 'checkpoints/pose-resnet50-mpii-384x384' --data_root 'data/mpii'
+```
+
+### 模型训练
+
+```bash
+python train.py --dataset 'mpii' --data_root 'data/mpii'
+```
+
+**说明** 详细参数配置已保存到`lib/mpii_reader.py` 和 `lib/coco_reader.py`文件中,通过设置dataset来选择使用具体的参数配置
+
+### 模型测试(任意图片,使用上述COCO或MPII预训练好的模型)
+
+同时,我们支持使用预训练好的关键点检测模型预测任意图片
+
+将测试图片放入根目录下的'test'文件夹中,执行
+
+```bash
+python test.py --checkpoint 'checkpoints/pose-resnet-50-384x384-mpii'
+```
+
+## 引用
+
+- Simple Baselines for Human Pose Estimation and Tracking in PyTorch [`code`](https://github.com/Microsoft/human-pose-estimation.pytorch#data-preparation)
\ No newline at end of file
diff --git a/fluid/PaddleCV/human_pose_estimation/demo.gif b/PaddleCV/human_pose_estimation/demo.gif
similarity index 100%
rename from fluid/PaddleCV/human_pose_estimation/demo.gif
rename to PaddleCV/human_pose_estimation/demo.gif
diff --git a/fluid/PaddleCV/HiNAS_models/build/__init__.py b/PaddleCV/human_pose_estimation/lib/__init__.py
old mode 100755
new mode 100644
similarity index 100%
rename from fluid/PaddleCV/HiNAS_models/build/__init__.py
rename to PaddleCV/human_pose_estimation/lib/__init__.py
diff --git a/fluid/PaddleCV/human_pose_estimation/lib/base_reader.py b/PaddleCV/human_pose_estimation/lib/base_reader.py
similarity index 100%
rename from fluid/PaddleCV/human_pose_estimation/lib/base_reader.py
rename to PaddleCV/human_pose_estimation/lib/base_reader.py
diff --git a/PaddleCV/human_pose_estimation/lib/coco_reader.py b/PaddleCV/human_pose_estimation/lib/coco_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..91b26c6dc3e170982f3fa1c054f4ea0d484460d3
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/lib/coco_reader.py
@@ -0,0 +1,330 @@
+# Copyright (c) 2018-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Data reader for COCO dataset."""
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import functools
+import numpy as np
+import cv2
+import random
+
+from utils.transforms import fliplr_joints
+from utils.transforms import get_affine_transform
+from utils.transforms import affine_transform
+from lib.base_reader import visualize, generate_target
+from pycocotools.coco import COCO
+
+# NOTE
+# -- COCO Datatset --
+# "keypoints":
+# {
+# 0: "nose",
+# 1: "left_eye",
+# 2: "right_eye",
+# 3: "left_ear",
+# 4: "right_ear",
+# 5: "left_shoulder",
+# 6: "right_shoulder",
+# 7: "left_elbow",
+# 8: "right_elbow",
+# 9: "left_wrist",
+# 10: "right_wrist",
+# 11: "left_hip",
+# 12: "right_hip",
+# 13: "left_knee",
+# 14: "right_knee",
+# 15: "left_ankle",
+# 16: "right_ankle"
+# },
+#
+# "skeleton":
+# [
+# [16,14],[14,12],[17,15],[15,13],[12,13],[6,12],[7,13], [6,7],[6,8],
+# [7,9],[8,10],[9,11],[2,3],[1,2],[1,3],[2,4],[3,5],[4,6],[5,7]
+# ]
+
+class Config:
+ """Configurations for COCO dataset.
+ """
+ DEBUG = False
+ TMPDIR = 'tmp_fold_for_debug'
+
+ # For reader
+ BUF_SIZE = 102400
+ THREAD = 1 if DEBUG else 8 # have to be larger than 0
+
+ # Fixed infos of dataset
+ DATAROOT = 'data/coco'
+ IMAGEDIR = 'images'
+ NUM_JOINTS = 17
+ FLIP_PAIRS = [[1, 2], [3, 4], [5, 6], [7, 8], [9, 10], [11, 12], [13, 14], [15, 16]]
+ PARENT_IDS = None
+
+ # CFGS
+ SCALE_FACTOR = 0.3
+ ROT_FACTOR = 40
+ FLIP = True
+ TARGET_TYPE = 'gaussian'
+ SIGMA = 3
+ IMAGE_SIZE = [288, 384]
+ HEATMAP_SIZE = [72, 96]
+ ASPECT_RATIO = IMAGE_SIZE[0] * 1.0 / IMAGE_SIZE[1]
+ MEAN = [0.485, 0.456, 0.406]
+ STD = [0.229, 0.224, 0.225]
+ PIXEL_STD = 200
+
+cfg = Config()
+
+def _box2cs(box):
+ x, y, w, h = box[:4]
+ return _xywh2cs(x, y, w, h)
+
+def _xywh2cs(x, y, w, h):
+ center = np.zeros((2), dtype=np.float32)
+ center[0] = x + w * 0.5
+ center[1] = y + h * 0.5
+
+ if w > cfg.ASPECT_RATIO * h:
+ h = w * 1.0 / cfg.ASPECT_RATIO
+ elif w < cfg.ASPECT_RATIO * h:
+ w = h * cfg.ASPECT_RATIO
+ scale = np.array(
+ [w * 1.0 / cfg.PIXEL_STD, h * 1.0 / cfg.PIXEL_STD],
+ dtype=np.float32)
+ if center[0] != -1:
+ scale = scale * 1.25
+
+ return center, scale
+
+def _select_data(db):
+ db_selected = []
+ for rec in db:
+ num_vis = 0
+ joints_x = 0.0
+ joints_y = 0.0
+ for joint, joint_vis in zip(
+ rec['joints_3d'], rec['joints_3d_vis']):
+ if joint_vis[0] <= 0:
+ continue
+ num_vis += 1
+
+ joints_x += joint[0]
+ joints_y += joint[1]
+ if num_vis == 0:
+ continue
+
+ joints_x, joints_y = joints_x / num_vis, joints_y / num_vis
+
+ area = rec['scale'][0] * rec['scale'][1] * (cfg.PIXEL_STD**2)
+ joints_center = np.array([joints_x, joints_y])
+ bbox_center = np.array(rec['center'])
+ diff_norm2 = np.linalg.norm((joints_center-bbox_center), 2)
+ ks = np.exp(-1.0*(diff_norm2**2) / ((0.2)**2*2.0*area))
+
+ metric = (0.2 / 16) * num_vis + 0.45 - 0.2 / 16
+ if ks > metric:
+ db_selected.append(rec)
+
+ print('=> num db: {}'.format(len(db)))
+ print('=> num selected db: {}'.format(len(db_selected)))
+ return db_selected
+
+def _load_coco_keypoint_annotation(image_set_index, coco, _coco_ind_to_class_ind, image_set):
+ """Ground truth bbox and keypoints.
+ """
+ print('generating coco gt_db...')
+ gt_db = []
+ for index in image_set_index:
+ im_ann = coco.loadImgs(index)[0]
+ width = im_ann['width']
+ height = im_ann['height']
+
+ annIds = coco.getAnnIds(imgIds=index, iscrowd=False)
+ objs = coco.loadAnns(annIds)
+
+ # Sanitize bboxes
+ valid_objs = []
+ for obj in objs:
+ x, y, w, h = obj['bbox']
+ x1 = np.max((0, x))
+ y1 = np.max((0, y))
+ x2 = np.min((width - 1, x1 + np.max((0, w - 1))))
+ y2 = np.min((height - 1, y1 + np.max((0, h - 1))))
+ if obj['area'] > 0 and x2 >= x1 and y2 >= y1:
+ obj['clean_bbox'] = [x1, y1, x2-x1, y2-y1]
+ valid_objs.append(obj)
+ objs = valid_objs
+
+ rec = []
+ for obj in objs:
+ cls = _coco_ind_to_class_ind[obj['category_id']]
+ if cls != 1:
+ continue
+
+ # Ignore objs without keypoints annotation
+ if max(obj['keypoints']) == 0:
+ continue
+
+ joints_3d = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
+ joints_3d_vis = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
+ for ipt in range(cfg.NUM_JOINTS):
+ joints_3d[ipt, 0] = obj['keypoints'][ipt * 3 + 0]
+ joints_3d[ipt, 1] = obj['keypoints'][ipt * 3 + 1]
+ joints_3d[ipt, 2] = 0
+ t_vis = obj['keypoints'][ipt * 3 + 2]
+ if t_vis > 1:
+ t_vis = 1
+ joints_3d_vis[ipt, 0] = t_vis
+ joints_3d_vis[ipt, 1] = t_vis
+ joints_3d_vis[ipt, 2] = 0
+
+ center, scale = _box2cs(obj['clean_bbox'][:4])
+ rec.append({
+ 'image': os.path.join(cfg.DATAROOT, cfg.IMAGEDIR, image_set+'2017', '%012d.jpg' % index),
+ 'center': center,
+ 'scale': scale,
+ 'joints_3d': joints_3d,
+ 'joints_3d_vis': joints_3d_vis,
+ 'filename': '%012d.jpg' % index,
+ 'imgnum': 0,
+ })
+
+ gt_db.extend(rec)
+ return gt_db
+
+def data_augmentation(sample, is_train):
+ image_file = sample['image']
+ filename = sample['filename'] if 'filename' in sample else ''
+ joints = sample['joints_3d']
+ joints_vis = sample['joints_3d_vis']
+ c = sample['center']
+ s = sample['scale']
+ score = sample['score'] if 'score' in sample else 1
+ # imgnum = sample['imgnum'] if 'imgnum' in sample else ''
+ r = 0
+
+ data_numpy = cv2.imread(
+ image_file, cv2.IMREAD_COLOR | cv2.IMREAD_IGNORE_ORIENTATION)
+
+ if is_train:
+ sf = cfg.SCALE_FACTOR
+ rf = cfg.ROT_FACTOR
+ s = s * np.clip(np.random.randn()*sf + 1, 1 - sf, 1 + sf)
+ r = np.clip(np.random.randn()*rf, -rf*2, rf*2) \
+ if random.random() <= 0.6 else 0
+
+ if cfg.FLIP and random.random() <= 0.5:
+ data_numpy = data_numpy[:, ::-1, :]
+ joints, joints_vis = fliplr_joints(
+ joints, joints_vis, data_numpy.shape[1], cfg.FLIP_PAIRS)
+ c[0] = data_numpy.shape[1] - c[0] - 1
+
+ trans = get_affine_transform(c, s, r, cfg.IMAGE_SIZE)
+ input = cv2.warpAffine(
+ data_numpy,
+ trans,
+ (int(cfg.IMAGE_SIZE[0]), int(cfg.IMAGE_SIZE[1])),
+ flags=cv2.INTER_LINEAR)
+
+ for i in range(cfg.NUM_JOINTS):
+ if joints_vis[i, 0] > 0.0:
+ joints[i, 0:2] = affine_transform(joints[i, 0:2], trans)
+
+ # Numpy target
+ target, target_weight = generate_target(cfg, joints, joints_vis)
+
+ if cfg.DEBUG:
+ visualize(cfg, filename, data_numpy, input.copy(), joints, target)
+
+ # Normalization
+ input = input.astype('float32').transpose((2, 0, 1)) / 255
+ input -= np.array(cfg.MEAN).reshape((3, 1, 1))
+ input /= np.array(cfg.STD).reshape((3, 1, 1))
+
+ if is_train:
+ return input, target, target_weight
+ else:
+ return input, target, target_weight, c, s, score, image_file
+
+# Create a reader
+def _reader_creator(root, image_set, shuffle=False, is_train=False, use_gt_bbox=False):
+
+ def reader():
+ if image_set in ['train', 'val']:
+ file_name = os.path.join(root, 'annotations', 'person_keypoints_'+image_set+'2017.json')
+ elif image_set in ['test', 'test-dev']:
+ file_name = os.path.join(root, 'annotations', 'image_info_'+image_set+'2017.json')
+ else:
+ raise ValueError("The dataset '{}' is not supported".format(image_set))
+
+ # Load annotations
+ coco = COCO(file_name)
+
+ # Deal with class names
+ cats = [cat['name']
+ for cat in coco.loadCats(coco.getCatIds())]
+ classes = ['__background__'] + cats
+ print('=> classes: {}'.format(classes))
+ num_classes = len(classes)
+ _class_to_ind = dict(zip(classes, range(num_classes)))
+ _class_to_coco_ind = dict(zip(cats, coco.getCatIds()))
+ _coco_ind_to_class_ind = dict([(_class_to_coco_ind[cls],
+ _class_to_ind[cls])
+ for cls in classes[1:]])
+
+ # Load image file names
+ image_set_index = coco.getImgIds()
+ num_images = len(image_set_index)
+ print('=> num_images: {}'.format(num_images))
+
+ if is_train or use_gt_bbox:
+ gt_db = _load_coco_keypoint_annotation(
+ image_set_index, coco, _coco_ind_to_class_ind, image_set)
+ gt_db = _select_data(gt_db)
+
+ if shuffle:
+ random.shuffle(gt_db)
+
+ for db in gt_db:
+ yield db
+
+ mapper = functools.partial(data_augmentation, is_train=is_train)
+ return reader, mapper
+
+def train():
+ reader, mapper = _reader_creator(cfg.DATAROOT, 'train', shuffle=True, is_train=True)
+ def pop():
+ for i, x in enumerate(reader()):
+ yield mapper(x)
+ return pop
+
+def valid():
+ reader, mapper = _reader_creator(cfg.DATAROOT, 'val', shuffle=False, is_train=False, use_gt_bbox=True)
+ def pop():
+ for i, x in enumerate(reader()):
+ yield mapper(x)
+ return pop
+
+def test():
+ reader, mapper = _reader_creator(cfg.DATAROOT, 'test', shuffle=False, is_train=False, use_gt_bbox=True)
+ def pop():
+ for i, x in enumerate(reader()):
+ yield mapper(x)
+ return pop
diff --git a/PaddleCV/human_pose_estimation/lib/mpii_reader.py b/PaddleCV/human_pose_estimation/lib/mpii_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..41ea25ed252155913ca52523c3e78c46199a34bb
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/lib/mpii_reader.py
@@ -0,0 +1,216 @@
+# Copyright (c) 2018-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Data reader for MPII."""
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import random
+import functools
+import json
+import numpy as np
+import cv2
+
+from utils.transforms import fliplr_joints
+from utils.transforms import get_affine_transform
+from utils.transforms import affine_transform
+from lib.base_reader import visualize, generate_target
+
+class Config:
+ """Configurations for MPII dataset.
+ """
+ DEBUG = False
+ TMPDIR = 'tmp_fold_for_debug'
+
+ # For reader
+ BUF_SIZE = 102400
+ THREAD = 1 if DEBUG else 8 # have to be larger than 0
+
+ # Fixed infos of dataset
+ DATAROOT = 'data/mpii'
+ IMAGEDIR = 'images'
+ NUM_JOINTS = 16
+ FLIP_PAIRS = [[0, 5], [1, 4], [2, 3], [10, 15], [11, 14], [12, 13]]
+ PARENT_IDS = [1, 2, 6, 6, 3, 4, 6, 6, 7, 8, 11, 12, 7, 7, 13, 14]
+
+ # CFGS
+ SCALE_FACTOR = 0.3
+ ROT_FACTOR = 40
+ FLIP = True
+ TARGET_TYPE = 'gaussian'
+ SIGMA = 3
+ IMAGE_SIZE = [384, 384]
+ HEATMAP_SIZE = [96, 96]
+ MEAN = [0.485, 0.456, 0.406]
+ STD = [0.229, 0.224, 0.225]
+
+cfg = Config()
+
+def data_augmentation(sample, is_train):
+ image_file = sample['image']
+ filename = sample['filename'] if 'filename' in sample else ''
+ joints = sample['joints_3d']
+ joints_vis = sample['joints_3d_vis']
+ c = sample['center']
+ s = sample['scale']
+ score = sample['score'] if 'score' in sample else 1
+ # imgnum = sample['imgnum'] if 'imgnum' in sample else ''
+ r = 0
+
+ data_numpy = cv2.imread(
+ image_file, cv2.IMREAD_COLOR | cv2.IMREAD_IGNORE_ORIENTATION)
+
+ if is_train:
+ sf = cfg.SCALE_FACTOR
+ rf = cfg.ROT_FACTOR
+ s = s * np.clip(np.random.randn()*sf + 1, 1 - sf, 1 + sf)
+ r = np.clip(np.random.randn()*rf, -rf*2, rf*2) \
+ if random.random() <= 0.6 else 0
+
+ if cfg.FLIP and random.random() <= 0.5:
+ data_numpy = data_numpy[:, ::-1, :]
+ joints, joints_vis = fliplr_joints(
+ joints, joints_vis, data_numpy.shape[1], cfg.FLIP_PAIRS)
+ c[0] = data_numpy.shape[1] - c[0] - 1
+
+ trans = get_affine_transform(c, s, r, cfg.IMAGE_SIZE)
+ input = cv2.warpAffine(
+ data_numpy,
+ trans,
+ (int(cfg.IMAGE_SIZE[0]), int(cfg.IMAGE_SIZE[1])),
+ flags=cv2.INTER_LINEAR)
+
+ for i in range(cfg.NUM_JOINTS):
+ if joints_vis[i, 0] > 0.0:
+ joints[i, 0:2] = affine_transform(joints[i, 0:2], trans)
+
+ # Numpy target
+ target, target_weight = generate_target(cfg, joints, joints_vis)
+
+ if cfg.DEBUG:
+ visualize(cfg, filename, data_numpy, input.copy(), joints, target)
+
+ # Normalization
+ input = input.astype('float32').transpose((2, 0, 1)) / 255
+ input -= np.array(cfg.MEAN).reshape((3, 1, 1))
+ input /= np.array(cfg.STD).reshape((3, 1, 1))
+
+ if is_train:
+ return input, target, target_weight
+ else:
+ return input, target, target_weight, c, s, score
+
+def test_data_augmentation(sample):
+ image_file = sample['image']
+ filename = sample['filename'] if 'filename' in sample else ''
+
+ file_id = int(filename.split('.')[0])
+
+ input = cv2.imread(
+ image_file, cv2.IMREAD_COLOR | cv2.IMREAD_IGNORE_ORIENTATION)
+
+ input = cv2.resize(input, (int(cfg.IMAGE_SIZE[0]), int(cfg.IMAGE_SIZE[1])))
+
+ # Normalization
+ input = input.astype('float32').transpose((2, 0, 1)) / 255
+ input -= np.array(cfg.MEAN).reshape((3, 1, 1))
+ input /= np.array(cfg.STD).reshape((3, 1, 1))
+
+ return input, file_id
+
+# Create a reader
+def _reader_creator(root, image_set, shuffle=False, is_train=False):
+ def reader():
+ if image_set != 'test':
+ file_name = os.path.join(root, 'annot', image_set+'.json')
+ with open(file_name) as anno_file:
+ anno = json.load(anno_file)
+ print('=> load {} samples of {} dataset'.format(len(anno), image_set))
+
+ if shuffle:
+ random.shuffle(anno)
+
+ for a in anno:
+ image_name = a['image']
+
+ c = np.array(a['center'], dtype=np.float)
+ s = np.array([a['scale'], a['scale']], dtype=np.float)
+
+ # Adjust center/scale slightly to avoid cropping limbs
+ if c[0] != -1:
+ c[1] = c[1] + 15 * s[1]
+ s = s * 1.25
+
+ # MPII uses matlab format, index is based 1,
+ # we should first convert to 0-based index
+ c = c - 1
+
+ joints_3d = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
+ joints_3d_vis = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
+
+ joints = np.array(a['joints'])
+ joints[:, 0:2] = joints[:, 0:2] - 1
+ joints_vis = np.array(a['joints_vis'])
+ assert len(joints) == cfg.NUM_JOINTS, \
+ 'joint num diff: {} vs {}'.format(len(joints), cfg.NUM_JOINTS)
+
+ joints_3d[:, 0:2] = joints[:, 0:2]
+ joints_3d_vis[:, 0] = joints_vis[:]
+ joints_3d_vis[:, 1] = joints_vis[:]
+
+ yield dict(
+ image=os.path.join(cfg.DATAROOT, cfg.IMAGEDIR, image_name),
+ center=c,
+ scale=s,
+ joints_3d=joints_3d,
+ joints_3d_vis=joints_3d_vis,
+ filename=image_name,
+ test_mode=False,
+ imagenum=0)
+ else:
+ fold = 'test'
+ for img_name in os.listdir(fold):
+ yield dict(image=os.path.join(fold, img_name),
+ filename=img_name)
+
+ if not image_set == 'test':
+ mapper = functools.partial(data_augmentation, is_train=is_train)
+ else:
+ mapper = functools.partial(test_data_augmentation)
+ return reader, mapper
+
+def train():
+ reader, mapper = _reader_creator(cfg.DATAROOT, 'train', shuffle=True, is_train=True)
+ def pop():
+ for i, x in enumerate(reader()):
+ yield mapper(x)
+ return pop
+
+def valid():
+ reader, mapper = _reader_creator(cfg.DATAROOT, 'valid', shuffle=False, is_train=False)
+ def pop():
+ for i, x in enumerate(reader()):
+ yield mapper(x)
+ return pop
+
+def test():
+ reader, mapper = _reader_creator(cfg.DATAROOT, 'test')
+ def pop():
+ for i, x in enumerate(reader()):
+ yield mapper(x)
+ return pop
diff --git a/PaddleCV/human_pose_estimation/lib/pose_resnet.py b/PaddleCV/human_pose_estimation/lib/pose_resnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..e9578edb9ce8e90996dc33efe611a445f1f8c5c6
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/lib/pose_resnet.py
@@ -0,0 +1,192 @@
+# Copyright (c) 2018-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Functions for building network."""
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+
+__all__ = ["ResNet", "ResNet50", "ResNet101", "ResNet152"]
+
+# Global parameters
+BN_MOMENTUM = 0.9
+
+class ResNet():
+ def __init__(self, layers=50, kps_num=16, test_mode=False):
+ """
+ :param layers: int, the layers number which is used here
+ :param kps_num: int, the number of keypoints in accord with the dataset
+ :param test_mode: bool, if True, only return output heatmaps, no loss
+
+ :return: loss, output heatmaps
+ """
+ self.k = kps_num
+ self.layers = layers
+ self.test_mode = test_mode
+
+ def net(self, input, target=None, target_weight=None):
+ layers = self.layers
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input, num_filters=64, filter_size=7, stride=2, act='relu')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1)
+
+ conv = fluid.layers.conv2d_transpose(
+ input=conv, num_filters=256,
+ filter_size=4,
+ padding=1,
+ stride=2,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(0., 0.001)),
+ act=None,
+ bias_attr=False)
+ conv = fluid.layers.batch_norm(input=conv, act='relu', momentum=BN_MOMENTUM)
+ conv = fluid.layers.conv2d_transpose(
+ input=conv, num_filters=256,
+ filter_size=4,
+ padding=1,
+ stride=2,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(0., 0.001)),
+ act=None,
+ bias_attr=False)
+ conv = fluid.layers.batch_norm(input=conv, act='relu', momentum=BN_MOMENTUM)
+ conv = fluid.layers.conv2d_transpose(
+ input=conv, num_filters=256,
+ filter_size=4,
+ padding=1,
+ stride=2,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(0., 0.001)),
+ act=None,
+ bias_attr=False)
+ conv = fluid.layers.batch_norm(input=conv, act='relu', momentum=BN_MOMENTUM)
+
+ out = fluid.layers.conv2d(
+ input=conv,
+ num_filters=self.k,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ act=None,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(0., 0.001)))
+
+ if self.test_mode:
+ return out
+ else:
+ loss = self.calc_loss(out, target, target_weight)
+ return loss, out
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(0., 0.001)),
+ act=None,
+ bias_attr=False)
+ return fluid.layers.batch_norm(input=conv, act=act, momentum=BN_MOMENTUM)
+
+ def shortcut(self, input, ch_out, stride):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride)
+ else:
+ return input
+
+ def calc_loss(self, heatmap, target, target_weight):
+ _, c, h, w = heatmap.shape
+ x = fluid.layers.reshape(heatmap, (-1, self.k, h*w))
+ y = fluid.layers.reshape(target, (-1, self.k, h*w))
+ w = fluid.layers.reshape(target_weight, (-1, self.k))
+
+ x = fluid.layers.split(x, num_or_sections=self.k, dim=1)
+ y = fluid.layers.split(y, num_or_sections=self.k, dim=1)
+ w = fluid.layers.split(w, num_or_sections=self.k, dim=1)
+
+ _list = []
+ for idx in range(self.k):
+ _tmp = fluid.layers.scale(x=x[idx] - y[idx], scale=1.)
+ _tmp = _tmp * _tmp
+ _tmp = fluid.layers.reduce_mean(_tmp, dim=2)
+ _list.append(_tmp * w[idx])
+
+ _loss = fluid.layers.concat(_list, axis=0)
+ _loss = fluid.layers.reduce_mean(_loss)
+ return 0.5 * _loss
+
+ def bottleneck_block(self, input, num_filters, stride):
+ conv0 = self.conv_bn_layer(
+ input=input, num_filters=num_filters, filter_size=1, act='relu')
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu')
+ conv2 = self.conv_bn_layer(
+ input=conv1, num_filters=num_filters * 4, filter_size=1, act=None)
+
+ short = self.shortcut(input, num_filters * 4, stride)
+
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
+
+def ResNet50():
+ model = ResNet(layers=50)
+ return model
+
+def ResNet101():
+ model = ResNet(layers=101)
+ return model
+
+def ResNet152():
+ model = ResNet(layers=152)
+ return model
diff --git a/PaddleCV/human_pose_estimation/test.py b/PaddleCV/human_pose_estimation/test.py
new file mode 100644
index 0000000000000000000000000000000000000000..8ede66d2d4e2b7eef8adffb70be46ff5ab626b83
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/test.py
@@ -0,0 +1,136 @@
+# Copyright (c) 2018-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Functions for inference."""
+
+import sys
+import argparse
+import functools
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.layers as layers
+
+from lib import pose_resnet
+from utils.transforms import flip_back
+from utils.utility import *
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+
+# yapf: disable
+add_arg('batch_size', int, 32, "Minibatch size.")
+add_arg('dataset', str, 'mpii', "Dataset")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('kp_dim', int, 16, "Class number.")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+add_arg('flip_test', bool, True, "Flip test")
+add_arg('shift_heatmap', bool, True, "Shift heatmap")
+# yapf: enable
+
+
+def print_immediately(s):
+ print(s)
+ sys.stdout.flush()
+
+
+def test(args):
+ import lib.mpii_reader as reader
+ if args.dataset == 'coco':
+ IMAGE_SIZE = [288, 384]
+ FLIP_PAIRS = [[1, 2], [3, 4], [5, 6], [7, 8], [9, 10], [11, 12], [13, 14], [15, 16]]
+ args.kp_dim = 17
+ elif args.dataset == 'mpii':
+ IMAGE_SIZE = [384, 384]
+ FLIP_PAIRS = [[0, 5], [1, 4], [2, 3], [10, 15], [11, 14], [12, 13]]
+ args.kp_dim = 16
+ else:
+ raise ValueError('The dataset {} is not supported yet.'.format(args.dataset))
+
+ print_arguments(args)
+
+ # Image and target
+ image = layers.data(name='image', shape=[3, IMAGE_SIZE[1], IMAGE_SIZE[0]], dtype='float32')
+ file_id = layers.data(name='file_id', shape=[1,], dtype='int')
+
+ # Build model
+ model = pose_resnet.ResNet(layers=50, kps_num=args.kp_dim, test_mode=True)
+
+ # Output
+ output = model.net(input=image, target=None, target_weight=None)
+
+ if args.with_mem_opt:
+ fluid.memory_optimize(fluid.default_main_program(),
+ skip_opt_set=[output.name])
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if args.checkpoint is not None:
+ fluid.io.load_persistables(exe, args.checkpoint)
+
+ # Dataloader
+ test_reader = paddle.batch(reader.test(), batch_size=args.batch_size)
+ feeder = fluid.DataFeeder(place=place, feed_list=[image, file_id])
+
+ test_exe = fluid.ParallelExecutor(
+ use_cuda=True if args.use_gpu else False,
+ main_program=fluid.default_main_program().clone(for_test=True),
+ loss_name=None)
+
+ fetch_list = [image.name, output.name]
+
+ for batch_id, data in enumerate(test_reader()):
+ print_immediately("Processing batch #%d" % batch_id)
+ num_images = len(data)
+
+ file_ids = []
+ for i in range(num_images):
+ file_ids.append(data[i][1])
+
+ input_image, out_heatmaps = test_exe.run(
+ fetch_list=fetch_list,
+ feed=feeder.feed(data))
+
+ if args.flip_test:
+ # Flip all the images in a same batch
+ data_fliped = []
+ for i in range(num_images):
+ data_fliped.append((
+ data[i][0][:, :, ::-1],
+ data[i][1]))
+
+ # Inference again
+ _, output_flipped = test_exe.run(
+ fetch_list=fetch_list,
+ feed=feeder.feed(data_fliped))
+
+ # Flip back
+ output_flipped = flip_back(output_flipped, FLIP_PAIRS)
+
+ # Feature is not aligned, shift flipped heatmap for higher accuracy
+ if args.shift_heatmap:
+ output_flipped[:, :, :, 1:] = \
+ output_flipped.copy()[:, :, :, 0:-1]
+
+ # Aggregate
+ out_heatmaps = (out_heatmaps + output_flipped) * 0.5
+ save_predict_results(input_image, out_heatmaps, file_ids, fold_name='results')
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ test(args)
diff --git a/PaddleCV/human_pose_estimation/train.py b/PaddleCV/human_pose_estimation/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..c6d757321111babbfa4c0c37d5cb12bbab92ac3c
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/train.py
@@ -0,0 +1,179 @@
+# Copyright (c) 2018-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Functions for training."""
+
+import os
+import sys
+import numpy as np
+import cv2
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.layers as layers
+import argparse
+import functools
+
+from lib import pose_resnet
+from utils.utility import *
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 128, "Minibatch size totally.")
+add_arg('dataset', str, 'mpii', "Dataset, valid value: mpii, coco")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('num_epochs', int, 140, "Number of epochs.")
+add_arg('total_images', int, 144406, "Training image number.")
+add_arg('kp_dim', int, 16, "Class number.")
+add_arg('model_save_dir', str, "output", "Model save directory")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, "pretrained/resnet_50/115", "Whether to use pretrained model.")
+add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+add_arg('lr', float, 0.001, "Set learning rate.")
+add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+# yapf: enable
+
+def optimizer_setting(args, params):
+ lr_drop_ratio = 0.1
+
+ ls = params["learning_strategy"]
+
+ if ls["name"] == "piecewise_decay":
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ step = int(total_images / batch_size + 1)
+
+ ls['epochs'] = [90, 120]
+ print('=> LR will be dropped at the epoch of {}'.format(ls['epochs']))
+
+ bd = [step * e for e in ls["epochs"]]
+ base_lr = params["lr"]
+ lr = []
+ lr = [base_lr * (lr_drop_ratio**i) for i in range(len(bd) + 1)]
+
+ # AdamOptimizer
+ optimizer = paddle.fluid.optimizer.AdamOptimizer(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr))
+ else:
+ lr = params["lr"]
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=lr,
+ momentum=0.9,
+ regularization=fluid.regularizer.L2Decay(0.0005))
+
+ return optimizer
+
+
+def print_immediately(s):
+ print(s)
+ sys.stdout.flush()
+
+
+def train(args):
+ if args.dataset == 'coco':
+ import lib.coco_reader as reader
+ IMAGE_SIZE = [288, 384]
+ HEATMAP_SIZE = [72, 96]
+ args.kp_dim = 17
+ args.total_images = 144406 # 149813
+ elif args.dataset == 'mpii':
+ import lib.mpii_reader as reader
+ IMAGE_SIZE = [384, 384]
+ HEATMAP_SIZE = [96, 96]
+ args.kp_dim = 16
+ args.total_images = 22246
+ else:
+ raise ValueError('The dataset {} is not supported yet.'.format(args.dataset))
+
+ print_arguments(args)
+
+ # Image and target
+ image = layers.data(name='image', shape=[3, IMAGE_SIZE[1], IMAGE_SIZE[0]], dtype='float32')
+ target = layers.data(name='target', shape=[args.kp_dim, HEATMAP_SIZE[1], HEATMAP_SIZE[0]], dtype='float32')
+ target_weight = layers.data(name='target_weight', shape=[args.kp_dim, 1], dtype='float32')
+
+ # Build model
+ model = pose_resnet.ResNet(layers=50, kps_num=args.kp_dim)
+
+ # Output
+ loss, output = model.net(input=image, target=target, target_weight=target_weight)
+
+ # Parameters from model and arguments
+ params = {}
+ params["total_images"] = args.total_images
+ params["lr"] = args.lr
+ params["num_epochs"] = args.num_epochs
+ params["learning_strategy"] = {}
+ params["learning_strategy"]["batch_size"] = args.batch_size
+ params["learning_strategy"]["name"] = args.lr_strategy
+
+ # Initialize optimizer
+ optimizer = optimizer_setting(args, params)
+ optimizer.minimize(loss)
+
+ if args.with_mem_opt:
+ fluid.memory_optimize(fluid.default_main_program(),
+ skip_opt_set=[loss.name, output.name, target.name])
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+
+ if args.pretrained_model:
+ def if_exist(var):
+ exist_flag = os.path.exists(os.path.join(args.pretrained_model, var.name))
+ return exist_flag
+ fluid.io.load_vars(exe, args.pretrained_model, predicate=if_exist)
+
+ if args.checkpoint is not None:
+ fluid.io.load_persistables(exe, args.checkpoint)
+
+ # Dataloader
+ train_reader = paddle.batch(reader.train(), batch_size=args.batch_size)
+ feeder = fluid.DataFeeder(place=place, feed_list=[image, target, target_weight])
+
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=True if args.use_gpu else False, loss_name=loss.name)
+ fetch_list = [image.name, loss.name, output.name]
+
+ for pass_id in range(params["num_epochs"]):
+ for batch_id, data in enumerate(train_reader()):
+ current_lr = np.array(paddle.fluid.global_scope().find_var('learning_rate').get_tensor())
+
+ input_image, loss, out_heatmaps = train_exe.run(
+ fetch_list, feed=feeder.feed(data))
+
+ loss = np.mean(np.array(loss))
+
+ print_immediately('Epoch [{:4d}/{:3d}] LR: {:.10f} '
+ 'Loss = {:.5f}'.format(
+ batch_id, pass_id, current_lr[0], loss))
+
+ if batch_id % 10 == 0:
+ save_batch_heatmaps(input_image, out_heatmaps, file_name='visualization@train.jpg', normalize=True)
+
+ model_path = os.path.join(args.model_save_dir + '/' + 'simplebase-{}'.format(args.dataset),
+ str(pass_id))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(exe, model_path)
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ train(args)
+
diff --git a/fluid/PaddleCV/faster_rcnn/models/__init__.py b/PaddleCV/human_pose_estimation/utils/__init__.py
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/models/__init__.py
rename to PaddleCV/human_pose_estimation/utils/__init__.py
diff --git a/PaddleCV/human_pose_estimation/utils/base_evaluator.py b/PaddleCV/human_pose_estimation/utils/base_evaluator.py
new file mode 100644
index 0000000000000000000000000000000000000000..c5201c563306b48979cba13b0c1b3edd80f52516
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/utils/base_evaluator.py
@@ -0,0 +1,32 @@
+# Copyright (c) 2019-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Interface for evaluation."""
+
+
+class BaseEvaluator(object):
+ def __init__(self, root, kp_dim):
+ """
+ :param root: the root dir of dataset
+ :param kp_dim: the dimension of keypoints
+ """
+ self.root = root
+ self.kp_dim = kp_dim
+
+ def evaluate(self, *args, **kwargs):
+ """
+ Need Implementation for specific task / dataset
+ """
+ raise NotImplementedError
diff --git a/PaddleCV/human_pose_estimation/utils/coco_evaluator.py b/PaddleCV/human_pose_estimation/utils/coco_evaluator.py
new file mode 100644
index 0000000000000000000000000000000000000000..9afb2e22fb533b748101e09f59e903c528e8a707
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/utils/coco_evaluator.py
@@ -0,0 +1,189 @@
+# Copyright (c) 2019-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Interface for COCO evaluation."""
+
+import os
+import json
+
+import numpy as np
+from collections import defaultdict
+from collections import OrderedDict
+import pickle
+
+from utils.base_evaluator import BaseEvaluator
+from utils.nms_utils import oks_nms
+from pycocotools.coco import COCO
+from pycocotools.cocoeval import COCOeval
+
+
+class COCOEvaluator(BaseEvaluator):
+ def __init__(self, root, kp_dim=17):
+ """
+ :param root: the root dir of dataset
+ :param kp_dim: the dimension of keypoints
+ """
+ super(COCOEvaluator, self).__init__(root, kp_dim)
+ self.kp_dim = kp_dim
+ self.in_vis_thre = 0.2
+ self.oks_thre = 0.9
+ self.coco = COCO(os.path.join(root, 'annotations', 'person_keypoints_val2017.json'))
+
+ cats = [cat['name']
+ for cat in self.coco.loadCats(self.coco.getCatIds())]
+ self.classes = ['__background__'] + cats
+ self.num_classes = len(self.classes)
+ self._class_to_ind = dict(zip(self.classes, range(self.num_classes)))
+ self._class_to_coco_ind = dict(zip(cats, self.coco.getCatIds()))
+ self._coco_ind_to_class_ind = dict([(self._class_to_coco_ind[cls],
+ self._class_to_ind[cls])
+ for cls in self.classes[1:]])
+
+ def evaluate(self, preds, output_dir, all_boxes, img_path, *args, **kwargs):
+ """
+ :param preds: the predictions to be evaluated
+ :param output_dir: target directory to save evaluation results
+ :param all_boxes: ground truth
+ :param img_path: paths of the original image
+ :return:
+ """
+ res_folder = os.path.join(output_dir, 'results')
+ if not os.path.exists(res_folder):
+ os.makedirs(res_folder)
+ res_file = os.path.join(res_folder, 'keypoints_coco_results.json')
+
+ # person x (keypoints)
+ _kpts = []
+ for idx, kpt in enumerate(preds):
+ _kpts.append({
+ 'keypoints': kpt,
+ 'center': all_boxes[idx][0:2],
+ 'scale': all_boxes[idx][2:4],
+ 'area': all_boxes[idx][4],
+ 'score': all_boxes[idx][5],
+ 'image': int(img_path[idx][-16:-4])
+ })
+ # image x person x (keypoints)
+ kpts = defaultdict(list)
+ for kpt in _kpts:
+ kpts[kpt['image']].append(kpt)
+
+ # rescoring and oks nms
+ kp_dim = self.kp_dim
+ in_vis_thre = self.in_vis_thre
+ oks_thre = self.oks_thre
+ oks_nmsed_kpts = []
+ for img in kpts.keys():
+ img_kpts = kpts[img]
+ for n_p in img_kpts:
+ box_score = n_p['score']
+ kpt_score = 0
+ valid_num = 0
+ for n_jt in range(0, kp_dim):
+ t_s = n_p['keypoints'][n_jt][2]
+ if t_s > in_vis_thre:
+ kpt_score = kpt_score + t_s
+ valid_num = valid_num + 1
+ if valid_num != 0:
+ kpt_score = kpt_score / valid_num
+ # rescoring
+ n_p['score'] = kpt_score * box_score
+ keep = oks_nms([img_kpts[i] for i in range(len(img_kpts))], oks_thre)
+ if len(keep) == 0:
+ oks_nmsed_kpts.append(img_kpts)
+ else:
+ oks_nmsed_kpts.append([img_kpts[_keep] for _keep in keep])
+
+ self._write_coco_keypoint_results(oks_nmsed_kpts, res_file)
+ info_str = self._do_python_keypoint_eval(res_file, res_folder)
+ name_value = OrderedDict(info_str)
+ return name_value, name_value['AP']
+
+ def _write_coco_keypoint_results(self, keypoints, res_file):
+ data_pack = [{'cat_id': self._class_to_coco_ind[cls],
+ 'cls_ind': cls_ind,
+ 'cls': cls,
+ 'ann_type': 'keypoints',
+ 'keypoints': keypoints
+ }
+ for cls_ind, cls in enumerate(self.classes) if not cls == '__background__']
+
+ results = self._coco_keypoint_results_one_category_kernel(data_pack[0])
+ with open(res_file, 'w') as f:
+ json.dump(results, f, sort_keys=True, indent=4)
+ try:
+ json.load(open(res_file))
+ except Exception:
+ content = []
+ with open(res_file, 'r') as f:
+ for line in f:
+ content.append(line)
+ content[-1] = ']'
+ with open(res_file, 'w') as f:
+ for c in content:
+ f.write(c)
+
+ def _coco_keypoint_results_one_category_kernel(self, data_pack):
+ cat_id = data_pack['cat_id']
+ keypoints = data_pack['keypoints']
+ cat_results = []
+
+ for img_kpts in keypoints:
+ if len(img_kpts) == 0:
+ continue
+
+ _key_points = np.array([img_kpts[k]['keypoints']
+ for k in range(len(img_kpts))])
+ key_points = np.zeros(
+ (_key_points.shape[0], self.kp_dim * 3), dtype=np.float)
+
+ for ipt in range(self.kp_dim):
+ key_points[:, ipt * 3 + 0] = _key_points[:, ipt, 0]
+ key_points[:, ipt * 3 + 1] = _key_points[:, ipt, 1]
+ key_points[:, ipt * 3 + 2] = _key_points[:, ipt, 2] # keypoints score.
+
+ result = [{'image_id': img_kpts[k]['image'],
+ 'category_id': cat_id,
+ 'keypoints': list(key_points[k]),
+ 'score': img_kpts[k]['score'],
+ 'center': list(img_kpts[k]['center']),
+ 'scale': list(img_kpts[k]['scale'])
+ } for k in range(len(img_kpts))]
+ cat_results.extend(result)
+
+ return cat_results
+
+ def _do_python_keypoint_eval(self, res_file, res_folder):
+ coco_dt = self.coco.loadRes(res_file)
+ coco_eval = COCOeval(self.coco, coco_dt, 'keypoints')
+ coco_eval.params.useSegm = None
+ coco_eval.evaluate()
+ coco_eval.accumulate()
+ coco_eval.summarize()
+ stats_names = ['AP', 'Ap .5', 'AP .75', 'AP (M)', 'AP (L)', 'AR', 'AR .5', 'AR .75', 'AR (M)', 'AR (L)']
+
+ info_str = []
+ for ind, name in enumerate(stats_names):
+ info_str.append((name, coco_eval.stats[ind]))
+
+ eval_file = os.path.join(res_folder, 'keypoints_val_results.pkl')
+
+ with open(eval_file, 'wb') as f:
+ pickle.dump(coco_eval, f, pickle.HIGHEST_PROTOCOL)
+
+ print('=> coco eval results saved to %s' % eval_file)
+
+ return info_str
+
diff --git a/PaddleCV/human_pose_estimation/utils/evaluator_builder.py b/PaddleCV/human_pose_estimation/utils/evaluator_builder.py
new file mode 100644
index 0000000000000000000000000000000000000000..2f9b856ade04be6355f3e33fa3d7cf7863b8dc77
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/utils/evaluator_builder.py
@@ -0,0 +1,33 @@
+# Copyright (c) 2019-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Interface for building evaluator."""
+
+from utils.coco_evaluator import COCOEvaluator
+from utils.mpii_evaluator import MPIIEvaluator
+
+
+evaluator_map = {
+ 'coco': COCOEvaluator,
+ 'mpii': MPIIEvaluator
+}
+
+
+def create_evaluator(dataset):
+ """
+ :param dataset: specific dataset to be evaluated
+ :return:
+ """
+ return evaluator_map[dataset]
diff --git a/PaddleCV/human_pose_estimation/utils/mpii_evaluator.py b/PaddleCV/human_pose_estimation/utils/mpii_evaluator.py
new file mode 100644
index 0000000000000000000000000000000000000000..ed247fa8f537af65fb12fa6ee2e3cf08e015a2b8
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/utils/mpii_evaluator.py
@@ -0,0 +1,117 @@
+# Copyright (c) 2019-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Interface for MPII evaluation."""
+
+import os
+import numpy as np
+from collections import OrderedDict
+from scipy.io import loadmat, savemat
+
+from utils.base_evaluator import BaseEvaluator
+
+
+class MPIIEvaluator(BaseEvaluator):
+ def __init__(self, root, kp_dim=16):
+ """
+ :param root: the root dir of dataset
+ :param kp_dim: the dimension of keypoints
+ """
+ super(MPIIEvaluator, self).__init__(root, kp_dim)
+ self.root = root
+ self.kp_dim = kp_dim
+ self.sc_bias = 0.6
+ self.threshold = 0.5
+
+ def evaluate(self, preds, output_dir, *args, **kwargs):
+ """
+ :param preds: the predictions to be evaluated
+ :param output_dir: target directory to save evaluation results
+ :return:
+ """
+ # Convert 0-based index to 1-based index
+ preds = preds[:, :, 0:2] + 1.0
+
+ if output_dir:
+ pred_file = os.path.join(output_dir, 'pred.mat')
+ savemat(pred_file, mdict={'preds': preds})
+
+ gt_file = os.path.join(self.root, 'annot', 'gt_valid.mat')
+ gt_dict = loadmat(gt_file)
+ dataset_joints = gt_dict['dataset_joints']
+ jnt_missing = gt_dict['jnt_missing']
+ pos_gt_src = gt_dict['pos_gt_src']
+ headboxes_src = gt_dict['headboxes_src']
+
+ pos_pred_src = np.transpose(preds, [1, 2, 0])
+
+ head = np.where(dataset_joints == 'head')[1][0]
+ lsho = np.where(dataset_joints == 'lsho')[1][0]
+ lelb = np.where(dataset_joints == 'lelb')[1][0]
+ lwri = np.where(dataset_joints == 'lwri')[1][0]
+ lhip = np.where(dataset_joints == 'lhip')[1][0]
+ lkne = np.where(dataset_joints == 'lkne')[1][0]
+ lank = np.where(dataset_joints == 'lank')[1][0]
+
+ rsho = np.where(dataset_joints == 'rsho')[1][0]
+ relb = np.where(dataset_joints == 'relb')[1][0]
+ rwri = np.where(dataset_joints == 'rwri')[1][0]
+ rkne = np.where(dataset_joints == 'rkne')[1][0]
+ rank = np.where(dataset_joints == 'rank')[1][0]
+ rhip = np.where(dataset_joints == 'rhip')[1][0]
+
+ jnt_visible = 1 - jnt_missing
+ uv_error = pos_pred_src - pos_gt_src
+ uv_err = np.linalg.norm(uv_error, axis=1)
+ headsizes = headboxes_src[1, :, :] - headboxes_src[0, :, :]
+ headsizes = np.linalg.norm(headsizes, axis=0)
+ headsizes *= self.sc_bias
+ scale = np.multiply(headsizes, np.ones((len(uv_err), 1)))
+ scaled_uv_err = np.divide(uv_err, scale)
+ scaled_uv_err = np.multiply(scaled_uv_err, jnt_visible)
+ jnt_count = np.sum(jnt_visible, axis=1)
+ less_than_threshold = np.multiply((scaled_uv_err <= self.threshold), jnt_visible)
+ PCKh = np.divide(100. * np.sum(less_than_threshold, axis=1), jnt_count)
+
+ # Save
+ rng = np.arange(0, 0.5 + 0.01, 0.01)
+ pckAll = np.zeros((len(rng), self.kp_dim))
+
+ for r in range(len(rng)):
+ thresh = rng[r]
+ less_than_threshold = np.multiply(scaled_uv_err <= thresh, jnt_visible)
+ pckAll[r, :] = np.divide(100. * np.sum(less_than_threshold, axis=1), jnt_count)
+
+ PCKh = np.ma.array(PCKh, mask=False)
+ PCKh.mask[6:8] = True
+
+ jnt_count = np.ma.array(jnt_count, mask=False)
+ jnt_count.mask[6:8] = True
+ jnt_ratio = jnt_count / np.sum(jnt_count).astype(np.float64)
+
+ name_value = [
+ ('Head', PCKh[head]),
+ ('Shoulder', 0.5 * (PCKh[lsho] + PCKh[rsho])),
+ ('Elbow', 0.5 * (PCKh[lelb] + PCKh[relb])),
+ ('Wrist', 0.5 * (PCKh[lwri] + PCKh[rwri])),
+ ('Hip', 0.5 * (PCKh[lhip] + PCKh[rhip])),
+ ('Knee', 0.5 * (PCKh[lkne] + PCKh[rkne])),
+ ('Ankle', 0.5 * (PCKh[lank] + PCKh[rank])),
+ ('Mean', np.sum(PCKh * jnt_ratio)),
+ ('Mean@0.1', np.sum(pckAll[11, :] * jnt_ratio))
+ ]
+ name_value = OrderedDict(name_value)
+
+ return name_value, name_value['Mean']
diff --git a/PaddleCV/human_pose_estimation/utils/nms_utils.py b/PaddleCV/human_pose_estimation/utils/nms_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..ea72ddbaac0fee163ea6a91604aeef8c2312b171
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/utils/nms_utils.py
@@ -0,0 +1,71 @@
+# Copyright (c) 2019-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import numpy as np
+
+
+def oks_iou(g, d, a_g, a_d, sigmas=None, in_vis_thre=None):
+ if not isinstance(sigmas, np.ndarray):
+ sigmas = np.array([.26, .25, .25, .35, .35, .79, .79, .72, .72, .62, .62, 1.07, 1.07, .87, .87, .89, .89]) / 10.0
+ vars = (sigmas * 2) ** 2
+ xg = g[0::3]
+ yg = g[1::3]
+ vg = g[2::3]
+ ious = np.zeros((d.shape[0]))
+ for n_d in range(0, d.shape[0]):
+ xd = d[n_d, 0::3]
+ yd = d[n_d, 1::3]
+ vd = d[n_d, 2::3]
+ dx = xd - xg
+ dy = yd - yg
+ e = (dx ** 2 + dy ** 2) / vars / ((a_g + a_d[n_d]) / 2 + np.spacing(1)) / 2
+ if in_vis_thre is not None:
+ ind = list(vg > in_vis_thre) and list(vd > in_vis_thre)
+ e = e[ind]
+ ious[n_d] = np.sum(np.exp(-e)) / e.shape[0] if e.shape[0] != 0 else 0.0
+ return ious
+
+
+def oks_nms(kpts_db, thresh, sigmas=None, in_vis_thre=None):
+ """
+ greedily select boxes with high confidence and overlap with current maximum <= thresh
+ rule out overlap >= thresh, overlap = oks
+ :param kpts_db
+ :param thresh: retain overlap < thresh
+ :return: indexes to keep
+ """
+ if len(kpts_db) == 0:
+ return []
+
+ scores = np.array([kpts_db[i]['score'] for i in range(len(kpts_db))])
+ kpts = np.array([kpts_db[i]['keypoints'].flatten() for i in range(len(kpts_db))])
+ areas = np.array([kpts_db[i]['area'] for i in range(len(kpts_db))])
+
+ order = scores.argsort()[::-1]
+
+ keep = []
+ while order.size > 0:
+ i = order[0]
+ keep.append(i)
+
+ oks_ovr = oks_iou(kpts[i], kpts[order[1:]], areas[i], areas[order[1:]], sigmas, in_vis_thre)
+
+ inds = np.where(oks_ovr <= thresh)[0]
+ order = order[inds + 1]
+
+ return keep
\ No newline at end of file
diff --git a/fluid/PaddleCV/human_pose_estimation/utils/transforms.py b/PaddleCV/human_pose_estimation/utils/transforms.py
similarity index 100%
rename from fluid/PaddleCV/human_pose_estimation/utils/transforms.py
rename to PaddleCV/human_pose_estimation/utils/transforms.py
diff --git a/fluid/PaddleCV/human_pose_estimation/utils/utility.py b/PaddleCV/human_pose_estimation/utils/utility.py
similarity index 100%
rename from fluid/PaddleCV/human_pose_estimation/utils/utility.py
rename to PaddleCV/human_pose_estimation/utils/utility.py
diff --git a/PaddleCV/human_pose_estimation/val.py b/PaddleCV/human_pose_estimation/val.py
new file mode 100644
index 0000000000000000000000000000000000000000..4224ec1ec50fbfec473a573ef4624350c66e2215
--- /dev/null
+++ b/PaddleCV/human_pose_estimation/val.py
@@ -0,0 +1,233 @@
+# Copyright (c) 2018-present, Baidu, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+##############################################################################
+
+"""Functions for validation."""
+
+import os
+import sys
+import argparse
+import functools
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.layers as layers
+
+from lib import pose_resnet
+from utils.transforms import flip_back
+from utils.utility import *
+from utils.evaluator_builder import create_evaluator
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+
+# yapf: disable
+add_arg('batch_size', int, 128, "Minibatch size.")
+add_arg('dataset', str, 'coco', "Dataset")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('num_epochs', int, 140, "Number of epochs.")
+add_arg('total_images', int, 144406, "Training image number.")
+add_arg('kp_dim', int, 16, "Class number.")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+add_arg('lr', float, 0.001, "Set learning rate.")
+add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+add_arg('flip_test', bool, True, "Flip test")
+add_arg('shift_heatmap', bool, True, "Shift heatmap")
+add_arg('post_process', bool, True, "Post process")
+add_arg('data_root', str, "data/coco", "Root directory of dataset")
+# yapf: enable
+
+
+def print_immediately(s):
+ print(s)
+ sys.stdout.flush()
+
+
+def valid(args):
+ if args.dataset == 'coco':
+ import lib.coco_reader as reader
+ IMAGE_SIZE = [288, 384]
+ HEATMAP_SIZE = [72, 96]
+ FLIP_PAIRS = [[1, 2], [3, 4], [5, 6], [7, 8], [9, 10], [11, 12], [13, 14], [15, 16]]
+ args.kp_dim = 17
+ args.total_images = 6108
+ elif args.dataset == 'mpii':
+ import lib.mpii_reader as reader
+ IMAGE_SIZE = [384, 384]
+ HEATMAP_SIZE = [96, 96]
+ FLIP_PAIRS = [[0, 5], [1, 4], [2, 3], [10, 15], [11, 14], [12, 13]]
+ args.kp_dim = 16
+ args.total_images = 2958
+ else:
+ raise ValueError('The dataset {} is not supported yet.'.format(args.dataset))
+
+ print_arguments(args)
+
+ # Image and target
+ image = layers.data(name='image', shape=[3, IMAGE_SIZE[1], IMAGE_SIZE[0]], dtype='float32')
+ target = layers.data(name='target', shape=[args.kp_dim, HEATMAP_SIZE[1], HEATMAP_SIZE[0]], dtype='float32')
+ target_weight = layers.data(name='target_weight', shape=[args.kp_dim, 1], dtype='float32')
+ center = layers.data(name='center', shape=[2,], dtype='float32')
+ scale = layers.data(name='scale', shape=[2,], dtype='float32')
+ score = layers.data(name='score', shape=[1,], dtype='float32')
+
+ # Build model
+ model = pose_resnet.ResNet(layers=50, kps_num=args.kp_dim)
+
+ # Output
+ loss, output = model.net(input=image, target=target, target_weight=target_weight)
+
+ # Parameters from model and arguments
+ params = {}
+ params["total_images"] = args.total_images
+ params["lr"] = args.lr
+ params["num_epochs"] = args.num_epochs
+ params["learning_strategy"] = {}
+ params["learning_strategy"]["batch_size"] = args.batch_size
+ params["learning_strategy"]["name"] = args.lr_strategy
+
+ if args.with_mem_opt:
+ fluid.memory_optimize(fluid.default_main_program(),
+ skip_opt_set=[loss.name, output.name, target.name])
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if args.pretrained_model:
+ def if_exist(var):
+ exist_flag = os.path.exists(os.path.join(args.pretrained_model, var.name))
+ if exist_flag:
+ print("Copy pretrianed weights from: %s" % var.name)
+ return exist_flag
+ fluid.io.load_vars(exe, args.pretrained_model, predicate=if_exist)
+
+ if args.checkpoint is not None:
+ fluid.io.load_persistables(exe, args.checkpoint)
+
+ # Dataloader
+ valid_reader = paddle.batch(reader.valid(), batch_size=args.batch_size)
+ feeder = fluid.DataFeeder(place=place, feed_list=[image, target, target_weight, center, scale, score])
+
+ valid_exe = fluid.ParallelExecutor(
+ use_cuda=True if args.use_gpu else False,
+ main_program=fluid.default_main_program().clone(for_test=True),
+ loss_name=loss.name)
+
+ fetch_list = [image.name, loss.name, output.name, target.name]
+
+ # For validation
+ acc = AverageMeter()
+ idx = 0
+
+ num_samples = args.total_images
+ all_preds = np.zeros((num_samples, args.kp_dim, 3), dtype=np.float32)
+ all_boxes = np.zeros((num_samples, 6))
+
+ image_path = []
+ for batch_id, meta in enumerate(valid_reader()):
+ num_images = len(meta)
+ data = meta
+ if args.dataset == 'coco':
+ for i in range(num_images):
+ image_path.append(meta[i][-1])
+ data[i] = data[i][:-1]
+
+ num_images = len(data)
+
+ centers = []
+ scales = []
+ scores = []
+ for i in range(num_images):
+ centers.append(data[i][3])
+ scales.append(data[i][4])
+ scores.append(data[i][5])
+
+ input_image, loss, out_heatmaps, target_heatmaps = valid_exe.run(
+ fetch_list=fetch_list,
+ feed=feeder.feed(data))
+
+ if args.flip_test:
+ # Flip all the images in a same batch
+ data_fliped = []
+ for i in range(num_images):
+ # Input, target, target_weight, c, s, score
+ data_fliped.append((
+ # np.flip(input_image, 3)[i],
+ data[i][0][:, :, ::-1],
+ data[i][1],
+ data[i][2],
+ data[i][3],
+ data[i][4],
+ data[i][5]))
+
+ # Inference again
+ _, _, output_flipped, _ = valid_exe.run(
+ fetch_list=fetch_list,
+ feed=feeder.feed(data_fliped))
+
+ # Flip back
+ output_flipped = flip_back(output_flipped, FLIP_PAIRS)
+
+ # Feature is not aligned, shift flipped heatmap for higher accuracy
+ if args.shift_heatmap:
+ output_flipped[:, :, :, 1:] = \
+ output_flipped.copy()[:, :, :, 0:-1]
+
+ # Aggregate
+ # out_heatmaps.shape: size[b, args.kp_dim, 96, 96]
+ out_heatmaps = (out_heatmaps + output_flipped) * 0.5
+
+ loss = np.mean(np.array(loss))
+
+ # Accuracy
+ _, avg_acc, cnt, pred = accuracy(out_heatmaps, target_heatmaps)
+ acc.update(avg_acc, cnt)
+
+ # Current center, scale, score
+ centers = np.array(centers)
+ scales = np.array(scales)
+ scores = np.array(scores)
+
+ preds, maxvals = get_final_preds(
+ args, out_heatmaps, centers, scales)
+
+ all_preds[idx:idx + num_images, :, 0:2] = preds[:, :, 0:2]
+ all_preds[idx:idx + num_images, :, 2:3] = maxvals
+ # Double check this all_boxes parts
+ all_boxes[idx:idx + num_images, 0:2] = centers[:, 0:2]
+ all_boxes[idx:idx + num_images, 2:4] = scales[:, 0:2]
+ all_boxes[idx:idx + num_images, 4] = np.prod(scales*200, 1)
+ all_boxes[idx:idx + num_images, 5] = scores
+
+ idx += num_images
+
+ print_immediately('Epoch [{:4d}] '
+ 'Loss = {:.5f} '
+ 'Acc = {:.5f}'.format(batch_id, loss, acc.avg))
+
+ if batch_id % 10 == 0:
+ save_batch_heatmaps(input_image, out_heatmaps, file_name='visualization@val.jpg', normalize=True)
+
+ # Evaluate
+ output_dir = './'
+ evaluator = create_evaluator(args.dataset)(args.data_root, args.kp_dim)
+ name_values, perf_indicator = evaluator.evaluate(all_preds, output_dir, all_boxes, image_path)
+ print_name_value(name_values, perf_indicator)
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ valid(args)
diff --git a/fluid/PaddleCV/icnet/.run_ce.sh b/PaddleCV/icnet/.run_ce.sh
similarity index 100%
rename from fluid/PaddleCV/icnet/.run_ce.sh
rename to PaddleCV/icnet/.run_ce.sh
diff --git a/PaddleCV/icnet/README.md b/PaddleCV/icnet/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..4698a41c894bb0a3d3a4d44f74df5dc2832f4d0f
--- /dev/null
+++ b/PaddleCV/icnet/README.md
@@ -0,0 +1,108 @@
+
+## 代码结构
+```
+├── network.py # 网络结构定义脚本
+├── train.py # 训练任务脚本
+├── eval.py # 评估脚本
+├── infer.py # 预测脚本
+├── cityscape.py # 数据预处理脚本
+└── utils.py # 定义通用的函数
+```
+
+## 简介
+
+Image Cascade Network(ICNet)主要用于图像实时语义分割。相较于其它压缩计算的方法,ICNet即考虑了速度,也考虑了准确性。
+ICNet的主要思想是将输入图像变换为不同的分辨率,然后用不同计算复杂度的子网络计算不同分辨率的输入,然后将结果合并。ICNet由三个子网络组成,计算复杂度高的网络处理低分辨率输入,计算复杂度低的网络处理分辨率高的网络,通过这种方式在高分辨率图像的准确性和低复杂度网络的效率之间获得平衡。
+
+整个网络结构如下:
+
+
+
+图 1
+
+
+
+## 数据准备
+
+
+
+本文采用Cityscape数据集,请前往[Cityscape官网](https://www.cityscapes-dataset.com)注册下载。下载数据之后,按照[这里](https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/createTrainIdLabelImgs.py#L3)的说明和工具处理数据。
+处理之后的数据
+```
+data/cityscape/
+|-- gtFine
+| |-- test
+| |-- train
+| `-- val
+|-- leftImg8bit
+| |-- test
+| |-- train
+| `-- val
+|-- train.list
+`-- val.list
+```
+其中,train.list和val.list分别是用于训练和测试的列表文件,第一列为输入图像数据,第二列为标注数据,两列用空格分开。示例如下:
+```
+leftImg8bit/train/stuttgart/stuttgart_000021_000019_leftImg8bit.png gtFine/train/stuttgart/stuttgart_000021_000019_gtFine_labelTrainIds.png
+leftImg8bit/train/stuttgart/stuttgart_000072_000019_leftImg8bit.png gtFine/train/stuttgart/stuttgart_000072_000019_gtFine_labelTrainIds.png
+```
+完成数据下载和准备后,需要修改`cityscape.py`脚本中对应的数据地址。
+
+## 模型训练与预测
+
+### 训练
+执行以下命令进行训练,同时指定checkpoint保存路径:
+```
+python train.py --batch_size=16 --use_gpu=True --checkpoint_path="./chkpnt/"
+```
+使用以下命令获得更多使用说明:
+```
+python train.py --help
+```
+训练过程中会根据用户的设置,输出训练集上每个网络分支的`loss`, 示例如下:
+```
+Iter[0]; train loss: 2.338; sub4_loss: 3.367; sub24_loss: 4.120; sub124_loss: 0.151
+```
+### 测试
+执行以下命令在`Cityscape`测试数据集上进行测试:
+```
+python eval.py --model_path="./cnkpnt/100" --use_gpu=True
+```
+需要通过选项`--model_path`指定模型文件。
+测试脚本的输出的评估指标为[mean IoU]()。
+
+### 预测
+执行以下命令对指定的数据进行预测:
+```
+python infer.py \
+--model_path="./cnkpnt/100" \
+--images_path="./data/cityscape/" \
+--images_list="./data/cityscape/infer.list"
+```
+通过选项`--images_list`指定列表文件,列表文件中每一行为一个要预测的图片的路径。
+预测结果默认保存到当前路径下的`output`文件夹下。
+
+## 实验结果
+图2为在`CityScape`训练集上的训练的Loss曲线:
+
+
+
+图 2
+
+
+在训练集上训练,在validation数据集上验证的结果为:mean_IoU=67.0%(论文67.7%)
+
+图3是使用`infer.py`脚本预测产生的结果示例,其中,第一行为输入的原始图片,第二行为人工的标注,第三行为我们模型计算的结果。
+
+
+图 3
+
+
+## 其他信息
+|数据集 | pretrained model |
+|---|---|
+|CityScape | [pretrained_model](https://paddle-icnet-models.bj.bcebos.com/model_1000.tar.gz) |
+
+## 参考
+
+- [ICNet for Real-Time Semantic Segmentation on High-Resolution Images](https://arxiv.org/abs/1704.08545)
diff --git a/fluid/PaddleCV/icnet/_ce.py b/PaddleCV/icnet/_ce.py
similarity index 100%
rename from fluid/PaddleCV/icnet/_ce.py
rename to PaddleCV/icnet/_ce.py
diff --git a/PaddleCV/icnet/cityscape.py b/PaddleCV/icnet/cityscape.py
new file mode 100644
index 0000000000000000000000000000000000000000..281658fa1e00bf38a9e7f0fa1a7ed2e9b7559539
--- /dev/null
+++ b/PaddleCV/icnet/cityscape.py
@@ -0,0 +1,249 @@
+"""Reader for Cityscape dataset.
+"""
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import cv2
+import numpy as np
+import paddle.dataset as dataset
+
+DATA_PATH = "./data/cityscape"
+TRAIN_LIST = DATA_PATH + "/train.list"
+TEST_LIST = DATA_PATH + "/val.list"
+IGNORE_LABEL = 255
+NUM_CLASSES = 19
+TRAIN_DATA_SHAPE = (3, 720, 720)
+TEST_DATA_SHAPE = (3, 1024, 2048)
+IMG_MEAN = np.array((103.939, 116.779, 123.68), dtype=np.float32)
+
+
+def train_data_shape():
+ return TRAIN_DATA_SHAPE
+
+
+def test_data_shape():
+ return TEST_DATA_SHAPE
+
+
+def num_classes():
+ return NUM_CLASSES
+
+
+class DataGenerater:
+ def __init__(self, data_list, mode="train", flip=True, scaling=True):
+ self.flip = flip
+ self.scaling = scaling
+ self.image_label = []
+ with open(data_list, 'r') as f:
+ for line in f:
+ image_file, label_file = line.strip().split(' ')
+ self.image_label.append((image_file, label_file))
+
+ def create_train_reader(self, batch_size):
+ """
+ Create a reader for train dataset.
+ """
+
+ def reader():
+ np.random.shuffle(self.image_label)
+ images = []
+ labels_sub1 = []
+ labels_sub2 = []
+ labels_sub4 = []
+ count = 0
+ for image, label in self.image_label:
+ image, label_sub1, label_sub2, label_sub4 = self.process_train_data(
+ image, label)
+ count += 1
+ images.append(image)
+ labels_sub1.append(label_sub1)
+ labels_sub2.append(label_sub2)
+ labels_sub4.append(label_sub4)
+ if count == batch_size:
+ yield self.mask(
+ np.array(images),
+ np.array(labels_sub1),
+ np.array(labels_sub2), np.array(labels_sub4))
+ images = []
+ labels_sub1 = []
+ labels_sub2 = []
+ labels_sub4 = []
+ count = 0
+ if images:
+ yield self.mask(
+ np.array(images),
+ np.array(labels_sub1),
+ np.array(labels_sub2), np.array(labels_sub4))
+
+ return reader
+
+ def create_test_reader(self):
+ """
+ Create a reader for test dataset.
+ """
+
+ def reader():
+ for image, label in self.image_label:
+ image, label = self.load(image, label)
+ image = dataset.image.to_chw(image)[np.newaxis, :]
+ label = label[np.newaxis, :, :, np.newaxis].astype("float32")
+ label_mask = np.where((label != IGNORE_LABEL).flatten())[
+ 0].astype("int32")
+ yield image, label, label_mask
+
+ return reader
+
+ def process_train_data(self, image, label):
+ """
+ Process training data.
+ """
+ image, label = self.load(image, label)
+ if self.flip:
+ image, label = self.random_flip(image, label)
+ if self.scaling:
+ image, label = self.random_scaling(image, label)
+ image, label = self.resize(image, label, out_size=TRAIN_DATA_SHAPE[1:])
+ label = label.astype("float32")
+ label_sub1 = dataset.image.to_chw(self.scale_label(label, factor=4))
+ label_sub2 = dataset.image.to_chw(self.scale_label(label, factor=8))
+ label_sub4 = dataset.image.to_chw(self.scale_label(label, factor=16))
+ image = dataset.image.to_chw(image)
+ return image, label_sub1, label_sub2, label_sub4
+
+ def load(self, image, label):
+ """
+ Load image from file.
+ """
+ image = dataset.image.load_image(
+ DATA_PATH + "/" + image, is_color=True).astype("float32")
+ image -= IMG_MEAN
+ label = dataset.image.load_image(
+ DATA_PATH + "/" + label, is_color=False).astype("float32")
+ return image, label
+
+ def random_flip(self, image, label):
+ """
+ Flip image and label randomly.
+ """
+ r = np.random.rand(1)
+ if r > 0.5:
+ image = dataset.image.left_right_flip(image, is_color=True)
+ label = dataset.image.left_right_flip(label, is_color=False)
+ return image, label
+
+ def random_scaling(self, image, label):
+ """
+ Scale image and label randomly.
+ """
+ scale = np.random.uniform(0.5, 2.0, 1)[0]
+ h_new = int(image.shape[0] * scale)
+ w_new = int(image.shape[1] * scale)
+ image = cv2.resize(image, (w_new, h_new))
+ label = cv2.resize(
+ label, (w_new, h_new), interpolation=cv2.INTER_NEAREST)
+ return image, label
+
+ def padding_as(self, image, h, w, is_color):
+ """
+ Padding image.
+ """
+ pad_h = max(image.shape[0], h) - image.shape[0]
+ pad_w = max(image.shape[1], w) - image.shape[1]
+ if is_color:
+ return np.pad(image, ((0, pad_h), (0, pad_w), (0, 0)), 'constant')
+ else:
+ return np.pad(image, ((0, pad_h), (0, pad_w)), 'constant')
+
+ def random_crop(self, im, out_shape, is_color=True):
+ h, w = im.shape[:2]
+ h_start = np.random.randint(0, h - out_shape[0] + 1)
+ w_start = np.random.randint(0, w - out_shape[1] + 1)
+ h_end, w_end = h_start + out_shape[0], w_start + out_shape[1]
+ if is_color:
+ im = im[h_start:h_end, w_start:w_end, :]
+ else:
+ im = im[h_start:h_end, w_start:w_end]
+ return im
+
+ def resize(self, image, label, out_size):
+ """
+ Resize image and label by padding or cropping.
+ """
+ ignore_label = IGNORE_LABEL
+ label = label - ignore_label
+ if len(label.shape) == 2:
+ label = label[:, :, np.newaxis]
+ combined = np.concatenate((image, label), axis=2)
+ combined = self.padding_as(
+ combined, out_size[0], out_size[1], is_color=True)
+ combined = self.random_crop(combined, out_size, is_color=True)
+ image = combined[:, :, 0:3]
+ label = combined[:, :, 3:4] + ignore_label
+ return image, label
+
+ def scale_label(self, label, factor):
+ """
+ Scale label according to factor.
+ """
+ h = label.shape[0] // factor
+ w = label.shape[1] // factor
+ return cv2.resize(
+ label, (h, w), interpolation=cv2.INTER_NEAREST)[:, :, np.newaxis]
+
+ def mask(self, image, label0, label1, label2):
+ """
+ Get mask for valid pixels.
+ """
+ mask_sub1 = np.where(((label0 < (NUM_CLASSES + 1)) & (
+ label0 != IGNORE_LABEL)).flatten())[0].astype("int32")
+ mask_sub2 = np.where(((label1 < (NUM_CLASSES + 1)) & (
+ label1 != IGNORE_LABEL)).flatten())[0].astype("int32")
+ mask_sub4 = np.where(((label2 < (NUM_CLASSES + 1)) & (
+ label2 != IGNORE_LABEL)).flatten())[0].astype("int32")
+ return image.astype(
+ "float32"), label0, mask_sub1, label1, mask_sub2, label2, mask_sub4
+
+
+def train(batch_size=32, flip=True, scaling=True):
+ """
+ Cityscape training set reader.
+ It returns a reader, in which each result is a batch with batch_size samples.
+
+ :param batch_size: The batch size of each result return by the reader.
+ :type batch_size: int
+ :param flip: Whether flip images randomly.
+ :type batch_size: bool
+ :param scaling: Whether scale images randomly.
+ :type batch_size: bool
+ :return: Training reader.
+ :rtype: callable
+ """
+ reader = DataGenerater(
+ TRAIN_LIST, flip=flip, scaling=scaling).create_train_reader(batch_size)
+ return reader
+
+
+def test():
+ """
+ Cityscape validation set reader.
+ It returns a reader, in which each result is a sample.
+
+ :return: Training reader.
+ :rtype: callable
+ """
+ reader = DataGenerater(TEST_LIST).create_test_reader()
+ return reader
+
+
+def infer(image_list=TEST_LIST):
+ """
+ Infer set reader.
+ It returns a reader, in which each result is a sample.
+
+ :param image_list: The image list file in which each line is a path of image to be infered.
+ :type batch_size: str
+ :return: Infer reader.
+ :rtype: callable
+ """
+ reader = DataGenerater(image_list).create_test_reader()
diff --git a/fluid/PaddleCV/icnet/data/cityscape/gtFine/train/stuttgart/stuttgart_000021_000019_gtFine_labelTrainIds.png b/PaddleCV/icnet/data/cityscape/gtFine/train/stuttgart/stuttgart_000021_000019_gtFine_labelTrainIds.png
similarity index 100%
rename from fluid/PaddleCV/icnet/data/cityscape/gtFine/train/stuttgart/stuttgart_000021_000019_gtFine_labelTrainIds.png
rename to PaddleCV/icnet/data/cityscape/gtFine/train/stuttgart/stuttgart_000021_000019_gtFine_labelTrainIds.png
diff --git a/fluid/PaddleCV/icnet/data/cityscape/leftImg8bit/train/stuttgart/stuttgart_000021_000019_leftImg8bit.png b/PaddleCV/icnet/data/cityscape/leftImg8bit/train/stuttgart/stuttgart_000021_000019_leftImg8bit.png
similarity index 100%
rename from fluid/PaddleCV/icnet/data/cityscape/leftImg8bit/train/stuttgart/stuttgart_000021_000019_leftImg8bit.png
rename to PaddleCV/icnet/data/cityscape/leftImg8bit/train/stuttgart/stuttgart_000021_000019_leftImg8bit.png
diff --git a/fluid/PaddleCV/icnet/data/cityscape/train.list b/PaddleCV/icnet/data/cityscape/train.list
similarity index 100%
rename from fluid/PaddleCV/icnet/data/cityscape/train.list
rename to PaddleCV/icnet/data/cityscape/train.list
diff --git a/fluid/PaddleCV/icnet/eval.py b/PaddleCV/icnet/eval.py
similarity index 100%
rename from fluid/PaddleCV/icnet/eval.py
rename to PaddleCV/icnet/eval.py
diff --git a/PaddleCV/icnet/icnet.py b/PaddleCV/icnet/icnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..3286ce74072f0fde2b215763d50156dcd152a99c
--- /dev/null
+++ b/PaddleCV/icnet/icnet.py
@@ -0,0 +1,304 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+import numpy as np
+import sys
+
+
+def conv(input,
+ k_h,
+ k_w,
+ c_o,
+ s_h,
+ s_w,
+ relu=False,
+ padding="VALID",
+ biased=False,
+ name=None):
+ act = None
+ tmp = input
+ if relu:
+ act = "relu"
+ if padding == "SAME":
+ padding_h = max(k_h - s_h, 0)
+ padding_w = max(k_w - s_w, 0)
+ padding_top = padding_h // 2
+ padding_left = padding_w // 2
+ padding_bottom = padding_h - padding_top
+ padding_right = padding_w - padding_left
+ padding = [
+ 0, 0, 0, 0, padding_top, padding_bottom, padding_left, padding_right
+ ]
+ tmp = fluid.layers.pad(tmp, padding)
+ tmp = fluid.layers.conv2d(
+ tmp,
+ num_filters=c_o,
+ filter_size=[k_h, k_w],
+ stride=[s_h, s_w],
+ groups=1,
+ act=act,
+ bias_attr=biased,
+ use_cudnn=False,
+ name=name)
+ return tmp
+
+
+def atrous_conv(input,
+ k_h,
+ k_w,
+ c_o,
+ dilation,
+ relu=False,
+ padding="VALID",
+ biased=False,
+ name=None):
+ act = None
+ if relu:
+ act = "relu"
+ tmp = input
+ if padding == "SAME":
+ padding_h = max(k_h - s_h, 0)
+ padding_w = max(k_w - s_w, 0)
+ padding_top = padding_h // 2
+ padding_left = padding_w // 2
+ padding_bottom = padding_h - padding_top
+ padding_right = padding_w - padding_left
+ padding = [
+ 0, 0, 0, 0, padding_top, padding_bottom, padding_left, padding_right
+ ]
+ tmp = fluid.layers.pad(tmp, padding)
+
+ tmp = fluid.layers.conv2d(
+ input,
+ num_filters=c_o,
+ filter_size=[k_h, k_w],
+ dilation=dilation,
+ groups=1,
+ act=act,
+ bias_attr=biased,
+ use_cudnn=False,
+ name=name)
+ return tmp
+
+
+def zero_padding(input, padding):
+ return fluid.layers.pad(input,
+ [0, 0, 0, 0, padding, padding, padding, padding])
+
+
+def bn(input, relu=False, name=None, is_test=False):
+ act = None
+ if relu:
+ act = 'relu'
+ name = input.name.split(".")[0] + "_bn"
+ tmp = fluid.layers.batch_norm(
+ input, act=act, momentum=0.95, epsilon=1e-5, name=name)
+ return tmp
+
+
+def avg_pool(input, k_h, k_w, s_h, s_w, name=None, padding=0):
+ temp = fluid.layers.pool2d(
+ input,
+ pool_size=[k_h, k_w],
+ pool_type="avg",
+ pool_stride=[s_h, s_w],
+ pool_padding=padding,
+ name=name)
+ return temp
+
+
+def max_pool(input, k_h, k_w, s_h, s_w, name=None, padding=0):
+ temp = fluid.layers.pool2d(
+ input,
+ pool_size=[k_h, k_w],
+ pool_type="max",
+ pool_stride=[s_h, s_w],
+ pool_padding=padding,
+ name=name)
+ return temp
+
+
+def interp(input, out_shape):
+ out_shape = list(out_shape.astype("int32"))
+ return fluid.layers.resize_bilinear(input, out_shape=out_shape)
+
+
+def dilation_convs(input):
+ tmp = res_block(input, filter_num=256, padding=1, name="conv3_2")
+ tmp = res_block(tmp, filter_num=256, padding=1, name="conv3_3")
+ tmp = res_block(tmp, filter_num=256, padding=1, name="conv3_4")
+
+ tmp = proj_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_1")
+ tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_2")
+ tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_3")
+ tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_4")
+ tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_5")
+ tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_6")
+
+ tmp = proj_block(
+ tmp, filter_num=1024, padding=4, dilation=4, name="conv5_1")
+ tmp = res_block(tmp, filter_num=1024, padding=4, dilation=4, name="conv5_2")
+ tmp = res_block(tmp, filter_num=1024, padding=4, dilation=4, name="conv5_3")
+ return tmp
+
+
+def pyramis_pooling(input, input_shape):
+ shape = np.ceil(input_shape // 32).astype("int32")
+ h, w = shape
+ pool1 = avg_pool(input, h, w, h, w)
+ pool1_interp = interp(pool1, shape)
+ pool2 = avg_pool(input, h // 2, w // 2, h // 2, w // 2)
+ pool2_interp = interp(pool2, shape)
+ pool3 = avg_pool(input, h // 3, w // 3, h // 3, w // 3)
+ pool3_interp = interp(pool3, shape)
+ pool4 = avg_pool(input, h // 4, w // 4, h // 4, w // 4)
+ pool4_interp = interp(pool4, shape)
+ conv5_3_sum = input + pool4_interp + pool3_interp + pool2_interp + pool1_interp
+ return conv5_3_sum
+
+
+def shared_convs(image):
+ tmp = conv(image, 3, 3, 32, 2, 2, padding='SAME', name="conv1_1_3_3_s2")
+ tmp = bn(tmp, relu=True)
+ tmp = conv(tmp, 3, 3, 32, 1, 1, padding='SAME', name="conv1_2_3_3")
+ tmp = bn(tmp, relu=True)
+ tmp = conv(tmp, 3, 3, 64, 1, 1, padding='SAME', name="conv1_3_3_3")
+ tmp = bn(tmp, relu=True)
+ tmp = max_pool(tmp, 3, 3, 2, 2, padding=[1, 1])
+
+ tmp = proj_block(tmp, filter_num=128, padding=0, name="conv2_1")
+ tmp = res_block(tmp, filter_num=128, padding=1, name="conv2_2")
+ tmp = res_block(tmp, filter_num=128, padding=1, name="conv2_3")
+ tmp = proj_block(tmp, filter_num=256, padding=1, stride=2, name="conv3_1")
+ return tmp
+
+
+def res_block(input, filter_num, padding=0, dilation=None, name=None):
+ tmp = conv(input, 1, 1, filter_num // 4, 1, 1, name=name + "_1_1_reduce")
+ tmp = bn(tmp, relu=True)
+ tmp = zero_padding(tmp, padding=padding)
+ if dilation is None:
+ tmp = conv(tmp, 3, 3, filter_num // 4, 1, 1, name=name + "_3_3")
+ else:
+ tmp = atrous_conv(
+ tmp, 3, 3, filter_num // 4, dilation, name=name + "_3_3")
+ tmp = bn(tmp, relu=True)
+ tmp = conv(tmp, 1, 1, filter_num, 1, 1, name=name + "_1_1_increase")
+ tmp = bn(tmp, relu=False)
+ tmp = input + tmp
+ tmp = fluid.layers.relu(tmp)
+ return tmp
+
+
+def proj_block(input, filter_num, padding=0, dilation=None, stride=1,
+ name=None):
+ proj = conv(
+ input, 1, 1, filter_num, stride, stride, name=name + "_1_1_proj")
+ proj_bn = bn(proj, relu=False)
+
+ tmp = conv(
+ input, 1, 1, filter_num // 4, stride, stride, name=name + "_1_1_reduce")
+ tmp = bn(tmp, relu=True)
+
+ tmp = zero_padding(tmp, padding=padding)
+ if padding == 0:
+ padding = 'SAME'
+ else:
+ padding = 'VALID'
+ if dilation is None:
+ tmp = conv(
+ tmp,
+ 3,
+ 3,
+ filter_num // 4,
+ 1,
+ 1,
+ padding=padding,
+ name=name + "_3_3")
+ else:
+ tmp = atrous_conv(
+ tmp,
+ 3,
+ 3,
+ filter_num // 4,
+ dilation,
+ padding=padding,
+ name=name + "_3_3")
+
+ tmp = bn(tmp, relu=True)
+ tmp = conv(tmp, 1, 1, filter_num, 1, 1, name=name + "_1_1_increase")
+ tmp = bn(tmp, relu=False)
+ tmp = proj_bn + tmp
+ tmp = fluid.layers.relu(tmp)
+ return tmp
+
+
+def sub_net_4(input, input_shape):
+ tmp = interp(input, out_shape=(input_shape // 32))
+ tmp = dilation_convs(tmp)
+ tmp = pyramis_pooling(tmp, input_shape)
+ tmp = conv(tmp, 1, 1, 256, 1, 1, name="conv5_4_k1")
+ tmp = bn(tmp, relu=True)
+ tmp = interp(tmp, out_shape=np.ceil(input_shape / 16))
+ return tmp
+
+
+def sub_net_2(input):
+ tmp = conv(input, 1, 1, 128, 1, 1, name="conv3_1_sub2_proj")
+ tmp = bn(tmp, relu=False)
+ return tmp
+
+
+def sub_net_1(input):
+ tmp = conv(input, 3, 3, 32, 2, 2, padding='SAME', name="conv1_sub1")
+ tmp = bn(tmp, relu=True)
+ tmp = conv(tmp, 3, 3, 32, 2, 2, padding='SAME', name="conv2_sub1")
+ tmp = bn(tmp, relu=True)
+ tmp = conv(tmp, 3, 3, 64, 2, 2, padding='SAME', name="conv3_sub1")
+ tmp = bn(tmp, relu=True)
+ tmp = conv(tmp, 1, 1, 128, 1, 1, name="conv3_sub1_proj")
+ tmp = bn(tmp, relu=False)
+ return tmp
+
+
+def CCF24(sub2_out, sub4_out, input_shape):
+ tmp = zero_padding(sub4_out, padding=2)
+ tmp = atrous_conv(tmp, 3, 3, 128, 2, name="conv_sub4")
+ tmp = bn(tmp, relu=False)
+ tmp = tmp + sub2_out
+ tmp = fluid.layers.relu(tmp)
+ tmp = interp(tmp, input_shape // 8)
+ return tmp
+
+
+def CCF124(sub1_out, sub24_out, input_shape):
+ tmp = zero_padding(sub24_out, padding=2)
+ tmp = atrous_conv(tmp, 3, 3, 128, 2, name="conv_sub2")
+ tmp = bn(tmp, relu=False)
+ tmp = tmp + sub1_out
+ tmp = fluid.layers.relu(tmp)
+ tmp = interp(tmp, input_shape // 4)
+ return tmp
+
+
+def icnet(data, num_classes, input_shape):
+ image_sub1 = data
+ image_sub2 = interp(data, out_shape=input_shape * 0.5)
+
+ s_convs = shared_convs(image_sub2)
+ sub4_out = sub_net_4(s_convs, input_shape)
+ sub2_out = sub_net_2(s_convs)
+ sub1_out = sub_net_1(image_sub1)
+
+ sub24_out = CCF24(sub2_out, sub4_out, input_shape)
+ sub124_out = CCF124(sub1_out, sub24_out, input_shape)
+
+ conv6_cls = conv(
+ sub124_out, 1, 1, num_classes, 1, 1, biased=True, name="conv6_cls")
+ sub4_out = conv(
+ sub4_out, 1, 1, num_classes, 1, 1, biased=True, name="sub4_out")
+ sub24_out = conv(
+ sub24_out, 1, 1, num_classes, 1, 1, biased=True, name="sub24_out")
+
+ return sub4_out, sub24_out, conv6_cls
diff --git a/fluid/PaddleCV/icnet/images/icnet.png b/PaddleCV/icnet/images/icnet.png
similarity index 100%
rename from fluid/PaddleCV/icnet/images/icnet.png
rename to PaddleCV/icnet/images/icnet.png
diff --git a/fluid/PaddleCV/icnet/images/result.png b/PaddleCV/icnet/images/result.png
similarity index 100%
rename from fluid/PaddleCV/icnet/images/result.png
rename to PaddleCV/icnet/images/result.png
diff --git a/fluid/PaddleCV/icnet/images/train_loss.png b/PaddleCV/icnet/images/train_loss.png
similarity index 100%
rename from fluid/PaddleCV/icnet/images/train_loss.png
rename to PaddleCV/icnet/images/train_loss.png
diff --git a/PaddleCV/icnet/infer.py b/PaddleCV/icnet/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..fddc375af223962add58cb6ddf7c0d319b318c99
--- /dev/null
+++ b/PaddleCV/icnet/infer.py
@@ -0,0 +1,135 @@
+"""Infer for ICNet model."""
+from __future__ import print_function
+import cityscape
+import argparse
+import functools
+import sys
+import os
+import cv2
+
+import paddle.fluid as fluid
+import paddle
+from icnet import icnet
+from utils import add_arguments, print_arguments, get_feeder_data
+from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
+from paddle.fluid.initializer import init_on_cpu
+import numpy as np
+
+IMG_MEAN = np.array((103.939, 116.779, 123.68), dtype=np.float32)
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('model_path', str, None, "Model path.")
+add_arg('images_list', str, None, "List file with images to be infered.")
+add_arg('images_path', str, None, "The images path.")
+add_arg('out_path', str, "./output", "Output path.")
+add_arg('use_gpu', bool, True, "Whether use GPU to test.")
+# yapf: enable
+
+data_shape = [3, 1024, 2048]
+num_classes = 19
+
+label_colours = [
+ [128, 64, 128],
+ [244, 35, 231],
+ [69, 69, 69]
+ # 0 = road, 1 = sidewalk, 2 = building
+ ,
+ [102, 102, 156],
+ [190, 153, 153],
+ [153, 153, 153]
+ # 3 = wall, 4 = fence, 5 = pole
+ ,
+ [250, 170, 29],
+ [219, 219, 0],
+ [106, 142, 35]
+ # 6 = traffic light, 7 = traffic sign, 8 = vegetation
+ ,
+ [152, 250, 152],
+ [69, 129, 180],
+ [219, 19, 60]
+ # 9 = terrain, 10 = sky, 11 = person
+ ,
+ [255, 0, 0],
+ [0, 0, 142],
+ [0, 0, 69]
+ # 12 = rider, 13 = car, 14 = truck
+ ,
+ [0, 60, 100],
+ [0, 79, 100],
+ [0, 0, 230]
+ # 15 = bus, 16 = train, 17 = motocycle
+ ,
+ [119, 10, 32]
+]
+
+# 18 = bicycle
+
+
+def color(input):
+ """
+ Convert infered result to color image.
+ """
+ result = []
+ for i in input.flatten():
+ result.append(
+ [label_colours[i][2], label_colours[i][1], label_colours[i][0]])
+ result = np.array(result).reshape([input.shape[0], input.shape[1], 3])
+ return result
+
+
+def infer(args):
+ data_shape = cityscape.test_data_shape()
+ num_classes = cityscape.num_classes()
+ # define network
+ images = fluid.layers.data(name='image', shape=data_shape, dtype='float32')
+ _, _, sub124_out = icnet(images, num_classes,
+ np.array(data_shape[1:]).astype("float32"))
+ predict = fluid.layers.resize_bilinear(
+ sub124_out, out_shape=data_shape[1:3])
+ predict = fluid.layers.transpose(predict, perm=[0, 2, 3, 1])
+ predict = fluid.layers.reshape(predict, shape=[-1, num_classes])
+ _, predict = fluid.layers.topk(predict, k=1)
+ predict = fluid.layers.reshape(
+ predict,
+ shape=[data_shape[1], data_shape[2], -1]) # batch_size should be 1
+ inference_program = fluid.default_main_program().clone(for_test=True)
+ # prepare environment
+ place = fluid.CPUPlace()
+ if args.use_gpu:
+ place = fluid.CUDAPlace(0)
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ assert os.path.exists(args.model_path)
+ fluid.io.load_params(exe, args.model_path)
+ print("loaded model from: %s" % args.model_path)
+ sys.stdout.flush()
+
+ if not os.path.isdir(args.out_path):
+ os.makedirs(args.out_path)
+
+ for line in open(args.images_list):
+ image_file = args.images_path + "/" + line.strip()
+ filename = os.path.basename(image_file)
+ image = paddle.dataset.image.load_image(
+ image_file, is_color=True).astype("float32")
+ image -= IMG_MEAN
+ img = paddle.dataset.image.to_chw(image)[np.newaxis, :]
+ image_t = fluid.core.LoDTensor()
+ image_t.set(img, place)
+ result = exe.run(inference_program,
+ feed={"image": image_t},
+ fetch_list=[predict])
+ cv2.imwrite(args.out_path + "/" + filename + "_result.png",
+ color(result[0]))
+ print("Saved images into: %s" % args.out_path)
+
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ infer(args)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/PaddleCV/icnet/train.py b/PaddleCV/icnet/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..07efe4c6c065644f50cfa7ce39ccea9a3673cbd4
--- /dev/null
+++ b/PaddleCV/icnet/train.py
@@ -0,0 +1,155 @@
+"""Trainer for ICNet model."""
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from icnet import icnet
+import cityscape
+import argparse
+import functools
+import sys
+import os
+import time
+import paddle.fluid as fluid
+import numpy as np
+from utils import add_arguments, print_arguments, get_feeder_data
+from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
+from paddle.fluid.initializer import init_on_cpu
+
+if 'ce_mode' in os.environ:
+ np.random.seed(10)
+ fluid.default_startup_program().random_seed = 90
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 16, "Minibatch size.")
+add_arg('checkpoint_path', str, None, "Checkpoint svae path.")
+add_arg('init_model', str, None, "Pretrain model path.")
+add_arg('use_gpu', bool, True, "Whether use GPU to train.")
+add_arg('random_mirror', bool, True, "Whether prepare by random mirror.")
+add_arg('random_scaling', bool, True, "Whether prepare by random scaling.")
+# yapf: enable
+
+LAMBDA1 = 0.16
+LAMBDA2 = 0.4
+LAMBDA3 = 1.0
+LEARNING_RATE = 0.003
+POWER = 0.9
+LOG_PERIOD = 100
+CHECKPOINT_PERIOD = 100
+TOTAL_STEP = 100
+
+no_grad_set = []
+
+
+def create_loss(predict, label, mask, num_classes):
+ predict = fluid.layers.transpose(predict, perm=[0, 2, 3, 1])
+ predict = fluid.layers.reshape(predict, shape=[-1, num_classes])
+ label = fluid.layers.reshape(label, shape=[-1, 1])
+ predict = fluid.layers.gather(predict, mask)
+ label = fluid.layers.gather(label, mask)
+ label = fluid.layers.cast(label, dtype="int64")
+ loss = fluid.layers.softmax_with_cross_entropy(predict, label)
+ no_grad_set.append(label.name)
+ return fluid.layers.reduce_mean(loss)
+
+
+def poly_decay():
+ global_step = _decay_step_counter()
+ with init_on_cpu():
+ decayed_lr = LEARNING_RATE * (fluid.layers.pow(
+ (1 - global_step / TOTAL_STEP), POWER))
+ return decayed_lr
+
+
+def train(args):
+ data_shape = cityscape.train_data_shape()
+ num_classes = cityscape.num_classes()
+ # define network
+ images = fluid.layers.data(name='image', shape=data_shape, dtype='float32')
+ label_sub1 = fluid.layers.data(name='label_sub1', shape=[1], dtype='int32')
+ label_sub2 = fluid.layers.data(name='label_sub2', shape=[1], dtype='int32')
+ label_sub4 = fluid.layers.data(name='label_sub4', shape=[1], dtype='int32')
+ mask_sub1 = fluid.layers.data(name='mask_sub1', shape=[-1], dtype='int32')
+ mask_sub2 = fluid.layers.data(name='mask_sub2', shape=[-1], dtype='int32')
+ mask_sub4 = fluid.layers.data(name='mask_sub4', shape=[-1], dtype='int32')
+
+ sub4_out, sub24_out, sub124_out = icnet(
+ images, num_classes, np.array(data_shape[1:]).astype("float32"))
+ loss_sub4 = create_loss(sub4_out, label_sub4, mask_sub4, num_classes)
+ loss_sub24 = create_loss(sub24_out, label_sub2, mask_sub2, num_classes)
+ loss_sub124 = create_loss(sub124_out, label_sub1, mask_sub1, num_classes)
+ reduced_loss = LAMBDA1 * loss_sub4 + LAMBDA2 * loss_sub24 + LAMBDA3 * loss_sub124
+
+ regularizer = fluid.regularizer.L2Decay(0.0001)
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=poly_decay(), momentum=0.9, regularization=regularizer)
+ _, params_grads = optimizer.minimize(reduced_loss, no_grad_set=no_grad_set)
+
+ # prepare environment
+ place = fluid.CPUPlace()
+ if args.use_gpu:
+ place = fluid.CUDAPlace(0)
+ exe = fluid.Executor(place)
+
+ exe.run(fluid.default_startup_program())
+
+ if args.init_model is not None:
+ print("load model from: %s" % args.init_model)
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(args.init_model, var.name))
+
+ fluid.io.load_vars(exe, args.init_model, predicate=if_exist)
+
+ iter_id = 0
+ t_loss = 0.
+ sub4_loss = 0.
+ sub24_loss = 0.
+ sub124_loss = 0.
+ train_reader = cityscape.train(
+ args.batch_size, flip=args.random_mirror, scaling=args.random_scaling)
+ start_time = time.time()
+ while True:
+ # train a pass
+ for data in train_reader():
+ if iter_id > TOTAL_STEP:
+ end_time = time.time()
+ print("kpis train_duration %f" % (end_time - start_time))
+ return
+ iter_id += 1
+ results = exe.run(
+ feed=get_feeder_data(data, place),
+ fetch_list=[reduced_loss, loss_sub4, loss_sub24, loss_sub124])
+ t_loss += results[0]
+ sub4_loss += results[1]
+ sub24_loss += results[2]
+ sub124_loss += results[3]
+ # training log
+ if iter_id % LOG_PERIOD == 0:
+ print(
+ "Iter[%d]; train loss: %.3f; sub4_loss: %.3f; sub24_loss: %.3f; sub124_loss: %.3f"
+ % (iter_id, t_loss / LOG_PERIOD, sub4_loss / LOG_PERIOD,
+ sub24_loss / LOG_PERIOD, sub124_loss / LOG_PERIOD))
+ print("kpis train_cost %f" % (t_loss / LOG_PERIOD))
+
+ t_loss = 0.
+ sub4_loss = 0.
+ sub24_loss = 0.
+ sub124_loss = 0.
+ sys.stdout.flush()
+
+ if iter_id % CHECKPOINT_PERIOD == 0 and args.checkpoint_path is not None:
+ dir_name = args.checkpoint_path + "/" + str(iter_id)
+ fluid.io.save_persistables(exe, dirname=dir_name)
+ print("Saved checkpoint: %s" % (dir_name))
+
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ train(args)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/fluid/PaddleCV/icnet/utils.py b/PaddleCV/icnet/utils.py
similarity index 100%
rename from fluid/PaddleCV/icnet/utils.py
rename to PaddleCV/icnet/utils.py
diff --git a/fluid/PaddleCV/image_classification/.gitignore b/PaddleCV/image_classification/.gitignore
similarity index 100%
rename from fluid/PaddleCV/image_classification/.gitignore
rename to PaddleCV/image_classification/.gitignore
diff --git a/PaddleCV/image_classification/.run_ce.sh b/PaddleCV/image_classification/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..cc0d894a634bc0add12fd83840990eacf77382cc
--- /dev/null
+++ b/PaddleCV/image_classification/.run_ce.sh
@@ -0,0 +1,13 @@
+#!/bin/bash
+
+# This file is only used for continuous evaluation.
+export FLAGS_cudnn_deterministic=True
+BATCH_SIZE=56
+cudaid=${object_detection_cudaid:=0}
+export CUDA_VISIBLE_DEVICES=$cudaid
+python train.py --batch_size=${BATCH_SIZE} --num_epochs=5 --enable_ce=True --lr_strategy=cosine_decay | python _ce.py
+
+BATCH_SIZE=224
+cudaid=${object_detection_cudaid_m:=0, 1, 2, 3}
+export CUDA_VISIBLE_DEVICES=$cudaid
+python train.py --batch_size=${BATCH_SIZE} --num_epochs=5 --enable_ce=True --lr_strategy=cosine_decay | python _ce.py
diff --git a/PaddleCV/image_classification/README.md b/PaddleCV/image_classification/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..4f37d8f5b57aed073e1e9522380bb4a1e9181d61
--- /dev/null
+++ b/PaddleCV/image_classification/README.md
@@ -0,0 +1,167 @@
+# Image Classification and Model Zoo
+Image classification, which is an important field of computer vision, is to classify an image into pre-defined labels. Recently, many researchers developed different kinds of neural networks and highly improve the classification performance. This page introduces how to do image classification with PaddlePaddle Fluid.
+
+---
+## Table of Contents
+- [Installation](#installation)
+- [Data preparation](#data-preparation)
+- [Training a model with flexible parameters](#training-a-model-with-flexible-parameters)
+- [Using Mixed-Precision Training](#using-mixed-precision-training)
+- [Finetuning](#finetuning)
+- [Evaluation](#evaluation)
+- [Inference](#inference)
+- [Supported models and performances](#supported-models-and-performances)
+
+## Installation
+
+Running sample code in this directory requires PaddelPaddle Fluid v0.13.0 and later, the latest release version is recommended, If the PaddlePaddle on your device is lower than v0.13.0, please follow the instructions in [installation document](http://paddlepaddle.org/documentation/docs/zh/1.3/beginners_guide/install/index_cn.html) and make an update.
+
+## Data preparation
+
+An example for ImageNet classification is as follows. First of all, preparation of imagenet data can be done as:
+```
+cd data/ILSVRC2012/
+sh download_imagenet2012.sh
+```
+
+In the shell script ```download_imagenet2012.sh```, there are three steps to prepare data:
+
+**step-1:** Register at ```image-net.org``` first in order to get a pair of ```Username``` and ```AccessKey```, which are used to download ImageNet data.
+
+**step-2:** Download ImageNet-2012 dataset from website. The training and validation data will be downloaded into folder "train" and "val" respectively. Please note that the size of data is more than 40 GB, it will take much time to download. Users who have downloaded the ImageNet data can organize it into ```data/ILSVRC2012``` directly.
+
+**step-3:** Download training and validation label files. There are two label files which contain train and validation image labels respectively:
+
+* *train_list.txt*: label file of imagenet-2012 training set, with each line seperated by ```SPACE```, like:
+```
+train/n02483708/n02483708_2436.jpeg 369
+train/n03998194/n03998194_7015.jpeg 741
+train/n04523525/n04523525_38118.jpeg 884
+...
+```
+* *val_list.txt*: label file of imagenet-2012 validation set, with each line seperated by ```SPACE```, like.
+```
+val/ILSVRC2012_val_00000001.jpeg 65
+val/ILSVRC2012_val_00000002.jpeg 970
+val/ILSVRC2012_val_00000003.jpeg 230
+...
+```
+
+You may need to modify the path in reader.py to load data correctly.
+
+## Training a model with flexible parameters
+
+After data preparation, one can start the training step by:
+
+```
+python train.py \
+ --model=SE_ResNeXt50_32x4d \
+ --batch_size=32 \
+ --total_images=1281167 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --model_save_dir=output/ \
+ --with_mem_opt=False \
+ --lr_strategy=piecewise_decay \
+ --lr=0.1
+```
+**parameter introduction:**
+* **model**: name model to use. Default: "SE_ResNeXt50_32x4d".
+* **num_epochs**: the number of epochs. Default: 120.
+* **batch_size**: the size of each mini-batch. Default: 256.
+* **use_gpu**: whether to use GPU or not. Default: True.
+* **total_images**: total number of images in the training set. Default: 1281167.
+* **class_dim**: the class number of the classification task. Default: 1000.
+* **image_shape**: input size of the network. Default: "3,224,224".
+* **model_save_dir**: the directory to save trained model. Default: "output".
+* **with_mem_opt**: whether to use memory optimization or not. Default: True.
+* **lr_strategy**: learning rate changing strategy. Default: "piecewise_decay".
+* **lr**: initialized learning rate. Default: 0.1.
+* **pretrained_model**: model path for pretraining. Default: None.
+* **checkpoint**: the checkpoint path to resume. Default: None.
+* **data_dir**: the data path. Default: "./data/ILSVRC2012".
+* **fp16**: whether to enable half precision training with fp16. Default: False.
+* **scale_loss**: scale loss for fp16. Default: 1.0.
+* **l2_decay**: L2_decay parameter. Default: 1e-4.
+* **momentum_rate**: momentum_rate. Default: 0.9.
+
+Or can start the training step by running the ```run.sh```.
+
+**data reader introduction:** Data reader is defined in ```reader.py```and```reader_cv2.py```, Using CV2 reader can improve the speed of reading. In [training stage](#training-a-model-with-flexible-parameters), random crop and flipping are used, while center crop is used in [Evaluation](#evaluation) and [Inference](#inference) stages. Supported data augmentation includes:
+* rotation
+* color jitter
+* random crop
+* center crop
+* resize
+* flipping
+
+## Using Mixed-Precision Training
+
+You may add `--fp16=1` to start train using mixed precisioin training, which the training process will use float16 and the output model ("master" parameters) is saved as float32. You also may need to pass `--scale_loss` to overcome accuracy issues, usually `--scale_loss=8.0` will do.
+
+Note that currently `--fp16` can not use together with `--with_mem_opt`, so pass `--with_mem_opt=0` to disable memory optimization pass.
+
+## Finetuning
+
+Finetuning is to finetune model weights in a specific task by loading pretrained weights. After initializing ```path_to_pretrain_model```, one can finetune a model as:
+```
+python train.py
+ --model=SE_ResNeXt50_32x4d \
+ --pretrained_model=${path_to_pretrain_model} \
+ --batch_size=32 \
+ --total_images=1281167 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --model_save_dir=output/ \
+ --with_mem_opt=True \
+ --lr_strategy=piecewise_decay \
+ --lr=0.1
+```
+
+## Evaluation
+Evaluation is to evaluate the performance of a trained model. One can download [pretrained models](#supported-models-and-performances) and set its path to ```path_to_pretrain_model```. Then top1/top5 accuracy can be obtained by running the following command:
+```
+python eval.py \
+ --model=SE_ResNeXt50_32x4d \
+ --batch_size=32 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --with_mem_opt=True \
+ --pretrained_model=${path_to_pretrain_model}
+```
+
+## Inference
+Inference is used to get prediction score or image features based on trained models.
+```
+python infer.py \
+ --model=SE_ResNeXt50_32x4d \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --with_mem_opt=True \
+ --pretrained_model=${path_to_pretrain_model}
+```
+
+## Supported models and performances
+
+Available top-1/top-5 validation accuracy on ImageNet 2012 are listed in table. Pretrained models can be downloaded by clicking related model names.
+
+- Released models: specify parameter names
+
+|model | top-1/top-5 accuracy(PIL)| top-1/top-5 accuracy(CV2) |
+|- |:-: |:-:|
+|[AlexNet](http://paddle-imagenet-models-name.bj.bcebos.com/AlexNet_pretrained.zip) | 56.71%/79.18% | 55.88%/78.65% |
+|[VGG11](https://paddle-imagenet-models-name.bj.bcebos.com/VGG11_pretrained.zip) | 69.22%/89.09% | 69.01%/88.90% |
+|[VGG13](https://paddle-imagenet-models-name.bj.bcebos.com/VGG13_pretrained.zip) | 70.14%/89.48% | 69.83%/89.13% |
+|[VGG16](https://paddle-imagenet-models-name.bj.bcebos.com/VGG16_pretrained.zip) | 72.08%/90.63% | 71.65%/90.57% |
+|[VGG19](https://paddle-imagenet-models-name.bj.bcebos.com/VGG19_pretrained.zip) | 72.56%/90.83% | 72.32%/90.98% |
+|[MobileNetV1](http://paddle-imagenet-models-name.bj.bcebos.com/MobileNetV1_pretrained.zip) | 70.91%/89.54% | 70.51%/89.35% |
+|[MobileNetV2](https://paddle-imagenet-models-name.bj.bcebos.com/MobileNetV2_pretrained.zip) | 71.90%/90.55% | 71.53%/90.41% |
+|[ResNet18](https://paddle-imagenet-models-name.bj.bcebos.com/ResNet18_pretrained.tar) | 70.85%/89.89% | 70.65%/89.89% |
+|[ResNet34](https://paddle-imagenet-models-name.bj.bcebos.com/ResNet34_pretrained.tar) | 74.41%/92.03% | 74.13%/91.97% |
+|[ResNet50](http://paddle-imagenet-models-name.bj.bcebos.com/ResNet50_pretrained.zip) | 76.35%/92.80% | 76.22%/92.92% |
+|[ResNet101](http://paddle-imagenet-models-name.bj.bcebos.com/ResNet101_pretrained.zip) | 77.49%/93.57% | 77.56%/93.64% |
+|[ResNet152](https://paddle-imagenet-models-name.bj.bcebos.com/ResNet152_pretrained.zip) | 78.12%/93.93% | 77.92%/93.87% |
+|[SE_ResNeXt50_32x4d](https://paddle-imagenet-models-name.bj.bcebos.com/SE_ResNext50_32x4d_pretrained.zip) | 78.50%/94.01% | 78.44%/93.96% |
+|[SE_ResNeXt101_32x4d](https://paddle-imagenet-models-name.bj.bcebos.com/SE_ResNeXt101_32x4d_pretrained.zip) | 79.26%/94.22% | 79.12%/94.20% |
+|[GoogleNet](https://paddle-imagenet-models-name.bj.bcebos.com/GoogleNet_pretrained.tar) | 70.50%/89.59% | 70.27%/89.58% |
+|[ShuffleNetV2](https://paddle-imagenet-models-name.bj.bcebos.com/ShuffleNet_pretrained.tar) | | 69.48%/88.99% |
diff --git a/PaddleCV/image_classification/README_cn.md b/PaddleCV/image_classification/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..34b0ac158e16616177957f45c4c89fc600008e9d
--- /dev/null
+++ b/PaddleCV/image_classification/README_cn.md
@@ -0,0 +1,163 @@
+# 图像分类以及模型库
+图像分类是计算机视觉的重要领域,它的目标是将图像分类到预定义的标签。近期,许多研究者提出很多不同种类的神经网络,并且极大的提升了分类算法的性能。本页将介绍如何使用PaddlePaddle进行图像分类。
+
+---
+## 内容
+- [安装](#安装)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [混合精度训练](#混合精度训练)
+- [参数微调](#参数微调)
+- [模型评估](#模型评估)
+- [模型预测](#模型预测)
+- [已有模型及其性能](#已有模型及其性能)
+
+## 安装
+
+在当前目录下运行样例代码需要PadddlePaddle Fluid的v0.13.0或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据 [installation document](http://paddlepaddle.org/documentation/docs/zh/1.3/beginners_guide/install/index_cn.html) 中的说明来更新PaddlePaddle。
+
+## 数据准备
+
+下面给出了ImageNet分类任务的样例,首先,通过如下的方式进行数据的准备:
+```
+cd data/ILSVRC2012/
+sh download_imagenet2012.sh
+```
+在```download_imagenet2012.sh```脚本中,通过下面三步来准备数据:
+
+**步骤一:** 首先在```image-net.org```网站上完成注册,用于获得一对```Username```和```AccessKey```。
+
+**步骤二:** 从ImageNet官网下载ImageNet-2012的图像数据。训练以及验证数据集会分别被下载到"train" 和 "val" 目录中。请注意,ImaegNet数据的大小超过40GB,下载非常耗时;已经自行下载ImageNet的用户可以直接将数据组织放置到```data/ILSVRC2012```。
+
+**步骤三:** 下载训练与验证集合对应的标签文件。下面两个文件分别包含了训练集合与验证集合中图像的标签:
+
+* *train_list.txt*: ImageNet-2012训练集合的标签文件,每一行采用"空格"分隔图像路径与标注,例如:
+```
+train/n02483708/n02483708_2436.jpeg 369
+train/n03998194/n03998194_7015.jpeg 741
+train/n04523525/n04523525_38118.jpeg 884
+...
+```
+* *val_list.txt*: ImageNet-2012验证集合的标签文件,每一行采用"空格"分隔图像路径与标注,例如:
+```
+val/ILSVRC2012_val_00000001.jpeg 65
+val/ILSVRC2012_val_00000002.jpeg 970
+val/ILSVRC2012_val_00000003.jpeg 230
+...
+```
+注意:需要根据本地环境调整reader.py相关路径来正确读取数据。
+
+## 模型训练
+
+数据准备完毕后,可以通过如下的方式启动训练:
+```
+python train.py \
+ --model=SE_ResNeXt50_32x4d \
+ --batch_size=32 \
+ --total_images=1281167 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --model_save_dir=output/ \
+ --with_mem_opt=False \
+ --lr_strategy=piecewise_decay \
+ --lr=0.1
+```
+**参数说明:**
+* **model**: 模型名称, 默认值: "SE_ResNeXt50_32x4d"
+* **num_epochs**: 训练回合数,默认值: 120
+* **batch_size**: 批大小,默认值: 256
+* **use_gpu**: 是否在GPU上运行,默认值: True
+* **total_images**: 图片数,ImageNet2012默认值: 1281167.
+* **class_dim**: 类别数,默认值: 1000
+* **image_shape**: 图片大小,默认值: "3,224,224"
+* **model_save_dir**: 模型存储路径,默认值: "output/"
+* **with_mem_opt**: 是否开启显存优化,默认值: False
+* **lr_strategy**: 学习率变化策略,默认值: "piecewise_decay"
+* **lr**: 初始学习率,默认值: 0.1
+* **pretrained_model**: 预训练模型路径,默认值: None
+* **checkpoint**: 用于继续训练的检查点(指定具体模型存储路径,如"output/SE_ResNeXt50_32x4d/100/"),默认值: None
+* **fp16**: 是否开启混合精度训练,默认值: False
+* **scale_loss**: 调整混合训练的loss scale值,默认值: 1.0
+* **l2_decay**: l2_decay值,默认值: 1e-4
+* **momentum_rate**: momentum_rate值,默认值: 0.9
+
+在```run.sh```中有用于训练的脚本.
+
+**数据读取器说明:** 数据读取器定义在```reader.py```和```reader_cv2.py```中。一般, CV2可以提高数据读取速度, PIL reader可以得到相对更高的精度, 我们现在默认基于PIL的数据读取器, 在[训练阶段](#模型训练), 默认采用的增广方式是随机裁剪与水平翻转, 而在[模型评估](#模型评估)与[模型预测](#模型预测)阶段用的默认方式是中心裁剪。当前支持的数据增广方式有:
+* 旋转
+* 颜色抖动
+* 随机裁剪
+* 中心裁剪
+* 长宽调整
+* 水平翻转
+
+## 混合精度训练
+
+可以通过开启`--fp16=True`启动混合精度训练,这样训练过程会使用float16数据,并输出float32的模型参数("master"参数)。您可能需要同时传入`--scale_loss`来解决fp16训练的精度问题,通常传入`--scale_loss=8.0`即可。
+
+注意,目前混合精度训练不能和内存优化功能同时使用,所以需要传`--with_mem_opt=False`这个参数来禁用内存优化功能。
+
+## 参数微调
+
+参数微调是指在特定任务上微调已训练模型的参数。通过初始化```path_to_pretrain_model```,微调一个模型可以采用如下的命令:
+```
+python train.py
+ --model=SE_ResNeXt50_32x4d \
+ --pretrained_model=${path_to_pretrain_model} \
+ --batch_size=32 \
+ --total_images=1281167 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --model_save_dir=output/ \
+ --with_mem_opt=True \
+ --lr_strategy=piecewise_decay \
+ --lr=0.1
+```
+
+## 模型评估
+模型评估是指对训练完毕的模型评估各类性能指标。用户可以下载[已有模型及其性能](#已有模型及其性能)并且设置```path_to_pretrain_model```为模型所在路径。运行如下的命令,可以获得一个模型top-1/top-5精度:
+```
+python eval.py \
+ --model=SE_ResNeXt50_32x4d \
+ --batch_size=32 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --with_mem_opt=True \
+ --pretrained_model=${path_to_pretrain_model}
+```
+
+## 模型预测
+模型预测可以获取一个模型的预测分数或者图像的特征:
+```
+python infer.py \
+ --model=SE_ResNeXt50_32x4d \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --with_mem_opt=True \
+ --pretrained_model=${path_to_pretrain_model}
+```
+
+## 已有模型及其性能
+表格中列出了在```models```目录下支持的图像分类模型,并且给出了已完成训练的模型在ImageNet-2012验证集合上的top-1/top-5精度,
+可以通过点击相应模型的名称下载相应预训练模型。
+
+- Released models:
+
+|model | top-1/top-5 accuracy(PIL)| top-1/top-5 accuracy(CV2) |
+|- |:-: |:-:|
+|[AlexNet](http://paddle-imagenet-models-name.bj.bcebos.com/AlexNet_pretrained.zip) | 56.71%/79.18% | 55.88%/78.65% |
+|[VGG11](https://paddle-imagenet-models-name.bj.bcebos.com/VGG11_pretrained.zip) | 69.22%/89.09% | 69.01%/88.90% |
+|[VGG13](https://paddle-imagenet-models-name.bj.bcebos.com/VGG13_pretrained.zip) | 70.14%/89.48% | 69.83%/89.13% |
+|[VGG16](https://paddle-imagenet-models-name.bj.bcebos.com/VGG16_pretrained.zip) | 72.08%/90.63% | 71.65%/90.57% |
+|[VGG19](https://paddle-imagenet-models-name.bj.bcebos.com/VGG19_pretrained.zip) | 72.56%/90.83% | 72.32%/90.98% |
+|[MobileNetV1](http://paddle-imagenet-models-name.bj.bcebos.com/MobileNetV1_pretrained.zip) | 70.91%/89.54% | 70.51%/89.35% |
+|[MobileNetV2](https://paddle-imagenet-models-name.bj.bcebos.com/MobileNetV2_pretrained.zip) | 71.90%/90.55% | 71.53%/90.41% |
+|[ResNet18](https://paddle-imagenet-models-name.bj.bcebos.com/ResNet18_pretrained.tar) | 70.85%/89.89% | 70.65%/89.89% |
+|[ResNet34](https://paddle-imagenet-models-name.bj.bcebos.com/ResNet34_pretrained.tar) | 74.41%/92.03% | 74.13%/91.97% |
+|[ResNet50](http://paddle-imagenet-models-name.bj.bcebos.com/ResNet50_pretrained.zip) | 76.35%/92.80% | 76.22%/92.92% |
+|[ResNet101](http://paddle-imagenet-models-name.bj.bcebos.com/ResNet101_pretrained.zip) | 77.49%/93.57% | 77.56%/93.64% |
+|[ResNet152](https://paddle-imagenet-models-name.bj.bcebos.com/ResNet152_pretrained.zip) | 78.12%/93.93% | 77.92%/93.87% |
+|[SE_ResNeXt50_32x4d](https://paddle-imagenet-models-name.bj.bcebos.com/SE_ResNext50_32x4d_pretrained.zip) | 78.50%/94.01% | 78.44%/93.96% |
+|[SE_ResNeXt101_32x4d](https://paddle-imagenet-models-name.bj.bcebos.com/SE_ResNeXt101_32x4d_pretrained.zip) | 79.26%/94.22% | 79.12%/94.20% |
+|[GoogleNet](https://paddle-imagenet-models-name.bj.bcebos.com/GoogleNet_pretrained.tar) | 70.50%/89.59% | 70.27%/89.58% |
+|[ShuffleNetV2](https://paddle-imagenet-models-name.bj.bcebos.com/ShuffleNet_pretrained.tar) | | 69.48%/88.99% |
diff --git a/PaddleCV/image_classification/README_ngraph.md b/PaddleCV/image_classification/README_ngraph.md
new file mode 100644
index 0000000000000000000000000000000000000000..bb8190758d876244df931a090134f8410b6d38b3
--- /dev/null
+++ b/PaddleCV/image_classification/README_ngraph.md
@@ -0,0 +1,43 @@
+
+# PaddlePaddle inference and training script
+This directory contains configuration and instructions to run the PaddlePaddle + nGraph for a local training and inference.
+
+# How to build PaddlePaddle framework with NGraph engine
+In order to build the PaddlePaddle + nGraph engine and run proper script, follow up a few steps:
+1. Install PaddlePaddle project
+2. set env exports for nGraph and OpenMP
+3. run the inference/training script
+
+Currently supported models:
+* ResNet50 (inference and training).
+
+Only support Adam optimizer yet.
+
+Short description of aforementioned steps:
+
+## 1. Install PaddlePaddle
+Follow PaddlePaddle [installation instruction](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification#installation) to install PaddlePaddle. If you [build from source](https://github.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/beginners_guide/install/compile/compile_Ubuntu_en.md), please use the following cmake arguments and ensure to set `-DWITH_NGRAPH=ON`.
+```
+cmake .. -DCMAKE_BUILD_TYPE=Release -DWITH_GPU=OFF -DWITH_MKL=ON -DWITH_MKLDNN=ON -DWITH_NGRAPH=ON
+```
+Note: MKLDNN and MKL are required.
+
+## 2. Set env exports for nGraph and OMP
+Set the following exports needed for running nGraph:
+```
+export FLAGS_use_ngraph=true
+export OMP_NUM_THREADS=
+```
+
+If multiple threads are used, you may export the following for better performance:
+```
+export KMP_AFFINITY=granularity=fine,compact,1,0
+```
+
+## 3. How the benchmark script might be run.
+If everything built successfully, you can run command in ResNet50 nGraph session in script [run.sh](https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleCV/image_classification/run.sh) to start the benchmark job locally. You will need to uncomment the `#ResNet50 nGraph` part of script.
+
+Above is training job using the nGraph, to run the inference job using the nGraph:
+
+Please download the pre-trained resnet50 model from [supported models](https://github.com/PaddlePaddle/models/tree/72dcc7c1a8d5de9d19fbd65b4143bd0d661eee2c/fluid/PaddleCV/image_classification#supported-models-and-performances) for inference script.
+
diff --git a/fluid/PaddleCV/human_pose_estimation/lib/__init__.py b/PaddleCV/image_classification/__init__.py
similarity index 100%
rename from fluid/PaddleCV/human_pose_estimation/lib/__init__.py
rename to PaddleCV/image_classification/__init__.py
diff --git a/fluid/PaddleCV/image_classification/_ce.py b/PaddleCV/image_classification/_ce.py
similarity index 100%
rename from fluid/PaddleCV/image_classification/_ce.py
rename to PaddleCV/image_classification/_ce.py
diff --git a/fluid/PaddleCV/image_classification/data/ILSVRC2012/download_imagenet2012.sh b/PaddleCV/image_classification/data/ILSVRC2012/download_imagenet2012.sh
similarity index 100%
rename from fluid/PaddleCV/image_classification/data/ILSVRC2012/download_imagenet2012.sh
rename to PaddleCV/image_classification/data/ILSVRC2012/download_imagenet2012.sh
diff --git a/PaddleCV/image_classification/dist_train/README.md b/PaddleCV/image_classification/dist_train/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..8875429146045e60457246fbc9b0a34d874d8855
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/README.md
@@ -0,0 +1,110 @@
+# Distributed Image Classification Models Training
+
+This folder contains implementations of **Image Classification Models**, they are designed to support
+large-scaled distributed training with two distributed mode: parameter server mode and NCCL2(Nvidia NCCL2 communication library) collective mode.
+
+## Getting Started
+
+Before getting started, please make sure you have go throught the imagenet [Data Preparation](../README.md#data-preparation).
+
+1. The entrypoint file is `dist_train.py`, the commandline arguments are almost the same as the original `train.py`, with the following arguments specific to distributed training.
+
+ - `update_method`, specify the update method, can choose from local, pserver or nccl2.
+ - `multi_batch_repeat`, set this greater than 1 to merge batches before pushing gradients to pservers.
+ - `start_test_pass`, when to start running tests.
+ - `num_threads`, how many threads will be used for ParallelExecutor.
+ - `split_var`, in pserver mode, whether to split one parameter to several pservers, default True.
+ - `async_mode`, do async training, defalt False.
+ - `reduce_strategy`, choose from "reduce", "allreduce".
+
+ you can check out more details of the flags by `python dist_train.py --help`.
+
+1. Runtime configurations
+
+ We use the environment variable to distinguish the different training role of a distributed training job.
+
+ - General envs:
+ - `PADDLE_TRAINER_ID`, the unique trainer ID of a job, the ranging is [0, PADDLE_TRAINERS).
+ - `PADDLE_TRAINERS_NUM`, the trainer count of a distributed job.
+ - `PADDLE_CURRENT_ENDPOINT`, current process endpoint.
+ - Pserver mode:
+ - `PADDLE_TRAINING_ROLE`, the current training role, should be in [PSERVER, TRAINER].
+ - `PADDLE_PSERVER_ENDPOINTS`, the parameter server endpoint list, separated by ",".
+ - NCCL2 mode:
+ - `PADDLE_TRAINER_ENDPOINTS`, endpoint list for each worker, separated by ",".
+
+### Try Out Different Distributed Training Modes
+
+You can test if distributed training works on a single node before deploying to the "real" cluster.
+
+***NOTE: for best performance, we recommend using multi-process mode, see No.3. And together with fp16.***
+
+***NOTE: for nccl2 distributed mode, you must ensure each node train same number of samples, or set skip_unbalanced_data to 1 to do sync training.***
+
+1. simply run `python dist_train.py` to start local training with default configuratioins.
+2. for pserver mode, run `bash run_ps_mode.sh` to start 2 pservers and 2 trainers, these 2 trainers
+ will use GPU 0 and 1 to simulate 2 workers.
+3. for nccl2 mode, run `bash run_nccl2_mode.sh` to start 2 workers.
+4. for local/distributed multi-process mode, run `run_mp_mode.sh` (this test use 4 GPUs).
+
+### Visualize the Training Process
+
+It's easy to draw the learning curve accroding to the training logs, for example,
+the logs of ResNet50 is as follows:
+
+``` text
+Pass 0, batch 30, loss 7.569439, acc1: 0.0125, acc5: 0.0125, avg batch time 0.1720
+Pass 0, batch 60, loss 7.027379, acc1: 0.0, acc5: 0.0, avg batch time 0.1551
+Pass 0, batch 90, loss 6.819984, acc1: 0.0, acc5: 0.0125, avg batch time 0.1492
+Pass 0, batch 120, loss 6.9076853, acc1: 0.0, acc5: 0.0125, avg batch time 0.1464
+```
+
+The below figure shows top 1 train accuracy for local training with 8 GPUs and distributed training
+with 32 GPUs, and also distributed training with batch merge feature turned on. Note that the
+red curve is trained with origin model configuration, which does not have the warmup and some detailed
+modifications.
+
+For distributed training with 32GPUs using `--model DistResnet` we can achieve test accuracy 75.5% after
+90 passes of training (the test accuracy is not shown in below figure). We can also achieve this result
+using "batch merge" feature by setting `--multi_batch_repeat 4` and with higher throughput.
+
+
+
+Training top-1 accuracy curves
+
+
+### Finetuning for Distributed Training
+
+The default resnet50 distributed training config is based on this paper: https://arxiv.org/pdf/1706.02677.pdf
+
+- use `--model DistResnet`
+- we use 32 P40 GPUs with 4 Nodes, each has 8 GPUs
+- we set `batch_size=32` for each GPU, in `batch_merge=on` case, we repeat 4 times before communicating with pserver.
+- learning rate starts from 0.1 and warm up to 0.4 in 5 passes(because we already have gradient merging,
+ so we only need to linear scale up to trainer count) using 4 nodes.
+- using batch_merge (`--multi_batch_repeat 4`) can make better use of GPU computing power and increase the
+ total training throughput. Because in the fine-tune configuration, we have to use `batch_size=32` per GPU,
+ and recent GPU is so fast that the communication between nodes may delay the total speed. In batch_merge mode
+ we run several batches forward and backward computation, then merge the gradients and send to pserver for
+ optimization, we use different batch norm mean and variance variable in each repeat so that adding repeats
+ behaves the same as adding more GPUs.
+
+
+### Performance
+
+The below figure shows fluid distributed training performances. We did these on a 4-node V100 GPU cluster,
+each has 8 V100 GPU card, with total of 32 GPUs. All modes can reach the "state of the art (choose loss scale carefully when using fp16 mode)" of ResNet50 model with imagenet dataset. The Y axis in the figure shows
+the images/s while the X-axis shows the number of GPUs.
+
+
+
+Performance of Multiple-GPU Training of Resnet50 on Imagenet
+
+
+The second figure shows speed-ups when using multiple GPUs according to the above figure.
+
+
+
+Speed-ups of Multiple-GPU Training of Resnet50 on Imagenet
+
+
diff --git a/fluid/PaddleCV/human_pose_estimation/utils/__init__.py b/PaddleCV/image_classification/dist_train/__init__.py
similarity index 100%
rename from fluid/PaddleCV/human_pose_estimation/utils/__init__.py
rename to PaddleCV/image_classification/dist_train/__init__.py
diff --git a/PaddleCV/image_classification/dist_train/batch_merge.py b/PaddleCV/image_classification/dist_train/batch_merge.py
new file mode 100644
index 0000000000000000000000000000000000000000..ee365de38494723a7d7542bd162169c4211ee63e
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/batch_merge.py
@@ -0,0 +1,43 @@
+import paddle.fluid as fluid
+import numpy as np
+
+def copyback_repeat_bn_params(main_prog):
+ repeat_vars = set()
+ for op in main_prog.global_block().ops:
+ if op.type == "batch_norm":
+ repeat_vars.add(op.input("Mean")[0])
+ repeat_vars.add(op.input("Variance")[0])
+ for vname in repeat_vars:
+ real_var = fluid.global_scope().find_var("%s.repeat.0" % vname).get_tensor()
+ orig_var = fluid.global_scope().find_var(vname).get_tensor()
+ orig_var.set(np.array(real_var), fluid.CUDAPlace(0)) # test on GPU0
+
+def append_bn_repeat_init_op(main_prog, startup_prog, num_repeats):
+ repeat_vars = set()
+ for op in main_prog.global_block().ops:
+ if op.type == "batch_norm":
+ repeat_vars.add(op.input("Mean")[0])
+ repeat_vars.add(op.input("Variance")[0])
+
+ for i in range(num_repeats):
+ for op in startup_prog.global_block().ops:
+ if op.type == "fill_constant":
+ for oname in op.output_arg_names:
+ if oname in repeat_vars:
+ var = startup_prog.global_block().var(oname)
+ repeat_var_name = "%s.repeat.%d" % (oname, i)
+ repeat_var = startup_prog.global_block().create_var(
+ name=repeat_var_name,
+ type=var.type,
+ dtype=var.dtype,
+ shape=var.shape,
+ persistable=var.persistable
+ )
+ main_prog.global_block()._clone_variable(repeat_var)
+ startup_prog.global_block().append_op(
+ type="fill_constant",
+ inputs={},
+ outputs={"Out": repeat_var},
+ attrs=op.all_attrs()
+ )
+
diff --git a/PaddleCV/image_classification/dist_train/dist_train.py b/PaddleCV/image_classification/dist_train/dist_train.py
new file mode 100644
index 0000000000000000000000000000000000000000..9d053916e0a2844170d782b65f1217c864ba27d9
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/dist_train.py
@@ -0,0 +1,374 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import argparse
+import time
+import os
+import traceback
+import functools
+import subprocess
+
+import numpy as np
+
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.core as core
+import six
+import sys
+sys.path.append("..")
+import models
+import utils
+from reader import train, val
+from utility import add_arguments, print_arguments
+from batch_merge import copyback_repeat_bn_params, append_bn_repeat_init_op
+from dist_utils import pserver_prepare, nccl2_prepare
+from env import dist_env
+
+def parse_args():
+ parser = argparse.ArgumentParser(description=__doc__)
+ add_arg = functools.partial(add_arguments, argparser=parser)
+ # yapf: disable
+ add_arg('batch_size', int, 256, "Minibatch size.")
+ add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+ add_arg('total_images', int, 1281167, "Training image number.")
+ add_arg('num_epochs', int, 120, "number of epochs.")
+ add_arg('class_dim', int, 1000, "Class number.")
+ add_arg('image_shape', str, "3,224,224", "input image size")
+ add_arg('model_save_dir', str, "output", "model save directory")
+ add_arg('with_mem_opt', bool, False, "Whether to use memory optimization or not.")
+ add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+ add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+ add_arg('lr', float, 0.1, "set learning rate.")
+ add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+ add_arg('model', str, "DistResNet", "Set the network to use.")
+ add_arg('enable_ce', bool, False, "If set True, enable continuous evaluation job.")
+ add_arg('data_dir', str, "./data/ILSVRC2012", "The ImageNet dataset root dir.")
+ add_arg('model_category', str, "models", "Whether to use models_name or not, valid value:'models','models_name'" )
+ add_arg('fp16', bool, False, "Enable half precision training with fp16." )
+ add_arg('scale_loss', float, 1.0, "Scale loss for fp16." )
+ add_arg('reduce_master_grad', bool, False, "Whether to allreduce fp32 gradients." )
+ # for distributed
+ add_arg('update_method', str, "local", "Can be local, pserver, nccl2.")
+ add_arg('multi_batch_repeat', int, 1, "Batch merge repeats.")
+ add_arg('start_test_pass', int, 0, "Start test after x passes.")
+ add_arg('num_threads', int, 8, "Use num_threads to run the fluid program.")
+ add_arg('split_var', bool, True, "Split params on pserver.")
+ add_arg('async_mode', bool, False, "Async distributed training, only for pserver mode.")
+ add_arg('reduce_strategy', str, "allreduce", "Choose from reduce or allreduce.")
+ add_arg('skip_unbalanced_data', bool, False, "Skip data not if data not balanced on nodes.")
+ add_arg('enable_sequential_execution', bool, False, "Skip data not if data not balanced on nodes.")
+ # yapf: enable
+ args = parser.parse_args()
+ return args
+
+def get_device_num():
+ if os.getenv("CPU_NUM"):
+ return int(os.getenv("CPU_NUM"))
+ visible_device = os.getenv('CUDA_VISIBLE_DEVICES')
+ if visible_device:
+ device_num = len(visible_device.split(','))
+ else:
+ device_num = subprocess.check_output(['nvidia-smi', '-L']).decode().count('\n')
+ return device_num
+
+def prepare_reader(is_train, pyreader, args, pass_id=0):
+ if is_train:
+ reader = train(data_dir=args.data_dir, pass_id_as_seed=pass_id)
+ else:
+ reader = val(data_dir=args.data_dir)
+ if is_train:
+ bs = args.batch_size / get_device_num()
+ else:
+ bs = 16
+ pyreader.decorate_paddle_reader(
+ paddle.batch(
+ reader,
+ batch_size=bs))
+
+def build_program(is_train, main_prog, startup_prog, args):
+ pyreader = None
+ class_dim = args.class_dim
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+
+ trainer_count = args.dist_env["num_trainers"]
+ device_num_per_worker = get_device_num()
+ with fluid.program_guard(main_prog, startup_prog):
+ pyreader = fluid.layers.py_reader(
+ capacity=16,
+ shapes=([-1] + image_shape, (-1, 1)),
+ dtypes=('float32', 'int64'),
+ name="train_reader" if is_train else "test_reader",
+ use_double_buffer=True)
+ with fluid.unique_name.guard():
+ image, label = fluid.layers.read_file(pyreader)
+ if args.fp16:
+ image = fluid.layers.cast(image, "float16")
+ model_def = models.__dict__[args.model](layers=50, is_train=is_train)
+ predict = model_def.net(image, class_dim=class_dim)
+ cost, pred = fluid.layers.softmax_with_cross_entropy(predict, label, return_softmax=True)
+ if args.scale_loss > 1:
+ avg_cost = fluid.layers.mean(x=cost) * float(args.scale_loss)
+ else:
+ avg_cost = fluid.layers.mean(x=cost)
+
+ batch_acc1 = fluid.layers.accuracy(input=pred, label=label, k=1)
+ batch_acc5 = fluid.layers.accuracy(input=pred, label=label, k=5)
+
+ optimizer = None
+ if is_train:
+ start_lr = args.lr
+ end_lr = args.lr * trainer_count * args.multi_batch_repeat
+ if os.getenv("FLAGS_selected_gpus"):
+ # in multi process mode, "trainer_count" will be total devices
+ # in the whole cluster, and we need to scale num_of nodes.
+ end_lr /= device_num_per_worker
+
+ total_images = args.total_images / trainer_count
+ step = int(total_images / (args.batch_size * args.multi_batch_repeat) + 1)
+ warmup_steps = step * 5 # warmup 5 passes
+ epochs = [30, 60, 80]
+ bd = [step * e for e in epochs]
+ base_lr = end_lr
+ lr = []
+ lr = [base_lr * (0.1**i) for i in range(len(bd) + 1)]
+ print("start lr: %s, end lr: %s, decay boundaries: %s" % (
+ start_lr,
+ end_lr,
+ bd
+ ))
+
+ # NOTE: we put weight decay in layers config, and remove
+ # weight decay on bn layers, so don't add weight decay in
+ # optimizer config.
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=utils.learning_rate.lr_warmup(
+ fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ warmup_steps, start_lr, end_lr),
+ momentum=0.9)
+ if args.fp16:
+ params_grads = optimizer.backward(avg_cost)
+ master_params_grads = utils.create_master_params_grads(
+ params_grads, main_prog, startup_prog, args.scale_loss,
+ reduce_master_grad = args.reduce_master_grad)
+ optimizer.apply_gradients(master_params_grads)
+ utils.master_param_to_train_param(master_params_grads, params_grads, main_prog)
+ else:
+ optimizer.minimize(avg_cost)
+
+ # prepare reader for current program
+ prepare_reader(is_train, pyreader, args)
+
+ return pyreader, avg_cost, batch_acc1, batch_acc5
+
+
+def test_single(exe, test_prog, args, pyreader, fetch_list):
+ acc1 = fluid.metrics.Accuracy()
+ acc5 = fluid.metrics.Accuracy()
+ test_losses = []
+ pyreader.start()
+ while True:
+ try:
+ acc_rets = exe.run(program=test_prog, fetch_list=fetch_list)
+ test_losses.append(acc_rets[0])
+ acc1.update(value=np.array(acc_rets[1]), weight=args.batch_size)
+ acc5.update(value=np.array(acc_rets[2]), weight=args.batch_size)
+ except fluid.core.EOFException:
+ pyreader.reset()
+ break
+ test_avg_loss = np.mean(np.array(test_losses))
+ return test_avg_loss, np.mean(acc1.eval()), np.mean(acc5.eval())
+
+def test_parallel(exe, test_prog, args, pyreader, fetch_list):
+ acc1 = fluid.metrics.Accuracy()
+ acc5 = fluid.metrics.Accuracy()
+ test_losses = []
+ pyreader.start()
+ while True:
+ try:
+ acc_rets = exe.run(fetch_list=fetch_list)
+ test_losses.append(acc_rets[0])
+ acc1.update(value=np.array(acc_rets[1]), weight=args.batch_size)
+ acc5.update(value=np.array(acc_rets[2]), weight=args.batch_size)
+ except fluid.core.EOFException:
+ pyreader.reset()
+ break
+ test_avg_loss = np.mean(np.array(test_losses))
+ return test_avg_loss, np.mean(acc1.eval()), np.mean(acc5.eval())
+
+
+def run_pserver(train_prog, startup_prog):
+ server_exe = fluid.Executor(fluid.CPUPlace())
+ server_exe.run(startup_prog)
+ server_exe.run(train_prog)
+
+def train_parallel(args):
+ train_prog = fluid.Program()
+ test_prog = fluid.Program()
+ startup_prog = fluid.Program()
+
+ train_pyreader, train_cost, train_acc1, train_acc5 = build_program(True, train_prog, startup_prog, args)
+ test_pyreader, test_cost, test_acc1, test_acc5 = build_program(False, test_prog, startup_prog, args)
+
+ if args.update_method == "pserver":
+ train_prog, startup_prog = pserver_prepare(args, train_prog, startup_prog)
+ elif args.update_method == "nccl2":
+ nccl2_prepare(args, startup_prog)
+
+ if args.dist_env["training_role"] == "PSERVER":
+ run_pserver(train_prog, startup_prog)
+ exit(0)
+
+ if args.use_gpu:
+ # NOTE: for multi process mode: one process per GPU device.
+ gpu_id = 0
+ if os.getenv("FLAGS_selected_gpus"):
+ gpu_id = int(os.getenv("FLAGS_selected_gpus"))
+ place = core.CUDAPlace(gpu_id) if args.use_gpu else core.CPUPlace()
+
+ startup_exe = fluid.Executor(place)
+ if args.multi_batch_repeat > 1:
+ append_bn_repeat_init_op(train_prog, startup_prog, args.multi_batch_repeat)
+ startup_exe.run(startup_prog)
+
+ if args.checkpoint:
+ fluid.io.load_persistables(startup_exe, args.checkpoint, main_program=train_prog)
+
+ strategy = fluid.ExecutionStrategy()
+ strategy.num_threads = args.num_threads
+ build_strategy = fluid.BuildStrategy()
+ build_strategy.enable_inplace = False
+ build_strategy.memory_optimize = False
+ build_strategy.enable_sequential_execution = bool(args.enable_sequential_execution)
+
+
+ if args.reduce_strategy == "reduce":
+ build_strategy.reduce_strategy = fluid.BuildStrategy(
+ ).ReduceStrategy.Reduce
+ else:
+ build_strategy.reduce_strategy = fluid.BuildStrategy(
+ ).ReduceStrategy.AllReduce
+
+ if args.update_method == "pserver" or args.update_method == "local":
+ # parameter server mode distributed training, merge
+ # gradients on local server, do not initialize
+ # ParallelExecutor with multi server all-reduce mode.
+ num_trainers = 1
+ trainer_id = 0
+ else:
+ num_trainers = args.dist_env["num_trainers"]
+ trainer_id = args.dist_env["trainer_id"]
+ # Set this to let build_strategy to add "allreduce_deps_pass" automatically
+ build_strategy.num_trainers = num_trainers
+ build_strategy.trainer_id = trainer_id
+
+ if args.multi_batch_repeat > 1:
+ pass_builder = build_strategy._finalize_strategy_and_create_passes()
+ mypass = pass_builder.insert_pass(
+ len(pass_builder.all_passes()) - 4, "multi_batch_merge_pass")
+ mypass.set("num_repeats", args.multi_batch_repeat)
+
+ exe = fluid.ParallelExecutor(
+ True,
+ train_cost.name,
+ main_program=train_prog,
+ exec_strategy=strategy,
+ build_strategy=build_strategy,
+ num_trainers=num_trainers,
+ trainer_id=trainer_id)
+
+ # Uncomment below lines to use ParallelExecutor to run test.
+ # test_exe = fluid.ParallelExecutor(
+ # True,
+ # main_program=test_prog,
+ # share_vars_from=exe,
+ # scope=fluid.global_scope().new_scope()
+ # )
+
+ over_all_start = time.time()
+ fetch_list = [train_cost.name, train_acc1.name, train_acc5.name]
+ steps_per_pass = args.total_images / args.batch_size / args.dist_env["num_trainers"]
+ for pass_id in range(args.num_epochs):
+ num_samples = 0
+ start_time = time.time()
+ batch_id = 1
+ # use pass_id+1 as per pass global shuffle for distributed training
+ prepare_reader(True, train_pyreader, args, pass_id + 1)
+ train_pyreader.start()
+ while True:
+ try:
+ if batch_id % 30 == 0:
+ fetch_ret = exe.run(fetch_list)
+ fetched_data = [np.mean(np.array(d)) for d in fetch_ret]
+ print("Pass [%d/%d], batch [%d/%d], loss %s, acc1: %s, acc5: %s, avg batch time %.4f" %
+ (pass_id, args.num_epochs, batch_id, steps_per_pass, fetched_data[0], fetched_data[1],
+ fetched_data[2], (time.time()-start_time) / batch_id))
+ else:
+ fetch_ret = exe.run([])
+ except fluid.core.EOFException:
+ break
+ except fluid.core.EnforceNotMet:
+ traceback.print_exc()
+ break
+ num_samples += args.batch_size
+ batch_id += 1
+ if args.skip_unbalanced_data and batch_id >= steps_per_pass:
+ break
+
+ print_train_time(start_time, time.time(), num_samples)
+ train_pyreader.reset()
+ if pass_id >= args.start_test_pass:
+ if args.multi_batch_repeat > 1:
+ copyback_repeat_bn_params(train_prog)
+ test_fetch_list = [test_cost.name, test_acc1.name, test_acc5.name]
+ test_ret = test_single(startup_exe, test_prog, args, test_pyreader,test_fetch_list)
+ # NOTE: switch to below line if you use ParallelExecutor to run test.
+ # test_ret = test_parallel(test_exe, test_prog, args, test_pyreader,test_fetch_list)
+ print("Pass: %d, Test Loss %s, test acc1: %s, test acc5: %s\n" %
+ (pass_id, test_ret[0], test_ret[1], test_ret[2]))
+ model_path = os.path.join(args.model_save_dir + '/' + args.model,
+ str(pass_id))
+ print("saving model to ", model_path)
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(startup_exe, model_path, main_program=train_prog)
+ startup_exe.close()
+ print("total train time: ", time.time() - over_all_start)
+
+
+def print_train_time(start_time, end_time, num_samples):
+ train_elapsed = end_time - start_time
+ examples_per_sec = num_samples / train_elapsed
+ print('\nTotal examples: %d, total time: %.5f, %.5f examples/sed\n' %
+ (num_samples, train_elapsed, examples_per_sec))
+
+
+def print_paddle_envs():
+ print('----------- Configuration envs -----------')
+ for k in os.environ:
+ if "PADDLE_" in k:
+ print("ENV %s:%s" % (k, os.environ[k]))
+ print('------------------------------------------------')
+
+
+def main():
+ args = parse_args()
+ print_arguments(args)
+ print_paddle_envs()
+ args.dist_env = dist_env()
+ train_parallel(args)
+
+if __name__ == "__main__":
+ main()
+
diff --git a/PaddleCV/image_classification/dist_train/dist_utils.py b/PaddleCV/image_classification/dist_train/dist_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..51007273f717fe815d684aaae7c02b3d7245c4e7
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/dist_utils.py
@@ -0,0 +1,43 @@
+import os
+import paddle.fluid as fluid
+
+
+def nccl2_prepare(args, startup_prog):
+ config = fluid.DistributeTranspilerConfig()
+ config.mode = "nccl2"
+ t = fluid.DistributeTranspiler(config=config)
+
+ envs = args.dist_env
+
+ t.transpile(envs["trainer_id"],
+ trainers=','.join(envs["trainer_endpoints"]),
+ current_endpoint=envs["current_endpoint"],
+ startup_program=startup_prog)
+
+
+def pserver_prepare(args, train_prog, startup_prog):
+ config = fluid.DistributeTranspilerConfig()
+ config.slice_var_up = args.split_var
+ t = fluid.DistributeTranspiler(config=config)
+ envs = args.dist_env
+ training_role = envs["training_role"]
+
+ t.transpile(
+ envs["trainer_id"],
+ program=train_prog,
+ pservers=envs["pserver_endpoints"],
+ trainers=envs["num_trainers"],
+ sync_mode=not args.async_mode,
+ startup_program=startup_prog)
+ if training_role == "PSERVER":
+ pserver_program = t.get_pserver_program(envs["current_endpoint"])
+ pserver_startup_program = t.get_startup_program(
+ envs["current_endpoint"], pserver_program, startup_program=startup_prog)
+ return pserver_program, pserver_startup_program
+ elif training_role == "TRAINER":
+ train_program = t.get_trainer_program()
+ return train_program, startup_prog
+ else:
+ raise ValueError(
+ 'PADDLE_TRAINING_ROLE environment variable must be either TRAINER or PSERVER'
+ )
diff --git a/PaddleCV/image_classification/dist_train/env.py b/PaddleCV/image_classification/dist_train/env.py
new file mode 100644
index 0000000000000000000000000000000000000000..f85297e4d3e24322176ad25ee34366f446e18896
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/env.py
@@ -0,0 +1,33 @@
+import os
+
+
+def dist_env():
+ """
+ Return a dict of all variable that distributed training may use.
+ NOTE: you may rewrite this function to suit your cluster environments.
+ """
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
+ num_trainers = 1
+ training_role = os.getenv("PADDLE_TRAINING_ROLE", "TRAINER")
+ assert(training_role == "PSERVER" or training_role == "TRAINER")
+
+ # - PADDLE_TRAINER_ENDPOINTS means nccl2 mode.
+ # - PADDLE_PSERVER_ENDPOINTS means pserver mode.
+ # - PADDLE_CURRENT_ENDPOINT means current process endpoint.
+ trainer_endpoints = os.getenv("PADDLE_TRAINER_ENDPOINTS")
+ pserver_endpoints = os.getenv("PADDLE_PSERVER_ENDPOINTS")
+ current_endpoint = os.getenv("PADDLE_CURRENT_ENDPOINT")
+ if trainer_endpoints:
+ trainer_endpoints = trainer_endpoints.split(",")
+ num_trainers = len(trainer_endpoints)
+ elif pserver_endpoints:
+ num_trainers = int(os.getenv("PADDLE_TRAINERS_NUM"))
+
+ return {
+ "trainer_id": trainer_id,
+ "num_trainers": num_trainers,
+ "current_endpoint": current_endpoint,
+ "training_role": training_role,
+ "pserver_endpoints": pserver_endpoints,
+ "trainer_endpoints": trainer_endpoints
+ }
diff --git a/PaddleCV/image_classification/dist_train/run_mp_mode.sh b/PaddleCV/image_classification/dist_train/run_mp_mode.sh
new file mode 100755
index 0000000000000000000000000000000000000000..bf04e078284f02be0774209a599b839d0bbf20f5
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/run_mp_mode.sh
@@ -0,0 +1,19 @@
+#!/bin/bash
+
+# Test using 4 GPUs
+export CUDA_VISIBLE_DEVICES="0,1,2,3"
+export MODEL="DistResNet"
+export PADDLE_TRAINER_ENDPOINTS="127.0.0.1:7160,127.0.0.1:7161,127.0.0.1:7162,127.0.0.1:7163"
+# PADDLE_TRAINERS_NUM is used only for reader when nccl2 mode
+export PADDLE_TRAINERS_NUM="4"
+
+mkdir -p logs
+
+for i in {0..3}
+do
+PADDLE_TRAINING_ROLE="TRAINER" \
+PADDLE_CURRENT_ENDPOINT="127.0.0.1:716${i}" \
+PADDLE_TRAINER_ID="${i}" \
+FLAGS_selected_gpus="${i}" \
+python dist_train.py --model $MODEL --update_method nccl2 --batch_size 32 --fp16 1 --scale_loss 8 &> logs/tr$i.log &
+done
diff --git a/PaddleCV/image_classification/dist_train/run_nccl2_mode.sh b/PaddleCV/image_classification/dist_train/run_nccl2_mode.sh
new file mode 100755
index 0000000000000000000000000000000000000000..120a96647e093de6af362bd51d8e6942249db56f
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/run_nccl2_mode.sh
@@ -0,0 +1,24 @@
+#!/bin/bash
+
+export MODEL="DistResNet"
+export PADDLE_TRAINER_ENDPOINTS="127.0.0.1:7160,127.0.0.1:7161"
+# PADDLE_TRAINERS_NUM is used only for reader when nccl2 mode
+export PADDLE_TRAINERS_NUM="2"
+
+mkdir -p logs
+
+# NOTE: set NCCL_P2P_DISABLE so that can run nccl2 distribute train on one node.
+
+PADDLE_TRAINING_ROLE="TRAINER" \
+PADDLE_CURRENT_ENDPOINT="127.0.0.1:7160" \
+PADDLE_TRAINER_ID="0" \
+CUDA_VISIBLE_DEVICES="0" \
+NCCL_P2P_DISABLE="1" \
+python dist_train.py --model $MODEL --update_method nccl2 --batch_size 32 &> logs/tr0.log &
+
+PADDLE_TRAINING_ROLE="TRAINER" \
+PADDLE_CURRENT_ENDPOINT="127.0.0.1:7161" \
+PADDLE_TRAINER_ID="1" \
+CUDA_VISIBLE_DEVICES="1" \
+NCCL_P2P_DISABLE="1" \
+python dist_train.py --model $MODEL --update_method nccl2 --batch_size 32 &> logs/tr1.log &
diff --git a/PaddleCV/image_classification/dist_train/run_ps_mode.sh b/PaddleCV/image_classification/dist_train/run_ps_mode.sh
new file mode 100755
index 0000000000000000000000000000000000000000..99926afbb04e0bc2795a4fd7fd8b4ff58aefec31
--- /dev/null
+++ b/PaddleCV/image_classification/dist_train/run_ps_mode.sh
@@ -0,0 +1,27 @@
+#!/bin/bash
+
+export MODEL="DistResNet"
+export PADDLE_PSERVER_ENDPOINTS="127.0.0.1:7160,127.0.0.1:7161"
+export PADDLE_TRAINERS_NUM="2"
+
+mkdir -p logs
+
+PADDLE_TRAINING_ROLE="PSERVER" \
+PADDLE_CURRENT_ENDPOINT="127.0.0.1:7160" \
+python dist_train.py --model $MODEL --update_method pserver --batch_size 32 &> logs/ps0.log &
+
+PADDLE_TRAINING_ROLE="PSERVER" \
+PADDLE_CURRENT_ENDPOINT="127.0.0.1:7161" \
+python dist_train.py --model $MODEL --update_method pserver --batch_size 32 &> logs/ps1.log &
+
+PADDLE_TRAINING_ROLE="TRAINER" \
+PADDLE_CURRENT_ENDPOINT="127.0.0.1:7160" \
+PADDLE_TRAINER_ID="0" \
+CUDA_VISIBLE_DEVICES="0" \
+python dist_train.py --model $MODEL --update_method pserver --batch_size 32 &> logs/tr0.log &
+
+PADDLE_TRAINING_ROLE="TRAINER" \
+PADDLE_CURRENT_ENDPOINT="127.0.0.1:7161" \
+PADDLE_TRAINER_ID="1" \
+CUDA_VISIBLE_DEVICES="1" \
+python dist_train.py --model $MODEL --update_method pserver --batch_size 32 &> logs/tr1.log &
diff --git a/PaddleCV/image_classification/eval.py b/PaddleCV/image_classification/eval.py
new file mode 100644
index 0000000000000000000000000000000000000000..3bcc4e696c4795a98408a9b1e0f4ff5d0303ec77
--- /dev/null
+++ b/PaddleCV/image_classification/eval.py
@@ -0,0 +1,129 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import numpy as np
+import time
+import sys
+import paddle
+import paddle.fluid as fluid
+import reader as reader
+import argparse
+import functools
+import models
+from utils.learning_rate import cosine_decay
+from utils.utility import add_arguments, print_arguments
+import math
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 256, "Minibatch size.")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('class_dim', int, 1000, "Class number.")
+add_arg('image_shape', str, "3,224,224", "Input image size")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('model', str, "SE_ResNeXt50_32x4d", "Set the network to use.")
+
+# yapf: enable
+
+def eval(args):
+ # parameters from arguments
+ class_dim = args.class_dim
+ model_name = args.model
+ pretrained_model = args.pretrained_model
+ with_memory_optimization = args.with_mem_opt
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+
+ model_list = [m for m in dir(models) if "__" not in m]
+ assert model_name in model_list, "{} is not in lists: {}".format(args.model,
+ model_list)
+
+ image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
+ label = fluid.layers.data(name='label', shape=[1], dtype='int64')
+
+ # model definition
+ model = models.__dict__[model_name]()
+
+ if model_name == "GoogleNet":
+ out0, out1, out2 = model.net(input=image, class_dim=class_dim)
+ cost0 = fluid.layers.cross_entropy(input=out0, label=label)
+ cost1 = fluid.layers.cross_entropy(input=out1, label=label)
+ cost2 = fluid.layers.cross_entropy(input=out2, label=label)
+ avg_cost0 = fluid.layers.mean(x=cost0)
+ avg_cost1 = fluid.layers.mean(x=cost1)
+ avg_cost2 = fluid.layers.mean(x=cost2)
+
+ avg_cost = avg_cost0 + 0.3 * avg_cost1 + 0.3 * avg_cost2
+ acc_top1 = fluid.layers.accuracy(input=out0, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=out0, label=label, k=5)
+ else:
+ out = model.net(input=image, class_dim=class_dim)
+ cost, pred = fluid.layers.softmax_with_cross_entropy(
+ out, label, return_softmax=True)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc_top1 = fluid.layers.accuracy(input=pred, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=pred, label=label, k=5)
+
+ test_program = fluid.default_main_program().clone(for_test=True)
+
+ fetch_list = [avg_cost.name, acc_top1.name, acc_top5.name]
+ if with_memory_optimization:
+ fluid.memory_optimize(
+ fluid.default_main_program(), skip_opt_set=set(fetch_list))
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+
+ fluid.io.load_vars(exe, pretrained_model, predicate=if_exist)
+
+ val_reader = paddle.batch(reader.val(), batch_size=args.batch_size)
+ feeder = fluid.DataFeeder(place=place, feed_list=[image, label])
+
+ test_info = [[], [], []]
+ cnt = 0
+ for batch_id, data in enumerate(val_reader()):
+ t1 = time.time()
+ loss, acc1, acc5 = exe.run(test_program,
+ fetch_list=fetch_list,
+ feed=feeder.feed(data))
+ t2 = time.time()
+ period = t2 - t1
+ loss = np.mean(loss)
+ acc1 = np.mean(acc1)
+ acc5 = np.mean(acc5)
+ test_info[0].append(loss * len(data))
+ test_info[1].append(acc1 * len(data))
+ test_info[2].append(acc5 * len(data))
+ cnt += len(data)
+ if batch_id % 10 == 0:
+ print("Testbatch {0},loss {1}, "
+ "acc1 {2},acc5 {3},time {4}".format(batch_id, \
+ "%.5f"%loss,"%.5f"%acc1, "%.5f"%acc5, \
+ "%2.2f sec" % period))
+ sys.stdout.flush()
+
+ test_loss = np.sum(test_info[0]) / cnt
+ test_acc1 = np.sum(test_info[1]) / cnt
+ test_acc5 = np.sum(test_info[2]) / cnt
+
+ print("Test_loss {0}, test_acc1 {1}, test_acc5 {2}".format(
+ "%.5f"%test_loss, "%.5f"%test_acc1, "%.5f"%test_acc5))
+ sys.stdout.flush()
+
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ eval(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/PaddleCV/image_classification/fast_imagenet/README.md b/PaddleCV/image_classification/fast_imagenet/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..84805745ef4c48908ce0fd612aff6b0109ab5995
--- /dev/null
+++ b/PaddleCV/image_classification/fast_imagenet/README.md
@@ -0,0 +1,28 @@
+# PaddlePaddle Fast ImageNet Training
+
+PaddlePaddle Fast ImageNet can train ImageNet dataset with fewer epochs. We implemented the it according to the blog
+[Now anyone can train Imagenet in 18 minutes](https://www.fast.ai/2018/08/10/fastai-diu-imagenet/) which published on the [fast.ai] website.
+PaddlePaddle Fast ImageNet using the dynmiac batch size, dynamic image size, rectangular images validation and etc... so that the Fast ImageNet can achieve the baseline
+(acc1: 75.9%, acc5: 93.0%) at 26 epochs on 8 * V100 GPUs.
+
+## Experiment
+
+1. Prepare the training data, resize the images to 160 and 352 using `resize.py`, the prepared data folder should look like:
+ ``` text
+ `-ImageNet
+ |-train
+ |-validation
+ |-160
+ |-train
+ `-validation
+ `-352
+ |-train
+ `-validation
+ ```
+1. Install the requirements by `pip install -r requirement.txt`.
+1. Launch the training job: `python train.py --data_dir /data/imagenet`
+1. Learning curve, we launch the training job on V100 GPU card:
+
+
+
+
diff --git a/PaddleCV/image_classification/fast_imagenet/requirements.txt b/PaddleCV/image_classification/fast_imagenet/requirements.txt
new file mode 100644
index 0000000000000000000000000000000000000000..5e13381c497be8ec79924595b4de747b56c3f444
--- /dev/null
+++ b/PaddleCV/image_classification/fast_imagenet/requirements.txt
@@ -0,0 +1,3 @@
+torch==0.4.1
+torchvision
+tqdm
diff --git a/PaddleCV/image_classification/fast_imagenet/src/acc_curve.png b/PaddleCV/image_classification/fast_imagenet/src/acc_curve.png
new file mode 100644
index 0000000000000000000000000000000000000000..184c4f2cbdebb6640a62ad35923af5703941ff6d
Binary files /dev/null and b/PaddleCV/image_classification/fast_imagenet/src/acc_curve.png differ
diff --git a/PaddleCV/image_classification/fast_imagenet/tools/resize.py b/PaddleCV/image_classification/fast_imagenet/tools/resize.py
new file mode 100644
index 0000000000000000000000000000000000000000..0cb7fcde3b247ec8af1cb8287e092b2c746a5701
--- /dev/null
+++ b/PaddleCV/image_classification/fast_imagenet/tools/resize.py
@@ -0,0 +1,44 @@
+from PIL import Image
+from pathlib import Path
+from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor
+from functools import partial
+import multiprocessing
+cpus = multiprocessing.cpu_count()
+cpus = min(36,cpus)
+
+
+PATH = Path('/data/imagenet2')
+DEST = Path('/data/imagenet2/sz')
+def mkdir(path):
+ if not path.exists():
+ path.mkdir()
+
+mkdir(DEST)
+szs = (160, 352)
+
+def resize_img(p, im, fn, sz):
+ w,h = im.size
+ ratio = min(h/sz,w/sz)
+ im = im.resize((int(w/ratio), int(h/ratio)), resample=Image.BICUBIC)
+ new_fn = DEST/str(sz)/fn.relative_to(PATH)
+ mkdir(new_fn.parent())
+ im.save(new_fn)
+
+def resizes(p, fn):
+ im = Image.open(fn)
+ for sz in szs: resize_img(p, im, fn, sz)
+
+def resize_imgs(p):
+ files = p.glob('*/*.jpeg')
+ with ProcessPoolExecutor(cpus) as e: e.map(partial(resizes, p), files)
+
+
+for sz in szs:
+ ssz=str(sz)
+ mkdir((DEST/ssz))
+ for ds in ('validation','train'): mkdir((DEST/ssz/ds))
+ for ds in ('train',): mkdir((DEST/ssz/ds))
+
+for ds in ("validation", "train"):
+ print(PATH/ds)
+ resize_imgs(PATH/ds)
\ No newline at end of file
diff --git a/PaddleCV/image_classification/fast_imagenet/tools/valprep.sh b/PaddleCV/image_classification/fast_imagenet/tools/valprep.sh
new file mode 100644
index 0000000000000000000000000000000000000000..d1fcd9dd3bd5a376af8d10630a0af42982a65e7e
--- /dev/null
+++ b/PaddleCV/image_classification/fast_imagenet/tools/valprep.sh
@@ -0,0 +1,51000 @@
+mkdir -p n01440764
+mkdir -p n01443537
+mkdir -p n01484850
+mkdir -p n01491361
+mkdir -p n01494475
+mkdir -p n01496331
+mkdir -p n01498041
+mkdir -p n01514668
+mkdir -p n01514859
+mkdir -p n01518878
+mkdir -p n01530575
+mkdir -p n01531178
+mkdir -p n01532829
+mkdir -p n01534433
+mkdir -p n01537544
+mkdir -p n01558993
+mkdir -p n01560419
+mkdir -p n01580077
+mkdir -p n01582220
+mkdir -p n01592084
+mkdir -p n01601694
+mkdir -p n01608432
+mkdir -p n01614925
+mkdir -p n01616318
+mkdir -p n01622779
+mkdir -p n01629819
+mkdir -p n01630670
+mkdir -p n01631663
+mkdir -p n01632458
+mkdir -p n01632777
+mkdir -p n01641577
+mkdir -p n01644373
+mkdir -p n01644900
+mkdir -p n01664065
+mkdir -p n01665541
+mkdir -p n01667114
+mkdir -p n01667778
+mkdir -p n01669191
+mkdir -p n01675722
+mkdir -p n01677366
+mkdir -p n01682714
+mkdir -p n01685808
+mkdir -p n01687978
+mkdir -p n01688243
+mkdir -p n01689811
+mkdir -p n01692333
+mkdir -p n01693334
+mkdir -p n01694178
+mkdir -p n01695060
+mkdir -p n01697457
+mkdir -p n01698640
+mkdir -p n01704323
+mkdir -p n01728572
+mkdir -p n01728920
+mkdir -p n01729322
+mkdir -p n01729977
+mkdir -p n01734418
+mkdir -p n01735189
+mkdir -p n01737021
+mkdir -p n01739381
+mkdir -p n01740131
+mkdir -p n01742172
+mkdir -p n01744401
+mkdir -p n01748264
+mkdir -p n01749939
+mkdir -p n01751748
+mkdir -p n01753488
+mkdir -p n01755581
+mkdir -p n01756291
+mkdir -p n01768244
+mkdir -p n01770081
+mkdir -p n01770393
+mkdir -p n01773157
+mkdir -p n01773549
+mkdir -p n01773797
+mkdir -p n01774384
+mkdir -p n01774750
+mkdir -p n01775062
+mkdir -p n01776313
+mkdir -p n01784675
+mkdir -p n01795545
+mkdir -p n01796340
+mkdir -p n01797886
+mkdir -p n01798484
+mkdir -p n01806143
+mkdir -p n01806567
+mkdir -p n01807496
+mkdir -p n01817953
+mkdir -p n01818515
+mkdir -p n01819313
+mkdir -p n01820546
+mkdir -p n01824575
+mkdir -p n01828970
+mkdir -p n01829413
+mkdir -p n01833805
+mkdir -p n01843065
+mkdir -p n01843383
+mkdir -p n01847000
+mkdir -p n01855032
+mkdir -p n01855672
+mkdir -p n01860187
+mkdir -p n01871265
+mkdir -p n01872401
+mkdir -p n01873310
+mkdir -p n01877812
+mkdir -p n01882714
+mkdir -p n01883070
+mkdir -p n01910747
+mkdir -p n01914609
+mkdir -p n01917289
+mkdir -p n01924916
+mkdir -p n01930112
+mkdir -p n01943899
+mkdir -p n01944390
+mkdir -p n01945685
+mkdir -p n01950731
+mkdir -p n01955084
+mkdir -p n01968897
+mkdir -p n01978287
+mkdir -p n01978455
+mkdir -p n01980166
+mkdir -p n01981276
+mkdir -p n01983481
+mkdir -p n01984695
+mkdir -p n01985128
+mkdir -p n01986214
+mkdir -p n01990800
+mkdir -p n02002556
+mkdir -p n02002724
+mkdir -p n02006656
+mkdir -p n02007558
+mkdir -p n02009229
+mkdir -p n02009912
+mkdir -p n02011460
+mkdir -p n02012849
+mkdir -p n02013706
+mkdir -p n02017213
+mkdir -p n02018207
+mkdir -p n02018795
+mkdir -p n02025239
+mkdir -p n02027492
+mkdir -p n02028035
+mkdir -p n02033041
+mkdir -p n02037110
+mkdir -p n02051845
+mkdir -p n02056570
+mkdir -p n02058221
+mkdir -p n02066245
+mkdir -p n02071294
+mkdir -p n02074367
+mkdir -p n02077923
+mkdir -p n02085620
+mkdir -p n02085782
+mkdir -p n02085936
+mkdir -p n02086079
+mkdir -p n02086240
+mkdir -p n02086646
+mkdir -p n02086910
+mkdir -p n02087046
+mkdir -p n02087394
+mkdir -p n02088094
+mkdir -p n02088238
+mkdir -p n02088364
+mkdir -p n02088466
+mkdir -p n02088632
+mkdir -p n02089078
+mkdir -p n02089867
+mkdir -p n02089973
+mkdir -p n02090379
+mkdir -p n02090622
+mkdir -p n02090721
+mkdir -p n02091032
+mkdir -p n02091134
+mkdir -p n02091244
+mkdir -p n02091467
+mkdir -p n02091635
+mkdir -p n02091831
+mkdir -p n02092002
+mkdir -p n02092339
+mkdir -p n02093256
+mkdir -p n02093428
+mkdir -p n02093647
+mkdir -p n02093754
+mkdir -p n02093859
+mkdir -p n02093991
+mkdir -p n02094114
+mkdir -p n02094258
+mkdir -p n02094433
+mkdir -p n02095314
+mkdir -p n02095570
+mkdir -p n02095889
+mkdir -p n02096051
+mkdir -p n02096177
+mkdir -p n02096294
+mkdir -p n02096437
+mkdir -p n02096585
+mkdir -p n02097047
+mkdir -p n02097130
+mkdir -p n02097209
+mkdir -p n02097298
+mkdir -p n02097474
+mkdir -p n02097658
+mkdir -p n02098105
+mkdir -p n02098286
+mkdir -p n02098413
+mkdir -p n02099267
+mkdir -p n02099429
+mkdir -p n02099601
+mkdir -p n02099712
+mkdir -p n02099849
+mkdir -p n02100236
+mkdir -p n02100583
+mkdir -p n02100735
+mkdir -p n02100877
+mkdir -p n02101006
+mkdir -p n02101388
+mkdir -p n02101556
+mkdir -p n02102040
+mkdir -p n02102177
+mkdir -p n02102318
+mkdir -p n02102480
+mkdir -p n02102973
+mkdir -p n02104029
+mkdir -p n02104365
+mkdir -p n02105056
+mkdir -p n02105162
+mkdir -p n02105251
+mkdir -p n02105412
+mkdir -p n02105505
+mkdir -p n02105641
+mkdir -p n02105855
+mkdir -p n02106030
+mkdir -p n02106166
+mkdir -p n02106382
+mkdir -p n02106550
+mkdir -p n02106662
+mkdir -p n02107142
+mkdir -p n02107312
+mkdir -p n02107574
+mkdir -p n02107683
+mkdir -p n02107908
+mkdir -p n02108000
+mkdir -p n02108089
+mkdir -p n02108422
+mkdir -p n02108551
+mkdir -p n02108915
+mkdir -p n02109047
+mkdir -p n02109525
+mkdir -p n02109961
+mkdir -p n02110063
+mkdir -p n02110185
+mkdir -p n02110341
+mkdir -p n02110627
+mkdir -p n02110806
+mkdir -p n02110958
+mkdir -p n02111129
+mkdir -p n02111277
+mkdir -p n02111500
+mkdir -p n02111889
+mkdir -p n02112018
+mkdir -p n02112137
+mkdir -p n02112350
+mkdir -p n02112706
+mkdir -p n02113023
+mkdir -p n02113186
+mkdir -p n02113624
+mkdir -p n02113712
+mkdir -p n02113799
+mkdir -p n02113978
+mkdir -p n02114367
+mkdir -p n02114548
+mkdir -p n02114712
+mkdir -p n02114855
+mkdir -p n02115641
+mkdir -p n02115913
+mkdir -p n02116738
+mkdir -p n02117135
+mkdir -p n02119022
+mkdir -p n02119789
+mkdir -p n02120079
+mkdir -p n02120505
+mkdir -p n02123045
+mkdir -p n02123159
+mkdir -p n02123394
+mkdir -p n02123597
+mkdir -p n02124075
+mkdir -p n02125311
+mkdir -p n02127052
+mkdir -p n02128385
+mkdir -p n02128757
+mkdir -p n02128925
+mkdir -p n02129165
+mkdir -p n02129604
+mkdir -p n02130308
+mkdir -p n02132136
+mkdir -p n02133161
+mkdir -p n02134084
+mkdir -p n02134418
+mkdir -p n02137549
+mkdir -p n02138441
+mkdir -p n02165105
+mkdir -p n02165456
+mkdir -p n02167151
+mkdir -p n02168699
+mkdir -p n02169497
+mkdir -p n02172182
+mkdir -p n02174001
+mkdir -p n02177972
+mkdir -p n02190166
+mkdir -p n02206856
+mkdir -p n02219486
+mkdir -p n02226429
+mkdir -p n02229544
+mkdir -p n02231487
+mkdir -p n02233338
+mkdir -p n02236044
+mkdir -p n02256656
+mkdir -p n02259212
+mkdir -p n02264363
+mkdir -p n02268443
+mkdir -p n02268853
+mkdir -p n02276258
+mkdir -p n02277742
+mkdir -p n02279972
+mkdir -p n02280649
+mkdir -p n02281406
+mkdir -p n02281787
+mkdir -p n02317335
+mkdir -p n02319095
+mkdir -p n02321529
+mkdir -p n02325366
+mkdir -p n02326432
+mkdir -p n02328150
+mkdir -p n02342885
+mkdir -p n02346627
+mkdir -p n02356798
+mkdir -p n02361337
+mkdir -p n02363005
+mkdir -p n02364673
+mkdir -p n02389026
+mkdir -p n02391049
+mkdir -p n02395406
+mkdir -p n02396427
+mkdir -p n02397096
+mkdir -p n02398521
+mkdir -p n02403003
+mkdir -p n02408429
+mkdir -p n02410509
+mkdir -p n02412080
+mkdir -p n02415577
+mkdir -p n02417914
+mkdir -p n02422106
+mkdir -p n02422699
+mkdir -p n02423022
+mkdir -p n02437312
+mkdir -p n02437616
+mkdir -p n02441942
+mkdir -p n02442845
+mkdir -p n02443114
+mkdir -p n02443484
+mkdir -p n02444819
+mkdir -p n02445715
+mkdir -p n02447366
+mkdir -p n02454379
+mkdir -p n02457408
+mkdir -p n02480495
+mkdir -p n02480855
+mkdir -p n02481823
+mkdir -p n02483362
+mkdir -p n02483708
+mkdir -p n02484975
+mkdir -p n02486261
+mkdir -p n02486410
+mkdir -p n02487347
+mkdir -p n02488291
+mkdir -p n02488702
+mkdir -p n02489166
+mkdir -p n02490219
+mkdir -p n02492035
+mkdir -p n02492660
+mkdir -p n02493509
+mkdir -p n02493793
+mkdir -p n02494079
+mkdir -p n02497673
+mkdir -p n02500267
+mkdir -p n02504013
+mkdir -p n02504458
+mkdir -p n02509815
+mkdir -p n02510455
+mkdir -p n02514041
+mkdir -p n02526121
+mkdir -p n02536864
+mkdir -p n02606052
+mkdir -p n02607072
+mkdir -p n02640242
+mkdir -p n02641379
+mkdir -p n02643566
+mkdir -p n02655020
+mkdir -p n02666196
+mkdir -p n02667093
+mkdir -p n02669723
+mkdir -p n02672831
+mkdir -p n02676566
+mkdir -p n02687172
+mkdir -p n02690373
+mkdir -p n02692877
+mkdir -p n02699494
+mkdir -p n02701002
+mkdir -p n02704792
+mkdir -p n02708093
+mkdir -p n02727426
+mkdir -p n02730930
+mkdir -p n02747177
+mkdir -p n02749479
+mkdir -p n02769748
+mkdir -p n02776631
+mkdir -p n02777292
+mkdir -p n02782093
+mkdir -p n02783161
+mkdir -p n02786058
+mkdir -p n02787622
+mkdir -p n02788148
+mkdir -p n02790996
+mkdir -p n02791124
+mkdir -p n02791270
+mkdir -p n02793495
+mkdir -p n02794156
+mkdir -p n02795169
+mkdir -p n02797295
+mkdir -p n02799071
+mkdir -p n02802426
+mkdir -p n02804414
+mkdir -p n02804610
+mkdir -p n02807133
+mkdir -p n02808304
+mkdir -p n02808440
+mkdir -p n02814533
+mkdir -p n02814860
+mkdir -p n02815834
+mkdir -p n02817516
+mkdir -p n02823428
+mkdir -p n02823750
+mkdir -p n02825657
+mkdir -p n02834397
+mkdir -p n02835271
+mkdir -p n02837789
+mkdir -p n02840245
+mkdir -p n02841315
+mkdir -p n02843684
+mkdir -p n02859443
+mkdir -p n02860847
+mkdir -p n02865351
+mkdir -p n02869837
+mkdir -p n02870880
+mkdir -p n02871525
+mkdir -p n02877765
+mkdir -p n02879718
+mkdir -p n02883205
+mkdir -p n02892201
+mkdir -p n02892767
+mkdir -p n02894605
+mkdir -p n02895154
+mkdir -p n02906734
+mkdir -p n02909870
+mkdir -p n02910353
+mkdir -p n02916936
+mkdir -p n02917067
+mkdir -p n02927161
+mkdir -p n02930766
+mkdir -p n02939185
+mkdir -p n02948072
+mkdir -p n02950826
+mkdir -p n02951358
+mkdir -p n02951585
+mkdir -p n02963159
+mkdir -p n02965783
+mkdir -p n02966193
+mkdir -p n02966687
+mkdir -p n02971356
+mkdir -p n02974003
+mkdir -p n02977058
+mkdir -p n02978881
+mkdir -p n02979186
+mkdir -p n02980441
+mkdir -p n02981792
+mkdir -p n02988304
+mkdir -p n02992211
+mkdir -p n02992529
+mkdir -p n02999410
+mkdir -p n03000134
+mkdir -p n03000247
+mkdir -p n03000684
+mkdir -p n03014705
+mkdir -p n03016953
+mkdir -p n03017168
+mkdir -p n03018349
+mkdir -p n03026506
+mkdir -p n03028079
+mkdir -p n03032252
+mkdir -p n03041632
+mkdir -p n03042490
+mkdir -p n03045698
+mkdir -p n03047690
+mkdir -p n03062245
+mkdir -p n03063599
+mkdir -p n03063689
+mkdir -p n03065424
+mkdir -p n03075370
+mkdir -p n03085013
+mkdir -p n03089624
+mkdir -p n03095699
+mkdir -p n03100240
+mkdir -p n03109150
+mkdir -p n03110669
+mkdir -p n03124043
+mkdir -p n03124170
+mkdir -p n03125729
+mkdir -p n03126707
+mkdir -p n03127747
+mkdir -p n03127925
+mkdir -p n03131574
+mkdir -p n03133878
+mkdir -p n03134739
+mkdir -p n03141823
+mkdir -p n03146219
+mkdir -p n03160309
+mkdir -p n03179701
+mkdir -p n03180011
+mkdir -p n03187595
+mkdir -p n03188531
+mkdir -p n03196217
+mkdir -p n03197337
+mkdir -p n03201208
+mkdir -p n03207743
+mkdir -p n03207941
+mkdir -p n03208938
+mkdir -p n03216828
+mkdir -p n03218198
+mkdir -p n03220513
+mkdir -p n03223299
+mkdir -p n03240683
+mkdir -p n03249569
+mkdir -p n03250847
+mkdir -p n03255030
+mkdir -p n03259280
+mkdir -p n03271574
+mkdir -p n03272010
+mkdir -p n03272562
+mkdir -p n03290653
+mkdir -p n03291819
+mkdir -p n03297495
+mkdir -p n03314780
+mkdir -p n03325584
+mkdir -p n03337140
+mkdir -p n03344393
+mkdir -p n03345487
+mkdir -p n03347037
+mkdir -p n03355925
+mkdir -p n03372029
+mkdir -p n03376595
+mkdir -p n03379051
+mkdir -p n03384352
+mkdir -p n03388043
+mkdir -p n03388183
+mkdir -p n03388549
+mkdir -p n03393912
+mkdir -p n03394916
+mkdir -p n03400231
+mkdir -p n03404251
+mkdir -p n03417042
+mkdir -p n03424325
+mkdir -p n03425413
+mkdir -p n03443371
+mkdir -p n03444034
+mkdir -p n03445777
+mkdir -p n03445924
+mkdir -p n03447447
+mkdir -p n03447721
+mkdir -p n03450230
+mkdir -p n03452741
+mkdir -p n03457902
+mkdir -p n03459775
+mkdir -p n03461385
+mkdir -p n03467068
+mkdir -p n03476684
+mkdir -p n03476991
+mkdir -p n03478589
+mkdir -p n03481172
+mkdir -p n03482405
+mkdir -p n03483316
+mkdir -p n03485407
+mkdir -p n03485794
+mkdir -p n03492542
+mkdir -p n03494278
+mkdir -p n03495258
+mkdir -p n03496892
+mkdir -p n03498962
+mkdir -p n03527444
+mkdir -p n03529860
+mkdir -p n03530642
+mkdir -p n03532672
+mkdir -p n03534580
+mkdir -p n03535780
+mkdir -p n03538406
+mkdir -p n03544143
+mkdir -p n03584254
+mkdir -p n03584829
+mkdir -p n03590841
+mkdir -p n03594734
+mkdir -p n03594945
+mkdir -p n03595614
+mkdir -p n03598930
+mkdir -p n03599486
+mkdir -p n03602883
+mkdir -p n03617480
+mkdir -p n03623198
+mkdir -p n03627232
+mkdir -p n03630383
+mkdir -p n03633091
+mkdir -p n03637318
+mkdir -p n03642806
+mkdir -p n03649909
+mkdir -p n03657121
+mkdir -p n03658185
+mkdir -p n03661043
+mkdir -p n03662601
+mkdir -p n03666591
+mkdir -p n03670208
+mkdir -p n03673027
+mkdir -p n03676483
+mkdir -p n03680355
+mkdir -p n03690938
+mkdir -p n03691459
+mkdir -p n03692522
+mkdir -p n03697007
+mkdir -p n03706229
+mkdir -p n03709823
+mkdir -p n03710193
+mkdir -p n03710637
+mkdir -p n03710721
+mkdir -p n03717622
+mkdir -p n03720891
+mkdir -p n03721384
+mkdir -p n03724870
+mkdir -p n03729826
+mkdir -p n03733131
+mkdir -p n03733281
+mkdir -p n03733805
+mkdir -p n03742115
+mkdir -p n03743016
+mkdir -p n03759954
+mkdir -p n03761084
+mkdir -p n03763968
+mkdir -p n03764736
+mkdir -p n03769881
+mkdir -p n03770439
+mkdir -p n03770679
+mkdir -p n03773504
+mkdir -p n03775071
+mkdir -p n03775546
+mkdir -p n03776460
+mkdir -p n03777568
+mkdir -p n03777754
+mkdir -p n03781244
+mkdir -p n03782006
+mkdir -p n03785016
+mkdir -p n03786901
+mkdir -p n03787032
+mkdir -p n03788195
+mkdir -p n03788365
+mkdir -p n03791053
+mkdir -p n03792782
+mkdir -p n03792972
+mkdir -p n03793489
+mkdir -p n03794056
+mkdir -p n03796401
+mkdir -p n03803284
+mkdir -p n03804744
+mkdir -p n03814639
+mkdir -p n03814906
+mkdir -p n03825788
+mkdir -p n03832673
+mkdir -p n03837869
+mkdir -p n03838899
+mkdir -p n03840681
+mkdir -p n03841143
+mkdir -p n03843555
+mkdir -p n03854065
+mkdir -p n03857828
+mkdir -p n03866082
+mkdir -p n03868242
+mkdir -p n03868863
+mkdir -p n03871628
+mkdir -p n03873416
+mkdir -p n03874293
+mkdir -p n03874599
+mkdir -p n03876231
+mkdir -p n03877472
+mkdir -p n03877845
+mkdir -p n03884397
+mkdir -p n03887697
+mkdir -p n03888257
+mkdir -p n03888605
+mkdir -p n03891251
+mkdir -p n03891332
+mkdir -p n03895866
+mkdir -p n03899768
+mkdir -p n03902125
+mkdir -p n03903868
+mkdir -p n03908618
+mkdir -p n03908714
+mkdir -p n03916031
+mkdir -p n03920288
+mkdir -p n03924679
+mkdir -p n03929660
+mkdir -p n03929855
+mkdir -p n03930313
+mkdir -p n03930630
+mkdir -p n03933933
+mkdir -p n03935335
+mkdir -p n03937543
+mkdir -p n03938244
+mkdir -p n03942813
+mkdir -p n03944341
+mkdir -p n03947888
+mkdir -p n03950228
+mkdir -p n03954731
+mkdir -p n03956157
+mkdir -p n03958227
+mkdir -p n03961711
+mkdir -p n03967562
+mkdir -p n03970156
+mkdir -p n03976467
+mkdir -p n03976657
+mkdir -p n03977966
+mkdir -p n03980874
+mkdir -p n03982430
+mkdir -p n03983396
+mkdir -p n03991062
+mkdir -p n03992509
+mkdir -p n03995372
+mkdir -p n03998194
+mkdir -p n04004767
+mkdir -p n04005630
+mkdir -p n04008634
+mkdir -p n04009552
+mkdir -p n04019541
+mkdir -p n04023962
+mkdir -p n04026417
+mkdir -p n04033901
+mkdir -p n04033995
+mkdir -p n04037443
+mkdir -p n04039381
+mkdir -p n04040759
+mkdir -p n04041544
+mkdir -p n04044716
+mkdir -p n04049303
+mkdir -p n04065272
+mkdir -p n04067472
+mkdir -p n04069434
+mkdir -p n04070727
+mkdir -p n04074963
+mkdir -p n04081281
+mkdir -p n04086273
+mkdir -p n04090263
+mkdir -p n04099969
+mkdir -p n04111531
+mkdir -p n04116512
+mkdir -p n04118538
+mkdir -p n04118776
+mkdir -p n04120489
+mkdir -p n04125021
+mkdir -p n04127249
+mkdir -p n04131690
+mkdir -p n04133789
+mkdir -p n04136333
+mkdir -p n04141076
+mkdir -p n04141327
+mkdir -p n04141975
+mkdir -p n04146614
+mkdir -p n04147183
+mkdir -p n04149813
+mkdir -p n04152593
+mkdir -p n04153751
+mkdir -p n04154565
+mkdir -p n04162706
+mkdir -p n04179913
+mkdir -p n04192698
+mkdir -p n04200800
+mkdir -p n04201297
+mkdir -p n04204238
+mkdir -p n04204347
+mkdir -p n04208210
+mkdir -p n04209133
+mkdir -p n04209239
+mkdir -p n04228054
+mkdir -p n04229816
+mkdir -p n04235860
+mkdir -p n04238763
+mkdir -p n04239074
+mkdir -p n04243546
+mkdir -p n04251144
+mkdir -p n04252077
+mkdir -p n04252225
+mkdir -p n04254120
+mkdir -p n04254680
+mkdir -p n04254777
+mkdir -p n04258138
+mkdir -p n04259630
+mkdir -p n04263257
+mkdir -p n04264628
+mkdir -p n04265275
+mkdir -p n04266014
+mkdir -p n04270147
+mkdir -p n04273569
+mkdir -p n04275548
+mkdir -p n04277352
+mkdir -p n04285008
+mkdir -p n04286575
+mkdir -p n04296562
+mkdir -p n04310018
+mkdir -p n04311004
+mkdir -p n04311174
+mkdir -p n04317175
+mkdir -p n04325704
+mkdir -p n04326547
+mkdir -p n04328186
+mkdir -p n04330267
+mkdir -p n04332243
+mkdir -p n04335435
+mkdir -p n04336792
+mkdir -p n04344873
+mkdir -p n04346328
+mkdir -p n04347754
+mkdir -p n04350905
+mkdir -p n04355338
+mkdir -p n04355933
+mkdir -p n04356056
+mkdir -p n04357314
+mkdir -p n04366367
+mkdir -p n04367480
+mkdir -p n04370456
+mkdir -p n04371430
+mkdir -p n04371774
+mkdir -p n04372370
+mkdir -p n04376876
+mkdir -p n04380533
+mkdir -p n04389033
+mkdir -p n04392985
+mkdir -p n04398044
+mkdir -p n04399382
+mkdir -p n04404412
+mkdir -p n04409515
+mkdir -p n04417672
+mkdir -p n04418357
+mkdir -p n04423845
+mkdir -p n04428191
+mkdir -p n04429376
+mkdir -p n04435653
+mkdir -p n04442312
+mkdir -p n04443257
+mkdir -p n04447861
+mkdir -p n04456115
+mkdir -p n04458633
+mkdir -p n04461696
+mkdir -p n04462240
+mkdir -p n04465501
+mkdir -p n04467665
+mkdir -p n04476259
+mkdir -p n04479046
+mkdir -p n04482393
+mkdir -p n04483307
+mkdir -p n04485082
+mkdir -p n04486054
+mkdir -p n04487081
+mkdir -p n04487394
+mkdir -p n04493381
+mkdir -p n04501370
+mkdir -p n04505470
+mkdir -p n04507155
+mkdir -p n04509417
+mkdir -p n04515003
+mkdir -p n04517823
+mkdir -p n04522168
+mkdir -p n04523525
+mkdir -p n04525038
+mkdir -p n04525305
+mkdir -p n04532106
+mkdir -p n04532670
+mkdir -p n04536866
+mkdir -p n04540053
+mkdir -p n04542943
+mkdir -p n04548280
+mkdir -p n04548362
+mkdir -p n04550184
+mkdir -p n04552348
+mkdir -p n04553703
+mkdir -p n04554684
+mkdir -p n04557648
+mkdir -p n04560804
+mkdir -p n04562935
+mkdir -p n04579145
+mkdir -p n04579432
+mkdir -p n04584207
+mkdir -p n04589890
+mkdir -p n04590129
+mkdir -p n04591157
+mkdir -p n04591713
+mkdir -p n04592741
+mkdir -p n04596742
+mkdir -p n04597913
+mkdir -p n04599235
+mkdir -p n04604644
+mkdir -p n04606251
+mkdir -p n04612504
+mkdir -p n04613696
+mkdir -p n06359193
+mkdir -p n06596364
+mkdir -p n06785654
+mkdir -p n06794110
+mkdir -p n06874185
+mkdir -p n07248320
+mkdir -p n07565083
+mkdir -p n07579787
+mkdir -p n07583066
+mkdir -p n07584110
+mkdir -p n07590611
+mkdir -p n07613480
+mkdir -p n07614500
+mkdir -p n07615774
+mkdir -p n07684084
+mkdir -p n07693725
+mkdir -p n07695742
+mkdir -p n07697313
+mkdir -p n07697537
+mkdir -p n07711569
+mkdir -p n07714571
+mkdir -p n07714990
+mkdir -p n07715103
+mkdir -p n07716358
+mkdir -p n07716906
+mkdir -p n07717410
+mkdir -p n07717556
+mkdir -p n07718472
+mkdir -p n07718747
+mkdir -p n07720875
+mkdir -p n07730033
+mkdir -p n07734744
+mkdir -p n07742313
+mkdir -p n07745940
+mkdir -p n07747607
+mkdir -p n07749582
+mkdir -p n07753113
+mkdir -p n07753275
+mkdir -p n07753592
+mkdir -p n07754684
+mkdir -p n07760859
+mkdir -p n07768694
+mkdir -p n07802026
+mkdir -p n07831146
+mkdir -p n07836838
+mkdir -p n07860988
+mkdir -p n07871810
+mkdir -p n07873807
+mkdir -p n07875152
+mkdir -p n07880968
+mkdir -p n07892512
+mkdir -p n07920052
+mkdir -p n07930864
+mkdir -p n07932039
+mkdir -p n09193705
+mkdir -p n09229709
+mkdir -p n09246464
+mkdir -p n09256479
+mkdir -p n09288635
+mkdir -p n09332890
+mkdir -p n09399592
+mkdir -p n09421951
+mkdir -p n09428293
+mkdir -p n09468604
+mkdir -p n09472597
+mkdir -p n09835506
+mkdir -p n10148035
+mkdir -p n10565667
+mkdir -p n11879895
+mkdir -p n11939491
+mkdir -p n12057211
+mkdir -p n12144580
+mkdir -p n12267677
+mkdir -p n12620546
+mkdir -p n12768682
+mkdir -p n12985857
+mkdir -p n12998815
+mkdir -p n13037406
+mkdir -p n13040303
+mkdir -p n13044778
+mkdir -p n13052670
+mkdir -p n13054560
+mkdir -p n13133613
+mkdir -p n15075141
+mv val/ILSVRC2012_val_00000001.JPEG n01751748/
+mv val/ILSVRC2012_val_00000002.JPEG n09193705/
+mv val/ILSVRC2012_val_00000003.JPEG n02105855/
+mv val/ILSVRC2012_val_00000004.JPEG n04263257/
+mv val/ILSVRC2012_val_00000005.JPEG n03125729/
+mv val/ILSVRC2012_val_00000006.JPEG n01735189/
+mv val/ILSVRC2012_val_00000007.JPEG n02346627/
+mv val/ILSVRC2012_val_00000008.JPEG n02776631/
+mv val/ILSVRC2012_val_00000009.JPEG n03794056/
+mv val/ILSVRC2012_val_00000010.JPEG n02328150/
+mv val/ILSVRC2012_val_00000011.JPEG n01917289/
+mv val/ILSVRC2012_val_00000012.JPEG n02125311/
+mv val/ILSVRC2012_val_00000013.JPEG n02484975/
+mv val/ILSVRC2012_val_00000014.JPEG n04065272/
+mv val/ILSVRC2012_val_00000015.JPEG n03496892/
+mv val/ILSVRC2012_val_00000016.JPEG n02066245/
+mv val/ILSVRC2012_val_00000017.JPEG n01914609/
+mv val/ILSVRC2012_val_00000018.JPEG n01616318/
+mv val/ILSVRC2012_val_00000019.JPEG n02971356/
+mv val/ILSVRC2012_val_00000020.JPEG n03126707/
+mv val/ILSVRC2012_val_00000021.JPEG n02346627/
+mv val/ILSVRC2012_val_00000022.JPEG n02091244/
+mv val/ILSVRC2012_val_00000023.JPEG n07742313/
+mv val/ILSVRC2012_val_00000024.JPEG n03956157/
+mv val/ILSVRC2012_val_00000025.JPEG n01616318/
+mv val/ILSVRC2012_val_00000026.JPEG n04380533/
+mv val/ILSVRC2012_val_00000027.JPEG n02114548/
+mv val/ILSVRC2012_val_00000028.JPEG n02089973/
+mv val/ILSVRC2012_val_00000029.JPEG n01729977/
+mv val/ILSVRC2012_val_00000030.JPEG n04435653/
+mv val/ILSVRC2012_val_00000031.JPEG n02280649/
+mv val/ILSVRC2012_val_00000032.JPEG n03444034/
+mv val/ILSVRC2012_val_00000033.JPEG n02077923/
+mv val/ILSVRC2012_val_00000034.JPEG n09835506/
+mv val/ILSVRC2012_val_00000035.JPEG n03478589/
+mv val/ILSVRC2012_val_00000036.JPEG n04532106/
+mv val/ILSVRC2012_val_00000037.JPEG n01644900/
+mv val/ILSVRC2012_val_00000038.JPEG n02666196/
+mv val/ILSVRC2012_val_00000039.JPEG n04141327/
+mv val/ILSVRC2012_val_00000040.JPEG n01773797/
+mv val/ILSVRC2012_val_00000041.JPEG n03125729/
+mv val/ILSVRC2012_val_00000042.JPEG n04049303/
+mv val/ILSVRC2012_val_00000043.JPEG n02006656/
+mv val/ILSVRC2012_val_00000044.JPEG n02097209/
+mv val/ILSVRC2012_val_00000045.JPEG n02111277/
+mv val/ILSVRC2012_val_00000046.JPEG n03950228/
+mv val/ILSVRC2012_val_00000047.JPEG n03393912/
+mv val/ILSVRC2012_val_00000048.JPEG n02089973/
+mv val/ILSVRC2012_val_00000049.JPEG n03930630/
+mv val/ILSVRC2012_val_00000050.JPEG n02640242/
+mv val/ILSVRC2012_val_00000051.JPEG n01828970/
+mv val/ILSVRC2012_val_00000052.JPEG n01632777/
+mv val/ILSVRC2012_val_00000053.JPEG n04372370/
+mv val/ILSVRC2012_val_00000054.JPEG n03485794/
+mv val/ILSVRC2012_val_00000055.JPEG n02443114/
+mv val/ILSVRC2012_val_00000056.JPEG n02930766/
+mv val/ILSVRC2012_val_00000057.JPEG n02112018/
+mv val/ILSVRC2012_val_00000058.JPEG n13040303/
+mv val/ILSVRC2012_val_00000059.JPEG n04485082/
+mv val/ILSVRC2012_val_00000060.JPEG n03482405/
+mv val/ILSVRC2012_val_00000061.JPEG n02963159/
+mv val/ILSVRC2012_val_00000062.JPEG n02093859/
+mv val/ILSVRC2012_val_00000063.JPEG n01910747/
+mv val/ILSVRC2012_val_00000064.JPEG n01693334/
+mv val/ILSVRC2012_val_00000065.JPEG n04371430/
+mv val/ILSVRC2012_val_00000066.JPEG n02526121/
+mv val/ILSVRC2012_val_00000067.JPEG n01871265/
+mv val/ILSVRC2012_val_00000068.JPEG n04532106/
+mv val/ILSVRC2012_val_00000069.JPEG n04482393/
+mv val/ILSVRC2012_val_00000070.JPEG n04370456/
+mv val/ILSVRC2012_val_00000071.JPEG n02927161/
+mv val/ILSVRC2012_val_00000072.JPEG n02074367/
+mv val/ILSVRC2012_val_00000073.JPEG n01608432/
+mv val/ILSVRC2012_val_00000074.JPEG n02966193/
+mv val/ILSVRC2012_val_00000075.JPEG n01795545/
+mv val/ILSVRC2012_val_00000076.JPEG n02791270/
+mv val/ILSVRC2012_val_00000077.JPEG n02087394/
+mv val/ILSVRC2012_val_00000078.JPEG n02116738/
+mv val/ILSVRC2012_val_00000079.JPEG n02091635/
+mv val/ILSVRC2012_val_00000080.JPEG n02895154/
+mv val/ILSVRC2012_val_00000081.JPEG n09193705/
+mv val/ILSVRC2012_val_00000082.JPEG n02088094/
+mv val/ILSVRC2012_val_00000083.JPEG n04200800/
+mv val/ILSVRC2012_val_00000084.JPEG n01737021/
+mv val/ILSVRC2012_val_00000085.JPEG n02974003/
+mv val/ILSVRC2012_val_00000086.JPEG n03032252/
+mv val/ILSVRC2012_val_00000087.JPEG n02483708/
+mv val/ILSVRC2012_val_00000088.JPEG n01632458/
+mv val/ILSVRC2012_val_00000089.JPEG n02992529/
+mv val/ILSVRC2012_val_00000090.JPEG n01698640/
+mv val/ILSVRC2012_val_00000091.JPEG n02114548/
+mv val/ILSVRC2012_val_00000092.JPEG n02497673/
+mv val/ILSVRC2012_val_00000093.JPEG n02480855/
+mv val/ILSVRC2012_val_00000094.JPEG n04147183/
+mv val/ILSVRC2012_val_00000095.JPEG n02487347/
+mv val/ILSVRC2012_val_00000096.JPEG n03895866/
+mv val/ILSVRC2012_val_00000097.JPEG n02325366/
+mv val/ILSVRC2012_val_00000098.JPEG n02033041/
+mv val/ILSVRC2012_val_00000099.JPEG n07745940/
+mv val/ILSVRC2012_val_00000100.JPEG n02415577/
+mv val/ILSVRC2012_val_00000101.JPEG n02951585/
+mv val/ILSVRC2012_val_00000102.JPEG n02087394/
+mv val/ILSVRC2012_val_00000103.JPEG n04485082/
+mv val/ILSVRC2012_val_00000104.JPEG n04505470/
+mv val/ILSVRC2012_val_00000105.JPEG n02097658/
+mv val/ILSVRC2012_val_00000106.JPEG n04591157/
+mv val/ILSVRC2012_val_00000107.JPEG n01770081/
+mv val/ILSVRC2012_val_00000108.JPEG n02992211/
+mv val/ILSVRC2012_val_00000109.JPEG n03691459/
+mv val/ILSVRC2012_val_00000110.JPEG n03594734/
+mv val/ILSVRC2012_val_00000111.JPEG n01983481/
+mv val/ILSVRC2012_val_00000112.JPEG n03937543/
+mv val/ILSVRC2012_val_00000113.JPEG n02105412/
+mv val/ILSVRC2012_val_00000114.JPEG n03843555/
+mv val/ILSVRC2012_val_00000115.JPEG n02091244/
+mv val/ILSVRC2012_val_00000116.JPEG n07831146/
+mv val/ILSVRC2012_val_00000117.JPEG n03710637/
+mv val/ILSVRC2012_val_00000118.JPEG n03733281/
+mv val/ILSVRC2012_val_00000119.JPEG n03782006/
+mv val/ILSVRC2012_val_00000120.JPEG n03733131/
+mv val/ILSVRC2012_val_00000121.JPEG n03933933/
+mv val/ILSVRC2012_val_00000122.JPEG n02980441/
+mv val/ILSVRC2012_val_00000123.JPEG n04409515/
+mv val/ILSVRC2012_val_00000124.JPEG n02606052/
+mv val/ILSVRC2012_val_00000125.JPEG n02226429/
+mv val/ILSVRC2012_val_00000126.JPEG n02883205/
+mv val/ILSVRC2012_val_00000127.JPEG n02422699/
+mv val/ILSVRC2012_val_00000128.JPEG n01614925/
+mv val/ILSVRC2012_val_00000129.JPEG n07697537/
+mv val/ILSVRC2012_val_00000130.JPEG n02123394/
+mv val/ILSVRC2012_val_00000131.JPEG n04252077/
+mv val/ILSVRC2012_val_00000132.JPEG n03337140/
+mv val/ILSVRC2012_val_00000133.JPEG n02117135/
+mv val/ILSVRC2012_val_00000134.JPEG n02107142/
+mv val/ILSVRC2012_val_00000135.JPEG n04037443/
+mv val/ILSVRC2012_val_00000136.JPEG n02397096/
+mv val/ILSVRC2012_val_00000137.JPEG n03187595/
+mv val/ILSVRC2012_val_00000138.JPEG n02319095/
+mv val/ILSVRC2012_val_00000139.JPEG n07932039/
+mv val/ILSVRC2012_val_00000140.JPEG n03372029/
+mv val/ILSVRC2012_val_00000141.JPEG n02088466/
+mv val/ILSVRC2012_val_00000142.JPEG n02319095/
+mv val/ILSVRC2012_val_00000143.JPEG n04125021/
+mv val/ILSVRC2012_val_00000144.JPEG n03954731/
+mv val/ILSVRC2012_val_00000145.JPEG n09421951/
+mv val/ILSVRC2012_val_00000146.JPEG n04487394/
+mv val/ILSVRC2012_val_00000147.JPEG n02113624/
+mv val/ILSVRC2012_val_00000148.JPEG n03843555/
+mv val/ILSVRC2012_val_00000149.JPEG n03485407/
+mv val/ILSVRC2012_val_00000150.JPEG n09332890/
+mv val/ILSVRC2012_val_00000151.JPEG n03642806/
+mv val/ILSVRC2012_val_00000152.JPEG n03710193/
+mv val/ILSVRC2012_val_00000153.JPEG n01677366/
+mv val/ILSVRC2012_val_00000154.JPEG n01950731/
+mv val/ILSVRC2012_val_00000155.JPEG n07714990/
+mv val/ILSVRC2012_val_00000156.JPEG n02114855/
+mv val/ILSVRC2012_val_00000157.JPEG n02119022/
+mv val/ILSVRC2012_val_00000158.JPEG n04086273/
+mv val/ILSVRC2012_val_00000159.JPEG n04201297/
+mv val/ILSVRC2012_val_00000160.JPEG n03733281/
+mv val/ILSVRC2012_val_00000161.JPEG n02100877/
+mv val/ILSVRC2012_val_00000162.JPEG n03016953/
+mv val/ILSVRC2012_val_00000163.JPEG n03733805/
+mv val/ILSVRC2012_val_00000164.JPEG n03063599/
+mv val/ILSVRC2012_val_00000165.JPEG n07714990/
+mv val/ILSVRC2012_val_00000166.JPEG n03854065/
+mv val/ILSVRC2012_val_00000167.JPEG n04149813/
+mv val/ILSVRC2012_val_00000168.JPEG n03786901/
+mv val/ILSVRC2012_val_00000169.JPEG n03467068/
+mv val/ILSVRC2012_val_00000170.JPEG n02087046/
+mv val/ILSVRC2012_val_00000171.JPEG n04326547/
+mv val/ILSVRC2012_val_00000172.JPEG n02100735/
+mv val/ILSVRC2012_val_00000173.JPEG n03775546/
+mv val/ILSVRC2012_val_00000174.JPEG n02111500/
+mv val/ILSVRC2012_val_00000175.JPEG n02814533/
+mv val/ILSVRC2012_val_00000176.JPEG n02097047/
+mv val/ILSVRC2012_val_00000177.JPEG n02027492/
+mv val/ILSVRC2012_val_00000178.JPEG n02109961/
+mv val/ILSVRC2012_val_00000179.JPEG n02389026/
+mv val/ILSVRC2012_val_00000180.JPEG n02105855/
+mv val/ILSVRC2012_val_00000181.JPEG n02445715/
+mv val/ILSVRC2012_val_00000182.JPEG n03259280/
+mv val/ILSVRC2012_val_00000183.JPEG n07711569/
+mv val/ILSVRC2012_val_00000184.JPEG n03710637/
+mv val/ILSVRC2012_val_00000185.JPEG n03670208/
+mv val/ILSVRC2012_val_00000186.JPEG n02128757/
+mv val/ILSVRC2012_val_00000187.JPEG n04467665/
+mv val/ILSVRC2012_val_00000188.JPEG n02114855/
+mv val/ILSVRC2012_val_00000189.JPEG n01873310/
+mv val/ILSVRC2012_val_00000190.JPEG n03476684/
+mv val/ILSVRC2012_val_00000191.JPEG n02093428/
+mv val/ILSVRC2012_val_00000192.JPEG n03891251/
+mv val/ILSVRC2012_val_00000193.JPEG n02859443/
+mv val/ILSVRC2012_val_00000194.JPEG n04125021/
+mv val/ILSVRC2012_val_00000195.JPEG n01978287/
+mv val/ILSVRC2012_val_00000196.JPEG n02643566/
+mv val/ILSVRC2012_val_00000197.JPEG n07697537/
+mv val/ILSVRC2012_val_00000198.JPEG n01560419/
+mv val/ILSVRC2012_val_00000199.JPEG n03290653/
+mv val/ILSVRC2012_val_00000200.JPEG n13037406/
+mv val/ILSVRC2012_val_00000201.JPEG n03891332/
+mv val/ILSVRC2012_val_00000202.JPEG n02883205/
+mv val/ILSVRC2012_val_00000203.JPEG n02106382/
+mv val/ILSVRC2012_val_00000204.JPEG n02672831/
+mv val/ILSVRC2012_val_00000205.JPEG n04330267/
+mv val/ILSVRC2012_val_00000206.JPEG n02489166/
+mv val/ILSVRC2012_val_00000207.JPEG n02058221/
+mv val/ILSVRC2012_val_00000208.JPEG n03584829/
+mv val/ILSVRC2012_val_00000209.JPEG n07565083/
+mv val/ILSVRC2012_val_00000210.JPEG n03125729/
+mv val/ILSVRC2012_val_00000211.JPEG n02123597/
+mv val/ILSVRC2012_val_00000212.JPEG n04536866/
+mv val/ILSVRC2012_val_00000213.JPEG n02965783/
+mv val/ILSVRC2012_val_00000214.JPEG n09428293/
+mv val/ILSVRC2012_val_00000215.JPEG n02965783/
+mv val/ILSVRC2012_val_00000216.JPEG n11879895/
+mv val/ILSVRC2012_val_00000217.JPEG n01560419/
+mv val/ILSVRC2012_val_00000218.JPEG n01775062/
+mv val/ILSVRC2012_val_00000219.JPEG n03595614/
+mv val/ILSVRC2012_val_00000220.JPEG n02110958/
+mv val/ILSVRC2012_val_00000221.JPEG n03709823/
+mv val/ILSVRC2012_val_00000222.JPEG n03777754/
+mv val/ILSVRC2012_val_00000223.JPEG n02951585/
+mv val/ILSVRC2012_val_00000224.JPEG n02100877/
+mv val/ILSVRC2012_val_00000225.JPEG n01629819/
+mv val/ILSVRC2012_val_00000226.JPEG n02909870/
+mv val/ILSVRC2012_val_00000227.JPEG n02101388/
+mv val/ILSVRC2012_val_00000228.JPEG n02091244/
+mv val/ILSVRC2012_val_00000229.JPEG n01667114/
+mv val/ILSVRC2012_val_00000230.JPEG n03998194/
+mv val/ILSVRC2012_val_00000231.JPEG n01986214/
+mv val/ILSVRC2012_val_00000232.JPEG n04192698/
+mv val/ILSVRC2012_val_00000233.JPEG n02128757/
+mv val/ILSVRC2012_val_00000234.JPEG n02793495/
+mv val/ILSVRC2012_val_00000235.JPEG n09256479/
+mv val/ILSVRC2012_val_00000236.JPEG n01443537/
+mv val/ILSVRC2012_val_00000237.JPEG n02089973/
+mv val/ILSVRC2012_val_00000238.JPEG n01981276/
+mv val/ILSVRC2012_val_00000239.JPEG n02837789/
+mv val/ILSVRC2012_val_00000240.JPEG n03888605/
+mv val/ILSVRC2012_val_00000241.JPEG n03201208/
+mv val/ILSVRC2012_val_00000242.JPEG n02480855/
+mv val/ILSVRC2012_val_00000243.JPEG n03814639/
+mv val/ILSVRC2012_val_00000244.JPEG n04090263/
+mv val/ILSVRC2012_val_00000245.JPEG n01986214/
+mv val/ILSVRC2012_val_00000246.JPEG n02415577/
+mv val/ILSVRC2012_val_00000247.JPEG n01534433/
+mv val/ILSVRC2012_val_00000248.JPEG n02093256/
+mv val/ILSVRC2012_val_00000249.JPEG n03134739/
+mv val/ILSVRC2012_val_00000250.JPEG n03016953/
+mv val/ILSVRC2012_val_00000251.JPEG n12620546/
+mv val/ILSVRC2012_val_00000252.JPEG n03937543/
+mv val/ILSVRC2012_val_00000253.JPEG n02815834/
+mv val/ILSVRC2012_val_00000254.JPEG n03776460/
+mv val/ILSVRC2012_val_00000255.JPEG n10565667/
+mv val/ILSVRC2012_val_00000256.JPEG n03207743/
+mv val/ILSVRC2012_val_00000257.JPEG n02992529/
+mv val/ILSVRC2012_val_00000258.JPEG n01631663/
+mv val/ILSVRC2012_val_00000259.JPEG n03729826/
+mv val/ILSVRC2012_val_00000260.JPEG n04033995/
+mv val/ILSVRC2012_val_00000261.JPEG n04462240/
+mv val/ILSVRC2012_val_00000262.JPEG n01443537/
+mv val/ILSVRC2012_val_00000263.JPEG n02091831/
+mv val/ILSVRC2012_val_00000264.JPEG n03874293/
+mv val/ILSVRC2012_val_00000265.JPEG n03874599/
+mv val/ILSVRC2012_val_00000266.JPEG n04238763/
+mv val/ILSVRC2012_val_00000267.JPEG n07584110/
+mv val/ILSVRC2012_val_00000268.JPEG n02749479/
+mv val/ILSVRC2012_val_00000269.JPEG n02110185/
+mv val/ILSVRC2012_val_00000270.JPEG n09193705/
+mv val/ILSVRC2012_val_00000271.JPEG n04311004/
+mv val/ILSVRC2012_val_00000272.JPEG n02788148/
+mv val/ILSVRC2012_val_00000273.JPEG n02445715/
+mv val/ILSVRC2012_val_00000274.JPEG n06874185/
+mv val/ILSVRC2012_val_00000275.JPEG n04074963/
+mv val/ILSVRC2012_val_00000276.JPEG n01631663/
+mv val/ILSVRC2012_val_00000277.JPEG n03803284/
+mv val/ILSVRC2012_val_00000278.JPEG n01828970/
+mv val/ILSVRC2012_val_00000279.JPEG n02096437/
+mv val/ILSVRC2012_val_00000280.JPEG n04554684/
+mv val/ILSVRC2012_val_00000281.JPEG n03599486/
+mv val/ILSVRC2012_val_00000282.JPEG n03595614/
+mv val/ILSVRC2012_val_00000283.JPEG n02123394/
+mv val/ILSVRC2012_val_00000284.JPEG n04515003/
+mv val/ILSVRC2012_val_00000285.JPEG n04591157/
+mv val/ILSVRC2012_val_00000286.JPEG n04560804/
+mv val/ILSVRC2012_val_00000287.JPEG n02794156/
+mv val/ILSVRC2012_val_00000288.JPEG n03344393/
+mv val/ILSVRC2012_val_00000289.JPEG n02687172/
+mv val/ILSVRC2012_val_00000290.JPEG n04328186/
+mv val/ILSVRC2012_val_00000291.JPEG n04479046/
+mv val/ILSVRC2012_val_00000292.JPEG n03967562/
+mv val/ILSVRC2012_val_00000293.JPEG n01440764/
+mv val/ILSVRC2012_val_00000294.JPEG n04465501/
+mv val/ILSVRC2012_val_00000295.JPEG n03457902/
+mv val/ILSVRC2012_val_00000296.JPEG n04532670/
+mv val/ILSVRC2012_val_00000297.JPEG n01688243/
+mv val/ILSVRC2012_val_00000298.JPEG n01749939/
+mv val/ILSVRC2012_val_00000299.JPEG n01768244/
+mv val/ILSVRC2012_val_00000300.JPEG n02091831/
+mv val/ILSVRC2012_val_00000301.JPEG n02321529/
+mv val/ILSVRC2012_val_00000302.JPEG n02939185/
+mv val/ILSVRC2012_val_00000303.JPEG n02129604/
+mv val/ILSVRC2012_val_00000304.JPEG n12985857/
+mv val/ILSVRC2012_val_00000305.JPEG n03485794/
+mv val/ILSVRC2012_val_00000306.JPEG n02408429/
+mv val/ILSVRC2012_val_00000307.JPEG n01443537/
+mv val/ILSVRC2012_val_00000308.JPEG n03590841/
+mv val/ILSVRC2012_val_00000309.JPEG n07697537/
+mv val/ILSVRC2012_val_00000310.JPEG n04154565/
+mv val/ILSVRC2012_val_00000311.JPEG n03443371/
+mv val/ILSVRC2012_val_00000312.JPEG n02514041/
+mv val/ILSVRC2012_val_00000313.JPEG n09468604/
+mv val/ILSVRC2012_val_00000314.JPEG n03769881/
+mv val/ILSVRC2012_val_00000315.JPEG n02787622/
+mv val/ILSVRC2012_val_00000316.JPEG n02526121/
+mv val/ILSVRC2012_val_00000317.JPEG n03888605/
+mv val/ILSVRC2012_val_00000318.JPEG n01622779/
+mv val/ILSVRC2012_val_00000319.JPEG n01872401/
+mv val/ILSVRC2012_val_00000320.JPEG n07745940/
+mv val/ILSVRC2012_val_00000321.JPEG n03085013/
+mv val/ILSVRC2012_val_00000322.JPEG n02445715/
+mv val/ILSVRC2012_val_00000323.JPEG n02120505/
+mv val/ILSVRC2012_val_00000324.JPEG n01751748/
+mv val/ILSVRC2012_val_00000325.JPEG n04141327/
+mv val/ILSVRC2012_val_00000326.JPEG n02443484/
+mv val/ILSVRC2012_val_00000327.JPEG n02089078/
+mv val/ILSVRC2012_val_00000328.JPEG n01608432/
+mv val/ILSVRC2012_val_00000329.JPEG n01514668/
+mv val/ILSVRC2012_val_00000330.JPEG n03160309/
+mv val/ILSVRC2012_val_00000331.JPEG n04070727/
+mv val/ILSVRC2012_val_00000332.JPEG n07715103/
+mv val/ILSVRC2012_val_00000333.JPEG n02110958/
+mv val/ILSVRC2012_val_00000334.JPEG n03976657/
+mv val/ILSVRC2012_val_00000335.JPEG n03902125/
+mv val/ILSVRC2012_val_00000336.JPEG n02909870/
+mv val/ILSVRC2012_val_00000337.JPEG n01740131/
+mv val/ILSVRC2012_val_00000338.JPEG n04532106/
+mv val/ILSVRC2012_val_00000339.JPEG n03197337/
+mv val/ILSVRC2012_val_00000340.JPEG n02493509/
+mv val/ILSVRC2012_val_00000341.JPEG n10148035/
+mv val/ILSVRC2012_val_00000342.JPEG n02172182/
+mv val/ILSVRC2012_val_00000343.JPEG n02437616/
+mv val/ILSVRC2012_val_00000344.JPEG n03062245/
+mv val/ILSVRC2012_val_00000345.JPEG n04286575/
+mv val/ILSVRC2012_val_00000346.JPEG n03018349/
+mv val/ILSVRC2012_val_00000347.JPEG n02951358/
+mv val/ILSVRC2012_val_00000348.JPEG n02130308/
+mv val/ILSVRC2012_val_00000349.JPEG n04277352/
+mv val/ILSVRC2012_val_00000350.JPEG n02096585/
+mv val/ILSVRC2012_val_00000351.JPEG n04589890/
+mv val/ILSVRC2012_val_00000352.JPEG n02965783/
+mv val/ILSVRC2012_val_00000353.JPEG n02978881/
+mv val/ILSVRC2012_val_00000354.JPEG n02804414/
+mv val/ILSVRC2012_val_00000355.JPEG n02112137/
+mv val/ILSVRC2012_val_00000356.JPEG n02007558/
+mv val/ILSVRC2012_val_00000357.JPEG n03670208/
+mv val/ILSVRC2012_val_00000358.JPEG n02894605/
+mv val/ILSVRC2012_val_00000359.JPEG n03657121/
+mv val/ILSVRC2012_val_00000360.JPEG n03876231/
+mv val/ILSVRC2012_val_00000361.JPEG n02165105/
+mv val/ILSVRC2012_val_00000362.JPEG n01669191/
+mv val/ILSVRC2012_val_00000363.JPEG n02011460/
+mv val/ILSVRC2012_val_00000364.JPEG n03710193/
+mv val/ILSVRC2012_val_00000365.JPEG n03796401/
+mv val/ILSVRC2012_val_00000366.JPEG n02916936/
+mv val/ILSVRC2012_val_00000367.JPEG n03492542/
+mv val/ILSVRC2012_val_00000368.JPEG n03998194/
+mv val/ILSVRC2012_val_00000369.JPEG n04552348/
+mv val/ILSVRC2012_val_00000370.JPEG n01824575/
+mv val/ILSVRC2012_val_00000371.JPEG n01917289/
+mv val/ILSVRC2012_val_00000372.JPEG n03461385/
+mv val/ILSVRC2012_val_00000373.JPEG n03874293/
+mv val/ILSVRC2012_val_00000374.JPEG n03272010/
+mv val/ILSVRC2012_val_00000375.JPEG n02099712/
+mv val/ILSVRC2012_val_00000376.JPEG n02999410/
+mv val/ILSVRC2012_val_00000377.JPEG n04179913/
+mv val/ILSVRC2012_val_00000378.JPEG n07831146/
+mv val/ILSVRC2012_val_00000379.JPEG n02096177/
+mv val/ILSVRC2012_val_00000380.JPEG n04350905/
+mv val/ILSVRC2012_val_00000381.JPEG n04507155/
+mv val/ILSVRC2012_val_00000382.JPEG n03743016/
+mv val/ILSVRC2012_val_00000383.JPEG n02105505/
+mv val/ILSVRC2012_val_00000384.JPEG n03649909/
+mv val/ILSVRC2012_val_00000385.JPEG n03680355/
+mv val/ILSVRC2012_val_00000386.JPEG n01910747/
+mv val/ILSVRC2012_val_00000387.JPEG n03529860/
+mv val/ILSVRC2012_val_00000388.JPEG n02787622/
+mv val/ILSVRC2012_val_00000389.JPEG n02012849/
+mv val/ILSVRC2012_val_00000390.JPEG n02011460/
+mv val/ILSVRC2012_val_00000391.JPEG n02094114/
+mv val/ILSVRC2012_val_00000392.JPEG n02950826/
+mv val/ILSVRC2012_val_00000393.JPEG n02105855/
+mv val/ILSVRC2012_val_00000394.JPEG n09288635/
+mv val/ILSVRC2012_val_00000395.JPEG n01773797/
+mv val/ILSVRC2012_val_00000396.JPEG n01774750/
+mv val/ILSVRC2012_val_00000397.JPEG n04409515/
+mv val/ILSVRC2012_val_00000398.JPEG n02497673/
+mv val/ILSVRC2012_val_00000399.JPEG n02113799/
+mv val/ILSVRC2012_val_00000400.JPEG n02786058/
+mv val/ILSVRC2012_val_00000401.JPEG n02443484/
+mv val/ILSVRC2012_val_00000402.JPEG n02981792/
+mv val/ILSVRC2012_val_00000403.JPEG n03095699/
+mv val/ILSVRC2012_val_00000404.JPEG n01664065/
+mv val/ILSVRC2012_val_00000405.JPEG n02092002/
+mv val/ILSVRC2012_val_00000406.JPEG n07711569/
+mv val/ILSVRC2012_val_00000407.JPEG n02219486/
+mv val/ILSVRC2012_val_00000408.JPEG n13133613/
+mv val/ILSVRC2012_val_00000409.JPEG n02114548/
+mv val/ILSVRC2012_val_00000410.JPEG n03529860/
+mv val/ILSVRC2012_val_00000411.JPEG n02097298/
+mv val/ILSVRC2012_val_00000412.JPEG n13133613/
+mv val/ILSVRC2012_val_00000413.JPEG n04355933/
+mv val/ILSVRC2012_val_00000414.JPEG n01537544/
+mv val/ILSVRC2012_val_00000415.JPEG n01847000/
+mv val/ILSVRC2012_val_00000416.JPEG n04428191/
+mv val/ILSVRC2012_val_00000417.JPEG n02666196/
+mv val/ILSVRC2012_val_00000418.JPEG n02268443/
+mv val/ILSVRC2012_val_00000419.JPEG n03291819/
+mv val/ILSVRC2012_val_00000420.JPEG n01828970/
+mv val/ILSVRC2012_val_00000421.JPEG n04099969/
+mv val/ILSVRC2012_val_00000422.JPEG n02747177/
+mv val/ILSVRC2012_val_00000423.JPEG n07720875/
+mv val/ILSVRC2012_val_00000424.JPEG n02088094/
+mv val/ILSVRC2012_val_00000425.JPEG n02113624/
+mv val/ILSVRC2012_val_00000426.JPEG n03710637/
+mv val/ILSVRC2012_val_00000427.JPEG n03637318/
+mv val/ILSVRC2012_val_00000428.JPEG n03942813/
+mv val/ILSVRC2012_val_00000429.JPEG n02093859/
+mv val/ILSVRC2012_val_00000430.JPEG n03794056/
+mv val/ILSVRC2012_val_00000431.JPEG n02930766/
+mv val/ILSVRC2012_val_00000432.JPEG n02930766/
+mv val/ILSVRC2012_val_00000433.JPEG n04525038/
+mv val/ILSVRC2012_val_00000434.JPEG n03796401/
+mv val/ILSVRC2012_val_00000435.JPEG n03709823/
+mv val/ILSVRC2012_val_00000436.JPEG n02097047/
+mv val/ILSVRC2012_val_00000437.JPEG n04604644/
+mv val/ILSVRC2012_val_00000438.JPEG n03938244/
+mv val/ILSVRC2012_val_00000439.JPEG n01560419/
+mv val/ILSVRC2012_val_00000440.JPEG n02097298/
+mv val/ILSVRC2012_val_00000441.JPEG n02091635/
+mv val/ILSVRC2012_val_00000442.JPEG n04136333/
+mv val/ILSVRC2012_val_00000443.JPEG n07718747/
+mv val/ILSVRC2012_val_00000444.JPEG n02417914/
+mv val/ILSVRC2012_val_00000445.JPEG n03355925/
+mv val/ILSVRC2012_val_00000446.JPEG n02445715/
+mv val/ILSVRC2012_val_00000447.JPEG n02445715/
+mv val/ILSVRC2012_val_00000448.JPEG n03495258/
+mv val/ILSVRC2012_val_00000449.JPEG n04447861/
+mv val/ILSVRC2012_val_00000450.JPEG n02111500/
+mv val/ILSVRC2012_val_00000451.JPEG n03584829/
+mv val/ILSVRC2012_val_00000452.JPEG n03977966/
+mv val/ILSVRC2012_val_00000453.JPEG n04116512/
+mv val/ILSVRC2012_val_00000454.JPEG n04019541/
+mv val/ILSVRC2012_val_00000455.JPEG n04200800/
+mv val/ILSVRC2012_val_00000456.JPEG n02408429/
+mv val/ILSVRC2012_val_00000457.JPEG n02085936/
+mv val/ILSVRC2012_val_00000458.JPEG n03992509/
+mv val/ILSVRC2012_val_00000459.JPEG n02769748/
+mv val/ILSVRC2012_val_00000460.JPEG n04613696/
+mv val/ILSVRC2012_val_00000461.JPEG n07716906/
+mv val/ILSVRC2012_val_00000462.JPEG n02085782/
+mv val/ILSVRC2012_val_00000463.JPEG n07718472/
+mv val/ILSVRC2012_val_00000464.JPEG n04398044/
+mv val/ILSVRC2012_val_00000465.JPEG n03920288/
+mv val/ILSVRC2012_val_00000466.JPEG n01860187/
+mv val/ILSVRC2012_val_00000467.JPEG n03272010/
+mv val/ILSVRC2012_val_00000468.JPEG n04008634/
+mv val/ILSVRC2012_val_00000469.JPEG n04090263/
+mv val/ILSVRC2012_val_00000470.JPEG n02028035/
+mv val/ILSVRC2012_val_00000471.JPEG n01677366/
+mv val/ILSVRC2012_val_00000472.JPEG n13037406/
+mv val/ILSVRC2012_val_00000473.JPEG n04067472/
+mv val/ILSVRC2012_val_00000474.JPEG n02095889/
+mv val/ILSVRC2012_val_00000475.JPEG n04532670/
+mv val/ILSVRC2012_val_00000476.JPEG n01582220/
+mv val/ILSVRC2012_val_00000477.JPEG n03476684/
+mv val/ILSVRC2012_val_00000478.JPEG n02395406/
+mv val/ILSVRC2012_val_00000479.JPEG n04487394/
+mv val/ILSVRC2012_val_00000480.JPEG n02443484/
+mv val/ILSVRC2012_val_00000481.JPEG n02510455/
+mv val/ILSVRC2012_val_00000482.JPEG n04550184/
+mv val/ILSVRC2012_val_00000483.JPEG n02814860/
+mv val/ILSVRC2012_val_00000484.JPEG n12144580/
+mv val/ILSVRC2012_val_00000485.JPEG n03126707/
+mv val/ILSVRC2012_val_00000486.JPEG n02486410/
+mv val/ILSVRC2012_val_00000487.JPEG n02125311/
+mv val/ILSVRC2012_val_00000488.JPEG n03777754/
+mv val/ILSVRC2012_val_00000489.JPEG n03924679/
+mv val/ILSVRC2012_val_00000490.JPEG n04613696/
+mv val/ILSVRC2012_val_00000491.JPEG n07875152/
+mv val/ILSVRC2012_val_00000492.JPEG n02058221/
+mv val/ILSVRC2012_val_00000493.JPEG n03188531/
+mv val/ILSVRC2012_val_00000494.JPEG n02777292/
+mv val/ILSVRC2012_val_00000495.JPEG n02489166/
+mv val/ILSVRC2012_val_00000496.JPEG n02066245/
+mv val/ILSVRC2012_val_00000497.JPEG n04579432/
+mv val/ILSVRC2012_val_00000498.JPEG n01630670/
+mv val/ILSVRC2012_val_00000499.JPEG n02666196/
+mv val/ILSVRC2012_val_00000500.JPEG n02091635/
+mv val/ILSVRC2012_val_00000501.JPEG n02114548/
+mv val/ILSVRC2012_val_00000502.JPEG n02356798/
+mv val/ILSVRC2012_val_00000503.JPEG n03201208/
+mv val/ILSVRC2012_val_00000504.JPEG n03240683/
+mv val/ILSVRC2012_val_00000505.JPEG n03590841/
+mv val/ILSVRC2012_val_00000506.JPEG n03018349/
+mv val/ILSVRC2012_val_00000507.JPEG n02104029/
+mv val/ILSVRC2012_val_00000508.JPEG n04251144/
+mv val/ILSVRC2012_val_00000509.JPEG n10148035/
+mv val/ILSVRC2012_val_00000510.JPEG n02169497/
+mv val/ILSVRC2012_val_00000511.JPEG n02089867/
+mv val/ILSVRC2012_val_00000512.JPEG n01734418/
+mv val/ILSVRC2012_val_00000513.JPEG n04476259/
+mv val/ILSVRC2012_val_00000514.JPEG n02843684/
+mv val/ILSVRC2012_val_00000515.JPEG n04008634/
+mv val/ILSVRC2012_val_00000516.JPEG n03400231/
+mv val/ILSVRC2012_val_00000517.JPEG n02119022/
+mv val/ILSVRC2012_val_00000518.JPEG n02137549/
+mv val/ILSVRC2012_val_00000519.JPEG n03761084/
+mv val/ILSVRC2012_val_00000520.JPEG n02490219/
+mv val/ILSVRC2012_val_00000521.JPEG n03840681/
+mv val/ILSVRC2012_val_00000522.JPEG n04346328/
+mv val/ILSVRC2012_val_00000523.JPEG n01677366/
+mv val/ILSVRC2012_val_00000524.JPEG n02102318/
+mv val/ILSVRC2012_val_00000525.JPEG n04458633/
+mv val/ILSVRC2012_val_00000526.JPEG n04476259/
+mv val/ILSVRC2012_val_00000527.JPEG n04209239/
+mv val/ILSVRC2012_val_00000528.JPEG n01795545/
+mv val/ILSVRC2012_val_00000529.JPEG n10565667/
+mv val/ILSVRC2012_val_00000530.JPEG n02114367/
+mv val/ILSVRC2012_val_00000531.JPEG n02107574/
+mv val/ILSVRC2012_val_00000532.JPEG n03032252/
+mv val/ILSVRC2012_val_00000533.JPEG n02104365/
+mv val/ILSVRC2012_val_00000534.JPEG n03133878/
+mv val/ILSVRC2012_val_00000535.JPEG n04336792/
+mv val/ILSVRC2012_val_00000536.JPEG n02112137/
+mv val/ILSVRC2012_val_00000537.JPEG n03000684/
+mv val/ILSVRC2012_val_00000538.JPEG n04553703/
+mv val/ILSVRC2012_val_00000539.JPEG n02102480/
+mv val/ILSVRC2012_val_00000540.JPEG n03825788/
+mv val/ILSVRC2012_val_00000541.JPEG n01695060/
+mv val/ILSVRC2012_val_00000542.JPEG n03250847/
+mv val/ILSVRC2012_val_00000543.JPEG n07860988/
+mv val/ILSVRC2012_val_00000544.JPEG n04310018/
+mv val/ILSVRC2012_val_00000545.JPEG n02071294/
+mv val/ILSVRC2012_val_00000546.JPEG n01945685/
+mv val/ILSVRC2012_val_00000547.JPEG n01855672/
+mv val/ILSVRC2012_val_00000548.JPEG n02037110/
+mv val/ILSVRC2012_val_00000549.JPEG n03868863/
+mv val/ILSVRC2012_val_00000550.JPEG n04229816/
+mv val/ILSVRC2012_val_00000551.JPEG n12057211/
+mv val/ILSVRC2012_val_00000552.JPEG n02408429/
+mv val/ILSVRC2012_val_00000553.JPEG n02481823/
+mv val/ILSVRC2012_val_00000554.JPEG n07716358/
+mv val/ILSVRC2012_val_00000555.JPEG n04487394/
+mv val/ILSVRC2012_val_00000556.JPEG n03662601/
+mv val/ILSVRC2012_val_00000557.JPEG n02979186/
+mv val/ILSVRC2012_val_00000558.JPEG n02910353/
+mv val/ILSVRC2012_val_00000559.JPEG n04266014/
+mv val/ILSVRC2012_val_00000560.JPEG n03895866/
+mv val/ILSVRC2012_val_00000561.JPEG n04443257/
+mv val/ILSVRC2012_val_00000562.JPEG n02917067/
+mv val/ILSVRC2012_val_00000563.JPEG n04149813/
+mv val/ILSVRC2012_val_00000564.JPEG n03041632/
+mv val/ILSVRC2012_val_00000565.JPEG n02364673/
+mv val/ILSVRC2012_val_00000566.JPEG n02999410/
+mv val/ILSVRC2012_val_00000567.JPEG n04435653/
+mv val/ILSVRC2012_val_00000568.JPEG n04228054/
+mv val/ILSVRC2012_val_00000569.JPEG n02814860/
+mv val/ILSVRC2012_val_00000570.JPEG n01531178/
+mv val/ILSVRC2012_val_00000571.JPEG n03662601/
+mv val/ILSVRC2012_val_00000572.JPEG n07880968/
+mv val/ILSVRC2012_val_00000573.JPEG n04487081/
+mv val/ILSVRC2012_val_00000574.JPEG n07614500/
+mv val/ILSVRC2012_val_00000575.JPEG n03532672/
+mv val/ILSVRC2012_val_00000576.JPEG n01807496/
+mv val/ILSVRC2012_val_00000577.JPEG n02011460/
+mv val/ILSVRC2012_val_00000578.JPEG n02074367/
+mv val/ILSVRC2012_val_00000579.JPEG n04462240/
+mv val/ILSVRC2012_val_00000580.JPEG n02977058/
+mv val/ILSVRC2012_val_00000581.JPEG n02281406/
+mv val/ILSVRC2012_val_00000582.JPEG n03041632/
+mv val/ILSVRC2012_val_00000583.JPEG n04350905/
+mv val/ILSVRC2012_val_00000584.JPEG n02788148/
+mv val/ILSVRC2012_val_00000585.JPEG n02137549/
+mv val/ILSVRC2012_val_00000586.JPEG n04562935/
+mv val/ILSVRC2012_val_00000587.JPEG n04590129/
+mv val/ILSVRC2012_val_00000588.JPEG n02093991/
+mv val/ILSVRC2012_val_00000589.JPEG n03995372/
+mv val/ILSVRC2012_val_00000590.JPEG n02111889/
+mv val/ILSVRC2012_val_00000591.JPEG n04081281/
+mv val/ILSVRC2012_val_00000592.JPEG n02133161/
+mv val/ILSVRC2012_val_00000593.JPEG n02006656/
+mv val/ILSVRC2012_val_00000594.JPEG n02107908/
+mv val/ILSVRC2012_val_00000595.JPEG n04347754/
+mv val/ILSVRC2012_val_00000596.JPEG n02950826/
+mv val/ILSVRC2012_val_00000597.JPEG n02504013/
+mv val/ILSVRC2012_val_00000598.JPEG n04560804/
+mv val/ILSVRC2012_val_00000599.JPEG n02088364/
+mv val/ILSVRC2012_val_00000600.JPEG n02128385/
+mv val/ILSVRC2012_val_00000601.JPEG n02860847/
+mv val/ILSVRC2012_val_00000602.JPEG n04399382/
+mv val/ILSVRC2012_val_00000603.JPEG n02105412/
+mv val/ILSVRC2012_val_00000604.JPEG n02115641/
+mv val/ILSVRC2012_val_00000605.JPEG n07753592/
+mv val/ILSVRC2012_val_00000606.JPEG n07880968/
+mv val/ILSVRC2012_val_00000607.JPEG n03598930/
+mv val/ILSVRC2012_val_00000608.JPEG n03724870/
+mv val/ILSVRC2012_val_00000609.JPEG n02066245/
+mv val/ILSVRC2012_val_00000610.JPEG n02128925/
+mv val/ILSVRC2012_val_00000611.JPEG n04465501/
+mv val/ILSVRC2012_val_00000612.JPEG n02094258/
+mv val/ILSVRC2012_val_00000613.JPEG n02086646/
+mv val/ILSVRC2012_val_00000614.JPEG n04141076/
+mv val/ILSVRC2012_val_00000615.JPEG n04136333/
+mv val/ILSVRC2012_val_00000616.JPEG n13133613/
+mv val/ILSVRC2012_val_00000617.JPEG n02342885/
+mv val/ILSVRC2012_val_00000618.JPEG n02281406/
+mv val/ILSVRC2012_val_00000619.JPEG n03443371/
+mv val/ILSVRC2012_val_00000620.JPEG n07613480/
+mv val/ILSVRC2012_val_00000621.JPEG n04008634/
+mv val/ILSVRC2012_val_00000622.JPEG n04141327/
+mv val/ILSVRC2012_val_00000623.JPEG n04347754/
+mv val/ILSVRC2012_val_00000624.JPEG n03314780/
+mv val/ILSVRC2012_val_00000625.JPEG n02165456/
+mv val/ILSVRC2012_val_00000626.JPEG n03930313/
+mv val/ILSVRC2012_val_00000627.JPEG n04392985/
+mv val/ILSVRC2012_val_00000628.JPEG n01872401/
+mv val/ILSVRC2012_val_00000629.JPEG n04204238/
+mv val/ILSVRC2012_val_00000630.JPEG n07831146/
+mv val/ILSVRC2012_val_00000631.JPEG n02690373/
+mv val/ILSVRC2012_val_00000632.JPEG n12144580/
+mv val/ILSVRC2012_val_00000633.JPEG n02776631/
+mv val/ILSVRC2012_val_00000634.JPEG n02877765/
+mv val/ILSVRC2012_val_00000635.JPEG n02108089/
+mv val/ILSVRC2012_val_00000636.JPEG n03532672/
+mv val/ILSVRC2012_val_00000637.JPEG n03126707/
+mv val/ILSVRC2012_val_00000638.JPEG n01560419/
+mv val/ILSVRC2012_val_00000639.JPEG n02268853/
+mv val/ILSVRC2012_val_00000640.JPEG n03691459/
+mv val/ILSVRC2012_val_00000641.JPEG n03404251/
+mv val/ILSVRC2012_val_00000642.JPEG n02364673/
+mv val/ILSVRC2012_val_00000643.JPEG n02101556/
+mv val/ILSVRC2012_val_00000644.JPEG n02326432/
+mv val/ILSVRC2012_val_00000645.JPEG n03954731/
+mv val/ILSVRC2012_val_00000646.JPEG n07831146/
+mv val/ILSVRC2012_val_00000647.JPEG n03584254/
+mv val/ILSVRC2012_val_00000648.JPEG n02012849/
+mv val/ILSVRC2012_val_00000649.JPEG n03804744/
+mv val/ILSVRC2012_val_00000650.JPEG n02128385/
+mv val/ILSVRC2012_val_00000651.JPEG n01530575/
+mv val/ILSVRC2012_val_00000652.JPEG n03933933/
+mv val/ILSVRC2012_val_00000653.JPEG n04409515/
+mv val/ILSVRC2012_val_00000654.JPEG n02823428/
+mv val/ILSVRC2012_val_00000655.JPEG n01877812/
+mv val/ILSVRC2012_val_00000656.JPEG n03920288/
+mv val/ILSVRC2012_val_00000657.JPEG n02510455/
+mv val/ILSVRC2012_val_00000658.JPEG n02112350/
+mv val/ILSVRC2012_val_00000659.JPEG n03594945/
+mv val/ILSVRC2012_val_00000660.JPEG n03642806/
+mv val/ILSVRC2012_val_00000661.JPEG n02395406/
+mv val/ILSVRC2012_val_00000662.JPEG n03452741/
+mv val/ILSVRC2012_val_00000663.JPEG n02860847/
+mv val/ILSVRC2012_val_00000664.JPEG n03673027/
+mv val/ILSVRC2012_val_00000665.JPEG n02102040/
+mv val/ILSVRC2012_val_00000666.JPEG n04505470/
+mv val/ILSVRC2012_val_00000667.JPEG n04086273/
+mv val/ILSVRC2012_val_00000668.JPEG n02099849/
+mv val/ILSVRC2012_val_00000669.JPEG n01990800/
+mv val/ILSVRC2012_val_00000670.JPEG n03781244/
+mv val/ILSVRC2012_val_00000671.JPEG n04461696/
+mv val/ILSVRC2012_val_00000672.JPEG n02106166/
+mv val/ILSVRC2012_val_00000673.JPEG n04141076/
+mv val/ILSVRC2012_val_00000674.JPEG n07717556/
+mv val/ILSVRC2012_val_00000675.JPEG n02361337/
+mv val/ILSVRC2012_val_00000676.JPEG n03976657/
+mv val/ILSVRC2012_val_00000677.JPEG n03832673/
+mv val/ILSVRC2012_val_00000678.JPEG n03109150/
+mv val/ILSVRC2012_val_00000679.JPEG n01776313/
+mv val/ILSVRC2012_val_00000680.JPEG n03788195/
+mv val/ILSVRC2012_val_00000681.JPEG n03884397/
+mv val/ILSVRC2012_val_00000682.JPEG n04019541/
+mv val/ILSVRC2012_val_00000683.JPEG n01693334/
+mv val/ILSVRC2012_val_00000684.JPEG n03633091/
+mv val/ILSVRC2012_val_00000685.JPEG n02325366/
+mv val/ILSVRC2012_val_00000686.JPEG n03623198/
+mv val/ILSVRC2012_val_00000687.JPEG n02795169/
+mv val/ILSVRC2012_val_00000688.JPEG n01744401/
+mv val/ILSVRC2012_val_00000689.JPEG n01955084/
+mv val/ILSVRC2012_val_00000690.JPEG n02002556/
+mv val/ILSVRC2012_val_00000691.JPEG n07754684/
+mv val/ILSVRC2012_val_00000692.JPEG n02174001/
+mv val/ILSVRC2012_val_00000693.JPEG n02793495/
+mv val/ILSVRC2012_val_00000694.JPEG n02095889/
+mv val/ILSVRC2012_val_00000695.JPEG n02484975/
+mv val/ILSVRC2012_val_00000696.JPEG n02094433/
+mv val/ILSVRC2012_val_00000697.JPEG n09229709/
+mv val/ILSVRC2012_val_00000698.JPEG n03207941/
+mv val/ILSVRC2012_val_00000699.JPEG n02655020/
+mv val/ILSVRC2012_val_00000700.JPEG n03773504/
+mv val/ILSVRC2012_val_00000701.JPEG n04367480/
+mv val/ILSVRC2012_val_00000702.JPEG n03933933/
+mv val/ILSVRC2012_val_00000703.JPEG n01955084/
+mv val/ILSVRC2012_val_00000704.JPEG n04355933/
+mv val/ILSVRC2012_val_00000705.JPEG n13040303/
+mv val/ILSVRC2012_val_00000706.JPEG n02786058/
+mv val/ILSVRC2012_val_00000707.JPEG n04090263/
+mv val/ILSVRC2012_val_00000708.JPEG n02101006/
+mv val/ILSVRC2012_val_00000709.JPEG n02124075/
+mv val/ILSVRC2012_val_00000710.JPEG n03720891/
+mv val/ILSVRC2012_val_00000711.JPEG n07749582/
+mv val/ILSVRC2012_val_00000712.JPEG n04517823/
+mv val/ILSVRC2012_val_00000713.JPEG n01534433/
+mv val/ILSVRC2012_val_00000714.JPEG n04335435/
+mv val/ILSVRC2012_val_00000715.JPEG n03661043/
+mv val/ILSVRC2012_val_00000716.JPEG n02101556/
+mv val/ILSVRC2012_val_00000717.JPEG n03785016/
+mv val/ILSVRC2012_val_00000718.JPEG n03133878/
+mv val/ILSVRC2012_val_00000719.JPEG n02113978/
+mv val/ILSVRC2012_val_00000720.JPEG n02930766/
+mv val/ILSVRC2012_val_00000721.JPEG n02783161/
+mv val/ILSVRC2012_val_00000722.JPEG n03958227/
+mv val/ILSVRC2012_val_00000723.JPEG n02441942/
+mv val/ILSVRC2012_val_00000724.JPEG n02859443/
+mv val/ILSVRC2012_val_00000725.JPEG n02096437/
+mv val/ILSVRC2012_val_00000726.JPEG n02447366/
+mv val/ILSVRC2012_val_00000727.JPEG n07742313/
+mv val/ILSVRC2012_val_00000728.JPEG n07583066/
+mv val/ILSVRC2012_val_00000729.JPEG n02110063/
+mv val/ILSVRC2012_val_00000730.JPEG n03146219/
+mv val/ILSVRC2012_val_00000731.JPEG n12998815/
+mv val/ILSVRC2012_val_00000732.JPEG n03425413/
+mv val/ILSVRC2012_val_00000733.JPEG n02123394/
+mv val/ILSVRC2012_val_00000734.JPEG n03594734/
+mv val/ILSVRC2012_val_00000735.JPEG n02006656/
+mv val/ILSVRC2012_val_00000736.JPEG n02992211/
+mv val/ILSVRC2012_val_00000737.JPEG n04442312/
+mv val/ILSVRC2012_val_00000738.JPEG n03032252/
+mv val/ILSVRC2012_val_00000739.JPEG n01608432/
+mv val/ILSVRC2012_val_00000740.JPEG n02927161/
+mv val/ILSVRC2012_val_00000741.JPEG n03485794/
+mv val/ILSVRC2012_val_00000742.JPEG n07583066/
+mv val/ILSVRC2012_val_00000743.JPEG n03347037/
+mv val/ILSVRC2012_val_00000744.JPEG n01847000/
+mv val/ILSVRC2012_val_00000745.JPEG n04557648/
+mv val/ILSVRC2012_val_00000746.JPEG n03478589/
+mv val/ILSVRC2012_val_00000747.JPEG n01530575/
+mv val/ILSVRC2012_val_00000748.JPEG n02098105/
+mv val/ILSVRC2012_val_00000749.JPEG n01755581/
+mv val/ILSVRC2012_val_00000750.JPEG n03045698/
+mv val/ILSVRC2012_val_00000751.JPEG n02028035/
+mv val/ILSVRC2012_val_00000752.JPEG n03538406/
+mv val/ILSVRC2012_val_00000753.JPEG n03956157/
+mv val/ILSVRC2012_val_00000754.JPEG n01871265/
+mv val/ILSVRC2012_val_00000755.JPEG n13044778/
+mv val/ILSVRC2012_val_00000756.JPEG n02119789/
+mv val/ILSVRC2012_val_00000757.JPEG n07875152/
+mv val/ILSVRC2012_val_00000758.JPEG n02107908/
+mv val/ILSVRC2012_val_00000759.JPEG n02791124/
+mv val/ILSVRC2012_val_00000760.JPEG n03697007/
+mv val/ILSVRC2012_val_00000761.JPEG n03207743/
+mv val/ILSVRC2012_val_00000762.JPEG n02791270/
+mv val/ILSVRC2012_val_00000763.JPEG n02865351/
+mv val/ILSVRC2012_val_00000764.JPEG n03345487/
+mv val/ILSVRC2012_val_00000765.JPEG n03976467/
+mv val/ILSVRC2012_val_00000766.JPEG n03124043/
+mv val/ILSVRC2012_val_00000767.JPEG n04252225/
+mv val/ILSVRC2012_val_00000768.JPEG n02165105/
+mv val/ILSVRC2012_val_00000769.JPEG n03314780/
+mv val/ILSVRC2012_val_00000770.JPEG n04040759/
+mv val/ILSVRC2012_val_00000771.JPEG n02730930/
+mv val/ILSVRC2012_val_00000772.JPEG n02236044/
+mv val/ILSVRC2012_val_00000773.JPEG n07873807/
+mv val/ILSVRC2012_val_00000774.JPEG n02006656/
+mv val/ILSVRC2012_val_00000775.JPEG n02514041/
+mv val/ILSVRC2012_val_00000776.JPEG n03534580/
+mv val/ILSVRC2012_val_00000777.JPEG n03179701/
+mv val/ILSVRC2012_val_00000778.JPEG n04366367/
+mv val/ILSVRC2012_val_00000779.JPEG n02138441/
+mv val/ILSVRC2012_val_00000780.JPEG n03450230/
+mv val/ILSVRC2012_val_00000781.JPEG n01943899/
+mv val/ILSVRC2012_val_00000782.JPEG n07836838/
+mv val/ILSVRC2012_val_00000783.JPEG n03691459/
+mv val/ILSVRC2012_val_00000784.JPEG n04467665/
+mv val/ILSVRC2012_val_00000785.JPEG n02115641/
+mv val/ILSVRC2012_val_00000786.JPEG n01742172/
+mv val/ILSVRC2012_val_00000787.JPEG n02795169/
+mv val/ILSVRC2012_val_00000788.JPEG n02481823/
+mv val/ILSVRC2012_val_00000789.JPEG n07583066/
+mv val/ILSVRC2012_val_00000790.JPEG n02749479/
+mv val/ILSVRC2012_val_00000791.JPEG n01665541/
+mv val/ILSVRC2012_val_00000792.JPEG n04131690/
+mv val/ILSVRC2012_val_00000793.JPEG n03769881/
+mv val/ILSVRC2012_val_00000794.JPEG n02009229/
+mv val/ILSVRC2012_val_00000795.JPEG n04487081/
+mv val/ILSVRC2012_val_00000796.JPEG n02123159/
+mv val/ILSVRC2012_val_00000797.JPEG n04542943/
+mv val/ILSVRC2012_val_00000798.JPEG n07760859/
+mv val/ILSVRC2012_val_00000799.JPEG n02097658/
+mv val/ILSVRC2012_val_00000800.JPEG n02113799/
+mv val/ILSVRC2012_val_00000801.JPEG n07932039/
+mv val/ILSVRC2012_val_00000802.JPEG n02097474/
+mv val/ILSVRC2012_val_00000803.JPEG n03793489/
+mv val/ILSVRC2012_val_00000804.JPEG n02791124/
+mv val/ILSVRC2012_val_00000805.JPEG n04591713/
+mv val/ILSVRC2012_val_00000806.JPEG n01735189/
+mv val/ILSVRC2012_val_00000807.JPEG n01631663/
+mv val/ILSVRC2012_val_00000808.JPEG n02892767/
+mv val/ILSVRC2012_val_00000809.JPEG n04458633/
+mv val/ILSVRC2012_val_00000810.JPEG n02277742/
+mv val/ILSVRC2012_val_00000811.JPEG n07697537/
+mv val/ILSVRC2012_val_00000812.JPEG n03781244/
+mv val/ILSVRC2012_val_00000813.JPEG n02791270/
+mv val/ILSVRC2012_val_00000814.JPEG n03854065/
+mv val/ILSVRC2012_val_00000815.JPEG n04356056/
+mv val/ILSVRC2012_val_00000816.JPEG n07802026/
+mv val/ILSVRC2012_val_00000817.JPEG n03733131/
+mv val/ILSVRC2012_val_00000818.JPEG n01980166/
+mv val/ILSVRC2012_val_00000819.JPEG n02174001/
+mv val/ILSVRC2012_val_00000820.JPEG n07684084/
+mv val/ILSVRC2012_val_00000821.JPEG n01981276/
+mv val/ILSVRC2012_val_00000822.JPEG n03874293/
+mv val/ILSVRC2012_val_00000823.JPEG n03146219/
+mv val/ILSVRC2012_val_00000824.JPEG n02099267/
+mv val/ILSVRC2012_val_00000825.JPEG n02018207/
+mv val/ILSVRC2012_val_00000826.JPEG n04398044/
+mv val/ILSVRC2012_val_00000827.JPEG n03832673/
+mv val/ILSVRC2012_val_00000828.JPEG n02493509/
+mv val/ILSVRC2012_val_00000829.JPEG n03478589/
+mv val/ILSVRC2012_val_00000830.JPEG n06359193/
+mv val/ILSVRC2012_val_00000831.JPEG n02971356/
+mv val/ILSVRC2012_val_00000832.JPEG n02093754/
+mv val/ILSVRC2012_val_00000833.JPEG n04487081/
+mv val/ILSVRC2012_val_00000834.JPEG n03929855/
+mv val/ILSVRC2012_val_00000835.JPEG n03485407/
+mv val/ILSVRC2012_val_00000836.JPEG n01930112/
+mv val/ILSVRC2012_val_00000837.JPEG n01592084/
+mv val/ILSVRC2012_val_00000838.JPEG n02088238/
+mv val/ILSVRC2012_val_00000839.JPEG n04613696/
+mv val/ILSVRC2012_val_00000840.JPEG n03967562/
+mv val/ILSVRC2012_val_00000841.JPEG n03814639/
+mv val/ILSVRC2012_val_00000842.JPEG n04311174/
+mv val/ILSVRC2012_val_00000843.JPEG n04286575/
+mv val/ILSVRC2012_val_00000844.JPEG n03884397/
+mv val/ILSVRC2012_val_00000845.JPEG n03534580/
+mv val/ILSVRC2012_val_00000846.JPEG n03793489/
+mv val/ILSVRC2012_val_00000847.JPEG n02106382/
+mv val/ILSVRC2012_val_00000848.JPEG n03045698/
+mv val/ILSVRC2012_val_00000849.JPEG n03661043/
+mv val/ILSVRC2012_val_00000850.JPEG n03814906/
+mv val/ILSVRC2012_val_00000851.JPEG n02669723/
+mv val/ILSVRC2012_val_00000852.JPEG n03459775/
+mv val/ILSVRC2012_val_00000853.JPEG n03785016/
+mv val/ILSVRC2012_val_00000854.JPEG n04584207/
+mv val/ILSVRC2012_val_00000855.JPEG n03657121/
+mv val/ILSVRC2012_val_00000856.JPEG n03476991/
+mv val/ILSVRC2012_val_00000857.JPEG n04243546/
+mv val/ILSVRC2012_val_00000858.JPEG n04560804/
+mv val/ILSVRC2012_val_00000859.JPEG n03788365/
+mv val/ILSVRC2012_val_00000860.JPEG n01796340/
+mv val/ILSVRC2012_val_00000861.JPEG n04019541/
+mv val/ILSVRC2012_val_00000862.JPEG n03496892/
+mv val/ILSVRC2012_val_00000863.JPEG n07711569/
+mv val/ILSVRC2012_val_00000864.JPEG n03788195/
+mv val/ILSVRC2012_val_00000865.JPEG n02133161/
+mv val/ILSVRC2012_val_00000866.JPEG n04548362/
+mv val/ILSVRC2012_val_00000867.JPEG n02113712/
+mv val/ILSVRC2012_val_00000868.JPEG n03673027/
+mv val/ILSVRC2012_val_00000869.JPEG n12144580/
+mv val/ILSVRC2012_val_00000870.JPEG n02481823/
+mv val/ILSVRC2012_val_00000871.JPEG n02132136/
+mv val/ILSVRC2012_val_00000872.JPEG n03956157/
+mv val/ILSVRC2012_val_00000873.JPEG n01532829/
+mv val/ILSVRC2012_val_00000874.JPEG n04493381/
+mv val/ILSVRC2012_val_00000875.JPEG n02094258/
+mv val/ILSVRC2012_val_00000876.JPEG n03483316/
+mv val/ILSVRC2012_val_00000877.JPEG n01770081/
+mv val/ILSVRC2012_val_00000878.JPEG n02006656/
+mv val/ILSVRC2012_val_00000879.JPEG n02871525/
+mv val/ILSVRC2012_val_00000880.JPEG n01580077/
+mv val/ILSVRC2012_val_00000881.JPEG n07730033/
+mv val/ILSVRC2012_val_00000882.JPEG n02097474/
+mv val/ILSVRC2012_val_00000883.JPEG n02093647/
+mv val/ILSVRC2012_val_00000884.JPEG n02088466/
+mv val/ILSVRC2012_val_00000885.JPEG n01795545/
+mv val/ILSVRC2012_val_00000886.JPEG n07716906/
+mv val/ILSVRC2012_val_00000887.JPEG n03481172/
+mv val/ILSVRC2012_val_00000888.JPEG n01608432/
+mv val/ILSVRC2012_val_00000889.JPEG n02097209/
+mv val/ILSVRC2012_val_00000890.JPEG n01629819/
+mv val/ILSVRC2012_val_00000891.JPEG n07695742/
+mv val/ILSVRC2012_val_00000892.JPEG n02389026/
+mv val/ILSVRC2012_val_00000893.JPEG n02977058/
+mv val/ILSVRC2012_val_00000894.JPEG n04090263/
+mv val/ILSVRC2012_val_00000895.JPEG n04522168/
+mv val/ILSVRC2012_val_00000896.JPEG n02871525/
+mv val/ILSVRC2012_val_00000897.JPEG n04258138/
+mv val/ILSVRC2012_val_00000898.JPEG n02127052/
+mv val/ILSVRC2012_val_00000899.JPEG n04476259/
+mv val/ILSVRC2012_val_00000900.JPEG n03617480/
+mv val/ILSVRC2012_val_00000901.JPEG n04273569/
+mv val/ILSVRC2012_val_00000902.JPEG n03485794/
+mv val/ILSVRC2012_val_00000903.JPEG n06794110/
+mv val/ILSVRC2012_val_00000904.JPEG n03085013/
+mv val/ILSVRC2012_val_00000905.JPEG n02974003/
+mv val/ILSVRC2012_val_00000906.JPEG n02869837/
+mv val/ILSVRC2012_val_00000907.JPEG n02086240/
+mv val/ILSVRC2012_val_00000908.JPEG n01685808/
+mv val/ILSVRC2012_val_00000909.JPEG n02088466/
+mv val/ILSVRC2012_val_00000910.JPEG n03584829/
+mv val/ILSVRC2012_val_00000911.JPEG n01514668/
+mv val/ILSVRC2012_val_00000912.JPEG n02114367/
+mv val/ILSVRC2012_val_00000913.JPEG n03447447/
+mv val/ILSVRC2012_val_00000914.JPEG n04435653/
+mv val/ILSVRC2012_val_00000915.JPEG n03065424/
+mv val/ILSVRC2012_val_00000916.JPEG n01616318/
+mv val/ILSVRC2012_val_00000917.JPEG n02841315/
+mv val/ILSVRC2012_val_00000918.JPEG n02655020/
+mv val/ILSVRC2012_val_00000919.JPEG n03496892/
+mv val/ILSVRC2012_val_00000920.JPEG n04040759/
+mv val/ILSVRC2012_val_00000921.JPEG n01496331/
+mv val/ILSVRC2012_val_00000922.JPEG n02094258/
+mv val/ILSVRC2012_val_00000923.JPEG n03787032/
+mv val/ILSVRC2012_val_00000924.JPEG n02172182/
+mv val/ILSVRC2012_val_00000925.JPEG n01693334/
+mv val/ILSVRC2012_val_00000926.JPEG n02168699/
+mv val/ILSVRC2012_val_00000927.JPEG n03793489/
+mv val/ILSVRC2012_val_00000928.JPEG n07613480/
+mv val/ILSVRC2012_val_00000929.JPEG n01824575/
+mv val/ILSVRC2012_val_00000930.JPEG n01665541/
+mv val/ILSVRC2012_val_00000931.JPEG n04065272/
+mv val/ILSVRC2012_val_00000932.JPEG n02699494/
+mv val/ILSVRC2012_val_00000933.JPEG n02526121/
+mv val/ILSVRC2012_val_00000934.JPEG n01774750/
+mv val/ILSVRC2012_val_00000935.JPEG n03126707/
+mv val/ILSVRC2012_val_00000936.JPEG n04254777/
+mv val/ILSVRC2012_val_00000937.JPEG n02325366/
+mv val/ILSVRC2012_val_00000938.JPEG n01665541/
+mv val/ILSVRC2012_val_00000939.JPEG n02007558/
+mv val/ILSVRC2012_val_00000940.JPEG n01873310/
+mv val/ILSVRC2012_val_00000941.JPEG n01734418/
+mv val/ILSVRC2012_val_00000942.JPEG n03271574/
+mv val/ILSVRC2012_val_00000943.JPEG n01776313/
+mv val/ILSVRC2012_val_00000944.JPEG n01644373/
+mv val/ILSVRC2012_val_00000945.JPEG n02486410/
+mv val/ILSVRC2012_val_00000946.JPEG n02106662/
+mv val/ILSVRC2012_val_00000947.JPEG n03125729/
+mv val/ILSVRC2012_val_00000948.JPEG n02087394/
+mv val/ILSVRC2012_val_00000949.JPEG n02094433/
+mv val/ILSVRC2012_val_00000950.JPEG n07684084/
+mv val/ILSVRC2012_val_00000951.JPEG n04532670/
+mv val/ILSVRC2012_val_00000952.JPEG n01843383/
+mv val/ILSVRC2012_val_00000953.JPEG n02835271/
+mv val/ILSVRC2012_val_00000954.JPEG n12985857/
+mv val/ILSVRC2012_val_00000955.JPEG n04485082/
+mv val/ILSVRC2012_val_00000956.JPEG n02167151/
+mv val/ILSVRC2012_val_00000957.JPEG n03394916/
+mv val/ILSVRC2012_val_00000958.JPEG n01664065/
+mv val/ILSVRC2012_val_00000959.JPEG n04286575/
+mv val/ILSVRC2012_val_00000960.JPEG n03874293/
+mv val/ILSVRC2012_val_00000961.JPEG n02699494/
+mv val/ILSVRC2012_val_00000962.JPEG n01601694/
+mv val/ILSVRC2012_val_00000963.JPEG n01582220/
+mv val/ILSVRC2012_val_00000964.JPEG n02486261/
+mv val/ILSVRC2012_val_00000965.JPEG n02268853/
+mv val/ILSVRC2012_val_00000966.JPEG n03947888/
+mv val/ILSVRC2012_val_00000967.JPEG n13040303/
+mv val/ILSVRC2012_val_00000968.JPEG n03967562/
+mv val/ILSVRC2012_val_00000969.JPEG n03602883/
+mv val/ILSVRC2012_val_00000970.JPEG n01882714/
+mv val/ILSVRC2012_val_00000971.JPEG n04505470/
+mv val/ILSVRC2012_val_00000972.JPEG n02226429/
+mv val/ILSVRC2012_val_00000973.JPEG n04522168/
+mv val/ILSVRC2012_val_00000974.JPEG n02481823/
+mv val/ILSVRC2012_val_00000975.JPEG n02108422/
+mv val/ILSVRC2012_val_00000976.JPEG n03670208/
+mv val/ILSVRC2012_val_00000977.JPEG n07718747/
+mv val/ILSVRC2012_val_00000978.JPEG n01688243/
+mv val/ILSVRC2012_val_00000979.JPEG n02747177/
+mv val/ILSVRC2012_val_00000980.JPEG n07248320/
+mv val/ILSVRC2012_val_00000981.JPEG n02328150/
+mv val/ILSVRC2012_val_00000982.JPEG n02963159/
+mv val/ILSVRC2012_val_00000983.JPEG n02117135/
+mv val/ILSVRC2012_val_00000984.JPEG n03676483/
+mv val/ILSVRC2012_val_00000985.JPEG n06596364/
+mv val/ILSVRC2012_val_00000986.JPEG n01775062/
+mv val/ILSVRC2012_val_00000987.JPEG n03724870/
+mv val/ILSVRC2012_val_00000988.JPEG n03347037/
+mv val/ILSVRC2012_val_00000989.JPEG n13133613/
+mv val/ILSVRC2012_val_00000990.JPEG n02319095/
+mv val/ILSVRC2012_val_00000991.JPEG n03944341/
+mv val/ILSVRC2012_val_00000992.JPEG n02088238/
+mv val/ILSVRC2012_val_00000993.JPEG n02110185/
+mv val/ILSVRC2012_val_00000994.JPEG n01443537/
+mv val/ILSVRC2012_val_00000995.JPEG n06794110/
+mv val/ILSVRC2012_val_00000996.JPEG n02606052/
+mv val/ILSVRC2012_val_00000997.JPEG n02113186/
+mv val/ILSVRC2012_val_00000998.JPEG n02704792/
+mv val/ILSVRC2012_val_00000999.JPEG n03692522/
+mv val/ILSVRC2012_val_00001000.JPEG n03018349/
+mv val/ILSVRC2012_val_00001001.JPEG n02095314/
+mv val/ILSVRC2012_val_00001002.JPEG n04523525/
+mv val/ILSVRC2012_val_00001003.JPEG n02356798/
+mv val/ILSVRC2012_val_00001004.JPEG n04228054/
+mv val/ILSVRC2012_val_00001005.JPEG n02108000/
+mv val/ILSVRC2012_val_00001006.JPEG n04371430/
+mv val/ILSVRC2012_val_00001007.JPEG n01770393/
+mv val/ILSVRC2012_val_00001008.JPEG n04456115/
+mv val/ILSVRC2012_val_00001009.JPEG n02110958/
+mv val/ILSVRC2012_val_00001010.JPEG n01631663/
+mv val/ILSVRC2012_val_00001011.JPEG n02708093/
+mv val/ILSVRC2012_val_00001012.JPEG n02835271/
+mv val/ILSVRC2012_val_00001013.JPEG n02807133/
+mv val/ILSVRC2012_val_00001014.JPEG n02280649/
+mv val/ILSVRC2012_val_00001015.JPEG n02277742/
+mv val/ILSVRC2012_val_00001016.JPEG n03857828/
+mv val/ILSVRC2012_val_00001017.JPEG n03452741/
+mv val/ILSVRC2012_val_00001018.JPEG n03388043/
+mv val/ILSVRC2012_val_00001019.JPEG n06596364/
+mv val/ILSVRC2012_val_00001020.JPEG n04252225/
+mv val/ILSVRC2012_val_00001021.JPEG n04458633/
+mv val/ILSVRC2012_val_00001022.JPEG n01689811/
+mv val/ILSVRC2012_val_00001023.JPEG n03935335/
+mv val/ILSVRC2012_val_00001024.JPEG n01560419/
+mv val/ILSVRC2012_val_00001025.JPEG n02500267/
+mv val/ILSVRC2012_val_00001026.JPEG n02319095/
+mv val/ILSVRC2012_val_00001027.JPEG n02412080/
+mv val/ILSVRC2012_val_00001028.JPEG n02096437/
+mv val/ILSVRC2012_val_00001029.JPEG n03814639/
+mv val/ILSVRC2012_val_00001030.JPEG n03494278/
+mv val/ILSVRC2012_val_00001031.JPEG n01518878/
+mv val/ILSVRC2012_val_00001032.JPEG n02486261/
+mv val/ILSVRC2012_val_00001033.JPEG n01629819/
+mv val/ILSVRC2012_val_00001034.JPEG n04606251/
+mv val/ILSVRC2012_val_00001035.JPEG n03787032/
+mv val/ILSVRC2012_val_00001036.JPEG n01877812/
+mv val/ILSVRC2012_val_00001037.JPEG n01773157/
+mv val/ILSVRC2012_val_00001038.JPEG n02104365/
+mv val/ILSVRC2012_val_00001039.JPEG n02113978/
+mv val/ILSVRC2012_val_00001040.JPEG n02123394/
+mv val/ILSVRC2012_val_00001041.JPEG n02966687/
+mv val/ILSVRC2012_val_00001042.JPEG n01728920/
+mv val/ILSVRC2012_val_00001043.JPEG n02916936/
+mv val/ILSVRC2012_val_00001044.JPEG n01860187/
+mv val/ILSVRC2012_val_00001045.JPEG n03255030/
+mv val/ILSVRC2012_val_00001046.JPEG n02011460/
+mv val/ILSVRC2012_val_00001047.JPEG n02087394/
+mv val/ILSVRC2012_val_00001048.JPEG n02817516/
+mv val/ILSVRC2012_val_00001049.JPEG n02085620/
+mv val/ILSVRC2012_val_00001050.JPEG n02437616/
+mv val/ILSVRC2012_val_00001051.JPEG n02606052/
+mv val/ILSVRC2012_val_00001052.JPEG n03447721/
+mv val/ILSVRC2012_val_00001053.JPEG n01773157/
+mv val/ILSVRC2012_val_00001054.JPEG n02497673/
+mv val/ILSVRC2012_val_00001055.JPEG n04380533/
+mv val/ILSVRC2012_val_00001056.JPEG n02056570/
+mv val/ILSVRC2012_val_00001057.JPEG n01917289/
+mv val/ILSVRC2012_val_00001058.JPEG n12267677/
+mv val/ILSVRC2012_val_00001059.JPEG n04325704/
+mv val/ILSVRC2012_val_00001060.JPEG n02130308/
+mv val/ILSVRC2012_val_00001061.JPEG n02730930/
+mv val/ILSVRC2012_val_00001062.JPEG n03933933/
+mv val/ILSVRC2012_val_00001063.JPEG n02981792/
+mv val/ILSVRC2012_val_00001064.JPEG n07892512/
+mv val/ILSVRC2012_val_00001065.JPEG n02112018/
+mv val/ILSVRC2012_val_00001066.JPEG n02398521/
+mv val/ILSVRC2012_val_00001067.JPEG n02009912/
+mv val/ILSVRC2012_val_00001068.JPEG n02002724/
+mv val/ILSVRC2012_val_00001069.JPEG n02086079/
+mv val/ILSVRC2012_val_00001070.JPEG n02100236/
+mv val/ILSVRC2012_val_00001071.JPEG n03085013/
+mv val/ILSVRC2012_val_00001072.JPEG n02837789/
+mv val/ILSVRC2012_val_00001073.JPEG n02018795/
+mv val/ILSVRC2012_val_00001074.JPEG n02106382/
+mv val/ILSVRC2012_val_00001075.JPEG n02489166/
+mv val/ILSVRC2012_val_00001076.JPEG n03937543/
+mv val/ILSVRC2012_val_00001077.JPEG n02910353/
+mv val/ILSVRC2012_val_00001078.JPEG n07836838/
+mv val/ILSVRC2012_val_00001079.JPEG n15075141/
+mv val/ILSVRC2012_val_00001080.JPEG n02877765/
+mv val/ILSVRC2012_val_00001081.JPEG n03602883/
+mv val/ILSVRC2012_val_00001082.JPEG n02233338/
+mv val/ILSVRC2012_val_00001083.JPEG n13037406/
+mv val/ILSVRC2012_val_00001084.JPEG n01580077/
+mv val/ILSVRC2012_val_00001085.JPEG n04069434/
+mv val/ILSVRC2012_val_00001086.JPEG n04371774/
+mv val/ILSVRC2012_val_00001087.JPEG n03938244/
+mv val/ILSVRC2012_val_00001088.JPEG n02326432/
+mv val/ILSVRC2012_val_00001089.JPEG n03085013/
+mv val/ILSVRC2012_val_00001090.JPEG n02804610/
+mv val/ILSVRC2012_val_00001091.JPEG n04141975/
+mv val/ILSVRC2012_val_00001092.JPEG n02484975/
+mv val/ILSVRC2012_val_00001093.JPEG n02930766/
+mv val/ILSVRC2012_val_00001094.JPEG n03000134/
+mv val/ILSVRC2012_val_00001095.JPEG n02488702/
+mv val/ILSVRC2012_val_00001096.JPEG n02113023/
+mv val/ILSVRC2012_val_00001097.JPEG n02088632/
+mv val/ILSVRC2012_val_00001098.JPEG n02783161/
+mv val/ILSVRC2012_val_00001099.JPEG n02490219/
+mv val/ILSVRC2012_val_00001100.JPEG n04505470/
+mv val/ILSVRC2012_val_00001101.JPEG n02123394/
+mv val/ILSVRC2012_val_00001102.JPEG n04357314/
+mv val/ILSVRC2012_val_00001103.JPEG n02825657/
+mv val/ILSVRC2012_val_00001104.JPEG n02493509/
+mv val/ILSVRC2012_val_00001105.JPEG n03720891/
+mv val/ILSVRC2012_val_00001106.JPEG n03673027/
+mv val/ILSVRC2012_val_00001107.JPEG n03492542/
+mv val/ILSVRC2012_val_00001108.JPEG n01739381/
+mv val/ILSVRC2012_val_00001109.JPEG n02105056/
+mv val/ILSVRC2012_val_00001110.JPEG n03481172/
+mv val/ILSVRC2012_val_00001111.JPEG n03947888/
+mv val/ILSVRC2012_val_00001112.JPEG n02099601/
+mv val/ILSVRC2012_val_00001113.JPEG n02105505/
+mv val/ILSVRC2012_val_00001114.JPEG n01514859/
+mv val/ILSVRC2012_val_00001115.JPEG n07871810/
+mv val/ILSVRC2012_val_00001116.JPEG n03445924/
+mv val/ILSVRC2012_val_00001117.JPEG n12267677/
+mv val/ILSVRC2012_val_00001118.JPEG n04536866/
+mv val/ILSVRC2012_val_00001119.JPEG n03314780/
+mv val/ILSVRC2012_val_00001120.JPEG n12768682/
+mv val/ILSVRC2012_val_00001121.JPEG n02028035/
+mv val/ILSVRC2012_val_00001122.JPEG n01980166/
+mv val/ILSVRC2012_val_00001123.JPEG n02099601/
+mv val/ILSVRC2012_val_00001124.JPEG n01981276/
+mv val/ILSVRC2012_val_00001125.JPEG n07730033/
+mv val/ILSVRC2012_val_00001126.JPEG n02909870/
+mv val/ILSVRC2012_val_00001127.JPEG n04179913/
+mv val/ILSVRC2012_val_00001128.JPEG n02089973/
+mv val/ILSVRC2012_val_00001129.JPEG n02111277/
+mv val/ILSVRC2012_val_00001130.JPEG n12057211/
+mv val/ILSVRC2012_val_00001131.JPEG n01632458/
+mv val/ILSVRC2012_val_00001132.JPEG n02123394/
+mv val/ILSVRC2012_val_00001133.JPEG n04350905/
+mv val/ILSVRC2012_val_00001134.JPEG n03937543/
+mv val/ILSVRC2012_val_00001135.JPEG n02730930/
+mv val/ILSVRC2012_val_00001136.JPEG n01795545/
+mv val/ILSVRC2012_val_00001137.JPEG n02091244/
+mv val/ILSVRC2012_val_00001138.JPEG n01632777/
+mv val/ILSVRC2012_val_00001139.JPEG n03584829/
+mv val/ILSVRC2012_val_00001140.JPEG n03709823/
+mv val/ILSVRC2012_val_00001141.JPEG n02086646/
+mv val/ILSVRC2012_val_00001142.JPEG n01824575/
+mv val/ILSVRC2012_val_00001143.JPEG n03977966/
+mv val/ILSVRC2012_val_00001144.JPEG n03417042/
+mv val/ILSVRC2012_val_00001145.JPEG n02892201/
+mv val/ILSVRC2012_val_00001146.JPEG n01806143/
+mv val/ILSVRC2012_val_00001147.JPEG n02105855/
+mv val/ILSVRC2012_val_00001148.JPEG n02115913/
+mv val/ILSVRC2012_val_00001149.JPEG n03902125/
+mv val/ILSVRC2012_val_00001150.JPEG n01774384/
+mv val/ILSVRC2012_val_00001151.JPEG n07880968/
+mv val/ILSVRC2012_val_00001152.JPEG n02112137/
+mv val/ILSVRC2012_val_00001153.JPEG n09428293/
+mv val/ILSVRC2012_val_00001154.JPEG n04116512/
+mv val/ILSVRC2012_val_00001155.JPEG n02486410/
+mv val/ILSVRC2012_val_00001156.JPEG n03930630/
+mv val/ILSVRC2012_val_00001157.JPEG n04090263/
+mv val/ILSVRC2012_val_00001158.JPEG n01843383/
+mv val/ILSVRC2012_val_00001159.JPEG n07802026/
+mv val/ILSVRC2012_val_00001160.JPEG n04429376/
+mv val/ILSVRC2012_val_00001161.JPEG n02317335/
+mv val/ILSVRC2012_val_00001162.JPEG n02027492/
+mv val/ILSVRC2012_val_00001163.JPEG n01818515/
+mv val/ILSVRC2012_val_00001164.JPEG n02086646/
+mv val/ILSVRC2012_val_00001165.JPEG n02018207/
+mv val/ILSVRC2012_val_00001166.JPEG n04371430/
+mv val/ILSVRC2012_val_00001167.JPEG n03347037/
+mv val/ILSVRC2012_val_00001168.JPEG n03014705/
+mv val/ILSVRC2012_val_00001169.JPEG n04125021/
+mv val/ILSVRC2012_val_00001170.JPEG n03764736/
+mv val/ILSVRC2012_val_00001171.JPEG n02981792/
+mv val/ILSVRC2012_val_00001172.JPEG n02114367/
+mv val/ILSVRC2012_val_00001173.JPEG n04192698/
+mv val/ILSVRC2012_val_00001174.JPEG n04330267/
+mv val/ILSVRC2012_val_00001175.JPEG n03729826/
+mv val/ILSVRC2012_val_00001176.JPEG n02607072/
+mv val/ILSVRC2012_val_00001177.JPEG n02504458/
+mv val/ILSVRC2012_val_00001178.JPEG n03769881/
+mv val/ILSVRC2012_val_00001179.JPEG n02018207/
+mv val/ILSVRC2012_val_00001180.JPEG n03929855/
+mv val/ILSVRC2012_val_00001181.JPEG n04591157/
+mv val/ILSVRC2012_val_00001182.JPEG n03947888/
+mv val/ILSVRC2012_val_00001183.JPEG n04317175/
+mv val/ILSVRC2012_val_00001184.JPEG n03125729/
+mv val/ILSVRC2012_val_00001185.JPEG n01749939/
+mv val/ILSVRC2012_val_00001186.JPEG n04399382/
+mv val/ILSVRC2012_val_00001187.JPEG n02276258/
+mv val/ILSVRC2012_val_00001188.JPEG n03598930/
+mv val/ILSVRC2012_val_00001189.JPEG n02606052/
+mv val/ILSVRC2012_val_00001190.JPEG n03089624/
+mv val/ILSVRC2012_val_00001191.JPEG n02099601/
+mv val/ILSVRC2012_val_00001192.JPEG n03770439/
+mv val/ILSVRC2012_val_00001193.JPEG n02655020/
+mv val/ILSVRC2012_val_00001194.JPEG n07745940/
+mv val/ILSVRC2012_val_00001195.JPEG n02095314/
+mv val/ILSVRC2012_val_00001196.JPEG n04336792/
+mv val/ILSVRC2012_val_00001197.JPEG n04033995/
+mv val/ILSVRC2012_val_00001198.JPEG n02112018/
+mv val/ILSVRC2012_val_00001199.JPEG n02132136/
+mv val/ILSVRC2012_val_00001200.JPEG n02860847/
+mv val/ILSVRC2012_val_00001201.JPEG n03100240/
+mv val/ILSVRC2012_val_00001202.JPEG n02966687/
+mv val/ILSVRC2012_val_00001203.JPEG n02111129/
+mv val/ILSVRC2012_val_00001204.JPEG n04273569/
+mv val/ILSVRC2012_val_00001205.JPEG n04149813/
+mv val/ILSVRC2012_val_00001206.JPEG n02092002/
+mv val/ILSVRC2012_val_00001207.JPEG n03769881/
+mv val/ILSVRC2012_val_00001208.JPEG n04599235/
+mv val/ILSVRC2012_val_00001209.JPEG n03825788/
+mv val/ILSVRC2012_val_00001210.JPEG n04118776/
+mv val/ILSVRC2012_val_00001211.JPEG n04336792/
+mv val/ILSVRC2012_val_00001212.JPEG n02115641/
+mv val/ILSVRC2012_val_00001213.JPEG n01622779/
+mv val/ILSVRC2012_val_00001214.JPEG n02909870/
+mv val/ILSVRC2012_val_00001215.JPEG n02276258/
+mv val/ILSVRC2012_val_00001216.JPEG n02977058/
+mv val/ILSVRC2012_val_00001217.JPEG n02326432/
+mv val/ILSVRC2012_val_00001218.JPEG n01608432/
+mv val/ILSVRC2012_val_00001219.JPEG n03347037/
+mv val/ILSVRC2012_val_00001220.JPEG n02978881/
+mv val/ILSVRC2012_val_00001221.JPEG n02787622/
+mv val/ILSVRC2012_val_00001222.JPEG n02093256/
+mv val/ILSVRC2012_val_00001223.JPEG n02101556/
+mv val/ILSVRC2012_val_00001224.JPEG n02100735/
+mv val/ILSVRC2012_val_00001225.JPEG n02085782/
+mv val/ILSVRC2012_val_00001226.JPEG n02342885/
+mv val/ILSVRC2012_val_00001227.JPEG n03733281/
+mv val/ILSVRC2012_val_00001228.JPEG n02085782/
+mv val/ILSVRC2012_val_00001229.JPEG n03706229/
+mv val/ILSVRC2012_val_00001230.JPEG n02002724/
+mv val/ILSVRC2012_val_00001231.JPEG n13037406/
+mv val/ILSVRC2012_val_00001232.JPEG n02422106/
+mv val/ILSVRC2012_val_00001233.JPEG n07614500/
+mv val/ILSVRC2012_val_00001234.JPEG n02113712/
+mv val/ILSVRC2012_val_00001235.JPEG n04336792/
+mv val/ILSVRC2012_val_00001236.JPEG n02486261/
+mv val/ILSVRC2012_val_00001237.JPEG n02356798/
+mv val/ILSVRC2012_val_00001238.JPEG n02268443/
+mv val/ILSVRC2012_val_00001239.JPEG n04179913/
+mv val/ILSVRC2012_val_00001240.JPEG n04277352/
+mv val/ILSVRC2012_val_00001241.JPEG n02346627/
+mv val/ILSVRC2012_val_00001242.JPEG n03089624/
+mv val/ILSVRC2012_val_00001243.JPEG n02835271/
+mv val/ILSVRC2012_val_00001244.JPEG n02086240/
+mv val/ILSVRC2012_val_00001245.JPEG n04579432/
+mv val/ILSVRC2012_val_00001246.JPEG n03180011/
+mv val/ILSVRC2012_val_00001247.JPEG n04285008/
+mv val/ILSVRC2012_val_00001248.JPEG n02408429/
+mv val/ILSVRC2012_val_00001249.JPEG n04392985/
+mv val/ILSVRC2012_val_00001250.JPEG n02091244/
+mv val/ILSVRC2012_val_00001251.JPEG n02815834/
+mv val/ILSVRC2012_val_00001252.JPEG n02834397/
+mv val/ILSVRC2012_val_00001253.JPEG n04009552/
+mv val/ILSVRC2012_val_00001254.JPEG n02488291/
+mv val/ILSVRC2012_val_00001255.JPEG n03290653/
+mv val/ILSVRC2012_val_00001256.JPEG n03325584/
+mv val/ILSVRC2012_val_00001257.JPEG n03637318/
+mv val/ILSVRC2012_val_00001258.JPEG n02730930/
+mv val/ILSVRC2012_val_00001259.JPEG n02865351/
+mv val/ILSVRC2012_val_00001260.JPEG n02119789/
+mv val/ILSVRC2012_val_00001261.JPEG n03929855/
+mv val/ILSVRC2012_val_00001262.JPEG n03676483/
+mv val/ILSVRC2012_val_00001263.JPEG n04423845/
+mv val/ILSVRC2012_val_00001264.JPEG n03874293/
+mv val/ILSVRC2012_val_00001265.JPEG n03908618/
+mv val/ILSVRC2012_val_00001266.JPEG n03598930/
+mv val/ILSVRC2012_val_00001267.JPEG n02090379/
+mv val/ILSVRC2012_val_00001268.JPEG n01944390/
+mv val/ILSVRC2012_val_00001269.JPEG n04152593/
+mv val/ILSVRC2012_val_00001270.JPEG n09288635/
+mv val/ILSVRC2012_val_00001271.JPEG n02066245/
+mv val/ILSVRC2012_val_00001272.JPEG n01768244/
+mv val/ILSVRC2012_val_00001273.JPEG n03272010/
+mv val/ILSVRC2012_val_00001274.JPEG n01531178/
+mv val/ILSVRC2012_val_00001275.JPEG n03255030/
+mv val/ILSVRC2012_val_00001276.JPEG n03676483/
+mv val/ILSVRC2012_val_00001277.JPEG n02002556/
+mv val/ILSVRC2012_val_00001278.JPEG n02749479/
+mv val/ILSVRC2012_val_00001279.JPEG n02415577/
+mv val/ILSVRC2012_val_00001280.JPEG n02403003/
+mv val/ILSVRC2012_val_00001281.JPEG n07565083/
+mv val/ILSVRC2012_val_00001282.JPEG n02981792/
+mv val/ILSVRC2012_val_00001283.JPEG n01776313/
+mv val/ILSVRC2012_val_00001284.JPEG n02097474/
+mv val/ILSVRC2012_val_00001285.JPEG n02667093/
+mv val/ILSVRC2012_val_00001286.JPEG n02096177/
+mv val/ILSVRC2012_val_00001287.JPEG n03255030/
+mv val/ILSVRC2012_val_00001288.JPEG n01819313/
+mv val/ILSVRC2012_val_00001289.JPEG n02791124/
+mv val/ILSVRC2012_val_00001290.JPEG n02279972/
+mv val/ILSVRC2012_val_00001291.JPEG n04090263/
+mv val/ILSVRC2012_val_00001292.JPEG n09193705/
+mv val/ILSVRC2012_val_00001293.JPEG n04335435/
+mv val/ILSVRC2012_val_00001294.JPEG n03733131/
+mv val/ILSVRC2012_val_00001295.JPEG n03250847/
+mv val/ILSVRC2012_val_00001296.JPEG n04263257/
+mv val/ILSVRC2012_val_00001297.JPEG n02096585/
+mv val/ILSVRC2012_val_00001298.JPEG n03976467/
+mv val/ILSVRC2012_val_00001299.JPEG n02963159/
+mv val/ILSVRC2012_val_00001300.JPEG n04613696/
+mv val/ILSVRC2012_val_00001301.JPEG n04310018/
+mv val/ILSVRC2012_val_00001302.JPEG n02107574/
+mv val/ILSVRC2012_val_00001303.JPEG n03724870/
+mv val/ILSVRC2012_val_00001304.JPEG n09428293/
+mv val/ILSVRC2012_val_00001305.JPEG n02101006/
+mv val/ILSVRC2012_val_00001306.JPEG n04372370/
+mv val/ILSVRC2012_val_00001307.JPEG n03930630/
+mv val/ILSVRC2012_val_00001308.JPEG n07584110/
+mv val/ILSVRC2012_val_00001309.JPEG n01735189/
+mv val/ILSVRC2012_val_00001310.JPEG n04599235/
+mv val/ILSVRC2012_val_00001311.JPEG n02835271/
+mv val/ILSVRC2012_val_00001312.JPEG n04330267/
+mv val/ILSVRC2012_val_00001313.JPEG n02108915/
+mv val/ILSVRC2012_val_00001314.JPEG n02110185/
+mv val/ILSVRC2012_val_00001315.JPEG n07684084/
+mv val/ILSVRC2012_val_00001316.JPEG n04204347/
+mv val/ILSVRC2012_val_00001317.JPEG n02672831/
+mv val/ILSVRC2012_val_00001318.JPEG n03742115/
+mv val/ILSVRC2012_val_00001319.JPEG n04131690/
+mv val/ILSVRC2012_val_00001320.JPEG n09428293/
+mv val/ILSVRC2012_val_00001321.JPEG n04487394/
+mv val/ILSVRC2012_val_00001322.JPEG n03710193/
+mv val/ILSVRC2012_val_00001323.JPEG n09332890/
+mv val/ILSVRC2012_val_00001324.JPEG n03478589/
+mv val/ILSVRC2012_val_00001325.JPEG n04486054/
+mv val/ILSVRC2012_val_00001326.JPEG n02951358/
+mv val/ILSVRC2012_val_00001327.JPEG n09428293/
+mv val/ILSVRC2012_val_00001328.JPEG n04596742/
+mv val/ILSVRC2012_val_00001329.JPEG n01872401/
+mv val/ILSVRC2012_val_00001330.JPEG n04505470/
+mv val/ILSVRC2012_val_00001331.JPEG n04154565/
+mv val/ILSVRC2012_val_00001332.JPEG n02666196/
+mv val/ILSVRC2012_val_00001333.JPEG n02437616/
+mv val/ILSVRC2012_val_00001334.JPEG n03724870/
+mv val/ILSVRC2012_val_00001335.JPEG n02120079/
+mv val/ILSVRC2012_val_00001336.JPEG n01828970/
+mv val/ILSVRC2012_val_00001337.JPEG n03141823/
+mv val/ILSVRC2012_val_00001338.JPEG n01698640/
+mv val/ILSVRC2012_val_00001339.JPEG n03095699/
+mv val/ILSVRC2012_val_00001340.JPEG n04099969/
+mv val/ILSVRC2012_val_00001341.JPEG n02123045/
+mv val/ILSVRC2012_val_00001342.JPEG n04482393/
+mv val/ILSVRC2012_val_00001343.JPEG n04026417/
+mv val/ILSVRC2012_val_00001344.JPEG n02110806/
+mv val/ILSVRC2012_val_00001345.JPEG n04033901/
+mv val/ILSVRC2012_val_00001346.JPEG n04041544/
+mv val/ILSVRC2012_val_00001347.JPEG n02869837/
+mv val/ILSVRC2012_val_00001348.JPEG n04136333/
+mv val/ILSVRC2012_val_00001349.JPEG n02112350/
+mv val/ILSVRC2012_val_00001350.JPEG n03388043/
+mv val/ILSVRC2012_val_00001351.JPEG n03065424/
+mv val/ILSVRC2012_val_00001352.JPEG n02128757/
+mv val/ILSVRC2012_val_00001353.JPEG n04330267/
+mv val/ILSVRC2012_val_00001354.JPEG n02879718/
+mv val/ILSVRC2012_val_00001355.JPEG n02859443/
+mv val/ILSVRC2012_val_00001356.JPEG n01968897/
+mv val/ILSVRC2012_val_00001357.JPEG n01847000/
+mv val/ILSVRC2012_val_00001358.JPEG n01871265/
+mv val/ILSVRC2012_val_00001359.JPEG n02129165/
+mv val/ILSVRC2012_val_00001360.JPEG n02408429/
+mv val/ILSVRC2012_val_00001361.JPEG n04263257/
+mv val/ILSVRC2012_val_00001362.JPEG n13054560/
+mv val/ILSVRC2012_val_00001363.JPEG n02090379/
+mv val/ILSVRC2012_val_00001364.JPEG n04553703/
+mv val/ILSVRC2012_val_00001365.JPEG n03929660/
+mv val/ILSVRC2012_val_00001366.JPEG n01990800/
+mv val/ILSVRC2012_val_00001367.JPEG n03494278/
+mv val/ILSVRC2012_val_00001368.JPEG n01514859/
+mv val/ILSVRC2012_val_00001369.JPEG n02804610/
+mv val/ILSVRC2012_val_00001370.JPEG n01773157/
+mv val/ILSVRC2012_val_00001371.JPEG n02087046/
+mv val/ILSVRC2012_val_00001372.JPEG n07802026/
+mv val/ILSVRC2012_val_00001373.JPEG n03777754/
+mv val/ILSVRC2012_val_00001374.JPEG n07720875/
+mv val/ILSVRC2012_val_00001375.JPEG n01694178/
+mv val/ILSVRC2012_val_00001376.JPEG n06794110/
+mv val/ILSVRC2012_val_00001377.JPEG n02795169/
+mv val/ILSVRC2012_val_00001378.JPEG n07583066/
+mv val/ILSVRC2012_val_00001379.JPEG n02094114/
+mv val/ILSVRC2012_val_00001380.JPEG n03841143/
+mv val/ILSVRC2012_val_00001381.JPEG n01985128/
+mv val/ILSVRC2012_val_00001382.JPEG n03776460/
+mv val/ILSVRC2012_val_00001383.JPEG n02859443/
+mv val/ILSVRC2012_val_00001384.JPEG n02808304/
+mv val/ILSVRC2012_val_00001385.JPEG n02092339/
+mv val/ILSVRC2012_val_00001386.JPEG n02441942/
+mv val/ILSVRC2012_val_00001387.JPEG n02002724/
+mv val/ILSVRC2012_val_00001388.JPEG n04296562/
+mv val/ILSVRC2012_val_00001389.JPEG n02086910/
+mv val/ILSVRC2012_val_00001390.JPEG n02690373/
+mv val/ILSVRC2012_val_00001391.JPEG n01616318/
+mv val/ILSVRC2012_val_00001392.JPEG n07718472/
+mv val/ILSVRC2012_val_00001393.JPEG n02086240/
+mv val/ILSVRC2012_val_00001394.JPEG n04049303/
+mv val/ILSVRC2012_val_00001395.JPEG n04235860/
+mv val/ILSVRC2012_val_00001396.JPEG n06359193/
+mv val/ILSVRC2012_val_00001397.JPEG n02110958/
+mv val/ILSVRC2012_val_00001398.JPEG n01518878/
+mv val/ILSVRC2012_val_00001399.JPEG n02950826/
+mv val/ILSVRC2012_val_00001400.JPEG n03447721/
+mv val/ILSVRC2012_val_00001401.JPEG n02111129/
+mv val/ILSVRC2012_val_00001402.JPEG n04517823/
+mv val/ILSVRC2012_val_00001403.JPEG n03769881/
+mv val/ILSVRC2012_val_00001404.JPEG n02112350/
+mv val/ILSVRC2012_val_00001405.JPEG n07693725/
+mv val/ILSVRC2012_val_00001406.JPEG n07747607/
+mv val/ILSVRC2012_val_00001407.JPEG n02444819/
+mv val/ILSVRC2012_val_00001408.JPEG n02109047/
+mv val/ILSVRC2012_val_00001409.JPEG n04485082/
+mv val/ILSVRC2012_val_00001410.JPEG n10148035/
+mv val/ILSVRC2012_val_00001411.JPEG n03127925/
+mv val/ILSVRC2012_val_00001412.JPEG n04328186/
+mv val/ILSVRC2012_val_00001413.JPEG n03347037/
+mv val/ILSVRC2012_val_00001414.JPEG n02102480/
+mv val/ILSVRC2012_val_00001415.JPEG n07614500/
+mv val/ILSVRC2012_val_00001416.JPEG n02676566/
+mv val/ILSVRC2012_val_00001417.JPEG n04599235/
+mv val/ILSVRC2012_val_00001418.JPEG n03534580/
+mv val/ILSVRC2012_val_00001419.JPEG n02093256/
+mv val/ILSVRC2012_val_00001420.JPEG n03710721/
+mv val/ILSVRC2012_val_00001421.JPEG n02167151/
+mv val/ILSVRC2012_val_00001422.JPEG n04116512/
+mv val/ILSVRC2012_val_00001423.JPEG n04141975/
+mv val/ILSVRC2012_val_00001424.JPEG n03877472/
+mv val/ILSVRC2012_val_00001425.JPEG n02092339/
+mv val/ILSVRC2012_val_00001426.JPEG n03042490/
+mv val/ILSVRC2012_val_00001427.JPEG n04604644/
+mv val/ILSVRC2012_val_00001428.JPEG n03355925/
+mv val/ILSVRC2012_val_00001429.JPEG n04009552/
+mv val/ILSVRC2012_val_00001430.JPEG n03598930/
+mv val/ILSVRC2012_val_00001431.JPEG n02672831/
+mv val/ILSVRC2012_val_00001432.JPEG n03425413/
+mv val/ILSVRC2012_val_00001433.JPEG n03649909/
+mv val/ILSVRC2012_val_00001434.JPEG n02099429/
+mv val/ILSVRC2012_val_00001435.JPEG n01819313/
+mv val/ILSVRC2012_val_00001436.JPEG n02640242/
+mv val/ILSVRC2012_val_00001437.JPEG n02978881/
+mv val/ILSVRC2012_val_00001438.JPEG n03670208/
+mv val/ILSVRC2012_val_00001439.JPEG n02342885/
+mv val/ILSVRC2012_val_00001440.JPEG n03888257/
+mv val/ILSVRC2012_val_00001441.JPEG n03729826/
+mv val/ILSVRC2012_val_00001442.JPEG n02457408/
+mv val/ILSVRC2012_val_00001443.JPEG n02860847/
+mv val/ILSVRC2012_val_00001444.JPEG n09246464/
+mv val/ILSVRC2012_val_00001445.JPEG n02097298/
+mv val/ILSVRC2012_val_00001446.JPEG n03649909/
+mv val/ILSVRC2012_val_00001447.JPEG n04228054/
+mv val/ILSVRC2012_val_00001448.JPEG n02113624/
+mv val/ILSVRC2012_val_00001449.JPEG n01978287/
+mv val/ILSVRC2012_val_00001450.JPEG n03895866/
+mv val/ILSVRC2012_val_00001451.JPEG n03393912/
+mv val/ILSVRC2012_val_00001452.JPEG n03127925/
+mv val/ILSVRC2012_val_00001453.JPEG n03720891/
+mv val/ILSVRC2012_val_00001454.JPEG n01774384/
+mv val/ILSVRC2012_val_00001455.JPEG n04065272/
+mv val/ILSVRC2012_val_00001456.JPEG n03485407/
+mv val/ILSVRC2012_val_00001457.JPEG n04033901/
+mv val/ILSVRC2012_val_00001458.JPEG n02488291/
+mv val/ILSVRC2012_val_00001459.JPEG n12057211/
+mv val/ILSVRC2012_val_00001460.JPEG n01774750/
+mv val/ILSVRC2012_val_00001461.JPEG n01798484/
+mv val/ILSVRC2012_val_00001462.JPEG n01537544/
+mv val/ILSVRC2012_val_00001463.JPEG n07720875/
+mv val/ILSVRC2012_val_00001464.JPEG n03838899/
+mv val/ILSVRC2012_val_00001465.JPEG n04120489/
+mv val/ILSVRC2012_val_00001466.JPEG n02264363/
+mv val/ILSVRC2012_val_00001467.JPEG n02113978/
+mv val/ILSVRC2012_val_00001468.JPEG n02799071/
+mv val/ILSVRC2012_val_00001469.JPEG n02114367/
+mv val/ILSVRC2012_val_00001470.JPEG n04332243/
+mv val/ILSVRC2012_val_00001471.JPEG n03062245/
+mv val/ILSVRC2012_val_00001472.JPEG n02077923/
+mv val/ILSVRC2012_val_00001473.JPEG n02398521/
+mv val/ILSVRC2012_val_00001474.JPEG n04435653/
+mv val/ILSVRC2012_val_00001475.JPEG n01692333/
+mv val/ILSVRC2012_val_00001476.JPEG n07831146/
+mv val/ILSVRC2012_val_00001477.JPEG n04523525/
+mv val/ILSVRC2012_val_00001478.JPEG n02342885/
+mv val/ILSVRC2012_val_00001479.JPEG n07753275/
+mv val/ILSVRC2012_val_00001480.JPEG n01807496/
+mv val/ILSVRC2012_val_00001481.JPEG n02098413/
+mv val/ILSVRC2012_val_00001482.JPEG n01744401/
+mv val/ILSVRC2012_val_00001483.JPEG n07836838/
+mv val/ILSVRC2012_val_00001484.JPEG n02104029/
+mv val/ILSVRC2012_val_00001485.JPEG n02092339/
+mv val/ILSVRC2012_val_00001486.JPEG n02092339/
+mv val/ILSVRC2012_val_00001487.JPEG n02115913/
+mv val/ILSVRC2012_val_00001488.JPEG n01608432/
+mv val/ILSVRC2012_val_00001489.JPEG n03325584/
+mv val/ILSVRC2012_val_00001490.JPEG n02066245/
+mv val/ILSVRC2012_val_00001491.JPEG n03345487/
+mv val/ILSVRC2012_val_00001492.JPEG n03394916/
+mv val/ILSVRC2012_val_00001493.JPEG n01773797/
+mv val/ILSVRC2012_val_00001494.JPEG n02113186/
+mv val/ILSVRC2012_val_00001495.JPEG n02667093/
+mv val/ILSVRC2012_val_00001496.JPEG n02124075/
+mv val/ILSVRC2012_val_00001497.JPEG n04118538/
+mv val/ILSVRC2012_val_00001498.JPEG n02134084/
+mv val/ILSVRC2012_val_00001499.JPEG n02317335/
+mv val/ILSVRC2012_val_00001500.JPEG n03047690/
+mv val/ILSVRC2012_val_00001501.JPEG n03938244/
+mv val/ILSVRC2012_val_00001502.JPEG n02219486/
+mv val/ILSVRC2012_val_00001503.JPEG n07718747/
+mv val/ILSVRC2012_val_00001504.JPEG n02490219/
+mv val/ILSVRC2012_val_00001505.JPEG n04326547/
+mv val/ILSVRC2012_val_00001506.JPEG n02690373/
+mv val/ILSVRC2012_val_00001507.JPEG n07717556/
+mv val/ILSVRC2012_val_00001508.JPEG n01580077/
+mv val/ILSVRC2012_val_00001509.JPEG n02443484/
+mv val/ILSVRC2012_val_00001510.JPEG n04443257/
+mv val/ILSVRC2012_val_00001511.JPEG n04033995/
+mv val/ILSVRC2012_val_00001512.JPEG n07590611/
+mv val/ILSVRC2012_val_00001513.JPEG n02403003/
+mv val/ILSVRC2012_val_00001514.JPEG n07768694/
+mv val/ILSVRC2012_val_00001515.JPEG n03803284/
+mv val/ILSVRC2012_val_00001516.JPEG n04371774/
+mv val/ILSVRC2012_val_00001517.JPEG n02802426/
+mv val/ILSVRC2012_val_00001518.JPEG n06794110/
+mv val/ILSVRC2012_val_00001519.JPEG n04483307/
+mv val/ILSVRC2012_val_00001520.JPEG n02791270/
+mv val/ILSVRC2012_val_00001521.JPEG n02028035/
+mv val/ILSVRC2012_val_00001522.JPEG n03764736/
+mv val/ILSVRC2012_val_00001523.JPEG n07860988/
+mv val/ILSVRC2012_val_00001524.JPEG n09421951/
+mv val/ILSVRC2012_val_00001525.JPEG n03773504/
+mv val/ILSVRC2012_val_00001526.JPEG n04152593/
+mv val/ILSVRC2012_val_00001527.JPEG n04367480/
+mv val/ILSVRC2012_val_00001528.JPEG n02950826/
+mv val/ILSVRC2012_val_00001529.JPEG n02168699/
+mv val/ILSVRC2012_val_00001530.JPEG n04458633/
+mv val/ILSVRC2012_val_00001531.JPEG n01983481/
+mv val/ILSVRC2012_val_00001532.JPEG n04404412/
+mv val/ILSVRC2012_val_00001533.JPEG n04252225/
+mv val/ILSVRC2012_val_00001534.JPEG n04596742/
+mv val/ILSVRC2012_val_00001535.JPEG n02480495/
+mv val/ILSVRC2012_val_00001536.JPEG n02281787/
+mv val/ILSVRC2012_val_00001537.JPEG n01795545/
+mv val/ILSVRC2012_val_00001538.JPEG n02089867/
+mv val/ILSVRC2012_val_00001539.JPEG n02169497/
+mv val/ILSVRC2012_val_00001540.JPEG n02666196/
+mv val/ILSVRC2012_val_00001541.JPEG n04311004/
+mv val/ILSVRC2012_val_00001542.JPEG n02879718/
+mv val/ILSVRC2012_val_00001543.JPEG n03457902/
+mv val/ILSVRC2012_val_00001544.JPEG n02074367/
+mv val/ILSVRC2012_val_00001545.JPEG n03297495/
+mv val/ILSVRC2012_val_00001546.JPEG n02481823/
+mv val/ILSVRC2012_val_00001547.JPEG n04485082/
+mv val/ILSVRC2012_val_00001548.JPEG n02091244/
+mv val/ILSVRC2012_val_00001549.JPEG n07718747/
+mv val/ILSVRC2012_val_00001550.JPEG n02102480/
+mv val/ILSVRC2012_val_00001551.JPEG n04147183/
+mv val/ILSVRC2012_val_00001552.JPEG n03014705/
+mv val/ILSVRC2012_val_00001553.JPEG n02814860/
+mv val/ILSVRC2012_val_00001554.JPEG n04532670/
+mv val/ILSVRC2012_val_00001555.JPEG n02094114/
+mv val/ILSVRC2012_val_00001556.JPEG n01532829/
+mv val/ILSVRC2012_val_00001557.JPEG n01664065/
+mv val/ILSVRC2012_val_00001558.JPEG n04090263/
+mv val/ILSVRC2012_val_00001559.JPEG n03995372/
+mv val/ILSVRC2012_val_00001560.JPEG n03134739/
+mv val/ILSVRC2012_val_00001561.JPEG n06596364/
+mv val/ILSVRC2012_val_00001562.JPEG n03710637/
+mv val/ILSVRC2012_val_00001563.JPEG n01807496/
+mv val/ILSVRC2012_val_00001564.JPEG n02096294/
+mv val/ILSVRC2012_val_00001565.JPEG n04026417/
+mv val/ILSVRC2012_val_00001566.JPEG n02165105/
+mv val/ILSVRC2012_val_00001567.JPEG n03998194/
+mv val/ILSVRC2012_val_00001568.JPEG n02112706/
+mv val/ILSVRC2012_val_00001569.JPEG n04366367/
+mv val/ILSVRC2012_val_00001570.JPEG n02177972/
+mv val/ILSVRC2012_val_00001571.JPEG n04152593/
+mv val/ILSVRC2012_val_00001572.JPEG n04442312/
+mv val/ILSVRC2012_val_00001573.JPEG n01697457/
+mv val/ILSVRC2012_val_00001574.JPEG n03775071/
+mv val/ILSVRC2012_val_00001575.JPEG n07892512/
+mv val/ILSVRC2012_val_00001576.JPEG n02091831/
+mv val/ILSVRC2012_val_00001577.JPEG n02101388/
+mv val/ILSVRC2012_val_00001578.JPEG n01749939/
+mv val/ILSVRC2012_val_00001579.JPEG n03384352/
+mv val/ILSVRC2012_val_00001580.JPEG n02484975/
+mv val/ILSVRC2012_val_00001581.JPEG n03868242/
+mv val/ILSVRC2012_val_00001582.JPEG n01753488/
+mv val/ILSVRC2012_val_00001583.JPEG n02687172/
+mv val/ILSVRC2012_val_00001584.JPEG n02807133/
+mv val/ILSVRC2012_val_00001585.JPEG n02231487/
+mv val/ILSVRC2012_val_00001586.JPEG n02018795/
+mv val/ILSVRC2012_val_00001587.JPEG n04270147/
+mv val/ILSVRC2012_val_00001588.JPEG n03063599/
+mv val/ILSVRC2012_val_00001589.JPEG n04591713/
+mv val/ILSVRC2012_val_00001590.JPEG n03895866/
+mv val/ILSVRC2012_val_00001591.JPEG n03481172/
+mv val/ILSVRC2012_val_00001592.JPEG n04456115/
+mv val/ILSVRC2012_val_00001593.JPEG n01755581/
+mv val/ILSVRC2012_val_00001594.JPEG n02319095/
+mv val/ILSVRC2012_val_00001595.JPEG n02526121/
+mv val/ILSVRC2012_val_00001596.JPEG n01796340/
+mv val/ILSVRC2012_val_00001597.JPEG n02094433/
+mv val/ILSVRC2012_val_00001598.JPEG n01558993/
+mv val/ILSVRC2012_val_00001599.JPEG n04238763/
+mv val/ILSVRC2012_val_00001600.JPEG n03127925/
+mv val/ILSVRC2012_val_00001601.JPEG n03017168/
+mv val/ILSVRC2012_val_00001602.JPEG n02692877/
+mv val/ILSVRC2012_val_00001603.JPEG n04179913/
+mv val/ILSVRC2012_val_00001604.JPEG n02791124/
+mv val/ILSVRC2012_val_00001605.JPEG n03494278/
+mv val/ILSVRC2012_val_00001606.JPEG n06596364/
+mv val/ILSVRC2012_val_00001607.JPEG n01751748/
+mv val/ILSVRC2012_val_00001608.JPEG n02074367/
+mv val/ILSVRC2012_val_00001609.JPEG n03249569/
+mv val/ILSVRC2012_val_00001610.JPEG n04357314/
+mv val/ILSVRC2012_val_00001611.JPEG n07579787/
+mv val/ILSVRC2012_val_00001612.JPEG n04550184/
+mv val/ILSVRC2012_val_00001613.JPEG n06596364/
+mv val/ILSVRC2012_val_00001614.JPEG n03761084/
+mv val/ILSVRC2012_val_00001615.JPEG n07718472/
+mv val/ILSVRC2012_val_00001616.JPEG n03376595/
+mv val/ILSVRC2012_val_00001617.JPEG n04428191/
+mv val/ILSVRC2012_val_00001618.JPEG n01773157/
+mv val/ILSVRC2012_val_00001619.JPEG n07248320/
+mv val/ILSVRC2012_val_00001620.JPEG n03400231/
+mv val/ILSVRC2012_val_00001621.JPEG n04447861/
+mv val/ILSVRC2012_val_00001622.JPEG n03854065/
+mv val/ILSVRC2012_val_00001623.JPEG n01694178/
+mv val/ILSVRC2012_val_00001624.JPEG n02111500/
+mv val/ILSVRC2012_val_00001625.JPEG n04111531/
+mv val/ILSVRC2012_val_00001626.JPEG n02090622/
+mv val/ILSVRC2012_val_00001627.JPEG n03450230/
+mv val/ILSVRC2012_val_00001628.JPEG n04536866/
+mv val/ILSVRC2012_val_00001629.JPEG n01817953/
+mv val/ILSVRC2012_val_00001630.JPEG n02843684/
+mv val/ILSVRC2012_val_00001631.JPEG n03776460/
+mv val/ILSVRC2012_val_00001632.JPEG n04201297/
+mv val/ILSVRC2012_val_00001633.JPEG n04204238/
+mv val/ILSVRC2012_val_00001634.JPEG n02094114/
+mv val/ILSVRC2012_val_00001635.JPEG n04238763/
+mv val/ILSVRC2012_val_00001636.JPEG n01667114/
+mv val/ILSVRC2012_val_00001637.JPEG n02116738/
+mv val/ILSVRC2012_val_00001638.JPEG n03709823/
+mv val/ILSVRC2012_val_00001639.JPEG n04153751/
+mv val/ILSVRC2012_val_00001640.JPEG n02422699/
+mv val/ILSVRC2012_val_00001641.JPEG n01796340/
+mv val/ILSVRC2012_val_00001642.JPEG n07836838/
+mv val/ILSVRC2012_val_00001643.JPEG n02027492/
+mv val/ILSVRC2012_val_00001644.JPEG n03478589/
+mv val/ILSVRC2012_val_00001645.JPEG n01689811/
+mv val/ILSVRC2012_val_00001646.JPEG n02110958/
+mv val/ILSVRC2012_val_00001647.JPEG n03538406/
+mv val/ILSVRC2012_val_00001648.JPEG n03207743/
+mv val/ILSVRC2012_val_00001649.JPEG n01669191/
+mv val/ILSVRC2012_val_00001650.JPEG n06794110/
+mv val/ILSVRC2012_val_00001651.JPEG n02087394/
+mv val/ILSVRC2012_val_00001652.JPEG n01641577/
+mv val/ILSVRC2012_val_00001653.JPEG n07873807/
+mv val/ILSVRC2012_val_00001654.JPEG n03314780/
+mv val/ILSVRC2012_val_00001655.JPEG n04591157/
+mv val/ILSVRC2012_val_00001656.JPEG n02487347/
+mv val/ILSVRC2012_val_00001657.JPEG n04277352/
+mv val/ILSVRC2012_val_00001658.JPEG n07749582/
+mv val/ILSVRC2012_val_00001659.JPEG n03792782/
+mv val/ILSVRC2012_val_00001660.JPEG n03947888/
+mv val/ILSVRC2012_val_00001661.JPEG n03792782/
+mv val/ILSVRC2012_val_00001662.JPEG n01669191/
+mv val/ILSVRC2012_val_00001663.JPEG n02102318/
+mv val/ILSVRC2012_val_00001664.JPEG n03788365/
+mv val/ILSVRC2012_val_00001665.JPEG n03899768/
+mv val/ILSVRC2012_val_00001666.JPEG n04392985/
+mv val/ILSVRC2012_val_00001667.JPEG n01629819/
+mv val/ILSVRC2012_val_00001668.JPEG n04557648/
+mv val/ILSVRC2012_val_00001669.JPEG n02640242/
+mv val/ILSVRC2012_val_00001670.JPEG n02325366/
+mv val/ILSVRC2012_val_00001671.JPEG n07749582/
+mv val/ILSVRC2012_val_00001672.JPEG n04264628/
+mv val/ILSVRC2012_val_00001673.JPEG n04487081/
+mv val/ILSVRC2012_val_00001674.JPEG n02978881/
+mv val/ILSVRC2012_val_00001675.JPEG n03720891/
+mv val/ILSVRC2012_val_00001676.JPEG n01494475/
+mv val/ILSVRC2012_val_00001677.JPEG n02951358/
+mv val/ILSVRC2012_val_00001678.JPEG n01828970/
+mv val/ILSVRC2012_val_00001679.JPEG n04286575/
+mv val/ILSVRC2012_val_00001680.JPEG n04540053/
+mv val/ILSVRC2012_val_00001681.JPEG n04332243/
+mv val/ILSVRC2012_val_00001682.JPEG n04367480/
+mv val/ILSVRC2012_val_00001683.JPEG n03840681/
+mv val/ILSVRC2012_val_00001684.JPEG n02106662/
+mv val/ILSVRC2012_val_00001685.JPEG n03376595/
+mv val/ILSVRC2012_val_00001686.JPEG n02113186/
+mv val/ILSVRC2012_val_00001687.JPEG n03085013/
+mv val/ILSVRC2012_val_00001688.JPEG n09246464/
+mv val/ILSVRC2012_val_00001689.JPEG n03127747/
+mv val/ILSVRC2012_val_00001690.JPEG n04367480/
+mv val/ILSVRC2012_val_00001691.JPEG n03290653/
+mv val/ILSVRC2012_val_00001692.JPEG n07760859/
+mv val/ILSVRC2012_val_00001693.JPEG n02102973/
+mv val/ILSVRC2012_val_00001694.JPEG n03290653/
+mv val/ILSVRC2012_val_00001695.JPEG n01751748/
+mv val/ILSVRC2012_val_00001696.JPEG n02089973/
+mv val/ILSVRC2012_val_00001697.JPEG n02086910/
+mv val/ILSVRC2012_val_00001698.JPEG n02112350/
+mv val/ILSVRC2012_val_00001699.JPEG n03272562/
+mv val/ILSVRC2012_val_00001700.JPEG n04456115/
+mv val/ILSVRC2012_val_00001701.JPEG n03785016/
+mv val/ILSVRC2012_val_00001702.JPEG n02110341/
+mv val/ILSVRC2012_val_00001703.JPEG n01728920/
+mv val/ILSVRC2012_val_00001704.JPEG n04554684/
+mv val/ILSVRC2012_val_00001705.JPEG n02417914/
+mv val/ILSVRC2012_val_00001706.JPEG n01756291/
+mv val/ILSVRC2012_val_00001707.JPEG n03590841/
+mv val/ILSVRC2012_val_00001708.JPEG n01877812/
+mv val/ILSVRC2012_val_00001709.JPEG n02113186/
+mv val/ILSVRC2012_val_00001710.JPEG n02093256/
+mv val/ILSVRC2012_val_00001711.JPEG n02099849/
+mv val/ILSVRC2012_val_00001712.JPEG n02397096/
+mv val/ILSVRC2012_val_00001713.JPEG n03642806/
+mv val/ILSVRC2012_val_00001714.JPEG n02231487/
+mv val/ILSVRC2012_val_00001715.JPEG n04179913/
+mv val/ILSVRC2012_val_00001716.JPEG n02012849/
+mv val/ILSVRC2012_val_00001717.JPEG n02279972/
+mv val/ILSVRC2012_val_00001718.JPEG n04447861/
+mv val/ILSVRC2012_val_00001719.JPEG n04355933/
+mv val/ILSVRC2012_val_00001720.JPEG n01560419/
+mv val/ILSVRC2012_val_00001721.JPEG n02445715/
+mv val/ILSVRC2012_val_00001722.JPEG n03770679/
+mv val/ILSVRC2012_val_00001723.JPEG n03929855/
+mv val/ILSVRC2012_val_00001724.JPEG n01688243/
+mv val/ILSVRC2012_val_00001725.JPEG n06596364/
+mv val/ILSVRC2012_val_00001726.JPEG n07930864/
+mv val/ILSVRC2012_val_00001727.JPEG n01945685/
+mv val/ILSVRC2012_val_00001728.JPEG n01631663/
+mv val/ILSVRC2012_val_00001729.JPEG n03216828/
+mv val/ILSVRC2012_val_00001730.JPEG n03995372/
+mv val/ILSVRC2012_val_00001731.JPEG n02782093/
+mv val/ILSVRC2012_val_00001732.JPEG n01860187/
+mv val/ILSVRC2012_val_00001733.JPEG n04443257/
+mv val/ILSVRC2012_val_00001734.JPEG n04579432/
+mv val/ILSVRC2012_val_00001735.JPEG n07745940/
+mv val/ILSVRC2012_val_00001736.JPEG n04146614/
+mv val/ILSVRC2012_val_00001737.JPEG n02177972/
+mv val/ILSVRC2012_val_00001738.JPEG n04392985/
+mv val/ILSVRC2012_val_00001739.JPEG n01644373/
+mv val/ILSVRC2012_val_00001740.JPEG n02317335/
+mv val/ILSVRC2012_val_00001741.JPEG n04553703/
+mv val/ILSVRC2012_val_00001742.JPEG n02138441/
+mv val/ILSVRC2012_val_00001743.JPEG n13040303/
+mv val/ILSVRC2012_val_00001744.JPEG n01985128/
+mv val/ILSVRC2012_val_00001745.JPEG n02134418/
+mv val/ILSVRC2012_val_00001746.JPEG n01945685/
+mv val/ILSVRC2012_val_00001747.JPEG n02526121/
+mv val/ILSVRC2012_val_00001748.JPEG n02317335/
+mv val/ILSVRC2012_val_00001749.JPEG n01820546/
+mv val/ILSVRC2012_val_00001750.JPEG n04501370/
+mv val/ILSVRC2012_val_00001751.JPEG n01560419/
+mv val/ILSVRC2012_val_00001752.JPEG n02268443/
+mv val/ILSVRC2012_val_00001753.JPEG n03796401/
+mv val/ILSVRC2012_val_00001754.JPEG n03916031/
+mv val/ILSVRC2012_val_00001755.JPEG n02992211/
+mv val/ILSVRC2012_val_00001756.JPEG n03127747/
+mv val/ILSVRC2012_val_00001757.JPEG n03180011/
+mv val/ILSVRC2012_val_00001758.JPEG n02102480/
+mv val/ILSVRC2012_val_00001759.JPEG n04277352/
+mv val/ILSVRC2012_val_00001760.JPEG n01776313/
+mv val/ILSVRC2012_val_00001761.JPEG n03017168/
+mv val/ILSVRC2012_val_00001762.JPEG n02111129/
+mv val/ILSVRC2012_val_00001763.JPEG n02190166/
+mv val/ILSVRC2012_val_00001764.JPEG n02098413/
+mv val/ILSVRC2012_val_00001765.JPEG n02090721/
+mv val/ILSVRC2012_val_00001766.JPEG n01776313/
+mv val/ILSVRC2012_val_00001767.JPEG n09421951/
+mv val/ILSVRC2012_val_00001768.JPEG n02113023/
+mv val/ILSVRC2012_val_00001769.JPEG n02672831/
+mv val/ILSVRC2012_val_00001770.JPEG n03764736/
+mv val/ILSVRC2012_val_00001771.JPEG n04146614/
+mv val/ILSVRC2012_val_00001772.JPEG n03347037/
+mv val/ILSVRC2012_val_00001773.JPEG n03868242/
+mv val/ILSVRC2012_val_00001774.JPEG n02667093/
+mv val/ILSVRC2012_val_00001775.JPEG n02093647/
+mv val/ILSVRC2012_val_00001776.JPEG n02169497/
+mv val/ILSVRC2012_val_00001777.JPEG n02089973/
+mv val/ILSVRC2012_val_00001778.JPEG n07747607/
+mv val/ILSVRC2012_val_00001779.JPEG n02085782/
+mv val/ILSVRC2012_val_00001780.JPEG n02815834/
+mv val/ILSVRC2012_val_00001781.JPEG n02105412/
+mv val/ILSVRC2012_val_00001782.JPEG n02086910/
+mv val/ILSVRC2012_val_00001783.JPEG n04204238/
+mv val/ILSVRC2012_val_00001784.JPEG n03530642/
+mv val/ILSVRC2012_val_00001785.JPEG n07583066/
+mv val/ILSVRC2012_val_00001786.JPEG n04039381/
+mv val/ILSVRC2012_val_00001787.JPEG n02965783/
+mv val/ILSVRC2012_val_00001788.JPEG n04501370/
+mv val/ILSVRC2012_val_00001789.JPEG n04086273/
+mv val/ILSVRC2012_val_00001790.JPEG n04263257/
+mv val/ILSVRC2012_val_00001791.JPEG n02443484/
+mv val/ILSVRC2012_val_00001792.JPEG n04162706/
+mv val/ILSVRC2012_val_00001793.JPEG n07613480/
+mv val/ILSVRC2012_val_00001794.JPEG n04525038/
+mv val/ILSVRC2012_val_00001795.JPEG n04266014/
+mv val/ILSVRC2012_val_00001796.JPEG n03721384/
+mv val/ILSVRC2012_val_00001797.JPEG n04467665/
+mv val/ILSVRC2012_val_00001798.JPEG n04523525/
+mv val/ILSVRC2012_val_00001799.JPEG n04162706/
+mv val/ILSVRC2012_val_00001800.JPEG n02025239/
+mv val/ILSVRC2012_val_00001801.JPEG n04146614/
+mv val/ILSVRC2012_val_00001802.JPEG n01677366/
+mv val/ILSVRC2012_val_00001803.JPEG n04179913/
+mv val/ILSVRC2012_val_00001804.JPEG n04125021/
+mv val/ILSVRC2012_val_00001805.JPEG n02917067/
+mv val/ILSVRC2012_val_00001806.JPEG n04392985/
+mv val/ILSVRC2012_val_00001807.JPEG n04550184/
+mv val/ILSVRC2012_val_00001808.JPEG n02090721/
+mv val/ILSVRC2012_val_00001809.JPEG n03796401/
+mv val/ILSVRC2012_val_00001810.JPEG n03014705/
+mv val/ILSVRC2012_val_00001811.JPEG n04344873/
+mv val/ILSVRC2012_val_00001812.JPEG n02091635/
+mv val/ILSVRC2012_val_00001813.JPEG n01608432/
+mv val/ILSVRC2012_val_00001814.JPEG n03690938/
+mv val/ILSVRC2012_val_00001815.JPEG n04141975/
+mv val/ILSVRC2012_val_00001816.JPEG n01629819/
+mv val/ILSVRC2012_val_00001817.JPEG n04523525/
+mv val/ILSVRC2012_val_00001818.JPEG n01955084/
+mv val/ILSVRC2012_val_00001819.JPEG n01756291/
+mv val/ILSVRC2012_val_00001820.JPEG n04443257/
+mv val/ILSVRC2012_val_00001821.JPEG n02927161/
+mv val/ILSVRC2012_val_00001822.JPEG n07880968/
+mv val/ILSVRC2012_val_00001823.JPEG n07836838/
+mv val/ILSVRC2012_val_00001824.JPEG n02484975/
+mv val/ILSVRC2012_val_00001825.JPEG n02091032/
+mv val/ILSVRC2012_val_00001826.JPEG n07714571/
+mv val/ILSVRC2012_val_00001827.JPEG n03535780/
+mv val/ILSVRC2012_val_00001828.JPEG n04149813/
+mv val/ILSVRC2012_val_00001829.JPEG n09468604/
+mv val/ILSVRC2012_val_00001830.JPEG n02033041/
+mv val/ILSVRC2012_val_00001831.JPEG n03584254/
+mv val/ILSVRC2012_val_00001832.JPEG n04550184/
+mv val/ILSVRC2012_val_00001833.JPEG n03887697/
+mv val/ILSVRC2012_val_00001834.JPEG n03838899/
+mv val/ILSVRC2012_val_00001835.JPEG n02174001/
+mv val/ILSVRC2012_val_00001836.JPEG n03272010/
+mv val/ILSVRC2012_val_00001837.JPEG n03297495/
+mv val/ILSVRC2012_val_00001838.JPEG n04074963/
+mv val/ILSVRC2012_val_00001839.JPEG n03649909/
+mv val/ILSVRC2012_val_00001840.JPEG n03496892/
+mv val/ILSVRC2012_val_00001841.JPEG n03467068/
+mv val/ILSVRC2012_val_00001842.JPEG n02268853/
+mv val/ILSVRC2012_val_00001843.JPEG n03400231/
+mv val/ILSVRC2012_val_00001844.JPEG n02093256/
+mv val/ILSVRC2012_val_00001845.JPEG n04367480/
+mv val/ILSVRC2012_val_00001846.JPEG n02091134/
+mv val/ILSVRC2012_val_00001847.JPEG n04118776/
+mv val/ILSVRC2012_val_00001848.JPEG n02086646/
+mv val/ILSVRC2012_val_00001849.JPEG n07753592/
+mv val/ILSVRC2012_val_00001850.JPEG n02504013/
+mv val/ILSVRC2012_val_00001851.JPEG n02104365/
+mv val/ILSVRC2012_val_00001852.JPEG n02096177/
+mv val/ILSVRC2012_val_00001853.JPEG n03961711/
+mv val/ILSVRC2012_val_00001854.JPEG n04069434/
+mv val/ILSVRC2012_val_00001855.JPEG n03376595/
+mv val/ILSVRC2012_val_00001856.JPEG n01817953/
+mv val/ILSVRC2012_val_00001857.JPEG n01955084/
+mv val/ILSVRC2012_val_00001858.JPEG n02107142/
+mv val/ILSVRC2012_val_00001859.JPEG n03344393/
+mv val/ILSVRC2012_val_00001860.JPEG n03709823/
+mv val/ILSVRC2012_val_00001861.JPEG n02974003/
+mv val/ILSVRC2012_val_00001862.JPEG n02090379/
+mv val/ILSVRC2012_val_00001863.JPEG n04332243/
+mv val/ILSVRC2012_val_00001864.JPEG n03125729/
+mv val/ILSVRC2012_val_00001865.JPEG n03935335/
+mv val/ILSVRC2012_val_00001866.JPEG n02814860/
+mv val/ILSVRC2012_val_00001867.JPEG n01860187/
+mv val/ILSVRC2012_val_00001868.JPEG n03220513/
+mv val/ILSVRC2012_val_00001869.JPEG n02094114/
+mv val/ILSVRC2012_val_00001870.JPEG n03877472/
+mv val/ILSVRC2012_val_00001871.JPEG n02009912/
+mv val/ILSVRC2012_val_00001872.JPEG n02108000/
+mv val/ILSVRC2012_val_00001873.JPEG n02229544/
+mv val/ILSVRC2012_val_00001874.JPEG n03697007/
+mv val/ILSVRC2012_val_00001875.JPEG n03124170/
+mv val/ILSVRC2012_val_00001876.JPEG n02206856/
+mv val/ILSVRC2012_val_00001877.JPEG n03841143/
+mv val/ILSVRC2012_val_00001878.JPEG n04153751/
+mv val/ILSVRC2012_val_00001879.JPEG n01742172/
+mv val/ILSVRC2012_val_00001880.JPEG n13133613/
+mv val/ILSVRC2012_val_00001881.JPEG n04525305/
+mv val/ILSVRC2012_val_00001882.JPEG n01930112/
+mv val/ILSVRC2012_val_00001883.JPEG n02795169/
+mv val/ILSVRC2012_val_00001884.JPEG n02233338/
+mv val/ILSVRC2012_val_00001885.JPEG n02417914/
+mv val/ILSVRC2012_val_00001886.JPEG n03935335/
+mv val/ILSVRC2012_val_00001887.JPEG n01770393/
+mv val/ILSVRC2012_val_00001888.JPEG n02125311/
+mv val/ILSVRC2012_val_00001889.JPEG n03482405/
+mv val/ILSVRC2012_val_00001890.JPEG n04604644/
+mv val/ILSVRC2012_val_00001891.JPEG n02009912/
+mv val/ILSVRC2012_val_00001892.JPEG n03791053/
+mv val/ILSVRC2012_val_00001893.JPEG n03223299/
+mv val/ILSVRC2012_val_00001894.JPEG n03032252/
+mv val/ILSVRC2012_val_00001895.JPEG n04501370/
+mv val/ILSVRC2012_val_00001896.JPEG n03372029/
+mv val/ILSVRC2012_val_00001897.JPEG n03485794/
+mv val/ILSVRC2012_val_00001898.JPEG n02110341/
+mv val/ILSVRC2012_val_00001899.JPEG n04200800/
+mv val/ILSVRC2012_val_00001900.JPEG n02106166/
+mv val/ILSVRC2012_val_00001901.JPEG n04592741/
+mv val/ILSVRC2012_val_00001902.JPEG n02950826/
+mv val/ILSVRC2012_val_00001903.JPEG n04041544/
+mv val/ILSVRC2012_val_00001904.JPEG n07831146/
+mv val/ILSVRC2012_val_00001905.JPEG n04116512/
+mv val/ILSVRC2012_val_00001906.JPEG n01514859/
+mv val/ILSVRC2012_val_00001907.JPEG n03868242/
+mv val/ILSVRC2012_val_00001908.JPEG n03026506/
+mv val/ILSVRC2012_val_00001909.JPEG n02443484/
+mv val/ILSVRC2012_val_00001910.JPEG n02701002/
+mv val/ILSVRC2012_val_00001911.JPEG n04116512/
+mv val/ILSVRC2012_val_00001912.JPEG n02815834/
+mv val/ILSVRC2012_val_00001913.JPEG n03929855/
+mv val/ILSVRC2012_val_00001914.JPEG n03676483/
+mv val/ILSVRC2012_val_00001915.JPEG n01534433/
+mv val/ILSVRC2012_val_00001916.JPEG n02701002/
+mv val/ILSVRC2012_val_00001917.JPEG n02113978/
+mv val/ILSVRC2012_val_00001918.JPEG n04371430/
+mv val/ILSVRC2012_val_00001919.JPEG n03991062/
+mv val/ILSVRC2012_val_00001920.JPEG n07718472/
+mv val/ILSVRC2012_val_00001921.JPEG n02268853/
+mv val/ILSVRC2012_val_00001922.JPEG n04264628/
+mv val/ILSVRC2012_val_00001923.JPEG n02098105/
+mv val/ILSVRC2012_val_00001924.JPEG n07565083/
+mv val/ILSVRC2012_val_00001925.JPEG n02112706/
+mv val/ILSVRC2012_val_00001926.JPEG n02094114/
+mv val/ILSVRC2012_val_00001927.JPEG n02093991/
+mv val/ILSVRC2012_val_00001928.JPEG n02488291/
+mv val/ILSVRC2012_val_00001929.JPEG n02093859/
+mv val/ILSVRC2012_val_00001930.JPEG n03047690/
+mv val/ILSVRC2012_val_00001931.JPEG n01682714/
+mv val/ILSVRC2012_val_00001932.JPEG n07717410/
+mv val/ILSVRC2012_val_00001933.JPEG n01883070/
+mv val/ILSVRC2012_val_00001934.JPEG n04562935/
+mv val/ILSVRC2012_val_00001935.JPEG n01498041/
+mv val/ILSVRC2012_val_00001936.JPEG n07745940/
+mv val/ILSVRC2012_val_00001937.JPEG n02109525/
+mv val/ILSVRC2012_val_00001938.JPEG n01644900/
+mv val/ILSVRC2012_val_00001939.JPEG n01694178/
+mv val/ILSVRC2012_val_00001940.JPEG n03063689/
+mv val/ILSVRC2012_val_00001941.JPEG n02894605/
+mv val/ILSVRC2012_val_00001942.JPEG n01682714/
+mv val/ILSVRC2012_val_00001943.JPEG n03544143/
+mv val/ILSVRC2012_val_00001944.JPEG n02101556/
+mv val/ILSVRC2012_val_00001945.JPEG n02966687/
+mv val/ILSVRC2012_val_00001946.JPEG n03485407/
+mv val/ILSVRC2012_val_00001947.JPEG n03657121/
+mv val/ILSVRC2012_val_00001948.JPEG n02236044/
+mv val/ILSVRC2012_val_00001949.JPEG n07860988/
+mv val/ILSVRC2012_val_00001950.JPEG n01677366/
+mv val/ILSVRC2012_val_00001951.JPEG n07718747/
+mv val/ILSVRC2012_val_00001952.JPEG n02690373/
+mv val/ILSVRC2012_val_00001953.JPEG n04099969/
+mv val/ILSVRC2012_val_00001954.JPEG n03814639/
+mv val/ILSVRC2012_val_00001955.JPEG n02098413/
+mv val/ILSVRC2012_val_00001956.JPEG n01985128/
+mv val/ILSVRC2012_val_00001957.JPEG n02093647/
+mv val/ILSVRC2012_val_00001958.JPEG n02504458/
+mv val/ILSVRC2012_val_00001959.JPEG n01944390/
+mv val/ILSVRC2012_val_00001960.JPEG n03445924/
+mv val/ILSVRC2012_val_00001961.JPEG n03866082/
+mv val/ILSVRC2012_val_00001962.JPEG n03355925/
+mv val/ILSVRC2012_val_00001963.JPEG n02105855/
+mv val/ILSVRC2012_val_00001964.JPEG n03041632/
+mv val/ILSVRC2012_val_00001965.JPEG n03791053/
+mv val/ILSVRC2012_val_00001966.JPEG n03954731/
+mv val/ILSVRC2012_val_00001967.JPEG n07695742/
+mv val/ILSVRC2012_val_00001968.JPEG n02102040/
+mv val/ILSVRC2012_val_00001969.JPEG n03956157/
+mv val/ILSVRC2012_val_00001970.JPEG n03983396/
+mv val/ILSVRC2012_val_00001971.JPEG n02105855/
+mv val/ILSVRC2012_val_00001972.JPEG n03249569/
+mv val/ILSVRC2012_val_00001973.JPEG n03976467/
+mv val/ILSVRC2012_val_00001974.JPEG n03843555/
+mv val/ILSVRC2012_val_00001975.JPEG n02641379/
+mv val/ILSVRC2012_val_00001976.JPEG n03272562/
+mv val/ILSVRC2012_val_00001977.JPEG n03658185/
+mv val/ILSVRC2012_val_00001978.JPEG n03976467/
+mv val/ILSVRC2012_val_00001979.JPEG n02398521/
+mv val/ILSVRC2012_val_00001980.JPEG n03791053/
+mv val/ILSVRC2012_val_00001981.JPEG n03065424/
+mv val/ILSVRC2012_val_00001982.JPEG n03759954/
+mv val/ILSVRC2012_val_00001983.JPEG n03216828/
+mv val/ILSVRC2012_val_00001984.JPEG n03796401/
+mv val/ILSVRC2012_val_00001985.JPEG n01980166/
+mv val/ILSVRC2012_val_00001986.JPEG n09193705/
+mv val/ILSVRC2012_val_00001987.JPEG n01773797/
+mv val/ILSVRC2012_val_00001988.JPEG n02129604/
+mv val/ILSVRC2012_val_00001989.JPEG n04009552/
+mv val/ILSVRC2012_val_00001990.JPEG n02980441/
+mv val/ILSVRC2012_val_00001991.JPEG n03188531/
+mv val/ILSVRC2012_val_00001992.JPEG n02100735/
+mv val/ILSVRC2012_val_00001993.JPEG n07860988/
+mv val/ILSVRC2012_val_00001994.JPEG n03929855/
+mv val/ILSVRC2012_val_00001995.JPEG n04037443/
+mv val/ILSVRC2012_val_00001996.JPEG n03467068/
+mv val/ILSVRC2012_val_00001997.JPEG n02094114/
+mv val/ILSVRC2012_val_00001998.JPEG n03899768/
+mv val/ILSVRC2012_val_00001999.JPEG n04525038/
+mv val/ILSVRC2012_val_00002000.JPEG n02074367/
+mv val/ILSVRC2012_val_00002001.JPEG n04033901/
+mv val/ILSVRC2012_val_00002002.JPEG n02012849/
+mv val/ILSVRC2012_val_00002003.JPEG n02009229/
+mv val/ILSVRC2012_val_00002004.JPEG n02109961/
+mv val/ILSVRC2012_val_00002005.JPEG n03804744/
+mv val/ILSVRC2012_val_00002006.JPEG n02396427/
+mv val/ILSVRC2012_val_00002007.JPEG n02233338/
+mv val/ILSVRC2012_val_00002008.JPEG n03240683/
+mv val/ILSVRC2012_val_00002009.JPEG n03393912/
+mv val/ILSVRC2012_val_00002010.JPEG n03777568/
+mv val/ILSVRC2012_val_00002011.JPEG n02494079/
+mv val/ILSVRC2012_val_00002012.JPEG n02106662/
+mv val/ILSVRC2012_val_00002013.JPEG n04033995/
+mv val/ILSVRC2012_val_00002014.JPEG n02231487/
+mv val/ILSVRC2012_val_00002015.JPEG n04355338/
+mv val/ILSVRC2012_val_00002016.JPEG n04550184/
+mv val/ILSVRC2012_val_00002017.JPEG n02699494/
+mv val/ILSVRC2012_val_00002018.JPEG n04118538/
+mv val/ILSVRC2012_val_00002019.JPEG n03388043/
+mv val/ILSVRC2012_val_00002020.JPEG n02869837/
+mv val/ILSVRC2012_val_00002021.JPEG n02097047/
+mv val/ILSVRC2012_val_00002022.JPEG n03063689/
+mv val/ILSVRC2012_val_00002023.JPEG n01530575/
+mv val/ILSVRC2012_val_00002024.JPEG n02091032/
+mv val/ILSVRC2012_val_00002025.JPEG n03042490/
+mv val/ILSVRC2012_val_00002026.JPEG n03930313/
+mv val/ILSVRC2012_val_00002027.JPEG n02264363/
+mv val/ILSVRC2012_val_00002028.JPEG n02442845/
+mv val/ILSVRC2012_val_00002029.JPEG n02325366/
+mv val/ILSVRC2012_val_00002030.JPEG n01883070/
+mv val/ILSVRC2012_val_00002031.JPEG n01614925/
+mv val/ILSVRC2012_val_00002032.JPEG n03447721/
+mv val/ILSVRC2012_val_00002033.JPEG n03444034/
+mv val/ILSVRC2012_val_00002034.JPEG n02979186/
+mv val/ILSVRC2012_val_00002035.JPEG n02815834/
+mv val/ILSVRC2012_val_00002036.JPEG n02123394/
+mv val/ILSVRC2012_val_00002037.JPEG n03250847/
+mv val/ILSVRC2012_val_00002038.JPEG n02883205/
+mv val/ILSVRC2012_val_00002039.JPEG n04554684/
+mv val/ILSVRC2012_val_00002040.JPEG n03047690/
+mv val/ILSVRC2012_val_00002041.JPEG n01773157/
+mv val/ILSVRC2012_val_00002042.JPEG n02172182/
+mv val/ILSVRC2012_val_00002043.JPEG n03249569/
+mv val/ILSVRC2012_val_00002044.JPEG n04613696/
+mv val/ILSVRC2012_val_00002045.JPEG n03692522/
+mv val/ILSVRC2012_val_00002046.JPEG n04044716/
+mv val/ILSVRC2012_val_00002047.JPEG n12985857/
+mv val/ILSVRC2012_val_00002048.JPEG n02342885/
+mv val/ILSVRC2012_val_00002049.JPEG n03425413/
+mv val/ILSVRC2012_val_00002050.JPEG n02895154/
+mv val/ILSVRC2012_val_00002051.JPEG n01704323/
+mv val/ILSVRC2012_val_00002052.JPEG n01560419/
+mv val/ILSVRC2012_val_00002053.JPEG n02974003/
+mv val/ILSVRC2012_val_00002054.JPEG n07695742/
+mv val/ILSVRC2012_val_00002055.JPEG n03016953/
+mv val/ILSVRC2012_val_00002056.JPEG n03729826/
+mv val/ILSVRC2012_val_00002057.JPEG n03250847/
+mv val/ILSVRC2012_val_00002058.JPEG n02927161/
+mv val/ILSVRC2012_val_00002059.JPEG n02091635/
+mv val/ILSVRC2012_val_00002060.JPEG n01990800/
+mv val/ILSVRC2012_val_00002061.JPEG n02980441/
+mv val/ILSVRC2012_val_00002062.JPEG n02676566/
+mv val/ILSVRC2012_val_00002063.JPEG n02114548/
+mv val/ILSVRC2012_val_00002064.JPEG n02422699/
+mv val/ILSVRC2012_val_00002065.JPEG n04208210/
+mv val/ILSVRC2012_val_00002066.JPEG n02109961/
+mv val/ILSVRC2012_val_00002067.JPEG n04332243/
+mv val/ILSVRC2012_val_00002068.JPEG n04127249/
+mv val/ILSVRC2012_val_00002069.JPEG n03871628/
+mv val/ILSVRC2012_val_00002070.JPEG n02391049/
+mv val/ILSVRC2012_val_00002071.JPEG n01537544/
+mv val/ILSVRC2012_val_00002072.JPEG n02124075/
+mv val/ILSVRC2012_val_00002073.JPEG n02422106/
+mv val/ILSVRC2012_val_00002074.JPEG n01775062/
+mv val/ILSVRC2012_val_00002075.JPEG n03188531/
+mv val/ILSVRC2012_val_00002076.JPEG n02443114/
+mv val/ILSVRC2012_val_00002077.JPEG n01694178/
+mv val/ILSVRC2012_val_00002078.JPEG n03063689/
+mv val/ILSVRC2012_val_00002079.JPEG n02088364/
+mv val/ILSVRC2012_val_00002080.JPEG n04476259/
+mv val/ILSVRC2012_val_00002081.JPEG n04442312/
+mv val/ILSVRC2012_val_00002082.JPEG n03792972/
+mv val/ILSVRC2012_val_00002083.JPEG n07831146/
+mv val/ILSVRC2012_val_00002084.JPEG n02483708/
+mv val/ILSVRC2012_val_00002085.JPEG n04346328/
+mv val/ILSVRC2012_val_00002086.JPEG n04591713/
+mv val/ILSVRC2012_val_00002087.JPEG n03794056/
+mv val/ILSVRC2012_val_00002088.JPEG n04153751/
+mv val/ILSVRC2012_val_00002089.JPEG n03782006/
+mv val/ILSVRC2012_val_00002090.JPEG n02058221/
+mv val/ILSVRC2012_val_00002091.JPEG n04162706/
+mv val/ILSVRC2012_val_00002092.JPEG n04522168/
+mv val/ILSVRC2012_val_00002093.JPEG n03673027/
+mv val/ILSVRC2012_val_00002094.JPEG n04483307/
+mv val/ILSVRC2012_val_00002095.JPEG n03691459/
+mv val/ILSVRC2012_val_00002096.JPEG n03478589/
+mv val/ILSVRC2012_val_00002097.JPEG n02102318/
+mv val/ILSVRC2012_val_00002098.JPEG n07749582/
+mv val/ILSVRC2012_val_00002099.JPEG n07730033/
+mv val/ILSVRC2012_val_00002100.JPEG n01829413/
+mv val/ILSVRC2012_val_00002101.JPEG n01729977/
+mv val/ILSVRC2012_val_00002102.JPEG n04501370/
+mv val/ILSVRC2012_val_00002103.JPEG n09472597/
+mv val/ILSVRC2012_val_00002104.JPEG n03781244/
+mv val/ILSVRC2012_val_00002105.JPEG n02134084/
+mv val/ILSVRC2012_val_00002106.JPEG n01742172/
+mv val/ILSVRC2012_val_00002107.JPEG n03782006/
+mv val/ILSVRC2012_val_00002108.JPEG n04553703/
+mv val/ILSVRC2012_val_00002109.JPEG n09835506/
+mv val/ILSVRC2012_val_00002110.JPEG n03804744/
+mv val/ILSVRC2012_val_00002111.JPEG n02088238/
+mv val/ILSVRC2012_val_00002112.JPEG n04067472/
+mv val/ILSVRC2012_val_00002113.JPEG n03764736/
+mv val/ILSVRC2012_val_00002114.JPEG n02992529/
+mv val/ILSVRC2012_val_00002115.JPEG n03874599/
+mv val/ILSVRC2012_val_00002116.JPEG n03124043/
+mv val/ILSVRC2012_val_00002117.JPEG n04065272/
+mv val/ILSVRC2012_val_00002118.JPEG n02782093/
+mv val/ILSVRC2012_val_00002119.JPEG n03788195/
+mv val/ILSVRC2012_val_00002120.JPEG n04389033/
+mv val/ILSVRC2012_val_00002121.JPEG n03673027/
+mv val/ILSVRC2012_val_00002122.JPEG n04389033/
+mv val/ILSVRC2012_val_00002123.JPEG n03775071/
+mv val/ILSVRC2012_val_00002124.JPEG n07753113/
+mv val/ILSVRC2012_val_00002125.JPEG n12144580/
+mv val/ILSVRC2012_val_00002126.JPEG n02013706/
+mv val/ILSVRC2012_val_00002127.JPEG n02190166/
+mv val/ILSVRC2012_val_00002128.JPEG n04275548/
+mv val/ILSVRC2012_val_00002129.JPEG n03250847/
+mv val/ILSVRC2012_val_00002130.JPEG n03947888/
+mv val/ILSVRC2012_val_00002131.JPEG n01729977/
+mv val/ILSVRC2012_val_00002132.JPEG n02138441/
+mv val/ILSVRC2012_val_00002133.JPEG n04264628/
+mv val/ILSVRC2012_val_00002134.JPEG n03967562/
+mv val/ILSVRC2012_val_00002135.JPEG n03445924/
+mv val/ILSVRC2012_val_00002136.JPEG n04355338/
+mv val/ILSVRC2012_val_00002137.JPEG n02640242/
+mv val/ILSVRC2012_val_00002138.JPEG n01440764/
+mv val/ILSVRC2012_val_00002139.JPEG n12267677/
+mv val/ILSVRC2012_val_00002140.JPEG n02489166/
+mv val/ILSVRC2012_val_00002141.JPEG n02165105/
+mv val/ILSVRC2012_val_00002142.JPEG n03599486/
+mv val/ILSVRC2012_val_00002143.JPEG n03272010/
+mv val/ILSVRC2012_val_00002144.JPEG n02018207/
+mv val/ILSVRC2012_val_00002145.JPEG n02747177/
+mv val/ILSVRC2012_val_00002146.JPEG n04487081/
+mv val/ILSVRC2012_val_00002147.JPEG n02119789/
+mv val/ILSVRC2012_val_00002148.JPEG n02666196/
+mv val/ILSVRC2012_val_00002149.JPEG n02606052/
+mv val/ILSVRC2012_val_00002150.JPEG n02086646/
+mv val/ILSVRC2012_val_00002151.JPEG n04040759/
+mv val/ILSVRC2012_val_00002152.JPEG n01984695/
+mv val/ILSVRC2012_val_00002153.JPEG n12998815/
+mv val/ILSVRC2012_val_00002154.JPEG n01751748/
+mv val/ILSVRC2012_val_00002155.JPEG n04584207/
+mv val/ILSVRC2012_val_00002156.JPEG n04149813/
+mv val/ILSVRC2012_val_00002157.JPEG n01981276/
+mv val/ILSVRC2012_val_00002158.JPEG n02841315/
+mv val/ILSVRC2012_val_00002159.JPEG n03777754/
+mv val/ILSVRC2012_val_00002160.JPEG n04376876/
+mv val/ILSVRC2012_val_00002161.JPEG n02859443/
+mv val/ILSVRC2012_val_00002162.JPEG n04389033/
+mv val/ILSVRC2012_val_00002163.JPEG n01665541/
+mv val/ILSVRC2012_val_00002164.JPEG n04208210/
+mv val/ILSVRC2012_val_00002165.JPEG n04041544/
+mv val/ILSVRC2012_val_00002166.JPEG n02071294/
+mv val/ILSVRC2012_val_00002167.JPEG n13052670/
+mv val/ILSVRC2012_val_00002168.JPEG n01616318/
+mv val/ILSVRC2012_val_00002169.JPEG n03871628/
+mv val/ILSVRC2012_val_00002170.JPEG n02028035/
+mv val/ILSVRC2012_val_00002171.JPEG n03110669/
+mv val/ILSVRC2012_val_00002172.JPEG n01819313/
+mv val/ILSVRC2012_val_00002173.JPEG n04229816/
+mv val/ILSVRC2012_val_00002174.JPEG n02769748/
+mv val/ILSVRC2012_val_00002175.JPEG n03832673/
+mv val/ILSVRC2012_val_00002176.JPEG n02095889/
+mv val/ILSVRC2012_val_00002177.JPEG n01806143/
+mv val/ILSVRC2012_val_00002178.JPEG n02708093/
+mv val/ILSVRC2012_val_00002179.JPEG n07753113/
+mv val/ILSVRC2012_val_00002180.JPEG n02804610/
+mv val/ILSVRC2012_val_00002181.JPEG n02879718/
+mv val/ILSVRC2012_val_00002182.JPEG n03595614/
+mv val/ILSVRC2012_val_00002183.JPEG n02769748/
+mv val/ILSVRC2012_val_00002184.JPEG n07802026/
+mv val/ILSVRC2012_val_00002185.JPEG n04357314/
+mv val/ILSVRC2012_val_00002186.JPEG n09288635/
+mv val/ILSVRC2012_val_00002187.JPEG n07753592/
+mv val/ILSVRC2012_val_00002188.JPEG n04525038/
+mv val/ILSVRC2012_val_00002189.JPEG n04590129/
+mv val/ILSVRC2012_val_00002190.JPEG n01981276/
+mv val/ILSVRC2012_val_00002191.JPEG n01530575/
+mv val/ILSVRC2012_val_00002192.JPEG n02006656/
+mv val/ILSVRC2012_val_00002193.JPEG n03903868/
+mv val/ILSVRC2012_val_00002194.JPEG n02095570/
+mv val/ILSVRC2012_val_00002195.JPEG n03602883/
+mv val/ILSVRC2012_val_00002196.JPEG n03476991/
+mv val/ILSVRC2012_val_00002197.JPEG n04328186/
+mv val/ILSVRC2012_val_00002198.JPEG n03617480/
+mv val/ILSVRC2012_val_00002199.JPEG n03272562/
+mv val/ILSVRC2012_val_00002200.JPEG n02328150/
+mv val/ILSVRC2012_val_00002201.JPEG n04536866/
+mv val/ILSVRC2012_val_00002202.JPEG n02814860/
+mv val/ILSVRC2012_val_00002203.JPEG n03710193/
+mv val/ILSVRC2012_val_00002204.JPEG n04263257/
+mv val/ILSVRC2012_val_00002205.JPEG n02699494/
+mv val/ILSVRC2012_val_00002206.JPEG n04418357/
+mv val/ILSVRC2012_val_00002207.JPEG n01496331/
+mv val/ILSVRC2012_val_00002208.JPEG n02086079/
+mv val/ILSVRC2012_val_00002209.JPEG n03495258/
+mv val/ILSVRC2012_val_00002210.JPEG n03417042/
+mv val/ILSVRC2012_val_00002211.JPEG n03065424/
+mv val/ILSVRC2012_val_00002212.JPEG n03041632/
+mv val/ILSVRC2012_val_00002213.JPEG n04467665/
+mv val/ILSVRC2012_val_00002214.JPEG n02085936/
+mv val/ILSVRC2012_val_00002215.JPEG n03956157/
+mv val/ILSVRC2012_val_00002216.JPEG n02110341/
+mv val/ILSVRC2012_val_00002217.JPEG n07760859/
+mv val/ILSVRC2012_val_00002218.JPEG n03467068/
+mv val/ILSVRC2012_val_00002219.JPEG n02825657/
+mv val/ILSVRC2012_val_00002220.JPEG n02669723/
+mv val/ILSVRC2012_val_00002221.JPEG n07579787/
+mv val/ILSVRC2012_val_00002222.JPEG n02097658/
+mv val/ILSVRC2012_val_00002223.JPEG n03717622/
+mv val/ILSVRC2012_val_00002224.JPEG n03590841/
+mv val/ILSVRC2012_val_00002225.JPEG n02268443/
+mv val/ILSVRC2012_val_00002226.JPEG n07697313/
+mv val/ILSVRC2012_val_00002227.JPEG n02859443/
+mv val/ILSVRC2012_val_00002228.JPEG n01622779/
+mv val/ILSVRC2012_val_00002229.JPEG n02999410/
+mv val/ILSVRC2012_val_00002230.JPEG n01877812/
+mv val/ILSVRC2012_val_00002231.JPEG n01744401/
+mv val/ILSVRC2012_val_00002232.JPEG n01669191/
+mv val/ILSVRC2012_val_00002233.JPEG n04507155/
+mv val/ILSVRC2012_val_00002234.JPEG n02108000/
+mv val/ILSVRC2012_val_00002235.JPEG n10148035/
+mv val/ILSVRC2012_val_00002236.JPEG n04009552/
+mv val/ILSVRC2012_val_00002237.JPEG n09421951/
+mv val/ILSVRC2012_val_00002238.JPEG n03457902/
+mv val/ILSVRC2012_val_00002239.JPEG n02091032/
+mv val/ILSVRC2012_val_00002240.JPEG n03759954/
+mv val/ILSVRC2012_val_00002241.JPEG n01443537/
+mv val/ILSVRC2012_val_00002242.JPEG n02011460/
+mv val/ILSVRC2012_val_00002243.JPEG n01984695/
+mv val/ILSVRC2012_val_00002244.JPEG n02791270/
+mv val/ILSVRC2012_val_00002245.JPEG n03617480/
+mv val/ILSVRC2012_val_00002246.JPEG n02089973/
+mv val/ILSVRC2012_val_00002247.JPEG n02105641/
+mv val/ILSVRC2012_val_00002248.JPEG n03595614/
+mv val/ILSVRC2012_val_00002249.JPEG n03207941/
+mv val/ILSVRC2012_val_00002250.JPEG n03146219/
+mv val/ILSVRC2012_val_00002251.JPEG n04367480/
+mv val/ILSVRC2012_val_00002252.JPEG n07695742/
+mv val/ILSVRC2012_val_00002253.JPEG n03376595/
+mv val/ILSVRC2012_val_00002254.JPEG n09835506/
+mv val/ILSVRC2012_val_00002255.JPEG n02342885/
+mv val/ILSVRC2012_val_00002256.JPEG n03393912/
+mv val/ILSVRC2012_val_00002257.JPEG n04311004/
+mv val/ILSVRC2012_val_00002258.JPEG n04589890/
+mv val/ILSVRC2012_val_00002259.JPEG n02114367/
+mv val/ILSVRC2012_val_00002260.JPEG n02104029/
+mv val/ILSVRC2012_val_00002261.JPEG n01945685/
+mv val/ILSVRC2012_val_00002262.JPEG n02094114/
+mv val/ILSVRC2012_val_00002263.JPEG n01824575/
+mv val/ILSVRC2012_val_00002264.JPEG n04380533/
+mv val/ILSVRC2012_val_00002265.JPEG n02025239/
+mv val/ILSVRC2012_val_00002266.JPEG n03218198/
+mv val/ILSVRC2012_val_00002267.JPEG n02110627/
+mv val/ILSVRC2012_val_00002268.JPEG n04026417/
+mv val/ILSVRC2012_val_00002269.JPEG n02749479/
+mv val/ILSVRC2012_val_00002270.JPEG n07613480/
+mv val/ILSVRC2012_val_00002271.JPEG n02437312/
+mv val/ILSVRC2012_val_00002272.JPEG n03347037/
+mv val/ILSVRC2012_val_00002273.JPEG n02403003/
+mv val/ILSVRC2012_val_00002274.JPEG n03942813/
+mv val/ILSVRC2012_val_00002275.JPEG n03450230/
+mv val/ILSVRC2012_val_00002276.JPEG n04252225/
+mv val/ILSVRC2012_val_00002277.JPEG n02108000/
+mv val/ILSVRC2012_val_00002278.JPEG n03837869/
+mv val/ILSVRC2012_val_00002279.JPEG n02165105/
+mv val/ILSVRC2012_val_00002280.JPEG n03000247/
+mv val/ILSVRC2012_val_00002281.JPEG n04344873/
+mv val/ILSVRC2012_val_00002282.JPEG n02504458/
+mv val/ILSVRC2012_val_00002283.JPEG n02110185/
+mv val/ILSVRC2012_val_00002284.JPEG n01498041/
+mv val/ILSVRC2012_val_00002285.JPEG n04270147/
+mv val/ILSVRC2012_val_00002286.JPEG n04239074/
+mv val/ILSVRC2012_val_00002287.JPEG n03924679/
+mv val/ILSVRC2012_val_00002288.JPEG n02086646/
+mv val/ILSVRC2012_val_00002289.JPEG n09835506/
+mv val/ILSVRC2012_val_00002290.JPEG n03424325/
+mv val/ILSVRC2012_val_00002291.JPEG n04370456/
+mv val/ILSVRC2012_val_00002292.JPEG n03777754/
+mv val/ILSVRC2012_val_00002293.JPEG n03529860/
+mv val/ILSVRC2012_val_00002294.JPEG n02102040/
+mv val/ILSVRC2012_val_00002295.JPEG n01688243/
+mv val/ILSVRC2012_val_00002296.JPEG n02110627/
+mv val/ILSVRC2012_val_00002297.JPEG n02100735/
+mv val/ILSVRC2012_val_00002298.JPEG n02102177/
+mv val/ILSVRC2012_val_00002299.JPEG n04086273/
+mv val/ILSVRC2012_val_00002300.JPEG n01883070/
+mv val/ILSVRC2012_val_00002301.JPEG n04366367/
+mv val/ILSVRC2012_val_00002302.JPEG n02107574/
+mv val/ILSVRC2012_val_00002303.JPEG n02102480/
+mv val/ILSVRC2012_val_00002304.JPEG n04008634/
+mv val/ILSVRC2012_val_00002305.JPEG n02169497/
+mv val/ILSVRC2012_val_00002306.JPEG n04141327/
+mv val/ILSVRC2012_val_00002307.JPEG n02442845/
+mv val/ILSVRC2012_val_00002308.JPEG n03662601/
+mv val/ILSVRC2012_val_00002309.JPEG n01855032/
+mv val/ILSVRC2012_val_00002310.JPEG n04589890/
+mv val/ILSVRC2012_val_00002311.JPEG n02018795/
+mv val/ILSVRC2012_val_00002312.JPEG n03271574/
+mv val/ILSVRC2012_val_00002313.JPEG n02097298/
+mv val/ILSVRC2012_val_00002314.JPEG n03445777/
+mv val/ILSVRC2012_val_00002315.JPEG n02102040/
+mv val/ILSVRC2012_val_00002316.JPEG n03617480/
+mv val/ILSVRC2012_val_00002317.JPEG n02108422/
+mv val/ILSVRC2012_val_00002318.JPEG n02097474/
+mv val/ILSVRC2012_val_00002319.JPEG n02109525/
+mv val/ILSVRC2012_val_00002320.JPEG n02097474/
+mv val/ILSVRC2012_val_00002321.JPEG n11879895/
+mv val/ILSVRC2012_val_00002322.JPEG n03223299/
+mv val/ILSVRC2012_val_00002323.JPEG n02100583/
+mv val/ILSVRC2012_val_00002324.JPEG n03840681/
+mv val/ILSVRC2012_val_00002325.JPEG n02091032/
+mv val/ILSVRC2012_val_00002326.JPEG n01843065/
+mv val/ILSVRC2012_val_00002327.JPEG n03769881/
+mv val/ILSVRC2012_val_00002328.JPEG n02091467/
+mv val/ILSVRC2012_val_00002329.JPEG n02134418/
+mv val/ILSVRC2012_val_00002330.JPEG n02109047/
+mv val/ILSVRC2012_val_00002331.JPEG n04456115/
+mv val/ILSVRC2012_val_00002332.JPEG n03866082/
+mv val/ILSVRC2012_val_00002333.JPEG n04239074/
+mv val/ILSVRC2012_val_00002334.JPEG n02484975/
+mv val/ILSVRC2012_val_00002335.JPEG n04259630/
+mv val/ILSVRC2012_val_00002336.JPEG n07760859/
+mv val/ILSVRC2012_val_00002337.JPEG n09246464/
+mv val/ILSVRC2012_val_00002338.JPEG n01484850/
+mv val/ILSVRC2012_val_00002339.JPEG n02443114/
+mv val/ILSVRC2012_val_00002340.JPEG n04251144/
+mv val/ILSVRC2012_val_00002341.JPEG n03843555/
+mv val/ILSVRC2012_val_00002342.JPEG n04131690/
+mv val/ILSVRC2012_val_00002343.JPEG n07716906/
+mv val/ILSVRC2012_val_00002344.JPEG n03584254/
+mv val/ILSVRC2012_val_00002345.JPEG n04033901/
+mv val/ILSVRC2012_val_00002346.JPEG n04146614/
+mv val/ILSVRC2012_val_00002347.JPEG n03633091/
+mv val/ILSVRC2012_val_00002348.JPEG n13037406/
+mv val/ILSVRC2012_val_00002349.JPEG n04254680/
+mv val/ILSVRC2012_val_00002350.JPEG n07583066/
+mv val/ILSVRC2012_val_00002351.JPEG n03483316/
+mv val/ILSVRC2012_val_00002352.JPEG n02056570/
+mv val/ILSVRC2012_val_00002353.JPEG n02102177/
+mv val/ILSVRC2012_val_00002354.JPEG n04355338/
+mv val/ILSVRC2012_val_00002355.JPEG n01669191/
+mv val/ILSVRC2012_val_00002356.JPEG n04039381/
+mv val/ILSVRC2012_val_00002357.JPEG n01532829/
+mv val/ILSVRC2012_val_00002358.JPEG n02978881/
+mv val/ILSVRC2012_val_00002359.JPEG n03691459/
+mv val/ILSVRC2012_val_00002360.JPEG n04118776/
+mv val/ILSVRC2012_val_00002361.JPEG n02672831/
+mv val/ILSVRC2012_val_00002362.JPEG n06785654/
+mv val/ILSVRC2012_val_00002363.JPEG n07749582/
+mv val/ILSVRC2012_val_00002364.JPEG n02536864/
+mv val/ILSVRC2012_val_00002365.JPEG n02116738/
+mv val/ILSVRC2012_val_00002366.JPEG n04239074/
+mv val/ILSVRC2012_val_00002367.JPEG n02483708/
+mv val/ILSVRC2012_val_00002368.JPEG n03124170/
+mv val/ILSVRC2012_val_00002369.JPEG n07930864/
+mv val/ILSVRC2012_val_00002370.JPEG n02018207/
+mv val/ILSVRC2012_val_00002371.JPEG n04074963/
+mv val/ILSVRC2012_val_00002372.JPEG n01514859/
+mv val/ILSVRC2012_val_00002373.JPEG n02089867/
+mv val/ILSVRC2012_val_00002374.JPEG n03804744/
+mv val/ILSVRC2012_val_00002375.JPEG n04116512/
+mv val/ILSVRC2012_val_00002376.JPEG n02802426/
+mv val/ILSVRC2012_val_00002377.JPEG n03627232/
+mv val/ILSVRC2012_val_00002378.JPEG n03787032/
+mv val/ILSVRC2012_val_00002379.JPEG n02281406/
+mv val/ILSVRC2012_val_00002380.JPEG n07613480/
+mv val/ILSVRC2012_val_00002381.JPEG n02526121/
+mv val/ILSVRC2012_val_00002382.JPEG n02860847/
+mv val/ILSVRC2012_val_00002383.JPEG n01806143/
+mv val/ILSVRC2012_val_00002384.JPEG n03706229/
+mv val/ILSVRC2012_val_00002385.JPEG n03982430/
+mv val/ILSVRC2012_val_00002386.JPEG n04009552/
+mv val/ILSVRC2012_val_00002387.JPEG n01616318/
+mv val/ILSVRC2012_val_00002388.JPEG n01828970/
+mv val/ILSVRC2012_val_00002389.JPEG n03920288/
+mv val/ILSVRC2012_val_00002390.JPEG n03680355/
+mv val/ILSVRC2012_val_00002391.JPEG n02727426/
+mv val/ILSVRC2012_val_00002392.JPEG n02963159/
+mv val/ILSVRC2012_val_00002393.JPEG n02102973/
+mv val/ILSVRC2012_val_00002394.JPEG n04209133/
+mv val/ILSVRC2012_val_00002395.JPEG n01798484/
+mv val/ILSVRC2012_val_00002396.JPEG n02190166/
+mv val/ILSVRC2012_val_00002397.JPEG n02091635/
+mv val/ILSVRC2012_val_00002398.JPEG n02089078/
+mv val/ILSVRC2012_val_00002399.JPEG n04371774/
+mv val/ILSVRC2012_val_00002400.JPEG n04515003/
+mv val/ILSVRC2012_val_00002401.JPEG n02655020/
+mv val/ILSVRC2012_val_00002402.JPEG n02104029/
+mv val/ILSVRC2012_val_00002403.JPEG n01877812/
+mv val/ILSVRC2012_val_00002404.JPEG n02794156/
+mv val/ILSVRC2012_val_00002405.JPEG n02974003/
+mv val/ILSVRC2012_val_00002406.JPEG n02096585/
+mv val/ILSVRC2012_val_00002407.JPEG n04525305/
+mv val/ILSVRC2012_val_00002408.JPEG n02672831/
+mv val/ILSVRC2012_val_00002409.JPEG n02113712/
+mv val/ILSVRC2012_val_00002410.JPEG n02917067/
+mv val/ILSVRC2012_val_00002411.JPEG n02096437/
+mv val/ILSVRC2012_val_00002412.JPEG n07745940/
+mv val/ILSVRC2012_val_00002413.JPEG n02326432/
+mv val/ILSVRC2012_val_00002414.JPEG n03314780/
+mv val/ILSVRC2012_val_00002415.JPEG n02236044/
+mv val/ILSVRC2012_val_00002416.JPEG n02102973/
+mv val/ILSVRC2012_val_00002417.JPEG n02093428/
+mv val/ILSVRC2012_val_00002418.JPEG n03297495/
+mv val/ILSVRC2012_val_00002419.JPEG n03676483/
+mv val/ILSVRC2012_val_00002420.JPEG n03775071/
+mv val/ILSVRC2012_val_00002421.JPEG n04536866/
+mv val/ILSVRC2012_val_00002422.JPEG n04554684/
+mv val/ILSVRC2012_val_00002423.JPEG n03400231/
+mv val/ILSVRC2012_val_00002424.JPEG n04346328/
+mv val/ILSVRC2012_val_00002425.JPEG n01530575/
+mv val/ILSVRC2012_val_00002426.JPEG n04133789/
+mv val/ILSVRC2012_val_00002427.JPEG n03160309/
+mv val/ILSVRC2012_val_00002428.JPEG n01930112/
+mv val/ILSVRC2012_val_00002429.JPEG n03494278/
+mv val/ILSVRC2012_val_00002430.JPEG n03063599/
+mv val/ILSVRC2012_val_00002431.JPEG n03891332/
+mv val/ILSVRC2012_val_00002432.JPEG n04476259/
+mv val/ILSVRC2012_val_00002433.JPEG n02410509/
+mv val/ILSVRC2012_val_00002434.JPEG n03417042/
+mv val/ILSVRC2012_val_00002435.JPEG n07753113/
+mv val/ILSVRC2012_val_00002436.JPEG n03498962/
+mv val/ILSVRC2012_val_00002437.JPEG n03991062/
+mv val/ILSVRC2012_val_00002438.JPEG n04086273/
+mv val/ILSVRC2012_val_00002439.JPEG n01739381/
+mv val/ILSVRC2012_val_00002440.JPEG n07753275/
+mv val/ILSVRC2012_val_00002441.JPEG n03065424/
+mv val/ILSVRC2012_val_00002442.JPEG n03476991/
+mv val/ILSVRC2012_val_00002443.JPEG n07565083/
+mv val/ILSVRC2012_val_00002444.JPEG n01608432/
+mv val/ILSVRC2012_val_00002445.JPEG n04258138/
+mv val/ILSVRC2012_val_00002446.JPEG n03803284/
+mv val/ILSVRC2012_val_00002447.JPEG n02120079/
+mv val/ILSVRC2012_val_00002448.JPEG n02454379/
+mv val/ILSVRC2012_val_00002449.JPEG n01537544/
+mv val/ILSVRC2012_val_00002450.JPEG n02492035/
+mv val/ILSVRC2012_val_00002451.JPEG n02219486/
+mv val/ILSVRC2012_val_00002452.JPEG n01735189/
+mv val/ILSVRC2012_val_00002453.JPEG n03594734/
+mv val/ILSVRC2012_val_00002454.JPEG n02442845/
+mv val/ILSVRC2012_val_00002455.JPEG n04485082/
+mv val/ILSVRC2012_val_00002456.JPEG n03599486/
+mv val/ILSVRC2012_val_00002457.JPEG n02086079/
+mv val/ILSVRC2012_val_00002458.JPEG n03995372/
+mv val/ILSVRC2012_val_00002459.JPEG n04501370/
+mv val/ILSVRC2012_val_00002460.JPEG n02113712/
+mv val/ILSVRC2012_val_00002461.JPEG n02102480/
+mv val/ILSVRC2012_val_00002462.JPEG n03599486/
+mv val/ILSVRC2012_val_00002463.JPEG n04162706/
+mv val/ILSVRC2012_val_00002464.JPEG n03868242/
+mv val/ILSVRC2012_val_00002465.JPEG n04209133/
+mv val/ILSVRC2012_val_00002466.JPEG n02791124/
+mv val/ILSVRC2012_val_00002467.JPEG n01819313/
+mv val/ILSVRC2012_val_00002468.JPEG n02116738/
+mv val/ILSVRC2012_val_00002469.JPEG n02894605/
+mv val/ILSVRC2012_val_00002470.JPEG n03764736/
+mv val/ILSVRC2012_val_00002471.JPEG n03476684/
+mv val/ILSVRC2012_val_00002472.JPEG n02123159/
+mv val/ILSVRC2012_val_00002473.JPEG n02325366/
+mv val/ILSVRC2012_val_00002474.JPEG n03457902/
+mv val/ILSVRC2012_val_00002475.JPEG n02123597/
+mv val/ILSVRC2012_val_00002476.JPEG n09399592/
+mv val/ILSVRC2012_val_00002477.JPEG n02488291/
+mv val/ILSVRC2012_val_00002478.JPEG n03788365/
+mv val/ILSVRC2012_val_00002479.JPEG n01770081/
+mv val/ILSVRC2012_val_00002480.JPEG n01498041/
+mv val/ILSVRC2012_val_00002481.JPEG n02110341/
+mv val/ILSVRC2012_val_00002482.JPEG n02834397/
+mv val/ILSVRC2012_val_00002483.JPEG n02391049/
+mv val/ILSVRC2012_val_00002484.JPEG n02113023/
+mv val/ILSVRC2012_val_00002485.JPEG n02099712/
+mv val/ILSVRC2012_val_00002486.JPEG n01739381/
+mv val/ILSVRC2012_val_00002487.JPEG n02980441/
+mv val/ILSVRC2012_val_00002488.JPEG n02027492/
+mv val/ILSVRC2012_val_00002489.JPEG n03208938/
+mv val/ILSVRC2012_val_00002490.JPEG n07734744/
+mv val/ILSVRC2012_val_00002491.JPEG n02027492/
+mv val/ILSVRC2012_val_00002492.JPEG n02108000/
+mv val/ILSVRC2012_val_00002493.JPEG n03902125/
+mv val/ILSVRC2012_val_00002494.JPEG n04044716/
+mv val/ILSVRC2012_val_00002495.JPEG n09428293/
+mv val/ILSVRC2012_val_00002496.JPEG n01981276/
+mv val/ILSVRC2012_val_00002497.JPEG n02869837/
+mv val/ILSVRC2012_val_00002498.JPEG n03425413/
+mv val/ILSVRC2012_val_00002499.JPEG n03085013/
+mv val/ILSVRC2012_val_00002500.JPEG n03804744/
+mv val/ILSVRC2012_val_00002501.JPEG n02443114/
+mv val/ILSVRC2012_val_00002502.JPEG n01983481/
+mv val/ILSVRC2012_val_00002503.JPEG n02088466/
+mv val/ILSVRC2012_val_00002504.JPEG n02077923/
+mv val/ILSVRC2012_val_00002505.JPEG n01740131/
+mv val/ILSVRC2012_val_00002506.JPEG n09468604/
+mv val/ILSVRC2012_val_00002507.JPEG n02783161/
+mv val/ILSVRC2012_val_00002508.JPEG n03888257/
+mv val/ILSVRC2012_val_00002509.JPEG n02797295/
+mv val/ILSVRC2012_val_00002510.JPEG n04252225/
+mv val/ILSVRC2012_val_00002511.JPEG n01622779/
+mv val/ILSVRC2012_val_00002512.JPEG n01669191/
+mv val/ILSVRC2012_val_00002513.JPEG n03710637/
+mv val/ILSVRC2012_val_00002514.JPEG n01669191/
+mv val/ILSVRC2012_val_00002515.JPEG n01983481/
+mv val/ILSVRC2012_val_00002516.JPEG n02108422/
+mv val/ILSVRC2012_val_00002517.JPEG n04111531/
+mv val/ILSVRC2012_val_00002518.JPEG n04179913/
+mv val/ILSVRC2012_val_00002519.JPEG n04204238/
+mv val/ILSVRC2012_val_00002520.JPEG n04389033/
+mv val/ILSVRC2012_val_00002521.JPEG n02087046/
+mv val/ILSVRC2012_val_00002522.JPEG n01872401/
+mv val/ILSVRC2012_val_00002523.JPEG n02692877/
+mv val/ILSVRC2012_val_00002524.JPEG n01632777/
+mv val/ILSVRC2012_val_00002525.JPEG n02640242/
+mv val/ILSVRC2012_val_00002526.JPEG n02927161/
+mv val/ILSVRC2012_val_00002527.JPEG n02814860/
+mv val/ILSVRC2012_val_00002528.JPEG n03792972/
+mv val/ILSVRC2012_val_00002529.JPEG n04039381/
+mv val/ILSVRC2012_val_00002530.JPEG n02480855/
+mv val/ILSVRC2012_val_00002531.JPEG n03599486/
+mv val/ILSVRC2012_val_00002532.JPEG n04326547/
+mv val/ILSVRC2012_val_00002533.JPEG n03691459/
+mv val/ILSVRC2012_val_00002534.JPEG n04592741/
+mv val/ILSVRC2012_val_00002535.JPEG n03014705/
+mv val/ILSVRC2012_val_00002536.JPEG n01582220/
+mv val/ILSVRC2012_val_00002537.JPEG n13052670/
+mv val/ILSVRC2012_val_00002538.JPEG n02802426/
+mv val/ILSVRC2012_val_00002539.JPEG n01797886/
+mv val/ILSVRC2012_val_00002540.JPEG n04263257/
+mv val/ILSVRC2012_val_00002541.JPEG n04350905/
+mv val/ILSVRC2012_val_00002542.JPEG n03372029/
+mv val/ILSVRC2012_val_00002543.JPEG n02484975/
+mv val/ILSVRC2012_val_00002544.JPEG n09428293/
+mv val/ILSVRC2012_val_00002545.JPEG n03887697/
+mv val/ILSVRC2012_val_00002546.JPEG n02112350/
+mv val/ILSVRC2012_val_00002547.JPEG n03110669/
+mv val/ILSVRC2012_val_00002548.JPEG n02910353/
+mv val/ILSVRC2012_val_00002549.JPEG n02096294/
+mv val/ILSVRC2012_val_00002550.JPEG n02102177/
+mv val/ILSVRC2012_val_00002551.JPEG n02115913/
+mv val/ILSVRC2012_val_00002552.JPEG n02804610/
+mv val/ILSVRC2012_val_00002553.JPEG n04239074/
+mv val/ILSVRC2012_val_00002554.JPEG n04005630/
+mv val/ILSVRC2012_val_00002555.JPEG n04118538/
+mv val/ILSVRC2012_val_00002556.JPEG n04067472/
+mv val/ILSVRC2012_val_00002557.JPEG n02128757/
+mv val/ILSVRC2012_val_00002558.JPEG n02097658/
+mv val/ILSVRC2012_val_00002559.JPEG n02099849/
+mv val/ILSVRC2012_val_00002560.JPEG n01882714/
+mv val/ILSVRC2012_val_00002561.JPEG n02494079/
+mv val/ILSVRC2012_val_00002562.JPEG n03379051/
+mv val/ILSVRC2012_val_00002563.JPEG n02808440/
+mv val/ILSVRC2012_val_00002564.JPEG n04392985/
+mv val/ILSVRC2012_val_00002565.JPEG n02114548/
+mv val/ILSVRC2012_val_00002566.JPEG n02206856/
+mv val/ILSVRC2012_val_00002567.JPEG n03976657/
+mv val/ILSVRC2012_val_00002568.JPEG n01729322/
+mv val/ILSVRC2012_val_00002569.JPEG n07831146/
+mv val/ILSVRC2012_val_00002570.JPEG n01883070/
+mv val/ILSVRC2012_val_00002571.JPEG n02361337/
+mv val/ILSVRC2012_val_00002572.JPEG n02128757/
+mv val/ILSVRC2012_val_00002573.JPEG n02097130/
+mv val/ILSVRC2012_val_00002574.JPEG n04447861/
+mv val/ILSVRC2012_val_00002575.JPEG n13052670/
+mv val/ILSVRC2012_val_00002576.JPEG n02096177/
+mv val/ILSVRC2012_val_00002577.JPEG n03691459/
+mv val/ILSVRC2012_val_00002578.JPEG n02134084/
+mv val/ILSVRC2012_val_00002579.JPEG n02494079/
+mv val/ILSVRC2012_val_00002580.JPEG n03642806/
+mv val/ILSVRC2012_val_00002581.JPEG n04136333/
+mv val/ILSVRC2012_val_00002582.JPEG n02268853/
+mv val/ILSVRC2012_val_00002583.JPEG n02417914/
+mv val/ILSVRC2012_val_00002584.JPEG n03891332/
+mv val/ILSVRC2012_val_00002585.JPEG n09246464/
+mv val/ILSVRC2012_val_00002586.JPEG n03032252/
+mv val/ILSVRC2012_val_00002587.JPEG n02825657/
+mv val/ILSVRC2012_val_00002588.JPEG n03498962/
+mv val/ILSVRC2012_val_00002589.JPEG n03160309/
+mv val/ILSVRC2012_val_00002590.JPEG n04026417/
+mv val/ILSVRC2012_val_00002591.JPEG n04296562/
+mv val/ILSVRC2012_val_00002592.JPEG n03534580/
+mv val/ILSVRC2012_val_00002593.JPEG n03216828/
+mv val/ILSVRC2012_val_00002594.JPEG n07880968/
+mv val/ILSVRC2012_val_00002595.JPEG n03393912/
+mv val/ILSVRC2012_val_00002596.JPEG n02948072/
+mv val/ILSVRC2012_val_00002597.JPEG n04560804/
+mv val/ILSVRC2012_val_00002598.JPEG n04152593/
+mv val/ILSVRC2012_val_00002599.JPEG n04509417/
+mv val/ILSVRC2012_val_00002600.JPEG n03884397/
+mv val/ILSVRC2012_val_00002601.JPEG n02129604/
+mv val/ILSVRC2012_val_00002602.JPEG n01944390/
+mv val/ILSVRC2012_val_00002603.JPEG n04310018/
+mv val/ILSVRC2012_val_00002604.JPEG n04086273/
+mv val/ILSVRC2012_val_00002605.JPEG n07584110/
+mv val/ILSVRC2012_val_00002606.JPEG n04258138/
+mv val/ILSVRC2012_val_00002607.JPEG n04264628/
+mv val/ILSVRC2012_val_00002608.JPEG n13040303/
+mv val/ILSVRC2012_val_00002609.JPEG n02109525/
+mv val/ILSVRC2012_val_00002610.JPEG n04462240/
+mv val/ILSVRC2012_val_00002611.JPEG n02791270/
+mv val/ILSVRC2012_val_00002612.JPEG n03384352/
+mv val/ILSVRC2012_val_00002613.JPEG n04070727/
+mv val/ILSVRC2012_val_00002614.JPEG n02108422/
+mv val/ILSVRC2012_val_00002615.JPEG n03485407/
+mv val/ILSVRC2012_val_00002616.JPEG n02093647/
+mv val/ILSVRC2012_val_00002617.JPEG n03000134/
+mv val/ILSVRC2012_val_00002618.JPEG n03089624/
+mv val/ILSVRC2012_val_00002619.JPEG n07615774/
+mv val/ILSVRC2012_val_00002620.JPEG n03956157/
+mv val/ILSVRC2012_val_00002621.JPEG n02776631/
+mv val/ILSVRC2012_val_00002622.JPEG n01729977/
+mv val/ILSVRC2012_val_00002623.JPEG n03868242/
+mv val/ILSVRC2012_val_00002624.JPEG n03899768/
+mv val/ILSVRC2012_val_00002625.JPEG n01871265/
+mv val/ILSVRC2012_val_00002626.JPEG n03180011/
+mv val/ILSVRC2012_val_00002627.JPEG n03630383/
+mv val/ILSVRC2012_val_00002628.JPEG n01968897/
+mv val/ILSVRC2012_val_00002629.JPEG n02939185/
+mv val/ILSVRC2012_val_00002630.JPEG n02097474/
+mv val/ILSVRC2012_val_00002631.JPEG n04154565/
+mv val/ILSVRC2012_val_00002632.JPEG n04462240/
+mv val/ILSVRC2012_val_00002633.JPEG n02028035/
+mv val/ILSVRC2012_val_00002634.JPEG n04041544/
+mv val/ILSVRC2012_val_00002635.JPEG n02111129/
+mv val/ILSVRC2012_val_00002636.JPEG n03026506/
+mv val/ILSVRC2012_val_00002637.JPEG n04389033/
+mv val/ILSVRC2012_val_00002638.JPEG n02808440/
+mv val/ILSVRC2012_val_00002639.JPEG n03124170/
+mv val/ILSVRC2012_val_00002640.JPEG n02129165/
+mv val/ILSVRC2012_val_00002641.JPEG n02776631/
+mv val/ILSVRC2012_val_00002642.JPEG n04259630/
+mv val/ILSVRC2012_val_00002643.JPEG n03902125/
+mv val/ILSVRC2012_val_00002644.JPEG n07760859/
+mv val/ILSVRC2012_val_00002645.JPEG n01744401/
+mv val/ILSVRC2012_val_00002646.JPEG n02128757/
+mv val/ILSVRC2012_val_00002647.JPEG n02843684/
+mv val/ILSVRC2012_val_00002648.JPEG n02091134/
+mv val/ILSVRC2012_val_00002649.JPEG n02256656/
+mv val/ILSVRC2012_val_00002650.JPEG n03814639/
+mv val/ILSVRC2012_val_00002651.JPEG n02666196/
+mv val/ILSVRC2012_val_00002652.JPEG n02497673/
+mv val/ILSVRC2012_val_00002653.JPEG n13054560/
+mv val/ILSVRC2012_val_00002654.JPEG n01914609/
+mv val/ILSVRC2012_val_00002655.JPEG n01580077/
+mv val/ILSVRC2012_val_00002656.JPEG n02089867/
+mv val/ILSVRC2012_val_00002657.JPEG n03630383/
+mv val/ILSVRC2012_val_00002658.JPEG n02025239/
+mv val/ILSVRC2012_val_00002659.JPEG n02123597/
+mv val/ILSVRC2012_val_00002660.JPEG n02807133/
+mv val/ILSVRC2012_val_00002661.JPEG n03673027/
+mv val/ILSVRC2012_val_00002662.JPEG n04317175/
+mv val/ILSVRC2012_val_00002663.JPEG n15075141/
+mv val/ILSVRC2012_val_00002664.JPEG n01795545/
+mv val/ILSVRC2012_val_00002665.JPEG n03888257/
+mv val/ILSVRC2012_val_00002666.JPEG n03062245/
+mv val/ILSVRC2012_val_00002667.JPEG n04209133/
+mv val/ILSVRC2012_val_00002668.JPEG n01531178/
+mv val/ILSVRC2012_val_00002669.JPEG n02410509/
+mv val/ILSVRC2012_val_00002670.JPEG n04162706/
+mv val/ILSVRC2012_val_00002671.JPEG n03814639/
+mv val/ILSVRC2012_val_00002672.JPEG n02102177/
+mv val/ILSVRC2012_val_00002673.JPEG n04399382/
+mv val/ILSVRC2012_val_00002674.JPEG n03220513/
+mv val/ILSVRC2012_val_00002675.JPEG n06874185/
+mv val/ILSVRC2012_val_00002676.JPEG n04152593/
+mv val/ILSVRC2012_val_00002677.JPEG n07880968/
+mv val/ILSVRC2012_val_00002678.JPEG n02066245/
+mv val/ILSVRC2012_val_00002679.JPEG n01735189/
+mv val/ILSVRC2012_val_00002680.JPEG n03271574/
+mv val/ILSVRC2012_val_00002681.JPEG n01592084/
+mv val/ILSVRC2012_val_00002682.JPEG n04355933/
+mv val/ILSVRC2012_val_00002683.JPEG n02085936/
+mv val/ILSVRC2012_val_00002684.JPEG n01978455/
+mv val/ILSVRC2012_val_00002685.JPEG n04597913/
+mv val/ILSVRC2012_val_00002686.JPEG n07871810/
+mv val/ILSVRC2012_val_00002687.JPEG n02093859/
+mv val/ILSVRC2012_val_00002688.JPEG n01773549/
+mv val/ILSVRC2012_val_00002689.JPEG n03126707/
+mv val/ILSVRC2012_val_00002690.JPEG n03452741/
+mv val/ILSVRC2012_val_00002691.JPEG n02027492/
+mv val/ILSVRC2012_val_00002692.JPEG n02408429/
+mv val/ILSVRC2012_val_00002693.JPEG n01985128/
+mv val/ILSVRC2012_val_00002694.JPEG n03670208/
+mv val/ILSVRC2012_val_00002695.JPEG n04458633/
+mv val/ILSVRC2012_val_00002696.JPEG n04273569/
+mv val/ILSVRC2012_val_00002697.JPEG n03785016/
+mv val/ILSVRC2012_val_00002698.JPEG n01751748/
+mv val/ILSVRC2012_val_00002699.JPEG n03188531/
+mv val/ILSVRC2012_val_00002700.JPEG n02917067/
+mv val/ILSVRC2012_val_00002701.JPEG n02086240/
+mv val/ILSVRC2012_val_00002702.JPEG n03770439/
+mv val/ILSVRC2012_val_00002703.JPEG n03240683/
+mv val/ILSVRC2012_val_00002704.JPEG n03920288/
+mv val/ILSVRC2012_val_00002705.JPEG n03954731/
+mv val/ILSVRC2012_val_00002706.JPEG n02109525/
+mv val/ILSVRC2012_val_00002707.JPEG n03016953/
+mv val/ILSVRC2012_val_00002708.JPEG n02107683/
+mv val/ILSVRC2012_val_00002709.JPEG n01665541/
+mv val/ILSVRC2012_val_00002710.JPEG n04310018/
+mv val/ILSVRC2012_val_00002711.JPEG n03485407/
+mv val/ILSVRC2012_val_00002712.JPEG n03187595/
+mv val/ILSVRC2012_val_00002713.JPEG n03814639/
+mv val/ILSVRC2012_val_00002714.JPEG n02095570/
+mv val/ILSVRC2012_val_00002715.JPEG n01968897/
+mv val/ILSVRC2012_val_00002716.JPEG n03874599/
+mv val/ILSVRC2012_val_00002717.JPEG n02493509/
+mv val/ILSVRC2012_val_00002718.JPEG n02130308/
+mv val/ILSVRC2012_val_00002719.JPEG n02749479/
+mv val/ILSVRC2012_val_00002720.JPEG n01945685/
+mv val/ILSVRC2012_val_00002721.JPEG n02536864/
+mv val/ILSVRC2012_val_00002722.JPEG n04154565/
+mv val/ILSVRC2012_val_00002723.JPEG n02328150/
+mv val/ILSVRC2012_val_00002724.JPEG n03908618/
+mv val/ILSVRC2012_val_00002725.JPEG n01737021/
+mv val/ILSVRC2012_val_00002726.JPEG n02408429/
+mv val/ILSVRC2012_val_00002727.JPEG n02231487/
+mv val/ILSVRC2012_val_00002728.JPEG n04131690/
+mv val/ILSVRC2012_val_00002729.JPEG n03970156/
+mv val/ILSVRC2012_val_00002730.JPEG n01530575/
+mv val/ILSVRC2012_val_00002731.JPEG n04336792/
+mv val/ILSVRC2012_val_00002732.JPEG n02951358/
+mv val/ILSVRC2012_val_00002733.JPEG n02879718/
+mv val/ILSVRC2012_val_00002734.JPEG n03944341/
+mv val/ILSVRC2012_val_00002735.JPEG n03788195/
+mv val/ILSVRC2012_val_00002736.JPEG n02895154/
+mv val/ILSVRC2012_val_00002737.JPEG n03838899/
+mv val/ILSVRC2012_val_00002738.JPEG n02037110/
+mv val/ILSVRC2012_val_00002739.JPEG n04009552/
+mv val/ILSVRC2012_val_00002740.JPEG n03141823/
+mv val/ILSVRC2012_val_00002741.JPEG n02102973/
+mv val/ILSVRC2012_val_00002742.JPEG n07730033/
+mv val/ILSVRC2012_val_00002743.JPEG n01984695/
+mv val/ILSVRC2012_val_00002744.JPEG n07693725/
+mv val/ILSVRC2012_val_00002745.JPEG n04065272/
+mv val/ILSVRC2012_val_00002746.JPEG n01631663/
+mv val/ILSVRC2012_val_00002747.JPEG n02699494/
+mv val/ILSVRC2012_val_00002748.JPEG n03095699/
+mv val/ILSVRC2012_val_00002749.JPEG n02112350/
+mv val/ILSVRC2012_val_00002750.JPEG n04019541/
+mv val/ILSVRC2012_val_00002751.JPEG n09835506/
+mv val/ILSVRC2012_val_00002752.JPEG n01484850/
+mv val/ILSVRC2012_val_00002753.JPEG n07697313/
+mv val/ILSVRC2012_val_00002754.JPEG n01729322/
+mv val/ILSVRC2012_val_00002755.JPEG n03085013/
+mv val/ILSVRC2012_val_00002756.JPEG n04041544/
+mv val/ILSVRC2012_val_00002757.JPEG n02396427/
+mv val/ILSVRC2012_val_00002758.JPEG n02879718/
+mv val/ILSVRC2012_val_00002759.JPEG n03891332/
+mv val/ILSVRC2012_val_00002760.JPEG n04590129/
+mv val/ILSVRC2012_val_00002761.JPEG n03271574/
+mv val/ILSVRC2012_val_00002762.JPEG n02454379/
+mv val/ILSVRC2012_val_00002763.JPEG n01944390/
+mv val/ILSVRC2012_val_00002764.JPEG n02099267/
+mv val/ILSVRC2012_val_00002765.JPEG n02097658/
+mv val/ILSVRC2012_val_00002766.JPEG n07720875/
+mv val/ILSVRC2012_val_00002767.JPEG n02484975/
+mv val/ILSVRC2012_val_00002768.JPEG n03733805/
+mv val/ILSVRC2012_val_00002769.JPEG n02086240/
+mv val/ILSVRC2012_val_00002770.JPEG n04204238/
+mv val/ILSVRC2012_val_00002771.JPEG n03483316/
+mv val/ILSVRC2012_val_00002772.JPEG n03201208/
+mv val/ILSVRC2012_val_00002773.JPEG n02095570/
+mv val/ILSVRC2012_val_00002774.JPEG n01630670/
+mv val/ILSVRC2012_val_00002775.JPEG n03201208/
+mv val/ILSVRC2012_val_00002776.JPEG n01755581/
+mv val/ILSVRC2012_val_00002777.JPEG n02879718/
+mv val/ILSVRC2012_val_00002778.JPEG n03065424/
+mv val/ILSVRC2012_val_00002779.JPEG n02037110/
+mv val/ILSVRC2012_val_00002780.JPEG n02108915/
+mv val/ILSVRC2012_val_00002781.JPEG n02807133/
+mv val/ILSVRC2012_val_00002782.JPEG n04023962/
+mv val/ILSVRC2012_val_00002783.JPEG n01669191/
+mv val/ILSVRC2012_val_00002784.JPEG n02098286/
+mv val/ILSVRC2012_val_00002785.JPEG n04252225/
+mv val/ILSVRC2012_val_00002786.JPEG n02115641/
+mv val/ILSVRC2012_val_00002787.JPEG n02281787/
+mv val/ILSVRC2012_val_00002788.JPEG n06794110/
+mv val/ILSVRC2012_val_00002789.JPEG n02391049/
+mv val/ILSVRC2012_val_00002790.JPEG n04486054/
+mv val/ILSVRC2012_val_00002791.JPEG n01817953/
+mv val/ILSVRC2012_val_00002792.JPEG n04041544/
+mv val/ILSVRC2012_val_00002793.JPEG n04277352/
+mv val/ILSVRC2012_val_00002794.JPEG n02107574/
+mv val/ILSVRC2012_val_00002795.JPEG n09193705/
+mv val/ILSVRC2012_val_00002796.JPEG n04371774/
+mv val/ILSVRC2012_val_00002797.JPEG n04372370/
+mv val/ILSVRC2012_val_00002798.JPEG n03724870/
+mv val/ILSVRC2012_val_00002799.JPEG n03388183/
+mv val/ILSVRC2012_val_00002800.JPEG n04371430/
+mv val/ILSVRC2012_val_00002801.JPEG n02788148/
+mv val/ILSVRC2012_val_00002802.JPEG n01817953/
+mv val/ILSVRC2012_val_00002803.JPEG n02699494/
+mv val/ILSVRC2012_val_00002804.JPEG n07730033/
+mv val/ILSVRC2012_val_00002805.JPEG n09468604/
+mv val/ILSVRC2012_val_00002806.JPEG n04254777/
+mv val/ILSVRC2012_val_00002807.JPEG n04501370/
+mv val/ILSVRC2012_val_00002808.JPEG n03637318/
+mv val/ILSVRC2012_val_00002809.JPEG n02782093/
+mv val/ILSVRC2012_val_00002810.JPEG n04152593/
+mv val/ILSVRC2012_val_00002811.JPEG n01882714/
+mv val/ILSVRC2012_val_00002812.JPEG n02916936/
+mv val/ILSVRC2012_val_00002813.JPEG n03661043/
+mv val/ILSVRC2012_val_00002814.JPEG n04336792/
+mv val/ILSVRC2012_val_00002815.JPEG n02422699/
+mv val/ILSVRC2012_val_00002816.JPEG n04019541/
+mv val/ILSVRC2012_val_00002817.JPEG n01664065/
+mv val/ILSVRC2012_val_00002818.JPEG n03325584/
+mv val/ILSVRC2012_val_00002819.JPEG n03976657/
+mv val/ILSVRC2012_val_00002820.JPEG n04423845/
+mv val/ILSVRC2012_val_00002821.JPEG n04404412/
+mv val/ILSVRC2012_val_00002822.JPEG n03527444/
+mv val/ILSVRC2012_val_00002823.JPEG n02123045/
+mv val/ILSVRC2012_val_00002824.JPEG n02094114/
+mv val/ILSVRC2012_val_00002825.JPEG n01558993/
+mv val/ILSVRC2012_val_00002826.JPEG n03062245/
+mv val/ILSVRC2012_val_00002827.JPEG n02113712/
+mv val/ILSVRC2012_val_00002828.JPEG n03662601/
+mv val/ILSVRC2012_val_00002829.JPEG n03065424/
+mv val/ILSVRC2012_val_00002830.JPEG n03388183/
+mv val/ILSVRC2012_val_00002831.JPEG n03447721/
+mv val/ILSVRC2012_val_00002832.JPEG n01667778/
+mv val/ILSVRC2012_val_00002833.JPEG n03584254/
+mv val/ILSVRC2012_val_00002834.JPEG n03000247/
+mv val/ILSVRC2012_val_00002835.JPEG n07718747/
+mv val/ILSVRC2012_val_00002836.JPEG n01737021/
+mv val/ILSVRC2012_val_00002837.JPEG n02676566/
+mv val/ILSVRC2012_val_00002838.JPEG n01795545/
+mv val/ILSVRC2012_val_00002839.JPEG n07860988/
+mv val/ILSVRC2012_val_00002840.JPEG n04086273/
+mv val/ILSVRC2012_val_00002841.JPEG n04332243/
+mv val/ILSVRC2012_val_00002842.JPEG n03447721/
+mv val/ILSVRC2012_val_00002843.JPEG n01829413/
+mv val/ILSVRC2012_val_00002844.JPEG n02236044/
+mv val/ILSVRC2012_val_00002845.JPEG n02165105/
+mv val/ILSVRC2012_val_00002846.JPEG n01796340/
+mv val/ILSVRC2012_val_00002847.JPEG n02092339/
+mv val/ILSVRC2012_val_00002848.JPEG n01443537/
+mv val/ILSVRC2012_val_00002849.JPEG n04370456/
+mv val/ILSVRC2012_val_00002850.JPEG n03961711/
+mv val/ILSVRC2012_val_00002851.JPEG n07579787/
+mv val/ILSVRC2012_val_00002852.JPEG n01753488/
+mv val/ILSVRC2012_val_00002853.JPEG n02708093/
+mv val/ILSVRC2012_val_00002854.JPEG n02111277/
+mv val/ILSVRC2012_val_00002855.JPEG n01774750/
+mv val/ILSVRC2012_val_00002856.JPEG n04286575/
+mv val/ILSVRC2012_val_00002857.JPEG n02483708/
+mv val/ILSVRC2012_val_00002858.JPEG n02002724/
+mv val/ILSVRC2012_val_00002859.JPEG n02536864/
+mv val/ILSVRC2012_val_00002860.JPEG n03400231/
+mv val/ILSVRC2012_val_00002861.JPEG n03485794/
+mv val/ILSVRC2012_val_00002862.JPEG n02480495/
+mv val/ILSVRC2012_val_00002863.JPEG n02509815/
+mv val/ILSVRC2012_val_00002864.JPEG n04111531/
+mv val/ILSVRC2012_val_00002865.JPEG n07716358/
+mv val/ILSVRC2012_val_00002866.JPEG n01968897/
+mv val/ILSVRC2012_val_00002867.JPEG n04579145/
+mv val/ILSVRC2012_val_00002868.JPEG n02892201/
+mv val/ILSVRC2012_val_00002869.JPEG n02091134/
+mv val/ILSVRC2012_val_00002870.JPEG n04118776/
+mv val/ILSVRC2012_val_00002871.JPEG n03249569/
+mv val/ILSVRC2012_val_00002872.JPEG n01601694/
+mv val/ILSVRC2012_val_00002873.JPEG n04522168/
+mv val/ILSVRC2012_val_00002874.JPEG n02441942/
+mv val/ILSVRC2012_val_00002875.JPEG n03271574/
+mv val/ILSVRC2012_val_00002876.JPEG n02692877/
+mv val/ILSVRC2012_val_00002877.JPEG n03930313/
+mv val/ILSVRC2012_val_00002878.JPEG n02100735/
+mv val/ILSVRC2012_val_00002879.JPEG n04428191/
+mv val/ILSVRC2012_val_00002880.JPEG n03706229/
+mv val/ILSVRC2012_val_00002881.JPEG n02119789/
+mv val/ILSVRC2012_val_00002882.JPEG n02111277/
+mv val/ILSVRC2012_val_00002883.JPEG n01629819/
+mv val/ILSVRC2012_val_00002884.JPEG n04476259/
+mv val/ILSVRC2012_val_00002885.JPEG n03958227/
+mv val/ILSVRC2012_val_00002886.JPEG n03240683/
+mv val/ILSVRC2012_val_00002887.JPEG n02504458/
+mv val/ILSVRC2012_val_00002888.JPEG n04461696/
+mv val/ILSVRC2012_val_00002889.JPEG n09229709/
+mv val/ILSVRC2012_val_00002890.JPEG n01728920/
+mv val/ILSVRC2012_val_00002891.JPEG n02422106/
+mv val/ILSVRC2012_val_00002892.JPEG n03450230/
+mv val/ILSVRC2012_val_00002893.JPEG n02268853/
+mv val/ILSVRC2012_val_00002894.JPEG n03902125/
+mv val/ILSVRC2012_val_00002895.JPEG n03868863/
+mv val/ILSVRC2012_val_00002896.JPEG n09428293/
+mv val/ILSVRC2012_val_00002897.JPEG n04482393/
+mv val/ILSVRC2012_val_00002898.JPEG n03680355/
+mv val/ILSVRC2012_val_00002899.JPEG n01744401/
+mv val/ILSVRC2012_val_00002900.JPEG n12620546/
+mv val/ILSVRC2012_val_00002901.JPEG n02002556/
+mv val/ILSVRC2012_val_00002902.JPEG n04136333/
+mv val/ILSVRC2012_val_00002903.JPEG n02447366/
+mv val/ILSVRC2012_val_00002904.JPEG n02226429/
+mv val/ILSVRC2012_val_00002905.JPEG n03249569/
+mv val/ILSVRC2012_val_00002906.JPEG n02281406/
+mv val/ILSVRC2012_val_00002907.JPEG n03721384/
+mv val/ILSVRC2012_val_00002908.JPEG n03874599/
+mv val/ILSVRC2012_val_00002909.JPEG n02951585/
+mv val/ILSVRC2012_val_00002910.JPEG n04074963/
+mv val/ILSVRC2012_val_00002911.JPEG n02480495/
+mv val/ILSVRC2012_val_00002912.JPEG n03929855/
+mv val/ILSVRC2012_val_00002913.JPEG n03016953/
+mv val/ILSVRC2012_val_00002914.JPEG n03376595/
+mv val/ILSVRC2012_val_00002915.JPEG n07747607/
+mv val/ILSVRC2012_val_00002916.JPEG n15075141/
+mv val/ILSVRC2012_val_00002917.JPEG n02085620/
+mv val/ILSVRC2012_val_00002918.JPEG n04141975/
+mv val/ILSVRC2012_val_00002919.JPEG n03733805/
+mv val/ILSVRC2012_val_00002920.JPEG n03670208/
+mv val/ILSVRC2012_val_00002921.JPEG n02085620/
+mv val/ILSVRC2012_val_00002922.JPEG n01491361/
+mv val/ILSVRC2012_val_00002923.JPEG n03803284/
+mv val/ILSVRC2012_val_00002924.JPEG n02415577/
+mv val/ILSVRC2012_val_00002925.JPEG n07714571/
+mv val/ILSVRC2012_val_00002926.JPEG n03929855/
+mv val/ILSVRC2012_val_00002927.JPEG n13037406/
+mv val/ILSVRC2012_val_00002928.JPEG n01740131/
+mv val/ILSVRC2012_val_00002929.JPEG n01580077/
+mv val/ILSVRC2012_val_00002930.JPEG n03891251/
+mv val/ILSVRC2012_val_00002931.JPEG n02128925/
+mv val/ILSVRC2012_val_00002932.JPEG n01664065/
+mv val/ILSVRC2012_val_00002933.JPEG n02090379/
+mv val/ILSVRC2012_val_00002934.JPEG n07920052/
+mv val/ILSVRC2012_val_00002935.JPEG n02279972/
+mv val/ILSVRC2012_val_00002936.JPEG n02490219/
+mv val/ILSVRC2012_val_00002937.JPEG n02906734/
+mv val/ILSVRC2012_val_00002938.JPEG n01914609/
+mv val/ILSVRC2012_val_00002939.JPEG n01704323/
+mv val/ILSVRC2012_val_00002940.JPEG n02105412/
+mv val/ILSVRC2012_val_00002941.JPEG n03492542/
+mv val/ILSVRC2012_val_00002942.JPEG n04482393/
+mv val/ILSVRC2012_val_00002943.JPEG n02788148/
+mv val/ILSVRC2012_val_00002944.JPEG n01985128/
+mv val/ILSVRC2012_val_00002945.JPEG n03388549/
+mv val/ILSVRC2012_val_00002946.JPEG n04251144/
+mv val/ILSVRC2012_val_00002947.JPEG n02939185/
+mv val/ILSVRC2012_val_00002948.JPEG n02114548/
+mv val/ILSVRC2012_val_00002949.JPEG n07836838/
+mv val/ILSVRC2012_val_00002950.JPEG n10148035/
+mv val/ILSVRC2012_val_00002951.JPEG n03976467/
+mv val/ILSVRC2012_val_00002952.JPEG n03447721/
+mv val/ILSVRC2012_val_00002953.JPEG n02006656/
+mv val/ILSVRC2012_val_00002954.JPEG n07802026/
+mv val/ILSVRC2012_val_00002955.JPEG n04370456/
+mv val/ILSVRC2012_val_00002956.JPEG n02417914/
+mv val/ILSVRC2012_val_00002957.JPEG n01776313/
+mv val/ILSVRC2012_val_00002958.JPEG n02112018/
+mv val/ILSVRC2012_val_00002959.JPEG n03938244/
+mv val/ILSVRC2012_val_00002960.JPEG n02536864/
+mv val/ILSVRC2012_val_00002961.JPEG n07802026/
+mv val/ILSVRC2012_val_00002962.JPEG n04501370/
+mv val/ILSVRC2012_val_00002963.JPEG n02963159/
+mv val/ILSVRC2012_val_00002964.JPEG n03759954/
+mv val/ILSVRC2012_val_00002965.JPEG n02028035/
+mv val/ILSVRC2012_val_00002966.JPEG n04044716/
+mv val/ILSVRC2012_val_00002967.JPEG n02123394/
+mv val/ILSVRC2012_val_00002968.JPEG n02823428/
+mv val/ILSVRC2012_val_00002969.JPEG n01491361/
+mv val/ILSVRC2012_val_00002970.JPEG n04008634/
+mv val/ILSVRC2012_val_00002971.JPEG n01877812/
+mv val/ILSVRC2012_val_00002972.JPEG n07615774/
+mv val/ILSVRC2012_val_00002973.JPEG n09256479/
+mv val/ILSVRC2012_val_00002974.JPEG n01833805/
+mv val/ILSVRC2012_val_00002975.JPEG n04127249/
+mv val/ILSVRC2012_val_00002976.JPEG n04507155/
+mv val/ILSVRC2012_val_00002977.JPEG n03673027/
+mv val/ILSVRC2012_val_00002978.JPEG n01882714/
+mv val/ILSVRC2012_val_00002979.JPEG n03697007/
+mv val/ILSVRC2012_val_00002980.JPEG n03637318/
+mv val/ILSVRC2012_val_00002981.JPEG n04332243/
+mv val/ILSVRC2012_val_00002982.JPEG n12267677/
+mv val/ILSVRC2012_val_00002983.JPEG n07714571/
+mv val/ILSVRC2012_val_00002984.JPEG n03485794/
+mv val/ILSVRC2012_val_00002985.JPEG n04004767/
+mv val/ILSVRC2012_val_00002986.JPEG n02795169/
+mv val/ILSVRC2012_val_00002987.JPEG n02120505/
+mv val/ILSVRC2012_val_00002988.JPEG n02086646/
+mv val/ILSVRC2012_val_00002989.JPEG n02107908/
+mv val/ILSVRC2012_val_00002990.JPEG n03888257/
+mv val/ILSVRC2012_val_00002991.JPEG n01795545/
+mv val/ILSVRC2012_val_00002992.JPEG n03272010/
+mv val/ILSVRC2012_val_00002993.JPEG n07714571/
+mv val/ILSVRC2012_val_00002994.JPEG n02097047/
+mv val/ILSVRC2012_val_00002995.JPEG n03874293/
+mv val/ILSVRC2012_val_00002996.JPEG n02391049/
+mv val/ILSVRC2012_val_00002997.JPEG n01855672/
+mv val/ILSVRC2012_val_00002998.JPEG n01871265/
+mv val/ILSVRC2012_val_00002999.JPEG n04208210/
+mv val/ILSVRC2012_val_00003000.JPEG n02487347/
+mv val/ILSVRC2012_val_00003001.JPEG n02013706/
+mv val/ILSVRC2012_val_00003002.JPEG n02096051/
+mv val/ILSVRC2012_val_00003003.JPEG n03598930/
+mv val/ILSVRC2012_val_00003004.JPEG n03873416/
+mv val/ILSVRC2012_val_00003005.JPEG n02871525/
+mv val/ILSVRC2012_val_00003006.JPEG n02102973/
+mv val/ILSVRC2012_val_00003007.JPEG n03710637/
+mv val/ILSVRC2012_val_00003008.JPEG n01773157/
+mv val/ILSVRC2012_val_00003009.JPEG n03208938/
+mv val/ILSVRC2012_val_00003010.JPEG n04325704/
+mv val/ILSVRC2012_val_00003011.JPEG n02002724/
+mv val/ILSVRC2012_val_00003012.JPEG n02137549/
+mv val/ILSVRC2012_val_00003013.JPEG n02125311/
+mv val/ILSVRC2012_val_00003014.JPEG n01440764/
+mv val/ILSVRC2012_val_00003015.JPEG n01806567/
+mv val/ILSVRC2012_val_00003016.JPEG n03345487/
+mv val/ILSVRC2012_val_00003017.JPEG n04209239/
+mv val/ILSVRC2012_val_00003018.JPEG n07860988/
+mv val/ILSVRC2012_val_00003019.JPEG n07802026/
+mv val/ILSVRC2012_val_00003020.JPEG n07714571/
+mv val/ILSVRC2012_val_00003021.JPEG n12768682/
+mv val/ILSVRC2012_val_00003022.JPEG n02108422/
+mv val/ILSVRC2012_val_00003023.JPEG n01770393/
+mv val/ILSVRC2012_val_00003024.JPEG n03124043/
+mv val/ILSVRC2012_val_00003025.JPEG n04023962/
+mv val/ILSVRC2012_val_00003026.JPEG n02105056/
+mv val/ILSVRC2012_val_00003027.JPEG n04476259/
+mv val/ILSVRC2012_val_00003028.JPEG n02871525/
+mv val/ILSVRC2012_val_00003029.JPEG n03598930/
+mv val/ILSVRC2012_val_00003030.JPEG n02206856/
+mv val/ILSVRC2012_val_00003031.JPEG n03223299/
+mv val/ILSVRC2012_val_00003032.JPEG n02259212/
+mv val/ILSVRC2012_val_00003033.JPEG n02607072/
+mv val/ILSVRC2012_val_00003034.JPEG n02834397/
+mv val/ILSVRC2012_val_00003035.JPEG n02364673/
+mv val/ILSVRC2012_val_00003036.JPEG n03131574/
+mv val/ILSVRC2012_val_00003037.JPEG n02802426/
+mv val/ILSVRC2012_val_00003038.JPEG n02117135/
+mv val/ILSVRC2012_val_00003039.JPEG n04370456/
+mv val/ILSVRC2012_val_00003040.JPEG n01829413/
+mv val/ILSVRC2012_val_00003041.JPEG n04033901/
+mv val/ILSVRC2012_val_00003042.JPEG n02123159/
+mv val/ILSVRC2012_val_00003043.JPEG n02794156/
+mv val/ILSVRC2012_val_00003044.JPEG n02132136/
+mv val/ILSVRC2012_val_00003045.JPEG n02883205/
+mv val/ILSVRC2012_val_00003046.JPEG n07720875/
+mv val/ILSVRC2012_val_00003047.JPEG n03920288/
+mv val/ILSVRC2012_val_00003048.JPEG n02892201/
+mv val/ILSVRC2012_val_00003049.JPEG n04285008/
+mv val/ILSVRC2012_val_00003050.JPEG n03345487/
+mv val/ILSVRC2012_val_00003051.JPEG n03661043/
+mv val/ILSVRC2012_val_00003052.JPEG n04423845/
+mv val/ILSVRC2012_val_00003053.JPEG n02013706/
+mv val/ILSVRC2012_val_00003054.JPEG n01924916/
+mv val/ILSVRC2012_val_00003055.JPEG n03095699/
+mv val/ILSVRC2012_val_00003056.JPEG n09428293/
+mv val/ILSVRC2012_val_00003057.JPEG n04153751/
+mv val/ILSVRC2012_val_00003058.JPEG n02865351/
+mv val/ILSVRC2012_val_00003059.JPEG n03384352/
+mv val/ILSVRC2012_val_00003060.JPEG n02786058/
+mv val/ILSVRC2012_val_00003061.JPEG n02099429/
+mv val/ILSVRC2012_val_00003062.JPEG n03014705/
+mv val/ILSVRC2012_val_00003063.JPEG n02113712/
+mv val/ILSVRC2012_val_00003064.JPEG n01833805/
+mv val/ILSVRC2012_val_00003065.JPEG n03924679/
+mv val/ILSVRC2012_val_00003066.JPEG n03937543/
+mv val/ILSVRC2012_val_00003067.JPEG n02892767/
+mv val/ILSVRC2012_val_00003068.JPEG n01819313/
+mv val/ILSVRC2012_val_00003069.JPEG n02109047/
+mv val/ILSVRC2012_val_00003070.JPEG n01694178/
+mv val/ILSVRC2012_val_00003071.JPEG n01729322/
+mv val/ILSVRC2012_val_00003072.JPEG n02808440/
+mv val/ILSVRC2012_val_00003073.JPEG n04266014/
+mv val/ILSVRC2012_val_00003074.JPEG n01978287/
+mv val/ILSVRC2012_val_00003075.JPEG n04111531/
+mv val/ILSVRC2012_val_00003076.JPEG n04540053/
+mv val/ILSVRC2012_val_00003077.JPEG n02100735/
+mv val/ILSVRC2012_val_00003078.JPEG n03935335/
+mv val/ILSVRC2012_val_00003079.JPEG n04372370/
+mv val/ILSVRC2012_val_00003080.JPEG n03930630/
+mv val/ILSVRC2012_val_00003081.JPEG n02443114/
+mv val/ILSVRC2012_val_00003082.JPEG n03854065/
+mv val/ILSVRC2012_val_00003083.JPEG n03724870/
+mv val/ILSVRC2012_val_00003084.JPEG n09193705/
+mv val/ILSVRC2012_val_00003085.JPEG n02640242/
+mv val/ILSVRC2012_val_00003086.JPEG n03967562/
+mv val/ILSVRC2012_val_00003087.JPEG n07711569/
+mv val/ILSVRC2012_val_00003088.JPEG n04147183/
+mv val/ILSVRC2012_val_00003089.JPEG n03710721/
+mv val/ILSVRC2012_val_00003090.JPEG n02965783/
+mv val/ILSVRC2012_val_00003091.JPEG n02951585/
+mv val/ILSVRC2012_val_00003092.JPEG n01582220/
+mv val/ILSVRC2012_val_00003093.JPEG n03014705/
+mv val/ILSVRC2012_val_00003094.JPEG n02643566/
+mv val/ILSVRC2012_val_00003095.JPEG n01739381/
+mv val/ILSVRC2012_val_00003096.JPEG n03814906/
+mv val/ILSVRC2012_val_00003097.JPEG n01882714/
+mv val/ILSVRC2012_val_00003098.JPEG n01729322/
+mv val/ILSVRC2012_val_00003099.JPEG n02860847/
+mv val/ILSVRC2012_val_00003100.JPEG n04350905/
+mv val/ILSVRC2012_val_00003101.JPEG n01697457/
+mv val/ILSVRC2012_val_00003102.JPEG n03220513/
+mv val/ILSVRC2012_val_00003103.JPEG n04311004/
+mv val/ILSVRC2012_val_00003104.JPEG n03877472/
+mv val/ILSVRC2012_val_00003105.JPEG n04209239/
+mv val/ILSVRC2012_val_00003106.JPEG n04149813/
+mv val/ILSVRC2012_val_00003107.JPEG n03770679/
+mv val/ILSVRC2012_val_00003108.JPEG n04548362/
+mv val/ILSVRC2012_val_00003109.JPEG n07930864/
+mv val/ILSVRC2012_val_00003110.JPEG n03661043/
+mv val/ILSVRC2012_val_00003111.JPEG n03400231/
+mv val/ILSVRC2012_val_00003112.JPEG n02930766/
+mv val/ILSVRC2012_val_00003113.JPEG n04613696/
+mv val/ILSVRC2012_val_00003114.JPEG n03866082/
+mv val/ILSVRC2012_val_00003115.JPEG n01990800/
+mv val/ILSVRC2012_val_00003116.JPEG n01534433/
+mv val/ILSVRC2012_val_00003117.JPEG n03947888/
+mv val/ILSVRC2012_val_00003118.JPEG n02492660/
+mv val/ILSVRC2012_val_00003119.JPEG n01985128/
+mv val/ILSVRC2012_val_00003120.JPEG n03793489/
+mv val/ILSVRC2012_val_00003121.JPEG n03977966/
+mv val/ILSVRC2012_val_00003122.JPEG n01795545/
+mv val/ILSVRC2012_val_00003123.JPEG n04086273/
+mv val/ILSVRC2012_val_00003124.JPEG n01688243/
+mv val/ILSVRC2012_val_00003125.JPEG n02423022/
+mv val/ILSVRC2012_val_00003126.JPEG n04277352/
+mv val/ILSVRC2012_val_00003127.JPEG n03877472/
+mv val/ILSVRC2012_val_00003128.JPEG n03208938/
+mv val/ILSVRC2012_val_00003129.JPEG n04476259/
+mv val/ILSVRC2012_val_00003130.JPEG n04550184/
+mv val/ILSVRC2012_val_00003131.JPEG n03063599/
+mv val/ILSVRC2012_val_00003132.JPEG n04523525/
+mv val/ILSVRC2012_val_00003133.JPEG n02123597/
+mv val/ILSVRC2012_val_00003134.JPEG n02708093/
+mv val/ILSVRC2012_val_00003135.JPEG n02134418/
+mv val/ILSVRC2012_val_00003136.JPEG n02086079/
+mv val/ILSVRC2012_val_00003137.JPEG n11879895/
+mv val/ILSVRC2012_val_00003138.JPEG n03676483/
+mv val/ILSVRC2012_val_00003139.JPEG n02107574/
+mv val/ILSVRC2012_val_00003140.JPEG n02113978/
+mv val/ILSVRC2012_val_00003141.JPEG n03764736/
+mv val/ILSVRC2012_val_00003142.JPEG n03642806/
+mv val/ILSVRC2012_val_00003143.JPEG n01748264/
+mv val/ILSVRC2012_val_00003144.JPEG n02167151/
+mv val/ILSVRC2012_val_00003145.JPEG n04612504/
+mv val/ILSVRC2012_val_00003146.JPEG n02817516/
+mv val/ILSVRC2012_val_00003147.JPEG n02051845/
+mv val/ILSVRC2012_val_00003148.JPEG n03724870/
+mv val/ILSVRC2012_val_00003149.JPEG n02077923/
+mv val/ILSVRC2012_val_00003150.JPEG n01443537/
+mv val/ILSVRC2012_val_00003151.JPEG n03065424/
+mv val/ILSVRC2012_val_00003152.JPEG n02105505/
+mv val/ILSVRC2012_val_00003153.JPEG n02051845/
+mv val/ILSVRC2012_val_00003154.JPEG n02087394/
+mv val/ILSVRC2012_val_00003155.JPEG n01735189/
+mv val/ILSVRC2012_val_00003156.JPEG n04310018/
+mv val/ILSVRC2012_val_00003157.JPEG n01632458/
+mv val/ILSVRC2012_val_00003158.JPEG n02509815/
+mv val/ILSVRC2012_val_00003159.JPEG n02093859/
+mv val/ILSVRC2012_val_00003160.JPEG n01669191/
+mv val/ILSVRC2012_val_00003161.JPEG n03868242/
+mv val/ILSVRC2012_val_00003162.JPEG n03400231/
+mv val/ILSVRC2012_val_00003163.JPEG n02423022/
+mv val/ILSVRC2012_val_00003164.JPEG n02090622/
+mv val/ILSVRC2012_val_00003165.JPEG n03146219/
+mv val/ILSVRC2012_val_00003166.JPEG n02397096/
+mv val/ILSVRC2012_val_00003167.JPEG n03532672/
+mv val/ILSVRC2012_val_00003168.JPEG n02013706/
+mv val/ILSVRC2012_val_00003169.JPEG n01622779/
+mv val/ILSVRC2012_val_00003170.JPEG n02483708/
+mv val/ILSVRC2012_val_00003171.JPEG n03187595/
+mv val/ILSVRC2012_val_00003172.JPEG n02114712/
+mv val/ILSVRC2012_val_00003173.JPEG n03131574/
+mv val/ILSVRC2012_val_00003174.JPEG n03476991/
+mv val/ILSVRC2012_val_00003175.JPEG n03838899/
+mv val/ILSVRC2012_val_00003176.JPEG n02105162/
+mv val/ILSVRC2012_val_00003177.JPEG n04604644/
+mv val/ILSVRC2012_val_00003178.JPEG n01689811/
+mv val/ILSVRC2012_val_00003179.JPEG n02113624/
+mv val/ILSVRC2012_val_00003180.JPEG n03691459/
+mv val/ILSVRC2012_val_00003181.JPEG n15075141/
+mv val/ILSVRC2012_val_00003182.JPEG n01773797/
+mv val/ILSVRC2012_val_00003183.JPEG n01491361/
+mv val/ILSVRC2012_val_00003184.JPEG n04209133/
+mv val/ILSVRC2012_val_00003185.JPEG n04476259/
+mv val/ILSVRC2012_val_00003186.JPEG n03444034/
+mv val/ILSVRC2012_val_00003187.JPEG n02488291/
+mv val/ILSVRC2012_val_00003188.JPEG n03485407/
+mv val/ILSVRC2012_val_00003189.JPEG n01630670/
+mv val/ILSVRC2012_val_00003190.JPEG n04599235/
+mv val/ILSVRC2012_val_00003191.JPEG n02174001/
+mv val/ILSVRC2012_val_00003192.JPEG n02834397/
+mv val/ILSVRC2012_val_00003193.JPEG n02509815/
+mv val/ILSVRC2012_val_00003194.JPEG n03538406/
+mv val/ILSVRC2012_val_00003195.JPEG n03535780/
+mv val/ILSVRC2012_val_00003196.JPEG n02105855/
+mv val/ILSVRC2012_val_00003197.JPEG n04501370/
+mv val/ILSVRC2012_val_00003198.JPEG n02098105/
+mv val/ILSVRC2012_val_00003199.JPEG n03763968/
+mv val/ILSVRC2012_val_00003200.JPEG n03095699/
+mv val/ILSVRC2012_val_00003201.JPEG n04591713/
+mv val/ILSVRC2012_val_00003202.JPEG n02363005/
+mv val/ILSVRC2012_val_00003203.JPEG n03599486/
+mv val/ILSVRC2012_val_00003204.JPEG n01491361/
+mv val/ILSVRC2012_val_00003205.JPEG n02090622/
+mv val/ILSVRC2012_val_00003206.JPEG n03590841/
+mv val/ILSVRC2012_val_00003207.JPEG n03832673/
+mv val/ILSVRC2012_val_00003208.JPEG n02013706/
+mv val/ILSVRC2012_val_00003209.JPEG n06874185/
+mv val/ILSVRC2012_val_00003210.JPEG n06596364/
+mv val/ILSVRC2012_val_00003211.JPEG n04074963/
+mv val/ILSVRC2012_val_00003212.JPEG n04389033/
+mv val/ILSVRC2012_val_00003213.JPEG n02447366/
+mv val/ILSVRC2012_val_00003214.JPEG n01631663/
+mv val/ILSVRC2012_val_00003215.JPEG n02841315/
+mv val/ILSVRC2012_val_00003216.JPEG n03733805/
+mv val/ILSVRC2012_val_00003217.JPEG n03146219/
+mv val/ILSVRC2012_val_00003218.JPEG n02974003/
+mv val/ILSVRC2012_val_00003219.JPEG n03947888/
+mv val/ILSVRC2012_val_00003220.JPEG n02095570/
+mv val/ILSVRC2012_val_00003221.JPEG n02422106/
+mv val/ILSVRC2012_val_00003222.JPEG n04049303/
+mv val/ILSVRC2012_val_00003223.JPEG n02396427/
+mv val/ILSVRC2012_val_00003224.JPEG n03891251/
+mv val/ILSVRC2012_val_00003225.JPEG n02422106/
+mv val/ILSVRC2012_val_00003226.JPEG n04486054/
+mv val/ILSVRC2012_val_00003227.JPEG n02091831/
+mv val/ILSVRC2012_val_00003228.JPEG n07760859/
+mv val/ILSVRC2012_val_00003229.JPEG n03179701/
+mv val/ILSVRC2012_val_00003230.JPEG n03947888/
+mv val/ILSVRC2012_val_00003231.JPEG n03692522/
+mv val/ILSVRC2012_val_00003232.JPEG n02097298/
+mv val/ILSVRC2012_val_00003233.JPEG n03602883/
+mv val/ILSVRC2012_val_00003234.JPEG n02974003/
+mv val/ILSVRC2012_val_00003235.JPEG n02951585/
+mv val/ILSVRC2012_val_00003236.JPEG n04141327/
+mv val/ILSVRC2012_val_00003237.JPEG n04357314/
+mv val/ILSVRC2012_val_00003238.JPEG n02786058/
+mv val/ILSVRC2012_val_00003239.JPEG n02268853/
+mv val/ILSVRC2012_val_00003240.JPEG n04596742/
+mv val/ILSVRC2012_val_00003241.JPEG n03788365/
+mv val/ILSVRC2012_val_00003242.JPEG n02111277/
+mv val/ILSVRC2012_val_00003243.JPEG n02104365/
+mv val/ILSVRC2012_val_00003244.JPEG n03584254/
+mv val/ILSVRC2012_val_00003245.JPEG n04509417/
+mv val/ILSVRC2012_val_00003246.JPEG n03494278/
+mv val/ILSVRC2012_val_00003247.JPEG n02939185/
+mv val/ILSVRC2012_val_00003248.JPEG n02363005/
+mv val/ILSVRC2012_val_00003249.JPEG n03047690/
+mv val/ILSVRC2012_val_00003250.JPEG n04366367/
+mv val/ILSVRC2012_val_00003251.JPEG n04409515/
+mv val/ILSVRC2012_val_00003252.JPEG n04380533/
+mv val/ILSVRC2012_val_00003253.JPEG n03187595/
+mv val/ILSVRC2012_val_00003254.JPEG n01882714/
+mv val/ILSVRC2012_val_00003255.JPEG n03680355/
+mv val/ILSVRC2012_val_00003256.JPEG n03124170/
+mv val/ILSVRC2012_val_00003257.JPEG n01986214/
+mv val/ILSVRC2012_val_00003258.JPEG n04004767/
+mv val/ILSVRC2012_val_00003259.JPEG n01833805/
+mv val/ILSVRC2012_val_00003260.JPEG n04141076/
+mv val/ILSVRC2012_val_00003261.JPEG n02033041/
+mv val/ILSVRC2012_val_00003262.JPEG n03109150/
+mv val/ILSVRC2012_val_00003263.JPEG n04560804/
+mv val/ILSVRC2012_val_00003264.JPEG n07930864/
+mv val/ILSVRC2012_val_00003265.JPEG n02114548/
+mv val/ILSVRC2012_val_00003266.JPEG n02877765/
+mv val/ILSVRC2012_val_00003267.JPEG n02093754/
+mv val/ILSVRC2012_val_00003268.JPEG n01737021/
+mv val/ILSVRC2012_val_00003269.JPEG n02093647/
+mv val/ILSVRC2012_val_00003270.JPEG n03794056/
+mv val/ILSVRC2012_val_00003271.JPEG n01843383/
+mv val/ILSVRC2012_val_00003272.JPEG n01978287/
+mv val/ILSVRC2012_val_00003273.JPEG n01669191/
+mv val/ILSVRC2012_val_00003274.JPEG n02870880/
+mv val/ILSVRC2012_val_00003275.JPEG n02071294/
+mv val/ILSVRC2012_val_00003276.JPEG n02098286/
+mv val/ILSVRC2012_val_00003277.JPEG n04120489/
+mv val/ILSVRC2012_val_00003278.JPEG n04239074/
+mv val/ILSVRC2012_val_00003279.JPEG n01537544/
+mv val/ILSVRC2012_val_00003280.JPEG n02504013/
+mv val/ILSVRC2012_val_00003281.JPEG n03929855/
+mv val/ILSVRC2012_val_00003282.JPEG n09193705/
+mv val/ILSVRC2012_val_00003283.JPEG n03534580/
+mv val/ILSVRC2012_val_00003284.JPEG n03018349/
+mv val/ILSVRC2012_val_00003285.JPEG n04179913/
+mv val/ILSVRC2012_val_00003286.JPEG n01735189/
+mv val/ILSVRC2012_val_00003287.JPEG n01665541/
+mv val/ILSVRC2012_val_00003288.JPEG n12768682/
+mv val/ILSVRC2012_val_00003289.JPEG n02669723/
+mv val/ILSVRC2012_val_00003290.JPEG n03930313/
+mv val/ILSVRC2012_val_00003291.JPEG n04200800/
+mv val/ILSVRC2012_val_00003292.JPEG n02363005/
+mv val/ILSVRC2012_val_00003293.JPEG n04552348/
+mv val/ILSVRC2012_val_00003294.JPEG n03992509/
+mv val/ILSVRC2012_val_00003295.JPEG n02123159/
+mv val/ILSVRC2012_val_00003296.JPEG n04505470/
+mv val/ILSVRC2012_val_00003297.JPEG n01518878/
+mv val/ILSVRC2012_val_00003298.JPEG n01742172/
+mv val/ILSVRC2012_val_00003299.JPEG n02445715/
+mv val/ILSVRC2012_val_00003300.JPEG n03584254/
+mv val/ILSVRC2012_val_00003301.JPEG n02101556/
+mv val/ILSVRC2012_val_00003302.JPEG n02398521/
+mv val/ILSVRC2012_val_00003303.JPEG n02106166/
+mv val/ILSVRC2012_val_00003304.JPEG n04372370/
+mv val/ILSVRC2012_val_00003305.JPEG n04346328/
+mv val/ILSVRC2012_val_00003306.JPEG n02109047/
+mv val/ILSVRC2012_val_00003307.JPEG n03498962/
+mv val/ILSVRC2012_val_00003308.JPEG n01980166/
+mv val/ILSVRC2012_val_00003309.JPEG n07753275/
+mv val/ILSVRC2012_val_00003310.JPEG n04447861/
+mv val/ILSVRC2012_val_00003311.JPEG n09332890/
+mv val/ILSVRC2012_val_00003312.JPEG n04417672/
+mv val/ILSVRC2012_val_00003313.JPEG n07248320/
+mv val/ILSVRC2012_val_00003314.JPEG n02412080/
+mv val/ILSVRC2012_val_00003315.JPEG n03218198/
+mv val/ILSVRC2012_val_00003316.JPEG n04428191/
+mv val/ILSVRC2012_val_00003317.JPEG n04447861/
+mv val/ILSVRC2012_val_00003318.JPEG n04557648/
+mv val/ILSVRC2012_val_00003319.JPEG n01677366/
+mv val/ILSVRC2012_val_00003320.JPEG n01774750/
+mv val/ILSVRC2012_val_00003321.JPEG n09399592/
+mv val/ILSVRC2012_val_00003322.JPEG n02859443/
+mv val/ILSVRC2012_val_00003323.JPEG n04456115/
+mv val/ILSVRC2012_val_00003324.JPEG n02018795/
+mv val/ILSVRC2012_val_00003325.JPEG n03935335/
+mv val/ILSVRC2012_val_00003326.JPEG n04465501/
+mv val/ILSVRC2012_val_00003327.JPEG n02112706/
+mv val/ILSVRC2012_val_00003328.JPEG n02799071/
+mv val/ILSVRC2012_val_00003329.JPEG n07684084/
+mv val/ILSVRC2012_val_00003330.JPEG n01614925/
+mv val/ILSVRC2012_val_00003331.JPEG n02167151/
+mv val/ILSVRC2012_val_00003332.JPEG n04606251/
+mv val/ILSVRC2012_val_00003333.JPEG n04317175/
+mv val/ILSVRC2012_val_00003334.JPEG n04311004/
+mv val/ILSVRC2012_val_00003335.JPEG n02077923/
+mv val/ILSVRC2012_val_00003336.JPEG n04326547/
+mv val/ILSVRC2012_val_00003337.JPEG n02483708/
+mv val/ILSVRC2012_val_00003338.JPEG n02963159/
+mv val/ILSVRC2012_val_00003339.JPEG n07565083/
+mv val/ILSVRC2012_val_00003340.JPEG n04557648/
+mv val/ILSVRC2012_val_00003341.JPEG n02397096/
+mv val/ILSVRC2012_val_00003342.JPEG n04133789/
+mv val/ILSVRC2012_val_00003343.JPEG n02229544/
+mv val/ILSVRC2012_val_00003344.JPEG n04317175/
+mv val/ILSVRC2012_val_00003345.JPEG n07749582/
+mv val/ILSVRC2012_val_00003346.JPEG n03803284/
+mv val/ILSVRC2012_val_00003347.JPEG n04456115/
+mv val/ILSVRC2012_val_00003348.JPEG n01828970/
+mv val/ILSVRC2012_val_00003349.JPEG n02408429/
+mv val/ILSVRC2012_val_00003350.JPEG n01632458/
+mv val/ILSVRC2012_val_00003351.JPEG n03028079/
+mv val/ILSVRC2012_val_00003352.JPEG n03291819/
+mv val/ILSVRC2012_val_00003353.JPEG n01773797/
+mv val/ILSVRC2012_val_00003354.JPEG n02096585/
+mv val/ILSVRC2012_val_00003355.JPEG n02110341/
+mv val/ILSVRC2012_val_00003356.JPEG n01669191/
+mv val/ILSVRC2012_val_00003357.JPEG n01986214/
+mv val/ILSVRC2012_val_00003358.JPEG n03742115/
+mv val/ILSVRC2012_val_00003359.JPEG n01910747/
+mv val/ILSVRC2012_val_00003360.JPEG n02966687/
+mv val/ILSVRC2012_val_00003361.JPEG n02025239/
+mv val/ILSVRC2012_val_00003362.JPEG n07615774/
+mv val/ILSVRC2012_val_00003363.JPEG n02090721/
+mv val/ILSVRC2012_val_00003364.JPEG n01855672/
+mv val/ILSVRC2012_val_00003365.JPEG n02965783/
+mv val/ILSVRC2012_val_00003366.JPEG n03924679/
+mv val/ILSVRC2012_val_00003367.JPEG n11879895/
+mv val/ILSVRC2012_val_00003368.JPEG n02113186/
+mv val/ILSVRC2012_val_00003369.JPEG n04270147/
+mv val/ILSVRC2012_val_00003370.JPEG n02804610/
+mv val/ILSVRC2012_val_00003371.JPEG n06359193/
+mv val/ILSVRC2012_val_00003372.JPEG n02965783/
+mv val/ILSVRC2012_val_00003373.JPEG n03777754/
+mv val/ILSVRC2012_val_00003374.JPEG n09399592/
+mv val/ILSVRC2012_val_00003375.JPEG n01693334/
+mv val/ILSVRC2012_val_00003376.JPEG n04033901/
+mv val/ILSVRC2012_val_00003377.JPEG n02098413/
+mv val/ILSVRC2012_val_00003378.JPEG n01981276/
+mv val/ILSVRC2012_val_00003379.JPEG n03657121/
+mv val/ILSVRC2012_val_00003380.JPEG n02096437/
+mv val/ILSVRC2012_val_00003381.JPEG n03841143/
+mv val/ILSVRC2012_val_00003382.JPEG n02123394/
+mv val/ILSVRC2012_val_00003383.JPEG n02447366/
+mv val/ILSVRC2012_val_00003384.JPEG n03345487/
+mv val/ILSVRC2012_val_00003385.JPEG n02963159/
+mv val/ILSVRC2012_val_00003386.JPEG n01580077/
+mv val/ILSVRC2012_val_00003387.JPEG n03481172/
+mv val/ILSVRC2012_val_00003388.JPEG n02483362/
+mv val/ILSVRC2012_val_00003389.JPEG n02894605/
+mv val/ILSVRC2012_val_00003390.JPEG n02109525/
+mv val/ILSVRC2012_val_00003391.JPEG n04525038/
+mv val/ILSVRC2012_val_00003392.JPEG n01917289/
+mv val/ILSVRC2012_val_00003393.JPEG n03983396/
+mv val/ILSVRC2012_val_00003394.JPEG n04462240/
+mv val/ILSVRC2012_val_00003395.JPEG n04153751/
+mv val/ILSVRC2012_val_00003396.JPEG n03992509/
+mv val/ILSVRC2012_val_00003397.JPEG n02906734/
+mv val/ILSVRC2012_val_00003398.JPEG n03290653/
+mv val/ILSVRC2012_val_00003399.JPEG n02017213/
+mv val/ILSVRC2012_val_00003400.JPEG n02808440/
+mv val/ILSVRC2012_val_00003401.JPEG n04515003/
+mv val/ILSVRC2012_val_00003402.JPEG n02422106/
+mv val/ILSVRC2012_val_00003403.JPEG n02115913/
+mv val/ILSVRC2012_val_00003404.JPEG n03720891/
+mv val/ILSVRC2012_val_00003405.JPEG n10148035/
+mv val/ILSVRC2012_val_00003406.JPEG n02794156/
+mv val/ILSVRC2012_val_00003407.JPEG n02096294/
+mv val/ILSVRC2012_val_00003408.JPEG n03220513/
+mv val/ILSVRC2012_val_00003409.JPEG n02437312/
+mv val/ILSVRC2012_val_00003410.JPEG n02058221/
+mv val/ILSVRC2012_val_00003411.JPEG n04540053/
+mv val/ILSVRC2012_val_00003412.JPEG n07753592/
+mv val/ILSVRC2012_val_00003413.JPEG n02105641/
+mv val/ILSVRC2012_val_00003414.JPEG n04325704/
+mv val/ILSVRC2012_val_00003415.JPEG n04447861/
+mv val/ILSVRC2012_val_00003416.JPEG n07695742/
+mv val/ILSVRC2012_val_00003417.JPEG n03666591/
+mv val/ILSVRC2012_val_00003418.JPEG n03642806/
+mv val/ILSVRC2012_val_00003419.JPEG n01910747/
+mv val/ILSVRC2012_val_00003420.JPEG n03733281/
+mv val/ILSVRC2012_val_00003421.JPEG n01768244/
+mv val/ILSVRC2012_val_00003422.JPEG n03888605/
+mv val/ILSVRC2012_val_00003423.JPEG n13133613/
+mv val/ILSVRC2012_val_00003424.JPEG n03590841/
+mv val/ILSVRC2012_val_00003425.JPEG n03127925/
+mv val/ILSVRC2012_val_00003426.JPEG n02488291/
+mv val/ILSVRC2012_val_00003427.JPEG n04208210/
+mv val/ILSVRC2012_val_00003428.JPEG n04592741/
+mv val/ILSVRC2012_val_00003429.JPEG n04557648/
+mv val/ILSVRC2012_val_00003430.JPEG n02169497/
+mv val/ILSVRC2012_val_00003431.JPEG n01773549/
+mv val/ILSVRC2012_val_00003432.JPEG n02672831/
+mv val/ILSVRC2012_val_00003433.JPEG n03742115/
+mv val/ILSVRC2012_val_00003434.JPEG n01983481/
+mv val/ILSVRC2012_val_00003435.JPEG n02113978/
+mv val/ILSVRC2012_val_00003436.JPEG n03494278/
+mv val/ILSVRC2012_val_00003437.JPEG n02490219/
+mv val/ILSVRC2012_val_00003438.JPEG n02488291/
+mv val/ILSVRC2012_val_00003439.JPEG n03062245/
+mv val/ILSVRC2012_val_00003440.JPEG n02167151/
+mv val/ILSVRC2012_val_00003441.JPEG n02676566/
+mv val/ILSVRC2012_val_00003442.JPEG n04392985/
+mv val/ILSVRC2012_val_00003443.JPEG n03877472/
+mv val/ILSVRC2012_val_00003444.JPEG n02168699/
+mv val/ILSVRC2012_val_00003445.JPEG n02488291/
+mv val/ILSVRC2012_val_00003446.JPEG n02840245/
+mv val/ILSVRC2012_val_00003447.JPEG n03014705/
+mv val/ILSVRC2012_val_00003448.JPEG n04044716/
+mv val/ILSVRC2012_val_00003449.JPEG n02119022/
+mv val/ILSVRC2012_val_00003450.JPEG n01824575/
+mv val/ILSVRC2012_val_00003451.JPEG n02840245/
+mv val/ILSVRC2012_val_00003452.JPEG n04023962/
+mv val/ILSVRC2012_val_00003453.JPEG n03032252/
+mv val/ILSVRC2012_val_00003454.JPEG n02486410/
+mv val/ILSVRC2012_val_00003455.JPEG n03197337/
+mv val/ILSVRC2012_val_00003456.JPEG n02974003/
+mv val/ILSVRC2012_val_00003457.JPEG n04086273/
+mv val/ILSVRC2012_val_00003458.JPEG n02441942/
+mv val/ILSVRC2012_val_00003459.JPEG n03496892/
+mv val/ILSVRC2012_val_00003460.JPEG n03721384/
+mv val/ILSVRC2012_val_00003461.JPEG n03538406/
+mv val/ILSVRC2012_val_00003462.JPEG n03041632/
+mv val/ILSVRC2012_val_00003463.JPEG n02927161/
+mv val/ILSVRC2012_val_00003464.JPEG n02408429/
+mv val/ILSVRC2012_val_00003465.JPEG n03759954/
+mv val/ILSVRC2012_val_00003466.JPEG n03690938/
+mv val/ILSVRC2012_val_00003467.JPEG n01930112/
+mv val/ILSVRC2012_val_00003468.JPEG n01744401/
+mv val/ILSVRC2012_val_00003469.JPEG n02992529/
+mv val/ILSVRC2012_val_00003470.JPEG n03873416/
+mv val/ILSVRC2012_val_00003471.JPEG n07615774/
+mv val/ILSVRC2012_val_00003472.JPEG n02012849/
+mv val/ILSVRC2012_val_00003473.JPEG n03777568/
+mv val/ILSVRC2012_val_00003474.JPEG n03676483/
+mv val/ILSVRC2012_val_00003475.JPEG n01968897/
+mv val/ILSVRC2012_val_00003476.JPEG n03866082/
+mv val/ILSVRC2012_val_00003477.JPEG n04005630/
+mv val/ILSVRC2012_val_00003478.JPEG n04285008/
+mv val/ILSVRC2012_val_00003479.JPEG n02841315/
+mv val/ILSVRC2012_val_00003480.JPEG n02106030/
+mv val/ILSVRC2012_val_00003481.JPEG n02276258/
+mv val/ILSVRC2012_val_00003482.JPEG n02422106/
+mv val/ILSVRC2012_val_00003483.JPEG n03649909/
+mv val/ILSVRC2012_val_00003484.JPEG n03017168/
+mv val/ILSVRC2012_val_00003485.JPEG n02097474/
+mv val/ILSVRC2012_val_00003486.JPEG n02948072/
+mv val/ILSVRC2012_val_00003487.JPEG n02256656/
+mv val/ILSVRC2012_val_00003488.JPEG n04179913/
+mv val/ILSVRC2012_val_00003489.JPEG n09835506/
+mv val/ILSVRC2012_val_00003490.JPEG n02111889/
+mv val/ILSVRC2012_val_00003491.JPEG n02988304/
+mv val/ILSVRC2012_val_00003492.JPEG n07836838/
+mv val/ILSVRC2012_val_00003493.JPEG n02051845/
+mv val/ILSVRC2012_val_00003494.JPEG n02971356/
+mv val/ILSVRC2012_val_00003495.JPEG n02640242/
+mv val/ILSVRC2012_val_00003496.JPEG n03065424/
+mv val/ILSVRC2012_val_00003497.JPEG n04201297/
+mv val/ILSVRC2012_val_00003498.JPEG n02281406/
+mv val/ILSVRC2012_val_00003499.JPEG n02134418/
+mv val/ILSVRC2012_val_00003500.JPEG n02500267/
+mv val/ILSVRC2012_val_00003501.JPEG n02895154/
+mv val/ILSVRC2012_val_00003502.JPEG n02870880/
+mv val/ILSVRC2012_val_00003503.JPEG n03617480/
+mv val/ILSVRC2012_val_00003504.JPEG n02415577/
+mv val/ILSVRC2012_val_00003505.JPEG n03733131/
+mv val/ILSVRC2012_val_00003506.JPEG n03594734/
+mv val/ILSVRC2012_val_00003507.JPEG n04152593/
+mv val/ILSVRC2012_val_00003508.JPEG n04258138/
+mv val/ILSVRC2012_val_00003509.JPEG n04286575/
+mv val/ILSVRC2012_val_00003510.JPEG n04336792/
+mv val/ILSVRC2012_val_00003511.JPEG n02484975/
+mv val/ILSVRC2012_val_00003512.JPEG n04041544/
+mv val/ILSVRC2012_val_00003513.JPEG n04081281/
+mv val/ILSVRC2012_val_00003514.JPEG n03291819/
+mv val/ILSVRC2012_val_00003515.JPEG n04584207/
+mv val/ILSVRC2012_val_00003516.JPEG n02100877/
+mv val/ILSVRC2012_val_00003517.JPEG n03459775/
+mv val/ILSVRC2012_val_00003518.JPEG n01498041/
+mv val/ILSVRC2012_val_00003519.JPEG n04429376/
+mv val/ILSVRC2012_val_00003520.JPEG n04252077/
+mv val/ILSVRC2012_val_00003521.JPEG n04515003/
+mv val/ILSVRC2012_val_00003522.JPEG n02108089/
+mv val/ILSVRC2012_val_00003523.JPEG n03876231/
+mv val/ILSVRC2012_val_00003524.JPEG n03838899/
+mv val/ILSVRC2012_val_00003525.JPEG n07716358/
+mv val/ILSVRC2012_val_00003526.JPEG n02025239/
+mv val/ILSVRC2012_val_00003527.JPEG n02965783/
+mv val/ILSVRC2012_val_00003528.JPEG n04033901/
+mv val/ILSVRC2012_val_00003529.JPEG n03841143/
+mv val/ILSVRC2012_val_00003530.JPEG n02102318/
+mv val/ILSVRC2012_val_00003531.JPEG n03888605/
+mv val/ILSVRC2012_val_00003532.JPEG n03777568/
+mv val/ILSVRC2012_val_00003533.JPEG n04350905/
+mv val/ILSVRC2012_val_00003534.JPEG n02870880/
+mv val/ILSVRC2012_val_00003535.JPEG n04277352/
+mv val/ILSVRC2012_val_00003536.JPEG n07720875/
+mv val/ILSVRC2012_val_00003537.JPEG n02317335/
+mv val/ILSVRC2012_val_00003538.JPEG n02504458/
+mv val/ILSVRC2012_val_00003539.JPEG n02488291/
+mv val/ILSVRC2012_val_00003540.JPEG n02137549/
+mv val/ILSVRC2012_val_00003541.JPEG n02490219/
+mv val/ILSVRC2012_val_00003542.JPEG n04428191/
+mv val/ILSVRC2012_val_00003543.JPEG n03662601/
+mv val/ILSVRC2012_val_00003544.JPEG n04532670/
+mv val/ILSVRC2012_val_00003545.JPEG n02105412/
+mv val/ILSVRC2012_val_00003546.JPEG n02091831/
+mv val/ILSVRC2012_val_00003547.JPEG n04154565/
+mv val/ILSVRC2012_val_00003548.JPEG n01531178/
+mv val/ILSVRC2012_val_00003549.JPEG n07753275/
+mv val/ILSVRC2012_val_00003550.JPEG n02117135/
+mv val/ILSVRC2012_val_00003551.JPEG n01882714/
+mv val/ILSVRC2012_val_00003552.JPEG n03272010/
+mv val/ILSVRC2012_val_00003553.JPEG n03759954/
+mv val/ILSVRC2012_val_00003554.JPEG n03866082/
+mv val/ILSVRC2012_val_00003555.JPEG n03992509/
+mv val/ILSVRC2012_val_00003556.JPEG n02137549/
+mv val/ILSVRC2012_val_00003557.JPEG n01537544/
+mv val/ILSVRC2012_val_00003558.JPEG n01494475/
+mv val/ILSVRC2012_val_00003559.JPEG n03179701/
+mv val/ILSVRC2012_val_00003560.JPEG n01694178/
+mv val/ILSVRC2012_val_00003561.JPEG n04554684/
+mv val/ILSVRC2012_val_00003562.JPEG n04204347/
+mv val/ILSVRC2012_val_00003563.JPEG n11879895/
+mv val/ILSVRC2012_val_00003564.JPEG n04366367/
+mv val/ILSVRC2012_val_00003565.JPEG n04371430/
+mv val/ILSVRC2012_val_00003566.JPEG n12057211/
+mv val/ILSVRC2012_val_00003567.JPEG n02730930/
+mv val/ILSVRC2012_val_00003568.JPEG n03461385/
+mv val/ILSVRC2012_val_00003569.JPEG n01728572/
+mv val/ILSVRC2012_val_00003570.JPEG n01688243/
+mv val/ILSVRC2012_val_00003571.JPEG n04141975/
+mv val/ILSVRC2012_val_00003572.JPEG n02174001/
+mv val/ILSVRC2012_val_00003573.JPEG n04310018/
+mv val/ILSVRC2012_val_00003574.JPEG n02077923/
+mv val/ILSVRC2012_val_00003575.JPEG n02105505/
+mv val/ILSVRC2012_val_00003576.JPEG n03250847/
+mv val/ILSVRC2012_val_00003577.JPEG n01776313/
+mv val/ILSVRC2012_val_00003578.JPEG n04532106/
+mv val/ILSVRC2012_val_00003579.JPEG n02346627/
+mv val/ILSVRC2012_val_00003580.JPEG n04493381/
+mv val/ILSVRC2012_val_00003581.JPEG n07742313/
+mv val/ILSVRC2012_val_00003582.JPEG n04335435/
+mv val/ILSVRC2012_val_00003583.JPEG n02112018/
+mv val/ILSVRC2012_val_00003584.JPEG n02097298/
+mv val/ILSVRC2012_val_00003585.JPEG n04254120/
+mv val/ILSVRC2012_val_00003586.JPEG n02231487/
+mv val/ILSVRC2012_val_00003587.JPEG n03394916/
+mv val/ILSVRC2012_val_00003588.JPEG n01806143/
+mv val/ILSVRC2012_val_00003589.JPEG n04311004/
+mv val/ILSVRC2012_val_00003590.JPEG n03216828/
+mv val/ILSVRC2012_val_00003591.JPEG n07615774/
+mv val/ILSVRC2012_val_00003592.JPEG n07614500/
+mv val/ILSVRC2012_val_00003593.JPEG n07768694/
+mv val/ILSVRC2012_val_00003594.JPEG n07248320/
+mv val/ILSVRC2012_val_00003595.JPEG n03594734/
+mv val/ILSVRC2012_val_00003596.JPEG n04008634/
+mv val/ILSVRC2012_val_00003597.JPEG n02091134/
+mv val/ILSVRC2012_val_00003598.JPEG n02606052/
+mv val/ILSVRC2012_val_00003599.JPEG n04310018/
+mv val/ILSVRC2012_val_00003600.JPEG n07714990/
+mv val/ILSVRC2012_val_00003601.JPEG n01945685/
+mv val/ILSVRC2012_val_00003602.JPEG n02326432/
+mv val/ILSVRC2012_val_00003603.JPEG n01704323/
+mv val/ILSVRC2012_val_00003604.JPEG n01944390/
+mv val/ILSVRC2012_val_00003605.JPEG n01514668/
+mv val/ILSVRC2012_val_00003606.JPEG n01514668/
+mv val/ILSVRC2012_val_00003607.JPEG n01740131/
+mv val/ILSVRC2012_val_00003608.JPEG n04356056/
+mv val/ILSVRC2012_val_00003609.JPEG n03492542/
+mv val/ILSVRC2012_val_00003610.JPEG n02643566/
+mv val/ILSVRC2012_val_00003611.JPEG n03759954/
+mv val/ILSVRC2012_val_00003612.JPEG n03854065/
+mv val/ILSVRC2012_val_00003613.JPEG n03781244/
+mv val/ILSVRC2012_val_00003614.JPEG n03125729/
+mv val/ILSVRC2012_val_00003615.JPEG n02087394/
+mv val/ILSVRC2012_val_00003616.JPEG n02093754/
+mv val/ILSVRC2012_val_00003617.JPEG n02802426/
+mv val/ILSVRC2012_val_00003618.JPEG n03527444/
+mv val/ILSVRC2012_val_00003619.JPEG n07747607/
+mv val/ILSVRC2012_val_00003620.JPEG n03394916/
+mv val/ILSVRC2012_val_00003621.JPEG n01644373/
+mv val/ILSVRC2012_val_00003622.JPEG n02823428/
+mv val/ILSVRC2012_val_00003623.JPEG n02106550/
+mv val/ILSVRC2012_val_00003624.JPEG n03954731/
+mv val/ILSVRC2012_val_00003625.JPEG n01944390/
+mv val/ILSVRC2012_val_00003626.JPEG n09472597/
+mv val/ILSVRC2012_val_00003627.JPEG n03126707/
+mv val/ILSVRC2012_val_00003628.JPEG n02102973/
+mv val/ILSVRC2012_val_00003629.JPEG n03443371/
+mv val/ILSVRC2012_val_00003630.JPEG n03529860/
+mv val/ILSVRC2012_val_00003631.JPEG n02489166/
+mv val/ILSVRC2012_val_00003632.JPEG n04606251/
+mv val/ILSVRC2012_val_00003633.JPEG n04371774/
+mv val/ILSVRC2012_val_00003634.JPEG n03197337/
+mv val/ILSVRC2012_val_00003635.JPEG n04252225/
+mv val/ILSVRC2012_val_00003636.JPEG n01986214/
+mv val/ILSVRC2012_val_00003637.JPEG n03841143/
+mv val/ILSVRC2012_val_00003638.JPEG n02111129/
+mv val/ILSVRC2012_val_00003639.JPEG n04251144/
+mv val/ILSVRC2012_val_00003640.JPEG n02782093/
+mv val/ILSVRC2012_val_00003641.JPEG n03786901/
+mv val/ILSVRC2012_val_00003642.JPEG n04542943/
+mv val/ILSVRC2012_val_00003643.JPEG n03196217/
+mv val/ILSVRC2012_val_00003644.JPEG n01735189/
+mv val/ILSVRC2012_val_00003645.JPEG n03125729/
+mv val/ILSVRC2012_val_00003646.JPEG n02089867/
+mv val/ILSVRC2012_val_00003647.JPEG n04009552/
+mv val/ILSVRC2012_val_00003648.JPEG n02860847/
+mv val/ILSVRC2012_val_00003649.JPEG n02229544/
+mv val/ILSVRC2012_val_00003650.JPEG n01871265/
+mv val/ILSVRC2012_val_00003651.JPEG n03930313/
+mv val/ILSVRC2012_val_00003652.JPEG n04296562/
+mv val/ILSVRC2012_val_00003653.JPEG n03388549/
+mv val/ILSVRC2012_val_00003654.JPEG n02437616/
+mv val/ILSVRC2012_val_00003655.JPEG n02423022/
+mv val/ILSVRC2012_val_00003656.JPEG n02190166/
+mv val/ILSVRC2012_val_00003657.JPEG n04522168/
+mv val/ILSVRC2012_val_00003658.JPEG n04136333/
+mv val/ILSVRC2012_val_00003659.JPEG n02009229/
+mv val/ILSVRC2012_val_00003660.JPEG n07716358/
+mv val/ILSVRC2012_val_00003661.JPEG n01798484/
+mv val/ILSVRC2012_val_00003662.JPEG n01990800/
+mv val/ILSVRC2012_val_00003663.JPEG n04525038/
+mv val/ILSVRC2012_val_00003664.JPEG n07754684/
+mv val/ILSVRC2012_val_00003665.JPEG n01582220/
+mv val/ILSVRC2012_val_00003666.JPEG n03673027/
+mv val/ILSVRC2012_val_00003667.JPEG n02977058/
+mv val/ILSVRC2012_val_00003668.JPEG n04317175/
+mv val/ILSVRC2012_val_00003669.JPEG n03495258/
+mv val/ILSVRC2012_val_00003670.JPEG n02692877/
+mv val/ILSVRC2012_val_00003671.JPEG n02089973/
+mv val/ILSVRC2012_val_00003672.JPEG n01843065/
+mv val/ILSVRC2012_val_00003673.JPEG n03584254/
+mv val/ILSVRC2012_val_00003674.JPEG n02802426/
+mv val/ILSVRC2012_val_00003675.JPEG n02364673/
+mv val/ILSVRC2012_val_00003676.JPEG n01807496/
+mv val/ILSVRC2012_val_00003677.JPEG n02172182/
+mv val/ILSVRC2012_val_00003678.JPEG n03742115/
+mv val/ILSVRC2012_val_00003679.JPEG n02687172/
+mv val/ILSVRC2012_val_00003680.JPEG n02769748/
+mv val/ILSVRC2012_val_00003681.JPEG n07716358/
+mv val/ILSVRC2012_val_00003682.JPEG n03028079/
+mv val/ILSVRC2012_val_00003683.JPEG n02107142/
+mv val/ILSVRC2012_val_00003684.JPEG n02749479/
+mv val/ILSVRC2012_val_00003685.JPEG n02417914/
+mv val/ILSVRC2012_val_00003686.JPEG n04296562/
+mv val/ILSVRC2012_val_00003687.JPEG n01829413/
+mv val/ILSVRC2012_val_00003688.JPEG n01698640/
+mv val/ILSVRC2012_val_00003689.JPEG n03935335/
+mv val/ILSVRC2012_val_00003690.JPEG n02096294/
+mv val/ILSVRC2012_val_00003691.JPEG n02112706/
+mv val/ILSVRC2012_val_00003692.JPEG n02692877/
+mv val/ILSVRC2012_val_00003693.JPEG n01740131/
+mv val/ILSVRC2012_val_00003694.JPEG n07754684/
+mv val/ILSVRC2012_val_00003695.JPEG n04136333/
+mv val/ILSVRC2012_val_00003696.JPEG n02112137/
+mv val/ILSVRC2012_val_00003697.JPEG n02326432/
+mv val/ILSVRC2012_val_00003698.JPEG n02113624/
+mv val/ILSVRC2012_val_00003699.JPEG n07715103/
+mv val/ILSVRC2012_val_00003700.JPEG n02484975/
+mv val/ILSVRC2012_val_00003701.JPEG n03781244/
+mv val/ILSVRC2012_val_00003702.JPEG n01630670/
+mv val/ILSVRC2012_val_00003703.JPEG n02701002/
+mv val/ILSVRC2012_val_00003704.JPEG n03776460/
+mv val/ILSVRC2012_val_00003705.JPEG n01978455/
+mv val/ILSVRC2012_val_00003706.JPEG n01755581/
+mv val/ILSVRC2012_val_00003707.JPEG n01819313/
+mv val/ILSVRC2012_val_00003708.JPEG n03838899/
+mv val/ILSVRC2012_val_00003709.JPEG n04146614/
+mv val/ILSVRC2012_val_00003710.JPEG n04251144/
+mv val/ILSVRC2012_val_00003711.JPEG n02113023/
+mv val/ILSVRC2012_val_00003712.JPEG n02483362/
+mv val/ILSVRC2012_val_00003713.JPEG n04456115/
+mv val/ILSVRC2012_val_00003714.JPEG n02101006/
+mv val/ILSVRC2012_val_00003715.JPEG n02992211/
+mv val/ILSVRC2012_val_00003716.JPEG n02037110/
+mv val/ILSVRC2012_val_00003717.JPEG n03045698/
+mv val/ILSVRC2012_val_00003718.JPEG n02963159/
+mv val/ILSVRC2012_val_00003719.JPEG n03249569/
+mv val/ILSVRC2012_val_00003720.JPEG n06359193/
+mv val/ILSVRC2012_val_00003721.JPEG n03196217/
+mv val/ILSVRC2012_val_00003722.JPEG n01693334/
+mv val/ILSVRC2012_val_00003723.JPEG n02085936/
+mv val/ILSVRC2012_val_00003724.JPEG n03697007/
+mv val/ILSVRC2012_val_00003725.JPEG n02092002/
+mv val/ILSVRC2012_val_00003726.JPEG n02099712/
+mv val/ILSVRC2012_val_00003727.JPEG n02793495/
+mv val/ILSVRC2012_val_00003728.JPEG n03710721/
+mv val/ILSVRC2012_val_00003729.JPEG n02102318/
+mv val/ILSVRC2012_val_00003730.JPEG n03895866/
+mv val/ILSVRC2012_val_00003731.JPEG n02097209/
+mv val/ILSVRC2012_val_00003732.JPEG n03127747/
+mv val/ILSVRC2012_val_00003733.JPEG n01950731/
+mv val/ILSVRC2012_val_00003734.JPEG n02106166/
+mv val/ILSVRC2012_val_00003735.JPEG n01443537/
+mv val/ILSVRC2012_val_00003736.JPEG n03372029/
+mv val/ILSVRC2012_val_00003737.JPEG n04229816/
+mv val/ILSVRC2012_val_00003738.JPEG n01990800/
+mv val/ILSVRC2012_val_00003739.JPEG n04258138/
+mv val/ILSVRC2012_val_00003740.JPEG n03637318/
+mv val/ILSVRC2012_val_00003741.JPEG n03633091/
+mv val/ILSVRC2012_val_00003742.JPEG n03770439/
+mv val/ILSVRC2012_val_00003743.JPEG n01818515/
+mv val/ILSVRC2012_val_00003744.JPEG n04069434/
+mv val/ILSVRC2012_val_00003745.JPEG n02110063/
+mv val/ILSVRC2012_val_00003746.JPEG n01664065/
+mv val/ILSVRC2012_val_00003747.JPEG n02504458/
+mv val/ILSVRC2012_val_00003748.JPEG n01641577/
+mv val/ILSVRC2012_val_00003749.JPEG n04562935/
+mv val/ILSVRC2012_val_00003750.JPEG n03825788/
+mv val/ILSVRC2012_val_00003751.JPEG n03873416/
+mv val/ILSVRC2012_val_00003752.JPEG n02484975/
+mv val/ILSVRC2012_val_00003753.JPEG n01984695/
+mv val/ILSVRC2012_val_00003754.JPEG n03761084/
+mv val/ILSVRC2012_val_00003755.JPEG n02892201/
+mv val/ILSVRC2012_val_00003756.JPEG n04392985/
+mv val/ILSVRC2012_val_00003757.JPEG n04357314/
+mv val/ILSVRC2012_val_00003758.JPEG n02097130/
+mv val/ILSVRC2012_val_00003759.JPEG n03394916/
+mv val/ILSVRC2012_val_00003760.JPEG n03124170/
+mv val/ILSVRC2012_val_00003761.JPEG n03938244/
+mv val/ILSVRC2012_val_00003762.JPEG n01582220/
+mv val/ILSVRC2012_val_00003763.JPEG n04133789/
+mv val/ILSVRC2012_val_00003764.JPEG n07871810/
+mv val/ILSVRC2012_val_00003765.JPEG n02114855/
+mv val/ILSVRC2012_val_00003766.JPEG n02445715/
+mv val/ILSVRC2012_val_00003767.JPEG n03017168/
+mv val/ILSVRC2012_val_00003768.JPEG n01729977/
+mv val/ILSVRC2012_val_00003769.JPEG n02101006/
+mv val/ILSVRC2012_val_00003770.JPEG n04153751/
+mv val/ILSVRC2012_val_00003771.JPEG n07730033/
+mv val/ILSVRC2012_val_00003772.JPEG n02802426/
+mv val/ILSVRC2012_val_00003773.JPEG n02130308/
+mv val/ILSVRC2012_val_00003774.JPEG n02096585/
+mv val/ILSVRC2012_val_00003775.JPEG n01860187/
+mv val/ILSVRC2012_val_00003776.JPEG n01980166/
+mv val/ILSVRC2012_val_00003777.JPEG n02825657/
+mv val/ILSVRC2012_val_00003778.JPEG n03450230/
+mv val/ILSVRC2012_val_00003779.JPEG n04037443/
+mv val/ILSVRC2012_val_00003780.JPEG n04090263/
+mv val/ILSVRC2012_val_00003781.JPEG n02361337/
+mv val/ILSVRC2012_val_00003782.JPEG n02823750/
+mv val/ILSVRC2012_val_00003783.JPEG n02843684/
+mv val/ILSVRC2012_val_00003784.JPEG n03372029/
+mv val/ILSVRC2012_val_00003785.JPEG n01749939/
+mv val/ILSVRC2012_val_00003786.JPEG n02808440/
+mv val/ILSVRC2012_val_00003787.JPEG n03384352/
+mv val/ILSVRC2012_val_00003788.JPEG n02129165/
+mv val/ILSVRC2012_val_00003789.JPEG n02095570/
+mv val/ILSVRC2012_val_00003790.JPEG n02916936/
+mv val/ILSVRC2012_val_00003791.JPEG n02098105/
+mv val/ILSVRC2012_val_00003792.JPEG n02093256/
+mv val/ILSVRC2012_val_00003793.JPEG n03445777/
+mv val/ILSVRC2012_val_00003794.JPEG n02111500/
+mv val/ILSVRC2012_val_00003795.JPEG n04553703/
+mv val/ILSVRC2012_val_00003796.JPEG n03871628/
+mv val/ILSVRC2012_val_00003797.JPEG n03876231/
+mv val/ILSVRC2012_val_00003798.JPEG n03062245/
+mv val/ILSVRC2012_val_00003799.JPEG n03207941/
+mv val/ILSVRC2012_val_00003800.JPEG n04428191/
+mv val/ILSVRC2012_val_00003801.JPEG n02408429/
+mv val/ILSVRC2012_val_00003802.JPEG n04005630/
+mv val/ILSVRC2012_val_00003803.JPEG n02777292/
+mv val/ILSVRC2012_val_00003804.JPEG n03877845/
+mv val/ILSVRC2012_val_00003805.JPEG n04599235/
+mv val/ILSVRC2012_val_00003806.JPEG n02514041/
+mv val/ILSVRC2012_val_00003807.JPEG n04081281/
+mv val/ILSVRC2012_val_00003808.JPEG n02111889/
+mv val/ILSVRC2012_val_00003809.JPEG n03208938/
+mv val/ILSVRC2012_val_00003810.JPEG n02105855/
+mv val/ILSVRC2012_val_00003811.JPEG n10565667/
+mv val/ILSVRC2012_val_00003812.JPEG n02493793/
+mv val/ILSVRC2012_val_00003813.JPEG n02676566/
+mv val/ILSVRC2012_val_00003814.JPEG n02219486/
+mv val/ILSVRC2012_val_00003815.JPEG n04147183/
+mv val/ILSVRC2012_val_00003816.JPEG n01531178/
+mv val/ILSVRC2012_val_00003817.JPEG n04542943/
+mv val/ILSVRC2012_val_00003818.JPEG n02492660/
+mv val/ILSVRC2012_val_00003819.JPEG n04235860/
+mv val/ILSVRC2012_val_00003820.JPEG n02321529/
+mv val/ILSVRC2012_val_00003821.JPEG n01687978/
+mv val/ILSVRC2012_val_00003822.JPEG n02066245/
+mv val/ILSVRC2012_val_00003823.JPEG n01818515/
+mv val/ILSVRC2012_val_00003824.JPEG n03461385/
+mv val/ILSVRC2012_val_00003825.JPEG n03710637/
+mv val/ILSVRC2012_val_00003826.JPEG n03854065/
+mv val/ILSVRC2012_val_00003827.JPEG n01872401/
+mv val/ILSVRC2012_val_00003828.JPEG n01847000/
+mv val/ILSVRC2012_val_00003829.JPEG n03690938/
+mv val/ILSVRC2012_val_00003830.JPEG n06596364/
+mv val/ILSVRC2012_val_00003831.JPEG n07932039/
+mv val/ILSVRC2012_val_00003832.JPEG n02102973/
+mv val/ILSVRC2012_val_00003833.JPEG n01806567/
+mv val/ILSVRC2012_val_00003834.JPEG n02106382/
+mv val/ILSVRC2012_val_00003835.JPEG n15075141/
+mv val/ILSVRC2012_val_00003836.JPEG n02109047/
+mv val/ILSVRC2012_val_00003837.JPEG n02087394/
+mv val/ILSVRC2012_val_00003838.JPEG n01774750/
+mv val/ILSVRC2012_val_00003839.JPEG n02128385/
+mv val/ILSVRC2012_val_00003840.JPEG n07871810/
+mv val/ILSVRC2012_val_00003841.JPEG n02086240/
+mv val/ILSVRC2012_val_00003842.JPEG n04209239/
+mv val/ILSVRC2012_val_00003843.JPEG n07749582/
+mv val/ILSVRC2012_val_00003844.JPEG n04392985/
+mv val/ILSVRC2012_val_00003845.JPEG n02058221/
+mv val/ILSVRC2012_val_00003846.JPEG n01644373/
+mv val/ILSVRC2012_val_00003847.JPEG n03127925/
+mv val/ILSVRC2012_val_00003848.JPEG n03690938/
+mv val/ILSVRC2012_val_00003849.JPEG n04485082/
+mv val/ILSVRC2012_val_00003850.JPEG n03388183/
+mv val/ILSVRC2012_val_00003851.JPEG n02110627/
+mv val/ILSVRC2012_val_00003852.JPEG n02165105/
+mv val/ILSVRC2012_val_00003853.JPEG n03785016/
+mv val/ILSVRC2012_val_00003854.JPEG n02259212/
+mv val/ILSVRC2012_val_00003855.JPEG n02108915/
+mv val/ILSVRC2012_val_00003856.JPEG n02099267/
+mv val/ILSVRC2012_val_00003857.JPEG n04044716/
+mv val/ILSVRC2012_val_00003858.JPEG n01990800/
+mv val/ILSVRC2012_val_00003859.JPEG n01986214/
+mv val/ILSVRC2012_val_00003860.JPEG n01632777/
+mv val/ILSVRC2012_val_00003861.JPEG n01580077/
+mv val/ILSVRC2012_val_00003862.JPEG n02106030/
+mv val/ILSVRC2012_val_00003863.JPEG n01632458/
+mv val/ILSVRC2012_val_00003864.JPEG n03337140/
+mv val/ILSVRC2012_val_00003865.JPEG n01695060/
+mv val/ILSVRC2012_val_00003866.JPEG n09399592/
+mv val/ILSVRC2012_val_00003867.JPEG n04116512/
+mv val/ILSVRC2012_val_00003868.JPEG n03443371/
+mv val/ILSVRC2012_val_00003869.JPEG n02097658/
+mv val/ILSVRC2012_val_00003870.JPEG n04039381/
+mv val/ILSVRC2012_val_00003871.JPEG n02422699/
+mv val/ILSVRC2012_val_00003872.JPEG n02105855/
+mv val/ILSVRC2012_val_00003873.JPEG n03792782/
+mv val/ILSVRC2012_val_00003874.JPEG n02229544/
+mv val/ILSVRC2012_val_00003875.JPEG n01950731/
+mv val/ILSVRC2012_val_00003876.JPEG n02256656/
+mv val/ILSVRC2012_val_00003877.JPEG n03916031/
+mv val/ILSVRC2012_val_00003878.JPEG n01534433/
+mv val/ILSVRC2012_val_00003879.JPEG n03791053/
+mv val/ILSVRC2012_val_00003880.JPEG n04200800/
+mv val/ILSVRC2012_val_00003881.JPEG n03314780/
+mv val/ILSVRC2012_val_00003882.JPEG n04120489/
+mv val/ILSVRC2012_val_00003883.JPEG n04584207/
+mv val/ILSVRC2012_val_00003884.JPEG n01820546/
+mv val/ILSVRC2012_val_00003885.JPEG n04125021/
+mv val/ILSVRC2012_val_00003886.JPEG n02930766/
+mv val/ILSVRC2012_val_00003887.JPEG n02093647/
+mv val/ILSVRC2012_val_00003888.JPEG n02910353/
+mv val/ILSVRC2012_val_00003889.JPEG n03452741/
+mv val/ILSVRC2012_val_00003890.JPEG n03482405/
+mv val/ILSVRC2012_val_00003891.JPEG n04380533/
+mv val/ILSVRC2012_val_00003892.JPEG n01622779/
+mv val/ILSVRC2012_val_00003893.JPEG n07768694/
+mv val/ILSVRC2012_val_00003894.JPEG n03042490/
+mv val/ILSVRC2012_val_00003895.JPEG n03461385/
+mv val/ILSVRC2012_val_00003896.JPEG n04285008/
+mv val/ILSVRC2012_val_00003897.JPEG n04540053/
+mv val/ILSVRC2012_val_00003898.JPEG n02099267/
+mv val/ILSVRC2012_val_00003899.JPEG n12057211/
+mv val/ILSVRC2012_val_00003900.JPEG n04118776/
+mv val/ILSVRC2012_val_00003901.JPEG n04162706/
+mv val/ILSVRC2012_val_00003902.JPEG n12620546/
+mv val/ILSVRC2012_val_00003903.JPEG n01534433/
+mv val/ILSVRC2012_val_00003904.JPEG n01675722/
+mv val/ILSVRC2012_val_00003905.JPEG n02089078/
+mv val/ILSVRC2012_val_00003906.JPEG n03290653/
+mv val/ILSVRC2012_val_00003907.JPEG n02883205/
+mv val/ILSVRC2012_val_00003908.JPEG n07697537/
+mv val/ILSVRC2012_val_00003909.JPEG n03393912/
+mv val/ILSVRC2012_val_00003910.JPEG n02113186/
+mv val/ILSVRC2012_val_00003911.JPEG n03014705/
+mv val/ILSVRC2012_val_00003912.JPEG n04435653/
+mv val/ILSVRC2012_val_00003913.JPEG n03590841/
+mv val/ILSVRC2012_val_00003914.JPEG n03773504/
+mv val/ILSVRC2012_val_00003915.JPEG n02782093/
+mv val/ILSVRC2012_val_00003916.JPEG n02980441/
+mv val/ILSVRC2012_val_00003917.JPEG n04239074/
+mv val/ILSVRC2012_val_00003918.JPEG n04228054/
+mv val/ILSVRC2012_val_00003919.JPEG n03877845/
+mv val/ILSVRC2012_val_00003920.JPEG n04023962/
+mv val/ILSVRC2012_val_00003921.JPEG n04404412/
+mv val/ILSVRC2012_val_00003922.JPEG n02088238/
+mv val/ILSVRC2012_val_00003923.JPEG n03617480/
+mv val/ILSVRC2012_val_00003924.JPEG n03670208/
+mv val/ILSVRC2012_val_00003925.JPEG n09229709/
+mv val/ILSVRC2012_val_00003926.JPEG n02971356/
+mv val/ILSVRC2012_val_00003927.JPEG n04553703/
+mv val/ILSVRC2012_val_00003928.JPEG n01748264/
+mv val/ILSVRC2012_val_00003929.JPEG n02091467/
+mv val/ILSVRC2012_val_00003930.JPEG n07697537/
+mv val/ILSVRC2012_val_00003931.JPEG n02113186/
+mv val/ILSVRC2012_val_00003932.JPEG n07615774/
+mv val/ILSVRC2012_val_00003933.JPEG n02328150/
+mv val/ILSVRC2012_val_00003934.JPEG n02883205/
+mv val/ILSVRC2012_val_00003935.JPEG n07579787/
+mv val/ILSVRC2012_val_00003936.JPEG n01514668/
+mv val/ILSVRC2012_val_00003937.JPEG n03877845/
+mv val/ILSVRC2012_val_00003938.JPEG n02108915/
+mv val/ILSVRC2012_val_00003939.JPEG n07760859/
+mv val/ILSVRC2012_val_00003940.JPEG n02125311/
+mv val/ILSVRC2012_val_00003941.JPEG n03899768/
+mv val/ILSVRC2012_val_00003942.JPEG n01924916/
+mv val/ILSVRC2012_val_00003943.JPEG n02487347/
+mv val/ILSVRC2012_val_00003944.JPEG n02979186/
+mv val/ILSVRC2012_val_00003945.JPEG n03594945/
+mv val/ILSVRC2012_val_00003946.JPEG n03895866/
+mv val/ILSVRC2012_val_00003947.JPEG n02441942/
+mv val/ILSVRC2012_val_00003948.JPEG n13040303/
+mv val/ILSVRC2012_val_00003949.JPEG n03710193/
+mv val/ILSVRC2012_val_00003950.JPEG n03709823/
+mv val/ILSVRC2012_val_00003951.JPEG n03544143/
+mv val/ILSVRC2012_val_00003952.JPEG n02843684/
+mv val/ILSVRC2012_val_00003953.JPEG n02085782/
+mv val/ILSVRC2012_val_00003954.JPEG n02088466/
+mv val/ILSVRC2012_val_00003955.JPEG n01910747/
+mv val/ILSVRC2012_val_00003956.JPEG n04599235/
+mv val/ILSVRC2012_val_00003957.JPEG n01847000/
+mv val/ILSVRC2012_val_00003958.JPEG n02423022/
+mv val/ILSVRC2012_val_00003959.JPEG n03476991/
+mv val/ILSVRC2012_val_00003960.JPEG n02690373/
+mv val/ILSVRC2012_val_00003961.JPEG n07730033/
+mv val/ILSVRC2012_val_00003962.JPEG n03733281/
+mv val/ILSVRC2012_val_00003963.JPEG n02129604/
+mv val/ILSVRC2012_val_00003964.JPEG n02027492/
+mv val/ILSVRC2012_val_00003965.JPEG n04443257/
+mv val/ILSVRC2012_val_00003966.JPEG n03977966/
+mv val/ILSVRC2012_val_00003967.JPEG n03992509/
+mv val/ILSVRC2012_val_00003968.JPEG n02108422/
+mv val/ILSVRC2012_val_00003969.JPEG n07875152/
+mv val/ILSVRC2012_val_00003970.JPEG n03793489/
+mv val/ILSVRC2012_val_00003971.JPEG n03127925/
+mv val/ILSVRC2012_val_00003972.JPEG n04579145/
+mv val/ILSVRC2012_val_00003973.JPEG n02395406/
+mv val/ILSVRC2012_val_00003974.JPEG n02119022/
+mv val/ILSVRC2012_val_00003975.JPEG n03706229/
+mv val/ILSVRC2012_val_00003976.JPEG n03902125/
+mv val/ILSVRC2012_val_00003977.JPEG n03777568/
+mv val/ILSVRC2012_val_00003978.JPEG n02125311/
+mv val/ILSVRC2012_val_00003979.JPEG n04458633/
+mv val/ILSVRC2012_val_00003980.JPEG n02672831/
+mv val/ILSVRC2012_val_00003981.JPEG n01784675/
+mv val/ILSVRC2012_val_00003982.JPEG n02138441/
+mv val/ILSVRC2012_val_00003983.JPEG n04328186/
+mv val/ILSVRC2012_val_00003984.JPEG n02120505/
+mv val/ILSVRC2012_val_00003985.JPEG n01644373/
+mv val/ILSVRC2012_val_00003986.JPEG n03544143/
+mv val/ILSVRC2012_val_00003987.JPEG n01818515/
+mv val/ILSVRC2012_val_00003988.JPEG n03877472/
+mv val/ILSVRC2012_val_00003989.JPEG n04044716/
+mv val/ILSVRC2012_val_00003990.JPEG n04009552/
+mv val/ILSVRC2012_val_00003991.JPEG n03220513/
+mv val/ILSVRC2012_val_00003992.JPEG n04067472/
+mv val/ILSVRC2012_val_00003993.JPEG n02172182/
+mv val/ILSVRC2012_val_00003994.JPEG n02823750/
+mv val/ILSVRC2012_val_00003995.JPEG n02317335/
+mv val/ILSVRC2012_val_00003996.JPEG n04467665/
+mv val/ILSVRC2012_val_00003997.JPEG n02229544/
+mv val/ILSVRC2012_val_00003998.JPEG n04049303/
+mv val/ILSVRC2012_val_00003999.JPEG n02116738/
+mv val/ILSVRC2012_val_00004000.JPEG n07584110/
+mv val/ILSVRC2012_val_00004001.JPEG n02018795/
+mv val/ILSVRC2012_val_00004002.JPEG n03930313/
+mv val/ILSVRC2012_val_00004003.JPEG n02480495/
+mv val/ILSVRC2012_val_00004004.JPEG n02172182/
+mv val/ILSVRC2012_val_00004005.JPEG n09399592/
+mv val/ILSVRC2012_val_00004006.JPEG n01530575/
+mv val/ILSVRC2012_val_00004007.JPEG n02971356/
+mv val/ILSVRC2012_val_00004008.JPEG n02105641/
+mv val/ILSVRC2012_val_00004009.JPEG n01698640/
+mv val/ILSVRC2012_val_00004010.JPEG n04553703/
+mv val/ILSVRC2012_val_00004011.JPEG n02280649/
+mv val/ILSVRC2012_val_00004012.JPEG n01807496/
+mv val/ILSVRC2012_val_00004013.JPEG n02504458/
+mv val/ILSVRC2012_val_00004014.JPEG n03617480/
+mv val/ILSVRC2012_val_00004015.JPEG n03884397/
+mv val/ILSVRC2012_val_00004016.JPEG n02011460/
+mv val/ILSVRC2012_val_00004017.JPEG n02704792/
+mv val/ILSVRC2012_val_00004018.JPEG n03393912/
+mv val/ILSVRC2012_val_00004019.JPEG n01667114/
+mv val/ILSVRC2012_val_00004020.JPEG n03598930/
+mv val/ILSVRC2012_val_00004021.JPEG n01775062/
+mv val/ILSVRC2012_val_00004022.JPEG n07717410/
+mv val/ILSVRC2012_val_00004023.JPEG n04118776/
+mv val/ILSVRC2012_val_00004024.JPEG n03218198/
+mv val/ILSVRC2012_val_00004025.JPEG n03255030/
+mv val/ILSVRC2012_val_00004026.JPEG n02111129/
+mv val/ILSVRC2012_val_00004027.JPEG n02892201/
+mv val/ILSVRC2012_val_00004028.JPEG n03444034/
+mv val/ILSVRC2012_val_00004029.JPEG n03692522/
+mv val/ILSVRC2012_val_00004030.JPEG n02364673/
+mv val/ILSVRC2012_val_00004031.JPEG n07718747/
+mv val/ILSVRC2012_val_00004032.JPEG n04418357/
+mv val/ILSVRC2012_val_00004033.JPEG n04235860/
+mv val/ILSVRC2012_val_00004034.JPEG n03000684/
+mv val/ILSVRC2012_val_00004035.JPEG n03929660/
+mv val/ILSVRC2012_val_00004036.JPEG n03670208/
+mv val/ILSVRC2012_val_00004037.JPEG n01560419/
+mv val/ILSVRC2012_val_00004038.JPEG n02494079/
+mv val/ILSVRC2012_val_00004039.JPEG n03197337/
+mv val/ILSVRC2012_val_00004040.JPEG n01737021/
+mv val/ILSVRC2012_val_00004041.JPEG n07697313/
+mv val/ILSVRC2012_val_00004042.JPEG n02127052/
+mv val/ILSVRC2012_val_00004043.JPEG n03764736/
+mv val/ILSVRC2012_val_00004044.JPEG n04270147/
+mv val/ILSVRC2012_val_00004045.JPEG n02097474/
+mv val/ILSVRC2012_val_00004046.JPEG n04204347/
+mv val/ILSVRC2012_val_00004047.JPEG n03291819/
+mv val/ILSVRC2012_val_00004048.JPEG n03134739/
+mv val/ILSVRC2012_val_00004049.JPEG n02086240/
+mv val/ILSVRC2012_val_00004050.JPEG n03691459/
+mv val/ILSVRC2012_val_00004051.JPEG n01924916/
+mv val/ILSVRC2012_val_00004052.JPEG n04550184/
+mv val/ILSVRC2012_val_00004053.JPEG n02093754/
+mv val/ILSVRC2012_val_00004054.JPEG n03110669/
+mv val/ILSVRC2012_val_00004055.JPEG n02643566/
+mv val/ILSVRC2012_val_00004056.JPEG n02108422/
+mv val/ILSVRC2012_val_00004057.JPEG n02795169/
+mv val/ILSVRC2012_val_00004058.JPEG n02483362/
+mv val/ILSVRC2012_val_00004059.JPEG n03983396/
+mv val/ILSVRC2012_val_00004060.JPEG n02093647/
+mv val/ILSVRC2012_val_00004061.JPEG n02815834/
+mv val/ILSVRC2012_val_00004062.JPEG n04069434/
+mv val/ILSVRC2012_val_00004063.JPEG n03930313/
+mv val/ILSVRC2012_val_00004064.JPEG n02326432/
+mv val/ILSVRC2012_val_00004065.JPEG n02086079/
+mv val/ILSVRC2012_val_00004066.JPEG n03958227/
+mv val/ILSVRC2012_val_00004067.JPEG n04258138/
+mv val/ILSVRC2012_val_00004068.JPEG n03498962/
+mv val/ILSVRC2012_val_00004069.JPEG n03697007/
+mv val/ILSVRC2012_val_00004070.JPEG n03126707/
+mv val/ILSVRC2012_val_00004071.JPEG n02980441/
+mv val/ILSVRC2012_val_00004072.JPEG n03530642/
+mv val/ILSVRC2012_val_00004073.JPEG n02086910/
+mv val/ILSVRC2012_val_00004074.JPEG n02087394/
+mv val/ILSVRC2012_val_00004075.JPEG n02280649/
+mv val/ILSVRC2012_val_00004076.JPEG n04285008/
+mv val/ILSVRC2012_val_00004077.JPEG n02093256/
+mv val/ILSVRC2012_val_00004078.JPEG n01950731/
+mv val/ILSVRC2012_val_00004079.JPEG n03733131/
+mv val/ILSVRC2012_val_00004080.JPEG n04277352/
+mv val/ILSVRC2012_val_00004081.JPEG n02086240/
+mv val/ILSVRC2012_val_00004082.JPEG n03544143/
+mv val/ILSVRC2012_val_00004083.JPEG n03782006/
+mv val/ILSVRC2012_val_00004084.JPEG n01632777/
+mv val/ILSVRC2012_val_00004085.JPEG n02086646/
+mv val/ILSVRC2012_val_00004086.JPEG n03297495/
+mv val/ILSVRC2012_val_00004087.JPEG n09246464/
+mv val/ILSVRC2012_val_00004088.JPEG n02123597/
+mv val/ILSVRC2012_val_00004089.JPEG n02687172/
+mv val/ILSVRC2012_val_00004090.JPEG n04487081/
+mv val/ILSVRC2012_val_00004091.JPEG n02236044/
+mv val/ILSVRC2012_val_00004092.JPEG n03710193/
+mv val/ILSVRC2012_val_00004093.JPEG n02607072/
+mv val/ILSVRC2012_val_00004094.JPEG n02788148/
+mv val/ILSVRC2012_val_00004095.JPEG n01776313/
+mv val/ILSVRC2012_val_00004096.JPEG n04376876/
+mv val/ILSVRC2012_val_00004097.JPEG n02102973/
+mv val/ILSVRC2012_val_00004098.JPEG n07873807/
+mv val/ILSVRC2012_val_00004099.JPEG n03372029/
+mv val/ILSVRC2012_val_00004100.JPEG n02104029/
+mv val/ILSVRC2012_val_00004101.JPEG n02669723/
+mv val/ILSVRC2012_val_00004102.JPEG n01693334/
+mv val/ILSVRC2012_val_00004103.JPEG n12985857/
+mv val/ILSVRC2012_val_00004104.JPEG n03785016/
+mv val/ILSVRC2012_val_00004105.JPEG n02066245/
+mv val/ILSVRC2012_val_00004106.JPEG n01698640/
+mv val/ILSVRC2012_val_00004107.JPEG n04086273/
+mv val/ILSVRC2012_val_00004108.JPEG n03047690/
+mv val/ILSVRC2012_val_00004109.JPEG n04026417/
+mv val/ILSVRC2012_val_00004110.JPEG n01773797/
+mv val/ILSVRC2012_val_00004111.JPEG n03742115/
+mv val/ILSVRC2012_val_00004112.JPEG n02018207/
+mv val/ILSVRC2012_val_00004113.JPEG n01978455/
+mv val/ILSVRC2012_val_00004114.JPEG n02988304/
+mv val/ILSVRC2012_val_00004115.JPEG n03595614/
+mv val/ILSVRC2012_val_00004116.JPEG n02965783/
+mv val/ILSVRC2012_val_00004117.JPEG n02992529/
+mv val/ILSVRC2012_val_00004118.JPEG n01773157/
+mv val/ILSVRC2012_val_00004119.JPEG n03417042/
+mv val/ILSVRC2012_val_00004120.JPEG n03376595/
+mv val/ILSVRC2012_val_00004121.JPEG n04435653/
+mv val/ILSVRC2012_val_00004122.JPEG n07711569/
+mv val/ILSVRC2012_val_00004123.JPEG n03970156/
+mv val/ILSVRC2012_val_00004124.JPEG n02877765/
+mv val/ILSVRC2012_val_00004125.JPEG n04111531/
+mv val/ILSVRC2012_val_00004126.JPEG n09256479/
+mv val/ILSVRC2012_val_00004127.JPEG n02641379/
+mv val/ILSVRC2012_val_00004128.JPEG n04179913/
+mv val/ILSVRC2012_val_00004129.JPEG n02113023/
+mv val/ILSVRC2012_val_00004130.JPEG n03977966/
+mv val/ILSVRC2012_val_00004131.JPEG n04525038/
+mv val/ILSVRC2012_val_00004132.JPEG n02190166/
+mv val/ILSVRC2012_val_00004133.JPEG n04070727/
+mv val/ILSVRC2012_val_00004134.JPEG n02111277/
+mv val/ILSVRC2012_val_00004135.JPEG n02128757/
+mv val/ILSVRC2012_val_00004136.JPEG n01784675/
+mv val/ILSVRC2012_val_00004137.JPEG n02412080/
+mv val/ILSVRC2012_val_00004138.JPEG n03146219/
+mv val/ILSVRC2012_val_00004139.JPEG n03485794/
+mv val/ILSVRC2012_val_00004140.JPEG n01773157/
+mv val/ILSVRC2012_val_00004141.JPEG n02119022/
+mv val/ILSVRC2012_val_00004142.JPEG n02704792/
+mv val/ILSVRC2012_val_00004143.JPEG n01737021/
+mv val/ILSVRC2012_val_00004144.JPEG n03697007/
+mv val/ILSVRC2012_val_00004145.JPEG n03450230/
+mv val/ILSVRC2012_val_00004146.JPEG n01770081/
+mv val/ILSVRC2012_val_00004147.JPEG n03792782/
+mv val/ILSVRC2012_val_00004148.JPEG n02089867/
+mv val/ILSVRC2012_val_00004149.JPEG n02817516/
+mv val/ILSVRC2012_val_00004150.JPEG n03141823/
+mv val/ILSVRC2012_val_00004151.JPEG n01773157/
+mv val/ILSVRC2012_val_00004152.JPEG n07860988/
+mv val/ILSVRC2012_val_00004153.JPEG n02317335/
+mv val/ILSVRC2012_val_00004154.JPEG n04442312/
+mv val/ILSVRC2012_val_00004155.JPEG n04428191/
+mv val/ILSVRC2012_val_00004156.JPEG n04049303/
+mv val/ILSVRC2012_val_00004157.JPEG n12620546/
+mv val/ILSVRC2012_val_00004158.JPEG n04591157/
+mv val/ILSVRC2012_val_00004159.JPEG n03980874/
+mv val/ILSVRC2012_val_00004160.JPEG n03314780/
+mv val/ILSVRC2012_val_00004161.JPEG n02514041/
+mv val/ILSVRC2012_val_00004162.JPEG n03376595/
+mv val/ILSVRC2012_val_00004163.JPEG n01774384/
+mv val/ILSVRC2012_val_00004164.JPEG n01774384/
+mv val/ILSVRC2012_val_00004165.JPEG n04579432/
+mv val/ILSVRC2012_val_00004166.JPEG n04336792/
+mv val/ILSVRC2012_val_00004167.JPEG n01872401/
+mv val/ILSVRC2012_val_00004168.JPEG n02483708/
+mv val/ILSVRC2012_val_00004169.JPEG n03127925/
+mv val/ILSVRC2012_val_00004170.JPEG n03314780/
+mv val/ILSVRC2012_val_00004171.JPEG n03843555/
+mv val/ILSVRC2012_val_00004172.JPEG n01770081/
+mv val/ILSVRC2012_val_00004173.JPEG n02480855/
+mv val/ILSVRC2012_val_00004174.JPEG n04118776/
+mv val/ILSVRC2012_val_00004175.JPEG n01910747/
+mv val/ILSVRC2012_val_00004176.JPEG n03126707/
+mv val/ILSVRC2012_val_00004177.JPEG n02233338/
+mv val/ILSVRC2012_val_00004178.JPEG n02114855/
+mv val/ILSVRC2012_val_00004179.JPEG n02808304/
+mv val/ILSVRC2012_val_00004180.JPEG n02107683/
+mv val/ILSVRC2012_val_00004181.JPEG n03590841/
+mv val/ILSVRC2012_val_00004182.JPEG n01737021/
+mv val/ILSVRC2012_val_00004183.JPEG n01514859/
+mv val/ILSVRC2012_val_00004184.JPEG n04346328/
+mv val/ILSVRC2012_val_00004185.JPEG n02102480/
+mv val/ILSVRC2012_val_00004186.JPEG n02093754/
+mv val/ILSVRC2012_val_00004187.JPEG n09472597/
+mv val/ILSVRC2012_val_00004188.JPEG n09332890/
+mv val/ILSVRC2012_val_00004189.JPEG n03630383/
+mv val/ILSVRC2012_val_00004190.JPEG n02492035/
+mv val/ILSVRC2012_val_00004191.JPEG n04026417/
+mv val/ILSVRC2012_val_00004192.JPEG n02110185/
+mv val/ILSVRC2012_val_00004193.JPEG n03125729/
+mv val/ILSVRC2012_val_00004194.JPEG n04465501/
+mv val/ILSVRC2012_val_00004195.JPEG n07695742/
+mv val/ILSVRC2012_val_00004196.JPEG n03775546/
+mv val/ILSVRC2012_val_00004197.JPEG n02930766/
+mv val/ILSVRC2012_val_00004198.JPEG n07753275/
+mv val/ILSVRC2012_val_00004199.JPEG n07684084/
+mv val/ILSVRC2012_val_00004200.JPEG n04486054/
+mv val/ILSVRC2012_val_00004201.JPEG n01677366/
+mv val/ILSVRC2012_val_00004202.JPEG n03127747/
+mv val/ILSVRC2012_val_00004203.JPEG n02917067/
+mv val/ILSVRC2012_val_00004204.JPEG n04347754/
+mv val/ILSVRC2012_val_00004205.JPEG n02704792/
+mv val/ILSVRC2012_val_00004206.JPEG n07583066/
+mv val/ILSVRC2012_val_00004207.JPEG n07714990/
+mv val/ILSVRC2012_val_00004208.JPEG n02111500/
+mv val/ILSVRC2012_val_00004209.JPEG n03085013/
+mv val/ILSVRC2012_val_00004210.JPEG n02233338/
+mv val/ILSVRC2012_val_00004211.JPEG n03977966/
+mv val/ILSVRC2012_val_00004212.JPEG n03876231/
+mv val/ILSVRC2012_val_00004213.JPEG n07760859/
+mv val/ILSVRC2012_val_00004214.JPEG n03623198/
+mv val/ILSVRC2012_val_00004215.JPEG n02268853/
+mv val/ILSVRC2012_val_00004216.JPEG n07730033/
+mv val/ILSVRC2012_val_00004217.JPEG n02097047/
+mv val/ILSVRC2012_val_00004218.JPEG n02981792/
+mv val/ILSVRC2012_val_00004219.JPEG n01984695/
+mv val/ILSVRC2012_val_00004220.JPEG n04584207/
+mv val/ILSVRC2012_val_00004221.JPEG n01665541/
+mv val/ILSVRC2012_val_00004222.JPEG n01734418/
+mv val/ILSVRC2012_val_00004223.JPEG n02100877/
+mv val/ILSVRC2012_val_00004224.JPEG n03109150/
+mv val/ILSVRC2012_val_00004225.JPEG n02099712/
+mv val/ILSVRC2012_val_00004226.JPEG n01855672/
+mv val/ILSVRC2012_val_00004227.JPEG n02486410/
+mv val/ILSVRC2012_val_00004228.JPEG n02099267/
+mv val/ILSVRC2012_val_00004229.JPEG n03804744/
+mv val/ILSVRC2012_val_00004230.JPEG n04179913/
+mv val/ILSVRC2012_val_00004231.JPEG n02091032/
+mv val/ILSVRC2012_val_00004232.JPEG n04200800/
+mv val/ILSVRC2012_val_00004233.JPEG n04127249/
+mv val/ILSVRC2012_val_00004234.JPEG n01833805/
+mv val/ILSVRC2012_val_00004235.JPEG n01855672/
+mv val/ILSVRC2012_val_00004236.JPEG n02909870/
+mv val/ILSVRC2012_val_00004237.JPEG n04423845/
+mv val/ILSVRC2012_val_00004238.JPEG n03345487/
+mv val/ILSVRC2012_val_00004239.JPEG n04456115/
+mv val/ILSVRC2012_val_00004240.JPEG n04517823/
+mv val/ILSVRC2012_val_00004241.JPEG n07714990/
+mv val/ILSVRC2012_val_00004242.JPEG n03492542/
+mv val/ILSVRC2012_val_00004243.JPEG n01531178/
+mv val/ILSVRC2012_val_00004244.JPEG n07892512/
+mv val/ILSVRC2012_val_00004245.JPEG n01534433/
+mv val/ILSVRC2012_val_00004246.JPEG n03982430/
+mv val/ILSVRC2012_val_00004247.JPEG n04116512/
+mv val/ILSVRC2012_val_00004248.JPEG n02097130/
+mv val/ILSVRC2012_val_00004249.JPEG n04612504/
+mv val/ILSVRC2012_val_00004250.JPEG n03146219/
+mv val/ILSVRC2012_val_00004251.JPEG n02097130/
+mv val/ILSVRC2012_val_00004252.JPEG n04517823/
+mv val/ILSVRC2012_val_00004253.JPEG n07684084/
+mv val/ILSVRC2012_val_00004254.JPEG n01978455/
+mv val/ILSVRC2012_val_00004255.JPEG n02236044/
+mv val/ILSVRC2012_val_00004256.JPEG n01798484/
+mv val/ILSVRC2012_val_00004257.JPEG n04200800/
+mv val/ILSVRC2012_val_00004258.JPEG n01985128/
+mv val/ILSVRC2012_val_00004259.JPEG n09468604/
+mv val/ILSVRC2012_val_00004260.JPEG n02268853/
+mv val/ILSVRC2012_val_00004261.JPEG n02090622/
+mv val/ILSVRC2012_val_00004262.JPEG n03000684/
+mv val/ILSVRC2012_val_00004263.JPEG n04447861/
+mv val/ILSVRC2012_val_00004264.JPEG n04154565/
+mv val/ILSVRC2012_val_00004265.JPEG n02840245/
+mv val/ILSVRC2012_val_00004266.JPEG n03126707/
+mv val/ILSVRC2012_val_00004267.JPEG n02391049/
+mv val/ILSVRC2012_val_00004268.JPEG n04532106/
+mv val/ILSVRC2012_val_00004269.JPEG n01728572/
+mv val/ILSVRC2012_val_00004270.JPEG n03124043/
+mv val/ILSVRC2012_val_00004271.JPEG n01773549/
+mv val/ILSVRC2012_val_00004272.JPEG n02480855/
+mv val/ILSVRC2012_val_00004273.JPEG n07860988/
+mv val/ILSVRC2012_val_00004274.JPEG n02105056/
+mv val/ILSVRC2012_val_00004275.JPEG n03888605/
+mv val/ILSVRC2012_val_00004276.JPEG n02116738/
+mv val/ILSVRC2012_val_00004277.JPEG n02804610/
+mv val/ILSVRC2012_val_00004278.JPEG n02113799/
+mv val/ILSVRC2012_val_00004279.JPEG n03899768/
+mv val/ILSVRC2012_val_00004280.JPEG n01729322/
+mv val/ILSVRC2012_val_00004281.JPEG n07873807/
+mv val/ILSVRC2012_val_00004282.JPEG n02116738/
+mv val/ILSVRC2012_val_00004283.JPEG n02795169/
+mv val/ILSVRC2012_val_00004284.JPEG n02256656/
+mv val/ILSVRC2012_val_00004285.JPEG n07720875/
+mv val/ILSVRC2012_val_00004286.JPEG n03584829/
+mv val/ILSVRC2012_val_00004287.JPEG n02097209/
+mv val/ILSVRC2012_val_00004288.JPEG n02092002/
+mv val/ILSVRC2012_val_00004289.JPEG n07614500/
+mv val/ILSVRC2012_val_00004290.JPEG n03599486/
+mv val/ILSVRC2012_val_00004291.JPEG n02825657/
+mv val/ILSVRC2012_val_00004292.JPEG n02966687/
+mv val/ILSVRC2012_val_00004293.JPEG n04428191/
+mv val/ILSVRC2012_val_00004294.JPEG n02488702/
+mv val/ILSVRC2012_val_00004295.JPEG n01774384/
+mv val/ILSVRC2012_val_00004296.JPEG n03908618/
+mv val/ILSVRC2012_val_00004297.JPEG n03814639/
+mv val/ILSVRC2012_val_00004298.JPEG n02444819/
+mv val/ILSVRC2012_val_00004299.JPEG n02825657/
+mv val/ILSVRC2012_val_00004300.JPEG n02325366/
+mv val/ILSVRC2012_val_00004301.JPEG n03394916/
+mv val/ILSVRC2012_val_00004302.JPEG n02077923/
+mv val/ILSVRC2012_val_00004303.JPEG n03709823/
+mv val/ILSVRC2012_val_00004304.JPEG n04579432/
+mv val/ILSVRC2012_val_00004305.JPEG n03967562/
+mv val/ILSVRC2012_val_00004306.JPEG n01514668/
+mv val/ILSVRC2012_val_00004307.JPEG n04548280/
+mv val/ILSVRC2012_val_00004308.JPEG n03899768/
+mv val/ILSVRC2012_val_00004309.JPEG n02892201/
+mv val/ILSVRC2012_val_00004310.JPEG n01704323/
+mv val/ILSVRC2012_val_00004311.JPEG n01484850/
+mv val/ILSVRC2012_val_00004312.JPEG n03535780/
+mv val/ILSVRC2012_val_00004313.JPEG n03775546/
+mv val/ILSVRC2012_val_00004314.JPEG n03337140/
+mv val/ILSVRC2012_val_00004315.JPEG n01514859/
+mv val/ILSVRC2012_val_00004316.JPEG n01580077/
+mv val/ILSVRC2012_val_00004317.JPEG n01580077/
+mv val/ILSVRC2012_val_00004318.JPEG n04509417/
+mv val/ILSVRC2012_val_00004319.JPEG n03977966/
+mv val/ILSVRC2012_val_00004320.JPEG n02115641/
+mv val/ILSVRC2012_val_00004321.JPEG n07697313/
+mv val/ILSVRC2012_val_00004322.JPEG n07753275/
+mv val/ILSVRC2012_val_00004323.JPEG n04542943/
+mv val/ILSVRC2012_val_00004324.JPEG n02910353/
+mv val/ILSVRC2012_val_00004325.JPEG n02087046/
+mv val/ILSVRC2012_val_00004326.JPEG n04443257/
+mv val/ILSVRC2012_val_00004327.JPEG n03788365/
+mv val/ILSVRC2012_val_00004328.JPEG n04429376/
+mv val/ILSVRC2012_val_00004329.JPEG n01484850/
+mv val/ILSVRC2012_val_00004330.JPEG n02843684/
+mv val/ILSVRC2012_val_00004331.JPEG n04479046/
+mv val/ILSVRC2012_val_00004332.JPEG n01990800/
+mv val/ILSVRC2012_val_00004333.JPEG n09193705/
+mv val/ILSVRC2012_val_00004334.JPEG n02115641/
+mv val/ILSVRC2012_val_00004335.JPEG n01773549/
+mv val/ILSVRC2012_val_00004336.JPEG n09246464/
+mv val/ILSVRC2012_val_00004337.JPEG n03956157/
+mv val/ILSVRC2012_val_00004338.JPEG n03065424/
+mv val/ILSVRC2012_val_00004339.JPEG n02174001/
+mv val/ILSVRC2012_val_00004340.JPEG n01824575/
+mv val/ILSVRC2012_val_00004341.JPEG n02099267/
+mv val/ILSVRC2012_val_00004342.JPEG n02093647/
+mv val/ILSVRC2012_val_00004343.JPEG n03133878/
+mv val/ILSVRC2012_val_00004344.JPEG n01580077/
+mv val/ILSVRC2012_val_00004345.JPEG n01622779/
+mv val/ILSVRC2012_val_00004346.JPEG n03271574/
+mv val/ILSVRC2012_val_00004347.JPEG n07768694/
+mv val/ILSVRC2012_val_00004348.JPEG n04376876/
+mv val/ILSVRC2012_val_00004349.JPEG n01877812/
+mv val/ILSVRC2012_val_00004350.JPEG n03110669/
+mv val/ILSVRC2012_val_00004351.JPEG n01728920/
+mv val/ILSVRC2012_val_00004352.JPEG n04141327/
+mv val/ILSVRC2012_val_00004353.JPEG n04389033/
+mv val/ILSVRC2012_val_00004354.JPEG n02096294/
+mv val/ILSVRC2012_val_00004355.JPEG n02492035/
+mv val/ILSVRC2012_val_00004356.JPEG n03876231/
+mv val/ILSVRC2012_val_00004357.JPEG n07716906/
+mv val/ILSVRC2012_val_00004358.JPEG n02097474/
+mv val/ILSVRC2012_val_00004359.JPEG n02086240/
+mv val/ILSVRC2012_val_00004360.JPEG n02708093/
+mv val/ILSVRC2012_val_00004361.JPEG n02105641/
+mv val/ILSVRC2012_val_00004362.JPEG n01984695/
+mv val/ILSVRC2012_val_00004363.JPEG n03125729/
+mv val/ILSVRC2012_val_00004364.JPEG n03944341/
+mv val/ILSVRC2012_val_00004365.JPEG n03450230/
+mv val/ILSVRC2012_val_00004366.JPEG n02109525/
+mv val/ILSVRC2012_val_00004367.JPEG n04389033/
+mv val/ILSVRC2012_val_00004368.JPEG n07760859/
+mv val/ILSVRC2012_val_00004369.JPEG n01704323/
+mv val/ILSVRC2012_val_00004370.JPEG n04540053/
+mv val/ILSVRC2012_val_00004371.JPEG n02823428/
+mv val/ILSVRC2012_val_00004372.JPEG n02115641/
+mv val/ILSVRC2012_val_00004373.JPEG n03733281/
+mv val/ILSVRC2012_val_00004374.JPEG n02093754/
+mv val/ILSVRC2012_val_00004375.JPEG n01532829/
+mv val/ILSVRC2012_val_00004376.JPEG n07802026/
+mv val/ILSVRC2012_val_00004377.JPEG n09472597/
+mv val/ILSVRC2012_val_00004378.JPEG n02091134/
+mv val/ILSVRC2012_val_00004379.JPEG n03041632/
+mv val/ILSVRC2012_val_00004380.JPEG n04372370/
+mv val/ILSVRC2012_val_00004381.JPEG n01608432/
+mv val/ILSVRC2012_val_00004382.JPEG n04265275/
+mv val/ILSVRC2012_val_00004383.JPEG n02804414/
+mv val/ILSVRC2012_val_00004384.JPEG n03109150/
+mv val/ILSVRC2012_val_00004385.JPEG n04328186/
+mv val/ILSVRC2012_val_00004386.JPEG n02107312/
+mv val/ILSVRC2012_val_00004387.JPEG n03100240/
+mv val/ILSVRC2012_val_00004388.JPEG n03250847/
+mv val/ILSVRC2012_val_00004389.JPEG n03393912/
+mv val/ILSVRC2012_val_00004390.JPEG n02090622/
+mv val/ILSVRC2012_val_00004391.JPEG n02840245/
+mv val/ILSVRC2012_val_00004392.JPEG n02870880/
+mv val/ILSVRC2012_val_00004393.JPEG n04562935/
+mv val/ILSVRC2012_val_00004394.JPEG n02397096/
+mv val/ILSVRC2012_val_00004395.JPEG n03995372/
+mv val/ILSVRC2012_val_00004396.JPEG n02106662/
+mv val/ILSVRC2012_val_00004397.JPEG n02096177/
+mv val/ILSVRC2012_val_00004398.JPEG n02493509/
+mv val/ILSVRC2012_val_00004399.JPEG n02965783/
+mv val/ILSVRC2012_val_00004400.JPEG n01981276/
+mv val/ILSVRC2012_val_00004401.JPEG n01990800/
+mv val/ILSVRC2012_val_00004402.JPEG n01698640/
+mv val/ILSVRC2012_val_00004403.JPEG n02088238/
+mv val/ILSVRC2012_val_00004404.JPEG n02107908/
+mv val/ILSVRC2012_val_00004405.JPEG n09399592/
+mv val/ILSVRC2012_val_00004406.JPEG n02790996/
+mv val/ILSVRC2012_val_00004407.JPEG n02091134/
+mv val/ILSVRC2012_val_00004408.JPEG n04252225/
+mv val/ILSVRC2012_val_00004409.JPEG n02447366/
+mv val/ILSVRC2012_val_00004410.JPEG n03179701/
+mv val/ILSVRC2012_val_00004411.JPEG n02123394/
+mv val/ILSVRC2012_val_00004412.JPEG n02974003/
+mv val/ILSVRC2012_val_00004413.JPEG n03124170/
+mv val/ILSVRC2012_val_00004414.JPEG n03045698/
+mv val/ILSVRC2012_val_00004415.JPEG n03271574/
+mv val/ILSVRC2012_val_00004416.JPEG n04067472/
+mv val/ILSVRC2012_val_00004417.JPEG n01494475/
+mv val/ILSVRC2012_val_00004418.JPEG n01984695/
+mv val/ILSVRC2012_val_00004419.JPEG n02321529/
+mv val/ILSVRC2012_val_00004420.JPEG n03062245/
+mv val/ILSVRC2012_val_00004421.JPEG n07892512/
+mv val/ILSVRC2012_val_00004422.JPEG n02123045/
+mv val/ILSVRC2012_val_00004423.JPEG n02099849/
+mv val/ILSVRC2012_val_00004424.JPEG n02672831/
+mv val/ILSVRC2012_val_00004425.JPEG n03854065/
+mv val/ILSVRC2012_val_00004426.JPEG n02825657/
+mv val/ILSVRC2012_val_00004427.JPEG n01644900/
+mv val/ILSVRC2012_val_00004428.JPEG n07745940/
+mv val/ILSVRC2012_val_00004429.JPEG n04366367/
+mv val/ILSVRC2012_val_00004430.JPEG n09288635/
+mv val/ILSVRC2012_val_00004431.JPEG n03447447/
+mv val/ILSVRC2012_val_00004432.JPEG n03124043/
+mv val/ILSVRC2012_val_00004433.JPEG n12267677/
+mv val/ILSVRC2012_val_00004434.JPEG n02091244/
+mv val/ILSVRC2012_val_00004435.JPEG n02111277/
+mv val/ILSVRC2012_val_00004436.JPEG n02088632/
+mv val/ILSVRC2012_val_00004437.JPEG n12985857/
+mv val/ILSVRC2012_val_00004438.JPEG n04517823/
+mv val/ILSVRC2012_val_00004439.JPEG n03594945/
+mv val/ILSVRC2012_val_00004440.JPEG n04049303/
+mv val/ILSVRC2012_val_00004441.JPEG n03908714/
+mv val/ILSVRC2012_val_00004442.JPEG n03697007/
+mv val/ILSVRC2012_val_00004443.JPEG n07714571/
+mv val/ILSVRC2012_val_00004444.JPEG n01986214/
+mv val/ILSVRC2012_val_00004445.JPEG n03014705/
+mv val/ILSVRC2012_val_00004446.JPEG n04238763/
+mv val/ILSVRC2012_val_00004447.JPEG n02950826/
+mv val/ILSVRC2012_val_00004448.JPEG n01755581/
+mv val/ILSVRC2012_val_00004449.JPEG n02108089/
+mv val/ILSVRC2012_val_00004450.JPEG n02111500/
+mv val/ILSVRC2012_val_00004451.JPEG n02028035/
+mv val/ILSVRC2012_val_00004452.JPEG n03425413/
+mv val/ILSVRC2012_val_00004453.JPEG n02276258/
+mv val/ILSVRC2012_val_00004454.JPEG n03690938/
+mv val/ILSVRC2012_val_00004455.JPEG n03478589/
+mv val/ILSVRC2012_val_00004456.JPEG n04579432/
+mv val/ILSVRC2012_val_00004457.JPEG n04209133/
+mv val/ILSVRC2012_val_00004458.JPEG n02492035/
+mv val/ILSVRC2012_val_00004459.JPEG n04479046/
+mv val/ILSVRC2012_val_00004460.JPEG n03131574/
+mv val/ILSVRC2012_val_00004461.JPEG n04026417/
+mv val/ILSVRC2012_val_00004462.JPEG n01981276/
+mv val/ILSVRC2012_val_00004463.JPEG n01514668/
+mv val/ILSVRC2012_val_00004464.JPEG n02643566/
+mv val/ILSVRC2012_val_00004465.JPEG n03791053/
+mv val/ILSVRC2012_val_00004466.JPEG n02870880/
+mv val/ILSVRC2012_val_00004467.JPEG n04235860/
+mv val/ILSVRC2012_val_00004468.JPEG n06596364/
+mv val/ILSVRC2012_val_00004469.JPEG n04019541/
+mv val/ILSVRC2012_val_00004470.JPEG n09246464/
+mv val/ILSVRC2012_val_00004471.JPEG n03065424/
+mv val/ILSVRC2012_val_00004472.JPEG n13054560/
+mv val/ILSVRC2012_val_00004473.JPEG n04597913/
+mv val/ILSVRC2012_val_00004474.JPEG n02111500/
+mv val/ILSVRC2012_val_00004475.JPEG n04252077/
+mv val/ILSVRC2012_val_00004476.JPEG n03857828/
+mv val/ILSVRC2012_val_00004477.JPEG n02100236/
+mv val/ILSVRC2012_val_00004478.JPEG n04442312/
+mv val/ILSVRC2012_val_00004479.JPEG n02363005/
+mv val/ILSVRC2012_val_00004480.JPEG n04040759/
+mv val/ILSVRC2012_val_00004481.JPEG n03127925/
+mv val/ILSVRC2012_val_00004482.JPEG n04033995/
+mv val/ILSVRC2012_val_00004483.JPEG n03662601/
+mv val/ILSVRC2012_val_00004484.JPEG n02966193/
+mv val/ILSVRC2012_val_00004485.JPEG n03761084/
+mv val/ILSVRC2012_val_00004486.JPEG n03838899/
+mv val/ILSVRC2012_val_00004487.JPEG n04081281/
+mv val/ILSVRC2012_val_00004488.JPEG n04243546/
+mv val/ILSVRC2012_val_00004489.JPEG n04252077/
+mv val/ILSVRC2012_val_00004490.JPEG n04487081/
+mv val/ILSVRC2012_val_00004491.JPEG n04417672/
+mv val/ILSVRC2012_val_00004492.JPEG n03662601/
+mv val/ILSVRC2012_val_00004493.JPEG n03476991/
+mv val/ILSVRC2012_val_00004494.JPEG n01829413/
+mv val/ILSVRC2012_val_00004495.JPEG n07614500/
+mv val/ILSVRC2012_val_00004496.JPEG n02701002/
+mv val/ILSVRC2012_val_00004497.JPEG n07754684/
+mv val/ILSVRC2012_val_00004498.JPEG n04258138/
+mv val/ILSVRC2012_val_00004499.JPEG n01744401/
+mv val/ILSVRC2012_val_00004500.JPEG n03259280/
+mv val/ILSVRC2012_val_00004501.JPEG n02676566/
+mv val/ILSVRC2012_val_00004502.JPEG n03017168/
+mv val/ILSVRC2012_val_00004503.JPEG n01817953/
+mv val/ILSVRC2012_val_00004504.JPEG n04049303/
+mv val/ILSVRC2012_val_00004505.JPEG n01692333/
+mv val/ILSVRC2012_val_00004506.JPEG n02108551/
+mv val/ILSVRC2012_val_00004507.JPEG n03134739/
+mv val/ILSVRC2012_val_00004508.JPEG n02410509/
+mv val/ILSVRC2012_val_00004509.JPEG n03871628/
+mv val/ILSVRC2012_val_00004510.JPEG n04525305/
+mv val/ILSVRC2012_val_00004511.JPEG n02093754/
+mv val/ILSVRC2012_val_00004512.JPEG n04461696/
+mv val/ILSVRC2012_val_00004513.JPEG n04523525/
+mv val/ILSVRC2012_val_00004514.JPEG n11939491/
+mv val/ILSVRC2012_val_00004515.JPEG n04612504/
+mv val/ILSVRC2012_val_00004516.JPEG n03706229/
+mv val/ILSVRC2012_val_00004517.JPEG n02167151/
+mv val/ILSVRC2012_val_00004518.JPEG n01582220/
+mv val/ILSVRC2012_val_00004519.JPEG n03692522/
+mv val/ILSVRC2012_val_00004520.JPEG n03595614/
+mv val/ILSVRC2012_val_00004521.JPEG n02823428/
+mv val/ILSVRC2012_val_00004522.JPEG n03950228/
+mv val/ILSVRC2012_val_00004523.JPEG n04399382/
+mv val/ILSVRC2012_val_00004524.JPEG n03877845/
+mv val/ILSVRC2012_val_00004525.JPEG n04596742/
+mv val/ILSVRC2012_val_00004526.JPEG n04005630/
+mv val/ILSVRC2012_val_00004527.JPEG n03724870/
+mv val/ILSVRC2012_val_00004528.JPEG n03445924/
+mv val/ILSVRC2012_val_00004529.JPEG n07614500/
+mv val/ILSVRC2012_val_00004530.JPEG n01883070/
+mv val/ILSVRC2012_val_00004531.JPEG n03710637/
+mv val/ILSVRC2012_val_00004532.JPEG n04120489/
+mv val/ILSVRC2012_val_00004533.JPEG n03127925/
+mv val/ILSVRC2012_val_00004534.JPEG n03249569/
+mv val/ILSVRC2012_val_00004535.JPEG n02879718/
+mv val/ILSVRC2012_val_00004536.JPEG n04562935/
+mv val/ILSVRC2012_val_00004537.JPEG n03630383/
+mv val/ILSVRC2012_val_00004538.JPEG n02106662/
+mv val/ILSVRC2012_val_00004539.JPEG n02097474/
+mv val/ILSVRC2012_val_00004540.JPEG n02114855/
+mv val/ILSVRC2012_val_00004541.JPEG n09332890/
+mv val/ILSVRC2012_val_00004542.JPEG n02096051/
+mv val/ILSVRC2012_val_00004543.JPEG n03995372/
+mv val/ILSVRC2012_val_00004544.JPEG n03016953/
+mv val/ILSVRC2012_val_00004545.JPEG n03447447/
+mv val/ILSVRC2012_val_00004546.JPEG n10565667/
+mv val/ILSVRC2012_val_00004547.JPEG n07579787/
+mv val/ILSVRC2012_val_00004548.JPEG n02102040/
+mv val/ILSVRC2012_val_00004549.JPEG n02097298/
+mv val/ILSVRC2012_val_00004550.JPEG n01514668/
+mv val/ILSVRC2012_val_00004551.JPEG n04332243/
+mv val/ILSVRC2012_val_00004552.JPEG n03770679/
+mv val/ILSVRC2012_val_00004553.JPEG n02102040/
+mv val/ILSVRC2012_val_00004554.JPEG n01616318/
+mv val/ILSVRC2012_val_00004555.JPEG n01694178/
+mv val/ILSVRC2012_val_00004556.JPEG n02817516/
+mv val/ILSVRC2012_val_00004557.JPEG n02086240/
+mv val/ILSVRC2012_val_00004558.JPEG n03787032/
+mv val/ILSVRC2012_val_00004559.JPEG n01582220/
+mv val/ILSVRC2012_val_00004560.JPEG n02097130/
+mv val/ILSVRC2012_val_00004561.JPEG n03690938/
+mv val/ILSVRC2012_val_00004562.JPEG n02825657/
+mv val/ILSVRC2012_val_00004563.JPEG n02106662/
+mv val/ILSVRC2012_val_00004564.JPEG n02490219/
+mv val/ILSVRC2012_val_00004565.JPEG n02514041/
+mv val/ILSVRC2012_val_00004566.JPEG n03958227/
+mv val/ILSVRC2012_val_00004567.JPEG n03658185/
+mv val/ILSVRC2012_val_00004568.JPEG n03187595/
+mv val/ILSVRC2012_val_00004569.JPEG n02107908/
+mv val/ILSVRC2012_val_00004570.JPEG n07734744/
+mv val/ILSVRC2012_val_00004571.JPEG n02093859/
+mv val/ILSVRC2012_val_00004572.JPEG n02011460/
+mv val/ILSVRC2012_val_00004573.JPEG n04447861/
+mv val/ILSVRC2012_val_00004574.JPEG n02640242/
+mv val/ILSVRC2012_val_00004575.JPEG n02793495/
+mv val/ILSVRC2012_val_00004576.JPEG n02514041/
+mv val/ILSVRC2012_val_00004577.JPEG n01534433/
+mv val/ILSVRC2012_val_00004578.JPEG n02132136/
+mv val/ILSVRC2012_val_00004579.JPEG n02108422/
+mv val/ILSVRC2012_val_00004580.JPEG n01768244/
+mv val/ILSVRC2012_val_00004581.JPEG n04399382/
+mv val/ILSVRC2012_val_00004582.JPEG n01734418/
+mv val/ILSVRC2012_val_00004583.JPEG n02037110/
+mv val/ILSVRC2012_val_00004584.JPEG n02444819/
+mv val/ILSVRC2012_val_00004585.JPEG n03272562/
+mv val/ILSVRC2012_val_00004586.JPEG n02906734/
+mv val/ILSVRC2012_val_00004587.JPEG n01740131/
+mv val/ILSVRC2012_val_00004588.JPEG n03325584/
+mv val/ILSVRC2012_val_00004589.JPEG n03598930/
+mv val/ILSVRC2012_val_00004590.JPEG n02277742/
+mv val/ILSVRC2012_val_00004591.JPEG n03443371/
+mv val/ILSVRC2012_val_00004592.JPEG n03447721/
+mv val/ILSVRC2012_val_00004593.JPEG n02097130/
+mv val/ILSVRC2012_val_00004594.JPEG n04347754/
+mv val/ILSVRC2012_val_00004595.JPEG n03903868/
+mv val/ILSVRC2012_val_00004596.JPEG n03529860/
+mv val/ILSVRC2012_val_00004597.JPEG n06785654/
+mv val/ILSVRC2012_val_00004598.JPEG n01985128/
+mv val/ILSVRC2012_val_00004599.JPEG n02892767/
+mv val/ILSVRC2012_val_00004600.JPEG n02074367/
+mv val/ILSVRC2012_val_00004601.JPEG n02445715/
+mv val/ILSVRC2012_val_00004602.JPEG n03131574/
+mv val/ILSVRC2012_val_00004603.JPEG n02892201/
+mv val/ILSVRC2012_val_00004604.JPEG n02114548/
+mv val/ILSVRC2012_val_00004605.JPEG n02096294/
+mv val/ILSVRC2012_val_00004606.JPEG n03787032/
+mv val/ILSVRC2012_val_00004607.JPEG n03776460/
+mv val/ILSVRC2012_val_00004608.JPEG n02870880/
+mv val/ILSVRC2012_val_00004609.JPEG n04347754/
+mv val/ILSVRC2012_val_00004610.JPEG n03930313/
+mv val/ILSVRC2012_val_00004611.JPEG n02095889/
+mv val/ILSVRC2012_val_00004612.JPEG n02124075/
+mv val/ILSVRC2012_val_00004613.JPEG n01641577/
+mv val/ILSVRC2012_val_00004614.JPEG n07753592/
+mv val/ILSVRC2012_val_00004615.JPEG n02100583/
+mv val/ILSVRC2012_val_00004616.JPEG n04591157/
+mv val/ILSVRC2012_val_00004617.JPEG n02488291/
+mv val/ILSVRC2012_val_00004618.JPEG n03690938/
+mv val/ILSVRC2012_val_00004619.JPEG n03791053/
+mv val/ILSVRC2012_val_00004620.JPEG n02860847/
+mv val/ILSVRC2012_val_00004621.JPEG n04612504/
+mv val/ILSVRC2012_val_00004622.JPEG n01677366/
+mv val/ILSVRC2012_val_00004623.JPEG n02112350/
+mv val/ILSVRC2012_val_00004624.JPEG n03062245/
+mv val/ILSVRC2012_val_00004625.JPEG n02909870/
+mv val/ILSVRC2012_val_00004626.JPEG n09428293/
+mv val/ILSVRC2012_val_00004627.JPEG n01860187/
+mv val/ILSVRC2012_val_00004628.JPEG n02999410/
+mv val/ILSVRC2012_val_00004629.JPEG n13044778/
+mv val/ILSVRC2012_val_00004630.JPEG n04070727/
+mv val/ILSVRC2012_val_00004631.JPEG n02105855/
+mv val/ILSVRC2012_val_00004632.JPEG n01950731/
+mv val/ILSVRC2012_val_00004633.JPEG n04443257/
+mv val/ILSVRC2012_val_00004634.JPEG n02110341/
+mv val/ILSVRC2012_val_00004635.JPEG n04265275/
+mv val/ILSVRC2012_val_00004636.JPEG n04273569/
+mv val/ILSVRC2012_val_00004637.JPEG n03000247/
+mv val/ILSVRC2012_val_00004638.JPEG n01675722/
+mv val/ILSVRC2012_val_00004639.JPEG n03838899/
+mv val/ILSVRC2012_val_00004640.JPEG n13040303/
+mv val/ILSVRC2012_val_00004641.JPEG n03016953/
+mv val/ILSVRC2012_val_00004642.JPEG n03793489/
+mv val/ILSVRC2012_val_00004643.JPEG n02119022/
+mv val/ILSVRC2012_val_00004644.JPEG n04366367/
+mv val/ILSVRC2012_val_00004645.JPEG n03388549/
+mv val/ILSVRC2012_val_00004646.JPEG n06874185/
+mv val/ILSVRC2012_val_00004647.JPEG n02980441/
+mv val/ILSVRC2012_val_00004648.JPEG n03676483/
+mv val/ILSVRC2012_val_00004649.JPEG n04065272/
+mv val/ILSVRC2012_val_00004650.JPEG n02102040/
+mv val/ILSVRC2012_val_00004651.JPEG n04501370/
+mv val/ILSVRC2012_val_00004652.JPEG n01740131/
+mv val/ILSVRC2012_val_00004653.JPEG n04162706/
+mv val/ILSVRC2012_val_00004654.JPEG n04325704/
+mv val/ILSVRC2012_val_00004655.JPEG n01443537/
+mv val/ILSVRC2012_val_00004656.JPEG n02672831/
+mv val/ILSVRC2012_val_00004657.JPEG n02101006/
+mv val/ILSVRC2012_val_00004658.JPEG n04417672/
+mv val/ILSVRC2012_val_00004659.JPEG n01990800/
+mv val/ILSVRC2012_val_00004660.JPEG n02133161/
+mv val/ILSVRC2012_val_00004661.JPEG n02264363/
+mv val/ILSVRC2012_val_00004662.JPEG n04548280/
+mv val/ILSVRC2012_val_00004663.JPEG n03935335/
+mv val/ILSVRC2012_val_00004664.JPEG n02906734/
+mv val/ILSVRC2012_val_00004665.JPEG n01985128/
+mv val/ILSVRC2012_val_00004666.JPEG n02107574/
+mv val/ILSVRC2012_val_00004667.JPEG n03125729/
+mv val/ILSVRC2012_val_00004668.JPEG n03208938/
+mv val/ILSVRC2012_val_00004669.JPEG n02074367/
+mv val/ILSVRC2012_val_00004670.JPEG n03133878/
+mv val/ILSVRC2012_val_00004671.JPEG n02085782/
+mv val/ILSVRC2012_val_00004672.JPEG n02607072/
+mv val/ILSVRC2012_val_00004673.JPEG n03388043/
+mv val/ILSVRC2012_val_00004674.JPEG n02096585/
+mv val/ILSVRC2012_val_00004675.JPEG n07693725/
+mv val/ILSVRC2012_val_00004676.JPEG n02786058/
+mv val/ILSVRC2012_val_00004677.JPEG n01443537/
+mv val/ILSVRC2012_val_00004678.JPEG n01873310/
+mv val/ILSVRC2012_val_00004679.JPEG n02791124/
+mv val/ILSVRC2012_val_00004680.JPEG n04325704/
+mv val/ILSVRC2012_val_00004681.JPEG n03530642/
+mv val/ILSVRC2012_val_00004682.JPEG n04147183/
+mv val/ILSVRC2012_val_00004683.JPEG n02484975/
+mv val/ILSVRC2012_val_00004684.JPEG n02091635/
+mv val/ILSVRC2012_val_00004685.JPEG n03100240/
+mv val/ILSVRC2012_val_00004686.JPEG n02879718/
+mv val/ILSVRC2012_val_00004687.JPEG n02093991/
+mv val/ILSVRC2012_val_00004688.JPEG n11879895/
+mv val/ILSVRC2012_val_00004689.JPEG n01737021/
+mv val/ILSVRC2012_val_00004690.JPEG n13054560/
+mv val/ILSVRC2012_val_00004691.JPEG n01945685/
+mv val/ILSVRC2012_val_00004692.JPEG n04356056/
+mv val/ILSVRC2012_val_00004693.JPEG n02342885/
+mv val/ILSVRC2012_val_00004694.JPEG n04192698/
+mv val/ILSVRC2012_val_00004695.JPEG n04536866/
+mv val/ILSVRC2012_val_00004696.JPEG n04435653/
+mv val/ILSVRC2012_val_00004697.JPEG n01829413/
+mv val/ILSVRC2012_val_00004698.JPEG n01496331/
+mv val/ILSVRC2012_val_00004699.JPEG n03887697/
+mv val/ILSVRC2012_val_00004700.JPEG n03770679/
+mv val/ILSVRC2012_val_00004701.JPEG n12057211/
+mv val/ILSVRC2012_val_00004702.JPEG n12985857/
+mv val/ILSVRC2012_val_00004703.JPEG n04266014/
+mv val/ILSVRC2012_val_00004704.JPEG n02916936/
+mv val/ILSVRC2012_val_00004705.JPEG n04429376/
+mv val/ILSVRC2012_val_00004706.JPEG n02229544/
+mv val/ILSVRC2012_val_00004707.JPEG n03763968/
+mv val/ILSVRC2012_val_00004708.JPEG n03595614/
+mv val/ILSVRC2012_val_00004709.JPEG n02837789/
+mv val/ILSVRC2012_val_00004710.JPEG n02109047/
+mv val/ILSVRC2012_val_00004711.JPEG n02106030/
+mv val/ILSVRC2012_val_00004712.JPEG n03180011/
+mv val/ILSVRC2012_val_00004713.JPEG n02102973/
+mv val/ILSVRC2012_val_00004714.JPEG n02865351/
+mv val/ILSVRC2012_val_00004715.JPEG n02074367/
+mv val/ILSVRC2012_val_00004716.JPEG n02169497/
+mv val/ILSVRC2012_val_00004717.JPEG n02087046/
+mv val/ILSVRC2012_val_00004718.JPEG n03141823/
+mv val/ILSVRC2012_val_00004719.JPEG n02124075/
+mv val/ILSVRC2012_val_00004720.JPEG n02437312/
+mv val/ILSVRC2012_val_00004721.JPEG n07892512/
+mv val/ILSVRC2012_val_00004722.JPEG n01776313/
+mv val/ILSVRC2012_val_00004723.JPEG n02641379/
+mv val/ILSVRC2012_val_00004724.JPEG n01644900/
+mv val/ILSVRC2012_val_00004725.JPEG n03042490/
+mv val/ILSVRC2012_val_00004726.JPEG n03630383/
+mv val/ILSVRC2012_val_00004727.JPEG n03785016/
+mv val/ILSVRC2012_val_00004728.JPEG n07730033/
+mv val/ILSVRC2012_val_00004729.JPEG n03544143/
+mv val/ILSVRC2012_val_00004730.JPEG n02007558/
+mv val/ILSVRC2012_val_00004731.JPEG n02109047/
+mv val/ILSVRC2012_val_00004732.JPEG n02910353/
+mv val/ILSVRC2012_val_00004733.JPEG n02107312/
+mv val/ILSVRC2012_val_00004734.JPEG n02389026/
+mv val/ILSVRC2012_val_00004735.JPEG n01698640/
+mv val/ILSVRC2012_val_00004736.JPEG n03633091/
+mv val/ILSVRC2012_val_00004737.JPEG n04442312/
+mv val/ILSVRC2012_val_00004738.JPEG n07248320/
+mv val/ILSVRC2012_val_00004739.JPEG n04525038/
+mv val/ILSVRC2012_val_00004740.JPEG n03459775/
+mv val/ILSVRC2012_val_00004741.JPEG n03297495/
+mv val/ILSVRC2012_val_00004742.JPEG n03676483/
+mv val/ILSVRC2012_val_00004743.JPEG n03476991/
+mv val/ILSVRC2012_val_00004744.JPEG n02097658/
+mv val/ILSVRC2012_val_00004745.JPEG n03888257/
+mv val/ILSVRC2012_val_00004746.JPEG n02115913/
+mv val/ILSVRC2012_val_00004747.JPEG n01532829/
+mv val/ILSVRC2012_val_00004748.JPEG n02085936/
+mv val/ILSVRC2012_val_00004749.JPEG n01532829/
+mv val/ILSVRC2012_val_00004750.JPEG n02107312/
+mv val/ILSVRC2012_val_00004751.JPEG n02403003/
+mv val/ILSVRC2012_val_00004752.JPEG n03933933/
+mv val/ILSVRC2012_val_00004753.JPEG n02483362/
+mv val/ILSVRC2012_val_00004754.JPEG n02105162/
+mv val/ILSVRC2012_val_00004755.JPEG n02066245/
+mv val/ILSVRC2012_val_00004756.JPEG n01518878/
+mv val/ILSVRC2012_val_00004757.JPEG n01685808/
+mv val/ILSVRC2012_val_00004758.JPEG n03782006/
+mv val/ILSVRC2012_val_00004759.JPEG n07695742/
+mv val/ILSVRC2012_val_00004760.JPEG n09835506/
+mv val/ILSVRC2012_val_00004761.JPEG n04141076/
+mv val/ILSVRC2012_val_00004762.JPEG n02454379/
+mv val/ILSVRC2012_val_00004763.JPEG n02107683/
+mv val/ILSVRC2012_val_00004764.JPEG n03874293/
+mv val/ILSVRC2012_val_00004765.JPEG n02177972/
+mv val/ILSVRC2012_val_00004766.JPEG n02106166/
+mv val/ILSVRC2012_val_00004767.JPEG n04590129/
+mv val/ILSVRC2012_val_00004768.JPEG n03388549/
+mv val/ILSVRC2012_val_00004769.JPEG n04399382/
+mv val/ILSVRC2012_val_00004770.JPEG n02096585/
+mv val/ILSVRC2012_val_00004771.JPEG n02093256/
+mv val/ILSVRC2012_val_00004772.JPEG n02319095/
+mv val/ILSVRC2012_val_00004773.JPEG n04560804/
+mv val/ILSVRC2012_val_00004774.JPEG n02089973/
+mv val/ILSVRC2012_val_00004775.JPEG n03223299/
+mv val/ILSVRC2012_val_00004776.JPEG n02091244/
+mv val/ILSVRC2012_val_00004777.JPEG n02089867/
+mv val/ILSVRC2012_val_00004778.JPEG n04335435/
+mv val/ILSVRC2012_val_00004779.JPEG n03825788/
+mv val/ILSVRC2012_val_00004780.JPEG n02056570/
+mv val/ILSVRC2012_val_00004781.JPEG n01669191/
+mv val/ILSVRC2012_val_00004782.JPEG n02113978/
+mv val/ILSVRC2012_val_00004783.JPEG n03141823/
+mv val/ILSVRC2012_val_00004784.JPEG n02640242/
+mv val/ILSVRC2012_val_00004785.JPEG n02841315/
+mv val/ILSVRC2012_val_00004786.JPEG n04146614/
+mv val/ILSVRC2012_val_00004787.JPEG n03400231/
+mv val/ILSVRC2012_val_00004788.JPEG n02490219/
+mv val/ILSVRC2012_val_00004789.JPEG n03791053/
+mv val/ILSVRC2012_val_00004790.JPEG n07880968/
+mv val/ILSVRC2012_val_00004791.JPEG n02025239/
+mv val/ILSVRC2012_val_00004792.JPEG n03873416/
+mv val/ILSVRC2012_val_00004793.JPEG n02437616/
+mv val/ILSVRC2012_val_00004794.JPEG n03220513/
+mv val/ILSVRC2012_val_00004795.JPEG n02089973/
+mv val/ILSVRC2012_val_00004796.JPEG n03045698/
+mv val/ILSVRC2012_val_00004797.JPEG n02100735/
+mv val/ILSVRC2012_val_00004798.JPEG n04228054/
+mv val/ILSVRC2012_val_00004799.JPEG n06785654/
+mv val/ILSVRC2012_val_00004800.JPEG n04554684/
+mv val/ILSVRC2012_val_00004801.JPEG n03595614/
+mv val/ILSVRC2012_val_00004802.JPEG n03933933/
+mv val/ILSVRC2012_val_00004803.JPEG n03954731/
+mv val/ILSVRC2012_val_00004804.JPEG n02110806/
+mv val/ILSVRC2012_val_00004805.JPEG n02056570/
+mv val/ILSVRC2012_val_00004806.JPEG n04476259/
+mv val/ILSVRC2012_val_00004807.JPEG n03032252/
+mv val/ILSVRC2012_val_00004808.JPEG n02445715/
+mv val/ILSVRC2012_val_00004809.JPEG n03895866/
+mv val/ILSVRC2012_val_00004810.JPEG n02317335/
+mv val/ILSVRC2012_val_00004811.JPEG n04479046/
+mv val/ILSVRC2012_val_00004812.JPEG n02782093/
+mv val/ILSVRC2012_val_00004813.JPEG n02172182/
+mv val/ILSVRC2012_val_00004814.JPEG n02417914/
+mv val/ILSVRC2012_val_00004815.JPEG n03041632/
+mv val/ILSVRC2012_val_00004816.JPEG n04507155/
+mv val/ILSVRC2012_val_00004817.JPEG n02672831/
+mv val/ILSVRC2012_val_00004818.JPEG n02108000/
+mv val/ILSVRC2012_val_00004819.JPEG n07714990/
+mv val/ILSVRC2012_val_00004820.JPEG n03532672/
+mv val/ILSVRC2012_val_00004821.JPEG n02123597/
+mv val/ILSVRC2012_val_00004822.JPEG n03218198/
+mv val/ILSVRC2012_val_00004823.JPEG n02091134/
+mv val/ILSVRC2012_val_00004824.JPEG n02825657/
+mv val/ILSVRC2012_val_00004825.JPEG n02916936/
+mv val/ILSVRC2012_val_00004826.JPEG n03874599/
+mv val/ILSVRC2012_val_00004827.JPEG n03876231/
+mv val/ILSVRC2012_val_00004828.JPEG n03160309/
+mv val/ILSVRC2012_val_00004829.JPEG n04118538/
+mv val/ILSVRC2012_val_00004830.JPEG n03259280/
+mv val/ILSVRC2012_val_00004831.JPEG n03670208/
+mv val/ILSVRC2012_val_00004832.JPEG n07745940/
+mv val/ILSVRC2012_val_00004833.JPEG n03733805/
+mv val/ILSVRC2012_val_00004834.JPEG n01669191/
+mv val/ILSVRC2012_val_00004835.JPEG n03404251/
+mv val/ILSVRC2012_val_00004836.JPEG n07718747/
+mv val/ILSVRC2012_val_00004837.JPEG n07831146/
+mv val/ILSVRC2012_val_00004838.JPEG n02403003/
+mv val/ILSVRC2012_val_00004839.JPEG n02883205/
+mv val/ILSVRC2012_val_00004840.JPEG n02415577/
+mv val/ILSVRC2012_val_00004841.JPEG n01784675/
+mv val/ILSVRC2012_val_00004842.JPEG n02492035/
+mv val/ILSVRC2012_val_00004843.JPEG n03599486/
+mv val/ILSVRC2012_val_00004844.JPEG n01877812/
+mv val/ILSVRC2012_val_00004845.JPEG n01877812/
+mv val/ILSVRC2012_val_00004846.JPEG n03498962/
+mv val/ILSVRC2012_val_00004847.JPEG n04355338/
+mv val/ILSVRC2012_val_00004848.JPEG n03617480/
+mv val/ILSVRC2012_val_00004849.JPEG n03404251/
+mv val/ILSVRC2012_val_00004850.JPEG n02277742/
+mv val/ILSVRC2012_val_00004851.JPEG n02169497/
+mv val/ILSVRC2012_val_00004852.JPEG n02113624/
+mv val/ILSVRC2012_val_00004853.JPEG n04067472/
+mv val/ILSVRC2012_val_00004854.JPEG n04465501/
+mv val/ILSVRC2012_val_00004855.JPEG n04335435/
+mv val/ILSVRC2012_val_00004856.JPEG n02444819/
+mv val/ILSVRC2012_val_00004857.JPEG n09421951/
+mv val/ILSVRC2012_val_00004858.JPEG n04591157/
+mv val/ILSVRC2012_val_00004859.JPEG n01622779/
+mv val/ILSVRC2012_val_00004860.JPEG n03425413/
+mv val/ILSVRC2012_val_00004861.JPEG n02346627/
+mv val/ILSVRC2012_val_00004862.JPEG n04162706/
+mv val/ILSVRC2012_val_00004863.JPEG n03874293/
+mv val/ILSVRC2012_val_00004864.JPEG n02138441/
+mv val/ILSVRC2012_val_00004865.JPEG n04005630/
+mv val/ILSVRC2012_val_00004866.JPEG n03769881/
+mv val/ILSVRC2012_val_00004867.JPEG n03942813/
+mv val/ILSVRC2012_val_00004868.JPEG n04285008/
+mv val/ILSVRC2012_val_00004869.JPEG n02114855/
+mv val/ILSVRC2012_val_00004870.JPEG n02114712/
+mv val/ILSVRC2012_val_00004871.JPEG n02708093/
+mv val/ILSVRC2012_val_00004872.JPEG n03124170/
+mv val/ILSVRC2012_val_00004873.JPEG n01498041/
+mv val/ILSVRC2012_val_00004874.JPEG n07613480/
+mv val/ILSVRC2012_val_00004875.JPEG n02363005/
+mv val/ILSVRC2012_val_00004876.JPEG n03355925/
+mv val/ILSVRC2012_val_00004877.JPEG n13054560/
+mv val/ILSVRC2012_val_00004878.JPEG n03180011/
+mv val/ILSVRC2012_val_00004879.JPEG n04552348/
+mv val/ILSVRC2012_val_00004880.JPEG n02423022/
+mv val/ILSVRC2012_val_00004881.JPEG n04525038/
+mv val/ILSVRC2012_val_00004882.JPEG n02504013/
+mv val/ILSVRC2012_val_00004883.JPEG n02107312/
+mv val/ILSVRC2012_val_00004884.JPEG n02091467/
+mv val/ILSVRC2012_val_00004885.JPEG n02101006/
+mv val/ILSVRC2012_val_00004886.JPEG n03721384/
+mv val/ILSVRC2012_val_00004887.JPEG n07695742/
+mv val/ILSVRC2012_val_00004888.JPEG n02823428/
+mv val/ILSVRC2012_val_00004889.JPEG n04589890/
+mv val/ILSVRC2012_val_00004890.JPEG n04584207/
+mv val/ILSVRC2012_val_00004891.JPEG n04111531/
+mv val/ILSVRC2012_val_00004892.JPEG n03160309/
+mv val/ILSVRC2012_val_00004893.JPEG n01531178/
+mv val/ILSVRC2012_val_00004894.JPEG n02123394/
+mv val/ILSVRC2012_val_00004895.JPEG n02777292/
+mv val/ILSVRC2012_val_00004896.JPEG n04208210/
+mv val/ILSVRC2012_val_00004897.JPEG n01667114/
+mv val/ILSVRC2012_val_00004898.JPEG n01667114/
+mv val/ILSVRC2012_val_00004899.JPEG n04597913/
+mv val/ILSVRC2012_val_00004900.JPEG n03529860/
+mv val/ILSVRC2012_val_00004901.JPEG n03450230/
+mv val/ILSVRC2012_val_00004902.JPEG n02123045/
+mv val/ILSVRC2012_val_00004903.JPEG n12768682/
+mv val/ILSVRC2012_val_00004904.JPEG n01924916/
+mv val/ILSVRC2012_val_00004905.JPEG n02536864/
+mv val/ILSVRC2012_val_00004906.JPEG n04442312/
+mv val/ILSVRC2012_val_00004907.JPEG n02747177/
+mv val/ILSVRC2012_val_00004908.JPEG n07831146/
+mv val/ILSVRC2012_val_00004909.JPEG n02951358/
+mv val/ILSVRC2012_val_00004910.JPEG n03857828/
+mv val/ILSVRC2012_val_00004911.JPEG n03482405/
+mv val/ILSVRC2012_val_00004912.JPEG n03028079/
+mv val/ILSVRC2012_val_00004913.JPEG n04040759/
+mv val/ILSVRC2012_val_00004914.JPEG n02417914/
+mv val/ILSVRC2012_val_00004915.JPEG n01689811/
+mv val/ILSVRC2012_val_00004916.JPEG n03188531/
+mv val/ILSVRC2012_val_00004917.JPEG n04070727/
+mv val/ILSVRC2012_val_00004918.JPEG n07720875/
+mv val/ILSVRC2012_val_00004919.JPEG n02168699/
+mv val/ILSVRC2012_val_00004920.JPEG n11939491/
+mv val/ILSVRC2012_val_00004921.JPEG n01704323/
+mv val/ILSVRC2012_val_00004922.JPEG n03223299/
+mv val/ILSVRC2012_val_00004923.JPEG n01930112/
+mv val/ILSVRC2012_val_00004924.JPEG n02747177/
+mv val/ILSVRC2012_val_00004925.JPEG n03903868/
+mv val/ILSVRC2012_val_00004926.JPEG n02093428/
+mv val/ILSVRC2012_val_00004927.JPEG n01728572/
+mv val/ILSVRC2012_val_00004928.JPEG n03459775/
+mv val/ILSVRC2012_val_00004929.JPEG n04409515/
+mv val/ILSVRC2012_val_00004930.JPEG n03977966/
+mv val/ILSVRC2012_val_00004931.JPEG n03220513/
+mv val/ILSVRC2012_val_00004932.JPEG n04355933/
+mv val/ILSVRC2012_val_00004933.JPEG n03662601/
+mv val/ILSVRC2012_val_00004934.JPEG n03916031/
+mv val/ILSVRC2012_val_00004935.JPEG n07836838/
+mv val/ILSVRC2012_val_00004936.JPEG n07714571/
+mv val/ILSVRC2012_val_00004937.JPEG n03891332/
+mv val/ILSVRC2012_val_00004938.JPEG n02105251/
+mv val/ILSVRC2012_val_00004939.JPEG n03028079/
+mv val/ILSVRC2012_val_00004940.JPEG n02117135/
+mv val/ILSVRC2012_val_00004941.JPEG n02096585/
+mv val/ILSVRC2012_val_00004942.JPEG n04458633/
+mv val/ILSVRC2012_val_00004943.JPEG n02883205/
+mv val/ILSVRC2012_val_00004944.JPEG n01818515/
+mv val/ILSVRC2012_val_00004945.JPEG n01641577/
+mv val/ILSVRC2012_val_00004946.JPEG n04070727/
+mv val/ILSVRC2012_val_00004947.JPEG n02093428/
+mv val/ILSVRC2012_val_00004948.JPEG n03494278/
+mv val/ILSVRC2012_val_00004949.JPEG n03255030/
+mv val/ILSVRC2012_val_00004950.JPEG n03769881/
+mv val/ILSVRC2012_val_00004951.JPEG n07716358/
+mv val/ILSVRC2012_val_00004952.JPEG n03877845/
+mv val/ILSVRC2012_val_00004953.JPEG n07760859/
+mv val/ILSVRC2012_val_00004954.JPEG n03495258/
+mv val/ILSVRC2012_val_00004955.JPEG n04370456/
+mv val/ILSVRC2012_val_00004956.JPEG n02091134/
+mv val/ILSVRC2012_val_00004957.JPEG n03874293/
+mv val/ILSVRC2012_val_00004958.JPEG n03026506/
+mv val/ILSVRC2012_val_00004959.JPEG n03259280/
+mv val/ILSVRC2012_val_00004960.JPEG n02097209/
+mv val/ILSVRC2012_val_00004961.JPEG n03873416/
+mv val/ILSVRC2012_val_00004962.JPEG n07760859/
+mv val/ILSVRC2012_val_00004963.JPEG n02108422/
+mv val/ILSVRC2012_val_00004964.JPEG n01872401/
+mv val/ILSVRC2012_val_00004965.JPEG n01981276/
+mv val/ILSVRC2012_val_00004966.JPEG n04153751/
+mv val/ILSVRC2012_val_00004967.JPEG n02110185/
+mv val/ILSVRC2012_val_00004968.JPEG n02095570/
+mv val/ILSVRC2012_val_00004969.JPEG n01496331/
+mv val/ILSVRC2012_val_00004970.JPEG n04285008/
+mv val/ILSVRC2012_val_00004971.JPEG n03075370/
+mv val/ILSVRC2012_val_00004972.JPEG n02815834/
+mv val/ILSVRC2012_val_00004973.JPEG n09256479/
+mv val/ILSVRC2012_val_00004974.JPEG n02092339/
+mv val/ILSVRC2012_val_00004975.JPEG n02808304/
+mv val/ILSVRC2012_val_00004976.JPEG n09428293/
+mv val/ILSVRC2012_val_00004977.JPEG n02101006/
+mv val/ILSVRC2012_val_00004978.JPEG n02412080/
+mv val/ILSVRC2012_val_00004979.JPEG n04285008/
+mv val/ILSVRC2012_val_00004980.JPEG n03954731/
+mv val/ILSVRC2012_val_00004981.JPEG n04311004/
+mv val/ILSVRC2012_val_00004982.JPEG n03476991/
+mv val/ILSVRC2012_val_00004983.JPEG n01518878/
+mv val/ILSVRC2012_val_00004984.JPEG n02687172/
+mv val/ILSVRC2012_val_00004985.JPEG n02342885/
+mv val/ILSVRC2012_val_00004986.JPEG n02346627/
+mv val/ILSVRC2012_val_00004987.JPEG n02883205/
+mv val/ILSVRC2012_val_00004988.JPEG n03457902/
+mv val/ILSVRC2012_val_00004989.JPEG n02097658/
+mv val/ILSVRC2012_val_00004990.JPEG n02504458/
+mv val/ILSVRC2012_val_00004991.JPEG n03930313/
+mv val/ILSVRC2012_val_00004992.JPEG n02087394/
+mv val/ILSVRC2012_val_00004993.JPEG n02802426/
+mv val/ILSVRC2012_val_00004994.JPEG n03272010/
+mv val/ILSVRC2012_val_00004995.JPEG n02102318/
+mv val/ILSVRC2012_val_00004996.JPEG n02091467/
+mv val/ILSVRC2012_val_00004997.JPEG n02099849/
+mv val/ILSVRC2012_val_00004998.JPEG n04552348/
+mv val/ILSVRC2012_val_00004999.JPEG n02443114/
+mv val/ILSVRC2012_val_00005000.JPEG n02276258/
+mv val/ILSVRC2012_val_00005001.JPEG n03642806/
+mv val/ILSVRC2012_val_00005002.JPEG n02342885/
+mv val/ILSVRC2012_val_00005003.JPEG n03916031/
+mv val/ILSVRC2012_val_00005004.JPEG n02125311/
+mv val/ILSVRC2012_val_00005005.JPEG n02837789/
+mv val/ILSVRC2012_val_00005006.JPEG n02130308/
+mv val/ILSVRC2012_val_00005007.JPEG n04509417/
+mv val/ILSVRC2012_val_00005008.JPEG n03207941/
+mv val/ILSVRC2012_val_00005009.JPEG n03877845/
+mv val/ILSVRC2012_val_00005010.JPEG n13052670/
+mv val/ILSVRC2012_val_00005011.JPEG n02317335/
+mv val/ILSVRC2012_val_00005012.JPEG n03444034/
+mv val/ILSVRC2012_val_00005013.JPEG n03179701/
+mv val/ILSVRC2012_val_00005014.JPEG n04371774/
+mv val/ILSVRC2012_val_00005015.JPEG n03924679/
+mv val/ILSVRC2012_val_00005016.JPEG n02950826/
+mv val/ILSVRC2012_val_00005017.JPEG n02110958/
+mv val/ILSVRC2012_val_00005018.JPEG n02113978/
+mv val/ILSVRC2012_val_00005019.JPEG n02109961/
+mv val/ILSVRC2012_val_00005020.JPEG n02363005/
+mv val/ILSVRC2012_val_00005021.JPEG n02090622/
+mv val/ILSVRC2012_val_00005022.JPEG n07930864/
+mv val/ILSVRC2012_val_00005023.JPEG n03857828/
+mv val/ILSVRC2012_val_00005024.JPEG n03763968/
+mv val/ILSVRC2012_val_00005025.JPEG n07684084/
+mv val/ILSVRC2012_val_00005026.JPEG n02497673/
+mv val/ILSVRC2012_val_00005027.JPEG n02102480/
+mv val/ILSVRC2012_val_00005028.JPEG n04275548/
+mv val/ILSVRC2012_val_00005029.JPEG n04264628/
+mv val/ILSVRC2012_val_00005030.JPEG n02058221/
+mv val/ILSVRC2012_val_00005031.JPEG n01687978/
+mv val/ILSVRC2012_val_00005032.JPEG n02877765/
+mv val/ILSVRC2012_val_00005033.JPEG n01748264/
+mv val/ILSVRC2012_val_00005034.JPEG n02028035/
+mv val/ILSVRC2012_val_00005035.JPEG n02909870/
+mv val/ILSVRC2012_val_00005036.JPEG n04332243/
+mv val/ILSVRC2012_val_00005037.JPEG n09835506/
+mv val/ILSVRC2012_val_00005038.JPEG n04192698/
+mv val/ILSVRC2012_val_00005039.JPEG n03877845/
+mv val/ILSVRC2012_val_00005040.JPEG n03832673/
+mv val/ILSVRC2012_val_00005041.JPEG n04179913/
+mv val/ILSVRC2012_val_00005042.JPEG n03623198/
+mv val/ILSVRC2012_val_00005043.JPEG n02107908/
+mv val/ILSVRC2012_val_00005044.JPEG n04548362/
+mv val/ILSVRC2012_val_00005045.JPEG n01641577/
+mv val/ILSVRC2012_val_00005046.JPEG n02992211/
+mv val/ILSVRC2012_val_00005047.JPEG n04326547/
+mv val/ILSVRC2012_val_00005048.JPEG n02783161/
+mv val/ILSVRC2012_val_00005049.JPEG n03743016/
+mv val/ILSVRC2012_val_00005050.JPEG n01729977/
+mv val/ILSVRC2012_val_00005051.JPEG n04146614/
+mv val/ILSVRC2012_val_00005052.JPEG n01695060/
+mv val/ILSVRC2012_val_00005053.JPEG n03649909/
+mv val/ILSVRC2012_val_00005054.JPEG n02087394/
+mv val/ILSVRC2012_val_00005055.JPEG n03424325/
+mv val/ILSVRC2012_val_00005056.JPEG n01688243/
+mv val/ILSVRC2012_val_00005057.JPEG n03223299/
+mv val/ILSVRC2012_val_00005058.JPEG n01914609/
+mv val/ILSVRC2012_val_00005059.JPEG n02091032/
+mv val/ILSVRC2012_val_00005060.JPEG n02095570/
+mv val/ILSVRC2012_val_00005061.JPEG n07720875/
+mv val/ILSVRC2012_val_00005062.JPEG n02606052/
+mv val/ILSVRC2012_val_00005063.JPEG n03584829/
+mv val/ILSVRC2012_val_00005064.JPEG n02110185/
+mv val/ILSVRC2012_val_00005065.JPEG n03220513/
+mv val/ILSVRC2012_val_00005066.JPEG n07745940/
+mv val/ILSVRC2012_val_00005067.JPEG n01824575/
+mv val/ILSVRC2012_val_00005068.JPEG n02099601/
+mv val/ILSVRC2012_val_00005069.JPEG n11939491/
+mv val/ILSVRC2012_val_00005070.JPEG n07749582/
+mv val/ILSVRC2012_val_00005071.JPEG n03457902/
+mv val/ILSVRC2012_val_00005072.JPEG n01784675/
+mv val/ILSVRC2012_val_00005073.JPEG n02112018/
+mv val/ILSVRC2012_val_00005074.JPEG n03733131/
+mv val/ILSVRC2012_val_00005075.JPEG n04328186/
+mv val/ILSVRC2012_val_00005076.JPEG n04037443/
+mv val/ILSVRC2012_val_00005077.JPEG n03717622/
+mv val/ILSVRC2012_val_00005078.JPEG n01694178/
+mv val/ILSVRC2012_val_00005079.JPEG n02871525/
+mv val/ILSVRC2012_val_00005080.JPEG n02808440/
+mv val/ILSVRC2012_val_00005081.JPEG n04560804/
+mv val/ILSVRC2012_val_00005082.JPEG n02097474/
+mv val/ILSVRC2012_val_00005083.JPEG n02137549/
+mv val/ILSVRC2012_val_00005084.JPEG n01981276/
+mv val/ILSVRC2012_val_00005085.JPEG n02443114/
+mv val/ILSVRC2012_val_00005086.JPEG n02101006/
+mv val/ILSVRC2012_val_00005087.JPEG n04550184/
+mv val/ILSVRC2012_val_00005088.JPEG n12985857/
+mv val/ILSVRC2012_val_00005089.JPEG n02236044/
+mv val/ILSVRC2012_val_00005090.JPEG n02488291/
+mv val/ILSVRC2012_val_00005091.JPEG n04532106/
+mv val/ILSVRC2012_val_00005092.JPEG n03895866/
+mv val/ILSVRC2012_val_00005093.JPEG n03617480/
+mv val/ILSVRC2012_val_00005094.JPEG n03417042/
+mv val/ILSVRC2012_val_00005095.JPEG n03903868/
+mv val/ILSVRC2012_val_00005096.JPEG n03584254/
+mv val/ILSVRC2012_val_00005097.JPEG n02389026/
+mv val/ILSVRC2012_val_00005098.JPEG n04435653/
+mv val/ILSVRC2012_val_00005099.JPEG n02492035/
+mv val/ILSVRC2012_val_00005100.JPEG n01796340/
+mv val/ILSVRC2012_val_00005101.JPEG n03447721/
+mv val/ILSVRC2012_val_00005102.JPEG n03447447/
+mv val/ILSVRC2012_val_00005103.JPEG n03595614/
+mv val/ILSVRC2012_val_00005104.JPEG n04579145/
+mv val/ILSVRC2012_val_00005105.JPEG n02777292/
+mv val/ILSVRC2012_val_00005106.JPEG n04147183/
+mv val/ILSVRC2012_val_00005107.JPEG n02006656/
+mv val/ILSVRC2012_val_00005108.JPEG n03843555/
+mv val/ILSVRC2012_val_00005109.JPEG n02504458/
+mv val/ILSVRC2012_val_00005110.JPEG n03444034/
+mv val/ILSVRC2012_val_00005111.JPEG n03673027/
+mv val/ILSVRC2012_val_00005112.JPEG n04417672/
+mv val/ILSVRC2012_val_00005113.JPEG n10148035/
+mv val/ILSVRC2012_val_00005114.JPEG n04179913/
+mv val/ILSVRC2012_val_00005115.JPEG n03792972/
+mv val/ILSVRC2012_val_00005116.JPEG n04552348/
+mv val/ILSVRC2012_val_00005117.JPEG n02281406/
+mv val/ILSVRC2012_val_00005118.JPEG n02326432/
+mv val/ILSVRC2012_val_00005119.JPEG n02493509/
+mv val/ILSVRC2012_val_00005120.JPEG n03314780/
+mv val/ILSVRC2012_val_00005121.JPEG n03485407/
+mv val/ILSVRC2012_val_00005122.JPEG n01980166/
+mv val/ILSVRC2012_val_00005123.JPEG n04442312/
+mv val/ILSVRC2012_val_00005124.JPEG n03602883/
+mv val/ILSVRC2012_val_00005125.JPEG n01986214/
+mv val/ILSVRC2012_val_00005126.JPEG n02108915/
+mv val/ILSVRC2012_val_00005127.JPEG n02492660/
+mv val/ILSVRC2012_val_00005128.JPEG n03384352/
+mv val/ILSVRC2012_val_00005129.JPEG n04367480/
+mv val/ILSVRC2012_val_00005130.JPEG n04467665/
+mv val/ILSVRC2012_val_00005131.JPEG n02814860/
+mv val/ILSVRC2012_val_00005132.JPEG n01728572/
+mv val/ILSVRC2012_val_00005133.JPEG n03733281/
+mv val/ILSVRC2012_val_00005134.JPEG n03216828/
+mv val/ILSVRC2012_val_00005135.JPEG n02494079/
+mv val/ILSVRC2012_val_00005136.JPEG n03733805/
+mv val/ILSVRC2012_val_00005137.JPEG n02279972/
+mv val/ILSVRC2012_val_00005138.JPEG n01692333/
+mv val/ILSVRC2012_val_00005139.JPEG n02091635/
+mv val/ILSVRC2012_val_00005140.JPEG n04487081/
+mv val/ILSVRC2012_val_00005141.JPEG n03866082/
+mv val/ILSVRC2012_val_00005142.JPEG n03208938/
+mv val/ILSVRC2012_val_00005143.JPEG n07714990/
+mv val/ILSVRC2012_val_00005144.JPEG n02906734/
+mv val/ILSVRC2012_val_00005145.JPEG n02807133/
+mv val/ILSVRC2012_val_00005146.JPEG n02095570/
+mv val/ILSVRC2012_val_00005147.JPEG n03594945/
+mv val/ILSVRC2012_val_00005148.JPEG n03492542/
+mv val/ILSVRC2012_val_00005149.JPEG n02442845/
+mv val/ILSVRC2012_val_00005150.JPEG n01833805/
+mv val/ILSVRC2012_val_00005151.JPEG n02395406/
+mv val/ILSVRC2012_val_00005152.JPEG n06874185/
+mv val/ILSVRC2012_val_00005153.JPEG n02490219/
+mv val/ILSVRC2012_val_00005154.JPEG n02071294/
+mv val/ILSVRC2012_val_00005155.JPEG n02447366/
+mv val/ILSVRC2012_val_00005156.JPEG n01537544/
+mv val/ILSVRC2012_val_00005157.JPEG n02281787/
+mv val/ILSVRC2012_val_00005158.JPEG n02268443/
+mv val/ILSVRC2012_val_00005159.JPEG n03775546/
+mv val/ILSVRC2012_val_00005160.JPEG n04429376/
+mv val/ILSVRC2012_val_00005161.JPEG n03832673/
+mv val/ILSVRC2012_val_00005162.JPEG n04398044/
+mv val/ILSVRC2012_val_00005163.JPEG n04370456/
+mv val/ILSVRC2012_val_00005164.JPEG n02128757/
+mv val/ILSVRC2012_val_00005165.JPEG n04162706/
+mv val/ILSVRC2012_val_00005166.JPEG n04146614/
+mv val/ILSVRC2012_val_00005167.JPEG n04482393/
+mv val/ILSVRC2012_val_00005168.JPEG n07860988/
+mv val/ILSVRC2012_val_00005169.JPEG n02167151/
+mv val/ILSVRC2012_val_00005170.JPEG n02095889/
+mv val/ILSVRC2012_val_00005171.JPEG n02487347/
+mv val/ILSVRC2012_val_00005172.JPEG n01632777/
+mv val/ILSVRC2012_val_00005173.JPEG n02992211/
+mv val/ILSVRC2012_val_00005174.JPEG n02097658/
+mv val/ILSVRC2012_val_00005175.JPEG n02107683/
+mv val/ILSVRC2012_val_00005176.JPEG n03980874/
+mv val/ILSVRC2012_val_00005177.JPEG n07753592/
+mv val/ILSVRC2012_val_00005178.JPEG n02037110/
+mv val/ILSVRC2012_val_00005179.JPEG n03388183/
+mv val/ILSVRC2012_val_00005180.JPEG n01695060/
+mv val/ILSVRC2012_val_00005181.JPEG n04258138/
+mv val/ILSVRC2012_val_00005182.JPEG n02802426/
+mv val/ILSVRC2012_val_00005183.JPEG n03425413/
+mv val/ILSVRC2012_val_00005184.JPEG n02403003/
+mv val/ILSVRC2012_val_00005185.JPEG n03868242/
+mv val/ILSVRC2012_val_00005186.JPEG n02006656/
+mv val/ILSVRC2012_val_00005187.JPEG n02667093/
+mv val/ILSVRC2012_val_00005188.JPEG n02607072/
+mv val/ILSVRC2012_val_00005189.JPEG n02093647/
+mv val/ILSVRC2012_val_00005190.JPEG n02536864/
+mv val/ILSVRC2012_val_00005191.JPEG n04591713/
+mv val/ILSVRC2012_val_00005192.JPEG n02669723/
+mv val/ILSVRC2012_val_00005193.JPEG n03733805/
+mv val/ILSVRC2012_val_00005194.JPEG n03259280/
+mv val/ILSVRC2012_val_00005195.JPEG n03709823/
+mv val/ILSVRC2012_val_00005196.JPEG n04483307/
+mv val/ILSVRC2012_val_00005197.JPEG n03877472/
+mv val/ILSVRC2012_val_00005198.JPEG n02113023/
+mv val/ILSVRC2012_val_00005199.JPEG n04133789/
+mv val/ILSVRC2012_val_00005200.JPEG n06359193/
+mv val/ILSVRC2012_val_00005201.JPEG n03903868/
+mv val/ILSVRC2012_val_00005202.JPEG n03089624/
+mv val/ILSVRC2012_val_00005203.JPEG n02013706/
+mv val/ILSVRC2012_val_00005204.JPEG n04266014/
+mv val/ILSVRC2012_val_00005205.JPEG n02504013/
+mv val/ILSVRC2012_val_00005206.JPEG n02101006/
+mv val/ILSVRC2012_val_00005207.JPEG n02124075/
+mv val/ILSVRC2012_val_00005208.JPEG n01774750/
+mv val/ILSVRC2012_val_00005209.JPEG n02112350/
+mv val/ILSVRC2012_val_00005210.JPEG n02526121/
+mv val/ILSVRC2012_val_00005211.JPEG n03485407/
+mv val/ILSVRC2012_val_00005212.JPEG n03496892/
+mv val/ILSVRC2012_val_00005213.JPEG n02655020/
+mv val/ILSVRC2012_val_00005214.JPEG n07714571/
+mv val/ILSVRC2012_val_00005215.JPEG n02087394/
+mv val/ILSVRC2012_val_00005216.JPEG n03160309/
+mv val/ILSVRC2012_val_00005217.JPEG n02091831/
+mv val/ILSVRC2012_val_00005218.JPEG n03047690/
+mv val/ILSVRC2012_val_00005219.JPEG n04612504/
+mv val/ILSVRC2012_val_00005220.JPEG n02859443/
+mv val/ILSVRC2012_val_00005221.JPEG n04033995/
+mv val/ILSVRC2012_val_00005222.JPEG n02950826/
+mv val/ILSVRC2012_val_00005223.JPEG n03187595/
+mv val/ILSVRC2012_val_00005224.JPEG n01592084/
+mv val/ILSVRC2012_val_00005225.JPEG n07892512/
+mv val/ILSVRC2012_val_00005226.JPEG n04507155/
+mv val/ILSVRC2012_val_00005227.JPEG n01692333/
+mv val/ILSVRC2012_val_00005228.JPEG n01981276/
+mv val/ILSVRC2012_val_00005229.JPEG n02823750/
+mv val/ILSVRC2012_val_00005230.JPEG n04251144/
+mv val/ILSVRC2012_val_00005231.JPEG n04548362/
+mv val/ILSVRC2012_val_00005232.JPEG n07565083/
+mv val/ILSVRC2012_val_00005233.JPEG n04209133/
+mv val/ILSVRC2012_val_00005234.JPEG n01877812/
+mv val/ILSVRC2012_val_00005235.JPEG n04486054/
+mv val/ILSVRC2012_val_00005236.JPEG n09421951/
+mv val/ILSVRC2012_val_00005237.JPEG n02231487/
+mv val/ILSVRC2012_val_00005238.JPEG n02113799/
+mv val/ILSVRC2012_val_00005239.JPEG n02098413/
+mv val/ILSVRC2012_val_00005240.JPEG n04081281/
+mv val/ILSVRC2012_val_00005241.JPEG n02999410/
+mv val/ILSVRC2012_val_00005242.JPEG n02107312/
+mv val/ILSVRC2012_val_00005243.JPEG n02346627/
+mv val/ILSVRC2012_val_00005244.JPEG n01675722/
+mv val/ILSVRC2012_val_00005245.JPEG n02795169/
+mv val/ILSVRC2012_val_00005246.JPEG n03649909/
+mv val/ILSVRC2012_val_00005247.JPEG n04090263/
+mv val/ILSVRC2012_val_00005248.JPEG n03871628/
+mv val/ILSVRC2012_val_00005249.JPEG n01877812/
+mv val/ILSVRC2012_val_00005250.JPEG n03670208/
+mv val/ILSVRC2012_val_00005251.JPEG n03866082/
+mv val/ILSVRC2012_val_00005252.JPEG n03496892/
+mv val/ILSVRC2012_val_00005253.JPEG n07248320/
+mv val/ILSVRC2012_val_00005254.JPEG n04162706/
+mv val/ILSVRC2012_val_00005255.JPEG n02098413/
+mv val/ILSVRC2012_val_00005256.JPEG n04069434/
+mv val/ILSVRC2012_val_00005257.JPEG n03938244/
+mv val/ILSVRC2012_val_00005258.JPEG n02101006/
+mv val/ILSVRC2012_val_00005259.JPEG n02325366/
+mv val/ILSVRC2012_val_00005260.JPEG n03388549/
+mv val/ILSVRC2012_val_00005261.JPEG n03393912/
+mv val/ILSVRC2012_val_00005262.JPEG n01739381/
+mv val/ILSVRC2012_val_00005263.JPEG n02108089/
+mv val/ILSVRC2012_val_00005264.JPEG n03000134/
+mv val/ILSVRC2012_val_00005265.JPEG n03124170/
+mv val/ILSVRC2012_val_00005266.JPEG n02037110/
+mv val/ILSVRC2012_val_00005267.JPEG n02098105/
+mv val/ILSVRC2012_val_00005268.JPEG n01986214/
+mv val/ILSVRC2012_val_00005269.JPEG n03314780/
+mv val/ILSVRC2012_val_00005270.JPEG n10148035/
+mv val/ILSVRC2012_val_00005271.JPEG n04200800/
+mv val/ILSVRC2012_val_00005272.JPEG n03457902/
+mv val/ILSVRC2012_val_00005273.JPEG n02091831/
+mv val/ILSVRC2012_val_00005274.JPEG n02835271/
+mv val/ILSVRC2012_val_00005275.JPEG n03642806/
+mv val/ILSVRC2012_val_00005276.JPEG n02101388/
+mv val/ILSVRC2012_val_00005277.JPEG n02128757/
+mv val/ILSVRC2012_val_00005278.JPEG n04004767/
+mv val/ILSVRC2012_val_00005279.JPEG n02091635/
+mv val/ILSVRC2012_val_00005280.JPEG n04311004/
+mv val/ILSVRC2012_val_00005281.JPEG n04328186/
+mv val/ILSVRC2012_val_00005282.JPEG n01829413/
+mv val/ILSVRC2012_val_00005283.JPEG n02108000/
+mv val/ILSVRC2012_val_00005284.JPEG n03877845/
+mv val/ILSVRC2012_val_00005285.JPEG n03935335/
+mv val/ILSVRC2012_val_00005286.JPEG n01744401/
+mv val/ILSVRC2012_val_00005287.JPEG n01531178/
+mv val/ILSVRC2012_val_00005288.JPEG n13044778/
+mv val/ILSVRC2012_val_00005289.JPEG n02699494/
+mv val/ILSVRC2012_val_00005290.JPEG n01775062/
+mv val/ILSVRC2012_val_00005291.JPEG n02088364/
+mv val/ILSVRC2012_val_00005292.JPEG n04239074/
+mv val/ILSVRC2012_val_00005293.JPEG n03781244/
+mv val/ILSVRC2012_val_00005294.JPEG n02442845/
+mv val/ILSVRC2012_val_00005295.JPEG n03028079/
+mv val/ILSVRC2012_val_00005296.JPEG n09421951/
+mv val/ILSVRC2012_val_00005297.JPEG n12768682/
+mv val/ILSVRC2012_val_00005298.JPEG n02454379/
+mv val/ILSVRC2012_val_00005299.JPEG n03065424/
+mv val/ILSVRC2012_val_00005300.JPEG n02113023/
+mv val/ILSVRC2012_val_00005301.JPEG n01873310/
+mv val/ILSVRC2012_val_00005302.JPEG n03594945/
+mv val/ILSVRC2012_val_00005303.JPEG n03792782/
+mv val/ILSVRC2012_val_00005304.JPEG n03529860/
+mv val/ILSVRC2012_val_00005305.JPEG n02174001/
+mv val/ILSVRC2012_val_00005306.JPEG n02487347/
+mv val/ILSVRC2012_val_00005307.JPEG n01692333/
+mv val/ILSVRC2012_val_00005308.JPEG n02837789/
+mv val/ILSVRC2012_val_00005309.JPEG n04487394/
+mv val/ILSVRC2012_val_00005310.JPEG n02509815/
+mv val/ILSVRC2012_val_00005311.JPEG n03970156/
+mv val/ILSVRC2012_val_00005312.JPEG n02445715/
+mv val/ILSVRC2012_val_00005313.JPEG n02666196/
+mv val/ILSVRC2012_val_00005314.JPEG n02009912/
+mv val/ILSVRC2012_val_00005315.JPEG n01797886/
+mv val/ILSVRC2012_val_00005316.JPEG n07583066/
+mv val/ILSVRC2012_val_00005317.JPEG n02111500/
+mv val/ILSVRC2012_val_00005318.JPEG n03461385/
+mv val/ILSVRC2012_val_00005319.JPEG n04371774/
+mv val/ILSVRC2012_val_00005320.JPEG n04296562/
+mv val/ILSVRC2012_val_00005321.JPEG n02978881/
+mv val/ILSVRC2012_val_00005322.JPEG n02066245/
+mv val/ILSVRC2012_val_00005323.JPEG n02129604/
+mv val/ILSVRC2012_val_00005324.JPEG n03761084/
+mv val/ILSVRC2012_val_00005325.JPEG n09229709/
+mv val/ILSVRC2012_val_00005326.JPEG n01774750/
+mv val/ILSVRC2012_val_00005327.JPEG n02108915/
+mv val/ILSVRC2012_val_00005328.JPEG n01797886/
+mv val/ILSVRC2012_val_00005329.JPEG n04482393/
+mv val/ILSVRC2012_val_00005330.JPEG n03792782/
+mv val/ILSVRC2012_val_00005331.JPEG n02095314/
+mv val/ILSVRC2012_val_00005332.JPEG n01693334/
+mv val/ILSVRC2012_val_00005333.JPEG n04560804/
+mv val/ILSVRC2012_val_00005334.JPEG n04376876/
+mv val/ILSVRC2012_val_00005335.JPEG n07718747/
+mv val/ILSVRC2012_val_00005336.JPEG n01532829/
+mv val/ILSVRC2012_val_00005337.JPEG n03888605/
+mv val/ILSVRC2012_val_00005338.JPEG n02980441/
+mv val/ILSVRC2012_val_00005339.JPEG n01494475/
+mv val/ILSVRC2012_val_00005340.JPEG n02093754/
+mv val/ILSVRC2012_val_00005341.JPEG n07802026/
+mv val/ILSVRC2012_val_00005342.JPEG n04562935/
+mv val/ILSVRC2012_val_00005343.JPEG n02165456/
+mv val/ILSVRC2012_val_00005344.JPEG n02356798/
+mv val/ILSVRC2012_val_00005345.JPEG n03977966/
+mv val/ILSVRC2012_val_00005346.JPEG n03124170/
+mv val/ILSVRC2012_val_00005347.JPEG n02797295/
+mv val/ILSVRC2012_val_00005348.JPEG n04201297/
+mv val/ILSVRC2012_val_00005349.JPEG n04392985/
+mv val/ILSVRC2012_val_00005350.JPEG n04579432/
+mv val/ILSVRC2012_val_00005351.JPEG n02106550/
+mv val/ILSVRC2012_val_00005352.JPEG n02782093/
+mv val/ILSVRC2012_val_00005353.JPEG n04252077/
+mv val/ILSVRC2012_val_00005354.JPEG n04326547/
+mv val/ILSVRC2012_val_00005355.JPEG n02454379/
+mv val/ILSVRC2012_val_00005356.JPEG n02437312/
+mv val/ILSVRC2012_val_00005357.JPEG n01729977/
+mv val/ILSVRC2012_val_00005358.JPEG n02123045/
+mv val/ILSVRC2012_val_00005359.JPEG n04229816/
+mv val/ILSVRC2012_val_00005360.JPEG n02077923/
+mv val/ILSVRC2012_val_00005361.JPEG n03788195/
+mv val/ILSVRC2012_val_00005362.JPEG n02124075/
+mv val/ILSVRC2012_val_00005363.JPEG n02051845/
+mv val/ILSVRC2012_val_00005364.JPEG n02087394/
+mv val/ILSVRC2012_val_00005365.JPEG n02096437/
+mv val/ILSVRC2012_val_00005366.JPEG n02403003/
+mv val/ILSVRC2012_val_00005367.JPEG n02769748/
+mv val/ILSVRC2012_val_00005368.JPEG n04392985/
+mv val/ILSVRC2012_val_00005369.JPEG n02134084/
+mv val/ILSVRC2012_val_00005370.JPEG n02840245/
+mv val/ILSVRC2012_val_00005371.JPEG n04273569/
+mv val/ILSVRC2012_val_00005372.JPEG n03125729/
+mv val/ILSVRC2012_val_00005373.JPEG n03967562/
+mv val/ILSVRC2012_val_00005374.JPEG n03961711/
+mv val/ILSVRC2012_val_00005375.JPEG n03961711/
+mv val/ILSVRC2012_val_00005376.JPEG n07579787/
+mv val/ILSVRC2012_val_00005377.JPEG n04270147/
+mv val/ILSVRC2012_val_00005378.JPEG n02965783/
+mv val/ILSVRC2012_val_00005379.JPEG n02006656/
+mv val/ILSVRC2012_val_00005380.JPEG n03995372/
+mv val/ILSVRC2012_val_00005381.JPEG n03444034/
+mv val/ILSVRC2012_val_00005382.JPEG n02814860/
+mv val/ILSVRC2012_val_00005383.JPEG n04070727/
+mv val/ILSVRC2012_val_00005384.JPEG n04208210/
+mv val/ILSVRC2012_val_00005385.JPEG n04486054/
+mv val/ILSVRC2012_val_00005386.JPEG n03729826/
+mv val/ILSVRC2012_val_00005387.JPEG n02120079/
+mv val/ILSVRC2012_val_00005388.JPEG n04591713/
+mv val/ILSVRC2012_val_00005389.JPEG n02808304/
+mv val/ILSVRC2012_val_00005390.JPEG n02105641/
+mv val/ILSVRC2012_val_00005391.JPEG n03770439/
+mv val/ILSVRC2012_val_00005392.JPEG n04228054/
+mv val/ILSVRC2012_val_00005393.JPEG n02094114/
+mv val/ILSVRC2012_val_00005394.JPEG n03400231/
+mv val/ILSVRC2012_val_00005395.JPEG n02106166/
+mv val/ILSVRC2012_val_00005396.JPEG n03868863/
+mv val/ILSVRC2012_val_00005397.JPEG n02089078/
+mv val/ILSVRC2012_val_00005398.JPEG n03954731/
+mv val/ILSVRC2012_val_00005399.JPEG n04355338/
+mv val/ILSVRC2012_val_00005400.JPEG n02669723/
+mv val/ILSVRC2012_val_00005401.JPEG n04200800/
+mv val/ILSVRC2012_val_00005402.JPEG n04266014/
+mv val/ILSVRC2012_val_00005403.JPEG n03929855/
+mv val/ILSVRC2012_val_00005404.JPEG n02107312/
+mv val/ILSVRC2012_val_00005405.JPEG n04023962/
+mv val/ILSVRC2012_val_00005406.JPEG n03958227/
+mv val/ILSVRC2012_val_00005407.JPEG n01677366/
+mv val/ILSVRC2012_val_00005408.JPEG n02791124/
+mv val/ILSVRC2012_val_00005409.JPEG n03485407/
+mv val/ILSVRC2012_val_00005410.JPEG n02129165/
+mv val/ILSVRC2012_val_00005411.JPEG n03075370/
+mv val/ILSVRC2012_val_00005412.JPEG n01558993/
+mv val/ILSVRC2012_val_00005413.JPEG n02988304/
+mv val/ILSVRC2012_val_00005414.JPEG n04355933/
+mv val/ILSVRC2012_val_00005415.JPEG n02134418/
+mv val/ILSVRC2012_val_00005416.JPEG n01675722/
+mv val/ILSVRC2012_val_00005417.JPEG n07920052/
+mv val/ILSVRC2012_val_00005418.JPEG n02321529/
+mv val/ILSVRC2012_val_00005419.JPEG n02018795/
+mv val/ILSVRC2012_val_00005420.JPEG n03992509/
+mv val/ILSVRC2012_val_00005421.JPEG n03868863/
+mv val/ILSVRC2012_val_00005422.JPEG n03796401/
+mv val/ILSVRC2012_val_00005423.JPEG n02892767/
+mv val/ILSVRC2012_val_00005424.JPEG n04254120/
+mv val/ILSVRC2012_val_00005425.JPEG n03785016/
+mv val/ILSVRC2012_val_00005426.JPEG n04591157/
+mv val/ILSVRC2012_val_00005427.JPEG n01518878/
+mv val/ILSVRC2012_val_00005428.JPEG n06794110/
+mv val/ILSVRC2012_val_00005429.JPEG n01930112/
+mv val/ILSVRC2012_val_00005430.JPEG n02951585/
+mv val/ILSVRC2012_val_00005431.JPEG n07711569/
+mv val/ILSVRC2012_val_00005432.JPEG n01496331/
+mv val/ILSVRC2012_val_00005433.JPEG n02788148/
+mv val/ILSVRC2012_val_00005434.JPEG n03207743/
+mv val/ILSVRC2012_val_00005435.JPEG n03794056/
+mv val/ILSVRC2012_val_00005436.JPEG n04332243/
+mv val/ILSVRC2012_val_00005437.JPEG n04356056/
+mv val/ILSVRC2012_val_00005438.JPEG n07873807/
+mv val/ILSVRC2012_val_00005439.JPEG n02667093/
+mv val/ILSVRC2012_val_00005440.JPEG n03271574/
+mv val/ILSVRC2012_val_00005441.JPEG n02794156/
+mv val/ILSVRC2012_val_00005442.JPEG n02493793/
+mv val/ILSVRC2012_val_00005443.JPEG n03527444/
+mv val/ILSVRC2012_val_00005444.JPEG n02951585/
+mv val/ILSVRC2012_val_00005445.JPEG n03240683/
+mv val/ILSVRC2012_val_00005446.JPEG n02109961/
+mv val/ILSVRC2012_val_00005447.JPEG n01795545/
+mv val/ILSVRC2012_val_00005448.JPEG n03599486/
+mv val/ILSVRC2012_val_00005449.JPEG n04599235/
+mv val/ILSVRC2012_val_00005450.JPEG n01644900/
+mv val/ILSVRC2012_val_00005451.JPEG n07880968/
+mv val/ILSVRC2012_val_00005452.JPEG n04317175/
+mv val/ILSVRC2012_val_00005453.JPEG n02840245/
+mv val/ILSVRC2012_val_00005454.JPEG n02408429/
+mv val/ILSVRC2012_val_00005455.JPEG n07248320/
+mv val/ILSVRC2012_val_00005456.JPEG n04285008/
+mv val/ILSVRC2012_val_00005457.JPEG n02096585/
+mv val/ILSVRC2012_val_00005458.JPEG n02704792/
+mv val/ILSVRC2012_val_00005459.JPEG n04560804/
+mv val/ILSVRC2012_val_00005460.JPEG n03785016/
+mv val/ILSVRC2012_val_00005461.JPEG n02927161/
+mv val/ILSVRC2012_val_00005462.JPEG n03697007/
+mv val/ILSVRC2012_val_00005463.JPEG n07930864/
+mv val/ILSVRC2012_val_00005464.JPEG n07248320/
+mv val/ILSVRC2012_val_00005465.JPEG n02028035/
+mv val/ILSVRC2012_val_00005466.JPEG n02123597/
+mv val/ILSVRC2012_val_00005467.JPEG n02676566/
+mv val/ILSVRC2012_val_00005468.JPEG n07583066/
+mv val/ILSVRC2012_val_00005469.JPEG n02871525/
+mv val/ILSVRC2012_val_00005470.JPEG n02134084/
+mv val/ILSVRC2012_val_00005471.JPEG n02091032/
+mv val/ILSVRC2012_val_00005472.JPEG n04462240/
+mv val/ILSVRC2012_val_00005473.JPEG n02117135/
+mv val/ILSVRC2012_val_00005474.JPEG n02009912/
+mv val/ILSVRC2012_val_00005475.JPEG n09193705/
+mv val/ILSVRC2012_val_00005476.JPEG n09472597/
+mv val/ILSVRC2012_val_00005477.JPEG n02834397/
+mv val/ILSVRC2012_val_00005478.JPEG n03764736/
+mv val/ILSVRC2012_val_00005479.JPEG n01753488/
+mv val/ILSVRC2012_val_00005480.JPEG n03895866/
+mv val/ILSVRC2012_val_00005481.JPEG n02112018/
+mv val/ILSVRC2012_val_00005482.JPEG n02165105/
+mv val/ILSVRC2012_val_00005483.JPEG n02837789/
+mv val/ILSVRC2012_val_00005484.JPEG n03457902/
+mv val/ILSVRC2012_val_00005485.JPEG n04522168/
+mv val/ILSVRC2012_val_00005486.JPEG n04023962/
+mv val/ILSVRC2012_val_00005487.JPEG n04536866/
+mv val/ILSVRC2012_val_00005488.JPEG n04005630/
+mv val/ILSVRC2012_val_00005489.JPEG n02110627/
+mv val/ILSVRC2012_val_00005490.JPEG n02708093/
+mv val/ILSVRC2012_val_00005491.JPEG n04554684/
+mv val/ILSVRC2012_val_00005492.JPEG n01514668/
+mv val/ILSVRC2012_val_00005493.JPEG n02090379/
+mv val/ILSVRC2012_val_00005494.JPEG n07836838/
+mv val/ILSVRC2012_val_00005495.JPEG n02108089/
+mv val/ILSVRC2012_val_00005496.JPEG n03095699/
+mv val/ILSVRC2012_val_00005497.JPEG n04366367/
+mv val/ILSVRC2012_val_00005498.JPEG n04039381/
+mv val/ILSVRC2012_val_00005499.JPEG n07802026/
+mv val/ILSVRC2012_val_00005500.JPEG n03100240/
+mv val/ILSVRC2012_val_00005501.JPEG n03255030/
+mv val/ILSVRC2012_val_00005502.JPEG n04235860/
+mv val/ILSVRC2012_val_00005503.JPEG n02980441/
+mv val/ILSVRC2012_val_00005504.JPEG n03218198/
+mv val/ILSVRC2012_val_00005505.JPEG n01514668/
+mv val/ILSVRC2012_val_00005506.JPEG n03000684/
+mv val/ILSVRC2012_val_00005507.JPEG n02088094/
+mv val/ILSVRC2012_val_00005508.JPEG n02815834/
+mv val/ILSVRC2012_val_00005509.JPEG n03657121/
+mv val/ILSVRC2012_val_00005510.JPEG n03891251/
+mv val/ILSVRC2012_val_00005511.JPEG n02808440/
+mv val/ILSVRC2012_val_00005512.JPEG n02916936/
+mv val/ILSVRC2012_val_00005513.JPEG n03661043/
+mv val/ILSVRC2012_val_00005514.JPEG n04243546/
+mv val/ILSVRC2012_val_00005515.JPEG n04065272/
+mv val/ILSVRC2012_val_00005516.JPEG n03666591/
+mv val/ILSVRC2012_val_00005517.JPEG n04604644/
+mv val/ILSVRC2012_val_00005518.JPEG n04509417/
+mv val/ILSVRC2012_val_00005519.JPEG n03937543/
+mv val/ILSVRC2012_val_00005520.JPEG n04509417/
+mv val/ILSVRC2012_val_00005521.JPEG n02109961/
+mv val/ILSVRC2012_val_00005522.JPEG n04251144/
+mv val/ILSVRC2012_val_00005523.JPEG n02869837/
+mv val/ILSVRC2012_val_00005524.JPEG n02113712/
+mv val/ILSVRC2012_val_00005525.JPEG n02492660/
+mv val/ILSVRC2012_val_00005526.JPEG n02841315/
+mv val/ILSVRC2012_val_00005527.JPEG n07734744/
+mv val/ILSVRC2012_val_00005528.JPEG n04456115/
+mv val/ILSVRC2012_val_00005529.JPEG n02640242/
+mv val/ILSVRC2012_val_00005530.JPEG n03929855/
+mv val/ILSVRC2012_val_00005531.JPEG n04266014/
+mv val/ILSVRC2012_val_00005532.JPEG n01644900/
+mv val/ILSVRC2012_val_00005533.JPEG n02807133/
+mv val/ILSVRC2012_val_00005534.JPEG n03814639/
+mv val/ILSVRC2012_val_00005535.JPEG n01514859/
+mv val/ILSVRC2012_val_00005536.JPEG n01784675/
+mv val/ILSVRC2012_val_00005537.JPEG n04023962/
+mv val/ILSVRC2012_val_00005538.JPEG n02256656/
+mv val/ILSVRC2012_val_00005539.JPEG n01695060/
+mv val/ILSVRC2012_val_00005540.JPEG n03532672/
+mv val/ILSVRC2012_val_00005541.JPEG n04070727/
+mv val/ILSVRC2012_val_00005542.JPEG n03742115/
+mv val/ILSVRC2012_val_00005543.JPEG n03482405/
+mv val/ILSVRC2012_val_00005544.JPEG n01773797/
+mv val/ILSVRC2012_val_00005545.JPEG n03388183/
+mv val/ILSVRC2012_val_00005546.JPEG n03792782/
+mv val/ILSVRC2012_val_00005547.JPEG n09246464/
+mv val/ILSVRC2012_val_00005548.JPEG n03394916/
+mv val/ILSVRC2012_val_00005549.JPEG n13052670/
+mv val/ILSVRC2012_val_00005550.JPEG n03498962/
+mv val/ILSVRC2012_val_00005551.JPEG n02356798/
+mv val/ILSVRC2012_val_00005552.JPEG n02966193/
+mv val/ILSVRC2012_val_00005553.JPEG n01798484/
+mv val/ILSVRC2012_val_00005554.JPEG n03394916/
+mv val/ILSVRC2012_val_00005555.JPEG n04476259/
+mv val/ILSVRC2012_val_00005556.JPEG n03854065/
+mv val/ILSVRC2012_val_00005557.JPEG n03950228/
+mv val/ILSVRC2012_val_00005558.JPEG n02708093/
+mv val/ILSVRC2012_val_00005559.JPEG n02206856/
+mv val/ILSVRC2012_val_00005560.JPEG n03026506/
+mv val/ILSVRC2012_val_00005561.JPEG n04004767/
+mv val/ILSVRC2012_val_00005562.JPEG n03691459/
+mv val/ILSVRC2012_val_00005563.JPEG n01682714/
+mv val/ILSVRC2012_val_00005564.JPEG n02095570/
+mv val/ILSVRC2012_val_00005565.JPEG n02480855/
+mv val/ILSVRC2012_val_00005566.JPEG n03424325/
+mv val/ILSVRC2012_val_00005567.JPEG n01531178/
+mv val/ILSVRC2012_val_00005568.JPEG n03868863/
+mv val/ILSVRC2012_val_00005569.JPEG n02883205/
+mv val/ILSVRC2012_val_00005570.JPEG n02795169/
+mv val/ILSVRC2012_val_00005571.JPEG n04399382/
+mv val/ILSVRC2012_val_00005572.JPEG n02840245/
+mv val/ILSVRC2012_val_00005573.JPEG n02808304/
+mv val/ILSVRC2012_val_00005574.JPEG n01695060/
+mv val/ILSVRC2012_val_00005575.JPEG n02110063/
+mv val/ILSVRC2012_val_00005576.JPEG n01601694/
+mv val/ILSVRC2012_val_00005577.JPEG n04229816/
+mv val/ILSVRC2012_val_00005578.JPEG n02927161/
+mv val/ILSVRC2012_val_00005579.JPEG n03187595/
+mv val/ILSVRC2012_val_00005580.JPEG n02454379/
+mv val/ILSVRC2012_val_00005581.JPEG n04483307/
+mv val/ILSVRC2012_val_00005582.JPEG n01986214/
+mv val/ILSVRC2012_val_00005583.JPEG n02104029/
+mv val/ILSVRC2012_val_00005584.JPEG n04485082/
+mv val/ILSVRC2012_val_00005585.JPEG n02808304/
+mv val/ILSVRC2012_val_00005586.JPEG n03384352/
+mv val/ILSVRC2012_val_00005587.JPEG n02107574/
+mv val/ILSVRC2012_val_00005588.JPEG n02927161/
+mv val/ILSVRC2012_val_00005589.JPEG n03924679/
+mv val/ILSVRC2012_val_00005590.JPEG n01685808/
+mv val/ILSVRC2012_val_00005591.JPEG n02364673/
+mv val/ILSVRC2012_val_00005592.JPEG n04389033/
+mv val/ILSVRC2012_val_00005593.JPEG n07718472/
+mv val/ILSVRC2012_val_00005594.JPEG n01558993/
+mv val/ILSVRC2012_val_00005595.JPEG n03047690/
+mv val/ILSVRC2012_val_00005596.JPEG n03595614/
+mv val/ILSVRC2012_val_00005597.JPEG n02071294/
+mv val/ILSVRC2012_val_00005598.JPEG n03028079/
+mv val/ILSVRC2012_val_00005599.JPEG n01806143/
+mv val/ILSVRC2012_val_00005600.JPEG n03814639/
+mv val/ILSVRC2012_val_00005601.JPEG n02007558/
+mv val/ILSVRC2012_val_00005602.JPEG n04525038/
+mv val/ILSVRC2012_val_00005603.JPEG n02128385/
+mv val/ILSVRC2012_val_00005604.JPEG n02391049/
+mv val/ILSVRC2012_val_00005605.JPEG n04372370/
+mv val/ILSVRC2012_val_00005606.JPEG n03769881/
+mv val/ILSVRC2012_val_00005607.JPEG n02100877/
+mv val/ILSVRC2012_val_00005608.JPEG n09288635/
+mv val/ILSVRC2012_val_00005609.JPEG n03950228/
+mv val/ILSVRC2012_val_00005610.JPEG n02786058/
+mv val/ILSVRC2012_val_00005611.JPEG n03788365/
+mv val/ILSVRC2012_val_00005612.JPEG n01667114/
+mv val/ILSVRC2012_val_00005613.JPEG n02119789/
+mv val/ILSVRC2012_val_00005614.JPEG n02279972/
+mv val/ILSVRC2012_val_00005615.JPEG n02033041/
+mv val/ILSVRC2012_val_00005616.JPEG n02086910/
+mv val/ILSVRC2012_val_00005617.JPEG n01749939/
+mv val/ILSVRC2012_val_00005618.JPEG n03337140/
+mv val/ILSVRC2012_val_00005619.JPEG n07693725/
+mv val/ILSVRC2012_val_00005620.JPEG n02492660/
+mv val/ILSVRC2012_val_00005621.JPEG n02442845/
+mv val/ILSVRC2012_val_00005622.JPEG n02917067/
+mv val/ILSVRC2012_val_00005623.JPEG n03733281/
+mv val/ILSVRC2012_val_00005624.JPEG n07920052/
+mv val/ILSVRC2012_val_00005625.JPEG n02490219/
+mv val/ILSVRC2012_val_00005626.JPEG n02111277/
+mv val/ILSVRC2012_val_00005627.JPEG n02123394/
+mv val/ILSVRC2012_val_00005628.JPEG n02128757/
+mv val/ILSVRC2012_val_00005629.JPEG n02992211/
+mv val/ILSVRC2012_val_00005630.JPEG n03424325/
+mv val/ILSVRC2012_val_00005631.JPEG n03942813/
+mv val/ILSVRC2012_val_00005632.JPEG n04399382/
+mv val/ILSVRC2012_val_00005633.JPEG n04417672/
+mv val/ILSVRC2012_val_00005634.JPEG n01828970/
+mv val/ILSVRC2012_val_00005635.JPEG n03854065/
+mv val/ILSVRC2012_val_00005636.JPEG n02325366/
+mv val/ILSVRC2012_val_00005637.JPEG n02492035/
+mv val/ILSVRC2012_val_00005638.JPEG n03220513/
+mv val/ILSVRC2012_val_00005639.JPEG n02087046/
+mv val/ILSVRC2012_val_00005640.JPEG n03602883/
+mv val/ILSVRC2012_val_00005641.JPEG n01983481/
+mv val/ILSVRC2012_val_00005642.JPEG n01498041/
+mv val/ILSVRC2012_val_00005643.JPEG n02834397/
+mv val/ILSVRC2012_val_00005644.JPEG n03791053/
+mv val/ILSVRC2012_val_00005645.JPEG n04604644/
+mv val/ILSVRC2012_val_00005646.JPEG n07730033/
+mv val/ILSVRC2012_val_00005647.JPEG n01675722/
+mv val/ILSVRC2012_val_00005648.JPEG n02105056/
+mv val/ILSVRC2012_val_00005649.JPEG n04039381/
+mv val/ILSVRC2012_val_00005650.JPEG n02835271/
+mv val/ILSVRC2012_val_00005651.JPEG n02787622/
+mv val/ILSVRC2012_val_00005652.JPEG n04591157/
+mv val/ILSVRC2012_val_00005653.JPEG n02484975/
+mv val/ILSVRC2012_val_00005654.JPEG n04044716/
+mv val/ILSVRC2012_val_00005655.JPEG n02977058/
+mv val/ILSVRC2012_val_00005656.JPEG n03000247/
+mv val/ILSVRC2012_val_00005657.JPEG n03602883/
+mv val/ILSVRC2012_val_00005658.JPEG n02112018/
+mv val/ILSVRC2012_val_00005659.JPEG n04584207/
+mv val/ILSVRC2012_val_00005660.JPEG n03733281/
+mv val/ILSVRC2012_val_00005661.JPEG n04209133/
+mv val/ILSVRC2012_val_00005662.JPEG n02106662/
+mv val/ILSVRC2012_val_00005663.JPEG n01740131/
+mv val/ILSVRC2012_val_00005664.JPEG n03983396/
+mv val/ILSVRC2012_val_00005665.JPEG n04141327/
+mv val/ILSVRC2012_val_00005666.JPEG n03476684/
+mv val/ILSVRC2012_val_00005667.JPEG n03337140/
+mv val/ILSVRC2012_val_00005668.JPEG n04311174/
+mv val/ILSVRC2012_val_00005669.JPEG n02510455/
+mv val/ILSVRC2012_val_00005670.JPEG n03476991/
+mv val/ILSVRC2012_val_00005671.JPEG n04456115/
+mv val/ILSVRC2012_val_00005672.JPEG n03141823/
+mv val/ILSVRC2012_val_00005673.JPEG n04009552/
+mv val/ILSVRC2012_val_00005674.JPEG n03461385/
+mv val/ILSVRC2012_val_00005675.JPEG n01797886/
+mv val/ILSVRC2012_val_00005676.JPEG n01734418/
+mv val/ILSVRC2012_val_00005677.JPEG n02108915/
+mv val/ILSVRC2012_val_00005678.JPEG n04251144/
+mv val/ILSVRC2012_val_00005679.JPEG n04192698/
+mv val/ILSVRC2012_val_00005680.JPEG n04525038/
+mv val/ILSVRC2012_val_00005681.JPEG n03995372/
+mv val/ILSVRC2012_val_00005682.JPEG n01985128/
+mv val/ILSVRC2012_val_00005683.JPEG n07930864/
+mv val/ILSVRC2012_val_00005684.JPEG n02514041/
+mv val/ILSVRC2012_val_00005685.JPEG n02098413/
+mv val/ILSVRC2012_val_00005686.JPEG n03388183/
+mv val/ILSVRC2012_val_00005687.JPEG n02095889/
+mv val/ILSVRC2012_val_00005688.JPEG n02992529/
+mv val/ILSVRC2012_val_00005689.JPEG n07920052/
+mv val/ILSVRC2012_val_00005690.JPEG n03249569/
+mv val/ILSVRC2012_val_00005691.JPEG n02667093/
+mv val/ILSVRC2012_val_00005692.JPEG n03393912/
+mv val/ILSVRC2012_val_00005693.JPEG n03743016/
+mv val/ILSVRC2012_val_00005694.JPEG n03876231/
+mv val/ILSVRC2012_val_00005695.JPEG n02138441/
+mv val/ILSVRC2012_val_00005696.JPEG n07875152/
+mv val/ILSVRC2012_val_00005697.JPEG n02099601/
+mv val/ILSVRC2012_val_00005698.JPEG n01630670/
+mv val/ILSVRC2012_val_00005699.JPEG n02099429/
+mv val/ILSVRC2012_val_00005700.JPEG n03706229/
+mv val/ILSVRC2012_val_00005701.JPEG n03992509/
+mv val/ILSVRC2012_val_00005702.JPEG n03141823/
+mv val/ILSVRC2012_val_00005703.JPEG n03109150/
+mv val/ILSVRC2012_val_00005704.JPEG n02504013/
+mv val/ILSVRC2012_val_00005705.JPEG n02992529/
+mv val/ILSVRC2012_val_00005706.JPEG n01943899/
+mv val/ILSVRC2012_val_00005707.JPEG n03796401/
+mv val/ILSVRC2012_val_00005708.JPEG n01675722/
+mv val/ILSVRC2012_val_00005709.JPEG n04141327/
+mv val/ILSVRC2012_val_00005710.JPEG n07697537/
+mv val/ILSVRC2012_val_00005711.JPEG n04141327/
+mv val/ILSVRC2012_val_00005712.JPEG n02871525/
+mv val/ILSVRC2012_val_00005713.JPEG n04254680/
+mv val/ILSVRC2012_val_00005714.JPEG n07836838/
+mv val/ILSVRC2012_val_00005715.JPEG n03133878/
+mv val/ILSVRC2012_val_00005716.JPEG n02346627/
+mv val/ILSVRC2012_val_00005717.JPEG n03649909/
+mv val/ILSVRC2012_val_00005718.JPEG n02090622/
+mv val/ILSVRC2012_val_00005719.JPEG n03124170/
+mv val/ILSVRC2012_val_00005720.JPEG n04458633/
+mv val/ILSVRC2012_val_00005721.JPEG n04525305/
+mv val/ILSVRC2012_val_00005722.JPEG n03666591/
+mv val/ILSVRC2012_val_00005723.JPEG n02699494/
+mv val/ILSVRC2012_val_00005724.JPEG n03680355/
+mv val/ILSVRC2012_val_00005725.JPEG n01692333/
+mv val/ILSVRC2012_val_00005726.JPEG n02480495/
+mv val/ILSVRC2012_val_00005727.JPEG n03109150/
+mv val/ILSVRC2012_val_00005728.JPEG n02342885/
+mv val/ILSVRC2012_val_00005729.JPEG n02776631/
+mv val/ILSVRC2012_val_00005730.JPEG n04596742/
+mv val/ILSVRC2012_val_00005731.JPEG n03018349/
+mv val/ILSVRC2012_val_00005732.JPEG n04525305/
+mv val/ILSVRC2012_val_00005733.JPEG n01824575/
+mv val/ILSVRC2012_val_00005734.JPEG n01882714/
+mv val/ILSVRC2012_val_00005735.JPEG n02115641/
+mv val/ILSVRC2012_val_00005736.JPEG n02788148/
+mv val/ILSVRC2012_val_00005737.JPEG n04335435/
+mv val/ILSVRC2012_val_00005738.JPEG n02085936/
+mv val/ILSVRC2012_val_00005739.JPEG n02782093/
+mv val/ILSVRC2012_val_00005740.JPEG n03095699/
+mv val/ILSVRC2012_val_00005741.JPEG n03127925/
+mv val/ILSVRC2012_val_00005742.JPEG n09468604/
+mv val/ILSVRC2012_val_00005743.JPEG n07717410/
+mv val/ILSVRC2012_val_00005744.JPEG n03417042/
+mv val/ILSVRC2012_val_00005745.JPEG n12998815/
+mv val/ILSVRC2012_val_00005746.JPEG n02113023/
+mv val/ILSVRC2012_val_00005747.JPEG n07742313/
+mv val/ILSVRC2012_val_00005748.JPEG n04296562/
+mv val/ILSVRC2012_val_00005749.JPEG n07714571/
+mv val/ILSVRC2012_val_00005750.JPEG n02107312/
+mv val/ILSVRC2012_val_00005751.JPEG n01806143/
+mv val/ILSVRC2012_val_00005752.JPEG n04033995/
+mv val/ILSVRC2012_val_00005753.JPEG n02025239/
+mv val/ILSVRC2012_val_00005754.JPEG n03930313/
+mv val/ILSVRC2012_val_00005755.JPEG n02641379/
+mv val/ILSVRC2012_val_00005756.JPEG n03804744/
+mv val/ILSVRC2012_val_00005757.JPEG n07745940/
+mv val/ILSVRC2012_val_00005758.JPEG n02097658/
+mv val/ILSVRC2012_val_00005759.JPEG n07930864/
+mv val/ILSVRC2012_val_00005760.JPEG n03089624/
+mv val/ILSVRC2012_val_00005761.JPEG n02492035/
+mv val/ILSVRC2012_val_00005762.JPEG n02791124/
+mv val/ILSVRC2012_val_00005763.JPEG n02172182/
+mv val/ILSVRC2012_val_00005764.JPEG n02865351/
+mv val/ILSVRC2012_val_00005765.JPEG n01739381/
+mv val/ILSVRC2012_val_00005766.JPEG n03950228/
+mv val/ILSVRC2012_val_00005767.JPEG n02099429/
+mv val/ILSVRC2012_val_00005768.JPEG n01644900/
+mv val/ILSVRC2012_val_00005769.JPEG n02788148/
+mv val/ILSVRC2012_val_00005770.JPEG n01622779/
+mv val/ILSVRC2012_val_00005771.JPEG n02027492/
+mv val/ILSVRC2012_val_00005772.JPEG n04254120/
+mv val/ILSVRC2012_val_00005773.JPEG n03929855/
+mv val/ILSVRC2012_val_00005774.JPEG n02814533/
+mv val/ILSVRC2012_val_00005775.JPEG n02226429/
+mv val/ILSVRC2012_val_00005776.JPEG n07715103/
+mv val/ILSVRC2012_val_00005777.JPEG n03840681/
+mv val/ILSVRC2012_val_00005778.JPEG n02256656/
+mv val/ILSVRC2012_val_00005779.JPEG n01833805/
+mv val/ILSVRC2012_val_00005780.JPEG n12267677/
+mv val/ILSVRC2012_val_00005781.JPEG n01687978/
+mv val/ILSVRC2012_val_00005782.JPEG n04592741/
+mv val/ILSVRC2012_val_00005783.JPEG n04592741/
+mv val/ILSVRC2012_val_00005784.JPEG n07873807/
+mv val/ILSVRC2012_val_00005785.JPEG n02110627/
+mv val/ILSVRC2012_val_00005786.JPEG n02277742/
+mv val/ILSVRC2012_val_00005787.JPEG n04266014/
+mv val/ILSVRC2012_val_00005788.JPEG n01776313/
+mv val/ILSVRC2012_val_00005789.JPEG n02794156/
+mv val/ILSVRC2012_val_00005790.JPEG n02093428/
+mv val/ILSVRC2012_val_00005791.JPEG n04311004/
+mv val/ILSVRC2012_val_00005792.JPEG n03920288/
+mv val/ILSVRC2012_val_00005793.JPEG n03047690/
+mv val/ILSVRC2012_val_00005794.JPEG n03992509/
+mv val/ILSVRC2012_val_00005795.JPEG n02112350/
+mv val/ILSVRC2012_val_00005796.JPEG n04591157/
+mv val/ILSVRC2012_val_00005797.JPEG n03017168/
+mv val/ILSVRC2012_val_00005798.JPEG n03459775/
+mv val/ILSVRC2012_val_00005799.JPEG n01667778/
+mv val/ILSVRC2012_val_00005800.JPEG n01820546/
+mv val/ILSVRC2012_val_00005801.JPEG n03485794/
+mv val/ILSVRC2012_val_00005802.JPEG n02804610/
+mv val/ILSVRC2012_val_00005803.JPEG n03602883/
+mv val/ILSVRC2012_val_00005804.JPEG n03666591/
+mv val/ILSVRC2012_val_00005805.JPEG n01872401/
+mv val/ILSVRC2012_val_00005806.JPEG n04589890/
+mv val/ILSVRC2012_val_00005807.JPEG n02730930/
+mv val/ILSVRC2012_val_00005808.JPEG n02090379/
+mv val/ILSVRC2012_val_00005809.JPEG n03670208/
+mv val/ILSVRC2012_val_00005810.JPEG n02892201/
+mv val/ILSVRC2012_val_00005811.JPEG n03372029/
+mv val/ILSVRC2012_val_00005812.JPEG n03062245/
+mv val/ILSVRC2012_val_00005813.JPEG n02486410/
+mv val/ILSVRC2012_val_00005814.JPEG n04562935/
+mv val/ILSVRC2012_val_00005815.JPEG n01697457/
+mv val/ILSVRC2012_val_00005816.JPEG n02099429/
+mv val/ILSVRC2012_val_00005817.JPEG n04111531/
+mv val/ILSVRC2012_val_00005818.JPEG n01728920/
+mv val/ILSVRC2012_val_00005819.JPEG n04153751/
+mv val/ILSVRC2012_val_00005820.JPEG n02113624/
+mv val/ILSVRC2012_val_00005821.JPEG n01770393/
+mv val/ILSVRC2012_val_00005822.JPEG n04266014/
+mv val/ILSVRC2012_val_00005823.JPEG n02017213/
+mv val/ILSVRC2012_val_00005824.JPEG n03483316/
+mv val/ILSVRC2012_val_00005825.JPEG n01742172/
+mv val/ILSVRC2012_val_00005826.JPEG n02480855/
+mv val/ILSVRC2012_val_00005827.JPEG n01739381/
+mv val/ILSVRC2012_val_00005828.JPEG n01768244/
+mv val/ILSVRC2012_val_00005829.JPEG n03908714/
+mv val/ILSVRC2012_val_00005830.JPEG n02006656/
+mv val/ILSVRC2012_val_00005831.JPEG n02089867/
+mv val/ILSVRC2012_val_00005832.JPEG n03026506/
+mv val/ILSVRC2012_val_00005833.JPEG n01558993/
+mv val/ILSVRC2012_val_00005834.JPEG n03980874/
+mv val/ILSVRC2012_val_00005835.JPEG n03775546/
+mv val/ILSVRC2012_val_00005836.JPEG n01980166/
+mv val/ILSVRC2012_val_00005837.JPEG n09399592/
+mv val/ILSVRC2012_val_00005838.JPEG n02804610/
+mv val/ILSVRC2012_val_00005839.JPEG n04336792/
+mv val/ILSVRC2012_val_00005840.JPEG n02027492/
+mv val/ILSVRC2012_val_00005841.JPEG n04251144/
+mv val/ILSVRC2012_val_00005842.JPEG n02100735/
+mv val/ILSVRC2012_val_00005843.JPEG n03788365/
+mv val/ILSVRC2012_val_00005844.JPEG n13040303/
+mv val/ILSVRC2012_val_00005845.JPEG n02328150/
+mv val/ILSVRC2012_val_00005846.JPEG n15075141/
+mv val/ILSVRC2012_val_00005847.JPEG n07802026/
+mv val/ILSVRC2012_val_00005848.JPEG n01532829/
+mv val/ILSVRC2012_val_00005849.JPEG n03594734/
+mv val/ILSVRC2012_val_00005850.JPEG n02676566/
+mv val/ILSVRC2012_val_00005851.JPEG n04404412/
+mv val/ILSVRC2012_val_00005852.JPEG n02346627/
+mv val/ILSVRC2012_val_00005853.JPEG n02843684/
+mv val/ILSVRC2012_val_00005854.JPEG n02108000/
+mv val/ILSVRC2012_val_00005855.JPEG n02871525/
+mv val/ILSVRC2012_val_00005856.JPEG n02606052/
+mv val/ILSVRC2012_val_00005857.JPEG n03982430/
+mv val/ILSVRC2012_val_00005858.JPEG n02165456/
+mv val/ILSVRC2012_val_00005859.JPEG n02823750/
+mv val/ILSVRC2012_val_00005860.JPEG n01871265/
+mv val/ILSVRC2012_val_00005861.JPEG n02730930/
+mv val/ILSVRC2012_val_00005862.JPEG n03770679/
+mv val/ILSVRC2012_val_00005863.JPEG n04505470/
+mv val/ILSVRC2012_val_00005864.JPEG n03404251/
+mv val/ILSVRC2012_val_00005865.JPEG n01883070/
+mv val/ILSVRC2012_val_00005866.JPEG n02979186/
+mv val/ILSVRC2012_val_00005867.JPEG n02093991/
+mv val/ILSVRC2012_val_00005868.JPEG n01630670/
+mv val/ILSVRC2012_val_00005869.JPEG n04120489/
+mv val/ILSVRC2012_val_00005870.JPEG n01443537/
+mv val/ILSVRC2012_val_00005871.JPEG n04371774/
+mv val/ILSVRC2012_val_00005872.JPEG n03866082/
+mv val/ILSVRC2012_val_00005873.JPEG n01833805/
+mv val/ILSVRC2012_val_00005874.JPEG n03527444/
+mv val/ILSVRC2012_val_00005875.JPEG n03998194/
+mv val/ILSVRC2012_val_00005876.JPEG n03873416/
+mv val/ILSVRC2012_val_00005877.JPEG n02930766/
+mv val/ILSVRC2012_val_00005878.JPEG n03776460/
+mv val/ILSVRC2012_val_00005879.JPEG n06596364/
+mv val/ILSVRC2012_val_00005880.JPEG n02321529/
+mv val/ILSVRC2012_val_00005881.JPEG n04392985/
+mv val/ILSVRC2012_val_00005882.JPEG n03796401/
+mv val/ILSVRC2012_val_00005883.JPEG n04483307/
+mv val/ILSVRC2012_val_00005884.JPEG n02526121/
+mv val/ILSVRC2012_val_00005885.JPEG n02396427/
+mv val/ILSVRC2012_val_00005886.JPEG n02113023/
+mv val/ILSVRC2012_val_00005887.JPEG n03443371/
+mv val/ILSVRC2012_val_00005888.JPEG n07747607/
+mv val/ILSVRC2012_val_00005889.JPEG n01980166/
+mv val/ILSVRC2012_val_00005890.JPEG n02058221/
+mv val/ILSVRC2012_val_00005891.JPEG n02167151/
+mv val/ILSVRC2012_val_00005892.JPEG n02769748/
+mv val/ILSVRC2012_val_00005893.JPEG n03127925/
+mv val/ILSVRC2012_val_00005894.JPEG n02190166/
+mv val/ILSVRC2012_val_00005895.JPEG n03272562/
+mv val/ILSVRC2012_val_00005896.JPEG n02097130/
+mv val/ILSVRC2012_val_00005897.JPEG n04560804/
+mv val/ILSVRC2012_val_00005898.JPEG n02086240/
+mv val/ILSVRC2012_val_00005899.JPEG n04326547/
+mv val/ILSVRC2012_val_00005900.JPEG n02095314/
+mv val/ILSVRC2012_val_00005901.JPEG n01843383/
+mv val/ILSVRC2012_val_00005902.JPEG n02107312/
+mv val/ILSVRC2012_val_00005903.JPEG n03954731/
+mv val/ILSVRC2012_val_00005904.JPEG n02281406/
+mv val/ILSVRC2012_val_00005905.JPEG n02105641/
+mv val/ILSVRC2012_val_00005906.JPEG n03075370/
+mv val/ILSVRC2012_val_00005907.JPEG n02883205/
+mv val/ILSVRC2012_val_00005908.JPEG n01829413/
+mv val/ILSVRC2012_val_00005909.JPEG n02099849/
+mv val/ILSVRC2012_val_00005910.JPEG n02112137/
+mv val/ILSVRC2012_val_00005911.JPEG n07684084/
+mv val/ILSVRC2012_val_00005912.JPEG n03095699/
+mv val/ILSVRC2012_val_00005913.JPEG n02408429/
+mv val/ILSVRC2012_val_00005914.JPEG n10565667/
+mv val/ILSVRC2012_val_00005915.JPEG n02641379/
+mv val/ILSVRC2012_val_00005916.JPEG n02259212/
+mv val/ILSVRC2012_val_00005917.JPEG n02128757/
+mv val/ILSVRC2012_val_00005918.JPEG n03344393/
+mv val/ILSVRC2012_val_00005919.JPEG n01665541/
+mv val/ILSVRC2012_val_00005920.JPEG n04004767/
+mv val/ILSVRC2012_val_00005921.JPEG n07734744/
+mv val/ILSVRC2012_val_00005922.JPEG n02088364/
+mv val/ILSVRC2012_val_00005923.JPEG n02100583/
+mv val/ILSVRC2012_val_00005924.JPEG n02672831/
+mv val/ILSVRC2012_val_00005925.JPEG n01820546/
+mv val/ILSVRC2012_val_00005926.JPEG n03376595/
+mv val/ILSVRC2012_val_00005927.JPEG n04070727/
+mv val/ILSVRC2012_val_00005928.JPEG n02981792/
+mv val/ILSVRC2012_val_00005929.JPEG n03709823/
+mv val/ILSVRC2012_val_00005930.JPEG n02206856/
+mv val/ILSVRC2012_val_00005931.JPEG n01537544/
+mv val/ILSVRC2012_val_00005932.JPEG n01776313/
+mv val/ILSVRC2012_val_00005933.JPEG n04579145/
+mv val/ILSVRC2012_val_00005934.JPEG n02492035/
+mv val/ILSVRC2012_val_00005935.JPEG n02804414/
+mv val/ILSVRC2012_val_00005936.JPEG n02113799/
+mv val/ILSVRC2012_val_00005937.JPEG n02104365/
+mv val/ILSVRC2012_val_00005938.JPEG n03483316/
+mv val/ILSVRC2012_val_00005939.JPEG n09256479/
+mv val/ILSVRC2012_val_00005940.JPEG n03642806/
+mv val/ILSVRC2012_val_00005941.JPEG n07590611/
+mv val/ILSVRC2012_val_00005942.JPEG n02094433/
+mv val/ILSVRC2012_val_00005943.JPEG n02089973/
+mv val/ILSVRC2012_val_00005944.JPEG n02497673/
+mv val/ILSVRC2012_val_00005945.JPEG n01968897/
+mv val/ILSVRC2012_val_00005946.JPEG n02090721/
+mv val/ILSVRC2012_val_00005947.JPEG n02167151/
+mv val/ILSVRC2012_val_00005948.JPEG n02974003/
+mv val/ILSVRC2012_val_00005949.JPEG n02514041/
+mv val/ILSVRC2012_val_00005950.JPEG n03781244/
+mv val/ILSVRC2012_val_00005951.JPEG n02408429/
+mv val/ILSVRC2012_val_00005952.JPEG n02279972/
+mv val/ILSVRC2012_val_00005953.JPEG n04311174/
+mv val/ILSVRC2012_val_00005954.JPEG n01990800/
+mv val/ILSVRC2012_val_00005955.JPEG n02804610/
+mv val/ILSVRC2012_val_00005956.JPEG n03146219/
+mv val/ILSVRC2012_val_00005957.JPEG n13040303/
+mv val/ILSVRC2012_val_00005958.JPEG n07930864/
+mv val/ILSVRC2012_val_00005959.JPEG n04423845/
+mv val/ILSVRC2012_val_00005960.JPEG n02437616/
+mv val/ILSVRC2012_val_00005961.JPEG n03388043/
+mv val/ILSVRC2012_val_00005962.JPEG n04487394/
+mv val/ILSVRC2012_val_00005963.JPEG n04201297/
+mv val/ILSVRC2012_val_00005964.JPEG n02704792/
+mv val/ILSVRC2012_val_00005965.JPEG n01729322/
+mv val/ILSVRC2012_val_00005966.JPEG n04371430/
+mv val/ILSVRC2012_val_00005967.JPEG n03937543/
+mv val/ILSVRC2012_val_00005968.JPEG n03216828/
+mv val/ILSVRC2012_val_00005969.JPEG n02486261/
+mv val/ILSVRC2012_val_00005970.JPEG n02666196/
+mv val/ILSVRC2012_val_00005971.JPEG n04612504/
+mv val/ILSVRC2012_val_00005972.JPEG n03180011/
+mv val/ILSVRC2012_val_00005973.JPEG n03240683/
+mv val/ILSVRC2012_val_00005974.JPEG n03627232/
+mv val/ILSVRC2012_val_00005975.JPEG n01877812/
+mv val/ILSVRC2012_val_00005976.JPEG n04486054/
+mv val/ILSVRC2012_val_00005977.JPEG n02782093/
+mv val/ILSVRC2012_val_00005978.JPEG n02814533/
+mv val/ILSVRC2012_val_00005979.JPEG n02119022/
+mv val/ILSVRC2012_val_00005980.JPEG n03788195/
+mv val/ILSVRC2012_val_00005981.JPEG n07720875/
+mv val/ILSVRC2012_val_00005982.JPEG n02096051/
+mv val/ILSVRC2012_val_00005983.JPEG n03903868/
+mv val/ILSVRC2012_val_00005984.JPEG n02105162/
+mv val/ILSVRC2012_val_00005985.JPEG n04125021/
+mv val/ILSVRC2012_val_00005986.JPEG n03272010/
+mv val/ILSVRC2012_val_00005987.JPEG n03794056/
+mv val/ILSVRC2012_val_00005988.JPEG n02058221/
+mv val/ILSVRC2012_val_00005989.JPEG n03457902/
+mv val/ILSVRC2012_val_00005990.JPEG n04584207/
+mv val/ILSVRC2012_val_00005991.JPEG n03785016/
+mv val/ILSVRC2012_val_00005992.JPEG n04311004/
+mv val/ILSVRC2012_val_00005993.JPEG n03837869/
+mv val/ILSVRC2012_val_00005994.JPEG n02101556/
+mv val/ILSVRC2012_val_00005995.JPEG n03840681/
+mv val/ILSVRC2012_val_00005996.JPEG n03425413/
+mv val/ILSVRC2012_val_00005997.JPEG n03496892/
+mv val/ILSVRC2012_val_00005998.JPEG n02127052/
+mv val/ILSVRC2012_val_00005999.JPEG n01980166/
+mv val/ILSVRC2012_val_00006000.JPEG n03770439/
+mv val/ILSVRC2012_val_00006001.JPEG n04398044/
+mv val/ILSVRC2012_val_00006002.JPEG n02105412/
+mv val/ILSVRC2012_val_00006003.JPEG n03032252/
+mv val/ILSVRC2012_val_00006004.JPEG n03594734/
+mv val/ILSVRC2012_val_00006005.JPEG n02096437/
+mv val/ILSVRC2012_val_00006006.JPEG n10148035/
+mv val/ILSVRC2012_val_00006007.JPEG n01443537/
+mv val/ILSVRC2012_val_00006008.JPEG n04125021/
+mv val/ILSVRC2012_val_00006009.JPEG n03649909/
+mv val/ILSVRC2012_val_00006010.JPEG n02939185/
+mv val/ILSVRC2012_val_00006011.JPEG n01737021/
+mv val/ILSVRC2012_val_00006012.JPEG n02510455/
+mv val/ILSVRC2012_val_00006013.JPEG n02398521/
+mv val/ILSVRC2012_val_00006014.JPEG n02490219/
+mv val/ILSVRC2012_val_00006015.JPEG n03595614/
+mv val/ILSVRC2012_val_00006016.JPEG n04277352/
+mv val/ILSVRC2012_val_00006017.JPEG n03649909/
+mv val/ILSVRC2012_val_00006018.JPEG n07716906/
+mv val/ILSVRC2012_val_00006019.JPEG n02808440/
+mv val/ILSVRC2012_val_00006020.JPEG n03124170/
+mv val/ILSVRC2012_val_00006021.JPEG n03538406/
+mv val/ILSVRC2012_val_00006022.JPEG n03376595/
+mv val/ILSVRC2012_val_00006023.JPEG n02860847/
+mv val/ILSVRC2012_val_00006024.JPEG n01797886/
+mv val/ILSVRC2012_val_00006025.JPEG n04243546/
+mv val/ILSVRC2012_val_00006026.JPEG n03673027/
+mv val/ILSVRC2012_val_00006027.JPEG n04462240/
+mv val/ILSVRC2012_val_00006028.JPEG n03595614/
+mv val/ILSVRC2012_val_00006029.JPEG n04579432/
+mv val/ILSVRC2012_val_00006030.JPEG n01558993/
+mv val/ILSVRC2012_val_00006031.JPEG n04081281/
+mv val/ILSVRC2012_val_00006032.JPEG n04136333/
+mv val/ILSVRC2012_val_00006033.JPEG n03223299/
+mv val/ILSVRC2012_val_00006034.JPEG n03197337/
+mv val/ILSVRC2012_val_00006035.JPEG n02094114/
+mv val/ILSVRC2012_val_00006036.JPEG n03452741/
+mv val/ILSVRC2012_val_00006037.JPEG n04392985/
+mv val/ILSVRC2012_val_00006038.JPEG n02666196/
+mv val/ILSVRC2012_val_00006039.JPEG n02786058/
+mv val/ILSVRC2012_val_00006040.JPEG n09332890/
+mv val/ILSVRC2012_val_00006041.JPEG n03759954/
+mv val/ILSVRC2012_val_00006042.JPEG n04125021/
+mv val/ILSVRC2012_val_00006043.JPEG n03000684/
+mv val/ILSVRC2012_val_00006044.JPEG n04597913/
+mv val/ILSVRC2012_val_00006045.JPEG n01768244/
+mv val/ILSVRC2012_val_00006046.JPEG n02099601/
+mv val/ILSVRC2012_val_00006047.JPEG n07716358/
+mv val/ILSVRC2012_val_00006048.JPEG n03530642/
+mv val/ILSVRC2012_val_00006049.JPEG n01860187/
+mv val/ILSVRC2012_val_00006050.JPEG n02012849/
+mv val/ILSVRC2012_val_00006051.JPEG n02814860/
+mv val/ILSVRC2012_val_00006052.JPEG n02110063/
+mv val/ILSVRC2012_val_00006053.JPEG n03160309/
+mv val/ILSVRC2012_val_00006054.JPEG n02091032/
+mv val/ILSVRC2012_val_00006055.JPEG n15075141/
+mv val/ILSVRC2012_val_00006056.JPEG n02127052/
+mv val/ILSVRC2012_val_00006057.JPEG n02699494/
+mv val/ILSVRC2012_val_00006058.JPEG n04447861/
+mv val/ILSVRC2012_val_00006059.JPEG n02109961/
+mv val/ILSVRC2012_val_00006060.JPEG n03532672/
+mv val/ILSVRC2012_val_00006061.JPEG n04099969/
+mv val/ILSVRC2012_val_00006062.JPEG n03594945/
+mv val/ILSVRC2012_val_00006063.JPEG n02101556/
+mv val/ILSVRC2012_val_00006064.JPEG n04200800/
+mv val/ILSVRC2012_val_00006065.JPEG n02100236/
+mv val/ILSVRC2012_val_00006066.JPEG n04149813/
+mv val/ILSVRC2012_val_00006067.JPEG n07920052/
+mv val/ILSVRC2012_val_00006068.JPEG n04149813/
+mv val/ILSVRC2012_val_00006069.JPEG n02097209/
+mv val/ILSVRC2012_val_00006070.JPEG n03793489/
+mv val/ILSVRC2012_val_00006071.JPEG n09428293/
+mv val/ILSVRC2012_val_00006072.JPEG n03840681/
+mv val/ILSVRC2012_val_00006073.JPEG n02799071/
+mv val/ILSVRC2012_val_00006074.JPEG n04332243/
+mv val/ILSVRC2012_val_00006075.JPEG n01807496/
+mv val/ILSVRC2012_val_00006076.JPEG n04479046/
+mv val/ILSVRC2012_val_00006077.JPEG n02101388/
+mv val/ILSVRC2012_val_00006078.JPEG n02099849/
+mv val/ILSVRC2012_val_00006079.JPEG n02085620/
+mv val/ILSVRC2012_val_00006080.JPEG n02655020/
+mv val/ILSVRC2012_val_00006081.JPEG n02802426/
+mv val/ILSVRC2012_val_00006082.JPEG n04204347/
+mv val/ILSVRC2012_val_00006083.JPEG n02094433/
+mv val/ILSVRC2012_val_00006084.JPEG n02814533/
+mv val/ILSVRC2012_val_00006085.JPEG n04398044/
+mv val/ILSVRC2012_val_00006086.JPEG n04090263/
+mv val/ILSVRC2012_val_00006087.JPEG n02051845/
+mv val/ILSVRC2012_val_00006088.JPEG n04548362/
+mv val/ILSVRC2012_val_00006089.JPEG n04259630/
+mv val/ILSVRC2012_val_00006090.JPEG n04209133/
+mv val/ILSVRC2012_val_00006091.JPEG n04596742/
+mv val/ILSVRC2012_val_00006092.JPEG n02114855/
+mv val/ILSVRC2012_val_00006093.JPEG n02091635/
+mv val/ILSVRC2012_val_00006094.JPEG n01795545/
+mv val/ILSVRC2012_val_00006095.JPEG n02231487/
+mv val/ILSVRC2012_val_00006096.JPEG n07831146/
+mv val/ILSVRC2012_val_00006097.JPEG n02110341/
+mv val/ILSVRC2012_val_00006098.JPEG n01728920/
+mv val/ILSVRC2012_val_00006099.JPEG n02802426/
+mv val/ILSVRC2012_val_00006100.JPEG n01978455/
+mv val/ILSVRC2012_val_00006101.JPEG n03388043/
+mv val/ILSVRC2012_val_00006102.JPEG n03041632/
+mv val/ILSVRC2012_val_00006103.JPEG n03976657/
+mv val/ILSVRC2012_val_00006104.JPEG n02443484/
+mv val/ILSVRC2012_val_00006105.JPEG n01735189/
+mv val/ILSVRC2012_val_00006106.JPEG n04310018/
+mv val/ILSVRC2012_val_00006107.JPEG n02009229/
+mv val/ILSVRC2012_val_00006108.JPEG n02325366/
+mv val/ILSVRC2012_val_00006109.JPEG n03075370/
+mv val/ILSVRC2012_val_00006110.JPEG n04149813/
+mv val/ILSVRC2012_val_00006111.JPEG n03891251/
+mv val/ILSVRC2012_val_00006112.JPEG n02125311/
+mv val/ILSVRC2012_val_00006113.JPEG n04074963/
+mv val/ILSVRC2012_val_00006114.JPEG n02105855/
+mv val/ILSVRC2012_val_00006115.JPEG n04525038/
+mv val/ILSVRC2012_val_00006116.JPEG n02002724/
+mv val/ILSVRC2012_val_00006117.JPEG n03924679/
+mv val/ILSVRC2012_val_00006118.JPEG n03947888/
+mv val/ILSVRC2012_val_00006119.JPEG n03544143/
+mv val/ILSVRC2012_val_00006120.JPEG n01704323/
+mv val/ILSVRC2012_val_00006121.JPEG n02177972/
+mv val/ILSVRC2012_val_00006122.JPEG n04509417/
+mv val/ILSVRC2012_val_00006123.JPEG n07754684/
+mv val/ILSVRC2012_val_00006124.JPEG n03961711/
+mv val/ILSVRC2012_val_00006125.JPEG n02364673/
+mv val/ILSVRC2012_val_00006126.JPEG n07614500/
+mv val/ILSVRC2012_val_00006127.JPEG n04239074/
+mv val/ILSVRC2012_val_00006128.JPEG n02825657/
+mv val/ILSVRC2012_val_00006129.JPEG n02391049/
+mv val/ILSVRC2012_val_00006130.JPEG n03447721/
+mv val/ILSVRC2012_val_00006131.JPEG n03042490/
+mv val/ILSVRC2012_val_00006132.JPEG n04442312/
+mv val/ILSVRC2012_val_00006133.JPEG n02098105/
+mv val/ILSVRC2012_val_00006134.JPEG n03388043/
+mv val/ILSVRC2012_val_00006135.JPEG n03692522/
+mv val/ILSVRC2012_val_00006136.JPEG n04428191/
+mv val/ILSVRC2012_val_00006137.JPEG n02100236/
+mv val/ILSVRC2012_val_00006138.JPEG n04591157/
+mv val/ILSVRC2012_val_00006139.JPEG n03729826/
+mv val/ILSVRC2012_val_00006140.JPEG n03775071/
+mv val/ILSVRC2012_val_00006141.JPEG n02480855/
+mv val/ILSVRC2012_val_00006142.JPEG n03697007/
+mv val/ILSVRC2012_val_00006143.JPEG n02088094/
+mv val/ILSVRC2012_val_00006144.JPEG n02012849/
+mv val/ILSVRC2012_val_00006145.JPEG n02119789/
+mv val/ILSVRC2012_val_00006146.JPEG n02085782/
+mv val/ILSVRC2012_val_00006147.JPEG n03424325/
+mv val/ILSVRC2012_val_00006148.JPEG n01872401/
+mv val/ILSVRC2012_val_00006149.JPEG n01631663/
+mv val/ILSVRC2012_val_00006150.JPEG n02788148/
+mv val/ILSVRC2012_val_00006151.JPEG n01698640/
+mv val/ILSVRC2012_val_00006152.JPEG n02672831/
+mv val/ILSVRC2012_val_00006153.JPEG n04162706/
+mv val/ILSVRC2012_val_00006154.JPEG n04591157/
+mv val/ILSVRC2012_val_00006155.JPEG n02128385/
+mv val/ILSVRC2012_val_00006156.JPEG n02992529/
+mv val/ILSVRC2012_val_00006157.JPEG n03443371/
+mv val/ILSVRC2012_val_00006158.JPEG n03792782/
+mv val/ILSVRC2012_val_00006159.JPEG n04200800/
+mv val/ILSVRC2012_val_00006160.JPEG n04069434/
+mv val/ILSVRC2012_val_00006161.JPEG n02490219/
+mv val/ILSVRC2012_val_00006162.JPEG n03868242/
+mv val/ILSVRC2012_val_00006163.JPEG n04277352/
+mv val/ILSVRC2012_val_00006164.JPEG n03770439/
+mv val/ILSVRC2012_val_00006165.JPEG n01773157/
+mv val/ILSVRC2012_val_00006166.JPEG n04026417/
+mv val/ILSVRC2012_val_00006167.JPEG n03492542/
+mv val/ILSVRC2012_val_00006168.JPEG n02107908/
+mv val/ILSVRC2012_val_00006169.JPEG n04548362/
+mv val/ILSVRC2012_val_00006170.JPEG n03379051/
+mv val/ILSVRC2012_val_00006171.JPEG n01582220/
+mv val/ILSVRC2012_val_00006172.JPEG n02109047/
+mv val/ILSVRC2012_val_00006173.JPEG n04579145/
+mv val/ILSVRC2012_val_00006174.JPEG n02114548/
+mv val/ILSVRC2012_val_00006175.JPEG n04152593/
+mv val/ILSVRC2012_val_00006176.JPEG n02769748/
+mv val/ILSVRC2012_val_00006177.JPEG n04296562/
+mv val/ILSVRC2012_val_00006178.JPEG n02097209/
+mv val/ILSVRC2012_val_00006179.JPEG n01983481/
+mv val/ILSVRC2012_val_00006180.JPEG n04366367/
+mv val/ILSVRC2012_val_00006181.JPEG n03657121/
+mv val/ILSVRC2012_val_00006182.JPEG n02879718/
+mv val/ILSVRC2012_val_00006183.JPEG n02119789/
+mv val/ILSVRC2012_val_00006184.JPEG n03947888/
+mv val/ILSVRC2012_val_00006185.JPEG n02342885/
+mv val/ILSVRC2012_val_00006186.JPEG n04152593/
+mv val/ILSVRC2012_val_00006187.JPEG n04370456/
+mv val/ILSVRC2012_val_00006188.JPEG n03032252/
+mv val/ILSVRC2012_val_00006189.JPEG n07880968/
+mv val/ILSVRC2012_val_00006190.JPEG n04328186/
+mv val/ILSVRC2012_val_00006191.JPEG n02107574/
+mv val/ILSVRC2012_val_00006192.JPEG n02017213/
+mv val/ILSVRC2012_val_00006193.JPEG n01945685/
+mv val/ILSVRC2012_val_00006194.JPEG n04550184/
+mv val/ILSVRC2012_val_00006195.JPEG n01514859/
+mv val/ILSVRC2012_val_00006196.JPEG n04479046/
+mv val/ILSVRC2012_val_00006197.JPEG n07695742/
+mv val/ILSVRC2012_val_00006198.JPEG n03481172/
+mv val/ILSVRC2012_val_00006199.JPEG n07747607/
+mv val/ILSVRC2012_val_00006200.JPEG n02437312/
+mv val/ILSVRC2012_val_00006201.JPEG n03742115/
+mv val/ILSVRC2012_val_00006202.JPEG n01924916/
+mv val/ILSVRC2012_val_00006203.JPEG n01608432/
+mv val/ILSVRC2012_val_00006204.JPEG n04584207/
+mv val/ILSVRC2012_val_00006205.JPEG n02825657/
+mv val/ILSVRC2012_val_00006206.JPEG n12144580/
+mv val/ILSVRC2012_val_00006207.JPEG n01689811/
+mv val/ILSVRC2012_val_00006208.JPEG n04228054/
+mv val/ILSVRC2012_val_00006209.JPEG n02113624/
+mv val/ILSVRC2012_val_00006210.JPEG n07697313/
+mv val/ILSVRC2012_val_00006211.JPEG n04367480/
+mv val/ILSVRC2012_val_00006212.JPEG n04026417/
+mv val/ILSVRC2012_val_00006213.JPEG n01616318/
+mv val/ILSVRC2012_val_00006214.JPEG n02643566/
+mv val/ILSVRC2012_val_00006215.JPEG n04228054/
+mv val/ILSVRC2012_val_00006216.JPEG n01443537/
+mv val/ILSVRC2012_val_00006217.JPEG n04252077/
+mv val/ILSVRC2012_val_00006218.JPEG n01734418/
+mv val/ILSVRC2012_val_00006219.JPEG n02490219/
+mv val/ILSVRC2012_val_00006220.JPEG n02814533/
+mv val/ILSVRC2012_val_00006221.JPEG n01796340/
+mv val/ILSVRC2012_val_00006222.JPEG n03160309/
+mv val/ILSVRC2012_val_00006223.JPEG n04355933/
+mv val/ILSVRC2012_val_00006224.JPEG n03666591/
+mv val/ILSVRC2012_val_00006225.JPEG n02443114/
+mv val/ILSVRC2012_val_00006226.JPEG n03595614/
+mv val/ILSVRC2012_val_00006227.JPEG n02948072/
+mv val/ILSVRC2012_val_00006228.JPEG n03786901/
+mv val/ILSVRC2012_val_00006229.JPEG n04380533/
+mv val/ILSVRC2012_val_00006230.JPEG n01824575/
+mv val/ILSVRC2012_val_00006231.JPEG n02018207/
+mv val/ILSVRC2012_val_00006232.JPEG n02111500/
+mv val/ILSVRC2012_val_00006233.JPEG n03188531/
+mv val/ILSVRC2012_val_00006234.JPEG n03417042/
+mv val/ILSVRC2012_val_00006235.JPEG n13037406/
+mv val/ILSVRC2012_val_00006236.JPEG n02869837/
+mv val/ILSVRC2012_val_00006237.JPEG n03627232/
+mv val/ILSVRC2012_val_00006238.JPEG n07716906/
+mv val/ILSVRC2012_val_00006239.JPEG n02130308/
+mv val/ILSVRC2012_val_00006240.JPEG n02422106/
+mv val/ILSVRC2012_val_00006241.JPEG n03544143/
+mv val/ILSVRC2012_val_00006242.JPEG n02108551/
+mv val/ILSVRC2012_val_00006243.JPEG n03314780/
+mv val/ILSVRC2012_val_00006244.JPEG n01694178/
+mv val/ILSVRC2012_val_00006245.JPEG n02437312/
+mv val/ILSVRC2012_val_00006246.JPEG n02978881/
+mv val/ILSVRC2012_val_00006247.JPEG n04243546/
+mv val/ILSVRC2012_val_00006248.JPEG n02823428/
+mv val/ILSVRC2012_val_00006249.JPEG n03916031/
+mv val/ILSVRC2012_val_00006250.JPEG n01616318/
+mv val/ILSVRC2012_val_00006251.JPEG n01496331/
+mv val/ILSVRC2012_val_00006252.JPEG n15075141/
+mv val/ILSVRC2012_val_00006253.JPEG n02071294/
+mv val/ILSVRC2012_val_00006254.JPEG n03095699/
+mv val/ILSVRC2012_val_00006255.JPEG n04525305/
+mv val/ILSVRC2012_val_00006256.JPEG n02483362/
+mv val/ILSVRC2012_val_00006257.JPEG n02109047/
+mv val/ILSVRC2012_val_00006258.JPEG n02930766/
+mv val/ILSVRC2012_val_00006259.JPEG n03792972/
+mv val/ILSVRC2012_val_00006260.JPEG n04507155/
+mv val/ILSVRC2012_val_00006261.JPEG n02091032/
+mv val/ILSVRC2012_val_00006262.JPEG n01744401/
+mv val/ILSVRC2012_val_00006263.JPEG n03929660/
+mv val/ILSVRC2012_val_00006264.JPEG n01632458/
+mv val/ILSVRC2012_val_00006265.JPEG n02090622/
+mv val/ILSVRC2012_val_00006266.JPEG n13037406/
+mv val/ILSVRC2012_val_00006267.JPEG n01580077/
+mv val/ILSVRC2012_val_00006268.JPEG n03028079/
+mv val/ILSVRC2012_val_00006269.JPEG n04366367/
+mv val/ILSVRC2012_val_00006270.JPEG n03000247/
+mv val/ILSVRC2012_val_00006271.JPEG n02088094/
+mv val/ILSVRC2012_val_00006272.JPEG n04376876/
+mv val/ILSVRC2012_val_00006273.JPEG n02110341/
+mv val/ILSVRC2012_val_00006274.JPEG n03983396/
+mv val/ILSVRC2012_val_00006275.JPEG n02791124/
+mv val/ILSVRC2012_val_00006276.JPEG n02977058/
+mv val/ILSVRC2012_val_00006277.JPEG n03384352/
+mv val/ILSVRC2012_val_00006278.JPEG n03042490/
+mv val/ILSVRC2012_val_00006279.JPEG n02643566/
+mv val/ILSVRC2012_val_00006280.JPEG n04522168/
+mv val/ILSVRC2012_val_00006281.JPEG n02804414/
+mv val/ILSVRC2012_val_00006282.JPEG n07760859/
+mv val/ILSVRC2012_val_00006283.JPEG n02445715/
+mv val/ILSVRC2012_val_00006284.JPEG n01728920/
+mv val/ILSVRC2012_val_00006285.JPEG n04285008/
+mv val/ILSVRC2012_val_00006286.JPEG n01697457/
+mv val/ILSVRC2012_val_00006287.JPEG n03961711/
+mv val/ILSVRC2012_val_00006288.JPEG n03134739/
+mv val/ILSVRC2012_val_00006289.JPEG n01882714/
+mv val/ILSVRC2012_val_00006290.JPEG n07716358/
+mv val/ILSVRC2012_val_00006291.JPEG n02364673/
+mv val/ILSVRC2012_val_00006292.JPEG n02536864/
+mv val/ILSVRC2012_val_00006293.JPEG n07880968/
+mv val/ILSVRC2012_val_00006294.JPEG n03662601/
+mv val/ILSVRC2012_val_00006295.JPEG n02699494/
+mv val/ILSVRC2012_val_00006296.JPEG n04133789/
+mv val/ILSVRC2012_val_00006297.JPEG n04141076/
+mv val/ILSVRC2012_val_00006298.JPEG n04366367/
+mv val/ILSVRC2012_val_00006299.JPEG n02892201/
+mv val/ILSVRC2012_val_00006300.JPEG n02100877/
+mv val/ILSVRC2012_val_00006301.JPEG n01695060/
+mv val/ILSVRC2012_val_00006302.JPEG n07747607/
+mv val/ILSVRC2012_val_00006303.JPEG n02971356/
+mv val/ILSVRC2012_val_00006304.JPEG n02804414/
+mv val/ILSVRC2012_val_00006305.JPEG n01665541/
+mv val/ILSVRC2012_val_00006306.JPEG n02422699/
+mv val/ILSVRC2012_val_00006307.JPEG n03065424/
+mv val/ILSVRC2012_val_00006308.JPEG n07693725/
+mv val/ILSVRC2012_val_00006309.JPEG n04336792/
+mv val/ILSVRC2012_val_00006310.JPEG n07932039/
+mv val/ILSVRC2012_val_00006311.JPEG n04311174/
+mv val/ILSVRC2012_val_00006312.JPEG n07715103/
+mv val/ILSVRC2012_val_00006313.JPEG n02268853/
+mv val/ILSVRC2012_val_00006314.JPEG n02096585/
+mv val/ILSVRC2012_val_00006315.JPEG n01981276/
+mv val/ILSVRC2012_val_00006316.JPEG n04133789/
+mv val/ILSVRC2012_val_00006317.JPEG n02814860/
+mv val/ILSVRC2012_val_00006318.JPEG n03388183/
+mv val/ILSVRC2012_val_00006319.JPEG n01631663/
+mv val/ILSVRC2012_val_00006320.JPEG n02447366/
+mv val/ILSVRC2012_val_00006321.JPEG n01560419/
+mv val/ILSVRC2012_val_00006322.JPEG n02319095/
+mv val/ILSVRC2012_val_00006323.JPEG n04370456/
+mv val/ILSVRC2012_val_00006324.JPEG n04152593/
+mv val/ILSVRC2012_val_00006325.JPEG n02939185/
+mv val/ILSVRC2012_val_00006326.JPEG n01534433/
+mv val/ILSVRC2012_val_00006327.JPEG n02909870/
+mv val/ILSVRC2012_val_00006328.JPEG n01537544/
+mv val/ILSVRC2012_val_00006329.JPEG n07565083/
+mv val/ILSVRC2012_val_00006330.JPEG n02106030/
+mv val/ILSVRC2012_val_00006331.JPEG n01630670/
+mv val/ILSVRC2012_val_00006332.JPEG n02837789/
+mv val/ILSVRC2012_val_00006333.JPEG n03633091/
+mv val/ILSVRC2012_val_00006334.JPEG n01614925/
+mv val/ILSVRC2012_val_00006335.JPEG n13052670/
+mv val/ILSVRC2012_val_00006336.JPEG n02104029/
+mv val/ILSVRC2012_val_00006337.JPEG n02877765/
+mv val/ILSVRC2012_val_00006338.JPEG n02106166/
+mv val/ILSVRC2012_val_00006339.JPEG n02011460/
+mv val/ILSVRC2012_val_00006340.JPEG n03590841/
+mv val/ILSVRC2012_val_00006341.JPEG n02130308/
+mv val/ILSVRC2012_val_00006342.JPEG n01968897/
+mv val/ILSVRC2012_val_00006343.JPEG n02397096/
+mv val/ILSVRC2012_val_00006344.JPEG n02966193/
+mv val/ILSVRC2012_val_00006345.JPEG n02129165/
+mv val/ILSVRC2012_val_00006346.JPEG n03393912/
+mv val/ILSVRC2012_val_00006347.JPEG n03133878/
+mv val/ILSVRC2012_val_00006348.JPEG n03743016/
+mv val/ILSVRC2012_val_00006349.JPEG n03947888/
+mv val/ILSVRC2012_val_00006350.JPEG n02133161/
+mv val/ILSVRC2012_val_00006351.JPEG n02102480/
+mv val/ILSVRC2012_val_00006352.JPEG n02457408/
+mv val/ILSVRC2012_val_00006353.JPEG n02111889/
+mv val/ILSVRC2012_val_00006354.JPEG n02364673/
+mv val/ILSVRC2012_val_00006355.JPEG n02980441/
+mv val/ILSVRC2012_val_00006356.JPEG n02138441/
+mv val/ILSVRC2012_val_00006357.JPEG n03908714/
+mv val/ILSVRC2012_val_00006358.JPEG n04599235/
+mv val/ILSVRC2012_val_00006359.JPEG n03220513/
+mv val/ILSVRC2012_val_00006360.JPEG n01729977/
+mv val/ILSVRC2012_val_00006361.JPEG n02808304/
+mv val/ILSVRC2012_val_00006362.JPEG n03223299/
+mv val/ILSVRC2012_val_00006363.JPEG n03444034/
+mv val/ILSVRC2012_val_00006364.JPEG n03538406/
+mv val/ILSVRC2012_val_00006365.JPEG n03384352/
+mv val/ILSVRC2012_val_00006366.JPEG n02607072/
+mv val/ILSVRC2012_val_00006367.JPEG n07684084/
+mv val/ILSVRC2012_val_00006368.JPEG n07697537/
+mv val/ILSVRC2012_val_00006369.JPEG n07565083/
+mv val/ILSVRC2012_val_00006370.JPEG n02939185/
+mv val/ILSVRC2012_val_00006371.JPEG n04483307/
+mv val/ILSVRC2012_val_00006372.JPEG n01843065/
+mv val/ILSVRC2012_val_00006373.JPEG n03272010/
+mv val/ILSVRC2012_val_00006374.JPEG n04370456/
+mv val/ILSVRC2012_val_00006375.JPEG n03627232/
+mv val/ILSVRC2012_val_00006376.JPEG n03259280/
+mv val/ILSVRC2012_val_00006377.JPEG n01698640/
+mv val/ILSVRC2012_val_00006378.JPEG n01775062/
+mv val/ILSVRC2012_val_00006379.JPEG n02769748/
+mv val/ILSVRC2012_val_00006380.JPEG n04428191/
+mv val/ILSVRC2012_val_00006381.JPEG n04326547/
+mv val/ILSVRC2012_val_00006382.JPEG n02090721/
+mv val/ILSVRC2012_val_00006383.JPEG n02051845/
+mv val/ILSVRC2012_val_00006384.JPEG n03124170/
+mv val/ILSVRC2012_val_00006385.JPEG n02422106/
+mv val/ILSVRC2012_val_00006386.JPEG n02134418/
+mv val/ILSVRC2012_val_00006387.JPEG n09399592/
+mv val/ILSVRC2012_val_00006388.JPEG n03447721/
+mv val/ILSVRC2012_val_00006389.JPEG n04090263/
+mv val/ILSVRC2012_val_00006390.JPEG n04584207/
+mv val/ILSVRC2012_val_00006391.JPEG n03884397/
+mv val/ILSVRC2012_val_00006392.JPEG n02356798/
+mv val/ILSVRC2012_val_00006393.JPEG n02105641/
+mv val/ILSVRC2012_val_00006394.JPEG n03786901/
+mv val/ILSVRC2012_val_00006395.JPEG n02835271/
+mv val/ILSVRC2012_val_00006396.JPEG n02090379/
+mv val/ILSVRC2012_val_00006397.JPEG n03379051/
+mv val/ILSVRC2012_val_00006398.JPEG n04389033/
+mv val/ILSVRC2012_val_00006399.JPEG n01847000/
+mv val/ILSVRC2012_val_00006400.JPEG n02125311/
+mv val/ILSVRC2012_val_00006401.JPEG n02089078/
+mv val/ILSVRC2012_val_00006402.JPEG n01498041/
+mv val/ILSVRC2012_val_00006403.JPEG n01749939/
+mv val/ILSVRC2012_val_00006404.JPEG n02102177/
+mv val/ILSVRC2012_val_00006405.JPEG n04023962/
+mv val/ILSVRC2012_val_00006406.JPEG n03788365/
+mv val/ILSVRC2012_val_00006407.JPEG n02127052/
+mv val/ILSVRC2012_val_00006408.JPEG n04326547/
+mv val/ILSVRC2012_val_00006409.JPEG n01641577/
+mv val/ILSVRC2012_val_00006410.JPEG n02484975/
+mv val/ILSVRC2012_val_00006411.JPEG n07768694/
+mv val/ILSVRC2012_val_00006412.JPEG n03777754/
+mv val/ILSVRC2012_val_00006413.JPEG n04487394/
+mv val/ILSVRC2012_val_00006414.JPEG n07873807/
+mv val/ILSVRC2012_val_00006415.JPEG n02089078/
+mv val/ILSVRC2012_val_00006416.JPEG n02112137/
+mv val/ILSVRC2012_val_00006417.JPEG n03733281/
+mv val/ILSVRC2012_val_00006418.JPEG n04141975/
+mv val/ILSVRC2012_val_00006419.JPEG n02105251/
+mv val/ILSVRC2012_val_00006420.JPEG n04040759/
+mv val/ILSVRC2012_val_00006421.JPEG n13052670/
+mv val/ILSVRC2012_val_00006422.JPEG n07684084/
+mv val/ILSVRC2012_val_00006423.JPEG n03179701/
+mv val/ILSVRC2012_val_00006424.JPEG n03804744/
+mv val/ILSVRC2012_val_00006425.JPEG n03127747/
+mv val/ILSVRC2012_val_00006426.JPEG n01748264/
+mv val/ILSVRC2012_val_00006427.JPEG n02408429/
+mv val/ILSVRC2012_val_00006428.JPEG n03126707/
+mv val/ILSVRC2012_val_00006429.JPEG n03595614/
+mv val/ILSVRC2012_val_00006430.JPEG n04235860/
+mv val/ILSVRC2012_val_00006431.JPEG n02117135/
+mv val/ILSVRC2012_val_00006432.JPEG n03938244/
+mv val/ILSVRC2012_val_00006433.JPEG n02497673/
+mv val/ILSVRC2012_val_00006434.JPEG n03425413/
+mv val/ILSVRC2012_val_00006435.JPEG n04192698/
+mv val/ILSVRC2012_val_00006436.JPEG n03980874/
+mv val/ILSVRC2012_val_00006437.JPEG n01774384/
+mv val/ILSVRC2012_val_00006438.JPEG n04591157/
+mv val/ILSVRC2012_val_00006439.JPEG n02403003/
+mv val/ILSVRC2012_val_00006440.JPEG n01729322/
+mv val/ILSVRC2012_val_00006441.JPEG n02834397/
+mv val/ILSVRC2012_val_00006442.JPEG n03527444/
+mv val/ILSVRC2012_val_00006443.JPEG n03763968/
+mv val/ILSVRC2012_val_00006444.JPEG n04120489/
+mv val/ILSVRC2012_val_00006445.JPEG n02100735/
+mv val/ILSVRC2012_val_00006446.JPEG n01955084/
+mv val/ILSVRC2012_val_00006447.JPEG n02483362/
+mv val/ILSVRC2012_val_00006448.JPEG n02510455/
+mv val/ILSVRC2012_val_00006449.JPEG n01817953/
+mv val/ILSVRC2012_val_00006450.JPEG n03868242/
+mv val/ILSVRC2012_val_00006451.JPEG n02483362/
+mv val/ILSVRC2012_val_00006452.JPEG n04418357/
+mv val/ILSVRC2012_val_00006453.JPEG n01968897/
+mv val/ILSVRC2012_val_00006454.JPEG n03691459/
+mv val/ILSVRC2012_val_00006455.JPEG n01882714/
+mv val/ILSVRC2012_val_00006456.JPEG n02883205/
+mv val/ILSVRC2012_val_00006457.JPEG n01829413/
+mv val/ILSVRC2012_val_00006458.JPEG n02870880/
+mv val/ILSVRC2012_val_00006459.JPEG n02396427/
+mv val/ILSVRC2012_val_00006460.JPEG n01843383/
+mv val/ILSVRC2012_val_00006461.JPEG n10148035/
+mv val/ILSVRC2012_val_00006462.JPEG n02699494/
+mv val/ILSVRC2012_val_00006463.JPEG n01580077/
+mv val/ILSVRC2012_val_00006464.JPEG n04238763/
+mv val/ILSVRC2012_val_00006465.JPEG n03496892/
+mv val/ILSVRC2012_val_00006466.JPEG n07684084/
+mv val/ILSVRC2012_val_00006467.JPEG n02950826/
+mv val/ILSVRC2012_val_00006468.JPEG n03445777/
+mv val/ILSVRC2012_val_00006469.JPEG n01798484/
+mv val/ILSVRC2012_val_00006470.JPEG n03877845/
+mv val/ILSVRC2012_val_00006471.JPEG n04239074/
+mv val/ILSVRC2012_val_00006472.JPEG n01622779/
+mv val/ILSVRC2012_val_00006473.JPEG n02099712/
+mv val/ILSVRC2012_val_00006474.JPEG n02837789/
+mv val/ILSVRC2012_val_00006475.JPEG n07730033/
+mv val/ILSVRC2012_val_00006476.JPEG n09835506/
+mv val/ILSVRC2012_val_00006477.JPEG n04532106/
+mv val/ILSVRC2012_val_00006478.JPEG n03976467/
+mv val/ILSVRC2012_val_00006479.JPEG n03854065/
+mv val/ILSVRC2012_val_00006480.JPEG n01756291/
+mv val/ILSVRC2012_val_00006481.JPEG n07892512/
+mv val/ILSVRC2012_val_00006482.JPEG n15075141/
+mv val/ILSVRC2012_val_00006483.JPEG n02971356/
+mv val/ILSVRC2012_val_00006484.JPEG n02113023/
+mv val/ILSVRC2012_val_00006485.JPEG n04023962/
+mv val/ILSVRC2012_val_00006486.JPEG n02108551/
+mv val/ILSVRC2012_val_00006487.JPEG n02002724/
+mv val/ILSVRC2012_val_00006488.JPEG n09288635/
+mv val/ILSVRC2012_val_00006489.JPEG n03457902/
+mv val/ILSVRC2012_val_00006490.JPEG n03124170/
+mv val/ILSVRC2012_val_00006491.JPEG n01484850/
+mv val/ILSVRC2012_val_00006492.JPEG n04548362/
+mv val/ILSVRC2012_val_00006493.JPEG n03201208/
+mv val/ILSVRC2012_val_00006494.JPEG n01734418/
+mv val/ILSVRC2012_val_00006495.JPEG n02090622/
+mv val/ILSVRC2012_val_00006496.JPEG n03929660/
+mv val/ILSVRC2012_val_00006497.JPEG n03868863/
+mv val/ILSVRC2012_val_00006498.JPEG n02480855/
+mv val/ILSVRC2012_val_00006499.JPEG n02028035/
+mv val/ILSVRC2012_val_00006500.JPEG n01692333/
+mv val/ILSVRC2012_val_00006501.JPEG n02206856/
+mv val/ILSVRC2012_val_00006502.JPEG n03970156/
+mv val/ILSVRC2012_val_00006503.JPEG n07768694/
+mv val/ILSVRC2012_val_00006504.JPEG n04376876/
+mv val/ILSVRC2012_val_00006505.JPEG n02089973/
+mv val/ILSVRC2012_val_00006506.JPEG n03976467/
+mv val/ILSVRC2012_val_00006507.JPEG n03134739/
+mv val/ILSVRC2012_val_00006508.JPEG n03788195/
+mv val/ILSVRC2012_val_00006509.JPEG n04399382/
+mv val/ILSVRC2012_val_00006510.JPEG n04023962/
+mv val/ILSVRC2012_val_00006511.JPEG n03393912/
+mv val/ILSVRC2012_val_00006512.JPEG n12620546/
+mv val/ILSVRC2012_val_00006513.JPEG n03085013/
+mv val/ILSVRC2012_val_00006514.JPEG n02277742/
+mv val/ILSVRC2012_val_00006515.JPEG n03272562/
+mv val/ILSVRC2012_val_00006516.JPEG n01698640/
+mv val/ILSVRC2012_val_00006517.JPEG n04039381/
+mv val/ILSVRC2012_val_00006518.JPEG n02877765/
+mv val/ILSVRC2012_val_00006519.JPEG n03680355/
+mv val/ILSVRC2012_val_00006520.JPEG n01873310/
+mv val/ILSVRC2012_val_00006521.JPEG n04039381/
+mv val/ILSVRC2012_val_00006522.JPEG n02980441/
+mv val/ILSVRC2012_val_00006523.JPEG n04376876/
+mv val/ILSVRC2012_val_00006524.JPEG n01729322/
+mv val/ILSVRC2012_val_00006525.JPEG n02795169/
+mv val/ILSVRC2012_val_00006526.JPEG n01530575/
+mv val/ILSVRC2012_val_00006527.JPEG n04515003/
+mv val/ILSVRC2012_val_00006528.JPEG n02794156/
+mv val/ILSVRC2012_val_00006529.JPEG n02165105/
+mv val/ILSVRC2012_val_00006530.JPEG n03594945/
+mv val/ILSVRC2012_val_00006531.JPEG n02093991/
+mv val/ILSVRC2012_val_00006532.JPEG n02256656/
+mv val/ILSVRC2012_val_00006533.JPEG n02105412/
+mv val/ILSVRC2012_val_00006534.JPEG n03216828/
+mv val/ILSVRC2012_val_00006535.JPEG n02110806/
+mv val/ILSVRC2012_val_00006536.JPEG n03297495/
+mv val/ILSVRC2012_val_00006537.JPEG n02112137/
+mv val/ILSVRC2012_val_00006538.JPEG n03710721/
+mv val/ILSVRC2012_val_00006539.JPEG n02110185/
+mv val/ILSVRC2012_val_00006540.JPEG n09421951/
+mv val/ILSVRC2012_val_00006541.JPEG n02480855/
+mv val/ILSVRC2012_val_00006542.JPEG n04336792/
+mv val/ILSVRC2012_val_00006543.JPEG n02510455/
+mv val/ILSVRC2012_val_00006544.JPEG n02087046/
+mv val/ILSVRC2012_val_00006545.JPEG n02110627/
+mv val/ILSVRC2012_val_00006546.JPEG n04005630/
+mv val/ILSVRC2012_val_00006547.JPEG n02536864/
+mv val/ILSVRC2012_val_00006548.JPEG n04277352/
+mv val/ILSVRC2012_val_00006549.JPEG n01774750/
+mv val/ILSVRC2012_val_00006550.JPEG n02667093/
+mv val/ILSVRC2012_val_00006551.JPEG n04554684/
+mv val/ILSVRC2012_val_00006552.JPEG n02823750/
+mv val/ILSVRC2012_val_00006553.JPEG n03196217/
+mv val/ILSVRC2012_val_00006554.JPEG n01496331/
+mv val/ILSVRC2012_val_00006555.JPEG n01855032/
+mv val/ILSVRC2012_val_00006556.JPEG n02128757/
+mv val/ILSVRC2012_val_00006557.JPEG n03764736/
+mv val/ILSVRC2012_val_00006558.JPEG n02981792/
+mv val/ILSVRC2012_val_00006559.JPEG n03876231/
+mv val/ILSVRC2012_val_00006560.JPEG n04458633/
+mv val/ILSVRC2012_val_00006561.JPEG n03888257/
+mv val/ILSVRC2012_val_00006562.JPEG n01860187/
+mv val/ILSVRC2012_val_00006563.JPEG n04326547/
+mv val/ILSVRC2012_val_00006564.JPEG n09421951/
+mv val/ILSVRC2012_val_00006565.JPEG n07880968/
+mv val/ILSVRC2012_val_00006566.JPEG n02500267/
+mv val/ILSVRC2012_val_00006567.JPEG n01770081/
+mv val/ILSVRC2012_val_00006568.JPEG n03584254/
+mv val/ILSVRC2012_val_00006569.JPEG n07711569/
+mv val/ILSVRC2012_val_00006570.JPEG n09468604/
+mv val/ILSVRC2012_val_00006571.JPEG n01614925/
+mv val/ILSVRC2012_val_00006572.JPEG n03788365/
+mv val/ILSVRC2012_val_00006573.JPEG n04560804/
+mv val/ILSVRC2012_val_00006574.JPEG n01729977/
+mv val/ILSVRC2012_val_00006575.JPEG n03717622/
+mv val/ILSVRC2012_val_00006576.JPEG n02410509/
+mv val/ILSVRC2012_val_00006577.JPEG n02437312/
+mv val/ILSVRC2012_val_00006578.JPEG n03000684/
+mv val/ILSVRC2012_val_00006579.JPEG n01632777/
+mv val/ILSVRC2012_val_00006580.JPEG n02028035/
+mv val/ILSVRC2012_val_00006581.JPEG n07873807/
+mv val/ILSVRC2012_val_00006582.JPEG n01630670/
+mv val/ILSVRC2012_val_00006583.JPEG n03388183/
+mv val/ILSVRC2012_val_00006584.JPEG n02110185/
+mv val/ILSVRC2012_val_00006585.JPEG n02098413/
+mv val/ILSVRC2012_val_00006586.JPEG n02107142/
+mv val/ILSVRC2012_val_00006587.JPEG n04209133/
+mv val/ILSVRC2012_val_00006588.JPEG n07932039/
+mv val/ILSVRC2012_val_00006589.JPEG n03992509/
+mv val/ILSVRC2012_val_00006590.JPEG n04612504/
+mv val/ILSVRC2012_val_00006591.JPEG n01986214/
+mv val/ILSVRC2012_val_00006592.JPEG n04270147/
+mv val/ILSVRC2012_val_00006593.JPEG n06874185/
+mv val/ILSVRC2012_val_00006594.JPEG n02909870/
+mv val/ILSVRC2012_val_00006595.JPEG n02168699/
+mv val/ILSVRC2012_val_00006596.JPEG n03785016/
+mv val/ILSVRC2012_val_00006597.JPEG n01532829/
+mv val/ILSVRC2012_val_00006598.JPEG n04264628/
+mv val/ILSVRC2012_val_00006599.JPEG n02484975/
+mv val/ILSVRC2012_val_00006600.JPEG n02799071/
+mv val/ILSVRC2012_val_00006601.JPEG n04209133/
+mv val/ILSVRC2012_val_00006602.JPEG n07584110/
+mv val/ILSVRC2012_val_00006603.JPEG n01560419/
+mv val/ILSVRC2012_val_00006604.JPEG n02117135/
+mv val/ILSVRC2012_val_00006605.JPEG n07684084/
+mv val/ILSVRC2012_val_00006606.JPEG n03814906/
+mv val/ILSVRC2012_val_00006607.JPEG n03908618/
+mv val/ILSVRC2012_val_00006608.JPEG n02279972/
+mv val/ILSVRC2012_val_00006609.JPEG n02098413/
+mv val/ILSVRC2012_val_00006610.JPEG n02097658/
+mv val/ILSVRC2012_val_00006611.JPEG n04154565/
+mv val/ILSVRC2012_val_00006612.JPEG n02125311/
+mv val/ILSVRC2012_val_00006613.JPEG n02018795/
+mv val/ILSVRC2012_val_00006614.JPEG n02168699/
+mv val/ILSVRC2012_val_00006615.JPEG n02096177/
+mv val/ILSVRC2012_val_00006616.JPEG n03047690/
+mv val/ILSVRC2012_val_00006617.JPEG n02747177/
+mv val/ILSVRC2012_val_00006618.JPEG n03788365/
+mv val/ILSVRC2012_val_00006619.JPEG n02128385/
+mv val/ILSVRC2012_val_00006620.JPEG n03000134/
+mv val/ILSVRC2012_val_00006621.JPEG n03775546/
+mv val/ILSVRC2012_val_00006622.JPEG n04204238/
+mv val/ILSVRC2012_val_00006623.JPEG n04604644/
+mv val/ILSVRC2012_val_00006624.JPEG n03980874/
+mv val/ILSVRC2012_val_00006625.JPEG n03598930/
+mv val/ILSVRC2012_val_00006626.JPEG n01855672/
+mv val/ILSVRC2012_val_00006627.JPEG n02090721/
+mv val/ILSVRC2012_val_00006628.JPEG n07715103/
+mv val/ILSVRC2012_val_00006629.JPEG n02443114/
+mv val/ILSVRC2012_val_00006630.JPEG n02102177/
+mv val/ILSVRC2012_val_00006631.JPEG n04258138/
+mv val/ILSVRC2012_val_00006632.JPEG n04591713/
+mv val/ILSVRC2012_val_00006633.JPEG n03297495/
+mv val/ILSVRC2012_val_00006634.JPEG n01667778/
+mv val/ILSVRC2012_val_00006635.JPEG n04350905/
+mv val/ILSVRC2012_val_00006636.JPEG n04589890/
+mv val/ILSVRC2012_val_00006637.JPEG n06794110/
+mv val/ILSVRC2012_val_00006638.JPEG n03884397/
+mv val/ILSVRC2012_val_00006639.JPEG n04367480/
+mv val/ILSVRC2012_val_00006640.JPEG n03877845/
+mv val/ILSVRC2012_val_00006641.JPEG n10148035/
+mv val/ILSVRC2012_val_00006642.JPEG n03492542/
+mv val/ILSVRC2012_val_00006643.JPEG n04116512/
+mv val/ILSVRC2012_val_00006644.JPEG n03785016/
+mv val/ILSVRC2012_val_00006645.JPEG n01968897/
+mv val/ILSVRC2012_val_00006646.JPEG n02111889/
+mv val/ILSVRC2012_val_00006647.JPEG n04579432/
+mv val/ILSVRC2012_val_00006648.JPEG n03492542/
+mv val/ILSVRC2012_val_00006649.JPEG n02111277/
+mv val/ILSVRC2012_val_00006650.JPEG n03535780/
+mv val/ILSVRC2012_val_00006651.JPEG n03786901/
+mv val/ILSVRC2012_val_00006652.JPEG n02113799/
+mv val/ILSVRC2012_val_00006653.JPEG n04347754/
+mv val/ILSVRC2012_val_00006654.JPEG n03535780/
+mv val/ILSVRC2012_val_00006655.JPEG n02963159/
+mv val/ILSVRC2012_val_00006656.JPEG n03249569/
+mv val/ILSVRC2012_val_00006657.JPEG n03617480/
+mv val/ILSVRC2012_val_00006658.JPEG n04070727/
+mv val/ILSVRC2012_val_00006659.JPEG n02108000/
+mv val/ILSVRC2012_val_00006660.JPEG n03075370/
+mv val/ILSVRC2012_val_00006661.JPEG n03355925/
+mv val/ILSVRC2012_val_00006662.JPEG n04418357/
+mv val/ILSVRC2012_val_00006663.JPEG n02783161/
+mv val/ILSVRC2012_val_00006664.JPEG n02112137/
+mv val/ILSVRC2012_val_00006665.JPEG n03179701/
+mv val/ILSVRC2012_val_00006666.JPEG n02114367/
+mv val/ILSVRC2012_val_00006667.JPEG n02098286/
+mv val/ILSVRC2012_val_00006668.JPEG n02119022/
+mv val/ILSVRC2012_val_00006669.JPEG n03000684/
+mv val/ILSVRC2012_val_00006670.JPEG n01695060/
+mv val/ILSVRC2012_val_00006671.JPEG n15075141/
+mv val/ILSVRC2012_val_00006672.JPEG n02877765/
+mv val/ILSVRC2012_val_00006673.JPEG n02107683/
+mv val/ILSVRC2012_val_00006674.JPEG n03721384/
+mv val/ILSVRC2012_val_00006675.JPEG n02107142/
+mv val/ILSVRC2012_val_00006676.JPEG n02092339/
+mv val/ILSVRC2012_val_00006677.JPEG n02687172/
+mv val/ILSVRC2012_val_00006678.JPEG n02396427/
+mv val/ILSVRC2012_val_00006679.JPEG n01629819/
+mv val/ILSVRC2012_val_00006680.JPEG n03272010/
+mv val/ILSVRC2012_val_00006681.JPEG n10148035/
+mv val/ILSVRC2012_val_00006682.JPEG n04141076/
+mv val/ILSVRC2012_val_00006683.JPEG n04044716/
+mv val/ILSVRC2012_val_00006684.JPEG n04277352/
+mv val/ILSVRC2012_val_00006685.JPEG n02364673/
+mv val/ILSVRC2012_val_00006686.JPEG n04141975/
+mv val/ILSVRC2012_val_00006687.JPEG n01819313/
+mv val/ILSVRC2012_val_00006688.JPEG n03775546/
+mv val/ILSVRC2012_val_00006689.JPEG n03379051/
+mv val/ILSVRC2012_val_00006690.JPEG n01756291/
+mv val/ILSVRC2012_val_00006691.JPEG n03785016/
+mv val/ILSVRC2012_val_00006692.JPEG n04476259/
+mv val/ILSVRC2012_val_00006693.JPEG n04612504/
+mv val/ILSVRC2012_val_00006694.JPEG n01632777/
+mv val/ILSVRC2012_val_00006695.JPEG n03838899/
+mv val/ILSVRC2012_val_00006696.JPEG n02007558/
+mv val/ILSVRC2012_val_00006697.JPEG n01440764/
+mv val/ILSVRC2012_val_00006698.JPEG n02088094/
+mv val/ILSVRC2012_val_00006699.JPEG n01735189/
+mv val/ILSVRC2012_val_00006700.JPEG n02356798/
+mv val/ILSVRC2012_val_00006701.JPEG n02095889/
+mv val/ILSVRC2012_val_00006702.JPEG n09229709/
+mv val/ILSVRC2012_val_00006703.JPEG n02132136/
+mv val/ILSVRC2012_val_00006704.JPEG n02091635/
+mv val/ILSVRC2012_val_00006705.JPEG n07754684/
+mv val/ILSVRC2012_val_00006706.JPEG n03146219/
+mv val/ILSVRC2012_val_00006707.JPEG n03467068/
+mv val/ILSVRC2012_val_00006708.JPEG n03047690/
+mv val/ILSVRC2012_val_00006709.JPEG n02408429/
+mv val/ILSVRC2012_val_00006710.JPEG n02086910/
+mv val/ILSVRC2012_val_00006711.JPEG n02012849/
+mv val/ILSVRC2012_val_00006712.JPEG n04522168/
+mv val/ILSVRC2012_val_00006713.JPEG n01943899/
+mv val/ILSVRC2012_val_00006714.JPEG n12144580/
+mv val/ILSVRC2012_val_00006715.JPEG n01820546/
+mv val/ILSVRC2012_val_00006716.JPEG n01824575/
+mv val/ILSVRC2012_val_00006717.JPEG n01677366/
+mv val/ILSVRC2012_val_00006718.JPEG n03868242/
+mv val/ILSVRC2012_val_00006719.JPEG n03814639/
+mv val/ILSVRC2012_val_00006720.JPEG n02091635/
+mv val/ILSVRC2012_val_00006721.JPEG n04033901/
+mv val/ILSVRC2012_val_00006722.JPEG n02074367/
+mv val/ILSVRC2012_val_00006723.JPEG n04597913/
+mv val/ILSVRC2012_val_00006724.JPEG n07880968/
+mv val/ILSVRC2012_val_00006725.JPEG n01871265/
+mv val/ILSVRC2012_val_00006726.JPEG n03000684/
+mv val/ILSVRC2012_val_00006727.JPEG n01983481/
+mv val/ILSVRC2012_val_00006728.JPEG n07753592/
+mv val/ILSVRC2012_val_00006729.JPEG n04235860/
+mv val/ILSVRC2012_val_00006730.JPEG n02229544/
+mv val/ILSVRC2012_val_00006731.JPEG n03814906/
+mv val/ILSVRC2012_val_00006732.JPEG n03527444/
+mv val/ILSVRC2012_val_00006733.JPEG n04532106/
+mv val/ILSVRC2012_val_00006734.JPEG n02447366/
+mv val/ILSVRC2012_val_00006735.JPEG n04179913/
+mv val/ILSVRC2012_val_00006736.JPEG n04116512/
+mv val/ILSVRC2012_val_00006737.JPEG n01631663/
+mv val/ILSVRC2012_val_00006738.JPEG n04037443/
+mv val/ILSVRC2012_val_00006739.JPEG n03947888/
+mv val/ILSVRC2012_val_00006740.JPEG n02708093/
+mv val/ILSVRC2012_val_00006741.JPEG n03874293/
+mv val/ILSVRC2012_val_00006742.JPEG n04612504/
+mv val/ILSVRC2012_val_00006743.JPEG n04589890/
+mv val/ILSVRC2012_val_00006744.JPEG n02097130/
+mv val/ILSVRC2012_val_00006745.JPEG n03089624/
+mv val/ILSVRC2012_val_00006746.JPEG n03670208/
+mv val/ILSVRC2012_val_00006747.JPEG n04579145/
+mv val/ILSVRC2012_val_00006748.JPEG n03344393/
+mv val/ILSVRC2012_val_00006749.JPEG n07614500/
+mv val/ILSVRC2012_val_00006750.JPEG n04462240/
+mv val/ILSVRC2012_val_00006751.JPEG n01751748/
+mv val/ILSVRC2012_val_00006752.JPEG n04201297/
+mv val/ILSVRC2012_val_00006753.JPEG n07802026/
+mv val/ILSVRC2012_val_00006754.JPEG n02795169/
+mv val/ILSVRC2012_val_00006755.JPEG n07613480/
+mv val/ILSVRC2012_val_00006756.JPEG n07747607/
+mv val/ILSVRC2012_val_00006757.JPEG n02115913/
+mv val/ILSVRC2012_val_00006758.JPEG n02493793/
+mv val/ILSVRC2012_val_00006759.JPEG n03770679/
+mv val/ILSVRC2012_val_00006760.JPEG n02268443/
+mv val/ILSVRC2012_val_00006761.JPEG n02009912/
+mv val/ILSVRC2012_val_00006762.JPEG n04423845/
+mv val/ILSVRC2012_val_00006763.JPEG n01530575/
+mv val/ILSVRC2012_val_00006764.JPEG n01685808/
+mv val/ILSVRC2012_val_00006765.JPEG n07715103/
+mv val/ILSVRC2012_val_00006766.JPEG n03016953/
+mv val/ILSVRC2012_val_00006767.JPEG n03355925/
+mv val/ILSVRC2012_val_00006768.JPEG n04554684/
+mv val/ILSVRC2012_val_00006769.JPEG n04366367/
+mv val/ILSVRC2012_val_00006770.JPEG n03207941/
+mv val/ILSVRC2012_val_00006771.JPEG n03887697/
+mv val/ILSVRC2012_val_00006772.JPEG n04336792/
+mv val/ILSVRC2012_val_00006773.JPEG n03759954/
+mv val/ILSVRC2012_val_00006774.JPEG n03595614/
+mv val/ILSVRC2012_val_00006775.JPEG n02480855/
+mv val/ILSVRC2012_val_00006776.JPEG n04525038/
+mv val/ILSVRC2012_val_00006777.JPEG n04355338/
+mv val/ILSVRC2012_val_00006778.JPEG n02129165/
+mv val/ILSVRC2012_val_00006779.JPEG n03255030/
+mv val/ILSVRC2012_val_00006780.JPEG n02843684/
+mv val/ILSVRC2012_val_00006781.JPEG n04493381/
+mv val/ILSVRC2012_val_00006782.JPEG n02992211/
+mv val/ILSVRC2012_val_00006783.JPEG n03814906/
+mv val/ILSVRC2012_val_00006784.JPEG n04239074/
+mv val/ILSVRC2012_val_00006785.JPEG n06794110/
+mv val/ILSVRC2012_val_00006786.JPEG n03977966/
+mv val/ILSVRC2012_val_00006787.JPEG n02979186/
+mv val/ILSVRC2012_val_00006788.JPEG n03207941/
+mv val/ILSVRC2012_val_00006789.JPEG n07875152/
+mv val/ILSVRC2012_val_00006790.JPEG n01798484/
+mv val/ILSVRC2012_val_00006791.JPEG n02484975/
+mv val/ILSVRC2012_val_00006792.JPEG n02127052/
+mv val/ILSVRC2012_val_00006793.JPEG n02133161/
+mv val/ILSVRC2012_val_00006794.JPEG n03929660/
+mv val/ILSVRC2012_val_00006795.JPEG n02966687/
+mv val/ILSVRC2012_val_00006796.JPEG n12985857/
+mv val/ILSVRC2012_val_00006797.JPEG n01873310/
+mv val/ILSVRC2012_val_00006798.JPEG n07584110/
+mv val/ILSVRC2012_val_00006799.JPEG n02088094/
+mv val/ILSVRC2012_val_00006800.JPEG n01748264/
+mv val/ILSVRC2012_val_00006801.JPEG n02101006/
+mv val/ILSVRC2012_val_00006802.JPEG n03450230/
+mv val/ILSVRC2012_val_00006803.JPEG n03657121/
+mv val/ILSVRC2012_val_00006804.JPEG n03991062/
+mv val/ILSVRC2012_val_00006805.JPEG n02013706/
+mv val/ILSVRC2012_val_00006806.JPEG n03742115/
+mv val/ILSVRC2012_val_00006807.JPEG n03595614/
+mv val/ILSVRC2012_val_00006808.JPEG n04591713/
+mv val/ILSVRC2012_val_00006809.JPEG n03891251/
+mv val/ILSVRC2012_val_00006810.JPEG n01943899/
+mv val/ILSVRC2012_val_00006811.JPEG n03065424/
+mv val/ILSVRC2012_val_00006812.JPEG n04127249/
+mv val/ILSVRC2012_val_00006813.JPEG n03584829/
+mv val/ILSVRC2012_val_00006814.JPEG n02018207/
+mv val/ILSVRC2012_val_00006815.JPEG n02089973/
+mv val/ILSVRC2012_val_00006816.JPEG n03773504/
+mv val/ILSVRC2012_val_00006817.JPEG n01751748/
+mv val/ILSVRC2012_val_00006818.JPEG n02119022/
+mv val/ILSVRC2012_val_00006819.JPEG n02276258/
+mv val/ILSVRC2012_val_00006820.JPEG n04086273/
+mv val/ILSVRC2012_val_00006821.JPEG n01877812/
+mv val/ILSVRC2012_val_00006822.JPEG n02917067/
+mv val/ILSVRC2012_val_00006823.JPEG n02168699/
+mv val/ILSVRC2012_val_00006824.JPEG n02107574/
+mv val/ILSVRC2012_val_00006825.JPEG n03954731/
+mv val/ILSVRC2012_val_00006826.JPEG n02443114/
+mv val/ILSVRC2012_val_00006827.JPEG n02101556/
+mv val/ILSVRC2012_val_00006828.JPEG n01943899/
+mv val/ILSVRC2012_val_00006829.JPEG n03457902/
+mv val/ILSVRC2012_val_00006830.JPEG n01644900/
+mv val/ILSVRC2012_val_00006831.JPEG n01770081/
+mv val/ILSVRC2012_val_00006832.JPEG n03495258/
+mv val/ILSVRC2012_val_00006833.JPEG n02606052/
+mv val/ILSVRC2012_val_00006834.JPEG n02109047/
+mv val/ILSVRC2012_val_00006835.JPEG n01532829/
+mv val/ILSVRC2012_val_00006836.JPEG n02099429/
+mv val/ILSVRC2012_val_00006837.JPEG n02100735/
+mv val/ILSVRC2012_val_00006838.JPEG n03216828/
+mv val/ILSVRC2012_val_00006839.JPEG n04204347/
+mv val/ILSVRC2012_val_00006840.JPEG n02095889/
+mv val/ILSVRC2012_val_00006841.JPEG n03794056/
+mv val/ILSVRC2012_val_00006842.JPEG n02104365/
+mv val/ILSVRC2012_val_00006843.JPEG n03595614/
+mv val/ILSVRC2012_val_00006844.JPEG n01630670/
+mv val/ILSVRC2012_val_00006845.JPEG n03223299/
+mv val/ILSVRC2012_val_00006846.JPEG n04389033/
+mv val/ILSVRC2012_val_00006847.JPEG n01796340/
+mv val/ILSVRC2012_val_00006848.JPEG n02098286/
+mv val/ILSVRC2012_val_00006849.JPEG n02109525/
+mv val/ILSVRC2012_val_00006850.JPEG n04509417/
+mv val/ILSVRC2012_val_00006851.JPEG n01580077/
+mv val/ILSVRC2012_val_00006852.JPEG n04209239/
+mv val/ILSVRC2012_val_00006853.JPEG n01675722/
+mv val/ILSVRC2012_val_00006854.JPEG n07718747/
+mv val/ILSVRC2012_val_00006855.JPEG n02787622/
+mv val/ILSVRC2012_val_00006856.JPEG n04553703/
+mv val/ILSVRC2012_val_00006857.JPEG n02100877/
+mv val/ILSVRC2012_val_00006858.JPEG n02708093/
+mv val/ILSVRC2012_val_00006859.JPEG n01687978/
+mv val/ILSVRC2012_val_00006860.JPEG n01944390/
+mv val/ILSVRC2012_val_00006861.JPEG n02807133/
+mv val/ILSVRC2012_val_00006862.JPEG n03908714/
+mv val/ILSVRC2012_val_00006863.JPEG n12620546/
+mv val/ILSVRC2012_val_00006864.JPEG n04009552/
+mv val/ILSVRC2012_val_00006865.JPEG n04591713/
+mv val/ILSVRC2012_val_00006866.JPEG n02112350/
+mv val/ILSVRC2012_val_00006867.JPEG n02168699/
+mv val/ILSVRC2012_val_00006868.JPEG n03773504/
+mv val/ILSVRC2012_val_00006869.JPEG n03127747/
+mv val/ILSVRC2012_val_00006870.JPEG n03393912/
+mv val/ILSVRC2012_val_00006871.JPEG n03617480/
+mv val/ILSVRC2012_val_00006872.JPEG n02704792/
+mv val/ILSVRC2012_val_00006873.JPEG n03590841/
+mv val/ILSVRC2012_val_00006874.JPEG n03445924/
+mv val/ILSVRC2012_val_00006875.JPEG n02486261/
+mv val/ILSVRC2012_val_00006876.JPEG n03803284/
+mv val/ILSVRC2012_val_00006877.JPEG n03954731/
+mv val/ILSVRC2012_val_00006878.JPEG n02971356/
+mv val/ILSVRC2012_val_00006879.JPEG n03000247/
+mv val/ILSVRC2012_val_00006880.JPEG n03887697/
+mv val/ILSVRC2012_val_00006881.JPEG n02894605/
+mv val/ILSVRC2012_val_00006882.JPEG n04286575/
+mv val/ILSVRC2012_val_00006883.JPEG n02172182/
+mv val/ILSVRC2012_val_00006884.JPEG n01873310/
+mv val/ILSVRC2012_val_00006885.JPEG n04118538/
+mv val/ILSVRC2012_val_00006886.JPEG n04357314/
+mv val/ILSVRC2012_val_00006887.JPEG n02113624/
+mv val/ILSVRC2012_val_00006888.JPEG n02667093/
+mv val/ILSVRC2012_val_00006889.JPEG n03141823/
+mv val/ILSVRC2012_val_00006890.JPEG n04423845/
+mv val/ILSVRC2012_val_00006891.JPEG n03742115/
+mv val/ILSVRC2012_val_00006892.JPEG n02085620/
+mv val/ILSVRC2012_val_00006893.JPEG n02727426/
+mv val/ILSVRC2012_val_00006894.JPEG n04606251/
+mv val/ILSVRC2012_val_00006895.JPEG n02088466/
+mv val/ILSVRC2012_val_00006896.JPEG n03109150/
+mv val/ILSVRC2012_val_00006897.JPEG n03134739/
+mv val/ILSVRC2012_val_00006898.JPEG n02361337/
+mv val/ILSVRC2012_val_00006899.JPEG n03832673/
+mv val/ILSVRC2012_val_00006900.JPEG n02087394/
+mv val/ILSVRC2012_val_00006901.JPEG n02177972/
+mv val/ILSVRC2012_val_00006902.JPEG n04347754/
+mv val/ILSVRC2012_val_00006903.JPEG n07718747/
+mv val/ILSVRC2012_val_00006904.JPEG n03710721/
+mv val/ILSVRC2012_val_00006905.JPEG n03970156/
+mv val/ILSVRC2012_val_00006906.JPEG n04229816/
+mv val/ILSVRC2012_val_00006907.JPEG n01601694/
+mv val/ILSVRC2012_val_00006908.JPEG n02606052/
+mv val/ILSVRC2012_val_00006909.JPEG n03425413/
+mv val/ILSVRC2012_val_00006910.JPEG n03447447/
+mv val/ILSVRC2012_val_00006911.JPEG n04336792/
+mv val/ILSVRC2012_val_00006912.JPEG n04486054/
+mv val/ILSVRC2012_val_00006913.JPEG n04201297/
+mv val/ILSVRC2012_val_00006914.JPEG n07614500/
+mv val/ILSVRC2012_val_00006915.JPEG n02226429/
+mv val/ILSVRC2012_val_00006916.JPEG n01622779/
+mv val/ILSVRC2012_val_00006917.JPEG n04435653/
+mv val/ILSVRC2012_val_00006918.JPEG n09288635/
+mv val/ILSVRC2012_val_00006919.JPEG n02790996/
+mv val/ILSVRC2012_val_00006920.JPEG n02108000/
+mv val/ILSVRC2012_val_00006921.JPEG n03961711/
+mv val/ILSVRC2012_val_00006922.JPEG n03417042/
+mv val/ILSVRC2012_val_00006923.JPEG n03017168/
+mv val/ILSVRC2012_val_00006924.JPEG n03840681/
+mv val/ILSVRC2012_val_00006925.JPEG n02509815/
+mv val/ILSVRC2012_val_00006926.JPEG n04019541/
+mv val/ILSVRC2012_val_00006927.JPEG n01692333/
+mv val/ILSVRC2012_val_00006928.JPEG n01843065/
+mv val/ILSVRC2012_val_00006929.JPEG n03461385/
+mv val/ILSVRC2012_val_00006930.JPEG n04296562/
+mv val/ILSVRC2012_val_00006931.JPEG n02493509/
+mv val/ILSVRC2012_val_00006932.JPEG n03133878/
+mv val/ILSVRC2012_val_00006933.JPEG n02110627/
+mv val/ILSVRC2012_val_00006934.JPEG n07932039/
+mv val/ILSVRC2012_val_00006935.JPEG n02091831/
+mv val/ILSVRC2012_val_00006936.JPEG n03249569/
+mv val/ILSVRC2012_val_00006937.JPEG n02091467/
+mv val/ILSVRC2012_val_00006938.JPEG n03680355/
+mv val/ILSVRC2012_val_00006939.JPEG n07714990/
+mv val/ILSVRC2012_val_00006940.JPEG n02412080/
+mv val/ILSVRC2012_val_00006941.JPEG n03250847/
+mv val/ILSVRC2012_val_00006942.JPEG n03447721/
+mv val/ILSVRC2012_val_00006943.JPEG n02916936/
+mv val/ILSVRC2012_val_00006944.JPEG n02107683/
+mv val/ILSVRC2012_val_00006945.JPEG n02492035/
+mv val/ILSVRC2012_val_00006946.JPEG n03404251/
+mv val/ILSVRC2012_val_00006947.JPEG n02102177/
+mv val/ILSVRC2012_val_00006948.JPEG n07932039/
+mv val/ILSVRC2012_val_00006949.JPEG n04557648/
+mv val/ILSVRC2012_val_00006950.JPEG n04372370/
+mv val/ILSVRC2012_val_00006951.JPEG n03891251/
+mv val/ILSVRC2012_val_00006952.JPEG n02974003/
+mv val/ILSVRC2012_val_00006953.JPEG n15075141/
+mv val/ILSVRC2012_val_00006954.JPEG n02444819/
+mv val/ILSVRC2012_val_00006955.JPEG n04462240/
+mv val/ILSVRC2012_val_00006956.JPEG n02100236/
+mv val/ILSVRC2012_val_00006957.JPEG n02108551/
+mv val/ILSVRC2012_val_00006958.JPEG n04515003/
+mv val/ILSVRC2012_val_00006959.JPEG n02002556/
+mv val/ILSVRC2012_val_00006960.JPEG n02794156/
+mv val/ILSVRC2012_val_00006961.JPEG n04204238/
+mv val/ILSVRC2012_val_00006962.JPEG n04090263/
+mv val/ILSVRC2012_val_00006963.JPEG n04584207/
+mv val/ILSVRC2012_val_00006964.JPEG n02120505/
+mv val/ILSVRC2012_val_00006965.JPEG n03773504/
+mv val/ILSVRC2012_val_00006966.JPEG n02165456/
+mv val/ILSVRC2012_val_00006967.JPEG n07684084/
+mv val/ILSVRC2012_val_00006968.JPEG n04311174/
+mv val/ILSVRC2012_val_00006969.JPEG n02002556/
+mv val/ILSVRC2012_val_00006970.JPEG n02106382/
+mv val/ILSVRC2012_val_00006971.JPEG n01695060/
+mv val/ILSVRC2012_val_00006972.JPEG n02783161/
+mv val/ILSVRC2012_val_00006973.JPEG n02422699/
+mv val/ILSVRC2012_val_00006974.JPEG n03982430/
+mv val/ILSVRC2012_val_00006975.JPEG n02397096/
+mv val/ILSVRC2012_val_00006976.JPEG n03976657/
+mv val/ILSVRC2012_val_00006977.JPEG n02692877/
+mv val/ILSVRC2012_val_00006978.JPEG n03841143/
+mv val/ILSVRC2012_val_00006979.JPEG n03710637/
+mv val/ILSVRC2012_val_00006980.JPEG n04259630/
+mv val/ILSVRC2012_val_00006981.JPEG n02099601/
+mv val/ILSVRC2012_val_00006982.JPEG n03942813/
+mv val/ILSVRC2012_val_00006983.JPEG n12998815/
+mv val/ILSVRC2012_val_00006984.JPEG n11939491/
+mv val/ILSVRC2012_val_00006985.JPEG n04399382/
+mv val/ILSVRC2012_val_00006986.JPEG n03065424/
+mv val/ILSVRC2012_val_00006987.JPEG n01644373/
+mv val/ILSVRC2012_val_00006988.JPEG n04462240/
+mv val/ILSVRC2012_val_00006989.JPEG n03992509/
+mv val/ILSVRC2012_val_00006990.JPEG n03534580/
+mv val/ILSVRC2012_val_00006991.JPEG n02398521/
+mv val/ILSVRC2012_val_00006992.JPEG n02095889/
+mv val/ILSVRC2012_val_00006993.JPEG n02808440/
+mv val/ILSVRC2012_val_00006994.JPEG n04264628/
+mv val/ILSVRC2012_val_00006995.JPEG n02786058/
+mv val/ILSVRC2012_val_00006996.JPEG n04399382/
+mv val/ILSVRC2012_val_00006997.JPEG n03933933/
+mv val/ILSVRC2012_val_00006998.JPEG n04487081/
+mv val/ILSVRC2012_val_00006999.JPEG n01873310/
+mv val/ILSVRC2012_val_00007000.JPEG n04409515/
+mv val/ILSVRC2012_val_00007001.JPEG n02108089/
+mv val/ILSVRC2012_val_00007002.JPEG n02091831/
+mv val/ILSVRC2012_val_00007003.JPEG n07734744/
+mv val/ILSVRC2012_val_00007004.JPEG n04552348/
+mv val/ILSVRC2012_val_00007005.JPEG n04162706/
+mv val/ILSVRC2012_val_00007006.JPEG n02123045/
+mv val/ILSVRC2012_val_00007007.JPEG n13040303/
+mv val/ILSVRC2012_val_00007008.JPEG n02492035/
+mv val/ILSVRC2012_val_00007009.JPEG n03657121/
+mv val/ILSVRC2012_val_00007010.JPEG n02488291/
+mv val/ILSVRC2012_val_00007011.JPEG n02027492/
+mv val/ILSVRC2012_val_00007012.JPEG n02769748/
+mv val/ILSVRC2012_val_00007013.JPEG n07753113/
+mv val/ILSVRC2012_val_00007014.JPEG n03814639/
+mv val/ILSVRC2012_val_00007015.JPEG n01704323/
+mv val/ILSVRC2012_val_00007016.JPEG n02276258/
+mv val/ILSVRC2012_val_00007017.JPEG n04557648/
+mv val/ILSVRC2012_val_00007018.JPEG n03478589/
+mv val/ILSVRC2012_val_00007019.JPEG n04435653/
+mv val/ILSVRC2012_val_00007020.JPEG n03535780/
+mv val/ILSVRC2012_val_00007021.JPEG n04371774/
+mv val/ILSVRC2012_val_00007022.JPEG n02823750/
+mv val/ILSVRC2012_val_00007023.JPEG n02124075/
+mv val/ILSVRC2012_val_00007024.JPEG n07695742/
+mv val/ILSVRC2012_val_00007025.JPEG n03337140/
+mv val/ILSVRC2012_val_00007026.JPEG n03884397/
+mv val/ILSVRC2012_val_00007027.JPEG n01917289/
+mv val/ILSVRC2012_val_00007028.JPEG n07720875/
+mv val/ILSVRC2012_val_00007029.JPEG n07742313/
+mv val/ILSVRC2012_val_00007030.JPEG n04019541/
+mv val/ILSVRC2012_val_00007031.JPEG n02130308/
+mv val/ILSVRC2012_val_00007032.JPEG n02102040/
+mv val/ILSVRC2012_val_00007033.JPEG n02104365/
+mv val/ILSVRC2012_val_00007034.JPEG n02963159/
+mv val/ILSVRC2012_val_00007035.JPEG n01687978/
+mv val/ILSVRC2012_val_00007036.JPEG n07754684/
+mv val/ILSVRC2012_val_00007037.JPEG n02328150/
+mv val/ILSVRC2012_val_00007038.JPEG n02791124/
+mv val/ILSVRC2012_val_00007039.JPEG n04286575/
+mv val/ILSVRC2012_val_00007040.JPEG n04606251/
+mv val/ILSVRC2012_val_00007041.JPEG n03814639/
+mv val/ILSVRC2012_val_00007042.JPEG n09246464/
+mv val/ILSVRC2012_val_00007043.JPEG n02009229/
+mv val/ILSVRC2012_val_00007044.JPEG n01665541/
+mv val/ILSVRC2012_val_00007045.JPEG n04399382/
+mv val/ILSVRC2012_val_00007046.JPEG n04429376/
+mv val/ILSVRC2012_val_00007047.JPEG n04033995/
+mv val/ILSVRC2012_val_00007048.JPEG n04238763/
+mv val/ILSVRC2012_val_00007049.JPEG n09256479/
+mv val/ILSVRC2012_val_00007050.JPEG n01632458/
+mv val/ILSVRC2012_val_00007051.JPEG n04004767/
+mv val/ILSVRC2012_val_00007052.JPEG n04111531/
+mv val/ILSVRC2012_val_00007053.JPEG n03710637/
+mv val/ILSVRC2012_val_00007054.JPEG n02107908/
+mv val/ILSVRC2012_val_00007055.JPEG n04008634/
+mv val/ILSVRC2012_val_00007056.JPEG n02106382/
+mv val/ILSVRC2012_val_00007057.JPEG n02086079/
+mv val/ILSVRC2012_val_00007058.JPEG n07871810/
+mv val/ILSVRC2012_val_00007059.JPEG n02105505/
+mv val/ILSVRC2012_val_00007060.JPEG n02013706/
+mv val/ILSVRC2012_val_00007061.JPEG n03733131/
+mv val/ILSVRC2012_val_00007062.JPEG n07875152/
+mv val/ILSVRC2012_val_00007063.JPEG n03376595/
+mv val/ILSVRC2012_val_00007064.JPEG n03594945/
+mv val/ILSVRC2012_val_00007065.JPEG n01776313/
+mv val/ILSVRC2012_val_00007066.JPEG n03016953/
+mv val/ILSVRC2012_val_00007067.JPEG n04243546/
+mv val/ILSVRC2012_val_00007068.JPEG n04252225/
+mv val/ILSVRC2012_val_00007069.JPEG n03709823/
+mv val/ILSVRC2012_val_00007070.JPEG n02939185/
+mv val/ILSVRC2012_val_00007071.JPEG n02107574/
+mv val/ILSVRC2012_val_00007072.JPEG n02097047/
+mv val/ILSVRC2012_val_00007073.JPEG n02109525/
+mv val/ILSVRC2012_val_00007074.JPEG n03916031/
+mv val/ILSVRC2012_val_00007075.JPEG n02116738/
+mv val/ILSVRC2012_val_00007076.JPEG n07579787/
+mv val/ILSVRC2012_val_00007077.JPEG n02018795/
+mv val/ILSVRC2012_val_00007078.JPEG n03967562/
+mv val/ILSVRC2012_val_00007079.JPEG n03075370/
+mv val/ILSVRC2012_val_00007080.JPEG n12998815/
+mv val/ILSVRC2012_val_00007081.JPEG n01818515/
+mv val/ILSVRC2012_val_00007082.JPEG n02190166/
+mv val/ILSVRC2012_val_00007083.JPEG n02701002/
+mv val/ILSVRC2012_val_00007084.JPEG n01685808/
+mv val/ILSVRC2012_val_00007085.JPEG n12267677/
+mv val/ILSVRC2012_val_00007086.JPEG n02107683/
+mv val/ILSVRC2012_val_00007087.JPEG n07695742/
+mv val/ILSVRC2012_val_00007088.JPEG n02085782/
+mv val/ILSVRC2012_val_00007089.JPEG n03692522/
+mv val/ILSVRC2012_val_00007090.JPEG n02086646/
+mv val/ILSVRC2012_val_00007091.JPEG n03623198/
+mv val/ILSVRC2012_val_00007092.JPEG n03534580/
+mv val/ILSVRC2012_val_00007093.JPEG n02133161/
+mv val/ILSVRC2012_val_00007094.JPEG n07584110/
+mv val/ILSVRC2012_val_00007095.JPEG n03980874/
+mv val/ILSVRC2012_val_00007096.JPEG n03710721/
+mv val/ILSVRC2012_val_00007097.JPEG n03838899/
+mv val/ILSVRC2012_val_00007098.JPEG n04311174/
+mv val/ILSVRC2012_val_00007099.JPEG n03976467/
+mv val/ILSVRC2012_val_00007100.JPEG n02966687/
+mv val/ILSVRC2012_val_00007101.JPEG n03785016/
+mv val/ILSVRC2012_val_00007102.JPEG n02097658/
+mv val/ILSVRC2012_val_00007103.JPEG n04442312/
+mv val/ILSVRC2012_val_00007104.JPEG n04380533/
+mv val/ILSVRC2012_val_00007105.JPEG n03042490/
+mv val/ILSVRC2012_val_00007106.JPEG n03982430/
+mv val/ILSVRC2012_val_00007107.JPEG n02510455/
+mv val/ILSVRC2012_val_00007108.JPEG n02408429/
+mv val/ILSVRC2012_val_00007109.JPEG n02093859/
+mv val/ILSVRC2012_val_00007110.JPEG n07718472/
+mv val/ILSVRC2012_val_00007111.JPEG n02086079/
+mv val/ILSVRC2012_val_00007112.JPEG n02834397/
+mv val/ILSVRC2012_val_00007113.JPEG n03670208/
+mv val/ILSVRC2012_val_00007114.JPEG n01728572/
+mv val/ILSVRC2012_val_00007115.JPEG n02444819/
+mv val/ILSVRC2012_val_00007116.JPEG n02091467/
+mv val/ILSVRC2012_val_00007117.JPEG n04325704/
+mv val/ILSVRC2012_val_00007118.JPEG n04332243/
+mv val/ILSVRC2012_val_00007119.JPEG n03223299/
+mv val/ILSVRC2012_val_00007120.JPEG n01734418/
+mv val/ILSVRC2012_val_00007121.JPEG n03496892/
+mv val/ILSVRC2012_val_00007122.JPEG n01697457/
+mv val/ILSVRC2012_val_00007123.JPEG n03884397/
+mv val/ILSVRC2012_val_00007124.JPEG n03483316/
+mv val/ILSVRC2012_val_00007125.JPEG n04285008/
+mv val/ILSVRC2012_val_00007126.JPEG n01795545/
+mv val/ILSVRC2012_val_00007127.JPEG n03220513/
+mv val/ILSVRC2012_val_00007128.JPEG n02007558/
+mv val/ILSVRC2012_val_00007129.JPEG n01532829/
+mv val/ILSVRC2012_val_00007130.JPEG n02236044/
+mv val/ILSVRC2012_val_00007131.JPEG n06596364/
+mv val/ILSVRC2012_val_00007132.JPEG n04111531/
+mv val/ILSVRC2012_val_00007133.JPEG n03032252/
+mv val/ILSVRC2012_val_00007134.JPEG n03814639/
+mv val/ILSVRC2012_val_00007135.JPEG n04317175/
+mv val/ILSVRC2012_val_00007136.JPEG n04033995/
+mv val/ILSVRC2012_val_00007137.JPEG n02086079/
+mv val/ILSVRC2012_val_00007138.JPEG n07684084/
+mv val/ILSVRC2012_val_00007139.JPEG n01829413/
+mv val/ILSVRC2012_val_00007140.JPEG n02128757/
+mv val/ILSVRC2012_val_00007141.JPEG n03983396/
+mv val/ILSVRC2012_val_00007142.JPEG n04487081/
+mv val/ILSVRC2012_val_00007143.JPEG n02190166/
+mv val/ILSVRC2012_val_00007144.JPEG n04523525/
+mv val/ILSVRC2012_val_00007145.JPEG n04328186/
+mv val/ILSVRC2012_val_00007146.JPEG n04116512/
+mv val/ILSVRC2012_val_00007147.JPEG n03450230/
+mv val/ILSVRC2012_val_00007148.JPEG n04228054/
+mv val/ILSVRC2012_val_00007149.JPEG n02102177/
+mv val/ILSVRC2012_val_00007150.JPEG n03873416/
+mv val/ILSVRC2012_val_00007151.JPEG n02488702/
+mv val/ILSVRC2012_val_00007152.JPEG n02226429/
+mv val/ILSVRC2012_val_00007153.JPEG n02018207/
+mv val/ILSVRC2012_val_00007154.JPEG n04044716/
+mv val/ILSVRC2012_val_00007155.JPEG n03394916/
+mv val/ILSVRC2012_val_00007156.JPEG n01818515/
+mv val/ILSVRC2012_val_00007157.JPEG n01910747/
+mv val/ILSVRC2012_val_00007158.JPEG n03584829/
+mv val/ILSVRC2012_val_00007159.JPEG n03240683/
+mv val/ILSVRC2012_val_00007160.JPEG n04133789/
+mv val/ILSVRC2012_val_00007161.JPEG n03095699/
+mv val/ILSVRC2012_val_00007162.JPEG n04325704/
+mv val/ILSVRC2012_val_00007163.JPEG n02606052/
+mv val/ILSVRC2012_val_00007164.JPEG n02102318/
+mv val/ILSVRC2012_val_00007165.JPEG n02106382/
+mv val/ILSVRC2012_val_00007166.JPEG n03424325/
+mv val/ILSVRC2012_val_00007167.JPEG n02906734/
+mv val/ILSVRC2012_val_00007168.JPEG n01818515/
+mv val/ILSVRC2012_val_00007169.JPEG n04548362/
+mv val/ILSVRC2012_val_00007170.JPEG n04086273/
+mv val/ILSVRC2012_val_00007171.JPEG n07590611/
+mv val/ILSVRC2012_val_00007172.JPEG n02033041/
+mv val/ILSVRC2012_val_00007173.JPEG n04501370/
+mv val/ILSVRC2012_val_00007174.JPEG n02486261/
+mv val/ILSVRC2012_val_00007175.JPEG n03793489/
+mv val/ILSVRC2012_val_00007176.JPEG n02974003/
+mv val/ILSVRC2012_val_00007177.JPEG n09428293/
+mv val/ILSVRC2012_val_00007178.JPEG n02088466/
+mv val/ILSVRC2012_val_00007179.JPEG n04355933/
+mv val/ILSVRC2012_val_00007180.JPEG n02113712/
+mv val/ILSVRC2012_val_00007181.JPEG n02777292/
+mv val/ILSVRC2012_val_00007182.JPEG n02490219/
+mv val/ILSVRC2012_val_00007183.JPEG n02105056/
+mv val/ILSVRC2012_val_00007184.JPEG n02071294/
+mv val/ILSVRC2012_val_00007185.JPEG n02655020/
+mv val/ILSVRC2012_val_00007186.JPEG n03425413/
+mv val/ILSVRC2012_val_00007187.JPEG n02808440/
+mv val/ILSVRC2012_val_00007188.JPEG n02493509/
+mv val/ILSVRC2012_val_00007189.JPEG n03384352/
+mv val/ILSVRC2012_val_00007190.JPEG n02108422/
+mv val/ILSVRC2012_val_00007191.JPEG n04350905/
+mv val/ILSVRC2012_val_00007192.JPEG n07695742/
+mv val/ILSVRC2012_val_00007193.JPEG n02077923/
+mv val/ILSVRC2012_val_00007194.JPEG n03476991/
+mv val/ILSVRC2012_val_00007195.JPEG n03857828/
+mv val/ILSVRC2012_val_00007196.JPEG n02494079/
+mv val/ILSVRC2012_val_00007197.JPEG n01440764/
+mv val/ILSVRC2012_val_00007198.JPEG n02277742/
+mv val/ILSVRC2012_val_00007199.JPEG n02509815/
+mv val/ILSVRC2012_val_00007200.JPEG n07730033/
+mv val/ILSVRC2012_val_00007201.JPEG n01774384/
+mv val/ILSVRC2012_val_00007202.JPEG n02951585/
+mv val/ILSVRC2012_val_00007203.JPEG n02892201/
+mv val/ILSVRC2012_val_00007204.JPEG n02488702/
+mv val/ILSVRC2012_val_00007205.JPEG n02782093/
+mv val/ILSVRC2012_val_00007206.JPEG n03854065/
+mv val/ILSVRC2012_val_00007207.JPEG n04517823/
+mv val/ILSVRC2012_val_00007208.JPEG n03467068/
+mv val/ILSVRC2012_val_00007209.JPEG n07920052/
+mv val/ILSVRC2012_val_00007210.JPEG n03180011/
+mv val/ILSVRC2012_val_00007211.JPEG n02111129/
+mv val/ILSVRC2012_val_00007212.JPEG n02361337/
+mv val/ILSVRC2012_val_00007213.JPEG n03544143/
+mv val/ILSVRC2012_val_00007214.JPEG n07717556/
+mv val/ILSVRC2012_val_00007215.JPEG n03291819/
+mv val/ILSVRC2012_val_00007216.JPEG n02110063/
+mv val/ILSVRC2012_val_00007217.JPEG n03825788/
+mv val/ILSVRC2012_val_00007218.JPEG n02110185/
+mv val/ILSVRC2012_val_00007219.JPEG n02108422/
+mv val/ILSVRC2012_val_00007220.JPEG n01744401/
+mv val/ILSVRC2012_val_00007221.JPEG n04204347/
+mv val/ILSVRC2012_val_00007222.JPEG n01744401/
+mv val/ILSVRC2012_val_00007223.JPEG n02086079/
+mv val/ILSVRC2012_val_00007224.JPEG n01773549/
+mv val/ILSVRC2012_val_00007225.JPEG n03498962/
+mv val/ILSVRC2012_val_00007226.JPEG n02979186/
+mv val/ILSVRC2012_val_00007227.JPEG n01694178/
+mv val/ILSVRC2012_val_00007228.JPEG n04265275/
+mv val/ILSVRC2012_val_00007229.JPEG n04371774/
+mv val/ILSVRC2012_val_00007230.JPEG n01669191/
+mv val/ILSVRC2012_val_00007231.JPEG n01582220/
+mv val/ILSVRC2012_val_00007232.JPEG n02128925/
+mv val/ILSVRC2012_val_00007233.JPEG n02747177/
+mv val/ILSVRC2012_val_00007234.JPEG n02108551/
+mv val/ILSVRC2012_val_00007235.JPEG n02105056/
+mv val/ILSVRC2012_val_00007236.JPEG n02107312/
+mv val/ILSVRC2012_val_00007237.JPEG n01532829/
+mv val/ILSVRC2012_val_00007238.JPEG n01698640/
+mv val/ILSVRC2012_val_00007239.JPEG n03661043/
+mv val/ILSVRC2012_val_00007240.JPEG n02834397/
+mv val/ILSVRC2012_val_00007241.JPEG n03956157/
+mv val/ILSVRC2012_val_00007242.JPEG n01739381/
+mv val/ILSVRC2012_val_00007243.JPEG n02500267/
+mv val/ILSVRC2012_val_00007244.JPEG n02317335/
+mv val/ILSVRC2012_val_00007245.JPEG n02951358/
+mv val/ILSVRC2012_val_00007246.JPEG n02105505/
+mv val/ILSVRC2012_val_00007247.JPEG n07718747/
+mv val/ILSVRC2012_val_00007248.JPEG n04192698/
+mv val/ILSVRC2012_val_00007249.JPEG n04536866/
+mv val/ILSVRC2012_val_00007250.JPEG n03710637/
+mv val/ILSVRC2012_val_00007251.JPEG n02346627/
+mv val/ILSVRC2012_val_00007252.JPEG n03476684/
+mv val/ILSVRC2012_val_00007253.JPEG n02086910/
+mv val/ILSVRC2012_val_00007254.JPEG n02747177/
+mv val/ILSVRC2012_val_00007255.JPEG n02096177/
+mv val/ILSVRC2012_val_00007256.JPEG n04548280/
+mv val/ILSVRC2012_val_00007257.JPEG n01630670/
+mv val/ILSVRC2012_val_00007258.JPEG n01682714/
+mv val/ILSVRC2012_val_00007259.JPEG n04275548/
+mv val/ILSVRC2012_val_00007260.JPEG n03538406/
+mv val/ILSVRC2012_val_00007261.JPEG n02113712/
+mv val/ILSVRC2012_val_00007262.JPEG n09421951/
+mv val/ILSVRC2012_val_00007263.JPEG n01560419/
+mv val/ILSVRC2012_val_00007264.JPEG n04252225/
+mv val/ILSVRC2012_val_00007265.JPEG n02423022/
+mv val/ILSVRC2012_val_00007266.JPEG n01697457/
+mv val/ILSVRC2012_val_00007267.JPEG n02389026/
+mv val/ILSVRC2012_val_00007268.JPEG n03595614/
+mv val/ILSVRC2012_val_00007269.JPEG n02415577/
+mv val/ILSVRC2012_val_00007270.JPEG n04004767/
+mv val/ILSVRC2012_val_00007271.JPEG n02672831/
+mv val/ILSVRC2012_val_00007272.JPEG n03018349/
+mv val/ILSVRC2012_val_00007273.JPEG n03998194/
+mv val/ILSVRC2012_val_00007274.JPEG n03089624/
+mv val/ILSVRC2012_val_00007275.JPEG n04273569/
+mv val/ILSVRC2012_val_00007276.JPEG n02058221/
+mv val/ILSVRC2012_val_00007277.JPEG n03544143/
+mv val/ILSVRC2012_val_00007278.JPEG n02395406/
+mv val/ILSVRC2012_val_00007279.JPEG n03535780/
+mv val/ILSVRC2012_val_00007280.JPEG n03450230/
+mv val/ILSVRC2012_val_00007281.JPEG n03888605/
+mv val/ILSVRC2012_val_00007282.JPEG n13052670/
+mv val/ILSVRC2012_val_00007283.JPEG n01910747/
+mv val/ILSVRC2012_val_00007284.JPEG n01843065/
+mv val/ILSVRC2012_val_00007285.JPEG n03982430/
+mv val/ILSVRC2012_val_00007286.JPEG n03447721/
+mv val/ILSVRC2012_val_00007287.JPEG n01955084/
+mv val/ILSVRC2012_val_00007288.JPEG n01630670/
+mv val/ILSVRC2012_val_00007289.JPEG n03803284/
+mv val/ILSVRC2012_val_00007290.JPEG n02120079/
+mv val/ILSVRC2012_val_00007291.JPEG n03372029/
+mv val/ILSVRC2012_val_00007292.JPEG n02504458/
+mv val/ILSVRC2012_val_00007293.JPEG n03874599/
+mv val/ILSVRC2012_val_00007294.JPEG n02011460/
+mv val/ILSVRC2012_val_00007295.JPEG n02108089/
+mv val/ILSVRC2012_val_00007296.JPEG n03627232/
+mv val/ILSVRC2012_val_00007297.JPEG n02492660/
+mv val/ILSVRC2012_val_00007298.JPEG n04399382/
+mv val/ILSVRC2012_val_00007299.JPEG n02412080/
+mv val/ILSVRC2012_val_00007300.JPEG n03325584/
+mv val/ILSVRC2012_val_00007301.JPEG n03706229/
+mv val/ILSVRC2012_val_00007302.JPEG n02500267/
+mv val/ILSVRC2012_val_00007303.JPEG n02123159/
+mv val/ILSVRC2012_val_00007304.JPEG n04238763/
+mv val/ILSVRC2012_val_00007305.JPEG n02883205/
+mv val/ILSVRC2012_val_00007306.JPEG n13044778/
+mv val/ILSVRC2012_val_00007307.JPEG n07836838/
+mv val/ILSVRC2012_val_00007308.JPEG n02799071/
+mv val/ILSVRC2012_val_00007309.JPEG n01917289/
+mv val/ILSVRC2012_val_00007310.JPEG n04273569/
+mv val/ILSVRC2012_val_00007311.JPEG n04552348/
+mv val/ILSVRC2012_val_00007312.JPEG n01795545/
+mv val/ILSVRC2012_val_00007313.JPEG n02011460/
+mv val/ILSVRC2012_val_00007314.JPEG n03944341/
+mv val/ILSVRC2012_val_00007315.JPEG n02356798/
+mv val/ILSVRC2012_val_00007316.JPEG n04264628/
+mv val/ILSVRC2012_val_00007317.JPEG n02859443/
+mv val/ILSVRC2012_val_00007318.JPEG n02108915/
+mv val/ILSVRC2012_val_00007319.JPEG n02108422/
+mv val/ILSVRC2012_val_00007320.JPEG n04591713/
+mv val/ILSVRC2012_val_00007321.JPEG n02099849/
+mv val/ILSVRC2012_val_00007322.JPEG n07693725/
+mv val/ILSVRC2012_val_00007323.JPEG n01795545/
+mv val/ILSVRC2012_val_00007324.JPEG n04596742/
+mv val/ILSVRC2012_val_00007325.JPEG n03868242/
+mv val/ILSVRC2012_val_00007326.JPEG n03958227/
+mv val/ILSVRC2012_val_00007327.JPEG n02093991/
+mv val/ILSVRC2012_val_00007328.JPEG n03134739/
+mv val/ILSVRC2012_val_00007329.JPEG n01917289/
+mv val/ILSVRC2012_val_00007330.JPEG n02099712/
+mv val/ILSVRC2012_val_00007331.JPEG n03314780/
+mv val/ILSVRC2012_val_00007332.JPEG n11879895/
+mv val/ILSVRC2012_val_00007333.JPEG n10148035/
+mv val/ILSVRC2012_val_00007334.JPEG n02018795/
+mv val/ILSVRC2012_val_00007335.JPEG n02747177/
+mv val/ILSVRC2012_val_00007336.JPEG n04542943/
+mv val/ILSVRC2012_val_00007337.JPEG n03141823/
+mv val/ILSVRC2012_val_00007338.JPEG n02797295/
+mv val/ILSVRC2012_val_00007339.JPEG n01704323/
+mv val/ILSVRC2012_val_00007340.JPEG n02777292/
+mv val/ILSVRC2012_val_00007341.JPEG n02769748/
+mv val/ILSVRC2012_val_00007342.JPEG n04033995/
+mv val/ILSVRC2012_val_00007343.JPEG n01860187/
+mv val/ILSVRC2012_val_00007344.JPEG n02321529/
+mv val/ILSVRC2012_val_00007345.JPEG n01917289/
+mv val/ILSVRC2012_val_00007346.JPEG n03785016/
+mv val/ILSVRC2012_val_00007347.JPEG n03956157/
+mv val/ILSVRC2012_val_00007348.JPEG n03100240/
+mv val/ILSVRC2012_val_00007349.JPEG n04041544/
+mv val/ILSVRC2012_val_00007350.JPEG n02165105/
+mv val/ILSVRC2012_val_00007351.JPEG n03947888/
+mv val/ILSVRC2012_val_00007352.JPEG n03891251/
+mv val/ILSVRC2012_val_00007353.JPEG n03709823/
+mv val/ILSVRC2012_val_00007354.JPEG n02988304/
+mv val/ILSVRC2012_val_00007355.JPEG n02106030/
+mv val/ILSVRC2012_val_00007356.JPEG n02095570/
+mv val/ILSVRC2012_val_00007357.JPEG n02814860/
+mv val/ILSVRC2012_val_00007358.JPEG n03649909/
+mv val/ILSVRC2012_val_00007359.JPEG n03110669/
+mv val/ILSVRC2012_val_00007360.JPEG n02444819/
+mv val/ILSVRC2012_val_00007361.JPEG n04044716/
+mv val/ILSVRC2012_val_00007362.JPEG n04487394/
+mv val/ILSVRC2012_val_00007363.JPEG n02422106/
+mv val/ILSVRC2012_val_00007364.JPEG n04069434/
+mv val/ILSVRC2012_val_00007365.JPEG n02165456/
+mv val/ILSVRC2012_val_00007366.JPEG n02098105/
+mv val/ILSVRC2012_val_00007367.JPEG n02106382/
+mv val/ILSVRC2012_val_00007368.JPEG n02280649/
+mv val/ILSVRC2012_val_00007369.JPEG n02002556/
+mv val/ILSVRC2012_val_00007370.JPEG n01980166/
+mv val/ILSVRC2012_val_00007371.JPEG n02091032/
+mv val/ILSVRC2012_val_00007372.JPEG n09229709/
+mv val/ILSVRC2012_val_00007373.JPEG n03642806/
+mv val/ILSVRC2012_val_00007374.JPEG n03770679/
+mv val/ILSVRC2012_val_00007375.JPEG n02172182/
+mv val/ILSVRC2012_val_00007376.JPEG n07892512/
+mv val/ILSVRC2012_val_00007377.JPEG n01944390/
+mv val/ILSVRC2012_val_00007378.JPEG n04462240/
+mv val/ILSVRC2012_val_00007379.JPEG n02114548/
+mv val/ILSVRC2012_val_00007380.JPEG n02403003/
+mv val/ILSVRC2012_val_00007381.JPEG n03899768/
+mv val/ILSVRC2012_val_00007382.JPEG n09472597/
+mv val/ILSVRC2012_val_00007383.JPEG n03530642/
+mv val/ILSVRC2012_val_00007384.JPEG n02974003/
+mv val/ILSVRC2012_val_00007385.JPEG n02777292/
+mv val/ILSVRC2012_val_00007386.JPEG n02093428/
+mv val/ILSVRC2012_val_00007387.JPEG n01829413/
+mv val/ILSVRC2012_val_00007388.JPEG n02097298/
+mv val/ILSVRC2012_val_00007389.JPEG n01882714/
+mv val/ILSVRC2012_val_00007390.JPEG n01833805/
+mv val/ILSVRC2012_val_00007391.JPEG n03481172/
+mv val/ILSVRC2012_val_00007392.JPEG n02094114/
+mv val/ILSVRC2012_val_00007393.JPEG n03218198/
+mv val/ILSVRC2012_val_00007394.JPEG n02640242/
+mv val/ILSVRC2012_val_00007395.JPEG n02422699/
+mv val/ILSVRC2012_val_00007396.JPEG n03297495/
+mv val/ILSVRC2012_val_00007397.JPEG n04592741/
+mv val/ILSVRC2012_val_00007398.JPEG n01644373/
+mv val/ILSVRC2012_val_00007399.JPEG n02066245/
+mv val/ILSVRC2012_val_00007400.JPEG n03028079/
+mv val/ILSVRC2012_val_00007401.JPEG n04399382/
+mv val/ILSVRC2012_val_00007402.JPEG n03355925/
+mv val/ILSVRC2012_val_00007403.JPEG n03187595/
+mv val/ILSVRC2012_val_00007404.JPEG n02071294/
+mv val/ILSVRC2012_val_00007405.JPEG n01494475/
+mv val/ILSVRC2012_val_00007406.JPEG n02119789/
+mv val/ILSVRC2012_val_00007407.JPEG n02963159/
+mv val/ILSVRC2012_val_00007408.JPEG n03976657/
+mv val/ILSVRC2012_val_00007409.JPEG n03759954/
+mv val/ILSVRC2012_val_00007410.JPEG n02916936/
+mv val/ILSVRC2012_val_00007411.JPEG n02120079/
+mv val/ILSVRC2012_val_00007412.JPEG n03109150/
+mv val/ILSVRC2012_val_00007413.JPEG n04370456/
+mv val/ILSVRC2012_val_00007414.JPEG n02817516/
+mv val/ILSVRC2012_val_00007415.JPEG n01734418/
+mv val/ILSVRC2012_val_00007416.JPEG n02415577/
+mv val/ILSVRC2012_val_00007417.JPEG n03691459/
+mv val/ILSVRC2012_val_00007418.JPEG n04023962/
+mv val/ILSVRC2012_val_00007419.JPEG n02114712/
+mv val/ILSVRC2012_val_00007420.JPEG n03995372/
+mv val/ILSVRC2012_val_00007421.JPEG n06359193/
+mv val/ILSVRC2012_val_00007422.JPEG n01943899/
+mv val/ILSVRC2012_val_00007423.JPEG n01860187/
+mv val/ILSVRC2012_val_00007424.JPEG n02859443/
+mv val/ILSVRC2012_val_00007425.JPEG n02268443/
+mv val/ILSVRC2012_val_00007426.JPEG n02488702/
+mv val/ILSVRC2012_val_00007427.JPEG n03110669/
+mv val/ILSVRC2012_val_00007428.JPEG n03250847/
+mv val/ILSVRC2012_val_00007429.JPEG n02165105/
+mv val/ILSVRC2012_val_00007430.JPEG n02102480/
+mv val/ILSVRC2012_val_00007431.JPEG n03026506/
+mv val/ILSVRC2012_val_00007432.JPEG n04465501/
+mv val/ILSVRC2012_val_00007433.JPEG n03733131/
+mv val/ILSVRC2012_val_00007434.JPEG n01910747/
+mv val/ILSVRC2012_val_00007435.JPEG n04277352/
+mv val/ILSVRC2012_val_00007436.JPEG n03065424/
+mv val/ILSVRC2012_val_00007437.JPEG n01644900/
+mv val/ILSVRC2012_val_00007438.JPEG n02951358/
+mv val/ILSVRC2012_val_00007439.JPEG n04399382/
+mv val/ILSVRC2012_val_00007440.JPEG n02326432/
+mv val/ILSVRC2012_val_00007441.JPEG n03529860/
+mv val/ILSVRC2012_val_00007442.JPEG n03764736/
+mv val/ILSVRC2012_val_00007443.JPEG n02444819/
+mv val/ILSVRC2012_val_00007444.JPEG n02093256/
+mv val/ILSVRC2012_val_00007445.JPEG n02091134/
+mv val/ILSVRC2012_val_00007446.JPEG n02091635/
+mv val/ILSVRC2012_val_00007447.JPEG n11879895/
+mv val/ILSVRC2012_val_00007448.JPEG n03657121/
+mv val/ILSVRC2012_val_00007449.JPEG n04613696/
+mv val/ILSVRC2012_val_00007450.JPEG n03452741/
+mv val/ILSVRC2012_val_00007451.JPEG n04596742/
+mv val/ILSVRC2012_val_00007452.JPEG n02097474/
+mv val/ILSVRC2012_val_00007453.JPEG n02672831/
+mv val/ILSVRC2012_val_00007454.JPEG n01968897/
+mv val/ILSVRC2012_val_00007455.JPEG n02486410/
+mv val/ILSVRC2012_val_00007456.JPEG n02488291/
+mv val/ILSVRC2012_val_00007457.JPEG n02356798/
+mv val/ILSVRC2012_val_00007458.JPEG n07749582/
+mv val/ILSVRC2012_val_00007459.JPEG n04033995/
+mv val/ILSVRC2012_val_00007460.JPEG n03000684/
+mv val/ILSVRC2012_val_00007461.JPEG n04428191/
+mv val/ILSVRC2012_val_00007462.JPEG n02089078/
+mv val/ILSVRC2012_val_00007463.JPEG n04005630/
+mv val/ILSVRC2012_val_00007464.JPEG n03476991/
+mv val/ILSVRC2012_val_00007465.JPEG n02817516/
+mv val/ILSVRC2012_val_00007466.JPEG n04371774/
+mv val/ILSVRC2012_val_00007467.JPEG n12144580/
+mv val/ILSVRC2012_val_00007468.JPEG n12144580/
+mv val/ILSVRC2012_val_00007469.JPEG n03950228/
+mv val/ILSVRC2012_val_00007470.JPEG n02009912/
+mv val/ILSVRC2012_val_00007471.JPEG n03425413/
+mv val/ILSVRC2012_val_00007472.JPEG n04141975/
+mv val/ILSVRC2012_val_00007473.JPEG n02790996/
+mv val/ILSVRC2012_val_00007474.JPEG n01818515/
+mv val/ILSVRC2012_val_00007475.JPEG n07583066/
+mv val/ILSVRC2012_val_00007476.JPEG n04116512/
+mv val/ILSVRC2012_val_00007477.JPEG n03417042/
+mv val/ILSVRC2012_val_00007478.JPEG n01739381/
+mv val/ILSVRC2012_val_00007479.JPEG n01944390/
+mv val/ILSVRC2012_val_00007480.JPEG n03447721/
+mv val/ILSVRC2012_val_00007481.JPEG n03891332/
+mv val/ILSVRC2012_val_00007482.JPEG n01689811/
+mv val/ILSVRC2012_val_00007483.JPEG n04081281/
+mv val/ILSVRC2012_val_00007484.JPEG n02892767/
+mv val/ILSVRC2012_val_00007485.JPEG n04590129/
+mv val/ILSVRC2012_val_00007486.JPEG n01632777/
+mv val/ILSVRC2012_val_00007487.JPEG n02086910/
+mv val/ILSVRC2012_val_00007488.JPEG n01742172/
+mv val/ILSVRC2012_val_00007489.JPEG n04579145/
+mv val/ILSVRC2012_val_00007490.JPEG n02814860/
+mv val/ILSVRC2012_val_00007491.JPEG n04458633/
+mv val/ILSVRC2012_val_00007492.JPEG n04487394/
+mv val/ILSVRC2012_val_00007493.JPEG n02088632/
+mv val/ILSVRC2012_val_00007494.JPEG n03942813/
+mv val/ILSVRC2012_val_00007495.JPEG n04162706/
+mv val/ILSVRC2012_val_00007496.JPEG n07613480/
+mv val/ILSVRC2012_val_00007497.JPEG n02098413/
+mv val/ILSVRC2012_val_00007498.JPEG n04037443/
+mv val/ILSVRC2012_val_00007499.JPEG n02457408/
+mv val/ILSVRC2012_val_00007500.JPEG n04461696/
+mv val/ILSVRC2012_val_00007501.JPEG n02110185/
+mv val/ILSVRC2012_val_00007502.JPEG n03887697/
+mv val/ILSVRC2012_val_00007503.JPEG n03344393/
+mv val/ILSVRC2012_val_00007504.JPEG n04336792/
+mv val/ILSVRC2012_val_00007505.JPEG n04209239/
+mv val/ILSVRC2012_val_00007506.JPEG n02480495/
+mv val/ILSVRC2012_val_00007507.JPEG n02102480/
+mv val/ILSVRC2012_val_00007508.JPEG n04040759/
+mv val/ILSVRC2012_val_00007509.JPEG n03372029/
+mv val/ILSVRC2012_val_00007510.JPEG n03017168/
+mv val/ILSVRC2012_val_00007511.JPEG n02087046/
+mv val/ILSVRC2012_val_00007512.JPEG n02110185/
+mv val/ILSVRC2012_val_00007513.JPEG n04131690/
+mv val/ILSVRC2012_val_00007514.JPEG n02133161/
+mv val/ILSVRC2012_val_00007515.JPEG n02749479/
+mv val/ILSVRC2012_val_00007516.JPEG n02092002/
+mv val/ILSVRC2012_val_00007517.JPEG n04612504/
+mv val/ILSVRC2012_val_00007518.JPEG n03388183/
+mv val/ILSVRC2012_val_00007519.JPEG n03417042/
+mv val/ILSVRC2012_val_00007520.JPEG n02168699/
+mv val/ILSVRC2012_val_00007521.JPEG n07248320/
+mv val/ILSVRC2012_val_00007522.JPEG n02012849/
+mv val/ILSVRC2012_val_00007523.JPEG n03791053/
+mv val/ILSVRC2012_val_00007524.JPEG n02027492/
+mv val/ILSVRC2012_val_00007525.JPEG n07768694/
+mv val/ILSVRC2012_val_00007526.JPEG n02115913/
+mv val/ILSVRC2012_val_00007527.JPEG n02093428/
+mv val/ILSVRC2012_val_00007528.JPEG n01630670/
+mv val/ILSVRC2012_val_00007529.JPEG n02226429/
+mv val/ILSVRC2012_val_00007530.JPEG n01514859/
+mv val/ILSVRC2012_val_00007531.JPEG n07716358/
+mv val/ILSVRC2012_val_00007532.JPEG n02860847/
+mv val/ILSVRC2012_val_00007533.JPEG n04041544/
+mv val/ILSVRC2012_val_00007534.JPEG n02105505/
+mv val/ILSVRC2012_val_00007535.JPEG n02107683/
+mv val/ILSVRC2012_val_00007536.JPEG n03394916/
+mv val/ILSVRC2012_val_00007537.JPEG n03384352/
+mv val/ILSVRC2012_val_00007538.JPEG n04536866/
+mv val/ILSVRC2012_val_00007539.JPEG n02107312/
+mv val/ILSVRC2012_val_00007540.JPEG n04487081/
+mv val/ILSVRC2012_val_00007541.JPEG n02447366/
+mv val/ILSVRC2012_val_00007542.JPEG n02113186/
+mv val/ILSVRC2012_val_00007543.JPEG n03777754/
+mv val/ILSVRC2012_val_00007544.JPEG n03496892/
+mv val/ILSVRC2012_val_00007545.JPEG n09421951/
+mv val/ILSVRC2012_val_00007546.JPEG n02097298/
+mv val/ILSVRC2012_val_00007547.JPEG n02112706/
+mv val/ILSVRC2012_val_00007548.JPEG n02128757/
+mv val/ILSVRC2012_val_00007549.JPEG n02169497/
+mv val/ILSVRC2012_val_00007550.JPEG n03933933/
+mv val/ILSVRC2012_val_00007551.JPEG n02109961/
+mv val/ILSVRC2012_val_00007552.JPEG n04254120/
+mv val/ILSVRC2012_val_00007553.JPEG n04562935/
+mv val/ILSVRC2012_val_00007554.JPEG n02457408/
+mv val/ILSVRC2012_val_00007555.JPEG n02093754/
+mv val/ILSVRC2012_val_00007556.JPEG n15075141/
+mv val/ILSVRC2012_val_00007557.JPEG n02788148/
+mv val/ILSVRC2012_val_00007558.JPEG n01751748/
+mv val/ILSVRC2012_val_00007559.JPEG n02837789/
+mv val/ILSVRC2012_val_00007560.JPEG n06359193/
+mv val/ILSVRC2012_val_00007561.JPEG n01630670/
+mv val/ILSVRC2012_val_00007562.JPEG n03908618/
+mv val/ILSVRC2012_val_00007563.JPEG n07754684/
+mv val/ILSVRC2012_val_00007564.JPEG n02013706/
+mv val/ILSVRC2012_val_00007565.JPEG n03680355/
+mv val/ILSVRC2012_val_00007566.JPEG n02788148/
+mv val/ILSVRC2012_val_00007567.JPEG n06794110/
+mv val/ILSVRC2012_val_00007568.JPEG n02102040/
+mv val/ILSVRC2012_val_00007569.JPEG n01496331/
+mv val/ILSVRC2012_val_00007570.JPEG n03482405/
+mv val/ILSVRC2012_val_00007571.JPEG n02107312/
+mv val/ILSVRC2012_val_00007572.JPEG n13054560/
+mv val/ILSVRC2012_val_00007573.JPEG n03843555/
+mv val/ILSVRC2012_val_00007574.JPEG n01644373/
+mv val/ILSVRC2012_val_00007575.JPEG n02894605/
+mv val/ILSVRC2012_val_00007576.JPEG n01818515/
+mv val/ILSVRC2012_val_00007577.JPEG n03899768/
+mv val/ILSVRC2012_val_00007578.JPEG n02134084/
+mv val/ILSVRC2012_val_00007579.JPEG n01692333/
+mv val/ILSVRC2012_val_00007580.JPEG n02948072/
+mv val/ILSVRC2012_val_00007581.JPEG n03743016/
+mv val/ILSVRC2012_val_00007582.JPEG n07583066/
+mv val/ILSVRC2012_val_00007583.JPEG n02279972/
+mv val/ILSVRC2012_val_00007584.JPEG n07760859/
+mv val/ILSVRC2012_val_00007585.JPEG n03868863/
+mv val/ILSVRC2012_val_00007586.JPEG n02422699/
+mv val/ILSVRC2012_val_00007587.JPEG n02825657/
+mv val/ILSVRC2012_val_00007588.JPEG n02480855/
+mv val/ILSVRC2012_val_00007589.JPEG n02226429/
+mv val/ILSVRC2012_val_00007590.JPEG n04033901/
+mv val/ILSVRC2012_val_00007591.JPEG n01817953/
+mv val/ILSVRC2012_val_00007592.JPEG n04285008/
+mv val/ILSVRC2012_val_00007593.JPEG n04550184/
+mv val/ILSVRC2012_val_00007594.JPEG n04476259/
+mv val/ILSVRC2012_val_00007595.JPEG n02100877/
+mv val/ILSVRC2012_val_00007596.JPEG n09835506/
+mv val/ILSVRC2012_val_00007597.JPEG n02410509/
+mv val/ILSVRC2012_val_00007598.JPEG n03207743/
+mv val/ILSVRC2012_val_00007599.JPEG n03877845/
+mv val/ILSVRC2012_val_00007600.JPEG n03947888/
+mv val/ILSVRC2012_val_00007601.JPEG n01774750/
+mv val/ILSVRC2012_val_00007602.JPEG n02641379/
+mv val/ILSVRC2012_val_00007603.JPEG n04584207/
+mv val/ILSVRC2012_val_00007604.JPEG n02481823/
+mv val/ILSVRC2012_val_00007605.JPEG n07768694/
+mv val/ILSVRC2012_val_00007606.JPEG n02130308/
+mv val/ILSVRC2012_val_00007607.JPEG n04147183/
+mv val/ILSVRC2012_val_00007608.JPEG n04596742/
+mv val/ILSVRC2012_val_00007609.JPEG n02395406/
+mv val/ILSVRC2012_val_00007610.JPEG n07754684/
+mv val/ILSVRC2012_val_00007611.JPEG n04252225/
+mv val/ILSVRC2012_val_00007612.JPEG n04118538/
+mv val/ILSVRC2012_val_00007613.JPEG n09256479/
+mv val/ILSVRC2012_val_00007614.JPEG n07742313/
+mv val/ILSVRC2012_val_00007615.JPEG n02769748/
+mv val/ILSVRC2012_val_00007616.JPEG n03888257/
+mv val/ILSVRC2012_val_00007617.JPEG n03658185/
+mv val/ILSVRC2012_val_00007618.JPEG n04067472/
+mv val/ILSVRC2012_val_00007619.JPEG n02481823/
+mv val/ILSVRC2012_val_00007620.JPEG n03255030/
+mv val/ILSVRC2012_val_00007621.JPEG n03903868/
+mv val/ILSVRC2012_val_00007622.JPEG n03124043/
+mv val/ILSVRC2012_val_00007623.JPEG n03874599/
+mv val/ILSVRC2012_val_00007624.JPEG n06596364/
+mv val/ILSVRC2012_val_00007625.JPEG n04355933/
+mv val/ILSVRC2012_val_00007626.JPEG n04613696/
+mv val/ILSVRC2012_val_00007627.JPEG n04357314/
+mv val/ILSVRC2012_val_00007628.JPEG n02814860/
+mv val/ILSVRC2012_val_00007629.JPEG n02099601/
+mv val/ILSVRC2012_val_00007630.JPEG n01806567/
+mv val/ILSVRC2012_val_00007631.JPEG n02396427/
+mv val/ILSVRC2012_val_00007632.JPEG n02106166/
+mv val/ILSVRC2012_val_00007633.JPEG n03769881/
+mv val/ILSVRC2012_val_00007634.JPEG n02113023/
+mv val/ILSVRC2012_val_00007635.JPEG n04146614/
+mv val/ILSVRC2012_val_00007636.JPEG n02640242/
+mv val/ILSVRC2012_val_00007637.JPEG n02966193/
+mv val/ILSVRC2012_val_00007638.JPEG n02841315/
+mv val/ILSVRC2012_val_00007639.JPEG n02481823/
+mv val/ILSVRC2012_val_00007640.JPEG n03724870/
+mv val/ILSVRC2012_val_00007641.JPEG n03998194/
+mv val/ILSVRC2012_val_00007642.JPEG n04522168/
+mv val/ILSVRC2012_val_00007643.JPEG n02747177/
+mv val/ILSVRC2012_val_00007644.JPEG n02317335/
+mv val/ILSVRC2012_val_00007645.JPEG n04067472/
+mv val/ILSVRC2012_val_00007646.JPEG n02129165/
+mv val/ILSVRC2012_val_00007647.JPEG n07714571/
+mv val/ILSVRC2012_val_00007648.JPEG n03992509/
+mv val/ILSVRC2012_val_00007649.JPEG n03379051/
+mv val/ILSVRC2012_val_00007650.JPEG n04141975/
+mv val/ILSVRC2012_val_00007651.JPEG n02028035/
+mv val/ILSVRC2012_val_00007652.JPEG n02085936/
+mv val/ILSVRC2012_val_00007653.JPEG n04540053/
+mv val/ILSVRC2012_val_00007654.JPEG n02112137/
+mv val/ILSVRC2012_val_00007655.JPEG n03977966/
+mv val/ILSVRC2012_val_00007656.JPEG n03637318/
+mv val/ILSVRC2012_val_00007657.JPEG n03887697/
+mv val/ILSVRC2012_val_00007658.JPEG n09468604/
+mv val/ILSVRC2012_val_00007659.JPEG n03424325/
+mv val/ILSVRC2012_val_00007660.JPEG n04584207/
+mv val/ILSVRC2012_val_00007661.JPEG n01917289/
+mv val/ILSVRC2012_val_00007662.JPEG n07579787/
+mv val/ILSVRC2012_val_00007663.JPEG n03325584/
+mv val/ILSVRC2012_val_00007664.JPEG n01829413/
+mv val/ILSVRC2012_val_00007665.JPEG n04540053/
+mv val/ILSVRC2012_val_00007666.JPEG n03127925/
+mv val/ILSVRC2012_val_00007667.JPEG n01558993/
+mv val/ILSVRC2012_val_00007668.JPEG n02027492/
+mv val/ILSVRC2012_val_00007669.JPEG n03424325/
+mv val/ILSVRC2012_val_00007670.JPEG n03109150/
+mv val/ILSVRC2012_val_00007671.JPEG n06794110/
+mv val/ILSVRC2012_val_00007672.JPEG n01773797/
+mv val/ILSVRC2012_val_00007673.JPEG n03188531/
+mv val/ILSVRC2012_val_00007674.JPEG n02106382/
+mv val/ILSVRC2012_val_00007675.JPEG n03788365/
+mv val/ILSVRC2012_val_00007676.JPEG n02123159/
+mv val/ILSVRC2012_val_00007677.JPEG n01773797/
+mv val/ILSVRC2012_val_00007678.JPEG n02229544/
+mv val/ILSVRC2012_val_00007679.JPEG n02727426/
+mv val/ILSVRC2012_val_00007680.JPEG n02823428/
+mv val/ILSVRC2012_val_00007681.JPEG n02454379/
+mv val/ILSVRC2012_val_00007682.JPEG n02106030/
+mv val/ILSVRC2012_val_00007683.JPEG n01924916/
+mv val/ILSVRC2012_val_00007684.JPEG n12998815/
+mv val/ILSVRC2012_val_00007685.JPEG n04179913/
+mv val/ILSVRC2012_val_00007686.JPEG n04099969/
+mv val/ILSVRC2012_val_00007687.JPEG n07684084/
+mv val/ILSVRC2012_val_00007688.JPEG n03450230/
+mv val/ILSVRC2012_val_00007689.JPEG n04435653/
+mv val/ILSVRC2012_val_00007690.JPEG n02422106/
+mv val/ILSVRC2012_val_00007691.JPEG n03637318/
+mv val/ILSVRC2012_val_00007692.JPEG n03018349/
+mv val/ILSVRC2012_val_00007693.JPEG n04429376/
+mv val/ILSVRC2012_val_00007694.JPEG n03868863/
+mv val/ILSVRC2012_val_00007695.JPEG n02110806/
+mv val/ILSVRC2012_val_00007696.JPEG n02226429/
+mv val/ILSVRC2012_val_00007697.JPEG n02006656/
+mv val/ILSVRC2012_val_00007698.JPEG n03843555/
+mv val/ILSVRC2012_val_00007699.JPEG n06359193/
+mv val/ILSVRC2012_val_00007700.JPEG n01860187/
+mv val/ILSVRC2012_val_00007701.JPEG n01694178/
+mv val/ILSVRC2012_val_00007702.JPEG n02138441/
+mv val/ILSVRC2012_val_00007703.JPEG n03630383/
+mv val/ILSVRC2012_val_00007704.JPEG n04009552/
+mv val/ILSVRC2012_val_00007705.JPEG n02101006/
+mv val/ILSVRC2012_val_00007706.JPEG n03496892/
+mv val/ILSVRC2012_val_00007707.JPEG n03447721/
+mv val/ILSVRC2012_val_00007708.JPEG n07920052/
+mv val/ILSVRC2012_val_00007709.JPEG n07873807/
+mv val/ILSVRC2012_val_00007710.JPEG n01729977/
+mv val/ILSVRC2012_val_00007711.JPEG n03220513/
+mv val/ILSVRC2012_val_00007712.JPEG n01614925/
+mv val/ILSVRC2012_val_00007713.JPEG n02134084/
+mv val/ILSVRC2012_val_00007714.JPEG n03908618/
+mv val/ILSVRC2012_val_00007715.JPEG n03763968/
+mv val/ILSVRC2012_val_00007716.JPEG n03544143/
+mv val/ILSVRC2012_val_00007717.JPEG n02797295/
+mv val/ILSVRC2012_val_00007718.JPEG n04392985/
+mv val/ILSVRC2012_val_00007719.JPEG n01728920/
+mv val/ILSVRC2012_val_00007720.JPEG n03876231/
+mv val/ILSVRC2012_val_00007721.JPEG n03259280/
+mv val/ILSVRC2012_val_00007722.JPEG n03325584/
+mv val/ILSVRC2012_val_00007723.JPEG n04296562/
+mv val/ILSVRC2012_val_00007724.JPEG n02909870/
+mv val/ILSVRC2012_val_00007725.JPEG n02493793/
+mv val/ILSVRC2012_val_00007726.JPEG n02112706/
+mv val/ILSVRC2012_val_00007727.JPEG n02776631/
+mv val/ILSVRC2012_val_00007728.JPEG n02447366/
+mv val/ILSVRC2012_val_00007729.JPEG n01514859/
+mv val/ILSVRC2012_val_00007730.JPEG n03954731/
+mv val/ILSVRC2012_val_00007731.JPEG n03344393/
+mv val/ILSVRC2012_val_00007732.JPEG n04125021/
+mv val/ILSVRC2012_val_00007733.JPEG n03930630/
+mv val/ILSVRC2012_val_00007734.JPEG n04116512/
+mv val/ILSVRC2012_val_00007735.JPEG n02441942/
+mv val/ILSVRC2012_val_00007736.JPEG n03344393/
+mv val/ILSVRC2012_val_00007737.JPEG n02125311/
+mv val/ILSVRC2012_val_00007738.JPEG n02643566/
+mv val/ILSVRC2012_val_00007739.JPEG n03840681/
+mv val/ILSVRC2012_val_00007740.JPEG n02106662/
+mv val/ILSVRC2012_val_00007741.JPEG n03325584/
+mv val/ILSVRC2012_val_00007742.JPEG n07695742/
+mv val/ILSVRC2012_val_00007743.JPEG n01491361/
+mv val/ILSVRC2012_val_00007744.JPEG n03814906/
+mv val/ILSVRC2012_val_00007745.JPEG n03075370/
+mv val/ILSVRC2012_val_00007746.JPEG n02098286/
+mv val/ILSVRC2012_val_00007747.JPEG n02666196/
+mv val/ILSVRC2012_val_00007748.JPEG n07718472/
+mv val/ILSVRC2012_val_00007749.JPEG n02948072/
+mv val/ILSVRC2012_val_00007750.JPEG n01698640/
+mv val/ILSVRC2012_val_00007751.JPEG n03777754/
+mv val/ILSVRC2012_val_00007752.JPEG n07714571/
+mv val/ILSVRC2012_val_00007753.JPEG n01945685/
+mv val/ILSVRC2012_val_00007754.JPEG n03085013/
+mv val/ILSVRC2012_val_00007755.JPEG n03445777/
+mv val/ILSVRC2012_val_00007756.JPEG n04380533/
+mv val/ILSVRC2012_val_00007757.JPEG n01986214/
+mv val/ILSVRC2012_val_00007758.JPEG n03673027/
+mv val/ILSVRC2012_val_00007759.JPEG n03710193/
+mv val/ILSVRC2012_val_00007760.JPEG n02441942/
+mv val/ILSVRC2012_val_00007761.JPEG n01734418/
+mv val/ILSVRC2012_val_00007762.JPEG n02105412/
+mv val/ILSVRC2012_val_00007763.JPEG n03447447/
+mv val/ILSVRC2012_val_00007764.JPEG n04591157/
+mv val/ILSVRC2012_val_00007765.JPEG n02727426/
+mv val/ILSVRC2012_val_00007766.JPEG n04486054/
+mv val/ILSVRC2012_val_00007767.JPEG n02510455/
+mv val/ILSVRC2012_val_00007768.JPEG n03958227/
+mv val/ILSVRC2012_val_00007769.JPEG n01978455/
+mv val/ILSVRC2012_val_00007770.JPEG n04461696/
+mv val/ILSVRC2012_val_00007771.JPEG n03908618/
+mv val/ILSVRC2012_val_00007772.JPEG n04522168/
+mv val/ILSVRC2012_val_00007773.JPEG n02107908/
+mv val/ILSVRC2012_val_00007774.JPEG n07715103/
+mv val/ILSVRC2012_val_00007775.JPEG n04009552/
+mv val/ILSVRC2012_val_00007776.JPEG n03457902/
+mv val/ILSVRC2012_val_00007777.JPEG n03447447/
+mv val/ILSVRC2012_val_00007778.JPEG n01820546/
+mv val/ILSVRC2012_val_00007779.JPEG n02692877/
+mv val/ILSVRC2012_val_00007780.JPEG n03874599/
+mv val/ILSVRC2012_val_00007781.JPEG n02101388/
+mv val/ILSVRC2012_val_00007782.JPEG n02115641/
+mv val/ILSVRC2012_val_00007783.JPEG n03532672/
+mv val/ILSVRC2012_val_00007784.JPEG n03127925/
+mv val/ILSVRC2012_val_00007785.JPEG n04081281/
+mv val/ILSVRC2012_val_00007786.JPEG n02814533/
+mv val/ILSVRC2012_val_00007787.JPEG n02916936/
+mv val/ILSVRC2012_val_00007788.JPEG n02483708/
+mv val/ILSVRC2012_val_00007789.JPEG n02791124/
+mv val/ILSVRC2012_val_00007790.JPEG n04505470/
+mv val/ILSVRC2012_val_00007791.JPEG n04417672/
+mv val/ILSVRC2012_val_00007792.JPEG n03876231/
+mv val/ILSVRC2012_val_00007793.JPEG n01829413/
+mv val/ILSVRC2012_val_00007794.JPEG n09246464/
+mv val/ILSVRC2012_val_00007795.JPEG n01728920/
+mv val/ILSVRC2012_val_00007796.JPEG n02363005/
+mv val/ILSVRC2012_val_00007797.JPEG n07754684/
+mv val/ILSVRC2012_val_00007798.JPEG n07717556/
+mv val/ILSVRC2012_val_00007799.JPEG n03000247/
+mv val/ILSVRC2012_val_00007800.JPEG n01873310/
+mv val/ILSVRC2012_val_00007801.JPEG n02091635/
+mv val/ILSVRC2012_val_00007802.JPEG n07831146/
+mv val/ILSVRC2012_val_00007803.JPEG n02794156/
+mv val/ILSVRC2012_val_00007804.JPEG n03825788/
+mv val/ILSVRC2012_val_00007805.JPEG n03476991/
+mv val/ILSVRC2012_val_00007806.JPEG n04033901/
+mv val/ILSVRC2012_val_00007807.JPEG n02607072/
+mv val/ILSVRC2012_val_00007808.JPEG n02123394/
+mv val/ILSVRC2012_val_00007809.JPEG n03534580/
+mv val/ILSVRC2012_val_00007810.JPEG n01770081/
+mv val/ILSVRC2012_val_00007811.JPEG n02011460/
+mv val/ILSVRC2012_val_00007812.JPEG n02843684/
+mv val/ILSVRC2012_val_00007813.JPEG n02109525/
+mv val/ILSVRC2012_val_00007814.JPEG n03916031/
+mv val/ILSVRC2012_val_00007815.JPEG n04418357/
+mv val/ILSVRC2012_val_00007816.JPEG n03710637/
+mv val/ILSVRC2012_val_00007817.JPEG n03075370/
+mv val/ILSVRC2012_val_00007818.JPEG n01644900/
+mv val/ILSVRC2012_val_00007819.JPEG n04254680/
+mv val/ILSVRC2012_val_00007820.JPEG n07768694/
+mv val/ILSVRC2012_val_00007821.JPEG n04228054/
+mv val/ILSVRC2012_val_00007822.JPEG n04258138/
+mv val/ILSVRC2012_val_00007823.JPEG n04357314/
+mv val/ILSVRC2012_val_00007824.JPEG n07836838/
+mv val/ILSVRC2012_val_00007825.JPEG n03000134/
+mv val/ILSVRC2012_val_00007826.JPEG n04310018/
+mv val/ILSVRC2012_val_00007827.JPEG n03000134/
+mv val/ILSVRC2012_val_00007828.JPEG n02098413/
+mv val/ILSVRC2012_val_00007829.JPEG n02108000/
+mv val/ILSVRC2012_val_00007830.JPEG n04252077/
+mv val/ILSVRC2012_val_00007831.JPEG n02457408/
+mv val/ILSVRC2012_val_00007832.JPEG n04483307/
+mv val/ILSVRC2012_val_00007833.JPEG n02105505/
+mv val/ILSVRC2012_val_00007834.JPEG n03125729/
+mv val/ILSVRC2012_val_00007835.JPEG n02091467/
+mv val/ILSVRC2012_val_00007836.JPEG n03868242/
+mv val/ILSVRC2012_val_00007837.JPEG n02106166/
+mv val/ILSVRC2012_val_00007838.JPEG n03240683/
+mv val/ILSVRC2012_val_00007839.JPEG n02917067/
+mv val/ILSVRC2012_val_00007840.JPEG n02105056/
+mv val/ILSVRC2012_val_00007841.JPEG n04525305/
+mv val/ILSVRC2012_val_00007842.JPEG n01753488/
+mv val/ILSVRC2012_val_00007843.JPEG n02978881/
+mv val/ILSVRC2012_val_00007844.JPEG n03977966/
+mv val/ILSVRC2012_val_00007845.JPEG n02486261/
+mv val/ILSVRC2012_val_00007846.JPEG n04162706/
+mv val/ILSVRC2012_val_00007847.JPEG n02120079/
+mv val/ILSVRC2012_val_00007848.JPEG n03709823/
+mv val/ILSVRC2012_val_00007849.JPEG n03127747/
+mv val/ILSVRC2012_val_00007850.JPEG n02089973/
+mv val/ILSVRC2012_val_00007851.JPEG n03089624/
+mv val/ILSVRC2012_val_00007852.JPEG n03814906/
+mv val/ILSVRC2012_val_00007853.JPEG n01534433/
+mv val/ILSVRC2012_val_00007854.JPEG n04613696/
+mv val/ILSVRC2012_val_00007855.JPEG n03325584/
+mv val/ILSVRC2012_val_00007856.JPEG n04505470/
+mv val/ILSVRC2012_val_00007857.JPEG n03325584/
+mv val/ILSVRC2012_val_00007858.JPEG n02115641/
+mv val/ILSVRC2012_val_00007859.JPEG n03630383/
+mv val/ILSVRC2012_val_00007860.JPEG n01930112/
+mv val/ILSVRC2012_val_00007861.JPEG n04204238/
+mv val/ILSVRC2012_val_00007862.JPEG n03063689/
+mv val/ILSVRC2012_val_00007863.JPEG n02233338/
+mv val/ILSVRC2012_val_00007864.JPEG n03916031/
+mv val/ILSVRC2012_val_00007865.JPEG n02786058/
+mv val/ILSVRC2012_val_00007866.JPEG n02113799/
+mv val/ILSVRC2012_val_00007867.JPEG n03935335/
+mv val/ILSVRC2012_val_00007868.JPEG n04179913/
+mv val/ILSVRC2012_val_00007869.JPEG n03690938/
+mv val/ILSVRC2012_val_00007870.JPEG n02442845/
+mv val/ILSVRC2012_val_00007871.JPEG n01819313/
+mv val/ILSVRC2012_val_00007872.JPEG n01534433/
+mv val/ILSVRC2012_val_00007873.JPEG n01753488/
+mv val/ILSVRC2012_val_00007874.JPEG n02823750/
+mv val/ILSVRC2012_val_00007875.JPEG n01491361/
+mv val/ILSVRC2012_val_00007876.JPEG n03124043/
+mv val/ILSVRC2012_val_00007877.JPEG n01749939/
+mv val/ILSVRC2012_val_00007878.JPEG n02328150/
+mv val/ILSVRC2012_val_00007879.JPEG n03272562/
+mv val/ILSVRC2012_val_00007880.JPEG n02094258/
+mv val/ILSVRC2012_val_00007881.JPEG n04597913/
+mv val/ILSVRC2012_val_00007882.JPEG n01773549/
+mv val/ILSVRC2012_val_00007883.JPEG n03724870/
+mv val/ILSVRC2012_val_00007884.JPEG n01871265/
+mv val/ILSVRC2012_val_00007885.JPEG n01751748/
+mv val/ILSVRC2012_val_00007886.JPEG n04039381/
+mv val/ILSVRC2012_val_00007887.JPEG n03733805/
+mv val/ILSVRC2012_val_00007888.JPEG n02783161/
+mv val/ILSVRC2012_val_00007889.JPEG n02948072/
+mv val/ILSVRC2012_val_00007890.JPEG n02397096/
+mv val/ILSVRC2012_val_00007891.JPEG n02233338/
+mv val/ILSVRC2012_val_00007892.JPEG n02093647/
+mv val/ILSVRC2012_val_00007893.JPEG n03016953/
+mv val/ILSVRC2012_val_00007894.JPEG n04344873/
+mv val/ILSVRC2012_val_00007895.JPEG n02640242/
+mv val/ILSVRC2012_val_00007896.JPEG n01677366/
+mv val/ILSVRC2012_val_00007897.JPEG n02106166/
+mv val/ILSVRC2012_val_00007898.JPEG n07745940/
+mv val/ILSVRC2012_val_00007899.JPEG n03710637/
+mv val/ILSVRC2012_val_00007900.JPEG n03529860/
+mv val/ILSVRC2012_val_00007901.JPEG n02988304/
+mv val/ILSVRC2012_val_00007902.JPEG n04350905/
+mv val/ILSVRC2012_val_00007903.JPEG n02105056/
+mv val/ILSVRC2012_val_00007904.JPEG n01630670/
+mv val/ILSVRC2012_val_00007905.JPEG n12998815/
+mv val/ILSVRC2012_val_00007906.JPEG n02094258/
+mv val/ILSVRC2012_val_00007907.JPEG n03481172/
+mv val/ILSVRC2012_val_00007908.JPEG n04515003/
+mv val/ILSVRC2012_val_00007909.JPEG n04418357/
+mv val/ILSVRC2012_val_00007910.JPEG n03075370/
+mv val/ILSVRC2012_val_00007911.JPEG n04273569/
+mv val/ILSVRC2012_val_00007912.JPEG n01592084/
+mv val/ILSVRC2012_val_00007913.JPEG n03290653/
+mv val/ILSVRC2012_val_00007914.JPEG n04487394/
+mv val/ILSVRC2012_val_00007915.JPEG n02109047/
+mv val/ILSVRC2012_val_00007916.JPEG n02259212/
+mv val/ILSVRC2012_val_00007917.JPEG n04604644/
+mv val/ILSVRC2012_val_00007918.JPEG n03976467/
+mv val/ILSVRC2012_val_00007919.JPEG n04023962/
+mv val/ILSVRC2012_val_00007920.JPEG n02910353/
+mv val/ILSVRC2012_val_00007921.JPEG n03394916/
+mv val/ILSVRC2012_val_00007922.JPEG n02106662/
+mv val/ILSVRC2012_val_00007923.JPEG n01882714/
+mv val/ILSVRC2012_val_00007924.JPEG n03494278/
+mv val/ILSVRC2012_val_00007925.JPEG n01770393/
+mv val/ILSVRC2012_val_00007926.JPEG n03445924/
+mv val/ILSVRC2012_val_00007927.JPEG n02102177/
+mv val/ILSVRC2012_val_00007928.JPEG n02110958/
+mv val/ILSVRC2012_val_00007929.JPEG n02089973/
+mv val/ILSVRC2012_val_00007930.JPEG n01924916/
+mv val/ILSVRC2012_val_00007931.JPEG n02113799/
+mv val/ILSVRC2012_val_00007932.JPEG n01817953/
+mv val/ILSVRC2012_val_00007933.JPEG n02091134/
+mv val/ILSVRC2012_val_00007934.JPEG n01697457/
+mv val/ILSVRC2012_val_00007935.JPEG n03443371/
+mv val/ILSVRC2012_val_00007936.JPEG n04482393/
+mv val/ILSVRC2012_val_00007937.JPEG n01749939/
+mv val/ILSVRC2012_val_00007938.JPEG n01985128/
+mv val/ILSVRC2012_val_00007939.JPEG n04116512/
+mv val/ILSVRC2012_val_00007940.JPEG n03452741/
+mv val/ILSVRC2012_val_00007941.JPEG n03220513/
+mv val/ILSVRC2012_val_00007942.JPEG n02510455/
+mv val/ILSVRC2012_val_00007943.JPEG n03761084/
+mv val/ILSVRC2012_val_00007944.JPEG n02916936/
+mv val/ILSVRC2012_val_00007945.JPEG n02089867/
+mv val/ILSVRC2012_val_00007946.JPEG n02281406/
+mv val/ILSVRC2012_val_00007947.JPEG n03445777/
+mv val/ILSVRC2012_val_00007948.JPEG n03642806/
+mv val/ILSVRC2012_val_00007949.JPEG n03255030/
+mv val/ILSVRC2012_val_00007950.JPEG n09428293/
+mv val/ILSVRC2012_val_00007951.JPEG n01774750/
+mv val/ILSVRC2012_val_00007952.JPEG n03220513/
+mv val/ILSVRC2012_val_00007953.JPEG n04254777/
+mv val/ILSVRC2012_val_00007954.JPEG n13037406/
+mv val/ILSVRC2012_val_00007955.JPEG n04235860/
+mv val/ILSVRC2012_val_00007956.JPEG n07875152/
+mv val/ILSVRC2012_val_00007957.JPEG n01877812/
+mv val/ILSVRC2012_val_00007958.JPEG n02086240/
+mv val/ILSVRC2012_val_00007959.JPEG n03876231/
+mv val/ILSVRC2012_val_00007960.JPEG n02484975/
+mv val/ILSVRC2012_val_00007961.JPEG n03595614/
+mv val/ILSVRC2012_val_00007962.JPEG n03733805/
+mv val/ILSVRC2012_val_00007963.JPEG n02099712/
+mv val/ILSVRC2012_val_00007964.JPEG n03884397/
+mv val/ILSVRC2012_val_00007965.JPEG n03016953/
+mv val/ILSVRC2012_val_00007966.JPEG n02088632/
+mv val/ILSVRC2012_val_00007967.JPEG n04086273/
+mv val/ILSVRC2012_val_00007968.JPEG n02797295/
+mv val/ILSVRC2012_val_00007969.JPEG n04392985/
+mv val/ILSVRC2012_val_00007970.JPEG n03124043/
+mv val/ILSVRC2012_val_00007971.JPEG n02102480/
+mv val/ILSVRC2012_val_00007972.JPEG n02100583/
+mv val/ILSVRC2012_val_00007973.JPEG n01855032/
+mv val/ILSVRC2012_val_00007974.JPEG n02667093/
+mv val/ILSVRC2012_val_00007975.JPEG n01945685/
+mv val/ILSVRC2012_val_00007976.JPEG n03250847/
+mv val/ILSVRC2012_val_00007977.JPEG n01644373/
+mv val/ILSVRC2012_val_00007978.JPEG n04147183/
+mv val/ILSVRC2012_val_00007979.JPEG n02641379/
+mv val/ILSVRC2012_val_00007980.JPEG n02342885/
+mv val/ILSVRC2012_val_00007981.JPEG n03666591/
+mv val/ILSVRC2012_val_00007982.JPEG n03000134/
+mv val/ILSVRC2012_val_00007983.JPEG n03197337/
+mv val/ILSVRC2012_val_00007984.JPEG n02807133/
+mv val/ILSVRC2012_val_00007985.JPEG n03394916/
+mv val/ILSVRC2012_val_00007986.JPEG n01797886/
+mv val/ILSVRC2012_val_00007987.JPEG n02443114/
+mv val/ILSVRC2012_val_00007988.JPEG n02056570/
+mv val/ILSVRC2012_val_00007989.JPEG n02916936/
+mv val/ILSVRC2012_val_00007990.JPEG n04090263/
+mv val/ILSVRC2012_val_00007991.JPEG n01756291/
+mv val/ILSVRC2012_val_00007992.JPEG n03724870/
+mv val/ILSVRC2012_val_00007993.JPEG n02747177/
+mv val/ILSVRC2012_val_00007994.JPEG n04553703/
+mv val/ILSVRC2012_val_00007995.JPEG n01983481/
+mv val/ILSVRC2012_val_00007996.JPEG n04479046/
+mv val/ILSVRC2012_val_00007997.JPEG n07920052/
+mv val/ILSVRC2012_val_00007998.JPEG n01631663/
+mv val/ILSVRC2012_val_00007999.JPEG n01981276/
+mv val/ILSVRC2012_val_00008000.JPEG n02097474/
+mv val/ILSVRC2012_val_00008001.JPEG n02268443/
+mv val/ILSVRC2012_val_00008002.JPEG n01944390/
+mv val/ILSVRC2012_val_00008003.JPEG n02108422/
+mv val/ILSVRC2012_val_00008004.JPEG n04487081/
+mv val/ILSVRC2012_val_00008005.JPEG n07734744/
+mv val/ILSVRC2012_val_00008006.JPEG n02091244/
+mv val/ILSVRC2012_val_00008007.JPEG n02835271/
+mv val/ILSVRC2012_val_00008008.JPEG n01824575/
+mv val/ILSVRC2012_val_00008009.JPEG n02056570/
+mv val/ILSVRC2012_val_00008010.JPEG n03773504/
+mv val/ILSVRC2012_val_00008011.JPEG n01688243/
+mv val/ILSVRC2012_val_00008012.JPEG n03345487/
+mv val/ILSVRC2012_val_00008013.JPEG n03345487/
+mv val/ILSVRC2012_val_00008014.JPEG n02486410/
+mv val/ILSVRC2012_val_00008015.JPEG n03271574/
+mv val/ILSVRC2012_val_00008016.JPEG n03485407/
+mv val/ILSVRC2012_val_00008017.JPEG n02483362/
+mv val/ILSVRC2012_val_00008018.JPEG n02113712/
+mv val/ILSVRC2012_val_00008019.JPEG n02786058/
+mv val/ILSVRC2012_val_00008020.JPEG n04579145/
+mv val/ILSVRC2012_val_00008021.JPEG n02948072/
+mv val/ILSVRC2012_val_00008022.JPEG n03595614/
+mv val/ILSVRC2012_val_00008023.JPEG n03594734/
+mv val/ILSVRC2012_val_00008024.JPEG n01491361/
+mv val/ILSVRC2012_val_00008025.JPEG n01729977/
+mv val/ILSVRC2012_val_00008026.JPEG n04033995/
+mv val/ILSVRC2012_val_00008027.JPEG n04597913/
+mv val/ILSVRC2012_val_00008028.JPEG n01871265/
+mv val/ILSVRC2012_val_00008029.JPEG n02992211/
+mv val/ILSVRC2012_val_00008030.JPEG n02361337/
+mv val/ILSVRC2012_val_00008031.JPEG n04070727/
+mv val/ILSVRC2012_val_00008032.JPEG n02007558/
+mv val/ILSVRC2012_val_00008033.JPEG n03110669/
+mv val/ILSVRC2012_val_00008034.JPEG n09399592/
+mv val/ILSVRC2012_val_00008035.JPEG n02009912/
+mv val/ILSVRC2012_val_00008036.JPEG n03249569/
+mv val/ILSVRC2012_val_00008037.JPEG n02415577/
+mv val/ILSVRC2012_val_00008038.JPEG n02190166/
+mv val/ILSVRC2012_val_00008039.JPEG n02701002/
+mv val/ILSVRC2012_val_00008040.JPEG n03042490/
+mv val/ILSVRC2012_val_00008041.JPEG n01871265/
+mv val/ILSVRC2012_val_00008042.JPEG n02091467/
+mv val/ILSVRC2012_val_00008043.JPEG n03208938/
+mv val/ILSVRC2012_val_00008044.JPEG n02105505/
+mv val/ILSVRC2012_val_00008045.JPEG n04589890/
+mv val/ILSVRC2012_val_00008046.JPEG n02138441/
+mv val/ILSVRC2012_val_00008047.JPEG n04591157/
+mv val/ILSVRC2012_val_00008048.JPEG n03344393/
+mv val/ILSVRC2012_val_00008049.JPEG n01622779/
+mv val/ILSVRC2012_val_00008050.JPEG n01924916/
+mv val/ILSVRC2012_val_00008051.JPEG n02137549/
+mv val/ILSVRC2012_val_00008052.JPEG n04328186/
+mv val/ILSVRC2012_val_00008053.JPEG n07590611/
+mv val/ILSVRC2012_val_00008054.JPEG n01776313/
+mv val/ILSVRC2012_val_00008055.JPEG n04389033/
+mv val/ILSVRC2012_val_00008056.JPEG n02058221/
+mv val/ILSVRC2012_val_00008057.JPEG n03786901/
+mv val/ILSVRC2012_val_00008058.JPEG n02865351/
+mv val/ILSVRC2012_val_00008059.JPEG n02536864/
+mv val/ILSVRC2012_val_00008060.JPEG n04154565/
+mv val/ILSVRC2012_val_00008061.JPEG n02108422/
+mv val/ILSVRC2012_val_00008062.JPEG n07583066/
+mv val/ILSVRC2012_val_00008063.JPEG n03770439/
+mv val/ILSVRC2012_val_00008064.JPEG n04235860/
+mv val/ILSVRC2012_val_00008065.JPEG n03594945/
+mv val/ILSVRC2012_val_00008066.JPEG n02096051/
+mv val/ILSVRC2012_val_00008067.JPEG n03590841/
+mv val/ILSVRC2012_val_00008068.JPEG n04525038/
+mv val/ILSVRC2012_val_00008069.JPEG n02264363/
+mv val/ILSVRC2012_val_00008070.JPEG n04592741/
+mv val/ILSVRC2012_val_00008071.JPEG n02364673/
+mv val/ILSVRC2012_val_00008072.JPEG n01735189/
+mv val/ILSVRC2012_val_00008073.JPEG n02977058/
+mv val/ILSVRC2012_val_00008074.JPEG n02488291/
+mv val/ILSVRC2012_val_00008075.JPEG n07871810/
+mv val/ILSVRC2012_val_00008076.JPEG n03062245/
+mv val/ILSVRC2012_val_00008077.JPEG n04557648/
+mv val/ILSVRC2012_val_00008078.JPEG n03837869/
+mv val/ILSVRC2012_val_00008079.JPEG n01770081/
+mv val/ILSVRC2012_val_00008080.JPEG n04273569/
+mv val/ILSVRC2012_val_00008081.JPEG n03290653/
+mv val/ILSVRC2012_val_00008082.JPEG n03124043/
+mv val/ILSVRC2012_val_00008083.JPEG n02971356/
+mv val/ILSVRC2012_val_00008084.JPEG n02423022/
+mv val/ILSVRC2012_val_00008085.JPEG n02094114/
+mv val/ILSVRC2012_val_00008086.JPEG n01695060/
+mv val/ILSVRC2012_val_00008087.JPEG n01917289/
+mv val/ILSVRC2012_val_00008088.JPEG n02814533/
+mv val/ILSVRC2012_val_00008089.JPEG n03250847/
+mv val/ILSVRC2012_val_00008090.JPEG n02110063/
+mv val/ILSVRC2012_val_00008091.JPEG n02666196/
+mv val/ILSVRC2012_val_00008092.JPEG n02488291/
+mv val/ILSVRC2012_val_00008093.JPEG n02504013/
+mv val/ILSVRC2012_val_00008094.JPEG n02130308/
+mv val/ILSVRC2012_val_00008095.JPEG n01695060/
+mv val/ILSVRC2012_val_00008096.JPEG n03089624/
+mv val/ILSVRC2012_val_00008097.JPEG n02906734/
+mv val/ILSVRC2012_val_00008098.JPEG n02791124/
+mv val/ILSVRC2012_val_00008099.JPEG n09835506/
+mv val/ILSVRC2012_val_00008100.JPEG n07695742/
+mv val/ILSVRC2012_val_00008101.JPEG n06874185/
+mv val/ILSVRC2012_val_00008102.JPEG n04229816/
+mv val/ILSVRC2012_val_00008103.JPEG n02408429/
+mv val/ILSVRC2012_val_00008104.JPEG n02087394/
+mv val/ILSVRC2012_val_00008105.JPEG n03297495/
+mv val/ILSVRC2012_val_00008106.JPEG n02058221/
+mv val/ILSVRC2012_val_00008107.JPEG n03763968/
+mv val/ILSVRC2012_val_00008108.JPEG n01491361/
+mv val/ILSVRC2012_val_00008109.JPEG n03781244/
+mv val/ILSVRC2012_val_00008110.JPEG n03873416/
+mv val/ILSVRC2012_val_00008111.JPEG n02111277/
+mv val/ILSVRC2012_val_00008112.JPEG n13052670/
+mv val/ILSVRC2012_val_00008113.JPEG n02119022/
+mv val/ILSVRC2012_val_00008114.JPEG n02108000/
+mv val/ILSVRC2012_val_00008115.JPEG n02791124/
+mv val/ILSVRC2012_val_00008116.JPEG n03028079/
+mv val/ILSVRC2012_val_00008117.JPEG n02906734/
+mv val/ILSVRC2012_val_00008118.JPEG n02112350/
+mv val/ILSVRC2012_val_00008119.JPEG n02102318/
+mv val/ILSVRC2012_val_00008120.JPEG n04118776/
+mv val/ILSVRC2012_val_00008121.JPEG n02823428/
+mv val/ILSVRC2012_val_00008122.JPEG n04435653/
+mv val/ILSVRC2012_val_00008123.JPEG n03786901/
+mv val/ILSVRC2012_val_00008124.JPEG n02105505/
+mv val/ILSVRC2012_val_00008125.JPEG n01514859/
+mv val/ILSVRC2012_val_00008126.JPEG n02860847/
+mv val/ILSVRC2012_val_00008127.JPEG n01871265/
+mv val/ILSVRC2012_val_00008128.JPEG n07742313/
+mv val/ILSVRC2012_val_00008129.JPEG n01695060/
+mv val/ILSVRC2012_val_00008130.JPEG n01735189/
+mv val/ILSVRC2012_val_00008131.JPEG n03141823/
+mv val/ILSVRC2012_val_00008132.JPEG n02692877/
+mv val/ILSVRC2012_val_00008133.JPEG n04254680/
+mv val/ILSVRC2012_val_00008134.JPEG n02483708/
+mv val/ILSVRC2012_val_00008135.JPEG n02011460/
+mv val/ILSVRC2012_val_00008136.JPEG n02927161/
+mv val/ILSVRC2012_val_00008137.JPEG n02113978/
+mv val/ILSVRC2012_val_00008138.JPEG n02106166/
+mv val/ILSVRC2012_val_00008139.JPEG n03770679/
+mv val/ILSVRC2012_val_00008140.JPEG n02169497/
+mv val/ILSVRC2012_val_00008141.JPEG n04482393/
+mv val/ILSVRC2012_val_00008142.JPEG n02277742/
+mv val/ILSVRC2012_val_00008143.JPEG n04485082/
+mv val/ILSVRC2012_val_00008144.JPEG n01984695/
+mv val/ILSVRC2012_val_00008145.JPEG n03658185/
+mv val/ILSVRC2012_val_00008146.JPEG n01697457/
+mv val/ILSVRC2012_val_00008147.JPEG n09428293/
+mv val/ILSVRC2012_val_00008148.JPEG n02102480/
+mv val/ILSVRC2012_val_00008149.JPEG n04501370/
+mv val/ILSVRC2012_val_00008150.JPEG n04141975/
+mv val/ILSVRC2012_val_00008151.JPEG n01614925/
+mv val/ILSVRC2012_val_00008152.JPEG n02089078/
+mv val/ILSVRC2012_val_00008153.JPEG n03935335/
+mv val/ILSVRC2012_val_00008154.JPEG n02486410/
+mv val/ILSVRC2012_val_00008155.JPEG n01843065/
+mv val/ILSVRC2012_val_00008156.JPEG n01984695/
+mv val/ILSVRC2012_val_00008157.JPEG n02363005/
+mv val/ILSVRC2012_val_00008158.JPEG n04536866/
+mv val/ILSVRC2012_val_00008159.JPEG n04141076/
+mv val/ILSVRC2012_val_00008160.JPEG n01950731/
+mv val/ILSVRC2012_val_00008161.JPEG n03445777/
+mv val/ILSVRC2012_val_00008162.JPEG n02102040/
+mv val/ILSVRC2012_val_00008163.JPEG n07715103/
+mv val/ILSVRC2012_val_00008164.JPEG n09256479/
+mv val/ILSVRC2012_val_00008165.JPEG n03781244/
+mv val/ILSVRC2012_val_00008166.JPEG n02090379/
+mv val/ILSVRC2012_val_00008167.JPEG n02129165/
+mv val/ILSVRC2012_val_00008168.JPEG n04532670/
+mv val/ILSVRC2012_val_00008169.JPEG n02939185/
+mv val/ILSVRC2012_val_00008170.JPEG n04259630/
+mv val/ILSVRC2012_val_00008171.JPEG n03788365/
+mv val/ILSVRC2012_val_00008172.JPEG n03461385/
+mv val/ILSVRC2012_val_00008173.JPEG n04606251/
+mv val/ILSVRC2012_val_00008174.JPEG n04428191/
+mv val/ILSVRC2012_val_00008175.JPEG n02488702/
+mv val/ILSVRC2012_val_00008176.JPEG n01518878/
+mv val/ILSVRC2012_val_00008177.JPEG n02107142/
+mv val/ILSVRC2012_val_00008178.JPEG n01622779/
+mv val/ILSVRC2012_val_00008179.JPEG n02483708/
+mv val/ILSVRC2012_val_00008180.JPEG n07753113/
+mv val/ILSVRC2012_val_00008181.JPEG n07930864/
+mv val/ILSVRC2012_val_00008182.JPEG n01984695/
+mv val/ILSVRC2012_val_00008183.JPEG n03476684/
+mv val/ILSVRC2012_val_00008184.JPEG n02655020/
+mv val/ILSVRC2012_val_00008185.JPEG n03376595/
+mv val/ILSVRC2012_val_00008186.JPEG n01806143/
+mv val/ILSVRC2012_val_00008187.JPEG n04286575/
+mv val/ILSVRC2012_val_00008188.JPEG n02490219/
+mv val/ILSVRC2012_val_00008189.JPEG n02640242/
+mv val/ILSVRC2012_val_00008190.JPEG n04141975/
+mv val/ILSVRC2012_val_00008191.JPEG n03938244/
+mv val/ILSVRC2012_val_00008192.JPEG n02100735/
+mv val/ILSVRC2012_val_00008193.JPEG n04041544/
+mv val/ILSVRC2012_val_00008194.JPEG n02108915/
+mv val/ILSVRC2012_val_00008195.JPEG n03769881/
+mv val/ILSVRC2012_val_00008196.JPEG n02108551/
+mv val/ILSVRC2012_val_00008197.JPEG n02110185/
+mv val/ILSVRC2012_val_00008198.JPEG n02086646/
+mv val/ILSVRC2012_val_00008199.JPEG n03388043/
+mv val/ILSVRC2012_val_00008200.JPEG n07697313/
+mv val/ILSVRC2012_val_00008201.JPEG n02098105/
+mv val/ILSVRC2012_val_00008202.JPEG n04597913/
+mv val/ILSVRC2012_val_00008203.JPEG n04090263/
+mv val/ILSVRC2012_val_00008204.JPEG n02492660/
+mv val/ILSVRC2012_val_00008205.JPEG n02795169/
+mv val/ILSVRC2012_val_00008206.JPEG n02086240/
+mv val/ILSVRC2012_val_00008207.JPEG n02097130/
+mv val/ILSVRC2012_val_00008208.JPEG n02346627/
+mv val/ILSVRC2012_val_00008209.JPEG n01622779/
+mv val/ILSVRC2012_val_00008210.JPEG n01978287/
+mv val/ILSVRC2012_val_00008211.JPEG n01924916/
+mv val/ILSVRC2012_val_00008212.JPEG n02655020/
+mv val/ILSVRC2012_val_00008213.JPEG n02787622/
+mv val/ILSVRC2012_val_00008214.JPEG n02108551/
+mv val/ILSVRC2012_val_00008215.JPEG n03717622/
+mv val/ILSVRC2012_val_00008216.JPEG n07697313/
+mv val/ILSVRC2012_val_00008217.JPEG n02105505/
+mv val/ILSVRC2012_val_00008218.JPEG n07753113/
+mv val/ILSVRC2012_val_00008219.JPEG n04204347/
+mv val/ILSVRC2012_val_00008220.JPEG n02909870/
+mv val/ILSVRC2012_val_00008221.JPEG n01828970/
+mv val/ILSVRC2012_val_00008222.JPEG n02018795/
+mv val/ILSVRC2012_val_00008223.JPEG n07836838/
+mv val/ILSVRC2012_val_00008224.JPEG n01775062/
+mv val/ILSVRC2012_val_00008225.JPEG n07716358/
+mv val/ILSVRC2012_val_00008226.JPEG n01675722/
+mv val/ILSVRC2012_val_00008227.JPEG n02807133/
+mv val/ILSVRC2012_val_00008228.JPEG n02493793/
+mv val/ILSVRC2012_val_00008229.JPEG n02091467/
+mv val/ILSVRC2012_val_00008230.JPEG n02804414/
+mv val/ILSVRC2012_val_00008231.JPEG n12144580/
+mv val/ILSVRC2012_val_00008232.JPEG n02823428/
+mv val/ILSVRC2012_val_00008233.JPEG n09229709/
+mv val/ILSVRC2012_val_00008234.JPEG n03379051/
+mv val/ILSVRC2012_val_00008235.JPEG n02791270/
+mv val/ILSVRC2012_val_00008236.JPEG n01828970/
+mv val/ILSVRC2012_val_00008237.JPEG n03832673/
+mv val/ILSVRC2012_val_00008238.JPEG n04366367/
+mv val/ILSVRC2012_val_00008239.JPEG n03877845/
+mv val/ILSVRC2012_val_00008240.JPEG n03372029/
+mv val/ILSVRC2012_val_00008241.JPEG n03961711/
+mv val/ILSVRC2012_val_00008242.JPEG n03916031/
+mv val/ILSVRC2012_val_00008243.JPEG n03788365/
+mv val/ILSVRC2012_val_00008244.JPEG n04265275/
+mv val/ILSVRC2012_val_00008245.JPEG n01806143/
+mv val/ILSVRC2012_val_00008246.JPEG n04008634/
+mv val/ILSVRC2012_val_00008247.JPEG n02794156/
+mv val/ILSVRC2012_val_00008248.JPEG n03777754/
+mv val/ILSVRC2012_val_00008249.JPEG n01630670/
+mv val/ILSVRC2012_val_00008250.JPEG n07860988/
+mv val/ILSVRC2012_val_00008251.JPEG n04239074/
+mv val/ILSVRC2012_val_00008252.JPEG n04270147/
+mv val/ILSVRC2012_val_00008253.JPEG n03761084/
+mv val/ILSVRC2012_val_00008254.JPEG n04270147/
+mv val/ILSVRC2012_val_00008255.JPEG n04487081/
+mv val/ILSVRC2012_val_00008256.JPEG n02481823/
+mv val/ILSVRC2012_val_00008257.JPEG n02395406/
+mv val/ILSVRC2012_val_00008258.JPEG n02093859/
+mv val/ILSVRC2012_val_00008259.JPEG n03991062/
+mv val/ILSVRC2012_val_00008260.JPEG n04264628/
+mv val/ILSVRC2012_val_00008261.JPEG n04258138/
+mv val/ILSVRC2012_val_00008262.JPEG n06359193/
+mv val/ILSVRC2012_val_00008263.JPEG n02074367/
+mv val/ILSVRC2012_val_00008264.JPEG n07614500/
+mv val/ILSVRC2012_val_00008265.JPEG n02865351/
+mv val/ILSVRC2012_val_00008266.JPEG n07718747/
+mv val/ILSVRC2012_val_00008267.JPEG n04074963/
+mv val/ILSVRC2012_val_00008268.JPEG n04482393/
+mv val/ILSVRC2012_val_00008269.JPEG n03347037/
+mv val/ILSVRC2012_val_00008270.JPEG n02110063/
+mv val/ILSVRC2012_val_00008271.JPEG n07836838/
+mv val/ILSVRC2012_val_00008272.JPEG n02090379/
+mv val/ILSVRC2012_val_00008273.JPEG n03595614/
+mv val/ILSVRC2012_val_00008274.JPEG n03482405/
+mv val/ILSVRC2012_val_00008275.JPEG n13052670/
+mv val/ILSVRC2012_val_00008276.JPEG n04023962/
+mv val/ILSVRC2012_val_00008277.JPEG n03991062/
+mv val/ILSVRC2012_val_00008278.JPEG n04548280/
+mv val/ILSVRC2012_val_00008279.JPEG n02056570/
+mv val/ILSVRC2012_val_00008280.JPEG n02794156/
+mv val/ILSVRC2012_val_00008281.JPEG n13133613/
+mv val/ILSVRC2012_val_00008282.JPEG n02100877/
+mv val/ILSVRC2012_val_00008283.JPEG n03272010/
+mv val/ILSVRC2012_val_00008284.JPEG n02107683/
+mv val/ILSVRC2012_val_00008285.JPEG n04149813/
+mv val/ILSVRC2012_val_00008286.JPEG n04152593/
+mv val/ILSVRC2012_val_00008287.JPEG n02002556/
+mv val/ILSVRC2012_val_00008288.JPEG n03954731/
+mv val/ILSVRC2012_val_00008289.JPEG n01968897/
+mv val/ILSVRC2012_val_00008290.JPEG n03388043/
+mv val/ILSVRC2012_val_00008291.JPEG n03764736/
+mv val/ILSVRC2012_val_00008292.JPEG n02690373/
+mv val/ILSVRC2012_val_00008293.JPEG n02966193/
+mv val/ILSVRC2012_val_00008294.JPEG n01518878/
+mv val/ILSVRC2012_val_00008295.JPEG n02128385/
+mv val/ILSVRC2012_val_00008296.JPEG n03197337/
+mv val/ILSVRC2012_val_00008297.JPEG n02092002/
+mv val/ILSVRC2012_val_00008298.JPEG n03110669/
+mv val/ILSVRC2012_val_00008299.JPEG n03478589/
+mv val/ILSVRC2012_val_00008300.JPEG n02457408/
+mv val/ILSVRC2012_val_00008301.JPEG n02870880/
+mv val/ILSVRC2012_val_00008302.JPEG n02011460/
+mv val/ILSVRC2012_val_00008303.JPEG n02093428/
+mv val/ILSVRC2012_val_00008304.JPEG n03063689/
+mv val/ILSVRC2012_val_00008305.JPEG n03337140/
+mv val/ILSVRC2012_val_00008306.JPEG n04356056/
+mv val/ILSVRC2012_val_00008307.JPEG n02963159/
+mv val/ILSVRC2012_val_00008308.JPEG n04435653/
+mv val/ILSVRC2012_val_00008309.JPEG n03871628/
+mv val/ILSVRC2012_val_00008310.JPEG n02110627/
+mv val/ILSVRC2012_val_00008311.JPEG n02088238/
+mv val/ILSVRC2012_val_00008312.JPEG n03160309/
+mv val/ILSVRC2012_val_00008313.JPEG n03983396/
+mv val/ILSVRC2012_val_00008314.JPEG n02992529/
+mv val/ILSVRC2012_val_00008315.JPEG n03843555/
+mv val/ILSVRC2012_val_00008316.JPEG n01773549/
+mv val/ILSVRC2012_val_00008317.JPEG n02389026/
+mv val/ILSVRC2012_val_00008318.JPEG n09468604/
+mv val/ILSVRC2012_val_00008319.JPEG n04505470/
+mv val/ILSVRC2012_val_00008320.JPEG n02109961/
+mv val/ILSVRC2012_val_00008321.JPEG n02794156/
+mv val/ILSVRC2012_val_00008322.JPEG n03854065/
+mv val/ILSVRC2012_val_00008323.JPEG n04355338/
+mv val/ILSVRC2012_val_00008324.JPEG n02094433/
+mv val/ILSVRC2012_val_00008325.JPEG n13133613/
+mv val/ILSVRC2012_val_00008326.JPEG n03272010/
+mv val/ILSVRC2012_val_00008327.JPEG n01667778/
+mv val/ILSVRC2012_val_00008328.JPEG n03494278/
+mv val/ILSVRC2012_val_00008329.JPEG n12768682/
+mv val/ILSVRC2012_val_00008330.JPEG n02481823/
+mv val/ILSVRC2012_val_00008331.JPEG n03085013/
+mv val/ILSVRC2012_val_00008332.JPEG n03179701/
+mv val/ILSVRC2012_val_00008333.JPEG n01667778/
+mv val/ILSVRC2012_val_00008334.JPEG n02102040/
+mv val/ILSVRC2012_val_00008335.JPEG n02112706/
+mv val/ILSVRC2012_val_00008336.JPEG n02951585/
+mv val/ILSVRC2012_val_00008337.JPEG n02108089/
+mv val/ILSVRC2012_val_00008338.JPEG n02099601/
+mv val/ILSVRC2012_val_00008339.JPEG n07860988/
+mv val/ILSVRC2012_val_00008340.JPEG n04033995/
+mv val/ILSVRC2012_val_00008341.JPEG n03388183/
+mv val/ILSVRC2012_val_00008342.JPEG n02127052/
+mv val/ILSVRC2012_val_00008343.JPEG n02107142/
+mv val/ILSVRC2012_val_00008344.JPEG n03814639/
+mv val/ILSVRC2012_val_00008345.JPEG n04004767/
+mv val/ILSVRC2012_val_00008346.JPEG n02099712/
+mv val/ILSVRC2012_val_00008347.JPEG n01582220/
+mv val/ILSVRC2012_val_00008348.JPEG n02102177/
+mv val/ILSVRC2012_val_00008349.JPEG n02100735/
+mv val/ILSVRC2012_val_00008350.JPEG n03958227/
+mv val/ILSVRC2012_val_00008351.JPEG n02481823/
+mv val/ILSVRC2012_val_00008352.JPEG n01773549/
+mv val/ILSVRC2012_val_00008353.JPEG n03131574/
+mv val/ILSVRC2012_val_00008354.JPEG n04540053/
+mv val/ILSVRC2012_val_00008355.JPEG n03424325/
+mv val/ILSVRC2012_val_00008356.JPEG n03871628/
+mv val/ILSVRC2012_val_00008357.JPEG n02116738/
+mv val/ILSVRC2012_val_00008358.JPEG n09229709/
+mv val/ILSVRC2012_val_00008359.JPEG n02797295/
+mv val/ILSVRC2012_val_00008360.JPEG n02704792/
+mv val/ILSVRC2012_val_00008361.JPEG n02825657/
+mv val/ILSVRC2012_val_00008362.JPEG n02115913/
+mv val/ILSVRC2012_val_00008363.JPEG n03888605/
+mv val/ILSVRC2012_val_00008364.JPEG n02009229/
+mv val/ILSVRC2012_val_00008365.JPEG n03063689/
+mv val/ILSVRC2012_val_00008366.JPEG n07734744/
+mv val/ILSVRC2012_val_00008367.JPEG n02669723/
+mv val/ILSVRC2012_val_00008368.JPEG n02101556/
+mv val/ILSVRC2012_val_00008369.JPEG n03045698/
+mv val/ILSVRC2012_val_00008370.JPEG n04532106/
+mv val/ILSVRC2012_val_00008371.JPEG n03961711/
+mv val/ILSVRC2012_val_00008372.JPEG n04372370/
+mv val/ILSVRC2012_val_00008373.JPEG n02655020/
+mv val/ILSVRC2012_val_00008374.JPEG n02094433/
+mv val/ILSVRC2012_val_00008375.JPEG n02088466/
+mv val/ILSVRC2012_val_00008376.JPEG n04005630/
+mv val/ILSVRC2012_val_00008377.JPEG n12144580/
+mv val/ILSVRC2012_val_00008378.JPEG n02892767/
+mv val/ILSVRC2012_val_00008379.JPEG n02091244/
+mv val/ILSVRC2012_val_00008380.JPEG n03110669/
+mv val/ILSVRC2012_val_00008381.JPEG n03759954/
+mv val/ILSVRC2012_val_00008382.JPEG n03594945/
+mv val/ILSVRC2012_val_00008383.JPEG n03594945/
+mv val/ILSVRC2012_val_00008384.JPEG n04462240/
+mv val/ILSVRC2012_val_00008385.JPEG n07711569/
+mv val/ILSVRC2012_val_00008386.JPEG n03259280/
+mv val/ILSVRC2012_val_00008387.JPEG n04482393/
+mv val/ILSVRC2012_val_00008388.JPEG n02018207/
+mv val/ILSVRC2012_val_00008389.JPEG n03134739/
+mv val/ILSVRC2012_val_00008390.JPEG n03832673/
+mv val/ILSVRC2012_val_00008391.JPEG n04467665/
+mv val/ILSVRC2012_val_00008392.JPEG n04285008/
+mv val/ILSVRC2012_val_00008393.JPEG n02169497/
+mv val/ILSVRC2012_val_00008394.JPEG n03796401/
+mv val/ILSVRC2012_val_00008395.JPEG n02099267/
+mv val/ILSVRC2012_val_00008396.JPEG n02909870/
+mv val/ILSVRC2012_val_00008397.JPEG n02105412/
+mv val/ILSVRC2012_val_00008398.JPEG n04265275/
+mv val/ILSVRC2012_val_00008399.JPEG n01728572/
+mv val/ILSVRC2012_val_00008400.JPEG n04336792/
+mv val/ILSVRC2012_val_00008401.JPEG n02834397/
+mv val/ILSVRC2012_val_00008402.JPEG n02804414/
+mv val/ILSVRC2012_val_00008403.JPEG n04548362/
+mv val/ILSVRC2012_val_00008404.JPEG n03109150/
+mv val/ILSVRC2012_val_00008405.JPEG n02895154/
+mv val/ILSVRC2012_val_00008406.JPEG n03929660/
+mv val/ILSVRC2012_val_00008407.JPEG n01685808/
+mv val/ILSVRC2012_val_00008408.JPEG n02111500/
+mv val/ILSVRC2012_val_00008409.JPEG n04033995/
+mv val/ILSVRC2012_val_00008410.JPEG n01768244/
+mv val/ILSVRC2012_val_00008411.JPEG n02002556/
+mv val/ILSVRC2012_val_00008412.JPEG n03887697/
+mv val/ILSVRC2012_val_00008413.JPEG n04069434/
+mv val/ILSVRC2012_val_00008414.JPEG n03594734/
+mv val/ILSVRC2012_val_00008415.JPEG n02500267/
+mv val/ILSVRC2012_val_00008416.JPEG n07714990/
+mv val/ILSVRC2012_val_00008417.JPEG n02137549/
+mv val/ILSVRC2012_val_00008418.JPEG n03014705/
+mv val/ILSVRC2012_val_00008419.JPEG n02447366/
+mv val/ILSVRC2012_val_00008420.JPEG n01537544/
+mv val/ILSVRC2012_val_00008421.JPEG n07802026/
+mv val/ILSVRC2012_val_00008422.JPEG n03895866/
+mv val/ILSVRC2012_val_00008423.JPEG n04330267/
+mv val/ILSVRC2012_val_00008424.JPEG n03602883/
+mv val/ILSVRC2012_val_00008425.JPEG n02795169/
+mv val/ILSVRC2012_val_00008426.JPEG n04153751/
+mv val/ILSVRC2012_val_00008427.JPEG n03782006/
+mv val/ILSVRC2012_val_00008428.JPEG n02489166/
+mv val/ILSVRC2012_val_00008429.JPEG n03447721/
+mv val/ILSVRC2012_val_00008430.JPEG n03417042/
+mv val/ILSVRC2012_val_00008431.JPEG n04550184/
+mv val/ILSVRC2012_val_00008432.JPEG n02500267/
+mv val/ILSVRC2012_val_00008433.JPEG n02112706/
+mv val/ILSVRC2012_val_00008434.JPEG n03347037/
+mv val/ILSVRC2012_val_00008435.JPEG n02088364/
+mv val/ILSVRC2012_val_00008436.JPEG n02640242/
+mv val/ILSVRC2012_val_00008437.JPEG n03983396/
+mv val/ILSVRC2012_val_00008438.JPEG n02817516/
+mv val/ILSVRC2012_val_00008439.JPEG n01695060/
+mv val/ILSVRC2012_val_00008440.JPEG n13133613/
+mv val/ILSVRC2012_val_00008441.JPEG n02095314/
+mv val/ILSVRC2012_val_00008442.JPEG n03887697/
+mv val/ILSVRC2012_val_00008443.JPEG n02892767/
+mv val/ILSVRC2012_val_00008444.JPEG n07697313/
+mv val/ILSVRC2012_val_00008445.JPEG n11939491/
+mv val/ILSVRC2012_val_00008446.JPEG n04332243/
+mv val/ILSVRC2012_val_00008447.JPEG n02667093/
+mv val/ILSVRC2012_val_00008448.JPEG n02643566/
+mv val/ILSVRC2012_val_00008449.JPEG n02493509/
+mv val/ILSVRC2012_val_00008450.JPEG n04251144/
+mv val/ILSVRC2012_val_00008451.JPEG n02730930/
+mv val/ILSVRC2012_val_00008452.JPEG n04118776/
+mv val/ILSVRC2012_val_00008453.JPEG n02097209/
+mv val/ILSVRC2012_val_00008454.JPEG n04335435/
+mv val/ILSVRC2012_val_00008455.JPEG n03016953/
+mv val/ILSVRC2012_val_00008456.JPEG n03691459/
+mv val/ILSVRC2012_val_00008457.JPEG n04037443/
+mv val/ILSVRC2012_val_00008458.JPEG n02100583/
+mv val/ILSVRC2012_val_00008459.JPEG n02104029/
+mv val/ILSVRC2012_val_00008460.JPEG n02088466/
+mv val/ILSVRC2012_val_00008461.JPEG n09193705/
+mv val/ILSVRC2012_val_00008462.JPEG n03495258/
+mv val/ILSVRC2012_val_00008463.JPEG n02095314/
+mv val/ILSVRC2012_val_00008464.JPEG n03355925/
+mv val/ILSVRC2012_val_00008465.JPEG n07613480/
+mv val/ILSVRC2012_val_00008466.JPEG n02971356/
+mv val/ILSVRC2012_val_00008467.JPEG n04153751/
+mv val/ILSVRC2012_val_00008468.JPEG n01945685/
+mv val/ILSVRC2012_val_00008469.JPEG n01697457/
+mv val/ILSVRC2012_val_00008470.JPEG n04532106/
+mv val/ILSVRC2012_val_00008471.JPEG n02895154/
+mv val/ILSVRC2012_val_00008472.JPEG n04548362/
+mv val/ILSVRC2012_val_00008473.JPEG n04485082/
+mv val/ILSVRC2012_val_00008474.JPEG n02002724/
+mv val/ILSVRC2012_val_00008475.JPEG n02999410/
+mv val/ILSVRC2012_val_00008476.JPEG n03976467/
+mv val/ILSVRC2012_val_00008477.JPEG n02951358/
+mv val/ILSVRC2012_val_00008478.JPEG n03874293/
+mv val/ILSVRC2012_val_00008479.JPEG n02442845/
+mv val/ILSVRC2012_val_00008480.JPEG n04229816/
+mv val/ILSVRC2012_val_00008481.JPEG n01614925/
+mv val/ILSVRC2012_val_00008482.JPEG n02769748/
+mv val/ILSVRC2012_val_00008483.JPEG n04461696/
+mv val/ILSVRC2012_val_00008484.JPEG n02486410/
+mv val/ILSVRC2012_val_00008485.JPEG n03916031/
+mv val/ILSVRC2012_val_00008486.JPEG n04562935/
+mv val/ILSVRC2012_val_00008487.JPEG n02098413/
+mv val/ILSVRC2012_val_00008488.JPEG n02097474/
+mv val/ILSVRC2012_val_00008489.JPEG n03584829/
+mv val/ILSVRC2012_val_00008490.JPEG n02606052/
+mv val/ILSVRC2012_val_00008491.JPEG n02123394/
+mv val/ILSVRC2012_val_00008492.JPEG n03871628/
+mv val/ILSVRC2012_val_00008493.JPEG n04311004/
+mv val/ILSVRC2012_val_00008494.JPEG n02865351/
+mv val/ILSVRC2012_val_00008495.JPEG n01601694/
+mv val/ILSVRC2012_val_00008496.JPEG n02111129/
+mv val/ILSVRC2012_val_00008497.JPEG n04509417/
+mv val/ILSVRC2012_val_00008498.JPEG n01882714/
+mv val/ILSVRC2012_val_00008499.JPEG n03908714/
+mv val/ILSVRC2012_val_00008500.JPEG n02102973/
+mv val/ILSVRC2012_val_00008501.JPEG n03983396/
+mv val/ILSVRC2012_val_00008502.JPEG n02093859/
+mv val/ILSVRC2012_val_00008503.JPEG n03775071/
+mv val/ILSVRC2012_val_00008504.JPEG n02667093/
+mv val/ILSVRC2012_val_00008505.JPEG n02906734/
+mv val/ILSVRC2012_val_00008506.JPEG n07873807/
+mv val/ILSVRC2012_val_00008507.JPEG n04277352/
+mv val/ILSVRC2012_val_00008508.JPEG n04153751/
+mv val/ILSVRC2012_val_00008509.JPEG n01675722/
+mv val/ILSVRC2012_val_00008510.JPEG n01601694/
+mv val/ILSVRC2012_val_00008511.JPEG n04263257/
+mv val/ILSVRC2012_val_00008512.JPEG n01582220/
+mv val/ILSVRC2012_val_00008513.JPEG n03000134/
+mv val/ILSVRC2012_val_00008514.JPEG n04263257/
+mv val/ILSVRC2012_val_00008515.JPEG n04286575/
+mv val/ILSVRC2012_val_00008516.JPEG n06359193/
+mv val/ILSVRC2012_val_00008517.JPEG n02445715/
+mv val/ILSVRC2012_val_00008518.JPEG n03179701/
+mv val/ILSVRC2012_val_00008519.JPEG n04275548/
+mv val/ILSVRC2012_val_00008520.JPEG n02444819/
+mv val/ILSVRC2012_val_00008521.JPEG n02002724/
+mv val/ILSVRC2012_val_00008522.JPEG n03124170/
+mv val/ILSVRC2012_val_00008523.JPEG n02018795/
+mv val/ILSVRC2012_val_00008524.JPEG n02776631/
+mv val/ILSVRC2012_val_00008525.JPEG n12144580/
+mv val/ILSVRC2012_val_00008526.JPEG n03041632/
+mv val/ILSVRC2012_val_00008527.JPEG n02101556/
+mv val/ILSVRC2012_val_00008528.JPEG n04435653/
+mv val/ILSVRC2012_val_00008529.JPEG n04254120/
+mv val/ILSVRC2012_val_00008530.JPEG n04505470/
+mv val/ILSVRC2012_val_00008531.JPEG n03297495/
+mv val/ILSVRC2012_val_00008532.JPEG n02093256/
+mv val/ILSVRC2012_val_00008533.JPEG n03529860/
+mv val/ILSVRC2012_val_00008534.JPEG n01734418/
+mv val/ILSVRC2012_val_00008535.JPEG n04462240/
+mv val/ILSVRC2012_val_00008536.JPEG n02089867/
+mv val/ILSVRC2012_val_00008537.JPEG n03259280/
+mv val/ILSVRC2012_val_00008538.JPEG n03804744/
+mv val/ILSVRC2012_val_00008539.JPEG n02484975/
+mv val/ILSVRC2012_val_00008540.JPEG n03372029/
+mv val/ILSVRC2012_val_00008541.JPEG n02992529/
+mv val/ILSVRC2012_val_00008542.JPEG n01629819/
+mv val/ILSVRC2012_val_00008543.JPEG n03814639/
+mv val/ILSVRC2012_val_00008544.JPEG n04004767/
+mv val/ILSVRC2012_val_00008545.JPEG n02280649/
+mv val/ILSVRC2012_val_00008546.JPEG n04275548/
+mv val/ILSVRC2012_val_00008547.JPEG n04023962/
+mv val/ILSVRC2012_val_00008548.JPEG n03476684/
+mv val/ILSVRC2012_val_00008549.JPEG n01843383/
+mv val/ILSVRC2012_val_00008550.JPEG n02490219/
+mv val/ILSVRC2012_val_00008551.JPEG n03450230/
+mv val/ILSVRC2012_val_00008552.JPEG n02088238/
+mv val/ILSVRC2012_val_00008553.JPEG n02129165/
+mv val/ILSVRC2012_val_00008554.JPEG n07716906/
+mv val/ILSVRC2012_val_00008555.JPEG n02006656/
+mv val/ILSVRC2012_val_00008556.JPEG n07615774/
+mv val/ILSVRC2012_val_00008557.JPEG n04033901/
+mv val/ILSVRC2012_val_00008558.JPEG n02101388/
+mv val/ILSVRC2012_val_00008559.JPEG n02412080/
+mv val/ILSVRC2012_val_00008560.JPEG n02871525/
+mv val/ILSVRC2012_val_00008561.JPEG n01689811/
+mv val/ILSVRC2012_val_00008562.JPEG n02447366/
+mv val/ILSVRC2012_val_00008563.JPEG n02951585/
+mv val/ILSVRC2012_val_00008564.JPEG n03325584/
+mv val/ILSVRC2012_val_00008565.JPEG n04238763/
+mv val/ILSVRC2012_val_00008566.JPEG n01817953/
+mv val/ILSVRC2012_val_00008567.JPEG n07753275/
+mv val/ILSVRC2012_val_00008568.JPEG n03803284/
+mv val/ILSVRC2012_val_00008569.JPEG n03724870/
+mv val/ILSVRC2012_val_00008570.JPEG n01694178/
+mv val/ILSVRC2012_val_00008571.JPEG n04613696/
+mv val/ILSVRC2012_val_00008572.JPEG n03961711/
+mv val/ILSVRC2012_val_00008573.JPEG n04553703/
+mv val/ILSVRC2012_val_00008574.JPEG n04493381/
+mv val/ILSVRC2012_val_00008575.JPEG n04507155/
+mv val/ILSVRC2012_val_00008576.JPEG n03388183/
+mv val/ILSVRC2012_val_00008577.JPEG n04483307/
+mv val/ILSVRC2012_val_00008578.JPEG n02840245/
+mv val/ILSVRC2012_val_00008579.JPEG n01739381/
+mv val/ILSVRC2012_val_00008580.JPEG n03837869/
+mv val/ILSVRC2012_val_00008581.JPEG n03980874/
+mv val/ILSVRC2012_val_00008582.JPEG n02093647/
+mv val/ILSVRC2012_val_00008583.JPEG n02992529/
+mv val/ILSVRC2012_val_00008584.JPEG n03983396/
+mv val/ILSVRC2012_val_00008585.JPEG n02110958/
+mv val/ILSVRC2012_val_00008586.JPEG n01688243/
+mv val/ILSVRC2012_val_00008587.JPEG n02100236/
+mv val/ILSVRC2012_val_00008588.JPEG n01873310/
+mv val/ILSVRC2012_val_00008589.JPEG n04525038/
+mv val/ILSVRC2012_val_00008590.JPEG n03496892/
+mv val/ILSVRC2012_val_00008591.JPEG n04350905/
+mv val/ILSVRC2012_val_00008592.JPEG n02115913/
+mv val/ILSVRC2012_val_00008593.JPEG n01824575/
+mv val/ILSVRC2012_val_00008594.JPEG n04443257/
+mv val/ILSVRC2012_val_00008595.JPEG n01729322/
+mv val/ILSVRC2012_val_00008596.JPEG n03197337/
+mv val/ILSVRC2012_val_00008597.JPEG n09421951/
+mv val/ILSVRC2012_val_00008598.JPEG n07614500/
+mv val/ILSVRC2012_val_00008599.JPEG n03445777/
+mv val/ILSVRC2012_val_00008600.JPEG n03680355/
+mv val/ILSVRC2012_val_00008601.JPEG n04579145/
+mv val/ILSVRC2012_val_00008602.JPEG n03345487/
+mv val/ILSVRC2012_val_00008603.JPEG n03062245/
+mv val/ILSVRC2012_val_00008604.JPEG n02655020/
+mv val/ILSVRC2012_val_00008605.JPEG n02769748/
+mv val/ILSVRC2012_val_00008606.JPEG n03930630/
+mv val/ILSVRC2012_val_00008607.JPEG n03956157/
+mv val/ILSVRC2012_val_00008608.JPEG n04332243/
+mv val/ILSVRC2012_val_00008609.JPEG n03690938/
+mv val/ILSVRC2012_val_00008610.JPEG n04153751/
+mv val/ILSVRC2012_val_00008611.JPEG n04456115/
+mv val/ILSVRC2012_val_00008612.JPEG n02883205/
+mv val/ILSVRC2012_val_00008613.JPEG n01631663/
+mv val/ILSVRC2012_val_00008614.JPEG n02841315/
+mv val/ILSVRC2012_val_00008615.JPEG n02480495/
+mv val/ILSVRC2012_val_00008616.JPEG n02396427/
+mv val/ILSVRC2012_val_00008617.JPEG n04357314/
+mv val/ILSVRC2012_val_00008618.JPEG n01695060/
+mv val/ILSVRC2012_val_00008619.JPEG n02101556/
+mv val/ILSVRC2012_val_00008620.JPEG n03947888/
+mv val/ILSVRC2012_val_00008621.JPEG n04367480/
+mv val/ILSVRC2012_val_00008622.JPEG n03958227/
+mv val/ILSVRC2012_val_00008623.JPEG n01924916/
+mv val/ILSVRC2012_val_00008624.JPEG n02111129/
+mv val/ILSVRC2012_val_00008625.JPEG n02939185/
+mv val/ILSVRC2012_val_00008626.JPEG n01829413/
+mv val/ILSVRC2012_val_00008627.JPEG n02108915/
+mv val/ILSVRC2012_val_00008628.JPEG n03388183/
+mv val/ILSVRC2012_val_00008629.JPEG n02410509/
+mv val/ILSVRC2012_val_00008630.JPEG n04273569/
+mv val/ILSVRC2012_val_00008631.JPEG n02119789/
+mv val/ILSVRC2012_val_00008632.JPEG n04505470/
+mv val/ILSVRC2012_val_00008633.JPEG n02094258/
+mv val/ILSVRC2012_val_00008634.JPEG n02231487/
+mv val/ILSVRC2012_val_00008635.JPEG n02916936/
+mv val/ILSVRC2012_val_00008636.JPEG n02441942/
+mv val/ILSVRC2012_val_00008637.JPEG n04039381/
+mv val/ILSVRC2012_val_00008638.JPEG n02883205/
+mv val/ILSVRC2012_val_00008639.JPEG n02098413/
+mv val/ILSVRC2012_val_00008640.JPEG n01496331/
+mv val/ILSVRC2012_val_00008641.JPEG n03534580/
+mv val/ILSVRC2012_val_00008642.JPEG n07714990/
+mv val/ILSVRC2012_val_00008643.JPEG n04286575/
+mv val/ILSVRC2012_val_00008644.JPEG n03000247/
+mv val/ILSVRC2012_val_00008645.JPEG n03691459/
+mv val/ILSVRC2012_val_00008646.JPEG n03376595/
+mv val/ILSVRC2012_val_00008647.JPEG n01729322/
+mv val/ILSVRC2012_val_00008648.JPEG n12144580/
+mv val/ILSVRC2012_val_00008649.JPEG n04192698/
+mv val/ILSVRC2012_val_00008650.JPEG n03998194/
+mv val/ILSVRC2012_val_00008651.JPEG n02979186/
+mv val/ILSVRC2012_val_00008652.JPEG n02102973/
+mv val/ILSVRC2012_val_00008653.JPEG n02110627/
+mv val/ILSVRC2012_val_00008654.JPEG n01728572/
+mv val/ILSVRC2012_val_00008655.JPEG n03272010/
+mv val/ILSVRC2012_val_00008656.JPEG n03786901/
+mv val/ILSVRC2012_val_00008657.JPEG n04033901/
+mv val/ILSVRC2012_val_00008658.JPEG n02097047/
+mv val/ILSVRC2012_val_00008659.JPEG n03947888/
+mv val/ILSVRC2012_val_00008660.JPEG n07873807/
+mv val/ILSVRC2012_val_00008661.JPEG n02097047/
+mv val/ILSVRC2012_val_00008662.JPEG n07754684/
+mv val/ILSVRC2012_val_00008663.JPEG n02276258/
+mv val/ILSVRC2012_val_00008664.JPEG n02104365/
+mv val/ILSVRC2012_val_00008665.JPEG n01734418/
+mv val/ILSVRC2012_val_00008666.JPEG n03976467/
+mv val/ILSVRC2012_val_00008667.JPEG n02825657/
+mv val/ILSVRC2012_val_00008668.JPEG n01694178/
+mv val/ILSVRC2012_val_00008669.JPEG n01682714/
+mv val/ILSVRC2012_val_00008670.JPEG n02747177/
+mv val/ILSVRC2012_val_00008671.JPEG n03710193/
+mv val/ILSVRC2012_val_00008672.JPEG n09288635/
+mv val/ILSVRC2012_val_00008673.JPEG n02510455/
+mv val/ILSVRC2012_val_00008674.JPEG n02319095/
+mv val/ILSVRC2012_val_00008675.JPEG n02088364/
+mv val/ILSVRC2012_val_00008676.JPEG n02129604/
+mv val/ILSVRC2012_val_00008677.JPEG n04326547/
+mv val/ILSVRC2012_val_00008678.JPEG n03871628/
+mv val/ILSVRC2012_val_00008679.JPEG n02096177/
+mv val/ILSVRC2012_val_00008680.JPEG n09246464/
+mv val/ILSVRC2012_val_00008681.JPEG n03127925/
+mv val/ILSVRC2012_val_00008682.JPEG n02488702/
+mv val/ILSVRC2012_val_00008683.JPEG n06785654/
+mv val/ILSVRC2012_val_00008684.JPEG n02066245/
+mv val/ILSVRC2012_val_00008685.JPEG n12998815/
+mv val/ILSVRC2012_val_00008686.JPEG n01632777/
+mv val/ILSVRC2012_val_00008687.JPEG n02091244/
+mv val/ILSVRC2012_val_00008688.JPEG n01742172/
+mv val/ILSVRC2012_val_00008689.JPEG n03908618/
+mv val/ILSVRC2012_val_00008690.JPEG n04536866/
+mv val/ILSVRC2012_val_00008691.JPEG n03841143/
+mv val/ILSVRC2012_val_00008692.JPEG n01917289/
+mv val/ILSVRC2012_val_00008693.JPEG n02276258/
+mv val/ILSVRC2012_val_00008694.JPEG n03457902/
+mv val/ILSVRC2012_val_00008695.JPEG n04041544/
+mv val/ILSVRC2012_val_00008696.JPEG n03259280/
+mv val/ILSVRC2012_val_00008697.JPEG n02236044/
+mv val/ILSVRC2012_val_00008698.JPEG n02090379/
+mv val/ILSVRC2012_val_00008699.JPEG n04127249/
+mv val/ILSVRC2012_val_00008700.JPEG n03873416/
+mv val/ILSVRC2012_val_00008701.JPEG n02415577/
+mv val/ILSVRC2012_val_00008702.JPEG n03590841/
+mv val/ILSVRC2012_val_00008703.JPEG n02094258/
+mv val/ILSVRC2012_val_00008704.JPEG n03884397/
+mv val/ILSVRC2012_val_00008705.JPEG n01978287/
+mv val/ILSVRC2012_val_00008706.JPEG n02172182/
+mv val/ILSVRC2012_val_00008707.JPEG n01990800/
+mv val/ILSVRC2012_val_00008708.JPEG n04476259/
+mv val/ILSVRC2012_val_00008709.JPEG n03871628/
+mv val/ILSVRC2012_val_00008710.JPEG n03584829/
+mv val/ILSVRC2012_val_00008711.JPEG n04118776/
+mv val/ILSVRC2012_val_00008712.JPEG n02509815/
+mv val/ILSVRC2012_val_00008713.JPEG n02102480/
+mv val/ILSVRC2012_val_00008714.JPEG n01729977/
+mv val/ILSVRC2012_val_00008715.JPEG n02776631/
+mv val/ILSVRC2012_val_00008716.JPEG n03125729/
+mv val/ILSVRC2012_val_00008717.JPEG n02948072/
+mv val/ILSVRC2012_val_00008718.JPEG n01774384/
+mv val/ILSVRC2012_val_00008719.JPEG n01695060/
+mv val/ILSVRC2012_val_00008720.JPEG n07734744/
+mv val/ILSVRC2012_val_00008721.JPEG n01990800/
+mv val/ILSVRC2012_val_00008722.JPEG n02445715/
+mv val/ILSVRC2012_val_00008723.JPEG n03017168/
+mv val/ILSVRC2012_val_00008724.JPEG n02606052/
+mv val/ILSVRC2012_val_00008725.JPEG n04612504/
+mv val/ILSVRC2012_val_00008726.JPEG n02119789/
+mv val/ILSVRC2012_val_00008727.JPEG n02113978/
+mv val/ILSVRC2012_val_00008728.JPEG n03706229/
+mv val/ILSVRC2012_val_00008729.JPEG n02115913/
+mv val/ILSVRC2012_val_00008730.JPEG n02655020/
+mv val/ILSVRC2012_val_00008731.JPEG n02640242/
+mv val/ILSVRC2012_val_00008732.JPEG n03478589/
+mv val/ILSVRC2012_val_00008733.JPEG n03891251/
+mv val/ILSVRC2012_val_00008734.JPEG n02892201/
+mv val/ILSVRC2012_val_00008735.JPEG n02676566/
+mv val/ILSVRC2012_val_00008736.JPEG n01877812/
+mv val/ILSVRC2012_val_00008737.JPEG n02037110/
+mv val/ILSVRC2012_val_00008738.JPEG n07745940/
+mv val/ILSVRC2012_val_00008739.JPEG n02090721/
+mv val/ILSVRC2012_val_00008740.JPEG n04548280/
+mv val/ILSVRC2012_val_00008741.JPEG n02971356/
+mv val/ILSVRC2012_val_00008742.JPEG n03042490/
+mv val/ILSVRC2012_val_00008743.JPEG n02865351/
+mv val/ILSVRC2012_val_00008744.JPEG n04310018/
+mv val/ILSVRC2012_val_00008745.JPEG n07802026/
+mv val/ILSVRC2012_val_00008746.JPEG n01843065/
+mv val/ILSVRC2012_val_00008747.JPEG n01944390/
+mv val/ILSVRC2012_val_00008748.JPEG n03443371/
+mv val/ILSVRC2012_val_00008749.JPEG n01496331/
+mv val/ILSVRC2012_val_00008750.JPEG n13044778/
+mv val/ILSVRC2012_val_00008751.JPEG n03196217/
+mv val/ILSVRC2012_val_00008752.JPEG n02111889/
+mv val/ILSVRC2012_val_00008753.JPEG n09288635/
+mv val/ILSVRC2012_val_00008754.JPEG n03777568/
+mv val/ILSVRC2012_val_00008755.JPEG n03970156/
+mv val/ILSVRC2012_val_00008756.JPEG n02027492/
+mv val/ILSVRC2012_val_00008757.JPEG n09332890/
+mv val/ILSVRC2012_val_00008758.JPEG n04326547/
+mv val/ILSVRC2012_val_00008759.JPEG n04458633/
+mv val/ILSVRC2012_val_00008760.JPEG n02093428/
+mv val/ILSVRC2012_val_00008761.JPEG n03992509/
+mv val/ILSVRC2012_val_00008762.JPEG n03908618/
+mv val/ILSVRC2012_val_00008763.JPEG n03290653/
+mv val/ILSVRC2012_val_00008764.JPEG n04311004/
+mv val/ILSVRC2012_val_00008765.JPEG n03764736/
+mv val/ILSVRC2012_val_00008766.JPEG n04465501/
+mv val/ILSVRC2012_val_00008767.JPEG n03345487/
+mv val/ILSVRC2012_val_00008768.JPEG n04099969/
+mv val/ILSVRC2012_val_00008769.JPEG n02843684/
+mv val/ILSVRC2012_val_00008770.JPEG n02361337/
+mv val/ILSVRC2012_val_00008771.JPEG n02066245/
+mv val/ILSVRC2012_val_00008772.JPEG n02099601/
+mv val/ILSVRC2012_val_00008773.JPEG n03259280/
+mv val/ILSVRC2012_val_00008774.JPEG n02105641/
+mv val/ILSVRC2012_val_00008775.JPEG n01755581/
+mv val/ILSVRC2012_val_00008776.JPEG n03937543/
+mv val/ILSVRC2012_val_00008777.JPEG n03249569/
+mv val/ILSVRC2012_val_00008778.JPEG n02124075/
+mv val/ILSVRC2012_val_00008779.JPEG n03761084/
+mv val/ILSVRC2012_val_00008780.JPEG n02834397/
+mv val/ILSVRC2012_val_00008781.JPEG n03891251/
+mv val/ILSVRC2012_val_00008782.JPEG n07753275/
+mv val/ILSVRC2012_val_00008783.JPEG n04389033/
+mv val/ILSVRC2012_val_00008784.JPEG n03599486/
+mv val/ILSVRC2012_val_00008785.JPEG n04392985/
+mv val/ILSVRC2012_val_00008786.JPEG n01582220/
+mv val/ILSVRC2012_val_00008787.JPEG n03642806/
+mv val/ILSVRC2012_val_00008788.JPEG n01749939/
+mv val/ILSVRC2012_val_00008789.JPEG n01944390/
+mv val/ILSVRC2012_val_00008790.JPEG n03146219/
+mv val/ILSVRC2012_val_00008791.JPEG n09428293/
+mv val/ILSVRC2012_val_00008792.JPEG n02112350/
+mv val/ILSVRC2012_val_00008793.JPEG n03249569/
+mv val/ILSVRC2012_val_00008794.JPEG n02085936/
+mv val/ILSVRC2012_val_00008795.JPEG n03240683/
+mv val/ILSVRC2012_val_00008796.JPEG n04597913/
+mv val/ILSVRC2012_val_00008797.JPEG n03249569/
+mv val/ILSVRC2012_val_00008798.JPEG n02256656/
+mv val/ILSVRC2012_val_00008799.JPEG n07248320/
+mv val/ILSVRC2012_val_00008800.JPEG n04376876/
+mv val/ILSVRC2012_val_00008801.JPEG n03089624/
+mv val/ILSVRC2012_val_00008802.JPEG n04118538/
+mv val/ILSVRC2012_val_00008803.JPEG n02966687/
+mv val/ILSVRC2012_val_00008804.JPEG n03891332/
+mv val/ILSVRC2012_val_00008805.JPEG n01773157/
+mv val/ILSVRC2012_val_00008806.JPEG n02948072/
+mv val/ILSVRC2012_val_00008807.JPEG n01685808/
+mv val/ILSVRC2012_val_00008808.JPEG n04371430/
+mv val/ILSVRC2012_val_00008809.JPEG n02107312/
+mv val/ILSVRC2012_val_00008810.JPEG n01749939/
+mv val/ILSVRC2012_val_00008811.JPEG n02085936/
+mv val/ILSVRC2012_val_00008812.JPEG n02091831/
+mv val/ILSVRC2012_val_00008813.JPEG n02098105/
+mv val/ILSVRC2012_val_00008814.JPEG n02708093/
+mv val/ILSVRC2012_val_00008815.JPEG n02120505/
+mv val/ILSVRC2012_val_00008816.JPEG n01601694/
+mv val/ILSVRC2012_val_00008817.JPEG n06874185/
+mv val/ILSVRC2012_val_00008818.JPEG n02319095/
+mv val/ILSVRC2012_val_00008819.JPEG n01616318/
+mv val/ILSVRC2012_val_00008820.JPEG n01775062/
+mv val/ILSVRC2012_val_00008821.JPEG n13040303/
+mv val/ILSVRC2012_val_00008822.JPEG n03796401/
+mv val/ILSVRC2012_val_00008823.JPEG n04482393/
+mv val/ILSVRC2012_val_00008824.JPEG n03272562/
+mv val/ILSVRC2012_val_00008825.JPEG n03478589/
+mv val/ILSVRC2012_val_00008826.JPEG n02190166/
+mv val/ILSVRC2012_val_00008827.JPEG n02910353/
+mv val/ILSVRC2012_val_00008828.JPEG n02951358/
+mv val/ILSVRC2012_val_00008829.JPEG n01749939/
+mv val/ILSVRC2012_val_00008830.JPEG n12985857/
+mv val/ILSVRC2012_val_00008831.JPEG n04254120/
+mv val/ILSVRC2012_val_00008832.JPEG n03944341/
+mv val/ILSVRC2012_val_00008833.JPEG n03743016/
+mv val/ILSVRC2012_val_00008834.JPEG n01855672/
+mv val/ILSVRC2012_val_00008835.JPEG n04228054/
+mv val/ILSVRC2012_val_00008836.JPEG n03642806/
+mv val/ILSVRC2012_val_00008837.JPEG n03956157/
+mv val/ILSVRC2012_val_00008838.JPEG n04162706/
+mv val/ILSVRC2012_val_00008839.JPEG n02992211/
+mv val/ILSVRC2012_val_00008840.JPEG n01883070/
+mv val/ILSVRC2012_val_00008841.JPEG n03045698/
+mv val/ILSVRC2012_val_00008842.JPEG n02018207/
+mv val/ILSVRC2012_val_00008843.JPEG n01872401/
+mv val/ILSVRC2012_val_00008844.JPEG n04239074/
+mv val/ILSVRC2012_val_00008845.JPEG n07932039/
+mv val/ILSVRC2012_val_00008846.JPEG n04392985/
+mv val/ILSVRC2012_val_00008847.JPEG n02641379/
+mv val/ILSVRC2012_val_00008848.JPEG n01484850/
+mv val/ILSVRC2012_val_00008849.JPEG n01742172/
+mv val/ILSVRC2012_val_00008850.JPEG n04376876/
+mv val/ILSVRC2012_val_00008851.JPEG n04550184/
+mv val/ILSVRC2012_val_00008852.JPEG n03733805/
+mv val/ILSVRC2012_val_00008853.JPEG n04371774/
+mv val/ILSVRC2012_val_00008854.JPEG n04317175/
+mv val/ILSVRC2012_val_00008855.JPEG n03873416/
+mv val/ILSVRC2012_val_00008856.JPEG n02361337/
+mv val/ILSVRC2012_val_00008857.JPEG n02002556/
+mv val/ILSVRC2012_val_00008858.JPEG n02168699/
+mv val/ILSVRC2012_val_00008859.JPEG n02098413/
+mv val/ILSVRC2012_val_00008860.JPEG n02104365/
+mv val/ILSVRC2012_val_00008861.JPEG n03841143/
+mv val/ILSVRC2012_val_00008862.JPEG n02074367/
+mv val/ILSVRC2012_val_00008863.JPEG n04344873/
+mv val/ILSVRC2012_val_00008864.JPEG n07615774/
+mv val/ILSVRC2012_val_00008865.JPEG n04149813/
+mv val/ILSVRC2012_val_00008866.JPEG n02321529/
+mv val/ILSVRC2012_val_00008867.JPEG n12144580/
+mv val/ILSVRC2012_val_00008868.JPEG n02509815/
+mv val/ILSVRC2012_val_00008869.JPEG n03938244/
+mv val/ILSVRC2012_val_00008870.JPEG n01978455/
+mv val/ILSVRC2012_val_00008871.JPEG n03047690/
+mv val/ILSVRC2012_val_00008872.JPEG n04252077/
+mv val/ILSVRC2012_val_00008873.JPEG n02487347/
+mv val/ILSVRC2012_val_00008874.JPEG n03141823/
+mv val/ILSVRC2012_val_00008875.JPEG n02666196/
+mv val/ILSVRC2012_val_00008876.JPEG n02123045/
+mv val/ILSVRC2012_val_00008877.JPEG n02486410/
+mv val/ILSVRC2012_val_00008878.JPEG n02492660/
+mv val/ILSVRC2012_val_00008879.JPEG n03796401/
+mv val/ILSVRC2012_val_00008880.JPEG n02112350/
+mv val/ILSVRC2012_val_00008881.JPEG n07730033/
+mv val/ILSVRC2012_val_00008882.JPEG n03950228/
+mv val/ILSVRC2012_val_00008883.JPEG n04162706/
+mv val/ILSVRC2012_val_00008884.JPEG n02895154/
+mv val/ILSVRC2012_val_00008885.JPEG n02105641/
+mv val/ILSVRC2012_val_00008886.JPEG n03404251/
+mv val/ILSVRC2012_val_00008887.JPEG n02007558/
+mv val/ILSVRC2012_val_00008888.JPEG n01739381/
+mv val/ILSVRC2012_val_00008889.JPEG n02481823/
+mv val/ILSVRC2012_val_00008890.JPEG n04409515/
+mv val/ILSVRC2012_val_00008891.JPEG n02443114/
+mv val/ILSVRC2012_val_00008892.JPEG n02879718/
+mv val/ILSVRC2012_val_00008893.JPEG n03345487/
+mv val/ILSVRC2012_val_00008894.JPEG n02268853/
+mv val/ILSVRC2012_val_00008895.JPEG n12620546/
+mv val/ILSVRC2012_val_00008896.JPEG n03930313/
+mv val/ILSVRC2012_val_00008897.JPEG n04380533/
+mv val/ILSVRC2012_val_00008898.JPEG n01518878/
+mv val/ILSVRC2012_val_00008899.JPEG n04596742/
+mv val/ILSVRC2012_val_00008900.JPEG n03680355/
+mv val/ILSVRC2012_val_00008901.JPEG n02074367/
+mv val/ILSVRC2012_val_00008902.JPEG n01667778/
+mv val/ILSVRC2012_val_00008903.JPEG n03376595/
+mv val/ILSVRC2012_val_00008904.JPEG n04366367/
+mv val/ILSVRC2012_val_00008905.JPEG n02097047/
+mv val/ILSVRC2012_val_00008906.JPEG n02101006/
+mv val/ILSVRC2012_val_00008907.JPEG n01873310/
+mv val/ILSVRC2012_val_00008908.JPEG n03876231/
+mv val/ILSVRC2012_val_00008909.JPEG n04507155/
+mv val/ILSVRC2012_val_00008910.JPEG n02086910/
+mv val/ILSVRC2012_val_00008911.JPEG n04370456/
+mv val/ILSVRC2012_val_00008912.JPEG n02687172/
+mv val/ILSVRC2012_val_00008913.JPEG n03724870/
+mv val/ILSVRC2012_val_00008914.JPEG n02966193/
+mv val/ILSVRC2012_val_00008915.JPEG n02776631/
+mv val/ILSVRC2012_val_00008916.JPEG n03089624/
+mv val/ILSVRC2012_val_00008917.JPEG n04456115/
+mv val/ILSVRC2012_val_00008918.JPEG n03325584/
+mv val/ILSVRC2012_val_00008919.JPEG n01770081/
+mv val/ILSVRC2012_val_00008920.JPEG n04428191/
+mv val/ILSVRC2012_val_00008921.JPEG n01667778/
+mv val/ILSVRC2012_val_00008922.JPEG n02132136/
+mv val/ILSVRC2012_val_00008923.JPEG n02105162/
+mv val/ILSVRC2012_val_00008924.JPEG n03743016/
+mv val/ILSVRC2012_val_00008925.JPEG n04367480/
+mv val/ILSVRC2012_val_00008926.JPEG n02098105/
+mv val/ILSVRC2012_val_00008927.JPEG n03000134/
+mv val/ILSVRC2012_val_00008928.JPEG n02100236/
+mv val/ILSVRC2012_val_00008929.JPEG n02011460/
+mv val/ILSVRC2012_val_00008930.JPEG n02097047/
+mv val/ILSVRC2012_val_00008931.JPEG n02177972/
+mv val/ILSVRC2012_val_00008932.JPEG n04493381/
+mv val/ILSVRC2012_val_00008933.JPEG n03874293/
+mv val/ILSVRC2012_val_00008934.JPEG n02017213/
+mv val/ILSVRC2012_val_00008935.JPEG n03908714/
+mv val/ILSVRC2012_val_00008936.JPEG n02361337/
+mv val/ILSVRC2012_val_00008937.JPEG n02669723/
+mv val/ILSVRC2012_val_00008938.JPEG n02119022/
+mv val/ILSVRC2012_val_00008939.JPEG n02105505/
+mv val/ILSVRC2012_val_00008940.JPEG n03884397/
+mv val/ILSVRC2012_val_00008941.JPEG n02190166/
+mv val/ILSVRC2012_val_00008942.JPEG n03216828/
+mv val/ILSVRC2012_val_00008943.JPEG n02410509/
+mv val/ILSVRC2012_val_00008944.JPEG n02101556/
+mv val/ILSVRC2012_val_00008945.JPEG n02098286/
+mv val/ILSVRC2012_val_00008946.JPEG n03250847/
+mv val/ILSVRC2012_val_00008947.JPEG n02117135/
+mv val/ILSVRC2012_val_00008948.JPEG n03929660/
+mv val/ILSVRC2012_val_00008949.JPEG n04332243/
+mv val/ILSVRC2012_val_00008950.JPEG n03891332/
+mv val/ILSVRC2012_val_00008951.JPEG n02018207/
+mv val/ILSVRC2012_val_00008952.JPEG n01498041/
+mv val/ILSVRC2012_val_00008953.JPEG n03977966/
+mv val/ILSVRC2012_val_00008954.JPEG n02892767/
+mv val/ILSVRC2012_val_00008955.JPEG n03781244/
+mv val/ILSVRC2012_val_00008956.JPEG n02094433/
+mv val/ILSVRC2012_val_00008957.JPEG n02112137/
+mv val/ILSVRC2012_val_00008958.JPEG n02910353/
+mv val/ILSVRC2012_val_00008959.JPEG n03791053/
+mv val/ILSVRC2012_val_00008960.JPEG n01773157/
+mv val/ILSVRC2012_val_00008961.JPEG n03599486/
+mv val/ILSVRC2012_val_00008962.JPEG n11939491/
+mv val/ILSVRC2012_val_00008963.JPEG n01496331/
+mv val/ILSVRC2012_val_00008964.JPEG n02950826/
+mv val/ILSVRC2012_val_00008965.JPEG n09246464/
+mv val/ILSVRC2012_val_00008966.JPEG n02099429/
+mv val/ILSVRC2012_val_00008967.JPEG n02108551/
+mv val/ILSVRC2012_val_00008968.JPEG n02895154/
+mv val/ILSVRC2012_val_00008969.JPEG n09229709/
+mv val/ILSVRC2012_val_00008970.JPEG n07932039/
+mv val/ILSVRC2012_val_00008971.JPEG n03721384/
+mv val/ILSVRC2012_val_00008972.JPEG n03529860/
+mv val/ILSVRC2012_val_00008973.JPEG n02113186/
+mv val/ILSVRC2012_val_00008974.JPEG n03929660/
+mv val/ILSVRC2012_val_00008975.JPEG n02086646/
+mv val/ILSVRC2012_val_00008976.JPEG n02787622/
+mv val/ILSVRC2012_val_00008977.JPEG n02676566/
+mv val/ILSVRC2012_val_00008978.JPEG n02006656/
+mv val/ILSVRC2012_val_00008979.JPEG n02104365/
+mv val/ILSVRC2012_val_00008980.JPEG n03045698/
+mv val/ILSVRC2012_val_00008981.JPEG n03100240/
+mv val/ILSVRC2012_val_00008982.JPEG n03599486/
+mv val/ILSVRC2012_val_00008983.JPEG n03924679/
+mv val/ILSVRC2012_val_00008984.JPEG n03937543/
+mv val/ILSVRC2012_val_00008985.JPEG n02869837/
+mv val/ILSVRC2012_val_00008986.JPEG n02123394/
+mv val/ILSVRC2012_val_00008987.JPEG n01980166/
+mv val/ILSVRC2012_val_00008988.JPEG n04355933/
+mv val/ILSVRC2012_val_00008989.JPEG n03133878/
+mv val/ILSVRC2012_val_00008990.JPEG n03709823/
+mv val/ILSVRC2012_val_00008991.JPEG n06794110/
+mv val/ILSVRC2012_val_00008992.JPEG n02110341/
+mv val/ILSVRC2012_val_00008993.JPEG n01796340/
+mv val/ILSVRC2012_val_00008994.JPEG n02978881/
+mv val/ILSVRC2012_val_00008995.JPEG n03495258/
+mv val/ILSVRC2012_val_00008996.JPEG n03452741/
+mv val/ILSVRC2012_val_00008997.JPEG n02091032/
+mv val/ILSVRC2012_val_00008998.JPEG n04442312/
+mv val/ILSVRC2012_val_00008999.JPEG n04118776/
+mv val/ILSVRC2012_val_00009000.JPEG n01630670/
+mv val/ILSVRC2012_val_00009001.JPEG n03662601/
+mv val/ILSVRC2012_val_00009002.JPEG n02174001/
+mv val/ILSVRC2012_val_00009003.JPEG n04606251/
+mv val/ILSVRC2012_val_00009004.JPEG n02107142/
+mv val/ILSVRC2012_val_00009005.JPEG n03814906/
+mv val/ILSVRC2012_val_00009006.JPEG n03457902/
+mv val/ILSVRC2012_val_00009007.JPEG n02085782/
+mv val/ILSVRC2012_val_00009008.JPEG n03598930/
+mv val/ILSVRC2012_val_00009009.JPEG n02094258/
+mv val/ILSVRC2012_val_00009010.JPEG n03000247/
+mv val/ILSVRC2012_val_00009011.JPEG n02966193/
+mv val/ILSVRC2012_val_00009012.JPEG n02489166/
+mv val/ILSVRC2012_val_00009013.JPEG n04367480/
+mv val/ILSVRC2012_val_00009014.JPEG n02110063/
+mv val/ILSVRC2012_val_00009015.JPEG n07753275/
+mv val/ILSVRC2012_val_00009016.JPEG n07715103/
+mv val/ILSVRC2012_val_00009017.JPEG n04485082/
+mv val/ILSVRC2012_val_00009018.JPEG n03075370/
+mv val/ILSVRC2012_val_00009019.JPEG n02098105/
+mv val/ILSVRC2012_val_00009020.JPEG n13054560/
+mv val/ILSVRC2012_val_00009021.JPEG n02730930/
+mv val/ILSVRC2012_val_00009022.JPEG n03670208/
+mv val/ILSVRC2012_val_00009023.JPEG n02281787/
+mv val/ILSVRC2012_val_00009024.JPEG n04462240/
+mv val/ILSVRC2012_val_00009025.JPEG n02510455/
+mv val/ILSVRC2012_val_00009026.JPEG n02814860/
+mv val/ILSVRC2012_val_00009027.JPEG n04482393/
+mv val/ILSVRC2012_val_00009028.JPEG n03498962/
+mv val/ILSVRC2012_val_00009029.JPEG n09229709/
+mv val/ILSVRC2012_val_00009030.JPEG n02097130/
+mv val/ILSVRC2012_val_00009031.JPEG n04265275/
+mv val/ILSVRC2012_val_00009032.JPEG n04004767/
+mv val/ILSVRC2012_val_00009033.JPEG n02093647/
+mv val/ILSVRC2012_val_00009034.JPEG n01443537/
+mv val/ILSVRC2012_val_00009035.JPEG n01704323/
+mv val/ILSVRC2012_val_00009036.JPEG n02096437/
+mv val/ILSVRC2012_val_00009037.JPEG n03394916/
+mv val/ILSVRC2012_val_00009038.JPEG n04423845/
+mv val/ILSVRC2012_val_00009039.JPEG n02108422/
+mv val/ILSVRC2012_val_00009040.JPEG n03706229/
+mv val/ILSVRC2012_val_00009041.JPEG n02869837/
+mv val/ILSVRC2012_val_00009042.JPEG n01737021/
+mv val/ILSVRC2012_val_00009043.JPEG n03930313/
+mv val/ILSVRC2012_val_00009044.JPEG n04039381/
+mv val/ILSVRC2012_val_00009045.JPEG n02113186/
+mv val/ILSVRC2012_val_00009046.JPEG n02403003/
+mv val/ILSVRC2012_val_00009047.JPEG n02037110/
+mv val/ILSVRC2012_val_00009048.JPEG n03637318/
+mv val/ILSVRC2012_val_00009049.JPEG n02823750/
+mv val/ILSVRC2012_val_00009050.JPEG n01677366/
+mv val/ILSVRC2012_val_00009051.JPEG n02093256/
+mv val/ILSVRC2012_val_00009052.JPEG n02096294/
+mv val/ILSVRC2012_val_00009053.JPEG n06596364/
+mv val/ILSVRC2012_val_00009054.JPEG n03220513/
+mv val/ILSVRC2012_val_00009055.JPEG n02106030/
+mv val/ILSVRC2012_val_00009056.JPEG n02917067/
+mv val/ILSVRC2012_val_00009057.JPEG n02090622/
+mv val/ILSVRC2012_val_00009058.JPEG n04141076/
+mv val/ILSVRC2012_val_00009059.JPEG n01749939/
+mv val/ILSVRC2012_val_00009060.JPEG n02981792/
+mv val/ILSVRC2012_val_00009061.JPEG n02111889/
+mv val/ILSVRC2012_val_00009062.JPEG n02116738/
+mv val/ILSVRC2012_val_00009063.JPEG n09246464/
+mv val/ILSVRC2012_val_00009064.JPEG n02791124/
+mv val/ILSVRC2012_val_00009065.JPEG n02091244/
+mv val/ILSVRC2012_val_00009066.JPEG n02119022/
+mv val/ILSVRC2012_val_00009067.JPEG n02445715/
+mv val/ILSVRC2012_val_00009068.JPEG n03216828/
+mv val/ILSVRC2012_val_00009069.JPEG n03095699/
+mv val/ILSVRC2012_val_00009070.JPEG n03481172/
+mv val/ILSVRC2012_val_00009071.JPEG n04442312/
+mv val/ILSVRC2012_val_00009072.JPEG n02802426/
+mv val/ILSVRC2012_val_00009073.JPEG n09428293/
+mv val/ILSVRC2012_val_00009074.JPEG n03065424/
+mv val/ILSVRC2012_val_00009075.JPEG n02363005/
+mv val/ILSVRC2012_val_00009076.JPEG n12057211/
+mv val/ILSVRC2012_val_00009077.JPEG n02422106/
+mv val/ILSVRC2012_val_00009078.JPEG n02999410/
+mv val/ILSVRC2012_val_00009079.JPEG n03207743/
+mv val/ILSVRC2012_val_00009080.JPEG n03786901/
+mv val/ILSVRC2012_val_00009081.JPEG n02363005/
+mv val/ILSVRC2012_val_00009082.JPEG n02417914/
+mv val/ILSVRC2012_val_00009083.JPEG n01698640/
+mv val/ILSVRC2012_val_00009084.JPEG n03063599/
+mv val/ILSVRC2012_val_00009085.JPEG n04409515/
+mv val/ILSVRC2012_val_00009086.JPEG n03891251/
+mv val/ILSVRC2012_val_00009087.JPEG n03794056/
+mv val/ILSVRC2012_val_00009088.JPEG n02101388/
+mv val/ILSVRC2012_val_00009089.JPEG n04044716/
+mv val/ILSVRC2012_val_00009090.JPEG n02226429/
+mv val/ILSVRC2012_val_00009091.JPEG n01818515/
+mv val/ILSVRC2012_val_00009092.JPEG n01558993/
+mv val/ILSVRC2012_val_00009093.JPEG n02110806/
+mv val/ILSVRC2012_val_00009094.JPEG n03337140/
+mv val/ILSVRC2012_val_00009095.JPEG n03627232/
+mv val/ILSVRC2012_val_00009096.JPEG n04204238/
+mv val/ILSVRC2012_val_00009097.JPEG n07873807/
+mv val/ILSVRC2012_val_00009098.JPEG n03930630/
+mv val/ILSVRC2012_val_00009099.JPEG n04311174/
+mv val/ILSVRC2012_val_00009100.JPEG n01616318/
+mv val/ILSVRC2012_val_00009101.JPEG n04330267/
+mv val/ILSVRC2012_val_00009102.JPEG n04179913/
+mv val/ILSVRC2012_val_00009103.JPEG n04501370/
+mv val/ILSVRC2012_val_00009104.JPEG n02687172/
+mv val/ILSVRC2012_val_00009105.JPEG n02086079/
+mv val/ILSVRC2012_val_00009106.JPEG n03976467/
+mv val/ILSVRC2012_val_00009107.JPEG n03950228/
+mv val/ILSVRC2012_val_00009108.JPEG n01773797/
+mv val/ILSVRC2012_val_00009109.JPEG n03197337/
+mv val/ILSVRC2012_val_00009110.JPEG n02640242/
+mv val/ILSVRC2012_val_00009111.JPEG n01440764/
+mv val/ILSVRC2012_val_00009112.JPEG n02342885/
+mv val/ILSVRC2012_val_00009113.JPEG n02389026/
+mv val/ILSVRC2012_val_00009114.JPEG n02895154/
+mv val/ILSVRC2012_val_00009115.JPEG n02056570/
+mv val/ILSVRC2012_val_00009116.JPEG n04584207/
+mv val/ILSVRC2012_val_00009117.JPEG n03042490/
+mv val/ILSVRC2012_val_00009118.JPEG n09421951/
+mv val/ILSVRC2012_val_00009119.JPEG n01616318/
+mv val/ILSVRC2012_val_00009120.JPEG n03384352/
+mv val/ILSVRC2012_val_00009121.JPEG n07248320/
+mv val/ILSVRC2012_val_00009122.JPEG n03590841/
+mv val/ILSVRC2012_val_00009123.JPEG n03903868/
+mv val/ILSVRC2012_val_00009124.JPEG n02129165/
+mv val/ILSVRC2012_val_00009125.JPEG n02123159/
+mv val/ILSVRC2012_val_00009126.JPEG n03837869/
+mv val/ILSVRC2012_val_00009127.JPEG n03630383/
+mv val/ILSVRC2012_val_00009128.JPEG n02119789/
+mv val/ILSVRC2012_val_00009129.JPEG n07768694/
+mv val/ILSVRC2012_val_00009130.JPEG n02102973/
+mv val/ILSVRC2012_val_00009131.JPEG n03788195/
+mv val/ILSVRC2012_val_00009132.JPEG n01682714/
+mv val/ILSVRC2012_val_00009133.JPEG n02130308/
+mv val/ILSVRC2012_val_00009134.JPEG n03495258/
+mv val/ILSVRC2012_val_00009135.JPEG n03770439/
+mv val/ILSVRC2012_val_00009136.JPEG n02398521/
+mv val/ILSVRC2012_val_00009137.JPEG n02965783/
+mv val/ILSVRC2012_val_00009138.JPEG n02033041/
+mv val/ILSVRC2012_val_00009139.JPEG n02088094/
+mv val/ILSVRC2012_val_00009140.JPEG n02939185/
+mv val/ILSVRC2012_val_00009141.JPEG n01914609/
+mv val/ILSVRC2012_val_00009142.JPEG n04147183/
+mv val/ILSVRC2012_val_00009143.JPEG n03720891/
+mv val/ILSVRC2012_val_00009144.JPEG n02105641/
+mv val/ILSVRC2012_val_00009145.JPEG n01843383/
+mv val/ILSVRC2012_val_00009146.JPEG n01818515/
+mv val/ILSVRC2012_val_00009147.JPEG n02730930/
+mv val/ILSVRC2012_val_00009148.JPEG n02109961/
+mv val/ILSVRC2012_val_00009149.JPEG n04398044/
+mv val/ILSVRC2012_val_00009150.JPEG n04131690/
+mv val/ILSVRC2012_val_00009151.JPEG n01914609/
+mv val/ILSVRC2012_val_00009152.JPEG n03481172/
+mv val/ILSVRC2012_val_00009153.JPEG n04317175/
+mv val/ILSVRC2012_val_00009154.JPEG n03344393/
+mv val/ILSVRC2012_val_00009155.JPEG n04557648/
+mv val/ILSVRC2012_val_00009156.JPEG n02120505/
+mv val/ILSVRC2012_val_00009157.JPEG n02109961/
+mv val/ILSVRC2012_val_00009158.JPEG n02128385/
+mv val/ILSVRC2012_val_00009159.JPEG n02391049/
+mv val/ILSVRC2012_val_00009160.JPEG n03041632/
+mv val/ILSVRC2012_val_00009161.JPEG n09246464/
+mv val/ILSVRC2012_val_00009162.JPEG n03666591/
+mv val/ILSVRC2012_val_00009163.JPEG n02111129/
+mv val/ILSVRC2012_val_00009164.JPEG n02974003/
+mv val/ILSVRC2012_val_00009165.JPEG n02643566/
+mv val/ILSVRC2012_val_00009166.JPEG n03492542/
+mv val/ILSVRC2012_val_00009167.JPEG n02090622/
+mv val/ILSVRC2012_val_00009168.JPEG n02389026/
+mv val/ILSVRC2012_val_00009169.JPEG n01735189/
+mv val/ILSVRC2012_val_00009170.JPEG n03478589/
+mv val/ILSVRC2012_val_00009171.JPEG n03785016/
+mv val/ILSVRC2012_val_00009172.JPEG n03854065/
+mv val/ILSVRC2012_val_00009173.JPEG n03207743/
+mv val/ILSVRC2012_val_00009174.JPEG n04399382/
+mv val/ILSVRC2012_val_00009175.JPEG n02108422/
+mv val/ILSVRC2012_val_00009176.JPEG n04428191/
+mv val/ILSVRC2012_val_00009177.JPEG n07760859/
+mv val/ILSVRC2012_val_00009178.JPEG n03888605/
+mv val/ILSVRC2012_val_00009179.JPEG n02704792/
+mv val/ILSVRC2012_val_00009180.JPEG n03697007/
+mv val/ILSVRC2012_val_00009181.JPEG n03657121/
+mv val/ILSVRC2012_val_00009182.JPEG n04141975/
+mv val/ILSVRC2012_val_00009183.JPEG n04008634/
+mv val/ILSVRC2012_val_00009184.JPEG n02799071/
+mv val/ILSVRC2012_val_00009185.JPEG n02018795/
+mv val/ILSVRC2012_val_00009186.JPEG n02877765/
+mv val/ILSVRC2012_val_00009187.JPEG n07613480/
+mv val/ILSVRC2012_val_00009188.JPEG n11939491/
+mv val/ILSVRC2012_val_00009189.JPEG n02108089/
+mv val/ILSVRC2012_val_00009190.JPEG n02098413/
+mv val/ILSVRC2012_val_00009191.JPEG n01440764/
+mv val/ILSVRC2012_val_00009192.JPEG n01776313/
+mv val/ILSVRC2012_val_00009193.JPEG n03804744/
+mv val/ILSVRC2012_val_00009194.JPEG n01817953/
+mv val/ILSVRC2012_val_00009195.JPEG n02788148/
+mv val/ILSVRC2012_val_00009196.JPEG n03400231/
+mv val/ILSVRC2012_val_00009197.JPEG n03899768/
+mv val/ILSVRC2012_val_00009198.JPEG n02027492/
+mv val/ILSVRC2012_val_00009199.JPEG n02028035/
+mv val/ILSVRC2012_val_00009200.JPEG n02087394/
+mv val/ILSVRC2012_val_00009201.JPEG n04392985/
+mv val/ILSVRC2012_val_00009202.JPEG n01944390/
+mv val/ILSVRC2012_val_00009203.JPEG n04204238/
+mv val/ILSVRC2012_val_00009204.JPEG n03995372/
+mv val/ILSVRC2012_val_00009205.JPEG n02437616/
+mv val/ILSVRC2012_val_00009206.JPEG n03000684/
+mv val/ILSVRC2012_val_00009207.JPEG n03146219/
+mv val/ILSVRC2012_val_00009208.JPEG n01496331/
+mv val/ILSVRC2012_val_00009209.JPEG n02128925/
+mv val/ILSVRC2012_val_00009210.JPEG n02025239/
+mv val/ILSVRC2012_val_00009211.JPEG n03903868/
+mv val/ILSVRC2012_val_00009212.JPEG n06596364/
+mv val/ILSVRC2012_val_00009213.JPEG n01990800/
+mv val/ILSVRC2012_val_00009214.JPEG n03877845/
+mv val/ILSVRC2012_val_00009215.JPEG n02704792/
+mv val/ILSVRC2012_val_00009216.JPEG n01773549/
+mv val/ILSVRC2012_val_00009217.JPEG n03271574/
+mv val/ILSVRC2012_val_00009218.JPEG n02667093/
+mv val/ILSVRC2012_val_00009219.JPEG n01514668/
+mv val/ILSVRC2012_val_00009220.JPEG n02089867/
+mv val/ILSVRC2012_val_00009221.JPEG n02410509/
+mv val/ILSVRC2012_val_00009222.JPEG n09193705/
+mv val/ILSVRC2012_val_00009223.JPEG n04204238/
+mv val/ILSVRC2012_val_00009224.JPEG n02110806/
+mv val/ILSVRC2012_val_00009225.JPEG n02823428/
+mv val/ILSVRC2012_val_00009226.JPEG n01807496/
+mv val/ILSVRC2012_val_00009227.JPEG n07753592/
+mv val/ILSVRC2012_val_00009228.JPEG n02835271/
+mv val/ILSVRC2012_val_00009229.JPEG n04579432/
+mv val/ILSVRC2012_val_00009230.JPEG n03763968/
+mv val/ILSVRC2012_val_00009231.JPEG n01667114/
+mv val/ILSVRC2012_val_00009232.JPEG n01770393/
+mv val/ILSVRC2012_val_00009233.JPEG n02364673/
+mv val/ILSVRC2012_val_00009234.JPEG n03777568/
+mv val/ILSVRC2012_val_00009235.JPEG n04204238/
+mv val/ILSVRC2012_val_00009236.JPEG n04252077/
+mv val/ILSVRC2012_val_00009237.JPEG n01496331/
+mv val/ILSVRC2012_val_00009238.JPEG n02877765/
+mv val/ILSVRC2012_val_00009239.JPEG n01532829/
+mv val/ILSVRC2012_val_00009240.JPEG n02640242/
+mv val/ILSVRC2012_val_00009241.JPEG n04483307/
+mv val/ILSVRC2012_val_00009242.JPEG n04332243/
+mv val/ILSVRC2012_val_00009243.JPEG n03197337/
+mv val/ILSVRC2012_val_00009244.JPEG n02094433/
+mv val/ILSVRC2012_val_00009245.JPEG n03995372/
+mv val/ILSVRC2012_val_00009246.JPEG n03485407/
+mv val/ILSVRC2012_val_00009247.JPEG n02085782/
+mv val/ILSVRC2012_val_00009248.JPEG n04591157/
+mv val/ILSVRC2012_val_00009249.JPEG n07930864/
+mv val/ILSVRC2012_val_00009250.JPEG n02086079/
+mv val/ILSVRC2012_val_00009251.JPEG n01983481/
+mv val/ILSVRC2012_val_00009252.JPEG n04162706/
+mv val/ILSVRC2012_val_00009253.JPEG n02981792/
+mv val/ILSVRC2012_val_00009254.JPEG n02447366/
+mv val/ILSVRC2012_val_00009255.JPEG n03733805/
+mv val/ILSVRC2012_val_00009256.JPEG n02097298/
+mv val/ILSVRC2012_val_00009257.JPEG n04120489/
+mv val/ILSVRC2012_val_00009258.JPEG n04442312/
+mv val/ILSVRC2012_val_00009259.JPEG n07714990/
+mv val/ILSVRC2012_val_00009260.JPEG n02823428/
+mv val/ILSVRC2012_val_00009261.JPEG n02788148/
+mv val/ILSVRC2012_val_00009262.JPEG n02791270/
+mv val/ILSVRC2012_val_00009263.JPEG n11879895/
+mv val/ILSVRC2012_val_00009264.JPEG n03776460/
+mv val/ILSVRC2012_val_00009265.JPEG n02834397/
+mv val/ILSVRC2012_val_00009266.JPEG n03657121/
+mv val/ILSVRC2012_val_00009267.JPEG n02423022/
+mv val/ILSVRC2012_val_00009268.JPEG n03785016/
+mv val/ILSVRC2012_val_00009269.JPEG n03888257/
+mv val/ILSVRC2012_val_00009270.JPEG n02018207/
+mv val/ILSVRC2012_val_00009271.JPEG n01742172/
+mv val/ILSVRC2012_val_00009272.JPEG n04154565/
+mv val/ILSVRC2012_val_00009273.JPEG n02536864/
+mv val/ILSVRC2012_val_00009274.JPEG n03447721/
+mv val/ILSVRC2012_val_00009275.JPEG n02229544/
+mv val/ILSVRC2012_val_00009276.JPEG n04540053/
+mv val/ILSVRC2012_val_00009277.JPEG n04266014/
+mv val/ILSVRC2012_val_00009278.JPEG n03457902/
+mv val/ILSVRC2012_val_00009279.JPEG n03425413/
+mv val/ILSVRC2012_val_00009280.JPEG n02504013/
+mv val/ILSVRC2012_val_00009281.JPEG n02107312/
+mv val/ILSVRC2012_val_00009282.JPEG n02177972/
+mv val/ILSVRC2012_val_00009283.JPEG n02489166/
+mv val/ILSVRC2012_val_00009284.JPEG n04330267/
+mv val/ILSVRC2012_val_00009285.JPEG n03791053/
+mv val/ILSVRC2012_val_00009286.JPEG n04311004/
+mv val/ILSVRC2012_val_00009287.JPEG n02422699/
+mv val/ILSVRC2012_val_00009288.JPEG n02319095/
+mv val/ILSVRC2012_val_00009289.JPEG n04606251/
+mv val/ILSVRC2012_val_00009290.JPEG n04229816/
+mv val/ILSVRC2012_val_00009291.JPEG n02101556/
+mv val/ILSVRC2012_val_00009292.JPEG n04592741/
+mv val/ILSVRC2012_val_00009293.JPEG n03666591/
+mv val/ILSVRC2012_val_00009294.JPEG n02088094/
+mv val/ILSVRC2012_val_00009295.JPEG n02017213/
+mv val/ILSVRC2012_val_00009296.JPEG n03759954/
+mv val/ILSVRC2012_val_00009297.JPEG n02128925/
+mv val/ILSVRC2012_val_00009298.JPEG n03544143/
+mv val/ILSVRC2012_val_00009299.JPEG n03188531/
+mv val/ILSVRC2012_val_00009300.JPEG n03459775/
+mv val/ILSVRC2012_val_00009301.JPEG n04254680/
+mv val/ILSVRC2012_val_00009302.JPEG n03496892/
+mv val/ILSVRC2012_val_00009303.JPEG n02483362/
+mv val/ILSVRC2012_val_00009304.JPEG n02906734/
+mv val/ILSVRC2012_val_00009305.JPEG n07753275/
+mv val/ILSVRC2012_val_00009306.JPEG n02879718/
+mv val/ILSVRC2012_val_00009307.JPEG n02641379/
+mv val/ILSVRC2012_val_00009308.JPEG n02814860/
+mv val/ILSVRC2012_val_00009309.JPEG n03400231/
+mv val/ILSVRC2012_val_00009310.JPEG n02966687/
+mv val/ILSVRC2012_val_00009311.JPEG n09246464/
+mv val/ILSVRC2012_val_00009312.JPEG n02114712/
+mv val/ILSVRC2012_val_00009313.JPEG n02087046/
+mv val/ILSVRC2012_val_00009314.JPEG n02115913/
+mv val/ILSVRC2012_val_00009315.JPEG n03424325/
+mv val/ILSVRC2012_val_00009316.JPEG n03529860/
+mv val/ILSVRC2012_val_00009317.JPEG n01943899/
+mv val/ILSVRC2012_val_00009318.JPEG n04238763/
+mv val/ILSVRC2012_val_00009319.JPEG n03146219/
+mv val/ILSVRC2012_val_00009320.JPEG n02747177/
+mv val/ILSVRC2012_val_00009321.JPEG n02233338/
+mv val/ILSVRC2012_val_00009322.JPEG n13044778/
+mv val/ILSVRC2012_val_00009323.JPEG n03109150/
+mv val/ILSVRC2012_val_00009324.JPEG n02112350/
+mv val/ILSVRC2012_val_00009325.JPEG n03180011/
+mv val/ILSVRC2012_val_00009326.JPEG n02091831/
+mv val/ILSVRC2012_val_00009327.JPEG n03134739/
+mv val/ILSVRC2012_val_00009328.JPEG n03133878/
+mv val/ILSVRC2012_val_00009329.JPEG n01740131/
+mv val/ILSVRC2012_val_00009330.JPEG n02125311/
+mv val/ILSVRC2012_val_00009331.JPEG n02398521/
+mv val/ILSVRC2012_val_00009332.JPEG n02219486/
+mv val/ILSVRC2012_val_00009333.JPEG n04086273/
+mv val/ILSVRC2012_val_00009334.JPEG n02091244/
+mv val/ILSVRC2012_val_00009335.JPEG n02099849/
+mv val/ILSVRC2012_val_00009336.JPEG n02119789/
+mv val/ILSVRC2012_val_00009337.JPEG n04039381/
+mv val/ILSVRC2012_val_00009338.JPEG n02094114/
+mv val/ILSVRC2012_val_00009339.JPEG n04562935/
+mv val/ILSVRC2012_val_00009340.JPEG n03938244/
+mv val/ILSVRC2012_val_00009341.JPEG n07693725/
+mv val/ILSVRC2012_val_00009342.JPEG n12998815/
+mv val/ILSVRC2012_val_00009343.JPEG n04542943/
+mv val/ILSVRC2012_val_00009344.JPEG n02389026/
+mv val/ILSVRC2012_val_00009345.JPEG n03417042/
+mv val/ILSVRC2012_val_00009346.JPEG n01440764/
+mv val/ILSVRC2012_val_00009347.JPEG n02095889/
+mv val/ILSVRC2012_val_00009348.JPEG n02090379/
+mv val/ILSVRC2012_val_00009349.JPEG n02493509/
+mv val/ILSVRC2012_val_00009350.JPEG n02672831/
+mv val/ILSVRC2012_val_00009351.JPEG n01534433/
+mv val/ILSVRC2012_val_00009352.JPEG n02794156/
+mv val/ILSVRC2012_val_00009353.JPEG n02396427/
+mv val/ILSVRC2012_val_00009354.JPEG n02117135/
+mv val/ILSVRC2012_val_00009355.JPEG n03782006/
+mv val/ILSVRC2012_val_00009356.JPEG n04336792/
+mv val/ILSVRC2012_val_00009357.JPEG n03042490/
+mv val/ILSVRC2012_val_00009358.JPEG n03075370/
+mv val/ILSVRC2012_val_00009359.JPEG n02488291/
+mv val/ILSVRC2012_val_00009360.JPEG n04332243/
+mv val/ILSVRC2012_val_00009361.JPEG n02708093/
+mv val/ILSVRC2012_val_00009362.JPEG n02097209/
+mv val/ILSVRC2012_val_00009363.JPEG n02356798/
+mv val/ILSVRC2012_val_00009364.JPEG n03837869/
+mv val/ILSVRC2012_val_00009365.JPEG n04355338/
+mv val/ILSVRC2012_val_00009366.JPEG n03584829/
+mv val/ILSVRC2012_val_00009367.JPEG n03041632/
+mv val/ILSVRC2012_val_00009368.JPEG n06359193/
+mv val/ILSVRC2012_val_00009369.JPEG n03041632/
+mv val/ILSVRC2012_val_00009370.JPEG n03888257/
+mv val/ILSVRC2012_val_00009371.JPEG n03717622/
+mv val/ILSVRC2012_val_00009372.JPEG n04235860/
+mv val/ILSVRC2012_val_00009373.JPEG n04275548/
+mv val/ILSVRC2012_val_00009374.JPEG n01592084/
+mv val/ILSVRC2012_val_00009375.JPEG n03388549/
+mv val/ILSVRC2012_val_00009376.JPEG n01669191/
+mv val/ILSVRC2012_val_00009377.JPEG n07760859/
+mv val/ILSVRC2012_val_00009378.JPEG n02090622/
+mv val/ILSVRC2012_val_00009379.JPEG n01440764/
+mv val/ILSVRC2012_val_00009380.JPEG n01729322/
+mv val/ILSVRC2012_val_00009381.JPEG n02480495/
+mv val/ILSVRC2012_val_00009382.JPEG n07871810/
+mv val/ILSVRC2012_val_00009383.JPEG n04505470/
+mv val/ILSVRC2012_val_00009384.JPEG n04418357/
+mv val/ILSVRC2012_val_00009385.JPEG n03404251/
+mv val/ILSVRC2012_val_00009386.JPEG n03676483/
+mv val/ILSVRC2012_val_00009387.JPEG n02165105/
+mv val/ILSVRC2012_val_00009388.JPEG n04008634/
+mv val/ILSVRC2012_val_00009389.JPEG n03958227/
+mv val/ILSVRC2012_val_00009390.JPEG n02480855/
+mv val/ILSVRC2012_val_00009391.JPEG n02823750/
+mv val/ILSVRC2012_val_00009392.JPEG n07579787/
+mv val/ILSVRC2012_val_00009393.JPEG n02009912/
+mv val/ILSVRC2012_val_00009394.JPEG n07734744/
+mv val/ILSVRC2012_val_00009395.JPEG n03372029/
+mv val/ILSVRC2012_val_00009396.JPEG n01440764/
+mv val/ILSVRC2012_val_00009397.JPEG n02102177/
+mv val/ILSVRC2012_val_00009398.JPEG n03840681/
+mv val/ILSVRC2012_val_00009399.JPEG n07753275/
+mv val/ILSVRC2012_val_00009400.JPEG n03026506/
+mv val/ILSVRC2012_val_00009401.JPEG n01601694/
+mv val/ILSVRC2012_val_00009402.JPEG n03047690/
+mv val/ILSVRC2012_val_00009403.JPEG n02086079/
+mv val/ILSVRC2012_val_00009404.JPEG n02979186/
+mv val/ILSVRC2012_val_00009405.JPEG n02089078/
+mv val/ILSVRC2012_val_00009406.JPEG n02397096/
+mv val/ILSVRC2012_val_00009407.JPEG n12985857/
+mv val/ILSVRC2012_val_00009408.JPEG n02808304/
+mv val/ILSVRC2012_val_00009409.JPEG n04118538/
+mv val/ILSVRC2012_val_00009410.JPEG n04229816/
+mv val/ILSVRC2012_val_00009411.JPEG n09428293/
+mv val/ILSVRC2012_val_00009412.JPEG n07880968/
+mv val/ILSVRC2012_val_00009413.JPEG n04548280/
+mv val/ILSVRC2012_val_00009414.JPEG n03804744/
+mv val/ILSVRC2012_val_00009415.JPEG n01622779/
+mv val/ILSVRC2012_val_00009416.JPEG n02110063/
+mv val/ILSVRC2012_val_00009417.JPEG n02814860/
+mv val/ILSVRC2012_val_00009418.JPEG n02128385/
+mv val/ILSVRC2012_val_00009419.JPEG n01824575/
+mv val/ILSVRC2012_val_00009420.JPEG n01496331/
+mv val/ILSVRC2012_val_00009421.JPEG n04286575/
+mv val/ILSVRC2012_val_00009422.JPEG n03599486/
+mv val/ILSVRC2012_val_00009423.JPEG n03857828/
+mv val/ILSVRC2012_val_00009424.JPEG n03866082/
+mv val/ILSVRC2012_val_00009425.JPEG n03495258/
+mv val/ILSVRC2012_val_00009426.JPEG n02526121/
+mv val/ILSVRC2012_val_00009427.JPEG n02098105/
+mv val/ILSVRC2012_val_00009428.JPEG n02102973/
+mv val/ILSVRC2012_val_00009429.JPEG n03124043/
+mv val/ILSVRC2012_val_00009430.JPEG n04357314/
+mv val/ILSVRC2012_val_00009431.JPEG n07768694/
+mv val/ILSVRC2012_val_00009432.JPEG n03000134/
+mv val/ILSVRC2012_val_00009433.JPEG n03970156/
+mv val/ILSVRC2012_val_00009434.JPEG n04040759/
+mv val/ILSVRC2012_val_00009435.JPEG n02112706/
+mv val/ILSVRC2012_val_00009436.JPEG n04008634/
+mv val/ILSVRC2012_val_00009437.JPEG n04040759/
+mv val/ILSVRC2012_val_00009438.JPEG n06794110/
+mv val/ILSVRC2012_val_00009439.JPEG n02086646/
+mv val/ILSVRC2012_val_00009440.JPEG n02066245/
+mv val/ILSVRC2012_val_00009441.JPEG n03884397/
+mv val/ILSVRC2012_val_00009442.JPEG n03967562/
+mv val/ILSVRC2012_val_00009443.JPEG n04125021/
+mv val/ILSVRC2012_val_00009444.JPEG n02910353/
+mv val/ILSVRC2012_val_00009445.JPEG n02236044/
+mv val/ILSVRC2012_val_00009446.JPEG n01981276/
+mv val/ILSVRC2012_val_00009447.JPEG n07871810/
+mv val/ILSVRC2012_val_00009448.JPEG n02099849/
+mv val/ILSVRC2012_val_00009449.JPEG n03146219/
+mv val/ILSVRC2012_val_00009450.JPEG n04146614/
+mv val/ILSVRC2012_val_00009451.JPEG n09193705/
+mv val/ILSVRC2012_val_00009452.JPEG n02113023/
+mv val/ILSVRC2012_val_00009453.JPEG n02100236/
+mv val/ILSVRC2012_val_00009454.JPEG n13044778/
+mv val/ILSVRC2012_val_00009455.JPEG n03584829/
+mv val/ILSVRC2012_val_00009456.JPEG n03180011/
+mv val/ILSVRC2012_val_00009457.JPEG n02027492/
+mv val/ILSVRC2012_val_00009458.JPEG n03240683/
+mv val/ILSVRC2012_val_00009459.JPEG n02526121/
+mv val/ILSVRC2012_val_00009460.JPEG n01494475/
+mv val/ILSVRC2012_val_00009461.JPEG n02492660/
+mv val/ILSVRC2012_val_00009462.JPEG n01774750/
+mv val/ILSVRC2012_val_00009463.JPEG n07768694/
+mv val/ILSVRC2012_val_00009464.JPEG n02113712/
+mv val/ILSVRC2012_val_00009465.JPEG n03666591/
+mv val/ILSVRC2012_val_00009466.JPEG n12998815/
+mv val/ILSVRC2012_val_00009467.JPEG n03657121/
+mv val/ILSVRC2012_val_00009468.JPEG n02110806/
+mv val/ILSVRC2012_val_00009469.JPEG n03717622/
+mv val/ILSVRC2012_val_00009470.JPEG n02087394/
+mv val/ILSVRC2012_val_00009471.JPEG n02692877/
+mv val/ILSVRC2012_val_00009472.JPEG n02497673/
+mv val/ILSVRC2012_val_00009473.JPEG n04507155/
+mv val/ILSVRC2012_val_00009474.JPEG n02114855/
+mv val/ILSVRC2012_val_00009475.JPEG n04332243/
+mv val/ILSVRC2012_val_00009476.JPEG n02100877/
+mv val/ILSVRC2012_val_00009477.JPEG n04332243/
+mv val/ILSVRC2012_val_00009478.JPEG n02110627/
+mv val/ILSVRC2012_val_00009479.JPEG n03424325/
+mv val/ILSVRC2012_val_00009480.JPEG n02104365/
+mv val/ILSVRC2012_val_00009481.JPEG n01943899/
+mv val/ILSVRC2012_val_00009482.JPEG n03535780/
+mv val/ILSVRC2012_val_00009483.JPEG n02883205/
+mv val/ILSVRC2012_val_00009484.JPEG n01667778/
+mv val/ILSVRC2012_val_00009485.JPEG n01986214/
+mv val/ILSVRC2012_val_00009486.JPEG n02666196/
+mv val/ILSVRC2012_val_00009487.JPEG n02966687/
+mv val/ILSVRC2012_val_00009488.JPEG n02097658/
+mv val/ILSVRC2012_val_00009489.JPEG n03866082/
+mv val/ILSVRC2012_val_00009490.JPEG n04239074/
+mv val/ILSVRC2012_val_00009491.JPEG n02488702/
+mv val/ILSVRC2012_val_00009492.JPEG n01735189/
+mv val/ILSVRC2012_val_00009493.JPEG n04090263/
+mv val/ILSVRC2012_val_00009494.JPEG n04008634/
+mv val/ILSVRC2012_val_00009495.JPEG n03742115/
+mv val/ILSVRC2012_val_00009496.JPEG n03877472/
+mv val/ILSVRC2012_val_00009497.JPEG n03788195/
+mv val/ILSVRC2012_val_00009498.JPEG n03794056/
+mv val/ILSVRC2012_val_00009499.JPEG n01768244/
+mv val/ILSVRC2012_val_00009500.JPEG n02797295/
+mv val/ILSVRC2012_val_00009501.JPEG n02009229/
+mv val/ILSVRC2012_val_00009502.JPEG n03085013/
+mv val/ILSVRC2012_val_00009503.JPEG n02119789/
+mv val/ILSVRC2012_val_00009504.JPEG n04557648/
+mv val/ILSVRC2012_val_00009505.JPEG n02099267/
+mv val/ILSVRC2012_val_00009506.JPEG n03424325/
+mv val/ILSVRC2012_val_00009507.JPEG n03666591/
+mv val/ILSVRC2012_val_00009508.JPEG n01667778/
+mv val/ILSVRC2012_val_00009509.JPEG n07875152/
+mv val/ILSVRC2012_val_00009510.JPEG n01514668/
+mv val/ILSVRC2012_val_00009511.JPEG n02492660/
+mv val/ILSVRC2012_val_00009512.JPEG n03482405/
+mv val/ILSVRC2012_val_00009513.JPEG n04033901/
+mv val/ILSVRC2012_val_00009514.JPEG n04044716/
+mv val/ILSVRC2012_val_00009515.JPEG n03290653/
+mv val/ILSVRC2012_val_00009516.JPEG n12057211/
+mv val/ILSVRC2012_val_00009517.JPEG n02981792/
+mv val/ILSVRC2012_val_00009518.JPEG n01496331/
+mv val/ILSVRC2012_val_00009519.JPEG n02483362/
+mv val/ILSVRC2012_val_00009520.JPEG n03314780/
+mv val/ILSVRC2012_val_00009521.JPEG n04099969/
+mv val/ILSVRC2012_val_00009522.JPEG n02669723/
+mv val/ILSVRC2012_val_00009523.JPEG n02113799/
+mv val/ILSVRC2012_val_00009524.JPEG n02074367/
+mv val/ILSVRC2012_val_00009525.JPEG n02094258/
+mv val/ILSVRC2012_val_00009526.JPEG n03866082/
+mv val/ILSVRC2012_val_00009527.JPEG n04540053/
+mv val/ILSVRC2012_val_00009528.JPEG n02777292/
+mv val/ILSVRC2012_val_00009529.JPEG n03782006/
+mv val/ILSVRC2012_val_00009530.JPEG n02105251/
+mv val/ILSVRC2012_val_00009531.JPEG n03761084/
+mv val/ILSVRC2012_val_00009532.JPEG n01955084/
+mv val/ILSVRC2012_val_00009533.JPEG n02643566/
+mv val/ILSVRC2012_val_00009534.JPEG n02106662/
+mv val/ILSVRC2012_val_00009535.JPEG n01580077/
+mv val/ILSVRC2012_val_00009536.JPEG n01828970/
+mv val/ILSVRC2012_val_00009537.JPEG n02690373/
+mv val/ILSVRC2012_val_00009538.JPEG n03063599/
+mv val/ILSVRC2012_val_00009539.JPEG n02114548/
+mv val/ILSVRC2012_val_00009540.JPEG n03014705/
+mv val/ILSVRC2012_val_00009541.JPEG n03724870/
+mv val/ILSVRC2012_val_00009542.JPEG n02088364/
+mv val/ILSVRC2012_val_00009543.JPEG n07716358/
+mv val/ILSVRC2012_val_00009544.JPEG n03724870/
+mv val/ILSVRC2012_val_00009545.JPEG n03937543/
+mv val/ILSVRC2012_val_00009546.JPEG n02091635/
+mv val/ILSVRC2012_val_00009547.JPEG n02106382/
+mv val/ILSVRC2012_val_00009548.JPEG n07613480/
+mv val/ILSVRC2012_val_00009549.JPEG n13133613/
+mv val/ILSVRC2012_val_00009550.JPEG n04591157/
+mv val/ILSVRC2012_val_00009551.JPEG n02396427/
+mv val/ILSVRC2012_val_00009552.JPEG n03776460/
+mv val/ILSVRC2012_val_00009553.JPEG n02108089/
+mv val/ILSVRC2012_val_00009554.JPEG n02017213/
+mv val/ILSVRC2012_val_00009555.JPEG n04350905/
+mv val/ILSVRC2012_val_00009556.JPEG n02107683/
+mv val/ILSVRC2012_val_00009557.JPEG n04228054/
+mv val/ILSVRC2012_val_00009558.JPEG n01773549/
+mv val/ILSVRC2012_val_00009559.JPEG n03888257/
+mv val/ILSVRC2012_val_00009560.JPEG n02488291/
+mv val/ILSVRC2012_val_00009561.JPEG n04493381/
+mv val/ILSVRC2012_val_00009562.JPEG n01817953/
+mv val/ILSVRC2012_val_00009563.JPEG n01641577/
+mv val/ILSVRC2012_val_00009564.JPEG n02012849/
+mv val/ILSVRC2012_val_00009565.JPEG n01797886/
+mv val/ILSVRC2012_val_00009566.JPEG n02787622/
+mv val/ILSVRC2012_val_00009567.JPEG n02910353/
+mv val/ILSVRC2012_val_00009568.JPEG n04067472/
+mv val/ILSVRC2012_val_00009569.JPEG n03100240/
+mv val/ILSVRC2012_val_00009570.JPEG n02087046/
+mv val/ILSVRC2012_val_00009571.JPEG n03733131/
+mv val/ILSVRC2012_val_00009572.JPEG n02643566/
+mv val/ILSVRC2012_val_00009573.JPEG n02916936/
+mv val/ILSVRC2012_val_00009574.JPEG n02480495/
+mv val/ILSVRC2012_val_00009575.JPEG n02815834/
+mv val/ILSVRC2012_val_00009576.JPEG n02086079/
+mv val/ILSVRC2012_val_00009577.JPEG n02814860/
+mv val/ILSVRC2012_val_00009578.JPEG n02114712/
+mv val/ILSVRC2012_val_00009579.JPEG n07742313/
+mv val/ILSVRC2012_val_00009580.JPEG n01728920/
+mv val/ILSVRC2012_val_00009581.JPEG n02356798/
+mv val/ILSVRC2012_val_00009582.JPEG n13044778/
+mv val/ILSVRC2012_val_00009583.JPEG n01798484/
+mv val/ILSVRC2012_val_00009584.JPEG n04613696/
+mv val/ILSVRC2012_val_00009585.JPEG n02108915/
+mv val/ILSVRC2012_val_00009586.JPEG n02109047/
+mv val/ILSVRC2012_val_00009587.JPEG n03272010/
+mv val/ILSVRC2012_val_00009588.JPEG n04008634/
+mv val/ILSVRC2012_val_00009589.JPEG n02097209/
+mv val/ILSVRC2012_val_00009590.JPEG n01843065/
+mv val/ILSVRC2012_val_00009591.JPEG n02999410/
+mv val/ILSVRC2012_val_00009592.JPEG n04086273/
+mv val/ILSVRC2012_val_00009593.JPEG n03888257/
+mv val/ILSVRC2012_val_00009594.JPEG n02123394/
+mv val/ILSVRC2012_val_00009595.JPEG n04356056/
+mv val/ILSVRC2012_val_00009596.JPEG n09468604/
+mv val/ILSVRC2012_val_00009597.JPEG n01601694/
+mv val/ILSVRC2012_val_00009598.JPEG n03950228/
+mv val/ILSVRC2012_val_00009599.JPEG n04344873/
+mv val/ILSVRC2012_val_00009600.JPEG n02672831/
+mv val/ILSVRC2012_val_00009601.JPEG n12768682/
+mv val/ILSVRC2012_val_00009602.JPEG n02110341/
+mv val/ILSVRC2012_val_00009603.JPEG n10148035/
+mv val/ILSVRC2012_val_00009604.JPEG n02114367/
+mv val/ILSVRC2012_val_00009605.JPEG n04409515/
+mv val/ILSVRC2012_val_00009606.JPEG n03240683/
+mv val/ILSVRC2012_val_00009607.JPEG n04285008/
+mv val/ILSVRC2012_val_00009608.JPEG n07831146/
+mv val/ILSVRC2012_val_00009609.JPEG n03584254/
+mv val/ILSVRC2012_val_00009610.JPEG n01855672/
+mv val/ILSVRC2012_val_00009611.JPEG n02489166/
+mv val/ILSVRC2012_val_00009612.JPEG n03216828/
+mv val/ILSVRC2012_val_00009613.JPEG n03297495/
+mv val/ILSVRC2012_val_00009614.JPEG n04086273/
+mv val/ILSVRC2012_val_00009615.JPEG n01514859/
+mv val/ILSVRC2012_val_00009616.JPEG n01629819/
+mv val/ILSVRC2012_val_00009617.JPEG n02643566/
+mv val/ILSVRC2012_val_00009618.JPEG n02113023/
+mv val/ILSVRC2012_val_00009619.JPEG n02791270/
+mv val/ILSVRC2012_val_00009620.JPEG n03983396/
+mv val/ILSVRC2012_val_00009621.JPEG n07880968/
+mv val/ILSVRC2012_val_00009622.JPEG n02268853/
+mv val/ILSVRC2012_val_00009623.JPEG n03970156/
+mv val/ILSVRC2012_val_00009624.JPEG n02091831/
+mv val/ILSVRC2012_val_00009625.JPEG n02268853/
+mv val/ILSVRC2012_val_00009626.JPEG n02167151/
+mv val/ILSVRC2012_val_00009627.JPEG n03742115/
+mv val/ILSVRC2012_val_00009628.JPEG n03947888/
+mv val/ILSVRC2012_val_00009629.JPEG n04591157/
+mv val/ILSVRC2012_val_00009630.JPEG n03729826/
+mv val/ILSVRC2012_val_00009631.JPEG n02988304/
+mv val/ILSVRC2012_val_00009632.JPEG n03717622/
+mv val/ILSVRC2012_val_00009633.JPEG n02391049/
+mv val/ILSVRC2012_val_00009634.JPEG n02096585/
+mv val/ILSVRC2012_val_00009635.JPEG n02219486/
+mv val/ILSVRC2012_val_00009636.JPEG n02093647/
+mv val/ILSVRC2012_val_00009637.JPEG n02002556/
+mv val/ILSVRC2012_val_00009638.JPEG n02504458/
+mv val/ILSVRC2012_val_00009639.JPEG n01665541/
+mv val/ILSVRC2012_val_00009640.JPEG n03938244/
+mv val/ILSVRC2012_val_00009641.JPEG n03776460/
+mv val/ILSVRC2012_val_00009642.JPEG n02093256/
+mv val/ILSVRC2012_val_00009643.JPEG n02056570/
+mv val/ILSVRC2012_val_00009644.JPEG n02096051/
+mv val/ILSVRC2012_val_00009645.JPEG n02488702/
+mv val/ILSVRC2012_val_00009646.JPEG n07693725/
+mv val/ILSVRC2012_val_00009647.JPEG n01796340/
+mv val/ILSVRC2012_val_00009648.JPEG n02950826/
+mv val/ILSVRC2012_val_00009649.JPEG n01828970/
+mv val/ILSVRC2012_val_00009650.JPEG n03534580/
+mv val/ILSVRC2012_val_00009651.JPEG n03394916/
+mv val/ILSVRC2012_val_00009652.JPEG n04404412/
+mv val/ILSVRC2012_val_00009653.JPEG n03895866/
+mv val/ILSVRC2012_val_00009654.JPEG n01944390/
+mv val/ILSVRC2012_val_00009655.JPEG n04554684/
+mv val/ILSVRC2012_val_00009656.JPEG n02444819/
+mv val/ILSVRC2012_val_00009657.JPEG n03623198/
+mv val/ILSVRC2012_val_00009658.JPEG n04263257/
+mv val/ILSVRC2012_val_00009659.JPEG n04099969/
+mv val/ILSVRC2012_val_00009660.JPEG n02105855/
+mv val/ILSVRC2012_val_00009661.JPEG n03584829/
+mv val/ILSVRC2012_val_00009662.JPEG n04442312/
+mv val/ILSVRC2012_val_00009663.JPEG n01514668/
+mv val/ILSVRC2012_val_00009664.JPEG n02088364/
+mv val/ILSVRC2012_val_00009665.JPEG n01943899/
+mv val/ILSVRC2012_val_00009666.JPEG n02091831/
+mv val/ILSVRC2012_val_00009667.JPEG n02071294/
+mv val/ILSVRC2012_val_00009668.JPEG n03461385/
+mv val/ILSVRC2012_val_00009669.JPEG n04485082/
+mv val/ILSVRC2012_val_00009670.JPEG n01630670/
+mv val/ILSVRC2012_val_00009671.JPEG n01873310/
+mv val/ILSVRC2012_val_00009672.JPEG n02011460/
+mv val/ILSVRC2012_val_00009673.JPEG n02113978/
+mv val/ILSVRC2012_val_00009674.JPEG n01629819/
+mv val/ILSVRC2012_val_00009675.JPEG n07711569/
+mv val/ILSVRC2012_val_00009676.JPEG n04023962/
+mv val/ILSVRC2012_val_00009677.JPEG n01631663/
+mv val/ILSVRC2012_val_00009678.JPEG n02815834/
+mv val/ILSVRC2012_val_00009679.JPEG n01797886/
+mv val/ILSVRC2012_val_00009680.JPEG n03662601/
+mv val/ILSVRC2012_val_00009681.JPEG n02704792/
+mv val/ILSVRC2012_val_00009682.JPEG n02494079/
+mv val/ILSVRC2012_val_00009683.JPEG n02124075/
+mv val/ILSVRC2012_val_00009684.JPEG n03530642/
+mv val/ILSVRC2012_val_00009685.JPEG n03424325/
+mv val/ILSVRC2012_val_00009686.JPEG n02974003/
+mv val/ILSVRC2012_val_00009687.JPEG n01685808/
+mv val/ILSVRC2012_val_00009688.JPEG n02086910/
+mv val/ILSVRC2012_val_00009689.JPEG n04004767/
+mv val/ILSVRC2012_val_00009690.JPEG n03720891/
+mv val/ILSVRC2012_val_00009691.JPEG n04200800/
+mv val/ILSVRC2012_val_00009692.JPEG n01755581/
+mv val/ILSVRC2012_val_00009693.JPEG n04118776/
+mv val/ILSVRC2012_val_00009694.JPEG n02058221/
+mv val/ILSVRC2012_val_00009695.JPEG n03124170/
+mv val/ILSVRC2012_val_00009696.JPEG n03584829/
+mv val/ILSVRC2012_val_00009697.JPEG n01978455/
+mv val/ILSVRC2012_val_00009698.JPEG n02100583/
+mv val/ILSVRC2012_val_00009699.JPEG n03131574/
+mv val/ILSVRC2012_val_00009700.JPEG n03467068/
+mv val/ILSVRC2012_val_00009701.JPEG n02490219/
+mv val/ILSVRC2012_val_00009702.JPEG n02978881/
+mv val/ILSVRC2012_val_00009703.JPEG n02096051/
+mv val/ILSVRC2012_val_00009704.JPEG n04254120/
+mv val/ILSVRC2012_val_00009705.JPEG n03028079/
+mv val/ILSVRC2012_val_00009706.JPEG n04371774/
+mv val/ILSVRC2012_val_00009707.JPEG n02105641/
+mv val/ILSVRC2012_val_00009708.JPEG n02397096/
+mv val/ILSVRC2012_val_00009709.JPEG n04258138/
+mv val/ILSVRC2012_val_00009710.JPEG n03297495/
+mv val/ILSVRC2012_val_00009711.JPEG n02108000/
+mv val/ILSVRC2012_val_00009712.JPEG n02096585/
+mv val/ILSVRC2012_val_00009713.JPEG n02090721/
+mv val/ILSVRC2012_val_00009714.JPEG n02786058/
+mv val/ILSVRC2012_val_00009715.JPEG n02025239/
+mv val/ILSVRC2012_val_00009716.JPEG n01784675/
+mv val/ILSVRC2012_val_00009717.JPEG n03393912/
+mv val/ILSVRC2012_val_00009718.JPEG n01755581/
+mv val/ILSVRC2012_val_00009719.JPEG n02437616/
+mv val/ILSVRC2012_val_00009720.JPEG n02219486/
+mv val/ILSVRC2012_val_00009721.JPEG n03388549/
+mv val/ILSVRC2012_val_00009722.JPEG n02769748/
+mv val/ILSVRC2012_val_00009723.JPEG n03384352/
+mv val/ILSVRC2012_val_00009724.JPEG n03998194/
+mv val/ILSVRC2012_val_00009725.JPEG n02699494/
+mv val/ILSVRC2012_val_00009726.JPEG n04277352/
+mv val/ILSVRC2012_val_00009727.JPEG n03637318/
+mv val/ILSVRC2012_val_00009728.JPEG n02415577/
+mv val/ILSVRC2012_val_00009729.JPEG n03788365/
+mv val/ILSVRC2012_val_00009730.JPEG n01943899/
+mv val/ILSVRC2012_val_00009731.JPEG n02009229/
+mv val/ILSVRC2012_val_00009732.JPEG n04325704/
+mv val/ILSVRC2012_val_00009733.JPEG n04532670/
+mv val/ILSVRC2012_val_00009734.JPEG n01498041/
+mv val/ILSVRC2012_val_00009735.JPEG n03793489/
+mv val/ILSVRC2012_val_00009736.JPEG n04141076/
+mv val/ILSVRC2012_val_00009737.JPEG n04525038/
+mv val/ILSVRC2012_val_00009738.JPEG n04548362/
+mv val/ILSVRC2012_val_00009739.JPEG n02012849/
+mv val/ILSVRC2012_val_00009740.JPEG n02093754/
+mv val/ILSVRC2012_val_00009741.JPEG n03534580/
+mv val/ILSVRC2012_val_00009742.JPEG n04532670/
+mv val/ILSVRC2012_val_00009743.JPEG n02859443/
+mv val/ILSVRC2012_val_00009744.JPEG n02027492/
+mv val/ILSVRC2012_val_00009745.JPEG n04070727/
+mv val/ILSVRC2012_val_00009746.JPEG n03673027/
+mv val/ILSVRC2012_val_00009747.JPEG n11879895/
+mv val/ILSVRC2012_val_00009748.JPEG n02643566/
+mv val/ILSVRC2012_val_00009749.JPEG n04606251/
+mv val/ILSVRC2012_val_00009750.JPEG n04613696/
+mv val/ILSVRC2012_val_00009751.JPEG n03680355/
+mv val/ILSVRC2012_val_00009752.JPEG n01860187/
+mv val/ILSVRC2012_val_00009753.JPEG n04251144/
+mv val/ILSVRC2012_val_00009754.JPEG n01739381/
+mv val/ILSVRC2012_val_00009755.JPEG n02098413/
+mv val/ILSVRC2012_val_00009756.JPEG n04019541/
+mv val/ILSVRC2012_val_00009757.JPEG n02101556/
+mv val/ILSVRC2012_val_00009758.JPEG n03201208/
+mv val/ILSVRC2012_val_00009759.JPEG n04532106/
+mv val/ILSVRC2012_val_00009760.JPEG n02879718/
+mv val/ILSVRC2012_val_00009761.JPEG n02951585/
+mv val/ILSVRC2012_val_00009762.JPEG n04604644/
+mv val/ILSVRC2012_val_00009763.JPEG n04275548/
+mv val/ILSVRC2012_val_00009764.JPEG n02097474/
+mv val/ILSVRC2012_val_00009765.JPEG n03482405/
+mv val/ILSVRC2012_val_00009766.JPEG n07734744/
+mv val/ILSVRC2012_val_00009767.JPEG n03868242/
+mv val/ILSVRC2012_val_00009768.JPEG n04332243/
+mv val/ILSVRC2012_val_00009769.JPEG n04589890/
+mv val/ILSVRC2012_val_00009770.JPEG n03788365/
+mv val/ILSVRC2012_val_00009771.JPEG n03649909/
+mv val/ILSVRC2012_val_00009772.JPEG n02090721/
+mv val/ILSVRC2012_val_00009773.JPEG n02672831/
+mv val/ILSVRC2012_val_00009774.JPEG n02109525/
+mv val/ILSVRC2012_val_00009775.JPEG n02112018/
+mv val/ILSVRC2012_val_00009776.JPEG n07615774/
+mv val/ILSVRC2012_val_00009777.JPEG n02102480/
+mv val/ILSVRC2012_val_00009778.JPEG n03125729/
+mv val/ILSVRC2012_val_00009779.JPEG n01632458/
+mv val/ILSVRC2012_val_00009780.JPEG n04252225/
+mv val/ILSVRC2012_val_00009781.JPEG n01824575/
+mv val/ILSVRC2012_val_00009782.JPEG n02666196/
+mv val/ILSVRC2012_val_00009783.JPEG n03832673/
+mv val/ILSVRC2012_val_00009784.JPEG n02105641/
+mv val/ILSVRC2012_val_00009785.JPEG n07768694/
+mv val/ILSVRC2012_val_00009786.JPEG n03871628/
+mv val/ILSVRC2012_val_00009787.JPEG n03127925/
+mv val/ILSVRC2012_val_00009788.JPEG n03344393/
+mv val/ILSVRC2012_val_00009789.JPEG n02096177/
+mv val/ILSVRC2012_val_00009790.JPEG n03887697/
+mv val/ILSVRC2012_val_00009791.JPEG n03424325/
+mv val/ILSVRC2012_val_00009792.JPEG n03014705/
+mv val/ILSVRC2012_val_00009793.JPEG n03796401/
+mv val/ILSVRC2012_val_00009794.JPEG n03617480/
+mv val/ILSVRC2012_val_00009795.JPEG n04065272/
+mv val/ILSVRC2012_val_00009796.JPEG n03982430/
+mv val/ILSVRC2012_val_00009797.JPEG n04479046/
+mv val/ILSVRC2012_val_00009798.JPEG n03763968/
+mv val/ILSVRC2012_val_00009799.JPEG n02486410/
+mv val/ILSVRC2012_val_00009800.JPEG n07742313/
+mv val/ILSVRC2012_val_00009801.JPEG n02687172/
+mv val/ILSVRC2012_val_00009802.JPEG n03794056/
+mv val/ILSVRC2012_val_00009803.JPEG n04254680/
+mv val/ILSVRC2012_val_00009804.JPEG n03661043/
+mv val/ILSVRC2012_val_00009805.JPEG n02837789/
+mv val/ILSVRC2012_val_00009806.JPEG n02454379/
+mv val/ILSVRC2012_val_00009807.JPEG n01560419/
+mv val/ILSVRC2012_val_00009808.JPEG n04443257/
+mv val/ILSVRC2012_val_00009809.JPEG n07613480/
+mv val/ILSVRC2012_val_00009810.JPEG n02110806/
+mv val/ILSVRC2012_val_00009811.JPEG n01818515/
+mv val/ILSVRC2012_val_00009812.JPEG n02099712/
+mv val/ILSVRC2012_val_00009813.JPEG n03384352/
+mv val/ILSVRC2012_val_00009814.JPEG n04366367/
+mv val/ILSVRC2012_val_00009815.JPEG n03676483/
+mv val/ILSVRC2012_val_00009816.JPEG n02892767/
+mv val/ILSVRC2012_val_00009817.JPEG n02110627/
+mv val/ILSVRC2012_val_00009818.JPEG n02096294/
+mv val/ILSVRC2012_val_00009819.JPEG n01667778/
+mv val/ILSVRC2012_val_00009820.JPEG n02870880/
+mv val/ILSVRC2012_val_00009821.JPEG n03425413/
+mv val/ILSVRC2012_val_00009822.JPEG n01751748/
+mv val/ILSVRC2012_val_00009823.JPEG n04275548/
+mv val/ILSVRC2012_val_00009824.JPEG n03187595/
+mv val/ILSVRC2012_val_00009825.JPEG n02437312/
+mv val/ILSVRC2012_val_00009826.JPEG n03623198/
+mv val/ILSVRC2012_val_00009827.JPEG n01796340/
+mv val/ILSVRC2012_val_00009828.JPEG n09472597/
+mv val/ILSVRC2012_val_00009829.JPEG n04523525/
+mv val/ILSVRC2012_val_00009830.JPEG n02486261/
+mv val/ILSVRC2012_val_00009831.JPEG n01531178/
+mv val/ILSVRC2012_val_00009832.JPEG n02493509/
+mv val/ILSVRC2012_val_00009833.JPEG n02979186/
+mv val/ILSVRC2012_val_00009834.JPEG n03584829/
+mv val/ILSVRC2012_val_00009835.JPEG n03924679/
+mv val/ILSVRC2012_val_00009836.JPEG n02099601/
+mv val/ILSVRC2012_val_00009837.JPEG n03259280/
+mv val/ILSVRC2012_val_00009838.JPEG n04229816/
+mv val/ILSVRC2012_val_00009839.JPEG n01872401/
+mv val/ILSVRC2012_val_00009840.JPEG n04579432/
+mv val/ILSVRC2012_val_00009841.JPEG n01855672/
+mv val/ILSVRC2012_val_00009842.JPEG n01622779/
+mv val/ILSVRC2012_val_00009843.JPEG n02509815/
+mv val/ILSVRC2012_val_00009844.JPEG n04525305/
+mv val/ILSVRC2012_val_00009845.JPEG n04131690/
+mv val/ILSVRC2012_val_00009846.JPEG n02484975/
+mv val/ILSVRC2012_val_00009847.JPEG n09193705/
+mv val/ILSVRC2012_val_00009848.JPEG n02097658/
+mv val/ILSVRC2012_val_00009849.JPEG n02877765/
+mv val/ILSVRC2012_val_00009850.JPEG n02749479/
+mv val/ILSVRC2012_val_00009851.JPEG n06596364/
+mv val/ILSVRC2012_val_00009852.JPEG n01806567/
+mv val/ILSVRC2012_val_00009853.JPEG n02093428/
+mv val/ILSVRC2012_val_00009854.JPEG n01773157/
+mv val/ILSVRC2012_val_00009855.JPEG n03207941/
+mv val/ILSVRC2012_val_00009856.JPEG n03947888/
+mv val/ILSVRC2012_val_00009857.JPEG n01818515/
+mv val/ILSVRC2012_val_00009858.JPEG n02092339/
+mv val/ILSVRC2012_val_00009859.JPEG n02276258/
+mv val/ILSVRC2012_val_00009860.JPEG n03207743/
+mv val/ILSVRC2012_val_00009861.JPEG n02794156/
+mv val/ILSVRC2012_val_00009862.JPEG n02106166/
+mv val/ILSVRC2012_val_00009863.JPEG n03529860/
+mv val/ILSVRC2012_val_00009864.JPEG n04493381/
+mv val/ILSVRC2012_val_00009865.JPEG n02086079/
+mv val/ILSVRC2012_val_00009866.JPEG n02011460/
+mv val/ILSVRC2012_val_00009867.JPEG n03961711/
+mv val/ILSVRC2012_val_00009868.JPEG n03680355/
+mv val/ILSVRC2012_val_00009869.JPEG n04263257/
+mv val/ILSVRC2012_val_00009870.JPEG n01819313/
+mv val/ILSVRC2012_val_00009871.JPEG n02102177/
+mv val/ILSVRC2012_val_00009872.JPEG n04254120/
+mv val/ILSVRC2012_val_00009873.JPEG n03888257/
+mv val/ILSVRC2012_val_00009874.JPEG n03729826/
+mv val/ILSVRC2012_val_00009875.JPEG n04136333/
+mv val/ILSVRC2012_val_00009876.JPEG n04346328/
+mv val/ILSVRC2012_val_00009877.JPEG n02107908/
+mv val/ILSVRC2012_val_00009878.JPEG n02447366/
+mv val/ILSVRC2012_val_00009879.JPEG n03125729/
+mv val/ILSVRC2012_val_00009880.JPEG n03476684/
+mv val/ILSVRC2012_val_00009881.JPEG n02443114/
+mv val/ILSVRC2012_val_00009882.JPEG n03788195/
+mv val/ILSVRC2012_val_00009883.JPEG n03710637/
+mv val/ILSVRC2012_val_00009884.JPEG n03657121/
+mv val/ILSVRC2012_val_00009885.JPEG n03633091/
+mv val/ILSVRC2012_val_00009886.JPEG n03141823/
+mv val/ILSVRC2012_val_00009887.JPEG n07802026/
+mv val/ILSVRC2012_val_00009888.JPEG n02113978/
+mv val/ILSVRC2012_val_00009889.JPEG n01665541/
+mv val/ILSVRC2012_val_00009890.JPEG n01744401/
+mv val/ILSVRC2012_val_00009891.JPEG n02834397/
+mv val/ILSVRC2012_val_00009892.JPEG n03633091/
+mv val/ILSVRC2012_val_00009893.JPEG n04335435/
+mv val/ILSVRC2012_val_00009894.JPEG n02011460/
+mv val/ILSVRC2012_val_00009895.JPEG n02099712/
+mv val/ILSVRC2012_val_00009896.JPEG n03527444/
+mv val/ILSVRC2012_val_00009897.JPEG n03180011/
+mv val/ILSVRC2012_val_00009898.JPEG n02408429/
+mv val/ILSVRC2012_val_00009899.JPEG n02123394/
+mv val/ILSVRC2012_val_00009900.JPEG n03980874/
+mv val/ILSVRC2012_val_00009901.JPEG n04070727/
+mv val/ILSVRC2012_val_00009902.JPEG n03445777/
+mv val/ILSVRC2012_val_00009903.JPEG n04465501/
+mv val/ILSVRC2012_val_00009904.JPEG n03530642/
+mv val/ILSVRC2012_val_00009905.JPEG n03291819/
+mv val/ILSVRC2012_val_00009906.JPEG n04252077/
+mv val/ILSVRC2012_val_00009907.JPEG n01689811/
+mv val/ILSVRC2012_val_00009908.JPEG n02058221/
+mv val/ILSVRC2012_val_00009909.JPEG n02112137/
+mv val/ILSVRC2012_val_00009910.JPEG n01950731/
+mv val/ILSVRC2012_val_00009911.JPEG n01682714/
+mv val/ILSVRC2012_val_00009912.JPEG n02231487/
+mv val/ILSVRC2012_val_00009913.JPEG n07684084/
+mv val/ILSVRC2012_val_00009914.JPEG n03481172/
+mv val/ILSVRC2012_val_00009915.JPEG n02963159/
+mv val/ILSVRC2012_val_00009916.JPEG n07768694/
+mv val/ILSVRC2012_val_00009917.JPEG n03977966/
+mv val/ILSVRC2012_val_00009918.JPEG n02165456/
+mv val/ILSVRC2012_val_00009919.JPEG n02939185/
+mv val/ILSVRC2012_val_00009920.JPEG n04258138/
+mv val/ILSVRC2012_val_00009921.JPEG n02123045/
+mv val/ILSVRC2012_val_00009922.JPEG n02128757/
+mv val/ILSVRC2012_val_00009923.JPEG n02037110/
+mv val/ILSVRC2012_val_00009924.JPEG n02128925/
+mv val/ILSVRC2012_val_00009925.JPEG n02483362/
+mv val/ILSVRC2012_val_00009926.JPEG n03483316/
+mv val/ILSVRC2012_val_00009927.JPEG n04273569/
+mv val/ILSVRC2012_val_00009928.JPEG n04208210/
+mv val/ILSVRC2012_val_00009929.JPEG n03942813/
+mv val/ILSVRC2012_val_00009930.JPEG n03291819/
+mv val/ILSVRC2012_val_00009931.JPEG n03467068/
+mv val/ILSVRC2012_val_00009932.JPEG n02091467/
+mv val/ILSVRC2012_val_00009933.JPEG n02113624/
+mv val/ILSVRC2012_val_00009934.JPEG n03950228/
+mv val/ILSVRC2012_val_00009935.JPEG n03786901/
+mv val/ILSVRC2012_val_00009936.JPEG n04228054/
+mv val/ILSVRC2012_val_00009937.JPEG n03649909/
+mv val/ILSVRC2012_val_00009938.JPEG n01629819/
+mv val/ILSVRC2012_val_00009939.JPEG n02104365/
+mv val/ILSVRC2012_val_00009940.JPEG n02865351/
+mv val/ILSVRC2012_val_00009941.JPEG n02097047/
+mv val/ILSVRC2012_val_00009942.JPEG n03902125/
+mv val/ILSVRC2012_val_00009943.JPEG n02231487/
+mv val/ILSVRC2012_val_00009944.JPEG n04033995/
+mv val/ILSVRC2012_val_00009945.JPEG n02172182/
+mv val/ILSVRC2012_val_00009946.JPEG n01632777/
+mv val/ILSVRC2012_val_00009947.JPEG n02494079/
+mv val/ILSVRC2012_val_00009948.JPEG n02391049/
+mv val/ILSVRC2012_val_00009949.JPEG n02093256/
+mv val/ILSVRC2012_val_00009950.JPEG n03992509/
+mv val/ILSVRC2012_val_00009951.JPEG n03710721/
+mv val/ILSVRC2012_val_00009952.JPEG n03272010/
+mv val/ILSVRC2012_val_00009953.JPEG n03124043/
+mv val/ILSVRC2012_val_00009954.JPEG n02422699/
+mv val/ILSVRC2012_val_00009955.JPEG n02492035/
+mv val/ILSVRC2012_val_00009956.JPEG n02410509/
+mv val/ILSVRC2012_val_00009957.JPEG n04120489/
+mv val/ILSVRC2012_val_00009958.JPEG n02793495/
+mv val/ILSVRC2012_val_00009959.JPEG n03594734/
+mv val/ILSVRC2012_val_00009960.JPEG n03841143/
+mv val/ILSVRC2012_val_00009961.JPEG n03124043/
+mv val/ILSVRC2012_val_00009962.JPEG n04265275/
+mv val/ILSVRC2012_val_00009963.JPEG n02088466/
+mv val/ILSVRC2012_val_00009964.JPEG n02123159/
+mv val/ILSVRC2012_val_00009965.JPEG n03461385/
+mv val/ILSVRC2012_val_00009966.JPEG n01675722/
+mv val/ILSVRC2012_val_00009967.JPEG n02965783/
+mv val/ILSVRC2012_val_00009968.JPEG n07753113/
+mv val/ILSVRC2012_val_00009969.JPEG n07614500/
+mv val/ILSVRC2012_val_00009970.JPEG n04154565/
+mv val/ILSVRC2012_val_00009971.JPEG n03590841/
+mv val/ILSVRC2012_val_00009972.JPEG n02361337/
+mv val/ILSVRC2012_val_00009973.JPEG n07720875/
+mv val/ILSVRC2012_val_00009974.JPEG n01843383/
+mv val/ILSVRC2012_val_00009975.JPEG n04162706/
+mv val/ILSVRC2012_val_00009976.JPEG n02134418/
+mv val/ILSVRC2012_val_00009977.JPEG n03271574/
+mv val/ILSVRC2012_val_00009978.JPEG n01494475/
+mv val/ILSVRC2012_val_00009979.JPEG n01729977/
+mv val/ILSVRC2012_val_00009980.JPEG n01689811/
+mv val/ILSVRC2012_val_00009981.JPEG n01582220/
+mv val/ILSVRC2012_val_00009982.JPEG n02655020/
+mv val/ILSVRC2012_val_00009983.JPEG n03594945/
+mv val/ILSVRC2012_val_00009984.JPEG n02099712/
+mv val/ILSVRC2012_val_00009985.JPEG n02110627/
+mv val/ILSVRC2012_val_00009986.JPEG n02441942/
+mv val/ILSVRC2012_val_00009987.JPEG n02791124/
+mv val/ILSVRC2012_val_00009988.JPEG n02007558/
+mv val/ILSVRC2012_val_00009989.JPEG n03891332/
+mv val/ILSVRC2012_val_00009990.JPEG n02791270/
+mv val/ILSVRC2012_val_00009991.JPEG n02037110/
+mv val/ILSVRC2012_val_00009992.JPEG n02127052/
+mv val/ILSVRC2012_val_00009993.JPEG n01910747/
+mv val/ILSVRC2012_val_00009994.JPEG n01829413/
+mv val/ILSVRC2012_val_00009995.JPEG n04523525/
+mv val/ILSVRC2012_val_00009996.JPEG n02417914/
+mv val/ILSVRC2012_val_00009997.JPEG n04465501/
+mv val/ILSVRC2012_val_00009998.JPEG n01860187/
+mv val/ILSVRC2012_val_00009999.JPEG n03935335/
+mv val/ILSVRC2012_val_00010000.JPEG n03908714/
+mv val/ILSVRC2012_val_00010001.JPEG n02018207/
+mv val/ILSVRC2012_val_00010002.JPEG n02006656/
+mv val/ILSVRC2012_val_00010003.JPEG n07802026/
+mv val/ILSVRC2012_val_00010004.JPEG n03950228/
+mv val/ILSVRC2012_val_00010005.JPEG n07590611/
+mv val/ILSVRC2012_val_00010006.JPEG n02092002/
+mv val/ILSVRC2012_val_00010007.JPEG n04423845/
+mv val/ILSVRC2012_val_00010008.JPEG n02790996/
+mv val/ILSVRC2012_val_00010009.JPEG n04252225/
+mv val/ILSVRC2012_val_00010010.JPEG n03666591/
+mv val/ILSVRC2012_val_00010011.JPEG n02109961/
+mv val/ILSVRC2012_val_00010012.JPEG n03930630/
+mv val/ILSVRC2012_val_00010013.JPEG n02860847/
+mv val/ILSVRC2012_val_00010014.JPEG n04552348/
+mv val/ILSVRC2012_val_00010015.JPEG n02092339/
+mv val/ILSVRC2012_val_00010016.JPEG n09229709/
+mv val/ILSVRC2012_val_00010017.JPEG n02791270/
+mv val/ILSVRC2012_val_00010018.JPEG n07579787/
+mv val/ILSVRC2012_val_00010019.JPEG n03196217/
+mv val/ILSVRC2012_val_00010020.JPEG n02500267/
+mv val/ILSVRC2012_val_00010021.JPEG n02790996/
+mv val/ILSVRC2012_val_00010022.JPEG n01622779/
+mv val/ILSVRC2012_val_00010023.JPEG n02484975/
+mv val/ILSVRC2012_val_00010024.JPEG n02669723/
+mv val/ILSVRC2012_val_00010025.JPEG n02280649/
+mv val/ILSVRC2012_val_00010026.JPEG n11879895/
+mv val/ILSVRC2012_val_00010027.JPEG n03769881/
+mv val/ILSVRC2012_val_00010028.JPEG n02167151/
+mv val/ILSVRC2012_val_00010029.JPEG n02403003/
+mv val/ILSVRC2012_val_00010030.JPEG n03717622/
+mv val/ILSVRC2012_val_00010031.JPEG n02093991/
+mv val/ILSVRC2012_val_00010032.JPEG n03942813/
+mv val/ILSVRC2012_val_00010033.JPEG n04254680/
+mv val/ILSVRC2012_val_00010034.JPEG n04443257/
+mv val/ILSVRC2012_val_00010035.JPEG n01860187/
+mv val/ILSVRC2012_val_00010036.JPEG n09229709/
+mv val/ILSVRC2012_val_00010037.JPEG n02028035/
+mv val/ILSVRC2012_val_00010038.JPEG n02087394/
+mv val/ILSVRC2012_val_00010039.JPEG n01986214/
+mv val/ILSVRC2012_val_00010040.JPEG n02115641/
+mv val/ILSVRC2012_val_00010041.JPEG n02640242/
+mv val/ILSVRC2012_val_00010042.JPEG n04328186/
+mv val/ILSVRC2012_val_00010043.JPEG n03908618/
+mv val/ILSVRC2012_val_00010044.JPEG n04154565/
+mv val/ILSVRC2012_val_00010045.JPEG n02797295/
+mv val/ILSVRC2012_val_00010046.JPEG n02097209/
+mv val/ILSVRC2012_val_00010047.JPEG n02125311/
+mv val/ILSVRC2012_val_00010048.JPEG n07932039/
+mv val/ILSVRC2012_val_00010049.JPEG n02102973/
+mv val/ILSVRC2012_val_00010050.JPEG n03529860/
+mv val/ILSVRC2012_val_00010051.JPEG n01980166/
+mv val/ILSVRC2012_val_00010052.JPEG n02443114/
+mv val/ILSVRC2012_val_00010053.JPEG n03733131/
+mv val/ILSVRC2012_val_00010054.JPEG n07718472/
+mv val/ILSVRC2012_val_00010055.JPEG n03255030/
+mv val/ILSVRC2012_val_00010056.JPEG n02009912/
+mv val/ILSVRC2012_val_00010057.JPEG n02087394/
+mv val/ILSVRC2012_val_00010058.JPEG n03218198/
+mv val/ILSVRC2012_val_00010059.JPEG n02106550/
+mv val/ILSVRC2012_val_00010060.JPEG n03888605/
+mv val/ILSVRC2012_val_00010061.JPEG n01704323/
+mv val/ILSVRC2012_val_00010062.JPEG n02091635/
+mv val/ILSVRC2012_val_00010063.JPEG n03710721/
+mv val/ILSVRC2012_val_00010064.JPEG n02325366/
+mv val/ILSVRC2012_val_00010065.JPEG n02112350/
+mv val/ILSVRC2012_val_00010066.JPEG n03207743/
+mv val/ILSVRC2012_val_00010067.JPEG n03980874/
+mv val/ILSVRC2012_val_00010068.JPEG n03042490/
+mv val/ILSVRC2012_val_00010069.JPEG n07590611/
+mv val/ILSVRC2012_val_00010070.JPEG n02096051/
+mv val/ILSVRC2012_val_00010071.JPEG n02408429/
+mv val/ILSVRC2012_val_00010072.JPEG n02091244/
+mv val/ILSVRC2012_val_00010073.JPEG n03773504/
+mv val/ILSVRC2012_val_00010074.JPEG n01491361/
+mv val/ILSVRC2012_val_00010075.JPEG n02120505/
+mv val/ILSVRC2012_val_00010076.JPEG n02607072/
+mv val/ILSVRC2012_val_00010077.JPEG n02487347/
+mv val/ILSVRC2012_val_00010078.JPEG n02504458/
+mv val/ILSVRC2012_val_00010079.JPEG n04204347/
+mv val/ILSVRC2012_val_00010080.JPEG n02037110/
+mv val/ILSVRC2012_val_00010081.JPEG n02790996/
+mv val/ILSVRC2012_val_00010082.JPEG n02107312/
+mv val/ILSVRC2012_val_00010083.JPEG n04044716/
+mv val/ILSVRC2012_val_00010084.JPEG n02002556/
+mv val/ILSVRC2012_val_00010085.JPEG n02727426/
+mv val/ILSVRC2012_val_00010086.JPEG n04606251/
+mv val/ILSVRC2012_val_00010087.JPEG n02091831/
+mv val/ILSVRC2012_val_00010088.JPEG n03598930/
+mv val/ILSVRC2012_val_00010089.JPEG n03089624/
+mv val/ILSVRC2012_val_00010090.JPEG n01807496/
+mv val/ILSVRC2012_val_00010091.JPEG n07613480/
+mv val/ILSVRC2012_val_00010092.JPEG n04404412/
+mv val/ILSVRC2012_val_00010093.JPEG n04542943/
+mv val/ILSVRC2012_val_00010094.JPEG n09229709/
+mv val/ILSVRC2012_val_00010095.JPEG n03467068/
+mv val/ILSVRC2012_val_00010096.JPEG n01943899/
+mv val/ILSVRC2012_val_00010097.JPEG n11939491/
+mv val/ILSVRC2012_val_00010098.JPEG n02086646/
+mv val/ILSVRC2012_val_00010099.JPEG n02095314/
+mv val/ILSVRC2012_val_00010100.JPEG n02328150/
+mv val/ILSVRC2012_val_00010101.JPEG n02992529/
+mv val/ILSVRC2012_val_00010102.JPEG n02281787/
+mv val/ILSVRC2012_val_00010103.JPEG n04008634/
+mv val/ILSVRC2012_val_00010104.JPEG n07697313/
+mv val/ILSVRC2012_val_00010105.JPEG n03347037/
+mv val/ILSVRC2012_val_00010106.JPEG n02012849/
+mv val/ILSVRC2012_val_00010107.JPEG n02099429/
+mv val/ILSVRC2012_val_00010108.JPEG n04179913/
+mv val/ILSVRC2012_val_00010109.JPEG n02106662/
+mv val/ILSVRC2012_val_00010110.JPEG n03841143/
+mv val/ILSVRC2012_val_00010111.JPEG n07768694/
+mv val/ILSVRC2012_val_00010112.JPEG n07880968/
+mv val/ILSVRC2012_val_00010113.JPEG n02111129/
+mv val/ILSVRC2012_val_00010114.JPEG n04456115/
+mv val/ILSVRC2012_val_00010115.JPEG n04330267/
+mv val/ILSVRC2012_val_00010116.JPEG n01629819/
+mv val/ILSVRC2012_val_00010117.JPEG n04146614/
+mv val/ILSVRC2012_val_00010118.JPEG n03710193/
+mv val/ILSVRC2012_val_00010119.JPEG n03250847/
+mv val/ILSVRC2012_val_00010120.JPEG n02808304/
+mv val/ILSVRC2012_val_00010121.JPEG n03018349/
+mv val/ILSVRC2012_val_00010122.JPEG n01943899/
+mv val/ILSVRC2012_val_00010123.JPEG n02398521/
+mv val/ILSVRC2012_val_00010124.JPEG n03388549/
+mv val/ILSVRC2012_val_00010125.JPEG n02097658/
+mv val/ILSVRC2012_val_00010126.JPEG n03529860/
+mv val/ILSVRC2012_val_00010127.JPEG n02782093/
+mv val/ILSVRC2012_val_00010128.JPEG n01592084/
+mv val/ILSVRC2012_val_00010129.JPEG n04311174/
+mv val/ILSVRC2012_val_00010130.JPEG n02823750/
+mv val/ILSVRC2012_val_00010131.JPEG n04067472/
+mv val/ILSVRC2012_val_00010132.JPEG n02422699/
+mv val/ILSVRC2012_val_00010133.JPEG n03832673/
+mv val/ILSVRC2012_val_00010134.JPEG n04367480/
+mv val/ILSVRC2012_val_00010135.JPEG n04557648/
+mv val/ILSVRC2012_val_00010136.JPEG n02051845/
+mv val/ILSVRC2012_val_00010137.JPEG n01882714/
+mv val/ILSVRC2012_val_00010138.JPEG n02012849/
+mv val/ILSVRC2012_val_00010139.JPEG n03796401/
+mv val/ILSVRC2012_val_00010140.JPEG n01735189/
+mv val/ILSVRC2012_val_00010141.JPEG n09256479/
+mv val/ILSVRC2012_val_00010142.JPEG n03529860/
+mv val/ILSVRC2012_val_00010143.JPEG n11939491/
+mv val/ILSVRC2012_val_00010144.JPEG n03673027/
+mv val/ILSVRC2012_val_00010145.JPEG n01669191/
+mv val/ILSVRC2012_val_00010146.JPEG n03742115/
+mv val/ILSVRC2012_val_00010147.JPEG n02692877/
+mv val/ILSVRC2012_val_00010148.JPEG n02328150/
+mv val/ILSVRC2012_val_00010149.JPEG n07715103/
+mv val/ILSVRC2012_val_00010150.JPEG n02268443/
+mv val/ILSVRC2012_val_00010151.JPEG n02268853/
+mv val/ILSVRC2012_val_00010152.JPEG n01770393/
+mv val/ILSVRC2012_val_00010153.JPEG n07718747/
+mv val/ILSVRC2012_val_00010154.JPEG n07714571/
+mv val/ILSVRC2012_val_00010155.JPEG n01695060/
+mv val/ILSVRC2012_val_00010156.JPEG n01843065/
+mv val/ILSVRC2012_val_00010157.JPEG n03404251/
+mv val/ILSVRC2012_val_00010158.JPEG n02823750/
+mv val/ILSVRC2012_val_00010159.JPEG n04264628/
+mv val/ILSVRC2012_val_00010160.JPEG n03478589/
+mv val/ILSVRC2012_val_00010161.JPEG n02643566/
+mv val/ILSVRC2012_val_00010162.JPEG n01514859/
+mv val/ILSVRC2012_val_00010163.JPEG n02086646/
+mv val/ILSVRC2012_val_00010164.JPEG n01692333/
+mv val/ILSVRC2012_val_00010165.JPEG n03841143/
+mv val/ILSVRC2012_val_00010166.JPEG n03977966/
+mv val/ILSVRC2012_val_00010167.JPEG n04136333/
+mv val/ILSVRC2012_val_00010168.JPEG n02089973/
+mv val/ILSVRC2012_val_00010169.JPEG n02097298/
+mv val/ILSVRC2012_val_00010170.JPEG n04311174/
+mv val/ILSVRC2012_val_00010171.JPEG n01677366/
+mv val/ILSVRC2012_val_00010172.JPEG n01930112/
+mv val/ILSVRC2012_val_00010173.JPEG n02128925/
+mv val/ILSVRC2012_val_00010174.JPEG n03710721/
+mv val/ILSVRC2012_val_00010175.JPEG n02909870/
+mv val/ILSVRC2012_val_00010176.JPEG n02027492/
+mv val/ILSVRC2012_val_00010177.JPEG n04252077/
+mv val/ILSVRC2012_val_00010178.JPEG n03544143/
+mv val/ILSVRC2012_val_00010179.JPEG n09332890/
+mv val/ILSVRC2012_val_00010180.JPEG n04118776/
+mv val/ILSVRC2012_val_00010181.JPEG n04553703/
+mv val/ILSVRC2012_val_00010182.JPEG n02488702/
+mv val/ILSVRC2012_val_00010183.JPEG n02109525/
+mv val/ILSVRC2012_val_00010184.JPEG n04443257/
+mv val/ILSVRC2012_val_00010185.JPEG n01728572/
+mv val/ILSVRC2012_val_00010186.JPEG n03384352/
+mv val/ILSVRC2012_val_00010187.JPEG n04136333/
+mv val/ILSVRC2012_val_00010188.JPEG n07718472/
+mv val/ILSVRC2012_val_00010189.JPEG n03773504/
+mv val/ILSVRC2012_val_00010190.JPEG n04273569/
+mv val/ILSVRC2012_val_00010191.JPEG n02730930/
+mv val/ILSVRC2012_val_00010192.JPEG n02259212/
+mv val/ILSVRC2012_val_00010193.JPEG n03125729/
+mv val/ILSVRC2012_val_00010194.JPEG n01748264/
+mv val/ILSVRC2012_val_00010195.JPEG n03095699/
+mv val/ILSVRC2012_val_00010196.JPEG n02504458/
+mv val/ILSVRC2012_val_00010197.JPEG n04579432/
+mv val/ILSVRC2012_val_00010198.JPEG n02231487/
+mv val/ILSVRC2012_val_00010199.JPEG n04442312/
+mv val/ILSVRC2012_val_00010200.JPEG n03447447/
+mv val/ILSVRC2012_val_00010201.JPEG n02939185/
+mv val/ILSVRC2012_val_00010202.JPEG n02110341/
+mv val/ILSVRC2012_val_00010203.JPEG n04458633/
+mv val/ILSVRC2012_val_00010204.JPEG n03492542/
+mv val/ILSVRC2012_val_00010205.JPEG n02841315/
+mv val/ILSVRC2012_val_00010206.JPEG n04285008/
+mv val/ILSVRC2012_val_00010207.JPEG n02787622/
+mv val/ILSVRC2012_val_00010208.JPEG n01514668/
+mv val/ILSVRC2012_val_00010209.JPEG n03877472/
+mv val/ILSVRC2012_val_00010210.JPEG n04486054/
+mv val/ILSVRC2012_val_00010211.JPEG n04238763/
+mv val/ILSVRC2012_val_00010212.JPEG n02480495/
+mv val/ILSVRC2012_val_00010213.JPEG n07871810/
+mv val/ILSVRC2012_val_00010214.JPEG n01968897/
+mv val/ILSVRC2012_val_00010215.JPEG n03954731/
+mv val/ILSVRC2012_val_00010216.JPEG n03584829/
+mv val/ILSVRC2012_val_00010217.JPEG n03379051/
+mv val/ILSVRC2012_val_00010218.JPEG n02123394/
+mv val/ILSVRC2012_val_00010219.JPEG n03259280/
+mv val/ILSVRC2012_val_00010220.JPEG n07920052/
+mv val/ILSVRC2012_val_00010221.JPEG n02113712/
+mv val/ILSVRC2012_val_00010222.JPEG n02092002/
+mv val/ILSVRC2012_val_00010223.JPEG n02727426/
+mv val/ILSVRC2012_val_00010224.JPEG n04149813/
+mv val/ILSVRC2012_val_00010225.JPEG n01775062/
+mv val/ILSVRC2012_val_00010226.JPEG n03457902/
+mv val/ILSVRC2012_val_00010227.JPEG n03791053/
+mv val/ILSVRC2012_val_00010228.JPEG n02106550/
+mv val/ILSVRC2012_val_00010229.JPEG n09288635/
+mv val/ILSVRC2012_val_00010230.JPEG n01742172/
+mv val/ILSVRC2012_val_00010231.JPEG n02219486/
+mv val/ILSVRC2012_val_00010232.JPEG n04332243/
+mv val/ILSVRC2012_val_00010233.JPEG n02490219/
+mv val/ILSVRC2012_val_00010234.JPEG n04033901/
+mv val/ILSVRC2012_val_00010235.JPEG n03590841/
+mv val/ILSVRC2012_val_00010236.JPEG n04344873/
+mv val/ILSVRC2012_val_00010237.JPEG n07753592/
+mv val/ILSVRC2012_val_00010238.JPEG n02085936/
+mv val/ILSVRC2012_val_00010239.JPEG n03447721/
+mv val/ILSVRC2012_val_00010240.JPEG n01580077/
+mv val/ILSVRC2012_val_00010241.JPEG n02120505/
+mv val/ILSVRC2012_val_00010242.JPEG n02504458/
+mv val/ILSVRC2012_val_00010243.JPEG n03633091/
+mv val/ILSVRC2012_val_00010244.JPEG n02113023/
+mv val/ILSVRC2012_val_00010245.JPEG n02109525/
+mv val/ILSVRC2012_val_00010246.JPEG n11879895/
+mv val/ILSVRC2012_val_00010247.JPEG n03445924/
+mv val/ILSVRC2012_val_00010248.JPEG n01882714/
+mv val/ILSVRC2012_val_00010249.JPEG n02089867/
+mv val/ILSVRC2012_val_00010250.JPEG n04604644/
+mv val/ILSVRC2012_val_00010251.JPEG n03697007/
+mv val/ILSVRC2012_val_00010252.JPEG n02814533/
+mv val/ILSVRC2012_val_00010253.JPEG n02094114/
+mv val/ILSVRC2012_val_00010254.JPEG n01631663/
+mv val/ILSVRC2012_val_00010255.JPEG n02105251/
+mv val/ILSVRC2012_val_00010256.JPEG n02948072/
+mv val/ILSVRC2012_val_00010257.JPEG n04200800/
+mv val/ILSVRC2012_val_00010258.JPEG n01820546/
+mv val/ILSVRC2012_val_00010259.JPEG n03125729/
+mv val/ILSVRC2012_val_00010260.JPEG n03290653/
+mv val/ILSVRC2012_val_00010261.JPEG n02102480/
+mv val/ILSVRC2012_val_00010262.JPEG n04525038/
+mv val/ILSVRC2012_val_00010263.JPEG n03347037/
+mv val/ILSVRC2012_val_00010264.JPEG n03950228/
+mv val/ILSVRC2012_val_00010265.JPEG n02319095/
+mv val/ILSVRC2012_val_00010266.JPEG n03160309/
+mv val/ILSVRC2012_val_00010267.JPEG n03787032/
+mv val/ILSVRC2012_val_00010268.JPEG n02107574/
+mv val/ILSVRC2012_val_00010269.JPEG n04487394/
+mv val/ILSVRC2012_val_00010270.JPEG n04548280/
+mv val/ILSVRC2012_val_00010271.JPEG n07697537/
+mv val/ILSVRC2012_val_00010272.JPEG n01580077/
+mv val/ILSVRC2012_val_00010273.JPEG n03599486/
+mv val/ILSVRC2012_val_00010274.JPEG n04599235/
+mv val/ILSVRC2012_val_00010275.JPEG n01735189/
+mv val/ILSVRC2012_val_00010276.JPEG n04612504/
+mv val/ILSVRC2012_val_00010277.JPEG n02786058/
+mv val/ILSVRC2012_val_00010278.JPEG n03000247/
+mv val/ILSVRC2012_val_00010279.JPEG n02906734/
+mv val/ILSVRC2012_val_00010280.JPEG n13054560/
+mv val/ILSVRC2012_val_00010281.JPEG n02132136/
+mv val/ILSVRC2012_val_00010282.JPEG n02939185/
+mv val/ILSVRC2012_val_00010283.JPEG n02101006/
+mv val/ILSVRC2012_val_00010284.JPEG n04141975/
+mv val/ILSVRC2012_val_00010285.JPEG n04127249/
+mv val/ILSVRC2012_val_00010286.JPEG n07565083/
+mv val/ILSVRC2012_val_00010287.JPEG n01641577/
+mv val/ILSVRC2012_val_00010288.JPEG n02017213/
+mv val/ILSVRC2012_val_00010289.JPEG n02095889/
+mv val/ILSVRC2012_val_00010290.JPEG n02096585/
+mv val/ILSVRC2012_val_00010291.JPEG n03461385/
+mv val/ILSVRC2012_val_00010292.JPEG n02231487/
+mv val/ILSVRC2012_val_00010293.JPEG n04493381/
+mv val/ILSVRC2012_val_00010294.JPEG n02092339/
+mv val/ILSVRC2012_val_00010295.JPEG n04332243/
+mv val/ILSVRC2012_val_00010296.JPEG n02497673/
+mv val/ILSVRC2012_val_00010297.JPEG n02119022/
+mv val/ILSVRC2012_val_00010298.JPEG n02099601/
+mv val/ILSVRC2012_val_00010299.JPEG n04311004/
+mv val/ILSVRC2012_val_00010300.JPEG n03920288/
+mv val/ILSVRC2012_val_00010301.JPEG n02704792/
+mv val/ILSVRC2012_val_00010302.JPEG n02091032/
+mv val/ILSVRC2012_val_00010303.JPEG n03240683/
+mv val/ILSVRC2012_val_00010304.JPEG n03538406/
+mv val/ILSVRC2012_val_00010305.JPEG n04560804/
+mv val/ILSVRC2012_val_00010306.JPEG n01440764/
+mv val/ILSVRC2012_val_00010307.JPEG n02776631/
+mv val/ILSVRC2012_val_00010308.JPEG n02013706/
+mv val/ILSVRC2012_val_00010309.JPEG n02099849/
+mv val/ILSVRC2012_val_00010310.JPEG n01532829/
+mv val/ILSVRC2012_val_00010311.JPEG n02110341/
+mv val/ILSVRC2012_val_00010312.JPEG n01944390/
+mv val/ILSVRC2012_val_00010313.JPEG n03218198/
+mv val/ILSVRC2012_val_00010314.JPEG n02099712/
+mv val/ILSVRC2012_val_00010315.JPEG n04429376/
+mv val/ILSVRC2012_val_00010316.JPEG n03249569/
+mv val/ILSVRC2012_val_00010317.JPEG n02422106/
+mv val/ILSVRC2012_val_00010318.JPEG n04254777/
+mv val/ILSVRC2012_val_00010319.JPEG n04009552/
+mv val/ILSVRC2012_val_00010320.JPEG n03617480/
+mv val/ILSVRC2012_val_00010321.JPEG n03337140/
+mv val/ILSVRC2012_val_00010322.JPEG n01692333/
+mv val/ILSVRC2012_val_00010323.JPEG n02493509/
+mv val/ILSVRC2012_val_00010324.JPEG n12144580/
+mv val/ILSVRC2012_val_00010325.JPEG n03095699/
+mv val/ILSVRC2012_val_00010326.JPEG n03781244/
+mv val/ILSVRC2012_val_00010327.JPEG n03782006/
+mv val/ILSVRC2012_val_00010328.JPEG n02099429/
+mv val/ILSVRC2012_val_00010329.JPEG n09428293/
+mv val/ILSVRC2012_val_00010330.JPEG n04179913/
+mv val/ILSVRC2012_val_00010331.JPEG n02105251/
+mv val/ILSVRC2012_val_00010332.JPEG n07716358/
+mv val/ILSVRC2012_val_00010333.JPEG n04357314/
+mv val/ILSVRC2012_val_00010334.JPEG n03895866/
+mv val/ILSVRC2012_val_00010335.JPEG n02948072/
+mv val/ILSVRC2012_val_00010336.JPEG n03888257/
+mv val/ILSVRC2012_val_00010337.JPEG n03447447/
+mv val/ILSVRC2012_val_00010338.JPEG n07248320/
+mv val/ILSVRC2012_val_00010339.JPEG n01537544/
+mv val/ILSVRC2012_val_00010340.JPEG n02487347/
+mv val/ILSVRC2012_val_00010341.JPEG n03982430/
+mv val/ILSVRC2012_val_00010342.JPEG n02910353/
+mv val/ILSVRC2012_val_00010343.JPEG n07892512/
+mv val/ILSVRC2012_val_00010344.JPEG n09468604/
+mv val/ILSVRC2012_val_00010345.JPEG n03857828/
+mv val/ILSVRC2012_val_00010346.JPEG n03290653/
+mv val/ILSVRC2012_val_00010347.JPEG n03388043/
+mv val/ILSVRC2012_val_00010348.JPEG n03843555/
+mv val/ILSVRC2012_val_00010349.JPEG n04423845/
+mv val/ILSVRC2012_val_00010350.JPEG n04404412/
+mv val/ILSVRC2012_val_00010351.JPEG n04347754/
+mv val/ILSVRC2012_val_00010352.JPEG n01537544/
+mv val/ILSVRC2012_val_00010353.JPEG n02992529/
+mv val/ILSVRC2012_val_00010354.JPEG n02101388/
+mv val/ILSVRC2012_val_00010355.JPEG n02056570/
+mv val/ILSVRC2012_val_00010356.JPEG n02093859/
+mv val/ILSVRC2012_val_00010357.JPEG n02105412/
+mv val/ILSVRC2012_val_00010358.JPEG n03933933/
+mv val/ILSVRC2012_val_00010359.JPEG n02704792/
+mv val/ILSVRC2012_val_00010360.JPEG n03063599/
+mv val/ILSVRC2012_val_00010361.JPEG n12267677/
+mv val/ILSVRC2012_val_00010362.JPEG n04482393/
+mv val/ILSVRC2012_val_00010363.JPEG n01443537/
+mv val/ILSVRC2012_val_00010364.JPEG n03670208/
+mv val/ILSVRC2012_val_00010365.JPEG n04590129/
+mv val/ILSVRC2012_val_00010366.JPEG n07565083/
+mv val/ILSVRC2012_val_00010367.JPEG n04111531/
+mv val/ILSVRC2012_val_00010368.JPEG n03188531/
+mv val/ILSVRC2012_val_00010369.JPEG n02114712/
+mv val/ILSVRC2012_val_00010370.JPEG n04409515/
+mv val/ILSVRC2012_val_00010371.JPEG n03272010/
+mv val/ILSVRC2012_val_00010372.JPEG n02107312/
+mv val/ILSVRC2012_val_00010373.JPEG n02112018/
+mv val/ILSVRC2012_val_00010374.JPEG n03676483/
+mv val/ILSVRC2012_val_00010375.JPEG n03770439/
+mv val/ILSVRC2012_val_00010376.JPEG n13133613/
+mv val/ILSVRC2012_val_00010377.JPEG n04259630/
+mv val/ILSVRC2012_val_00010378.JPEG n02105641/
+mv val/ILSVRC2012_val_00010379.JPEG n04049303/
+mv val/ILSVRC2012_val_00010380.JPEG n02807133/
+mv val/ILSVRC2012_val_00010381.JPEG n03249569/
+mv val/ILSVRC2012_val_00010382.JPEG n02099267/
+mv val/ILSVRC2012_val_00010383.JPEG n04065272/
+mv val/ILSVRC2012_val_00010384.JPEG n07716906/
+mv val/ILSVRC2012_val_00010385.JPEG n02087394/
+mv val/ILSVRC2012_val_00010386.JPEG n01669191/
+mv val/ILSVRC2012_val_00010387.JPEG n04376876/
+mv val/ILSVRC2012_val_00010388.JPEG n01847000/
+mv val/ILSVRC2012_val_00010389.JPEG n02123597/
+mv val/ILSVRC2012_val_00010390.JPEG n04131690/
+mv val/ILSVRC2012_val_00010391.JPEG n02033041/
+mv val/ILSVRC2012_val_00010392.JPEG n04357314/
+mv val/ILSVRC2012_val_00010393.JPEG n01530575/
+mv val/ILSVRC2012_val_00010394.JPEG n02841315/
+mv val/ILSVRC2012_val_00010395.JPEG n01698640/
+mv val/ILSVRC2012_val_00010396.JPEG n04179913/
+mv val/ILSVRC2012_val_00010397.JPEG n01824575/
+mv val/ILSVRC2012_val_00010398.JPEG n02092002/
+mv val/ILSVRC2012_val_00010399.JPEG n02058221/
+mv val/ILSVRC2012_val_00010400.JPEG n03617480/
+mv val/ILSVRC2012_val_00010401.JPEG n04146614/
+mv val/ILSVRC2012_val_00010402.JPEG n02097130/
+mv val/ILSVRC2012_val_00010403.JPEG n09399592/
+mv val/ILSVRC2012_val_00010404.JPEG n02892201/
+mv val/ILSVRC2012_val_00010405.JPEG n02116738/
+mv val/ILSVRC2012_val_00010406.JPEG n04204347/
+mv val/ILSVRC2012_val_00010407.JPEG n04522168/
+mv val/ILSVRC2012_val_00010408.JPEG n04136333/
+mv val/ILSVRC2012_val_00010409.JPEG n01531178/
+mv val/ILSVRC2012_val_00010410.JPEG n02346627/
+mv val/ILSVRC2012_val_00010411.JPEG n02168699/
+mv val/ILSVRC2012_val_00010412.JPEG n01980166/
+mv val/ILSVRC2012_val_00010413.JPEG n07711569/
+mv val/ILSVRC2012_val_00010414.JPEG n03347037/
+mv val/ILSVRC2012_val_00010415.JPEG n04208210/
+mv val/ILSVRC2012_val_00010416.JPEG n02823750/
+mv val/ILSVRC2012_val_00010417.JPEG n02124075/
+mv val/ILSVRC2012_val_00010418.JPEG n02509815/
+mv val/ILSVRC2012_val_00010419.JPEG n03404251/
+mv val/ILSVRC2012_val_00010420.JPEG n02088364/
+mv val/ILSVRC2012_val_00010421.JPEG n01798484/
+mv val/ILSVRC2012_val_00010422.JPEG n02009912/
+mv val/ILSVRC2012_val_00010423.JPEG n03814639/
+mv val/ILSVRC2012_val_00010424.JPEG n02172182/
+mv val/ILSVRC2012_val_00010425.JPEG n03840681/
+mv val/ILSVRC2012_val_00010426.JPEG n02002556/
+mv val/ILSVRC2012_val_00010427.JPEG n03888257/
+mv val/ILSVRC2012_val_00010428.JPEG n03065424/
+mv val/ILSVRC2012_val_00010429.JPEG n03325584/
+mv val/ILSVRC2012_val_00010430.JPEG n02317335/
+mv val/ILSVRC2012_val_00010431.JPEG n02281406/
+mv val/ILSVRC2012_val_00010432.JPEG n03658185/
+mv val/ILSVRC2012_val_00010433.JPEG n02095570/
+mv val/ILSVRC2012_val_00010434.JPEG n03920288/
+mv val/ILSVRC2012_val_00010435.JPEG n03710637/
+mv val/ILSVRC2012_val_00010436.JPEG n02123597/
+mv val/ILSVRC2012_val_00010437.JPEG n03877472/
+mv val/ILSVRC2012_val_00010438.JPEG n04357314/
+mv val/ILSVRC2012_val_00010439.JPEG n07802026/
+mv val/ILSVRC2012_val_00010440.JPEG n04067472/
+mv val/ILSVRC2012_val_00010441.JPEG n02437616/
+mv val/ILSVRC2012_val_00010442.JPEG n03482405/
+mv val/ILSVRC2012_val_00010443.JPEG n01532829/
+mv val/ILSVRC2012_val_00010444.JPEG n04553703/
+mv val/ILSVRC2012_val_00010445.JPEG n03065424/
+mv val/ILSVRC2012_val_00010446.JPEG n02058221/
+mv val/ILSVRC2012_val_00010447.JPEG n07718472/
+mv val/ILSVRC2012_val_00010448.JPEG n04252225/
+mv val/ILSVRC2012_val_00010449.JPEG n02096585/
+mv val/ILSVRC2012_val_00010450.JPEG n02097658/
+mv val/ILSVRC2012_val_00010451.JPEG n04525305/
+mv val/ILSVRC2012_val_00010452.JPEG n12057211/
+mv val/ILSVRC2012_val_00010453.JPEG n04259630/
+mv val/ILSVRC2012_val_00010454.JPEG n02490219/
+mv val/ILSVRC2012_val_00010455.JPEG n04285008/
+mv val/ILSVRC2012_val_00010456.JPEG n01534433/
+mv val/ILSVRC2012_val_00010457.JPEG n01622779/
+mv val/ILSVRC2012_val_00010458.JPEG n04067472/
+mv val/ILSVRC2012_val_00010459.JPEG n04557648/
+mv val/ILSVRC2012_val_00010460.JPEG n03888257/
+mv val/ILSVRC2012_val_00010461.JPEG n02096051/
+mv val/ILSVRC2012_val_00010462.JPEG n01632458/
+mv val/ILSVRC2012_val_00010463.JPEG n02808304/
+mv val/ILSVRC2012_val_00010464.JPEG n12985857/
+mv val/ILSVRC2012_val_00010465.JPEG n01756291/
+mv val/ILSVRC2012_val_00010466.JPEG n02111500/
+mv val/ILSVRC2012_val_00010467.JPEG n02963159/
+mv val/ILSVRC2012_val_00010468.JPEG n02790996/
+mv val/ILSVRC2012_val_00010469.JPEG n03630383/
+mv val/ILSVRC2012_val_00010470.JPEG n07714990/
+mv val/ILSVRC2012_val_00010471.JPEG n04589890/
+mv val/ILSVRC2012_val_00010472.JPEG n02128757/
+mv val/ILSVRC2012_val_00010473.JPEG n02786058/
+mv val/ILSVRC2012_val_00010474.JPEG n02951358/
+mv val/ILSVRC2012_val_00010475.JPEG n03763968/
+mv val/ILSVRC2012_val_00010476.JPEG n02356798/
+mv val/ILSVRC2012_val_00010477.JPEG n01818515/
+mv val/ILSVRC2012_val_00010478.JPEG n02607072/
+mv val/ILSVRC2012_val_00010479.JPEG n07717410/
+mv val/ILSVRC2012_val_00010480.JPEG n03877472/
+mv val/ILSVRC2012_val_00010481.JPEG n04069434/
+mv val/ILSVRC2012_val_00010482.JPEG n02483362/
+mv val/ILSVRC2012_val_00010483.JPEG n04479046/
+mv val/ILSVRC2012_val_00010484.JPEG n02268853/
+mv val/ILSVRC2012_val_00010485.JPEG n10148035/
+mv val/ILSVRC2012_val_00010486.JPEG n02815834/
+mv val/ILSVRC2012_val_00010487.JPEG n02116738/
+mv val/ILSVRC2012_val_00010488.JPEG n04501370/
+mv val/ILSVRC2012_val_00010489.JPEG n03131574/
+mv val/ILSVRC2012_val_00010490.JPEG n02099712/
+mv val/ILSVRC2012_val_00010491.JPEG n02108915/
+mv val/ILSVRC2012_val_00010492.JPEG n04209239/
+mv val/ILSVRC2012_val_00010493.JPEG n03770439/
+mv val/ILSVRC2012_val_00010494.JPEG n02226429/
+mv val/ILSVRC2012_val_00010495.JPEG n12144580/
+mv val/ILSVRC2012_val_00010496.JPEG n02906734/
+mv val/ILSVRC2012_val_00010497.JPEG n02783161/
+mv val/ILSVRC2012_val_00010498.JPEG n02667093/
+mv val/ILSVRC2012_val_00010499.JPEG n04239074/
+mv val/ILSVRC2012_val_00010500.JPEG n02110063/
+mv val/ILSVRC2012_val_00010501.JPEG n01582220/
+mv val/ILSVRC2012_val_00010502.JPEG n07768694/
+mv val/ILSVRC2012_val_00010503.JPEG n01774750/
+mv val/ILSVRC2012_val_00010504.JPEG n03787032/
+mv val/ILSVRC2012_val_00010505.JPEG n12057211/
+mv val/ILSVRC2012_val_00010506.JPEG n03764736/
+mv val/ILSVRC2012_val_00010507.JPEG n01795545/
+mv val/ILSVRC2012_val_00010508.JPEG n03623198/
+mv val/ILSVRC2012_val_00010509.JPEG n01443537/
+mv val/ILSVRC2012_val_00010510.JPEG n02892201/
+mv val/ILSVRC2012_val_00010511.JPEG n03868242/
+mv val/ILSVRC2012_val_00010512.JPEG n03384352/
+mv val/ILSVRC2012_val_00010513.JPEG n02403003/
+mv val/ILSVRC2012_val_00010514.JPEG n03658185/
+mv val/ILSVRC2012_val_00010515.JPEG n03485794/
+mv val/ILSVRC2012_val_00010516.JPEG n02085782/
+mv val/ILSVRC2012_val_00010517.JPEG n04328186/
+mv val/ILSVRC2012_val_00010518.JPEG n03388183/
+mv val/ILSVRC2012_val_00010519.JPEG n04344873/
+mv val/ILSVRC2012_val_00010520.JPEG n07716358/
+mv val/ILSVRC2012_val_00010521.JPEG n02097047/
+mv val/ILSVRC2012_val_00010522.JPEG n01737021/
+mv val/ILSVRC2012_val_00010523.JPEG n01695060/
+mv val/ILSVRC2012_val_00010524.JPEG n02098286/
+mv val/ILSVRC2012_val_00010525.JPEG n04258138/
+mv val/ILSVRC2012_val_00010526.JPEG n03127747/
+mv val/ILSVRC2012_val_00010527.JPEG n07565083/
+mv val/ILSVRC2012_val_00010528.JPEG n01667114/
+mv val/ILSVRC2012_val_00010529.JPEG n03929660/
+mv val/ILSVRC2012_val_00010530.JPEG n03476684/
+mv val/ILSVRC2012_val_00010531.JPEG n03785016/
+mv val/ILSVRC2012_val_00010532.JPEG n04041544/
+mv val/ILSVRC2012_val_00010533.JPEG n02100236/
+mv val/ILSVRC2012_val_00010534.JPEG n03854065/
+mv val/ILSVRC2012_val_00010535.JPEG n03529860/
+mv val/ILSVRC2012_val_00010536.JPEG n02097209/
+mv val/ILSVRC2012_val_00010537.JPEG n02100236/
+mv val/ILSVRC2012_val_00010538.JPEG n04540053/
+mv val/ILSVRC2012_val_00010539.JPEG n02002556/
+mv val/ILSVRC2012_val_00010540.JPEG n03495258/
+mv val/ILSVRC2012_val_00010541.JPEG n02834397/
+mv val/ILSVRC2012_val_00010542.JPEG n04346328/
+mv val/ILSVRC2012_val_00010543.JPEG n03485407/
+mv val/ILSVRC2012_val_00010544.JPEG n02835271/
+mv val/ILSVRC2012_val_00010545.JPEG n01729977/
+mv val/ILSVRC2012_val_00010546.JPEG n02802426/
+mv val/ILSVRC2012_val_00010547.JPEG n03781244/
+mv val/ILSVRC2012_val_00010548.JPEG n02793495/
+mv val/ILSVRC2012_val_00010549.JPEG n02892767/
+mv val/ILSVRC2012_val_00010550.JPEG n02086240/
+mv val/ILSVRC2012_val_00010551.JPEG n02490219/
+mv val/ILSVRC2012_val_00010552.JPEG n02119022/
+mv val/ILSVRC2012_val_00010553.JPEG n06359193/
+mv val/ILSVRC2012_val_00010554.JPEG n03207743/
+mv val/ILSVRC2012_val_00010555.JPEG n01980166/
+mv val/ILSVRC2012_val_00010556.JPEG n04467665/
+mv val/ILSVRC2012_val_00010557.JPEG n04332243/
+mv val/ILSVRC2012_val_00010558.JPEG n03598930/
+mv val/ILSVRC2012_val_00010559.JPEG n04523525/
+mv val/ILSVRC2012_val_00010560.JPEG n03877472/
+mv val/ILSVRC2012_val_00010561.JPEG n03976657/
+mv val/ILSVRC2012_val_00010562.JPEG n02256656/
+mv val/ILSVRC2012_val_00010563.JPEG n02097130/
+mv val/ILSVRC2012_val_00010564.JPEG n02606052/
+mv val/ILSVRC2012_val_00010565.JPEG n04037443/
+mv val/ILSVRC2012_val_00010566.JPEG n02793495/
+mv val/ILSVRC2012_val_00010567.JPEG n03929855/
+mv val/ILSVRC2012_val_00010568.JPEG n04118776/
+mv val/ILSVRC2012_val_00010569.JPEG n02727426/
+mv val/ILSVRC2012_val_00010570.JPEG n01833805/
+mv val/ILSVRC2012_val_00010571.JPEG n02536864/
+mv val/ILSVRC2012_val_00010572.JPEG n03710721/
+mv val/ILSVRC2012_val_00010573.JPEG n03459775/
+mv val/ILSVRC2012_val_00010574.JPEG n04311004/
+mv val/ILSVRC2012_val_00010575.JPEG n02113712/
+mv val/ILSVRC2012_val_00010576.JPEG n02480495/
+mv val/ILSVRC2012_val_00010577.JPEG n03041632/
+mv val/ILSVRC2012_val_00010578.JPEG n02966193/
+mv val/ILSVRC2012_val_00010579.JPEG n03476684/
+mv val/ILSVRC2012_val_00010580.JPEG n07716358/
+mv val/ILSVRC2012_val_00010581.JPEG n04310018/
+mv val/ILSVRC2012_val_00010582.JPEG n07579787/
+mv val/ILSVRC2012_val_00010583.JPEG n02493793/
+mv val/ILSVRC2012_val_00010584.JPEG n02094433/
+mv val/ILSVRC2012_val_00010585.JPEG n07734744/
+mv val/ILSVRC2012_val_00010586.JPEG n01744401/
+mv val/ILSVRC2012_val_00010587.JPEG n03770679/
+mv val/ILSVRC2012_val_00010588.JPEG n04523525/
+mv val/ILSVRC2012_val_00010589.JPEG n02364673/
+mv val/ILSVRC2012_val_00010590.JPEG n03355925/
+mv val/ILSVRC2012_val_00010591.JPEG n07715103/
+mv val/ILSVRC2012_val_00010592.JPEG n02403003/
+mv val/ILSVRC2012_val_00010593.JPEG n01644900/
+mv val/ILSVRC2012_val_00010594.JPEG n01518878/
+mv val/ILSVRC2012_val_00010595.JPEG n02815834/
+mv val/ILSVRC2012_val_00010596.JPEG n04251144/
+mv val/ILSVRC2012_val_00010597.JPEG n02690373/
+mv val/ILSVRC2012_val_00010598.JPEG n02124075/
+mv val/ILSVRC2012_val_00010599.JPEG n04553703/
+mv val/ILSVRC2012_val_00010600.JPEG n04081281/
+mv val/ILSVRC2012_val_00010601.JPEG n02408429/
+mv val/ILSVRC2012_val_00010602.JPEG n01704323/
+mv val/ILSVRC2012_val_00010603.JPEG n02640242/
+mv val/ILSVRC2012_val_00010604.JPEG n03478589/
+mv val/ILSVRC2012_val_00010605.JPEG n04447861/
+mv val/ILSVRC2012_val_00010606.JPEG n07875152/
+mv val/ILSVRC2012_val_00010607.JPEG n04209133/
+mv val/ILSVRC2012_val_00010608.JPEG n07734744/
+mv val/ILSVRC2012_val_00010609.JPEG n04487081/
+mv val/ILSVRC2012_val_00010610.JPEG n02177972/
+mv val/ILSVRC2012_val_00010611.JPEG n02892767/
+mv val/ILSVRC2012_val_00010612.JPEG n02113624/
+mv val/ILSVRC2012_val_00010613.JPEG n03016953/
+mv val/ILSVRC2012_val_00010614.JPEG n07753275/
+mv val/ILSVRC2012_val_00010615.JPEG n02319095/
+mv val/ILSVRC2012_val_00010616.JPEG n07745940/
+mv val/ILSVRC2012_val_00010617.JPEG n02108000/
+mv val/ILSVRC2012_val_00010618.JPEG n02028035/
+mv val/ILSVRC2012_val_00010619.JPEG n02504458/
+mv val/ILSVRC2012_val_00010620.JPEG n02106550/
+mv val/ILSVRC2012_val_00010621.JPEG n07754684/
+mv val/ILSVRC2012_val_00010622.JPEG n03063599/
+mv val/ILSVRC2012_val_00010623.JPEG n03787032/
+mv val/ILSVRC2012_val_00010624.JPEG n02098105/
+mv val/ILSVRC2012_val_00010625.JPEG n03467068/
+mv val/ILSVRC2012_val_00010626.JPEG n02089867/
+mv val/ILSVRC2012_val_00010627.JPEG n02093428/
+mv val/ILSVRC2012_val_00010628.JPEG n07718747/
+mv val/ILSVRC2012_val_00010629.JPEG n07831146/
+mv val/ILSVRC2012_val_00010630.JPEG n03496892/
+mv val/ILSVRC2012_val_00010631.JPEG n03961711/
+mv val/ILSVRC2012_val_00010632.JPEG n01924916/
+mv val/ILSVRC2012_val_00010633.JPEG n01883070/
+mv val/ILSVRC2012_val_00010634.JPEG n01704323/
+mv val/ILSVRC2012_val_00010635.JPEG n03733281/
+mv val/ILSVRC2012_val_00010636.JPEG n03791053/
+mv val/ILSVRC2012_val_00010637.JPEG n02930766/
+mv val/ILSVRC2012_val_00010638.JPEG n03478589/
+mv val/ILSVRC2012_val_00010639.JPEG n01980166/
+mv val/ILSVRC2012_val_00010640.JPEG n01985128/
+mv val/ILSVRC2012_val_00010641.JPEG n09472597/
+mv val/ILSVRC2012_val_00010642.JPEG n03967562/
+mv val/ILSVRC2012_val_00010643.JPEG n02087394/
+mv val/ILSVRC2012_val_00010644.JPEG n01914609/
+mv val/ILSVRC2012_val_00010645.JPEG n02497673/
+mv val/ILSVRC2012_val_00010646.JPEG n03924679/
+mv val/ILSVRC2012_val_00010647.JPEG n03706229/
+mv val/ILSVRC2012_val_00010648.JPEG n02108089/
+mv val/ILSVRC2012_val_00010649.JPEG n15075141/
+mv val/ILSVRC2012_val_00010650.JPEG n03977966/
+mv val/ILSVRC2012_val_00010651.JPEG n07715103/
+mv val/ILSVRC2012_val_00010652.JPEG n03187595/
+mv val/ILSVRC2012_val_00010653.JPEG n02236044/
+mv val/ILSVRC2012_val_00010654.JPEG n04599235/
+mv val/ILSVRC2012_val_00010655.JPEG n03529860/
+mv val/ILSVRC2012_val_00010656.JPEG n04023962/
+mv val/ILSVRC2012_val_00010657.JPEG n02092339/
+mv val/ILSVRC2012_val_00010658.JPEG n02977058/
+mv val/ILSVRC2012_val_00010659.JPEG n07584110/
+mv val/ILSVRC2012_val_00010660.JPEG n07730033/
+mv val/ILSVRC2012_val_00010661.JPEG n03272010/
+mv val/ILSVRC2012_val_00010662.JPEG n03676483/
+mv val/ILSVRC2012_val_00010663.JPEG n02493509/
+mv val/ILSVRC2012_val_00010664.JPEG n09468604/
+mv val/ILSVRC2012_val_00010665.JPEG n02091467/
+mv val/ILSVRC2012_val_00010666.JPEG n03534580/
+mv val/ILSVRC2012_val_00010667.JPEG n03125729/
+mv val/ILSVRC2012_val_00010668.JPEG n04467665/
+mv val/ILSVRC2012_val_00010669.JPEG n01665541/
+mv val/ILSVRC2012_val_00010670.JPEG n04330267/
+mv val/ILSVRC2012_val_00010671.JPEG n02917067/
+mv val/ILSVRC2012_val_00010672.JPEG n03196217/
+mv val/ILSVRC2012_val_00010673.JPEG n02009229/
+mv val/ILSVRC2012_val_00010674.JPEG n03042490/
+mv val/ILSVRC2012_val_00010675.JPEG n01632458/
+mv val/ILSVRC2012_val_00010676.JPEG n03100240/
+mv val/ILSVRC2012_val_00010677.JPEG n02965783/
+mv val/ILSVRC2012_val_00010678.JPEG n02172182/
+mv val/ILSVRC2012_val_00010679.JPEG n03920288/
+mv val/ILSVRC2012_val_00010680.JPEG n03109150/
+mv val/ILSVRC2012_val_00010681.JPEG n07747607/
+mv val/ILSVRC2012_val_00010682.JPEG n02093859/
+mv val/ILSVRC2012_val_00010683.JPEG n02655020/
+mv val/ILSVRC2012_val_00010684.JPEG n03658185/
+mv val/ILSVRC2012_val_00010685.JPEG n03584254/
+mv val/ILSVRC2012_val_00010686.JPEG n02110806/
+mv val/ILSVRC2012_val_00010687.JPEG n04596742/
+mv val/ILSVRC2012_val_00010688.JPEG n02113799/
+mv val/ILSVRC2012_val_00010689.JPEG n01530575/
+mv val/ILSVRC2012_val_00010690.JPEG n03345487/
+mv val/ILSVRC2012_val_00010691.JPEG n02917067/
+mv val/ILSVRC2012_val_00010692.JPEG n03788195/
+mv val/ILSVRC2012_val_00010693.JPEG n02105162/
+mv val/ILSVRC2012_val_00010694.JPEG n15075141/
+mv val/ILSVRC2012_val_00010695.JPEG n04317175/
+mv val/ILSVRC2012_val_00010696.JPEG n04251144/
+mv val/ILSVRC2012_val_00010697.JPEG n02112018/
+mv val/ILSVRC2012_val_00010698.JPEG n04326547/
+mv val/ILSVRC2012_val_00010699.JPEG n03838899/
+mv val/ILSVRC2012_val_00010700.JPEG n01955084/
+mv val/ILSVRC2012_val_00010701.JPEG n02417914/
+mv val/ILSVRC2012_val_00010702.JPEG n02099849/
+mv val/ILSVRC2012_val_00010703.JPEG n02317335/
+mv val/ILSVRC2012_val_00010704.JPEG n03095699/
+mv val/ILSVRC2012_val_00010705.JPEG n02699494/
+mv val/ILSVRC2012_val_00010706.JPEG n04554684/
+mv val/ILSVRC2012_val_00010707.JPEG n03729826/
+mv val/ILSVRC2012_val_00010708.JPEG n04005630/
+mv val/ILSVRC2012_val_00010709.JPEG n02108422/
+mv val/ILSVRC2012_val_00010710.JPEG n03127925/
+mv val/ILSVRC2012_val_00010711.JPEG n02123045/
+mv val/ILSVRC2012_val_00010712.JPEG n03832673/
+mv val/ILSVRC2012_val_00010713.JPEG n02504013/
+mv val/ILSVRC2012_val_00010714.JPEG n01806567/
+mv val/ILSVRC2012_val_00010715.JPEG n04069434/
+mv val/ILSVRC2012_val_00010716.JPEG n04023962/
+mv val/ILSVRC2012_val_00010717.JPEG n04111531/
+mv val/ILSVRC2012_val_00010718.JPEG n02097209/
+mv val/ILSVRC2012_val_00010719.JPEG n02105056/
+mv val/ILSVRC2012_val_00010720.JPEG n02097209/
+mv val/ILSVRC2012_val_00010721.JPEG n03376595/
+mv val/ILSVRC2012_val_00010722.JPEG n02095314/
+mv val/ILSVRC2012_val_00010723.JPEG n01756291/
+mv val/ILSVRC2012_val_00010724.JPEG n03773504/
+mv val/ILSVRC2012_val_00010725.JPEG n01980166/
+mv val/ILSVRC2012_val_00010726.JPEG n06794110/
+mv val/ILSVRC2012_val_00010727.JPEG n04074963/
+mv val/ILSVRC2012_val_00010728.JPEG n02747177/
+mv val/ILSVRC2012_val_00010729.JPEG n02108551/
+mv val/ILSVRC2012_val_00010730.JPEG n03255030/
+mv val/ILSVRC2012_val_00010731.JPEG n03891251/
+mv val/ILSVRC2012_val_00010732.JPEG n03935335/
+mv val/ILSVRC2012_val_00010733.JPEG n03673027/
+mv val/ILSVRC2012_val_00010734.JPEG n02111277/
+mv val/ILSVRC2012_val_00010735.JPEG n03188531/
+mv val/ILSVRC2012_val_00010736.JPEG n02100236/
+mv val/ILSVRC2012_val_00010737.JPEG n02992529/
+mv val/ILSVRC2012_val_00010738.JPEG n02607072/
+mv val/ILSVRC2012_val_00010739.JPEG n02095889/
+mv val/ILSVRC2012_val_00010740.JPEG n02002556/
+mv val/ILSVRC2012_val_00010741.JPEG n02834397/
+mv val/ILSVRC2012_val_00010742.JPEG n02134084/
+mv val/ILSVRC2012_val_00010743.JPEG n07716906/
+mv val/ILSVRC2012_val_00010744.JPEG n02804414/
+mv val/ILSVRC2012_val_00010745.JPEG n02134084/
+mv val/ILSVRC2012_val_00010746.JPEG n04008634/
+mv val/ILSVRC2012_val_00010747.JPEG n02509815/
+mv val/ILSVRC2012_val_00010748.JPEG n04254120/
+mv val/ILSVRC2012_val_00010749.JPEG n04147183/
+mv val/ILSVRC2012_val_00010750.JPEG n04204238/
+mv val/ILSVRC2012_val_00010751.JPEG n03908714/
+mv val/ILSVRC2012_val_00010752.JPEG n04162706/
+mv val/ILSVRC2012_val_00010753.JPEG n03197337/
+mv val/ILSVRC2012_val_00010754.JPEG n11879895/
+mv val/ILSVRC2012_val_00010755.JPEG n03787032/
+mv val/ILSVRC2012_val_00010756.JPEG n04111531/
+mv val/ILSVRC2012_val_00010757.JPEG n02978881/
+mv val/ILSVRC2012_val_00010758.JPEG n02102177/
+mv val/ILSVRC2012_val_00010759.JPEG n03379051/
+mv val/ILSVRC2012_val_00010760.JPEG n04371774/
+mv val/ILSVRC2012_val_00010761.JPEG n01704323/
+mv val/ILSVRC2012_val_00010762.JPEG n03710721/
+mv val/ILSVRC2012_val_00010763.JPEG n01518878/
+mv val/ILSVRC2012_val_00010764.JPEG n03016953/
+mv val/ILSVRC2012_val_00010765.JPEG n02106382/
+mv val/ILSVRC2012_val_00010766.JPEG n04540053/
+mv val/ILSVRC2012_val_00010767.JPEG n01558993/
+mv val/ILSVRC2012_val_00010768.JPEG n02105412/
+mv val/ILSVRC2012_val_00010769.JPEG n02981792/
+mv val/ILSVRC2012_val_00010770.JPEG n03028079/
+mv val/ILSVRC2012_val_00010771.JPEG n03782006/
+mv val/ILSVRC2012_val_00010772.JPEG n02086079/
+mv val/ILSVRC2012_val_00010773.JPEG n04192698/
+mv val/ILSVRC2012_val_00010774.JPEG n02233338/
+mv val/ILSVRC2012_val_00010775.JPEG n03649909/
+mv val/ILSVRC2012_val_00010776.JPEG n03496892/
+mv val/ILSVRC2012_val_00010777.JPEG n02276258/
+mv val/ILSVRC2012_val_00010778.JPEG n03832673/
+mv val/ILSVRC2012_val_00010779.JPEG n04070727/
+mv val/ILSVRC2012_val_00010780.JPEG n03899768/
+mv val/ILSVRC2012_val_00010781.JPEG n03017168/
+mv val/ILSVRC2012_val_00010782.JPEG n03485794/
+mv val/ILSVRC2012_val_00010783.JPEG n04591157/
+mv val/ILSVRC2012_val_00010784.JPEG n02493509/
+mv val/ILSVRC2012_val_00010785.JPEG n02093754/
+mv val/ILSVRC2012_val_00010786.JPEG n02107683/
+mv val/ILSVRC2012_val_00010787.JPEG n04208210/
+mv val/ILSVRC2012_val_00010788.JPEG n02992529/
+mv val/ILSVRC2012_val_00010789.JPEG n03124043/
+mv val/ILSVRC2012_val_00010790.JPEG n03876231/
+mv val/ILSVRC2012_val_00010791.JPEG n03691459/
+mv val/ILSVRC2012_val_00010792.JPEG n01667778/
+mv val/ILSVRC2012_val_00010793.JPEG n07730033/
+mv val/ILSVRC2012_val_00010794.JPEG n04252225/
+mv val/ILSVRC2012_val_00010795.JPEG n04208210/
+mv val/ILSVRC2012_val_00010796.JPEG n02860847/
+mv val/ILSVRC2012_val_00010797.JPEG n01742172/
+mv val/ILSVRC2012_val_00010798.JPEG n02094114/
+mv val/ILSVRC2012_val_00010799.JPEG n03000134/
+mv val/ILSVRC2012_val_00010800.JPEG n07860988/
+mv val/ILSVRC2012_val_00010801.JPEG n01775062/
+mv val/ILSVRC2012_val_00010802.JPEG n03958227/
+mv val/ILSVRC2012_val_00010803.JPEG n03045698/
+mv val/ILSVRC2012_val_00010804.JPEG n03759954/
+mv val/ILSVRC2012_val_00010805.JPEG n02086240/
+mv val/ILSVRC2012_val_00010806.JPEG n03676483/
+mv val/ILSVRC2012_val_00010807.JPEG n04532670/
+mv val/ILSVRC2012_val_00010808.JPEG n02100583/
+mv val/ILSVRC2012_val_00010809.JPEG n02793495/
+mv val/ILSVRC2012_val_00010810.JPEG n01855032/
+mv val/ILSVRC2012_val_00010811.JPEG n04275548/
+mv val/ILSVRC2012_val_00010812.JPEG n04409515/
+mv val/ILSVRC2012_val_00010813.JPEG n03733131/
+mv val/ILSVRC2012_val_00010814.JPEG n03710193/
+mv val/ILSVRC2012_val_00010815.JPEG n07760859/
+mv val/ILSVRC2012_val_00010816.JPEG n03854065/
+mv val/ILSVRC2012_val_00010817.JPEG n01629819/
+mv val/ILSVRC2012_val_00010818.JPEG n02840245/
+mv val/ILSVRC2012_val_00010819.JPEG n03691459/
+mv val/ILSVRC2012_val_00010820.JPEG n03452741/
+mv val/ILSVRC2012_val_00010821.JPEG n03297495/
+mv val/ILSVRC2012_val_00010822.JPEG n03877472/
+mv val/ILSVRC2012_val_00010823.JPEG n02125311/
+mv val/ILSVRC2012_val_00010824.JPEG n04037443/
+mv val/ILSVRC2012_val_00010825.JPEG n02526121/
+mv val/ILSVRC2012_val_00010826.JPEG n01698640/
+mv val/ILSVRC2012_val_00010827.JPEG n04591713/
+mv val/ILSVRC2012_val_00010828.JPEG n02860847/
+mv val/ILSVRC2012_val_00010829.JPEG n02412080/
+mv val/ILSVRC2012_val_00010830.JPEG n01728572/
+mv val/ILSVRC2012_val_00010831.JPEG n04152593/
+mv val/ILSVRC2012_val_00010832.JPEG n02879718/
+mv val/ILSVRC2012_val_00010833.JPEG n02699494/
+mv val/ILSVRC2012_val_00010834.JPEG n02115913/
+mv val/ILSVRC2012_val_00010835.JPEG n03000134/
+mv val/ILSVRC2012_val_00010836.JPEG n02326432/
+mv val/ILSVRC2012_val_00010837.JPEG n02966193/
+mv val/ILSVRC2012_val_00010838.JPEG n04326547/
+mv val/ILSVRC2012_val_00010839.JPEG n04049303/
+mv val/ILSVRC2012_val_00010840.JPEG n04501370/
+mv val/ILSVRC2012_val_00010841.JPEG n07590611/
+mv val/ILSVRC2012_val_00010842.JPEG n02088466/
+mv val/ILSVRC2012_val_00010843.JPEG n01665541/
+mv val/ILSVRC2012_val_00010844.JPEG n03141823/
+mv val/ILSVRC2012_val_00010845.JPEG n02037110/
+mv val/ILSVRC2012_val_00010846.JPEG n02110958/
+mv val/ILSVRC2012_val_00010847.JPEG n03481172/
+mv val/ILSVRC2012_val_00010848.JPEG n07860988/
+mv val/ILSVRC2012_val_00010849.JPEG n02509815/
+mv val/ILSVRC2012_val_00010850.JPEG n02869837/
+mv val/ILSVRC2012_val_00010851.JPEG n03930313/
+mv val/ILSVRC2012_val_00010852.JPEG n03492542/
+mv val/ILSVRC2012_val_00010853.JPEG n02480855/
+mv val/ILSVRC2012_val_00010854.JPEG n02486261/
+mv val/ILSVRC2012_val_00010855.JPEG n03495258/
+mv val/ILSVRC2012_val_00010856.JPEG n03478589/
+mv val/ILSVRC2012_val_00010857.JPEG n03063599/
+mv val/ILSVRC2012_val_00010858.JPEG n04525038/
+mv val/ILSVRC2012_val_00010859.JPEG n02109525/
+mv val/ILSVRC2012_val_00010860.JPEG n02787622/
+mv val/ILSVRC2012_val_00010861.JPEG n01592084/
+mv val/ILSVRC2012_val_00010862.JPEG n02437616/
+mv val/ILSVRC2012_val_00010863.JPEG n13040303/
+mv val/ILSVRC2012_val_00010864.JPEG n04118776/
+mv val/ILSVRC2012_val_00010865.JPEG n02104365/
+mv val/ILSVRC2012_val_00010866.JPEG n02927161/
+mv val/ILSVRC2012_val_00010867.JPEG n03532672/
+mv val/ILSVRC2012_val_00010868.JPEG n03814639/
+mv val/ILSVRC2012_val_00010869.JPEG n01910747/
+mv val/ILSVRC2012_val_00010870.JPEG n01737021/
+mv val/ILSVRC2012_val_00010871.JPEG n03877845/
+mv val/ILSVRC2012_val_00010872.JPEG n07579787/
+mv val/ILSVRC2012_val_00010873.JPEG n09288635/
+mv val/ILSVRC2012_val_00010874.JPEG n01981276/
+mv val/ILSVRC2012_val_00010875.JPEG n03133878/
+mv val/ILSVRC2012_val_00010876.JPEG n02667093/
+mv val/ILSVRC2012_val_00010877.JPEG n02747177/
+mv val/ILSVRC2012_val_00010878.JPEG n02500267/
+mv val/ILSVRC2012_val_00010879.JPEG n04370456/
+mv val/ILSVRC2012_val_00010880.JPEG n01601694/
+mv val/ILSVRC2012_val_00010881.JPEG n03769881/
+mv val/ILSVRC2012_val_00010882.JPEG n04372370/
+mv val/ILSVRC2012_val_00010883.JPEG n02114712/
+mv val/ILSVRC2012_val_00010884.JPEG n02326432/
+mv val/ILSVRC2012_val_00010885.JPEG n03134739/
+mv val/ILSVRC2012_val_00010886.JPEG n03041632/
+mv val/ILSVRC2012_val_00010887.JPEG n01685808/
+mv val/ILSVRC2012_val_00010888.JPEG n02233338/
+mv val/ILSVRC2012_val_00010889.JPEG n01614925/
+mv val/ILSVRC2012_val_00010890.JPEG n03982430/
+mv val/ILSVRC2012_val_00010891.JPEG n03929855/
+mv val/ILSVRC2012_val_00010892.JPEG n04069434/
+mv val/ILSVRC2012_val_00010893.JPEG n04367480/
+mv val/ILSVRC2012_val_00010894.JPEG n03961711/
+mv val/ILSVRC2012_val_00010895.JPEG n03201208/
+mv val/ILSVRC2012_val_00010896.JPEG n02092002/
+mv val/ILSVRC2012_val_00010897.JPEG n04370456/
+mv val/ILSVRC2012_val_00010898.JPEG n04376876/
+mv val/ILSVRC2012_val_00010899.JPEG n02395406/
+mv val/ILSVRC2012_val_00010900.JPEG n03717622/
+mv val/ILSVRC2012_val_00010901.JPEG n04317175/
+mv val/ILSVRC2012_val_00010902.JPEG n02088094/
+mv val/ILSVRC2012_val_00010903.JPEG n02950826/
+mv val/ILSVRC2012_val_00010904.JPEG n01697457/
+mv val/ILSVRC2012_val_00010905.JPEG n04591157/
+mv val/ILSVRC2012_val_00010906.JPEG n01784675/
+mv val/ILSVRC2012_val_00010907.JPEG n03930630/
+mv val/ILSVRC2012_val_00010908.JPEG n04251144/
+mv val/ILSVRC2012_val_00010909.JPEG n02802426/
+mv val/ILSVRC2012_val_00010910.JPEG n07697537/
+mv val/ILSVRC2012_val_00010911.JPEG n01689811/
+mv val/ILSVRC2012_val_00010912.JPEG n12998815/
+mv val/ILSVRC2012_val_00010913.JPEG n04550184/
+mv val/ILSVRC2012_val_00010914.JPEG n04486054/
+mv val/ILSVRC2012_val_00010915.JPEG n01667778/
+mv val/ILSVRC2012_val_00010916.JPEG n03916031/
+mv val/ILSVRC2012_val_00010917.JPEG n01795545/
+mv val/ILSVRC2012_val_00010918.JPEG n02790996/
+mv val/ILSVRC2012_val_00010919.JPEG n01910747/
+mv val/ILSVRC2012_val_00010920.JPEG n02085936/
+mv val/ILSVRC2012_val_00010921.JPEG n03938244/
+mv val/ILSVRC2012_val_00010922.JPEG n03976467/
+mv val/ILSVRC2012_val_00010923.JPEG n02325366/
+mv val/ILSVRC2012_val_00010924.JPEG n03527444/
+mv val/ILSVRC2012_val_00010925.JPEG n02268443/
+mv val/ILSVRC2012_val_00010926.JPEG n03290653/
+mv val/ILSVRC2012_val_00010927.JPEG n03444034/
+mv val/ILSVRC2012_val_00010928.JPEG n02105056/
+mv val/ILSVRC2012_val_00010929.JPEG n02096437/
+mv val/ILSVRC2012_val_00010930.JPEG n03457902/
+mv val/ILSVRC2012_val_00010931.JPEG n03843555/
+mv val/ILSVRC2012_val_00010932.JPEG n02500267/
+mv val/ILSVRC2012_val_00010933.JPEG n02088094/
+mv val/ILSVRC2012_val_00010934.JPEG n02769748/
+mv val/ILSVRC2012_val_00010935.JPEG n04525038/
+mv val/ILSVRC2012_val_00010936.JPEG n02606052/
+mv val/ILSVRC2012_val_00010937.JPEG n04487081/
+mv val/ILSVRC2012_val_00010938.JPEG n02486261/
+mv val/ILSVRC2012_val_00010939.JPEG n03492542/
+mv val/ILSVRC2012_val_00010940.JPEG n03733131/
+mv val/ILSVRC2012_val_00010941.JPEG n02120505/
+mv val/ILSVRC2012_val_00010942.JPEG n07745940/
+mv val/ILSVRC2012_val_00010943.JPEG n02112137/
+mv val/ILSVRC2012_val_00010944.JPEG n07579787/
+mv val/ILSVRC2012_val_00010945.JPEG n02105505/
+mv val/ILSVRC2012_val_00010946.JPEG n03452741/
+mv val/ILSVRC2012_val_00010947.JPEG n10148035/
+mv val/ILSVRC2012_val_00010948.JPEG n04125021/
+mv val/ILSVRC2012_val_00010949.JPEG n04026417/
+mv val/ILSVRC2012_val_00010950.JPEG n02089867/
+mv val/ILSVRC2012_val_00010951.JPEG n03995372/
+mv val/ILSVRC2012_val_00010952.JPEG n02177972/
+mv val/ILSVRC2012_val_00010953.JPEG n03903868/
+mv val/ILSVRC2012_val_00010954.JPEG n04409515/
+mv val/ILSVRC2012_val_00010955.JPEG n01943899/
+mv val/ILSVRC2012_val_00010956.JPEG n02100236/
+mv val/ILSVRC2012_val_00010957.JPEG n03124170/
+mv val/ILSVRC2012_val_00010958.JPEG n03197337/
+mv val/ILSVRC2012_val_00010959.JPEG n02361337/
+mv val/ILSVRC2012_val_00010960.JPEG n04325704/
+mv val/ILSVRC2012_val_00010961.JPEG n03920288/
+mv val/ILSVRC2012_val_00010962.JPEG n03825788/
+mv val/ILSVRC2012_val_00010963.JPEG n02101388/
+mv val/ILSVRC2012_val_00010964.JPEG n11879895/
+mv val/ILSVRC2012_val_00010965.JPEG n03443371/
+mv val/ILSVRC2012_val_00010966.JPEG n02071294/
+mv val/ILSVRC2012_val_00010967.JPEG n07880968/
+mv val/ILSVRC2012_val_00010968.JPEG n03769881/
+mv val/ILSVRC2012_val_00010969.JPEG n03902125/
+mv val/ILSVRC2012_val_00010970.JPEG n02110806/
+mv val/ILSVRC2012_val_00010971.JPEG n03637318/
+mv val/ILSVRC2012_val_00010972.JPEG n04019541/
+mv val/ILSVRC2012_val_00010973.JPEG n03840681/
+mv val/ILSVRC2012_val_00010974.JPEG n02342885/
+mv val/ILSVRC2012_val_00010975.JPEG n03476684/
+mv val/ILSVRC2012_val_00010976.JPEG n02094114/
+mv val/ILSVRC2012_val_00010977.JPEG n04023962/
+mv val/ILSVRC2012_val_00010978.JPEG n03706229/
+mv val/ILSVRC2012_val_00010979.JPEG n02730930/
+mv val/ILSVRC2012_val_00010980.JPEG n02877765/
+mv val/ILSVRC2012_val_00010981.JPEG n04548362/
+mv val/ILSVRC2012_val_00010982.JPEG n02088632/
+mv val/ILSVRC2012_val_00010983.JPEG n04285008/
+mv val/ILSVRC2012_val_00010984.JPEG n07873807/
+mv val/ILSVRC2012_val_00010985.JPEG n03903868/
+mv val/ILSVRC2012_val_00010986.JPEG n04501370/
+mv val/ILSVRC2012_val_00010987.JPEG n04118538/
+mv val/ILSVRC2012_val_00010988.JPEG n02025239/
+mv val/ILSVRC2012_val_00010989.JPEG n03530642/
+mv val/ILSVRC2012_val_00010990.JPEG n02018207/
+mv val/ILSVRC2012_val_00010991.JPEG n03476684/
+mv val/ILSVRC2012_val_00010992.JPEG n03602883/
+mv val/ILSVRC2012_val_00010993.JPEG n02948072/
+mv val/ILSVRC2012_val_00010994.JPEG n02102040/
+mv val/ILSVRC2012_val_00010995.JPEG n02123394/
+mv val/ILSVRC2012_val_00010996.JPEG n01944390/
+mv val/ILSVRC2012_val_00010997.JPEG n02268853/
+mv val/ILSVRC2012_val_00010998.JPEG n04590129/
+mv val/ILSVRC2012_val_00010999.JPEG n01530575/
+mv val/ILSVRC2012_val_00011000.JPEG n02117135/
+mv val/ILSVRC2012_val_00011001.JPEG n03691459/
+mv val/ILSVRC2012_val_00011002.JPEG n02504013/
+mv val/ILSVRC2012_val_00011003.JPEG n03179701/
+mv val/ILSVRC2012_val_00011004.JPEG n04357314/
+mv val/ILSVRC2012_val_00011005.JPEG n04399382/
+mv val/ILSVRC2012_val_00011006.JPEG n03218198/
+mv val/ILSVRC2012_val_00011007.JPEG n02865351/
+mv val/ILSVRC2012_val_00011008.JPEG n03598930/
+mv val/ILSVRC2012_val_00011009.JPEG n02113978/
+mv val/ILSVRC2012_val_00011010.JPEG n03697007/
+mv val/ILSVRC2012_val_00011011.JPEG n01843383/
+mv val/ILSVRC2012_val_00011012.JPEG n02074367/
+mv val/ILSVRC2012_val_00011013.JPEG n02264363/
+mv val/ILSVRC2012_val_00011014.JPEG n01742172/
+mv val/ILSVRC2012_val_00011015.JPEG n02123045/
+mv val/ILSVRC2012_val_00011016.JPEG n02795169/
+mv val/ILSVRC2012_val_00011017.JPEG n03721384/
+mv val/ILSVRC2012_val_00011018.JPEG n02129165/
+mv val/ILSVRC2012_val_00011019.JPEG n03544143/
+mv val/ILSVRC2012_val_00011020.JPEG n04522168/
+mv val/ILSVRC2012_val_00011021.JPEG n12985857/
+mv val/ILSVRC2012_val_00011022.JPEG n02814860/
+mv val/ILSVRC2012_val_00011023.JPEG n02110958/
+mv val/ILSVRC2012_val_00011024.JPEG n02100735/
+mv val/ILSVRC2012_val_00011025.JPEG n13044778/
+mv val/ILSVRC2012_val_00011026.JPEG n02817516/
+mv val/ILSVRC2012_val_00011027.JPEG n07730033/
+mv val/ILSVRC2012_val_00011028.JPEG n04429376/
+mv val/ILSVRC2012_val_00011029.JPEG n04033995/
+mv val/ILSVRC2012_val_00011030.JPEG n04367480/
+mv val/ILSVRC2012_val_00011031.JPEG n03729826/
+mv val/ILSVRC2012_val_00011032.JPEG n02493793/
+mv val/ILSVRC2012_val_00011033.JPEG n04141975/
+mv val/ILSVRC2012_val_00011034.JPEG n01740131/
+mv val/ILSVRC2012_val_00011035.JPEG n01914609/
+mv val/ILSVRC2012_val_00011036.JPEG n02134418/
+mv val/ILSVRC2012_val_00011037.JPEG n01739381/
+mv val/ILSVRC2012_val_00011038.JPEG n02687172/
+mv val/ILSVRC2012_val_00011039.JPEG n02483362/
+mv val/ILSVRC2012_val_00011040.JPEG n13037406/
+mv val/ILSVRC2012_val_00011041.JPEG n01742172/
+mv val/ILSVRC2012_val_00011042.JPEG n02396427/
+mv val/ILSVRC2012_val_00011043.JPEG n02397096/
+mv val/ILSVRC2012_val_00011044.JPEG n01689811/
+mv val/ILSVRC2012_val_00011045.JPEG n09399592/
+mv val/ILSVRC2012_val_00011046.JPEG n04347754/
+mv val/ILSVRC2012_val_00011047.JPEG n02865351/
+mv val/ILSVRC2012_val_00011048.JPEG n04344873/
+mv val/ILSVRC2012_val_00011049.JPEG n02111889/
+mv val/ILSVRC2012_val_00011050.JPEG n02939185/
+mv val/ILSVRC2012_val_00011051.JPEG n04033995/
+mv val/ILSVRC2012_val_00011052.JPEG n02037110/
+mv val/ILSVRC2012_val_00011053.JPEG n01773157/
+mv val/ILSVRC2012_val_00011054.JPEG n03599486/
+mv val/ILSVRC2012_val_00011055.JPEG n02093647/
+mv val/ILSVRC2012_val_00011056.JPEG n01532829/
+mv val/ILSVRC2012_val_00011057.JPEG n02097209/
+mv val/ILSVRC2012_val_00011058.JPEG n02492660/
+mv val/ILSVRC2012_val_00011059.JPEG n04009552/
+mv val/ILSVRC2012_val_00011060.JPEG n04033901/
+mv val/ILSVRC2012_val_00011061.JPEG n02099429/
+mv val/ILSVRC2012_val_00011062.JPEG n02056570/
+mv val/ILSVRC2012_val_00011063.JPEG n02098413/
+mv val/ILSVRC2012_val_00011064.JPEG n02992211/
+mv val/ILSVRC2012_val_00011065.JPEG n03788195/
+mv val/ILSVRC2012_val_00011066.JPEG n03207743/
+mv val/ILSVRC2012_val_00011067.JPEG n03444034/
+mv val/ILSVRC2012_val_00011068.JPEG n03814639/
+mv val/ILSVRC2012_val_00011069.JPEG n04485082/
+mv val/ILSVRC2012_val_00011070.JPEG n01981276/
+mv val/ILSVRC2012_val_00011071.JPEG n01978455/
+mv val/ILSVRC2012_val_00011072.JPEG n03461385/
+mv val/ILSVRC2012_val_00011073.JPEG n01688243/
+mv val/ILSVRC2012_val_00011074.JPEG n02277742/
+mv val/ILSVRC2012_val_00011075.JPEG n03388043/
+mv val/ILSVRC2012_val_00011076.JPEG n02871525/
+mv val/ILSVRC2012_val_00011077.JPEG n02101556/
+mv val/ILSVRC2012_val_00011078.JPEG n03131574/
+mv val/ILSVRC2012_val_00011079.JPEG n02236044/
+mv val/ILSVRC2012_val_00011080.JPEG n07248320/
+mv val/ILSVRC2012_val_00011081.JPEG n03041632/
+mv val/ILSVRC2012_val_00011082.JPEG n02095314/
+mv val/ILSVRC2012_val_00011083.JPEG n04344873/
+mv val/ILSVRC2012_val_00011084.JPEG n02119022/
+mv val/ILSVRC2012_val_00011085.JPEG n02172182/
+mv val/ILSVRC2012_val_00011086.JPEG n13054560/
+mv val/ILSVRC2012_val_00011087.JPEG n01978287/
+mv val/ILSVRC2012_val_00011088.JPEG n03532672/
+mv val/ILSVRC2012_val_00011089.JPEG n04536866/
+mv val/ILSVRC2012_val_00011090.JPEG n02105412/
+mv val/ILSVRC2012_val_00011091.JPEG n04118538/
+mv val/ILSVRC2012_val_00011092.JPEG n02443484/
+mv val/ILSVRC2012_val_00011093.JPEG n01695060/
+mv val/ILSVRC2012_val_00011094.JPEG n02909870/
+mv val/ILSVRC2012_val_00011095.JPEG n02441942/
+mv val/ILSVRC2012_val_00011096.JPEG n02017213/
+mv val/ILSVRC2012_val_00011097.JPEG n02799071/
+mv val/ILSVRC2012_val_00011098.JPEG n04147183/
+mv val/ILSVRC2012_val_00011099.JPEG n04589890/
+mv val/ILSVRC2012_val_00011100.JPEG n02056570/
+mv val/ILSVRC2012_val_00011101.JPEG n02486261/
+mv val/ILSVRC2012_val_00011102.JPEG n03345487/
+mv val/ILSVRC2012_val_00011103.JPEG n04328186/
+mv val/ILSVRC2012_val_00011104.JPEG n02328150/
+mv val/ILSVRC2012_val_00011105.JPEG n04476259/
+mv val/ILSVRC2012_val_00011106.JPEG n04346328/
+mv val/ILSVRC2012_val_00011107.JPEG n04273569/
+mv val/ILSVRC2012_val_00011108.JPEG n03290653/
+mv val/ILSVRC2012_val_00011109.JPEG n03627232/
+mv val/ILSVRC2012_val_00011110.JPEG n02791124/
+mv val/ILSVRC2012_val_00011111.JPEG n02012849/
+mv val/ILSVRC2012_val_00011112.JPEG n02259212/
+mv val/ILSVRC2012_val_00011113.JPEG n02090379/
+mv val/ILSVRC2012_val_00011114.JPEG n03627232/
+mv val/ILSVRC2012_val_00011115.JPEG n03764736/
+mv val/ILSVRC2012_val_00011116.JPEG n02817516/
+mv val/ILSVRC2012_val_00011117.JPEG n04326547/
+mv val/ILSVRC2012_val_00011118.JPEG n03065424/
+mv val/ILSVRC2012_val_00011119.JPEG n02909870/
+mv val/ILSVRC2012_val_00011120.JPEG n01675722/
+mv val/ILSVRC2012_val_00011121.JPEG n04522168/
+mv val/ILSVRC2012_val_00011122.JPEG n13133613/
+mv val/ILSVRC2012_val_00011123.JPEG n02655020/
+mv val/ILSVRC2012_val_00011124.JPEG n04209133/
+mv val/ILSVRC2012_val_00011125.JPEG n02783161/
+mv val/ILSVRC2012_val_00011126.JPEG n03796401/
+mv val/ILSVRC2012_val_00011127.JPEG n03250847/
+mv val/ILSVRC2012_val_00011128.JPEG n01872401/
+mv val/ILSVRC2012_val_00011129.JPEG n01682714/
+mv val/ILSVRC2012_val_00011130.JPEG n01873310/
+mv val/ILSVRC2012_val_00011131.JPEG n01631663/
+mv val/ILSVRC2012_val_00011132.JPEG n04005630/
+mv val/ILSVRC2012_val_00011133.JPEG n02843684/
+mv val/ILSVRC2012_val_00011134.JPEG n02769748/
+mv val/ILSVRC2012_val_00011135.JPEG n02804610/
+mv val/ILSVRC2012_val_00011136.JPEG n03782006/
+mv val/ILSVRC2012_val_00011137.JPEG n01978455/
+mv val/ILSVRC2012_val_00011138.JPEG n02097298/
+mv val/ILSVRC2012_val_00011139.JPEG n02787622/
+mv val/ILSVRC2012_val_00011140.JPEG n07716906/
+mv val/ILSVRC2012_val_00011141.JPEG n02111129/
+mv val/ILSVRC2012_val_00011142.JPEG n02123045/
+mv val/ILSVRC2012_val_00011143.JPEG n02279972/
+mv val/ILSVRC2012_val_00011144.JPEG n02497673/
+mv val/ILSVRC2012_val_00011145.JPEG n02980441/
+mv val/ILSVRC2012_val_00011146.JPEG n02111129/
+mv val/ILSVRC2012_val_00011147.JPEG n03297495/
+mv val/ILSVRC2012_val_00011148.JPEG n04487081/
+mv val/ILSVRC2012_val_00011149.JPEG n04370456/
+mv val/ILSVRC2012_val_00011150.JPEG n01667778/
+mv val/ILSVRC2012_val_00011151.JPEG n03710193/
+mv val/ILSVRC2012_val_00011152.JPEG n02096294/
+mv val/ILSVRC2012_val_00011153.JPEG n03876231/
+mv val/ILSVRC2012_val_00011154.JPEG n03938244/
+mv val/ILSVRC2012_val_00011155.JPEG n02950826/
+mv val/ILSVRC2012_val_00011156.JPEG n04311174/
+mv val/ILSVRC2012_val_00011157.JPEG n04081281/
+mv val/ILSVRC2012_val_00011158.JPEG n01687978/
+mv val/ILSVRC2012_val_00011159.JPEG n04371774/
+mv val/ILSVRC2012_val_00011160.JPEG n06794110/
+mv val/ILSVRC2012_val_00011161.JPEG n02281406/
+mv val/ILSVRC2012_val_00011162.JPEG n04326547/
+mv val/ILSVRC2012_val_00011163.JPEG n02395406/
+mv val/ILSVRC2012_val_00011164.JPEG n02096051/
+mv val/ILSVRC2012_val_00011165.JPEG n02113186/
+mv val/ILSVRC2012_val_00011166.JPEG n04070727/
+mv val/ILSVRC2012_val_00011167.JPEG n02206856/
+mv val/ILSVRC2012_val_00011168.JPEG n02690373/
+mv val/ILSVRC2012_val_00011169.JPEG n01729977/
+mv val/ILSVRC2012_val_00011170.JPEG n03000684/
+mv val/ILSVRC2012_val_00011171.JPEG n01514859/
+mv val/ILSVRC2012_val_00011172.JPEG n03197337/
+mv val/ILSVRC2012_val_00011173.JPEG n03445924/
+mv val/ILSVRC2012_val_00011174.JPEG n04604644/
+mv val/ILSVRC2012_val_00011175.JPEG n02280649/
+mv val/ILSVRC2012_val_00011176.JPEG n02090379/
+mv val/ILSVRC2012_val_00011177.JPEG n02012849/
+mv val/ILSVRC2012_val_00011178.JPEG n01534433/
+mv val/ILSVRC2012_val_00011179.JPEG n07734744/
+mv val/ILSVRC2012_val_00011180.JPEG n03838899/
+mv val/ILSVRC2012_val_00011181.JPEG n02177972/
+mv val/ILSVRC2012_val_00011182.JPEG n04423845/
+mv val/ILSVRC2012_val_00011183.JPEG n03899768/
+mv val/ILSVRC2012_val_00011184.JPEG n02098105/
+mv val/ILSVRC2012_val_00011185.JPEG n03633091/
+mv val/ILSVRC2012_val_00011186.JPEG n02701002/
+mv val/ILSVRC2012_val_00011187.JPEG n04371430/
+mv val/ILSVRC2012_val_00011188.JPEG n02114367/
+mv val/ILSVRC2012_val_00011189.JPEG n03947888/
+mv val/ILSVRC2012_val_00011190.JPEG n01820546/
+mv val/ILSVRC2012_val_00011191.JPEG n02088238/
+mv val/ILSVRC2012_val_00011192.JPEG n03929855/
+mv val/ILSVRC2012_val_00011193.JPEG n04612504/
+mv val/ILSVRC2012_val_00011194.JPEG n02963159/
+mv val/ILSVRC2012_val_00011195.JPEG n02966193/
+mv val/ILSVRC2012_val_00011196.JPEG n02037110/
+mv val/ILSVRC2012_val_00011197.JPEG n03982430/
+mv val/ILSVRC2012_val_00011198.JPEG n02107574/
+mv val/ILSVRC2012_val_00011199.JPEG n02966193/
+mv val/ILSVRC2012_val_00011200.JPEG n04355933/
+mv val/ILSVRC2012_val_00011201.JPEG n03372029/
+mv val/ILSVRC2012_val_00011202.JPEG n02113978/
+mv val/ILSVRC2012_val_00011203.JPEG n04398044/
+mv val/ILSVRC2012_val_00011204.JPEG n02087046/
+mv val/ILSVRC2012_val_00011205.JPEG n02106166/
+mv val/ILSVRC2012_val_00011206.JPEG n04465501/
+mv val/ILSVRC2012_val_00011207.JPEG n03179701/
+mv val/ILSVRC2012_val_00011208.JPEG n10565667/
+mv val/ILSVRC2012_val_00011209.JPEG n03492542/
+mv val/ILSVRC2012_val_00011210.JPEG n01735189/
+mv val/ILSVRC2012_val_00011211.JPEG n02120079/
+mv val/ILSVRC2012_val_00011212.JPEG n02105251/
+mv val/ILSVRC2012_val_00011213.JPEG n01873310/
+mv val/ILSVRC2012_val_00011214.JPEG n02110063/
+mv val/ILSVRC2012_val_00011215.JPEG n03388183/
+mv val/ILSVRC2012_val_00011216.JPEG n02444819/
+mv val/ILSVRC2012_val_00011217.JPEG n02687172/
+mv val/ILSVRC2012_val_00011218.JPEG n01871265/
+mv val/ILSVRC2012_val_00011219.JPEG n02445715/
+mv val/ILSVRC2012_val_00011220.JPEG n04590129/
+mv val/ILSVRC2012_val_00011221.JPEG n12985857/
+mv val/ILSVRC2012_val_00011222.JPEG n01819313/
+mv val/ILSVRC2012_val_00011223.JPEG n03938244/
+mv val/ILSVRC2012_val_00011224.JPEG n02443114/
+mv val/ILSVRC2012_val_00011225.JPEG n04380533/
+mv val/ILSVRC2012_val_00011226.JPEG n04277352/
+mv val/ILSVRC2012_val_00011227.JPEG n02444819/
+mv val/ILSVRC2012_val_00011228.JPEG n02536864/
+mv val/ILSVRC2012_val_00011229.JPEG n02111277/
+mv val/ILSVRC2012_val_00011230.JPEG n02948072/
+mv val/ILSVRC2012_val_00011231.JPEG n03938244/
+mv val/ILSVRC2012_val_00011232.JPEG n07753113/
+mv val/ILSVRC2012_val_00011233.JPEG n01440764/
+mv val/ILSVRC2012_val_00011234.JPEG n09193705/
+mv val/ILSVRC2012_val_00011235.JPEG n02509815/
+mv val/ILSVRC2012_val_00011236.JPEG n01770393/
+mv val/ILSVRC2012_val_00011237.JPEG n01828970/
+mv val/ILSVRC2012_val_00011238.JPEG n03794056/
+mv val/ILSVRC2012_val_00011239.JPEG n03902125/
+mv val/ILSVRC2012_val_00011240.JPEG n02097474/
+mv val/ILSVRC2012_val_00011241.JPEG n07714571/
+mv val/ILSVRC2012_val_00011242.JPEG n02107908/
+mv val/ILSVRC2012_val_00011243.JPEG n01698640/
+mv val/ILSVRC2012_val_00011244.JPEG n04590129/
+mv val/ILSVRC2012_val_00011245.JPEG n02481823/
+mv val/ILSVRC2012_val_00011246.JPEG n04418357/
+mv val/ILSVRC2012_val_00011247.JPEG n02504013/
+mv val/ILSVRC2012_val_00011248.JPEG n02815834/
+mv val/ILSVRC2012_val_00011249.JPEG n01530575/
+mv val/ILSVRC2012_val_00011250.JPEG n03131574/
+mv val/ILSVRC2012_val_00011251.JPEG n02104365/
+mv val/ILSVRC2012_val_00011252.JPEG n04204238/
+mv val/ILSVRC2012_val_00011253.JPEG n02454379/
+mv val/ILSVRC2012_val_00011254.JPEG n04147183/
+mv val/ILSVRC2012_val_00011255.JPEG n02077923/
+mv val/ILSVRC2012_val_00011256.JPEG n02488291/
+mv val/ILSVRC2012_val_00011257.JPEG n02342885/
+mv val/ILSVRC2012_val_00011258.JPEG n02097474/
+mv val/ILSVRC2012_val_00011259.JPEG n07716358/
+mv val/ILSVRC2012_val_00011260.JPEG n03337140/
+mv val/ILSVRC2012_val_00011261.JPEG n04417672/
+mv val/ILSVRC2012_val_00011262.JPEG n01694178/
+mv val/ILSVRC2012_val_00011263.JPEG n04311004/
+mv val/ILSVRC2012_val_00011264.JPEG n06785654/
+mv val/ILSVRC2012_val_00011265.JPEG n07768694/
+mv val/ILSVRC2012_val_00011266.JPEG n04149813/
+mv val/ILSVRC2012_val_00011267.JPEG n01560419/
+mv val/ILSVRC2012_val_00011268.JPEG n03970156/
+mv val/ILSVRC2012_val_00011269.JPEG n04125021/
+mv val/ILSVRC2012_val_00011270.JPEG n09428293/
+mv val/ILSVRC2012_val_00011271.JPEG n04258138/
+mv val/ILSVRC2012_val_00011272.JPEG n03720891/
+mv val/ILSVRC2012_val_00011273.JPEG n04086273/
+mv val/ILSVRC2012_val_00011274.JPEG n02804610/
+mv val/ILSVRC2012_val_00011275.JPEG n03642806/
+mv val/ILSVRC2012_val_00011276.JPEG n03133878/
+mv val/ILSVRC2012_val_00011277.JPEG n02974003/
+mv val/ILSVRC2012_val_00011278.JPEG n01629819/
+mv val/ILSVRC2012_val_00011279.JPEG n03983396/
+mv val/ILSVRC2012_val_00011280.JPEG n04154565/
+mv val/ILSVRC2012_val_00011281.JPEG n02483362/
+mv val/ILSVRC2012_val_00011282.JPEG n04019541/
+mv val/ILSVRC2012_val_00011283.JPEG n03065424/
+mv val/ILSVRC2012_val_00011284.JPEG n04040759/
+mv val/ILSVRC2012_val_00011285.JPEG n06596364/
+mv val/ILSVRC2012_val_00011286.JPEG n04131690/
+mv val/ILSVRC2012_val_00011287.JPEG n01770393/
+mv val/ILSVRC2012_val_00011288.JPEG n04550184/
+mv val/ILSVRC2012_val_00011289.JPEG n02120079/
+mv val/ILSVRC2012_val_00011290.JPEG n03255030/
+mv val/ILSVRC2012_val_00011291.JPEG n02326432/
+mv val/ILSVRC2012_val_00011292.JPEG n03344393/
+mv val/ILSVRC2012_val_00011293.JPEG n12985857/
+mv val/ILSVRC2012_val_00011294.JPEG n01675722/
+mv val/ILSVRC2012_val_00011295.JPEG n01729322/
+mv val/ILSVRC2012_val_00011296.JPEG n02112137/
+mv val/ILSVRC2012_val_00011297.JPEG n04398044/
+mv val/ILSVRC2012_val_00011298.JPEG n02013706/
+mv val/ILSVRC2012_val_00011299.JPEG n04162706/
+mv val/ILSVRC2012_val_00011300.JPEG n04069434/
+mv val/ILSVRC2012_val_00011301.JPEG n03630383/
+mv val/ILSVRC2012_val_00011302.JPEG n02840245/
+mv val/ILSVRC2012_val_00011303.JPEG n01644900/
+mv val/ILSVRC2012_val_00011304.JPEG n03680355/
+mv val/ILSVRC2012_val_00011305.JPEG n04229816/
+mv val/ILSVRC2012_val_00011306.JPEG n09193705/
+mv val/ILSVRC2012_val_00011307.JPEG n02788148/
+mv val/ILSVRC2012_val_00011308.JPEG n04462240/
+mv val/ILSVRC2012_val_00011309.JPEG n03775546/
+mv val/ILSVRC2012_val_00011310.JPEG n06596364/
+mv val/ILSVRC2012_val_00011311.JPEG n02090721/
+mv val/ILSVRC2012_val_00011312.JPEG n03388183/
+mv val/ILSVRC2012_val_00011313.JPEG n04252077/
+mv val/ILSVRC2012_val_00011314.JPEG n03042490/
+mv val/ILSVRC2012_val_00011315.JPEG n01843065/
+mv val/ILSVRC2012_val_00011316.JPEG n02111129/
+mv val/ILSVRC2012_val_00011317.JPEG n01616318/
+mv val/ILSVRC2012_val_00011318.JPEG n04409515/
+mv val/ILSVRC2012_val_00011319.JPEG n10148035/
+mv val/ILSVRC2012_val_00011320.JPEG n01677366/
+mv val/ILSVRC2012_val_00011321.JPEG n02655020/
+mv val/ILSVRC2012_val_00011322.JPEG n02107683/
+mv val/ILSVRC2012_val_00011323.JPEG n02105162/
+mv val/ILSVRC2012_val_00011324.JPEG n03888257/
+mv val/ILSVRC2012_val_00011325.JPEG n02128925/
+mv val/ILSVRC2012_val_00011326.JPEG n03868863/
+mv val/ILSVRC2012_val_00011327.JPEG n04069434/
+mv val/ILSVRC2012_val_00011328.JPEG n01773797/
+mv val/ILSVRC2012_val_00011329.JPEG n03792782/
+mv val/ILSVRC2012_val_00011330.JPEG n03792782/
+mv val/ILSVRC2012_val_00011331.JPEG n01560419/
+mv val/ILSVRC2012_val_00011332.JPEG n07742313/
+mv val/ILSVRC2012_val_00011333.JPEG n13054560/
+mv val/ILSVRC2012_val_00011334.JPEG n02981792/
+mv val/ILSVRC2012_val_00011335.JPEG n03916031/
+mv val/ILSVRC2012_val_00011336.JPEG n03623198/
+mv val/ILSVRC2012_val_00011337.JPEG n04146614/
+mv val/ILSVRC2012_val_00011338.JPEG n11879895/
+mv val/ILSVRC2012_val_00011339.JPEG n01675722/
+mv val/ILSVRC2012_val_00011340.JPEG n02097130/
+mv val/ILSVRC2012_val_00011341.JPEG n04423845/
+mv val/ILSVRC2012_val_00011342.JPEG n02089973/
+mv val/ILSVRC2012_val_00011343.JPEG n04592741/
+mv val/ILSVRC2012_val_00011344.JPEG n01968897/
+mv val/ILSVRC2012_val_00011345.JPEG n07718747/
+mv val/ILSVRC2012_val_00011346.JPEG n02992529/
+mv val/ILSVRC2012_val_00011347.JPEG n07753275/
+mv val/ILSVRC2012_val_00011348.JPEG n07745940/
+mv val/ILSVRC2012_val_00011349.JPEG n02108422/
+mv val/ILSVRC2012_val_00011350.JPEG n02804414/
+mv val/ILSVRC2012_val_00011351.JPEG n02342885/
+mv val/ILSVRC2012_val_00011352.JPEG n03379051/
+mv val/ILSVRC2012_val_00011353.JPEG n02457408/
+mv val/ILSVRC2012_val_00011354.JPEG n02437312/
+mv val/ILSVRC2012_val_00011355.JPEG n03787032/
+mv val/ILSVRC2012_val_00011356.JPEG n02091032/
+mv val/ILSVRC2012_val_00011357.JPEG n02002556/
+mv val/ILSVRC2012_val_00011358.JPEG n03666591/
+mv val/ILSVRC2012_val_00011359.JPEG n03717622/
+mv val/ILSVRC2012_val_00011360.JPEG n07831146/
+mv val/ILSVRC2012_val_00011361.JPEG n03208938/
+mv val/ILSVRC2012_val_00011362.JPEG n02840245/
+mv val/ILSVRC2012_val_00011363.JPEG n03891332/
+mv val/ILSVRC2012_val_00011364.JPEG n04589890/
+mv val/ILSVRC2012_val_00011365.JPEG n03887697/
+mv val/ILSVRC2012_val_00011366.JPEG n04141076/
+mv val/ILSVRC2012_val_00011367.JPEG n03770439/
+mv val/ILSVRC2012_val_00011368.JPEG n02113023/
+mv val/ILSVRC2012_val_00011369.JPEG n02009912/
+mv val/ILSVRC2012_val_00011370.JPEG n02823750/
+mv val/ILSVRC2012_val_00011371.JPEG n04252077/
+mv val/ILSVRC2012_val_00011372.JPEG n02396427/
+mv val/ILSVRC2012_val_00011373.JPEG n02099601/
+mv val/ILSVRC2012_val_00011374.JPEG n02279972/
+mv val/ILSVRC2012_val_00011375.JPEG n01843383/
+mv val/ILSVRC2012_val_00011376.JPEG n02749479/
+mv val/ILSVRC2012_val_00011377.JPEG n04228054/
+mv val/ILSVRC2012_val_00011378.JPEG n04590129/
+mv val/ILSVRC2012_val_00011379.JPEG n01773797/
+mv val/ILSVRC2012_val_00011380.JPEG n02027492/
+mv val/ILSVRC2012_val_00011381.JPEG n02093428/
+mv val/ILSVRC2012_val_00011382.JPEG n02259212/
+mv val/ILSVRC2012_val_00011383.JPEG n01910747/
+mv val/ILSVRC2012_val_00011384.JPEG n02088364/
+mv val/ILSVRC2012_val_00011385.JPEG n02093754/
+mv val/ILSVRC2012_val_00011386.JPEG n07860988/
+mv val/ILSVRC2012_val_00011387.JPEG n02093428/
+mv val/ILSVRC2012_val_00011388.JPEG n01494475/
+mv val/ILSVRC2012_val_00011389.JPEG n03888605/
+mv val/ILSVRC2012_val_00011390.JPEG n04589890/
+mv val/ILSVRC2012_val_00011391.JPEG n02092339/
+mv val/ILSVRC2012_val_00011392.JPEG n07584110/
+mv val/ILSVRC2012_val_00011393.JPEG n02190166/
+mv val/ILSVRC2012_val_00011394.JPEG n02096051/
+mv val/ILSVRC2012_val_00011395.JPEG n04023962/
+mv val/ILSVRC2012_val_00011396.JPEG n02484975/
+mv val/ILSVRC2012_val_00011397.JPEG n03980874/
+mv val/ILSVRC2012_val_00011398.JPEG n02870880/
+mv val/ILSVRC2012_val_00011399.JPEG n01807496/
+mv val/ILSVRC2012_val_00011400.JPEG n02090721/
+mv val/ILSVRC2012_val_00011401.JPEG n02011460/
+mv val/ILSVRC2012_val_00011402.JPEG n02033041/
+mv val/ILSVRC2012_val_00011403.JPEG n01514668/
+mv val/ILSVRC2012_val_00011404.JPEG n02094114/
+mv val/ILSVRC2012_val_00011405.JPEG n02687172/
+mv val/ILSVRC2012_val_00011406.JPEG n02013706/
+mv val/ILSVRC2012_val_00011407.JPEG n04523525/
+mv val/ILSVRC2012_val_00011408.JPEG n07718747/
+mv val/ILSVRC2012_val_00011409.JPEG n02361337/
+mv val/ILSVRC2012_val_00011410.JPEG n07720875/
+mv val/ILSVRC2012_val_00011411.JPEG n04005630/
+mv val/ILSVRC2012_val_00011412.JPEG n04509417/
+mv val/ILSVRC2012_val_00011413.JPEG n07613480/
+mv val/ILSVRC2012_val_00011414.JPEG n01622779/
+mv val/ILSVRC2012_val_00011415.JPEG n03131574/
+mv val/ILSVRC2012_val_00011416.JPEG n01631663/
+mv val/ILSVRC2012_val_00011417.JPEG n02701002/
+mv val/ILSVRC2012_val_00011418.JPEG n03014705/
+mv val/ILSVRC2012_val_00011419.JPEG n02607072/
+mv val/ILSVRC2012_val_00011420.JPEG n01560419/
+mv val/ILSVRC2012_val_00011421.JPEG n03197337/
+mv val/ILSVRC2012_val_00011422.JPEG n09193705/
+mv val/ILSVRC2012_val_00011423.JPEG n02099849/
+mv val/ILSVRC2012_val_00011424.JPEG n03000134/
+mv val/ILSVRC2012_val_00011425.JPEG n02480495/
+mv val/ILSVRC2012_val_00011426.JPEG n03733805/
+mv val/ILSVRC2012_val_00011427.JPEG n07802026/
+mv val/ILSVRC2012_val_00011428.JPEG n01749939/
+mv val/ILSVRC2012_val_00011429.JPEG n03956157/
+mv val/ILSVRC2012_val_00011430.JPEG n01955084/
+mv val/ILSVRC2012_val_00011431.JPEG n03445777/
+mv val/ILSVRC2012_val_00011432.JPEG n02927161/
+mv val/ILSVRC2012_val_00011433.JPEG n02105162/
+mv val/ILSVRC2012_val_00011434.JPEG n02088238/
+mv val/ILSVRC2012_val_00011435.JPEG n06794110/
+mv val/ILSVRC2012_val_00011436.JPEG n09332890/
+mv val/ILSVRC2012_val_00011437.JPEG n02823428/
+mv val/ILSVRC2012_val_00011438.JPEG n03773504/
+mv val/ILSVRC2012_val_00011439.JPEG n03657121/
+mv val/ILSVRC2012_val_00011440.JPEG n04044716/
+mv val/ILSVRC2012_val_00011441.JPEG n07760859/
+mv val/ILSVRC2012_val_00011442.JPEG n03207941/
+mv val/ILSVRC2012_val_00011443.JPEG n07717410/
+mv val/ILSVRC2012_val_00011444.JPEG n01664065/
+mv val/ILSVRC2012_val_00011445.JPEG n03291819/
+mv val/ILSVRC2012_val_00011446.JPEG n01580077/
+mv val/ILSVRC2012_val_00011447.JPEG n02132136/
+mv val/ILSVRC2012_val_00011448.JPEG n01687978/
+mv val/ILSVRC2012_val_00011449.JPEG n09332890/
+mv val/ILSVRC2012_val_00011450.JPEG n04590129/
+mv val/ILSVRC2012_val_00011451.JPEG n04487081/
+mv val/ILSVRC2012_val_00011452.JPEG n03838899/
+mv val/ILSVRC2012_val_00011453.JPEG n01981276/
+mv val/ILSVRC2012_val_00011454.JPEG n03899768/
+mv val/ILSVRC2012_val_00011455.JPEG n04004767/
+mv val/ILSVRC2012_val_00011456.JPEG n03207743/
+mv val/ILSVRC2012_val_00011457.JPEG n02106166/
+mv val/ILSVRC2012_val_00011458.JPEG n07873807/
+mv val/ILSVRC2012_val_00011459.JPEG n04039381/
+mv val/ILSVRC2012_val_00011460.JPEG n03388549/
+mv val/ILSVRC2012_val_00011461.JPEG n03977966/
+mv val/ILSVRC2012_val_00011462.JPEG n03384352/
+mv val/ILSVRC2012_val_00011463.JPEG n02114367/
+mv val/ILSVRC2012_val_00011464.JPEG n07695742/
+mv val/ILSVRC2012_val_00011465.JPEG n02105412/
+mv val/ILSVRC2012_val_00011466.JPEG n04591157/
+mv val/ILSVRC2012_val_00011467.JPEG n01729322/
+mv val/ILSVRC2012_val_00011468.JPEG n02066245/
+mv val/ILSVRC2012_val_00011469.JPEG n03938244/
+mv val/ILSVRC2012_val_00011470.JPEG n03240683/
+mv val/ILSVRC2012_val_00011471.JPEG n07880968/
+mv val/ILSVRC2012_val_00011472.JPEG n03782006/
+mv val/ILSVRC2012_val_00011473.JPEG n02086646/
+mv val/ILSVRC2012_val_00011474.JPEG n01632777/
+mv val/ILSVRC2012_val_00011475.JPEG n02793495/
+mv val/ILSVRC2012_val_00011476.JPEG n02281406/
+mv val/ILSVRC2012_val_00011477.JPEG n02443484/
+mv val/ILSVRC2012_val_00011478.JPEG n03208938/
+mv val/ILSVRC2012_val_00011479.JPEG n04350905/
+mv val/ILSVRC2012_val_00011480.JPEG n03179701/
+mv val/ILSVRC2012_val_00011481.JPEG n03658185/
+mv val/ILSVRC2012_val_00011482.JPEG n02480855/
+mv val/ILSVRC2012_val_00011483.JPEG n01737021/
+mv val/ILSVRC2012_val_00011484.JPEG n09256479/
+mv val/ILSVRC2012_val_00011485.JPEG n04357314/
+mv val/ILSVRC2012_val_00011486.JPEG n03424325/
+mv val/ILSVRC2012_val_00011487.JPEG n02807133/
+mv val/ILSVRC2012_val_00011488.JPEG n01855032/
+mv val/ILSVRC2012_val_00011489.JPEG n01828970/
+mv val/ILSVRC2012_val_00011490.JPEG n03980874/
+mv val/ILSVRC2012_val_00011491.JPEG n02107683/
+mv val/ILSVRC2012_val_00011492.JPEG n03895866/
+mv val/ILSVRC2012_val_00011493.JPEG n07768694/
+mv val/ILSVRC2012_val_00011494.JPEG n02090721/
+mv val/ILSVRC2012_val_00011495.JPEG n02110958/
+mv val/ILSVRC2012_val_00011496.JPEG n02669723/
+mv val/ILSVRC2012_val_00011497.JPEG n04599235/
+mv val/ILSVRC2012_val_00011498.JPEG n02105641/
+mv val/ILSVRC2012_val_00011499.JPEG n02692877/
+mv val/ILSVRC2012_val_00011500.JPEG n02927161/
+mv val/ILSVRC2012_val_00011501.JPEG n01582220/
+mv val/ILSVRC2012_val_00011502.JPEG n02325366/
+mv val/ILSVRC2012_val_00011503.JPEG n04039381/
+mv val/ILSVRC2012_val_00011504.JPEG n02790996/
+mv val/ILSVRC2012_val_00011505.JPEG n07760859/
+mv val/ILSVRC2012_val_00011506.JPEG n02114712/
+mv val/ILSVRC2012_val_00011507.JPEG n02099712/
+mv val/ILSVRC2012_val_00011508.JPEG n04275548/
+mv val/ILSVRC2012_val_00011509.JPEG n04366367/
+mv val/ILSVRC2012_val_00011510.JPEG n02687172/
+mv val/ILSVRC2012_val_00011511.JPEG n02113624/
+mv val/ILSVRC2012_val_00011512.JPEG n02454379/
+mv val/ILSVRC2012_val_00011513.JPEG n04120489/
+mv val/ILSVRC2012_val_00011514.JPEG n03785016/
+mv val/ILSVRC2012_val_00011515.JPEG n02279972/
+mv val/ILSVRC2012_val_00011516.JPEG n04209239/
+mv val/ILSVRC2012_val_00011517.JPEG n01677366/
+mv val/ILSVRC2012_val_00011518.JPEG n01682714/
+mv val/ILSVRC2012_val_00011519.JPEG n01601694/
+mv val/ILSVRC2012_val_00011520.JPEG n02483708/
+mv val/ILSVRC2012_val_00011521.JPEG n07718747/
+mv val/ILSVRC2012_val_00011522.JPEG n04344873/
+mv val/ILSVRC2012_val_00011523.JPEG n02483362/
+mv val/ILSVRC2012_val_00011524.JPEG n07717556/
+mv val/ILSVRC2012_val_00011525.JPEG n01981276/
+mv val/ILSVRC2012_val_00011526.JPEG n02699494/
+mv val/ILSVRC2012_val_00011527.JPEG n03160309/
+mv val/ILSVRC2012_val_00011528.JPEG n02123597/
+mv val/ILSVRC2012_val_00011529.JPEG n03970156/
+mv val/ILSVRC2012_val_00011530.JPEG n01669191/
+mv val/ILSVRC2012_val_00011531.JPEG n01756291/
+mv val/ILSVRC2012_val_00011532.JPEG n02606052/
+mv val/ILSVRC2012_val_00011533.JPEG n02795169/
+mv val/ILSVRC2012_val_00011534.JPEG n03478589/
+mv val/ILSVRC2012_val_00011535.JPEG n02259212/
+mv val/ILSVRC2012_val_00011536.JPEG n06785654/
+mv val/ILSVRC2012_val_00011537.JPEG n02114712/
+mv val/ILSVRC2012_val_00011538.JPEG n04311174/
+mv val/ILSVRC2012_val_00011539.JPEG n03891332/
+mv val/ILSVRC2012_val_00011540.JPEG n04443257/
+mv val/ILSVRC2012_val_00011541.JPEG n01687978/
+mv val/ILSVRC2012_val_00011542.JPEG n04259630/
+mv val/ILSVRC2012_val_00011543.JPEG n02128925/
+mv val/ILSVRC2012_val_00011544.JPEG n02526121/
+mv val/ILSVRC2012_val_00011545.JPEG n03447721/
+mv val/ILSVRC2012_val_00011546.JPEG n04239074/
+mv val/ILSVRC2012_val_00011547.JPEG n03877472/
+mv val/ILSVRC2012_val_00011548.JPEG n03710637/
+mv val/ILSVRC2012_val_00011549.JPEG n07711569/
+mv val/ILSVRC2012_val_00011550.JPEG n04153751/
+mv val/ILSVRC2012_val_00011551.JPEG n01682714/
+mv val/ILSVRC2012_val_00011552.JPEG n03598930/
+mv val/ILSVRC2012_val_00011553.JPEG n04131690/
+mv val/ILSVRC2012_val_00011554.JPEG n01819313/
+mv val/ILSVRC2012_val_00011555.JPEG n02085620/
+mv val/ILSVRC2012_val_00011556.JPEG n02113023/
+mv val/ILSVRC2012_val_00011557.JPEG n03133878/
+mv val/ILSVRC2012_val_00011558.JPEG n07768694/
+mv val/ILSVRC2012_val_00011559.JPEG n04579432/
+mv val/ILSVRC2012_val_00011560.JPEG n04532670/
+mv val/ILSVRC2012_val_00011561.JPEG n03976467/
+mv val/ILSVRC2012_val_00011562.JPEG n04326547/
+mv val/ILSVRC2012_val_00011563.JPEG n02951358/
+mv val/ILSVRC2012_val_00011564.JPEG n02279972/
+mv val/ILSVRC2012_val_00011565.JPEG n03000247/
+mv val/ILSVRC2012_val_00011566.JPEG n03837869/
+mv val/ILSVRC2012_val_00011567.JPEG n09288635/
+mv val/ILSVRC2012_val_00011568.JPEG n03196217/
+mv val/ILSVRC2012_val_00011569.JPEG n03733805/
+mv val/ILSVRC2012_val_00011570.JPEG n02111889/
+mv val/ILSVRC2012_val_00011571.JPEG n04286575/
+mv val/ILSVRC2012_val_00011572.JPEG n01985128/
+mv val/ILSVRC2012_val_00011573.JPEG n02105056/
+mv val/ILSVRC2012_val_00011574.JPEG n02783161/
+mv val/ILSVRC2012_val_00011575.JPEG n03902125/
+mv val/ILSVRC2012_val_00011576.JPEG n02643566/
+mv val/ILSVRC2012_val_00011577.JPEG n04553703/
+mv val/ILSVRC2012_val_00011578.JPEG n03787032/
+mv val/ILSVRC2012_val_00011579.JPEG n02799071/
+mv val/ILSVRC2012_val_00011580.JPEG n02137549/
+mv val/ILSVRC2012_val_00011581.JPEG n03445777/
+mv val/ILSVRC2012_val_00011582.JPEG n03240683/
+mv val/ILSVRC2012_val_00011583.JPEG n02093256/
+mv val/ILSVRC2012_val_00011584.JPEG n01847000/
+mv val/ILSVRC2012_val_00011585.JPEG n01978455/
+mv val/ILSVRC2012_val_00011586.JPEG n02089973/
+mv val/ILSVRC2012_val_00011587.JPEG n03482405/
+mv val/ILSVRC2012_val_00011588.JPEG n06874185/
+mv val/ILSVRC2012_val_00011589.JPEG n02280649/
+mv val/ILSVRC2012_val_00011590.JPEG n02129604/
+mv val/ILSVRC2012_val_00011591.JPEG n02892767/
+mv val/ILSVRC2012_val_00011592.JPEG n02480495/
+mv val/ILSVRC2012_val_00011593.JPEG n02106662/
+mv val/ILSVRC2012_val_00011594.JPEG n12144580/
+mv val/ILSVRC2012_val_00011595.JPEG n03599486/
+mv val/ILSVRC2012_val_00011596.JPEG n02066245/
+mv val/ILSVRC2012_val_00011597.JPEG n02454379/
+mv val/ILSVRC2012_val_00011598.JPEG n01873310/
+mv val/ILSVRC2012_val_00011599.JPEG n03690938/
+mv val/ILSVRC2012_val_00011600.JPEG n02389026/
+mv val/ILSVRC2012_val_00011601.JPEG n02264363/
+mv val/ILSVRC2012_val_00011602.JPEG n02966193/
+mv val/ILSVRC2012_val_00011603.JPEG n02500267/
+mv val/ILSVRC2012_val_00011604.JPEG n03538406/
+mv val/ILSVRC2012_val_00011605.JPEG n01843065/
+mv val/ILSVRC2012_val_00011606.JPEG n04254680/
+mv val/ILSVRC2012_val_00011607.JPEG n04346328/
+mv val/ILSVRC2012_val_00011608.JPEG n03961711/
+mv val/ILSVRC2012_val_00011609.JPEG n03970156/
+mv val/ILSVRC2012_val_00011610.JPEG n03207941/
+mv val/ILSVRC2012_val_00011611.JPEG n03791053/
+mv val/ILSVRC2012_val_00011612.JPEG n02085936/
+mv val/ILSVRC2012_val_00011613.JPEG n03954731/
+mv val/ILSVRC2012_val_00011614.JPEG n03857828/
+mv val/ILSVRC2012_val_00011615.JPEG n02807133/
+mv val/ILSVRC2012_val_00011616.JPEG n02443114/
+mv val/ILSVRC2012_val_00011617.JPEG n02219486/
+mv val/ILSVRC2012_val_00011618.JPEG n03670208/
+mv val/ILSVRC2012_val_00011619.JPEG n04263257/
+mv val/ILSVRC2012_val_00011620.JPEG n03110669/
+mv val/ILSVRC2012_val_00011621.JPEG n01795545/
+mv val/ILSVRC2012_val_00011622.JPEG n03467068/
+mv val/ILSVRC2012_val_00011623.JPEG n02115913/
+mv val/ILSVRC2012_val_00011624.JPEG n02119789/
+mv val/ILSVRC2012_val_00011625.JPEG n04487081/
+mv val/ILSVRC2012_val_00011626.JPEG n02791124/
+mv val/ILSVRC2012_val_00011627.JPEG n04201297/
+mv val/ILSVRC2012_val_00011628.JPEG n04265275/
+mv val/ILSVRC2012_val_00011629.JPEG n01784675/
+mv val/ILSVRC2012_val_00011630.JPEG n02814533/
+mv val/ILSVRC2012_val_00011631.JPEG n02417914/
+mv val/ILSVRC2012_val_00011632.JPEG n07932039/
+mv val/ILSVRC2012_val_00011633.JPEG n02606052/
+mv val/ILSVRC2012_val_00011634.JPEG n01768244/
+mv val/ILSVRC2012_val_00011635.JPEG n04311004/
+mv val/ILSVRC2012_val_00011636.JPEG n03662601/
+mv val/ILSVRC2012_val_00011637.JPEG n02607072/
+mv val/ILSVRC2012_val_00011638.JPEG n01773549/
+mv val/ILSVRC2012_val_00011639.JPEG n02085620/
+mv val/ILSVRC2012_val_00011640.JPEG n02730930/
+mv val/ILSVRC2012_val_00011641.JPEG n04347754/
+mv val/ILSVRC2012_val_00011642.JPEG n02051845/
+mv val/ILSVRC2012_val_00011643.JPEG n01914609/
+mv val/ILSVRC2012_val_00011644.JPEG n03729826/
+mv val/ILSVRC2012_val_00011645.JPEG n02129165/
+mv val/ILSVRC2012_val_00011646.JPEG n01537544/
+mv val/ILSVRC2012_val_00011647.JPEG n03888605/
+mv val/ILSVRC2012_val_00011648.JPEG n03764736/
+mv val/ILSVRC2012_val_00011649.JPEG n04579145/
+mv val/ILSVRC2012_val_00011650.JPEG n01630670/
+mv val/ILSVRC2012_val_00011651.JPEG n01950731/
+mv val/ILSVRC2012_val_00011652.JPEG n03599486/
+mv val/ILSVRC2012_val_00011653.JPEG n03786901/
+mv val/ILSVRC2012_val_00011654.JPEG n04243546/
+mv val/ILSVRC2012_val_00011655.JPEG n04040759/
+mv val/ILSVRC2012_val_00011656.JPEG n03594945/
+mv val/ILSVRC2012_val_00011657.JPEG n01632458/
+mv val/ILSVRC2012_val_00011658.JPEG n02823750/
+mv val/ILSVRC2012_val_00011659.JPEG n04442312/
+mv val/ILSVRC2012_val_00011660.JPEG n02859443/
+mv val/ILSVRC2012_val_00011661.JPEG n01629819/
+mv val/ILSVRC2012_val_00011662.JPEG n04254777/
+mv val/ILSVRC2012_val_00011663.JPEG n04039381/
+mv val/ILSVRC2012_val_00011664.JPEG n01641577/
+mv val/ILSVRC2012_val_00011665.JPEG n04553703/
+mv val/ILSVRC2012_val_00011666.JPEG n03443371/
+mv val/ILSVRC2012_val_00011667.JPEG n04467665/
+mv val/ILSVRC2012_val_00011668.JPEG n03991062/
+mv val/ILSVRC2012_val_00011669.JPEG n02219486/
+mv val/ILSVRC2012_val_00011670.JPEG n02799071/
+mv val/ILSVRC2012_val_00011671.JPEG n04026417/
+mv val/ILSVRC2012_val_00011672.JPEG n03930313/
+mv val/ILSVRC2012_val_00011673.JPEG n02096585/
+mv val/ILSVRC2012_val_00011674.JPEG n03534580/
+mv val/ILSVRC2012_val_00011675.JPEG n07753113/
+mv val/ILSVRC2012_val_00011676.JPEG n03868863/
+mv val/ILSVRC2012_val_00011677.JPEG n01773549/
+mv val/ILSVRC2012_val_00011678.JPEG n03720891/
+mv val/ILSVRC2012_val_00011679.JPEG n02727426/
+mv val/ILSVRC2012_val_00011680.JPEG n02096177/
+mv val/ILSVRC2012_val_00011681.JPEG n03272562/
+mv val/ILSVRC2012_val_00011682.JPEG n02100236/
+mv val/ILSVRC2012_val_00011683.JPEG n03450230/
+mv val/ILSVRC2012_val_00011684.JPEG n03697007/
+mv val/ILSVRC2012_val_00011685.JPEG n02927161/
+mv val/ILSVRC2012_val_00011686.JPEG n01798484/
+mv val/ILSVRC2012_val_00011687.JPEG n02865351/
+mv val/ILSVRC2012_val_00011688.JPEG n01631663/
+mv val/ILSVRC2012_val_00011689.JPEG n02100236/
+mv val/ILSVRC2012_val_00011690.JPEG n03871628/
+mv val/ILSVRC2012_val_00011691.JPEG n03394916/
+mv val/ILSVRC2012_val_00011692.JPEG n03983396/
+mv val/ILSVRC2012_val_00011693.JPEG n03908714/
+mv val/ILSVRC2012_val_00011694.JPEG n02641379/
+mv val/ILSVRC2012_val_00011695.JPEG n07892512/
+mv val/ILSVRC2012_val_00011696.JPEG n01877812/
+mv val/ILSVRC2012_val_00011697.JPEG n01824575/
+mv val/ILSVRC2012_val_00011698.JPEG n02106030/
+mv val/ILSVRC2012_val_00011699.JPEG n02100583/
+mv val/ILSVRC2012_val_00011700.JPEG n03424325/
+mv val/ILSVRC2012_val_00011701.JPEG n02106166/
+mv val/ILSVRC2012_val_00011702.JPEG n01682714/
+mv val/ILSVRC2012_val_00011703.JPEG n04456115/
+mv val/ILSVRC2012_val_00011704.JPEG n01784675/
+mv val/ILSVRC2012_val_00011705.JPEG n03868242/
+mv val/ILSVRC2012_val_00011706.JPEG n02100877/
+mv val/ILSVRC2012_val_00011707.JPEG n04033901/
+mv val/ILSVRC2012_val_00011708.JPEG n04266014/
+mv val/ILSVRC2012_val_00011709.JPEG n04332243/
+mv val/ILSVRC2012_val_00011710.JPEG n02443114/
+mv val/ILSVRC2012_val_00011711.JPEG n04487081/
+mv val/ILSVRC2012_val_00011712.JPEG n01774750/
+mv val/ILSVRC2012_val_00011713.JPEG n02129165/
+mv val/ILSVRC2012_val_00011714.JPEG n01984695/
+mv val/ILSVRC2012_val_00011715.JPEG n03769881/
+mv val/ILSVRC2012_val_00011716.JPEG n02422106/
+mv val/ILSVRC2012_val_00011717.JPEG n04328186/
+mv val/ILSVRC2012_val_00011718.JPEG n02108915/
+mv val/ILSVRC2012_val_00011719.JPEG n02088364/
+mv val/ILSVRC2012_val_00011720.JPEG n02795169/
+mv val/ILSVRC2012_val_00011721.JPEG n01773157/
+mv val/ILSVRC2012_val_00011722.JPEG n03063689/
+mv val/ILSVRC2012_val_00011723.JPEG n04326547/
+mv val/ILSVRC2012_val_00011724.JPEG n01644900/
+mv val/ILSVRC2012_val_00011725.JPEG n09229709/
+mv val/ILSVRC2012_val_00011726.JPEG n02133161/
+mv val/ILSVRC2012_val_00011727.JPEG n03016953/
+mv val/ILSVRC2012_val_00011728.JPEG n02085620/
+mv val/ILSVRC2012_val_00011729.JPEG n07565083/
+mv val/ILSVRC2012_val_00011730.JPEG n02317335/
+mv val/ILSVRC2012_val_00011731.JPEG n04485082/
+mv val/ILSVRC2012_val_00011732.JPEG n02125311/
+mv val/ILSVRC2012_val_00011733.JPEG n04591157/
+mv val/ILSVRC2012_val_00011734.JPEG n02396427/
+mv val/ILSVRC2012_val_00011735.JPEG n04347754/
+mv val/ILSVRC2012_val_00011736.JPEG n02129604/
+mv val/ILSVRC2012_val_00011737.JPEG n02422699/
+mv val/ILSVRC2012_val_00011738.JPEG n02123597/
+mv val/ILSVRC2012_val_00011739.JPEG n03388183/
+mv val/ILSVRC2012_val_00011740.JPEG n03590841/
+mv val/ILSVRC2012_val_00011741.JPEG n02807133/
+mv val/ILSVRC2012_val_00011742.JPEG n03676483/
+mv val/ILSVRC2012_val_00011743.JPEG n03255030/
+mv val/ILSVRC2012_val_00011744.JPEG n02174001/
+mv val/ILSVRC2012_val_00011745.JPEG n04536866/
+mv val/ILSVRC2012_val_00011746.JPEG n02104029/
+mv val/ILSVRC2012_val_00011747.JPEG n02817516/
+mv val/ILSVRC2012_val_00011748.JPEG n02087046/
+mv val/ILSVRC2012_val_00011749.JPEG n02085782/
+mv val/ILSVRC2012_val_00011750.JPEG n02115641/
+mv val/ILSVRC2012_val_00011751.JPEG n02086910/
+mv val/ILSVRC2012_val_00011752.JPEG n02834397/
+mv val/ILSVRC2012_val_00011753.JPEG n03201208/
+mv val/ILSVRC2012_val_00011754.JPEG n02086240/
+mv val/ILSVRC2012_val_00011755.JPEG n02454379/
+mv val/ILSVRC2012_val_00011756.JPEG n02422699/
+mv val/ILSVRC2012_val_00011757.JPEG n02106662/
+mv val/ILSVRC2012_val_00011758.JPEG n04560804/
+mv val/ILSVRC2012_val_00011759.JPEG n02699494/
+mv val/ILSVRC2012_val_00011760.JPEG n02871525/
+mv val/ILSVRC2012_val_00011761.JPEG n04591157/
+mv val/ILSVRC2012_val_00011762.JPEG n04149813/
+mv val/ILSVRC2012_val_00011763.JPEG n03920288/
+mv val/ILSVRC2012_val_00011764.JPEG n02099267/
+mv val/ILSVRC2012_val_00011765.JPEG n02105412/
+mv val/ILSVRC2012_val_00011766.JPEG n01667778/
+mv val/ILSVRC2012_val_00011767.JPEG n03535780/
+mv val/ILSVRC2012_val_00011768.JPEG n02085936/
+mv val/ILSVRC2012_val_00011769.JPEG n03344393/
+mv val/ILSVRC2012_val_00011770.JPEG n03871628/
+mv val/ILSVRC2012_val_00011771.JPEG n02268853/
+mv val/ILSVRC2012_val_00011772.JPEG n02276258/
+mv val/ILSVRC2012_val_00011773.JPEG n03773504/
+mv val/ILSVRC2012_val_00011774.JPEG n04505470/
+mv val/ILSVRC2012_val_00011775.JPEG n02895154/
+mv val/ILSVRC2012_val_00011776.JPEG n01740131/
+mv val/ILSVRC2012_val_00011777.JPEG n02101388/
+mv val/ILSVRC2012_val_00011778.JPEG n01847000/
+mv val/ILSVRC2012_val_00011779.JPEG n04111531/
+mv val/ILSVRC2012_val_00011780.JPEG n02280649/
+mv val/ILSVRC2012_val_00011781.JPEG n04509417/
+mv val/ILSVRC2012_val_00011782.JPEG n01496331/
+mv val/ILSVRC2012_val_00011783.JPEG n02264363/
+mv val/ILSVRC2012_val_00011784.JPEG n02109525/
+mv val/ILSVRC2012_val_00011785.JPEG n03372029/
+mv val/ILSVRC2012_val_00011786.JPEG n03903868/
+mv val/ILSVRC2012_val_00011787.JPEG n01796340/
+mv val/ILSVRC2012_val_00011788.JPEG n02988304/
+mv val/ILSVRC2012_val_00011789.JPEG n02486261/
+mv val/ILSVRC2012_val_00011790.JPEG n07932039/
+mv val/ILSVRC2012_val_00011791.JPEG n03841143/
+mv val/ILSVRC2012_val_00011792.JPEG n02089867/
+mv val/ILSVRC2012_val_00011793.JPEG n02099429/
+mv val/ILSVRC2012_val_00011794.JPEG n03062245/
+mv val/ILSVRC2012_val_00011795.JPEG n02799071/
+mv val/ILSVRC2012_val_00011796.JPEG n03485794/
+mv val/ILSVRC2012_val_00011797.JPEG n03944341/
+mv val/ILSVRC2012_val_00011798.JPEG n02090379/
+mv val/ILSVRC2012_val_00011799.JPEG n04370456/
+mv val/ILSVRC2012_val_00011800.JPEG n04125021/
+mv val/ILSVRC2012_val_00011801.JPEG n03929855/
+mv val/ILSVRC2012_val_00011802.JPEG n02110063/
+mv val/ILSVRC2012_val_00011803.JPEG n02794156/
+mv val/ILSVRC2012_val_00011804.JPEG n04141076/
+mv val/ILSVRC2012_val_00011805.JPEG n02085936/
+mv val/ILSVRC2012_val_00011806.JPEG n04606251/
+mv val/ILSVRC2012_val_00011807.JPEG n02099712/
+mv val/ILSVRC2012_val_00011808.JPEG n01773549/
+mv val/ILSVRC2012_val_00011809.JPEG n02992529/
+mv val/ILSVRC2012_val_00011810.JPEG n03347037/
+mv val/ILSVRC2012_val_00011811.JPEG n02120505/
+mv val/ILSVRC2012_val_00011812.JPEG n02727426/
+mv val/ILSVRC2012_val_00011813.JPEG n03483316/
+mv val/ILSVRC2012_val_00011814.JPEG n04479046/
+mv val/ILSVRC2012_val_00011815.JPEG n03544143/
+mv val/ILSVRC2012_val_00011816.JPEG n03888605/
+mv val/ILSVRC2012_val_00011817.JPEG n04548362/
+mv val/ILSVRC2012_val_00011818.JPEG n13037406/
+mv val/ILSVRC2012_val_00011819.JPEG n04044716/
+mv val/ILSVRC2012_val_00011820.JPEG n02259212/
+mv val/ILSVRC2012_val_00011821.JPEG n02835271/
+mv val/ILSVRC2012_val_00011822.JPEG n01797886/
+mv val/ILSVRC2012_val_00011823.JPEG n02823428/
+mv val/ILSVRC2012_val_00011824.JPEG n04086273/
+mv val/ILSVRC2012_val_00011825.JPEG n02127052/
+mv val/ILSVRC2012_val_00011826.JPEG n03133878/
+mv val/ILSVRC2012_val_00011827.JPEG n03733281/
+mv val/ILSVRC2012_val_00011828.JPEG n02676566/
+mv val/ILSVRC2012_val_00011829.JPEG n02667093/
+mv val/ILSVRC2012_val_00011830.JPEG n04026417/
+mv val/ILSVRC2012_val_00011831.JPEG n07932039/
+mv val/ILSVRC2012_val_00011832.JPEG n04252077/
+mv val/ILSVRC2012_val_00011833.JPEG n03976467/
+mv val/ILSVRC2012_val_00011834.JPEG n04366367/
+mv val/ILSVRC2012_val_00011835.JPEG n03443371/
+mv val/ILSVRC2012_val_00011836.JPEG n04346328/
+mv val/ILSVRC2012_val_00011837.JPEG n02112018/
+mv val/ILSVRC2012_val_00011838.JPEG n03781244/
+mv val/ILSVRC2012_val_00011839.JPEG n03459775/
+mv val/ILSVRC2012_val_00011840.JPEG n03876231/
+mv val/ILSVRC2012_val_00011841.JPEG n01534433/
+mv val/ILSVRC2012_val_00011842.JPEG n03017168/
+mv val/ILSVRC2012_val_00011843.JPEG n02808304/
+mv val/ILSVRC2012_val_00011844.JPEG n07730033/
+mv val/ILSVRC2012_val_00011845.JPEG n02169497/
+mv val/ILSVRC2012_val_00011846.JPEG n02514041/
+mv val/ILSVRC2012_val_00011847.JPEG n04458633/
+mv val/ILSVRC2012_val_00011848.JPEG n02002556/
+mv val/ILSVRC2012_val_00011849.JPEG n03980874/
+mv val/ILSVRC2012_val_00011850.JPEG n03131574/
+mv val/ILSVRC2012_val_00011851.JPEG n01807496/
+mv val/ILSVRC2012_val_00011852.JPEG n04330267/
+mv val/ILSVRC2012_val_00011853.JPEG n01773549/
+mv val/ILSVRC2012_val_00011854.JPEG n02123159/
+mv val/ILSVRC2012_val_00011855.JPEG n04204347/
+mv val/ILSVRC2012_val_00011856.JPEG n02395406/
+mv val/ILSVRC2012_val_00011857.JPEG n02321529/
+mv val/ILSVRC2012_val_00011858.JPEG n03124043/
+mv val/ILSVRC2012_val_00011859.JPEG n03617480/
+mv val/ILSVRC2012_val_00011860.JPEG n01910747/
+mv val/ILSVRC2012_val_00011861.JPEG n01784675/
+mv val/ILSVRC2012_val_00011862.JPEG n03733131/
+mv val/ILSVRC2012_val_00011863.JPEG n07875152/
+mv val/ILSVRC2012_val_00011864.JPEG n04599235/
+mv val/ILSVRC2012_val_00011865.JPEG n09428293/
+mv val/ILSVRC2012_val_00011866.JPEG n07565083/
+mv val/ILSVRC2012_val_00011867.JPEG n02206856/
+mv val/ILSVRC2012_val_00011868.JPEG n03127747/
+mv val/ILSVRC2012_val_00011869.JPEG n02086240/
+mv val/ILSVRC2012_val_00011870.JPEG n04146614/
+mv val/ILSVRC2012_val_00011871.JPEG n04532670/
+mv val/ILSVRC2012_val_00011872.JPEG n03259280/
+mv val/ILSVRC2012_val_00011873.JPEG n02104365/
+mv val/ILSVRC2012_val_00011874.JPEG n01855032/
+mv val/ILSVRC2012_val_00011875.JPEG n04366367/
+mv val/ILSVRC2012_val_00011876.JPEG n02977058/
+mv val/ILSVRC2012_val_00011877.JPEG n02444819/
+mv val/ILSVRC2012_val_00011878.JPEG n02088632/
+mv val/ILSVRC2012_val_00011879.JPEG n04562935/
+mv val/ILSVRC2012_val_00011880.JPEG n03891251/
+mv val/ILSVRC2012_val_00011881.JPEG n07718747/
+mv val/ILSVRC2012_val_00011882.JPEG n02783161/
+mv val/ILSVRC2012_val_00011883.JPEG n03929855/
+mv val/ILSVRC2012_val_00011884.JPEG n01872401/
+mv val/ILSVRC2012_val_00011885.JPEG n07693725/
+mv val/ILSVRC2012_val_00011886.JPEG n02859443/
+mv val/ILSVRC2012_val_00011887.JPEG n04370456/
+mv val/ILSVRC2012_val_00011888.JPEG n02259212/
+mv val/ILSVRC2012_val_00011889.JPEG n02231487/
+mv val/ILSVRC2012_val_00011890.JPEG n04065272/
+mv val/ILSVRC2012_val_00011891.JPEG n02361337/
+mv val/ILSVRC2012_val_00011892.JPEG n02395406/
+mv val/ILSVRC2012_val_00011893.JPEG n02094433/
+mv val/ILSVRC2012_val_00011894.JPEG n01833805/
+mv val/ILSVRC2012_val_00011895.JPEG n02097474/
+mv val/ILSVRC2012_val_00011896.JPEG n03868242/
+mv val/ILSVRC2012_val_00011897.JPEG n04041544/
+mv val/ILSVRC2012_val_00011898.JPEG n02493793/
+mv val/ILSVRC2012_val_00011899.JPEG n02174001/
+mv val/ILSVRC2012_val_00011900.JPEG n02085620/
+mv val/ILSVRC2012_val_00011901.JPEG n12620546/
+mv val/ILSVRC2012_val_00011902.JPEG n02412080/
+mv val/ILSVRC2012_val_00011903.JPEG n02808440/
+mv val/ILSVRC2012_val_00011904.JPEG n02489166/
+mv val/ILSVRC2012_val_00011905.JPEG n04069434/
+mv val/ILSVRC2012_val_00011906.JPEG n03763968/
+mv val/ILSVRC2012_val_00011907.JPEG n03721384/
+mv val/ILSVRC2012_val_00011908.JPEG n04522168/
+mv val/ILSVRC2012_val_00011909.JPEG n03527444/
+mv val/ILSVRC2012_val_00011910.JPEG n04147183/
+mv val/ILSVRC2012_val_00011911.JPEG n02277742/
+mv val/ILSVRC2012_val_00011912.JPEG n03743016/
+mv val/ILSVRC2012_val_00011913.JPEG n02490219/
+mv val/ILSVRC2012_val_00011914.JPEG n01443537/
+mv val/ILSVRC2012_val_00011915.JPEG n01534433/
+mv val/ILSVRC2012_val_00011916.JPEG n02965783/
+mv val/ILSVRC2012_val_00011917.JPEG n02106382/
+mv val/ILSVRC2012_val_00011918.JPEG n02007558/
+mv val/ILSVRC2012_val_00011919.JPEG n03908618/
+mv val/ILSVRC2012_val_00011920.JPEG n04357314/
+mv val/ILSVRC2012_val_00011921.JPEG n02108089/
+mv val/ILSVRC2012_val_00011922.JPEG n01980166/
+mv val/ILSVRC2012_val_00011923.JPEG n03642806/
+mv val/ILSVRC2012_val_00011924.JPEG n04090263/
+mv val/ILSVRC2012_val_00011925.JPEG n02093256/
+mv val/ILSVRC2012_val_00011926.JPEG n02841315/
+mv val/ILSVRC2012_val_00011927.JPEG n01695060/
+mv val/ILSVRC2012_val_00011928.JPEG n04152593/
+mv val/ILSVRC2012_val_00011929.JPEG n04532670/
+mv val/ILSVRC2012_val_00011930.JPEG n04201297/
+mv val/ILSVRC2012_val_00011931.JPEG n03476684/
+mv val/ILSVRC2012_val_00011932.JPEG n02236044/
+mv val/ILSVRC2012_val_00011933.JPEG n02769748/
+mv val/ILSVRC2012_val_00011934.JPEG n03187595/
+mv val/ILSVRC2012_val_00011935.JPEG n02841315/
+mv val/ILSVRC2012_val_00011936.JPEG n04081281/
+mv val/ILSVRC2012_val_00011937.JPEG n07873807/
+mv val/ILSVRC2012_val_00011938.JPEG n04548362/
+mv val/ILSVRC2012_val_00011939.JPEG n03595614/
+mv val/ILSVRC2012_val_00011940.JPEG n04532670/
+mv val/ILSVRC2012_val_00011941.JPEG n03047690/
+mv val/ILSVRC2012_val_00011942.JPEG n04552348/
+mv val/ILSVRC2012_val_00011943.JPEG n01806143/
+mv val/ILSVRC2012_val_00011944.JPEG n04542943/
+mv val/ILSVRC2012_val_00011945.JPEG n07717556/
+mv val/ILSVRC2012_val_00011946.JPEG n03782006/
+mv val/ILSVRC2012_val_00011947.JPEG n02107574/
+mv val/ILSVRC2012_val_00011948.JPEG n04118776/
+mv val/ILSVRC2012_val_00011949.JPEG n04523525/
+mv val/ILSVRC2012_val_00011950.JPEG n04141327/
+mv val/ILSVRC2012_val_00011951.JPEG n03000684/
+mv val/ILSVRC2012_val_00011952.JPEG n02124075/
+mv val/ILSVRC2012_val_00011953.JPEG n02667093/
+mv val/ILSVRC2012_val_00011954.JPEG n03976467/
+mv val/ILSVRC2012_val_00011955.JPEG n02965783/
+mv val/ILSVRC2012_val_00011956.JPEG n06785654/
+mv val/ILSVRC2012_val_00011957.JPEG n04548280/
+mv val/ILSVRC2012_val_00011958.JPEG n03840681/
+mv val/ILSVRC2012_val_00011959.JPEG n04243546/
+mv val/ILSVRC2012_val_00011960.JPEG n03447721/
+mv val/ILSVRC2012_val_00011961.JPEG n03720891/
+mv val/ILSVRC2012_val_00011962.JPEG n03825788/
+mv val/ILSVRC2012_val_00011963.JPEG n02791270/
+mv val/ILSVRC2012_val_00011964.JPEG n02870880/
+mv val/ILSVRC2012_val_00011965.JPEG n03535780/
+mv val/ILSVRC2012_val_00011966.JPEG n02165456/
+mv val/ILSVRC2012_val_00011967.JPEG n02132136/
+mv val/ILSVRC2012_val_00011968.JPEG n04044716/
+mv val/ILSVRC2012_val_00011969.JPEG n03970156/
+mv val/ILSVRC2012_val_00011970.JPEG n03692522/
+mv val/ILSVRC2012_val_00011971.JPEG n01744401/
+mv val/ILSVRC2012_val_00011972.JPEG n04418357/
+mv val/ILSVRC2012_val_00011973.JPEG n02167151/
+mv val/ILSVRC2012_val_00011974.JPEG n02790996/
+mv val/ILSVRC2012_val_00011975.JPEG n03903868/
+mv val/ILSVRC2012_val_00011976.JPEG n02860847/
+mv val/ILSVRC2012_val_00011977.JPEG n02417914/
+mv val/ILSVRC2012_val_00011978.JPEG n01985128/
+mv val/ILSVRC2012_val_00011979.JPEG n02281787/
+mv val/ILSVRC2012_val_00011980.JPEG n10148035/
+mv val/ILSVRC2012_val_00011981.JPEG n02974003/
+mv val/ILSVRC2012_val_00011982.JPEG n03777754/
+mv val/ILSVRC2012_val_00011983.JPEG n03445777/
+mv val/ILSVRC2012_val_00011984.JPEG n04532106/
+mv val/ILSVRC2012_val_00011985.JPEG n02085782/
+mv val/ILSVRC2012_val_00011986.JPEG n03452741/
+mv val/ILSVRC2012_val_00011987.JPEG n03670208/
+mv val/ILSVRC2012_val_00011988.JPEG n03866082/
+mv val/ILSVRC2012_val_00011989.JPEG n02105162/
+mv val/ILSVRC2012_val_00011990.JPEG n03220513/
+mv val/ILSVRC2012_val_00011991.JPEG n03529860/
+mv val/ILSVRC2012_val_00011992.JPEG n04376876/
+mv val/ILSVRC2012_val_00011993.JPEG n01440764/
+mv val/ILSVRC2012_val_00011994.JPEG n03498962/
+mv val/ILSVRC2012_val_00011995.JPEG n02687172/
+mv val/ILSVRC2012_val_00011996.JPEG n01665541/
+mv val/ILSVRC2012_val_00011997.JPEG n04344873/
+mv val/ILSVRC2012_val_00011998.JPEG n02489166/
+mv val/ILSVRC2012_val_00011999.JPEG n03384352/
+mv val/ILSVRC2012_val_00012000.JPEG n02443484/
+mv val/ILSVRC2012_val_00012001.JPEG n03976657/
+mv val/ILSVRC2012_val_00012002.JPEG n04540053/
+mv val/ILSVRC2012_val_00012003.JPEG n01817953/
+mv val/ILSVRC2012_val_00012004.JPEG n02098105/
+mv val/ILSVRC2012_val_00012005.JPEG n02655020/
+mv val/ILSVRC2012_val_00012006.JPEG n01756291/
+mv val/ILSVRC2012_val_00012007.JPEG n02099267/
+mv val/ILSVRC2012_val_00012008.JPEG n04141327/
+mv val/ILSVRC2012_val_00012009.JPEG n07734744/
+mv val/ILSVRC2012_val_00012010.JPEG n03690938/
+mv val/ILSVRC2012_val_00012011.JPEG n02133161/
+mv val/ILSVRC2012_val_00012012.JPEG n10148035/
+mv val/ILSVRC2012_val_00012013.JPEG n03461385/
+mv val/ILSVRC2012_val_00012014.JPEG n03840681/
+mv val/ILSVRC2012_val_00012015.JPEG n02099267/
+mv val/ILSVRC2012_val_00012016.JPEG n03908618/
+mv val/ILSVRC2012_val_00012017.JPEG n02483708/
+mv val/ILSVRC2012_val_00012018.JPEG n03710637/
+mv val/ILSVRC2012_val_00012019.JPEG n02804610/
+mv val/ILSVRC2012_val_00012020.JPEG n02906734/
+mv val/ILSVRC2012_val_00012021.JPEG n07836838/
+mv val/ILSVRC2012_val_00012022.JPEG n03930313/
+mv val/ILSVRC2012_val_00012023.JPEG n02786058/
+mv val/ILSVRC2012_val_00012024.JPEG n01795545/
+mv val/ILSVRC2012_val_00012025.JPEG n02804610/
+mv val/ILSVRC2012_val_00012026.JPEG n02095570/
+mv val/ILSVRC2012_val_00012027.JPEG n03447721/
+mv val/ILSVRC2012_val_00012028.JPEG n04311004/
+mv val/ILSVRC2012_val_00012029.JPEG n04229816/
+mv val/ILSVRC2012_val_00012030.JPEG n04208210/
+mv val/ILSVRC2012_val_00012031.JPEG n03710193/
+mv val/ILSVRC2012_val_00012032.JPEG n03584829/
+mv val/ILSVRC2012_val_00012033.JPEG n04355338/
+mv val/ILSVRC2012_val_00012034.JPEG n03146219/
+mv val/ILSVRC2012_val_00012035.JPEG n02085620/
+mv val/ILSVRC2012_val_00012036.JPEG n04522168/
+mv val/ILSVRC2012_val_00012037.JPEG n02106030/
+mv val/ILSVRC2012_val_00012038.JPEG n03908618/
+mv val/ILSVRC2012_val_00012039.JPEG n02113624/
+mv val/ILSVRC2012_val_00012040.JPEG n04429376/
+mv val/ILSVRC2012_val_00012041.JPEG n02100877/
+mv val/ILSVRC2012_val_00012042.JPEG n02894605/
+mv val/ILSVRC2012_val_00012043.JPEG n02088632/
+mv val/ILSVRC2012_val_00012044.JPEG n02490219/
+mv val/ILSVRC2012_val_00012045.JPEG n02264363/
+mv val/ILSVRC2012_val_00012046.JPEG n04204238/
+mv val/ILSVRC2012_val_00012047.JPEG n07717556/
+mv val/ILSVRC2012_val_00012048.JPEG n02699494/
+mv val/ILSVRC2012_val_00012049.JPEG n13040303/
+mv val/ILSVRC2012_val_00012050.JPEG n02782093/
+mv val/ILSVRC2012_val_00012051.JPEG n04238763/
+mv val/ILSVRC2012_val_00012052.JPEG n03935335/
+mv val/ILSVRC2012_val_00012053.JPEG n02111889/
+mv val/ILSVRC2012_val_00012054.JPEG n04147183/
+mv val/ILSVRC2012_val_00012055.JPEG n02089078/
+mv val/ILSVRC2012_val_00012056.JPEG n03598930/
+mv val/ILSVRC2012_val_00012057.JPEG n04131690/
+mv val/ILSVRC2012_val_00012058.JPEG n01534433/
+mv val/ILSVRC2012_val_00012059.JPEG n04039381/
+mv val/ILSVRC2012_val_00012060.JPEG n02113023/
+mv val/ILSVRC2012_val_00012061.JPEG n03649909/
+mv val/ILSVRC2012_val_00012062.JPEG n02804610/
+mv val/ILSVRC2012_val_00012063.JPEG n02950826/
+mv val/ILSVRC2012_val_00012064.JPEG n07695742/
+mv val/ILSVRC2012_val_00012065.JPEG n03899768/
+mv val/ILSVRC2012_val_00012066.JPEG n03662601/
+mv val/ILSVRC2012_val_00012067.JPEG n02100877/
+mv val/ILSVRC2012_val_00012068.JPEG n06359193/
+mv val/ILSVRC2012_val_00012069.JPEG n04270147/
+mv val/ILSVRC2012_val_00012070.JPEG n03527444/
+mv val/ILSVRC2012_val_00012071.JPEG n04023962/
+mv val/ILSVRC2012_val_00012072.JPEG n03207743/
+mv val/ILSVRC2012_val_00012073.JPEG n03691459/
+mv val/ILSVRC2012_val_00012074.JPEG n02086646/
+mv val/ILSVRC2012_val_00012075.JPEG n04456115/
+mv val/ILSVRC2012_val_00012076.JPEG n04335435/
+mv val/ILSVRC2012_val_00012077.JPEG n04493381/
+mv val/ILSVRC2012_val_00012078.JPEG n03355925/
+mv val/ILSVRC2012_val_00012079.JPEG n02128757/
+mv val/ILSVRC2012_val_00012080.JPEG n03710637/
+mv val/ILSVRC2012_val_00012081.JPEG n02749479/
+mv val/ILSVRC2012_val_00012082.JPEG n04111531/
+mv val/ILSVRC2012_val_00012083.JPEG n02669723/
+mv val/ILSVRC2012_val_00012084.JPEG n04591157/
+mv val/ILSVRC2012_val_00012085.JPEG n02106550/
+mv val/ILSVRC2012_val_00012086.JPEG n04069434/
+mv val/ILSVRC2012_val_00012087.JPEG n01669191/
+mv val/ILSVRC2012_val_00012088.JPEG n03496892/
+mv val/ILSVRC2012_val_00012089.JPEG n01855672/
+mv val/ILSVRC2012_val_00012090.JPEG n03803284/
+mv val/ILSVRC2012_val_00012091.JPEG n04371774/
+mv val/ILSVRC2012_val_00012092.JPEG n02965783/
+mv val/ILSVRC2012_val_00012093.JPEG n01955084/
+mv val/ILSVRC2012_val_00012094.JPEG n03710637/
+mv val/ILSVRC2012_val_00012095.JPEG n04147183/
+mv val/ILSVRC2012_val_00012096.JPEG n03792782/
+mv val/ILSVRC2012_val_00012097.JPEG n04597913/
+mv val/ILSVRC2012_val_00012098.JPEG n04266014/
+mv val/ILSVRC2012_val_00012099.JPEG n02790996/
+mv val/ILSVRC2012_val_00012100.JPEG n02099601/
+mv val/ILSVRC2012_val_00012101.JPEG n03627232/
+mv val/ILSVRC2012_val_00012102.JPEG n02219486/
+mv val/ILSVRC2012_val_00012103.JPEG n07760859/
+mv val/ILSVRC2012_val_00012104.JPEG n02877765/
+mv val/ILSVRC2012_val_00012105.JPEG n07715103/
+mv val/ILSVRC2012_val_00012106.JPEG n02259212/
+mv val/ILSVRC2012_val_00012107.JPEG n07747607/
+mv val/ILSVRC2012_val_00012108.JPEG n04376876/
+mv val/ILSVRC2012_val_00012109.JPEG n01748264/
+mv val/ILSVRC2012_val_00012110.JPEG n04317175/
+mv val/ILSVRC2012_val_00012111.JPEG n02687172/
+mv val/ILSVRC2012_val_00012112.JPEG n13037406/
+mv val/ILSVRC2012_val_00012113.JPEG n02321529/
+mv val/ILSVRC2012_val_00012114.JPEG n02981792/
+mv val/ILSVRC2012_val_00012115.JPEG n02992211/
+mv val/ILSVRC2012_val_00012116.JPEG n03891332/
+mv val/ILSVRC2012_val_00012117.JPEG n01944390/
+mv val/ILSVRC2012_val_00012118.JPEG n02398521/
+mv val/ILSVRC2012_val_00012119.JPEG n07753275/
+mv val/ILSVRC2012_val_00012120.JPEG n01687978/
+mv val/ILSVRC2012_val_00012121.JPEG n03325584/
+mv val/ILSVRC2012_val_00012122.JPEG n01806143/
+mv val/ILSVRC2012_val_00012123.JPEG n01795545/
+mv val/ILSVRC2012_val_00012124.JPEG n02256656/
+mv val/ILSVRC2012_val_00012125.JPEG n13133613/
+mv val/ILSVRC2012_val_00012126.JPEG n06785654/
+mv val/ILSVRC2012_val_00012127.JPEG n02236044/
+mv val/ILSVRC2012_val_00012128.JPEG n04033901/
+mv val/ILSVRC2012_val_00012129.JPEG n02892767/
+mv val/ILSVRC2012_val_00012130.JPEG n03792972/
+mv val/ILSVRC2012_val_00012131.JPEG n07753592/
+mv val/ILSVRC2012_val_00012132.JPEG n01580077/
+mv val/ILSVRC2012_val_00012133.JPEG n03535780/
+mv val/ILSVRC2012_val_00012134.JPEG n03602883/
+mv val/ILSVRC2012_val_00012135.JPEG n02423022/
+mv val/ILSVRC2012_val_00012136.JPEG n03599486/
+mv val/ILSVRC2012_val_00012137.JPEG n02279972/
+mv val/ILSVRC2012_val_00012138.JPEG n02655020/
+mv val/ILSVRC2012_val_00012139.JPEG n03637318/
+mv val/ILSVRC2012_val_00012140.JPEG n02108000/
+mv val/ILSVRC2012_val_00012141.JPEG n03355925/
+mv val/ILSVRC2012_val_00012142.JPEG n04486054/
+mv val/ILSVRC2012_val_00012143.JPEG n01986214/
+mv val/ILSVRC2012_val_00012144.JPEG n03014705/
+mv val/ILSVRC2012_val_00012145.JPEG n04599235/
+mv val/ILSVRC2012_val_00012146.JPEG n02107312/
+mv val/ILSVRC2012_val_00012147.JPEG n04522168/
+mv val/ILSVRC2012_val_00012148.JPEG n03782006/
+mv val/ILSVRC2012_val_00012149.JPEG n02091244/
+mv val/ILSVRC2012_val_00012150.JPEG n04238763/
+mv val/ILSVRC2012_val_00012151.JPEG n01641577/
+mv val/ILSVRC2012_val_00012152.JPEG n02268853/
+mv val/ILSVRC2012_val_00012153.JPEG n07711569/
+mv val/ILSVRC2012_val_00012154.JPEG n03662601/
+mv val/ILSVRC2012_val_00012155.JPEG n02102318/
+mv val/ILSVRC2012_val_00012156.JPEG n01677366/
+mv val/ILSVRC2012_val_00012157.JPEG n02097209/
+mv val/ILSVRC2012_val_00012158.JPEG n03763968/
+mv val/ILSVRC2012_val_00012159.JPEG n03786901/
+mv val/ILSVRC2012_val_00012160.JPEG n02509815/
+mv val/ILSVRC2012_val_00012161.JPEG n02086910/
+mv val/ILSVRC2012_val_00012162.JPEG n06794110/
+mv val/ILSVRC2012_val_00012163.JPEG n07920052/
+mv val/ILSVRC2012_val_00012164.JPEG n03379051/
+mv val/ILSVRC2012_val_00012165.JPEG n02346627/
+mv val/ILSVRC2012_val_00012166.JPEG n02018795/
+mv val/ILSVRC2012_val_00012167.JPEG n02480495/
+mv val/ILSVRC2012_val_00012168.JPEG n07711569/
+mv val/ILSVRC2012_val_00012169.JPEG n04532670/
+mv val/ILSVRC2012_val_00012170.JPEG n02099712/
+mv val/ILSVRC2012_val_00012171.JPEG n02110806/
+mv val/ILSVRC2012_val_00012172.JPEG n03759954/
+mv val/ILSVRC2012_val_00012173.JPEG n02123597/
+mv val/ILSVRC2012_val_00012174.JPEG n04154565/
+mv val/ILSVRC2012_val_00012175.JPEG n03347037/
+mv val/ILSVRC2012_val_00012176.JPEG n02077923/
+mv val/ILSVRC2012_val_00012177.JPEG n02514041/
+mv val/ILSVRC2012_val_00012178.JPEG n01616318/
+mv val/ILSVRC2012_val_00012179.JPEG n02641379/
+mv val/ILSVRC2012_val_00012180.JPEG n04086273/
+mv val/ILSVRC2012_val_00012181.JPEG n02097298/
+mv val/ILSVRC2012_val_00012182.JPEG n02930766/
+mv val/ILSVRC2012_val_00012183.JPEG n01983481/
+mv val/ILSVRC2012_val_00012184.JPEG n03995372/
+mv val/ILSVRC2012_val_00012185.JPEG n03891332/
+mv val/ILSVRC2012_val_00012186.JPEG n03218198/
+mv val/ILSVRC2012_val_00012187.JPEG n02058221/
+mv val/ILSVRC2012_val_00012188.JPEG n01729322/
+mv val/ILSVRC2012_val_00012189.JPEG n02799071/
+mv val/ILSVRC2012_val_00012190.JPEG n01820546/
+mv val/ILSVRC2012_val_00012191.JPEG n04127249/
+mv val/ILSVRC2012_val_00012192.JPEG n02834397/
+mv val/ILSVRC2012_val_00012193.JPEG n02097209/
+mv val/ILSVRC2012_val_00012194.JPEG n03196217/
+mv val/ILSVRC2012_val_00012195.JPEG n03216828/
+mv val/ILSVRC2012_val_00012196.JPEG n02096585/
+mv val/ILSVRC2012_val_00012197.JPEG n04229816/
+mv val/ILSVRC2012_val_00012198.JPEG n11879895/
+mv val/ILSVRC2012_val_00012199.JPEG n03977966/
+mv val/ILSVRC2012_val_00012200.JPEG n03876231/
+mv val/ILSVRC2012_val_00012201.JPEG n03908618/
+mv val/ILSVRC2012_val_00012202.JPEG n03255030/
+mv val/ILSVRC2012_val_00012203.JPEG n02106662/
+mv val/ILSVRC2012_val_00012204.JPEG n02488702/
+mv val/ILSVRC2012_val_00012205.JPEG n02978881/
+mv val/ILSVRC2012_val_00012206.JPEG n03868242/
+mv val/ILSVRC2012_val_00012207.JPEG n03710721/
+mv val/ILSVRC2012_val_00012208.JPEG n03494278/
+mv val/ILSVRC2012_val_00012209.JPEG n02363005/
+mv val/ILSVRC2012_val_00012210.JPEG n02939185/
+mv val/ILSVRC2012_val_00012211.JPEG n07768694/
+mv val/ILSVRC2012_val_00012212.JPEG n04505470/
+mv val/ILSVRC2012_val_00012213.JPEG n02028035/
+mv val/ILSVRC2012_val_00012214.JPEG n02894605/
+mv val/ILSVRC2012_val_00012215.JPEG n07717410/
+mv val/ILSVRC2012_val_00012216.JPEG n07745940/
+mv val/ILSVRC2012_val_00012217.JPEG n04429376/
+mv val/ILSVRC2012_val_00012218.JPEG n04344873/
+mv val/ILSVRC2012_val_00012219.JPEG n02727426/
+mv val/ILSVRC2012_val_00012220.JPEG n01753488/
+mv val/ILSVRC2012_val_00012221.JPEG n02110806/
+mv val/ILSVRC2012_val_00012222.JPEG n03661043/
+mv val/ILSVRC2012_val_00012223.JPEG n01806567/
+mv val/ILSVRC2012_val_00012224.JPEG n01955084/
+mv val/ILSVRC2012_val_00012225.JPEG n03467068/
+mv val/ILSVRC2012_val_00012226.JPEG n02110063/
+mv val/ILSVRC2012_val_00012227.JPEG n03902125/
+mv val/ILSVRC2012_val_00012228.JPEG n03450230/
+mv val/ILSVRC2012_val_00012229.JPEG n01692333/
+mv val/ILSVRC2012_val_00012230.JPEG n02114855/
+mv val/ILSVRC2012_val_00012231.JPEG n01644900/
+mv val/ILSVRC2012_val_00012232.JPEG n07742313/
+mv val/ILSVRC2012_val_00012233.JPEG n07565083/
+mv val/ILSVRC2012_val_00012234.JPEG n04505470/
+mv val/ILSVRC2012_val_00012235.JPEG n02088364/
+mv val/ILSVRC2012_val_00012236.JPEG n03733131/
+mv val/ILSVRC2012_val_00012237.JPEG n02105056/
+mv val/ILSVRC2012_val_00012238.JPEG n02606052/
+mv val/ILSVRC2012_val_00012239.JPEG n03179701/
+mv val/ILSVRC2012_val_00012240.JPEG n07715103/
+mv val/ILSVRC2012_val_00012241.JPEG n02641379/
+mv val/ILSVRC2012_val_00012242.JPEG n03259280/
+mv val/ILSVRC2012_val_00012243.JPEG n07873807/
+mv val/ILSVRC2012_val_00012244.JPEG n04584207/
+mv val/ILSVRC2012_val_00012245.JPEG n02110063/
+mv val/ILSVRC2012_val_00012246.JPEG n03218198/
+mv val/ILSVRC2012_val_00012247.JPEG n02494079/
+mv val/ILSVRC2012_val_00012248.JPEG n01644373/
+mv val/ILSVRC2012_val_00012249.JPEG n04332243/
+mv val/ILSVRC2012_val_00012250.JPEG n02115913/
+mv val/ILSVRC2012_val_00012251.JPEG n02120079/
+mv val/ILSVRC2012_val_00012252.JPEG n09229709/
+mv val/ILSVRC2012_val_00012253.JPEG n02481823/
+mv val/ILSVRC2012_val_00012254.JPEG n04235860/
+mv val/ILSVRC2012_val_00012255.JPEG n02113799/
+mv val/ILSVRC2012_val_00012256.JPEG n02823428/
+mv val/ILSVRC2012_val_00012257.JPEG n04371774/
+mv val/ILSVRC2012_val_00012258.JPEG n02442845/
+mv val/ILSVRC2012_val_00012259.JPEG n01498041/
+mv val/ILSVRC2012_val_00012260.JPEG n03944341/
+mv val/ILSVRC2012_val_00012261.JPEG n09332890/
+mv val/ILSVRC2012_val_00012262.JPEG n02091134/
+mv val/ILSVRC2012_val_00012263.JPEG n02690373/
+mv val/ILSVRC2012_val_00012264.JPEG n02788148/
+mv val/ILSVRC2012_val_00012265.JPEG n02869837/
+mv val/ILSVRC2012_val_00012266.JPEG n04204238/
+mv val/ILSVRC2012_val_00012267.JPEG n01675722/
+mv val/ILSVRC2012_val_00012268.JPEG n02236044/
+mv val/ILSVRC2012_val_00012269.JPEG n02280649/
+mv val/ILSVRC2012_val_00012270.JPEG n12144580/
+mv val/ILSVRC2012_val_00012271.JPEG n01882714/
+mv val/ILSVRC2012_val_00012272.JPEG n04120489/
+mv val/ILSVRC2012_val_00012273.JPEG n02999410/
+mv val/ILSVRC2012_val_00012274.JPEG n03692522/
+mv val/ILSVRC2012_val_00012275.JPEG n01729322/
+mv val/ILSVRC2012_val_00012276.JPEG n04532670/
+mv val/ILSVRC2012_val_00012277.JPEG n03337140/
+mv val/ILSVRC2012_val_00012278.JPEG n02966193/
+mv val/ILSVRC2012_val_00012279.JPEG n07742313/
+mv val/ILSVRC2012_val_00012280.JPEG n03793489/
+mv val/ILSVRC2012_val_00012281.JPEG n04355933/
+mv val/ILSVRC2012_val_00012282.JPEG n03220513/
+mv val/ILSVRC2012_val_00012283.JPEG n02445715/
+mv val/ILSVRC2012_val_00012284.JPEG n04443257/
+mv val/ILSVRC2012_val_00012285.JPEG n04026417/
+mv val/ILSVRC2012_val_00012286.JPEG n02823428/
+mv val/ILSVRC2012_val_00012287.JPEG n03976467/
+mv val/ILSVRC2012_val_00012288.JPEG n02102177/
+mv val/ILSVRC2012_val_00012289.JPEG n03773504/
+mv val/ILSVRC2012_val_00012290.JPEG n04487394/
+mv val/ILSVRC2012_val_00012291.JPEG n02085936/
+mv val/ILSVRC2012_val_00012292.JPEG n07614500/
+mv val/ILSVRC2012_val_00012293.JPEG n02089078/
+mv val/ILSVRC2012_val_00012294.JPEG n02206856/
+mv val/ILSVRC2012_val_00012295.JPEG n04147183/
+mv val/ILSVRC2012_val_00012296.JPEG n04501370/
+mv val/ILSVRC2012_val_00012297.JPEG n02422699/
+mv val/ILSVRC2012_val_00012298.JPEG n02085782/
+mv val/ILSVRC2012_val_00012299.JPEG n02097130/
+mv val/ILSVRC2012_val_00012300.JPEG n03929660/
+mv val/ILSVRC2012_val_00012301.JPEG n01751748/
+mv val/ILSVRC2012_val_00012302.JPEG n02099849/
+mv val/ILSVRC2012_val_00012303.JPEG n01924916/
+mv val/ILSVRC2012_val_00012304.JPEG n01692333/
+mv val/ILSVRC2012_val_00012305.JPEG n04275548/
+mv val/ILSVRC2012_val_00012306.JPEG n03991062/
+mv val/ILSVRC2012_val_00012307.JPEG n01824575/
+mv val/ILSVRC2012_val_00012308.JPEG n03218198/
+mv val/ILSVRC2012_val_00012309.JPEG n02018207/
+mv val/ILSVRC2012_val_00012310.JPEG n03530642/
+mv val/ILSVRC2012_val_00012311.JPEG n03782006/
+mv val/ILSVRC2012_val_00012312.JPEG n03697007/
+mv val/ILSVRC2012_val_00012313.JPEG n07734744/
+mv val/ILSVRC2012_val_00012314.JPEG n01820546/
+mv val/ILSVRC2012_val_00012315.JPEG n02280649/
+mv val/ILSVRC2012_val_00012316.JPEG n02115913/
+mv val/ILSVRC2012_val_00012317.JPEG n04325704/
+mv val/ILSVRC2012_val_00012318.JPEG n02104029/
+mv val/ILSVRC2012_val_00012319.JPEG n03250847/
+mv val/ILSVRC2012_val_00012320.JPEG n11879895/
+mv val/ILSVRC2012_val_00012321.JPEG n03709823/
+mv val/ILSVRC2012_val_00012322.JPEG n03271574/
+mv val/ILSVRC2012_val_00012323.JPEG n04483307/
+mv val/ILSVRC2012_val_00012324.JPEG n04525038/
+mv val/ILSVRC2012_val_00012325.JPEG n02835271/
+mv val/ILSVRC2012_val_00012326.JPEG n02102318/
+mv val/ILSVRC2012_val_00012327.JPEG n04285008/
+mv val/ILSVRC2012_val_00012328.JPEG n01491361/
+mv val/ILSVRC2012_val_00012329.JPEG n01742172/
+mv val/ILSVRC2012_val_00012330.JPEG n02077923/
+mv val/ILSVRC2012_val_00012331.JPEG n01728572/
+mv val/ILSVRC2012_val_00012332.JPEG n01914609/
+mv val/ILSVRC2012_val_00012333.JPEG n03388549/
+mv val/ILSVRC2012_val_00012334.JPEG n03085013/
+mv val/ILSVRC2012_val_00012335.JPEG n02395406/
+mv val/ILSVRC2012_val_00012336.JPEG n03868863/
+mv val/ILSVRC2012_val_00012337.JPEG n04033901/
+mv val/ILSVRC2012_val_00012338.JPEG n02011460/
+mv val/ILSVRC2012_val_00012339.JPEG n02123159/
+mv val/ILSVRC2012_val_00012340.JPEG n02391049/
+mv val/ILSVRC2012_val_00012341.JPEG n04039381/
+mv val/ILSVRC2012_val_00012342.JPEG n01695060/
+mv val/ILSVRC2012_val_00012343.JPEG n02129165/
+mv val/ILSVRC2012_val_00012344.JPEG n03944341/
+mv val/ILSVRC2012_val_00012345.JPEG n04462240/
+mv val/ILSVRC2012_val_00012346.JPEG n02403003/
+mv val/ILSVRC2012_val_00012347.JPEG n03920288/
+mv val/ILSVRC2012_val_00012348.JPEG n03649909/
+mv val/ILSVRC2012_val_00012349.JPEG n04515003/
+mv val/ILSVRC2012_val_00012350.JPEG n03372029/
+mv val/ILSVRC2012_val_00012351.JPEG n02091467/
+mv val/ILSVRC2012_val_00012352.JPEG n04372370/
+mv val/ILSVRC2012_val_00012353.JPEG n02129165/
+mv val/ILSVRC2012_val_00012354.JPEG n01753488/
+mv val/ILSVRC2012_val_00012355.JPEG n02113712/
+mv val/ILSVRC2012_val_00012356.JPEG n03445777/
+mv val/ILSVRC2012_val_00012357.JPEG n04525305/
+mv val/ILSVRC2012_val_00012358.JPEG n01768244/
+mv val/ILSVRC2012_val_00012359.JPEG n02493509/
+mv val/ILSVRC2012_val_00012360.JPEG n03743016/
+mv val/ILSVRC2012_val_00012361.JPEG n12998815/
+mv val/ILSVRC2012_val_00012362.JPEG n03770439/
+mv val/ILSVRC2012_val_00012363.JPEG n02777292/
+mv val/ILSVRC2012_val_00012364.JPEG n02097298/
+mv val/ILSVRC2012_val_00012365.JPEG n01687978/
+mv val/ILSVRC2012_val_00012366.JPEG n04179913/
+mv val/ILSVRC2012_val_00012367.JPEG n02749479/
+mv val/ILSVRC2012_val_00012368.JPEG n03627232/
+mv val/ILSVRC2012_val_00012369.JPEG n03207743/
+mv val/ILSVRC2012_val_00012370.JPEG n03476991/
+mv val/ILSVRC2012_val_00012371.JPEG n07745940/
+mv val/ILSVRC2012_val_00012372.JPEG n01883070/
+mv val/ILSVRC2012_val_00012373.JPEG n03792972/
+mv val/ILSVRC2012_val_00012374.JPEG n03769881/
+mv val/ILSVRC2012_val_00012375.JPEG n02011460/
+mv val/ILSVRC2012_val_00012376.JPEG n02870880/
+mv val/ILSVRC2012_val_00012377.JPEG n02123045/
+mv val/ILSVRC2012_val_00012378.JPEG n04040759/
+mv val/ILSVRC2012_val_00012379.JPEG n07684084/
+mv val/ILSVRC2012_val_00012380.JPEG n02111277/
+mv val/ILSVRC2012_val_00012381.JPEG n01877812/
+mv val/ILSVRC2012_val_00012382.JPEG n04019541/
+mv val/ILSVRC2012_val_00012383.JPEG n03197337/
+mv val/ILSVRC2012_val_00012384.JPEG n02494079/
+mv val/ILSVRC2012_val_00012385.JPEG n03187595/
+mv val/ILSVRC2012_val_00012386.JPEG n02687172/
+mv val/ILSVRC2012_val_00012387.JPEG n02883205/
+mv val/ILSVRC2012_val_00012388.JPEG n07754684/
+mv val/ILSVRC2012_val_00012389.JPEG n09399592/
+mv val/ILSVRC2012_val_00012390.JPEG n02791270/
+mv val/ILSVRC2012_val_00012391.JPEG n03063689/
+mv val/ILSVRC2012_val_00012392.JPEG n03902125/
+mv val/ILSVRC2012_val_00012393.JPEG n02415577/
+mv val/ILSVRC2012_val_00012394.JPEG n02086240/
+mv val/ILSVRC2012_val_00012395.JPEG n02093991/
+mv val/ILSVRC2012_val_00012396.JPEG n02802426/
+mv val/ILSVRC2012_val_00012397.JPEG n03782006/
+mv val/ILSVRC2012_val_00012398.JPEG n03478589/
+mv val/ILSVRC2012_val_00012399.JPEG n02128385/
+mv val/ILSVRC2012_val_00012400.JPEG n02894605/
+mv val/ILSVRC2012_val_00012401.JPEG n02115641/
+mv val/ILSVRC2012_val_00012402.JPEG n02011460/
+mv val/ILSVRC2012_val_00012403.JPEG n02951358/
+mv val/ILSVRC2012_val_00012404.JPEG n02128757/
+mv val/ILSVRC2012_val_00012405.JPEG n02871525/
+mv val/ILSVRC2012_val_00012406.JPEG n02346627/
+mv val/ILSVRC2012_val_00012407.JPEG n03450230/
+mv val/ILSVRC2012_val_00012408.JPEG n09229709/
+mv val/ILSVRC2012_val_00012409.JPEG n02417914/
+mv val/ILSVRC2012_val_00012410.JPEG n01796340/
+mv val/ILSVRC2012_val_00012411.JPEG n02128925/
+mv val/ILSVRC2012_val_00012412.JPEG n04486054/
+mv val/ILSVRC2012_val_00012413.JPEG n02749479/
+mv val/ILSVRC2012_val_00012414.JPEG n02346627/
+mv val/ILSVRC2012_val_00012415.JPEG n01930112/
+mv val/ILSVRC2012_val_00012416.JPEG n02091032/
+mv val/ILSVRC2012_val_00012417.JPEG n02963159/
+mv val/ILSVRC2012_val_00012418.JPEG n01944390/
+mv val/ILSVRC2012_val_00012419.JPEG n02793495/
+mv val/ILSVRC2012_val_00012420.JPEG n02018207/
+mv val/ILSVRC2012_val_00012421.JPEG n04153751/
+mv val/ILSVRC2012_val_00012422.JPEG n02790996/
+mv val/ILSVRC2012_val_00012423.JPEG n02129165/
+mv val/ILSVRC2012_val_00012424.JPEG n03538406/
+mv val/ILSVRC2012_val_00012425.JPEG n02965783/
+mv val/ILSVRC2012_val_00012426.JPEG n03179701/
+mv val/ILSVRC2012_val_00012427.JPEG n03160309/
+mv val/ILSVRC2012_val_00012428.JPEG n01644373/
+mv val/ILSVRC2012_val_00012429.JPEG n01770393/
+mv val/ILSVRC2012_val_00012430.JPEG n02109961/
+mv val/ILSVRC2012_val_00012431.JPEG n01873310/
+mv val/ILSVRC2012_val_00012432.JPEG n03085013/
+mv val/ILSVRC2012_val_00012433.JPEG n01735189/
+mv val/ILSVRC2012_val_00012434.JPEG n04370456/
+mv val/ILSVRC2012_val_00012435.JPEG n02018207/
+mv val/ILSVRC2012_val_00012436.JPEG n02018795/
+mv val/ILSVRC2012_val_00012437.JPEG n02110627/
+mv val/ILSVRC2012_val_00012438.JPEG n03804744/
+mv val/ILSVRC2012_val_00012439.JPEG n03534580/
+mv val/ILSVRC2012_val_00012440.JPEG n07760859/
+mv val/ILSVRC2012_val_00012441.JPEG n01631663/
+mv val/ILSVRC2012_val_00012442.JPEG n04482393/
+mv val/ILSVRC2012_val_00012443.JPEG n02917067/
+mv val/ILSVRC2012_val_00012444.JPEG n07753592/
+mv val/ILSVRC2012_val_00012445.JPEG n03447447/
+mv val/ILSVRC2012_val_00012446.JPEG n02112706/
+mv val/ILSVRC2012_val_00012447.JPEG n03947888/
+mv val/ILSVRC2012_val_00012448.JPEG n02927161/
+mv val/ILSVRC2012_val_00012449.JPEG n04228054/
+mv val/ILSVRC2012_val_00012450.JPEG n03259280/
+mv val/ILSVRC2012_val_00012451.JPEG n07753275/
+mv val/ILSVRC2012_val_00012452.JPEG n07753592/
+mv val/ILSVRC2012_val_00012453.JPEG n02948072/
+mv val/ILSVRC2012_val_00012454.JPEG n07697313/
+mv val/ILSVRC2012_val_00012455.JPEG n01984695/
+mv val/ILSVRC2012_val_00012456.JPEG n11879895/
+mv val/ILSVRC2012_val_00012457.JPEG n02125311/
+mv val/ILSVRC2012_val_00012458.JPEG n12998815/
+mv val/ILSVRC2012_val_00012459.JPEG n03976657/
+mv val/ILSVRC2012_val_00012460.JPEG n02096294/
+mv val/ILSVRC2012_val_00012461.JPEG n04264628/
+mv val/ILSVRC2012_val_00012462.JPEG n04548362/
+mv val/ILSVRC2012_val_00012463.JPEG n02276258/
+mv val/ILSVRC2012_val_00012464.JPEG n03891251/
+mv val/ILSVRC2012_val_00012465.JPEG n03127925/
+mv val/ILSVRC2012_val_00012466.JPEG n02834397/
+mv val/ILSVRC2012_val_00012467.JPEG n03854065/
+mv val/ILSVRC2012_val_00012468.JPEG n02979186/
+mv val/ILSVRC2012_val_00012469.JPEG n07920052/
+mv val/ILSVRC2012_val_00012470.JPEG n02110627/
+mv val/ILSVRC2012_val_00012471.JPEG n02095314/
+mv val/ILSVRC2012_val_00012472.JPEG n04049303/
+mv val/ILSVRC2012_val_00012473.JPEG n02965783/
+mv val/ILSVRC2012_val_00012474.JPEG n02895154/
+mv val/ILSVRC2012_val_00012475.JPEG n02013706/
+mv val/ILSVRC2012_val_00012476.JPEG n04044716/
+mv val/ILSVRC2012_val_00012477.JPEG n03709823/
+mv val/ILSVRC2012_val_00012478.JPEG n02138441/
+mv val/ILSVRC2012_val_00012479.JPEG n02777292/
+mv val/ILSVRC2012_val_00012480.JPEG n01943899/
+mv val/ILSVRC2012_val_00012481.JPEG n07892512/
+mv val/ILSVRC2012_val_00012482.JPEG n02091831/
+mv val/ILSVRC2012_val_00012483.JPEG n03743016/
+mv val/ILSVRC2012_val_00012484.JPEG n01514668/
+mv val/ILSVRC2012_val_00012485.JPEG n04243546/
+mv val/ILSVRC2012_val_00012486.JPEG n02105251/
+mv val/ILSVRC2012_val_00012487.JPEG n03032252/
+mv val/ILSVRC2012_val_00012488.JPEG n01855032/
+mv val/ILSVRC2012_val_00012489.JPEG n04612504/
+mv val/ILSVRC2012_val_00012490.JPEG n03770679/
+mv val/ILSVRC2012_val_00012491.JPEG n03866082/
+mv val/ILSVRC2012_val_00012492.JPEG n02091134/
+mv val/ILSVRC2012_val_00012493.JPEG n03443371/
+mv val/ILSVRC2012_val_00012494.JPEG n03777568/
+mv val/ILSVRC2012_val_00012495.JPEG n03773504/
+mv val/ILSVRC2012_val_00012496.JPEG n02480855/
+mv val/ILSVRC2012_val_00012497.JPEG n07745940/
+mv val/ILSVRC2012_val_00012498.JPEG n02391049/
+mv val/ILSVRC2012_val_00012499.JPEG n01910747/
+mv val/ILSVRC2012_val_00012500.JPEG n02277742/
+mv val/ILSVRC2012_val_00012501.JPEG n03938244/
+mv val/ILSVRC2012_val_00012502.JPEG n02788148/
+mv val/ILSVRC2012_val_00012503.JPEG n01440764/
+mv val/ILSVRC2012_val_00012504.JPEG n03425413/
+mv val/ILSVRC2012_val_00012505.JPEG n03895866/
+mv val/ILSVRC2012_val_00012506.JPEG n03950228/
+mv val/ILSVRC2012_val_00012507.JPEG n02133161/
+mv val/ILSVRC2012_val_00012508.JPEG n01843065/
+mv val/ILSVRC2012_val_00012509.JPEG n02992211/
+mv val/ILSVRC2012_val_00012510.JPEG n02834397/
+mv val/ILSVRC2012_val_00012511.JPEG n02066245/
+mv val/ILSVRC2012_val_00012512.JPEG n03337140/
+mv val/ILSVRC2012_val_00012513.JPEG n07716358/
+mv val/ILSVRC2012_val_00012514.JPEG n03584829/
+mv val/ILSVRC2012_val_00012515.JPEG n02095314/
+mv val/ILSVRC2012_val_00012516.JPEG n02093991/
+mv val/ILSVRC2012_val_00012517.JPEG n02974003/
+mv val/ILSVRC2012_val_00012518.JPEG n02025239/
+mv val/ILSVRC2012_val_00012519.JPEG n04596742/
+mv val/ILSVRC2012_val_00012520.JPEG n02916936/
+mv val/ILSVRC2012_val_00012521.JPEG n01768244/
+mv val/ILSVRC2012_val_00012522.JPEG n03720891/
+mv val/ILSVRC2012_val_00012523.JPEG n02056570/
+mv val/ILSVRC2012_val_00012524.JPEG n02102177/
+mv val/ILSVRC2012_val_00012525.JPEG n04557648/
+mv val/ILSVRC2012_val_00012526.JPEG n02268853/
+mv val/ILSVRC2012_val_00012527.JPEG n02098105/
+mv val/ILSVRC2012_val_00012528.JPEG n01514859/
+mv val/ILSVRC2012_val_00012529.JPEG n04141975/
+mv val/ILSVRC2012_val_00012530.JPEG n02071294/
+mv val/ILSVRC2012_val_00012531.JPEG n03188531/
+mv val/ILSVRC2012_val_00012532.JPEG n04254777/
+mv val/ILSVRC2012_val_00012533.JPEG n03709823/
+mv val/ILSVRC2012_val_00012534.JPEG n03095699/
+mv val/ILSVRC2012_val_00012535.JPEG n04517823/
+mv val/ILSVRC2012_val_00012536.JPEG n03733131/
+mv val/ILSVRC2012_val_00012537.JPEG n07693725/
+mv val/ILSVRC2012_val_00012538.JPEG n03476684/
+mv val/ILSVRC2012_val_00012539.JPEG n03724870/
+mv val/ILSVRC2012_val_00012540.JPEG n03983396/
+mv val/ILSVRC2012_val_00012541.JPEG n02342885/
+mv val/ILSVRC2012_val_00012542.JPEG n02510455/
+mv val/ILSVRC2012_val_00012543.JPEG n03874293/
+mv val/ILSVRC2012_val_00012544.JPEG n02823428/
+mv val/ILSVRC2012_val_00012545.JPEG n04356056/
+mv val/ILSVRC2012_val_00012546.JPEG n01494475/
+mv val/ILSVRC2012_val_00012547.JPEG n04251144/
+mv val/ILSVRC2012_val_00012548.JPEG n02894605/
+mv val/ILSVRC2012_val_00012549.JPEG n02097658/
+mv val/ILSVRC2012_val_00012550.JPEG n04273569/
+mv val/ILSVRC2012_val_00012551.JPEG n02123045/
+mv val/ILSVRC2012_val_00012552.JPEG n03250847/
+mv val/ILSVRC2012_val_00012553.JPEG n01687978/
+mv val/ILSVRC2012_val_00012554.JPEG n02012849/
+mv val/ILSVRC2012_val_00012555.JPEG n03733131/
+mv val/ILSVRC2012_val_00012556.JPEG n02096294/
+mv val/ILSVRC2012_val_00012557.JPEG n02279972/
+mv val/ILSVRC2012_val_00012558.JPEG n01641577/
+mv val/ILSVRC2012_val_00012559.JPEG n03804744/
+mv val/ILSVRC2012_val_00012560.JPEG n02871525/
+mv val/ILSVRC2012_val_00012561.JPEG n04479046/
+mv val/ILSVRC2012_val_00012562.JPEG n07697313/
+mv val/ILSVRC2012_val_00012563.JPEG n02786058/
+mv val/ILSVRC2012_val_00012564.JPEG n01924916/
+mv val/ILSVRC2012_val_00012565.JPEG n07932039/
+mv val/ILSVRC2012_val_00012566.JPEG n02099712/
+mv val/ILSVRC2012_val_00012567.JPEG n03271574/
+mv val/ILSVRC2012_val_00012568.JPEG n02488702/
+mv val/ILSVRC2012_val_00012569.JPEG n02927161/
+mv val/ILSVRC2012_val_00012570.JPEG n02815834/
+mv val/ILSVRC2012_val_00012571.JPEG n02877765/
+mv val/ILSVRC2012_val_00012572.JPEG n04560804/
+mv val/ILSVRC2012_val_00012573.JPEG n03297495/
+mv val/ILSVRC2012_val_00012574.JPEG n04590129/
+mv val/ILSVRC2012_val_00012575.JPEG n03944341/
+mv val/ILSVRC2012_val_00012576.JPEG n03980874/
+mv val/ILSVRC2012_val_00012577.JPEG n02105056/
+mv val/ILSVRC2012_val_00012578.JPEG n01734418/
+mv val/ILSVRC2012_val_00012579.JPEG n03947888/
+mv val/ILSVRC2012_val_00012580.JPEG n02363005/
+mv val/ILSVRC2012_val_00012581.JPEG n06596364/
+mv val/ILSVRC2012_val_00012582.JPEG n07753275/
+mv val/ILSVRC2012_val_00012583.JPEG n02930766/
+mv val/ILSVRC2012_val_00012584.JPEG n02093859/
+mv val/ILSVRC2012_val_00012585.JPEG n03207941/
+mv val/ILSVRC2012_val_00012586.JPEG n01818515/
+mv val/ILSVRC2012_val_00012587.JPEG n03657121/
+mv val/ILSVRC2012_val_00012588.JPEG n01629819/
+mv val/ILSVRC2012_val_00012589.JPEG n03063689/
+mv val/ILSVRC2012_val_00012590.JPEG n03255030/
+mv val/ILSVRC2012_val_00012591.JPEG n02808440/
+mv val/ILSVRC2012_val_00012592.JPEG n02981792/
+mv val/ILSVRC2012_val_00012593.JPEG n09246464/
+mv val/ILSVRC2012_val_00012594.JPEG n04591713/
+mv val/ILSVRC2012_val_00012595.JPEG n03492542/
+mv val/ILSVRC2012_val_00012596.JPEG n04517823/
+mv val/ILSVRC2012_val_00012597.JPEG n03240683/
+mv val/ILSVRC2012_val_00012598.JPEG n07716358/
+mv val/ILSVRC2012_val_00012599.JPEG n07717556/
+mv val/ILSVRC2012_val_00012600.JPEG n02814533/
+mv val/ILSVRC2012_val_00012601.JPEG n01843383/
+mv val/ILSVRC2012_val_00012602.JPEG n03691459/
+mv val/ILSVRC2012_val_00012603.JPEG n02134418/
+mv val/ILSVRC2012_val_00012604.JPEG n02110185/
+mv val/ILSVRC2012_val_00012605.JPEG n02093754/
+mv val/ILSVRC2012_val_00012606.JPEG n02807133/
+mv val/ILSVRC2012_val_00012607.JPEG n07684084/
+mv val/ILSVRC2012_val_00012608.JPEG n02091244/
+mv val/ILSVRC2012_val_00012609.JPEG n03873416/
+mv val/ILSVRC2012_val_00012610.JPEG n02113624/
+mv val/ILSVRC2012_val_00012611.JPEG n02094433/
+mv val/ILSVRC2012_val_00012612.JPEG n02917067/
+mv val/ILSVRC2012_val_00012613.JPEG n03450230/
+mv val/ILSVRC2012_val_00012614.JPEG n03888605/
+mv val/ILSVRC2012_val_00012615.JPEG n01616318/
+mv val/ILSVRC2012_val_00012616.JPEG n04435653/
+mv val/ILSVRC2012_val_00012617.JPEG n02111277/
+mv val/ILSVRC2012_val_00012618.JPEG n02006656/
+mv val/ILSVRC2012_val_00012619.JPEG n02363005/
+mv val/ILSVRC2012_val_00012620.JPEG n02497673/
+mv val/ILSVRC2012_val_00012621.JPEG n07753592/
+mv val/ILSVRC2012_val_00012622.JPEG n07711569/
+mv val/ILSVRC2012_val_00012623.JPEG n01693334/
+mv val/ILSVRC2012_val_00012624.JPEG n03954731/
+mv val/ILSVRC2012_val_00012625.JPEG n04033995/
+mv val/ILSVRC2012_val_00012626.JPEG n04208210/
+mv val/ILSVRC2012_val_00012627.JPEG n02817516/
+mv val/ILSVRC2012_val_00012628.JPEG n07754684/
+mv val/ILSVRC2012_val_00012629.JPEG n02256656/
+mv val/ILSVRC2012_val_00012630.JPEG n13052670/
+mv val/ILSVRC2012_val_00012631.JPEG n04417672/
+mv val/ILSVRC2012_val_00012632.JPEG n11939491/
+mv val/ILSVRC2012_val_00012633.JPEG n02443114/
+mv val/ILSVRC2012_val_00012634.JPEG n03445777/
+mv val/ILSVRC2012_val_00012635.JPEG n02093859/
+mv val/ILSVRC2012_val_00012636.JPEG n07684084/
+mv val/ILSVRC2012_val_00012637.JPEG n03026506/
+mv val/ILSVRC2012_val_00012638.JPEG n04081281/
+mv val/ILSVRC2012_val_00012639.JPEG n02002724/
+mv val/ILSVRC2012_val_00012640.JPEG n02317335/
+mv val/ILSVRC2012_val_00012641.JPEG n03584829/
+mv val/ILSVRC2012_val_00012642.JPEG n04039381/
+mv val/ILSVRC2012_val_00012643.JPEG n03062245/
+mv val/ILSVRC2012_val_00012644.JPEG n02091134/
+mv val/ILSVRC2012_val_00012645.JPEG n07745940/
+mv val/ILSVRC2012_val_00012646.JPEG n02092002/
+mv val/ILSVRC2012_val_00012647.JPEG n03991062/
+mv val/ILSVRC2012_val_00012648.JPEG n02843684/
+mv val/ILSVRC2012_val_00012649.JPEG n03961711/
+mv val/ILSVRC2012_val_00012650.JPEG n04069434/
+mv val/ILSVRC2012_val_00012651.JPEG n01558993/
+mv val/ILSVRC2012_val_00012652.JPEG n07745940/
+mv val/ILSVRC2012_val_00012653.JPEG n04486054/
+mv val/ILSVRC2012_val_00012654.JPEG n04347754/
+mv val/ILSVRC2012_val_00012655.JPEG n02011460/
+mv val/ILSVRC2012_val_00012656.JPEG n02808304/
+mv val/ILSVRC2012_val_00012657.JPEG n02109961/
+mv val/ILSVRC2012_val_00012658.JPEG n04229816/
+mv val/ILSVRC2012_val_00012659.JPEG n04409515/
+mv val/ILSVRC2012_val_00012660.JPEG n04116512/
+mv val/ILSVRC2012_val_00012661.JPEG n03857828/
+mv val/ILSVRC2012_val_00012662.JPEG n02445715/
+mv val/ILSVRC2012_val_00012663.JPEG n03920288/
+mv val/ILSVRC2012_val_00012664.JPEG n02488702/
+mv val/ILSVRC2012_val_00012665.JPEG n03126707/
+mv val/ILSVRC2012_val_00012666.JPEG n07932039/
+mv val/ILSVRC2012_val_00012667.JPEG n02835271/
+mv val/ILSVRC2012_val_00012668.JPEG n03445924/
+mv val/ILSVRC2012_val_00012669.JPEG n01797886/
+mv val/ILSVRC2012_val_00012670.JPEG n03476684/
+mv val/ILSVRC2012_val_00012671.JPEG n03658185/
+mv val/ILSVRC2012_val_00012672.JPEG n01943899/
+mv val/ILSVRC2012_val_00012673.JPEG n02951358/
+mv val/ILSVRC2012_val_00012674.JPEG n03532672/
+mv val/ILSVRC2012_val_00012675.JPEG n02966193/
+mv val/ILSVRC2012_val_00012676.JPEG n02988304/
+mv val/ILSVRC2012_val_00012677.JPEG n02229544/
+mv val/ILSVRC2012_val_00012678.JPEG n02095570/
+mv val/ILSVRC2012_val_00012679.JPEG n02841315/
+mv val/ILSVRC2012_val_00012680.JPEG n04536866/
+mv val/ILSVRC2012_val_00012681.JPEG n02268853/
+mv val/ILSVRC2012_val_00012682.JPEG n03445924/
+mv val/ILSVRC2012_val_00012683.JPEG n03803284/
+mv val/ILSVRC2012_val_00012684.JPEG n04254777/
+mv val/ILSVRC2012_val_00012685.JPEG n02443484/
+mv val/ILSVRC2012_val_00012686.JPEG n03133878/
+mv val/ILSVRC2012_val_00012687.JPEG n02799071/
+mv val/ILSVRC2012_val_00012688.JPEG n13133613/
+mv val/ILSVRC2012_val_00012689.JPEG n02102040/
+mv val/ILSVRC2012_val_00012690.JPEG n02107908/
+mv val/ILSVRC2012_val_00012691.JPEG n03947888/
+mv val/ILSVRC2012_val_00012692.JPEG n04487394/
+mv val/ILSVRC2012_val_00012693.JPEG n03599486/
+mv val/ILSVRC2012_val_00012694.JPEG n03452741/
+mv val/ILSVRC2012_val_00012695.JPEG n02097298/
+mv val/ILSVRC2012_val_00012696.JPEG n04417672/
+mv val/ILSVRC2012_val_00012697.JPEG n02493793/
+mv val/ILSVRC2012_val_00012698.JPEG n02325366/
+mv val/ILSVRC2012_val_00012699.JPEG n07747607/
+mv val/ILSVRC2012_val_00012700.JPEG n03188531/
+mv val/ILSVRC2012_val_00012701.JPEG n04482393/
+mv val/ILSVRC2012_val_00012702.JPEG n02088632/
+mv val/ILSVRC2012_val_00012703.JPEG n04461696/
+mv val/ILSVRC2012_val_00012704.JPEG n03249569/
+mv val/ILSVRC2012_val_00012705.JPEG n07693725/
+mv val/ILSVRC2012_val_00012706.JPEG n02096437/
+mv val/ILSVRC2012_val_00012707.JPEG n01773797/
+mv val/ILSVRC2012_val_00012708.JPEG n02105162/
+mv val/ILSVRC2012_val_00012709.JPEG n02843684/
+mv val/ILSVRC2012_val_00012710.JPEG n02950826/
+mv val/ILSVRC2012_val_00012711.JPEG n02492660/
+mv val/ILSVRC2012_val_00012712.JPEG n04366367/
+mv val/ILSVRC2012_val_00012713.JPEG n01981276/
+mv val/ILSVRC2012_val_00012714.JPEG n03207941/
+mv val/ILSVRC2012_val_00012715.JPEG n02966193/
+mv val/ILSVRC2012_val_00012716.JPEG n03534580/
+mv val/ILSVRC2012_val_00012717.JPEG n02112018/
+mv val/ILSVRC2012_val_00012718.JPEG n01688243/
+mv val/ILSVRC2012_val_00012719.JPEG n04584207/
+mv val/ILSVRC2012_val_00012720.JPEG n02415577/
+mv val/ILSVRC2012_val_00012721.JPEG n01847000/
+mv val/ILSVRC2012_val_00012722.JPEG n02514041/
+mv val/ILSVRC2012_val_00012723.JPEG n02488291/
+mv val/ILSVRC2012_val_00012724.JPEG n02749479/
+mv val/ILSVRC2012_val_00012725.JPEG n04380533/
+mv val/ILSVRC2012_val_00012726.JPEG n02510455/
+mv val/ILSVRC2012_val_00012727.JPEG n02526121/
+mv val/ILSVRC2012_val_00012728.JPEG n07745940/
+mv val/ILSVRC2012_val_00012729.JPEG n03930313/
+mv val/ILSVRC2012_val_00012730.JPEG n03877845/
+mv val/ILSVRC2012_val_00012731.JPEG n01755581/
+mv val/ILSVRC2012_val_00012732.JPEG n01667114/
+mv val/ILSVRC2012_val_00012733.JPEG n02108000/
+mv val/ILSVRC2012_val_00012734.JPEG n02699494/
+mv val/ILSVRC2012_val_00012735.JPEG n02363005/
+mv val/ILSVRC2012_val_00012736.JPEG n02100877/
+mv val/ILSVRC2012_val_00012737.JPEG n03770439/
+mv val/ILSVRC2012_val_00012738.JPEG n02114712/
+mv val/ILSVRC2012_val_00012739.JPEG n02100735/
+mv val/ILSVRC2012_val_00012740.JPEG n02108000/
+mv val/ILSVRC2012_val_00012741.JPEG n02028035/
+mv val/ILSVRC2012_val_00012742.JPEG n02108551/
+mv val/ILSVRC2012_val_00012743.JPEG n02484975/
+mv val/ILSVRC2012_val_00012744.JPEG n07718747/
+mv val/ILSVRC2012_val_00012745.JPEG n03498962/
+mv val/ILSVRC2012_val_00012746.JPEG n01665541/
+mv val/ILSVRC2012_val_00012747.JPEG n02894605/
+mv val/ILSVRC2012_val_00012748.JPEG n04118776/
+mv val/ILSVRC2012_val_00012749.JPEG n02119022/
+mv val/ILSVRC2012_val_00012750.JPEG n04258138/
+mv val/ILSVRC2012_val_00012751.JPEG n04604644/
+mv val/ILSVRC2012_val_00012752.JPEG n02115641/
+mv val/ILSVRC2012_val_00012753.JPEG n07768694/
+mv val/ILSVRC2012_val_00012754.JPEG n12267677/
+mv val/ILSVRC2012_val_00012755.JPEG n03908714/
+mv val/ILSVRC2012_val_00012756.JPEG n03876231/
+mv val/ILSVRC2012_val_00012757.JPEG n07717556/
+mv val/ILSVRC2012_val_00012758.JPEG n11879895/
+mv val/ILSVRC2012_val_00012759.JPEG n01688243/
+mv val/ILSVRC2012_val_00012760.JPEG n03208938/
+mv val/ILSVRC2012_val_00012761.JPEG n12267677/
+mv val/ILSVRC2012_val_00012762.JPEG n02669723/
+mv val/ILSVRC2012_val_00012763.JPEG n02965783/
+mv val/ILSVRC2012_val_00012764.JPEG n02276258/
+mv val/ILSVRC2012_val_00012765.JPEG n01631663/
+mv val/ILSVRC2012_val_00012766.JPEG n04487394/
+mv val/ILSVRC2012_val_00012767.JPEG n02825657/
+mv val/ILSVRC2012_val_00012768.JPEG n01749939/
+mv val/ILSVRC2012_val_00012769.JPEG n04037443/
+mv val/ILSVRC2012_val_00012770.JPEG n04041544/
+mv val/ILSVRC2012_val_00012771.JPEG n03376595/
+mv val/ILSVRC2012_val_00012772.JPEG n04532670/
+mv val/ILSVRC2012_val_00012773.JPEG n02104365/
+mv val/ILSVRC2012_val_00012774.JPEG n02233338/
+mv val/ILSVRC2012_val_00012775.JPEG n02793495/
+mv val/ILSVRC2012_val_00012776.JPEG n03770439/
+mv val/ILSVRC2012_val_00012777.JPEG n01910747/
+mv val/ILSVRC2012_val_00012778.JPEG n04154565/
+mv val/ILSVRC2012_val_00012779.JPEG n01980166/
+mv val/ILSVRC2012_val_00012780.JPEG n03793489/
+mv val/ILSVRC2012_val_00012781.JPEG n02025239/
+mv val/ILSVRC2012_val_00012782.JPEG n02480495/
+mv val/ILSVRC2012_val_00012783.JPEG n03781244/
+mv val/ILSVRC2012_val_00012784.JPEG n04399382/
+mv val/ILSVRC2012_val_00012785.JPEG n07871810/
+mv val/ILSVRC2012_val_00012786.JPEG n04065272/
+mv val/ILSVRC2012_val_00012787.JPEG n02017213/
+mv val/ILSVRC2012_val_00012788.JPEG n01943899/
+mv val/ILSVRC2012_val_00012789.JPEG n04067472/
+mv val/ILSVRC2012_val_00012790.JPEG n03761084/
+mv val/ILSVRC2012_val_00012791.JPEG n02094433/
+mv val/ILSVRC2012_val_00012792.JPEG n03538406/
+mv val/ILSVRC2012_val_00012793.JPEG n02494079/
+mv val/ILSVRC2012_val_00012794.JPEG n04147183/
+mv val/ILSVRC2012_val_00012795.JPEG n04141076/
+mv val/ILSVRC2012_val_00012796.JPEG n04589890/
+mv val/ILSVRC2012_val_00012797.JPEG n01601694/
+mv val/ILSVRC2012_val_00012798.JPEG n02123394/
+mv val/ILSVRC2012_val_00012799.JPEG n06874185/
+mv val/ILSVRC2012_val_00012800.JPEG n02114548/
+mv val/ILSVRC2012_val_00012801.JPEG n03637318/
+mv val/ILSVRC2012_val_00012802.JPEG n03710193/
+mv val/ILSVRC2012_val_00012803.JPEG n04536866/
+mv val/ILSVRC2012_val_00012804.JPEG n09399592/
+mv val/ILSVRC2012_val_00012805.JPEG n03452741/
+mv val/ILSVRC2012_val_00012806.JPEG n03594945/
+mv val/ILSVRC2012_val_00012807.JPEG n07860988/
+mv val/ILSVRC2012_val_00012808.JPEG n03085013/
+mv val/ILSVRC2012_val_00012809.JPEG n02814533/
+mv val/ILSVRC2012_val_00012810.JPEG n03461385/
+mv val/ILSVRC2012_val_00012811.JPEG n04252077/
+mv val/ILSVRC2012_val_00012812.JPEG n02859443/
+mv val/ILSVRC2012_val_00012813.JPEG n04033901/
+mv val/ILSVRC2012_val_00012814.JPEG n01530575/
+mv val/ILSVRC2012_val_00012815.JPEG n03476684/
+mv val/ILSVRC2012_val_00012816.JPEG n04069434/
+mv val/ILSVRC2012_val_00012817.JPEG n02105056/
+mv val/ILSVRC2012_val_00012818.JPEG n02128385/
+mv val/ILSVRC2012_val_00012819.JPEG n01694178/
+mv val/ILSVRC2012_val_00012820.JPEG n01688243/
+mv val/ILSVRC2012_val_00012821.JPEG n03372029/
+mv val/ILSVRC2012_val_00012822.JPEG n04465501/
+mv val/ILSVRC2012_val_00012823.JPEG n02808440/
+mv val/ILSVRC2012_val_00012824.JPEG n04235860/
+mv val/ILSVRC2012_val_00012825.JPEG n02177972/
+mv val/ILSVRC2012_val_00012826.JPEG n13044778/
+mv val/ILSVRC2012_val_00012827.JPEG n02096177/
+mv val/ILSVRC2012_val_00012828.JPEG n01770081/
+mv val/ILSVRC2012_val_00012829.JPEG n01669191/
+mv val/ILSVRC2012_val_00012830.JPEG n02481823/
+mv val/ILSVRC2012_val_00012831.JPEG n07880968/
+mv val/ILSVRC2012_val_00012832.JPEG n03888605/
+mv val/ILSVRC2012_val_00012833.JPEG n02117135/
+mv val/ILSVRC2012_val_00012834.JPEG n02096437/
+mv val/ILSVRC2012_val_00012835.JPEG n02397096/
+mv val/ILSVRC2012_val_00012836.JPEG n01592084/
+mv val/ILSVRC2012_val_00012837.JPEG n03769881/
+mv val/ILSVRC2012_val_00012838.JPEG n03026506/
+mv val/ILSVRC2012_val_00012839.JPEG n02107574/
+mv val/ILSVRC2012_val_00012840.JPEG n02114367/
+mv val/ILSVRC2012_val_00012841.JPEG n03124170/
+mv val/ILSVRC2012_val_00012842.JPEG n03733281/
+mv val/ILSVRC2012_val_00012843.JPEG n03692522/
+mv val/ILSVRC2012_val_00012844.JPEG n02037110/
+mv val/ILSVRC2012_val_00012845.JPEG n02167151/
+mv val/ILSVRC2012_val_00012846.JPEG n01930112/
+mv val/ILSVRC2012_val_00012847.JPEG n03995372/
+mv val/ILSVRC2012_val_00012848.JPEG n03355925/
+mv val/ILSVRC2012_val_00012849.JPEG n03676483/
+mv val/ILSVRC2012_val_00012850.JPEG n03000247/
+mv val/ILSVRC2012_val_00012851.JPEG n02966193/
+mv val/ILSVRC2012_val_00012852.JPEG n02910353/
+mv val/ILSVRC2012_val_00012853.JPEG n01682714/
+mv val/ILSVRC2012_val_00012854.JPEG n02910353/
+mv val/ILSVRC2012_val_00012855.JPEG n02510455/
+mv val/ILSVRC2012_val_00012856.JPEG n02106550/
+mv val/ILSVRC2012_val_00012857.JPEG n02120079/
+mv val/ILSVRC2012_val_00012858.JPEG n03841143/
+mv val/ILSVRC2012_val_00012859.JPEG n04229816/
+mv val/ILSVRC2012_val_00012860.JPEG n02447366/
+mv val/ILSVRC2012_val_00012861.JPEG n02091467/
+mv val/ILSVRC2012_val_00012862.JPEG n04456115/
+mv val/ILSVRC2012_val_00012863.JPEG n03937543/
+mv val/ILSVRC2012_val_00012864.JPEG n01818515/
+mv val/ILSVRC2012_val_00012865.JPEG n04086273/
+mv val/ILSVRC2012_val_00012866.JPEG n02865351/
+mv val/ILSVRC2012_val_00012867.JPEG n03109150/
+mv val/ILSVRC2012_val_00012868.JPEG n02808304/
+mv val/ILSVRC2012_val_00012869.JPEG n03483316/
+mv val/ILSVRC2012_val_00012870.JPEG n01560419/
+mv val/ILSVRC2012_val_00012871.JPEG n07930864/
+mv val/ILSVRC2012_val_00012872.JPEG n04392985/
+mv val/ILSVRC2012_val_00012873.JPEG n04592741/
+mv val/ILSVRC2012_val_00012874.JPEG n04192698/
+mv val/ILSVRC2012_val_00012875.JPEG n02089973/
+mv val/ILSVRC2012_val_00012876.JPEG n03485794/
+mv val/ILSVRC2012_val_00012877.JPEG n07613480/
+mv val/ILSVRC2012_val_00012878.JPEG n02951585/
+mv val/ILSVRC2012_val_00012879.JPEG n01494475/
+mv val/ILSVRC2012_val_00012880.JPEG n01443537/
+mv val/ILSVRC2012_val_00012881.JPEG n02097298/
+mv val/ILSVRC2012_val_00012882.JPEG n02877765/
+mv val/ILSVRC2012_val_00012883.JPEG n02101388/
+mv val/ILSVRC2012_val_00012884.JPEG n03271574/
+mv val/ILSVRC2012_val_00012885.JPEG n03041632/
+mv val/ILSVRC2012_val_00012886.JPEG n03895866/
+mv val/ILSVRC2012_val_00012887.JPEG n02865351/
+mv val/ILSVRC2012_val_00012888.JPEG n02091134/
+mv val/ILSVRC2012_val_00012889.JPEG n02027492/
+mv val/ILSVRC2012_val_00012890.JPEG n03201208/
+mv val/ILSVRC2012_val_00012891.JPEG n03983396/
+mv val/ILSVRC2012_val_00012892.JPEG n02364673/
+mv val/ILSVRC2012_val_00012893.JPEG n02134084/
+mv val/ILSVRC2012_val_00012894.JPEG n02165105/
+mv val/ILSVRC2012_val_00012895.JPEG n01773549/
+mv val/ILSVRC2012_val_00012896.JPEG n04127249/
+mv val/ILSVRC2012_val_00012897.JPEG n04275548/
+mv val/ILSVRC2012_val_00012898.JPEG n01883070/
+mv val/ILSVRC2012_val_00012899.JPEG n02112706/
+mv val/ILSVRC2012_val_00012900.JPEG n03776460/
+mv val/ILSVRC2012_val_00012901.JPEG n02108000/
+mv val/ILSVRC2012_val_00012902.JPEG n02397096/
+mv val/ILSVRC2012_val_00012903.JPEG n04525305/
+mv val/ILSVRC2012_val_00012904.JPEG n02113624/
+mv val/ILSVRC2012_val_00012905.JPEG n02268853/
+mv val/ILSVRC2012_val_00012906.JPEG n02091134/
+mv val/ILSVRC2012_val_00012907.JPEG n03476991/
+mv val/ILSVRC2012_val_00012908.JPEG n02815834/
+mv val/ILSVRC2012_val_00012909.JPEG n04525305/
+mv val/ILSVRC2012_val_00012910.JPEG n03857828/
+mv val/ILSVRC2012_val_00012911.JPEG n03272010/
+mv val/ILSVRC2012_val_00012912.JPEG n04523525/
+mv val/ILSVRC2012_val_00012913.JPEG n04335435/
+mv val/ILSVRC2012_val_00012914.JPEG n03595614/
+mv val/ILSVRC2012_val_00012915.JPEG n07932039/
+mv val/ILSVRC2012_val_00012916.JPEG n03345487/
+mv val/ILSVRC2012_val_00012917.JPEG n03877472/
+mv val/ILSVRC2012_val_00012918.JPEG n04485082/
+mv val/ILSVRC2012_val_00012919.JPEG n02794156/
+mv val/ILSVRC2012_val_00012920.JPEG n03877472/
+mv val/ILSVRC2012_val_00012921.JPEG n03492542/
+mv val/ILSVRC2012_val_00012922.JPEG n02114712/
+mv val/ILSVRC2012_val_00012923.JPEG n02883205/
+mv val/ILSVRC2012_val_00012924.JPEG n02106662/
+mv val/ILSVRC2012_val_00012925.JPEG n03417042/
+mv val/ILSVRC2012_val_00012926.JPEG n03617480/
+mv val/ILSVRC2012_val_00012927.JPEG n02978881/
+mv val/ILSVRC2012_val_00012928.JPEG n02101556/
+mv val/ILSVRC2012_val_00012929.JPEG n04039381/
+mv val/ILSVRC2012_val_00012930.JPEG n02105641/
+mv val/ILSVRC2012_val_00012931.JPEG n02098413/
+mv val/ILSVRC2012_val_00012932.JPEG n04552348/
+mv val/ILSVRC2012_val_00012933.JPEG n02823750/
+mv val/ILSVRC2012_val_00012934.JPEG n07753113/
+mv val/ILSVRC2012_val_00012935.JPEG n02110063/
+mv val/ILSVRC2012_val_00012936.JPEG n09332890/
+mv val/ILSVRC2012_val_00012937.JPEG n09468604/
+mv val/ILSVRC2012_val_00012938.JPEG n02457408/
+mv val/ILSVRC2012_val_00012939.JPEG n01537544/
+mv val/ILSVRC2012_val_00012940.JPEG n02497673/
+mv val/ILSVRC2012_val_00012941.JPEG n09229709/
+mv val/ILSVRC2012_val_00012942.JPEG n04311004/
+mv val/ILSVRC2012_val_00012943.JPEG n02776631/
+mv val/ILSVRC2012_val_00012944.JPEG n02692877/
+mv val/ILSVRC2012_val_00012945.JPEG n03623198/
+mv val/ILSVRC2012_val_00012946.JPEG n04328186/
+mv val/ILSVRC2012_val_00012947.JPEG n03697007/
+mv val/ILSVRC2012_val_00012948.JPEG n02102177/
+mv val/ILSVRC2012_val_00012949.JPEG n01687978/
+mv val/ILSVRC2012_val_00012950.JPEG n03207743/
+mv val/ILSVRC2012_val_00012951.JPEG n03733131/
+mv val/ILSVRC2012_val_00012952.JPEG n02099429/
+mv val/ILSVRC2012_val_00012953.JPEG n03769881/
+mv val/ILSVRC2012_val_00012954.JPEG n02099601/
+mv val/ILSVRC2012_val_00012955.JPEG n02787622/
+mv val/ILSVRC2012_val_00012956.JPEG n03000134/
+mv val/ILSVRC2012_val_00012957.JPEG n03895866/
+mv val/ILSVRC2012_val_00012958.JPEG n02127052/
+mv val/ILSVRC2012_val_00012959.JPEG n04136333/
+mv val/ILSVRC2012_val_00012960.JPEG n02106662/
+mv val/ILSVRC2012_val_00012961.JPEG n13044778/
+mv val/ILSVRC2012_val_00012962.JPEG n01981276/
+mv val/ILSVRC2012_val_00012963.JPEG n03680355/
+mv val/ILSVRC2012_val_00012964.JPEG n03372029/
+mv val/ILSVRC2012_val_00012965.JPEG n03908618/
+mv val/ILSVRC2012_val_00012966.JPEG n03877472/
+mv val/ILSVRC2012_val_00012967.JPEG n04346328/
+mv val/ILSVRC2012_val_00012968.JPEG n04557648/
+mv val/ILSVRC2012_val_00012969.JPEG n04270147/
+mv val/ILSVRC2012_val_00012970.JPEG n04428191/
+mv val/ILSVRC2012_val_00012971.JPEG n02870880/
+mv val/ILSVRC2012_val_00012972.JPEG n03297495/
+mv val/ILSVRC2012_val_00012973.JPEG n02871525/
+mv val/ILSVRC2012_val_00012974.JPEG n02391049/
+mv val/ILSVRC2012_val_00012975.JPEG n02123045/
+mv val/ILSVRC2012_val_00012976.JPEG n01871265/
+mv val/ILSVRC2012_val_00012977.JPEG n02071294/
+mv val/ILSVRC2012_val_00012978.JPEG n02119022/
+mv val/ILSVRC2012_val_00012979.JPEG n04592741/
+mv val/ILSVRC2012_val_00012980.JPEG n02509815/
+mv val/ILSVRC2012_val_00012981.JPEG n03424325/
+mv val/ILSVRC2012_val_00012982.JPEG n02514041/
+mv val/ILSVRC2012_val_00012983.JPEG n02101006/
+mv val/ILSVRC2012_val_00012984.JPEG n02747177/
+mv val/ILSVRC2012_val_00012985.JPEG n01950731/
+mv val/ILSVRC2012_val_00012986.JPEG n02172182/
+mv val/ILSVRC2012_val_00012987.JPEG n04336792/
+mv val/ILSVRC2012_val_00012988.JPEG n04356056/
+mv val/ILSVRC2012_val_00012989.JPEG n04252077/
+mv val/ILSVRC2012_val_00012990.JPEG n01740131/
+mv val/ILSVRC2012_val_00012991.JPEG n04613696/
+mv val/ILSVRC2012_val_00012992.JPEG n04023962/
+mv val/ILSVRC2012_val_00012993.JPEG n04485082/
+mv val/ILSVRC2012_val_00012994.JPEG n02128925/
+mv val/ILSVRC2012_val_00012995.JPEG n02086079/
+mv val/ILSVRC2012_val_00012996.JPEG n03983396/
+mv val/ILSVRC2012_val_00012997.JPEG n02134084/
+mv val/ILSVRC2012_val_00012998.JPEG n02133161/
+mv val/ILSVRC2012_val_00012999.JPEG n02128925/
+mv val/ILSVRC2012_val_00013000.JPEG n04517823/
+mv val/ILSVRC2012_val_00013001.JPEG n07875152/
+mv val/ILSVRC2012_val_00013002.JPEG n02128385/
+mv val/ILSVRC2012_val_00013003.JPEG n04204347/
+mv val/ILSVRC2012_val_00013004.JPEG n02077923/
+mv val/ILSVRC2012_val_00013005.JPEG n03272010/
+mv val/ILSVRC2012_val_00013006.JPEG n02840245/
+mv val/ILSVRC2012_val_00013007.JPEG n02105641/
+mv val/ILSVRC2012_val_00013008.JPEG n01817953/
+mv val/ILSVRC2012_val_00013009.JPEG n04146614/
+mv val/ILSVRC2012_val_00013010.JPEG n04554684/
+mv val/ILSVRC2012_val_00013011.JPEG n03796401/
+mv val/ILSVRC2012_val_00013012.JPEG n04039381/
+mv val/ILSVRC2012_val_00013013.JPEG n02788148/
+mv val/ILSVRC2012_val_00013014.JPEG n04483307/
+mv val/ILSVRC2012_val_00013015.JPEG n02493793/
+mv val/ILSVRC2012_val_00013016.JPEG n03692522/
+mv val/ILSVRC2012_val_00013017.JPEG n03075370/
+mv val/ILSVRC2012_val_00013018.JPEG n03733281/
+mv val/ILSVRC2012_val_00013019.JPEG n04238763/
+mv val/ILSVRC2012_val_00013020.JPEG n02815834/
+mv val/ILSVRC2012_val_00013021.JPEG n03065424/
+mv val/ILSVRC2012_val_00013022.JPEG n02672831/
+mv val/ILSVRC2012_val_00013023.JPEG n03602883/
+mv val/ILSVRC2012_val_00013024.JPEG n04346328/
+mv val/ILSVRC2012_val_00013025.JPEG n02066245/
+mv val/ILSVRC2012_val_00013026.JPEG n03444034/
+mv val/ILSVRC2012_val_00013027.JPEG n03594734/
+mv val/ILSVRC2012_val_00013028.JPEG n15075141/
+mv val/ILSVRC2012_val_00013029.JPEG n12144580/
+mv val/ILSVRC2012_val_00013030.JPEG n07579787/
+mv val/ILSVRC2012_val_00013031.JPEG n02992529/
+mv val/ILSVRC2012_val_00013032.JPEG n04515003/
+mv val/ILSVRC2012_val_00013033.JPEG n02107142/
+mv val/ILSVRC2012_val_00013034.JPEG n02117135/
+mv val/ILSVRC2012_val_00013035.JPEG n01734418/
+mv val/ILSVRC2012_val_00013036.JPEG n01693334/
+mv val/ILSVRC2012_val_00013037.JPEG n02105505/
+mv val/ILSVRC2012_val_00013038.JPEG n02992211/
+mv val/ILSVRC2012_val_00013039.JPEG n02869837/
+mv val/ILSVRC2012_val_00013040.JPEG n13133613/
+mv val/ILSVRC2012_val_00013041.JPEG n02666196/
+mv val/ILSVRC2012_val_00013042.JPEG n04041544/
+mv val/ILSVRC2012_val_00013043.JPEG n03857828/
+mv val/ILSVRC2012_val_00013044.JPEG n04418357/
+mv val/ILSVRC2012_val_00013045.JPEG n02113978/
+mv val/ILSVRC2012_val_00013046.JPEG n01744401/
+mv val/ILSVRC2012_val_00013047.JPEG n02797295/
+mv val/ILSVRC2012_val_00013048.JPEG n02699494/
+mv val/ILSVRC2012_val_00013049.JPEG n02489166/
+mv val/ILSVRC2012_val_00013050.JPEG n02098286/
+mv val/ILSVRC2012_val_00013051.JPEG n04243546/
+mv val/ILSVRC2012_val_00013052.JPEG n02134418/
+mv val/ILSVRC2012_val_00013053.JPEG n02106662/
+mv val/ILSVRC2012_val_00013054.JPEG n03670208/
+mv val/ILSVRC2012_val_00013055.JPEG n04090263/
+mv val/ILSVRC2012_val_00013056.JPEG n02692877/
+mv val/ILSVRC2012_val_00013057.JPEG n03467068/
+mv val/ILSVRC2012_val_00013058.JPEG n04238763/
+mv val/ILSVRC2012_val_00013059.JPEG n03788365/
+mv val/ILSVRC2012_val_00013060.JPEG n03657121/
+mv val/ILSVRC2012_val_00013061.JPEG n02906734/
+mv val/ILSVRC2012_val_00013062.JPEG n02326432/
+mv val/ILSVRC2012_val_00013063.JPEG n02676566/
+mv val/ILSVRC2012_val_00013064.JPEG n02607072/
+mv val/ILSVRC2012_val_00013065.JPEG n03627232/
+mv val/ILSVRC2012_val_00013066.JPEG n02894605/
+mv val/ILSVRC2012_val_00013067.JPEG n03538406/
+mv val/ILSVRC2012_val_00013068.JPEG n04136333/
+mv val/ILSVRC2012_val_00013069.JPEG n01632458/
+mv val/ILSVRC2012_val_00013070.JPEG n04125021/
+mv val/ILSVRC2012_val_00013071.JPEG n03134739/
+mv val/ILSVRC2012_val_00013072.JPEG n01697457/
+mv val/ILSVRC2012_val_00013073.JPEG n03924679/
+mv val/ILSVRC2012_val_00013074.JPEG n04243546/
+mv val/ILSVRC2012_val_00013075.JPEG n09256479/
+mv val/ILSVRC2012_val_00013076.JPEG n02493793/
+mv val/ILSVRC2012_val_00013077.JPEG n07871810/
+mv val/ILSVRC2012_val_00013078.JPEG n02177972/
+mv val/ILSVRC2012_val_00013079.JPEG n01917289/
+mv val/ILSVRC2012_val_00013080.JPEG n02088466/
+mv val/ILSVRC2012_val_00013081.JPEG n04069434/
+mv val/ILSVRC2012_val_00013082.JPEG n03891251/
+mv val/ILSVRC2012_val_00013083.JPEG n02113799/
+mv val/ILSVRC2012_val_00013084.JPEG n07711569/
+mv val/ILSVRC2012_val_00013085.JPEG n01833805/
+mv val/ILSVRC2012_val_00013086.JPEG n04270147/
+mv val/ILSVRC2012_val_00013087.JPEG n04259630/
+mv val/ILSVRC2012_val_00013088.JPEG n02859443/
+mv val/ILSVRC2012_val_00013089.JPEG n04270147/
+mv val/ILSVRC2012_val_00013090.JPEG n02110063/
+mv val/ILSVRC2012_val_00013091.JPEG n03042490/
+mv val/ILSVRC2012_val_00013092.JPEG n03290653/
+mv val/ILSVRC2012_val_00013093.JPEG n02002724/
+mv val/ILSVRC2012_val_00013094.JPEG n02100583/
+mv val/ILSVRC2012_val_00013095.JPEG n01608432/
+mv val/ILSVRC2012_val_00013096.JPEG n03710193/
+mv val/ILSVRC2012_val_00013097.JPEG n03777754/
+mv val/ILSVRC2012_val_00013098.JPEG n02971356/
+mv val/ILSVRC2012_val_00013099.JPEG n04482393/
+mv val/ILSVRC2012_val_00013100.JPEG n13037406/
+mv val/ILSVRC2012_val_00013101.JPEG n01768244/
+mv val/ILSVRC2012_val_00013102.JPEG n03929855/
+mv val/ILSVRC2012_val_00013103.JPEG n03016953/
+mv val/ILSVRC2012_val_00013104.JPEG n07584110/
+mv val/ILSVRC2012_val_00013105.JPEG n02113023/
+mv val/ILSVRC2012_val_00013106.JPEG n04447861/
+mv val/ILSVRC2012_val_00013107.JPEG n02128925/
+mv val/ILSVRC2012_val_00013108.JPEG n02988304/
+mv val/ILSVRC2012_val_00013109.JPEG n04201297/
+mv val/ILSVRC2012_val_00013110.JPEG n02006656/
+mv val/ILSVRC2012_val_00013111.JPEG n01807496/
+mv val/ILSVRC2012_val_00013112.JPEG n03658185/
+mv val/ILSVRC2012_val_00013113.JPEG n03394916/
+mv val/ILSVRC2012_val_00013114.JPEG n07716358/
+mv val/ILSVRC2012_val_00013115.JPEG n07579787/
+mv val/ILSVRC2012_val_00013116.JPEG n02102177/
+mv val/ILSVRC2012_val_00013117.JPEG n01729322/
+mv val/ILSVRC2012_val_00013118.JPEG n03775071/
+mv val/ILSVRC2012_val_00013119.JPEG n04482393/
+mv val/ILSVRC2012_val_00013120.JPEG n02415577/
+mv val/ILSVRC2012_val_00013121.JPEG n02607072/
+mv val/ILSVRC2012_val_00013122.JPEG n02909870/
+mv val/ILSVRC2012_val_00013123.JPEG n03255030/
+mv val/ILSVRC2012_val_00013124.JPEG n03344393/
+mv val/ILSVRC2012_val_00013125.JPEG n02325366/
+mv val/ILSVRC2012_val_00013126.JPEG n02102480/
+mv val/ILSVRC2012_val_00013127.JPEG n02102177/
+mv val/ILSVRC2012_val_00013128.JPEG n04423845/
+mv val/ILSVRC2012_val_00013129.JPEG n02130308/
+mv val/ILSVRC2012_val_00013130.JPEG n03785016/
+mv val/ILSVRC2012_val_00013131.JPEG n02787622/
+mv val/ILSVRC2012_val_00013132.JPEG n04200800/
+mv val/ILSVRC2012_val_00013133.JPEG n02087046/
+mv val/ILSVRC2012_val_00013134.JPEG n04487394/
+mv val/ILSVRC2012_val_00013135.JPEG n04152593/
+mv val/ILSVRC2012_val_00013136.JPEG n04065272/
+mv val/ILSVRC2012_val_00013137.JPEG n07831146/
+mv val/ILSVRC2012_val_00013138.JPEG n02843684/
+mv val/ILSVRC2012_val_00013139.JPEG n07248320/
+mv val/ILSVRC2012_val_00013140.JPEG n03498962/
+mv val/ILSVRC2012_val_00013141.JPEG n02128757/
+mv val/ILSVRC2012_val_00013142.JPEG n04523525/
+mv val/ILSVRC2012_val_00013143.JPEG n02999410/
+mv val/ILSVRC2012_val_00013144.JPEG n03697007/
+mv val/ILSVRC2012_val_00013145.JPEG n02097209/
+mv val/ILSVRC2012_val_00013146.JPEG n11939491/
+mv val/ILSVRC2012_val_00013147.JPEG n04141327/
+mv val/ILSVRC2012_val_00013148.JPEG n07248320/
+mv val/ILSVRC2012_val_00013149.JPEG n04461696/
+mv val/ILSVRC2012_val_00013150.JPEG n02110185/
+mv val/ILSVRC2012_val_00013151.JPEG n02483708/
+mv val/ILSVRC2012_val_00013152.JPEG n03902125/
+mv val/ILSVRC2012_val_00013153.JPEG n02168699/
+mv val/ILSVRC2012_val_00013154.JPEG n02834397/
+mv val/ILSVRC2012_val_00013155.JPEG n02108915/
+mv val/ILSVRC2012_val_00013156.JPEG n02963159/
+mv val/ILSVRC2012_val_00013157.JPEG n03841143/
+mv val/ILSVRC2012_val_00013158.JPEG n02120505/
+mv val/ILSVRC2012_val_00013159.JPEG n02111129/
+mv val/ILSVRC2012_val_00013160.JPEG n02112350/
+mv val/ILSVRC2012_val_00013161.JPEG n03793489/
+mv val/ILSVRC2012_val_00013162.JPEG n03649909/
+mv val/ILSVRC2012_val_00013163.JPEG n04090263/
+mv val/ILSVRC2012_val_00013164.JPEG n02727426/
+mv val/ILSVRC2012_val_00013165.JPEG n04033995/
+mv val/ILSVRC2012_val_00013166.JPEG n01608432/
+mv val/ILSVRC2012_val_00013167.JPEG n02364673/
+mv val/ILSVRC2012_val_00013168.JPEG n02895154/
+mv val/ILSVRC2012_val_00013169.JPEG n07730033/
+mv val/ILSVRC2012_val_00013170.JPEG n02423022/
+mv val/ILSVRC2012_val_00013171.JPEG n02999410/
+mv val/ILSVRC2012_val_00013172.JPEG n07579787/
+mv val/ILSVRC2012_val_00013173.JPEG n02086079/
+mv val/ILSVRC2012_val_00013174.JPEG n01631663/
+mv val/ILSVRC2012_val_00013175.JPEG n02494079/
+mv val/ILSVRC2012_val_00013176.JPEG n04118776/
+mv val/ILSVRC2012_val_00013177.JPEG n03467068/
+mv val/ILSVRC2012_val_00013178.JPEG n03476684/
+mv val/ILSVRC2012_val_00013179.JPEG n03954731/
+mv val/ILSVRC2012_val_00013180.JPEG n03775546/
+mv val/ILSVRC2012_val_00013181.JPEG n02981792/
+mv val/ILSVRC2012_val_00013182.JPEG n01873310/
+mv val/ILSVRC2012_val_00013183.JPEG n01980166/
+mv val/ILSVRC2012_val_00013184.JPEG n04049303/
+mv val/ILSVRC2012_val_00013185.JPEG n04099969/
+mv val/ILSVRC2012_val_00013186.JPEG n02965783/
+mv val/ILSVRC2012_val_00013187.JPEG n02281787/
+mv val/ILSVRC2012_val_00013188.JPEG n02823750/
+mv val/ILSVRC2012_val_00013189.JPEG n02655020/
+mv val/ILSVRC2012_val_00013190.JPEG n02403003/
+mv val/ILSVRC2012_val_00013191.JPEG n02951358/
+mv val/ILSVRC2012_val_00013192.JPEG n02028035/
+mv val/ILSVRC2012_val_00013193.JPEG n02504458/
+mv val/ILSVRC2012_val_00013194.JPEG n03814639/
+mv val/ILSVRC2012_val_00013195.JPEG n02085620/
+mv val/ILSVRC2012_val_00013196.JPEG n04486054/
+mv val/ILSVRC2012_val_00013197.JPEG n03761084/
+mv val/ILSVRC2012_val_00013198.JPEG n07930864/
+mv val/ILSVRC2012_val_00013199.JPEG n04522168/
+mv val/ILSVRC2012_val_00013200.JPEG n04347754/
+mv val/ILSVRC2012_val_00013201.JPEG n01644373/
+mv val/ILSVRC2012_val_00013202.JPEG n02992211/
+mv val/ILSVRC2012_val_00013203.JPEG n04483307/
+mv val/ILSVRC2012_val_00013204.JPEG n02102973/
+mv val/ILSVRC2012_val_00013205.JPEG n04467665/
+mv val/ILSVRC2012_val_00013206.JPEG n03026506/
+mv val/ILSVRC2012_val_00013207.JPEG n03026506/
+mv val/ILSVRC2012_val_00013208.JPEG n07697537/
+mv val/ILSVRC2012_val_00013209.JPEG n01532829/
+mv val/ILSVRC2012_val_00013210.JPEG n04442312/
+mv val/ILSVRC2012_val_00013211.JPEG n02108551/
+mv val/ILSVRC2012_val_00013212.JPEG n01824575/
+mv val/ILSVRC2012_val_00013213.JPEG n04254777/
+mv val/ILSVRC2012_val_00013214.JPEG n03109150/
+mv val/ILSVRC2012_val_00013215.JPEG n01728920/
+mv val/ILSVRC2012_val_00013216.JPEG n04380533/
+mv val/ILSVRC2012_val_00013217.JPEG n02795169/
+mv val/ILSVRC2012_val_00013218.JPEG n04493381/
+mv val/ILSVRC2012_val_00013219.JPEG n03141823/
+mv val/ILSVRC2012_val_00013220.JPEG n01817953/
+mv val/ILSVRC2012_val_00013221.JPEG n04026417/
+mv val/ILSVRC2012_val_00013222.JPEG n02909870/
+mv val/ILSVRC2012_val_00013223.JPEG n01601694/
+mv val/ILSVRC2012_val_00013224.JPEG n02834397/
+mv val/ILSVRC2012_val_00013225.JPEG n03376595/
+mv val/ILSVRC2012_val_00013226.JPEG n02909870/
+mv val/ILSVRC2012_val_00013227.JPEG n07711569/
+mv val/ILSVRC2012_val_00013228.JPEG n03891251/
+mv val/ILSVRC2012_val_00013229.JPEG n01806567/
+mv val/ILSVRC2012_val_00013230.JPEG n03854065/
+mv val/ILSVRC2012_val_00013231.JPEG n03814906/
+mv val/ILSVRC2012_val_00013232.JPEG n02808304/
+mv val/ILSVRC2012_val_00013233.JPEG n04153751/
+mv val/ILSVRC2012_val_00013234.JPEG n07768694/
+mv val/ILSVRC2012_val_00013235.JPEG n04532106/
+mv val/ILSVRC2012_val_00013236.JPEG n02102973/
+mv val/ILSVRC2012_val_00013237.JPEG n02346627/
+mv val/ILSVRC2012_val_00013238.JPEG n13133613/
+mv val/ILSVRC2012_val_00013239.JPEG n02129604/
+mv val/ILSVRC2012_val_00013240.JPEG n02443484/
+mv val/ILSVRC2012_val_00013241.JPEG n03792972/
+mv val/ILSVRC2012_val_00013242.JPEG n02804414/
+mv val/ILSVRC2012_val_00013243.JPEG n02097298/
+mv val/ILSVRC2012_val_00013244.JPEG n02708093/
+mv val/ILSVRC2012_val_00013245.JPEG n01748264/
+mv val/ILSVRC2012_val_00013246.JPEG n03992509/
+mv val/ILSVRC2012_val_00013247.JPEG n04591713/
+mv val/ILSVRC2012_val_00013248.JPEG n02105162/
+mv val/ILSVRC2012_val_00013249.JPEG n03840681/
+mv val/ILSVRC2012_val_00013250.JPEG n02276258/
+mv val/ILSVRC2012_val_00013251.JPEG n02100583/
+mv val/ILSVRC2012_val_00013252.JPEG n02408429/
+mv val/ILSVRC2012_val_00013253.JPEG n03770679/
+mv val/ILSVRC2012_val_00013254.JPEG n07717556/
+mv val/ILSVRC2012_val_00013255.JPEG n02280649/
+mv val/ILSVRC2012_val_00013256.JPEG n02006656/
+mv val/ILSVRC2012_val_00013257.JPEG n04560804/
+mv val/ILSVRC2012_val_00013258.JPEG n04285008/
+mv val/ILSVRC2012_val_00013259.JPEG n03868863/
+mv val/ILSVRC2012_val_00013260.JPEG n02088238/
+mv val/ILSVRC2012_val_00013261.JPEG n02799071/
+mv val/ILSVRC2012_val_00013262.JPEG n04560804/
+mv val/ILSVRC2012_val_00013263.JPEG n02108551/
+mv val/ILSVRC2012_val_00013264.JPEG n02487347/
+mv val/ILSVRC2012_val_00013265.JPEG n01614925/
+mv val/ILSVRC2012_val_00013266.JPEG n04505470/
+mv val/ILSVRC2012_val_00013267.JPEG n04090263/
+mv val/ILSVRC2012_val_00013268.JPEG n03661043/
+mv val/ILSVRC2012_val_00013269.JPEG n01675722/
+mv val/ILSVRC2012_val_00013270.JPEG n01531178/
+mv val/ILSVRC2012_val_00013271.JPEG n01632458/
+mv val/ILSVRC2012_val_00013272.JPEG n01695060/
+mv val/ILSVRC2012_val_00013273.JPEG n04254777/
+mv val/ILSVRC2012_val_00013274.JPEG n04355933/
+mv val/ILSVRC2012_val_00013275.JPEG n03743016/
+mv val/ILSVRC2012_val_00013276.JPEG n04259630/
+mv val/ILSVRC2012_val_00013277.JPEG n01534433/
+mv val/ILSVRC2012_val_00013278.JPEG n02110958/
+mv val/ILSVRC2012_val_00013279.JPEG n02112350/
+mv val/ILSVRC2012_val_00013280.JPEG n02488702/
+mv val/ILSVRC2012_val_00013281.JPEG n02687172/
+mv val/ILSVRC2012_val_00013282.JPEG n09246464/
+mv val/ILSVRC2012_val_00013283.JPEG n02071294/
+mv val/ILSVRC2012_val_00013284.JPEG n02497673/
+mv val/ILSVRC2012_val_00013285.JPEG n03871628/
+mv val/ILSVRC2012_val_00013286.JPEG n07717556/
+mv val/ILSVRC2012_val_00013287.JPEG n02105412/
+mv val/ILSVRC2012_val_00013288.JPEG n02999410/
+mv val/ILSVRC2012_val_00013289.JPEG n02105412/
+mv val/ILSVRC2012_val_00013290.JPEG n04208210/
+mv val/ILSVRC2012_val_00013291.JPEG n04589890/
+mv val/ILSVRC2012_val_00013292.JPEG n03379051/
+mv val/ILSVRC2012_val_00013293.JPEG n03404251/
+mv val/ILSVRC2012_val_00013294.JPEG n03014705/
+mv val/ILSVRC2012_val_00013295.JPEG n04146614/
+mv val/ILSVRC2012_val_00013296.JPEG n03938244/
+mv val/ILSVRC2012_val_00013297.JPEG n02107142/
+mv val/ILSVRC2012_val_00013298.JPEG n03452741/
+mv val/ILSVRC2012_val_00013299.JPEG n01667114/
+mv val/ILSVRC2012_val_00013300.JPEG n04311174/
+mv val/ILSVRC2012_val_00013301.JPEG n01667778/
+mv val/ILSVRC2012_val_00013302.JPEG n03127747/
+mv val/ILSVRC2012_val_00013303.JPEG n02105412/
+mv val/ILSVRC2012_val_00013304.JPEG n09399592/
+mv val/ILSVRC2012_val_00013305.JPEG n07716906/
+mv val/ILSVRC2012_val_00013306.JPEG n03673027/
+mv val/ILSVRC2012_val_00013307.JPEG n03197337/
+mv val/ILSVRC2012_val_00013308.JPEG n03450230/
+mv val/ILSVRC2012_val_00013309.JPEG n02113186/
+mv val/ILSVRC2012_val_00013310.JPEG n01775062/
+mv val/ILSVRC2012_val_00013311.JPEG n04380533/
+mv val/ILSVRC2012_val_00013312.JPEG n06359193/
+mv val/ILSVRC2012_val_00013313.JPEG n03483316/
+mv val/ILSVRC2012_val_00013314.JPEG n02172182/
+mv val/ILSVRC2012_val_00013315.JPEG n03496892/
+mv val/ILSVRC2012_val_00013316.JPEG n03843555/
+mv val/ILSVRC2012_val_00013317.JPEG n04476259/
+mv val/ILSVRC2012_val_00013318.JPEG n02110806/
+mv val/ILSVRC2012_val_00013319.JPEG n04467665/
+mv val/ILSVRC2012_val_00013320.JPEG n04548280/
+mv val/ILSVRC2012_val_00013321.JPEG n01518878/
+mv val/ILSVRC2012_val_00013322.JPEG n02281787/
+mv val/ILSVRC2012_val_00013323.JPEG n02093647/
+mv val/ILSVRC2012_val_00013324.JPEG n04404412/
+mv val/ILSVRC2012_val_00013325.JPEG n04356056/
+mv val/ILSVRC2012_val_00013326.JPEG n03840681/
+mv val/ILSVRC2012_val_00013327.JPEG n03995372/
+mv val/ILSVRC2012_val_00013328.JPEG n02326432/
+mv val/ILSVRC2012_val_00013329.JPEG n02777292/
+mv val/ILSVRC2012_val_00013330.JPEG n01776313/
+mv val/ILSVRC2012_val_00013331.JPEG n03220513/
+mv val/ILSVRC2012_val_00013332.JPEG n02795169/
+mv val/ILSVRC2012_val_00013333.JPEG n02074367/
+mv val/ILSVRC2012_val_00013334.JPEG n01968897/
+mv val/ILSVRC2012_val_00013335.JPEG n07693725/
+mv val/ILSVRC2012_val_00013336.JPEG n02906734/
+mv val/ILSVRC2012_val_00013337.JPEG n03777754/
+mv val/ILSVRC2012_val_00013338.JPEG n02497673/
+mv val/ILSVRC2012_val_00013339.JPEG n03126707/
+mv val/ILSVRC2012_val_00013340.JPEG n04259630/
+mv val/ILSVRC2012_val_00013341.JPEG n03729826/
+mv val/ILSVRC2012_val_00013342.JPEG n04026417/
+mv val/ILSVRC2012_val_00013343.JPEG n01855032/
+mv val/ILSVRC2012_val_00013344.JPEG n02808440/
+mv val/ILSVRC2012_val_00013345.JPEG n04346328/
+mv val/ILSVRC2012_val_00013346.JPEG n03930313/
+mv val/ILSVRC2012_val_00013347.JPEG n04560804/
+mv val/ILSVRC2012_val_00013348.JPEG n03127925/
+mv val/ILSVRC2012_val_00013349.JPEG n07684084/
+mv val/ILSVRC2012_val_00013350.JPEG n04417672/
+mv val/ILSVRC2012_val_00013351.JPEG n02172182/
+mv val/ILSVRC2012_val_00013352.JPEG n02325366/
+mv val/ILSVRC2012_val_00013353.JPEG n03899768/
+mv val/ILSVRC2012_val_00013354.JPEG n01644900/
+mv val/ILSVRC2012_val_00013355.JPEG n02113186/
+mv val/ILSVRC2012_val_00013356.JPEG n03710637/
+mv val/ILSVRC2012_val_00013357.JPEG n03857828/
+mv val/ILSVRC2012_val_00013358.JPEG n02114548/
+mv val/ILSVRC2012_val_00013359.JPEG n04326547/
+mv val/ILSVRC2012_val_00013360.JPEG n02643566/
+mv val/ILSVRC2012_val_00013361.JPEG n02092002/
+mv val/ILSVRC2012_val_00013362.JPEG n03124170/
+mv val/ILSVRC2012_val_00013363.JPEG n02281406/
+mv val/ILSVRC2012_val_00013364.JPEG n01806567/
+mv val/ILSVRC2012_val_00013365.JPEG n04254680/
+mv val/ILSVRC2012_val_00013366.JPEG n03344393/
+mv val/ILSVRC2012_val_00013367.JPEG n01532829/
+mv val/ILSVRC2012_val_00013368.JPEG n02116738/
+mv val/ILSVRC2012_val_00013369.JPEG n02116738/
+mv val/ILSVRC2012_val_00013370.JPEG n02094258/
+mv val/ILSVRC2012_val_00013371.JPEG n03690938/
+mv val/ILSVRC2012_val_00013372.JPEG n03272562/
+mv val/ILSVRC2012_val_00013373.JPEG n03110669/
+mv val/ILSVRC2012_val_00013374.JPEG n03786901/
+mv val/ILSVRC2012_val_00013375.JPEG n07920052/
+mv val/ILSVRC2012_val_00013376.JPEG n04355933/
+mv val/ILSVRC2012_val_00013377.JPEG n01978455/
+mv val/ILSVRC2012_val_00013378.JPEG n01806143/
+mv val/ILSVRC2012_val_00013379.JPEG n01944390/
+mv val/ILSVRC2012_val_00013380.JPEG n03450230/
+mv val/ILSVRC2012_val_00013381.JPEG n02088364/
+mv val/ILSVRC2012_val_00013382.JPEG n03956157/
+mv val/ILSVRC2012_val_00013383.JPEG n02437312/
+mv val/ILSVRC2012_val_00013384.JPEG n03590841/
+mv val/ILSVRC2012_val_00013385.JPEG n04344873/
+mv val/ILSVRC2012_val_00013386.JPEG n02277742/
+mv val/ILSVRC2012_val_00013387.JPEG n02111277/
+mv val/ILSVRC2012_val_00013388.JPEG n01784675/
+mv val/ILSVRC2012_val_00013389.JPEG n04483307/
+mv val/ILSVRC2012_val_00013390.JPEG n02132136/
+mv val/ILSVRC2012_val_00013391.JPEG n04019541/
+mv val/ILSVRC2012_val_00013392.JPEG n01693334/
+mv val/ILSVRC2012_val_00013393.JPEG n01608432/
+mv val/ILSVRC2012_val_00013394.JPEG n01667114/
+mv val/ILSVRC2012_val_00013395.JPEG n02236044/
+mv val/ILSVRC2012_val_00013396.JPEG n03775546/
+mv val/ILSVRC2012_val_00013397.JPEG n01739381/
+mv val/ILSVRC2012_val_00013398.JPEG n02100583/
+mv val/ILSVRC2012_val_00013399.JPEG n02090622/
+mv val/ILSVRC2012_val_00013400.JPEG n01729322/
+mv val/ILSVRC2012_val_00013401.JPEG n04350905/
+mv val/ILSVRC2012_val_00013402.JPEG n02056570/
+mv val/ILSVRC2012_val_00013403.JPEG n04612504/
+mv val/ILSVRC2012_val_00013404.JPEG n04505470/
+mv val/ILSVRC2012_val_00013405.JPEG n12057211/
+mv val/ILSVRC2012_val_00013406.JPEG n03837869/
+mv val/ILSVRC2012_val_00013407.JPEG n01531178/
+mv val/ILSVRC2012_val_00013408.JPEG n04376876/
+mv val/ILSVRC2012_val_00013409.JPEG n02454379/
+mv val/ILSVRC2012_val_00013410.JPEG n02124075/
+mv val/ILSVRC2012_val_00013411.JPEG n02395406/
+mv val/ILSVRC2012_val_00013412.JPEG n02114367/
+mv val/ILSVRC2012_val_00013413.JPEG n03481172/
+mv val/ILSVRC2012_val_00013414.JPEG n02109047/
+mv val/ILSVRC2012_val_00013415.JPEG n07715103/
+mv val/ILSVRC2012_val_00013416.JPEG n04154565/
+mv val/ILSVRC2012_val_00013417.JPEG n02423022/
+mv val/ILSVRC2012_val_00013418.JPEG n01756291/
+mv val/ILSVRC2012_val_00013419.JPEG n02108089/
+mv val/ILSVRC2012_val_00013420.JPEG n02493793/
+mv val/ILSVRC2012_val_00013421.JPEG n03602883/
+mv val/ILSVRC2012_val_00013422.JPEG n02168699/
+mv val/ILSVRC2012_val_00013423.JPEG n01978455/
+mv val/ILSVRC2012_val_00013424.JPEG n02097298/
+mv val/ILSVRC2012_val_00013425.JPEG n02447366/
+mv val/ILSVRC2012_val_00013426.JPEG n04229816/
+mv val/ILSVRC2012_val_00013427.JPEG n07583066/
+mv val/ILSVRC2012_val_00013428.JPEG n03207743/
+mv val/ILSVRC2012_val_00013429.JPEG n07248320/
+mv val/ILSVRC2012_val_00013430.JPEG n02100583/
+mv val/ILSVRC2012_val_00013431.JPEG n02823750/
+mv val/ILSVRC2012_val_00013432.JPEG n01608432/
+mv val/ILSVRC2012_val_00013433.JPEG n04418357/
+mv val/ILSVRC2012_val_00013434.JPEG n01833805/
+mv val/ILSVRC2012_val_00013435.JPEG n03930630/
+mv val/ILSVRC2012_val_00013436.JPEG n03425413/
+mv val/ILSVRC2012_val_00013437.JPEG n02788148/
+mv val/ILSVRC2012_val_00013438.JPEG n03637318/
+mv val/ILSVRC2012_val_00013439.JPEG n04265275/
+mv val/ILSVRC2012_val_00013440.JPEG n02281787/
+mv val/ILSVRC2012_val_00013441.JPEG n04335435/
+mv val/ILSVRC2012_val_00013442.JPEG n02093428/
+mv val/ILSVRC2012_val_00013443.JPEG n06359193/
+mv val/ILSVRC2012_val_00013444.JPEG n03944341/
+mv val/ILSVRC2012_val_00013445.JPEG n04041544/
+mv val/ILSVRC2012_val_00013446.JPEG n04515003/
+mv val/ILSVRC2012_val_00013447.JPEG n02106550/
+mv val/ILSVRC2012_val_00013448.JPEG n02097130/
+mv val/ILSVRC2012_val_00013449.JPEG n02837789/
+mv val/ILSVRC2012_val_00013450.JPEG n07753275/
+mv val/ILSVRC2012_val_00013451.JPEG n04026417/
+mv val/ILSVRC2012_val_00013452.JPEG n03673027/
+mv val/ILSVRC2012_val_00013453.JPEG n03887697/
+mv val/ILSVRC2012_val_00013454.JPEG n03110669/
+mv val/ILSVRC2012_val_00013455.JPEG n03769881/
+mv val/ILSVRC2012_val_00013456.JPEG n01532829/
+mv val/ILSVRC2012_val_00013457.JPEG n02006656/
+mv val/ILSVRC2012_val_00013458.JPEG n04296562/
+mv val/ILSVRC2012_val_00013459.JPEG n04347754/
+mv val/ILSVRC2012_val_00013460.JPEG n01828970/
+mv val/ILSVRC2012_val_00013461.JPEG n03125729/
+mv val/ILSVRC2012_val_00013462.JPEG n03877472/
+mv val/ILSVRC2012_val_00013463.JPEG n02096051/
+mv val/ILSVRC2012_val_00013464.JPEG n04483307/
+mv val/ILSVRC2012_val_00013465.JPEG n02398521/
+mv val/ILSVRC2012_val_00013466.JPEG n03770679/
+mv val/ILSVRC2012_val_00013467.JPEG n02106662/
+mv val/ILSVRC2012_val_00013468.JPEG n03775546/
+mv val/ILSVRC2012_val_00013469.JPEG n04347754/
+mv val/ILSVRC2012_val_00013470.JPEG n02676566/
+mv val/ILSVRC2012_val_00013471.JPEG n03690938/
+mv val/ILSVRC2012_val_00013472.JPEG n07831146/
+mv val/ILSVRC2012_val_00013473.JPEG n04398044/
+mv val/ILSVRC2012_val_00013474.JPEG n01985128/
+mv val/ILSVRC2012_val_00013475.JPEG n02109047/
+mv val/ILSVRC2012_val_00013476.JPEG n03785016/
+mv val/ILSVRC2012_val_00013477.JPEG n03494278/
+mv val/ILSVRC2012_val_00013478.JPEG n03792972/
+mv val/ILSVRC2012_val_00013479.JPEG n02114367/
+mv val/ILSVRC2012_val_00013480.JPEG n03777754/
+mv val/ILSVRC2012_val_00013481.JPEG n04090263/
+mv val/ILSVRC2012_val_00013482.JPEG n02132136/
+mv val/ILSVRC2012_val_00013483.JPEG n03134739/
+mv val/ILSVRC2012_val_00013484.JPEG n01491361/
+mv val/ILSVRC2012_val_00013485.JPEG n09332890/
+mv val/ILSVRC2012_val_00013486.JPEG n03803284/
+mv val/ILSVRC2012_val_00013487.JPEG n02120079/
+mv val/ILSVRC2012_val_00013488.JPEG n03075370/
+mv val/ILSVRC2012_val_00013489.JPEG n02104365/
+mv val/ILSVRC2012_val_00013490.JPEG n03884397/
+mv val/ILSVRC2012_val_00013491.JPEG n02790996/
+mv val/ILSVRC2012_val_00013492.JPEG n01751748/
+mv val/ILSVRC2012_val_00013493.JPEG n07695742/
+mv val/ILSVRC2012_val_00013494.JPEG n02123045/
+mv val/ILSVRC2012_val_00013495.JPEG n03759954/
+mv val/ILSVRC2012_val_00013496.JPEG n03733131/
+mv val/ILSVRC2012_val_00013497.JPEG n12998815/
+mv val/ILSVRC2012_val_00013498.JPEG n03223299/
+mv val/ILSVRC2012_val_00013499.JPEG n07745940/
+mv val/ILSVRC2012_val_00013500.JPEG n04532106/
+mv val/ILSVRC2012_val_00013501.JPEG n02111889/
+mv val/ILSVRC2012_val_00013502.JPEG n02708093/
+mv val/ILSVRC2012_val_00013503.JPEG n01944390/
+mv val/ILSVRC2012_val_00013504.JPEG n01534433/
+mv val/ILSVRC2012_val_00013505.JPEG n02361337/
+mv val/ILSVRC2012_val_00013506.JPEG n02113624/
+mv val/ILSVRC2012_val_00013507.JPEG n02090721/
+mv val/ILSVRC2012_val_00013508.JPEG n02093256/
+mv val/ILSVRC2012_val_00013509.JPEG n02025239/
+mv val/ILSVRC2012_val_00013510.JPEG n04355933/
+mv val/ILSVRC2012_val_00013511.JPEG n03452741/
+mv val/ILSVRC2012_val_00013512.JPEG n01530575/
+mv val/ILSVRC2012_val_00013513.JPEG n01443537/
+mv val/ILSVRC2012_val_00013514.JPEG n04209239/
+mv val/ILSVRC2012_val_00013515.JPEG n02037110/
+mv val/ILSVRC2012_val_00013516.JPEG n04154565/
+mv val/ILSVRC2012_val_00013517.JPEG n03594945/
+mv val/ILSVRC2012_val_00013518.JPEG n04465501/
+mv val/ILSVRC2012_val_00013519.JPEG n07714990/
+mv val/ILSVRC2012_val_00013520.JPEG n03868863/
+mv val/ILSVRC2012_val_00013521.JPEG n01819313/
+mv val/ILSVRC2012_val_00013522.JPEG n04026417/
+mv val/ILSVRC2012_val_00013523.JPEG n04553703/
+mv val/ILSVRC2012_val_00013524.JPEG n02112706/
+mv val/ILSVRC2012_val_00013525.JPEG n01980166/
+mv val/ILSVRC2012_val_00013526.JPEG n02797295/
+mv val/ILSVRC2012_val_00013527.JPEG n03888257/
+mv val/ILSVRC2012_val_00013528.JPEG n02342885/
+mv val/ILSVRC2012_val_00013529.JPEG n03216828/
+mv val/ILSVRC2012_val_00013530.JPEG n03388043/
+mv val/ILSVRC2012_val_00013531.JPEG n03804744/
+mv val/ILSVRC2012_val_00013532.JPEG n02138441/
+mv val/ILSVRC2012_val_00013533.JPEG n01689811/
+mv val/ILSVRC2012_val_00013534.JPEG n04553703/
+mv val/ILSVRC2012_val_00013535.JPEG n02231487/
+mv val/ILSVRC2012_val_00013536.JPEG n04208210/
+mv val/ILSVRC2012_val_00013537.JPEG n03372029/
+mv val/ILSVRC2012_val_00013538.JPEG n02096177/
+mv val/ILSVRC2012_val_00013539.JPEG n04429376/
+mv val/ILSVRC2012_val_00013540.JPEG n03272010/
+mv val/ILSVRC2012_val_00013541.JPEG n02493509/
+mv val/ILSVRC2012_val_00013542.JPEG n03127747/
+mv val/ILSVRC2012_val_00013543.JPEG n02786058/
+mv val/ILSVRC2012_val_00013544.JPEG n03777568/
+mv val/ILSVRC2012_val_00013545.JPEG n04238763/
+mv val/ILSVRC2012_val_00013546.JPEG n03535780/
+mv val/ILSVRC2012_val_00013547.JPEG n03938244/
+mv val/ILSVRC2012_val_00013548.JPEG n02408429/
+mv val/ILSVRC2012_val_00013549.JPEG n02097658/
+mv val/ILSVRC2012_val_00013550.JPEG n02123159/
+mv val/ILSVRC2012_val_00013551.JPEG n03891251/
+mv val/ILSVRC2012_val_00013552.JPEG n02165105/
+mv val/ILSVRC2012_val_00013553.JPEG n02437312/
+mv val/ILSVRC2012_val_00013554.JPEG n02114712/
+mv val/ILSVRC2012_val_00013555.JPEG n04540053/
+mv val/ILSVRC2012_val_00013556.JPEG n04270147/
+mv val/ILSVRC2012_val_00013557.JPEG n02113186/
+mv val/ILSVRC2012_val_00013558.JPEG n02281406/
+mv val/ILSVRC2012_val_00013559.JPEG n03899768/
+mv val/ILSVRC2012_val_00013560.JPEG n04442312/
+mv val/ILSVRC2012_val_00013561.JPEG n04023962/
+mv val/ILSVRC2012_val_00013562.JPEG n02963159/
+mv val/ILSVRC2012_val_00013563.JPEG n02102973/
+mv val/ILSVRC2012_val_00013564.JPEG n01860187/
+mv val/ILSVRC2012_val_00013565.JPEG n03297495/
+mv val/ILSVRC2012_val_00013566.JPEG n03733805/
+mv val/ILSVRC2012_val_00013567.JPEG n03980874/
+mv val/ILSVRC2012_val_00013568.JPEG n04336792/
+mv val/ILSVRC2012_val_00013569.JPEG n04366367/
+mv val/ILSVRC2012_val_00013570.JPEG n02412080/
+mv val/ILSVRC2012_val_00013571.JPEG n02966687/
+mv val/ILSVRC2012_val_00013572.JPEG n03763968/
+mv val/ILSVRC2012_val_00013573.JPEG n02098286/
+mv val/ILSVRC2012_val_00013574.JPEG n01756291/
+mv val/ILSVRC2012_val_00013575.JPEG n03929855/
+mv val/ILSVRC2012_val_00013576.JPEG n03944341/
+mv val/ILSVRC2012_val_00013577.JPEG n03271574/
+mv val/ILSVRC2012_val_00013578.JPEG n04026417/
+mv val/ILSVRC2012_val_00013579.JPEG n07754684/
+mv val/ILSVRC2012_val_00013580.JPEG n01985128/
+mv val/ILSVRC2012_val_00013581.JPEG n07753113/
+mv val/ILSVRC2012_val_00013582.JPEG n01675722/
+mv val/ILSVRC2012_val_00013583.JPEG n02106166/
+mv val/ILSVRC2012_val_00013584.JPEG n02116738/
+mv val/ILSVRC2012_val_00013585.JPEG n03916031/
+mv val/ILSVRC2012_val_00013586.JPEG n04065272/
+mv val/ILSVRC2012_val_00013587.JPEG n03110669/
+mv val/ILSVRC2012_val_00013588.JPEG n07747607/
+mv val/ILSVRC2012_val_00013589.JPEG n02009912/
+mv val/ILSVRC2012_val_00013590.JPEG n03950228/
+mv val/ILSVRC2012_val_00013591.JPEG n03483316/
+mv val/ILSVRC2012_val_00013592.JPEG n07716358/
+mv val/ILSVRC2012_val_00013593.JPEG n03216828/
+mv val/ILSVRC2012_val_00013594.JPEG n09835506/
+mv val/ILSVRC2012_val_00013595.JPEG n03393912/
+mv val/ILSVRC2012_val_00013596.JPEG n02526121/
+mv val/ILSVRC2012_val_00013597.JPEG n03770439/
+mv val/ILSVRC2012_val_00013598.JPEG n02002724/
+mv val/ILSVRC2012_val_00013599.JPEG n02871525/
+mv val/ILSVRC2012_val_00013600.JPEG n01776313/
+mv val/ILSVRC2012_val_00013601.JPEG n04355933/
+mv val/ILSVRC2012_val_00013602.JPEG n03450230/
+mv val/ILSVRC2012_val_00013603.JPEG n02025239/
+mv val/ILSVRC2012_val_00013604.JPEG n02107312/
+mv val/ILSVRC2012_val_00013605.JPEG n04606251/
+mv val/ILSVRC2012_val_00013606.JPEG n03063599/
+mv val/ILSVRC2012_val_00013607.JPEG n01795545/
+mv val/ILSVRC2012_val_00013608.JPEG n04254777/
+mv val/ILSVRC2012_val_00013609.JPEG n02120079/
+mv val/ILSVRC2012_val_00013610.JPEG n01833805/
+mv val/ILSVRC2012_val_00013611.JPEG n02099601/
+mv val/ILSVRC2012_val_00013612.JPEG n13052670/
+mv val/ILSVRC2012_val_00013613.JPEG n02676566/
+mv val/ILSVRC2012_val_00013614.JPEG n03457902/
+mv val/ILSVRC2012_val_00013615.JPEG n03720891/
+mv val/ILSVRC2012_val_00013616.JPEG n03793489/
+mv val/ILSVRC2012_val_00013617.JPEG n01775062/
+mv val/ILSVRC2012_val_00013618.JPEG n01978287/
+mv val/ILSVRC2012_val_00013619.JPEG n10565667/
+mv val/ILSVRC2012_val_00013620.JPEG n02916936/
+mv val/ILSVRC2012_val_00013621.JPEG n03599486/
+mv val/ILSVRC2012_val_00013622.JPEG n02110958/
+mv val/ILSVRC2012_val_00013623.JPEG n01443537/
+mv val/ILSVRC2012_val_00013624.JPEG n04204238/
+mv val/ILSVRC2012_val_00013625.JPEG n02672831/
+mv val/ILSVRC2012_val_00013626.JPEG n07717410/
+mv val/ILSVRC2012_val_00013627.JPEG n04209239/
+mv val/ILSVRC2012_val_00013628.JPEG n01491361/
+mv val/ILSVRC2012_val_00013629.JPEG n02963159/
+mv val/ILSVRC2012_val_00013630.JPEG n03424325/
+mv val/ILSVRC2012_val_00013631.JPEG n03697007/
+mv val/ILSVRC2012_val_00013632.JPEG n03344393/
+mv val/ILSVRC2012_val_00013633.JPEG n03445777/
+mv val/ILSVRC2012_val_00013634.JPEG n02999410/
+mv val/ILSVRC2012_val_00013635.JPEG n02441942/
+mv val/ILSVRC2012_val_00013636.JPEG n04525038/
+mv val/ILSVRC2012_val_00013637.JPEG n02403003/
+mv val/ILSVRC2012_val_00013638.JPEG n07684084/
+mv val/ILSVRC2012_val_00013639.JPEG n03125729/
+mv val/ILSVRC2012_val_00013640.JPEG n02095570/
+mv val/ILSVRC2012_val_00013641.JPEG n01796340/
+mv val/ILSVRC2012_val_00013642.JPEG n03599486/
+mv val/ILSVRC2012_val_00013643.JPEG n07747607/
+mv val/ILSVRC2012_val_00013644.JPEG n04507155/
+mv val/ILSVRC2012_val_00013645.JPEG n07768694/
+mv val/ILSVRC2012_val_00013646.JPEG n04501370/
+mv val/ILSVRC2012_val_00013647.JPEG n07734744/
+mv val/ILSVRC2012_val_00013648.JPEG n02676566/
+mv val/ILSVRC2012_val_00013649.JPEG n01871265/
+mv val/ILSVRC2012_val_00013650.JPEG n03680355/
+mv val/ILSVRC2012_val_00013651.JPEG n02088466/
+mv val/ILSVRC2012_val_00013652.JPEG n10565667/
+mv val/ILSVRC2012_val_00013653.JPEG n02110958/
+mv val/ILSVRC2012_val_00013654.JPEG n02096437/
+mv val/ILSVRC2012_val_00013655.JPEG n01498041/
+mv val/ILSVRC2012_val_00013656.JPEG n02130308/
+mv val/ILSVRC2012_val_00013657.JPEG n07836838/
+mv val/ILSVRC2012_val_00013658.JPEG n03884397/
+mv val/ILSVRC2012_val_00013659.JPEG n04065272/
+mv val/ILSVRC2012_val_00013660.JPEG n02033041/
+mv val/ILSVRC2012_val_00013661.JPEG n02607072/
+mv val/ILSVRC2012_val_00013662.JPEG n13040303/
+mv val/ILSVRC2012_val_00013663.JPEG n02808304/
+mv val/ILSVRC2012_val_00013664.JPEG n03095699/
+mv val/ILSVRC2012_val_00013665.JPEG n03485407/
+mv val/ILSVRC2012_val_00013666.JPEG n02395406/
+mv val/ILSVRC2012_val_00013667.JPEG n04560804/
+mv val/ILSVRC2012_val_00013668.JPEG n02676566/
+mv val/ILSVRC2012_val_00013669.JPEG n04589890/
+mv val/ILSVRC2012_val_00013670.JPEG n02110958/
+mv val/ILSVRC2012_val_00013671.JPEG n02837789/
+mv val/ILSVRC2012_val_00013672.JPEG n01669191/
+mv val/ILSVRC2012_val_00013673.JPEG n02123045/
+mv val/ILSVRC2012_val_00013674.JPEG n07579787/
+mv val/ILSVRC2012_val_00013675.JPEG n01667778/
+mv val/ILSVRC2012_val_00013676.JPEG n12998815/
+mv val/ILSVRC2012_val_00013677.JPEG n04613696/
+mv val/ILSVRC2012_val_00013678.JPEG n02951585/
+mv val/ILSVRC2012_val_00013679.JPEG n03623198/
+mv val/ILSVRC2012_val_00013680.JPEG n03764736/
+mv val/ILSVRC2012_val_00013681.JPEG n02892767/
+mv val/ILSVRC2012_val_00013682.JPEG n02102318/
+mv val/ILSVRC2012_val_00013683.JPEG n04040759/
+mv val/ILSVRC2012_val_00013684.JPEG n02123045/
+mv val/ILSVRC2012_val_00013685.JPEG n03062245/
+mv val/ILSVRC2012_val_00013686.JPEG n02701002/
+mv val/ILSVRC2012_val_00013687.JPEG n03201208/
+mv val/ILSVRC2012_val_00013688.JPEG n04266014/
+mv val/ILSVRC2012_val_00013689.JPEG n01873310/
+mv val/ILSVRC2012_val_00013690.JPEG n04597913/
+mv val/ILSVRC2012_val_00013691.JPEG n03595614/
+mv val/ILSVRC2012_val_00013692.JPEG n07716906/
+mv val/ILSVRC2012_val_00013693.JPEG n02988304/
+mv val/ILSVRC2012_val_00013694.JPEG n03445924/
+mv val/ILSVRC2012_val_00013695.JPEG n02860847/
+mv val/ILSVRC2012_val_00013696.JPEG n02095889/
+mv val/ILSVRC2012_val_00013697.JPEG n02115913/
+mv val/ILSVRC2012_val_00013698.JPEG n01756291/
+mv val/ILSVRC2012_val_00013699.JPEG n02114548/
+mv val/ILSVRC2012_val_00013700.JPEG n02457408/
+mv val/ILSVRC2012_val_00013701.JPEG n03995372/
+mv val/ILSVRC2012_val_00013702.JPEG n01614925/
+mv val/ILSVRC2012_val_00013703.JPEG n02107312/
+mv val/ILSVRC2012_val_00013704.JPEG n03930630/
+mv val/ILSVRC2012_val_00013705.JPEG n03017168/
+mv val/ILSVRC2012_val_00013706.JPEG n03535780/
+mv val/ILSVRC2012_val_00013707.JPEG n01985128/
+mv val/ILSVRC2012_val_00013708.JPEG n02177972/
+mv val/ILSVRC2012_val_00013709.JPEG n03045698/
+mv val/ILSVRC2012_val_00013710.JPEG n13133613/
+mv val/ILSVRC2012_val_00013711.JPEG n04398044/
+mv val/ILSVRC2012_val_00013712.JPEG n02099267/
+mv val/ILSVRC2012_val_00013713.JPEG n01829413/
+mv val/ILSVRC2012_val_00013714.JPEG n02114712/
+mv val/ILSVRC2012_val_00013715.JPEG n02104029/
+mv val/ILSVRC2012_val_00013716.JPEG n01440764/
+mv val/ILSVRC2012_val_00013717.JPEG n04263257/
+mv val/ILSVRC2012_val_00013718.JPEG n04251144/
+mv val/ILSVRC2012_val_00013719.JPEG n03584254/
+mv val/ILSVRC2012_val_00013720.JPEG n03874599/
+mv val/ILSVRC2012_val_00013721.JPEG n06359193/
+mv val/ILSVRC2012_val_00013722.JPEG n04070727/
+mv val/ILSVRC2012_val_00013723.JPEG n04209133/
+mv val/ILSVRC2012_val_00013724.JPEG n04065272/
+mv val/ILSVRC2012_val_00013725.JPEG n01748264/
+mv val/ILSVRC2012_val_00013726.JPEG n02980441/
+mv val/ILSVRC2012_val_00013727.JPEG n02093754/
+mv val/ILSVRC2012_val_00013728.JPEG n02097658/
+mv val/ILSVRC2012_val_00013729.JPEG n03187595/
+mv val/ILSVRC2012_val_00013730.JPEG n01742172/
+mv val/ILSVRC2012_val_00013731.JPEG n04590129/
+mv val/ILSVRC2012_val_00013732.JPEG n03188531/
+mv val/ILSVRC2012_val_00013733.JPEG n02504013/
+mv val/ILSVRC2012_val_00013734.JPEG n02017213/
+mv val/ILSVRC2012_val_00013735.JPEG n02979186/
+mv val/ILSVRC2012_val_00013736.JPEG n02843684/
+mv val/ILSVRC2012_val_00013737.JPEG n04040759/
+mv val/ILSVRC2012_val_00013738.JPEG n01667778/
+mv val/ILSVRC2012_val_00013739.JPEG n01820546/
+mv val/ILSVRC2012_val_00013740.JPEG n02116738/
+mv val/ILSVRC2012_val_00013741.JPEG n04243546/
+mv val/ILSVRC2012_val_00013742.JPEG n04090263/
+mv val/ILSVRC2012_val_00013743.JPEG n03888605/
+mv val/ILSVRC2012_val_00013744.JPEG n01985128/
+mv val/ILSVRC2012_val_00013745.JPEG n02823750/
+mv val/ILSVRC2012_val_00013746.JPEG n04141975/
+mv val/ILSVRC2012_val_00013747.JPEG n03376595/
+mv val/ILSVRC2012_val_00013748.JPEG n02108915/
+mv val/ILSVRC2012_val_00013749.JPEG n03372029/
+mv val/ILSVRC2012_val_00013750.JPEG n02423022/
+mv val/ILSVRC2012_val_00013751.JPEG n01728920/
+mv val/ILSVRC2012_val_00013752.JPEG n02102973/
+mv val/ILSVRC2012_val_00013753.JPEG n01580077/
+mv val/ILSVRC2012_val_00013754.JPEG n02492660/
+mv val/ILSVRC2012_val_00013755.JPEG n07716906/
+mv val/ILSVRC2012_val_00013756.JPEG n02096294/
+mv val/ILSVRC2012_val_00013757.JPEG n03259280/
+mv val/ILSVRC2012_val_00013758.JPEG n03884397/
+mv val/ILSVRC2012_val_00013759.JPEG n02102973/
+mv val/ILSVRC2012_val_00013760.JPEG n03666591/
+mv val/ILSVRC2012_val_00013761.JPEG n02486410/
+mv val/ILSVRC2012_val_00013762.JPEG n02102480/
+mv val/ILSVRC2012_val_00013763.JPEG n02105162/
+mv val/ILSVRC2012_val_00013764.JPEG n09246464/
+mv val/ILSVRC2012_val_00013765.JPEG n02823750/
+mv val/ILSVRC2012_val_00013766.JPEG n04152593/
+mv val/ILSVRC2012_val_00013767.JPEG n03196217/
+mv val/ILSVRC2012_val_00013768.JPEG n01818515/
+mv val/ILSVRC2012_val_00013769.JPEG n04591157/
+mv val/ILSVRC2012_val_00013770.JPEG n04328186/
+mv val/ILSVRC2012_val_00013771.JPEG n01742172/
+mv val/ILSVRC2012_val_00013772.JPEG n01753488/
+mv val/ILSVRC2012_val_00013773.JPEG n02971356/
+mv val/ILSVRC2012_val_00013774.JPEG n09428293/
+mv val/ILSVRC2012_val_00013775.JPEG n02927161/
+mv val/ILSVRC2012_val_00013776.JPEG n03180011/
+mv val/ILSVRC2012_val_00013777.JPEG n04099969/
+mv val/ILSVRC2012_val_00013778.JPEG n02795169/
+mv val/ILSVRC2012_val_00013779.JPEG n02895154/
+mv val/ILSVRC2012_val_00013780.JPEG n03929660/
+mv val/ILSVRC2012_val_00013781.JPEG n01910747/
+mv val/ILSVRC2012_val_00013782.JPEG n03854065/
+mv val/ILSVRC2012_val_00013783.JPEG n02747177/
+mv val/ILSVRC2012_val_00013784.JPEG n03803284/
+mv val/ILSVRC2012_val_00013785.JPEG n02123394/
+mv val/ILSVRC2012_val_00013786.JPEG n04264628/
+mv val/ILSVRC2012_val_00013787.JPEG n04243546/
+mv val/ILSVRC2012_val_00013788.JPEG n02123159/
+mv val/ILSVRC2012_val_00013789.JPEG n01983481/
+mv val/ILSVRC2012_val_00013790.JPEG n02526121/
+mv val/ILSVRC2012_val_00013791.JPEG n12267677/
+mv val/ILSVRC2012_val_00013792.JPEG n06785654/
+mv val/ILSVRC2012_val_00013793.JPEG n04606251/
+mv val/ILSVRC2012_val_00013794.JPEG n01855672/
+mv val/ILSVRC2012_val_00013795.JPEG n02281406/
+mv val/ILSVRC2012_val_00013796.JPEG n04296562/
+mv val/ILSVRC2012_val_00013797.JPEG n01773549/
+mv val/ILSVRC2012_val_00013798.JPEG n02127052/
+mv val/ILSVRC2012_val_00013799.JPEG n02090622/
+mv val/ILSVRC2012_val_00013800.JPEG n02088094/
+mv val/ILSVRC2012_val_00013801.JPEG n04125021/
+mv val/ILSVRC2012_val_00013802.JPEG n01728920/
+mv val/ILSVRC2012_val_00013803.JPEG n03595614/
+mv val/ILSVRC2012_val_00013804.JPEG n02090622/
+mv val/ILSVRC2012_val_00013805.JPEG n04285008/
+mv val/ILSVRC2012_val_00013806.JPEG n03874293/
+mv val/ILSVRC2012_val_00013807.JPEG n02823428/
+mv val/ILSVRC2012_val_00013808.JPEG n02028035/
+mv val/ILSVRC2012_val_00013809.JPEG n02077923/
+mv val/ILSVRC2012_val_00013810.JPEG n02017213/
+mv val/ILSVRC2012_val_00013811.JPEG n03903868/
+mv val/ILSVRC2012_val_00013812.JPEG n02127052/
+mv val/ILSVRC2012_val_00013813.JPEG n04317175/
+mv val/ILSVRC2012_val_00013814.JPEG n02107683/
+mv val/ILSVRC2012_val_00013815.JPEG n01984695/
+mv val/ILSVRC2012_val_00013816.JPEG n03995372/
+mv val/ILSVRC2012_val_00013817.JPEG n02090721/
+mv val/ILSVRC2012_val_00013818.JPEG n02089867/
+mv val/ILSVRC2012_val_00013819.JPEG n10148035/
+mv val/ILSVRC2012_val_00013820.JPEG n01737021/
+mv val/ILSVRC2012_val_00013821.JPEG n01883070/
+mv val/ILSVRC2012_val_00013822.JPEG n01819313/
+mv val/ILSVRC2012_val_00013823.JPEG n03958227/
+mv val/ILSVRC2012_val_00013824.JPEG n03841143/
+mv val/ILSVRC2012_val_00013825.JPEG n03459775/
+mv val/ILSVRC2012_val_00013826.JPEG n03777568/
+mv val/ILSVRC2012_val_00013827.JPEG n03417042/
+mv val/ILSVRC2012_val_00013828.JPEG n02110185/
+mv val/ILSVRC2012_val_00013829.JPEG n03388549/
+mv val/ILSVRC2012_val_00013830.JPEG n03924679/
+mv val/ILSVRC2012_val_00013831.JPEG n02672831/
+mv val/ILSVRC2012_val_00013832.JPEG n02165456/
+mv val/ILSVRC2012_val_00013833.JPEG n03207743/
+mv val/ILSVRC2012_val_00013834.JPEG n04136333/
+mv val/ILSVRC2012_val_00013835.JPEG n02971356/
+mv val/ILSVRC2012_val_00013836.JPEG n04039381/
+mv val/ILSVRC2012_val_00013837.JPEG n04162706/
+mv val/ILSVRC2012_val_00013838.JPEG n02791124/
+mv val/ILSVRC2012_val_00013839.JPEG n03124170/
+mv val/ILSVRC2012_val_00013840.JPEG n01843065/
+mv val/ILSVRC2012_val_00013841.JPEG n04428191/
+mv val/ILSVRC2012_val_00013842.JPEG n03874599/
+mv val/ILSVRC2012_val_00013843.JPEG n02102480/
+mv val/ILSVRC2012_val_00013844.JPEG n04487394/
+mv val/ILSVRC2012_val_00013845.JPEG n01883070/
+mv val/ILSVRC2012_val_00013846.JPEG n02966193/
+mv val/ILSVRC2012_val_00013847.JPEG n01494475/
+mv val/ILSVRC2012_val_00013848.JPEG n02110341/
+mv val/ILSVRC2012_val_00013849.JPEG n07716358/
+mv val/ILSVRC2012_val_00013850.JPEG n07248320/
+mv val/ILSVRC2012_val_00013851.JPEG n02814860/
+mv val/ILSVRC2012_val_00013852.JPEG n04133789/
+mv val/ILSVRC2012_val_00013853.JPEG n02443114/
+mv val/ILSVRC2012_val_00013854.JPEG n02110063/
+mv val/ILSVRC2012_val_00013855.JPEG n04509417/
+mv val/ILSVRC2012_val_00013856.JPEG n02108089/
+mv val/ILSVRC2012_val_00013857.JPEG n04548362/
+mv val/ILSVRC2012_val_00013858.JPEG n01748264/
+mv val/ILSVRC2012_val_00013859.JPEG n03710637/
+mv val/ILSVRC2012_val_00013860.JPEG n02091467/
+mv val/ILSVRC2012_val_00013861.JPEG n02110341/
+mv val/ILSVRC2012_val_00013862.JPEG n02113624/
+mv val/ILSVRC2012_val_00013863.JPEG n01819313/
+mv val/ILSVRC2012_val_00013864.JPEG n02939185/
+mv val/ILSVRC2012_val_00013865.JPEG n03272562/
+mv val/ILSVRC2012_val_00013866.JPEG n02787622/
+mv val/ILSVRC2012_val_00013867.JPEG n12267677/
+mv val/ILSVRC2012_val_00013868.JPEG n04141327/
+mv val/ILSVRC2012_val_00013869.JPEG n02110958/
+mv val/ILSVRC2012_val_00013870.JPEG n01687978/
+mv val/ILSVRC2012_val_00013871.JPEG n04429376/
+mv val/ILSVRC2012_val_00013872.JPEG n01729322/
+mv val/ILSVRC2012_val_00013873.JPEG n02093647/
+mv val/ILSVRC2012_val_00013874.JPEG n07920052/
+mv val/ILSVRC2012_val_00013875.JPEG n01910747/
+mv val/ILSVRC2012_val_00013876.JPEG n02107908/
+mv val/ILSVRC2012_val_00013877.JPEG n03895866/
+mv val/ILSVRC2012_val_00013878.JPEG n02086079/
+mv val/ILSVRC2012_val_00013879.JPEG n02895154/
+mv val/ILSVRC2012_val_00013880.JPEG n13037406/
+mv val/ILSVRC2012_val_00013881.JPEG n03876231/
+mv val/ILSVRC2012_val_00013882.JPEG n04590129/
+mv val/ILSVRC2012_val_00013883.JPEG n01692333/
+mv val/ILSVRC2012_val_00013884.JPEG n03717622/
+mv val/ILSVRC2012_val_00013885.JPEG n02109525/
+mv val/ILSVRC2012_val_00013886.JPEG n04355338/
+mv val/ILSVRC2012_val_00013887.JPEG n03777568/
+mv val/ILSVRC2012_val_00013888.JPEG n03314780/
+mv val/ILSVRC2012_val_00013889.JPEG n03887697/
+mv val/ILSVRC2012_val_00013890.JPEG n04141975/
+mv val/ILSVRC2012_val_00013891.JPEG n01978287/
+mv val/ILSVRC2012_val_00013892.JPEG n04597913/
+mv val/ILSVRC2012_val_00013893.JPEG n04141975/
+mv val/ILSVRC2012_val_00013894.JPEG n02782093/
+mv val/ILSVRC2012_val_00013895.JPEG n03868242/
+mv val/ILSVRC2012_val_00013896.JPEG n02002724/
+mv val/ILSVRC2012_val_00013897.JPEG n03196217/
+mv val/ILSVRC2012_val_00013898.JPEG n04153751/
+mv val/ILSVRC2012_val_00013899.JPEG n01629819/
+mv val/ILSVRC2012_val_00013900.JPEG n02808440/
+mv val/ILSVRC2012_val_00013901.JPEG n02058221/
+mv val/ILSVRC2012_val_00013902.JPEG n01531178/
+mv val/ILSVRC2012_val_00013903.JPEG n02114712/
+mv val/ILSVRC2012_val_00013904.JPEG n03494278/
+mv val/ILSVRC2012_val_00013905.JPEG n04204347/
+mv val/ILSVRC2012_val_00013906.JPEG n03793489/
+mv val/ILSVRC2012_val_00013907.JPEG n03483316/
+mv val/ILSVRC2012_val_00013908.JPEG n04209239/
+mv val/ILSVRC2012_val_00013909.JPEG n03776460/
+mv val/ILSVRC2012_val_00013910.JPEG n04336792/
+mv val/ILSVRC2012_val_00013911.JPEG n02114548/
+mv val/ILSVRC2012_val_00013912.JPEG n02667093/
+mv val/ILSVRC2012_val_00013913.JPEG n02834397/
+mv val/ILSVRC2012_val_00013914.JPEG n04456115/
+mv val/ILSVRC2012_val_00013915.JPEG n03394916/
+mv val/ILSVRC2012_val_00013916.JPEG n04346328/
+mv val/ILSVRC2012_val_00013917.JPEG n01776313/
+mv val/ILSVRC2012_val_00013918.JPEG n02124075/
+mv val/ILSVRC2012_val_00013919.JPEG n02356798/
+mv val/ILSVRC2012_val_00013920.JPEG n03895866/
+mv val/ILSVRC2012_val_00013921.JPEG n02963159/
+mv val/ILSVRC2012_val_00013922.JPEG n01883070/
+mv val/ILSVRC2012_val_00013923.JPEG n03355925/
+mv val/ILSVRC2012_val_00013924.JPEG n02226429/
+mv val/ILSVRC2012_val_00013925.JPEG n03417042/
+mv val/ILSVRC2012_val_00013926.JPEG n02106550/
+mv val/ILSVRC2012_val_00013927.JPEG n02101388/
+mv val/ILSVRC2012_val_00013928.JPEG n04200800/
+mv val/ILSVRC2012_val_00013929.JPEG n02011460/
+mv val/ILSVRC2012_val_00013930.JPEG n02112706/
+mv val/ILSVRC2012_val_00013931.JPEG n04326547/
+mv val/ILSVRC2012_val_00013932.JPEG n01985128/
+mv val/ILSVRC2012_val_00013933.JPEG n03110669/
+mv val/ILSVRC2012_val_00013934.JPEG n03804744/
+mv val/ILSVRC2012_val_00013935.JPEG n04141327/
+mv val/ILSVRC2012_val_00013936.JPEG n11939491/
+mv val/ILSVRC2012_val_00013937.JPEG n02105251/
+mv val/ILSVRC2012_val_00013938.JPEG n03201208/
+mv val/ILSVRC2012_val_00013939.JPEG n07754684/
+mv val/ILSVRC2012_val_00013940.JPEG n01632777/
+mv val/ILSVRC2012_val_00013941.JPEG n04553703/
+mv val/ILSVRC2012_val_00013942.JPEG n04149813/
+mv val/ILSVRC2012_val_00013943.JPEG n02481823/
+mv val/ILSVRC2012_val_00013944.JPEG n03947888/
+mv val/ILSVRC2012_val_00013945.JPEG n01534433/
+mv val/ILSVRC2012_val_00013946.JPEG n03457902/
+mv val/ILSVRC2012_val_00013947.JPEG n02776631/
+mv val/ILSVRC2012_val_00013948.JPEG n04209239/
+mv val/ILSVRC2012_val_00013949.JPEG n04523525/
+mv val/ILSVRC2012_val_00013950.JPEG n04074963/
+mv val/ILSVRC2012_val_00013951.JPEG n02233338/
+mv val/ILSVRC2012_val_00013952.JPEG n03930313/
+mv val/ILSVRC2012_val_00013953.JPEG n03249569/
+mv val/ILSVRC2012_val_00013954.JPEG n03884397/
+mv val/ILSVRC2012_val_00013955.JPEG n01601694/
+mv val/ILSVRC2012_val_00013956.JPEG n04560804/
+mv val/ILSVRC2012_val_00013957.JPEG n02514041/
+mv val/ILSVRC2012_val_00013958.JPEG n03417042/
+mv val/ILSVRC2012_val_00013959.JPEG n07880968/
+mv val/ILSVRC2012_val_00013960.JPEG n03594734/
+mv val/ILSVRC2012_val_00013961.JPEG n03344393/
+mv val/ILSVRC2012_val_00013962.JPEG n02088632/
+mv val/ILSVRC2012_val_00013963.JPEG n02106662/
+mv val/ILSVRC2012_val_00013964.JPEG n02108551/
+mv val/ILSVRC2012_val_00013965.JPEG n01744401/
+mv val/ILSVRC2012_val_00013966.JPEG n02483708/
+mv val/ILSVRC2012_val_00013967.JPEG n02971356/
+mv val/ILSVRC2012_val_00013968.JPEG n02909870/
+mv val/ILSVRC2012_val_00013969.JPEG n02841315/
+mv val/ILSVRC2012_val_00013970.JPEG n03496892/
+mv val/ILSVRC2012_val_00013971.JPEG n02100583/
+mv val/ILSVRC2012_val_00013972.JPEG n03476684/
+mv val/ILSVRC2012_val_00013973.JPEG n07718472/
+mv val/ILSVRC2012_val_00013974.JPEG n01641577/
+mv val/ILSVRC2012_val_00013975.JPEG n06596364/
+mv val/ILSVRC2012_val_00013976.JPEG n03954731/
+mv val/ILSVRC2012_val_00013977.JPEG n04357314/
+mv val/ILSVRC2012_val_00013978.JPEG n04259630/
+mv val/ILSVRC2012_val_00013979.JPEG n07695742/
+mv val/ILSVRC2012_val_00013980.JPEG n04423845/
+mv val/ILSVRC2012_val_00013981.JPEG n03249569/
+mv val/ILSVRC2012_val_00013982.JPEG n04111531/
+mv val/ILSVRC2012_val_00013983.JPEG n02895154/
+mv val/ILSVRC2012_val_00013984.JPEG n04149813/
+mv val/ILSVRC2012_val_00013985.JPEG n02114712/
+mv val/ILSVRC2012_val_00013986.JPEG n04252225/
+mv val/ILSVRC2012_val_00013987.JPEG n03770679/
+mv val/ILSVRC2012_val_00013988.JPEG n02837789/
+mv val/ILSVRC2012_val_00013989.JPEG n04428191/
+mv val/ILSVRC2012_val_00013990.JPEG n02361337/
+mv val/ILSVRC2012_val_00013991.JPEG n02100236/
+mv val/ILSVRC2012_val_00013992.JPEG n01728920/
+mv val/ILSVRC2012_val_00013993.JPEG n03594945/
+mv val/ILSVRC2012_val_00013994.JPEG n02268443/
+mv val/ILSVRC2012_val_00013995.JPEG n07875152/
+mv val/ILSVRC2012_val_00013996.JPEG n07695742/
+mv val/ILSVRC2012_val_00013997.JPEG n02108551/
+mv val/ILSVRC2012_val_00013998.JPEG n01531178/
+mv val/ILSVRC2012_val_00013999.JPEG n01980166/
+mv val/ILSVRC2012_val_00014000.JPEG n02106382/
+mv val/ILSVRC2012_val_00014001.JPEG n03658185/
+mv val/ILSVRC2012_val_00014002.JPEG n02988304/
+mv val/ILSVRC2012_val_00014003.JPEG n04141076/
+mv val/ILSVRC2012_val_00014004.JPEG n02906734/
+mv val/ILSVRC2012_val_00014005.JPEG n02012849/
+mv val/ILSVRC2012_val_00014006.JPEG n02786058/
+mv val/ILSVRC2012_val_00014007.JPEG n01614925/
+mv val/ILSVRC2012_val_00014008.JPEG n02206856/
+mv val/ILSVRC2012_val_00014009.JPEG n01631663/
+mv val/ILSVRC2012_val_00014010.JPEG n03100240/
+mv val/ILSVRC2012_val_00014011.JPEG n03047690/
+mv val/ILSVRC2012_val_00014012.JPEG n03180011/
+mv val/ILSVRC2012_val_00014013.JPEG n02895154/
+mv val/ILSVRC2012_val_00014014.JPEG n02782093/
+mv val/ILSVRC2012_val_00014015.JPEG n03595614/
+mv val/ILSVRC2012_val_00014016.JPEG n09332890/
+mv val/ILSVRC2012_val_00014017.JPEG n07749582/
+mv val/ILSVRC2012_val_00014018.JPEG n04258138/
+mv val/ILSVRC2012_val_00014019.JPEG n03095699/
+mv val/ILSVRC2012_val_00014020.JPEG n02096177/
+mv val/ILSVRC2012_val_00014021.JPEG n01728920/
+mv val/ILSVRC2012_val_00014022.JPEG n03538406/
+mv val/ILSVRC2012_val_00014023.JPEG n01806143/
+mv val/ILSVRC2012_val_00014024.JPEG n02088238/
+mv val/ILSVRC2012_val_00014025.JPEG n04501370/
+mv val/ILSVRC2012_val_00014026.JPEG n09229709/
+mv val/ILSVRC2012_val_00014027.JPEG n04423845/
+mv val/ILSVRC2012_val_00014028.JPEG n02397096/
+mv val/ILSVRC2012_val_00014029.JPEG n02133161/
+mv val/ILSVRC2012_val_00014030.JPEG n02088238/
+mv val/ILSVRC2012_val_00014031.JPEG n02264363/
+mv val/ILSVRC2012_val_00014032.JPEG n02101006/
+mv val/ILSVRC2012_val_00014033.JPEG n04515003/
+mv val/ILSVRC2012_val_00014034.JPEG n02870880/
+mv val/ILSVRC2012_val_00014035.JPEG n04548280/
+mv val/ILSVRC2012_val_00014036.JPEG n04461696/
+mv val/ILSVRC2012_val_00014037.JPEG n03028079/
+mv val/ILSVRC2012_val_00014038.JPEG n02268853/
+mv val/ILSVRC2012_val_00014039.JPEG n03874599/
+mv val/ILSVRC2012_val_00014040.JPEG n01877812/
+mv val/ILSVRC2012_val_00014041.JPEG n02699494/
+mv val/ILSVRC2012_val_00014042.JPEG n12985857/
+mv val/ILSVRC2012_val_00014043.JPEG n02454379/
+mv val/ILSVRC2012_val_00014044.JPEG n04326547/
+mv val/ILSVRC2012_val_00014045.JPEG n02089867/
+mv val/ILSVRC2012_val_00014046.JPEG n01560419/
+mv val/ILSVRC2012_val_00014047.JPEG n02093256/
+mv val/ILSVRC2012_val_00014048.JPEG n04204347/
+mv val/ILSVRC2012_val_00014049.JPEG n04347754/
+mv val/ILSVRC2012_val_00014050.JPEG n02086240/
+mv val/ILSVRC2012_val_00014051.JPEG n04286575/
+mv val/ILSVRC2012_val_00014052.JPEG n04482393/
+mv val/ILSVRC2012_val_00014053.JPEG n03840681/
+mv val/ILSVRC2012_val_00014054.JPEG n04065272/
+mv val/ILSVRC2012_val_00014055.JPEG n02480855/
+mv val/ILSVRC2012_val_00014056.JPEG n02749479/
+mv val/ILSVRC2012_val_00014057.JPEG n03492542/
+mv val/ILSVRC2012_val_00014058.JPEG n02096437/
+mv val/ILSVRC2012_val_00014059.JPEG n02317335/
+mv val/ILSVRC2012_val_00014060.JPEG n02174001/
+mv val/ILSVRC2012_val_00014061.JPEG n04525305/
+mv val/ILSVRC2012_val_00014062.JPEG n04039381/
+mv val/ILSVRC2012_val_00014063.JPEG n07753592/
+mv val/ILSVRC2012_val_00014064.JPEG n13037406/
+mv val/ILSVRC2012_val_00014065.JPEG n02494079/
+mv val/ILSVRC2012_val_00014066.JPEG n04258138/
+mv val/ILSVRC2012_val_00014067.JPEG n02229544/
+mv val/ILSVRC2012_val_00014068.JPEG n01843383/
+mv val/ILSVRC2012_val_00014069.JPEG n01728920/
+mv val/ILSVRC2012_val_00014070.JPEG n04330267/
+mv val/ILSVRC2012_val_00014071.JPEG n02325366/
+mv val/ILSVRC2012_val_00014072.JPEG n02808304/
+mv val/ILSVRC2012_val_00014073.JPEG n04462240/
+mv val/ILSVRC2012_val_00014074.JPEG n03874293/
+mv val/ILSVRC2012_val_00014075.JPEG n03482405/
+mv val/ILSVRC2012_val_00014076.JPEG n01629819/
+mv val/ILSVRC2012_val_00014077.JPEG n03781244/
+mv val/ILSVRC2012_val_00014078.JPEG n04392985/
+mv val/ILSVRC2012_val_00014079.JPEG n04258138/
+mv val/ILSVRC2012_val_00014080.JPEG n03160309/
+mv val/ILSVRC2012_val_00014081.JPEG n02096585/
+mv val/ILSVRC2012_val_00014082.JPEG n01614925/
+mv val/ILSVRC2012_val_00014083.JPEG n02017213/
+mv val/ILSVRC2012_val_00014084.JPEG n04133789/
+mv val/ILSVRC2012_val_00014085.JPEG n04277352/
+mv val/ILSVRC2012_val_00014086.JPEG n02106030/
+mv val/ILSVRC2012_val_00014087.JPEG n04428191/
+mv val/ILSVRC2012_val_00014088.JPEG n03400231/
+mv val/ILSVRC2012_val_00014089.JPEG n03249569/
+mv val/ILSVRC2012_val_00014090.JPEG n01514668/
+mv val/ILSVRC2012_val_00014091.JPEG n10148035/
+mv val/ILSVRC2012_val_00014092.JPEG n02397096/
+mv val/ILSVRC2012_val_00014093.JPEG n07697313/
+mv val/ILSVRC2012_val_00014094.JPEG n07802026/
+mv val/ILSVRC2012_val_00014095.JPEG n03887697/
+mv val/ILSVRC2012_val_00014096.JPEG n07248320/
+mv val/ILSVRC2012_val_00014097.JPEG n01855032/
+mv val/ILSVRC2012_val_00014098.JPEG n03908618/
+mv val/ILSVRC2012_val_00014099.JPEG n02086910/
+mv val/ILSVRC2012_val_00014100.JPEG n04254680/
+mv val/ILSVRC2012_val_00014101.JPEG n02104365/
+mv val/ILSVRC2012_val_00014102.JPEG n03445777/
+mv val/ILSVRC2012_val_00014103.JPEG n02011460/
+mv val/ILSVRC2012_val_00014104.JPEG n07695742/
+mv val/ILSVRC2012_val_00014105.JPEG n04344873/
+mv val/ILSVRC2012_val_00014106.JPEG n01667778/
+mv val/ILSVRC2012_val_00014107.JPEG n02091244/
+mv val/ILSVRC2012_val_00014108.JPEG n01534433/
+mv val/ILSVRC2012_val_00014109.JPEG n02097474/
+mv val/ILSVRC2012_val_00014110.JPEG n02701002/
+mv val/ILSVRC2012_val_00014111.JPEG n03208938/
+mv val/ILSVRC2012_val_00014112.JPEG n03676483/
+mv val/ILSVRC2012_val_00014113.JPEG n03770439/
+mv val/ILSVRC2012_val_00014114.JPEG n01755581/
+mv val/ILSVRC2012_val_00014115.JPEG n02108915/
+mv val/ILSVRC2012_val_00014116.JPEG n01753488/
+mv val/ILSVRC2012_val_00014117.JPEG n02102480/
+mv val/ILSVRC2012_val_00014118.JPEG n03633091/
+mv val/ILSVRC2012_val_00014119.JPEG n03662601/
+mv val/ILSVRC2012_val_00014120.JPEG n01770393/
+mv val/ILSVRC2012_val_00014121.JPEG n07590611/
+mv val/ILSVRC2012_val_00014122.JPEG n04264628/
+mv val/ILSVRC2012_val_00014123.JPEG n03998194/
+mv val/ILSVRC2012_val_00014124.JPEG n02396427/
+mv val/ILSVRC2012_val_00014125.JPEG n02102040/
+mv val/ILSVRC2012_val_00014126.JPEG n01770393/
+mv val/ILSVRC2012_val_00014127.JPEG n04162706/
+mv val/ILSVRC2012_val_00014128.JPEG n02281406/
+mv val/ILSVRC2012_val_00014129.JPEG n12768682/
+mv val/ILSVRC2012_val_00014130.JPEG n01945685/
+mv val/ILSVRC2012_val_00014131.JPEG n03483316/
+mv val/ILSVRC2012_val_00014132.JPEG n01978287/
+mv val/ILSVRC2012_val_00014133.JPEG n02119022/
+mv val/ILSVRC2012_val_00014134.JPEG n02169497/
+mv val/ILSVRC2012_val_00014135.JPEG n03991062/
+mv val/ILSVRC2012_val_00014136.JPEG n04465501/
+mv val/ILSVRC2012_val_00014137.JPEG n07614500/
+mv val/ILSVRC2012_val_00014138.JPEG n01990800/
+mv val/ILSVRC2012_val_00014139.JPEG n01534433/
+mv val/ILSVRC2012_val_00014140.JPEG n03770679/
+mv val/ILSVRC2012_val_00014141.JPEG n09288635/
+mv val/ILSVRC2012_val_00014142.JPEG n03188531/
+mv val/ILSVRC2012_val_00014143.JPEG n09256479/
+mv val/ILSVRC2012_val_00014144.JPEG n04259630/
+mv val/ILSVRC2012_val_00014145.JPEG n02110627/
+mv val/ILSVRC2012_val_00014146.JPEG n04560804/
+mv val/ILSVRC2012_val_00014147.JPEG n02113978/
+mv val/ILSVRC2012_val_00014148.JPEG n02095889/
+mv val/ILSVRC2012_val_00014149.JPEG n04599235/
+mv val/ILSVRC2012_val_00014150.JPEG n03259280/
+mv val/ILSVRC2012_val_00014151.JPEG n02111277/
+mv val/ILSVRC2012_val_00014152.JPEG n02794156/
+mv val/ILSVRC2012_val_00014153.JPEG n04328186/
+mv val/ILSVRC2012_val_00014154.JPEG n04254680/
+mv val/ILSVRC2012_val_00014155.JPEG n03661043/
+mv val/ILSVRC2012_val_00014156.JPEG n03599486/
+mv val/ILSVRC2012_val_00014157.JPEG n02097130/
+mv val/ILSVRC2012_val_00014158.JPEG n02033041/
+mv val/ILSVRC2012_val_00014159.JPEG n02071294/
+mv val/ILSVRC2012_val_00014160.JPEG n03937543/
+mv val/ILSVRC2012_val_00014161.JPEG n09288635/
+mv val/ILSVRC2012_val_00014162.JPEG n03709823/
+mv val/ILSVRC2012_val_00014163.JPEG n02489166/
+mv val/ILSVRC2012_val_00014164.JPEG n03673027/
+mv val/ILSVRC2012_val_00014165.JPEG n01828970/
+mv val/ILSVRC2012_val_00014166.JPEG n04532106/
+mv val/ILSVRC2012_val_00014167.JPEG n03496892/
+mv val/ILSVRC2012_val_00014168.JPEG n01924916/
+mv val/ILSVRC2012_val_00014169.JPEG n04548280/
+mv val/ILSVRC2012_val_00014170.JPEG n02319095/
+mv val/ILSVRC2012_val_00014171.JPEG n02395406/
+mv val/ILSVRC2012_val_00014172.JPEG n02782093/
+mv val/ILSVRC2012_val_00014173.JPEG n04554684/
+mv val/ILSVRC2012_val_00014174.JPEG n02086240/
+mv val/ILSVRC2012_val_00014175.JPEG n03916031/
+mv val/ILSVRC2012_val_00014176.JPEG n02791270/
+mv val/ILSVRC2012_val_00014177.JPEG n07717410/
+mv val/ILSVRC2012_val_00014178.JPEG n04238763/
+mv val/ILSVRC2012_val_00014179.JPEG n02730930/
+mv val/ILSVRC2012_val_00014180.JPEG n01514859/
+mv val/ILSVRC2012_val_00014181.JPEG n01748264/
+mv val/ILSVRC2012_val_00014182.JPEG n02988304/
+mv val/ILSVRC2012_val_00014183.JPEG n03461385/
+mv val/ILSVRC2012_val_00014184.JPEG n03272562/
+mv val/ILSVRC2012_val_00014185.JPEG n04330267/
+mv val/ILSVRC2012_val_00014186.JPEG n07860988/
+mv val/ILSVRC2012_val_00014187.JPEG n02276258/
+mv val/ILSVRC2012_val_00014188.JPEG n07871810/
+mv val/ILSVRC2012_val_00014189.JPEG n02097474/
+mv val/ILSVRC2012_val_00014190.JPEG n02999410/
+mv val/ILSVRC2012_val_00014191.JPEG n04037443/
+mv val/ILSVRC2012_val_00014192.JPEG n01614925/
+mv val/ILSVRC2012_val_00014193.JPEG n04033901/
+mv val/ILSVRC2012_val_00014194.JPEG n03944341/
+mv val/ILSVRC2012_val_00014195.JPEG n02655020/
+mv val/ILSVRC2012_val_00014196.JPEG n01608432/
+mv val/ILSVRC2012_val_00014197.JPEG n03874599/
+mv val/ILSVRC2012_val_00014198.JPEG n03594945/
+mv val/ILSVRC2012_val_00014199.JPEG n04252225/
+mv val/ILSVRC2012_val_00014200.JPEG n07892512/
+mv val/ILSVRC2012_val_00014201.JPEG n03717622/
+mv val/ILSVRC2012_val_00014202.JPEG n03763968/
+mv val/ILSVRC2012_val_00014203.JPEG n02110627/
+mv val/ILSVRC2012_val_00014204.JPEG n02795169/
+mv val/ILSVRC2012_val_00014205.JPEG n03000134/
+mv val/ILSVRC2012_val_00014206.JPEG n02494079/
+mv val/ILSVRC2012_val_00014207.JPEG n03042490/
+mv val/ILSVRC2012_val_00014208.JPEG n03100240/
+mv val/ILSVRC2012_val_00014209.JPEG n07875152/
+mv val/ILSVRC2012_val_00014210.JPEG n02802426/
+mv val/ILSVRC2012_val_00014211.JPEG n02484975/
+mv val/ILSVRC2012_val_00014212.JPEG n09229709/
+mv val/ILSVRC2012_val_00014213.JPEG n02747177/
+mv val/ILSVRC2012_val_00014214.JPEG n06596364/
+mv val/ILSVRC2012_val_00014215.JPEG n04557648/
+mv val/ILSVRC2012_val_00014216.JPEG n02123394/
+mv val/ILSVRC2012_val_00014217.JPEG n02002724/
+mv val/ILSVRC2012_val_00014218.JPEG n02167151/
+mv val/ILSVRC2012_val_00014219.JPEG n02504013/
+mv val/ILSVRC2012_val_00014220.JPEG n01616318/
+mv val/ILSVRC2012_val_00014221.JPEG n03770439/
+mv val/ILSVRC2012_val_00014222.JPEG n04428191/
+mv val/ILSVRC2012_val_00014223.JPEG n02051845/
+mv val/ILSVRC2012_val_00014224.JPEG n04579145/
+mv val/ILSVRC2012_val_00014225.JPEG n02093754/
+mv val/ILSVRC2012_val_00014226.JPEG n12267677/
+mv val/ILSVRC2012_val_00014227.JPEG n01641577/
+mv val/ILSVRC2012_val_00014228.JPEG n02963159/
+mv val/ILSVRC2012_val_00014229.JPEG n02807133/
+mv val/ILSVRC2012_val_00014230.JPEG n04590129/
+mv val/ILSVRC2012_val_00014231.JPEG n03467068/
+mv val/ILSVRC2012_val_00014232.JPEG n01629819/
+mv val/ILSVRC2012_val_00014233.JPEG n02443484/
+mv val/ILSVRC2012_val_00014234.JPEG n02088238/
+mv val/ILSVRC2012_val_00014235.JPEG n02412080/
+mv val/ILSVRC2012_val_00014236.JPEG n03532672/
+mv val/ILSVRC2012_val_00014237.JPEG n04591157/
+mv val/ILSVRC2012_val_00014238.JPEG n04486054/
+mv val/ILSVRC2012_val_00014239.JPEG n02692877/
+mv val/ILSVRC2012_val_00014240.JPEG n02727426/
+mv val/ILSVRC2012_val_00014241.JPEG n04371774/
+mv val/ILSVRC2012_val_00014242.JPEG n04273569/
+mv val/ILSVRC2012_val_00014243.JPEG n03733131/
+mv val/ILSVRC2012_val_00014244.JPEG n03544143/
+mv val/ILSVRC2012_val_00014245.JPEG n02104365/
+mv val/ILSVRC2012_val_00014246.JPEG n02109961/
+mv val/ILSVRC2012_val_00014247.JPEG n03447447/
+mv val/ILSVRC2012_val_00014248.JPEG n01872401/
+mv val/ILSVRC2012_val_00014249.JPEG n03961711/
+mv val/ILSVRC2012_val_00014250.JPEG n02116738/
+mv val/ILSVRC2012_val_00014251.JPEG n01688243/
+mv val/ILSVRC2012_val_00014252.JPEG n01749939/
+mv val/ILSVRC2012_val_00014253.JPEG n03141823/
+mv val/ILSVRC2012_val_00014254.JPEG n02509815/
+mv val/ILSVRC2012_val_00014255.JPEG n12985857/
+mv val/ILSVRC2012_val_00014256.JPEG n01829413/
+mv val/ILSVRC2012_val_00014257.JPEG n02109047/
+mv val/ILSVRC2012_val_00014258.JPEG n02526121/
+mv val/ILSVRC2012_val_00014259.JPEG n02097658/
+mv val/ILSVRC2012_val_00014260.JPEG n03216828/
+mv val/ILSVRC2012_val_00014261.JPEG n02870880/
+mv val/ILSVRC2012_val_00014262.JPEG n04266014/
+mv val/ILSVRC2012_val_00014263.JPEG n04355338/
+mv val/ILSVRC2012_val_00014264.JPEG n03633091/
+mv val/ILSVRC2012_val_00014265.JPEG n01910747/
+mv val/ILSVRC2012_val_00014266.JPEG n02006656/
+mv val/ILSVRC2012_val_00014267.JPEG n03445924/
+mv val/ILSVRC2012_val_00014268.JPEG n02906734/
+mv val/ILSVRC2012_val_00014269.JPEG n04099969/
+mv val/ILSVRC2012_val_00014270.JPEG n02099712/
+mv val/ILSVRC2012_val_00014271.JPEG n02229544/
+mv val/ILSVRC2012_val_00014272.JPEG n04443257/
+mv val/ILSVRC2012_val_00014273.JPEG n02687172/
+mv val/ILSVRC2012_val_00014274.JPEG n04273569/
+mv val/ILSVRC2012_val_00014275.JPEG n02489166/
+mv val/ILSVRC2012_val_00014276.JPEG n03924679/
+mv val/ILSVRC2012_val_00014277.JPEG n12985857/
+mv val/ILSVRC2012_val_00014278.JPEG n02167151/
+mv val/ILSVRC2012_val_00014279.JPEG n02321529/
+mv val/ILSVRC2012_val_00014280.JPEG n02102040/
+mv val/ILSVRC2012_val_00014281.JPEG n02870880/
+mv val/ILSVRC2012_val_00014282.JPEG n01693334/
+mv val/ILSVRC2012_val_00014283.JPEG n02097298/
+mv val/ILSVRC2012_val_00014284.JPEG n01882714/
+mv val/ILSVRC2012_val_00014285.JPEG n04040759/
+mv val/ILSVRC2012_val_00014286.JPEG n03791053/
+mv val/ILSVRC2012_val_00014287.JPEG n02979186/
+mv val/ILSVRC2012_val_00014288.JPEG n02454379/
+mv val/ILSVRC2012_val_00014289.JPEG n03131574/
+mv val/ILSVRC2012_val_00014290.JPEG n04141327/
+mv val/ILSVRC2012_val_00014291.JPEG n02981792/
+mv val/ILSVRC2012_val_00014292.JPEG n02974003/
+mv val/ILSVRC2012_val_00014293.JPEG n02090721/
+mv val/ILSVRC2012_val_00014294.JPEG n04131690/
+mv val/ILSVRC2012_val_00014295.JPEG n02106030/
+mv val/ILSVRC2012_val_00014296.JPEG n02493793/
+mv val/ILSVRC2012_val_00014297.JPEG n02963159/
+mv val/ILSVRC2012_val_00014298.JPEG n04596742/
+mv val/ILSVRC2012_val_00014299.JPEG n11879895/
+mv val/ILSVRC2012_val_00014300.JPEG n03457902/
+mv val/ILSVRC2012_val_00014301.JPEG n02823750/
+mv val/ILSVRC2012_val_00014302.JPEG n01774750/
+mv val/ILSVRC2012_val_00014303.JPEG n03788365/
+mv val/ILSVRC2012_val_00014304.JPEG n02389026/
+mv val/ILSVRC2012_val_00014305.JPEG n02823750/
+mv val/ILSVRC2012_val_00014306.JPEG n02493509/
+mv val/ILSVRC2012_val_00014307.JPEG n07583066/
+mv val/ILSVRC2012_val_00014308.JPEG n01682714/
+mv val/ILSVRC2012_val_00014309.JPEG n03899768/
+mv val/ILSVRC2012_val_00014310.JPEG n02279972/
+mv val/ILSVRC2012_val_00014311.JPEG n07747607/
+mv val/ILSVRC2012_val_00014312.JPEG n01692333/
+mv val/ILSVRC2012_val_00014313.JPEG n04243546/
+mv val/ILSVRC2012_val_00014314.JPEG n04317175/
+mv val/ILSVRC2012_val_00014315.JPEG n02106550/
+mv val/ILSVRC2012_val_00014316.JPEG n01664065/
+mv val/ILSVRC2012_val_00014317.JPEG n01677366/
+mv val/ILSVRC2012_val_00014318.JPEG n02093754/
+mv val/ILSVRC2012_val_00014319.JPEG n04346328/
+mv val/ILSVRC2012_val_00014320.JPEG n02106550/
+mv val/ILSVRC2012_val_00014321.JPEG n02127052/
+mv val/ILSVRC2012_val_00014322.JPEG n03666591/
+mv val/ILSVRC2012_val_00014323.JPEG n03877845/
+mv val/ILSVRC2012_val_00014324.JPEG n03125729/
+mv val/ILSVRC2012_val_00014325.JPEG n03786901/
+mv val/ILSVRC2012_val_00014326.JPEG n03775071/
+mv val/ILSVRC2012_val_00014327.JPEG n02412080/
+mv val/ILSVRC2012_val_00014328.JPEG n01518878/
+mv val/ILSVRC2012_val_00014329.JPEG n03720891/
+mv val/ILSVRC2012_val_00014330.JPEG n01735189/
+mv val/ILSVRC2012_val_00014331.JPEG n02356798/
+mv val/ILSVRC2012_val_00014332.JPEG n02110806/
+mv val/ILSVRC2012_val_00014333.JPEG n03047690/
+mv val/ILSVRC2012_val_00014334.JPEG n04462240/
+mv val/ILSVRC2012_val_00014335.JPEG n02951585/
+mv val/ILSVRC2012_val_00014336.JPEG n01558993/
+mv val/ILSVRC2012_val_00014337.JPEG n03065424/
+mv val/ILSVRC2012_val_00014338.JPEG n02860847/
+mv val/ILSVRC2012_val_00014339.JPEG n02486410/
+mv val/ILSVRC2012_val_00014340.JPEG n02398521/
+mv val/ILSVRC2012_val_00014341.JPEG n04346328/
+mv val/ILSVRC2012_val_00014342.JPEG n02106030/
+mv val/ILSVRC2012_val_00014343.JPEG n02445715/
+mv val/ILSVRC2012_val_00014344.JPEG n04153751/
+mv val/ILSVRC2012_val_00014345.JPEG n02509815/
+mv val/ILSVRC2012_val_00014346.JPEG n01828970/
+mv val/ILSVRC2012_val_00014347.JPEG n04069434/
+mv val/ILSVRC2012_val_00014348.JPEG n07714571/
+mv val/ILSVRC2012_val_00014349.JPEG n13044778/
+mv val/ILSVRC2012_val_00014350.JPEG n01955084/
+mv val/ILSVRC2012_val_00014351.JPEG n03662601/
+mv val/ILSVRC2012_val_00014352.JPEG n01664065/
+mv val/ILSVRC2012_val_00014353.JPEG n02708093/
+mv val/ILSVRC2012_val_00014354.JPEG n02408429/
+mv val/ILSVRC2012_val_00014355.JPEG n03920288/
+mv val/ILSVRC2012_val_00014356.JPEG n02190166/
+mv val/ILSVRC2012_val_00014357.JPEG n02091635/
+mv val/ILSVRC2012_val_00014358.JPEG n04229816/
+mv val/ILSVRC2012_val_00014359.JPEG n01773549/
+mv val/ILSVRC2012_val_00014360.JPEG n02106662/
+mv val/ILSVRC2012_val_00014361.JPEG n02009912/
+mv val/ILSVRC2012_val_00014362.JPEG n01558993/
+mv val/ILSVRC2012_val_00014363.JPEG n02127052/
+mv val/ILSVRC2012_val_00014364.JPEG n02843684/
+mv val/ILSVRC2012_val_00014365.JPEG n02174001/
+mv val/ILSVRC2012_val_00014366.JPEG n03345487/
+mv val/ILSVRC2012_val_00014367.JPEG n01990800/
+mv val/ILSVRC2012_val_00014368.JPEG n03584254/
+mv val/ILSVRC2012_val_00014369.JPEG n02389026/
+mv val/ILSVRC2012_val_00014370.JPEG n02389026/
+mv val/ILSVRC2012_val_00014371.JPEG n04069434/
+mv val/ILSVRC2012_val_00014372.JPEG n03032252/
+mv val/ILSVRC2012_val_00014373.JPEG n07749582/
+mv val/ILSVRC2012_val_00014374.JPEG n02110627/
+mv val/ILSVRC2012_val_00014375.JPEG n02807133/
+mv val/ILSVRC2012_val_00014376.JPEG n02012849/
+mv val/ILSVRC2012_val_00014377.JPEG n03208938/
+mv val/ILSVRC2012_val_00014378.JPEG n02107142/
+mv val/ILSVRC2012_val_00014379.JPEG n03995372/
+mv val/ILSVRC2012_val_00014380.JPEG n02927161/
+mv val/ILSVRC2012_val_00014381.JPEG n03888257/
+mv val/ILSVRC2012_val_00014382.JPEG n02802426/
+mv val/ILSVRC2012_val_00014383.JPEG n09193705/
+mv val/ILSVRC2012_val_00014384.JPEG n07716906/
+mv val/ILSVRC2012_val_00014385.JPEG n03345487/
+mv val/ILSVRC2012_val_00014386.JPEG n02088094/
+mv val/ILSVRC2012_val_00014387.JPEG n03297495/
+mv val/ILSVRC2012_val_00014388.JPEG n02871525/
+mv val/ILSVRC2012_val_00014389.JPEG n02363005/
+mv val/ILSVRC2012_val_00014390.JPEG n02206856/
+mv val/ILSVRC2012_val_00014391.JPEG n02445715/
+mv val/ILSVRC2012_val_00014392.JPEG n02783161/
+mv val/ILSVRC2012_val_00014393.JPEG n02948072/
+mv val/ILSVRC2012_val_00014394.JPEG n09421951/
+mv val/ILSVRC2012_val_00014395.JPEG n02410509/
+mv val/ILSVRC2012_val_00014396.JPEG n02808304/
+mv val/ILSVRC2012_val_00014397.JPEG n03903868/
+mv val/ILSVRC2012_val_00014398.JPEG n02110063/
+mv val/ILSVRC2012_val_00014399.JPEG n03724870/
+mv val/ILSVRC2012_val_00014400.JPEG n07836838/
+mv val/ILSVRC2012_val_00014401.JPEG n04141975/
+mv val/ILSVRC2012_val_00014402.JPEG n02487347/
+mv val/ILSVRC2012_val_00014403.JPEG n02112137/
+mv val/ILSVRC2012_val_00014404.JPEG n02804610/
+mv val/ILSVRC2012_val_00014405.JPEG n07734744/
+mv val/ILSVRC2012_val_00014406.JPEG n04462240/
+mv val/ILSVRC2012_val_00014407.JPEG n03372029/
+mv val/ILSVRC2012_val_00014408.JPEG n02177972/
+mv val/ILSVRC2012_val_00014409.JPEG n02085620/
+mv val/ILSVRC2012_val_00014410.JPEG n01917289/
+mv val/ILSVRC2012_val_00014411.JPEG n04070727/
+mv val/ILSVRC2012_val_00014412.JPEG n02823428/
+mv val/ILSVRC2012_val_00014413.JPEG n02860847/
+mv val/ILSVRC2012_val_00014414.JPEG n04392985/
+mv val/ILSVRC2012_val_00014415.JPEG n02791124/
+mv val/ILSVRC2012_val_00014416.JPEG n01847000/
+mv val/ILSVRC2012_val_00014417.JPEG n01784675/
+mv val/ILSVRC2012_val_00014418.JPEG n02093991/
+mv val/ILSVRC2012_val_00014419.JPEG n03457902/
+mv val/ILSVRC2012_val_00014420.JPEG n02939185/
+mv val/ILSVRC2012_val_00014421.JPEG n04493381/
+mv val/ILSVRC2012_val_00014422.JPEG n03271574/
+mv val/ILSVRC2012_val_00014423.JPEG n02509815/
+mv val/ILSVRC2012_val_00014424.JPEG n03793489/
+mv val/ILSVRC2012_val_00014425.JPEG n02690373/
+mv val/ILSVRC2012_val_00014426.JPEG n03983396/
+mv val/ILSVRC2012_val_00014427.JPEG n02927161/
+mv val/ILSVRC2012_val_00014428.JPEG n03018349/
+mv val/ILSVRC2012_val_00014429.JPEG n03908618/
+mv val/ILSVRC2012_val_00014430.JPEG n02110341/
+mv val/ILSVRC2012_val_00014431.JPEG n03776460/
+mv val/ILSVRC2012_val_00014432.JPEG n02124075/
+mv val/ILSVRC2012_val_00014433.JPEG n04335435/
+mv val/ILSVRC2012_val_00014434.JPEG n03127747/
+mv val/ILSVRC2012_val_00014435.JPEG n02948072/
+mv val/ILSVRC2012_val_00014436.JPEG n03085013/
+mv val/ILSVRC2012_val_00014437.JPEG n02442845/
+mv val/ILSVRC2012_val_00014438.JPEG n02916936/
+mv val/ILSVRC2012_val_00014439.JPEG n01688243/
+mv val/ILSVRC2012_val_00014440.JPEG n02879718/
+mv val/ILSVRC2012_val_00014441.JPEG n02097298/
+mv val/ILSVRC2012_val_00014442.JPEG n04589890/
+mv val/ILSVRC2012_val_00014443.JPEG n02607072/
+mv val/ILSVRC2012_val_00014444.JPEG n02948072/
+mv val/ILSVRC2012_val_00014445.JPEG n04525038/
+mv val/ILSVRC2012_val_00014446.JPEG n02100735/
+mv val/ILSVRC2012_val_00014447.JPEG n02814533/
+mv val/ILSVRC2012_val_00014448.JPEG n03000134/
+mv val/ILSVRC2012_val_00014449.JPEG n03478589/
+mv val/ILSVRC2012_val_00014450.JPEG n02037110/
+mv val/ILSVRC2012_val_00014451.JPEG n04235860/
+mv val/ILSVRC2012_val_00014452.JPEG n02112137/
+mv val/ILSVRC2012_val_00014453.JPEG n04435653/
+mv val/ILSVRC2012_val_00014454.JPEG n04273569/
+mv val/ILSVRC2012_val_00014455.JPEG n03794056/
+mv val/ILSVRC2012_val_00014456.JPEG n01910747/
+mv val/ILSVRC2012_val_00014457.JPEG n01748264/
+mv val/ILSVRC2012_val_00014458.JPEG n01883070/
+mv val/ILSVRC2012_val_00014459.JPEG n04200800/
+mv val/ILSVRC2012_val_00014460.JPEG n04590129/
+mv val/ILSVRC2012_val_00014461.JPEG n03443371/
+mv val/ILSVRC2012_val_00014462.JPEG n02791124/
+mv val/ILSVRC2012_val_00014463.JPEG n03075370/
+mv val/ILSVRC2012_val_00014464.JPEG n03673027/
+mv val/ILSVRC2012_val_00014465.JPEG n01742172/
+mv val/ILSVRC2012_val_00014466.JPEG n03476684/
+mv val/ILSVRC2012_val_00014467.JPEG n01484850/
+mv val/ILSVRC2012_val_00014468.JPEG n01675722/
+mv val/ILSVRC2012_val_00014469.JPEG n02978881/
+mv val/ILSVRC2012_val_00014470.JPEG n03938244/
+mv val/ILSVRC2012_val_00014471.JPEG n02106166/
+mv val/ILSVRC2012_val_00014472.JPEG n01729977/
+mv val/ILSVRC2012_val_00014473.JPEG n04118776/
+mv val/ILSVRC2012_val_00014474.JPEG n04209239/
+mv val/ILSVRC2012_val_00014475.JPEG n03376595/
+mv val/ILSVRC2012_val_00014476.JPEG n04008634/
+mv val/ILSVRC2012_val_00014477.JPEG n02095889/
+mv val/ILSVRC2012_val_00014478.JPEG n01855032/
+mv val/ILSVRC2012_val_00014479.JPEG n03376595/
+mv val/ILSVRC2012_val_00014480.JPEG n04456115/
+mv val/ILSVRC2012_val_00014481.JPEG n02879718/
+mv val/ILSVRC2012_val_00014482.JPEG n04238763/
+mv val/ILSVRC2012_val_00014483.JPEG n02268443/
+mv val/ILSVRC2012_val_00014484.JPEG n02794156/
+mv val/ILSVRC2012_val_00014485.JPEG n02105505/
+mv val/ILSVRC2012_val_00014486.JPEG n01914609/
+mv val/ILSVRC2012_val_00014487.JPEG n03899768/
+mv val/ILSVRC2012_val_00014488.JPEG n02676566/
+mv val/ILSVRC2012_val_00014489.JPEG n02099601/
+mv val/ILSVRC2012_val_00014490.JPEG n02106382/
+mv val/ILSVRC2012_val_00014491.JPEG n04264628/
+mv val/ILSVRC2012_val_00014492.JPEG n04501370/
+mv val/ILSVRC2012_val_00014493.JPEG n03594734/
+mv val/ILSVRC2012_val_00014494.JPEG n03895866/
+mv val/ILSVRC2012_val_00014495.JPEG n04332243/
+mv val/ILSVRC2012_val_00014496.JPEG n04008634/
+mv val/ILSVRC2012_val_00014497.JPEG n02492035/
+mv val/ILSVRC2012_val_00014498.JPEG n01773797/
+mv val/ILSVRC2012_val_00014499.JPEG n04228054/
+mv val/ILSVRC2012_val_00014500.JPEG n02110958/
+mv val/ILSVRC2012_val_00014501.JPEG n06359193/
+mv val/ILSVRC2012_val_00014502.JPEG n02403003/
+mv val/ILSVRC2012_val_00014503.JPEG n04409515/
+mv val/ILSVRC2012_val_00014504.JPEG n03337140/
+mv val/ILSVRC2012_val_00014505.JPEG n02483708/
+mv val/ILSVRC2012_val_00014506.JPEG n02106166/
+mv val/ILSVRC2012_val_00014507.JPEG n04209133/
+mv val/ILSVRC2012_val_00014508.JPEG n02114367/
+mv val/ILSVRC2012_val_00014509.JPEG n03743016/
+mv val/ILSVRC2012_val_00014510.JPEG n03201208/
+mv val/ILSVRC2012_val_00014511.JPEG n03207941/
+mv val/ILSVRC2012_val_00014512.JPEG n02804414/
+mv val/ILSVRC2012_val_00014513.JPEG n04487081/
+mv val/ILSVRC2012_val_00014514.JPEG n01945685/
+mv val/ILSVRC2012_val_00014515.JPEG n02606052/
+mv val/ILSVRC2012_val_00014516.JPEG n03388043/
+mv val/ILSVRC2012_val_00014517.JPEG n03661043/
+mv val/ILSVRC2012_val_00014518.JPEG n02804610/
+mv val/ILSVRC2012_val_00014519.JPEG n04235860/
+mv val/ILSVRC2012_val_00014520.JPEG n02795169/
+mv val/ILSVRC2012_val_00014521.JPEG n03476991/
+mv val/ILSVRC2012_val_00014522.JPEG n03444034/
+mv val/ILSVRC2012_val_00014523.JPEG n03942813/
+mv val/ILSVRC2012_val_00014524.JPEG n04026417/
+mv val/ILSVRC2012_val_00014525.JPEG n03337140/
+mv val/ILSVRC2012_val_00014526.JPEG n02108422/
+mv val/ILSVRC2012_val_00014527.JPEG n04033995/
+mv val/ILSVRC2012_val_00014528.JPEG n03041632/
+mv val/ILSVRC2012_val_00014529.JPEG n02134418/
+mv val/ILSVRC2012_val_00014530.JPEG n04554684/
+mv val/ILSVRC2012_val_00014531.JPEG n03733131/
+mv val/ILSVRC2012_val_00014532.JPEG n02116738/
+mv val/ILSVRC2012_val_00014533.JPEG n03786901/
+mv val/ILSVRC2012_val_00014534.JPEG n03937543/
+mv val/ILSVRC2012_val_00014535.JPEG n04147183/
+mv val/ILSVRC2012_val_00014536.JPEG n04131690/
+mv val/ILSVRC2012_val_00014537.JPEG n03400231/
+mv val/ILSVRC2012_val_00014538.JPEG n02125311/
+mv val/ILSVRC2012_val_00014539.JPEG n02410509/
+mv val/ILSVRC2012_val_00014540.JPEG n01775062/
+mv val/ILSVRC2012_val_00014541.JPEG n02814533/
+mv val/ILSVRC2012_val_00014542.JPEG n02110185/
+mv val/ILSVRC2012_val_00014543.JPEG n04008634/
+mv val/ILSVRC2012_val_00014544.JPEG n04597913/
+mv val/ILSVRC2012_val_00014545.JPEG n01883070/
+mv val/ILSVRC2012_val_00014546.JPEG n07714990/
+mv val/ILSVRC2012_val_00014547.JPEG n02112350/
+mv val/ILSVRC2012_val_00014548.JPEG n02437616/
+mv val/ILSVRC2012_val_00014549.JPEG n03662601/
+mv val/ILSVRC2012_val_00014550.JPEG n02074367/
+mv val/ILSVRC2012_val_00014551.JPEG n04239074/
+mv val/ILSVRC2012_val_00014552.JPEG n03063689/
+mv val/ILSVRC2012_val_00014553.JPEG n07831146/
+mv val/ILSVRC2012_val_00014554.JPEG n02869837/
+mv val/ILSVRC2012_val_00014555.JPEG n03920288/
+mv val/ILSVRC2012_val_00014556.JPEG n13052670/
+mv val/ILSVRC2012_val_00014557.JPEG n03016953/
+mv val/ILSVRC2012_val_00014558.JPEG n02788148/
+mv val/ILSVRC2012_val_00014559.JPEG n04613696/
+mv val/ILSVRC2012_val_00014560.JPEG n02113023/
+mv val/ILSVRC2012_val_00014561.JPEG n03866082/
+mv val/ILSVRC2012_val_00014562.JPEG n02992529/
+mv val/ILSVRC2012_val_00014563.JPEG n04479046/
+mv val/ILSVRC2012_val_00014564.JPEG n04467665/
+mv val/ILSVRC2012_val_00014565.JPEG n04540053/
+mv val/ILSVRC2012_val_00014566.JPEG n02927161/
+mv val/ILSVRC2012_val_00014567.JPEG n03992509/
+mv val/ILSVRC2012_val_00014568.JPEG n04347754/
+mv val/ILSVRC2012_val_00014569.JPEG n03495258/
+mv val/ILSVRC2012_val_00014570.JPEG n03633091/
+mv val/ILSVRC2012_val_00014571.JPEG n02105251/
+mv val/ILSVRC2012_val_00014572.JPEG n02231487/
+mv val/ILSVRC2012_val_00014573.JPEG n02102318/
+mv val/ILSVRC2012_val_00014574.JPEG n02667093/
+mv val/ILSVRC2012_val_00014575.JPEG n01749939/
+mv val/ILSVRC2012_val_00014576.JPEG n02133161/
+mv val/ILSVRC2012_val_00014577.JPEG n03372029/
+mv val/ILSVRC2012_val_00014578.JPEG n02486261/
+mv val/ILSVRC2012_val_00014579.JPEG n04004767/
+mv val/ILSVRC2012_val_00014580.JPEG n02088466/
+mv val/ILSVRC2012_val_00014581.JPEG n07579787/
+mv val/ILSVRC2012_val_00014582.JPEG n02791270/
+mv val/ILSVRC2012_val_00014583.JPEG n03131574/
+mv val/ILSVRC2012_val_00014584.JPEG n02391049/
+mv val/ILSVRC2012_val_00014585.JPEG n01664065/
+mv val/ILSVRC2012_val_00014586.JPEG n02099429/
+mv val/ILSVRC2012_val_00014587.JPEG n01776313/
+mv val/ILSVRC2012_val_00014588.JPEG n03920288/
+mv val/ILSVRC2012_val_00014589.JPEG n02109047/
+mv val/ILSVRC2012_val_00014590.JPEG n02317335/
+mv val/ILSVRC2012_val_00014591.JPEG n04612504/
+mv val/ILSVRC2012_val_00014592.JPEG n03584254/
+mv val/ILSVRC2012_val_00014593.JPEG n03457902/
+mv val/ILSVRC2012_val_00014594.JPEG n02051845/
+mv val/ILSVRC2012_val_00014595.JPEG n03047690/
+mv val/ILSVRC2012_val_00014596.JPEG n04507155/
+mv val/ILSVRC2012_val_00014597.JPEG n02704792/
+mv val/ILSVRC2012_val_00014598.JPEG n01748264/
+mv val/ILSVRC2012_val_00014599.JPEG n02017213/
+mv val/ILSVRC2012_val_00014600.JPEG n03450230/
+mv val/ILSVRC2012_val_00014601.JPEG n02841315/
+mv val/ILSVRC2012_val_00014602.JPEG n04070727/
+mv val/ILSVRC2012_val_00014603.JPEG n02992211/
+mv val/ILSVRC2012_val_00014604.JPEG n03404251/
+mv val/ILSVRC2012_val_00014605.JPEG n02092339/
+mv val/ILSVRC2012_val_00014606.JPEG n12768682/
+mv val/ILSVRC2012_val_00014607.JPEG n07873807/
+mv val/ILSVRC2012_val_00014608.JPEG n03041632/
+mv val/ILSVRC2012_val_00014609.JPEG n03379051/
+mv val/ILSVRC2012_val_00014610.JPEG n04435653/
+mv val/ILSVRC2012_val_00014611.JPEG n04146614/
+mv val/ILSVRC2012_val_00014612.JPEG n02012849/
+mv val/ILSVRC2012_val_00014613.JPEG n03443371/
+mv val/ILSVRC2012_val_00014614.JPEG n04152593/
+mv val/ILSVRC2012_val_00014615.JPEG n04507155/
+mv val/ILSVRC2012_val_00014616.JPEG n03447447/
+mv val/ILSVRC2012_val_00014617.JPEG n04252225/
+mv val/ILSVRC2012_val_00014618.JPEG n03770439/
+mv val/ILSVRC2012_val_00014619.JPEG n13037406/
+mv val/ILSVRC2012_val_00014620.JPEG n01748264/
+mv val/ILSVRC2012_val_00014621.JPEG n04550184/
+mv val/ILSVRC2012_val_00014622.JPEG n03207941/
+mv val/ILSVRC2012_val_00014623.JPEG n07716906/
+mv val/ILSVRC2012_val_00014624.JPEG n03595614/
+mv val/ILSVRC2012_val_00014625.JPEG n07875152/
+mv val/ILSVRC2012_val_00014626.JPEG n04560804/
+mv val/ILSVRC2012_val_00014627.JPEG n04479046/
+mv val/ILSVRC2012_val_00014628.JPEG n03127925/
+mv val/ILSVRC2012_val_00014629.JPEG n07248320/
+mv val/ILSVRC2012_val_00014630.JPEG n02342885/
+mv val/ILSVRC2012_val_00014631.JPEG n02088466/
+mv val/ILSVRC2012_val_00014632.JPEG n03485407/
+mv val/ILSVRC2012_val_00014633.JPEG n09399592/
+mv val/ILSVRC2012_val_00014634.JPEG n04039381/
+mv val/ILSVRC2012_val_00014635.JPEG n04548280/
+mv val/ILSVRC2012_val_00014636.JPEG n02099267/
+mv val/ILSVRC2012_val_00014637.JPEG n04254777/
+mv val/ILSVRC2012_val_00014638.JPEG n06785654/
+mv val/ILSVRC2012_val_00014639.JPEG n02190166/
+mv val/ILSVRC2012_val_00014640.JPEG n03868242/
+mv val/ILSVRC2012_val_00014641.JPEG n04141076/
+mv val/ILSVRC2012_val_00014642.JPEG n02980441/
+mv val/ILSVRC2012_val_00014643.JPEG n03868863/
+mv val/ILSVRC2012_val_00014644.JPEG n02437312/
+mv val/ILSVRC2012_val_00014645.JPEG n02096177/
+mv val/ILSVRC2012_val_00014646.JPEG n02701002/
+mv val/ILSVRC2012_val_00014647.JPEG n03259280/
+mv val/ILSVRC2012_val_00014648.JPEG n02834397/
+mv val/ILSVRC2012_val_00014649.JPEG n15075141/
+mv val/ILSVRC2012_val_00014650.JPEG n07880968/
+mv val/ILSVRC2012_val_00014651.JPEG n02096585/
+mv val/ILSVRC2012_val_00014652.JPEG n09256479/
+mv val/ILSVRC2012_val_00014653.JPEG n02091032/
+mv val/ILSVRC2012_val_00014654.JPEG n03457902/
+mv val/ILSVRC2012_val_00014655.JPEG n02099849/
+mv val/ILSVRC2012_val_00014656.JPEG n02398521/
+mv val/ILSVRC2012_val_00014657.JPEG n02129165/
+mv val/ILSVRC2012_val_00014658.JPEG n03404251/
+mv val/ILSVRC2012_val_00014659.JPEG n01774384/
+mv val/ILSVRC2012_val_00014660.JPEG n03977966/
+mv val/ILSVRC2012_val_00014661.JPEG n02980441/
+mv val/ILSVRC2012_val_00014662.JPEG n02137549/
+mv val/ILSVRC2012_val_00014663.JPEG n03920288/
+mv val/ILSVRC2012_val_00014664.JPEG n01770081/
+mv val/ILSVRC2012_val_00014665.JPEG n03891332/
+mv val/ILSVRC2012_val_00014666.JPEG n03196217/
+mv val/ILSVRC2012_val_00014667.JPEG n02782093/
+mv val/ILSVRC2012_val_00014668.JPEG n02510455/
+mv val/ILSVRC2012_val_00014669.JPEG n03535780/
+mv val/ILSVRC2012_val_00014670.JPEG n04263257/
+mv val/ILSVRC2012_val_00014671.JPEG n02790996/
+mv val/ILSVRC2012_val_00014672.JPEG n03146219/
+mv val/ILSVRC2012_val_00014673.JPEG n01601694/
+mv val/ILSVRC2012_val_00014674.JPEG n03379051/
+mv val/ILSVRC2012_val_00014675.JPEG n03188531/
+mv val/ILSVRC2012_val_00014676.JPEG n02790996/
+mv val/ILSVRC2012_val_00014677.JPEG n04596742/
+mv val/ILSVRC2012_val_00014678.JPEG n01560419/
+mv val/ILSVRC2012_val_00014679.JPEG n03376595/
+mv val/ILSVRC2012_val_00014680.JPEG n12768682/
+mv val/ILSVRC2012_val_00014681.JPEG n02504013/
+mv val/ILSVRC2012_val_00014682.JPEG n03388043/
+mv val/ILSVRC2012_val_00014683.JPEG n02231487/
+mv val/ILSVRC2012_val_00014684.JPEG n03134739/
+mv val/ILSVRC2012_val_00014685.JPEG n03775071/
+mv val/ILSVRC2012_val_00014686.JPEG n02509815/
+mv val/ILSVRC2012_val_00014687.JPEG n07695742/
+mv val/ILSVRC2012_val_00014688.JPEG n02325366/
+mv val/ILSVRC2012_val_00014689.JPEG n09835506/
+mv val/ILSVRC2012_val_00014690.JPEG n04418357/
+mv val/ILSVRC2012_val_00014691.JPEG n04483307/
+mv val/ILSVRC2012_val_00014692.JPEG n04069434/
+mv val/ILSVRC2012_val_00014693.JPEG n03991062/
+mv val/ILSVRC2012_val_00014694.JPEG n02487347/
+mv val/ILSVRC2012_val_00014695.JPEG n03223299/
+mv val/ILSVRC2012_val_00014696.JPEG n02817516/
+mv val/ILSVRC2012_val_00014697.JPEG n03207743/
+mv val/ILSVRC2012_val_00014698.JPEG n02110627/
+mv val/ILSVRC2012_val_00014699.JPEG n04604644/
+mv val/ILSVRC2012_val_00014700.JPEG n02112350/
+mv val/ILSVRC2012_val_00014701.JPEG n02109961/
+mv val/ILSVRC2012_val_00014702.JPEG n03534580/
+mv val/ILSVRC2012_val_00014703.JPEG n03208938/
+mv val/ILSVRC2012_val_00014704.JPEG n03125729/
+mv val/ILSVRC2012_val_00014705.JPEG n03947888/
+mv val/ILSVRC2012_val_00014706.JPEG n04154565/
+mv val/ILSVRC2012_val_00014707.JPEG n01860187/
+mv val/ILSVRC2012_val_00014708.JPEG n02328150/
+mv val/ILSVRC2012_val_00014709.JPEG n02777292/
+mv val/ILSVRC2012_val_00014710.JPEG n02112018/
+mv val/ILSVRC2012_val_00014711.JPEG n02113978/
+mv val/ILSVRC2012_val_00014712.JPEG n02033041/
+mv val/ILSVRC2012_val_00014713.JPEG n07871810/
+mv val/ILSVRC2012_val_00014714.JPEG n10148035/
+mv val/ILSVRC2012_val_00014715.JPEG n01981276/
+mv val/ILSVRC2012_val_00014716.JPEG n07860988/
+mv val/ILSVRC2012_val_00014717.JPEG n03492542/
+mv val/ILSVRC2012_val_00014718.JPEG n04005630/
+mv val/ILSVRC2012_val_00014719.JPEG n02093428/
+mv val/ILSVRC2012_val_00014720.JPEG n04355933/
+mv val/ILSVRC2012_val_00014721.JPEG n02108089/
+mv val/ILSVRC2012_val_00014722.JPEG n03841143/
+mv val/ILSVRC2012_val_00014723.JPEG n02704792/
+mv val/ILSVRC2012_val_00014724.JPEG n02277742/
+mv val/ILSVRC2012_val_00014725.JPEG n03874599/
+mv val/ILSVRC2012_val_00014726.JPEG n04371774/
+mv val/ILSVRC2012_val_00014727.JPEG n01775062/
+mv val/ILSVRC2012_val_00014728.JPEG n03461385/
+mv val/ILSVRC2012_val_00014729.JPEG n02096585/
+mv val/ILSVRC2012_val_00014730.JPEG n02093754/
+mv val/ILSVRC2012_val_00014731.JPEG n02011460/
+mv val/ILSVRC2012_val_00014732.JPEG n02814533/
+mv val/ILSVRC2012_val_00014733.JPEG n02787622/
+mv val/ILSVRC2012_val_00014734.JPEG n02114367/
+mv val/ILSVRC2012_val_00014735.JPEG n01641577/
+mv val/ILSVRC2012_val_00014736.JPEG n03992509/
+mv val/ILSVRC2012_val_00014737.JPEG n04265275/
+mv val/ILSVRC2012_val_00014738.JPEG n02096051/
+mv val/ILSVRC2012_val_00014739.JPEG n07745940/
+mv val/ILSVRC2012_val_00014740.JPEG n02422106/
+mv val/ILSVRC2012_val_00014741.JPEG n01496331/
+mv val/ILSVRC2012_val_00014742.JPEG n03188531/
+mv val/ILSVRC2012_val_00014743.JPEG n07614500/
+mv val/ILSVRC2012_val_00014744.JPEG n02101006/
+mv val/ILSVRC2012_val_00014745.JPEG n02101006/
+mv val/ILSVRC2012_val_00014746.JPEG n13040303/
+mv val/ILSVRC2012_val_00014747.JPEG n02085936/
+mv val/ILSVRC2012_val_00014748.JPEG n03961711/
+mv val/ILSVRC2012_val_00014749.JPEG n02093991/
+mv val/ILSVRC2012_val_00014750.JPEG n07714571/
+mv val/ILSVRC2012_val_00014751.JPEG n01986214/
+mv val/ILSVRC2012_val_00014752.JPEG n01669191/
+mv val/ILSVRC2012_val_00014753.JPEG n01984695/
+mv val/ILSVRC2012_val_00014754.JPEG n03297495/
+mv val/ILSVRC2012_val_00014755.JPEG n02108422/
+mv val/ILSVRC2012_val_00014756.JPEG n03249569/
+mv val/ILSVRC2012_val_00014757.JPEG n04398044/
+mv val/ILSVRC2012_val_00014758.JPEG n03775546/
+mv val/ILSVRC2012_val_00014759.JPEG n01986214/
+mv val/ILSVRC2012_val_00014760.JPEG n04579432/
+mv val/ILSVRC2012_val_00014761.JPEG n07714571/
+mv val/ILSVRC2012_val_00014762.JPEG n01945685/
+mv val/ILSVRC2012_val_00014763.JPEG n02640242/
+mv val/ILSVRC2012_val_00014764.JPEG n06785654/
+mv val/ILSVRC2012_val_00014765.JPEG n04116512/
+mv val/ILSVRC2012_val_00014766.JPEG n02099429/
+mv val/ILSVRC2012_val_00014767.JPEG n09229709/
+mv val/ILSVRC2012_val_00014768.JPEG n01682714/
+mv val/ILSVRC2012_val_00014769.JPEG n01749939/
+mv val/ILSVRC2012_val_00014770.JPEG n02007558/
+mv val/ILSVRC2012_val_00014771.JPEG n01498041/
+mv val/ILSVRC2012_val_00014772.JPEG n04507155/
+mv val/ILSVRC2012_val_00014773.JPEG n02124075/
+mv val/ILSVRC2012_val_00014774.JPEG n02101006/
+mv val/ILSVRC2012_val_00014775.JPEG n02104029/
+mv val/ILSVRC2012_val_00014776.JPEG n02676566/
+mv val/ILSVRC2012_val_00014777.JPEG n02606052/
+mv val/ILSVRC2012_val_00014778.JPEG n04238763/
+mv val/ILSVRC2012_val_00014779.JPEG n02101388/
+mv val/ILSVRC2012_val_00014780.JPEG n02107312/
+mv val/ILSVRC2012_val_00014781.JPEG n03347037/
+mv val/ILSVRC2012_val_00014782.JPEG n02493509/
+mv val/ILSVRC2012_val_00014783.JPEG n02396427/
+mv val/ILSVRC2012_val_00014784.JPEG n04065272/
+mv val/ILSVRC2012_val_00014785.JPEG n03840681/
+mv val/ILSVRC2012_val_00014786.JPEG n04515003/
+mv val/ILSVRC2012_val_00014787.JPEG n02091635/
+mv val/ILSVRC2012_val_00014788.JPEG n02325366/
+mv val/ILSVRC2012_val_00014789.JPEG n04033901/
+mv val/ILSVRC2012_val_00014790.JPEG n01675722/
+mv val/ILSVRC2012_val_00014791.JPEG n03788365/
+mv val/ILSVRC2012_val_00014792.JPEG n13037406/
+mv val/ILSVRC2012_val_00014793.JPEG n03527444/
+mv val/ILSVRC2012_val_00014794.JPEG n01695060/
+mv val/ILSVRC2012_val_00014795.JPEG n04328186/
+mv val/ILSVRC2012_val_00014796.JPEG n07590611/
+mv val/ILSVRC2012_val_00014797.JPEG n01728572/
+mv val/ILSVRC2012_val_00014798.JPEG n02119022/
+mv val/ILSVRC2012_val_00014799.JPEG n02974003/
+mv val/ILSVRC2012_val_00014800.JPEG n02410509/
+mv val/ILSVRC2012_val_00014801.JPEG n07892512/
+mv val/ILSVRC2012_val_00014802.JPEG n07730033/
+mv val/ILSVRC2012_val_00014803.JPEG n04330267/
+mv val/ILSVRC2012_val_00014804.JPEG n03868863/
+mv val/ILSVRC2012_val_00014805.JPEG n02018207/
+mv val/ILSVRC2012_val_00014806.JPEG n02500267/
+mv val/ILSVRC2012_val_00014807.JPEG n02980441/
+mv val/ILSVRC2012_val_00014808.JPEG n01843065/
+mv val/ILSVRC2012_val_00014809.JPEG n02093859/
+mv val/ILSVRC2012_val_00014810.JPEG n02094114/
+mv val/ILSVRC2012_val_00014811.JPEG n07768694/
+mv val/ILSVRC2012_val_00014812.JPEG n04154565/
+mv val/ILSVRC2012_val_00014813.JPEG n02123394/
+mv val/ILSVRC2012_val_00014814.JPEG n03843555/
+mv val/ILSVRC2012_val_00014815.JPEG n02123159/
+mv val/ILSVRC2012_val_00014816.JPEG n02107574/
+mv val/ILSVRC2012_val_00014817.JPEG n01795545/
+mv val/ILSVRC2012_val_00014818.JPEG n02917067/
+mv val/ILSVRC2012_val_00014819.JPEG n02071294/
+mv val/ILSVRC2012_val_00014820.JPEG n03895866/
+mv val/ILSVRC2012_val_00014821.JPEG n03179701/
+mv val/ILSVRC2012_val_00014822.JPEG n03950228/
+mv val/ILSVRC2012_val_00014823.JPEG n04259630/
+mv val/ILSVRC2012_val_00014824.JPEG n02165105/
+mv val/ILSVRC2012_val_00014825.JPEG n02120079/
+mv val/ILSVRC2012_val_00014826.JPEG n02804610/
+mv val/ILSVRC2012_val_00014827.JPEG n02279972/
+mv val/ILSVRC2012_val_00014828.JPEG n01728920/
+mv val/ILSVRC2012_val_00014829.JPEG n02978881/
+mv val/ILSVRC2012_val_00014830.JPEG n03710637/
+mv val/ILSVRC2012_val_00014831.JPEG n01872401/
+mv val/ILSVRC2012_val_00014832.JPEG n03160309/
+mv val/ILSVRC2012_val_00014833.JPEG n02442845/
+mv val/ILSVRC2012_val_00014834.JPEG n09256479/
+mv val/ILSVRC2012_val_00014835.JPEG n02950826/
+mv val/ILSVRC2012_val_00014836.JPEG n02841315/
+mv val/ILSVRC2012_val_00014837.JPEG n04357314/
+mv val/ILSVRC2012_val_00014838.JPEG n02865351/
+mv val/ILSVRC2012_val_00014839.JPEG n04111531/
+mv val/ILSVRC2012_val_00014840.JPEG n07747607/
+mv val/ILSVRC2012_val_00014841.JPEG n03594945/
+mv val/ILSVRC2012_val_00014842.JPEG n03763968/
+mv val/ILSVRC2012_val_00014843.JPEG n04606251/
+mv val/ILSVRC2012_val_00014844.JPEG n03895866/
+mv val/ILSVRC2012_val_00014845.JPEG n02113978/
+mv val/ILSVRC2012_val_00014846.JPEG n04554684/
+mv val/ILSVRC2012_val_00014847.JPEG n04344873/
+mv val/ILSVRC2012_val_00014848.JPEG n04254120/
+mv val/ILSVRC2012_val_00014849.JPEG n01740131/
+mv val/ILSVRC2012_val_00014850.JPEG n03976467/
+mv val/ILSVRC2012_val_00014851.JPEG n07753275/
+mv val/ILSVRC2012_val_00014852.JPEG n02443484/
+mv val/ILSVRC2012_val_00014853.JPEG n02939185/
+mv val/ILSVRC2012_val_00014854.JPEG n02977058/
+mv val/ILSVRC2012_val_00014855.JPEG n13037406/
+mv val/ILSVRC2012_val_00014856.JPEG n07747607/
+mv val/ILSVRC2012_val_00014857.JPEG n04467665/
+mv val/ILSVRC2012_val_00014858.JPEG n01784675/
+mv val/ILSVRC2012_val_00014859.JPEG n04536866/
+mv val/ILSVRC2012_val_00014860.JPEG n02123159/
+mv val/ILSVRC2012_val_00014861.JPEG n02119789/
+mv val/ILSVRC2012_val_00014862.JPEG n04548362/
+mv val/ILSVRC2012_val_00014863.JPEG n02111129/
+mv val/ILSVRC2012_val_00014864.JPEG n06794110/
+mv val/ILSVRC2012_val_00014865.JPEG n04239074/
+mv val/ILSVRC2012_val_00014866.JPEG n03733805/
+mv val/ILSVRC2012_val_00014867.JPEG n02088466/
+mv val/ILSVRC2012_val_00014868.JPEG n03764736/
+mv val/ILSVRC2012_val_00014869.JPEG n01914609/
+mv val/ILSVRC2012_val_00014870.JPEG n02105505/
+mv val/ILSVRC2012_val_00014871.JPEG n02412080/
+mv val/ILSVRC2012_val_00014872.JPEG n04254680/
+mv val/ILSVRC2012_val_00014873.JPEG n04523525/
+mv val/ILSVRC2012_val_00014874.JPEG n07697537/
+mv val/ILSVRC2012_val_00014875.JPEG n01728920/
+mv val/ILSVRC2012_val_00014876.JPEG n02794156/
+mv val/ILSVRC2012_val_00014877.JPEG n02113978/
+mv val/ILSVRC2012_val_00014878.JPEG n13040303/
+mv val/ILSVRC2012_val_00014879.JPEG n01514859/
+mv val/ILSVRC2012_val_00014880.JPEG n04398044/
+mv val/ILSVRC2012_val_00014881.JPEG n02364673/
+mv val/ILSVRC2012_val_00014882.JPEG n01924916/
+mv val/ILSVRC2012_val_00014883.JPEG n02007558/
+mv val/ILSVRC2012_val_00014884.JPEG n03803284/
+mv val/ILSVRC2012_val_00014885.JPEG n02795169/
+mv val/ILSVRC2012_val_00014886.JPEG n03916031/
+mv val/ILSVRC2012_val_00014887.JPEG n02088238/
+mv val/ILSVRC2012_val_00014888.JPEG n02086646/
+mv val/ILSVRC2012_val_00014889.JPEG n03063689/
+mv val/ILSVRC2012_val_00014890.JPEG n01806143/
+mv val/ILSVRC2012_val_00014891.JPEG n04366367/
+mv val/ILSVRC2012_val_00014892.JPEG n03109150/
+mv val/ILSVRC2012_val_00014893.JPEG n04523525/
+mv val/ILSVRC2012_val_00014894.JPEG n04208210/
+mv val/ILSVRC2012_val_00014895.JPEG n01978287/
+mv val/ILSVRC2012_val_00014896.JPEG n03272010/
+mv val/ILSVRC2012_val_00014897.JPEG n03146219/
+mv val/ILSVRC2012_val_00014898.JPEG n03933933/
+mv val/ILSVRC2012_val_00014899.JPEG n04525305/
+mv val/ILSVRC2012_val_00014900.JPEG n03124043/
+mv val/ILSVRC2012_val_00014901.JPEG n02510455/
+mv val/ILSVRC2012_val_00014902.JPEG n01687978/
+mv val/ILSVRC2012_val_00014903.JPEG n01824575/
+mv val/ILSVRC2012_val_00014904.JPEG n04613696/
+mv val/ILSVRC2012_val_00014905.JPEG n06359193/
+mv val/ILSVRC2012_val_00014906.JPEG n03110669/
+mv val/ILSVRC2012_val_00014907.JPEG n03388183/
+mv val/ILSVRC2012_val_00014908.JPEG n03691459/
+mv val/ILSVRC2012_val_00014909.JPEG n02280649/
+mv val/ILSVRC2012_val_00014910.JPEG n03133878/
+mv val/ILSVRC2012_val_00014911.JPEG n02085782/
+mv val/ILSVRC2012_val_00014912.JPEG n02087046/
+mv val/ILSVRC2012_val_00014913.JPEG n02090721/
+mv val/ILSVRC2012_val_00014914.JPEG n02497673/
+mv val/ILSVRC2012_val_00014915.JPEG n04344873/
+mv val/ILSVRC2012_val_00014916.JPEG n04330267/
+mv val/ILSVRC2012_val_00014917.JPEG n01514859/
+mv val/ILSVRC2012_val_00014918.JPEG n02488702/
+mv val/ILSVRC2012_val_00014919.JPEG n04525038/
+mv val/ILSVRC2012_val_00014920.JPEG n07711569/
+mv val/ILSVRC2012_val_00014921.JPEG n01978455/
+mv val/ILSVRC2012_val_00014922.JPEG n01768244/
+mv val/ILSVRC2012_val_00014923.JPEG n02105855/
+mv val/ILSVRC2012_val_00014924.JPEG n04604644/
+mv val/ILSVRC2012_val_00014925.JPEG n02281406/
+mv val/ILSVRC2012_val_00014926.JPEG n01739381/
+mv val/ILSVRC2012_val_00014927.JPEG n01693334/
+mv val/ILSVRC2012_val_00014928.JPEG n02113978/
+mv val/ILSVRC2012_val_00014929.JPEG n07749582/
+mv val/ILSVRC2012_val_00014930.JPEG n03786901/
+mv val/ILSVRC2012_val_00014931.JPEG n01883070/
+mv val/ILSVRC2012_val_00014932.JPEG n09246464/
+mv val/ILSVRC2012_val_00014933.JPEG n03841143/
+mv val/ILSVRC2012_val_00014934.JPEG n03482405/
+mv val/ILSVRC2012_val_00014935.JPEG n12998815/
+mv val/ILSVRC2012_val_00014936.JPEG n03938244/
+mv val/ILSVRC2012_val_00014937.JPEG n04238763/
+mv val/ILSVRC2012_val_00014938.JPEG n03929855/
+mv val/ILSVRC2012_val_00014939.JPEG n02892201/
+mv val/ILSVRC2012_val_00014940.JPEG n02486261/
+mv val/ILSVRC2012_val_00014941.JPEG n02676566/
+mv val/ILSVRC2012_val_00014942.JPEG n01843065/
+mv val/ILSVRC2012_val_00014943.JPEG n01728920/
+mv val/ILSVRC2012_val_00014944.JPEG n03379051/
+mv val/ILSVRC2012_val_00014945.JPEG n02823750/
+mv val/ILSVRC2012_val_00014946.JPEG n02776631/
+mv val/ILSVRC2012_val_00014947.JPEG n02488291/
+mv val/ILSVRC2012_val_00014948.JPEG n02317335/
+mv val/ILSVRC2012_val_00014949.JPEG n02002724/
+mv val/ILSVRC2012_val_00014950.JPEG n01755581/
+mv val/ILSVRC2012_val_00014951.JPEG n03110669/
+mv val/ILSVRC2012_val_00014952.JPEG n04019541/
+mv val/ILSVRC2012_val_00014953.JPEG n03095699/
+mv val/ILSVRC2012_val_00014954.JPEG n04004767/
+mv val/ILSVRC2012_val_00014955.JPEG n03877845/
+mv val/ILSVRC2012_val_00014956.JPEG n02120505/
+mv val/ILSVRC2012_val_00014957.JPEG n02113624/
+mv val/ILSVRC2012_val_00014958.JPEG n07695742/
+mv val/ILSVRC2012_val_00014959.JPEG n03127747/
+mv val/ILSVRC2012_val_00014960.JPEG n03041632/
+mv val/ILSVRC2012_val_00014961.JPEG n01744401/
+mv val/ILSVRC2012_val_00014962.JPEG n02098286/
+mv val/ILSVRC2012_val_00014963.JPEG n02100735/
+mv val/ILSVRC2012_val_00014964.JPEG n02264363/
+mv val/ILSVRC2012_val_00014965.JPEG n04456115/
+mv val/ILSVRC2012_val_00014966.JPEG n02219486/
+mv val/ILSVRC2012_val_00014967.JPEG n02129165/
+mv val/ILSVRC2012_val_00014968.JPEG n04275548/
+mv val/ILSVRC2012_val_00014969.JPEG n03874599/
+mv val/ILSVRC2012_val_00014970.JPEG n03706229/
+mv val/ILSVRC2012_val_00014971.JPEG n01770081/
+mv val/ILSVRC2012_val_00014972.JPEG n02988304/
+mv val/ILSVRC2012_val_00014973.JPEG n02105505/
+mv val/ILSVRC2012_val_00014974.JPEG n02130308/
+mv val/ILSVRC2012_val_00014975.JPEG n02113799/
+mv val/ILSVRC2012_val_00014976.JPEG n06596364/
+mv val/ILSVRC2012_val_00014977.JPEG n02028035/
+mv val/ILSVRC2012_val_00014978.JPEG n01784675/
+mv val/ILSVRC2012_val_00014979.JPEG n04266014/
+mv val/ILSVRC2012_val_00014980.JPEG n02422106/
+mv val/ILSVRC2012_val_00014981.JPEG n03271574/
+mv val/ILSVRC2012_val_00014982.JPEG n01622779/
+mv val/ILSVRC2012_val_00014983.JPEG n04229816/
+mv val/ILSVRC2012_val_00014984.JPEG n02988304/
+mv val/ILSVRC2012_val_00014985.JPEG n02977058/
+mv val/ILSVRC2012_val_00014986.JPEG n03594734/
+mv val/ILSVRC2012_val_00014987.JPEG n03196217/
+mv val/ILSVRC2012_val_00014988.JPEG n04008634/
+mv val/ILSVRC2012_val_00014989.JPEG n03947888/
+mv val/ILSVRC2012_val_00014990.JPEG n03032252/
+mv val/ILSVRC2012_val_00014991.JPEG n02037110/
+mv val/ILSVRC2012_val_00014992.JPEG n03424325/
+mv val/ILSVRC2012_val_00014993.JPEG n03873416/
+mv val/ILSVRC2012_val_00014994.JPEG n03379051/
+mv val/ILSVRC2012_val_00014995.JPEG n02096437/
+mv val/ILSVRC2012_val_00014996.JPEG n03887697/
+mv val/ILSVRC2012_val_00014997.JPEG n04154565/
+mv val/ILSVRC2012_val_00014998.JPEG n03803284/
+mv val/ILSVRC2012_val_00014999.JPEG n06794110/
+mv val/ILSVRC2012_val_00015000.JPEG n03956157/
+mv val/ILSVRC2012_val_00015001.JPEG n03297495/
+mv val/ILSVRC2012_val_00015002.JPEG n03444034/
+mv val/ILSVRC2012_val_00015003.JPEG n09256479/
+mv val/ILSVRC2012_val_00015004.JPEG n02317335/
+mv val/ILSVRC2012_val_00015005.JPEG n03871628/
+mv val/ILSVRC2012_val_00015006.JPEG n04192698/
+mv val/ILSVRC2012_val_00015007.JPEG n07873807/
+mv val/ILSVRC2012_val_00015008.JPEG n02793495/
+mv val/ILSVRC2012_val_00015009.JPEG n03764736/
+mv val/ILSVRC2012_val_00015010.JPEG n02483362/
+mv val/ILSVRC2012_val_00015011.JPEG n01773797/
+mv val/ILSVRC2012_val_00015012.JPEG n03788195/
+mv val/ILSVRC2012_val_00015013.JPEG n03032252/
+mv val/ILSVRC2012_val_00015014.JPEG n04311174/
+mv val/ILSVRC2012_val_00015015.JPEG n02111889/
+mv val/ILSVRC2012_val_00015016.JPEG n03970156/
+mv val/ILSVRC2012_val_00015017.JPEG n04447861/
+mv val/ILSVRC2012_val_00015018.JPEG n02018795/
+mv val/ILSVRC2012_val_00015019.JPEG n03666591/
+mv val/ILSVRC2012_val_00015020.JPEG n03314780/
+mv val/ILSVRC2012_val_00015021.JPEG n02229544/
+mv val/ILSVRC2012_val_00015022.JPEG n02172182/
+mv val/ILSVRC2012_val_00015023.JPEG n02486410/
+mv val/ILSVRC2012_val_00015024.JPEG n02607072/
+mv val/ILSVRC2012_val_00015025.JPEG n02276258/
+mv val/ILSVRC2012_val_00015026.JPEG n04254777/
+mv val/ILSVRC2012_val_00015027.JPEG n02403003/
+mv val/ILSVRC2012_val_00015028.JPEG n02094114/
+mv val/ILSVRC2012_val_00015029.JPEG n09246464/
+mv val/ILSVRC2012_val_00015030.JPEG n02114367/
+mv val/ILSVRC2012_val_00015031.JPEG n03788365/
+mv val/ILSVRC2012_val_00015032.JPEG n03297495/
+mv val/ILSVRC2012_val_00015033.JPEG n02492660/
+mv val/ILSVRC2012_val_00015034.JPEG n04326547/
+mv val/ILSVRC2012_val_00015035.JPEG n03201208/
+mv val/ILSVRC2012_val_00015036.JPEG n04286575/
+mv val/ILSVRC2012_val_00015037.JPEG n03492542/
+mv val/ILSVRC2012_val_00015038.JPEG n03877472/
+mv val/ILSVRC2012_val_00015039.JPEG n01910747/
+mv val/ILSVRC2012_val_00015040.JPEG n01608432/
+mv val/ILSVRC2012_val_00015041.JPEG n02490219/
+mv val/ILSVRC2012_val_00015042.JPEG n03710637/
+mv val/ILSVRC2012_val_00015043.JPEG n04344873/
+mv val/ILSVRC2012_val_00015044.JPEG n02951358/
+mv val/ILSVRC2012_val_00015045.JPEG n01498041/
+mv val/ILSVRC2012_val_00015046.JPEG n01729322/
+mv val/ILSVRC2012_val_00015047.JPEG n04409515/
+mv val/ILSVRC2012_val_00015048.JPEG n04146614/
+mv val/ILSVRC2012_val_00015049.JPEG n03873416/
+mv val/ILSVRC2012_val_00015050.JPEG n02090721/
+mv val/ILSVRC2012_val_00015051.JPEG n04081281/
+mv val/ILSVRC2012_val_00015052.JPEG n03976467/
+mv val/ILSVRC2012_val_00015053.JPEG n02837789/
+mv val/ILSVRC2012_val_00015054.JPEG n04409515/
+mv val/ILSVRC2012_val_00015055.JPEG n03759954/
+mv val/ILSVRC2012_val_00015056.JPEG n02168699/
+mv val/ILSVRC2012_val_00015057.JPEG n03127925/
+mv val/ILSVRC2012_val_00015058.JPEG n03970156/
+mv val/ILSVRC2012_val_00015059.JPEG n01665541/
+mv val/ILSVRC2012_val_00015060.JPEG n03160309/
+mv val/ILSVRC2012_val_00015061.JPEG n04251144/
+mv val/ILSVRC2012_val_00015062.JPEG n04311174/
+mv val/ILSVRC2012_val_00015063.JPEG n02098413/
+mv val/ILSVRC2012_val_00015064.JPEG n02480855/
+mv val/ILSVRC2012_val_00015065.JPEG n01773549/
+mv val/ILSVRC2012_val_00015066.JPEG n02489166/
+mv val/ILSVRC2012_val_00015067.JPEG n03494278/
+mv val/ILSVRC2012_val_00015068.JPEG n02229544/
+mv val/ILSVRC2012_val_00015069.JPEG n01729977/
+mv val/ILSVRC2012_val_00015070.JPEG n04552348/
+mv val/ILSVRC2012_val_00015071.JPEG n04033995/
+mv val/ILSVRC2012_val_00015072.JPEG n01882714/
+mv val/ILSVRC2012_val_00015073.JPEG n04366367/
+mv val/ILSVRC2012_val_00015074.JPEG n03271574/
+mv val/ILSVRC2012_val_00015075.JPEG n03666591/
+mv val/ILSVRC2012_val_00015076.JPEG n02093428/
+mv val/ILSVRC2012_val_00015077.JPEG n02791124/
+mv val/ILSVRC2012_val_00015078.JPEG n03384352/
+mv val/ILSVRC2012_val_00015079.JPEG n03498962/
+mv val/ILSVRC2012_val_00015080.JPEG n03709823/
+mv val/ILSVRC2012_val_00015081.JPEG n02422699/
+mv val/ILSVRC2012_val_00015082.JPEG n02085782/
+mv val/ILSVRC2012_val_00015083.JPEG n04133789/
+mv val/ILSVRC2012_val_00015084.JPEG n02486261/
+mv val/ILSVRC2012_val_00015085.JPEG n12985857/
+mv val/ILSVRC2012_val_00015086.JPEG n04372370/
+mv val/ILSVRC2012_val_00015087.JPEG n03857828/
+mv val/ILSVRC2012_val_00015088.JPEG n04367480/
+mv val/ILSVRC2012_val_00015089.JPEG n04612504/
+mv val/ILSVRC2012_val_00015090.JPEG n04399382/
+mv val/ILSVRC2012_val_00015091.JPEG n01632458/
+mv val/ILSVRC2012_val_00015092.JPEG n03717622/
+mv val/ILSVRC2012_val_00015093.JPEG n02514041/
+mv val/ILSVRC2012_val_00015094.JPEG n02018207/
+mv val/ILSVRC2012_val_00015095.JPEG n07615774/
+mv val/ILSVRC2012_val_00015096.JPEG n02098413/
+mv val/ILSVRC2012_val_00015097.JPEG n03691459/
+mv val/ILSVRC2012_val_00015098.JPEG n02108915/
+mv val/ILSVRC2012_val_00015099.JPEG n07920052/
+mv val/ILSVRC2012_val_00015100.JPEG n04228054/
+mv val/ILSVRC2012_val_00015101.JPEG n04493381/
+mv val/ILSVRC2012_val_00015102.JPEG n04081281/
+mv val/ILSVRC2012_val_00015103.JPEG n03832673/
+mv val/ILSVRC2012_val_00015104.JPEG n13052670/
+mv val/ILSVRC2012_val_00015105.JPEG n04584207/
+mv val/ILSVRC2012_val_00015106.JPEG n04252225/
+mv val/ILSVRC2012_val_00015107.JPEG n01608432/
+mv val/ILSVRC2012_val_00015108.JPEG n02708093/
+mv val/ILSVRC2012_val_00015109.JPEG n04398044/
+mv val/ILSVRC2012_val_00015110.JPEG n02087046/
+mv val/ILSVRC2012_val_00015111.JPEG n04599235/
+mv val/ILSVRC2012_val_00015112.JPEG n02177972/
+mv val/ILSVRC2012_val_00015113.JPEG n02326432/
+mv val/ILSVRC2012_val_00015114.JPEG n02490219/
+mv val/ILSVRC2012_val_00015115.JPEG n03761084/
+mv val/ILSVRC2012_val_00015116.JPEG n02101556/
+mv val/ILSVRC2012_val_00015117.JPEG n04599235/
+mv val/ILSVRC2012_val_00015118.JPEG n04467665/
+mv val/ILSVRC2012_val_00015119.JPEG n02097658/
+mv val/ILSVRC2012_val_00015120.JPEG n01978287/
+mv val/ILSVRC2012_val_00015121.JPEG n04612504/
+mv val/ILSVRC2012_val_00015122.JPEG n02397096/
+mv val/ILSVRC2012_val_00015123.JPEG n03018349/
+mv val/ILSVRC2012_val_00015124.JPEG n02391049/
+mv val/ILSVRC2012_val_00015125.JPEG n07584110/
+mv val/ILSVRC2012_val_00015126.JPEG n02457408/
+mv val/ILSVRC2012_val_00015127.JPEG n01776313/
+mv val/ILSVRC2012_val_00015128.JPEG n02120079/
+mv val/ILSVRC2012_val_00015129.JPEG n02727426/
+mv val/ILSVRC2012_val_00015130.JPEG n02791270/
+mv val/ILSVRC2012_val_00015131.JPEG n04590129/
+mv val/ILSVRC2012_val_00015132.JPEG n02058221/
+mv val/ILSVRC2012_val_00015133.JPEG n03599486/
+mv val/ILSVRC2012_val_00015134.JPEG n03788365/
+mv val/ILSVRC2012_val_00015135.JPEG n02098105/
+mv val/ILSVRC2012_val_00015136.JPEG n02097047/
+mv val/ILSVRC2012_val_00015137.JPEG n03794056/
+mv val/ILSVRC2012_val_00015138.JPEG n02966193/
+mv val/ILSVRC2012_val_00015139.JPEG n01494475/
+mv val/ILSVRC2012_val_00015140.JPEG n02514041/
+mv val/ILSVRC2012_val_00015141.JPEG n01773157/
+mv val/ILSVRC2012_val_00015142.JPEG n07613480/
+mv val/ILSVRC2012_val_00015143.JPEG n09332890/
+mv val/ILSVRC2012_val_00015144.JPEG n02086910/
+mv val/ILSVRC2012_val_00015145.JPEG n02071294/
+mv val/ILSVRC2012_val_00015146.JPEG n02105412/
+mv val/ILSVRC2012_val_00015147.JPEG n02966193/
+mv val/ILSVRC2012_val_00015148.JPEG n02481823/
+mv val/ILSVRC2012_val_00015149.JPEG n04228054/
+mv val/ILSVRC2012_val_00015150.JPEG n02825657/
+mv val/ILSVRC2012_val_00015151.JPEG n03775071/
+mv val/ILSVRC2012_val_00015152.JPEG n02096177/
+mv val/ILSVRC2012_val_00015153.JPEG n02328150/
+mv val/ILSVRC2012_val_00015154.JPEG n01768244/
+mv val/ILSVRC2012_val_00015155.JPEG n03028079/
+mv val/ILSVRC2012_val_00015156.JPEG n03534580/
+mv val/ILSVRC2012_val_00015157.JPEG n01484850/
+mv val/ILSVRC2012_val_00015158.JPEG n09428293/
+mv val/ILSVRC2012_val_00015159.JPEG n03788365/
+mv val/ILSVRC2012_val_00015160.JPEG n02106550/
+mv val/ILSVRC2012_val_00015161.JPEG n03782006/
+mv val/ILSVRC2012_val_00015162.JPEG n04258138/
+mv val/ILSVRC2012_val_00015163.JPEG n03710637/
+mv val/ILSVRC2012_val_00015164.JPEG n02097298/
+mv val/ILSVRC2012_val_00015165.JPEG n03721384/
+mv val/ILSVRC2012_val_00015166.JPEG n02391049/
+mv val/ILSVRC2012_val_00015167.JPEG n02013706/
+mv val/ILSVRC2012_val_00015168.JPEG n02840245/
+mv val/ILSVRC2012_val_00015169.JPEG n03249569/
+mv val/ILSVRC2012_val_00015170.JPEG n02454379/
+mv val/ILSVRC2012_val_00015171.JPEG n02865351/
+mv val/ILSVRC2012_val_00015172.JPEG n02206856/
+mv val/ILSVRC2012_val_00015173.JPEG n02093991/
+mv val/ILSVRC2012_val_00015174.JPEG n01877812/
+mv val/ILSVRC2012_val_00015175.JPEG n03485407/
+mv val/ILSVRC2012_val_00015176.JPEG n02101388/
+mv val/ILSVRC2012_val_00015177.JPEG n03014705/
+mv val/ILSVRC2012_val_00015178.JPEG n04456115/
+mv val/ILSVRC2012_val_00015179.JPEG n03976657/
+mv val/ILSVRC2012_val_00015180.JPEG n03188531/
+mv val/ILSVRC2012_val_00015181.JPEG n02342885/
+mv val/ILSVRC2012_val_00015182.JPEG n02096437/
+mv val/ILSVRC2012_val_00015183.JPEG n02102318/
+mv val/ILSVRC2012_val_00015184.JPEG n03376595/
+mv val/ILSVRC2012_val_00015185.JPEG n03271574/
+mv val/ILSVRC2012_val_00015186.JPEG n02177972/
+mv val/ILSVRC2012_val_00015187.JPEG n03594945/
+mv val/ILSVRC2012_val_00015188.JPEG n03126707/
+mv val/ILSVRC2012_val_00015189.JPEG n02099712/
+mv val/ILSVRC2012_val_00015190.JPEG n01692333/
+mv val/ILSVRC2012_val_00015191.JPEG n02966687/
+mv val/ILSVRC2012_val_00015192.JPEG n03930313/
+mv val/ILSVRC2012_val_00015193.JPEG n01667778/
+mv val/ILSVRC2012_val_00015194.JPEG n07716906/
+mv val/ILSVRC2012_val_00015195.JPEG n01580077/
+mv val/ILSVRC2012_val_00015196.JPEG n03804744/
+mv val/ILSVRC2012_val_00015197.JPEG n02111277/
+mv val/ILSVRC2012_val_00015198.JPEG n03100240/
+mv val/ILSVRC2012_val_00015199.JPEG n04548280/
+mv val/ILSVRC2012_val_00015200.JPEG n02814533/
+mv val/ILSVRC2012_val_00015201.JPEG n04204347/
+mv val/ILSVRC2012_val_00015202.JPEG n04141327/
+mv val/ILSVRC2012_val_00015203.JPEG n02066245/
+mv val/ILSVRC2012_val_00015204.JPEG n02096585/
+mv val/ILSVRC2012_val_00015205.JPEG n02102480/
+mv val/ILSVRC2012_val_00015206.JPEG n03125729/
+mv val/ILSVRC2012_val_00015207.JPEG n03272010/
+mv val/ILSVRC2012_val_00015208.JPEG n03980874/
+mv val/ILSVRC2012_val_00015209.JPEG n07753592/
+mv val/ILSVRC2012_val_00015210.JPEG n02105412/
+mv val/ILSVRC2012_val_00015211.JPEG n02443114/
+mv val/ILSVRC2012_val_00015212.JPEG n04579432/
+mv val/ILSVRC2012_val_00015213.JPEG n02101556/
+mv val/ILSVRC2012_val_00015214.JPEG n03995372/
+mv val/ILSVRC2012_val_00015215.JPEG n02950826/
+mv val/ILSVRC2012_val_00015216.JPEG n01534433/
+mv val/ILSVRC2012_val_00015217.JPEG n02088238/
+mv val/ILSVRC2012_val_00015218.JPEG n07715103/
+mv val/ILSVRC2012_val_00015219.JPEG n02795169/
+mv val/ILSVRC2012_val_00015220.JPEG n01484850/
+mv val/ILSVRC2012_val_00015221.JPEG n01753488/
+mv val/ILSVRC2012_val_00015222.JPEG n02607072/
+mv val/ILSVRC2012_val_00015223.JPEG n01530575/
+mv val/ILSVRC2012_val_00015224.JPEG n01692333/
+mv val/ILSVRC2012_val_00015225.JPEG n04153751/
+mv val/ILSVRC2012_val_00015226.JPEG n02111500/
+mv val/ILSVRC2012_val_00015227.JPEG n03131574/
+mv val/ILSVRC2012_val_00015228.JPEG n03803284/
+mv val/ILSVRC2012_val_00015229.JPEG n02437312/
+mv val/ILSVRC2012_val_00015230.JPEG n02974003/
+mv val/ILSVRC2012_val_00015231.JPEG n02776631/
+mv val/ILSVRC2012_val_00015232.JPEG n04125021/
+mv val/ILSVRC2012_val_00015233.JPEG n09428293/
+mv val/ILSVRC2012_val_00015234.JPEG n02843684/
+mv val/ILSVRC2012_val_00015235.JPEG n03047690/
+mv val/ILSVRC2012_val_00015236.JPEG n02417914/
+mv val/ILSVRC2012_val_00015237.JPEG n03998194/
+mv val/ILSVRC2012_val_00015238.JPEG n03110669/
+mv val/ILSVRC2012_val_00015239.JPEG n02445715/
+mv val/ILSVRC2012_val_00015240.JPEG n04525305/
+mv val/ILSVRC2012_val_00015241.JPEG n03998194/
+mv val/ILSVRC2012_val_00015242.JPEG n01514668/
+mv val/ILSVRC2012_val_00015243.JPEG n02321529/
+mv val/ILSVRC2012_val_00015244.JPEG n02088466/
+mv val/ILSVRC2012_val_00015245.JPEG n01644373/
+mv val/ILSVRC2012_val_00015246.JPEG n07714571/
+mv val/ILSVRC2012_val_00015247.JPEG n04357314/
+mv val/ILSVRC2012_val_00015248.JPEG n03991062/
+mv val/ILSVRC2012_val_00015249.JPEG n02088094/
+mv val/ILSVRC2012_val_00015250.JPEG n02687172/
+mv val/ILSVRC2012_val_00015251.JPEG n02110185/
+mv val/ILSVRC2012_val_00015252.JPEG n02089078/
+mv val/ILSVRC2012_val_00015253.JPEG n09468604/
+mv val/ILSVRC2012_val_00015254.JPEG n02408429/
+mv val/ILSVRC2012_val_00015255.JPEG n04389033/
+mv val/ILSVRC2012_val_00015256.JPEG n03706229/
+mv val/ILSVRC2012_val_00015257.JPEG n02488702/
+mv val/ILSVRC2012_val_00015258.JPEG n03992509/
+mv val/ILSVRC2012_val_00015259.JPEG n02417914/
+mv val/ILSVRC2012_val_00015260.JPEG n04086273/
+mv val/ILSVRC2012_val_00015261.JPEG n07613480/
+mv val/ILSVRC2012_val_00015262.JPEG n04270147/
+mv val/ILSVRC2012_val_00015263.JPEG n03887697/
+mv val/ILSVRC2012_val_00015264.JPEG n01601694/
+mv val/ILSVRC2012_val_00015265.JPEG n02123159/
+mv val/ILSVRC2012_val_00015266.JPEG n01518878/
+mv val/ILSVRC2012_val_00015267.JPEG n07836838/
+mv val/ILSVRC2012_val_00015268.JPEG n04443257/
+mv val/ILSVRC2012_val_00015269.JPEG n01592084/
+mv val/ILSVRC2012_val_00015270.JPEG n03109150/
+mv val/ILSVRC2012_val_00015271.JPEG n02264363/
+mv val/ILSVRC2012_val_00015272.JPEG n02808304/
+mv val/ILSVRC2012_val_00015273.JPEG n04252225/
+mv val/ILSVRC2012_val_00015274.JPEG n01630670/
+mv val/ILSVRC2012_val_00015275.JPEG n04507155/
+mv val/ILSVRC2012_val_00015276.JPEG n03047690/
+mv val/ILSVRC2012_val_00015277.JPEG n03344393/
+mv val/ILSVRC2012_val_00015278.JPEG n02981792/
+mv val/ILSVRC2012_val_00015279.JPEG n03680355/
+mv val/ILSVRC2012_val_00015280.JPEG n07579787/
+mv val/ILSVRC2012_val_00015281.JPEG n02526121/
+mv val/ILSVRC2012_val_00015282.JPEG n01984695/
+mv val/ILSVRC2012_val_00015283.JPEG n04485082/
+mv val/ILSVRC2012_val_00015284.JPEG n03814639/
+mv val/ILSVRC2012_val_00015285.JPEG n02977058/
+mv val/ILSVRC2012_val_00015286.JPEG n03866082/
+mv val/ILSVRC2012_val_00015287.JPEG n04404412/
+mv val/ILSVRC2012_val_00015288.JPEG n04116512/
+mv val/ILSVRC2012_val_00015289.JPEG n03100240/
+mv val/ILSVRC2012_val_00015290.JPEG n03127925/
+mv val/ILSVRC2012_val_00015291.JPEG n01847000/
+mv val/ILSVRC2012_val_00015292.JPEG n02051845/
+mv val/ILSVRC2012_val_00015293.JPEG n02177972/
+mv val/ILSVRC2012_val_00015294.JPEG n02106030/
+mv val/ILSVRC2012_val_00015295.JPEG n03770679/
+mv val/ILSVRC2012_val_00015296.JPEG n03535780/
+mv val/ILSVRC2012_val_00015297.JPEG n03676483/
+mv val/ILSVRC2012_val_00015298.JPEG n01843383/
+mv val/ILSVRC2012_val_00015299.JPEG n01873310/
+mv val/ILSVRC2012_val_00015300.JPEG n02085936/
+mv val/ILSVRC2012_val_00015301.JPEG n02328150/
+mv val/ILSVRC2012_val_00015302.JPEG n03089624/
+mv val/ILSVRC2012_val_00015303.JPEG n02102318/
+mv val/ILSVRC2012_val_00015304.JPEG n02500267/
+mv val/ILSVRC2012_val_00015305.JPEG n04040759/
+mv val/ILSVRC2012_val_00015306.JPEG n04552348/
+mv val/ILSVRC2012_val_00015307.JPEG n02101006/
+mv val/ILSVRC2012_val_00015308.JPEG n07749582/
+mv val/ILSVRC2012_val_00015309.JPEG n03884397/
+mv val/ILSVRC2012_val_00015310.JPEG n02111129/
+mv val/ILSVRC2012_val_00015311.JPEG n03662601/
+mv val/ILSVRC2012_val_00015312.JPEG n03250847/
+mv val/ILSVRC2012_val_00015313.JPEG n02129604/
+mv val/ILSVRC2012_val_00015314.JPEG n03461385/
+mv val/ILSVRC2012_val_00015315.JPEG n03970156/
+mv val/ILSVRC2012_val_00015316.JPEG n04317175/
+mv val/ILSVRC2012_val_00015317.JPEG n03958227/
+mv val/ILSVRC2012_val_00015318.JPEG n07714990/
+mv val/ILSVRC2012_val_00015319.JPEG n01980166/
+mv val/ILSVRC2012_val_00015320.JPEG n03929660/
+mv val/ILSVRC2012_val_00015321.JPEG n03314780/
+mv val/ILSVRC2012_val_00015322.JPEG n01855032/
+mv val/ILSVRC2012_val_00015323.JPEG n03630383/
+mv val/ILSVRC2012_val_00015324.JPEG n01817953/
+mv val/ILSVRC2012_val_00015325.JPEG n02095889/
+mv val/ILSVRC2012_val_00015326.JPEG n04505470/
+mv val/ILSVRC2012_val_00015327.JPEG n02727426/
+mv val/ILSVRC2012_val_00015328.JPEG n03598930/
+mv val/ILSVRC2012_val_00015329.JPEG n02105855/
+mv val/ILSVRC2012_val_00015330.JPEG n02115913/
+mv val/ILSVRC2012_val_00015331.JPEG n03110669/
+mv val/ILSVRC2012_val_00015332.JPEG n10148035/
+mv val/ILSVRC2012_val_00015333.JPEG n02106550/
+mv val/ILSVRC2012_val_00015334.JPEG n02086079/
+mv val/ILSVRC2012_val_00015335.JPEG n04380533/
+mv val/ILSVRC2012_val_00015336.JPEG n10565667/
+mv val/ILSVRC2012_val_00015337.JPEG n03249569/
+mv val/ILSVRC2012_val_00015338.JPEG n02095889/
+mv val/ILSVRC2012_val_00015339.JPEG n02492660/
+mv val/ILSVRC2012_val_00015340.JPEG n07873807/
+mv val/ILSVRC2012_val_00015341.JPEG n02797295/
+mv val/ILSVRC2012_val_00015342.JPEG n04209239/
+mv val/ILSVRC2012_val_00015343.JPEG n02786058/
+mv val/ILSVRC2012_val_00015344.JPEG n02837789/
+mv val/ILSVRC2012_val_00015345.JPEG n02841315/
+mv val/ILSVRC2012_val_00015346.JPEG n02704792/
+mv val/ILSVRC2012_val_00015347.JPEG n03935335/
+mv val/ILSVRC2012_val_00015348.JPEG n04562935/
+mv val/ILSVRC2012_val_00015349.JPEG n02099429/
+mv val/ILSVRC2012_val_00015350.JPEG n02112137/
+mv val/ILSVRC2012_val_00015351.JPEG n03325584/
+mv val/ILSVRC2012_val_00015352.JPEG n04442312/
+mv val/ILSVRC2012_val_00015353.JPEG n04033995/
+mv val/ILSVRC2012_val_00015354.JPEG n07614500/
+mv val/ILSVRC2012_val_00015355.JPEG n02108089/
+mv val/ILSVRC2012_val_00015356.JPEG n03710721/
+mv val/ILSVRC2012_val_00015357.JPEG n03100240/
+mv val/ILSVRC2012_val_00015358.JPEG n02093859/
+mv val/ILSVRC2012_val_00015359.JPEG n02906734/
+mv val/ILSVRC2012_val_00015360.JPEG n04254777/
+mv val/ILSVRC2012_val_00015361.JPEG n07871810/
+mv val/ILSVRC2012_val_00015362.JPEG n02422106/
+mv val/ILSVRC2012_val_00015363.JPEG n04049303/
+mv val/ILSVRC2012_val_00015364.JPEG n03961711/
+mv val/ILSVRC2012_val_00015365.JPEG n02777292/
+mv val/ILSVRC2012_val_00015366.JPEG n04443257/
+mv val/ILSVRC2012_val_00015367.JPEG n04597913/
+mv val/ILSVRC2012_val_00015368.JPEG n02927161/
+mv val/ILSVRC2012_val_00015369.JPEG n03424325/
+mv val/ILSVRC2012_val_00015370.JPEG n03032252/
+mv val/ILSVRC2012_val_00015371.JPEG n02795169/
+mv val/ILSVRC2012_val_00015372.JPEG n02123394/
+mv val/ILSVRC2012_val_00015373.JPEG n01498041/
+mv val/ILSVRC2012_val_00015374.JPEG n01751748/
+mv val/ILSVRC2012_val_00015375.JPEG n03793489/
+mv val/ILSVRC2012_val_00015376.JPEG n03345487/
+mv val/ILSVRC2012_val_00015377.JPEG n02091635/
+mv val/ILSVRC2012_val_00015378.JPEG n02123159/
+mv val/ILSVRC2012_val_00015379.JPEG n02107142/
+mv val/ILSVRC2012_val_00015380.JPEG n02484975/
+mv val/ILSVRC2012_val_00015381.JPEG n03666591/
+mv val/ILSVRC2012_val_00015382.JPEG n03085013/
+mv val/ILSVRC2012_val_00015383.JPEG n04325704/
+mv val/ILSVRC2012_val_00015384.JPEG n03208938/
+mv val/ILSVRC2012_val_00015385.JPEG n04562935/
+mv val/ILSVRC2012_val_00015386.JPEG n04152593/
+mv val/ILSVRC2012_val_00015387.JPEG n09472597/
+mv val/ILSVRC2012_val_00015388.JPEG n07875152/
+mv val/ILSVRC2012_val_00015389.JPEG n04597913/
+mv val/ILSVRC2012_val_00015390.JPEG n04099969/
+mv val/ILSVRC2012_val_00015391.JPEG n03976657/
+mv val/ILSVRC2012_val_00015392.JPEG n02028035/
+mv val/ILSVRC2012_val_00015393.JPEG n03796401/
+mv val/ILSVRC2012_val_00015394.JPEG n02917067/
+mv val/ILSVRC2012_val_00015395.JPEG n02110958/
+mv val/ILSVRC2012_val_00015396.JPEG n02730930/
+mv val/ILSVRC2012_val_00015397.JPEG n02802426/
+mv val/ILSVRC2012_val_00015398.JPEG n02917067/
+mv val/ILSVRC2012_val_00015399.JPEG n02704792/
+mv val/ILSVRC2012_val_00015400.JPEG n07760859/
+mv val/ILSVRC2012_val_00015401.JPEG n02123597/
+mv val/ILSVRC2012_val_00015402.JPEG n01981276/
+mv val/ILSVRC2012_val_00015403.JPEG n01688243/
+mv val/ILSVRC2012_val_00015404.JPEG n03400231/
+mv val/ILSVRC2012_val_00015405.JPEG n02088238/
+mv val/ILSVRC2012_val_00015406.JPEG n07753275/
+mv val/ILSVRC2012_val_00015407.JPEG n02100583/
+mv val/ILSVRC2012_val_00015408.JPEG n01955084/
+mv val/ILSVRC2012_val_00015409.JPEG n02777292/
+mv val/ILSVRC2012_val_00015410.JPEG n01534433/
+mv val/ILSVRC2012_val_00015411.JPEG n03908714/
+mv val/ILSVRC2012_val_00015412.JPEG n02120079/
+mv val/ILSVRC2012_val_00015413.JPEG n04465501/
+mv val/ILSVRC2012_val_00015414.JPEG n02641379/
+mv val/ILSVRC2012_val_00015415.JPEG n02098286/
+mv val/ILSVRC2012_val_00015416.JPEG n01534433/
+mv val/ILSVRC2012_val_00015417.JPEG n02917067/
+mv val/ILSVRC2012_val_00015418.JPEG n04371774/
+mv val/ILSVRC2012_val_00015419.JPEG n02110958/
+mv val/ILSVRC2012_val_00015420.JPEG n03538406/
+mv val/ILSVRC2012_val_00015421.JPEG n03443371/
+mv val/ILSVRC2012_val_00015422.JPEG n03902125/
+mv val/ILSVRC2012_val_00015423.JPEG n03075370/
+mv val/ILSVRC2012_val_00015424.JPEG n04336792/
+mv val/ILSVRC2012_val_00015425.JPEG n02091831/
+mv val/ILSVRC2012_val_00015426.JPEG n02510455/
+mv val/ILSVRC2012_val_00015427.JPEG n02097047/
+mv val/ILSVRC2012_val_00015428.JPEG n03908618/
+mv val/ILSVRC2012_val_00015429.JPEG n02817516/
+mv val/ILSVRC2012_val_00015430.JPEG n02111889/
+mv val/ILSVRC2012_val_00015431.JPEG n01531178/
+mv val/ILSVRC2012_val_00015432.JPEG n02481823/
+mv val/ILSVRC2012_val_00015433.JPEG n03110669/
+mv val/ILSVRC2012_val_00015434.JPEG n02095570/
+mv val/ILSVRC2012_val_00015435.JPEG n03982430/
+mv val/ILSVRC2012_val_00015436.JPEG n03444034/
+mv val/ILSVRC2012_val_00015437.JPEG n07714571/
+mv val/ILSVRC2012_val_00015438.JPEG n07932039/
+mv val/ILSVRC2012_val_00015439.JPEG n01768244/
+mv val/ILSVRC2012_val_00015440.JPEG n02837789/
+mv val/ILSVRC2012_val_00015441.JPEG n03637318/
+mv val/ILSVRC2012_val_00015442.JPEG n04141975/
+mv val/ILSVRC2012_val_00015443.JPEG n01910747/
+mv val/ILSVRC2012_val_00015444.JPEG n03873416/
+mv val/ILSVRC2012_val_00015445.JPEG n03018349/
+mv val/ILSVRC2012_val_00015446.JPEG n02114548/
+mv val/ILSVRC2012_val_00015447.JPEG n07717556/
+mv val/ILSVRC2012_val_00015448.JPEG n03494278/
+mv val/ILSVRC2012_val_00015449.JPEG n03924679/
+mv val/ILSVRC2012_val_00015450.JPEG n02012849/
+mv val/ILSVRC2012_val_00015451.JPEG n02361337/
+mv val/ILSVRC2012_val_00015452.JPEG n02398521/
+mv val/ILSVRC2012_val_00015453.JPEG n03443371/
+mv val/ILSVRC2012_val_00015454.JPEG n07615774/
+mv val/ILSVRC2012_val_00015455.JPEG n02009912/
+mv val/ILSVRC2012_val_00015456.JPEG n02395406/
+mv val/ILSVRC2012_val_00015457.JPEG n02777292/
+mv val/ILSVRC2012_val_00015458.JPEG n02783161/
+mv val/ILSVRC2012_val_00015459.JPEG n02445715/
+mv val/ILSVRC2012_val_00015460.JPEG n03743016/
+mv val/ILSVRC2012_val_00015461.JPEG n03891332/
+mv val/ILSVRC2012_val_00015462.JPEG n04542943/
+mv val/ILSVRC2012_val_00015463.JPEG n15075141/
+mv val/ILSVRC2012_val_00015464.JPEG n02091244/
+mv val/ILSVRC2012_val_00015465.JPEG n02114367/
+mv val/ILSVRC2012_val_00015466.JPEG n03404251/
+mv val/ILSVRC2012_val_00015467.JPEG n03000134/
+mv val/ILSVRC2012_val_00015468.JPEG n01667114/
+mv val/ILSVRC2012_val_00015469.JPEG n03763968/
+mv val/ILSVRC2012_val_00015470.JPEG n02233338/
+mv val/ILSVRC2012_val_00015471.JPEG n09428293/
+mv val/ILSVRC2012_val_00015472.JPEG n03793489/
+mv val/ILSVRC2012_val_00015473.JPEG n04258138/
+mv val/ILSVRC2012_val_00015474.JPEG n04023962/
+mv val/ILSVRC2012_val_00015475.JPEG n01667778/
+mv val/ILSVRC2012_val_00015476.JPEG n03899768/
+mv val/ILSVRC2012_val_00015477.JPEG n13133613/
+mv val/ILSVRC2012_val_00015478.JPEG n03599486/
+mv val/ILSVRC2012_val_00015479.JPEG n03042490/
+mv val/ILSVRC2012_val_00015480.JPEG n04467665/
+mv val/ILSVRC2012_val_00015481.JPEG n03633091/
+mv val/ILSVRC2012_val_00015482.JPEG n02437616/
+mv val/ILSVRC2012_val_00015483.JPEG n02835271/
+mv val/ILSVRC2012_val_00015484.JPEG n03791053/
+mv val/ILSVRC2012_val_00015485.JPEG n04486054/
+mv val/ILSVRC2012_val_00015486.JPEG n07717410/
+mv val/ILSVRC2012_val_00015487.JPEG n07613480/
+mv val/ILSVRC2012_val_00015488.JPEG n01728920/
+mv val/ILSVRC2012_val_00015489.JPEG n03400231/
+mv val/ILSVRC2012_val_00015490.JPEG n02790996/
+mv val/ILSVRC2012_val_00015491.JPEG n02676566/
+mv val/ILSVRC2012_val_00015492.JPEG n04562935/
+mv val/ILSVRC2012_val_00015493.JPEG n02264363/
+mv val/ILSVRC2012_val_00015494.JPEG n04141975/
+mv val/ILSVRC2012_val_00015495.JPEG n03089624/
+mv val/ILSVRC2012_val_00015496.JPEG n03954731/
+mv val/ILSVRC2012_val_00015497.JPEG n03467068/
+mv val/ILSVRC2012_val_00015498.JPEG n02690373/
+mv val/ILSVRC2012_val_00015499.JPEG n02102040/
+mv val/ILSVRC2012_val_00015500.JPEG n01985128/
+mv val/ILSVRC2012_val_00015501.JPEG n04116512/
+mv val/ILSVRC2012_val_00015502.JPEG n02497673/
+mv val/ILSVRC2012_val_00015503.JPEG n04392985/
+mv val/ILSVRC2012_val_00015504.JPEG n03937543/
+mv val/ILSVRC2012_val_00015505.JPEG n02006656/
+mv val/ILSVRC2012_val_00015506.JPEG n01773549/
+mv val/ILSVRC2012_val_00015507.JPEG n02704792/
+mv val/ILSVRC2012_val_00015508.JPEG n02999410/
+mv val/ILSVRC2012_val_00015509.JPEG n07930864/
+mv val/ILSVRC2012_val_00015510.JPEG n02011460/
+mv val/ILSVRC2012_val_00015511.JPEG n02107312/
+mv val/ILSVRC2012_val_00015512.JPEG n02910353/
+mv val/ILSVRC2012_val_00015513.JPEG n01795545/
+mv val/ILSVRC2012_val_00015514.JPEG n04111531/
+mv val/ILSVRC2012_val_00015515.JPEG n02894605/
+mv val/ILSVRC2012_val_00015516.JPEG n01614925/
+mv val/ILSVRC2012_val_00015517.JPEG n02793495/
+mv val/ILSVRC2012_val_00015518.JPEG n02100877/
+mv val/ILSVRC2012_val_00015519.JPEG n03761084/
+mv val/ILSVRC2012_val_00015520.JPEG n02504013/
+mv val/ILSVRC2012_val_00015521.JPEG n02408429/
+mv val/ILSVRC2012_val_00015522.JPEG n07583066/
+mv val/ILSVRC2012_val_00015523.JPEG n01744401/
+mv val/ILSVRC2012_val_00015524.JPEG n03447447/
+mv val/ILSVRC2012_val_00015525.JPEG n03125729/
+mv val/ILSVRC2012_val_00015526.JPEG n01978287/
+mv val/ILSVRC2012_val_00015527.JPEG n04346328/
+mv val/ILSVRC2012_val_00015528.JPEG n03742115/
+mv val/ILSVRC2012_val_00015529.JPEG n02483708/
+mv val/ILSVRC2012_val_00015530.JPEG n13054560/
+mv val/ILSVRC2012_val_00015531.JPEG n02096177/
+mv val/ILSVRC2012_val_00015532.JPEG n03920288/
+mv val/ILSVRC2012_val_00015533.JPEG n02837789/
+mv val/ILSVRC2012_val_00015534.JPEG n03877472/
+mv val/ILSVRC2012_val_00015535.JPEG n02165105/
+mv val/ILSVRC2012_val_00015536.JPEG n03937543/
+mv val/ILSVRC2012_val_00015537.JPEG n03982430/
+mv val/ILSVRC2012_val_00015538.JPEG n03787032/
+mv val/ILSVRC2012_val_00015539.JPEG n07880968/
+mv val/ILSVRC2012_val_00015540.JPEG n04371774/
+mv val/ILSVRC2012_val_00015541.JPEG n04146614/
+mv val/ILSVRC2012_val_00015542.JPEG n03394916/
+mv val/ILSVRC2012_val_00015543.JPEG n03903868/
+mv val/ILSVRC2012_val_00015544.JPEG n02687172/
+mv val/ILSVRC2012_val_00015545.JPEG n01494475/
+mv val/ILSVRC2012_val_00015546.JPEG n02536864/
+mv val/ILSVRC2012_val_00015547.JPEG n02129165/
+mv val/ILSVRC2012_val_00015548.JPEG n07920052/
+mv val/ILSVRC2012_val_00015549.JPEG n01496331/
+mv val/ILSVRC2012_val_00015550.JPEG n02009912/
+mv val/ILSVRC2012_val_00015551.JPEG n02692877/
+mv val/ILSVRC2012_val_00015552.JPEG n02101006/
+mv val/ILSVRC2012_val_00015553.JPEG n03271574/
+mv val/ILSVRC2012_val_00015554.JPEG n04371774/
+mv val/ILSVRC2012_val_00015555.JPEG n01496331/
+mv val/ILSVRC2012_val_00015556.JPEG n04557648/
+mv val/ILSVRC2012_val_00015557.JPEG n02027492/
+mv val/ILSVRC2012_val_00015558.JPEG n02125311/
+mv val/ILSVRC2012_val_00015559.JPEG n03376595/
+mv val/ILSVRC2012_val_00015560.JPEG n01872401/
+mv val/ILSVRC2012_val_00015561.JPEG n04346328/
+mv val/ILSVRC2012_val_00015562.JPEG n02091134/
+mv val/ILSVRC2012_val_00015563.JPEG n04238763/
+mv val/ILSVRC2012_val_00015564.JPEG n01776313/
+mv val/ILSVRC2012_val_00015565.JPEG n01796340/
+mv val/ILSVRC2012_val_00015566.JPEG n01770081/
+mv val/ILSVRC2012_val_00015567.JPEG n03141823/
+mv val/ILSVRC2012_val_00015568.JPEG n01665541/
+mv val/ILSVRC2012_val_00015569.JPEG n04133789/
+mv val/ILSVRC2012_val_00015570.JPEG n02096437/
+mv val/ILSVRC2012_val_00015571.JPEG n02096051/
+mv val/ILSVRC2012_val_00015572.JPEG n10565667/
+mv val/ILSVRC2012_val_00015573.JPEG n04542943/
+mv val/ILSVRC2012_val_00015574.JPEG n03447447/
+mv val/ILSVRC2012_val_00015575.JPEG n09421951/
+mv val/ILSVRC2012_val_00015576.JPEG n02113624/
+mv val/ILSVRC2012_val_00015577.JPEG n03160309/
+mv val/ILSVRC2012_val_00015578.JPEG n02504458/
+mv val/ILSVRC2012_val_00015579.JPEG n01774750/
+mv val/ILSVRC2012_val_00015580.JPEG n03871628/
+mv val/ILSVRC2012_val_00015581.JPEG n04590129/
+mv val/ILSVRC2012_val_00015582.JPEG n12057211/
+mv val/ILSVRC2012_val_00015583.JPEG n03481172/
+mv val/ILSVRC2012_val_00015584.JPEG n03000247/
+mv val/ILSVRC2012_val_00015585.JPEG n04090263/
+mv val/ILSVRC2012_val_00015586.JPEG n04141076/
+mv val/ILSVRC2012_val_00015587.JPEG n01914609/
+mv val/ILSVRC2012_val_00015588.JPEG n03775071/
+mv val/ILSVRC2012_val_00015589.JPEG n02869837/
+mv val/ILSVRC2012_val_00015590.JPEG n04509417/
+mv val/ILSVRC2012_val_00015591.JPEG n04371430/
+mv val/ILSVRC2012_val_00015592.JPEG n02097209/
+mv val/ILSVRC2012_val_00015593.JPEG n04613696/
+mv val/ILSVRC2012_val_00015594.JPEG n02669723/
+mv val/ILSVRC2012_val_00015595.JPEG n02883205/
+mv val/ILSVRC2012_val_00015596.JPEG n01748264/
+mv val/ILSVRC2012_val_00015597.JPEG n01955084/
+mv val/ILSVRC2012_val_00015598.JPEG n04204238/
+mv val/ILSVRC2012_val_00015599.JPEG n03743016/
+mv val/ILSVRC2012_val_00015600.JPEG n02177972/
+mv val/ILSVRC2012_val_00015601.JPEG n03868863/
+mv val/ILSVRC2012_val_00015602.JPEG n04133789/
+mv val/ILSVRC2012_val_00015603.JPEG n02168699/
+mv val/ILSVRC2012_val_00015604.JPEG n04041544/
+mv val/ILSVRC2012_val_00015605.JPEG n02115913/
+mv val/ILSVRC2012_val_00015606.JPEG n02259212/
+mv val/ILSVRC2012_val_00015607.JPEG n02096177/
+mv val/ILSVRC2012_val_00015608.JPEG n02277742/
+mv val/ILSVRC2012_val_00015609.JPEG n04493381/
+mv val/ILSVRC2012_val_00015610.JPEG n02093859/
+mv val/ILSVRC2012_val_00015611.JPEG n03160309/
+mv val/ILSVRC2012_val_00015612.JPEG n04120489/
+mv val/ILSVRC2012_val_00015613.JPEG n09246464/
+mv val/ILSVRC2012_val_00015614.JPEG n04005630/
+mv val/ILSVRC2012_val_00015615.JPEG n03938244/
+mv val/ILSVRC2012_val_00015616.JPEG n03208938/
+mv val/ILSVRC2012_val_00015617.JPEG n04033901/
+mv val/ILSVRC2012_val_00015618.JPEG n02835271/
+mv val/ILSVRC2012_val_00015619.JPEG n04049303/
+mv val/ILSVRC2012_val_00015620.JPEG n02951585/
+mv val/ILSVRC2012_val_00015621.JPEG n04229816/
+mv val/ILSVRC2012_val_00015622.JPEG n01755581/
+mv val/ILSVRC2012_val_00015623.JPEG n01734418/
+mv val/ILSVRC2012_val_00015624.JPEG n01843065/
+mv val/ILSVRC2012_val_00015625.JPEG n02114367/
+mv val/ILSVRC2012_val_00015626.JPEG n09288635/
+mv val/ILSVRC2012_val_00015627.JPEG n04147183/
+mv val/ILSVRC2012_val_00015628.JPEG n03196217/
+mv val/ILSVRC2012_val_00015629.JPEG n04367480/
+mv val/ILSVRC2012_val_00015630.JPEG n03467068/
+mv val/ILSVRC2012_val_00015631.JPEG n01491361/
+mv val/ILSVRC2012_val_00015632.JPEG n02091831/
+mv val/ILSVRC2012_val_00015633.JPEG n04154565/
+mv val/ILSVRC2012_val_00015634.JPEG n07875152/
+mv val/ILSVRC2012_val_00015635.JPEG n07873807/
+mv val/ILSVRC2012_val_00015636.JPEG n02690373/
+mv val/ILSVRC2012_val_00015637.JPEG n02730930/
+mv val/ILSVRC2012_val_00015638.JPEG n04389033/
+mv val/ILSVRC2012_val_00015639.JPEG n02879718/
+mv val/ILSVRC2012_val_00015640.JPEG n03223299/
+mv val/ILSVRC2012_val_00015641.JPEG n01784675/
+mv val/ILSVRC2012_val_00015642.JPEG n03447721/
+mv val/ILSVRC2012_val_00015643.JPEG n01742172/
+mv val/ILSVRC2012_val_00015644.JPEG n01728572/
+mv val/ILSVRC2012_val_00015645.JPEG n12985857/
+mv val/ILSVRC2012_val_00015646.JPEG n03376595/
+mv val/ILSVRC2012_val_00015647.JPEG n03089624/
+mv val/ILSVRC2012_val_00015648.JPEG n03887697/
+mv val/ILSVRC2012_val_00015649.JPEG n04270147/
+mv val/ILSVRC2012_val_00015650.JPEG n01930112/
+mv val/ILSVRC2012_val_00015651.JPEG n02814533/
+mv val/ILSVRC2012_val_00015652.JPEG n07802026/
+mv val/ILSVRC2012_val_00015653.JPEG n07920052/
+mv val/ILSVRC2012_val_00015654.JPEG n03425413/
+mv val/ILSVRC2012_val_00015655.JPEG n06596364/
+mv val/ILSVRC2012_val_00015656.JPEG n03134739/
+mv val/ILSVRC2012_val_00015657.JPEG n02108422/
+mv val/ILSVRC2012_val_00015658.JPEG n12998815/
+mv val/ILSVRC2012_val_00015659.JPEG n07753113/
+mv val/ILSVRC2012_val_00015660.JPEG n02056570/
+mv val/ILSVRC2012_val_00015661.JPEG n09256479/
+mv val/ILSVRC2012_val_00015662.JPEG n04238763/
+mv val/ILSVRC2012_val_00015663.JPEG n02951585/
+mv val/ILSVRC2012_val_00015664.JPEG n04033901/
+mv val/ILSVRC2012_val_00015665.JPEG n01833805/
+mv val/ILSVRC2012_val_00015666.JPEG n01737021/
+mv val/ILSVRC2012_val_00015667.JPEG n01694178/
+mv val/ILSVRC2012_val_00015668.JPEG n06785654/
+mv val/ILSVRC2012_val_00015669.JPEG n02500267/
+mv val/ILSVRC2012_val_00015670.JPEG n02085782/
+mv val/ILSVRC2012_val_00015671.JPEG n03825788/
+mv val/ILSVRC2012_val_00015672.JPEG n03899768/
+mv val/ILSVRC2012_val_00015673.JPEG n01843383/
+mv val/ILSVRC2012_val_00015674.JPEG n02782093/
+mv val/ILSVRC2012_val_00015675.JPEG n01855672/
+mv val/ILSVRC2012_val_00015676.JPEG n04239074/
+mv val/ILSVRC2012_val_00015677.JPEG n04604644/
+mv val/ILSVRC2012_val_00015678.JPEG n07583066/
+mv val/ILSVRC2012_val_00015679.JPEG n03041632/
+mv val/ILSVRC2012_val_00015680.JPEG n02777292/
+mv val/ILSVRC2012_val_00015681.JPEG n03627232/
+mv val/ILSVRC2012_val_00015682.JPEG n03884397/
+mv val/ILSVRC2012_val_00015683.JPEG n02328150/
+mv val/ILSVRC2012_val_00015684.JPEG n04005630/
+mv val/ILSVRC2012_val_00015685.JPEG n02093859/
+mv val/ILSVRC2012_val_00015686.JPEG n01749939/
+mv val/ILSVRC2012_val_00015687.JPEG n03000134/
+mv val/ILSVRC2012_val_00015688.JPEG n04037443/
+mv val/ILSVRC2012_val_00015689.JPEG n03888257/
+mv val/ILSVRC2012_val_00015690.JPEG n01824575/
+mv val/ILSVRC2012_val_00015691.JPEG n07875152/
+mv val/ILSVRC2012_val_00015692.JPEG n02526121/
+mv val/ILSVRC2012_val_00015693.JPEG n07920052/
+mv val/ILSVRC2012_val_00015694.JPEG n02102040/
+mv val/ILSVRC2012_val_00015695.JPEG n02869837/
+mv val/ILSVRC2012_val_00015696.JPEG n02099849/
+mv val/ILSVRC2012_val_00015697.JPEG n04356056/
+mv val/ILSVRC2012_val_00015698.JPEG n01749939/
+mv val/ILSVRC2012_val_00015699.JPEG n02442845/
+mv val/ILSVRC2012_val_00015700.JPEG n04487081/
+mv val/ILSVRC2012_val_00015701.JPEG n02087046/
+mv val/ILSVRC2012_val_00015702.JPEG n04201297/
+mv val/ILSVRC2012_val_00015703.JPEG n02094433/
+mv val/ILSVRC2012_val_00015704.JPEG n02480495/
+mv val/ILSVRC2012_val_00015705.JPEG n02096585/
+mv val/ILSVRC2012_val_00015706.JPEG n01518878/
+mv val/ILSVRC2012_val_00015707.JPEG n04141975/
+mv val/ILSVRC2012_val_00015708.JPEG n02981792/
+mv val/ILSVRC2012_val_00015709.JPEG n01632458/
+mv val/ILSVRC2012_val_00015710.JPEG n02093647/
+mv val/ILSVRC2012_val_00015711.JPEG n02018207/
+mv val/ILSVRC2012_val_00015712.JPEG n04040759/
+mv val/ILSVRC2012_val_00015713.JPEG n01820546/
+mv val/ILSVRC2012_val_00015714.JPEG n03840681/
+mv val/ILSVRC2012_val_00015715.JPEG n03832673/
+mv val/ILSVRC2012_val_00015716.JPEG n02051845/
+mv val/ILSVRC2012_val_00015717.JPEG n01883070/
+mv val/ILSVRC2012_val_00015718.JPEG n03534580/
+mv val/ILSVRC2012_val_00015719.JPEG n02028035/
+mv val/ILSVRC2012_val_00015720.JPEG n03857828/
+mv val/ILSVRC2012_val_00015721.JPEG n01682714/
+mv val/ILSVRC2012_val_00015722.JPEG n04049303/
+mv val/ILSVRC2012_val_00015723.JPEG n02096585/
+mv val/ILSVRC2012_val_00015724.JPEG n04254120/
+mv val/ILSVRC2012_val_00015725.JPEG n02071294/
+mv val/ILSVRC2012_val_00015726.JPEG n03868863/
+mv val/ILSVRC2012_val_00015727.JPEG n02206856/
+mv val/ILSVRC2012_val_00015728.JPEG n04086273/
+mv val/ILSVRC2012_val_00015729.JPEG n02177972/
+mv val/ILSVRC2012_val_00015730.JPEG n02085782/
+mv val/ILSVRC2012_val_00015731.JPEG n03942813/
+mv val/ILSVRC2012_val_00015732.JPEG n01496331/
+mv val/ILSVRC2012_val_00015733.JPEG n04355933/
+mv val/ILSVRC2012_val_00015734.JPEG n02790996/
+mv val/ILSVRC2012_val_00015735.JPEG n04265275/
+mv val/ILSVRC2012_val_00015736.JPEG n03976467/
+mv val/ILSVRC2012_val_00015737.JPEG n02279972/
+mv val/ILSVRC2012_val_00015738.JPEG n02086240/
+mv val/ILSVRC2012_val_00015739.JPEG n01824575/
+mv val/ILSVRC2012_val_00015740.JPEG n09421951/
+mv val/ILSVRC2012_val_00015741.JPEG n02123159/
+mv val/ILSVRC2012_val_00015742.JPEG n02086079/
+mv val/ILSVRC2012_val_00015743.JPEG n07717410/
+mv val/ILSVRC2012_val_00015744.JPEG n02422106/
+mv val/ILSVRC2012_val_00015745.JPEG n02236044/
+mv val/ILSVRC2012_val_00015746.JPEG n01608432/
+mv val/ILSVRC2012_val_00015747.JPEG n03062245/
+mv val/ILSVRC2012_val_00015748.JPEG n07734744/
+mv val/ILSVRC2012_val_00015749.JPEG n01983481/
+mv val/ILSVRC2012_val_00015750.JPEG n04542943/
+mv val/ILSVRC2012_val_00015751.JPEG n01773797/
+mv val/ILSVRC2012_val_00015752.JPEG n02526121/
+mv val/ILSVRC2012_val_00015753.JPEG n01688243/
+mv val/ILSVRC2012_val_00015754.JPEG n01990800/
+mv val/ILSVRC2012_val_00015755.JPEG n02169497/
+mv val/ILSVRC2012_val_00015756.JPEG n01768244/
+mv val/ILSVRC2012_val_00015757.JPEG n01770393/
+mv val/ILSVRC2012_val_00015758.JPEG n03977966/
+mv val/ILSVRC2012_val_00015759.JPEG n02096585/
+mv val/ILSVRC2012_val_00015760.JPEG n03532672/
+mv val/ILSVRC2012_val_00015761.JPEG n07711569/
+mv val/ILSVRC2012_val_00015762.JPEG n01734418/
+mv val/ILSVRC2012_val_00015763.JPEG n04326547/
+mv val/ILSVRC2012_val_00015764.JPEG n09332890/
+mv val/ILSVRC2012_val_00015765.JPEG n04584207/
+mv val/ILSVRC2012_val_00015766.JPEG n02114712/
+mv val/ILSVRC2012_val_00015767.JPEG n02093754/
+mv val/ILSVRC2012_val_00015768.JPEG n03495258/
+mv val/ILSVRC2012_val_00015769.JPEG n01616318/
+mv val/ILSVRC2012_val_00015770.JPEG n02326432/
+mv val/ILSVRC2012_val_00015771.JPEG n04507155/
+mv val/ILSVRC2012_val_00015772.JPEG n03527444/
+mv val/ILSVRC2012_val_00015773.JPEG n01981276/
+mv val/ILSVRC2012_val_00015774.JPEG n02097298/
+mv val/ILSVRC2012_val_00015775.JPEG n03958227/
+mv val/ILSVRC2012_val_00015776.JPEG n02165105/
+mv val/ILSVRC2012_val_00015777.JPEG n07718472/
+mv val/ILSVRC2012_val_00015778.JPEG n04591157/
+mv val/ILSVRC2012_val_00015779.JPEG n04286575/
+mv val/ILSVRC2012_val_00015780.JPEG n04208210/
+mv val/ILSVRC2012_val_00015781.JPEG n02120505/
+mv val/ILSVRC2012_val_00015782.JPEG n04265275/
+mv val/ILSVRC2012_val_00015783.JPEG n04147183/
+mv val/ILSVRC2012_val_00015784.JPEG n03271574/
+mv val/ILSVRC2012_val_00015785.JPEG n02128385/
+mv val/ILSVRC2012_val_00015786.JPEG n02110958/
+mv val/ILSVRC2012_val_00015787.JPEG n03888257/
+mv val/ILSVRC2012_val_00015788.JPEG n02730930/
+mv val/ILSVRC2012_val_00015789.JPEG n01978455/
+mv val/ILSVRC2012_val_00015790.JPEG n02843684/
+mv val/ILSVRC2012_val_00015791.JPEG n03590841/
+mv val/ILSVRC2012_val_00015792.JPEG n03065424/
+mv val/ILSVRC2012_val_00015793.JPEG n03854065/
+mv val/ILSVRC2012_val_00015794.JPEG n01739381/
+mv val/ILSVRC2012_val_00015795.JPEG n01773797/
+mv val/ILSVRC2012_val_00015796.JPEG n03976657/
+mv val/ILSVRC2012_val_00015797.JPEG n04116512/
+mv val/ILSVRC2012_val_00015798.JPEG n02092339/
+mv val/ILSVRC2012_val_00015799.JPEG n01817953/
+mv val/ILSVRC2012_val_00015800.JPEG n02119789/
+mv val/ILSVRC2012_val_00015801.JPEG n01748264/
+mv val/ILSVRC2012_val_00015802.JPEG n02169497/
+mv val/ILSVRC2012_val_00015803.JPEG n03125729/
+mv val/ILSVRC2012_val_00015804.JPEG n02091467/
+mv val/ILSVRC2012_val_00015805.JPEG n07714571/
+mv val/ILSVRC2012_val_00015806.JPEG n02704792/
+mv val/ILSVRC2012_val_00015807.JPEG n02085936/
+mv val/ILSVRC2012_val_00015808.JPEG n02108915/
+mv val/ILSVRC2012_val_00015809.JPEG n03314780/
+mv val/ILSVRC2012_val_00015810.JPEG n02086646/
+mv val/ILSVRC2012_val_00015811.JPEG n07697537/
+mv val/ILSVRC2012_val_00015812.JPEG n03584829/
+mv val/ILSVRC2012_val_00015813.JPEG n03773504/
+mv val/ILSVRC2012_val_00015814.JPEG n04204347/
+mv val/ILSVRC2012_val_00015815.JPEG n01796340/
+mv val/ILSVRC2012_val_00015816.JPEG n03930313/
+mv val/ILSVRC2012_val_00015817.JPEG n02033041/
+mv val/ILSVRC2012_val_00015818.JPEG n02236044/
+mv val/ILSVRC2012_val_00015819.JPEG n02895154/
+mv val/ILSVRC2012_val_00015820.JPEG n02708093/
+mv val/ILSVRC2012_val_00015821.JPEG n02115641/
+mv val/ILSVRC2012_val_00015822.JPEG n04209239/
+mv val/ILSVRC2012_val_00015823.JPEG n01735189/
+mv val/ILSVRC2012_val_00015824.JPEG n03201208/
+mv val/ILSVRC2012_val_00015825.JPEG n09468604/
+mv val/ILSVRC2012_val_00015826.JPEG n03047690/
+mv val/ILSVRC2012_val_00015827.JPEG n04254777/
+mv val/ILSVRC2012_val_00015828.JPEG n06596364/
+mv val/ILSVRC2012_val_00015829.JPEG n03627232/
+mv val/ILSVRC2012_val_00015830.JPEG n01532829/
+mv val/ILSVRC2012_val_00015831.JPEG n01694178/
+mv val/ILSVRC2012_val_00015832.JPEG n04081281/
+mv val/ILSVRC2012_val_00015833.JPEG n03495258/
+mv val/ILSVRC2012_val_00015834.JPEG n02788148/
+mv val/ILSVRC2012_val_00015835.JPEG n01775062/
+mv val/ILSVRC2012_val_00015836.JPEG n04355933/
+mv val/ILSVRC2012_val_00015837.JPEG n03017168/
+mv val/ILSVRC2012_val_00015838.JPEG n04599235/
+mv val/ILSVRC2012_val_00015839.JPEG n03785016/
+mv val/ILSVRC2012_val_00015840.JPEG n07871810/
+mv val/ILSVRC2012_val_00015841.JPEG n03980874/
+mv val/ILSVRC2012_val_00015842.JPEG n02071294/
+mv val/ILSVRC2012_val_00015843.JPEG n04493381/
+mv val/ILSVRC2012_val_00015844.JPEG n04372370/
+mv val/ILSVRC2012_val_00015845.JPEG n02087046/
+mv val/ILSVRC2012_val_00015846.JPEG n04584207/
+mv val/ILSVRC2012_val_00015847.JPEG n04086273/
+mv val/ILSVRC2012_val_00015848.JPEG n02092339/
+mv val/ILSVRC2012_val_00015849.JPEG n02817516/
+mv val/ILSVRC2012_val_00015850.JPEG n03240683/
+mv val/ILSVRC2012_val_00015851.JPEG n12998815/
+mv val/ILSVRC2012_val_00015852.JPEG n03075370/
+mv val/ILSVRC2012_val_00015853.JPEG n02804414/
+mv val/ILSVRC2012_val_00015854.JPEG n01833805/
+mv val/ILSVRC2012_val_00015855.JPEG n01695060/
+mv val/ILSVRC2012_val_00015856.JPEG n04596742/
+mv val/ILSVRC2012_val_00015857.JPEG n04398044/
+mv val/ILSVRC2012_val_00015858.JPEG n02106382/
+mv val/ILSVRC2012_val_00015859.JPEG n04204238/
+mv val/ILSVRC2012_val_00015860.JPEG n02219486/
+mv val/ILSVRC2012_val_00015861.JPEG n02437312/
+mv val/ILSVRC2012_val_00015862.JPEG n04335435/
+mv val/ILSVRC2012_val_00015863.JPEG n01531178/
+mv val/ILSVRC2012_val_00015864.JPEG n04201297/
+mv val/ILSVRC2012_val_00015865.JPEG n03920288/
+mv val/ILSVRC2012_val_00015866.JPEG n03759954/
+mv val/ILSVRC2012_val_00015867.JPEG n03792782/
+mv val/ILSVRC2012_val_00015868.JPEG n02412080/
+mv val/ILSVRC2012_val_00015869.JPEG n04536866/
+mv val/ILSVRC2012_val_00015870.JPEG n03874293/
+mv val/ILSVRC2012_val_00015871.JPEG n02708093/
+mv val/ILSVRC2012_val_00015872.JPEG n02437312/
+mv val/ILSVRC2012_val_00015873.JPEG n04509417/
+mv val/ILSVRC2012_val_00015874.JPEG n01990800/
+mv val/ILSVRC2012_val_00015875.JPEG n04579145/
+mv val/ILSVRC2012_val_00015876.JPEG n02480495/
+mv val/ILSVRC2012_val_00015877.JPEG n04371430/
+mv val/ILSVRC2012_val_00015878.JPEG n02105056/
+mv val/ILSVRC2012_val_00015879.JPEG n03930630/
+mv val/ILSVRC2012_val_00015880.JPEG n03481172/
+mv val/ILSVRC2012_val_00015881.JPEG n02808440/
+mv val/ILSVRC2012_val_00015882.JPEG n07932039/
+mv val/ILSVRC2012_val_00015883.JPEG n04428191/
+mv val/ILSVRC2012_val_00015884.JPEG n02971356/
+mv val/ILSVRC2012_val_00015885.JPEG n02090379/
+mv val/ILSVRC2012_val_00015886.JPEG n03857828/
+mv val/ILSVRC2012_val_00015887.JPEG n02988304/
+mv val/ILSVRC2012_val_00015888.JPEG n02115913/
+mv val/ILSVRC2012_val_00015889.JPEG n04599235/
+mv val/ILSVRC2012_val_00015890.JPEG n04033901/
+mv val/ILSVRC2012_val_00015891.JPEG n11879895/
+mv val/ILSVRC2012_val_00015892.JPEG n03014705/
+mv val/ILSVRC2012_val_00015893.JPEG n02002724/
+mv val/ILSVRC2012_val_00015894.JPEG n02445715/
+mv val/ILSVRC2012_val_00015895.JPEG n02870880/
+mv val/ILSVRC2012_val_00015896.JPEG n02951585/
+mv val/ILSVRC2012_val_00015897.JPEG n02129604/
+mv val/ILSVRC2012_val_00015898.JPEG n02123394/
+mv val/ILSVRC2012_val_00015899.JPEG n01860187/
+mv val/ILSVRC2012_val_00015900.JPEG n03788195/
+mv val/ILSVRC2012_val_00015901.JPEG n03729826/
+mv val/ILSVRC2012_val_00015902.JPEG n01665541/
+mv val/ILSVRC2012_val_00015903.JPEG n01531178/
+mv val/ILSVRC2012_val_00015904.JPEG n04442312/
+mv val/ILSVRC2012_val_00015905.JPEG n02777292/
+mv val/ILSVRC2012_val_00015906.JPEG n13044778/
+mv val/ILSVRC2012_val_00015907.JPEG n07720875/
+mv val/ILSVRC2012_val_00015908.JPEG n02027492/
+mv val/ILSVRC2012_val_00015909.JPEG n02480855/
+mv val/ILSVRC2012_val_00015910.JPEG n04447861/
+mv val/ILSVRC2012_val_00015911.JPEG n02403003/
+mv val/ILSVRC2012_val_00015912.JPEG n03874599/
+mv val/ILSVRC2012_val_00015913.JPEG n01622779/
+mv val/ILSVRC2012_val_00015914.JPEG n02860847/
+mv val/ILSVRC2012_val_00015915.JPEG n03884397/
+mv val/ILSVRC2012_val_00015916.JPEG n13040303/
+mv val/ILSVRC2012_val_00015917.JPEG n03796401/
+mv val/ILSVRC2012_val_00015918.JPEG n03388549/
+mv val/ILSVRC2012_val_00015919.JPEG n03970156/
+mv val/ILSVRC2012_val_00015920.JPEG n02112137/
+mv val/ILSVRC2012_val_00015921.JPEG n03775071/
+mv val/ILSVRC2012_val_00015922.JPEG n01601694/
+mv val/ILSVRC2012_val_00015923.JPEG n02093991/
+mv val/ILSVRC2012_val_00015924.JPEG n01664065/
+mv val/ILSVRC2012_val_00015925.JPEG n02077923/
+mv val/ILSVRC2012_val_00015926.JPEG n02487347/
+mv val/ILSVRC2012_val_00015927.JPEG n02444819/
+mv val/ILSVRC2012_val_00015928.JPEG n02480855/
+mv val/ILSVRC2012_val_00015929.JPEG n04505470/
+mv val/ILSVRC2012_val_00015930.JPEG n03980874/
+mv val/ILSVRC2012_val_00015931.JPEG n03447447/
+mv val/ILSVRC2012_val_00015932.JPEG n01955084/
+mv val/ILSVRC2012_val_00015933.JPEG n02056570/
+mv val/ILSVRC2012_val_00015934.JPEG n03127747/
+mv val/ILSVRC2012_val_00015935.JPEG n02692877/
+mv val/ILSVRC2012_val_00015936.JPEG n06596364/
+mv val/ILSVRC2012_val_00015937.JPEG n03400231/
+mv val/ILSVRC2012_val_00015938.JPEG n03482405/
+mv val/ILSVRC2012_val_00015939.JPEG n03920288/
+mv val/ILSVRC2012_val_00015940.JPEG n03871628/
+mv val/ILSVRC2012_val_00015941.JPEG n03496892/
+mv val/ILSVRC2012_val_00015942.JPEG n12267677/
+mv val/ILSVRC2012_val_00015943.JPEG n04310018/
+mv val/ILSVRC2012_val_00015944.JPEG n02865351/
+mv val/ILSVRC2012_val_00015945.JPEG n01924916/
+mv val/ILSVRC2012_val_00015946.JPEG n03000247/
+mv val/ILSVRC2012_val_00015947.JPEG n03393912/
+mv val/ILSVRC2012_val_00015948.JPEG n02825657/
+mv val/ILSVRC2012_val_00015949.JPEG n06785654/
+mv val/ILSVRC2012_val_00015950.JPEG n02097474/
+mv val/ILSVRC2012_val_00015951.JPEG n04179913/
+mv val/ILSVRC2012_val_00015952.JPEG n02112350/
+mv val/ILSVRC2012_val_00015953.JPEG n03444034/
+mv val/ILSVRC2012_val_00015954.JPEG n03133878/
+mv val/ILSVRC2012_val_00015955.JPEG n02132136/
+mv val/ILSVRC2012_val_00015956.JPEG n02843684/
+mv val/ILSVRC2012_val_00015957.JPEG n01770393/
+mv val/ILSVRC2012_val_00015958.JPEG n01871265/
+mv val/ILSVRC2012_val_00015959.JPEG n03290653/
+mv val/ILSVRC2012_val_00015960.JPEG n03207941/
+mv val/ILSVRC2012_val_00015961.JPEG n03476991/
+mv val/ILSVRC2012_val_00015962.JPEG n03481172/
+mv val/ILSVRC2012_val_00015963.JPEG n04590129/
+mv val/ILSVRC2012_val_00015964.JPEG n01532829/
+mv val/ILSVRC2012_val_00015965.JPEG n03642806/
+mv val/ILSVRC2012_val_00015966.JPEG n03388183/
+mv val/ILSVRC2012_val_00015967.JPEG n02094258/
+mv val/ILSVRC2012_val_00015968.JPEG n03496892/
+mv val/ILSVRC2012_val_00015969.JPEG n04467665/
+mv val/ILSVRC2012_val_00015970.JPEG n02963159/
+mv val/ILSVRC2012_val_00015971.JPEG n02328150/
+mv val/ILSVRC2012_val_00015972.JPEG n02101388/
+mv val/ILSVRC2012_val_00015973.JPEG n09256479/
+mv val/ILSVRC2012_val_00015974.JPEG n03777568/
+mv val/ILSVRC2012_val_00015975.JPEG n02165456/
+mv val/ILSVRC2012_val_00015976.JPEG n03042490/
+mv val/ILSVRC2012_val_00015977.JPEG n02363005/
+mv val/ILSVRC2012_val_00015978.JPEG n13054560/
+mv val/ILSVRC2012_val_00015979.JPEG n02808440/
+mv val/ILSVRC2012_val_00015980.JPEG n04532670/
+mv val/ILSVRC2012_val_00015981.JPEG n01688243/
+mv val/ILSVRC2012_val_00015982.JPEG n03602883/
+mv val/ILSVRC2012_val_00015983.JPEG n02206856/
+mv val/ILSVRC2012_val_00015984.JPEG n03400231/
+mv val/ILSVRC2012_val_00015985.JPEG n02346627/
+mv val/ILSVRC2012_val_00015986.JPEG n01871265/
+mv val/ILSVRC2012_val_00015987.JPEG n01806567/
+mv val/ILSVRC2012_val_00015988.JPEG n02727426/
+mv val/ILSVRC2012_val_00015989.JPEG n04067472/
+mv val/ILSVRC2012_val_00015990.JPEG n02088094/
+mv val/ILSVRC2012_val_00015991.JPEG n04553703/
+mv val/ILSVRC2012_val_00015992.JPEG n13037406/
+mv val/ILSVRC2012_val_00015993.JPEG n07718472/
+mv val/ILSVRC2012_val_00015994.JPEG n04252077/
+mv val/ILSVRC2012_val_00015995.JPEG n04258138/
+mv val/ILSVRC2012_val_00015996.JPEG n02808440/
+mv val/ILSVRC2012_val_00015997.JPEG n02328150/
+mv val/ILSVRC2012_val_00015998.JPEG n03325584/
+mv val/ILSVRC2012_val_00015999.JPEG n01774750/
+mv val/ILSVRC2012_val_00016000.JPEG n02123159/
+mv val/ILSVRC2012_val_00016001.JPEG n02111277/
+mv val/ILSVRC2012_val_00016002.JPEG n04591157/
+mv val/ILSVRC2012_val_00016003.JPEG n03871628/
+mv val/ILSVRC2012_val_00016004.JPEG n03775071/
+mv val/ILSVRC2012_val_00016005.JPEG n04136333/
+mv val/ILSVRC2012_val_00016006.JPEG n03976467/
+mv val/ILSVRC2012_val_00016007.JPEG n03908618/
+mv val/ILSVRC2012_val_00016008.JPEG n03483316/
+mv val/ILSVRC2012_val_00016009.JPEG n04487394/
+mv val/ILSVRC2012_val_00016010.JPEG n02769748/
+mv val/ILSVRC2012_val_00016011.JPEG n04523525/
+mv val/ILSVRC2012_val_00016012.JPEG n12998815/
+mv val/ILSVRC2012_val_00016013.JPEG n04553703/
+mv val/ILSVRC2012_val_00016014.JPEG n04152593/
+mv val/ILSVRC2012_val_00016015.JPEG n02346627/
+mv val/ILSVRC2012_val_00016016.JPEG n02007558/
+mv val/ILSVRC2012_val_00016017.JPEG n03110669/
+mv val/ILSVRC2012_val_00016018.JPEG n01440764/
+mv val/ILSVRC2012_val_00016019.JPEG n09472597/
+mv val/ILSVRC2012_val_00016020.JPEG n02730930/
+mv val/ILSVRC2012_val_00016021.JPEG n02782093/
+mv val/ILSVRC2012_val_00016022.JPEG n04483307/
+mv val/ILSVRC2012_val_00016023.JPEG n02028035/
+mv val/ILSVRC2012_val_00016024.JPEG n04040759/
+mv val/ILSVRC2012_val_00016025.JPEG n03372029/
+mv val/ILSVRC2012_val_00016026.JPEG n02808440/
+mv val/ILSVRC2012_val_00016027.JPEG n02120505/
+mv val/ILSVRC2012_val_00016028.JPEG n03141823/
+mv val/ILSVRC2012_val_00016029.JPEG n02100236/
+mv val/ILSVRC2012_val_00016030.JPEG n01770393/
+mv val/ILSVRC2012_val_00016031.JPEG n01739381/
+mv val/ILSVRC2012_val_00016032.JPEG n03208938/
+mv val/ILSVRC2012_val_00016033.JPEG n03954731/
+mv val/ILSVRC2012_val_00016034.JPEG n04536866/
+mv val/ILSVRC2012_val_00016035.JPEG n04456115/
+mv val/ILSVRC2012_val_00016036.JPEG n03000247/
+mv val/ILSVRC2012_val_00016037.JPEG n04612504/
+mv val/ILSVRC2012_val_00016038.JPEG n02837789/
+mv val/ILSVRC2012_val_00016039.JPEG n03538406/
+mv val/ILSVRC2012_val_00016040.JPEG n02699494/
+mv val/ILSVRC2012_val_00016041.JPEG n03967562/
+mv val/ILSVRC2012_val_00016042.JPEG n04398044/
+mv val/ILSVRC2012_val_00016043.JPEG n03710721/
+mv val/ILSVRC2012_val_00016044.JPEG n04356056/
+mv val/ILSVRC2012_val_00016045.JPEG n04033995/
+mv val/ILSVRC2012_val_00016046.JPEG n02415577/
+mv val/ILSVRC2012_val_00016047.JPEG n04270147/
+mv val/ILSVRC2012_val_00016048.JPEG n03866082/
+mv val/ILSVRC2012_val_00016049.JPEG n03271574/
+mv val/ILSVRC2012_val_00016050.JPEG n02133161/
+mv val/ILSVRC2012_val_00016051.JPEG n03483316/
+mv val/ILSVRC2012_val_00016052.JPEG n01514668/
+mv val/ILSVRC2012_val_00016053.JPEG n03770679/
+mv val/ILSVRC2012_val_00016054.JPEG n04532670/
+mv val/ILSVRC2012_val_00016055.JPEG n03720891/
+mv val/ILSVRC2012_val_00016056.JPEG n02096437/
+mv val/ILSVRC2012_val_00016057.JPEG n03444034/
+mv val/ILSVRC2012_val_00016058.JPEG n02088632/
+mv val/ILSVRC2012_val_00016059.JPEG n02328150/
+mv val/ILSVRC2012_val_00016060.JPEG n02787622/
+mv val/ILSVRC2012_val_00016061.JPEG n12998815/
+mv val/ILSVRC2012_val_00016062.JPEG n07716358/
+mv val/ILSVRC2012_val_00016063.JPEG n02817516/
+mv val/ILSVRC2012_val_00016064.JPEG n03961711/
+mv val/ILSVRC2012_val_00016065.JPEG n02823428/
+mv val/ILSVRC2012_val_00016066.JPEG n01753488/
+mv val/ILSVRC2012_val_00016067.JPEG n02443114/
+mv val/ILSVRC2012_val_00016068.JPEG n04370456/
+mv val/ILSVRC2012_val_00016069.JPEG n04542943/
+mv val/ILSVRC2012_val_00016070.JPEG n03876231/
+mv val/ILSVRC2012_val_00016071.JPEG n02509815/
+mv val/ILSVRC2012_val_00016072.JPEG n04371430/
+mv val/ILSVRC2012_val_00016073.JPEG n04141975/
+mv val/ILSVRC2012_val_00016074.JPEG n02112350/
+mv val/ILSVRC2012_val_00016075.JPEG n02321529/
+mv val/ILSVRC2012_val_00016076.JPEG n02097474/
+mv val/ILSVRC2012_val_00016077.JPEG n04461696/
+mv val/ILSVRC2012_val_00016078.JPEG n03804744/
+mv val/ILSVRC2012_val_00016079.JPEG n02786058/
+mv val/ILSVRC2012_val_00016080.JPEG n12768682/
+mv val/ILSVRC2012_val_00016081.JPEG n01855032/
+mv val/ILSVRC2012_val_00016082.JPEG n03992509/
+mv val/ILSVRC2012_val_00016083.JPEG n01773797/
+mv val/ILSVRC2012_val_00016084.JPEG n02443484/
+mv val/ILSVRC2012_val_00016085.JPEG n02101006/
+mv val/ILSVRC2012_val_00016086.JPEG n09421951/
+mv val/ILSVRC2012_val_00016087.JPEG n03837869/
+mv val/ILSVRC2012_val_00016088.JPEG n04356056/
+mv val/ILSVRC2012_val_00016089.JPEG n01744401/
+mv val/ILSVRC2012_val_00016090.JPEG n02701002/
+mv val/ILSVRC2012_val_00016091.JPEG n03977966/
+mv val/ILSVRC2012_val_00016092.JPEG n02105056/
+mv val/ILSVRC2012_val_00016093.JPEG n02102318/
+mv val/ILSVRC2012_val_00016094.JPEG n03095699/
+mv val/ILSVRC2012_val_00016095.JPEG n01728572/
+mv val/ILSVRC2012_val_00016096.JPEG n01873310/
+mv val/ILSVRC2012_val_00016097.JPEG n03930313/
+mv val/ILSVRC2012_val_00016098.JPEG n03930630/
+mv val/ILSVRC2012_val_00016099.JPEG n06359193/
+mv val/ILSVRC2012_val_00016100.JPEG n02033041/
+mv val/ILSVRC2012_val_00016101.JPEG n04604644/
+mv val/ILSVRC2012_val_00016102.JPEG n03781244/
+mv val/ILSVRC2012_val_00016103.JPEG n04599235/
+mv val/ILSVRC2012_val_00016104.JPEG n02114548/
+mv val/ILSVRC2012_val_00016105.JPEG n02356798/
+mv val/ILSVRC2012_val_00016106.JPEG n03271574/
+mv val/ILSVRC2012_val_00016107.JPEG n07932039/
+mv val/ILSVRC2012_val_00016108.JPEG n02100735/
+mv val/ILSVRC2012_val_00016109.JPEG n04069434/
+mv val/ILSVRC2012_val_00016110.JPEG n04346328/
+mv val/ILSVRC2012_val_00016111.JPEG n09332890/
+mv val/ILSVRC2012_val_00016112.JPEG n12768682/
+mv val/ILSVRC2012_val_00016113.JPEG n02795169/
+mv val/ILSVRC2012_val_00016114.JPEG n04049303/
+mv val/ILSVRC2012_val_00016115.JPEG n02403003/
+mv val/ILSVRC2012_val_00016116.JPEG n04239074/
+mv val/ILSVRC2012_val_00016117.JPEG n02493793/
+mv val/ILSVRC2012_val_00016118.JPEG n02127052/
+mv val/ILSVRC2012_val_00016119.JPEG n04317175/
+mv val/ILSVRC2012_val_00016120.JPEG n02363005/
+mv val/ILSVRC2012_val_00016121.JPEG n03832673/
+mv val/ILSVRC2012_val_00016122.JPEG n04296562/
+mv val/ILSVRC2012_val_00016123.JPEG n03630383/
+mv val/ILSVRC2012_val_00016124.JPEG n01739381/
+mv val/ILSVRC2012_val_00016125.JPEG n02107683/
+mv val/ILSVRC2012_val_00016126.JPEG n02012849/
+mv val/ILSVRC2012_val_00016127.JPEG n03786901/
+mv val/ILSVRC2012_val_00016128.JPEG n04033995/
+mv val/ILSVRC2012_val_00016129.JPEG n03782006/
+mv val/ILSVRC2012_val_00016130.JPEG n02113624/
+mv val/ILSVRC2012_val_00016131.JPEG n02783161/
+mv val/ILSVRC2012_val_00016132.JPEG n02134418/
+mv val/ILSVRC2012_val_00016133.JPEG n03532672/
+mv val/ILSVRC2012_val_00016134.JPEG n02012849/
+mv val/ILSVRC2012_val_00016135.JPEG n02415577/
+mv val/ILSVRC2012_val_00016136.JPEG n02096437/
+mv val/ILSVRC2012_val_00016137.JPEG n03220513/
+mv val/ILSVRC2012_val_00016138.JPEG n01945685/
+mv val/ILSVRC2012_val_00016139.JPEG n02892201/
+mv val/ILSVRC2012_val_00016140.JPEG n04044716/
+mv val/ILSVRC2012_val_00016141.JPEG n07742313/
+mv val/ILSVRC2012_val_00016142.JPEG n03376595/
+mv val/ILSVRC2012_val_00016143.JPEG n02643566/
+mv val/ILSVRC2012_val_00016144.JPEG n01735189/
+mv val/ILSVRC2012_val_00016145.JPEG n01729977/
+mv val/ILSVRC2012_val_00016146.JPEG n02105251/
+mv val/ILSVRC2012_val_00016147.JPEG n09421951/
+mv val/ILSVRC2012_val_00016148.JPEG n02099712/
+mv val/ILSVRC2012_val_00016149.JPEG n03388043/
+mv val/ILSVRC2012_val_00016150.JPEG n02174001/
+mv val/ILSVRC2012_val_00016151.JPEG n04147183/
+mv val/ILSVRC2012_val_00016152.JPEG n02013706/
+mv val/ILSVRC2012_val_00016153.JPEG n13054560/
+mv val/ILSVRC2012_val_00016154.JPEG n02978881/
+mv val/ILSVRC2012_val_00016155.JPEG n09246464/
+mv val/ILSVRC2012_val_00016156.JPEG n02699494/
+mv val/ILSVRC2012_val_00016157.JPEG n02107312/
+mv val/ILSVRC2012_val_00016158.JPEG n03017168/
+mv val/ILSVRC2012_val_00016159.JPEG n07745940/
+mv val/ILSVRC2012_val_00016160.JPEG n02233338/
+mv val/ILSVRC2012_val_00016161.JPEG n02791270/
+mv val/ILSVRC2012_val_00016162.JPEG n01950731/
+mv val/ILSVRC2012_val_00016163.JPEG n03857828/
+mv val/ILSVRC2012_val_00016164.JPEG n02025239/
+mv val/ILSVRC2012_val_00016165.JPEG n03452741/
+mv val/ILSVRC2012_val_00016166.JPEG n02101388/
+mv val/ILSVRC2012_val_00016167.JPEG n03388549/
+mv val/ILSVRC2012_val_00016168.JPEG n01484850/
+mv val/ILSVRC2012_val_00016169.JPEG n02111277/
+mv val/ILSVRC2012_val_00016170.JPEG n01950731/
+mv val/ILSVRC2012_val_00016171.JPEG n02174001/
+mv val/ILSVRC2012_val_00016172.JPEG n02105162/
+mv val/ILSVRC2012_val_00016173.JPEG n02480855/
+mv val/ILSVRC2012_val_00016174.JPEG n03325584/
+mv val/ILSVRC2012_val_00016175.JPEG n03272562/
+mv val/ILSVRC2012_val_00016176.JPEG n03876231/
+mv val/ILSVRC2012_val_00016177.JPEG n01644373/
+mv val/ILSVRC2012_val_00016178.JPEG n04380533/
+mv val/ILSVRC2012_val_00016179.JPEG n07697537/
+mv val/ILSVRC2012_val_00016180.JPEG n04380533/
+mv val/ILSVRC2012_val_00016181.JPEG n02190166/
+mv val/ILSVRC2012_val_00016182.JPEG n07753592/
+mv val/ILSVRC2012_val_00016183.JPEG n01630670/
+mv val/ILSVRC2012_val_00016184.JPEG n02730930/
+mv val/ILSVRC2012_val_00016185.JPEG n03788195/
+mv val/ILSVRC2012_val_00016186.JPEG n02669723/
+mv val/ILSVRC2012_val_00016187.JPEG n02100735/
+mv val/ILSVRC2012_val_00016188.JPEG n03271574/
+mv val/ILSVRC2012_val_00016189.JPEG n03179701/
+mv val/ILSVRC2012_val_00016190.JPEG n02486261/
+mv val/ILSVRC2012_val_00016191.JPEG n02105412/
+mv val/ILSVRC2012_val_00016192.JPEG n02417914/
+mv val/ILSVRC2012_val_00016193.JPEG n01770081/
+mv val/ILSVRC2012_val_00016194.JPEG n02123394/
+mv val/ILSVRC2012_val_00016195.JPEG n01855672/
+mv val/ILSVRC2012_val_00016196.JPEG n02480495/
+mv val/ILSVRC2012_val_00016197.JPEG n02692877/
+mv val/ILSVRC2012_val_00016198.JPEG n01532829/
+mv val/ILSVRC2012_val_00016199.JPEG n04372370/
+mv val/ILSVRC2012_val_00016200.JPEG n01910747/
+mv val/ILSVRC2012_val_00016201.JPEG n03400231/
+mv val/ILSVRC2012_val_00016202.JPEG n02444819/
+mv val/ILSVRC2012_val_00016203.JPEG n04099969/
+mv val/ILSVRC2012_val_00016204.JPEG n03498962/
+mv val/ILSVRC2012_val_00016205.JPEG n04154565/
+mv val/ILSVRC2012_val_00016206.JPEG n02783161/
+mv val/ILSVRC2012_val_00016207.JPEG n03124170/
+mv val/ILSVRC2012_val_00016208.JPEG n03417042/
+mv val/ILSVRC2012_val_00016209.JPEG n04254120/
+mv val/ILSVRC2012_val_00016210.JPEG n07717410/
+mv val/ILSVRC2012_val_00016211.JPEG n04372370/
+mv val/ILSVRC2012_val_00016212.JPEG n07565083/
+mv val/ILSVRC2012_val_00016213.JPEG n03661043/
+mv val/ILSVRC2012_val_00016214.JPEG n04074963/
+mv val/ILSVRC2012_val_00016215.JPEG n02504458/
+mv val/ILSVRC2012_val_00016216.JPEG n03720891/
+mv val/ILSVRC2012_val_00016217.JPEG n03445924/
+mv val/ILSVRC2012_val_00016218.JPEG n03873416/
+mv val/ILSVRC2012_val_00016219.JPEG n03775071/
+mv val/ILSVRC2012_val_00016220.JPEG n02443114/
+mv val/ILSVRC2012_val_00016221.JPEG n03623198/
+mv val/ILSVRC2012_val_00016222.JPEG n03000247/
+mv val/ILSVRC2012_val_00016223.JPEG n02423022/
+mv val/ILSVRC2012_val_00016224.JPEG n03929660/
+mv val/ILSVRC2012_val_00016225.JPEG n02782093/
+mv val/ILSVRC2012_val_00016226.JPEG n01930112/
+mv val/ILSVRC2012_val_00016227.JPEG n01776313/
+mv val/ILSVRC2012_val_00016228.JPEG n03388183/
+mv val/ILSVRC2012_val_00016229.JPEG n02133161/
+mv val/ILSVRC2012_val_00016230.JPEG n02782093/
+mv val/ILSVRC2012_val_00016231.JPEG n03393912/
+mv val/ILSVRC2012_val_00016232.JPEG n03794056/
+mv val/ILSVRC2012_val_00016233.JPEG n09256479/
+mv val/ILSVRC2012_val_00016234.JPEG n07920052/
+mv val/ILSVRC2012_val_00016235.JPEG n03384352/
+mv val/ILSVRC2012_val_00016236.JPEG n02666196/
+mv val/ILSVRC2012_val_00016237.JPEG n02894605/
+mv val/ILSVRC2012_val_00016238.JPEG n03476684/
+mv val/ILSVRC2012_val_00016239.JPEG n02526121/
+mv val/ILSVRC2012_val_00016240.JPEG n02123045/
+mv val/ILSVRC2012_val_00016241.JPEG n03673027/
+mv val/ILSVRC2012_val_00016242.JPEG n03197337/
+mv val/ILSVRC2012_val_00016243.JPEG n02114548/
+mv val/ILSVRC2012_val_00016244.JPEG n04599235/
+mv val/ILSVRC2012_val_00016245.JPEG n02085936/
+mv val/ILSVRC2012_val_00016246.JPEG n02963159/
+mv val/ILSVRC2012_val_00016247.JPEG n04258138/
+mv val/ILSVRC2012_val_00016248.JPEG n03983396/
+mv val/ILSVRC2012_val_00016249.JPEG n03187595/
+mv val/ILSVRC2012_val_00016250.JPEG n03290653/
+mv val/ILSVRC2012_val_00016251.JPEG n03179701/
+mv val/ILSVRC2012_val_00016252.JPEG n01531178/
+mv val/ILSVRC2012_val_00016253.JPEG n02398521/
+mv val/ILSVRC2012_val_00016254.JPEG n02119789/
+mv val/ILSVRC2012_val_00016255.JPEG n02089867/
+mv val/ILSVRC2012_val_00016256.JPEG n04548362/
+mv val/ILSVRC2012_val_00016257.JPEG n02486410/
+mv val/ILSVRC2012_val_00016258.JPEG n01704323/
+mv val/ILSVRC2012_val_00016259.JPEG n01494475/
+mv val/ILSVRC2012_val_00016260.JPEG n04141327/
+mv val/ILSVRC2012_val_00016261.JPEG n02790996/
+mv val/ILSVRC2012_val_00016262.JPEG n02056570/
+mv val/ILSVRC2012_val_00016263.JPEG n02106166/
+mv val/ILSVRC2012_val_00016264.JPEG n02018795/
+mv val/ILSVRC2012_val_00016265.JPEG n04523525/
+mv val/ILSVRC2012_val_00016266.JPEG n03598930/
+mv val/ILSVRC2012_val_00016267.JPEG n04118776/
+mv val/ILSVRC2012_val_00016268.JPEG n03662601/
+mv val/ILSVRC2012_val_00016269.JPEG n04509417/
+mv val/ILSVRC2012_val_00016270.JPEG n02606052/
+mv val/ILSVRC2012_val_00016271.JPEG n02966193/
+mv val/ILSVRC2012_val_00016272.JPEG n03775071/
+mv val/ILSVRC2012_val_00016273.JPEG n02317335/
+mv val/ILSVRC2012_val_00016274.JPEG n03146219/
+mv val/ILSVRC2012_val_00016275.JPEG n03355925/
+mv val/ILSVRC2012_val_00016276.JPEG n02229544/
+mv val/ILSVRC2012_val_00016277.JPEG n02443114/
+mv val/ILSVRC2012_val_00016278.JPEG n03355925/
+mv val/ILSVRC2012_val_00016279.JPEG n04590129/
+mv val/ILSVRC2012_val_00016280.JPEG n02804414/
+mv val/ILSVRC2012_val_00016281.JPEG n02114367/
+mv val/ILSVRC2012_val_00016282.JPEG n03379051/
+mv val/ILSVRC2012_val_00016283.JPEG n02138441/
+mv val/ILSVRC2012_val_00016284.JPEG n03461385/
+mv val/ILSVRC2012_val_00016285.JPEG n04200800/
+mv val/ILSVRC2012_val_00016286.JPEG n03584829/
+mv val/ILSVRC2012_val_00016287.JPEG n01755581/
+mv val/ILSVRC2012_val_00016288.JPEG n04335435/
+mv val/ILSVRC2012_val_00016289.JPEG n03127747/
+mv val/ILSVRC2012_val_00016290.JPEG n04263257/
+mv val/ILSVRC2012_val_00016291.JPEG n04192698/
+mv val/ILSVRC2012_val_00016292.JPEG n01622779/
+mv val/ILSVRC2012_val_00016293.JPEG n02422699/
+mv val/ILSVRC2012_val_00016294.JPEG n02107683/
+mv val/ILSVRC2012_val_00016295.JPEG n04532670/
+mv val/ILSVRC2012_val_00016296.JPEG n02906734/
+mv val/ILSVRC2012_val_00016297.JPEG n02804414/
+mv val/ILSVRC2012_val_00016298.JPEG n12768682/
+mv val/ILSVRC2012_val_00016299.JPEG n02108089/
+mv val/ILSVRC2012_val_00016300.JPEG n02909870/
+mv val/ILSVRC2012_val_00016301.JPEG n03837869/
+mv val/ILSVRC2012_val_00016302.JPEG n02113186/
+mv val/ILSVRC2012_val_00016303.JPEG n02112350/
+mv val/ILSVRC2012_val_00016304.JPEG n01677366/
+mv val/ILSVRC2012_val_00016305.JPEG n03630383/
+mv val/ILSVRC2012_val_00016306.JPEG n02526121/
+mv val/ILSVRC2012_val_00016307.JPEG n02840245/
+mv val/ILSVRC2012_val_00016308.JPEG n01687978/
+mv val/ILSVRC2012_val_00016309.JPEG n04515003/
+mv val/ILSVRC2012_val_00016310.JPEG n15075141/
+mv val/ILSVRC2012_val_00016311.JPEG n02841315/
+mv val/ILSVRC2012_val_00016312.JPEG n02422106/
+mv val/ILSVRC2012_val_00016313.JPEG n02783161/
+mv val/ILSVRC2012_val_00016314.JPEG n02814533/
+mv val/ILSVRC2012_val_00016315.JPEG n02102177/
+mv val/ILSVRC2012_val_00016316.JPEG n02415577/
+mv val/ILSVRC2012_val_00016317.JPEG n03782006/
+mv val/ILSVRC2012_val_00016318.JPEG n01770081/
+mv val/ILSVRC2012_val_00016319.JPEG n02114548/
+mv val/ILSVRC2012_val_00016320.JPEG n03958227/
+mv val/ILSVRC2012_val_00016321.JPEG n01728920/
+mv val/ILSVRC2012_val_00016322.JPEG n03494278/
+mv val/ILSVRC2012_val_00016323.JPEG n01873310/
+mv val/ILSVRC2012_val_00016324.JPEG n02894605/
+mv val/ILSVRC2012_val_00016325.JPEG n01833805/
+mv val/ILSVRC2012_val_00016326.JPEG n03160309/
+mv val/ILSVRC2012_val_00016327.JPEG n04458633/
+mv val/ILSVRC2012_val_00016328.JPEG n03223299/
+mv val/ILSVRC2012_val_00016329.JPEG n12620546/
+mv val/ILSVRC2012_val_00016330.JPEG n12998815/
+mv val/ILSVRC2012_val_00016331.JPEG n01496331/
+mv val/ILSVRC2012_val_00016332.JPEG n04461696/
+mv val/ILSVRC2012_val_00016333.JPEG n01981276/
+mv val/ILSVRC2012_val_00016334.JPEG n03595614/
+mv val/ILSVRC2012_val_00016335.JPEG n02101388/
+mv val/ILSVRC2012_val_00016336.JPEG n03937543/
+mv val/ILSVRC2012_val_00016337.JPEG n03100240/
+mv val/ILSVRC2012_val_00016338.JPEG n03791053/
+mv val/ILSVRC2012_val_00016339.JPEG n04613696/
+mv val/ILSVRC2012_val_00016340.JPEG n02134084/
+mv val/ILSVRC2012_val_00016341.JPEG n04141975/
+mv val/ILSVRC2012_val_00016342.JPEG n02093859/
+mv val/ILSVRC2012_val_00016343.JPEG n03125729/
+mv val/ILSVRC2012_val_00016344.JPEG n02326432/
+mv val/ILSVRC2012_val_00016345.JPEG n03680355/
+mv val/ILSVRC2012_val_00016346.JPEG n03998194/
+mv val/ILSVRC2012_val_00016347.JPEG n01494475/
+mv val/ILSVRC2012_val_00016348.JPEG n02342885/
+mv val/ILSVRC2012_val_00016349.JPEG n03976657/
+mv val/ILSVRC2012_val_00016350.JPEG n01819313/
+mv val/ILSVRC2012_val_00016351.JPEG n04606251/
+mv val/ILSVRC2012_val_00016352.JPEG n01740131/
+mv val/ILSVRC2012_val_00016353.JPEG n02797295/
+mv val/ILSVRC2012_val_00016354.JPEG n02123394/
+mv val/ILSVRC2012_val_00016355.JPEG n02169497/
+mv val/ILSVRC2012_val_00016356.JPEG n03630383/
+mv val/ILSVRC2012_val_00016357.JPEG n01689811/
+mv val/ILSVRC2012_val_00016358.JPEG n03950228/
+mv val/ILSVRC2012_val_00016359.JPEG n07584110/
+mv val/ILSVRC2012_val_00016360.JPEG n04591713/
+mv val/ILSVRC2012_val_00016361.JPEG n04127249/
+mv val/ILSVRC2012_val_00016362.JPEG n12144580/
+mv val/ILSVRC2012_val_00016363.JPEG n07831146/
+mv val/ILSVRC2012_val_00016364.JPEG n03791053/
+mv val/ILSVRC2012_val_00016365.JPEG n02808440/
+mv val/ILSVRC2012_val_00016366.JPEG n02793495/
+mv val/ILSVRC2012_val_00016367.JPEG n02437312/
+mv val/ILSVRC2012_val_00016368.JPEG n02138441/
+mv val/ILSVRC2012_val_00016369.JPEG n02111500/
+mv val/ILSVRC2012_val_00016370.JPEG n02109961/
+mv val/ILSVRC2012_val_00016371.JPEG n03459775/
+mv val/ILSVRC2012_val_00016372.JPEG n03126707/
+mv val/ILSVRC2012_val_00016373.JPEG n03388549/
+mv val/ILSVRC2012_val_00016374.JPEG n02096294/
+mv val/ILSVRC2012_val_00016375.JPEG n03961711/
+mv val/ILSVRC2012_val_00016376.JPEG n04209133/
+mv val/ILSVRC2012_val_00016377.JPEG n04243546/
+mv val/ILSVRC2012_val_00016378.JPEG n02791270/
+mv val/ILSVRC2012_val_00016379.JPEG n01685808/
+mv val/ILSVRC2012_val_00016380.JPEG n02965783/
+mv val/ILSVRC2012_val_00016381.JPEG n03775546/
+mv val/ILSVRC2012_val_00016382.JPEG n02074367/
+mv val/ILSVRC2012_val_00016383.JPEG n03775546/
+mv val/ILSVRC2012_val_00016384.JPEG n03584254/
+mv val/ILSVRC2012_val_00016385.JPEG n02119789/
+mv val/ILSVRC2012_val_00016386.JPEG n02437312/
+mv val/ILSVRC2012_val_00016387.JPEG n03888257/
+mv val/ILSVRC2012_val_00016388.JPEG n03187595/
+mv val/ILSVRC2012_val_00016389.JPEG n02123045/
+mv val/ILSVRC2012_val_00016390.JPEG n03937543/
+mv val/ILSVRC2012_val_00016391.JPEG n02412080/
+mv val/ILSVRC2012_val_00016392.JPEG n01729322/
+mv val/ILSVRC2012_val_00016393.JPEG n03908714/
+mv val/ILSVRC2012_val_00016394.JPEG n02125311/
+mv val/ILSVRC2012_val_00016395.JPEG n01494475/
+mv val/ILSVRC2012_val_00016396.JPEG n02894605/
+mv val/ILSVRC2012_val_00016397.JPEG n03908618/
+mv val/ILSVRC2012_val_00016398.JPEG n02114855/
+mv val/ILSVRC2012_val_00016399.JPEG n02123159/
+mv val/ILSVRC2012_val_00016400.JPEG n03598930/
+mv val/ILSVRC2012_val_00016401.JPEG n02107142/
+mv val/ILSVRC2012_val_00016402.JPEG n03290653/
+mv val/ILSVRC2012_val_00016403.JPEG n02791124/
+mv val/ILSVRC2012_val_00016404.JPEG n03803284/
+mv val/ILSVRC2012_val_00016405.JPEG n03937543/
+mv val/ILSVRC2012_val_00016406.JPEG n03388043/
+mv val/ILSVRC2012_val_00016407.JPEG n03131574/
+mv val/ILSVRC2012_val_00016408.JPEG n02788148/
+mv val/ILSVRC2012_val_00016409.JPEG n02106382/
+mv val/ILSVRC2012_val_00016410.JPEG n04467665/
+mv val/ILSVRC2012_val_00016411.JPEG n02100877/
+mv val/ILSVRC2012_val_00016412.JPEG n04330267/
+mv val/ILSVRC2012_val_00016413.JPEG n03697007/
+mv val/ILSVRC2012_val_00016414.JPEG n03710721/
+mv val/ILSVRC2012_val_00016415.JPEG n02403003/
+mv val/ILSVRC2012_val_00016416.JPEG n02108089/
+mv val/ILSVRC2012_val_00016417.JPEG n03017168/
+mv val/ILSVRC2012_val_00016418.JPEG n03733281/
+mv val/ILSVRC2012_val_00016419.JPEG n03792972/
+mv val/ILSVRC2012_val_00016420.JPEG n02105056/
+mv val/ILSVRC2012_val_00016421.JPEG n01806567/
+mv val/ILSVRC2012_val_00016422.JPEG n01630670/
+mv val/ILSVRC2012_val_00016423.JPEG n03337140/
+mv val/ILSVRC2012_val_00016424.JPEG n03467068/
+mv val/ILSVRC2012_val_00016425.JPEG n01873310/
+mv val/ILSVRC2012_val_00016426.JPEG n02398521/
+mv val/ILSVRC2012_val_00016427.JPEG n02013706/
+mv val/ILSVRC2012_val_00016428.JPEG n04120489/
+mv val/ILSVRC2012_val_00016429.JPEG n02708093/
+mv val/ILSVRC2012_val_00016430.JPEG n02110341/
+mv val/ILSVRC2012_val_00016431.JPEG n03770679/
+mv val/ILSVRC2012_val_00016432.JPEG n02480495/
+mv val/ILSVRC2012_val_00016433.JPEG n03450230/
+mv val/ILSVRC2012_val_00016434.JPEG n03584254/
+mv val/ILSVRC2012_val_00016435.JPEG n02823750/
+mv val/ILSVRC2012_val_00016436.JPEG n04127249/
+mv val/ILSVRC2012_val_00016437.JPEG n02410509/
+mv val/ILSVRC2012_val_00016438.JPEG n04562935/
+mv val/ILSVRC2012_val_00016439.JPEG n04019541/
+mv val/ILSVRC2012_val_00016440.JPEG n04613696/
+mv val/ILSVRC2012_val_00016441.JPEG n01632777/
+mv val/ILSVRC2012_val_00016442.JPEG n07836838/
+mv val/ILSVRC2012_val_00016443.JPEG n02114855/
+mv val/ILSVRC2012_val_00016444.JPEG n02100236/
+mv val/ILSVRC2012_val_00016445.JPEG n02102318/
+mv val/ILSVRC2012_val_00016446.JPEG n07831146/
+mv val/ILSVRC2012_val_00016447.JPEG n03742115/
+mv val/ILSVRC2012_val_00016448.JPEG n03662601/
+mv val/ILSVRC2012_val_00016449.JPEG n03720891/
+mv val/ILSVRC2012_val_00016450.JPEG n02804610/
+mv val/ILSVRC2012_val_00016451.JPEG n02107142/
+mv val/ILSVRC2012_val_00016452.JPEG n03733131/
+mv val/ILSVRC2012_val_00016453.JPEG n03791053/
+mv val/ILSVRC2012_val_00016454.JPEG n03991062/
+mv val/ILSVRC2012_val_00016455.JPEG n02808304/
+mv val/ILSVRC2012_val_00016456.JPEG n03594945/
+mv val/ILSVRC2012_val_00016457.JPEG n02749479/
+mv val/ILSVRC2012_val_00016458.JPEG n04562935/
+mv val/ILSVRC2012_val_00016459.JPEG n02134084/
+mv val/ILSVRC2012_val_00016460.JPEG n02342885/
+mv val/ILSVRC2012_val_00016461.JPEG n03538406/
+mv val/ILSVRC2012_val_00016462.JPEG n02107683/
+mv val/ILSVRC2012_val_00016463.JPEG n02012849/
+mv val/ILSVRC2012_val_00016464.JPEG n01682714/
+mv val/ILSVRC2012_val_00016465.JPEG n02988304/
+mv val/ILSVRC2012_val_00016466.JPEG n07932039/
+mv val/ILSVRC2012_val_00016467.JPEG n02206856/
+mv val/ILSVRC2012_val_00016468.JPEG n03447447/
+mv val/ILSVRC2012_val_00016469.JPEG n01753488/
+mv val/ILSVRC2012_val_00016470.JPEG n01755581/
+mv val/ILSVRC2012_val_00016471.JPEG n02119022/
+mv val/ILSVRC2012_val_00016472.JPEG n04597913/
+mv val/ILSVRC2012_val_00016473.JPEG n03314780/
+mv val/ILSVRC2012_val_00016474.JPEG n02865351/
+mv val/ILSVRC2012_val_00016475.JPEG n03459775/
+mv val/ILSVRC2012_val_00016476.JPEG n01530575/
+mv val/ILSVRC2012_val_00016477.JPEG n04335435/
+mv val/ILSVRC2012_val_00016478.JPEG n09288635/
+mv val/ILSVRC2012_val_00016479.JPEG n02769748/
+mv val/ILSVRC2012_val_00016480.JPEG n02256656/
+mv val/ILSVRC2012_val_00016481.JPEG n03131574/
+mv val/ILSVRC2012_val_00016482.JPEG n03770439/
+mv val/ILSVRC2012_val_00016483.JPEG n02123045/
+mv val/ILSVRC2012_val_00016484.JPEG n02096177/
+mv val/ILSVRC2012_val_00016485.JPEG n04131690/
+mv val/ILSVRC2012_val_00016486.JPEG n02397096/
+mv val/ILSVRC2012_val_00016487.JPEG n01798484/
+mv val/ILSVRC2012_val_00016488.JPEG n02107574/
+mv val/ILSVRC2012_val_00016489.JPEG n02113186/
+mv val/ILSVRC2012_val_00016490.JPEG n01855672/
+mv val/ILSVRC2012_val_00016491.JPEG n03791053/
+mv val/ILSVRC2012_val_00016492.JPEG n03770679/
+mv val/ILSVRC2012_val_00016493.JPEG n01983481/
+mv val/ILSVRC2012_val_00016494.JPEG n02093256/
+mv val/ILSVRC2012_val_00016495.JPEG n01968897/
+mv val/ILSVRC2012_val_00016496.JPEG n02692877/
+mv val/ILSVRC2012_val_00016497.JPEG n02356798/
+mv val/ILSVRC2012_val_00016498.JPEG n07875152/
+mv val/ILSVRC2012_val_00016499.JPEG n02107312/
+mv val/ILSVRC2012_val_00016500.JPEG n02837789/
+mv val/ILSVRC2012_val_00016501.JPEG n03042490/
+mv val/ILSVRC2012_val_00016502.JPEG n03188531/
+mv val/ILSVRC2012_val_00016503.JPEG n03447721/
+mv val/ILSVRC2012_val_00016504.JPEG n02825657/
+mv val/ILSVRC2012_val_00016505.JPEG n03868242/
+mv val/ILSVRC2012_val_00016506.JPEG n04552348/
+mv val/ILSVRC2012_val_00016507.JPEG n01770081/
+mv val/ILSVRC2012_val_00016508.JPEG n02095314/
+mv val/ILSVRC2012_val_00016509.JPEG n04204347/
+mv val/ILSVRC2012_val_00016510.JPEG n02087394/
+mv val/ILSVRC2012_val_00016511.JPEG n04065272/
+mv val/ILSVRC2012_val_00016512.JPEG n02132136/
+mv val/ILSVRC2012_val_00016513.JPEG n02134418/
+mv val/ILSVRC2012_val_00016514.JPEG n01632777/
+mv val/ILSVRC2012_val_00016515.JPEG n04325704/
+mv val/ILSVRC2012_val_00016516.JPEG n03776460/
+mv val/ILSVRC2012_val_00016517.JPEG n01955084/
+mv val/ILSVRC2012_val_00016518.JPEG n02129604/
+mv val/ILSVRC2012_val_00016519.JPEG n01644900/
+mv val/ILSVRC2012_val_00016520.JPEG n02101006/
+mv val/ILSVRC2012_val_00016521.JPEG n04357314/
+mv val/ILSVRC2012_val_00016522.JPEG n12985857/
+mv val/ILSVRC2012_val_00016523.JPEG n03670208/
+mv val/ILSVRC2012_val_00016524.JPEG n07760859/
+mv val/ILSVRC2012_val_00016525.JPEG n04067472/
+mv val/ILSVRC2012_val_00016526.JPEG n02099849/
+mv val/ILSVRC2012_val_00016527.JPEG n03770679/
+mv val/ILSVRC2012_val_00016528.JPEG n02978881/
+mv val/ILSVRC2012_val_00016529.JPEG n03623198/
+mv val/ILSVRC2012_val_00016530.JPEG n03717622/
+mv val/ILSVRC2012_val_00016531.JPEG n04536866/
+mv val/ILSVRC2012_val_00016532.JPEG n02835271/
+mv val/ILSVRC2012_val_00016533.JPEG n07717410/
+mv val/ILSVRC2012_val_00016534.JPEG n04429376/
+mv val/ILSVRC2012_val_00016535.JPEG n02869837/
+mv val/ILSVRC2012_val_00016536.JPEG n03124170/
+mv val/ILSVRC2012_val_00016537.JPEG n01632458/
+mv val/ILSVRC2012_val_00016538.JPEG n01531178/
+mv val/ILSVRC2012_val_00016539.JPEG n03127925/
+mv val/ILSVRC2012_val_00016540.JPEG n02097047/
+mv val/ILSVRC2012_val_00016541.JPEG n03950228/
+mv val/ILSVRC2012_val_00016542.JPEG n03028079/
+mv val/ILSVRC2012_val_00016543.JPEG n02107312/
+mv val/ILSVRC2012_val_00016544.JPEG n13052670/
+mv val/ILSVRC2012_val_00016545.JPEG n02090721/
+mv val/ILSVRC2012_val_00016546.JPEG n07711569/
+mv val/ILSVRC2012_val_00016547.JPEG n02091831/
+mv val/ILSVRC2012_val_00016548.JPEG n01530575/
+mv val/ILSVRC2012_val_00016549.JPEG n04146614/
+mv val/ILSVRC2012_val_00016550.JPEG n01667114/
+mv val/ILSVRC2012_val_00016551.JPEG n03958227/
+mv val/ILSVRC2012_val_00016552.JPEG n02098286/
+mv val/ILSVRC2012_val_00016553.JPEG n07871810/
+mv val/ILSVRC2012_val_00016554.JPEG n01980166/
+mv val/ILSVRC2012_val_00016555.JPEG n02412080/
+mv val/ILSVRC2012_val_00016556.JPEG n02500267/
+mv val/ILSVRC2012_val_00016557.JPEG n01924916/
+mv val/ILSVRC2012_val_00016558.JPEG n04254680/
+mv val/ILSVRC2012_val_00016559.JPEG n02480495/
+mv val/ILSVRC2012_val_00016560.JPEG n01774384/
+mv val/ILSVRC2012_val_00016561.JPEG n03216828/
+mv val/ILSVRC2012_val_00016562.JPEG n07711569/
+mv val/ILSVRC2012_val_00016563.JPEG n03026506/
+mv val/ILSVRC2012_val_00016564.JPEG n01749939/
+mv val/ILSVRC2012_val_00016565.JPEG n03344393/
+mv val/ILSVRC2012_val_00016566.JPEG n03938244/
+mv val/ILSVRC2012_val_00016567.JPEG n02098105/
+mv val/ILSVRC2012_val_00016568.JPEG n01986214/
+mv val/ILSVRC2012_val_00016569.JPEG n01917289/
+mv val/ILSVRC2012_val_00016570.JPEG n04418357/
+mv val/ILSVRC2012_val_00016571.JPEG n02058221/
+mv val/ILSVRC2012_val_00016572.JPEG n02106030/
+mv val/ILSVRC2012_val_00016573.JPEG n02966193/
+mv val/ILSVRC2012_val_00016574.JPEG n03032252/
+mv val/ILSVRC2012_val_00016575.JPEG n02206856/
+mv val/ILSVRC2012_val_00016576.JPEG n03063599/
+mv val/ILSVRC2012_val_00016577.JPEG n02107312/
+mv val/ILSVRC2012_val_00016578.JPEG n03843555/
+mv val/ILSVRC2012_val_00016579.JPEG n02108551/
+mv val/ILSVRC2012_val_00016580.JPEG n01855672/
+mv val/ILSVRC2012_val_00016581.JPEG n02107142/
+mv val/ILSVRC2012_val_00016582.JPEG n02102040/
+mv val/ILSVRC2012_val_00016583.JPEG n04357314/
+mv val/ILSVRC2012_val_00016584.JPEG n04505470/
+mv val/ILSVRC2012_val_00016585.JPEG n03529860/
+mv val/ILSVRC2012_val_00016586.JPEG n02437312/
+mv val/ILSVRC2012_val_00016587.JPEG n02129604/
+mv val/ILSVRC2012_val_00016588.JPEG n03773504/
+mv val/ILSVRC2012_val_00016589.JPEG n02100877/
+mv val/ILSVRC2012_val_00016590.JPEG n03877472/
+mv val/ILSVRC2012_val_00016591.JPEG n04501370/
+mv val/ILSVRC2012_val_00016592.JPEG n07880968/
+mv val/ILSVRC2012_val_00016593.JPEG n04458633/
+mv val/ILSVRC2012_val_00016594.JPEG n02167151/
+mv val/ILSVRC2012_val_00016595.JPEG n03721384/
+mv val/ILSVRC2012_val_00016596.JPEG n02102480/
+mv val/ILSVRC2012_val_00016597.JPEG n07579787/
+mv val/ILSVRC2012_val_00016598.JPEG n02123394/
+mv val/ILSVRC2012_val_00016599.JPEG n02484975/
+mv val/ILSVRC2012_val_00016600.JPEG n03942813/
+mv val/ILSVRC2012_val_00016601.JPEG n04270147/
+mv val/ILSVRC2012_val_00016602.JPEG n03777568/
+mv val/ILSVRC2012_val_00016603.JPEG n02085782/
+mv val/ILSVRC2012_val_00016604.JPEG n01729977/
+mv val/ILSVRC2012_val_00016605.JPEG n04404412/
+mv val/ILSVRC2012_val_00016606.JPEG n04311174/
+mv val/ILSVRC2012_val_00016607.JPEG n03160309/
+mv val/ILSVRC2012_val_00016608.JPEG n02454379/
+mv val/ILSVRC2012_val_00016609.JPEG n02096294/
+mv val/ILSVRC2012_val_00016610.JPEG n04065272/
+mv val/ILSVRC2012_val_00016611.JPEG n02483362/
+mv val/ILSVRC2012_val_00016612.JPEG n02364673/
+mv val/ILSVRC2012_val_00016613.JPEG n03100240/
+mv val/ILSVRC2012_val_00016614.JPEG n07873807/
+mv val/ILSVRC2012_val_00016615.JPEG n03594734/
+mv val/ILSVRC2012_val_00016616.JPEG n04344873/
+mv val/ILSVRC2012_val_00016617.JPEG n07590611/
+mv val/ILSVRC2012_val_00016618.JPEG n01883070/
+mv val/ILSVRC2012_val_00016619.JPEG n03770439/
+mv val/ILSVRC2012_val_00016620.JPEG n03141823/
+mv val/ILSVRC2012_val_00016621.JPEG n02133161/
+mv val/ILSVRC2012_val_00016622.JPEG n01689811/
+mv val/ILSVRC2012_val_00016623.JPEG n01833805/
+mv val/ILSVRC2012_val_00016624.JPEG n02814860/
+mv val/ILSVRC2012_val_00016625.JPEG n04367480/
+mv val/ILSVRC2012_val_00016626.JPEG n03710637/
+mv val/ILSVRC2012_val_00016627.JPEG n07714571/
+mv val/ILSVRC2012_val_00016628.JPEG n02071294/
+mv val/ILSVRC2012_val_00016629.JPEG n01768244/
+mv val/ILSVRC2012_val_00016630.JPEG n03388183/
+mv val/ILSVRC2012_val_00016631.JPEG n01847000/
+mv val/ILSVRC2012_val_00016632.JPEG n03325584/
+mv val/ILSVRC2012_val_00016633.JPEG n01667114/
+mv val/ILSVRC2012_val_00016634.JPEG n02236044/
+mv val/ILSVRC2012_val_00016635.JPEG n04141327/
+mv val/ILSVRC2012_val_00016636.JPEG n03467068/
+mv val/ILSVRC2012_val_00016637.JPEG n01687978/
+mv val/ILSVRC2012_val_00016638.JPEG n04285008/
+mv val/ILSVRC2012_val_00016639.JPEG n03483316/
+mv val/ILSVRC2012_val_00016640.JPEG n03447447/
+mv val/ILSVRC2012_val_00016641.JPEG n02264363/
+mv val/ILSVRC2012_val_00016642.JPEG n02097209/
+mv val/ILSVRC2012_val_00016643.JPEG n04501370/
+mv val/ILSVRC2012_val_00016644.JPEG n09468604/
+mv val/ILSVRC2012_val_00016645.JPEG n02930766/
+mv val/ILSVRC2012_val_00016646.JPEG n01917289/
+mv val/ILSVRC2012_val_00016647.JPEG n04554684/
+mv val/ILSVRC2012_val_00016648.JPEG n02979186/
+mv val/ILSVRC2012_val_00016649.JPEG n02442845/
+mv val/ILSVRC2012_val_00016650.JPEG n03345487/
+mv val/ILSVRC2012_val_00016651.JPEG n02486410/
+mv val/ILSVRC2012_val_00016652.JPEG n02841315/
+mv val/ILSVRC2012_val_00016653.JPEG n03899768/
+mv val/ILSVRC2012_val_00016654.JPEG n09399592/
+mv val/ILSVRC2012_val_00016655.JPEG n03344393/
+mv val/ILSVRC2012_val_00016656.JPEG n02088364/
+mv val/ILSVRC2012_val_00016657.JPEG n03763968/
+mv val/ILSVRC2012_val_00016658.JPEG n02105162/
+mv val/ILSVRC2012_val_00016659.JPEG n04235860/
+mv val/ILSVRC2012_val_00016660.JPEG n03903868/
+mv val/ILSVRC2012_val_00016661.JPEG n09428293/
+mv val/ILSVRC2012_val_00016662.JPEG n03661043/
+mv val/ILSVRC2012_val_00016663.JPEG n03249569/
+mv val/ILSVRC2012_val_00016664.JPEG n02268443/
+mv val/ILSVRC2012_val_00016665.JPEG n02444819/
+mv val/ILSVRC2012_val_00016666.JPEG n02116738/
+mv val/ILSVRC2012_val_00016667.JPEG n03902125/
+mv val/ILSVRC2012_val_00016668.JPEG n02093991/
+mv val/ILSVRC2012_val_00016669.JPEG n02110185/
+mv val/ILSVRC2012_val_00016670.JPEG n03832673/
+mv val/ILSVRC2012_val_00016671.JPEG n03983396/
+mv val/ILSVRC2012_val_00016672.JPEG n07716358/
+mv val/ILSVRC2012_val_00016673.JPEG n02113712/
+mv val/ILSVRC2012_val_00016674.JPEG n03887697/
+mv val/ILSVRC2012_val_00016675.JPEG n03424325/
+mv val/ILSVRC2012_val_00016676.JPEG n03958227/
+mv val/ILSVRC2012_val_00016677.JPEG n01534433/
+mv val/ILSVRC2012_val_00016678.JPEG n02086646/
+mv val/ILSVRC2012_val_00016679.JPEG n04591713/
+mv val/ILSVRC2012_val_00016680.JPEG n07753113/
+mv val/ILSVRC2012_val_00016681.JPEG n03841143/
+mv val/ILSVRC2012_val_00016682.JPEG n02790996/
+mv val/ILSVRC2012_val_00016683.JPEG n02165456/
+mv val/ILSVRC2012_val_00016684.JPEG n02009229/
+mv val/ILSVRC2012_val_00016685.JPEG n02814860/
+mv val/ILSVRC2012_val_00016686.JPEG n04462240/
+mv val/ILSVRC2012_val_00016687.JPEG n02730930/
+mv val/ILSVRC2012_val_00016688.JPEG n02085620/
+mv val/ILSVRC2012_val_00016689.JPEG n02098413/
+mv val/ILSVRC2012_val_00016690.JPEG n03337140/
+mv val/ILSVRC2012_val_00016691.JPEG n02807133/
+mv val/ILSVRC2012_val_00016692.JPEG n04263257/
+mv val/ILSVRC2012_val_00016693.JPEG n02108422/
+mv val/ILSVRC2012_val_00016694.JPEG n02138441/
+mv val/ILSVRC2012_val_00016695.JPEG n01630670/
+mv val/ILSVRC2012_val_00016696.JPEG n04008634/
+mv val/ILSVRC2012_val_00016697.JPEG n02113799/
+mv val/ILSVRC2012_val_00016698.JPEG n02643566/
+mv val/ILSVRC2012_val_00016699.JPEG n12057211/
+mv val/ILSVRC2012_val_00016700.JPEG n01665541/
+mv val/ILSVRC2012_val_00016701.JPEG n04404412/
+mv val/ILSVRC2012_val_00016702.JPEG n03691459/
+mv val/ILSVRC2012_val_00016703.JPEG n01729977/
+mv val/ILSVRC2012_val_00016704.JPEG n03290653/
+mv val/ILSVRC2012_val_00016705.JPEG n01924916/
+mv val/ILSVRC2012_val_00016706.JPEG n02486410/
+mv val/ILSVRC2012_val_00016707.JPEG n04332243/
+mv val/ILSVRC2012_val_00016708.JPEG n13052670/
+mv val/ILSVRC2012_val_00016709.JPEG n03598930/
+mv val/ILSVRC2012_val_00016710.JPEG n02437616/
+mv val/ILSVRC2012_val_00016711.JPEG n02093991/
+mv val/ILSVRC2012_val_00016712.JPEG n01729977/
+mv val/ILSVRC2012_val_00016713.JPEG n02115641/
+mv val/ILSVRC2012_val_00016714.JPEG n02825657/
+mv val/ILSVRC2012_val_00016715.JPEG n02786058/
+mv val/ILSVRC2012_val_00016716.JPEG n02788148/
+mv val/ILSVRC2012_val_00016717.JPEG n02094258/
+mv val/ILSVRC2012_val_00016718.JPEG n02793495/
+mv val/ILSVRC2012_val_00016719.JPEG n03388043/
+mv val/ILSVRC2012_val_00016720.JPEG n02128757/
+mv val/ILSVRC2012_val_00016721.JPEG n02443484/
+mv val/ILSVRC2012_val_00016722.JPEG n02088094/
+mv val/ILSVRC2012_val_00016723.JPEG n03110669/
+mv val/ILSVRC2012_val_00016724.JPEG n01985128/
+mv val/ILSVRC2012_val_00016725.JPEG n07714990/
+mv val/ILSVRC2012_val_00016726.JPEG n02869837/
+mv val/ILSVRC2012_val_00016727.JPEG n03595614/
+mv val/ILSVRC2012_val_00016728.JPEG n04592741/
+mv val/ILSVRC2012_val_00016729.JPEG n02127052/
+mv val/ILSVRC2012_val_00016730.JPEG n07880968/
+mv val/ILSVRC2012_val_00016731.JPEG n02643566/
+mv val/ILSVRC2012_val_00016732.JPEG n09256479/
+mv val/ILSVRC2012_val_00016733.JPEG n02356798/
+mv val/ILSVRC2012_val_00016734.JPEG n02509815/
+mv val/ILSVRC2012_val_00016735.JPEG n04487394/
+mv val/ILSVRC2012_val_00016736.JPEG n03721384/
+mv val/ILSVRC2012_val_00016737.JPEG n01728572/
+mv val/ILSVRC2012_val_00016738.JPEG n02992211/
+mv val/ILSVRC2012_val_00016739.JPEG n03877845/
+mv val/ILSVRC2012_val_00016740.JPEG n02231487/
+mv val/ILSVRC2012_val_00016741.JPEG n02445715/
+mv val/ILSVRC2012_val_00016742.JPEG n02095570/
+mv val/ILSVRC2012_val_00016743.JPEG n04579145/
+mv val/ILSVRC2012_val_00016744.JPEG n03706229/
+mv val/ILSVRC2012_val_00016745.JPEG n02107574/
+mv val/ILSVRC2012_val_00016746.JPEG n01833805/
+mv val/ILSVRC2012_val_00016747.JPEG n01629819/
+mv val/ILSVRC2012_val_00016748.JPEG n03445777/
+mv val/ILSVRC2012_val_00016749.JPEG n03710721/
+mv val/ILSVRC2012_val_00016750.JPEG n03014705/
+mv val/ILSVRC2012_val_00016751.JPEG n04336792/
+mv val/ILSVRC2012_val_00016752.JPEG n04311174/
+mv val/ILSVRC2012_val_00016753.JPEG n03724870/
+mv val/ILSVRC2012_val_00016754.JPEG n03920288/
+mv val/ILSVRC2012_val_00016755.JPEG n03063689/
+mv val/ILSVRC2012_val_00016756.JPEG n03908618/
+mv val/ILSVRC2012_val_00016757.JPEG n02085620/
+mv val/ILSVRC2012_val_00016758.JPEG n02699494/
+mv val/ILSVRC2012_val_00016759.JPEG n02096437/
+mv val/ILSVRC2012_val_00016760.JPEG n03804744/
+mv val/ILSVRC2012_val_00016761.JPEG n04209239/
+mv val/ILSVRC2012_val_00016762.JPEG n03249569/
+mv val/ILSVRC2012_val_00016763.JPEG n11939491/
+mv val/ILSVRC2012_val_00016764.JPEG n01882714/
+mv val/ILSVRC2012_val_00016765.JPEG n02129165/
+mv val/ILSVRC2012_val_00016766.JPEG n03773504/
+mv val/ILSVRC2012_val_00016767.JPEG n04346328/
+mv val/ILSVRC2012_val_00016768.JPEG n02102040/
+mv val/ILSVRC2012_val_00016769.JPEG n12620546/
+mv val/ILSVRC2012_val_00016770.JPEG n02177972/
+mv val/ILSVRC2012_val_00016771.JPEG n02066245/
+mv val/ILSVRC2012_val_00016772.JPEG n03492542/
+mv val/ILSVRC2012_val_00016773.JPEG n02090721/
+mv val/ILSVRC2012_val_00016774.JPEG n04482393/
+mv val/ILSVRC2012_val_00016775.JPEG n01914609/
+mv val/ILSVRC2012_val_00016776.JPEG n02174001/
+mv val/ILSVRC2012_val_00016777.JPEG n02233338/
+mv val/ILSVRC2012_val_00016778.JPEG n01693334/
+mv val/ILSVRC2012_val_00016779.JPEG n01665541/
+mv val/ILSVRC2012_val_00016780.JPEG n02280649/
+mv val/ILSVRC2012_val_00016781.JPEG n01514668/
+mv val/ILSVRC2012_val_00016782.JPEG n01641577/
+mv val/ILSVRC2012_val_00016783.JPEG n02107683/
+mv val/ILSVRC2012_val_00016784.JPEG n04040759/
+mv val/ILSVRC2012_val_00016785.JPEG n03355925/
+mv val/ILSVRC2012_val_00016786.JPEG n04579432/
+mv val/ILSVRC2012_val_00016787.JPEG n02280649/
+mv val/ILSVRC2012_val_00016788.JPEG n02361337/
+mv val/ILSVRC2012_val_00016789.JPEG n03937543/
+mv val/ILSVRC2012_val_00016790.JPEG n03891251/
+mv val/ILSVRC2012_val_00016791.JPEG n02492035/
+mv val/ILSVRC2012_val_00016792.JPEG n03759954/
+mv val/ILSVRC2012_val_00016793.JPEG n03763968/
+mv val/ILSVRC2012_val_00016794.JPEG n01582220/
+mv val/ILSVRC2012_val_00016795.JPEG n03866082/
+mv val/ILSVRC2012_val_00016796.JPEG n04086273/
+mv val/ILSVRC2012_val_00016797.JPEG n04330267/
+mv val/ILSVRC2012_val_00016798.JPEG n04476259/
+mv val/ILSVRC2012_val_00016799.JPEG n04118776/
+mv val/ILSVRC2012_val_00016800.JPEG n03180011/
+mv val/ILSVRC2012_val_00016801.JPEG n03838899/
+mv val/ILSVRC2012_val_00016802.JPEG n03627232/
+mv val/ILSVRC2012_val_00016803.JPEG n04264628/
+mv val/ILSVRC2012_val_00016804.JPEG n02101006/
+mv val/ILSVRC2012_val_00016805.JPEG n02113624/
+mv val/ILSVRC2012_val_00016806.JPEG n02395406/
+mv val/ILSVRC2012_val_00016807.JPEG n01675722/
+mv val/ILSVRC2012_val_00016808.JPEG n04090263/
+mv val/ILSVRC2012_val_00016809.JPEG n03785016/
+mv val/ILSVRC2012_val_00016810.JPEG n02137549/
+mv val/ILSVRC2012_val_00016811.JPEG n02277742/
+mv val/ILSVRC2012_val_00016812.JPEG n03642806/
+mv val/ILSVRC2012_val_00016813.JPEG n07718472/
+mv val/ILSVRC2012_val_00016814.JPEG n03447447/
+mv val/ILSVRC2012_val_00016815.JPEG n03792782/
+mv val/ILSVRC2012_val_00016816.JPEG n04008634/
+mv val/ILSVRC2012_val_00016817.JPEG n04254777/
+mv val/ILSVRC2012_val_00016818.JPEG n01631663/
+mv val/ILSVRC2012_val_00016819.JPEG n04254680/
+mv val/ILSVRC2012_val_00016820.JPEG n02074367/
+mv val/ILSVRC2012_val_00016821.JPEG n01744401/
+mv val/ILSVRC2012_val_00016822.JPEG n03127747/
+mv val/ILSVRC2012_val_00016823.JPEG n02190166/
+mv val/ILSVRC2012_val_00016824.JPEG n03623198/
+mv val/ILSVRC2012_val_00016825.JPEG n02607072/
+mv val/ILSVRC2012_val_00016826.JPEG n02877765/
+mv val/ILSVRC2012_val_00016827.JPEG n02790996/
+mv val/ILSVRC2012_val_00016828.JPEG n02992529/
+mv val/ILSVRC2012_val_00016829.JPEG n02492660/
+mv val/ILSVRC2012_val_00016830.JPEG n02117135/
+mv val/ILSVRC2012_val_00016831.JPEG n01580077/
+mv val/ILSVRC2012_val_00016832.JPEG n03028079/
+mv val/ILSVRC2012_val_00016833.JPEG n02102040/
+mv val/ILSVRC2012_val_00016834.JPEG n01494475/
+mv val/ILSVRC2012_val_00016835.JPEG n04461696/
+mv val/ILSVRC2012_val_00016836.JPEG n01917289/
+mv val/ILSVRC2012_val_00016837.JPEG n04146614/
+mv val/ILSVRC2012_val_00016838.JPEG n04004767/
+mv val/ILSVRC2012_val_00016839.JPEG n02906734/
+mv val/ILSVRC2012_val_00016840.JPEG n01560419/
+mv val/ILSVRC2012_val_00016841.JPEG n02085936/
+mv val/ILSVRC2012_val_00016842.JPEG n12267677/
+mv val/ILSVRC2012_val_00016843.JPEG n03075370/
+mv val/ILSVRC2012_val_00016844.JPEG n01682714/
+mv val/ILSVRC2012_val_00016845.JPEG n02669723/
+mv val/ILSVRC2012_val_00016846.JPEG n01751748/
+mv val/ILSVRC2012_val_00016847.JPEG n02999410/
+mv val/ILSVRC2012_val_00016848.JPEG n10148035/
+mv val/ILSVRC2012_val_00016849.JPEG n02797295/
+mv val/ILSVRC2012_val_00016850.JPEG n03958227/
+mv val/ILSVRC2012_val_00016851.JPEG n03134739/
+mv val/ILSVRC2012_val_00016852.JPEG n01860187/
+mv val/ILSVRC2012_val_00016853.JPEG n02443114/
+mv val/ILSVRC2012_val_00016854.JPEG n03028079/
+mv val/ILSVRC2012_val_00016855.JPEG n03495258/
+mv val/ILSVRC2012_val_00016856.JPEG n03787032/
+mv val/ILSVRC2012_val_00016857.JPEG n02108089/
+mv val/ILSVRC2012_val_00016858.JPEG n01687978/
+mv val/ILSVRC2012_val_00016859.JPEG n01484850/
+mv val/ILSVRC2012_val_00016860.JPEG n02098105/
+mv val/ILSVRC2012_val_00016861.JPEG n03942813/
+mv val/ILSVRC2012_val_00016862.JPEG n02109525/
+mv val/ILSVRC2012_val_00016863.JPEG n04613696/
+mv val/ILSVRC2012_val_00016864.JPEG n01631663/
+mv val/ILSVRC2012_val_00016865.JPEG n09835506/
+mv val/ILSVRC2012_val_00016866.JPEG n01784675/
+mv val/ILSVRC2012_val_00016867.JPEG n02137549/
+mv val/ILSVRC2012_val_00016868.JPEG n09472597/
+mv val/ILSVRC2012_val_00016869.JPEG n02895154/
+mv val/ILSVRC2012_val_00016870.JPEG n03676483/
+mv val/ILSVRC2012_val_00016871.JPEG n04209239/
+mv val/ILSVRC2012_val_00016872.JPEG n01784675/
+mv val/ILSVRC2012_val_00016873.JPEG n03028079/
+mv val/ILSVRC2012_val_00016874.JPEG n03355925/
+mv val/ILSVRC2012_val_00016875.JPEG n03483316/
+mv val/ILSVRC2012_val_00016876.JPEG n03337140/
+mv val/ILSVRC2012_val_00016877.JPEG n03495258/
+mv val/ILSVRC2012_val_00016878.JPEG n04311004/
+mv val/ILSVRC2012_val_00016879.JPEG n04270147/
+mv val/ILSVRC2012_val_00016880.JPEG n03791053/
+mv val/ILSVRC2012_val_00016881.JPEG n02488702/
+mv val/ILSVRC2012_val_00016882.JPEG n02895154/
+mv val/ILSVRC2012_val_00016883.JPEG n02100583/
+mv val/ILSVRC2012_val_00016884.JPEG n10565667/
+mv val/ILSVRC2012_val_00016885.JPEG n04548280/
+mv val/ILSVRC2012_val_00016886.JPEG n02091134/
+mv val/ILSVRC2012_val_00016887.JPEG n01806567/
+mv val/ILSVRC2012_val_00016888.JPEG n02264363/
+mv val/ILSVRC2012_val_00016889.JPEG n02708093/
+mv val/ILSVRC2012_val_00016890.JPEG n02111277/
+mv val/ILSVRC2012_val_00016891.JPEG n02692877/
+mv val/ILSVRC2012_val_00016892.JPEG n03837869/
+mv val/ILSVRC2012_val_00016893.JPEG n03240683/
+mv val/ILSVRC2012_val_00016894.JPEG n03773504/
+mv val/ILSVRC2012_val_00016895.JPEG n03706229/
+mv val/ILSVRC2012_val_00016896.JPEG n03742115/
+mv val/ILSVRC2012_val_00016897.JPEG n01734418/
+mv val/ILSVRC2012_val_00016898.JPEG n12998815/
+mv val/ILSVRC2012_val_00016899.JPEG n03452741/
+mv val/ILSVRC2012_val_00016900.JPEG n06596364/
+mv val/ILSVRC2012_val_00016901.JPEG n03041632/
+mv val/ILSVRC2012_val_00016902.JPEG n02096585/
+mv val/ILSVRC2012_val_00016903.JPEG n04317175/
+mv val/ILSVRC2012_val_00016904.JPEG n07892512/
+mv val/ILSVRC2012_val_00016905.JPEG n01755581/
+mv val/ILSVRC2012_val_00016906.JPEG n03777568/
+mv val/ILSVRC2012_val_00016907.JPEG n03457902/
+mv val/ILSVRC2012_val_00016908.JPEG n02106382/
+mv val/ILSVRC2012_val_00016909.JPEG n01601694/
+mv val/ILSVRC2012_val_00016910.JPEG n03691459/
+mv val/ILSVRC2012_val_00016911.JPEG n02114855/
+mv val/ILSVRC2012_val_00016912.JPEG n03461385/
+mv val/ILSVRC2012_val_00016913.JPEG n02096294/
+mv val/ILSVRC2012_val_00016914.JPEG n03498962/
+mv val/ILSVRC2012_val_00016915.JPEG n04482393/
+mv val/ILSVRC2012_val_00016916.JPEG n02412080/
+mv val/ILSVRC2012_val_00016917.JPEG n03857828/
+mv val/ILSVRC2012_val_00016918.JPEG n02124075/
+mv val/ILSVRC2012_val_00016919.JPEG n02106550/
+mv val/ILSVRC2012_val_00016920.JPEG n03950228/
+mv val/ILSVRC2012_val_00016921.JPEG n07730033/
+mv val/ILSVRC2012_val_00016922.JPEG n02093991/
+mv val/ILSVRC2012_val_00016923.JPEG n07768694/
+mv val/ILSVRC2012_val_00016924.JPEG n02870880/
+mv val/ILSVRC2012_val_00016925.JPEG n02672831/
+mv val/ILSVRC2012_val_00016926.JPEG n02268443/
+mv val/ILSVRC2012_val_00016927.JPEG n03773504/
+mv val/ILSVRC2012_val_00016928.JPEG n09332890/
+mv val/ILSVRC2012_val_00016929.JPEG n02025239/
+mv val/ILSVRC2012_val_00016930.JPEG n04562935/
+mv val/ILSVRC2012_val_00016931.JPEG n07742313/
+mv val/ILSVRC2012_val_00016932.JPEG n04192698/
+mv val/ILSVRC2012_val_00016933.JPEG n04049303/
+mv val/ILSVRC2012_val_00016934.JPEG n01644900/
+mv val/ILSVRC2012_val_00016935.JPEG n02769748/
+mv val/ILSVRC2012_val_00016936.JPEG n01774384/
+mv val/ILSVRC2012_val_00016937.JPEG n02894605/
+mv val/ILSVRC2012_val_00016938.JPEG n03127747/
+mv val/ILSVRC2012_val_00016939.JPEG n03045698/
+mv val/ILSVRC2012_val_00016940.JPEG n03388549/
+mv val/ILSVRC2012_val_00016941.JPEG n03724870/
+mv val/ILSVRC2012_val_00016942.JPEG n03706229/
+mv val/ILSVRC2012_val_00016943.JPEG n03825788/
+mv val/ILSVRC2012_val_00016944.JPEG n01775062/
+mv val/ILSVRC2012_val_00016945.JPEG n03670208/
+mv val/ILSVRC2012_val_00016946.JPEG n02492035/
+mv val/ILSVRC2012_val_00016947.JPEG n01983481/
+mv val/ILSVRC2012_val_00016948.JPEG n04435653/
+mv val/ILSVRC2012_val_00016949.JPEG n03028079/
+mv val/ILSVRC2012_val_00016950.JPEG n03445924/
+mv val/ILSVRC2012_val_00016951.JPEG n02108000/
+mv val/ILSVRC2012_val_00016952.JPEG n01882714/
+mv val/ILSVRC2012_val_00016953.JPEG n02346627/
+mv val/ILSVRC2012_val_00016954.JPEG n09399592/
+mv val/ILSVRC2012_val_00016955.JPEG n12620546/
+mv val/ILSVRC2012_val_00016956.JPEG n03047690/
+mv val/ILSVRC2012_val_00016957.JPEG n02807133/
+mv val/ILSVRC2012_val_00016958.JPEG n03630383/
+mv val/ILSVRC2012_val_00016959.JPEG n03325584/
+mv val/ILSVRC2012_val_00016960.JPEG n02110063/
+mv val/ILSVRC2012_val_00016961.JPEG n07860988/
+mv val/ILSVRC2012_val_00016962.JPEG n01443537/
+mv val/ILSVRC2012_val_00016963.JPEG n04523525/
+mv val/ILSVRC2012_val_00016964.JPEG n02112706/
+mv val/ILSVRC2012_val_00016965.JPEG n02815834/
+mv val/ILSVRC2012_val_00016966.JPEG n03720891/
+mv val/ILSVRC2012_val_00016967.JPEG n03843555/
+mv val/ILSVRC2012_val_00016968.JPEG n02992211/
+mv val/ILSVRC2012_val_00016969.JPEG n02107908/
+mv val/ILSVRC2012_val_00016970.JPEG n03662601/
+mv val/ILSVRC2012_val_00016971.JPEG n03207743/
+mv val/ILSVRC2012_val_00016972.JPEG n04507155/
+mv val/ILSVRC2012_val_00016973.JPEG n02094433/
+mv val/ILSVRC2012_val_00016974.JPEG n02791270/
+mv val/ILSVRC2012_val_00016975.JPEG n02788148/
+mv val/ILSVRC2012_val_00016976.JPEG n02094258/
+mv val/ILSVRC2012_val_00016977.JPEG n02105162/
+mv val/ILSVRC2012_val_00016978.JPEG n04179913/
+mv val/ILSVRC2012_val_00016979.JPEG n07930864/
+mv val/ILSVRC2012_val_00016980.JPEG n03873416/
+mv val/ILSVRC2012_val_00016981.JPEG n02027492/
+mv val/ILSVRC2012_val_00016982.JPEG n02790996/
+mv val/ILSVRC2012_val_00016983.JPEG n03924679/
+mv val/ILSVRC2012_val_00016984.JPEG n07753275/
+mv val/ILSVRC2012_val_00016985.JPEG n03658185/
+mv val/ILSVRC2012_val_00016986.JPEG n02444819/
+mv val/ILSVRC2012_val_00016987.JPEG n07802026/
+mv val/ILSVRC2012_val_00016988.JPEG n01484850/
+mv val/ILSVRC2012_val_00016989.JPEG n02113186/
+mv val/ILSVRC2012_val_00016990.JPEG n02110341/
+mv val/ILSVRC2012_val_00016991.JPEG n02090622/
+mv val/ILSVRC2012_val_00016992.JPEG n04366367/
+mv val/ILSVRC2012_val_00016993.JPEG n01773157/
+mv val/ILSVRC2012_val_00016994.JPEG n03792972/
+mv val/ILSVRC2012_val_00016995.JPEG n02690373/
+mv val/ILSVRC2012_val_00016996.JPEG n02090622/
+mv val/ILSVRC2012_val_00016997.JPEG n06794110/
+mv val/ILSVRC2012_val_00016998.JPEG n02101388/
+mv val/ILSVRC2012_val_00016999.JPEG n07697313/
+mv val/ILSVRC2012_val_00017000.JPEG n03297495/
+mv val/ILSVRC2012_val_00017001.JPEG n03032252/
+mv val/ILSVRC2012_val_00017002.JPEG n01688243/
+mv val/ILSVRC2012_val_00017003.JPEG n02090379/
+mv val/ILSVRC2012_val_00017004.JPEG n02017213/
+mv val/ILSVRC2012_val_00017005.JPEG n04152593/
+mv val/ILSVRC2012_val_00017006.JPEG n02108551/
+mv val/ILSVRC2012_val_00017007.JPEG n03658185/
+mv val/ILSVRC2012_val_00017008.JPEG n02643566/
+mv val/ILSVRC2012_val_00017009.JPEG n04049303/
+mv val/ILSVRC2012_val_00017010.JPEG n03544143/
+mv val/ILSVRC2012_val_00017011.JPEG n03709823/
+mv val/ILSVRC2012_val_00017012.JPEG n01632458/
+mv val/ILSVRC2012_val_00017013.JPEG n02111500/
+mv val/ILSVRC2012_val_00017014.JPEG n07717556/
+mv val/ILSVRC2012_val_00017015.JPEG n01688243/
+mv val/ILSVRC2012_val_00017016.JPEG n07747607/
+mv val/ILSVRC2012_val_00017017.JPEG n01592084/
+mv val/ILSVRC2012_val_00017018.JPEG n03485794/
+mv val/ILSVRC2012_val_00017019.JPEG n02443114/
+mv val/ILSVRC2012_val_00017020.JPEG n03888257/
+mv val/ILSVRC2012_val_00017021.JPEG n07753592/
+mv val/ILSVRC2012_val_00017022.JPEG n01930112/
+mv val/ILSVRC2012_val_00017023.JPEG n03127747/
+mv val/ILSVRC2012_val_00017024.JPEG n01580077/
+mv val/ILSVRC2012_val_00017025.JPEG n12057211/
+mv val/ILSVRC2012_val_00017026.JPEG n03344393/
+mv val/ILSVRC2012_val_00017027.JPEG n03697007/
+mv val/ILSVRC2012_val_00017028.JPEG n01601694/
+mv val/ILSVRC2012_val_00017029.JPEG n01818515/
+mv val/ILSVRC2012_val_00017030.JPEG n04517823/
+mv val/ILSVRC2012_val_00017031.JPEG n04584207/
+mv val/ILSVRC2012_val_00017032.JPEG n02002724/
+mv val/ILSVRC2012_val_00017033.JPEG n03424325/
+mv val/ILSVRC2012_val_00017034.JPEG n03895866/
+mv val/ILSVRC2012_val_00017035.JPEG n03787032/
+mv val/ILSVRC2012_val_00017036.JPEG n02100236/
+mv val/ILSVRC2012_val_00017037.JPEG n03110669/
+mv val/ILSVRC2012_val_00017038.JPEG n04523525/
+mv val/ILSVRC2012_val_00017039.JPEG n01983481/
+mv val/ILSVRC2012_val_00017040.JPEG n04465501/
+mv val/ILSVRC2012_val_00017041.JPEG n02090721/
+mv val/ILSVRC2012_val_00017042.JPEG n02980441/
+mv val/ILSVRC2012_val_00017043.JPEG n02088094/
+mv val/ILSVRC2012_val_00017044.JPEG n02492035/
+mv val/ILSVRC2012_val_00017045.JPEG n03109150/
+mv val/ILSVRC2012_val_00017046.JPEG n02091635/
+mv val/ILSVRC2012_val_00017047.JPEG n07695742/
+mv val/ILSVRC2012_val_00017048.JPEG n02074367/
+mv val/ILSVRC2012_val_00017049.JPEG n07754684/
+mv val/ILSVRC2012_val_00017050.JPEG n02783161/
+mv val/ILSVRC2012_val_00017051.JPEG n03761084/
+mv val/ILSVRC2012_val_00017052.JPEG n02096585/
+mv val/ILSVRC2012_val_00017053.JPEG n04099969/
+mv val/ILSVRC2012_val_00017054.JPEG n01930112/
+mv val/ILSVRC2012_val_00017055.JPEG n03379051/
+mv val/ILSVRC2012_val_00017056.JPEG n02105412/
+mv val/ILSVRC2012_val_00017057.JPEG n02097298/
+mv val/ILSVRC2012_val_00017058.JPEG n04026417/
+mv val/ILSVRC2012_val_00017059.JPEG n03866082/
+mv val/ILSVRC2012_val_00017060.JPEG n04004767/
+mv val/ILSVRC2012_val_00017061.JPEG n01704323/
+mv val/ILSVRC2012_val_00017062.JPEG n04286575/
+mv val/ILSVRC2012_val_00017063.JPEG n02321529/
+mv val/ILSVRC2012_val_00017064.JPEG n04417672/
+mv val/ILSVRC2012_val_00017065.JPEG n04389033/
+mv val/ILSVRC2012_val_00017066.JPEG n02909870/
+mv val/ILSVRC2012_val_00017067.JPEG n01685808/
+mv val/ILSVRC2012_val_00017068.JPEG n01806143/
+mv val/ILSVRC2012_val_00017069.JPEG n02006656/
+mv val/ILSVRC2012_val_00017070.JPEG n03832673/
+mv val/ILSVRC2012_val_00017071.JPEG n07697313/
+mv val/ILSVRC2012_val_00017072.JPEG n07932039/
+mv val/ILSVRC2012_val_00017073.JPEG n02206856/
+mv val/ILSVRC2012_val_00017074.JPEG n12144580/
+mv val/ILSVRC2012_val_00017075.JPEG n02108422/
+mv val/ILSVRC2012_val_00017076.JPEG n07753113/
+mv val/ILSVRC2012_val_00017077.JPEG n03777754/
+mv val/ILSVRC2012_val_00017078.JPEG n04259630/
+mv val/ILSVRC2012_val_00017079.JPEG n02641379/
+mv val/ILSVRC2012_val_00017080.JPEG n13052670/
+mv val/ILSVRC2012_val_00017081.JPEG n03788365/
+mv val/ILSVRC2012_val_00017082.JPEG n02870880/
+mv val/ILSVRC2012_val_00017083.JPEG n02799071/
+mv val/ILSVRC2012_val_00017084.JPEG n02137549/
+mv val/ILSVRC2012_val_00017085.JPEG n02999410/
+mv val/ILSVRC2012_val_00017086.JPEG n04317175/
+mv val/ILSVRC2012_val_00017087.JPEG n02094114/
+mv val/ILSVRC2012_val_00017088.JPEG n03529860/
+mv val/ILSVRC2012_val_00017089.JPEG n03188531/
+mv val/ILSVRC2012_val_00017090.JPEG n03160309/
+mv val/ILSVRC2012_val_00017091.JPEG n03697007/
+mv val/ILSVRC2012_val_00017092.JPEG n02091831/
+mv val/ILSVRC2012_val_00017093.JPEG n03594734/
+mv val/ILSVRC2012_val_00017094.JPEG n04389033/
+mv val/ILSVRC2012_val_00017095.JPEG n02799071/
+mv val/ILSVRC2012_val_00017096.JPEG n07747607/
+mv val/ILSVRC2012_val_00017097.JPEG n02504458/
+mv val/ILSVRC2012_val_00017098.JPEG n04277352/
+mv val/ILSVRC2012_val_00017099.JPEG n01914609/
+mv val/ILSVRC2012_val_00017100.JPEG n02281787/
+mv val/ILSVRC2012_val_00017101.JPEG n03868863/
+mv val/ILSVRC2012_val_00017102.JPEG n09421951/
+mv val/ILSVRC2012_val_00017103.JPEG n03792782/
+mv val/ILSVRC2012_val_00017104.JPEG n02102318/
+mv val/ILSVRC2012_val_00017105.JPEG n01484850/
+mv val/ILSVRC2012_val_00017106.JPEG n04192698/
+mv val/ILSVRC2012_val_00017107.JPEG n02089867/
+mv val/ILSVRC2012_val_00017108.JPEG n03584254/
+mv val/ILSVRC2012_val_00017109.JPEG n01728572/
+mv val/ILSVRC2012_val_00017110.JPEG n03062245/
+mv val/ILSVRC2012_val_00017111.JPEG n02109047/
+mv val/ILSVRC2012_val_00017112.JPEG n02108422/
+mv val/ILSVRC2012_val_00017113.JPEG n02088632/
+mv val/ILSVRC2012_val_00017114.JPEG n02447366/
+mv val/ILSVRC2012_val_00017115.JPEG n02236044/
+mv val/ILSVRC2012_val_00017116.JPEG n02910353/
+mv val/ILSVRC2012_val_00017117.JPEG n02105056/
+mv val/ILSVRC2012_val_00017118.JPEG n03498962/
+mv val/ILSVRC2012_val_00017119.JPEG n03250847/
+mv val/ILSVRC2012_val_00017120.JPEG n04120489/
+mv val/ILSVRC2012_val_00017121.JPEG n02999410/
+mv val/ILSVRC2012_val_00017122.JPEG n03467068/
+mv val/ILSVRC2012_val_00017123.JPEG n03187595/
+mv val/ILSVRC2012_val_00017124.JPEG n03255030/
+mv val/ILSVRC2012_val_00017125.JPEG n04004767/
+mv val/ILSVRC2012_val_00017126.JPEG n02091635/
+mv val/ILSVRC2012_val_00017127.JPEG n04507155/
+mv val/ILSVRC2012_val_00017128.JPEG n03782006/
+mv val/ILSVRC2012_val_00017129.JPEG n02317335/
+mv val/ILSVRC2012_val_00017130.JPEG n02165456/
+mv val/ILSVRC2012_val_00017131.JPEG n04243546/
+mv val/ILSVRC2012_val_00017132.JPEG n02099849/
+mv val/ILSVRC2012_val_00017133.JPEG n04239074/
+mv val/ILSVRC2012_val_00017134.JPEG n09246464/
+mv val/ILSVRC2012_val_00017135.JPEG n04335435/
+mv val/ILSVRC2012_val_00017136.JPEG n03770439/
+mv val/ILSVRC2012_val_00017137.JPEG n01978455/
+mv val/ILSVRC2012_val_00017138.JPEG n01644373/
+mv val/ILSVRC2012_val_00017139.JPEG n02256656/
+mv val/ILSVRC2012_val_00017140.JPEG n02509815/
+mv val/ILSVRC2012_val_00017141.JPEG n03584254/
+mv val/ILSVRC2012_val_00017142.JPEG n03710721/
+mv val/ILSVRC2012_val_00017143.JPEG n01795545/
+mv val/ILSVRC2012_val_00017144.JPEG n07753592/
+mv val/ILSVRC2012_val_00017145.JPEG n02412080/
+mv val/ILSVRC2012_val_00017146.JPEG n07892512/
+mv val/ILSVRC2012_val_00017147.JPEG n02091032/
+mv val/ILSVRC2012_val_00017148.JPEG n04074963/
+mv val/ILSVRC2012_val_00017149.JPEG n03197337/
+mv val/ILSVRC2012_val_00017150.JPEG n03075370/
+mv val/ILSVRC2012_val_00017151.JPEG n02111129/
+mv val/ILSVRC2012_val_00017152.JPEG n03930630/
+mv val/ILSVRC2012_val_00017153.JPEG n01770081/
+mv val/ILSVRC2012_val_00017154.JPEG n04235860/
+mv val/ILSVRC2012_val_00017155.JPEG n02132136/
+mv val/ILSVRC2012_val_00017156.JPEG n02100735/
+mv val/ILSVRC2012_val_00017157.JPEG n01978287/
+mv val/ILSVRC2012_val_00017158.JPEG n02097658/
+mv val/ILSVRC2012_val_00017159.JPEG n04540053/
+mv val/ILSVRC2012_val_00017160.JPEG n04149813/
+mv val/ILSVRC2012_val_00017161.JPEG n02105251/
+mv val/ILSVRC2012_val_00017162.JPEG n01984695/
+mv val/ILSVRC2012_val_00017163.JPEG n03314780/
+mv val/ILSVRC2012_val_00017164.JPEG n02115641/
+mv val/ILSVRC2012_val_00017165.JPEG n04235860/
+mv val/ILSVRC2012_val_00017166.JPEG n02843684/
+mv val/ILSVRC2012_val_00017167.JPEG n04311004/
+mv val/ILSVRC2012_val_00017168.JPEG n04118776/
+mv val/ILSVRC2012_val_00017169.JPEG n02276258/
+mv val/ILSVRC2012_val_00017170.JPEG n02909870/
+mv val/ILSVRC2012_val_00017171.JPEG n02701002/
+mv val/ILSVRC2012_val_00017172.JPEG n02051845/
+mv val/ILSVRC2012_val_00017173.JPEG n04599235/
+mv val/ILSVRC2012_val_00017174.JPEG n01689811/
+mv val/ILSVRC2012_val_00017175.JPEG n03637318/
+mv val/ILSVRC2012_val_00017176.JPEG n03344393/
+mv val/ILSVRC2012_val_00017177.JPEG n04591713/
+mv val/ILSVRC2012_val_00017178.JPEG n02018795/
+mv val/ILSVRC2012_val_00017179.JPEG n02795169/
+mv val/ILSVRC2012_val_00017180.JPEG n04462240/
+mv val/ILSVRC2012_val_00017181.JPEG n03776460/
+mv val/ILSVRC2012_val_00017182.JPEG n03404251/
+mv val/ILSVRC2012_val_00017183.JPEG n03188531/
+mv val/ILSVRC2012_val_00017184.JPEG n07749582/
+mv val/ILSVRC2012_val_00017185.JPEG n01631663/
+mv val/ILSVRC2012_val_00017186.JPEG n02123597/
+mv val/ILSVRC2012_val_00017187.JPEG n02328150/
+mv val/ILSVRC2012_val_00017188.JPEG n02110958/
+mv val/ILSVRC2012_val_00017189.JPEG n02125311/
+mv val/ILSVRC2012_val_00017190.JPEG n04023962/
+mv val/ILSVRC2012_val_00017191.JPEG n03133878/
+mv val/ILSVRC2012_val_00017192.JPEG n03131574/
+mv val/ILSVRC2012_val_00017193.JPEG n02091467/
+mv val/ILSVRC2012_val_00017194.JPEG n01484850/
+mv val/ILSVRC2012_val_00017195.JPEG n02096177/
+mv val/ILSVRC2012_val_00017196.JPEG n01496331/
+mv val/ILSVRC2012_val_00017197.JPEG n02058221/
+mv val/ILSVRC2012_val_00017198.JPEG n03028079/
+mv val/ILSVRC2012_val_00017199.JPEG n02113023/
+mv val/ILSVRC2012_val_00017200.JPEG n02480855/
+mv val/ILSVRC2012_val_00017201.JPEG n02892201/
+mv val/ILSVRC2012_val_00017202.JPEG n04418357/
+mv val/ILSVRC2012_val_00017203.JPEG n03042490/
+mv val/ILSVRC2012_val_00017204.JPEG n03124170/
+mv val/ILSVRC2012_val_00017205.JPEG n12985857/
+mv val/ILSVRC2012_val_00017206.JPEG n04141975/
+mv val/ILSVRC2012_val_00017207.JPEG n01860187/
+mv val/ILSVRC2012_val_00017208.JPEG n02130308/
+mv val/ILSVRC2012_val_00017209.JPEG n04037443/
+mv val/ILSVRC2012_val_00017210.JPEG n13052670/
+mv val/ILSVRC2012_val_00017211.JPEG n07714571/
+mv val/ILSVRC2012_val_00017212.JPEG n02391049/
+mv val/ILSVRC2012_val_00017213.JPEG n04149813/
+mv val/ILSVRC2012_val_00017214.JPEG n04099969/
+mv val/ILSVRC2012_val_00017215.JPEG n01729977/
+mv val/ILSVRC2012_val_00017216.JPEG n04243546/
+mv val/ILSVRC2012_val_00017217.JPEG n02978881/
+mv val/ILSVRC2012_val_00017218.JPEG n03131574/
+mv val/ILSVRC2012_val_00017219.JPEG n02127052/
+mv val/ILSVRC2012_val_00017220.JPEG n04366367/
+mv val/ILSVRC2012_val_00017221.JPEG n02229544/
+mv val/ILSVRC2012_val_00017222.JPEG n01669191/
+mv val/ILSVRC2012_val_00017223.JPEG n02489166/
+mv val/ILSVRC2012_val_00017224.JPEG n07716906/
+mv val/ILSVRC2012_val_00017225.JPEG n03208938/
+mv val/ILSVRC2012_val_00017226.JPEG n02088466/
+mv val/ILSVRC2012_val_00017227.JPEG n02093754/
+mv val/ILSVRC2012_val_00017228.JPEG n01632777/
+mv val/ILSVRC2012_val_00017229.JPEG n04118538/
+mv val/ILSVRC2012_val_00017230.JPEG n02363005/
+mv val/ILSVRC2012_val_00017231.JPEG n02114855/
+mv val/ILSVRC2012_val_00017232.JPEG n09256479/
+mv val/ILSVRC2012_val_00017233.JPEG n02787622/
+mv val/ILSVRC2012_val_00017234.JPEG n02105412/
+mv val/ILSVRC2012_val_00017235.JPEG n03498962/
+mv val/ILSVRC2012_val_00017236.JPEG n12768682/
+mv val/ILSVRC2012_val_00017237.JPEG n03216828/
+mv val/ILSVRC2012_val_00017238.JPEG n03598930/
+mv val/ILSVRC2012_val_00017239.JPEG n02643566/
+mv val/ILSVRC2012_val_00017240.JPEG n03837869/
+mv val/ILSVRC2012_val_00017241.JPEG n07695742/
+mv val/ILSVRC2012_val_00017242.JPEG n01817953/
+mv val/ILSVRC2012_val_00017243.JPEG n01667778/
+mv val/ILSVRC2012_val_00017244.JPEG n04251144/
+mv val/ILSVRC2012_val_00017245.JPEG n02231487/
+mv val/ILSVRC2012_val_00017246.JPEG n04005630/
+mv val/ILSVRC2012_val_00017247.JPEG n03445777/
+mv val/ILSVRC2012_val_00017248.JPEG n04597913/
+mv val/ILSVRC2012_val_00017249.JPEG n07615774/
+mv val/ILSVRC2012_val_00017250.JPEG n02769748/
+mv val/ILSVRC2012_val_00017251.JPEG n01833805/
+mv val/ILSVRC2012_val_00017252.JPEG n01828970/
+mv val/ILSVRC2012_val_00017253.JPEG n01796340/
+mv val/ILSVRC2012_val_00017254.JPEG n01694178/
+mv val/ILSVRC2012_val_00017255.JPEG n03995372/
+mv val/ILSVRC2012_val_00017256.JPEG n03494278/
+mv val/ILSVRC2012_val_00017257.JPEG n03271574/
+mv val/ILSVRC2012_val_00017258.JPEG n03014705/
+mv val/ILSVRC2012_val_00017259.JPEG n02088632/
+mv val/ILSVRC2012_val_00017260.JPEG n03788195/
+mv val/ILSVRC2012_val_00017261.JPEG n02328150/
+mv val/ILSVRC2012_val_00017262.JPEG n02992529/
+mv val/ILSVRC2012_val_00017263.JPEG n03498962/
+mv val/ILSVRC2012_val_00017264.JPEG n02169497/
+mv val/ILSVRC2012_val_00017265.JPEG n02112137/
+mv val/ILSVRC2012_val_00017266.JPEG n02483362/
+mv val/ILSVRC2012_val_00017267.JPEG n07836838/
+mv val/ILSVRC2012_val_00017268.JPEG n02086240/
+mv val/ILSVRC2012_val_00017269.JPEG n01739381/
+mv val/ILSVRC2012_val_00017270.JPEG n02325366/
+mv val/ILSVRC2012_val_00017271.JPEG n03877472/
+mv val/ILSVRC2012_val_00017272.JPEG n04589890/
+mv val/ILSVRC2012_val_00017273.JPEG n02133161/
+mv val/ILSVRC2012_val_00017274.JPEG n01632777/
+mv val/ILSVRC2012_val_00017275.JPEG n02105162/
+mv val/ILSVRC2012_val_00017276.JPEG n04019541/
+mv val/ILSVRC2012_val_00017277.JPEG n01775062/
+mv val/ILSVRC2012_val_00017278.JPEG n02107574/
+mv val/ILSVRC2012_val_00017279.JPEG n04509417/
+mv val/ILSVRC2012_val_00017280.JPEG n01860187/
+mv val/ILSVRC2012_val_00017281.JPEG n02088632/
+mv val/ILSVRC2012_val_00017282.JPEG n03459775/
+mv val/ILSVRC2012_val_00017283.JPEG n03133878/
+mv val/ILSVRC2012_val_00017284.JPEG n04254680/
+mv val/ILSVRC2012_val_00017285.JPEG n01755581/
+mv val/ILSVRC2012_val_00017286.JPEG n02939185/
+mv val/ILSVRC2012_val_00017287.JPEG n02091134/
+mv val/ILSVRC2012_val_00017288.JPEG n02114712/
+mv val/ILSVRC2012_val_00017289.JPEG n07714990/
+mv val/ILSVRC2012_val_00017290.JPEG n02484975/
+mv val/ILSVRC2012_val_00017291.JPEG n03445924/
+mv val/ILSVRC2012_val_00017292.JPEG n03018349/
+mv val/ILSVRC2012_val_00017293.JPEG n02802426/
+mv val/ILSVRC2012_val_00017294.JPEG n01774384/
+mv val/ILSVRC2012_val_00017295.JPEG n03124043/
+mv val/ILSVRC2012_val_00017296.JPEG n03355925/
+mv val/ILSVRC2012_val_00017297.JPEG n03146219/
+mv val/ILSVRC2012_val_00017298.JPEG n03388183/
+mv val/ILSVRC2012_val_00017299.JPEG n02226429/
+mv val/ILSVRC2012_val_00017300.JPEG n07860988/
+mv val/ILSVRC2012_val_00017301.JPEG n03388183/
+mv val/ILSVRC2012_val_00017302.JPEG n04009552/
+mv val/ILSVRC2012_val_00017303.JPEG n02488291/
+mv val/ILSVRC2012_val_00017304.JPEG n03899768/
+mv val/ILSVRC2012_val_00017305.JPEG n03649909/
+mv val/ILSVRC2012_val_00017306.JPEG n03393912/
+mv val/ILSVRC2012_val_00017307.JPEG n02797295/
+mv val/ILSVRC2012_val_00017308.JPEG n03014705/
+mv val/ILSVRC2012_val_00017309.JPEG n03729826/
+mv val/ILSVRC2012_val_00017310.JPEG n01560419/
+mv val/ILSVRC2012_val_00017311.JPEG n02114367/
+mv val/ILSVRC2012_val_00017312.JPEG n03637318/
+mv val/ILSVRC2012_val_00017313.JPEG n02115641/
+mv val/ILSVRC2012_val_00017314.JPEG n04517823/
+mv val/ILSVRC2012_val_00017315.JPEG n02346627/
+mv val/ILSVRC2012_val_00017316.JPEG n02033041/
+mv val/ILSVRC2012_val_00017317.JPEG n02804414/
+mv val/ILSVRC2012_val_00017318.JPEG n07714990/
+mv val/ILSVRC2012_val_00017319.JPEG n04120489/
+mv val/ILSVRC2012_val_00017320.JPEG n03481172/
+mv val/ILSVRC2012_val_00017321.JPEG n02099267/
+mv val/ILSVRC2012_val_00017322.JPEG n10565667/
+mv val/ILSVRC2012_val_00017323.JPEG n03825788/
+mv val/ILSVRC2012_val_00017324.JPEG n03240683/
+mv val/ILSVRC2012_val_00017325.JPEG n02123597/
+mv val/ILSVRC2012_val_00017326.JPEG n02097130/
+mv val/ILSVRC2012_val_00017327.JPEG n02090721/
+mv val/ILSVRC2012_val_00017328.JPEG n02094433/
+mv val/ILSVRC2012_val_00017329.JPEG n02667093/
+mv val/ILSVRC2012_val_00017330.JPEG n03461385/
+mv val/ILSVRC2012_val_00017331.JPEG n02101388/
+mv val/ILSVRC2012_val_00017332.JPEG n09399592/
+mv val/ILSVRC2012_val_00017333.JPEG n02109047/
+mv val/ILSVRC2012_val_00017334.JPEG n04153751/
+mv val/ILSVRC2012_val_00017335.JPEG n04479046/
+mv val/ILSVRC2012_val_00017336.JPEG n03223299/
+mv val/ILSVRC2012_val_00017337.JPEG n13133613/
+mv val/ILSVRC2012_val_00017338.JPEG n01688243/
+mv val/ILSVRC2012_val_00017339.JPEG n02363005/
+mv val/ILSVRC2012_val_00017340.JPEG n04493381/
+mv val/ILSVRC2012_val_00017341.JPEG n02445715/
+mv val/ILSVRC2012_val_00017342.JPEG n02280649/
+mv val/ILSVRC2012_val_00017343.JPEG n03804744/
+mv val/ILSVRC2012_val_00017344.JPEG n04596742/
+mv val/ILSVRC2012_val_00017345.JPEG n04597913/
+mv val/ILSVRC2012_val_00017346.JPEG n01729322/
+mv val/ILSVRC2012_val_00017347.JPEG n02793495/
+mv val/ILSVRC2012_val_00017348.JPEG n04604644/
+mv val/ILSVRC2012_val_00017349.JPEG n04592741/
+mv val/ILSVRC2012_val_00017350.JPEG n03425413/
+mv val/ILSVRC2012_val_00017351.JPEG n04332243/
+mv val/ILSVRC2012_val_00017352.JPEG n04562935/
+mv val/ILSVRC2012_val_00017353.JPEG n02494079/
+mv val/ILSVRC2012_val_00017354.JPEG n07693725/
+mv val/ILSVRC2012_val_00017355.JPEG n07717410/
+mv val/ILSVRC2012_val_00017356.JPEG n06874185/
+mv val/ILSVRC2012_val_00017357.JPEG n03063689/
+mv val/ILSVRC2012_val_00017358.JPEG n02389026/
+mv val/ILSVRC2012_val_00017359.JPEG n02110627/
+mv val/ILSVRC2012_val_00017360.JPEG n03930630/
+mv val/ILSVRC2012_val_00017361.JPEG n01871265/
+mv val/ILSVRC2012_val_00017362.JPEG n07716358/
+mv val/ILSVRC2012_val_00017363.JPEG n02114712/
+mv val/ILSVRC2012_val_00017364.JPEG n03216828/
+mv val/ILSVRC2012_val_00017365.JPEG n06596364/
+mv val/ILSVRC2012_val_00017366.JPEG n03494278/
+mv val/ILSVRC2012_val_00017367.JPEG n07579787/
+mv val/ILSVRC2012_val_00017368.JPEG n04548280/
+mv val/ILSVRC2012_val_00017369.JPEG n04409515/
+mv val/ILSVRC2012_val_00017370.JPEG n02102040/
+mv val/ILSVRC2012_val_00017371.JPEG n07753113/
+mv val/ILSVRC2012_val_00017372.JPEG n01632777/
+mv val/ILSVRC2012_val_00017373.JPEG n02843684/
+mv val/ILSVRC2012_val_00017374.JPEG n02395406/
+mv val/ILSVRC2012_val_00017375.JPEG n02100583/
+mv val/ILSVRC2012_val_00017376.JPEG n03481172/
+mv val/ILSVRC2012_val_00017377.JPEG n02099849/
+mv val/ILSVRC2012_val_00017378.JPEG n02708093/
+mv val/ILSVRC2012_val_00017379.JPEG n01980166/
+mv val/ILSVRC2012_val_00017380.JPEG n02096294/
+mv val/ILSVRC2012_val_00017381.JPEG n01744401/
+mv val/ILSVRC2012_val_00017382.JPEG n03291819/
+mv val/ILSVRC2012_val_00017383.JPEG n04004767/
+mv val/ILSVRC2012_val_00017384.JPEG n01534433/
+mv val/ILSVRC2012_val_00017385.JPEG n03223299/
+mv val/ILSVRC2012_val_00017386.JPEG n03773504/
+mv val/ILSVRC2012_val_00017387.JPEG n04090263/
+mv val/ILSVRC2012_val_00017388.JPEG n02002724/
+mv val/ILSVRC2012_val_00017389.JPEG n02422106/
+mv val/ILSVRC2012_val_00017390.JPEG n04325704/
+mv val/ILSVRC2012_val_00017391.JPEG n01531178/
+mv val/ILSVRC2012_val_00017392.JPEG n02948072/
+mv val/ILSVRC2012_val_00017393.JPEG n02281787/
+mv val/ILSVRC2012_val_00017394.JPEG n04239074/
+mv val/ILSVRC2012_val_00017395.JPEG n04399382/
+mv val/ILSVRC2012_val_00017396.JPEG n03400231/
+mv val/ILSVRC2012_val_00017397.JPEG n02802426/
+mv val/ILSVRC2012_val_00017398.JPEG n02165456/
+mv val/ILSVRC2012_val_00017399.JPEG n02256656/
+mv val/ILSVRC2012_val_00017400.JPEG n02104029/
+mv val/ILSVRC2012_val_00017401.JPEG n06794110/
+mv val/ILSVRC2012_val_00017402.JPEG n07932039/
+mv val/ILSVRC2012_val_00017403.JPEG n02793495/
+mv val/ILSVRC2012_val_00017404.JPEG n02093754/
+mv val/ILSVRC2012_val_00017405.JPEG n02834397/
+mv val/ILSVRC2012_val_00017406.JPEG n02165456/
+mv val/ILSVRC2012_val_00017407.JPEG n03394916/
+mv val/ILSVRC2012_val_00017408.JPEG n02138441/
+mv val/ILSVRC2012_val_00017409.JPEG n01729977/
+mv val/ILSVRC2012_val_00017410.JPEG n02138441/
+mv val/ILSVRC2012_val_00017411.JPEG n04311174/
+mv val/ILSVRC2012_val_00017412.JPEG n03388043/
+mv val/ILSVRC2012_val_00017413.JPEG n03344393/
+mv val/ILSVRC2012_val_00017414.JPEG n03445924/
+mv val/ILSVRC2012_val_00017415.JPEG n02504013/
+mv val/ILSVRC2012_val_00017416.JPEG n13040303/
+mv val/ILSVRC2012_val_00017417.JPEG n02363005/
+mv val/ILSVRC2012_val_00017418.JPEG n02206856/
+mv val/ILSVRC2012_val_00017419.JPEG n03982430/
+mv val/ILSVRC2012_val_00017420.JPEG n03661043/
+mv val/ILSVRC2012_val_00017421.JPEG n02107574/
+mv val/ILSVRC2012_val_00017422.JPEG n03785016/
+mv val/ILSVRC2012_val_00017423.JPEG n02231487/
+mv val/ILSVRC2012_val_00017424.JPEG n04487394/
+mv val/ILSVRC2012_val_00017425.JPEG n04376876/
+mv val/ILSVRC2012_val_00017426.JPEG n04277352/
+mv val/ILSVRC2012_val_00017427.JPEG n07718472/
+mv val/ILSVRC2012_val_00017428.JPEG n04118776/
+mv val/ILSVRC2012_val_00017429.JPEG n01914609/
+mv val/ILSVRC2012_val_00017430.JPEG n01798484/
+mv val/ILSVRC2012_val_00017431.JPEG n01944390/
+mv val/ILSVRC2012_val_00017432.JPEG n03355925/
+mv val/ILSVRC2012_val_00017433.JPEG n03742115/
+mv val/ILSVRC2012_val_00017434.JPEG n02108089/
+mv val/ILSVRC2012_val_00017435.JPEG n03924679/
+mv val/ILSVRC2012_val_00017436.JPEG n03134739/
+mv val/ILSVRC2012_val_00017437.JPEG n02011460/
+mv val/ILSVRC2012_val_00017438.JPEG n02974003/
+mv val/ILSVRC2012_val_00017439.JPEG n02100583/
+mv val/ILSVRC2012_val_00017440.JPEG n01496331/
+mv val/ILSVRC2012_val_00017441.JPEG n01860187/
+mv val/ILSVRC2012_val_00017442.JPEG n02100236/
+mv val/ILSVRC2012_val_00017443.JPEG n04596742/
+mv val/ILSVRC2012_val_00017444.JPEG n02119789/
+mv val/ILSVRC2012_val_00017445.JPEG n02342885/
+mv val/ILSVRC2012_val_00017446.JPEG n04044716/
+mv val/ILSVRC2012_val_00017447.JPEG n04099969/
+mv val/ILSVRC2012_val_00017448.JPEG n03602883/
+mv val/ILSVRC2012_val_00017449.JPEG n07717556/
+mv val/ILSVRC2012_val_00017450.JPEG n04548280/
+mv val/ILSVRC2012_val_00017451.JPEG n03843555/
+mv val/ILSVRC2012_val_00017452.JPEG n04409515/
+mv val/ILSVRC2012_val_00017453.JPEG n02093647/
+mv val/ILSVRC2012_val_00017454.JPEG n01797886/
+mv val/ILSVRC2012_val_00017455.JPEG n04429376/
+mv val/ILSVRC2012_val_00017456.JPEG n03063599/
+mv val/ILSVRC2012_val_00017457.JPEG n07760859/
+mv val/ILSVRC2012_val_00017458.JPEG n02487347/
+mv val/ILSVRC2012_val_00017459.JPEG n01697457/
+mv val/ILSVRC2012_val_00017460.JPEG n03706229/
+mv val/ILSVRC2012_val_00017461.JPEG n02988304/
+mv val/ILSVRC2012_val_00017462.JPEG n03134739/
+mv val/ILSVRC2012_val_00017463.JPEG n02979186/
+mv val/ILSVRC2012_val_00017464.JPEG n02892201/
+mv val/ILSVRC2012_val_00017465.JPEG n03840681/
+mv val/ILSVRC2012_val_00017466.JPEG n03425413/
+mv val/ILSVRC2012_val_00017467.JPEG n13044778/
+mv val/ILSVRC2012_val_00017468.JPEG n04330267/
+mv val/ILSVRC2012_val_00017469.JPEG n03425413/
+mv val/ILSVRC2012_val_00017470.JPEG n02099849/
+mv val/ILSVRC2012_val_00017471.JPEG n04044716/
+mv val/ILSVRC2012_val_00017472.JPEG n01440764/
+mv val/ILSVRC2012_val_00017473.JPEG n02105251/
+mv val/ILSVRC2012_val_00017474.JPEG n03599486/
+mv val/ILSVRC2012_val_00017475.JPEG n03240683/
+mv val/ILSVRC2012_val_00017476.JPEG n02097130/
+mv val/ILSVRC2012_val_00017477.JPEG n04162706/
+mv val/ILSVRC2012_val_00017478.JPEG n03443371/
+mv val/ILSVRC2012_val_00017479.JPEG n02492660/
+mv val/ILSVRC2012_val_00017480.JPEG n03793489/
+mv val/ILSVRC2012_val_00017481.JPEG n04347754/
+mv val/ILSVRC2012_val_00017482.JPEG n04296562/
+mv val/ILSVRC2012_val_00017483.JPEG n03666591/
+mv val/ILSVRC2012_val_00017484.JPEG n04584207/
+mv val/ILSVRC2012_val_00017485.JPEG n04136333/
+mv val/ILSVRC2012_val_00017486.JPEG n02123159/
+mv val/ILSVRC2012_val_00017487.JPEG n04070727/
+mv val/ILSVRC2012_val_00017488.JPEG n02981792/
+mv val/ILSVRC2012_val_00017489.JPEG n07718472/
+mv val/ILSVRC2012_val_00017490.JPEG n01694178/
+mv val/ILSVRC2012_val_00017491.JPEG n10565667/
+mv val/ILSVRC2012_val_00017492.JPEG n04532670/
+mv val/ILSVRC2012_val_00017493.JPEG n02480495/
+mv val/ILSVRC2012_val_00017494.JPEG n07590611/
+mv val/ILSVRC2012_val_00017495.JPEG n02111277/
+mv val/ILSVRC2012_val_00017496.JPEG n04554684/
+mv val/ILSVRC2012_val_00017497.JPEG n01695060/
+mv val/ILSVRC2012_val_00017498.JPEG n04311004/
+mv val/ILSVRC2012_val_00017499.JPEG n02102480/
+mv val/ILSVRC2012_val_00017500.JPEG n04447861/
+mv val/ILSVRC2012_val_00017501.JPEG n02807133/
+mv val/ILSVRC2012_val_00017502.JPEG n04398044/
+mv val/ILSVRC2012_val_00017503.JPEG n04418357/
+mv val/ILSVRC2012_val_00017504.JPEG n03690938/
+mv val/ILSVRC2012_val_00017505.JPEG n01644373/
+mv val/ILSVRC2012_val_00017506.JPEG n03837869/
+mv val/ILSVRC2012_val_00017507.JPEG n02493793/
+mv val/ILSVRC2012_val_00017508.JPEG n01796340/
+mv val/ILSVRC2012_val_00017509.JPEG n02095889/
+mv val/ILSVRC2012_val_00017510.JPEG n03781244/
+mv val/ILSVRC2012_val_00017511.JPEG n02088466/
+mv val/ILSVRC2012_val_00017512.JPEG n02906734/
+mv val/ILSVRC2012_val_00017513.JPEG n04596742/
+mv val/ILSVRC2012_val_00017514.JPEG n12057211/
+mv val/ILSVRC2012_val_00017515.JPEG n02097658/
+mv val/ILSVRC2012_val_00017516.JPEG n03954731/
+mv val/ILSVRC2012_val_00017517.JPEG n02447366/
+mv val/ILSVRC2012_val_00017518.JPEG n03223299/
+mv val/ILSVRC2012_val_00017519.JPEG n03710637/
+mv val/ILSVRC2012_val_00017520.JPEG n03459775/
+mv val/ILSVRC2012_val_00017521.JPEG n04458633/
+mv val/ILSVRC2012_val_00017522.JPEG n02397096/
+mv val/ILSVRC2012_val_00017523.JPEG n03877472/
+mv val/ILSVRC2012_val_00017524.JPEG n07584110/
+mv val/ILSVRC2012_val_00017525.JPEG n03393912/
+mv val/ILSVRC2012_val_00017526.JPEG n07716906/
+mv val/ILSVRC2012_val_00017527.JPEG n07836838/
+mv val/ILSVRC2012_val_00017528.JPEG n03720891/
+mv val/ILSVRC2012_val_00017529.JPEG n02109961/
+mv val/ILSVRC2012_val_00017530.JPEG n04326547/
+mv val/ILSVRC2012_val_00017531.JPEG n01753488/
+mv val/ILSVRC2012_val_00017532.JPEG n02389026/
+mv val/ILSVRC2012_val_00017533.JPEG n07734744/
+mv val/ILSVRC2012_val_00017534.JPEG n07745940/
+mv val/ILSVRC2012_val_00017535.JPEG n02094114/
+mv val/ILSVRC2012_val_00017536.JPEG n02981792/
+mv val/ILSVRC2012_val_00017537.JPEG n02097298/
+mv val/ILSVRC2012_val_00017538.JPEG n03930630/
+mv val/ILSVRC2012_val_00017539.JPEG n02783161/
+mv val/ILSVRC2012_val_00017540.JPEG n04346328/
+mv val/ILSVRC2012_val_00017541.JPEG n01774750/
+mv val/ILSVRC2012_val_00017542.JPEG n01829413/
+mv val/ILSVRC2012_val_00017543.JPEG n02910353/
+mv val/ILSVRC2012_val_00017544.JPEG n02894605/
+mv val/ILSVRC2012_val_00017545.JPEG n02132136/
+mv val/ILSVRC2012_val_00017546.JPEG n04372370/
+mv val/ILSVRC2012_val_00017547.JPEG n04040759/
+mv val/ILSVRC2012_val_00017548.JPEG n02493509/
+mv val/ILSVRC2012_val_00017549.JPEG n03788195/
+mv val/ILSVRC2012_val_00017550.JPEG n04357314/
+mv val/ILSVRC2012_val_00017551.JPEG n02106166/
+mv val/ILSVRC2012_val_00017552.JPEG n02168699/
+mv val/ILSVRC2012_val_00017553.JPEG n02091831/
+mv val/ILSVRC2012_val_00017554.JPEG n02105056/
+mv val/ILSVRC2012_val_00017555.JPEG n01986214/
+mv val/ILSVRC2012_val_00017556.JPEG n02268443/
+mv val/ILSVRC2012_val_00017557.JPEG n01739381/
+mv val/ILSVRC2012_val_00017558.JPEG n01774384/
+mv val/ILSVRC2012_val_00017559.JPEG n02444819/
+mv val/ILSVRC2012_val_00017560.JPEG n02105641/
+mv val/ILSVRC2012_val_00017561.JPEG n01687978/
+mv val/ILSVRC2012_val_00017562.JPEG n04606251/
+mv val/ILSVRC2012_val_00017563.JPEG n03325584/
+mv val/ILSVRC2012_val_00017564.JPEG n04596742/
+mv val/ILSVRC2012_val_00017565.JPEG n02325366/
+mv val/ILSVRC2012_val_00017566.JPEG n02950826/
+mv val/ILSVRC2012_val_00017567.JPEG n04067472/
+mv val/ILSVRC2012_val_00017568.JPEG n02086646/
+mv val/ILSVRC2012_val_00017569.JPEG n02113799/
+mv val/ILSVRC2012_val_00017570.JPEG n04557648/
+mv val/ILSVRC2012_val_00017571.JPEG n04429376/
+mv val/ILSVRC2012_val_00017572.JPEG n01704323/
+mv val/ILSVRC2012_val_00017573.JPEG n02056570/
+mv val/ILSVRC2012_val_00017574.JPEG n02488291/
+mv val/ILSVRC2012_val_00017575.JPEG n07614500/
+mv val/ILSVRC2012_val_00017576.JPEG n03089624/
+mv val/ILSVRC2012_val_00017577.JPEG n01532829/
+mv val/ILSVRC2012_val_00017578.JPEG n03160309/
+mv val/ILSVRC2012_val_00017579.JPEG n04550184/
+mv val/ILSVRC2012_val_00017580.JPEG n07730033/
+mv val/ILSVRC2012_val_00017581.JPEG n02095570/
+mv val/ILSVRC2012_val_00017582.JPEG n04367480/
+mv val/ILSVRC2012_val_00017583.JPEG n04081281/
+mv val/ILSVRC2012_val_00017584.JPEG n04254120/
+mv val/ILSVRC2012_val_00017585.JPEG n04443257/
+mv val/ILSVRC2012_val_00017586.JPEG n03777568/
+mv val/ILSVRC2012_val_00017587.JPEG n03584829/
+mv val/ILSVRC2012_val_00017588.JPEG n04201297/
+mv val/ILSVRC2012_val_00017589.JPEG n12144580/
+mv val/ILSVRC2012_val_00017590.JPEG n02834397/
+mv val/ILSVRC2012_val_00017591.JPEG n03127925/
+mv val/ILSVRC2012_val_00017592.JPEG n02100735/
+mv val/ILSVRC2012_val_00017593.JPEG n02256656/
+mv val/ILSVRC2012_val_00017594.JPEG n02092002/
+mv val/ILSVRC2012_val_00017595.JPEG n01753488/
+mv val/ILSVRC2012_val_00017596.JPEG n04259630/
+mv val/ILSVRC2012_val_00017597.JPEG n03197337/
+mv val/ILSVRC2012_val_00017598.JPEG n02510455/
+mv val/ILSVRC2012_val_00017599.JPEG n02108422/
+mv val/ILSVRC2012_val_00017600.JPEG n02013706/
+mv val/ILSVRC2012_val_00017601.JPEG n03840681/
+mv val/ILSVRC2012_val_00017602.JPEG n02108089/
+mv val/ILSVRC2012_val_00017603.JPEG n04485082/
+mv val/ILSVRC2012_val_00017604.JPEG n03584829/
+mv val/ILSVRC2012_val_00017605.JPEG n02134084/
+mv val/ILSVRC2012_val_00017606.JPEG n03814639/
+mv val/ILSVRC2012_val_00017607.JPEG n04522168/
+mv val/ILSVRC2012_val_00017608.JPEG n04589890/
+mv val/ILSVRC2012_val_00017609.JPEG n04252225/
+mv val/ILSVRC2012_val_00017610.JPEG n03188531/
+mv val/ILSVRC2012_val_00017611.JPEG n03594945/
+mv val/ILSVRC2012_val_00017612.JPEG n03691459/
+mv val/ILSVRC2012_val_00017613.JPEG n04041544/
+mv val/ILSVRC2012_val_00017614.JPEG n04033901/
+mv val/ILSVRC2012_val_00017615.JPEG n04090263/
+mv val/ILSVRC2012_val_00017616.JPEG n02486410/
+mv val/ILSVRC2012_val_00017617.JPEG n03873416/
+mv val/ILSVRC2012_val_00017618.JPEG n03871628/
+mv val/ILSVRC2012_val_00017619.JPEG n02325366/
+mv val/ILSVRC2012_val_00017620.JPEG n02841315/
+mv val/ILSVRC2012_val_00017621.JPEG n02037110/
+mv val/ILSVRC2012_val_00017622.JPEG n02909870/
+mv val/ILSVRC2012_val_00017623.JPEG n01629819/
+mv val/ILSVRC2012_val_00017624.JPEG n07565083/
+mv val/ILSVRC2012_val_00017625.JPEG n02088094/
+mv val/ILSVRC2012_val_00017626.JPEG n03954731/
+mv val/ILSVRC2012_val_00017627.JPEG n12998815/
+mv val/ILSVRC2012_val_00017628.JPEG n03661043/
+mv val/ILSVRC2012_val_00017629.JPEG n04332243/
+mv val/ILSVRC2012_val_00017630.JPEG n02167151/
+mv val/ILSVRC2012_val_00017631.JPEG n04099969/
+mv val/ILSVRC2012_val_00017632.JPEG n04266014/
+mv val/ILSVRC2012_val_00017633.JPEG n03733131/
+mv val/ILSVRC2012_val_00017634.JPEG n02033041/
+mv val/ILSVRC2012_val_00017635.JPEG n02165456/
+mv val/ILSVRC2012_val_00017636.JPEG n02109047/
+mv val/ILSVRC2012_val_00017637.JPEG n02999410/
+mv val/ILSVRC2012_val_00017638.JPEG n02177972/
+mv val/ILSVRC2012_val_00017639.JPEG n02033041/
+mv val/ILSVRC2012_val_00017640.JPEG n03899768/
+mv val/ILSVRC2012_val_00017641.JPEG n01685808/
+mv val/ILSVRC2012_val_00017642.JPEG n04023962/
+mv val/ILSVRC2012_val_00017643.JPEG n02114712/
+mv val/ILSVRC2012_val_00017644.JPEG n03775546/
+mv val/ILSVRC2012_val_00017645.JPEG n02092002/
+mv val/ILSVRC2012_val_00017646.JPEG n02107142/
+mv val/ILSVRC2012_val_00017647.JPEG n02977058/
+mv val/ILSVRC2012_val_00017648.JPEG n01582220/
+mv val/ILSVRC2012_val_00017649.JPEG n04127249/
+mv val/ILSVRC2012_val_00017650.JPEG n03814906/
+mv val/ILSVRC2012_val_00017651.JPEG n03769881/
+mv val/ILSVRC2012_val_00017652.JPEG n03393912/
+mv val/ILSVRC2012_val_00017653.JPEG n03291819/
+mv val/ILSVRC2012_val_00017654.JPEG n02497673/
+mv val/ILSVRC2012_val_00017655.JPEG n03127925/
+mv val/ILSVRC2012_val_00017656.JPEG n09193705/
+mv val/ILSVRC2012_val_00017657.JPEG n07831146/
+mv val/ILSVRC2012_val_00017658.JPEG n03980874/
+mv val/ILSVRC2012_val_00017659.JPEG n07753113/
+mv val/ILSVRC2012_val_00017660.JPEG n01558993/
+mv val/ILSVRC2012_val_00017661.JPEG n02808304/
+mv val/ILSVRC2012_val_00017662.JPEG n03854065/
+mv val/ILSVRC2012_val_00017663.JPEG n04483307/
+mv val/ILSVRC2012_val_00017664.JPEG n02102040/
+mv val/ILSVRC2012_val_00017665.JPEG n04326547/
+mv val/ILSVRC2012_val_00017666.JPEG n02443484/
+mv val/ILSVRC2012_val_00017667.JPEG n09256479/
+mv val/ILSVRC2012_val_00017668.JPEG n03961711/
+mv val/ILSVRC2012_val_00017669.JPEG n01641577/
+mv val/ILSVRC2012_val_00017670.JPEG n03733131/
+mv val/ILSVRC2012_val_00017671.JPEG n04254680/
+mv val/ILSVRC2012_val_00017672.JPEG n02099601/
+mv val/ILSVRC2012_val_00017673.JPEG n02089078/
+mv val/ILSVRC2012_val_00017674.JPEG n03016953/
+mv val/ILSVRC2012_val_00017675.JPEG n03216828/
+mv val/ILSVRC2012_val_00017676.JPEG n02101388/
+mv val/ILSVRC2012_val_00017677.JPEG n02229544/
+mv val/ILSVRC2012_val_00017678.JPEG n02606052/
+mv val/ILSVRC2012_val_00017679.JPEG n04141076/
+mv val/ILSVRC2012_val_00017680.JPEG n01694178/
+mv val/ILSVRC2012_val_00017681.JPEG n03063689/
+mv val/ILSVRC2012_val_00017682.JPEG n01774384/
+mv val/ILSVRC2012_val_00017683.JPEG n02607072/
+mv val/ILSVRC2012_val_00017684.JPEG n02091244/
+mv val/ILSVRC2012_val_00017685.JPEG n03937543/
+mv val/ILSVRC2012_val_00017686.JPEG n04328186/
+mv val/ILSVRC2012_val_00017687.JPEG n03532672/
+mv val/ILSVRC2012_val_00017688.JPEG n03485407/
+mv val/ILSVRC2012_val_00017689.JPEG n07717556/
+mv val/ILSVRC2012_val_00017690.JPEG n02006656/
+mv val/ILSVRC2012_val_00017691.JPEG n04525305/
+mv val/ILSVRC2012_val_00017692.JPEG n02123597/
+mv val/ILSVRC2012_val_00017693.JPEG n02708093/
+mv val/ILSVRC2012_val_00017694.JPEG n02137549/
+mv val/ILSVRC2012_val_00017695.JPEG n07614500/
+mv val/ILSVRC2012_val_00017696.JPEG n03947888/
+mv val/ILSVRC2012_val_00017697.JPEG n03983396/
+mv val/ILSVRC2012_val_00017698.JPEG n03544143/
+mv val/ILSVRC2012_val_00017699.JPEG n01440764/
+mv val/ILSVRC2012_val_00017700.JPEG n01440764/
+mv val/ILSVRC2012_val_00017701.JPEG n03717622/
+mv val/ILSVRC2012_val_00017702.JPEG n02085620/
+mv val/ILSVRC2012_val_00017703.JPEG n02727426/
+mv val/ILSVRC2012_val_00017704.JPEG n03485794/
+mv val/ILSVRC2012_val_00017705.JPEG n03825788/
+mv val/ILSVRC2012_val_00017706.JPEG n04259630/
+mv val/ILSVRC2012_val_00017707.JPEG n02788148/
+mv val/ILSVRC2012_val_00017708.JPEG n03930630/
+mv val/ILSVRC2012_val_00017709.JPEG n04392985/
+mv val/ILSVRC2012_val_00017710.JPEG n02454379/
+mv val/ILSVRC2012_val_00017711.JPEG n02100236/
+mv val/ILSVRC2012_val_00017712.JPEG n01534433/
+mv val/ILSVRC2012_val_00017713.JPEG n02102318/
+mv val/ILSVRC2012_val_00017714.JPEG n04044716/
+mv val/ILSVRC2012_val_00017715.JPEG n02113186/
+mv val/ILSVRC2012_val_00017716.JPEG n02066245/
+mv val/ILSVRC2012_val_00017717.JPEG n02127052/
+mv val/ILSVRC2012_val_00017718.JPEG n01950731/
+mv val/ILSVRC2012_val_00017719.JPEG n03000684/
+mv val/ILSVRC2012_val_00017720.JPEG n02843684/
+mv val/ILSVRC2012_val_00017721.JPEG n04147183/
+mv val/ILSVRC2012_val_00017722.JPEG n02110063/
+mv val/ILSVRC2012_val_00017723.JPEG n07590611/
+mv val/ILSVRC2012_val_00017724.JPEG n02113712/
+mv val/ILSVRC2012_val_00017725.JPEG n04074963/
+mv val/ILSVRC2012_val_00017726.JPEG n03871628/
+mv val/ILSVRC2012_val_00017727.JPEG n02168699/
+mv val/ILSVRC2012_val_00017728.JPEG n09246464/
+mv val/ILSVRC2012_val_00017729.JPEG n07802026/
+mv val/ILSVRC2012_val_00017730.JPEG n01693334/
+mv val/ILSVRC2012_val_00017731.JPEG n03908714/
+mv val/ILSVRC2012_val_00017732.JPEG n02130308/
+mv val/ILSVRC2012_val_00017733.JPEG n09193705/
+mv val/ILSVRC2012_val_00017734.JPEG n02091244/
+mv val/ILSVRC2012_val_00017735.JPEG n02111500/
+mv val/ILSVRC2012_val_00017736.JPEG n03642806/
+mv val/ILSVRC2012_val_00017737.JPEG n04033901/
+mv val/ILSVRC2012_val_00017738.JPEG n02999410/
+mv val/ILSVRC2012_val_00017739.JPEG n02128925/
+mv val/ILSVRC2012_val_00017740.JPEG n06359193/
+mv val/ILSVRC2012_val_00017741.JPEG n07717410/
+mv val/ILSVRC2012_val_00017742.JPEG n02102318/
+mv val/ILSVRC2012_val_00017743.JPEG n04208210/
+mv val/ILSVRC2012_val_00017744.JPEG n02086079/
+mv val/ILSVRC2012_val_00017745.JPEG n03868863/
+mv val/ILSVRC2012_val_00017746.JPEG n03743016/
+mv val/ILSVRC2012_val_00017747.JPEG n03062245/
+mv val/ILSVRC2012_val_00017748.JPEG n03717622/
+mv val/ILSVRC2012_val_00017749.JPEG n04069434/
+mv val/ILSVRC2012_val_00017750.JPEG n03598930/
+mv val/ILSVRC2012_val_00017751.JPEG n01978287/
+mv val/ILSVRC2012_val_00017752.JPEG n04026417/
+mv val/ILSVRC2012_val_00017753.JPEG n01748264/
+mv val/ILSVRC2012_val_00017754.JPEG n02096294/
+mv val/ILSVRC2012_val_00017755.JPEG n04483307/
+mv val/ILSVRC2012_val_00017756.JPEG n01592084/
+mv val/ILSVRC2012_val_00017757.JPEG n03787032/
+mv val/ILSVRC2012_val_00017758.JPEG n03742115/
+mv val/ILSVRC2012_val_00017759.JPEG n01795545/
+mv val/ILSVRC2012_val_00017760.JPEG n02807133/
+mv val/ILSVRC2012_val_00017761.JPEG n02769748/
+mv val/ILSVRC2012_val_00017762.JPEG n02108915/
+mv val/ILSVRC2012_val_00017763.JPEG n04509417/
+mv val/ILSVRC2012_val_00017764.JPEG n02093754/
+mv val/ILSVRC2012_val_00017765.JPEG n02129604/
+mv val/ILSVRC2012_val_00017766.JPEG n02090622/
+mv val/ILSVRC2012_val_00017767.JPEG n01806567/
+mv val/ILSVRC2012_val_00017768.JPEG n04579432/
+mv val/ILSVRC2012_val_00017769.JPEG n04542943/
+mv val/ILSVRC2012_val_00017770.JPEG n03400231/
+mv val/ILSVRC2012_val_00017771.JPEG n07871810/
+mv val/ILSVRC2012_val_00017772.JPEG n09399592/
+mv val/ILSVRC2012_val_00017773.JPEG n02114367/
+mv val/ILSVRC2012_val_00017774.JPEG n04049303/
+mv val/ILSVRC2012_val_00017775.JPEG n02979186/
+mv val/ILSVRC2012_val_00017776.JPEG n02494079/
+mv val/ILSVRC2012_val_00017777.JPEG n03944341/
+mv val/ILSVRC2012_val_00017778.JPEG n03535780/
+mv val/ILSVRC2012_val_00017779.JPEG n03297495/
+mv val/ILSVRC2012_val_00017780.JPEG n07831146/
+mv val/ILSVRC2012_val_00017781.JPEG n02457408/
+mv val/ILSVRC2012_val_00017782.JPEG n04254680/
+mv val/ILSVRC2012_val_00017783.JPEG n03028079/
+mv val/ILSVRC2012_val_00017784.JPEG n03498962/
+mv val/ILSVRC2012_val_00017785.JPEG n02883205/
+mv val/ILSVRC2012_val_00017786.JPEG n02077923/
+mv val/ILSVRC2012_val_00017787.JPEG n02090721/
+mv val/ILSVRC2012_val_00017788.JPEG n04005630/
+mv val/ILSVRC2012_val_00017789.JPEG n02056570/
+mv val/ILSVRC2012_val_00017790.JPEG n01775062/
+mv val/ILSVRC2012_val_00017791.JPEG n03866082/
+mv val/ILSVRC2012_val_00017792.JPEG n02087394/
+mv val/ILSVRC2012_val_00017793.JPEG n04336792/
+mv val/ILSVRC2012_val_00017794.JPEG n01917289/
+mv val/ILSVRC2012_val_00017795.JPEG n04111531/
+mv val/ILSVRC2012_val_00017796.JPEG n02007558/
+mv val/ILSVRC2012_val_00017797.JPEG n04086273/
+mv val/ILSVRC2012_val_00017798.JPEG n02843684/
+mv val/ILSVRC2012_val_00017799.JPEG n13037406/
+mv val/ILSVRC2012_val_00017800.JPEG n04200800/
+mv val/ILSVRC2012_val_00017801.JPEG n03000684/
+mv val/ILSVRC2012_val_00017802.JPEG n03991062/
+mv val/ILSVRC2012_val_00017803.JPEG n02488702/
+mv val/ILSVRC2012_val_00017804.JPEG n02808440/
+mv val/ILSVRC2012_val_00017805.JPEG n03887697/
+mv val/ILSVRC2012_val_00017806.JPEG n01784675/
+mv val/ILSVRC2012_val_00017807.JPEG n02058221/
+mv val/ILSVRC2012_val_00017808.JPEG n02841315/
+mv val/ILSVRC2012_val_00017809.JPEG n02114367/
+mv val/ILSVRC2012_val_00017810.JPEG n03657121/
+mv val/ILSVRC2012_val_00017811.JPEG n02787622/
+mv val/ILSVRC2012_val_00017812.JPEG n03095699/
+mv val/ILSVRC2012_val_00017813.JPEG n03450230/
+mv val/ILSVRC2012_val_00017814.JPEG n02123394/
+mv val/ILSVRC2012_val_00017815.JPEG n02869837/
+mv val/ILSVRC2012_val_00017816.JPEG n03793489/
+mv val/ILSVRC2012_val_00017817.JPEG n02094258/
+mv val/ILSVRC2012_val_00017818.JPEG n04380533/
+mv val/ILSVRC2012_val_00017819.JPEG n02978881/
+mv val/ILSVRC2012_val_00017820.JPEG n07584110/
+mv val/ILSVRC2012_val_00017821.JPEG n02927161/
+mv val/ILSVRC2012_val_00017822.JPEG n02930766/
+mv val/ILSVRC2012_val_00017823.JPEG n02093428/
+mv val/ILSVRC2012_val_00017824.JPEG n04507155/
+mv val/ILSVRC2012_val_00017825.JPEG n03534580/
+mv val/ILSVRC2012_val_00017826.JPEG n03857828/
+mv val/ILSVRC2012_val_00017827.JPEG n01872401/
+mv val/ILSVRC2012_val_00017828.JPEG n03337140/
+mv val/ILSVRC2012_val_00017829.JPEG n02980441/
+mv val/ILSVRC2012_val_00017830.JPEG n02102177/
+mv val/ILSVRC2012_val_00017831.JPEG n02509815/
+mv val/ILSVRC2012_val_00017832.JPEG n02097047/
+mv val/ILSVRC2012_val_00017833.JPEG n02992529/
+mv val/ILSVRC2012_val_00017834.JPEG n02797295/
+mv val/ILSVRC2012_val_00017835.JPEG n03866082/
+mv val/ILSVRC2012_val_00017836.JPEG n02279972/
+mv val/ILSVRC2012_val_00017837.JPEG n03485794/
+mv val/ILSVRC2012_val_00017838.JPEG n03530642/
+mv val/ILSVRC2012_val_00017839.JPEG n01518878/
+mv val/ILSVRC2012_val_00017840.JPEG n04483307/
+mv val/ILSVRC2012_val_00017841.JPEG n04033901/
+mv val/ILSVRC2012_val_00017842.JPEG n07749582/
+mv val/ILSVRC2012_val_00017843.JPEG n02917067/
+mv val/ILSVRC2012_val_00017844.JPEG n03623198/
+mv val/ILSVRC2012_val_00017845.JPEG n02233338/
+mv val/ILSVRC2012_val_00017846.JPEG n03623198/
+mv val/ILSVRC2012_val_00017847.JPEG n03594945/
+mv val/ILSVRC2012_val_00017848.JPEG n02256656/
+mv val/ILSVRC2012_val_00017849.JPEG n02999410/
+mv val/ILSVRC2012_val_00017850.JPEG n02093991/
+mv val/ILSVRC2012_val_00017851.JPEG n02002724/
+mv val/ILSVRC2012_val_00017852.JPEG n03788365/
+mv val/ILSVRC2012_val_00017853.JPEG n03623198/
+mv val/ILSVRC2012_val_00017854.JPEG n02110063/
+mv val/ILSVRC2012_val_00017855.JPEG n01740131/
+mv val/ILSVRC2012_val_00017856.JPEG n04346328/
+mv val/ILSVRC2012_val_00017857.JPEG n04033995/
+mv val/ILSVRC2012_val_00017858.JPEG n02095889/
+mv val/ILSVRC2012_val_00017859.JPEG n04311174/
+mv val/ILSVRC2012_val_00017860.JPEG n02445715/
+mv val/ILSVRC2012_val_00017861.JPEG n03218198/
+mv val/ILSVRC2012_val_00017862.JPEG n02640242/
+mv val/ILSVRC2012_val_00017863.JPEG n04462240/
+mv val/ILSVRC2012_val_00017864.JPEG n03180011/
+mv val/ILSVRC2012_val_00017865.JPEG n02093256/
+mv val/ILSVRC2012_val_00017866.JPEG n03425413/
+mv val/ILSVRC2012_val_00017867.JPEG n02504013/
+mv val/ILSVRC2012_val_00017868.JPEG n03877472/
+mv val/ILSVRC2012_val_00017869.JPEG n02087046/
+mv val/ILSVRC2012_val_00017870.JPEG n03976467/
+mv val/ILSVRC2012_val_00017871.JPEG n02091134/
+mv val/ILSVRC2012_val_00017872.JPEG n04044716/
+mv val/ILSVRC2012_val_00017873.JPEG n02088364/
+mv val/ILSVRC2012_val_00017874.JPEG n02009912/
+mv val/ILSVRC2012_val_00017875.JPEG n02206856/
+mv val/ILSVRC2012_val_00017876.JPEG n03297495/
+mv val/ILSVRC2012_val_00017877.JPEG n02871525/
+mv val/ILSVRC2012_val_00017878.JPEG n03633091/
+mv val/ILSVRC2012_val_00017879.JPEG n02105855/
+mv val/ILSVRC2012_val_00017880.JPEG n03075370/
+mv val/ILSVRC2012_val_00017881.JPEG n02119789/
+mv val/ILSVRC2012_val_00017882.JPEG n01644373/
+mv val/ILSVRC2012_val_00017883.JPEG n03216828/
+mv val/ILSVRC2012_val_00017884.JPEG n03478589/
+mv val/ILSVRC2012_val_00017885.JPEG n03929855/
+mv val/ILSVRC2012_val_00017886.JPEG n02939185/
+mv val/ILSVRC2012_val_00017887.JPEG n01847000/
+mv val/ILSVRC2012_val_00017888.JPEG n02317335/
+mv val/ILSVRC2012_val_00017889.JPEG n01983481/
+mv val/ILSVRC2012_val_00017890.JPEG n03657121/
+mv val/ILSVRC2012_val_00017891.JPEG n02086910/
+mv val/ILSVRC2012_val_00017892.JPEG n02088238/
+mv val/ILSVRC2012_val_00017893.JPEG n02168699/
+mv val/ILSVRC2012_val_00017894.JPEG n03976467/
+mv val/ILSVRC2012_val_00017895.JPEG n07697313/
+mv val/ILSVRC2012_val_00017896.JPEG n03743016/
+mv val/ILSVRC2012_val_00017897.JPEG n04086273/
+mv val/ILSVRC2012_val_00017898.JPEG n04200800/
+mv val/ILSVRC2012_val_00017899.JPEG n01632777/
+mv val/ILSVRC2012_val_00017900.JPEG n03529860/
+mv val/ILSVRC2012_val_00017901.JPEG n03404251/
+mv val/ILSVRC2012_val_00017902.JPEG n03255030/
+mv val/ILSVRC2012_val_00017903.JPEG n03476991/
+mv val/ILSVRC2012_val_00017904.JPEG n04311174/
+mv val/ILSVRC2012_val_00017905.JPEG n02093991/
+mv val/ILSVRC2012_val_00017906.JPEG n03924679/
+mv val/ILSVRC2012_val_00017907.JPEG n03478589/
+mv val/ILSVRC2012_val_00017908.JPEG n04258138/
+mv val/ILSVRC2012_val_00017909.JPEG n01774384/
+mv val/ILSVRC2012_val_00017910.JPEG n02277742/
+mv val/ILSVRC2012_val_00017911.JPEG n01980166/
+mv val/ILSVRC2012_val_00017912.JPEG n02951358/
+mv val/ILSVRC2012_val_00017913.JPEG n03983396/
+mv val/ILSVRC2012_val_00017914.JPEG n03482405/
+mv val/ILSVRC2012_val_00017915.JPEG n02091244/
+mv val/ILSVRC2012_val_00017916.JPEG n01592084/
+mv val/ILSVRC2012_val_00017917.JPEG n02415577/
+mv val/ILSVRC2012_val_00017918.JPEG n02125311/
+mv val/ILSVRC2012_val_00017919.JPEG n03888257/
+mv val/ILSVRC2012_val_00017920.JPEG n03871628/
+mv val/ILSVRC2012_val_00017921.JPEG n02096437/
+mv val/ILSVRC2012_val_00017922.JPEG n03743016/
+mv val/ILSVRC2012_val_00017923.JPEG n04118776/
+mv val/ILSVRC2012_val_00017924.JPEG n02526121/
+mv val/ILSVRC2012_val_00017925.JPEG n07711569/
+mv val/ILSVRC2012_val_00017926.JPEG n01694178/
+mv val/ILSVRC2012_val_00017927.JPEG n01744401/
+mv val/ILSVRC2012_val_00017928.JPEG n03424325/
+mv val/ILSVRC2012_val_00017929.JPEG n10565667/
+mv val/ILSVRC2012_val_00017930.JPEG n02007558/
+mv val/ILSVRC2012_val_00017931.JPEG n01860187/
+mv val/ILSVRC2012_val_00017932.JPEG n03127925/
+mv val/ILSVRC2012_val_00017933.JPEG n04380533/
+mv val/ILSVRC2012_val_00017934.JPEG n03637318/
+mv val/ILSVRC2012_val_00017935.JPEG n02088238/
+mv val/ILSVRC2012_val_00017936.JPEG n04118538/
+mv val/ILSVRC2012_val_00017937.JPEG n02101006/
+mv val/ILSVRC2012_val_00017938.JPEG n02110958/
+mv val/ILSVRC2012_val_00017939.JPEG n01820546/
+mv val/ILSVRC2012_val_00017940.JPEG n02106550/
+mv val/ILSVRC2012_val_00017941.JPEG n03874293/
+mv val/ILSVRC2012_val_00017942.JPEG n02229544/
+mv val/ILSVRC2012_val_00017943.JPEG n03937543/
+mv val/ILSVRC2012_val_00017944.JPEG n03838899/
+mv val/ILSVRC2012_val_00017945.JPEG n04147183/
+mv val/ILSVRC2012_val_00017946.JPEG n03697007/
+mv val/ILSVRC2012_val_00017947.JPEG n02655020/
+mv val/ILSVRC2012_val_00017948.JPEG n01677366/
+mv val/ILSVRC2012_val_00017949.JPEG n02415577/
+mv val/ILSVRC2012_val_00017950.JPEG n03891332/
+mv val/ILSVRC2012_val_00017951.JPEG n03673027/
+mv val/ILSVRC2012_val_00017952.JPEG n02328150/
+mv val/ILSVRC2012_val_00017953.JPEG n02363005/
+mv val/ILSVRC2012_val_00017954.JPEG n04209133/
+mv val/ILSVRC2012_val_00017955.JPEG n04065272/
+mv val/ILSVRC2012_val_00017956.JPEG n04399382/
+mv val/ILSVRC2012_val_00017957.JPEG n02114548/
+mv val/ILSVRC2012_val_00017958.JPEG n03724870/
+mv val/ILSVRC2012_val_00017959.JPEG n12620546/
+mv val/ILSVRC2012_val_00017960.JPEG n04277352/
+mv val/ILSVRC2012_val_00017961.JPEG n02105855/
+mv val/ILSVRC2012_val_00017962.JPEG n01704323/
+mv val/ILSVRC2012_val_00017963.JPEG n01697457/
+mv val/ILSVRC2012_val_00017964.JPEG n02094433/
+mv val/ILSVRC2012_val_00017965.JPEG n02110958/
+mv val/ILSVRC2012_val_00017966.JPEG n02092339/
+mv val/ILSVRC2012_val_00017967.JPEG n01734418/
+mv val/ILSVRC2012_val_00017968.JPEG n02108915/
+mv val/ILSVRC2012_val_00017969.JPEG n02791270/
+mv val/ILSVRC2012_val_00017970.JPEG n01534433/
+mv val/ILSVRC2012_val_00017971.JPEG n04111531/
+mv val/ILSVRC2012_val_00017972.JPEG n03476684/
+mv val/ILSVRC2012_val_00017973.JPEG n02708093/
+mv val/ILSVRC2012_val_00017974.JPEG n01955084/
+mv val/ILSVRC2012_val_00017975.JPEG n01580077/
+mv val/ILSVRC2012_val_00017976.JPEG n01592084/
+mv val/ILSVRC2012_val_00017977.JPEG n03602883/
+mv val/ILSVRC2012_val_00017978.JPEG n02871525/
+mv val/ILSVRC2012_val_00017979.JPEG n04037443/
+mv val/ILSVRC2012_val_00017980.JPEG n02086910/
+mv val/ILSVRC2012_val_00017981.JPEG n13040303/
+mv val/ILSVRC2012_val_00017982.JPEG n07749582/
+mv val/ILSVRC2012_val_00017983.JPEG n01930112/
+mv val/ILSVRC2012_val_00017984.JPEG n13037406/
+mv val/ILSVRC2012_val_00017985.JPEG n03792972/
+mv val/ILSVRC2012_val_00017986.JPEG n01775062/
+mv val/ILSVRC2012_val_00017987.JPEG n02403003/
+mv val/ILSVRC2012_val_00017988.JPEG n02974003/
+mv val/ILSVRC2012_val_00017989.JPEG n01644373/
+mv val/ILSVRC2012_val_00017990.JPEG n02966193/
+mv val/ILSVRC2012_val_00017991.JPEG n03481172/
+mv val/ILSVRC2012_val_00017992.JPEG n02095570/
+mv val/ILSVRC2012_val_00017993.JPEG n03297495/
+mv val/ILSVRC2012_val_00017994.JPEG n01614925/
+mv val/ILSVRC2012_val_00017995.JPEG n01440764/
+mv val/ILSVRC2012_val_00017996.JPEG n02879718/
+mv val/ILSVRC2012_val_00017997.JPEG n02105641/
+mv val/ILSVRC2012_val_00017998.JPEG n03125729/
+mv val/ILSVRC2012_val_00017999.JPEG n03891332/
+mv val/ILSVRC2012_val_00018000.JPEG n01697457/
+mv val/ILSVRC2012_val_00018001.JPEG n03443371/
+mv val/ILSVRC2012_val_00018002.JPEG n03794056/
+mv val/ILSVRC2012_val_00018003.JPEG n02231487/
+mv val/ILSVRC2012_val_00018004.JPEG n02395406/
+mv val/ILSVRC2012_val_00018005.JPEG n02787622/
+mv val/ILSVRC2012_val_00018006.JPEG n03425413/
+mv val/ILSVRC2012_val_00018007.JPEG n02111889/
+mv val/ILSVRC2012_val_00018008.JPEG n01632458/
+mv val/ILSVRC2012_val_00018009.JPEG n02110806/
+mv val/ILSVRC2012_val_00018010.JPEG n03584829/
+mv val/ILSVRC2012_val_00018011.JPEG n03733805/
+mv val/ILSVRC2012_val_00018012.JPEG n04613696/
+mv val/ILSVRC2012_val_00018013.JPEG n07747607/
+mv val/ILSVRC2012_val_00018014.JPEG n02687172/
+mv val/ILSVRC2012_val_00018015.JPEG n03792782/
+mv val/ILSVRC2012_val_00018016.JPEG n02492035/
+mv val/ILSVRC2012_val_00018017.JPEG n02489166/
+mv val/ILSVRC2012_val_00018018.JPEG n03393912/
+mv val/ILSVRC2012_val_00018019.JPEG n03018349/
+mv val/ILSVRC2012_val_00018020.JPEG n03843555/
+mv val/ILSVRC2012_val_00018021.JPEG n02769748/
+mv val/ILSVRC2012_val_00018022.JPEG n02168699/
+mv val/ILSVRC2012_val_00018023.JPEG n03272010/
+mv val/ILSVRC2012_val_00018024.JPEG n04532106/
+mv val/ILSVRC2012_val_00018025.JPEG n01943899/
+mv val/ILSVRC2012_val_00018026.JPEG n01882714/
+mv val/ILSVRC2012_val_00018027.JPEG n03127747/
+mv val/ILSVRC2012_val_00018028.JPEG n02088632/
+mv val/ILSVRC2012_val_00018029.JPEG n04589890/
+mv val/ILSVRC2012_val_00018030.JPEG n12768682/
+mv val/ILSVRC2012_val_00018031.JPEG n07715103/
+mv val/ILSVRC2012_val_00018032.JPEG n02410509/
+mv val/ILSVRC2012_val_00018033.JPEG n03995372/
+mv val/ILSVRC2012_val_00018034.JPEG n01728920/
+mv val/ILSVRC2012_val_00018035.JPEG n02091134/
+mv val/ILSVRC2012_val_00018036.JPEG n01820546/
+mv val/ILSVRC2012_val_00018037.JPEG n01739381/
+mv val/ILSVRC2012_val_00018038.JPEG n02917067/
+mv val/ILSVRC2012_val_00018039.JPEG n04591157/
+mv val/ILSVRC2012_val_00018040.JPEG n07697313/
+mv val/ILSVRC2012_val_00018041.JPEG n01728920/
+mv val/ILSVRC2012_val_00018042.JPEG n02835271/
+mv val/ILSVRC2012_val_00018043.JPEG n02028035/
+mv val/ILSVRC2012_val_00018044.JPEG n03908714/
+mv val/ILSVRC2012_val_00018045.JPEG n02096294/
+mv val/ILSVRC2012_val_00018046.JPEG n02106030/
+mv val/ILSVRC2012_val_00018047.JPEG n03384352/
+mv val/ILSVRC2012_val_00018048.JPEG n02174001/
+mv val/ILSVRC2012_val_00018049.JPEG n04522168/
+mv val/ILSVRC2012_val_00018050.JPEG n03866082/
+mv val/ILSVRC2012_val_00018051.JPEG n02817516/
+mv val/ILSVRC2012_val_00018052.JPEG n01978287/
+mv val/ILSVRC2012_val_00018053.JPEG n04259630/
+mv val/ILSVRC2012_val_00018054.JPEG n04399382/
+mv val/ILSVRC2012_val_00018055.JPEG n02113978/
+mv val/ILSVRC2012_val_00018056.JPEG n03447721/
+mv val/ILSVRC2012_val_00018057.JPEG n02749479/
+mv val/ILSVRC2012_val_00018058.JPEG n03188531/
+mv val/ILSVRC2012_val_00018059.JPEG n02483708/
+mv val/ILSVRC2012_val_00018060.JPEG n07693725/
+mv val/ILSVRC2012_val_00018061.JPEG n03014705/
+mv val/ILSVRC2012_val_00018062.JPEG n01622779/
+mv val/ILSVRC2012_val_00018063.JPEG n03642806/
+mv val/ILSVRC2012_val_00018064.JPEG n02018207/
+mv val/ILSVRC2012_val_00018065.JPEG n09332890/
+mv val/ILSVRC2012_val_00018066.JPEG n03670208/
+mv val/ILSVRC2012_val_00018067.JPEG n03291819/
+mv val/ILSVRC2012_val_00018068.JPEG n02017213/
+mv val/ILSVRC2012_val_00018069.JPEG n02098286/
+mv val/ILSVRC2012_val_00018070.JPEG n04141327/
+mv val/ILSVRC2012_val_00018071.JPEG n02105251/
+mv val/ILSVRC2012_val_00018072.JPEG n02447366/
+mv val/ILSVRC2012_val_00018073.JPEG n02321529/
+mv val/ILSVRC2012_val_00018074.JPEG n03792782/
+mv val/ILSVRC2012_val_00018075.JPEG n01443537/
+mv val/ILSVRC2012_val_00018076.JPEG n01943899/
+mv val/ILSVRC2012_val_00018077.JPEG n04522168/
+mv val/ILSVRC2012_val_00018078.JPEG n13133613/
+mv val/ILSVRC2012_val_00018079.JPEG n03891251/
+mv val/ILSVRC2012_val_00018080.JPEG n02106166/
+mv val/ILSVRC2012_val_00018081.JPEG n04592741/
+mv val/ILSVRC2012_val_00018082.JPEG n04179913/
+mv val/ILSVRC2012_val_00018083.JPEG n03216828/
+mv val/ILSVRC2012_val_00018084.JPEG n04467665/
+mv val/ILSVRC2012_val_00018085.JPEG n01883070/
+mv val/ILSVRC2012_val_00018086.JPEG n07614500/
+mv val/ILSVRC2012_val_00018087.JPEG n02105162/
+mv val/ILSVRC2012_val_00018088.JPEG n04456115/
+mv val/ILSVRC2012_val_00018089.JPEG n04332243/
+mv val/ILSVRC2012_val_00018090.JPEG n04049303/
+mv val/ILSVRC2012_val_00018091.JPEG n07615774/
+mv val/ILSVRC2012_val_00018092.JPEG n01616318/
+mv val/ILSVRC2012_val_00018093.JPEG n07802026/
+mv val/ILSVRC2012_val_00018094.JPEG n03291819/
+mv val/ILSVRC2012_val_00018095.JPEG n01688243/
+mv val/ILSVRC2012_val_00018096.JPEG n02396427/
+mv val/ILSVRC2012_val_00018097.JPEG n09229709/
+mv val/ILSVRC2012_val_00018098.JPEG n09399592/
+mv val/ILSVRC2012_val_00018099.JPEG n02027492/
+mv val/ILSVRC2012_val_00018100.JPEG n04517823/
+mv val/ILSVRC2012_val_00018101.JPEG n03325584/
+mv val/ILSVRC2012_val_00018102.JPEG n02165456/
+mv val/ILSVRC2012_val_00018103.JPEG n03803284/
+mv val/ILSVRC2012_val_00018104.JPEG n02802426/
+mv val/ILSVRC2012_val_00018105.JPEG n09428293/
+mv val/ILSVRC2012_val_00018106.JPEG n02168699/
+mv val/ILSVRC2012_val_00018107.JPEG n02106662/
+mv val/ILSVRC2012_val_00018108.JPEG n03259280/
+mv val/ILSVRC2012_val_00018109.JPEG n03733131/
+mv val/ILSVRC2012_val_00018110.JPEG n04258138/
+mv val/ILSVRC2012_val_00018111.JPEG n01924916/
+mv val/ILSVRC2012_val_00018112.JPEG n01945685/
+mv val/ILSVRC2012_val_00018113.JPEG n09428293/
+mv val/ILSVRC2012_val_00018114.JPEG n02871525/
+mv val/ILSVRC2012_val_00018115.JPEG n02786058/
+mv val/ILSVRC2012_val_00018116.JPEG n03721384/
+mv val/ILSVRC2012_val_00018117.JPEG n04285008/
+mv val/ILSVRC2012_val_00018118.JPEG n03485794/
+mv val/ILSVRC2012_val_00018119.JPEG n01784675/
+mv val/ILSVRC2012_val_00018120.JPEG n04428191/
+mv val/ILSVRC2012_val_00018121.JPEG n02092002/
+mv val/ILSVRC2012_val_00018122.JPEG n04372370/
+mv val/ILSVRC2012_val_00018123.JPEG n04099969/
+mv val/ILSVRC2012_val_00018124.JPEG n03026506/
+mv val/ILSVRC2012_val_00018125.JPEG n02971356/
+mv val/ILSVRC2012_val_00018126.JPEG n02106030/
+mv val/ILSVRC2012_val_00018127.JPEG n04131690/
+mv val/ILSVRC2012_val_00018128.JPEG n01847000/
+mv val/ILSVRC2012_val_00018129.JPEG n03794056/
+mv val/ILSVRC2012_val_00018130.JPEG n12985857/
+mv val/ILSVRC2012_val_00018131.JPEG n02488702/
+mv val/ILSVRC2012_val_00018132.JPEG n01872401/
+mv val/ILSVRC2012_val_00018133.JPEG n03372029/
+mv val/ILSVRC2012_val_00018134.JPEG n01806567/
+mv val/ILSVRC2012_val_00018135.JPEG n01917289/
+mv val/ILSVRC2012_val_00018136.JPEG n03444034/
+mv val/ILSVRC2012_val_00018137.JPEG n01776313/
+mv val/ILSVRC2012_val_00018138.JPEG n02814533/
+mv val/ILSVRC2012_val_00018139.JPEG n02672831/
+mv val/ILSVRC2012_val_00018140.JPEG n03637318/
+mv val/ILSVRC2012_val_00018141.JPEG n02113978/
+mv val/ILSVRC2012_val_00018142.JPEG n02165456/
+mv val/ILSVRC2012_val_00018143.JPEG n04548280/
+mv val/ILSVRC2012_val_00018144.JPEG n02917067/
+mv val/ILSVRC2012_val_00018145.JPEG n01560419/
+mv val/ILSVRC2012_val_00018146.JPEG n02825657/
+mv val/ILSVRC2012_val_00018147.JPEG n04552348/
+mv val/ILSVRC2012_val_00018148.JPEG n02999410/
+mv val/ILSVRC2012_val_00018149.JPEG n02190166/
+mv val/ILSVRC2012_val_00018150.JPEG n03065424/
+mv val/ILSVRC2012_val_00018151.JPEG n02825657/
+mv val/ILSVRC2012_val_00018152.JPEG n07716358/
+mv val/ILSVRC2012_val_00018153.JPEG n02877765/
+mv val/ILSVRC2012_val_00018154.JPEG n09421951/
+mv val/ILSVRC2012_val_00018155.JPEG n12267677/
+mv val/ILSVRC2012_val_00018156.JPEG n01819313/
+mv val/ILSVRC2012_val_00018157.JPEG n04264628/
+mv val/ILSVRC2012_val_00018158.JPEG n03344393/
+mv val/ILSVRC2012_val_00018159.JPEG n02002724/
+mv val/ILSVRC2012_val_00018160.JPEG n01641577/
+mv val/ILSVRC2012_val_00018161.JPEG n02256656/
+mv val/ILSVRC2012_val_00018162.JPEG n01532829/
+mv val/ILSVRC2012_val_00018163.JPEG n03854065/
+mv val/ILSVRC2012_val_00018164.JPEG n02791270/
+mv val/ILSVRC2012_val_00018165.JPEG n02951585/
+mv val/ILSVRC2012_val_00018166.JPEG n03014705/
+mv val/ILSVRC2012_val_00018167.JPEG n01592084/
+mv val/ILSVRC2012_val_00018168.JPEG n01728572/
+mv val/ILSVRC2012_val_00018169.JPEG n01774750/
+mv val/ILSVRC2012_val_00018170.JPEG n03868242/
+mv val/ILSVRC2012_val_00018171.JPEG n04370456/
+mv val/ILSVRC2012_val_00018172.JPEG n03337140/
+mv val/ILSVRC2012_val_00018173.JPEG n03124043/
+mv val/ILSVRC2012_val_00018174.JPEG n03290653/
+mv val/ILSVRC2012_val_00018175.JPEG n02488291/
+mv val/ILSVRC2012_val_00018176.JPEG n04505470/
+mv val/ILSVRC2012_val_00018177.JPEG n04553703/
+mv val/ILSVRC2012_val_00018178.JPEG n02107574/
+mv val/ILSVRC2012_val_00018179.JPEG n01692333/
+mv val/ILSVRC2012_val_00018180.JPEG n12620546/
+mv val/ILSVRC2012_val_00018181.JPEG n04086273/
+mv val/ILSVRC2012_val_00018182.JPEG n03657121/
+mv val/ILSVRC2012_val_00018183.JPEG n01582220/
+mv val/ILSVRC2012_val_00018184.JPEG n03485407/
+mv val/ILSVRC2012_val_00018185.JPEG n03840681/
+mv val/ILSVRC2012_val_00018186.JPEG n07768694/
+mv val/ILSVRC2012_val_00018187.JPEG n03782006/
+mv val/ILSVRC2012_val_00018188.JPEG n02114548/
+mv val/ILSVRC2012_val_00018189.JPEG n11939491/
+mv val/ILSVRC2012_val_00018190.JPEG n04552348/
+mv val/ILSVRC2012_val_00018191.JPEG n03208938/
+mv val/ILSVRC2012_val_00018192.JPEG n02006656/
+mv val/ILSVRC2012_val_00018193.JPEG n03764736/
+mv val/ILSVRC2012_val_00018194.JPEG n07695742/
+mv val/ILSVRC2012_val_00018195.JPEG n01820546/
+mv val/ILSVRC2012_val_00018196.JPEG n02326432/
+mv val/ILSVRC2012_val_00018197.JPEG n02009229/
+mv val/ILSVRC2012_val_00018198.JPEG n02408429/
+mv val/ILSVRC2012_val_00018199.JPEG n03018349/
+mv val/ILSVRC2012_val_00018200.JPEG n03018349/
+mv val/ILSVRC2012_val_00018201.JPEG n02504458/
+mv val/ILSVRC2012_val_00018202.JPEG n02089973/
+mv val/ILSVRC2012_val_00018203.JPEG n01917289/
+mv val/ILSVRC2012_val_00018204.JPEG n01739381/
+mv val/ILSVRC2012_val_00018205.JPEG n02130308/
+mv val/ILSVRC2012_val_00018206.JPEG n04099969/
+mv val/ILSVRC2012_val_00018207.JPEG n02102040/
+mv val/ILSVRC2012_val_00018208.JPEG n03788195/
+mv val/ILSVRC2012_val_00018209.JPEG n03764736/
+mv val/ILSVRC2012_val_00018210.JPEG n02422699/
+mv val/ILSVRC2012_val_00018211.JPEG n01978287/
+mv val/ILSVRC2012_val_00018212.JPEG n02860847/
+mv val/ILSVRC2012_val_00018213.JPEG n02749479/
+mv val/ILSVRC2012_val_00018214.JPEG n03877845/
+mv val/ILSVRC2012_val_00018215.JPEG n03404251/
+mv val/ILSVRC2012_val_00018216.JPEG n04209133/
+mv val/ILSVRC2012_val_00018217.JPEG n07695742/
+mv val/ILSVRC2012_val_00018218.JPEG n04090263/
+mv val/ILSVRC2012_val_00018219.JPEG n03720891/
+mv val/ILSVRC2012_val_00018220.JPEG n04311174/
+mv val/ILSVRC2012_val_00018221.JPEG n03642806/
+mv val/ILSVRC2012_val_00018222.JPEG n03933933/
+mv val/ILSVRC2012_val_00018223.JPEG n04005630/
+mv val/ILSVRC2012_val_00018224.JPEG n02093991/
+mv val/ILSVRC2012_val_00018225.JPEG n02977058/
+mv val/ILSVRC2012_val_00018226.JPEG n09835506/
+mv val/ILSVRC2012_val_00018227.JPEG n03417042/
+mv val/ILSVRC2012_val_00018228.JPEG n01742172/
+mv val/ILSVRC2012_val_00018229.JPEG n03888257/
+mv val/ILSVRC2012_val_00018230.JPEG n02782093/
+mv val/ILSVRC2012_val_00018231.JPEG n07802026/
+mv val/ILSVRC2012_val_00018232.JPEG n03208938/
+mv val/ILSVRC2012_val_00018233.JPEG n02130308/
+mv val/ILSVRC2012_val_00018234.JPEG n02090622/
+mv val/ILSVRC2012_val_00018235.JPEG n04040759/
+mv val/ILSVRC2012_val_00018236.JPEG n02422699/
+mv val/ILSVRC2012_val_00018237.JPEG n03594945/
+mv val/ILSVRC2012_val_00018238.JPEG n02437616/
+mv val/ILSVRC2012_val_00018239.JPEG n03337140/
+mv val/ILSVRC2012_val_00018240.JPEG n09399592/
+mv val/ILSVRC2012_val_00018241.JPEG n02129604/
+mv val/ILSVRC2012_val_00018242.JPEG n02488291/
+mv val/ILSVRC2012_val_00018243.JPEG n04597913/
+mv val/ILSVRC2012_val_00018244.JPEG n03089624/
+mv val/ILSVRC2012_val_00018245.JPEG n03710193/
+mv val/ILSVRC2012_val_00018246.JPEG n02930766/
+mv val/ILSVRC2012_val_00018247.JPEG n04435653/
+mv val/ILSVRC2012_val_00018248.JPEG n01806567/
+mv val/ILSVRC2012_val_00018249.JPEG n03100240/
+mv val/ILSVRC2012_val_00018250.JPEG n01582220/
+mv val/ILSVRC2012_val_00018251.JPEG n03871628/
+mv val/ILSVRC2012_val_00018252.JPEG n02422106/
+mv val/ILSVRC2012_val_00018253.JPEG n02494079/
+mv val/ILSVRC2012_val_00018254.JPEG n04372370/
+mv val/ILSVRC2012_val_00018255.JPEG n07716358/
+mv val/ILSVRC2012_val_00018256.JPEG n04277352/
+mv val/ILSVRC2012_val_00018257.JPEG n02236044/
+mv val/ILSVRC2012_val_00018258.JPEG n03891332/
+mv val/ILSVRC2012_val_00018259.JPEG n03814639/
+mv val/ILSVRC2012_val_00018260.JPEG n02396427/
+mv val/ILSVRC2012_val_00018261.JPEG n02793495/
+mv val/ILSVRC2012_val_00018262.JPEG n02096437/
+mv val/ILSVRC2012_val_00018263.JPEG n02504458/
+mv val/ILSVRC2012_val_00018264.JPEG n02085936/
+mv val/ILSVRC2012_val_00018265.JPEG n01978287/
+mv val/ILSVRC2012_val_00018266.JPEG n04239074/
+mv val/ILSVRC2012_val_00018267.JPEG n03532672/
+mv val/ILSVRC2012_val_00018268.JPEG n02869837/
+mv val/ILSVRC2012_val_00018269.JPEG n02127052/
+mv val/ILSVRC2012_val_00018270.JPEG n03680355/
+mv val/ILSVRC2012_val_00018271.JPEG n02206856/
+mv val/ILSVRC2012_val_00018272.JPEG n03602883/
+mv val/ILSVRC2012_val_00018273.JPEG n01817953/
+mv val/ILSVRC2012_val_00018274.JPEG n03733805/
+mv val/ILSVRC2012_val_00018275.JPEG n03938244/
+mv val/ILSVRC2012_val_00018276.JPEG n03450230/
+mv val/ILSVRC2012_val_00018277.JPEG n04044716/
+mv val/ILSVRC2012_val_00018278.JPEG n02965783/
+mv val/ILSVRC2012_val_00018279.JPEG n03938244/
+mv val/ILSVRC2012_val_00018280.JPEG n01592084/
+mv val/ILSVRC2012_val_00018281.JPEG n03290653/
+mv val/ILSVRC2012_val_00018282.JPEG n04479046/
+mv val/ILSVRC2012_val_00018283.JPEG n07831146/
+mv val/ILSVRC2012_val_00018284.JPEG n01735189/
+mv val/ILSVRC2012_val_00018285.JPEG n04525305/
+mv val/ILSVRC2012_val_00018286.JPEG n02870880/
+mv val/ILSVRC2012_val_00018287.JPEG n02776631/
+mv val/ILSVRC2012_val_00018288.JPEG n02172182/
+mv val/ILSVRC2012_val_00018289.JPEG n04081281/
+mv val/ILSVRC2012_val_00018290.JPEG n03876231/
+mv val/ILSVRC2012_val_00018291.JPEG n01985128/
+mv val/ILSVRC2012_val_00018292.JPEG n01917289/
+mv val/ILSVRC2012_val_00018293.JPEG n10148035/
+mv val/ILSVRC2012_val_00018294.JPEG n04286575/
+mv val/ILSVRC2012_val_00018295.JPEG n03598930/
+mv val/ILSVRC2012_val_00018296.JPEG n02085782/
+mv val/ILSVRC2012_val_00018297.JPEG n02699494/
+mv val/ILSVRC2012_val_00018298.JPEG n04009552/
+mv val/ILSVRC2012_val_00018299.JPEG n03492542/
+mv val/ILSVRC2012_val_00018300.JPEG n07749582/
+mv val/ILSVRC2012_val_00018301.JPEG n03017168/
+mv val/ILSVRC2012_val_00018302.JPEG n03494278/
+mv val/ILSVRC2012_val_00018303.JPEG n02134418/
+mv val/ILSVRC2012_val_00018304.JPEG n03792782/
+mv val/ILSVRC2012_val_00018305.JPEG n01687978/
+mv val/ILSVRC2012_val_00018306.JPEG n13040303/
+mv val/ILSVRC2012_val_00018307.JPEG n03220513/
+mv val/ILSVRC2012_val_00018308.JPEG n03347037/
+mv val/ILSVRC2012_val_00018309.JPEG n03476684/
+mv val/ILSVRC2012_val_00018310.JPEG n01828970/
+mv val/ILSVRC2012_val_00018311.JPEG n02114367/
+mv val/ILSVRC2012_val_00018312.JPEG n07715103/
+mv val/ILSVRC2012_val_00018313.JPEG n02119789/
+mv val/ILSVRC2012_val_00018314.JPEG n01749939/
+mv val/ILSVRC2012_val_00018315.JPEG n03791053/
+mv val/ILSVRC2012_val_00018316.JPEG n02457408/
+mv val/ILSVRC2012_val_00018317.JPEG n01440764/
+mv val/ILSVRC2012_val_00018318.JPEG n01824575/
+mv val/ILSVRC2012_val_00018319.JPEG n04372370/
+mv val/ILSVRC2012_val_00018320.JPEG n07802026/
+mv val/ILSVRC2012_val_00018321.JPEG n04270147/
+mv val/ILSVRC2012_val_00018322.JPEG n04033901/
+mv val/ILSVRC2012_val_00018323.JPEG n04515003/
+mv val/ILSVRC2012_val_00018324.JPEG n03950228/
+mv val/ILSVRC2012_val_00018325.JPEG n04005630/
+mv val/ILSVRC2012_val_00018326.JPEG n02091032/
+mv val/ILSVRC2012_val_00018327.JPEG n02090379/
+mv val/ILSVRC2012_val_00018328.JPEG n02486410/
+mv val/ILSVRC2012_val_00018329.JPEG n07684084/
+mv val/ILSVRC2012_val_00018330.JPEG n04592741/
+mv val/ILSVRC2012_val_00018331.JPEG n02106382/
+mv val/ILSVRC2012_val_00018332.JPEG n02165456/
+mv val/ILSVRC2012_val_00018333.JPEG n02483708/
+mv val/ILSVRC2012_val_00018334.JPEG n01737021/
+mv val/ILSVRC2012_val_00018335.JPEG n02814533/
+mv val/ILSVRC2012_val_00018336.JPEG n04081281/
+mv val/ILSVRC2012_val_00018337.JPEG n03884397/
+mv val/ILSVRC2012_val_00018338.JPEG n07749582/
+mv val/ILSVRC2012_val_00018339.JPEG n01641577/
+mv val/ILSVRC2012_val_00018340.JPEG n03929855/
+mv val/ILSVRC2012_val_00018341.JPEG n04550184/
+mv val/ILSVRC2012_val_00018342.JPEG n04467665/
+mv val/ILSVRC2012_val_00018343.JPEG n03930313/
+mv val/ILSVRC2012_val_00018344.JPEG n02951585/
+mv val/ILSVRC2012_val_00018345.JPEG n02747177/
+mv val/ILSVRC2012_val_00018346.JPEG n04487394/
+mv val/ILSVRC2012_val_00018347.JPEG n01773549/
+mv val/ILSVRC2012_val_00018348.JPEG n04228054/
+mv val/ILSVRC2012_val_00018349.JPEG n02410509/
+mv val/ILSVRC2012_val_00018350.JPEG n04596742/
+mv val/ILSVRC2012_val_00018351.JPEG n02795169/
+mv val/ILSVRC2012_val_00018352.JPEG n03496892/
+mv val/ILSVRC2012_val_00018353.JPEG n04613696/
+mv val/ILSVRC2012_val_00018354.JPEG n02398521/
+mv val/ILSVRC2012_val_00018355.JPEG n03814906/
+mv val/ILSVRC2012_val_00018356.JPEG n02823750/
+mv val/ILSVRC2012_val_00018357.JPEG n02106550/
+mv val/ILSVRC2012_val_00018358.JPEG n02128385/
+mv val/ILSVRC2012_val_00018359.JPEG n02364673/
+mv val/ILSVRC2012_val_00018360.JPEG n03770679/
+mv val/ILSVRC2012_val_00018361.JPEG n02099429/
+mv val/ILSVRC2012_val_00018362.JPEG n01669191/
+mv val/ILSVRC2012_val_00018363.JPEG n12057211/
+mv val/ILSVRC2012_val_00018364.JPEG n04476259/
+mv val/ILSVRC2012_val_00018365.JPEG n02229544/
+mv val/ILSVRC2012_val_00018366.JPEG n03781244/
+mv val/ILSVRC2012_val_00018367.JPEG n02509815/
+mv val/ILSVRC2012_val_00018368.JPEG n02807133/
+mv val/ILSVRC2012_val_00018369.JPEG n02132136/
+mv val/ILSVRC2012_val_00018370.JPEG n03447721/
+mv val/ILSVRC2012_val_00018371.JPEG n02840245/
+mv val/ILSVRC2012_val_00018372.JPEG n03743016/
+mv val/ILSVRC2012_val_00018373.JPEG n04118776/
+mv val/ILSVRC2012_val_00018374.JPEG n04356056/
+mv val/ILSVRC2012_val_00018375.JPEG n02190166/
+mv val/ILSVRC2012_val_00018376.JPEG n03424325/
+mv val/ILSVRC2012_val_00018377.JPEG n04606251/
+mv val/ILSVRC2012_val_00018378.JPEG n04146614/
+mv val/ILSVRC2012_val_00018379.JPEG n04040759/
+mv val/ILSVRC2012_val_00018380.JPEG n07754684/
+mv val/ILSVRC2012_val_00018381.JPEG n02119022/
+mv val/ILSVRC2012_val_00018382.JPEG n02454379/
+mv val/ILSVRC2012_val_00018383.JPEG n02443484/
+mv val/ILSVRC2012_val_00018384.JPEG n04310018/
+mv val/ILSVRC2012_val_00018385.JPEG n03527444/
+mv val/ILSVRC2012_val_00018386.JPEG n04399382/
+mv val/ILSVRC2012_val_00018387.JPEG n03843555/
+mv val/ILSVRC2012_val_00018388.JPEG n01740131/
+mv val/ILSVRC2012_val_00018389.JPEG n02127052/
+mv val/ILSVRC2012_val_00018390.JPEG n02749479/
+mv val/ILSVRC2012_val_00018391.JPEG n03045698/
+mv val/ILSVRC2012_val_00018392.JPEG n02086240/
+mv val/ILSVRC2012_val_00018393.JPEG n01795545/
+mv val/ILSVRC2012_val_00018394.JPEG n04592741/
+mv val/ILSVRC2012_val_00018395.JPEG n02701002/
+mv val/ILSVRC2012_val_00018396.JPEG n04149813/
+mv val/ILSVRC2012_val_00018397.JPEG n02823750/
+mv val/ILSVRC2012_val_00018398.JPEG n01728920/
+mv val/ILSVRC2012_val_00018399.JPEG n04493381/
+mv val/ILSVRC2012_val_00018400.JPEG n02894605/
+mv val/ILSVRC2012_val_00018401.JPEG n03970156/
+mv val/ILSVRC2012_val_00018402.JPEG n03838899/
+mv val/ILSVRC2012_val_00018403.JPEG n03877845/
+mv val/ILSVRC2012_val_00018404.JPEG n03534580/
+mv val/ILSVRC2012_val_00018405.JPEG n02094258/
+mv val/ILSVRC2012_val_00018406.JPEG n03047690/
+mv val/ILSVRC2012_val_00018407.JPEG n02033041/
+mv val/ILSVRC2012_val_00018408.JPEG n03208938/
+mv val/ILSVRC2012_val_00018409.JPEG n03124043/
+mv val/ILSVRC2012_val_00018410.JPEG n03000134/
+mv val/ILSVRC2012_val_00018411.JPEG n03250847/
+mv val/ILSVRC2012_val_00018412.JPEG n01817953/
+mv val/ILSVRC2012_val_00018413.JPEG n02727426/
+mv val/ILSVRC2012_val_00018414.JPEG n01669191/
+mv val/ILSVRC2012_val_00018415.JPEG n02268443/
+mv val/ILSVRC2012_val_00018416.JPEG n03770439/
+mv val/ILSVRC2012_val_00018417.JPEG n02389026/
+mv val/ILSVRC2012_val_00018418.JPEG n04550184/
+mv val/ILSVRC2012_val_00018419.JPEG n02804610/
+mv val/ILSVRC2012_val_00018420.JPEG n03461385/
+mv val/ILSVRC2012_val_00018421.JPEG n02091244/
+mv val/ILSVRC2012_val_00018422.JPEG n02363005/
+mv val/ILSVRC2012_val_00018423.JPEG n02391049/
+mv val/ILSVRC2012_val_00018424.JPEG n07717410/
+mv val/ILSVRC2012_val_00018425.JPEG n03404251/
+mv val/ILSVRC2012_val_00018426.JPEG n07695742/
+mv val/ILSVRC2012_val_00018427.JPEG n04462240/
+mv val/ILSVRC2012_val_00018428.JPEG n01817953/
+mv val/ILSVRC2012_val_00018429.JPEG n06359193/
+mv val/ILSVRC2012_val_00018430.JPEG n01685808/
+mv val/ILSVRC2012_val_00018431.JPEG n02509815/
+mv val/ILSVRC2012_val_00018432.JPEG n09835506/
+mv val/ILSVRC2012_val_00018433.JPEG n04523525/
+mv val/ILSVRC2012_val_00018434.JPEG n04398044/
+mv val/ILSVRC2012_val_00018435.JPEG n01955084/
+mv val/ILSVRC2012_val_00018436.JPEG n02423022/
+mv val/ILSVRC2012_val_00018437.JPEG n02129604/
+mv val/ILSVRC2012_val_00018438.JPEG n02066245/
+mv val/ILSVRC2012_val_00018439.JPEG n01773797/
+mv val/ILSVRC2012_val_00018440.JPEG n02859443/
+mv val/ILSVRC2012_val_00018441.JPEG n04090263/
+mv val/ILSVRC2012_val_00018442.JPEG n03617480/
+mv val/ILSVRC2012_val_00018443.JPEG n04548280/
+mv val/ILSVRC2012_val_00018444.JPEG n03929855/
+mv val/ILSVRC2012_val_00018445.JPEG n03777754/
+mv val/ILSVRC2012_val_00018446.JPEG n02791270/
+mv val/ILSVRC2012_val_00018447.JPEG n02317335/
+mv val/ILSVRC2012_val_00018448.JPEG n03791053/
+mv val/ILSVRC2012_val_00018449.JPEG n03180011/
+mv val/ILSVRC2012_val_00018450.JPEG n01677366/
+mv val/ILSVRC2012_val_00018451.JPEG n03976467/
+mv val/ILSVRC2012_val_00018452.JPEG n02497673/
+mv val/ILSVRC2012_val_00018453.JPEG n01729322/
+mv val/ILSVRC2012_val_00018454.JPEG n03297495/
+mv val/ILSVRC2012_val_00018455.JPEG n02268853/
+mv val/ILSVRC2012_val_00018456.JPEG n01742172/
+mv val/ILSVRC2012_val_00018457.JPEG n07716906/
+mv val/ILSVRC2012_val_00018458.JPEG n03630383/
+mv val/ILSVRC2012_val_00018459.JPEG n02825657/
+mv val/ILSVRC2012_val_00018460.JPEG n02094258/
+mv val/ILSVRC2012_val_00018461.JPEG n07873807/
+mv val/ILSVRC2012_val_00018462.JPEG n03776460/
+mv val/ILSVRC2012_val_00018463.JPEG n01843383/
+mv val/ILSVRC2012_val_00018464.JPEG n02840245/
+mv val/ILSVRC2012_val_00018465.JPEG n02607072/
+mv val/ILSVRC2012_val_00018466.JPEG n01491361/
+mv val/ILSVRC2012_val_00018467.JPEG n03109150/
+mv val/ILSVRC2012_val_00018468.JPEG n03908618/
+mv val/ILSVRC2012_val_00018469.JPEG n02132136/
+mv val/ILSVRC2012_val_00018470.JPEG n01950731/
+mv val/ILSVRC2012_val_00018471.JPEG n02133161/
+mv val/ILSVRC2012_val_00018472.JPEG n04070727/
+mv val/ILSVRC2012_val_00018473.JPEG n03384352/
+mv val/ILSVRC2012_val_00018474.JPEG n03594945/
+mv val/ILSVRC2012_val_00018475.JPEG n03933933/
+mv val/ILSVRC2012_val_00018476.JPEG n03891332/
+mv val/ILSVRC2012_val_00018477.JPEG n01968897/
+mv val/ILSVRC2012_val_00018478.JPEG n09229709/
+mv val/ILSVRC2012_val_00018479.JPEG n02095314/
+mv val/ILSVRC2012_val_00018480.JPEG n02088364/
+mv val/ILSVRC2012_val_00018481.JPEG n01641577/
+mv val/ILSVRC2012_val_00018482.JPEG n03124170/
+mv val/ILSVRC2012_val_00018483.JPEG n03272562/
+mv val/ILSVRC2012_val_00018484.JPEG n02817516/
+mv val/ILSVRC2012_val_00018485.JPEG n01943899/
+mv val/ILSVRC2012_val_00018486.JPEG n07590611/
+mv val/ILSVRC2012_val_00018487.JPEG n04235860/
+mv val/ILSVRC2012_val_00018488.JPEG n03991062/
+mv val/ILSVRC2012_val_00018489.JPEG n02006656/
+mv val/ILSVRC2012_val_00018490.JPEG n04026417/
+mv val/ILSVRC2012_val_00018491.JPEG n02113799/
+mv val/ILSVRC2012_val_00018492.JPEG n04311004/
+mv val/ILSVRC2012_val_00018493.JPEG n02815834/
+mv val/ILSVRC2012_val_00018494.JPEG n04008634/
+mv val/ILSVRC2012_val_00018495.JPEG n07718472/
+mv val/ILSVRC2012_val_00018496.JPEG n02437616/
+mv val/ILSVRC2012_val_00018497.JPEG n04325704/
+mv val/ILSVRC2012_val_00018498.JPEG n03676483/
+mv val/ILSVRC2012_val_00018499.JPEG n03207941/
+mv val/ILSVRC2012_val_00018500.JPEG n02066245/
+mv val/ILSVRC2012_val_00018501.JPEG n03873416/
+mv val/ILSVRC2012_val_00018502.JPEG n02489166/
+mv val/ILSVRC2012_val_00018503.JPEG n03782006/
+mv val/ILSVRC2012_val_00018504.JPEG n04523525/
+mv val/ILSVRC2012_val_00018505.JPEG n03710637/
+mv val/ILSVRC2012_val_00018506.JPEG n02791270/
+mv val/ILSVRC2012_val_00018507.JPEG n09835506/
+mv val/ILSVRC2012_val_00018508.JPEG n01768244/
+mv val/ILSVRC2012_val_00018509.JPEG n03888257/
+mv val/ILSVRC2012_val_00018510.JPEG n04325704/
+mv val/ILSVRC2012_val_00018511.JPEG n02007558/
+mv val/ILSVRC2012_val_00018512.JPEG n01641577/
+mv val/ILSVRC2012_val_00018513.JPEG n03983396/
+mv val/ILSVRC2012_val_00018514.JPEG n04179913/
+mv val/ILSVRC2012_val_00018515.JPEG n03786901/
+mv val/ILSVRC2012_val_00018516.JPEG n03425413/
+mv val/ILSVRC2012_val_00018517.JPEG n02012849/
+mv val/ILSVRC2012_val_00018518.JPEG n03876231/
+mv val/ILSVRC2012_val_00018519.JPEG n02802426/
+mv val/ILSVRC2012_val_00018520.JPEG n04067472/
+mv val/ILSVRC2012_val_00018521.JPEG n02112350/
+mv val/ILSVRC2012_val_00018522.JPEG n02797295/
+mv val/ILSVRC2012_val_00018523.JPEG n03895866/
+mv val/ILSVRC2012_val_00018524.JPEG n07753113/
+mv val/ILSVRC2012_val_00018525.JPEG n03297495/
+mv val/ILSVRC2012_val_00018526.JPEG n02091635/
+mv val/ILSVRC2012_val_00018527.JPEG n04487394/
+mv val/ILSVRC2012_val_00018528.JPEG n03729826/
+mv val/ILSVRC2012_val_00018529.JPEG n02104029/
+mv val/ILSVRC2012_val_00018530.JPEG n02102973/
+mv val/ILSVRC2012_val_00018531.JPEG n03000247/
+mv val/ILSVRC2012_val_00018532.JPEG n01871265/
+mv val/ILSVRC2012_val_00018533.JPEG n03920288/
+mv val/ILSVRC2012_val_00018534.JPEG n03627232/
+mv val/ILSVRC2012_val_00018535.JPEG n02229544/
+mv val/ILSVRC2012_val_00018536.JPEG n02092339/
+mv val/ILSVRC2012_val_00018537.JPEG n02802426/
+mv val/ILSVRC2012_val_00018538.JPEG n03018349/
+mv val/ILSVRC2012_val_00018539.JPEG n13044778/
+mv val/ILSVRC2012_val_00018540.JPEG n03014705/
+mv val/ILSVRC2012_val_00018541.JPEG n02776631/
+mv val/ILSVRC2012_val_00018542.JPEG n03109150/
+mv val/ILSVRC2012_val_00018543.JPEG n13052670/
+mv val/ILSVRC2012_val_00018544.JPEG n03218198/
+mv val/ILSVRC2012_val_00018545.JPEG n04125021/
+mv val/ILSVRC2012_val_00018546.JPEG n04550184/
+mv val/ILSVRC2012_val_00018547.JPEG n04479046/
+mv val/ILSVRC2012_val_00018548.JPEG n04443257/
+mv val/ILSVRC2012_val_00018549.JPEG n03908618/
+mv val/ILSVRC2012_val_00018550.JPEG n02094433/
+mv val/ILSVRC2012_val_00018551.JPEG n02113186/
+mv val/ILSVRC2012_val_00018552.JPEG n02105162/
+mv val/ILSVRC2012_val_00018553.JPEG n02980441/
+mv val/ILSVRC2012_val_00018554.JPEG n02971356/
+mv val/ILSVRC2012_val_00018555.JPEG n07697313/
+mv val/ILSVRC2012_val_00018556.JPEG n02102177/
+mv val/ILSVRC2012_val_00018557.JPEG n04613696/
+mv val/ILSVRC2012_val_00018558.JPEG n02095889/
+mv val/ILSVRC2012_val_00018559.JPEG n02979186/
+mv val/ILSVRC2012_val_00018560.JPEG n09472597/
+mv val/ILSVRC2012_val_00018561.JPEG n03476684/
+mv val/ILSVRC2012_val_00018562.JPEG n02692877/
+mv val/ILSVRC2012_val_00018563.JPEG n01756291/
+mv val/ILSVRC2012_val_00018564.JPEG n03976657/
+mv val/ILSVRC2012_val_00018565.JPEG n03494278/
+mv val/ILSVRC2012_val_00018566.JPEG n03026506/
+mv val/ILSVRC2012_val_00018567.JPEG n04228054/
+mv val/ILSVRC2012_val_00018568.JPEG n04146614/
+mv val/ILSVRC2012_val_00018569.JPEG n03100240/
+mv val/ILSVRC2012_val_00018570.JPEG n02018795/
+mv val/ILSVRC2012_val_00018571.JPEG n01873310/
+mv val/ILSVRC2012_val_00018572.JPEG n04026417/
+mv val/ILSVRC2012_val_00018573.JPEG n02086910/
+mv val/ILSVRC2012_val_00018574.JPEG n04192698/
+mv val/ILSVRC2012_val_00018575.JPEG n02093991/
+mv val/ILSVRC2012_val_00018576.JPEG n04116512/
+mv val/ILSVRC2012_val_00018577.JPEG n02107908/
+mv val/ILSVRC2012_val_00018578.JPEG n02066245/
+mv val/ILSVRC2012_val_00018579.JPEG n04026417/
+mv val/ILSVRC2012_val_00018580.JPEG n02444819/
+mv val/ILSVRC2012_val_00018581.JPEG n02536864/
+mv val/ILSVRC2012_val_00018582.JPEG n02361337/
+mv val/ILSVRC2012_val_00018583.JPEG n03770439/
+mv val/ILSVRC2012_val_00018584.JPEG n02086646/
+mv val/ILSVRC2012_val_00018585.JPEG n03444034/
+mv val/ILSVRC2012_val_00018586.JPEG n04008634/
+mv val/ILSVRC2012_val_00018587.JPEG n02727426/
+mv val/ILSVRC2012_val_00018588.JPEG n07615774/
+mv val/ILSVRC2012_val_00018589.JPEG n02107908/
+mv val/ILSVRC2012_val_00018590.JPEG n03637318/
+mv val/ILSVRC2012_val_00018591.JPEG n04317175/
+mv val/ILSVRC2012_val_00018592.JPEG n03662601/
+mv val/ILSVRC2012_val_00018593.JPEG n09256479/
+mv val/ILSVRC2012_val_00018594.JPEG n03933933/
+mv val/ILSVRC2012_val_00018595.JPEG n03666591/
+mv val/ILSVRC2012_val_00018596.JPEG n02102318/
+mv val/ILSVRC2012_val_00018597.JPEG n07802026/
+mv val/ILSVRC2012_val_00018598.JPEG n04467665/
+mv val/ILSVRC2012_val_00018599.JPEG n03109150/
+mv val/ILSVRC2012_val_00018600.JPEG n03710721/
+mv val/ILSVRC2012_val_00018601.JPEG n02817516/
+mv val/ILSVRC2012_val_00018602.JPEG n01855672/
+mv val/ILSVRC2012_val_00018603.JPEG n03259280/
+mv val/ILSVRC2012_val_00018604.JPEG n02108089/
+mv val/ILSVRC2012_val_00018605.JPEG n01943899/
+mv val/ILSVRC2012_val_00018606.JPEG n02655020/
+mv val/ILSVRC2012_val_00018607.JPEG n02817516/
+mv val/ILSVRC2012_val_00018608.JPEG n07871810/
+mv val/ILSVRC2012_val_00018609.JPEG n03935335/
+mv val/ILSVRC2012_val_00018610.JPEG n03250847/
+mv val/ILSVRC2012_val_00018611.JPEG n04417672/
+mv val/ILSVRC2012_val_00018612.JPEG n04252077/
+mv val/ILSVRC2012_val_00018613.JPEG n01910747/
+mv val/ILSVRC2012_val_00018614.JPEG n03950228/
+mv val/ILSVRC2012_val_00018615.JPEG n02009912/
+mv val/ILSVRC2012_val_00018616.JPEG n02690373/
+mv val/ILSVRC2012_val_00018617.JPEG n02787622/
+mv val/ILSVRC2012_val_00018618.JPEG n01685808/
+mv val/ILSVRC2012_val_00018619.JPEG n02486410/
+mv val/ILSVRC2012_val_00018620.JPEG n04326547/
+mv val/ILSVRC2012_val_00018621.JPEG n03467068/
+mv val/ILSVRC2012_val_00018622.JPEG n01742172/
+mv val/ILSVRC2012_val_00018623.JPEG n02965783/
+mv val/ILSVRC2012_val_00018624.JPEG n04209133/
+mv val/ILSVRC2012_val_00018625.JPEG n06874185/
+mv val/ILSVRC2012_val_00018626.JPEG n01797886/
+mv val/ILSVRC2012_val_00018627.JPEG n01755581/
+mv val/ILSVRC2012_val_00018628.JPEG n03942813/
+mv val/ILSVRC2012_val_00018629.JPEG n02087394/
+mv val/ILSVRC2012_val_00018630.JPEG n02137549/
+mv val/ILSVRC2012_val_00018631.JPEG n03047690/
+mv val/ILSVRC2012_val_00018632.JPEG n04447861/
+mv val/ILSVRC2012_val_00018633.JPEG n04275548/
+mv val/ILSVRC2012_val_00018634.JPEG n02229544/
+mv val/ILSVRC2012_val_00018635.JPEG n03530642/
+mv val/ILSVRC2012_val_00018636.JPEG n01930112/
+mv val/ILSVRC2012_val_00018637.JPEG n04548362/
+mv val/ILSVRC2012_val_00018638.JPEG n04552348/
+mv val/ILSVRC2012_val_00018639.JPEG n02486261/
+mv val/ILSVRC2012_val_00018640.JPEG n02328150/
+mv val/ILSVRC2012_val_00018641.JPEG n03355925/
+mv val/ILSVRC2012_val_00018642.JPEG n02096177/
+mv val/ILSVRC2012_val_00018643.JPEG n02403003/
+mv val/ILSVRC2012_val_00018644.JPEG n01817953/
+mv val/ILSVRC2012_val_00018645.JPEG n01629819/
+mv val/ILSVRC2012_val_00018646.JPEG n03983396/
+mv val/ILSVRC2012_val_00018647.JPEG n03207941/
+mv val/ILSVRC2012_val_00018648.JPEG n01806567/
+mv val/ILSVRC2012_val_00018649.JPEG n02089973/
+mv val/ILSVRC2012_val_00018650.JPEG n07714990/
+mv val/ILSVRC2012_val_00018651.JPEG n03590841/
+mv val/ILSVRC2012_val_00018652.JPEG n02086646/
+mv val/ILSVRC2012_val_00018653.JPEG n03781244/
+mv val/ILSVRC2012_val_00018654.JPEG n02090622/
+mv val/ILSVRC2012_val_00018655.JPEG n03445924/
+mv val/ILSVRC2012_val_00018656.JPEG n02051845/
+mv val/ILSVRC2012_val_00018657.JPEG n04560804/
+mv val/ILSVRC2012_val_00018658.JPEG n09288635/
+mv val/ILSVRC2012_val_00018659.JPEG n03840681/
+mv val/ILSVRC2012_val_00018660.JPEG n01622779/
+mv val/ILSVRC2012_val_00018661.JPEG n03445924/
+mv val/ILSVRC2012_val_00018662.JPEG n02058221/
+mv val/ILSVRC2012_val_00018663.JPEG n03837869/
+mv val/ILSVRC2012_val_00018664.JPEG n02125311/
+mv val/ILSVRC2012_val_00018665.JPEG n02783161/
+mv val/ILSVRC2012_val_00018666.JPEG n01698640/
+mv val/ILSVRC2012_val_00018667.JPEG n02787622/
+mv val/ILSVRC2012_val_00018668.JPEG n03706229/
+mv val/ILSVRC2012_val_00018669.JPEG n02840245/
+mv val/ILSVRC2012_val_00018670.JPEG n02808440/
+mv val/ILSVRC2012_val_00018671.JPEG n03680355/
+mv val/ILSVRC2012_val_00018672.JPEG n01560419/
+mv val/ILSVRC2012_val_00018673.JPEG n01978287/
+mv val/ILSVRC2012_val_00018674.JPEG n02422699/
+mv val/ILSVRC2012_val_00018675.JPEG n01687978/
+mv val/ILSVRC2012_val_00018676.JPEG n01537544/
+mv val/ILSVRC2012_val_00018677.JPEG n03793489/
+mv val/ILSVRC2012_val_00018678.JPEG n03016953/
+mv val/ILSVRC2012_val_00018679.JPEG n04044716/
+mv val/ILSVRC2012_val_00018680.JPEG n01560419/
+mv val/ILSVRC2012_val_00018681.JPEG n02056570/
+mv val/ILSVRC2012_val_00018682.JPEG n03179701/
+mv val/ILSVRC2012_val_00018683.JPEG n09468604/
+mv val/ILSVRC2012_val_00018684.JPEG n03623198/
+mv val/ILSVRC2012_val_00018685.JPEG n02690373/
+mv val/ILSVRC2012_val_00018686.JPEG n02454379/
+mv val/ILSVRC2012_val_00018687.JPEG n04467665/
+mv val/ILSVRC2012_val_00018688.JPEG n02112018/
+mv val/ILSVRC2012_val_00018689.JPEG n04591157/
+mv val/ILSVRC2012_val_00018690.JPEG n04243546/
+mv val/ILSVRC2012_val_00018691.JPEG n04254777/
+mv val/ILSVRC2012_val_00018692.JPEG n01558993/
+mv val/ILSVRC2012_val_00018693.JPEG n07932039/
+mv val/ILSVRC2012_val_00018694.JPEG n04258138/
+mv val/ILSVRC2012_val_00018695.JPEG n02085936/
+mv val/ILSVRC2012_val_00018696.JPEG n03240683/
+mv val/ILSVRC2012_val_00018697.JPEG n04409515/
+mv val/ILSVRC2012_val_00018698.JPEG n03661043/
+mv val/ILSVRC2012_val_00018699.JPEG n01532829/
+mv val/ILSVRC2012_val_00018700.JPEG n03930630/
+mv val/ILSVRC2012_val_00018701.JPEG n02112350/
+mv val/ILSVRC2012_val_00018702.JPEG n02837789/
+mv val/ILSVRC2012_val_00018703.JPEG n02098286/
+mv val/ILSVRC2012_val_00018704.JPEG n04485082/
+mv val/ILSVRC2012_val_00018705.JPEG n03272562/
+mv val/ILSVRC2012_val_00018706.JPEG n02105505/
+mv val/ILSVRC2012_val_00018707.JPEG n03916031/
+mv val/ILSVRC2012_val_00018708.JPEG n07742313/
+mv val/ILSVRC2012_val_00018709.JPEG n03042490/
+mv val/ILSVRC2012_val_00018710.JPEG n02105855/
+mv val/ILSVRC2012_val_00018711.JPEG n04229816/
+mv val/ILSVRC2012_val_00018712.JPEG n04447861/
+mv val/ILSVRC2012_val_00018713.JPEG n02916936/
+mv val/ILSVRC2012_val_00018714.JPEG n02120505/
+mv val/ILSVRC2012_val_00018715.JPEG n02917067/
+mv val/ILSVRC2012_val_00018716.JPEG n01984695/
+mv val/ILSVRC2012_val_00018717.JPEG n02454379/
+mv val/ILSVRC2012_val_00018718.JPEG n03529860/
+mv val/ILSVRC2012_val_00018719.JPEG n03482405/
+mv val/ILSVRC2012_val_00018720.JPEG n04049303/
+mv val/ILSVRC2012_val_00018721.JPEG n03452741/
+mv val/ILSVRC2012_val_00018722.JPEG n02113023/
+mv val/ILSVRC2012_val_00018723.JPEG n03447721/
+mv val/ILSVRC2012_val_00018724.JPEG n01728572/
+mv val/ILSVRC2012_val_00018725.JPEG n03942813/
+mv val/ILSVRC2012_val_00018726.JPEG n03929855/
+mv val/ILSVRC2012_val_00018727.JPEG n03344393/
+mv val/ILSVRC2012_val_00018728.JPEG n01692333/
+mv val/ILSVRC2012_val_00018729.JPEG n01945685/
+mv val/ILSVRC2012_val_00018730.JPEG n03929660/
+mv val/ILSVRC2012_val_00018731.JPEG n07565083/
+mv val/ILSVRC2012_val_00018732.JPEG n04579432/
+mv val/ILSVRC2012_val_00018733.JPEG n03594734/
+mv val/ILSVRC2012_val_00018734.JPEG n03793489/
+mv val/ILSVRC2012_val_00018735.JPEG n02114712/
+mv val/ILSVRC2012_val_00018736.JPEG n02111129/
+mv val/ILSVRC2012_val_00018737.JPEG n02091244/
+mv val/ILSVRC2012_val_00018738.JPEG n12057211/
+mv val/ILSVRC2012_val_00018739.JPEG n02493793/
+mv val/ILSVRC2012_val_00018740.JPEG n03404251/
+mv val/ILSVRC2012_val_00018741.JPEG n03026506/
+mv val/ILSVRC2012_val_00018742.JPEG n01817953/
+mv val/ILSVRC2012_val_00018743.JPEG n02130308/
+mv val/ILSVRC2012_val_00018744.JPEG n02930766/
+mv val/ILSVRC2012_val_00018745.JPEG n03594734/
+mv val/ILSVRC2012_val_00018746.JPEG n02777292/
+mv val/ILSVRC2012_val_00018747.JPEG n02486410/
+mv val/ILSVRC2012_val_00018748.JPEG n09468604/
+mv val/ILSVRC2012_val_00018749.JPEG n02489166/
+mv val/ILSVRC2012_val_00018750.JPEG n01981276/
+mv val/ILSVRC2012_val_00018751.JPEG n04275548/
+mv val/ILSVRC2012_val_00018752.JPEG n02865351/
+mv val/ILSVRC2012_val_00018753.JPEG n04118538/
+mv val/ILSVRC2012_val_00018754.JPEG n01641577/
+mv val/ILSVRC2012_val_00018755.JPEG n02113624/
+mv val/ILSVRC2012_val_00018756.JPEG n04008634/
+mv val/ILSVRC2012_val_00018757.JPEG n01945685/
+mv val/ILSVRC2012_val_00018758.JPEG n02692877/
+mv val/ILSVRC2012_val_00018759.JPEG n02749479/
+mv val/ILSVRC2012_val_00018760.JPEG n03891332/
+mv val/ILSVRC2012_val_00018761.JPEG n02795169/
+mv val/ILSVRC2012_val_00018762.JPEG n02105641/
+mv val/ILSVRC2012_val_00018763.JPEG n04136333/
+mv val/ILSVRC2012_val_00018764.JPEG n04417672/
+mv val/ILSVRC2012_val_00018765.JPEG n04263257/
+mv val/ILSVRC2012_val_00018766.JPEG n06596364/
+mv val/ILSVRC2012_val_00018767.JPEG n02091032/
+mv val/ILSVRC2012_val_00018768.JPEG n03770679/
+mv val/ILSVRC2012_val_00018769.JPEG n07749582/
+mv val/ILSVRC2012_val_00018770.JPEG n02977058/
+mv val/ILSVRC2012_val_00018771.JPEG n03594734/
+mv val/ILSVRC2012_val_00018772.JPEG n02317335/
+mv val/ILSVRC2012_val_00018773.JPEG n04550184/
+mv val/ILSVRC2012_val_00018774.JPEG n02437312/
+mv val/ILSVRC2012_val_00018775.JPEG n01728572/
+mv val/ILSVRC2012_val_00018776.JPEG n02395406/
+mv val/ILSVRC2012_val_00018777.JPEG n04522168/
+mv val/ILSVRC2012_val_00018778.JPEG n04209133/
+mv val/ILSVRC2012_val_00018779.JPEG n02108000/
+mv val/ILSVRC2012_val_00018780.JPEG n01843383/
+mv val/ILSVRC2012_val_00018781.JPEG n04004767/
+mv val/ILSVRC2012_val_00018782.JPEG n03804744/
+mv val/ILSVRC2012_val_00018783.JPEG n04398044/
+mv val/ILSVRC2012_val_00018784.JPEG n02643566/
+mv val/ILSVRC2012_val_00018785.JPEG n13052670/
+mv val/ILSVRC2012_val_00018786.JPEG n03443371/
+mv val/ILSVRC2012_val_00018787.JPEG n02101388/
+mv val/ILSVRC2012_val_00018788.JPEG n02133161/
+mv val/ILSVRC2012_val_00018789.JPEG n02641379/
+mv val/ILSVRC2012_val_00018790.JPEG n03814906/
+mv val/ILSVRC2012_val_00018791.JPEG n02115913/
+mv val/ILSVRC2012_val_00018792.JPEG n02108915/
+mv val/ILSVRC2012_val_00018793.JPEG n01978287/
+mv val/ILSVRC2012_val_00018794.JPEG n04277352/
+mv val/ILSVRC2012_val_00018795.JPEG n04493381/
+mv val/ILSVRC2012_val_00018796.JPEG n01608432/
+mv val/ILSVRC2012_val_00018797.JPEG n04548280/
+mv val/ILSVRC2012_val_00018798.JPEG n03379051/
+mv val/ILSVRC2012_val_00018799.JPEG n03796401/
+mv val/ILSVRC2012_val_00018800.JPEG n02051845/
+mv val/ILSVRC2012_val_00018801.JPEG n04350905/
+mv val/ILSVRC2012_val_00018802.JPEG n04612504/
+mv val/ILSVRC2012_val_00018803.JPEG n03207743/
+mv val/ILSVRC2012_val_00018804.JPEG n02097298/
+mv val/ILSVRC2012_val_00018805.JPEG n03447447/
+mv val/ILSVRC2012_val_00018806.JPEG n02804610/
+mv val/ILSVRC2012_val_00018807.JPEG n01770393/
+mv val/ILSVRC2012_val_00018808.JPEG n10148035/
+mv val/ILSVRC2012_val_00018809.JPEG n02094258/
+mv val/ILSVRC2012_val_00018810.JPEG n03720891/
+mv val/ILSVRC2012_val_00018811.JPEG n02089078/
+mv val/ILSVRC2012_val_00018812.JPEG n02130308/
+mv val/ILSVRC2012_val_00018813.JPEG n02536864/
+mv val/ILSVRC2012_val_00018814.JPEG n03942813/
+mv val/ILSVRC2012_val_00018815.JPEG n02110341/
+mv val/ILSVRC2012_val_00018816.JPEG n04579432/
+mv val/ILSVRC2012_val_00018817.JPEG n07716358/
+mv val/ILSVRC2012_val_00018818.JPEG n03095699/
+mv val/ILSVRC2012_val_00018819.JPEG n02128925/
+mv val/ILSVRC2012_val_00018820.JPEG n04141975/
+mv val/ILSVRC2012_val_00018821.JPEG n02119789/
+mv val/ILSVRC2012_val_00018822.JPEG n03481172/
+mv val/ILSVRC2012_val_00018823.JPEG n03532672/
+mv val/ILSVRC2012_val_00018824.JPEG n02655020/
+mv val/ILSVRC2012_val_00018825.JPEG n07749582/
+mv val/ILSVRC2012_val_00018826.JPEG n02109961/
+mv val/ILSVRC2012_val_00018827.JPEG n02101556/
+mv val/ILSVRC2012_val_00018828.JPEG n03662601/
+mv val/ILSVRC2012_val_00018829.JPEG n03803284/
+mv val/ILSVRC2012_val_00018830.JPEG n02641379/
+mv val/ILSVRC2012_val_00018831.JPEG n04367480/
+mv val/ILSVRC2012_val_00018832.JPEG n02101388/
+mv val/ILSVRC2012_val_00018833.JPEG n04562935/
+mv val/ILSVRC2012_val_00018834.JPEG n01694178/
+mv val/ILSVRC2012_val_00018835.JPEG n02088466/
+mv val/ILSVRC2012_val_00018836.JPEG n02536864/
+mv val/ILSVRC2012_val_00018837.JPEG n03781244/
+mv val/ILSVRC2012_val_00018838.JPEG n04192698/
+mv val/ILSVRC2012_val_00018839.JPEG n02167151/
+mv val/ILSVRC2012_val_00018840.JPEG n02089078/
+mv val/ILSVRC2012_val_00018841.JPEG n03544143/
+mv val/ILSVRC2012_val_00018842.JPEG n03026506/
+mv val/ILSVRC2012_val_00018843.JPEG n02128925/
+mv val/ILSVRC2012_val_00018844.JPEG n04251144/
+mv val/ILSVRC2012_val_00018845.JPEG n03929855/
+mv val/ILSVRC2012_val_00018846.JPEG n03085013/
+mv val/ILSVRC2012_val_00018847.JPEG n03125729/
+mv val/ILSVRC2012_val_00018848.JPEG n01677366/
+mv val/ILSVRC2012_val_00018849.JPEG n03661043/
+mv val/ILSVRC2012_val_00018850.JPEG n04584207/
+mv val/ILSVRC2012_val_00018851.JPEG n04200800/
+mv val/ILSVRC2012_val_00018852.JPEG n02487347/
+mv val/ILSVRC2012_val_00018853.JPEG n02321529/
+mv val/ILSVRC2012_val_00018854.JPEG n03814906/
+mv val/ILSVRC2012_val_00018855.JPEG n01924916/
+mv val/ILSVRC2012_val_00018856.JPEG n02802426/
+mv val/ILSVRC2012_val_00018857.JPEG n01693334/
+mv val/ILSVRC2012_val_00018858.JPEG n02169497/
+mv val/ILSVRC2012_val_00018859.JPEG n02128925/
+mv val/ILSVRC2012_val_00018860.JPEG n07717556/
+mv val/ILSVRC2012_val_00018861.JPEG n03895866/
+mv val/ILSVRC2012_val_00018862.JPEG n02099429/
+mv val/ILSVRC2012_val_00018863.JPEG n03085013/
+mv val/ILSVRC2012_val_00018864.JPEG n11939491/
+mv val/ILSVRC2012_val_00018865.JPEG n09468604/
+mv val/ILSVRC2012_val_00018866.JPEG n02109047/
+mv val/ILSVRC2012_val_00018867.JPEG n07565083/
+mv val/ILSVRC2012_val_00018868.JPEG n04310018/
+mv val/ILSVRC2012_val_00018869.JPEG n02988304/
+mv val/ILSVRC2012_val_00018870.JPEG n07754684/
+mv val/ILSVRC2012_val_00018871.JPEG n02058221/
+mv val/ILSVRC2012_val_00018872.JPEG n02114367/
+mv val/ILSVRC2012_val_00018873.JPEG n03485794/
+mv val/ILSVRC2012_val_00018874.JPEG n03424325/
+mv val/ILSVRC2012_val_00018875.JPEG n04443257/
+mv val/ILSVRC2012_val_00018876.JPEG n01697457/
+mv val/ILSVRC2012_val_00018877.JPEG n02219486/
+mv val/ILSVRC2012_val_00018878.JPEG n02877765/
+mv val/ILSVRC2012_val_00018879.JPEG n01644900/
+mv val/ILSVRC2012_val_00018880.JPEG n03775071/
+mv val/ILSVRC2012_val_00018881.JPEG n02097047/
+mv val/ILSVRC2012_val_00018882.JPEG n02085620/
+mv val/ILSVRC2012_val_00018883.JPEG n07693725/
+mv val/ILSVRC2012_val_00018884.JPEG n03160309/
+mv val/ILSVRC2012_val_00018885.JPEG n02815834/
+mv val/ILSVRC2012_val_00018886.JPEG n03110669/
+mv val/ILSVRC2012_val_00018887.JPEG n03868863/
+mv val/ILSVRC2012_val_00018888.JPEG n04008634/
+mv val/ILSVRC2012_val_00018889.JPEG n03743016/
+mv val/ILSVRC2012_val_00018890.JPEG n02094114/
+mv val/ILSVRC2012_val_00018891.JPEG n03208938/
+mv val/ILSVRC2012_val_00018892.JPEG n07590611/
+mv val/ILSVRC2012_val_00018893.JPEG n04273569/
+mv val/ILSVRC2012_val_00018894.JPEG n03706229/
+mv val/ILSVRC2012_val_00018895.JPEG n02013706/
+mv val/ILSVRC2012_val_00018896.JPEG n07753592/
+mv val/ILSVRC2012_val_00018897.JPEG n02916936/
+mv val/ILSVRC2012_val_00018898.JPEG n02112137/
+mv val/ILSVRC2012_val_00018899.JPEG n02108089/
+mv val/ILSVRC2012_val_00018900.JPEG n03841143/
+mv val/ILSVRC2012_val_00018901.JPEG n03595614/
+mv val/ILSVRC2012_val_00018902.JPEG n03125729/
+mv val/ILSVRC2012_val_00018903.JPEG n07742313/
+mv val/ILSVRC2012_val_00018904.JPEG n02487347/
+mv val/ILSVRC2012_val_00018905.JPEG n04235860/
+mv val/ILSVRC2012_val_00018906.JPEG n02782093/
+mv val/ILSVRC2012_val_00018907.JPEG n01742172/
+mv val/ILSVRC2012_val_00018908.JPEG n04604644/
+mv val/ILSVRC2012_val_00018909.JPEG n04554684/
+mv val/ILSVRC2012_val_00018910.JPEG n04086273/
+mv val/ILSVRC2012_val_00018911.JPEG n02906734/
+mv val/ILSVRC2012_val_00018912.JPEG n02091635/
+mv val/ILSVRC2012_val_00018913.JPEG n03201208/
+mv val/ILSVRC2012_val_00018914.JPEG n07693725/
+mv val/ILSVRC2012_val_00018915.JPEG n09332890/
+mv val/ILSVRC2012_val_00018916.JPEG n02088364/
+mv val/ILSVRC2012_val_00018917.JPEG n03017168/
+mv val/ILSVRC2012_val_00018918.JPEG n03729826/
+mv val/ILSVRC2012_val_00018919.JPEG n03983396/
+mv val/ILSVRC2012_val_00018920.JPEG n03676483/
+mv val/ILSVRC2012_val_00018921.JPEG n04204347/
+mv val/ILSVRC2012_val_00018922.JPEG n04251144/
+mv val/ILSVRC2012_val_00018923.JPEG n02917067/
+mv val/ILSVRC2012_val_00018924.JPEG n04081281/
+mv val/ILSVRC2012_val_00018925.JPEG n03930313/
+mv val/ILSVRC2012_val_00018926.JPEG n03494278/
+mv val/ILSVRC2012_val_00018927.JPEG n03160309/
+mv val/ILSVRC2012_val_00018928.JPEG n02389026/
+mv val/ILSVRC2012_val_00018929.JPEG n03250847/
+mv val/ILSVRC2012_val_00018930.JPEG n03133878/
+mv val/ILSVRC2012_val_00018931.JPEG n02091635/
+mv val/ILSVRC2012_val_00018932.JPEG n02389026/
+mv val/ILSVRC2012_val_00018933.JPEG n02087394/
+mv val/ILSVRC2012_val_00018934.JPEG n02113799/
+mv val/ILSVRC2012_val_00018935.JPEG n02281787/
+mv val/ILSVRC2012_val_00018936.JPEG n04548280/
+mv val/ILSVRC2012_val_00018937.JPEG n04509417/
+mv val/ILSVRC2012_val_00018938.JPEG n03384352/
+mv val/ILSVRC2012_val_00018939.JPEG n02009229/
+mv val/ILSVRC2012_val_00018940.JPEG n04370456/
+mv val/ILSVRC2012_val_00018941.JPEG n07753275/
+mv val/ILSVRC2012_val_00018942.JPEG n02102177/
+mv val/ILSVRC2012_val_00018943.JPEG n01494475/
+mv val/ILSVRC2012_val_00018944.JPEG n03459775/
+mv val/ILSVRC2012_val_00018945.JPEG n02804610/
+mv val/ILSVRC2012_val_00018946.JPEG n04456115/
+mv val/ILSVRC2012_val_00018947.JPEG n02099712/
+mv val/ILSVRC2012_val_00018948.JPEG n01494475/
+mv val/ILSVRC2012_val_00018949.JPEG n04344873/
+mv val/ILSVRC2012_val_00018950.JPEG n03788195/
+mv val/ILSVRC2012_val_00018951.JPEG n01944390/
+mv val/ILSVRC2012_val_00018952.JPEG n01910747/
+mv val/ILSVRC2012_val_00018953.JPEG n03868242/
+mv val/ILSVRC2012_val_00018954.JPEG n03452741/
+mv val/ILSVRC2012_val_00018955.JPEG n13044778/
+mv val/ILSVRC2012_val_00018956.JPEG n01883070/
+mv val/ILSVRC2012_val_00018957.JPEG n02701002/
+mv val/ILSVRC2012_val_00018958.JPEG n02793495/
+mv val/ILSVRC2012_val_00018959.JPEG n02692877/
+mv val/ILSVRC2012_val_00018960.JPEG n03220513/
+mv val/ILSVRC2012_val_00018961.JPEG n01978287/
+mv val/ILSVRC2012_val_00018962.JPEG n02483362/
+mv val/ILSVRC2012_val_00018963.JPEG n01776313/
+mv val/ILSVRC2012_val_00018964.JPEG n02808304/
+mv val/ILSVRC2012_val_00018965.JPEG n03721384/
+mv val/ILSVRC2012_val_00018966.JPEG n02012849/
+mv val/ILSVRC2012_val_00018967.JPEG n03733281/
+mv val/ILSVRC2012_val_00018968.JPEG n07920052/
+mv val/ILSVRC2012_val_00018969.JPEG n02326432/
+mv val/ILSVRC2012_val_00018970.JPEG n04192698/
+mv val/ILSVRC2012_val_00018971.JPEG n02113799/
+mv val/ILSVRC2012_val_00018972.JPEG n02106550/
+mv val/ILSVRC2012_val_00018973.JPEG n02097298/
+mv val/ILSVRC2012_val_00018974.JPEG n02509815/
+mv val/ILSVRC2012_val_00018975.JPEG n02835271/
+mv val/ILSVRC2012_val_00018976.JPEG n04548280/
+mv val/ILSVRC2012_val_00018977.JPEG n04522168/
+mv val/ILSVRC2012_val_00018978.JPEG n03950228/
+mv val/ILSVRC2012_val_00018979.JPEG n01689811/
+mv val/ILSVRC2012_val_00018980.JPEG n09428293/
+mv val/ILSVRC2012_val_00018981.JPEG n01877812/
+mv val/ILSVRC2012_val_00018982.JPEG n02100583/
+mv val/ILSVRC2012_val_00018983.JPEG n01704323/
+mv val/ILSVRC2012_val_00018984.JPEG n03680355/
+mv val/ILSVRC2012_val_00018985.JPEG n03000247/
+mv val/ILSVRC2012_val_00018986.JPEG n03742115/
+mv val/ILSVRC2012_val_00018987.JPEG n04486054/
+mv val/ILSVRC2012_val_00018988.JPEG n02097298/
+mv val/ILSVRC2012_val_00018989.JPEG n02091635/
+mv val/ILSVRC2012_val_00018990.JPEG n03680355/
+mv val/ILSVRC2012_val_00018991.JPEG n02002556/
+mv val/ILSVRC2012_val_00018992.JPEG n02101388/
+mv val/ILSVRC2012_val_00018993.JPEG n01818515/
+mv val/ILSVRC2012_val_00018994.JPEG n02454379/
+mv val/ILSVRC2012_val_00018995.JPEG n03216828/
+mv val/ILSVRC2012_val_00018996.JPEG n03933933/
+mv val/ILSVRC2012_val_00018997.JPEG n02107683/
+mv val/ILSVRC2012_val_00018998.JPEG n04252077/
+mv val/ILSVRC2012_val_00018999.JPEG n02980441/
+mv val/ILSVRC2012_val_00019000.JPEG n04039381/
+mv val/ILSVRC2012_val_00019001.JPEG n03201208/
+mv val/ILSVRC2012_val_00019002.JPEG n02102177/
+mv val/ILSVRC2012_val_00019003.JPEG n03388549/
+mv val/ILSVRC2012_val_00019004.JPEG n04523525/
+mv val/ILSVRC2012_val_00019005.JPEG n03770439/
+mv val/ILSVRC2012_val_00019006.JPEG n03710193/
+mv val/ILSVRC2012_val_00019007.JPEG n01675722/
+mv val/ILSVRC2012_val_00019008.JPEG n04501370/
+mv val/ILSVRC2012_val_00019009.JPEG n04501370/
+mv val/ILSVRC2012_val_00019010.JPEG n02092002/
+mv val/ILSVRC2012_val_00019011.JPEG n03598930/
+mv val/ILSVRC2012_val_00019012.JPEG n07932039/
+mv val/ILSVRC2012_val_00019013.JPEG n02101006/
+mv val/ILSVRC2012_val_00019014.JPEG n02268853/
+mv val/ILSVRC2012_val_00019015.JPEG n04259630/
+mv val/ILSVRC2012_val_00019016.JPEG n03871628/
+mv val/ILSVRC2012_val_00019017.JPEG n02786058/
+mv val/ILSVRC2012_val_00019018.JPEG n03485794/
+mv val/ILSVRC2012_val_00019019.JPEG n02009912/
+mv val/ILSVRC2012_val_00019020.JPEG n02091244/
+mv val/ILSVRC2012_val_00019021.JPEG n02808304/
+mv val/ILSVRC2012_val_00019022.JPEG n01860187/
+mv val/ILSVRC2012_val_00019023.JPEG n07613480/
+mv val/ILSVRC2012_val_00019024.JPEG n01843065/
+mv val/ILSVRC2012_val_00019025.JPEG n02095889/
+mv val/ILSVRC2012_val_00019026.JPEG n01943899/
+mv val/ILSVRC2012_val_00019027.JPEG n02859443/
+mv val/ILSVRC2012_val_00019028.JPEG n02112350/
+mv val/ILSVRC2012_val_00019029.JPEG n02165456/
+mv val/ILSVRC2012_val_00019030.JPEG n01773797/
+mv val/ILSVRC2012_val_00019031.JPEG n02328150/
+mv val/ILSVRC2012_val_00019032.JPEG n03485407/
+mv val/ILSVRC2012_val_00019033.JPEG n01955084/
+mv val/ILSVRC2012_val_00019034.JPEG n01601694/
+mv val/ILSVRC2012_val_00019035.JPEG n03290653/
+mv val/ILSVRC2012_val_00019036.JPEG n01796340/
+mv val/ILSVRC2012_val_00019037.JPEG n06359193/
+mv val/ILSVRC2012_val_00019038.JPEG n01558993/
+mv val/ILSVRC2012_val_00019039.JPEG n03950228/
+mv val/ILSVRC2012_val_00019040.JPEG n02096437/
+mv val/ILSVRC2012_val_00019041.JPEG n02093859/
+mv val/ILSVRC2012_val_00019042.JPEG n01773549/
+mv val/ILSVRC2012_val_00019043.JPEG n04154565/
+mv val/ILSVRC2012_val_00019044.JPEG n02437616/
+mv val/ILSVRC2012_val_00019045.JPEG n02017213/
+mv val/ILSVRC2012_val_00019046.JPEG n04146614/
+mv val/ILSVRC2012_val_00019047.JPEG n02488702/
+mv val/ILSVRC2012_val_00019048.JPEG n02137549/
+mv val/ILSVRC2012_val_00019049.JPEG n02013706/
+mv val/ILSVRC2012_val_00019050.JPEG n02100735/
+mv val/ILSVRC2012_val_00019051.JPEG n04465501/
+mv val/ILSVRC2012_val_00019052.JPEG n02727426/
+mv val/ILSVRC2012_val_00019053.JPEG n04467665/
+mv val/ILSVRC2012_val_00019054.JPEG n02095889/
+mv val/ILSVRC2012_val_00019055.JPEG n02415577/
+mv val/ILSVRC2012_val_00019056.JPEG n03075370/
+mv val/ILSVRC2012_val_00019057.JPEG n02097298/
+mv val/ILSVRC2012_val_00019058.JPEG n02027492/
+mv val/ILSVRC2012_val_00019059.JPEG n02441942/
+mv val/ILSVRC2012_val_00019060.JPEG n02104029/
+mv val/ILSVRC2012_val_00019061.JPEG n03617480/
+mv val/ILSVRC2012_val_00019062.JPEG n03623198/
+mv val/ILSVRC2012_val_00019063.JPEG n02536864/
+mv val/ILSVRC2012_val_00019064.JPEG n07875152/
+mv val/ILSVRC2012_val_00019065.JPEG n04208210/
+mv val/ILSVRC2012_val_00019066.JPEG n02423022/
+mv val/ILSVRC2012_val_00019067.JPEG n03016953/
+mv val/ILSVRC2012_val_00019068.JPEG n01669191/
+mv val/ILSVRC2012_val_00019069.JPEG n04344873/
+mv val/ILSVRC2012_val_00019070.JPEG n02526121/
+mv val/ILSVRC2012_val_00019071.JPEG n09472597/
+mv val/ILSVRC2012_val_00019072.JPEG n03873416/
+mv val/ILSVRC2012_val_00019073.JPEG n01829413/
+mv val/ILSVRC2012_val_00019074.JPEG n12057211/
+mv val/ILSVRC2012_val_00019075.JPEG n02950826/
+mv val/ILSVRC2012_val_00019076.JPEG n02786058/
+mv val/ILSVRC2012_val_00019077.JPEG n02486410/
+mv val/ILSVRC2012_val_00019078.JPEG n02486261/
+mv val/ILSVRC2012_val_00019079.JPEG n02423022/
+mv val/ILSVRC2012_val_00019080.JPEG n02107574/
+mv val/ILSVRC2012_val_00019081.JPEG n03773504/
+mv val/ILSVRC2012_val_00019082.JPEG n01558993/
+mv val/ILSVRC2012_val_00019083.JPEG n02096177/
+mv val/ILSVRC2012_val_00019084.JPEG n03961711/
+mv val/ILSVRC2012_val_00019085.JPEG n01873310/
+mv val/ILSVRC2012_val_00019086.JPEG n04118538/
+mv val/ILSVRC2012_val_00019087.JPEG n02091032/
+mv val/ILSVRC2012_val_00019088.JPEG n03483316/
+mv val/ILSVRC2012_val_00019089.JPEG n13040303/
+mv val/ILSVRC2012_val_00019090.JPEG n03180011/
+mv val/ILSVRC2012_val_00019091.JPEG n02125311/
+mv val/ILSVRC2012_val_00019092.JPEG n02172182/
+mv val/ILSVRC2012_val_00019093.JPEG n03976657/
+mv val/ILSVRC2012_val_00019094.JPEG n02094258/
+mv val/ILSVRC2012_val_00019095.JPEG n02980441/
+mv val/ILSVRC2012_val_00019096.JPEG n02107312/
+mv val/ILSVRC2012_val_00019097.JPEG n01755581/
+mv val/ILSVRC2012_val_00019098.JPEG n02776631/
+mv val/ILSVRC2012_val_00019099.JPEG n02492660/
+mv val/ILSVRC2012_val_00019100.JPEG n01664065/
+mv val/ILSVRC2012_val_00019101.JPEG n01514668/
+mv val/ILSVRC2012_val_00019102.JPEG n02966193/
+mv val/ILSVRC2012_val_00019103.JPEG n02492035/
+mv val/ILSVRC2012_val_00019104.JPEG n03482405/
+mv val/ILSVRC2012_val_00019105.JPEG n04019541/
+mv val/ILSVRC2012_val_00019106.JPEG n03954731/
+mv val/ILSVRC2012_val_00019107.JPEG n02106550/
+mv val/ILSVRC2012_val_00019108.JPEG n04404412/
+mv val/ILSVRC2012_val_00019109.JPEG n02797295/
+mv val/ILSVRC2012_val_00019110.JPEG n01955084/
+mv val/ILSVRC2012_val_00019111.JPEG n04612504/
+mv val/ILSVRC2012_val_00019112.JPEG n04069434/
+mv val/ILSVRC2012_val_00019113.JPEG n02492035/
+mv val/ILSVRC2012_val_00019114.JPEG n10565667/
+mv val/ILSVRC2012_val_00019115.JPEG n02091134/
+mv val/ILSVRC2012_val_00019116.JPEG n01631663/
+mv val/ILSVRC2012_val_00019117.JPEG n02727426/
+mv val/ILSVRC2012_val_00019118.JPEG n02071294/
+mv val/ILSVRC2012_val_00019119.JPEG n02124075/
+mv val/ILSVRC2012_val_00019120.JPEG n02092002/
+mv val/ILSVRC2012_val_00019121.JPEG n02321529/
+mv val/ILSVRC2012_val_00019122.JPEG n04208210/
+mv val/ILSVRC2012_val_00019123.JPEG n01819313/
+mv val/ILSVRC2012_val_00019124.JPEG n02087046/
+mv val/ILSVRC2012_val_00019125.JPEG n04409515/
+mv val/ILSVRC2012_val_00019126.JPEG n03485794/
+mv val/ILSVRC2012_val_00019127.JPEG n04356056/
+mv val/ILSVRC2012_val_00019128.JPEG n02087046/
+mv val/ILSVRC2012_val_00019129.JPEG n02492035/
+mv val/ILSVRC2012_val_00019130.JPEG n02085782/
+mv val/ILSVRC2012_val_00019131.JPEG n03788365/
+mv val/ILSVRC2012_val_00019132.JPEG n02483708/
+mv val/ILSVRC2012_val_00019133.JPEG n04532106/
+mv val/ILSVRC2012_val_00019134.JPEG n02106030/
+mv val/ILSVRC2012_val_00019135.JPEG n03742115/
+mv val/ILSVRC2012_val_00019136.JPEG n03868242/
+mv val/ILSVRC2012_val_00019137.JPEG n03000684/
+mv val/ILSVRC2012_val_00019138.JPEG n02100236/
+mv val/ILSVRC2012_val_00019139.JPEG n02398521/
+mv val/ILSVRC2012_val_00019140.JPEG n03976657/
+mv val/ILSVRC2012_val_00019141.JPEG n03595614/
+mv val/ILSVRC2012_val_00019142.JPEG n03884397/
+mv val/ILSVRC2012_val_00019143.JPEG n03109150/
+mv val/ILSVRC2012_val_00019144.JPEG n02978881/
+mv val/ILSVRC2012_val_00019145.JPEG n02279972/
+mv val/ILSVRC2012_val_00019146.JPEG n02391049/
+mv val/ILSVRC2012_val_00019147.JPEG n03417042/
+mv val/ILSVRC2012_val_00019148.JPEG n01734418/
+mv val/ILSVRC2012_val_00019149.JPEG n07565083/
+mv val/ILSVRC2012_val_00019150.JPEG n03970156/
+mv val/ILSVRC2012_val_00019151.JPEG n02256656/
+mv val/ILSVRC2012_val_00019152.JPEG n01689811/
+mv val/ILSVRC2012_val_00019153.JPEG n02107683/
+mv val/ILSVRC2012_val_00019154.JPEG n04591713/
+mv val/ILSVRC2012_val_00019155.JPEG n02105855/
+mv val/ILSVRC2012_val_00019156.JPEG n04099969/
+mv val/ILSVRC2012_val_00019157.JPEG n02980441/
+mv val/ILSVRC2012_val_00019158.JPEG n07720875/
+mv val/ILSVRC2012_val_00019159.JPEG n04259630/
+mv val/ILSVRC2012_val_00019160.JPEG n07920052/
+mv val/ILSVRC2012_val_00019161.JPEG n03777754/
+mv val/ILSVRC2012_val_00019162.JPEG n02099429/
+mv val/ILSVRC2012_val_00019163.JPEG n03777568/
+mv val/ILSVRC2012_val_00019164.JPEG n03759954/
+mv val/ILSVRC2012_val_00019165.JPEG n02109525/
+mv val/ILSVRC2012_val_00019166.JPEG n04264628/
+mv val/ILSVRC2012_val_00019167.JPEG n03584829/
+mv val/ILSVRC2012_val_00019168.JPEG n04525305/
+mv val/ILSVRC2012_val_00019169.JPEG n02099712/
+mv val/ILSVRC2012_val_00019170.JPEG n01689811/
+mv val/ILSVRC2012_val_00019171.JPEG n02169497/
+mv val/ILSVRC2012_val_00019172.JPEG n02011460/
+mv val/ILSVRC2012_val_00019173.JPEG n02109961/
+mv val/ILSVRC2012_val_00019174.JPEG n03814906/
+mv val/ILSVRC2012_val_00019175.JPEG n02095314/
+mv val/ILSVRC2012_val_00019176.JPEG n03866082/
+mv val/ILSVRC2012_val_00019177.JPEG n02966687/
+mv val/ILSVRC2012_val_00019178.JPEG n03710721/
+mv val/ILSVRC2012_val_00019179.JPEG n02690373/
+mv val/ILSVRC2012_val_00019180.JPEG n02514041/
+mv val/ILSVRC2012_val_00019181.JPEG n03062245/
+mv val/ILSVRC2012_val_00019182.JPEG n02797295/
+mv val/ILSVRC2012_val_00019183.JPEG n02167151/
+mv val/ILSVRC2012_val_00019184.JPEG n01518878/
+mv val/ILSVRC2012_val_00019185.JPEG n13040303/
+mv val/ILSVRC2012_val_00019186.JPEG n13044778/
+mv val/ILSVRC2012_val_00019187.JPEG n02088364/
+mv val/ILSVRC2012_val_00019188.JPEG n03045698/
+mv val/ILSVRC2012_val_00019189.JPEG n03857828/
+mv val/ILSVRC2012_val_00019190.JPEG n09288635/
+mv val/ILSVRC2012_val_00019191.JPEG n03873416/
+mv val/ILSVRC2012_val_00019192.JPEG n10148035/
+mv val/ILSVRC2012_val_00019193.JPEG n02837789/
+mv val/ILSVRC2012_val_00019194.JPEG n03388183/
+mv val/ILSVRC2012_val_00019195.JPEG n03272010/
+mv val/ILSVRC2012_val_00019196.JPEG n13054560/
+mv val/ILSVRC2012_val_00019197.JPEG n02699494/
+mv val/ILSVRC2012_val_00019198.JPEG n02051845/
+mv val/ILSVRC2012_val_00019199.JPEG n02966193/
+mv val/ILSVRC2012_val_00019200.JPEG n02437312/
+mv val/ILSVRC2012_val_00019201.JPEG n04557648/
+mv val/ILSVRC2012_val_00019202.JPEG n02177972/
+mv val/ILSVRC2012_val_00019203.JPEG n03792782/
+mv val/ILSVRC2012_val_00019204.JPEG n01751748/
+mv val/ILSVRC2012_val_00019205.JPEG n02892767/
+mv val/ILSVRC2012_val_00019206.JPEG n04344873/
+mv val/ILSVRC2012_val_00019207.JPEG n03902125/
+mv val/ILSVRC2012_val_00019208.JPEG n01558993/
+mv val/ILSVRC2012_val_00019209.JPEG n02087394/
+mv val/ILSVRC2012_val_00019210.JPEG n02006656/
+mv val/ILSVRC2012_val_00019211.JPEG n01784675/
+mv val/ILSVRC2012_val_00019212.JPEG n02099601/
+mv val/ILSVRC2012_val_00019213.JPEG n03930313/
+mv val/ILSVRC2012_val_00019214.JPEG n02980441/
+mv val/ILSVRC2012_val_00019215.JPEG n02097209/
+mv val/ILSVRC2012_val_00019216.JPEG n02091032/
+mv val/ILSVRC2012_val_00019217.JPEG n03742115/
+mv val/ILSVRC2012_val_00019218.JPEG n02606052/
+mv val/ILSVRC2012_val_00019219.JPEG n02104365/
+mv val/ILSVRC2012_val_00019220.JPEG n02097130/
+mv val/ILSVRC2012_val_00019221.JPEG n07860988/
+mv val/ILSVRC2012_val_00019222.JPEG n02120079/
+mv val/ILSVRC2012_val_00019223.JPEG n04235860/
+mv val/ILSVRC2012_val_00019224.JPEG n02883205/
+mv val/ILSVRC2012_val_00019225.JPEG n02727426/
+mv val/ILSVRC2012_val_00019226.JPEG n02099267/
+mv val/ILSVRC2012_val_00019227.JPEG n03884397/
+mv val/ILSVRC2012_val_00019228.JPEG n02992211/
+mv val/ILSVRC2012_val_00019229.JPEG n03095699/
+mv val/ILSVRC2012_val_00019230.JPEG n04254777/
+mv val/ILSVRC2012_val_00019231.JPEG n02093859/
+mv val/ILSVRC2012_val_00019232.JPEG n03146219/
+mv val/ILSVRC2012_val_00019233.JPEG n04548362/
+mv val/ILSVRC2012_val_00019234.JPEG n04335435/
+mv val/ILSVRC2012_val_00019235.JPEG n02489166/
+mv val/ILSVRC2012_val_00019236.JPEG n01531178/
+mv val/ILSVRC2012_val_00019237.JPEG n02259212/
+mv val/ILSVRC2012_val_00019238.JPEG n02894605/
+mv val/ILSVRC2012_val_00019239.JPEG n02114855/
+mv val/ILSVRC2012_val_00019240.JPEG n03188531/
+mv val/ILSVRC2012_val_00019241.JPEG n02088466/
+mv val/ILSVRC2012_val_00019242.JPEG n03956157/
+mv val/ILSVRC2012_val_00019243.JPEG n04589890/
+mv val/ILSVRC2012_val_00019244.JPEG n04525038/
+mv val/ILSVRC2012_val_00019245.JPEG n02233338/
+mv val/ILSVRC2012_val_00019246.JPEG n04612504/
+mv val/ILSVRC2012_val_00019247.JPEG n07711569/
+mv val/ILSVRC2012_val_00019248.JPEG n02437312/
+mv val/ILSVRC2012_val_00019249.JPEG n03976657/
+mv val/ILSVRC2012_val_00019250.JPEG n12144580/
+mv val/ILSVRC2012_val_00019251.JPEG n01843065/
+mv val/ILSVRC2012_val_00019252.JPEG n02120505/
+mv val/ILSVRC2012_val_00019253.JPEG n07745940/
+mv val/ILSVRC2012_val_00019254.JPEG n04552348/
+mv val/ILSVRC2012_val_00019255.JPEG n03710721/
+mv val/ILSVRC2012_val_00019256.JPEG n03425413/
+mv val/ILSVRC2012_val_00019257.JPEG n01697457/
+mv val/ILSVRC2012_val_00019258.JPEG n02396427/
+mv val/ILSVRC2012_val_00019259.JPEG n02092339/
+mv val/ILSVRC2012_val_00019260.JPEG n02493509/
+mv val/ILSVRC2012_val_00019261.JPEG n02087046/
+mv val/ILSVRC2012_val_00019262.JPEG n02123159/
+mv val/ILSVRC2012_val_00019263.JPEG n04251144/
+mv val/ILSVRC2012_val_00019264.JPEG n04259630/
+mv val/ILSVRC2012_val_00019265.JPEG n02096051/
+mv val/ILSVRC2012_val_00019266.JPEG n04507155/
+mv val/ILSVRC2012_val_00019267.JPEG n02106662/
+mv val/ILSVRC2012_val_00019268.JPEG n03445777/
+mv val/ILSVRC2012_val_00019269.JPEG n03494278/
+mv val/ILSVRC2012_val_00019270.JPEG n01756291/
+mv val/ILSVRC2012_val_00019271.JPEG n03063689/
+mv val/ILSVRC2012_val_00019272.JPEG n02105162/
+mv val/ILSVRC2012_val_00019273.JPEG n04346328/
+mv val/ILSVRC2012_val_00019274.JPEG n04591713/
+mv val/ILSVRC2012_val_00019275.JPEG n03662601/
+mv val/ILSVRC2012_val_00019276.JPEG n02093428/
+mv val/ILSVRC2012_val_00019277.JPEG n02917067/
+mv val/ILSVRC2012_val_00019278.JPEG n03710721/
+mv val/ILSVRC2012_val_00019279.JPEG n02493509/
+mv val/ILSVRC2012_val_00019280.JPEG n02794156/
+mv val/ILSVRC2012_val_00019281.JPEG n07720875/
+mv val/ILSVRC2012_val_00019282.JPEG n01669191/
+mv val/ILSVRC2012_val_00019283.JPEG n02088364/
+mv val/ILSVRC2012_val_00019284.JPEG n01873310/
+mv val/ILSVRC2012_val_00019285.JPEG n04037443/
+mv val/ILSVRC2012_val_00019286.JPEG n03598930/
+mv val/ILSVRC2012_val_00019287.JPEG n07714571/
+mv val/ILSVRC2012_val_00019288.JPEG n04069434/
+mv val/ILSVRC2012_val_00019289.JPEG n03888257/
+mv val/ILSVRC2012_val_00019290.JPEG n07718472/
+mv val/ILSVRC2012_val_00019291.JPEG n03676483/
+mv val/ILSVRC2012_val_00019292.JPEG n03929660/
+mv val/ILSVRC2012_val_00019293.JPEG n02514041/
+mv val/ILSVRC2012_val_00019294.JPEG n02105056/
+mv val/ILSVRC2012_val_00019295.JPEG n04275548/
+mv val/ILSVRC2012_val_00019296.JPEG n03534580/
+mv val/ILSVRC2012_val_00019297.JPEG n04296562/
+mv val/ILSVRC2012_val_00019298.JPEG n03770439/
+mv val/ILSVRC2012_val_00019299.JPEG n02165456/
+mv val/ILSVRC2012_val_00019300.JPEG n02704792/
+mv val/ILSVRC2012_val_00019301.JPEG n03995372/
+mv val/ILSVRC2012_val_00019302.JPEG n04344873/
+mv val/ILSVRC2012_val_00019303.JPEG n02123159/
+mv val/ILSVRC2012_val_00019304.JPEG n11879895/
+mv val/ILSVRC2012_val_00019305.JPEG n02094114/
+mv val/ILSVRC2012_val_00019306.JPEG n02514041/
+mv val/ILSVRC2012_val_00019307.JPEG n03388549/
+mv val/ILSVRC2012_val_00019308.JPEG n01629819/
+mv val/ILSVRC2012_val_00019309.JPEG n02776631/
+mv val/ILSVRC2012_val_00019310.JPEG n02963159/
+mv val/ILSVRC2012_val_00019311.JPEG n03857828/
+mv val/ILSVRC2012_val_00019312.JPEG n07768694/
+mv val/ILSVRC2012_val_00019313.JPEG n01847000/
+mv val/ILSVRC2012_val_00019314.JPEG n02229544/
+mv val/ILSVRC2012_val_00019315.JPEG n02834397/
+mv val/ILSVRC2012_val_00019316.JPEG n04380533/
+mv val/ILSVRC2012_val_00019317.JPEG n07717410/
+mv val/ILSVRC2012_val_00019318.JPEG n02112706/
+mv val/ILSVRC2012_val_00019319.JPEG n03014705/
+mv val/ILSVRC2012_val_00019320.JPEG n11939491/
+mv val/ILSVRC2012_val_00019321.JPEG n02769748/
+mv val/ILSVRC2012_val_00019322.JPEG n03075370/
+mv val/ILSVRC2012_val_00019323.JPEG n03534580/
+mv val/ILSVRC2012_val_00019324.JPEG n02116738/
+mv val/ILSVRC2012_val_00019325.JPEG n02111277/
+mv val/ILSVRC2012_val_00019326.JPEG n03482405/
+mv val/ILSVRC2012_val_00019327.JPEG n02096294/
+mv val/ILSVRC2012_val_00019328.JPEG n01819313/
+mv val/ILSVRC2012_val_00019329.JPEG n02105056/
+mv val/ILSVRC2012_val_00019330.JPEG n04540053/
+mv val/ILSVRC2012_val_00019331.JPEG n03028079/
+mv val/ILSVRC2012_val_00019332.JPEG n03467068/
+mv val/ILSVRC2012_val_00019333.JPEG n02107683/
+mv val/ILSVRC2012_val_00019334.JPEG n12768682/
+mv val/ILSVRC2012_val_00019335.JPEG n02481823/
+mv val/ILSVRC2012_val_00019336.JPEG n02447366/
+mv val/ILSVRC2012_val_00019337.JPEG n03255030/
+mv val/ILSVRC2012_val_00019338.JPEG n02977058/
+mv val/ILSVRC2012_val_00019339.JPEG n12620546/
+mv val/ILSVRC2012_val_00019340.JPEG n03131574/
+mv val/ILSVRC2012_val_00019341.JPEG n02981792/
+mv val/ILSVRC2012_val_00019342.JPEG n02110063/
+mv val/ILSVRC2012_val_00019343.JPEG n03494278/
+mv val/ILSVRC2012_val_00019344.JPEG n02415577/
+mv val/ILSVRC2012_val_00019345.JPEG n02398521/
+mv val/ILSVRC2012_val_00019346.JPEG n04554684/
+mv val/ILSVRC2012_val_00019347.JPEG n03063599/
+mv val/ILSVRC2012_val_00019348.JPEG n04579145/
+mv val/ILSVRC2012_val_00019349.JPEG n04335435/
+mv val/ILSVRC2012_val_00019350.JPEG n04264628/
+mv val/ILSVRC2012_val_00019351.JPEG n04311004/
+mv val/ILSVRC2012_val_00019352.JPEG n02457408/
+mv val/ILSVRC2012_val_00019353.JPEG n02106550/
+mv val/ILSVRC2012_val_00019354.JPEG n04483307/
+mv val/ILSVRC2012_val_00019355.JPEG n02977058/
+mv val/ILSVRC2012_val_00019356.JPEG n02091244/
+mv val/ILSVRC2012_val_00019357.JPEG n02169497/
+mv val/ILSVRC2012_val_00019358.JPEG n03041632/
+mv val/ILSVRC2012_val_00019359.JPEG n03630383/
+mv val/ILSVRC2012_val_00019360.JPEG n02669723/
+mv val/ILSVRC2012_val_00019361.JPEG n02104029/
+mv val/ILSVRC2012_val_00019362.JPEG n02364673/
+mv val/ILSVRC2012_val_00019363.JPEG n02749479/
+mv val/ILSVRC2012_val_00019364.JPEG n02107312/
+mv val/ILSVRC2012_val_00019365.JPEG n02128925/
+mv val/ILSVRC2012_val_00019366.JPEG n02091831/
+mv val/ILSVRC2012_val_00019367.JPEG n04554684/
+mv val/ILSVRC2012_val_00019368.JPEG n01978287/
+mv val/ILSVRC2012_val_00019369.JPEG n02655020/
+mv val/ILSVRC2012_val_00019370.JPEG n02125311/
+mv val/ILSVRC2012_val_00019371.JPEG n04136333/
+mv val/ILSVRC2012_val_00019372.JPEG n07753113/
+mv val/ILSVRC2012_val_00019373.JPEG n01943899/
+mv val/ILSVRC2012_val_00019374.JPEG n04204347/
+mv val/ILSVRC2012_val_00019375.JPEG n03372029/
+mv val/ILSVRC2012_val_00019376.JPEG n04418357/
+mv val/ILSVRC2012_val_00019377.JPEG n02980441/
+mv val/ILSVRC2012_val_00019378.JPEG n02859443/
+mv val/ILSVRC2012_val_00019379.JPEG n04235860/
+mv val/ILSVRC2012_val_00019380.JPEG n09472597/
+mv val/ILSVRC2012_val_00019381.JPEG n02328150/
+mv val/ILSVRC2012_val_00019382.JPEG n02017213/
+mv val/ILSVRC2012_val_00019383.JPEG n01734418/
+mv val/ILSVRC2012_val_00019384.JPEG n03930313/
+mv val/ILSVRC2012_val_00019385.JPEG n03868242/
+mv val/ILSVRC2012_val_00019386.JPEG n04355338/
+mv val/ILSVRC2012_val_00019387.JPEG n04118538/
+mv val/ILSVRC2012_val_00019388.JPEG n02804610/
+mv val/ILSVRC2012_val_00019389.JPEG n02028035/
+mv val/ILSVRC2012_val_00019390.JPEG n02835271/
+mv val/ILSVRC2012_val_00019391.JPEG n02114548/
+mv val/ILSVRC2012_val_00019392.JPEG n03710193/
+mv val/ILSVRC2012_val_00019393.JPEG n04033901/
+mv val/ILSVRC2012_val_00019394.JPEG n01984695/
+mv val/ILSVRC2012_val_00019395.JPEG n03443371/
+mv val/ILSVRC2012_val_00019396.JPEG n03956157/
+mv val/ILSVRC2012_val_00019397.JPEG n07753113/
+mv val/ILSVRC2012_val_00019398.JPEG n03532672/
+mv val/ILSVRC2012_val_00019399.JPEG n01664065/
+mv val/ILSVRC2012_val_00019400.JPEG n02786058/
+mv val/ILSVRC2012_val_00019401.JPEG n02125311/
+mv val/ILSVRC2012_val_00019402.JPEG n02085620/
+mv val/ILSVRC2012_val_00019403.JPEG n02655020/
+mv val/ILSVRC2012_val_00019404.JPEG n04235860/
+mv val/ILSVRC2012_val_00019405.JPEG n03018349/
+mv val/ILSVRC2012_val_00019406.JPEG n13040303/
+mv val/ILSVRC2012_val_00019407.JPEG n03658185/
+mv val/ILSVRC2012_val_00019408.JPEG n04254680/
+mv val/ILSVRC2012_val_00019409.JPEG n01484850/
+mv val/ILSVRC2012_val_00019410.JPEG n03594945/
+mv val/ILSVRC2012_val_00019411.JPEG n04209133/
+mv val/ILSVRC2012_val_00019412.JPEG n03877845/
+mv val/ILSVRC2012_val_00019413.JPEG n12985857/
+mv val/ILSVRC2012_val_00019414.JPEG n02102040/
+mv val/ILSVRC2012_val_00019415.JPEG n02112018/
+mv val/ILSVRC2012_val_00019416.JPEG n03467068/
+mv val/ILSVRC2012_val_00019417.JPEG n02115641/
+mv val/ILSVRC2012_val_00019418.JPEG n04562935/
+mv val/ILSVRC2012_val_00019419.JPEG n03042490/
+mv val/ILSVRC2012_val_00019420.JPEG n04429376/
+mv val/ILSVRC2012_val_00019421.JPEG n02895154/
+mv val/ILSVRC2012_val_00019422.JPEG n13052670/
+mv val/ILSVRC2012_val_00019423.JPEG n01514668/
+mv val/ILSVRC2012_val_00019424.JPEG n01491361/
+mv val/ILSVRC2012_val_00019425.JPEG n01924916/
+mv val/ILSVRC2012_val_00019426.JPEG n04039381/
+mv val/ILSVRC2012_val_00019427.JPEG n02437616/
+mv val/ILSVRC2012_val_00019428.JPEG n04065272/
+mv val/ILSVRC2012_val_00019429.JPEG n01855672/
+mv val/ILSVRC2012_val_00019430.JPEG n03733281/
+mv val/ILSVRC2012_val_00019431.JPEG n03935335/
+mv val/ILSVRC2012_val_00019432.JPEG n02492035/
+mv val/ILSVRC2012_val_00019433.JPEG n02130308/
+mv val/ILSVRC2012_val_00019434.JPEG n04131690/
+mv val/ILSVRC2012_val_00019435.JPEG n01484850/
+mv val/ILSVRC2012_val_00019436.JPEG n03197337/
+mv val/ILSVRC2012_val_00019437.JPEG n03761084/
+mv val/ILSVRC2012_val_00019438.JPEG n03899768/
+mv val/ILSVRC2012_val_00019439.JPEG n02128385/
+mv val/ILSVRC2012_val_00019440.JPEG n04604644/
+mv val/ILSVRC2012_val_00019441.JPEG n03623198/
+mv val/ILSVRC2012_val_00019442.JPEG n04152593/
+mv val/ILSVRC2012_val_00019443.JPEG n02783161/
+mv val/ILSVRC2012_val_00019444.JPEG n04252225/
+mv val/ILSVRC2012_val_00019445.JPEG n04118538/
+mv val/ILSVRC2012_val_00019446.JPEG n02412080/
+mv val/ILSVRC2012_val_00019447.JPEG n03717622/
+mv val/ILSVRC2012_val_00019448.JPEG n02480495/
+mv val/ILSVRC2012_val_00019449.JPEG n02102480/
+mv val/ILSVRC2012_val_00019450.JPEG n02676566/
+mv val/ILSVRC2012_val_00019451.JPEG n02492035/
+mv val/ILSVRC2012_val_00019452.JPEG n04265275/
+mv val/ILSVRC2012_val_00019453.JPEG n07742313/
+mv val/ILSVRC2012_val_00019454.JPEG n03483316/
+mv val/ILSVRC2012_val_00019455.JPEG n03706229/
+mv val/ILSVRC2012_val_00019456.JPEG n02129165/
+mv val/ILSVRC2012_val_00019457.JPEG n07718747/
+mv val/ILSVRC2012_val_00019458.JPEG n03967562/
+mv val/ILSVRC2012_val_00019459.JPEG n01443537/
+mv val/ILSVRC2012_val_00019460.JPEG n02190166/
+mv val/ILSVRC2012_val_00019461.JPEG n01943899/
+mv val/ILSVRC2012_val_00019462.JPEG n02089078/
+mv val/ILSVRC2012_val_00019463.JPEG n03627232/
+mv val/ILSVRC2012_val_00019464.JPEG n02110958/
+mv val/ILSVRC2012_val_00019465.JPEG n03902125/
+mv val/ILSVRC2012_val_00019466.JPEG n04081281/
+mv val/ILSVRC2012_val_00019467.JPEG n02172182/
+mv val/ILSVRC2012_val_00019468.JPEG n02099849/
+mv val/ILSVRC2012_val_00019469.JPEG n02492035/
+mv val/ILSVRC2012_val_00019470.JPEG n02999410/
+mv val/ILSVRC2012_val_00019471.JPEG n04435653/
+mv val/ILSVRC2012_val_00019472.JPEG n03127925/
+mv val/ILSVRC2012_val_00019473.JPEG n07880968/
+mv val/ILSVRC2012_val_00019474.JPEG n04243546/
+mv val/ILSVRC2012_val_00019475.JPEG n03544143/
+mv val/ILSVRC2012_val_00019476.JPEG n01877812/
+mv val/ILSVRC2012_val_00019477.JPEG n02823750/
+mv val/ILSVRC2012_val_00019478.JPEG n02814533/
+mv val/ILSVRC2012_val_00019479.JPEG n02916936/
+mv val/ILSVRC2012_val_00019480.JPEG n02120505/
+mv val/ILSVRC2012_val_00019481.JPEG n02088632/
+mv val/ILSVRC2012_val_00019482.JPEG n02977058/
+mv val/ILSVRC2012_val_00019483.JPEG n07734744/
+mv val/ILSVRC2012_val_00019484.JPEG n02676566/
+mv val/ILSVRC2012_val_00019485.JPEG n01770081/
+mv val/ILSVRC2012_val_00019486.JPEG n04116512/
+mv val/ILSVRC2012_val_00019487.JPEG n02871525/
+mv val/ILSVRC2012_val_00019488.JPEG n02091032/
+mv val/ILSVRC2012_val_00019489.JPEG n02536864/
+mv val/ILSVRC2012_val_00019490.JPEG n03223299/
+mv val/ILSVRC2012_val_00019491.JPEG n02963159/
+mv val/ILSVRC2012_val_00019492.JPEG n03180011/
+mv val/ILSVRC2012_val_00019493.JPEG n03207743/
+mv val/ILSVRC2012_val_00019494.JPEG n03496892/
+mv val/ILSVRC2012_val_00019495.JPEG n03444034/
+mv val/ILSVRC2012_val_00019496.JPEG n03100240/
+mv val/ILSVRC2012_val_00019497.JPEG n04592741/
+mv val/ILSVRC2012_val_00019498.JPEG n02091831/
+mv val/ILSVRC2012_val_00019499.JPEG n04613696/
+mv val/ILSVRC2012_val_00019500.JPEG n02097130/
+mv val/ILSVRC2012_val_00019501.JPEG n03196217/
+mv val/ILSVRC2012_val_00019502.JPEG n04523525/
+mv val/ILSVRC2012_val_00019503.JPEG n04505470/
+mv val/ILSVRC2012_val_00019504.JPEG n04153751/
+mv val/ILSVRC2012_val_00019505.JPEG n03786901/
+mv val/ILSVRC2012_val_00019506.JPEG n03220513/
+mv val/ILSVRC2012_val_00019507.JPEG n02808440/
+mv val/ILSVRC2012_val_00019508.JPEG n04399382/
+mv val/ILSVRC2012_val_00019509.JPEG n03594945/
+mv val/ILSVRC2012_val_00019510.JPEG n01978455/
+mv val/ILSVRC2012_val_00019511.JPEG n01824575/
+mv val/ILSVRC2012_val_00019512.JPEG n01986214/
+mv val/ILSVRC2012_val_00019513.JPEG n03792782/
+mv val/ILSVRC2012_val_00019514.JPEG n02730930/
+mv val/ILSVRC2012_val_00019515.JPEG n03208938/
+mv val/ILSVRC2012_val_00019516.JPEG n02641379/
+mv val/ILSVRC2012_val_00019517.JPEG n02106030/
+mv val/ILSVRC2012_val_00019518.JPEG n02106550/
+mv val/ILSVRC2012_val_00019519.JPEG n02110063/
+mv val/ILSVRC2012_val_00019520.JPEG n03786901/
+mv val/ILSVRC2012_val_00019521.JPEG n04532670/
+mv val/ILSVRC2012_val_00019522.JPEG n03595614/
+mv val/ILSVRC2012_val_00019523.JPEG n13054560/
+mv val/ILSVRC2012_val_00019524.JPEG n02233338/
+mv val/ILSVRC2012_val_00019525.JPEG n03803284/
+mv val/ILSVRC2012_val_00019526.JPEG n03355925/
+mv val/ILSVRC2012_val_00019527.JPEG n02236044/
+mv val/ILSVRC2012_val_00019528.JPEG n02951585/
+mv val/ILSVRC2012_val_00019529.JPEG n03063599/
+mv val/ILSVRC2012_val_00019530.JPEG n03047690/
+mv val/ILSVRC2012_val_00019531.JPEG n01496331/
+mv val/ILSVRC2012_val_00019532.JPEG n02708093/
+mv val/ILSVRC2012_val_00019533.JPEG n02356798/
+mv val/ILSVRC2012_val_00019534.JPEG n04442312/
+mv val/ILSVRC2012_val_00019535.JPEG n02107574/
+mv val/ILSVRC2012_val_00019536.JPEG n03459775/
+mv val/ILSVRC2012_val_00019537.JPEG n04026417/
+mv val/ILSVRC2012_val_00019538.JPEG n02860847/
+mv val/ILSVRC2012_val_00019539.JPEG n02655020/
+mv val/ILSVRC2012_val_00019540.JPEG n03983396/
+mv val/ILSVRC2012_val_00019541.JPEG n03658185/
+mv val/ILSVRC2012_val_00019542.JPEG n04589890/
+mv val/ILSVRC2012_val_00019543.JPEG n03956157/
+mv val/ILSVRC2012_val_00019544.JPEG n02093991/
+mv val/ILSVRC2012_val_00019545.JPEG n02091032/
+mv val/ILSVRC2012_val_00019546.JPEG n02977058/
+mv val/ILSVRC2012_val_00019547.JPEG n01667114/
+mv val/ILSVRC2012_val_00019548.JPEG n02500267/
+mv val/ILSVRC2012_val_00019549.JPEG n03347037/
+mv val/ILSVRC2012_val_00019550.JPEG n07716906/
+mv val/ILSVRC2012_val_00019551.JPEG n03598930/
+mv val/ILSVRC2012_val_00019552.JPEG n02841315/
+mv val/ILSVRC2012_val_00019553.JPEG n04254777/
+mv val/ILSVRC2012_val_00019554.JPEG n04049303/
+mv val/ILSVRC2012_val_00019555.JPEG n13040303/
+mv val/ILSVRC2012_val_00019556.JPEG n03495258/
+mv val/ILSVRC2012_val_00019557.JPEG n04596742/
+mv val/ILSVRC2012_val_00019558.JPEG n15075141/
+mv val/ILSVRC2012_val_00019559.JPEG n02105251/
+mv val/ILSVRC2012_val_00019560.JPEG n01667114/
+mv val/ILSVRC2012_val_00019561.JPEG n01775062/
+mv val/ILSVRC2012_val_00019562.JPEG n02002724/
+mv val/ILSVRC2012_val_00019563.JPEG n04536866/
+mv val/ILSVRC2012_val_00019564.JPEG n01768244/
+mv val/ILSVRC2012_val_00019565.JPEG n02808440/
+mv val/ILSVRC2012_val_00019566.JPEG n02087046/
+mv val/ILSVRC2012_val_00019567.JPEG n02917067/
+mv val/ILSVRC2012_val_00019568.JPEG n04111531/
+mv val/ILSVRC2012_val_00019569.JPEG n02190166/
+mv val/ILSVRC2012_val_00019570.JPEG n03690938/
+mv val/ILSVRC2012_val_00019571.JPEG n13040303/
+mv val/ILSVRC2012_val_00019572.JPEG n04133789/
+mv val/ILSVRC2012_val_00019573.JPEG n03877845/
+mv val/ILSVRC2012_val_00019574.JPEG n01985128/
+mv val/ILSVRC2012_val_00019575.JPEG n03220513/
+mv val/ILSVRC2012_val_00019576.JPEG n03970156/
+mv val/ILSVRC2012_val_00019577.JPEG n04483307/
+mv val/ILSVRC2012_val_00019578.JPEG n01641577/
+mv val/ILSVRC2012_val_00019579.JPEG n03384352/
+mv val/ILSVRC2012_val_00019580.JPEG n02823750/
+mv val/ILSVRC2012_val_00019581.JPEG n02088238/
+mv val/ILSVRC2012_val_00019582.JPEG n04346328/
+mv val/ILSVRC2012_val_00019583.JPEG n04423845/
+mv val/ILSVRC2012_val_00019584.JPEG n04356056/
+mv val/ILSVRC2012_val_00019585.JPEG n04509417/
+mv val/ILSVRC2012_val_00019586.JPEG n02606052/
+mv val/ILSVRC2012_val_00019587.JPEG n01704323/
+mv val/ILSVRC2012_val_00019588.JPEG n07831146/
+mv val/ILSVRC2012_val_00019589.JPEG n02120505/
+mv val/ILSVRC2012_val_00019590.JPEG n02099601/
+mv val/ILSVRC2012_val_00019591.JPEG n02799071/
+mv val/ILSVRC2012_val_00019592.JPEG n02233338/
+mv val/ILSVRC2012_val_00019593.JPEG n03394916/
+mv val/ILSVRC2012_val_00019594.JPEG n02865351/
+mv val/ILSVRC2012_val_00019595.JPEG n03272562/
+mv val/ILSVRC2012_val_00019596.JPEG n03843555/
+mv val/ILSVRC2012_val_00019597.JPEG n09246464/
+mv val/ILSVRC2012_val_00019598.JPEG n02825657/
+mv val/ILSVRC2012_val_00019599.JPEG n02951585/
+mv val/ILSVRC2012_val_00019600.JPEG n03692522/
+mv val/ILSVRC2012_val_00019601.JPEG n04517823/
+mv val/ILSVRC2012_val_00019602.JPEG n03803284/
+mv val/ILSVRC2012_val_00019603.JPEG n02086910/
+mv val/ILSVRC2012_val_00019604.JPEG n07613480/
+mv val/ILSVRC2012_val_00019605.JPEG n09399592/
+mv val/ILSVRC2012_val_00019606.JPEG n03775071/
+mv val/ILSVRC2012_val_00019607.JPEG n02099429/
+mv val/ILSVRC2012_val_00019608.JPEG n07695742/
+mv val/ILSVRC2012_val_00019609.JPEG n03527444/
+mv val/ILSVRC2012_val_00019610.JPEG n04330267/
+mv val/ILSVRC2012_val_00019611.JPEG n03832673/
+mv val/ILSVRC2012_val_00019612.JPEG n02894605/
+mv val/ILSVRC2012_val_00019613.JPEG n02951585/
+mv val/ILSVRC2012_val_00019614.JPEG n09332890/
+mv val/ILSVRC2012_val_00019615.JPEG n13054560/
+mv val/ILSVRC2012_val_00019616.JPEG n03623198/
+mv val/ILSVRC2012_val_00019617.JPEG n02363005/
+mv val/ILSVRC2012_val_00019618.JPEG n04275548/
+mv val/ILSVRC2012_val_00019619.JPEG n09288635/
+mv val/ILSVRC2012_val_00019620.JPEG n03902125/
+mv val/ILSVRC2012_val_00019621.JPEG n04435653/
+mv val/ILSVRC2012_val_00019622.JPEG n04398044/
+mv val/ILSVRC2012_val_00019623.JPEG n02666196/
+mv val/ILSVRC2012_val_00019624.JPEG n04147183/
+mv val/ILSVRC2012_val_00019625.JPEG n02454379/
+mv val/ILSVRC2012_val_00019626.JPEG n02107574/
+mv val/ILSVRC2012_val_00019627.JPEG n04592741/
+mv val/ILSVRC2012_val_00019628.JPEG n04200800/
+mv val/ILSVRC2012_val_00019629.JPEG n02066245/
+mv val/ILSVRC2012_val_00019630.JPEG n01629819/
+mv val/ILSVRC2012_val_00019631.JPEG n03272562/
+mv val/ILSVRC2012_val_00019632.JPEG n03877472/
+mv val/ILSVRC2012_val_00019633.JPEG n02009229/
+mv val/ILSVRC2012_val_00019634.JPEG n03532672/
+mv val/ILSVRC2012_val_00019635.JPEG n02437312/
+mv val/ILSVRC2012_val_00019636.JPEG n02089078/
+mv val/ILSVRC2012_val_00019637.JPEG n04127249/
+mv val/ILSVRC2012_val_00019638.JPEG n03443371/
+mv val/ILSVRC2012_val_00019639.JPEG n02091635/
+mv val/ILSVRC2012_val_00019640.JPEG n02667093/
+mv val/ILSVRC2012_val_00019641.JPEG n03935335/
+mv val/ILSVRC2012_val_00019642.JPEG n02364673/
+mv val/ILSVRC2012_val_00019643.JPEG n02165105/
+mv val/ILSVRC2012_val_00019644.JPEG n03770439/
+mv val/ILSVRC2012_val_00019645.JPEG n03063599/
+mv val/ILSVRC2012_val_00019646.JPEG n02363005/
+mv val/ILSVRC2012_val_00019647.JPEG n03100240/
+mv val/ILSVRC2012_val_00019648.JPEG n02815834/
+mv val/ILSVRC2012_val_00019649.JPEG n04275548/
+mv val/ILSVRC2012_val_00019650.JPEG n02791270/
+mv val/ILSVRC2012_val_00019651.JPEG n02325366/
+mv val/ILSVRC2012_val_00019652.JPEG n01695060/
+mv val/ILSVRC2012_val_00019653.JPEG n02787622/
+mv val/ILSVRC2012_val_00019654.JPEG n07753113/
+mv val/ILSVRC2012_val_00019655.JPEG n02128385/
+mv val/ILSVRC2012_val_00019656.JPEG n04125021/
+mv val/ILSVRC2012_val_00019657.JPEG n02395406/
+mv val/ILSVRC2012_val_00019658.JPEG n04371430/
+mv val/ILSVRC2012_val_00019659.JPEG n03388043/
+mv val/ILSVRC2012_val_00019660.JPEG n12620546/
+mv val/ILSVRC2012_val_00019661.JPEG n04597913/
+mv val/ILSVRC2012_val_00019662.JPEG n03967562/
+mv val/ILSVRC2012_val_00019663.JPEG n02708093/
+mv val/ILSVRC2012_val_00019664.JPEG n02280649/
+mv val/ILSVRC2012_val_00019665.JPEG n02113978/
+mv val/ILSVRC2012_val_00019666.JPEG n09288635/
+mv val/ILSVRC2012_val_00019667.JPEG n03425413/
+mv val/ILSVRC2012_val_00019668.JPEG n03207941/
+mv val/ILSVRC2012_val_00019669.JPEG n01740131/
+mv val/ILSVRC2012_val_00019670.JPEG n04120489/
+mv val/ILSVRC2012_val_00019671.JPEG n02106382/
+mv val/ILSVRC2012_val_00019672.JPEG n02536864/
+mv val/ILSVRC2012_val_00019673.JPEG n04458633/
+mv val/ILSVRC2012_val_00019674.JPEG n03633091/
+mv val/ILSVRC2012_val_00019675.JPEG n03967562/
+mv val/ILSVRC2012_val_00019676.JPEG n04371430/
+mv val/ILSVRC2012_val_00019677.JPEG n02690373/
+mv val/ILSVRC2012_val_00019678.JPEG n02113186/
+mv val/ILSVRC2012_val_00019679.JPEG n02870880/
+mv val/ILSVRC2012_val_00019680.JPEG n02114855/
+mv val/ILSVRC2012_val_00019681.JPEG n02396427/
+mv val/ILSVRC2012_val_00019682.JPEG n02132136/
+mv val/ILSVRC2012_val_00019683.JPEG n02107908/
+mv val/ILSVRC2012_val_00019684.JPEG n01950731/
+mv val/ILSVRC2012_val_00019685.JPEG n02992529/
+mv val/ILSVRC2012_val_00019686.JPEG n03814639/
+mv val/ILSVRC2012_val_00019687.JPEG n03594734/
+mv val/ILSVRC2012_val_00019688.JPEG n07613480/
+mv val/ILSVRC2012_val_00019689.JPEG n07932039/
+mv val/ILSVRC2012_val_00019690.JPEG n03721384/
+mv val/ILSVRC2012_val_00019691.JPEG n02641379/
+mv val/ILSVRC2012_val_00019692.JPEG n03721384/
+mv val/ILSVRC2012_val_00019693.JPEG n03661043/
+mv val/ILSVRC2012_val_00019694.JPEG n04509417/
+mv val/ILSVRC2012_val_00019695.JPEG n02814533/
+mv val/ILSVRC2012_val_00019696.JPEG n02437616/
+mv val/ILSVRC2012_val_00019697.JPEG n04192698/
+mv val/ILSVRC2012_val_00019698.JPEG n02002724/
+mv val/ILSVRC2012_val_00019699.JPEG n15075141/
+mv val/ILSVRC2012_val_00019700.JPEG n03670208/
+mv val/ILSVRC2012_val_00019701.JPEG n02974003/
+mv val/ILSVRC2012_val_00019702.JPEG n02094433/
+mv val/ILSVRC2012_val_00019703.JPEG n03617480/
+mv val/ILSVRC2012_val_00019704.JPEG n04486054/
+mv val/ILSVRC2012_val_00019705.JPEG n03290653/
+mv val/ILSVRC2012_val_00019706.JPEG n03255030/
+mv val/ILSVRC2012_val_00019707.JPEG n04435653/
+mv val/ILSVRC2012_val_00019708.JPEG n02916936/
+mv val/ILSVRC2012_val_00019709.JPEG n01728572/
+mv val/ILSVRC2012_val_00019710.JPEG n01632777/
+mv val/ILSVRC2012_val_00019711.JPEG n03028079/
+mv val/ILSVRC2012_val_00019712.JPEG n02106382/
+mv val/ILSVRC2012_val_00019713.JPEG n12267677/
+mv val/ILSVRC2012_val_00019714.JPEG n02279972/
+mv val/ILSVRC2012_val_00019715.JPEG n02111129/
+mv val/ILSVRC2012_val_00019716.JPEG n01820546/
+mv val/ILSVRC2012_val_00019717.JPEG n03680355/
+mv val/ILSVRC2012_val_00019718.JPEG n03991062/
+mv val/ILSVRC2012_val_00019719.JPEG n02090721/
+mv val/ILSVRC2012_val_00019720.JPEG n02879718/
+mv val/ILSVRC2012_val_00019721.JPEG n01514668/
+mv val/ILSVRC2012_val_00019722.JPEG n01728572/
+mv val/ILSVRC2012_val_00019723.JPEG n04442312/
+mv val/ILSVRC2012_val_00019724.JPEG n03379051/
+mv val/ILSVRC2012_val_00019725.JPEG n02930766/
+mv val/ILSVRC2012_val_00019726.JPEG n03982430/
+mv val/ILSVRC2012_val_00019727.JPEG n02497673/
+mv val/ILSVRC2012_val_00019728.JPEG n02115641/
+mv val/ILSVRC2012_val_00019729.JPEG n02389026/
+mv val/ILSVRC2012_val_00019730.JPEG n02793495/
+mv val/ILSVRC2012_val_00019731.JPEG n03594945/
+mv val/ILSVRC2012_val_00019732.JPEG n03661043/
+mv val/ILSVRC2012_val_00019733.JPEG n04398044/
+mv val/ILSVRC2012_val_00019734.JPEG n01773797/
+mv val/ILSVRC2012_val_00019735.JPEG n03630383/
+mv val/ILSVRC2012_val_00019736.JPEG n07892512/
+mv val/ILSVRC2012_val_00019737.JPEG n02259212/
+mv val/ILSVRC2012_val_00019738.JPEG n02128757/
+mv val/ILSVRC2012_val_00019739.JPEG n03595614/
+mv val/ILSVRC2012_val_00019740.JPEG n03126707/
+mv val/ILSVRC2012_val_00019741.JPEG n04200800/
+mv val/ILSVRC2012_val_00019742.JPEG n12620546/
+mv val/ILSVRC2012_val_00019743.JPEG n02091032/
+mv val/ILSVRC2012_val_00019744.JPEG n01531178/
+mv val/ILSVRC2012_val_00019745.JPEG n03775071/
+mv val/ILSVRC2012_val_00019746.JPEG n02346627/
+mv val/ILSVRC2012_val_00019747.JPEG n02096294/
+mv val/ILSVRC2012_val_00019748.JPEG n04204347/
+mv val/ILSVRC2012_val_00019749.JPEG n02892201/
+mv val/ILSVRC2012_val_00019750.JPEG n01807496/
+mv val/ILSVRC2012_val_00019751.JPEG n03825788/
+mv val/ILSVRC2012_val_00019752.JPEG n02342885/
+mv val/ILSVRC2012_val_00019753.JPEG n02128385/
+mv val/ILSVRC2012_val_00019754.JPEG n07745940/
+mv val/ILSVRC2012_val_00019755.JPEG n04404412/
+mv val/ILSVRC2012_val_00019756.JPEG n03720891/
+mv val/ILSVRC2012_val_00019757.JPEG n02109961/
+mv val/ILSVRC2012_val_00019758.JPEG n03976657/
+mv val/ILSVRC2012_val_00019759.JPEG n02093256/
+mv val/ILSVRC2012_val_00019760.JPEG n03787032/
+mv val/ILSVRC2012_val_00019761.JPEG n03794056/
+mv val/ILSVRC2012_val_00019762.JPEG n04136333/
+mv val/ILSVRC2012_val_00019763.JPEG n03787032/
+mv val/ILSVRC2012_val_00019764.JPEG n02105855/
+mv val/ILSVRC2012_val_00019765.JPEG n01774384/
+mv val/ILSVRC2012_val_00019766.JPEG n02974003/
+mv val/ILSVRC2012_val_00019767.JPEG n02106030/
+mv val/ILSVRC2012_val_00019768.JPEG n04023962/
+mv val/ILSVRC2012_val_00019769.JPEG n03485794/
+mv val/ILSVRC2012_val_00019770.JPEG n02086910/
+mv val/ILSVRC2012_val_00019771.JPEG n02091134/
+mv val/ILSVRC2012_val_00019772.JPEG n02727426/
+mv val/ILSVRC2012_val_00019773.JPEG n04591157/
+mv val/ILSVRC2012_val_00019774.JPEG n03804744/
+mv val/ILSVRC2012_val_00019775.JPEG n04111531/
+mv val/ILSVRC2012_val_00019776.JPEG n03733805/
+mv val/ILSVRC2012_val_00019777.JPEG n02787622/
+mv val/ILSVRC2012_val_00019778.JPEG n02980441/
+mv val/ILSVRC2012_val_00019779.JPEG n03347037/
+mv val/ILSVRC2012_val_00019780.JPEG n01630670/
+mv val/ILSVRC2012_val_00019781.JPEG n04579432/
+mv val/ILSVRC2012_val_00019782.JPEG n01944390/
+mv val/ILSVRC2012_val_00019783.JPEG n12620546/
+mv val/ILSVRC2012_val_00019784.JPEG n02114712/
+mv val/ILSVRC2012_val_00019785.JPEG n03527444/
+mv val/ILSVRC2012_val_00019786.JPEG n04239074/
+mv val/ILSVRC2012_val_00019787.JPEG n01807496/
+mv val/ILSVRC2012_val_00019788.JPEG n01592084/
+mv val/ILSVRC2012_val_00019789.JPEG n02879718/
+mv val/ILSVRC2012_val_00019790.JPEG n04429376/
+mv val/ILSVRC2012_val_00019791.JPEG n02643566/
+mv val/ILSVRC2012_val_00019792.JPEG n07871810/
+mv val/ILSVRC2012_val_00019793.JPEG n07753113/
+mv val/ILSVRC2012_val_00019794.JPEG n03042490/
+mv val/ILSVRC2012_val_00019795.JPEG n02281787/
+mv val/ILSVRC2012_val_00019796.JPEG n03179701/
+mv val/ILSVRC2012_val_00019797.JPEG n01685808/
+mv val/ILSVRC2012_val_00019798.JPEG n03814906/
+mv val/ILSVRC2012_val_00019799.JPEG n02927161/
+mv val/ILSVRC2012_val_00019800.JPEG n02346627/
+mv val/ILSVRC2012_val_00019801.JPEG n03160309/
+mv val/ILSVRC2012_val_00019802.JPEG n04037443/
+mv val/ILSVRC2012_val_00019803.JPEG n02708093/
+mv val/ILSVRC2012_val_00019804.JPEG n03590841/
+mv val/ILSVRC2012_val_00019805.JPEG n04370456/
+mv val/ILSVRC2012_val_00019806.JPEG n02948072/
+mv val/ILSVRC2012_val_00019807.JPEG n02494079/
+mv val/ILSVRC2012_val_00019808.JPEG n06785654/
+mv val/ILSVRC2012_val_00019809.JPEG n04507155/
+mv val/ILSVRC2012_val_00019810.JPEG n02011460/
+mv val/ILSVRC2012_val_00019811.JPEG n02256656/
+mv val/ILSVRC2012_val_00019812.JPEG n04037443/
+mv val/ILSVRC2012_val_00019813.JPEG n03485794/
+mv val/ILSVRC2012_val_00019814.JPEG n03271574/
+mv val/ILSVRC2012_val_00019815.JPEG n04254777/
+mv val/ILSVRC2012_val_00019816.JPEG n02128757/
+mv val/ILSVRC2012_val_00019817.JPEG n04154565/
+mv val/ILSVRC2012_val_00019818.JPEG n03461385/
+mv val/ILSVRC2012_val_00019819.JPEG n02966193/
+mv val/ILSVRC2012_val_00019820.JPEG n02226429/
+mv val/ILSVRC2012_val_00019821.JPEG n02101006/
+mv val/ILSVRC2012_val_00019822.JPEG n02112018/
+mv val/ILSVRC2012_val_00019823.JPEG n07695742/
+mv val/ILSVRC2012_val_00019824.JPEG n02110341/
+mv val/ILSVRC2012_val_00019825.JPEG n02443114/
+mv val/ILSVRC2012_val_00019826.JPEG n02110185/
+mv val/ILSVRC2012_val_00019827.JPEG n02948072/
+mv val/ILSVRC2012_val_00019828.JPEG n02840245/
+mv val/ILSVRC2012_val_00019829.JPEG n03854065/
+mv val/ILSVRC2012_val_00019830.JPEG n02096294/
+mv val/ILSVRC2012_val_00019831.JPEG n02980441/
+mv val/ILSVRC2012_val_00019832.JPEG n03062245/
+mv val/ILSVRC2012_val_00019833.JPEG n03584829/
+mv val/ILSVRC2012_val_00019834.JPEG n01644900/
+mv val/ILSVRC2012_val_00019835.JPEG n03891251/
+mv val/ILSVRC2012_val_00019836.JPEG n03599486/
+mv val/ILSVRC2012_val_00019837.JPEG n02701002/
+mv val/ILSVRC2012_val_00019838.JPEG n02172182/
+mv val/ILSVRC2012_val_00019839.JPEG n03888605/
+mv val/ILSVRC2012_val_00019840.JPEG n03642806/
+mv val/ILSVRC2012_val_00019841.JPEG n04562935/
+mv val/ILSVRC2012_val_00019842.JPEG n01930112/
+mv val/ILSVRC2012_val_00019843.JPEG n02389026/
+mv val/ILSVRC2012_val_00019844.JPEG n02783161/
+mv val/ILSVRC2012_val_00019845.JPEG n02807133/
+mv val/ILSVRC2012_val_00019846.JPEG n04099969/
+mv val/ILSVRC2012_val_00019847.JPEG n03457902/
+mv val/ILSVRC2012_val_00019848.JPEG n03633091/
+mv val/ILSVRC2012_val_00019849.JPEG n03594945/
+mv val/ILSVRC2012_val_00019850.JPEG n07695742/
+mv val/ILSVRC2012_val_00019851.JPEG n07714990/
+mv val/ILSVRC2012_val_00019852.JPEG n03208938/
+mv val/ILSVRC2012_val_00019853.JPEG n04479046/
+mv val/ILSVRC2012_val_00019854.JPEG n09835506/
+mv val/ILSVRC2012_val_00019855.JPEG n03595614/
+mv val/ILSVRC2012_val_00019856.JPEG n01983481/
+mv val/ILSVRC2012_val_00019857.JPEG n03670208/
+mv val/ILSVRC2012_val_00019858.JPEG n01734418/
+mv val/ILSVRC2012_val_00019859.JPEG n01978455/
+mv val/ILSVRC2012_val_00019860.JPEG n03721384/
+mv val/ILSVRC2012_val_00019861.JPEG n02091635/
+mv val/ILSVRC2012_val_00019862.JPEG n02133161/
+mv val/ILSVRC2012_val_00019863.JPEG n04026417/
+mv val/ILSVRC2012_val_00019864.JPEG n01734418/
+mv val/ILSVRC2012_val_00019865.JPEG n03530642/
+mv val/ILSVRC2012_val_00019866.JPEG n04209133/
+mv val/ILSVRC2012_val_00019867.JPEG n04099969/
+mv val/ILSVRC2012_val_00019868.JPEG n01616318/
+mv val/ILSVRC2012_val_00019869.JPEG n02279972/
+mv val/ILSVRC2012_val_00019870.JPEG n03676483/
+mv val/ILSVRC2012_val_00019871.JPEG n03868863/
+mv val/ILSVRC2012_val_00019872.JPEG n02666196/
+mv val/ILSVRC2012_val_00019873.JPEG n02396427/
+mv val/ILSVRC2012_val_00019874.JPEG n01768244/
+mv val/ILSVRC2012_val_00019875.JPEG n03240683/
+mv val/ILSVRC2012_val_00019876.JPEG n02112018/
+mv val/ILSVRC2012_val_00019877.JPEG n13133613/
+mv val/ILSVRC2012_val_00019878.JPEG n03032252/
+mv val/ILSVRC2012_val_00019879.JPEG n04235860/
+mv val/ILSVRC2012_val_00019880.JPEG n02110627/
+mv val/ILSVRC2012_val_00019881.JPEG n03404251/
+mv val/ILSVRC2012_val_00019882.JPEG n04350905/
+mv val/ILSVRC2012_val_00019883.JPEG n02087046/
+mv val/ILSVRC2012_val_00019884.JPEG n01843383/
+mv val/ILSVRC2012_val_00019885.JPEG n01797886/
+mv val/ILSVRC2012_val_00019886.JPEG n02992211/
+mv val/ILSVRC2012_val_00019887.JPEG n02950826/
+mv val/ILSVRC2012_val_00019888.JPEG n02268853/
+mv val/ILSVRC2012_val_00019889.JPEG n03888605/
+mv val/ILSVRC2012_val_00019890.JPEG n07248320/
+mv val/ILSVRC2012_val_00019891.JPEG n03160309/
+mv val/ILSVRC2012_val_00019892.JPEG n07248320/
+mv val/ILSVRC2012_val_00019893.JPEG n03868242/
+mv val/ILSVRC2012_val_00019894.JPEG n01704323/
+mv val/ILSVRC2012_val_00019895.JPEG n01944390/
+mv val/ILSVRC2012_val_00019896.JPEG n04462240/
+mv val/ILSVRC2012_val_00019897.JPEG n06794110/
+mv val/ILSVRC2012_val_00019898.JPEG n03032252/
+mv val/ILSVRC2012_val_00019899.JPEG n04376876/
+mv val/ILSVRC2012_val_00019900.JPEG n02281406/
+mv val/ILSVRC2012_val_00019901.JPEG n02134418/
+mv val/ILSVRC2012_val_00019902.JPEG n03584829/
+mv val/ILSVRC2012_val_00019903.JPEG n03598930/
+mv val/ILSVRC2012_val_00019904.JPEG n04254777/
+mv val/ILSVRC2012_val_00019905.JPEG n04435653/
+mv val/ILSVRC2012_val_00019906.JPEG n02017213/
+mv val/ILSVRC2012_val_00019907.JPEG n04049303/
+mv val/ILSVRC2012_val_00019908.JPEG n03180011/
+mv val/ILSVRC2012_val_00019909.JPEG n03782006/
+mv val/ILSVRC2012_val_00019910.JPEG n02749479/
+mv val/ILSVRC2012_val_00019911.JPEG n04525305/
+mv val/ILSVRC2012_val_00019912.JPEG n02791270/
+mv val/ILSVRC2012_val_00019913.JPEG n04429376/
+mv val/ILSVRC2012_val_00019914.JPEG n02102318/
+mv val/ILSVRC2012_val_00019915.JPEG n07584110/
+mv val/ILSVRC2012_val_00019916.JPEG n02966687/
+mv val/ILSVRC2012_val_00019917.JPEG n02423022/
+mv val/ILSVRC2012_val_00019918.JPEG n02107142/
+mv val/ILSVRC2012_val_00019919.JPEG n02101556/
+mv val/ILSVRC2012_val_00019920.JPEG n04179913/
+mv val/ILSVRC2012_val_00019921.JPEG n02999410/
+mv val/ILSVRC2012_val_00019922.JPEG n02091134/
+mv val/ILSVRC2012_val_00019923.JPEG n02797295/
+mv val/ILSVRC2012_val_00019924.JPEG n04560804/
+mv val/ILSVRC2012_val_00019925.JPEG n01955084/
+mv val/ILSVRC2012_val_00019926.JPEG n07583066/
+mv val/ILSVRC2012_val_00019927.JPEG n03743016/
+mv val/ILSVRC2012_val_00019928.JPEG n03623198/
+mv val/ILSVRC2012_val_00019929.JPEG n03843555/
+mv val/ILSVRC2012_val_00019930.JPEG n02134084/
+mv val/ILSVRC2012_val_00019931.JPEG n02093256/
+mv val/ILSVRC2012_val_00019932.JPEG n02105505/
+mv val/ILSVRC2012_val_00019933.JPEG n03788195/
+mv val/ILSVRC2012_val_00019934.JPEG n07716906/
+mv val/ILSVRC2012_val_00019935.JPEG n04542943/
+mv val/ILSVRC2012_val_00019936.JPEG n04296562/
+mv val/ILSVRC2012_val_00019937.JPEG n02120079/
+mv val/ILSVRC2012_val_00019938.JPEG n03920288/
+mv val/ILSVRC2012_val_00019939.JPEG n02892767/
+mv val/ILSVRC2012_val_00019940.JPEG n04311174/
+mv val/ILSVRC2012_val_00019941.JPEG n04141327/
+mv val/ILSVRC2012_val_00019942.JPEG n02117135/
+mv val/ILSVRC2012_val_00019943.JPEG n03888605/
+mv val/ILSVRC2012_val_00019944.JPEG n04557648/
+mv val/ILSVRC2012_val_00019945.JPEG n04523525/
+mv val/ILSVRC2012_val_00019946.JPEG n02281787/
+mv val/ILSVRC2012_val_00019947.JPEG n02951358/
+mv val/ILSVRC2012_val_00019948.JPEG n03680355/
+mv val/ILSVRC2012_val_00019949.JPEG n07693725/
+mv val/ILSVRC2012_val_00019950.JPEG n02870880/
+mv val/ILSVRC2012_val_00019951.JPEG n02007558/
+mv val/ILSVRC2012_val_00019952.JPEG n06596364/
+mv val/ILSVRC2012_val_00019953.JPEG n01984695/
+mv val/ILSVRC2012_val_00019954.JPEG n03345487/
+mv val/ILSVRC2012_val_00019955.JPEG n02091244/
+mv val/ILSVRC2012_val_00019956.JPEG n09256479/
+mv val/ILSVRC2012_val_00019957.JPEG n02105162/
+mv val/ILSVRC2012_val_00019958.JPEG n07693725/
+mv val/ILSVRC2012_val_00019959.JPEG n03838899/
+mv val/ILSVRC2012_val_00019960.JPEG n03534580/
+mv val/ILSVRC2012_val_00019961.JPEG n02493509/
+mv val/ILSVRC2012_val_00019962.JPEG n02096177/
+mv val/ILSVRC2012_val_00019963.JPEG n07892512/
+mv val/ILSVRC2012_val_00019964.JPEG n02018795/
+mv val/ILSVRC2012_val_00019965.JPEG n04592741/
+mv val/ILSVRC2012_val_00019966.JPEG n01728920/
+mv val/ILSVRC2012_val_00019967.JPEG n07875152/
+mv val/ILSVRC2012_val_00019968.JPEG n01773797/
+mv val/ILSVRC2012_val_00019969.JPEG n02051845/
+mv val/ILSVRC2012_val_00019970.JPEG n04273569/
+mv val/ILSVRC2012_val_00019971.JPEG n03125729/
+mv val/ILSVRC2012_val_00019972.JPEG n01773549/
+mv val/ILSVRC2012_val_00019973.JPEG n04376876/
+mv val/ILSVRC2012_val_00019974.JPEG n04336792/
+mv val/ILSVRC2012_val_00019975.JPEG n02137549/
+mv val/ILSVRC2012_val_00019976.JPEG n03633091/
+mv val/ILSVRC2012_val_00019977.JPEG n01877812/
+mv val/ILSVRC2012_val_00019978.JPEG n02128757/
+mv val/ILSVRC2012_val_00019979.JPEG n04423845/
+mv val/ILSVRC2012_val_00019980.JPEG n02981792/
+mv val/ILSVRC2012_val_00019981.JPEG n03452741/
+mv val/ILSVRC2012_val_00019982.JPEG n01735189/
+mv val/ILSVRC2012_val_00019983.JPEG n04532106/
+mv val/ILSVRC2012_val_00019984.JPEG n02268853/
+mv val/ILSVRC2012_val_00019985.JPEG n07615774/
+mv val/ILSVRC2012_val_00019986.JPEG n03538406/
+mv val/ILSVRC2012_val_00019987.JPEG n01917289/
+mv val/ILSVRC2012_val_00019988.JPEG n01496331/
+mv val/ILSVRC2012_val_00019989.JPEG n01773549/
+mv val/ILSVRC2012_val_00019990.JPEG n03788195/
+mv val/ILSVRC2012_val_00019991.JPEG n02916936/
+mv val/ILSVRC2012_val_00019992.JPEG n03045698/
+mv val/ILSVRC2012_val_00019993.JPEG n03743016/
+mv val/ILSVRC2012_val_00019994.JPEG n03868863/
+mv val/ILSVRC2012_val_00019995.JPEG n04479046/
+mv val/ILSVRC2012_val_00019996.JPEG n01882714/
+mv val/ILSVRC2012_val_00019997.JPEG n03197337/
+mv val/ILSVRC2012_val_00019998.JPEG n02013706/
+mv val/ILSVRC2012_val_00019999.JPEG n07873807/
+mv val/ILSVRC2012_val_00020000.JPEG n02480855/
+mv val/ILSVRC2012_val_00020001.JPEG n04409515/
+mv val/ILSVRC2012_val_00020002.JPEG n02930766/
+mv val/ILSVRC2012_val_00020003.JPEG n03888257/
+mv val/ILSVRC2012_val_00020004.JPEG n03127925/
+mv val/ILSVRC2012_val_00020005.JPEG n11939491/
+mv val/ILSVRC2012_val_00020006.JPEG n02328150/
+mv val/ILSVRC2012_val_00020007.JPEG n02895154/
+mv val/ILSVRC2012_val_00020008.JPEG n02408429/
+mv val/ILSVRC2012_val_00020009.JPEG n02361337/
+mv val/ILSVRC2012_val_00020010.JPEG n02092339/
+mv val/ILSVRC2012_val_00020011.JPEG n01484850/
+mv val/ILSVRC2012_val_00020012.JPEG n03065424/
+mv val/ILSVRC2012_val_00020013.JPEG n02167151/
+mv val/ILSVRC2012_val_00020014.JPEG n01798484/
+mv val/ILSVRC2012_val_00020015.JPEG n02110341/
+mv val/ILSVRC2012_val_00020016.JPEG n02085620/
+mv val/ILSVRC2012_val_00020017.JPEG n04417672/
+mv val/ILSVRC2012_val_00020018.JPEG n02097047/
+mv val/ILSVRC2012_val_00020019.JPEG n04235860/
+mv val/ILSVRC2012_val_00020020.JPEG n02692877/
+mv val/ILSVRC2012_val_00020021.JPEG n04599235/
+mv val/ILSVRC2012_val_00020022.JPEG n04201297/
+mv val/ILSVRC2012_val_00020023.JPEG n02110341/
+mv val/ILSVRC2012_val_00020024.JPEG n03776460/
+mv val/ILSVRC2012_val_00020025.JPEG n02037110/
+mv val/ILSVRC2012_val_00020026.JPEG n02174001/
+mv val/ILSVRC2012_val_00020027.JPEG n02797295/
+mv val/ILSVRC2012_val_00020028.JPEG n02939185/
+mv val/ILSVRC2012_val_00020029.JPEG n03637318/
+mv val/ILSVRC2012_val_00020030.JPEG n03710721/
+mv val/ILSVRC2012_val_00020031.JPEG n02086646/
+mv val/ILSVRC2012_val_00020032.JPEG n03657121/
+mv val/ILSVRC2012_val_00020033.JPEG n02509815/
+mv val/ILSVRC2012_val_00020034.JPEG n07836838/
+mv val/ILSVRC2012_val_00020035.JPEG n04592741/
+mv val/ILSVRC2012_val_00020036.JPEG n04264628/
+mv val/ILSVRC2012_val_00020037.JPEG n04399382/
+mv val/ILSVRC2012_val_00020038.JPEG n02814533/
+mv val/ILSVRC2012_val_00020039.JPEG n04311174/
+mv val/ILSVRC2012_val_00020040.JPEG n02137549/
+mv val/ILSVRC2012_val_00020041.JPEG n07753113/
+mv val/ILSVRC2012_val_00020042.JPEG n02704792/
+mv val/ILSVRC2012_val_00020043.JPEG n02093859/
+mv val/ILSVRC2012_val_00020044.JPEG n01694178/
+mv val/ILSVRC2012_val_00020045.JPEG n03444034/
+mv val/ILSVRC2012_val_00020046.JPEG n01784675/
+mv val/ILSVRC2012_val_00020047.JPEG n02088466/
+mv val/ILSVRC2012_val_00020048.JPEG n03692522/
+mv val/ILSVRC2012_val_00020049.JPEG n02091244/
+mv val/ILSVRC2012_val_00020050.JPEG n02133161/
+mv val/ILSVRC2012_val_00020051.JPEG n09835506/
+mv val/ILSVRC2012_val_00020052.JPEG n01614925/
+mv val/ILSVRC2012_val_00020053.JPEG n02168699/
+mv val/ILSVRC2012_val_00020054.JPEG n02113624/
+mv val/ILSVRC2012_val_00020055.JPEG n03109150/
+mv val/ILSVRC2012_val_00020056.JPEG n02190166/
+mv val/ILSVRC2012_val_00020057.JPEG n03710721/
+mv val/ILSVRC2012_val_00020058.JPEG n02092002/
+mv val/ILSVRC2012_val_00020059.JPEG n01644373/
+mv val/ILSVRC2012_val_00020060.JPEG n04357314/
+mv val/ILSVRC2012_val_00020061.JPEG n01704323/
+mv val/ILSVRC2012_val_00020062.JPEG n01882714/
+mv val/ILSVRC2012_val_00020063.JPEG n03908618/
+mv val/ILSVRC2012_val_00020064.JPEG n04592741/
+mv val/ILSVRC2012_val_00020065.JPEG n02095570/
+mv val/ILSVRC2012_val_00020066.JPEG n02870880/
+mv val/ILSVRC2012_val_00020067.JPEG n04277352/
+mv val/ILSVRC2012_val_00020068.JPEG n03666591/
+mv val/ILSVRC2012_val_00020069.JPEG n09332890/
+mv val/ILSVRC2012_val_00020070.JPEG n02090721/
+mv val/ILSVRC2012_val_00020071.JPEG n04326547/
+mv val/ILSVRC2012_val_00020072.JPEG n04251144/
+mv val/ILSVRC2012_val_00020073.JPEG n04033901/
+mv val/ILSVRC2012_val_00020074.JPEG n02977058/
+mv val/ILSVRC2012_val_00020075.JPEG n03095699/
+mv val/ILSVRC2012_val_00020076.JPEG n02114548/
+mv val/ILSVRC2012_val_00020077.JPEG n02966193/
+mv val/ILSVRC2012_val_00020078.JPEG n07717410/
+mv val/ILSVRC2012_val_00020079.JPEG n04562935/
+mv val/ILSVRC2012_val_00020080.JPEG n02814860/
+mv val/ILSVRC2012_val_00020081.JPEG n02963159/
+mv val/ILSVRC2012_val_00020082.JPEG n02090721/
+mv val/ILSVRC2012_val_00020083.JPEG n03891251/
+mv val/ILSVRC2012_val_00020084.JPEG n02325366/
+mv val/ILSVRC2012_val_00020085.JPEG n03630383/
+mv val/ILSVRC2012_val_00020086.JPEG n03742115/
+mv val/ILSVRC2012_val_00020087.JPEG n03400231/
+mv val/ILSVRC2012_val_00020088.JPEG n07753275/
+mv val/ILSVRC2012_val_00020089.JPEG n02174001/
+mv val/ILSVRC2012_val_00020090.JPEG n01877812/
+mv val/ILSVRC2012_val_00020091.JPEG n02870880/
+mv val/ILSVRC2012_val_00020092.JPEG n02892201/
+mv val/ILSVRC2012_val_00020093.JPEG n02727426/
+mv val/ILSVRC2012_val_00020094.JPEG n02115913/
+mv val/ILSVRC2012_val_00020095.JPEG n02395406/
+mv val/ILSVRC2012_val_00020096.JPEG n03956157/
+mv val/ILSVRC2012_val_00020097.JPEG n02074367/
+mv val/ILSVRC2012_val_00020098.JPEG n07760859/
+mv val/ILSVRC2012_val_00020099.JPEG n04476259/
+mv val/ILSVRC2012_val_00020100.JPEG n03018349/
+mv val/ILSVRC2012_val_00020101.JPEG n04208210/
+mv val/ILSVRC2012_val_00020102.JPEG n04560804/
+mv val/ILSVRC2012_val_00020103.JPEG n03794056/
+mv val/ILSVRC2012_val_00020104.JPEG n03803284/
+mv val/ILSVRC2012_val_00020105.JPEG n03476684/
+mv val/ILSVRC2012_val_00020106.JPEG n01514668/
+mv val/ILSVRC2012_val_00020107.JPEG n04347754/
+mv val/ILSVRC2012_val_00020108.JPEG n01773157/
+mv val/ILSVRC2012_val_00020109.JPEG n01820546/
+mv val/ILSVRC2012_val_00020110.JPEG n04443257/
+mv val/ILSVRC2012_val_00020111.JPEG n03976657/
+mv val/ILSVRC2012_val_00020112.JPEG n04146614/
+mv val/ILSVRC2012_val_00020113.JPEG n02100583/
+mv val/ILSVRC2012_val_00020114.JPEG n04476259/
+mv val/ILSVRC2012_val_00020115.JPEG n01776313/
+mv val/ILSVRC2012_val_00020116.JPEG n02095570/
+mv val/ILSVRC2012_val_00020117.JPEG n03180011/
+mv val/ILSVRC2012_val_00020118.JPEG n02110806/
+mv val/ILSVRC2012_val_00020119.JPEG n02129165/
+mv val/ILSVRC2012_val_00020120.JPEG n02504013/
+mv val/ILSVRC2012_val_00020121.JPEG n02808304/
+mv val/ILSVRC2012_val_00020122.JPEG n03854065/
+mv val/ILSVRC2012_val_00020123.JPEG n02066245/
+mv val/ILSVRC2012_val_00020124.JPEG n01685808/
+mv val/ILSVRC2012_val_00020125.JPEG n03290653/
+mv val/ILSVRC2012_val_00020126.JPEG n01924916/
+mv val/ILSVRC2012_val_00020127.JPEG n03776460/
+mv val/ILSVRC2012_val_00020128.JPEG n02102973/
+mv val/ILSVRC2012_val_00020129.JPEG n03871628/
+mv val/ILSVRC2012_val_00020130.JPEG n04266014/
+mv val/ILSVRC2012_val_00020131.JPEG n04350905/
+mv val/ILSVRC2012_val_00020132.JPEG n02104029/
+mv val/ILSVRC2012_val_00020133.JPEG n03598930/
+mv val/ILSVRC2012_val_00020134.JPEG n04344873/
+mv val/ILSVRC2012_val_00020135.JPEG n10565667/
+mv val/ILSVRC2012_val_00020136.JPEG n02123045/
+mv val/ILSVRC2012_val_00020137.JPEG n02437312/
+mv val/ILSVRC2012_val_00020138.JPEG n03759954/
+mv val/ILSVRC2012_val_00020139.JPEG n02437616/
+mv val/ILSVRC2012_val_00020140.JPEG n02123159/
+mv val/ILSVRC2012_val_00020141.JPEG n01664065/
+mv val/ILSVRC2012_val_00020142.JPEG n02916936/
+mv val/ILSVRC2012_val_00020143.JPEG n03124170/
+mv val/ILSVRC2012_val_00020144.JPEG n02504013/
+mv val/ILSVRC2012_val_00020145.JPEG n03272562/
+mv val/ILSVRC2012_val_00020146.JPEG n03617480/
+mv val/ILSVRC2012_val_00020147.JPEG n02091244/
+mv val/ILSVRC2012_val_00020148.JPEG n02051845/
+mv val/ILSVRC2012_val_00020149.JPEG n02090622/
+mv val/ILSVRC2012_val_00020150.JPEG n04376876/
+mv val/ILSVRC2012_val_00020151.JPEG n04613696/
+mv val/ILSVRC2012_val_00020152.JPEG n02108551/
+mv val/ILSVRC2012_val_00020153.JPEG n04328186/
+mv val/ILSVRC2012_val_00020154.JPEG n01682714/
+mv val/ILSVRC2012_val_00020155.JPEG n03777754/
+mv val/ILSVRC2012_val_00020156.JPEG n02095570/
+mv val/ILSVRC2012_val_00020157.JPEG n07802026/
+mv val/ILSVRC2012_val_00020158.JPEG n02437616/
+mv val/ILSVRC2012_val_00020159.JPEG n02169497/
+mv val/ILSVRC2012_val_00020160.JPEG n02100735/
+mv val/ILSVRC2012_val_00020161.JPEG n01748264/
+mv val/ILSVRC2012_val_00020162.JPEG n03942813/
+mv val/ILSVRC2012_val_00020163.JPEG n04296562/
+mv val/ILSVRC2012_val_00020164.JPEG n02264363/
+mv val/ILSVRC2012_val_00020165.JPEG n04517823/
+mv val/ILSVRC2012_val_00020166.JPEG n03207743/
+mv val/ILSVRC2012_val_00020167.JPEG n02927161/
+mv val/ILSVRC2012_val_00020168.JPEG n04332243/
+mv val/ILSVRC2012_val_00020169.JPEG n02110185/
+mv val/ILSVRC2012_val_00020170.JPEG n04409515/
+mv val/ILSVRC2012_val_00020171.JPEG n02480495/
+mv val/ILSVRC2012_val_00020172.JPEG n09468604/
+mv val/ILSVRC2012_val_00020173.JPEG n02100735/
+mv val/ILSVRC2012_val_00020174.JPEG n07716358/
+mv val/ILSVRC2012_val_00020175.JPEG n15075141/
+mv val/ILSVRC2012_val_00020176.JPEG n03814639/
+mv val/ILSVRC2012_val_00020177.JPEG n02105251/
+mv val/ILSVRC2012_val_00020178.JPEG n01537544/
+mv val/ILSVRC2012_val_00020179.JPEG n01855672/
+mv val/ILSVRC2012_val_00020180.JPEG n01644900/
+mv val/ILSVRC2012_val_00020181.JPEG n04037443/
+mv val/ILSVRC2012_val_00020182.JPEG n02870880/
+mv val/ILSVRC2012_val_00020183.JPEG n02264363/
+mv val/ILSVRC2012_val_00020184.JPEG n04336792/
+mv val/ILSVRC2012_val_00020185.JPEG n09229709/
+mv val/ILSVRC2012_val_00020186.JPEG n03146219/
+mv val/ILSVRC2012_val_00020187.JPEG n02837789/
+mv val/ILSVRC2012_val_00020188.JPEG n03733281/
+mv val/ILSVRC2012_val_00020189.JPEG n04599235/
+mv val/ILSVRC2012_val_00020190.JPEG n04008634/
+mv val/ILSVRC2012_val_00020191.JPEG n02111500/
+mv val/ILSVRC2012_val_00020192.JPEG n04560804/
+mv val/ILSVRC2012_val_00020193.JPEG n02116738/
+mv val/ILSVRC2012_val_00020194.JPEG n02009229/
+mv val/ILSVRC2012_val_00020195.JPEG n03272562/
+mv val/ILSVRC2012_val_00020196.JPEG n02106030/
+mv val/ILSVRC2012_val_00020197.JPEG n03666591/
+mv val/ILSVRC2012_val_00020198.JPEG n02356798/
+mv val/ILSVRC2012_val_00020199.JPEG n09835506/
+mv val/ILSVRC2012_val_00020200.JPEG n02727426/
+mv val/ILSVRC2012_val_00020201.JPEG n02113712/
+mv val/ILSVRC2012_val_00020202.JPEG n02397096/
+mv val/ILSVRC2012_val_00020203.JPEG n04153751/
+mv val/ILSVRC2012_val_00020204.JPEG n02808304/
+mv val/ILSVRC2012_val_00020205.JPEG n02033041/
+mv val/ILSVRC2012_val_00020206.JPEG n02992529/
+mv val/ILSVRC2012_val_00020207.JPEG n02837789/
+mv val/ILSVRC2012_val_00020208.JPEG n03355925/
+mv val/ILSVRC2012_val_00020209.JPEG n03492542/
+mv val/ILSVRC2012_val_00020210.JPEG n03991062/
+mv val/ILSVRC2012_val_00020211.JPEG n02457408/
+mv val/ILSVRC2012_val_00020212.JPEG n03085013/
+mv val/ILSVRC2012_val_00020213.JPEG n04501370/
+mv val/ILSVRC2012_val_00020214.JPEG n02843684/
+mv val/ILSVRC2012_val_00020215.JPEG n02490219/
+mv val/ILSVRC2012_val_00020216.JPEG n02106382/
+mv val/ILSVRC2012_val_00020217.JPEG n02489166/
+mv val/ILSVRC2012_val_00020218.JPEG n03670208/
+mv val/ILSVRC2012_val_00020219.JPEG n02447366/
+mv val/ILSVRC2012_val_00020220.JPEG n02655020/
+mv val/ILSVRC2012_val_00020221.JPEG n13054560/
+mv val/ILSVRC2012_val_00020222.JPEG n03445924/
+mv val/ILSVRC2012_val_00020223.JPEG n03903868/
+mv val/ILSVRC2012_val_00020224.JPEG n02099601/
+mv val/ILSVRC2012_val_00020225.JPEG n02119022/
+mv val/ILSVRC2012_val_00020226.JPEG n02422106/
+mv val/ILSVRC2012_val_00020227.JPEG n04019541/
+mv val/ILSVRC2012_val_00020228.JPEG n04355933/
+mv val/ILSVRC2012_val_00020229.JPEG n04200800/
+mv val/ILSVRC2012_val_00020230.JPEG n02123597/
+mv val/ILSVRC2012_val_00020231.JPEG n13052670/
+mv val/ILSVRC2012_val_00020232.JPEG n03250847/
+mv val/ILSVRC2012_val_00020233.JPEG n02992529/
+mv val/ILSVRC2012_val_00020234.JPEG n02951585/
+mv val/ILSVRC2012_val_00020235.JPEG n03085013/
+mv val/ILSVRC2012_val_00020236.JPEG n01768244/
+mv val/ILSVRC2012_val_00020237.JPEG n04525305/
+mv val/ILSVRC2012_val_00020238.JPEG n03187595/
+mv val/ILSVRC2012_val_00020239.JPEG n01798484/
+mv val/ILSVRC2012_val_00020240.JPEG n03467068/
+mv val/ILSVRC2012_val_00020241.JPEG n04370456/
+mv val/ILSVRC2012_val_00020242.JPEG n03832673/
+mv val/ILSVRC2012_val_00020243.JPEG n02097130/
+mv val/ILSVRC2012_val_00020244.JPEG n03240683/
+mv val/ILSVRC2012_val_00020245.JPEG n04371430/
+mv val/ILSVRC2012_val_00020246.JPEG n04579432/
+mv val/ILSVRC2012_val_00020247.JPEG n04458633/
+mv val/ILSVRC2012_val_00020248.JPEG n04483307/
+mv val/ILSVRC2012_val_00020249.JPEG n02980441/
+mv val/ILSVRC2012_val_00020250.JPEG n02102318/
+mv val/ILSVRC2012_val_00020251.JPEG n04154565/
+mv val/ILSVRC2012_val_00020252.JPEG n03452741/
+mv val/ILSVRC2012_val_00020253.JPEG n03961711/
+mv val/ILSVRC2012_val_00020254.JPEG n02808440/
+mv val/ILSVRC2012_val_00020255.JPEG n03063689/
+mv val/ILSVRC2012_val_00020256.JPEG n02114855/
+mv val/ILSVRC2012_val_00020257.JPEG n02096051/
+mv val/ILSVRC2012_val_00020258.JPEG n04461696/
+mv val/ILSVRC2012_val_00020259.JPEG n04487394/
+mv val/ILSVRC2012_val_00020260.JPEG n02113186/
+mv val/ILSVRC2012_val_00020261.JPEG n07892512/
+mv val/ILSVRC2012_val_00020262.JPEG n03223299/
+mv val/ILSVRC2012_val_00020263.JPEG n04081281/
+mv val/ILSVRC2012_val_00020264.JPEG n04371774/
+mv val/ILSVRC2012_val_00020265.JPEG n04417672/
+mv val/ILSVRC2012_val_00020266.JPEG n03249569/
+mv val/ILSVRC2012_val_00020267.JPEG n03197337/
+mv val/ILSVRC2012_val_00020268.JPEG n02101006/
+mv val/ILSVRC2012_val_00020269.JPEG n01768244/
+mv val/ILSVRC2012_val_00020270.JPEG n02113186/
+mv val/ILSVRC2012_val_00020271.JPEG n03899768/
+mv val/ILSVRC2012_val_00020272.JPEG n02783161/
+mv val/ILSVRC2012_val_00020273.JPEG n01734418/
+mv val/ILSVRC2012_val_00020274.JPEG n01728920/
+mv val/ILSVRC2012_val_00020275.JPEG n02497673/
+mv val/ILSVRC2012_val_00020276.JPEG n03063599/
+mv val/ILSVRC2012_val_00020277.JPEG n04479046/
+mv val/ILSVRC2012_val_00020278.JPEG n02895154/
+mv val/ILSVRC2012_val_00020279.JPEG n02100877/
+mv val/ILSVRC2012_val_00020280.JPEG n01983481/
+mv val/ILSVRC2012_val_00020281.JPEG n03908618/
+mv val/ILSVRC2012_val_00020282.JPEG n04507155/
+mv val/ILSVRC2012_val_00020283.JPEG n03344393/
+mv val/ILSVRC2012_val_00020284.JPEG n01829413/
+mv val/ILSVRC2012_val_00020285.JPEG n02342885/
+mv val/ILSVRC2012_val_00020286.JPEG n02190166/
+mv val/ILSVRC2012_val_00020287.JPEG n07802026/
+mv val/ILSVRC2012_val_00020288.JPEG n03991062/
+mv val/ILSVRC2012_val_00020289.JPEG n02974003/
+mv val/ILSVRC2012_val_00020290.JPEG n01698640/
+mv val/ILSVRC2012_val_00020291.JPEG n04447861/
+mv val/ILSVRC2012_val_00020292.JPEG n03623198/
+mv val/ILSVRC2012_val_00020293.JPEG n04347754/
+mv val/ILSVRC2012_val_00020294.JPEG n07614500/
+mv val/ILSVRC2012_val_00020295.JPEG n12144580/
+mv val/ILSVRC2012_val_00020296.JPEG n04254680/
+mv val/ILSVRC2012_val_00020297.JPEG n04482393/
+mv val/ILSVRC2012_val_00020298.JPEG n01943899/
+mv val/ILSVRC2012_val_00020299.JPEG n03887697/
+mv val/ILSVRC2012_val_00020300.JPEG n03598930/
+mv val/ILSVRC2012_val_00020301.JPEG n02483362/
+mv val/ILSVRC2012_val_00020302.JPEG n02120079/
+mv val/ILSVRC2012_val_00020303.JPEG n03680355/
+mv val/ILSVRC2012_val_00020304.JPEG n03485407/
+mv val/ILSVRC2012_val_00020305.JPEG n02130308/
+mv val/ILSVRC2012_val_00020306.JPEG n02894605/
+mv val/ILSVRC2012_val_00020307.JPEG n03841143/
+mv val/ILSVRC2012_val_00020308.JPEG n02172182/
+mv val/ILSVRC2012_val_00020309.JPEG n02727426/
+mv val/ILSVRC2012_val_00020310.JPEG n04418357/
+mv val/ILSVRC2012_val_00020311.JPEG n02097209/
+mv val/ILSVRC2012_val_00020312.JPEG n03495258/
+mv val/ILSVRC2012_val_00020313.JPEG n02701002/
+mv val/ILSVRC2012_val_00020314.JPEG n03481172/
+mv val/ILSVRC2012_val_00020315.JPEG n02860847/
+mv val/ILSVRC2012_val_00020316.JPEG n04435653/
+mv val/ILSVRC2012_val_00020317.JPEG n03384352/
+mv val/ILSVRC2012_val_00020318.JPEG n04131690/
+mv val/ILSVRC2012_val_00020319.JPEG n02701002/
+mv val/ILSVRC2012_val_00020320.JPEG n03868863/
+mv val/ILSVRC2012_val_00020321.JPEG n01644373/
+mv val/ILSVRC2012_val_00020322.JPEG n03000247/
+mv val/ILSVRC2012_val_00020323.JPEG n02397096/
+mv val/ILSVRC2012_val_00020324.JPEG n04118776/
+mv val/ILSVRC2012_val_00020325.JPEG n02117135/
+mv val/ILSVRC2012_val_00020326.JPEG n02051845/
+mv val/ILSVRC2012_val_00020327.JPEG n03649909/
+mv val/ILSVRC2012_val_00020328.JPEG n02869837/
+mv val/ILSVRC2012_val_00020329.JPEG n03661043/
+mv val/ILSVRC2012_val_00020330.JPEG n02090622/
+mv val/ILSVRC2012_val_00020331.JPEG n02190166/
+mv val/ILSVRC2012_val_00020332.JPEG n02134084/
+mv val/ILSVRC2012_val_00020333.JPEG n02701002/
+mv val/ILSVRC2012_val_00020334.JPEG n03496892/
+mv val/ILSVRC2012_val_00020335.JPEG n02871525/
+mv val/ILSVRC2012_val_00020336.JPEG n04277352/
+mv val/ILSVRC2012_val_00020337.JPEG n02966193/
+mv val/ILSVRC2012_val_00020338.JPEG n07697313/
+mv val/ILSVRC2012_val_00020339.JPEG n03447447/
+mv val/ILSVRC2012_val_00020340.JPEG n03388183/
+mv val/ILSVRC2012_val_00020341.JPEG n02483708/
+mv val/ILSVRC2012_val_00020342.JPEG n03623198/
+mv val/ILSVRC2012_val_00020343.JPEG n09421951/
+mv val/ILSVRC2012_val_00020344.JPEG n02128925/
+mv val/ILSVRC2012_val_00020345.JPEG n02823428/
+mv val/ILSVRC2012_val_00020346.JPEG n02410509/
+mv val/ILSVRC2012_val_00020347.JPEG n02099429/
+mv val/ILSVRC2012_val_00020348.JPEG n04162706/
+mv val/ILSVRC2012_val_00020349.JPEG n01601694/
+mv val/ILSVRC2012_val_00020350.JPEG n06794110/
+mv val/ILSVRC2012_val_00020351.JPEG n03929660/
+mv val/ILSVRC2012_val_00020352.JPEG n07920052/
+mv val/ILSVRC2012_val_00020353.JPEG n04273569/
+mv val/ILSVRC2012_val_00020354.JPEG n02259212/
+mv val/ILSVRC2012_val_00020355.JPEG n03180011/
+mv val/ILSVRC2012_val_00020356.JPEG n01685808/
+mv val/ILSVRC2012_val_00020357.JPEG n02095889/
+mv val/ILSVRC2012_val_00020358.JPEG n04204347/
+mv val/ILSVRC2012_val_00020359.JPEG n02804414/
+mv val/ILSVRC2012_val_00020360.JPEG n02236044/
+mv val/ILSVRC2012_val_00020361.JPEG n04111531/
+mv val/ILSVRC2012_val_00020362.JPEG n02132136/
+mv val/ILSVRC2012_val_00020363.JPEG n07717556/
+mv val/ILSVRC2012_val_00020364.JPEG n03388183/
+mv val/ILSVRC2012_val_00020365.JPEG n04200800/
+mv val/ILSVRC2012_val_00020366.JPEG n04154565/
+mv val/ILSVRC2012_val_00020367.JPEG n02099601/
+mv val/ILSVRC2012_val_00020368.JPEG n03065424/
+mv val/ILSVRC2012_val_00020369.JPEG n03942813/
+mv val/ILSVRC2012_val_00020370.JPEG n01930112/
+mv val/ILSVRC2012_val_00020371.JPEG n04049303/
+mv val/ILSVRC2012_val_00020372.JPEG n02965783/
+mv val/ILSVRC2012_val_00020373.JPEG n03444034/
+mv val/ILSVRC2012_val_00020374.JPEG n03131574/
+mv val/ILSVRC2012_val_00020375.JPEG n02090721/
+mv val/ILSVRC2012_val_00020376.JPEG n02281787/
+mv val/ILSVRC2012_val_00020377.JPEG n04389033/
+mv val/ILSVRC2012_val_00020378.JPEG n07615774/
+mv val/ILSVRC2012_val_00020379.JPEG n02086240/
+mv val/ILSVRC2012_val_00020380.JPEG n02105412/
+mv val/ILSVRC2012_val_00020381.JPEG n03794056/
+mv val/ILSVRC2012_val_00020382.JPEG n03977966/
+mv val/ILSVRC2012_val_00020383.JPEG n01728572/
+mv val/ILSVRC2012_val_00020384.JPEG n03218198/
+mv val/ILSVRC2012_val_00020385.JPEG n07584110/
+mv val/ILSVRC2012_val_00020386.JPEG n02134084/
+mv val/ILSVRC2012_val_00020387.JPEG n03991062/
+mv val/ILSVRC2012_val_00020388.JPEG n03124170/
+mv val/ILSVRC2012_val_00020389.JPEG n04070727/
+mv val/ILSVRC2012_val_00020390.JPEG n03908618/
+mv val/ILSVRC2012_val_00020391.JPEG n07932039/
+mv val/ILSVRC2012_val_00020392.JPEG n02110806/
+mv val/ILSVRC2012_val_00020393.JPEG n01630670/
+mv val/ILSVRC2012_val_00020394.JPEG n03598930/
+mv val/ILSVRC2012_val_00020395.JPEG n04355338/
+mv val/ILSVRC2012_val_00020396.JPEG n03014705/
+mv val/ILSVRC2012_val_00020397.JPEG n02172182/
+mv val/ILSVRC2012_val_00020398.JPEG n03721384/
+mv val/ILSVRC2012_val_00020399.JPEG n02095314/
+mv val/ILSVRC2012_val_00020400.JPEG n02979186/
+mv val/ILSVRC2012_val_00020401.JPEG n01742172/
+mv val/ILSVRC2012_val_00020402.JPEG n04409515/
+mv val/ILSVRC2012_val_00020403.JPEG n02089973/
+mv val/ILSVRC2012_val_00020404.JPEG n02422699/
+mv val/ILSVRC2012_val_00020405.JPEG n03763968/
+mv val/ILSVRC2012_val_00020406.JPEG n02492660/
+mv val/ILSVRC2012_val_00020407.JPEG n02910353/
+mv val/ILSVRC2012_val_00020408.JPEG n03743016/
+mv val/ILSVRC2012_val_00020409.JPEG n03196217/
+mv val/ILSVRC2012_val_00020410.JPEG n02840245/
+mv val/ILSVRC2012_val_00020411.JPEG n03804744/
+mv val/ILSVRC2012_val_00020412.JPEG n04532106/
+mv val/ILSVRC2012_val_00020413.JPEG n03773504/
+mv val/ILSVRC2012_val_00020414.JPEG n02100236/
+mv val/ILSVRC2012_val_00020415.JPEG n02325366/
+mv val/ILSVRC2012_val_00020416.JPEG n07753275/
+mv val/ILSVRC2012_val_00020417.JPEG n03483316/
+mv val/ILSVRC2012_val_00020418.JPEG n01494475/
+mv val/ILSVRC2012_val_00020419.JPEG n04344873/
+mv val/ILSVRC2012_val_00020420.JPEG n04259630/
+mv val/ILSVRC2012_val_00020421.JPEG n03627232/
+mv val/ILSVRC2012_val_00020422.JPEG n02280649/
+mv val/ILSVRC2012_val_00020423.JPEG n02883205/
+mv val/ILSVRC2012_val_00020424.JPEG n04404412/
+mv val/ILSVRC2012_val_00020425.JPEG n04357314/
+mv val/ILSVRC2012_val_00020426.JPEG n04286575/
+mv val/ILSVRC2012_val_00020427.JPEG n03803284/
+mv val/ILSVRC2012_val_00020428.JPEG n02098413/
+mv val/ILSVRC2012_val_00020429.JPEG n04209239/
+mv val/ILSVRC2012_val_00020430.JPEG n01632777/
+mv val/ILSVRC2012_val_00020431.JPEG n03908618/
+mv val/ILSVRC2012_val_00020432.JPEG n02110185/
+mv val/ILSVRC2012_val_00020433.JPEG n02457408/
+mv val/ILSVRC2012_val_00020434.JPEG n02788148/
+mv val/ILSVRC2012_val_00020435.JPEG n03467068/
+mv val/ILSVRC2012_val_00020436.JPEG n01443537/
+mv val/ILSVRC2012_val_00020437.JPEG n04310018/
+mv val/ILSVRC2012_val_00020438.JPEG n03325584/
+mv val/ILSVRC2012_val_00020439.JPEG n02395406/
+mv val/ILSVRC2012_val_00020440.JPEG n03133878/
+mv val/ILSVRC2012_val_00020441.JPEG n02134084/
+mv val/ILSVRC2012_val_00020442.JPEG n02089867/
+mv val/ILSVRC2012_val_00020443.JPEG n01833805/
+mv val/ILSVRC2012_val_00020444.JPEG n03443371/
+mv val/ILSVRC2012_val_00020445.JPEG n03838899/
+mv val/ILSVRC2012_val_00020446.JPEG n03216828/
+mv val/ILSVRC2012_val_00020447.JPEG n03485794/
+mv val/ILSVRC2012_val_00020448.JPEG n03761084/
+mv val/ILSVRC2012_val_00020449.JPEG n02500267/
+mv val/ILSVRC2012_val_00020450.JPEG n04435653/
+mv val/ILSVRC2012_val_00020451.JPEG n01514668/
+mv val/ILSVRC2012_val_00020452.JPEG n10565667/
+mv val/ILSVRC2012_val_00020453.JPEG n01675722/
+mv val/ILSVRC2012_val_00020454.JPEG n02233338/
+mv val/ILSVRC2012_val_00020455.JPEG n02497673/
+mv val/ILSVRC2012_val_00020456.JPEG n01784675/
+mv val/ILSVRC2012_val_00020457.JPEG n03761084/
+mv val/ILSVRC2012_val_00020458.JPEG n02279972/
+mv val/ILSVRC2012_val_00020459.JPEG n03721384/
+mv val/ILSVRC2012_val_00020460.JPEG n02088238/
+mv val/ILSVRC2012_val_00020461.JPEG n03017168/
+mv val/ILSVRC2012_val_00020462.JPEG n01770081/
+mv val/ILSVRC2012_val_00020463.JPEG n03347037/
+mv val/ILSVRC2012_val_00020464.JPEG n02231487/
+mv val/ILSVRC2012_val_00020465.JPEG n12768682/
+mv val/ILSVRC2012_val_00020466.JPEG n03877472/
+mv val/ILSVRC2012_val_00020467.JPEG n02730930/
+mv val/ILSVRC2012_val_00020468.JPEG n02088238/
+mv val/ILSVRC2012_val_00020469.JPEG n01592084/
+mv val/ILSVRC2012_val_00020470.JPEG n03998194/
+mv val/ILSVRC2012_val_00020471.JPEG n03478589/
+mv val/ILSVRC2012_val_00020472.JPEG n03776460/
+mv val/ILSVRC2012_val_00020473.JPEG n02086910/
+mv val/ILSVRC2012_val_00020474.JPEG n02113624/
+mv val/ILSVRC2012_val_00020475.JPEG n02669723/
+mv val/ILSVRC2012_val_00020476.JPEG n01930112/
+mv val/ILSVRC2012_val_00020477.JPEG n04356056/
+mv val/ILSVRC2012_val_00020478.JPEG n12768682/
+mv val/ILSVRC2012_val_00020479.JPEG n09421951/
+mv val/ILSVRC2012_val_00020480.JPEG n03908618/
+mv val/ILSVRC2012_val_00020481.JPEG n02120079/
+mv val/ILSVRC2012_val_00020482.JPEG n02133161/
+mv val/ILSVRC2012_val_00020483.JPEG n03345487/
+mv val/ILSVRC2012_val_00020484.JPEG n02087046/
+mv val/ILSVRC2012_val_00020485.JPEG n04118538/
+mv val/ILSVRC2012_val_00020486.JPEG n03344393/
+mv val/ILSVRC2012_val_00020487.JPEG n02704792/
+mv val/ILSVRC2012_val_00020488.JPEG n02112018/
+mv val/ILSVRC2012_val_00020489.JPEG n02100583/
+mv val/ILSVRC2012_val_00020490.JPEG n03196217/
+mv val/ILSVRC2012_val_00020491.JPEG n04133789/
+mv val/ILSVRC2012_val_00020492.JPEG n02640242/
+mv val/ILSVRC2012_val_00020493.JPEG n02817516/
+mv val/ILSVRC2012_val_00020494.JPEG n01740131/
+mv val/ILSVRC2012_val_00020495.JPEG n01532829/
+mv val/ILSVRC2012_val_00020496.JPEG n04548362/
+mv val/ILSVRC2012_val_00020497.JPEG n04509417/
+mv val/ILSVRC2012_val_00020498.JPEG n02364673/
+mv val/ILSVRC2012_val_00020499.JPEG n02415577/
+mv val/ILSVRC2012_val_00020500.JPEG n04204347/
+mv val/ILSVRC2012_val_00020501.JPEG n12267677/
+mv val/ILSVRC2012_val_00020502.JPEG n03445777/
+mv val/ILSVRC2012_val_00020503.JPEG n07584110/
+mv val/ILSVRC2012_val_00020504.JPEG n03544143/
+mv val/ILSVRC2012_val_00020505.JPEG n03764736/
+mv val/ILSVRC2012_val_00020506.JPEG n07892512/
+mv val/ILSVRC2012_val_00020507.JPEG n01770393/
+mv val/ILSVRC2012_val_00020508.JPEG n01688243/
+mv val/ILSVRC2012_val_00020509.JPEG n04033995/
+mv val/ILSVRC2012_val_00020510.JPEG n04590129/
+mv val/ILSVRC2012_val_00020511.JPEG n01978287/
+mv val/ILSVRC2012_val_00020512.JPEG n02113712/
+mv val/ILSVRC2012_val_00020513.JPEG n02093428/
+mv val/ILSVRC2012_val_00020514.JPEG n01819313/
+mv val/ILSVRC2012_val_00020515.JPEG n02437312/
+mv val/ILSVRC2012_val_00020516.JPEG n03706229/
+mv val/ILSVRC2012_val_00020517.JPEG n03535780/
+mv val/ILSVRC2012_val_00020518.JPEG n02112137/
+mv val/ILSVRC2012_val_00020519.JPEG n04266014/
+mv val/ILSVRC2012_val_00020520.JPEG n02137549/
+mv val/ILSVRC2012_val_00020521.JPEG n03630383/
+mv val/ILSVRC2012_val_00020522.JPEG n03089624/
+mv val/ILSVRC2012_val_00020523.JPEG n04208210/
+mv val/ILSVRC2012_val_00020524.JPEG n03100240/
+mv val/ILSVRC2012_val_00020525.JPEG n02480495/
+mv val/ILSVRC2012_val_00020526.JPEG n02860847/
+mv val/ILSVRC2012_val_00020527.JPEG n03062245/
+mv val/ILSVRC2012_val_00020528.JPEG n04409515/
+mv val/ILSVRC2012_val_00020529.JPEG n04404412/
+mv val/ILSVRC2012_val_00020530.JPEG n02687172/
+mv val/ILSVRC2012_val_00020531.JPEG n04065272/
+mv val/ILSVRC2012_val_00020532.JPEG n03770439/
+mv val/ILSVRC2012_val_00020533.JPEG n04049303/
+mv val/ILSVRC2012_val_00020534.JPEG n03249569/
+mv val/ILSVRC2012_val_00020535.JPEG n02088238/
+mv val/ILSVRC2012_val_00020536.JPEG n01978287/
+mv val/ILSVRC2012_val_00020537.JPEG n04532106/
+mv val/ILSVRC2012_val_00020538.JPEG n01687978/
+mv val/ILSVRC2012_val_00020539.JPEG n01751748/
+mv val/ILSVRC2012_val_00020540.JPEG n02981792/
+mv val/ILSVRC2012_val_00020541.JPEG n03792972/
+mv val/ILSVRC2012_val_00020542.JPEG n04326547/
+mv val/ILSVRC2012_val_00020543.JPEG n01728920/
+mv val/ILSVRC2012_val_00020544.JPEG n04612504/
+mv val/ILSVRC2012_val_00020545.JPEG n07714990/
+mv val/ILSVRC2012_val_00020546.JPEG n03764736/
+mv val/ILSVRC2012_val_00020547.JPEG n07717410/
+mv val/ILSVRC2012_val_00020548.JPEG n04141327/
+mv val/ILSVRC2012_val_00020549.JPEG n03032252/
+mv val/ILSVRC2012_val_00020550.JPEG n02107574/
+mv val/ILSVRC2012_val_00020551.JPEG n02226429/
+mv val/ILSVRC2012_val_00020552.JPEG n01820546/
+mv val/ILSVRC2012_val_00020553.JPEG n02088364/
+mv val/ILSVRC2012_val_00020554.JPEG n03961711/
+mv val/ILSVRC2012_val_00020555.JPEG n07753113/
+mv val/ILSVRC2012_val_00020556.JPEG n02094114/
+mv val/ILSVRC2012_val_00020557.JPEG n03733805/
+mv val/ILSVRC2012_val_00020558.JPEG n02607072/
+mv val/ILSVRC2012_val_00020559.JPEG n02028035/
+mv val/ILSVRC2012_val_00020560.JPEG n03857828/
+mv val/ILSVRC2012_val_00020561.JPEG n02807133/
+mv val/ILSVRC2012_val_00020562.JPEG n04456115/
+mv val/ILSVRC2012_val_00020563.JPEG n02640242/
+mv val/ILSVRC2012_val_00020564.JPEG n02206856/
+mv val/ILSVRC2012_val_00020565.JPEG n12144580/
+mv val/ILSVRC2012_val_00020566.JPEG n02115913/
+mv val/ILSVRC2012_val_00020567.JPEG n03627232/
+mv val/ILSVRC2012_val_00020568.JPEG n02699494/
+mv val/ILSVRC2012_val_00020569.JPEG n01756291/
+mv val/ILSVRC2012_val_00020570.JPEG n03630383/
+mv val/ILSVRC2012_val_00020571.JPEG n02280649/
+mv val/ILSVRC2012_val_00020572.JPEG n02799071/
+mv val/ILSVRC2012_val_00020573.JPEG n07749582/
+mv val/ILSVRC2012_val_00020574.JPEG n01773157/
+mv val/ILSVRC2012_val_00020575.JPEG n09256479/
+mv val/ILSVRC2012_val_00020576.JPEG n04235860/
+mv val/ILSVRC2012_val_00020577.JPEG n06874185/
+mv val/ILSVRC2012_val_00020578.JPEG n02002556/
+mv val/ILSVRC2012_val_00020579.JPEG n02454379/
+mv val/ILSVRC2012_val_00020580.JPEG n03775546/
+mv val/ILSVRC2012_val_00020581.JPEG n02177972/
+mv val/ILSVRC2012_val_00020582.JPEG n02009229/
+mv val/ILSVRC2012_val_00020583.JPEG n03297495/
+mv val/ILSVRC2012_val_00020584.JPEG n03895866/
+mv val/ILSVRC2012_val_00020585.JPEG n01694178/
+mv val/ILSVRC2012_val_00020586.JPEG n01698640/
+mv val/ILSVRC2012_val_00020587.JPEG n01796340/
+mv val/ILSVRC2012_val_00020588.JPEG n03124043/
+mv val/ILSVRC2012_val_00020589.JPEG n02107683/
+mv val/ILSVRC2012_val_00020590.JPEG n02981792/
+mv val/ILSVRC2012_val_00020591.JPEG n04540053/
+mv val/ILSVRC2012_val_00020592.JPEG n07695742/
+mv val/ILSVRC2012_val_00020593.JPEG n02102318/
+mv val/ILSVRC2012_val_00020594.JPEG n02123597/
+mv val/ILSVRC2012_val_00020595.JPEG n04152593/
+mv val/ILSVRC2012_val_00020596.JPEG n01695060/
+mv val/ILSVRC2012_val_00020597.JPEG n04252077/
+mv val/ILSVRC2012_val_00020598.JPEG n01689811/
+mv val/ILSVRC2012_val_00020599.JPEG n01882714/
+mv val/ILSVRC2012_val_00020600.JPEG n04141327/
+mv val/ILSVRC2012_val_00020601.JPEG n07753592/
+mv val/ILSVRC2012_val_00020602.JPEG n02793495/
+mv val/ILSVRC2012_val_00020603.JPEG n04136333/
+mv val/ILSVRC2012_val_00020604.JPEG n03876231/
+mv val/ILSVRC2012_val_00020605.JPEG n02860847/
+mv val/ILSVRC2012_val_00020606.JPEG n04591157/
+mv val/ILSVRC2012_val_00020607.JPEG n04380533/
+mv val/ILSVRC2012_val_00020608.JPEG n03259280/
+mv val/ILSVRC2012_val_00020609.JPEG n03530642/
+mv val/ILSVRC2012_val_00020610.JPEG n01558993/
+mv val/ILSVRC2012_val_00020611.JPEG n04355338/
+mv val/ILSVRC2012_val_00020612.JPEG n02017213/
+mv val/ILSVRC2012_val_00020613.JPEG n02091032/
+mv val/ILSVRC2012_val_00020614.JPEG n07615774/
+mv val/ILSVRC2012_val_00020615.JPEG n07693725/
+mv val/ILSVRC2012_val_00020616.JPEG n02319095/
+mv val/ILSVRC2012_val_00020617.JPEG n04335435/
+mv val/ILSVRC2012_val_00020618.JPEG n06794110/
+mv val/ILSVRC2012_val_00020619.JPEG n11879895/
+mv val/ILSVRC2012_val_00020620.JPEG n09332890/
+mv val/ILSVRC2012_val_00020621.JPEG n02708093/
+mv val/ILSVRC2012_val_00020622.JPEG n02643566/
+mv val/ILSVRC2012_val_00020623.JPEG n03895866/
+mv val/ILSVRC2012_val_00020624.JPEG n03838899/
+mv val/ILSVRC2012_val_00020625.JPEG n03393912/
+mv val/ILSVRC2012_val_00020626.JPEG n02112137/
+mv val/ILSVRC2012_val_00020627.JPEG n01955084/
+mv val/ILSVRC2012_val_00020628.JPEG n02094433/
+mv val/ILSVRC2012_val_00020629.JPEG n02791124/
+mv val/ILSVRC2012_val_00020630.JPEG n03877472/
+mv val/ILSVRC2012_val_00020631.JPEG n03792782/
+mv val/ILSVRC2012_val_00020632.JPEG n01756291/
+mv val/ILSVRC2012_val_00020633.JPEG n02097474/
+mv val/ILSVRC2012_val_00020634.JPEG n03259280/
+mv val/ILSVRC2012_val_00020635.JPEG n02190166/
+mv val/ILSVRC2012_val_00020636.JPEG n07715103/
+mv val/ILSVRC2012_val_00020637.JPEG n02095889/
+mv val/ILSVRC2012_val_00020638.JPEG n04532106/
+mv val/ILSVRC2012_val_00020639.JPEG n04597913/
+mv val/ILSVRC2012_val_00020640.JPEG n03743016/
+mv val/ILSVRC2012_val_00020641.JPEG n04548362/
+mv val/ILSVRC2012_val_00020642.JPEG n02481823/
+mv val/ILSVRC2012_val_00020643.JPEG n03388549/
+mv val/ILSVRC2012_val_00020644.JPEG n02319095/
+mv val/ILSVRC2012_val_00020645.JPEG n03792972/
+mv val/ILSVRC2012_val_00020646.JPEG n02823750/
+mv val/ILSVRC2012_val_00020647.JPEG n03623198/
+mv val/ILSVRC2012_val_00020648.JPEG n03933933/
+mv val/ILSVRC2012_val_00020649.JPEG n02231487/
+mv val/ILSVRC2012_val_00020650.JPEG n03476684/
+mv val/ILSVRC2012_val_00020651.JPEG n02098286/
+mv val/ILSVRC2012_val_00020652.JPEG n02169497/
+mv val/ILSVRC2012_val_00020653.JPEG n03379051/
+mv val/ILSVRC2012_val_00020654.JPEG n02457408/
+mv val/ILSVRC2012_val_00020655.JPEG n07742313/
+mv val/ILSVRC2012_val_00020656.JPEG n07615774/
+mv val/ILSVRC2012_val_00020657.JPEG n02206856/
+mv val/ILSVRC2012_val_00020658.JPEG n04239074/
+mv val/ILSVRC2012_val_00020659.JPEG n03393912/
+mv val/ILSVRC2012_val_00020660.JPEG n01592084/
+mv val/ILSVRC2012_val_00020661.JPEG n03680355/
+mv val/ILSVRC2012_val_00020662.JPEG n02837789/
+mv val/ILSVRC2012_val_00020663.JPEG n03590841/
+mv val/ILSVRC2012_val_00020664.JPEG n01986214/
+mv val/ILSVRC2012_val_00020665.JPEG n03657121/
+mv val/ILSVRC2012_val_00020666.JPEG n03697007/
+mv val/ILSVRC2012_val_00020667.JPEG n01697457/
+mv val/ILSVRC2012_val_00020668.JPEG n02447366/
+mv val/ILSVRC2012_val_00020669.JPEG n04418357/
+mv val/ILSVRC2012_val_00020670.JPEG n04367480/
+mv val/ILSVRC2012_val_00020671.JPEG n03220513/
+mv val/ILSVRC2012_val_00020672.JPEG n04479046/
+mv val/ILSVRC2012_val_00020673.JPEG n03100240/
+mv val/ILSVRC2012_val_00020674.JPEG n03000684/
+mv val/ILSVRC2012_val_00020675.JPEG n01978287/
+mv val/ILSVRC2012_val_00020676.JPEG n02105855/
+mv val/ILSVRC2012_val_00020677.JPEG n03127925/
+mv val/ILSVRC2012_val_00020678.JPEG n02105855/
+mv val/ILSVRC2012_val_00020679.JPEG n02092002/
+mv val/ILSVRC2012_val_00020680.JPEG n02028035/
+mv val/ILSVRC2012_val_00020681.JPEG n02094258/
+mv val/ILSVRC2012_val_00020682.JPEG n04204347/
+mv val/ILSVRC2012_val_00020683.JPEG n01795545/
+mv val/ILSVRC2012_val_00020684.JPEG n02125311/
+mv val/ILSVRC2012_val_00020685.JPEG n02823750/
+mv val/ILSVRC2012_val_00020686.JPEG n02112137/
+mv val/ILSVRC2012_val_00020687.JPEG n03126707/
+mv val/ILSVRC2012_val_00020688.JPEG n02123597/
+mv val/ILSVRC2012_val_00020689.JPEG n03223299/
+mv val/ILSVRC2012_val_00020690.JPEG n01798484/
+mv val/ILSVRC2012_val_00020691.JPEG n02280649/
+mv val/ILSVRC2012_val_00020692.JPEG n01776313/
+mv val/ILSVRC2012_val_00020693.JPEG n02641379/
+mv val/ILSVRC2012_val_00020694.JPEG n01608432/
+mv val/ILSVRC2012_val_00020695.JPEG n03249569/
+mv val/ILSVRC2012_val_00020696.JPEG n01630670/
+mv val/ILSVRC2012_val_00020697.JPEG n03895866/
+mv val/ILSVRC2012_val_00020698.JPEG n03888257/
+mv val/ILSVRC2012_val_00020699.JPEG n02422106/
+mv val/ILSVRC2012_val_00020700.JPEG n02093859/
+mv val/ILSVRC2012_val_00020701.JPEG n04125021/
+mv val/ILSVRC2012_val_00020702.JPEG n04065272/
+mv val/ILSVRC2012_val_00020703.JPEG n03814906/
+mv val/ILSVRC2012_val_00020704.JPEG n03992509/
+mv val/ILSVRC2012_val_00020705.JPEG n04423845/
+mv val/ILSVRC2012_val_00020706.JPEG n03393912/
+mv val/ILSVRC2012_val_00020707.JPEG n02066245/
+mv val/ILSVRC2012_val_00020708.JPEG n02114548/
+mv val/ILSVRC2012_val_00020709.JPEG n10148035/
+mv val/ILSVRC2012_val_00020710.JPEG n01608432/
+mv val/ILSVRC2012_val_00020711.JPEG n04355338/
+mv val/ILSVRC2012_val_00020712.JPEG n04277352/
+mv val/ILSVRC2012_val_00020713.JPEG n03976467/
+mv val/ILSVRC2012_val_00020714.JPEG n02859443/
+mv val/ILSVRC2012_val_00020715.JPEG n04141076/
+mv val/ILSVRC2012_val_00020716.JPEG n02127052/
+mv val/ILSVRC2012_val_00020717.JPEG n02088466/
+mv val/ILSVRC2012_val_00020718.JPEG n07880968/
+mv val/ILSVRC2012_val_00020719.JPEG n09835506/
+mv val/ILSVRC2012_val_00020720.JPEG n03874293/
+mv val/ILSVRC2012_val_00020721.JPEG n03481172/
+mv val/ILSVRC2012_val_00020722.JPEG n04355338/
+mv val/ILSVRC2012_val_00020723.JPEG n02894605/
+mv val/ILSVRC2012_val_00020724.JPEG n03544143/
+mv val/ILSVRC2012_val_00020725.JPEG n02977058/
+mv val/ILSVRC2012_val_00020726.JPEG n01773157/
+mv val/ILSVRC2012_val_00020727.JPEG n02486261/
+mv val/ILSVRC2012_val_00020728.JPEG n02112137/
+mv val/ILSVRC2012_val_00020729.JPEG n03075370/
+mv val/ILSVRC2012_val_00020730.JPEG n01601694/
+mv val/ILSVRC2012_val_00020731.JPEG n04004767/
+mv val/ILSVRC2012_val_00020732.JPEG n04273569/
+mv val/ILSVRC2012_val_00020733.JPEG n04275548/
+mv val/ILSVRC2012_val_00020734.JPEG n02966193/
+mv val/ILSVRC2012_val_00020735.JPEG n03443371/
+mv val/ILSVRC2012_val_00020736.JPEG n01755581/
+mv val/ILSVRC2012_val_00020737.JPEG n02100877/
+mv val/ILSVRC2012_val_00020738.JPEG n04325704/
+mv val/ILSVRC2012_val_00020739.JPEG n02090379/
+mv val/ILSVRC2012_val_00020740.JPEG n02088466/
+mv val/ILSVRC2012_val_00020741.JPEG n03347037/
+mv val/ILSVRC2012_val_00020742.JPEG n03691459/
+mv val/ILSVRC2012_val_00020743.JPEG n01616318/
+mv val/ILSVRC2012_val_00020744.JPEG n01820546/
+mv val/ILSVRC2012_val_00020745.JPEG n04009552/
+mv val/ILSVRC2012_val_00020746.JPEG n03637318/
+mv val/ILSVRC2012_val_00020747.JPEG n01795545/
+mv val/ILSVRC2012_val_00020748.JPEG n02108000/
+mv val/ILSVRC2012_val_00020749.JPEG n01843383/
+mv val/ILSVRC2012_val_00020750.JPEG n03908618/
+mv val/ILSVRC2012_val_00020751.JPEG n07753275/
+mv val/ILSVRC2012_val_00020752.JPEG n02950826/
+mv val/ILSVRC2012_val_00020753.JPEG n04069434/
+mv val/ILSVRC2012_val_00020754.JPEG n02701002/
+mv val/ILSVRC2012_val_00020755.JPEG n02799071/
+mv val/ILSVRC2012_val_00020756.JPEG n02786058/
+mv val/ILSVRC2012_val_00020757.JPEG n02526121/
+mv val/ILSVRC2012_val_00020758.JPEG n03459775/
+mv val/ILSVRC2012_val_00020759.JPEG n04552348/
+mv val/ILSVRC2012_val_00020760.JPEG n04462240/
+mv val/ILSVRC2012_val_00020761.JPEG n02108915/
+mv val/ILSVRC2012_val_00020762.JPEG n02088364/
+mv val/ILSVRC2012_val_00020763.JPEG n02791270/
+mv val/ILSVRC2012_val_00020764.JPEG n01682714/
+mv val/ILSVRC2012_val_00020765.JPEG n02123394/
+mv val/ILSVRC2012_val_00020766.JPEG n02101388/
+mv val/ILSVRC2012_val_00020767.JPEG n02840245/
+mv val/ILSVRC2012_val_00020768.JPEG n04493381/
+mv val/ILSVRC2012_val_00020769.JPEG n01990800/
+mv val/ILSVRC2012_val_00020770.JPEG n04162706/
+mv val/ILSVRC2012_val_00020771.JPEG n13054560/
+mv val/ILSVRC2012_val_00020772.JPEG n01632777/
+mv val/ILSVRC2012_val_00020773.JPEG n02093859/
+mv val/ILSVRC2012_val_00020774.JPEG n02025239/
+mv val/ILSVRC2012_val_00020775.JPEG n02797295/
+mv val/ILSVRC2012_val_00020776.JPEG n03179701/
+mv val/ILSVRC2012_val_00020777.JPEG n02980441/
+mv val/ILSVRC2012_val_00020778.JPEG n04596742/
+mv val/ILSVRC2012_val_00020779.JPEG n01980166/
+mv val/ILSVRC2012_val_00020780.JPEG n09835506/
+mv val/ILSVRC2012_val_00020781.JPEG n03445777/
+mv val/ILSVRC2012_val_00020782.JPEG n03110669/
+mv val/ILSVRC2012_val_00020783.JPEG n02094114/
+mv val/ILSVRC2012_val_00020784.JPEG n02086079/
+mv val/ILSVRC2012_val_00020785.JPEG n01443537/
+mv val/ILSVRC2012_val_00020786.JPEG n02110063/
+mv val/ILSVRC2012_val_00020787.JPEG n04355338/
+mv val/ILSVRC2012_val_00020788.JPEG n01560419/
+mv val/ILSVRC2012_val_00020789.JPEG n03355925/
+mv val/ILSVRC2012_val_00020790.JPEG n02119022/
+mv val/ILSVRC2012_val_00020791.JPEG n03447447/
+mv val/ILSVRC2012_val_00020792.JPEG n02219486/
+mv val/ILSVRC2012_val_00020793.JPEG n02113624/
+mv val/ILSVRC2012_val_00020794.JPEG n04523525/
+mv val/ILSVRC2012_val_00020795.JPEG n01983481/
+mv val/ILSVRC2012_val_00020796.JPEG n10565667/
+mv val/ILSVRC2012_val_00020797.JPEG n03803284/
+mv val/ILSVRC2012_val_00020798.JPEG n04367480/
+mv val/ILSVRC2012_val_00020799.JPEG n03400231/
+mv val/ILSVRC2012_val_00020800.JPEG n01980166/
+mv val/ILSVRC2012_val_00020801.JPEG n04596742/
+mv val/ILSVRC2012_val_00020802.JPEG n02417914/
+mv val/ILSVRC2012_val_00020803.JPEG n02514041/
+mv val/ILSVRC2012_val_00020804.JPEG n02033041/
+mv val/ILSVRC2012_val_00020805.JPEG n02094114/
+mv val/ILSVRC2012_val_00020806.JPEG n02134084/
+mv val/ILSVRC2012_val_00020807.JPEG n13040303/
+mv val/ILSVRC2012_val_00020808.JPEG n03763968/
+mv val/ILSVRC2012_val_00020809.JPEG n04111531/
+mv val/ILSVRC2012_val_00020810.JPEG n02090622/
+mv val/ILSVRC2012_val_00020811.JPEG n02486261/
+mv val/ILSVRC2012_val_00020812.JPEG n03452741/
+mv val/ILSVRC2012_val_00020813.JPEG n04458633/
+mv val/ILSVRC2012_val_00020814.JPEG n02094114/
+mv val/ILSVRC2012_val_00020815.JPEG n02097658/
+mv val/ILSVRC2012_val_00020816.JPEG n01978455/
+mv val/ILSVRC2012_val_00020817.JPEG n02988304/
+mv val/ILSVRC2012_val_00020818.JPEG n04229816/
+mv val/ILSVRC2012_val_00020819.JPEG n02892767/
+mv val/ILSVRC2012_val_00020820.JPEG n02804414/
+mv val/ILSVRC2012_val_00020821.JPEG n03240683/
+mv val/ILSVRC2012_val_00020822.JPEG n01443537/
+mv val/ILSVRC2012_val_00020823.JPEG n02088632/
+mv val/ILSVRC2012_val_00020824.JPEG n02172182/
+mv val/ILSVRC2012_val_00020825.JPEG n02786058/
+mv val/ILSVRC2012_val_00020826.JPEG n02701002/
+mv val/ILSVRC2012_val_00020827.JPEG n04515003/
+mv val/ILSVRC2012_val_00020828.JPEG n07693725/
+mv val/ILSVRC2012_val_00020829.JPEG n03594945/
+mv val/ILSVRC2012_val_00020830.JPEG n02100735/
+mv val/ILSVRC2012_val_00020831.JPEG n04204347/
+mv val/ILSVRC2012_val_00020832.JPEG n02093754/
+mv val/ILSVRC2012_val_00020833.JPEG n09428293/
+mv val/ILSVRC2012_val_00020834.JPEG n03958227/
+mv val/ILSVRC2012_val_00020835.JPEG n03042490/
+mv val/ILSVRC2012_val_00020836.JPEG n06359193/
+mv val/ILSVRC2012_val_00020837.JPEG n02102177/
+mv val/ILSVRC2012_val_00020838.JPEG n03445924/
+mv val/ILSVRC2012_val_00020839.JPEG n04141975/
+mv val/ILSVRC2012_val_00020840.JPEG n03690938/
+mv val/ILSVRC2012_val_00020841.JPEG n02108089/
+mv val/ILSVRC2012_val_00020842.JPEG n03075370/
+mv val/ILSVRC2012_val_00020843.JPEG n04517823/
+mv val/ILSVRC2012_val_00020844.JPEG n03208938/
+mv val/ILSVRC2012_val_00020845.JPEG n03958227/
+mv val/ILSVRC2012_val_00020846.JPEG n10148035/
+mv val/ILSVRC2012_val_00020847.JPEG n02444819/
+mv val/ILSVRC2012_val_00020848.JPEG n02092002/
+mv val/ILSVRC2012_val_00020849.JPEG n10565667/
+mv val/ILSVRC2012_val_00020850.JPEG n02437312/
+mv val/ILSVRC2012_val_00020851.JPEG n02280649/
+mv val/ILSVRC2012_val_00020852.JPEG n02909870/
+mv val/ILSVRC2012_val_00020853.JPEG n03977966/
+mv val/ILSVRC2012_val_00020854.JPEG n03110669/
+mv val/ILSVRC2012_val_00020855.JPEG n03777568/
+mv val/ILSVRC2012_val_00020856.JPEG n07930864/
+mv val/ILSVRC2012_val_00020857.JPEG n04560804/
+mv val/ILSVRC2012_val_00020858.JPEG n03888605/
+mv val/ILSVRC2012_val_00020859.JPEG n02120505/
+mv val/ILSVRC2012_val_00020860.JPEG n03014705/
+mv val/ILSVRC2012_val_00020861.JPEG n01744401/
+mv val/ILSVRC2012_val_00020862.JPEG n03770439/
+mv val/ILSVRC2012_val_00020863.JPEG n03393912/
+mv val/ILSVRC2012_val_00020864.JPEG n02727426/
+mv val/ILSVRC2012_val_00020865.JPEG n02093754/
+mv val/ILSVRC2012_val_00020866.JPEG n03379051/
+mv val/ILSVRC2012_val_00020867.JPEG n03788195/
+mv val/ILSVRC2012_val_00020868.JPEG n02099601/
+mv val/ILSVRC2012_val_00020869.JPEG n02481823/
+mv val/ILSVRC2012_val_00020870.JPEG n03291819/
+mv val/ILSVRC2012_val_00020871.JPEG n04127249/
+mv val/ILSVRC2012_val_00020872.JPEG n03803284/
+mv val/ILSVRC2012_val_00020873.JPEG n03794056/
+mv val/ILSVRC2012_val_00020874.JPEG n03478589/
+mv val/ILSVRC2012_val_00020875.JPEG n02009912/
+mv val/ILSVRC2012_val_00020876.JPEG n07579787/
+mv val/ILSVRC2012_val_00020877.JPEG n02951358/
+mv val/ILSVRC2012_val_00020878.JPEG n03297495/
+mv val/ILSVRC2012_val_00020879.JPEG n04517823/
+mv val/ILSVRC2012_val_00020880.JPEG n03794056/
+mv val/ILSVRC2012_val_00020881.JPEG n03854065/
+mv val/ILSVRC2012_val_00020882.JPEG n04325704/
+mv val/ILSVRC2012_val_00020883.JPEG n03902125/
+mv val/ILSVRC2012_val_00020884.JPEG n03207941/
+mv val/ILSVRC2012_val_00020885.JPEG n03160309/
+mv val/ILSVRC2012_val_00020886.JPEG n02727426/
+mv val/ILSVRC2012_val_00020887.JPEG n03498962/
+mv val/ILSVRC2012_val_00020888.JPEG n02056570/
+mv val/ILSVRC2012_val_00020889.JPEG n01530575/
+mv val/ILSVRC2012_val_00020890.JPEG n03290653/
+mv val/ILSVRC2012_val_00020891.JPEG n03133878/
+mv val/ILSVRC2012_val_00020892.JPEG n02099267/
+mv val/ILSVRC2012_val_00020893.JPEG n03742115/
+mv val/ILSVRC2012_val_00020894.JPEG n04273569/
+mv val/ILSVRC2012_val_00020895.JPEG n02977058/
+mv val/ILSVRC2012_val_00020896.JPEG n03724870/
+mv val/ILSVRC2012_val_00020897.JPEG n04597913/
+mv val/ILSVRC2012_val_00020898.JPEG n03763968/
+mv val/ILSVRC2012_val_00020899.JPEG n03201208/
+mv val/ILSVRC2012_val_00020900.JPEG n02672831/
+mv val/ILSVRC2012_val_00020901.JPEG n02096437/
+mv val/ILSVRC2012_val_00020902.JPEG n02916936/
+mv val/ILSVRC2012_val_00020903.JPEG n04398044/
+mv val/ILSVRC2012_val_00020904.JPEG n03110669/
+mv val/ILSVRC2012_val_00020905.JPEG n01580077/
+mv val/ILSVRC2012_val_00020906.JPEG n03775546/
+mv val/ILSVRC2012_val_00020907.JPEG n01665541/
+mv val/ILSVRC2012_val_00020908.JPEG n03109150/
+mv val/ILSVRC2012_val_00020909.JPEG n01843383/
+mv val/ILSVRC2012_val_00020910.JPEG n01751748/
+mv val/ILSVRC2012_val_00020911.JPEG n04487394/
+mv val/ILSVRC2012_val_00020912.JPEG n02804414/
+mv val/ILSVRC2012_val_00020913.JPEG n04200800/
+mv val/ILSVRC2012_val_00020914.JPEG n03661043/
+mv val/ILSVRC2012_val_00020915.JPEG n01806143/
+mv val/ILSVRC2012_val_00020916.JPEG n01641577/
+mv val/ILSVRC2012_val_00020917.JPEG n02325366/
+mv val/ILSVRC2012_val_00020918.JPEG n03976467/
+mv val/ILSVRC2012_val_00020919.JPEG n02917067/
+mv val/ILSVRC2012_val_00020920.JPEG n01819313/
+mv val/ILSVRC2012_val_00020921.JPEG n04465501/
+mv val/ILSVRC2012_val_00020922.JPEG n01955084/
+mv val/ILSVRC2012_val_00020923.JPEG n03063599/
+mv val/ILSVRC2012_val_00020924.JPEG n04099969/
+mv val/ILSVRC2012_val_00020925.JPEG n02793495/
+mv val/ILSVRC2012_val_00020926.JPEG n02086079/
+mv val/ILSVRC2012_val_00020927.JPEG n02859443/
+mv val/ILSVRC2012_val_00020928.JPEG n03690938/
+mv val/ILSVRC2012_val_00020929.JPEG n13052670/
+mv val/ILSVRC2012_val_00020930.JPEG n02088238/
+mv val/ILSVRC2012_val_00020931.JPEG n02699494/
+mv val/ILSVRC2012_val_00020932.JPEG n03721384/
+mv val/ILSVRC2012_val_00020933.JPEG n02006656/
+mv val/ILSVRC2012_val_00020934.JPEG n02415577/
+mv val/ILSVRC2012_val_00020935.JPEG n02981792/
+mv val/ILSVRC2012_val_00020936.JPEG n02492035/
+mv val/ILSVRC2012_val_00020937.JPEG n03379051/
+mv val/ILSVRC2012_val_00020938.JPEG n02280649/
+mv val/ILSVRC2012_val_00020939.JPEG n03095699/
+mv val/ILSVRC2012_val_00020940.JPEG n03720891/
+mv val/ILSVRC2012_val_00020941.JPEG n03459775/
+mv val/ILSVRC2012_val_00020942.JPEG n02422106/
+mv val/ILSVRC2012_val_00020943.JPEG n01644373/
+mv val/ILSVRC2012_val_00020944.JPEG n03347037/
+mv val/ILSVRC2012_val_00020945.JPEG n02834397/
+mv val/ILSVRC2012_val_00020946.JPEG n03218198/
+mv val/ILSVRC2012_val_00020947.JPEG n03627232/
+mv val/ILSVRC2012_val_00020948.JPEG n04557648/
+mv val/ILSVRC2012_val_00020949.JPEG n02423022/
+mv val/ILSVRC2012_val_00020950.JPEG n01784675/
+mv val/ILSVRC2012_val_00020951.JPEG n03425413/
+mv val/ILSVRC2012_val_00020952.JPEG n04579432/
+mv val/ILSVRC2012_val_00020953.JPEG n07875152/
+mv val/ILSVRC2012_val_00020954.JPEG n03461385/
+mv val/ILSVRC2012_val_00020955.JPEG n03404251/
+mv val/ILSVRC2012_val_00020956.JPEG n03658185/
+mv val/ILSVRC2012_val_00020957.JPEG n07720875/
+mv val/ILSVRC2012_val_00020958.JPEG n01943899/
+mv val/ILSVRC2012_val_00020959.JPEG n12620546/
+mv val/ILSVRC2012_val_00020960.JPEG n03967562/
+mv val/ILSVRC2012_val_00020961.JPEG n02102480/
+mv val/ILSVRC2012_val_00020962.JPEG n02500267/
+mv val/ILSVRC2012_val_00020963.JPEG n02087046/
+mv val/ILSVRC2012_val_00020964.JPEG n03595614/
+mv val/ILSVRC2012_val_00020965.JPEG n02100236/
+mv val/ILSVRC2012_val_00020966.JPEG n07892512/
+mv val/ILSVRC2012_val_00020967.JPEG n04505470/
+mv val/ILSVRC2012_val_00020968.JPEG n01986214/
+mv val/ILSVRC2012_val_00020969.JPEG n02447366/
+mv val/ILSVRC2012_val_00020970.JPEG n01978455/
+mv val/ILSVRC2012_val_00020971.JPEG n03942813/
+mv val/ILSVRC2012_val_00020972.JPEG n02917067/
+mv val/ILSVRC2012_val_00020973.JPEG n02125311/
+mv val/ILSVRC2012_val_00020974.JPEG n04275548/
+mv val/ILSVRC2012_val_00020975.JPEG n02077923/
+mv val/ILSVRC2012_val_00020976.JPEG n01829413/
+mv val/ILSVRC2012_val_00020977.JPEG n04557648/
+mv val/ILSVRC2012_val_00020978.JPEG n02483362/
+mv val/ILSVRC2012_val_00020979.JPEG n03250847/
+mv val/ILSVRC2012_val_00020980.JPEG n02454379/
+mv val/ILSVRC2012_val_00020981.JPEG n02793495/
+mv val/ILSVRC2012_val_00020982.JPEG n03891251/
+mv val/ILSVRC2012_val_00020983.JPEG n03938244/
+mv val/ILSVRC2012_val_00020984.JPEG n03467068/
+mv val/ILSVRC2012_val_00020985.JPEG n02226429/
+mv val/ILSVRC2012_val_00020986.JPEG n02106166/
+mv val/ILSVRC2012_val_00020987.JPEG n04465501/
+mv val/ILSVRC2012_val_00020988.JPEG n04423845/
+mv val/ILSVRC2012_val_00020989.JPEG n02108422/
+mv val/ILSVRC2012_val_00020990.JPEG n02776631/
+mv val/ILSVRC2012_val_00020991.JPEG n01773797/
+mv val/ILSVRC2012_val_00020992.JPEG n03250847/
+mv val/ILSVRC2012_val_00020993.JPEG n04606251/
+mv val/ILSVRC2012_val_00020994.JPEG n01664065/
+mv val/ILSVRC2012_val_00020995.JPEG n04127249/
+mv val/ILSVRC2012_val_00020996.JPEG n04254777/
+mv val/ILSVRC2012_val_00020997.JPEG n02483362/
+mv val/ILSVRC2012_val_00020998.JPEG n03041632/
+mv val/ILSVRC2012_val_00020999.JPEG n01729322/
+mv val/ILSVRC2012_val_00021000.JPEG n02093859/
+mv val/ILSVRC2012_val_00021001.JPEG n02977058/
+mv val/ILSVRC2012_val_00021002.JPEG n04252225/
+mv val/ILSVRC2012_val_00021003.JPEG n02116738/
+mv val/ILSVRC2012_val_00021004.JPEG n02950826/
+mv val/ILSVRC2012_val_00021005.JPEG n03494278/
+mv val/ILSVRC2012_val_00021006.JPEG n02130308/
+mv val/ILSVRC2012_val_00021007.JPEG n03786901/
+mv val/ILSVRC2012_val_00021008.JPEG n04462240/
+mv val/ILSVRC2012_val_00021009.JPEG n03617480/
+mv val/ILSVRC2012_val_00021010.JPEG n04418357/
+mv val/ILSVRC2012_val_00021011.JPEG n02879718/
+mv val/ILSVRC2012_val_00021012.JPEG n03018349/
+mv val/ILSVRC2012_val_00021013.JPEG n03272010/
+mv val/ILSVRC2012_val_00021014.JPEG n03379051/
+mv val/ILSVRC2012_val_00021015.JPEG n01614925/
+mv val/ILSVRC2012_val_00021016.JPEG n02102040/
+mv val/ILSVRC2012_val_00021017.JPEG n01630670/
+mv val/ILSVRC2012_val_00021018.JPEG n03627232/
+mv val/ILSVRC2012_val_00021019.JPEG n13037406/
+mv val/ILSVRC2012_val_00021020.JPEG n09288635/
+mv val/ILSVRC2012_val_00021021.JPEG n07584110/
+mv val/ILSVRC2012_val_00021022.JPEG n02102177/
+mv val/ILSVRC2012_val_00021023.JPEG n03347037/
+mv val/ILSVRC2012_val_00021024.JPEG n01632458/
+mv val/ILSVRC2012_val_00021025.JPEG n01768244/
+mv val/ILSVRC2012_val_00021026.JPEG n03584254/
+mv val/ILSVRC2012_val_00021027.JPEG n04346328/
+mv val/ILSVRC2012_val_00021028.JPEG n03599486/
+mv val/ILSVRC2012_val_00021029.JPEG n03109150/
+mv val/ILSVRC2012_val_00021030.JPEG n03692522/
+mv val/ILSVRC2012_val_00021031.JPEG n15075141/
+mv val/ILSVRC2012_val_00021032.JPEG n01742172/
+mv val/ILSVRC2012_val_00021033.JPEG n02841315/
+mv val/ILSVRC2012_val_00021034.JPEG n13040303/
+mv val/ILSVRC2012_val_00021035.JPEG n02117135/
+mv val/ILSVRC2012_val_00021036.JPEG n02107142/
+mv val/ILSVRC2012_val_00021037.JPEG n04266014/
+mv val/ILSVRC2012_val_00021038.JPEG n03724870/
+mv val/ILSVRC2012_val_00021039.JPEG n07248320/
+mv val/ILSVRC2012_val_00021040.JPEG n02704792/
+mv val/ILSVRC2012_val_00021041.JPEG n03871628/
+mv val/ILSVRC2012_val_00021042.JPEG n01990800/
+mv val/ILSVRC2012_val_00021043.JPEG n02129604/
+mv val/ILSVRC2012_val_00021044.JPEG n02119789/
+mv val/ILSVRC2012_val_00021045.JPEG n02125311/
+mv val/ILSVRC2012_val_00021046.JPEG n04606251/
+mv val/ILSVRC2012_val_00021047.JPEG n07768694/
+mv val/ILSVRC2012_val_00021048.JPEG n03187595/
+mv val/ILSVRC2012_val_00021049.JPEG n04376876/
+mv val/ILSVRC2012_val_00021050.JPEG n04483307/
+mv val/ILSVRC2012_val_00021051.JPEG n02110063/
+mv val/ILSVRC2012_val_00021052.JPEG n02107142/
+mv val/ILSVRC2012_val_00021053.JPEG n02782093/
+mv val/ILSVRC2012_val_00021054.JPEG n04487081/
+mv val/ILSVRC2012_val_00021055.JPEG n01675722/
+mv val/ILSVRC2012_val_00021056.JPEG n01608432/
+mv val/ILSVRC2012_val_00021057.JPEG n03297495/
+mv val/ILSVRC2012_val_00021058.JPEG n02098105/
+mv val/ILSVRC2012_val_00021059.JPEG n01950731/
+mv val/ILSVRC2012_val_00021060.JPEG n04238763/
+mv val/ILSVRC2012_val_00021061.JPEG n02105855/
+mv val/ILSVRC2012_val_00021062.JPEG n04552348/
+mv val/ILSVRC2012_val_00021063.JPEG n02051845/
+mv val/ILSVRC2012_val_00021064.JPEG n02128925/
+mv val/ILSVRC2012_val_00021065.JPEG n02877765/
+mv val/ILSVRC2012_val_00021066.JPEG n02128385/
+mv val/ILSVRC2012_val_00021067.JPEG n02877765/
+mv val/ILSVRC2012_val_00021068.JPEG n01872401/
+mv val/ILSVRC2012_val_00021069.JPEG n01682714/
+mv val/ILSVRC2012_val_00021070.JPEG n03481172/
+mv val/ILSVRC2012_val_00021071.JPEG n02509815/
+mv val/ILSVRC2012_val_00021072.JPEG n02236044/
+mv val/ILSVRC2012_val_00021073.JPEG n02280649/
+mv val/ILSVRC2012_val_00021074.JPEG n02488702/
+mv val/ILSVRC2012_val_00021075.JPEG n03492542/
+mv val/ILSVRC2012_val_00021076.JPEG n01749939/
+mv val/ILSVRC2012_val_00021077.JPEG n03207743/
+mv val/ILSVRC2012_val_00021078.JPEG n03179701/
+mv val/ILSVRC2012_val_00021079.JPEG n02100877/
+mv val/ILSVRC2012_val_00021080.JPEG n01981276/
+mv val/ILSVRC2012_val_00021081.JPEG n03710637/
+mv val/ILSVRC2012_val_00021082.JPEG n03223299/
+mv val/ILSVRC2012_val_00021083.JPEG n01630670/
+mv val/ILSVRC2012_val_00021084.JPEG n03877472/
+mv val/ILSVRC2012_val_00021085.JPEG n01560419/
+mv val/ILSVRC2012_val_00021086.JPEG n02259212/
+mv val/ILSVRC2012_val_00021087.JPEG n04127249/
+mv val/ILSVRC2012_val_00021088.JPEG n03796401/
+mv val/ILSVRC2012_val_00021089.JPEG n04486054/
+mv val/ILSVRC2012_val_00021090.JPEG n01807496/
+mv val/ILSVRC2012_val_00021091.JPEG n03492542/
+mv val/ILSVRC2012_val_00021092.JPEG n01694178/
+mv val/ILSVRC2012_val_00021093.JPEG n01740131/
+mv val/ILSVRC2012_val_00021094.JPEG n01985128/
+mv val/ILSVRC2012_val_00021095.JPEG n03637318/
+mv val/ILSVRC2012_val_00021096.JPEG n03584254/
+mv val/ILSVRC2012_val_00021097.JPEG n07717556/
+mv val/ILSVRC2012_val_00021098.JPEG n07753592/
+mv val/ILSVRC2012_val_00021099.JPEG n02791124/
+mv val/ILSVRC2012_val_00021100.JPEG n03786901/
+mv val/ILSVRC2012_val_00021101.JPEG n02965783/
+mv val/ILSVRC2012_val_00021102.JPEG n03733131/
+mv val/ILSVRC2012_val_00021103.JPEG n04458633/
+mv val/ILSVRC2012_val_00021104.JPEG n01614925/
+mv val/ILSVRC2012_val_00021105.JPEG n04435653/
+mv val/ILSVRC2012_val_00021106.JPEG n03534580/
+mv val/ILSVRC2012_val_00021107.JPEG n04532106/
+mv val/ILSVRC2012_val_00021108.JPEG n02276258/
+mv val/ILSVRC2012_val_00021109.JPEG n01697457/
+mv val/ILSVRC2012_val_00021110.JPEG n03187595/
+mv val/ILSVRC2012_val_00021111.JPEG n04590129/
+mv val/ILSVRC2012_val_00021112.JPEG n04004767/
+mv val/ILSVRC2012_val_00021113.JPEG n03877472/
+mv val/ILSVRC2012_val_00021114.JPEG n07248320/
+mv val/ILSVRC2012_val_00021115.JPEG n03207743/
+mv val/ILSVRC2012_val_00021116.JPEG n02892767/
+mv val/ILSVRC2012_val_00021117.JPEG n03976467/
+mv val/ILSVRC2012_val_00021118.JPEG n03133878/
+mv val/ILSVRC2012_val_00021119.JPEG n03594734/
+mv val/ILSVRC2012_val_00021120.JPEG n01877812/
+mv val/ILSVRC2012_val_00021121.JPEG n03785016/
+mv val/ILSVRC2012_val_00021122.JPEG n04613696/
+mv val/ILSVRC2012_val_00021123.JPEG n03534580/
+mv val/ILSVRC2012_val_00021124.JPEG n02013706/
+mv val/ILSVRC2012_val_00021125.JPEG n01985128/
+mv val/ILSVRC2012_val_00021126.JPEG n02110806/
+mv val/ILSVRC2012_val_00021127.JPEG n02441942/
+mv val/ILSVRC2012_val_00021128.JPEG n04554684/
+mv val/ILSVRC2012_val_00021129.JPEG n03916031/
+mv val/ILSVRC2012_val_00021130.JPEG n01748264/
+mv val/ILSVRC2012_val_00021131.JPEG n04204347/
+mv val/ILSVRC2012_val_00021132.JPEG n03450230/
+mv val/ILSVRC2012_val_00021133.JPEG n01622779/
+mv val/ILSVRC2012_val_00021134.JPEG n02799071/
+mv val/ILSVRC2012_val_00021135.JPEG n02017213/
+mv val/ILSVRC2012_val_00021136.JPEG n03201208/
+mv val/ILSVRC2012_val_00021137.JPEG n02487347/
+mv val/ILSVRC2012_val_00021138.JPEG n02497673/
+mv val/ILSVRC2012_val_00021139.JPEG n01795545/
+mv val/ILSVRC2012_val_00021140.JPEG n02487347/
+mv val/ILSVRC2012_val_00021141.JPEG n04487081/
+mv val/ILSVRC2012_val_00021142.JPEG n03710637/
+mv val/ILSVRC2012_val_00021143.JPEG n04026417/
+mv val/ILSVRC2012_val_00021144.JPEG n07747607/
+mv val/ILSVRC2012_val_00021145.JPEG n02092002/
+mv val/ILSVRC2012_val_00021146.JPEG n02701002/
+mv val/ILSVRC2012_val_00021147.JPEG n02492660/
+mv val/ILSVRC2012_val_00021148.JPEG n03995372/
+mv val/ILSVRC2012_val_00021149.JPEG n02415577/
+mv val/ILSVRC2012_val_00021150.JPEG n02091831/
+mv val/ILSVRC2012_val_00021151.JPEG n02423022/
+mv val/ILSVRC2012_val_00021152.JPEG n02165456/
+mv val/ILSVRC2012_val_00021153.JPEG n03666591/
+mv val/ILSVRC2012_val_00021154.JPEG n04604644/
+mv val/ILSVRC2012_val_00021155.JPEG n02107142/
+mv val/ILSVRC2012_val_00021156.JPEG n02951358/
+mv val/ILSVRC2012_val_00021157.JPEG n02219486/
+mv val/ILSVRC2012_val_00021158.JPEG n04542943/
+mv val/ILSVRC2012_val_00021159.JPEG n03777568/
+mv val/ILSVRC2012_val_00021160.JPEG n03787032/
+mv val/ILSVRC2012_val_00021161.JPEG n04332243/
+mv val/ILSVRC2012_val_00021162.JPEG n02927161/
+mv val/ILSVRC2012_val_00021163.JPEG n09288635/
+mv val/ILSVRC2012_val_00021164.JPEG n01704323/
+mv val/ILSVRC2012_val_00021165.JPEG n02091244/
+mv val/ILSVRC2012_val_00021166.JPEG n02894605/
+mv val/ILSVRC2012_val_00021167.JPEG n04554684/
+mv val/ILSVRC2012_val_00021168.JPEG n02085936/
+mv val/ILSVRC2012_val_00021169.JPEG n03014705/
+mv val/ILSVRC2012_val_00021170.JPEG n01871265/
+mv val/ILSVRC2012_val_00021171.JPEG n02113799/
+mv val/ILSVRC2012_val_00021172.JPEG n02107683/
+mv val/ILSVRC2012_val_00021173.JPEG n03347037/
+mv val/ILSVRC2012_val_00021174.JPEG n04296562/
+mv val/ILSVRC2012_val_00021175.JPEG n09256479/
+mv val/ILSVRC2012_val_00021176.JPEG n02110341/
+mv val/ILSVRC2012_val_00021177.JPEG n06874185/
+mv val/ILSVRC2012_val_00021178.JPEG n03967562/
+mv val/ILSVRC2012_val_00021179.JPEG n02708093/
+mv val/ILSVRC2012_val_00021180.JPEG n04344873/
+mv val/ILSVRC2012_val_00021181.JPEG n02437616/
+mv val/ILSVRC2012_val_00021182.JPEG n04523525/
+mv val/ILSVRC2012_val_00021183.JPEG n02099712/
+mv val/ILSVRC2012_val_00021184.JPEG n04404412/
+mv val/ILSVRC2012_val_00021185.JPEG n04277352/
+mv val/ILSVRC2012_val_00021186.JPEG n02948072/
+mv val/ILSVRC2012_val_00021187.JPEG n04111531/
+mv val/ILSVRC2012_val_00021188.JPEG n03452741/
+mv val/ILSVRC2012_val_00021189.JPEG n02966193/
+mv val/ILSVRC2012_val_00021190.JPEG n03452741/
+mv val/ILSVRC2012_val_00021191.JPEG n02100735/
+mv val/ILSVRC2012_val_00021192.JPEG n04597913/
+mv val/ILSVRC2012_val_00021193.JPEG n07747607/
+mv val/ILSVRC2012_val_00021194.JPEG n03764736/
+mv val/ILSVRC2012_val_00021195.JPEG n02123159/
+mv val/ILSVRC2012_val_00021196.JPEG n02107574/
+mv val/ILSVRC2012_val_00021197.JPEG n01729977/
+mv val/ILSVRC2012_val_00021198.JPEG n03976467/
+mv val/ILSVRC2012_val_00021199.JPEG n03788195/
+mv val/ILSVRC2012_val_00021200.JPEG n07717556/
+mv val/ILSVRC2012_val_00021201.JPEG n15075141/
+mv val/ILSVRC2012_val_00021202.JPEG n04596742/
+mv val/ILSVRC2012_val_00021203.JPEG n01729977/
+mv val/ILSVRC2012_val_00021204.JPEG n03042490/
+mv val/ILSVRC2012_val_00021205.JPEG n02102040/
+mv val/ILSVRC2012_val_00021206.JPEG n02093991/
+mv val/ILSVRC2012_val_00021207.JPEG n12144580/
+mv val/ILSVRC2012_val_00021208.JPEG n02107908/
+mv val/ILSVRC2012_val_00021209.JPEG n04612504/
+mv val/ILSVRC2012_val_00021210.JPEG n02981792/
+mv val/ILSVRC2012_val_00021211.JPEG n01644900/
+mv val/ILSVRC2012_val_00021212.JPEG n02128385/
+mv val/ILSVRC2012_val_00021213.JPEG n02128925/
+mv val/ILSVRC2012_val_00021214.JPEG n02110806/
+mv val/ILSVRC2012_val_00021215.JPEG n01748264/
+mv val/ILSVRC2012_val_00021216.JPEG n02777292/
+mv val/ILSVRC2012_val_00021217.JPEG n04209239/
+mv val/ILSVRC2012_val_00021218.JPEG n02112350/
+mv val/ILSVRC2012_val_00021219.JPEG n02361337/
+mv val/ILSVRC2012_val_00021220.JPEG n04141327/
+mv val/ILSVRC2012_val_00021221.JPEG n02229544/
+mv val/ILSVRC2012_val_00021222.JPEG n02281406/
+mv val/ILSVRC2012_val_00021223.JPEG n03895866/
+mv val/ILSVRC2012_val_00021224.JPEG n02108915/
+mv val/ILSVRC2012_val_00021225.JPEG n12768682/
+mv val/ILSVRC2012_val_00021226.JPEG n02106030/
+mv val/ILSVRC2012_val_00021227.JPEG n03218198/
+mv val/ILSVRC2012_val_00021228.JPEG n04133789/
+mv val/ILSVRC2012_val_00021229.JPEG n02093428/
+mv val/ILSVRC2012_val_00021230.JPEG n03461385/
+mv val/ILSVRC2012_val_00021231.JPEG n02119789/
+mv val/ILSVRC2012_val_00021232.JPEG n03444034/
+mv val/ILSVRC2012_val_00021233.JPEG n02877765/
+mv val/ILSVRC2012_val_00021234.JPEG n03724870/
+mv val/ILSVRC2012_val_00021235.JPEG n03773504/
+mv val/ILSVRC2012_val_00021236.JPEG n01698640/
+mv val/ILSVRC2012_val_00021237.JPEG n02504013/
+mv val/ILSVRC2012_val_00021238.JPEG n02231487/
+mv val/ILSVRC2012_val_00021239.JPEG n01558993/
+mv val/ILSVRC2012_val_00021240.JPEG n06785654/
+mv val/ILSVRC2012_val_00021241.JPEG n01981276/
+mv val/ILSVRC2012_val_00021242.JPEG n02389026/
+mv val/ILSVRC2012_val_00021243.JPEG n04277352/
+mv val/ILSVRC2012_val_00021244.JPEG n02687172/
+mv val/ILSVRC2012_val_00021245.JPEG n03291819/
+mv val/ILSVRC2012_val_00021246.JPEG n04447861/
+mv val/ILSVRC2012_val_00021247.JPEG n04310018/
+mv val/ILSVRC2012_val_00021248.JPEG n02486410/
+mv val/ILSVRC2012_val_00021249.JPEG n02105855/
+mv val/ILSVRC2012_val_00021250.JPEG n02948072/
+mv val/ILSVRC2012_val_00021251.JPEG n03785016/
+mv val/ILSVRC2012_val_00021252.JPEG n02002724/
+mv val/ILSVRC2012_val_00021253.JPEG n03417042/
+mv val/ILSVRC2012_val_00021254.JPEG n03188531/
+mv val/ILSVRC2012_val_00021255.JPEG n02259212/
+mv val/ILSVRC2012_val_00021256.JPEG n02776631/
+mv val/ILSVRC2012_val_00021257.JPEG n02951585/
+mv val/ILSVRC2012_val_00021258.JPEG n03337140/
+mv val/ILSVRC2012_val_00021259.JPEG n01751748/
+mv val/ILSVRC2012_val_00021260.JPEG n02879718/
+mv val/ILSVRC2012_val_00021261.JPEG n04277352/
+mv val/ILSVRC2012_val_00021262.JPEG n12057211/
+mv val/ILSVRC2012_val_00021263.JPEG n02951585/
+mv val/ILSVRC2012_val_00021264.JPEG n03967562/
+mv val/ILSVRC2012_val_00021265.JPEG n07714571/
+mv val/ILSVRC2012_val_00021266.JPEG n02085620/
+mv val/ILSVRC2012_val_00021267.JPEG n02510455/
+mv val/ILSVRC2012_val_00021268.JPEG n02869837/
+mv val/ILSVRC2012_val_00021269.JPEG n01980166/
+mv val/ILSVRC2012_val_00021270.JPEG n01756291/
+mv val/ILSVRC2012_val_00021271.JPEG n03792972/
+mv val/ILSVRC2012_val_00021272.JPEG n02112137/
+mv val/ILSVRC2012_val_00021273.JPEG n03680355/
+mv val/ILSVRC2012_val_00021274.JPEG n03841143/
+mv val/ILSVRC2012_val_00021275.JPEG n07565083/
+mv val/ILSVRC2012_val_00021276.JPEG n07693725/
+mv val/ILSVRC2012_val_00021277.JPEG n07715103/
+mv val/ILSVRC2012_val_00021278.JPEG n01820546/
+mv val/ILSVRC2012_val_00021279.JPEG n01873310/
+mv val/ILSVRC2012_val_00021280.JPEG n03777568/
+mv val/ILSVRC2012_val_00021281.JPEG n01833805/
+mv val/ILSVRC2012_val_00021282.JPEG n02676566/
+mv val/ILSVRC2012_val_00021283.JPEG n03447721/
+mv val/ILSVRC2012_val_00021284.JPEG n02500267/
+mv val/ILSVRC2012_val_00021285.JPEG n03602883/
+mv val/ILSVRC2012_val_00021286.JPEG n04239074/
+mv val/ILSVRC2012_val_00021287.JPEG n04118538/
+mv val/ILSVRC2012_val_00021288.JPEG n04536866/
+mv val/ILSVRC2012_val_00021289.JPEG n04548362/
+mv val/ILSVRC2012_val_00021290.JPEG n02776631/
+mv val/ILSVRC2012_val_00021291.JPEG n01667778/
+mv val/ILSVRC2012_val_00021292.JPEG n03825788/
+mv val/ILSVRC2012_val_00021293.JPEG n03891332/
+mv val/ILSVRC2012_val_00021294.JPEG n04258138/
+mv val/ILSVRC2012_val_00021295.JPEG n04542943/
+mv val/ILSVRC2012_val_00021296.JPEG n02099849/
+mv val/ILSVRC2012_val_00021297.JPEG n03041632/
+mv val/ILSVRC2012_val_00021298.JPEG n04179913/
+mv val/ILSVRC2012_val_00021299.JPEG n01632458/
+mv val/ILSVRC2012_val_00021300.JPEG n01537544/
+mv val/ILSVRC2012_val_00021301.JPEG n02930766/
+mv val/ILSVRC2012_val_00021302.JPEG n03814639/
+mv val/ILSVRC2012_val_00021303.JPEG n02643566/
+mv val/ILSVRC2012_val_00021304.JPEG n03498962/
+mv val/ILSVRC2012_val_00021305.JPEG n01798484/
+mv val/ILSVRC2012_val_00021306.JPEG n02692877/
+mv val/ILSVRC2012_val_00021307.JPEG n03134739/
+mv val/ILSVRC2012_val_00021308.JPEG n03314780/
+mv val/ILSVRC2012_val_00021309.JPEG n02870880/
+mv val/ILSVRC2012_val_00021310.JPEG n07768694/
+mv val/ILSVRC2012_val_00021311.JPEG n04141076/
+mv val/ILSVRC2012_val_00021312.JPEG n03786901/
+mv val/ILSVRC2012_val_00021313.JPEG n03314780/
+mv val/ILSVRC2012_val_00021314.JPEG n02172182/
+mv val/ILSVRC2012_val_00021315.JPEG n02092339/
+mv val/ILSVRC2012_val_00021316.JPEG n03259280/
+mv val/ILSVRC2012_val_00021317.JPEG n07880968/
+mv val/ILSVRC2012_val_00021318.JPEG n02115641/
+mv val/ILSVRC2012_val_00021319.JPEG n01990800/
+mv val/ILSVRC2012_val_00021320.JPEG n12768682/
+mv val/ILSVRC2012_val_00021321.JPEG n07930864/
+mv val/ILSVRC2012_val_00021322.JPEG n03527444/
+mv val/ILSVRC2012_val_00021323.JPEG n02091244/
+mv val/ILSVRC2012_val_00021324.JPEG n03769881/
+mv val/ILSVRC2012_val_00021325.JPEG n01494475/
+mv val/ILSVRC2012_val_00021326.JPEG n03249569/
+mv val/ILSVRC2012_val_00021327.JPEG n02395406/
+mv val/ILSVRC2012_val_00021328.JPEG n03776460/
+mv val/ILSVRC2012_val_00021329.JPEG n12985857/
+mv val/ILSVRC2012_val_00021330.JPEG n02056570/
+mv val/ILSVRC2012_val_00021331.JPEG n02486410/
+mv val/ILSVRC2012_val_00021332.JPEG n01737021/
+mv val/ILSVRC2012_val_00021333.JPEG n02488702/
+mv val/ILSVRC2012_val_00021334.JPEG n01978455/
+mv val/ILSVRC2012_val_00021335.JPEG n01622779/
+mv val/ILSVRC2012_val_00021336.JPEG n02510455/
+mv val/ILSVRC2012_val_00021337.JPEG n01776313/
+mv val/ILSVRC2012_val_00021338.JPEG n07831146/
+mv val/ILSVRC2012_val_00021339.JPEG n02018207/
+mv val/ILSVRC2012_val_00021340.JPEG n02808304/
+mv val/ILSVRC2012_val_00021341.JPEG n01855032/
+mv val/ILSVRC2012_val_00021342.JPEG n03803284/
+mv val/ILSVRC2012_val_00021343.JPEG n02514041/
+mv val/ILSVRC2012_val_00021344.JPEG n02099849/
+mv val/ILSVRC2012_val_00021345.JPEG n01806143/
+mv val/ILSVRC2012_val_00021346.JPEG n03837869/
+mv val/ILSVRC2012_val_00021347.JPEG n03902125/
+mv val/ILSVRC2012_val_00021348.JPEG n02895154/
+mv val/ILSVRC2012_val_00021349.JPEG n04208210/
+mv val/ILSVRC2012_val_00021350.JPEG n02107142/
+mv val/ILSVRC2012_val_00021351.JPEG n01855672/
+mv val/ILSVRC2012_val_00021352.JPEG n02480495/
+mv val/ILSVRC2012_val_00021353.JPEG n04065272/
+mv val/ILSVRC2012_val_00021354.JPEG n03761084/
+mv val/ILSVRC2012_val_00021355.JPEG n02100236/
+mv val/ILSVRC2012_val_00021356.JPEG n02111277/
+mv val/ILSVRC2012_val_00021357.JPEG n02089867/
+mv val/ILSVRC2012_val_00021358.JPEG n04552348/
+mv val/ILSVRC2012_val_00021359.JPEG n02791124/
+mv val/ILSVRC2012_val_00021360.JPEG n02101556/
+mv val/ILSVRC2012_val_00021361.JPEG n02480855/
+mv val/ILSVRC2012_val_00021362.JPEG n02097658/
+mv val/ILSVRC2012_val_00021363.JPEG n03180011/
+mv val/ILSVRC2012_val_00021364.JPEG n03899768/
+mv val/ILSVRC2012_val_00021365.JPEG n02087394/
+mv val/ILSVRC2012_val_00021366.JPEG n02236044/
+mv val/ILSVRC2012_val_00021367.JPEG n02794156/
+mv val/ILSVRC2012_val_00021368.JPEG n04550184/
+mv val/ILSVRC2012_val_00021369.JPEG n02099849/
+mv val/ILSVRC2012_val_00021370.JPEG n02111129/
+mv val/ILSVRC2012_val_00021371.JPEG n03976657/
+mv val/ILSVRC2012_val_00021372.JPEG n01847000/
+mv val/ILSVRC2012_val_00021373.JPEG n04465501/
+mv val/ILSVRC2012_val_00021374.JPEG n03063599/
+mv val/ILSVRC2012_val_00021375.JPEG n03733131/
+mv val/ILSVRC2012_val_00021376.JPEG n09332890/
+mv val/ILSVRC2012_val_00021377.JPEG n02892767/
+mv val/ILSVRC2012_val_00021378.JPEG n01978455/
+mv val/ILSVRC2012_val_00021379.JPEG n02111129/
+mv val/ILSVRC2012_val_00021380.JPEG n03832673/
+mv val/ILSVRC2012_val_00021381.JPEG n04141327/
+mv val/ILSVRC2012_val_00021382.JPEG n02276258/
+mv val/ILSVRC2012_val_00021383.JPEG n03786901/
+mv val/ILSVRC2012_val_00021384.JPEG n02672831/
+mv val/ILSVRC2012_val_00021385.JPEG n01978455/
+mv val/ILSVRC2012_val_00021386.JPEG n02807133/
+mv val/ILSVRC2012_val_00021387.JPEG n03290653/
+mv val/ILSVRC2012_val_00021388.JPEG n03297495/
+mv val/ILSVRC2012_val_00021389.JPEG n02112350/
+mv val/ILSVRC2012_val_00021390.JPEG n02894605/
+mv val/ILSVRC2012_val_00021391.JPEG n03763968/
+mv val/ILSVRC2012_val_00021392.JPEG n02776631/
+mv val/ILSVRC2012_val_00021393.JPEG n04606251/
+mv val/ILSVRC2012_val_00021394.JPEG n03498962/
+mv val/ILSVRC2012_val_00021395.JPEG n04443257/
+mv val/ILSVRC2012_val_00021396.JPEG n04355933/
+mv val/ILSVRC2012_val_00021397.JPEG n02727426/
+mv val/ILSVRC2012_val_00021398.JPEG n12057211/
+mv val/ILSVRC2012_val_00021399.JPEG n04376876/
+mv val/ILSVRC2012_val_00021400.JPEG n02403003/
+mv val/ILSVRC2012_val_00021401.JPEG n03495258/
+mv val/ILSVRC2012_val_00021402.JPEG n04584207/
+mv val/ILSVRC2012_val_00021403.JPEG n04462240/
+mv val/ILSVRC2012_val_00021404.JPEG n01729322/
+mv val/ILSVRC2012_val_00021405.JPEG n03207941/
+mv val/ILSVRC2012_val_00021406.JPEG n02483708/
+mv val/ILSVRC2012_val_00021407.JPEG n10565667/
+mv val/ILSVRC2012_val_00021408.JPEG n03866082/
+mv val/ILSVRC2012_val_00021409.JPEG n04019541/
+mv val/ILSVRC2012_val_00021410.JPEG n04154565/
+mv val/ILSVRC2012_val_00021411.JPEG n13052670/
+mv val/ILSVRC2012_val_00021412.JPEG n02992211/
+mv val/ILSVRC2012_val_00021413.JPEG n03642806/
+mv val/ILSVRC2012_val_00021414.JPEG n03372029/
+mv val/ILSVRC2012_val_00021415.JPEG n03832673/
+mv val/ILSVRC2012_val_00021416.JPEG n03617480/
+mv val/ILSVRC2012_val_00021417.JPEG n01797886/
+mv val/ILSVRC2012_val_00021418.JPEG n04591157/
+mv val/ILSVRC2012_val_00021419.JPEG n04443257/
+mv val/ILSVRC2012_val_00021420.JPEG n03045698/
+mv val/ILSVRC2012_val_00021421.JPEG n03207941/
+mv val/ILSVRC2012_val_00021422.JPEG n04081281/
+mv val/ILSVRC2012_val_00021423.JPEG n02165105/
+mv val/ILSVRC2012_val_00021424.JPEG n02105412/
+mv val/ILSVRC2012_val_00021425.JPEG n02980441/
+mv val/ILSVRC2012_val_00021426.JPEG n02097658/
+mv val/ILSVRC2012_val_00021427.JPEG n02823750/
+mv val/ILSVRC2012_val_00021428.JPEG n02397096/
+mv val/ILSVRC2012_val_00021429.JPEG n03662601/
+mv val/ILSVRC2012_val_00021430.JPEG n01514859/
+mv val/ILSVRC2012_val_00021431.JPEG n03759954/
+mv val/ILSVRC2012_val_00021432.JPEG n02859443/
+mv val/ILSVRC2012_val_00021433.JPEG n02011460/
+mv val/ILSVRC2012_val_00021434.JPEG n03467068/
+mv val/ILSVRC2012_val_00021435.JPEG n04458633/
+mv val/ILSVRC2012_val_00021436.JPEG n02111277/
+mv val/ILSVRC2012_val_00021437.JPEG n01751748/
+mv val/ILSVRC2012_val_00021438.JPEG n03127747/
+mv val/ILSVRC2012_val_00021439.JPEG n03838899/
+mv val/ILSVRC2012_val_00021440.JPEG n07715103/
+mv val/ILSVRC2012_val_00021441.JPEG n02894605/
+mv val/ILSVRC2012_val_00021442.JPEG n02793495/
+mv val/ILSVRC2012_val_00021443.JPEG n07248320/
+mv val/ILSVRC2012_val_00021444.JPEG n03995372/
+mv val/ILSVRC2012_val_00021445.JPEG n02094258/
+mv val/ILSVRC2012_val_00021446.JPEG n03937543/
+mv val/ILSVRC2012_val_00021447.JPEG n03642806/
+mv val/ILSVRC2012_val_00021448.JPEG n02607072/
+mv val/ILSVRC2012_val_00021449.JPEG n03483316/
+mv val/ILSVRC2012_val_00021450.JPEG n02090622/
+mv val/ILSVRC2012_val_00021451.JPEG n04525305/
+mv val/ILSVRC2012_val_00021452.JPEG n02085936/
+mv val/ILSVRC2012_val_00021453.JPEG n03920288/
+mv val/ILSVRC2012_val_00021454.JPEG n03063599/
+mv val/ILSVRC2012_val_00021455.JPEG n01843065/
+mv val/ILSVRC2012_val_00021456.JPEG n02099267/
+mv val/ILSVRC2012_val_00021457.JPEG n01739381/
+mv val/ILSVRC2012_val_00021458.JPEG n03793489/
+mv val/ILSVRC2012_val_00021459.JPEG n02018207/
+mv val/ILSVRC2012_val_00021460.JPEG n03775071/
+mv val/ILSVRC2012_val_00021461.JPEG n01496331/
+mv val/ILSVRC2012_val_00021462.JPEG n06785654/
+mv val/ILSVRC2012_val_00021463.JPEG n03935335/
+mv val/ILSVRC2012_val_00021464.JPEG n03887697/
+mv val/ILSVRC2012_val_00021465.JPEG n07747607/
+mv val/ILSVRC2012_val_00021466.JPEG n03773504/
+mv val/ILSVRC2012_val_00021467.JPEG n07860988/
+mv val/ILSVRC2012_val_00021468.JPEG n04456115/
+mv val/ILSVRC2012_val_00021469.JPEG n02492035/
+mv val/ILSVRC2012_val_00021470.JPEG n03874293/
+mv val/ILSVRC2012_val_00021471.JPEG n04275548/
+mv val/ILSVRC2012_val_00021472.JPEG n03063689/
+mv val/ILSVRC2012_val_00021473.JPEG n02101006/
+mv val/ILSVRC2012_val_00021474.JPEG n01807496/
+mv val/ILSVRC2012_val_00021475.JPEG n02113978/
+mv val/ILSVRC2012_val_00021476.JPEG n02655020/
+mv val/ILSVRC2012_val_00021477.JPEG n02488702/
+mv val/ILSVRC2012_val_00021478.JPEG n02174001/
+mv val/ILSVRC2012_val_00021479.JPEG n04004767/
+mv val/ILSVRC2012_val_00021480.JPEG n04579432/
+mv val/ILSVRC2012_val_00021481.JPEG n04141975/
+mv val/ILSVRC2012_val_00021482.JPEG n03584254/
+mv val/ILSVRC2012_val_00021483.JPEG n02112706/
+mv val/ILSVRC2012_val_00021484.JPEG n03127747/
+mv val/ILSVRC2012_val_00021485.JPEG n02097047/
+mv val/ILSVRC2012_val_00021486.JPEG n04458633/
+mv val/ILSVRC2012_val_00021487.JPEG n02814533/
+mv val/ILSVRC2012_val_00021488.JPEG n02510455/
+mv val/ILSVRC2012_val_00021489.JPEG n02106166/
+mv val/ILSVRC2012_val_00021490.JPEG n02492035/
+mv val/ILSVRC2012_val_00021491.JPEG n13054560/
+mv val/ILSVRC2012_val_00021492.JPEG n04090263/
+mv val/ILSVRC2012_val_00021493.JPEG n02110341/
+mv val/ILSVRC2012_val_00021494.JPEG n02965783/
+mv val/ILSVRC2012_val_00021495.JPEG n04235860/
+mv val/ILSVRC2012_val_00021496.JPEG n01735189/
+mv val/ILSVRC2012_val_00021497.JPEG n01698640/
+mv val/ILSVRC2012_val_00021498.JPEG n07697313/
+mv val/ILSVRC2012_val_00021499.JPEG n02276258/
+mv val/ILSVRC2012_val_00021500.JPEG n03868242/
+mv val/ILSVRC2012_val_00021501.JPEG n02321529/
+mv val/ILSVRC2012_val_00021502.JPEG n03042490/
+mv val/ILSVRC2012_val_00021503.JPEG n04418357/
+mv val/ILSVRC2012_val_00021504.JPEG n03814906/
+mv val/ILSVRC2012_val_00021505.JPEG n02607072/
+mv val/ILSVRC2012_val_00021506.JPEG n04517823/
+mv val/ILSVRC2012_val_00021507.JPEG n03496892/
+mv val/ILSVRC2012_val_00021508.JPEG n07717556/
+mv val/ILSVRC2012_val_00021509.JPEG n02051845/
+mv val/ILSVRC2012_val_00021510.JPEG n03291819/
+mv val/ILSVRC2012_val_00021511.JPEG n09399592/
+mv val/ILSVRC2012_val_00021512.JPEG n02791124/
+mv val/ILSVRC2012_val_00021513.JPEG n02259212/
+mv val/ILSVRC2012_val_00021514.JPEG n02233338/
+mv val/ILSVRC2012_val_00021515.JPEG n07802026/
+mv val/ILSVRC2012_val_00021516.JPEG n03047690/
+mv val/ILSVRC2012_val_00021517.JPEG n03995372/
+mv val/ILSVRC2012_val_00021518.JPEG n03530642/
+mv val/ILSVRC2012_val_00021519.JPEG n02966687/
+mv val/ILSVRC2012_val_00021520.JPEG n02492035/
+mv val/ILSVRC2012_val_00021521.JPEG n02229544/
+mv val/ILSVRC2012_val_00021522.JPEG n01689811/
+mv val/ILSVRC2012_val_00021523.JPEG n01532829/
+mv val/ILSVRC2012_val_00021524.JPEG n03733805/
+mv val/ILSVRC2012_val_00021525.JPEG n01776313/
+mv val/ILSVRC2012_val_00021526.JPEG n02112137/
+mv val/ILSVRC2012_val_00021527.JPEG n04200800/
+mv val/ILSVRC2012_val_00021528.JPEG n07747607/
+mv val/ILSVRC2012_val_00021529.JPEG n03016953/
+mv val/ILSVRC2012_val_00021530.JPEG n03729826/
+mv val/ILSVRC2012_val_00021531.JPEG n07734744/
+mv val/ILSVRC2012_val_00021532.JPEG n02088094/
+mv val/ILSVRC2012_val_00021533.JPEG n04542943/
+mv val/ILSVRC2012_val_00021534.JPEG n02667093/
+mv val/ILSVRC2012_val_00021535.JPEG n03400231/
+mv val/ILSVRC2012_val_00021536.JPEG n04355933/
+mv val/ILSVRC2012_val_00021537.JPEG n03544143/
+mv val/ILSVRC2012_val_00021538.JPEG n02128385/
+mv val/ILSVRC2012_val_00021539.JPEG n04356056/
+mv val/ILSVRC2012_val_00021540.JPEG n02112018/
+mv val/ILSVRC2012_val_00021541.JPEG n02859443/
+mv val/ILSVRC2012_val_00021542.JPEG n02128925/
+mv val/ILSVRC2012_val_00021543.JPEG n02091032/
+mv val/ILSVRC2012_val_00021544.JPEG n04004767/
+mv val/ILSVRC2012_val_00021545.JPEG n02096051/
+mv val/ILSVRC2012_val_00021546.JPEG n02113712/
+mv val/ILSVRC2012_val_00021547.JPEG n02927161/
+mv val/ILSVRC2012_val_00021548.JPEG n03476991/
+mv val/ILSVRC2012_val_00021549.JPEG n02423022/
+mv val/ILSVRC2012_val_00021550.JPEG n12144580/
+mv val/ILSVRC2012_val_00021551.JPEG n04548280/
+mv val/ILSVRC2012_val_00021552.JPEG n03724870/
+mv val/ILSVRC2012_val_00021553.JPEG n04335435/
+mv val/ILSVRC2012_val_00021554.JPEG n07583066/
+mv val/ILSVRC2012_val_00021555.JPEG n02871525/
+mv val/ILSVRC2012_val_00021556.JPEG n03272010/
+mv val/ILSVRC2012_val_00021557.JPEG n02484975/
+mv val/ILSVRC2012_val_00021558.JPEG n02786058/
+mv val/ILSVRC2012_val_00021559.JPEG n09472597/
+mv val/ILSVRC2012_val_00021560.JPEG n04209133/
+mv val/ILSVRC2012_val_00021561.JPEG n03717622/
+mv val/ILSVRC2012_val_00021562.JPEG n03598930/
+mv val/ILSVRC2012_val_00021563.JPEG n02417914/
+mv val/ILSVRC2012_val_00021564.JPEG n01824575/
+mv val/ILSVRC2012_val_00021565.JPEG n04204238/
+mv val/ILSVRC2012_val_00021566.JPEG n02999410/
+mv val/ILSVRC2012_val_00021567.JPEG n04467665/
+mv val/ILSVRC2012_val_00021568.JPEG n04239074/
+mv val/ILSVRC2012_val_00021569.JPEG n03444034/
+mv val/ILSVRC2012_val_00021570.JPEG n04263257/
+mv val/ILSVRC2012_val_00021571.JPEG n03903868/
+mv val/ILSVRC2012_val_00021572.JPEG n02492035/
+mv val/ILSVRC2012_val_00021573.JPEG n02110627/
+mv val/ILSVRC2012_val_00021574.JPEG n02007558/
+mv val/ILSVRC2012_val_00021575.JPEG n02090379/
+mv val/ILSVRC2012_val_00021576.JPEG n03995372/
+mv val/ILSVRC2012_val_00021577.JPEG n04325704/
+mv val/ILSVRC2012_val_00021578.JPEG n04277352/
+mv val/ILSVRC2012_val_00021579.JPEG n02494079/
+mv val/ILSVRC2012_val_00021580.JPEG n02321529/
+mv val/ILSVRC2012_val_00021581.JPEG n12144580/
+mv val/ILSVRC2012_val_00021582.JPEG n01687978/
+mv val/ILSVRC2012_val_00021583.JPEG n03095699/
+mv val/ILSVRC2012_val_00021584.JPEG n02074367/
+mv val/ILSVRC2012_val_00021585.JPEG n02128925/
+mv val/ILSVRC2012_val_00021586.JPEG n02363005/
+mv val/ILSVRC2012_val_00021587.JPEG n02346627/
+mv val/ILSVRC2012_val_00021588.JPEG n04579145/
+mv val/ILSVRC2012_val_00021589.JPEG n03133878/
+mv val/ILSVRC2012_val_00021590.JPEG n02776631/
+mv val/ILSVRC2012_val_00021591.JPEG n03787032/
+mv val/ILSVRC2012_val_00021592.JPEG n03127747/
+mv val/ILSVRC2012_val_00021593.JPEG n01749939/
+mv val/ILSVRC2012_val_00021594.JPEG n01860187/
+mv val/ILSVRC2012_val_00021595.JPEG n04317175/
+mv val/ILSVRC2012_val_00021596.JPEG n12768682/
+mv val/ILSVRC2012_val_00021597.JPEG n02219486/
+mv val/ILSVRC2012_val_00021598.JPEG n03630383/
+mv val/ILSVRC2012_val_00021599.JPEG n02097130/
+mv val/ILSVRC2012_val_00021600.JPEG n02859443/
+mv val/ILSVRC2012_val_00021601.JPEG n03529860/
+mv val/ILSVRC2012_val_00021602.JPEG n02229544/
+mv val/ILSVRC2012_val_00021603.JPEG n03272562/
+mv val/ILSVRC2012_val_00021604.JPEG n04116512/
+mv val/ILSVRC2012_val_00021605.JPEG n01685808/
+mv val/ILSVRC2012_val_00021606.JPEG n03902125/
+mv val/ILSVRC2012_val_00021607.JPEG n02174001/
+mv val/ILSVRC2012_val_00021608.JPEG n02112706/
+mv val/ILSVRC2012_val_00021609.JPEG n02840245/
+mv val/ILSVRC2012_val_00021610.JPEG n04141975/
+mv val/ILSVRC2012_val_00021611.JPEG n01641577/
+mv val/ILSVRC2012_val_00021612.JPEG n02326432/
+mv val/ILSVRC2012_val_00021613.JPEG n07749582/
+mv val/ILSVRC2012_val_00021614.JPEG n02797295/
+mv val/ILSVRC2012_val_00021615.JPEG n04596742/
+mv val/ILSVRC2012_val_00021616.JPEG n02974003/
+mv val/ILSVRC2012_val_00021617.JPEG n01729977/
+mv val/ILSVRC2012_val_00021618.JPEG n02504013/
+mv val/ILSVRC2012_val_00021619.JPEG n02843684/
+mv val/ILSVRC2012_val_00021620.JPEG n03825788/
+mv val/ILSVRC2012_val_00021621.JPEG n04517823/
+mv val/ILSVRC2012_val_00021622.JPEG n03216828/
+mv val/ILSVRC2012_val_00021623.JPEG n04346328/
+mv val/ILSVRC2012_val_00021624.JPEG n02408429/
+mv val/ILSVRC2012_val_00021625.JPEG n01797886/
+mv val/ILSVRC2012_val_00021626.JPEG n02493509/
+mv val/ILSVRC2012_val_00021627.JPEG n02799071/
+mv val/ILSVRC2012_val_00021628.JPEG n04204347/
+mv val/ILSVRC2012_val_00021629.JPEG n07716906/
+mv val/ILSVRC2012_val_00021630.JPEG n06874185/
+mv val/ILSVRC2012_val_00021631.JPEG n02093647/
+mv val/ILSVRC2012_val_00021632.JPEG n02111889/
+mv val/ILSVRC2012_val_00021633.JPEG n04254777/
+mv val/ILSVRC2012_val_00021634.JPEG n02966687/
+mv val/ILSVRC2012_val_00021635.JPEG n03938244/
+mv val/ILSVRC2012_val_00021636.JPEG n02321529/
+mv val/ILSVRC2012_val_00021637.JPEG n03089624/
+mv val/ILSVRC2012_val_00021638.JPEG n02096585/
+mv val/ILSVRC2012_val_00021639.JPEG n02877765/
+mv val/ILSVRC2012_val_00021640.JPEG n03259280/
+mv val/ILSVRC2012_val_00021641.JPEG n02895154/
+mv val/ILSVRC2012_val_00021642.JPEG n02107574/
+mv val/ILSVRC2012_val_00021643.JPEG n07615774/
+mv val/ILSVRC2012_val_00021644.JPEG n03131574/
+mv val/ILSVRC2012_val_00021645.JPEG n02497673/
+mv val/ILSVRC2012_val_00021646.JPEG n01688243/
+mv val/ILSVRC2012_val_00021647.JPEG n04273569/
+mv val/ILSVRC2012_val_00021648.JPEG n03873416/
+mv val/ILSVRC2012_val_00021649.JPEG n03763968/
+mv val/ILSVRC2012_val_00021650.JPEG n01534433/
+mv val/ILSVRC2012_val_00021651.JPEG n03187595/
+mv val/ILSVRC2012_val_00021652.JPEG n02786058/
+mv val/ILSVRC2012_val_00021653.JPEG n02165105/
+mv val/ILSVRC2012_val_00021654.JPEG n02099601/
+mv val/ILSVRC2012_val_00021655.JPEG n02782093/
+mv val/ILSVRC2012_val_00021656.JPEG n01601694/
+mv val/ILSVRC2012_val_00021657.JPEG n03459775/
+mv val/ILSVRC2012_val_00021658.JPEG n01770081/
+mv val/ILSVRC2012_val_00021659.JPEG n04019541/
+mv val/ILSVRC2012_val_00021660.JPEG n01742172/
+mv val/ILSVRC2012_val_00021661.JPEG n03452741/
+mv val/ILSVRC2012_val_00021662.JPEG n03891251/
+mv val/ILSVRC2012_val_00021663.JPEG n01818515/
+mv val/ILSVRC2012_val_00021664.JPEG n03825788/
+mv val/ILSVRC2012_val_00021665.JPEG n04141975/
+mv val/ILSVRC2012_val_00021666.JPEG n02087394/
+mv val/ILSVRC2012_val_00021667.JPEG n02325366/
+mv val/ILSVRC2012_val_00021668.JPEG n02092339/
+mv val/ILSVRC2012_val_00021669.JPEG n07584110/
+mv val/ILSVRC2012_val_00021670.JPEG n03649909/
+mv val/ILSVRC2012_val_00021671.JPEG n02113712/
+mv val/ILSVRC2012_val_00021672.JPEG n04579145/
+mv val/ILSVRC2012_val_00021673.JPEG n03908714/
+mv val/ILSVRC2012_val_00021674.JPEG n04392985/
+mv val/ILSVRC2012_val_00021675.JPEG n02124075/
+mv val/ILSVRC2012_val_00021676.JPEG n13040303/
+mv val/ILSVRC2012_val_00021677.JPEG n02051845/
+mv val/ILSVRC2012_val_00021678.JPEG n02231487/
+mv val/ILSVRC2012_val_00021679.JPEG n02493509/
+mv val/ILSVRC2012_val_00021680.JPEG n01748264/
+mv val/ILSVRC2012_val_00021681.JPEG n03457902/
+mv val/ILSVRC2012_val_00021682.JPEG n03146219/
+mv val/ILSVRC2012_val_00021683.JPEG n01675722/
+mv val/ILSVRC2012_val_00021684.JPEG n03787032/
+mv val/ILSVRC2012_val_00021685.JPEG n02361337/
+mv val/ILSVRC2012_val_00021686.JPEG n07579787/
+mv val/ILSVRC2012_val_00021687.JPEG n04479046/
+mv val/ILSVRC2012_val_00021688.JPEG n02168699/
+mv val/ILSVRC2012_val_00021689.JPEG n02992211/
+mv val/ILSVRC2012_val_00021690.JPEG n02113624/
+mv val/ILSVRC2012_val_00021691.JPEG n02974003/
+mv val/ILSVRC2012_val_00021692.JPEG n04357314/
+mv val/ILSVRC2012_val_00021693.JPEG n07920052/
+mv val/ILSVRC2012_val_00021694.JPEG n07615774/
+mv val/ILSVRC2012_val_00021695.JPEG n03452741/
+mv val/ILSVRC2012_val_00021696.JPEG n03534580/
+mv val/ILSVRC2012_val_00021697.JPEG n02094258/
+mv val/ILSVRC2012_val_00021698.JPEG n04505470/
+mv val/ILSVRC2012_val_00021699.JPEG n02641379/
+mv val/ILSVRC2012_val_00021700.JPEG n03868863/
+mv val/ILSVRC2012_val_00021701.JPEG n02422699/
+mv val/ILSVRC2012_val_00021702.JPEG n03249569/
+mv val/ILSVRC2012_val_00021703.JPEG n02123394/
+mv val/ILSVRC2012_val_00021704.JPEG n02106662/
+mv val/ILSVRC2012_val_00021705.JPEG n01784675/
+mv val/ILSVRC2012_val_00021706.JPEG n04371430/
+mv val/ILSVRC2012_val_00021707.JPEG n04557648/
+mv val/ILSVRC2012_val_00021708.JPEG n02514041/
+mv val/ILSVRC2012_val_00021709.JPEG n02051845/
+mv val/ILSVRC2012_val_00021710.JPEG n03916031/
+mv val/ILSVRC2012_val_00021711.JPEG n01751748/
+mv val/ILSVRC2012_val_00021712.JPEG n02504458/
+mv val/ILSVRC2012_val_00021713.JPEG n07734744/
+mv val/ILSVRC2012_val_00021714.JPEG n02494079/
+mv val/ILSVRC2012_val_00021715.JPEG n03902125/
+mv val/ILSVRC2012_val_00021716.JPEG n02930766/
+mv val/ILSVRC2012_val_00021717.JPEG n03977966/
+mv val/ILSVRC2012_val_00021718.JPEG n03724870/
+mv val/ILSVRC2012_val_00021719.JPEG n04116512/
+mv val/ILSVRC2012_val_00021720.JPEG n03272010/
+mv val/ILSVRC2012_val_00021721.JPEG n04049303/
+mv val/ILSVRC2012_val_00021722.JPEG n03590841/
+mv val/ILSVRC2012_val_00021723.JPEG n02361337/
+mv val/ILSVRC2012_val_00021724.JPEG n04044716/
+mv val/ILSVRC2012_val_00021725.JPEG n03680355/
+mv val/ILSVRC2012_val_00021726.JPEG n03637318/
+mv val/ILSVRC2012_val_00021727.JPEG n11939491/
+mv val/ILSVRC2012_val_00021728.JPEG n03866082/
+mv val/ILSVRC2012_val_00021729.JPEG n03272010/
+mv val/ILSVRC2012_val_00021730.JPEG n02119789/
+mv val/ILSVRC2012_val_00021731.JPEG n07615774/
+mv val/ILSVRC2012_val_00021732.JPEG n03602883/
+mv val/ILSVRC2012_val_00021733.JPEG n03492542/
+mv val/ILSVRC2012_val_00021734.JPEG n04310018/
+mv val/ILSVRC2012_val_00021735.JPEG n02231487/
+mv val/ILSVRC2012_val_00021736.JPEG n02110185/
+mv val/ILSVRC2012_val_00021737.JPEG n03544143/
+mv val/ILSVRC2012_val_00021738.JPEG n03995372/
+mv val/ILSVRC2012_val_00021739.JPEG n02268443/
+mv val/ILSVRC2012_val_00021740.JPEG n01440764/
+mv val/ILSVRC2012_val_00021741.JPEG n02480855/
+mv val/ILSVRC2012_val_00021742.JPEG n02317335/
+mv val/ILSVRC2012_val_00021743.JPEG n01692333/
+mv val/ILSVRC2012_val_00021744.JPEG n02109961/
+mv val/ILSVRC2012_val_00021745.JPEG n03379051/
+mv val/ILSVRC2012_val_00021746.JPEG n03075370/
+mv val/ILSVRC2012_val_00021747.JPEG n02687172/
+mv val/ILSVRC2012_val_00021748.JPEG n04442312/
+mv val/ILSVRC2012_val_00021749.JPEG n03584254/
+mv val/ILSVRC2012_val_00021750.JPEG n01729977/
+mv val/ILSVRC2012_val_00021751.JPEG n02727426/
+mv val/ILSVRC2012_val_00021752.JPEG n03134739/
+mv val/ILSVRC2012_val_00021753.JPEG n01828970/
+mv val/ILSVRC2012_val_00021754.JPEG n02093428/
+mv val/ILSVRC2012_val_00021755.JPEG n02233338/
+mv val/ILSVRC2012_val_00021756.JPEG n02091831/
+mv val/ILSVRC2012_val_00021757.JPEG n02939185/
+mv val/ILSVRC2012_val_00021758.JPEG n04579432/
+mv val/ILSVRC2012_val_00021759.JPEG n04266014/
+mv val/ILSVRC2012_val_00021760.JPEG n03291819/
+mv val/ILSVRC2012_val_00021761.JPEG n03954731/
+mv val/ILSVRC2012_val_00021762.JPEG n03838899/
+mv val/ILSVRC2012_val_00021763.JPEG n07871810/
+mv val/ILSVRC2012_val_00021764.JPEG n02077923/
+mv val/ILSVRC2012_val_00021765.JPEG n12057211/
+mv val/ILSVRC2012_val_00021766.JPEG n02415577/
+mv val/ILSVRC2012_val_00021767.JPEG n02115641/
+mv val/ILSVRC2012_val_00021768.JPEG n03781244/
+mv val/ILSVRC2012_val_00021769.JPEG n07880968/
+mv val/ILSVRC2012_val_00021770.JPEG n07711569/
+mv val/ILSVRC2012_val_00021771.JPEG n03838899/
+mv val/ILSVRC2012_val_00021772.JPEG n03180011/
+mv val/ILSVRC2012_val_00021773.JPEG n02114712/
+mv val/ILSVRC2012_val_00021774.JPEG n03887697/
+mv val/ILSVRC2012_val_00021775.JPEG n02930766/
+mv val/ILSVRC2012_val_00021776.JPEG n01644900/
+mv val/ILSVRC2012_val_00021777.JPEG n02111277/
+mv val/ILSVRC2012_val_00021778.JPEG n02999410/
+mv val/ILSVRC2012_val_00021779.JPEG n03534580/
+mv val/ILSVRC2012_val_00021780.JPEG n02497673/
+mv val/ILSVRC2012_val_00021781.JPEG n02410509/
+mv val/ILSVRC2012_val_00021782.JPEG n02777292/
+mv val/ILSVRC2012_val_00021783.JPEG n03461385/
+mv val/ILSVRC2012_val_00021784.JPEG n04086273/
+mv val/ILSVRC2012_val_00021785.JPEG n03627232/
+mv val/ILSVRC2012_val_00021786.JPEG n01689811/
+mv val/ILSVRC2012_val_00021787.JPEG n09193705/
+mv val/ILSVRC2012_val_00021788.JPEG n01955084/
+mv val/ILSVRC2012_val_00021789.JPEG n03916031/
+mv val/ILSVRC2012_val_00021790.JPEG n04355338/
+mv val/ILSVRC2012_val_00021791.JPEG n04259630/
+mv val/ILSVRC2012_val_00021792.JPEG n03617480/
+mv val/ILSVRC2012_val_00021793.JPEG n01498041/
+mv val/ILSVRC2012_val_00021794.JPEG n02169497/
+mv val/ILSVRC2012_val_00021795.JPEG n02423022/
+mv val/ILSVRC2012_val_00021796.JPEG n02422106/
+mv val/ILSVRC2012_val_00021797.JPEG n02699494/
+mv val/ILSVRC2012_val_00021798.JPEG n02494079/
+mv val/ILSVRC2012_val_00021799.JPEG n04515003/
+mv val/ILSVRC2012_val_00021800.JPEG n03724870/
+mv val/ILSVRC2012_val_00021801.JPEG n02113799/
+mv val/ILSVRC2012_val_00021802.JPEG n03930630/
+mv val/ILSVRC2012_val_00021803.JPEG n04458633/
+mv val/ILSVRC2012_val_00021804.JPEG n04065272/
+mv val/ILSVRC2012_val_00021805.JPEG n02939185/
+mv val/ILSVRC2012_val_00021806.JPEG n02281787/
+mv val/ILSVRC2012_val_00021807.JPEG n02504458/
+mv val/ILSVRC2012_val_00021808.JPEG n02190166/
+mv val/ILSVRC2012_val_00021809.JPEG n03691459/
+mv val/ILSVRC2012_val_00021810.JPEG n02408429/
+mv val/ILSVRC2012_val_00021811.JPEG n07579787/
+mv val/ILSVRC2012_val_00021812.JPEG n02114712/
+mv val/ILSVRC2012_val_00021813.JPEG n04125021/
+mv val/ILSVRC2012_val_00021814.JPEG n04461696/
+mv val/ILSVRC2012_val_00021815.JPEG n03384352/
+mv val/ILSVRC2012_val_00021816.JPEG n03388183/
+mv val/ILSVRC2012_val_00021817.JPEG n03837869/
+mv val/ILSVRC2012_val_00021818.JPEG n03485407/
+mv val/ILSVRC2012_val_00021819.JPEG n01986214/
+mv val/ILSVRC2012_val_00021820.JPEG n03255030/
+mv val/ILSVRC2012_val_00021821.JPEG n02804610/
+mv val/ILSVRC2012_val_00021822.JPEG n03255030/
+mv val/ILSVRC2012_val_00021823.JPEG n01924916/
+mv val/ILSVRC2012_val_00021824.JPEG n04398044/
+mv val/ILSVRC2012_val_00021825.JPEG n04540053/
+mv val/ILSVRC2012_val_00021826.JPEG n02667093/
+mv val/ILSVRC2012_val_00021827.JPEG n03146219/
+mv val/ILSVRC2012_val_00021828.JPEG n02483708/
+mv val/ILSVRC2012_val_00021829.JPEG n03125729/
+mv val/ILSVRC2012_val_00021830.JPEG n09256479/
+mv val/ILSVRC2012_val_00021831.JPEG n02089078/
+mv val/ILSVRC2012_val_00021832.JPEG n02607072/
+mv val/ILSVRC2012_val_00021833.JPEG n03742115/
+mv val/ILSVRC2012_val_00021834.JPEG n04067472/
+mv val/ILSVRC2012_val_00021835.JPEG n02114712/
+mv val/ILSVRC2012_val_00021836.JPEG n03196217/
+mv val/ILSVRC2012_val_00021837.JPEG n04254120/
+mv val/ILSVRC2012_val_00021838.JPEG n02105412/
+mv val/ILSVRC2012_val_00021839.JPEG n03250847/
+mv val/ILSVRC2012_val_00021840.JPEG n02111500/
+mv val/ILSVRC2012_val_00021841.JPEG n07565083/
+mv val/ILSVRC2012_val_00021842.JPEG n04162706/
+mv val/ILSVRC2012_val_00021843.JPEG n01917289/
+mv val/ILSVRC2012_val_00021844.JPEG n03018349/
+mv val/ILSVRC2012_val_00021845.JPEG n03530642/
+mv val/ILSVRC2012_val_00021846.JPEG n02107908/
+mv val/ILSVRC2012_val_00021847.JPEG n02169497/
+mv val/ILSVRC2012_val_00021848.JPEG n02018795/
+mv val/ILSVRC2012_val_00021849.JPEG n03658185/
+mv val/ILSVRC2012_val_00021850.JPEG n03424325/
+mv val/ILSVRC2012_val_00021851.JPEG n02018207/
+mv val/ILSVRC2012_val_00021852.JPEG n03630383/
+mv val/ILSVRC2012_val_00021853.JPEG n03903868/
+mv val/ILSVRC2012_val_00021854.JPEG n07745940/
+mv val/ILSVRC2012_val_00021855.JPEG n02138441/
+mv val/ILSVRC2012_val_00021856.JPEG n03372029/
+mv val/ILSVRC2012_val_00021857.JPEG n02319095/
+mv val/ILSVRC2012_val_00021858.JPEG n01855672/
+mv val/ILSVRC2012_val_00021859.JPEG n03062245/
+mv val/ILSVRC2012_val_00021860.JPEG n07753592/
+mv val/ILSVRC2012_val_00021861.JPEG n04147183/
+mv val/ILSVRC2012_val_00021862.JPEG n04254777/
+mv val/ILSVRC2012_val_00021863.JPEG n03838899/
+mv val/ILSVRC2012_val_00021864.JPEG n02219486/
+mv val/ILSVRC2012_val_00021865.JPEG n04270147/
+mv val/ILSVRC2012_val_00021866.JPEG n07871810/
+mv val/ILSVRC2012_val_00021867.JPEG n01910747/
+mv val/ILSVRC2012_val_00021868.JPEG n02999410/
+mv val/ILSVRC2012_val_00021869.JPEG n12768682/
+mv val/ILSVRC2012_val_00021870.JPEG n03649909/
+mv val/ILSVRC2012_val_00021871.JPEG n04120489/
+mv val/ILSVRC2012_val_00021872.JPEG n02002724/
+mv val/ILSVRC2012_val_00021873.JPEG n01756291/
+mv val/ILSVRC2012_val_00021874.JPEG n02445715/
+mv val/ILSVRC2012_val_00021875.JPEG n02009912/
+mv val/ILSVRC2012_val_00021876.JPEG n01798484/
+mv val/ILSVRC2012_val_00021877.JPEG n04532670/
+mv val/ILSVRC2012_val_00021878.JPEG n04604644/
+mv val/ILSVRC2012_val_00021879.JPEG n04044716/
+mv val/ILSVRC2012_val_00021880.JPEG n02169497/
+mv val/ILSVRC2012_val_00021881.JPEG n02669723/
+mv val/ILSVRC2012_val_00021882.JPEG n04461696/
+mv val/ILSVRC2012_val_00021883.JPEG n02134084/
+mv val/ILSVRC2012_val_00021884.JPEG n03743016/
+mv val/ILSVRC2012_val_00021885.JPEG n01798484/
+mv val/ILSVRC2012_val_00021886.JPEG n03404251/
+mv val/ILSVRC2012_val_00021887.JPEG n02783161/
+mv val/ILSVRC2012_val_00021888.JPEG n03201208/
+mv val/ILSVRC2012_val_00021889.JPEG n02134084/
+mv val/ILSVRC2012_val_00021890.JPEG n02607072/
+mv val/ILSVRC2012_val_00021891.JPEG n03180011/
+mv val/ILSVRC2012_val_00021892.JPEG n02094433/
+mv val/ILSVRC2012_val_00021893.JPEG n03388549/
+mv val/ILSVRC2012_val_00021894.JPEG n07590611/
+mv val/ILSVRC2012_val_00021895.JPEG n02640242/
+mv val/ILSVRC2012_val_00021896.JPEG n02085782/
+mv val/ILSVRC2012_val_00021897.JPEG n02871525/
+mv val/ILSVRC2012_val_00021898.JPEG n03967562/
+mv val/ILSVRC2012_val_00021899.JPEG n02119789/
+mv val/ILSVRC2012_val_00021900.JPEG n04507155/
+mv val/ILSVRC2012_val_00021901.JPEG n04149813/
+mv val/ILSVRC2012_val_00021902.JPEG n03492542/
+mv val/ILSVRC2012_val_00021903.JPEG n02437312/
+mv val/ILSVRC2012_val_00021904.JPEG n02098105/
+mv val/ILSVRC2012_val_00021905.JPEG n01443537/
+mv val/ILSVRC2012_val_00021906.JPEG n01632458/
+mv val/ILSVRC2012_val_00021907.JPEG n02860847/
+mv val/ILSVRC2012_val_00021908.JPEG n02113023/
+mv val/ILSVRC2012_val_00021909.JPEG n03337140/
+mv val/ILSVRC2012_val_00021910.JPEG n12620546/
+mv val/ILSVRC2012_val_00021911.JPEG n03459775/
+mv val/ILSVRC2012_val_00021912.JPEG n11879895/
+mv val/ILSVRC2012_val_00021913.JPEG n03085013/
+mv val/ILSVRC2012_val_00021914.JPEG n02096585/
+mv val/ILSVRC2012_val_00021915.JPEG n02088466/
+mv val/ILSVRC2012_val_00021916.JPEG n01751748/
+mv val/ILSVRC2012_val_00021917.JPEG n02497673/
+mv val/ILSVRC2012_val_00021918.JPEG n02236044/
+mv val/ILSVRC2012_val_00021919.JPEG n03109150/
+mv val/ILSVRC2012_val_00021920.JPEG n02130308/
+mv val/ILSVRC2012_val_00021921.JPEG n04325704/
+mv val/ILSVRC2012_val_00021922.JPEG n03676483/
+mv val/ILSVRC2012_val_00021923.JPEG n02105412/
+mv val/ILSVRC2012_val_00021924.JPEG n03180011/
+mv val/ILSVRC2012_val_00021925.JPEG n02787622/
+mv val/ILSVRC2012_val_00021926.JPEG n02025239/
+mv val/ILSVRC2012_val_00021927.JPEG n01693334/
+mv val/ILSVRC2012_val_00021928.JPEG n02325366/
+mv val/ILSVRC2012_val_00021929.JPEG n02281787/
+mv val/ILSVRC2012_val_00021930.JPEG n04597913/
+mv val/ILSVRC2012_val_00021931.JPEG n04346328/
+mv val/ILSVRC2012_val_00021932.JPEG n04404412/
+mv val/ILSVRC2012_val_00021933.JPEG n02006656/
+mv val/ILSVRC2012_val_00021934.JPEG n02107312/
+mv val/ILSVRC2012_val_00021935.JPEG n02165456/
+mv val/ILSVRC2012_val_00021936.JPEG n03042490/
+mv val/ILSVRC2012_val_00021937.JPEG n04418357/
+mv val/ILSVRC2012_val_00021938.JPEG n02093428/
+mv val/ILSVRC2012_val_00021939.JPEG n04133789/
+mv val/ILSVRC2012_val_00021940.JPEG n07754684/
+mv val/ILSVRC2012_val_00021941.JPEG n03075370/
+mv val/ILSVRC2012_val_00021942.JPEG n03916031/
+mv val/ILSVRC2012_val_00021943.JPEG n04536866/
+mv val/ILSVRC2012_val_00021944.JPEG n07711569/
+mv val/ILSVRC2012_val_00021945.JPEG n02895154/
+mv val/ILSVRC2012_val_00021946.JPEG n02105251/
+mv val/ILSVRC2012_val_00021947.JPEG n02692877/
+mv val/ILSVRC2012_val_00021948.JPEG n03344393/
+mv val/ILSVRC2012_val_00021949.JPEG n04493381/
+mv val/ILSVRC2012_val_00021950.JPEG n04579145/
+mv val/ILSVRC2012_val_00021951.JPEG n03201208/
+mv val/ILSVRC2012_val_00021952.JPEG n04243546/
+mv val/ILSVRC2012_val_00021953.JPEG n02167151/
+mv val/ILSVRC2012_val_00021954.JPEG n01797886/
+mv val/ILSVRC2012_val_00021955.JPEG n09256479/
+mv val/ILSVRC2012_val_00021956.JPEG n01582220/
+mv val/ILSVRC2012_val_00021957.JPEG n04548362/
+mv val/ILSVRC2012_val_00021958.JPEG n03476684/
+mv val/ILSVRC2012_val_00021959.JPEG n04606251/
+mv val/ILSVRC2012_val_00021960.JPEG n04579432/
+mv val/ILSVRC2012_val_00021961.JPEG n02086910/
+mv val/ILSVRC2012_val_00021962.JPEG n02134084/
+mv val/ILSVRC2012_val_00021963.JPEG n02109525/
+mv val/ILSVRC2012_val_00021964.JPEG n04238763/
+mv val/ILSVRC2012_val_00021965.JPEG n03764736/
+mv val/ILSVRC2012_val_00021966.JPEG n04044716/
+mv val/ILSVRC2012_val_00021967.JPEG n04548362/
+mv val/ILSVRC2012_val_00021968.JPEG n02692877/
+mv val/ILSVRC2012_val_00021969.JPEG n03207941/
+mv val/ILSVRC2012_val_00021970.JPEG n04229816/
+mv val/ILSVRC2012_val_00021971.JPEG n03598930/
+mv val/ILSVRC2012_val_00021972.JPEG n04591157/
+mv val/ILSVRC2012_val_00021973.JPEG n02317335/
+mv val/ILSVRC2012_val_00021974.JPEG n01734418/
+mv val/ILSVRC2012_val_00021975.JPEG n15075141/
+mv val/ILSVRC2012_val_00021976.JPEG n03825788/
+mv val/ILSVRC2012_val_00021977.JPEG n04536866/
+mv val/ILSVRC2012_val_00021978.JPEG n04254777/
+mv val/ILSVRC2012_val_00021979.JPEG n02277742/
+mv val/ILSVRC2012_val_00021980.JPEG n03877845/
+mv val/ILSVRC2012_val_00021981.JPEG n02747177/
+mv val/ILSVRC2012_val_00021982.JPEG n01667778/
+mv val/ILSVRC2012_val_00021983.JPEG n01664065/
+mv val/ILSVRC2012_val_00021984.JPEG n03180011/
+mv val/ILSVRC2012_val_00021985.JPEG n02701002/
+mv val/ILSVRC2012_val_00021986.JPEG n13040303/
+mv val/ILSVRC2012_val_00021987.JPEG n03388549/
+mv val/ILSVRC2012_val_00021988.JPEG n04591713/
+mv val/ILSVRC2012_val_00021989.JPEG n04389033/
+mv val/ILSVRC2012_val_00021990.JPEG n02699494/
+mv val/ILSVRC2012_val_00021991.JPEG n02105162/
+mv val/ILSVRC2012_val_00021992.JPEG n02280649/
+mv val/ILSVRC2012_val_00021993.JPEG n04254777/
+mv val/ILSVRC2012_val_00021994.JPEG n02607072/
+mv val/ILSVRC2012_val_00021995.JPEG n01985128/
+mv val/ILSVRC2012_val_00021996.JPEG n03045698/
+mv val/ILSVRC2012_val_00021997.JPEG n03717622/
+mv val/ILSVRC2012_val_00021998.JPEG n02086240/
+mv val/ILSVRC2012_val_00021999.JPEG n03903868/
+mv val/ILSVRC2012_val_00022000.JPEG n02326432/
+mv val/ILSVRC2012_val_00022001.JPEG n02229544/
+mv val/ILSVRC2012_val_00022002.JPEG n03530642/
+mv val/ILSVRC2012_val_00022003.JPEG n01685808/
+mv val/ILSVRC2012_val_00022004.JPEG n02091467/
+mv val/ILSVRC2012_val_00022005.JPEG n03544143/
+mv val/ILSVRC2012_val_00022006.JPEG n03902125/
+mv val/ILSVRC2012_val_00022007.JPEG n02125311/
+mv val/ILSVRC2012_val_00022008.JPEG n09399592/
+mv val/ILSVRC2012_val_00022009.JPEG n04070727/
+mv val/ILSVRC2012_val_00022010.JPEG n07730033/
+mv val/ILSVRC2012_val_00022011.JPEG n07684084/
+mv val/ILSVRC2012_val_00022012.JPEG n04398044/
+mv val/ILSVRC2012_val_00022013.JPEG n03372029/
+mv val/ILSVRC2012_val_00022014.JPEG n03483316/
+mv val/ILSVRC2012_val_00022015.JPEG n03495258/
+mv val/ILSVRC2012_val_00022016.JPEG n01728572/
+mv val/ILSVRC2012_val_00022017.JPEG n04037443/
+mv val/ILSVRC2012_val_00022018.JPEG n02395406/
+mv val/ILSVRC2012_val_00022019.JPEG n03457902/
+mv val/ILSVRC2012_val_00022020.JPEG n03761084/
+mv val/ILSVRC2012_val_00022021.JPEG n01734418/
+mv val/ILSVRC2012_val_00022022.JPEG n02090721/
+mv val/ILSVRC2012_val_00022023.JPEG n03976657/
+mv val/ILSVRC2012_val_00022024.JPEG n03785016/
+mv val/ILSVRC2012_val_00022025.JPEG n01514668/
+mv val/ILSVRC2012_val_00022026.JPEG n04357314/
+mv val/ILSVRC2012_val_00022027.JPEG n02835271/
+mv val/ILSVRC2012_val_00022028.JPEG n02504013/
+mv val/ILSVRC2012_val_00022029.JPEG n02489166/
+mv val/ILSVRC2012_val_00022030.JPEG n03530642/
+mv val/ILSVRC2012_val_00022031.JPEG n02950826/
+mv val/ILSVRC2012_val_00022032.JPEG n02111889/
+mv val/ILSVRC2012_val_00022033.JPEG n04371774/
+mv val/ILSVRC2012_val_00022034.JPEG n04560804/
+mv val/ILSVRC2012_val_00022035.JPEG n03445924/
+mv val/ILSVRC2012_val_00022036.JPEG n02091831/
+mv val/ILSVRC2012_val_00022037.JPEG n07753592/
+mv val/ILSVRC2012_val_00022038.JPEG n03447721/
+mv val/ILSVRC2012_val_00022039.JPEG n01770081/
+mv val/ILSVRC2012_val_00022040.JPEG n02487347/
+mv val/ILSVRC2012_val_00022041.JPEG n02794156/
+mv val/ILSVRC2012_val_00022042.JPEG n02097209/
+mv val/ILSVRC2012_val_00022043.JPEG n03891251/
+mv val/ILSVRC2012_val_00022044.JPEG n02790996/
+mv val/ILSVRC2012_val_00022045.JPEG n03109150/
+mv val/ILSVRC2012_val_00022046.JPEG n04380533/
+mv val/ILSVRC2012_val_00022047.JPEG n03595614/
+mv val/ILSVRC2012_val_00022048.JPEG n04153751/
+mv val/ILSVRC2012_val_00022049.JPEG n04591713/
+mv val/ILSVRC2012_val_00022050.JPEG n02108915/
+mv val/ILSVRC2012_val_00022051.JPEG n04429376/
+mv val/ILSVRC2012_val_00022052.JPEG n01641577/
+mv val/ILSVRC2012_val_00022053.JPEG n04264628/
+mv val/ILSVRC2012_val_00022054.JPEG n03271574/
+mv val/ILSVRC2012_val_00022055.JPEG n02114367/
+mv val/ILSVRC2012_val_00022056.JPEG n07930864/
+mv val/ILSVRC2012_val_00022057.JPEG n02105641/
+mv val/ILSVRC2012_val_00022058.JPEG n02104365/
+mv val/ILSVRC2012_val_00022059.JPEG n03717622/
+mv val/ILSVRC2012_val_00022060.JPEG n04423845/
+mv val/ILSVRC2012_val_00022061.JPEG n02094258/
+mv val/ILSVRC2012_val_00022062.JPEG n02116738/
+mv val/ILSVRC2012_val_00022063.JPEG n01692333/
+mv val/ILSVRC2012_val_00022064.JPEG n02909870/
+mv val/ILSVRC2012_val_00022065.JPEG n02606052/
+mv val/ILSVRC2012_val_00022066.JPEG n02099849/
+mv val/ILSVRC2012_val_00022067.JPEG n02363005/
+mv val/ILSVRC2012_val_00022068.JPEG n07734744/
+mv val/ILSVRC2012_val_00022069.JPEG n02841315/
+mv val/ILSVRC2012_val_00022070.JPEG n01860187/
+mv val/ILSVRC2012_val_00022071.JPEG n02090721/
+mv val/ILSVRC2012_val_00022072.JPEG n03841143/
+mv val/ILSVRC2012_val_00022073.JPEG n02892201/
+mv val/ILSVRC2012_val_00022074.JPEG n04125021/
+mv val/ILSVRC2012_val_00022075.JPEG n04612504/
+mv val/ILSVRC2012_val_00022076.JPEG n01537544/
+mv val/ILSVRC2012_val_00022077.JPEG n04505470/
+mv val/ILSVRC2012_val_00022078.JPEG n02281406/
+mv val/ILSVRC2012_val_00022079.JPEG n03983396/
+mv val/ILSVRC2012_val_00022080.JPEG n02123045/
+mv val/ILSVRC2012_val_00022081.JPEG n01784675/
+mv val/ILSVRC2012_val_00022082.JPEG n02493509/
+mv val/ILSVRC2012_val_00022083.JPEG n03476991/
+mv val/ILSVRC2012_val_00022084.JPEG n03534580/
+mv val/ILSVRC2012_val_00022085.JPEG n02123159/
+mv val/ILSVRC2012_val_00022086.JPEG n02808440/
+mv val/ILSVRC2012_val_00022087.JPEG n04074963/
+mv val/ILSVRC2012_val_00022088.JPEG n01616318/
+mv val/ILSVRC2012_val_00022089.JPEG n03786901/
+mv val/ILSVRC2012_val_00022090.JPEG n03721384/
+mv val/ILSVRC2012_val_00022091.JPEG n02086240/
+mv val/ILSVRC2012_val_00022092.JPEG n02488702/
+mv val/ILSVRC2012_val_00022093.JPEG n03642806/
+mv val/ILSVRC2012_val_00022094.JPEG n03160309/
+mv val/ILSVRC2012_val_00022095.JPEG n01796340/
+mv val/ILSVRC2012_val_00022096.JPEG n13044778/
+mv val/ILSVRC2012_val_00022097.JPEG n09256479/
+mv val/ILSVRC2012_val_00022098.JPEG n03089624/
+mv val/ILSVRC2012_val_00022099.JPEG n02086910/
+mv val/ILSVRC2012_val_00022100.JPEG n04604644/
+mv val/ILSVRC2012_val_00022101.JPEG n04040759/
+mv val/ILSVRC2012_val_00022102.JPEG n07584110/
+mv val/ILSVRC2012_val_00022103.JPEG n04552348/
+mv val/ILSVRC2012_val_00022104.JPEG n04149813/
+mv val/ILSVRC2012_val_00022105.JPEG n02066245/
+mv val/ILSVRC2012_val_00022106.JPEG n01580077/
+mv val/ILSVRC2012_val_00022107.JPEG n04443257/
+mv val/ILSVRC2012_val_00022108.JPEG n04336792/
+mv val/ILSVRC2012_val_00022109.JPEG n02107683/
+mv val/ILSVRC2012_val_00022110.JPEG n01797886/
+mv val/ILSVRC2012_val_00022111.JPEG n02134418/
+mv val/ILSVRC2012_val_00022112.JPEG n02134418/
+mv val/ILSVRC2012_val_00022113.JPEG n01632777/
+mv val/ILSVRC2012_val_00022114.JPEG n06359193/
+mv val/ILSVRC2012_val_00022115.JPEG n01797886/
+mv val/ILSVRC2012_val_00022116.JPEG n03485407/
+mv val/ILSVRC2012_val_00022117.JPEG n04259630/
+mv val/ILSVRC2012_val_00022118.JPEG n03992509/
+mv val/ILSVRC2012_val_00022119.JPEG n07248320/
+mv val/ILSVRC2012_val_00022120.JPEG n04486054/
+mv val/ILSVRC2012_val_00022121.JPEG n03026506/
+mv val/ILSVRC2012_val_00022122.JPEG n02088632/
+mv val/ILSVRC2012_val_00022123.JPEG n03124043/
+mv val/ILSVRC2012_val_00022124.JPEG n02442845/
+mv val/ILSVRC2012_val_00022125.JPEG n02091467/
+mv val/ILSVRC2012_val_00022126.JPEG n03376595/
+mv val/ILSVRC2012_val_00022127.JPEG n04310018/
+mv val/ILSVRC2012_val_00022128.JPEG n02966687/
+mv val/ILSVRC2012_val_00022129.JPEG n03777568/
+mv val/ILSVRC2012_val_00022130.JPEG n03100240/
+mv val/ILSVRC2012_val_00022131.JPEG n04350905/
+mv val/ILSVRC2012_val_00022132.JPEG n02843684/
+mv val/ILSVRC2012_val_00022133.JPEG n02109961/
+mv val/ILSVRC2012_val_00022134.JPEG n01631663/
+mv val/ILSVRC2012_val_00022135.JPEG n03240683/
+mv val/ILSVRC2012_val_00022136.JPEG n03141823/
+mv val/ILSVRC2012_val_00022137.JPEG n02091635/
+mv val/ILSVRC2012_val_00022138.JPEG n01443537/
+mv val/ILSVRC2012_val_00022139.JPEG n11939491/
+mv val/ILSVRC2012_val_00022140.JPEG n02002724/
+mv val/ILSVRC2012_val_00022141.JPEG n03733281/
+mv val/ILSVRC2012_val_00022142.JPEG n02106662/
+mv val/ILSVRC2012_val_00022143.JPEG n03942813/
+mv val/ILSVRC2012_val_00022144.JPEG n03337140/
+mv val/ILSVRC2012_val_00022145.JPEG n03777568/
+mv val/ILSVRC2012_val_00022146.JPEG n04251144/
+mv val/ILSVRC2012_val_00022147.JPEG n07716906/
+mv val/ILSVRC2012_val_00022148.JPEG n01820546/
+mv val/ILSVRC2012_val_00022149.JPEG n03929660/
+mv val/ILSVRC2012_val_00022150.JPEG n03478589/
+mv val/ILSVRC2012_val_00022151.JPEG n02441942/
+mv val/ILSVRC2012_val_00022152.JPEG n02364673/
+mv val/ILSVRC2012_val_00022153.JPEG n09835506/
+mv val/ILSVRC2012_val_00022154.JPEG n04515003/
+mv val/ILSVRC2012_val_00022155.JPEG n02264363/
+mv val/ILSVRC2012_val_00022156.JPEG n01773157/
+mv val/ILSVRC2012_val_00022157.JPEG n01770393/
+mv val/ILSVRC2012_val_00022158.JPEG n03777568/
+mv val/ILSVRC2012_val_00022159.JPEG n04049303/
+mv val/ILSVRC2012_val_00022160.JPEG n02219486/
+mv val/ILSVRC2012_val_00022161.JPEG n02130308/
+mv val/ILSVRC2012_val_00022162.JPEG n02437312/
+mv val/ILSVRC2012_val_00022163.JPEG n02815834/
+mv val/ILSVRC2012_val_00022164.JPEG n02093647/
+mv val/ILSVRC2012_val_00022165.JPEG n01616318/
+mv val/ILSVRC2012_val_00022166.JPEG n04332243/
+mv val/ILSVRC2012_val_00022167.JPEG n12620546/
+mv val/ILSVRC2012_val_00022168.JPEG n10148035/
+mv val/ILSVRC2012_val_00022169.JPEG n02927161/
+mv val/ILSVRC2012_val_00022170.JPEG n02128757/
+mv val/ILSVRC2012_val_00022171.JPEG n03496892/
+mv val/ILSVRC2012_val_00022172.JPEG n03417042/
+mv val/ILSVRC2012_val_00022173.JPEG n04200800/
+mv val/ILSVRC2012_val_00022174.JPEG n02484975/
+mv val/ILSVRC2012_val_00022175.JPEG n01689811/
+mv val/ILSVRC2012_val_00022176.JPEG n02107574/
+mv val/ILSVRC2012_val_00022177.JPEG n03976657/
+mv val/ILSVRC2012_val_00022178.JPEG n03998194/
+mv val/ILSVRC2012_val_00022179.JPEG n02088632/
+mv val/ILSVRC2012_val_00022180.JPEG n04243546/
+mv val/ILSVRC2012_val_00022181.JPEG n03788365/
+mv val/ILSVRC2012_val_00022182.JPEG n02087046/
+mv val/ILSVRC2012_val_00022183.JPEG n10565667/
+mv val/ILSVRC2012_val_00022184.JPEG n03832673/
+mv val/ILSVRC2012_val_00022185.JPEG n02412080/
+mv val/ILSVRC2012_val_00022186.JPEG n01558993/
+mv val/ILSVRC2012_val_00022187.JPEG n03492542/
+mv val/ILSVRC2012_val_00022188.JPEG n04540053/
+mv val/ILSVRC2012_val_00022189.JPEG n01796340/
+mv val/ILSVRC2012_val_00022190.JPEG n04376876/
+mv val/ILSVRC2012_val_00022191.JPEG n02395406/
+mv val/ILSVRC2012_val_00022192.JPEG n03075370/
+mv val/ILSVRC2012_val_00022193.JPEG n07753592/
+mv val/ILSVRC2012_val_00022194.JPEG n02481823/
+mv val/ILSVRC2012_val_00022195.JPEG n02457408/
+mv val/ILSVRC2012_val_00022196.JPEG n02110806/
+mv val/ILSVRC2012_val_00022197.JPEG n03877472/
+mv val/ILSVRC2012_val_00022198.JPEG n01667778/
+mv val/ILSVRC2012_val_00022199.JPEG n03131574/
+mv val/ILSVRC2012_val_00022200.JPEG n03956157/
+mv val/ILSVRC2012_val_00022201.JPEG n02108422/
+mv val/ILSVRC2012_val_00022202.JPEG n02114548/
+mv val/ILSVRC2012_val_00022203.JPEG n03272010/
+mv val/ILSVRC2012_val_00022204.JPEG n03394916/
+mv val/ILSVRC2012_val_00022205.JPEG n01774384/
+mv val/ILSVRC2012_val_00022206.JPEG n03623198/
+mv val/ILSVRC2012_val_00022207.JPEG n02027492/
+mv val/ILSVRC2012_val_00022208.JPEG n04099969/
+mv val/ILSVRC2012_val_00022209.JPEG n02106662/
+mv val/ILSVRC2012_val_00022210.JPEG n02951358/
+mv val/ILSVRC2012_val_00022211.JPEG n01798484/
+mv val/ILSVRC2012_val_00022212.JPEG n13133613/
+mv val/ILSVRC2012_val_00022213.JPEG n03207743/
+mv val/ILSVRC2012_val_00022214.JPEG n04560804/
+mv val/ILSVRC2012_val_00022215.JPEG n02268443/
+mv val/ILSVRC2012_val_00022216.JPEG n03775071/
+mv val/ILSVRC2012_val_00022217.JPEG n04346328/
+mv val/ILSVRC2012_val_00022218.JPEG n01930112/
+mv val/ILSVRC2012_val_00022219.JPEG n03584254/
+mv val/ILSVRC2012_val_00022220.JPEG n02790996/
+mv val/ILSVRC2012_val_00022221.JPEG n09256479/
+mv val/ILSVRC2012_val_00022222.JPEG n01985128/
+mv val/ILSVRC2012_val_00022223.JPEG n02480495/
+mv val/ILSVRC2012_val_00022224.JPEG n02268853/
+mv val/ILSVRC2012_val_00022225.JPEG n03627232/
+mv val/ILSVRC2012_val_00022226.JPEG n03180011/
+mv val/ILSVRC2012_val_00022227.JPEG n02233338/
+mv val/ILSVRC2012_val_00022228.JPEG n03982430/
+mv val/ILSVRC2012_val_00022229.JPEG n02841315/
+mv val/ILSVRC2012_val_00022230.JPEG n03649909/
+mv val/ILSVRC2012_val_00022231.JPEG n04336792/
+mv val/ILSVRC2012_val_00022232.JPEG n09468604/
+mv val/ILSVRC2012_val_00022233.JPEG n02056570/
+mv val/ILSVRC2012_val_00022234.JPEG n02787622/
+mv val/ILSVRC2012_val_00022235.JPEG n03764736/
+mv val/ILSVRC2012_val_00022236.JPEG n02442845/
+mv val/ILSVRC2012_val_00022237.JPEG n02437616/
+mv val/ILSVRC2012_val_00022238.JPEG n03445924/
+mv val/ILSVRC2012_val_00022239.JPEG n01917289/
+mv val/ILSVRC2012_val_00022240.JPEG n02107312/
+mv val/ILSVRC2012_val_00022241.JPEG n02137549/
+mv val/ILSVRC2012_val_00022242.JPEG n03599486/
+mv val/ILSVRC2012_val_00022243.JPEG n03721384/
+mv val/ILSVRC2012_val_00022244.JPEG n04041544/
+mv val/ILSVRC2012_val_00022245.JPEG n01824575/
+mv val/ILSVRC2012_val_00022246.JPEG n04285008/
+mv val/ILSVRC2012_val_00022247.JPEG n01687978/
+mv val/ILSVRC2012_val_00022248.JPEG n01514668/
+mv val/ILSVRC2012_val_00022249.JPEG n04554684/
+mv val/ILSVRC2012_val_00022250.JPEG n04209239/
+mv val/ILSVRC2012_val_00022251.JPEG n03272562/
+mv val/ILSVRC2012_val_00022252.JPEG n03425413/
+mv val/ILSVRC2012_val_00022253.JPEG n02797295/
+mv val/ILSVRC2012_val_00022254.JPEG n02106382/
+mv val/ILSVRC2012_val_00022255.JPEG n06359193/
+mv val/ILSVRC2012_val_00022256.JPEG n03642806/
+mv val/ILSVRC2012_val_00022257.JPEG n01677366/
+mv val/ILSVRC2012_val_00022258.JPEG n03134739/
+mv val/ILSVRC2012_val_00022259.JPEG n02105641/
+mv val/ILSVRC2012_val_00022260.JPEG n01985128/
+mv val/ILSVRC2012_val_00022261.JPEG n03594945/
+mv val/ILSVRC2012_val_00022262.JPEG n07583066/
+mv val/ILSVRC2012_val_00022263.JPEG n02667093/
+mv val/ILSVRC2012_val_00022264.JPEG n02086646/
+mv val/ILSVRC2012_val_00022265.JPEG n07590611/
+mv val/ILSVRC2012_val_00022266.JPEG n02111889/
+mv val/ILSVRC2012_val_00022267.JPEG n03857828/
+mv val/ILSVRC2012_val_00022268.JPEG n04259630/
+mv val/ILSVRC2012_val_00022269.JPEG n02730930/
+mv val/ILSVRC2012_val_00022270.JPEG n04285008/
+mv val/ILSVRC2012_val_00022271.JPEG n03095699/
+mv val/ILSVRC2012_val_00022272.JPEG n03761084/
+mv val/ILSVRC2012_val_00022273.JPEG n02167151/
+mv val/ILSVRC2012_val_00022274.JPEG n04404412/
+mv val/ILSVRC2012_val_00022275.JPEG n04254120/
+mv val/ILSVRC2012_val_00022276.JPEG n04461696/
+mv val/ILSVRC2012_val_00022277.JPEG n04192698/
+mv val/ILSVRC2012_val_00022278.JPEG n01873310/
+mv val/ILSVRC2012_val_00022279.JPEG n03763968/
+mv val/ILSVRC2012_val_00022280.JPEG n02804414/
+mv val/ILSVRC2012_val_00022281.JPEG n04325704/
+mv val/ILSVRC2012_val_00022282.JPEG n01682714/
+mv val/ILSVRC2012_val_00022283.JPEG n02120505/
+mv val/ILSVRC2012_val_00022284.JPEG n03584829/
+mv val/ILSVRC2012_val_00022285.JPEG n04356056/
+mv val/ILSVRC2012_val_00022286.JPEG n04476259/
+mv val/ILSVRC2012_val_00022287.JPEG n09332890/
+mv val/ILSVRC2012_val_00022288.JPEG n04399382/
+mv val/ILSVRC2012_val_00022289.JPEG n03676483/
+mv val/ILSVRC2012_val_00022290.JPEG n03961711/
+mv val/ILSVRC2012_val_00022291.JPEG n09332890/
+mv val/ILSVRC2012_val_00022292.JPEG n02096294/
+mv val/ILSVRC2012_val_00022293.JPEG n04532106/
+mv val/ILSVRC2012_val_00022294.JPEG n04149813/
+mv val/ILSVRC2012_val_00022295.JPEG n03891251/
+mv val/ILSVRC2012_val_00022296.JPEG n06874185/
+mv val/ILSVRC2012_val_00022297.JPEG n02769748/
+mv val/ILSVRC2012_val_00022298.JPEG n04485082/
+mv val/ILSVRC2012_val_00022299.JPEG n04277352/
+mv val/ILSVRC2012_val_00022300.JPEG n03793489/
+mv val/ILSVRC2012_val_00022301.JPEG n03788365/
+mv val/ILSVRC2012_val_00022302.JPEG n02389026/
+mv val/ILSVRC2012_val_00022303.JPEG n03709823/
+mv val/ILSVRC2012_val_00022304.JPEG n03032252/
+mv val/ILSVRC2012_val_00022305.JPEG n02606052/
+mv val/ILSVRC2012_val_00022306.JPEG n03271574/
+mv val/ILSVRC2012_val_00022307.JPEG n03492542/
+mv val/ILSVRC2012_val_00022308.JPEG n01665541/
+mv val/ILSVRC2012_val_00022309.JPEG n01675722/
+mv val/ILSVRC2012_val_00022310.JPEG n03691459/
+mv val/ILSVRC2012_val_00022311.JPEG n07892512/
+mv val/ILSVRC2012_val_00022312.JPEG n02799071/
+mv val/ILSVRC2012_val_00022313.JPEG n02007558/
+mv val/ILSVRC2012_val_00022314.JPEG n02510455/
+mv val/ILSVRC2012_val_00022315.JPEG n03742115/
+mv val/ILSVRC2012_val_00022316.JPEG n04136333/
+mv val/ILSVRC2012_val_00022317.JPEG n03630383/
+mv val/ILSVRC2012_val_00022318.JPEG n02910353/
+mv val/ILSVRC2012_val_00022319.JPEG n02111129/
+mv val/ILSVRC2012_val_00022320.JPEG n02488702/
+mv val/ILSVRC2012_val_00022321.JPEG n01950731/
+mv val/ILSVRC2012_val_00022322.JPEG n04204238/
+mv val/ILSVRC2012_val_00022323.JPEG n04461696/
+mv val/ILSVRC2012_val_00022324.JPEG n02102318/
+mv val/ILSVRC2012_val_00022325.JPEG n03538406/
+mv val/ILSVRC2012_val_00022326.JPEG n03916031/
+mv val/ILSVRC2012_val_00022327.JPEG n02130308/
+mv val/ILSVRC2012_val_00022328.JPEG n04311174/
+mv val/ILSVRC2012_val_00022329.JPEG n01667114/
+mv val/ILSVRC2012_val_00022330.JPEG n02115641/
+mv val/ILSVRC2012_val_00022331.JPEG n04487394/
+mv val/ILSVRC2012_val_00022332.JPEG n02233338/
+mv val/ILSVRC2012_val_00022333.JPEG n02099267/
+mv val/ILSVRC2012_val_00022334.JPEG n01797886/
+mv val/ILSVRC2012_val_00022335.JPEG n02051845/
+mv val/ILSVRC2012_val_00022336.JPEG n04428191/
+mv val/ILSVRC2012_val_00022337.JPEG n02124075/
+mv val/ILSVRC2012_val_00022338.JPEG n04532670/
+mv val/ILSVRC2012_val_00022339.JPEG n03775546/
+mv val/ILSVRC2012_val_00022340.JPEG n07892512/
+mv val/ILSVRC2012_val_00022341.JPEG n02100877/
+mv val/ILSVRC2012_val_00022342.JPEG n04398044/
+mv val/ILSVRC2012_val_00022343.JPEG n04590129/
+mv val/ILSVRC2012_val_00022344.JPEG n02101388/
+mv val/ILSVRC2012_val_00022345.JPEG n04254680/
+mv val/ILSVRC2012_val_00022346.JPEG n04485082/
+mv val/ILSVRC2012_val_00022347.JPEG n03026506/
+mv val/ILSVRC2012_val_00022348.JPEG n04111531/
+mv val/ILSVRC2012_val_00022349.JPEG n03924679/
+mv val/ILSVRC2012_val_00022350.JPEG n01667778/
+mv val/ILSVRC2012_val_00022351.JPEG n02169497/
+mv val/ILSVRC2012_val_00022352.JPEG n04311004/
+mv val/ILSVRC2012_val_00022353.JPEG n03947888/
+mv val/ILSVRC2012_val_00022354.JPEG n02093754/
+mv val/ILSVRC2012_val_00022355.JPEG n01818515/
+mv val/ILSVRC2012_val_00022356.JPEG n03763968/
+mv val/ILSVRC2012_val_00022357.JPEG n04380533/
+mv val/ILSVRC2012_val_00022358.JPEG n02077923/
+mv val/ILSVRC2012_val_00022359.JPEG n02488702/
+mv val/ILSVRC2012_val_00022360.JPEG n01770393/
+mv val/ILSVRC2012_val_00022361.JPEG n02226429/
+mv val/ILSVRC2012_val_00022362.JPEG n07932039/
+mv val/ILSVRC2012_val_00022363.JPEG n02095314/
+mv val/ILSVRC2012_val_00022364.JPEG n01847000/
+mv val/ILSVRC2012_val_00022365.JPEG n03250847/
+mv val/ILSVRC2012_val_00022366.JPEG n04296562/
+mv val/ILSVRC2012_val_00022367.JPEG n02100236/
+mv val/ILSVRC2012_val_00022368.JPEG n03045698/
+mv val/ILSVRC2012_val_00022369.JPEG n07590611/
+mv val/ILSVRC2012_val_00022370.JPEG n03787032/
+mv val/ILSVRC2012_val_00022371.JPEG n02101006/
+mv val/ILSVRC2012_val_00022372.JPEG n01873310/
+mv val/ILSVRC2012_val_00022373.JPEG n02009912/
+mv val/ILSVRC2012_val_00022374.JPEG n02096051/
+mv val/ILSVRC2012_val_00022375.JPEG n07749582/
+mv val/ILSVRC2012_val_00022376.JPEG n02112018/
+mv val/ILSVRC2012_val_00022377.JPEG n03000134/
+mv val/ILSVRC2012_val_00022378.JPEG n03447721/
+mv val/ILSVRC2012_val_00022379.JPEG n04118776/
+mv val/ILSVRC2012_val_00022380.JPEG n03970156/
+mv val/ILSVRC2012_val_00022381.JPEG n01944390/
+mv val/ILSVRC2012_val_00022382.JPEG n07613480/
+mv val/ILSVRC2012_val_00022383.JPEG n02879718/
+mv val/ILSVRC2012_val_00022384.JPEG n01873310/
+mv val/ILSVRC2012_val_00022385.JPEG n03187595/
+mv val/ILSVRC2012_val_00022386.JPEG n03325584/
+mv val/ILSVRC2012_val_00022387.JPEG n01496331/
+mv val/ILSVRC2012_val_00022388.JPEG n02097298/
+mv val/ILSVRC2012_val_00022389.JPEG n03793489/
+mv val/ILSVRC2012_val_00022390.JPEG n02111500/
+mv val/ILSVRC2012_val_00022391.JPEG n04311174/
+mv val/ILSVRC2012_val_00022392.JPEG n01739381/
+mv val/ILSVRC2012_val_00022393.JPEG n02114548/
+mv val/ILSVRC2012_val_00022394.JPEG n02165105/
+mv val/ILSVRC2012_val_00022395.JPEG n01930112/
+mv val/ILSVRC2012_val_00022396.JPEG n02823428/
+mv val/ILSVRC2012_val_00022397.JPEG n04111531/
+mv val/ILSVRC2012_val_00022398.JPEG n02137549/
+mv val/ILSVRC2012_val_00022399.JPEG n04355338/
+mv val/ILSVRC2012_val_00022400.JPEG n03916031/
+mv val/ILSVRC2012_val_00022401.JPEG n03791053/
+mv val/ILSVRC2012_val_00022402.JPEG n02113186/
+mv val/ILSVRC2012_val_00022403.JPEG n04081281/
+mv val/ILSVRC2012_val_00022404.JPEG n02104029/
+mv val/ILSVRC2012_val_00022405.JPEG n03483316/
+mv val/ILSVRC2012_val_00022406.JPEG n04579145/
+mv val/ILSVRC2012_val_00022407.JPEG n01558993/
+mv val/ILSVRC2012_val_00022408.JPEG n01748264/
+mv val/ILSVRC2012_val_00022409.JPEG n02791270/
+mv val/ILSVRC2012_val_00022410.JPEG n03929660/
+mv val/ILSVRC2012_val_00022411.JPEG n02129604/
+mv val/ILSVRC2012_val_00022412.JPEG n02102040/
+mv val/ILSVRC2012_val_00022413.JPEG n03796401/
+mv val/ILSVRC2012_val_00022414.JPEG n02007558/
+mv val/ILSVRC2012_val_00022415.JPEG n11879895/
+mv val/ILSVRC2012_val_00022416.JPEG n06794110/
+mv val/ILSVRC2012_val_00022417.JPEG n07614500/
+mv val/ILSVRC2012_val_00022418.JPEG n02006656/
+mv val/ILSVRC2012_val_00022419.JPEG n04065272/
+mv val/ILSVRC2012_val_00022420.JPEG n02486261/
+mv val/ILSVRC2012_val_00022421.JPEG n02640242/
+mv val/ILSVRC2012_val_00022422.JPEG n01806143/
+mv val/ILSVRC2012_val_00022423.JPEG n03991062/
+mv val/ILSVRC2012_val_00022424.JPEG n02788148/
+mv val/ILSVRC2012_val_00022425.JPEG n09472597/
+mv val/ILSVRC2012_val_00022426.JPEG n03935335/
+mv val/ILSVRC2012_val_00022427.JPEG n02510455/
+mv val/ILSVRC2012_val_00022428.JPEG n03958227/
+mv val/ILSVRC2012_val_00022429.JPEG n02105641/
+mv val/ILSVRC2012_val_00022430.JPEG n04428191/
+mv val/ILSVRC2012_val_00022431.JPEG n03018349/
+mv val/ILSVRC2012_val_00022432.JPEG n02116738/
+mv val/ILSVRC2012_val_00022433.JPEG n03773504/
+mv val/ILSVRC2012_val_00022434.JPEG n02087046/
+mv val/ILSVRC2012_val_00022435.JPEG n03709823/
+mv val/ILSVRC2012_val_00022436.JPEG n01749939/
+mv val/ILSVRC2012_val_00022437.JPEG n02190166/
+mv val/ILSVRC2012_val_00022438.JPEG n02085782/
+mv val/ILSVRC2012_val_00022439.JPEG n01843065/
+mv val/ILSVRC2012_val_00022440.JPEG n03743016/
+mv val/ILSVRC2012_val_00022441.JPEG n01828970/
+mv val/ILSVRC2012_val_00022442.JPEG n01828970/
+mv val/ILSVRC2012_val_00022443.JPEG n03908714/
+mv val/ILSVRC2012_val_00022444.JPEG n03937543/
+mv val/ILSVRC2012_val_00022445.JPEG n02817516/
+mv val/ILSVRC2012_val_00022446.JPEG n04592741/
+mv val/ILSVRC2012_val_00022447.JPEG n02869837/
+mv val/ILSVRC2012_val_00022448.JPEG n03874293/
+mv val/ILSVRC2012_val_00022449.JPEG n04540053/
+mv val/ILSVRC2012_val_00022450.JPEG n03250847/
+mv val/ILSVRC2012_val_00022451.JPEG n02971356/
+mv val/ILSVRC2012_val_00022452.JPEG n02114548/
+mv val/ILSVRC2012_val_00022453.JPEG n02113023/
+mv val/ILSVRC2012_val_00022454.JPEG n04081281/
+mv val/ILSVRC2012_val_00022455.JPEG n03857828/
+mv val/ILSVRC2012_val_00022456.JPEG n03450230/
+mv val/ILSVRC2012_val_00022457.JPEG n04127249/
+mv val/ILSVRC2012_val_00022458.JPEG n02108089/
+mv val/ILSVRC2012_val_00022459.JPEG n02093428/
+mv val/ILSVRC2012_val_00022460.JPEG n04392985/
+mv val/ILSVRC2012_val_00022461.JPEG n04254120/
+mv val/ILSVRC2012_val_00022462.JPEG n02782093/
+mv val/ILSVRC2012_val_00022463.JPEG n02012849/
+mv val/ILSVRC2012_val_00022464.JPEG n03179701/
+mv val/ILSVRC2012_val_00022465.JPEG n04357314/
+mv val/ILSVRC2012_val_00022466.JPEG n13133613/
+mv val/ILSVRC2012_val_00022467.JPEG n02992211/
+mv val/ILSVRC2012_val_00022468.JPEG n04243546/
+mv val/ILSVRC2012_val_00022469.JPEG n01664065/
+mv val/ILSVRC2012_val_00022470.JPEG n01695060/
+mv val/ILSVRC2012_val_00022471.JPEG n04005630/
+mv val/ILSVRC2012_val_00022472.JPEG n03400231/
+mv val/ILSVRC2012_val_00022473.JPEG n03733131/
+mv val/ILSVRC2012_val_00022474.JPEG n02107142/
+mv val/ILSVRC2012_val_00022475.JPEG n02104365/
+mv val/ILSVRC2012_val_00022476.JPEG n04597913/
+mv val/ILSVRC2012_val_00022477.JPEG n04238763/
+mv val/ILSVRC2012_val_00022478.JPEG n04371430/
+mv val/ILSVRC2012_val_00022479.JPEG n03877472/
+mv val/ILSVRC2012_val_00022480.JPEG n04589890/
+mv val/ILSVRC2012_val_00022481.JPEG n04154565/
+mv val/ILSVRC2012_val_00022482.JPEG n01734418/
+mv val/ILSVRC2012_val_00022483.JPEG n03781244/
+mv val/ILSVRC2012_val_00022484.JPEG n07745940/
+mv val/ILSVRC2012_val_00022485.JPEG n02109961/
+mv val/ILSVRC2012_val_00022486.JPEG n01755581/
+mv val/ILSVRC2012_val_00022487.JPEG n07742313/
+mv val/ILSVRC2012_val_00022488.JPEG n04118776/
+mv val/ILSVRC2012_val_00022489.JPEG n01734418/
+mv val/ILSVRC2012_val_00022490.JPEG n02085782/
+mv val/ILSVRC2012_val_00022491.JPEG n03100240/
+mv val/ILSVRC2012_val_00022492.JPEG n02013706/
+mv val/ILSVRC2012_val_00022493.JPEG n03658185/
+mv val/ILSVRC2012_val_00022494.JPEG n03290653/
+mv val/ILSVRC2012_val_00022495.JPEG n02105505/
+mv val/ILSVRC2012_val_00022496.JPEG n03888257/
+mv val/ILSVRC2012_val_00022497.JPEG n02865351/
+mv val/ILSVRC2012_val_00022498.JPEG n02277742/
+mv val/ILSVRC2012_val_00022499.JPEG n02099849/
+mv val/ILSVRC2012_val_00022500.JPEG n03131574/
+mv val/ILSVRC2012_val_00022501.JPEG n02102177/
+mv val/ILSVRC2012_val_00022502.JPEG n02093428/
+mv val/ILSVRC2012_val_00022503.JPEG n02814860/
+mv val/ILSVRC2012_val_00022504.JPEG n01734418/
+mv val/ILSVRC2012_val_00022505.JPEG n01580077/
+mv val/ILSVRC2012_val_00022506.JPEG n04136333/
+mv val/ILSVRC2012_val_00022507.JPEG n04483307/
+mv val/ILSVRC2012_val_00022508.JPEG n01774384/
+mv val/ILSVRC2012_val_00022509.JPEG n02364673/
+mv val/ILSVRC2012_val_00022510.JPEG n06874185/
+mv val/ILSVRC2012_val_00022511.JPEG n07754684/
+mv val/ILSVRC2012_val_00022512.JPEG n07734744/
+mv val/ILSVRC2012_val_00022513.JPEG n04487081/
+mv val/ILSVRC2012_val_00022514.JPEG n07802026/
+mv val/ILSVRC2012_val_00022515.JPEG n09399592/
+mv val/ILSVRC2012_val_00022516.JPEG n03602883/
+mv val/ILSVRC2012_val_00022517.JPEG n04435653/
+mv val/ILSVRC2012_val_00022518.JPEG n02096437/
+mv val/ILSVRC2012_val_00022519.JPEG n02672831/
+mv val/ILSVRC2012_val_00022520.JPEG n02107683/
+mv val/ILSVRC2012_val_00022521.JPEG n02086646/
+mv val/ILSVRC2012_val_00022522.JPEG n01698640/
+mv val/ILSVRC2012_val_00022523.JPEG n03485794/
+mv val/ILSVRC2012_val_00022524.JPEG n03967562/
+mv val/ILSVRC2012_val_00022525.JPEG n01664065/
+mv val/ILSVRC2012_val_00022526.JPEG n03837869/
+mv val/ILSVRC2012_val_00022527.JPEG n01950731/
+mv val/ILSVRC2012_val_00022528.JPEG n02909870/
+mv val/ILSVRC2012_val_00022529.JPEG n01756291/
+mv val/ILSVRC2012_val_00022530.JPEG n02091467/
+mv val/ILSVRC2012_val_00022531.JPEG n03658185/
+mv val/ILSVRC2012_val_00022532.JPEG n02690373/
+mv val/ILSVRC2012_val_00022533.JPEG n02012849/
+mv val/ILSVRC2012_val_00022534.JPEG n03709823/
+mv val/ILSVRC2012_val_00022535.JPEG n02123597/
+mv val/ILSVRC2012_val_00022536.JPEG n13044778/
+mv val/ILSVRC2012_val_00022537.JPEG n02167151/
+mv val/ILSVRC2012_val_00022538.JPEG n03425413/
+mv val/ILSVRC2012_val_00022539.JPEG n07730033/
+mv val/ILSVRC2012_val_00022540.JPEG n03721384/
+mv val/ILSVRC2012_val_00022541.JPEG n03126707/
+mv val/ILSVRC2012_val_00022542.JPEG n02883205/
+mv val/ILSVRC2012_val_00022543.JPEG n02111889/
+mv val/ILSVRC2012_val_00022544.JPEG n03866082/
+mv val/ILSVRC2012_val_00022545.JPEG n01698640/
+mv val/ILSVRC2012_val_00022546.JPEG n04584207/
+mv val/ILSVRC2012_val_00022547.JPEG n03485407/
+mv val/ILSVRC2012_val_00022548.JPEG n02105251/
+mv val/ILSVRC2012_val_00022549.JPEG n03743016/
+mv val/ILSVRC2012_val_00022550.JPEG n03314780/
+mv val/ILSVRC2012_val_00022551.JPEG n03769881/
+mv val/ILSVRC2012_val_00022552.JPEG n01494475/
+mv val/ILSVRC2012_val_00022553.JPEG n04005630/
+mv val/ILSVRC2012_val_00022554.JPEG n03291819/
+mv val/ILSVRC2012_val_00022555.JPEG n03721384/
+mv val/ILSVRC2012_val_00022556.JPEG n04118776/
+mv val/ILSVRC2012_val_00022557.JPEG n03868242/
+mv val/ILSVRC2012_val_00022558.JPEG n04265275/
+mv val/ILSVRC2012_val_00022559.JPEG n09835506/
+mv val/ILSVRC2012_val_00022560.JPEG n03443371/
+mv val/ILSVRC2012_val_00022561.JPEG n03459775/
+mv val/ILSVRC2012_val_00022562.JPEG n04501370/
+mv val/ILSVRC2012_val_00022563.JPEG n01688243/
+mv val/ILSVRC2012_val_00022564.JPEG n03494278/
+mv val/ILSVRC2012_val_00022565.JPEG n02486410/
+mv val/ILSVRC2012_val_00022566.JPEG n02105251/
+mv val/ILSVRC2012_val_00022567.JPEG n03956157/
+mv val/ILSVRC2012_val_00022568.JPEG n02410509/
+mv val/ILSVRC2012_val_00022569.JPEG n02116738/
+mv val/ILSVRC2012_val_00022570.JPEG n04532106/
+mv val/ILSVRC2012_val_00022571.JPEG n02100236/
+mv val/ILSVRC2012_val_00022572.JPEG n04591157/
+mv val/ILSVRC2012_val_00022573.JPEG n02398521/
+mv val/ILSVRC2012_val_00022574.JPEG n04131690/
+mv val/ILSVRC2012_val_00022575.JPEG n03935335/
+mv val/ILSVRC2012_val_00022576.JPEG n02098105/
+mv val/ILSVRC2012_val_00022577.JPEG n04428191/
+mv val/ILSVRC2012_val_00022578.JPEG n02110627/
+mv val/ILSVRC2012_val_00022579.JPEG n03970156/
+mv val/ILSVRC2012_val_00022580.JPEG n03950228/
+mv val/ILSVRC2012_val_00022581.JPEG n02110341/
+mv val/ILSVRC2012_val_00022582.JPEG n04201297/
+mv val/ILSVRC2012_val_00022583.JPEG n07932039/
+mv val/ILSVRC2012_val_00022584.JPEG n07920052/
+mv val/ILSVRC2012_val_00022585.JPEG n03063689/
+mv val/ILSVRC2012_val_00022586.JPEG n02137549/
+mv val/ILSVRC2012_val_00022587.JPEG n03100240/
+mv val/ILSVRC2012_val_00022588.JPEG n01665541/
+mv val/ILSVRC2012_val_00022589.JPEG n04099969/
+mv val/ILSVRC2012_val_00022590.JPEG n02106382/
+mv val/ILSVRC2012_val_00022591.JPEG n02009912/
+mv val/ILSVRC2012_val_00022592.JPEG n03223299/
+mv val/ILSVRC2012_val_00022593.JPEG n02091635/
+mv val/ILSVRC2012_val_00022594.JPEG n03982430/
+mv val/ILSVRC2012_val_00022595.JPEG n04548362/
+mv val/ILSVRC2012_val_00022596.JPEG n01978455/
+mv val/ILSVRC2012_val_00022597.JPEG n01614925/
+mv val/ILSVRC2012_val_00022598.JPEG n02841315/
+mv val/ILSVRC2012_val_00022599.JPEG n07711569/
+mv val/ILSVRC2012_val_00022600.JPEG n04335435/
+mv val/ILSVRC2012_val_00022601.JPEG n02892767/
+mv val/ILSVRC2012_val_00022602.JPEG n03345487/
+mv val/ILSVRC2012_val_00022603.JPEG n02948072/
+mv val/ILSVRC2012_val_00022604.JPEG n04127249/
+mv val/ILSVRC2012_val_00022605.JPEG n02909870/
+mv val/ILSVRC2012_val_00022606.JPEG n02099712/
+mv val/ILSVRC2012_val_00022607.JPEG n04162706/
+mv val/ILSVRC2012_val_00022608.JPEG n01981276/
+mv val/ILSVRC2012_val_00022609.JPEG n02085620/
+mv val/ILSVRC2012_val_00022610.JPEG n02917067/
+mv val/ILSVRC2012_val_00022611.JPEG n07716358/
+mv val/ILSVRC2012_val_00022612.JPEG n04332243/
+mv val/ILSVRC2012_val_00022613.JPEG n03724870/
+mv val/ILSVRC2012_val_00022614.JPEG n04074963/
+mv val/ILSVRC2012_val_00022615.JPEG n01984695/
+mv val/ILSVRC2012_val_00022616.JPEG n03794056/
+mv val/ILSVRC2012_val_00022617.JPEG n03929855/
+mv val/ILSVRC2012_val_00022618.JPEG n01773157/
+mv val/ILSVRC2012_val_00022619.JPEG n01806567/
+mv val/ILSVRC2012_val_00022620.JPEG n04350905/
+mv val/ILSVRC2012_val_00022621.JPEG n03804744/
+mv val/ILSVRC2012_val_00022622.JPEG n10565667/
+mv val/ILSVRC2012_val_00022623.JPEG n07747607/
+mv val/ILSVRC2012_val_00022624.JPEG n03218198/
+mv val/ILSVRC2012_val_00022625.JPEG n03942813/
+mv val/ILSVRC2012_val_00022626.JPEG n01877812/
+mv val/ILSVRC2012_val_00022627.JPEG n03924679/
+mv val/ILSVRC2012_val_00022628.JPEG n07753592/
+mv val/ILSVRC2012_val_00022629.JPEG n02113799/
+mv val/ILSVRC2012_val_00022630.JPEG n02086079/
+mv val/ILSVRC2012_val_00022631.JPEG n03814639/
+mv val/ILSVRC2012_val_00022632.JPEG n02834397/
+mv val/ILSVRC2012_val_00022633.JPEG n02109525/
+mv val/ILSVRC2012_val_00022634.JPEG n07720875/
+mv val/ILSVRC2012_val_00022635.JPEG n04273569/
+mv val/ILSVRC2012_val_00022636.JPEG n03018349/
+mv val/ILSVRC2012_val_00022637.JPEG n03404251/
+mv val/ILSVRC2012_val_00022638.JPEG n03888257/
+mv val/ILSVRC2012_val_00022639.JPEG n03485407/
+mv val/ILSVRC2012_val_00022640.JPEG n07730033/
+mv val/ILSVRC2012_val_00022641.JPEG n13052670/
+mv val/ILSVRC2012_val_00022642.JPEG n02095889/
+mv val/ILSVRC2012_val_00022643.JPEG n01739381/
+mv val/ILSVRC2012_val_00022644.JPEG n01514859/
+mv val/ILSVRC2012_val_00022645.JPEG n02106030/
+mv val/ILSVRC2012_val_00022646.JPEG n07860988/
+mv val/ILSVRC2012_val_00022647.JPEG n03775546/
+mv val/ILSVRC2012_val_00022648.JPEG n04263257/
+mv val/ILSVRC2012_val_00022649.JPEG n03485794/
+mv val/ILSVRC2012_val_00022650.JPEG n03924679/
+mv val/ILSVRC2012_val_00022651.JPEG n04228054/
+mv val/ILSVRC2012_val_00022652.JPEG n02319095/
+mv val/ILSVRC2012_val_00022653.JPEG n02747177/
+mv val/ILSVRC2012_val_00022654.JPEG n03770679/
+mv val/ILSVRC2012_val_00022655.JPEG n03980874/
+mv val/ILSVRC2012_val_00022656.JPEG n02097658/
+mv val/ILSVRC2012_val_00022657.JPEG n02988304/
+mv val/ILSVRC2012_val_00022658.JPEG n07579787/
+mv val/ILSVRC2012_val_00022659.JPEG n02137549/
+mv val/ILSVRC2012_val_00022660.JPEG n01644373/
+mv val/ILSVRC2012_val_00022661.JPEG n02870880/
+mv val/ILSVRC2012_val_00022662.JPEG n04069434/
+mv val/ILSVRC2012_val_00022663.JPEG n13040303/
+mv val/ILSVRC2012_val_00022664.JPEG n02106550/
+mv val/ILSVRC2012_val_00022665.JPEG n02804414/
+mv val/ILSVRC2012_val_00022666.JPEG n07565083/
+mv val/ILSVRC2012_val_00022667.JPEG n03877845/
+mv val/ILSVRC2012_val_00022668.JPEG n03187595/
+mv val/ILSVRC2012_val_00022669.JPEG n02074367/
+mv val/ILSVRC2012_val_00022670.JPEG n02099712/
+mv val/ILSVRC2012_val_00022671.JPEG n01950731/
+mv val/ILSVRC2012_val_00022672.JPEG n03884397/
+mv val/ILSVRC2012_val_00022673.JPEG n03776460/
+mv val/ILSVRC2012_val_00022674.JPEG n04209133/
+mv val/ILSVRC2012_val_00022675.JPEG n03697007/
+mv val/ILSVRC2012_val_00022676.JPEG n01978287/
+mv val/ILSVRC2012_val_00022677.JPEG n03792972/
+mv val/ILSVRC2012_val_00022678.JPEG n07716906/
+mv val/ILSVRC2012_val_00022679.JPEG n04146614/
+mv val/ILSVRC2012_val_00022680.JPEG n03887697/
+mv val/ILSVRC2012_val_00022681.JPEG n02095889/
+mv val/ILSVRC2012_val_00022682.JPEG n02096177/
+mv val/ILSVRC2012_val_00022683.JPEG n04435653/
+mv val/ILSVRC2012_val_00022684.JPEG n02091032/
+mv val/ILSVRC2012_val_00022685.JPEG n02840245/
+mv val/ILSVRC2012_val_00022686.JPEG n02097658/
+mv val/ILSVRC2012_val_00022687.JPEG n02002724/
+mv val/ILSVRC2012_val_00022688.JPEG n02058221/
+mv val/ILSVRC2012_val_00022689.JPEG n03127747/
+mv val/ILSVRC2012_val_00022690.JPEG n04501370/
+mv val/ILSVRC2012_val_00022691.JPEG n01817953/
+mv val/ILSVRC2012_val_00022692.JPEG n02113186/
+mv val/ILSVRC2012_val_00022693.JPEG n01877812/
+mv val/ILSVRC2012_val_00022694.JPEG n04004767/
+mv val/ILSVRC2012_val_00022695.JPEG n02441942/
+mv val/ILSVRC2012_val_00022696.JPEG n02408429/
+mv val/ILSVRC2012_val_00022697.JPEG n04116512/
+mv val/ILSVRC2012_val_00022698.JPEG n02134418/
+mv val/ILSVRC2012_val_00022699.JPEG n03529860/
+mv val/ILSVRC2012_val_00022700.JPEG n03041632/
+mv val/ILSVRC2012_val_00022701.JPEG n03447447/
+mv val/ILSVRC2012_val_00022702.JPEG n03188531/
+mv val/ILSVRC2012_val_00022703.JPEG n03770439/
+mv val/ILSVRC2012_val_00022704.JPEG n03633091/
+mv val/ILSVRC2012_val_00022705.JPEG n02086646/
+mv val/ILSVRC2012_val_00022706.JPEG n02011460/
+mv val/ILSVRC2012_val_00022707.JPEG n04209133/
+mv val/ILSVRC2012_val_00022708.JPEG n04229816/
+mv val/ILSVRC2012_val_00022709.JPEG n01622779/
+mv val/ILSVRC2012_val_00022710.JPEG n01667114/
+mv val/ILSVRC2012_val_00022711.JPEG n01685808/
+mv val/ILSVRC2012_val_00022712.JPEG n02113186/
+mv val/ILSVRC2012_val_00022713.JPEG n02097047/
+mv val/ILSVRC2012_val_00022714.JPEG n03876231/
+mv val/ILSVRC2012_val_00022715.JPEG n02699494/
+mv val/ILSVRC2012_val_00022716.JPEG n03961711/
+mv val/ILSVRC2012_val_00022717.JPEG n03530642/
+mv val/ILSVRC2012_val_00022718.JPEG n03452741/
+mv val/ILSVRC2012_val_00022719.JPEG n02708093/
+mv val/ILSVRC2012_val_00022720.JPEG n01985128/
+mv val/ILSVRC2012_val_00022721.JPEG n02894605/
+mv val/ILSVRC2012_val_00022722.JPEG n03124170/
+mv val/ILSVRC2012_val_00022723.JPEG n03633091/
+mv val/ILSVRC2012_val_00022724.JPEG n13054560/
+mv val/ILSVRC2012_val_00022725.JPEG n02112137/
+mv val/ILSVRC2012_val_00022726.JPEG n02120505/
+mv val/ILSVRC2012_val_00022727.JPEG n01532829/
+mv val/ILSVRC2012_val_00022728.JPEG n03929660/
+mv val/ILSVRC2012_val_00022729.JPEG n04589890/
+mv val/ILSVRC2012_val_00022730.JPEG n04507155/
+mv val/ILSVRC2012_val_00022731.JPEG n01685808/
+mv val/ILSVRC2012_val_00022732.JPEG n02077923/
+mv val/ILSVRC2012_val_00022733.JPEG n04523525/
+mv val/ILSVRC2012_val_00022734.JPEG n04592741/
+mv val/ILSVRC2012_val_00022735.JPEG n02056570/
+mv val/ILSVRC2012_val_00022736.JPEG n03841143/
+mv val/ILSVRC2012_val_00022737.JPEG n02226429/
+mv val/ILSVRC2012_val_00022738.JPEG n04243546/
+mv val/ILSVRC2012_val_00022739.JPEG n04285008/
+mv val/ILSVRC2012_val_00022740.JPEG n02483708/
+mv val/ILSVRC2012_val_00022741.JPEG n03944341/
+mv val/ILSVRC2012_val_00022742.JPEG n04553703/
+mv val/ILSVRC2012_val_00022743.JPEG n03977966/
+mv val/ILSVRC2012_val_00022744.JPEG n02441942/
+mv val/ILSVRC2012_val_00022745.JPEG n01818515/
+mv val/ILSVRC2012_val_00022746.JPEG n03871628/
+mv val/ILSVRC2012_val_00022747.JPEG n03692522/
+mv val/ILSVRC2012_val_00022748.JPEG n07768694/
+mv val/ILSVRC2012_val_00022749.JPEG n02607072/
+mv val/ILSVRC2012_val_00022750.JPEG n04456115/
+mv val/ILSVRC2012_val_00022751.JPEG n04590129/
+mv val/ILSVRC2012_val_00022752.JPEG n03476991/
+mv val/ILSVRC2012_val_00022753.JPEG n02091134/
+mv val/ILSVRC2012_val_00022754.JPEG n03394916/
+mv val/ILSVRC2012_val_00022755.JPEG n01990800/
+mv val/ILSVRC2012_val_00022756.JPEG n02066245/
+mv val/ILSVRC2012_val_00022757.JPEG n02279972/
+mv val/ILSVRC2012_val_00022758.JPEG n01944390/
+mv val/ILSVRC2012_val_00022759.JPEG n02105251/
+mv val/ILSVRC2012_val_00022760.JPEG n04273569/
+mv val/ILSVRC2012_val_00022761.JPEG n03857828/
+mv val/ILSVRC2012_val_00022762.JPEG n02110185/
+mv val/ILSVRC2012_val_00022763.JPEG n02096051/
+mv val/ILSVRC2012_val_00022764.JPEG n01770081/
+mv val/ILSVRC2012_val_00022765.JPEG n02259212/
+mv val/ILSVRC2012_val_00022766.JPEG n02799071/
+mv val/ILSVRC2012_val_00022767.JPEG n01806143/
+mv val/ILSVRC2012_val_00022768.JPEG n03476684/
+mv val/ILSVRC2012_val_00022769.JPEG n01796340/
+mv val/ILSVRC2012_val_00022770.JPEG n03100240/
+mv val/ILSVRC2012_val_00022771.JPEG n01632777/
+mv val/ILSVRC2012_val_00022772.JPEG n02190166/
+mv val/ILSVRC2012_val_00022773.JPEG n02066245/
+mv val/ILSVRC2012_val_00022774.JPEG n03976657/
+mv val/ILSVRC2012_val_00022775.JPEG n03788365/
+mv val/ILSVRC2012_val_00022776.JPEG n02108422/
+mv val/ILSVRC2012_val_00022777.JPEG n03400231/
+mv val/ILSVRC2012_val_00022778.JPEG n04589890/
+mv val/ILSVRC2012_val_00022779.JPEG n04435653/
+mv val/ILSVRC2012_val_00022780.JPEG n02326432/
+mv val/ILSVRC2012_val_00022781.JPEG n03954731/
+mv val/ILSVRC2012_val_00022782.JPEG n04591157/
+mv val/ILSVRC2012_val_00022783.JPEG n02823428/
+mv val/ILSVRC2012_val_00022784.JPEG n07716358/
+mv val/ILSVRC2012_val_00022785.JPEG n02088632/
+mv val/ILSVRC2012_val_00022786.JPEG n01824575/
+mv val/ILSVRC2012_val_00022787.JPEG n01631663/
+mv val/ILSVRC2012_val_00022788.JPEG n02086079/
+mv val/ILSVRC2012_val_00022789.JPEG n03995372/
+mv val/ILSVRC2012_val_00022790.JPEG n04517823/
+mv val/ILSVRC2012_val_00022791.JPEG n02480855/
+mv val/ILSVRC2012_val_00022792.JPEG n03445777/
+mv val/ILSVRC2012_val_00022793.JPEG n04357314/
+mv val/ILSVRC2012_val_00022794.JPEG n03884397/
+mv val/ILSVRC2012_val_00022795.JPEG n03445924/
+mv val/ILSVRC2012_val_00022796.JPEG n03777754/
+mv val/ILSVRC2012_val_00022797.JPEG n03133878/
+mv val/ILSVRC2012_val_00022798.JPEG n03873416/
+mv val/ILSVRC2012_val_00022799.JPEG n02086240/
+mv val/ILSVRC2012_val_00022800.JPEG n04553703/
+mv val/ILSVRC2012_val_00022801.JPEG n04133789/
+mv val/ILSVRC2012_val_00022802.JPEG n07693725/
+mv val/ILSVRC2012_val_00022803.JPEG n02895154/
+mv val/ILSVRC2012_val_00022804.JPEG n02317335/
+mv val/ILSVRC2012_val_00022805.JPEG n04613696/
+mv val/ILSVRC2012_val_00022806.JPEG n01819313/
+mv val/ILSVRC2012_val_00022807.JPEG n03977966/
+mv val/ILSVRC2012_val_00022808.JPEG n02109047/
+mv val/ILSVRC2012_val_00022809.JPEG n03000247/
+mv val/ILSVRC2012_val_00022810.JPEG n02443114/
+mv val/ILSVRC2012_val_00022811.JPEG n03272010/
+mv val/ILSVRC2012_val_00022812.JPEG n01697457/
+mv val/ILSVRC2012_val_00022813.JPEG n04200800/
+mv val/ILSVRC2012_val_00022814.JPEG n02109047/
+mv val/ILSVRC2012_val_00022815.JPEG n02840245/
+mv val/ILSVRC2012_val_00022816.JPEG n01739381/
+mv val/ILSVRC2012_val_00022817.JPEG n06794110/
+mv val/ILSVRC2012_val_00022818.JPEG n01756291/
+mv val/ILSVRC2012_val_00022819.JPEG n01748264/
+mv val/ILSVRC2012_val_00022820.JPEG n03950228/
+mv val/ILSVRC2012_val_00022821.JPEG n02971356/
+mv val/ILSVRC2012_val_00022822.JPEG n02123159/
+mv val/ILSVRC2012_val_00022823.JPEG n04346328/
+mv val/ILSVRC2012_val_00022824.JPEG n02092339/
+mv val/ILSVRC2012_val_00022825.JPEG n01729977/
+mv val/ILSVRC2012_val_00022826.JPEG n03187595/
+mv val/ILSVRC2012_val_00022827.JPEG n02454379/
+mv val/ILSVRC2012_val_00022828.JPEG n03794056/
+mv val/ILSVRC2012_val_00022829.JPEG n03967562/
+mv val/ILSVRC2012_val_00022830.JPEG n04039381/
+mv val/ILSVRC2012_val_00022831.JPEG n02879718/
+mv val/ILSVRC2012_val_00022832.JPEG n02441942/
+mv val/ILSVRC2012_val_00022833.JPEG n04515003/
+mv val/ILSVRC2012_val_00022834.JPEG n04311174/
+mv val/ILSVRC2012_val_00022835.JPEG n03100240/
+mv val/ILSVRC2012_val_00022836.JPEG n03868242/
+mv val/ILSVRC2012_val_00022837.JPEG n03126707/
+mv val/ILSVRC2012_val_00022838.JPEG n04461696/
+mv val/ILSVRC2012_val_00022839.JPEG n13054560/
+mv val/ILSVRC2012_val_00022840.JPEG n04398044/
+mv val/ILSVRC2012_val_00022841.JPEG n01667114/
+mv val/ILSVRC2012_val_00022842.JPEG n01664065/
+mv val/ILSVRC2012_val_00022843.JPEG n02106382/
+mv val/ILSVRC2012_val_00022844.JPEG n04613696/
+mv val/ILSVRC2012_val_00022845.JPEG n02948072/
+mv val/ILSVRC2012_val_00022846.JPEG n12144580/
+mv val/ILSVRC2012_val_00022847.JPEG n03877472/
+mv val/ILSVRC2012_val_00022848.JPEG n02096585/
+mv val/ILSVRC2012_val_00022849.JPEG n03935335/
+mv val/ILSVRC2012_val_00022850.JPEG n04429376/
+mv val/ILSVRC2012_val_00022851.JPEG n02110185/
+mv val/ILSVRC2012_val_00022852.JPEG n03207941/
+mv val/ILSVRC2012_val_00022853.JPEG n02123045/
+mv val/ILSVRC2012_val_00022854.JPEG n03788195/
+mv val/ILSVRC2012_val_00022855.JPEG n04259630/
+mv val/ILSVRC2012_val_00022856.JPEG n02097209/
+mv val/ILSVRC2012_val_00022857.JPEG n02092002/
+mv val/ILSVRC2012_val_00022858.JPEG n01877812/
+mv val/ILSVRC2012_val_00022859.JPEG n03529860/
+mv val/ILSVRC2012_val_00022860.JPEG n02966687/
+mv val/ILSVRC2012_val_00022861.JPEG n03980874/
+mv val/ILSVRC2012_val_00022862.JPEG n02013706/
+mv val/ILSVRC2012_val_00022863.JPEG n02776631/
+mv val/ILSVRC2012_val_00022864.JPEG n02445715/
+mv val/ILSVRC2012_val_00022865.JPEG n01496331/
+mv val/ILSVRC2012_val_00022866.JPEG n01807496/
+mv val/ILSVRC2012_val_00022867.JPEG n02112137/
+mv val/ILSVRC2012_val_00022868.JPEG n02086646/
+mv val/ILSVRC2012_val_00022869.JPEG n04118776/
+mv val/ILSVRC2012_val_00022870.JPEG n03658185/
+mv val/ILSVRC2012_val_00022871.JPEG n01985128/
+mv val/ILSVRC2012_val_00022872.JPEG n02504013/
+mv val/ILSVRC2012_val_00022873.JPEG n12998815/
+mv val/ILSVRC2012_val_00022874.JPEG n02233338/
+mv val/ILSVRC2012_val_00022875.JPEG n12057211/
+mv val/ILSVRC2012_val_00022876.JPEG n07875152/
+mv val/ILSVRC2012_val_00022877.JPEG n03840681/
+mv val/ILSVRC2012_val_00022878.JPEG n03721384/
+mv val/ILSVRC2012_val_00022879.JPEG n03908714/
+mv val/ILSVRC2012_val_00022880.JPEG n02412080/
+mv val/ILSVRC2012_val_00022881.JPEG n02113799/
+mv val/ILSVRC2012_val_00022882.JPEG n02096437/
+mv val/ILSVRC2012_val_00022883.JPEG n02669723/
+mv val/ILSVRC2012_val_00022884.JPEG n03775546/
+mv val/ILSVRC2012_val_00022885.JPEG n03393912/
+mv val/ILSVRC2012_val_00022886.JPEG n07718472/
+mv val/ILSVRC2012_val_00022887.JPEG n01883070/
+mv val/ILSVRC2012_val_00022888.JPEG n02120079/
+mv val/ILSVRC2012_val_00022889.JPEG n01532829/
+mv val/ILSVRC2012_val_00022890.JPEG n04443257/
+mv val/ILSVRC2012_val_00022891.JPEG n02917067/
+mv val/ILSVRC2012_val_00022892.JPEG n02877765/
+mv val/ILSVRC2012_val_00022893.JPEG n02115913/
+mv val/ILSVRC2012_val_00022894.JPEG n07920052/
+mv val/ILSVRC2012_val_00022895.JPEG n01773797/
+mv val/ILSVRC2012_val_00022896.JPEG n02123159/
+mv val/ILSVRC2012_val_00022897.JPEG n03447447/
+mv val/ILSVRC2012_val_00022898.JPEG n04613696/
+mv val/ILSVRC2012_val_00022899.JPEG n03933933/
+mv val/ILSVRC2012_val_00022900.JPEG n04380533/
+mv val/ILSVRC2012_val_00022901.JPEG n01728572/
+mv val/ILSVRC2012_val_00022902.JPEG n03535780/
+mv val/ILSVRC2012_val_00022903.JPEG n04599235/
+mv val/ILSVRC2012_val_00022904.JPEG n02877765/
+mv val/ILSVRC2012_val_00022905.JPEG n13037406/
+mv val/ILSVRC2012_val_00022906.JPEG n02971356/
+mv val/ILSVRC2012_val_00022907.JPEG n02504458/
+mv val/ILSVRC2012_val_00022908.JPEG n02101388/
+mv val/ILSVRC2012_val_00022909.JPEG n04370456/
+mv val/ILSVRC2012_val_00022910.JPEG n09229709/
+mv val/ILSVRC2012_val_00022911.JPEG n02113624/
+mv val/ILSVRC2012_val_00022912.JPEG n02492035/
+mv val/ILSVRC2012_val_00022913.JPEG n02089867/
+mv val/ILSVRC2012_val_00022914.JPEG n09421951/
+mv val/ILSVRC2012_val_00022915.JPEG n02219486/
+mv val/ILSVRC2012_val_00022916.JPEG n02494079/
+mv val/ILSVRC2012_val_00022917.JPEG n02963159/
+mv val/ILSVRC2012_val_00022918.JPEG n03930630/
+mv val/ILSVRC2012_val_00022919.JPEG n02206856/
+mv val/ILSVRC2012_val_00022920.JPEG n02091831/
+mv val/ILSVRC2012_val_00022921.JPEG n02504013/
+mv val/ILSVRC2012_val_00022922.JPEG n02097298/
+mv val/ILSVRC2012_val_00022923.JPEG n09428293/
+mv val/ILSVRC2012_val_00022924.JPEG n04596742/
+mv val/ILSVRC2012_val_00022925.JPEG n01632777/
+mv val/ILSVRC2012_val_00022926.JPEG n02018207/
+mv val/ILSVRC2012_val_00022927.JPEG n03344393/
+mv val/ILSVRC2012_val_00022928.JPEG n03388549/
+mv val/ILSVRC2012_val_00022929.JPEG n03791053/
+mv val/ILSVRC2012_val_00022930.JPEG n01729322/
+mv val/ILSVRC2012_val_00022931.JPEG n02018207/
+mv val/ILSVRC2012_val_00022932.JPEG n03599486/
+mv val/ILSVRC2012_val_00022933.JPEG n03297495/
+mv val/ILSVRC2012_val_00022934.JPEG n02093859/
+mv val/ILSVRC2012_val_00022935.JPEG n01629819/
+mv val/ILSVRC2012_val_00022936.JPEG n04037443/
+mv val/ILSVRC2012_val_00022937.JPEG n01693334/
+mv val/ILSVRC2012_val_00022938.JPEG n02058221/
+mv val/ILSVRC2012_val_00022939.JPEG n03141823/
+mv val/ILSVRC2012_val_00022940.JPEG n04252225/
+mv val/ILSVRC2012_val_00022941.JPEG n04418357/
+mv val/ILSVRC2012_val_00022942.JPEG n01774384/
+mv val/ILSVRC2012_val_00022943.JPEG n03871628/
+mv val/ILSVRC2012_val_00022944.JPEG n03598930/
+mv val/ILSVRC2012_val_00022945.JPEG n03032252/
+mv val/ILSVRC2012_val_00022946.JPEG n02321529/
+mv val/ILSVRC2012_val_00022947.JPEG n02117135/
+mv val/ILSVRC2012_val_00022948.JPEG n02206856/
+mv val/ILSVRC2012_val_00022949.JPEG n03944341/
+mv val/ILSVRC2012_val_00022950.JPEG n02111129/
+mv val/ILSVRC2012_val_00022951.JPEG n02346627/
+mv val/ILSVRC2012_val_00022952.JPEG n03404251/
+mv val/ILSVRC2012_val_00022953.JPEG n02113023/
+mv val/ILSVRC2012_val_00022954.JPEG n02009229/
+mv val/ILSVRC2012_val_00022955.JPEG n02879718/
+mv val/ILSVRC2012_val_00022956.JPEG n01748264/
+mv val/ILSVRC2012_val_00022957.JPEG n01773549/
+mv val/ILSVRC2012_val_00022958.JPEG n04252077/
+mv val/ILSVRC2012_val_00022959.JPEG n02825657/
+mv val/ILSVRC2012_val_00022960.JPEG n03476991/
+mv val/ILSVRC2012_val_00022961.JPEG n03584254/
+mv val/ILSVRC2012_val_00022962.JPEG n04350905/
+mv val/ILSVRC2012_val_00022963.JPEG n13052670/
+mv val/ILSVRC2012_val_00022964.JPEG n04141076/
+mv val/ILSVRC2012_val_00022965.JPEG n03388549/
+mv val/ILSVRC2012_val_00022966.JPEG n02415577/
+mv val/ILSVRC2012_val_00022967.JPEG n02607072/
+mv val/ILSVRC2012_val_00022968.JPEG n04346328/
+mv val/ILSVRC2012_val_00022969.JPEG n01914609/
+mv val/ILSVRC2012_val_00022970.JPEG n02641379/
+mv val/ILSVRC2012_val_00022971.JPEG n03782006/
+mv val/ILSVRC2012_val_00022972.JPEG n01601694/
+mv val/ILSVRC2012_val_00022973.JPEG n03388183/
+mv val/ILSVRC2012_val_00022974.JPEG n03803284/
+mv val/ILSVRC2012_val_00022975.JPEG n02690373/
+mv val/ILSVRC2012_val_00022976.JPEG n02106662/
+mv val/ILSVRC2012_val_00022977.JPEG n02097047/
+mv val/ILSVRC2012_val_00022978.JPEG n07892512/
+mv val/ILSVRC2012_val_00022979.JPEG n02277742/
+mv val/ILSVRC2012_val_00022980.JPEG n10148035/
+mv val/ILSVRC2012_val_00022981.JPEG n02412080/
+mv val/ILSVRC2012_val_00022982.JPEG n02091635/
+mv val/ILSVRC2012_val_00022983.JPEG n01917289/
+mv val/ILSVRC2012_val_00022984.JPEG n03742115/
+mv val/ILSVRC2012_val_00022985.JPEG n04074963/
+mv val/ILSVRC2012_val_00022986.JPEG n03124043/
+mv val/ILSVRC2012_val_00022987.JPEG n02669723/
+mv val/ILSVRC2012_val_00022988.JPEG n04507155/
+mv val/ILSVRC2012_val_00022989.JPEG n02808304/
+mv val/ILSVRC2012_val_00022990.JPEG n02111500/
+mv val/ILSVRC2012_val_00022991.JPEG n03761084/
+mv val/ILSVRC2012_val_00022992.JPEG n01797886/
+mv val/ILSVRC2012_val_00022993.JPEG n03874599/
+mv val/ILSVRC2012_val_00022994.JPEG n03476991/
+mv val/ILSVRC2012_val_00022995.JPEG n04404412/
+mv val/ILSVRC2012_val_00022996.JPEG n02108915/
+mv val/ILSVRC2012_val_00022997.JPEG n01694178/
+mv val/ILSVRC2012_val_00022998.JPEG n02802426/
+mv val/ILSVRC2012_val_00022999.JPEG n02974003/
+mv val/ILSVRC2012_val_00023000.JPEG n03028079/
+mv val/ILSVRC2012_val_00023001.JPEG n03944341/
+mv val/ILSVRC2012_val_00023002.JPEG n03742115/
+mv val/ILSVRC2012_val_00023003.JPEG n02111500/
+mv val/ILSVRC2012_val_00023004.JPEG n02117135/
+mv val/ILSVRC2012_val_00023005.JPEG n02092339/
+mv val/ILSVRC2012_val_00023006.JPEG n04133789/
+mv val/ILSVRC2012_val_00023007.JPEG n03868242/
+mv val/ILSVRC2012_val_00023008.JPEG n07714990/
+mv val/ILSVRC2012_val_00023009.JPEG n07579787/
+mv val/ILSVRC2012_val_00023010.JPEG n04252077/
+mv val/ILSVRC2012_val_00023011.JPEG n02096051/
+mv val/ILSVRC2012_val_00023012.JPEG n02102480/
+mv val/ILSVRC2012_val_00023013.JPEG n02174001/
+mv val/ILSVRC2012_val_00023014.JPEG n03085013/
+mv val/ILSVRC2012_val_00023015.JPEG n01740131/
+mv val/ILSVRC2012_val_00023016.JPEG n02107312/
+mv val/ILSVRC2012_val_00023017.JPEG n04162706/
+mv val/ILSVRC2012_val_00023018.JPEG n02869837/
+mv val/ILSVRC2012_val_00023019.JPEG n02412080/
+mv val/ILSVRC2012_val_00023020.JPEG n04612504/
+mv val/ILSVRC2012_val_00023021.JPEG n01807496/
+mv val/ILSVRC2012_val_00023022.JPEG n04041544/
+mv val/ILSVRC2012_val_00023023.JPEG n03459775/
+mv val/ILSVRC2012_val_00023024.JPEG n02017213/
+mv val/ILSVRC2012_val_00023025.JPEG n02101006/
+mv val/ILSVRC2012_val_00023026.JPEG n07749582/
+mv val/ILSVRC2012_val_00023027.JPEG n02109047/
+mv val/ILSVRC2012_val_00023028.JPEG n07718472/
+mv val/ILSVRC2012_val_00023029.JPEG n02877765/
+mv val/ILSVRC2012_val_00023030.JPEG n01622779/
+mv val/ILSVRC2012_val_00023031.JPEG n01882714/
+mv val/ILSVRC2012_val_00023032.JPEG n03781244/
+mv val/ILSVRC2012_val_00023033.JPEG n02137549/
+mv val/ILSVRC2012_val_00023034.JPEG n02342885/
+mv val/ILSVRC2012_val_00023035.JPEG n03498962/
+mv val/ILSVRC2012_val_00023036.JPEG n04127249/
+mv val/ILSVRC2012_val_00023037.JPEG n06785654/
+mv val/ILSVRC2012_val_00023038.JPEG n02105412/
+mv val/ILSVRC2012_val_00023039.JPEG n03447447/
+mv val/ILSVRC2012_val_00023040.JPEG n09193705/
+mv val/ILSVRC2012_val_00023041.JPEG n02326432/
+mv val/ILSVRC2012_val_00023042.JPEG n04590129/
+mv val/ILSVRC2012_val_00023043.JPEG n02892201/
+mv val/ILSVRC2012_val_00023044.JPEG n03425413/
+mv val/ILSVRC2012_val_00023045.JPEG n04235860/
+mv val/ILSVRC2012_val_00023046.JPEG n03000247/
+mv val/ILSVRC2012_val_00023047.JPEG n03272562/
+mv val/ILSVRC2012_val_00023048.JPEG n03598930/
+mv val/ILSVRC2012_val_00023049.JPEG n02174001/
+mv val/ILSVRC2012_val_00023050.JPEG n03347037/
+mv val/ILSVRC2012_val_00023051.JPEG n07920052/
+mv val/ILSVRC2012_val_00023052.JPEG n01784675/
+mv val/ILSVRC2012_val_00023053.JPEG n07718747/
+mv val/ILSVRC2012_val_00023054.JPEG n02279972/
+mv val/ILSVRC2012_val_00023055.JPEG n02097298/
+mv val/ILSVRC2012_val_00023056.JPEG n03394916/
+mv val/ILSVRC2012_val_00023057.JPEG n03977966/
+mv val/ILSVRC2012_val_00023058.JPEG n03692522/
+mv val/ILSVRC2012_val_00023059.JPEG n03825788/
+mv val/ILSVRC2012_val_00023060.JPEG n07717556/
+mv val/ILSVRC2012_val_00023061.JPEG n02727426/
+mv val/ILSVRC2012_val_00023062.JPEG n02396427/
+mv val/ILSVRC2012_val_00023063.JPEG n07747607/
+mv val/ILSVRC2012_val_00023064.JPEG n04330267/
+mv val/ILSVRC2012_val_00023065.JPEG n03062245/
+mv val/ILSVRC2012_val_00023066.JPEG n02389026/
+mv val/ILSVRC2012_val_00023067.JPEG n02871525/
+mv val/ILSVRC2012_val_00023068.JPEG n02107142/
+mv val/ILSVRC2012_val_00023069.JPEG n02012849/
+mv val/ILSVRC2012_val_00023070.JPEG n02077923/
+mv val/ILSVRC2012_val_00023071.JPEG n03532672/
+mv val/ILSVRC2012_val_00023072.JPEG n03216828/
+mv val/ILSVRC2012_val_00023073.JPEG n02486261/
+mv val/ILSVRC2012_val_00023074.JPEG n01494475/
+mv val/ILSVRC2012_val_00023075.JPEG n04251144/
+mv val/ILSVRC2012_val_00023076.JPEG n02109047/
+mv val/ILSVRC2012_val_00023077.JPEG n03649909/
+mv val/ILSVRC2012_val_00023078.JPEG n01873310/
+mv val/ILSVRC2012_val_00023079.JPEG n03710637/
+mv val/ILSVRC2012_val_00023080.JPEG n01632458/
+mv val/ILSVRC2012_val_00023081.JPEG n02077923/
+mv val/ILSVRC2012_val_00023082.JPEG n04263257/
+mv val/ILSVRC2012_val_00023083.JPEG n04423845/
+mv val/ILSVRC2012_val_00023084.JPEG n02279972/
+mv val/ILSVRC2012_val_00023085.JPEG n01728572/
+mv val/ILSVRC2012_val_00023086.JPEG n02128757/
+mv val/ILSVRC2012_val_00023087.JPEG n04552348/
+mv val/ILSVRC2012_val_00023088.JPEG n07747607/
+mv val/ILSVRC2012_val_00023089.JPEG n07932039/
+mv val/ILSVRC2012_val_00023090.JPEG n02071294/
+mv val/ILSVRC2012_val_00023091.JPEG n02951585/
+mv val/ILSVRC2012_val_00023092.JPEG n02123159/
+mv val/ILSVRC2012_val_00023093.JPEG n04201297/
+mv val/ILSVRC2012_val_00023094.JPEG n03680355/
+mv val/ILSVRC2012_val_00023095.JPEG n02892767/
+mv val/ILSVRC2012_val_00023096.JPEG n03930630/
+mv val/ILSVRC2012_val_00023097.JPEG n01798484/
+mv val/ILSVRC2012_val_00023098.JPEG n01729977/
+mv val/ILSVRC2012_val_00023099.JPEG n01798484/
+mv val/ILSVRC2012_val_00023100.JPEG n04371430/
+mv val/ILSVRC2012_val_00023101.JPEG n02090379/
+mv val/ILSVRC2012_val_00023102.JPEG n03347037/
+mv val/ILSVRC2012_val_00023103.JPEG n03998194/
+mv val/ILSVRC2012_val_00023104.JPEG n03947888/
+mv val/ILSVRC2012_val_00023105.JPEG n02108422/
+mv val/ILSVRC2012_val_00023106.JPEG n02837789/
+mv val/ILSVRC2012_val_00023107.JPEG n03888257/
+mv val/ILSVRC2012_val_00023108.JPEG n01739381/
+mv val/ILSVRC2012_val_00023109.JPEG n04179913/
+mv val/ILSVRC2012_val_00023110.JPEG n07590611/
+mv val/ILSVRC2012_val_00023111.JPEG n02279972/
+mv val/ILSVRC2012_val_00023112.JPEG n03063599/
+mv val/ILSVRC2012_val_00023113.JPEG n02113712/
+mv val/ILSVRC2012_val_00023114.JPEG n02444819/
+mv val/ILSVRC2012_val_00023115.JPEG n03532672/
+mv val/ILSVRC2012_val_00023116.JPEG n02687172/
+mv val/ILSVRC2012_val_00023117.JPEG n07720875/
+mv val/ILSVRC2012_val_00023118.JPEG n01819313/
+mv val/ILSVRC2012_val_00023119.JPEG n02445715/
+mv val/ILSVRC2012_val_00023120.JPEG n03793489/
+mv val/ILSVRC2012_val_00023121.JPEG n02092002/
+mv val/ILSVRC2012_val_00023122.JPEG n03899768/
+mv val/ILSVRC2012_val_00023123.JPEG n03424325/
+mv val/ILSVRC2012_val_00023124.JPEG n02978881/
+mv val/ILSVRC2012_val_00023125.JPEG n01534433/
+mv val/ILSVRC2012_val_00023126.JPEG n02999410/
+mv val/ILSVRC2012_val_00023127.JPEG n04557648/
+mv val/ILSVRC2012_val_00023128.JPEG n01608432/
+mv val/ILSVRC2012_val_00023129.JPEG n02391049/
+mv val/ILSVRC2012_val_00023130.JPEG n03929660/
+mv val/ILSVRC2012_val_00023131.JPEG n02835271/
+mv val/ILSVRC2012_val_00023132.JPEG n03876231/
+mv val/ILSVRC2012_val_00023133.JPEG n02102318/
+mv val/ILSVRC2012_val_00023134.JPEG n02777292/
+mv val/ILSVRC2012_val_00023135.JPEG n04004767/
+mv val/ILSVRC2012_val_00023136.JPEG n03933933/
+mv val/ILSVRC2012_val_00023137.JPEG n07836838/
+mv val/ILSVRC2012_val_00023138.JPEG n01751748/
+mv val/ILSVRC2012_val_00023139.JPEG n07718472/
+mv val/ILSVRC2012_val_00023140.JPEG n04254777/
+mv val/ILSVRC2012_val_00023141.JPEG n03424325/
+mv val/ILSVRC2012_val_00023142.JPEG n03063599/
+mv val/ILSVRC2012_val_00023143.JPEG n02095570/
+mv val/ILSVRC2012_val_00023144.JPEG n01824575/
+mv val/ILSVRC2012_val_00023145.JPEG n04311004/
+mv val/ILSVRC2012_val_00023146.JPEG n01677366/
+mv val/ILSVRC2012_val_00023147.JPEG n03062245/
+mv val/ILSVRC2012_val_00023148.JPEG n03627232/
+mv val/ILSVRC2012_val_00023149.JPEG n03134739/
+mv val/ILSVRC2012_val_00023150.JPEG n04372370/
+mv val/ILSVRC2012_val_00023151.JPEG n03075370/
+mv val/ILSVRC2012_val_00023152.JPEG n02802426/
+mv val/ILSVRC2012_val_00023153.JPEG n03447721/
+mv val/ILSVRC2012_val_00023154.JPEG n01829413/
+mv val/ILSVRC2012_val_00023155.JPEG n02090379/
+mv val/ILSVRC2012_val_00023156.JPEG n04192698/
+mv val/ILSVRC2012_val_00023157.JPEG n03743016/
+mv val/ILSVRC2012_val_00023158.JPEG n01692333/
+mv val/ILSVRC2012_val_00023159.JPEG n02099601/
+mv val/ILSVRC2012_val_00023160.JPEG n03720891/
+mv val/ILSVRC2012_val_00023161.JPEG n02951585/
+mv val/ILSVRC2012_val_00023162.JPEG n01532829/
+mv val/ILSVRC2012_val_00023163.JPEG n02281406/
+mv val/ILSVRC2012_val_00023164.JPEG n02096177/
+mv val/ILSVRC2012_val_00023165.JPEG n03920288/
+mv val/ILSVRC2012_val_00023166.JPEG n02927161/
+mv val/ILSVRC2012_val_00023167.JPEG n04179913/
+mv val/ILSVRC2012_val_00023168.JPEG n02100236/
+mv val/ILSVRC2012_val_00023169.JPEG n04515003/
+mv val/ILSVRC2012_val_00023170.JPEG n07802026/
+mv val/ILSVRC2012_val_00023171.JPEG n02088632/
+mv val/ILSVRC2012_val_00023172.JPEG n03950228/
+mv val/ILSVRC2012_val_00023173.JPEG n09193705/
+mv val/ILSVRC2012_val_00023174.JPEG n03841143/
+mv val/ILSVRC2012_val_00023175.JPEG n02093647/
+mv val/ILSVRC2012_val_00023176.JPEG n04336792/
+mv val/ILSVRC2012_val_00023177.JPEG n04357314/
+mv val/ILSVRC2012_val_00023178.JPEG n03929660/
+mv val/ILSVRC2012_val_00023179.JPEG n02093647/
+mv val/ILSVRC2012_val_00023180.JPEG n02093428/
+mv val/ILSVRC2012_val_00023181.JPEG n04049303/
+mv val/ILSVRC2012_val_00023182.JPEG n01873310/
+mv val/ILSVRC2012_val_00023183.JPEG n02268853/
+mv val/ILSVRC2012_val_00023184.JPEG n03838899/
+mv val/ILSVRC2012_val_00023185.JPEG n01484850/
+mv val/ILSVRC2012_val_00023186.JPEG n03337140/
+mv val/ILSVRC2012_val_00023187.JPEG n01537544/
+mv val/ILSVRC2012_val_00023188.JPEG n02174001/
+mv val/ILSVRC2012_val_00023189.JPEG n03063599/
+mv val/ILSVRC2012_val_00023190.JPEG n02640242/
+mv val/ILSVRC2012_val_00023191.JPEG n03721384/
+mv val/ILSVRC2012_val_00023192.JPEG n04596742/
+mv val/ILSVRC2012_val_00023193.JPEG n02795169/
+mv val/ILSVRC2012_val_00023194.JPEG n02492660/
+mv val/ILSVRC2012_val_00023195.JPEG n02892201/
+mv val/ILSVRC2012_val_00023196.JPEG n02361337/
+mv val/ILSVRC2012_val_00023197.JPEG n04417672/
+mv val/ILSVRC2012_val_00023198.JPEG n02113624/
+mv val/ILSVRC2012_val_00023199.JPEG n02028035/
+mv val/ILSVRC2012_val_00023200.JPEG n02999410/
+mv val/ILSVRC2012_val_00023201.JPEG n01629819/
+mv val/ILSVRC2012_val_00023202.JPEG n02115913/
+mv val/ILSVRC2012_val_00023203.JPEG n02089078/
+mv val/ILSVRC2012_val_00023204.JPEG n01768244/
+mv val/ILSVRC2012_val_00023205.JPEG n04263257/
+mv val/ILSVRC2012_val_00023206.JPEG n01944390/
+mv val/ILSVRC2012_val_00023207.JPEG n01945685/
+mv val/ILSVRC2012_val_00023208.JPEG n02071294/
+mv val/ILSVRC2012_val_00023209.JPEG n03937543/
+mv val/ILSVRC2012_val_00023210.JPEG n02391049/
+mv val/ILSVRC2012_val_00023211.JPEG n02018207/
+mv val/ILSVRC2012_val_00023212.JPEG n02129165/
+mv val/ILSVRC2012_val_00023213.JPEG n02074367/
+mv val/ILSVRC2012_val_00023214.JPEG n01518878/
+mv val/ILSVRC2012_val_00023215.JPEG n03445777/
+mv val/ILSVRC2012_val_00023216.JPEG n04149813/
+mv val/ILSVRC2012_val_00023217.JPEG n02669723/
+mv val/ILSVRC2012_val_00023218.JPEG n02097047/
+mv val/ILSVRC2012_val_00023219.JPEG n02865351/
+mv val/ILSVRC2012_val_00023220.JPEG n07753592/
+mv val/ILSVRC2012_val_00023221.JPEG n02814533/
+mv val/ILSVRC2012_val_00023222.JPEG n03874599/
+mv val/ILSVRC2012_val_00023223.JPEG n07720875/
+mv val/ILSVRC2012_val_00023224.JPEG n04116512/
+mv val/ILSVRC2012_val_00023225.JPEG n02417914/
+mv val/ILSVRC2012_val_00023226.JPEG n02027492/
+mv val/ILSVRC2012_val_00023227.JPEG n03877845/
+mv val/ILSVRC2012_val_00023228.JPEG n02123159/
+mv val/ILSVRC2012_val_00023229.JPEG n04264628/
+mv val/ILSVRC2012_val_00023230.JPEG n02236044/
+mv val/ILSVRC2012_val_00023231.JPEG n02108089/
+mv val/ILSVRC2012_val_00023232.JPEG n04133789/
+mv val/ILSVRC2012_val_00023233.JPEG n04147183/
+mv val/ILSVRC2012_val_00023234.JPEG n02085620/
+mv val/ILSVRC2012_val_00023235.JPEG n02091134/
+mv val/ILSVRC2012_val_00023236.JPEG n03944341/
+mv val/ILSVRC2012_val_00023237.JPEG n13037406/
+mv val/ILSVRC2012_val_00023238.JPEG n02422106/
+mv val/ILSVRC2012_val_00023239.JPEG n01498041/
+mv val/ILSVRC2012_val_00023240.JPEG n03775071/
+mv val/ILSVRC2012_val_00023241.JPEG n04357314/
+mv val/ILSVRC2012_val_00023242.JPEG n02102040/
+mv val/ILSVRC2012_val_00023243.JPEG n01682714/
+mv val/ILSVRC2012_val_00023244.JPEG n01775062/
+mv val/ILSVRC2012_val_00023245.JPEG n03014705/
+mv val/ILSVRC2012_val_00023246.JPEG n01693334/
+mv val/ILSVRC2012_val_00023247.JPEG n01616318/
+mv val/ILSVRC2012_val_00023248.JPEG n04604644/
+mv val/ILSVRC2012_val_00023249.JPEG n03109150/
+mv val/ILSVRC2012_val_00023250.JPEG n02088238/
+mv val/ILSVRC2012_val_00023251.JPEG n01981276/
+mv val/ILSVRC2012_val_00023252.JPEG n02422106/
+mv val/ILSVRC2012_val_00023253.JPEG n01985128/
+mv val/ILSVRC2012_val_00023254.JPEG n04026417/
+mv val/ILSVRC2012_val_00023255.JPEG n01644900/
+mv val/ILSVRC2012_val_00023256.JPEG n02095570/
+mv val/ILSVRC2012_val_00023257.JPEG n04266014/
+mv val/ILSVRC2012_val_00023258.JPEG n02236044/
+mv val/ILSVRC2012_val_00023259.JPEG n02115913/
+mv val/ILSVRC2012_val_00023260.JPEG n01883070/
+mv val/ILSVRC2012_val_00023261.JPEG n03840681/
+mv val/ILSVRC2012_val_00023262.JPEG n02481823/
+mv val/ILSVRC2012_val_00023263.JPEG n03447721/
+mv val/ILSVRC2012_val_00023264.JPEG n01981276/
+mv val/ILSVRC2012_val_00023265.JPEG n03673027/
+mv val/ILSVRC2012_val_00023266.JPEG n02835271/
+mv val/ILSVRC2012_val_00023267.JPEG n02123159/
+mv val/ILSVRC2012_val_00023268.JPEG n02113186/
+mv val/ILSVRC2012_val_00023269.JPEG n03947888/
+mv val/ILSVRC2012_val_00023270.JPEG n02100877/
+mv val/ILSVRC2012_val_00023271.JPEG n03814639/
+mv val/ILSVRC2012_val_00023272.JPEG n02510455/
+mv val/ILSVRC2012_val_00023273.JPEG n04037443/
+mv val/ILSVRC2012_val_00023274.JPEG n03929660/
+mv val/ILSVRC2012_val_00023275.JPEG n03837869/
+mv val/ILSVRC2012_val_00023276.JPEG n02791270/
+mv val/ILSVRC2012_val_00023277.JPEG n03461385/
+mv val/ILSVRC2012_val_00023278.JPEG n02951585/
+mv val/ILSVRC2012_val_00023279.JPEG n04525305/
+mv val/ILSVRC2012_val_00023280.JPEG n02788148/
+mv val/ILSVRC2012_val_00023281.JPEG n02165105/
+mv val/ILSVRC2012_val_00023282.JPEG n04592741/
+mv val/ILSVRC2012_val_00023283.JPEG n02091467/
+mv val/ILSVRC2012_val_00023284.JPEG n03188531/
+mv val/ILSVRC2012_val_00023285.JPEG n02091134/
+mv val/ILSVRC2012_val_00023286.JPEG n03617480/
+mv val/ILSVRC2012_val_00023287.JPEG n03954731/
+mv val/ILSVRC2012_val_00023288.JPEG n04328186/
+mv val/ILSVRC2012_val_00023289.JPEG n02105162/
+mv val/ILSVRC2012_val_00023290.JPEG n02870880/
+mv val/ILSVRC2012_val_00023291.JPEG n03028079/
+mv val/ILSVRC2012_val_00023292.JPEG n04596742/
+mv val/ILSVRC2012_val_00023293.JPEG n04204347/
+mv val/ILSVRC2012_val_00023294.JPEG n02108422/
+mv val/ILSVRC2012_val_00023295.JPEG n01740131/
+mv val/ILSVRC2012_val_00023296.JPEG n02363005/
+mv val/ILSVRC2012_val_00023297.JPEG n03840681/
+mv val/ILSVRC2012_val_00023298.JPEG n04116512/
+mv val/ILSVRC2012_val_00023299.JPEG n02138441/
+mv val/ILSVRC2012_val_00023300.JPEG n04367480/
+mv val/ILSVRC2012_val_00023301.JPEG n01773797/
+mv val/ILSVRC2012_val_00023302.JPEG n04350905/
+mv val/ILSVRC2012_val_00023303.JPEG n02095314/
+mv val/ILSVRC2012_val_00023304.JPEG n09229709/
+mv val/ILSVRC2012_val_00023305.JPEG n02494079/
+mv val/ILSVRC2012_val_00023306.JPEG n03788365/
+mv val/ILSVRC2012_val_00023307.JPEG n02117135/
+mv val/ILSVRC2012_val_00023308.JPEG n01641577/
+mv val/ILSVRC2012_val_00023309.JPEG n04192698/
+mv val/ILSVRC2012_val_00023310.JPEG n02087046/
+mv val/ILSVRC2012_val_00023311.JPEG n12620546/
+mv val/ILSVRC2012_val_00023312.JPEG n02410509/
+mv val/ILSVRC2012_val_00023313.JPEG n03777568/
+mv val/ILSVRC2012_val_00023314.JPEG n02948072/
+mv val/ILSVRC2012_val_00023315.JPEG n03662601/
+mv val/ILSVRC2012_val_00023316.JPEG n02690373/
+mv val/ILSVRC2012_val_00023317.JPEG n02441942/
+mv val/ILSVRC2012_val_00023318.JPEG n03127925/
+mv val/ILSVRC2012_val_00023319.JPEG n02066245/
+mv val/ILSVRC2012_val_00023320.JPEG n02097130/
+mv val/ILSVRC2012_val_00023321.JPEG n03187595/
+mv val/ILSVRC2012_val_00023322.JPEG n02977058/
+mv val/ILSVRC2012_val_00023323.JPEG n03977966/
+mv val/ILSVRC2012_val_00023324.JPEG n03291819/
+mv val/ILSVRC2012_val_00023325.JPEG n02788148/
+mv val/ILSVRC2012_val_00023326.JPEG n03482405/
+mv val/ILSVRC2012_val_00023327.JPEG n02090721/
+mv val/ILSVRC2012_val_00023328.JPEG n02105641/
+mv val/ILSVRC2012_val_00023329.JPEG n04525038/
+mv val/ILSVRC2012_val_00023330.JPEG n04328186/
+mv val/ILSVRC2012_val_00023331.JPEG n03424325/
+mv val/ILSVRC2012_val_00023332.JPEG n03498962/
+mv val/ILSVRC2012_val_00023333.JPEG n03223299/
+mv val/ILSVRC2012_val_00023334.JPEG n04552348/
+mv val/ILSVRC2012_val_00023335.JPEG n09193705/
+mv val/ILSVRC2012_val_00023336.JPEG n07697537/
+mv val/ILSVRC2012_val_00023337.JPEG n04596742/
+mv val/ILSVRC2012_val_00023338.JPEG n01797886/
+mv val/ILSVRC2012_val_00023339.JPEG n01980166/
+mv val/ILSVRC2012_val_00023340.JPEG n02093991/
+mv val/ILSVRC2012_val_00023341.JPEG n01688243/
+mv val/ILSVRC2012_val_00023342.JPEG n01817953/
+mv val/ILSVRC2012_val_00023343.JPEG n03485407/
+mv val/ILSVRC2012_val_00023344.JPEG n01795545/
+mv val/ILSVRC2012_val_00023345.JPEG n02794156/
+mv val/ILSVRC2012_val_00023346.JPEG n02102480/
+mv val/ILSVRC2012_val_00023347.JPEG n01819313/
+mv val/ILSVRC2012_val_00023348.JPEG n03188531/
+mv val/ILSVRC2012_val_00023349.JPEG n02965783/
+mv val/ILSVRC2012_val_00023350.JPEG n03534580/
+mv val/ILSVRC2012_val_00023351.JPEG n02395406/
+mv val/ILSVRC2012_val_00023352.JPEG n02033041/
+mv val/ILSVRC2012_val_00023353.JPEG n03337140/
+mv val/ILSVRC2012_val_00023354.JPEG n04200800/
+mv val/ILSVRC2012_val_00023355.JPEG n02797295/
+mv val/ILSVRC2012_val_00023356.JPEG n02804414/
+mv val/ILSVRC2012_val_00023357.JPEG n02088364/
+mv val/ILSVRC2012_val_00023358.JPEG n03000247/
+mv val/ILSVRC2012_val_00023359.JPEG n03937543/
+mv val/ILSVRC2012_val_00023360.JPEG n02389026/
+mv val/ILSVRC2012_val_00023361.JPEG n01682714/
+mv val/ILSVRC2012_val_00023362.JPEG n02101388/
+mv val/ILSVRC2012_val_00023363.JPEG n01685808/
+mv val/ILSVRC2012_val_00023364.JPEG n07880968/
+mv val/ILSVRC2012_val_00023365.JPEG n02509815/
+mv val/ILSVRC2012_val_00023366.JPEG n03938244/
+mv val/ILSVRC2012_val_00023367.JPEG n04532670/
+mv val/ILSVRC2012_val_00023368.JPEG n03967562/
+mv val/ILSVRC2012_val_00023369.JPEG n03196217/
+mv val/ILSVRC2012_val_00023370.JPEG n02892767/
+mv val/ILSVRC2012_val_00023371.JPEG n01843383/
+mv val/ILSVRC2012_val_00023372.JPEG n02978881/
+mv val/ILSVRC2012_val_00023373.JPEG n01748264/
+mv val/ILSVRC2012_val_00023374.JPEG n04423845/
+mv val/ILSVRC2012_val_00023375.JPEG n02396427/
+mv val/ILSVRC2012_val_00023376.JPEG n03388043/
+mv val/ILSVRC2012_val_00023377.JPEG n03000134/
+mv val/ILSVRC2012_val_00023378.JPEG n04429376/
+mv val/ILSVRC2012_val_00023379.JPEG n03483316/
+mv val/ILSVRC2012_val_00023380.JPEG n03485407/
+mv val/ILSVRC2012_val_00023381.JPEG n02256656/
+mv val/ILSVRC2012_val_00023382.JPEG n04086273/
+mv val/ILSVRC2012_val_00023383.JPEG n02356798/
+mv val/ILSVRC2012_val_00023384.JPEG n02747177/
+mv val/ILSVRC2012_val_00023385.JPEG n01773157/
+mv val/ILSVRC2012_val_00023386.JPEG n03297495/
+mv val/ILSVRC2012_val_00023387.JPEG n02403003/
+mv val/ILSVRC2012_val_00023388.JPEG n07718472/
+mv val/ILSVRC2012_val_00023389.JPEG n03445924/
+mv val/ILSVRC2012_val_00023390.JPEG n01843383/
+mv val/ILSVRC2012_val_00023391.JPEG n02328150/
+mv val/ILSVRC2012_val_00023392.JPEG n03447447/
+mv val/ILSVRC2012_val_00023393.JPEG n02124075/
+mv val/ILSVRC2012_val_00023394.JPEG n02098105/
+mv val/ILSVRC2012_val_00023395.JPEG n06596364/
+mv val/ILSVRC2012_val_00023396.JPEG n03388183/
+mv val/ILSVRC2012_val_00023397.JPEG n06596364/
+mv val/ILSVRC2012_val_00023398.JPEG n02504013/
+mv val/ILSVRC2012_val_00023399.JPEG n04041544/
+mv val/ILSVRC2012_val_00023400.JPEG n02009912/
+mv val/ILSVRC2012_val_00023401.JPEG n02093859/
+mv val/ILSVRC2012_val_00023402.JPEG n04350905/
+mv val/ILSVRC2012_val_00023403.JPEG n02317335/
+mv val/ILSVRC2012_val_00023404.JPEG n07871810/
+mv val/ILSVRC2012_val_00023405.JPEG n02105855/
+mv val/ILSVRC2012_val_00023406.JPEG n02607072/
+mv val/ILSVRC2012_val_00023407.JPEG n02095570/
+mv val/ILSVRC2012_val_00023408.JPEG n02389026/
+mv val/ILSVRC2012_val_00023409.JPEG n06785654/
+mv val/ILSVRC2012_val_00023410.JPEG n09421951/
+mv val/ILSVRC2012_val_00023411.JPEG n02114855/
+mv val/ILSVRC2012_val_00023412.JPEG n03216828/
+mv val/ILSVRC2012_val_00023413.JPEG n01855032/
+mv val/ILSVRC2012_val_00023414.JPEG n03095699/
+mv val/ILSVRC2012_val_00023415.JPEG n02115641/
+mv val/ILSVRC2012_val_00023416.JPEG n01955084/
+mv val/ILSVRC2012_val_00023417.JPEG n03095699/
+mv val/ILSVRC2012_val_00023418.JPEG n03133878/
+mv val/ILSVRC2012_val_00023419.JPEG n03902125/
+mv val/ILSVRC2012_val_00023420.JPEG n02395406/
+mv val/ILSVRC2012_val_00023421.JPEG n04371774/
+mv val/ILSVRC2012_val_00023422.JPEG n04525305/
+mv val/ILSVRC2012_val_00023423.JPEG n03345487/
+mv val/ILSVRC2012_val_00023424.JPEG n02108551/
+mv val/ILSVRC2012_val_00023425.JPEG n01774750/
+mv val/ILSVRC2012_val_00023426.JPEG n02480495/
+mv val/ILSVRC2012_val_00023427.JPEG n03594945/
+mv val/ILSVRC2012_val_00023428.JPEG n02091635/
+mv val/ILSVRC2012_val_00023429.JPEG n04557648/
+mv val/ILSVRC2012_val_00023430.JPEG n03388549/
+mv val/ILSVRC2012_val_00023431.JPEG n01784675/
+mv val/ILSVRC2012_val_00023432.JPEG n13040303/
+mv val/ILSVRC2012_val_00023433.JPEG n13037406/
+mv val/ILSVRC2012_val_00023434.JPEG n01776313/
+mv val/ILSVRC2012_val_00023435.JPEG n02099601/
+mv val/ILSVRC2012_val_00023436.JPEG n03134739/
+mv val/ILSVRC2012_val_00023437.JPEG n02110185/
+mv val/ILSVRC2012_val_00023438.JPEG n01537544/
+mv val/ILSVRC2012_val_00023439.JPEG n13133613/
+mv val/ILSVRC2012_val_00023440.JPEG n02102040/
+mv val/ILSVRC2012_val_00023441.JPEG n01530575/
+mv val/ILSVRC2012_val_00023442.JPEG n01735189/
+mv val/ILSVRC2012_val_00023443.JPEG n01491361/
+mv val/ILSVRC2012_val_00023444.JPEG n07583066/
+mv val/ILSVRC2012_val_00023445.JPEG n02137549/
+mv val/ILSVRC2012_val_00023446.JPEG n03908714/
+mv val/ILSVRC2012_val_00023447.JPEG n03045698/
+mv val/ILSVRC2012_val_00023448.JPEG n01914609/
+mv val/ILSVRC2012_val_00023449.JPEG n02326432/
+mv val/ILSVRC2012_val_00023450.JPEG n01631663/
+mv val/ILSVRC2012_val_00023451.JPEG n03868242/
+mv val/ILSVRC2012_val_00023452.JPEG n03920288/
+mv val/ILSVRC2012_val_00023453.JPEG n03729826/
+mv val/ILSVRC2012_val_00023454.JPEG n02002724/
+mv val/ILSVRC2012_val_00023455.JPEG n03776460/
+mv val/ILSVRC2012_val_00023456.JPEG n03535780/
+mv val/ILSVRC2012_val_00023457.JPEG n03146219/
+mv val/ILSVRC2012_val_00023458.JPEG n02094258/
+mv val/ILSVRC2012_val_00023459.JPEG n03841143/
+mv val/ILSVRC2012_val_00023460.JPEG n02797295/
+mv val/ILSVRC2012_val_00023461.JPEG n02500267/
+mv val/ILSVRC2012_val_00023462.JPEG n04392985/
+mv val/ILSVRC2012_val_00023463.JPEG n02504458/
+mv val/ILSVRC2012_val_00023464.JPEG n01773797/
+mv val/ILSVRC2012_val_00023465.JPEG n04325704/
+mv val/ILSVRC2012_val_00023466.JPEG n03920288/
+mv val/ILSVRC2012_val_00023467.JPEG n02999410/
+mv val/ILSVRC2012_val_00023468.JPEG n02655020/
+mv val/ILSVRC2012_val_00023469.JPEG n02097474/
+mv val/ILSVRC2012_val_00023470.JPEG n09472597/
+mv val/ILSVRC2012_val_00023471.JPEG n02099712/
+mv val/ILSVRC2012_val_00023472.JPEG n02980441/
+mv val/ILSVRC2012_val_00023473.JPEG n04461696/
+mv val/ILSVRC2012_val_00023474.JPEG n02814533/
+mv val/ILSVRC2012_val_00023475.JPEG n03495258/
+mv val/ILSVRC2012_val_00023476.JPEG n01784675/
+mv val/ILSVRC2012_val_00023477.JPEG n03000684/
+mv val/ILSVRC2012_val_00023478.JPEG n07760859/
+mv val/ILSVRC2012_val_00023479.JPEG n04141327/
+mv val/ILSVRC2012_val_00023480.JPEG n02641379/
+mv val/ILSVRC2012_val_00023481.JPEG n04200800/
+mv val/ILSVRC2012_val_00023482.JPEG n04141327/
+mv val/ILSVRC2012_val_00023483.JPEG n01943899/
+mv val/ILSVRC2012_val_00023484.JPEG n04037443/
+mv val/ILSVRC2012_val_00023485.JPEG n04357314/
+mv val/ILSVRC2012_val_00023486.JPEG n02097474/
+mv val/ILSVRC2012_val_00023487.JPEG n03857828/
+mv val/ILSVRC2012_val_00023488.JPEG n01630670/
+mv val/ILSVRC2012_val_00023489.JPEG n02417914/
+mv val/ILSVRC2012_val_00023490.JPEG n02747177/
+mv val/ILSVRC2012_val_00023491.JPEG n04590129/
+mv val/ILSVRC2012_val_00023492.JPEG n02037110/
+mv val/ILSVRC2012_val_00023493.JPEG n03841143/
+mv val/ILSVRC2012_val_00023494.JPEG n04204238/
+mv val/ILSVRC2012_val_00023495.JPEG n04252225/
+mv val/ILSVRC2012_val_00023496.JPEG n02791270/
+mv val/ILSVRC2012_val_00023497.JPEG n09193705/
+mv val/ILSVRC2012_val_00023498.JPEG n04376876/
+mv val/ILSVRC2012_val_00023499.JPEG n02815834/
+mv val/ILSVRC2012_val_00023500.JPEG n01817953/
+mv val/ILSVRC2012_val_00023501.JPEG n04356056/
+mv val/ILSVRC2012_val_00023502.JPEG n02007558/
+mv val/ILSVRC2012_val_00023503.JPEG n02917067/
+mv val/ILSVRC2012_val_00023504.JPEG n03544143/
+mv val/ILSVRC2012_val_00023505.JPEG n03954731/
+mv val/ILSVRC2012_val_00023506.JPEG n03372029/
+mv val/ILSVRC2012_val_00023507.JPEG n02930766/
+mv val/ILSVRC2012_val_00023508.JPEG n04310018/
+mv val/ILSVRC2012_val_00023509.JPEG n03630383/
+mv val/ILSVRC2012_val_00023510.JPEG n04009552/
+mv val/ILSVRC2012_val_00023511.JPEG n02132136/
+mv val/ILSVRC2012_val_00023512.JPEG n07745940/
+mv val/ILSVRC2012_val_00023513.JPEG n02094114/
+mv val/ILSVRC2012_val_00023514.JPEG n02480855/
+mv val/ILSVRC2012_val_00023515.JPEG n02093991/
+mv val/ILSVRC2012_val_00023516.JPEG n02113624/
+mv val/ILSVRC2012_val_00023517.JPEG n03662601/
+mv val/ILSVRC2012_val_00023518.JPEG n12144580/
+mv val/ILSVRC2012_val_00023519.JPEG n02443114/
+mv val/ILSVRC2012_val_00023520.JPEG n01914609/
+mv val/ILSVRC2012_val_00023521.JPEG n04040759/
+mv val/ILSVRC2012_val_00023522.JPEG n02834397/
+mv val/ILSVRC2012_val_00023523.JPEG n02276258/
+mv val/ILSVRC2012_val_00023524.JPEG n04557648/
+mv val/ILSVRC2012_val_00023525.JPEG n07718472/
+mv val/ILSVRC2012_val_00023526.JPEG n02108915/
+mv val/ILSVRC2012_val_00023527.JPEG n07753113/
+mv val/ILSVRC2012_val_00023528.JPEG n02093428/
+mv val/ILSVRC2012_val_00023529.JPEG n03976467/
+mv val/ILSVRC2012_val_00023530.JPEG n01984695/
+mv val/ILSVRC2012_val_00023531.JPEG n02492035/
+mv val/ILSVRC2012_val_00023532.JPEG n04275548/
+mv val/ILSVRC2012_val_00023533.JPEG n02100877/
+mv val/ILSVRC2012_val_00023534.JPEG n04254777/
+mv val/ILSVRC2012_val_00023535.JPEG n02799071/
+mv val/ILSVRC2012_val_00023536.JPEG n03908618/
+mv val/ILSVRC2012_val_00023537.JPEG n03773504/
+mv val/ILSVRC2012_val_00023538.JPEG n03347037/
+mv val/ILSVRC2012_val_00023539.JPEG n02107574/
+mv val/ILSVRC2012_val_00023540.JPEG n03529860/
+mv val/ILSVRC2012_val_00023541.JPEG n02093256/
+mv val/ILSVRC2012_val_00023542.JPEG n03291819/
+mv val/ILSVRC2012_val_00023543.JPEG n02110958/
+mv val/ILSVRC2012_val_00023544.JPEG n04275548/
+mv val/ILSVRC2012_val_00023545.JPEG n04273569/
+mv val/ILSVRC2012_val_00023546.JPEG n02113023/
+mv val/ILSVRC2012_val_00023547.JPEG n03958227/
+mv val/ILSVRC2012_val_00023548.JPEG n04417672/
+mv val/ILSVRC2012_val_00023549.JPEG n03272562/
+mv val/ILSVRC2012_val_00023550.JPEG n01980166/
+mv val/ILSVRC2012_val_00023551.JPEG n01514668/
+mv val/ILSVRC2012_val_00023552.JPEG n02002556/
+mv val/ILSVRC2012_val_00023553.JPEG n02086079/
+mv val/ILSVRC2012_val_00023554.JPEG n02104365/
+mv val/ILSVRC2012_val_00023555.JPEG n01677366/
+mv val/ILSVRC2012_val_00023556.JPEG n03770679/
+mv val/ILSVRC2012_val_00023557.JPEG n02096177/
+mv val/ILSVRC2012_val_00023558.JPEG n02094258/
+mv val/ILSVRC2012_val_00023559.JPEG n01440764/
+mv val/ILSVRC2012_val_00023560.JPEG n01943899/
+mv val/ILSVRC2012_val_00023561.JPEG n02099849/
+mv val/ILSVRC2012_val_00023562.JPEG n03899768/
+mv val/ILSVRC2012_val_00023563.JPEG n01729322/
+mv val/ILSVRC2012_val_00023564.JPEG n01776313/
+mv val/ILSVRC2012_val_00023565.JPEG n06359193/
+mv val/ILSVRC2012_val_00023566.JPEG n02447366/
+mv val/ILSVRC2012_val_00023567.JPEG n03857828/
+mv val/ILSVRC2012_val_00023568.JPEG n03384352/
+mv val/ILSVRC2012_val_00023569.JPEG n02111277/
+mv val/ILSVRC2012_val_00023570.JPEG n02226429/
+mv val/ILSVRC2012_val_00023571.JPEG n04366367/
+mv val/ILSVRC2012_val_00023572.JPEG n01737021/
+mv val/ILSVRC2012_val_00023573.JPEG n01537544/
+mv val/ILSVRC2012_val_00023574.JPEG n02951358/
+mv val/ILSVRC2012_val_00023575.JPEG n04371430/
+mv val/ILSVRC2012_val_00023576.JPEG n03196217/
+mv val/ILSVRC2012_val_00023577.JPEG n02100236/
+mv val/ILSVRC2012_val_00023578.JPEG n04443257/
+mv val/ILSVRC2012_val_00023579.JPEG n04479046/
+mv val/ILSVRC2012_val_00023580.JPEG n03983396/
+mv val/ILSVRC2012_val_00023581.JPEG n03218198/
+mv val/ILSVRC2012_val_00023582.JPEG n02105505/
+mv val/ILSVRC2012_val_00023583.JPEG n01978287/
+mv val/ILSVRC2012_val_00023584.JPEG n04286575/
+mv val/ILSVRC2012_val_00023585.JPEG n03866082/
+mv val/ILSVRC2012_val_00023586.JPEG n04208210/
+mv val/ILSVRC2012_val_00023587.JPEG n03891332/
+mv val/ILSVRC2012_val_00023588.JPEG n03857828/
+mv val/ILSVRC2012_val_00023589.JPEG n02504013/
+mv val/ILSVRC2012_val_00023590.JPEG n03982430/
+mv val/ILSVRC2012_val_00023591.JPEG n04554684/
+mv val/ILSVRC2012_val_00023592.JPEG n04317175/
+mv val/ILSVRC2012_val_00023593.JPEG n04552348/
+mv val/ILSVRC2012_val_00023594.JPEG n12057211/
+mv val/ILSVRC2012_val_00023595.JPEG n02483362/
+mv val/ILSVRC2012_val_00023596.JPEG n02097474/
+mv val/ILSVRC2012_val_00023597.JPEG n02361337/
+mv val/ILSVRC2012_val_00023598.JPEG n02120505/
+mv val/ILSVRC2012_val_00023599.JPEG n03594945/
+mv val/ILSVRC2012_val_00023600.JPEG n03498962/
+mv val/ILSVRC2012_val_00023601.JPEG n01978455/
+mv val/ILSVRC2012_val_00023602.JPEG n01829413/
+mv val/ILSVRC2012_val_00023603.JPEG n02105505/
+mv val/ILSVRC2012_val_00023604.JPEG n01978455/
+mv val/ILSVRC2012_val_00023605.JPEG n04356056/
+mv val/ILSVRC2012_val_00023606.JPEG n07718472/
+mv val/ILSVRC2012_val_00023607.JPEG n01518878/
+mv val/ILSVRC2012_val_00023608.JPEG n02795169/
+mv val/ILSVRC2012_val_00023609.JPEG n03617480/
+mv val/ILSVRC2012_val_00023610.JPEG n03372029/
+mv val/ILSVRC2012_val_00023611.JPEG n02099267/
+mv val/ILSVRC2012_val_00023612.JPEG n04229816/
+mv val/ILSVRC2012_val_00023613.JPEG n07717410/
+mv val/ILSVRC2012_val_00023614.JPEG n02895154/
+mv val/ILSVRC2012_val_00023615.JPEG n02110185/
+mv val/ILSVRC2012_val_00023616.JPEG n04149813/
+mv val/ILSVRC2012_val_00023617.JPEG n02056570/
+mv val/ILSVRC2012_val_00023618.JPEG n04404412/
+mv val/ILSVRC2012_val_00023619.JPEG n03028079/
+mv val/ILSVRC2012_val_00023620.JPEG n02110341/
+mv val/ILSVRC2012_val_00023621.JPEG n04120489/
+mv val/ILSVRC2012_val_00023622.JPEG n02804414/
+mv val/ILSVRC2012_val_00023623.JPEG n02988304/
+mv val/ILSVRC2012_val_00023624.JPEG n02167151/
+mv val/ILSVRC2012_val_00023625.JPEG n04392985/
+mv val/ILSVRC2012_val_00023626.JPEG n07747607/
+mv val/ILSVRC2012_val_00023627.JPEG n02966687/
+mv val/ILSVRC2012_val_00023628.JPEG n09399592/
+mv val/ILSVRC2012_val_00023629.JPEG n03761084/
+mv val/ILSVRC2012_val_00023630.JPEG n03400231/
+mv val/ILSVRC2012_val_00023631.JPEG n04136333/
+mv val/ILSVRC2012_val_00023632.JPEG n04423845/
+mv val/ILSVRC2012_val_00023633.JPEG n02978881/
+mv val/ILSVRC2012_val_00023634.JPEG n02099429/
+mv val/ILSVRC2012_val_00023635.JPEG n07892512/
+mv val/ILSVRC2012_val_00023636.JPEG n02137549/
+mv val/ILSVRC2012_val_00023637.JPEG n01807496/
+mv val/ILSVRC2012_val_00023638.JPEG n04033995/
+mv val/ILSVRC2012_val_00023639.JPEG n03876231/
+mv val/ILSVRC2012_val_00023640.JPEG n03063599/
+mv val/ILSVRC2012_val_00023641.JPEG n04005630/
+mv val/ILSVRC2012_val_00023642.JPEG n02489166/
+mv val/ILSVRC2012_val_00023643.JPEG n03197337/
+mv val/ILSVRC2012_val_00023644.JPEG n04456115/
+mv val/ILSVRC2012_val_00023645.JPEG n03388043/
+mv val/ILSVRC2012_val_00023646.JPEG n03062245/
+mv val/ILSVRC2012_val_00023647.JPEG n03899768/
+mv val/ILSVRC2012_val_00023648.JPEG n04371430/
+mv val/ILSVRC2012_val_00023649.JPEG n03729826/
+mv val/ILSVRC2012_val_00023650.JPEG n02165456/
+mv val/ILSVRC2012_val_00023651.JPEG n02769748/
+mv val/ILSVRC2012_val_00023652.JPEG n02412080/
+mv val/ILSVRC2012_val_00023653.JPEG n02086240/
+mv val/ILSVRC2012_val_00023654.JPEG n01665541/
+mv val/ILSVRC2012_val_00023655.JPEG n02412080/
+mv val/ILSVRC2012_val_00023656.JPEG n02445715/
+mv val/ILSVRC2012_val_00023657.JPEG n01735189/
+mv val/ILSVRC2012_val_00023658.JPEG n02086079/
+mv val/ILSVRC2012_val_00023659.JPEG n02110185/
+mv val/ILSVRC2012_val_00023660.JPEG n07697537/
+mv val/ILSVRC2012_val_00023661.JPEG n02112350/
+mv val/ILSVRC2012_val_00023662.JPEG n02137549/
+mv val/ILSVRC2012_val_00023663.JPEG n02398521/
+mv val/ILSVRC2012_val_00023664.JPEG n02971356/
+mv val/ILSVRC2012_val_00023665.JPEG n03980874/
+mv val/ILSVRC2012_val_00023666.JPEG n02106030/
+mv val/ILSVRC2012_val_00023667.JPEG n02980441/
+mv val/ILSVRC2012_val_00023668.JPEG n09193705/
+mv val/ILSVRC2012_val_00023669.JPEG n03393912/
+mv val/ILSVRC2012_val_00023670.JPEG n04562935/
+mv val/ILSVRC2012_val_00023671.JPEG n03691459/
+mv val/ILSVRC2012_val_00023672.JPEG n02870880/
+mv val/ILSVRC2012_val_00023673.JPEG n02443484/
+mv val/ILSVRC2012_val_00023674.JPEG n02979186/
+mv val/ILSVRC2012_val_00023675.JPEG n02100735/
+mv val/ILSVRC2012_val_00023676.JPEG n01682714/
+mv val/ILSVRC2012_val_00023677.JPEG n02607072/
+mv val/ILSVRC2012_val_00023678.JPEG n01688243/
+mv val/ILSVRC2012_val_00023679.JPEG n02454379/
+mv val/ILSVRC2012_val_00023680.JPEG n02443484/
+mv val/ILSVRC2012_val_00023681.JPEG n07248320/
+mv val/ILSVRC2012_val_00023682.JPEG n03814639/
+mv val/ILSVRC2012_val_00023683.JPEG n04509417/
+mv val/ILSVRC2012_val_00023684.JPEG n04019541/
+mv val/ILSVRC2012_val_00023685.JPEG n03938244/
+mv val/ILSVRC2012_val_00023686.JPEG n01667114/
+mv val/ILSVRC2012_val_00023687.JPEG n03791053/
+mv val/ILSVRC2012_val_00023688.JPEG n04442312/
+mv val/ILSVRC2012_val_00023689.JPEG n02226429/
+mv val/ILSVRC2012_val_00023690.JPEG n01693334/
+mv val/ILSVRC2012_val_00023691.JPEG n02794156/
+mv val/ILSVRC2012_val_00023692.JPEG n01773549/
+mv val/ILSVRC2012_val_00023693.JPEG n01685808/
+mv val/ILSVRC2012_val_00023694.JPEG n03598930/
+mv val/ILSVRC2012_val_00023695.JPEG n02017213/
+mv val/ILSVRC2012_val_00023696.JPEG n02124075/
+mv val/ILSVRC2012_val_00023697.JPEG n02091134/
+mv val/ILSVRC2012_val_00023698.JPEG n01530575/
+mv val/ILSVRC2012_val_00023699.JPEG n03657121/
+mv val/ILSVRC2012_val_00023700.JPEG n01768244/
+mv val/ILSVRC2012_val_00023701.JPEG n04552348/
+mv val/ILSVRC2012_val_00023702.JPEG n02106030/
+mv val/ILSVRC2012_val_00023703.JPEG n01667114/
+mv val/ILSVRC2012_val_00023704.JPEG n02790996/
+mv val/ILSVRC2012_val_00023705.JPEG n02699494/
+mv val/ILSVRC2012_val_00023706.JPEG n03291819/
+mv val/ILSVRC2012_val_00023707.JPEG n01694178/
+mv val/ILSVRC2012_val_00023708.JPEG n02423022/
+mv val/ILSVRC2012_val_00023709.JPEG n01855672/
+mv val/ILSVRC2012_val_00023710.JPEG n03459775/
+mv val/ILSVRC2012_val_00023711.JPEG n04070727/
+mv val/ILSVRC2012_val_00023712.JPEG n03770439/
+mv val/ILSVRC2012_val_00023713.JPEG n03709823/
+mv val/ILSVRC2012_val_00023714.JPEG n01924916/
+mv val/ILSVRC2012_val_00023715.JPEG n06785654/
+mv val/ILSVRC2012_val_00023716.JPEG n03272562/
+mv val/ILSVRC2012_val_00023717.JPEG n02099429/
+mv val/ILSVRC2012_val_00023718.JPEG n03100240/
+mv val/ILSVRC2012_val_00023719.JPEG n02174001/
+mv val/ILSVRC2012_val_00023720.JPEG n06794110/
+mv val/ILSVRC2012_val_00023721.JPEG n03759954/
+mv val/ILSVRC2012_val_00023722.JPEG n04357314/
+mv val/ILSVRC2012_val_00023723.JPEG n03584829/
+mv val/ILSVRC2012_val_00023724.JPEG n03345487/
+mv val/ILSVRC2012_val_00023725.JPEG n03443371/
+mv val/ILSVRC2012_val_00023726.JPEG n02100236/
+mv val/ILSVRC2012_val_00023727.JPEG n03709823/
+mv val/ILSVRC2012_val_00023728.JPEG n04350905/
+mv val/ILSVRC2012_val_00023729.JPEG n02086910/
+mv val/ILSVRC2012_val_00023730.JPEG n02977058/
+mv val/ILSVRC2012_val_00023731.JPEG n02112018/
+mv val/ILSVRC2012_val_00023732.JPEG n04409515/
+mv val/ILSVRC2012_val_00023733.JPEG n04118776/
+mv val/ILSVRC2012_val_00023734.JPEG n03376595/
+mv val/ILSVRC2012_val_00023735.JPEG n02101556/
+mv val/ILSVRC2012_val_00023736.JPEG n02776631/
+mv val/ILSVRC2012_val_00023737.JPEG n02108551/
+mv val/ILSVRC2012_val_00023738.JPEG n03291819/
+mv val/ILSVRC2012_val_00023739.JPEG n07745940/
+mv val/ILSVRC2012_val_00023740.JPEG n02109047/
+mv val/ILSVRC2012_val_00023741.JPEG n04336792/
+mv val/ILSVRC2012_val_00023742.JPEG n03494278/
+mv val/ILSVRC2012_val_00023743.JPEG n03388183/
+mv val/ILSVRC2012_val_00023744.JPEG n02398521/
+mv val/ILSVRC2012_val_00023745.JPEG n03485794/
+mv val/ILSVRC2012_val_00023746.JPEG n03018349/
+mv val/ILSVRC2012_val_00023747.JPEG n03967562/
+mv val/ILSVRC2012_val_00023748.JPEG n02116738/
+mv val/ILSVRC2012_val_00023749.JPEG n02085620/
+mv val/ILSVRC2012_val_00023750.JPEG n02108551/
+mv val/ILSVRC2012_val_00023751.JPEG n02894605/
+mv val/ILSVRC2012_val_00023752.JPEG n07695742/
+mv val/ILSVRC2012_val_00023753.JPEG n01693334/
+mv val/ILSVRC2012_val_00023754.JPEG n04356056/
+mv val/ILSVRC2012_val_00023755.JPEG n02120079/
+mv val/ILSVRC2012_val_00023756.JPEG n04540053/
+mv val/ILSVRC2012_val_00023757.JPEG n03134739/
+mv val/ILSVRC2012_val_00023758.JPEG n01644900/
+mv val/ILSVRC2012_val_00023759.JPEG n01697457/
+mv val/ILSVRC2012_val_00023760.JPEG n02108000/
+mv val/ILSVRC2012_val_00023761.JPEG n03720891/
+mv val/ILSVRC2012_val_00023762.JPEG n03733281/
+mv val/ILSVRC2012_val_00023763.JPEG n04404412/
+mv val/ILSVRC2012_val_00023764.JPEG n02098105/
+mv val/ILSVRC2012_val_00023765.JPEG n02089867/
+mv val/ILSVRC2012_val_00023766.JPEG n01530575/
+mv val/ILSVRC2012_val_00023767.JPEG n03884397/
+mv val/ILSVRC2012_val_00023768.JPEG n03602883/
+mv val/ILSVRC2012_val_00023769.JPEG n02090721/
+mv val/ILSVRC2012_val_00023770.JPEG n04228054/
+mv val/ILSVRC2012_val_00023771.JPEG n03208938/
+mv val/ILSVRC2012_val_00023772.JPEG n02483708/
+mv val/ILSVRC2012_val_00023773.JPEG n02017213/
+mv val/ILSVRC2012_val_00023774.JPEG n02097047/
+mv val/ILSVRC2012_val_00023775.JPEG n02509815/
+mv val/ILSVRC2012_val_00023776.JPEG n02447366/
+mv val/ILSVRC2012_val_00023777.JPEG n03532672/
+mv val/ILSVRC2012_val_00023778.JPEG n01518878/
+mv val/ILSVRC2012_val_00023779.JPEG n02123045/
+mv val/ILSVRC2012_val_00023780.JPEG n01847000/
+mv val/ILSVRC2012_val_00023781.JPEG n02690373/
+mv val/ILSVRC2012_val_00023782.JPEG n02092002/
+mv val/ILSVRC2012_val_00023783.JPEG n02096177/
+mv val/ILSVRC2012_val_00023784.JPEG n04487081/
+mv val/ILSVRC2012_val_00023785.JPEG n02526121/
+mv val/ILSVRC2012_val_00023786.JPEG n02124075/
+mv val/ILSVRC2012_val_00023787.JPEG n03717622/
+mv val/ILSVRC2012_val_00023788.JPEG n02106030/
+mv val/ILSVRC2012_val_00023789.JPEG n02002724/
+mv val/ILSVRC2012_val_00023790.JPEG n03240683/
+mv val/ILSVRC2012_val_00023791.JPEG n03902125/
+mv val/ILSVRC2012_val_00023792.JPEG n03709823/
+mv val/ILSVRC2012_val_00023793.JPEG n02974003/
+mv val/ILSVRC2012_val_00023794.JPEG n02100583/
+mv val/ILSVRC2012_val_00023795.JPEG n03201208/
+mv val/ILSVRC2012_val_00023796.JPEG n01833805/
+mv val/ILSVRC2012_val_00023797.JPEG n13052670/
+mv val/ILSVRC2012_val_00023798.JPEG n02219486/
+mv val/ILSVRC2012_val_00023799.JPEG n02107574/
+mv val/ILSVRC2012_val_00023800.JPEG n07742313/
+mv val/ILSVRC2012_val_00023801.JPEG n02112018/
+mv val/ILSVRC2012_val_00023802.JPEG n02489166/
+mv val/ILSVRC2012_val_00023803.JPEG n02441942/
+mv val/ILSVRC2012_val_00023804.JPEG n07753275/
+mv val/ILSVRC2012_val_00023805.JPEG n01819313/
+mv val/ILSVRC2012_val_00023806.JPEG n02643566/
+mv val/ILSVRC2012_val_00023807.JPEG n03110669/
+mv val/ILSVRC2012_val_00023808.JPEG n04482393/
+mv val/ILSVRC2012_val_00023809.JPEG n04613696/
+mv val/ILSVRC2012_val_00023810.JPEG n02129604/
+mv val/ILSVRC2012_val_00023811.JPEG n02088466/
+mv val/ILSVRC2012_val_00023812.JPEG n02134418/
+mv val/ILSVRC2012_val_00023813.JPEG n02114855/
+mv val/ILSVRC2012_val_00023814.JPEG n04591157/
+mv val/ILSVRC2012_val_00023815.JPEG n02277742/
+mv val/ILSVRC2012_val_00023816.JPEG n02112350/
+mv val/ILSVRC2012_val_00023817.JPEG n03590841/
+mv val/ILSVRC2012_val_00023818.JPEG n04476259/
+mv val/ILSVRC2012_val_00023819.JPEG n02326432/
+mv val/ILSVRC2012_val_00023820.JPEG n01755581/
+mv val/ILSVRC2012_val_00023821.JPEG n11939491/
+mv val/ILSVRC2012_val_00023822.JPEG n04264628/
+mv val/ILSVRC2012_val_00023823.JPEG n12998815/
+mv val/ILSVRC2012_val_00023824.JPEG n02101388/
+mv val/ILSVRC2012_val_00023825.JPEG n02137549/
+mv val/ILSVRC2012_val_00023826.JPEG n02236044/
+mv val/ILSVRC2012_val_00023827.JPEG n02123394/
+mv val/ILSVRC2012_val_00023828.JPEG n02909870/
+mv val/ILSVRC2012_val_00023829.JPEG n03733805/
+mv val/ILSVRC2012_val_00023830.JPEG n04120489/
+mv val/ILSVRC2012_val_00023831.JPEG n03958227/
+mv val/ILSVRC2012_val_00023832.JPEG n02100877/
+mv val/ILSVRC2012_val_00023833.JPEG n02169497/
+mv val/ILSVRC2012_val_00023834.JPEG n02168699/
+mv val/ILSVRC2012_val_00023835.JPEG n03794056/
+mv val/ILSVRC2012_val_00023836.JPEG n04146614/
+mv val/ILSVRC2012_val_00023837.JPEG n03787032/
+mv val/ILSVRC2012_val_00023838.JPEG n03937543/
+mv val/ILSVRC2012_val_00023839.JPEG n03388549/
+mv val/ILSVRC2012_val_00023840.JPEG n01978455/
+mv val/ILSVRC2012_val_00023841.JPEG n06874185/
+mv val/ILSVRC2012_val_00023842.JPEG n03717622/
+mv val/ILSVRC2012_val_00023843.JPEG n07875152/
+mv val/ILSVRC2012_val_00023844.JPEG n01820546/
+mv val/ILSVRC2012_val_00023845.JPEG n03445777/
+mv val/ILSVRC2012_val_00023846.JPEG n02109961/
+mv val/ILSVRC2012_val_00023847.JPEG n04127249/
+mv val/ILSVRC2012_val_00023848.JPEG n07716358/
+mv val/ILSVRC2012_val_00023849.JPEG n03661043/
+mv val/ILSVRC2012_val_00023850.JPEG n01534433/
+mv val/ILSVRC2012_val_00023851.JPEG n03982430/
+mv val/ILSVRC2012_val_00023852.JPEG n02490219/
+mv val/ILSVRC2012_val_00023853.JPEG n04152593/
+mv val/ILSVRC2012_val_00023854.JPEG n03062245/
+mv val/ILSVRC2012_val_00023855.JPEG n01644373/
+mv val/ILSVRC2012_val_00023856.JPEG n02951358/
+mv val/ILSVRC2012_val_00023857.JPEG n04041544/
+mv val/ILSVRC2012_val_00023858.JPEG n02974003/
+mv val/ILSVRC2012_val_00023859.JPEG n02102318/
+mv val/ILSVRC2012_val_00023860.JPEG n04127249/
+mv val/ILSVRC2012_val_00023861.JPEG n02500267/
+mv val/ILSVRC2012_val_00023862.JPEG n04548280/
+mv val/ILSVRC2012_val_00023863.JPEG n02690373/
+mv val/ILSVRC2012_val_00023864.JPEG n02125311/
+mv val/ILSVRC2012_val_00023865.JPEG n01950731/
+mv val/ILSVRC2012_val_00023866.JPEG n02007558/
+mv val/ILSVRC2012_val_00023867.JPEG n12267677/
+mv val/ILSVRC2012_val_00023868.JPEG n03045698/
+mv val/ILSVRC2012_val_00023869.JPEG n01443537/
+mv val/ILSVRC2012_val_00023870.JPEG n02447366/
+mv val/ILSVRC2012_val_00023871.JPEG n02124075/
+mv val/ILSVRC2012_val_00023872.JPEG n03916031/
+mv val/ILSVRC2012_val_00023873.JPEG n03146219/
+mv val/ILSVRC2012_val_00023874.JPEG n02843684/
+mv val/ILSVRC2012_val_00023875.JPEG n02980441/
+mv val/ILSVRC2012_val_00023876.JPEG n03187595/
+mv val/ILSVRC2012_val_00023877.JPEG n02091134/
+mv val/ILSVRC2012_val_00023878.JPEG n03124170/
+mv val/ILSVRC2012_val_00023879.JPEG n07749582/
+mv val/ILSVRC2012_val_00023880.JPEG n03594734/
+mv val/ILSVRC2012_val_00023881.JPEG n02666196/
+mv val/ILSVRC2012_val_00023882.JPEG n03782006/
+mv val/ILSVRC2012_val_00023883.JPEG n07697537/
+mv val/ILSVRC2012_val_00023884.JPEG n02111889/
+mv val/ILSVRC2012_val_00023885.JPEG n03724870/
+mv val/ILSVRC2012_val_00023886.JPEG n02085620/
+mv val/ILSVRC2012_val_00023887.JPEG n03492542/
+mv val/ILSVRC2012_val_00023888.JPEG n02102177/
+mv val/ILSVRC2012_val_00023889.JPEG n04515003/
+mv val/ILSVRC2012_val_00023890.JPEG n02167151/
+mv val/ILSVRC2012_val_00023891.JPEG n03877472/
+mv val/ILSVRC2012_val_00023892.JPEG n07720875/
+mv val/ILSVRC2012_val_00023893.JPEG n02097209/
+mv val/ILSVRC2012_val_00023894.JPEG n03208938/
+mv val/ILSVRC2012_val_00023895.JPEG n01601694/
+mv val/ILSVRC2012_val_00023896.JPEG n04067472/
+mv val/ILSVRC2012_val_00023897.JPEG n02174001/
+mv val/ILSVRC2012_val_00023898.JPEG n02123394/
+mv val/ILSVRC2012_val_00023899.JPEG n07583066/
+mv val/ILSVRC2012_val_00023900.JPEG n03599486/
+mv val/ILSVRC2012_val_00023901.JPEG n04005630/
+mv val/ILSVRC2012_val_00023902.JPEG n01698640/
+mv val/ILSVRC2012_val_00023903.JPEG n03047690/
+mv val/ILSVRC2012_val_00023904.JPEG n03793489/
+mv val/ILSVRC2012_val_00023905.JPEG n02916936/
+mv val/ILSVRC2012_val_00023906.JPEG n02124075/
+mv val/ILSVRC2012_val_00023907.JPEG n01592084/
+mv val/ILSVRC2012_val_00023908.JPEG n03127747/
+mv val/ILSVRC2012_val_00023909.JPEG n02130308/
+mv val/ILSVRC2012_val_00023910.JPEG n02094114/
+mv val/ILSVRC2012_val_00023911.JPEG n04131690/
+mv val/ILSVRC2012_val_00023912.JPEG n03063599/
+mv val/ILSVRC2012_val_00023913.JPEG n02110341/
+mv val/ILSVRC2012_val_00023914.JPEG n04008634/
+mv val/ILSVRC2012_val_00023915.JPEG n03218198/
+mv val/ILSVRC2012_val_00023916.JPEG n01496331/
+mv val/ILSVRC2012_val_00023917.JPEG n03146219/
+mv val/ILSVRC2012_val_00023918.JPEG n03496892/
+mv val/ILSVRC2012_val_00023919.JPEG n02097047/
+mv val/ILSVRC2012_val_00023920.JPEG n02397096/
+mv val/ILSVRC2012_val_00023921.JPEG n03942813/
+mv val/ILSVRC2012_val_00023922.JPEG n03787032/
+mv val/ILSVRC2012_val_00023923.JPEG n02125311/
+mv val/ILSVRC2012_val_00023924.JPEG n02119789/
+mv val/ILSVRC2012_val_00023925.JPEG n01945685/
+mv val/ILSVRC2012_val_00023926.JPEG n02105162/
+mv val/ILSVRC2012_val_00023927.JPEG n03127747/
+mv val/ILSVRC2012_val_00023928.JPEG n02107142/
+mv val/ILSVRC2012_val_00023929.JPEG n02992529/
+mv val/ILSVRC2012_val_00023930.JPEG n12620546/
+mv val/ILSVRC2012_val_00023931.JPEG n04067472/
+mv val/ILSVRC2012_val_00023932.JPEG n01630670/
+mv val/ILSVRC2012_val_00023933.JPEG n02423022/
+mv val/ILSVRC2012_val_00023934.JPEG n02948072/
+mv val/ILSVRC2012_val_00023935.JPEG n01491361/
+mv val/ILSVRC2012_val_00023936.JPEG n04067472/
+mv val/ILSVRC2012_val_00023937.JPEG n04263257/
+mv val/ILSVRC2012_val_00023938.JPEG n03223299/
+mv val/ILSVRC2012_val_00023939.JPEG n02088238/
+mv val/ILSVRC2012_val_00023940.JPEG n02231487/
+mv val/ILSVRC2012_val_00023941.JPEG n01739381/
+mv val/ILSVRC2012_val_00023942.JPEG n01532829/
+mv val/ILSVRC2012_val_00023943.JPEG n02099849/
+mv val/ILSVRC2012_val_00023944.JPEG n09256479/
+mv val/ILSVRC2012_val_00023945.JPEG n01580077/
+mv val/ILSVRC2012_val_00023946.JPEG n03895866/
+mv val/ILSVRC2012_val_00023947.JPEG n02037110/
+mv val/ILSVRC2012_val_00023948.JPEG n07742313/
+mv val/ILSVRC2012_val_00023949.JPEG n02091032/
+mv val/ILSVRC2012_val_00023950.JPEG n03841143/
+mv val/ILSVRC2012_val_00023951.JPEG n01986214/
+mv val/ILSVRC2012_val_00023952.JPEG n04356056/
+mv val/ILSVRC2012_val_00023953.JPEG n02971356/
+mv val/ILSVRC2012_val_00023954.JPEG n01774384/
+mv val/ILSVRC2012_val_00023955.JPEG n02097474/
+mv val/ILSVRC2012_val_00023956.JPEG n04019541/
+mv val/ILSVRC2012_val_00023957.JPEG n07753275/
+mv val/ILSVRC2012_val_00023958.JPEG n01944390/
+mv val/ILSVRC2012_val_00023959.JPEG n04371774/
+mv val/ILSVRC2012_val_00023960.JPEG n02120079/
+mv val/ILSVRC2012_val_00023961.JPEG n07932039/
+mv val/ILSVRC2012_val_00023962.JPEG n04033901/
+mv val/ILSVRC2012_val_00023963.JPEG n04074963/
+mv val/ILSVRC2012_val_00023964.JPEG n02843684/
+mv val/ILSVRC2012_val_00023965.JPEG n03457902/
+mv val/ILSVRC2012_val_00023966.JPEG n02089078/
+mv val/ILSVRC2012_val_00023967.JPEG n03544143/
+mv val/ILSVRC2012_val_00023968.JPEG n02088238/
+mv val/ILSVRC2012_val_00023969.JPEG n02342885/
+mv val/ILSVRC2012_val_00023970.JPEG n01753488/
+mv val/ILSVRC2012_val_00023971.JPEG n02895154/
+mv val/ILSVRC2012_val_00023972.JPEG n04009552/
+mv val/ILSVRC2012_val_00023973.JPEG n01806143/
+mv val/ILSVRC2012_val_00023974.JPEG n03794056/
+mv val/ILSVRC2012_val_00023975.JPEG n01740131/
+mv val/ILSVRC2012_val_00023976.JPEG n02423022/
+mv val/ILSVRC2012_val_00023977.JPEG n02033041/
+mv val/ILSVRC2012_val_00023978.JPEG n03942813/
+mv val/ILSVRC2012_val_00023979.JPEG n04023962/
+mv val/ILSVRC2012_val_00023980.JPEG n03630383/
+mv val/ILSVRC2012_val_00023981.JPEG n04251144/
+mv val/ILSVRC2012_val_00023982.JPEG n04376876/
+mv val/ILSVRC2012_val_00023983.JPEG n02107142/
+mv val/ILSVRC2012_val_00023984.JPEG n01740131/
+mv val/ILSVRC2012_val_00023985.JPEG n03075370/
+mv val/ILSVRC2012_val_00023986.JPEG n01494475/
+mv val/ILSVRC2012_val_00023987.JPEG n04590129/
+mv val/ILSVRC2012_val_00023988.JPEG n02786058/
+mv val/ILSVRC2012_val_00023989.JPEG n01773549/
+mv val/ILSVRC2012_val_00023990.JPEG n02028035/
+mv val/ILSVRC2012_val_00023991.JPEG n01978287/
+mv val/ILSVRC2012_val_00023992.JPEG n02966193/
+mv val/ILSVRC2012_val_00023993.JPEG n03982430/
+mv val/ILSVRC2012_val_00023994.JPEG n02442845/
+mv val/ILSVRC2012_val_00023995.JPEG n07734744/
+mv val/ILSVRC2012_val_00023996.JPEG n07615774/
+mv val/ILSVRC2012_val_00023997.JPEG n03970156/
+mv val/ILSVRC2012_val_00023998.JPEG n03000134/
+mv val/ILSVRC2012_val_00023999.JPEG n01883070/
+mv val/ILSVRC2012_val_00024000.JPEG n02124075/
+mv val/ILSVRC2012_val_00024001.JPEG n07892512/
+mv val/ILSVRC2012_val_00024002.JPEG n03970156/
+mv val/ILSVRC2012_val_00024003.JPEG n03958227/
+mv val/ILSVRC2012_val_00024004.JPEG n04532670/
+mv val/ILSVRC2012_val_00024005.JPEG n03743016/
+mv val/ILSVRC2012_val_00024006.JPEG n04479046/
+mv val/ILSVRC2012_val_00024007.JPEG n02011460/
+mv val/ILSVRC2012_val_00024008.JPEG n02391049/
+mv val/ILSVRC2012_val_00024009.JPEG n03877845/
+mv val/ILSVRC2012_val_00024010.JPEG n01981276/
+mv val/ILSVRC2012_val_00024011.JPEG n02488291/
+mv val/ILSVRC2012_val_00024012.JPEG n01592084/
+mv val/ILSVRC2012_val_00024013.JPEG n03544143/
+mv val/ILSVRC2012_val_00024014.JPEG n02168699/
+mv val/ILSVRC2012_val_00024015.JPEG n01494475/
+mv val/ILSVRC2012_val_00024016.JPEG n03887697/
+mv val/ILSVRC2012_val_00024017.JPEG n03249569/
+mv val/ILSVRC2012_val_00024018.JPEG n03777754/
+mv val/ILSVRC2012_val_00024019.JPEG n02100236/
+mv val/ILSVRC2012_val_00024020.JPEG n02017213/
+mv val/ILSVRC2012_val_00024021.JPEG n02999410/
+mv val/ILSVRC2012_val_00024022.JPEG n03590841/
+mv val/ILSVRC2012_val_00024023.JPEG n03476991/
+mv val/ILSVRC2012_val_00024024.JPEG n04192698/
+mv val/ILSVRC2012_val_00024025.JPEG n01582220/
+mv val/ILSVRC2012_val_00024026.JPEG n04604644/
+mv val/ILSVRC2012_val_00024027.JPEG n03658185/
+mv val/ILSVRC2012_val_00024028.JPEG n03773504/
+mv val/ILSVRC2012_val_00024029.JPEG n02640242/
+mv val/ILSVRC2012_val_00024030.JPEG n01819313/
+mv val/ILSVRC2012_val_00024031.JPEG n02906734/
+mv val/ILSVRC2012_val_00024032.JPEG n07697537/
+mv val/ILSVRC2012_val_00024033.JPEG n02403003/
+mv val/ILSVRC2012_val_00024034.JPEG n04270147/
+mv val/ILSVRC2012_val_00024035.JPEG n03544143/
+mv val/ILSVRC2012_val_00024036.JPEG n02859443/
+mv val/ILSVRC2012_val_00024037.JPEG n03733131/
+mv val/ILSVRC2012_val_00024038.JPEG n03733131/
+mv val/ILSVRC2012_val_00024039.JPEG n04251144/
+mv val/ILSVRC2012_val_00024040.JPEG n01806143/
+mv val/ILSVRC2012_val_00024041.JPEG n04254120/
+mv val/ILSVRC2012_val_00024042.JPEG n04350905/
+mv val/ILSVRC2012_val_00024043.JPEG n02090379/
+mv val/ILSVRC2012_val_00024044.JPEG n01582220/
+mv val/ILSVRC2012_val_00024045.JPEG n03868242/
+mv val/ILSVRC2012_val_00024046.JPEG n02088466/
+mv val/ILSVRC2012_val_00024047.JPEG n02793495/
+mv val/ILSVRC2012_val_00024048.JPEG n04136333/
+mv val/ILSVRC2012_val_00024049.JPEG n03476684/
+mv val/ILSVRC2012_val_00024050.JPEG n02129604/
+mv val/ILSVRC2012_val_00024051.JPEG n02112137/
+mv val/ILSVRC2012_val_00024052.JPEG n01622779/
+mv val/ILSVRC2012_val_00024053.JPEG n02087046/
+mv val/ILSVRC2012_val_00024054.JPEG n02114548/
+mv val/ILSVRC2012_val_00024055.JPEG n07875152/
+mv val/ILSVRC2012_val_00024056.JPEG n01773549/
+mv val/ILSVRC2012_val_00024057.JPEG n03721384/
+mv val/ILSVRC2012_val_00024058.JPEG n01843065/
+mv val/ILSVRC2012_val_00024059.JPEG n01601694/
+mv val/ILSVRC2012_val_00024060.JPEG n04254680/
+mv val/ILSVRC2012_val_00024061.JPEG n07860988/
+mv val/ILSVRC2012_val_00024062.JPEG n04523525/
+mv val/ILSVRC2012_val_00024063.JPEG n01843383/
+mv val/ILSVRC2012_val_00024064.JPEG n03314780/
+mv val/ILSVRC2012_val_00024065.JPEG n04069434/
+mv val/ILSVRC2012_val_00024066.JPEG n02791270/
+mv val/ILSVRC2012_val_00024067.JPEG n04125021/
+mv val/ILSVRC2012_val_00024068.JPEG n07880968/
+mv val/ILSVRC2012_val_00024069.JPEG n03314780/
+mv val/ILSVRC2012_val_00024070.JPEG n04346328/
+mv val/ILSVRC2012_val_00024071.JPEG n04335435/
+mv val/ILSVRC2012_val_00024072.JPEG n02093647/
+mv val/ILSVRC2012_val_00024073.JPEG n04532106/
+mv val/ILSVRC2012_val_00024074.JPEG n04465501/
+mv val/ILSVRC2012_val_00024075.JPEG n02102177/
+mv val/ILSVRC2012_val_00024076.JPEG n04344873/
+mv val/ILSVRC2012_val_00024077.JPEG n03788195/
+mv val/ILSVRC2012_val_00024078.JPEG n03803284/
+mv val/ILSVRC2012_val_00024079.JPEG n09835506/
+mv val/ILSVRC2012_val_00024080.JPEG n01872401/
+mv val/ILSVRC2012_val_00024081.JPEG n01688243/
+mv val/ILSVRC2012_val_00024082.JPEG n02233338/
+mv val/ILSVRC2012_val_00024083.JPEG n03633091/
+mv val/ILSVRC2012_val_00024084.JPEG n03888605/
+mv val/ILSVRC2012_val_00024085.JPEG n02095570/
+mv val/ILSVRC2012_val_00024086.JPEG n04579145/
+mv val/ILSVRC2012_val_00024087.JPEG n03598930/
+mv val/ILSVRC2012_val_00024088.JPEG n02980441/
+mv val/ILSVRC2012_val_00024089.JPEG n03095699/
+mv val/ILSVRC2012_val_00024090.JPEG n02088466/
+mv val/ILSVRC2012_val_00024091.JPEG n04296562/
+mv val/ILSVRC2012_val_00024092.JPEG n01739381/
+mv val/ILSVRC2012_val_00024093.JPEG n02033041/
+mv val/ILSVRC2012_val_00024094.JPEG n04346328/
+mv val/ILSVRC2012_val_00024095.JPEG n01695060/
+mv val/ILSVRC2012_val_00024096.JPEG n03733281/
+mv val/ILSVRC2012_val_00024097.JPEG n04265275/
+mv val/ILSVRC2012_val_00024098.JPEG n01796340/
+mv val/ILSVRC2012_val_00024099.JPEG n07880968/
+mv val/ILSVRC2012_val_00024100.JPEG n02894605/
+mv val/ILSVRC2012_val_00024101.JPEG n04465501/
+mv val/ILSVRC2012_val_00024102.JPEG n01644900/
+mv val/ILSVRC2012_val_00024103.JPEG n03100240/
+mv val/ILSVRC2012_val_00024104.JPEG n03447721/
+mv val/ILSVRC2012_val_00024105.JPEG n03792782/
+mv val/ILSVRC2012_val_00024106.JPEG n01828970/
+mv val/ILSVRC2012_val_00024107.JPEG n02486261/
+mv val/ILSVRC2012_val_00024108.JPEG n02690373/
+mv val/ILSVRC2012_val_00024109.JPEG n01774750/
+mv val/ILSVRC2012_val_00024110.JPEG n09229709/
+mv val/ILSVRC2012_val_00024111.JPEG n03045698/
+mv val/ILSVRC2012_val_00024112.JPEG n03874293/
+mv val/ILSVRC2012_val_00024113.JPEG n12267677/
+mv val/ILSVRC2012_val_00024114.JPEG n03637318/
+mv val/ILSVRC2012_val_00024115.JPEG n02398521/
+mv val/ILSVRC2012_val_00024116.JPEG n02782093/
+mv val/ILSVRC2012_val_00024117.JPEG n01728572/
+mv val/ILSVRC2012_val_00024118.JPEG n02457408/
+mv val/ILSVRC2012_val_00024119.JPEG n04005630/
+mv val/ILSVRC2012_val_00024120.JPEG n04525305/
+mv val/ILSVRC2012_val_00024121.JPEG n01820546/
+mv val/ILSVRC2012_val_00024122.JPEG n02138441/
+mv val/ILSVRC2012_val_00024123.JPEG n03532672/
+mv val/ILSVRC2012_val_00024124.JPEG n02808440/
+mv val/ILSVRC2012_val_00024125.JPEG n12985857/
+mv val/ILSVRC2012_val_00024126.JPEG n02085620/
+mv val/ILSVRC2012_val_00024127.JPEG n04584207/
+mv val/ILSVRC2012_val_00024128.JPEG n02125311/
+mv val/ILSVRC2012_val_00024129.JPEG n07742313/
+mv val/ILSVRC2012_val_00024130.JPEG n03355925/
+mv val/ILSVRC2012_val_00024131.JPEG n03868242/
+mv val/ILSVRC2012_val_00024132.JPEG n03871628/
+mv val/ILSVRC2012_val_00024133.JPEG n03840681/
+mv val/ILSVRC2012_val_00024134.JPEG n04310018/
+mv val/ILSVRC2012_val_00024135.JPEG n02793495/
+mv val/ILSVRC2012_val_00024136.JPEG n02489166/
+mv val/ILSVRC2012_val_00024137.JPEG n02727426/
+mv val/ILSVRC2012_val_00024138.JPEG n04592741/
+mv val/ILSVRC2012_val_00024139.JPEG n02841315/
+mv val/ILSVRC2012_val_00024140.JPEG n02490219/
+mv val/ILSVRC2012_val_00024141.JPEG n04273569/
+mv val/ILSVRC2012_val_00024142.JPEG n04228054/
+mv val/ILSVRC2012_val_00024143.JPEG n03991062/
+mv val/ILSVRC2012_val_00024144.JPEG n02093647/
+mv val/ILSVRC2012_val_00024145.JPEG n02113023/
+mv val/ILSVRC2012_val_00024146.JPEG n01698640/
+mv val/ILSVRC2012_val_00024147.JPEG n04591713/
+mv val/ILSVRC2012_val_00024148.JPEG n02111277/
+mv val/ILSVRC2012_val_00024149.JPEG n04596742/
+mv val/ILSVRC2012_val_00024150.JPEG n02110627/
+mv val/ILSVRC2012_val_00024151.JPEG n03720891/
+mv val/ILSVRC2012_val_00024152.JPEG n04251144/
+mv val/ILSVRC2012_val_00024153.JPEG n03179701/
+mv val/ILSVRC2012_val_00024154.JPEG n02091244/
+mv val/ILSVRC2012_val_00024155.JPEG n07745940/
+mv val/ILSVRC2012_val_00024156.JPEG n03000247/
+mv val/ILSVRC2012_val_00024157.JPEG n04243546/
+mv val/ILSVRC2012_val_00024158.JPEG n07697313/
+mv val/ILSVRC2012_val_00024159.JPEG n03127925/
+mv val/ILSVRC2012_val_00024160.JPEG n01985128/
+mv val/ILSVRC2012_val_00024161.JPEG n03942813/
+mv val/ILSVRC2012_val_00024162.JPEG n02013706/
+mv val/ILSVRC2012_val_00024163.JPEG n02483708/
+mv val/ILSVRC2012_val_00024164.JPEG n01632458/
+mv val/ILSVRC2012_val_00024165.JPEG n02279972/
+mv val/ILSVRC2012_val_00024166.JPEG n02009912/
+mv val/ILSVRC2012_val_00024167.JPEG n02256656/
+mv val/ILSVRC2012_val_00024168.JPEG n01768244/
+mv val/ILSVRC2012_val_00024169.JPEG n02091635/
+mv val/ILSVRC2012_val_00024170.JPEG n03770679/
+mv val/ILSVRC2012_val_00024171.JPEG n12144580/
+mv val/ILSVRC2012_val_00024172.JPEG n01806567/
+mv val/ILSVRC2012_val_00024173.JPEG n04536866/
+mv val/ILSVRC2012_val_00024174.JPEG n03991062/
+mv val/ILSVRC2012_val_00024175.JPEG n02391049/
+mv val/ILSVRC2012_val_00024176.JPEG n02326432/
+mv val/ILSVRC2012_val_00024177.JPEG n04443257/
+mv val/ILSVRC2012_val_00024178.JPEG n02097047/
+mv val/ILSVRC2012_val_00024179.JPEG n02101006/
+mv val/ILSVRC2012_val_00024180.JPEG n02051845/
+mv val/ILSVRC2012_val_00024181.JPEG n03933933/
+mv val/ILSVRC2012_val_00024182.JPEG n03595614/
+mv val/ILSVRC2012_val_00024183.JPEG n07695742/
+mv val/ILSVRC2012_val_00024184.JPEG n07579787/
+mv val/ILSVRC2012_val_00024185.JPEG n02120079/
+mv val/ILSVRC2012_val_00024186.JPEG n02110627/
+mv val/ILSVRC2012_val_00024187.JPEG n02095314/
+mv val/ILSVRC2012_val_00024188.JPEG n03201208/
+mv val/ILSVRC2012_val_00024189.JPEG n03803284/
+mv val/ILSVRC2012_val_00024190.JPEG n02444819/
+mv val/ILSVRC2012_val_00024191.JPEG n03899768/
+mv val/ILSVRC2012_val_00024192.JPEG n02233338/
+mv val/ILSVRC2012_val_00024193.JPEG n02747177/
+mv val/ILSVRC2012_val_00024194.JPEG n03483316/
+mv val/ILSVRC2012_val_00024195.JPEG n04136333/
+mv val/ILSVRC2012_val_00024196.JPEG n03220513/
+mv val/ILSVRC2012_val_00024197.JPEG n03623198/
+mv val/ILSVRC2012_val_00024198.JPEG n03134739/
+mv val/ILSVRC2012_val_00024199.JPEG n03630383/
+mv val/ILSVRC2012_val_00024200.JPEG n02808440/
+mv val/ILSVRC2012_val_00024201.JPEG n03769881/
+mv val/ILSVRC2012_val_00024202.JPEG n02799071/
+mv val/ILSVRC2012_val_00024203.JPEG n04019541/
+mv val/ILSVRC2012_val_00024204.JPEG n01498041/
+mv val/ILSVRC2012_val_00024205.JPEG n04428191/
+mv val/ILSVRC2012_val_00024206.JPEG n02094433/
+mv val/ILSVRC2012_val_00024207.JPEG n03450230/
+mv val/ILSVRC2012_val_00024208.JPEG n02092002/
+mv val/ILSVRC2012_val_00024209.JPEG n03929660/
+mv val/ILSVRC2012_val_00024210.JPEG n03000134/
+mv val/ILSVRC2012_val_00024211.JPEG n01914609/
+mv val/ILSVRC2012_val_00024212.JPEG n03721384/
+mv val/ILSVRC2012_val_00024213.JPEG n04389033/
+mv val/ILSVRC2012_val_00024214.JPEG n02128385/
+mv val/ILSVRC2012_val_00024215.JPEG n03000247/
+mv val/ILSVRC2012_val_00024216.JPEG n02091244/
+mv val/ILSVRC2012_val_00024217.JPEG n02108000/
+mv val/ILSVRC2012_val_00024218.JPEG n02110063/
+mv val/ILSVRC2012_val_00024219.JPEG n02128385/
+mv val/ILSVRC2012_val_00024220.JPEG n02641379/
+mv val/ILSVRC2012_val_00024221.JPEG n01664065/
+mv val/ILSVRC2012_val_00024222.JPEG n02109525/
+mv val/ILSVRC2012_val_00024223.JPEG n07802026/
+mv val/ILSVRC2012_val_00024224.JPEG n07714571/
+mv val/ILSVRC2012_val_00024225.JPEG n03691459/
+mv val/ILSVRC2012_val_00024226.JPEG n02109961/
+mv val/ILSVRC2012_val_00024227.JPEG n01688243/
+mv val/ILSVRC2012_val_00024228.JPEG n04515003/
+mv val/ILSVRC2012_val_00024229.JPEG n04252225/
+mv val/ILSVRC2012_val_00024230.JPEG n02877765/
+mv val/ILSVRC2012_val_00024231.JPEG n03476991/
+mv val/ILSVRC2012_val_00024232.JPEG n07717410/
+mv val/ILSVRC2012_val_00024233.JPEG n04389033/
+mv val/ILSVRC2012_val_00024234.JPEG n02129165/
+mv val/ILSVRC2012_val_00024235.JPEG n01440764/
+mv val/ILSVRC2012_val_00024236.JPEG n12985857/
+mv val/ILSVRC2012_val_00024237.JPEG n04371430/
+mv val/ILSVRC2012_val_00024238.JPEG n03447721/
+mv val/ILSVRC2012_val_00024239.JPEG n02441942/
+mv val/ILSVRC2012_val_00024240.JPEG n02110958/
+mv val/ILSVRC2012_val_00024241.JPEG n02094433/
+mv val/ILSVRC2012_val_00024242.JPEG n04146614/
+mv val/ILSVRC2012_val_00024243.JPEG n03857828/
+mv val/ILSVRC2012_val_00024244.JPEG n03788195/
+mv val/ILSVRC2012_val_00024245.JPEG n03804744/
+mv val/ILSVRC2012_val_00024246.JPEG n02102040/
+mv val/ILSVRC2012_val_00024247.JPEG n02317335/
+mv val/ILSVRC2012_val_00024248.JPEG n09246464/
+mv val/ILSVRC2012_val_00024249.JPEG n02110958/
+mv val/ILSVRC2012_val_00024250.JPEG n02256656/
+mv val/ILSVRC2012_val_00024251.JPEG n03781244/
+mv val/ILSVRC2012_val_00024252.JPEG n01689811/
+mv val/ILSVRC2012_val_00024253.JPEG n02487347/
+mv val/ILSVRC2012_val_00024254.JPEG n02092002/
+mv val/ILSVRC2012_val_00024255.JPEG n03733805/
+mv val/ILSVRC2012_val_00024256.JPEG n01531178/
+mv val/ILSVRC2012_val_00024257.JPEG n02454379/
+mv val/ILSVRC2012_val_00024258.JPEG n02088238/
+mv val/ILSVRC2012_val_00024259.JPEG n01729322/
+mv val/ILSVRC2012_val_00024260.JPEG n01945685/
+mv val/ILSVRC2012_val_00024261.JPEG n01774384/
+mv val/ILSVRC2012_val_00024262.JPEG n01632458/
+mv val/ILSVRC2012_val_00024263.JPEG n03776460/
+mv val/ILSVRC2012_val_00024264.JPEG n01877812/
+mv val/ILSVRC2012_val_00024265.JPEG n07615774/
+mv val/ILSVRC2012_val_00024266.JPEG n02423022/
+mv val/ILSVRC2012_val_00024267.JPEG n03384352/
+mv val/ILSVRC2012_val_00024268.JPEG n01518878/
+mv val/ILSVRC2012_val_00024269.JPEG n03000684/
+mv val/ILSVRC2012_val_00024270.JPEG n02018207/
+mv val/ILSVRC2012_val_00024271.JPEG n03876231/
+mv val/ILSVRC2012_val_00024272.JPEG n02113799/
+mv val/ILSVRC2012_val_00024273.JPEG n01855032/
+mv val/ILSVRC2012_val_00024274.JPEG n02910353/
+mv val/ILSVRC2012_val_00024275.JPEG n02109047/
+mv val/ILSVRC2012_val_00024276.JPEG n03967562/
+mv val/ILSVRC2012_val_00024277.JPEG n02112018/
+mv val/ILSVRC2012_val_00024278.JPEG n02708093/
+mv val/ILSVRC2012_val_00024279.JPEG n02417914/
+mv val/ILSVRC2012_val_00024280.JPEG n13040303/
+mv val/ILSVRC2012_val_00024281.JPEG n04005630/
+mv val/ILSVRC2012_val_00024282.JPEG n02794156/
+mv val/ILSVRC2012_val_00024283.JPEG n01689811/
+mv val/ILSVRC2012_val_00024284.JPEG n02113186/
+mv val/ILSVRC2012_val_00024285.JPEG n03476991/
+mv val/ILSVRC2012_val_00024286.JPEG n03773504/
+mv val/ILSVRC2012_val_00024287.JPEG n03868863/
+mv val/ILSVRC2012_val_00024288.JPEG n03788365/
+mv val/ILSVRC2012_val_00024289.JPEG n02133161/
+mv val/ILSVRC2012_val_00024290.JPEG n02708093/
+mv val/ILSVRC2012_val_00024291.JPEG n07718747/
+mv val/ILSVRC2012_val_00024292.JPEG n02106030/
+mv val/ILSVRC2012_val_00024293.JPEG n03916031/
+mv val/ILSVRC2012_val_00024294.JPEG n02493793/
+mv val/ILSVRC2012_val_00024295.JPEG n02277742/
+mv val/ILSVRC2012_val_00024296.JPEG n02701002/
+mv val/ILSVRC2012_val_00024297.JPEG n04238763/
+mv val/ILSVRC2012_val_00024298.JPEG n07742313/
+mv val/ILSVRC2012_val_00024299.JPEG n01755581/
+mv val/ILSVRC2012_val_00024300.JPEG n02321529/
+mv val/ILSVRC2012_val_00024301.JPEG n01728572/
+mv val/ILSVRC2012_val_00024302.JPEG n12057211/
+mv val/ILSVRC2012_val_00024303.JPEG n03016953/
+mv val/ILSVRC2012_val_00024304.JPEG n04009552/
+mv val/ILSVRC2012_val_00024305.JPEG n02107312/
+mv val/ILSVRC2012_val_00024306.JPEG n04486054/
+mv val/ILSVRC2012_val_00024307.JPEG n03837869/
+mv val/ILSVRC2012_val_00024308.JPEG n04127249/
+mv val/ILSVRC2012_val_00024309.JPEG n03837869/
+mv val/ILSVRC2012_val_00024310.JPEG n03895866/
+mv val/ILSVRC2012_val_00024311.JPEG n03032252/
+mv val/ILSVRC2012_val_00024312.JPEG n04380533/
+mv val/ILSVRC2012_val_00024313.JPEG n02777292/
+mv val/ILSVRC2012_val_00024314.JPEG n01729322/
+mv val/ILSVRC2012_val_00024315.JPEG n02607072/
+mv val/ILSVRC2012_val_00024316.JPEG n03792972/
+mv val/ILSVRC2012_val_00024317.JPEG n03930630/
+mv val/ILSVRC2012_val_00024318.JPEG n02814533/
+mv val/ILSVRC2012_val_00024319.JPEG n04005630/
+mv val/ILSVRC2012_val_00024320.JPEG n04099969/
+mv val/ILSVRC2012_val_00024321.JPEG n02110806/
+mv val/ILSVRC2012_val_00024322.JPEG n03594734/
+mv val/ILSVRC2012_val_00024323.JPEG n03697007/
+mv val/ILSVRC2012_val_00024324.JPEG n02071294/
+mv val/ILSVRC2012_val_00024325.JPEG n02346627/
+mv val/ILSVRC2012_val_00024326.JPEG n02096294/
+mv val/ILSVRC2012_val_00024327.JPEG n01440764/
+mv val/ILSVRC2012_val_00024328.JPEG n12267677/
+mv val/ILSVRC2012_val_00024329.JPEG n02097658/
+mv val/ILSVRC2012_val_00024330.JPEG n02111889/
+mv val/ILSVRC2012_val_00024331.JPEG n03825788/
+mv val/ILSVRC2012_val_00024332.JPEG n04153751/
+mv val/ILSVRC2012_val_00024333.JPEG n04259630/
+mv val/ILSVRC2012_val_00024334.JPEG n04254680/
+mv val/ILSVRC2012_val_00024335.JPEG n02092002/
+mv val/ILSVRC2012_val_00024336.JPEG n01833805/
+mv val/ILSVRC2012_val_00024337.JPEG n04200800/
+mv val/ILSVRC2012_val_00024338.JPEG n04435653/
+mv val/ILSVRC2012_val_00024339.JPEG n07753113/
+mv val/ILSVRC2012_val_00024340.JPEG n03888257/
+mv val/ILSVRC2012_val_00024341.JPEG n01744401/
+mv val/ILSVRC2012_val_00024342.JPEG n04192698/
+mv val/ILSVRC2012_val_00024343.JPEG n02415577/
+mv val/ILSVRC2012_val_00024344.JPEG n04550184/
+mv val/ILSVRC2012_val_00024345.JPEG n02097474/
+mv val/ILSVRC2012_val_00024346.JPEG n02793495/
+mv val/ILSVRC2012_val_00024347.JPEG n04252225/
+mv val/ILSVRC2012_val_00024348.JPEG n03388549/
+mv val/ILSVRC2012_val_00024349.JPEG n02422106/
+mv val/ILSVRC2012_val_00024350.JPEG n02807133/
+mv val/ILSVRC2012_val_00024351.JPEG n02090622/
+mv val/ILSVRC2012_val_00024352.JPEG n03598930/
+mv val/ILSVRC2012_val_00024353.JPEG n01592084/
+mv val/ILSVRC2012_val_00024354.JPEG n01924916/
+mv val/ILSVRC2012_val_00024355.JPEG n07584110/
+mv val/ILSVRC2012_val_00024356.JPEG n02114712/
+mv val/ILSVRC2012_val_00024357.JPEG n03874599/
+mv val/ILSVRC2012_val_00024358.JPEG n03590841/
+mv val/ILSVRC2012_val_00024359.JPEG n09246464/
+mv val/ILSVRC2012_val_00024360.JPEG n04589890/
+mv val/ILSVRC2012_val_00024361.JPEG n03794056/
+mv val/ILSVRC2012_val_00024362.JPEG n03180011/
+mv val/ILSVRC2012_val_00024363.JPEG n02104029/
+mv val/ILSVRC2012_val_00024364.JPEG n03272562/
+mv val/ILSVRC2012_val_00024365.JPEG n04263257/
+mv val/ILSVRC2012_val_00024366.JPEG n03874599/
+mv val/ILSVRC2012_val_00024367.JPEG n07714990/
+mv val/ILSVRC2012_val_00024368.JPEG n02791124/
+mv val/ILSVRC2012_val_00024369.JPEG n03690938/
+mv val/ILSVRC2012_val_00024370.JPEG n02837789/
+mv val/ILSVRC2012_val_00024371.JPEG n02138441/
+mv val/ILSVRC2012_val_00024372.JPEG n02859443/
+mv val/ILSVRC2012_val_00024373.JPEG n03026506/
+mv val/ILSVRC2012_val_00024374.JPEG n02442845/
+mv val/ILSVRC2012_val_00024375.JPEG n04004767/
+mv val/ILSVRC2012_val_00024376.JPEG n02397096/
+mv val/ILSVRC2012_val_00024377.JPEG n04120489/
+mv val/ILSVRC2012_val_00024378.JPEG n01882714/
+mv val/ILSVRC2012_val_00024379.JPEG n03124170/
+mv val/ILSVRC2012_val_00024380.JPEG n03992509/
+mv val/ILSVRC2012_val_00024381.JPEG n01818515/
+mv val/ILSVRC2012_val_00024382.JPEG n03124170/
+mv val/ILSVRC2012_val_00024383.JPEG n02002724/
+mv val/ILSVRC2012_val_00024384.JPEG n03680355/
+mv val/ILSVRC2012_val_00024385.JPEG n02096051/
+mv val/ILSVRC2012_val_00024386.JPEG n02492660/
+mv val/ILSVRC2012_val_00024387.JPEG n04033995/
+mv val/ILSVRC2012_val_00024388.JPEG n04019541/
+mv val/ILSVRC2012_val_00024389.JPEG n02108915/
+mv val/ILSVRC2012_val_00024390.JPEG n01872401/
+mv val/ILSVRC2012_val_00024391.JPEG n04366367/
+mv val/ILSVRC2012_val_00024392.JPEG n04501370/
+mv val/ILSVRC2012_val_00024393.JPEG n04355338/
+mv val/ILSVRC2012_val_00024394.JPEG n03661043/
+mv val/ILSVRC2012_val_00024395.JPEG n02536864/
+mv val/ILSVRC2012_val_00024396.JPEG n01796340/
+mv val/ILSVRC2012_val_00024397.JPEG n02326432/
+mv val/ILSVRC2012_val_00024398.JPEG n02493509/
+mv val/ILSVRC2012_val_00024399.JPEG n02099849/
+mv val/ILSVRC2012_val_00024400.JPEG n02096051/
+mv val/ILSVRC2012_val_00024401.JPEG n02974003/
+mv val/ILSVRC2012_val_00024402.JPEG n03481172/
+mv val/ILSVRC2012_val_00024403.JPEG n03089624/
+mv val/ILSVRC2012_val_00024404.JPEG n01773157/
+mv val/ILSVRC2012_val_00024405.JPEG n03445777/
+mv val/ILSVRC2012_val_00024406.JPEG n02138441/
+mv val/ILSVRC2012_val_00024407.JPEG n07565083/
+mv val/ILSVRC2012_val_00024408.JPEG n03916031/
+mv val/ILSVRC2012_val_00024409.JPEG n02363005/
+mv val/ILSVRC2012_val_00024410.JPEG n01944390/
+mv val/ILSVRC2012_val_00024411.JPEG n02093754/
+mv val/ILSVRC2012_val_00024412.JPEG n04560804/
+mv val/ILSVRC2012_val_00024413.JPEG n12267677/
+mv val/ILSVRC2012_val_00024414.JPEG n03967562/
+mv val/ILSVRC2012_val_00024415.JPEG n07932039/
+mv val/ILSVRC2012_val_00024416.JPEG n03666591/
+mv val/ILSVRC2012_val_00024417.JPEG n02256656/
+mv val/ILSVRC2012_val_00024418.JPEG n03770439/
+mv val/ILSVRC2012_val_00024419.JPEG n04509417/
+mv val/ILSVRC2012_val_00024420.JPEG n03720891/
+mv val/ILSVRC2012_val_00024421.JPEG n07565083/
+mv val/ILSVRC2012_val_00024422.JPEG n07875152/
+mv val/ILSVRC2012_val_00024423.JPEG n01843383/
+mv val/ILSVRC2012_val_00024424.JPEG n03481172/
+mv val/ILSVRC2012_val_00024425.JPEG n02708093/
+mv val/ILSVRC2012_val_00024426.JPEG n02165105/
+mv val/ILSVRC2012_val_00024427.JPEG n02123394/
+mv val/ILSVRC2012_val_00024428.JPEG n01644900/
+mv val/ILSVRC2012_val_00024429.JPEG n02109961/
+mv val/ILSVRC2012_val_00024430.JPEG n04335435/
+mv val/ILSVRC2012_val_00024431.JPEG n02096177/
+mv val/ILSVRC2012_val_00024432.JPEG n02110185/
+mv val/ILSVRC2012_val_00024433.JPEG n02687172/
+mv val/ILSVRC2012_val_00024434.JPEG n04116512/
+mv val/ILSVRC2012_val_00024435.JPEG n01693334/
+mv val/ILSVRC2012_val_00024436.JPEG n03133878/
+mv val/ILSVRC2012_val_00024437.JPEG n02493793/
+mv val/ILSVRC2012_val_00024438.JPEG n01806143/
+mv val/ILSVRC2012_val_00024439.JPEG n07892512/
+mv val/ILSVRC2012_val_00024440.JPEG n03670208/
+mv val/ILSVRC2012_val_00024441.JPEG n04264628/
+mv val/ILSVRC2012_val_00024442.JPEG n03014705/
+mv val/ILSVRC2012_val_00024443.JPEG n07615774/
+mv val/ILSVRC2012_val_00024444.JPEG n02992211/
+mv val/ILSVRC2012_val_00024445.JPEG n03063599/
+mv val/ILSVRC2012_val_00024446.JPEG n04209239/
+mv val/ILSVRC2012_val_00024447.JPEG n02489166/
+mv val/ILSVRC2012_val_00024448.JPEG n07920052/
+mv val/ILSVRC2012_val_00024449.JPEG n04081281/
+mv val/ILSVRC2012_val_00024450.JPEG n04486054/
+mv val/ILSVRC2012_val_00024451.JPEG n02783161/
+mv val/ILSVRC2012_val_00024452.JPEG n03594734/
+mv val/ILSVRC2012_val_00024453.JPEG n03016953/
+mv val/ILSVRC2012_val_00024454.JPEG n02834397/
+mv val/ILSVRC2012_val_00024455.JPEG n04409515/
+mv val/ILSVRC2012_val_00024456.JPEG n03544143/
+mv val/ILSVRC2012_val_00024457.JPEG n01924916/
+mv val/ILSVRC2012_val_00024458.JPEG n02174001/
+mv val/ILSVRC2012_val_00024459.JPEG n04599235/
+mv val/ILSVRC2012_val_00024460.JPEG n07754684/
+mv val/ILSVRC2012_val_00024461.JPEG n07753275/
+mv val/ILSVRC2012_val_00024462.JPEG n02112706/
+mv val/ILSVRC2012_val_00024463.JPEG n03197337/
+mv val/ILSVRC2012_val_00024464.JPEG n02095570/
+mv val/ILSVRC2012_val_00024465.JPEG n02120079/
+mv val/ILSVRC2012_val_00024466.JPEG n03804744/
+mv val/ILSVRC2012_val_00024467.JPEG n01820546/
+mv val/ILSVRC2012_val_00024468.JPEG n02099849/
+mv val/ILSVRC2012_val_00024469.JPEG n04004767/
+mv val/ILSVRC2012_val_00024470.JPEG n02092339/
+mv val/ILSVRC2012_val_00024471.JPEG n03983396/
+mv val/ILSVRC2012_val_00024472.JPEG n01749939/
+mv val/ILSVRC2012_val_00024473.JPEG n04162706/
+mv val/ILSVRC2012_val_00024474.JPEG n04264628/
+mv val/ILSVRC2012_val_00024475.JPEG n03598930/
+mv val/ILSVRC2012_val_00024476.JPEG n02098286/
+mv val/ILSVRC2012_val_00024477.JPEG n07892512/
+mv val/ILSVRC2012_val_00024478.JPEG n03929660/
+mv val/ILSVRC2012_val_00024479.JPEG n04209133/
+mv val/ILSVRC2012_val_00024480.JPEG n03000684/
+mv val/ILSVRC2012_val_00024481.JPEG n04589890/
+mv val/ILSVRC2012_val_00024482.JPEG n02963159/
+mv val/ILSVRC2012_val_00024483.JPEG n02206856/
+mv val/ILSVRC2012_val_00024484.JPEG n03970156/
+mv val/ILSVRC2012_val_00024485.JPEG n04418357/
+mv val/ILSVRC2012_val_00024486.JPEG n02090379/
+mv val/ILSVRC2012_val_00024487.JPEG n03785016/
+mv val/ILSVRC2012_val_00024488.JPEG n02488291/
+mv val/ILSVRC2012_val_00024489.JPEG n04501370/
+mv val/ILSVRC2012_val_00024490.JPEG n04118538/
+mv val/ILSVRC2012_val_00024491.JPEG n04311174/
+mv val/ILSVRC2012_val_00024492.JPEG n03838899/
+mv val/ILSVRC2012_val_00024493.JPEG n02906734/
+mv val/ILSVRC2012_val_00024494.JPEG n01665541/
+mv val/ILSVRC2012_val_00024495.JPEG n03188531/
+mv val/ILSVRC2012_val_00024496.JPEG n03642806/
+mv val/ILSVRC2012_val_00024497.JPEG n03220513/
+mv val/ILSVRC2012_val_00024498.JPEG n02105855/
+mv val/ILSVRC2012_val_00024499.JPEG n03642806/
+mv val/ILSVRC2012_val_00024500.JPEG n02123394/
+mv val/ILSVRC2012_val_00024501.JPEG n02457408/
+mv val/ILSVRC2012_val_00024502.JPEG n03208938/
+mv val/ILSVRC2012_val_00024503.JPEG n04536866/
+mv val/ILSVRC2012_val_00024504.JPEG n02056570/
+mv val/ILSVRC2012_val_00024505.JPEG n02088466/
+mv val/ILSVRC2012_val_00024506.JPEG n04019541/
+mv val/ILSVRC2012_val_00024507.JPEG n02165456/
+mv val/ILSVRC2012_val_00024508.JPEG n02097209/
+mv val/ILSVRC2012_val_00024509.JPEG n02108000/
+mv val/ILSVRC2012_val_00024510.JPEG n04536866/
+mv val/ILSVRC2012_val_00024511.JPEG n02777292/
+mv val/ILSVRC2012_val_00024512.JPEG n02939185/
+mv val/ILSVRC2012_val_00024513.JPEG n04366367/
+mv val/ILSVRC2012_val_00024514.JPEG n01616318/
+mv val/ILSVRC2012_val_00024515.JPEG n03337140/
+mv val/ILSVRC2012_val_00024516.JPEG n04229816/
+mv val/ILSVRC2012_val_00024517.JPEG n03792782/
+mv val/ILSVRC2012_val_00024518.JPEG n07831146/
+mv val/ILSVRC2012_val_00024519.JPEG n03903868/
+mv val/ILSVRC2012_val_00024520.JPEG n03041632/
+mv val/ILSVRC2012_val_00024521.JPEG n02089867/
+mv val/ILSVRC2012_val_00024522.JPEG n07695742/
+mv val/ILSVRC2012_val_00024523.JPEG n03534580/
+mv val/ILSVRC2012_val_00024524.JPEG n03271574/
+mv val/ILSVRC2012_val_00024525.JPEG n01843383/
+mv val/ILSVRC2012_val_00024526.JPEG n07836838/
+mv val/ILSVRC2012_val_00024527.JPEG n02279972/
+mv val/ILSVRC2012_val_00024528.JPEG n07584110/
+mv val/ILSVRC2012_val_00024529.JPEG n02119789/
+mv val/ILSVRC2012_val_00024530.JPEG n01843065/
+mv val/ILSVRC2012_val_00024531.JPEG n02206856/
+mv val/ILSVRC2012_val_00024532.JPEG n03042490/
+mv val/ILSVRC2012_val_00024533.JPEG n02104029/
+mv val/ILSVRC2012_val_00024534.JPEG n04447861/
+mv val/ILSVRC2012_val_00024535.JPEG n03814906/
+mv val/ILSVRC2012_val_00024536.JPEG n02280649/
+mv val/ILSVRC2012_val_00024537.JPEG n03494278/
+mv val/ILSVRC2012_val_00024538.JPEG n02256656/
+mv val/ILSVRC2012_val_00024539.JPEG n02909870/
+mv val/ILSVRC2012_val_00024540.JPEG n03602883/
+mv val/ILSVRC2012_val_00024541.JPEG n01748264/
+mv val/ILSVRC2012_val_00024542.JPEG n02093428/
+mv val/ILSVRC2012_val_00024543.JPEG n03841143/
+mv val/ILSVRC2012_val_00024544.JPEG n03710193/
+mv val/ILSVRC2012_val_00024545.JPEG n01675722/
+mv val/ILSVRC2012_val_00024546.JPEG n02395406/
+mv val/ILSVRC2012_val_00024547.JPEG n03250847/
+mv val/ILSVRC2012_val_00024548.JPEG n02397096/
+mv val/ILSVRC2012_val_00024549.JPEG n12267677/
+mv val/ILSVRC2012_val_00024550.JPEG n03770679/
+mv val/ILSVRC2012_val_00024551.JPEG n02007558/
+mv val/ILSVRC2012_val_00024552.JPEG n03642806/
+mv val/ILSVRC2012_val_00024553.JPEG n07871810/
+mv val/ILSVRC2012_val_00024554.JPEG n03742115/
+mv val/ILSVRC2012_val_00024555.JPEG n02190166/
+mv val/ILSVRC2012_val_00024556.JPEG n07716358/
+mv val/ILSVRC2012_val_00024557.JPEG n01978455/
+mv val/ILSVRC2012_val_00024558.JPEG n02169497/
+mv val/ILSVRC2012_val_00024559.JPEG n04204347/
+mv val/ILSVRC2012_val_00024560.JPEG n03417042/
+mv val/ILSVRC2012_val_00024561.JPEG n02793495/
+mv val/ILSVRC2012_val_00024562.JPEG n03530642/
+mv val/ILSVRC2012_val_00024563.JPEG n03188531/
+mv val/ILSVRC2012_val_00024564.JPEG n02105505/
+mv val/ILSVRC2012_val_00024565.JPEG n02804414/
+mv val/ILSVRC2012_val_00024566.JPEG n02093754/
+mv val/ILSVRC2012_val_00024567.JPEG n02092339/
+mv val/ILSVRC2012_val_00024568.JPEG n02860847/
+mv val/ILSVRC2012_val_00024569.JPEG n02085936/
+mv val/ILSVRC2012_val_00024570.JPEG n02786058/
+mv val/ILSVRC2012_val_00024571.JPEG n02056570/
+mv val/ILSVRC2012_val_00024572.JPEG n02165456/
+mv val/ILSVRC2012_val_00024573.JPEG n03710637/
+mv val/ILSVRC2012_val_00024574.JPEG n04200800/
+mv val/ILSVRC2012_val_00024575.JPEG n04592741/
+mv val/ILSVRC2012_val_00024576.JPEG n03935335/
+mv val/ILSVRC2012_val_00024577.JPEG n02102973/
+mv val/ILSVRC2012_val_00024578.JPEG n04296562/
+mv val/ILSVRC2012_val_00024579.JPEG n04328186/
+mv val/ILSVRC2012_val_00024580.JPEG n12267677/
+mv val/ILSVRC2012_val_00024581.JPEG n01824575/
+mv val/ILSVRC2012_val_00024582.JPEG n02494079/
+mv val/ILSVRC2012_val_00024583.JPEG n02730930/
+mv val/ILSVRC2012_val_00024584.JPEG n02356798/
+mv val/ILSVRC2012_val_00024585.JPEG n03937543/
+mv val/ILSVRC2012_val_00024586.JPEG n03290653/
+mv val/ILSVRC2012_val_00024587.JPEG n02109047/
+mv val/ILSVRC2012_val_00024588.JPEG n02112137/
+mv val/ILSVRC2012_val_00024589.JPEG n02104365/
+mv val/ILSVRC2012_val_00024590.JPEG n02085620/
+mv val/ILSVRC2012_val_00024591.JPEG n09246464/
+mv val/ILSVRC2012_val_00024592.JPEG n01817953/
+mv val/ILSVRC2012_val_00024593.JPEG n03345487/
+mv val/ILSVRC2012_val_00024594.JPEG n02410509/
+mv val/ILSVRC2012_val_00024595.JPEG n02281787/
+mv val/ILSVRC2012_val_00024596.JPEG n04487081/
+mv val/ILSVRC2012_val_00024597.JPEG n01770393/
+mv val/ILSVRC2012_val_00024598.JPEG n03814906/
+mv val/ILSVRC2012_val_00024599.JPEG n01728920/
+mv val/ILSVRC2012_val_00024600.JPEG n02481823/
+mv val/ILSVRC2012_val_00024601.JPEG n01768244/
+mv val/ILSVRC2012_val_00024602.JPEG n03891251/
+mv val/ILSVRC2012_val_00024603.JPEG n04111531/
+mv val/ILSVRC2012_val_00024604.JPEG n03347037/
+mv val/ILSVRC2012_val_00024605.JPEG n03929660/
+mv val/ILSVRC2012_val_00024606.JPEG n02951585/
+mv val/ILSVRC2012_val_00024607.JPEG n02840245/
+mv val/ILSVRC2012_val_00024608.JPEG n02489166/
+mv val/ILSVRC2012_val_00024609.JPEG n01756291/
+mv val/ILSVRC2012_val_00024610.JPEG n02669723/
+mv val/ILSVRC2012_val_00024611.JPEG n07583066/
+mv val/ILSVRC2012_val_00024612.JPEG n02268443/
+mv val/ILSVRC2012_val_00024613.JPEG n04552348/
+mv val/ILSVRC2012_val_00024614.JPEG n04263257/
+mv val/ILSVRC2012_val_00024615.JPEG n04371774/
+mv val/ILSVRC2012_val_00024616.JPEG n03379051/
+mv val/ILSVRC2012_val_00024617.JPEG n04355338/
+mv val/ILSVRC2012_val_00024618.JPEG n04355933/
+mv val/ILSVRC2012_val_00024619.JPEG n04118538/
+mv val/ILSVRC2012_val_00024620.JPEG n04099969/
+mv val/ILSVRC2012_val_00024621.JPEG n04507155/
+mv val/ILSVRC2012_val_00024622.JPEG n02480495/
+mv val/ILSVRC2012_val_00024623.JPEG n03814639/
+mv val/ILSVRC2012_val_00024624.JPEG n02105855/
+mv val/ILSVRC2012_val_00024625.JPEG n02487347/
+mv val/ILSVRC2012_val_00024626.JPEG n04553703/
+mv val/ILSVRC2012_val_00024627.JPEG n04310018/
+mv val/ILSVRC2012_val_00024628.JPEG n03895866/
+mv val/ILSVRC2012_val_00024629.JPEG n03000247/
+mv val/ILSVRC2012_val_00024630.JPEG n01796340/
+mv val/ILSVRC2012_val_00024631.JPEG n03903868/
+mv val/ILSVRC2012_val_00024632.JPEG n03903868/
+mv val/ILSVRC2012_val_00024633.JPEG n07583066/
+mv val/ILSVRC2012_val_00024634.JPEG n04192698/
+mv val/ILSVRC2012_val_00024635.JPEG n02018795/
+mv val/ILSVRC2012_val_00024636.JPEG n02096177/
+mv val/ILSVRC2012_val_00024637.JPEG n02098286/
+mv val/ILSVRC2012_val_00024638.JPEG n03970156/
+mv val/ILSVRC2012_val_00024639.JPEG n03733281/
+mv val/ILSVRC2012_val_00024640.JPEG n07614500/
+mv val/ILSVRC2012_val_00024641.JPEG n03388043/
+mv val/ILSVRC2012_val_00024642.JPEG n02110958/
+mv val/ILSVRC2012_val_00024643.JPEG n01601694/
+mv val/ILSVRC2012_val_00024644.JPEG n07715103/
+mv val/ILSVRC2012_val_00024645.JPEG n02127052/
+mv val/ILSVRC2012_val_00024646.JPEG n02325366/
+mv val/ILSVRC2012_val_00024647.JPEG n03673027/
+mv val/ILSVRC2012_val_00024648.JPEG n02950826/
+mv val/ILSVRC2012_val_00024649.JPEG n02091467/
+mv val/ILSVRC2012_val_00024650.JPEG n03110669/
+mv val/ILSVRC2012_val_00024651.JPEG n03840681/
+mv val/ILSVRC2012_val_00024652.JPEG n03680355/
+mv val/ILSVRC2012_val_00024653.JPEG n02441942/
+mv val/ILSVRC2012_val_00024654.JPEG n03485407/
+mv val/ILSVRC2012_val_00024655.JPEG n02097474/
+mv val/ILSVRC2012_val_00024656.JPEG n02398521/
+mv val/ILSVRC2012_val_00024657.JPEG n02776631/
+mv val/ILSVRC2012_val_00024658.JPEG n02701002/
+mv val/ILSVRC2012_val_00024659.JPEG n02325366/
+mv val/ILSVRC2012_val_00024660.JPEG n03388043/
+mv val/ILSVRC2012_val_00024661.JPEG n07873807/
+mv val/ILSVRC2012_val_00024662.JPEG n03763968/
+mv val/ILSVRC2012_val_00024663.JPEG n04515003/
+mv val/ILSVRC2012_val_00024664.JPEG n02094258/
+mv val/ILSVRC2012_val_00024665.JPEG n02422699/
+mv val/ILSVRC2012_val_00024666.JPEG n01667114/
+mv val/ILSVRC2012_val_00024667.JPEG n04263257/
+mv val/ILSVRC2012_val_00024668.JPEG n07590611/
+mv val/ILSVRC2012_val_00024669.JPEG n02110185/
+mv val/ILSVRC2012_val_00024670.JPEG n03899768/
+mv val/ILSVRC2012_val_00024671.JPEG n03877845/
+mv val/ILSVRC2012_val_00024672.JPEG n03197337/
+mv val/ILSVRC2012_val_00024673.JPEG n12144580/
+mv val/ILSVRC2012_val_00024674.JPEG n04152593/
+mv val/ILSVRC2012_val_00024675.JPEG n02108089/
+mv val/ILSVRC2012_val_00024676.JPEG n02493793/
+mv val/ILSVRC2012_val_00024677.JPEG n02105855/
+mv val/ILSVRC2012_val_00024678.JPEG n03481172/
+mv val/ILSVRC2012_val_00024679.JPEG n04228054/
+mv val/ILSVRC2012_val_00024680.JPEG n03899768/
+mv val/ILSVRC2012_val_00024681.JPEG n02093754/
+mv val/ILSVRC2012_val_00024682.JPEG n01737021/
+mv val/ILSVRC2012_val_00024683.JPEG n02415577/
+mv val/ILSVRC2012_val_00024684.JPEG n01685808/
+mv val/ILSVRC2012_val_00024685.JPEG n01773157/
+mv val/ILSVRC2012_val_00024686.JPEG n02101388/
+mv val/ILSVRC2012_val_00024687.JPEG n03710721/
+mv val/ILSVRC2012_val_00024688.JPEG n01873310/
+mv val/ILSVRC2012_val_00024689.JPEG n03627232/
+mv val/ILSVRC2012_val_00024690.JPEG n02708093/
+mv val/ILSVRC2012_val_00024691.JPEG n02102318/
+mv val/ILSVRC2012_val_00024692.JPEG n07747607/
+mv val/ILSVRC2012_val_00024693.JPEG n02791124/
+mv val/ILSVRC2012_val_00024694.JPEG n02870880/
+mv val/ILSVRC2012_val_00024695.JPEG n03388549/
+mv val/ILSVRC2012_val_00024696.JPEG n04372370/
+mv val/ILSVRC2012_val_00024697.JPEG n03775071/
+mv val/ILSVRC2012_val_00024698.JPEG n04347754/
+mv val/ILSVRC2012_val_00024699.JPEG n03026506/
+mv val/ILSVRC2012_val_00024700.JPEG n07720875/
+mv val/ILSVRC2012_val_00024701.JPEG n01883070/
+mv val/ILSVRC2012_val_00024702.JPEG n03690938/
+mv val/ILSVRC2012_val_00024703.JPEG n03776460/
+mv val/ILSVRC2012_val_00024704.JPEG n01558993/
+mv val/ILSVRC2012_val_00024705.JPEG n04552348/
+mv val/ILSVRC2012_val_00024706.JPEG n03457902/
+mv val/ILSVRC2012_val_00024707.JPEG n07768694/
+mv val/ILSVRC2012_val_00024708.JPEG n04356056/
+mv val/ILSVRC2012_val_00024709.JPEG n04485082/
+mv val/ILSVRC2012_val_00024710.JPEG n09288635/
+mv val/ILSVRC2012_val_00024711.JPEG n07760859/
+mv val/ILSVRC2012_val_00024712.JPEG n03991062/
+mv val/ILSVRC2012_val_00024713.JPEG n04136333/
+mv val/ILSVRC2012_val_00024714.JPEG n03938244/
+mv val/ILSVRC2012_val_00024715.JPEG n02102177/
+mv val/ILSVRC2012_val_00024716.JPEG n03991062/
+mv val/ILSVRC2012_val_00024717.JPEG n04550184/
+mv val/ILSVRC2012_val_00024718.JPEG n04127249/
+mv val/ILSVRC2012_val_00024719.JPEG n01498041/
+mv val/ILSVRC2012_val_00024720.JPEG n03691459/
+mv val/ILSVRC2012_val_00024721.JPEG n03255030/
+mv val/ILSVRC2012_val_00024722.JPEG n02417914/
+mv val/ILSVRC2012_val_00024723.JPEG n02099429/
+mv val/ILSVRC2012_val_00024724.JPEG n04254777/
+mv val/ILSVRC2012_val_00024725.JPEG n04277352/
+mv val/ILSVRC2012_val_00024726.JPEG n01855032/
+mv val/ILSVRC2012_val_00024727.JPEG n01983481/
+mv val/ILSVRC2012_val_00024728.JPEG n04604644/
+mv val/ILSVRC2012_val_00024729.JPEG n02102973/
+mv val/ILSVRC2012_val_00024730.JPEG n02790996/
+mv val/ILSVRC2012_val_00024731.JPEG n02094258/
+mv val/ILSVRC2012_val_00024732.JPEG n02489166/
+mv val/ILSVRC2012_val_00024733.JPEG n03887697/
+mv val/ILSVRC2012_val_00024734.JPEG n02443114/
+mv val/ILSVRC2012_val_00024735.JPEG n04228054/
+mv val/ILSVRC2012_val_00024736.JPEG n01667778/
+mv val/ILSVRC2012_val_00024737.JPEG n02172182/
+mv val/ILSVRC2012_val_00024738.JPEG n04133789/
+mv val/ILSVRC2012_val_00024739.JPEG n03196217/
+mv val/ILSVRC2012_val_00024740.JPEG n02018207/
+mv val/ILSVRC2012_val_00024741.JPEG n03124170/
+mv val/ILSVRC2012_val_00024742.JPEG n02841315/
+mv val/ILSVRC2012_val_00024743.JPEG n02174001/
+mv val/ILSVRC2012_val_00024744.JPEG n02138441/
+mv val/ILSVRC2012_val_00024745.JPEG n02364673/
+mv val/ILSVRC2012_val_00024746.JPEG n03874599/
+mv val/ILSVRC2012_val_00024747.JPEG n02690373/
+mv val/ILSVRC2012_val_00024748.JPEG n12267677/
+mv val/ILSVRC2012_val_00024749.JPEG n02071294/
+mv val/ILSVRC2012_val_00024750.JPEG n02396427/
+mv val/ILSVRC2012_val_00024751.JPEG n02100236/
+mv val/ILSVRC2012_val_00024752.JPEG n04125021/
+mv val/ILSVRC2012_val_00024753.JPEG n01704323/
+mv val/ILSVRC2012_val_00024754.JPEG n02281406/
+mv val/ILSVRC2012_val_00024755.JPEG n02226429/
+mv val/ILSVRC2012_val_00024756.JPEG n02097298/
+mv val/ILSVRC2012_val_00024757.JPEG n02787622/
+mv val/ILSVRC2012_val_00024758.JPEG n02086910/
+mv val/ILSVRC2012_val_00024759.JPEG n02415577/
+mv val/ILSVRC2012_val_00024760.JPEG n02123597/
+mv val/ILSVRC2012_val_00024761.JPEG n03977966/
+mv val/ILSVRC2012_val_00024762.JPEG n03743016/
+mv val/ILSVRC2012_val_00024763.JPEG n02951585/
+mv val/ILSVRC2012_val_00024764.JPEG n04548280/
+mv val/ILSVRC2012_val_00024765.JPEG n03216828/
+mv val/ILSVRC2012_val_00024766.JPEG n02096437/
+mv val/ILSVRC2012_val_00024767.JPEG n02233338/
+mv val/ILSVRC2012_val_00024768.JPEG n02536864/
+mv val/ILSVRC2012_val_00024769.JPEG n01773157/
+mv val/ILSVRC2012_val_00024770.JPEG n03657121/
+mv val/ILSVRC2012_val_00024771.JPEG n02883205/
+mv val/ILSVRC2012_val_00024772.JPEG n03777754/
+mv val/ILSVRC2012_val_00024773.JPEG n01843065/
+mv val/ILSVRC2012_val_00024774.JPEG n15075141/
+mv val/ILSVRC2012_val_00024775.JPEG n04462240/
+mv val/ILSVRC2012_val_00024776.JPEG n02086240/
+mv val/ILSVRC2012_val_00024777.JPEG n03832673/
+mv val/ILSVRC2012_val_00024778.JPEG n04026417/
+mv val/ILSVRC2012_val_00024779.JPEG n04346328/
+mv val/ILSVRC2012_val_00024780.JPEG n02808440/
+mv val/ILSVRC2012_val_00024781.JPEG n04152593/
+mv val/ILSVRC2012_val_00024782.JPEG n03017168/
+mv val/ILSVRC2012_val_00024783.JPEG n03710193/
+mv val/ILSVRC2012_val_00024784.JPEG n02110341/
+mv val/ILSVRC2012_val_00024785.JPEG n02111500/
+mv val/ILSVRC2012_val_00024786.JPEG n02117135/
+mv val/ILSVRC2012_val_00024787.JPEG n02018207/
+mv val/ILSVRC2012_val_00024788.JPEG n03769881/
+mv val/ILSVRC2012_val_00024789.JPEG n02087394/
+mv val/ILSVRC2012_val_00024790.JPEG n04286575/
+mv val/ILSVRC2012_val_00024791.JPEG n02105855/
+mv val/ILSVRC2012_val_00024792.JPEG n03218198/
+mv val/ILSVRC2012_val_00024793.JPEG n04509417/
+mv val/ILSVRC2012_val_00024794.JPEG n02749479/
+mv val/ILSVRC2012_val_00024795.JPEG n01756291/
+mv val/ILSVRC2012_val_00024796.JPEG n03584254/
+mv val/ILSVRC2012_val_00024797.JPEG n07613480/
+mv val/ILSVRC2012_val_00024798.JPEG n02437312/
+mv val/ILSVRC2012_val_00024799.JPEG n04458633/
+mv val/ILSVRC2012_val_00024800.JPEG n01518878/
+mv val/ILSVRC2012_val_00024801.JPEG n01677366/
+mv val/ILSVRC2012_val_00024802.JPEG n02797295/
+mv val/ILSVRC2012_val_00024803.JPEG n07717410/
+mv val/ILSVRC2012_val_00024804.JPEG n03775071/
+mv val/ILSVRC2012_val_00024805.JPEG n04209133/
+mv val/ILSVRC2012_val_00024806.JPEG n03425413/
+mv val/ILSVRC2012_val_00024807.JPEG n04347754/
+mv val/ILSVRC2012_val_00024808.JPEG n02028035/
+mv val/ILSVRC2012_val_00024809.JPEG n02085936/
+mv val/ILSVRC2012_val_00024810.JPEG n04317175/
+mv val/ILSVRC2012_val_00024811.JPEG n04310018/
+mv val/ILSVRC2012_val_00024812.JPEG n13044778/
+mv val/ILSVRC2012_val_00024813.JPEG n01693334/
+mv val/ILSVRC2012_val_00024814.JPEG n03047690/
+mv val/ILSVRC2012_val_00024815.JPEG n03983396/
+mv val/ILSVRC2012_val_00024816.JPEG n02268443/
+mv val/ILSVRC2012_val_00024817.JPEG n04442312/
+mv val/ILSVRC2012_val_00024818.JPEG n02109961/
+mv val/ILSVRC2012_val_00024819.JPEG n04019541/
+mv val/ILSVRC2012_val_00024820.JPEG n04335435/
+mv val/ILSVRC2012_val_00024821.JPEG n07932039/
+mv val/ILSVRC2012_val_00024822.JPEG n03743016/
+mv val/ILSVRC2012_val_00024823.JPEG n02268443/
+mv val/ILSVRC2012_val_00024824.JPEG n04523525/
+mv val/ILSVRC2012_val_00024825.JPEG n02134418/
+mv val/ILSVRC2012_val_00024826.JPEG n02860847/
+mv val/ILSVRC2012_val_00024827.JPEG n02096051/
+mv val/ILSVRC2012_val_00024828.JPEG n02817516/
+mv val/ILSVRC2012_val_00024829.JPEG n04238763/
+mv val/ILSVRC2012_val_00024830.JPEG n12620546/
+mv val/ILSVRC2012_val_00024831.JPEG n02092002/
+mv val/ILSVRC2012_val_00024832.JPEG n13037406/
+mv val/ILSVRC2012_val_00024833.JPEG n03000134/
+mv val/ILSVRC2012_val_00024834.JPEG n04228054/
+mv val/ILSVRC2012_val_00024835.JPEG n02002724/
+mv val/ILSVRC2012_val_00024836.JPEG n02086079/
+mv val/ILSVRC2012_val_00024837.JPEG n03394916/
+mv val/ILSVRC2012_val_00024838.JPEG n04265275/
+mv val/ILSVRC2012_val_00024839.JPEG n04136333/
+mv val/ILSVRC2012_val_00024840.JPEG n02481823/
+mv val/ILSVRC2012_val_00024841.JPEG n04041544/
+mv val/ILSVRC2012_val_00024842.JPEG n03272562/
+mv val/ILSVRC2012_val_00024843.JPEG n02999410/
+mv val/ILSVRC2012_val_00024844.JPEG n02488702/
+mv val/ILSVRC2012_val_00024845.JPEG n01824575/
+mv val/ILSVRC2012_val_00024846.JPEG n03967562/
+mv val/ILSVRC2012_val_00024847.JPEG n02730930/
+mv val/ILSVRC2012_val_00024848.JPEG n01843383/
+mv val/ILSVRC2012_val_00024849.JPEG n04604644/
+mv val/ILSVRC2012_val_00024850.JPEG n02177972/
+mv val/ILSVRC2012_val_00024851.JPEG n01744401/
+mv val/ILSVRC2012_val_00024852.JPEG n07860988/
+mv val/ILSVRC2012_val_00024853.JPEG n04153751/
+mv val/ILSVRC2012_val_00024854.JPEG n01491361/
+mv val/ILSVRC2012_val_00024855.JPEG n03297495/
+mv val/ILSVRC2012_val_00024856.JPEG n04346328/
+mv val/ILSVRC2012_val_00024857.JPEG n03956157/
+mv val/ILSVRC2012_val_00024858.JPEG n02325366/
+mv val/ILSVRC2012_val_00024859.JPEG n02974003/
+mv val/ILSVRC2012_val_00024860.JPEG n03733281/
+mv val/ILSVRC2012_val_00024861.JPEG n03899768/
+mv val/ILSVRC2012_val_00024862.JPEG n07717556/
+mv val/ILSVRC2012_val_00024863.JPEG n02114367/
+mv val/ILSVRC2012_val_00024864.JPEG n04366367/
+mv val/ILSVRC2012_val_00024865.JPEG n03400231/
+mv val/ILSVRC2012_val_00024866.JPEG n02808440/
+mv val/ILSVRC2012_val_00024867.JPEG n01968897/
+mv val/ILSVRC2012_val_00024868.JPEG n02259212/
+mv val/ILSVRC2012_val_00024869.JPEG n03642806/
+mv val/ILSVRC2012_val_00024870.JPEG n01955084/
+mv val/ILSVRC2012_val_00024871.JPEG n03776460/
+mv val/ILSVRC2012_val_00024872.JPEG n09835506/
+mv val/ILSVRC2012_val_00024873.JPEG n01775062/
+mv val/ILSVRC2012_val_00024874.JPEG n02979186/
+mv val/ILSVRC2012_val_00024875.JPEG n02093991/
+mv val/ILSVRC2012_val_00024876.JPEG n04263257/
+mv val/ILSVRC2012_val_00024877.JPEG n04485082/
+mv val/ILSVRC2012_val_00024878.JPEG n04482393/
+mv val/ILSVRC2012_val_00024879.JPEG n03179701/
+mv val/ILSVRC2012_val_00024880.JPEG n01739381/
+mv val/ILSVRC2012_val_00024881.JPEG n02088238/
+mv val/ILSVRC2012_val_00024882.JPEG n03991062/
+mv val/ILSVRC2012_val_00024883.JPEG n13040303/
+mv val/ILSVRC2012_val_00024884.JPEG n01534433/
+mv val/ILSVRC2012_val_00024885.JPEG n01978455/
+mv val/ILSVRC2012_val_00024886.JPEG n02480495/
+mv val/ILSVRC2012_val_00024887.JPEG n02086910/
+mv val/ILSVRC2012_val_00024888.JPEG n02097209/
+mv val/ILSVRC2012_val_00024889.JPEG n02096294/
+mv val/ILSVRC2012_val_00024890.JPEG n04209133/
+mv val/ILSVRC2012_val_00024891.JPEG n09428293/
+mv val/ILSVRC2012_val_00024892.JPEG n03018349/
+mv val/ILSVRC2012_val_00024893.JPEG n07871810/
+mv val/ILSVRC2012_val_00024894.JPEG n01986214/
+mv val/ILSVRC2012_val_00024895.JPEG n01491361/
+mv val/ILSVRC2012_val_00024896.JPEG n02106662/
+mv val/ILSVRC2012_val_00024897.JPEG n03028079/
+mv val/ILSVRC2012_val_00024898.JPEG n04179913/
+mv val/ILSVRC2012_val_00024899.JPEG n04264628/
+mv val/ILSVRC2012_val_00024900.JPEG n03450230/
+mv val/ILSVRC2012_val_00024901.JPEG n04376876/
+mv val/ILSVRC2012_val_00024902.JPEG n02129165/
+mv val/ILSVRC2012_val_00024903.JPEG n02127052/
+mv val/ILSVRC2012_val_00024904.JPEG n02111500/
+mv val/ILSVRC2012_val_00024905.JPEG n04254680/
+mv val/ILSVRC2012_val_00024906.JPEG n02951358/
+mv val/ILSVRC2012_val_00024907.JPEG n03854065/
+mv val/ILSVRC2012_val_00024908.JPEG n02488702/
+mv val/ILSVRC2012_val_00024909.JPEG n02834397/
+mv val/ILSVRC2012_val_00024910.JPEG n02128757/
+mv val/ILSVRC2012_val_00024911.JPEG n03075370/
+mv val/ILSVRC2012_val_00024912.JPEG n07583066/
+mv val/ILSVRC2012_val_00024913.JPEG n03047690/
+mv val/ILSVRC2012_val_00024914.JPEG n01829413/
+mv val/ILSVRC2012_val_00024915.JPEG n03124043/
+mv val/ILSVRC2012_val_00024916.JPEG n01843065/
+mv val/ILSVRC2012_val_00024917.JPEG n07697537/
+mv val/ILSVRC2012_val_00024918.JPEG n07734744/
+mv val/ILSVRC2012_val_00024919.JPEG n02834397/
+mv val/ILSVRC2012_val_00024920.JPEG n02814860/
+mv val/ILSVRC2012_val_00024921.JPEG n02481823/
+mv val/ILSVRC2012_val_00024922.JPEG n04356056/
+mv val/ILSVRC2012_val_00024923.JPEG n03124043/
+mv val/ILSVRC2012_val_00024924.JPEG n01990800/
+mv val/ILSVRC2012_val_00024925.JPEG n03291819/
+mv val/ILSVRC2012_val_00024926.JPEG n02487347/
+mv val/ILSVRC2012_val_00024927.JPEG n03658185/
+mv val/ILSVRC2012_val_00024928.JPEG n04404412/
+mv val/ILSVRC2012_val_00024929.JPEG n03791053/
+mv val/ILSVRC2012_val_00024930.JPEG n03866082/
+mv val/ILSVRC2012_val_00024931.JPEG n02930766/
+mv val/ILSVRC2012_val_00024932.JPEG n02074367/
+mv val/ILSVRC2012_val_00024933.JPEG n02777292/
+mv val/ILSVRC2012_val_00024934.JPEG n04458633/
+mv val/ILSVRC2012_val_00024935.JPEG n02098286/
+mv val/ILSVRC2012_val_00024936.JPEG n02843684/
+mv val/ILSVRC2012_val_00024937.JPEG n04592741/
+mv val/ILSVRC2012_val_00024938.JPEG n01641577/
+mv val/ILSVRC2012_val_00024939.JPEG n03529860/
+mv val/ILSVRC2012_val_00024940.JPEG n01484850/
+mv val/ILSVRC2012_val_00024941.JPEG n04141076/
+mv val/ILSVRC2012_val_00024942.JPEG n03485407/
+mv val/ILSVRC2012_val_00024943.JPEG n03590841/
+mv val/ILSVRC2012_val_00024944.JPEG n04037443/
+mv val/ILSVRC2012_val_00024945.JPEG n07613480/
+mv val/ILSVRC2012_val_00024946.JPEG n01688243/
+mv val/ILSVRC2012_val_00024947.JPEG n04074963/
+mv val/ILSVRC2012_val_00024948.JPEG n02701002/
+mv val/ILSVRC2012_val_00024949.JPEG n03535780/
+mv val/ILSVRC2012_val_00024950.JPEG n02090379/
+mv val/ILSVRC2012_val_00024951.JPEG n02111889/
+mv val/ILSVRC2012_val_00024952.JPEG n06874185/
+mv val/ILSVRC2012_val_00024953.JPEG n07693725/
+mv val/ILSVRC2012_val_00024954.JPEG n07802026/
+mv val/ILSVRC2012_val_00024955.JPEG n07754684/
+mv val/ILSVRC2012_val_00024956.JPEG n01774384/
+mv val/ILSVRC2012_val_00024957.JPEG n01514668/
+mv val/ILSVRC2012_val_00024958.JPEG n02028035/
+mv val/ILSVRC2012_val_00024959.JPEG n04423845/
+mv val/ILSVRC2012_val_00024960.JPEG n02096051/
+mv val/ILSVRC2012_val_00024961.JPEG n02115641/
+mv val/ILSVRC2012_val_00024962.JPEG n01774384/
+mv val/ILSVRC2012_val_00024963.JPEG n02894605/
+mv val/ILSVRC2012_val_00024964.JPEG n03026506/
+mv val/ILSVRC2012_val_00024965.JPEG n02666196/
+mv val/ILSVRC2012_val_00024966.JPEG n03690938/
+mv val/ILSVRC2012_val_00024967.JPEG n02112706/
+mv val/ILSVRC2012_val_00024968.JPEG n03787032/
+mv val/ILSVRC2012_val_00024969.JPEG n01748264/
+mv val/ILSVRC2012_val_00024970.JPEG n03733131/
+mv val/ILSVRC2012_val_00024971.JPEG n03920288/
+mv val/ILSVRC2012_val_00024972.JPEG n04141076/
+mv val/ILSVRC2012_val_00024973.JPEG n02101006/
+mv val/ILSVRC2012_val_00024974.JPEG n03944341/
+mv val/ILSVRC2012_val_00024975.JPEG n12267677/
+mv val/ILSVRC2012_val_00024976.JPEG n03782006/
+mv val/ILSVRC2012_val_00024977.JPEG n03924679/
+mv val/ILSVRC2012_val_00024978.JPEG n02437616/
+mv val/ILSVRC2012_val_00024979.JPEG n02992529/
+mv val/ILSVRC2012_val_00024980.JPEG n02871525/
+mv val/ILSVRC2012_val_00024981.JPEG n02104029/
+mv val/ILSVRC2012_val_00024982.JPEG n03376595/
+mv val/ILSVRC2012_val_00024983.JPEG n04243546/
+mv val/ILSVRC2012_val_00024984.JPEG n03854065/
+mv val/ILSVRC2012_val_00024985.JPEG n03983396/
+mv val/ILSVRC2012_val_00024986.JPEG n02104029/
+mv val/ILSVRC2012_val_00024987.JPEG n01883070/
+mv val/ILSVRC2012_val_00024988.JPEG n07716906/
+mv val/ILSVRC2012_val_00024989.JPEG n02092002/
+mv val/ILSVRC2012_val_00024990.JPEG n02114855/
+mv val/ILSVRC2012_val_00024991.JPEG n03255030/
+mv val/ILSVRC2012_val_00024992.JPEG n01873310/
+mv val/ILSVRC2012_val_00024993.JPEG n01704323/
+mv val/ILSVRC2012_val_00024994.JPEG n04192698/
+mv val/ILSVRC2012_val_00024995.JPEG n03485407/
+mv val/ILSVRC2012_val_00024996.JPEG n02916936/
+mv val/ILSVRC2012_val_00024997.JPEG n07590611/
+mv val/ILSVRC2012_val_00024998.JPEG n02869837/
+mv val/ILSVRC2012_val_00024999.JPEG n03527444/
+mv val/ILSVRC2012_val_00025000.JPEG n03595614/
+mv val/ILSVRC2012_val_00025001.JPEG n02105412/
+mv val/ILSVRC2012_val_00025002.JPEG n09835506/
+mv val/ILSVRC2012_val_00025003.JPEG n04033901/
+mv val/ILSVRC2012_val_00025004.JPEG n04285008/
+mv val/ILSVRC2012_val_00025005.JPEG n02326432/
+mv val/ILSVRC2012_val_00025006.JPEG n02104029/
+mv val/ILSVRC2012_val_00025007.JPEG n07716906/
+mv val/ILSVRC2012_val_00025008.JPEG n07760859/
+mv val/ILSVRC2012_val_00025009.JPEG n03832673/
+mv val/ILSVRC2012_val_00025010.JPEG n03492542/
+mv val/ILSVRC2012_val_00025011.JPEG n02408429/
+mv val/ILSVRC2012_val_00025012.JPEG n03781244/
+mv val/ILSVRC2012_val_00025013.JPEG n02099849/
+mv val/ILSVRC2012_val_00025014.JPEG n03840681/
+mv val/ILSVRC2012_val_00025015.JPEG n02092339/
+mv val/ILSVRC2012_val_00025016.JPEG n03590841/
+mv val/ILSVRC2012_val_00025017.JPEG n01685808/
+mv val/ILSVRC2012_val_00025018.JPEG n01694178/
+mv val/ILSVRC2012_val_00025019.JPEG n07753592/
+mv val/ILSVRC2012_val_00025020.JPEG n03535780/
+mv val/ILSVRC2012_val_00025021.JPEG n02730930/
+mv val/ILSVRC2012_val_00025022.JPEG n04270147/
+mv val/ILSVRC2012_val_00025023.JPEG n02011460/
+mv val/ILSVRC2012_val_00025024.JPEG n04483307/
+mv val/ILSVRC2012_val_00025025.JPEG n01688243/
+mv val/ILSVRC2012_val_00025026.JPEG n01737021/
+mv val/ILSVRC2012_val_00025027.JPEG n02033041/
+mv val/ILSVRC2012_val_00025028.JPEG n03100240/
+mv val/ILSVRC2012_val_00025029.JPEG n03447447/
+mv val/ILSVRC2012_val_00025030.JPEG n03584829/
+mv val/ILSVRC2012_val_00025031.JPEG n02483362/
+mv val/ILSVRC2012_val_00025032.JPEG n03998194/
+mv val/ILSVRC2012_val_00025033.JPEG n02483362/
+mv val/ILSVRC2012_val_00025034.JPEG n03481172/
+mv val/ILSVRC2012_val_00025035.JPEG n01558993/
+mv val/ILSVRC2012_val_00025036.JPEG n04606251/
+mv val/ILSVRC2012_val_00025037.JPEG n01537544/
+mv val/ILSVRC2012_val_00025038.JPEG n02808440/
+mv val/ILSVRC2012_val_00025039.JPEG n03825788/
+mv val/ILSVRC2012_val_00025040.JPEG n01773157/
+mv val/ILSVRC2012_val_00025041.JPEG n04507155/
+mv val/ILSVRC2012_val_00025042.JPEG n04141076/
+mv val/ILSVRC2012_val_00025043.JPEG n02504013/
+mv val/ILSVRC2012_val_00025044.JPEG n04562935/
+mv val/ILSVRC2012_val_00025045.JPEG n07590611/
+mv val/ILSVRC2012_val_00025046.JPEG n04357314/
+mv val/ILSVRC2012_val_00025047.JPEG n01608432/
+mv val/ILSVRC2012_val_00025048.JPEG n02097658/
+mv val/ILSVRC2012_val_00025049.JPEG n03950228/
+mv val/ILSVRC2012_val_00025050.JPEG n02814860/
+mv val/ILSVRC2012_val_00025051.JPEG n01498041/
+mv val/ILSVRC2012_val_00025052.JPEG n04553703/
+mv val/ILSVRC2012_val_00025053.JPEG n12768682/
+mv val/ILSVRC2012_val_00025054.JPEG n03032252/
+mv val/ILSVRC2012_val_00025055.JPEG n02097474/
+mv val/ILSVRC2012_val_00025056.JPEG n01955084/
+mv val/ILSVRC2012_val_00025057.JPEG n07695742/
+mv val/ILSVRC2012_val_00025058.JPEG n02483708/
+mv val/ILSVRC2012_val_00025059.JPEG n02106550/
+mv val/ILSVRC2012_val_00025060.JPEG n04515003/
+mv val/ILSVRC2012_val_00025061.JPEG n02226429/
+mv val/ILSVRC2012_val_00025062.JPEG n04370456/
+mv val/ILSVRC2012_val_00025063.JPEG n03000684/
+mv val/ILSVRC2012_val_00025064.JPEG n03837869/
+mv val/ILSVRC2012_val_00025065.JPEG n02113799/
+mv val/ILSVRC2012_val_00025066.JPEG n02102480/
+mv val/ILSVRC2012_val_00025067.JPEG n03459775/
+mv val/ILSVRC2012_val_00025068.JPEG n02120079/
+mv val/ILSVRC2012_val_00025069.JPEG n02071294/
+mv val/ILSVRC2012_val_00025070.JPEG n13054560/
+mv val/ILSVRC2012_val_00025071.JPEG n04192698/
+mv val/ILSVRC2012_val_00025072.JPEG n02504458/
+mv val/ILSVRC2012_val_00025073.JPEG n04372370/
+mv val/ILSVRC2012_val_00025074.JPEG n04251144/
+mv val/ILSVRC2012_val_00025075.JPEG n02006656/
+mv val/ILSVRC2012_val_00025076.JPEG n03908618/
+mv val/ILSVRC2012_val_00025077.JPEG n04311174/
+mv val/ILSVRC2012_val_00025078.JPEG n03018349/
+mv val/ILSVRC2012_val_00025079.JPEG n13133613/
+mv val/ILSVRC2012_val_00025080.JPEG n03796401/
+mv val/ILSVRC2012_val_00025081.JPEG n04409515/
+mv val/ILSVRC2012_val_00025082.JPEG n02102480/
+mv val/ILSVRC2012_val_00025083.JPEG n02843684/
+mv val/ILSVRC2012_val_00025084.JPEG n04040759/
+mv val/ILSVRC2012_val_00025085.JPEG n02086646/
+mv val/ILSVRC2012_val_00025086.JPEG n02948072/
+mv val/ILSVRC2012_val_00025087.JPEG n07836838/
+mv val/ILSVRC2012_val_00025088.JPEG n03476684/
+mv val/ILSVRC2012_val_00025089.JPEG n02236044/
+mv val/ILSVRC2012_val_00025090.JPEG n04296562/
+mv val/ILSVRC2012_val_00025091.JPEG n02017213/
+mv val/ILSVRC2012_val_00025092.JPEG n04612504/
+mv val/ILSVRC2012_val_00025093.JPEG n02769748/
+mv val/ILSVRC2012_val_00025094.JPEG n07717410/
+mv val/ILSVRC2012_val_00025095.JPEG n07717410/
+mv val/ILSVRC2012_val_00025096.JPEG n01751748/
+mv val/ILSVRC2012_val_00025097.JPEG n03773504/
+mv val/ILSVRC2012_val_00025098.JPEG n02085782/
+mv val/ILSVRC2012_val_00025099.JPEG n04562935/
+mv val/ILSVRC2012_val_00025100.JPEG n04239074/
+mv val/ILSVRC2012_val_00025101.JPEG n07760859/
+mv val/ILSVRC2012_val_00025102.JPEG n07768694/
+mv val/ILSVRC2012_val_00025103.JPEG n03160309/
+mv val/ILSVRC2012_val_00025104.JPEG n01692333/
+mv val/ILSVRC2012_val_00025105.JPEG n03045698/
+mv val/ILSVRC2012_val_00025106.JPEG n03272562/
+mv val/ILSVRC2012_val_00025107.JPEG n04417672/
+mv val/ILSVRC2012_val_00025108.JPEG n03954731/
+mv val/ILSVRC2012_val_00025109.JPEG n04505470/
+mv val/ILSVRC2012_val_00025110.JPEG n04154565/
+mv val/ILSVRC2012_val_00025111.JPEG n03691459/
+mv val/ILSVRC2012_val_00025112.JPEG n04209239/
+mv val/ILSVRC2012_val_00025113.JPEG n04409515/
+mv val/ILSVRC2012_val_00025114.JPEG n02363005/
+mv val/ILSVRC2012_val_00025115.JPEG n07734744/
+mv val/ILSVRC2012_val_00025116.JPEG n02422699/
+mv val/ILSVRC2012_val_00025117.JPEG n03529860/
+mv val/ILSVRC2012_val_00025118.JPEG n04235860/
+mv val/ILSVRC2012_val_00025119.JPEG n04536866/
+mv val/ILSVRC2012_val_00025120.JPEG n01981276/
+mv val/ILSVRC2012_val_00025121.JPEG n03888257/
+mv val/ILSVRC2012_val_00025122.JPEG n02276258/
+mv val/ILSVRC2012_val_00025123.JPEG n03388043/
+mv val/ILSVRC2012_val_00025124.JPEG n07718472/
+mv val/ILSVRC2012_val_00025125.JPEG n02869837/
+mv val/ILSVRC2012_val_00025126.JPEG n02006656/
+mv val/ILSVRC2012_val_00025127.JPEG n03595614/
+mv val/ILSVRC2012_val_00025128.JPEG n02917067/
+mv val/ILSVRC2012_val_00025129.JPEG n01440764/
+mv val/ILSVRC2012_val_00025130.JPEG n01855032/
+mv val/ILSVRC2012_val_00025131.JPEG n03930630/
+mv val/ILSVRC2012_val_00025132.JPEG n02105505/
+mv val/ILSVRC2012_val_00025133.JPEG n01491361/
+mv val/ILSVRC2012_val_00025134.JPEG n03345487/
+mv val/ILSVRC2012_val_00025135.JPEG n04372370/
+mv val/ILSVRC2012_val_00025136.JPEG n03187595/
+mv val/ILSVRC2012_val_00025137.JPEG n01491361/
+mv val/ILSVRC2012_val_00025138.JPEG n04264628/
+mv val/ILSVRC2012_val_00025139.JPEG n04557648/
+mv val/ILSVRC2012_val_00025140.JPEG n02119022/
+mv val/ILSVRC2012_val_00025141.JPEG n02607072/
+mv val/ILSVRC2012_val_00025142.JPEG n02396427/
+mv val/ILSVRC2012_val_00025143.JPEG n07615774/
+mv val/ILSVRC2012_val_00025144.JPEG n04553703/
+mv val/ILSVRC2012_val_00025145.JPEG n07718472/
+mv val/ILSVRC2012_val_00025146.JPEG n03530642/
+mv val/ILSVRC2012_val_00025147.JPEG n02100583/
+mv val/ILSVRC2012_val_00025148.JPEG n04557648/
+mv val/ILSVRC2012_val_00025149.JPEG n03485407/
+mv val/ILSVRC2012_val_00025150.JPEG n07745940/
+mv val/ILSVRC2012_val_00025151.JPEG n01531178/
+mv val/ILSVRC2012_val_00025152.JPEG n03954731/
+mv val/ILSVRC2012_val_00025153.JPEG n04465501/
+mv val/ILSVRC2012_val_00025154.JPEG n12768682/
+mv val/ILSVRC2012_val_00025155.JPEG n04486054/
+mv val/ILSVRC2012_val_00025156.JPEG n03595614/
+mv val/ILSVRC2012_val_00025157.JPEG n04548362/
+mv val/ILSVRC2012_val_00025158.JPEG n07753113/
+mv val/ILSVRC2012_val_00025159.JPEG n02701002/
+mv val/ILSVRC2012_val_00025160.JPEG n04525038/
+mv val/ILSVRC2012_val_00025161.JPEG n02317335/
+mv val/ILSVRC2012_val_00025162.JPEG n02443484/
+mv val/ILSVRC2012_val_00025163.JPEG n02939185/
+mv val/ILSVRC2012_val_00025164.JPEG n03314780/
+mv val/ILSVRC2012_val_00025165.JPEG n02089078/
+mv val/ILSVRC2012_val_00025166.JPEG n02859443/
+mv val/ILSVRC2012_val_00025167.JPEG n02091467/
+mv val/ILSVRC2012_val_00025168.JPEG n02124075/
+mv val/ILSVRC2012_val_00025169.JPEG n03690938/
+mv val/ILSVRC2012_val_00025170.JPEG n02091831/
+mv val/ILSVRC2012_val_00025171.JPEG n02454379/
+mv val/ILSVRC2012_val_00025172.JPEG n04065272/
+mv val/ILSVRC2012_val_00025173.JPEG n03196217/
+mv val/ILSVRC2012_val_00025174.JPEG n02655020/
+mv val/ILSVRC2012_val_00025175.JPEG n04487394/
+mv val/ILSVRC2012_val_00025176.JPEG n04286575/
+mv val/ILSVRC2012_val_00025177.JPEG n03125729/
+mv val/ILSVRC2012_val_00025178.JPEG n03854065/
+mv val/ILSVRC2012_val_00025179.JPEG n03670208/
+mv val/ILSVRC2012_val_00025180.JPEG n02108422/
+mv val/ILSVRC2012_val_00025181.JPEG n02102480/
+mv val/ILSVRC2012_val_00025182.JPEG n02988304/
+mv val/ILSVRC2012_val_00025183.JPEG n02009229/
+mv val/ILSVRC2012_val_00025184.JPEG n02099267/
+mv val/ILSVRC2012_val_00025185.JPEG n02097209/
+mv val/ILSVRC2012_val_00025186.JPEG n02948072/
+mv val/ILSVRC2012_val_00025187.JPEG n02110806/
+mv val/ILSVRC2012_val_00025188.JPEG n02177972/
+mv val/ILSVRC2012_val_00025189.JPEG n03494278/
+mv val/ILSVRC2012_val_00025190.JPEG n01737021/
+mv val/ILSVRC2012_val_00025191.JPEG n13133613/
+mv val/ILSVRC2012_val_00025192.JPEG n04447861/
+mv val/ILSVRC2012_val_00025193.JPEG n04591713/
+mv val/ILSVRC2012_val_00025194.JPEG n03495258/
+mv val/ILSVRC2012_val_00025195.JPEG n02859443/
+mv val/ILSVRC2012_val_00025196.JPEG n02860847/
+mv val/ILSVRC2012_val_00025197.JPEG n04554684/
+mv val/ILSVRC2012_val_00025198.JPEG n03637318/
+mv val/ILSVRC2012_val_00025199.JPEG n04258138/
+mv val/ILSVRC2012_val_00025200.JPEG n01797886/
+mv val/ILSVRC2012_val_00025201.JPEG n03095699/
+mv val/ILSVRC2012_val_00025202.JPEG n04041544/
+mv val/ILSVRC2012_val_00025203.JPEG n03602883/
+mv val/ILSVRC2012_val_00025204.JPEG n04525038/
+mv val/ILSVRC2012_val_00025205.JPEG n03706229/
+mv val/ILSVRC2012_val_00025206.JPEG n02093859/
+mv val/ILSVRC2012_val_00025207.JPEG n02119022/
+mv val/ILSVRC2012_val_00025208.JPEG n02454379/
+mv val/ILSVRC2012_val_00025209.JPEG n07614500/
+mv val/ILSVRC2012_val_00025210.JPEG n02276258/
+mv val/ILSVRC2012_val_00025211.JPEG n07714571/
+mv val/ILSVRC2012_val_00025212.JPEG n02177972/
+mv val/ILSVRC2012_val_00025213.JPEG n02129604/
+mv val/ILSVRC2012_val_00025214.JPEG n01601694/
+mv val/ILSVRC2012_val_00025215.JPEG n04355338/
+mv val/ILSVRC2012_val_00025216.JPEG n02999410/
+mv val/ILSVRC2012_val_00025217.JPEG n07760859/
+mv val/ILSVRC2012_val_00025218.JPEG n02165456/
+mv val/ILSVRC2012_val_00025219.JPEG n02111129/
+mv val/ILSVRC2012_val_00025220.JPEG n03220513/
+mv val/ILSVRC2012_val_00025221.JPEG n02437616/
+mv val/ILSVRC2012_val_00025222.JPEG n04465501/
+mv val/ILSVRC2012_val_00025223.JPEG n03272010/
+mv val/ILSVRC2012_val_00025224.JPEG n02167151/
+mv val/ILSVRC2012_val_00025225.JPEG n02174001/
+mv val/ILSVRC2012_val_00025226.JPEG n02607072/
+mv val/ILSVRC2012_val_00025227.JPEG n04254120/
+mv val/ILSVRC2012_val_00025228.JPEG n07584110/
+mv val/ILSVRC2012_val_00025229.JPEG n03388549/
+mv val/ILSVRC2012_val_00025230.JPEG n03063599/
+mv val/ILSVRC2012_val_00025231.JPEG n02795169/
+mv val/ILSVRC2012_val_00025232.JPEG n02727426/
+mv val/ILSVRC2012_val_00025233.JPEG n02799071/
+mv val/ILSVRC2012_val_00025234.JPEG n10565667/
+mv val/ILSVRC2012_val_00025235.JPEG n02454379/
+mv val/ILSVRC2012_val_00025236.JPEG n07717410/
+mv val/ILSVRC2012_val_00025237.JPEG n02504013/
+mv val/ILSVRC2012_val_00025238.JPEG n04266014/
+mv val/ILSVRC2012_val_00025239.JPEG n04493381/
+mv val/ILSVRC2012_val_00025240.JPEG n03832673/
+mv val/ILSVRC2012_val_00025241.JPEG n02033041/
+mv val/ILSVRC2012_val_00025242.JPEG n02447366/
+mv val/ILSVRC2012_val_00025243.JPEG n03314780/
+mv val/ILSVRC2012_val_00025244.JPEG n02930766/
+mv val/ILSVRC2012_val_00025245.JPEG n02110806/
+mv val/ILSVRC2012_val_00025246.JPEG n04033901/
+mv val/ILSVRC2012_val_00025247.JPEG n02870880/
+mv val/ILSVRC2012_val_00025248.JPEG n01872401/
+mv val/ILSVRC2012_val_00025249.JPEG n03063689/
+mv val/ILSVRC2012_val_00025250.JPEG n03814906/
+mv val/ILSVRC2012_val_00025251.JPEG n01798484/
+mv val/ILSVRC2012_val_00025252.JPEG n02219486/
+mv val/ILSVRC2012_val_00025253.JPEG n02111129/
+mv val/ILSVRC2012_val_00025254.JPEG n03124170/
+mv val/ILSVRC2012_val_00025255.JPEG n03443371/
+mv val/ILSVRC2012_val_00025256.JPEG n01855672/
+mv val/ILSVRC2012_val_00025257.JPEG n03089624/
+mv val/ILSVRC2012_val_00025258.JPEG n04239074/
+mv val/ILSVRC2012_val_00025259.JPEG n03814906/
+mv val/ILSVRC2012_val_00025260.JPEG n04285008/
+mv val/ILSVRC2012_val_00025261.JPEG n02097474/
+mv val/ILSVRC2012_val_00025262.JPEG n01819313/
+mv val/ILSVRC2012_val_00025263.JPEG n02364673/
+mv val/ILSVRC2012_val_00025264.JPEG n03773504/
+mv val/ILSVRC2012_val_00025265.JPEG n04310018/
+mv val/ILSVRC2012_val_00025266.JPEG n04398044/
+mv val/ILSVRC2012_val_00025267.JPEG n13054560/
+mv val/ILSVRC2012_val_00025268.JPEG n01665541/
+mv val/ILSVRC2012_val_00025269.JPEG n02025239/
+mv val/ILSVRC2012_val_00025270.JPEG n03976657/
+mv val/ILSVRC2012_val_00025271.JPEG n04553703/
+mv val/ILSVRC2012_val_00025272.JPEG n07715103/
+mv val/ILSVRC2012_val_00025273.JPEG n02018795/
+mv val/ILSVRC2012_val_00025274.JPEG n03794056/
+mv val/ILSVRC2012_val_00025275.JPEG n03595614/
+mv val/ILSVRC2012_val_00025276.JPEG n03026506/
+mv val/ILSVRC2012_val_00025277.JPEG n02128925/
+mv val/ILSVRC2012_val_00025278.JPEG n03717622/
+mv val/ILSVRC2012_val_00025279.JPEG n03041632/
+mv val/ILSVRC2012_val_00025280.JPEG n04417672/
+mv val/ILSVRC2012_val_00025281.JPEG n07753275/
+mv val/ILSVRC2012_val_00025282.JPEG n07718747/
+mv val/ILSVRC2012_val_00025283.JPEG n01728920/
+mv val/ILSVRC2012_val_00025284.JPEG n03447447/
+mv val/ILSVRC2012_val_00025285.JPEG n02114548/
+mv val/ILSVRC2012_val_00025286.JPEG n02769748/
+mv val/ILSVRC2012_val_00025287.JPEG n01784675/
+mv val/ILSVRC2012_val_00025288.JPEG n02100877/
+mv val/ILSVRC2012_val_00025289.JPEG n02097658/
+mv val/ILSVRC2012_val_00025290.JPEG n04523525/
+mv val/ILSVRC2012_val_00025291.JPEG n02002556/
+mv val/ILSVRC2012_val_00025292.JPEG n03404251/
+mv val/ILSVRC2012_val_00025293.JPEG n03786901/
+mv val/ILSVRC2012_val_00025294.JPEG n04162706/
+mv val/ILSVRC2012_val_00025295.JPEG n02776631/
+mv val/ILSVRC2012_val_00025296.JPEG n13133613/
+mv val/ILSVRC2012_val_00025297.JPEG n04254777/
+mv val/ILSVRC2012_val_00025298.JPEG n04355338/
+mv val/ILSVRC2012_val_00025299.JPEG n02104029/
+mv val/ILSVRC2012_val_00025300.JPEG n04201297/
+mv val/ILSVRC2012_val_00025301.JPEG n03775071/
+mv val/ILSVRC2012_val_00025302.JPEG n02093754/
+mv val/ILSVRC2012_val_00025303.JPEG n03992509/
+mv val/ILSVRC2012_val_00025304.JPEG n03134739/
+mv val/ILSVRC2012_val_00025305.JPEG n12057211/
+mv val/ILSVRC2012_val_00025306.JPEG n04116512/
+mv val/ILSVRC2012_val_00025307.JPEG n02281787/
+mv val/ILSVRC2012_val_00025308.JPEG n07920052/
+mv val/ILSVRC2012_val_00025309.JPEG n02105641/
+mv val/ILSVRC2012_val_00025310.JPEG n01943899/
+mv val/ILSVRC2012_val_00025311.JPEG n03841143/
+mv val/ILSVRC2012_val_00025312.JPEG n02487347/
+mv val/ILSVRC2012_val_00025313.JPEG n04486054/
+mv val/ILSVRC2012_val_00025314.JPEG n02281787/
+mv val/ILSVRC2012_val_00025315.JPEG n02342885/
+mv val/ILSVRC2012_val_00025316.JPEG n03775546/
+mv val/ILSVRC2012_val_00025317.JPEG n02011460/
+mv val/ILSVRC2012_val_00025318.JPEG n02089078/
+mv val/ILSVRC2012_val_00025319.JPEG n03776460/
+mv val/ILSVRC2012_val_00025320.JPEG n04423845/
+mv val/ILSVRC2012_val_00025321.JPEG n02865351/
+mv val/ILSVRC2012_val_00025322.JPEG n03089624/
+mv val/ILSVRC2012_val_00025323.JPEG n04371774/
+mv val/ILSVRC2012_val_00025324.JPEG n01514859/
+mv val/ILSVRC2012_val_00025325.JPEG n01734418/
+mv val/ILSVRC2012_val_00025326.JPEG n02328150/
+mv val/ILSVRC2012_val_00025327.JPEG n09468604/
+mv val/ILSVRC2012_val_00025328.JPEG n03063689/
+mv val/ILSVRC2012_val_00025329.JPEG n02951585/
+mv val/ILSVRC2012_val_00025330.JPEG n02095314/
+mv val/ILSVRC2012_val_00025331.JPEG n03792972/
+mv val/ILSVRC2012_val_00025332.JPEG n03776460/
+mv val/ILSVRC2012_val_00025333.JPEG n02346627/
+mv val/ILSVRC2012_val_00025334.JPEG n02894605/
+mv val/ILSVRC2012_val_00025335.JPEG n01775062/
+mv val/ILSVRC2012_val_00025336.JPEG n02130308/
+mv val/ILSVRC2012_val_00025337.JPEG n04192698/
+mv val/ILSVRC2012_val_00025338.JPEG n13044778/
+mv val/ILSVRC2012_val_00025339.JPEG n01751748/
+mv val/ILSVRC2012_val_00025340.JPEG n07697537/
+mv val/ILSVRC2012_val_00025341.JPEG n03868242/
+mv val/ILSVRC2012_val_00025342.JPEG n04525038/
+mv val/ILSVRC2012_val_00025343.JPEG n02259212/
+mv val/ILSVRC2012_val_00025344.JPEG n02391049/
+mv val/ILSVRC2012_val_00025345.JPEG n04399382/
+mv val/ILSVRC2012_val_00025346.JPEG n02667093/
+mv val/ILSVRC2012_val_00025347.JPEG n01530575/
+mv val/ILSVRC2012_val_00025348.JPEG n01632777/
+mv val/ILSVRC2012_val_00025349.JPEG n03259280/
+mv val/ILSVRC2012_val_00025350.JPEG n02840245/
+mv val/ILSVRC2012_val_00025351.JPEG n04019541/
+mv val/ILSVRC2012_val_00025352.JPEG n02422699/
+mv val/ILSVRC2012_val_00025353.JPEG n02113712/
+mv val/ILSVRC2012_val_00025354.JPEG n03930630/
+mv val/ILSVRC2012_val_00025355.JPEG n02643566/
+mv val/ILSVRC2012_val_00025356.JPEG n02231487/
+mv val/ILSVRC2012_val_00025357.JPEG n04487394/
+mv val/ILSVRC2012_val_00025358.JPEG n03937543/
+mv val/ILSVRC2012_val_00025359.JPEG n03355925/
+mv val/ILSVRC2012_val_00025360.JPEG n01828970/
+mv val/ILSVRC2012_val_00025361.JPEG n01580077/
+mv val/ILSVRC2012_val_00025362.JPEG n07932039/
+mv val/ILSVRC2012_val_00025363.JPEG n02877765/
+mv val/ILSVRC2012_val_00025364.JPEG n02167151/
+mv val/ILSVRC2012_val_00025365.JPEG n03476991/
+mv val/ILSVRC2012_val_00025366.JPEG n02825657/
+mv val/ILSVRC2012_val_00025367.JPEG n01751748/
+mv val/ILSVRC2012_val_00025368.JPEG n03207941/
+mv val/ILSVRC2012_val_00025369.JPEG n03840681/
+mv val/ILSVRC2012_val_00025370.JPEG n09288635/
+mv val/ILSVRC2012_val_00025371.JPEG n01843383/
+mv val/ILSVRC2012_val_00025372.JPEG n04536866/
+mv val/ILSVRC2012_val_00025373.JPEG n03814906/
+mv val/ILSVRC2012_val_00025374.JPEG n04429376/
+mv val/ILSVRC2012_val_00025375.JPEG n04428191/
+mv val/ILSVRC2012_val_00025376.JPEG n03814906/
+mv val/ILSVRC2012_val_00025377.JPEG n04344873/
+mv val/ILSVRC2012_val_00025378.JPEG n01693334/
+mv val/ILSVRC2012_val_00025379.JPEG n03417042/
+mv val/ILSVRC2012_val_00025380.JPEG n02747177/
+mv val/ILSVRC2012_val_00025381.JPEG n01986214/
+mv val/ILSVRC2012_val_00025382.JPEG n02277742/
+mv val/ILSVRC2012_val_00025383.JPEG n03127747/
+mv val/ILSVRC2012_val_00025384.JPEG n02422699/
+mv val/ILSVRC2012_val_00025385.JPEG n12985857/
+mv val/ILSVRC2012_val_00025386.JPEG n02672831/
+mv val/ILSVRC2012_val_00025387.JPEG n02823428/
+mv val/ILSVRC2012_val_00025388.JPEG n02112018/
+mv val/ILSVRC2012_val_00025389.JPEG n04037443/
+mv val/ILSVRC2012_val_00025390.JPEG n07695742/
+mv val/ILSVRC2012_val_00025391.JPEG n02536864/
+mv val/ILSVRC2012_val_00025392.JPEG n02788148/
+mv val/ILSVRC2012_val_00025393.JPEG n02088364/
+mv val/ILSVRC2012_val_00025394.JPEG n02105251/
+mv val/ILSVRC2012_val_00025395.JPEG n02105641/
+mv val/ILSVRC2012_val_00025396.JPEG n02123159/
+mv val/ILSVRC2012_val_00025397.JPEG n03729826/
+mv val/ILSVRC2012_val_00025398.JPEG n03125729/
+mv val/ILSVRC2012_val_00025399.JPEG n04179913/
+mv val/ILSVRC2012_val_00025400.JPEG n02097474/
+mv val/ILSVRC2012_val_00025401.JPEG n03297495/
+mv val/ILSVRC2012_val_00025402.JPEG n03042490/
+mv val/ILSVRC2012_val_00025403.JPEG n04252225/
+mv val/ILSVRC2012_val_00025404.JPEG n03141823/
+mv val/ILSVRC2012_val_00025405.JPEG n09193705/
+mv val/ILSVRC2012_val_00025406.JPEG n04149813/
+mv val/ILSVRC2012_val_00025407.JPEG n02655020/
+mv val/ILSVRC2012_val_00025408.JPEG n03788365/
+mv val/ILSVRC2012_val_00025409.JPEG n03085013/
+mv val/ILSVRC2012_val_00025410.JPEG n02037110/
+mv val/ILSVRC2012_val_00025411.JPEG n01944390/
+mv val/ILSVRC2012_val_00025412.JPEG n02120505/
+mv val/ILSVRC2012_val_00025413.JPEG n04536866/
+mv val/ILSVRC2012_val_00025414.JPEG n07695742/
+mv val/ILSVRC2012_val_00025415.JPEG n02951358/
+mv val/ILSVRC2012_val_00025416.JPEG n03417042/
+mv val/ILSVRC2012_val_00025417.JPEG n03733131/
+mv val/ILSVRC2012_val_00025418.JPEG n04325704/
+mv val/ILSVRC2012_val_00025419.JPEG n03843555/
+mv val/ILSVRC2012_val_00025420.JPEG n03179701/
+mv val/ILSVRC2012_val_00025421.JPEG n02009229/
+mv val/ILSVRC2012_val_00025422.JPEG n04523525/
+mv val/ILSVRC2012_val_00025423.JPEG n02098413/
+mv val/ILSVRC2012_val_00025424.JPEG n02096585/
+mv val/ILSVRC2012_val_00025425.JPEG n03424325/
+mv val/ILSVRC2012_val_00025426.JPEG n02105162/
+mv val/ILSVRC2012_val_00025427.JPEG n04590129/
+mv val/ILSVRC2012_val_00025428.JPEG n01537544/
+mv val/ILSVRC2012_val_00025429.JPEG n02093991/
+mv val/ILSVRC2012_val_00025430.JPEG n03394916/
+mv val/ILSVRC2012_val_00025431.JPEG n01514668/
+mv val/ILSVRC2012_val_00025432.JPEG n13133613/
+mv val/ILSVRC2012_val_00025433.JPEG n03445924/
+mv val/ILSVRC2012_val_00025434.JPEG n03873416/
+mv val/ILSVRC2012_val_00025435.JPEG n01632458/
+mv val/ILSVRC2012_val_00025436.JPEG n03706229/
+mv val/ILSVRC2012_val_00025437.JPEG n02085782/
+mv val/ILSVRC2012_val_00025438.JPEG n01632777/
+mv val/ILSVRC2012_val_00025439.JPEG n04371430/
+mv val/ILSVRC2012_val_00025440.JPEG n12144580/
+mv val/ILSVRC2012_val_00025441.JPEG n01665541/
+mv val/ILSVRC2012_val_00025442.JPEG n02102040/
+mv val/ILSVRC2012_val_00025443.JPEG n02701002/
+mv val/ILSVRC2012_val_00025444.JPEG n04131690/
+mv val/ILSVRC2012_val_00025445.JPEG n04347754/
+mv val/ILSVRC2012_val_00025446.JPEG n13040303/
+mv val/ILSVRC2012_val_00025447.JPEG n01775062/
+mv val/ILSVRC2012_val_00025448.JPEG n02114712/
+mv val/ILSVRC2012_val_00025449.JPEG n01833805/
+mv val/ILSVRC2012_val_00025450.JPEG n03759954/
+mv val/ILSVRC2012_val_00025451.JPEG n02860847/
+mv val/ILSVRC2012_val_00025452.JPEG n04330267/
+mv val/ILSVRC2012_val_00025453.JPEG n02859443/
+mv val/ILSVRC2012_val_00025454.JPEG n02138441/
+mv val/ILSVRC2012_val_00025455.JPEG n01774384/
+mv val/ILSVRC2012_val_00025456.JPEG n07717556/
+mv val/ILSVRC2012_val_00025457.JPEG n04311004/
+mv val/ILSVRC2012_val_00025458.JPEG n03908714/
+mv val/ILSVRC2012_val_00025459.JPEG n02361337/
+mv val/ILSVRC2012_val_00025460.JPEG n04065272/
+mv val/ILSVRC2012_val_00025461.JPEG n04146614/
+mv val/ILSVRC2012_val_00025462.JPEG n04179913/
+mv val/ILSVRC2012_val_00025463.JPEG n01697457/
+mv val/ILSVRC2012_val_00025464.JPEG n03857828/
+mv val/ILSVRC2012_val_00025465.JPEG n04285008/
+mv val/ILSVRC2012_val_00025466.JPEG n02089078/
+mv val/ILSVRC2012_val_00025467.JPEG n01755581/
+mv val/ILSVRC2012_val_00025468.JPEG n02056570/
+mv val/ILSVRC2012_val_00025469.JPEG n02701002/
+mv val/ILSVRC2012_val_00025470.JPEG n02483708/
+mv val/ILSVRC2012_val_00025471.JPEG n02101556/
+mv val/ILSVRC2012_val_00025472.JPEG n01737021/
+mv val/ILSVRC2012_val_00025473.JPEG n03874599/
+mv val/ILSVRC2012_val_00025474.JPEG n02107683/
+mv val/ILSVRC2012_val_00025475.JPEG n03657121/
+mv val/ILSVRC2012_val_00025476.JPEG n01592084/
+mv val/ILSVRC2012_val_00025477.JPEG n03995372/
+mv val/ILSVRC2012_val_00025478.JPEG n03788195/
+mv val/ILSVRC2012_val_00025479.JPEG n02100877/
+mv val/ILSVRC2012_val_00025480.JPEG n03447447/
+mv val/ILSVRC2012_val_00025481.JPEG n09399592/
+mv val/ILSVRC2012_val_00025482.JPEG n04350905/
+mv val/ILSVRC2012_val_00025483.JPEG n04266014/
+mv val/ILSVRC2012_val_00025484.JPEG n02979186/
+mv val/ILSVRC2012_val_00025485.JPEG n02988304/
+mv val/ILSVRC2012_val_00025486.JPEG n02879718/
+mv val/ILSVRC2012_val_00025487.JPEG n03032252/
+mv val/ILSVRC2012_val_00025488.JPEG n01530575/
+mv val/ILSVRC2012_val_00025489.JPEG n03291819/
+mv val/ILSVRC2012_val_00025490.JPEG n04131690/
+mv val/ILSVRC2012_val_00025491.JPEG n02037110/
+mv val/ILSVRC2012_val_00025492.JPEG n01632458/
+mv val/ILSVRC2012_val_00025493.JPEG n02102177/
+mv val/ILSVRC2012_val_00025494.JPEG n04367480/
+mv val/ILSVRC2012_val_00025495.JPEG n01807496/
+mv val/ILSVRC2012_val_00025496.JPEG n02107908/
+mv val/ILSVRC2012_val_00025497.JPEG n01740131/
+mv val/ILSVRC2012_val_00025498.JPEG n02096585/
+mv val/ILSVRC2012_val_00025499.JPEG n04235860/
+mv val/ILSVRC2012_val_00025500.JPEG n02363005/
+mv val/ILSVRC2012_val_00025501.JPEG n02110958/
+mv val/ILSVRC2012_val_00025502.JPEG n07711569/
+mv val/ILSVRC2012_val_00025503.JPEG n03384352/
+mv val/ILSVRC2012_val_00025504.JPEG n03530642/
+mv val/ILSVRC2012_val_00025505.JPEG n03761084/
+mv val/ILSVRC2012_val_00025506.JPEG n03602883/
+mv val/ILSVRC2012_val_00025507.JPEG n01531178/
+mv val/ILSVRC2012_val_00025508.JPEG n01774384/
+mv val/ILSVRC2012_val_00025509.JPEG n04456115/
+mv val/ILSVRC2012_val_00025510.JPEG n01985128/
+mv val/ILSVRC2012_val_00025511.JPEG n01694178/
+mv val/ILSVRC2012_val_00025512.JPEG n03065424/
+mv val/ILSVRC2012_val_00025513.JPEG n04589890/
+mv val/ILSVRC2012_val_00025514.JPEG n04049303/
+mv val/ILSVRC2012_val_00025515.JPEG n07248320/
+mv val/ILSVRC2012_val_00025516.JPEG n06874185/
+mv val/ILSVRC2012_val_00025517.JPEG n04604644/
+mv val/ILSVRC2012_val_00025518.JPEG n01775062/
+mv val/ILSVRC2012_val_00025519.JPEG n02123597/
+mv val/ILSVRC2012_val_00025520.JPEG n02095570/
+mv val/ILSVRC2012_val_00025521.JPEG n01985128/
+mv val/ILSVRC2012_val_00025522.JPEG n02115913/
+mv val/ILSVRC2012_val_00025523.JPEG n01622779/
+mv val/ILSVRC2012_val_00025524.JPEG n01601694/
+mv val/ILSVRC2012_val_00025525.JPEG n04589890/
+mv val/ILSVRC2012_val_00025526.JPEG n01560419/
+mv val/ILSVRC2012_val_00025527.JPEG n01440764/
+mv val/ILSVRC2012_val_00025528.JPEG n02051845/
+mv val/ILSVRC2012_val_00025529.JPEG n03218198/
+mv val/ILSVRC2012_val_00025530.JPEG n03047690/
+mv val/ILSVRC2012_val_00025531.JPEG n03854065/
+mv val/ILSVRC2012_val_00025532.JPEG n02442845/
+mv val/ILSVRC2012_val_00025533.JPEG n02361337/
+mv val/ILSVRC2012_val_00025534.JPEG n02835271/
+mv val/ILSVRC2012_val_00025535.JPEG n01531178/
+mv val/ILSVRC2012_val_00025536.JPEG n02108422/
+mv val/ILSVRC2012_val_00025537.JPEG n02115913/
+mv val/ILSVRC2012_val_00025538.JPEG n03141823/
+mv val/ILSVRC2012_val_00025539.JPEG n02088238/
+mv val/ILSVRC2012_val_00025540.JPEG n03690938/
+mv val/ILSVRC2012_val_00025541.JPEG n03207941/
+mv val/ILSVRC2012_val_00025542.JPEG n02510455/
+mv val/ILSVRC2012_val_00025543.JPEG n01806143/
+mv val/ILSVRC2012_val_00025544.JPEG n01740131/
+mv val/ILSVRC2012_val_00025545.JPEG n03854065/
+mv val/ILSVRC2012_val_00025546.JPEG n02488291/
+mv val/ILSVRC2012_val_00025547.JPEG n04428191/
+mv val/ILSVRC2012_val_00025548.JPEG n03063599/
+mv val/ILSVRC2012_val_00025549.JPEG n02101556/
+mv val/ILSVRC2012_val_00025550.JPEG n02087046/
+mv val/ILSVRC2012_val_00025551.JPEG n02101556/
+mv val/ILSVRC2012_val_00025552.JPEG n03792972/
+mv val/ILSVRC2012_val_00025553.JPEG n04296562/
+mv val/ILSVRC2012_val_00025554.JPEG n02101006/
+mv val/ILSVRC2012_val_00025555.JPEG n02776631/
+mv val/ILSVRC2012_val_00025556.JPEG n01773797/
+mv val/ILSVRC2012_val_00025557.JPEG n03709823/
+mv val/ILSVRC2012_val_00025558.JPEG n04458633/
+mv val/ILSVRC2012_val_00025559.JPEG n02281406/
+mv val/ILSVRC2012_val_00025560.JPEG n03691459/
+mv val/ILSVRC2012_val_00025561.JPEG n03692522/
+mv val/ILSVRC2012_val_00025562.JPEG n02089867/
+mv val/ILSVRC2012_val_00025563.JPEG n03868863/
+mv val/ILSVRC2012_val_00025564.JPEG n02012849/
+mv val/ILSVRC2012_val_00025565.JPEG n03763968/
+mv val/ILSVRC2012_val_00025566.JPEG n01944390/
+mv val/ILSVRC2012_val_00025567.JPEG n01667114/
+mv val/ILSVRC2012_val_00025568.JPEG n03950228/
+mv val/ILSVRC2012_val_00025569.JPEG n02128385/
+mv val/ILSVRC2012_val_00025570.JPEG n02319095/
+mv val/ILSVRC2012_val_00025571.JPEG n04553703/
+mv val/ILSVRC2012_val_00025572.JPEG n03452741/
+mv val/ILSVRC2012_val_00025573.JPEG n03345487/
+mv val/ILSVRC2012_val_00025574.JPEG n02672831/
+mv val/ILSVRC2012_val_00025575.JPEG n03935335/
+mv val/ILSVRC2012_val_00025576.JPEG n02104365/
+mv val/ILSVRC2012_val_00025577.JPEG n01592084/
+mv val/ILSVRC2012_val_00025578.JPEG n04149813/
+mv val/ILSVRC2012_val_00025579.JPEG n03594734/
+mv val/ILSVRC2012_val_00025580.JPEG n02233338/
+mv val/ILSVRC2012_val_00025581.JPEG n01688243/
+mv val/ILSVRC2012_val_00025582.JPEG n07718472/
+mv val/ILSVRC2012_val_00025583.JPEG n03394916/
+mv val/ILSVRC2012_val_00025584.JPEG n13040303/
+mv val/ILSVRC2012_val_00025585.JPEG n01986214/
+mv val/ILSVRC2012_val_00025586.JPEG n02510455/
+mv val/ILSVRC2012_val_00025587.JPEG n04285008/
+mv val/ILSVRC2012_val_00025588.JPEG n03956157/
+mv val/ILSVRC2012_val_00025589.JPEG n02264363/
+mv val/ILSVRC2012_val_00025590.JPEG n03127747/
+mv val/ILSVRC2012_val_00025591.JPEG n03445777/
+mv val/ILSVRC2012_val_00025592.JPEG n04467665/
+mv val/ILSVRC2012_val_00025593.JPEG n03240683/
+mv val/ILSVRC2012_val_00025594.JPEG n03065424/
+mv val/ILSVRC2012_val_00025595.JPEG n04517823/
+mv val/ILSVRC2012_val_00025596.JPEG n02165105/
+mv val/ILSVRC2012_val_00025597.JPEG n03602883/
+mv val/ILSVRC2012_val_00025598.JPEG n01753488/
+mv val/ILSVRC2012_val_00025599.JPEG n04399382/
+mv val/ILSVRC2012_val_00025600.JPEG n09256479/
+mv val/ILSVRC2012_val_00025601.JPEG n02086910/
+mv val/ILSVRC2012_val_00025602.JPEG n03956157/
+mv val/ILSVRC2012_val_00025603.JPEG n03485794/
+mv val/ILSVRC2012_val_00025604.JPEG n02484975/
+mv val/ILSVRC2012_val_00025605.JPEG n02666196/
+mv val/ILSVRC2012_val_00025606.JPEG n02097209/
+mv val/ILSVRC2012_val_00025607.JPEG n03535780/
+mv val/ILSVRC2012_val_00025608.JPEG n02112018/
+mv val/ILSVRC2012_val_00025609.JPEG n03109150/
+mv val/ILSVRC2012_val_00025610.JPEG n04590129/
+mv val/ILSVRC2012_val_00025611.JPEG n01667778/
+mv val/ILSVRC2012_val_00025612.JPEG n02787622/
+mv val/ILSVRC2012_val_00025613.JPEG n02088364/
+mv val/ILSVRC2012_val_00025614.JPEG n03388549/
+mv val/ILSVRC2012_val_00025615.JPEG n02494079/
+mv val/ILSVRC2012_val_00025616.JPEG n01843065/
+mv val/ILSVRC2012_val_00025617.JPEG n02108551/
+mv val/ILSVRC2012_val_00025618.JPEG n03929855/
+mv val/ILSVRC2012_val_00025619.JPEG n03498962/
+mv val/ILSVRC2012_val_00025620.JPEG n02109525/
+mv val/ILSVRC2012_val_00025621.JPEG n04328186/
+mv val/ILSVRC2012_val_00025622.JPEG n09256479/
+mv val/ILSVRC2012_val_00025623.JPEG n04540053/
+mv val/ILSVRC2012_val_00025624.JPEG n03459775/
+mv val/ILSVRC2012_val_00025625.JPEG n03982430/
+mv val/ILSVRC2012_val_00025626.JPEG n02444819/
+mv val/ILSVRC2012_val_00025627.JPEG n01494475/
+mv val/ILSVRC2012_val_00025628.JPEG n02086079/
+mv val/ILSVRC2012_val_00025629.JPEG n02125311/
+mv val/ILSVRC2012_val_00025630.JPEG n03529860/
+mv val/ILSVRC2012_val_00025631.JPEG n01843383/
+mv val/ILSVRC2012_val_00025632.JPEG n03992509/
+mv val/ILSVRC2012_val_00025633.JPEG n01641577/
+mv val/ILSVRC2012_val_00025634.JPEG n04099969/
+mv val/ILSVRC2012_val_00025635.JPEG n04254777/
+mv val/ILSVRC2012_val_00025636.JPEG n01608432/
+mv val/ILSVRC2012_val_00025637.JPEG n02346627/
+mv val/ILSVRC2012_val_00025638.JPEG n02397096/
+mv val/ILSVRC2012_val_00025639.JPEG n02676566/
+mv val/ILSVRC2012_val_00025640.JPEG n01491361/
+mv val/ILSVRC2012_val_00025641.JPEG n02074367/
+mv val/ILSVRC2012_val_00025642.JPEG n04252225/
+mv val/ILSVRC2012_val_00025643.JPEG n04485082/
+mv val/ILSVRC2012_val_00025644.JPEG n02092002/
+mv val/ILSVRC2012_val_00025645.JPEG n02098286/
+mv val/ILSVRC2012_val_00025646.JPEG n02727426/
+mv val/ILSVRC2012_val_00025647.JPEG n03100240/
+mv val/ILSVRC2012_val_00025648.JPEG n13054560/
+mv val/ILSVRC2012_val_00025649.JPEG n02097298/
+mv val/ILSVRC2012_val_00025650.JPEG n02123045/
+mv val/ILSVRC2012_val_00025651.JPEG n02002724/
+mv val/ILSVRC2012_val_00025652.JPEG n02109047/
+mv val/ILSVRC2012_val_00025653.JPEG n03131574/
+mv val/ILSVRC2012_val_00025654.JPEG n02692877/
+mv val/ILSVRC2012_val_00025655.JPEG n02088632/
+mv val/ILSVRC2012_val_00025656.JPEG n04465501/
+mv val/ILSVRC2012_val_00025657.JPEG n02930766/
+mv val/ILSVRC2012_val_00025658.JPEG n01843065/
+mv val/ILSVRC2012_val_00025659.JPEG n03697007/
+mv val/ILSVRC2012_val_00025660.JPEG n02102973/
+mv val/ILSVRC2012_val_00025661.JPEG n04147183/
+mv val/ILSVRC2012_val_00025662.JPEG n02117135/
+mv val/ILSVRC2012_val_00025663.JPEG n07754684/
+mv val/ILSVRC2012_val_00025664.JPEG n02787622/
+mv val/ILSVRC2012_val_00025665.JPEG n02114548/
+mv val/ILSVRC2012_val_00025666.JPEG n04515003/
+mv val/ILSVRC2012_val_00025667.JPEG n01855672/
+mv val/ILSVRC2012_val_00025668.JPEG n01682714/
+mv val/ILSVRC2012_val_00025669.JPEG n02110063/
+mv val/ILSVRC2012_val_00025670.JPEG n04127249/
+mv val/ILSVRC2012_val_00025671.JPEG n03127925/
+mv val/ILSVRC2012_val_00025672.JPEG n04429376/
+mv val/ILSVRC2012_val_00025673.JPEG n03710193/
+mv val/ILSVRC2012_val_00025674.JPEG n03796401/
+mv val/ILSVRC2012_val_00025675.JPEG n02786058/
+mv val/ILSVRC2012_val_00025676.JPEG n02794156/
+mv val/ILSVRC2012_val_00025677.JPEG n02112018/
+mv val/ILSVRC2012_val_00025678.JPEG n02423022/
+mv val/ILSVRC2012_val_00025679.JPEG n02094114/
+mv val/ILSVRC2012_val_00025680.JPEG n02092339/
+mv val/ILSVRC2012_val_00025681.JPEG n03344393/
+mv val/ILSVRC2012_val_00025682.JPEG n03888605/
+mv val/ILSVRC2012_val_00025683.JPEG n02437312/
+mv val/ILSVRC2012_val_00025684.JPEG n02107574/
+mv val/ILSVRC2012_val_00025685.JPEG n03710637/
+mv val/ILSVRC2012_val_00025686.JPEG n01491361/
+mv val/ILSVRC2012_val_00025687.JPEG n04074963/
+mv val/ILSVRC2012_val_00025688.JPEG n02128385/
+mv val/ILSVRC2012_val_00025689.JPEG n04044716/
+mv val/ILSVRC2012_val_00025690.JPEG n02093991/
+mv val/ILSVRC2012_val_00025691.JPEG n02113186/
+mv val/ILSVRC2012_val_00025692.JPEG n01592084/
+mv val/ILSVRC2012_val_00025693.JPEG n07714990/
+mv val/ILSVRC2012_val_00025694.JPEG n02174001/
+mv val/ILSVRC2012_val_00025695.JPEG n02777292/
+mv val/ILSVRC2012_val_00025696.JPEG n02090379/
+mv val/ILSVRC2012_val_00025697.JPEG n04509417/
+mv val/ILSVRC2012_val_00025698.JPEG n02486261/
+mv val/ILSVRC2012_val_00025699.JPEG n02841315/
+mv val/ILSVRC2012_val_00025700.JPEG n02096051/
+mv val/ILSVRC2012_val_00025701.JPEG n01768244/
+mv val/ILSVRC2012_val_00025702.JPEG n03895866/
+mv val/ILSVRC2012_val_00025703.JPEG n03891332/
+mv val/ILSVRC2012_val_00025704.JPEG n02102177/
+mv val/ILSVRC2012_val_00025705.JPEG n04525038/
+mv val/ILSVRC2012_val_00025706.JPEG n03777754/
+mv val/ILSVRC2012_val_00025707.JPEG n07716906/
+mv val/ILSVRC2012_val_00025708.JPEG n02091244/
+mv val/ILSVRC2012_val_00025709.JPEG n02966687/
+mv val/ILSVRC2012_val_00025710.JPEG n01981276/
+mv val/ILSVRC2012_val_00025711.JPEG n02092339/
+mv val/ILSVRC2012_val_00025712.JPEG n04612504/
+mv val/ILSVRC2012_val_00025713.JPEG n09229709/
+mv val/ILSVRC2012_val_00025714.JPEG n02099429/
+mv val/ILSVRC2012_val_00025715.JPEG n04540053/
+mv val/ILSVRC2012_val_00025716.JPEG n03935335/
+mv val/ILSVRC2012_val_00025717.JPEG n01644373/
+mv val/ILSVRC2012_val_00025718.JPEG n02088466/
+mv val/ILSVRC2012_val_00025719.JPEG n04380533/
+mv val/ILSVRC2012_val_00025720.JPEG n02105162/
+mv val/ILSVRC2012_val_00025721.JPEG n02916936/
+mv val/ILSVRC2012_val_00025722.JPEG n01944390/
+mv val/ILSVRC2012_val_00025723.JPEG n02123159/
+mv val/ILSVRC2012_val_00025724.JPEG n03459775/
+mv val/ILSVRC2012_val_00025725.JPEG n01944390/
+mv val/ILSVRC2012_val_00025726.JPEG n02100735/
+mv val/ILSVRC2012_val_00025727.JPEG n01740131/
+mv val/ILSVRC2012_val_00025728.JPEG n03599486/
+mv val/ILSVRC2012_val_00025729.JPEG n02169497/
+mv val/ILSVRC2012_val_00025730.JPEG n03888605/
+mv val/ILSVRC2012_val_00025731.JPEG n04296562/
+mv val/ILSVRC2012_val_00025732.JPEG n03794056/
+mv val/ILSVRC2012_val_00025733.JPEG n03110669/
+mv val/ILSVRC2012_val_00025734.JPEG n02356798/
+mv val/ILSVRC2012_val_00025735.JPEG n03032252/
+mv val/ILSVRC2012_val_00025736.JPEG n04482393/
+mv val/ILSVRC2012_val_00025737.JPEG n03888605/
+mv val/ILSVRC2012_val_00025738.JPEG n01748264/
+mv val/ILSVRC2012_val_00025739.JPEG n02098413/
+mv val/ILSVRC2012_val_00025740.JPEG n03967562/
+mv val/ILSVRC2012_val_00025741.JPEG n03706229/
+mv val/ILSVRC2012_val_00025742.JPEG n13052670/
+mv val/ILSVRC2012_val_00025743.JPEG n04252225/
+mv val/ILSVRC2012_val_00025744.JPEG n02009229/
+mv val/ILSVRC2012_val_00025745.JPEG n04252225/
+mv val/ILSVRC2012_val_00025746.JPEG n09421951/
+mv val/ILSVRC2012_val_00025747.JPEG n01930112/
+mv val/ILSVRC2012_val_00025748.JPEG n04461696/
+mv val/ILSVRC2012_val_00025749.JPEG n04208210/
+mv val/ILSVRC2012_val_00025750.JPEG n02443484/
+mv val/ILSVRC2012_val_00025751.JPEG n03045698/
+mv val/ILSVRC2012_val_00025752.JPEG n03967562/
+mv val/ILSVRC2012_val_00025753.JPEG n07880968/
+mv val/ILSVRC2012_val_00025754.JPEG n02177972/
+mv val/ILSVRC2012_val_00025755.JPEG n01698640/
+mv val/ILSVRC2012_val_00025756.JPEG n02704792/
+mv val/ILSVRC2012_val_00025757.JPEG n04328186/
+mv val/ILSVRC2012_val_00025758.JPEG n01828970/
+mv val/ILSVRC2012_val_00025759.JPEG n04482393/
+mv val/ILSVRC2012_val_00025760.JPEG n03400231/
+mv val/ILSVRC2012_val_00025761.JPEG n03394916/
+mv val/ILSVRC2012_val_00025762.JPEG n04467665/
+mv val/ILSVRC2012_val_00025763.JPEG n04259630/
+mv val/ILSVRC2012_val_00025764.JPEG n01860187/
+mv val/ILSVRC2012_val_00025765.JPEG n03868863/
+mv val/ILSVRC2012_val_00025766.JPEG n03000134/
+mv val/ILSVRC2012_val_00025767.JPEG n02783161/
+mv val/ILSVRC2012_val_00025768.JPEG n02509815/
+mv val/ILSVRC2012_val_00025769.JPEG n04465501/
+mv val/ILSVRC2012_val_00025770.JPEG n02417914/
+mv val/ILSVRC2012_val_00025771.JPEG n04482393/
+mv val/ILSVRC2012_val_00025772.JPEG n02787622/
+mv val/ILSVRC2012_val_00025773.JPEG n02089867/
+mv val/ILSVRC2012_val_00025774.JPEG n03240683/
+mv val/ILSVRC2012_val_00025775.JPEG n02403003/
+mv val/ILSVRC2012_val_00025776.JPEG n04296562/
+mv val/ILSVRC2012_val_00025777.JPEG n02782093/
+mv val/ILSVRC2012_val_00025778.JPEG n02892201/
+mv val/ILSVRC2012_val_00025779.JPEG n03777754/
+mv val/ILSVRC2012_val_00025780.JPEG n04612504/
+mv val/ILSVRC2012_val_00025781.JPEG n03372029/
+mv val/ILSVRC2012_val_00025782.JPEG n01756291/
+mv val/ILSVRC2012_val_00025783.JPEG n03902125/
+mv val/ILSVRC2012_val_00025784.JPEG n03355925/
+mv val/ILSVRC2012_val_00025785.JPEG n01843383/
+mv val/ILSVRC2012_val_00025786.JPEG n04579432/
+mv val/ILSVRC2012_val_00025787.JPEG n02091134/
+mv val/ILSVRC2012_val_00025788.JPEG n04579432/
+mv val/ILSVRC2012_val_00025789.JPEG n03481172/
+mv val/ILSVRC2012_val_00025790.JPEG n02841315/
+mv val/ILSVRC2012_val_00025791.JPEG n07831146/
+mv val/ILSVRC2012_val_00025792.JPEG n03075370/
+mv val/ILSVRC2012_val_00025793.JPEG n02009912/
+mv val/ILSVRC2012_val_00025794.JPEG n04201297/
+mv val/ILSVRC2012_val_00025795.JPEG n02396427/
+mv val/ILSVRC2012_val_00025796.JPEG n01753488/
+mv val/ILSVRC2012_val_00025797.JPEG n03249569/
+mv val/ILSVRC2012_val_00025798.JPEG n04090263/
+mv val/ILSVRC2012_val_00025799.JPEG n01704323/
+mv val/ILSVRC2012_val_00025800.JPEG n02526121/
+mv val/ILSVRC2012_val_00025801.JPEG n04204347/
+mv val/ILSVRC2012_val_00025802.JPEG n02777292/
+mv val/ILSVRC2012_val_00025803.JPEG n03126707/
+mv val/ILSVRC2012_val_00025804.JPEG n04254120/
+mv val/ILSVRC2012_val_00025805.JPEG n02111277/
+mv val/ILSVRC2012_val_00025806.JPEG n01582220/
+mv val/ILSVRC2012_val_00025807.JPEG n02206856/
+mv val/ILSVRC2012_val_00025808.JPEG n02939185/
+mv val/ILSVRC2012_val_00025809.JPEG n01693334/
+mv val/ILSVRC2012_val_00025810.JPEG n02641379/
+mv val/ILSVRC2012_val_00025811.JPEG n04263257/
+mv val/ILSVRC2012_val_00025812.JPEG n04347754/
+mv val/ILSVRC2012_val_00025813.JPEG n07734744/
+mv val/ILSVRC2012_val_00025814.JPEG n01990800/
+mv val/ILSVRC2012_val_00025815.JPEG n04399382/
+mv val/ILSVRC2012_val_00025816.JPEG n04270147/
+mv val/ILSVRC2012_val_00025817.JPEG n03944341/
+mv val/ILSVRC2012_val_00025818.JPEG n01773549/
+mv val/ILSVRC2012_val_00025819.JPEG n03259280/
+mv val/ILSVRC2012_val_00025820.JPEG n02089078/
+mv val/ILSVRC2012_val_00025821.JPEG n02094433/
+mv val/ILSVRC2012_val_00025822.JPEG n04525305/
+mv val/ILSVRC2012_val_00025823.JPEG n04493381/
+mv val/ILSVRC2012_val_00025824.JPEG n01669191/
+mv val/ILSVRC2012_val_00025825.JPEG n02066245/
+mv val/ILSVRC2012_val_00025826.JPEG n02841315/
+mv val/ILSVRC2012_val_00025827.JPEG n03796401/
+mv val/ILSVRC2012_val_00025828.JPEG n04371430/
+mv val/ILSVRC2012_val_00025829.JPEG n04548362/
+mv val/ILSVRC2012_val_00025830.JPEG n03944341/
+mv val/ILSVRC2012_val_00025831.JPEG n01773157/
+mv val/ILSVRC2012_val_00025832.JPEG n03223299/
+mv val/ILSVRC2012_val_00025833.JPEG n03692522/
+mv val/ILSVRC2012_val_00025834.JPEG n03594945/
+mv val/ILSVRC2012_val_00025835.JPEG n02100877/
+mv val/ILSVRC2012_val_00025836.JPEG n03000134/
+mv val/ILSVRC2012_val_00025837.JPEG n02783161/
+mv val/ILSVRC2012_val_00025838.JPEG n03345487/
+mv val/ILSVRC2012_val_00025839.JPEG n02802426/
+mv val/ILSVRC2012_val_00025840.JPEG n01944390/
+mv val/ILSVRC2012_val_00025841.JPEG n02817516/
+mv val/ILSVRC2012_val_00025842.JPEG n02102973/
+mv val/ILSVRC2012_val_00025843.JPEG n03956157/
+mv val/ILSVRC2012_val_00025844.JPEG n03627232/
+mv val/ILSVRC2012_val_00025845.JPEG n02114712/
+mv val/ILSVRC2012_val_00025846.JPEG n03837869/
+mv val/ILSVRC2012_val_00025847.JPEG n02797295/
+mv val/ILSVRC2012_val_00025848.JPEG n04458633/
+mv val/ILSVRC2012_val_00025849.JPEG n03196217/
+mv val/ILSVRC2012_val_00025850.JPEG n02963159/
+mv val/ILSVRC2012_val_00025851.JPEG n02110341/
+mv val/ILSVRC2012_val_00025852.JPEG n02108551/
+mv val/ILSVRC2012_val_00025853.JPEG n09468604/
+mv val/ILSVRC2012_val_00025854.JPEG n03452741/
+mv val/ILSVRC2012_val_00025855.JPEG n02174001/
+mv val/ILSVRC2012_val_00025856.JPEG n04380533/
+mv val/ILSVRC2012_val_00025857.JPEG n07716358/
+mv val/ILSVRC2012_val_00025858.JPEG n04037443/
+mv val/ILSVRC2012_val_00025859.JPEG n03803284/
+mv val/ILSVRC2012_val_00025860.JPEG n03958227/
+mv val/ILSVRC2012_val_00025861.JPEG n09288635/
+mv val/ILSVRC2012_val_00025862.JPEG n04442312/
+mv val/ILSVRC2012_val_00025863.JPEG n03272562/
+mv val/ILSVRC2012_val_00025864.JPEG n03891251/
+mv val/ILSVRC2012_val_00025865.JPEG n04118776/
+mv val/ILSVRC2012_val_00025866.JPEG n04532670/
+mv val/ILSVRC2012_val_00025867.JPEG n01742172/
+mv val/ILSVRC2012_val_00025868.JPEG n03733281/
+mv val/ILSVRC2012_val_00025869.JPEG n02102177/
+mv val/ILSVRC2012_val_00025870.JPEG n03026506/
+mv val/ILSVRC2012_val_00025871.JPEG n02606052/
+mv val/ILSVRC2012_val_00025872.JPEG n01818515/
+mv val/ILSVRC2012_val_00025873.JPEG n04589890/
+mv val/ILSVRC2012_val_00025874.JPEG n04428191/
+mv val/ILSVRC2012_val_00025875.JPEG n02279972/
+mv val/ILSVRC2012_val_00025876.JPEG n02123045/
+mv val/ILSVRC2012_val_00025877.JPEG n04254120/
+mv val/ILSVRC2012_val_00025878.JPEG n03000684/
+mv val/ILSVRC2012_val_00025879.JPEG n01983481/
+mv val/ILSVRC2012_val_00025880.JPEG n02704792/
+mv val/ILSVRC2012_val_00025881.JPEG n07590611/
+mv val/ILSVRC2012_val_00025882.JPEG n04162706/
+mv val/ILSVRC2012_val_00025883.JPEG n02088632/
+mv val/ILSVRC2012_val_00025884.JPEG n02112706/
+mv val/ILSVRC2012_val_00025885.JPEG n03938244/
+mv val/ILSVRC2012_val_00025886.JPEG n02112018/
+mv val/ILSVRC2012_val_00025887.JPEG n02123597/
+mv val/ILSVRC2012_val_00025888.JPEG n01531178/
+mv val/ILSVRC2012_val_00025889.JPEG n02325366/
+mv val/ILSVRC2012_val_00025890.JPEG n03000684/
+mv val/ILSVRC2012_val_00025891.JPEG n02066245/
+mv val/ILSVRC2012_val_00025892.JPEG n02859443/
+mv val/ILSVRC2012_val_00025893.JPEG n03063599/
+mv val/ILSVRC2012_val_00025894.JPEG n07753113/
+mv val/ILSVRC2012_val_00025895.JPEG n02999410/
+mv val/ILSVRC2012_val_00025896.JPEG n03777568/
+mv val/ILSVRC2012_val_00025897.JPEG n02108089/
+mv val/ILSVRC2012_val_00025898.JPEG n01872401/
+mv val/ILSVRC2012_val_00025899.JPEG n02025239/
+mv val/ILSVRC2012_val_00025900.JPEG n01484850/
+mv val/ILSVRC2012_val_00025901.JPEG n03899768/
+mv val/ILSVRC2012_val_00025902.JPEG n04162706/
+mv val/ILSVRC2012_val_00025903.JPEG n02110341/
+mv val/ILSVRC2012_val_00025904.JPEG n02091467/
+mv val/ILSVRC2012_val_00025905.JPEG n04417672/
+mv val/ILSVRC2012_val_00025906.JPEG n03000134/
+mv val/ILSVRC2012_val_00025907.JPEG n04356056/
+mv val/ILSVRC2012_val_00025908.JPEG n04417672/
+mv val/ILSVRC2012_val_00025909.JPEG n01689811/
+mv val/ILSVRC2012_val_00025910.JPEG n02412080/
+mv val/ILSVRC2012_val_00025911.JPEG n02086646/
+mv val/ILSVRC2012_val_00025912.JPEG n02096294/
+mv val/ILSVRC2012_val_00025913.JPEG n01622779/
+mv val/ILSVRC2012_val_00025914.JPEG n02089973/
+mv val/ILSVRC2012_val_00025915.JPEG n02835271/
+mv val/ILSVRC2012_val_00025916.JPEG n09193705/
+mv val/ILSVRC2012_val_00025917.JPEG n04111531/
+mv val/ILSVRC2012_val_00025918.JPEG n04456115/
+mv val/ILSVRC2012_val_00025919.JPEG n09193705/
+mv val/ILSVRC2012_val_00025920.JPEG n03633091/
+mv val/ILSVRC2012_val_00025921.JPEG n07749582/
+mv val/ILSVRC2012_val_00025922.JPEG n07697537/
+mv val/ILSVRC2012_val_00025923.JPEG n02860847/
+mv val/ILSVRC2012_val_00025924.JPEG n01855672/
+mv val/ILSVRC2012_val_00025925.JPEG n03743016/
+mv val/ILSVRC2012_val_00025926.JPEG n02077923/
+mv val/ILSVRC2012_val_00025927.JPEG n07754684/
+mv val/ILSVRC2012_val_00025928.JPEG n01833805/
+mv val/ILSVRC2012_val_00025929.JPEG n02013706/
+mv val/ILSVRC2012_val_00025930.JPEG n03976657/
+mv val/ILSVRC2012_val_00025931.JPEG n03134739/
+mv val/ILSVRC2012_val_00025932.JPEG n03720891/
+mv val/ILSVRC2012_val_00025933.JPEG n02837789/
+mv val/ILSVRC2012_val_00025934.JPEG n04355933/
+mv val/ILSVRC2012_val_00025935.JPEG n03584829/
+mv val/ILSVRC2012_val_00025936.JPEG n09472597/
+mv val/ILSVRC2012_val_00025937.JPEG n01843065/
+mv val/ILSVRC2012_val_00025938.JPEG n01749939/
+mv val/ILSVRC2012_val_00025939.JPEG n03717622/
+mv val/ILSVRC2012_val_00025940.JPEG n03982430/
+mv val/ILSVRC2012_val_00025941.JPEG n02504458/
+mv val/ILSVRC2012_val_00025942.JPEG n02127052/
+mv val/ILSVRC2012_val_00025943.JPEG n03127747/
+mv val/ILSVRC2012_val_00025944.JPEG n04026417/
+mv val/ILSVRC2012_val_00025945.JPEG n03866082/
+mv val/ILSVRC2012_val_00025946.JPEG n01872401/
+mv val/ILSVRC2012_val_00025947.JPEG n02094258/
+mv val/ILSVRC2012_val_00025948.JPEG n03291819/
+mv val/ILSVRC2012_val_00025949.JPEG n02110627/
+mv val/ILSVRC2012_val_00025950.JPEG n03982430/
+mv val/ILSVRC2012_val_00025951.JPEG n02093256/
+mv val/ILSVRC2012_val_00025952.JPEG n02277742/
+mv val/ILSVRC2012_val_00025953.JPEG n02965783/
+mv val/ILSVRC2012_val_00025954.JPEG n04428191/
+mv val/ILSVRC2012_val_00025955.JPEG n01740131/
+mv val/ILSVRC2012_val_00025956.JPEG n02795169/
+mv val/ILSVRC2012_val_00025957.JPEG n02119789/
+mv val/ILSVRC2012_val_00025958.JPEG n03535780/
+mv val/ILSVRC2012_val_00025959.JPEG n03461385/
+mv val/ILSVRC2012_val_00025960.JPEG n01980166/
+mv val/ILSVRC2012_val_00025961.JPEG n02486410/
+mv val/ILSVRC2012_val_00025962.JPEG n03720891/
+mv val/ILSVRC2012_val_00025963.JPEG n04597913/
+mv val/ILSVRC2012_val_00025964.JPEG n03666591/
+mv val/ILSVRC2012_val_00025965.JPEG n02843684/
+mv val/ILSVRC2012_val_00025966.JPEG n04252225/
+mv val/ILSVRC2012_val_00025967.JPEG n10565667/
+mv val/ILSVRC2012_val_00025968.JPEG n02268443/
+mv val/ILSVRC2012_val_00025969.JPEG n01491361/
+mv val/ILSVRC2012_val_00025970.JPEG n02098105/
+mv val/ILSVRC2012_val_00025971.JPEG n03775071/
+mv val/ILSVRC2012_val_00025972.JPEG n03187595/
+mv val/ILSVRC2012_val_00025973.JPEG n07760859/
+mv val/ILSVRC2012_val_00025974.JPEG n02259212/
+mv val/ILSVRC2012_val_00025975.JPEG n03042490/
+mv val/ILSVRC2012_val_00025976.JPEG n03942813/
+mv val/ILSVRC2012_val_00025977.JPEG n04069434/
+mv val/ILSVRC2012_val_00025978.JPEG n04120489/
+mv val/ILSVRC2012_val_00025979.JPEG n01820546/
+mv val/ILSVRC2012_val_00025980.JPEG n04548280/
+mv val/ILSVRC2012_val_00025981.JPEG n07718472/
+mv val/ILSVRC2012_val_00025982.JPEG n02417914/
+mv val/ILSVRC2012_val_00025983.JPEG n02095314/
+mv val/ILSVRC2012_val_00025984.JPEG n06874185/
+mv val/ILSVRC2012_val_00025985.JPEG n03447447/
+mv val/ILSVRC2012_val_00025986.JPEG n03983396/
+mv val/ILSVRC2012_val_00025987.JPEG n04592741/
+mv val/ILSVRC2012_val_00025988.JPEG n02102177/
+mv val/ILSVRC2012_val_00025989.JPEG n03649909/
+mv val/ILSVRC2012_val_00025990.JPEG n03594945/
+mv val/ILSVRC2012_val_00025991.JPEG n02099712/
+mv val/ILSVRC2012_val_00025992.JPEG n04370456/
+mv val/ILSVRC2012_val_00025993.JPEG n04517823/
+mv val/ILSVRC2012_val_00025994.JPEG n07875152/
+mv val/ILSVRC2012_val_00025995.JPEG n03207941/
+mv val/ILSVRC2012_val_00025996.JPEG n02398521/
+mv val/ILSVRC2012_val_00025997.JPEG n03954731/
+mv val/ILSVRC2012_val_00025998.JPEG n01796340/
+mv val/ILSVRC2012_val_00025999.JPEG n01798484/
+mv val/ILSVRC2012_val_00026000.JPEG n02113712/
+mv val/ILSVRC2012_val_00026001.JPEG n01491361/
+mv val/ILSVRC2012_val_00026002.JPEG n04423845/
+mv val/ILSVRC2012_val_00026003.JPEG n03483316/
+mv val/ILSVRC2012_val_00026004.JPEG n04461696/
+mv val/ILSVRC2012_val_00026005.JPEG n02106550/
+mv val/ILSVRC2012_val_00026006.JPEG n01773157/
+mv val/ILSVRC2012_val_00026007.JPEG n13052670/
+mv val/ILSVRC2012_val_00026008.JPEG n02091244/
+mv val/ILSVRC2012_val_00026009.JPEG n03706229/
+mv val/ILSVRC2012_val_00026010.JPEG n01560419/
+mv val/ILSVRC2012_val_00026011.JPEG n03832673/
+mv val/ILSVRC2012_val_00026012.JPEG n02492660/
+mv val/ILSVRC2012_val_00026013.JPEG n04099969/
+mv val/ILSVRC2012_val_00026014.JPEG n03982430/
+mv val/ILSVRC2012_val_00026015.JPEG n04532670/
+mv val/ILSVRC2012_val_00026016.JPEG n01631663/
+mv val/ILSVRC2012_val_00026017.JPEG n02085782/
+mv val/ILSVRC2012_val_00026018.JPEG n01728920/
+mv val/ILSVRC2012_val_00026019.JPEG n03240683/
+mv val/ILSVRC2012_val_00026020.JPEG n04584207/
+mv val/ILSVRC2012_val_00026021.JPEG n01806567/
+mv val/ILSVRC2012_val_00026022.JPEG n01729977/
+mv val/ILSVRC2012_val_00026023.JPEG n01601694/
+mv val/ILSVRC2012_val_00026024.JPEG n04350905/
+mv val/ILSVRC2012_val_00026025.JPEG n04179913/
+mv val/ILSVRC2012_val_00026026.JPEG n04592741/
+mv val/ILSVRC2012_val_00026027.JPEG n02108422/
+mv val/ILSVRC2012_val_00026028.JPEG n02110806/
+mv val/ILSVRC2012_val_00026029.JPEG n02814533/
+mv val/ILSVRC2012_val_00026030.JPEG n01773797/
+mv val/ILSVRC2012_val_00026031.JPEG n02704792/
+mv val/ILSVRC2012_val_00026032.JPEG n02782093/
+mv val/ILSVRC2012_val_00026033.JPEG n03916031/
+mv val/ILSVRC2012_val_00026034.JPEG n03467068/
+mv val/ILSVRC2012_val_00026035.JPEG n03710721/
+mv val/ILSVRC2012_val_00026036.JPEG n04554684/
+mv val/ILSVRC2012_val_00026037.JPEG n01955084/
+mv val/ILSVRC2012_val_00026038.JPEG n07717556/
+mv val/ILSVRC2012_val_00026039.JPEG n02009229/
+mv val/ILSVRC2012_val_00026040.JPEG n02256656/
+mv val/ILSVRC2012_val_00026041.JPEG n03095699/
+mv val/ILSVRC2012_val_00026042.JPEG n02094258/
+mv val/ILSVRC2012_val_00026043.JPEG n02486410/
+mv val/ILSVRC2012_val_00026044.JPEG n02027492/
+mv val/ILSVRC2012_val_00026045.JPEG n04200800/
+mv val/ILSVRC2012_val_00026046.JPEG n04371430/
+mv val/ILSVRC2012_val_00026047.JPEG n03662601/
+mv val/ILSVRC2012_val_00026048.JPEG n02444819/
+mv val/ILSVRC2012_val_00026049.JPEG n01665541/
+mv val/ILSVRC2012_val_00026050.JPEG n01614925/
+mv val/ILSVRC2012_val_00026051.JPEG n02112018/
+mv val/ILSVRC2012_val_00026052.JPEG n03773504/
+mv val/ILSVRC2012_val_00026053.JPEG n04505470/
+mv val/ILSVRC2012_val_00026054.JPEG n02951358/
+mv val/ILSVRC2012_val_00026055.JPEG n02948072/
+mv val/ILSVRC2012_val_00026056.JPEG n02101556/
+mv val/ILSVRC2012_val_00026057.JPEG n03868242/
+mv val/ILSVRC2012_val_00026058.JPEG n02093256/
+mv val/ILSVRC2012_val_00026059.JPEG n01641577/
+mv val/ILSVRC2012_val_00026060.JPEG n02128385/
+mv val/ILSVRC2012_val_00026061.JPEG n03000684/
+mv val/ILSVRC2012_val_00026062.JPEG n03874293/
+mv val/ILSVRC2012_val_00026063.JPEG n03134739/
+mv val/ILSVRC2012_val_00026064.JPEG n01440764/
+mv val/ILSVRC2012_val_00026065.JPEG n02268853/
+mv val/ILSVRC2012_val_00026066.JPEG n07584110/
+mv val/ILSVRC2012_val_00026067.JPEG n04399382/
+mv val/ILSVRC2012_val_00026068.JPEG n01843065/
+mv val/ILSVRC2012_val_00026069.JPEG n03188531/
+mv val/ILSVRC2012_val_00026070.JPEG n02086240/
+mv val/ILSVRC2012_val_00026071.JPEG n04540053/
+mv val/ILSVRC2012_val_00026072.JPEG n01829413/
+mv val/ILSVRC2012_val_00026073.JPEG n04462240/
+mv val/ILSVRC2012_val_00026074.JPEG n03018349/
+mv val/ILSVRC2012_val_00026075.JPEG n03782006/
+mv val/ILSVRC2012_val_00026076.JPEG n07730033/
+mv val/ILSVRC2012_val_00026077.JPEG n03676483/
+mv val/ILSVRC2012_val_00026078.JPEG n04275548/
+mv val/ILSVRC2012_val_00026079.JPEG n03930630/
+mv val/ILSVRC2012_val_00026080.JPEG n03764736/
+mv val/ILSVRC2012_val_00026081.JPEG n02226429/
+mv val/ILSVRC2012_val_00026082.JPEG n02007558/
+mv val/ILSVRC2012_val_00026083.JPEG n04149813/
+mv val/ILSVRC2012_val_00026084.JPEG n01820546/
+mv val/ILSVRC2012_val_00026085.JPEG n01829413/
+mv val/ILSVRC2012_val_00026086.JPEG n02110185/
+mv val/ILSVRC2012_val_00026087.JPEG n02107683/
+mv val/ILSVRC2012_val_00026088.JPEG n03840681/
+mv val/ILSVRC2012_val_00026089.JPEG n02018207/
+mv val/ILSVRC2012_val_00026090.JPEG n01833805/
+mv val/ILSVRC2012_val_00026091.JPEG n03902125/
+mv val/ILSVRC2012_val_00026092.JPEG n03868863/
+mv val/ILSVRC2012_val_00026093.JPEG n03443371/
+mv val/ILSVRC2012_val_00026094.JPEG n02113978/
+mv val/ILSVRC2012_val_00026095.JPEG n03793489/
+mv val/ILSVRC2012_val_00026096.JPEG n02859443/
+mv val/ILSVRC2012_val_00026097.JPEG n02097047/
+mv val/ILSVRC2012_val_00026098.JPEG n04192698/
+mv val/ILSVRC2012_val_00026099.JPEG n07590611/
+mv val/ILSVRC2012_val_00026100.JPEG n07880968/
+mv val/ILSVRC2012_val_00026101.JPEG n07697537/
+mv val/ILSVRC2012_val_00026102.JPEG n02342885/
+mv val/ILSVRC2012_val_00026103.JPEG n02398521/
+mv val/ILSVRC2012_val_00026104.JPEG n02002724/
+mv val/ILSVRC2012_val_00026105.JPEG n02910353/
+mv val/ILSVRC2012_val_00026106.JPEG n02442845/
+mv val/ILSVRC2012_val_00026107.JPEG n02906734/
+mv val/ILSVRC2012_val_00026108.JPEG n02494079/
+mv val/ILSVRC2012_val_00026109.JPEG n02091831/
+mv val/ILSVRC2012_val_00026110.JPEG n02823750/
+mv val/ILSVRC2012_val_00026111.JPEG n04447861/
+mv val/ILSVRC2012_val_00026112.JPEG n01796340/
+mv val/ILSVRC2012_val_00026113.JPEG n03089624/
+mv val/ILSVRC2012_val_00026114.JPEG n03924679/
+mv val/ILSVRC2012_val_00026115.JPEG n01980166/
+mv val/ILSVRC2012_val_00026116.JPEG n04435653/
+mv val/ILSVRC2012_val_00026117.JPEG n03649909/
+mv val/ILSVRC2012_val_00026118.JPEG n02107142/
+mv val/ILSVRC2012_val_00026119.JPEG n02110063/
+mv val/ILSVRC2012_val_00026120.JPEG n02403003/
+mv val/ILSVRC2012_val_00026121.JPEG n04081281/
+mv val/ILSVRC2012_val_00026122.JPEG n01735189/
+mv val/ILSVRC2012_val_00026123.JPEG n01532829/
+mv val/ILSVRC2012_val_00026124.JPEG n03891251/
+mv val/ILSVRC2012_val_00026125.JPEG n02077923/
+mv val/ILSVRC2012_val_00026126.JPEG n03977966/
+mv val/ILSVRC2012_val_00026127.JPEG n03452741/
+mv val/ILSVRC2012_val_00026128.JPEG n04465501/
+mv val/ILSVRC2012_val_00026129.JPEG n02777292/
+mv val/ILSVRC2012_val_00026130.JPEG n02113799/
+mv val/ILSVRC2012_val_00026131.JPEG n04367480/
+mv val/ILSVRC2012_val_00026132.JPEG n03787032/
+mv val/ILSVRC2012_val_00026133.JPEG n01744401/
+mv val/ILSVRC2012_val_00026134.JPEG n02667093/
+mv val/ILSVRC2012_val_00026135.JPEG n03933933/
+mv val/ILSVRC2012_val_00026136.JPEG n01580077/
+mv val/ILSVRC2012_val_00026137.JPEG n02794156/
+mv val/ILSVRC2012_val_00026138.JPEG n01796340/
+mv val/ILSVRC2012_val_00026139.JPEG n02002556/
+mv val/ILSVRC2012_val_00026140.JPEG n02837789/
+mv val/ILSVRC2012_val_00026141.JPEG n01818515/
+mv val/ILSVRC2012_val_00026142.JPEG n09835506/
+mv val/ILSVRC2012_val_00026143.JPEG n04604644/
+mv val/ILSVRC2012_val_00026144.JPEG n01917289/
+mv val/ILSVRC2012_val_00026145.JPEG n03180011/
+mv val/ILSVRC2012_val_00026146.JPEG n02102480/
+mv val/ILSVRC2012_val_00026147.JPEG n03873416/
+mv val/ILSVRC2012_val_00026148.JPEG n03995372/
+mv val/ILSVRC2012_val_00026149.JPEG n03884397/
+mv val/ILSVRC2012_val_00026150.JPEG n03657121/
+mv val/ILSVRC2012_val_00026151.JPEG n02093754/
+mv val/ILSVRC2012_val_00026152.JPEG n02102318/
+mv val/ILSVRC2012_val_00026153.JPEG n02097658/
+mv val/ILSVRC2012_val_00026154.JPEG n02108422/
+mv val/ILSVRC2012_val_00026155.JPEG n01855672/
+mv val/ILSVRC2012_val_00026156.JPEG n02489166/
+mv val/ILSVRC2012_val_00026157.JPEG n03208938/
+mv val/ILSVRC2012_val_00026158.JPEG n02116738/
+mv val/ILSVRC2012_val_00026159.JPEG n07802026/
+mv val/ILSVRC2012_val_00026160.JPEG n03584254/
+mv val/ILSVRC2012_val_00026161.JPEG n02108000/
+mv val/ILSVRC2012_val_00026162.JPEG n09256479/
+mv val/ILSVRC2012_val_00026163.JPEG n02892767/
+mv val/ILSVRC2012_val_00026164.JPEG n02105162/
+mv val/ILSVRC2012_val_00026165.JPEG n03388549/
+mv val/ILSVRC2012_val_00026166.JPEG n02870880/
+mv val/ILSVRC2012_val_00026167.JPEG n02116738/
+mv val/ILSVRC2012_val_00026168.JPEG n01807496/
+mv val/ILSVRC2012_val_00026169.JPEG n03045698/
+mv val/ILSVRC2012_val_00026170.JPEG n03717622/
+mv val/ILSVRC2012_val_00026171.JPEG n03109150/
+mv val/ILSVRC2012_val_00026172.JPEG n03388549/
+mv val/ILSVRC2012_val_00026173.JPEG n02437616/
+mv val/ILSVRC2012_val_00026174.JPEG n07930864/
+mv val/ILSVRC2012_val_00026175.JPEG n03991062/
+mv val/ILSVRC2012_val_00026176.JPEG n03709823/
+mv val/ILSVRC2012_val_00026177.JPEG n03680355/
+mv val/ILSVRC2012_val_00026178.JPEG n02033041/
+mv val/ILSVRC2012_val_00026179.JPEG n02843684/
+mv val/ILSVRC2012_val_00026180.JPEG n02795169/
+mv val/ILSVRC2012_val_00026181.JPEG n02236044/
+mv val/ILSVRC2012_val_00026182.JPEG n02509815/
+mv val/ILSVRC2012_val_00026183.JPEG n04442312/
+mv val/ILSVRC2012_val_00026184.JPEG n12998815/
+mv val/ILSVRC2012_val_00026185.JPEG n03255030/
+mv val/ILSVRC2012_val_00026186.JPEG n02111889/
+mv val/ILSVRC2012_val_00026187.JPEG n03595614/
+mv val/ILSVRC2012_val_00026188.JPEG n03788195/
+mv val/ILSVRC2012_val_00026189.JPEG n02690373/
+mv val/ILSVRC2012_val_00026190.JPEG n01756291/
+mv val/ILSVRC2012_val_00026191.JPEG n01698640/
+mv val/ILSVRC2012_val_00026192.JPEG n07565083/
+mv val/ILSVRC2012_val_00026193.JPEG n01983481/
+mv val/ILSVRC2012_val_00026194.JPEG n03445777/
+mv val/ILSVRC2012_val_00026195.JPEG n03998194/
+mv val/ILSVRC2012_val_00026196.JPEG n02879718/
+mv val/ILSVRC2012_val_00026197.JPEG n07930864/
+mv val/ILSVRC2012_val_00026198.JPEG n03255030/
+mv val/ILSVRC2012_val_00026199.JPEG n02086646/
+mv val/ILSVRC2012_val_00026200.JPEG n04120489/
+mv val/ILSVRC2012_val_00026201.JPEG n03733281/
+mv val/ILSVRC2012_val_00026202.JPEG n01667114/
+mv val/ILSVRC2012_val_00026203.JPEG n03532672/
+mv val/ILSVRC2012_val_00026204.JPEG n03179701/
+mv val/ILSVRC2012_val_00026205.JPEG n04229816/
+mv val/ILSVRC2012_val_00026206.JPEG n03733281/
+mv val/ILSVRC2012_val_00026207.JPEG n09256479/
+mv val/ILSVRC2012_val_00026208.JPEG n02105251/
+mv val/ILSVRC2012_val_00026209.JPEG n03146219/
+mv val/ILSVRC2012_val_00026210.JPEG n04330267/
+mv val/ILSVRC2012_val_00026211.JPEG n06874185/
+mv val/ILSVRC2012_val_00026212.JPEG n12620546/
+mv val/ILSVRC2012_val_00026213.JPEG n01641577/
+mv val/ILSVRC2012_val_00026214.JPEG n02106550/
+mv val/ILSVRC2012_val_00026215.JPEG n02445715/
+mv val/ILSVRC2012_val_00026216.JPEG n03146219/
+mv val/ILSVRC2012_val_00026217.JPEG n02493793/
+mv val/ILSVRC2012_val_00026218.JPEG n02509815/
+mv val/ILSVRC2012_val_00026219.JPEG n02804610/
+mv val/ILSVRC2012_val_00026220.JPEG n03590841/
+mv val/ILSVRC2012_val_00026221.JPEG n01871265/
+mv val/ILSVRC2012_val_00026222.JPEG n02483362/
+mv val/ILSVRC2012_val_00026223.JPEG n02437616/
+mv val/ILSVRC2012_val_00026224.JPEG n03895866/
+mv val/ILSVRC2012_val_00026225.JPEG n02071294/
+mv val/ILSVRC2012_val_00026226.JPEG n03291819/
+mv val/ILSVRC2012_val_00026227.JPEG n13044778/
+mv val/ILSVRC2012_val_00026228.JPEG n02114855/
+mv val/ILSVRC2012_val_00026229.JPEG n01984695/
+mv val/ILSVRC2012_val_00026230.JPEG n02500267/
+mv val/ILSVRC2012_val_00026231.JPEG n06359193/
+mv val/ILSVRC2012_val_00026232.JPEG n01843065/
+mv val/ILSVRC2012_val_00026233.JPEG n03763968/
+mv val/ILSVRC2012_val_00026234.JPEG n02643566/
+mv val/ILSVRC2012_val_00026235.JPEG n04258138/
+mv val/ILSVRC2012_val_00026236.JPEG n02667093/
+mv val/ILSVRC2012_val_00026237.JPEG n07734744/
+mv val/ILSVRC2012_val_00026238.JPEG n04153751/
+mv val/ILSVRC2012_val_00026239.JPEG n02138441/
+mv val/ILSVRC2012_val_00026240.JPEG n03188531/
+mv val/ILSVRC2012_val_00026241.JPEG n07802026/
+mv val/ILSVRC2012_val_00026242.JPEG n02100583/
+mv val/ILSVRC2012_val_00026243.JPEG n07860988/
+mv val/ILSVRC2012_val_00026244.JPEG n01817953/
+mv val/ILSVRC2012_val_00026245.JPEG n02106166/
+mv val/ILSVRC2012_val_00026246.JPEG n02483708/
+mv val/ILSVRC2012_val_00026247.JPEG n03782006/
+mv val/ILSVRC2012_val_00026248.JPEG n02007558/
+mv val/ILSVRC2012_val_00026249.JPEG n04476259/
+mv val/ILSVRC2012_val_00026250.JPEG n02835271/
+mv val/ILSVRC2012_val_00026251.JPEG n03124170/
+mv val/ILSVRC2012_val_00026252.JPEG n04550184/
+mv val/ILSVRC2012_val_00026253.JPEG n03661043/
+mv val/ILSVRC2012_val_00026254.JPEG n04204238/
+mv val/ILSVRC2012_val_00026255.JPEG n03776460/
+mv val/ILSVRC2012_val_00026256.JPEG n03837869/
+mv val/ILSVRC2012_val_00026257.JPEG n04443257/
+mv val/ILSVRC2012_val_00026258.JPEG n02486261/
+mv val/ILSVRC2012_val_00026259.JPEG n01537544/
+mv val/ILSVRC2012_val_00026260.JPEG n02317335/
+mv val/ILSVRC2012_val_00026261.JPEG n02134418/
+mv val/ILSVRC2012_val_00026262.JPEG n04557648/
+mv val/ILSVRC2012_val_00026263.JPEG n01872401/
+mv val/ILSVRC2012_val_00026264.JPEG n04209239/
+mv val/ILSVRC2012_val_00026265.JPEG n01677366/
+mv val/ILSVRC2012_val_00026266.JPEG n02100735/
+mv val/ILSVRC2012_val_00026267.JPEG n02096437/
+mv val/ILSVRC2012_val_00026268.JPEG n04479046/
+mv val/ILSVRC2012_val_00026269.JPEG n01693334/
+mv val/ILSVRC2012_val_00026270.JPEG n02965783/
+mv val/ILSVRC2012_val_00026271.JPEG n01514859/
+mv val/ILSVRC2012_val_00026272.JPEG n07613480/
+mv val/ILSVRC2012_val_00026273.JPEG n02108422/
+mv val/ILSVRC2012_val_00026274.JPEG n01914609/
+mv val/ILSVRC2012_val_00026275.JPEG n03482405/
+mv val/ILSVRC2012_val_00026276.JPEG n03710637/
+mv val/ILSVRC2012_val_00026277.JPEG n04009552/
+mv val/ILSVRC2012_val_00026278.JPEG n02106166/
+mv val/ILSVRC2012_val_00026279.JPEG n01531178/
+mv val/ILSVRC2012_val_00026280.JPEG n02704792/
+mv val/ILSVRC2012_val_00026281.JPEG n04487394/
+mv val/ILSVRC2012_val_00026282.JPEG n02834397/
+mv val/ILSVRC2012_val_00026283.JPEG n02108915/
+mv val/ILSVRC2012_val_00026284.JPEG n02484975/
+mv val/ILSVRC2012_val_00026285.JPEG n04310018/
+mv val/ILSVRC2012_val_00026286.JPEG n02095570/
+mv val/ILSVRC2012_val_00026287.JPEG n03447721/
+mv val/ILSVRC2012_val_00026288.JPEG n02119022/
+mv val/ILSVRC2012_val_00026289.JPEG n03017168/
+mv val/ILSVRC2012_val_00026290.JPEG n03697007/
+mv val/ILSVRC2012_val_00026291.JPEG n03249569/
+mv val/ILSVRC2012_val_00026292.JPEG n02835271/
+mv val/ILSVRC2012_val_00026293.JPEG n04591713/
+mv val/ILSVRC2012_val_00026294.JPEG n03347037/
+mv val/ILSVRC2012_val_00026295.JPEG n02791124/
+mv val/ILSVRC2012_val_00026296.JPEG n01692333/
+mv val/ILSVRC2012_val_00026297.JPEG n01882714/
+mv val/ILSVRC2012_val_00026298.JPEG n03196217/
+mv val/ILSVRC2012_val_00026299.JPEG n02422699/
+mv val/ILSVRC2012_val_00026300.JPEG n04041544/
+mv val/ILSVRC2012_val_00026301.JPEG n03796401/
+mv val/ILSVRC2012_val_00026302.JPEG n02028035/
+mv val/ILSVRC2012_val_00026303.JPEG n02966193/
+mv val/ILSVRC2012_val_00026304.JPEG n04235860/
+mv val/ILSVRC2012_val_00026305.JPEG n03642806/
+mv val/ILSVRC2012_val_00026306.JPEG n03838899/
+mv val/ILSVRC2012_val_00026307.JPEG n02510455/
+mv val/ILSVRC2012_val_00026308.JPEG n01930112/
+mv val/ILSVRC2012_val_00026309.JPEG n03781244/
+mv val/ILSVRC2012_val_00026310.JPEG n02091032/
+mv val/ILSVRC2012_val_00026311.JPEG n02025239/
+mv val/ILSVRC2012_val_00026312.JPEG n03196217/
+mv val/ILSVRC2012_val_00026313.JPEG n02094114/
+mv val/ILSVRC2012_val_00026314.JPEG n01978455/
+mv val/ILSVRC2012_val_00026315.JPEG n04254120/
+mv val/ILSVRC2012_val_00026316.JPEG n13040303/
+mv val/ILSVRC2012_val_00026317.JPEG n03459775/
+mv val/ILSVRC2012_val_00026318.JPEG n07716358/
+mv val/ILSVRC2012_val_00026319.JPEG n03016953/
+mv val/ILSVRC2012_val_00026320.JPEG n03876231/
+mv val/ILSVRC2012_val_00026321.JPEG n02892767/
+mv val/ILSVRC2012_val_00026322.JPEG n04069434/
+mv val/ILSVRC2012_val_00026323.JPEG n02256656/
+mv val/ILSVRC2012_val_00026324.JPEG n02168699/
+mv val/ILSVRC2012_val_00026325.JPEG n02128757/
+mv val/ILSVRC2012_val_00026326.JPEG n01986214/
+mv val/ILSVRC2012_val_00026327.JPEG n02009229/
+mv val/ILSVRC2012_val_00026328.JPEG n02790996/
+mv val/ILSVRC2012_val_00026329.JPEG n03630383/
+mv val/ILSVRC2012_val_00026330.JPEG n07718747/
+mv val/ILSVRC2012_val_00026331.JPEG n02361337/
+mv val/ILSVRC2012_val_00026332.JPEG n02951585/
+mv val/ILSVRC2012_val_00026333.JPEG n07873807/
+mv val/ILSVRC2012_val_00026334.JPEG n03223299/
+mv val/ILSVRC2012_val_00026335.JPEG n07836838/
+mv val/ILSVRC2012_val_00026336.JPEG n04266014/
+mv val/ILSVRC2012_val_00026337.JPEG n03956157/
+mv val/ILSVRC2012_val_00026338.JPEG n02002724/
+mv val/ILSVRC2012_val_00026339.JPEG n02077923/
+mv val/ILSVRC2012_val_00026340.JPEG n02002556/
+mv val/ILSVRC2012_val_00026341.JPEG n02951358/
+mv val/ILSVRC2012_val_00026342.JPEG n03259280/
+mv val/ILSVRC2012_val_00026343.JPEG n02113186/
+mv val/ILSVRC2012_val_00026344.JPEG n02843684/
+mv val/ILSVRC2012_val_00026345.JPEG n04332243/
+mv val/ILSVRC2012_val_00026346.JPEG n01775062/
+mv val/ILSVRC2012_val_00026347.JPEG n02777292/
+mv val/ILSVRC2012_val_00026348.JPEG n04118538/
+mv val/ILSVRC2012_val_00026349.JPEG n02226429/
+mv val/ILSVRC2012_val_00026350.JPEG n03908618/
+mv val/ILSVRC2012_val_00026351.JPEG n02782093/
+mv val/ILSVRC2012_val_00026352.JPEG n03777568/
+mv val/ILSVRC2012_val_00026353.JPEG n02101556/
+mv val/ILSVRC2012_val_00026354.JPEG n02701002/
+mv val/ILSVRC2012_val_00026355.JPEG n02018795/
+mv val/ILSVRC2012_val_00026356.JPEG n02102318/
+mv val/ILSVRC2012_val_00026357.JPEG n03045698/
+mv val/ILSVRC2012_val_00026358.JPEG n04254680/
+mv val/ILSVRC2012_val_00026359.JPEG n02692877/
+mv val/ILSVRC2012_val_00026360.JPEG n12620546/
+mv val/ILSVRC2012_val_00026361.JPEG n02325366/
+mv val/ILSVRC2012_val_00026362.JPEG n01560419/
+mv val/ILSVRC2012_val_00026363.JPEG n02977058/
+mv val/ILSVRC2012_val_00026364.JPEG n03127925/
+mv val/ILSVRC2012_val_00026365.JPEG n04325704/
+mv val/ILSVRC2012_val_00026366.JPEG n03483316/
+mv val/ILSVRC2012_val_00026367.JPEG n02101556/
+mv val/ILSVRC2012_val_00026368.JPEG n03450230/
+mv val/ILSVRC2012_val_00026369.JPEG n04264628/
+mv val/ILSVRC2012_val_00026370.JPEG n02101556/
+mv val/ILSVRC2012_val_00026371.JPEG n03482405/
+mv val/ILSVRC2012_val_00026372.JPEG n07715103/
+mv val/ILSVRC2012_val_00026373.JPEG n03544143/
+mv val/ILSVRC2012_val_00026374.JPEG n02395406/
+mv val/ILSVRC2012_val_00026375.JPEG n01797886/
+mv val/ILSVRC2012_val_00026376.JPEG n03207941/
+mv val/ILSVRC2012_val_00026377.JPEG n04389033/
+mv val/ILSVRC2012_val_00026378.JPEG n01978455/
+mv val/ILSVRC2012_val_00026379.JPEG n01755581/
+mv val/ILSVRC2012_val_00026380.JPEG n02708093/
+mv val/ILSVRC2012_val_00026381.JPEG n03461385/
+mv val/ILSVRC2012_val_00026382.JPEG n02342885/
+mv val/ILSVRC2012_val_00026383.JPEG n01930112/
+mv val/ILSVRC2012_val_00026384.JPEG n04009552/
+mv val/ILSVRC2012_val_00026385.JPEG n02804610/
+mv val/ILSVRC2012_val_00026386.JPEG n13037406/
+mv val/ILSVRC2012_val_00026387.JPEG n02092339/
+mv val/ILSVRC2012_val_00026388.JPEG n02106550/
+mv val/ILSVRC2012_val_00026389.JPEG n04033995/
+mv val/ILSVRC2012_val_00026390.JPEG n02395406/
+mv val/ILSVRC2012_val_00026391.JPEG n03733131/
+mv val/ILSVRC2012_val_00026392.JPEG n02859443/
+mv val/ILSVRC2012_val_00026393.JPEG n04008634/
+mv val/ILSVRC2012_val_00026394.JPEG n02841315/
+mv val/ILSVRC2012_val_00026395.JPEG n02412080/
+mv val/ILSVRC2012_val_00026396.JPEG n03785016/
+mv val/ILSVRC2012_val_00026397.JPEG n01440764/
+mv val/ILSVRC2012_val_00026398.JPEG n03100240/
+mv val/ILSVRC2012_val_00026399.JPEG n01665541/
+mv val/ILSVRC2012_val_00026400.JPEG n03710721/
+mv val/ILSVRC2012_val_00026401.JPEG n04599235/
+mv val/ILSVRC2012_val_00026402.JPEG n04370456/
+mv val/ILSVRC2012_val_00026403.JPEG n02124075/
+mv val/ILSVRC2012_val_00026404.JPEG n02138441/
+mv val/ILSVRC2012_val_00026405.JPEG n03085013/
+mv val/ILSVRC2012_val_00026406.JPEG n01744401/
+mv val/ILSVRC2012_val_00026407.JPEG n04296562/
+mv val/ILSVRC2012_val_00026408.JPEG n09835506/
+mv val/ILSVRC2012_val_00026409.JPEG n03785016/
+mv val/ILSVRC2012_val_00026410.JPEG n07754684/
+mv val/ILSVRC2012_val_00026411.JPEG n04311004/
+mv val/ILSVRC2012_val_00026412.JPEG n02124075/
+mv val/ILSVRC2012_val_00026413.JPEG n02802426/
+mv val/ILSVRC2012_val_00026414.JPEG n04239074/
+mv val/ILSVRC2012_val_00026415.JPEG n02971356/
+mv val/ILSVRC2012_val_00026416.JPEG n02009229/
+mv val/ILSVRC2012_val_00026417.JPEG n02096177/
+mv val/ILSVRC2012_val_00026418.JPEG n01695060/
+mv val/ILSVRC2012_val_00026419.JPEG n03954731/
+mv val/ILSVRC2012_val_00026420.JPEG n01828970/
+mv val/ILSVRC2012_val_00026421.JPEG n02086240/
+mv val/ILSVRC2012_val_00026422.JPEG n02447366/
+mv val/ILSVRC2012_val_00026423.JPEG n03095699/
+mv val/ILSVRC2012_val_00026424.JPEG n03590841/
+mv val/ILSVRC2012_val_00026425.JPEG n03482405/
+mv val/ILSVRC2012_val_00026426.JPEG n02107574/
+mv val/ILSVRC2012_val_00026427.JPEG n02096294/
+mv val/ILSVRC2012_val_00026428.JPEG n03085013/
+mv val/ILSVRC2012_val_00026429.JPEG n04456115/
+mv val/ILSVRC2012_val_00026430.JPEG n04486054/
+mv val/ILSVRC2012_val_00026431.JPEG n04599235/
+mv val/ILSVRC2012_val_00026432.JPEG n03141823/
+mv val/ILSVRC2012_val_00026433.JPEG n04263257/
+mv val/ILSVRC2012_val_00026434.JPEG n03877845/
+mv val/ILSVRC2012_val_00026435.JPEG n04428191/
+mv val/ILSVRC2012_val_00026436.JPEG n03976657/
+mv val/ILSVRC2012_val_00026437.JPEG n02797295/
+mv val/ILSVRC2012_val_00026438.JPEG n03637318/
+mv val/ILSVRC2012_val_00026439.JPEG n03041632/
+mv val/ILSVRC2012_val_00026440.JPEG n07579787/
+mv val/ILSVRC2012_val_00026441.JPEG n02687172/
+mv val/ILSVRC2012_val_00026442.JPEG n03201208/
+mv val/ILSVRC2012_val_00026443.JPEG n04579145/
+mv val/ILSVRC2012_val_00026444.JPEG n01608432/
+mv val/ILSVRC2012_val_00026445.JPEG n02099849/
+mv val/ILSVRC2012_val_00026446.JPEG n01667114/
+mv val/ILSVRC2012_val_00026447.JPEG n04372370/
+mv val/ILSVRC2012_val_00026448.JPEG n02106166/
+mv val/ILSVRC2012_val_00026449.JPEG n03075370/
+mv val/ILSVRC2012_val_00026450.JPEG n02138441/
+mv val/ILSVRC2012_val_00026451.JPEG n03028079/
+mv val/ILSVRC2012_val_00026452.JPEG n01930112/
+mv val/ILSVRC2012_val_00026453.JPEG n03388183/
+mv val/ILSVRC2012_val_00026454.JPEG n03825788/
+mv val/ILSVRC2012_val_00026455.JPEG n13044778/
+mv val/ILSVRC2012_val_00026456.JPEG n02687172/
+mv val/ILSVRC2012_val_00026457.JPEG n03692522/
+mv val/ILSVRC2012_val_00026458.JPEG n02391049/
+mv val/ILSVRC2012_val_00026459.JPEG n04254120/
+mv val/ILSVRC2012_val_00026460.JPEG n03146219/
+mv val/ILSVRC2012_val_00026461.JPEG n03126707/
+mv val/ILSVRC2012_val_00026462.JPEG n02025239/
+mv val/ILSVRC2012_val_00026463.JPEG n07714571/
+mv val/ILSVRC2012_val_00026464.JPEG n02869837/
+mv val/ILSVRC2012_val_00026465.JPEG n01580077/
+mv val/ILSVRC2012_val_00026466.JPEG n03594945/
+mv val/ILSVRC2012_val_00026467.JPEG n02109525/
+mv val/ILSVRC2012_val_00026468.JPEG n04099969/
+mv val/ILSVRC2012_val_00026469.JPEG n03792972/
+mv val/ILSVRC2012_val_00026470.JPEG n03623198/
+mv val/ILSVRC2012_val_00026471.JPEG n01872401/
+mv val/ILSVRC2012_val_00026472.JPEG n02441942/
+mv val/ILSVRC2012_val_00026473.JPEG n03032252/
+mv val/ILSVRC2012_val_00026474.JPEG n02687172/
+mv val/ILSVRC2012_val_00026475.JPEG n02096294/
+mv val/ILSVRC2012_val_00026476.JPEG n02037110/
+mv val/ILSVRC2012_val_00026477.JPEG n04310018/
+mv val/ILSVRC2012_val_00026478.JPEG n02280649/
+mv val/ILSVRC2012_val_00026479.JPEG n03992509/
+mv val/ILSVRC2012_val_00026480.JPEG n04037443/
+mv val/ILSVRC2012_val_00026481.JPEG n01806567/
+mv val/ILSVRC2012_val_00026482.JPEG n02325366/
+mv val/ILSVRC2012_val_00026483.JPEG n03372029/
+mv val/ILSVRC2012_val_00026484.JPEG n02259212/
+mv val/ILSVRC2012_val_00026485.JPEG n04371430/
+mv val/ILSVRC2012_val_00026486.JPEG n02391049/
+mv val/ILSVRC2012_val_00026487.JPEG n01755581/
+mv val/ILSVRC2012_val_00026488.JPEG n01820546/
+mv val/ILSVRC2012_val_00026489.JPEG n02264363/
+mv val/ILSVRC2012_val_00026490.JPEG n01494475/
+mv val/ILSVRC2012_val_00026491.JPEG n03201208/
+mv val/ILSVRC2012_val_00026492.JPEG n01774750/
+mv val/ILSVRC2012_val_00026493.JPEG n03259280/
+mv val/ILSVRC2012_val_00026494.JPEG n02687172/
+mv val/ILSVRC2012_val_00026495.JPEG n04090263/
+mv val/ILSVRC2012_val_00026496.JPEG n02483708/
+mv val/ILSVRC2012_val_00026497.JPEG n04487081/
+mv val/ILSVRC2012_val_00026498.JPEG n03218198/
+mv val/ILSVRC2012_val_00026499.JPEG n02480495/
+mv val/ILSVRC2012_val_00026500.JPEG n01692333/
+mv val/ILSVRC2012_val_00026501.JPEG n03017168/
+mv val/ILSVRC2012_val_00026502.JPEG n01843065/
+mv val/ILSVRC2012_val_00026503.JPEG n03930630/
+mv val/ILSVRC2012_val_00026504.JPEG n02056570/
+mv val/ILSVRC2012_val_00026505.JPEG n03041632/
+mv val/ILSVRC2012_val_00026506.JPEG n02799071/
+mv val/ILSVRC2012_val_00026507.JPEG n03344393/
+mv val/ILSVRC2012_val_00026508.JPEG n01514859/
+mv val/ILSVRC2012_val_00026509.JPEG n02113978/
+mv val/ILSVRC2012_val_00026510.JPEG n02027492/
+mv val/ILSVRC2012_val_00026511.JPEG n01981276/
+mv val/ILSVRC2012_val_00026512.JPEG n02397096/
+mv val/ILSVRC2012_val_00026513.JPEG n04192698/
+mv val/ILSVRC2012_val_00026514.JPEG n03134739/
+mv val/ILSVRC2012_val_00026515.JPEG n02666196/
+mv val/ILSVRC2012_val_00026516.JPEG n02117135/
+mv val/ILSVRC2012_val_00026517.JPEG n04461696/
+mv val/ILSVRC2012_val_00026518.JPEG n02231487/
+mv val/ILSVRC2012_val_00026519.JPEG n09246464/
+mv val/ILSVRC2012_val_00026520.JPEG n04149813/
+mv val/ILSVRC2012_val_00026521.JPEG n02102040/
+mv val/ILSVRC2012_val_00026522.JPEG n02086910/
+mv val/ILSVRC2012_val_00026523.JPEG n04355338/
+mv val/ILSVRC2012_val_00026524.JPEG n02457408/
+mv val/ILSVRC2012_val_00026525.JPEG n02093428/
+mv val/ILSVRC2012_val_00026526.JPEG n01689811/
+mv val/ILSVRC2012_val_00026527.JPEG n03481172/
+mv val/ILSVRC2012_val_00026528.JPEG n07836838/
+mv val/ILSVRC2012_val_00026529.JPEG n03803284/
+mv val/ILSVRC2012_val_00026530.JPEG n01910747/
+mv val/ILSVRC2012_val_00026531.JPEG n04553703/
+mv val/ILSVRC2012_val_00026532.JPEG n03478589/
+mv val/ILSVRC2012_val_00026533.JPEG n03584829/
+mv val/ILSVRC2012_val_00026534.JPEG n04254777/
+mv val/ILSVRC2012_val_00026535.JPEG n04254120/
+mv val/ILSVRC2012_val_00026536.JPEG n02105505/
+mv val/ILSVRC2012_val_00026537.JPEG n02361337/
+mv val/ILSVRC2012_val_00026538.JPEG n03992509/
+mv val/ILSVRC2012_val_00026539.JPEG n02804610/
+mv val/ILSVRC2012_val_00026540.JPEG n02102318/
+mv val/ILSVRC2012_val_00026541.JPEG n01560419/
+mv val/ILSVRC2012_val_00026542.JPEG n01773549/
+mv val/ILSVRC2012_val_00026543.JPEG n03902125/
+mv val/ILSVRC2012_val_00026544.JPEG n06359193/
+mv val/ILSVRC2012_val_00026545.JPEG n02129165/
+mv val/ILSVRC2012_val_00026546.JPEG n02120079/
+mv val/ILSVRC2012_val_00026547.JPEG n02113712/
+mv val/ILSVRC2012_val_00026548.JPEG n01728920/
+mv val/ILSVRC2012_val_00026549.JPEG n03160309/
+mv val/ILSVRC2012_val_00026550.JPEG n07871810/
+mv val/ILSVRC2012_val_00026551.JPEG n04258138/
+mv val/ILSVRC2012_val_00026552.JPEG n03045698/
+mv val/ILSVRC2012_val_00026553.JPEG n04552348/
+mv val/ILSVRC2012_val_00026554.JPEG n13044778/
+mv val/ILSVRC2012_val_00026555.JPEG n03717622/
+mv val/ILSVRC2012_val_00026556.JPEG n02025239/
+mv val/ILSVRC2012_val_00026557.JPEG n02268443/
+mv val/ILSVRC2012_val_00026558.JPEG n02108915/
+mv val/ILSVRC2012_val_00026559.JPEG n04542943/
+mv val/ILSVRC2012_val_00026560.JPEG n03240683/
+mv val/ILSVRC2012_val_00026561.JPEG n02966687/
+mv val/ILSVRC2012_val_00026562.JPEG n07754684/
+mv val/ILSVRC2012_val_00026563.JPEG n03991062/
+mv val/ILSVRC2012_val_00026564.JPEG n02769748/
+mv val/ILSVRC2012_val_00026565.JPEG n03187595/
+mv val/ILSVRC2012_val_00026566.JPEG n03271574/
+mv val/ILSVRC2012_val_00026567.JPEG n02256656/
+mv val/ILSVRC2012_val_00026568.JPEG n03637318/
+mv val/ILSVRC2012_val_00026569.JPEG n04357314/
+mv val/ILSVRC2012_val_00026570.JPEG n03207941/
+mv val/ILSVRC2012_val_00026571.JPEG n01728920/
+mv val/ILSVRC2012_val_00026572.JPEG n04074963/
+mv val/ILSVRC2012_val_00026573.JPEG n03000684/
+mv val/ILSVRC2012_val_00026574.JPEG n04118538/
+mv val/ILSVRC2012_val_00026575.JPEG n03888257/
+mv val/ILSVRC2012_val_00026576.JPEG n03000134/
+mv val/ILSVRC2012_val_00026577.JPEG n02930766/
+mv val/ILSVRC2012_val_00026578.JPEG n02437616/
+mv val/ILSVRC2012_val_00026579.JPEG n01622779/
+mv val/ILSVRC2012_val_00026580.JPEG n03954731/
+mv val/ILSVRC2012_val_00026581.JPEG n04266014/
+mv val/ILSVRC2012_val_00026582.JPEG n02108915/
+mv val/ILSVRC2012_val_00026583.JPEG n01729977/
+mv val/ILSVRC2012_val_00026584.JPEG n04553703/
+mv val/ILSVRC2012_val_00026585.JPEG n02328150/
+mv val/ILSVRC2012_val_00026586.JPEG n07715103/
+mv val/ILSVRC2012_val_00026587.JPEG n03617480/
+mv val/ILSVRC2012_val_00026588.JPEG n02441942/
+mv val/ILSVRC2012_val_00026589.JPEG n01734418/
+mv val/ILSVRC2012_val_00026590.JPEG n02229544/
+mv val/ILSVRC2012_val_00026591.JPEG n02259212/
+mv val/ILSVRC2012_val_00026592.JPEG n03017168/
+mv val/ILSVRC2012_val_00026593.JPEG n02077923/
+mv val/ILSVRC2012_val_00026594.JPEG n03871628/
+mv val/ILSVRC2012_val_00026595.JPEG n02025239/
+mv val/ILSVRC2012_val_00026596.JPEG n02992211/
+mv val/ILSVRC2012_val_00026597.JPEG n01978287/
+mv val/ILSVRC2012_val_00026598.JPEG n01755581/
+mv val/ILSVRC2012_val_00026599.JPEG n04008634/
+mv val/ILSVRC2012_val_00026600.JPEG n01773797/
+mv val/ILSVRC2012_val_00026601.JPEG n04209239/
+mv val/ILSVRC2012_val_00026602.JPEG n04584207/
+mv val/ILSVRC2012_val_00026603.JPEG n02493793/
+mv val/ILSVRC2012_val_00026604.JPEG n01616318/
+mv val/ILSVRC2012_val_00026605.JPEG n04127249/
+mv val/ILSVRC2012_val_00026606.JPEG n01877812/
+mv val/ILSVRC2012_val_00026607.JPEG n02814860/
+mv val/ILSVRC2012_val_00026608.JPEG n03535780/
+mv val/ILSVRC2012_val_00026609.JPEG n04040759/
+mv val/ILSVRC2012_val_00026610.JPEG n02879718/
+mv val/ILSVRC2012_val_00026611.JPEG n02514041/
+mv val/ILSVRC2012_val_00026612.JPEG n04592741/
+mv val/ILSVRC2012_val_00026613.JPEG n03854065/
+mv val/ILSVRC2012_val_00026614.JPEG n01614925/
+mv val/ILSVRC2012_val_00026615.JPEG n04026417/
+mv val/ILSVRC2012_val_00026616.JPEG n03837869/
+mv val/ILSVRC2012_val_00026617.JPEG n02865351/
+mv val/ILSVRC2012_val_00026618.JPEG n04239074/
+mv val/ILSVRC2012_val_00026619.JPEG n06794110/
+mv val/ILSVRC2012_val_00026620.JPEG n02190166/
+mv val/ILSVRC2012_val_00026621.JPEG n04208210/
+mv val/ILSVRC2012_val_00026622.JPEG n02088238/
+mv val/ILSVRC2012_val_00026623.JPEG n02497673/
+mv val/ILSVRC2012_val_00026624.JPEG n03179701/
+mv val/ILSVRC2012_val_00026625.JPEG n04613696/
+mv val/ILSVRC2012_val_00026626.JPEG n01693334/
+mv val/ILSVRC2012_val_00026627.JPEG n02672831/
+mv val/ILSVRC2012_val_00026628.JPEG n02817516/
+mv val/ILSVRC2012_val_00026629.JPEG n02106662/
+mv val/ILSVRC2012_val_00026630.JPEG n04392985/
+mv val/ILSVRC2012_val_00026631.JPEG n03777754/
+mv val/ILSVRC2012_val_00026632.JPEG n03649909/
+mv val/ILSVRC2012_val_00026633.JPEG n04311004/
+mv val/ILSVRC2012_val_00026634.JPEG n01664065/
+mv val/ILSVRC2012_val_00026635.JPEG n04389033/
+mv val/ILSVRC2012_val_00026636.JPEG n02807133/
+mv val/ILSVRC2012_val_00026637.JPEG n03476991/
+mv val/ILSVRC2012_val_00026638.JPEG n03141823/
+mv val/ILSVRC2012_val_00026639.JPEG n03793489/
+mv val/ILSVRC2012_val_00026640.JPEG n02988304/
+mv val/ILSVRC2012_val_00026641.JPEG n03325584/
+mv val/ILSVRC2012_val_00026642.JPEG n01871265/
+mv val/ILSVRC2012_val_00026643.JPEG n09288635/
+mv val/ILSVRC2012_val_00026644.JPEG n04326547/
+mv val/ILSVRC2012_val_00026645.JPEG n02110063/
+mv val/ILSVRC2012_val_00026646.JPEG n03220513/
+mv val/ILSVRC2012_val_00026647.JPEG n02093859/
+mv val/ILSVRC2012_val_00026648.JPEG n01693334/
+mv val/ILSVRC2012_val_00026649.JPEG n02815834/
+mv val/ILSVRC2012_val_00026650.JPEG n02107574/
+mv val/ILSVRC2012_val_00026651.JPEG n04487081/
+mv val/ILSVRC2012_val_00026652.JPEG n04347754/
+mv val/ILSVRC2012_val_00026653.JPEG n07695742/
+mv val/ILSVRC2012_val_00026654.JPEG n04086273/
+mv val/ILSVRC2012_val_00026655.JPEG n04493381/
+mv val/ILSVRC2012_val_00026656.JPEG n01580077/
+mv val/ILSVRC2012_val_00026657.JPEG n02910353/
+mv val/ILSVRC2012_val_00026658.JPEG n07754684/
+mv val/ILSVRC2012_val_00026659.JPEG n04067472/
+mv val/ILSVRC2012_val_00026660.JPEG n12768682/
+mv val/ILSVRC2012_val_00026661.JPEG n01675722/
+mv val/ILSVRC2012_val_00026662.JPEG n02437312/
+mv val/ILSVRC2012_val_00026663.JPEG n04417672/
+mv val/ILSVRC2012_val_00026664.JPEG n03868863/
+mv val/ILSVRC2012_val_00026665.JPEG n13054560/
+mv val/ILSVRC2012_val_00026666.JPEG n02100735/
+mv val/ILSVRC2012_val_00026667.JPEG n03888605/
+mv val/ILSVRC2012_val_00026668.JPEG n04009552/
+mv val/ILSVRC2012_val_00026669.JPEG n04238763/
+mv val/ILSVRC2012_val_00026670.JPEG n03876231/
+mv val/ILSVRC2012_val_00026671.JPEG n03706229/
+mv val/ILSVRC2012_val_00026672.JPEG n02859443/
+mv val/ILSVRC2012_val_00026673.JPEG n01530575/
+mv val/ILSVRC2012_val_00026674.JPEG n01824575/
+mv val/ILSVRC2012_val_00026675.JPEG n02096437/
+mv val/ILSVRC2012_val_00026676.JPEG n04486054/
+mv val/ILSVRC2012_val_00026677.JPEG n02704792/
+mv val/ILSVRC2012_val_00026678.JPEG n02110185/
+mv val/ILSVRC2012_val_00026679.JPEG n01824575/
+mv val/ILSVRC2012_val_00026680.JPEG n12620546/
+mv val/ILSVRC2012_val_00026681.JPEG n03814906/
+mv val/ILSVRC2012_val_00026682.JPEG n04154565/
+mv val/ILSVRC2012_val_00026683.JPEG n02058221/
+mv val/ILSVRC2012_val_00026684.JPEG n02111129/
+mv val/ILSVRC2012_val_00026685.JPEG n03690938/
+mv val/ILSVRC2012_val_00026686.JPEG n03857828/
+mv val/ILSVRC2012_val_00026687.JPEG n01534433/
+mv val/ILSVRC2012_val_00026688.JPEG n09229709/
+mv val/ILSVRC2012_val_00026689.JPEG n02086910/
+mv val/ILSVRC2012_val_00026690.JPEG n04507155/
+mv val/ILSVRC2012_val_00026691.JPEG n02098105/
+mv val/ILSVRC2012_val_00026692.JPEG n02089078/
+mv val/ILSVRC2012_val_00026693.JPEG n04355933/
+mv val/ILSVRC2012_val_00026694.JPEG n02930766/
+mv val/ILSVRC2012_val_00026695.JPEG n03384352/
+mv val/ILSVRC2012_val_00026696.JPEG n02892201/
+mv val/ILSVRC2012_val_00026697.JPEG n03992509/
+mv val/ILSVRC2012_val_00026698.JPEG n02109961/
+mv val/ILSVRC2012_val_00026699.JPEG n04479046/
+mv val/ILSVRC2012_val_00026700.JPEG n03000247/
+mv val/ILSVRC2012_val_00026701.JPEG n03047690/
+mv val/ILSVRC2012_val_00026702.JPEG n04258138/
+mv val/ILSVRC2012_val_00026703.JPEG n04005630/
+mv val/ILSVRC2012_val_00026704.JPEG n02281787/
+mv val/ILSVRC2012_val_00026705.JPEG n01693334/
+mv val/ILSVRC2012_val_00026706.JPEG n03379051/
+mv val/ILSVRC2012_val_00026707.JPEG n01614925/
+mv val/ILSVRC2012_val_00026708.JPEG n04479046/
+mv val/ILSVRC2012_val_00026709.JPEG n04591713/
+mv val/ILSVRC2012_val_00026710.JPEG n03920288/
+mv val/ILSVRC2012_val_00026711.JPEG n02051845/
+mv val/ILSVRC2012_val_00026712.JPEG n01756291/
+mv val/ILSVRC2012_val_00026713.JPEG n02107312/
+mv val/ILSVRC2012_val_00026714.JPEG n04435653/
+mv val/ILSVRC2012_val_00026715.JPEG n03325584/
+mv val/ILSVRC2012_val_00026716.JPEG n02058221/
+mv val/ILSVRC2012_val_00026717.JPEG n02107683/
+mv val/ILSVRC2012_val_00026718.JPEG n02111277/
+mv val/ILSVRC2012_val_00026719.JPEG n03786901/
+mv val/ILSVRC2012_val_00026720.JPEG n07768694/
+mv val/ILSVRC2012_val_00026721.JPEG n03891332/
+mv val/ILSVRC2012_val_00026722.JPEG n04204347/
+mv val/ILSVRC2012_val_00026723.JPEG n03400231/
+mv val/ILSVRC2012_val_00026724.JPEG n03961711/
+mv val/ILSVRC2012_val_00026725.JPEG n02490219/
+mv val/ILSVRC2012_val_00026726.JPEG n03347037/
+mv val/ILSVRC2012_val_00026727.JPEG n04597913/
+mv val/ILSVRC2012_val_00026728.JPEG n02090721/
+mv val/ILSVRC2012_val_00026729.JPEG n03450230/
+mv val/ILSVRC2012_val_00026730.JPEG n02112137/
+mv val/ILSVRC2012_val_00026731.JPEG n03250847/
+mv val/ILSVRC2012_val_00026732.JPEG n03868242/
+mv val/ILSVRC2012_val_00026733.JPEG n02058221/
+mv val/ILSVRC2012_val_00026734.JPEG n04141327/
+mv val/ILSVRC2012_val_00026735.JPEG n03761084/
+mv val/ILSVRC2012_val_00026736.JPEG n02090379/
+mv val/ILSVRC2012_val_00026737.JPEG n02486261/
+mv val/ILSVRC2012_val_00026738.JPEG n02095570/
+mv val/ILSVRC2012_val_00026739.JPEG n01749939/
+mv val/ILSVRC2012_val_00026740.JPEG n02804610/
+mv val/ILSVRC2012_val_00026741.JPEG n04273569/
+mv val/ILSVRC2012_val_00026742.JPEG n02777292/
+mv val/ILSVRC2012_val_00026743.JPEG n03930630/
+mv val/ILSVRC2012_val_00026744.JPEG n03775546/
+mv val/ILSVRC2012_val_00026745.JPEG n07716906/
+mv val/ILSVRC2012_val_00026746.JPEG n02916936/
+mv val/ILSVRC2012_val_00026747.JPEG n02930766/
+mv val/ILSVRC2012_val_00026748.JPEG n03709823/
+mv val/ILSVRC2012_val_00026749.JPEG n02056570/
+mv val/ILSVRC2012_val_00026750.JPEG n02412080/
+mv val/ILSVRC2012_val_00026751.JPEG n02666196/
+mv val/ILSVRC2012_val_00026752.JPEG n03196217/
+mv val/ILSVRC2012_val_00026753.JPEG n04479046/
+mv val/ILSVRC2012_val_00026754.JPEG n04509417/
+mv val/ILSVRC2012_val_00026755.JPEG n01532829/
+mv val/ILSVRC2012_val_00026756.JPEG n07697313/
+mv val/ILSVRC2012_val_00026757.JPEG n02493793/
+mv val/ILSVRC2012_val_00026758.JPEG n02058221/
+mv val/ILSVRC2012_val_00026759.JPEG n04252077/
+mv val/ILSVRC2012_val_00026760.JPEG n02002556/
+mv val/ILSVRC2012_val_00026761.JPEG n02085936/
+mv val/ILSVRC2012_val_00026762.JPEG n03063599/
+mv val/ILSVRC2012_val_00026763.JPEG n04273569/
+mv val/ILSVRC2012_val_00026764.JPEG n04550184/
+mv val/ILSVRC2012_val_00026765.JPEG n03710193/
+mv val/ILSVRC2012_val_00026766.JPEG n01742172/
+mv val/ILSVRC2012_val_00026767.JPEG n02443484/
+mv val/ILSVRC2012_val_00026768.JPEG n03720891/
+mv val/ILSVRC2012_val_00026769.JPEG n03706229/
+mv val/ILSVRC2012_val_00026770.JPEG n02643566/
+mv val/ILSVRC2012_val_00026771.JPEG n03218198/
+mv val/ILSVRC2012_val_00026772.JPEG n03877845/
+mv val/ILSVRC2012_val_00026773.JPEG n01630670/
+mv val/ILSVRC2012_val_00026774.JPEG n07714990/
+mv val/ILSVRC2012_val_00026775.JPEG n02264363/
+mv val/ILSVRC2012_val_00026776.JPEG n01532829/
+mv val/ILSVRC2012_val_00026777.JPEG n04540053/
+mv val/ILSVRC2012_val_00026778.JPEG n02113712/
+mv val/ILSVRC2012_val_00026779.JPEG n04259630/
+mv val/ILSVRC2012_val_00026780.JPEG n03661043/
+mv val/ILSVRC2012_val_00026781.JPEG n03220513/
+mv val/ILSVRC2012_val_00026782.JPEG n03445924/
+mv val/ILSVRC2012_val_00026783.JPEG n07831146/
+mv val/ILSVRC2012_val_00026784.JPEG n01530575/
+mv val/ILSVRC2012_val_00026785.JPEG n03691459/
+mv val/ILSVRC2012_val_00026786.JPEG n01773157/
+mv val/ILSVRC2012_val_00026787.JPEG n06785654/
+mv val/ILSVRC2012_val_00026788.JPEG n03290653/
+mv val/ILSVRC2012_val_00026789.JPEG n03995372/
+mv val/ILSVRC2012_val_00026790.JPEG n03866082/
+mv val/ILSVRC2012_val_00026791.JPEG n02276258/
+mv val/ILSVRC2012_val_00026792.JPEG n03777568/
+mv val/ILSVRC2012_val_00026793.JPEG n01675722/
+mv val/ILSVRC2012_val_00026794.JPEG n12985857/
+mv val/ILSVRC2012_val_00026795.JPEG n02835271/
+mv val/ILSVRC2012_val_00026796.JPEG n03444034/
+mv val/ILSVRC2012_val_00026797.JPEG n02101006/
+mv val/ILSVRC2012_val_00026798.JPEG n03637318/
+mv val/ILSVRC2012_val_00026799.JPEG n03787032/
+mv val/ILSVRC2012_val_00026800.JPEG n04258138/
+mv val/ILSVRC2012_val_00026801.JPEG n03535780/
+mv val/ILSVRC2012_val_00026802.JPEG n04065272/
+mv val/ILSVRC2012_val_00026803.JPEG n02099267/
+mv val/ILSVRC2012_val_00026804.JPEG n03347037/
+mv val/ILSVRC2012_val_00026805.JPEG n01755581/
+mv val/ILSVRC2012_val_00026806.JPEG n03908714/
+mv val/ILSVRC2012_val_00026807.JPEG n02056570/
+mv val/ILSVRC2012_val_00026808.JPEG n02093647/
+mv val/ILSVRC2012_val_00026809.JPEG n01729977/
+mv val/ILSVRC2012_val_00026810.JPEG n04344873/
+mv val/ILSVRC2012_val_00026811.JPEG n01847000/
+mv val/ILSVRC2012_val_00026812.JPEG n02112350/
+mv val/ILSVRC2012_val_00026813.JPEG n01632458/
+mv val/ILSVRC2012_val_00026814.JPEG n04562935/
+mv val/ILSVRC2012_val_00026815.JPEG n03325584/
+mv val/ILSVRC2012_val_00026816.JPEG n04127249/
+mv val/ILSVRC2012_val_00026817.JPEG n04141076/
+mv val/ILSVRC2012_val_00026818.JPEG n04554684/
+mv val/ILSVRC2012_val_00026819.JPEG n07714571/
+mv val/ILSVRC2012_val_00026820.JPEG n02027492/
+mv val/ILSVRC2012_val_00026821.JPEG n03532672/
+mv val/ILSVRC2012_val_00026822.JPEG n02992529/
+mv val/ILSVRC2012_val_00026823.JPEG n02321529/
+mv val/ILSVRC2012_val_00026824.JPEG n03538406/
+mv val/ILSVRC2012_val_00026825.JPEG n03721384/
+mv val/ILSVRC2012_val_00026826.JPEG n02013706/
+mv val/ILSVRC2012_val_00026827.JPEG n04599235/
+mv val/ILSVRC2012_val_00026828.JPEG n02093991/
+mv val/ILSVRC2012_val_00026829.JPEG n02777292/
+mv val/ILSVRC2012_val_00026830.JPEG n02123394/
+mv val/ILSVRC2012_val_00026831.JPEG n07747607/
+mv val/ILSVRC2012_val_00026832.JPEG n03424325/
+mv val/ILSVRC2012_val_00026833.JPEG n03976657/
+mv val/ILSVRC2012_val_00026834.JPEG n04209239/
+mv val/ILSVRC2012_val_00026835.JPEG n02951585/
+mv val/ILSVRC2012_val_00026836.JPEG n07753592/
+mv val/ILSVRC2012_val_00026837.JPEG n04443257/
+mv val/ILSVRC2012_val_00026838.JPEG n03388183/
+mv val/ILSVRC2012_val_00026839.JPEG n10148035/
+mv val/ILSVRC2012_val_00026840.JPEG n03344393/
+mv val/ILSVRC2012_val_00026841.JPEG n04336792/
+mv val/ILSVRC2012_val_00026842.JPEG n02120505/
+mv val/ILSVRC2012_val_00026843.JPEG n01981276/
+mv val/ILSVRC2012_val_00026844.JPEG n03933933/
+mv val/ILSVRC2012_val_00026845.JPEG n01829413/
+mv val/ILSVRC2012_val_00026846.JPEG n03916031/
+mv val/ILSVRC2012_val_00026847.JPEG n02776631/
+mv val/ILSVRC2012_val_00026848.JPEG n01775062/
+mv val/ILSVRC2012_val_00026849.JPEG n04286575/
+mv val/ILSVRC2012_val_00026850.JPEG n04209239/
+mv val/ILSVRC2012_val_00026851.JPEG n07730033/
+mv val/ILSVRC2012_val_00026852.JPEG n02099712/
+mv val/ILSVRC2012_val_00026853.JPEG n07613480/
+mv val/ILSVRC2012_val_00026854.JPEG n02100583/
+mv val/ILSVRC2012_val_00026855.JPEG n03733805/
+mv val/ILSVRC2012_val_00026856.JPEG n03873416/
+mv val/ILSVRC2012_val_00026857.JPEG n04476259/
+mv val/ILSVRC2012_val_00026858.JPEG n02113799/
+mv val/ILSVRC2012_val_00026859.JPEG n02690373/
+mv val/ILSVRC2012_val_00026860.JPEG n09468604/
+mv val/ILSVRC2012_val_00026861.JPEG n02009912/
+mv val/ILSVRC2012_val_00026862.JPEG n01980166/
+mv val/ILSVRC2012_val_00026863.JPEG n02096294/
+mv val/ILSVRC2012_val_00026864.JPEG n03764736/
+mv val/ILSVRC2012_val_00026865.JPEG n03417042/
+mv val/ILSVRC2012_val_00026866.JPEG n03000134/
+mv val/ILSVRC2012_val_00026867.JPEG n10565667/
+mv val/ILSVRC2012_val_00026868.JPEG n04120489/
+mv val/ILSVRC2012_val_00026869.JPEG n02114855/
+mv val/ILSVRC2012_val_00026870.JPEG n04039381/
+mv val/ILSVRC2012_val_00026871.JPEG n04376876/
+mv val/ILSVRC2012_val_00026872.JPEG n02843684/
+mv val/ILSVRC2012_val_00026873.JPEG n02643566/
+mv val/ILSVRC2012_val_00026874.JPEG n03924679/
+mv val/ILSVRC2012_val_00026875.JPEG n03958227/
+mv val/ILSVRC2012_val_00026876.JPEG n03773504/
+mv val/ILSVRC2012_val_00026877.JPEG n02276258/
+mv val/ILSVRC2012_val_00026878.JPEG n03776460/
+mv val/ILSVRC2012_val_00026879.JPEG n03000684/
+mv val/ILSVRC2012_val_00026880.JPEG n02129165/
+mv val/ILSVRC2012_val_00026881.JPEG n03445924/
+mv val/ILSVRC2012_val_00026882.JPEG n02108089/
+mv val/ILSVRC2012_val_00026883.JPEG n04310018/
+mv val/ILSVRC2012_val_00026884.JPEG n03873416/
+mv val/ILSVRC2012_val_00026885.JPEG n02236044/
+mv val/ILSVRC2012_val_00026886.JPEG n03483316/
+mv val/ILSVRC2012_val_00026887.JPEG n02099601/
+mv val/ILSVRC2012_val_00026888.JPEG n02115913/
+mv val/ILSVRC2012_val_00026889.JPEG n02441942/
+mv val/ILSVRC2012_val_00026890.JPEG n03967562/
+mv val/ILSVRC2012_val_00026891.JPEG n04479046/
+mv val/ILSVRC2012_val_00026892.JPEG n04344873/
+mv val/ILSVRC2012_val_00026893.JPEG n02123597/
+mv val/ILSVRC2012_val_00026894.JPEG n02229544/
+mv val/ILSVRC2012_val_00026895.JPEG n03179701/
+mv val/ILSVRC2012_val_00026896.JPEG n02791124/
+mv val/ILSVRC2012_val_00026897.JPEG n04525305/
+mv val/ILSVRC2012_val_00026898.JPEG n03976657/
+mv val/ILSVRC2012_val_00026899.JPEG n04147183/
+mv val/ILSVRC2012_val_00026900.JPEG n02835271/
+mv val/ILSVRC2012_val_00026901.JPEG n01685808/
+mv val/ILSVRC2012_val_00026902.JPEG n02280649/
+mv val/ILSVRC2012_val_00026903.JPEG n01768244/
+mv val/ILSVRC2012_val_00026904.JPEG n02489166/
+mv val/ILSVRC2012_val_00026905.JPEG n04355338/
+mv val/ILSVRC2012_val_00026906.JPEG n02279972/
+mv val/ILSVRC2012_val_00026907.JPEG n03770679/
+mv val/ILSVRC2012_val_00026908.JPEG n01498041/
+mv val/ILSVRC2012_val_00026909.JPEG n04041544/
+mv val/ILSVRC2012_val_00026910.JPEG n02085620/
+mv val/ILSVRC2012_val_00026911.JPEG n02086240/
+mv val/ILSVRC2012_val_00026912.JPEG n03532672/
+mv val/ILSVRC2012_val_00026913.JPEG n02268853/
+mv val/ILSVRC2012_val_00026914.JPEG n02978881/
+mv val/ILSVRC2012_val_00026915.JPEG n02363005/
+mv val/ILSVRC2012_val_00026916.JPEG n04442312/
+mv val/ILSVRC2012_val_00026917.JPEG n02280649/
+mv val/ILSVRC2012_val_00026918.JPEG n02108915/
+mv val/ILSVRC2012_val_00026919.JPEG n04380533/
+mv val/ILSVRC2012_val_00026920.JPEG n04462240/
+mv val/ILSVRC2012_val_00026921.JPEG n03271574/
+mv val/ILSVRC2012_val_00026922.JPEG n03930630/
+mv val/ILSVRC2012_val_00026923.JPEG n02892767/
+mv val/ILSVRC2012_val_00026924.JPEG n01797886/
+mv val/ILSVRC2012_val_00026925.JPEG n01978287/
+mv val/ILSVRC2012_val_00026926.JPEG n02437616/
+mv val/ILSVRC2012_val_00026927.JPEG n03920288/
+mv val/ILSVRC2012_val_00026928.JPEG n03160309/
+mv val/ILSVRC2012_val_00026929.JPEG n01560419/
+mv val/ILSVRC2012_val_00026930.JPEG n02666196/
+mv val/ILSVRC2012_val_00026931.JPEG n03424325/
+mv val/ILSVRC2012_val_00026932.JPEG n02514041/
+mv val/ILSVRC2012_val_00026933.JPEG n02790996/
+mv val/ILSVRC2012_val_00026934.JPEG n02397096/
+mv val/ILSVRC2012_val_00026935.JPEG n01775062/
+mv val/ILSVRC2012_val_00026936.JPEG n02071294/
+mv val/ILSVRC2012_val_00026937.JPEG n02100583/
+mv val/ILSVRC2012_val_00026938.JPEG n04380533/
+mv val/ILSVRC2012_val_00026939.JPEG n01990800/
+mv val/ILSVRC2012_val_00026940.JPEG n03903868/
+mv val/ILSVRC2012_val_00026941.JPEG n07583066/
+mv val/ILSVRC2012_val_00026942.JPEG n02013706/
+mv val/ILSVRC2012_val_00026943.JPEG n02130308/
+mv val/ILSVRC2012_val_00026944.JPEG n02113023/
+mv val/ILSVRC2012_val_00026945.JPEG n03884397/
+mv val/ILSVRC2012_val_00026946.JPEG n03000684/
+mv val/ILSVRC2012_val_00026947.JPEG n04037443/
+mv val/ILSVRC2012_val_00026948.JPEG n01687978/
+mv val/ILSVRC2012_val_00026949.JPEG n02058221/
+mv val/ILSVRC2012_val_00026950.JPEG n02704792/
+mv val/ILSVRC2012_val_00026951.JPEG n07693725/
+mv val/ILSVRC2012_val_00026952.JPEG n04039381/
+mv val/ILSVRC2012_val_00026953.JPEG n03461385/
+mv val/ILSVRC2012_val_00026954.JPEG n01950731/
+mv val/ILSVRC2012_val_00026955.JPEG n03773504/
+mv val/ILSVRC2012_val_00026956.JPEG n02104365/
+mv val/ILSVRC2012_val_00026957.JPEG n04536866/
+mv val/ILSVRC2012_val_00026958.JPEG n02328150/
+mv val/ILSVRC2012_val_00026959.JPEG n07871810/
+mv val/ILSVRC2012_val_00026960.JPEG n03372029/
+mv val/ILSVRC2012_val_00026961.JPEG n04462240/
+mv val/ILSVRC2012_val_00026962.JPEG n02133161/
+mv val/ILSVRC2012_val_00026963.JPEG n02808304/
+mv val/ILSVRC2012_val_00026964.JPEG n03443371/
+mv val/ILSVRC2012_val_00026965.JPEG n01843065/
+mv val/ILSVRC2012_val_00026966.JPEG n01914609/
+mv val/ILSVRC2012_val_00026967.JPEG n01855032/
+mv val/ILSVRC2012_val_00026968.JPEG n04380533/
+mv val/ILSVRC2012_val_00026969.JPEG n02086646/
+mv val/ILSVRC2012_val_00026970.JPEG n02363005/
+mv val/ILSVRC2012_val_00026971.JPEG n04296562/
+mv val/ILSVRC2012_val_00026972.JPEG n04033995/
+mv val/ILSVRC2012_val_00026973.JPEG n02871525/
+mv val/ILSVRC2012_val_00026974.JPEG n03742115/
+mv val/ILSVRC2012_val_00026975.JPEG n02704792/
+mv val/ILSVRC2012_val_00026976.JPEG n02108915/
+mv val/ILSVRC2012_val_00026977.JPEG n03670208/
+mv val/ILSVRC2012_val_00026978.JPEG n02093428/
+mv val/ILSVRC2012_val_00026979.JPEG n04428191/
+mv val/ILSVRC2012_val_00026980.JPEG n09421951/
+mv val/ILSVRC2012_val_00026981.JPEG n01984695/
+mv val/ILSVRC2012_val_00026982.JPEG n02128757/
+mv val/ILSVRC2012_val_00026983.JPEG n01917289/
+mv val/ILSVRC2012_val_00026984.JPEG n04033901/
+mv val/ILSVRC2012_val_00026985.JPEG n02092002/
+mv val/ILSVRC2012_val_00026986.JPEG n03840681/
+mv val/ILSVRC2012_val_00026987.JPEG n03476684/
+mv val/ILSVRC2012_val_00026988.JPEG n04286575/
+mv val/ILSVRC2012_val_00026989.JPEG n04423845/
+mv val/ILSVRC2012_val_00026990.JPEG n02951358/
+mv val/ILSVRC2012_val_00026991.JPEG n03877845/
+mv val/ILSVRC2012_val_00026992.JPEG n01728572/
+mv val/ILSVRC2012_val_00026993.JPEG n03481172/
+mv val/ILSVRC2012_val_00026994.JPEG n03208938/
+mv val/ILSVRC2012_val_00026995.JPEG n02487347/
+mv val/ILSVRC2012_val_00026996.JPEG n02107908/
+mv val/ILSVRC2012_val_00026997.JPEG n07565083/
+mv val/ILSVRC2012_val_00026998.JPEG n04479046/
+mv val/ILSVRC2012_val_00026999.JPEG n03832673/
+mv val/ILSVRC2012_val_00027000.JPEG n02948072/
+mv val/ILSVRC2012_val_00027001.JPEG n02950826/
+mv val/ILSVRC2012_val_00027002.JPEG n03929660/
+mv val/ILSVRC2012_val_00027003.JPEG n04370456/
+mv val/ILSVRC2012_val_00027004.JPEG n02978881/
+mv val/ILSVRC2012_val_00027005.JPEG n01498041/
+mv val/ILSVRC2012_val_00027006.JPEG n02783161/
+mv val/ILSVRC2012_val_00027007.JPEG n03697007/
+mv val/ILSVRC2012_val_00027008.JPEG n01820546/
+mv val/ILSVRC2012_val_00027009.JPEG n03026506/
+mv val/ILSVRC2012_val_00027010.JPEG n04584207/
+mv val/ILSVRC2012_val_00027011.JPEG n02091467/
+mv val/ILSVRC2012_val_00027012.JPEG n02422699/
+mv val/ILSVRC2012_val_00027013.JPEG n02123045/
+mv val/ILSVRC2012_val_00027014.JPEG n03793489/
+mv val/ILSVRC2012_val_00027015.JPEG n03958227/
+mv val/ILSVRC2012_val_00027016.JPEG n02443484/
+mv val/ILSVRC2012_val_00027017.JPEG n02098286/
+mv val/ILSVRC2012_val_00027018.JPEG n02788148/
+mv val/ILSVRC2012_val_00027019.JPEG n04392985/
+mv val/ILSVRC2012_val_00027020.JPEG n12768682/
+mv val/ILSVRC2012_val_00027021.JPEG n03843555/
+mv val/ILSVRC2012_val_00027022.JPEG n02894605/
+mv val/ILSVRC2012_val_00027023.JPEG n04372370/
+mv val/ILSVRC2012_val_00027024.JPEG n02077923/
+mv val/ILSVRC2012_val_00027025.JPEG n02111889/
+mv val/ILSVRC2012_val_00027026.JPEG n01770393/
+mv val/ILSVRC2012_val_00027027.JPEG n02840245/
+mv val/ILSVRC2012_val_00027028.JPEG n01631663/
+mv val/ILSVRC2012_val_00027029.JPEG n02786058/
+mv val/ILSVRC2012_val_00027030.JPEG n04462240/
+mv val/ILSVRC2012_val_00027031.JPEG n02264363/
+mv val/ILSVRC2012_val_00027032.JPEG n03942813/
+mv val/ILSVRC2012_val_00027033.JPEG n02457408/
+mv val/ILSVRC2012_val_00027034.JPEG n03476991/
+mv val/ILSVRC2012_val_00027035.JPEG n02107312/
+mv val/ILSVRC2012_val_00027036.JPEG n02917067/
+mv val/ILSVRC2012_val_00027037.JPEG n04612504/
+mv val/ILSVRC2012_val_00027038.JPEG n02100583/
+mv val/ILSVRC2012_val_00027039.JPEG n04239074/
+mv val/ILSVRC2012_val_00027040.JPEG n04476259/
+mv val/ILSVRC2012_val_00027041.JPEG n02105855/
+mv val/ILSVRC2012_val_00027042.JPEG n03929855/
+mv val/ILSVRC2012_val_00027043.JPEG n02389026/
+mv val/ILSVRC2012_val_00027044.JPEG n04389033/
+mv val/ILSVRC2012_val_00027045.JPEG n03876231/
+mv val/ILSVRC2012_val_00027046.JPEG n04041544/
+mv val/ILSVRC2012_val_00027047.JPEG n01806143/
+mv val/ILSVRC2012_val_00027048.JPEG n07584110/
+mv val/ILSVRC2012_val_00027049.JPEG n02814533/
+mv val/ILSVRC2012_val_00027050.JPEG n03868863/
+mv val/ILSVRC2012_val_00027051.JPEG n02104365/
+mv val/ILSVRC2012_val_00027052.JPEG n02128925/
+mv val/ILSVRC2012_val_00027053.JPEG n02105251/
+mv val/ILSVRC2012_val_00027054.JPEG n04447861/
+mv val/ILSVRC2012_val_00027055.JPEG n04517823/
+mv val/ILSVRC2012_val_00027056.JPEG n02395406/
+mv val/ILSVRC2012_val_00027057.JPEG n04208210/
+mv val/ILSVRC2012_val_00027058.JPEG n02091831/
+mv val/ILSVRC2012_val_00027059.JPEG n04330267/
+mv val/ILSVRC2012_val_00027060.JPEG n02444819/
+mv val/ILSVRC2012_val_00027061.JPEG n02815834/
+mv val/ILSVRC2012_val_00027062.JPEG n02264363/
+mv val/ILSVRC2012_val_00027063.JPEG n01484850/
+mv val/ILSVRC2012_val_00027064.JPEG n02105641/
+mv val/ILSVRC2012_val_00027065.JPEG n02808440/
+mv val/ILSVRC2012_val_00027066.JPEG n02116738/
+mv val/ILSVRC2012_val_00027067.JPEG n01873310/
+mv val/ILSVRC2012_val_00027068.JPEG n03792972/
+mv val/ILSVRC2012_val_00027069.JPEG n02125311/
+mv val/ILSVRC2012_val_00027070.JPEG n01855032/
+mv val/ILSVRC2012_val_00027071.JPEG n02704792/
+mv val/ILSVRC2012_val_00027072.JPEG n07717556/
+mv val/ILSVRC2012_val_00027073.JPEG n03814906/
+mv val/ILSVRC2012_val_00027074.JPEG n01667114/
+mv val/ILSVRC2012_val_00027075.JPEG n03857828/
+mv val/ILSVRC2012_val_00027076.JPEG n01784675/
+mv val/ILSVRC2012_val_00027077.JPEG n02091032/
+mv val/ILSVRC2012_val_00027078.JPEG n04409515/
+mv val/ILSVRC2012_val_00027079.JPEG n01614925/
+mv val/ILSVRC2012_val_00027080.JPEG n03769881/
+mv val/ILSVRC2012_val_00027081.JPEG n02814533/
+mv val/ILSVRC2012_val_00027082.JPEG n02093754/
+mv val/ILSVRC2012_val_00027083.JPEG n07747607/
+mv val/ILSVRC2012_val_00027084.JPEG n03857828/
+mv val/ILSVRC2012_val_00027085.JPEG n04277352/
+mv val/ILSVRC2012_val_00027086.JPEG n02104029/
+mv val/ILSVRC2012_val_00027087.JPEG n04131690/
+mv val/ILSVRC2012_val_00027088.JPEG n02951358/
+mv val/ILSVRC2012_val_00027089.JPEG n02134084/
+mv val/ILSVRC2012_val_00027090.JPEG n07749582/
+mv val/ILSVRC2012_val_00027091.JPEG n03126707/
+mv val/ILSVRC2012_val_00027092.JPEG n04325704/
+mv val/ILSVRC2012_val_00027093.JPEG n02497673/
+mv val/ILSVRC2012_val_00027094.JPEG n02105412/
+mv val/ILSVRC2012_val_00027095.JPEG n01685808/
+mv val/ILSVRC2012_val_00027096.JPEG n07871810/
+mv val/ILSVRC2012_val_00027097.JPEG n02927161/
+mv val/ILSVRC2012_val_00027098.JPEG n04380533/
+mv val/ILSVRC2012_val_00027099.JPEG n04152593/
+mv val/ILSVRC2012_val_00027100.JPEG n02106382/
+mv val/ILSVRC2012_val_00027101.JPEG n04350905/
+mv val/ILSVRC2012_val_00027102.JPEG n01795545/
+mv val/ILSVRC2012_val_00027103.JPEG n03871628/
+mv val/ILSVRC2012_val_00027104.JPEG n02965783/
+mv val/ILSVRC2012_val_00027105.JPEG n07614500/
+mv val/ILSVRC2012_val_00027106.JPEG n03884397/
+mv val/ILSVRC2012_val_00027107.JPEG n03980874/
+mv val/ILSVRC2012_val_00027108.JPEG n02492035/
+mv val/ILSVRC2012_val_00027109.JPEG n02113712/
+mv val/ILSVRC2012_val_00027110.JPEG n03417042/
+mv val/ILSVRC2012_val_00027111.JPEG n04259630/
+mv val/ILSVRC2012_val_00027112.JPEG n03483316/
+mv val/ILSVRC2012_val_00027113.JPEG n01494475/
+mv val/ILSVRC2012_val_00027114.JPEG n02088238/
+mv val/ILSVRC2012_val_00027115.JPEG n07565083/
+mv val/ILSVRC2012_val_00027116.JPEG n07753113/
+mv val/ILSVRC2012_val_00027117.JPEG n04366367/
+mv val/ILSVRC2012_val_00027118.JPEG n04120489/
+mv val/ILSVRC2012_val_00027119.JPEG n04429376/
+mv val/ILSVRC2012_val_00027120.JPEG n02091467/
+mv val/ILSVRC2012_val_00027121.JPEG n02112350/
+mv val/ILSVRC2012_val_00027122.JPEG n02699494/
+mv val/ILSVRC2012_val_00027123.JPEG n03995372/
+mv val/ILSVRC2012_val_00027124.JPEG n02113186/
+mv val/ILSVRC2012_val_00027125.JPEG n01685808/
+mv val/ILSVRC2012_val_00027126.JPEG n03347037/
+mv val/ILSVRC2012_val_00027127.JPEG n02843684/
+mv val/ILSVRC2012_val_00027128.JPEG n02108089/
+mv val/ILSVRC2012_val_00027129.JPEG n03825788/
+mv val/ILSVRC2012_val_00027130.JPEG n03773504/
+mv val/ILSVRC2012_val_00027131.JPEG n02787622/
+mv val/ILSVRC2012_val_00027132.JPEG n04325704/
+mv val/ILSVRC2012_val_00027133.JPEG n03796401/
+mv val/ILSVRC2012_val_00027134.JPEG n01698640/
+mv val/ILSVRC2012_val_00027135.JPEG n03045698/
+mv val/ILSVRC2012_val_00027136.JPEG n02422699/
+mv val/ILSVRC2012_val_00027137.JPEG n04417672/
+mv val/ILSVRC2012_val_00027138.JPEG n04141327/
+mv val/ILSVRC2012_val_00027139.JPEG n04118538/
+mv val/ILSVRC2012_val_00027140.JPEG n02113624/
+mv val/ILSVRC2012_val_00027141.JPEG n04550184/
+mv val/ILSVRC2012_val_00027142.JPEG n01728572/
+mv val/ILSVRC2012_val_00027143.JPEG n04380533/
+mv val/ILSVRC2012_val_00027144.JPEG n04209133/
+mv val/ILSVRC2012_val_00027145.JPEG n01537544/
+mv val/ILSVRC2012_val_00027146.JPEG n07920052/
+mv val/ILSVRC2012_val_00027147.JPEG n04317175/
+mv val/ILSVRC2012_val_00027148.JPEG n01742172/
+mv val/ILSVRC2012_val_00027149.JPEG n02786058/
+mv val/ILSVRC2012_val_00027150.JPEG n03417042/
+mv val/ILSVRC2012_val_00027151.JPEG n03770679/
+mv val/ILSVRC2012_val_00027152.JPEG n02804414/
+mv val/ILSVRC2012_val_00027153.JPEG n02236044/
+mv val/ILSVRC2012_val_00027154.JPEG n03085013/
+mv val/ILSVRC2012_val_00027155.JPEG n04019541/
+mv val/ILSVRC2012_val_00027156.JPEG n03661043/
+mv val/ILSVRC2012_val_00027157.JPEG n03769881/
+mv val/ILSVRC2012_val_00027158.JPEG n01773797/
+mv val/ILSVRC2012_val_00027159.JPEG n02835271/
+mv val/ILSVRC2012_val_00027160.JPEG n01494475/
+mv val/ILSVRC2012_val_00027161.JPEG n01773797/
+mv val/ILSVRC2012_val_00027162.JPEG n02097298/
+mv val/ILSVRC2012_val_00027163.JPEG n01667114/
+mv val/ILSVRC2012_val_00027164.JPEG n02106030/
+mv val/ILSVRC2012_val_00027165.JPEG n02106030/
+mv val/ILSVRC2012_val_00027166.JPEG n03146219/
+mv val/ILSVRC2012_val_00027167.JPEG n01930112/
+mv val/ILSVRC2012_val_00027168.JPEG n02102177/
+mv val/ILSVRC2012_val_00027169.JPEG n13040303/
+mv val/ILSVRC2012_val_00027170.JPEG n04357314/
+mv val/ILSVRC2012_val_00027171.JPEG n04264628/
+mv val/ILSVRC2012_val_00027172.JPEG n07875152/
+mv val/ILSVRC2012_val_00027173.JPEG n04371774/
+mv val/ILSVRC2012_val_00027174.JPEG n02099849/
+mv val/ILSVRC2012_val_00027175.JPEG n03127925/
+mv val/ILSVRC2012_val_00027176.JPEG n02869837/
+mv val/ILSVRC2012_val_00027177.JPEG n03710193/
+mv val/ILSVRC2012_val_00027178.JPEG n02097130/
+mv val/ILSVRC2012_val_00027179.JPEG n07730033/
+mv val/ILSVRC2012_val_00027180.JPEG n04311004/
+mv val/ILSVRC2012_val_00027181.JPEG n03085013/
+mv val/ILSVRC2012_val_00027182.JPEG n02102040/
+mv val/ILSVRC2012_val_00027183.JPEG n04486054/
+mv val/ILSVRC2012_val_00027184.JPEG n02111889/
+mv val/ILSVRC2012_val_00027185.JPEG n04204238/
+mv val/ILSVRC2012_val_00027186.JPEG n03792972/
+mv val/ILSVRC2012_val_00027187.JPEG n03450230/
+mv val/ILSVRC2012_val_00027188.JPEG n03617480/
+mv val/ILSVRC2012_val_00027189.JPEG n02124075/
+mv val/ILSVRC2012_val_00027190.JPEG n03495258/
+mv val/ILSVRC2012_val_00027191.JPEG n03769881/
+mv val/ILSVRC2012_val_00027192.JPEG n02916936/
+mv val/ILSVRC2012_val_00027193.JPEG n01704323/
+mv val/ILSVRC2012_val_00027194.JPEG n03063599/
+mv val/ILSVRC2012_val_00027195.JPEG n01883070/
+mv val/ILSVRC2012_val_00027196.JPEG n01614925/
+mv val/ILSVRC2012_val_00027197.JPEG n04311004/
+mv val/ILSVRC2012_val_00027198.JPEG n01692333/
+mv val/ILSVRC2012_val_00027199.JPEG n03125729/
+mv val/ILSVRC2012_val_00027200.JPEG n04192698/
+mv val/ILSVRC2012_val_00027201.JPEG n03874293/
+mv val/ILSVRC2012_val_00027202.JPEG n03496892/
+mv val/ILSVRC2012_val_00027203.JPEG n04118776/
+mv val/ILSVRC2012_val_00027204.JPEG n02454379/
+mv val/ILSVRC2012_val_00027205.JPEG n04116512/
+mv val/ILSVRC2012_val_00027206.JPEG n01677366/
+mv val/ILSVRC2012_val_00027207.JPEG n01514668/
+mv val/ILSVRC2012_val_00027208.JPEG n03476991/
+mv val/ILSVRC2012_val_00027209.JPEG n03733805/
+mv val/ILSVRC2012_val_00027210.JPEG n03942813/
+mv val/ILSVRC2012_val_00027211.JPEG n03095699/
+mv val/ILSVRC2012_val_00027212.JPEG n02883205/
+mv val/ILSVRC2012_val_00027213.JPEG n02091467/
+mv val/ILSVRC2012_val_00027214.JPEG n02817516/
+mv val/ILSVRC2012_val_00027215.JPEG n06794110/
+mv val/ILSVRC2012_val_00027216.JPEG n03131574/
+mv val/ILSVRC2012_val_00027217.JPEG n02101388/
+mv val/ILSVRC2012_val_00027218.JPEG n01978455/
+mv val/ILSVRC2012_val_00027219.JPEG n02106382/
+mv val/ILSVRC2012_val_00027220.JPEG n02108915/
+mv val/ILSVRC2012_val_00027221.JPEG n03216828/
+mv val/ILSVRC2012_val_00027222.JPEG n07615774/
+mv val/ILSVRC2012_val_00027223.JPEG n07730033/
+mv val/ILSVRC2012_val_00027224.JPEG n01770393/
+mv val/ILSVRC2012_val_00027225.JPEG n04371430/
+mv val/ILSVRC2012_val_00027226.JPEG n02123159/
+mv val/ILSVRC2012_val_00027227.JPEG n01984695/
+mv val/ILSVRC2012_val_00027228.JPEG n01737021/
+mv val/ILSVRC2012_val_00027229.JPEG n02825657/
+mv val/ILSVRC2012_val_00027230.JPEG n02099267/
+mv val/ILSVRC2012_val_00027231.JPEG n03658185/
+mv val/ILSVRC2012_val_00027232.JPEG n02815834/
+mv val/ILSVRC2012_val_00027233.JPEG n02120079/
+mv val/ILSVRC2012_val_00027234.JPEG n03908714/
+mv val/ILSVRC2012_val_00027235.JPEG n04554684/
+mv val/ILSVRC2012_val_00027236.JPEG n04604644/
+mv val/ILSVRC2012_val_00027237.JPEG n03109150/
+mv val/ILSVRC2012_val_00027238.JPEG n03866082/
+mv val/ILSVRC2012_val_00027239.JPEG n03908714/
+mv val/ILSVRC2012_val_00027240.JPEG n03617480/
+mv val/ILSVRC2012_val_00027241.JPEG n02093647/
+mv val/ILSVRC2012_val_00027242.JPEG n02510455/
+mv val/ILSVRC2012_val_00027243.JPEG n04074963/
+mv val/ILSVRC2012_val_00027244.JPEG n03089624/
+mv val/ILSVRC2012_val_00027245.JPEG n02095314/
+mv val/ILSVRC2012_val_00027246.JPEG n03218198/
+mv val/ILSVRC2012_val_00027247.JPEG n02817516/
+mv val/ILSVRC2012_val_00027248.JPEG n01943899/
+mv val/ILSVRC2012_val_00027249.JPEG n03854065/
+mv val/ILSVRC2012_val_00027250.JPEG n03891251/
+mv val/ILSVRC2012_val_00027251.JPEG n04423845/
+mv val/ILSVRC2012_val_00027252.JPEG n04131690/
+mv val/ILSVRC2012_val_00027253.JPEG n04442312/
+mv val/ILSVRC2012_val_00027254.JPEG n01537544/
+mv val/ILSVRC2012_val_00027255.JPEG n03325584/
+mv val/ILSVRC2012_val_00027256.JPEG n02095889/
+mv val/ILSVRC2012_val_00027257.JPEG n03291819/
+mv val/ILSVRC2012_val_00027258.JPEG n03042490/
+mv val/ILSVRC2012_val_00027259.JPEG n02504013/
+mv val/ILSVRC2012_val_00027260.JPEG n03146219/
+mv val/ILSVRC2012_val_00027261.JPEG n04252077/
+mv val/ILSVRC2012_val_00027262.JPEG n02328150/
+mv val/ILSVRC2012_val_00027263.JPEG n01697457/
+mv val/ILSVRC2012_val_00027264.JPEG n02655020/
+mv val/ILSVRC2012_val_00027265.JPEG n04606251/
+mv val/ILSVRC2012_val_00027266.JPEG n07720875/
+mv val/ILSVRC2012_val_00027267.JPEG n02091831/
+mv val/ILSVRC2012_val_00027268.JPEG n02097209/
+mv val/ILSVRC2012_val_00027269.JPEG n01630670/
+mv val/ILSVRC2012_val_00027270.JPEG n01950731/
+mv val/ILSVRC2012_val_00027271.JPEG n01910747/
+mv val/ILSVRC2012_val_00027272.JPEG n07695742/
+mv val/ILSVRC2012_val_00027273.JPEG n03063689/
+mv val/ILSVRC2012_val_00027274.JPEG n01871265/
+mv val/ILSVRC2012_val_00027275.JPEG n03478589/
+mv val/ILSVRC2012_val_00027276.JPEG n07583066/
+mv val/ILSVRC2012_val_00027277.JPEG n02109525/
+mv val/ILSVRC2012_val_00027278.JPEG n03982430/
+mv val/ILSVRC2012_val_00027279.JPEG n04270147/
+mv val/ILSVRC2012_val_00027280.JPEG n01871265/
+mv val/ILSVRC2012_val_00027281.JPEG n02033041/
+mv val/ILSVRC2012_val_00027282.JPEG n03476991/
+mv val/ILSVRC2012_val_00027283.JPEG n01494475/
+mv val/ILSVRC2012_val_00027284.JPEG n09229709/
+mv val/ILSVRC2012_val_00027285.JPEG n03967562/
+mv val/ILSVRC2012_val_00027286.JPEG n03902125/
+mv val/ILSVRC2012_val_00027287.JPEG n02837789/
+mv val/ILSVRC2012_val_00027288.JPEG n04311004/
+mv val/ILSVRC2012_val_00027289.JPEG n04228054/
+mv val/ILSVRC2012_val_00027290.JPEG n02087394/
+mv val/ILSVRC2012_val_00027291.JPEG n04147183/
+mv val/ILSVRC2012_val_00027292.JPEG n02133161/
+mv val/ILSVRC2012_val_00027293.JPEG n03100240/
+mv val/ILSVRC2012_val_00027294.JPEG n04204238/
+mv val/ILSVRC2012_val_00027295.JPEG n02445715/
+mv val/ILSVRC2012_val_00027296.JPEG n03481172/
+mv val/ILSVRC2012_val_00027297.JPEG n04487394/
+mv val/ILSVRC2012_val_00027298.JPEG n03796401/
+mv val/ILSVRC2012_val_00027299.JPEG n02978881/
+mv val/ILSVRC2012_val_00027300.JPEG n01877812/
+mv val/ILSVRC2012_val_00027301.JPEG n01496331/
+mv val/ILSVRC2012_val_00027302.JPEG n07717410/
+mv val/ILSVRC2012_val_00027303.JPEG n02871525/
+mv val/ILSVRC2012_val_00027304.JPEG n02442845/
+mv val/ILSVRC2012_val_00027305.JPEG n02112706/
+mv val/ILSVRC2012_val_00027306.JPEG n02879718/
+mv val/ILSVRC2012_val_00027307.JPEG n03085013/
+mv val/ILSVRC2012_val_00027308.JPEG n02799071/
+mv val/ILSVRC2012_val_00027309.JPEG n03902125/
+mv val/ILSVRC2012_val_00027310.JPEG n02965783/
+mv val/ILSVRC2012_val_00027311.JPEG n02281406/
+mv val/ILSVRC2012_val_00027312.JPEG n04404412/
+mv val/ILSVRC2012_val_00027313.JPEG n02123159/
+mv val/ILSVRC2012_val_00027314.JPEG n02747177/
+mv val/ILSVRC2012_val_00027315.JPEG n04548280/
+mv val/ILSVRC2012_val_00027316.JPEG n04591713/
+mv val/ILSVRC2012_val_00027317.JPEG n04044716/
+mv val/ILSVRC2012_val_00027318.JPEG n03742115/
+mv val/ILSVRC2012_val_00027319.JPEG n02992211/
+mv val/ILSVRC2012_val_00027320.JPEG n07717410/
+mv val/ILSVRC2012_val_00027321.JPEG n10148035/
+mv val/ILSVRC2012_val_00027322.JPEG n02099429/
+mv val/ILSVRC2012_val_00027323.JPEG n02486261/
+mv val/ILSVRC2012_val_00027324.JPEG n04447861/
+mv val/ILSVRC2012_val_00027325.JPEG n03843555/
+mv val/ILSVRC2012_val_00027326.JPEG n04263257/
+mv val/ILSVRC2012_val_00027327.JPEG n04330267/
+mv val/ILSVRC2012_val_00027328.JPEG n02787622/
+mv val/ILSVRC2012_val_00027329.JPEG n02823750/
+mv val/ILSVRC2012_val_00027330.JPEG n01740131/
+mv val/ILSVRC2012_val_00027331.JPEG n04235860/
+mv val/ILSVRC2012_val_00027332.JPEG n03498962/
+mv val/ILSVRC2012_val_00027333.JPEG n02492660/
+mv val/ILSVRC2012_val_00027334.JPEG n02437312/
+mv val/ILSVRC2012_val_00027335.JPEG n07718747/
+mv val/ILSVRC2012_val_00027336.JPEG n03803284/
+mv val/ILSVRC2012_val_00027337.JPEG n02364673/
+mv val/ILSVRC2012_val_00027338.JPEG n02906734/
+mv val/ILSVRC2012_val_00027339.JPEG n07684084/
+mv val/ILSVRC2012_val_00027340.JPEG n03970156/
+mv val/ILSVRC2012_val_00027341.JPEG n03825788/
+mv val/ILSVRC2012_val_00027342.JPEG n03814906/
+mv val/ILSVRC2012_val_00027343.JPEG n07715103/
+mv val/ILSVRC2012_val_00027344.JPEG n02749479/
+mv val/ILSVRC2012_val_00027345.JPEG n02815834/
+mv val/ILSVRC2012_val_00027346.JPEG n02877765/
+mv val/ILSVRC2012_val_00027347.JPEG n02088364/
+mv val/ILSVRC2012_val_00027348.JPEG n02088632/
+mv val/ILSVRC2012_val_00027349.JPEG n04270147/
+mv val/ILSVRC2012_val_00027350.JPEG n07248320/
+mv val/ILSVRC2012_val_00027351.JPEG n01514668/
+mv val/ILSVRC2012_val_00027352.JPEG n01883070/
+mv val/ILSVRC2012_val_00027353.JPEG n02276258/
+mv val/ILSVRC2012_val_00027354.JPEG n04554684/
+mv val/ILSVRC2012_val_00027355.JPEG n02009229/
+mv val/ILSVRC2012_val_00027356.JPEG n07248320/
+mv val/ILSVRC2012_val_00027357.JPEG n01924916/
+mv val/ILSVRC2012_val_00027358.JPEG n03376595/
+mv val/ILSVRC2012_val_00027359.JPEG n03983396/
+mv val/ILSVRC2012_val_00027360.JPEG n02112018/
+mv val/ILSVRC2012_val_00027361.JPEG n01770393/
+mv val/ILSVRC2012_val_00027362.JPEG n02403003/
+mv val/ILSVRC2012_val_00027363.JPEG n02051845/
+mv val/ILSVRC2012_val_00027364.JPEG n02870880/
+mv val/ILSVRC2012_val_00027365.JPEG n02484975/
+mv val/ILSVRC2012_val_00027366.JPEG n02113799/
+mv val/ILSVRC2012_val_00027367.JPEG n03717622/
+mv val/ILSVRC2012_val_00027368.JPEG n07930864/
+mv val/ILSVRC2012_val_00027369.JPEG n07717410/
+mv val/ILSVRC2012_val_00027370.JPEG n02730930/
+mv val/ILSVRC2012_val_00027371.JPEG n03874599/
+mv val/ILSVRC2012_val_00027372.JPEG n02105162/
+mv val/ILSVRC2012_val_00027373.JPEG n02099712/
+mv val/ILSVRC2012_val_00027374.JPEG n01530575/
+mv val/ILSVRC2012_val_00027375.JPEG n03891332/
+mv val/ILSVRC2012_val_00027376.JPEG n01773157/
+mv val/ILSVRC2012_val_00027377.JPEG n02808440/
+mv val/ILSVRC2012_val_00027378.JPEG n02177972/
+mv val/ILSVRC2012_val_00027379.JPEG n03759954/
+mv val/ILSVRC2012_val_00027380.JPEG n07579787/
+mv val/ILSVRC2012_val_00027381.JPEG n02877765/
+mv val/ILSVRC2012_val_00027382.JPEG n03958227/
+mv val/ILSVRC2012_val_00027383.JPEG n03977966/
+mv val/ILSVRC2012_val_00027384.JPEG n03825788/
+mv val/ILSVRC2012_val_00027385.JPEG n03028079/
+mv val/ILSVRC2012_val_00027386.JPEG n04501370/
+mv val/ILSVRC2012_val_00027387.JPEG n02259212/
+mv val/ILSVRC2012_val_00027388.JPEG n03961711/
+mv val/ILSVRC2012_val_00027389.JPEG n03496892/
+mv val/ILSVRC2012_val_00027390.JPEG n03706229/
+mv val/ILSVRC2012_val_00027391.JPEG n04409515/
+mv val/ILSVRC2012_val_00027392.JPEG n12144580/
+mv val/ILSVRC2012_val_00027393.JPEG n03769881/
+mv val/ILSVRC2012_val_00027394.JPEG n09193705/
+mv val/ILSVRC2012_val_00027395.JPEG n02782093/
+mv val/ILSVRC2012_val_00027396.JPEG n01734418/
+mv val/ILSVRC2012_val_00027397.JPEG n04285008/
+mv val/ILSVRC2012_val_00027398.JPEG n02120505/
+mv val/ILSVRC2012_val_00027399.JPEG n02111277/
+mv val/ILSVRC2012_val_00027400.JPEG n02640242/
+mv val/ILSVRC2012_val_00027401.JPEG n02790996/
+mv val/ILSVRC2012_val_00027402.JPEG n02099267/
+mv val/ILSVRC2012_val_00027403.JPEG n07871810/
+mv val/ILSVRC2012_val_00027404.JPEG n01986214/
+mv val/ILSVRC2012_val_00027405.JPEG n01984695/
+mv val/ILSVRC2012_val_00027406.JPEG n12985857/
+mv val/ILSVRC2012_val_00027407.JPEG n04542943/
+mv val/ILSVRC2012_val_00027408.JPEG n03888605/
+mv val/ILSVRC2012_val_00027409.JPEG n04074963/
+mv val/ILSVRC2012_val_00027410.JPEG n10565667/
+mv val/ILSVRC2012_val_00027411.JPEG n04483307/
+mv val/ILSVRC2012_val_00027412.JPEG n09835506/
+mv val/ILSVRC2012_val_00027413.JPEG n02129165/
+mv val/ILSVRC2012_val_00027414.JPEG n03538406/
+mv val/ILSVRC2012_val_00027415.JPEG n01498041/
+mv val/ILSVRC2012_val_00027416.JPEG n04461696/
+mv val/ILSVRC2012_val_00027417.JPEG n03944341/
+mv val/ILSVRC2012_val_00027418.JPEG n03259280/
+mv val/ILSVRC2012_val_00027419.JPEG n01484850/
+mv val/ILSVRC2012_val_00027420.JPEG n04486054/
+mv val/ILSVRC2012_val_00027421.JPEG n03788195/
+mv val/ILSVRC2012_val_00027422.JPEG n09193705/
+mv val/ILSVRC2012_val_00027423.JPEG n03530642/
+mv val/ILSVRC2012_val_00027424.JPEG n04557648/
+mv val/ILSVRC2012_val_00027425.JPEG n02892201/
+mv val/ILSVRC2012_val_00027426.JPEG n04509417/
+mv val/ILSVRC2012_val_00027427.JPEG n03041632/
+mv val/ILSVRC2012_val_00027428.JPEG n02093256/
+mv val/ILSVRC2012_val_00027429.JPEG n02391049/
+mv val/ILSVRC2012_val_00027430.JPEG n04479046/
+mv val/ILSVRC2012_val_00027431.JPEG n03961711/
+mv val/ILSVRC2012_val_00027432.JPEG n15075141/
+mv val/ILSVRC2012_val_00027433.JPEG n02108915/
+mv val/ILSVRC2012_val_00027434.JPEG n01847000/
+mv val/ILSVRC2012_val_00027435.JPEG n02325366/
+mv val/ILSVRC2012_val_00027436.JPEG n03770439/
+mv val/ILSVRC2012_val_00027437.JPEG n03676483/
+mv val/ILSVRC2012_val_00027438.JPEG n06794110/
+mv val/ILSVRC2012_val_00027439.JPEG n01770393/
+mv val/ILSVRC2012_val_00027440.JPEG n02788148/
+mv val/ILSVRC2012_val_00027441.JPEG n03127925/
+mv val/ILSVRC2012_val_00027442.JPEG n03710721/
+mv val/ILSVRC2012_val_00027443.JPEG n02484975/
+mv val/ILSVRC2012_val_00027444.JPEG n02536864/
+mv val/ILSVRC2012_val_00027445.JPEG n02105855/
+mv val/ILSVRC2012_val_00027446.JPEG n03733131/
+mv val/ILSVRC2012_val_00027447.JPEG n04435653/
+mv val/ILSVRC2012_val_00027448.JPEG n02124075/
+mv val/ILSVRC2012_val_00027449.JPEG n03792782/
+mv val/ILSVRC2012_val_00027450.JPEG n04465501/
+mv val/ILSVRC2012_val_00027451.JPEG n01644373/
+mv val/ILSVRC2012_val_00027452.JPEG n02085620/
+mv val/ILSVRC2012_val_00027453.JPEG n03720891/
+mv val/ILSVRC2012_val_00027454.JPEG n03814639/
+mv val/ILSVRC2012_val_00027455.JPEG n03133878/
+mv val/ILSVRC2012_val_00027456.JPEG n02892201/
+mv val/ILSVRC2012_val_00027457.JPEG n02077923/
+mv val/ILSVRC2012_val_00027458.JPEG n02992211/
+mv val/ILSVRC2012_val_00027459.JPEG n02114712/
+mv val/ILSVRC2012_val_00027460.JPEG n02410509/
+mv val/ILSVRC2012_val_00027461.JPEG n03733131/
+mv val/ILSVRC2012_val_00027462.JPEG n03843555/
+mv val/ILSVRC2012_val_00027463.JPEG n02917067/
+mv val/ILSVRC2012_val_00027464.JPEG n02128385/
+mv val/ILSVRC2012_val_00027465.JPEG n04009552/
+mv val/ILSVRC2012_val_00027466.JPEG n03888605/
+mv val/ILSVRC2012_val_00027467.JPEG n03388043/
+mv val/ILSVRC2012_val_00027468.JPEG n04596742/
+mv val/ILSVRC2012_val_00027469.JPEG n03935335/
+mv val/ILSVRC2012_val_00027470.JPEG n06785654/
+mv val/ILSVRC2012_val_00027471.JPEG n02356798/
+mv val/ILSVRC2012_val_00027472.JPEG n02398521/
+mv val/ILSVRC2012_val_00027473.JPEG n03445924/
+mv val/ILSVRC2012_val_00027474.JPEG n03041632/
+mv val/ILSVRC2012_val_00027475.JPEG n03535780/
+mv val/ILSVRC2012_val_00027476.JPEG n07753113/
+mv val/ILSVRC2012_val_00027477.JPEG n02834397/
+mv val/ILSVRC2012_val_00027478.JPEG n01824575/
+mv val/ILSVRC2012_val_00027479.JPEG n07697313/
+mv val/ILSVRC2012_val_00027480.JPEG n04487081/
+mv val/ILSVRC2012_val_00027481.JPEG n02509815/
+mv val/ILSVRC2012_val_00027482.JPEG n02106550/
+mv val/ILSVRC2012_val_00027483.JPEG n01704323/
+mv val/ILSVRC2012_val_00027484.JPEG n01742172/
+mv val/ILSVRC2012_val_00027485.JPEG n02094433/
+mv val/ILSVRC2012_val_00027486.JPEG n01817953/
+mv val/ILSVRC2012_val_00027487.JPEG n03032252/
+mv val/ILSVRC2012_val_00027488.JPEG n01742172/
+mv val/ILSVRC2012_val_00027489.JPEG n02483362/
+mv val/ILSVRC2012_val_00027490.JPEG n02096437/
+mv val/ILSVRC2012_val_00027491.JPEG n02487347/
+mv val/ILSVRC2012_val_00027492.JPEG n02096294/
+mv val/ILSVRC2012_val_00027493.JPEG n04465501/
+mv val/ILSVRC2012_val_00027494.JPEG n02948072/
+mv val/ILSVRC2012_val_00027495.JPEG n03424325/
+mv val/ILSVRC2012_val_00027496.JPEG n02111500/
+mv val/ILSVRC2012_val_00027497.JPEG n02114367/
+mv val/ILSVRC2012_val_00027498.JPEG n01537544/
+mv val/ILSVRC2012_val_00027499.JPEG n01945685/
+mv val/ILSVRC2012_val_00027500.JPEG n02607072/
+mv val/ILSVRC2012_val_00027501.JPEG n04005630/
+mv val/ILSVRC2012_val_00027502.JPEG n04127249/
+mv val/ILSVRC2012_val_00027503.JPEG n07714990/
+mv val/ILSVRC2012_val_00027504.JPEG n03662601/
+mv val/ILSVRC2012_val_00027505.JPEG n03179701/
+mv val/ILSVRC2012_val_00027506.JPEG n09468604/
+mv val/ILSVRC2012_val_00027507.JPEG n01530575/
+mv val/ILSVRC2012_val_00027508.JPEG n03100240/
+mv val/ILSVRC2012_val_00027509.JPEG n06359193/
+mv val/ILSVRC2012_val_00027510.JPEG n02510455/
+mv val/ILSVRC2012_val_00027511.JPEG n02120079/
+mv val/ILSVRC2012_val_00027512.JPEG n02096437/
+mv val/ILSVRC2012_val_00027513.JPEG n03141823/
+mv val/ILSVRC2012_val_00027514.JPEG n01484850/
+mv val/ILSVRC2012_val_00027515.JPEG n04579432/
+mv val/ILSVRC2012_val_00027516.JPEG n04118538/
+mv val/ILSVRC2012_val_00027517.JPEG n02094433/
+mv val/ILSVRC2012_val_00027518.JPEG n02086910/
+mv val/ILSVRC2012_val_00027519.JPEG n01622779/
+mv val/ILSVRC2012_val_00027520.JPEG n07747607/
+mv val/ILSVRC2012_val_00027521.JPEG n07718747/
+mv val/ILSVRC2012_val_00027522.JPEG n02106030/
+mv val/ILSVRC2012_val_00027523.JPEG n02363005/
+mv val/ILSVRC2012_val_00027524.JPEG n03599486/
+mv val/ILSVRC2012_val_00027525.JPEG n03637318/
+mv val/ILSVRC2012_val_00027526.JPEG n02101388/
+mv val/ILSVRC2012_val_00027527.JPEG n03662601/
+mv val/ILSVRC2012_val_00027528.JPEG n03188531/
+mv val/ILSVRC2012_val_00027529.JPEG n02104029/
+mv val/ILSVRC2012_val_00027530.JPEG n11939491/
+mv val/ILSVRC2012_val_00027531.JPEG n04238763/
+mv val/ILSVRC2012_val_00027532.JPEG n01945685/
+mv val/ILSVRC2012_val_00027533.JPEG n02834397/
+mv val/ILSVRC2012_val_00027534.JPEG n02099712/
+mv val/ILSVRC2012_val_00027535.JPEG n01558993/
+mv val/ILSVRC2012_val_00027536.JPEG n03450230/
+mv val/ILSVRC2012_val_00027537.JPEG n03838899/
+mv val/ILSVRC2012_val_00027538.JPEG n04243546/
+mv val/ILSVRC2012_val_00027539.JPEG n02123159/
+mv val/ILSVRC2012_val_00027540.JPEG n04536866/
+mv val/ILSVRC2012_val_00027541.JPEG n02808304/
+mv val/ILSVRC2012_val_00027542.JPEG n04120489/
+mv val/ILSVRC2012_val_00027543.JPEG n03127925/
+mv val/ILSVRC2012_val_00027544.JPEG n04505470/
+mv val/ILSVRC2012_val_00027545.JPEG n03782006/
+mv val/ILSVRC2012_val_00027546.JPEG n02281406/
+mv val/ILSVRC2012_val_00027547.JPEG n04252225/
+mv val/ILSVRC2012_val_00027548.JPEG n02776631/
+mv val/ILSVRC2012_val_00027549.JPEG n02444819/
+mv val/ILSVRC2012_val_00027550.JPEG n04005630/
+mv val/ILSVRC2012_val_00027551.JPEG n03717622/
+mv val/ILSVRC2012_val_00027552.JPEG n03961711/
+mv val/ILSVRC2012_val_00027553.JPEG n03444034/
+mv val/ILSVRC2012_val_00027554.JPEG n03970156/
+mv val/ILSVRC2012_val_00027555.JPEG n01824575/
+mv val/ILSVRC2012_val_00027556.JPEG n02396427/
+mv val/ILSVRC2012_val_00027557.JPEG n02165456/
+mv val/ILSVRC2012_val_00027558.JPEG n02226429/
+mv val/ILSVRC2012_val_00027559.JPEG n02056570/
+mv val/ILSVRC2012_val_00027560.JPEG n07693725/
+mv val/ILSVRC2012_val_00027561.JPEG n04599235/
+mv val/ILSVRC2012_val_00027562.JPEG n03944341/
+mv val/ILSVRC2012_val_00027563.JPEG n02134418/
+mv val/ILSVRC2012_val_00027564.JPEG n03788365/
+mv val/ILSVRC2012_val_00027565.JPEG n07717410/
+mv val/ILSVRC2012_val_00027566.JPEG n04264628/
+mv val/ILSVRC2012_val_00027567.JPEG n03967562/
+mv val/ILSVRC2012_val_00027568.JPEG n04265275/
+mv val/ILSVRC2012_val_00027569.JPEG n03584254/
+mv val/ILSVRC2012_val_00027570.JPEG n01614925/
+mv val/ILSVRC2012_val_00027571.JPEG n07720875/
+mv val/ILSVRC2012_val_00027572.JPEG n03814639/
+mv val/ILSVRC2012_val_00027573.JPEG n04370456/
+mv val/ILSVRC2012_val_00027574.JPEG n04037443/
+mv val/ILSVRC2012_val_00027575.JPEG n03297495/
+mv val/ILSVRC2012_val_00027576.JPEG n02129604/
+mv val/ILSVRC2012_val_00027577.JPEG n03131574/
+mv val/ILSVRC2012_val_00027578.JPEG n04243546/
+mv val/ILSVRC2012_val_00027579.JPEG n02105855/
+mv val/ILSVRC2012_val_00027580.JPEG n03895866/
+mv val/ILSVRC2012_val_00027581.JPEG n03216828/
+mv val/ILSVRC2012_val_00027582.JPEG n02317335/
+mv val/ILSVRC2012_val_00027583.JPEG n02106030/
+mv val/ILSVRC2012_val_00027584.JPEG n03661043/
+mv val/ILSVRC2012_val_00027585.JPEG n01924916/
+mv val/ILSVRC2012_val_00027586.JPEG n02165456/
+mv val/ILSVRC2012_val_00027587.JPEG n04536866/
+mv val/ILSVRC2012_val_00027588.JPEG n01616318/
+mv val/ILSVRC2012_val_00027589.JPEG n02799071/
+mv val/ILSVRC2012_val_00027590.JPEG n03788195/
+mv val/ILSVRC2012_val_00027591.JPEG n02363005/
+mv val/ILSVRC2012_val_00027592.JPEG n01924916/
+mv val/ILSVRC2012_val_00027593.JPEG n04461696/
+mv val/ILSVRC2012_val_00027594.JPEG n04270147/
+mv val/ILSVRC2012_val_00027595.JPEG n02843684/
+mv val/ILSVRC2012_val_00027596.JPEG n04258138/
+mv val/ILSVRC2012_val_00027597.JPEG n03944341/
+mv val/ILSVRC2012_val_00027598.JPEG n01737021/
+mv val/ILSVRC2012_val_00027599.JPEG n01882714/
+mv val/ILSVRC2012_val_00027600.JPEG n02817516/
+mv val/ILSVRC2012_val_00027601.JPEG n02097298/
+mv val/ILSVRC2012_val_00027602.JPEG n01843383/
+mv val/ILSVRC2012_val_00027603.JPEG n04019541/
+mv val/ILSVRC2012_val_00027604.JPEG n04118776/
+mv val/ILSVRC2012_val_00027605.JPEG n02799071/
+mv val/ILSVRC2012_val_00027606.JPEG n03967562/
+mv val/ILSVRC2012_val_00027607.JPEG n03494278/
+mv val/ILSVRC2012_val_00027608.JPEG n02229544/
+mv val/ILSVRC2012_val_00027609.JPEG n04325704/
+mv val/ILSVRC2012_val_00027610.JPEG n03967562/
+mv val/ILSVRC2012_val_00027611.JPEG n13044778/
+mv val/ILSVRC2012_val_00027612.JPEG n03344393/
+mv val/ILSVRC2012_val_00027613.JPEG n04557648/
+mv val/ILSVRC2012_val_00027614.JPEG n03447721/
+mv val/ILSVRC2012_val_00027615.JPEG n09472597/
+mv val/ILSVRC2012_val_00027616.JPEG n04118538/
+mv val/ILSVRC2012_val_00027617.JPEG n03424325/
+mv val/ILSVRC2012_val_00027618.JPEG n04599235/
+mv val/ILSVRC2012_val_00027619.JPEG n01530575/
+mv val/ILSVRC2012_val_00027620.JPEG n02835271/
+mv val/ILSVRC2012_val_00027621.JPEG n09472597/
+mv val/ILSVRC2012_val_00027622.JPEG n02092002/
+mv val/ILSVRC2012_val_00027623.JPEG n02730930/
+mv val/ILSVRC2012_val_00027624.JPEG n04599235/
+mv val/ILSVRC2012_val_00027625.JPEG n02422699/
+mv val/ILSVRC2012_val_00027626.JPEG n03657121/
+mv val/ILSVRC2012_val_00027627.JPEG n01622779/
+mv val/ILSVRC2012_val_00027628.JPEG n03903868/
+mv val/ILSVRC2012_val_00027629.JPEG n02090721/
+mv val/ILSVRC2012_val_00027630.JPEG n04443257/
+mv val/ILSVRC2012_val_00027631.JPEG n01734418/
+mv val/ILSVRC2012_val_00027632.JPEG n07714571/
+mv val/ILSVRC2012_val_00027633.JPEG n01496331/
+mv val/ILSVRC2012_val_00027634.JPEG n02264363/
+mv val/ILSVRC2012_val_00027635.JPEG n03483316/
+mv val/ILSVRC2012_val_00027636.JPEG n03742115/
+mv val/ILSVRC2012_val_00027637.JPEG n07714990/
+mv val/ILSVRC2012_val_00027638.JPEG n03590841/
+mv val/ILSVRC2012_val_00027639.JPEG n03871628/
+mv val/ILSVRC2012_val_00027640.JPEG n04311174/
+mv val/ILSVRC2012_val_00027641.JPEG n02114548/
+mv val/ILSVRC2012_val_00027642.JPEG n03255030/
+mv val/ILSVRC2012_val_00027643.JPEG n02105505/
+mv val/ILSVRC2012_val_00027644.JPEG n07579787/
+mv val/ILSVRC2012_val_00027645.JPEG n07697313/
+mv val/ILSVRC2012_val_00027646.JPEG n03400231/
+mv val/ILSVRC2012_val_00027647.JPEG n06874185/
+mv val/ILSVRC2012_val_00027648.JPEG n04591713/
+mv val/ILSVRC2012_val_00027649.JPEG n04509417/
+mv val/ILSVRC2012_val_00027650.JPEG n03255030/
+mv val/ILSVRC2012_val_00027651.JPEG n03404251/
+mv val/ILSVRC2012_val_00027652.JPEG n02268853/
+mv val/ILSVRC2012_val_00027653.JPEG n07613480/
+mv val/ILSVRC2012_val_00027654.JPEG n07768694/
+mv val/ILSVRC2012_val_00027655.JPEG n02321529/
+mv val/ILSVRC2012_val_00027656.JPEG n01818515/
+mv val/ILSVRC2012_val_00027657.JPEG n01877812/
+mv val/ILSVRC2012_val_00027658.JPEG n02895154/
+mv val/ILSVRC2012_val_00027659.JPEG n03485794/
+mv val/ILSVRC2012_val_00027660.JPEG n04553703/
+mv val/ILSVRC2012_val_00027661.JPEG n02364673/
+mv val/ILSVRC2012_val_00027662.JPEG n09229709/
+mv val/ILSVRC2012_val_00027663.JPEG n02916936/
+mv val/ILSVRC2012_val_00027664.JPEG n04235860/
+mv val/ILSVRC2012_val_00027665.JPEG n07932039/
+mv val/ILSVRC2012_val_00027666.JPEG n15075141/
+mv val/ILSVRC2012_val_00027667.JPEG n02006656/
+mv val/ILSVRC2012_val_00027668.JPEG n02487347/
+mv val/ILSVRC2012_val_00027669.JPEG n02087394/
+mv val/ILSVRC2012_val_00027670.JPEG n02480855/
+mv val/ILSVRC2012_val_00027671.JPEG n04372370/
+mv val/ILSVRC2012_val_00027672.JPEG n03733805/
+mv val/ILSVRC2012_val_00027673.JPEG n02979186/
+mv val/ILSVRC2012_val_00027674.JPEG n02033041/
+mv val/ILSVRC2012_val_00027675.JPEG n10565667/
+mv val/ILSVRC2012_val_00027676.JPEG n02006656/
+mv val/ILSVRC2012_val_00027677.JPEG n02099267/
+mv val/ILSVRC2012_val_00027678.JPEG n02108915/
+mv val/ILSVRC2012_val_00027679.JPEG n03930630/
+mv val/ILSVRC2012_val_00027680.JPEG n01728572/
+mv val/ILSVRC2012_val_00027681.JPEG n04552348/
+mv val/ILSVRC2012_val_00027682.JPEG n02090721/
+mv val/ILSVRC2012_val_00027683.JPEG n02870880/
+mv val/ILSVRC2012_val_00027684.JPEG n02951585/
+mv val/ILSVRC2012_val_00027685.JPEG n04259630/
+mv val/ILSVRC2012_val_00027686.JPEG n02328150/
+mv val/ILSVRC2012_val_00027687.JPEG n04435653/
+mv val/ILSVRC2012_val_00027688.JPEG n02843684/
+mv val/ILSVRC2012_val_00027689.JPEG n03788195/
+mv val/ILSVRC2012_val_00027690.JPEG n03887697/
+mv val/ILSVRC2012_val_00027691.JPEG n04335435/
+mv val/ILSVRC2012_val_00027692.JPEG n04228054/
+mv val/ILSVRC2012_val_00027693.JPEG n01608432/
+mv val/ILSVRC2012_val_00027694.JPEG n04355933/
+mv val/ILSVRC2012_val_00027695.JPEG n02123045/
+mv val/ILSVRC2012_val_00027696.JPEG n04589890/
+mv val/ILSVRC2012_val_00027697.JPEG n04086273/
+mv val/ILSVRC2012_val_00027698.JPEG n03832673/
+mv val/ILSVRC2012_val_00027699.JPEG n02111277/
+mv val/ILSVRC2012_val_00027700.JPEG n01704323/
+mv val/ILSVRC2012_val_00027701.JPEG n03599486/
+mv val/ILSVRC2012_val_00027702.JPEG n04254680/
+mv val/ILSVRC2012_val_00027703.JPEG n02086240/
+mv val/ILSVRC2012_val_00027704.JPEG n02817516/
+mv val/ILSVRC2012_val_00027705.JPEG n02487347/
+mv val/ILSVRC2012_val_00027706.JPEG n04592741/
+mv val/ILSVRC2012_val_00027707.JPEG n03272010/
+mv val/ILSVRC2012_val_00027708.JPEG n02018795/
+mv val/ILSVRC2012_val_00027709.JPEG n01930112/
+mv val/ILSVRC2012_val_00027710.JPEG n03223299/
+mv val/ILSVRC2012_val_00027711.JPEG n03388043/
+mv val/ILSVRC2012_val_00027712.JPEG n03888605/
+mv val/ILSVRC2012_val_00027713.JPEG n04040759/
+mv val/ILSVRC2012_val_00027714.JPEG n02169497/
+mv val/ILSVRC2012_val_00027715.JPEG n02793495/
+mv val/ILSVRC2012_val_00027716.JPEG n04376876/
+mv val/ILSVRC2012_val_00027717.JPEG n02177972/
+mv val/ILSVRC2012_val_00027718.JPEG n04485082/
+mv val/ILSVRC2012_val_00027719.JPEG n07717410/
+mv val/ILSVRC2012_val_00027720.JPEG n04081281/
+mv val/ILSVRC2012_val_00027721.JPEG n03109150/
+mv val/ILSVRC2012_val_00027722.JPEG n02090622/
+mv val/ILSVRC2012_val_00027723.JPEG n03482405/
+mv val/ILSVRC2012_val_00027724.JPEG n01664065/
+mv val/ILSVRC2012_val_00027725.JPEG n03032252/
+mv val/ILSVRC2012_val_00027726.JPEG n03355925/
+mv val/ILSVRC2012_val_00027727.JPEG n01910747/
+mv val/ILSVRC2012_val_00027728.JPEG n04536866/
+mv val/ILSVRC2012_val_00027729.JPEG n03000247/
+mv val/ILSVRC2012_val_00027730.JPEG n03527444/
+mv val/ILSVRC2012_val_00027731.JPEG n02025239/
+mv val/ILSVRC2012_val_00027732.JPEG n04254777/
+mv val/ILSVRC2012_val_00027733.JPEG n04141975/
+mv val/ILSVRC2012_val_00027734.JPEG n03793489/
+mv val/ILSVRC2012_val_00027735.JPEG n02979186/
+mv val/ILSVRC2012_val_00027736.JPEG n02127052/
+mv val/ILSVRC2012_val_00027737.JPEG n01847000/
+mv val/ILSVRC2012_val_00027738.JPEG n02328150/
+mv val/ILSVRC2012_val_00027739.JPEG n02909870/
+mv val/ILSVRC2012_val_00027740.JPEG n10565667/
+mv val/ILSVRC2012_val_00027741.JPEG n03709823/
+mv val/ILSVRC2012_val_00027742.JPEG n02992211/
+mv val/ILSVRC2012_val_00027743.JPEG n02093859/
+mv val/ILSVRC2012_val_00027744.JPEG n07747607/
+mv val/ILSVRC2012_val_00027745.JPEG n07717410/
+mv val/ILSVRC2012_val_00027746.JPEG n03249569/
+mv val/ILSVRC2012_val_00027747.JPEG n01734418/
+mv val/ILSVRC2012_val_00027748.JPEG n03944341/
+mv val/ILSVRC2012_val_00027749.JPEG n04344873/
+mv val/ILSVRC2012_val_00027750.JPEG n01677366/
+mv val/ILSVRC2012_val_00027751.JPEG n02108000/
+mv val/ILSVRC2012_val_00027752.JPEG n03876231/
+mv val/ILSVRC2012_val_00027753.JPEG n04461696/
+mv val/ILSVRC2012_val_00027754.JPEG n06596364/
+mv val/ILSVRC2012_val_00027755.JPEG n09428293/
+mv val/ILSVRC2012_val_00027756.JPEG n03482405/
+mv val/ILSVRC2012_val_00027757.JPEG n02088094/
+mv val/ILSVRC2012_val_00027758.JPEG n04136333/
+mv val/ILSVRC2012_val_00027759.JPEG n04204238/
+mv val/ILSVRC2012_val_00027760.JPEG n01697457/
+mv val/ILSVRC2012_val_00027761.JPEG n04074963/
+mv val/ILSVRC2012_val_00027762.JPEG n01514859/
+mv val/ILSVRC2012_val_00027763.JPEG n02106662/
+mv val/ILSVRC2012_val_00027764.JPEG n04252225/
+mv val/ILSVRC2012_val_00027765.JPEG n02117135/
+mv val/ILSVRC2012_val_00027766.JPEG n03476684/
+mv val/ILSVRC2012_val_00027767.JPEG n01770393/
+mv val/ILSVRC2012_val_00027768.JPEG n02795169/
+mv val/ILSVRC2012_val_00027769.JPEG n03733131/
+mv val/ILSVRC2012_val_00027770.JPEG n03676483/
+mv val/ILSVRC2012_val_00027771.JPEG n04133789/
+mv val/ILSVRC2012_val_00027772.JPEG n04435653/
+mv val/ILSVRC2012_val_00027773.JPEG n01728920/
+mv val/ILSVRC2012_val_00027774.JPEG n04033995/
+mv val/ILSVRC2012_val_00027775.JPEG n04355933/
+mv val/ILSVRC2012_val_00027776.JPEG n01675722/
+mv val/ILSVRC2012_val_00027777.JPEG n03717622/
+mv val/ILSVRC2012_val_00027778.JPEG n04428191/
+mv val/ILSVRC2012_val_00027779.JPEG n03535780/
+mv val/ILSVRC2012_val_00027780.JPEG n02105162/
+mv val/ILSVRC2012_val_00027781.JPEG n07753275/
+mv val/ILSVRC2012_val_00027782.JPEG n04483307/
+mv val/ILSVRC2012_val_00027783.JPEG n02917067/
+mv val/ILSVRC2012_val_00027784.JPEG n04118776/
+mv val/ILSVRC2012_val_00027785.JPEG n03000684/
+mv val/ILSVRC2012_val_00027786.JPEG n03000134/
+mv val/ILSVRC2012_val_00027787.JPEG n02281787/
+mv val/ILSVRC2012_val_00027788.JPEG n01770393/
+mv val/ILSVRC2012_val_00027789.JPEG n02326432/
+mv val/ILSVRC2012_val_00027790.JPEG n01753488/
+mv val/ILSVRC2012_val_00027791.JPEG n02167151/
+mv val/ILSVRC2012_val_00027792.JPEG n02808304/
+mv val/ILSVRC2012_val_00027793.JPEG n04392985/
+mv val/ILSVRC2012_val_00027794.JPEG n03197337/
+mv val/ILSVRC2012_val_00027795.JPEG n03100240/
+mv val/ILSVRC2012_val_00027796.JPEG n04286575/
+mv val/ILSVRC2012_val_00027797.JPEG n03127925/
+mv val/ILSVRC2012_val_00027798.JPEG n01945685/
+mv val/ILSVRC2012_val_00027799.JPEG n02536864/
+mv val/ILSVRC2012_val_00027800.JPEG n02799071/
+mv val/ILSVRC2012_val_00027801.JPEG n02783161/
+mv val/ILSVRC2012_val_00027802.JPEG n02346627/
+mv val/ILSVRC2012_val_00027803.JPEG n02264363/
+mv val/ILSVRC2012_val_00027804.JPEG n02088364/
+mv val/ILSVRC2012_val_00027805.JPEG n02093754/
+mv val/ILSVRC2012_val_00027806.JPEG n03617480/
+mv val/ILSVRC2012_val_00027807.JPEG n02105162/
+mv val/ILSVRC2012_val_00027808.JPEG n02966687/
+mv val/ILSVRC2012_val_00027809.JPEG n01795545/
+mv val/ILSVRC2012_val_00027810.JPEG n02091831/
+mv val/ILSVRC2012_val_00027811.JPEG n01537544/
+mv val/ILSVRC2012_val_00027812.JPEG n03041632/
+mv val/ILSVRC2012_val_00027813.JPEG n02834397/
+mv val/ILSVRC2012_val_00027814.JPEG n02699494/
+mv val/ILSVRC2012_val_00027815.JPEG n03404251/
+mv val/ILSVRC2012_val_00027816.JPEG n01860187/
+mv val/ILSVRC2012_val_00027817.JPEG n04550184/
+mv val/ILSVRC2012_val_00027818.JPEG n02992211/
+mv val/ILSVRC2012_val_00027819.JPEG n02437312/
+mv val/ILSVRC2012_val_00027820.JPEG n02098105/
+mv val/ILSVRC2012_val_00027821.JPEG n07590611/
+mv val/ILSVRC2012_val_00027822.JPEG n03527444/
+mv val/ILSVRC2012_val_00027823.JPEG n07583066/
+mv val/ILSVRC2012_val_00027824.JPEG n01748264/
+mv val/ILSVRC2012_val_00027825.JPEG n02966687/
+mv val/ILSVRC2012_val_00027826.JPEG n03803284/
+mv val/ILSVRC2012_val_00027827.JPEG n04366367/
+mv val/ILSVRC2012_val_00027828.JPEG n02119022/
+mv val/ILSVRC2012_val_00027829.JPEG n01740131/
+mv val/ILSVRC2012_val_00027830.JPEG n02099601/
+mv val/ILSVRC2012_val_00027831.JPEG n01534433/
+mv val/ILSVRC2012_val_00027832.JPEG n04606251/
+mv val/ILSVRC2012_val_00027833.JPEG n02099601/
+mv val/ILSVRC2012_val_00027834.JPEG n02488702/
+mv val/ILSVRC2012_val_00027835.JPEG n04336792/
+mv val/ILSVRC2012_val_00027836.JPEG n02391049/
+mv val/ILSVRC2012_val_00027837.JPEG n02086646/
+mv val/ILSVRC2012_val_00027838.JPEG n02086079/
+mv val/ILSVRC2012_val_00027839.JPEG n02110806/
+mv val/ILSVRC2012_val_00027840.JPEG n02110341/
+mv val/ILSVRC2012_val_00027841.JPEG n04447861/
+mv val/ILSVRC2012_val_00027842.JPEG n02119789/
+mv val/ILSVRC2012_val_00027843.JPEG n04162706/
+mv val/ILSVRC2012_val_00027844.JPEG n02259212/
+mv val/ILSVRC2012_val_00027845.JPEG n03124043/
+mv val/ILSVRC2012_val_00027846.JPEG n02101388/
+mv val/ILSVRC2012_val_00027847.JPEG n03630383/
+mv val/ILSVRC2012_val_00027848.JPEG n02980441/
+mv val/ILSVRC2012_val_00027849.JPEG n02494079/
+mv val/ILSVRC2012_val_00027850.JPEG n03602883/
+mv val/ILSVRC2012_val_00027851.JPEG n01695060/
+mv val/ILSVRC2012_val_00027852.JPEG n04141327/
+mv val/ILSVRC2012_val_00027853.JPEG n04266014/
+mv val/ILSVRC2012_val_00027854.JPEG n03047690/
+mv val/ILSVRC2012_val_00027855.JPEG n02097209/
+mv val/ILSVRC2012_val_00027856.JPEG n02113023/
+mv val/ILSVRC2012_val_00027857.JPEG n02174001/
+mv val/ILSVRC2012_val_00027858.JPEG n01669191/
+mv val/ILSVRC2012_val_00027859.JPEG n01667778/
+mv val/ILSVRC2012_val_00027860.JPEG n02096051/
+mv val/ILSVRC2012_val_00027861.JPEG n04251144/
+mv val/ILSVRC2012_val_00027862.JPEG n02112706/
+mv val/ILSVRC2012_val_00027863.JPEG n02988304/
+mv val/ILSVRC2012_val_00027864.JPEG n03461385/
+mv val/ILSVRC2012_val_00027865.JPEG n03447447/
+mv val/ILSVRC2012_val_00027866.JPEG n02077923/
+mv val/ILSVRC2012_val_00027867.JPEG n03887697/
+mv val/ILSVRC2012_val_00027868.JPEG n02342885/
+mv val/ILSVRC2012_val_00027869.JPEG n01641577/
+mv val/ILSVRC2012_val_00027870.JPEG n01616318/
+mv val/ILSVRC2012_val_00027871.JPEG n02007558/
+mv val/ILSVRC2012_val_00027872.JPEG n01698640/
+mv val/ILSVRC2012_val_00027873.JPEG n04033995/
+mv val/ILSVRC2012_val_00027874.JPEG n03804744/
+mv val/ILSVRC2012_val_00027875.JPEG n02110063/
+mv val/ILSVRC2012_val_00027876.JPEG n03355925/
+mv val/ILSVRC2012_val_00027877.JPEG n01667114/
+mv val/ILSVRC2012_val_00027878.JPEG n01914609/
+mv val/ILSVRC2012_val_00027879.JPEG n03804744/
+mv val/ILSVRC2012_val_00027880.JPEG n02669723/
+mv val/ILSVRC2012_val_00027881.JPEG n07836838/
+mv val/ILSVRC2012_val_00027882.JPEG n02412080/
+mv val/ILSVRC2012_val_00027883.JPEG n03743016/
+mv val/ILSVRC2012_val_00027884.JPEG n04336792/
+mv val/ILSVRC2012_val_00027885.JPEG n13052670/
+mv val/ILSVRC2012_val_00027886.JPEG n03791053/
+mv val/ILSVRC2012_val_00027887.JPEG n03776460/
+mv val/ILSVRC2012_val_00027888.JPEG n03017168/
+mv val/ILSVRC2012_val_00027889.JPEG n04404412/
+mv val/ILSVRC2012_val_00027890.JPEG n03777754/
+mv val/ILSVRC2012_val_00027891.JPEG n04037443/
+mv val/ILSVRC2012_val_00027892.JPEG n03796401/
+mv val/ILSVRC2012_val_00027893.JPEG n04404412/
+mv val/ILSVRC2012_val_00027894.JPEG n06596364/
+mv val/ILSVRC2012_val_00027895.JPEG n02105412/
+mv val/ILSVRC2012_val_00027896.JPEG n04023962/
+mv val/ILSVRC2012_val_00027897.JPEG n01734418/
+mv val/ILSVRC2012_val_00027898.JPEG n02328150/
+mv val/ILSVRC2012_val_00027899.JPEG n02101006/
+mv val/ILSVRC2012_val_00027900.JPEG n07684084/
+mv val/ILSVRC2012_val_00027901.JPEG n02002556/
+mv val/ILSVRC2012_val_00027902.JPEG n13133613/
+mv val/ILSVRC2012_val_00027903.JPEG n07248320/
+mv val/ILSVRC2012_val_00027904.JPEG n01753488/
+mv val/ILSVRC2012_val_00027905.JPEG n02107908/
+mv val/ILSVRC2012_val_00027906.JPEG n02123394/
+mv val/ILSVRC2012_val_00027907.JPEG n04154565/
+mv val/ILSVRC2012_val_00027908.JPEG n02504458/
+mv val/ILSVRC2012_val_00027909.JPEG n13052670/
+mv val/ILSVRC2012_val_00027910.JPEG n04008634/
+mv val/ILSVRC2012_val_00027911.JPEG n02916936/
+mv val/ILSVRC2012_val_00027912.JPEG n02107683/
+mv val/ILSVRC2012_val_00027913.JPEG n02134084/
+mv val/ILSVRC2012_val_00027914.JPEG n02443484/
+mv val/ILSVRC2012_val_00027915.JPEG n07720875/
+mv val/ILSVRC2012_val_00027916.JPEG n04493381/
+mv val/ILSVRC2012_val_00027917.JPEG n03761084/
+mv val/ILSVRC2012_val_00027918.JPEG n02102040/
+mv val/ILSVRC2012_val_00027919.JPEG n03089624/
+mv val/ILSVRC2012_val_00027920.JPEG n01985128/
+mv val/ILSVRC2012_val_00027921.JPEG n01753488/
+mv val/ILSVRC2012_val_00027922.JPEG n02137549/
+mv val/ILSVRC2012_val_00027923.JPEG n09835506/
+mv val/ILSVRC2012_val_00027924.JPEG n03443371/
+mv val/ILSVRC2012_val_00027925.JPEG n02346627/
+mv val/ILSVRC2012_val_00027926.JPEG n02002556/
+mv val/ILSVRC2012_val_00027927.JPEG n04589890/
+mv val/ILSVRC2012_val_00027928.JPEG n04562935/
+mv val/ILSVRC2012_val_00027929.JPEG n01632777/
+mv val/ILSVRC2012_val_00027930.JPEG n02317335/
+mv val/ILSVRC2012_val_00027931.JPEG n01632458/
+mv val/ILSVRC2012_val_00027932.JPEG n02493509/
+mv val/ILSVRC2012_val_00027933.JPEG n02398521/
+mv val/ILSVRC2012_val_00027934.JPEG n03970156/
+mv val/ILSVRC2012_val_00027935.JPEG n02667093/
+mv val/ILSVRC2012_val_00027936.JPEG n03825788/
+mv val/ILSVRC2012_val_00027937.JPEG n02086646/
+mv val/ILSVRC2012_val_00027938.JPEG n13044778/
+mv val/ILSVRC2012_val_00027939.JPEG n02088238/
+mv val/ILSVRC2012_val_00027940.JPEG n01776313/
+mv val/ILSVRC2012_val_00027941.JPEG n02481823/
+mv val/ILSVRC2012_val_00027942.JPEG n04423845/
+mv val/ILSVRC2012_val_00027943.JPEG n03047690/
+mv val/ILSVRC2012_val_00027944.JPEG n07749582/
+mv val/ILSVRC2012_val_00027945.JPEG n02977058/
+mv val/ILSVRC2012_val_00027946.JPEG n01796340/
+mv val/ILSVRC2012_val_00027947.JPEG n02110627/
+mv val/ILSVRC2012_val_00027948.JPEG n02910353/
+mv val/ILSVRC2012_val_00027949.JPEG n03201208/
+mv val/ILSVRC2012_val_00027950.JPEG n01728572/
+mv val/ILSVRC2012_val_00027951.JPEG n02114367/
+mv val/ILSVRC2012_val_00027952.JPEG n03980874/
+mv val/ILSVRC2012_val_00027953.JPEG n02776631/
+mv val/ILSVRC2012_val_00027954.JPEG n02165456/
+mv val/ILSVRC2012_val_00027955.JPEG n02437312/
+mv val/ILSVRC2012_val_00027956.JPEG n02364673/
+mv val/ILSVRC2012_val_00027957.JPEG n03764736/
+mv val/ILSVRC2012_val_00027958.JPEG n04041544/
+mv val/ILSVRC2012_val_00027959.JPEG n12998815/
+mv val/ILSVRC2012_val_00027960.JPEG n03388043/
+mv val/ILSVRC2012_val_00027961.JPEG n03803284/
+mv val/ILSVRC2012_val_00027962.JPEG n02113624/
+mv val/ILSVRC2012_val_00027963.JPEG n02102318/
+mv val/ILSVRC2012_val_00027964.JPEG n03424325/
+mv val/ILSVRC2012_val_00027965.JPEG n03250847/
+mv val/ILSVRC2012_val_00027966.JPEG n09288635/
+mv val/ILSVRC2012_val_00027967.JPEG n03924679/
+mv val/ILSVRC2012_val_00027968.JPEG n03956157/
+mv val/ILSVRC2012_val_00027969.JPEG n01910747/
+mv val/ILSVRC2012_val_00027970.JPEG n04560804/
+mv val/ILSVRC2012_val_00027971.JPEG n07714990/
+mv val/ILSVRC2012_val_00027972.JPEG n04542943/
+mv val/ILSVRC2012_val_00027973.JPEG n07716906/
+mv val/ILSVRC2012_val_00027974.JPEG n02128925/
+mv val/ILSVRC2012_val_00027975.JPEG n04487394/
+mv val/ILSVRC2012_val_00027976.JPEG n04399382/
+mv val/ILSVRC2012_val_00027977.JPEG n04044716/
+mv val/ILSVRC2012_val_00027978.JPEG n04465501/
+mv val/ILSVRC2012_val_00027979.JPEG n03854065/
+mv val/ILSVRC2012_val_00027980.JPEG n02398521/
+mv val/ILSVRC2012_val_00027981.JPEG n02823750/
+mv val/ILSVRC2012_val_00027982.JPEG n07583066/
+mv val/ILSVRC2012_val_00027983.JPEG n02107312/
+mv val/ILSVRC2012_val_00027984.JPEG n04584207/
+mv val/ILSVRC2012_val_00027985.JPEG n01829413/
+mv val/ILSVRC2012_val_00027986.JPEG n01833805/
+mv val/ILSVRC2012_val_00027987.JPEG n02417914/
+mv val/ILSVRC2012_val_00027988.JPEG n04081281/
+mv val/ILSVRC2012_val_00027989.JPEG n02088364/
+mv val/ILSVRC2012_val_00027990.JPEG n02113799/
+mv val/ILSVRC2012_val_00027991.JPEG n04376876/
+mv val/ILSVRC2012_val_00027992.JPEG n02093991/
+mv val/ILSVRC2012_val_00027993.JPEG n02730930/
+mv val/ILSVRC2012_val_00027994.JPEG n04133789/
+mv val/ILSVRC2012_val_00027995.JPEG n02442845/
+mv val/ILSVRC2012_val_00027996.JPEG n02018207/
+mv val/ILSVRC2012_val_00027997.JPEG n03930630/
+mv val/ILSVRC2012_val_00027998.JPEG n02910353/
+mv val/ILSVRC2012_val_00027999.JPEG n02730930/
+mv val/ILSVRC2012_val_00028000.JPEG n03776460/
+mv val/ILSVRC2012_val_00028001.JPEG n02088364/
+mv val/ILSVRC2012_val_00028002.JPEG n04264628/
+mv val/ILSVRC2012_val_00028003.JPEG n07714990/
+mv val/ILSVRC2012_val_00028004.JPEG n04461696/
+mv val/ILSVRC2012_val_00028005.JPEG n03372029/
+mv val/ILSVRC2012_val_00028006.JPEG n02090379/
+mv val/ILSVRC2012_val_00028007.JPEG n01819313/
+mv val/ILSVRC2012_val_00028008.JPEG n03657121/
+mv val/ILSVRC2012_val_00028009.JPEG n02106662/
+mv val/ILSVRC2012_val_00028010.JPEG n02109525/
+mv val/ILSVRC2012_val_00028011.JPEG n02500267/
+mv val/ILSVRC2012_val_00028012.JPEG n04376876/
+mv val/ILSVRC2012_val_00028013.JPEG n04483307/
+mv val/ILSVRC2012_val_00028014.JPEG n03843555/
+mv val/ILSVRC2012_val_00028015.JPEG n13037406/
+mv val/ILSVRC2012_val_00028016.JPEG n02097047/
+mv val/ILSVRC2012_val_00028017.JPEG n02403003/
+mv val/ILSVRC2012_val_00028018.JPEG n03290653/
+mv val/ILSVRC2012_val_00028019.JPEG n02690373/
+mv val/ILSVRC2012_val_00028020.JPEG n02536864/
+mv val/ILSVRC2012_val_00028021.JPEG n02091467/
+mv val/ILSVRC2012_val_00028022.JPEG n03843555/
+mv val/ILSVRC2012_val_00028023.JPEG n04044716/
+mv val/ILSVRC2012_val_00028024.JPEG n01537544/
+mv val/ILSVRC2012_val_00028025.JPEG n02037110/
+mv val/ILSVRC2012_val_00028026.JPEG n04146614/
+mv val/ILSVRC2012_val_00028027.JPEG n04612504/
+mv val/ILSVRC2012_val_00028028.JPEG n01484850/
+mv val/ILSVRC2012_val_00028029.JPEG n07684084/
+mv val/ILSVRC2012_val_00028030.JPEG n03220513/
+mv val/ILSVRC2012_val_00028031.JPEG n04326547/
+mv val/ILSVRC2012_val_00028032.JPEG n03127925/
+mv val/ILSVRC2012_val_00028033.JPEG n02971356/
+mv val/ILSVRC2012_val_00028034.JPEG n03476991/
+mv val/ILSVRC2012_val_00028035.JPEG n01774384/
+mv val/ILSVRC2012_val_00028036.JPEG n07565083/
+mv val/ILSVRC2012_val_00028037.JPEG n02672831/
+mv val/ILSVRC2012_val_00028038.JPEG n03967562/
+mv val/ILSVRC2012_val_00028039.JPEG n03998194/
+mv val/ILSVRC2012_val_00028040.JPEG n09229709/
+mv val/ILSVRC2012_val_00028041.JPEG n01641577/
+mv val/ILSVRC2012_val_00028042.JPEG n01682714/
+mv val/ILSVRC2012_val_00028043.JPEG n04204347/
+mv val/ILSVRC2012_val_00028044.JPEG n03160309/
+mv val/ILSVRC2012_val_00028045.JPEG n03478589/
+mv val/ILSVRC2012_val_00028046.JPEG n03792972/
+mv val/ILSVRC2012_val_00028047.JPEG n04458633/
+mv val/ILSVRC2012_val_00028048.JPEG n04392985/
+mv val/ILSVRC2012_val_00028049.JPEG n02480855/
+mv val/ILSVRC2012_val_00028050.JPEG n02099429/
+mv val/ILSVRC2012_val_00028051.JPEG n07714571/
+mv val/ILSVRC2012_val_00028052.JPEG n02098105/
+mv val/ILSVRC2012_val_00028053.JPEG n02963159/
+mv val/ILSVRC2012_val_00028054.JPEG n02777292/
+mv val/ILSVRC2012_val_00028055.JPEG n03529860/
+mv val/ILSVRC2012_val_00028056.JPEG n03706229/
+mv val/ILSVRC2012_val_00028057.JPEG n12057211/
+mv val/ILSVRC2012_val_00028058.JPEG n04612504/
+mv val/ILSVRC2012_val_00028059.JPEG n04554684/
+mv val/ILSVRC2012_val_00028060.JPEG n03590841/
+mv val/ILSVRC2012_val_00028061.JPEG n03661043/
+mv val/ILSVRC2012_val_00028062.JPEG n04065272/
+mv val/ILSVRC2012_val_00028063.JPEG n01531178/
+mv val/ILSVRC2012_val_00028064.JPEG n07614500/
+mv val/ILSVRC2012_val_00028065.JPEG n02017213/
+mv val/ILSVRC2012_val_00028066.JPEG n02859443/
+mv val/ILSVRC2012_val_00028067.JPEG n04235860/
+mv val/ILSVRC2012_val_00028068.JPEG n02256656/
+mv val/ILSVRC2012_val_00028069.JPEG n03481172/
+mv val/ILSVRC2012_val_00028070.JPEG n02110063/
+mv val/ILSVRC2012_val_00028071.JPEG n02281787/
+mv val/ILSVRC2012_val_00028072.JPEG n04579432/
+mv val/ILSVRC2012_val_00028073.JPEG n01985128/
+mv val/ILSVRC2012_val_00028074.JPEG n02363005/
+mv val/ILSVRC2012_val_00028075.JPEG n04317175/
+mv val/ILSVRC2012_val_00028076.JPEG n01737021/
+mv val/ILSVRC2012_val_00028077.JPEG n03216828/
+mv val/ILSVRC2012_val_00028078.JPEG n02095570/
+mv val/ILSVRC2012_val_00028079.JPEG n07714571/
+mv val/ILSVRC2012_val_00028080.JPEG n04525305/
+mv val/ILSVRC2012_val_00028081.JPEG n07565083/
+mv val/ILSVRC2012_val_00028082.JPEG n03494278/
+mv val/ILSVRC2012_val_00028083.JPEG n04525038/
+mv val/ILSVRC2012_val_00028084.JPEG n01494475/
+mv val/ILSVRC2012_val_00028085.JPEG n04404412/
+mv val/ILSVRC2012_val_00028086.JPEG n07718747/
+mv val/ILSVRC2012_val_00028087.JPEG n03903868/
+mv val/ILSVRC2012_val_00028088.JPEG n04376876/
+mv val/ILSVRC2012_val_00028089.JPEG n02088632/
+mv val/ILSVRC2012_val_00028090.JPEG n07720875/
+mv val/ILSVRC2012_val_00028091.JPEG n02111277/
+mv val/ILSVRC2012_val_00028092.JPEG n01728920/
+mv val/ILSVRC2012_val_00028093.JPEG n04311004/
+mv val/ILSVRC2012_val_00028094.JPEG n02877765/
+mv val/ILSVRC2012_val_00028095.JPEG n06785654/
+mv val/ILSVRC2012_val_00028096.JPEG n01978455/
+mv val/ILSVRC2012_val_00028097.JPEG n01729977/
+mv val/ILSVRC2012_val_00028098.JPEG n02906734/
+mv val/ILSVRC2012_val_00028099.JPEG n01601694/
+mv val/ILSVRC2012_val_00028100.JPEG n04429376/
+mv val/ILSVRC2012_val_00028101.JPEG n02676566/
+mv val/ILSVRC2012_val_00028102.JPEG n03733281/
+mv val/ILSVRC2012_val_00028103.JPEG n02106382/
+mv val/ILSVRC2012_val_00028104.JPEG n02817516/
+mv val/ILSVRC2012_val_00028105.JPEG n04039381/
+mv val/ILSVRC2012_val_00028106.JPEG n04356056/
+mv val/ILSVRC2012_val_00028107.JPEG n01514859/
+mv val/ILSVRC2012_val_00028108.JPEG n03791053/
+mv val/ILSVRC2012_val_00028109.JPEG n04376876/
+mv val/ILSVRC2012_val_00028110.JPEG n03630383/
+mv val/ILSVRC2012_val_00028111.JPEG n04252077/
+mv val/ILSVRC2012_val_00028112.JPEG n04417672/
+mv val/ILSVRC2012_val_00028113.JPEG n01641577/
+mv val/ILSVRC2012_val_00028114.JPEG n04141076/
+mv val/ILSVRC2012_val_00028115.JPEG n02025239/
+mv val/ILSVRC2012_val_00028116.JPEG n02992529/
+mv val/ILSVRC2012_val_00028117.JPEG n02672831/
+mv val/ILSVRC2012_val_00028118.JPEG n02088466/
+mv val/ILSVRC2012_val_00028119.JPEG n01797886/
+mv val/ILSVRC2012_val_00028120.JPEG n04501370/
+mv val/ILSVRC2012_val_00028121.JPEG n04149813/
+mv val/ILSVRC2012_val_00028122.JPEG n02172182/
+mv val/ILSVRC2012_val_00028123.JPEG n04336792/
+mv val/ILSVRC2012_val_00028124.JPEG n04417672/
+mv val/ILSVRC2012_val_00028125.JPEG n03944341/
+mv val/ILSVRC2012_val_00028126.JPEG n03961711/
+mv val/ILSVRC2012_val_00028127.JPEG n04493381/
+mv val/ILSVRC2012_val_00028128.JPEG n04258138/
+mv val/ILSVRC2012_val_00028129.JPEG n04523525/
+mv val/ILSVRC2012_val_00028130.JPEG n02423022/
+mv val/ILSVRC2012_val_00028131.JPEG n02102177/
+mv val/ILSVRC2012_val_00028132.JPEG n02865351/
+mv val/ILSVRC2012_val_00028133.JPEG n04507155/
+mv val/ILSVRC2012_val_00028134.JPEG n07930864/
+mv val/ILSVRC2012_val_00028135.JPEG n02097047/
+mv val/ILSVRC2012_val_00028136.JPEG n03916031/
+mv val/ILSVRC2012_val_00028137.JPEG n02892201/
+mv val/ILSVRC2012_val_00028138.JPEG n04254680/
+mv val/ILSVRC2012_val_00028139.JPEG n01608432/
+mv val/ILSVRC2012_val_00028140.JPEG n04461696/
+mv val/ILSVRC2012_val_00028141.JPEG n03483316/
+mv val/ILSVRC2012_val_00028142.JPEG n02500267/
+mv val/ILSVRC2012_val_00028143.JPEG n02916936/
+mv val/ILSVRC2012_val_00028144.JPEG n03452741/
+mv val/ILSVRC2012_val_00028145.JPEG n02892201/
+mv val/ILSVRC2012_val_00028146.JPEG n02113186/
+mv val/ILSVRC2012_val_00028147.JPEG n03775546/
+mv val/ILSVRC2012_val_00028148.JPEG n03478589/
+mv val/ILSVRC2012_val_00028149.JPEG n03633091/
+mv val/ILSVRC2012_val_00028150.JPEG n04599235/
+mv val/ILSVRC2012_val_00028151.JPEG n03065424/
+mv val/ILSVRC2012_val_00028152.JPEG n02097209/
+mv val/ILSVRC2012_val_00028153.JPEG n01873310/
+mv val/ILSVRC2012_val_00028154.JPEG n04604644/
+mv val/ILSVRC2012_val_00028155.JPEG n04418357/
+mv val/ILSVRC2012_val_00028156.JPEG n03794056/
+mv val/ILSVRC2012_val_00028157.JPEG n03179701/
+mv val/ILSVRC2012_val_00028158.JPEG n01440764/
+mv val/ILSVRC2012_val_00028159.JPEG n01806143/
+mv val/ILSVRC2012_val_00028160.JPEG n02093859/
+mv val/ILSVRC2012_val_00028161.JPEG n01496331/
+mv val/ILSVRC2012_val_00028162.JPEG n01669191/
+mv val/ILSVRC2012_val_00028163.JPEG n04367480/
+mv val/ILSVRC2012_val_00028164.JPEG n02971356/
+mv val/ILSVRC2012_val_00028165.JPEG n02114548/
+mv val/ILSVRC2012_val_00028166.JPEG n03249569/
+mv val/ILSVRC2012_val_00028167.JPEG n01796340/
+mv val/ILSVRC2012_val_00028168.JPEG n07613480/
+mv val/ILSVRC2012_val_00028169.JPEG n04505470/
+mv val/ILSVRC2012_val_00028170.JPEG n03804744/
+mv val/ILSVRC2012_val_00028171.JPEG n02950826/
+mv val/ILSVRC2012_val_00028172.JPEG n03743016/
+mv val/ILSVRC2012_val_00028173.JPEG n02777292/
+mv val/ILSVRC2012_val_00028174.JPEG n03089624/
+mv val/ILSVRC2012_val_00028175.JPEG n02110341/
+mv val/ILSVRC2012_val_00028176.JPEG n03485407/
+mv val/ILSVRC2012_val_00028177.JPEG n02480855/
+mv val/ILSVRC2012_val_00028178.JPEG n02356798/
+mv val/ILSVRC2012_val_00028179.JPEG n02910353/
+mv val/ILSVRC2012_val_00028180.JPEG n03662601/
+mv val/ILSVRC2012_val_00028181.JPEG n01601694/
+mv val/ILSVRC2012_val_00028182.JPEG n04141076/
+mv val/ILSVRC2012_val_00028183.JPEG n03384352/
+mv val/ILSVRC2012_val_00028184.JPEG n02492660/
+mv val/ILSVRC2012_val_00028185.JPEG n03376595/
+mv val/ILSVRC2012_val_00028186.JPEG n02776631/
+mv val/ILSVRC2012_val_00028187.JPEG n02025239/
+mv val/ILSVRC2012_val_00028188.JPEG n04065272/
+mv val/ILSVRC2012_val_00028189.JPEG n02033041/
+mv val/ILSVRC2012_val_00028190.JPEG n03417042/
+mv val/ILSVRC2012_val_00028191.JPEG n09332890/
+mv val/ILSVRC2012_val_00028192.JPEG n02097658/
+mv val/ILSVRC2012_val_00028193.JPEG n04552348/
+mv val/ILSVRC2012_val_00028194.JPEG n03447447/
+mv val/ILSVRC2012_val_00028195.JPEG n03781244/
+mv val/ILSVRC2012_val_00028196.JPEG n03000684/
+mv val/ILSVRC2012_val_00028197.JPEG n01749939/
+mv val/ILSVRC2012_val_00028198.JPEG n01677366/
+mv val/ILSVRC2012_val_00028199.JPEG n02094114/
+mv val/ILSVRC2012_val_00028200.JPEG n04465501/
+mv val/ILSVRC2012_val_00028201.JPEG n04372370/
+mv val/ILSVRC2012_val_00028202.JPEG n02281787/
+mv val/ILSVRC2012_val_00028203.JPEG n03196217/
+mv val/ILSVRC2012_val_00028204.JPEG n02277742/
+mv val/ILSVRC2012_val_00028205.JPEG n02701002/
+mv val/ILSVRC2012_val_00028206.JPEG n03290653/
+mv val/ILSVRC2012_val_00028207.JPEG n03452741/
+mv val/ILSVRC2012_val_00028208.JPEG n01806143/
+mv val/ILSVRC2012_val_00028209.JPEG n04037443/
+mv val/ILSVRC2012_val_00028210.JPEG n03825788/
+mv val/ILSVRC2012_val_00028211.JPEG n04266014/
+mv val/ILSVRC2012_val_00028212.JPEG n07716906/
+mv val/ILSVRC2012_val_00028213.JPEG n02123597/
+mv val/ILSVRC2012_val_00028214.JPEG n02110063/
+mv val/ILSVRC2012_val_00028215.JPEG n02981792/
+mv val/ILSVRC2012_val_00028216.JPEG n03804744/
+mv val/ILSVRC2012_val_00028217.JPEG n02134418/
+mv val/ILSVRC2012_val_00028218.JPEG n03970156/
+mv val/ILSVRC2012_val_00028219.JPEG n02483362/
+mv val/ILSVRC2012_val_00028220.JPEG n02486261/
+mv val/ILSVRC2012_val_00028221.JPEG n01514668/
+mv val/ILSVRC2012_val_00028222.JPEG n02134084/
+mv val/ILSVRC2012_val_00028223.JPEG n03970156/
+mv val/ILSVRC2012_val_00028224.JPEG n01558993/
+mv val/ILSVRC2012_val_00028225.JPEG n01644373/
+mv val/ILSVRC2012_val_00028226.JPEG n03692522/
+mv val/ILSVRC2012_val_00028227.JPEG n03804744/
+mv val/ILSVRC2012_val_00028228.JPEG n02804414/
+mv val/ILSVRC2012_val_00028229.JPEG n02108551/
+mv val/ILSVRC2012_val_00028230.JPEG n01560419/
+mv val/ILSVRC2012_val_00028231.JPEG n02490219/
+mv val/ILSVRC2012_val_00028232.JPEG n03710637/
+mv val/ILSVRC2012_val_00028233.JPEG n03673027/
+mv val/ILSVRC2012_val_00028234.JPEG n04552348/
+mv val/ILSVRC2012_val_00028235.JPEG n02094114/
+mv val/ILSVRC2012_val_00028236.JPEG n03967562/
+mv val/ILSVRC2012_val_00028237.JPEG n03776460/
+mv val/ILSVRC2012_val_00028238.JPEG n02447366/
+mv val/ILSVRC2012_val_00028239.JPEG n03733805/
+mv val/ILSVRC2012_val_00028240.JPEG n03127925/
+mv val/ILSVRC2012_val_00028241.JPEG n02279972/
+mv val/ILSVRC2012_val_00028242.JPEG n09428293/
+mv val/ILSVRC2012_val_00028243.JPEG n03089624/
+mv val/ILSVRC2012_val_00028244.JPEG n03938244/
+mv val/ILSVRC2012_val_00028245.JPEG n04041544/
+mv val/ILSVRC2012_val_00028246.JPEG n02113712/
+mv val/ILSVRC2012_val_00028247.JPEG n03594734/
+mv val/ILSVRC2012_val_00028248.JPEG n02206856/
+mv val/ILSVRC2012_val_00028249.JPEG n03485794/
+mv val/ILSVRC2012_val_00028250.JPEG n02256656/
+mv val/ILSVRC2012_val_00028251.JPEG n02981792/
+mv val/ILSVRC2012_val_00028252.JPEG n03347037/
+mv val/ILSVRC2012_val_00028253.JPEG n03026506/
+mv val/ILSVRC2012_val_00028254.JPEG n04356056/
+mv val/ILSVRC2012_val_00028255.JPEG n09332890/
+mv val/ILSVRC2012_val_00028256.JPEG n07565083/
+mv val/ILSVRC2012_val_00028257.JPEG n07760859/
+mv val/ILSVRC2012_val_00028258.JPEG n04286575/
+mv val/ILSVRC2012_val_00028259.JPEG n02790996/
+mv val/ILSVRC2012_val_00028260.JPEG n01873310/
+mv val/ILSVRC2012_val_00028261.JPEG n03337140/
+mv val/ILSVRC2012_val_00028262.JPEG n04483307/
+mv val/ILSVRC2012_val_00028263.JPEG n02281787/
+mv val/ILSVRC2012_val_00028264.JPEG n02114548/
+mv val/ILSVRC2012_val_00028265.JPEG n12057211/
+mv val/ILSVRC2012_val_00028266.JPEG n02971356/
+mv val/ILSVRC2012_val_00028267.JPEG n04591713/
+mv val/ILSVRC2012_val_00028268.JPEG n04371774/
+mv val/ILSVRC2012_val_00028269.JPEG n03841143/
+mv val/ILSVRC2012_val_00028270.JPEG n02229544/
+mv val/ILSVRC2012_val_00028271.JPEG n02794156/
+mv val/ILSVRC2012_val_00028272.JPEG n04270147/
+mv val/ILSVRC2012_val_00028273.JPEG n04090263/
+mv val/ILSVRC2012_val_00028274.JPEG n04592741/
+mv val/ILSVRC2012_val_00028275.JPEG n02120505/
+mv val/ILSVRC2012_val_00028276.JPEG n02120505/
+mv val/ILSVRC2012_val_00028277.JPEG n03532672/
+mv val/ILSVRC2012_val_00028278.JPEG n03062245/
+mv val/ILSVRC2012_val_00028279.JPEG n03089624/
+mv val/ILSVRC2012_val_00028280.JPEG n03710193/
+mv val/ILSVRC2012_val_00028281.JPEG n03792972/
+mv val/ILSVRC2012_val_00028282.JPEG n02085936/
+mv val/ILSVRC2012_val_00028283.JPEG n01924916/
+mv val/ILSVRC2012_val_00028284.JPEG n01692333/
+mv val/ILSVRC2012_val_00028285.JPEG n04428191/
+mv val/ILSVRC2012_val_00028286.JPEG n13044778/
+mv val/ILSVRC2012_val_00028287.JPEG n06359193/
+mv val/ILSVRC2012_val_00028288.JPEG n07693725/
+mv val/ILSVRC2012_val_00028289.JPEG n02916936/
+mv val/ILSVRC2012_val_00028290.JPEG n02488702/
+mv val/ILSVRC2012_val_00028291.JPEG n02489166/
+mv val/ILSVRC2012_val_00028292.JPEG n02102318/
+mv val/ILSVRC2012_val_00028293.JPEG n03980874/
+mv val/ILSVRC2012_val_00028294.JPEG n04265275/
+mv val/ILSVRC2012_val_00028295.JPEG n04429376/
+mv val/ILSVRC2012_val_00028296.JPEG n02480855/
+mv val/ILSVRC2012_val_00028297.JPEG n07873807/
+mv val/ILSVRC2012_val_00028298.JPEG n03478589/
+mv val/ILSVRC2012_val_00028299.JPEG n02071294/
+mv val/ILSVRC2012_val_00028300.JPEG n02097298/
+mv val/ILSVRC2012_val_00028301.JPEG n01734418/
+mv val/ILSVRC2012_val_00028302.JPEG n02123159/
+mv val/ILSVRC2012_val_00028303.JPEG n02951585/
+mv val/ILSVRC2012_val_00028304.JPEG n07714990/
+mv val/ILSVRC2012_val_00028305.JPEG n02859443/
+mv val/ILSVRC2012_val_00028306.JPEG n04447861/
+mv val/ILSVRC2012_val_00028307.JPEG n02096585/
+mv val/ILSVRC2012_val_00028308.JPEG n03902125/
+mv val/ILSVRC2012_val_00028309.JPEG n04525038/
+mv val/ILSVRC2012_val_00028310.JPEG n03028079/
+mv val/ILSVRC2012_val_00028311.JPEG n03866082/
+mv val/ILSVRC2012_val_00028312.JPEG n03891332/
+mv val/ILSVRC2012_val_00028313.JPEG n03220513/
+mv val/ILSVRC2012_val_00028314.JPEG n03207743/
+mv val/ILSVRC2012_val_00028315.JPEG n04589890/
+mv val/ILSVRC2012_val_00028316.JPEG n03871628/
+mv val/ILSVRC2012_val_00028317.JPEG n01774750/
+mv val/ILSVRC2012_val_00028318.JPEG n02125311/
+mv val/ILSVRC2012_val_00028319.JPEG n02747177/
+mv val/ILSVRC2012_val_00028320.JPEG n04153751/
+mv val/ILSVRC2012_val_00028321.JPEG n02101556/
+mv val/ILSVRC2012_val_00028322.JPEG n02095570/
+mv val/ILSVRC2012_val_00028323.JPEG n01629819/
+mv val/ILSVRC2012_val_00028324.JPEG n03042490/
+mv val/ILSVRC2012_val_00028325.JPEG n01872401/
+mv val/ILSVRC2012_val_00028326.JPEG n04311004/
+mv val/ILSVRC2012_val_00028327.JPEG n04228054/
+mv val/ILSVRC2012_val_00028328.JPEG n03983396/
+mv val/ILSVRC2012_val_00028329.JPEG n04456115/
+mv val/ILSVRC2012_val_00028330.JPEG n04070727/
+mv val/ILSVRC2012_val_00028331.JPEG n02490219/
+mv val/ILSVRC2012_val_00028332.JPEG n02093256/
+mv val/ILSVRC2012_val_00028333.JPEG n03710193/
+mv val/ILSVRC2012_val_00028334.JPEG n03742115/
+mv val/ILSVRC2012_val_00028335.JPEG n03841143/
+mv val/ILSVRC2012_val_00028336.JPEG n04285008/
+mv val/ILSVRC2012_val_00028337.JPEG n02074367/
+mv val/ILSVRC2012_val_00028338.JPEG n02526121/
+mv val/ILSVRC2012_val_00028339.JPEG n02116738/
+mv val/ILSVRC2012_val_00028340.JPEG n03666591/
+mv val/ILSVRC2012_val_00028341.JPEG n02363005/
+mv val/ILSVRC2012_val_00028342.JPEG n02910353/
+mv val/ILSVRC2012_val_00028343.JPEG n02219486/
+mv val/ILSVRC2012_val_00028344.JPEG n03063599/
+mv val/ILSVRC2012_val_00028345.JPEG n01955084/
+mv val/ILSVRC2012_val_00028346.JPEG n02104029/
+mv val/ILSVRC2012_val_00028347.JPEG n02114855/
+mv val/ILSVRC2012_val_00028348.JPEG n04023962/
+mv val/ILSVRC2012_val_00028349.JPEG n04376876/
+mv val/ILSVRC2012_val_00028350.JPEG n04275548/
+mv val/ILSVRC2012_val_00028351.JPEG n01682714/
+mv val/ILSVRC2012_val_00028352.JPEG n01641577/
+mv val/ILSVRC2012_val_00028353.JPEG n02676566/
+mv val/ILSVRC2012_val_00028354.JPEG n07892512/
+mv val/ILSVRC2012_val_00028355.JPEG n01775062/
+mv val/ILSVRC2012_val_00028356.JPEG n03457902/
+mv val/ILSVRC2012_val_00028357.JPEG n04486054/
+mv val/ILSVRC2012_val_00028358.JPEG n03457902/
+mv val/ILSVRC2012_val_00028359.JPEG n02843684/
+mv val/ILSVRC2012_val_00028360.JPEG n07768694/
+mv val/ILSVRC2012_val_00028361.JPEG n04026417/
+mv val/ILSVRC2012_val_00028362.JPEG n03355925/
+mv val/ILSVRC2012_val_00028363.JPEG n02025239/
+mv val/ILSVRC2012_val_00028364.JPEG n03781244/
+mv val/ILSVRC2012_val_00028365.JPEG n03947888/
+mv val/ILSVRC2012_val_00028366.JPEG n02280649/
+mv val/ILSVRC2012_val_00028367.JPEG n03450230/
+mv val/ILSVRC2012_val_00028368.JPEG n02098286/
+mv val/ILSVRC2012_val_00028369.JPEG n03776460/
+mv val/ILSVRC2012_val_00028370.JPEG n03594945/
+mv val/ILSVRC2012_val_00028371.JPEG n07734744/
+mv val/ILSVRC2012_val_00028372.JPEG n02276258/
+mv val/ILSVRC2012_val_00028373.JPEG n07720875/
+mv val/ILSVRC2012_val_00028374.JPEG n02988304/
+mv val/ILSVRC2012_val_00028375.JPEG n03595614/
+mv val/ILSVRC2012_val_00028376.JPEG n02951358/
+mv val/ILSVRC2012_val_00028377.JPEG n03764736/
+mv val/ILSVRC2012_val_00028378.JPEG n02939185/
+mv val/ILSVRC2012_val_00028379.JPEG n02091134/
+mv val/ILSVRC2012_val_00028380.JPEG n01978287/
+mv val/ILSVRC2012_val_00028381.JPEG n02268443/
+mv val/ILSVRC2012_val_00028382.JPEG n03127747/
+mv val/ILSVRC2012_val_00028383.JPEG n03814639/
+mv val/ILSVRC2012_val_00028384.JPEG n03874293/
+mv val/ILSVRC2012_val_00028385.JPEG n04081281/
+mv val/ILSVRC2012_val_00028386.JPEG n07768694/
+mv val/ILSVRC2012_val_00028387.JPEG n07715103/
+mv val/ILSVRC2012_val_00028388.JPEG n02790996/
+mv val/ILSVRC2012_val_00028389.JPEG n03160309/
+mv val/ILSVRC2012_val_00028390.JPEG n04525038/
+mv val/ILSVRC2012_val_00028391.JPEG n02013706/
+mv val/ILSVRC2012_val_00028392.JPEG n04540053/
+mv val/ILSVRC2012_val_00028393.JPEG n02105056/
+mv val/ILSVRC2012_val_00028394.JPEG n07715103/
+mv val/ILSVRC2012_val_00028395.JPEG n01860187/
+mv val/ILSVRC2012_val_00028396.JPEG n07920052/
+mv val/ILSVRC2012_val_00028397.JPEG n01687978/
+mv val/ILSVRC2012_val_00028398.JPEG n07590611/
+mv val/ILSVRC2012_val_00028399.JPEG n03394916/
+mv val/ILSVRC2012_val_00028400.JPEG n03947888/
+mv val/ILSVRC2012_val_00028401.JPEG n01945685/
+mv val/ILSVRC2012_val_00028402.JPEG n02110063/
+mv val/ILSVRC2012_val_00028403.JPEG n04074963/
+mv val/ILSVRC2012_val_00028404.JPEG n04606251/
+mv val/ILSVRC2012_val_00028405.JPEG n03594945/
+mv val/ILSVRC2012_val_00028406.JPEG n04254120/
+mv val/ILSVRC2012_val_00028407.JPEG n03187595/
+mv val/ILSVRC2012_val_00028408.JPEG n02110958/
+mv val/ILSVRC2012_val_00028409.JPEG n02977058/
+mv val/ILSVRC2012_val_00028410.JPEG n07930864/
+mv val/ILSVRC2012_val_00028411.JPEG n02099601/
+mv val/ILSVRC2012_val_00028412.JPEG n03590841/
+mv val/ILSVRC2012_val_00028413.JPEG n02441942/
+mv val/ILSVRC2012_val_00028414.JPEG n01806567/
+mv val/ILSVRC2012_val_00028415.JPEG n02643566/
+mv val/ILSVRC2012_val_00028416.JPEG n03874293/
+mv val/ILSVRC2012_val_00028417.JPEG n03255030/
+mv val/ILSVRC2012_val_00028418.JPEG n04487394/
+mv val/ILSVRC2012_val_00028419.JPEG n07760859/
+mv val/ILSVRC2012_val_00028420.JPEG n02112137/
+mv val/ILSVRC2012_val_00028421.JPEG n04486054/
+mv val/ILSVRC2012_val_00028422.JPEG n01496331/
+mv val/ILSVRC2012_val_00028423.JPEG n03337140/
+mv val/ILSVRC2012_val_00028424.JPEG n01882714/
+mv val/ILSVRC2012_val_00028425.JPEG n02113978/
+mv val/ILSVRC2012_val_00028426.JPEG n07615774/
+mv val/ILSVRC2012_val_00028427.JPEG n02168699/
+mv val/ILSVRC2012_val_00028428.JPEG n04465501/
+mv val/ILSVRC2012_val_00028429.JPEG n02086910/
+mv val/ILSVRC2012_val_00028430.JPEG n04136333/
+mv val/ILSVRC2012_val_00028431.JPEG n04254120/
+mv val/ILSVRC2012_val_00028432.JPEG n03530642/
+mv val/ILSVRC2012_val_00028433.JPEG n03187595/
+mv val/ILSVRC2012_val_00028434.JPEG n01770393/
+mv val/ILSVRC2012_val_00028435.JPEG n02422106/
+mv val/ILSVRC2012_val_00028436.JPEG n03709823/
+mv val/ILSVRC2012_val_00028437.JPEG n02910353/
+mv val/ILSVRC2012_val_00028438.JPEG n01855672/
+mv val/ILSVRC2012_val_00028439.JPEG n02361337/
+mv val/ILSVRC2012_val_00028440.JPEG n01580077/
+mv val/ILSVRC2012_val_00028441.JPEG n01694178/
+mv val/ILSVRC2012_val_00028442.JPEG n04120489/
+mv val/ILSVRC2012_val_00028443.JPEG n04517823/
+mv val/ILSVRC2012_val_00028444.JPEG n03775546/
+mv val/ILSVRC2012_val_00028445.JPEG n01773157/
+mv val/ILSVRC2012_val_00028446.JPEG n03775546/
+mv val/ILSVRC2012_val_00028447.JPEG n03777568/
+mv val/ILSVRC2012_val_00028448.JPEG n04355933/
+mv val/ILSVRC2012_val_00028449.JPEG n01784675/
+mv val/ILSVRC2012_val_00028450.JPEG n01498041/
+mv val/ILSVRC2012_val_00028451.JPEG n02422699/
+mv val/ILSVRC2012_val_00028452.JPEG n04447861/
+mv val/ILSVRC2012_val_00028453.JPEG n02177972/
+mv val/ILSVRC2012_val_00028454.JPEG n02319095/
+mv val/ILSVRC2012_val_00028455.JPEG n03935335/
+mv val/ILSVRC2012_val_00028456.JPEG n03980874/
+mv val/ILSVRC2012_val_00028457.JPEG n03976657/
+mv val/ILSVRC2012_val_00028458.JPEG n02442845/
+mv val/ILSVRC2012_val_00028459.JPEG n02085782/
+mv val/ILSVRC2012_val_00028460.JPEG n03976467/
+mv val/ILSVRC2012_val_00028461.JPEG n07583066/
+mv val/ILSVRC2012_val_00028462.JPEG n04461696/
+mv val/ILSVRC2012_val_00028463.JPEG n04467665/
+mv val/ILSVRC2012_val_00028464.JPEG n02105641/
+mv val/ILSVRC2012_val_00028465.JPEG n04501370/
+mv val/ILSVRC2012_val_00028466.JPEG n03777754/
+mv val/ILSVRC2012_val_00028467.JPEG n04065272/
+mv val/ILSVRC2012_val_00028468.JPEG n03447721/
+mv val/ILSVRC2012_val_00028469.JPEG n02206856/
+mv val/ILSVRC2012_val_00028470.JPEG n03459775/
+mv val/ILSVRC2012_val_00028471.JPEG n03947888/
+mv val/ILSVRC2012_val_00028472.JPEG n04111531/
+mv val/ILSVRC2012_val_00028473.JPEG n02807133/
+mv val/ILSVRC2012_val_00028474.JPEG n03481172/
+mv val/ILSVRC2012_val_00028475.JPEG n01983481/
+mv val/ILSVRC2012_val_00028476.JPEG n03733131/
+mv val/ILSVRC2012_val_00028477.JPEG n02105641/
+mv val/ILSVRC2012_val_00028478.JPEG n03841143/
+mv val/ILSVRC2012_val_00028479.JPEG n03976467/
+mv val/ILSVRC2012_val_00028480.JPEG n02391049/
+mv val/ILSVRC2012_val_00028481.JPEG n03196217/
+mv val/ILSVRC2012_val_00028482.JPEG n02422699/
+mv val/ILSVRC2012_val_00028483.JPEG n04462240/
+mv val/ILSVRC2012_val_00028484.JPEG n04328186/
+mv val/ILSVRC2012_val_00028485.JPEG n04310018/
+mv val/ILSVRC2012_val_00028486.JPEG n04417672/
+mv val/ILSVRC2012_val_00028487.JPEG n03018349/
+mv val/ILSVRC2012_val_00028488.JPEG n02965783/
+mv val/ILSVRC2012_val_00028489.JPEG n01629819/
+mv val/ILSVRC2012_val_00028490.JPEG n03207941/
+mv val/ILSVRC2012_val_00028491.JPEG n04311174/
+mv val/ILSVRC2012_val_00028492.JPEG n02226429/
+mv val/ILSVRC2012_val_00028493.JPEG n02363005/
+mv val/ILSVRC2012_val_00028494.JPEG n03041632/
+mv val/ILSVRC2012_val_00028495.JPEG n04033901/
+mv val/ILSVRC2012_val_00028496.JPEG n02410509/
+mv val/ILSVRC2012_val_00028497.JPEG n02112137/
+mv val/ILSVRC2012_val_00028498.JPEG n02747177/
+mv val/ILSVRC2012_val_00028499.JPEG n02825657/
+mv val/ILSVRC2012_val_00028500.JPEG n02097298/
+mv val/ILSVRC2012_val_00028501.JPEG n02992529/
+mv val/ILSVRC2012_val_00028502.JPEG n03032252/
+mv val/ILSVRC2012_val_00028503.JPEG n01734418/
+mv val/ILSVRC2012_val_00028504.JPEG n04090263/
+mv val/ILSVRC2012_val_00028505.JPEG n04201297/
+mv val/ILSVRC2012_val_00028506.JPEG n02094258/
+mv val/ILSVRC2012_val_00028507.JPEG n04111531/
+mv val/ILSVRC2012_val_00028508.JPEG n04265275/
+mv val/ILSVRC2012_val_00028509.JPEG n04065272/
+mv val/ILSVRC2012_val_00028510.JPEG n02676566/
+mv val/ILSVRC2012_val_00028511.JPEG n03388043/
+mv val/ILSVRC2012_val_00028512.JPEG n07930864/
+mv val/ILSVRC2012_val_00028513.JPEG n02423022/
+mv val/ILSVRC2012_val_00028514.JPEG n02108551/
+mv val/ILSVRC2012_val_00028515.JPEG n03424325/
+mv val/ILSVRC2012_val_00028516.JPEG n02815834/
+mv val/ILSVRC2012_val_00028517.JPEG n04228054/
+mv val/ILSVRC2012_val_00028518.JPEG n02097209/
+mv val/ILSVRC2012_val_00028519.JPEG n02137549/
+mv val/ILSVRC2012_val_00028520.JPEG n03314780/
+mv val/ILSVRC2012_val_00028521.JPEG n01608432/
+mv val/ILSVRC2012_val_00028522.JPEG n01820546/
+mv val/ILSVRC2012_val_00028523.JPEG n02109961/
+mv val/ILSVRC2012_val_00028524.JPEG n01580077/
+mv val/ILSVRC2012_val_00028525.JPEG n07579787/
+mv val/ILSVRC2012_val_00028526.JPEG n03788365/
+mv val/ILSVRC2012_val_00028527.JPEG n02749479/
+mv val/ILSVRC2012_val_00028528.JPEG n03930313/
+mv val/ILSVRC2012_val_00028529.JPEG n01806567/
+mv val/ILSVRC2012_val_00028530.JPEG n02927161/
+mv val/ILSVRC2012_val_00028531.JPEG n04447861/
+mv val/ILSVRC2012_val_00028532.JPEG n04548362/
+mv val/ILSVRC2012_val_00028533.JPEG n02259212/
+mv val/ILSVRC2012_val_00028534.JPEG n04252225/
+mv val/ILSVRC2012_val_00028535.JPEG n02105162/
+mv val/ILSVRC2012_val_00028536.JPEG n03345487/
+mv val/ILSVRC2012_val_00028537.JPEG n02727426/
+mv val/ILSVRC2012_val_00028538.JPEG n07584110/
+mv val/ILSVRC2012_val_00028539.JPEG n04005630/
+mv val/ILSVRC2012_val_00028540.JPEG n02096294/
+mv val/ILSVRC2012_val_00028541.JPEG n04273569/
+mv val/ILSVRC2012_val_00028542.JPEG n02422106/
+mv val/ILSVRC2012_val_00028543.JPEG n03534580/
+mv val/ILSVRC2012_val_00028544.JPEG n09288635/
+mv val/ILSVRC2012_val_00028545.JPEG n01795545/
+mv val/ILSVRC2012_val_00028546.JPEG n02397096/
+mv val/ILSVRC2012_val_00028547.JPEG n02730930/
+mv val/ILSVRC2012_val_00028548.JPEG n01806143/
+mv val/ILSVRC2012_val_00028549.JPEG n03661043/
+mv val/ILSVRC2012_val_00028550.JPEG n02807133/
+mv val/ILSVRC2012_val_00028551.JPEG n02277742/
+mv val/ILSVRC2012_val_00028552.JPEG n07613480/
+mv val/ILSVRC2012_val_00028553.JPEG n03297495/
+mv val/ILSVRC2012_val_00028554.JPEG n03761084/
+mv val/ILSVRC2012_val_00028555.JPEG n03109150/
+mv val/ILSVRC2012_val_00028556.JPEG n07716906/
+mv val/ILSVRC2012_val_00028557.JPEG n12267677/
+mv val/ILSVRC2012_val_00028558.JPEG n04204238/
+mv val/ILSVRC2012_val_00028559.JPEG n04204347/
+mv val/ILSVRC2012_val_00028560.JPEG n04596742/
+mv val/ILSVRC2012_val_00028561.JPEG n03710637/
+mv val/ILSVRC2012_val_00028562.JPEG n02481823/
+mv val/ILSVRC2012_val_00028563.JPEG n02669723/
+mv val/ILSVRC2012_val_00028564.JPEG n01491361/
+mv val/ILSVRC2012_val_00028565.JPEG n01629819/
+mv val/ILSVRC2012_val_00028566.JPEG n03982430/
+mv val/ILSVRC2012_val_00028567.JPEG n02869837/
+mv val/ILSVRC2012_val_00028568.JPEG n01843065/
+mv val/ILSVRC2012_val_00028569.JPEG n04311174/
+mv val/ILSVRC2012_val_00028570.JPEG n01820546/
+mv val/ILSVRC2012_val_00028571.JPEG n01677366/
+mv val/ILSVRC2012_val_00028572.JPEG n02108089/
+mv val/ILSVRC2012_val_00028573.JPEG n01807496/
+mv val/ILSVRC2012_val_00028574.JPEG n03710721/
+mv val/ILSVRC2012_val_00028575.JPEG n03063599/
+mv val/ILSVRC2012_val_00028576.JPEG n03498962/
+mv val/ILSVRC2012_val_00028577.JPEG n01729322/
+mv val/ILSVRC2012_val_00028578.JPEG n02769748/
+mv val/ILSVRC2012_val_00028579.JPEG n02268853/
+mv val/ILSVRC2012_val_00028580.JPEG n04081281/
+mv val/ILSVRC2012_val_00028581.JPEG n03983396/
+mv val/ILSVRC2012_val_00028582.JPEG n06359193/
+mv val/ILSVRC2012_val_00028583.JPEG n02127052/
+mv val/ILSVRC2012_val_00028584.JPEG n02107142/
+mv val/ILSVRC2012_val_00028585.JPEG n02488702/
+mv val/ILSVRC2012_val_00028586.JPEG n02006656/
+mv val/ILSVRC2012_val_00028587.JPEG n07831146/
+mv val/ILSVRC2012_val_00028588.JPEG n02676566/
+mv val/ILSVRC2012_val_00028589.JPEG n04277352/
+mv val/ILSVRC2012_val_00028590.JPEG n03527444/
+mv val/ILSVRC2012_val_00028591.JPEG n03372029/
+mv val/ILSVRC2012_val_00028592.JPEG n03314780/
+mv val/ILSVRC2012_val_00028593.JPEG n02114712/
+mv val/ILSVRC2012_val_00028594.JPEG n01978287/
+mv val/ILSVRC2012_val_00028595.JPEG n03337140/
+mv val/ILSVRC2012_val_00028596.JPEG n03538406/
+mv val/ILSVRC2012_val_00028597.JPEG n02917067/
+mv val/ILSVRC2012_val_00028598.JPEG n01756291/
+mv val/ILSVRC2012_val_00028599.JPEG n01667778/
+mv val/ILSVRC2012_val_00028600.JPEG n01795545/
+mv val/ILSVRC2012_val_00028601.JPEG n01631663/
+mv val/ILSVRC2012_val_00028602.JPEG n02088364/
+mv val/ILSVRC2012_val_00028603.JPEG n02808304/
+mv val/ILSVRC2012_val_00028604.JPEG n01797886/
+mv val/ILSVRC2012_val_00028605.JPEG n02104029/
+mv val/ILSVRC2012_val_00028606.JPEG n03201208/
+mv val/ILSVRC2012_val_00028607.JPEG n01558993/
+mv val/ILSVRC2012_val_00028608.JPEG n03967562/
+mv val/ILSVRC2012_val_00028609.JPEG n04428191/
+mv val/ILSVRC2012_val_00028610.JPEG n02494079/
+mv val/ILSVRC2012_val_00028611.JPEG n04162706/
+mv val/ILSVRC2012_val_00028612.JPEG n04515003/
+mv val/ILSVRC2012_val_00028613.JPEG n04040759/
+mv val/ILSVRC2012_val_00028614.JPEG n01774750/
+mv val/ILSVRC2012_val_00028615.JPEG n01943899/
+mv val/ILSVRC2012_val_00028616.JPEG n02098413/
+mv val/ILSVRC2012_val_00028617.JPEG n02099601/
+mv val/ILSVRC2012_val_00028618.JPEG n04270147/
+mv val/ILSVRC2012_val_00028619.JPEG n02417914/
+mv val/ILSVRC2012_val_00028620.JPEG n03065424/
+mv val/ILSVRC2012_val_00028621.JPEG n07734744/
+mv val/ILSVRC2012_val_00028622.JPEG n02007558/
+mv val/ILSVRC2012_val_00028623.JPEG n02119789/
+mv val/ILSVRC2012_val_00028624.JPEG n07695742/
+mv val/ILSVRC2012_val_00028625.JPEG n02364673/
+mv val/ILSVRC2012_val_00028626.JPEG n01689811/
+mv val/ILSVRC2012_val_00028627.JPEG n02672831/
+mv val/ILSVRC2012_val_00028628.JPEG n02124075/
+mv val/ILSVRC2012_val_00028629.JPEG n01644900/
+mv val/ILSVRC2012_val_00028630.JPEG n04335435/
+mv val/ILSVRC2012_val_00028631.JPEG n02086646/
+mv val/ILSVRC2012_val_00028632.JPEG n02095889/
+mv val/ILSVRC2012_val_00028633.JPEG n02105251/
+mv val/ILSVRC2012_val_00028634.JPEG n02391049/
+mv val/ILSVRC2012_val_00028635.JPEG n01955084/
+mv val/ILSVRC2012_val_00028636.JPEG n02480495/
+mv val/ILSVRC2012_val_00028637.JPEG n03032252/
+mv val/ILSVRC2012_val_00028638.JPEG n02808440/
+mv val/ILSVRC2012_val_00028639.JPEG n03637318/
+mv val/ILSVRC2012_val_00028640.JPEG n02877765/
+mv val/ILSVRC2012_val_00028641.JPEG n04597913/
+mv val/ILSVRC2012_val_00028642.JPEG n02112706/
+mv val/ILSVRC2012_val_00028643.JPEG n04590129/
+mv val/ILSVRC2012_val_00028644.JPEG n01910747/
+mv val/ILSVRC2012_val_00028645.JPEG n02895154/
+mv val/ILSVRC2012_val_00028646.JPEG n03062245/
+mv val/ILSVRC2012_val_00028647.JPEG n03775546/
+mv val/ILSVRC2012_val_00028648.JPEG n03372029/
+mv val/ILSVRC2012_val_00028649.JPEG n04228054/
+mv val/ILSVRC2012_val_00028650.JPEG n04258138/
+mv val/ILSVRC2012_val_00028651.JPEG n04074963/
+mv val/ILSVRC2012_val_00028652.JPEG n11879895/
+mv val/ILSVRC2012_val_00028653.JPEG n01986214/
+mv val/ILSVRC2012_val_00028654.JPEG n01943899/
+mv val/ILSVRC2012_val_00028655.JPEG n02138441/
+mv val/ILSVRC2012_val_00028656.JPEG n01806143/
+mv val/ILSVRC2012_val_00028657.JPEG n01983481/
+mv val/ILSVRC2012_val_00028658.JPEG n03478589/
+mv val/ILSVRC2012_val_00028659.JPEG n04389033/
+mv val/ILSVRC2012_val_00028660.JPEG n02951358/
+mv val/ILSVRC2012_val_00028661.JPEG n02102318/
+mv val/ILSVRC2012_val_00028662.JPEG n03763968/
+mv val/ILSVRC2012_val_00028663.JPEG n03594734/
+mv val/ILSVRC2012_val_00028664.JPEG n01689811/
+mv val/ILSVRC2012_val_00028665.JPEG n07753113/
+mv val/ILSVRC2012_val_00028666.JPEG n02074367/
+mv val/ILSVRC2012_val_00028667.JPEG n01819313/
+mv val/ILSVRC2012_val_00028668.JPEG n03467068/
+mv val/ILSVRC2012_val_00028669.JPEG n03393912/
+mv val/ILSVRC2012_val_00028670.JPEG n02056570/
+mv val/ILSVRC2012_val_00028671.JPEG n04008634/
+mv val/ILSVRC2012_val_00028672.JPEG n04254777/
+mv val/ILSVRC2012_val_00028673.JPEG n01644900/
+mv val/ILSVRC2012_val_00028674.JPEG n02106166/
+mv val/ILSVRC2012_val_00028675.JPEG n03891251/
+mv val/ILSVRC2012_val_00028676.JPEG n04435653/
+mv val/ILSVRC2012_val_00028677.JPEG n01773549/
+mv val/ILSVRC2012_val_00028678.JPEG n03729826/
+mv val/ILSVRC2012_val_00028679.JPEG n01770081/
+mv val/ILSVRC2012_val_00028680.JPEG n03529860/
+mv val/ILSVRC2012_val_00028681.JPEG n03110669/
+mv val/ILSVRC2012_val_00028682.JPEG n03841143/
+mv val/ILSVRC2012_val_00028683.JPEG n02091244/
+mv val/ILSVRC2012_val_00028684.JPEG n04067472/
+mv val/ILSVRC2012_val_00028685.JPEG n04371430/
+mv val/ILSVRC2012_val_00028686.JPEG n03796401/
+mv val/ILSVRC2012_val_00028687.JPEG n03782006/
+mv val/ILSVRC2012_val_00028688.JPEG n04238763/
+mv val/ILSVRC2012_val_00028689.JPEG n01784675/
+mv val/ILSVRC2012_val_00028690.JPEG n04019541/
+mv val/ILSVRC2012_val_00028691.JPEG n02097209/
+mv val/ILSVRC2012_val_00028692.JPEG n02259212/
+mv val/ILSVRC2012_val_00028693.JPEG n03956157/
+mv val/ILSVRC2012_val_00028694.JPEG n02112706/
+mv val/ILSVRC2012_val_00028695.JPEG n02111889/
+mv val/ILSVRC2012_val_00028696.JPEG n03527444/
+mv val/ILSVRC2012_val_00028697.JPEG n02167151/
+mv val/ILSVRC2012_val_00028698.JPEG n04442312/
+mv val/ILSVRC2012_val_00028699.JPEG n07695742/
+mv val/ILSVRC2012_val_00028700.JPEG n03710193/
+mv val/ILSVRC2012_val_00028701.JPEG n04074963/
+mv val/ILSVRC2012_val_00028702.JPEG n02099849/
+mv val/ILSVRC2012_val_00028703.JPEG n02134418/
+mv val/ILSVRC2012_val_00028704.JPEG n02825657/
+mv val/ILSVRC2012_val_00028705.JPEG n13037406/
+mv val/ILSVRC2012_val_00028706.JPEG n02085782/
+mv val/ILSVRC2012_val_00028707.JPEG n02417914/
+mv val/ILSVRC2012_val_00028708.JPEG n12620546/
+mv val/ILSVRC2012_val_00028709.JPEG n04275548/
+mv val/ILSVRC2012_val_00028710.JPEG n02804610/
+mv val/ILSVRC2012_val_00028711.JPEG n04146614/
+mv val/ILSVRC2012_val_00028712.JPEG n01514668/
+mv val/ILSVRC2012_val_00028713.JPEG n01443537/
+mv val/ILSVRC2012_val_00028714.JPEG n04509417/
+mv val/ILSVRC2012_val_00028715.JPEG n02892201/
+mv val/ILSVRC2012_val_00028716.JPEG n02088466/
+mv val/ILSVRC2012_val_00028717.JPEG n03065424/
+mv val/ILSVRC2012_val_00028718.JPEG n04254120/
+mv val/ILSVRC2012_val_00028719.JPEG n03792972/
+mv val/ILSVRC2012_val_00028720.JPEG n01924916/
+mv val/ILSVRC2012_val_00028721.JPEG n02037110/
+mv val/ILSVRC2012_val_00028722.JPEG n07697537/
+mv val/ILSVRC2012_val_00028723.JPEG n03394916/
+mv val/ILSVRC2012_val_00028724.JPEG n02101006/
+mv val/ILSVRC2012_val_00028725.JPEG n02110806/
+mv val/ILSVRC2012_val_00028726.JPEG n03146219/
+mv val/ILSVRC2012_val_00028727.JPEG n02814860/
+mv val/ILSVRC2012_val_00028728.JPEG n03649909/
+mv val/ILSVRC2012_val_00028729.JPEG n03127747/
+mv val/ILSVRC2012_val_00028730.JPEG n01980166/
+mv val/ILSVRC2012_val_00028731.JPEG n02092002/
+mv val/ILSVRC2012_val_00028732.JPEG n03787032/
+mv val/ILSVRC2012_val_00028733.JPEG n02133161/
+mv val/ILSVRC2012_val_00028734.JPEG n03874599/
+mv val/ILSVRC2012_val_00028735.JPEG n04201297/
+mv val/ILSVRC2012_val_00028736.JPEG n02106550/
+mv val/ILSVRC2012_val_00028737.JPEG n07615774/
+mv val/ILSVRC2012_val_00028738.JPEG n03710637/
+mv val/ILSVRC2012_val_00028739.JPEG n03527444/
+mv val/ILSVRC2012_val_00028740.JPEG n07714990/
+mv val/ILSVRC2012_val_00028741.JPEG n03017168/
+mv val/ILSVRC2012_val_00028742.JPEG n02111500/
+mv val/ILSVRC2012_val_00028743.JPEG n01744401/
+mv val/ILSVRC2012_val_00028744.JPEG n03950228/
+mv val/ILSVRC2012_val_00028745.JPEG n02410509/
+mv val/ILSVRC2012_val_00028746.JPEG n02483708/
+mv val/ILSVRC2012_val_00028747.JPEG n07583066/
+mv val/ILSVRC2012_val_00028748.JPEG n04589890/
+mv val/ILSVRC2012_val_00028749.JPEG n02655020/
+mv val/ILSVRC2012_val_00028750.JPEG n02259212/
+mv val/ILSVRC2012_val_00028751.JPEG n01990800/
+mv val/ILSVRC2012_val_00028752.JPEG n03457902/
+mv val/ILSVRC2012_val_00028753.JPEG n07920052/
+mv val/ILSVRC2012_val_00028754.JPEG n04505470/
+mv val/ILSVRC2012_val_00028755.JPEG n02111129/
+mv val/ILSVRC2012_val_00028756.JPEG n03216828/
+mv val/ILSVRC2012_val_00028757.JPEG n02892767/
+mv val/ILSVRC2012_val_00028758.JPEG n02095314/
+mv val/ILSVRC2012_val_00028759.JPEG n02092002/
+mv val/ILSVRC2012_val_00028760.JPEG n01664065/
+mv val/ILSVRC2012_val_00028761.JPEG n03944341/
+mv val/ILSVRC2012_val_00028762.JPEG n03495258/
+mv val/ILSVRC2012_val_00028763.JPEG n01737021/
+mv val/ILSVRC2012_val_00028764.JPEG n01677366/
+mv val/ILSVRC2012_val_00028765.JPEG n01806567/
+mv val/ILSVRC2012_val_00028766.JPEG n02097298/
+mv val/ILSVRC2012_val_00028767.JPEG n04532670/
+mv val/ILSVRC2012_val_00028768.JPEG n04522168/
+mv val/ILSVRC2012_val_00028769.JPEG n02708093/
+mv val/ILSVRC2012_val_00028770.JPEG n02066245/
+mv val/ILSVRC2012_val_00028771.JPEG n02971356/
+mv val/ILSVRC2012_val_00028772.JPEG n02906734/
+mv val/ILSVRC2012_val_00028773.JPEG n03492542/
+mv val/ILSVRC2012_val_00028774.JPEG n03930313/
+mv val/ILSVRC2012_val_00028775.JPEG n02396427/
+mv val/ILSVRC2012_val_00028776.JPEG n02037110/
+mv val/ILSVRC2012_val_00028777.JPEG n03297495/
+mv val/ILSVRC2012_val_00028778.JPEG n03017168/
+mv val/ILSVRC2012_val_00028779.JPEG n01773797/
+mv val/ILSVRC2012_val_00028780.JPEG n03786901/
+mv val/ILSVRC2012_val_00028781.JPEG n02910353/
+mv val/ILSVRC2012_val_00028782.JPEG n02102177/
+mv val/ILSVRC2012_val_00028783.JPEG n02730930/
+mv val/ILSVRC2012_val_00028784.JPEG n02480495/
+mv val/ILSVRC2012_val_00028785.JPEG n04562935/
+mv val/ILSVRC2012_val_00028786.JPEG n02109525/
+mv val/ILSVRC2012_val_00028787.JPEG n02988304/
+mv val/ILSVRC2012_val_00028788.JPEG n02091467/
+mv val/ILSVRC2012_val_00028789.JPEG n04204238/
+mv val/ILSVRC2012_val_00028790.JPEG n04476259/
+mv val/ILSVRC2012_val_00028791.JPEG n01532829/
+mv val/ILSVRC2012_val_00028792.JPEG n03208938/
+mv val/ILSVRC2012_val_00028793.JPEG n04532106/
+mv val/ILSVRC2012_val_00028794.JPEG n02165105/
+mv val/ILSVRC2012_val_00028795.JPEG n01677366/
+mv val/ILSVRC2012_val_00028796.JPEG n07715103/
+mv val/ILSVRC2012_val_00028797.JPEG n02795169/
+mv val/ILSVRC2012_val_00028798.JPEG n02127052/
+mv val/ILSVRC2012_val_00028799.JPEG n02098286/
+mv val/ILSVRC2012_val_00028800.JPEG n01728572/
+mv val/ILSVRC2012_val_00028801.JPEG n01833805/
+mv val/ILSVRC2012_val_00028802.JPEG n02445715/
+mv val/ILSVRC2012_val_00028803.JPEG n02259212/
+mv val/ILSVRC2012_val_00028804.JPEG n04209133/
+mv val/ILSVRC2012_val_00028805.JPEG n07711569/
+mv val/ILSVRC2012_val_00028806.JPEG n07860988/
+mv val/ILSVRC2012_val_00028807.JPEG n09421951/
+mv val/ILSVRC2012_val_00028808.JPEG n03125729/
+mv val/ILSVRC2012_val_00028809.JPEG n04141076/
+mv val/ILSVRC2012_val_00028810.JPEG n01742172/
+mv val/ILSVRC2012_val_00028811.JPEG n03063689/
+mv val/ILSVRC2012_val_00028812.JPEG n01704323/
+mv val/ILSVRC2012_val_00028813.JPEG n01748264/
+mv val/ILSVRC2012_val_00028814.JPEG n01770393/
+mv val/ILSVRC2012_val_00028815.JPEG n01955084/
+mv val/ILSVRC2012_val_00028816.JPEG n02894605/
+mv val/ILSVRC2012_val_00028817.JPEG n03792972/
+mv val/ILSVRC2012_val_00028818.JPEG n04141975/
+mv val/ILSVRC2012_val_00028819.JPEG n02672831/
+mv val/ILSVRC2012_val_00028820.JPEG n03018349/
+mv val/ILSVRC2012_val_00028821.JPEG n02971356/
+mv val/ILSVRC2012_val_00028822.JPEG n02859443/
+mv val/ILSVRC2012_val_00028823.JPEG n07749582/
+mv val/ILSVRC2012_val_00028824.JPEG n03792782/
+mv val/ILSVRC2012_val_00028825.JPEG n02398521/
+mv val/ILSVRC2012_val_00028826.JPEG n04254777/
+mv val/ILSVRC2012_val_00028827.JPEG n02326432/
+mv val/ILSVRC2012_val_00028828.JPEG n03877472/
+mv val/ILSVRC2012_val_00028829.JPEG n02123045/
+mv val/ILSVRC2012_val_00028830.JPEG n03623198/
+mv val/ILSVRC2012_val_00028831.JPEG n02342885/
+mv val/ILSVRC2012_val_00028832.JPEG n03187595/
+mv val/ILSVRC2012_val_00028833.JPEG n03884397/
+mv val/ILSVRC2012_val_00028834.JPEG n04330267/
+mv val/ILSVRC2012_val_00028835.JPEG n04266014/
+mv val/ILSVRC2012_val_00028836.JPEG n02138441/
+mv val/ILSVRC2012_val_00028837.JPEG n03538406/
+mv val/ILSVRC2012_val_00028838.JPEG n03000247/
+mv val/ILSVRC2012_val_00028839.JPEG n02363005/
+mv val/ILSVRC2012_val_00028840.JPEG n02883205/
+mv val/ILSVRC2012_val_00028841.JPEG n07753592/
+mv val/ILSVRC2012_val_00028842.JPEG n04371430/
+mv val/ILSVRC2012_val_00028843.JPEG n03871628/
+mv val/ILSVRC2012_val_00028844.JPEG n03633091/
+mv val/ILSVRC2012_val_00028845.JPEG n04023962/
+mv val/ILSVRC2012_val_00028846.JPEG n01740131/
+mv val/ILSVRC2012_val_00028847.JPEG n04251144/
+mv val/ILSVRC2012_val_00028848.JPEG n02870880/
+mv val/ILSVRC2012_val_00028849.JPEG n02009912/
+mv val/ILSVRC2012_val_00028850.JPEG n03461385/
+mv val/ILSVRC2012_val_00028851.JPEG n02328150/
+mv val/ILSVRC2012_val_00028852.JPEG n01945685/
+mv val/ILSVRC2012_val_00028853.JPEG n02280649/
+mv val/ILSVRC2012_val_00028854.JPEG n02012849/
+mv val/ILSVRC2012_val_00028855.JPEG n02112137/
+mv val/ILSVRC2012_val_00028856.JPEG n04326547/
+mv val/ILSVRC2012_val_00028857.JPEG n02117135/
+mv val/ILSVRC2012_val_00028858.JPEG n07930864/
+mv val/ILSVRC2012_val_00028859.JPEG n04136333/
+mv val/ILSVRC2012_val_00028860.JPEG n04370456/
+mv val/ILSVRC2012_val_00028861.JPEG n01737021/
+mv val/ILSVRC2012_val_00028862.JPEG n01817953/
+mv val/ILSVRC2012_val_00028863.JPEG n03888605/
+mv val/ILSVRC2012_val_00028864.JPEG n03452741/
+mv val/ILSVRC2012_val_00028865.JPEG n04330267/
+mv val/ILSVRC2012_val_00028866.JPEG n07932039/
+mv val/ILSVRC2012_val_00028867.JPEG n02398521/
+mv val/ILSVRC2012_val_00028868.JPEG n07930864/
+mv val/ILSVRC2012_val_00028869.JPEG n03787032/
+mv val/ILSVRC2012_val_00028870.JPEG n02112350/
+mv val/ILSVRC2012_val_00028871.JPEG n12267677/
+mv val/ILSVRC2012_val_00028872.JPEG n03494278/
+mv val/ILSVRC2012_val_00028873.JPEG n07693725/
+mv val/ILSVRC2012_val_00028874.JPEG n03857828/
+mv val/ILSVRC2012_val_00028875.JPEG n02815834/
+mv val/ILSVRC2012_val_00028876.JPEG n04376876/
+mv val/ILSVRC2012_val_00028877.JPEG n03874293/
+mv val/ILSVRC2012_val_00028878.JPEG n04371774/
+mv val/ILSVRC2012_val_00028879.JPEG n03929855/
+mv val/ILSVRC2012_val_00028880.JPEG n02841315/
+mv val/ILSVRC2012_val_00028881.JPEG n02090721/
+mv val/ILSVRC2012_val_00028882.JPEG n09468604/
+mv val/ILSVRC2012_val_00028883.JPEG n02488291/
+mv val/ILSVRC2012_val_00028884.JPEG n02106662/
+mv val/ILSVRC2012_val_00028885.JPEG n03461385/
+mv val/ILSVRC2012_val_00028886.JPEG n04485082/
+mv val/ILSVRC2012_val_00028887.JPEG n03995372/
+mv val/ILSVRC2012_val_00028888.JPEG n02493793/
+mv val/ILSVRC2012_val_00028889.JPEG n01914609/
+mv val/ILSVRC2012_val_00028890.JPEG n02002556/
+mv val/ILSVRC2012_val_00028891.JPEG n07711569/
+mv val/ILSVRC2012_val_00028892.JPEG n02098286/
+mv val/ILSVRC2012_val_00028893.JPEG n07693725/
+mv val/ILSVRC2012_val_00028894.JPEG n02422106/
+mv val/ILSVRC2012_val_00028895.JPEG n02110958/
+mv val/ILSVRC2012_val_00028896.JPEG n04613696/
+mv val/ILSVRC2012_val_00028897.JPEG n03692522/
+mv val/ILSVRC2012_val_00028898.JPEG n07920052/
+mv val/ILSVRC2012_val_00028899.JPEG n02799071/
+mv val/ILSVRC2012_val_00028900.JPEG n04037443/
+mv val/ILSVRC2012_val_00028901.JPEG n02113978/
+mv val/ILSVRC2012_val_00028902.JPEG n01530575/
+mv val/ILSVRC2012_val_00028903.JPEG n10565667/
+mv val/ILSVRC2012_val_00028904.JPEG n10148035/
+mv val/ILSVRC2012_val_00028905.JPEG n03773504/
+mv val/ILSVRC2012_val_00028906.JPEG n03347037/
+mv val/ILSVRC2012_val_00028907.JPEG n09193705/
+mv val/ILSVRC2012_val_00028908.JPEG n02113978/
+mv val/ILSVRC2012_val_00028909.JPEG n01882714/
+mv val/ILSVRC2012_val_00028910.JPEG n03527444/
+mv val/ILSVRC2012_val_00028911.JPEG n02979186/
+mv val/ILSVRC2012_val_00028912.JPEG n01877812/
+mv val/ILSVRC2012_val_00028913.JPEG n02111129/
+mv val/ILSVRC2012_val_00028914.JPEG n03417042/
+mv val/ILSVRC2012_val_00028915.JPEG n03461385/
+mv val/ILSVRC2012_val_00028916.JPEG n02114855/
+mv val/ILSVRC2012_val_00028917.JPEG n12768682/
+mv val/ILSVRC2012_val_00028918.JPEG n01950731/
+mv val/ILSVRC2012_val_00028919.JPEG n02667093/
+mv val/ILSVRC2012_val_00028920.JPEG n02011460/
+mv val/ILSVRC2012_val_00028921.JPEG n03290653/
+mv val/ILSVRC2012_val_00028922.JPEG n02108000/
+mv val/ILSVRC2012_val_00028923.JPEG n04229816/
+mv val/ILSVRC2012_val_00028924.JPEG n01930112/
+mv val/ILSVRC2012_val_00028925.JPEG n02486261/
+mv val/ILSVRC2012_val_00028926.JPEG n04542943/
+mv val/ILSVRC2012_val_00028927.JPEG n04235860/
+mv val/ILSVRC2012_val_00028928.JPEG n07768694/
+mv val/ILSVRC2012_val_00028929.JPEG n02403003/
+mv val/ILSVRC2012_val_00028930.JPEG n03786901/
+mv val/ILSVRC2012_val_00028931.JPEG n02396427/
+mv val/ILSVRC2012_val_00028932.JPEG n02109047/
+mv val/ILSVRC2012_val_00028933.JPEG n01968897/
+mv val/ILSVRC2012_val_00028934.JPEG n03388043/
+mv val/ILSVRC2012_val_00028935.JPEG n04258138/
+mv val/ILSVRC2012_val_00028936.JPEG n02112137/
+mv val/ILSVRC2012_val_00028937.JPEG n02607072/
+mv val/ILSVRC2012_val_00028938.JPEG n02134084/
+mv val/ILSVRC2012_val_00028939.JPEG n03837869/
+mv val/ILSVRC2012_val_00028940.JPEG n04200800/
+mv val/ILSVRC2012_val_00028941.JPEG n02071294/
+mv val/ILSVRC2012_val_00028942.JPEG n04141076/
+mv val/ILSVRC2012_val_00028943.JPEG n02085620/
+mv val/ILSVRC2012_val_00028944.JPEG n03218198/
+mv val/ILSVRC2012_val_00028945.JPEG n02098286/
+mv val/ILSVRC2012_val_00028946.JPEG n02099601/
+mv val/ILSVRC2012_val_00028947.JPEG n04099969/
+mv val/ILSVRC2012_val_00028948.JPEG n03216828/
+mv val/ILSVRC2012_val_00028949.JPEG n02892767/
+mv val/ILSVRC2012_val_00028950.JPEG n03482405/
+mv val/ILSVRC2012_val_00028951.JPEG n03838899/
+mv val/ILSVRC2012_val_00028952.JPEG n03018349/
+mv val/ILSVRC2012_val_00028953.JPEG n04487394/
+mv val/ILSVRC2012_val_00028954.JPEG n04141076/
+mv val/ILSVRC2012_val_00028955.JPEG n02106382/
+mv val/ILSVRC2012_val_00028956.JPEG n11939491/
+mv val/ILSVRC2012_val_00028957.JPEG n03100240/
+mv val/ILSVRC2012_val_00028958.JPEG n03908714/
+mv val/ILSVRC2012_val_00028959.JPEG n07831146/
+mv val/ILSVRC2012_val_00028960.JPEG n09256479/
+mv val/ILSVRC2012_val_00028961.JPEG n12267677/
+mv val/ILSVRC2012_val_00028962.JPEG n04152593/
+mv val/ILSVRC2012_val_00028963.JPEG n02093428/
+mv val/ILSVRC2012_val_00028964.JPEG n02791270/
+mv val/ILSVRC2012_val_00028965.JPEG n02099429/
+mv val/ILSVRC2012_val_00028966.JPEG n02105056/
+mv val/ILSVRC2012_val_00028967.JPEG n03223299/
+mv val/ILSVRC2012_val_00028968.JPEG n02643566/
+mv val/ILSVRC2012_val_00028969.JPEG n07720875/
+mv val/ILSVRC2012_val_00028970.JPEG n02124075/
+mv val/ILSVRC2012_val_00028971.JPEG n02699494/
+mv val/ILSVRC2012_val_00028972.JPEG n03888605/
+mv val/ILSVRC2012_val_00028973.JPEG n03249569/
+mv val/ILSVRC2012_val_00028974.JPEG n03584254/
+mv val/ILSVRC2012_val_00028975.JPEG n02981792/
+mv val/ILSVRC2012_val_00028976.JPEG n04133789/
+mv val/ILSVRC2012_val_00028977.JPEG n03534580/
+mv val/ILSVRC2012_val_00028978.JPEG n01518878/
+mv val/ILSVRC2012_val_00028979.JPEG n02704792/
+mv val/ILSVRC2012_val_00028980.JPEG n07747607/
+mv val/ILSVRC2012_val_00028981.JPEG n13037406/
+mv val/ILSVRC2012_val_00028982.JPEG n02488291/
+mv val/ILSVRC2012_val_00028983.JPEG n03538406/
+mv val/ILSVRC2012_val_00028984.JPEG n03627232/
+mv val/ILSVRC2012_val_00028985.JPEG n02099429/
+mv val/ILSVRC2012_val_00028986.JPEG n02704792/
+mv val/ILSVRC2012_val_00028987.JPEG n07684084/
+mv val/ILSVRC2012_val_00028988.JPEG n03733805/
+mv val/ILSVRC2012_val_00028989.JPEG n02397096/
+mv val/ILSVRC2012_val_00028990.JPEG n02114367/
+mv val/ILSVRC2012_val_00028991.JPEG n02319095/
+mv val/ILSVRC2012_val_00028992.JPEG n02086646/
+mv val/ILSVRC2012_val_00028993.JPEG n02094433/
+mv val/ILSVRC2012_val_00028994.JPEG n04133789/
+mv val/ILSVRC2012_val_00028995.JPEG n04483307/
+mv val/ILSVRC2012_val_00028996.JPEG n02504013/
+mv val/ILSVRC2012_val_00028997.JPEG n04525038/
+mv val/ILSVRC2012_val_00028998.JPEG n04265275/
+mv val/ILSVRC2012_val_00028999.JPEG n04209239/
+mv val/ILSVRC2012_val_00029000.JPEG n03967562/
+mv val/ILSVRC2012_val_00029001.JPEG n02129165/
+mv val/ILSVRC2012_val_00029002.JPEG n03777754/
+mv val/ILSVRC2012_val_00029003.JPEG n09835506/
+mv val/ILSVRC2012_val_00029004.JPEG n02727426/
+mv val/ILSVRC2012_val_00029005.JPEG n01693334/
+mv val/ILSVRC2012_val_00029006.JPEG n02457408/
+mv val/ILSVRC2012_val_00029007.JPEG n02128925/
+mv val/ILSVRC2012_val_00029008.JPEG n03903868/
+mv val/ILSVRC2012_val_00029009.JPEG n04409515/
+mv val/ILSVRC2012_val_00029010.JPEG n01950731/
+mv val/ILSVRC2012_val_00029011.JPEG n06359193/
+mv val/ILSVRC2012_val_00029012.JPEG n03187595/
+mv val/ILSVRC2012_val_00029013.JPEG n01950731/
+mv val/ILSVRC2012_val_00029014.JPEG n04041544/
+mv val/ILSVRC2012_val_00029015.JPEG n02892767/
+mv val/ILSVRC2012_val_00029016.JPEG n02363005/
+mv val/ILSVRC2012_val_00029017.JPEG n04355338/
+mv val/ILSVRC2012_val_00029018.JPEG n02277742/
+mv val/ILSVRC2012_val_00029019.JPEG n04090263/
+mv val/ILSVRC2012_val_00029020.JPEG n03314780/
+mv val/ILSVRC2012_val_00029021.JPEG n04285008/
+mv val/ILSVRC2012_val_00029022.JPEG n01847000/
+mv val/ILSVRC2012_val_00029023.JPEG n02094433/
+mv val/ILSVRC2012_val_00029024.JPEG n02098105/
+mv val/ILSVRC2012_val_00029025.JPEG n07892512/
+mv val/ILSVRC2012_val_00029026.JPEG n09229709/
+mv val/ILSVRC2012_val_00029027.JPEG n03527444/
+mv val/ILSVRC2012_val_00029028.JPEG n03530642/
+mv val/ILSVRC2012_val_00029029.JPEG n01774384/
+mv val/ILSVRC2012_val_00029030.JPEG n01773157/
+mv val/ILSVRC2012_val_00029031.JPEG n04366367/
+mv val/ILSVRC2012_val_00029032.JPEG n03676483/
+mv val/ILSVRC2012_val_00029033.JPEG n01930112/
+mv val/ILSVRC2012_val_00029034.JPEG n03933933/
+mv val/ILSVRC2012_val_00029035.JPEG n03877845/
+mv val/ILSVRC2012_val_00029036.JPEG n02104365/
+mv val/ILSVRC2012_val_00029037.JPEG n07697537/
+mv val/ILSVRC2012_val_00029038.JPEG n02444819/
+mv val/ILSVRC2012_val_00029039.JPEG n13037406/
+mv val/ILSVRC2012_val_00029040.JPEG n04296562/
+mv val/ILSVRC2012_val_00029041.JPEG n02457408/
+mv val/ILSVRC2012_val_00029042.JPEG n11879895/
+mv val/ILSVRC2012_val_00029043.JPEG n04120489/
+mv val/ILSVRC2012_val_00029044.JPEG n03958227/
+mv val/ILSVRC2012_val_00029045.JPEG n03187595/
+mv val/ILSVRC2012_val_00029046.JPEG n03930630/
+mv val/ILSVRC2012_val_00029047.JPEG n02277742/
+mv val/ILSVRC2012_val_00029048.JPEG n01774750/
+mv val/ILSVRC2012_val_00029049.JPEG n04550184/
+mv val/ILSVRC2012_val_00029050.JPEG n02837789/
+mv val/ILSVRC2012_val_00029051.JPEG n04479046/
+mv val/ILSVRC2012_val_00029052.JPEG n02500267/
+mv val/ILSVRC2012_val_00029053.JPEG n04317175/
+mv val/ILSVRC2012_val_00029054.JPEG n07875152/
+mv val/ILSVRC2012_val_00029055.JPEG n01687978/
+mv val/ILSVRC2012_val_00029056.JPEG n02088094/
+mv val/ILSVRC2012_val_00029057.JPEG n02814533/
+mv val/ILSVRC2012_val_00029058.JPEG n02109961/
+mv val/ILSVRC2012_val_00029059.JPEG n02117135/
+mv val/ILSVRC2012_val_00029060.JPEG n04579145/
+mv val/ILSVRC2012_val_00029061.JPEG n07880968/
+mv val/ILSVRC2012_val_00029062.JPEG n02190166/
+mv val/ILSVRC2012_val_00029063.JPEG n02396427/
+mv val/ILSVRC2012_val_00029064.JPEG n04542943/
+mv val/ILSVRC2012_val_00029065.JPEG n04357314/
+mv val/ILSVRC2012_val_00029066.JPEG n02114855/
+mv val/ILSVRC2012_val_00029067.JPEG n03920288/
+mv val/ILSVRC2012_val_00029068.JPEG n02120079/
+mv val/ILSVRC2012_val_00029069.JPEG n01776313/
+mv val/ILSVRC2012_val_00029070.JPEG n01847000/
+mv val/ILSVRC2012_val_00029071.JPEG n04447861/
+mv val/ILSVRC2012_val_00029072.JPEG n04019541/
+mv val/ILSVRC2012_val_00029073.JPEG n03201208/
+mv val/ILSVRC2012_val_00029074.JPEG n03857828/
+mv val/ILSVRC2012_val_00029075.JPEG n03404251/
+mv val/ILSVRC2012_val_00029076.JPEG n07754684/
+mv val/ILSVRC2012_val_00029077.JPEG n09256479/
+mv val/ILSVRC2012_val_00029078.JPEG n02442845/
+mv val/ILSVRC2012_val_00029079.JPEG n06794110/
+mv val/ILSVRC2012_val_00029080.JPEG n02917067/
+mv val/ILSVRC2012_val_00029081.JPEG n04592741/
+mv val/ILSVRC2012_val_00029082.JPEG n02389026/
+mv val/ILSVRC2012_val_00029083.JPEG n03444034/
+mv val/ILSVRC2012_val_00029084.JPEG n03724870/
+mv val/ILSVRC2012_val_00029085.JPEG n02895154/
+mv val/ILSVRC2012_val_00029086.JPEG n02165456/
+mv val/ILSVRC2012_val_00029087.JPEG n03804744/
+mv val/ILSVRC2012_val_00029088.JPEG n01742172/
+mv val/ILSVRC2012_val_00029089.JPEG n02037110/
+mv val/ILSVRC2012_val_00029090.JPEG n02087046/
+mv val/ILSVRC2012_val_00029091.JPEG n02865351/
+mv val/ILSVRC2012_val_00029092.JPEG n02025239/
+mv val/ILSVRC2012_val_00029093.JPEG n03887697/
+mv val/ILSVRC2012_val_00029094.JPEG n02814533/
+mv val/ILSVRC2012_val_00029095.JPEG n04133789/
+mv val/ILSVRC2012_val_00029096.JPEG n03891332/
+mv val/ILSVRC2012_val_00029097.JPEG n02483708/
+mv val/ILSVRC2012_val_00029098.JPEG n07714571/
+mv val/ILSVRC2012_val_00029099.JPEG n03982430/
+mv val/ILSVRC2012_val_00029100.JPEG n04579145/
+mv val/ILSVRC2012_val_00029101.JPEG n02127052/
+mv val/ILSVRC2012_val_00029102.JPEG n07932039/
+mv val/ILSVRC2012_val_00029103.JPEG n04238763/
+mv val/ILSVRC2012_val_00029104.JPEG n03710637/
+mv val/ILSVRC2012_val_00029105.JPEG n02825657/
+mv val/ILSVRC2012_val_00029106.JPEG n03977966/
+mv val/ILSVRC2012_val_00029107.JPEG n02321529/
+mv val/ILSVRC2012_val_00029108.JPEG n02493509/
+mv val/ILSVRC2012_val_00029109.JPEG n02219486/
+mv val/ILSVRC2012_val_00029110.JPEG n09193705/
+mv val/ILSVRC2012_val_00029111.JPEG n01950731/
+mv val/ILSVRC2012_val_00029112.JPEG n03457902/
+mv val/ILSVRC2012_val_00029113.JPEG n03908714/
+mv val/ILSVRC2012_val_00029114.JPEG n03980874/
+mv val/ILSVRC2012_val_00029115.JPEG n02113624/
+mv val/ILSVRC2012_val_00029116.JPEG n03393912/
+mv val/ILSVRC2012_val_00029117.JPEG n03379051/
+mv val/ILSVRC2012_val_00029118.JPEG n01688243/
+mv val/ILSVRC2012_val_00029119.JPEG n02971356/
+mv val/ILSVRC2012_val_00029120.JPEG n04243546/
+mv val/ILSVRC2012_val_00029121.JPEG n02510455/
+mv val/ILSVRC2012_val_00029122.JPEG n02092002/
+mv val/ILSVRC2012_val_00029123.JPEG n02116738/
+mv val/ILSVRC2012_val_00029124.JPEG n02391049/
+mv val/ILSVRC2012_val_00029125.JPEG n04111531/
+mv val/ILSVRC2012_val_00029126.JPEG n02128925/
+mv val/ILSVRC2012_val_00029127.JPEG n02097047/
+mv val/ILSVRC2012_val_00029128.JPEG n02071294/
+mv val/ILSVRC2012_val_00029129.JPEG n04462240/
+mv val/ILSVRC2012_val_00029130.JPEG n01748264/
+mv val/ILSVRC2012_val_00029131.JPEG n02086910/
+mv val/ILSVRC2012_val_00029132.JPEG n04326547/
+mv val/ILSVRC2012_val_00029133.JPEG n02107908/
+mv val/ILSVRC2012_val_00029134.JPEG n06874185/
+mv val/ILSVRC2012_val_00029135.JPEG n03773504/
+mv val/ILSVRC2012_val_00029136.JPEG n04039381/
+mv val/ILSVRC2012_val_00029137.JPEG n03874293/
+mv val/ILSVRC2012_val_00029138.JPEG n04482393/
+mv val/ILSVRC2012_val_00029139.JPEG n04371774/
+mv val/ILSVRC2012_val_00029140.JPEG n02088094/
+mv val/ILSVRC2012_val_00029141.JPEG n03887697/
+mv val/ILSVRC2012_val_00029142.JPEG n03452741/
+mv val/ILSVRC2012_val_00029143.JPEG n07802026/
+mv val/ILSVRC2012_val_00029144.JPEG n02509815/
+mv val/ILSVRC2012_val_00029145.JPEG n03347037/
+mv val/ILSVRC2012_val_00029146.JPEG n03983396/
+mv val/ILSVRC2012_val_00029147.JPEG n01774750/
+mv val/ILSVRC2012_val_00029148.JPEG n02879718/
+mv val/ILSVRC2012_val_00029149.JPEG n03888257/
+mv val/ILSVRC2012_val_00029150.JPEG n01796340/
+mv val/ILSVRC2012_val_00029151.JPEG n07717556/
+mv val/ILSVRC2012_val_00029152.JPEG n02112706/
+mv val/ILSVRC2012_val_00029153.JPEG n01742172/
+mv val/ILSVRC2012_val_00029154.JPEG n12998815/
+mv val/ILSVRC2012_val_00029155.JPEG n03271574/
+mv val/ILSVRC2012_val_00029156.JPEG n01775062/
+mv val/ILSVRC2012_val_00029157.JPEG n02112706/
+mv val/ILSVRC2012_val_00029158.JPEG n04153751/
+mv val/ILSVRC2012_val_00029159.JPEG n04350905/
+mv val/ILSVRC2012_val_00029160.JPEG n02481823/
+mv val/ILSVRC2012_val_00029161.JPEG n02487347/
+mv val/ILSVRC2012_val_00029162.JPEG n01950731/
+mv val/ILSVRC2012_val_00029163.JPEG n02667093/
+mv val/ILSVRC2012_val_00029164.JPEG n02089973/
+mv val/ILSVRC2012_val_00029165.JPEG n04592741/
+mv val/ILSVRC2012_val_00029166.JPEG n03393912/
+mv val/ILSVRC2012_val_00029167.JPEG n02840245/
+mv val/ILSVRC2012_val_00029168.JPEG n02006656/
+mv val/ILSVRC2012_val_00029169.JPEG n01498041/
+mv val/ILSVRC2012_val_00029170.JPEG n04548362/
+mv val/ILSVRC2012_val_00029171.JPEG n02782093/
+mv val/ILSVRC2012_val_00029172.JPEG n09193705/
+mv val/ILSVRC2012_val_00029173.JPEG n02443114/
+mv val/ILSVRC2012_val_00029174.JPEG n01773549/
+mv val/ILSVRC2012_val_00029175.JPEG n02093428/
+mv val/ILSVRC2012_val_00029176.JPEG n04116512/
+mv val/ILSVRC2012_val_00029177.JPEG n01770393/
+mv val/ILSVRC2012_val_00029178.JPEG n02128925/
+mv val/ILSVRC2012_val_00029179.JPEG n02939185/
+mv val/ILSVRC2012_val_00029180.JPEG n04133789/
+mv val/ILSVRC2012_val_00029181.JPEG n02777292/
+mv val/ILSVRC2012_val_00029182.JPEG n03976657/
+mv val/ILSVRC2012_val_00029183.JPEG n03876231/
+mv val/ILSVRC2012_val_00029184.JPEG n02443114/
+mv val/ILSVRC2012_val_00029185.JPEG n04590129/
+mv val/ILSVRC2012_val_00029186.JPEG n02114855/
+mv val/ILSVRC2012_val_00029187.JPEG n04335435/
+mv val/ILSVRC2012_val_00029188.JPEG n03372029/
+mv val/ILSVRC2012_val_00029189.JPEG n04418357/
+mv val/ILSVRC2012_val_00029190.JPEG n02109961/
+mv val/ILSVRC2012_val_00029191.JPEG n02088094/
+mv val/ILSVRC2012_val_00029192.JPEG n02279972/
+mv val/ILSVRC2012_val_00029193.JPEG n03657121/
+mv val/ILSVRC2012_val_00029194.JPEG n04482393/
+mv val/ILSVRC2012_val_00029195.JPEG n04229816/
+mv val/ILSVRC2012_val_00029196.JPEG n02264363/
+mv val/ILSVRC2012_val_00029197.JPEG n04136333/
+mv val/ILSVRC2012_val_00029198.JPEG n02027492/
+mv val/ILSVRC2012_val_00029199.JPEG n03617480/
+mv val/ILSVRC2012_val_00029200.JPEG n07753592/
+mv val/ILSVRC2012_val_00029201.JPEG n03459775/
+mv val/ILSVRC2012_val_00029202.JPEG n04154565/
+mv val/ILSVRC2012_val_00029203.JPEG n03425413/
+mv val/ILSVRC2012_val_00029204.JPEG n01955084/
+mv val/ILSVRC2012_val_00029205.JPEG n03127925/
+mv val/ILSVRC2012_val_00029206.JPEG n02017213/
+mv val/ILSVRC2012_val_00029207.JPEG n02437616/
+mv val/ILSVRC2012_val_00029208.JPEG n01774384/
+mv val/ILSVRC2012_val_00029209.JPEG n07760859/
+mv val/ILSVRC2012_val_00029210.JPEG n01818515/
+mv val/ILSVRC2012_val_00029211.JPEG n03000684/
+mv val/ILSVRC2012_val_00029212.JPEG n02128385/
+mv val/ILSVRC2012_val_00029213.JPEG n04487081/
+mv val/ILSVRC2012_val_00029214.JPEG n02105505/
+mv val/ILSVRC2012_val_00029215.JPEG n03376595/
+mv val/ILSVRC2012_val_00029216.JPEG n02130308/
+mv val/ILSVRC2012_val_00029217.JPEG n02108000/
+mv val/ILSVRC2012_val_00029218.JPEG n03042490/
+mv val/ILSVRC2012_val_00029219.JPEG n02992211/
+mv val/ILSVRC2012_val_00029220.JPEG n07718472/
+mv val/ILSVRC2012_val_00029221.JPEG n02417914/
+mv val/ILSVRC2012_val_00029222.JPEG n02701002/
+mv val/ILSVRC2012_val_00029223.JPEG n02058221/
+mv val/ILSVRC2012_val_00029224.JPEG n03888605/
+mv val/ILSVRC2012_val_00029225.JPEG n01694178/
+mv val/ILSVRC2012_val_00029226.JPEG n01855672/
+mv val/ILSVRC2012_val_00029227.JPEG n02168699/
+mv val/ILSVRC2012_val_00029228.JPEG n02676566/
+mv val/ILSVRC2012_val_00029229.JPEG n04507155/
+mv val/ILSVRC2012_val_00029230.JPEG n03777754/
+mv val/ILSVRC2012_val_00029231.JPEG n01704323/
+mv val/ILSVRC2012_val_00029232.JPEG n02088094/
+mv val/ILSVRC2012_val_00029233.JPEG n03444034/
+mv val/ILSVRC2012_val_00029234.JPEG n02883205/
+mv val/ILSVRC2012_val_00029235.JPEG n02909870/
+mv val/ILSVRC2012_val_00029236.JPEG n02787622/
+mv val/ILSVRC2012_val_00029237.JPEG n02102973/
+mv val/ILSVRC2012_val_00029238.JPEG n02514041/
+mv val/ILSVRC2012_val_00029239.JPEG n03085013/
+mv val/ILSVRC2012_val_00029240.JPEG n04328186/
+mv val/ILSVRC2012_val_00029241.JPEG n02494079/
+mv val/ILSVRC2012_val_00029242.JPEG n02093428/
+mv val/ILSVRC2012_val_00029243.JPEG n01986214/
+mv val/ILSVRC2012_val_00029244.JPEG n03594945/
+mv val/ILSVRC2012_val_00029245.JPEG n01847000/
+mv val/ILSVRC2012_val_00029246.JPEG n02110958/
+mv val/ILSVRC2012_val_00029247.JPEG n04252077/
+mv val/ILSVRC2012_val_00029248.JPEG n03041632/
+mv val/ILSVRC2012_val_00029249.JPEG n09421951/
+mv val/ILSVRC2012_val_00029250.JPEG n03776460/
+mv val/ILSVRC2012_val_00029251.JPEG n03676483/
+mv val/ILSVRC2012_val_00029252.JPEG n02804610/
+mv val/ILSVRC2012_val_00029253.JPEG n02112350/
+mv val/ILSVRC2012_val_00029254.JPEG n02096294/
+mv val/ILSVRC2012_val_00029255.JPEG n02108089/
+mv val/ILSVRC2012_val_00029256.JPEG n03690938/
+mv val/ILSVRC2012_val_00029257.JPEG n04372370/
+mv val/ILSVRC2012_val_00029258.JPEG n03877845/
+mv val/ILSVRC2012_val_00029259.JPEG n02111500/
+mv val/ILSVRC2012_val_00029260.JPEG n04476259/
+mv val/ILSVRC2012_val_00029261.JPEG n02104029/
+mv val/ILSVRC2012_val_00029262.JPEG n02085782/
+mv val/ILSVRC2012_val_00029263.JPEG n03424325/
+mv val/ILSVRC2012_val_00029264.JPEG n01943899/
+mv val/ILSVRC2012_val_00029265.JPEG n02443114/
+mv val/ILSVRC2012_val_00029266.JPEG n02865351/
+mv val/ILSVRC2012_val_00029267.JPEG n02129604/
+mv val/ILSVRC2012_val_00029268.JPEG n04487394/
+mv val/ILSVRC2012_val_00029269.JPEG n02493509/
+mv val/ILSVRC2012_val_00029270.JPEG n03026506/
+mv val/ILSVRC2012_val_00029271.JPEG n04136333/
+mv val/ILSVRC2012_val_00029272.JPEG n04507155/
+mv val/ILSVRC2012_val_00029273.JPEG n04356056/
+mv val/ILSVRC2012_val_00029274.JPEG n04039381/
+mv val/ILSVRC2012_val_00029275.JPEG n03944341/
+mv val/ILSVRC2012_val_00029276.JPEG n03947888/
+mv val/ILSVRC2012_val_00029277.JPEG n02098105/
+mv val/ILSVRC2012_val_00029278.JPEG n02133161/
+mv val/ILSVRC2012_val_00029279.JPEG n02841315/
+mv val/ILSVRC2012_val_00029280.JPEG n04251144/
+mv val/ILSVRC2012_val_00029281.JPEG n02094114/
+mv val/ILSVRC2012_val_00029282.JPEG n04505470/
+mv val/ILSVRC2012_val_00029283.JPEG n01829413/
+mv val/ILSVRC2012_val_00029284.JPEG n02493509/
+mv val/ILSVRC2012_val_00029285.JPEG n11879895/
+mv val/ILSVRC2012_val_00029286.JPEG n07875152/
+mv val/ILSVRC2012_val_00029287.JPEG n01983481/
+mv val/ILSVRC2012_val_00029288.JPEG n02500267/
+mv val/ILSVRC2012_val_00029289.JPEG n02085620/
+mv val/ILSVRC2012_val_00029290.JPEG n13040303/
+mv val/ILSVRC2012_val_00029291.JPEG n03902125/
+mv val/ILSVRC2012_val_00029292.JPEG n12620546/
+mv val/ILSVRC2012_val_00029293.JPEG n03599486/
+mv val/ILSVRC2012_val_00029294.JPEG n03891332/
+mv val/ILSVRC2012_val_00029295.JPEG n02102480/
+mv val/ILSVRC2012_val_00029296.JPEG n04118538/
+mv val/ILSVRC2012_val_00029297.JPEG n01807496/
+mv val/ILSVRC2012_val_00029298.JPEG n01860187/
+mv val/ILSVRC2012_val_00029299.JPEG n03444034/
+mv val/ILSVRC2012_val_00029300.JPEG n01491361/
+mv val/ILSVRC2012_val_00029301.JPEG n07831146/
+mv val/ILSVRC2012_val_00029302.JPEG n02666196/
+mv val/ILSVRC2012_val_00029303.JPEG n02892767/
+mv val/ILSVRC2012_val_00029304.JPEG n13040303/
+mv val/ILSVRC2012_val_00029305.JPEG n03032252/
+mv val/ILSVRC2012_val_00029306.JPEG n02125311/
+mv val/ILSVRC2012_val_00029307.JPEG n02168699/
+mv val/ILSVRC2012_val_00029308.JPEG n02117135/
+mv val/ILSVRC2012_val_00029309.JPEG n02395406/
+mv val/ILSVRC2012_val_00029310.JPEG n01537544/
+mv val/ILSVRC2012_val_00029311.JPEG n07753275/
+mv val/ILSVRC2012_val_00029312.JPEG n04428191/
+mv val/ILSVRC2012_val_00029313.JPEG n02109961/
+mv val/ILSVRC2012_val_00029314.JPEG n04235860/
+mv val/ILSVRC2012_val_00029315.JPEG n02417914/
+mv val/ILSVRC2012_val_00029316.JPEG n04584207/
+mv val/ILSVRC2012_val_00029317.JPEG n04070727/
+mv val/ILSVRC2012_val_00029318.JPEG n01873310/
+mv val/ILSVRC2012_val_00029319.JPEG n02749479/
+mv val/ILSVRC2012_val_00029320.JPEG n02769748/
+mv val/ILSVRC2012_val_00029321.JPEG n07714571/
+mv val/ILSVRC2012_val_00029322.JPEG n04367480/
+mv val/ILSVRC2012_val_00029323.JPEG n02012849/
+mv val/ILSVRC2012_val_00029324.JPEG n01665541/
+mv val/ILSVRC2012_val_00029325.JPEG n02167151/
+mv val/ILSVRC2012_val_00029326.JPEG n02088466/
+mv val/ILSVRC2012_val_00029327.JPEG n03527444/
+mv val/ILSVRC2012_val_00029328.JPEG n04409515/
+mv val/ILSVRC2012_val_00029329.JPEG n02013706/
+mv val/ILSVRC2012_val_00029330.JPEG n03325584/
+mv val/ILSVRC2012_val_00029331.JPEG n02441942/
+mv val/ILSVRC2012_val_00029332.JPEG n07613480/
+mv val/ILSVRC2012_val_00029333.JPEG n02101006/
+mv val/ILSVRC2012_val_00029334.JPEG n02088632/
+mv val/ILSVRC2012_val_00029335.JPEG n02129604/
+mv val/ILSVRC2012_val_00029336.JPEG n01685808/
+mv val/ILSVRC2012_val_00029337.JPEG n02966687/
+mv val/ILSVRC2012_val_00029338.JPEG n04367480/
+mv val/ILSVRC2012_val_00029339.JPEG n03908618/
+mv val/ILSVRC2012_val_00029340.JPEG n02977058/
+mv val/ILSVRC2012_val_00029341.JPEG n04111531/
+mv val/ILSVRC2012_val_00029342.JPEG n03042490/
+mv val/ILSVRC2012_val_00029343.JPEG n03717622/
+mv val/ILSVRC2012_val_00029344.JPEG n06785654/
+mv val/ILSVRC2012_val_00029345.JPEG n02980441/
+mv val/ILSVRC2012_val_00029346.JPEG n01968897/
+mv val/ILSVRC2012_val_00029347.JPEG n01843065/
+mv val/ILSVRC2012_val_00029348.JPEG n04554684/
+mv val/ILSVRC2012_val_00029349.JPEG n04523525/
+mv val/ILSVRC2012_val_00029350.JPEG n04417672/
+mv val/ILSVRC2012_val_00029351.JPEG n01855672/
+mv val/ILSVRC2012_val_00029352.JPEG n03873416/
+mv val/ILSVRC2012_val_00029353.JPEG n02100877/
+mv val/ILSVRC2012_val_00029354.JPEG n02105505/
+mv val/ILSVRC2012_val_00029355.JPEG n03492542/
+mv val/ILSVRC2012_val_00029356.JPEG n01833805/
+mv val/ILSVRC2012_val_00029357.JPEG n04116512/
+mv val/ILSVRC2012_val_00029358.JPEG n04487394/
+mv val/ILSVRC2012_val_00029359.JPEG n02105505/
+mv val/ILSVRC2012_val_00029360.JPEG n03297495/
+mv val/ILSVRC2012_val_00029361.JPEG n02119022/
+mv val/ILSVRC2012_val_00029362.JPEG n04392985/
+mv val/ILSVRC2012_val_00029363.JPEG n02108422/
+mv val/ILSVRC2012_val_00029364.JPEG n02098413/
+mv val/ILSVRC2012_val_00029365.JPEG n02012849/
+mv val/ILSVRC2012_val_00029366.JPEG n04487394/
+mv val/ILSVRC2012_val_00029367.JPEG n01990800/
+mv val/ILSVRC2012_val_00029368.JPEG n02817516/
+mv val/ILSVRC2012_val_00029369.JPEG n03216828/
+mv val/ILSVRC2012_val_00029370.JPEG n03187595/
+mv val/ILSVRC2012_val_00029371.JPEG n07871810/
+mv val/ILSVRC2012_val_00029372.JPEG n02669723/
+mv val/ILSVRC2012_val_00029373.JPEG n02229544/
+mv val/ILSVRC2012_val_00029374.JPEG n02966687/
+mv val/ILSVRC2012_val_00029375.JPEG n02113712/
+mv val/ILSVRC2012_val_00029376.JPEG n03930313/
+mv val/ILSVRC2012_val_00029377.JPEG n03417042/
+mv val/ILSVRC2012_val_00029378.JPEG n02389026/
+mv val/ILSVRC2012_val_00029379.JPEG n03249569/
+mv val/ILSVRC2012_val_00029380.JPEG n03633091/
+mv val/ILSVRC2012_val_00029381.JPEG n02096294/
+mv val/ILSVRC2012_val_00029382.JPEG n02110627/
+mv val/ILSVRC2012_val_00029383.JPEG n03916031/
+mv val/ILSVRC2012_val_00029384.JPEG n07920052/
+mv val/ILSVRC2012_val_00029385.JPEG n04146614/
+mv val/ILSVRC2012_val_00029386.JPEG n03207743/
+mv val/ILSVRC2012_val_00029387.JPEG n02325366/
+mv val/ILSVRC2012_val_00029388.JPEG n03954731/
+mv val/ILSVRC2012_val_00029389.JPEG n04133789/
+mv val/ILSVRC2012_val_00029390.JPEG n03788195/
+mv val/ILSVRC2012_val_00029391.JPEG n03982430/
+mv val/ILSVRC2012_val_00029392.JPEG n02112706/
+mv val/ILSVRC2012_val_00029393.JPEG n02017213/
+mv val/ILSVRC2012_val_00029394.JPEG n02492660/
+mv val/ILSVRC2012_val_00029395.JPEG n03976467/
+mv val/ILSVRC2012_val_00029396.JPEG n03792782/
+mv val/ILSVRC2012_val_00029397.JPEG n02123159/
+mv val/ILSVRC2012_val_00029398.JPEG n07754684/
+mv val/ILSVRC2012_val_00029399.JPEG n03444034/
+mv val/ILSVRC2012_val_00029400.JPEG n03063599/
+mv val/ILSVRC2012_val_00029401.JPEG n02326432/
+mv val/ILSVRC2012_val_00029402.JPEG n02009912/
+mv val/ILSVRC2012_val_00029403.JPEG n04154565/
+mv val/ILSVRC2012_val_00029404.JPEG n03492542/
+mv val/ILSVRC2012_val_00029405.JPEG n03649909/
+mv val/ILSVRC2012_val_00029406.JPEG n02101388/
+mv val/ILSVRC2012_val_00029407.JPEG n02091134/
+mv val/ILSVRC2012_val_00029408.JPEG n02892201/
+mv val/ILSVRC2012_val_00029409.JPEG n02077923/
+mv val/ILSVRC2012_val_00029410.JPEG n02168699/
+mv val/ILSVRC2012_val_00029411.JPEG n04239074/
+mv val/ILSVRC2012_val_00029412.JPEG n03899768/
+mv val/ILSVRC2012_val_00029413.JPEG n04461696/
+mv val/ILSVRC2012_val_00029414.JPEG n03124170/
+mv val/ILSVRC2012_val_00029415.JPEG n09428293/
+mv val/ILSVRC2012_val_00029416.JPEG n03000247/
+mv val/ILSVRC2012_val_00029417.JPEG n01558993/
+mv val/ILSVRC2012_val_00029418.JPEG n02104365/
+mv val/ILSVRC2012_val_00029419.JPEG n02093991/
+mv val/ILSVRC2012_val_00029420.JPEG n03837869/
+mv val/ILSVRC2012_val_00029421.JPEG n02169497/
+mv val/ILSVRC2012_val_00029422.JPEG n03492542/
+mv val/ILSVRC2012_val_00029423.JPEG n03706229/
+mv val/ILSVRC2012_val_00029424.JPEG n02129165/
+mv val/ILSVRC2012_val_00029425.JPEG n03216828/
+mv val/ILSVRC2012_val_00029426.JPEG n03662601/
+mv val/ILSVRC2012_val_00029427.JPEG n02444819/
+mv val/ILSVRC2012_val_00029428.JPEG n03930313/
+mv val/ILSVRC2012_val_00029429.JPEG n04039381/
+mv val/ILSVRC2012_val_00029430.JPEG n01601694/
+mv val/ILSVRC2012_val_00029431.JPEG n04228054/
+mv val/ILSVRC2012_val_00029432.JPEG n02788148/
+mv val/ILSVRC2012_val_00029433.JPEG n03133878/
+mv val/ILSVRC2012_val_00029434.JPEG n01983481/
+mv val/ILSVRC2012_val_00029435.JPEG n02093859/
+mv val/ILSVRC2012_val_00029436.JPEG n02106166/
+mv val/ILSVRC2012_val_00029437.JPEG n02102973/
+mv val/ILSVRC2012_val_00029438.JPEG n03982430/
+mv val/ILSVRC2012_val_00029439.JPEG n02667093/
+mv val/ILSVRC2012_val_00029440.JPEG n03891332/
+mv val/ILSVRC2012_val_00029441.JPEG n01592084/
+mv val/ILSVRC2012_val_00029442.JPEG n02172182/
+mv val/ILSVRC2012_val_00029443.JPEG n03404251/
+mv val/ILSVRC2012_val_00029444.JPEG n02259212/
+mv val/ILSVRC2012_val_00029445.JPEG n03250847/
+mv val/ILSVRC2012_val_00029446.JPEG n02817516/
+mv val/ILSVRC2012_val_00029447.JPEG n07747607/
+mv val/ILSVRC2012_val_00029448.JPEG n03063599/
+mv val/ILSVRC2012_val_00029449.JPEG n03935335/
+mv val/ILSVRC2012_val_00029450.JPEG n02085620/
+mv val/ILSVRC2012_val_00029451.JPEG n02092002/
+mv val/ILSVRC2012_val_00029452.JPEG n02999410/
+mv val/ILSVRC2012_val_00029453.JPEG n02504458/
+mv val/ILSVRC2012_val_00029454.JPEG n03100240/
+mv val/ILSVRC2012_val_00029455.JPEG n04392985/
+mv val/ILSVRC2012_val_00029456.JPEG n02105855/
+mv val/ILSVRC2012_val_00029457.JPEG n07718747/
+mv val/ILSVRC2012_val_00029458.JPEG n03721384/
+mv val/ILSVRC2012_val_00029459.JPEG n02483362/
+mv val/ILSVRC2012_val_00029460.JPEG n01629819/
+mv val/ILSVRC2012_val_00029461.JPEG n02107683/
+mv val/ILSVRC2012_val_00029462.JPEG n02951358/
+mv val/ILSVRC2012_val_00029463.JPEG n07920052/
+mv val/ILSVRC2012_val_00029464.JPEG n03733805/
+mv val/ILSVRC2012_val_00029465.JPEG n02483362/
+mv val/ILSVRC2012_val_00029466.JPEG n01798484/
+mv val/ILSVRC2012_val_00029467.JPEG n04418357/
+mv val/ILSVRC2012_val_00029468.JPEG n04251144/
+mv val/ILSVRC2012_val_00029469.JPEG n03197337/
+mv val/ILSVRC2012_val_00029470.JPEG n03908618/
+mv val/ILSVRC2012_val_00029471.JPEG n01978287/
+mv val/ILSVRC2012_val_00029472.JPEG n01817953/
+mv val/ILSVRC2012_val_00029473.JPEG n04486054/
+mv val/ILSVRC2012_val_00029474.JPEG n04127249/
+mv val/ILSVRC2012_val_00029475.JPEG n01945685/
+mv val/ILSVRC2012_val_00029476.JPEG n07711569/
+mv val/ILSVRC2012_val_00029477.JPEG n02088238/
+mv val/ILSVRC2012_val_00029478.JPEG n02105641/
+mv val/ILSVRC2012_val_00029479.JPEG n02910353/
+mv val/ILSVRC2012_val_00029480.JPEG n07892512/
+mv val/ILSVRC2012_val_00029481.JPEG n01484850/
+mv val/ILSVRC2012_val_00029482.JPEG n03657121/
+mv val/ILSVRC2012_val_00029483.JPEG n02859443/
+mv val/ILSVRC2012_val_00029484.JPEG n07860988/
+mv val/ILSVRC2012_val_00029485.JPEG n04141327/
+mv val/ILSVRC2012_val_00029486.JPEG n03868863/
+mv val/ILSVRC2012_val_00029487.JPEG n01768244/
+mv val/ILSVRC2012_val_00029488.JPEG n03657121/
+mv val/ILSVRC2012_val_00029489.JPEG n02102973/
+mv val/ILSVRC2012_val_00029490.JPEG n02111500/
+mv val/ILSVRC2012_val_00029491.JPEG n01632458/
+mv val/ILSVRC2012_val_00029492.JPEG n02319095/
+mv val/ILSVRC2012_val_00029493.JPEG n04328186/
+mv val/ILSVRC2012_val_00029494.JPEG n04311004/
+mv val/ILSVRC2012_val_00029495.JPEG n01558993/
+mv val/ILSVRC2012_val_00029496.JPEG n01773549/
+mv val/ILSVRC2012_val_00029497.JPEG n01622779/
+mv val/ILSVRC2012_val_00029498.JPEG n02442845/
+mv val/ILSVRC2012_val_00029499.JPEG n07768694/
+mv val/ILSVRC2012_val_00029500.JPEG n01632777/
+mv val/ILSVRC2012_val_00029501.JPEG n03733805/
+mv val/ILSVRC2012_val_00029502.JPEG n03133878/
+mv val/ILSVRC2012_val_00029503.JPEG n02012849/
+mv val/ILSVRC2012_val_00029504.JPEG n03496892/
+mv val/ILSVRC2012_val_00029505.JPEG n02066245/
+mv val/ILSVRC2012_val_00029506.JPEG n02094433/
+mv val/ILSVRC2012_val_00029507.JPEG n03271574/
+mv val/ILSVRC2012_val_00029508.JPEG n02128757/
+mv val/ILSVRC2012_val_00029509.JPEG n03792782/
+mv val/ILSVRC2012_val_00029510.JPEG n02018795/
+mv val/ILSVRC2012_val_00029511.JPEG n01630670/
+mv val/ILSVRC2012_val_00029512.JPEG n02101006/
+mv val/ILSVRC2012_val_00029513.JPEG n04067472/
+mv val/ILSVRC2012_val_00029514.JPEG n02100583/
+mv val/ILSVRC2012_val_00029515.JPEG n04317175/
+mv val/ILSVRC2012_val_00029516.JPEG n03602883/
+mv val/ILSVRC2012_val_00029517.JPEG n04141327/
+mv val/ILSVRC2012_val_00029518.JPEG n02102040/
+mv val/ILSVRC2012_val_00029519.JPEG n07875152/
+mv val/ILSVRC2012_val_00029520.JPEG n02892201/
+mv val/ILSVRC2012_val_00029521.JPEG n04127249/
+mv val/ILSVRC2012_val_00029522.JPEG n07753275/
+mv val/ILSVRC2012_val_00029523.JPEG n04355338/
+mv val/ILSVRC2012_val_00029524.JPEG n02236044/
+mv val/ILSVRC2012_val_00029525.JPEG n01749939/
+mv val/ILSVRC2012_val_00029526.JPEG n07717556/
+mv val/ILSVRC2012_val_00029527.JPEG n02317335/
+mv val/ILSVRC2012_val_00029528.JPEG n02606052/
+mv val/ILSVRC2012_val_00029529.JPEG n04483307/
+mv val/ILSVRC2012_val_00029530.JPEG n04435653/
+mv val/ILSVRC2012_val_00029531.JPEG n04264628/
+mv val/ILSVRC2012_val_00029532.JPEG n04347754/
+mv val/ILSVRC2012_val_00029533.JPEG n04179913/
+mv val/ILSVRC2012_val_00029534.JPEG n07583066/
+mv val/ILSVRC2012_val_00029535.JPEG n04146614/
+mv val/ILSVRC2012_val_00029536.JPEG n03478589/
+mv val/ILSVRC2012_val_00029537.JPEG n03599486/
+mv val/ILSVRC2012_val_00029538.JPEG n02676566/
+mv val/ILSVRC2012_val_00029539.JPEG n02264363/
+mv val/ILSVRC2012_val_00029540.JPEG n04371430/
+mv val/ILSVRC2012_val_00029541.JPEG n03782006/
+mv val/ILSVRC2012_val_00029542.JPEG n04604644/
+mv val/ILSVRC2012_val_00029543.JPEG n03180011/
+mv val/ILSVRC2012_val_00029544.JPEG n03045698/
+mv val/ILSVRC2012_val_00029545.JPEG n03887697/
+mv val/ILSVRC2012_val_00029546.JPEG n02085936/
+mv val/ILSVRC2012_val_00029547.JPEG n07614500/
+mv val/ILSVRC2012_val_00029548.JPEG n04296562/
+mv val/ILSVRC2012_val_00029549.JPEG n02074367/
+mv val/ILSVRC2012_val_00029550.JPEG n01729977/
+mv val/ILSVRC2012_val_00029551.JPEG n02018795/
+mv val/ILSVRC2012_val_00029552.JPEG n01735189/
+mv val/ILSVRC2012_val_00029553.JPEG n03777568/
+mv val/ILSVRC2012_val_00029554.JPEG n03775546/
+mv val/ILSVRC2012_val_00029555.JPEG n02091244/
+mv val/ILSVRC2012_val_00029556.JPEG n03838899/
+mv val/ILSVRC2012_val_00029557.JPEG n04357314/
+mv val/ILSVRC2012_val_00029558.JPEG n01945685/
+mv val/ILSVRC2012_val_00029559.JPEG n03788365/
+mv val/ILSVRC2012_val_00029560.JPEG n02441942/
+mv val/ILSVRC2012_val_00029561.JPEG n04429376/
+mv val/ILSVRC2012_val_00029562.JPEG n02119022/
+mv val/ILSVRC2012_val_00029563.JPEG n01945685/
+mv val/ILSVRC2012_val_00029564.JPEG n03627232/
+mv val/ILSVRC2012_val_00029565.JPEG n02056570/
+mv val/ILSVRC2012_val_00029566.JPEG n02437616/
+mv val/ILSVRC2012_val_00029567.JPEG n03590841/
+mv val/ILSVRC2012_val_00029568.JPEG n01491361/
+mv val/ILSVRC2012_val_00029569.JPEG n01871265/
+mv val/ILSVRC2012_val_00029570.JPEG n04442312/
+mv val/ILSVRC2012_val_00029571.JPEG n01833805/
+mv val/ILSVRC2012_val_00029572.JPEG n04596742/
+mv val/ILSVRC2012_val_00029573.JPEG n04553703/
+mv val/ILSVRC2012_val_00029574.JPEG n04487394/
+mv val/ILSVRC2012_val_00029575.JPEG n03763968/
+mv val/ILSVRC2012_val_00029576.JPEG n02514041/
+mv val/ILSVRC2012_val_00029577.JPEG n11879895/
+mv val/ILSVRC2012_val_00029578.JPEG n04525038/
+mv val/ILSVRC2012_val_00029579.JPEG n02510455/
+mv val/ILSVRC2012_val_00029580.JPEG n04275548/
+mv val/ILSVRC2012_val_00029581.JPEG n01531178/
+mv val/ILSVRC2012_val_00029582.JPEG n04162706/
+mv val/ILSVRC2012_val_00029583.JPEG n03240683/
+mv val/ILSVRC2012_val_00029584.JPEG n04589890/
+mv val/ILSVRC2012_val_00029585.JPEG n03871628/
+mv val/ILSVRC2012_val_00029586.JPEG n04443257/
+mv val/ILSVRC2012_val_00029587.JPEG n02655020/
+mv val/ILSVRC2012_val_00029588.JPEG n04264628/
+mv val/ILSVRC2012_val_00029589.JPEG n01843383/
+mv val/ILSVRC2012_val_00029590.JPEG n02138441/
+mv val/ILSVRC2012_val_00029591.JPEG n02091032/
+mv val/ILSVRC2012_val_00029592.JPEG n02281406/
+mv val/ILSVRC2012_val_00029593.JPEG n03272010/
+mv val/ILSVRC2012_val_00029594.JPEG n03775546/
+mv val/ILSVRC2012_val_00029595.JPEG n03345487/
+mv val/ILSVRC2012_val_00029596.JPEG n03532672/
+mv val/ILSVRC2012_val_00029597.JPEG n02814860/
+mv val/ILSVRC2012_val_00029598.JPEG n07714571/
+mv val/ILSVRC2012_val_00029599.JPEG n02423022/
+mv val/ILSVRC2012_val_00029600.JPEG n03187595/
+mv val/ILSVRC2012_val_00029601.JPEG n03992509/
+mv val/ILSVRC2012_val_00029602.JPEG n03933933/
+mv val/ILSVRC2012_val_00029603.JPEG n03956157/
+mv val/ILSVRC2012_val_00029604.JPEG n07920052/
+mv val/ILSVRC2012_val_00029605.JPEG n01981276/
+mv val/ILSVRC2012_val_00029606.JPEG n03710721/
+mv val/ILSVRC2012_val_00029607.JPEG n04201297/
+mv val/ILSVRC2012_val_00029608.JPEG n09472597/
+mv val/ILSVRC2012_val_00029609.JPEG n02097130/
+mv val/ILSVRC2012_val_00029610.JPEG n02111889/
+mv val/ILSVRC2012_val_00029611.JPEG n03929660/
+mv val/ILSVRC2012_val_00029612.JPEG n02804610/
+mv val/ILSVRC2012_val_00029613.JPEG n03961711/
+mv val/ILSVRC2012_val_00029614.JPEG n07613480/
+mv val/ILSVRC2012_val_00029615.JPEG n01755581/
+mv val/ILSVRC2012_val_00029616.JPEG n02277742/
+mv val/ILSVRC2012_val_00029617.JPEG n03452741/
+mv val/ILSVRC2012_val_00029618.JPEG n02396427/
+mv val/ILSVRC2012_val_00029619.JPEG n01514859/
+mv val/ILSVRC2012_val_00029620.JPEG n04590129/
+mv val/ILSVRC2012_val_00029621.JPEG n04116512/
+mv val/ILSVRC2012_val_00029622.JPEG n01631663/
+mv val/ILSVRC2012_val_00029623.JPEG n07711569/
+mv val/ILSVRC2012_val_00029624.JPEG n02134084/
+mv val/ILSVRC2012_val_00029625.JPEG n04332243/
+mv val/ILSVRC2012_val_00029626.JPEG n04517823/
+mv val/ILSVRC2012_val_00029627.JPEG n01558993/
+mv val/ILSVRC2012_val_00029628.JPEG n02817516/
+mv val/ILSVRC2012_val_00029629.JPEG n02088632/
+mv val/ILSVRC2012_val_00029630.JPEG n03457902/
+mv val/ILSVRC2012_val_00029631.JPEG n01775062/
+mv val/ILSVRC2012_val_00029632.JPEG n02328150/
+mv val/ILSVRC2012_val_00029633.JPEG n02804610/
+mv val/ILSVRC2012_val_00029634.JPEG n02077923/
+mv val/ILSVRC2012_val_00029635.JPEG n02129604/
+mv val/ILSVRC2012_val_00029636.JPEG n02095314/
+mv val/ILSVRC2012_val_00029637.JPEG n03388183/
+mv val/ILSVRC2012_val_00029638.JPEG n02536864/
+mv val/ILSVRC2012_val_00029639.JPEG n03134739/
+mv val/ILSVRC2012_val_00029640.JPEG n03014705/
+mv val/ILSVRC2012_val_00029641.JPEG n02423022/
+mv val/ILSVRC2012_val_00029642.JPEG n04254120/
+mv val/ILSVRC2012_val_00029643.JPEG n03776460/
+mv val/ILSVRC2012_val_00029644.JPEG n03788195/
+mv val/ILSVRC2012_val_00029645.JPEG n03637318/
+mv val/ILSVRC2012_val_00029646.JPEG n02112706/
+mv val/ILSVRC2012_val_00029647.JPEG n03777568/
+mv val/ILSVRC2012_val_00029648.JPEG n02089078/
+mv val/ILSVRC2012_val_00029649.JPEG n03838899/
+mv val/ILSVRC2012_val_00029650.JPEG n03661043/
+mv val/ILSVRC2012_val_00029651.JPEG n02687172/
+mv val/ILSVRC2012_val_00029652.JPEG n02097658/
+mv val/ILSVRC2012_val_00029653.JPEG n02395406/
+mv val/ILSVRC2012_val_00029654.JPEG n01820546/
+mv val/ILSVRC2012_val_00029655.JPEG n03788365/
+mv val/ILSVRC2012_val_00029656.JPEG n02963159/
+mv val/ILSVRC2012_val_00029657.JPEG n02097298/
+mv val/ILSVRC2012_val_00029658.JPEG n07717556/
+mv val/ILSVRC2012_val_00029659.JPEG n02114367/
+mv val/ILSVRC2012_val_00029660.JPEG n02219486/
+mv val/ILSVRC2012_val_00029661.JPEG n04442312/
+mv val/ILSVRC2012_val_00029662.JPEG n04536866/
+mv val/ILSVRC2012_val_00029663.JPEG n02979186/
+mv val/ILSVRC2012_val_00029664.JPEG n04458633/
+mv val/ILSVRC2012_val_00029665.JPEG n07584110/
+mv val/ILSVRC2012_val_00029666.JPEG n03633091/
+mv val/ILSVRC2012_val_00029667.JPEG n04501370/
+mv val/ILSVRC2012_val_00029668.JPEG n03000684/
+mv val/ILSVRC2012_val_00029669.JPEG n02417914/
+mv val/ILSVRC2012_val_00029670.JPEG n02093859/
+mv val/ILSVRC2012_val_00029671.JPEG n04228054/
+mv val/ILSVRC2012_val_00029672.JPEG n03478589/
+mv val/ILSVRC2012_val_00029673.JPEG n02112137/
+mv val/ILSVRC2012_val_00029674.JPEG n03642806/
+mv val/ILSVRC2012_val_00029675.JPEG n02113712/
+mv val/ILSVRC2012_val_00029676.JPEG n02817516/
+mv val/ILSVRC2012_val_00029677.JPEG n03980874/
+mv val/ILSVRC2012_val_00029678.JPEG n01644900/
+mv val/ILSVRC2012_val_00029679.JPEG n11879895/
+mv val/ILSVRC2012_val_00029680.JPEG n04347754/
+mv val/ILSVRC2012_val_00029681.JPEG n03788195/
+mv val/ILSVRC2012_val_00029682.JPEG n02825657/
+mv val/ILSVRC2012_val_00029683.JPEG n02119789/
+mv val/ILSVRC2012_val_00029684.JPEG n02128925/
+mv val/ILSVRC2012_val_00029685.JPEG n02129604/
+mv val/ILSVRC2012_val_00029686.JPEG n04523525/
+mv val/ILSVRC2012_val_00029687.JPEG n04162706/
+mv val/ILSVRC2012_val_00029688.JPEG n03000247/
+mv val/ILSVRC2012_val_00029689.JPEG n04347754/
+mv val/ILSVRC2012_val_00029690.JPEG n02447366/
+mv val/ILSVRC2012_val_00029691.JPEG n02096294/
+mv val/ILSVRC2012_val_00029692.JPEG n02002724/
+mv val/ILSVRC2012_val_00029693.JPEG n02098413/
+mv val/ILSVRC2012_val_00029694.JPEG n03467068/
+mv val/ILSVRC2012_val_00029695.JPEG n01582220/
+mv val/ILSVRC2012_val_00029696.JPEG n02002556/
+mv val/ILSVRC2012_val_00029697.JPEG n03063689/
+mv val/ILSVRC2012_val_00029698.JPEG n01855672/
+mv val/ILSVRC2012_val_00029699.JPEG n02971356/
+mv val/ILSVRC2012_val_00029700.JPEG n02086240/
+mv val/ILSVRC2012_val_00029701.JPEG n02817516/
+mv val/ILSVRC2012_val_00029702.JPEG n01930112/
+mv val/ILSVRC2012_val_00029703.JPEG n02490219/
+mv val/ILSVRC2012_val_00029704.JPEG n09428293/
+mv val/ILSVRC2012_val_00029705.JPEG n02091467/
+mv val/ILSVRC2012_val_00029706.JPEG n03710637/
+mv val/ILSVRC2012_val_00029707.JPEG n02917067/
+mv val/ILSVRC2012_val_00029708.JPEG n06596364/
+mv val/ILSVRC2012_val_00029709.JPEG n01532829/
+mv val/ILSVRC2012_val_00029710.JPEG n02056570/
+mv val/ILSVRC2012_val_00029711.JPEG n04560804/
+mv val/ILSVRC2012_val_00029712.JPEG n01735189/
+mv val/ILSVRC2012_val_00029713.JPEG n04557648/
+mv val/ILSVRC2012_val_00029714.JPEG n07711569/
+mv val/ILSVRC2012_val_00029715.JPEG n06785654/
+mv val/ILSVRC2012_val_00029716.JPEG n04118776/
+mv val/ILSVRC2012_val_00029717.JPEG n02860847/
+mv val/ILSVRC2012_val_00029718.JPEG n02007558/
+mv val/ILSVRC2012_val_00029719.JPEG n02356798/
+mv val/ILSVRC2012_val_00029720.JPEG n04070727/
+mv val/ILSVRC2012_val_00029721.JPEG n02489166/
+mv val/ILSVRC2012_val_00029722.JPEG n07714990/
+mv val/ILSVRC2012_val_00029723.JPEG n02104365/
+mv val/ILSVRC2012_val_00029724.JPEG n02007558/
+mv val/ILSVRC2012_val_00029725.JPEG n03649909/
+mv val/ILSVRC2012_val_00029726.JPEG n01667114/
+mv val/ILSVRC2012_val_00029727.JPEG n01641577/
+mv val/ILSVRC2012_val_00029728.JPEG n03028079/
+mv val/ILSVRC2012_val_00029729.JPEG n03494278/
+mv val/ILSVRC2012_val_00029730.JPEG n07880968/
+mv val/ILSVRC2012_val_00029731.JPEG n03775071/
+mv val/ILSVRC2012_val_00029732.JPEG n01632458/
+mv val/ILSVRC2012_val_00029733.JPEG n01990800/
+mv val/ILSVRC2012_val_00029734.JPEG n02442845/
+mv val/ILSVRC2012_val_00029735.JPEG n02119022/
+mv val/ILSVRC2012_val_00029736.JPEG n02006656/
+mv val/ILSVRC2012_val_00029737.JPEG n02701002/
+mv val/ILSVRC2012_val_00029738.JPEG n02483362/
+mv val/ILSVRC2012_val_00029739.JPEG n03124170/
+mv val/ILSVRC2012_val_00029740.JPEG n01531178/
+mv val/ILSVRC2012_val_00029741.JPEG n02704792/
+mv val/ILSVRC2012_val_00029742.JPEG n02099849/
+mv val/ILSVRC2012_val_00029743.JPEG n01873310/
+mv val/ILSVRC2012_val_00029744.JPEG n01735189/
+mv val/ILSVRC2012_val_00029745.JPEG n04462240/
+mv val/ILSVRC2012_val_00029746.JPEG n03065424/
+mv val/ILSVRC2012_val_00029747.JPEG n04398044/
+mv val/ILSVRC2012_val_00029748.JPEG n04120489/
+mv val/ILSVRC2012_val_00029749.JPEG n04330267/
+mv val/ILSVRC2012_val_00029750.JPEG n03967562/
+mv val/ILSVRC2012_val_00029751.JPEG n02099601/
+mv val/ILSVRC2012_val_00029752.JPEG n03388043/
+mv val/ILSVRC2012_val_00029753.JPEG n02100583/
+mv val/ILSVRC2012_val_00029754.JPEG n02093991/
+mv val/ILSVRC2012_val_00029755.JPEG n09399592/
+mv val/ILSVRC2012_val_00029756.JPEG n01773797/
+mv val/ILSVRC2012_val_00029757.JPEG n03761084/
+mv val/ILSVRC2012_val_00029758.JPEG n02342885/
+mv val/ILSVRC2012_val_00029759.JPEG n02206856/
+mv val/ILSVRC2012_val_00029760.JPEG n02098286/
+mv val/ILSVRC2012_val_00029761.JPEG n03207743/
+mv val/ILSVRC2012_val_00029762.JPEG n13040303/
+mv val/ILSVRC2012_val_00029763.JPEG n01629819/
+mv val/ILSVRC2012_val_00029764.JPEG n02927161/
+mv val/ILSVRC2012_val_00029765.JPEG n04125021/
+mv val/ILSVRC2012_val_00029766.JPEG n04554684/
+mv val/ILSVRC2012_val_00029767.JPEG n02328150/
+mv val/ILSVRC2012_val_00029768.JPEG n03476684/
+mv val/ILSVRC2012_val_00029769.JPEG n02114367/
+mv val/ILSVRC2012_val_00029770.JPEG n03793489/
+mv val/ILSVRC2012_val_00029771.JPEG n03633091/
+mv val/ILSVRC2012_val_00029772.JPEG n03930630/
+mv val/ILSVRC2012_val_00029773.JPEG n02871525/
+mv val/ILSVRC2012_val_00029774.JPEG n02097474/
+mv val/ILSVRC2012_val_00029775.JPEG n02113799/
+mv val/ILSVRC2012_val_00029776.JPEG n02408429/
+mv val/ILSVRC2012_val_00029777.JPEG n03899768/
+mv val/ILSVRC2012_val_00029778.JPEG n07831146/
+mv val/ILSVRC2012_val_00029779.JPEG n04525038/
+mv val/ILSVRC2012_val_00029780.JPEG n02808304/
+mv val/ILSVRC2012_val_00029781.JPEG n03724870/
+mv val/ILSVRC2012_val_00029782.JPEG n02033041/
+mv val/ILSVRC2012_val_00029783.JPEG n02110063/
+mv val/ILSVRC2012_val_00029784.JPEG n03063689/
+mv val/ILSVRC2012_val_00029785.JPEG n01855672/
+mv val/ILSVRC2012_val_00029786.JPEG n02395406/
+mv val/ILSVRC2012_val_00029787.JPEG n04254680/
+mv val/ILSVRC2012_val_00029788.JPEG n03063689/
+mv val/ILSVRC2012_val_00029789.JPEG n02487347/
+mv val/ILSVRC2012_val_00029790.JPEG n02640242/
+mv val/ILSVRC2012_val_00029791.JPEG n03457902/
+mv val/ILSVRC2012_val_00029792.JPEG n12267677/
+mv val/ILSVRC2012_val_00029793.JPEG n04482393/
+mv val/ILSVRC2012_val_00029794.JPEG n04009552/
+mv val/ILSVRC2012_val_00029795.JPEG n02174001/
+mv val/ILSVRC2012_val_00029796.JPEG n01990800/
+mv val/ILSVRC2012_val_00029797.JPEG n04209133/
+mv val/ILSVRC2012_val_00029798.JPEG n01950731/
+mv val/ILSVRC2012_val_00029799.JPEG n02113186/
+mv val/ILSVRC2012_val_00029800.JPEG n03095699/
+mv val/ILSVRC2012_val_00029801.JPEG n01770081/
+mv val/ILSVRC2012_val_00029802.JPEG n04127249/
+mv val/ILSVRC2012_val_00029803.JPEG n02971356/
+mv val/ILSVRC2012_val_00029804.JPEG n02490219/
+mv val/ILSVRC2012_val_00029805.JPEG n04044716/
+mv val/ILSVRC2012_val_00029806.JPEG n01667778/
+mv val/ILSVRC2012_val_00029807.JPEG n03710721/
+mv val/ILSVRC2012_val_00029808.JPEG n03141823/
+mv val/ILSVRC2012_val_00029809.JPEG n04099969/
+mv val/ILSVRC2012_val_00029810.JPEG n02325366/
+mv val/ILSVRC2012_val_00029811.JPEG n04599235/
+mv val/ILSVRC2012_val_00029812.JPEG n01978455/
+mv val/ILSVRC2012_val_00029813.JPEG n03599486/
+mv val/ILSVRC2012_val_00029814.JPEG n02090622/
+mv val/ILSVRC2012_val_00029815.JPEG n03630383/
+mv val/ILSVRC2012_val_00029816.JPEG n02117135/
+mv val/ILSVRC2012_val_00029817.JPEG n02037110/
+mv val/ILSVRC2012_val_00029818.JPEG n02219486/
+mv val/ILSVRC2012_val_00029819.JPEG n03297495/
+mv val/ILSVRC2012_val_00029820.JPEG n02105505/
+mv val/ILSVRC2012_val_00029821.JPEG n04263257/
+mv val/ILSVRC2012_val_00029822.JPEG n02442845/
+mv val/ILSVRC2012_val_00029823.JPEG n04266014/
+mv val/ILSVRC2012_val_00029824.JPEG n03393912/
+mv val/ILSVRC2012_val_00029825.JPEG n02115641/
+mv val/ILSVRC2012_val_00029826.JPEG n02883205/
+mv val/ILSVRC2012_val_00029827.JPEG n01729977/
+mv val/ILSVRC2012_val_00029828.JPEG n03047690/
+mv val/ILSVRC2012_val_00029829.JPEG n02361337/
+mv val/ILSVRC2012_val_00029830.JPEG n04560804/
+mv val/ILSVRC2012_val_00029831.JPEG n02106662/
+mv val/ILSVRC2012_val_00029832.JPEG n03876231/
+mv val/ILSVRC2012_val_00029833.JPEG n03041632/
+mv val/ILSVRC2012_val_00029834.JPEG n02098105/
+mv val/ILSVRC2012_val_00029835.JPEG n01560419/
+mv val/ILSVRC2012_val_00029836.JPEG n02089078/
+mv val/ILSVRC2012_val_00029837.JPEG n03218198/
+mv val/ILSVRC2012_val_00029838.JPEG n04153751/
+mv val/ILSVRC2012_val_00029839.JPEG n02123597/
+mv val/ILSVRC2012_val_00029840.JPEG n03584829/
+mv val/ILSVRC2012_val_00029841.JPEG n02930766/
+mv val/ILSVRC2012_val_00029842.JPEG n03781244/
+mv val/ILSVRC2012_val_00029843.JPEG n02264363/
+mv val/ILSVRC2012_val_00029844.JPEG n07711569/
+mv val/ILSVRC2012_val_00029845.JPEG n04418357/
+mv val/ILSVRC2012_val_00029846.JPEG n06596364/
+mv val/ILSVRC2012_val_00029847.JPEG n03345487/
+mv val/ILSVRC2012_val_00029848.JPEG n02835271/
+mv val/ILSVRC2012_val_00029849.JPEG n04467665/
+mv val/ILSVRC2012_val_00029850.JPEG n03450230/
+mv val/ILSVRC2012_val_00029851.JPEG n03692522/
+mv val/ILSVRC2012_val_00029852.JPEG n03929660/
+mv val/ILSVRC2012_val_00029853.JPEG n03935335/
+mv val/ILSVRC2012_val_00029854.JPEG n01630670/
+mv val/ILSVRC2012_val_00029855.JPEG n02120505/
+mv val/ILSVRC2012_val_00029856.JPEG n02172182/
+mv val/ILSVRC2012_val_00029857.JPEG n03777754/
+mv val/ILSVRC2012_val_00029858.JPEG n04209133/
+mv val/ILSVRC2012_val_00029859.JPEG n01687978/
+mv val/ILSVRC2012_val_00029860.JPEG n03481172/
+mv val/ILSVRC2012_val_00029861.JPEG n02088094/
+mv val/ILSVRC2012_val_00029862.JPEG n02112350/
+mv val/ILSVRC2012_val_00029863.JPEG n03982430/
+mv val/ILSVRC2012_val_00029864.JPEG n02124075/
+mv val/ILSVRC2012_val_00029865.JPEG n03854065/
+mv val/ILSVRC2012_val_00029866.JPEG n04141076/
+mv val/ILSVRC2012_val_00029867.JPEG n06785654/
+mv val/ILSVRC2012_val_00029868.JPEG n02981792/
+mv val/ILSVRC2012_val_00029869.JPEG n03207941/
+mv val/ILSVRC2012_val_00029870.JPEG n03028079/
+mv val/ILSVRC2012_val_00029871.JPEG n13133613/
+mv val/ILSVRC2012_val_00029872.JPEG n02423022/
+mv val/ILSVRC2012_val_00029873.JPEG n03777568/
+mv val/ILSVRC2012_val_00029874.JPEG n02328150/
+mv val/ILSVRC2012_val_00029875.JPEG n02037110/
+mv val/ILSVRC2012_val_00029876.JPEG n02092002/
+mv val/ILSVRC2012_val_00029877.JPEG n02655020/
+mv val/ILSVRC2012_val_00029878.JPEG n04443257/
+mv val/ILSVRC2012_val_00029879.JPEG n02963159/
+mv val/ILSVRC2012_val_00029880.JPEG n01687978/
+mv val/ILSVRC2012_val_00029881.JPEG n09193705/
+mv val/ILSVRC2012_val_00029882.JPEG n10148035/
+mv val/ILSVRC2012_val_00029883.JPEG n03065424/
+mv val/ILSVRC2012_val_00029884.JPEG n03792972/
+mv val/ILSVRC2012_val_00029885.JPEG n02013706/
+mv val/ILSVRC2012_val_00029886.JPEG n01494475/
+mv val/ILSVRC2012_val_00029887.JPEG n07860988/
+mv val/ILSVRC2012_val_00029888.JPEG n02099267/
+mv val/ILSVRC2012_val_00029889.JPEG n04355933/
+mv val/ILSVRC2012_val_00029890.JPEG n02457408/
+mv val/ILSVRC2012_val_00029891.JPEG n01943899/
+mv val/ILSVRC2012_val_00029892.JPEG n03733131/
+mv val/ILSVRC2012_val_00029893.JPEG n04252077/
+mv val/ILSVRC2012_val_00029894.JPEG n02978881/
+mv val/ILSVRC2012_val_00029895.JPEG n03868863/
+mv val/ILSVRC2012_val_00029896.JPEG n03544143/
+mv val/ILSVRC2012_val_00029897.JPEG n03692522/
+mv val/ILSVRC2012_val_00029898.JPEG n12768682/
+mv val/ILSVRC2012_val_00029899.JPEG n02088094/
+mv val/ILSVRC2012_val_00029900.JPEG n04023962/
+mv val/ILSVRC2012_val_00029901.JPEG n02793495/
+mv val/ILSVRC2012_val_00029902.JPEG n03840681/
+mv val/ILSVRC2012_val_00029903.JPEG n01773549/
+mv val/ILSVRC2012_val_00029904.JPEG n03843555/
+mv val/ILSVRC2012_val_00029905.JPEG n04482393/
+mv val/ILSVRC2012_val_00029906.JPEG n07753592/
+mv val/ILSVRC2012_val_00029907.JPEG n03673027/
+mv val/ILSVRC2012_val_00029908.JPEG n07930864/
+mv val/ILSVRC2012_val_00029909.JPEG n01685808/
+mv val/ILSVRC2012_val_00029910.JPEG n02037110/
+mv val/ILSVRC2012_val_00029911.JPEG n02787622/
+mv val/ILSVRC2012_val_00029912.JPEG n06596364/
+mv val/ILSVRC2012_val_00029913.JPEG n02033041/
+mv val/ILSVRC2012_val_00029914.JPEG n04204238/
+mv val/ILSVRC2012_val_00029915.JPEG n12267677/
+mv val/ILSVRC2012_val_00029916.JPEG n02321529/
+mv val/ILSVRC2012_val_00029917.JPEG n03404251/
+mv val/ILSVRC2012_val_00029918.JPEG n03000684/
+mv val/ILSVRC2012_val_00029919.JPEG n07753592/
+mv val/ILSVRC2012_val_00029920.JPEG n03804744/
+mv val/ILSVRC2012_val_00029921.JPEG n01514668/
+mv val/ILSVRC2012_val_00029922.JPEG n03594945/
+mv val/ILSVRC2012_val_00029923.JPEG n02110627/
+mv val/ILSVRC2012_val_00029924.JPEG n03793489/
+mv val/ILSVRC2012_val_00029925.JPEG n04243546/
+mv val/ILSVRC2012_val_00029926.JPEG n02490219/
+mv val/ILSVRC2012_val_00029927.JPEG n02817516/
+mv val/ILSVRC2012_val_00029928.JPEG n03291819/
+mv val/ILSVRC2012_val_00029929.JPEG n02100877/
+mv val/ILSVRC2012_val_00029930.JPEG n01440764/
+mv val/ILSVRC2012_val_00029931.JPEG n04209239/
+mv val/ILSVRC2012_val_00029932.JPEG n02088364/
+mv val/ILSVRC2012_val_00029933.JPEG n04590129/
+mv val/ILSVRC2012_val_00029934.JPEG n02110806/
+mv val/ILSVRC2012_val_00029935.JPEG n09229709/
+mv val/ILSVRC2012_val_00029936.JPEG n02447366/
+mv val/ILSVRC2012_val_00029937.JPEG n04606251/
+mv val/ILSVRC2012_val_00029938.JPEG n04562935/
+mv val/ILSVRC2012_val_00029939.JPEG n02128385/
+mv val/ILSVRC2012_val_00029940.JPEG n02837789/
+mv val/ILSVRC2012_val_00029941.JPEG n02363005/
+mv val/ILSVRC2012_val_00029942.JPEG n04133789/
+mv val/ILSVRC2012_val_00029943.JPEG n02165456/
+mv val/ILSVRC2012_val_00029944.JPEG n03649909/
+mv val/ILSVRC2012_val_00029945.JPEG n03661043/
+mv val/ILSVRC2012_val_00029946.JPEG n02107683/
+mv val/ILSVRC2012_val_00029947.JPEG n01688243/
+mv val/ILSVRC2012_val_00029948.JPEG n01843383/
+mv val/ILSVRC2012_val_00029949.JPEG n03891251/
+mv val/ILSVRC2012_val_00029950.JPEG n12620546/
+mv val/ILSVRC2012_val_00029951.JPEG n03832673/
+mv val/ILSVRC2012_val_00029952.JPEG n03452741/
+mv val/ILSVRC2012_val_00029953.JPEG n04074963/
+mv val/ILSVRC2012_val_00029954.JPEG n04228054/
+mv val/ILSVRC2012_val_00029955.JPEG n03982430/
+mv val/ILSVRC2012_val_00029956.JPEG n01795545/
+mv val/ILSVRC2012_val_00029957.JPEG n02877765/
+mv val/ILSVRC2012_val_00029958.JPEG n03196217/
+mv val/ILSVRC2012_val_00029959.JPEG n04435653/
+mv val/ILSVRC2012_val_00029960.JPEG n02105505/
+mv val/ILSVRC2012_val_00029961.JPEG n04467665/
+mv val/ILSVRC2012_val_00029962.JPEG n07695742/
+mv val/ILSVRC2012_val_00029963.JPEG n02672831/
+mv val/ILSVRC2012_val_00029964.JPEG n03690938/
+mv val/ILSVRC2012_val_00029965.JPEG n04456115/
+mv val/ILSVRC2012_val_00029966.JPEG n04125021/
+mv val/ILSVRC2012_val_00029967.JPEG n15075141/
+mv val/ILSVRC2012_val_00029968.JPEG n03761084/
+mv val/ILSVRC2012_val_00029969.JPEG n04487394/
+mv val/ILSVRC2012_val_00029970.JPEG n02108089/
+mv val/ILSVRC2012_val_00029971.JPEG n07932039/
+mv val/ILSVRC2012_val_00029972.JPEG n01806567/
+mv val/ILSVRC2012_val_00029973.JPEG n02089078/
+mv val/ILSVRC2012_val_00029974.JPEG n02028035/
+mv val/ILSVRC2012_val_00029975.JPEG n03623198/
+mv val/ILSVRC2012_val_00029976.JPEG n02108551/
+mv val/ILSVRC2012_val_00029977.JPEG n01632458/
+mv val/ILSVRC2012_val_00029978.JPEG n03445924/
+mv val/ILSVRC2012_val_00029979.JPEG n01739381/
+mv val/ILSVRC2012_val_00029980.JPEG n03887697/
+mv val/ILSVRC2012_val_00029981.JPEG n07836838/
+mv val/ILSVRC2012_val_00029982.JPEG n02364673/
+mv val/ILSVRC2012_val_00029983.JPEG n03355925/
+mv val/ILSVRC2012_val_00029984.JPEG n02113799/
+mv val/ILSVRC2012_val_00029985.JPEG n04476259/
+mv val/ILSVRC2012_val_00029986.JPEG n02437312/
+mv val/ILSVRC2012_val_00029987.JPEG n03534580/
+mv val/ILSVRC2012_val_00029988.JPEG n03841143/
+mv val/ILSVRC2012_val_00029989.JPEG n03131574/
+mv val/ILSVRC2012_val_00029990.JPEG n07697537/
+mv val/ILSVRC2012_val_00029991.JPEG n01818515/
+mv val/ILSVRC2012_val_00029992.JPEG n03929660/
+mv val/ILSVRC2012_val_00029993.JPEG n02093647/
+mv val/ILSVRC2012_val_00029994.JPEG n02892767/
+mv val/ILSVRC2012_val_00029995.JPEG n03916031/
+mv val/ILSVRC2012_val_00029996.JPEG n04081281/
+mv val/ILSVRC2012_val_00029997.JPEG n04443257/
+mv val/ILSVRC2012_val_00029998.JPEG n02441942/
+mv val/ILSVRC2012_val_00029999.JPEG n01534433/
+mv val/ILSVRC2012_val_00030000.JPEG n01843383/
+mv val/ILSVRC2012_val_00030001.JPEG n02951358/
+mv val/ILSVRC2012_val_00030002.JPEG n02089078/
+mv val/ILSVRC2012_val_00030003.JPEG n03874293/
+mv val/ILSVRC2012_val_00030004.JPEG n03127925/
+mv val/ILSVRC2012_val_00030005.JPEG n02094258/
+mv val/ILSVRC2012_val_00030006.JPEG n04366367/
+mv val/ILSVRC2012_val_00030007.JPEG n03485407/
+mv val/ILSVRC2012_val_00030008.JPEG n04597913/
+mv val/ILSVRC2012_val_00030009.JPEG n01755581/
+mv val/ILSVRC2012_val_00030010.JPEG n01795545/
+mv val/ILSVRC2012_val_00030011.JPEG n01601694/
+mv val/ILSVRC2012_val_00030012.JPEG n01944390/
+mv val/ILSVRC2012_val_00030013.JPEG n03124170/
+mv val/ILSVRC2012_val_00030014.JPEG n02395406/
+mv val/ILSVRC2012_val_00030015.JPEG n03594734/
+mv val/ILSVRC2012_val_00030016.JPEG n01685808/
+mv val/ILSVRC2012_val_00030017.JPEG n01582220/
+mv val/ILSVRC2012_val_00030018.JPEG n02110627/
+mv val/ILSVRC2012_val_00030019.JPEG n03991062/
+mv val/ILSVRC2012_val_00030020.JPEG n02699494/
+mv val/ILSVRC2012_val_00030021.JPEG n09472597/
+mv val/ILSVRC2012_val_00030022.JPEG n02500267/
+mv val/ILSVRC2012_val_00030023.JPEG n03476991/
+mv val/ILSVRC2012_val_00030024.JPEG n02963159/
+mv val/ILSVRC2012_val_00030025.JPEG n02089867/
+mv val/ILSVRC2012_val_00030026.JPEG n01697457/
+mv val/ILSVRC2012_val_00030027.JPEG n03347037/
+mv val/ILSVRC2012_val_00030028.JPEG n01806143/
+mv val/ILSVRC2012_val_00030029.JPEG n02074367/
+mv val/ILSVRC2012_val_00030030.JPEG n02699494/
+mv val/ILSVRC2012_val_00030031.JPEG n04090263/
+mv val/ILSVRC2012_val_00030032.JPEG n03763968/
+mv val/ILSVRC2012_val_00030033.JPEG n02422699/
+mv val/ILSVRC2012_val_00030034.JPEG n04070727/
+mv val/ILSVRC2012_val_00030035.JPEG n01694178/
+mv val/ILSVRC2012_val_00030036.JPEG n01797886/
+mv val/ILSVRC2012_val_00030037.JPEG n03459775/
+mv val/ILSVRC2012_val_00030038.JPEG n03977966/
+mv val/ILSVRC2012_val_00030039.JPEG n01751748/
+mv val/ILSVRC2012_val_00030040.JPEG n03803284/
+mv val/ILSVRC2012_val_00030041.JPEG n01950731/
+mv val/ILSVRC2012_val_00030042.JPEG n01532829/
+mv val/ILSVRC2012_val_00030043.JPEG n02454379/
+mv val/ILSVRC2012_val_00030044.JPEG n02051845/
+mv val/ILSVRC2012_val_00030045.JPEG n03976657/
+mv val/ILSVRC2012_val_00030046.JPEG n07248320/
+mv val/ILSVRC2012_val_00030047.JPEG n07753275/
+mv val/ILSVRC2012_val_00030048.JPEG n09332890/
+mv val/ILSVRC2012_val_00030049.JPEG n02002556/
+mv val/ILSVRC2012_val_00030050.JPEG n03602883/
+mv val/ILSVRC2012_val_00030051.JPEG n12057211/
+mv val/ILSVRC2012_val_00030052.JPEG n02123045/
+mv val/ILSVRC2012_val_00030053.JPEG n02950826/
+mv val/ILSVRC2012_val_00030054.JPEG n02219486/
+mv val/ILSVRC2012_val_00030055.JPEG n02115641/
+mv val/ILSVRC2012_val_00030056.JPEG n02085936/
+mv val/ILSVRC2012_val_00030057.JPEG n02951585/
+mv val/ILSVRC2012_val_00030058.JPEG n02111889/
+mv val/ILSVRC2012_val_00030059.JPEG n02102480/
+mv val/ILSVRC2012_val_00030060.JPEG n01443537/
+mv val/ILSVRC2012_val_00030061.JPEG n02105162/
+mv val/ILSVRC2012_val_00030062.JPEG n02794156/
+mv val/ILSVRC2012_val_00030063.JPEG n04479046/
+mv val/ILSVRC2012_val_00030064.JPEG n03047690/
+mv val/ILSVRC2012_val_00030065.JPEG n02105412/
+mv val/ILSVRC2012_val_00030066.JPEG n02692877/
+mv val/ILSVRC2012_val_00030067.JPEG n01739381/
+mv val/ILSVRC2012_val_00030068.JPEG n07930864/
+mv val/ILSVRC2012_val_00030069.JPEG n04552348/
+mv val/ILSVRC2012_val_00030070.JPEG n02835271/
+mv val/ILSVRC2012_val_00030071.JPEG n01531178/
+mv val/ILSVRC2012_val_00030072.JPEG n04120489/
+mv val/ILSVRC2012_val_00030073.JPEG n01582220/
+mv val/ILSVRC2012_val_00030074.JPEG n02840245/
+mv val/ILSVRC2012_val_00030075.JPEG n02422106/
+mv val/ILSVRC2012_val_00030076.JPEG n01697457/
+mv val/ILSVRC2012_val_00030077.JPEG n03075370/
+mv val/ILSVRC2012_val_00030078.JPEG n04136333/
+mv val/ILSVRC2012_val_00030079.JPEG n03874599/
+mv val/ILSVRC2012_val_00030080.JPEG n03492542/
+mv val/ILSVRC2012_val_00030081.JPEG n02389026/
+mv val/ILSVRC2012_val_00030082.JPEG n03207743/
+mv val/ILSVRC2012_val_00030083.JPEG n02089867/
+mv val/ILSVRC2012_val_00030084.JPEG n04136333/
+mv val/ILSVRC2012_val_00030085.JPEG n06359193/
+mv val/ILSVRC2012_val_00030086.JPEG n02106382/
+mv val/ILSVRC2012_val_00030087.JPEG n02101006/
+mv val/ILSVRC2012_val_00030088.JPEG n02091467/
+mv val/ILSVRC2012_val_00030089.JPEG n03325584/
+mv val/ILSVRC2012_val_00030090.JPEG n01616318/
+mv val/ILSVRC2012_val_00030091.JPEG n02804610/
+mv val/ILSVRC2012_val_00030092.JPEG n07717556/
+mv val/ILSVRC2012_val_00030093.JPEG n02111500/
+mv val/ILSVRC2012_val_00030094.JPEG n01608432/
+mv val/ILSVRC2012_val_00030095.JPEG n02007558/
+mv val/ILSVRC2012_val_00030096.JPEG n03887697/
+mv val/ILSVRC2012_val_00030097.JPEG n02107142/
+mv val/ILSVRC2012_val_00030098.JPEG n02641379/
+mv val/ILSVRC2012_val_00030099.JPEG n07734744/
+mv val/ILSVRC2012_val_00030100.JPEG n03710193/
+mv val/ILSVRC2012_val_00030101.JPEG n02231487/
+mv val/ILSVRC2012_val_00030102.JPEG n02028035/
+mv val/ILSVRC2012_val_00030103.JPEG n04296562/
+mv val/ILSVRC2012_val_00030104.JPEG n04009552/
+mv val/ILSVRC2012_val_00030105.JPEG n02977058/
+mv val/ILSVRC2012_val_00030106.JPEG n03710721/
+mv val/ILSVRC2012_val_00030107.JPEG n03884397/
+mv val/ILSVRC2012_val_00030108.JPEG n03775546/
+mv val/ILSVRC2012_val_00030109.JPEG n07892512/
+mv val/ILSVRC2012_val_00030110.JPEG n04254777/
+mv val/ILSVRC2012_val_00030111.JPEG n07697537/
+mv val/ILSVRC2012_val_00030112.JPEG n03792782/
+mv val/ILSVRC2012_val_00030113.JPEG n02102480/
+mv val/ILSVRC2012_val_00030114.JPEG n03000247/
+mv val/ILSVRC2012_val_00030115.JPEG n02117135/
+mv val/ILSVRC2012_val_00030116.JPEG n01796340/
+mv val/ILSVRC2012_val_00030117.JPEG n02892201/
+mv val/ILSVRC2012_val_00030118.JPEG n04254680/
+mv val/ILSVRC2012_val_00030119.JPEG n04040759/
+mv val/ILSVRC2012_val_00030120.JPEG n01773549/
+mv val/ILSVRC2012_val_00030121.JPEG n04040759/
+mv val/ILSVRC2012_val_00030122.JPEG n03124170/
+mv val/ILSVRC2012_val_00030123.JPEG n02790996/
+mv val/ILSVRC2012_val_00030124.JPEG n04037443/
+mv val/ILSVRC2012_val_00030125.JPEG n02033041/
+mv val/ILSVRC2012_val_00030126.JPEG n04509417/
+mv val/ILSVRC2012_val_00030127.JPEG n01484850/
+mv val/ILSVRC2012_val_00030128.JPEG n03697007/
+mv val/ILSVRC2012_val_00030129.JPEG n04208210/
+mv val/ILSVRC2012_val_00030130.JPEG n04209133/
+mv val/ILSVRC2012_val_00030131.JPEG n02497673/
+mv val/ILSVRC2012_val_00030132.JPEG n03840681/
+mv val/ILSVRC2012_val_00030133.JPEG n03785016/
+mv val/ILSVRC2012_val_00030134.JPEG n04086273/
+mv val/ILSVRC2012_val_00030135.JPEG n02085936/
+mv val/ILSVRC2012_val_00030136.JPEG n02134084/
+mv val/ILSVRC2012_val_00030137.JPEG n03404251/
+mv val/ILSVRC2012_val_00030138.JPEG n02098286/
+mv val/ILSVRC2012_val_00030139.JPEG n07734744/
+mv val/ILSVRC2012_val_00030140.JPEG n03998194/
+mv val/ILSVRC2012_val_00030141.JPEG n02086910/
+mv val/ILSVRC2012_val_00030142.JPEG n03250847/
+mv val/ILSVRC2012_val_00030143.JPEG n03983396/
+mv val/ILSVRC2012_val_00030144.JPEG n04336792/
+mv val/ILSVRC2012_val_00030145.JPEG n03457902/
+mv val/ILSVRC2012_val_00030146.JPEG n03026506/
+mv val/ILSVRC2012_val_00030147.JPEG n03980874/
+mv val/ILSVRC2012_val_00030148.JPEG n01818515/
+mv val/ILSVRC2012_val_00030149.JPEG n04507155/
+mv val/ILSVRC2012_val_00030150.JPEG n03933933/
+mv val/ILSVRC2012_val_00030151.JPEG n13037406/
+mv val/ILSVRC2012_val_00030152.JPEG n04235860/
+mv val/ILSVRC2012_val_00030153.JPEG n02504013/
+mv val/ILSVRC2012_val_00030154.JPEG n03297495/
+mv val/ILSVRC2012_val_00030155.JPEG n02802426/
+mv val/ILSVRC2012_val_00030156.JPEG n01491361/
+mv val/ILSVRC2012_val_00030157.JPEG n02916936/
+mv val/ILSVRC2012_val_00030158.JPEG n01755581/
+mv val/ILSVRC2012_val_00030159.JPEG n02727426/
+mv val/ILSVRC2012_val_00030160.JPEG n04228054/
+mv val/ILSVRC2012_val_00030161.JPEG n03584254/
+mv val/ILSVRC2012_val_00030162.JPEG n04317175/
+mv val/ILSVRC2012_val_00030163.JPEG n01667114/
+mv val/ILSVRC2012_val_00030164.JPEG n04486054/
+mv val/ILSVRC2012_val_00030165.JPEG n02110341/
+mv val/ILSVRC2012_val_00030166.JPEG n04465501/
+mv val/ILSVRC2012_val_00030167.JPEG n02974003/
+mv val/ILSVRC2012_val_00030168.JPEG n12768682/
+mv val/ILSVRC2012_val_00030169.JPEG n12998815/
+mv val/ILSVRC2012_val_00030170.JPEG n02111129/
+mv val/ILSVRC2012_val_00030171.JPEG n11879895/
+mv val/ILSVRC2012_val_00030172.JPEG n03775546/
+mv val/ILSVRC2012_val_00030173.JPEG n03496892/
+mv val/ILSVRC2012_val_00030174.JPEG n03791053/
+mv val/ILSVRC2012_val_00030175.JPEG n01768244/
+mv val/ILSVRC2012_val_00030176.JPEG n09421951/
+mv val/ILSVRC2012_val_00030177.JPEG n04192698/
+mv val/ILSVRC2012_val_00030178.JPEG n04517823/
+mv val/ILSVRC2012_val_00030179.JPEG n02514041/
+mv val/ILSVRC2012_val_00030180.JPEG n12985857/
+mv val/ILSVRC2012_val_00030181.JPEG n13054560/
+mv val/ILSVRC2012_val_00030182.JPEG n04330267/
+mv val/ILSVRC2012_val_00030183.JPEG n03388549/
+mv val/ILSVRC2012_val_00030184.JPEG n04254120/
+mv val/ILSVRC2012_val_00030185.JPEG n04423845/
+mv val/ILSVRC2012_val_00030186.JPEG n11879895/
+mv val/ILSVRC2012_val_00030187.JPEG n02776631/
+mv val/ILSVRC2012_val_00030188.JPEG n02137549/
+mv val/ILSVRC2012_val_00030189.JPEG n03495258/
+mv val/ILSVRC2012_val_00030190.JPEG n03355925/
+mv val/ILSVRC2012_val_00030191.JPEG n02486410/
+mv val/ILSVRC2012_val_00030192.JPEG n02749479/
+mv val/ILSVRC2012_val_00030193.JPEG n03187595/
+mv val/ILSVRC2012_val_00030194.JPEG n03388043/
+mv val/ILSVRC2012_val_00030195.JPEG n04005630/
+mv val/ILSVRC2012_val_00030196.JPEG n02100877/
+mv val/ILSVRC2012_val_00030197.JPEG n07714990/
+mv val/ILSVRC2012_val_00030198.JPEG n06359193/
+mv val/ILSVRC2012_val_00030199.JPEG n02096051/
+mv val/ILSVRC2012_val_00030200.JPEG n02105641/
+mv val/ILSVRC2012_val_00030201.JPEG n07579787/
+mv val/ILSVRC2012_val_00030202.JPEG n09472597/
+mv val/ILSVRC2012_val_00030203.JPEG n04355338/
+mv val/ILSVRC2012_val_00030204.JPEG n03680355/
+mv val/ILSVRC2012_val_00030205.JPEG n02730930/
+mv val/ILSVRC2012_val_00030206.JPEG n03874599/
+mv val/ILSVRC2012_val_00030207.JPEG n02730930/
+mv val/ILSVRC2012_val_00030208.JPEG n04552348/
+mv val/ILSVRC2012_val_00030209.JPEG n03535780/
+mv val/ILSVRC2012_val_00030210.JPEG n01753488/
+mv val/ILSVRC2012_val_00030211.JPEG n02012849/
+mv val/ILSVRC2012_val_00030212.JPEG n01704323/
+mv val/ILSVRC2012_val_00030213.JPEG n02097209/
+mv val/ILSVRC2012_val_00030214.JPEG n03908714/
+mv val/ILSVRC2012_val_00030215.JPEG n04589890/
+mv val/ILSVRC2012_val_00030216.JPEG n04372370/
+mv val/ILSVRC2012_val_00030217.JPEG n01443537/
+mv val/ILSVRC2012_val_00030218.JPEG n03457902/
+mv val/ILSVRC2012_val_00030219.JPEG n04238763/
+mv val/ILSVRC2012_val_00030220.JPEG n09246464/
+mv val/ILSVRC2012_val_00030221.JPEG n01739381/
+mv val/ILSVRC2012_val_00030222.JPEG n02488702/
+mv val/ILSVRC2012_val_00030223.JPEG n04026417/
+mv val/ILSVRC2012_val_00030224.JPEG n01530575/
+mv val/ILSVRC2012_val_00030225.JPEG n07749582/
+mv val/ILSVRC2012_val_00030226.JPEG n02102480/
+mv val/ILSVRC2012_val_00030227.JPEG n04557648/
+mv val/ILSVRC2012_val_00030228.JPEG n02096585/
+mv val/ILSVRC2012_val_00030229.JPEG n01740131/
+mv val/ILSVRC2012_val_00030230.JPEG n04389033/
+mv val/ILSVRC2012_val_00030231.JPEG n03314780/
+mv val/ILSVRC2012_val_00030232.JPEG n07875152/
+mv val/ILSVRC2012_val_00030233.JPEG n02492660/
+mv val/ILSVRC2012_val_00030234.JPEG n12057211/
+mv val/ILSVRC2012_val_00030235.JPEG n04371430/
+mv val/ILSVRC2012_val_00030236.JPEG n02099267/
+mv val/ILSVRC2012_val_00030237.JPEG n03495258/
+mv val/ILSVRC2012_val_00030238.JPEG n02096051/
+mv val/ILSVRC2012_val_00030239.JPEG n02105162/
+mv val/ILSVRC2012_val_00030240.JPEG n02105641/
+mv val/ILSVRC2012_val_00030241.JPEG n03016953/
+mv val/ILSVRC2012_val_00030242.JPEG n02808440/
+mv val/ILSVRC2012_val_00030243.JPEG n03598930/
+mv val/ILSVRC2012_val_00030244.JPEG n04542943/
+mv val/ILSVRC2012_val_00030245.JPEG n01855672/
+mv val/ILSVRC2012_val_00030246.JPEG n03733281/
+mv val/ILSVRC2012_val_00030247.JPEG n07717410/
+mv val/ILSVRC2012_val_00030248.JPEG n02504013/
+mv val/ILSVRC2012_val_00030249.JPEG n02091831/
+mv val/ILSVRC2012_val_00030250.JPEG n04133789/
+mv val/ILSVRC2012_val_00030251.JPEG n04356056/
+mv val/ILSVRC2012_val_00030252.JPEG n02879718/
+mv val/ILSVRC2012_val_00030253.JPEG n03891251/
+mv val/ILSVRC2012_val_00030254.JPEG n03379051/
+mv val/ILSVRC2012_val_00030255.JPEG n02113978/
+mv val/ILSVRC2012_val_00030256.JPEG n09288635/
+mv val/ILSVRC2012_val_00030257.JPEG n02444819/
+mv val/ILSVRC2012_val_00030258.JPEG n01945685/
+mv val/ILSVRC2012_val_00030259.JPEG n03980874/
+mv val/ILSVRC2012_val_00030260.JPEG n02526121/
+mv val/ILSVRC2012_val_00030261.JPEG n02101556/
+mv val/ILSVRC2012_val_00030262.JPEG n04040759/
+mv val/ILSVRC2012_val_00030263.JPEG n02009229/
+mv val/ILSVRC2012_val_00030264.JPEG n03837869/
+mv val/ILSVRC2012_val_00030265.JPEG n04311174/
+mv val/ILSVRC2012_val_00030266.JPEG n07583066/
+mv val/ILSVRC2012_val_00030267.JPEG n02777292/
+mv val/ILSVRC2012_val_00030268.JPEG n03950228/
+mv val/ILSVRC2012_val_00030269.JPEG n02129165/
+mv val/ILSVRC2012_val_00030270.JPEG n02114548/
+mv val/ILSVRC2012_val_00030271.JPEG n02100735/
+mv val/ILSVRC2012_val_00030272.JPEG n04590129/
+mv val/ILSVRC2012_val_00030273.JPEG n03400231/
+mv val/ILSVRC2012_val_00030274.JPEG n03868242/
+mv val/ILSVRC2012_val_00030275.JPEG n02074367/
+mv val/ILSVRC2012_val_00030276.JPEG n06874185/
+mv val/ILSVRC2012_val_00030277.JPEG n04141327/
+mv val/ILSVRC2012_val_00030278.JPEG n01833805/
+mv val/ILSVRC2012_val_00030279.JPEG n09288635/
+mv val/ILSVRC2012_val_00030280.JPEG n04070727/
+mv val/ILSVRC2012_val_00030281.JPEG n02795169/
+mv val/ILSVRC2012_val_00030282.JPEG n03944341/
+mv val/ILSVRC2012_val_00030283.JPEG n01560419/
+mv val/ILSVRC2012_val_00030284.JPEG n03187595/
+mv val/ILSVRC2012_val_00030285.JPEG n02092339/
+mv val/ILSVRC2012_val_00030286.JPEG n03388043/
+mv val/ILSVRC2012_val_00030287.JPEG n03255030/
+mv val/ILSVRC2012_val_00030288.JPEG n04532670/
+mv val/ILSVRC2012_val_00030289.JPEG n02120505/
+mv val/ILSVRC2012_val_00030290.JPEG n02894605/
+mv val/ILSVRC2012_val_00030291.JPEG n02101388/
+mv val/ILSVRC2012_val_00030292.JPEG n01608432/
+mv val/ILSVRC2012_val_00030293.JPEG n03995372/
+mv val/ILSVRC2012_val_00030294.JPEG n02259212/
+mv val/ILSVRC2012_val_00030295.JPEG n03908618/
+mv val/ILSVRC2012_val_00030296.JPEG n03223299/
+mv val/ILSVRC2012_val_00030297.JPEG n02107683/
+mv val/ILSVRC2012_val_00030298.JPEG n07932039/
+mv val/ILSVRC2012_val_00030299.JPEG n03063689/
+mv val/ILSVRC2012_val_00030300.JPEG n01629819/
+mv val/ILSVRC2012_val_00030301.JPEG n03982430/
+mv val/ILSVRC2012_val_00030302.JPEG n03188531/
+mv val/ILSVRC2012_val_00030303.JPEG n01748264/
+mv val/ILSVRC2012_val_00030304.JPEG n03877472/
+mv val/ILSVRC2012_val_00030305.JPEG n02115913/
+mv val/ILSVRC2012_val_00030306.JPEG n01748264/
+mv val/ILSVRC2012_val_00030307.JPEG n04350905/
+mv val/ILSVRC2012_val_00030308.JPEG n04070727/
+mv val/ILSVRC2012_val_00030309.JPEG n02643566/
+mv val/ILSVRC2012_val_00030310.JPEG n02966193/
+mv val/ILSVRC2012_val_00030311.JPEG n01770393/
+mv val/ILSVRC2012_val_00030312.JPEG n02672831/
+mv val/ILSVRC2012_val_00030313.JPEG n02494079/
+mv val/ILSVRC2012_val_00030314.JPEG n02930766/
+mv val/ILSVRC2012_val_00030315.JPEG n03259280/
+mv val/ILSVRC2012_val_00030316.JPEG n02442845/
+mv val/ILSVRC2012_val_00030317.JPEG n03903868/
+mv val/ILSVRC2012_val_00030318.JPEG n03710721/
+mv val/ILSVRC2012_val_00030319.JPEG n02690373/
+mv val/ILSVRC2012_val_00030320.JPEG n01531178/
+mv val/ILSVRC2012_val_00030321.JPEG n01496331/
+mv val/ILSVRC2012_val_00030322.JPEG n03710721/
+mv val/ILSVRC2012_val_00030323.JPEG n02088094/
+mv val/ILSVRC2012_val_00030324.JPEG n07717556/
+mv val/ILSVRC2012_val_00030325.JPEG n03920288/
+mv val/ILSVRC2012_val_00030326.JPEG n02089078/
+mv val/ILSVRC2012_val_00030327.JPEG n02109525/
+mv val/ILSVRC2012_val_00030328.JPEG n02808304/
+mv val/ILSVRC2012_val_00030329.JPEG n03447447/
+mv val/ILSVRC2012_val_00030330.JPEG n04548280/
+mv val/ILSVRC2012_val_00030331.JPEG n02906734/
+mv val/ILSVRC2012_val_00030332.JPEG n07716358/
+mv val/ILSVRC2012_val_00030333.JPEG n01774384/
+mv val/ILSVRC2012_val_00030334.JPEG n03637318/
+mv val/ILSVRC2012_val_00030335.JPEG n02909870/
+mv val/ILSVRC2012_val_00030336.JPEG n03788195/
+mv val/ILSVRC2012_val_00030337.JPEG n02699494/
+mv val/ILSVRC2012_val_00030338.JPEG n04355338/
+mv val/ILSVRC2012_val_00030339.JPEG n02095889/
+mv val/ILSVRC2012_val_00030340.JPEG n02606052/
+mv val/ILSVRC2012_val_00030341.JPEG n03623198/
+mv val/ILSVRC2012_val_00030342.JPEG n01641577/
+mv val/ILSVRC2012_val_00030343.JPEG n01669191/
+mv val/ILSVRC2012_val_00030344.JPEG n02457408/
+mv val/ILSVRC2012_val_00030345.JPEG n03627232/
+mv val/ILSVRC2012_val_00030346.JPEG n02769748/
+mv val/ILSVRC2012_val_00030347.JPEG n04311004/
+mv val/ILSVRC2012_val_00030348.JPEG n03584254/
+mv val/ILSVRC2012_val_00030349.JPEG n03220513/
+mv val/ILSVRC2012_val_00030350.JPEG n03530642/
+mv val/ILSVRC2012_val_00030351.JPEG n04285008/
+mv val/ILSVRC2012_val_00030352.JPEG n01644373/
+mv val/ILSVRC2012_val_00030353.JPEG n09421951/
+mv val/ILSVRC2012_val_00030354.JPEG n03733281/
+mv val/ILSVRC2012_val_00030355.JPEG n03047690/
+mv val/ILSVRC2012_val_00030356.JPEG n02808304/
+mv val/ILSVRC2012_val_00030357.JPEG n03720891/
+mv val/ILSVRC2012_val_00030358.JPEG n02437616/
+mv val/ILSVRC2012_val_00030359.JPEG n07684084/
+mv val/ILSVRC2012_val_00030360.JPEG n01749939/
+mv val/ILSVRC2012_val_00030361.JPEG n04409515/
+mv val/ILSVRC2012_val_00030362.JPEG n02494079/
+mv val/ILSVRC2012_val_00030363.JPEG n02948072/
+mv val/ILSVRC2012_val_00030364.JPEG n02110806/
+mv val/ILSVRC2012_val_00030365.JPEG n02077923/
+mv val/ILSVRC2012_val_00030366.JPEG n01924916/
+mv val/ILSVRC2012_val_00030367.JPEG n01496331/
+mv val/ILSVRC2012_val_00030368.JPEG n04604644/
+mv val/ILSVRC2012_val_00030369.JPEG n02667093/
+mv val/ILSVRC2012_val_00030370.JPEG n02107142/
+mv val/ILSVRC2012_val_00030371.JPEG n01692333/
+mv val/ILSVRC2012_val_00030372.JPEG n04277352/
+mv val/ILSVRC2012_val_00030373.JPEG n04254777/
+mv val/ILSVRC2012_val_00030374.JPEG n02676566/
+mv val/ILSVRC2012_val_00030375.JPEG n12144580/
+mv val/ILSVRC2012_val_00030376.JPEG n03630383/
+mv val/ILSVRC2012_val_00030377.JPEG n02095889/
+mv val/ILSVRC2012_val_00030378.JPEG n03666591/
+mv val/ILSVRC2012_val_00030379.JPEG n03937543/
+mv val/ILSVRC2012_val_00030380.JPEG n01498041/
+mv val/ILSVRC2012_val_00030381.JPEG n03272562/
+mv val/ILSVRC2012_val_00030382.JPEG n09472597/
+mv val/ILSVRC2012_val_00030383.JPEG n03223299/
+mv val/ILSVRC2012_val_00030384.JPEG n04456115/
+mv val/ILSVRC2012_val_00030385.JPEG n02099601/
+mv val/ILSVRC2012_val_00030386.JPEG n03000134/
+mv val/ILSVRC2012_val_00030387.JPEG n02951585/
+mv val/ILSVRC2012_val_00030388.JPEG n03717622/
+mv val/ILSVRC2012_val_00030389.JPEG n01910747/
+mv val/ILSVRC2012_val_00030390.JPEG n06596364/
+mv val/ILSVRC2012_val_00030391.JPEG n01820546/
+mv val/ILSVRC2012_val_00030392.JPEG n02018795/
+mv val/ILSVRC2012_val_00030393.JPEG n04264628/
+mv val/ILSVRC2012_val_00030394.JPEG n02096177/
+mv val/ILSVRC2012_val_00030395.JPEG n01944390/
+mv val/ILSVRC2012_val_00030396.JPEG n01978287/
+mv val/ILSVRC2012_val_00030397.JPEG n01818515/
+mv val/ILSVRC2012_val_00030398.JPEG n03125729/
+mv val/ILSVRC2012_val_00030399.JPEG n02093256/
+mv val/ILSVRC2012_val_00030400.JPEG n01855032/
+mv val/ILSVRC2012_val_00030401.JPEG n02009912/
+mv val/ILSVRC2012_val_00030402.JPEG n02097047/
+mv val/ILSVRC2012_val_00030403.JPEG n02113712/
+mv val/ILSVRC2012_val_00030404.JPEG n01883070/
+mv val/ILSVRC2012_val_00030405.JPEG n01774750/
+mv val/ILSVRC2012_val_00030406.JPEG n01665541/
+mv val/ILSVRC2012_val_00030407.JPEG n02093428/
+mv val/ILSVRC2012_val_00030408.JPEG n01980166/
+mv val/ILSVRC2012_val_00030409.JPEG n04392985/
+mv val/ILSVRC2012_val_00030410.JPEG n03947888/
+mv val/ILSVRC2012_val_00030411.JPEG n02690373/
+mv val/ILSVRC2012_val_00030412.JPEG n02090721/
+mv val/ILSVRC2012_val_00030413.JPEG n04023962/
+mv val/ILSVRC2012_val_00030414.JPEG n03476684/
+mv val/ILSVRC2012_val_00030415.JPEG n04389033/
+mv val/ILSVRC2012_val_00030416.JPEG n03729826/
+mv val/ILSVRC2012_val_00030417.JPEG n02910353/
+mv val/ILSVRC2012_val_00030418.JPEG n01632458/
+mv val/ILSVRC2012_val_00030419.JPEG n02167151/
+mv val/ILSVRC2012_val_00030420.JPEG n02676566/
+mv val/ILSVRC2012_val_00030421.JPEG n03045698/
+mv val/ILSVRC2012_val_00030422.JPEG n01770081/
+mv val/ILSVRC2012_val_00030423.JPEG n04238763/
+mv val/ILSVRC2012_val_00030424.JPEG n10148035/
+mv val/ILSVRC2012_val_00030425.JPEG n04344873/
+mv val/ILSVRC2012_val_00030426.JPEG n02481823/
+mv val/ILSVRC2012_val_00030427.JPEG n04467665/
+mv val/ILSVRC2012_val_00030428.JPEG n02013706/
+mv val/ILSVRC2012_val_00030429.JPEG n02088238/
+mv val/ILSVRC2012_val_00030430.JPEG n02877765/
+mv val/ILSVRC2012_val_00030431.JPEG n01833805/
+mv val/ILSVRC2012_val_00030432.JPEG n07718747/
+mv val/ILSVRC2012_val_00030433.JPEG n02091467/
+mv val/ILSVRC2012_val_00030434.JPEG n03627232/
+mv val/ILSVRC2012_val_00030435.JPEG n04141076/
+mv val/ILSVRC2012_val_00030436.JPEG n04209239/
+mv val/ILSVRC2012_val_00030437.JPEG n01950731/
+mv val/ILSVRC2012_val_00030438.JPEG n04467665/
+mv val/ILSVRC2012_val_00030439.JPEG n03976657/
+mv val/ILSVRC2012_val_00030440.JPEG n03729826/
+mv val/ILSVRC2012_val_00030441.JPEG n04398044/
+mv val/ILSVRC2012_val_00030442.JPEG n07754684/
+mv val/ILSVRC2012_val_00030443.JPEG n04465501/
+mv val/ILSVRC2012_val_00030444.JPEG n01776313/
+mv val/ILSVRC2012_val_00030445.JPEG n02111129/
+mv val/ILSVRC2012_val_00030446.JPEG n03207743/
+mv val/ILSVRC2012_val_00030447.JPEG n03201208/
+mv val/ILSVRC2012_val_00030448.JPEG n01847000/
+mv val/ILSVRC2012_val_00030449.JPEG n02085936/
+mv val/ILSVRC2012_val_00030450.JPEG n03710721/
+mv val/ILSVRC2012_val_00030451.JPEG n04599235/
+mv val/ILSVRC2012_val_00030452.JPEG n02817516/
+mv val/ILSVRC2012_val_00030453.JPEG n02807133/
+mv val/ILSVRC2012_val_00030454.JPEG n04389033/
+mv val/ILSVRC2012_val_00030455.JPEG n02840245/
+mv val/ILSVRC2012_val_00030456.JPEG n04423845/
+mv val/ILSVRC2012_val_00030457.JPEG n07718472/
+mv val/ILSVRC2012_val_00030458.JPEG n02356798/
+mv val/ILSVRC2012_val_00030459.JPEG n02167151/
+mv val/ILSVRC2012_val_00030460.JPEG n02966687/
+mv val/ILSVRC2012_val_00030461.JPEG n02790996/
+mv val/ILSVRC2012_val_00030462.JPEG n02840245/
+mv val/ILSVRC2012_val_00030463.JPEG n02342885/
+mv val/ILSVRC2012_val_00030464.JPEG n02437312/
+mv val/ILSVRC2012_val_00030465.JPEG n07716906/
+mv val/ILSVRC2012_val_00030466.JPEG n02233338/
+mv val/ILSVRC2012_val_00030467.JPEG n03379051/
+mv val/ILSVRC2012_val_00030468.JPEG n01990800/
+mv val/ILSVRC2012_val_00030469.JPEG n02443114/
+mv val/ILSVRC2012_val_00030470.JPEG n01498041/
+mv val/ILSVRC2012_val_00030471.JPEG n03337140/
+mv val/ILSVRC2012_val_00030472.JPEG n02165105/
+mv val/ILSVRC2012_val_00030473.JPEG n04525305/
+mv val/ILSVRC2012_val_00030474.JPEG n02226429/
+mv val/ILSVRC2012_val_00030475.JPEG n01558993/
+mv val/ILSVRC2012_val_00030476.JPEG n02110341/
+mv val/ILSVRC2012_val_00030477.JPEG n04069434/
+mv val/ILSVRC2012_val_00030478.JPEG n01644900/
+mv val/ILSVRC2012_val_00030479.JPEG n02096177/
+mv val/ILSVRC2012_val_00030480.JPEG n04347754/
+mv val/ILSVRC2012_val_00030481.JPEG n03127747/
+mv val/ILSVRC2012_val_00030482.JPEG n02106382/
+mv val/ILSVRC2012_val_00030483.JPEG n01608432/
+mv val/ILSVRC2012_val_00030484.JPEG n02412080/
+mv val/ILSVRC2012_val_00030485.JPEG n02134084/
+mv val/ILSVRC2012_val_00030486.JPEG n04486054/
+mv val/ILSVRC2012_val_00030487.JPEG n04026417/
+mv val/ILSVRC2012_val_00030488.JPEG n02437616/
+mv val/ILSVRC2012_val_00030489.JPEG n04081281/
+mv val/ILSVRC2012_val_00030490.JPEG n04417672/
+mv val/ILSVRC2012_val_00030491.JPEG n02018207/
+mv val/ILSVRC2012_val_00030492.JPEG n03018349/
+mv val/ILSVRC2012_val_00030493.JPEG n03595614/
+mv val/ILSVRC2012_val_00030494.JPEG n02120079/
+mv val/ILSVRC2012_val_00030495.JPEG n03388183/
+mv val/ILSVRC2012_val_00030496.JPEG n03902125/
+mv val/ILSVRC2012_val_00030497.JPEG n02403003/
+mv val/ILSVRC2012_val_00030498.JPEG n03933933/
+mv val/ILSVRC2012_val_00030499.JPEG n09193705/
+mv val/ILSVRC2012_val_00030500.JPEG n01872401/
+mv val/ILSVRC2012_val_00030501.JPEG n03534580/
+mv val/ILSVRC2012_val_00030502.JPEG n02129165/
+mv val/ILSVRC2012_val_00030503.JPEG n03710193/
+mv val/ILSVRC2012_val_00030504.JPEG n01981276/
+mv val/ILSVRC2012_val_00030505.JPEG n02259212/
+mv val/ILSVRC2012_val_00030506.JPEG n07873807/
+mv val/ILSVRC2012_val_00030507.JPEG n01843065/
+mv val/ILSVRC2012_val_00030508.JPEG n02457408/
+mv val/ILSVRC2012_val_00030509.JPEG n02837789/
+mv val/ILSVRC2012_val_00030510.JPEG n02177972/
+mv val/ILSVRC2012_val_00030511.JPEG n02951585/
+mv val/ILSVRC2012_val_00030512.JPEG n02101006/
+mv val/ILSVRC2012_val_00030513.JPEG n02965783/
+mv val/ILSVRC2012_val_00030514.JPEG n04482393/
+mv val/ILSVRC2012_val_00030515.JPEG n01616318/
+mv val/ILSVRC2012_val_00030516.JPEG n04465501/
+mv val/ILSVRC2012_val_00030517.JPEG n03485407/
+mv val/ILSVRC2012_val_00030518.JPEG n02086646/
+mv val/ILSVRC2012_val_00030519.JPEG n02085620/
+mv val/ILSVRC2012_val_00030520.JPEG n02361337/
+mv val/ILSVRC2012_val_00030521.JPEG n01753488/
+mv val/ILSVRC2012_val_00030522.JPEG n04579145/
+mv val/ILSVRC2012_val_00030523.JPEG n01682714/
+mv val/ILSVRC2012_val_00030524.JPEG n02105641/
+mv val/ILSVRC2012_val_00030525.JPEG n04065272/
+mv val/ILSVRC2012_val_00030526.JPEG n01968897/
+mv val/ILSVRC2012_val_00030527.JPEG n02102973/
+mv val/ILSVRC2012_val_00030528.JPEG n12144580/
+mv val/ILSVRC2012_val_00030529.JPEG n04372370/
+mv val/ILSVRC2012_val_00030530.JPEG n02127052/
+mv val/ILSVRC2012_val_00030531.JPEG n02690373/
+mv val/ILSVRC2012_val_00030532.JPEG n02895154/
+mv val/ILSVRC2012_val_00030533.JPEG n04049303/
+mv val/ILSVRC2012_val_00030534.JPEG n03676483/
+mv val/ILSVRC2012_val_00030535.JPEG n02268443/
+mv val/ILSVRC2012_val_00030536.JPEG n02869837/
+mv val/ILSVRC2012_val_00030537.JPEG n02206856/
+mv val/ILSVRC2012_val_00030538.JPEG n04201297/
+mv val/ILSVRC2012_val_00030539.JPEG n02091244/
+mv val/ILSVRC2012_val_00030540.JPEG n02101556/
+mv val/ILSVRC2012_val_00030541.JPEG n02843684/
+mv val/ILSVRC2012_val_00030542.JPEG n04380533/
+mv val/ILSVRC2012_val_00030543.JPEG n07753275/
+mv val/ILSVRC2012_val_00030544.JPEG n01534433/
+mv val/ILSVRC2012_val_00030545.JPEG n02027492/
+mv val/ILSVRC2012_val_00030546.JPEG n02971356/
+mv val/ILSVRC2012_val_00030547.JPEG n04118538/
+mv val/ILSVRC2012_val_00030548.JPEG n03384352/
+mv val/ILSVRC2012_val_00030549.JPEG n03444034/
+mv val/ILSVRC2012_val_00030550.JPEG n03676483/
+mv val/ILSVRC2012_val_00030551.JPEG n03495258/
+mv val/ILSVRC2012_val_00030552.JPEG n02666196/
+mv val/ILSVRC2012_val_00030553.JPEG n01756291/
+mv val/ILSVRC2012_val_00030554.JPEG n03482405/
+mv val/ILSVRC2012_val_00030555.JPEG n02098413/
+mv val/ILSVRC2012_val_00030556.JPEG n04355933/
+mv val/ILSVRC2012_val_00030557.JPEG n03841143/
+mv val/ILSVRC2012_val_00030558.JPEG n02120079/
+mv val/ILSVRC2012_val_00030559.JPEG n02417914/
+mv val/ILSVRC2012_val_00030560.JPEG n03857828/
+mv val/ILSVRC2012_val_00030561.JPEG n02114712/
+mv val/ILSVRC2012_val_00030562.JPEG n01729977/
+mv val/ILSVRC2012_val_00030563.JPEG n01770081/
+mv val/ILSVRC2012_val_00030564.JPEG n03733131/
+mv val/ILSVRC2012_val_00030565.JPEG n03793489/
+mv val/ILSVRC2012_val_00030566.JPEG n03590841/
+mv val/ILSVRC2012_val_00030567.JPEG n02088364/
+mv val/ILSVRC2012_val_00030568.JPEG n01847000/
+mv val/ILSVRC2012_val_00030569.JPEG n11939491/
+mv val/ILSVRC2012_val_00030570.JPEG n03724870/
+mv val/ILSVRC2012_val_00030571.JPEG n02025239/
+mv val/ILSVRC2012_val_00030572.JPEG n07717556/
+mv val/ILSVRC2012_val_00030573.JPEG n02119789/
+mv val/ILSVRC2012_val_00030574.JPEG n03016953/
+mv val/ILSVRC2012_val_00030575.JPEG n02129165/
+mv val/ILSVRC2012_val_00030576.JPEG n04033901/
+mv val/ILSVRC2012_val_00030577.JPEG n02790996/
+mv val/ILSVRC2012_val_00030578.JPEG n02012849/
+mv val/ILSVRC2012_val_00030579.JPEG n02099429/
+mv val/ILSVRC2012_val_00030580.JPEG n03691459/
+mv val/ILSVRC2012_val_00030581.JPEG n04330267/
+mv val/ILSVRC2012_val_00030582.JPEG n10148035/
+mv val/ILSVRC2012_val_00030583.JPEG n03888257/
+mv val/ILSVRC2012_val_00030584.JPEG n07584110/
+mv val/ILSVRC2012_val_00030585.JPEG n02096437/
+mv val/ILSVRC2012_val_00030586.JPEG n04515003/
+mv val/ILSVRC2012_val_00030587.JPEG n02804610/
+mv val/ILSVRC2012_val_00030588.JPEG n02096437/
+mv val/ILSVRC2012_val_00030589.JPEG n04418357/
+mv val/ILSVRC2012_val_00030590.JPEG n02033041/
+mv val/ILSVRC2012_val_00030591.JPEG n02092339/
+mv val/ILSVRC2012_val_00030592.JPEG n12620546/
+mv val/ILSVRC2012_val_00030593.JPEG n01669191/
+mv val/ILSVRC2012_val_00030594.JPEG n03160309/
+mv val/ILSVRC2012_val_00030595.JPEG n02112137/
+mv val/ILSVRC2012_val_00030596.JPEG n02172182/
+mv val/ILSVRC2012_val_00030597.JPEG n03110669/
+mv val/ILSVRC2012_val_00030598.JPEG n04380533/
+mv val/ILSVRC2012_val_00030599.JPEG n03673027/
+mv val/ILSVRC2012_val_00030600.JPEG n03347037/
+mv val/ILSVRC2012_val_00030601.JPEG n04201297/
+mv val/ILSVRC2012_val_00030602.JPEG n02492660/
+mv val/ILSVRC2012_val_00030603.JPEG n02110958/
+mv val/ILSVRC2012_val_00030604.JPEG n02783161/
+mv val/ILSVRC2012_val_00030605.JPEG n02483708/
+mv val/ILSVRC2012_val_00030606.JPEG n02110958/
+mv val/ILSVRC2012_val_00030607.JPEG n04120489/
+mv val/ILSVRC2012_val_00030608.JPEG n03908618/
+mv val/ILSVRC2012_val_00030609.JPEG n02423022/
+mv val/ILSVRC2012_val_00030610.JPEG n04350905/
+mv val/ILSVRC2012_val_00030611.JPEG n04153751/
+mv val/ILSVRC2012_val_00030612.JPEG n02444819/
+mv val/ILSVRC2012_val_00030613.JPEG n02114548/
+mv val/ILSVRC2012_val_00030614.JPEG n07747607/
+mv val/ILSVRC2012_val_00030615.JPEG n07614500/
+mv val/ILSVRC2012_val_00030616.JPEG n04070727/
+mv val/ILSVRC2012_val_00030617.JPEG n04074963/
+mv val/ILSVRC2012_val_00030618.JPEG n01616318/
+mv val/ILSVRC2012_val_00030619.JPEG n02112706/
+mv val/ILSVRC2012_val_00030620.JPEG n02096437/
+mv val/ILSVRC2012_val_00030621.JPEG n04228054/
+mv val/ILSVRC2012_val_00030622.JPEG n01644900/
+mv val/ILSVRC2012_val_00030623.JPEG n01756291/
+mv val/ILSVRC2012_val_00030624.JPEG n02442845/
+mv val/ILSVRC2012_val_00030625.JPEG n03980874/
+mv val/ILSVRC2012_val_00030626.JPEG n02441942/
+mv val/ILSVRC2012_val_00030627.JPEG n04149813/
+mv val/ILSVRC2012_val_00030628.JPEG n03950228/
+mv val/ILSVRC2012_val_00030629.JPEG n01843383/
+mv val/ILSVRC2012_val_00030630.JPEG n02910353/
+mv val/ILSVRC2012_val_00030631.JPEG n03207743/
+mv val/ILSVRC2012_val_00030632.JPEG n04263257/
+mv val/ILSVRC2012_val_00030633.JPEG n02099429/
+mv val/ILSVRC2012_val_00030634.JPEG n04486054/
+mv val/ILSVRC2012_val_00030635.JPEG n02606052/
+mv val/ILSVRC2012_val_00030636.JPEG n04238763/
+mv val/ILSVRC2012_val_00030637.JPEG n02099601/
+mv val/ILSVRC2012_val_00030638.JPEG n02177972/
+mv val/ILSVRC2012_val_00030639.JPEG n03584829/
+mv val/ILSVRC2012_val_00030640.JPEG n04356056/
+mv val/ILSVRC2012_val_00030641.JPEG n03673027/
+mv val/ILSVRC2012_val_00030642.JPEG n02086646/
+mv val/ILSVRC2012_val_00030643.JPEG n04485082/
+mv val/ILSVRC2012_val_00030644.JPEG n02692877/
+mv val/ILSVRC2012_val_00030645.JPEG n03761084/
+mv val/ILSVRC2012_val_00030646.JPEG n03249569/
+mv val/ILSVRC2012_val_00030647.JPEG n04252077/
+mv val/ILSVRC2012_val_00030648.JPEG n02092339/
+mv val/ILSVRC2012_val_00030649.JPEG n01770081/
+mv val/ILSVRC2012_val_00030650.JPEG n02877765/
+mv val/ILSVRC2012_val_00030651.JPEG n02129604/
+mv val/ILSVRC2012_val_00030652.JPEG n03032252/
+mv val/ILSVRC2012_val_00030653.JPEG n13044778/
+mv val/ILSVRC2012_val_00030654.JPEG n02607072/
+mv val/ILSVRC2012_val_00030655.JPEG n03498962/
+mv val/ILSVRC2012_val_00030656.JPEG n02120505/
+mv val/ILSVRC2012_val_00030657.JPEG n01534433/
+mv val/ILSVRC2012_val_00030658.JPEG n01491361/
+mv val/ILSVRC2012_val_00030659.JPEG n07730033/
+mv val/ILSVRC2012_val_00030660.JPEG n02098413/
+mv val/ILSVRC2012_val_00030661.JPEG n02793495/
+mv val/ILSVRC2012_val_00030662.JPEG n02017213/
+mv val/ILSVRC2012_val_00030663.JPEG n02100877/
+mv val/ILSVRC2012_val_00030664.JPEG n02948072/
+mv val/ILSVRC2012_val_00030665.JPEG n02398521/
+mv val/ILSVRC2012_val_00030666.JPEG n03498962/
+mv val/ILSVRC2012_val_00030667.JPEG n02494079/
+mv val/ILSVRC2012_val_00030668.JPEG n04026417/
+mv val/ILSVRC2012_val_00030669.JPEG n03259280/
+mv val/ILSVRC2012_val_00030670.JPEG n04209133/
+mv val/ILSVRC2012_val_00030671.JPEG n02094258/
+mv val/ILSVRC2012_val_00030672.JPEG n02028035/
+mv val/ILSVRC2012_val_00030673.JPEG n03627232/
+mv val/ILSVRC2012_val_00030674.JPEG n03529860/
+mv val/ILSVRC2012_val_00030675.JPEG n02077923/
+mv val/ILSVRC2012_val_00030676.JPEG n03843555/
+mv val/ILSVRC2012_val_00030677.JPEG n03873416/
+mv val/ILSVRC2012_val_00030678.JPEG n02116738/
+mv val/ILSVRC2012_val_00030679.JPEG n03995372/
+mv val/ILSVRC2012_val_00030680.JPEG n02104365/
+mv val/ILSVRC2012_val_00030681.JPEG n04347754/
+mv val/ILSVRC2012_val_00030682.JPEG n04590129/
+mv val/ILSVRC2012_val_00030683.JPEG n03657121/
+mv val/ILSVRC2012_val_00030684.JPEG n01774384/
+mv val/ILSVRC2012_val_00030685.JPEG n03937543/
+mv val/ILSVRC2012_val_00030686.JPEG n07836838/
+mv val/ILSVRC2012_val_00030687.JPEG n04127249/
+mv val/ILSVRC2012_val_00030688.JPEG n02391049/
+mv val/ILSVRC2012_val_00030689.JPEG n04296562/
+mv val/ILSVRC2012_val_00030690.JPEG n02492035/
+mv val/ILSVRC2012_val_00030691.JPEG n04254120/
+mv val/ILSVRC2012_val_00030692.JPEG n04201297/
+mv val/ILSVRC2012_val_00030693.JPEG n02115641/
+mv val/ILSVRC2012_val_00030694.JPEG n02094258/
+mv val/ILSVRC2012_val_00030695.JPEG n03729826/
+mv val/ILSVRC2012_val_00030696.JPEG n02090379/
+mv val/ILSVRC2012_val_00030697.JPEG n02165456/
+mv val/ILSVRC2012_val_00030698.JPEG n02107142/
+mv val/ILSVRC2012_val_00030699.JPEG n01518878/
+mv val/ILSVRC2012_val_00030700.JPEG n03649909/
+mv val/ILSVRC2012_val_00030701.JPEG n01558993/
+mv val/ILSVRC2012_val_00030702.JPEG n01843383/
+mv val/ILSVRC2012_val_00030703.JPEG n01695060/
+mv val/ILSVRC2012_val_00030704.JPEG n02134084/
+mv val/ILSVRC2012_val_00030705.JPEG n02101556/
+mv val/ILSVRC2012_val_00030706.JPEG n02123045/
+mv val/ILSVRC2012_val_00030707.JPEG n03929855/
+mv val/ILSVRC2012_val_00030708.JPEG n02110185/
+mv val/ILSVRC2012_val_00030709.JPEG n03291819/
+mv val/ILSVRC2012_val_00030710.JPEG n02099601/
+mv val/ILSVRC2012_val_00030711.JPEG n04443257/
+mv val/ILSVRC2012_val_00030712.JPEG n02487347/
+mv val/ILSVRC2012_val_00030713.JPEG n01795545/
+mv val/ILSVRC2012_val_00030714.JPEG n04458633/
+mv val/ILSVRC2012_val_00030715.JPEG n02229544/
+mv val/ILSVRC2012_val_00030716.JPEG n03325584/
+mv val/ILSVRC2012_val_00030717.JPEG n04086273/
+mv val/ILSVRC2012_val_00030718.JPEG n03017168/
+mv val/ILSVRC2012_val_00030719.JPEG n01729977/
+mv val/ILSVRC2012_val_00030720.JPEG n03388043/
+mv val/ILSVRC2012_val_00030721.JPEG n01675722/
+mv val/ILSVRC2012_val_00030722.JPEG n02009229/
+mv val/ILSVRC2012_val_00030723.JPEG n03126707/
+mv val/ILSVRC2012_val_00030724.JPEG n02117135/
+mv val/ILSVRC2012_val_00030725.JPEG n03873416/
+mv val/ILSVRC2012_val_00030726.JPEG n04332243/
+mv val/ILSVRC2012_val_00030727.JPEG n02486410/
+mv val/ILSVRC2012_val_00030728.JPEG n03394916/
+mv val/ILSVRC2012_val_00030729.JPEG n02480855/
+mv val/ILSVRC2012_val_00030730.JPEG n02837789/
+mv val/ILSVRC2012_val_00030731.JPEG n03018349/
+mv val/ILSVRC2012_val_00030732.JPEG n03998194/
+mv val/ILSVRC2012_val_00030733.JPEG n04317175/
+mv val/ILSVRC2012_val_00030734.JPEG n01819313/
+mv val/ILSVRC2012_val_00030735.JPEG n03291819/
+mv val/ILSVRC2012_val_00030736.JPEG n01664065/
+mv val/ILSVRC2012_val_00030737.JPEG n02128385/
+mv val/ILSVRC2012_val_00030738.JPEG n02417914/
+mv val/ILSVRC2012_val_00030739.JPEG n04040759/
+mv val/ILSVRC2012_val_00030740.JPEG n01440764/
+mv val/ILSVRC2012_val_00030741.JPEG n09468604/
+mv val/ILSVRC2012_val_00030742.JPEG n03240683/
+mv val/ILSVRC2012_val_00030743.JPEG n07248320/
+mv val/ILSVRC2012_val_00030744.JPEG n11939491/
+mv val/ILSVRC2012_val_00030745.JPEG n02971356/
+mv val/ILSVRC2012_val_00030746.JPEG n02096437/
+mv val/ILSVRC2012_val_00030747.JPEG n02101556/
+mv val/ILSVRC2012_val_00030748.JPEG n04467665/
+mv val/ILSVRC2012_val_00030749.JPEG n03983396/
+mv val/ILSVRC2012_val_00030750.JPEG n04146614/
+mv val/ILSVRC2012_val_00030751.JPEG n04252077/
+mv val/ILSVRC2012_val_00030752.JPEG n03476684/
+mv val/ILSVRC2012_val_00030753.JPEG n02777292/
+mv val/ILSVRC2012_val_00030754.JPEG n03617480/
+mv val/ILSVRC2012_val_00030755.JPEG n04004767/
+mv val/ILSVRC2012_val_00030756.JPEG n02102177/
+mv val/ILSVRC2012_val_00030757.JPEG n02088632/
+mv val/ILSVRC2012_val_00030758.JPEG n07749582/
+mv val/ILSVRC2012_val_00030759.JPEG n04264628/
+mv val/ILSVRC2012_val_00030760.JPEG n04487081/
+mv val/ILSVRC2012_val_00030761.JPEG n02808440/
+mv val/ILSVRC2012_val_00030762.JPEG n04399382/
+mv val/ILSVRC2012_val_00030763.JPEG n03961711/
+mv val/ILSVRC2012_val_00030764.JPEG n04229816/
+mv val/ILSVRC2012_val_00030765.JPEG n03977966/
+mv val/ILSVRC2012_val_00030766.JPEG n03133878/
+mv val/ILSVRC2012_val_00030767.JPEG n03877845/
+mv val/ILSVRC2012_val_00030768.JPEG n03995372/
+mv val/ILSVRC2012_val_00030769.JPEG n04131690/
+mv val/ILSVRC2012_val_00030770.JPEG n02093754/
+mv val/ILSVRC2012_val_00030771.JPEG n02110806/
+mv val/ILSVRC2012_val_00030772.JPEG n01872401/
+mv val/ILSVRC2012_val_00030773.JPEG n02106662/
+mv val/ILSVRC2012_val_00030774.JPEG n07836838/
+mv val/ILSVRC2012_val_00030775.JPEG n04553703/
+mv val/ILSVRC2012_val_00030776.JPEG n02095314/
+mv val/ILSVRC2012_val_00030777.JPEG n12620546/
+mv val/ILSVRC2012_val_00030778.JPEG n02231487/
+mv val/ILSVRC2012_val_00030779.JPEG n02277742/
+mv val/ILSVRC2012_val_00030780.JPEG n04456115/
+mv val/ILSVRC2012_val_00030781.JPEG n02643566/
+mv val/ILSVRC2012_val_00030782.JPEG n02317335/
+mv val/ILSVRC2012_val_00030783.JPEG n04008634/
+mv val/ILSVRC2012_val_00030784.JPEG n04476259/
+mv val/ILSVRC2012_val_00030785.JPEG n04550184/
+mv val/ILSVRC2012_val_00030786.JPEG n02107908/
+mv val/ILSVRC2012_val_00030787.JPEG n02125311/
+mv val/ILSVRC2012_val_00030788.JPEG n03355925/
+mv val/ILSVRC2012_val_00030789.JPEG n03769881/
+mv val/ILSVRC2012_val_00030790.JPEG n07615774/
+mv val/ILSVRC2012_val_00030791.JPEG n02443114/
+mv val/ILSVRC2012_val_00030792.JPEG n02167151/
+mv val/ILSVRC2012_val_00030793.JPEG n04590129/
+mv val/ILSVRC2012_val_00030794.JPEG n12620546/
+mv val/ILSVRC2012_val_00030795.JPEG n02177972/
+mv val/ILSVRC2012_val_00030796.JPEG n03866082/
+mv val/ILSVRC2012_val_00030797.JPEG n07718472/
+mv val/ILSVRC2012_val_00030798.JPEG n02102318/
+mv val/ILSVRC2012_val_00030799.JPEG n07697313/
+mv val/ILSVRC2012_val_00030800.JPEG n03384352/
+mv val/ILSVRC2012_val_00030801.JPEG n04330267/
+mv val/ILSVRC2012_val_00030802.JPEG n03874293/
+mv val/ILSVRC2012_val_00030803.JPEG n03895866/
+mv val/ILSVRC2012_val_00030804.JPEG n02444819/
+mv val/ILSVRC2012_val_00030805.JPEG n03908714/
+mv val/ILSVRC2012_val_00030806.JPEG n02395406/
+mv val/ILSVRC2012_val_00030807.JPEG n04355933/
+mv val/ILSVRC2012_val_00030808.JPEG n03220513/
+mv val/ILSVRC2012_val_00030809.JPEG n04147183/
+mv val/ILSVRC2012_val_00030810.JPEG n02099267/
+mv val/ILSVRC2012_val_00030811.JPEG n01983481/
+mv val/ILSVRC2012_val_00030812.JPEG n01770081/
+mv val/ILSVRC2012_val_00030813.JPEG n02095570/
+mv val/ILSVRC2012_val_00030814.JPEG n01695060/
+mv val/ILSVRC2012_val_00030815.JPEG n02115641/
+mv val/ILSVRC2012_val_00030816.JPEG n04355338/
+mv val/ILSVRC2012_val_00030817.JPEG n07584110/
+mv val/ILSVRC2012_val_00030818.JPEG n02843684/
+mv val/ILSVRC2012_val_00030819.JPEG n04023962/
+mv val/ILSVRC2012_val_00030820.JPEG n02102480/
+mv val/ILSVRC2012_val_00030821.JPEG n04116512/
+mv val/ILSVRC2012_val_00030822.JPEG n02094258/
+mv val/ILSVRC2012_val_00030823.JPEG n04326547/
+mv val/ILSVRC2012_val_00030824.JPEG n02951358/
+mv val/ILSVRC2012_val_00030825.JPEG n01784675/
+mv val/ILSVRC2012_val_00030826.JPEG n03494278/
+mv val/ILSVRC2012_val_00030827.JPEG n03935335/
+mv val/ILSVRC2012_val_00030828.JPEG n02106662/
+mv val/ILSVRC2012_val_00030829.JPEG n02256656/
+mv val/ILSVRC2012_val_00030830.JPEG n03944341/
+mv val/ILSVRC2012_val_00030831.JPEG n02105641/
+mv val/ILSVRC2012_val_00030832.JPEG n02666196/
+mv val/ILSVRC2012_val_00030833.JPEG n03982430/
+mv val/ILSVRC2012_val_00030834.JPEG n02814533/
+mv val/ILSVRC2012_val_00030835.JPEG n04204238/
+mv val/ILSVRC2012_val_00030836.JPEG n07730033/
+mv val/ILSVRC2012_val_00030837.JPEG n01807496/
+mv val/ILSVRC2012_val_00030838.JPEG n03042490/
+mv val/ILSVRC2012_val_00030839.JPEG n02963159/
+mv val/ILSVRC2012_val_00030840.JPEG n02504458/
+mv val/ILSVRC2012_val_00030841.JPEG n03535780/
+mv val/ILSVRC2012_val_00030842.JPEG n04355933/
+mv val/ILSVRC2012_val_00030843.JPEG n02009229/
+mv val/ILSVRC2012_val_00030844.JPEG n02423022/
+mv val/ILSVRC2012_val_00030845.JPEG n01582220/
+mv val/ILSVRC2012_val_00030846.JPEG n07614500/
+mv val/ILSVRC2012_val_00030847.JPEG n02321529/
+mv val/ILSVRC2012_val_00030848.JPEG n03272562/
+mv val/ILSVRC2012_val_00030849.JPEG n03642806/
+mv val/ILSVRC2012_val_00030850.JPEG n04251144/
+mv val/ILSVRC2012_val_00030851.JPEG n02115913/
+mv val/ILSVRC2012_val_00030852.JPEG n02107312/
+mv val/ILSVRC2012_val_00030853.JPEG n03924679/
+mv val/ILSVRC2012_val_00030854.JPEG n02699494/
+mv val/ILSVRC2012_val_00030855.JPEG n03908714/
+mv val/ILSVRC2012_val_00030856.JPEG n04522168/
+mv val/ILSVRC2012_val_00030857.JPEG n09246464/
+mv val/ILSVRC2012_val_00030858.JPEG n03617480/
+mv val/ILSVRC2012_val_00030859.JPEG n02231487/
+mv val/ILSVRC2012_val_00030860.JPEG n02127052/
+mv val/ILSVRC2012_val_00030861.JPEG n04335435/
+mv val/ILSVRC2012_val_00030862.JPEG n02804610/
+mv val/ILSVRC2012_val_00030863.JPEG n02437616/
+mv val/ILSVRC2012_val_00030864.JPEG n03249569/
+mv val/ILSVRC2012_val_00030865.JPEG n01682714/
+mv val/ILSVRC2012_val_00030866.JPEG n02790996/
+mv val/ILSVRC2012_val_00030867.JPEG n03742115/
+mv val/ILSVRC2012_val_00030868.JPEG n02112350/
+mv val/ILSVRC2012_val_00030869.JPEG n02837789/
+mv val/ILSVRC2012_val_00030870.JPEG n04371774/
+mv val/ILSVRC2012_val_00030871.JPEG n03443371/
+mv val/ILSVRC2012_val_00030872.JPEG n02992529/
+mv val/ILSVRC2012_val_00030873.JPEG n01688243/
+mv val/ILSVRC2012_val_00030874.JPEG n03733281/
+mv val/ILSVRC2012_val_00030875.JPEG n07875152/
+mv val/ILSVRC2012_val_00030876.JPEG n02105641/
+mv val/ILSVRC2012_val_00030877.JPEG n02110958/
+mv val/ILSVRC2012_val_00030878.JPEG n02018795/
+mv val/ILSVRC2012_val_00030879.JPEG n04482393/
+mv val/ILSVRC2012_val_00030880.JPEG n03063689/
+mv val/ILSVRC2012_val_00030881.JPEG n02328150/
+mv val/ILSVRC2012_val_00030882.JPEG n02109525/
+mv val/ILSVRC2012_val_00030883.JPEG n02071294/
+mv val/ILSVRC2012_val_00030884.JPEG n02808304/
+mv val/ILSVRC2012_val_00030885.JPEG n03530642/
+mv val/ILSVRC2012_val_00030886.JPEG n03970156/
+mv val/ILSVRC2012_val_00030887.JPEG n01860187/
+mv val/ILSVRC2012_val_00030888.JPEG n02102973/
+mv val/ILSVRC2012_val_00030889.JPEG n03220513/
+mv val/ILSVRC2012_val_00030890.JPEG n03032252/
+mv val/ILSVRC2012_val_00030891.JPEG n01797886/
+mv val/ILSVRC2012_val_00030892.JPEG n03792782/
+mv val/ILSVRC2012_val_00030893.JPEG n02085936/
+mv val/ILSVRC2012_val_00030894.JPEG n04487394/
+mv val/ILSVRC2012_val_00030895.JPEG n02790996/
+mv val/ILSVRC2012_val_00030896.JPEG n01773157/
+mv val/ILSVRC2012_val_00030897.JPEG n04367480/
+mv val/ILSVRC2012_val_00030898.JPEG n03290653/
+mv val/ILSVRC2012_val_00030899.JPEG n03478589/
+mv val/ILSVRC2012_val_00030900.JPEG n04542943/
+mv val/ILSVRC2012_val_00030901.JPEG n07579787/
+mv val/ILSVRC2012_val_00030902.JPEG n02190166/
+mv val/ILSVRC2012_val_00030903.JPEG n06785654/
+mv val/ILSVRC2012_val_00030904.JPEG n02002724/
+mv val/ILSVRC2012_val_00030905.JPEG n01740131/
+mv val/ILSVRC2012_val_00030906.JPEG n04033995/
+mv val/ILSVRC2012_val_00030907.JPEG n01978287/
+mv val/ILSVRC2012_val_00030908.JPEG n02011460/
+mv val/ILSVRC2012_val_00030909.JPEG n03937543/
+mv val/ILSVRC2012_val_00030910.JPEG n02096437/
+mv val/ILSVRC2012_val_00030911.JPEG n01534433/
+mv val/ILSVRC2012_val_00030912.JPEG n02978881/
+mv val/ILSVRC2012_val_00030913.JPEG n03445924/
+mv val/ILSVRC2012_val_00030914.JPEG n07716358/
+mv val/ILSVRC2012_val_00030915.JPEG n02093428/
+mv val/ILSVRC2012_val_00030916.JPEG n01776313/
+mv val/ILSVRC2012_val_00030917.JPEG n02704792/
+mv val/ILSVRC2012_val_00030918.JPEG n01687978/
+mv val/ILSVRC2012_val_00030919.JPEG n04550184/
+mv val/ILSVRC2012_val_00030920.JPEG n02102973/
+mv val/ILSVRC2012_val_00030921.JPEG n02165456/
+mv val/ILSVRC2012_val_00030922.JPEG n03347037/
+mv val/ILSVRC2012_val_00030923.JPEG n01755581/
+mv val/ILSVRC2012_val_00030924.JPEG n02111889/
+mv val/ILSVRC2012_val_00030925.JPEG n03967562/
+mv val/ILSVRC2012_val_00030926.JPEG n01491361/
+mv val/ILSVRC2012_val_00030927.JPEG n02437616/
+mv val/ILSVRC2012_val_00030928.JPEG n02089078/
+mv val/ILSVRC2012_val_00030929.JPEG n02123597/
+mv val/ILSVRC2012_val_00030930.JPEG n04507155/
+mv val/ILSVRC2012_val_00030931.JPEG n03110669/
+mv val/ILSVRC2012_val_00030932.JPEG n03868242/
+mv val/ILSVRC2012_val_00030933.JPEG n03874599/
+mv val/ILSVRC2012_val_00030934.JPEG n02120505/
+mv val/ILSVRC2012_val_00030935.JPEG n03930313/
+mv val/ILSVRC2012_val_00030936.JPEG n02165105/
+mv val/ILSVRC2012_val_00030937.JPEG n04604644/
+mv val/ILSVRC2012_val_00030938.JPEG n03445777/
+mv val/ILSVRC2012_val_00030939.JPEG n02099712/
+mv val/ILSVRC2012_val_00030940.JPEG n02009229/
+mv val/ILSVRC2012_val_00030941.JPEG n04389033/
+mv val/ILSVRC2012_val_00030942.JPEG n04371774/
+mv val/ILSVRC2012_val_00030943.JPEG n02437616/
+mv val/ILSVRC2012_val_00030944.JPEG n04243546/
+mv val/ILSVRC2012_val_00030945.JPEG n03794056/
+mv val/ILSVRC2012_val_00030946.JPEG n03775071/
+mv val/ILSVRC2012_val_00030947.JPEG n04479046/
+mv val/ILSVRC2012_val_00030948.JPEG n03796401/
+mv val/ILSVRC2012_val_00030949.JPEG n02892767/
+mv val/ILSVRC2012_val_00030950.JPEG n03929660/
+mv val/ILSVRC2012_val_00030951.JPEG n02133161/
+mv val/ILSVRC2012_val_00030952.JPEG n03944341/
+mv val/ILSVRC2012_val_00030953.JPEG n03884397/
+mv val/ILSVRC2012_val_00030954.JPEG n04589890/
+mv val/ILSVRC2012_val_00030955.JPEG n03590841/
+mv val/ILSVRC2012_val_00030956.JPEG n02071294/
+mv val/ILSVRC2012_val_00030957.JPEG n04263257/
+mv val/ILSVRC2012_val_00030958.JPEG n01768244/
+mv val/ILSVRC2012_val_00030959.JPEG n02410509/
+mv val/ILSVRC2012_val_00030960.JPEG n04465501/
+mv val/ILSVRC2012_val_00030961.JPEG n02098286/
+mv val/ILSVRC2012_val_00030962.JPEG n02747177/
+mv val/ILSVRC2012_val_00030963.JPEG n02105162/
+mv val/ILSVRC2012_val_00030964.JPEG n01667114/
+mv val/ILSVRC2012_val_00030965.JPEG n02999410/
+mv val/ILSVRC2012_val_00030966.JPEG n01560419/
+mv val/ILSVRC2012_val_00030967.JPEG n07749582/
+mv val/ILSVRC2012_val_00030968.JPEG n01968897/
+mv val/ILSVRC2012_val_00030969.JPEG n02130308/
+mv val/ILSVRC2012_val_00030970.JPEG n02110806/
+mv val/ILSVRC2012_val_00030971.JPEG n02106382/
+mv val/ILSVRC2012_val_00030972.JPEG n07590611/
+mv val/ILSVRC2012_val_00030973.JPEG n07697537/
+mv val/ILSVRC2012_val_00030974.JPEG n04591157/
+mv val/ILSVRC2012_val_00030975.JPEG n04462240/
+mv val/ILSVRC2012_val_00030976.JPEG n02988304/
+mv val/ILSVRC2012_val_00030977.JPEG n03126707/
+mv val/ILSVRC2012_val_00030978.JPEG n02727426/
+mv val/ILSVRC2012_val_00030979.JPEG n04127249/
+mv val/ILSVRC2012_val_00030980.JPEG n02843684/
+mv val/ILSVRC2012_val_00030981.JPEG n03179701/
+mv val/ILSVRC2012_val_00030982.JPEG n02443484/
+mv val/ILSVRC2012_val_00030983.JPEG n04344873/
+mv val/ILSVRC2012_val_00030984.JPEG n02280649/
+mv val/ILSVRC2012_val_00030985.JPEG n03216828/
+mv val/ILSVRC2012_val_00030986.JPEG n12985857/
+mv val/ILSVRC2012_val_00030987.JPEG n04548280/
+mv val/ILSVRC2012_val_00030988.JPEG n03602883/
+mv val/ILSVRC2012_val_00030989.JPEG n03447721/
+mv val/ILSVRC2012_val_00030990.JPEG n01694178/
+mv val/ILSVRC2012_val_00030991.JPEG n02415577/
+mv val/ILSVRC2012_val_00030992.JPEG n02699494/
+mv val/ILSVRC2012_val_00030993.JPEG n03085013/
+mv val/ILSVRC2012_val_00030994.JPEG n02895154/
+mv val/ILSVRC2012_val_00030995.JPEG n04371774/
+mv val/ILSVRC2012_val_00030996.JPEG n03495258/
+mv val/ILSVRC2012_val_00030997.JPEG n03791053/
+mv val/ILSVRC2012_val_00030998.JPEG n02641379/
+mv val/ILSVRC2012_val_00030999.JPEG n02980441/
+mv val/ILSVRC2012_val_00031000.JPEG n02950826/
+mv val/ILSVRC2012_val_00031001.JPEG n02110063/
+mv val/ILSVRC2012_val_00031002.JPEG n03788195/
+mv val/ILSVRC2012_val_00031003.JPEG n01693334/
+mv val/ILSVRC2012_val_00031004.JPEG n02606052/
+mv val/ILSVRC2012_val_00031005.JPEG n07742313/
+mv val/ILSVRC2012_val_00031006.JPEG n02113624/
+mv val/ILSVRC2012_val_00031007.JPEG n03874293/
+mv val/ILSVRC2012_val_00031008.JPEG n04209239/
+mv val/ILSVRC2012_val_00031009.JPEG n03388043/
+mv val/ILSVRC2012_val_00031010.JPEG n02927161/
+mv val/ILSVRC2012_val_00031011.JPEG n03944341/
+mv val/ILSVRC2012_val_00031012.JPEG n04579432/
+mv val/ILSVRC2012_val_00031013.JPEG n03759954/
+mv val/ILSVRC2012_val_00031014.JPEG n02101388/
+mv val/ILSVRC2012_val_00031015.JPEG n01978287/
+mv val/ILSVRC2012_val_00031016.JPEG n03443371/
+mv val/ILSVRC2012_val_00031017.JPEG n02129604/
+mv val/ILSVRC2012_val_00031018.JPEG n01693334/
+mv val/ILSVRC2012_val_00031019.JPEG n07742313/
+mv val/ILSVRC2012_val_00031020.JPEG n01770393/
+mv val/ILSVRC2012_val_00031021.JPEG n06785654/
+mv val/ILSVRC2012_val_00031022.JPEG n03126707/
+mv val/ILSVRC2012_val_00031023.JPEG n02058221/
+mv val/ILSVRC2012_val_00031024.JPEG n03721384/
+mv val/ILSVRC2012_val_00031025.JPEG n02093647/
+mv val/ILSVRC2012_val_00031026.JPEG n07684084/
+mv val/ILSVRC2012_val_00031027.JPEG n03775546/
+mv val/ILSVRC2012_val_00031028.JPEG n03494278/
+mv val/ILSVRC2012_val_00031029.JPEG n03131574/
+mv val/ILSVRC2012_val_00031030.JPEG n02823428/
+mv val/ILSVRC2012_val_00031031.JPEG n02111889/
+mv val/ILSVRC2012_val_00031032.JPEG n04208210/
+mv val/ILSVRC2012_val_00031033.JPEG n02190166/
+mv val/ILSVRC2012_val_00031034.JPEG n04228054/
+mv val/ILSVRC2012_val_00031035.JPEG n03888257/
+mv val/ILSVRC2012_val_00031036.JPEG n02169497/
+mv val/ILSVRC2012_val_00031037.JPEG n01770081/
+mv val/ILSVRC2012_val_00031038.JPEG n02974003/
+mv val/ILSVRC2012_val_00031039.JPEG n03637318/
+mv val/ILSVRC2012_val_00031040.JPEG n02089078/
+mv val/ILSVRC2012_val_00031041.JPEG n02117135/
+mv val/ILSVRC2012_val_00031042.JPEG n02457408/
+mv val/ILSVRC2012_val_00031043.JPEG n02606052/
+mv val/ILSVRC2012_val_00031044.JPEG n03877845/
+mv val/ILSVRC2012_val_00031045.JPEG n02776631/
+mv val/ILSVRC2012_val_00031046.JPEG n01882714/
+mv val/ILSVRC2012_val_00031047.JPEG n03325584/
+mv val/ILSVRC2012_val_00031048.JPEG n02095314/
+mv val/ILSVRC2012_val_00031049.JPEG n02102973/
+mv val/ILSVRC2012_val_00031050.JPEG n02236044/
+mv val/ILSVRC2012_val_00031051.JPEG n02090622/
+mv val/ILSVRC2012_val_00031052.JPEG n02797295/
+mv val/ILSVRC2012_val_00031053.JPEG n01775062/
+mv val/ILSVRC2012_val_00031054.JPEG n02098286/
+mv val/ILSVRC2012_val_00031055.JPEG n03498962/
+mv val/ILSVRC2012_val_00031056.JPEG n02128385/
+mv val/ILSVRC2012_val_00031057.JPEG n02783161/
+mv val/ILSVRC2012_val_00031058.JPEG n07768694/
+mv val/ILSVRC2012_val_00031059.JPEG n03337140/
+mv val/ILSVRC2012_val_00031060.JPEG n01751748/
+mv val/ILSVRC2012_val_00031061.JPEG n04447861/
+mv val/ILSVRC2012_val_00031062.JPEG n02172182/
+mv val/ILSVRC2012_val_00031063.JPEG n03743016/
+mv val/ILSVRC2012_val_00031064.JPEG n03599486/
+mv val/ILSVRC2012_val_00031065.JPEG n04380533/
+mv val/ILSVRC2012_val_00031066.JPEG n07892512/
+mv val/ILSVRC2012_val_00031067.JPEG n03598930/
+mv val/ILSVRC2012_val_00031068.JPEG n02085782/
+mv val/ILSVRC2012_val_00031069.JPEG n01685808/
+mv val/ILSVRC2012_val_00031070.JPEG n02879718/
+mv val/ILSVRC2012_val_00031071.JPEG n01491361/
+mv val/ILSVRC2012_val_00031072.JPEG n04273569/
+mv val/ILSVRC2012_val_00031073.JPEG n02441942/
+mv val/ILSVRC2012_val_00031074.JPEG n04553703/
+mv val/ILSVRC2012_val_00031075.JPEG n03649909/
+mv val/ILSVRC2012_val_00031076.JPEG n03141823/
+mv val/ILSVRC2012_val_00031077.JPEG n02115641/
+mv val/ILSVRC2012_val_00031078.JPEG n04372370/
+mv val/ILSVRC2012_val_00031079.JPEG n04265275/
+mv val/ILSVRC2012_val_00031080.JPEG n04493381/
+mv val/ILSVRC2012_val_00031081.JPEG n06596364/
+mv val/ILSVRC2012_val_00031082.JPEG n02825657/
+mv val/ILSVRC2012_val_00031083.JPEG n02480495/
+mv val/ILSVRC2012_val_00031084.JPEG n02097298/
+mv val/ILSVRC2012_val_00031085.JPEG n03532672/
+mv val/ILSVRC2012_val_00031086.JPEG n01531178/
+mv val/ILSVRC2012_val_00031087.JPEG n03843555/
+mv val/ILSVRC2012_val_00031088.JPEG n03770679/
+mv val/ILSVRC2012_val_00031089.JPEG n02346627/
+mv val/ILSVRC2012_val_00031090.JPEG n02127052/
+mv val/ILSVRC2012_val_00031091.JPEG n03297495/
+mv val/ILSVRC2012_val_00031092.JPEG n02869837/
+mv val/ILSVRC2012_val_00031093.JPEG n02106166/
+mv val/ILSVRC2012_val_00031094.JPEG n01440764/
+mv val/ILSVRC2012_val_00031095.JPEG n02510455/
+mv val/ILSVRC2012_val_00031096.JPEG n02095570/
+mv val/ILSVRC2012_val_00031097.JPEG n02177972/
+mv val/ILSVRC2012_val_00031098.JPEG n03347037/
+mv val/ILSVRC2012_val_00031099.JPEG n01978455/
+mv val/ILSVRC2012_val_00031100.JPEG n02488702/
+mv val/ILSVRC2012_val_00031101.JPEG n02791124/
+mv val/ILSVRC2012_val_00031102.JPEG n04229816/
+mv val/ILSVRC2012_val_00031103.JPEG n01675722/
+mv val/ILSVRC2012_val_00031104.JPEG n03630383/
+mv val/ILSVRC2012_val_00031105.JPEG n01930112/
+mv val/ILSVRC2012_val_00031106.JPEG n04005630/
+mv val/ILSVRC2012_val_00031107.JPEG n04039381/
+mv val/ILSVRC2012_val_00031108.JPEG n03950228/
+mv val/ILSVRC2012_val_00031109.JPEG n04592741/
+mv val/ILSVRC2012_val_00031110.JPEG n01914609/
+mv val/ILSVRC2012_val_00031111.JPEG n02129165/
+mv val/ILSVRC2012_val_00031112.JPEG n01871265/
+mv val/ILSVRC2012_val_00031113.JPEG n03902125/
+mv val/ILSVRC2012_val_00031114.JPEG n01689811/
+mv val/ILSVRC2012_val_00031115.JPEG n03534580/
+mv val/ILSVRC2012_val_00031116.JPEG n01945685/
+mv val/ILSVRC2012_val_00031117.JPEG n01773549/
+mv val/ILSVRC2012_val_00031118.JPEG n02089867/
+mv val/ILSVRC2012_val_00031119.JPEG n03788195/
+mv val/ILSVRC2012_val_00031120.JPEG n02788148/
+mv val/ILSVRC2012_val_00031121.JPEG n02113023/
+mv val/ILSVRC2012_val_00031122.JPEG n03534580/
+mv val/ILSVRC2012_val_00031123.JPEG n04592741/
+mv val/ILSVRC2012_val_00031124.JPEG n02797295/
+mv val/ILSVRC2012_val_00031125.JPEG n03017168/
+mv val/ILSVRC2012_val_00031126.JPEG n04355933/
+mv val/ILSVRC2012_val_00031127.JPEG n02097209/
+mv val/ILSVRC2012_val_00031128.JPEG n02167151/
+mv val/ILSVRC2012_val_00031129.JPEG n04026417/
+mv val/ILSVRC2012_val_00031130.JPEG n03271574/
+mv val/ILSVRC2012_val_00031131.JPEG n02105251/
+mv val/ILSVRC2012_val_00031132.JPEG n04004767/
+mv val/ILSVRC2012_val_00031133.JPEG n02108000/
+mv val/ILSVRC2012_val_00031134.JPEG n04350905/
+mv val/ILSVRC2012_val_00031135.JPEG n02106662/
+mv val/ILSVRC2012_val_00031136.JPEG n03201208/
+mv val/ILSVRC2012_val_00031137.JPEG n03126707/
+mv val/ILSVRC2012_val_00031138.JPEG n01443537/
+mv val/ILSVRC2012_val_00031139.JPEG n02837789/
+mv val/ILSVRC2012_val_00031140.JPEG n02165456/
+mv val/ILSVRC2012_val_00031141.JPEG n03796401/
+mv val/ILSVRC2012_val_00031142.JPEG n02870880/
+mv val/ILSVRC2012_val_00031143.JPEG n02641379/
+mv val/ILSVRC2012_val_00031144.JPEG n01622779/
+mv val/ILSVRC2012_val_00031145.JPEG n02113023/
+mv val/ILSVRC2012_val_00031146.JPEG n07880968/
+mv val/ILSVRC2012_val_00031147.JPEG n02165456/
+mv val/ILSVRC2012_val_00031148.JPEG n03840681/
+mv val/ILSVRC2012_val_00031149.JPEG n03372029/
+mv val/ILSVRC2012_val_00031150.JPEG n04044716/
+mv val/ILSVRC2012_val_00031151.JPEG n03840681/
+mv val/ILSVRC2012_val_00031152.JPEG n03692522/
+mv val/ILSVRC2012_val_00031153.JPEG n03992509/
+mv val/ILSVRC2012_val_00031154.JPEG n02085620/
+mv val/ILSVRC2012_val_00031155.JPEG n03530642/
+mv val/ILSVRC2012_val_00031156.JPEG n02113186/
+mv val/ILSVRC2012_val_00031157.JPEG n02086079/
+mv val/ILSVRC2012_val_00031158.JPEG n07614500/
+mv val/ILSVRC2012_val_00031159.JPEG n09468604/
+mv val/ILSVRC2012_val_00031160.JPEG n03602883/
+mv val/ILSVRC2012_val_00031161.JPEG n09468604/
+mv val/ILSVRC2012_val_00031162.JPEG n04270147/
+mv val/ILSVRC2012_val_00031163.JPEG n04146614/
+mv val/ILSVRC2012_val_00031164.JPEG n02892201/
+mv val/ILSVRC2012_val_00031165.JPEG n03958227/
+mv val/ILSVRC2012_val_00031166.JPEG n03832673/
+mv val/ILSVRC2012_val_00031167.JPEG n02268443/
+mv val/ILSVRC2012_val_00031168.JPEG n02236044/
+mv val/ILSVRC2012_val_00031169.JPEG n01494475/
+mv val/ILSVRC2012_val_00031170.JPEG n02009912/
+mv val/ILSVRC2012_val_00031171.JPEG n01532829/
+mv val/ILSVRC2012_val_00031172.JPEG n02093754/
+mv val/ILSVRC2012_val_00031173.JPEG n03404251/
+mv val/ILSVRC2012_val_00031174.JPEG n03770439/
+mv val/ILSVRC2012_val_00031175.JPEG n07734744/
+mv val/ILSVRC2012_val_00031176.JPEG n04252077/
+mv val/ILSVRC2012_val_00031177.JPEG n07714571/
+mv val/ILSVRC2012_val_00031178.JPEG n02120079/
+mv val/ILSVRC2012_val_00031179.JPEG n01665541/
+mv val/ILSVRC2012_val_00031180.JPEG n02123394/
+mv val/ILSVRC2012_val_00031181.JPEG n03240683/
+mv val/ILSVRC2012_val_00031182.JPEG n04264628/
+mv val/ILSVRC2012_val_00031183.JPEG n02457408/
+mv val/ILSVRC2012_val_00031184.JPEG n07614500/
+mv val/ILSVRC2012_val_00031185.JPEG n02124075/
+mv val/ILSVRC2012_val_00031186.JPEG n03425413/
+mv val/ILSVRC2012_val_00031187.JPEG n03133878/
+mv val/ILSVRC2012_val_00031188.JPEG n07930864/
+mv val/ILSVRC2012_val_00031189.JPEG n03160309/
+mv val/ILSVRC2012_val_00031190.JPEG n02484975/
+mv val/ILSVRC2012_val_00031191.JPEG n02086240/
+mv val/ILSVRC2012_val_00031192.JPEG n02978881/
+mv val/ILSVRC2012_val_00031193.JPEG n04404412/
+mv val/ILSVRC2012_val_00031194.JPEG n02643566/
+mv val/ILSVRC2012_val_00031195.JPEG n02494079/
+mv val/ILSVRC2012_val_00031196.JPEG n02749479/
+mv val/ILSVRC2012_val_00031197.JPEG n02114855/
+mv val/ILSVRC2012_val_00031198.JPEG n02106166/
+mv val/ILSVRC2012_val_00031199.JPEG n02114712/
+mv val/ILSVRC2012_val_00031200.JPEG n03662601/
+mv val/ILSVRC2012_val_00031201.JPEG n07583066/
+mv val/ILSVRC2012_val_00031202.JPEG n02396427/
+mv val/ILSVRC2012_val_00031203.JPEG n02108089/
+mv val/ILSVRC2012_val_00031204.JPEG n04335435/
+mv val/ILSVRC2012_val_00031205.JPEG n03017168/
+mv val/ILSVRC2012_val_00031206.JPEG n02113186/
+mv val/ILSVRC2012_val_00031207.JPEG n04493381/
+mv val/ILSVRC2012_val_00031208.JPEG n02909870/
+mv val/ILSVRC2012_val_00031209.JPEG n03075370/
+mv val/ILSVRC2012_val_00031210.JPEG n03627232/
+mv val/ILSVRC2012_val_00031211.JPEG n03794056/
+mv val/ILSVRC2012_val_00031212.JPEG n01734418/
+mv val/ILSVRC2012_val_00031213.JPEG n02951358/
+mv val/ILSVRC2012_val_00031214.JPEG n02457408/
+mv val/ILSVRC2012_val_00031215.JPEG n02883205/
+mv val/ILSVRC2012_val_00031216.JPEG n02917067/
+mv val/ILSVRC2012_val_00031217.JPEG n03250847/
+mv val/ILSVRC2012_val_00031218.JPEG n02804610/
+mv val/ILSVRC2012_val_00031219.JPEG n02110958/
+mv val/ILSVRC2012_val_00031220.JPEG n02088364/
+mv val/ILSVRC2012_val_00031221.JPEG n03891251/
+mv val/ILSVRC2012_val_00031222.JPEG n02641379/
+mv val/ILSVRC2012_val_00031223.JPEG n02098105/
+mv val/ILSVRC2012_val_00031224.JPEG n02113624/
+mv val/ILSVRC2012_val_00031225.JPEG n02027492/
+mv val/ILSVRC2012_val_00031226.JPEG n02066245/
+mv val/ILSVRC2012_val_00031227.JPEG n02168699/
+mv val/ILSVRC2012_val_00031228.JPEG n06359193/
+mv val/ILSVRC2012_val_00031229.JPEG n03627232/
+mv val/ILSVRC2012_val_00031230.JPEG n09229709/
+mv val/ILSVRC2012_val_00031231.JPEG n02749479/
+mv val/ILSVRC2012_val_00031232.JPEG n04355338/
+mv val/ILSVRC2012_val_00031233.JPEG n04252225/
+mv val/ILSVRC2012_val_00031234.JPEG n02939185/
+mv val/ILSVRC2012_val_00031235.JPEG n01632777/
+mv val/ILSVRC2012_val_00031236.JPEG n02395406/
+mv val/ILSVRC2012_val_00031237.JPEG n02219486/
+mv val/ILSVRC2012_val_00031238.JPEG n02988304/
+mv val/ILSVRC2012_val_00031239.JPEG n01518878/
+mv val/ILSVRC2012_val_00031240.JPEG n03891332/
+mv val/ILSVRC2012_val_00031241.JPEG n02114548/
+mv val/ILSVRC2012_val_00031242.JPEG n02892767/
+mv val/ILSVRC2012_val_00031243.JPEG n01491361/
+mv val/ILSVRC2012_val_00031244.JPEG n03933933/
+mv val/ILSVRC2012_val_00031245.JPEG n02795169/
+mv val/ILSVRC2012_val_00031246.JPEG n09472597/
+mv val/ILSVRC2012_val_00031247.JPEG n07579787/
+mv val/ILSVRC2012_val_00031248.JPEG n03032252/
+mv val/ILSVRC2012_val_00031249.JPEG n02093754/
+mv val/ILSVRC2012_val_00031250.JPEG n13054560/
+mv val/ILSVRC2012_val_00031251.JPEG n03891251/
+mv val/ILSVRC2012_val_00031252.JPEG n02105505/
+mv val/ILSVRC2012_val_00031253.JPEG n02132136/
+mv val/ILSVRC2012_val_00031254.JPEG n07873807/
+mv val/ILSVRC2012_val_00031255.JPEG n02640242/
+mv val/ILSVRC2012_val_00031256.JPEG n04461696/
+mv val/ILSVRC2012_val_00031257.JPEG n04613696/
+mv val/ILSVRC2012_val_00031258.JPEG n09468604/
+mv val/ILSVRC2012_val_00031259.JPEG n02113186/
+mv val/ILSVRC2012_val_00031260.JPEG n02493509/
+mv val/ILSVRC2012_val_00031261.JPEG n04553703/
+mv val/ILSVRC2012_val_00031262.JPEG n01968897/
+mv val/ILSVRC2012_val_00031263.JPEG n04296562/
+mv val/ILSVRC2012_val_00031264.JPEG n03467068/
+mv val/ILSVRC2012_val_00031265.JPEG n03763968/
+mv val/ILSVRC2012_val_00031266.JPEG n04209239/
+mv val/ILSVRC2012_val_00031267.JPEG n02219486/
+mv val/ILSVRC2012_val_00031268.JPEG n03888257/
+mv val/ILSVRC2012_val_00031269.JPEG n01871265/
+mv val/ILSVRC2012_val_00031270.JPEG n03325584/
+mv val/ILSVRC2012_val_00031271.JPEG n03272562/
+mv val/ILSVRC2012_val_00031272.JPEG n03854065/
+mv val/ILSVRC2012_val_00031273.JPEG n01558993/
+mv val/ILSVRC2012_val_00031274.JPEG n03670208/
+mv val/ILSVRC2012_val_00031275.JPEG n01665541/
+mv val/ILSVRC2012_val_00031276.JPEG n03325584/
+mv val/ILSVRC2012_val_00031277.JPEG n01695060/
+mv val/ILSVRC2012_val_00031278.JPEG n02457408/
+mv val/ILSVRC2012_val_00031279.JPEG n02797295/
+mv val/ILSVRC2012_val_00031280.JPEG n02950826/
+mv val/ILSVRC2012_val_00031281.JPEG n02099429/
+mv val/ILSVRC2012_val_00031282.JPEG n03291819/
+mv val/ILSVRC2012_val_00031283.JPEG n02939185/
+mv val/ILSVRC2012_val_00031284.JPEG n03976467/
+mv val/ILSVRC2012_val_00031285.JPEG n02120079/
+mv val/ILSVRC2012_val_00031286.JPEG n02879718/
+mv val/ILSVRC2012_val_00031287.JPEG n04579145/
+mv val/ILSVRC2012_val_00031288.JPEG n04120489/
+mv val/ILSVRC2012_val_00031289.JPEG n01632458/
+mv val/ILSVRC2012_val_00031290.JPEG n02009912/
+mv val/ILSVRC2012_val_00031291.JPEG n04328186/
+mv val/ILSVRC2012_val_00031292.JPEG n06874185/
+mv val/ILSVRC2012_val_00031293.JPEG n02398521/
+mv val/ILSVRC2012_val_00031294.JPEG n02488291/
+mv val/ILSVRC2012_val_00031295.JPEG n02107312/
+mv val/ILSVRC2012_val_00031296.JPEG n03026506/
+mv val/ILSVRC2012_val_00031297.JPEG n02119022/
+mv val/ILSVRC2012_val_00031298.JPEG n01843383/
+mv val/ILSVRC2012_val_00031299.JPEG n03657121/
+mv val/ILSVRC2012_val_00031300.JPEG n03062245/
+mv val/ILSVRC2012_val_00031301.JPEG n07584110/
+mv val/ILSVRC2012_val_00031302.JPEG n02091032/
+mv val/ILSVRC2012_val_00031303.JPEG n03476991/
+mv val/ILSVRC2012_val_00031304.JPEG n02013706/
+mv val/ILSVRC2012_val_00031305.JPEG n02607072/
+mv val/ILSVRC2012_val_00031306.JPEG n02113712/
+mv val/ILSVRC2012_val_00031307.JPEG n03788365/
+mv val/ILSVRC2012_val_00031308.JPEG n04355338/
+mv val/ILSVRC2012_val_00031309.JPEG n04428191/
+mv val/ILSVRC2012_val_00031310.JPEG n04442312/
+mv val/ILSVRC2012_val_00031311.JPEG n01753488/
+mv val/ILSVRC2012_val_00031312.JPEG n12620546/
+mv val/ILSVRC2012_val_00031313.JPEG n03417042/
+mv val/ILSVRC2012_val_00031314.JPEG n02108089/
+mv val/ILSVRC2012_val_00031315.JPEG n07871810/
+mv val/ILSVRC2012_val_00031316.JPEG n03930313/
+mv val/ILSVRC2012_val_00031317.JPEG n04019541/
+mv val/ILSVRC2012_val_00031318.JPEG n04074963/
+mv val/ILSVRC2012_val_00031319.JPEG n02408429/
+mv val/ILSVRC2012_val_00031320.JPEG n02817516/
+mv val/ILSVRC2012_val_00031321.JPEG n01955084/
+mv val/ILSVRC2012_val_00031322.JPEG n02747177/
+mv val/ILSVRC2012_val_00031323.JPEG n09472597/
+mv val/ILSVRC2012_val_00031324.JPEG n03866082/
+mv val/ILSVRC2012_val_00031325.JPEG n02099267/
+mv val/ILSVRC2012_val_00031326.JPEG n03782006/
+mv val/ILSVRC2012_val_00031327.JPEG n03998194/
+mv val/ILSVRC2012_val_00031328.JPEG n02823428/
+mv val/ILSVRC2012_val_00031329.JPEG n04487081/
+mv val/ILSVRC2012_val_00031330.JPEG n03956157/
+mv val/ILSVRC2012_val_00031331.JPEG n03854065/
+mv val/ILSVRC2012_val_00031332.JPEG n02002556/
+mv val/ILSVRC2012_val_00031333.JPEG n01440764/
+mv val/ILSVRC2012_val_00031334.JPEG n02093256/
+mv val/ILSVRC2012_val_00031335.JPEG n02229544/
+mv val/ILSVRC2012_val_00031336.JPEG n02109047/
+mv val/ILSVRC2012_val_00031337.JPEG n03160309/
+mv val/ILSVRC2012_val_00031338.JPEG n02825657/
+mv val/ILSVRC2012_val_00031339.JPEG n02423022/
+mv val/ILSVRC2012_val_00031340.JPEG n03016953/
+mv val/ILSVRC2012_val_00031341.JPEG n04179913/
+mv val/ILSVRC2012_val_00031342.JPEG n01860187/
+mv val/ILSVRC2012_val_00031343.JPEG n02107574/
+mv val/ILSVRC2012_val_00031344.JPEG n06359193/
+mv val/ILSVRC2012_val_00031345.JPEG n02088094/
+mv val/ILSVRC2012_val_00031346.JPEG n04065272/
+mv val/ILSVRC2012_val_00031347.JPEG n02088632/
+mv val/ILSVRC2012_val_00031348.JPEG n02130308/
+mv val/ILSVRC2012_val_00031349.JPEG n03769881/
+mv val/ILSVRC2012_val_00031350.JPEG n02966193/
+mv val/ILSVRC2012_val_00031351.JPEG n06794110/
+mv val/ILSVRC2012_val_00031352.JPEG n07590611/
+mv val/ILSVRC2012_val_00031353.JPEG n03924679/
+mv val/ILSVRC2012_val_00031354.JPEG n04153751/
+mv val/ILSVRC2012_val_00031355.JPEG n02112706/
+mv val/ILSVRC2012_val_00031356.JPEG n02509815/
+mv val/ILSVRC2012_val_00031357.JPEG n04335435/
+mv val/ILSVRC2012_val_00031358.JPEG n04579432/
+mv val/ILSVRC2012_val_00031359.JPEG n02815834/
+mv val/ILSVRC2012_val_00031360.JPEG n02361337/
+mv val/ILSVRC2012_val_00031361.JPEG n02123159/
+mv val/ILSVRC2012_val_00031362.JPEG n03133878/
+mv val/ILSVRC2012_val_00031363.JPEG n02457408/
+mv val/ILSVRC2012_val_00031364.JPEG n02092002/
+mv val/ILSVRC2012_val_00031365.JPEG n04347754/
+mv val/ILSVRC2012_val_00031366.JPEG n03775071/
+mv val/ILSVRC2012_val_00031367.JPEG n03498962/
+mv val/ILSVRC2012_val_00031368.JPEG n02101388/
+mv val/ILSVRC2012_val_00031369.JPEG n03447447/
+mv val/ILSVRC2012_val_00031370.JPEG n02443114/
+mv val/ILSVRC2012_val_00031371.JPEG n04039381/
+mv val/ILSVRC2012_val_00031372.JPEG n02791124/
+mv val/ILSVRC2012_val_00031373.JPEG n02104365/
+mv val/ILSVRC2012_val_00031374.JPEG n01776313/
+mv val/ILSVRC2012_val_00031375.JPEG n04442312/
+mv val/ILSVRC2012_val_00031376.JPEG n03584254/
+mv val/ILSVRC2012_val_00031377.JPEG n02094258/
+mv val/ILSVRC2012_val_00031378.JPEG n02086646/
+mv val/ILSVRC2012_val_00031379.JPEG n04370456/
+mv val/ILSVRC2012_val_00031380.JPEG n01797886/
+mv val/ILSVRC2012_val_00031381.JPEG n03724870/
+mv val/ILSVRC2012_val_00031382.JPEG n01775062/
+mv val/ILSVRC2012_val_00031383.JPEG n02687172/
+mv val/ILSVRC2012_val_00031384.JPEG n02091244/
+mv val/ILSVRC2012_val_00031385.JPEG n03124043/
+mv val/ILSVRC2012_val_00031386.JPEG n01632777/
+mv val/ILSVRC2012_val_00031387.JPEG n02787622/
+mv val/ILSVRC2012_val_00031388.JPEG n01930112/
+mv val/ILSVRC2012_val_00031389.JPEG n01664065/
+mv val/ILSVRC2012_val_00031390.JPEG n01734418/
+mv val/ILSVRC2012_val_00031391.JPEG n02110063/
+mv val/ILSVRC2012_val_00031392.JPEG n01818515/
+mv val/ILSVRC2012_val_00031393.JPEG n04336792/
+mv val/ILSVRC2012_val_00031394.JPEG n03793489/
+mv val/ILSVRC2012_val_00031395.JPEG n02097298/
+mv val/ILSVRC2012_val_00031396.JPEG n02017213/
+mv val/ILSVRC2012_val_00031397.JPEG n04273569/
+mv val/ILSVRC2012_val_00031398.JPEG n03485794/
+mv val/ILSVRC2012_val_00031399.JPEG n02002724/
+mv val/ILSVRC2012_val_00031400.JPEG n04507155/
+mv val/ILSVRC2012_val_00031401.JPEG n11879895/
+mv val/ILSVRC2012_val_00031402.JPEG n02087046/
+mv val/ILSVRC2012_val_00031403.JPEG n02486410/
+mv val/ILSVRC2012_val_00031404.JPEG n04033995/
+mv val/ILSVRC2012_val_00031405.JPEG n03345487/
+mv val/ILSVRC2012_val_00031406.JPEG n03692522/
+mv val/ILSVRC2012_val_00031407.JPEG n04347754/
+mv val/ILSVRC2012_val_00031408.JPEG n01986214/
+mv val/ILSVRC2012_val_00031409.JPEG n03873416/
+mv val/ILSVRC2012_val_00031410.JPEG n03483316/
+mv val/ILSVRC2012_val_00031411.JPEG n02101556/
+mv val/ILSVRC2012_val_00031412.JPEG n03425413/
+mv val/ILSVRC2012_val_00031413.JPEG n03000684/
+mv val/ILSVRC2012_val_00031414.JPEG n02114367/
+mv val/ILSVRC2012_val_00031415.JPEG n02113712/
+mv val/ILSVRC2012_val_00031416.JPEG n03535780/
+mv val/ILSVRC2012_val_00031417.JPEG n02454379/
+mv val/ILSVRC2012_val_00031418.JPEG n03788195/
+mv val/ILSVRC2012_val_00031419.JPEG n02086240/
+mv val/ILSVRC2012_val_00031420.JPEG n02095889/
+mv val/ILSVRC2012_val_00031421.JPEG n02422699/
+mv val/ILSVRC2012_val_00031422.JPEG n03400231/
+mv val/ILSVRC2012_val_00031423.JPEG n03690938/
+mv val/ILSVRC2012_val_00031424.JPEG n01494475/
+mv val/ILSVRC2012_val_00031425.JPEG n02099601/
+mv val/ILSVRC2012_val_00031426.JPEG n04612504/
+mv val/ILSVRC2012_val_00031427.JPEG n07753275/
+mv val/ILSVRC2012_val_00031428.JPEG n03814639/
+mv val/ILSVRC2012_val_00031429.JPEG n02165105/
+mv val/ILSVRC2012_val_00031430.JPEG n03314780/
+mv val/ILSVRC2012_val_00031431.JPEG n03478589/
+mv val/ILSVRC2012_val_00031432.JPEG n01796340/
+mv val/ILSVRC2012_val_00031433.JPEG n02105641/
+mv val/ILSVRC2012_val_00031434.JPEG n01847000/
+mv val/ILSVRC2012_val_00031435.JPEG n01877812/
+mv val/ILSVRC2012_val_00031436.JPEG n02447366/
+mv val/ILSVRC2012_val_00031437.JPEG n03929660/
+mv val/ILSVRC2012_val_00031438.JPEG n02992529/
+mv val/ILSVRC2012_val_00031439.JPEG n02088094/
+mv val/ILSVRC2012_val_00031440.JPEG n07745940/
+mv val/ILSVRC2012_val_00031441.JPEG n04522168/
+mv val/ILSVRC2012_val_00031442.JPEG n04069434/
+mv val/ILSVRC2012_val_00031443.JPEG n12620546/
+mv val/ILSVRC2012_val_00031444.JPEG n03673027/
+mv val/ILSVRC2012_val_00031445.JPEG n03998194/
+mv val/ILSVRC2012_val_00031446.JPEG n03028079/
+mv val/ILSVRC2012_val_00031447.JPEG n04252225/
+mv val/ILSVRC2012_val_00031448.JPEG n02033041/
+mv val/ILSVRC2012_val_00031449.JPEG n01843065/
+mv val/ILSVRC2012_val_00031450.JPEG n07720875/
+mv val/ILSVRC2012_val_00031451.JPEG n02099712/
+mv val/ILSVRC2012_val_00031452.JPEG n02939185/
+mv val/ILSVRC2012_val_00031453.JPEG n02098413/
+mv val/ILSVRC2012_val_00031454.JPEG n04296562/
+mv val/ILSVRC2012_val_00031455.JPEG n03796401/
+mv val/ILSVRC2012_val_00031456.JPEG n01729977/
+mv val/ILSVRC2012_val_00031457.JPEG n02859443/
+mv val/ILSVRC2012_val_00031458.JPEG n02105251/
+mv val/ILSVRC2012_val_00031459.JPEG n02860847/
+mv val/ILSVRC2012_val_00031460.JPEG n04209133/
+mv val/ILSVRC2012_val_00031461.JPEG n02108000/
+mv val/ILSVRC2012_val_00031462.JPEG n04235860/
+mv val/ILSVRC2012_val_00031463.JPEG n02782093/
+mv val/ILSVRC2012_val_00031464.JPEG n02814533/
+mv val/ILSVRC2012_val_00031465.JPEG n01614925/
+mv val/ILSVRC2012_val_00031466.JPEG n01484850/
+mv val/ILSVRC2012_val_00031467.JPEG n01669191/
+mv val/ILSVRC2012_val_00031468.JPEG n04525305/
+mv val/ILSVRC2012_val_00031469.JPEG n07716906/
+mv val/ILSVRC2012_val_00031470.JPEG n02119022/
+mv val/ILSVRC2012_val_00031471.JPEG n03721384/
+mv val/ILSVRC2012_val_00031472.JPEG n02259212/
+mv val/ILSVRC2012_val_00031473.JPEG n03976657/
+mv val/ILSVRC2012_val_00031474.JPEG n02415577/
+mv val/ILSVRC2012_val_00031475.JPEG n04392985/
+mv val/ILSVRC2012_val_00031476.JPEG n04023962/
+mv val/ILSVRC2012_val_00031477.JPEG n02793495/
+mv val/ILSVRC2012_val_00031478.JPEG n04592741/
+mv val/ILSVRC2012_val_00031479.JPEG n02233338/
+mv val/ILSVRC2012_val_00031480.JPEG n02777292/
+mv val/ILSVRC2012_val_00031481.JPEG n01514859/
+mv val/ILSVRC2012_val_00031482.JPEG n03127747/
+mv val/ILSVRC2012_val_00031483.JPEG n04548362/
+mv val/ILSVRC2012_val_00031484.JPEG n03947888/
+mv val/ILSVRC2012_val_00031485.JPEG n03792782/
+mv val/ILSVRC2012_val_00031486.JPEG n03445777/
+mv val/ILSVRC2012_val_00031487.JPEG n04592741/
+mv val/ILSVRC2012_val_00031488.JPEG n02165105/
+mv val/ILSVRC2012_val_00031489.JPEG n02105056/
+mv val/ILSVRC2012_val_00031490.JPEG n04525038/
+mv val/ILSVRC2012_val_00031491.JPEG n02395406/
+mv val/ILSVRC2012_val_00031492.JPEG n02129604/
+mv val/ILSVRC2012_val_00031493.JPEG n09399592/
+mv val/ILSVRC2012_val_00031494.JPEG n09229709/
+mv val/ILSVRC2012_val_00031495.JPEG n06785654/
+mv val/ILSVRC2012_val_00031496.JPEG n03045698/
+mv val/ILSVRC2012_val_00031497.JPEG n04380533/
+mv val/ILSVRC2012_val_00031498.JPEG n02835271/
+mv val/ILSVRC2012_val_00031499.JPEG n07715103/
+mv val/ILSVRC2012_val_00031500.JPEG n03692522/
+mv val/ILSVRC2012_val_00031501.JPEG n02950826/
+mv val/ILSVRC2012_val_00031502.JPEG n02259212/
+mv val/ILSVRC2012_val_00031503.JPEG n03773504/
+mv val/ILSVRC2012_val_00031504.JPEG n04560804/
+mv val/ILSVRC2012_val_00031505.JPEG n04355933/
+mv val/ILSVRC2012_val_00031506.JPEG n02167151/
+mv val/ILSVRC2012_val_00031507.JPEG n01695060/
+mv val/ILSVRC2012_val_00031508.JPEG n02091635/
+mv val/ILSVRC2012_val_00031509.JPEG n07745940/
+mv val/ILSVRC2012_val_00031510.JPEG n03958227/
+mv val/ILSVRC2012_val_00031511.JPEG n03642806/
+mv val/ILSVRC2012_val_00031512.JPEG n01537544/
+mv val/ILSVRC2012_val_00031513.JPEG n03733131/
+mv val/ILSVRC2012_val_00031514.JPEG n02028035/
+mv val/ILSVRC2012_val_00031515.JPEG n02667093/
+mv val/ILSVRC2012_val_00031516.JPEG n03617480/
+mv val/ILSVRC2012_val_00031517.JPEG n02443484/
+mv val/ILSVRC2012_val_00031518.JPEG n04532106/
+mv val/ILSVRC2012_val_00031519.JPEG n06874185/
+mv val/ILSVRC2012_val_00031520.JPEG n02730930/
+mv val/ILSVRC2012_val_00031521.JPEG n01632458/
+mv val/ILSVRC2012_val_00031522.JPEG n04067472/
+mv val/ILSVRC2012_val_00031523.JPEG n09246464/
+mv val/ILSVRC2012_val_00031524.JPEG n02264363/
+mv val/ILSVRC2012_val_00031525.JPEG n09229709/
+mv val/ILSVRC2012_val_00031526.JPEG n02708093/
+mv val/ILSVRC2012_val_00031527.JPEG n03804744/
+mv val/ILSVRC2012_val_00031528.JPEG n03042490/
+mv val/ILSVRC2012_val_00031529.JPEG n03347037/
+mv val/ILSVRC2012_val_00031530.JPEG n02120079/
+mv val/ILSVRC2012_val_00031531.JPEG n02098105/
+mv val/ILSVRC2012_val_00031532.JPEG n02092339/
+mv val/ILSVRC2012_val_00031533.JPEG n03017168/
+mv val/ILSVRC2012_val_00031534.JPEG n02099429/
+mv val/ILSVRC2012_val_00031535.JPEG n03160309/
+mv val/ILSVRC2012_val_00031536.JPEG n12267677/
+mv val/ILSVRC2012_val_00031537.JPEG n03642806/
+mv val/ILSVRC2012_val_00031538.JPEG n07579787/
+mv val/ILSVRC2012_val_00031539.JPEG n02817516/
+mv val/ILSVRC2012_val_00031540.JPEG n01770393/
+mv val/ILSVRC2012_val_00031541.JPEG n01667114/
+mv val/ILSVRC2012_val_00031542.JPEG n04417672/
+mv val/ILSVRC2012_val_00031543.JPEG n04515003/
+mv val/ILSVRC2012_val_00031544.JPEG n02091134/
+mv val/ILSVRC2012_val_00031545.JPEG n02090721/
+mv val/ILSVRC2012_val_00031546.JPEG n04428191/
+mv val/ILSVRC2012_val_00031547.JPEG n02086646/
+mv val/ILSVRC2012_val_00031548.JPEG n04536866/
+mv val/ILSVRC2012_val_00031549.JPEG n03000684/
+mv val/ILSVRC2012_val_00031550.JPEG n01692333/
+mv val/ILSVRC2012_val_00031551.JPEG n04591157/
+mv val/ILSVRC2012_val_00031552.JPEG n03967562/
+mv val/ILSVRC2012_val_00031553.JPEG n03743016/
+mv val/ILSVRC2012_val_00031554.JPEG n04579145/
+mv val/ILSVRC2012_val_00031555.JPEG n02110063/
+mv val/ILSVRC2012_val_00031556.JPEG n04040759/
+mv val/ILSVRC2012_val_00031557.JPEG n02074367/
+mv val/ILSVRC2012_val_00031558.JPEG n03100240/
+mv val/ILSVRC2012_val_00031559.JPEG n04552348/
+mv val/ILSVRC2012_val_00031560.JPEG n02916936/
+mv val/ILSVRC2012_val_00031561.JPEG n03485407/
+mv val/ILSVRC2012_val_00031562.JPEG n02489166/
+mv val/ILSVRC2012_val_00031563.JPEG n03271574/
+mv val/ILSVRC2012_val_00031564.JPEG n01677366/
+mv val/ILSVRC2012_val_00031565.JPEG n02457408/
+mv val/ILSVRC2012_val_00031566.JPEG n02966193/
+mv val/ILSVRC2012_val_00031567.JPEG n04152593/
+mv val/ILSVRC2012_val_00031568.JPEG n01491361/
+mv val/ILSVRC2012_val_00031569.JPEG n01748264/
+mv val/ILSVRC2012_val_00031570.JPEG n03530642/
+mv val/ILSVRC2012_val_00031571.JPEG n03840681/
+mv val/ILSVRC2012_val_00031572.JPEG n01768244/
+mv val/ILSVRC2012_val_00031573.JPEG n02226429/
+mv val/ILSVRC2012_val_00031574.JPEG n03642806/
+mv val/ILSVRC2012_val_00031575.JPEG n02002556/
+mv val/ILSVRC2012_val_00031576.JPEG n03598930/
+mv val/ILSVRC2012_val_00031577.JPEG n01631663/
+mv val/ILSVRC2012_val_00031578.JPEG n03787032/
+mv val/ILSVRC2012_val_00031579.JPEG n03954731/
+mv val/ILSVRC2012_val_00031580.JPEG n04462240/
+mv val/ILSVRC2012_val_00031581.JPEG n03680355/
+mv val/ILSVRC2012_val_00031582.JPEG n02013706/
+mv val/ILSVRC2012_val_00031583.JPEG n03271574/
+mv val/ILSVRC2012_val_00031584.JPEG n04357314/
+mv val/ILSVRC2012_val_00031585.JPEG n02397096/
+mv val/ILSVRC2012_val_00031586.JPEG n01697457/
+mv val/ILSVRC2012_val_00031587.JPEG n02441942/
+mv val/ILSVRC2012_val_00031588.JPEG n03661043/
+mv val/ILSVRC2012_val_00031589.JPEG n01985128/
+mv val/ILSVRC2012_val_00031590.JPEG n03658185/
+mv val/ILSVRC2012_val_00031591.JPEG n02099267/
+mv val/ILSVRC2012_val_00031592.JPEG n04522168/
+mv val/ILSVRC2012_val_00031593.JPEG n13037406/
+mv val/ILSVRC2012_val_00031594.JPEG n02108422/
+mv val/ILSVRC2012_val_00031595.JPEG n04111531/
+mv val/ILSVRC2012_val_00031596.JPEG n01728920/
+mv val/ILSVRC2012_val_00031597.JPEG n02085620/
+mv val/ILSVRC2012_val_00031598.JPEG n01644373/
+mv val/ILSVRC2012_val_00031599.JPEG n02101388/
+mv val/ILSVRC2012_val_00031600.JPEG n02795169/
+mv val/ILSVRC2012_val_00031601.JPEG n02100877/
+mv val/ILSVRC2012_val_00031602.JPEG n04509417/
+mv val/ILSVRC2012_val_00031603.JPEG n02088466/
+mv val/ILSVRC2012_val_00031604.JPEG n02769748/
+mv val/ILSVRC2012_val_00031605.JPEG n02965783/
+mv val/ILSVRC2012_val_00031606.JPEG n03649909/
+mv val/ILSVRC2012_val_00031607.JPEG n03179701/
+mv val/ILSVRC2012_val_00031608.JPEG n01742172/
+mv val/ILSVRC2012_val_00031609.JPEG n01877812/
+mv val/ILSVRC2012_val_00031610.JPEG n03769881/
+mv val/ILSVRC2012_val_00031611.JPEG n03000247/
+mv val/ILSVRC2012_val_00031612.JPEG n02106662/
+mv val/ILSVRC2012_val_00031613.JPEG n03888605/
+mv val/ILSVRC2012_val_00031614.JPEG n03937543/
+mv val/ILSVRC2012_val_00031615.JPEG n04346328/
+mv val/ILSVRC2012_val_00031616.JPEG n03976467/
+mv val/ILSVRC2012_val_00031617.JPEG n03187595/
+mv val/ILSVRC2012_val_00031618.JPEG n15075141/
+mv val/ILSVRC2012_val_00031619.JPEG n03062245/
+mv val/ILSVRC2012_val_00031620.JPEG n03710721/
+mv val/ILSVRC2012_val_00031621.JPEG n04009552/
+mv val/ILSVRC2012_val_00031622.JPEG n02447366/
+mv val/ILSVRC2012_val_00031623.JPEG n02107574/
+mv val/ILSVRC2012_val_00031624.JPEG n03970156/
+mv val/ILSVRC2012_val_00031625.JPEG n03991062/
+mv val/ILSVRC2012_val_00031626.JPEG n02098413/
+mv val/ILSVRC2012_val_00031627.JPEG n07892512/
+mv val/ILSVRC2012_val_00031628.JPEG n03529860/
+mv val/ILSVRC2012_val_00031629.JPEG n03935335/
+mv val/ILSVRC2012_val_00031630.JPEG n01531178/
+mv val/ILSVRC2012_val_00031631.JPEG n02835271/
+mv val/ILSVRC2012_val_00031632.JPEG n03787032/
+mv val/ILSVRC2012_val_00031633.JPEG n02101388/
+mv val/ILSVRC2012_val_00031634.JPEG n02085620/
+mv val/ILSVRC2012_val_00031635.JPEG n02701002/
+mv val/ILSVRC2012_val_00031636.JPEG n11939491/
+mv val/ILSVRC2012_val_00031637.JPEG n01698640/
+mv val/ILSVRC2012_val_00031638.JPEG n02233338/
+mv val/ILSVRC2012_val_00031639.JPEG n11879895/
+mv val/ILSVRC2012_val_00031640.JPEG n02101556/
+mv val/ILSVRC2012_val_00031641.JPEG n07753592/
+mv val/ILSVRC2012_val_00031642.JPEG n02441942/
+mv val/ILSVRC2012_val_00031643.JPEG n07871810/
+mv val/ILSVRC2012_val_00031644.JPEG n01914609/
+mv val/ILSVRC2012_val_00031645.JPEG n02132136/
+mv val/ILSVRC2012_val_00031646.JPEG n02097658/
+mv val/ILSVRC2012_val_00031647.JPEG n07720875/
+mv val/ILSVRC2012_val_00031648.JPEG n02259212/
+mv val/ILSVRC2012_val_00031649.JPEG n01560419/
+mv val/ILSVRC2012_val_00031650.JPEG n02510455/
+mv val/ILSVRC2012_val_00031651.JPEG n04200800/
+mv val/ILSVRC2012_val_00031652.JPEG n04254777/
+mv val/ILSVRC2012_val_00031653.JPEG n01616318/
+mv val/ILSVRC2012_val_00031654.JPEG n04522168/
+mv val/ILSVRC2012_val_00031655.JPEG n02100236/
+mv val/ILSVRC2012_val_00031656.JPEG n04356056/
+mv val/ILSVRC2012_val_00031657.JPEG n07615774/
+mv val/ILSVRC2012_val_00031658.JPEG n03160309/
+mv val/ILSVRC2012_val_00031659.JPEG n02666196/
+mv val/ILSVRC2012_val_00031660.JPEG n02169497/
+mv val/ILSVRC2012_val_00031661.JPEG n03207941/
+mv val/ILSVRC2012_val_00031662.JPEG n07831146/
+mv val/ILSVRC2012_val_00031663.JPEG n04131690/
+mv val/ILSVRC2012_val_00031664.JPEG n04136333/
+mv val/ILSVRC2012_val_00031665.JPEG n02895154/
+mv val/ILSVRC2012_val_00031666.JPEG n02002556/
+mv val/ILSVRC2012_val_00031667.JPEG n04311174/
+mv val/ILSVRC2012_val_00031668.JPEG n04243546/
+mv val/ILSVRC2012_val_00031669.JPEG n13052670/
+mv val/ILSVRC2012_val_00031670.JPEG n02895154/
+mv val/ILSVRC2012_val_00031671.JPEG n03527444/
+mv val/ILSVRC2012_val_00031672.JPEG n02090622/
+mv val/ILSVRC2012_val_00031673.JPEG n04429376/
+mv val/ILSVRC2012_val_00031674.JPEG n01667778/
+mv val/ILSVRC2012_val_00031675.JPEG n01871265/
+mv val/ILSVRC2012_val_00031676.JPEG n01608432/
+mv val/ILSVRC2012_val_00031677.JPEG n03424325/
+mv val/ILSVRC2012_val_00031678.JPEG n02111129/
+mv val/ILSVRC2012_val_00031679.JPEG n02094114/
+mv val/ILSVRC2012_val_00031680.JPEG n03706229/
+mv val/ILSVRC2012_val_00031681.JPEG n02883205/
+mv val/ILSVRC2012_val_00031682.JPEG n07590611/
+mv val/ILSVRC2012_val_00031683.JPEG n02948072/
+mv val/ILSVRC2012_val_00031684.JPEG n01770393/
+mv val/ILSVRC2012_val_00031685.JPEG n03290653/
+mv val/ILSVRC2012_val_00031686.JPEG n02128925/
+mv val/ILSVRC2012_val_00031687.JPEG n02110185/
+mv val/ILSVRC2012_val_00031688.JPEG n02110341/
+mv val/ILSVRC2012_val_00031689.JPEG n01796340/
+mv val/ILSVRC2012_val_00031690.JPEG n02342885/
+mv val/ILSVRC2012_val_00031691.JPEG n02487347/
+mv val/ILSVRC2012_val_00031692.JPEG n04310018/
+mv val/ILSVRC2012_val_00031693.JPEG n02091635/
+mv val/ILSVRC2012_val_00031694.JPEG n02708093/
+mv val/ILSVRC2012_val_00031695.JPEG n03016953/
+mv val/ILSVRC2012_val_00031696.JPEG n02264363/
+mv val/ILSVRC2012_val_00031697.JPEG n04372370/
+mv val/ILSVRC2012_val_00031698.JPEG n03272562/
+mv val/ILSVRC2012_val_00031699.JPEG n02089078/
+mv val/ILSVRC2012_val_00031700.JPEG n03764736/
+mv val/ILSVRC2012_val_00031701.JPEG n02963159/
+mv val/ILSVRC2012_val_00031702.JPEG n03874599/
+mv val/ILSVRC2012_val_00031703.JPEG n02641379/
+mv val/ILSVRC2012_val_00031704.JPEG n01984695/
+mv val/ILSVRC2012_val_00031705.JPEG n02802426/
+mv val/ILSVRC2012_val_00031706.JPEG n02346627/
+mv val/ILSVRC2012_val_00031707.JPEG n03773504/
+mv val/ILSVRC2012_val_00031708.JPEG n04273569/
+mv val/ILSVRC2012_val_00031709.JPEG n02111889/
+mv val/ILSVRC2012_val_00031710.JPEG n03498962/
+mv val/ILSVRC2012_val_00031711.JPEG n03141823/
+mv val/ILSVRC2012_val_00031712.JPEG n04350905/
+mv val/ILSVRC2012_val_00031713.JPEG n02095314/
+mv val/ILSVRC2012_val_00031714.JPEG n04335435/
+mv val/ILSVRC2012_val_00031715.JPEG n03388183/
+mv val/ILSVRC2012_val_00031716.JPEG n01537544/
+mv val/ILSVRC2012_val_00031717.JPEG n03947888/
+mv val/ILSVRC2012_val_00031718.JPEG n02106662/
+mv val/ILSVRC2012_val_00031719.JPEG n03854065/
+mv val/ILSVRC2012_val_00031720.JPEG n01484850/
+mv val/ILSVRC2012_val_00031721.JPEG n02086079/
+mv val/ILSVRC2012_val_00031722.JPEG n07714571/
+mv val/ILSVRC2012_val_00031723.JPEG n01768244/
+mv val/ILSVRC2012_val_00031724.JPEG n04070727/
+mv val/ILSVRC2012_val_00031725.JPEG n03494278/
+mv val/ILSVRC2012_val_00031726.JPEG n03584829/
+mv val/ILSVRC2012_val_00031727.JPEG n03837869/
+mv val/ILSVRC2012_val_00031728.JPEG n01945685/
+mv val/ILSVRC2012_val_00031729.JPEG n03733281/
+mv val/ILSVRC2012_val_00031730.JPEG n04429376/
+mv val/ILSVRC2012_val_00031731.JPEG n02099601/
+mv val/ILSVRC2012_val_00031732.JPEG n04554684/
+mv val/ILSVRC2012_val_00031733.JPEG n04509417/
+mv val/ILSVRC2012_val_00031734.JPEG n01943899/
+mv val/ILSVRC2012_val_00031735.JPEG n07565083/
+mv val/ILSVRC2012_val_00031736.JPEG n04515003/
+mv val/ILSVRC2012_val_00031737.JPEG n03777754/
+mv val/ILSVRC2012_val_00031738.JPEG n03594734/
+mv val/ILSVRC2012_val_00031739.JPEG n03777568/
+mv val/ILSVRC2012_val_00031740.JPEG n03840681/
+mv val/ILSVRC2012_val_00031741.JPEG n02536864/
+mv val/ILSVRC2012_val_00031742.JPEG n04442312/
+mv val/ILSVRC2012_val_00031743.JPEG n03127747/
+mv val/ILSVRC2012_val_00031744.JPEG n03445777/
+mv val/ILSVRC2012_val_00031745.JPEG n04579432/
+mv val/ILSVRC2012_val_00031746.JPEG n03063599/
+mv val/ILSVRC2012_val_00031747.JPEG n02113978/
+mv val/ILSVRC2012_val_00031748.JPEG n03787032/
+mv val/ILSVRC2012_val_00031749.JPEG n01742172/
+mv val/ILSVRC2012_val_00031750.JPEG n02487347/
+mv val/ILSVRC2012_val_00031751.JPEG n04486054/
+mv val/ILSVRC2012_val_00031752.JPEG n02093859/
+mv val/ILSVRC2012_val_00031753.JPEG n04162706/
+mv val/ILSVRC2012_val_00031754.JPEG n02328150/
+mv val/ILSVRC2012_val_00031755.JPEG n03482405/
+mv val/ILSVRC2012_val_00031756.JPEG n04517823/
+mv val/ILSVRC2012_val_00031757.JPEG n07615774/
+mv val/ILSVRC2012_val_00031758.JPEG n04192698/
+mv val/ILSVRC2012_val_00031759.JPEG n02808304/
+mv val/ILSVRC2012_val_00031760.JPEG n02037110/
+mv val/ILSVRC2012_val_00031761.JPEG n04254120/
+mv val/ILSVRC2012_val_00031762.JPEG n02490219/
+mv val/ILSVRC2012_val_00031763.JPEG n07684084/
+mv val/ILSVRC2012_val_00031764.JPEG n02094258/
+mv val/ILSVRC2012_val_00031765.JPEG n02814533/
+mv val/ILSVRC2012_val_00031766.JPEG n02174001/
+mv val/ILSVRC2012_val_00031767.JPEG n07753275/
+mv val/ILSVRC2012_val_00031768.JPEG n04033901/
+mv val/ILSVRC2012_val_00031769.JPEG n02481823/
+mv val/ILSVRC2012_val_00031770.JPEG n03770679/
+mv val/ILSVRC2012_val_00031771.JPEG n03134739/
+mv val/ILSVRC2012_val_00031772.JPEG n01560419/
+mv val/ILSVRC2012_val_00031773.JPEG n04275548/
+mv val/ILSVRC2012_val_00031774.JPEG n01667778/
+mv val/ILSVRC2012_val_00031775.JPEG n01737021/
+mv val/ILSVRC2012_val_00031776.JPEG n01806567/
+mv val/ILSVRC2012_val_00031777.JPEG n04456115/
+mv val/ILSVRC2012_val_00031778.JPEG n07613480/
+mv val/ILSVRC2012_val_00031779.JPEG n01737021/
+mv val/ILSVRC2012_val_00031780.JPEG n03761084/
+mv val/ILSVRC2012_val_00031781.JPEG n07753592/
+mv val/ILSVRC2012_val_00031782.JPEG n04461696/
+mv val/ILSVRC2012_val_00031783.JPEG n04336792/
+mv val/ILSVRC2012_val_00031784.JPEG n02137549/
+mv val/ILSVRC2012_val_00031785.JPEG n02100735/
+mv val/ILSVRC2012_val_00031786.JPEG n04005630/
+mv val/ILSVRC2012_val_00031787.JPEG n02112706/
+mv val/ILSVRC2012_val_00031788.JPEG n12144580/
+mv val/ILSVRC2012_val_00031789.JPEG n03785016/
+mv val/ILSVRC2012_val_00031790.JPEG n03372029/
+mv val/ILSVRC2012_val_00031791.JPEG n04486054/
+mv val/ILSVRC2012_val_00031792.JPEG n02117135/
+mv val/ILSVRC2012_val_00031793.JPEG n01667778/
+mv val/ILSVRC2012_val_00031794.JPEG n02927161/
+mv val/ILSVRC2012_val_00031795.JPEG n07760859/
+mv val/ILSVRC2012_val_00031796.JPEG n03924679/
+mv val/ILSVRC2012_val_00031797.JPEG n04040759/
+mv val/ILSVRC2012_val_00031798.JPEG n07742313/
+mv val/ILSVRC2012_val_00031799.JPEG n02106030/
+mv val/ILSVRC2012_val_00031800.JPEG n03388549/
+mv val/ILSVRC2012_val_00031801.JPEG n03950228/
+mv val/ILSVRC2012_val_00031802.JPEG n01768244/
+mv val/ILSVRC2012_val_00031803.JPEG n07734744/
+mv val/ILSVRC2012_val_00031804.JPEG n04479046/
+mv val/ILSVRC2012_val_00031805.JPEG n02791124/
+mv val/ILSVRC2012_val_00031806.JPEG n01807496/
+mv val/ILSVRC2012_val_00031807.JPEG n04357314/
+mv val/ILSVRC2012_val_00031808.JPEG n01484850/
+mv val/ILSVRC2012_val_00031809.JPEG n03888605/
+mv val/ILSVRC2012_val_00031810.JPEG n04277352/
+mv val/ILSVRC2012_val_00031811.JPEG n04326547/
+mv val/ILSVRC2012_val_00031812.JPEG n03876231/
+mv val/ILSVRC2012_val_00031813.JPEG n07584110/
+mv val/ILSVRC2012_val_00031814.JPEG n02092002/
+mv val/ILSVRC2012_val_00031815.JPEG n01667778/
+mv val/ILSVRC2012_val_00031816.JPEG n01682714/
+mv val/ILSVRC2012_val_00031817.JPEG n02091831/
+mv val/ILSVRC2012_val_00031818.JPEG n02108089/
+mv val/ILSVRC2012_val_00031819.JPEG n02951585/
+mv val/ILSVRC2012_val_00031820.JPEG n02219486/
+mv val/ILSVRC2012_val_00031821.JPEG n02090379/
+mv val/ILSVRC2012_val_00031822.JPEG n01950731/
+mv val/ILSVRC2012_val_00031823.JPEG n02089867/
+mv val/ILSVRC2012_val_00031824.JPEG n01828970/
+mv val/ILSVRC2012_val_00031825.JPEG n03837869/
+mv val/ILSVRC2012_val_00031826.JPEG n01978287/
+mv val/ILSVRC2012_val_00031827.JPEG n02092002/
+mv val/ILSVRC2012_val_00031828.JPEG n02814533/
+mv val/ILSVRC2012_val_00031829.JPEG n01664065/
+mv val/ILSVRC2012_val_00031830.JPEG n12768682/
+mv val/ILSVRC2012_val_00031831.JPEG n07930864/
+mv val/ILSVRC2012_val_00031832.JPEG n04357314/
+mv val/ILSVRC2012_val_00031833.JPEG n02802426/
+mv val/ILSVRC2012_val_00031834.JPEG n02089867/
+mv val/ILSVRC2012_val_00031835.JPEG n03063689/
+mv val/ILSVRC2012_val_00031836.JPEG n03535780/
+mv val/ILSVRC2012_val_00031837.JPEG n04591713/
+mv val/ILSVRC2012_val_00031838.JPEG n03796401/
+mv val/ILSVRC2012_val_00031839.JPEG n02877765/
+mv val/ILSVRC2012_val_00031840.JPEG n02823428/
+mv val/ILSVRC2012_val_00031841.JPEG n07717410/
+mv val/ILSVRC2012_val_00031842.JPEG n04612504/
+mv val/ILSVRC2012_val_00031843.JPEG n03642806/
+mv val/ILSVRC2012_val_00031844.JPEG n04033995/
+mv val/ILSVRC2012_val_00031845.JPEG n02095889/
+mv val/ILSVRC2012_val_00031846.JPEG n04074963/
+mv val/ILSVRC2012_val_00031847.JPEG n01855032/
+mv val/ILSVRC2012_val_00031848.JPEG n04270147/
+mv val/ILSVRC2012_val_00031849.JPEG n03110669/
+mv val/ILSVRC2012_val_00031850.JPEG n03255030/
+mv val/ILSVRC2012_val_00031851.JPEG n03530642/
+mv val/ILSVRC2012_val_00031852.JPEG n10148035/
+mv val/ILSVRC2012_val_00031853.JPEG n07745940/
+mv val/ILSVRC2012_val_00031854.JPEG n02490219/
+mv val/ILSVRC2012_val_00031855.JPEG n02074367/
+mv val/ILSVRC2012_val_00031856.JPEG n02097130/
+mv val/ILSVRC2012_val_00031857.JPEG n02106662/
+mv val/ILSVRC2012_val_00031858.JPEG n03891332/
+mv val/ILSVRC2012_val_00031859.JPEG n02089973/
+mv val/ILSVRC2012_val_00031860.JPEG n04209239/
+mv val/ILSVRC2012_val_00031861.JPEG n04548280/
+mv val/ILSVRC2012_val_00031862.JPEG n04154565/
+mv val/ILSVRC2012_val_00031863.JPEG n02037110/
+mv val/ILSVRC2012_val_00031864.JPEG n02113978/
+mv val/ILSVRC2012_val_00031865.JPEG n02115913/
+mv val/ILSVRC2012_val_00031866.JPEG n02018795/
+mv val/ILSVRC2012_val_00031867.JPEG n02823428/
+mv val/ILSVRC2012_val_00031868.JPEG n02091032/
+mv val/ILSVRC2012_val_00031869.JPEG n03874293/
+mv val/ILSVRC2012_val_00031870.JPEG n04146614/
+mv val/ILSVRC2012_val_00031871.JPEG n04560804/
+mv val/ILSVRC2012_val_00031872.JPEG n04522168/
+mv val/ILSVRC2012_val_00031873.JPEG n07717556/
+mv val/ILSVRC2012_val_00031874.JPEG n04311004/
+mv val/ILSVRC2012_val_00031875.JPEG n02105855/
+mv val/ILSVRC2012_val_00031876.JPEG n02109961/
+mv val/ILSVRC2012_val_00031877.JPEG n02134084/
+mv val/ILSVRC2012_val_00031878.JPEG n02930766/
+mv val/ILSVRC2012_val_00031879.JPEG n01855032/
+mv val/ILSVRC2012_val_00031880.JPEG n02480495/
+mv val/ILSVRC2012_val_00031881.JPEG n02509815/
+mv val/ILSVRC2012_val_00031882.JPEG n02100877/
+mv val/ILSVRC2012_val_00031883.JPEG n02795169/
+mv val/ILSVRC2012_val_00031884.JPEG n02125311/
+mv val/ILSVRC2012_val_00031885.JPEG n01734418/
+mv val/ILSVRC2012_val_00031886.JPEG n03124043/
+mv val/ILSVRC2012_val_00031887.JPEG n02165105/
+mv val/ILSVRC2012_val_00031888.JPEG n02840245/
+mv val/ILSVRC2012_val_00031889.JPEG n03759954/
+mv val/ILSVRC2012_val_00031890.JPEG n01622779/
+mv val/ILSVRC2012_val_00031891.JPEG n02442845/
+mv val/ILSVRC2012_val_00031892.JPEG n04328186/
+mv val/ILSVRC2012_val_00031893.JPEG n04152593/
+mv val/ILSVRC2012_val_00031894.JPEG n04554684/
+mv val/ILSVRC2012_val_00031895.JPEG n02965783/
+mv val/ILSVRC2012_val_00031896.JPEG n02510455/
+mv val/ILSVRC2012_val_00031897.JPEG n03445777/
+mv val/ILSVRC2012_val_00031898.JPEG n07615774/
+mv val/ILSVRC2012_val_00031899.JPEG n12998815/
+mv val/ILSVRC2012_val_00031900.JPEG n07717410/
+mv val/ILSVRC2012_val_00031901.JPEG n03742115/
+mv val/ILSVRC2012_val_00031902.JPEG n04264628/
+mv val/ILSVRC2012_val_00031903.JPEG n02165456/
+mv val/ILSVRC2012_val_00031904.JPEG n04074963/
+mv val/ILSVRC2012_val_00031905.JPEG n02098105/
+mv val/ILSVRC2012_val_00031906.JPEG n02132136/
+mv val/ILSVRC2012_val_00031907.JPEG n01872401/
+mv val/ILSVRC2012_val_00031908.JPEG n02441942/
+mv val/ILSVRC2012_val_00031909.JPEG n04560804/
+mv val/ILSVRC2012_val_00031910.JPEG n02422699/
+mv val/ILSVRC2012_val_00031911.JPEG n02802426/
+mv val/ILSVRC2012_val_00031912.JPEG n07768694/
+mv val/ILSVRC2012_val_00031913.JPEG n01518878/
+mv val/ILSVRC2012_val_00031914.JPEG n02096051/
+mv val/ILSVRC2012_val_00031915.JPEG n02786058/
+mv val/ILSVRC2012_val_00031916.JPEG n02483708/
+mv val/ILSVRC2012_val_00031917.JPEG n02099601/
+mv val/ILSVRC2012_val_00031918.JPEG n04435653/
+mv val/ILSVRC2012_val_00031919.JPEG n01630670/
+mv val/ILSVRC2012_val_00031920.JPEG n02177972/
+mv val/ILSVRC2012_val_00031921.JPEG n13052670/
+mv val/ILSVRC2012_val_00031922.JPEG n02028035/
+mv val/ILSVRC2012_val_00031923.JPEG n01978455/
+mv val/ILSVRC2012_val_00031924.JPEG n13054560/
+mv val/ILSVRC2012_val_00031925.JPEG n02165105/
+mv val/ILSVRC2012_val_00031926.JPEG n04317175/
+mv val/ILSVRC2012_val_00031927.JPEG n01739381/
+mv val/ILSVRC2012_val_00031928.JPEG n02168699/
+mv val/ILSVRC2012_val_00031929.JPEG n02483362/
+mv val/ILSVRC2012_val_00031930.JPEG n02342885/
+mv val/ILSVRC2012_val_00031931.JPEG n02007558/
+mv val/ILSVRC2012_val_00031932.JPEG n01798484/
+mv val/ILSVRC2012_val_00031933.JPEG n04579145/
+mv val/ILSVRC2012_val_00031934.JPEG n02361337/
+mv val/ILSVRC2012_val_00031935.JPEG n02643566/
+mv val/ILSVRC2012_val_00031936.JPEG n04147183/
+mv val/ILSVRC2012_val_00031937.JPEG n04208210/
+mv val/ILSVRC2012_val_00031938.JPEG n01798484/
+mv val/ILSVRC2012_val_00031939.JPEG n02488291/
+mv val/ILSVRC2012_val_00031940.JPEG n03773504/
+mv val/ILSVRC2012_val_00031941.JPEG n03662601/
+mv val/ILSVRC2012_val_00031942.JPEG n02483708/
+mv val/ILSVRC2012_val_00031943.JPEG n01986214/
+mv val/ILSVRC2012_val_00031944.JPEG n04005630/
+mv val/ILSVRC2012_val_00031945.JPEG n02165105/
+mv val/ILSVRC2012_val_00031946.JPEG n02009229/
+mv val/ILSVRC2012_val_00031947.JPEG n03814639/
+mv val/ILSVRC2012_val_00031948.JPEG n04462240/
+mv val/ILSVRC2012_val_00031949.JPEG n02090379/
+mv val/ILSVRC2012_val_00031950.JPEG n03786901/
+mv val/ILSVRC2012_val_00031951.JPEG n01734418/
+mv val/ILSVRC2012_val_00031952.JPEG n01770081/
+mv val/ILSVRC2012_val_00031953.JPEG n02814533/
+mv val/ILSVRC2012_val_00031954.JPEG n03445777/
+mv val/ILSVRC2012_val_00031955.JPEG n03196217/
+mv val/ILSVRC2012_val_00031956.JPEG n02747177/
+mv val/ILSVRC2012_val_00031957.JPEG n02493793/
+mv val/ILSVRC2012_val_00031958.JPEG n03970156/
+mv val/ILSVRC2012_val_00031959.JPEG n02165105/
+mv val/ILSVRC2012_val_00031960.JPEG n03930313/
+mv val/ILSVRC2012_val_00031961.JPEG n02169497/
+mv val/ILSVRC2012_val_00031962.JPEG n04204347/
+mv val/ILSVRC2012_val_00031963.JPEG n02113712/
+mv val/ILSVRC2012_val_00031964.JPEG n02979186/
+mv val/ILSVRC2012_val_00031965.JPEG n02085782/
+mv val/ILSVRC2012_val_00031966.JPEG n04265275/
+mv val/ILSVRC2012_val_00031967.JPEG n01694178/
+mv val/ILSVRC2012_val_00031968.JPEG n09229709/
+mv val/ILSVRC2012_val_00031969.JPEG n04317175/
+mv val/ILSVRC2012_val_00031970.JPEG n07760859/
+mv val/ILSVRC2012_val_00031971.JPEG n02865351/
+mv val/ILSVRC2012_val_00031972.JPEG n03841143/
+mv val/ILSVRC2012_val_00031973.JPEG n01601694/
+mv val/ILSVRC2012_val_00031974.JPEG n02128925/
+mv val/ILSVRC2012_val_00031975.JPEG n03908714/
+mv val/ILSVRC2012_val_00031976.JPEG n01775062/
+mv val/ILSVRC2012_val_00031977.JPEG n01770393/
+mv val/ILSVRC2012_val_00031978.JPEG n02877765/
+mv val/ILSVRC2012_val_00031979.JPEG n03902125/
+mv val/ILSVRC2012_val_00031980.JPEG n01744401/
+mv val/ILSVRC2012_val_00031981.JPEG n02094114/
+mv val/ILSVRC2012_val_00031982.JPEG n03271574/
+mv val/ILSVRC2012_val_00031983.JPEG n04372370/
+mv val/ILSVRC2012_val_00031984.JPEG n07697313/
+mv val/ILSVRC2012_val_00031985.JPEG n04229816/
+mv val/ILSVRC2012_val_00031986.JPEG n02692877/
+mv val/ILSVRC2012_val_00031987.JPEG n01537544/
+mv val/ILSVRC2012_val_00031988.JPEG n04153751/
+mv val/ILSVRC2012_val_00031989.JPEG n02490219/
+mv val/ILSVRC2012_val_00031990.JPEG n09193705/
+mv val/ILSVRC2012_val_00031991.JPEG n02951585/
+mv val/ILSVRC2012_val_00031992.JPEG n01986214/
+mv val/ILSVRC2012_val_00031993.JPEG n02865351/
+mv val/ILSVRC2012_val_00031994.JPEG n02105855/
+mv val/ILSVRC2012_val_00031995.JPEG n04392985/
+mv val/ILSVRC2012_val_00031996.JPEG n03825788/
+mv val/ILSVRC2012_val_00031997.JPEG n04265275/
+mv val/ILSVRC2012_val_00031998.JPEG n12267677/
+mv val/ILSVRC2012_val_00031999.JPEG n03787032/
+mv val/ILSVRC2012_val_00032000.JPEG n02088632/
+mv val/ILSVRC2012_val_00032001.JPEG n04507155/
+mv val/ILSVRC2012_val_00032002.JPEG n03481172/
+mv val/ILSVRC2012_val_00032003.JPEG n03868242/
+mv val/ILSVRC2012_val_00032004.JPEG n02797295/
+mv val/ILSVRC2012_val_00032005.JPEG n02500267/
+mv val/ILSVRC2012_val_00032006.JPEG n02480855/
+mv val/ILSVRC2012_val_00032007.JPEG n03956157/
+mv val/ILSVRC2012_val_00032008.JPEG n02948072/
+mv val/ILSVRC2012_val_00032009.JPEG n03792782/
+mv val/ILSVRC2012_val_00032010.JPEG n03478589/
+mv val/ILSVRC2012_val_00032011.JPEG n04590129/
+mv val/ILSVRC2012_val_00032012.JPEG n01729322/
+mv val/ILSVRC2012_val_00032013.JPEG n02105056/
+mv val/ILSVRC2012_val_00032014.JPEG n02837789/
+mv val/ILSVRC2012_val_00032015.JPEG n03393912/
+mv val/ILSVRC2012_val_00032016.JPEG n02319095/
+mv val/ILSVRC2012_val_00032017.JPEG n02100735/
+mv val/ILSVRC2012_val_00032018.JPEG n02093256/
+mv val/ILSVRC2012_val_00032019.JPEG n03782006/
+mv val/ILSVRC2012_val_00032020.JPEG n03388043/
+mv val/ILSVRC2012_val_00032021.JPEG n03891251/
+mv val/ILSVRC2012_val_00032022.JPEG n02391049/
+mv val/ILSVRC2012_val_00032023.JPEG n02167151/
+mv val/ILSVRC2012_val_00032024.JPEG n03045698/
+mv val/ILSVRC2012_val_00032025.JPEG n01534433/
+mv val/ILSVRC2012_val_00032026.JPEG n04067472/
+mv val/ILSVRC2012_val_00032027.JPEG n02105641/
+mv val/ILSVRC2012_val_00032028.JPEG n04423845/
+mv val/ILSVRC2012_val_00032029.JPEG n01983481/
+mv val/ILSVRC2012_val_00032030.JPEG n03160309/
+mv val/ILSVRC2012_val_00032031.JPEG n02802426/
+mv val/ILSVRC2012_val_00032032.JPEG n09428293/
+mv val/ILSVRC2012_val_00032033.JPEG n02106382/
+mv val/ILSVRC2012_val_00032034.JPEG n04325704/
+mv val/ILSVRC2012_val_00032035.JPEG n02444819/
+mv val/ILSVRC2012_val_00032036.JPEG n01755581/
+mv val/ILSVRC2012_val_00032037.JPEG n02895154/
+mv val/ILSVRC2012_val_00032038.JPEG n02129604/
+mv val/ILSVRC2012_val_00032039.JPEG n02910353/
+mv val/ILSVRC2012_val_00032040.JPEG n07873807/
+mv val/ILSVRC2012_val_00032041.JPEG n07716358/
+mv val/ILSVRC2012_val_00032042.JPEG n03325584/
+mv val/ILSVRC2012_val_00032043.JPEG n02104029/
+mv val/ILSVRC2012_val_00032044.JPEG n01883070/
+mv val/ILSVRC2012_val_00032045.JPEG n02408429/
+mv val/ILSVRC2012_val_00032046.JPEG n02992529/
+mv val/ILSVRC2012_val_00032047.JPEG n02111277/
+mv val/ILSVRC2012_val_00032048.JPEG n04141327/
+mv val/ILSVRC2012_val_00032049.JPEG n02098105/
+mv val/ILSVRC2012_val_00032050.JPEG n12998815/
+mv val/ILSVRC2012_val_00032051.JPEG n04133789/
+mv val/ILSVRC2012_val_00032052.JPEG n02837789/
+mv val/ILSVRC2012_val_00032053.JPEG n02321529/
+mv val/ILSVRC2012_val_00032054.JPEG n04041544/
+mv val/ILSVRC2012_val_00032055.JPEG n03131574/
+mv val/ILSVRC2012_val_00032056.JPEG n01968897/
+mv val/ILSVRC2012_val_00032057.JPEG n03721384/
+mv val/ILSVRC2012_val_00032058.JPEG n09428293/
+mv val/ILSVRC2012_val_00032059.JPEG n03637318/
+mv val/ILSVRC2012_val_00032060.JPEG n04536866/
+mv val/ILSVRC2012_val_00032061.JPEG n01641577/
+mv val/ILSVRC2012_val_00032062.JPEG n01828970/
+mv val/ILSVRC2012_val_00032063.JPEG n02794156/
+mv val/ILSVRC2012_val_00032064.JPEG n02105855/
+mv val/ILSVRC2012_val_00032065.JPEG n02825657/
+mv val/ILSVRC2012_val_00032066.JPEG n02100735/
+mv val/ILSVRC2012_val_00032067.JPEG n02487347/
+mv val/ILSVRC2012_val_00032068.JPEG n02281406/
+mv val/ILSVRC2012_val_00032069.JPEG n04550184/
+mv val/ILSVRC2012_val_00032070.JPEG n02804414/
+mv val/ILSVRC2012_val_00032071.JPEG n03594734/
+mv val/ILSVRC2012_val_00032072.JPEG n01806143/
+mv val/ILSVRC2012_val_00032073.JPEG n09256479/
+mv val/ILSVRC2012_val_00032074.JPEG n04204238/
+mv val/ILSVRC2012_val_00032075.JPEG n03544143/
+mv val/ILSVRC2012_val_00032076.JPEG n04350905/
+mv val/ILSVRC2012_val_00032077.JPEG n04380533/
+mv val/ILSVRC2012_val_00032078.JPEG n03459775/
+mv val/ILSVRC2012_val_00032079.JPEG n04509417/
+mv val/ILSVRC2012_val_00032080.JPEG n02480495/
+mv val/ILSVRC2012_val_00032081.JPEG n04204347/
+mv val/ILSVRC2012_val_00032082.JPEG n03967562/
+mv val/ILSVRC2012_val_00032083.JPEG n03666591/
+mv val/ILSVRC2012_val_00032084.JPEG n03481172/
+mv val/ILSVRC2012_val_00032085.JPEG n03179701/
+mv val/ILSVRC2012_val_00032086.JPEG n01728920/
+mv val/ILSVRC2012_val_00032087.JPEG n09835506/
+mv val/ILSVRC2012_val_00032088.JPEG n02509815/
+mv val/ILSVRC2012_val_00032089.JPEG n11939491/
+mv val/ILSVRC2012_val_00032090.JPEG n02125311/
+mv val/ILSVRC2012_val_00032091.JPEG n01774750/
+mv val/ILSVRC2012_val_00032092.JPEG n01924916/
+mv val/ILSVRC2012_val_00032093.JPEG n04380533/
+mv val/ILSVRC2012_val_00032094.JPEG n03496892/
+mv val/ILSVRC2012_val_00032095.JPEG n02510455/
+mv val/ILSVRC2012_val_00032096.JPEG n02808304/
+mv val/ILSVRC2012_val_00032097.JPEG n04328186/
+mv val/ILSVRC2012_val_00032098.JPEG n04009552/
+mv val/ILSVRC2012_val_00032099.JPEG n02105505/
+mv val/ILSVRC2012_val_00032100.JPEG n02454379/
+mv val/ILSVRC2012_val_00032101.JPEG n04507155/
+mv val/ILSVRC2012_val_00032102.JPEG n01592084/
+mv val/ILSVRC2012_val_00032103.JPEG n04118538/
+mv val/ILSVRC2012_val_00032104.JPEG n01644373/
+mv val/ILSVRC2012_val_00032105.JPEG n02965783/
+mv val/ILSVRC2012_val_00032106.JPEG n03742115/
+mv val/ILSVRC2012_val_00032107.JPEG n07715103/
+mv val/ILSVRC2012_val_00032108.JPEG n03733281/
+mv val/ILSVRC2012_val_00032109.JPEG n02268853/
+mv val/ILSVRC2012_val_00032110.JPEG n03967562/
+mv val/ILSVRC2012_val_00032111.JPEG n02107574/
+mv val/ILSVRC2012_val_00032112.JPEG n04597913/
+mv val/ILSVRC2012_val_00032113.JPEG n01798484/
+mv val/ILSVRC2012_val_00032114.JPEG n04562935/
+mv val/ILSVRC2012_val_00032115.JPEG n04584207/
+mv val/ILSVRC2012_val_00032116.JPEG n07717556/
+mv val/ILSVRC2012_val_00032117.JPEG n02110958/
+mv val/ILSVRC2012_val_00032118.JPEG n04597913/
+mv val/ILSVRC2012_val_00032119.JPEG n07693725/
+mv val/ILSVRC2012_val_00032120.JPEG n02086910/
+mv val/ILSVRC2012_val_00032121.JPEG n04136333/
+mv val/ILSVRC2012_val_00032122.JPEG n01843383/
+mv val/ILSVRC2012_val_00032123.JPEG n02794156/
+mv val/ILSVRC2012_val_00032124.JPEG n02101556/
+mv val/ILSVRC2012_val_00032125.JPEG n04192698/
+mv val/ILSVRC2012_val_00032126.JPEG n02389026/
+mv val/ILSVRC2012_val_00032127.JPEG n03250847/
+mv val/ILSVRC2012_val_00032128.JPEG n01817953/
+mv val/ILSVRC2012_val_00032129.JPEG n01682714/
+mv val/ILSVRC2012_val_00032130.JPEG n01491361/
+mv val/ILSVRC2012_val_00032131.JPEG n06874185/
+mv val/ILSVRC2012_val_00032132.JPEG n02093647/
+mv val/ILSVRC2012_val_00032133.JPEG n02483362/
+mv val/ILSVRC2012_val_00032134.JPEG n04435653/
+mv val/ILSVRC2012_val_00032135.JPEG n01667778/
+mv val/ILSVRC2012_val_00032136.JPEG n04548280/
+mv val/ILSVRC2012_val_00032137.JPEG n03133878/
+mv val/ILSVRC2012_val_00032138.JPEG n02840245/
+mv val/ILSVRC2012_val_00032139.JPEG n01950731/
+mv val/ILSVRC2012_val_00032140.JPEG n04229816/
+mv val/ILSVRC2012_val_00032141.JPEG n01817953/
+mv val/ILSVRC2012_val_00032142.JPEG n04346328/
+mv val/ILSVRC2012_val_00032143.JPEG n07871810/
+mv val/ILSVRC2012_val_00032144.JPEG n04493381/
+mv val/ILSVRC2012_val_00032145.JPEG n03476684/
+mv val/ILSVRC2012_val_00032146.JPEG n01882714/
+mv val/ILSVRC2012_val_00032147.JPEG n03100240/
+mv val/ILSVRC2012_val_00032148.JPEG n02105505/
+mv val/ILSVRC2012_val_00032149.JPEG n03623198/
+mv val/ILSVRC2012_val_00032150.JPEG n02128925/
+mv val/ILSVRC2012_val_00032151.JPEG n07749582/
+mv val/ILSVRC2012_val_00032152.JPEG n03124170/
+mv val/ILSVRC2012_val_00032153.JPEG n03042490/
+mv val/ILSVRC2012_val_00032154.JPEG n01531178/
+mv val/ILSVRC2012_val_00032155.JPEG n03180011/
+mv val/ILSVRC2012_val_00032156.JPEG n02276258/
+mv val/ILSVRC2012_val_00032157.JPEG n03538406/
+mv val/ILSVRC2012_val_00032158.JPEG n01843383/
+mv val/ILSVRC2012_val_00032159.JPEG n01833805/
+mv val/ILSVRC2012_val_00032160.JPEG n02109047/
+mv val/ILSVRC2012_val_00032161.JPEG n01735189/
+mv val/ILSVRC2012_val_00032162.JPEG n01514859/
+mv val/ILSVRC2012_val_00032163.JPEG n02396427/
+mv val/ILSVRC2012_val_00032164.JPEG n01537544/
+mv val/ILSVRC2012_val_00032165.JPEG n07920052/
+mv val/ILSVRC2012_val_00032166.JPEG n02077923/
+mv val/ILSVRC2012_val_00032167.JPEG n03661043/
+mv val/ILSVRC2012_val_00032168.JPEG n03445924/
+mv val/ILSVRC2012_val_00032169.JPEG n01514859/
+mv val/ILSVRC2012_val_00032170.JPEG n04418357/
+mv val/ILSVRC2012_val_00032171.JPEG n01630670/
+mv val/ILSVRC2012_val_00032172.JPEG n02256656/
+mv val/ILSVRC2012_val_00032173.JPEG n02980441/
+mv val/ILSVRC2012_val_00032174.JPEG n01985128/
+mv val/ILSVRC2012_val_00032175.JPEG n03787032/
+mv val/ILSVRC2012_val_00032176.JPEG n09399592/
+mv val/ILSVRC2012_val_00032177.JPEG n02096177/
+mv val/ILSVRC2012_val_00032178.JPEG n03095699/
+mv val/ILSVRC2012_val_00032179.JPEG n02791270/
+mv val/ILSVRC2012_val_00032180.JPEG n02002556/
+mv val/ILSVRC2012_val_00032181.JPEG n02099429/
+mv val/ILSVRC2012_val_00032182.JPEG n02687172/
+mv val/ILSVRC2012_val_00032183.JPEG n04487081/
+mv val/ILSVRC2012_val_00032184.JPEG n03775071/
+mv val/ILSVRC2012_val_00032185.JPEG n04120489/
+mv val/ILSVRC2012_val_00032186.JPEG n02100877/
+mv val/ILSVRC2012_val_00032187.JPEG n04131690/
+mv val/ILSVRC2012_val_00032188.JPEG n02111277/
+mv val/ILSVRC2012_val_00032189.JPEG n04008634/
+mv val/ILSVRC2012_val_00032190.JPEG n03796401/
+mv val/ILSVRC2012_val_00032191.JPEG n03690938/
+mv val/ILSVRC2012_val_00032192.JPEG n03496892/
+mv val/ILSVRC2012_val_00032193.JPEG n02487347/
+mv val/ILSVRC2012_val_00032194.JPEG n02098286/
+mv val/ILSVRC2012_val_00032195.JPEG n04398044/
+mv val/ILSVRC2012_val_00032196.JPEG n02281787/
+mv val/ILSVRC2012_val_00032197.JPEG n02641379/
+mv val/ILSVRC2012_val_00032198.JPEG n03179701/
+mv val/ILSVRC2012_val_00032199.JPEG n03110669/
+mv val/ILSVRC2012_val_00032200.JPEG n03314780/
+mv val/ILSVRC2012_val_00032201.JPEG n03388549/
+mv val/ILSVRC2012_val_00032202.JPEG n02441942/
+mv val/ILSVRC2012_val_00032203.JPEG n02091831/
+mv val/ILSVRC2012_val_00032204.JPEG n03933933/
+mv val/ILSVRC2012_val_00032205.JPEG n07584110/
+mv val/ILSVRC2012_val_00032206.JPEG n02510455/
+mv val/ILSVRC2012_val_00032207.JPEG n02437312/
+mv val/ILSVRC2012_val_00032208.JPEG n02417914/
+mv val/ILSVRC2012_val_00032209.JPEG n02110806/
+mv val/ILSVRC2012_val_00032210.JPEG n02667093/
+mv val/ILSVRC2012_val_00032211.JPEG n03384352/
+mv val/ILSVRC2012_val_00032212.JPEG n03529860/
+mv val/ILSVRC2012_val_00032213.JPEG n04209239/
+mv val/ILSVRC2012_val_00032214.JPEG n04254120/
+mv val/ILSVRC2012_val_00032215.JPEG n04310018/
+mv val/ILSVRC2012_val_00032216.JPEG n07615774/
+mv val/ILSVRC2012_val_00032217.JPEG n01984695/
+mv val/ILSVRC2012_val_00032218.JPEG n03188531/
+mv val/ILSVRC2012_val_00032219.JPEG n02701002/
+mv val/ILSVRC2012_val_00032220.JPEG n01749939/
+mv val/ILSVRC2012_val_00032221.JPEG n03494278/
+mv val/ILSVRC2012_val_00032222.JPEG n04317175/
+mv val/ILSVRC2012_val_00032223.JPEG n02480855/
+mv val/ILSVRC2012_val_00032224.JPEG n04553703/
+mv val/ILSVRC2012_val_00032225.JPEG n04591713/
+mv val/ILSVRC2012_val_00032226.JPEG n02093991/
+mv val/ILSVRC2012_val_00032227.JPEG n03496892/
+mv val/ILSVRC2012_val_00032228.JPEG n03498962/
+mv val/ILSVRC2012_val_00032229.JPEG n02870880/
+mv val/ILSVRC2012_val_00032230.JPEG n07734744/
+mv val/ILSVRC2012_val_00032231.JPEG n02090622/
+mv val/ILSVRC2012_val_00032232.JPEG n02095889/
+mv val/ILSVRC2012_val_00032233.JPEG n03089624/
+mv val/ILSVRC2012_val_00032234.JPEG n03814906/
+mv val/ILSVRC2012_val_00032235.JPEG n01443537/
+mv val/ILSVRC2012_val_00032236.JPEG n03775546/
+mv val/ILSVRC2012_val_00032237.JPEG n03895866/
+mv val/ILSVRC2012_val_00032238.JPEG n04254680/
+mv val/ILSVRC2012_val_00032239.JPEG n02093991/
+mv val/ILSVRC2012_val_00032240.JPEG n02094433/
+mv val/ILSVRC2012_val_00032241.JPEG n03709823/
+mv val/ILSVRC2012_val_00032242.JPEG n04133789/
+mv val/ILSVRC2012_val_00032243.JPEG n04356056/
+mv val/ILSVRC2012_val_00032244.JPEG n09421951/
+mv val/ILSVRC2012_val_00032245.JPEG n03781244/
+mv val/ILSVRC2012_val_00032246.JPEG n03970156/
+mv val/ILSVRC2012_val_00032247.JPEG n03709823/
+mv val/ILSVRC2012_val_00032248.JPEG n03873416/
+mv val/ILSVRC2012_val_00032249.JPEG n03950228/
+mv val/ILSVRC2012_val_00032250.JPEG n03425413/
+mv val/ILSVRC2012_val_00032251.JPEG n09229709/
+mv val/ILSVRC2012_val_00032252.JPEG n03141823/
+mv val/ILSVRC2012_val_00032253.JPEG n03290653/
+mv val/ILSVRC2012_val_00032254.JPEG n01675722/
+mv val/ILSVRC2012_val_00032255.JPEG n04259630/
+mv val/ILSVRC2012_val_00032256.JPEG n04613696/
+mv val/ILSVRC2012_val_00032257.JPEG n03838899/
+mv val/ILSVRC2012_val_00032258.JPEG n01443537/
+mv val/ILSVRC2012_val_00032259.JPEG n03617480/
+mv val/ILSVRC2012_val_00032260.JPEG n02112350/
+mv val/ILSVRC2012_val_00032261.JPEG n01774384/
+mv val/ILSVRC2012_val_00032262.JPEG n02108915/
+mv val/ILSVRC2012_val_00032263.JPEG n03876231/
+mv val/ILSVRC2012_val_00032264.JPEG n02099429/
+mv val/ILSVRC2012_val_00032265.JPEG n02226429/
+mv val/ILSVRC2012_val_00032266.JPEG n01770393/
+mv val/ILSVRC2012_val_00032267.JPEG n01694178/
+mv val/ILSVRC2012_val_00032268.JPEG n06794110/
+mv val/ILSVRC2012_val_00032269.JPEG n03220513/
+mv val/ILSVRC2012_val_00032270.JPEG n11879895/
+mv val/ILSVRC2012_val_00032271.JPEG n03124043/
+mv val/ILSVRC2012_val_00032272.JPEG n02105855/
+mv val/ILSVRC2012_val_00032273.JPEG n02486410/
+mv val/ILSVRC2012_val_00032274.JPEG n04004767/
+mv val/ILSVRC2012_val_00032275.JPEG n09835506/
+mv val/ILSVRC2012_val_00032276.JPEG n07745940/
+mv val/ILSVRC2012_val_00032277.JPEG n02097047/
+mv val/ILSVRC2012_val_00032278.JPEG n03721384/
+mv val/ILSVRC2012_val_00032279.JPEG n03133878/
+mv val/ILSVRC2012_val_00032280.JPEG n02093647/
+mv val/ILSVRC2012_val_00032281.JPEG n06794110/
+mv val/ILSVRC2012_val_00032282.JPEG n04317175/
+mv val/ILSVRC2012_val_00032283.JPEG n02134418/
+mv val/ILSVRC2012_val_00032284.JPEG n02692877/
+mv val/ILSVRC2012_val_00032285.JPEG n02128757/
+mv val/ILSVRC2012_val_00032286.JPEG n03794056/
+mv val/ILSVRC2012_val_00032287.JPEG n02727426/
+mv val/ILSVRC2012_val_00032288.JPEG n01484850/
+mv val/ILSVRC2012_val_00032289.JPEG n02514041/
+mv val/ILSVRC2012_val_00032290.JPEG n02106382/
+mv val/ILSVRC2012_val_00032291.JPEG n02097298/
+mv val/ILSVRC2012_val_00032292.JPEG n04613696/
+mv val/ILSVRC2012_val_00032293.JPEG n02701002/
+mv val/ILSVRC2012_val_00032294.JPEG n03770439/
+mv val/ILSVRC2012_val_00032295.JPEG n01855672/
+mv val/ILSVRC2012_val_00032296.JPEG n02328150/
+mv val/ILSVRC2012_val_00032297.JPEG n03944341/
+mv val/ILSVRC2012_val_00032298.JPEG n09468604/
+mv val/ILSVRC2012_val_00032299.JPEG n02281787/
+mv val/ILSVRC2012_val_00032300.JPEG n04554684/
+mv val/ILSVRC2012_val_00032301.JPEG n02098105/
+mv val/ILSVRC2012_val_00032302.JPEG n03179701/
+mv val/ILSVRC2012_val_00032303.JPEG n02174001/
+mv val/ILSVRC2012_val_00032304.JPEG n02109961/
+mv val/ILSVRC2012_val_00032305.JPEG n03742115/
+mv val/ILSVRC2012_val_00032306.JPEG n04562935/
+mv val/ILSVRC2012_val_00032307.JPEG n03729826/
+mv val/ILSVRC2012_val_00032308.JPEG n04133789/
+mv val/ILSVRC2012_val_00032309.JPEG n04086273/
+mv val/ILSVRC2012_val_00032310.JPEG n01514859/
+mv val/ILSVRC2012_val_00032311.JPEG n04597913/
+mv val/ILSVRC2012_val_00032312.JPEG n04476259/
+mv val/ILSVRC2012_val_00032313.JPEG n01914609/
+mv val/ILSVRC2012_val_00032314.JPEG n02095889/
+mv val/ILSVRC2012_val_00032315.JPEG n03125729/
+mv val/ILSVRC2012_val_00032316.JPEG n04366367/
+mv val/ILSVRC2012_val_00032317.JPEG n02443114/
+mv val/ILSVRC2012_val_00032318.JPEG n02098413/
+mv val/ILSVRC2012_val_00032319.JPEG n03599486/
+mv val/ILSVRC2012_val_00032320.JPEG n01614925/
+mv val/ILSVRC2012_val_00032321.JPEG n04483307/
+mv val/ILSVRC2012_val_00032322.JPEG n02105412/
+mv val/ILSVRC2012_val_00032323.JPEG n01631663/
+mv val/ILSVRC2012_val_00032324.JPEG n02500267/
+mv val/ILSVRC2012_val_00032325.JPEG n02095889/
+mv val/ILSVRC2012_val_00032326.JPEG n04264628/
+mv val/ILSVRC2012_val_00032327.JPEG n07753592/
+mv val/ILSVRC2012_val_00032328.JPEG n02123597/
+mv val/ILSVRC2012_val_00032329.JPEG n03884397/
+mv val/ILSVRC2012_val_00032330.JPEG n04579432/
+mv val/ILSVRC2012_val_00032331.JPEG n03938244/
+mv val/ILSVRC2012_val_00032332.JPEG n07831146/
+mv val/ILSVRC2012_val_00032333.JPEG n02101006/
+mv val/ILSVRC2012_val_00032334.JPEG n02092002/
+mv val/ILSVRC2012_val_00032335.JPEG n02006656/
+mv val/ILSVRC2012_val_00032336.JPEG n02106166/
+mv val/ILSVRC2012_val_00032337.JPEG n04596742/
+mv val/ILSVRC2012_val_00032338.JPEG n03770679/
+mv val/ILSVRC2012_val_00032339.JPEG n04149813/
+mv val/ILSVRC2012_val_00032340.JPEG n04599235/
+mv val/ILSVRC2012_val_00032341.JPEG n04332243/
+mv val/ILSVRC2012_val_00032342.JPEG n03379051/
+mv val/ILSVRC2012_val_00032343.JPEG n01776313/
+mv val/ILSVRC2012_val_00032344.JPEG n01806567/
+mv val/ILSVRC2012_val_00032345.JPEG n09468604/
+mv val/ILSVRC2012_val_00032346.JPEG n04554684/
+mv val/ILSVRC2012_val_00032347.JPEG n02747177/
+mv val/ILSVRC2012_val_00032348.JPEG n04243546/
+mv val/ILSVRC2012_val_00032349.JPEG n03838899/
+mv val/ILSVRC2012_val_00032350.JPEG n01855032/
+mv val/ILSVRC2012_val_00032351.JPEG n01917289/
+mv val/ILSVRC2012_val_00032352.JPEG n02226429/
+mv val/ILSVRC2012_val_00032353.JPEG n03706229/
+mv val/ILSVRC2012_val_00032354.JPEG n03843555/
+mv val/ILSVRC2012_val_00032355.JPEG n07615774/
+mv val/ILSVRC2012_val_00032356.JPEG n02268853/
+mv val/ILSVRC2012_val_00032357.JPEG n04141975/
+mv val/ILSVRC2012_val_00032358.JPEG n01728920/
+mv val/ILSVRC2012_val_00032359.JPEG n01531178/
+mv val/ILSVRC2012_val_00032360.JPEG n03838899/
+mv val/ILSVRC2012_val_00032361.JPEG n09472597/
+mv val/ILSVRC2012_val_00032362.JPEG n01847000/
+mv val/ILSVRC2012_val_00032363.JPEG n13133613/
+mv val/ILSVRC2012_val_00032364.JPEG n04522168/
+mv val/ILSVRC2012_val_00032365.JPEG n02088466/
+mv val/ILSVRC2012_val_00032366.JPEG n09193705/
+mv val/ILSVRC2012_val_00032367.JPEG n03445924/
+mv val/ILSVRC2012_val_00032368.JPEG n02092002/
+mv val/ILSVRC2012_val_00032369.JPEG n02640242/
+mv val/ILSVRC2012_val_00032370.JPEG n07742313/
+mv val/ILSVRC2012_val_00032371.JPEG n04612504/
+mv val/ILSVRC2012_val_00032372.JPEG n01986214/
+mv val/ILSVRC2012_val_00032373.JPEG n09229709/
+mv val/ILSVRC2012_val_00032374.JPEG n02488291/
+mv val/ILSVRC2012_val_00032375.JPEG n02643566/
+mv val/ILSVRC2012_val_00032376.JPEG n03891251/
+mv val/ILSVRC2012_val_00032377.JPEG n09468604/
+mv val/ILSVRC2012_val_00032378.JPEG n01983481/
+mv val/ILSVRC2012_val_00032379.JPEG n07920052/
+mv val/ILSVRC2012_val_00032380.JPEG n03770679/
+mv val/ILSVRC2012_val_00032381.JPEG n02097130/
+mv val/ILSVRC2012_val_00032382.JPEG n03769881/
+mv val/ILSVRC2012_val_00032383.JPEG n03498962/
+mv val/ILSVRC2012_val_00032384.JPEG n07697537/
+mv val/ILSVRC2012_val_00032385.JPEG n02422699/
+mv val/ILSVRC2012_val_00032386.JPEG n04254777/
+mv val/ILSVRC2012_val_00032387.JPEG n03452741/
+mv val/ILSVRC2012_val_00032388.JPEG n04152593/
+mv val/ILSVRC2012_val_00032389.JPEG n01616318/
+mv val/ILSVRC2012_val_00032390.JPEG n02259212/
+mv val/ILSVRC2012_val_00032391.JPEG n03690938/
+mv val/ILSVRC2012_val_00032392.JPEG n04501370/
+mv val/ILSVRC2012_val_00032393.JPEG n04355933/
+mv val/ILSVRC2012_val_00032394.JPEG n01498041/
+mv val/ILSVRC2012_val_00032395.JPEG n04023962/
+mv val/ILSVRC2012_val_00032396.JPEG n02488702/
+mv val/ILSVRC2012_val_00032397.JPEG n04443257/
+mv val/ILSVRC2012_val_00032398.JPEG n02091134/
+mv val/ILSVRC2012_val_00032399.JPEG n02978881/
+mv val/ILSVRC2012_val_00032400.JPEG n02091244/
+mv val/ILSVRC2012_val_00032401.JPEG n01756291/
+mv val/ILSVRC2012_val_00032402.JPEG n04120489/
+mv val/ILSVRC2012_val_00032403.JPEG n04141327/
+mv val/ILSVRC2012_val_00032404.JPEG n02504458/
+mv val/ILSVRC2012_val_00032405.JPEG n01667778/
+mv val/ILSVRC2012_val_00032406.JPEG n02108089/
+mv val/ILSVRC2012_val_00032407.JPEG n03843555/
+mv val/ILSVRC2012_val_00032408.JPEG n02951358/
+mv val/ILSVRC2012_val_00032409.JPEG n01807496/
+mv val/ILSVRC2012_val_00032410.JPEG n02102318/
+mv val/ILSVRC2012_val_00032411.JPEG n07745940/
+mv val/ILSVRC2012_val_00032412.JPEG n06794110/
+mv val/ILSVRC2012_val_00032413.JPEG n02363005/
+mv val/ILSVRC2012_val_00032414.JPEG n07753113/
+mv val/ILSVRC2012_val_00032415.JPEG n01644900/
+mv val/ILSVRC2012_val_00032416.JPEG n02363005/
+mv val/ILSVRC2012_val_00032417.JPEG n01484850/
+mv val/ILSVRC2012_val_00032418.JPEG n02105056/
+mv val/ILSVRC2012_val_00032419.JPEG n02107312/
+mv val/ILSVRC2012_val_00032420.JPEG n03482405/
+mv val/ILSVRC2012_val_00032421.JPEG n01945685/
+mv val/ILSVRC2012_val_00032422.JPEG n02823750/
+mv val/ILSVRC2012_val_00032423.JPEG n02090622/
+mv val/ILSVRC2012_val_00032424.JPEG n03710193/
+mv val/ILSVRC2012_val_00032425.JPEG n03379051/
+mv val/ILSVRC2012_val_00032426.JPEG n07873807/
+mv val/ILSVRC2012_val_00032427.JPEG n04263257/
+mv val/ILSVRC2012_val_00032428.JPEG n03062245/
+mv val/ILSVRC2012_val_00032429.JPEG n02088632/
+mv val/ILSVRC2012_val_00032430.JPEG n04208210/
+mv val/ILSVRC2012_val_00032431.JPEG n04141327/
+mv val/ILSVRC2012_val_00032432.JPEG n07932039/
+mv val/ILSVRC2012_val_00032433.JPEG n02951358/
+mv val/ILSVRC2012_val_00032434.JPEG n02790996/
+mv val/ILSVRC2012_val_00032435.JPEG n02777292/
+mv val/ILSVRC2012_val_00032436.JPEG n02804414/
+mv val/ILSVRC2012_val_00032437.JPEG n03970156/
+mv val/ILSVRC2012_val_00032438.JPEG n04501370/
+mv val/ILSVRC2012_val_00032439.JPEG n02641379/
+mv val/ILSVRC2012_val_00032440.JPEG n01774750/
+mv val/ILSVRC2012_val_00032441.JPEG n01498041/
+mv val/ILSVRC2012_val_00032442.JPEG n04116512/
+mv val/ILSVRC2012_val_00032443.JPEG n02233338/
+mv val/ILSVRC2012_val_00032444.JPEG n03706229/
+mv val/ILSVRC2012_val_00032445.JPEG n02097047/
+mv val/ILSVRC2012_val_00032446.JPEG n07697537/
+mv val/ILSVRC2012_val_00032447.JPEG n02444819/
+mv val/ILSVRC2012_val_00032448.JPEG n04153751/
+mv val/ILSVRC2012_val_00032449.JPEG n02398521/
+mv val/ILSVRC2012_val_00032450.JPEG n03908714/
+mv val/ILSVRC2012_val_00032451.JPEG n02088632/
+mv val/ILSVRC2012_val_00032452.JPEG n02113712/
+mv val/ILSVRC2012_val_00032453.JPEG n02132136/
+mv val/ILSVRC2012_val_00032454.JPEG n04258138/
+mv val/ILSVRC2012_val_00032455.JPEG n03425413/
+mv val/ILSVRC2012_val_00032456.JPEG n02397096/
+mv val/ILSVRC2012_val_00032457.JPEG n02443484/
+mv val/ILSVRC2012_val_00032458.JPEG n06785654/
+mv val/ILSVRC2012_val_00032459.JPEG n04367480/
+mv val/ILSVRC2012_val_00032460.JPEG n03717622/
+mv val/ILSVRC2012_val_00032461.JPEG n03721384/
+mv val/ILSVRC2012_val_00032462.JPEG n02981792/
+mv val/ILSVRC2012_val_00032463.JPEG n01955084/
+mv val/ILSVRC2012_val_00032464.JPEG n02090721/
+mv val/ILSVRC2012_val_00032465.JPEG n02879718/
+mv val/ILSVRC2012_val_00032466.JPEG n02113712/
+mv val/ILSVRC2012_val_00032467.JPEG n02417914/
+mv val/ILSVRC2012_val_00032468.JPEG n02093859/
+mv val/ILSVRC2012_val_00032469.JPEG n02009912/
+mv val/ILSVRC2012_val_00032470.JPEG n02006656/
+mv val/ILSVRC2012_val_00032471.JPEG n01770393/
+mv val/ILSVRC2012_val_00032472.JPEG n02701002/
+mv val/ILSVRC2012_val_00032473.JPEG n01818515/
+mv val/ILSVRC2012_val_00032474.JPEG n12998815/
+mv val/ILSVRC2012_val_00032475.JPEG n03532672/
+mv val/ILSVRC2012_val_00032476.JPEG n03666591/
+mv val/ILSVRC2012_val_00032477.JPEG n06794110/
+mv val/ILSVRC2012_val_00032478.JPEG n03110669/
+mv val/ILSVRC2012_val_00032479.JPEG n03220513/
+mv val/ILSVRC2012_val_00032480.JPEG n03976467/
+mv val/ILSVRC2012_val_00032481.JPEG n02396427/
+mv val/ILSVRC2012_val_00032482.JPEG n03888257/
+mv val/ILSVRC2012_val_00032483.JPEG n02514041/
+mv val/ILSVRC2012_val_00032484.JPEG n02837789/
+mv val/ILSVRC2012_val_00032485.JPEG n07711569/
+mv val/ILSVRC2012_val_00032486.JPEG n07613480/
+mv val/ILSVRC2012_val_00032487.JPEG n03075370/
+mv val/ILSVRC2012_val_00032488.JPEG n07684084/
+mv val/ILSVRC2012_val_00032489.JPEG n02708093/
+mv val/ILSVRC2012_val_00032490.JPEG n02099267/
+mv val/ILSVRC2012_val_00032491.JPEG n03131574/
+mv val/ILSVRC2012_val_00032492.JPEG n01843383/
+mv val/ILSVRC2012_val_00032493.JPEG n02091032/
+mv val/ILSVRC2012_val_00032494.JPEG n03796401/
+mv val/ILSVRC2012_val_00032495.JPEG n04243546/
+mv val/ILSVRC2012_val_00032496.JPEG n04389033/
+mv val/ILSVRC2012_val_00032497.JPEG n03014705/
+mv val/ILSVRC2012_val_00032498.JPEG n03868863/
+mv val/ILSVRC2012_val_00032499.JPEG n01883070/
+mv val/ILSVRC2012_val_00032500.JPEG n01744401/
+mv val/ILSVRC2012_val_00032501.JPEG n12267677/
+mv val/ILSVRC2012_val_00032502.JPEG n03876231/
+mv val/ILSVRC2012_val_00032503.JPEG n01847000/
+mv val/ILSVRC2012_val_00032504.JPEG n02219486/
+mv val/ILSVRC2012_val_00032505.JPEG n01955084/
+mv val/ILSVRC2012_val_00032506.JPEG n03089624/
+mv val/ILSVRC2012_val_00032507.JPEG n04350905/
+mv val/ILSVRC2012_val_00032508.JPEG n02119022/
+mv val/ILSVRC2012_val_00032509.JPEG n04004767/
+mv val/ILSVRC2012_val_00032510.JPEG n02793495/
+mv val/ILSVRC2012_val_00032511.JPEG n03404251/
+mv val/ILSVRC2012_val_00032512.JPEG n03014705/
+mv val/ILSVRC2012_val_00032513.JPEG n01677366/
+mv val/ILSVRC2012_val_00032514.JPEG n03690938/
+mv val/ILSVRC2012_val_00032515.JPEG n04162706/
+mv val/ILSVRC2012_val_00032516.JPEG n04552348/
+mv val/ILSVRC2012_val_00032517.JPEG n01985128/
+mv val/ILSVRC2012_val_00032518.JPEG n07873807/
+mv val/ILSVRC2012_val_00032519.JPEG n02526121/
+mv val/ILSVRC2012_val_00032520.JPEG n07932039/
+mv val/ILSVRC2012_val_00032521.JPEG n02102973/
+mv val/ILSVRC2012_val_00032522.JPEG n02108000/
+mv val/ILSVRC2012_val_00032523.JPEG n04493381/
+mv val/ILSVRC2012_val_00032524.JPEG n02097130/
+mv val/ILSVRC2012_val_00032525.JPEG n04086273/
+mv val/ILSVRC2012_val_00032526.JPEG n03832673/
+mv val/ILSVRC2012_val_00032527.JPEG n02088364/
+mv val/ILSVRC2012_val_00032528.JPEG n02119789/
+mv val/ILSVRC2012_val_00032529.JPEG n02113712/
+mv val/ILSVRC2012_val_00032530.JPEG n07716906/
+mv val/ILSVRC2012_val_00032531.JPEG n03792972/
+mv val/ILSVRC2012_val_00032532.JPEG n02097658/
+mv val/ILSVRC2012_val_00032533.JPEG n02226429/
+mv val/ILSVRC2012_val_00032534.JPEG n09428293/
+mv val/ILSVRC2012_val_00032535.JPEG n02116738/
+mv val/ILSVRC2012_val_00032536.JPEG n07753113/
+mv val/ILSVRC2012_val_00032537.JPEG n02777292/
+mv val/ILSVRC2012_val_00032538.JPEG n02017213/
+mv val/ILSVRC2012_val_00032539.JPEG n04209239/
+mv val/ILSVRC2012_val_00032540.JPEG n02077923/
+mv val/ILSVRC2012_val_00032541.JPEG n02509815/
+mv val/ILSVRC2012_val_00032542.JPEG n07716906/
+mv val/ILSVRC2012_val_00032543.JPEG n02843684/
+mv val/ILSVRC2012_val_00032544.JPEG n02417914/
+mv val/ILSVRC2012_val_00032545.JPEG n07920052/
+mv val/ILSVRC2012_val_00032546.JPEG n09288635/
+mv val/ILSVRC2012_val_00032547.JPEG n01980166/
+mv val/ILSVRC2012_val_00032548.JPEG n09193705/
+mv val/ILSVRC2012_val_00032549.JPEG n03124043/
+mv val/ILSVRC2012_val_00032550.JPEG n03944341/
+mv val/ILSVRC2012_val_00032551.JPEG n02219486/
+mv val/ILSVRC2012_val_00032552.JPEG n02127052/
+mv val/ILSVRC2012_val_00032553.JPEG n04147183/
+mv val/ILSVRC2012_val_00032554.JPEG n02106550/
+mv val/ILSVRC2012_val_00032555.JPEG n04550184/
+mv val/ILSVRC2012_val_00032556.JPEG n01728572/
+mv val/ILSVRC2012_val_00032557.JPEG n02102480/
+mv val/ILSVRC2012_val_00032558.JPEG n04371430/
+mv val/ILSVRC2012_val_00032559.JPEG n03983396/
+mv val/ILSVRC2012_val_00032560.JPEG n02815834/
+mv val/ILSVRC2012_val_00032561.JPEG n04264628/
+mv val/ILSVRC2012_val_00032562.JPEG n04356056/
+mv val/ILSVRC2012_val_00032563.JPEG n02096294/
+mv val/ILSVRC2012_val_00032564.JPEG n02106382/
+mv val/ILSVRC2012_val_00032565.JPEG n07579787/
+mv val/ILSVRC2012_val_00032566.JPEG n02536864/
+mv val/ILSVRC2012_val_00032567.JPEG n03630383/
+mv val/ILSVRC2012_val_00032568.JPEG n02114367/
+mv val/ILSVRC2012_val_00032569.JPEG n03781244/
+mv val/ILSVRC2012_val_00032570.JPEG n03271574/
+mv val/ILSVRC2012_val_00032571.JPEG n01739381/
+mv val/ILSVRC2012_val_00032572.JPEG n04008634/
+mv val/ILSVRC2012_val_00032573.JPEG n03594734/
+mv val/ILSVRC2012_val_00032574.JPEG n03201208/
+mv val/ILSVRC2012_val_00032575.JPEG n02058221/
+mv val/ILSVRC2012_val_00032576.JPEG n02134418/
+mv val/ILSVRC2012_val_00032577.JPEG n10148035/
+mv val/ILSVRC2012_val_00032578.JPEG n01631663/
+mv val/ILSVRC2012_val_00032579.JPEG n02526121/
+mv val/ILSVRC2012_val_00032580.JPEG n02002556/
+mv val/ILSVRC2012_val_00032581.JPEG n02095314/
+mv val/ILSVRC2012_val_00032582.JPEG n02098105/
+mv val/ILSVRC2012_val_00032583.JPEG n04509417/
+mv val/ILSVRC2012_val_00032584.JPEG n04612504/
+mv val/ILSVRC2012_val_00032585.JPEG n02497673/
+mv val/ILSVRC2012_val_00032586.JPEG n01580077/
+mv val/ILSVRC2012_val_00032587.JPEG n01697457/
+mv val/ILSVRC2012_val_00032588.JPEG n03109150/
+mv val/ILSVRC2012_val_00032589.JPEG n09468604/
+mv val/ILSVRC2012_val_00032590.JPEG n03874293/
+mv val/ILSVRC2012_val_00032591.JPEG n02109961/
+mv val/ILSVRC2012_val_00032592.JPEG n02110627/
+mv val/ILSVRC2012_val_00032593.JPEG n02892201/
+mv val/ILSVRC2012_val_00032594.JPEG n02088364/
+mv val/ILSVRC2012_val_00032595.JPEG n03100240/
+mv val/ILSVRC2012_val_00032596.JPEG n03532672/
+mv val/ILSVRC2012_val_00032597.JPEG n02892767/
+mv val/ILSVRC2012_val_00032598.JPEG n07860988/
+mv val/ILSVRC2012_val_00032599.JPEG n03337140/
+mv val/ILSVRC2012_val_00032600.JPEG n02951358/
+mv val/ILSVRC2012_val_00032601.JPEG n03691459/
+mv val/ILSVRC2012_val_00032602.JPEG n03134739/
+mv val/ILSVRC2012_val_00032603.JPEG n02422106/
+mv val/ILSVRC2012_val_00032604.JPEG n02788148/
+mv val/ILSVRC2012_val_00032605.JPEG n03814906/
+mv val/ILSVRC2012_val_00032606.JPEG n02444819/
+mv val/ILSVRC2012_val_00032607.JPEG n06785654/
+mv val/ILSVRC2012_val_00032608.JPEG n04612504/
+mv val/ILSVRC2012_val_00032609.JPEG n02123394/
+mv val/ILSVRC2012_val_00032610.JPEG n03042490/
+mv val/ILSVRC2012_val_00032611.JPEG n04116512/
+mv val/ILSVRC2012_val_00032612.JPEG n03527444/
+mv val/ILSVRC2012_val_00032613.JPEG n09288635/
+mv val/ILSVRC2012_val_00032614.JPEG n01983481/
+mv val/ILSVRC2012_val_00032615.JPEG n09332890/
+mv val/ILSVRC2012_val_00032616.JPEG n07715103/
+mv val/ILSVRC2012_val_00032617.JPEG n01828970/
+mv val/ILSVRC2012_val_00032618.JPEG n04037443/
+mv val/ILSVRC2012_val_00032619.JPEG n03089624/
+mv val/ILSVRC2012_val_00032620.JPEG n02504458/
+mv val/ILSVRC2012_val_00032621.JPEG n01917289/
+mv val/ILSVRC2012_val_00032622.JPEG n03223299/
+mv val/ILSVRC2012_val_00032623.JPEG n02119022/
+mv val/ILSVRC2012_val_00032624.JPEG n02206856/
+mv val/ILSVRC2012_val_00032625.JPEG n04252077/
+mv val/ILSVRC2012_val_00032626.JPEG n02012849/
+mv val/ILSVRC2012_val_00032627.JPEG n02037110/
+mv val/ILSVRC2012_val_00032628.JPEG n01751748/
+mv val/ILSVRC2012_val_00032629.JPEG n07930864/
+mv val/ILSVRC2012_val_00032630.JPEG n04131690/
+mv val/ILSVRC2012_val_00032631.JPEG n07697313/
+mv val/ILSVRC2012_val_00032632.JPEG n02841315/
+mv val/ILSVRC2012_val_00032633.JPEG n03950228/
+mv val/ILSVRC2012_val_00032634.JPEG n04254680/
+mv val/ILSVRC2012_val_00032635.JPEG n04141975/
+mv val/ILSVRC2012_val_00032636.JPEG n03983396/
+mv val/ILSVRC2012_val_00032637.JPEG n02124075/
+mv val/ILSVRC2012_val_00032638.JPEG n12998815/
+mv val/ILSVRC2012_val_00032639.JPEG n03709823/
+mv val/ILSVRC2012_val_00032640.JPEG n01689811/
+mv val/ILSVRC2012_val_00032641.JPEG n02966687/
+mv val/ILSVRC2012_val_00032642.JPEG n03590841/
+mv val/ILSVRC2012_val_00032643.JPEG n02002556/
+mv val/ILSVRC2012_val_00032644.JPEG n01770393/
+mv val/ILSVRC2012_val_00032645.JPEG n04532106/
+mv val/ILSVRC2012_val_00032646.JPEG n02109961/
+mv val/ILSVRC2012_val_00032647.JPEG n04286575/
+mv val/ILSVRC2012_val_00032648.JPEG n02910353/
+mv val/ILSVRC2012_val_00032649.JPEG n03785016/
+mv val/ILSVRC2012_val_00032650.JPEG n04125021/
+mv val/ILSVRC2012_val_00032651.JPEG n04370456/
+mv val/ILSVRC2012_val_00032652.JPEG n02115641/
+mv val/ILSVRC2012_val_00032653.JPEG n03874293/
+mv val/ILSVRC2012_val_00032654.JPEG n13054560/
+mv val/ILSVRC2012_val_00032655.JPEG n02480855/
+mv val/ILSVRC2012_val_00032656.JPEG n02105855/
+mv val/ILSVRC2012_val_00032657.JPEG n01773157/
+mv val/ILSVRC2012_val_00032658.JPEG n02108915/
+mv val/ILSVRC2012_val_00032659.JPEG n02108000/
+mv val/ILSVRC2012_val_00032660.JPEG n03764736/
+mv val/ILSVRC2012_val_00032661.JPEG n02231487/
+mv val/ILSVRC2012_val_00032662.JPEG n04507155/
+mv val/ILSVRC2012_val_00032663.JPEG n01744401/
+mv val/ILSVRC2012_val_00032664.JPEG n04325704/
+mv val/ILSVRC2012_val_00032665.JPEG n02526121/
+mv val/ILSVRC2012_val_00032666.JPEG n04371774/
+mv val/ILSVRC2012_val_00032667.JPEG n01582220/
+mv val/ILSVRC2012_val_00032668.JPEG n02088094/
+mv val/ILSVRC2012_val_00032669.JPEG n12267677/
+mv val/ILSVRC2012_val_00032670.JPEG n07880968/
+mv val/ILSVRC2012_val_00032671.JPEG n04266014/
+mv val/ILSVRC2012_val_00032672.JPEG n02417914/
+mv val/ILSVRC2012_val_00032673.JPEG n04270147/
+mv val/ILSVRC2012_val_00032674.JPEG n07684084/
+mv val/ILSVRC2012_val_00032675.JPEG n01443537/
+mv val/ILSVRC2012_val_00032676.JPEG n03866082/
+mv val/ILSVRC2012_val_00032677.JPEG n04179913/
+mv val/ILSVRC2012_val_00032678.JPEG n02422106/
+mv val/ILSVRC2012_val_00032679.JPEG n07697537/
+mv val/ILSVRC2012_val_00032680.JPEG n02687172/
+mv val/ILSVRC2012_val_00032681.JPEG n03803284/
+mv val/ILSVRC2012_val_00032682.JPEG n01692333/
+mv val/ILSVRC2012_val_00032683.JPEG n04192698/
+mv val/ILSVRC2012_val_00032684.JPEG n02481823/
+mv val/ILSVRC2012_val_00032685.JPEG n02115913/
+mv val/ILSVRC2012_val_00032686.JPEG n03404251/
+mv val/ILSVRC2012_val_00032687.JPEG n02138441/
+mv val/ILSVRC2012_val_00032688.JPEG n02999410/
+mv val/ILSVRC2012_val_00032689.JPEG n03388183/
+mv val/ILSVRC2012_val_00032690.JPEG n02317335/
+mv val/ILSVRC2012_val_00032691.JPEG n03759954/
+mv val/ILSVRC2012_val_00032692.JPEG n04335435/
+mv val/ILSVRC2012_val_00032693.JPEG n03814906/
+mv val/ILSVRC2012_val_00032694.JPEG n03692522/
+mv val/ILSVRC2012_val_00032695.JPEG n13052670/
+mv val/ILSVRC2012_val_00032696.JPEG n03729826/
+mv val/ILSVRC2012_val_00032697.JPEG n02790996/
+mv val/ILSVRC2012_val_00032698.JPEG n02012849/
+mv val/ILSVRC2012_val_00032699.JPEG n03935335/
+mv val/ILSVRC2012_val_00032700.JPEG n01667114/
+mv val/ILSVRC2012_val_00032701.JPEG n07836838/
+mv val/ILSVRC2012_val_00032702.JPEG n01580077/
+mv val/ILSVRC2012_val_00032703.JPEG n07615774/
+mv val/ILSVRC2012_val_00032704.JPEG n03535780/
+mv val/ILSVRC2012_val_00032705.JPEG n02226429/
+mv val/ILSVRC2012_val_00032706.JPEG n03903868/
+mv val/ILSVRC2012_val_00032707.JPEG n02999410/
+mv val/ILSVRC2012_val_00032708.JPEG n03532672/
+mv val/ILSVRC2012_val_00032709.JPEG n03498962/
+mv val/ILSVRC2012_val_00032710.JPEG n01531178/
+mv val/ILSVRC2012_val_00032711.JPEG n03868242/
+mv val/ILSVRC2012_val_00032712.JPEG n02128757/
+mv val/ILSVRC2012_val_00032713.JPEG n03793489/
+mv val/ILSVRC2012_val_00032714.JPEG n01755581/
+mv val/ILSVRC2012_val_00032715.JPEG n09332890/
+mv val/ILSVRC2012_val_00032716.JPEG n02087394/
+mv val/ILSVRC2012_val_00032717.JPEG n03920288/
+mv val/ILSVRC2012_val_00032718.JPEG n02128385/
+mv val/ILSVRC2012_val_00032719.JPEG n03495258/
+mv val/ILSVRC2012_val_00032720.JPEG n02114712/
+mv val/ILSVRC2012_val_00032721.JPEG n03976467/
+mv val/ILSVRC2012_val_00032722.JPEG n04259630/
+mv val/ILSVRC2012_val_00032723.JPEG n02794156/
+mv val/ILSVRC2012_val_00032724.JPEG n01774384/
+mv val/ILSVRC2012_val_00032725.JPEG n02091467/
+mv val/ILSVRC2012_val_00032726.JPEG n04467665/
+mv val/ILSVRC2012_val_00032727.JPEG n02091635/
+mv val/ILSVRC2012_val_00032728.JPEG n04579432/
+mv val/ILSVRC2012_val_00032729.JPEG n03599486/
+mv val/ILSVRC2012_val_00032730.JPEG n02328150/
+mv val/ILSVRC2012_val_00032731.JPEG n04147183/
+mv val/ILSVRC2012_val_00032732.JPEG n02486410/
+mv val/ILSVRC2012_val_00032733.JPEG n04252077/
+mv val/ILSVRC2012_val_00032734.JPEG n02395406/
+mv val/ILSVRC2012_val_00032735.JPEG n07584110/
+mv val/ILSVRC2012_val_00032736.JPEG n03075370/
+mv val/ILSVRC2012_val_00032737.JPEG n02138441/
+mv val/ILSVRC2012_val_00032738.JPEG n02105505/
+mv val/ILSVRC2012_val_00032739.JPEG n04311004/
+mv val/ILSVRC2012_val_00032740.JPEG n04086273/
+mv val/ILSVRC2012_val_00032741.JPEG n04435653/
+mv val/ILSVRC2012_val_00032742.JPEG n04467665/
+mv val/ILSVRC2012_val_00032743.JPEG n04201297/
+mv val/ILSVRC2012_val_00032744.JPEG n01689811/
+mv val/ILSVRC2012_val_00032745.JPEG n03345487/
+mv val/ILSVRC2012_val_00032746.JPEG n02090379/
+mv val/ILSVRC2012_val_00032747.JPEG n02776631/
+mv val/ILSVRC2012_val_00032748.JPEG n04023962/
+mv val/ILSVRC2012_val_00032749.JPEG n02114367/
+mv val/ILSVRC2012_val_00032750.JPEG n13044778/
+mv val/ILSVRC2012_val_00032751.JPEG n02917067/
+mv val/ILSVRC2012_val_00032752.JPEG n07711569/
+mv val/ILSVRC2012_val_00032753.JPEG n03452741/
+mv val/ILSVRC2012_val_00032754.JPEG n01734418/
+mv val/ILSVRC2012_val_00032755.JPEG n03272010/
+mv val/ILSVRC2012_val_00032756.JPEG n01744401/
+mv val/ILSVRC2012_val_00032757.JPEG n09399592/
+mv val/ILSVRC2012_val_00032758.JPEG n02114855/
+mv val/ILSVRC2012_val_00032759.JPEG n03594734/
+mv val/ILSVRC2012_val_00032760.JPEG n02860847/
+mv val/ILSVRC2012_val_00032761.JPEG n04141076/
+mv val/ILSVRC2012_val_00032762.JPEG n02133161/
+mv val/ILSVRC2012_val_00032763.JPEG n03804744/
+mv val/ILSVRC2012_val_00032764.JPEG n01924916/
+mv val/ILSVRC2012_val_00032765.JPEG n04532106/
+mv val/ILSVRC2012_val_00032766.JPEG n01770081/
+mv val/ILSVRC2012_val_00032767.JPEG n02096177/
+mv val/ILSVRC2012_val_00032768.JPEG n02797295/
+mv val/ILSVRC2012_val_00032769.JPEG n03188531/
+mv val/ILSVRC2012_val_00032770.JPEG n04204347/
+mv val/ILSVRC2012_val_00032771.JPEG n03063689/
+mv val/ILSVRC2012_val_00032772.JPEG n02841315/
+mv val/ILSVRC2012_val_00032773.JPEG n02276258/
+mv val/ILSVRC2012_val_00032774.JPEG n02086646/
+mv val/ILSVRC2012_val_00032775.JPEG n03775071/
+mv val/ILSVRC2012_val_00032776.JPEG n03947888/
+mv val/ILSVRC2012_val_00032777.JPEG n02137549/
+mv val/ILSVRC2012_val_00032778.JPEG n03063599/
+mv val/ILSVRC2012_val_00032779.JPEG n02074367/
+mv val/ILSVRC2012_val_00032780.JPEG n02051845/
+mv val/ILSVRC2012_val_00032781.JPEG n03832673/
+mv val/ILSVRC2012_val_00032782.JPEG n03982430/
+mv val/ILSVRC2012_val_00032783.JPEG n01776313/
+mv val/ILSVRC2012_val_00032784.JPEG n02102177/
+mv val/ILSVRC2012_val_00032785.JPEG n02106550/
+mv val/ILSVRC2012_val_00032786.JPEG n03929855/
+mv val/ILSVRC2012_val_00032787.JPEG n04201297/
+mv val/ILSVRC2012_val_00032788.JPEG n01592084/
+mv val/ILSVRC2012_val_00032789.JPEG n02906734/
+mv val/ILSVRC2012_val_00032790.JPEG n03124043/
+mv val/ILSVRC2012_val_00032791.JPEG n03598930/
+mv val/ILSVRC2012_val_00032792.JPEG n07590611/
+mv val/ILSVRC2012_val_00032793.JPEG n02091635/
+mv val/ILSVRC2012_val_00032794.JPEG n02128757/
+mv val/ILSVRC2012_val_00032795.JPEG n04204347/
+mv val/ILSVRC2012_val_00032796.JPEG n01698640/
+mv val/ILSVRC2012_val_00032797.JPEG n01955084/
+mv val/ILSVRC2012_val_00032798.JPEG n03891251/
+mv val/ILSVRC2012_val_00032799.JPEG n02823428/
+mv val/ILSVRC2012_val_00032800.JPEG n03417042/
+mv val/ILSVRC2012_val_00032801.JPEG n03666591/
+mv val/ILSVRC2012_val_00032802.JPEG n03958227/
+mv val/ILSVRC2012_val_00032803.JPEG n03895866/
+mv val/ILSVRC2012_val_00032804.JPEG n02690373/
+mv val/ILSVRC2012_val_00032805.JPEG n01667778/
+mv val/ILSVRC2012_val_00032806.JPEG n02692877/
+mv val/ILSVRC2012_val_00032807.JPEG n03532672/
+mv val/ILSVRC2012_val_00032808.JPEG n07920052/
+mv val/ILSVRC2012_val_00032809.JPEG n03924679/
+mv val/ILSVRC2012_val_00032810.JPEG n03085013/
+mv val/ILSVRC2012_val_00032811.JPEG n07697313/
+mv val/ILSVRC2012_val_00032812.JPEG n02444819/
+mv val/ILSVRC2012_val_00032813.JPEG n02992211/
+mv val/ILSVRC2012_val_00032814.JPEG n07248320/
+mv val/ILSVRC2012_val_00032815.JPEG n02950826/
+mv val/ILSVRC2012_val_00032816.JPEG n02077923/
+mv val/ILSVRC2012_val_00032817.JPEG n03786901/
+mv val/ILSVRC2012_val_00032818.JPEG n03016953/
+mv val/ILSVRC2012_val_00032819.JPEG n02111889/
+mv val/ILSVRC2012_val_00032820.JPEG n02892201/
+mv val/ILSVRC2012_val_00032821.JPEG n02786058/
+mv val/ILSVRC2012_val_00032822.JPEG n02106382/
+mv val/ILSVRC2012_val_00032823.JPEG n02877765/
+mv val/ILSVRC2012_val_00032824.JPEG n02687172/
+mv val/ILSVRC2012_val_00032825.JPEG n02747177/
+mv val/ILSVRC2012_val_00032826.JPEG n02105412/
+mv val/ILSVRC2012_val_00032827.JPEG n07753113/
+mv val/ILSVRC2012_val_00032828.JPEG n03207743/
+mv val/ILSVRC2012_val_00032829.JPEG n04418357/
+mv val/ILSVRC2012_val_00032830.JPEG n02009912/
+mv val/ILSVRC2012_val_00032831.JPEG n01580077/
+mv val/ILSVRC2012_val_00032832.JPEG n01616318/
+mv val/ILSVRC2012_val_00032833.JPEG n04273569/
+mv val/ILSVRC2012_val_00032834.JPEG n01945685/
+mv val/ILSVRC2012_val_00032835.JPEG n03706229/
+mv val/ILSVRC2012_val_00032836.JPEG n04326547/
+mv val/ILSVRC2012_val_00032837.JPEG n02105056/
+mv val/ILSVRC2012_val_00032838.JPEG n13037406/
+mv val/ILSVRC2012_val_00032839.JPEG n03459775/
+mv val/ILSVRC2012_val_00032840.JPEG n02526121/
+mv val/ILSVRC2012_val_00032841.JPEG n02837789/
+mv val/ILSVRC2012_val_00032842.JPEG n04346328/
+mv val/ILSVRC2012_val_00032843.JPEG n01819313/
+mv val/ILSVRC2012_val_00032844.JPEG n02321529/
+mv val/ILSVRC2012_val_00032845.JPEG n03916031/
+mv val/ILSVRC2012_val_00032846.JPEG n03026506/
+mv val/ILSVRC2012_val_00032847.JPEG n02105251/
+mv val/ILSVRC2012_val_00032848.JPEG n04599235/
+mv val/ILSVRC2012_val_00032849.JPEG n01518878/
+mv val/ILSVRC2012_val_00032850.JPEG n02110627/
+mv val/ILSVRC2012_val_00032851.JPEG n01984695/
+mv val/ILSVRC2012_val_00032852.JPEG n01943899/
+mv val/ILSVRC2012_val_00032853.JPEG n04069434/
+mv val/ILSVRC2012_val_00032854.JPEG n02113023/
+mv val/ILSVRC2012_val_00032855.JPEG n01531178/
+mv val/ILSVRC2012_val_00032856.JPEG n03947888/
+mv val/ILSVRC2012_val_00032857.JPEG n03733805/
+mv val/ILSVRC2012_val_00032858.JPEG n03873416/
+mv val/ILSVRC2012_val_00032859.JPEG n02087394/
+mv val/ILSVRC2012_val_00032860.JPEG n04273569/
+mv val/ILSVRC2012_val_00032861.JPEG n03690938/
+mv val/ILSVRC2012_val_00032862.JPEG n02281787/
+mv val/ILSVRC2012_val_00032863.JPEG n04515003/
+mv val/ILSVRC2012_val_00032864.JPEG n01630670/
+mv val/ILSVRC2012_val_00032865.JPEG n03445924/
+mv val/ILSVRC2012_val_00032866.JPEG n04317175/
+mv val/ILSVRC2012_val_00032867.JPEG n02395406/
+mv val/ILSVRC2012_val_00032868.JPEG n02018207/
+mv val/ILSVRC2012_val_00032869.JPEG n02128385/
+mv val/ILSVRC2012_val_00032870.JPEG n03255030/
+mv val/ILSVRC2012_val_00032871.JPEG n02169497/
+mv val/ILSVRC2012_val_00032872.JPEG n03717622/
+mv val/ILSVRC2012_val_00032873.JPEG n03602883/
+mv val/ILSVRC2012_val_00032874.JPEG n02488291/
+mv val/ILSVRC2012_val_00032875.JPEG n01622779/
+mv val/ILSVRC2012_val_00032876.JPEG n03992509/
+mv val/ILSVRC2012_val_00032877.JPEG n02877765/
+mv val/ILSVRC2012_val_00032878.JPEG n03873416/
+mv val/ILSVRC2012_val_00032879.JPEG n01855672/
+mv val/ILSVRC2012_val_00032880.JPEG n03478589/
+mv val/ILSVRC2012_val_00032881.JPEG n03404251/
+mv val/ILSVRC2012_val_00032882.JPEG n07584110/
+mv val/ILSVRC2012_val_00032883.JPEG n03980874/
+mv val/ILSVRC2012_val_00032884.JPEG n03476684/
+mv val/ILSVRC2012_val_00032885.JPEG n02138441/
+mv val/ILSVRC2012_val_00032886.JPEG n02977058/
+mv val/ILSVRC2012_val_00032887.JPEG n02105162/
+mv val/ILSVRC2012_val_00032888.JPEG n03485407/
+mv val/ILSVRC2012_val_00032889.JPEG n01616318/
+mv val/ILSVRC2012_val_00032890.JPEG n02051845/
+mv val/ILSVRC2012_val_00032891.JPEG n03793489/
+mv val/ILSVRC2012_val_00032892.JPEG n01768244/
+mv val/ILSVRC2012_val_00032893.JPEG n04209239/
+mv val/ILSVRC2012_val_00032894.JPEG n03930630/
+mv val/ILSVRC2012_val_00032895.JPEG n04532106/
+mv val/ILSVRC2012_val_00032896.JPEG n03259280/
+mv val/ILSVRC2012_val_00032897.JPEG n02841315/
+mv val/ILSVRC2012_val_00032898.JPEG n02966193/
+mv val/ILSVRC2012_val_00032899.JPEG n03980874/
+mv val/ILSVRC2012_val_00032900.JPEG n04532106/
+mv val/ILSVRC2012_val_00032901.JPEG n02981792/
+mv val/ILSVRC2012_val_00032902.JPEG n01776313/
+mv val/ILSVRC2012_val_00032903.JPEG n04355338/
+mv val/ILSVRC2012_val_00032904.JPEG n02110341/
+mv val/ILSVRC2012_val_00032905.JPEG n03697007/
+mv val/ILSVRC2012_val_00032906.JPEG n02454379/
+mv val/ILSVRC2012_val_00032907.JPEG n02655020/
+mv val/ILSVRC2012_val_00032908.JPEG n03841143/
+mv val/ILSVRC2012_val_00032909.JPEG n07584110/
+mv val/ILSVRC2012_val_00032910.JPEG n02123394/
+mv val/ILSVRC2012_val_00032911.JPEG n03255030/
+mv val/ILSVRC2012_val_00032912.JPEG n07711569/
+mv val/ILSVRC2012_val_00032913.JPEG n03724870/
+mv val/ILSVRC2012_val_00032914.JPEG n03110669/
+mv val/ILSVRC2012_val_00032915.JPEG n03133878/
+mv val/ILSVRC2012_val_00032916.JPEG n01641577/
+mv val/ILSVRC2012_val_00032917.JPEG n01644373/
+mv val/ILSVRC2012_val_00032918.JPEG n04049303/
+mv val/ILSVRC2012_val_00032919.JPEG n07768694/
+mv val/ILSVRC2012_val_00032920.JPEG n03075370/
+mv val/ILSVRC2012_val_00032921.JPEG n02823428/
+mv val/ILSVRC2012_val_00032922.JPEG n02640242/
+mv val/ILSVRC2012_val_00032923.JPEG n02104365/
+mv val/ILSVRC2012_val_00032924.JPEG n04009552/
+mv val/ILSVRC2012_val_00032925.JPEG n02129604/
+mv val/ILSVRC2012_val_00032926.JPEG n03733805/
+mv val/ILSVRC2012_val_00032927.JPEG n02281787/
+mv val/ILSVRC2012_val_00032928.JPEG n04208210/
+mv val/ILSVRC2012_val_00032929.JPEG n04067472/
+mv val/ILSVRC2012_val_00032930.JPEG n01514859/
+mv val/ILSVRC2012_val_00032931.JPEG n03384352/
+mv val/ILSVRC2012_val_00032932.JPEG n03544143/
+mv val/ILSVRC2012_val_00032933.JPEG n03355925/
+mv val/ILSVRC2012_val_00032934.JPEG n01694178/
+mv val/ILSVRC2012_val_00032935.JPEG n03950228/
+mv val/ILSVRC2012_val_00032936.JPEG n07717556/
+mv val/ILSVRC2012_val_00032937.JPEG n02317335/
+mv val/ILSVRC2012_val_00032938.JPEG n02113799/
+mv val/ILSVRC2012_val_00032939.JPEG n07583066/
+mv val/ILSVRC2012_val_00032940.JPEG n02999410/
+mv val/ILSVRC2012_val_00032941.JPEG n07760859/
+mv val/ILSVRC2012_val_00032942.JPEG n02410509/
+mv val/ILSVRC2012_val_00032943.JPEG n02013706/
+mv val/ILSVRC2012_val_00032944.JPEG n04285008/
+mv val/ILSVRC2012_val_00032945.JPEG n04296562/
+mv val/ILSVRC2012_val_00032946.JPEG n03196217/
+mv val/ILSVRC2012_val_00032947.JPEG n03000134/
+mv val/ILSVRC2012_val_00032948.JPEG n02110627/
+mv val/ILSVRC2012_val_00032949.JPEG n04442312/
+mv val/ILSVRC2012_val_00032950.JPEG n02787622/
+mv val/ILSVRC2012_val_00032951.JPEG n02443484/
+mv val/ILSVRC2012_val_00032952.JPEG n02137549/
+mv val/ILSVRC2012_val_00032953.JPEG n03337140/
+mv val/ILSVRC2012_val_00032954.JPEG n03594734/
+mv val/ILSVRC2012_val_00032955.JPEG n02879718/
+mv val/ILSVRC2012_val_00032956.JPEG n02415577/
+mv val/ILSVRC2012_val_00032957.JPEG n02092339/
+mv val/ILSVRC2012_val_00032958.JPEG n03450230/
+mv val/ILSVRC2012_val_00032959.JPEG n02102040/
+mv val/ILSVRC2012_val_00032960.JPEG n07747607/
+mv val/ILSVRC2012_val_00032961.JPEG n03085013/
+mv val/ILSVRC2012_val_00032962.JPEG n03026506/
+mv val/ILSVRC2012_val_00032963.JPEG n06874185/
+mv val/ILSVRC2012_val_00032964.JPEG n02493793/
+mv val/ILSVRC2012_val_00032965.JPEG n03532672/
+mv val/ILSVRC2012_val_00032966.JPEG n01644900/
+mv val/ILSVRC2012_val_00032967.JPEG n03792782/
+mv val/ILSVRC2012_val_00032968.JPEG n04004767/
+mv val/ILSVRC2012_val_00032969.JPEG n02966193/
+mv val/ILSVRC2012_val_00032970.JPEG n01784675/
+mv val/ILSVRC2012_val_00032971.JPEG n13037406/
+mv val/ILSVRC2012_val_00032972.JPEG n03481172/
+mv val/ILSVRC2012_val_00032973.JPEG n03775546/
+mv val/ILSVRC2012_val_00032974.JPEG n04033995/
+mv val/ILSVRC2012_val_00032975.JPEG n02101556/
+mv val/ILSVRC2012_val_00032976.JPEG n03666591/
+mv val/ILSVRC2012_val_00032977.JPEG n04317175/
+mv val/ILSVRC2012_val_00032978.JPEG n01882714/
+mv val/ILSVRC2012_val_00032979.JPEG n02640242/
+mv val/ILSVRC2012_val_00032980.JPEG n03063689/
+mv val/ILSVRC2012_val_00032981.JPEG n04560804/
+mv val/ILSVRC2012_val_00032982.JPEG n01860187/
+mv val/ILSVRC2012_val_00032983.JPEG n04376876/
+mv val/ILSVRC2012_val_00032984.JPEG n04523525/
+mv val/ILSVRC2012_val_00032985.JPEG n01833805/
+mv val/ILSVRC2012_val_00032986.JPEG n02169497/
+mv val/ILSVRC2012_val_00032987.JPEG n03314780/
+mv val/ILSVRC2012_val_00032988.JPEG n02988304/
+mv val/ILSVRC2012_val_00032989.JPEG n02168699/
+mv val/ILSVRC2012_val_00032990.JPEG n04044716/
+mv val/ILSVRC2012_val_00032991.JPEG n02109961/
+mv val/ILSVRC2012_val_00032992.JPEG n01770393/
+mv val/ILSVRC2012_val_00032993.JPEG n01531178/
+mv val/ILSVRC2012_val_00032994.JPEG n04152593/
+mv val/ILSVRC2012_val_00032995.JPEG n02106662/
+mv val/ILSVRC2012_val_00032996.JPEG n04389033/
+mv val/ILSVRC2012_val_00032997.JPEG n01735189/
+mv val/ILSVRC2012_val_00032998.JPEG n07871810/
+mv val/ILSVRC2012_val_00032999.JPEG n04277352/
+mv val/ILSVRC2012_val_00033000.JPEG n02077923/
+mv val/ILSVRC2012_val_00033001.JPEG n03347037/
+mv val/ILSVRC2012_val_00033002.JPEG n02111500/
+mv val/ILSVRC2012_val_00033003.JPEG n02088238/
+mv val/ILSVRC2012_val_00033004.JPEG n03534580/
+mv val/ILSVRC2012_val_00033005.JPEG n03314780/
+mv val/ILSVRC2012_val_00033006.JPEG n02791270/
+mv val/ILSVRC2012_val_00033007.JPEG n04548280/
+mv val/ILSVRC2012_val_00033008.JPEG n03109150/
+mv val/ILSVRC2012_val_00033009.JPEG n03944341/
+mv val/ILSVRC2012_val_00033010.JPEG n02137549/
+mv val/ILSVRC2012_val_00033011.JPEG n04523525/
+mv val/ILSVRC2012_val_00033012.JPEG n04592741/
+mv val/ILSVRC2012_val_00033013.JPEG n04266014/
+mv val/ILSVRC2012_val_00033014.JPEG n01978455/
+mv val/ILSVRC2012_val_00033015.JPEG n02091032/
+mv val/ILSVRC2012_val_00033016.JPEG n04398044/
+mv val/ILSVRC2012_val_00033017.JPEG n02113624/
+mv val/ILSVRC2012_val_00033018.JPEG n02408429/
+mv val/ILSVRC2012_val_00033019.JPEG n04417672/
+mv val/ILSVRC2012_val_00033020.JPEG n04009552/
+mv val/ILSVRC2012_val_00033021.JPEG n02231487/
+mv val/ILSVRC2012_val_00033022.JPEG n04599235/
+mv val/ILSVRC2012_val_00033023.JPEG n07248320/
+mv val/ILSVRC2012_val_00033024.JPEG n04086273/
+mv val/ILSVRC2012_val_00033025.JPEG n04606251/
+mv val/ILSVRC2012_val_00033026.JPEG n03532672/
+mv val/ILSVRC2012_val_00033027.JPEG n02112137/
+mv val/ILSVRC2012_val_00033028.JPEG n09256479/
+mv val/ILSVRC2012_val_00033029.JPEG n04523525/
+mv val/ILSVRC2012_val_00033030.JPEG n01697457/
+mv val/ILSVRC2012_val_00033031.JPEG n03662601/
+mv val/ILSVRC2012_val_00033032.JPEG n04070727/
+mv val/ILSVRC2012_val_00033033.JPEG n02098286/
+mv val/ILSVRC2012_val_00033034.JPEG n02017213/
+mv val/ILSVRC2012_val_00033035.JPEG n02177972/
+mv val/ILSVRC2012_val_00033036.JPEG n01689811/
+mv val/ILSVRC2012_val_00033037.JPEG n03697007/
+mv val/ILSVRC2012_val_00033038.JPEG n03874599/
+mv val/ILSVRC2012_val_00033039.JPEG n02110185/
+mv val/ILSVRC2012_val_00033040.JPEG n04417672/
+mv val/ILSVRC2012_val_00033041.JPEG n04310018/
+mv val/ILSVRC2012_val_00033042.JPEG n02130308/
+mv val/ILSVRC2012_val_00033043.JPEG n04252077/
+mv val/ILSVRC2012_val_00033044.JPEG n03534580/
+mv val/ILSVRC2012_val_00033045.JPEG n01860187/
+mv val/ILSVRC2012_val_00033046.JPEG n03814906/
+mv val/ILSVRC2012_val_00033047.JPEG n02442845/
+mv val/ILSVRC2012_val_00033048.JPEG n04487394/
+mv val/ILSVRC2012_val_00033049.JPEG n02090379/
+mv val/ILSVRC2012_val_00033050.JPEG n01930112/
+mv val/ILSVRC2012_val_00033051.JPEG n07860988/
+mv val/ILSVRC2012_val_00033052.JPEG n02869837/
+mv val/ILSVRC2012_val_00033053.JPEG n02231487/
+mv val/ILSVRC2012_val_00033054.JPEG n03956157/
+mv val/ILSVRC2012_val_00033055.JPEG n03482405/
+mv val/ILSVRC2012_val_00033056.JPEG n02489166/
+mv val/ILSVRC2012_val_00033057.JPEG n02107683/
+mv val/ILSVRC2012_val_00033058.JPEG n01677366/
+mv val/ILSVRC2012_val_00033059.JPEG n01806143/
+mv val/ILSVRC2012_val_00033060.JPEG n03775071/
+mv val/ILSVRC2012_val_00033061.JPEG n02825657/
+mv val/ILSVRC2012_val_00033062.JPEG n02783161/
+mv val/ILSVRC2012_val_00033063.JPEG n01622779/
+mv val/ILSVRC2012_val_00033064.JPEG n02268853/
+mv val/ILSVRC2012_val_00033065.JPEG n04044716/
+mv val/ILSVRC2012_val_00033066.JPEG n04540053/
+mv val/ILSVRC2012_val_00033067.JPEG n02107142/
+mv val/ILSVRC2012_val_00033068.JPEG n04487394/
+mv val/ILSVRC2012_val_00033069.JPEG n03376595/
+mv val/ILSVRC2012_val_00033070.JPEG n01496331/
+mv val/ILSVRC2012_val_00033071.JPEG n02815834/
+mv val/ILSVRC2012_val_00033072.JPEG n02099267/
+mv val/ILSVRC2012_val_00033073.JPEG n04229816/
+mv val/ILSVRC2012_val_00033074.JPEG n07615774/
+mv val/ILSVRC2012_val_00033075.JPEG n03272562/
+mv val/ILSVRC2012_val_00033076.JPEG n01855672/
+mv val/ILSVRC2012_val_00033077.JPEG n02804414/
+mv val/ILSVRC2012_val_00033078.JPEG n01818515/
+mv val/ILSVRC2012_val_00033079.JPEG n02704792/
+mv val/ILSVRC2012_val_00033080.JPEG n02483708/
+mv val/ILSVRC2012_val_00033081.JPEG n01629819/
+mv val/ILSVRC2012_val_00033082.JPEG n03393912/
+mv val/ILSVRC2012_val_00033083.JPEG n03794056/
+mv val/ILSVRC2012_val_00033084.JPEG n01644373/
+mv val/ILSVRC2012_val_00033085.JPEG n02951585/
+mv val/ILSVRC2012_val_00033086.JPEG n02497673/
+mv val/ILSVRC2012_val_00033087.JPEG n02415577/
+mv val/ILSVRC2012_val_00033088.JPEG n01871265/
+mv val/ILSVRC2012_val_00033089.JPEG n07718747/
+mv val/ILSVRC2012_val_00033090.JPEG n02966193/
+mv val/ILSVRC2012_val_00033091.JPEG n03017168/
+mv val/ILSVRC2012_val_00033092.JPEG n01530575/
+mv val/ILSVRC2012_val_00033093.JPEG n02319095/
+mv val/ILSVRC2012_val_00033094.JPEG n02090379/
+mv val/ILSVRC2012_val_00033095.JPEG n03297495/
+mv val/ILSVRC2012_val_00033096.JPEG n03388183/
+mv val/ILSVRC2012_val_00033097.JPEG n03825788/
+mv val/ILSVRC2012_val_00033098.JPEG n01798484/
+mv val/ILSVRC2012_val_00033099.JPEG n03814906/
+mv val/ILSVRC2012_val_00033100.JPEG n02027492/
+mv val/ILSVRC2012_val_00033101.JPEG n02111889/
+mv val/ILSVRC2012_val_00033102.JPEG n04118538/
+mv val/ILSVRC2012_val_00033103.JPEG n02356798/
+mv val/ILSVRC2012_val_00033104.JPEG n01983481/
+mv val/ILSVRC2012_val_00033105.JPEG n01986214/
+mv val/ILSVRC2012_val_00033106.JPEG n02808440/
+mv val/ILSVRC2012_val_00033107.JPEG n02486261/
+mv val/ILSVRC2012_val_00033108.JPEG n01751748/
+mv val/ILSVRC2012_val_00033109.JPEG n03777568/
+mv val/ILSVRC2012_val_00033110.JPEG n04335435/
+mv val/ILSVRC2012_val_00033111.JPEG n07720875/
+mv val/ILSVRC2012_val_00033112.JPEG n03633091/
+mv val/ILSVRC2012_val_00033113.JPEG n03534580/
+mv val/ILSVRC2012_val_00033114.JPEG n04141975/
+mv val/ILSVRC2012_val_00033115.JPEG n04162706/
+mv val/ILSVRC2012_val_00033116.JPEG n03998194/
+mv val/ILSVRC2012_val_00033117.JPEG n07579787/
+mv val/ILSVRC2012_val_00033118.JPEG n02676566/
+mv val/ILSVRC2012_val_00033119.JPEG n03483316/
+mv val/ILSVRC2012_val_00033120.JPEG n01693334/
+mv val/ILSVRC2012_val_00033121.JPEG n04238763/
+mv val/ILSVRC2012_val_00033122.JPEG n02071294/
+mv val/ILSVRC2012_val_00033123.JPEG n04493381/
+mv val/ILSVRC2012_val_00033124.JPEG n07875152/
+mv val/ILSVRC2012_val_00033125.JPEG n01753488/
+mv val/ILSVRC2012_val_00033126.JPEG n02091635/
+mv val/ILSVRC2012_val_00033127.JPEG n03314780/
+mv val/ILSVRC2012_val_00033128.JPEG n03291819/
+mv val/ILSVRC2012_val_00033129.JPEG n03924679/
+mv val/ILSVRC2012_val_00033130.JPEG n12768682/
+mv val/ILSVRC2012_val_00033131.JPEG n06794110/
+mv val/ILSVRC2012_val_00033132.JPEG n03291819/
+mv val/ILSVRC2012_val_00033133.JPEG n03544143/
+mv val/ILSVRC2012_val_00033134.JPEG n01698640/
+mv val/ILSVRC2012_val_00033135.JPEG n06785654/
+mv val/ILSVRC2012_val_00033136.JPEG n03782006/
+mv val/ILSVRC2012_val_00033137.JPEG n04154565/
+mv val/ILSVRC2012_val_00033138.JPEG n02012849/
+mv val/ILSVRC2012_val_00033139.JPEG n07930864/
+mv val/ILSVRC2012_val_00033140.JPEG n03017168/
+mv val/ILSVRC2012_val_00033141.JPEG n04133789/
+mv val/ILSVRC2012_val_00033142.JPEG n02138441/
+mv val/ILSVRC2012_val_00033143.JPEG n03769881/
+mv val/ILSVRC2012_val_00033144.JPEG n03773504/
+mv val/ILSVRC2012_val_00033145.JPEG n07930864/
+mv val/ILSVRC2012_val_00033146.JPEG n04589890/
+mv val/ILSVRC2012_val_00033147.JPEG n01806143/
+mv val/ILSVRC2012_val_00033148.JPEG n03207743/
+mv val/ILSVRC2012_val_00033149.JPEG n02097474/
+mv val/ILSVRC2012_val_00033150.JPEG n01582220/
+mv val/ILSVRC2012_val_00033151.JPEG n02939185/
+mv val/ILSVRC2012_val_00033152.JPEG n02640242/
+mv val/ILSVRC2012_val_00033153.JPEG n02981792/
+mv val/ILSVRC2012_val_00033154.JPEG n03657121/
+mv val/ILSVRC2012_val_00033155.JPEG n02106166/
+mv val/ILSVRC2012_val_00033156.JPEG n02666196/
+mv val/ILSVRC2012_val_00033157.JPEG n01751748/
+mv val/ILSVRC2012_val_00033158.JPEG n03188531/
+mv val/ILSVRC2012_val_00033159.JPEG n01768244/
+mv val/ILSVRC2012_val_00033160.JPEG n04429376/
+mv val/ILSVRC2012_val_00033161.JPEG n02690373/
+mv val/ILSVRC2012_val_00033162.JPEG n01806567/
+mv val/ILSVRC2012_val_00033163.JPEG n02319095/
+mv val/ILSVRC2012_val_00033164.JPEG n02107683/
+mv val/ILSVRC2012_val_00033165.JPEG n04550184/
+mv val/ILSVRC2012_val_00033166.JPEG n04350905/
+mv val/ILSVRC2012_val_00033167.JPEG n01797886/
+mv val/ILSVRC2012_val_00033168.JPEG n04447861/
+mv val/ILSVRC2012_val_00033169.JPEG n04485082/
+mv val/ILSVRC2012_val_00033170.JPEG n03443371/
+mv val/ILSVRC2012_val_00033171.JPEG n04229816/
+mv val/ILSVRC2012_val_00033172.JPEG n03443371/
+mv val/ILSVRC2012_val_00033173.JPEG n04579145/
+mv val/ILSVRC2012_val_00033174.JPEG n03125729/
+mv val/ILSVRC2012_val_00033175.JPEG n03942813/
+mv val/ILSVRC2012_val_00033176.JPEG n03649909/
+mv val/ILSVRC2012_val_00033177.JPEG n02119022/
+mv val/ILSVRC2012_val_00033178.JPEG n02105251/
+mv val/ILSVRC2012_val_00033179.JPEG n12144580/
+mv val/ILSVRC2012_val_00033180.JPEG n02992529/
+mv val/ILSVRC2012_val_00033181.JPEG n01518878/
+mv val/ILSVRC2012_val_00033182.JPEG n02977058/
+mv val/ILSVRC2012_val_00033183.JPEG n01968897/
+mv val/ILSVRC2012_val_00033184.JPEG n02233338/
+mv val/ILSVRC2012_val_00033185.JPEG n03642806/
+mv val/ILSVRC2012_val_00033186.JPEG n01833805/
+mv val/ILSVRC2012_val_00033187.JPEG n09421951/
+mv val/ILSVRC2012_val_00033188.JPEG n01985128/
+mv val/ILSVRC2012_val_00033189.JPEG n01824575/
+mv val/ILSVRC2012_val_00033190.JPEG n04286575/
+mv val/ILSVRC2012_val_00033191.JPEG n04330267/
+mv val/ILSVRC2012_val_00033192.JPEG n02106166/
+mv val/ILSVRC2012_val_00033193.JPEG n07875152/
+mv val/ILSVRC2012_val_00033194.JPEG n02094258/
+mv val/ILSVRC2012_val_00033195.JPEG n02123394/
+mv val/ILSVRC2012_val_00033196.JPEG n01537544/
+mv val/ILSVRC2012_val_00033197.JPEG n04493381/
+mv val/ILSVRC2012_val_00033198.JPEG n02102480/
+mv val/ILSVRC2012_val_00033199.JPEG n02086240/
+mv val/ILSVRC2012_val_00033200.JPEG n02085782/
+mv val/ILSVRC2012_val_00033201.JPEG n03786901/
+mv val/ILSVRC2012_val_00033202.JPEG n04254680/
+mv val/ILSVRC2012_val_00033203.JPEG n03721384/
+mv val/ILSVRC2012_val_00033204.JPEG n04311174/
+mv val/ILSVRC2012_val_00033205.JPEG n04487394/
+mv val/ILSVRC2012_val_00033206.JPEG n02099267/
+mv val/ILSVRC2012_val_00033207.JPEG n03207941/
+mv val/ILSVRC2012_val_00033208.JPEG n02883205/
+mv val/ILSVRC2012_val_00033209.JPEG n02672831/
+mv val/ILSVRC2012_val_00033210.JPEG n04008634/
+mv val/ILSVRC2012_val_00033211.JPEG n03868863/
+mv val/ILSVRC2012_val_00033212.JPEG n04251144/
+mv val/ILSVRC2012_val_00033213.JPEG n03529860/
+mv val/ILSVRC2012_val_00033214.JPEG n01608432/
+mv val/ILSVRC2012_val_00033215.JPEG n02093647/
+mv val/ILSVRC2012_val_00033216.JPEG n02028035/
+mv val/ILSVRC2012_val_00033217.JPEG n03982430/
+mv val/ILSVRC2012_val_00033218.JPEG n01687978/
+mv val/ILSVRC2012_val_00033219.JPEG n01632458/
+mv val/ILSVRC2012_val_00033220.JPEG n03125729/
+mv val/ILSVRC2012_val_00033221.JPEG n02389026/
+mv val/ILSVRC2012_val_00033222.JPEG n02085782/
+mv val/ILSVRC2012_val_00033223.JPEG n06359193/
+mv val/ILSVRC2012_val_00033224.JPEG n03459775/
+mv val/ILSVRC2012_val_00033225.JPEG n01773797/
+mv val/ILSVRC2012_val_00033226.JPEG n02093754/
+mv val/ILSVRC2012_val_00033227.JPEG n04275548/
+mv val/ILSVRC2012_val_00033228.JPEG n02120505/
+mv val/ILSVRC2012_val_00033229.JPEG n03450230/
+mv val/ILSVRC2012_val_00033230.JPEG n03854065/
+mv val/ILSVRC2012_val_00033231.JPEG n02096177/
+mv val/ILSVRC2012_val_00033232.JPEG n02112706/
+mv val/ILSVRC2012_val_00033233.JPEG n02089867/
+mv val/ILSVRC2012_val_00033234.JPEG n02138441/
+mv val/ILSVRC2012_val_00033235.JPEG n02504458/
+mv val/ILSVRC2012_val_00033236.JPEG n02865351/
+mv val/ILSVRC2012_val_00033237.JPEG n04479046/
+mv val/ILSVRC2012_val_00033238.JPEG n03180011/
+mv val/ILSVRC2012_val_00033239.JPEG n03223299/
+mv val/ILSVRC2012_val_00033240.JPEG n02804414/
+mv val/ILSVRC2012_val_00033241.JPEG n02134418/
+mv val/ILSVRC2012_val_00033242.JPEG n01751748/
+mv val/ILSVRC2012_val_00033243.JPEG n02483708/
+mv val/ILSVRC2012_val_00033244.JPEG n01692333/
+mv val/ILSVRC2012_val_00033245.JPEG n02992211/
+mv val/ILSVRC2012_val_00033246.JPEG n03404251/
+mv val/ILSVRC2012_val_00033247.JPEG n07716906/
+mv val/ILSVRC2012_val_00033248.JPEG n01924916/
+mv val/ILSVRC2012_val_00033249.JPEG n07695742/
+mv val/ILSVRC2012_val_00033250.JPEG n02112137/
+mv val/ILSVRC2012_val_00033251.JPEG n02692877/
+mv val/ILSVRC2012_val_00033252.JPEG n02423022/
+mv val/ILSVRC2012_val_00033253.JPEG n02860847/
+mv val/ILSVRC2012_val_00033254.JPEG n01877812/
+mv val/ILSVRC2012_val_00033255.JPEG n04326547/
+mv val/ILSVRC2012_val_00033256.JPEG n02051845/
+mv val/ILSVRC2012_val_00033257.JPEG n01855672/
+mv val/ILSVRC2012_val_00033258.JPEG n02667093/
+mv val/ILSVRC2012_val_00033259.JPEG n01829413/
+mv val/ILSVRC2012_val_00033260.JPEG n07760859/
+mv val/ILSVRC2012_val_00033261.JPEG n01630670/
+mv val/ILSVRC2012_val_00033262.JPEG n02869837/
+mv val/ILSVRC2012_val_00033263.JPEG n02086910/
+mv val/ILSVRC2012_val_00033264.JPEG n01740131/
+mv val/ILSVRC2012_val_00033265.JPEG n02398521/
+mv val/ILSVRC2012_val_00033266.JPEG n03016953/
+mv val/ILSVRC2012_val_00033267.JPEG n02091134/
+mv val/ILSVRC2012_val_00033268.JPEG n02096585/
+mv val/ILSVRC2012_val_00033269.JPEG n02093647/
+mv val/ILSVRC2012_val_00033270.JPEG n03220513/
+mv val/ILSVRC2012_val_00033271.JPEG n07716906/
+mv val/ILSVRC2012_val_00033272.JPEG n03188531/
+mv val/ILSVRC2012_val_00033273.JPEG n03627232/
+mv val/ILSVRC2012_val_00033274.JPEG n03690938/
+mv val/ILSVRC2012_val_00033275.JPEG n02788148/
+mv val/ILSVRC2012_val_00033276.JPEG n04254680/
+mv val/ILSVRC2012_val_00033277.JPEG n02493509/
+mv val/ILSVRC2012_val_00033278.JPEG n02098413/
+mv val/ILSVRC2012_val_00033279.JPEG n03532672/
+mv val/ILSVRC2012_val_00033280.JPEG n02111889/
+mv val/ILSVRC2012_val_00033281.JPEG n01843065/
+mv val/ILSVRC2012_val_00033282.JPEG n02666196/
+mv val/ILSVRC2012_val_00033283.JPEG n02457408/
+mv val/ILSVRC2012_val_00033284.JPEG n03785016/
+mv val/ILSVRC2012_val_00033285.JPEG n02097474/
+mv val/ILSVRC2012_val_00033286.JPEG n02704792/
+mv val/ILSVRC2012_val_00033287.JPEG n03868863/
+mv val/ILSVRC2012_val_00033288.JPEG n04540053/
+mv val/ILSVRC2012_val_00033289.JPEG n03529860/
+mv val/ILSVRC2012_val_00033290.JPEG n04238763/
+mv val/ILSVRC2012_val_00033291.JPEG n03658185/
+mv val/ILSVRC2012_val_00033292.JPEG n03970156/
+mv val/ILSVRC2012_val_00033293.JPEG n04285008/
+mv val/ILSVRC2012_val_00033294.JPEG n02526121/
+mv val/ILSVRC2012_val_00033295.JPEG n02096585/
+mv val/ILSVRC2012_val_00033296.JPEG n03814639/
+mv val/ILSVRC2012_val_00033297.JPEG n03180011/
+mv val/ILSVRC2012_val_00033298.JPEG n02480855/
+mv val/ILSVRC2012_val_00033299.JPEG n03594945/
+mv val/ILSVRC2012_val_00033300.JPEG n02101006/
+mv val/ILSVRC2012_val_00033301.JPEG n04517823/
+mv val/ILSVRC2012_val_00033302.JPEG n12985857/
+mv val/ILSVRC2012_val_00033303.JPEG n02104029/
+mv val/ILSVRC2012_val_00033304.JPEG n04111531/
+mv val/ILSVRC2012_val_00033305.JPEG n01729322/
+mv val/ILSVRC2012_val_00033306.JPEG n03773504/
+mv val/ILSVRC2012_val_00033307.JPEG n01580077/
+mv val/ILSVRC2012_val_00033308.JPEG n02098413/
+mv val/ILSVRC2012_val_00033309.JPEG n04065272/
+mv val/ILSVRC2012_val_00033310.JPEG n02085936/
+mv val/ILSVRC2012_val_00033311.JPEG n02093859/
+mv val/ILSVRC2012_val_00033312.JPEG n02104365/
+mv val/ILSVRC2012_val_00033313.JPEG n09472597/
+mv val/ILSVRC2012_val_00033314.JPEG n02865351/
+mv val/ILSVRC2012_val_00033315.JPEG n04254680/
+mv val/ILSVRC2012_val_00033316.JPEG n02951358/
+mv val/ILSVRC2012_val_00033317.JPEG n02281787/
+mv val/ILSVRC2012_val_00033318.JPEG n01496331/
+mv val/ILSVRC2012_val_00033319.JPEG n02093256/
+mv val/ILSVRC2012_val_00033320.JPEG n01910747/
+mv val/ILSVRC2012_val_00033321.JPEG n04509417/
+mv val/ILSVRC2012_val_00033322.JPEG n02417914/
+mv val/ILSVRC2012_val_00033323.JPEG n02389026/
+mv val/ILSVRC2012_val_00033324.JPEG n03666591/
+mv val/ILSVRC2012_val_00033325.JPEG n06794110/
+mv val/ILSVRC2012_val_00033326.JPEG n03786901/
+mv val/ILSVRC2012_val_00033327.JPEG n07695742/
+mv val/ILSVRC2012_val_00033328.JPEG n02133161/
+mv val/ILSVRC2012_val_00033329.JPEG n04540053/
+mv val/ILSVRC2012_val_00033330.JPEG n02782093/
+mv val/ILSVRC2012_val_00033331.JPEG n01871265/
+mv val/ILSVRC2012_val_00033332.JPEG n03690938/
+mv val/ILSVRC2012_val_00033333.JPEG n02028035/
+mv val/ILSVRC2012_val_00033334.JPEG n02106550/
+mv val/ILSVRC2012_val_00033335.JPEG n02494079/
+mv val/ILSVRC2012_val_00033336.JPEG n07831146/
+mv val/ILSVRC2012_val_00033337.JPEG n01498041/
+mv val/ILSVRC2012_val_00033338.JPEG n02130308/
+mv val/ILSVRC2012_val_00033339.JPEG n04483307/
+mv val/ILSVRC2012_val_00033340.JPEG n01820546/
+mv val/ILSVRC2012_val_00033341.JPEG n02105056/
+mv val/ILSVRC2012_val_00033342.JPEG n04487081/
+mv val/ILSVRC2012_val_00033343.JPEG n09332890/
+mv val/ILSVRC2012_val_00033344.JPEG n02437312/
+mv val/ILSVRC2012_val_00033345.JPEG n03692522/
+mv val/ILSVRC2012_val_00033346.JPEG n02871525/
+mv val/ILSVRC2012_val_00033347.JPEG n02326432/
+mv val/ILSVRC2012_val_00033348.JPEG n07749582/
+mv val/ILSVRC2012_val_00033349.JPEG n02992211/
+mv val/ILSVRC2012_val_00033350.JPEG n02497673/
+mv val/ILSVRC2012_val_00033351.JPEG n03544143/
+mv val/ILSVRC2012_val_00033352.JPEG n13052670/
+mv val/ILSVRC2012_val_00033353.JPEG n13133613/
+mv val/ILSVRC2012_val_00033354.JPEG n07714571/
+mv val/ILSVRC2012_val_00033355.JPEG n03868863/
+mv val/ILSVRC2012_val_00033356.JPEG n02606052/
+mv val/ILSVRC2012_val_00033357.JPEG n02111129/
+mv val/ILSVRC2012_val_00033358.JPEG n03874293/
+mv val/ILSVRC2012_val_00033359.JPEG n02190166/
+mv val/ILSVRC2012_val_00033360.JPEG n02226429/
+mv val/ILSVRC2012_val_00033361.JPEG n02363005/
+mv val/ILSVRC2012_val_00033362.JPEG n02443484/
+mv val/ILSVRC2012_val_00033363.JPEG n04579145/
+mv val/ILSVRC2012_val_00033364.JPEG n03425413/
+mv val/ILSVRC2012_val_00033365.JPEG n03018349/
+mv val/ILSVRC2012_val_00033366.JPEG n03452741/
+mv val/ILSVRC2012_val_00033367.JPEG n02791124/
+mv val/ILSVRC2012_val_00033368.JPEG n02346627/
+mv val/ILSVRC2012_val_00033369.JPEG n02128757/
+mv val/ILSVRC2012_val_00033370.JPEG n03998194/
+mv val/ILSVRC2012_val_00033371.JPEG n03530642/
+mv val/ILSVRC2012_val_00033372.JPEG n01592084/
+mv val/ILSVRC2012_val_00033373.JPEG n01917289/
+mv val/ILSVRC2012_val_00033374.JPEG n03764736/
+mv val/ILSVRC2012_val_00033375.JPEG n07615774/
+mv val/ILSVRC2012_val_00033376.JPEG n03977966/
+mv val/ILSVRC2012_val_00033377.JPEG n02877765/
+mv val/ILSVRC2012_val_00033378.JPEG n02089973/
+mv val/ILSVRC2012_val_00033379.JPEG n01986214/
+mv val/ILSVRC2012_val_00033380.JPEG n01872401/
+mv val/ILSVRC2012_val_00033381.JPEG n03942813/
+mv val/ILSVRC2012_val_00033382.JPEG n01689811/
+mv val/ILSVRC2012_val_00033383.JPEG n02834397/
+mv val/ILSVRC2012_val_00033384.JPEG n07714990/
+mv val/ILSVRC2012_val_00033385.JPEG n02486261/
+mv val/ILSVRC2012_val_00033386.JPEG n02397096/
+mv val/ILSVRC2012_val_00033387.JPEG n04467665/
+mv val/ILSVRC2012_val_00033388.JPEG n02909870/
+mv val/ILSVRC2012_val_00033389.JPEG n04517823/
+mv val/ILSVRC2012_val_00033390.JPEG n04131690/
+mv val/ILSVRC2012_val_00033391.JPEG n01728572/
+mv val/ILSVRC2012_val_00033392.JPEG n01729322/
+mv val/ILSVRC2012_val_00033393.JPEG n01797886/
+mv val/ILSVRC2012_val_00033394.JPEG n02108551/
+mv val/ILSVRC2012_val_00033395.JPEG n03866082/
+mv val/ILSVRC2012_val_00033396.JPEG n01677366/
+mv val/ILSVRC2012_val_00033397.JPEG n02979186/
+mv val/ILSVRC2012_val_00033398.JPEG n03710637/
+mv val/ILSVRC2012_val_00033399.JPEG n03933933/
+mv val/ILSVRC2012_val_00033400.JPEG n03930313/
+mv val/ILSVRC2012_val_00033401.JPEG n03899768/
+mv val/ILSVRC2012_val_00033402.JPEG n03763968/
+mv val/ILSVRC2012_val_00033403.JPEG n02326432/
+mv val/ILSVRC2012_val_00033404.JPEG n02107142/
+mv val/ILSVRC2012_val_00033405.JPEG n02066245/
+mv val/ILSVRC2012_val_00033406.JPEG n04099969/
+mv val/ILSVRC2012_val_00033407.JPEG n07860988/
+mv val/ILSVRC2012_val_00033408.JPEG n07695742/
+mv val/ILSVRC2012_val_00033409.JPEG n01924916/
+mv val/ILSVRC2012_val_00033410.JPEG n03895866/
+mv val/ILSVRC2012_val_00033411.JPEG n03788365/
+mv val/ILSVRC2012_val_00033412.JPEG n01632777/
+mv val/ILSVRC2012_val_00033413.JPEG n02787622/
+mv val/ILSVRC2012_val_00033414.JPEG n01768244/
+mv val/ILSVRC2012_val_00033415.JPEG n01768244/
+mv val/ILSVRC2012_val_00033416.JPEG n03146219/
+mv val/ILSVRC2012_val_00033417.JPEG n06785654/
+mv val/ILSVRC2012_val_00033418.JPEG n02110341/
+mv val/ILSVRC2012_val_00033419.JPEG n03400231/
+mv val/ILSVRC2012_val_00033420.JPEG n02123045/
+mv val/ILSVRC2012_val_00033421.JPEG n02025239/
+mv val/ILSVRC2012_val_00033422.JPEG n03670208/
+mv val/ILSVRC2012_val_00033423.JPEG n01784675/
+mv val/ILSVRC2012_val_00033424.JPEG n03982430/
+mv val/ILSVRC2012_val_00033425.JPEG n04485082/
+mv val/ILSVRC2012_val_00033426.JPEG n03208938/
+mv val/ILSVRC2012_val_00033427.JPEG n01990800/
+mv val/ILSVRC2012_val_00033428.JPEG n03930313/
+mv val/ILSVRC2012_val_00033429.JPEG n02708093/
+mv val/ILSVRC2012_val_00033430.JPEG n04597913/
+mv val/ILSVRC2012_val_00033431.JPEG n01796340/
+mv val/ILSVRC2012_val_00033432.JPEG n02100236/
+mv val/ILSVRC2012_val_00033433.JPEG n01608432/
+mv val/ILSVRC2012_val_00033434.JPEG n01828970/
+mv val/ILSVRC2012_val_00033435.JPEG n01614925/
+mv val/ILSVRC2012_val_00033436.JPEG n03400231/
+mv val/ILSVRC2012_val_00033437.JPEG n01631663/
+mv val/ILSVRC2012_val_00033438.JPEG n03759954/
+mv val/ILSVRC2012_val_00033439.JPEG n01872401/
+mv val/ILSVRC2012_val_00033440.JPEG n01917289/
+mv val/ILSVRC2012_val_00033441.JPEG n02690373/
+mv val/ILSVRC2012_val_00033442.JPEG n01664065/
+mv val/ILSVRC2012_val_00033443.JPEG n03016953/
+mv val/ILSVRC2012_val_00033444.JPEG n04376876/
+mv val/ILSVRC2012_val_00033445.JPEG n01664065/
+mv val/ILSVRC2012_val_00033446.JPEG n02950826/
+mv val/ILSVRC2012_val_00033447.JPEG n04557648/
+mv val/ILSVRC2012_val_00033448.JPEG n02793495/
+mv val/ILSVRC2012_val_00033449.JPEG n02111129/
+mv val/ILSVRC2012_val_00033450.JPEG n01968897/
+mv val/ILSVRC2012_val_00033451.JPEG n03781244/
+mv val/ILSVRC2012_val_00033452.JPEG n07871810/
+mv val/ILSVRC2012_val_00033453.JPEG n02641379/
+mv val/ILSVRC2012_val_00033454.JPEG n02097209/
+mv val/ILSVRC2012_val_00033455.JPEG n02109047/
+mv val/ILSVRC2012_val_00033456.JPEG n03065424/
+mv val/ILSVRC2012_val_00033457.JPEG n03838899/
+mv val/ILSVRC2012_val_00033458.JPEG n04501370/
+mv val/ILSVRC2012_val_00033459.JPEG n01753488/
+mv val/ILSVRC2012_val_00033460.JPEG n04049303/
+mv val/ILSVRC2012_val_00033461.JPEG n02097047/
+mv val/ILSVRC2012_val_00033462.JPEG n04311004/
+mv val/ILSVRC2012_val_00033463.JPEG n03538406/
+mv val/ILSVRC2012_val_00033464.JPEG n03666591/
+mv val/ILSVRC2012_val_00033465.JPEG n02017213/
+mv val/ILSVRC2012_val_00033466.JPEG n02093647/
+mv val/ILSVRC2012_val_00033467.JPEG n04409515/
+mv val/ILSVRC2012_val_00033468.JPEG n03207743/
+mv val/ILSVRC2012_val_00033469.JPEG n01843065/
+mv val/ILSVRC2012_val_00033470.JPEG n03697007/
+mv val/ILSVRC2012_val_00033471.JPEG n03291819/
+mv val/ILSVRC2012_val_00033472.JPEG n03197337/
+mv val/ILSVRC2012_val_00033473.JPEG n03000247/
+mv val/ILSVRC2012_val_00033474.JPEG n02443484/
+mv val/ILSVRC2012_val_00033475.JPEG n03891251/
+mv val/ILSVRC2012_val_00033476.JPEG n02085782/
+mv val/ILSVRC2012_val_00033477.JPEG n04033901/
+mv val/ILSVRC2012_val_00033478.JPEG n03658185/
+mv val/ILSVRC2012_val_00033479.JPEG n01819313/
+mv val/ILSVRC2012_val_00033480.JPEG n03388549/
+mv val/ILSVRC2012_val_00033481.JPEG n02606052/
+mv val/ILSVRC2012_val_00033482.JPEG n04612504/
+mv val/ILSVRC2012_val_00033483.JPEG n01582220/
+mv val/ILSVRC2012_val_00033484.JPEG n02883205/
+mv val/ILSVRC2012_val_00033485.JPEG n04467665/
+mv val/ILSVRC2012_val_00033486.JPEG n03535780/
+mv val/ILSVRC2012_val_00033487.JPEG n04326547/
+mv val/ILSVRC2012_val_00033488.JPEG n03895866/
+mv val/ILSVRC2012_val_00033489.JPEG n02095889/
+mv val/ILSVRC2012_val_00033490.JPEG n02123045/
+mv val/ILSVRC2012_val_00033491.JPEG n03777568/
+mv val/ILSVRC2012_val_00033492.JPEG n01631663/
+mv val/ILSVRC2012_val_00033493.JPEG n02999410/
+mv val/ILSVRC2012_val_00033494.JPEG n07717410/
+mv val/ILSVRC2012_val_00033495.JPEG n02837789/
+mv val/ILSVRC2012_val_00033496.JPEG n04461696/
+mv val/ILSVRC2012_val_00033497.JPEG n07720875/
+mv val/ILSVRC2012_val_00033498.JPEG n03141823/
+mv val/ILSVRC2012_val_00033499.JPEG n03216828/
+mv val/ILSVRC2012_val_00033500.JPEG n04589890/
+mv val/ILSVRC2012_val_00033501.JPEG n02105641/
+mv val/ILSVRC2012_val_00033502.JPEG n03196217/
+mv val/ILSVRC2012_val_00033503.JPEG n01797886/
+mv val/ILSVRC2012_val_00033504.JPEG n07742313/
+mv val/ILSVRC2012_val_00033505.JPEG n02396427/
+mv val/ILSVRC2012_val_00033506.JPEG n04532106/
+mv val/ILSVRC2012_val_00033507.JPEG n02655020/
+mv val/ILSVRC2012_val_00033508.JPEG n02437312/
+mv val/ILSVRC2012_val_00033509.JPEG n03028079/
+mv val/ILSVRC2012_val_00033510.JPEG n02037110/
+mv val/ILSVRC2012_val_00033511.JPEG n03788365/
+mv val/ILSVRC2012_val_00033512.JPEG n01978455/
+mv val/ILSVRC2012_val_00033513.JPEG n02483362/
+mv val/ILSVRC2012_val_00033514.JPEG n02444819/
+mv val/ILSVRC2012_val_00033515.JPEG n01580077/
+mv val/ILSVRC2012_val_00033516.JPEG n04347754/
+mv val/ILSVRC2012_val_00033517.JPEG n01728572/
+mv val/ILSVRC2012_val_00033518.JPEG n03063689/
+mv val/ILSVRC2012_val_00033519.JPEG n02106662/
+mv val/ILSVRC2012_val_00033520.JPEG n02672831/
+mv val/ILSVRC2012_val_00033521.JPEG n03895866/
+mv val/ILSVRC2012_val_00033522.JPEG n04560804/
+mv val/ILSVRC2012_val_00033523.JPEG n04540053/
+mv val/ILSVRC2012_val_00033524.JPEG n02233338/
+mv val/ILSVRC2012_val_00033525.JPEG n03777754/
+mv val/ILSVRC2012_val_00033526.JPEG n02788148/
+mv val/ILSVRC2012_val_00033527.JPEG n09472597/
+mv val/ILSVRC2012_val_00033528.JPEG n02484975/
+mv val/ILSVRC2012_val_00033529.JPEG n04404412/
+mv val/ILSVRC2012_val_00033530.JPEG n02087046/
+mv val/ILSVRC2012_val_00033531.JPEG n02089078/
+mv val/ILSVRC2012_val_00033532.JPEG n03255030/
+mv val/ILSVRC2012_val_00033533.JPEG n03095699/
+mv val/ILSVRC2012_val_00033534.JPEG n07714990/
+mv val/ILSVRC2012_val_00033535.JPEG n02641379/
+mv val/ILSVRC2012_val_00033536.JPEG n03218198/
+mv val/ILSVRC2012_val_00033537.JPEG n02481823/
+mv val/ILSVRC2012_val_00033538.JPEG n01514859/
+mv val/ILSVRC2012_val_00033539.JPEG n03337140/
+mv val/ILSVRC2012_val_00033540.JPEG n04399382/
+mv val/ILSVRC2012_val_00033541.JPEG n02641379/
+mv val/ILSVRC2012_val_00033542.JPEG n02129604/
+mv val/ILSVRC2012_val_00033543.JPEG n03982430/
+mv val/ILSVRC2012_val_00033544.JPEG n04127249/
+mv val/ILSVRC2012_val_00033545.JPEG n04125021/
+mv val/ILSVRC2012_val_00033546.JPEG n01774384/
+mv val/ILSVRC2012_val_00033547.JPEG n01740131/
+mv val/ILSVRC2012_val_00033548.JPEG n02325366/
+mv val/ILSVRC2012_val_00033549.JPEG n04041544/
+mv val/ILSVRC2012_val_00033550.JPEG n02667093/
+mv val/ILSVRC2012_val_00033551.JPEG n07836838/
+mv val/ILSVRC2012_val_00033552.JPEG n01739381/
+mv val/ILSVRC2012_val_00033553.JPEG n02108000/
+mv val/ILSVRC2012_val_00033554.JPEG n02277742/
+mv val/ILSVRC2012_val_00033555.JPEG n01950731/
+mv val/ILSVRC2012_val_00033556.JPEG n03777754/
+mv val/ILSVRC2012_val_00033557.JPEG n04310018/
+mv val/ILSVRC2012_val_00033558.JPEG n02917067/
+mv val/ILSVRC2012_val_00033559.JPEG n02835271/
+mv val/ILSVRC2012_val_00033560.JPEG n04515003/
+mv val/ILSVRC2012_val_00033561.JPEG n02119789/
+mv val/ILSVRC2012_val_00033562.JPEG n02966687/
+mv val/ILSVRC2012_val_00033563.JPEG n03085013/
+mv val/ILSVRC2012_val_00033564.JPEG n12144580/
+mv val/ILSVRC2012_val_00033565.JPEG n02071294/
+mv val/ILSVRC2012_val_00033566.JPEG n12998815/
+mv val/ILSVRC2012_val_00033567.JPEG n04162706/
+mv val/ILSVRC2012_val_00033568.JPEG n03028079/
+mv val/ILSVRC2012_val_00033569.JPEG n03218198/
+mv val/ILSVRC2012_val_00033570.JPEG n02895154/
+mv val/ILSVRC2012_val_00033571.JPEG n04562935/
+mv val/ILSVRC2012_val_00033572.JPEG n07613480/
+mv val/ILSVRC2012_val_00033573.JPEG n02128925/
+mv val/ILSVRC2012_val_00033574.JPEG n03649909/
+mv val/ILSVRC2012_val_00033575.JPEG n01629819/
+mv val/ILSVRC2012_val_00033576.JPEG n01883070/
+mv val/ILSVRC2012_val_00033577.JPEG n02098413/
+mv val/ILSVRC2012_val_00033578.JPEG n02002724/
+mv val/ILSVRC2012_val_00033579.JPEG n02106382/
+mv val/ILSVRC2012_val_00033580.JPEG n01530575/
+mv val/ILSVRC2012_val_00033581.JPEG n02113978/
+mv val/ILSVRC2012_val_00033582.JPEG n02124075/
+mv val/ILSVRC2012_val_00033583.JPEG n04332243/
+mv val/ILSVRC2012_val_00033584.JPEG n02655020/
+mv val/ILSVRC2012_val_00033585.JPEG n04239074/
+mv val/ILSVRC2012_val_00033586.JPEG n01910747/
+mv val/ILSVRC2012_val_00033587.JPEG n09399592/
+mv val/ILSVRC2012_val_00033588.JPEG n02096051/
+mv val/ILSVRC2012_val_00033589.JPEG n03930630/
+mv val/ILSVRC2012_val_00033590.JPEG n07693725/
+mv val/ILSVRC2012_val_00033591.JPEG n03933933/
+mv val/ILSVRC2012_val_00033592.JPEG n03187595/
+mv val/ILSVRC2012_val_00033593.JPEG n02281787/
+mv val/ILSVRC2012_val_00033594.JPEG n02892201/
+mv val/ILSVRC2012_val_00033595.JPEG n02108000/
+mv val/ILSVRC2012_val_00033596.JPEG n01687978/
+mv val/ILSVRC2012_val_00033597.JPEG n03803284/
+mv val/ILSVRC2012_val_00033598.JPEG n07892512/
+mv val/ILSVRC2012_val_00033599.JPEG n02074367/
+mv val/ILSVRC2012_val_00033600.JPEG n03891251/
+mv val/ILSVRC2012_val_00033601.JPEG n03384352/
+mv val/ILSVRC2012_val_00033602.JPEG n04409515/
+mv val/ILSVRC2012_val_00033603.JPEG n02107574/
+mv val/ILSVRC2012_val_00033604.JPEG n01860187/
+mv val/ILSVRC2012_val_00033605.JPEG n03529860/
+mv val/ILSVRC2012_val_00033606.JPEG n02280649/
+mv val/ILSVRC2012_val_00033607.JPEG n02860847/
+mv val/ILSVRC2012_val_00033608.JPEG n03325584/
+mv val/ILSVRC2012_val_00033609.JPEG n04409515/
+mv val/ILSVRC2012_val_00033610.JPEG n03692522/
+mv val/ILSVRC2012_val_00033611.JPEG n02089973/
+mv val/ILSVRC2012_val_00033612.JPEG n02782093/
+mv val/ILSVRC2012_val_00033613.JPEG n03208938/
+mv val/ILSVRC2012_val_00033614.JPEG n02980441/
+mv val/ILSVRC2012_val_00033615.JPEG n01693334/
+mv val/ILSVRC2012_val_00033616.JPEG n01773157/
+mv val/ILSVRC2012_val_00033617.JPEG n01729977/
+mv val/ILSVRC2012_val_00033618.JPEG n03063689/
+mv val/ILSVRC2012_val_00033619.JPEG n02865351/
+mv val/ILSVRC2012_val_00033620.JPEG n03459775/
+mv val/ILSVRC2012_val_00033621.JPEG n03637318/
+mv val/ILSVRC2012_val_00033622.JPEG n04263257/
+mv val/ILSVRC2012_val_00033623.JPEG n04604644/
+mv val/ILSVRC2012_val_00033624.JPEG n04311004/
+mv val/ILSVRC2012_val_00033625.JPEG n02120079/
+mv val/ILSVRC2012_val_00033626.JPEG n02112018/
+mv val/ILSVRC2012_val_00033627.JPEG n03196217/
+mv val/ILSVRC2012_val_00033628.JPEG n01871265/
+mv val/ILSVRC2012_val_00033629.JPEG n02804610/
+mv val/ILSVRC2012_val_00033630.JPEG n07892512/
+mv val/ILSVRC2012_val_00033631.JPEG n03124043/
+mv val/ILSVRC2012_val_00033632.JPEG n02219486/
+mv val/ILSVRC2012_val_00033633.JPEG n02089973/
+mv val/ILSVRC2012_val_00033634.JPEG n02109047/
+mv val/ILSVRC2012_val_00033635.JPEG n04040759/
+mv val/ILSVRC2012_val_00033636.JPEG n07711569/
+mv val/ILSVRC2012_val_00033637.JPEG n04458633/
+mv val/ILSVRC2012_val_00033638.JPEG n07720875/
+mv val/ILSVRC2012_val_00033639.JPEG n02277742/
+mv val/ILSVRC2012_val_00033640.JPEG n01675722/
+mv val/ILSVRC2012_val_00033641.JPEG n02119022/
+mv val/ILSVRC2012_val_00033642.JPEG n02106030/
+mv val/ILSVRC2012_val_00033643.JPEG n03763968/
+mv val/ILSVRC2012_val_00033644.JPEG n02105412/
+mv val/ILSVRC2012_val_00033645.JPEG n03017168/
+mv val/ILSVRC2012_val_00033646.JPEG n03857828/
+mv val/ILSVRC2012_val_00033647.JPEG n04346328/
+mv val/ILSVRC2012_val_00033648.JPEG n04005630/
+mv val/ILSVRC2012_val_00033649.JPEG n03492542/
+mv val/ILSVRC2012_val_00033650.JPEG n02480495/
+mv val/ILSVRC2012_val_00033651.JPEG n02090622/
+mv val/ILSVRC2012_val_00033652.JPEG n03814906/
+mv val/ILSVRC2012_val_00033653.JPEG n04004767/
+mv val/ILSVRC2012_val_00033654.JPEG n02992529/
+mv val/ILSVRC2012_val_00033655.JPEG n02692877/
+mv val/ILSVRC2012_val_00033656.JPEG n09332890/
+mv val/ILSVRC2012_val_00033657.JPEG n02979186/
+mv val/ILSVRC2012_val_00033658.JPEG n01770393/
+mv val/ILSVRC2012_val_00033659.JPEG n02129165/
+mv val/ILSVRC2012_val_00033660.JPEG n02391049/
+mv val/ILSVRC2012_val_00033661.JPEG n07871810/
+mv val/ILSVRC2012_val_00033662.JPEG n03355925/
+mv val/ILSVRC2012_val_00033663.JPEG n04398044/
+mv val/ILSVRC2012_val_00033664.JPEG n07860988/
+mv val/ILSVRC2012_val_00033665.JPEG n03961711/
+mv val/ILSVRC2012_val_00033666.JPEG n02089973/
+mv val/ILSVRC2012_val_00033667.JPEG n03404251/
+mv val/ILSVRC2012_val_00033668.JPEG n02395406/
+mv val/ILSVRC2012_val_00033669.JPEG n03063689/
+mv val/ILSVRC2012_val_00033670.JPEG n04070727/
+mv val/ILSVRC2012_val_00033671.JPEG n04552348/
+mv val/ILSVRC2012_val_00033672.JPEG n02112137/
+mv val/ILSVRC2012_val_00033673.JPEG n02110958/
+mv val/ILSVRC2012_val_00033674.JPEG n01753488/
+mv val/ILSVRC2012_val_00033675.JPEG n07697537/
+mv val/ILSVRC2012_val_00033676.JPEG n04389033/
+mv val/ILSVRC2012_val_00033677.JPEG n02783161/
+mv val/ILSVRC2012_val_00033678.JPEG n07693725/
+mv val/ILSVRC2012_val_00033679.JPEG n04286575/
+mv val/ILSVRC2012_val_00033680.JPEG n07753113/
+mv val/ILSVRC2012_val_00033681.JPEG n07716358/
+mv val/ILSVRC2012_val_00033682.JPEG n03394916/
+mv val/ILSVRC2012_val_00033683.JPEG n02093256/
+mv val/ILSVRC2012_val_00033684.JPEG n01737021/
+mv val/ILSVRC2012_val_00033685.JPEG n07836838/
+mv val/ILSVRC2012_val_00033686.JPEG n02268853/
+mv val/ILSVRC2012_val_00033687.JPEG n02130308/
+mv val/ILSVRC2012_val_00033688.JPEG n02906734/
+mv val/ILSVRC2012_val_00033689.JPEG n02134418/
+mv val/ILSVRC2012_val_00033690.JPEG n02108000/
+mv val/ILSVRC2012_val_00033691.JPEG n01560419/
+mv val/ILSVRC2012_val_00033692.JPEG n03131574/
+mv val/ILSVRC2012_val_00033693.JPEG n02133161/
+mv val/ILSVRC2012_val_00033694.JPEG n03000247/
+mv val/ILSVRC2012_val_00033695.JPEG n02279972/
+mv val/ILSVRC2012_val_00033696.JPEG n02951585/
+mv val/ILSVRC2012_val_00033697.JPEG n03733805/
+mv val/ILSVRC2012_val_00033698.JPEG n01677366/
+mv val/ILSVRC2012_val_00033699.JPEG n03976467/
+mv val/ILSVRC2012_val_00033700.JPEG n03535780/
+mv val/ILSVRC2012_val_00033701.JPEG n03938244/
+mv val/ILSVRC2012_val_00033702.JPEG n01644373/
+mv val/ILSVRC2012_val_00033703.JPEG n02109525/
+mv val/ILSVRC2012_val_00033704.JPEG n03649909/
+mv val/ILSVRC2012_val_00033705.JPEG n02190166/
+mv val/ILSVRC2012_val_00033706.JPEG n01692333/
+mv val/ILSVRC2012_val_00033707.JPEG n02910353/
+mv val/ILSVRC2012_val_00033708.JPEG n01807496/
+mv val/ILSVRC2012_val_00033709.JPEG n03982430/
+mv val/ILSVRC2012_val_00033710.JPEG n02974003/
+mv val/ILSVRC2012_val_00033711.JPEG n03950228/
+mv val/ILSVRC2012_val_00033712.JPEG n01978287/
+mv val/ILSVRC2012_val_00033713.JPEG n03720891/
+mv val/ILSVRC2012_val_00033714.JPEG n02892767/
+mv val/ILSVRC2012_val_00033715.JPEG n02504013/
+mv val/ILSVRC2012_val_00033716.JPEG n01855032/
+mv val/ILSVRC2012_val_00033717.JPEG n02483362/
+mv val/ILSVRC2012_val_00033718.JPEG n02025239/
+mv val/ILSVRC2012_val_00033719.JPEG n03868242/
+mv val/ILSVRC2012_val_00033720.JPEG n02094114/
+mv val/ILSVRC2012_val_00033721.JPEG n02109047/
+mv val/ILSVRC2012_val_00033722.JPEG n07749582/
+mv val/ILSVRC2012_val_00033723.JPEG n01669191/
+mv val/ILSVRC2012_val_00033724.JPEG n03785016/
+mv val/ILSVRC2012_val_00033725.JPEG n04041544/
+mv val/ILSVRC2012_val_00033726.JPEG n02087046/
+mv val/ILSVRC2012_val_00033727.JPEG n03272010/
+mv val/ILSVRC2012_val_00033728.JPEG n03447447/
+mv val/ILSVRC2012_val_00033729.JPEG n02783161/
+mv val/ILSVRC2012_val_00033730.JPEG n03976657/
+mv val/ILSVRC2012_val_00033731.JPEG n02087394/
+mv val/ILSVRC2012_val_00033732.JPEG n04548280/
+mv val/ILSVRC2012_val_00033733.JPEG n01860187/
+mv val/ILSVRC2012_val_00033734.JPEG n01689811/
+mv val/ILSVRC2012_val_00033735.JPEG n04584207/
+mv val/ILSVRC2012_val_00033736.JPEG n04251144/
+mv val/ILSVRC2012_val_00033737.JPEG n02113023/
+mv val/ILSVRC2012_val_00033738.JPEG n03977966/
+mv val/ILSVRC2012_val_00033739.JPEG n03792972/
+mv val/ILSVRC2012_val_00033740.JPEG n13054560/
+mv val/ILSVRC2012_val_00033741.JPEG n06785654/
+mv val/ILSVRC2012_val_00033742.JPEG n07734744/
+mv val/ILSVRC2012_val_00033743.JPEG n02115641/
+mv val/ILSVRC2012_val_00033744.JPEG n04606251/
+mv val/ILSVRC2012_val_00033745.JPEG n02277742/
+mv val/ILSVRC2012_val_00033746.JPEG n02794156/
+mv val/ILSVRC2012_val_00033747.JPEG n02137549/
+mv val/ILSVRC2012_val_00033748.JPEG n04479046/
+mv val/ILSVRC2012_val_00033749.JPEG n01753488/
+mv val/ILSVRC2012_val_00033750.JPEG n04485082/
+mv val/ILSVRC2012_val_00033751.JPEG n02100735/
+mv val/ILSVRC2012_val_00033752.JPEG n02869837/
+mv val/ILSVRC2012_val_00033753.JPEG n03534580/
+mv val/ILSVRC2012_val_00033754.JPEG n02879718/
+mv val/ILSVRC2012_val_00033755.JPEG n04525305/
+mv val/ILSVRC2012_val_00033756.JPEG n01829413/
+mv val/ILSVRC2012_val_00033757.JPEG n03792782/
+mv val/ILSVRC2012_val_00033758.JPEG n02109961/
+mv val/ILSVRC2012_val_00033759.JPEG n03443371/
+mv val/ILSVRC2012_val_00033760.JPEG n02009229/
+mv val/ILSVRC2012_val_00033761.JPEG n01744401/
+mv val/ILSVRC2012_val_00033762.JPEG n01728572/
+mv val/ILSVRC2012_val_00033763.JPEG n02098413/
+mv val/ILSVRC2012_val_00033764.JPEG n04311004/
+mv val/ILSVRC2012_val_00033765.JPEG n03272010/
+mv val/ILSVRC2012_val_00033766.JPEG n02095570/
+mv val/ILSVRC2012_val_00033767.JPEG n01632458/
+mv val/ILSVRC2012_val_00033768.JPEG n02783161/
+mv val/ILSVRC2012_val_00033769.JPEG n01644900/
+mv val/ILSVRC2012_val_00033770.JPEG n01601694/
+mv val/ILSVRC2012_val_00033771.JPEG n01608432/
+mv val/ILSVRC2012_val_00033772.JPEG n04335435/
+mv val/ILSVRC2012_val_00033773.JPEG n02086910/
+mv val/ILSVRC2012_val_00033774.JPEG n04418357/
+mv val/ILSVRC2012_val_00033775.JPEG n02097658/
+mv val/ILSVRC2012_val_00033776.JPEG n03124170/
+mv val/ILSVRC2012_val_00033777.JPEG n04228054/
+mv val/ILSVRC2012_val_00033778.JPEG n02494079/
+mv val/ILSVRC2012_val_00033779.JPEG n07754684/
+mv val/ILSVRC2012_val_00033780.JPEG n02493793/
+mv val/ILSVRC2012_val_00033781.JPEG n02165105/
+mv val/ILSVRC2012_val_00033782.JPEG n02133161/
+mv val/ILSVRC2012_val_00033783.JPEG n01847000/
+mv val/ILSVRC2012_val_00033784.JPEG n03394916/
+mv val/ILSVRC2012_val_00033785.JPEG n02105162/
+mv val/ILSVRC2012_val_00033786.JPEG n01950731/
+mv val/ILSVRC2012_val_00033787.JPEG n03970156/
+mv val/ILSVRC2012_val_00033788.JPEG n02233338/
+mv val/ILSVRC2012_val_00033789.JPEG n03045698/
+mv val/ILSVRC2012_val_00033790.JPEG n02099601/
+mv val/ILSVRC2012_val_00033791.JPEG n11939491/
+mv val/ILSVRC2012_val_00033792.JPEG n04467665/
+mv val/ILSVRC2012_val_00033793.JPEG n04346328/
+mv val/ILSVRC2012_val_00033794.JPEG n04347754/
+mv val/ILSVRC2012_val_00033795.JPEG n03063689/
+mv val/ILSVRC2012_val_00033796.JPEG n03100240/
+mv val/ILSVRC2012_val_00033797.JPEG n02127052/
+mv val/ILSVRC2012_val_00033798.JPEG n03887697/
+mv val/ILSVRC2012_val_00033799.JPEG n09428293/
+mv val/ILSVRC2012_val_00033800.JPEG n02361337/
+mv val/ILSVRC2012_val_00033801.JPEG n02606052/
+mv val/ILSVRC2012_val_00033802.JPEG n04590129/
+mv val/ILSVRC2012_val_00033803.JPEG n02692877/
+mv val/ILSVRC2012_val_00033804.JPEG n03796401/
+mv val/ILSVRC2012_val_00033805.JPEG n04532106/
+mv val/ILSVRC2012_val_00033806.JPEG n03538406/
+mv val/ILSVRC2012_val_00033807.JPEG n07747607/
+mv val/ILSVRC2012_val_00033808.JPEG n01978455/
+mv val/ILSVRC2012_val_00033809.JPEG n07717556/
+mv val/ILSVRC2012_val_00033810.JPEG n02894605/
+mv val/ILSVRC2012_val_00033811.JPEG n03134739/
+mv val/ILSVRC2012_val_00033812.JPEG n04243546/
+mv val/ILSVRC2012_val_00033813.JPEG n03903868/
+mv val/ILSVRC2012_val_00033814.JPEG n02879718/
+mv val/ILSVRC2012_val_00033815.JPEG n01824575/
+mv val/ILSVRC2012_val_00033816.JPEG n01877812/
+mv val/ILSVRC2012_val_00033817.JPEG n01770081/
+mv val/ILSVRC2012_val_00033818.JPEG n04525305/
+mv val/ILSVRC2012_val_00033819.JPEG n01773549/
+mv val/ILSVRC2012_val_00033820.JPEG n02099712/
+mv val/ILSVRC2012_val_00033821.JPEG n01774384/
+mv val/ILSVRC2012_val_00033822.JPEG n02823428/
+mv val/ILSVRC2012_val_00033823.JPEG n01860187/
+mv val/ILSVRC2012_val_00033824.JPEG n03461385/
+mv val/ILSVRC2012_val_00033825.JPEG n04366367/
+mv val/ILSVRC2012_val_00033826.JPEG n02167151/
+mv val/ILSVRC2012_val_00033827.JPEG n02454379/
+mv val/ILSVRC2012_val_00033828.JPEG n03777568/
+mv val/ILSVRC2012_val_00033829.JPEG n01833805/
+mv val/ILSVRC2012_val_00033830.JPEG n03761084/
+mv val/ILSVRC2012_val_00033831.JPEG n04542943/
+mv val/ILSVRC2012_val_00033832.JPEG n02504458/
+mv val/ILSVRC2012_val_00033833.JPEG n02033041/
+mv val/ILSVRC2012_val_00033834.JPEG n02095314/
+mv val/ILSVRC2012_val_00033835.JPEG n03527444/
+mv val/ILSVRC2012_val_00033836.JPEG n02280649/
+mv val/ILSVRC2012_val_00033837.JPEG n02123045/
+mv val/ILSVRC2012_val_00033838.JPEG n01644373/
+mv val/ILSVRC2012_val_00033839.JPEG n12998815/
+mv val/ILSVRC2012_val_00033840.JPEG n03792972/
+mv val/ILSVRC2012_val_00033841.JPEG n02480495/
+mv val/ILSVRC2012_val_00033842.JPEG n03417042/
+mv val/ILSVRC2012_val_00033843.JPEG n02091467/
+mv val/ILSVRC2012_val_00033844.JPEG n02415577/
+mv val/ILSVRC2012_val_00033845.JPEG n12985857/
+mv val/ILSVRC2012_val_00033846.JPEG n03544143/
+mv val/ILSVRC2012_val_00033847.JPEG n04370456/
+mv val/ILSVRC2012_val_00033848.JPEG n02110806/
+mv val/ILSVRC2012_val_00033849.JPEG n03676483/
+mv val/ILSVRC2012_val_00033850.JPEG n03602883/
+mv val/ILSVRC2012_val_00033851.JPEG n03538406/
+mv val/ILSVRC2012_val_00033852.JPEG n04201297/
+mv val/ILSVRC2012_val_00033853.JPEG n03929855/
+mv val/ILSVRC2012_val_00033854.JPEG n02504013/
+mv val/ILSVRC2012_val_00033855.JPEG n10565667/
+mv val/ILSVRC2012_val_00033856.JPEG n02097130/
+mv val/ILSVRC2012_val_00033857.JPEG n03950228/
+mv val/ILSVRC2012_val_00033858.JPEG n01675722/
+mv val/ILSVRC2012_val_00033859.JPEG n04523525/
+mv val/ILSVRC2012_val_00033860.JPEG n02966687/
+mv val/ILSVRC2012_val_00033861.JPEG n02504458/
+mv val/ILSVRC2012_val_00033862.JPEG n02089973/
+mv val/ILSVRC2012_val_00033863.JPEG n01641577/
+mv val/ILSVRC2012_val_00033864.JPEG n04330267/
+mv val/ILSVRC2012_val_00033865.JPEG n04146614/
+mv val/ILSVRC2012_val_00033866.JPEG n01631663/
+mv val/ILSVRC2012_val_00033867.JPEG n02978881/
+mv val/ILSVRC2012_val_00033868.JPEG n07802026/
+mv val/ILSVRC2012_val_00033869.JPEG n04039381/
+mv val/ILSVRC2012_val_00033870.JPEG n03485794/
+mv val/ILSVRC2012_val_00033871.JPEG n03825788/
+mv val/ILSVRC2012_val_00033872.JPEG n04265275/
+mv val/ILSVRC2012_val_00033873.JPEG n03141823/
+mv val/ILSVRC2012_val_00033874.JPEG n04033995/
+mv val/ILSVRC2012_val_00033875.JPEG n03179701/
+mv val/ILSVRC2012_val_00033876.JPEG n01986214/
+mv val/ILSVRC2012_val_00033877.JPEG n04604644/
+mv val/ILSVRC2012_val_00033878.JPEG n02730930/
+mv val/ILSVRC2012_val_00033879.JPEG n03920288/
+mv val/ILSVRC2012_val_00033880.JPEG n02799071/
+mv val/ILSVRC2012_val_00033881.JPEG n04399382/
+mv val/ILSVRC2012_val_00033882.JPEG n04023962/
+mv val/ILSVRC2012_val_00033883.JPEG n02951358/
+mv val/ILSVRC2012_val_00033884.JPEG n02114367/
+mv val/ILSVRC2012_val_00033885.JPEG n02074367/
+mv val/ILSVRC2012_val_00033886.JPEG n03992509/
+mv val/ILSVRC2012_val_00033887.JPEG n03000134/
+mv val/ILSVRC2012_val_00033888.JPEG n01824575/
+mv val/ILSVRC2012_val_00033889.JPEG n04525305/
+mv val/ILSVRC2012_val_00033890.JPEG n02119789/
+mv val/ILSVRC2012_val_00033891.JPEG n03899768/
+mv val/ILSVRC2012_val_00033892.JPEG n03617480/
+mv val/ILSVRC2012_val_00033893.JPEG n02012849/
+mv val/ILSVRC2012_val_00033894.JPEG n03814639/
+mv val/ILSVRC2012_val_00033895.JPEG n04347754/
+mv val/ILSVRC2012_val_00033896.JPEG n04597913/
+mv val/ILSVRC2012_val_00033897.JPEG n02113799/
+mv val/ILSVRC2012_val_00033898.JPEG n04562935/
+mv val/ILSVRC2012_val_00033899.JPEG n03777754/
+mv val/ILSVRC2012_val_00033900.JPEG n02687172/
+mv val/ILSVRC2012_val_00033901.JPEG n02066245/
+mv val/ILSVRC2012_val_00033902.JPEG n02704792/
+mv val/ILSVRC2012_val_00033903.JPEG n01751748/
+mv val/ILSVRC2012_val_00033904.JPEG n02090622/
+mv val/ILSVRC2012_val_00033905.JPEG n03857828/
+mv val/ILSVRC2012_val_00033906.JPEG n03777754/
+mv val/ILSVRC2012_val_00033907.JPEG n02130308/
+mv val/ILSVRC2012_val_00033908.JPEG n02606052/
+mv val/ILSVRC2012_val_00033909.JPEG n03483316/
+mv val/ILSVRC2012_val_00033910.JPEG n02808440/
+mv val/ILSVRC2012_val_00033911.JPEG n02114712/
+mv val/ILSVRC2012_val_00033912.JPEG n01774384/
+mv val/ILSVRC2012_val_00033913.JPEG n09468604/
+mv val/ILSVRC2012_val_00033914.JPEG n03045698/
+mv val/ILSVRC2012_val_00033915.JPEG n02107574/
+mv val/ILSVRC2012_val_00033916.JPEG n02112706/
+mv val/ILSVRC2012_val_00033917.JPEG n03777754/
+mv val/ILSVRC2012_val_00033918.JPEG n04209239/
+mv val/ILSVRC2012_val_00033919.JPEG n07745940/
+mv val/ILSVRC2012_val_00033920.JPEG n02690373/
+mv val/ILSVRC2012_val_00033921.JPEG n07584110/
+mv val/ILSVRC2012_val_00033922.JPEG n03388549/
+mv val/ILSVRC2012_val_00033923.JPEG n03977966/
+mv val/ILSVRC2012_val_00033924.JPEG n04584207/
+mv val/ILSVRC2012_val_00033925.JPEG n02279972/
+mv val/ILSVRC2012_val_00033926.JPEG n02443114/
+mv val/ILSVRC2012_val_00033927.JPEG n02493509/
+mv val/ILSVRC2012_val_00033928.JPEG n02494079/
+mv val/ILSVRC2012_val_00033929.JPEG n03063599/
+mv val/ILSVRC2012_val_00033930.JPEG n01774750/
+mv val/ILSVRC2012_val_00033931.JPEG n01968897/
+mv val/ILSVRC2012_val_00033932.JPEG n01695060/
+mv val/ILSVRC2012_val_00033933.JPEG n04380533/
+mv val/ILSVRC2012_val_00033934.JPEG n02128757/
+mv val/ILSVRC2012_val_00033935.JPEG n09256479/
+mv val/ILSVRC2012_val_00033936.JPEG n02909870/
+mv val/ILSVRC2012_val_00033937.JPEG n04501370/
+mv val/ILSVRC2012_val_00033938.JPEG n03935335/
+mv val/ILSVRC2012_val_00033939.JPEG n07693725/
+mv val/ILSVRC2012_val_00033940.JPEG n04591713/
+mv val/ILSVRC2012_val_00033941.JPEG n03787032/
+mv val/ILSVRC2012_val_00033942.JPEG n01498041/
+mv val/ILSVRC2012_val_00033943.JPEG n03042490/
+mv val/ILSVRC2012_val_00033944.JPEG n02086910/
+mv val/ILSVRC2012_val_00033945.JPEG n01855672/
+mv val/ILSVRC2012_val_00033946.JPEG n04596742/
+mv val/ILSVRC2012_val_00033947.JPEG n02445715/
+mv val/ILSVRC2012_val_00033948.JPEG n02859443/
+mv val/ILSVRC2012_val_00033949.JPEG n02804610/
+mv val/ILSVRC2012_val_00033950.JPEG n03709823/
+mv val/ILSVRC2012_val_00033951.JPEG n02488291/
+mv val/ILSVRC2012_val_00033952.JPEG n02410509/
+mv val/ILSVRC2012_val_00033953.JPEG n03393912/
+mv val/ILSVRC2012_val_00033954.JPEG n03498962/
+mv val/ILSVRC2012_val_00033955.JPEG n03131574/
+mv val/ILSVRC2012_val_00033956.JPEG n03791053/
+mv val/ILSVRC2012_val_00033957.JPEG n03763968/
+mv val/ILSVRC2012_val_00033958.JPEG n02097130/
+mv val/ILSVRC2012_val_00033959.JPEG n03042490/
+mv val/ILSVRC2012_val_00033960.JPEG n01641577/
+mv val/ILSVRC2012_val_00033961.JPEG n01677366/
+mv val/ILSVRC2012_val_00033962.JPEG n01828970/
+mv val/ILSVRC2012_val_00033963.JPEG n02096051/
+mv val/ILSVRC2012_val_00033964.JPEG n03888605/
+mv val/ILSVRC2012_val_00033965.JPEG n02094114/
+mv val/ILSVRC2012_val_00033966.JPEG n02892201/
+mv val/ILSVRC2012_val_00033967.JPEG n02486261/
+mv val/ILSVRC2012_val_00033968.JPEG n03983396/
+mv val/ILSVRC2012_val_00033969.JPEG n02133161/
+mv val/ILSVRC2012_val_00033970.JPEG n03602883/
+mv val/ILSVRC2012_val_00033971.JPEG n03065424/
+mv val/ILSVRC2012_val_00033972.JPEG n02749479/
+mv val/ILSVRC2012_val_00033973.JPEG n02791124/
+mv val/ILSVRC2012_val_00033974.JPEG n01968897/
+mv val/ILSVRC2012_val_00033975.JPEG n02797295/
+mv val/ILSVRC2012_val_00033976.JPEG n02877765/
+mv val/ILSVRC2012_val_00033977.JPEG n01843065/
+mv val/ILSVRC2012_val_00033978.JPEG n02892201/
+mv val/ILSVRC2012_val_00033979.JPEG n03786901/
+mv val/ILSVRC2012_val_00033980.JPEG n02174001/
+mv val/ILSVRC2012_val_00033981.JPEG n03133878/
+mv val/ILSVRC2012_val_00033982.JPEG n02107908/
+mv val/ILSVRC2012_val_00033983.JPEG n04136333/
+mv val/ILSVRC2012_val_00033984.JPEG n02437616/
+mv val/ILSVRC2012_val_00033985.JPEG n04592741/
+mv val/ILSVRC2012_val_00033986.JPEG n04044716/
+mv val/ILSVRC2012_val_00033987.JPEG n01773157/
+mv val/ILSVRC2012_val_00033988.JPEG n02130308/
+mv val/ILSVRC2012_val_00033989.JPEG n02325366/
+mv val/ILSVRC2012_val_00033990.JPEG n04591713/
+mv val/ILSVRC2012_val_00033991.JPEG n04090263/
+mv val/ILSVRC2012_val_00033992.JPEG n03902125/
+mv val/ILSVRC2012_val_00033993.JPEG n03670208/
+mv val/ILSVRC2012_val_00033994.JPEG n07753113/
+mv val/ILSVRC2012_val_00033995.JPEG n03866082/
+mv val/ILSVRC2012_val_00033996.JPEG n04201297/
+mv val/ILSVRC2012_val_00033997.JPEG n02093859/
+mv val/ILSVRC2012_val_00033998.JPEG n02410509/
+mv val/ILSVRC2012_val_00033999.JPEG n02823750/
+mv val/ILSVRC2012_val_00034000.JPEG n01740131/
+mv val/ILSVRC2012_val_00034001.JPEG n03417042/
+mv val/ILSVRC2012_val_00034002.JPEG n03874293/
+mv val/ILSVRC2012_val_00034003.JPEG n03710193/
+mv val/ILSVRC2012_val_00034004.JPEG n02871525/
+mv val/ILSVRC2012_val_00034005.JPEG n02091467/
+mv val/ILSVRC2012_val_00034006.JPEG n04254120/
+mv val/ILSVRC2012_val_00034007.JPEG n02109525/
+mv val/ILSVRC2012_val_00034008.JPEG n04404412/
+mv val/ILSVRC2012_val_00034009.JPEG n02094433/
+mv val/ILSVRC2012_val_00034010.JPEG n11939491/
+mv val/ILSVRC2012_val_00034011.JPEG n02107683/
+mv val/ILSVRC2012_val_00034012.JPEG n04356056/
+mv val/ILSVRC2012_val_00034013.JPEG n02002556/
+mv val/ILSVRC2012_val_00034014.JPEG n02168699/
+mv val/ILSVRC2012_val_00034015.JPEG n01945685/
+mv val/ILSVRC2012_val_00034016.JPEG n04376876/
+mv val/ILSVRC2012_val_00034017.JPEG n04033901/
+mv val/ILSVRC2012_val_00034018.JPEG n01530575/
+mv val/ILSVRC2012_val_00034019.JPEG n03838899/
+mv val/ILSVRC2012_val_00034020.JPEG n01776313/
+mv val/ILSVRC2012_val_00034021.JPEG n03028079/
+mv val/ILSVRC2012_val_00034022.JPEG n03658185/
+mv val/ILSVRC2012_val_00034023.JPEG n04310018/
+mv val/ILSVRC2012_val_00034024.JPEG n02090379/
+mv val/ILSVRC2012_val_00034025.JPEG n02109525/
+mv val/ILSVRC2012_val_00034026.JPEG n04376876/
+mv val/ILSVRC2012_val_00034027.JPEG n04418357/
+mv val/ILSVRC2012_val_00034028.JPEG n04409515/
+mv val/ILSVRC2012_val_00034029.JPEG n07583066/
+mv val/ILSVRC2012_val_00034030.JPEG n03841143/
+mv val/ILSVRC2012_val_00034031.JPEG n02837789/
+mv val/ILSVRC2012_val_00034032.JPEG n03494278/
+mv val/ILSVRC2012_val_00034033.JPEG n03457902/
+mv val/ILSVRC2012_val_00034034.JPEG n02497673/
+mv val/ILSVRC2012_val_00034035.JPEG n02504013/
+mv val/ILSVRC2012_val_00034036.JPEG n02110063/
+mv val/ILSVRC2012_val_00034037.JPEG n02835271/
+mv val/ILSVRC2012_val_00034038.JPEG n01491361/
+mv val/ILSVRC2012_val_00034039.JPEG n02807133/
+mv val/ILSVRC2012_val_00034040.JPEG n02085782/
+mv val/ILSVRC2012_val_00034041.JPEG n02088364/
+mv val/ILSVRC2012_val_00034042.JPEG n02607072/
+mv val/ILSVRC2012_val_00034043.JPEG n02120505/
+mv val/ILSVRC2012_val_00034044.JPEG n07718472/
+mv val/ILSVRC2012_val_00034045.JPEG n03781244/
+mv val/ILSVRC2012_val_00034046.JPEG n02389026/
+mv val/ILSVRC2012_val_00034047.JPEG n03026506/
+mv val/ILSVRC2012_val_00034048.JPEG n02769748/
+mv val/ILSVRC2012_val_00034049.JPEG n02096177/
+mv val/ILSVRC2012_val_00034050.JPEG n02840245/
+mv val/ILSVRC2012_val_00034051.JPEG n02606052/
+mv val/ILSVRC2012_val_00034052.JPEG n03857828/
+mv val/ILSVRC2012_val_00034053.JPEG n03837869/
+mv val/ILSVRC2012_val_00034054.JPEG n01735189/
+mv val/ILSVRC2012_val_00034055.JPEG n02093256/
+mv val/ILSVRC2012_val_00034056.JPEG n02112706/
+mv val/ILSVRC2012_val_00034057.JPEG n02749479/
+mv val/ILSVRC2012_val_00034058.JPEG n04525038/
+mv val/ILSVRC2012_val_00034059.JPEG n03982430/
+mv val/ILSVRC2012_val_00034060.JPEG n02510455/
+mv val/ILSVRC2012_val_00034061.JPEG n02410509/
+mv val/ILSVRC2012_val_00034062.JPEG n03680355/
+mv val/ILSVRC2012_val_00034063.JPEG n02105505/
+mv val/ILSVRC2012_val_00034064.JPEG n03017168/
+mv val/ILSVRC2012_val_00034065.JPEG n02120079/
+mv val/ILSVRC2012_val_00034066.JPEG n03532672/
+mv val/ILSVRC2012_val_00034067.JPEG n03992509/
+mv val/ILSVRC2012_val_00034068.JPEG n02009229/
+mv val/ILSVRC2012_val_00034069.JPEG n02106166/
+mv val/ILSVRC2012_val_00034070.JPEG n02105056/
+mv val/ILSVRC2012_val_00034071.JPEG n02422699/
+mv val/ILSVRC2012_val_00034072.JPEG n03770439/
+mv val/ILSVRC2012_val_00034073.JPEG n03794056/
+mv val/ILSVRC2012_val_00034074.JPEG n03777568/
+mv val/ILSVRC2012_val_00034075.JPEG n02110806/
+mv val/ILSVRC2012_val_00034076.JPEG n01950731/
+mv val/ILSVRC2012_val_00034077.JPEG n04371430/
+mv val/ILSVRC2012_val_00034078.JPEG n03417042/
+mv val/ILSVRC2012_val_00034079.JPEG n03743016/
+mv val/ILSVRC2012_val_00034080.JPEG n01729977/
+mv val/ILSVRC2012_val_00034081.JPEG n02669723/
+mv val/ILSVRC2012_val_00034082.JPEG n02094433/
+mv val/ILSVRC2012_val_00034083.JPEG n04251144/
+mv val/ILSVRC2012_val_00034084.JPEG n02119022/
+mv val/ILSVRC2012_val_00034085.JPEG n01697457/
+mv val/ILSVRC2012_val_00034086.JPEG n01682714/
+mv val/ILSVRC2012_val_00034087.JPEG n07614500/
+mv val/ILSVRC2012_val_00034088.JPEG n02127052/
+mv val/ILSVRC2012_val_00034089.JPEG n03042490/
+mv val/ILSVRC2012_val_00034090.JPEG n02113799/
+mv val/ILSVRC2012_val_00034091.JPEG n04399382/
+mv val/ILSVRC2012_val_00034092.JPEG n03794056/
+mv val/ILSVRC2012_val_00034093.JPEG n02963159/
+mv val/ILSVRC2012_val_00034094.JPEG n02730930/
+mv val/ILSVRC2012_val_00034095.JPEG n01592084/
+mv val/ILSVRC2012_val_00034096.JPEG n04067472/
+mv val/ILSVRC2012_val_00034097.JPEG n02815834/
+mv val/ILSVRC2012_val_00034098.JPEG n07753592/
+mv val/ILSVRC2012_val_00034099.JPEG n13052670/
+mv val/ILSVRC2012_val_00034100.JPEG n07875152/
+mv val/ILSVRC2012_val_00034101.JPEG n06785654/
+mv val/ILSVRC2012_val_00034102.JPEG n04509417/
+mv val/ILSVRC2012_val_00034103.JPEG n03977966/
+mv val/ILSVRC2012_val_00034104.JPEG n03345487/
+mv val/ILSVRC2012_val_00034105.JPEG n03223299/
+mv val/ILSVRC2012_val_00034106.JPEG n04277352/
+mv val/ILSVRC2012_val_00034107.JPEG n06794110/
+mv val/ILSVRC2012_val_00034108.JPEG n02389026/
+mv val/ILSVRC2012_val_00034109.JPEG n07920052/
+mv val/ILSVRC2012_val_00034110.JPEG n02100877/
+mv val/ILSVRC2012_val_00034111.JPEG n04435653/
+mv val/ILSVRC2012_val_00034112.JPEG n04239074/
+mv val/ILSVRC2012_val_00034113.JPEG n04069434/
+mv val/ILSVRC2012_val_00034114.JPEG n03617480/
+mv val/ILSVRC2012_val_00034115.JPEG n01494475/
+mv val/ILSVRC2012_val_00034116.JPEG n02672831/
+mv val/ILSVRC2012_val_00034117.JPEG n07831146/
+mv val/ILSVRC2012_val_00034118.JPEG n02097047/
+mv val/ILSVRC2012_val_00034119.JPEG n03814639/
+mv val/ILSVRC2012_val_00034120.JPEG n02514041/
+mv val/ILSVRC2012_val_00034121.JPEG n02091635/
+mv val/ILSVRC2012_val_00034122.JPEG n01687978/
+mv val/ILSVRC2012_val_00034123.JPEG n02116738/
+mv val/ILSVRC2012_val_00034124.JPEG n01630670/
+mv val/ILSVRC2012_val_00034125.JPEG n01695060/
+mv val/ILSVRC2012_val_00034126.JPEG n04204238/
+mv val/ILSVRC2012_val_00034127.JPEG n04090263/
+mv val/ILSVRC2012_val_00034128.JPEG n04081281/
+mv val/ILSVRC2012_val_00034129.JPEG n01819313/
+mv val/ILSVRC2012_val_00034130.JPEG n02132136/
+mv val/ILSVRC2012_val_00034131.JPEG n03787032/
+mv val/ILSVRC2012_val_00034132.JPEG n04044716/
+mv val/ILSVRC2012_val_00034133.JPEG n15075141/
+mv val/ILSVRC2012_val_00034134.JPEG n03954731/
+mv val/ILSVRC2012_val_00034135.JPEG n04389033/
+mv val/ILSVRC2012_val_00034136.JPEG n02002556/
+mv val/ILSVRC2012_val_00034137.JPEG n04591157/
+mv val/ILSVRC2012_val_00034138.JPEG n04133789/
+mv val/ILSVRC2012_val_00034139.JPEG n04277352/
+mv val/ILSVRC2012_val_00034140.JPEG n02641379/
+mv val/ILSVRC2012_val_00034141.JPEG n03733805/
+mv val/ILSVRC2012_val_00034142.JPEG n04417672/
+mv val/ILSVRC2012_val_00034143.JPEG n02403003/
+mv val/ILSVRC2012_val_00034144.JPEG n01580077/
+mv val/ILSVRC2012_val_00034145.JPEG n03920288/
+mv val/ILSVRC2012_val_00034146.JPEG n03673027/
+mv val/ILSVRC2012_val_00034147.JPEG n07697537/
+mv val/ILSVRC2012_val_00034148.JPEG n07836838/
+mv val/ILSVRC2012_val_00034149.JPEG n04243546/
+mv val/ILSVRC2012_val_00034150.JPEG n02977058/
+mv val/ILSVRC2012_val_00034151.JPEG n07684084/
+mv val/ILSVRC2012_val_00034152.JPEG n07697537/
+mv val/ILSVRC2012_val_00034153.JPEG n02132136/
+mv val/ILSVRC2012_val_00034154.JPEG n03131574/
+mv val/ILSVRC2012_val_00034155.JPEG n02093647/
+mv val/ILSVRC2012_val_00034156.JPEG n03443371/
+mv val/ILSVRC2012_val_00034157.JPEG n03134739/
+mv val/ILSVRC2012_val_00034158.JPEG n04550184/
+mv val/ILSVRC2012_val_00034159.JPEG n03891251/
+mv val/ILSVRC2012_val_00034160.JPEG n02087394/
+mv val/ILSVRC2012_val_00034161.JPEG n07697537/
+mv val/ILSVRC2012_val_00034162.JPEG n07583066/
+mv val/ILSVRC2012_val_00034163.JPEG n04522168/
+mv val/ILSVRC2012_val_00034164.JPEG n04493381/
+mv val/ILSVRC2012_val_00034165.JPEG n04065272/
+mv val/ILSVRC2012_val_00034166.JPEG n02097130/
+mv val/ILSVRC2012_val_00034167.JPEG n04467665/
+mv val/ILSVRC2012_val_00034168.JPEG n01614925/
+mv val/ILSVRC2012_val_00034169.JPEG n03961711/
+mv val/ILSVRC2012_val_00034170.JPEG n02802426/
+mv val/ILSVRC2012_val_00034171.JPEG n02089078/
+mv val/ILSVRC2012_val_00034172.JPEG n02018207/
+mv val/ILSVRC2012_val_00034173.JPEG n03947888/
+mv val/ILSVRC2012_val_00034174.JPEG n01748264/
+mv val/ILSVRC2012_val_00034175.JPEG n02280649/
+mv val/ILSVRC2012_val_00034176.JPEG n02002556/
+mv val/ILSVRC2012_val_00034177.JPEG n03709823/
+mv val/ILSVRC2012_val_00034178.JPEG n01494475/
+mv val/ILSVRC2012_val_00034179.JPEG n03485794/
+mv val/ILSVRC2012_val_00034180.JPEG n04479046/
+mv val/ILSVRC2012_val_00034181.JPEG n02108551/
+mv val/ILSVRC2012_val_00034182.JPEG n03325584/
+mv val/ILSVRC2012_val_00034183.JPEG n03188531/
+mv val/ILSVRC2012_val_00034184.JPEG n02091032/
+mv val/ILSVRC2012_val_00034185.JPEG n02259212/
+mv val/ILSVRC2012_val_00034186.JPEG n02033041/
+mv val/ILSVRC2012_val_00034187.JPEG n03290653/
+mv val/ILSVRC2012_val_00034188.JPEG n04033995/
+mv val/ILSVRC2012_val_00034189.JPEG n07614500/
+mv val/ILSVRC2012_val_00034190.JPEG n02169497/
+mv val/ILSVRC2012_val_00034191.JPEG n04553703/
+mv val/ILSVRC2012_val_00034192.JPEG n02268443/
+mv val/ILSVRC2012_val_00034193.JPEG n09288635/
+mv val/ILSVRC2012_val_00034194.JPEG n01843383/
+mv val/ILSVRC2012_val_00034195.JPEG n04428191/
+mv val/ILSVRC2012_val_00034196.JPEG n03717622/
+mv val/ILSVRC2012_val_00034197.JPEG n02268853/
+mv val/ILSVRC2012_val_00034198.JPEG n02012849/
+mv val/ILSVRC2012_val_00034199.JPEG n02894605/
+mv val/ILSVRC2012_val_00034200.JPEG n02134418/
+mv val/ILSVRC2012_val_00034201.JPEG n01751748/
+mv val/ILSVRC2012_val_00034202.JPEG n02823750/
+mv val/ILSVRC2012_val_00034203.JPEG n02177972/
+mv val/ILSVRC2012_val_00034204.JPEG n03424325/
+mv val/ILSVRC2012_val_00034205.JPEG n02397096/
+mv val/ILSVRC2012_val_00034206.JPEG n07753275/
+mv val/ILSVRC2012_val_00034207.JPEG n02417914/
+mv val/ILSVRC2012_val_00034208.JPEG n03379051/
+mv val/ILSVRC2012_val_00034209.JPEG n02096585/
+mv val/ILSVRC2012_val_00034210.JPEG n03814639/
+mv val/ILSVRC2012_val_00034211.JPEG n03355925/
+mv val/ILSVRC2012_val_00034212.JPEG n03127747/
+mv val/ILSVRC2012_val_00034213.JPEG n02264363/
+mv val/ILSVRC2012_val_00034214.JPEG n03733131/
+mv val/ILSVRC2012_val_00034215.JPEG n02481823/
+mv val/ILSVRC2012_val_00034216.JPEG n03447447/
+mv val/ILSVRC2012_val_00034217.JPEG n04409515/
+mv val/ILSVRC2012_val_00034218.JPEG n02066245/
+mv val/ILSVRC2012_val_00034219.JPEG n02102318/
+mv val/ILSVRC2012_val_00034220.JPEG n03028079/
+mv val/ILSVRC2012_val_00034221.JPEG n02107574/
+mv val/ILSVRC2012_val_00034222.JPEG n04026417/
+mv val/ILSVRC2012_val_00034223.JPEG n02058221/
+mv val/ILSVRC2012_val_00034224.JPEG n02106662/
+mv val/ILSVRC2012_val_00034225.JPEG n02607072/
+mv val/ILSVRC2012_val_00034226.JPEG n01641577/
+mv val/ILSVRC2012_val_00034227.JPEG n03376595/
+mv val/ILSVRC2012_val_00034228.JPEG n07892512/
+mv val/ILSVRC2012_val_00034229.JPEG n11939491/
+mv val/ILSVRC2012_val_00034230.JPEG n02488702/
+mv val/ILSVRC2012_val_00034231.JPEG n09421951/
+mv val/ILSVRC2012_val_00034232.JPEG n01910747/
+mv val/ILSVRC2012_val_00034233.JPEG n02364673/
+mv val/ILSVRC2012_val_00034234.JPEG n07248320/
+mv val/ILSVRC2012_val_00034235.JPEG n03908714/
+mv val/ILSVRC2012_val_00034236.JPEG n02939185/
+mv val/ILSVRC2012_val_00034237.JPEG n02099601/
+mv val/ILSVRC2012_val_00034238.JPEG n03680355/
+mv val/ILSVRC2012_val_00034239.JPEG n02095889/
+mv val/ILSVRC2012_val_00034240.JPEG n02917067/
+mv val/ILSVRC2012_val_00034241.JPEG n04380533/
+mv val/ILSVRC2012_val_00034242.JPEG n01592084/
+mv val/ILSVRC2012_val_00034243.JPEG n02109525/
+mv val/ILSVRC2012_val_00034244.JPEG n02123394/
+mv val/ILSVRC2012_val_00034245.JPEG n02236044/
+mv val/ILSVRC2012_val_00034246.JPEG n02346627/
+mv val/ILSVRC2012_val_00034247.JPEG n12057211/
+mv val/ILSVRC2012_val_00034248.JPEG n12620546/
+mv val/ILSVRC2012_val_00034249.JPEG n04346328/
+mv val/ILSVRC2012_val_00034250.JPEG n01531178/
+mv val/ILSVRC2012_val_00034251.JPEG n01735189/
+mv val/ILSVRC2012_val_00034252.JPEG n04152593/
+mv val/ILSVRC2012_val_00034253.JPEG n04487394/
+mv val/ILSVRC2012_val_00034254.JPEG n02123597/
+mv val/ILSVRC2012_val_00034255.JPEG n01768244/
+mv val/ILSVRC2012_val_00034256.JPEG n02129604/
+mv val/ILSVRC2012_val_00034257.JPEG n09193705/
+mv val/ILSVRC2012_val_00034258.JPEG n04131690/
+mv val/ILSVRC2012_val_00034259.JPEG n02085936/
+mv val/ILSVRC2012_val_00034260.JPEG n02088238/
+mv val/ILSVRC2012_val_00034261.JPEG n03538406/
+mv val/ILSVRC2012_val_00034262.JPEG n03131574/
+mv val/ILSVRC2012_val_00034263.JPEG n02110185/
+mv val/ILSVRC2012_val_00034264.JPEG n03124043/
+mv val/ILSVRC2012_val_00034265.JPEG n03000247/
+mv val/ILSVRC2012_val_00034266.JPEG n02107574/
+mv val/ILSVRC2012_val_00034267.JPEG n02110958/
+mv val/ILSVRC2012_val_00034268.JPEG n03018349/
+mv val/ILSVRC2012_val_00034269.JPEG n02930766/
+mv val/ILSVRC2012_val_00034270.JPEG n02229544/
+mv val/ILSVRC2012_val_00034271.JPEG n02483362/
+mv val/ILSVRC2012_val_00034272.JPEG n03887697/
+mv val/ILSVRC2012_val_00034273.JPEG n01773797/
+mv val/ILSVRC2012_val_00034274.JPEG n02264363/
+mv val/ILSVRC2012_val_00034275.JPEG n02088364/
+mv val/ILSVRC2012_val_00034276.JPEG n04127249/
+mv val/ILSVRC2012_val_00034277.JPEG n02113023/
+mv val/ILSVRC2012_val_00034278.JPEG n03146219/
+mv val/ILSVRC2012_val_00034279.JPEG n02114855/
+mv val/ILSVRC2012_val_00034280.JPEG n04536866/
+mv val/ILSVRC2012_val_00034281.JPEG n03770679/
+mv val/ILSVRC2012_val_00034282.JPEG n01796340/
+mv val/ILSVRC2012_val_00034283.JPEG n03866082/
+mv val/ILSVRC2012_val_00034284.JPEG n04380533/
+mv val/ILSVRC2012_val_00034285.JPEG n03764736/
+mv val/ILSVRC2012_val_00034286.JPEG n07749582/
+mv val/ILSVRC2012_val_00034287.JPEG n03658185/
+mv val/ILSVRC2012_val_00034288.JPEG n04579145/
+mv val/ILSVRC2012_val_00034289.JPEG n01784675/
+mv val/ILSVRC2012_val_00034290.JPEG n01644373/
+mv val/ILSVRC2012_val_00034291.JPEG n02110063/
+mv val/ILSVRC2012_val_00034292.JPEG n02971356/
+mv val/ILSVRC2012_val_00034293.JPEG n02494079/
+mv val/ILSVRC2012_val_00034294.JPEG n02361337/
+mv val/ILSVRC2012_val_00034295.JPEG n02490219/
+mv val/ILSVRC2012_val_00034296.JPEG n03803284/
+mv val/ILSVRC2012_val_00034297.JPEG n02113624/
+mv val/ILSVRC2012_val_00034298.JPEG n02106550/
+mv val/ILSVRC2012_val_00034299.JPEG n03814906/
+mv val/ILSVRC2012_val_00034300.JPEG n03180011/
+mv val/ILSVRC2012_val_00034301.JPEG n01872401/
+mv val/ILSVRC2012_val_00034302.JPEG n02730930/
+mv val/ILSVRC2012_val_00034303.JPEG n04548280/
+mv val/ILSVRC2012_val_00034304.JPEG n02814860/
+mv val/ILSVRC2012_val_00034305.JPEG n02105162/
+mv val/ILSVRC2012_val_00034306.JPEG n03676483/
+mv val/ILSVRC2012_val_00034307.JPEG n01871265/
+mv val/ILSVRC2012_val_00034308.JPEG n07716358/
+mv val/ILSVRC2012_val_00034309.JPEG n04476259/
+mv val/ILSVRC2012_val_00034310.JPEG n03887697/
+mv val/ILSVRC2012_val_00034311.JPEG n07697537/
+mv val/ILSVRC2012_val_00034312.JPEG n02514041/
+mv val/ILSVRC2012_val_00034313.JPEG n04004767/
+mv val/ILSVRC2012_val_00034314.JPEG n04371774/
+mv val/ILSVRC2012_val_00034315.JPEG n01855032/
+mv val/ILSVRC2012_val_00034316.JPEG n01518878/
+mv val/ILSVRC2012_val_00034317.JPEG n09835506/
+mv val/ILSVRC2012_val_00034318.JPEG n01943899/
+mv val/ILSVRC2012_val_00034319.JPEG n03908714/
+mv val/ILSVRC2012_val_00034320.JPEG n03400231/
+mv val/ILSVRC2012_val_00034321.JPEG n02129604/
+mv val/ILSVRC2012_val_00034322.JPEG n02492035/
+mv val/ILSVRC2012_val_00034323.JPEG n04252225/
+mv val/ILSVRC2012_val_00034324.JPEG n02107312/
+mv val/ILSVRC2012_val_00034325.JPEG n03443371/
+mv val/ILSVRC2012_val_00034326.JPEG n02950826/
+mv val/ILSVRC2012_val_00034327.JPEG n03814639/
+mv val/ILSVRC2012_val_00034328.JPEG n02951585/
+mv val/ILSVRC2012_val_00034329.JPEG n04265275/
+mv val/ILSVRC2012_val_00034330.JPEG n01806567/
+mv val/ILSVRC2012_val_00034331.JPEG n03482405/
+mv val/ILSVRC2012_val_00034332.JPEG n01882714/
+mv val/ILSVRC2012_val_00034333.JPEG n01580077/
+mv val/ILSVRC2012_val_00034334.JPEG n02091831/
+mv val/ILSVRC2012_val_00034335.JPEG n04266014/
+mv val/ILSVRC2012_val_00034336.JPEG n02895154/
+mv val/ILSVRC2012_val_00034337.JPEG n04532106/
+mv val/ILSVRC2012_val_00034338.JPEG n02999410/
+mv val/ILSVRC2012_val_00034339.JPEG n03729826/
+mv val/ILSVRC2012_val_00034340.JPEG n03345487/
+mv val/ILSVRC2012_val_00034341.JPEG n02105162/
+mv val/ILSVRC2012_val_00034342.JPEG n02690373/
+mv val/ILSVRC2012_val_00034343.JPEG n04597913/
+mv val/ILSVRC2012_val_00034344.JPEG n04325704/
+mv val/ILSVRC2012_val_00034345.JPEG n03461385/
+mv val/ILSVRC2012_val_00034346.JPEG n01695060/
+mv val/ILSVRC2012_val_00034347.JPEG n01818515/
+mv val/ILSVRC2012_val_00034348.JPEG n09472597/
+mv val/ILSVRC2012_val_00034349.JPEG n01806567/
+mv val/ILSVRC2012_val_00034350.JPEG n07754684/
+mv val/ILSVRC2012_val_00034351.JPEG n04326547/
+mv val/ILSVRC2012_val_00034352.JPEG n02093859/
+mv val/ILSVRC2012_val_00034353.JPEG n04049303/
+mv val/ILSVRC2012_val_00034354.JPEG n02641379/
+mv val/ILSVRC2012_val_00034355.JPEG n03196217/
+mv val/ILSVRC2012_val_00034356.JPEG n02088466/
+mv val/ILSVRC2012_val_00034357.JPEG n04376876/
+mv val/ILSVRC2012_val_00034358.JPEG n02009229/
+mv val/ILSVRC2012_val_00034359.JPEG n03929855/
+mv val/ILSVRC2012_val_00034360.JPEG n02025239/
+mv val/ILSVRC2012_val_00034361.JPEG n03814906/
+mv val/ILSVRC2012_val_00034362.JPEG n03291819/
+mv val/ILSVRC2012_val_00034363.JPEG n04612504/
+mv val/ILSVRC2012_val_00034364.JPEG n03000134/
+mv val/ILSVRC2012_val_00034365.JPEG n02837789/
+mv val/ILSVRC2012_val_00034366.JPEG n07718747/
+mv val/ILSVRC2012_val_00034367.JPEG n03459775/
+mv val/ILSVRC2012_val_00034368.JPEG n02281406/
+mv val/ILSVRC2012_val_00034369.JPEG n01693334/
+mv val/ILSVRC2012_val_00034370.JPEG n02219486/
+mv val/ILSVRC2012_val_00034371.JPEG n04266014/
+mv val/ILSVRC2012_val_00034372.JPEG n04399382/
+mv val/ILSVRC2012_val_00034373.JPEG n01774750/
+mv val/ILSVRC2012_val_00034374.JPEG n02980441/
+mv val/ILSVRC2012_val_00034375.JPEG n03062245/
+mv val/ILSVRC2012_val_00034376.JPEG n04418357/
+mv val/ILSVRC2012_val_00034377.JPEG n02841315/
+mv val/ILSVRC2012_val_00034378.JPEG n04239074/
+mv val/ILSVRC2012_val_00034379.JPEG n02117135/
+mv val/ILSVRC2012_val_00034380.JPEG n03908714/
+mv val/ILSVRC2012_val_00034381.JPEG n04429376/
+mv val/ILSVRC2012_val_00034382.JPEG n02089867/
+mv val/ILSVRC2012_val_00034383.JPEG n01641577/
+mv val/ILSVRC2012_val_00034384.JPEG n02444819/
+mv val/ILSVRC2012_val_00034385.JPEG n04277352/
+mv val/ILSVRC2012_val_00034386.JPEG n01443537/
+mv val/ILSVRC2012_val_00034387.JPEG n04522168/
+mv val/ILSVRC2012_val_00034388.JPEG n02137549/
+mv val/ILSVRC2012_val_00034389.JPEG n03770439/
+mv val/ILSVRC2012_val_00034390.JPEG n03697007/
+mv val/ILSVRC2012_val_00034391.JPEG n07248320/
+mv val/ILSVRC2012_val_00034392.JPEG n04523525/
+mv val/ILSVRC2012_val_00034393.JPEG n04141975/
+mv val/ILSVRC2012_val_00034394.JPEG n04442312/
+mv val/ILSVRC2012_val_00034395.JPEG n02979186/
+mv val/ILSVRC2012_val_00034396.JPEG n03929855/
+mv val/ILSVRC2012_val_00034397.JPEG n03160309/
+mv val/ILSVRC2012_val_00034398.JPEG n07613480/
+mv val/ILSVRC2012_val_00034399.JPEG n04154565/
+mv val/ILSVRC2012_val_00034400.JPEG n03452741/
+mv val/ILSVRC2012_val_00034401.JPEG n03063689/
+mv val/ILSVRC2012_val_00034402.JPEG n01983481/
+mv val/ILSVRC2012_val_00034403.JPEG n03884397/
+mv val/ILSVRC2012_val_00034404.JPEG n02687172/
+mv val/ILSVRC2012_val_00034405.JPEG n01622779/
+mv val/ILSVRC2012_val_00034406.JPEG n01774750/
+mv val/ILSVRC2012_val_00034407.JPEG n02096051/
+mv val/ILSVRC2012_val_00034408.JPEG n04074963/
+mv val/ILSVRC2012_val_00034409.JPEG n03207941/
+mv val/ILSVRC2012_val_00034410.JPEG n02107908/
+mv val/ILSVRC2012_val_00034411.JPEG n03180011/
+mv val/ILSVRC2012_val_00034412.JPEG n04557648/
+mv val/ILSVRC2012_val_00034413.JPEG n01491361/
+mv val/ILSVRC2012_val_00034414.JPEG n04209239/
+mv val/ILSVRC2012_val_00034415.JPEG n02091467/
+mv val/ILSVRC2012_val_00034416.JPEG n03930313/
+mv val/ILSVRC2012_val_00034417.JPEG n03417042/
+mv val/ILSVRC2012_val_00034418.JPEG n02395406/
+mv val/ILSVRC2012_val_00034419.JPEG n02112350/
+mv val/ILSVRC2012_val_00034420.JPEG n02108915/
+mv val/ILSVRC2012_val_00034421.JPEG n02123597/
+mv val/ILSVRC2012_val_00034422.JPEG n04125021/
+mv val/ILSVRC2012_val_00034423.JPEG n03777754/
+mv val/ILSVRC2012_val_00034424.JPEG n09288635/
+mv val/ILSVRC2012_val_00034425.JPEG n02066245/
+mv val/ILSVRC2012_val_00034426.JPEG n03196217/
+mv val/ILSVRC2012_val_00034427.JPEG n04118538/
+mv val/ILSVRC2012_val_00034428.JPEG n03733281/
+mv val/ILSVRC2012_val_00034429.JPEG n02106550/
+mv val/ILSVRC2012_val_00034430.JPEG n02111889/
+mv val/ILSVRC2012_val_00034431.JPEG n03720891/
+mv val/ILSVRC2012_val_00034432.JPEG n04604644/
+mv val/ILSVRC2012_val_00034433.JPEG n03016953/
+mv val/ILSVRC2012_val_00034434.JPEG n03249569/
+mv val/ILSVRC2012_val_00034435.JPEG n04039381/
+mv val/ILSVRC2012_val_00034436.JPEG n02100735/
+mv val/ILSVRC2012_val_00034437.JPEG n01582220/
+mv val/ILSVRC2012_val_00034438.JPEG n02423022/
+mv val/ILSVRC2012_val_00034439.JPEG n03764736/
+mv val/ILSVRC2012_val_00034440.JPEG n03109150/
+mv val/ILSVRC2012_val_00034441.JPEG n02028035/
+mv val/ILSVRC2012_val_00034442.JPEG n02510455/
+mv val/ILSVRC2012_val_00034443.JPEG n01735189/
+mv val/ILSVRC2012_val_00034444.JPEG n02666196/
+mv val/ILSVRC2012_val_00034445.JPEG n02992211/
+mv val/ILSVRC2012_val_00034446.JPEG n04356056/
+mv val/ILSVRC2012_val_00034447.JPEG n03240683/
+mv val/ILSVRC2012_val_00034448.JPEG n01978455/
+mv val/ILSVRC2012_val_00034449.JPEG n04579145/
+mv val/ILSVRC2012_val_00034450.JPEG n02963159/
+mv val/ILSVRC2012_val_00034451.JPEG n09288635/
+mv val/ILSVRC2012_val_00034452.JPEG n02442845/
+mv val/ILSVRC2012_val_00034453.JPEG n04606251/
+mv val/ILSVRC2012_val_00034454.JPEG n02087046/
+mv val/ILSVRC2012_val_00034455.JPEG n03344393/
+mv val/ILSVRC2012_val_00034456.JPEG n01883070/
+mv val/ILSVRC2012_val_00034457.JPEG n03697007/
+mv val/ILSVRC2012_val_00034458.JPEG n03891251/
+mv val/ILSVRC2012_val_00034459.JPEG n03662601/
+mv val/ILSVRC2012_val_00034460.JPEG n02138441/
+mv val/ILSVRC2012_val_00034461.JPEG n01753488/
+mv val/ILSVRC2012_val_00034462.JPEG n04613696/
+mv val/ILSVRC2012_val_00034463.JPEG n01950731/
+mv val/ILSVRC2012_val_00034464.JPEG n03485794/
+mv val/ILSVRC2012_val_00034465.JPEG n02110341/
+mv val/ILSVRC2012_val_00034466.JPEG n02892767/
+mv val/ILSVRC2012_val_00034467.JPEG n02492035/
+mv val/ILSVRC2012_val_00034468.JPEG n04273569/
+mv val/ILSVRC2012_val_00034469.JPEG n04008634/
+mv val/ILSVRC2012_val_00034470.JPEG n02095314/
+mv val/ILSVRC2012_val_00034471.JPEG n03794056/
+mv val/ILSVRC2012_val_00034472.JPEG n09472597/
+mv val/ILSVRC2012_val_00034473.JPEG n02802426/
+mv val/ILSVRC2012_val_00034474.JPEG n07716906/
+mv val/ILSVRC2012_val_00034475.JPEG n03792972/
+mv val/ILSVRC2012_val_00034476.JPEG n01872401/
+mv val/ILSVRC2012_val_00034477.JPEG n03673027/
+mv val/ILSVRC2012_val_00034478.JPEG n02279972/
+mv val/ILSVRC2012_val_00034479.JPEG n02910353/
+mv val/ILSVRC2012_val_00034480.JPEG n03933933/
+mv val/ILSVRC2012_val_00034481.JPEG n03938244/
+mv val/ILSVRC2012_val_00034482.JPEG n01558993/
+mv val/ILSVRC2012_val_00034483.JPEG n03908714/
+mv val/ILSVRC2012_val_00034484.JPEG n01914609/
+mv val/ILSVRC2012_val_00034485.JPEG n02101006/
+mv val/ILSVRC2012_val_00034486.JPEG n02672831/
+mv val/ILSVRC2012_val_00034487.JPEG n04067472/
+mv val/ILSVRC2012_val_00034488.JPEG n02526121/
+mv val/ILSVRC2012_val_00034489.JPEG n07836838/
+mv val/ILSVRC2012_val_00034490.JPEG n02817516/
+mv val/ILSVRC2012_val_00034491.JPEG n07742313/
+mv val/ILSVRC2012_val_00034492.JPEG n01828970/
+mv val/ILSVRC2012_val_00034493.JPEG n04286575/
+mv val/ILSVRC2012_val_00034494.JPEG n03649909/
+mv val/ILSVRC2012_val_00034495.JPEG n02107683/
+mv val/ILSVRC2012_val_00034496.JPEG n02988304/
+mv val/ILSVRC2012_val_00034497.JPEG n02165456/
+mv val/ILSVRC2012_val_00034498.JPEG n04560804/
+mv val/ILSVRC2012_val_00034499.JPEG n01629819/
+mv val/ILSVRC2012_val_00034500.JPEG n03814906/
+mv val/ILSVRC2012_val_00034501.JPEG n03782006/
+mv val/ILSVRC2012_val_00034502.JPEG n02264363/
+mv val/ILSVRC2012_val_00034503.JPEG n02909870/
+mv val/ILSVRC2012_val_00034504.JPEG n09246464/
+mv val/ILSVRC2012_val_00034505.JPEG n02328150/
+mv val/ILSVRC2012_val_00034506.JPEG n02730930/
+mv val/ILSVRC2012_val_00034507.JPEG n04596742/
+mv val/ILSVRC2012_val_00034508.JPEG n03095699/
+mv val/ILSVRC2012_val_00034509.JPEG n03146219/
+mv val/ILSVRC2012_val_00034510.JPEG n01824575/
+mv val/ILSVRC2012_val_00034511.JPEG n03977966/
+mv val/ILSVRC2012_val_00034512.JPEG n01807496/
+mv val/ILSVRC2012_val_00034513.JPEG n02500267/
+mv val/ILSVRC2012_val_00034514.JPEG n02098105/
+mv val/ILSVRC2012_val_00034515.JPEG n01796340/
+mv val/ILSVRC2012_val_00034516.JPEG n02113978/
+mv val/ILSVRC2012_val_00034517.JPEG n02948072/
+mv val/ILSVRC2012_val_00034518.JPEG n03089624/
+mv val/ILSVRC2012_val_00034519.JPEG n04550184/
+mv val/ILSVRC2012_val_00034520.JPEG n07565083/
+mv val/ILSVRC2012_val_00034521.JPEG n03529860/
+mv val/ILSVRC2012_val_00034522.JPEG n03544143/
+mv val/ILSVRC2012_val_00034523.JPEG n02791270/
+mv val/ILSVRC2012_val_00034524.JPEG n03775071/
+mv val/ILSVRC2012_val_00034525.JPEG n03710721/
+mv val/ILSVRC2012_val_00034526.JPEG n13044778/
+mv val/ILSVRC2012_val_00034527.JPEG n02504458/
+mv val/ILSVRC2012_val_00034528.JPEG n02514041/
+mv val/ILSVRC2012_val_00034529.JPEG n03743016/
+mv val/ILSVRC2012_val_00034530.JPEG n03483316/
+mv val/ILSVRC2012_val_00034531.JPEG n12985857/
+mv val/ILSVRC2012_val_00034532.JPEG n03709823/
+mv val/ILSVRC2012_val_00034533.JPEG n04465501/
+mv val/ILSVRC2012_val_00034534.JPEG n03028079/
+mv val/ILSVRC2012_val_00034535.JPEG n04209239/
+mv val/ILSVRC2012_val_00034536.JPEG n01807496/
+mv val/ILSVRC2012_val_00034537.JPEG n02859443/
+mv val/ILSVRC2012_val_00034538.JPEG n04398044/
+mv val/ILSVRC2012_val_00034539.JPEG n03337140/
+mv val/ILSVRC2012_val_00034540.JPEG n02783161/
+mv val/ILSVRC2012_val_00034541.JPEG n02500267/
+mv val/ILSVRC2012_val_00034542.JPEG n01644373/
+mv val/ILSVRC2012_val_00034543.JPEG n07711569/
+mv val/ILSVRC2012_val_00034544.JPEG n03888257/
+mv val/ILSVRC2012_val_00034545.JPEG n02655020/
+mv val/ILSVRC2012_val_00034546.JPEG n09399592/
+mv val/ILSVRC2012_val_00034547.JPEG n03197337/
+mv val/ILSVRC2012_val_00034548.JPEG n02007558/
+mv val/ILSVRC2012_val_00034549.JPEG n03961711/
+mv val/ILSVRC2012_val_00034550.JPEG n04542943/
+mv val/ILSVRC2012_val_00034551.JPEG n02116738/
+mv val/ILSVRC2012_val_00034552.JPEG n01580077/
+mv val/ILSVRC2012_val_00034553.JPEG n02088632/
+mv val/ILSVRC2012_val_00034554.JPEG n02096294/
+mv val/ILSVRC2012_val_00034555.JPEG n03388183/
+mv val/ILSVRC2012_val_00034556.JPEG n02099267/
+mv val/ILSVRC2012_val_00034557.JPEG n03445924/
+mv val/ILSVRC2012_val_00034558.JPEG n04133789/
+mv val/ILSVRC2012_val_00034559.JPEG n04332243/
+mv val/ILSVRC2012_val_00034560.JPEG n03201208/
+mv val/ILSVRC2012_val_00034561.JPEG n03032252/
+mv val/ILSVRC2012_val_00034562.JPEG n02504458/
+mv val/ILSVRC2012_val_00034563.JPEG n02979186/
+mv val/ILSVRC2012_val_00034564.JPEG n04584207/
+mv val/ILSVRC2012_val_00034565.JPEG n03535780/
+mv val/ILSVRC2012_val_00034566.JPEG n02229544/
+mv val/ILSVRC2012_val_00034567.JPEG n02111500/
+mv val/ILSVRC2012_val_00034568.JPEG n04525305/
+mv val/ILSVRC2012_val_00034569.JPEG n03197337/
+mv val/ILSVRC2012_val_00034570.JPEG n02398521/
+mv val/ILSVRC2012_val_00034571.JPEG n02088238/
+mv val/ILSVRC2012_val_00034572.JPEG n02364673/
+mv val/ILSVRC2012_val_00034573.JPEG n04146614/
+mv val/ILSVRC2012_val_00034574.JPEG n02113186/
+mv val/ILSVRC2012_val_00034575.JPEG n02391049/
+mv val/ILSVRC2012_val_00034576.JPEG n02098286/
+mv val/ILSVRC2012_val_00034577.JPEG n04548362/
+mv val/ILSVRC2012_val_00034578.JPEG n02009229/
+mv val/ILSVRC2012_val_00034579.JPEG n07802026/
+mv val/ILSVRC2012_val_00034580.JPEG n07716906/
+mv val/ILSVRC2012_val_00034581.JPEG n02111889/
+mv val/ILSVRC2012_val_00034582.JPEG n02730930/
+mv val/ILSVRC2012_val_00034583.JPEG n01632777/
+mv val/ILSVRC2012_val_00034584.JPEG n02099601/
+mv val/ILSVRC2012_val_00034585.JPEG n02981792/
+mv val/ILSVRC2012_val_00034586.JPEG n03637318/
+mv val/ILSVRC2012_val_00034587.JPEG n01735189/
+mv val/ILSVRC2012_val_00034588.JPEG n04049303/
+mv val/ILSVRC2012_val_00034589.JPEG n02129165/
+mv val/ILSVRC2012_val_00034590.JPEG n02443484/
+mv val/ILSVRC2012_val_00034591.JPEG n03770679/
+mv val/ILSVRC2012_val_00034592.JPEG n04149813/
+mv val/ILSVRC2012_val_00034593.JPEG n01622779/
+mv val/ILSVRC2012_val_00034594.JPEG n03110669/
+mv val/ILSVRC2012_val_00034595.JPEG n01945685/
+mv val/ILSVRC2012_val_00034596.JPEG n03937543/
+mv val/ILSVRC2012_val_00034597.JPEG n02977058/
+mv val/ILSVRC2012_val_00034598.JPEG n02457408/
+mv val/ILSVRC2012_val_00034599.JPEG n03041632/
+mv val/ILSVRC2012_val_00034600.JPEG n01694178/
+mv val/ILSVRC2012_val_00034601.JPEG n03095699/
+mv val/ILSVRC2012_val_00034602.JPEG n02085936/
+mv val/ILSVRC2012_val_00034603.JPEG n04252077/
+mv val/ILSVRC2012_val_00034604.JPEG n03529860/
+mv val/ILSVRC2012_val_00034605.JPEG n01978455/
+mv val/ILSVRC2012_val_00034606.JPEG n01768244/
+mv val/ILSVRC2012_val_00034607.JPEG n06359193/
+mv val/ILSVRC2012_val_00034608.JPEG n02107908/
+mv val/ILSVRC2012_val_00034609.JPEG n04162706/
+mv val/ILSVRC2012_val_00034610.JPEG n03494278/
+mv val/ILSVRC2012_val_00034611.JPEG n02009912/
+mv val/ILSVRC2012_val_00034612.JPEG n01740131/
+mv val/ILSVRC2012_val_00034613.JPEG n03717622/
+mv val/ILSVRC2012_val_00034614.JPEG n13054560/
+mv val/ILSVRC2012_val_00034615.JPEG n03014705/
+mv val/ILSVRC2012_val_00034616.JPEG n02087394/
+mv val/ILSVRC2012_val_00034617.JPEG n02093991/
+mv val/ILSVRC2012_val_00034618.JPEG n03063689/
+mv val/ILSVRC2012_val_00034619.JPEG n02113023/
+mv val/ILSVRC2012_val_00034620.JPEG n03733131/
+mv val/ILSVRC2012_val_00034621.JPEG n04493381/
+mv val/ILSVRC2012_val_00034622.JPEG n03825788/
+mv val/ILSVRC2012_val_00034623.JPEG n02643566/
+mv val/ILSVRC2012_val_00034624.JPEG n03495258/
+mv val/ILSVRC2012_val_00034625.JPEG n06794110/
+mv val/ILSVRC2012_val_00034626.JPEG n02280649/
+mv val/ILSVRC2012_val_00034627.JPEG n04065272/
+mv val/ILSVRC2012_val_00034628.JPEG n02110958/
+mv val/ILSVRC2012_val_00034629.JPEG n03452741/
+mv val/ILSVRC2012_val_00034630.JPEG n03314780/
+mv val/ILSVRC2012_val_00034631.JPEG n01828970/
+mv val/ILSVRC2012_val_00034632.JPEG n02871525/
+mv val/ILSVRC2012_val_00034633.JPEG n04447861/
+mv val/ILSVRC2012_val_00034634.JPEG n02815834/
+mv val/ILSVRC2012_val_00034635.JPEG n04417672/
+mv val/ILSVRC2012_val_00034636.JPEG n04328186/
+mv val/ILSVRC2012_val_00034637.JPEG n02134418/
+mv val/ILSVRC2012_val_00034638.JPEG n03788365/
+mv val/ILSVRC2012_val_00034639.JPEG n03877845/
+mv val/ILSVRC2012_val_00034640.JPEG n04487081/
+mv val/ILSVRC2012_val_00034641.JPEG n02500267/
+mv val/ILSVRC2012_val_00034642.JPEG n03372029/
+mv val/ILSVRC2012_val_00034643.JPEG n03837869/
+mv val/ILSVRC2012_val_00034644.JPEG n01968897/
+mv val/ILSVRC2012_val_00034645.JPEG n03443371/
+mv val/ILSVRC2012_val_00034646.JPEG n12768682/
+mv val/ILSVRC2012_val_00034647.JPEG n01685808/
+mv val/ILSVRC2012_val_00034648.JPEG n03584829/
+mv val/ILSVRC2012_val_00034649.JPEG n02814860/
+mv val/ILSVRC2012_val_00034650.JPEG n03485407/
+mv val/ILSVRC2012_val_00034651.JPEG n03670208/
+mv val/ILSVRC2012_val_00034652.JPEG n01817953/
+mv val/ILSVRC2012_val_00034653.JPEG n03026506/
+mv val/ILSVRC2012_val_00034654.JPEG n01440764/
+mv val/ILSVRC2012_val_00034655.JPEG n01685808/
+mv val/ILSVRC2012_val_00034656.JPEG n03691459/
+mv val/ILSVRC2012_val_00034657.JPEG n04141076/
+mv val/ILSVRC2012_val_00034658.JPEG n04179913/
+mv val/ILSVRC2012_val_00034659.JPEG n03670208/
+mv val/ILSVRC2012_val_00034660.JPEG n01755581/
+mv val/ILSVRC2012_val_00034661.JPEG n03958227/
+mv val/ILSVRC2012_val_00034662.JPEG n03388043/
+mv val/ILSVRC2012_val_00034663.JPEG n03223299/
+mv val/ILSVRC2012_val_00034664.JPEG n02504013/
+mv val/ILSVRC2012_val_00034665.JPEG n01773549/
+mv val/ILSVRC2012_val_00034666.JPEG n01694178/
+mv val/ILSVRC2012_val_00034667.JPEG n02112018/
+mv val/ILSVRC2012_val_00034668.JPEG n01739381/
+mv val/ILSVRC2012_val_00034669.JPEG n01695060/
+mv val/ILSVRC2012_val_00034670.JPEG n01980166/
+mv val/ILSVRC2012_val_00034671.JPEG n03788365/
+mv val/ILSVRC2012_val_00034672.JPEG n03187595/
+mv val/ILSVRC2012_val_00034673.JPEG n02277742/
+mv val/ILSVRC2012_val_00034674.JPEG n01669191/
+mv val/ILSVRC2012_val_00034675.JPEG n02892201/
+mv val/ILSVRC2012_val_00034676.JPEG n02123045/
+mv val/ILSVRC2012_val_00034677.JPEG n07747607/
+mv val/ILSVRC2012_val_00034678.JPEG n04604644/
+mv val/ILSVRC2012_val_00034679.JPEG n04149813/
+mv val/ILSVRC2012_val_00034680.JPEG n04074963/
+mv val/ILSVRC2012_val_00034681.JPEG n02111277/
+mv val/ILSVRC2012_val_00034682.JPEG n02101006/
+mv val/ILSVRC2012_val_00034683.JPEG n03961711/
+mv val/ILSVRC2012_val_00034684.JPEG n01978287/
+mv val/ILSVRC2012_val_00034685.JPEG n03127747/
+mv val/ILSVRC2012_val_00034686.JPEG n02129604/
+mv val/ILSVRC2012_val_00034687.JPEG n07717410/
+mv val/ILSVRC2012_val_00034688.JPEG n02264363/
+mv val/ILSVRC2012_val_00034689.JPEG n07802026/
+mv val/ILSVRC2012_val_00034690.JPEG n02089973/
+mv val/ILSVRC2012_val_00034691.JPEG n02096585/
+mv val/ILSVRC2012_val_00034692.JPEG n04243546/
+mv val/ILSVRC2012_val_00034693.JPEG n01688243/
+mv val/ILSVRC2012_val_00034694.JPEG n02817516/
+mv val/ILSVRC2012_val_00034695.JPEG n04596742/
+mv val/ILSVRC2012_val_00034696.JPEG n03673027/
+mv val/ILSVRC2012_val_00034697.JPEG n02797295/
+mv val/ILSVRC2012_val_00034698.JPEG n07753113/
+mv val/ILSVRC2012_val_00034699.JPEG n01685808/
+mv val/ILSVRC2012_val_00034700.JPEG n02871525/
+mv val/ILSVRC2012_val_00034701.JPEG n02093991/
+mv val/ILSVRC2012_val_00034702.JPEG n01984695/
+mv val/ILSVRC2012_val_00034703.JPEG n07760859/
+mv val/ILSVRC2012_val_00034704.JPEG n03032252/
+mv val/ILSVRC2012_val_00034705.JPEG n07711569/
+mv val/ILSVRC2012_val_00034706.JPEG n02280649/
+mv val/ILSVRC2012_val_00034707.JPEG n03761084/
+mv val/ILSVRC2012_val_00034708.JPEG n03160309/
+mv val/ILSVRC2012_val_00034709.JPEG n03891332/
+mv val/ILSVRC2012_val_00034710.JPEG n02883205/
+mv val/ILSVRC2012_val_00034711.JPEG n04372370/
+mv val/ILSVRC2012_val_00034712.JPEG n04041544/
+mv val/ILSVRC2012_val_00034713.JPEG n04552348/
+mv val/ILSVRC2012_val_00034714.JPEG n04264628/
+mv val/ILSVRC2012_val_00034715.JPEG n04041544/
+mv val/ILSVRC2012_val_00034716.JPEG n01910747/
+mv val/ILSVRC2012_val_00034717.JPEG n03950228/
+mv val/ILSVRC2012_val_00034718.JPEG n02666196/
+mv val/ILSVRC2012_val_00034719.JPEG n04204347/
+mv val/ILSVRC2012_val_00034720.JPEG n01560419/
+mv val/ILSVRC2012_val_00034721.JPEG n04204238/
+mv val/ILSVRC2012_val_00034722.JPEG n02236044/
+mv val/ILSVRC2012_val_00034723.JPEG n03131574/
+mv val/ILSVRC2012_val_00034724.JPEG n04487081/
+mv val/ILSVRC2012_val_00034725.JPEG n02018795/
+mv val/ILSVRC2012_val_00034726.JPEG n02843684/
+mv val/ILSVRC2012_val_00034727.JPEG n03000684/
+mv val/ILSVRC2012_val_00034728.JPEG n01667778/
+mv val/ILSVRC2012_val_00034729.JPEG n02115641/
+mv val/ILSVRC2012_val_00034730.JPEG n04548362/
+mv val/ILSVRC2012_val_00034731.JPEG n01943899/
+mv val/ILSVRC2012_val_00034732.JPEG n02100877/
+mv val/ILSVRC2012_val_00034733.JPEG n02093256/
+mv val/ILSVRC2012_val_00034734.JPEG n02018207/
+mv val/ILSVRC2012_val_00034735.JPEG n02112137/
+mv val/ILSVRC2012_val_00034736.JPEG n03141823/
+mv val/ILSVRC2012_val_00034737.JPEG n02093754/
+mv val/ILSVRC2012_val_00034738.JPEG n02174001/
+mv val/ILSVRC2012_val_00034739.JPEG n04476259/
+mv val/ILSVRC2012_val_00034740.JPEG n02480495/
+mv val/ILSVRC2012_val_00034741.JPEG n03887697/
+mv val/ILSVRC2012_val_00034742.JPEG n02769748/
+mv val/ILSVRC2012_val_00034743.JPEG n02002724/
+mv val/ILSVRC2012_val_00034744.JPEG n02113978/
+mv val/ILSVRC2012_val_00034745.JPEG n02110627/
+mv val/ILSVRC2012_val_00034746.JPEG n03874293/
+mv val/ILSVRC2012_val_00034747.JPEG n02107574/
+mv val/ILSVRC2012_val_00034748.JPEG n02109047/
+mv val/ILSVRC2012_val_00034749.JPEG n01855032/
+mv val/ILSVRC2012_val_00034750.JPEG n02794156/
+mv val/ILSVRC2012_val_00034751.JPEG n03134739/
+mv val/ILSVRC2012_val_00034752.JPEG n07742313/
+mv val/ILSVRC2012_val_00034753.JPEG n03124043/
+mv val/ILSVRC2012_val_00034754.JPEG n02486261/
+mv val/ILSVRC2012_val_00034755.JPEG n02992529/
+mv val/ILSVRC2012_val_00034756.JPEG n01734418/
+mv val/ILSVRC2012_val_00034757.JPEG n02321529/
+mv val/ILSVRC2012_val_00034758.JPEG n03047690/
+mv val/ILSVRC2012_val_00034759.JPEG n02879718/
+mv val/ILSVRC2012_val_00034760.JPEG n02025239/
+mv val/ILSVRC2012_val_00034761.JPEG n03131574/
+mv val/ILSVRC2012_val_00034762.JPEG n04347754/
+mv val/ILSVRC2012_val_00034763.JPEG n03216828/
+mv val/ILSVRC2012_val_00034764.JPEG n02264363/
+mv val/ILSVRC2012_val_00034765.JPEG n03041632/
+mv val/ILSVRC2012_val_00034766.JPEG n02071294/
+mv val/ILSVRC2012_val_00034767.JPEG n01914609/
+mv val/ILSVRC2012_val_00034768.JPEG n02497673/
+mv val/ILSVRC2012_val_00034769.JPEG n02172182/
+mv val/ILSVRC2012_val_00034770.JPEG n01667778/
+mv val/ILSVRC2012_val_00034771.JPEG n02106550/
+mv val/ILSVRC2012_val_00034772.JPEG n02814860/
+mv val/ILSVRC2012_val_00034773.JPEG n01773549/
+mv val/ILSVRC2012_val_00034774.JPEG n01986214/
+mv val/ILSVRC2012_val_00034775.JPEG n02236044/
+mv val/ILSVRC2012_val_00034776.JPEG n02009912/
+mv val/ILSVRC2012_val_00034777.JPEG n02487347/
+mv val/ILSVRC2012_val_00034778.JPEG n01755581/
+mv val/ILSVRC2012_val_00034779.JPEG n03623198/
+mv val/ILSVRC2012_val_00034780.JPEG n02445715/
+mv val/ILSVRC2012_val_00034781.JPEG n06794110/
+mv val/ILSVRC2012_val_00034782.JPEG n02085620/
+mv val/ILSVRC2012_val_00034783.JPEG n04482393/
+mv val/ILSVRC2012_val_00034784.JPEG n01820546/
+mv val/ILSVRC2012_val_00034785.JPEG n04579145/
+mv val/ILSVRC2012_val_00034786.JPEG n02326432/
+mv val/ILSVRC2012_val_00034787.JPEG n07754684/
+mv val/ILSVRC2012_val_00034788.JPEG n04111531/
+mv val/ILSVRC2012_val_00034789.JPEG n03724870/
+mv val/ILSVRC2012_val_00034790.JPEG n02093256/
+mv val/ILSVRC2012_val_00034791.JPEG n07711569/
+mv val/ILSVRC2012_val_00034792.JPEG n02017213/
+mv val/ILSVRC2012_val_00034793.JPEG n01688243/
+mv val/ILSVRC2012_val_00034794.JPEG n01669191/
+mv val/ILSVRC2012_val_00034795.JPEG n01664065/
+mv val/ILSVRC2012_val_00034796.JPEG n02092339/
+mv val/ILSVRC2012_val_00034797.JPEG n02108551/
+mv val/ILSVRC2012_val_00034798.JPEG n04525305/
+mv val/ILSVRC2012_val_00034799.JPEG n03950228/
+mv val/ILSVRC2012_val_00034800.JPEG n03929660/
+mv val/ILSVRC2012_val_00034801.JPEG n03956157/
+mv val/ILSVRC2012_val_00034802.JPEG n03891332/
+mv val/ILSVRC2012_val_00034803.JPEG n04493381/
+mv val/ILSVRC2012_val_00034804.JPEG n02102973/
+mv val/ILSVRC2012_val_00034805.JPEG n03255030/
+mv val/ILSVRC2012_val_00034806.JPEG n01990800/
+mv val/ILSVRC2012_val_00034807.JPEG n02500267/
+mv val/ILSVRC2012_val_00034808.JPEG n02281406/
+mv val/ILSVRC2012_val_00034809.JPEG n01824575/
+mv val/ILSVRC2012_val_00034810.JPEG n03032252/
+mv val/ILSVRC2012_val_00034811.JPEG n02129165/
+mv val/ILSVRC2012_val_00034812.JPEG n02356798/
+mv val/ILSVRC2012_val_00034813.JPEG n03538406/
+mv val/ILSVRC2012_val_00034814.JPEG n02009229/
+mv val/ILSVRC2012_val_00034815.JPEG n02097658/
+mv val/ILSVRC2012_val_00034816.JPEG n03095699/
+mv val/ILSVRC2012_val_00034817.JPEG n03786901/
+mv val/ILSVRC2012_val_00034818.JPEG n03743016/
+mv val/ILSVRC2012_val_00034819.JPEG n02980441/
+mv val/ILSVRC2012_val_00034820.JPEG n07742313/
+mv val/ILSVRC2012_val_00034821.JPEG n02106166/
+mv val/ILSVRC2012_val_00034822.JPEG n03314780/
+mv val/ILSVRC2012_val_00034823.JPEG n02097209/
+mv val/ILSVRC2012_val_00034824.JPEG n04037443/
+mv val/ILSVRC2012_val_00034825.JPEG n04086273/
+mv val/ILSVRC2012_val_00034826.JPEG n03394916/
+mv val/ILSVRC2012_val_00034827.JPEG n02037110/
+mv val/ILSVRC2012_val_00034828.JPEG n02112018/
+mv val/ILSVRC2012_val_00034829.JPEG n03379051/
+mv val/ILSVRC2012_val_00034830.JPEG n02951585/
+mv val/ILSVRC2012_val_00034831.JPEG n04501370/
+mv val/ILSVRC2012_val_00034832.JPEG n04355338/
+mv val/ILSVRC2012_val_00034833.JPEG n03874293/
+mv val/ILSVRC2012_val_00034834.JPEG n04153751/
+mv val/ILSVRC2012_val_00034835.JPEG n07930864/
+mv val/ILSVRC2012_val_00034836.JPEG n02930766/
+mv val/ILSVRC2012_val_00034837.JPEG n01496331/
+mv val/ILSVRC2012_val_00034838.JPEG n04265275/
+mv val/ILSVRC2012_val_00034839.JPEG n02256656/
+mv val/ILSVRC2012_val_00034840.JPEG n01667114/
+mv val/ILSVRC2012_val_00034841.JPEG n03630383/
+mv val/ILSVRC2012_val_00034842.JPEG n04591713/
+mv val/ILSVRC2012_val_00034843.JPEG n02704792/
+mv val/ILSVRC2012_val_00034844.JPEG n03207743/
+mv val/ILSVRC2012_val_00034845.JPEG n03854065/
+mv val/ILSVRC2012_val_00034846.JPEG n03720891/
+mv val/ILSVRC2012_val_00034847.JPEG n07873807/
+mv val/ILSVRC2012_val_00034848.JPEG n02120505/
+mv val/ILSVRC2012_val_00034849.JPEG n02099849/
+mv val/ILSVRC2012_val_00034850.JPEG n04152593/
+mv val/ILSVRC2012_val_00034851.JPEG n02100877/
+mv val/ILSVRC2012_val_00034852.JPEG n04560804/
+mv val/ILSVRC2012_val_00034853.JPEG n03792972/
+mv val/ILSVRC2012_val_00034854.JPEG n03733131/
+mv val/ILSVRC2012_val_00034855.JPEG n13133613/
+mv val/ILSVRC2012_val_00034856.JPEG n02114548/
+mv val/ILSVRC2012_val_00034857.JPEG n03000247/
+mv val/ILSVRC2012_val_00034858.JPEG n04146614/
+mv val/ILSVRC2012_val_00034859.JPEG n04398044/
+mv val/ILSVRC2012_val_00034860.JPEG n02325366/
+mv val/ILSVRC2012_val_00034861.JPEG n03633091/
+mv val/ILSVRC2012_val_00034862.JPEG n09256479/
+mv val/ILSVRC2012_val_00034863.JPEG n03617480/
+mv val/ILSVRC2012_val_00034864.JPEG n01530575/
+mv val/ILSVRC2012_val_00034865.JPEG n03633091/
+mv val/ILSVRC2012_val_00034866.JPEG n03018349/
+mv val/ILSVRC2012_val_00034867.JPEG n01768244/
+mv val/ILSVRC2012_val_00034868.JPEG n02871525/
+mv val/ILSVRC2012_val_00034869.JPEG n04040759/
+mv val/ILSVRC2012_val_00034870.JPEG n03658185/
+mv val/ILSVRC2012_val_00034871.JPEG n03272562/
+mv val/ILSVRC2012_val_00034872.JPEG n02447366/
+mv val/ILSVRC2012_val_00034873.JPEG n04392985/
+mv val/ILSVRC2012_val_00034874.JPEG n02797295/
+mv val/ILSVRC2012_val_00034875.JPEG n03903868/
+mv val/ILSVRC2012_val_00034876.JPEG n04548362/
+mv val/ILSVRC2012_val_00034877.JPEG n07714571/
+mv val/ILSVRC2012_val_00034878.JPEG n03884397/
+mv val/ILSVRC2012_val_00034879.JPEG n03888605/
+mv val/ILSVRC2012_val_00034880.JPEG n02105505/
+mv val/ILSVRC2012_val_00034881.JPEG n03666591/
+mv val/ILSVRC2012_val_00034882.JPEG n03063599/
+mv val/ILSVRC2012_val_00034883.JPEG n03530642/
+mv val/ILSVRC2012_val_00034884.JPEG n02097474/
+mv val/ILSVRC2012_val_00034885.JPEG n04483307/
+mv val/ILSVRC2012_val_00034886.JPEG n04554684/
+mv val/ILSVRC2012_val_00034887.JPEG n02978881/
+mv val/ILSVRC2012_val_00034888.JPEG n02492660/
+mv val/ILSVRC2012_val_00034889.JPEG n03692522/
+mv val/ILSVRC2012_val_00034890.JPEG n04589890/
+mv val/ILSVRC2012_val_00034891.JPEG n04579432/
+mv val/ILSVRC2012_val_00034892.JPEG n02127052/
+mv val/ILSVRC2012_val_00034893.JPEG n02112706/
+mv val/ILSVRC2012_val_00034894.JPEG n02804610/
+mv val/ILSVRC2012_val_00034895.JPEG n02190166/
+mv val/ILSVRC2012_val_00034896.JPEG n11939491/
+mv val/ILSVRC2012_val_00034897.JPEG n03000134/
+mv val/ILSVRC2012_val_00034898.JPEG n01697457/
+mv val/ILSVRC2012_val_00034899.JPEG n12620546/
+mv val/ILSVRC2012_val_00034900.JPEG n02256656/
+mv val/ILSVRC2012_val_00034901.JPEG n01968897/
+mv val/ILSVRC2012_val_00034902.JPEG n02950826/
+mv val/ILSVRC2012_val_00034903.JPEG n03127925/
+mv val/ILSVRC2012_val_00034904.JPEG n02939185/
+mv val/ILSVRC2012_val_00034905.JPEG n06596364/
+mv val/ILSVRC2012_val_00034906.JPEG n02091134/
+mv val/ILSVRC2012_val_00034907.JPEG n03877472/
+mv val/ILSVRC2012_val_00034908.JPEG n02113799/
+mv val/ILSVRC2012_val_00034909.JPEG n02102973/
+mv val/ILSVRC2012_val_00034910.JPEG n02027492/
+mv val/ILSVRC2012_val_00034911.JPEG n03498962/
+mv val/ILSVRC2012_val_00034912.JPEG n02834397/
+mv val/ILSVRC2012_val_00034913.JPEG n07248320/
+mv val/ILSVRC2012_val_00034914.JPEG n04286575/
+mv val/ILSVRC2012_val_00034915.JPEG n01735189/
+mv val/ILSVRC2012_val_00034916.JPEG n02417914/
+mv val/ILSVRC2012_val_00034917.JPEG n03690938/
+mv val/ILSVRC2012_val_00034918.JPEG n03404251/
+mv val/ILSVRC2012_val_00034919.JPEG n01739381/
+mv val/ILSVRC2012_val_00034920.JPEG n02099267/
+mv val/ILSVRC2012_val_00034921.JPEG n02219486/
+mv val/ILSVRC2012_val_00034922.JPEG n02108089/
+mv val/ILSVRC2012_val_00034923.JPEG n02206856/
+mv val/ILSVRC2012_val_00034924.JPEG n03208938/
+mv val/ILSVRC2012_val_00034925.JPEG n03127747/
+mv val/ILSVRC2012_val_00034926.JPEG n02279972/
+mv val/ILSVRC2012_val_00034927.JPEG n02281406/
+mv val/ILSVRC2012_val_00034928.JPEG n02113023/
+mv val/ILSVRC2012_val_00034929.JPEG n01601694/
+mv val/ILSVRC2012_val_00034930.JPEG n07715103/
+mv val/ILSVRC2012_val_00034931.JPEG n02107908/
+mv val/ILSVRC2012_val_00034932.JPEG n02120079/
+mv val/ILSVRC2012_val_00034933.JPEG n02102318/
+mv val/ILSVRC2012_val_00034934.JPEG n02096051/
+mv val/ILSVRC2012_val_00034935.JPEG n01990800/
+mv val/ILSVRC2012_val_00034936.JPEG n02917067/
+mv val/ILSVRC2012_val_00034937.JPEG n03372029/
+mv val/ILSVRC2012_val_00034938.JPEG n03538406/
+mv val/ILSVRC2012_val_00034939.JPEG n12267677/
+mv val/ILSVRC2012_val_00034940.JPEG n03314780/
+mv val/ILSVRC2012_val_00034941.JPEG n03903868/
+mv val/ILSVRC2012_val_00034942.JPEG n02009229/
+mv val/ILSVRC2012_val_00034943.JPEG n02100236/
+mv val/ILSVRC2012_val_00034944.JPEG n03759954/
+mv val/ILSVRC2012_val_00034945.JPEG n02277742/
+mv val/ILSVRC2012_val_00034946.JPEG n03804744/
+mv val/ILSVRC2012_val_00034947.JPEG n02966687/
+mv val/ILSVRC2012_val_00034948.JPEG n02102318/
+mv val/ILSVRC2012_val_00034949.JPEG n09835506/
+mv val/ILSVRC2012_val_00034950.JPEG n01484850/
+mv val/ILSVRC2012_val_00034951.JPEG n02097047/
+mv val/ILSVRC2012_val_00034952.JPEG n02795169/
+mv val/ILSVRC2012_val_00034953.JPEG n03673027/
+mv val/ILSVRC2012_val_00034954.JPEG n02169497/
+mv val/ILSVRC2012_val_00034955.JPEG n03532672/
+mv val/ILSVRC2012_val_00034956.JPEG n04067472/
+mv val/ILSVRC2012_val_00034957.JPEG n01944390/
+mv val/ILSVRC2012_val_00034958.JPEG n02786058/
+mv val/ILSVRC2012_val_00034959.JPEG n04019541/
+mv val/ILSVRC2012_val_00034960.JPEG n01665541/
+mv val/ILSVRC2012_val_00034961.JPEG n04162706/
+mv val/ILSVRC2012_val_00034962.JPEG n01695060/
+mv val/ILSVRC2012_val_00034963.JPEG n04116512/
+mv val/ILSVRC2012_val_00034964.JPEG n03680355/
+mv val/ILSVRC2012_val_00034965.JPEG n04548280/
+mv val/ILSVRC2012_val_00034966.JPEG n04517823/
+mv val/ILSVRC2012_val_00034967.JPEG n02883205/
+mv val/ILSVRC2012_val_00034968.JPEG n02869837/
+mv val/ILSVRC2012_val_00034969.JPEG n01871265/
+mv val/ILSVRC2012_val_00034970.JPEG n01737021/
+mv val/ILSVRC2012_val_00034971.JPEG n01496331/
+mv val/ILSVRC2012_val_00034972.JPEG n01773797/
+mv val/ILSVRC2012_val_00034973.JPEG n04562935/
+mv val/ILSVRC2012_val_00034974.JPEG n03617480/
+mv val/ILSVRC2012_val_00034975.JPEG n03930630/
+mv val/ILSVRC2012_val_00034976.JPEG n04033901/
+mv val/ILSVRC2012_val_00034977.JPEG n04270147/
+mv val/ILSVRC2012_val_00034978.JPEG n03388183/
+mv val/ILSVRC2012_val_00034979.JPEG n02823428/
+mv val/ILSVRC2012_val_00034980.JPEG n02090622/
+mv val/ILSVRC2012_val_00034981.JPEG n02504013/
+mv val/ILSVRC2012_val_00034982.JPEG n04356056/
+mv val/ILSVRC2012_val_00034983.JPEG n02510455/
+mv val/ILSVRC2012_val_00034984.JPEG n01860187/
+mv val/ILSVRC2012_val_00034985.JPEG n02492660/
+mv val/ILSVRC2012_val_00034986.JPEG n02879718/
+mv val/ILSVRC2012_val_00034987.JPEG n02669723/
+mv val/ILSVRC2012_val_00034988.JPEG n15075141/
+mv val/ILSVRC2012_val_00034989.JPEG n04263257/
+mv val/ILSVRC2012_val_00034990.JPEG n02422106/
+mv val/ILSVRC2012_val_00034991.JPEG n04350905/
+mv val/ILSVRC2012_val_00034992.JPEG n02105056/
+mv val/ILSVRC2012_val_00034993.JPEG n02102973/
+mv val/ILSVRC2012_val_00034994.JPEG n03776460/
+mv val/ILSVRC2012_val_00034995.JPEG n03857828/
+mv val/ILSVRC2012_val_00034996.JPEG n02120505/
+mv val/ILSVRC2012_val_00034997.JPEG n02105412/
+mv val/ILSVRC2012_val_00034998.JPEG n02643566/
+mv val/ILSVRC2012_val_00034999.JPEG n03291819/
+mv val/ILSVRC2012_val_00035000.JPEG n04447861/
+mv val/ILSVRC2012_val_00035001.JPEG n03938244/
+mv val/ILSVRC2012_val_00035002.JPEG n07717556/
+mv val/ILSVRC2012_val_00035003.JPEG n02423022/
+mv val/ILSVRC2012_val_00035004.JPEG n03450230/
+mv val/ILSVRC2012_val_00035005.JPEG n01770393/
+mv val/ILSVRC2012_val_00035006.JPEG n04254680/
+mv val/ILSVRC2012_val_00035007.JPEG n03530642/
+mv val/ILSVRC2012_val_00035008.JPEG n03476991/
+mv val/ILSVRC2012_val_00035009.JPEG n03710721/
+mv val/ILSVRC2012_val_00035010.JPEG n04116512/
+mv val/ILSVRC2012_val_00035011.JPEG n04398044/
+mv val/ILSVRC2012_val_00035012.JPEG n02930766/
+mv val/ILSVRC2012_val_00035013.JPEG n04370456/
+mv val/ILSVRC2012_val_00035014.JPEG n02231487/
+mv val/ILSVRC2012_val_00035015.JPEG n04019541/
+mv val/ILSVRC2012_val_00035016.JPEG n03476991/
+mv val/ILSVRC2012_val_00035017.JPEG n04366367/
+mv val/ILSVRC2012_val_00035018.JPEG n02930766/
+mv val/ILSVRC2012_val_00035019.JPEG n01728920/
+mv val/ILSVRC2012_val_00035020.JPEG n03908618/
+mv val/ILSVRC2012_val_00035021.JPEG n07615774/
+mv val/ILSVRC2012_val_00035022.JPEG n06794110/
+mv val/ILSVRC2012_val_00035023.JPEG n01744401/
+mv val/ILSVRC2012_val_00035024.JPEG n04153751/
+mv val/ILSVRC2012_val_00035025.JPEG n03187595/
+mv val/ILSVRC2012_val_00035026.JPEG n02009912/
+mv val/ILSVRC2012_val_00035027.JPEG n02096437/
+mv val/ILSVRC2012_val_00035028.JPEG n02018207/
+mv val/ILSVRC2012_val_00035029.JPEG n02363005/
+mv val/ILSVRC2012_val_00035030.JPEG n07717410/
+mv val/ILSVRC2012_val_00035031.JPEG n02939185/
+mv val/ILSVRC2012_val_00035032.JPEG n03495258/
+mv val/ILSVRC2012_val_00035033.JPEG n03787032/
+mv val/ILSVRC2012_val_00035034.JPEG n03920288/
+mv val/ILSVRC2012_val_00035035.JPEG n04392985/
+mv val/ILSVRC2012_val_00035036.JPEG n02109961/
+mv val/ILSVRC2012_val_00035037.JPEG n04325704/
+mv val/ILSVRC2012_val_00035038.JPEG n03240683/
+mv val/ILSVRC2012_val_00035039.JPEG n01773157/
+mv val/ILSVRC2012_val_00035040.JPEG n02317335/
+mv val/ILSVRC2012_val_00035041.JPEG n03929660/
+mv val/ILSVRC2012_val_00035042.JPEG n02493509/
+mv val/ILSVRC2012_val_00035043.JPEG n03920288/
+mv val/ILSVRC2012_val_00035044.JPEG n03447721/
+mv val/ILSVRC2012_val_00035045.JPEG n02486261/
+mv val/ILSVRC2012_val_00035046.JPEG n04562935/
+mv val/ILSVRC2012_val_00035047.JPEG n01829413/
+mv val/ILSVRC2012_val_00035048.JPEG n01930112/
+mv val/ILSVRC2012_val_00035049.JPEG n02104365/
+mv val/ILSVRC2012_val_00035050.JPEG n02992211/
+mv val/ILSVRC2012_val_00035051.JPEG n04033901/
+mv val/ILSVRC2012_val_00035052.JPEG n03710193/
+mv val/ILSVRC2012_val_00035053.JPEG n02797295/
+mv val/ILSVRC2012_val_00035054.JPEG n01847000/
+mv val/ILSVRC2012_val_00035055.JPEG n02100583/
+mv val/ILSVRC2012_val_00035056.JPEG n04483307/
+mv val/ILSVRC2012_val_00035057.JPEG n03874599/
+mv val/ILSVRC2012_val_00035058.JPEG n04275548/
+mv val/ILSVRC2012_val_00035059.JPEG n04540053/
+mv val/ILSVRC2012_val_00035060.JPEG n01558993/
+mv val/ILSVRC2012_val_00035061.JPEG n04560804/
+mv val/ILSVRC2012_val_00035062.JPEG n04542943/
+mv val/ILSVRC2012_val_00035063.JPEG n01773549/
+mv val/ILSVRC2012_val_00035064.JPEG n04317175/
+mv val/ILSVRC2012_val_00035065.JPEG n03935335/
+mv val/ILSVRC2012_val_00035066.JPEG n07717410/
+mv val/ILSVRC2012_val_00035067.JPEG n02165456/
+mv val/ILSVRC2012_val_00035068.JPEG n03832673/
+mv val/ILSVRC2012_val_00035069.JPEG n01692333/
+mv val/ILSVRC2012_val_00035070.JPEG n03788195/
+mv val/ILSVRC2012_val_00035071.JPEG n07831146/
+mv val/ILSVRC2012_val_00035072.JPEG n03590841/
+mv val/ILSVRC2012_val_00035073.JPEG n03840681/
+mv val/ILSVRC2012_val_00035074.JPEG n02277742/
+mv val/ILSVRC2012_val_00035075.JPEG n09472597/
+mv val/ILSVRC2012_val_00035076.JPEG n07614500/
+mv val/ILSVRC2012_val_00035077.JPEG n04548280/
+mv val/ILSVRC2012_val_00035078.JPEG n03443371/
+mv val/ILSVRC2012_val_00035079.JPEG n04532670/
+mv val/ILSVRC2012_val_00035080.JPEG n01774750/
+mv val/ILSVRC2012_val_00035081.JPEG n04486054/
+mv val/ILSVRC2012_val_00035082.JPEG n03127747/
+mv val/ILSVRC2012_val_00035083.JPEG n03676483/
+mv val/ILSVRC2012_val_00035084.JPEG n02669723/
+mv val/ILSVRC2012_val_00035085.JPEG n02017213/
+mv val/ILSVRC2012_val_00035086.JPEG n01945685/
+mv val/ILSVRC2012_val_00035087.JPEG n02219486/
+mv val/ILSVRC2012_val_00035088.JPEG n04599235/
+mv val/ILSVRC2012_val_00035089.JPEG n03530642/
+mv val/ILSVRC2012_val_00035090.JPEG n04254777/
+mv val/ILSVRC2012_val_00035091.JPEG n02111500/
+mv val/ILSVRC2012_val_00035092.JPEG n03125729/
+mv val/ILSVRC2012_val_00035093.JPEG n01631663/
+mv val/ILSVRC2012_val_00035094.JPEG n07880968/
+mv val/ILSVRC2012_val_00035095.JPEG n02111277/
+mv val/ILSVRC2012_val_00035096.JPEG n01817953/
+mv val/ILSVRC2012_val_00035097.JPEG n03776460/
+mv val/ILSVRC2012_val_00035098.JPEG n01622779/
+mv val/ILSVRC2012_val_00035099.JPEG n03240683/
+mv val/ILSVRC2012_val_00035100.JPEG n02906734/
+mv val/ILSVRC2012_val_00035101.JPEG n02391049/
+mv val/ILSVRC2012_val_00035102.JPEG n01695060/
+mv val/ILSVRC2012_val_00035103.JPEG n04023962/
+mv val/ILSVRC2012_val_00035104.JPEG n01514668/
+mv val/ILSVRC2012_val_00035105.JPEG n04133789/
+mv val/ILSVRC2012_val_00035106.JPEG n02871525/
+mv val/ILSVRC2012_val_00035107.JPEG n02277742/
+mv val/ILSVRC2012_val_00035108.JPEG n02090721/
+mv val/ILSVRC2012_val_00035109.JPEG n01693334/
+mv val/ILSVRC2012_val_00035110.JPEG n04074963/
+mv val/ILSVRC2012_val_00035111.JPEG n07693725/
+mv val/ILSVRC2012_val_00035112.JPEG n01873310/
+mv val/ILSVRC2012_val_00035113.JPEG n02279972/
+mv val/ILSVRC2012_val_00035114.JPEG n02971356/
+mv val/ILSVRC2012_val_00035115.JPEG n02071294/
+mv val/ILSVRC2012_val_00035116.JPEG n03991062/
+mv val/ILSVRC2012_val_00035117.JPEG n02088238/
+mv val/ILSVRC2012_val_00035118.JPEG n03538406/
+mv val/ILSVRC2012_val_00035119.JPEG n04552348/
+mv val/ILSVRC2012_val_00035120.JPEG n02112706/
+mv val/ILSVRC2012_val_00035121.JPEG n04229816/
+mv val/ILSVRC2012_val_00035122.JPEG n03126707/
+mv val/ILSVRC2012_val_00035123.JPEG n01518878/
+mv val/ILSVRC2012_val_00035124.JPEG n03903868/
+mv val/ILSVRC2012_val_00035125.JPEG n13054560/
+mv val/ILSVRC2012_val_00035126.JPEG n04149813/
+mv val/ILSVRC2012_val_00035127.JPEG n01828970/
+mv val/ILSVRC2012_val_00035128.JPEG n03197337/
+mv val/ILSVRC2012_val_00035129.JPEG n02443114/
+mv val/ILSVRC2012_val_00035130.JPEG n03255030/
+mv val/ILSVRC2012_val_00035131.JPEG n01558993/
+mv val/ILSVRC2012_val_00035132.JPEG n03529860/
+mv val/ILSVRC2012_val_00035133.JPEG n04069434/
+mv val/ILSVRC2012_val_00035134.JPEG n02396427/
+mv val/ILSVRC2012_val_00035135.JPEG n03197337/
+mv val/ILSVRC2012_val_00035136.JPEG n02356798/
+mv val/ILSVRC2012_val_00035137.JPEG n02504013/
+mv val/ILSVRC2012_val_00035138.JPEG n02641379/
+mv val/ILSVRC2012_val_00035139.JPEG n02017213/
+mv val/ILSVRC2012_val_00035140.JPEG n01882714/
+mv val/ILSVRC2012_val_00035141.JPEG n01514859/
+mv val/ILSVRC2012_val_00035142.JPEG n04429376/
+mv val/ILSVRC2012_val_00035143.JPEG n04366367/
+mv val/ILSVRC2012_val_00035144.JPEG n04443257/
+mv val/ILSVRC2012_val_00035145.JPEG n03075370/
+mv val/ILSVRC2012_val_00035146.JPEG n03782006/
+mv val/ILSVRC2012_val_00035147.JPEG n02927161/
+mv val/ILSVRC2012_val_00035148.JPEG n03899768/
+mv val/ILSVRC2012_val_00035149.JPEG n07715103/
+mv val/ILSVRC2012_val_00035150.JPEG n03980874/
+mv val/ILSVRC2012_val_00035151.JPEG n01514668/
+mv val/ILSVRC2012_val_00035152.JPEG n03761084/
+mv val/ILSVRC2012_val_00035153.JPEG n01773797/
+mv val/ILSVRC2012_val_00035154.JPEG n02120079/
+mv val/ILSVRC2012_val_00035155.JPEG n04131690/
+mv val/ILSVRC2012_val_00035156.JPEG n07248320/
+mv val/ILSVRC2012_val_00035157.JPEG n02133161/
+mv val/ILSVRC2012_val_00035158.JPEG n02096051/
+mv val/ILSVRC2012_val_00035159.JPEG n13052670/
+mv val/ILSVRC2012_val_00035160.JPEG n02979186/
+mv val/ILSVRC2012_val_00035161.JPEG n02113023/
+mv val/ILSVRC2012_val_00035162.JPEG n03594945/
+mv val/ILSVRC2012_val_00035163.JPEG n02123045/
+mv val/ILSVRC2012_val_00035164.JPEG n02120505/
+mv val/ILSVRC2012_val_00035165.JPEG n02119022/
+mv val/ILSVRC2012_val_00035166.JPEG n02493793/
+mv val/ILSVRC2012_val_00035167.JPEG n01728572/
+mv val/ILSVRC2012_val_00035168.JPEG n03482405/
+mv val/ILSVRC2012_val_00035169.JPEG n01980166/
+mv val/ILSVRC2012_val_00035170.JPEG n07745940/
+mv val/ILSVRC2012_val_00035171.JPEG n01773549/
+mv val/ILSVRC2012_val_00035172.JPEG n02123394/
+mv val/ILSVRC2012_val_00035173.JPEG n02093754/
+mv val/ILSVRC2012_val_00035174.JPEG n03534580/
+mv val/ILSVRC2012_val_00035175.JPEG n02174001/
+mv val/ILSVRC2012_val_00035176.JPEG n02641379/
+mv val/ILSVRC2012_val_00035177.JPEG n01693334/
+mv val/ILSVRC2012_val_00035178.JPEG n01983481/
+mv val/ILSVRC2012_val_00035179.JPEG n02793495/
+mv val/ILSVRC2012_val_00035180.JPEG n04456115/
+mv val/ILSVRC2012_val_00035181.JPEG n04141327/
+mv val/ILSVRC2012_val_00035182.JPEG n02096585/
+mv val/ILSVRC2012_val_00035183.JPEG n01855672/
+mv val/ILSVRC2012_val_00035184.JPEG n03223299/
+mv val/ILSVRC2012_val_00035185.JPEG n03544143/
+mv val/ILSVRC2012_val_00035186.JPEG n02321529/
+mv val/ILSVRC2012_val_00035187.JPEG n09193705/
+mv val/ILSVRC2012_val_00035188.JPEG n04409515/
+mv val/ILSVRC2012_val_00035189.JPEG n02105162/
+mv val/ILSVRC2012_val_00035190.JPEG n03775546/
+mv val/ILSVRC2012_val_00035191.JPEG n01990800/
+mv val/ILSVRC2012_val_00035192.JPEG n02128757/
+mv val/ILSVRC2012_val_00035193.JPEG n03769881/
+mv val/ILSVRC2012_val_00035194.JPEG n03314780/
+mv val/ILSVRC2012_val_00035195.JPEG n03598930/
+mv val/ILSVRC2012_val_00035196.JPEG n03452741/
+mv val/ILSVRC2012_val_00035197.JPEG n03388183/
+mv val/ILSVRC2012_val_00035198.JPEG n03958227/
+mv val/ILSVRC2012_val_00035199.JPEG n02236044/
+mv val/ILSVRC2012_val_00035200.JPEG n04208210/
+mv val/ILSVRC2012_val_00035201.JPEG n07693725/
+mv val/ILSVRC2012_val_00035202.JPEG n01945685/
+mv val/ILSVRC2012_val_00035203.JPEG n04579432/
+mv val/ILSVRC2012_val_00035204.JPEG n02486410/
+mv val/ILSVRC2012_val_00035205.JPEG n02791270/
+mv val/ILSVRC2012_val_00035206.JPEG n02099429/
+mv val/ILSVRC2012_val_00035207.JPEG n02074367/
+mv val/ILSVRC2012_val_00035208.JPEG n04208210/
+mv val/ILSVRC2012_val_00035209.JPEG n01981276/
+mv val/ILSVRC2012_val_00035210.JPEG n03240683/
+mv val/ILSVRC2012_val_00035211.JPEG n03425413/
+mv val/ILSVRC2012_val_00035212.JPEG n02115913/
+mv val/ILSVRC2012_val_00035213.JPEG n03124043/
+mv val/ILSVRC2012_val_00035214.JPEG n02002724/
+mv val/ILSVRC2012_val_00035215.JPEG n02667093/
+mv val/ILSVRC2012_val_00035216.JPEG n03724870/
+mv val/ILSVRC2012_val_00035217.JPEG n07730033/
+mv val/ILSVRC2012_val_00035218.JPEG n03733281/
+mv val/ILSVRC2012_val_00035219.JPEG n04522168/
+mv val/ILSVRC2012_val_00035220.JPEG n07717556/
+mv val/ILSVRC2012_val_00035221.JPEG n03977966/
+mv val/ILSVRC2012_val_00035222.JPEG n03788365/
+mv val/ILSVRC2012_val_00035223.JPEG n01484850/
+mv val/ILSVRC2012_val_00035224.JPEG n03482405/
+mv val/ILSVRC2012_val_00035225.JPEG n03623198/
+mv val/ILSVRC2012_val_00035226.JPEG n07892512/
+mv val/ILSVRC2012_val_00035227.JPEG n07711569/
+mv val/ILSVRC2012_val_00035228.JPEG n03710637/
+mv val/ILSVRC2012_val_00035229.JPEG n03376595/
+mv val/ILSVRC2012_val_00035230.JPEG n04141975/
+mv val/ILSVRC2012_val_00035231.JPEG n02981792/
+mv val/ILSVRC2012_val_00035232.JPEG n03804744/
+mv val/ILSVRC2012_val_00035233.JPEG n02107312/
+mv val/ILSVRC2012_val_00035234.JPEG n03733131/
+mv val/ILSVRC2012_val_00035235.JPEG n01739381/
+mv val/ILSVRC2012_val_00035236.JPEG n04252077/
+mv val/ILSVRC2012_val_00035237.JPEG n03445924/
+mv val/ILSVRC2012_val_00035238.JPEG n04599235/
+mv val/ILSVRC2012_val_00035239.JPEG n02422699/
+mv val/ILSVRC2012_val_00035240.JPEG n03637318/
+mv val/ILSVRC2012_val_00035241.JPEG n03673027/
+mv val/ILSVRC2012_val_00035242.JPEG n03425413/
+mv val/ILSVRC2012_val_00035243.JPEG n02442845/
+mv val/ILSVRC2012_val_00035244.JPEG n02325366/
+mv val/ILSVRC2012_val_00035245.JPEG n02410509/
+mv val/ILSVRC2012_val_00035246.JPEG n02641379/
+mv val/ILSVRC2012_val_00035247.JPEG n02165105/
+mv val/ILSVRC2012_val_00035248.JPEG n02769748/
+mv val/ILSVRC2012_val_00035249.JPEG n02859443/
+mv val/ILSVRC2012_val_00035250.JPEG n01806567/
+mv val/ILSVRC2012_val_00035251.JPEG n03527444/
+mv val/ILSVRC2012_val_00035252.JPEG n02099601/
+mv val/ILSVRC2012_val_00035253.JPEG n07715103/
+mv val/ILSVRC2012_val_00035254.JPEG n01531178/
+mv val/ILSVRC2012_val_00035255.JPEG n04599235/
+mv val/ILSVRC2012_val_00035256.JPEG n07697313/
+mv val/ILSVRC2012_val_00035257.JPEG n02091244/
+mv val/ILSVRC2012_val_00035258.JPEG n04317175/
+mv val/ILSVRC2012_val_00035259.JPEG n02823428/
+mv val/ILSVRC2012_val_00035260.JPEG n02096437/
+mv val/ILSVRC2012_val_00035261.JPEG n02236044/
+mv val/ILSVRC2012_val_00035262.JPEG n02190166/
+mv val/ILSVRC2012_val_00035263.JPEG n02948072/
+mv val/ILSVRC2012_val_00035264.JPEG n01728920/
+mv val/ILSVRC2012_val_00035265.JPEG n01728572/
+mv val/ILSVRC2012_val_00035266.JPEG n03000684/
+mv val/ILSVRC2012_val_00035267.JPEG n03133878/
+mv val/ILSVRC2012_val_00035268.JPEG n02017213/
+mv val/ILSVRC2012_val_00035269.JPEG n01978287/
+mv val/ILSVRC2012_val_00035270.JPEG n03775071/
+mv val/ILSVRC2012_val_00035271.JPEG n04479046/
+mv val/ILSVRC2012_val_00035272.JPEG n07720875/
+mv val/ILSVRC2012_val_00035273.JPEG n06785654/
+mv val/ILSVRC2012_val_00035274.JPEG n01843383/
+mv val/ILSVRC2012_val_00035275.JPEG n02108089/
+mv val/ILSVRC2012_val_00035276.JPEG n02606052/
+mv val/ILSVRC2012_val_00035277.JPEG n02794156/
+mv val/ILSVRC2012_val_00035278.JPEG n02100583/
+mv val/ILSVRC2012_val_00035279.JPEG n12620546/
+mv val/ILSVRC2012_val_00035280.JPEG n02412080/
+mv val/ILSVRC2012_val_00035281.JPEG n01677366/
+mv val/ILSVRC2012_val_00035282.JPEG n03710637/
+mv val/ILSVRC2012_val_00035283.JPEG n07753275/
+mv val/ILSVRC2012_val_00035284.JPEG n02417914/
+mv val/ILSVRC2012_val_00035285.JPEG n04019541/
+mv val/ILSVRC2012_val_00035286.JPEG n01697457/
+mv val/ILSVRC2012_val_00035287.JPEG n01806143/
+mv val/ILSVRC2012_val_00035288.JPEG n03759954/
+mv val/ILSVRC2012_val_00035289.JPEG n02115913/
+mv val/ILSVRC2012_val_00035290.JPEG n12985857/
+mv val/ILSVRC2012_val_00035291.JPEG n03530642/
+mv val/ILSVRC2012_val_00035292.JPEG n02133161/
+mv val/ILSVRC2012_val_00035293.JPEG n02086240/
+mv val/ILSVRC2012_val_00035294.JPEG n02782093/
+mv val/ILSVRC2012_val_00035295.JPEG n02259212/
+mv val/ILSVRC2012_val_00035296.JPEG n02110806/
+mv val/ILSVRC2012_val_00035297.JPEG n03733131/
+mv val/ILSVRC2012_val_00035298.JPEG n02096294/
+mv val/ILSVRC2012_val_00035299.JPEG n04229816/
+mv val/ILSVRC2012_val_00035300.JPEG n06794110/
+mv val/ILSVRC2012_val_00035301.JPEG n02699494/
+mv val/ILSVRC2012_val_00035302.JPEG n03761084/
+mv val/ILSVRC2012_val_00035303.JPEG n01592084/
+mv val/ILSVRC2012_val_00035304.JPEG n07695742/
+mv val/ILSVRC2012_val_00035305.JPEG n01631663/
+mv val/ILSVRC2012_val_00035306.JPEG n03017168/
+mv val/ILSVRC2012_val_00035307.JPEG n04350905/
+mv val/ILSVRC2012_val_00035308.JPEG n02256656/
+mv val/ILSVRC2012_val_00035309.JPEG n04285008/
+mv val/ILSVRC2012_val_00035310.JPEG n01984695/
+mv val/ILSVRC2012_val_00035311.JPEG n04275548/
+mv val/ILSVRC2012_val_00035312.JPEG n01883070/
+mv val/ILSVRC2012_val_00035313.JPEG n03047690/
+mv val/ILSVRC2012_val_00035314.JPEG n02445715/
+mv val/ILSVRC2012_val_00035315.JPEG n02088094/
+mv val/ILSVRC2012_val_00035316.JPEG n03223299/
+mv val/ILSVRC2012_val_00035317.JPEG n01729322/
+mv val/ILSVRC2012_val_00035318.JPEG n03837869/
+mv val/ILSVRC2012_val_00035319.JPEG n02102480/
+mv val/ILSVRC2012_val_00035320.JPEG n02088364/
+mv val/ILSVRC2012_val_00035321.JPEG n02102177/
+mv val/ILSVRC2012_val_00035322.JPEG n04265275/
+mv val/ILSVRC2012_val_00035323.JPEG n02319095/
+mv val/ILSVRC2012_val_00035324.JPEG n02229544/
+mv val/ILSVRC2012_val_00035325.JPEG n03759954/
+mv val/ILSVRC2012_val_00035326.JPEG n02869837/
+mv val/ILSVRC2012_val_00035327.JPEG n04209133/
+mv val/ILSVRC2012_val_00035328.JPEG n03291819/
+mv val/ILSVRC2012_val_00035329.JPEG n04371774/
+mv val/ILSVRC2012_val_00035330.JPEG n02138441/
+mv val/ILSVRC2012_val_00035331.JPEG n02417914/
+mv val/ILSVRC2012_val_00035332.JPEG n02128757/
+mv val/ILSVRC2012_val_00035333.JPEG n02098286/
+mv val/ILSVRC2012_val_00035334.JPEG n04591157/
+mv val/ILSVRC2012_val_00035335.JPEG n03443371/
+mv val/ILSVRC2012_val_00035336.JPEG n03902125/
+mv val/ILSVRC2012_val_00035337.JPEG n02422106/
+mv val/ILSVRC2012_val_00035338.JPEG n04423845/
+mv val/ILSVRC2012_val_00035339.JPEG n04465501/
+mv val/ILSVRC2012_val_00035340.JPEG n13052670/
+mv val/ILSVRC2012_val_00035341.JPEG n02087394/
+mv val/ILSVRC2012_val_00035342.JPEG n04367480/
+mv val/ILSVRC2012_val_00035343.JPEG n07742313/
+mv val/ILSVRC2012_val_00035344.JPEG n03538406/
+mv val/ILSVRC2012_val_00035345.JPEG n03492542/
+mv val/ILSVRC2012_val_00035346.JPEG n03868863/
+mv val/ILSVRC2012_val_00035347.JPEG n02088632/
+mv val/ILSVRC2012_val_00035348.JPEG n01582220/
+mv val/ILSVRC2012_val_00035349.JPEG n03876231/
+mv val/ILSVRC2012_val_00035350.JPEG n03770439/
+mv val/ILSVRC2012_val_00035351.JPEG n02977058/
+mv val/ILSVRC2012_val_00035352.JPEG n03457902/
+mv val/ILSVRC2012_val_00035353.JPEG n03874293/
+mv val/ILSVRC2012_val_00035354.JPEG n03902125/
+mv val/ILSVRC2012_val_00035355.JPEG n03929855/
+mv val/ILSVRC2012_val_00035356.JPEG n02391049/
+mv val/ILSVRC2012_val_00035357.JPEG n03180011/
+mv val/ILSVRC2012_val_00035358.JPEG n03956157/
+mv val/ILSVRC2012_val_00035359.JPEG n02790996/
+mv val/ILSVRC2012_val_00035360.JPEG n02099712/
+mv val/ILSVRC2012_val_00035361.JPEG n01980166/
+mv val/ILSVRC2012_val_00035362.JPEG n04041544/
+mv val/ILSVRC2012_val_00035363.JPEG n02033041/
+mv val/ILSVRC2012_val_00035364.JPEG n03976657/
+mv val/ILSVRC2012_val_00035365.JPEG n01751748/
+mv val/ILSVRC2012_val_00035366.JPEG n02127052/
+mv val/ILSVRC2012_val_00035367.JPEG n01494475/
+mv val/ILSVRC2012_val_00035368.JPEG n02128385/
+mv val/ILSVRC2012_val_00035369.JPEG n04204347/
+mv val/ILSVRC2012_val_00035370.JPEG n03690938/
+mv val/ILSVRC2012_val_00035371.JPEG n03759954/
+mv val/ILSVRC2012_val_00035372.JPEG n02412080/
+mv val/ILSVRC2012_val_00035373.JPEG n04204238/
+mv val/ILSVRC2012_val_00035374.JPEG n03662601/
+mv val/ILSVRC2012_val_00035375.JPEG n02114855/
+mv val/ILSVRC2012_val_00035376.JPEG n03788365/
+mv val/ILSVRC2012_val_00035377.JPEG n02104029/
+mv val/ILSVRC2012_val_00035378.JPEG n02101556/
+mv val/ILSVRC2012_val_00035379.JPEG n01737021/
+mv val/ILSVRC2012_val_00035380.JPEG n09288635/
+mv val/ILSVRC2012_val_00035381.JPEG n02096177/
+mv val/ILSVRC2012_val_00035382.JPEG n02492035/
+mv val/ILSVRC2012_val_00035383.JPEG n04238763/
+mv val/ILSVRC2012_val_00035384.JPEG n03393912/
+mv val/ILSVRC2012_val_00035385.JPEG n04149813/
+mv val/ILSVRC2012_val_00035386.JPEG n02398521/
+mv val/ILSVRC2012_val_00035387.JPEG n01742172/
+mv val/ILSVRC2012_val_00035388.JPEG n02130308/
+mv val/ILSVRC2012_val_00035389.JPEG n01534433/
+mv val/ILSVRC2012_val_00035390.JPEG n04404412/
+mv val/ILSVRC2012_val_00035391.JPEG n02107683/
+mv val/ILSVRC2012_val_00035392.JPEG n02708093/
+mv val/ILSVRC2012_val_00035393.JPEG n04209239/
+mv val/ILSVRC2012_val_00035394.JPEG n07715103/
+mv val/ILSVRC2012_val_00035395.JPEG n07718747/
+mv val/ILSVRC2012_val_00035396.JPEG n04462240/
+mv val/ILSVRC2012_val_00035397.JPEG n02510455/
+mv val/ILSVRC2012_val_00035398.JPEG n02098105/
+mv val/ILSVRC2012_val_00035399.JPEG n02277742/
+mv val/ILSVRC2012_val_00035400.JPEG n02096437/
+mv val/ILSVRC2012_val_00035401.JPEG n02802426/
+mv val/ILSVRC2012_val_00035402.JPEG n02486261/
+mv val/ILSVRC2012_val_00035403.JPEG n02091134/
+mv val/ILSVRC2012_val_00035404.JPEG n03272010/
+mv val/ILSVRC2012_val_00035405.JPEG n01491361/
+mv val/ILSVRC2012_val_00035406.JPEG n04604644/
+mv val/ILSVRC2012_val_00035407.JPEG n02640242/
+mv val/ILSVRC2012_val_00035408.JPEG n03692522/
+mv val/ILSVRC2012_val_00035409.JPEG n02229544/
+mv val/ILSVRC2012_val_00035410.JPEG n07720875/
+mv val/ILSVRC2012_val_00035411.JPEG n04606251/
+mv val/ILSVRC2012_val_00035412.JPEG n04201297/
+mv val/ILSVRC2012_val_00035413.JPEG n11939491/
+mv val/ILSVRC2012_val_00035414.JPEG n02088364/
+mv val/ILSVRC2012_val_00035415.JPEG n02655020/
+mv val/ILSVRC2012_val_00035416.JPEG n03657121/
+mv val/ILSVRC2012_val_00035417.JPEG n02112350/
+mv val/ILSVRC2012_val_00035418.JPEG n02326432/
+mv val/ILSVRC2012_val_00035419.JPEG n03445777/
+mv val/ILSVRC2012_val_00035420.JPEG n02028035/
+mv val/ILSVRC2012_val_00035421.JPEG n04326547/
+mv val/ILSVRC2012_val_00035422.JPEG n03400231/
+mv val/ILSVRC2012_val_00035423.JPEG n02091032/
+mv val/ILSVRC2012_val_00035424.JPEG n03710193/
+mv val/ILSVRC2012_val_00035425.JPEG n01742172/
+mv val/ILSVRC2012_val_00035426.JPEG n01806567/
+mv val/ILSVRC2012_val_00035427.JPEG n03485407/
+mv val/ILSVRC2012_val_00035428.JPEG n03450230/
+mv val/ILSVRC2012_val_00035429.JPEG n01735189/
+mv val/ILSVRC2012_val_00035430.JPEG n02319095/
+mv val/ILSVRC2012_val_00035431.JPEG n03467068/
+mv val/ILSVRC2012_val_00035432.JPEG n04458633/
+mv val/ILSVRC2012_val_00035433.JPEG n03394916/
+mv val/ILSVRC2012_val_00035434.JPEG n02500267/
+mv val/ILSVRC2012_val_00035435.JPEG n04525038/
+mv val/ILSVRC2012_val_00035436.JPEG n02112137/
+mv val/ILSVRC2012_val_00035437.JPEG n02107908/
+mv val/ILSVRC2012_val_00035438.JPEG n12768682/
+mv val/ILSVRC2012_val_00035439.JPEG n02119789/
+mv val/ILSVRC2012_val_00035440.JPEG n03662601/
+mv val/ILSVRC2012_val_00035441.JPEG n07860988/
+mv val/ILSVRC2012_val_00035442.JPEG n04584207/
+mv val/ILSVRC2012_val_00035443.JPEG n07932039/
+mv val/ILSVRC2012_val_00035444.JPEG n03062245/
+mv val/ILSVRC2012_val_00035445.JPEG n07745940/
+mv val/ILSVRC2012_val_00035446.JPEG n03085013/
+mv val/ILSVRC2012_val_00035447.JPEG n04465501/
+mv val/ILSVRC2012_val_00035448.JPEG n02483708/
+mv val/ILSVRC2012_val_00035449.JPEG n03379051/
+mv val/ILSVRC2012_val_00035450.JPEG n01631663/
+mv val/ILSVRC2012_val_00035451.JPEG n01773157/
+mv val/ILSVRC2012_val_00035452.JPEG n02364673/
+mv val/ILSVRC2012_val_00035453.JPEG n02917067/
+mv val/ILSVRC2012_val_00035454.JPEG n02488702/
+mv val/ILSVRC2012_val_00035455.JPEG n02105412/
+mv val/ILSVRC2012_val_00035456.JPEG n02423022/
+mv val/ILSVRC2012_val_00035457.JPEG n03868242/
+mv val/ILSVRC2012_val_00035458.JPEG n02018207/
+mv val/ILSVRC2012_val_00035459.JPEG n02113624/
+mv val/ILSVRC2012_val_00035460.JPEG n04041544/
+mv val/ILSVRC2012_val_00035461.JPEG n04548280/
+mv val/ILSVRC2012_val_00035462.JPEG n03483316/
+mv val/ILSVRC2012_val_00035463.JPEG n03444034/
+mv val/ILSVRC2012_val_00035464.JPEG n02125311/
+mv val/ILSVRC2012_val_00035465.JPEG n02281406/
+mv val/ILSVRC2012_val_00035466.JPEG n04041544/
+mv val/ILSVRC2012_val_00035467.JPEG n03223299/
+mv val/ILSVRC2012_val_00035468.JPEG n03602883/
+mv val/ILSVRC2012_val_00035469.JPEG n12144580/
+mv val/ILSVRC2012_val_00035470.JPEG n04192698/
+mv val/ILSVRC2012_val_00035471.JPEG n07831146/
+mv val/ILSVRC2012_val_00035472.JPEG n01748264/
+mv val/ILSVRC2012_val_00035473.JPEG n02096177/
+mv val/ILSVRC2012_val_00035474.JPEG n01798484/
+mv val/ILSVRC2012_val_00035475.JPEG n03075370/
+mv val/ILSVRC2012_val_00035476.JPEG n01807496/
+mv val/ILSVRC2012_val_00035477.JPEG n04479046/
+mv val/ILSVRC2012_val_00035478.JPEG n03457902/
+mv val/ILSVRC2012_val_00035479.JPEG n02504013/
+mv val/ILSVRC2012_val_00035480.JPEG n02097047/
+mv val/ILSVRC2012_val_00035481.JPEG n07583066/
+mv val/ILSVRC2012_val_00035482.JPEG n02979186/
+mv val/ILSVRC2012_val_00035483.JPEG n03595614/
+mv val/ILSVRC2012_val_00035484.JPEG n04286575/
+mv val/ILSVRC2012_val_00035485.JPEG n09246464/
+mv val/ILSVRC2012_val_00035486.JPEG n02981792/
+mv val/ILSVRC2012_val_00035487.JPEG n03220513/
+mv val/ILSVRC2012_val_00035488.JPEG n02090379/
+mv val/ILSVRC2012_val_00035489.JPEG n02037110/
+mv val/ILSVRC2012_val_00035490.JPEG n02009912/
+mv val/ILSVRC2012_val_00035491.JPEG n07860988/
+mv val/ILSVRC2012_val_00035492.JPEG n04435653/
+mv val/ILSVRC2012_val_00035493.JPEG n02486261/
+mv val/ILSVRC2012_val_00035494.JPEG n02129604/
+mv val/ILSVRC2012_val_00035495.JPEG n01491361/
+mv val/ILSVRC2012_val_00035496.JPEG n04579432/
+mv val/ILSVRC2012_val_00035497.JPEG n02165456/
+mv val/ILSVRC2012_val_00035498.JPEG n03259280/
+mv val/ILSVRC2012_val_00035499.JPEG n01860187/
+mv val/ILSVRC2012_val_00035500.JPEG n03796401/
+mv val/ILSVRC2012_val_00035501.JPEG n02356798/
+mv val/ILSVRC2012_val_00035502.JPEG n01828970/
+mv val/ILSVRC2012_val_00035503.JPEG n02206856/
+mv val/ILSVRC2012_val_00035504.JPEG n03983396/
+mv val/ILSVRC2012_val_00035505.JPEG n02783161/
+mv val/ILSVRC2012_val_00035506.JPEG n03134739/
+mv val/ILSVRC2012_val_00035507.JPEG n02823428/
+mv val/ILSVRC2012_val_00035508.JPEG n04371430/
+mv val/ILSVRC2012_val_00035509.JPEG n04118776/
+mv val/ILSVRC2012_val_00035510.JPEG n02106166/
+mv val/ILSVRC2012_val_00035511.JPEG n02988304/
+mv val/ILSVRC2012_val_00035512.JPEG n01770081/
+mv val/ILSVRC2012_val_00035513.JPEG n04465501/
+mv val/ILSVRC2012_val_00035514.JPEG n03447447/
+mv val/ILSVRC2012_val_00035515.JPEG n03976467/
+mv val/ILSVRC2012_val_00035516.JPEG n02977058/
+mv val/ILSVRC2012_val_00035517.JPEG n02058221/
+mv val/ILSVRC2012_val_00035518.JPEG n02280649/
+mv val/ILSVRC2012_val_00035519.JPEG n03445777/
+mv val/ILSVRC2012_val_00035520.JPEG n03884397/
+mv val/ILSVRC2012_val_00035521.JPEG n01797886/
+mv val/ILSVRC2012_val_00035522.JPEG n03240683/
+mv val/ILSVRC2012_val_00035523.JPEG n03485794/
+mv val/ILSVRC2012_val_00035524.JPEG n02974003/
+mv val/ILSVRC2012_val_00035525.JPEG n04548280/
+mv val/ILSVRC2012_val_00035526.JPEG n02168699/
+mv val/ILSVRC2012_val_00035527.JPEG n07716906/
+mv val/ILSVRC2012_val_00035528.JPEG n02002556/
+mv val/ILSVRC2012_val_00035529.JPEG n01632777/
+mv val/ILSVRC2012_val_00035530.JPEG n02111129/
+mv val/ILSVRC2012_val_00035531.JPEG n02492035/
+mv val/ILSVRC2012_val_00035532.JPEG n02123159/
+mv val/ILSVRC2012_val_00035533.JPEG n03424325/
+mv val/ILSVRC2012_val_00035534.JPEG n02231487/
+mv val/ILSVRC2012_val_00035535.JPEG n01641577/
+mv val/ILSVRC2012_val_00035536.JPEG n07873807/
+mv val/ILSVRC2012_val_00035537.JPEG n02363005/
+mv val/ILSVRC2012_val_00035538.JPEG n02100877/
+mv val/ILSVRC2012_val_00035539.JPEG n03777568/
+mv val/ILSVRC2012_val_00035540.JPEG n01530575/
+mv val/ILSVRC2012_val_00035541.JPEG n03998194/
+mv val/ILSVRC2012_val_00035542.JPEG n01829413/
+mv val/ILSVRC2012_val_00035543.JPEG n02480855/
+mv val/ILSVRC2012_val_00035544.JPEG n09288635/
+mv val/ILSVRC2012_val_00035545.JPEG n02321529/
+mv val/ILSVRC2012_val_00035546.JPEG n02509815/
+mv val/ILSVRC2012_val_00035547.JPEG n03482405/
+mv val/ILSVRC2012_val_00035548.JPEG n04493381/
+mv val/ILSVRC2012_val_00035549.JPEG n02319095/
+mv val/ILSVRC2012_val_00035550.JPEG n03223299/
+mv val/ILSVRC2012_val_00035551.JPEG n03388549/
+mv val/ILSVRC2012_val_00035552.JPEG n02113186/
+mv val/ILSVRC2012_val_00035553.JPEG n02093859/
+mv val/ILSVRC2012_val_00035554.JPEG n07718747/
+mv val/ILSVRC2012_val_00035555.JPEG n01855032/
+mv val/ILSVRC2012_val_00035556.JPEG n10148035/
+mv val/ILSVRC2012_val_00035557.JPEG n07753113/
+mv val/ILSVRC2012_val_00035558.JPEG n04154565/
+mv val/ILSVRC2012_val_00035559.JPEG n02423022/
+mv val/ILSVRC2012_val_00035560.JPEG n04179913/
+mv val/ILSVRC2012_val_00035561.JPEG n02486410/
+mv val/ILSVRC2012_val_00035562.JPEG n02106382/
+mv val/ILSVRC2012_val_00035563.JPEG n02033041/
+mv val/ILSVRC2012_val_00035564.JPEG n02483708/
+mv val/ILSVRC2012_val_00035565.JPEG n01537544/
+mv val/ILSVRC2012_val_00035566.JPEG n02123597/
+mv val/ILSVRC2012_val_00035567.JPEG n03240683/
+mv val/ILSVRC2012_val_00035568.JPEG n04026417/
+mv val/ILSVRC2012_val_00035569.JPEG n02108422/
+mv val/ILSVRC2012_val_00035570.JPEG n09399592/
+mv val/ILSVRC2012_val_00035571.JPEG n02104365/
+mv val/ILSVRC2012_val_00035572.JPEG n03794056/
+mv val/ILSVRC2012_val_00035573.JPEG n01776313/
+mv val/ILSVRC2012_val_00035574.JPEG n02787622/
+mv val/ILSVRC2012_val_00035575.JPEG n03854065/
+mv val/ILSVRC2012_val_00035576.JPEG n01729977/
+mv val/ILSVRC2012_val_00035577.JPEG n02127052/
+mv val/ILSVRC2012_val_00035578.JPEG n03942813/
+mv val/ILSVRC2012_val_00035579.JPEG n02109047/
+mv val/ILSVRC2012_val_00035580.JPEG n03133878/
+mv val/ILSVRC2012_val_00035581.JPEG n03775071/
+mv val/ILSVRC2012_val_00035582.JPEG n02268443/
+mv val/ILSVRC2012_val_00035583.JPEG n04118776/
+mv val/ILSVRC2012_val_00035584.JPEG n02009912/
+mv val/ILSVRC2012_val_00035585.JPEG n02111889/
+mv val/ILSVRC2012_val_00035586.JPEG n04542943/
+mv val/ILSVRC2012_val_00035587.JPEG n03759954/
+mv val/ILSVRC2012_val_00035588.JPEG n03633091/
+mv val/ILSVRC2012_val_00035589.JPEG n03124043/
+mv val/ILSVRC2012_val_00035590.JPEG n03016953/
+mv val/ILSVRC2012_val_00035591.JPEG n02133161/
+mv val/ILSVRC2012_val_00035592.JPEG n02106030/
+mv val/ILSVRC2012_val_00035593.JPEG n01773797/
+mv val/ILSVRC2012_val_00035594.JPEG n03887697/
+mv val/ILSVRC2012_val_00035595.JPEG n04501370/
+mv val/ILSVRC2012_val_00035596.JPEG n04120489/
+mv val/ILSVRC2012_val_00035597.JPEG n02096051/
+mv val/ILSVRC2012_val_00035598.JPEG n01682714/
+mv val/ILSVRC2012_val_00035599.JPEG n03133878/
+mv val/ILSVRC2012_val_00035600.JPEG n02992211/
+mv val/ILSVRC2012_val_00035601.JPEG n01795545/
+mv val/ILSVRC2012_val_00035602.JPEG n02033041/
+mv val/ILSVRC2012_val_00035603.JPEG n04285008/
+mv val/ILSVRC2012_val_00035604.JPEG n02113978/
+mv val/ILSVRC2012_val_00035605.JPEG n02006656/
+mv val/ILSVRC2012_val_00035606.JPEG n01768244/
+mv val/ILSVRC2012_val_00035607.JPEG n02837789/
+mv val/ILSVRC2012_val_00035608.JPEG n01622779/
+mv val/ILSVRC2012_val_00035609.JPEG n02091831/
+mv val/ILSVRC2012_val_00035610.JPEG n02992529/
+mv val/ILSVRC2012_val_00035611.JPEG n03929660/
+mv val/ILSVRC2012_val_00035612.JPEG n02493793/
+mv val/ILSVRC2012_val_00035613.JPEG n03447447/
+mv val/ILSVRC2012_val_00035614.JPEG n02013706/
+mv val/ILSVRC2012_val_00035615.JPEG n03478589/
+mv val/ILSVRC2012_val_00035616.JPEG n07615774/
+mv val/ILSVRC2012_val_00035617.JPEG n03530642/
+mv val/ILSVRC2012_val_00035618.JPEG n02410509/
+mv val/ILSVRC2012_val_00035619.JPEG n01968897/
+mv val/ILSVRC2012_val_00035620.JPEG n04252077/
+mv val/ILSVRC2012_val_00035621.JPEG n03976467/
+mv val/ILSVRC2012_val_00035622.JPEG n07871810/
+mv val/ILSVRC2012_val_00035623.JPEG n01697457/
+mv val/ILSVRC2012_val_00035624.JPEG n04200800/
+mv val/ILSVRC2012_val_00035625.JPEG n01806567/
+mv val/ILSVRC2012_val_00035626.JPEG n03998194/
+mv val/ILSVRC2012_val_00035627.JPEG n03721384/
+mv val/ILSVRC2012_val_00035628.JPEG n02107683/
+mv val/ILSVRC2012_val_00035629.JPEG n02950826/
+mv val/ILSVRC2012_val_00035630.JPEG n02834397/
+mv val/ILSVRC2012_val_00035631.JPEG n02978881/
+mv val/ILSVRC2012_val_00035632.JPEG n02106166/
+mv val/ILSVRC2012_val_00035633.JPEG n02098413/
+mv val/ILSVRC2012_val_00035634.JPEG n04204238/
+mv val/ILSVRC2012_val_00035635.JPEG n04328186/
+mv val/ILSVRC2012_val_00035636.JPEG n01943899/
+mv val/ILSVRC2012_val_00035637.JPEG n03494278/
+mv val/ILSVRC2012_val_00035638.JPEG n01798484/
+mv val/ILSVRC2012_val_00035639.JPEG n07714990/
+mv val/ILSVRC2012_val_00035640.JPEG n02105056/
+mv val/ILSVRC2012_val_00035641.JPEG n04033995/
+mv val/ILSVRC2012_val_00035642.JPEG n03207743/
+mv val/ILSVRC2012_val_00035643.JPEG n03459775/
+mv val/ILSVRC2012_val_00035644.JPEG n02704792/
+mv val/ILSVRC2012_val_00035645.JPEG n03379051/
+mv val/ILSVRC2012_val_00035646.JPEG n04372370/
+mv val/ILSVRC2012_val_00035647.JPEG n01855032/
+mv val/ILSVRC2012_val_00035648.JPEG n03124170/
+mv val/ILSVRC2012_val_00035649.JPEG n04039381/
+mv val/ILSVRC2012_val_00035650.JPEG n04355338/
+mv val/ILSVRC2012_val_00035651.JPEG n01774384/
+mv val/ILSVRC2012_val_00035652.JPEG n03016953/
+mv val/ILSVRC2012_val_00035653.JPEG n02486261/
+mv val/ILSVRC2012_val_00035654.JPEG n01632777/
+mv val/ILSVRC2012_val_00035655.JPEG n02319095/
+mv val/ILSVRC2012_val_00035656.JPEG n02106550/
+mv val/ILSVRC2012_val_00035657.JPEG n03476684/
+mv val/ILSVRC2012_val_00035658.JPEG n01644900/
+mv val/ILSVRC2012_val_00035659.JPEG n03729826/
+mv val/ILSVRC2012_val_00035660.JPEG n03047690/
+mv val/ILSVRC2012_val_00035661.JPEG n04179913/
+mv val/ILSVRC2012_val_00035662.JPEG n02437312/
+mv val/ILSVRC2012_val_00035663.JPEG n03769881/
+mv val/ILSVRC2012_val_00035664.JPEG n01664065/
+mv val/ILSVRC2012_val_00035665.JPEG n02107683/
+mv val/ILSVRC2012_val_00035666.JPEG n09835506/
+mv val/ILSVRC2012_val_00035667.JPEG n01784675/
+mv val/ILSVRC2012_val_00035668.JPEG n02483362/
+mv val/ILSVRC2012_val_00035669.JPEG n02089867/
+mv val/ILSVRC2012_val_00035670.JPEG n04356056/
+mv val/ILSVRC2012_val_00035671.JPEG n03666591/
+mv val/ILSVRC2012_val_00035672.JPEG n06359193/
+mv val/ILSVRC2012_val_00035673.JPEG n02277742/
+mv val/ILSVRC2012_val_00035674.JPEG n04456115/
+mv val/ILSVRC2012_val_00035675.JPEG n02099267/
+mv val/ILSVRC2012_val_00035676.JPEG n03657121/
+mv val/ILSVRC2012_val_00035677.JPEG n04149813/
+mv val/ILSVRC2012_val_00035678.JPEG n07579787/
+mv val/ILSVRC2012_val_00035679.JPEG n04372370/
+mv val/ILSVRC2012_val_00035680.JPEG n02095314/
+mv val/ILSVRC2012_val_00035681.JPEG n03496892/
+mv val/ILSVRC2012_val_00035682.JPEG n02483708/
+mv val/ILSVRC2012_val_00035683.JPEG n04417672/
+mv val/ILSVRC2012_val_00035684.JPEG n04447861/
+mv val/ILSVRC2012_val_00035685.JPEG n02804610/
+mv val/ILSVRC2012_val_00035686.JPEG n03126707/
+mv val/ILSVRC2012_val_00035687.JPEG n01704323/
+mv val/ILSVRC2012_val_00035688.JPEG n09332890/
+mv val/ILSVRC2012_val_00035689.JPEG n02090379/
+mv val/ILSVRC2012_val_00035690.JPEG n03837869/
+mv val/ILSVRC2012_val_00035691.JPEG n11939491/
+mv val/ILSVRC2012_val_00035692.JPEG n03866082/
+mv val/ILSVRC2012_val_00035693.JPEG n03733131/
+mv val/ILSVRC2012_val_00035694.JPEG n02165456/
+mv val/ILSVRC2012_val_00035695.JPEG n04443257/
+mv val/ILSVRC2012_val_00035696.JPEG n02281787/
+mv val/ILSVRC2012_val_00035697.JPEG n02398521/
+mv val/ILSVRC2012_val_00035698.JPEG n07718472/
+mv val/ILSVRC2012_val_00035699.JPEG n02106382/
+mv val/ILSVRC2012_val_00035700.JPEG n02066245/
+mv val/ILSVRC2012_val_00035701.JPEG n04428191/
+mv val/ILSVRC2012_val_00035702.JPEG n03527444/
+mv val/ILSVRC2012_val_00035703.JPEG n03085013/
+mv val/ILSVRC2012_val_00035704.JPEG n02112350/
+mv val/ILSVRC2012_val_00035705.JPEG n02094433/
+mv val/ILSVRC2012_val_00035706.JPEG n03942813/
+mv val/ILSVRC2012_val_00035707.JPEG n02398521/
+mv val/ILSVRC2012_val_00035708.JPEG n02865351/
+mv val/ILSVRC2012_val_00035709.JPEG n03908618/
+mv val/ILSVRC2012_val_00035710.JPEG n02229544/
+mv val/ILSVRC2012_val_00035711.JPEG n01981276/
+mv val/ILSVRC2012_val_00035712.JPEG n03208938/
+mv val/ILSVRC2012_val_00035713.JPEG n02236044/
+mv val/ILSVRC2012_val_00035714.JPEG n04542943/
+mv val/ILSVRC2012_val_00035715.JPEG n02804610/
+mv val/ILSVRC2012_val_00035716.JPEG n02843684/
+mv val/ILSVRC2012_val_00035717.JPEG n01687978/
+mv val/ILSVRC2012_val_00035718.JPEG n02447366/
+mv val/ILSVRC2012_val_00035719.JPEG n02099849/
+mv val/ILSVRC2012_val_00035720.JPEG n03017168/
+mv val/ILSVRC2012_val_00035721.JPEG n02999410/
+mv val/ILSVRC2012_val_00035722.JPEG n02013706/
+mv val/ILSVRC2012_val_00035723.JPEG n02102040/
+mv val/ILSVRC2012_val_00035724.JPEG n02825657/
+mv val/ILSVRC2012_val_00035725.JPEG n02091831/
+mv val/ILSVRC2012_val_00035726.JPEG n01833805/
+mv val/ILSVRC2012_val_00035727.JPEG n02117135/
+mv val/ILSVRC2012_val_00035728.JPEG n01910747/
+mv val/ILSVRC2012_val_00035729.JPEG n03724870/
+mv val/ILSVRC2012_val_00035730.JPEG n04209133/
+mv val/ILSVRC2012_val_00035731.JPEG n04328186/
+mv val/ILSVRC2012_val_00035732.JPEG n03761084/
+mv val/ILSVRC2012_val_00035733.JPEG n04509417/
+mv val/ILSVRC2012_val_00035734.JPEG n04612504/
+mv val/ILSVRC2012_val_00035735.JPEG n01537544/
+mv val/ILSVRC2012_val_00035736.JPEG n01748264/
+mv val/ILSVRC2012_val_00035737.JPEG n04542943/
+mv val/ILSVRC2012_val_00035738.JPEG n02892767/
+mv val/ILSVRC2012_val_00035739.JPEG n04332243/
+mv val/ILSVRC2012_val_00035740.JPEG n04591713/
+mv val/ILSVRC2012_val_00035741.JPEG n02116738/
+mv val/ILSVRC2012_val_00035742.JPEG n07714990/
+mv val/ILSVRC2012_val_00035743.JPEG n03782006/
+mv val/ILSVRC2012_val_00035744.JPEG n07697313/
+mv val/ILSVRC2012_val_00035745.JPEG n03692522/
+mv val/ILSVRC2012_val_00035746.JPEG n02776631/
+mv val/ILSVRC2012_val_00035747.JPEG n03197337/
+mv val/ILSVRC2012_val_00035748.JPEG n06874185/
+mv val/ILSVRC2012_val_00035749.JPEG n02089867/
+mv val/ILSVRC2012_val_00035750.JPEG n02790996/
+mv val/ILSVRC2012_val_00035751.JPEG n02979186/
+mv val/ILSVRC2012_val_00035752.JPEG n03938244/
+mv val/ILSVRC2012_val_00035753.JPEG n03028079/
+mv val/ILSVRC2012_val_00035754.JPEG n02823428/
+mv val/ILSVRC2012_val_00035755.JPEG n04133789/
+mv val/ILSVRC2012_val_00035756.JPEG n02794156/
+mv val/ILSVRC2012_val_00035757.JPEG n02815834/
+mv val/ILSVRC2012_val_00035758.JPEG n03063599/
+mv val/ILSVRC2012_val_00035759.JPEG n10148035/
+mv val/ILSVRC2012_val_00035760.JPEG n02486261/
+mv val/ILSVRC2012_val_00035761.JPEG n04435653/
+mv val/ILSVRC2012_val_00035762.JPEG n01943899/
+mv val/ILSVRC2012_val_00035763.JPEG n02391049/
+mv val/ILSVRC2012_val_00035764.JPEG n02090622/
+mv val/ILSVRC2012_val_00035765.JPEG n04542943/
+mv val/ILSVRC2012_val_00035766.JPEG n02058221/
+mv val/ILSVRC2012_val_00035767.JPEG n02089867/
+mv val/ILSVRC2012_val_00035768.JPEG n02115641/
+mv val/ILSVRC2012_val_00035769.JPEG n03930313/
+mv val/ILSVRC2012_val_00035770.JPEG n02105412/
+mv val/ILSVRC2012_val_00035771.JPEG n03691459/
+mv val/ILSVRC2012_val_00035772.JPEG n03781244/
+mv val/ILSVRC2012_val_00035773.JPEG n03721384/
+mv val/ILSVRC2012_val_00035774.JPEG n01484850/
+mv val/ILSVRC2012_val_00035775.JPEG n03201208/
+mv val/ILSVRC2012_val_00035776.JPEG n03710721/
+mv val/ILSVRC2012_val_00035777.JPEG n03384352/
+mv val/ILSVRC2012_val_00035778.JPEG n02410509/
+mv val/ILSVRC2012_val_00035779.JPEG n03787032/
+mv val/ILSVRC2012_val_00035780.JPEG n03970156/
+mv val/ILSVRC2012_val_00035781.JPEG n02105251/
+mv val/ILSVRC2012_val_00035782.JPEG n03958227/
+mv val/ILSVRC2012_val_00035783.JPEG n02690373/
+mv val/ILSVRC2012_val_00035784.JPEG n01729322/
+mv val/ILSVRC2012_val_00035785.JPEG n01518878/
+mv val/ILSVRC2012_val_00035786.JPEG n04254680/
+mv val/ILSVRC2012_val_00035787.JPEG n02988304/
+mv val/ILSVRC2012_val_00035788.JPEG n03670208/
+mv val/ILSVRC2012_val_00035789.JPEG n04033901/
+mv val/ILSVRC2012_val_00035790.JPEG n02018795/
+mv val/ILSVRC2012_val_00035791.JPEG n02749479/
+mv val/ILSVRC2012_val_00035792.JPEG n03447721/
+mv val/ILSVRC2012_val_00035793.JPEG n02093428/
+mv val/ILSVRC2012_val_00035794.JPEG n02099712/
+mv val/ILSVRC2012_val_00035795.JPEG n02094114/
+mv val/ILSVRC2012_val_00035796.JPEG n02814860/
+mv val/ILSVRC2012_val_00035797.JPEG n02167151/
+mv val/ILSVRC2012_val_00035798.JPEG n04525305/
+mv val/ILSVRC2012_val_00035799.JPEG n02483362/
+mv val/ILSVRC2012_val_00035800.JPEG n02105251/
+mv val/ILSVRC2012_val_00035801.JPEG n02817516/
+mv val/ILSVRC2012_val_00035802.JPEG n04125021/
+mv val/ILSVRC2012_val_00035803.JPEG n02979186/
+mv val/ILSVRC2012_val_00035804.JPEG n01829413/
+mv val/ILSVRC2012_val_00035805.JPEG n02097658/
+mv val/ILSVRC2012_val_00035806.JPEG n02909870/
+mv val/ILSVRC2012_val_00035807.JPEG n01558993/
+mv val/ILSVRC2012_val_00035808.JPEG n03216828/
+mv val/ILSVRC2012_val_00035809.JPEG n02280649/
+mv val/ILSVRC2012_val_00035810.JPEG n02051845/
+mv val/ILSVRC2012_val_00035811.JPEG n02115913/
+mv val/ILSVRC2012_val_00035812.JPEG n03938244/
+mv val/ILSVRC2012_val_00035813.JPEG n04522168/
+mv val/ILSVRC2012_val_00035814.JPEG n01632458/
+mv val/ILSVRC2012_val_00035815.JPEG n02106382/
+mv val/ILSVRC2012_val_00035816.JPEG n02939185/
+mv val/ILSVRC2012_val_00035817.JPEG n04111531/
+mv val/ILSVRC2012_val_00035818.JPEG n01693334/
+mv val/ILSVRC2012_val_00035819.JPEG n02268853/
+mv val/ILSVRC2012_val_00035820.JPEG n02109525/
+mv val/ILSVRC2012_val_00035821.JPEG n02125311/
+mv val/ILSVRC2012_val_00035822.JPEG n03617480/
+mv val/ILSVRC2012_val_00035823.JPEG n02437616/
+mv val/ILSVRC2012_val_00035824.JPEG n04146614/
+mv val/ILSVRC2012_val_00035825.JPEG n03832673/
+mv val/ILSVRC2012_val_00035826.JPEG n02870880/
+mv val/ILSVRC2012_val_00035827.JPEG n04554684/
+mv val/ILSVRC2012_val_00035828.JPEG n02071294/
+mv val/ILSVRC2012_val_00035829.JPEG n02971356/
+mv val/ILSVRC2012_val_00035830.JPEG n03775071/
+mv val/ILSVRC2012_val_00035831.JPEG n04326547/
+mv val/ILSVRC2012_val_00035832.JPEG n11879895/
+mv val/ILSVRC2012_val_00035833.JPEG n01531178/
+mv val/ILSVRC2012_val_00035834.JPEG n02667093/
+mv val/ILSVRC2012_val_00035835.JPEG n04317175/
+mv val/ILSVRC2012_val_00035836.JPEG n02027492/
+mv val/ILSVRC2012_val_00035837.JPEG n02002556/
+mv val/ILSVRC2012_val_00035838.JPEG n02206856/
+mv val/ILSVRC2012_val_00035839.JPEG n03527444/
+mv val/ILSVRC2012_val_00035840.JPEG n04557648/
+mv val/ILSVRC2012_val_00035841.JPEG n04467665/
+mv val/ILSVRC2012_val_00035842.JPEG n01742172/
+mv val/ILSVRC2012_val_00035843.JPEG n02100236/
+mv val/ILSVRC2012_val_00035844.JPEG n02096437/
+mv val/ILSVRC2012_val_00035845.JPEG n13054560/
+mv val/ILSVRC2012_val_00035846.JPEG n02389026/
+mv val/ILSVRC2012_val_00035847.JPEG n02098105/
+mv val/ILSVRC2012_val_00035848.JPEG n07871810/
+mv val/ILSVRC2012_val_00035849.JPEG n02488291/
+mv val/ILSVRC2012_val_00035850.JPEG n04251144/
+mv val/ILSVRC2012_val_00035851.JPEG n12057211/
+mv val/ILSVRC2012_val_00035852.JPEG n04483307/
+mv val/ILSVRC2012_val_00035853.JPEG n01917289/
+mv val/ILSVRC2012_val_00035854.JPEG n03637318/
+mv val/ILSVRC2012_val_00035855.JPEG n01950731/
+mv val/ILSVRC2012_val_00035856.JPEG n01955084/
+mv val/ILSVRC2012_val_00035857.JPEG n02869837/
+mv val/ILSVRC2012_val_00035858.JPEG n04037443/
+mv val/ILSVRC2012_val_00035859.JPEG n02099267/
+mv val/ILSVRC2012_val_00035860.JPEG n04254120/
+mv val/ILSVRC2012_val_00035861.JPEG n02493793/
+mv val/ILSVRC2012_val_00035862.JPEG n12144580/
+mv val/ILSVRC2012_val_00035863.JPEG n01968897/
+mv val/ILSVRC2012_val_00035864.JPEG n03770679/
+mv val/ILSVRC2012_val_00035865.JPEG n02910353/
+mv val/ILSVRC2012_val_00035866.JPEG n04146614/
+mv val/ILSVRC2012_val_00035867.JPEG n04154565/
+mv val/ILSVRC2012_val_00035868.JPEG n02128757/
+mv val/ILSVRC2012_val_00035869.JPEG n04380533/
+mv val/ILSVRC2012_val_00035870.JPEG n03530642/
+mv val/ILSVRC2012_val_00035871.JPEG n02640242/
+mv val/ILSVRC2012_val_00035872.JPEG n01530575/
+mv val/ILSVRC2012_val_00035873.JPEG n04325704/
+mv val/ILSVRC2012_val_00035874.JPEG n04562935/
+mv val/ILSVRC2012_val_00035875.JPEG n03838899/
+mv val/ILSVRC2012_val_00035876.JPEG n02692877/
+mv val/ILSVRC2012_val_00035877.JPEG n03692522/
+mv val/ILSVRC2012_val_00035878.JPEG n03916031/
+mv val/ILSVRC2012_val_00035879.JPEG n02486261/
+mv val/ILSVRC2012_val_00035880.JPEG n03724870/
+mv val/ILSVRC2012_val_00035881.JPEG n02099267/
+mv val/ILSVRC2012_val_00035882.JPEG n03207941/
+mv val/ILSVRC2012_val_00035883.JPEG n02128925/
+mv val/ILSVRC2012_val_00035884.JPEG n03461385/
+mv val/ILSVRC2012_val_00035885.JPEG n01950731/
+mv val/ILSVRC2012_val_00035886.JPEG n02492660/
+mv val/ILSVRC2012_val_00035887.JPEG n02102973/
+mv val/ILSVRC2012_val_00035888.JPEG n07749582/
+mv val/ILSVRC2012_val_00035889.JPEG n04310018/
+mv val/ILSVRC2012_val_00035890.JPEG n02110806/
+mv val/ILSVRC2012_val_00035891.JPEG n02105056/
+mv val/ILSVRC2012_val_00035892.JPEG n09428293/
+mv val/ILSVRC2012_val_00035893.JPEG n02087394/
+mv val/ILSVRC2012_val_00035894.JPEG n15075141/
+mv val/ILSVRC2012_val_00035895.JPEG n03141823/
+mv val/ILSVRC2012_val_00035896.JPEG n03709823/
+mv val/ILSVRC2012_val_00035897.JPEG n03930630/
+mv val/ILSVRC2012_val_00035898.JPEG n02280649/
+mv val/ILSVRC2012_val_00035899.JPEG n04069434/
+mv val/ILSVRC2012_val_00035900.JPEG n07718747/
+mv val/ILSVRC2012_val_00035901.JPEG n02480495/
+mv val/ILSVRC2012_val_00035902.JPEG n07754684/
+mv val/ILSVRC2012_val_00035903.JPEG n12985857/
+mv val/ILSVRC2012_val_00035904.JPEG n03602883/
+mv val/ILSVRC2012_val_00035905.JPEG n01665541/
+mv val/ILSVRC2012_val_00035906.JPEG n04465501/
+mv val/ILSVRC2012_val_00035907.JPEG n02788148/
+mv val/ILSVRC2012_val_00035908.JPEG n02114548/
+mv val/ILSVRC2012_val_00035909.JPEG n07753275/
+mv val/ILSVRC2012_val_00035910.JPEG n03788195/
+mv val/ILSVRC2012_val_00035911.JPEG n02814860/
+mv val/ILSVRC2012_val_00035912.JPEG n02090379/
+mv val/ILSVRC2012_val_00035913.JPEG n03425413/
+mv val/ILSVRC2012_val_00035914.JPEG n01751748/
+mv val/ILSVRC2012_val_00035915.JPEG n04311174/
+mv val/ILSVRC2012_val_00035916.JPEG n01796340/
+mv val/ILSVRC2012_val_00035917.JPEG n07613480/
+mv val/ILSVRC2012_val_00035918.JPEG n03445777/
+mv val/ILSVRC2012_val_00035919.JPEG n04404412/
+mv val/ILSVRC2012_val_00035920.JPEG n03124170/
+mv val/ILSVRC2012_val_00035921.JPEG n02364673/
+mv val/ILSVRC2012_val_00035922.JPEG n01829413/
+mv val/ILSVRC2012_val_00035923.JPEG n03134739/
+mv val/ILSVRC2012_val_00035924.JPEG n07730033/
+mv val/ILSVRC2012_val_00035925.JPEG n03379051/
+mv val/ILSVRC2012_val_00035926.JPEG n04485082/
+mv val/ILSVRC2012_val_00035927.JPEG n03250847/
+mv val/ILSVRC2012_val_00035928.JPEG n07730033/
+mv val/ILSVRC2012_val_00035929.JPEG n07714571/
+mv val/ILSVRC2012_val_00035930.JPEG n02790996/
+mv val/ILSVRC2012_val_00035931.JPEG n03160309/
+mv val/ILSVRC2012_val_00035932.JPEG n02268443/
+mv val/ILSVRC2012_val_00035933.JPEG n02093859/
+mv val/ILSVRC2012_val_00035934.JPEG n13052670/
+mv val/ILSVRC2012_val_00035935.JPEG n02086910/
+mv val/ILSVRC2012_val_00035936.JPEG n01632458/
+mv val/ILSVRC2012_val_00035937.JPEG n04259630/
+mv val/ILSVRC2012_val_00035938.JPEG n01806567/
+mv val/ILSVRC2012_val_00035939.JPEG n02094433/
+mv val/ILSVRC2012_val_00035940.JPEG n02093647/
+mv val/ILSVRC2012_val_00035941.JPEG n02111500/
+mv val/ILSVRC2012_val_00035942.JPEG n03876231/
+mv val/ILSVRC2012_val_00035943.JPEG n01883070/
+mv val/ILSVRC2012_val_00035944.JPEG n02098286/
+mv val/ILSVRC2012_val_00035945.JPEG n04483307/
+mv val/ILSVRC2012_val_00035946.JPEG n03344393/
+mv val/ILSVRC2012_val_00035947.JPEG n01592084/
+mv val/ILSVRC2012_val_00035948.JPEG n04579432/
+mv val/ILSVRC2012_val_00035949.JPEG n04152593/
+mv val/ILSVRC2012_val_00035950.JPEG n04579145/
+mv val/ILSVRC2012_val_00035951.JPEG n03998194/
+mv val/ILSVRC2012_val_00035952.JPEG n02093256/
+mv val/ILSVRC2012_val_00035953.JPEG n01616318/
+mv val/ILSVRC2012_val_00035954.JPEG n03085013/
+mv val/ILSVRC2012_val_00035955.JPEG n03527444/
+mv val/ILSVRC2012_val_00035956.JPEG n04116512/
+mv val/ILSVRC2012_val_00035957.JPEG n02514041/
+mv val/ILSVRC2012_val_00035958.JPEG n03627232/
+mv val/ILSVRC2012_val_00035959.JPEG n03376595/
+mv val/ILSVRC2012_val_00035960.JPEG n04443257/
+mv val/ILSVRC2012_val_00035961.JPEG n03095699/
+mv val/ILSVRC2012_val_00035962.JPEG n02403003/
+mv val/ILSVRC2012_val_00035963.JPEG n04589890/
+mv val/ILSVRC2012_val_00035964.JPEG n01910747/
+mv val/ILSVRC2012_val_00035965.JPEG n02978881/
+mv val/ILSVRC2012_val_00035966.JPEG n02727426/
+mv val/ILSVRC2012_val_00035967.JPEG n01985128/
+mv val/ILSVRC2012_val_00035968.JPEG n03482405/
+mv val/ILSVRC2012_val_00035969.JPEG n02132136/
+mv val/ILSVRC2012_val_00035970.JPEG n04277352/
+mv val/ILSVRC2012_val_00035971.JPEG n13133613/
+mv val/ILSVRC2012_val_00035972.JPEG n02033041/
+mv val/ILSVRC2012_val_00035973.JPEG n02100877/
+mv val/ILSVRC2012_val_00035974.JPEG n01806143/
+mv val/ILSVRC2012_val_00035975.JPEG n03733805/
+mv val/ILSVRC2012_val_00035976.JPEG n01748264/
+mv val/ILSVRC2012_val_00035977.JPEG n02483362/
+mv val/ILSVRC2012_val_00035978.JPEG n03776460/
+mv val/ILSVRC2012_val_00035979.JPEG n02105412/
+mv val/ILSVRC2012_val_00035980.JPEG n03887697/
+mv val/ILSVRC2012_val_00035981.JPEG n01773157/
+mv val/ILSVRC2012_val_00035982.JPEG n02056570/
+mv val/ILSVRC2012_val_00035983.JPEG n02808440/
+mv val/ILSVRC2012_val_00035984.JPEG n02007558/
+mv val/ILSVRC2012_val_00035985.JPEG n04146614/
+mv val/ILSVRC2012_val_00035986.JPEG n02097130/
+mv val/ILSVRC2012_val_00035987.JPEG n03888605/
+mv val/ILSVRC2012_val_00035988.JPEG n02412080/
+mv val/ILSVRC2012_val_00035989.JPEG n01806567/
+mv val/ILSVRC2012_val_00035990.JPEG n02457408/
+mv val/ILSVRC2012_val_00035991.JPEG n03935335/
+mv val/ILSVRC2012_val_00035992.JPEG n03775071/
+mv val/ILSVRC2012_val_00035993.JPEG n07697313/
+mv val/ILSVRC2012_val_00035994.JPEG n01774750/
+mv val/ILSVRC2012_val_00035995.JPEG n07873807/
+mv val/ILSVRC2012_val_00035996.JPEG n07749582/
+mv val/ILSVRC2012_val_00035997.JPEG n02091134/
+mv val/ILSVRC2012_val_00035998.JPEG n02871525/
+mv val/ILSVRC2012_val_00035999.JPEG n02117135/
+mv val/ILSVRC2012_val_00036000.JPEG n03657121/
+mv val/ILSVRC2012_val_00036001.JPEG n03661043/
+mv val/ILSVRC2012_val_00036002.JPEG n02088632/
+mv val/ILSVRC2012_val_00036003.JPEG n03776460/
+mv val/ILSVRC2012_val_00036004.JPEG n02120505/
+mv val/ILSVRC2012_val_00036005.JPEG n02165456/
+mv val/ILSVRC2012_val_00036006.JPEG n03089624/
+mv val/ILSVRC2012_val_00036007.JPEG n03485794/
+mv val/ILSVRC2012_val_00036008.JPEG n01534433/
+mv val/ILSVRC2012_val_00036009.JPEG n02835271/
+mv val/ILSVRC2012_val_00036010.JPEG n03240683/
+mv val/ILSVRC2012_val_00036011.JPEG n04251144/
+mv val/ILSVRC2012_val_00036012.JPEG n02086910/
+mv val/ILSVRC2012_val_00036013.JPEG n03447447/
+mv val/ILSVRC2012_val_00036014.JPEG n04200800/
+mv val/ILSVRC2012_val_00036015.JPEG n01582220/
+mv val/ILSVRC2012_val_00036016.JPEG n02655020/
+mv val/ILSVRC2012_val_00036017.JPEG n04458633/
+mv val/ILSVRC2012_val_00036018.JPEG n04371430/
+mv val/ILSVRC2012_val_00036019.JPEG n02097047/
+mv val/ILSVRC2012_val_00036020.JPEG n03970156/
+mv val/ILSVRC2012_val_00036021.JPEG n04418357/
+mv val/ILSVRC2012_val_00036022.JPEG n04243546/
+mv val/ILSVRC2012_val_00036023.JPEG n02098413/
+mv val/ILSVRC2012_val_00036024.JPEG n02992529/
+mv val/ILSVRC2012_val_00036025.JPEG n03384352/
+mv val/ILSVRC2012_val_00036026.JPEG n02640242/
+mv val/ILSVRC2012_val_00036027.JPEG n02894605/
+mv val/ILSVRC2012_val_00036028.JPEG n03920288/
+mv val/ILSVRC2012_val_00036029.JPEG n03250847/
+mv val/ILSVRC2012_val_00036030.JPEG n02607072/
+mv val/ILSVRC2012_val_00036031.JPEG n04326547/
+mv val/ILSVRC2012_val_00036032.JPEG n04485082/
+mv val/ILSVRC2012_val_00036033.JPEG n03868863/
+mv val/ILSVRC2012_val_00036034.JPEG n09472597/
+mv val/ILSVRC2012_val_00036035.JPEG n02027492/
+mv val/ILSVRC2012_val_00036036.JPEG n02692877/
+mv val/ILSVRC2012_val_00036037.JPEG n03388549/
+mv val/ILSVRC2012_val_00036038.JPEG n03874599/
+mv val/ILSVRC2012_val_00036039.JPEG n02096051/
+mv val/ILSVRC2012_val_00036040.JPEG n01847000/
+mv val/ILSVRC2012_val_00036041.JPEG n02328150/
+mv val/ILSVRC2012_val_00036042.JPEG n01534433/
+mv val/ILSVRC2012_val_00036043.JPEG n02910353/
+mv val/ILSVRC2012_val_00036044.JPEG n01829413/
+mv val/ILSVRC2012_val_00036045.JPEG n02107142/
+mv val/ILSVRC2012_val_00036046.JPEG n03977966/
+mv val/ILSVRC2012_val_00036047.JPEG n02090622/
+mv val/ILSVRC2012_val_00036048.JPEG n03444034/
+mv val/ILSVRC2012_val_00036049.JPEG n04418357/
+mv val/ILSVRC2012_val_00036050.JPEG n04254680/
+mv val/ILSVRC2012_val_00036051.JPEG n02692877/
+mv val/ILSVRC2012_val_00036052.JPEG n02002724/
+mv val/ILSVRC2012_val_00036053.JPEG n03535780/
+mv val/ILSVRC2012_val_00036054.JPEG n02108551/
+mv val/ILSVRC2012_val_00036055.JPEG n02112350/
+mv val/ILSVRC2012_val_00036056.JPEG n15075141/
+mv val/ILSVRC2012_val_00036057.JPEG n04141975/
+mv val/ILSVRC2012_val_00036058.JPEG n04507155/
+mv val/ILSVRC2012_val_00036059.JPEG n04509417/
+mv val/ILSVRC2012_val_00036060.JPEG n11939491/
+mv val/ILSVRC2012_val_00036061.JPEG n02112706/
+mv val/ILSVRC2012_val_00036062.JPEG n02110627/
+mv val/ILSVRC2012_val_00036063.JPEG n03125729/
+mv val/ILSVRC2012_val_00036064.JPEG n03680355/
+mv val/ILSVRC2012_val_00036065.JPEG n01644373/
+mv val/ILSVRC2012_val_00036066.JPEG n01644373/
+mv val/ILSVRC2012_val_00036067.JPEG n01756291/
+mv val/ILSVRC2012_val_00036068.JPEG n01753488/
+mv val/ILSVRC2012_val_00036069.JPEG n02098105/
+mv val/ILSVRC2012_val_00036070.JPEG n02342885/
+mv val/ILSVRC2012_val_00036071.JPEG n03759954/
+mv val/ILSVRC2012_val_00036072.JPEG n02110958/
+mv val/ILSVRC2012_val_00036073.JPEG n02797295/
+mv val/ILSVRC2012_val_00036074.JPEG n02006656/
+mv val/ILSVRC2012_val_00036075.JPEG n02111500/
+mv val/ILSVRC2012_val_00036076.JPEG n04033901/
+mv val/ILSVRC2012_val_00036077.JPEG n01784675/
+mv val/ILSVRC2012_val_00036078.JPEG n04277352/
+mv val/ILSVRC2012_val_00036079.JPEG n02489166/
+mv val/ILSVRC2012_val_00036080.JPEG n02481823/
+mv val/ILSVRC2012_val_00036081.JPEG n02398521/
+mv val/ILSVRC2012_val_00036082.JPEG n01739381/
+mv val/ILSVRC2012_val_00036083.JPEG n02823428/
+mv val/ILSVRC2012_val_00036084.JPEG n02939185/
+mv val/ILSVRC2012_val_00036085.JPEG n12985857/
+mv val/ILSVRC2012_val_00036086.JPEG n04275548/
+mv val/ILSVRC2012_val_00036087.JPEG n04127249/
+mv val/ILSVRC2012_val_00036088.JPEG n02087394/
+mv val/ILSVRC2012_val_00036089.JPEG n03920288/
+mv val/ILSVRC2012_val_00036090.JPEG n04482393/
+mv val/ILSVRC2012_val_00036091.JPEG n03100240/
+mv val/ILSVRC2012_val_00036092.JPEG n03000684/
+mv val/ILSVRC2012_val_00036093.JPEG n07248320/
+mv val/ILSVRC2012_val_00036094.JPEG n02454379/
+mv val/ILSVRC2012_val_00036095.JPEG n02361337/
+mv val/ILSVRC2012_val_00036096.JPEG n03218198/
+mv val/ILSVRC2012_val_00036097.JPEG n02106030/
+mv val/ILSVRC2012_val_00036098.JPEG n03544143/
+mv val/ILSVRC2012_val_00036099.JPEG n04456115/
+mv val/ILSVRC2012_val_00036100.JPEG n02165105/
+mv val/ILSVRC2012_val_00036101.JPEG n03188531/
+mv val/ILSVRC2012_val_00036102.JPEG n01641577/
+mv val/ILSVRC2012_val_00036103.JPEG n07742313/
+mv val/ILSVRC2012_val_00036104.JPEG n03761084/
+mv val/ILSVRC2012_val_00036105.JPEG n01518878/
+mv val/ILSVRC2012_val_00036106.JPEG n04376876/
+mv val/ILSVRC2012_val_00036107.JPEG n03782006/
+mv val/ILSVRC2012_val_00036108.JPEG n02422699/
+mv val/ILSVRC2012_val_00036109.JPEG n01773797/
+mv val/ILSVRC2012_val_00036110.JPEG n02106550/
+mv val/ILSVRC2012_val_00036111.JPEG n04590129/
+mv val/ILSVRC2012_val_00036112.JPEG n03902125/
+mv val/ILSVRC2012_val_00036113.JPEG n02823750/
+mv val/ILSVRC2012_val_00036114.JPEG n03393912/
+mv val/ILSVRC2012_val_00036115.JPEG n04090263/
+mv val/ILSVRC2012_val_00036116.JPEG n01737021/
+mv val/ILSVRC2012_val_00036117.JPEG n02129165/
+mv val/ILSVRC2012_val_00036118.JPEG n01498041/
+mv val/ILSVRC2012_val_00036119.JPEG n03792782/
+mv val/ILSVRC2012_val_00036120.JPEG n02966687/
+mv val/ILSVRC2012_val_00036121.JPEG n02504458/
+mv val/ILSVRC2012_val_00036122.JPEG n03838899/
+mv val/ILSVRC2012_val_00036123.JPEG n01689811/
+mv val/ILSVRC2012_val_00036124.JPEG n04347754/
+mv val/ILSVRC2012_val_00036125.JPEG n01608432/
+mv val/ILSVRC2012_val_00036126.JPEG n01817953/
+mv val/ILSVRC2012_val_00036127.JPEG n02536864/
+mv val/ILSVRC2012_val_00036128.JPEG n01729977/
+mv val/ILSVRC2012_val_00036129.JPEG n02096437/
+mv val/ILSVRC2012_val_00036130.JPEG n03924679/
+mv val/ILSVRC2012_val_00036131.JPEG n02096437/
+mv val/ILSVRC2012_val_00036132.JPEG n01798484/
+mv val/ILSVRC2012_val_00036133.JPEG n02869837/
+mv val/ILSVRC2012_val_00036134.JPEG n04336792/
+mv val/ILSVRC2012_val_00036135.JPEG n03485407/
+mv val/ILSVRC2012_val_00036136.JPEG n03868863/
+mv val/ILSVRC2012_val_00036137.JPEG n04376876/
+mv val/ILSVRC2012_val_00036138.JPEG n03602883/
+mv val/ILSVRC2012_val_00036139.JPEG n02128925/
+mv val/ILSVRC2012_val_00036140.JPEG n02102973/
+mv val/ILSVRC2012_val_00036141.JPEG n02447366/
+mv val/ILSVRC2012_val_00036142.JPEG n07716358/
+mv val/ILSVRC2012_val_00036143.JPEG n03857828/
+mv val/ILSVRC2012_val_00036144.JPEG n04517823/
+mv val/ILSVRC2012_val_00036145.JPEG n03837869/
+mv val/ILSVRC2012_val_00036146.JPEG n07749582/
+mv val/ILSVRC2012_val_00036147.JPEG n02105162/
+mv val/ILSVRC2012_val_00036148.JPEG n02281787/
+mv val/ILSVRC2012_val_00036149.JPEG n02769748/
+mv val/ILSVRC2012_val_00036150.JPEG n02085620/
+mv val/ILSVRC2012_val_00036151.JPEG n01751748/
+mv val/ILSVRC2012_val_00036152.JPEG n02093647/
+mv val/ILSVRC2012_val_00036153.JPEG n04423845/
+mv val/ILSVRC2012_val_00036154.JPEG n02488702/
+mv val/ILSVRC2012_val_00036155.JPEG n03485794/
+mv val/ILSVRC2012_val_00036156.JPEG n03908714/
+mv val/ILSVRC2012_val_00036157.JPEG n01498041/
+mv val/ILSVRC2012_val_00036158.JPEG n02231487/
+mv val/ILSVRC2012_val_00036159.JPEG n02108551/
+mv val/ILSVRC2012_val_00036160.JPEG n03179701/
+mv val/ILSVRC2012_val_00036161.JPEG n02786058/
+mv val/ILSVRC2012_val_00036162.JPEG n01855032/
+mv val/ILSVRC2012_val_00036163.JPEG n04147183/
+mv val/ILSVRC2012_val_00036164.JPEG n04254680/
+mv val/ILSVRC2012_val_00036165.JPEG n04557648/
+mv val/ILSVRC2012_val_00036166.JPEG n01728572/
+mv val/ILSVRC2012_val_00036167.JPEG n04325704/
+mv val/ILSVRC2012_val_00036168.JPEG n07860988/
+mv val/ILSVRC2012_val_00036169.JPEG n01847000/
+mv val/ILSVRC2012_val_00036170.JPEG n13044778/
+mv val/ILSVRC2012_val_00036171.JPEG n03445777/
+mv val/ILSVRC2012_val_00036172.JPEG n03447447/
+mv val/ILSVRC2012_val_00036173.JPEG n02169497/
+mv val/ILSVRC2012_val_00036174.JPEG n03290653/
+mv val/ILSVRC2012_val_00036175.JPEG n03376595/
+mv val/ILSVRC2012_val_00036176.JPEG n02094114/
+mv val/ILSVRC2012_val_00036177.JPEG n03854065/
+mv val/ILSVRC2012_val_00036178.JPEG n02422699/
+mv val/ILSVRC2012_val_00036179.JPEG n01796340/
+mv val/ILSVRC2012_val_00036180.JPEG n03459775/
+mv val/ILSVRC2012_val_00036181.JPEG n02091244/
+mv val/ILSVRC2012_val_00036182.JPEG n04399382/
+mv val/ILSVRC2012_val_00036183.JPEG n03476684/
+mv val/ILSVRC2012_val_00036184.JPEG n02951585/
+mv val/ILSVRC2012_val_00036185.JPEG n03207941/
+mv val/ILSVRC2012_val_00036186.JPEG n02174001/
+mv val/ILSVRC2012_val_00036187.JPEG n03445777/
+mv val/ILSVRC2012_val_00036188.JPEG n01950731/
+mv val/ILSVRC2012_val_00036189.JPEG n04562935/
+mv val/ILSVRC2012_val_00036190.JPEG n01728572/
+mv val/ILSVRC2012_val_00036191.JPEG n02089973/
+mv val/ILSVRC2012_val_00036192.JPEG n01945685/
+mv val/ILSVRC2012_val_00036193.JPEG n02791270/
+mv val/ILSVRC2012_val_00036194.JPEG n04090263/
+mv val/ILSVRC2012_val_00036195.JPEG n01665541/
+mv val/ILSVRC2012_val_00036196.JPEG n02264363/
+mv val/ILSVRC2012_val_00036197.JPEG n04228054/
+mv val/ILSVRC2012_val_00036198.JPEG n03345487/
+mv val/ILSVRC2012_val_00036199.JPEG n03947888/
+mv val/ILSVRC2012_val_00036200.JPEG n01944390/
+mv val/ILSVRC2012_val_00036201.JPEG n04153751/
+mv val/ILSVRC2012_val_00036202.JPEG n01664065/
+mv val/ILSVRC2012_val_00036203.JPEG n03223299/
+mv val/ILSVRC2012_val_00036204.JPEG n02930766/
+mv val/ILSVRC2012_val_00036205.JPEG n04404412/
+mv val/ILSVRC2012_val_00036206.JPEG n03992509/
+mv val/ILSVRC2012_val_00036207.JPEG n01877812/
+mv val/ILSVRC2012_val_00036208.JPEG n02977058/
+mv val/ILSVRC2012_val_00036209.JPEG n09835506/
+mv val/ILSVRC2012_val_00036210.JPEG n12267677/
+mv val/ILSVRC2012_val_00036211.JPEG n03127747/
+mv val/ILSVRC2012_val_00036212.JPEG n01980166/
+mv val/ILSVRC2012_val_00036213.JPEG n09835506/
+mv val/ILSVRC2012_val_00036214.JPEG n07753113/
+mv val/ILSVRC2012_val_00036215.JPEG n02860847/
+mv val/ILSVRC2012_val_00036216.JPEG n02840245/
+mv val/ILSVRC2012_val_00036217.JPEG n01748264/
+mv val/ILSVRC2012_val_00036218.JPEG n03891251/
+mv val/ILSVRC2012_val_00036219.JPEG n02484975/
+mv val/ILSVRC2012_val_00036220.JPEG n02095314/
+mv val/ILSVRC2012_val_00036221.JPEG n03063689/
+mv val/ILSVRC2012_val_00036222.JPEG n04372370/
+mv val/ILSVRC2012_val_00036223.JPEG n11879895/
+mv val/ILSVRC2012_val_00036224.JPEG n02447366/
+mv val/ILSVRC2012_val_00036225.JPEG n01795545/
+mv val/ILSVRC2012_val_00036226.JPEG n03201208/
+mv val/ILSVRC2012_val_00036227.JPEG n01797886/
+mv val/ILSVRC2012_val_00036228.JPEG n04548362/
+mv val/ILSVRC2012_val_00036229.JPEG n03028079/
+mv val/ILSVRC2012_val_00036230.JPEG n03201208/
+mv val/ILSVRC2012_val_00036231.JPEG n02109047/
+mv val/ILSVRC2012_val_00036232.JPEG n03804744/
+mv val/ILSVRC2012_val_00036233.JPEG n03417042/
+mv val/ILSVRC2012_val_00036234.JPEG n02111500/
+mv val/ILSVRC2012_val_00036235.JPEG n02109047/
+mv val/ILSVRC2012_val_00036236.JPEG n02415577/
+mv val/ILSVRC2012_val_00036237.JPEG n04456115/
+mv val/ILSVRC2012_val_00036238.JPEG n02486410/
+mv val/ILSVRC2012_val_00036239.JPEG n03976657/
+mv val/ILSVRC2012_val_00036240.JPEG n02109525/
+mv val/ILSVRC2012_val_00036241.JPEG n03602883/
+mv val/ILSVRC2012_val_00036242.JPEG n03937543/
+mv val/ILSVRC2012_val_00036243.JPEG n02492660/
+mv val/ILSVRC2012_val_00036244.JPEG n02127052/
+mv val/ILSVRC2012_val_00036245.JPEG n02641379/
+mv val/ILSVRC2012_val_00036246.JPEG n03146219/
+mv val/ILSVRC2012_val_00036247.JPEG n02091635/
+mv val/ILSVRC2012_val_00036248.JPEG n02110185/
+mv val/ILSVRC2012_val_00036249.JPEG n04389033/
+mv val/ILSVRC2012_val_00036250.JPEG n04330267/
+mv val/ILSVRC2012_val_00036251.JPEG n02165456/
+mv val/ILSVRC2012_val_00036252.JPEG n04152593/
+mv val/ILSVRC2012_val_00036253.JPEG n04548362/
+mv val/ILSVRC2012_val_00036254.JPEG n02094433/
+mv val/ILSVRC2012_val_00036255.JPEG n04372370/
+mv val/ILSVRC2012_val_00036256.JPEG n03208938/
+mv val/ILSVRC2012_val_00036257.JPEG n02356798/
+mv val/ILSVRC2012_val_00036258.JPEG n02666196/
+mv val/ILSVRC2012_val_00036259.JPEG n02279972/
+mv val/ILSVRC2012_val_00036260.JPEG n03661043/
+mv val/ILSVRC2012_val_00036261.JPEG n03187595/
+mv val/ILSVRC2012_val_00036262.JPEG n03131574/
+mv val/ILSVRC2012_val_00036263.JPEG n07742313/
+mv val/ILSVRC2012_val_00036264.JPEG n02104029/
+mv val/ILSVRC2012_val_00036265.JPEG n02172182/
+mv val/ILSVRC2012_val_00036266.JPEG n02090622/
+mv val/ILSVRC2012_val_00036267.JPEG n02085782/
+mv val/ILSVRC2012_val_00036268.JPEG n02123159/
+mv val/ILSVRC2012_val_00036269.JPEG n02105855/
+mv val/ILSVRC2012_val_00036270.JPEG n02422106/
+mv val/ILSVRC2012_val_00036271.JPEG n01667114/
+mv val/ILSVRC2012_val_00036272.JPEG n01943899/
+mv val/ILSVRC2012_val_00036273.JPEG n03692522/
+mv val/ILSVRC2012_val_00036274.JPEG n03788195/
+mv val/ILSVRC2012_val_00036275.JPEG n07718472/
+mv val/ILSVRC2012_val_00036276.JPEG n03146219/
+mv val/ILSVRC2012_val_00036277.JPEG n04553703/
+mv val/ILSVRC2012_val_00036278.JPEG n09472597/
+mv val/ILSVRC2012_val_00036279.JPEG n04447861/
+mv val/ILSVRC2012_val_00036280.JPEG n02790996/
+mv val/ILSVRC2012_val_00036281.JPEG n03673027/
+mv val/ILSVRC2012_val_00036282.JPEG n02102040/
+mv val/ILSVRC2012_val_00036283.JPEG n07565083/
+mv val/ILSVRC2012_val_00036284.JPEG n01532829/
+mv val/ILSVRC2012_val_00036285.JPEG n02276258/
+mv val/ILSVRC2012_val_00036286.JPEG n04141327/
+mv val/ILSVRC2012_val_00036287.JPEG n01817953/
+mv val/ILSVRC2012_val_00036288.JPEG n04118538/
+mv val/ILSVRC2012_val_00036289.JPEG n01990800/
+mv val/ILSVRC2012_val_00036290.JPEG n02123597/
+mv val/ILSVRC2012_val_00036291.JPEG n01751748/
+mv val/ILSVRC2012_val_00036292.JPEG n02025239/
+mv val/ILSVRC2012_val_00036293.JPEG n01644373/
+mv val/ILSVRC2012_val_00036294.JPEG n03355925/
+mv val/ILSVRC2012_val_00036295.JPEG n02177972/
+mv val/ILSVRC2012_val_00036296.JPEG n04286575/
+mv val/ILSVRC2012_val_00036297.JPEG n04009552/
+mv val/ILSVRC2012_val_00036298.JPEG n03899768/
+mv val/ILSVRC2012_val_00036299.JPEG n03857828/
+mv val/ILSVRC2012_val_00036300.JPEG n04613696/
+mv val/ILSVRC2012_val_00036301.JPEG n02120079/
+mv val/ILSVRC2012_val_00036302.JPEG n02007558/
+mv val/ILSVRC2012_val_00036303.JPEG n04311174/
+mv val/ILSVRC2012_val_00036304.JPEG n03594945/
+mv val/ILSVRC2012_val_00036305.JPEG n04355338/
+mv val/ILSVRC2012_val_00036306.JPEG n03325584/
+mv val/ILSVRC2012_val_00036307.JPEG n07590611/
+mv val/ILSVRC2012_val_00036308.JPEG n07831146/
+mv val/ILSVRC2012_val_00036309.JPEG n03899768/
+mv val/ILSVRC2012_val_00036310.JPEG n02165105/
+mv val/ILSVRC2012_val_00036311.JPEG n06359193/
+mv val/ILSVRC2012_val_00036312.JPEG n06874185/
+mv val/ILSVRC2012_val_00036313.JPEG n03657121/
+mv val/ILSVRC2012_val_00036314.JPEG n02056570/
+mv val/ILSVRC2012_val_00036315.JPEG n09428293/
+mv val/ILSVRC2012_val_00036316.JPEG n04597913/
+mv val/ILSVRC2012_val_00036317.JPEG n02114855/
+mv val/ILSVRC2012_val_00036318.JPEG n04548280/
+mv val/ILSVRC2012_val_00036319.JPEG n03065424/
+mv val/ILSVRC2012_val_00036320.JPEG n01986214/
+mv val/ILSVRC2012_val_00036321.JPEG n03623198/
+mv val/ILSVRC2012_val_00036322.JPEG n04485082/
+mv val/ILSVRC2012_val_00036323.JPEG n03888605/
+mv val/ILSVRC2012_val_00036324.JPEG n02114855/
+mv val/ILSVRC2012_val_00036325.JPEG n02917067/
+mv val/ILSVRC2012_val_00036326.JPEG n04067472/
+mv val/ILSVRC2012_val_00036327.JPEG n03457902/
+mv val/ILSVRC2012_val_00036328.JPEG n03775071/
+mv val/ILSVRC2012_val_00036329.JPEG n07579787/
+mv val/ILSVRC2012_val_00036330.JPEG n02509815/
+mv val/ILSVRC2012_val_00036331.JPEG n04458633/
+mv val/ILSVRC2012_val_00036332.JPEG n03347037/
+mv val/ILSVRC2012_val_00036333.JPEG n02098105/
+mv val/ILSVRC2012_val_00036334.JPEG n12985857/
+mv val/ILSVRC2012_val_00036335.JPEG n03691459/
+mv val/ILSVRC2012_val_00036336.JPEG n04525305/
+mv val/ILSVRC2012_val_00036337.JPEG n01817953/
+mv val/ILSVRC2012_val_00036338.JPEG n03393912/
+mv val/ILSVRC2012_val_00036339.JPEG n04251144/
+mv val/ILSVRC2012_val_00036340.JPEG n02088364/
+mv val/ILSVRC2012_val_00036341.JPEG n02526121/
+mv val/ILSVRC2012_val_00036342.JPEG n02444819/
+mv val/ILSVRC2012_val_00036343.JPEG n02088238/
+mv val/ILSVRC2012_val_00036344.JPEG n02051845/
+mv val/ILSVRC2012_val_00036345.JPEG n01667114/
+mv val/ILSVRC2012_val_00036346.JPEG n04487394/
+mv val/ILSVRC2012_val_00036347.JPEG n04125021/
+mv val/ILSVRC2012_val_00036348.JPEG n02883205/
+mv val/ILSVRC2012_val_00036349.JPEG n04162706/
+mv val/ILSVRC2012_val_00036350.JPEG n02085936/
+mv val/ILSVRC2012_val_00036351.JPEG n02807133/
+mv val/ILSVRC2012_val_00036352.JPEG n02978881/
+mv val/ILSVRC2012_val_00036353.JPEG n04350905/
+mv val/ILSVRC2012_val_00036354.JPEG n01843383/
+mv val/ILSVRC2012_val_00036355.JPEG n02906734/
+mv val/ILSVRC2012_val_00036356.JPEG n01608432/
+mv val/ILSVRC2012_val_00036357.JPEG n02950826/
+mv val/ILSVRC2012_val_00036358.JPEG n04131690/
+mv val/ILSVRC2012_val_00036359.JPEG n02823428/
+mv val/ILSVRC2012_val_00036360.JPEG n02106030/
+mv val/ILSVRC2012_val_00036361.JPEG n01818515/
+mv val/ILSVRC2012_val_00036362.JPEG n03840681/
+mv val/ILSVRC2012_val_00036363.JPEG n03443371/
+mv val/ILSVRC2012_val_00036364.JPEG n03447447/
+mv val/ILSVRC2012_val_00036365.JPEG n02492660/
+mv val/ILSVRC2012_val_00036366.JPEG n11879895/
+mv val/ILSVRC2012_val_00036367.JPEG n02981792/
+mv val/ILSVRC2012_val_00036368.JPEG n01514668/
+mv val/ILSVRC2012_val_00036369.JPEG n02701002/
+mv val/ILSVRC2012_val_00036370.JPEG n04192698/
+mv val/ILSVRC2012_val_00036371.JPEG n02106030/
+mv val/ILSVRC2012_val_00036372.JPEG n07717410/
+mv val/ILSVRC2012_val_00036373.JPEG n03492542/
+mv val/ILSVRC2012_val_00036374.JPEG n06794110/
+mv val/ILSVRC2012_val_00036375.JPEG n03977966/
+mv val/ILSVRC2012_val_00036376.JPEG n04008634/
+mv val/ILSVRC2012_val_00036377.JPEG n07768694/
+mv val/ILSVRC2012_val_00036378.JPEG n04515003/
+mv val/ILSVRC2012_val_00036379.JPEG n02111889/
+mv val/ILSVRC2012_val_00036380.JPEG n02363005/
+mv val/ILSVRC2012_val_00036381.JPEG n01930112/
+mv val/ILSVRC2012_val_00036382.JPEG n04447861/
+mv val/ILSVRC2012_val_00036383.JPEG n07684084/
+mv val/ILSVRC2012_val_00036384.JPEG n01883070/
+mv val/ILSVRC2012_val_00036385.JPEG n03250847/
+mv val/ILSVRC2012_val_00036386.JPEG n02825657/
+mv val/ILSVRC2012_val_00036387.JPEG n03793489/
+mv val/ILSVRC2012_val_00036388.JPEG n01616318/
+mv val/ILSVRC2012_val_00036389.JPEG n02110341/
+mv val/ILSVRC2012_val_00036390.JPEG n06596364/
+mv val/ILSVRC2012_val_00036391.JPEG n04456115/
+mv val/ILSVRC2012_val_00036392.JPEG n01749939/
+mv val/ILSVRC2012_val_00036393.JPEG n03180011/
+mv val/ILSVRC2012_val_00036394.JPEG n02690373/
+mv val/ILSVRC2012_val_00036395.JPEG n02088094/
+mv val/ILSVRC2012_val_00036396.JPEG n01984695/
+mv val/ILSVRC2012_val_00036397.JPEG n02493793/
+mv val/ILSVRC2012_val_00036398.JPEG n09428293/
+mv val/ILSVRC2012_val_00036399.JPEG n03888605/
+mv val/ILSVRC2012_val_00036400.JPEG n09229709/
+mv val/ILSVRC2012_val_00036401.JPEG n02128757/
+mv val/ILSVRC2012_val_00036402.JPEG n04239074/
+mv val/ILSVRC2012_val_00036403.JPEG n04040759/
+mv val/ILSVRC2012_val_00036404.JPEG n03062245/
+mv val/ILSVRC2012_val_00036405.JPEG n02168699/
+mv val/ILSVRC2012_val_00036406.JPEG n02977058/
+mv val/ILSVRC2012_val_00036407.JPEG n01773157/
+mv val/ILSVRC2012_val_00036408.JPEG n02101388/
+mv val/ILSVRC2012_val_00036409.JPEG n03459775/
+mv val/ILSVRC2012_val_00036410.JPEG n04532106/
+mv val/ILSVRC2012_val_00036411.JPEG n04026417/
+mv val/ILSVRC2012_val_00036412.JPEG n02870880/
+mv val/ILSVRC2012_val_00036413.JPEG n04179913/
+mv val/ILSVRC2012_val_00036414.JPEG n02115913/
+mv val/ILSVRC2012_val_00036415.JPEG n04525038/
+mv val/ILSVRC2012_val_00036416.JPEG n11939491/
+mv val/ILSVRC2012_val_00036417.JPEG n02165105/
+mv val/ILSVRC2012_val_00036418.JPEG n04258138/
+mv val/ILSVRC2012_val_00036419.JPEG n09472597/
+mv val/ILSVRC2012_val_00036420.JPEG n01491361/
+mv val/ILSVRC2012_val_00036421.JPEG n03706229/
+mv val/ILSVRC2012_val_00036422.JPEG n03937543/
+mv val/ILSVRC2012_val_00036423.JPEG n01855672/
+mv val/ILSVRC2012_val_00036424.JPEG n03673027/
+mv val/ILSVRC2012_val_00036425.JPEG n02443484/
+mv val/ILSVRC2012_val_00036426.JPEG n03706229/
+mv val/ILSVRC2012_val_00036427.JPEG n04149813/
+mv val/ILSVRC2012_val_00036428.JPEG n03599486/
+mv val/ILSVRC2012_val_00036429.JPEG n03272562/
+mv val/ILSVRC2012_val_00036430.JPEG n01704323/
+mv val/ILSVRC2012_val_00036431.JPEG n01537544/
+mv val/ILSVRC2012_val_00036432.JPEG n03424325/
+mv val/ILSVRC2012_val_00036433.JPEG n02085782/
+mv val/ILSVRC2012_val_00036434.JPEG n02190166/
+mv val/ILSVRC2012_val_00036435.JPEG n04592741/
+mv val/ILSVRC2012_val_00036436.JPEG n02504458/
+mv val/ILSVRC2012_val_00036437.JPEG n04086273/
+mv val/ILSVRC2012_val_00036438.JPEG n07754684/
+mv val/ILSVRC2012_val_00036439.JPEG n02443484/
+mv val/ILSVRC2012_val_00036440.JPEG n02086910/
+mv val/ILSVRC2012_val_00036441.JPEG n01756291/
+mv val/ILSVRC2012_val_00036442.JPEG n01873310/
+mv val/ILSVRC2012_val_00036443.JPEG n02096437/
+mv val/ILSVRC2012_val_00036444.JPEG n02870880/
+mv val/ILSVRC2012_val_00036445.JPEG n02106166/
+mv val/ILSVRC2012_val_00036446.JPEG n07613480/
+mv val/ILSVRC2012_val_00036447.JPEG n03018349/
+mv val/ILSVRC2012_val_00036448.JPEG n03447721/
+mv val/ILSVRC2012_val_00036449.JPEG n04335435/
+mv val/ILSVRC2012_val_00036450.JPEG n02114855/
+mv val/ILSVRC2012_val_00036451.JPEG n07760859/
+mv val/ILSVRC2012_val_00036452.JPEG n03825788/
+mv val/ILSVRC2012_val_00036453.JPEG n02107142/
+mv val/ILSVRC2012_val_00036454.JPEG n02095570/
+mv val/ILSVRC2012_val_00036455.JPEG n01697457/
+mv val/ILSVRC2012_val_00036456.JPEG n03837869/
+mv val/ILSVRC2012_val_00036457.JPEG n02018795/
+mv val/ILSVRC2012_val_00036458.JPEG n02113624/
+mv val/ILSVRC2012_val_00036459.JPEG n03781244/
+mv val/ILSVRC2012_val_00036460.JPEG n03942813/
+mv val/ILSVRC2012_val_00036461.JPEG n02445715/
+mv val/ILSVRC2012_val_00036462.JPEG n02111129/
+mv val/ILSVRC2012_val_00036463.JPEG n04372370/
+mv val/ILSVRC2012_val_00036464.JPEG n02115641/
+mv val/ILSVRC2012_val_00036465.JPEG n07802026/
+mv val/ILSVRC2012_val_00036466.JPEG n02137549/
+mv val/ILSVRC2012_val_00036467.JPEG n02099429/
+mv val/ILSVRC2012_val_00036468.JPEG n03998194/
+mv val/ILSVRC2012_val_00036469.JPEG n04162706/
+mv val/ILSVRC2012_val_00036470.JPEG n03208938/
+mv val/ILSVRC2012_val_00036471.JPEG n02486410/
+mv val/ILSVRC2012_val_00036472.JPEG n02536864/
+mv val/ILSVRC2012_val_00036473.JPEG n02437616/
+mv val/ILSVRC2012_val_00036474.JPEG n02128757/
+mv val/ILSVRC2012_val_00036475.JPEG n04604644/
+mv val/ILSVRC2012_val_00036476.JPEG n03016953/
+mv val/ILSVRC2012_val_00036477.JPEG n04404412/
+mv val/ILSVRC2012_val_00036478.JPEG n02096585/
+mv val/ILSVRC2012_val_00036479.JPEG n01494475/
+mv val/ILSVRC2012_val_00036480.JPEG n03657121/
+mv val/ILSVRC2012_val_00036481.JPEG n04259630/
+mv val/ILSVRC2012_val_00036482.JPEG n04423845/
+mv val/ILSVRC2012_val_00036483.JPEG n03388549/
+mv val/ILSVRC2012_val_00036484.JPEG n02640242/
+mv val/ILSVRC2012_val_00036485.JPEG n02988304/
+mv val/ILSVRC2012_val_00036486.JPEG n02165456/
+mv val/ILSVRC2012_val_00036487.JPEG n03924679/
+mv val/ILSVRC2012_val_00036488.JPEG n04086273/
+mv val/ILSVRC2012_val_00036489.JPEG n02492660/
+mv val/ILSVRC2012_val_00036490.JPEG n02113624/
+mv val/ILSVRC2012_val_00036491.JPEG n02093859/
+mv val/ILSVRC2012_val_00036492.JPEG n02089867/
+mv val/ILSVRC2012_val_00036493.JPEG n04192698/
+mv val/ILSVRC2012_val_00036494.JPEG n01944390/
+mv val/ILSVRC2012_val_00036495.JPEG n01632777/
+mv val/ILSVRC2012_val_00036496.JPEG n02966687/
+mv val/ILSVRC2012_val_00036497.JPEG n02107908/
+mv val/ILSVRC2012_val_00036498.JPEG n02098286/
+mv val/ILSVRC2012_val_00036499.JPEG n07831146/
+mv val/ILSVRC2012_val_00036500.JPEG n02007558/
+mv val/ILSVRC2012_val_00036501.JPEG n04536866/
+mv val/ILSVRC2012_val_00036502.JPEG n02808304/
+mv val/ILSVRC2012_val_00036503.JPEG n07718472/
+mv val/ILSVRC2012_val_00036504.JPEG n03930630/
+mv val/ILSVRC2012_val_00036505.JPEG n07754684/
+mv val/ILSVRC2012_val_00036506.JPEG n01774750/
+mv val/ILSVRC2012_val_00036507.JPEG n03980874/
+mv val/ILSVRC2012_val_00036508.JPEG n03384352/
+mv val/ILSVRC2012_val_00036509.JPEG n02104029/
+mv val/ILSVRC2012_val_00036510.JPEG n02769748/
+mv val/ILSVRC2012_val_00036511.JPEG n02058221/
+mv val/ILSVRC2012_val_00036512.JPEG n01695060/
+mv val/ILSVRC2012_val_00036513.JPEG n03929660/
+mv val/ILSVRC2012_val_00036514.JPEG n13040303/
+mv val/ILSVRC2012_val_00036515.JPEG n03089624/
+mv val/ILSVRC2012_val_00036516.JPEG n04443257/
+mv val/ILSVRC2012_val_00036517.JPEG n04428191/
+mv val/ILSVRC2012_val_00036518.JPEG n03775546/
+mv val/ILSVRC2012_val_00036519.JPEG n04517823/
+mv val/ILSVRC2012_val_00036520.JPEG n01945685/
+mv val/ILSVRC2012_val_00036521.JPEG n03216828/
+mv val/ILSVRC2012_val_00036522.JPEG n02965783/
+mv val/ILSVRC2012_val_00036523.JPEG n02088466/
+mv val/ILSVRC2012_val_00036524.JPEG n04133789/
+mv val/ILSVRC2012_val_00036525.JPEG n03838899/
+mv val/ILSVRC2012_val_00036526.JPEG n02123597/
+mv val/ILSVRC2012_val_00036527.JPEG n02128385/
+mv val/ILSVRC2012_val_00036528.JPEG n02486410/
+mv val/ILSVRC2012_val_00036529.JPEG n03124170/
+mv val/ILSVRC2012_val_00036530.JPEG n03530642/
+mv val/ILSVRC2012_val_00036531.JPEG n02500267/
+mv val/ILSVRC2012_val_00036532.JPEG n12768682/
+mv val/ILSVRC2012_val_00036533.JPEG n02128385/
+mv val/ILSVRC2012_val_00036534.JPEG n01592084/
+mv val/ILSVRC2012_val_00036535.JPEG n02526121/
+mv val/ILSVRC2012_val_00036536.JPEG n04356056/
+mv val/ILSVRC2012_val_00036537.JPEG n02137549/
+mv val/ILSVRC2012_val_00036538.JPEG n03854065/
+mv val/ILSVRC2012_val_00036539.JPEG n07684084/
+mv val/ILSVRC2012_val_00036540.JPEG n01855032/
+mv val/ILSVRC2012_val_00036541.JPEG n02992211/
+mv val/ILSVRC2012_val_00036542.JPEG n02484975/
+mv val/ILSVRC2012_val_00036543.JPEG n02106030/
+mv val/ILSVRC2012_val_00036544.JPEG n09421951/
+mv val/ILSVRC2012_val_00036545.JPEG n04367480/
+mv val/ILSVRC2012_val_00036546.JPEG n09256479/
+mv val/ILSVRC2012_val_00036547.JPEG n02119022/
+mv val/ILSVRC2012_val_00036548.JPEG n02493509/
+mv val/ILSVRC2012_val_00036549.JPEG n03803284/
+mv val/ILSVRC2012_val_00036550.JPEG n01685808/
+mv val/ILSVRC2012_val_00036551.JPEG n07697537/
+mv val/ILSVRC2012_val_00036552.JPEG n01807496/
+mv val/ILSVRC2012_val_00036553.JPEG n03733281/
+mv val/ILSVRC2012_val_00036554.JPEG n03417042/
+mv val/ILSVRC2012_val_00036555.JPEG n02219486/
+mv val/ILSVRC2012_val_00036556.JPEG n09229709/
+mv val/ILSVRC2012_val_00036557.JPEG n02526121/
+mv val/ILSVRC2012_val_00036558.JPEG n03908714/
+mv val/ILSVRC2012_val_00036559.JPEG n04204347/
+mv val/ILSVRC2012_val_00036560.JPEG n03527444/
+mv val/ILSVRC2012_val_00036561.JPEG n01740131/
+mv val/ILSVRC2012_val_00036562.JPEG n02492035/
+mv val/ILSVRC2012_val_00036563.JPEG n02094258/
+mv val/ILSVRC2012_val_00036564.JPEG n03769881/
+mv val/ILSVRC2012_val_00036565.JPEG n03026506/
+mv val/ILSVRC2012_val_00036566.JPEG n02804414/
+mv val/ILSVRC2012_val_00036567.JPEG n02489166/
+mv val/ILSVRC2012_val_00036568.JPEG n02883205/
+mv val/ILSVRC2012_val_00036569.JPEG n03482405/
+mv val/ILSVRC2012_val_00036570.JPEG n04366367/
+mv val/ILSVRC2012_val_00036571.JPEG n03868863/
+mv val/ILSVRC2012_val_00036572.JPEG n03891332/
+mv val/ILSVRC2012_val_00036573.JPEG n01797886/
+mv val/ILSVRC2012_val_00036574.JPEG n03447447/
+mv val/ILSVRC2012_val_00036575.JPEG n04399382/
+mv val/ILSVRC2012_val_00036576.JPEG n04146614/
+mv val/ILSVRC2012_val_00036577.JPEG n02423022/
+mv val/ILSVRC2012_val_00036578.JPEG n02268443/
+mv val/ILSVRC2012_val_00036579.JPEG n03250847/
+mv val/ILSVRC2012_val_00036580.JPEG n07753592/
+mv val/ILSVRC2012_val_00036581.JPEG n01984695/
+mv val/ILSVRC2012_val_00036582.JPEG n03709823/
+mv val/ILSVRC2012_val_00036583.JPEG n03884397/
+mv val/ILSVRC2012_val_00036584.JPEG n03630383/
+mv val/ILSVRC2012_val_00036585.JPEG n03814639/
+mv val/ILSVRC2012_val_00036586.JPEG n02834397/
+mv val/ILSVRC2012_val_00036587.JPEG n01737021/
+mv val/ILSVRC2012_val_00036588.JPEG n03786901/
+mv val/ILSVRC2012_val_00036589.JPEG n01775062/
+mv val/ILSVRC2012_val_00036590.JPEG n01883070/
+mv val/ILSVRC2012_val_00036591.JPEG n09428293/
+mv val/ILSVRC2012_val_00036592.JPEG n03977966/
+mv val/ILSVRC2012_val_00036593.JPEG n07754684/
+mv val/ILSVRC2012_val_00036594.JPEG n03384352/
+mv val/ILSVRC2012_val_00036595.JPEG n02794156/
+mv val/ILSVRC2012_val_00036596.JPEG n13054560/
+mv val/ILSVRC2012_val_00036597.JPEG n02132136/
+mv val/ILSVRC2012_val_00036598.JPEG n02769748/
+mv val/ILSVRC2012_val_00036599.JPEG n07718747/
+mv val/ILSVRC2012_val_00036600.JPEG n02950826/
+mv val/ILSVRC2012_val_00036601.JPEG n01930112/
+mv val/ILSVRC2012_val_00036602.JPEG n02086240/
+mv val/ILSVRC2012_val_00036603.JPEG n02125311/
+mv val/ILSVRC2012_val_00036604.JPEG n03947888/
+mv val/ILSVRC2012_val_00036605.JPEG n02840245/
+mv val/ILSVRC2012_val_00036606.JPEG n03220513/
+mv val/ILSVRC2012_val_00036607.JPEG n03720891/
+mv val/ILSVRC2012_val_00036608.JPEG n02791270/
+mv val/ILSVRC2012_val_00036609.JPEG n02802426/
+mv val/ILSVRC2012_val_00036610.JPEG n03866082/
+mv val/ILSVRC2012_val_00036611.JPEG n03825788/
+mv val/ILSVRC2012_val_00036612.JPEG n02487347/
+mv val/ILSVRC2012_val_00036613.JPEG n02169497/
+mv val/ILSVRC2012_val_00036614.JPEG n02860847/
+mv val/ILSVRC2012_val_00036615.JPEG n01728920/
+mv val/ILSVRC2012_val_00036616.JPEG n03535780/
+mv val/ILSVRC2012_val_00036617.JPEG n03710193/
+mv val/ILSVRC2012_val_00036618.JPEG n02091467/
+mv val/ILSVRC2012_val_00036619.JPEG n04243546/
+mv val/ILSVRC2012_val_00036620.JPEG n01616318/
+mv val/ILSVRC2012_val_00036621.JPEG n03942813/
+mv val/ILSVRC2012_val_00036622.JPEG n02128757/
+mv val/ILSVRC2012_val_00036623.JPEG n04049303/
+mv val/ILSVRC2012_val_00036624.JPEG n04417672/
+mv val/ILSVRC2012_val_00036625.JPEG n02127052/
+mv val/ILSVRC2012_val_00036626.JPEG n03838899/
+mv val/ILSVRC2012_val_00036627.JPEG n03729826/
+mv val/ILSVRC2012_val_00036628.JPEG n02909870/
+mv val/ILSVRC2012_val_00036629.JPEG n09421951/
+mv val/ILSVRC2012_val_00036630.JPEG n04515003/
+mv val/ILSVRC2012_val_00036631.JPEG n02165105/
+mv val/ILSVRC2012_val_00036632.JPEG n03146219/
+mv val/ILSVRC2012_val_00036633.JPEG n04423845/
+mv val/ILSVRC2012_val_00036634.JPEG n03602883/
+mv val/ILSVRC2012_val_00036635.JPEG n01930112/
+mv val/ILSVRC2012_val_00036636.JPEG n04208210/
+mv val/ILSVRC2012_val_00036637.JPEG n03887697/
+mv val/ILSVRC2012_val_00036638.JPEG n03761084/
+mv val/ILSVRC2012_val_00036639.JPEG n02268853/
+mv val/ILSVRC2012_val_00036640.JPEG n04392985/
+mv val/ILSVRC2012_val_00036641.JPEG n03649909/
+mv val/ILSVRC2012_val_00036642.JPEG n03447721/
+mv val/ILSVRC2012_val_00036643.JPEG n02692877/
+mv val/ILSVRC2012_val_00036644.JPEG n12267677/
+mv val/ILSVRC2012_val_00036645.JPEG n07715103/
+mv val/ILSVRC2012_val_00036646.JPEG n04392985/
+mv val/ILSVRC2012_val_00036647.JPEG n04509417/
+mv val/ILSVRC2012_val_00036648.JPEG n04041544/
+mv val/ILSVRC2012_val_00036649.JPEG n03538406/
+mv val/ILSVRC2012_val_00036650.JPEG n01664065/
+mv val/ILSVRC2012_val_00036651.JPEG n03179701/
+mv val/ILSVRC2012_val_00036652.JPEG n01820546/
+mv val/ILSVRC2012_val_00036653.JPEG n04204347/
+mv val/ILSVRC2012_val_00036654.JPEG n03929660/
+mv val/ILSVRC2012_val_00036655.JPEG n02102973/
+mv val/ILSVRC2012_val_00036656.JPEG n03903868/
+mv val/ILSVRC2012_val_00036657.JPEG n01742172/
+mv val/ILSVRC2012_val_00036658.JPEG n01770081/
+mv val/ILSVRC2012_val_00036659.JPEG n03109150/
+mv val/ILSVRC2012_val_00036660.JPEG n04273569/
+mv val/ILSVRC2012_val_00036661.JPEG n02123045/
+mv val/ILSVRC2012_val_00036662.JPEG n07590611/
+mv val/ILSVRC2012_val_00036663.JPEG n13037406/
+mv val/ILSVRC2012_val_00036664.JPEG n02102177/
+mv val/ILSVRC2012_val_00036665.JPEG n03000247/
+mv val/ILSVRC2012_val_00036666.JPEG n02410509/
+mv val/ILSVRC2012_val_00036667.JPEG n02088632/
+mv val/ILSVRC2012_val_00036668.JPEG n07768694/
+mv val/ILSVRC2012_val_00036669.JPEG n06785654/
+mv val/ILSVRC2012_val_00036670.JPEG n03393912/
+mv val/ILSVRC2012_val_00036671.JPEG n03496892/
+mv val/ILSVRC2012_val_00036672.JPEG n04275548/
+mv val/ILSVRC2012_val_00036673.JPEG n03854065/
+mv val/ILSVRC2012_val_00036674.JPEG n04355933/
+mv val/ILSVRC2012_val_00036675.JPEG n01807496/
+mv val/ILSVRC2012_val_00036676.JPEG n07720875/
+mv val/ILSVRC2012_val_00036677.JPEG n04584207/
+mv val/ILSVRC2012_val_00036678.JPEG n03792782/
+mv val/ILSVRC2012_val_00036679.JPEG n03208938/
+mv val/ILSVRC2012_val_00036680.JPEG n02666196/
+mv val/ILSVRC2012_val_00036681.JPEG n04149813/
+mv val/ILSVRC2012_val_00036682.JPEG n02107683/
+mv val/ILSVRC2012_val_00036683.JPEG n04049303/
+mv val/ILSVRC2012_val_00036684.JPEG n04118538/
+mv val/ILSVRC2012_val_00036685.JPEG n04418357/
+mv val/ILSVRC2012_val_00036686.JPEG n02877765/
+mv val/ILSVRC2012_val_00036687.JPEG n01883070/
+mv val/ILSVRC2012_val_00036688.JPEG n02509815/
+mv val/ILSVRC2012_val_00036689.JPEG n10565667/
+mv val/ILSVRC2012_val_00036690.JPEG n02497673/
+mv val/ILSVRC2012_val_00036691.JPEG n02115913/
+mv val/ILSVRC2012_val_00036692.JPEG n03837869/
+mv val/ILSVRC2012_val_00036693.JPEG n02190166/
+mv val/ILSVRC2012_val_00036694.JPEG n04592741/
+mv val/ILSVRC2012_val_00036695.JPEG n04285008/
+mv val/ILSVRC2012_val_00036696.JPEG n04606251/
+mv val/ILSVRC2012_val_00036697.JPEG n03075370/
+mv val/ILSVRC2012_val_00036698.JPEG n04125021/
+mv val/ILSVRC2012_val_00036699.JPEG n03796401/
+mv val/ILSVRC2012_val_00036700.JPEG n02091134/
+mv val/ILSVRC2012_val_00036701.JPEG n03792972/
+mv val/ILSVRC2012_val_00036702.JPEG n01824575/
+mv val/ILSVRC2012_val_00036703.JPEG n02086079/
+mv val/ILSVRC2012_val_00036704.JPEG n01855032/
+mv val/ILSVRC2012_val_00036705.JPEG n07742313/
+mv val/ILSVRC2012_val_00036706.JPEG n03393912/
+mv val/ILSVRC2012_val_00036707.JPEG n03958227/
+mv val/ILSVRC2012_val_00036708.JPEG n02137549/
+mv val/ILSVRC2012_val_00036709.JPEG n02113978/
+mv val/ILSVRC2012_val_00036710.JPEG n02356798/
+mv val/ILSVRC2012_val_00036711.JPEG n02808440/
+mv val/ILSVRC2012_val_00036712.JPEG n02105412/
+mv val/ILSVRC2012_val_00036713.JPEG n01797886/
+mv val/ILSVRC2012_val_00036714.JPEG n04204347/
+mv val/ILSVRC2012_val_00036715.JPEG n03837869/
+mv val/ILSVRC2012_val_00036716.JPEG n02111277/
+mv val/ILSVRC2012_val_00036717.JPEG n02777292/
+mv val/ILSVRC2012_val_00036718.JPEG n02129604/
+mv val/ILSVRC2012_val_00036719.JPEG n07930864/
+mv val/ILSVRC2012_val_00036720.JPEG n02489166/
+mv val/ILSVRC2012_val_00036721.JPEG n03459775/
+mv val/ILSVRC2012_val_00036722.JPEG n01644900/
+mv val/ILSVRC2012_val_00036723.JPEG n04149813/
+mv val/ILSVRC2012_val_00036724.JPEG n03854065/
+mv val/ILSVRC2012_val_00036725.JPEG n03125729/
+mv val/ILSVRC2012_val_00036726.JPEG n04141076/
+mv val/ILSVRC2012_val_00036727.JPEG n04505470/
+mv val/ILSVRC2012_val_00036728.JPEG n02089973/
+mv val/ILSVRC2012_val_00036729.JPEG n02172182/
+mv val/ILSVRC2012_val_00036730.JPEG n04266014/
+mv val/ILSVRC2012_val_00036731.JPEG n04606251/
+mv val/ILSVRC2012_val_00036732.JPEG n07768694/
+mv val/ILSVRC2012_val_00036733.JPEG n09472597/
+mv val/ILSVRC2012_val_00036734.JPEG n02134418/
+mv val/ILSVRC2012_val_00036735.JPEG n03623198/
+mv val/ILSVRC2012_val_00036736.JPEG n02793495/
+mv val/ILSVRC2012_val_00036737.JPEG n01484850/
+mv val/ILSVRC2012_val_00036738.JPEG n02276258/
+mv val/ILSVRC2012_val_00036739.JPEG n02095889/
+mv val/ILSVRC2012_val_00036740.JPEG n03733281/
+mv val/ILSVRC2012_val_00036741.JPEG n03535780/
+mv val/ILSVRC2012_val_00036742.JPEG n03983396/
+mv val/ILSVRC2012_val_00036743.JPEG n02640242/
+mv val/ILSVRC2012_val_00036744.JPEG n01818515/
+mv val/ILSVRC2012_val_00036745.JPEG n02051845/
+mv val/ILSVRC2012_val_00036746.JPEG n03544143/
+mv val/ILSVRC2012_val_00036747.JPEG n02092002/
+mv val/ILSVRC2012_val_00036748.JPEG n02906734/
+mv val/ILSVRC2012_val_00036749.JPEG n01518878/
+mv val/ILSVRC2012_val_00036750.JPEG n03769881/
+mv val/ILSVRC2012_val_00036751.JPEG n02087046/
+mv val/ILSVRC2012_val_00036752.JPEG n03891332/
+mv val/ILSVRC2012_val_00036753.JPEG n04392985/
+mv val/ILSVRC2012_val_00036754.JPEG n03485794/
+mv val/ILSVRC2012_val_00036755.JPEG n03445777/
+mv val/ILSVRC2012_val_00036756.JPEG n02115913/
+mv val/ILSVRC2012_val_00036757.JPEG n02321529/
+mv val/ILSVRC2012_val_00036758.JPEG n03633091/
+mv val/ILSVRC2012_val_00036759.JPEG n01984695/
+mv val/ILSVRC2012_val_00036760.JPEG n04590129/
+mv val/ILSVRC2012_val_00036761.JPEG n02268443/
+mv val/ILSVRC2012_val_00036762.JPEG n02676566/
+mv val/ILSVRC2012_val_00036763.JPEG n02134084/
+mv val/ILSVRC2012_val_00036764.JPEG n03658185/
+mv val/ILSVRC2012_val_00036765.JPEG n02091134/
+mv val/ILSVRC2012_val_00036766.JPEG n03733805/
+mv val/ILSVRC2012_val_00036767.JPEG n02488702/
+mv val/ILSVRC2012_val_00036768.JPEG n02869837/
+mv val/ILSVRC2012_val_00036769.JPEG n02640242/
+mv val/ILSVRC2012_val_00036770.JPEG n03160309/
+mv val/ILSVRC2012_val_00036771.JPEG n02443484/
+mv val/ILSVRC2012_val_00036772.JPEG n02441942/
+mv val/ILSVRC2012_val_00036773.JPEG n01775062/
+mv val/ILSVRC2012_val_00036774.JPEG n02825657/
+mv val/ILSVRC2012_val_00036775.JPEG n12144580/
+mv val/ILSVRC2012_val_00036776.JPEG n04591713/
+mv val/ILSVRC2012_val_00036777.JPEG n02783161/
+mv val/ILSVRC2012_val_00036778.JPEG n01882714/
+mv val/ILSVRC2012_val_00036779.JPEG n02815834/
+mv val/ILSVRC2012_val_00036780.JPEG n02814860/
+mv val/ILSVRC2012_val_00036781.JPEG n02102177/
+mv val/ILSVRC2012_val_00036782.JPEG n02988304/
+mv val/ILSVRC2012_val_00036783.JPEG n03376595/
+mv val/ILSVRC2012_val_00036784.JPEG n02165105/
+mv val/ILSVRC2012_val_00036785.JPEG n04081281/
+mv val/ILSVRC2012_val_00036786.JPEG n03495258/
+mv val/ILSVRC2012_val_00036787.JPEG n09193705/
+mv val/ILSVRC2012_val_00036788.JPEG n04493381/
+mv val/ILSVRC2012_val_00036789.JPEG n02815834/
+mv val/ILSVRC2012_val_00036790.JPEG n11939491/
+mv val/ILSVRC2012_val_00036791.JPEG n02883205/
+mv val/ILSVRC2012_val_00036792.JPEG n03063689/
+mv val/ILSVRC2012_val_00036793.JPEG n02095570/
+mv val/ILSVRC2012_val_00036794.JPEG n04033901/
+mv val/ILSVRC2012_val_00036795.JPEG n03937543/
+mv val/ILSVRC2012_val_00036796.JPEG n02107908/
+mv val/ILSVRC2012_val_00036797.JPEG n07742313/
+mv val/ILSVRC2012_val_00036798.JPEG n02114712/
+mv val/ILSVRC2012_val_00036799.JPEG n02971356/
+mv val/ILSVRC2012_val_00036800.JPEG n02906734/
+mv val/ILSVRC2012_val_00036801.JPEG n02814860/
+mv val/ILSVRC2012_val_00036802.JPEG n01692333/
+mv val/ILSVRC2012_val_00036803.JPEG n02808440/
+mv val/ILSVRC2012_val_00036804.JPEG n03706229/
+mv val/ILSVRC2012_val_00036805.JPEG n04335435/
+mv val/ILSVRC2012_val_00036806.JPEG n03791053/
+mv val/ILSVRC2012_val_00036807.JPEG n03742115/
+mv val/ILSVRC2012_val_00036808.JPEG n02099429/
+mv val/ILSVRC2012_val_00036809.JPEG n02877765/
+mv val/ILSVRC2012_val_00036810.JPEG n02321529/
+mv val/ILSVRC2012_val_00036811.JPEG n03814639/
+mv val/ILSVRC2012_val_00036812.JPEG n01592084/
+mv val/ILSVRC2012_val_00036813.JPEG n03272562/
+mv val/ILSVRC2012_val_00036814.JPEG n02786058/
+mv val/ILSVRC2012_val_00036815.JPEG n01667114/
+mv val/ILSVRC2012_val_00036816.JPEG n03947888/
+mv val/ILSVRC2012_val_00036817.JPEG n02100735/
+mv val/ILSVRC2012_val_00036818.JPEG n04409515/
+mv val/ILSVRC2012_val_00036819.JPEG n01601694/
+mv val/ILSVRC2012_val_00036820.JPEG n03777568/
+mv val/ILSVRC2012_val_00036821.JPEG n12620546/
+mv val/ILSVRC2012_val_00036822.JPEG n06794110/
+mv val/ILSVRC2012_val_00036823.JPEG n02483708/
+mv val/ILSVRC2012_val_00036824.JPEG n03666591/
+mv val/ILSVRC2012_val_00036825.JPEG n03759954/
+mv val/ILSVRC2012_val_00036826.JPEG n01871265/
+mv val/ILSVRC2012_val_00036827.JPEG n02790996/
+mv val/ILSVRC2012_val_00036828.JPEG n01955084/
+mv val/ILSVRC2012_val_00036829.JPEG n03868863/
+mv val/ILSVRC2012_val_00036830.JPEG n03026506/
+mv val/ILSVRC2012_val_00036831.JPEG n04070727/
+mv val/ILSVRC2012_val_00036832.JPEG n02233338/
+mv val/ILSVRC2012_val_00036833.JPEG n01983481/
+mv val/ILSVRC2012_val_00036834.JPEG n02640242/
+mv val/ILSVRC2012_val_00036835.JPEG n01819313/
+mv val/ILSVRC2012_val_00036836.JPEG n02794156/
+mv val/ILSVRC2012_val_00036837.JPEG n03017168/
+mv val/ILSVRC2012_val_00036838.JPEG n02486261/
+mv val/ILSVRC2012_val_00036839.JPEG n04118776/
+mv val/ILSVRC2012_val_00036840.JPEG n02769748/
+mv val/ILSVRC2012_val_00036841.JPEG n03250847/
+mv val/ILSVRC2012_val_00036842.JPEG n02113799/
+mv val/ILSVRC2012_val_00036843.JPEG n02105056/
+mv val/ILSVRC2012_val_00036844.JPEG n02108422/
+mv val/ILSVRC2012_val_00036845.JPEG n01806567/
+mv val/ILSVRC2012_val_00036846.JPEG n04229816/
+mv val/ILSVRC2012_val_00036847.JPEG n09256479/
+mv val/ILSVRC2012_val_00036848.JPEG n04141327/
+mv val/ILSVRC2012_val_00036849.JPEG n01692333/
+mv val/ILSVRC2012_val_00036850.JPEG n01644373/
+mv val/ILSVRC2012_val_00036851.JPEG n02493509/
+mv val/ILSVRC2012_val_00036852.JPEG n02892201/
+mv val/ILSVRC2012_val_00036853.JPEG n02346627/
+mv val/ILSVRC2012_val_00036854.JPEG n07747607/
+mv val/ILSVRC2012_val_00036855.JPEG n04120489/
+mv val/ILSVRC2012_val_00036856.JPEG n03032252/
+mv val/ILSVRC2012_val_00036857.JPEG n04081281/
+mv val/ILSVRC2012_val_00036858.JPEG n09468604/
+mv val/ILSVRC2012_val_00036859.JPEG n02108422/
+mv val/ILSVRC2012_val_00036860.JPEG n07753113/
+mv val/ILSVRC2012_val_00036861.JPEG n02441942/
+mv val/ILSVRC2012_val_00036862.JPEG n03775071/
+mv val/ILSVRC2012_val_00036863.JPEG n02319095/
+mv val/ILSVRC2012_val_00036864.JPEG n04579145/
+mv val/ILSVRC2012_val_00036865.JPEG n02097474/
+mv val/ILSVRC2012_val_00036866.JPEG n03697007/
+mv val/ILSVRC2012_val_00036867.JPEG n02769748/
+mv val/ILSVRC2012_val_00036868.JPEG n02129604/
+mv val/ILSVRC2012_val_00036869.JPEG n04141076/
+mv val/ILSVRC2012_val_00036870.JPEG n04476259/
+mv val/ILSVRC2012_val_00036871.JPEG n02442845/
+mv val/ILSVRC2012_val_00036872.JPEG n04442312/
+mv val/ILSVRC2012_val_00036873.JPEG n02012849/
+mv val/ILSVRC2012_val_00036874.JPEG n01806567/
+mv val/ILSVRC2012_val_00036875.JPEG n03337140/
+mv val/ILSVRC2012_val_00036876.JPEG n02097209/
+mv val/ILSVRC2012_val_00036877.JPEG n03207941/
+mv val/ILSVRC2012_val_00036878.JPEG n01632458/
+mv val/ILSVRC2012_val_00036879.JPEG n01818515/
+mv val/ILSVRC2012_val_00036880.JPEG n02233338/
+mv val/ILSVRC2012_val_00036881.JPEG n02088094/
+mv val/ILSVRC2012_val_00036882.JPEG n02727426/
+mv val/ILSVRC2012_val_00036883.JPEG n04239074/
+mv val/ILSVRC2012_val_00036884.JPEG n03095699/
+mv val/ILSVRC2012_val_00036885.JPEG n04606251/
+mv val/ILSVRC2012_val_00036886.JPEG n03902125/
+mv val/ILSVRC2012_val_00036887.JPEG n02099267/
+mv val/ILSVRC2012_val_00036888.JPEG n02086240/
+mv val/ILSVRC2012_val_00036889.JPEG n03337140/
+mv val/ILSVRC2012_val_00036890.JPEG n02085782/
+mv val/ILSVRC2012_val_00036891.JPEG n02412080/
+mv val/ILSVRC2012_val_00036892.JPEG n03637318/
+mv val/ILSVRC2012_val_00036893.JPEG n01734418/
+mv val/ILSVRC2012_val_00036894.JPEG n02113023/
+mv val/ILSVRC2012_val_00036895.JPEG n04251144/
+mv val/ILSVRC2012_val_00036896.JPEG n03764736/
+mv val/ILSVRC2012_val_00036897.JPEG n02114855/
+mv val/ILSVRC2012_val_00036898.JPEG n02799071/
+mv val/ILSVRC2012_val_00036899.JPEG n01675722/
+mv val/ILSVRC2012_val_00036900.JPEG n02843684/
+mv val/ILSVRC2012_val_00036901.JPEG n01756291/
+mv val/ILSVRC2012_val_00036902.JPEG n04417672/
+mv val/ILSVRC2012_val_00036903.JPEG n02835271/
+mv val/ILSVRC2012_val_00036904.JPEG n04141076/
+mv val/ILSVRC2012_val_00036905.JPEG n04389033/
+mv val/ILSVRC2012_val_00036906.JPEG n04482393/
+mv val/ILSVRC2012_val_00036907.JPEG n02087394/
+mv val/ILSVRC2012_val_00036908.JPEG n02115641/
+mv val/ILSVRC2012_val_00036909.JPEG n03017168/
+mv val/ILSVRC2012_val_00036910.JPEG n01753488/
+mv val/ILSVRC2012_val_00036911.JPEG n02514041/
+mv val/ILSVRC2012_val_00036912.JPEG n04509417/
+mv val/ILSVRC2012_val_00036913.JPEG n02089973/
+mv val/ILSVRC2012_val_00036914.JPEG n03075370/
+mv val/ILSVRC2012_val_00036915.JPEG n01644373/
+mv val/ILSVRC2012_val_00036916.JPEG n03791053/
+mv val/ILSVRC2012_val_00036917.JPEG n04265275/
+mv val/ILSVRC2012_val_00036918.JPEG n02111500/
+mv val/ILSVRC2012_val_00036919.JPEG n02097209/
+mv val/ILSVRC2012_val_00036920.JPEG n04458633/
+mv val/ILSVRC2012_val_00036921.JPEG n07802026/
+mv val/ILSVRC2012_val_00036922.JPEG n04141076/
+mv val/ILSVRC2012_val_00036923.JPEG n04597913/
+mv val/ILSVRC2012_val_00036924.JPEG n02281787/
+mv val/ILSVRC2012_val_00036925.JPEG n12057211/
+mv val/ILSVRC2012_val_00036926.JPEG n02277742/
+mv val/ILSVRC2012_val_00036927.JPEG n07716906/
+mv val/ILSVRC2012_val_00036928.JPEG n03920288/
+mv val/ILSVRC2012_val_00036929.JPEG n04326547/
+mv val/ILSVRC2012_val_00036930.JPEG n03127747/
+mv val/ILSVRC2012_val_00036931.JPEG n03404251/
+mv val/ILSVRC2012_val_00036932.JPEG n02108915/
+mv val/ILSVRC2012_val_00036933.JPEG n02127052/
+mv val/ILSVRC2012_val_00036934.JPEG n02391049/
+mv val/ILSVRC2012_val_00036935.JPEG n04229816/
+mv val/ILSVRC2012_val_00036936.JPEG n02837789/
+mv val/ILSVRC2012_val_00036937.JPEG n03314780/
+mv val/ILSVRC2012_val_00036938.JPEG n02089973/
+mv val/ILSVRC2012_val_00036939.JPEG n04296562/
+mv val/ILSVRC2012_val_00036940.JPEG n02791270/
+mv val/ILSVRC2012_val_00036941.JPEG n03000134/
+mv val/ILSVRC2012_val_00036942.JPEG n01644900/
+mv val/ILSVRC2012_val_00036943.JPEG n04209133/
+mv val/ILSVRC2012_val_00036944.JPEG n01669191/
+mv val/ILSVRC2012_val_00036945.JPEG n02107142/
+mv val/ILSVRC2012_val_00036946.JPEG n03908714/
+mv val/ILSVRC2012_val_00036947.JPEG n03045698/
+mv val/ILSVRC2012_val_00036948.JPEG n03485794/
+mv val/ILSVRC2012_val_00036949.JPEG n02108551/
+mv val/ILSVRC2012_val_00036950.JPEG n02807133/
+mv val/ILSVRC2012_val_00036951.JPEG n02892767/
+mv val/ILSVRC2012_val_00036952.JPEG n04525305/
+mv val/ILSVRC2012_val_00036953.JPEG n02493509/
+mv val/ILSVRC2012_val_00036954.JPEG n10148035/
+mv val/ILSVRC2012_val_00036955.JPEG n03201208/
+mv val/ILSVRC2012_val_00036956.JPEG n03690938/
+mv val/ILSVRC2012_val_00036957.JPEG n04505470/
+mv val/ILSVRC2012_val_00036958.JPEG n02206856/
+mv val/ILSVRC2012_val_00036959.JPEG n02098105/
+mv val/ILSVRC2012_val_00036960.JPEG n03478589/
+mv val/ILSVRC2012_val_00036961.JPEG n02123597/
+mv val/ILSVRC2012_val_00036962.JPEG n02783161/
+mv val/ILSVRC2012_val_00036963.JPEG n01667114/
+mv val/ILSVRC2012_val_00036964.JPEG n02106550/
+mv val/ILSVRC2012_val_00036965.JPEG n03733805/
+mv val/ILSVRC2012_val_00036966.JPEG n03424325/
+mv val/ILSVRC2012_val_00036967.JPEG n01882714/
+mv val/ILSVRC2012_val_00036968.JPEG n01855672/
+mv val/ILSVRC2012_val_00036969.JPEG n01855672/
+mv val/ILSVRC2012_val_00036970.JPEG n01983481/
+mv val/ILSVRC2012_val_00036971.JPEG n01695060/
+mv val/ILSVRC2012_val_00036972.JPEG n01847000/
+mv val/ILSVRC2012_val_00036973.JPEG n02799071/
+mv val/ILSVRC2012_val_00036974.JPEG n04428191/
+mv val/ILSVRC2012_val_00036975.JPEG n03223299/
+mv val/ILSVRC2012_val_00036976.JPEG n13052670/
+mv val/ILSVRC2012_val_00036977.JPEG n02101556/
+mv val/ILSVRC2012_val_00036978.JPEG n04265275/
+mv val/ILSVRC2012_val_00036979.JPEG n03016953/
+mv val/ILSVRC2012_val_00036980.JPEG n01775062/
+mv val/ILSVRC2012_val_00036981.JPEG n04033901/
+mv val/ILSVRC2012_val_00036982.JPEG n01753488/
+mv val/ILSVRC2012_val_00036983.JPEG n03146219/
+mv val/ILSVRC2012_val_00036984.JPEG n04235860/
+mv val/ILSVRC2012_val_00036985.JPEG n03759954/
+mv val/ILSVRC2012_val_00036986.JPEG n03788195/
+mv val/ILSVRC2012_val_00036987.JPEG n07749582/
+mv val/ILSVRC2012_val_00036988.JPEG n01829413/
+mv val/ILSVRC2012_val_00036989.JPEG n02093256/
+mv val/ILSVRC2012_val_00036990.JPEG n02231487/
+mv val/ILSVRC2012_val_00036991.JPEG n04536866/
+mv val/ILSVRC2012_val_00036992.JPEG n03146219/
+mv val/ILSVRC2012_val_00036993.JPEG n04004767/
+mv val/ILSVRC2012_val_00036994.JPEG n02493793/
+mv val/ILSVRC2012_val_00036995.JPEG n04371774/
+mv val/ILSVRC2012_val_00036996.JPEG n02395406/
+mv val/ILSVRC2012_val_00036997.JPEG n02114712/
+mv val/ILSVRC2012_val_00036998.JPEG n02747177/
+mv val/ILSVRC2012_val_00036999.JPEG n01560419/
+mv val/ILSVRC2012_val_00037000.JPEG n03814906/
+mv val/ILSVRC2012_val_00037001.JPEG n04141327/
+mv val/ILSVRC2012_val_00037002.JPEG n01833805/
+mv val/ILSVRC2012_val_00037003.JPEG n03825788/
+mv val/ILSVRC2012_val_00037004.JPEG n02128925/
+mv val/ILSVRC2012_val_00037005.JPEG n02120079/
+mv val/ILSVRC2012_val_00037006.JPEG n03658185/
+mv val/ILSVRC2012_val_00037007.JPEG n03935335/
+mv val/ILSVRC2012_val_00037008.JPEG n03530642/
+mv val/ILSVRC2012_val_00037009.JPEG n01968897/
+mv val/ILSVRC2012_val_00037010.JPEG n02114548/
+mv val/ILSVRC2012_val_00037011.JPEG n03873416/
+mv val/ILSVRC2012_val_00037012.JPEG n01985128/
+mv val/ILSVRC2012_val_00037013.JPEG n01514859/
+mv val/ILSVRC2012_val_00037014.JPEG n02669723/
+mv val/ILSVRC2012_val_00037015.JPEG n04311174/
+mv val/ILSVRC2012_val_00037016.JPEG n03141823/
+mv val/ILSVRC2012_val_00037017.JPEG n01872401/
+mv val/ILSVRC2012_val_00037018.JPEG n03920288/
+mv val/ILSVRC2012_val_00037019.JPEG n02927161/
+mv val/ILSVRC2012_val_00037020.JPEG n02397096/
+mv val/ILSVRC2012_val_00037021.JPEG n04357314/
+mv val/ILSVRC2012_val_00037022.JPEG n03535780/
+mv val/ILSVRC2012_val_00037023.JPEG n03127925/
+mv val/ILSVRC2012_val_00037024.JPEG n01807496/
+mv val/ILSVRC2012_val_00037025.JPEG n02895154/
+mv val/ILSVRC2012_val_00037026.JPEG n02794156/
+mv val/ILSVRC2012_val_00037027.JPEG n03666591/
+mv val/ILSVRC2012_val_00037028.JPEG n04004767/
+mv val/ILSVRC2012_val_00037029.JPEG n04039381/
+mv val/ILSVRC2012_val_00037030.JPEG n04179913/
+mv val/ILSVRC2012_val_00037031.JPEG n01828970/
+mv val/ILSVRC2012_val_00037032.JPEG n02128385/
+mv val/ILSVRC2012_val_00037033.JPEG n02095570/
+mv val/ILSVRC2012_val_00037034.JPEG n04592741/
+mv val/ILSVRC2012_val_00037035.JPEG n02793495/
+mv val/ILSVRC2012_val_00037036.JPEG n02096177/
+mv val/ILSVRC2012_val_00037037.JPEG n01631663/
+mv val/ILSVRC2012_val_00037038.JPEG n02111500/
+mv val/ILSVRC2012_val_00037039.JPEG n12057211/
+mv val/ILSVRC2012_val_00037040.JPEG n04356056/
+mv val/ILSVRC2012_val_00037041.JPEG n02894605/
+mv val/ILSVRC2012_val_00037042.JPEG n02226429/
+mv val/ILSVRC2012_val_00037043.JPEG n04482393/
+mv val/ILSVRC2012_val_00037044.JPEG n01950731/
+mv val/ILSVRC2012_val_00037045.JPEG n03452741/
+mv val/ILSVRC2012_val_00037046.JPEG n01632777/
+mv val/ILSVRC2012_val_00037047.JPEG n03197337/
+mv val/ILSVRC2012_val_00037048.JPEG n04505470/
+mv val/ILSVRC2012_val_00037049.JPEG n04599235/
+mv val/ILSVRC2012_val_00037050.JPEG n01484850/
+mv val/ILSVRC2012_val_00037051.JPEG n04501370/
+mv val/ILSVRC2012_val_00037052.JPEG n02095570/
+mv val/ILSVRC2012_val_00037053.JPEG n02276258/
+mv val/ILSVRC2012_val_00037054.JPEG n02410509/
+mv val/ILSVRC2012_val_00037055.JPEG n04037443/
+mv val/ILSVRC2012_val_00037056.JPEG n02276258/
+mv val/ILSVRC2012_val_00037057.JPEG n04418357/
+mv val/ILSVRC2012_val_00037058.JPEG n02892767/
+mv val/ILSVRC2012_val_00037059.JPEG n02099267/
+mv val/ILSVRC2012_val_00037060.JPEG n03791053/
+mv val/ILSVRC2012_val_00037061.JPEG n04599235/
+mv val/ILSVRC2012_val_00037062.JPEG n03642806/
+mv val/ILSVRC2012_val_00037063.JPEG n03530642/
+mv val/ILSVRC2012_val_00037064.JPEG n07718472/
+mv val/ILSVRC2012_val_00037065.JPEG n07693725/
+mv val/ILSVRC2012_val_00037066.JPEG n11939491/
+mv val/ILSVRC2012_val_00037067.JPEG n02793495/
+mv val/ILSVRC2012_val_00037068.JPEG n02988304/
+mv val/ILSVRC2012_val_00037069.JPEG n02096051/
+mv val/ILSVRC2012_val_00037070.JPEG n01514668/
+mv val/ILSVRC2012_val_00037071.JPEG n01616318/
+mv val/ILSVRC2012_val_00037072.JPEG n04243546/
+mv val/ILSVRC2012_val_00037073.JPEG n02808440/
+mv val/ILSVRC2012_val_00037074.JPEG n04270147/
+mv val/ILSVRC2012_val_00037075.JPEG n02106030/
+mv val/ILSVRC2012_val_00037076.JPEG n04344873/
+mv val/ILSVRC2012_val_00037077.JPEG n07930864/
+mv val/ILSVRC2012_val_00037078.JPEG n03444034/
+mv val/ILSVRC2012_val_00037079.JPEG n07860988/
+mv val/ILSVRC2012_val_00037080.JPEG n02119022/
+mv val/ILSVRC2012_val_00037081.JPEG n02108000/
+mv val/ILSVRC2012_val_00037082.JPEG n04562935/
+mv val/ILSVRC2012_val_00037083.JPEG n02105162/
+mv val/ILSVRC2012_val_00037084.JPEG n02492035/
+mv val/ILSVRC2012_val_00037085.JPEG n02823750/
+mv val/ILSVRC2012_val_00037086.JPEG n03481172/
+mv val/ILSVRC2012_val_00037087.JPEG n02108000/
+mv val/ILSVRC2012_val_00037088.JPEG n04310018/
+mv val/ILSVRC2012_val_00037089.JPEG n02107142/
+mv val/ILSVRC2012_val_00037090.JPEG n02226429/
+mv val/ILSVRC2012_val_00037091.JPEG n02074367/
+mv val/ILSVRC2012_val_00037092.JPEG n03785016/
+mv val/ILSVRC2012_val_00037093.JPEG n04553703/
+mv val/ILSVRC2012_val_00037094.JPEG n03495258/
+mv val/ILSVRC2012_val_00037095.JPEG n07579787/
+mv val/ILSVRC2012_val_00037096.JPEG n07745940/
+mv val/ILSVRC2012_val_00037097.JPEG n02111277/
+mv val/ILSVRC2012_val_00037098.JPEG n04476259/
+mv val/ILSVRC2012_val_00037099.JPEG n03476684/
+mv val/ILSVRC2012_val_00037100.JPEG n04487081/
+mv val/ILSVRC2012_val_00037101.JPEG n02091134/
+mv val/ILSVRC2012_val_00037102.JPEG n07714571/
+mv val/ILSVRC2012_val_00037103.JPEG n02105251/
+mv val/ILSVRC2012_val_00037104.JPEG n04404412/
+mv val/ILSVRC2012_val_00037105.JPEG n04398044/
+mv val/ILSVRC2012_val_00037106.JPEG n01924916/
+mv val/ILSVRC2012_val_00037107.JPEG n02487347/
+mv val/ILSVRC2012_val_00037108.JPEG n12620546/
+mv val/ILSVRC2012_val_00037109.JPEG n03255030/
+mv val/ILSVRC2012_val_00037110.JPEG n04325704/
+mv val/ILSVRC2012_val_00037111.JPEG n02093647/
+mv val/ILSVRC2012_val_00037112.JPEG n02814533/
+mv val/ILSVRC2012_val_00037113.JPEG n03125729/
+mv val/ILSVRC2012_val_00037114.JPEG n03000247/
+mv val/ILSVRC2012_val_00037115.JPEG n02492035/
+mv val/ILSVRC2012_val_00037116.JPEG n01530575/
+mv val/ILSVRC2012_val_00037117.JPEG n02108915/
+mv val/ILSVRC2012_val_00037118.JPEG n02114367/
+mv val/ILSVRC2012_val_00037119.JPEG n01796340/
+mv val/ILSVRC2012_val_00037120.JPEG n13044778/
+mv val/ILSVRC2012_val_00037121.JPEG n04522168/
+mv val/ILSVRC2012_val_00037122.JPEG n02443114/
+mv val/ILSVRC2012_val_00037123.JPEG n04589890/
+mv val/ILSVRC2012_val_00037124.JPEG n04201297/
+mv val/ILSVRC2012_val_00037125.JPEG n03733805/
+mv val/ILSVRC2012_val_00037126.JPEG n02168699/
+mv val/ILSVRC2012_val_00037127.JPEG n01616318/
+mv val/ILSVRC2012_val_00037128.JPEG n03594945/
+mv val/ILSVRC2012_val_00037129.JPEG n04479046/
+mv val/ILSVRC2012_val_00037130.JPEG n02391049/
+mv val/ILSVRC2012_val_00037131.JPEG n02892201/
+mv val/ILSVRC2012_val_00037132.JPEG n04447861/
+mv val/ILSVRC2012_val_00037133.JPEG n02134084/
+mv val/ILSVRC2012_val_00037134.JPEG n02096294/
+mv val/ILSVRC2012_val_00037135.JPEG n01484850/
+mv val/ILSVRC2012_val_00037136.JPEG n03930630/
+mv val/ILSVRC2012_val_00037137.JPEG n02090721/
+mv val/ILSVRC2012_val_00037138.JPEG n04118538/
+mv val/ILSVRC2012_val_00037139.JPEG n02445715/
+mv val/ILSVRC2012_val_00037140.JPEG n06596364/
+mv val/ILSVRC2012_val_00037141.JPEG n03599486/
+mv val/ILSVRC2012_val_00037142.JPEG n04579145/
+mv val/ILSVRC2012_val_00037143.JPEG n09468604/
+mv val/ILSVRC2012_val_00037144.JPEG n01986214/
+mv val/ILSVRC2012_val_00037145.JPEG n01820546/
+mv val/ILSVRC2012_val_00037146.JPEG n02526121/
+mv val/ILSVRC2012_val_00037147.JPEG n02408429/
+mv val/ILSVRC2012_val_00037148.JPEG n03854065/
+mv val/ILSVRC2012_val_00037149.JPEG n01855032/
+mv val/ILSVRC2012_val_00037150.JPEG n03272562/
+mv val/ILSVRC2012_val_00037151.JPEG n09288635/
+mv val/ILSVRC2012_val_00037152.JPEG n02106550/
+mv val/ILSVRC2012_val_00037153.JPEG n02095314/
+mv val/ILSVRC2012_val_00037154.JPEG n01667778/
+mv val/ILSVRC2012_val_00037155.JPEG n02137549/
+mv val/ILSVRC2012_val_00037156.JPEG n02483708/
+mv val/ILSVRC2012_val_00037157.JPEG n02804610/
+mv val/ILSVRC2012_val_00037158.JPEG n04125021/
+mv val/ILSVRC2012_val_00037159.JPEG n03769881/
+mv val/ILSVRC2012_val_00037160.JPEG n02814533/
+mv val/ILSVRC2012_val_00037161.JPEG n07718472/
+mv val/ILSVRC2012_val_00037162.JPEG n04263257/
+mv val/ILSVRC2012_val_00037163.JPEG n03877472/
+mv val/ILSVRC2012_val_00037164.JPEG n02107312/
+mv val/ILSVRC2012_val_00037165.JPEG n03042490/
+mv val/ILSVRC2012_val_00037166.JPEG n01697457/
+mv val/ILSVRC2012_val_00037167.JPEG n09468604/
+mv val/ILSVRC2012_val_00037168.JPEG n03146219/
+mv val/ILSVRC2012_val_00037169.JPEG n02799071/
+mv val/ILSVRC2012_val_00037170.JPEG n03764736/
+mv val/ILSVRC2012_val_00037171.JPEG n02493793/
+mv val/ILSVRC2012_val_00037172.JPEG n03787032/
+mv val/ILSVRC2012_val_00037173.JPEG n02808304/
+mv val/ILSVRC2012_val_00037174.JPEG n03485407/
+mv val/ILSVRC2012_val_00037175.JPEG n01740131/
+mv val/ILSVRC2012_val_00037176.JPEG n04589890/
+mv val/ILSVRC2012_val_00037177.JPEG n01914609/
+mv val/ILSVRC2012_val_00037178.JPEG n02883205/
+mv val/ILSVRC2012_val_00037179.JPEG n04254680/
+mv val/ILSVRC2012_val_00037180.JPEG n03777568/
+mv val/ILSVRC2012_val_00037181.JPEG n02280649/
+mv val/ILSVRC2012_val_00037182.JPEG n02102040/
+mv val/ILSVRC2012_val_00037183.JPEG n02823750/
+mv val/ILSVRC2012_val_00037184.JPEG n04147183/
+mv val/ILSVRC2012_val_00037185.JPEG n02091467/
+mv val/ILSVRC2012_val_00037186.JPEG n04069434/
+mv val/ILSVRC2012_val_00037187.JPEG n01729977/
+mv val/ILSVRC2012_val_00037188.JPEG n01818515/
+mv val/ILSVRC2012_val_00037189.JPEG n04023962/
+mv val/ILSVRC2012_val_00037190.JPEG n03584254/
+mv val/ILSVRC2012_val_00037191.JPEG n02095314/
+mv val/ILSVRC2012_val_00037192.JPEG n03983396/
+mv val/ILSVRC2012_val_00037193.JPEG n03956157/
+mv val/ILSVRC2012_val_00037194.JPEG n02097209/
+mv val/ILSVRC2012_val_00037195.JPEG n02095314/
+mv val/ILSVRC2012_val_00037196.JPEG n02825657/
+mv val/ILSVRC2012_val_00037197.JPEG n02107142/
+mv val/ILSVRC2012_val_00037198.JPEG n02219486/
+mv val/ILSVRC2012_val_00037199.JPEG n03796401/
+mv val/ILSVRC2012_val_00037200.JPEG n01687978/
+mv val/ILSVRC2012_val_00037201.JPEG n03944341/
+mv val/ILSVRC2012_val_00037202.JPEG n02097658/
+mv val/ILSVRC2012_val_00037203.JPEG n07718747/
+mv val/ILSVRC2012_val_00037204.JPEG n04552348/
+mv val/ILSVRC2012_val_00037205.JPEG n04263257/
+mv val/ILSVRC2012_val_00037206.JPEG n03942813/
+mv val/ILSVRC2012_val_00037207.JPEG n02037110/
+mv val/ILSVRC2012_val_00037208.JPEG n03787032/
+mv val/ILSVRC2012_val_00037209.JPEG n03642806/
+mv val/ILSVRC2012_val_00037210.JPEG n01689811/
+mv val/ILSVRC2012_val_00037211.JPEG n02102973/
+mv val/ILSVRC2012_val_00037212.JPEG n02480495/
+mv val/ILSVRC2012_val_00037213.JPEG n07684084/
+mv val/ILSVRC2012_val_00037214.JPEG n02408429/
+mv val/ILSVRC2012_val_00037215.JPEG n04356056/
+mv val/ILSVRC2012_val_00037216.JPEG n02117135/
+mv val/ILSVRC2012_val_00037217.JPEG n07584110/
+mv val/ILSVRC2012_val_00037218.JPEG n04265275/
+mv val/ILSVRC2012_val_00037219.JPEG n02493793/
+mv val/ILSVRC2012_val_00037220.JPEG n01682714/
+mv val/ILSVRC2012_val_00037221.JPEG n01981276/
+mv val/ILSVRC2012_val_00037222.JPEG n04592741/
+mv val/ILSVRC2012_val_00037223.JPEG n03976467/
+mv val/ILSVRC2012_val_00037224.JPEG n02948072/
+mv val/ILSVRC2012_val_00037225.JPEG n04086273/
+mv val/ILSVRC2012_val_00037226.JPEG n04277352/
+mv val/ILSVRC2012_val_00037227.JPEG n13054560/
+mv val/ILSVRC2012_val_00037228.JPEG n02480495/
+mv val/ILSVRC2012_val_00037229.JPEG n01983481/
+mv val/ILSVRC2012_val_00037230.JPEG n02085782/
+mv val/ILSVRC2012_val_00037231.JPEG n03598930/
+mv val/ILSVRC2012_val_00037232.JPEG n03345487/
+mv val/ILSVRC2012_val_00037233.JPEG n02017213/
+mv val/ILSVRC2012_val_00037234.JPEG n03179701/
+mv val/ILSVRC2012_val_00037235.JPEG n01984695/
+mv val/ILSVRC2012_val_00037236.JPEG n04296562/
+mv val/ILSVRC2012_val_00037237.JPEG n04507155/
+mv val/ILSVRC2012_val_00037238.JPEG n04328186/
+mv val/ILSVRC2012_val_00037239.JPEG n01534433/
+mv val/ILSVRC2012_val_00037240.JPEG n02494079/
+mv val/ILSVRC2012_val_00037241.JPEG n03916031/
+mv val/ILSVRC2012_val_00037242.JPEG n04376876/
+mv val/ILSVRC2012_val_00037243.JPEG n02093428/
+mv val/ILSVRC2012_val_00037244.JPEG n01843383/
+mv val/ILSVRC2012_val_00037245.JPEG n01924916/
+mv val/ILSVRC2012_val_00037246.JPEG n03207743/
+mv val/ILSVRC2012_val_00037247.JPEG n07747607/
+mv val/ILSVRC2012_val_00037248.JPEG n03785016/
+mv val/ILSVRC2012_val_00037249.JPEG n03388549/
+mv val/ILSVRC2012_val_00037250.JPEG n02113624/
+mv val/ILSVRC2012_val_00037251.JPEG n03961711/
+mv val/ILSVRC2012_val_00037252.JPEG n02086646/
+mv val/ILSVRC2012_val_00037253.JPEG n02134084/
+mv val/ILSVRC2012_val_00037254.JPEG n04606251/
+mv val/ILSVRC2012_val_00037255.JPEG n04493381/
+mv val/ILSVRC2012_val_00037256.JPEG n02096585/
+mv val/ILSVRC2012_val_00037257.JPEG n02992529/
+mv val/ILSVRC2012_val_00037258.JPEG n03891332/
+mv val/ILSVRC2012_val_00037259.JPEG n01616318/
+mv val/ILSVRC2012_val_00037260.JPEG n01496331/
+mv val/ILSVRC2012_val_00037261.JPEG n01694178/
+mv val/ILSVRC2012_val_00037262.JPEG n01695060/
+mv val/ILSVRC2012_val_00037263.JPEG n04026417/
+mv val/ILSVRC2012_val_00037264.JPEG n01695060/
+mv val/ILSVRC2012_val_00037265.JPEG n02117135/
+mv val/ILSVRC2012_val_00037266.JPEG n03584254/
+mv val/ILSVRC2012_val_00037267.JPEG n04336792/
+mv val/ILSVRC2012_val_00037268.JPEG n01698640/
+mv val/ILSVRC2012_val_00037269.JPEG n02177972/
+mv val/ILSVRC2012_val_00037270.JPEG n04532670/
+mv val/ILSVRC2012_val_00037271.JPEG n02859443/
+mv val/ILSVRC2012_val_00037272.JPEG n02095889/
+mv val/ILSVRC2012_val_00037273.JPEG n01682714/
+mv val/ILSVRC2012_val_00037274.JPEG n11879895/
+mv val/ILSVRC2012_val_00037275.JPEG n02114855/
+mv val/ILSVRC2012_val_00037276.JPEG n02484975/
+mv val/ILSVRC2012_val_00037277.JPEG n02097047/
+mv val/ILSVRC2012_val_00037278.JPEG n04204238/
+mv val/ILSVRC2012_val_00037279.JPEG n04604644/
+mv val/ILSVRC2012_val_00037280.JPEG n01775062/
+mv val/ILSVRC2012_val_00037281.JPEG n03775071/
+mv val/ILSVRC2012_val_00037282.JPEG n01773549/
+mv val/ILSVRC2012_val_00037283.JPEG n03956157/
+mv val/ILSVRC2012_val_00037284.JPEG n03792972/
+mv val/ILSVRC2012_val_00037285.JPEG n04404412/
+mv val/ILSVRC2012_val_00037286.JPEG n09835506/
+mv val/ILSVRC2012_val_00037287.JPEG n07717556/
+mv val/ILSVRC2012_val_00037288.JPEG n02037110/
+mv val/ILSVRC2012_val_00037289.JPEG n02361337/
+mv val/ILSVRC2012_val_00037290.JPEG n02105412/
+mv val/ILSVRC2012_val_00037291.JPEG n04447861/
+mv val/ILSVRC2012_val_00037292.JPEG n02835271/
+mv val/ILSVRC2012_val_00037293.JPEG n03240683/
+mv val/ILSVRC2012_val_00037294.JPEG n07613480/
+mv val/ILSVRC2012_val_00037295.JPEG n02422699/
+mv val/ILSVRC2012_val_00037296.JPEG n02488702/
+mv val/ILSVRC2012_val_00037297.JPEG n01776313/
+mv val/ILSVRC2012_val_00037298.JPEG n04579432/
+mv val/ILSVRC2012_val_00037299.JPEG n04116512/
+mv val/ILSVRC2012_val_00037300.JPEG n03857828/
+mv val/ILSVRC2012_val_00037301.JPEG n02676566/
+mv val/ILSVRC2012_val_00037302.JPEG n03063599/
+mv val/ILSVRC2012_val_00037303.JPEG n02397096/
+mv val/ILSVRC2012_val_00037304.JPEG n02977058/
+mv val/ILSVRC2012_val_00037305.JPEG n02089867/
+mv val/ILSVRC2012_val_00037306.JPEG n04429376/
+mv val/ILSVRC2012_val_00037307.JPEG n03018349/
+mv val/ILSVRC2012_val_00037308.JPEG n13037406/
+mv val/ILSVRC2012_val_00037309.JPEG n03998194/
+mv val/ILSVRC2012_val_00037310.JPEG n01693334/
+mv val/ILSVRC2012_val_00037311.JPEG n01770081/
+mv val/ILSVRC2012_val_00037312.JPEG n03991062/
+mv val/ILSVRC2012_val_00037313.JPEG n03141823/
+mv val/ILSVRC2012_val_00037314.JPEG n03691459/
+mv val/ILSVRC2012_val_00037315.JPEG n04039381/
+mv val/ILSVRC2012_val_00037316.JPEG n02894605/
+mv val/ILSVRC2012_val_00037317.JPEG n02096177/
+mv val/ILSVRC2012_val_00037318.JPEG n02093256/
+mv val/ILSVRC2012_val_00037319.JPEG n02917067/
+mv val/ILSVRC2012_val_00037320.JPEG n03791053/
+mv val/ILSVRC2012_val_00037321.JPEG n03976467/
+mv val/ILSVRC2012_val_00037322.JPEG n02795169/
+mv val/ILSVRC2012_val_00037323.JPEG n02112706/
+mv val/ILSVRC2012_val_00037324.JPEG n01692333/
+mv val/ILSVRC2012_val_00037325.JPEG n02111129/
+mv val/ILSVRC2012_val_00037326.JPEG n03110669/
+mv val/ILSVRC2012_val_00037327.JPEG n03803284/
+mv val/ILSVRC2012_val_00037328.JPEG n01592084/
+mv val/ILSVRC2012_val_00037329.JPEG n02514041/
+mv val/ILSVRC2012_val_00037330.JPEG n02104365/
+mv val/ILSVRC2012_val_00037331.JPEG n02089867/
+mv val/ILSVRC2012_val_00037332.JPEG n07860988/
+mv val/ILSVRC2012_val_00037333.JPEG n02093256/
+mv val/ILSVRC2012_val_00037334.JPEG n02403003/
+mv val/ILSVRC2012_val_00037335.JPEG n04522168/
+mv val/ILSVRC2012_val_00037336.JPEG n02837789/
+mv val/ILSVRC2012_val_00037337.JPEG n01855032/
+mv val/ILSVRC2012_val_00037338.JPEG n02793495/
+mv val/ILSVRC2012_val_00037339.JPEG n02093991/
+mv val/ILSVRC2012_val_00037340.JPEG n02437312/
+mv val/ILSVRC2012_val_00037341.JPEG n02980441/
+mv val/ILSVRC2012_val_00037342.JPEG n04116512/
+mv val/ILSVRC2012_val_00037343.JPEG n02120079/
+mv val/ILSVRC2012_val_00037344.JPEG n04371774/
+mv val/ILSVRC2012_val_00037345.JPEG n02104365/
+mv val/ILSVRC2012_val_00037346.JPEG n04153751/
+mv val/ILSVRC2012_val_00037347.JPEG n02091635/
+mv val/ILSVRC2012_val_00037348.JPEG n01775062/
+mv val/ILSVRC2012_val_00037349.JPEG n04310018/
+mv val/ILSVRC2012_val_00037350.JPEG n03529860/
+mv val/ILSVRC2012_val_00037351.JPEG n02105162/
+mv val/ILSVRC2012_val_00037352.JPEG n02814860/
+mv val/ILSVRC2012_val_00037353.JPEG n02088364/
+mv val/ILSVRC2012_val_00037354.JPEG n02116738/
+mv val/ILSVRC2012_val_00037355.JPEG n03630383/
+mv val/ILSVRC2012_val_00037356.JPEG n02229544/
+mv val/ILSVRC2012_val_00037357.JPEG n04111531/
+mv val/ILSVRC2012_val_00037358.JPEG n01882714/
+mv val/ILSVRC2012_val_00037359.JPEG n01917289/
+mv val/ILSVRC2012_val_00037360.JPEG n03877472/
+mv val/ILSVRC2012_val_00037361.JPEG n02346627/
+mv val/ILSVRC2012_val_00037362.JPEG n03476991/
+mv val/ILSVRC2012_val_00037363.JPEG n02115641/
+mv val/ILSVRC2012_val_00037364.JPEG n03110669/
+mv val/ILSVRC2012_val_00037365.JPEG n02799071/
+mv val/ILSVRC2012_val_00037366.JPEG n03272562/
+mv val/ILSVRC2012_val_00037367.JPEG n01729322/
+mv val/ILSVRC2012_val_00037368.JPEG n03599486/
+mv val/ILSVRC2012_val_00037369.JPEG n03445777/
+mv val/ILSVRC2012_val_00037370.JPEG n04099969/
+mv val/ILSVRC2012_val_00037371.JPEG n02536864/
+mv val/ILSVRC2012_val_00037372.JPEG n03026506/
+mv val/ILSVRC2012_val_00037373.JPEG n03899768/
+mv val/ILSVRC2012_val_00037374.JPEG n04485082/
+mv val/ILSVRC2012_val_00037375.JPEG n01440764/
+mv val/ILSVRC2012_val_00037376.JPEG n04370456/
+mv val/ILSVRC2012_val_00037377.JPEG n04125021/
+mv val/ILSVRC2012_val_00037378.JPEG n07565083/
+mv val/ILSVRC2012_val_00037379.JPEG n02012849/
+mv val/ILSVRC2012_val_00037380.JPEG n02437616/
+mv val/ILSVRC2012_val_00037381.JPEG n02281406/
+mv val/ILSVRC2012_val_00037382.JPEG n03141823/
+mv val/ILSVRC2012_val_00037383.JPEG n01440764/
+mv val/ILSVRC2012_val_00037384.JPEG n04548362/
+mv val/ILSVRC2012_val_00037385.JPEG n03584254/
+mv val/ILSVRC2012_val_00037386.JPEG n04366367/
+mv val/ILSVRC2012_val_00037387.JPEG n04069434/
+mv val/ILSVRC2012_val_00037388.JPEG n02108551/
+mv val/ILSVRC2012_val_00037389.JPEG n07697313/
+mv val/ILSVRC2012_val_00037390.JPEG n02916936/
+mv val/ILSVRC2012_val_00037391.JPEG n03124043/
+mv val/ILSVRC2012_val_00037392.JPEG n01697457/
+mv val/ILSVRC2012_val_00037393.JPEG n02095570/
+mv val/ILSVRC2012_val_00037394.JPEG n03016953/
+mv val/ILSVRC2012_val_00037395.JPEG n02441942/
+mv val/ILSVRC2012_val_00037396.JPEG n02106382/
+mv val/ILSVRC2012_val_00037397.JPEG n01833805/
+mv val/ILSVRC2012_val_00037398.JPEG n03045698/
+mv val/ILSVRC2012_val_00037399.JPEG n04404412/
+mv val/ILSVRC2012_val_00037400.JPEG n03888605/
+mv val/ILSVRC2012_val_00037401.JPEG n04259630/
+mv val/ILSVRC2012_val_00037402.JPEG n03075370/
+mv val/ILSVRC2012_val_00037403.JPEG n03124170/
+mv val/ILSVRC2012_val_00037404.JPEG n03534580/
+mv val/ILSVRC2012_val_00037405.JPEG n04277352/
+mv val/ILSVRC2012_val_00037406.JPEG n03717622/
+mv val/ILSVRC2012_val_00037407.JPEG n02526121/
+mv val/ILSVRC2012_val_00037408.JPEG n01797886/
+mv val/ILSVRC2012_val_00037409.JPEG n04133789/
+mv val/ILSVRC2012_val_00037410.JPEG n02105855/
+mv val/ILSVRC2012_val_00037411.JPEG n03530642/
+mv val/ILSVRC2012_val_00037412.JPEG n02130308/
+mv val/ILSVRC2012_val_00037413.JPEG n01980166/
+mv val/ILSVRC2012_val_00037414.JPEG n04192698/
+mv val/ILSVRC2012_val_00037415.JPEG n04336792/
+mv val/ILSVRC2012_val_00037416.JPEG n07742313/
+mv val/ILSVRC2012_val_00037417.JPEG n01692333/
+mv val/ILSVRC2012_val_00037418.JPEG n02279972/
+mv val/ILSVRC2012_val_00037419.JPEG n04371430/
+mv val/ILSVRC2012_val_00037420.JPEG n01592084/
+mv val/ILSVRC2012_val_00037421.JPEG n09332890/
+mv val/ILSVRC2012_val_00037422.JPEG n04332243/
+mv val/ILSVRC2012_val_00037423.JPEG n04392985/
+mv val/ILSVRC2012_val_00037424.JPEG n07720875/
+mv val/ILSVRC2012_val_00037425.JPEG n03478589/
+mv val/ILSVRC2012_val_00037426.JPEG n03291819/
+mv val/ILSVRC2012_val_00037427.JPEG n04560804/
+mv val/ILSVRC2012_val_00037428.JPEG n02106030/
+mv val/ILSVRC2012_val_00037429.JPEG n04049303/
+mv val/ILSVRC2012_val_00037430.JPEG n02927161/
+mv val/ILSVRC2012_val_00037431.JPEG n07753113/
+mv val/ILSVRC2012_val_00037432.JPEG n04065272/
+mv val/ILSVRC2012_val_00037433.JPEG n02835271/
+mv val/ILSVRC2012_val_00037434.JPEG n03047690/
+mv val/ILSVRC2012_val_00037435.JPEG n03538406/
+mv val/ILSVRC2012_val_00037436.JPEG n01582220/
+mv val/ILSVRC2012_val_00037437.JPEG n02113624/
+mv val/ILSVRC2012_val_00037438.JPEG n03792782/
+mv val/ILSVRC2012_val_00037439.JPEG n04116512/
+mv val/ILSVRC2012_val_00037440.JPEG n02093859/
+mv val/ILSVRC2012_val_00037441.JPEG n03961711/
+mv val/ILSVRC2012_val_00037442.JPEG n02109047/
+mv val/ILSVRC2012_val_00037443.JPEG n07831146/
+mv val/ILSVRC2012_val_00037444.JPEG n02825657/
+mv val/ILSVRC2012_val_00037445.JPEG n13054560/
+mv val/ILSVRC2012_val_00037446.JPEG n02951585/
+mv val/ILSVRC2012_val_00037447.JPEG n02442845/
+mv val/ILSVRC2012_val_00037448.JPEG n02817516/
+mv val/ILSVRC2012_val_00037449.JPEG n03874599/
+mv val/ILSVRC2012_val_00037450.JPEG n02093859/
+mv val/ILSVRC2012_val_00037451.JPEG n01755581/
+mv val/ILSVRC2012_val_00037452.JPEG n02860847/
+mv val/ILSVRC2012_val_00037453.JPEG n02167151/
+mv val/ILSVRC2012_val_00037454.JPEG n01537544/
+mv val/ILSVRC2012_val_00037455.JPEG n02099601/
+mv val/ILSVRC2012_val_00037456.JPEG n02111500/
+mv val/ILSVRC2012_val_00037457.JPEG n03670208/
+mv val/ILSVRC2012_val_00037458.JPEG n03179701/
+mv val/ILSVRC2012_val_00037459.JPEG n02093647/
+mv val/ILSVRC2012_val_00037460.JPEG n03444034/
+mv val/ILSVRC2012_val_00037461.JPEG n03131574/
+mv val/ILSVRC2012_val_00037462.JPEG n02111500/
+mv val/ILSVRC2012_val_00037463.JPEG n04069434/
+mv val/ILSVRC2012_val_00037464.JPEG n01744401/
+mv val/ILSVRC2012_val_00037465.JPEG n03220513/
+mv val/ILSVRC2012_val_00037466.JPEG n03393912/
+mv val/ILSVRC2012_val_00037467.JPEG n02486261/
+mv val/ILSVRC2012_val_00037468.JPEG n03372029/
+mv val/ILSVRC2012_val_00037469.JPEG n01728572/
+mv val/ILSVRC2012_val_00037470.JPEG n02422106/
+mv val/ILSVRC2012_val_00037471.JPEG n01833805/
+mv val/ILSVRC2012_val_00037472.JPEG n03594734/
+mv val/ILSVRC2012_val_00037473.JPEG n13044778/
+mv val/ILSVRC2012_val_00037474.JPEG n02074367/
+mv val/ILSVRC2012_val_00037475.JPEG n02391049/
+mv val/ILSVRC2012_val_00037476.JPEG n07873807/
+mv val/ILSVRC2012_val_00037477.JPEG n09468604/
+mv val/ILSVRC2012_val_00037478.JPEG n02799071/
+mv val/ILSVRC2012_val_00037479.JPEG n03832673/
+mv val/ILSVRC2012_val_00037480.JPEG n02361337/
+mv val/ILSVRC2012_val_00037481.JPEG n02111277/
+mv val/ILSVRC2012_val_00037482.JPEG n04204238/
+mv val/ILSVRC2012_val_00037483.JPEG n02172182/
+mv val/ILSVRC2012_val_00037484.JPEG n04562935/
+mv val/ILSVRC2012_val_00037485.JPEG n02100735/
+mv val/ILSVRC2012_val_00037486.JPEG n02007558/
+mv val/ILSVRC2012_val_00037487.JPEG n03630383/
+mv val/ILSVRC2012_val_00037488.JPEG n01484850/
+mv val/ILSVRC2012_val_00037489.JPEG n02484975/
+mv val/ILSVRC2012_val_00037490.JPEG n02096051/
+mv val/ILSVRC2012_val_00037491.JPEG n02206856/
+mv val/ILSVRC2012_val_00037492.JPEG n03770679/
+mv val/ILSVRC2012_val_00037493.JPEG n04265275/
+mv val/ILSVRC2012_val_00037494.JPEG n09246464/
+mv val/ILSVRC2012_val_00037495.JPEG n09835506/
+mv val/ILSVRC2012_val_00037496.JPEG n07614500/
+mv val/ILSVRC2012_val_00037497.JPEG n09472597/
+mv val/ILSVRC2012_val_00037498.JPEG n03379051/
+mv val/ILSVRC2012_val_00037499.JPEG n03457902/
+mv val/ILSVRC2012_val_00037500.JPEG n01855032/
+mv val/ILSVRC2012_val_00037501.JPEG n04201297/
+mv val/ILSVRC2012_val_00037502.JPEG n02951585/
+mv val/ILSVRC2012_val_00037503.JPEG n13133613/
+mv val/ILSVRC2012_val_00037504.JPEG n03770439/
+mv val/ILSVRC2012_val_00037505.JPEG n02172182/
+mv val/ILSVRC2012_val_00037506.JPEG n03992509/
+mv val/ILSVRC2012_val_00037507.JPEG n03617480/
+mv val/ILSVRC2012_val_00037508.JPEG n02802426/
+mv val/ILSVRC2012_val_00037509.JPEG n02676566/
+mv val/ILSVRC2012_val_00037510.JPEG n01687978/
+mv val/ILSVRC2012_val_00037511.JPEG n07711569/
+mv val/ILSVRC2012_val_00037512.JPEG n03690938/
+mv val/ILSVRC2012_val_00037513.JPEG n02869837/
+mv val/ILSVRC2012_val_00037514.JPEG n03942813/
+mv val/ILSVRC2012_val_00037515.JPEG n04332243/
+mv val/ILSVRC2012_val_00037516.JPEG n01491361/
+mv val/ILSVRC2012_val_00037517.JPEG n12768682/
+mv val/ILSVRC2012_val_00037518.JPEG n01910747/
+mv val/ILSVRC2012_val_00037519.JPEG n04179913/
+mv val/ILSVRC2012_val_00037520.JPEG n03627232/
+mv val/ILSVRC2012_val_00037521.JPEG n13037406/
+mv val/ILSVRC2012_val_00037522.JPEG n07745940/
+mv val/ILSVRC2012_val_00037523.JPEG n04152593/
+mv val/ILSVRC2012_val_00037524.JPEG n01806143/
+mv val/ILSVRC2012_val_00037525.JPEG n07565083/
+mv val/ILSVRC2012_val_00037526.JPEG n03627232/
+mv val/ILSVRC2012_val_00037527.JPEG n12267677/
+mv val/ILSVRC2012_val_00037528.JPEG n03837869/
+mv val/ILSVRC2012_val_00037529.JPEG n02094433/
+mv val/ILSVRC2012_val_00037530.JPEG n04238763/
+mv val/ILSVRC2012_val_00037531.JPEG n03496892/
+mv val/ILSVRC2012_val_00037532.JPEG n04612504/
+mv val/ILSVRC2012_val_00037533.JPEG n02807133/
+mv val/ILSVRC2012_val_00037534.JPEG n02106166/
+mv val/ILSVRC2012_val_00037535.JPEG n02484975/
+mv val/ILSVRC2012_val_00037536.JPEG n03208938/
+mv val/ILSVRC2012_val_00037537.JPEG n04065272/
+mv val/ILSVRC2012_val_00037538.JPEG n02107574/
+mv val/ILSVRC2012_val_00037539.JPEG n07715103/
+mv val/ILSVRC2012_val_00037540.JPEG n04517823/
+mv val/ILSVRC2012_val_00037541.JPEG n10565667/
+mv val/ILSVRC2012_val_00037542.JPEG n02807133/
+mv val/ILSVRC2012_val_00037543.JPEG n03717622/
+mv val/ILSVRC2012_val_00037544.JPEG n04557648/
+mv val/ILSVRC2012_val_00037545.JPEG n04591157/
+mv val/ILSVRC2012_val_00037546.JPEG n02326432/
+mv val/ILSVRC2012_val_00037547.JPEG n06874185/
+mv val/ILSVRC2012_val_00037548.JPEG n04442312/
+mv val/ILSVRC2012_val_00037549.JPEG n03042490/
+mv val/ILSVRC2012_val_00037550.JPEG n03188531/
+mv val/ILSVRC2012_val_00037551.JPEG n04487394/
+mv val/ILSVRC2012_val_00037552.JPEG n02006656/
+mv val/ILSVRC2012_val_00037553.JPEG n01729322/
+mv val/ILSVRC2012_val_00037554.JPEG n03929660/
+mv val/ILSVRC2012_val_00037555.JPEG n03425413/
+mv val/ILSVRC2012_val_00037556.JPEG n03216828/
+mv val/ILSVRC2012_val_00037557.JPEG n02346627/
+mv val/ILSVRC2012_val_00037558.JPEG n02526121/
+mv val/ILSVRC2012_val_00037559.JPEG n02089078/
+mv val/ILSVRC2012_val_00037560.JPEG n01669191/
+mv val/ILSVRC2012_val_00037561.JPEG n10565667/
+mv val/ILSVRC2012_val_00037562.JPEG n04376876/
+mv val/ILSVRC2012_val_00037563.JPEG n04258138/
+mv val/ILSVRC2012_val_00037564.JPEG n02489166/
+mv val/ILSVRC2012_val_00037565.JPEG n02493793/
+mv val/ILSVRC2012_val_00037566.JPEG n03584829/
+mv val/ILSVRC2012_val_00037567.JPEG n03379051/
+mv val/ILSVRC2012_val_00037568.JPEG n02094114/
+mv val/ILSVRC2012_val_00037569.JPEG n01514668/
+mv val/ILSVRC2012_val_00037570.JPEG n03770439/
+mv val/ILSVRC2012_val_00037571.JPEG n02231487/
+mv val/ILSVRC2012_val_00037572.JPEG n01855032/
+mv val/ILSVRC2012_val_00037573.JPEG n03180011/
+mv val/ILSVRC2012_val_00037574.JPEG n04606251/
+mv val/ILSVRC2012_val_00037575.JPEG n03916031/
+mv val/ILSVRC2012_val_00037576.JPEG n01774750/
+mv val/ILSVRC2012_val_00037577.JPEG n02087394/
+mv val/ILSVRC2012_val_00037578.JPEG n03297495/
+mv val/ILSVRC2012_val_00037579.JPEG n01968897/
+mv val/ILSVRC2012_val_00037580.JPEG n02105056/
+mv val/ILSVRC2012_val_00037581.JPEG n01491361/
+mv val/ILSVRC2012_val_00037582.JPEG n02114712/
+mv val/ILSVRC2012_val_00037583.JPEG n02097130/
+mv val/ILSVRC2012_val_00037584.JPEG n02692877/
+mv val/ILSVRC2012_val_00037585.JPEG n04125021/
+mv val/ILSVRC2012_val_00037586.JPEG n03476684/
+mv val/ILSVRC2012_val_00037587.JPEG n03658185/
+mv val/ILSVRC2012_val_00037588.JPEG n02966687/
+mv val/ILSVRC2012_val_00037589.JPEG n02259212/
+mv val/ILSVRC2012_val_00037590.JPEG n03355925/
+mv val/ILSVRC2012_val_00037591.JPEG n13133613/
+mv val/ILSVRC2012_val_00037592.JPEG n03394916/
+mv val/ILSVRC2012_val_00037593.JPEG n02107312/
+mv val/ILSVRC2012_val_00037594.JPEG n02788148/
+mv val/ILSVRC2012_val_00037595.JPEG n02109961/
+mv val/ILSVRC2012_val_00037596.JPEG n01440764/
+mv val/ILSVRC2012_val_00037597.JPEG n03124043/
+mv val/ILSVRC2012_val_00037598.JPEG n06359193/
+mv val/ILSVRC2012_val_00037599.JPEG n04133789/
+mv val/ILSVRC2012_val_00037600.JPEG n02500267/
+mv val/ILSVRC2012_val_00037601.JPEG n04209133/
+mv val/ILSVRC2012_val_00037602.JPEG n03344393/
+mv val/ILSVRC2012_val_00037603.JPEG n03494278/
+mv val/ILSVRC2012_val_00037604.JPEG n02977058/
+mv val/ILSVRC2012_val_00037605.JPEG n03710637/
+mv val/ILSVRC2012_val_00037606.JPEG n01622779/
+mv val/ILSVRC2012_val_00037607.JPEG n09421951/
+mv val/ILSVRC2012_val_00037608.JPEG n02790996/
+mv val/ILSVRC2012_val_00037609.JPEG n02089078/
+mv val/ILSVRC2012_val_00037610.JPEG n02256656/
+mv val/ILSVRC2012_val_00037611.JPEG n01531178/
+mv val/ILSVRC2012_val_00037612.JPEG n04479046/
+mv val/ILSVRC2012_val_00037613.JPEG n04141327/
+mv val/ILSVRC2012_val_00037614.JPEG n03000134/
+mv val/ILSVRC2012_val_00037615.JPEG n02504013/
+mv val/ILSVRC2012_val_00037616.JPEG n03627232/
+mv val/ILSVRC2012_val_00037617.JPEG n02114712/
+mv val/ILSVRC2012_val_00037618.JPEG n03325584/
+mv val/ILSVRC2012_val_00037619.JPEG n03773504/
+mv val/ILSVRC2012_val_00037620.JPEG n04004767/
+mv val/ILSVRC2012_val_00037621.JPEG n04266014/
+mv val/ILSVRC2012_val_00037622.JPEG n02977058/
+mv val/ILSVRC2012_val_00037623.JPEG n02125311/
+mv val/ILSVRC2012_val_00037624.JPEG n02281406/
+mv val/ILSVRC2012_val_00037625.JPEG n03291819/
+mv val/ILSVRC2012_val_00037626.JPEG n01675722/
+mv val/ILSVRC2012_val_00037627.JPEG n02138441/
+mv val/ILSVRC2012_val_00037628.JPEG n03804744/
+mv val/ILSVRC2012_val_00037629.JPEG n03000684/
+mv val/ILSVRC2012_val_00037630.JPEG n02114367/
+mv val/ILSVRC2012_val_00037631.JPEG n03187595/
+mv val/ILSVRC2012_val_00037632.JPEG n01943899/
+mv val/ILSVRC2012_val_00037633.JPEG n02125311/
+mv val/ILSVRC2012_val_00037634.JPEG n02113624/
+mv val/ILSVRC2012_val_00037635.JPEG n02823428/
+mv val/ILSVRC2012_val_00037636.JPEG n02233338/
+mv val/ILSVRC2012_val_00037637.JPEG n03110669/
+mv val/ILSVRC2012_val_00037638.JPEG n02500267/
+mv val/ILSVRC2012_val_00037639.JPEG n03594734/
+mv val/ILSVRC2012_val_00037640.JPEG n03347037/
+mv val/ILSVRC2012_val_00037641.JPEG n01990800/
+mv val/ILSVRC2012_val_00037642.JPEG n02074367/
+mv val/ILSVRC2012_val_00037643.JPEG n02396427/
+mv val/ILSVRC2012_val_00037644.JPEG n03954731/
+mv val/ILSVRC2012_val_00037645.JPEG n02687172/
+mv val/ILSVRC2012_val_00037646.JPEG n02883205/
+mv val/ILSVRC2012_val_00037647.JPEG n03127925/
+mv val/ILSVRC2012_val_00037648.JPEG n02111500/
+mv val/ILSVRC2012_val_00037649.JPEG n07718747/
+mv val/ILSVRC2012_val_00037650.JPEG n02447366/
+mv val/ILSVRC2012_val_00037651.JPEG n04286575/
+mv val/ILSVRC2012_val_00037652.JPEG n02930766/
+mv val/ILSVRC2012_val_00037653.JPEG n01664065/
+mv val/ILSVRC2012_val_00037654.JPEG n04153751/
+mv val/ILSVRC2012_val_00037655.JPEG n01687978/
+mv val/ILSVRC2012_val_00037656.JPEG n02422699/
+mv val/ILSVRC2012_val_00037657.JPEG n02791270/
+mv val/ILSVRC2012_val_00037658.JPEG n02835271/
+mv val/ILSVRC2012_val_00037659.JPEG n02504458/
+mv val/ILSVRC2012_val_00037660.JPEG n01917289/
+mv val/ILSVRC2012_val_00037661.JPEG n04252077/
+mv val/ILSVRC2012_val_00037662.JPEG n04548280/
+mv val/ILSVRC2012_val_00037663.JPEG n03089624/
+mv val/ILSVRC2012_val_00037664.JPEG n07590611/
+mv val/ILSVRC2012_val_00037665.JPEG n07754684/
+mv val/ILSVRC2012_val_00037666.JPEG n01739381/
+mv val/ILSVRC2012_val_00037667.JPEG n04483307/
+mv val/ILSVRC2012_val_00037668.JPEG n01914609/
+mv val/ILSVRC2012_val_00037669.JPEG n02087046/
+mv val/ILSVRC2012_val_00037670.JPEG n03697007/
+mv val/ILSVRC2012_val_00037671.JPEG n04039381/
+mv val/ILSVRC2012_val_00037672.JPEG n01820546/
+mv val/ILSVRC2012_val_00037673.JPEG n04355338/
+mv val/ILSVRC2012_val_00037674.JPEG n02100735/
+mv val/ILSVRC2012_val_00037675.JPEG n03032252/
+mv val/ILSVRC2012_val_00037676.JPEG n02091467/
+mv val/ILSVRC2012_val_00037677.JPEG n01728572/
+mv val/ILSVRC2012_val_00037678.JPEG n02002556/
+mv val/ILSVRC2012_val_00037679.JPEG n03874599/
+mv val/ILSVRC2012_val_00037680.JPEG n02859443/
+mv val/ILSVRC2012_val_00037681.JPEG n04146614/
+mv val/ILSVRC2012_val_00037682.JPEG n03534580/
+mv val/ILSVRC2012_val_00037683.JPEG n04532106/
+mv val/ILSVRC2012_val_00037684.JPEG n01981276/
+mv val/ILSVRC2012_val_00037685.JPEG n03814639/
+mv val/ILSVRC2012_val_00037686.JPEG n01689811/
+mv val/ILSVRC2012_val_00037687.JPEG n06359193/
+mv val/ILSVRC2012_val_00037688.JPEG n01675722/
+mv val/ILSVRC2012_val_00037689.JPEG n03888605/
+mv val/ILSVRC2012_val_00037690.JPEG n07714990/
+mv val/ILSVRC2012_val_00037691.JPEG n04476259/
+mv val/ILSVRC2012_val_00037692.JPEG n02536864/
+mv val/ILSVRC2012_val_00037693.JPEG n02492035/
+mv val/ILSVRC2012_val_00037694.JPEG n04265275/
+mv val/ILSVRC2012_val_00037695.JPEG n02948072/
+mv val/ILSVRC2012_val_00037696.JPEG n03804744/
+mv val/ILSVRC2012_val_00037697.JPEG n04380533/
+mv val/ILSVRC2012_val_00037698.JPEG n01518878/
+mv val/ILSVRC2012_val_00037699.JPEG n04005630/
+mv val/ILSVRC2012_val_00037700.JPEG n07590611/
+mv val/ILSVRC2012_val_00037701.JPEG n04417672/
+mv val/ILSVRC2012_val_00037702.JPEG n03709823/
+mv val/ILSVRC2012_val_00037703.JPEG n02105412/
+mv val/ILSVRC2012_val_00037704.JPEG n02363005/
+mv val/ILSVRC2012_val_00037705.JPEG n01494475/
+mv val/ILSVRC2012_val_00037706.JPEG n03680355/
+mv val/ILSVRC2012_val_00037707.JPEG n02951358/
+mv val/ILSVRC2012_val_00037708.JPEG n04597913/
+mv val/ILSVRC2012_val_00037709.JPEG n03998194/
+mv val/ILSVRC2012_val_00037710.JPEG n01855032/
+mv val/ILSVRC2012_val_00037711.JPEG n02018795/
+mv val/ILSVRC2012_val_00037712.JPEG n03271574/
+mv val/ILSVRC2012_val_00037713.JPEG n02167151/
+mv val/ILSVRC2012_val_00037714.JPEG n02009912/
+mv val/ILSVRC2012_val_00037715.JPEG n03825788/
+mv val/ILSVRC2012_val_00037716.JPEG n04482393/
+mv val/ILSVRC2012_val_00037717.JPEG n01774750/
+mv val/ILSVRC2012_val_00037718.JPEG n02500267/
+mv val/ILSVRC2012_val_00037719.JPEG n01514859/
+mv val/ILSVRC2012_val_00037720.JPEG n03908618/
+mv val/ILSVRC2012_val_00037721.JPEG n03761084/
+mv val/ILSVRC2012_val_00037722.JPEG n03633091/
+mv val/ILSVRC2012_val_00037723.JPEG n02096177/
+mv val/ILSVRC2012_val_00037724.JPEG n03729826/
+mv val/ILSVRC2012_val_00037725.JPEG n07717556/
+mv val/ILSVRC2012_val_00037726.JPEG n03670208/
+mv val/ILSVRC2012_val_00037727.JPEG n01773797/
+mv val/ILSVRC2012_val_00037728.JPEG n04554684/
+mv val/ILSVRC2012_val_00037729.JPEG n01697457/
+mv val/ILSVRC2012_val_00037730.JPEG n03691459/
+mv val/ILSVRC2012_val_00037731.JPEG n02138441/
+mv val/ILSVRC2012_val_00037732.JPEG n03764736/
+mv val/ILSVRC2012_val_00037733.JPEG n02123394/
+mv val/ILSVRC2012_val_00037734.JPEG n04192698/
+mv val/ILSVRC2012_val_00037735.JPEG n04120489/
+mv val/ILSVRC2012_val_00037736.JPEG n07615774/
+mv val/ILSVRC2012_val_00037737.JPEG n03929855/
+mv val/ILSVRC2012_val_00037738.JPEG n02494079/
+mv val/ILSVRC2012_val_00037739.JPEG n01669191/
+mv val/ILSVRC2012_val_00037740.JPEG n01498041/
+mv val/ILSVRC2012_val_00037741.JPEG n03250847/
+mv val/ILSVRC2012_val_00037742.JPEG n03924679/
+mv val/ILSVRC2012_val_00037743.JPEG n02356798/
+mv val/ILSVRC2012_val_00037744.JPEG n02823750/
+mv val/ILSVRC2012_val_00037745.JPEG n03447721/
+mv val/ILSVRC2012_val_00037746.JPEG n02058221/
+mv val/ILSVRC2012_val_00037747.JPEG n07930864/
+mv val/ILSVRC2012_val_00037748.JPEG n01530575/
+mv val/ILSVRC2012_val_00037749.JPEG n04428191/
+mv val/ILSVRC2012_val_00037750.JPEG n04372370/
+mv val/ILSVRC2012_val_00037751.JPEG n03840681/
+mv val/ILSVRC2012_val_00037752.JPEG n02027492/
+mv val/ILSVRC2012_val_00037753.JPEG n01498041/
+mv val/ILSVRC2012_val_00037754.JPEG n07718472/
+mv val/ILSVRC2012_val_00037755.JPEG n03954731/
+mv val/ILSVRC2012_val_00037756.JPEG n04099969/
+mv val/ILSVRC2012_val_00037757.JPEG n03954731/
+mv val/ILSVRC2012_val_00037758.JPEG n01770081/
+mv val/ILSVRC2012_val_00037759.JPEG n03445924/
+mv val/ILSVRC2012_val_00037760.JPEG n03045698/
+mv val/ILSVRC2012_val_00037761.JPEG n03527444/
+mv val/ILSVRC2012_val_00037762.JPEG n02840245/
+mv val/ILSVRC2012_val_00037763.JPEG n04201297/
+mv val/ILSVRC2012_val_00037764.JPEG n01735189/
+mv val/ILSVRC2012_val_00037765.JPEG n01986214/
+mv val/ILSVRC2012_val_00037766.JPEG n02002724/
+mv val/ILSVRC2012_val_00037767.JPEG n02113978/
+mv val/ILSVRC2012_val_00037768.JPEG n02177972/
+mv val/ILSVRC2012_val_00037769.JPEG n03908714/
+mv val/ILSVRC2012_val_00037770.JPEG n03888257/
+mv val/ILSVRC2012_val_00037771.JPEG n02100236/
+mv val/ILSVRC2012_val_00037772.JPEG n02437312/
+mv val/ILSVRC2012_val_00037773.JPEG n02236044/
+mv val/ILSVRC2012_val_00037774.JPEG n07871810/
+mv val/ILSVRC2012_val_00037775.JPEG n03775071/
+mv val/ILSVRC2012_val_00037776.JPEG n03947888/
+mv val/ILSVRC2012_val_00037777.JPEG n03933933/
+mv val/ILSVRC2012_val_00037778.JPEG n02066245/
+mv val/ILSVRC2012_val_00037779.JPEG n02128385/
+mv val/ILSVRC2012_val_00037780.JPEG n01491361/
+mv val/ILSVRC2012_val_00037781.JPEG n02493509/
+mv val/ILSVRC2012_val_00037782.JPEG n07717556/
+mv val/ILSVRC2012_val_00037783.JPEG n02865351/
+mv val/ILSVRC2012_val_00037784.JPEG n03187595/
+mv val/ILSVRC2012_val_00037785.JPEG n02666196/
+mv val/ILSVRC2012_val_00037786.JPEG n01917289/
+mv val/ILSVRC2012_val_00037787.JPEG n01770081/
+mv val/ILSVRC2012_val_00037788.JPEG n02788148/
+mv val/ILSVRC2012_val_00037789.JPEG n03661043/
+mv val/ILSVRC2012_val_00037790.JPEG n02481823/
+mv val/ILSVRC2012_val_00037791.JPEG n02085620/
+mv val/ILSVRC2012_val_00037792.JPEG n02799071/
+mv val/ILSVRC2012_val_00037793.JPEG n03590841/
+mv val/ILSVRC2012_val_00037794.JPEG n01749939/
+mv val/ILSVRC2012_val_00037795.JPEG n01614925/
+mv val/ILSVRC2012_val_00037796.JPEG n02950826/
+mv val/ILSVRC2012_val_00037797.JPEG n02088632/
+mv val/ILSVRC2012_val_00037798.JPEG n01498041/
+mv val/ILSVRC2012_val_00037799.JPEG n02105162/
+mv val/ILSVRC2012_val_00037800.JPEG n01737021/
+mv val/ILSVRC2012_val_00037801.JPEG n02690373/
+mv val/ILSVRC2012_val_00037802.JPEG n03584254/
+mv val/ILSVRC2012_val_00037803.JPEG n02791124/
+mv val/ILSVRC2012_val_00037804.JPEG n02088238/
+mv val/ILSVRC2012_val_00037805.JPEG n04328186/
+mv val/ILSVRC2012_val_00037806.JPEG n01582220/
+mv val/ILSVRC2012_val_00037807.JPEG n02231487/
+mv val/ILSVRC2012_val_00037808.JPEG n03717622/
+mv val/ILSVRC2012_val_00037809.JPEG n01751748/
+mv val/ILSVRC2012_val_00037810.JPEG n03721384/
+mv val/ILSVRC2012_val_00037811.JPEG n02108422/
+mv val/ILSVRC2012_val_00037812.JPEG n01669191/
+mv val/ILSVRC2012_val_00037813.JPEG n02980441/
+mv val/ILSVRC2012_val_00037814.JPEG n04243546/
+mv val/ILSVRC2012_val_00037815.JPEG n03982430/
+mv val/ILSVRC2012_val_00037816.JPEG n02422106/
+mv val/ILSVRC2012_val_00037817.JPEG n03014705/
+mv val/ILSVRC2012_val_00037818.JPEG n04371774/
+mv val/ILSVRC2012_val_00037819.JPEG n04125021/
+mv val/ILSVRC2012_val_00037820.JPEG n02090622/
+mv val/ILSVRC2012_val_00037821.JPEG n01930112/
+mv val/ILSVRC2012_val_00037822.JPEG n04552348/
+mv val/ILSVRC2012_val_00037823.JPEG n03764736/
+mv val/ILSVRC2012_val_00037824.JPEG n01582220/
+mv val/ILSVRC2012_val_00037825.JPEG n02056570/
+mv val/ILSVRC2012_val_00037826.JPEG n02089973/
+mv val/ILSVRC2012_val_00037827.JPEG n09399592/
+mv val/ILSVRC2012_val_00037828.JPEG n03450230/
+mv val/ILSVRC2012_val_00037829.JPEG n03770679/
+mv val/ILSVRC2012_val_00037830.JPEG n03445924/
+mv val/ILSVRC2012_val_00037831.JPEG n02007558/
+mv val/ILSVRC2012_val_00037832.JPEG n02268443/
+mv val/ILSVRC2012_val_00037833.JPEG n02396427/
+mv val/ILSVRC2012_val_00037834.JPEG n01440764/
+mv val/ILSVRC2012_val_00037835.JPEG n03062245/
+mv val/ILSVRC2012_val_00037836.JPEG n02134418/
+mv val/ILSVRC2012_val_00037837.JPEG n03594734/
+mv val/ILSVRC2012_val_00037838.JPEG n02094433/
+mv val/ILSVRC2012_val_00037839.JPEG n04264628/
+mv val/ILSVRC2012_val_00037840.JPEG n02992211/
+mv val/ILSVRC2012_val_00037841.JPEG n02093428/
+mv val/ILSVRC2012_val_00037842.JPEG n02100735/
+mv val/ILSVRC2012_val_00037843.JPEG n04367480/
+mv val/ILSVRC2012_val_00037844.JPEG n03764736/
+mv val/ILSVRC2012_val_00037845.JPEG n03041632/
+mv val/ILSVRC2012_val_00037846.JPEG n01443537/
+mv val/ILSVRC2012_val_00037847.JPEG n03476684/
+mv val/ILSVRC2012_val_00037848.JPEG n09229709/
+mv val/ILSVRC2012_val_00037849.JPEG n04355338/
+mv val/ILSVRC2012_val_00037850.JPEG n02128385/
+mv val/ILSVRC2012_val_00037851.JPEG n04550184/
+mv val/ILSVRC2012_val_00037852.JPEG n01806567/
+mv val/ILSVRC2012_val_00037853.JPEG n02098413/
+mv val/ILSVRC2012_val_00037854.JPEG n04086273/
+mv val/ILSVRC2012_val_00037855.JPEG n02090379/
+mv val/ILSVRC2012_val_00037856.JPEG n03958227/
+mv val/ILSVRC2012_val_00037857.JPEG n02091467/
+mv val/ILSVRC2012_val_00037858.JPEG n02108000/
+mv val/ILSVRC2012_val_00037859.JPEG n03658185/
+mv val/ILSVRC2012_val_00037860.JPEG n02843684/
+mv val/ILSVRC2012_val_00037861.JPEG n01440764/
+mv val/ILSVRC2012_val_00037862.JPEG n02981792/
+mv val/ILSVRC2012_val_00037863.JPEG n07892512/
+mv val/ILSVRC2012_val_00037864.JPEG n03297495/
+mv val/ILSVRC2012_val_00037865.JPEG n03692522/
+mv val/ILSVRC2012_val_00037866.JPEG n03937543/
+mv val/ILSVRC2012_val_00037867.JPEG n03691459/
+mv val/ILSVRC2012_val_00037868.JPEG n03240683/
+mv val/ILSVRC2012_val_00037869.JPEG n02977058/
+mv val/ILSVRC2012_val_00037870.JPEG n07730033/
+mv val/ILSVRC2012_val_00037871.JPEG n04591713/
+mv val/ILSVRC2012_val_00037872.JPEG n11939491/
+mv val/ILSVRC2012_val_00037873.JPEG n03902125/
+mv val/ILSVRC2012_val_00037874.JPEG n02783161/
+mv val/ILSVRC2012_val_00037875.JPEG n04355338/
+mv val/ILSVRC2012_val_00037876.JPEG n02281406/
+mv val/ILSVRC2012_val_00037877.JPEG n03538406/
+mv val/ILSVRC2012_val_00037878.JPEG n01608432/
+mv val/ILSVRC2012_val_00037879.JPEG n03935335/
+mv val/ILSVRC2012_val_00037880.JPEG n01983481/
+mv val/ILSVRC2012_val_00037881.JPEG n02730930/
+mv val/ILSVRC2012_val_00037882.JPEG n01968897/
+mv val/ILSVRC2012_val_00037883.JPEG n03769881/
+mv val/ILSVRC2012_val_00037884.JPEG n04493381/
+mv val/ILSVRC2012_val_00037885.JPEG n02112018/
+mv val/ILSVRC2012_val_00037886.JPEG n02391049/
+mv val/ILSVRC2012_val_00037887.JPEG n04389033/
+mv val/ILSVRC2012_val_00037888.JPEG n03775546/
+mv val/ILSVRC2012_val_00037889.JPEG n02172182/
+mv val/ILSVRC2012_val_00037890.JPEG n09399592/
+mv val/ILSVRC2012_val_00037891.JPEG n02093991/
+mv val/ILSVRC2012_val_00037892.JPEG n01806143/
+mv val/ILSVRC2012_val_00037893.JPEG n02226429/
+mv val/ILSVRC2012_val_00037894.JPEG n01669191/
+mv val/ILSVRC2012_val_00037895.JPEG n04125021/
+mv val/ILSVRC2012_val_00037896.JPEG n02113712/
+mv val/ILSVRC2012_val_00037897.JPEG n02860847/
+mv val/ILSVRC2012_val_00037898.JPEG n02074367/
+mv val/ILSVRC2012_val_00037899.JPEG n02447366/
+mv val/ILSVRC2012_val_00037900.JPEG n02783161/
+mv val/ILSVRC2012_val_00037901.JPEG n02454379/
+mv val/ILSVRC2012_val_00037902.JPEG n01984695/
+mv val/ILSVRC2012_val_00037903.JPEG n03721384/
+mv val/ILSVRC2012_val_00037904.JPEG n03633091/
+mv val/ILSVRC2012_val_00037905.JPEG n03376595/
+mv val/ILSVRC2012_val_00037906.JPEG n02120505/
+mv val/ILSVRC2012_val_00037907.JPEG n02105505/
+mv val/ILSVRC2012_val_00037908.JPEG n04517823/
+mv val/ILSVRC2012_val_00037909.JPEG n03372029/
+mv val/ILSVRC2012_val_00037910.JPEG n03527444/
+mv val/ILSVRC2012_val_00037911.JPEG n03786901/
+mv val/ILSVRC2012_val_00037912.JPEG n03478589/
+mv val/ILSVRC2012_val_00037913.JPEG n02066245/
+mv val/ILSVRC2012_val_00037914.JPEG n07892512/
+mv val/ILSVRC2012_val_00037915.JPEG n01491361/
+mv val/ILSVRC2012_val_00037916.JPEG n02108089/
+mv val/ILSVRC2012_val_00037917.JPEG n03325584/
+mv val/ILSVRC2012_val_00037918.JPEG n03717622/
+mv val/ILSVRC2012_val_00037919.JPEG n03773504/
+mv val/ILSVRC2012_val_00037920.JPEG n01582220/
+mv val/ILSVRC2012_val_00037921.JPEG n03676483/
+mv val/ILSVRC2012_val_00037922.JPEG n04540053/
+mv val/ILSVRC2012_val_00037923.JPEG n07248320/
+mv val/ILSVRC2012_val_00037924.JPEG n04118538/
+mv val/ILSVRC2012_val_00037925.JPEG n02095314/
+mv val/ILSVRC2012_val_00037926.JPEG n12267677/
+mv val/ILSVRC2012_val_00037927.JPEG n03602883/
+mv val/ILSVRC2012_val_00037928.JPEG n02815834/
+mv val/ILSVRC2012_val_00037929.JPEG n03379051/
+mv val/ILSVRC2012_val_00037930.JPEG n02172182/
+mv val/ILSVRC2012_val_00037931.JPEG n02107142/
+mv val/ILSVRC2012_val_00037932.JPEG n06874185/
+mv val/ILSVRC2012_val_00037933.JPEG n01776313/
+mv val/ILSVRC2012_val_00037934.JPEG n07714571/
+mv val/ILSVRC2012_val_00037935.JPEG n01775062/
+mv val/ILSVRC2012_val_00037936.JPEG n03452741/
+mv val/ILSVRC2012_val_00037937.JPEG n03916031/
+mv val/ILSVRC2012_val_00037938.JPEG n04118538/
+mv val/ILSVRC2012_val_00037939.JPEG n01580077/
+mv val/ILSVRC2012_val_00037940.JPEG n02497673/
+mv val/ILSVRC2012_val_00037941.JPEG n01518878/
+mv val/ILSVRC2012_val_00037942.JPEG n03673027/
+mv val/ILSVRC2012_val_00037943.JPEG n02101388/
+mv val/ILSVRC2012_val_00037944.JPEG n03187595/
+mv val/ILSVRC2012_val_00037945.JPEG n04350905/
+mv val/ILSVRC2012_val_00037946.JPEG n02408429/
+mv val/ILSVRC2012_val_00037947.JPEG n03417042/
+mv val/ILSVRC2012_val_00037948.JPEG n02514041/
+mv val/ILSVRC2012_val_00037949.JPEG n02116738/
+mv val/ILSVRC2012_val_00037950.JPEG n03476684/
+mv val/ILSVRC2012_val_00037951.JPEG n02497673/
+mv val/ILSVRC2012_val_00037952.JPEG n04285008/
+mv val/ILSVRC2012_val_00037953.JPEG n03126707/
+mv val/ILSVRC2012_val_00037954.JPEG n03544143/
+mv val/ILSVRC2012_val_00037955.JPEG n04147183/
+mv val/ILSVRC2012_val_00037956.JPEG n03481172/
+mv val/ILSVRC2012_val_00037957.JPEG n04041544/
+mv val/ILSVRC2012_val_00037958.JPEG n02268443/
+mv val/ILSVRC2012_val_00037959.JPEG n09472597/
+mv val/ILSVRC2012_val_00037960.JPEG n02085782/
+mv val/ILSVRC2012_val_00037961.JPEG n03400231/
+mv val/ILSVRC2012_val_00037962.JPEG n03954731/
+mv val/ILSVRC2012_val_00037963.JPEG n04074963/
+mv val/ILSVRC2012_val_00037964.JPEG n03782006/
+mv val/ILSVRC2012_val_00037965.JPEG n02281787/
+mv val/ILSVRC2012_val_00037966.JPEG n04023962/
+mv val/ILSVRC2012_val_00037967.JPEG n04008634/
+mv val/ILSVRC2012_val_00037968.JPEG n07875152/
+mv val/ILSVRC2012_val_00037969.JPEG n07716906/
+mv val/ILSVRC2012_val_00037970.JPEG n02109525/
+mv val/ILSVRC2012_val_00037971.JPEG n03995372/
+mv val/ILSVRC2012_val_00037972.JPEG n02096177/
+mv val/ILSVRC2012_val_00037973.JPEG n01981276/
+mv val/ILSVRC2012_val_00037974.JPEG n03884397/
+mv val/ILSVRC2012_val_00037975.JPEG n02509815/
+mv val/ILSVRC2012_val_00037976.JPEG n03529860/
+mv val/ILSVRC2012_val_00037977.JPEG n03584829/
+mv val/ILSVRC2012_val_00037978.JPEG n02268853/
+mv val/ILSVRC2012_val_00037979.JPEG n04141975/
+mv val/ILSVRC2012_val_00037980.JPEG n04599235/
+mv val/ILSVRC2012_val_00037981.JPEG n03759954/
+mv val/ILSVRC2012_val_00037982.JPEG n02894605/
+mv val/ILSVRC2012_val_00037983.JPEG n02454379/
+mv val/ILSVRC2012_val_00037984.JPEG n03014705/
+mv val/ILSVRC2012_val_00037985.JPEG n02786058/
+mv val/ILSVRC2012_val_00037986.JPEG n04505470/
+mv val/ILSVRC2012_val_00037987.JPEG n02172182/
+mv val/ILSVRC2012_val_00037988.JPEG n02979186/
+mv val/ILSVRC2012_val_00037989.JPEG n02091635/
+mv val/ILSVRC2012_val_00037990.JPEG n02007558/
+mv val/ILSVRC2012_val_00037991.JPEG n02797295/
+mv val/ILSVRC2012_val_00037992.JPEG n02817516/
+mv val/ILSVRC2012_val_00037993.JPEG n02233338/
+mv val/ILSVRC2012_val_00037994.JPEG n04099969/
+mv val/ILSVRC2012_val_00037995.JPEG n03250847/
+mv val/ILSVRC2012_val_00037996.JPEG n02950826/
+mv val/ILSVRC2012_val_00037997.JPEG n02124075/
+mv val/ILSVRC2012_val_00037998.JPEG n01484850/
+mv val/ILSVRC2012_val_00037999.JPEG n02096294/
+mv val/ILSVRC2012_val_00038000.JPEG n02965783/
+mv val/ILSVRC2012_val_00038001.JPEG n01943899/
+mv val/ILSVRC2012_val_00038002.JPEG n02028035/
+mv val/ILSVRC2012_val_00038003.JPEG n04486054/
+mv val/ILSVRC2012_val_00038004.JPEG n02417914/
+mv val/ILSVRC2012_val_00038005.JPEG n03445777/
+mv val/ILSVRC2012_val_00038006.JPEG n04009552/
+mv val/ILSVRC2012_val_00038007.JPEG n02125311/
+mv val/ILSVRC2012_val_00038008.JPEG n03770439/
+mv val/ILSVRC2012_val_00038009.JPEG n02018207/
+mv val/ILSVRC2012_val_00038010.JPEG n02219486/
+mv val/ILSVRC2012_val_00038011.JPEG n04111531/
+mv val/ILSVRC2012_val_00038012.JPEG n09288635/
+mv val/ILSVRC2012_val_00038013.JPEG n03825788/
+mv val/ILSVRC2012_val_00038014.JPEG n03223299/
+mv val/ILSVRC2012_val_00038015.JPEG n04606251/
+mv val/ILSVRC2012_val_00038016.JPEG n02396427/
+mv val/ILSVRC2012_val_00038017.JPEG n07717410/
+mv val/ILSVRC2012_val_00038018.JPEG n02111277/
+mv val/ILSVRC2012_val_00038019.JPEG n04515003/
+mv val/ILSVRC2012_val_00038020.JPEG n02643566/
+mv val/ILSVRC2012_val_00038021.JPEG n03733131/
+mv val/ILSVRC2012_val_00038022.JPEG n02093428/
+mv val/ILSVRC2012_val_00038023.JPEG n01807496/
+mv val/ILSVRC2012_val_00038024.JPEG n02480855/
+mv val/ILSVRC2012_val_00038025.JPEG n03527444/
+mv val/ILSVRC2012_val_00038026.JPEG n02099849/
+mv val/ILSVRC2012_val_00038027.JPEG n04482393/
+mv val/ILSVRC2012_val_00038028.JPEG n02361337/
+mv val/ILSVRC2012_val_00038029.JPEG n02107574/
+mv val/ILSVRC2012_val_00038030.JPEG n04201297/
+mv val/ILSVRC2012_val_00038031.JPEG n03633091/
+mv val/ILSVRC2012_val_00038032.JPEG n04033995/
+mv val/ILSVRC2012_val_00038033.JPEG n02641379/
+mv val/ILSVRC2012_val_00038034.JPEG n02790996/
+mv val/ILSVRC2012_val_00038035.JPEG n02190166/
+mv val/ILSVRC2012_val_00038036.JPEG n03127747/
+mv val/ILSVRC2012_val_00038037.JPEG n02483362/
+mv val/ILSVRC2012_val_00038038.JPEG n03126707/
+mv val/ILSVRC2012_val_00038039.JPEG n03590841/
+mv val/ILSVRC2012_val_00038040.JPEG n07717410/
+mv val/ILSVRC2012_val_00038041.JPEG n04033901/
+mv val/ILSVRC2012_val_00038042.JPEG n02676566/
+mv val/ILSVRC2012_val_00038043.JPEG n07875152/
+mv val/ILSVRC2012_val_00038044.JPEG n02100236/
+mv val/ILSVRC2012_val_00038045.JPEG n04584207/
+mv val/ILSVRC2012_val_00038046.JPEG n01737021/
+mv val/ILSVRC2012_val_00038047.JPEG n02493509/
+mv val/ILSVRC2012_val_00038048.JPEG n02105251/
+mv val/ILSVRC2012_val_00038049.JPEG n03930630/
+mv val/ILSVRC2012_val_00038050.JPEG n03873416/
+mv val/ILSVRC2012_val_00038051.JPEG n02396427/
+mv val/ILSVRC2012_val_00038052.JPEG n02493793/
+mv val/ILSVRC2012_val_00038053.JPEG n03250847/
+mv val/ILSVRC2012_val_00038054.JPEG n02088466/
+mv val/ILSVRC2012_val_00038055.JPEG n02814533/
+mv val/ILSVRC2012_val_00038056.JPEG n02108000/
+mv val/ILSVRC2012_val_00038057.JPEG n01443537/
+mv val/ILSVRC2012_val_00038058.JPEG n02988304/
+mv val/ILSVRC2012_val_00038059.JPEG n01944390/
+mv val/ILSVRC2012_val_00038060.JPEG n04285008/
+mv val/ILSVRC2012_val_00038061.JPEG n04356056/
+mv val/ILSVRC2012_val_00038062.JPEG n01930112/
+mv val/ILSVRC2012_val_00038063.JPEG n03630383/
+mv val/ILSVRC2012_val_00038064.JPEG n02281406/
+mv val/ILSVRC2012_val_00038065.JPEG n02346627/
+mv val/ILSVRC2012_val_00038066.JPEG n04493381/
+mv val/ILSVRC2012_val_00038067.JPEG n03709823/
+mv val/ILSVRC2012_val_00038068.JPEG n01755581/
+mv val/ILSVRC2012_val_00038069.JPEG n02018795/
+mv val/ILSVRC2012_val_00038070.JPEG n07802026/
+mv val/ILSVRC2012_val_00038071.JPEG n11939491/
+mv val/ILSVRC2012_val_00038072.JPEG n07836838/
+mv val/ILSVRC2012_val_00038073.JPEG n04429376/
+mv val/ILSVRC2012_val_00038074.JPEG n03967562/
+mv val/ILSVRC2012_val_00038075.JPEG n02113023/
+mv val/ILSVRC2012_val_00038076.JPEG n03724870/
+mv val/ILSVRC2012_val_00038077.JPEG n03792972/
+mv val/ILSVRC2012_val_00038078.JPEG n01753488/
+mv val/ILSVRC2012_val_00038079.JPEG n07875152/
+mv val/ILSVRC2012_val_00038080.JPEG n07753592/
+mv val/ILSVRC2012_val_00038081.JPEG n04357314/
+mv val/ILSVRC2012_val_00038082.JPEG n03642806/
+mv val/ILSVRC2012_val_00038083.JPEG n04131690/
+mv val/ILSVRC2012_val_00038084.JPEG n04258138/
+mv val/ILSVRC2012_val_00038085.JPEG n01667114/
+mv val/ILSVRC2012_val_00038086.JPEG n02782093/
+mv val/ILSVRC2012_val_00038087.JPEG n02493509/
+mv val/ILSVRC2012_val_00038088.JPEG n04465501/
+mv val/ILSVRC2012_val_00038089.JPEG n07583066/
+mv val/ILSVRC2012_val_00038090.JPEG n02256656/
+mv val/ILSVRC2012_val_00038091.JPEG n01532829/
+mv val/ILSVRC2012_val_00038092.JPEG n01872401/
+mv val/ILSVRC2012_val_00038093.JPEG n07684084/
+mv val/ILSVRC2012_val_00038094.JPEG n03763968/
+mv val/ILSVRC2012_val_00038095.JPEG n04579145/
+mv val/ILSVRC2012_val_00038096.JPEG n03492542/
+mv val/ILSVRC2012_val_00038097.JPEG n04417672/
+mv val/ILSVRC2012_val_00038098.JPEG n04350905/
+mv val/ILSVRC2012_val_00038099.JPEG n04069434/
+mv val/ILSVRC2012_val_00038100.JPEG n03866082/
+mv val/ILSVRC2012_val_00038101.JPEG n04311174/
+mv val/ILSVRC2012_val_00038102.JPEG n01756291/
+mv val/ILSVRC2012_val_00038103.JPEG n02797295/
+mv val/ILSVRC2012_val_00038104.JPEG n03642806/
+mv val/ILSVRC2012_val_00038105.JPEG n03676483/
+mv val/ILSVRC2012_val_00038106.JPEG n03697007/
+mv val/ILSVRC2012_val_00038107.JPEG n02087046/
+mv val/ILSVRC2012_val_00038108.JPEG n03207941/
+mv val/ILSVRC2012_val_00038109.JPEG n04201297/
+mv val/ILSVRC2012_val_00038110.JPEG n02074367/
+mv val/ILSVRC2012_val_00038111.JPEG n01608432/
+mv val/ILSVRC2012_val_00038112.JPEG n02111500/
+mv val/ILSVRC2012_val_00038113.JPEG n03633091/
+mv val/ILSVRC2012_val_00038114.JPEG n02804610/
+mv val/ILSVRC2012_val_00038115.JPEG n04562935/
+mv val/ILSVRC2012_val_00038116.JPEG n02093859/
+mv val/ILSVRC2012_val_00038117.JPEG n03935335/
+mv val/ILSVRC2012_val_00038118.JPEG n02051845/
+mv val/ILSVRC2012_val_00038119.JPEG n01990800/
+mv val/ILSVRC2012_val_00038120.JPEG n02799071/
+mv val/ILSVRC2012_val_00038121.JPEG n04228054/
+mv val/ILSVRC2012_val_00038122.JPEG n02100877/
+mv val/ILSVRC2012_val_00038123.JPEG n01755581/
+mv val/ILSVRC2012_val_00038124.JPEG n02129604/
+mv val/ILSVRC2012_val_00038125.JPEG n02727426/
+mv val/ILSVRC2012_val_00038126.JPEG n01860187/
+mv val/ILSVRC2012_val_00038127.JPEG n04326547/
+mv val/ILSVRC2012_val_00038128.JPEG n03776460/
+mv val/ILSVRC2012_val_00038129.JPEG n02206856/
+mv val/ILSVRC2012_val_00038130.JPEG n02093256/
+mv val/ILSVRC2012_val_00038131.JPEG n01968897/
+mv val/ILSVRC2012_val_00038132.JPEG n02326432/
+mv val/ILSVRC2012_val_00038133.JPEG n03770679/
+mv val/ILSVRC2012_val_00038134.JPEG n02509815/
+mv val/ILSVRC2012_val_00038135.JPEG n02978881/
+mv val/ILSVRC2012_val_00038136.JPEG n03018349/
+mv val/ILSVRC2012_val_00038137.JPEG n03394916/
+mv val/ILSVRC2012_val_00038138.JPEG n02977058/
+mv val/ILSVRC2012_val_00038139.JPEG n03891332/
+mv val/ILSVRC2012_val_00038140.JPEG n01665541/
+mv val/ILSVRC2012_val_00038141.JPEG n04141327/
+mv val/ILSVRC2012_val_00038142.JPEG n02233338/
+mv val/ILSVRC2012_val_00038143.JPEG n02092339/
+mv val/ILSVRC2012_val_00038144.JPEG n03388549/
+mv val/ILSVRC2012_val_00038145.JPEG n04548362/
+mv val/ILSVRC2012_val_00038146.JPEG n04296562/
+mv val/ILSVRC2012_val_00038147.JPEG n04067472/
+mv val/ILSVRC2012_val_00038148.JPEG n03014705/
+mv val/ILSVRC2012_val_00038149.JPEG n02747177/
+mv val/ILSVRC2012_val_00038150.JPEG n02441942/
+mv val/ILSVRC2012_val_00038151.JPEG n04081281/
+mv val/ILSVRC2012_val_00038152.JPEG n03290653/
+mv val/ILSVRC2012_val_00038153.JPEG n02066245/
+mv val/ILSVRC2012_val_00038154.JPEG n01983481/
+mv val/ILSVRC2012_val_00038155.JPEG n02085936/
+mv val/ILSVRC2012_val_00038156.JPEG n01518878/
+mv val/ILSVRC2012_val_00038157.JPEG n02085620/
+mv val/ILSVRC2012_val_00038158.JPEG n04346328/
+mv val/ILSVRC2012_val_00038159.JPEG n01601694/
+mv val/ILSVRC2012_val_00038160.JPEG n01532829/
+mv val/ILSVRC2012_val_00038161.JPEG n03992509/
+mv val/ILSVRC2012_val_00038162.JPEG n01694178/
+mv val/ILSVRC2012_val_00038163.JPEG n02437616/
+mv val/ILSVRC2012_val_00038164.JPEG n04612504/
+mv val/ILSVRC2012_val_00038165.JPEG n02666196/
+mv val/ILSVRC2012_val_00038166.JPEG n03950228/
+mv val/ILSVRC2012_val_00038167.JPEG n02093754/
+mv val/ILSVRC2012_val_00038168.JPEG n02123597/
+mv val/ILSVRC2012_val_00038169.JPEG n01817953/
+mv val/ILSVRC2012_val_00038170.JPEG n02190166/
+mv val/ILSVRC2012_val_00038171.JPEG n04067472/
+mv val/ILSVRC2012_val_00038172.JPEG n03933933/
+mv val/ILSVRC2012_val_00038173.JPEG n02398521/
+mv val/ILSVRC2012_val_00038174.JPEG n02097130/
+mv val/ILSVRC2012_val_00038175.JPEG n03444034/
+mv val/ILSVRC2012_val_00038176.JPEG n03792972/
+mv val/ILSVRC2012_val_00038177.JPEG n04418357/
+mv val/ILSVRC2012_val_00038178.JPEG n01871265/
+mv val/ILSVRC2012_val_00038179.JPEG n03208938/
+mv val/ILSVRC2012_val_00038180.JPEG n01768244/
+mv val/ILSVRC2012_val_00038181.JPEG n02174001/
+mv val/ILSVRC2012_val_00038182.JPEG n02219486/
+mv val/ILSVRC2012_val_00038183.JPEG n01774384/
+mv val/ILSVRC2012_val_00038184.JPEG n07742313/
+mv val/ILSVRC2012_val_00038185.JPEG n04355933/
+mv val/ILSVRC2012_val_00038186.JPEG n02129165/
+mv val/ILSVRC2012_val_00038187.JPEG n07742313/
+mv val/ILSVRC2012_val_00038188.JPEG n01697457/
+mv val/ILSVRC2012_val_00038189.JPEG n04310018/
+mv val/ILSVRC2012_val_00038190.JPEG n02669723/
+mv val/ILSVRC2012_val_00038191.JPEG n04367480/
+mv val/ILSVRC2012_val_00038192.JPEG n01592084/
+mv val/ILSVRC2012_val_00038193.JPEG n02105251/
+mv val/ILSVRC2012_val_00038194.JPEG n02113799/
+mv val/ILSVRC2012_val_00038195.JPEG n07565083/
+mv val/ILSVRC2012_val_00038196.JPEG n02091032/
+mv val/ILSVRC2012_val_00038197.JPEG n02011460/
+mv val/ILSVRC2012_val_00038198.JPEG n03773504/
+mv val/ILSVRC2012_val_00038199.JPEG n02445715/
+mv val/ILSVRC2012_val_00038200.JPEG n04275548/
+mv val/ILSVRC2012_val_00038201.JPEG n02112018/
+mv val/ILSVRC2012_val_00038202.JPEG n01632458/
+mv val/ILSVRC2012_val_00038203.JPEG n02486261/
+mv val/ILSVRC2012_val_00038204.JPEG n07714990/
+mv val/ILSVRC2012_val_00038205.JPEG n02106550/
+mv val/ILSVRC2012_val_00038206.JPEG n03478589/
+mv val/ILSVRC2012_val_00038207.JPEG n02963159/
+mv val/ILSVRC2012_val_00038208.JPEG n03743016/
+mv val/ILSVRC2012_val_00038209.JPEG n04146614/
+mv val/ILSVRC2012_val_00038210.JPEG n03970156/
+mv val/ILSVRC2012_val_00038211.JPEG n03874293/
+mv val/ILSVRC2012_val_00038212.JPEG n07749582/
+mv val/ILSVRC2012_val_00038213.JPEG n06874185/
+mv val/ILSVRC2012_val_00038214.JPEG n01950731/
+mv val/ILSVRC2012_val_00038215.JPEG n01498041/
+mv val/ILSVRC2012_val_00038216.JPEG n04090263/
+mv val/ILSVRC2012_val_00038217.JPEG n02077923/
+mv val/ILSVRC2012_val_00038218.JPEG n02106662/
+mv val/ILSVRC2012_val_00038219.JPEG n02786058/
+mv val/ILSVRC2012_val_00038220.JPEG n04591157/
+mv val/ILSVRC2012_val_00038221.JPEG n03481172/
+mv val/ILSVRC2012_val_00038222.JPEG n03924679/
+mv val/ILSVRC2012_val_00038223.JPEG n02500267/
+mv val/ILSVRC2012_val_00038224.JPEG n04258138/
+mv val/ILSVRC2012_val_00038225.JPEG n04540053/
+mv val/ILSVRC2012_val_00038226.JPEG n03160309/
+mv val/ILSVRC2012_val_00038227.JPEG n02087394/
+mv val/ILSVRC2012_val_00038228.JPEG n03494278/
+mv val/ILSVRC2012_val_00038229.JPEG n04325704/
+mv val/ILSVRC2012_val_00038230.JPEG n01669191/
+mv val/ILSVRC2012_val_00038231.JPEG n02108551/
+mv val/ILSVRC2012_val_00038232.JPEG n01980166/
+mv val/ILSVRC2012_val_00038233.JPEG n03314780/
+mv val/ILSVRC2012_val_00038234.JPEG n02808440/
+mv val/ILSVRC2012_val_00038235.JPEG n04447861/
+mv val/ILSVRC2012_val_00038236.JPEG n02281787/
+mv val/ILSVRC2012_val_00038237.JPEG n02095889/
+mv val/ILSVRC2012_val_00038238.JPEG n02489166/
+mv val/ILSVRC2012_val_00038239.JPEG n02114367/
+mv val/ILSVRC2012_val_00038240.JPEG n04344873/
+mv val/ILSVRC2012_val_00038241.JPEG n02058221/
+mv val/ILSVRC2012_val_00038242.JPEG n02444819/
+mv val/ILSVRC2012_val_00038243.JPEG n02988304/
+mv val/ILSVRC2012_val_00038244.JPEG n03495258/
+mv val/ILSVRC2012_val_00038245.JPEG n02002556/
+mv val/ILSVRC2012_val_00038246.JPEG n03874293/
+mv val/ILSVRC2012_val_00038247.JPEG n02085782/
+mv val/ILSVRC2012_val_00038248.JPEG n01695060/
+mv val/ILSVRC2012_val_00038249.JPEG n02870880/
+mv val/ILSVRC2012_val_00038250.JPEG n01608432/
+mv val/ILSVRC2012_val_00038251.JPEG n02948072/
+mv val/ILSVRC2012_val_00038252.JPEG n04067472/
+mv val/ILSVRC2012_val_00038253.JPEG n02098286/
+mv val/ILSVRC2012_val_00038254.JPEG n02093428/
+mv val/ILSVRC2012_val_00038255.JPEG n04009552/
+mv val/ILSVRC2012_val_00038256.JPEG n12267677/
+mv val/ILSVRC2012_val_00038257.JPEG n02085782/
+mv val/ILSVRC2012_val_00038258.JPEG n03376595/
+mv val/ILSVRC2012_val_00038259.JPEG n04335435/
+mv val/ILSVRC2012_val_00038260.JPEG n03891332/
+mv val/ILSVRC2012_val_00038261.JPEG n03733281/
+mv val/ILSVRC2012_val_00038262.JPEG n02264363/
+mv val/ILSVRC2012_val_00038263.JPEG n02132136/
+mv val/ILSVRC2012_val_00038264.JPEG n04263257/
+mv val/ILSVRC2012_val_00038265.JPEG n01698640/
+mv val/ILSVRC2012_val_00038266.JPEG n01753488/
+mv val/ILSVRC2012_val_00038267.JPEG n07714990/
+mv val/ILSVRC2012_val_00038268.JPEG n03417042/
+mv val/ILSVRC2012_val_00038269.JPEG n03259280/
+mv val/ILSVRC2012_val_00038270.JPEG n01737021/
+mv val/ILSVRC2012_val_00038271.JPEG n04118538/
+mv val/ILSVRC2012_val_00038272.JPEG n01773797/
+mv val/ILSVRC2012_val_00038273.JPEG n03124170/
+mv val/ILSVRC2012_val_00038274.JPEG n03874293/
+mv val/ILSVRC2012_val_00038275.JPEG n09421951/
+mv val/ILSVRC2012_val_00038276.JPEG n02747177/
+mv val/ILSVRC2012_val_00038277.JPEG n09288635/
+mv val/ILSVRC2012_val_00038278.JPEG n04136333/
+mv val/ILSVRC2012_val_00038279.JPEG n03956157/
+mv val/ILSVRC2012_val_00038280.JPEG n02093256/
+mv val/ILSVRC2012_val_00038281.JPEG n03729826/
+mv val/ILSVRC2012_val_00038282.JPEG n03538406/
+mv val/ILSVRC2012_val_00038283.JPEG n01774384/
+mv val/ILSVRC2012_val_00038284.JPEG n04355338/
+mv val/ILSVRC2012_val_00038285.JPEG n02105251/
+mv val/ILSVRC2012_val_00038286.JPEG n02403003/
+mv val/ILSVRC2012_val_00038287.JPEG n01697457/
+mv val/ILSVRC2012_val_00038288.JPEG n01828970/
+mv val/ILSVRC2012_val_00038289.JPEG n02892767/
+mv val/ILSVRC2012_val_00038290.JPEG n02018207/
+mv val/ILSVRC2012_val_00038291.JPEG n02134084/
+mv val/ILSVRC2012_val_00038292.JPEG n03733805/
+mv val/ILSVRC2012_val_00038293.JPEG n07930864/
+mv val/ILSVRC2012_val_00038294.JPEG n02097474/
+mv val/ILSVRC2012_val_00038295.JPEG n04507155/
+mv val/ILSVRC2012_val_00038296.JPEG n04344873/
+mv val/ILSVRC2012_val_00038297.JPEG n02950826/
+mv val/ILSVRC2012_val_00038298.JPEG n03721384/
+mv val/ILSVRC2012_val_00038299.JPEG n01943899/
+mv val/ILSVRC2012_val_00038300.JPEG n07920052/
+mv val/ILSVRC2012_val_00038301.JPEG n02319095/
+mv val/ILSVRC2012_val_00038302.JPEG n04149813/
+mv val/ILSVRC2012_val_00038303.JPEG n02364673/
+mv val/ILSVRC2012_val_00038304.JPEG n01742172/
+mv val/ILSVRC2012_val_00038305.JPEG n04428191/
+mv val/ILSVRC2012_val_00038306.JPEG n03450230/
+mv val/ILSVRC2012_val_00038307.JPEG n09399592/
+mv val/ILSVRC2012_val_00038308.JPEG n01689811/
+mv val/ILSVRC2012_val_00038309.JPEG n01978287/
+mv val/ILSVRC2012_val_00038310.JPEG n07716358/
+mv val/ILSVRC2012_val_00038311.JPEG n02074367/
+mv val/ILSVRC2012_val_00038312.JPEG n04557648/
+mv val/ILSVRC2012_val_00038313.JPEG n03062245/
+mv val/ILSVRC2012_val_00038314.JPEG n02105251/
+mv val/ILSVRC2012_val_00038315.JPEG n07716906/
+mv val/ILSVRC2012_val_00038316.JPEG n03623198/
+mv val/ILSVRC2012_val_00038317.JPEG n03125729/
+mv val/ILSVRC2012_val_00038318.JPEG n03876231/
+mv val/ILSVRC2012_val_00038319.JPEG n04509417/
+mv val/ILSVRC2012_val_00038320.JPEG n03041632/
+mv val/ILSVRC2012_val_00038321.JPEG n04347754/
+mv val/ILSVRC2012_val_00038322.JPEG n06359193/
+mv val/ILSVRC2012_val_00038323.JPEG n04118538/
+mv val/ILSVRC2012_val_00038324.JPEG n01806143/
+mv val/ILSVRC2012_val_00038325.JPEG n07749582/
+mv val/ILSVRC2012_val_00038326.JPEG n02105855/
+mv val/ILSVRC2012_val_00038327.JPEG n13052670/
+mv val/ILSVRC2012_val_00038328.JPEG n02094114/
+mv val/ILSVRC2012_val_00038329.JPEG n03775071/
+mv val/ILSVRC2012_val_00038330.JPEG n01873310/
+mv val/ILSVRC2012_val_00038331.JPEG n03788195/
+mv val/ILSVRC2012_val_00038332.JPEG n04311004/
+mv val/ILSVRC2012_val_00038333.JPEG n03018349/
+mv val/ILSVRC2012_val_00038334.JPEG n03089624/
+mv val/ILSVRC2012_val_00038335.JPEG n02087046/
+mv val/ILSVRC2012_val_00038336.JPEG n03379051/
+mv val/ILSVRC2012_val_00038337.JPEG n04493381/
+mv val/ILSVRC2012_val_00038338.JPEG n07714990/
+mv val/ILSVRC2012_val_00038339.JPEG n03895866/
+mv val/ILSVRC2012_val_00038340.JPEG n15075141/
+mv val/ILSVRC2012_val_00038341.JPEG n07684084/
+mv val/ILSVRC2012_val_00038342.JPEG n01755581/
+mv val/ILSVRC2012_val_00038343.JPEG n07715103/
+mv val/ILSVRC2012_val_00038344.JPEG n04285008/
+mv val/ILSVRC2012_val_00038345.JPEG n03476991/
+mv val/ILSVRC2012_val_00038346.JPEG n04049303/
+mv val/ILSVRC2012_val_00038347.JPEG n03496892/
+mv val/ILSVRC2012_val_00038348.JPEG n03041632/
+mv val/ILSVRC2012_val_00038349.JPEG n02403003/
+mv val/ILSVRC2012_val_00038350.JPEG n03832673/
+mv val/ILSVRC2012_val_00038351.JPEG n04131690/
+mv val/ILSVRC2012_val_00038352.JPEG n04479046/
+mv val/ILSVRC2012_val_00038353.JPEG n04479046/
+mv val/ILSVRC2012_val_00038354.JPEG n02259212/
+mv val/ILSVRC2012_val_00038355.JPEG n01734418/
+mv val/ILSVRC2012_val_00038356.JPEG n02002556/
+mv val/ILSVRC2012_val_00038357.JPEG n03179701/
+mv val/ILSVRC2012_val_00038358.JPEG n03992509/
+mv val/ILSVRC2012_val_00038359.JPEG n07932039/
+mv val/ILSVRC2012_val_00038360.JPEG n04467665/
+mv val/ILSVRC2012_val_00038361.JPEG n02099712/
+mv val/ILSVRC2012_val_00038362.JPEG n04456115/
+mv val/ILSVRC2012_val_00038363.JPEG n03690938/
+mv val/ILSVRC2012_val_00038364.JPEG n04367480/
+mv val/ILSVRC2012_val_00038365.JPEG n01729322/
+mv val/ILSVRC2012_val_00038366.JPEG n03961711/
+mv val/ILSVRC2012_val_00038367.JPEG n03841143/
+mv val/ILSVRC2012_val_00038368.JPEG n02963159/
+mv val/ILSVRC2012_val_00038369.JPEG n03476991/
+mv val/ILSVRC2012_val_00038370.JPEG n04074963/
+mv val/ILSVRC2012_val_00038371.JPEG n02077923/
+mv val/ILSVRC2012_val_00038372.JPEG n01532829/
+mv val/ILSVRC2012_val_00038373.JPEG n02865351/
+mv val/ILSVRC2012_val_00038374.JPEG n02966687/
+mv val/ILSVRC2012_val_00038375.JPEG n01694178/
+mv val/ILSVRC2012_val_00038376.JPEG n03017168/
+mv val/ILSVRC2012_val_00038377.JPEG n04429376/
+mv val/ILSVRC2012_val_00038378.JPEG n03935335/
+mv val/ILSVRC2012_val_00038379.JPEG n09246464/
+mv val/ILSVRC2012_val_00038380.JPEG n04004767/
+mv val/ILSVRC2012_val_00038381.JPEG n03208938/
+mv val/ILSVRC2012_val_00038382.JPEG n04111531/
+mv val/ILSVRC2012_val_00038383.JPEG n04389033/
+mv val/ILSVRC2012_val_00038384.JPEG n07760859/
+mv val/ILSVRC2012_val_00038385.JPEG n04326547/
+mv val/ILSVRC2012_val_00038386.JPEG n04209239/
+mv val/ILSVRC2012_val_00038387.JPEG n07697537/
+mv val/ILSVRC2012_val_00038388.JPEG n03785016/
+mv val/ILSVRC2012_val_00038389.JPEG n04367480/
+mv val/ILSVRC2012_val_00038390.JPEG n04037443/
+mv val/ILSVRC2012_val_00038391.JPEG n04311174/
+mv val/ILSVRC2012_val_00038392.JPEG n02814533/
+mv val/ILSVRC2012_val_00038393.JPEG n02113799/
+mv val/ILSVRC2012_val_00038394.JPEG n02825657/
+mv val/ILSVRC2012_val_00038395.JPEG n02672831/
+mv val/ILSVRC2012_val_00038396.JPEG n02114855/
+mv val/ILSVRC2012_val_00038397.JPEG n02090622/
+mv val/ILSVRC2012_val_00038398.JPEG n09399592/
+mv val/ILSVRC2012_val_00038399.JPEG n04482393/
+mv val/ILSVRC2012_val_00038400.JPEG n01910747/
+mv val/ILSVRC2012_val_00038401.JPEG n04417672/
+mv val/ILSVRC2012_val_00038402.JPEG n04162706/
+mv val/ILSVRC2012_val_00038403.JPEG n02098413/
+mv val/ILSVRC2012_val_00038404.JPEG n07717556/
+mv val/ILSVRC2012_val_00038405.JPEG n01580077/
+mv val/ILSVRC2012_val_00038406.JPEG n02092002/
+mv val/ILSVRC2012_val_00038407.JPEG n03014705/
+mv val/ILSVRC2012_val_00038408.JPEG n04370456/
+mv val/ILSVRC2012_val_00038409.JPEG n02835271/
+mv val/ILSVRC2012_val_00038410.JPEG n03047690/
+mv val/ILSVRC2012_val_00038411.JPEG n03944341/
+mv val/ILSVRC2012_val_00038412.JPEG n07613480/
+mv val/ILSVRC2012_val_00038413.JPEG n02361337/
+mv val/ILSVRC2012_val_00038414.JPEG n02356798/
+mv val/ILSVRC2012_val_00038415.JPEG n02835271/
+mv val/ILSVRC2012_val_00038416.JPEG n02011460/
+mv val/ILSVRC2012_val_00038417.JPEG n02096051/
+mv val/ILSVRC2012_val_00038418.JPEG n01843065/
+mv val/ILSVRC2012_val_00038419.JPEG n03498962/
+mv val/ILSVRC2012_val_00038420.JPEG n07583066/
+mv val/ILSVRC2012_val_00038421.JPEG n07734744/
+mv val/ILSVRC2012_val_00038422.JPEG n04277352/
+mv val/ILSVRC2012_val_00038423.JPEG n02088632/
+mv val/ILSVRC2012_val_00038424.JPEG n09835506/
+mv val/ILSVRC2012_val_00038425.JPEG n04141327/
+mv val/ILSVRC2012_val_00038426.JPEG n01820546/
+mv val/ILSVRC2012_val_00038427.JPEG n03218198/
+mv val/ILSVRC2012_val_00038428.JPEG n03825788/
+mv val/ILSVRC2012_val_00038429.JPEG n04310018/
+mv val/ILSVRC2012_val_00038430.JPEG n02099849/
+mv val/ILSVRC2012_val_00038431.JPEG n02025239/
+mv val/ILSVRC2012_val_00038432.JPEG n07753275/
+mv val/ILSVRC2012_val_00038433.JPEG n03876231/
+mv val/ILSVRC2012_val_00038434.JPEG n02099267/
+mv val/ILSVRC2012_val_00038435.JPEG n03794056/
+mv val/ILSVRC2012_val_00038436.JPEG n07590611/
+mv val/ILSVRC2012_val_00038437.JPEG n01740131/
+mv val/ILSVRC2012_val_00038438.JPEG n02091032/
+mv val/ILSVRC2012_val_00038439.JPEG n04200800/
+mv val/ILSVRC2012_val_00038440.JPEG n01770081/
+mv val/ILSVRC2012_val_00038441.JPEG n02869837/
+mv val/ILSVRC2012_val_00038442.JPEG n03379051/
+mv val/ILSVRC2012_val_00038443.JPEG n01833805/
+mv val/ILSVRC2012_val_00038444.JPEG n03929855/
+mv val/ILSVRC2012_val_00038445.JPEG n02749479/
+mv val/ILSVRC2012_val_00038446.JPEG n01644900/
+mv val/ILSVRC2012_val_00038447.JPEG n03445777/
+mv val/ILSVRC2012_val_00038448.JPEG n02110627/
+mv val/ILSVRC2012_val_00038449.JPEG n01630670/
+mv val/ILSVRC2012_val_00038450.JPEG n04273569/
+mv val/ILSVRC2012_val_00038451.JPEG n04483307/
+mv val/ILSVRC2012_val_00038452.JPEG n02138441/
+mv val/ILSVRC2012_val_00038453.JPEG n07892512/
+mv val/ILSVRC2012_val_00038454.JPEG n01983481/
+mv val/ILSVRC2012_val_00038455.JPEG n02108422/
+mv val/ILSVRC2012_val_00038456.JPEG n02948072/
+mv val/ILSVRC2012_val_00038457.JPEG n02094258/
+mv val/ILSVRC2012_val_00038458.JPEG n03141823/
+mv val/ILSVRC2012_val_00038459.JPEG n01632458/
+mv val/ILSVRC2012_val_00038460.JPEG n04517823/
+mv val/ILSVRC2012_val_00038461.JPEG n04380533/
+mv val/ILSVRC2012_val_00038462.JPEG n09472597/
+mv val/ILSVRC2012_val_00038463.JPEG n02165456/
+mv val/ILSVRC2012_val_00038464.JPEG n01930112/
+mv val/ILSVRC2012_val_00038465.JPEG n03018349/
+mv val/ILSVRC2012_val_00038466.JPEG n02268853/
+mv val/ILSVRC2012_val_00038467.JPEG n01770081/
+mv val/ILSVRC2012_val_00038468.JPEG n04141975/
+mv val/ILSVRC2012_val_00038469.JPEG n03998194/
+mv val/ILSVRC2012_val_00038470.JPEG n03384352/
+mv val/ILSVRC2012_val_00038471.JPEG n04147183/
+mv val/ILSVRC2012_val_00038472.JPEG n03045698/
+mv val/ILSVRC2012_val_00038473.JPEG n03791053/
+mv val/ILSVRC2012_val_00038474.JPEG n03944341/
+mv val/ILSVRC2012_val_00038475.JPEG n02536864/
+mv val/ILSVRC2012_val_00038476.JPEG n01829413/
+mv val/ILSVRC2012_val_00038477.JPEG n02088466/
+mv val/ILSVRC2012_val_00038478.JPEG n01694178/
+mv val/ILSVRC2012_val_00038479.JPEG n02106382/
+mv val/ILSVRC2012_val_00038480.JPEG n01748264/
+mv val/ILSVRC2012_val_00038481.JPEG n03759954/
+mv val/ILSVRC2012_val_00038482.JPEG n12985857/
+mv val/ILSVRC2012_val_00038483.JPEG n04254680/
+mv val/ILSVRC2012_val_00038484.JPEG n04465501/
+mv val/ILSVRC2012_val_00038485.JPEG n02795169/
+mv val/ILSVRC2012_val_00038486.JPEG n02096177/
+mv val/ILSVRC2012_val_00038487.JPEG n02444819/
+mv val/ILSVRC2012_val_00038488.JPEG n01558993/
+mv val/ILSVRC2012_val_00038489.JPEG n02115641/
+mv val/ILSVRC2012_val_00038490.JPEG n03445924/
+mv val/ILSVRC2012_val_00038491.JPEG n02701002/
+mv val/ILSVRC2012_val_00038492.JPEG n06359193/
+mv val/ILSVRC2012_val_00038493.JPEG n01773549/
+mv val/ILSVRC2012_val_00038494.JPEG n03637318/
+mv val/ILSVRC2012_val_00038495.JPEG n02437312/
+mv val/ILSVRC2012_val_00038496.JPEG n04332243/
+mv val/ILSVRC2012_val_00038497.JPEG n02865351/
+mv val/ILSVRC2012_val_00038498.JPEG n02088632/
+mv val/ILSVRC2012_val_00038499.JPEG n04067472/
+mv val/ILSVRC2012_val_00038500.JPEG n02092002/
+mv val/ILSVRC2012_val_00038501.JPEG n03956157/
+mv val/ILSVRC2012_val_00038502.JPEG n04326547/
+mv val/ILSVRC2012_val_00038503.JPEG n02786058/
+mv val/ILSVRC2012_val_00038504.JPEG n01784675/
+mv val/ILSVRC2012_val_00038505.JPEG n01847000/
+mv val/ILSVRC2012_val_00038506.JPEG n04146614/
+mv val/ILSVRC2012_val_00038507.JPEG n03666591/
+mv val/ILSVRC2012_val_00038508.JPEG n04310018/
+mv val/ILSVRC2012_val_00038509.JPEG n01914609/
+mv val/ILSVRC2012_val_00038510.JPEG n07695742/
+mv val/ILSVRC2012_val_00038511.JPEG n03404251/
+mv val/ILSVRC2012_val_00038512.JPEG n03891251/
+mv val/ILSVRC2012_val_00038513.JPEG n06874185/
+mv val/ILSVRC2012_val_00038514.JPEG n03062245/
+mv val/ILSVRC2012_val_00038515.JPEG n03355925/
+mv val/ILSVRC2012_val_00038516.JPEG n12267677/
+mv val/ILSVRC2012_val_00038517.JPEG n04254120/
+mv val/ILSVRC2012_val_00038518.JPEG n07714990/
+mv val/ILSVRC2012_val_00038519.JPEG n02233338/
+mv val/ILSVRC2012_val_00038520.JPEG n02804414/
+mv val/ILSVRC2012_val_00038521.JPEG n03062245/
+mv val/ILSVRC2012_val_00038522.JPEG n02018795/
+mv val/ILSVRC2012_val_00038523.JPEG n07720875/
+mv val/ILSVRC2012_val_00038524.JPEG n03075370/
+mv val/ILSVRC2012_val_00038525.JPEG n03530642/
+mv val/ILSVRC2012_val_00038526.JPEG n01980166/
+mv val/ILSVRC2012_val_00038527.JPEG n01667114/
+mv val/ILSVRC2012_val_00038528.JPEG n04553703/
+mv val/ILSVRC2012_val_00038529.JPEG n09468604/
+mv val/ILSVRC2012_val_00038530.JPEG n06794110/
+mv val/ILSVRC2012_val_00038531.JPEG n04367480/
+mv val/ILSVRC2012_val_00038532.JPEG n02963159/
+mv val/ILSVRC2012_val_00038533.JPEG n03710193/
+mv val/ILSVRC2012_val_00038534.JPEG n01980166/
+mv val/ILSVRC2012_val_00038535.JPEG n03000134/
+mv val/ILSVRC2012_val_00038536.JPEG n03938244/
+mv val/ILSVRC2012_val_00038537.JPEG n02231487/
+mv val/ILSVRC2012_val_00038538.JPEG n02493509/
+mv val/ILSVRC2012_val_00038539.JPEG n03447721/
+mv val/ILSVRC2012_val_00038540.JPEG n07583066/
+mv val/ILSVRC2012_val_00038541.JPEG n09472597/
+mv val/ILSVRC2012_val_00038542.JPEG n03877845/
+mv val/ILSVRC2012_val_00038543.JPEG n04147183/
+mv val/ILSVRC2012_val_00038544.JPEG n04229816/
+mv val/ILSVRC2012_val_00038545.JPEG n12998815/
+mv val/ILSVRC2012_val_00038546.JPEG n03877472/
+mv val/ILSVRC2012_val_00038547.JPEG n07718472/
+mv val/ILSVRC2012_val_00038548.JPEG n03063599/
+mv val/ILSVRC2012_val_00038549.JPEG n01665541/
+mv val/ILSVRC2012_val_00038550.JPEG n02111889/
+mv val/ILSVRC2012_val_00038551.JPEG n06596364/
+mv val/ILSVRC2012_val_00038552.JPEG n02094433/
+mv val/ILSVRC2012_val_00038553.JPEG n01817953/
+mv val/ILSVRC2012_val_00038554.JPEG n02091635/
+mv val/ILSVRC2012_val_00038555.JPEG n01755581/
+mv val/ILSVRC2012_val_00038556.JPEG n01740131/
+mv val/ILSVRC2012_val_00038557.JPEG n01592084/
+mv val/ILSVRC2012_val_00038558.JPEG n03673027/
+mv val/ILSVRC2012_val_00038559.JPEG n03467068/
+mv val/ILSVRC2012_val_00038560.JPEG n03924679/
+mv val/ILSVRC2012_val_00038561.JPEG n04467665/
+mv val/ILSVRC2012_val_00038562.JPEG n03733805/
+mv val/ILSVRC2012_val_00038563.JPEG n01833805/
+mv val/ILSVRC2012_val_00038564.JPEG n03089624/
+mv val/ILSVRC2012_val_00038565.JPEG n02091635/
+mv val/ILSVRC2012_val_00038566.JPEG n02489166/
+mv val/ILSVRC2012_val_00038567.JPEG n02112350/
+mv val/ILSVRC2012_val_00038568.JPEG n04192698/
+mv val/ILSVRC2012_val_00038569.JPEG n02102040/
+mv val/ILSVRC2012_val_00038570.JPEG n02823428/
+mv val/ILSVRC2012_val_00038571.JPEG n04074963/
+mv val/ILSVRC2012_val_00038572.JPEG n01872401/
+mv val/ILSVRC2012_val_00038573.JPEG n04579145/
+mv val/ILSVRC2012_val_00038574.JPEG n03788365/
+mv val/ILSVRC2012_val_00038575.JPEG n04086273/
+mv val/ILSVRC2012_val_00038576.JPEG n02009229/
+mv val/ILSVRC2012_val_00038577.JPEG n07753113/
+mv val/ILSVRC2012_val_00038578.JPEG n02504458/
+mv val/ILSVRC2012_val_00038579.JPEG n02002724/
+mv val/ILSVRC2012_val_00038580.JPEG n02097474/
+mv val/ILSVRC2012_val_00038581.JPEG n07754684/
+mv val/ILSVRC2012_val_00038582.JPEG n03134739/
+mv val/ILSVRC2012_val_00038583.JPEG n02113978/
+mv val/ILSVRC2012_val_00038584.JPEG n02403003/
+mv val/ILSVRC2012_val_00038585.JPEG n03998194/
+mv val/ILSVRC2012_val_00038586.JPEG n01688243/
+mv val/ILSVRC2012_val_00038587.JPEG n03891332/
+mv val/ILSVRC2012_val_00038588.JPEG n04133789/
+mv val/ILSVRC2012_val_00038589.JPEG n02111500/
+mv val/ILSVRC2012_val_00038590.JPEG n02916936/
+mv val/ILSVRC2012_val_00038591.JPEG n07248320/
+mv val/ILSVRC2012_val_00038592.JPEG n04404412/
+mv val/ILSVRC2012_val_00038593.JPEG n04209239/
+mv val/ILSVRC2012_val_00038594.JPEG n07590611/
+mv val/ILSVRC2012_val_00038595.JPEG n03673027/
+mv val/ILSVRC2012_val_00038596.JPEG n04008634/
+mv val/ILSVRC2012_val_00038597.JPEG n03272010/
+mv val/ILSVRC2012_val_00038598.JPEG n13040303/
+mv val/ILSVRC2012_val_00038599.JPEG n09399592/
+mv val/ILSVRC2012_val_00038600.JPEG n02007558/
+mv val/ILSVRC2012_val_00038601.JPEG n02488291/
+mv val/ILSVRC2012_val_00038602.JPEG n07716906/
+mv val/ILSVRC2012_val_00038603.JPEG n04009552/
+mv val/ILSVRC2012_val_00038604.JPEG n02111889/
+mv val/ILSVRC2012_val_00038605.JPEG n03658185/
+mv val/ILSVRC2012_val_00038606.JPEG n01980166/
+mv val/ILSVRC2012_val_00038607.JPEG n04367480/
+mv val/ILSVRC2012_val_00038608.JPEG n02892201/
+mv val/ILSVRC2012_val_00038609.JPEG n04423845/
+mv val/ILSVRC2012_val_00038610.JPEG n03131574/
+mv val/ILSVRC2012_val_00038611.JPEG n04041544/
+mv val/ILSVRC2012_val_00038612.JPEG n04266014/
+mv val/ILSVRC2012_val_00038613.JPEG n03825788/
+mv val/ILSVRC2012_val_00038614.JPEG n02033041/
+mv val/ILSVRC2012_val_00038615.JPEG n02002724/
+mv val/ILSVRC2012_val_00038616.JPEG n01871265/
+mv val/ILSVRC2012_val_00038617.JPEG n04099969/
+mv val/ILSVRC2012_val_00038618.JPEG n02321529/
+mv val/ILSVRC2012_val_00038619.JPEG n02666196/
+mv val/ILSVRC2012_val_00038620.JPEG n01698640/
+mv val/ILSVRC2012_val_00038621.JPEG n03709823/
+mv val/ILSVRC2012_val_00038622.JPEG n02356798/
+mv val/ILSVRC2012_val_00038623.JPEG n03089624/
+mv val/ILSVRC2012_val_00038624.JPEG n03873416/
+mv val/ILSVRC2012_val_00038625.JPEG n02097130/
+mv val/ILSVRC2012_val_00038626.JPEG n02108089/
+mv val/ILSVRC2012_val_00038627.JPEG n04258138/
+mv val/ILSVRC2012_val_00038628.JPEG n01667778/
+mv val/ILSVRC2012_val_00038629.JPEG n04456115/
+mv val/ILSVRC2012_val_00038630.JPEG n03492542/
+mv val/ILSVRC2012_val_00038631.JPEG n02363005/
+mv val/ILSVRC2012_val_00038632.JPEG n01871265/
+mv val/ILSVRC2012_val_00038633.JPEG n01950731/
+mv val/ILSVRC2012_val_00038634.JPEG n04153751/
+mv val/ILSVRC2012_val_00038635.JPEG n01984695/
+mv val/ILSVRC2012_val_00038636.JPEG n01614925/
+mv val/ILSVRC2012_val_00038637.JPEG n02110958/
+mv val/ILSVRC2012_val_00038638.JPEG n01824575/
+mv val/ILSVRC2012_val_00038639.JPEG n01981276/
+mv val/ILSVRC2012_val_00038640.JPEG n15075141/
+mv val/ILSVRC2012_val_00038641.JPEG n03814906/
+mv val/ILSVRC2012_val_00038642.JPEG n03874599/
+mv val/ILSVRC2012_val_00038643.JPEG n04118776/
+mv val/ILSVRC2012_val_00038644.JPEG n01675722/
+mv val/ILSVRC2012_val_00038645.JPEG n02939185/
+mv val/ILSVRC2012_val_00038646.JPEG n03742115/
+mv val/ILSVRC2012_val_00038647.JPEG n01697457/
+mv val/ILSVRC2012_val_00038648.JPEG n02326432/
+mv val/ILSVRC2012_val_00038649.JPEG n02090622/
+mv val/ILSVRC2012_val_00038650.JPEG n04532106/
+mv val/ILSVRC2012_val_00038651.JPEG n03983396/
+mv val/ILSVRC2012_val_00038652.JPEG n02415577/
+mv val/ILSVRC2012_val_00038653.JPEG n02412080/
+mv val/ILSVRC2012_val_00038654.JPEG n02102480/
+mv val/ILSVRC2012_val_00038655.JPEG n03459775/
+mv val/ILSVRC2012_val_00038656.JPEG n04380533/
+mv val/ILSVRC2012_val_00038657.JPEG n04254777/
+mv val/ILSVRC2012_val_00038658.JPEG n01631663/
+mv val/ILSVRC2012_val_00038659.JPEG n03404251/
+mv val/ILSVRC2012_val_00038660.JPEG n07871810/
+mv val/ILSVRC2012_val_00038661.JPEG n02123045/
+mv val/ILSVRC2012_val_00038662.JPEG n02226429/
+mv val/ILSVRC2012_val_00038663.JPEG n01871265/
+mv val/ILSVRC2012_val_00038664.JPEG n01820546/
+mv val/ILSVRC2012_val_00038665.JPEG n01688243/
+mv val/ILSVRC2012_val_00038666.JPEG n02825657/
+mv val/ILSVRC2012_val_00038667.JPEG n01689811/
+mv val/ILSVRC2012_val_00038668.JPEG n02095570/
+mv val/ILSVRC2012_val_00038669.JPEG n04019541/
+mv val/ILSVRC2012_val_00038670.JPEG n03777754/
+mv val/ILSVRC2012_val_00038671.JPEG n01748264/
+mv val/ILSVRC2012_val_00038672.JPEG n02123045/
+mv val/ILSVRC2012_val_00038673.JPEG n02129604/
+mv val/ILSVRC2012_val_00038674.JPEG n02105056/
+mv val/ILSVRC2012_val_00038675.JPEG n02125311/
+mv val/ILSVRC2012_val_00038676.JPEG n02089973/
+mv val/ILSVRC2012_val_00038677.JPEG n03649909/
+mv val/ILSVRC2012_val_00038678.JPEG n04540053/
+mv val/ILSVRC2012_val_00038679.JPEG n03670208/
+mv val/ILSVRC2012_val_00038680.JPEG n02097209/
+mv val/ILSVRC2012_val_00038681.JPEG n01819313/
+mv val/ILSVRC2012_val_00038682.JPEG n03110669/
+mv val/ILSVRC2012_val_00038683.JPEG n02124075/
+mv val/ILSVRC2012_val_00038684.JPEG n02437616/
+mv val/ILSVRC2012_val_00038685.JPEG n01843383/
+mv val/ILSVRC2012_val_00038686.JPEG n03935335/
+mv val/ILSVRC2012_val_00038687.JPEG n02782093/
+mv val/ILSVRC2012_val_00038688.JPEG n07753113/
+mv val/ILSVRC2012_val_00038689.JPEG n03791053/
+mv val/ILSVRC2012_val_00038690.JPEG n02111129/
+mv val/ILSVRC2012_val_00038691.JPEG n07614500/
+mv val/ILSVRC2012_val_00038692.JPEG n03761084/
+mv val/ILSVRC2012_val_00038693.JPEG n03676483/
+mv val/ILSVRC2012_val_00038694.JPEG n01978455/
+mv val/ILSVRC2012_val_00038695.JPEG n03857828/
+mv val/ILSVRC2012_val_00038696.JPEG n02488702/
+mv val/ILSVRC2012_val_00038697.JPEG n02165456/
+mv val/ILSVRC2012_val_00038698.JPEG n07734744/
+mv val/ILSVRC2012_val_00038699.JPEG n03991062/
+mv val/ILSVRC2012_val_00038700.JPEG n02860847/
+mv val/ILSVRC2012_val_00038701.JPEG n03954731/
+mv val/ILSVRC2012_val_00038702.JPEG n03045698/
+mv val/ILSVRC2012_val_00038703.JPEG n03944341/
+mv val/ILSVRC2012_val_00038704.JPEG n02111129/
+mv val/ILSVRC2012_val_00038705.JPEG n02092002/
+mv val/ILSVRC2012_val_00038706.JPEG n03891251/
+mv val/ILSVRC2012_val_00038707.JPEG n02130308/
+mv val/ILSVRC2012_val_00038708.JPEG n01945685/
+mv val/ILSVRC2012_val_00038709.JPEG n03188531/
+mv val/ILSVRC2012_val_00038710.JPEG n02457408/
+mv val/ILSVRC2012_val_00038711.JPEG n03085013/
+mv val/ILSVRC2012_val_00038712.JPEG n03796401/
+mv val/ILSVRC2012_val_00038713.JPEG n13052670/
+mv val/ILSVRC2012_val_00038714.JPEG n02398521/
+mv val/ILSVRC2012_val_00038715.JPEG n03743016/
+mv val/ILSVRC2012_val_00038716.JPEG n02229544/
+mv val/ILSVRC2012_val_00038717.JPEG n03160309/
+mv val/ILSVRC2012_val_00038718.JPEG n02276258/
+mv val/ILSVRC2012_val_00038719.JPEG n02276258/
+mv val/ILSVRC2012_val_00038720.JPEG n02504013/
+mv val/ILSVRC2012_val_00038721.JPEG n02281406/
+mv val/ILSVRC2012_val_00038722.JPEG n02877765/
+mv val/ILSVRC2012_val_00038723.JPEG n03649909/
+mv val/ILSVRC2012_val_00038724.JPEG n07697313/
+mv val/ILSVRC2012_val_00038725.JPEG n02058221/
+mv val/ILSVRC2012_val_00038726.JPEG n02077923/
+mv val/ILSVRC2012_val_00038727.JPEG n03394916/
+mv val/ILSVRC2012_val_00038728.JPEG n02256656/
+mv val/ILSVRC2012_val_00038729.JPEG n04328186/
+mv val/ILSVRC2012_val_00038730.JPEG n02009229/
+mv val/ILSVRC2012_val_00038731.JPEG n03476684/
+mv val/ILSVRC2012_val_00038732.JPEG n03388549/
+mv val/ILSVRC2012_val_00038733.JPEG n07714571/
+mv val/ILSVRC2012_val_00038734.JPEG n09193705/
+mv val/ILSVRC2012_val_00038735.JPEG n02396427/
+mv val/ILSVRC2012_val_00038736.JPEG n01806567/
+mv val/ILSVRC2012_val_00038737.JPEG n02090379/
+mv val/ILSVRC2012_val_00038738.JPEG n02100583/
+mv val/ILSVRC2012_val_00038739.JPEG n04483307/
+mv val/ILSVRC2012_val_00038740.JPEG n02120079/
+mv val/ILSVRC2012_val_00038741.JPEG n01914609/
+mv val/ILSVRC2012_val_00038742.JPEG n01630670/
+mv val/ILSVRC2012_val_00038743.JPEG n04259630/
+mv val/ILSVRC2012_val_00038744.JPEG n07695742/
+mv val/ILSVRC2012_val_00038745.JPEG n02106030/
+mv val/ILSVRC2012_val_00038746.JPEG n02883205/
+mv val/ILSVRC2012_val_00038747.JPEG n02398521/
+mv val/ILSVRC2012_val_00038748.JPEG n03995372/
+mv val/ILSVRC2012_val_00038749.JPEG n07590611/
+mv val/ILSVRC2012_val_00038750.JPEG n04099969/
+mv val/ILSVRC2012_val_00038751.JPEG n02110063/
+mv val/ILSVRC2012_val_00038752.JPEG n03785016/
+mv val/ILSVRC2012_val_00038753.JPEG n02669723/
+mv val/ILSVRC2012_val_00038754.JPEG n03125729/
+mv val/ILSVRC2012_val_00038755.JPEG n04442312/
+mv val/ILSVRC2012_val_00038756.JPEG n07920052/
+mv val/ILSVRC2012_val_00038757.JPEG n02497673/
+mv val/ILSVRC2012_val_00038758.JPEG n02454379/
+mv val/ILSVRC2012_val_00038759.JPEG n02091831/
+mv val/ILSVRC2012_val_00038760.JPEG n02454379/
+mv val/ILSVRC2012_val_00038761.JPEG n02088632/
+mv val/ILSVRC2012_val_00038762.JPEG n02115641/
+mv val/ILSVRC2012_val_00038763.JPEG n03761084/
+mv val/ILSVRC2012_val_00038764.JPEG n02606052/
+mv val/ILSVRC2012_val_00038765.JPEG n02264363/
+mv val/ILSVRC2012_val_00038766.JPEG n01843065/
+mv val/ILSVRC2012_val_00038767.JPEG n03623198/
+mv val/ILSVRC2012_val_00038768.JPEG n03445777/
+mv val/ILSVRC2012_val_00038769.JPEG n02481823/
+mv val/ILSVRC2012_val_00038770.JPEG n01773157/
+mv val/ILSVRC2012_val_00038771.JPEG n03109150/
+mv val/ILSVRC2012_val_00038772.JPEG n04458633/
+mv val/ILSVRC2012_val_00038773.JPEG n02165456/
+mv val/ILSVRC2012_val_00038774.JPEG n02190166/
+mv val/ILSVRC2012_val_00038775.JPEG n04111531/
+mv val/ILSVRC2012_val_00038776.JPEG n03197337/
+mv val/ILSVRC2012_val_00038777.JPEG n04542943/
+mv val/ILSVRC2012_val_00038778.JPEG n04507155/
+mv val/ILSVRC2012_val_00038779.JPEG n02089867/
+mv val/ILSVRC2012_val_00038780.JPEG n02342885/
+mv val/ILSVRC2012_val_00038781.JPEG n02099601/
+mv val/ILSVRC2012_val_00038782.JPEG n03787032/
+mv val/ILSVRC2012_val_00038783.JPEG n03483316/
+mv val/ILSVRC2012_val_00038784.JPEG n02454379/
+mv val/ILSVRC2012_val_00038785.JPEG n04041544/
+mv val/ILSVRC2012_val_00038786.JPEG n02086079/
+mv val/ILSVRC2012_val_00038787.JPEG n04485082/
+mv val/ILSVRC2012_val_00038788.JPEG n07831146/
+mv val/ILSVRC2012_val_00038789.JPEG n02106030/
+mv val/ILSVRC2012_val_00038790.JPEG n03445777/
+mv val/ILSVRC2012_val_00038791.JPEG n02398521/
+mv val/ILSVRC2012_val_00038792.JPEG n02666196/
+mv val/ILSVRC2012_val_00038793.JPEG n02009912/
+mv val/ILSVRC2012_val_00038794.JPEG n01534433/
+mv val/ILSVRC2012_val_00038795.JPEG n03126707/
+mv val/ILSVRC2012_val_00038796.JPEG n12057211/
+mv val/ILSVRC2012_val_00038797.JPEG n04355933/
+mv val/ILSVRC2012_val_00038798.JPEG n02025239/
+mv val/ILSVRC2012_val_00038799.JPEG n04336792/
+mv val/ILSVRC2012_val_00038800.JPEG n02906734/
+mv val/ILSVRC2012_val_00038801.JPEG n02002556/
+mv val/ILSVRC2012_val_00038802.JPEG n04487394/
+mv val/ILSVRC2012_val_00038803.JPEG n03291819/
+mv val/ILSVRC2012_val_00038804.JPEG n01614925/
+mv val/ILSVRC2012_val_00038805.JPEG n04235860/
+mv val/ILSVRC2012_val_00038806.JPEG n04270147/
+mv val/ILSVRC2012_val_00038807.JPEG n03291819/
+mv val/ILSVRC2012_val_00038808.JPEG n03837869/
+mv val/ILSVRC2012_val_00038809.JPEG n04192698/
+mv val/ILSVRC2012_val_00038810.JPEG n04120489/
+mv val/ILSVRC2012_val_00038811.JPEG n02930766/
+mv val/ILSVRC2012_val_00038812.JPEG n02128385/
+mv val/ILSVRC2012_val_00038813.JPEG n02837789/
+mv val/ILSVRC2012_val_00038814.JPEG n02105505/
+mv val/ILSVRC2012_val_00038815.JPEG n01704323/
+mv val/ILSVRC2012_val_00038816.JPEG n02481823/
+mv val/ILSVRC2012_val_00038817.JPEG n03384352/
+mv val/ILSVRC2012_val_00038818.JPEG n02167151/
+mv val/ILSVRC2012_val_00038819.JPEG n07753592/
+mv val/ILSVRC2012_val_00038820.JPEG n07614500/
+mv val/ILSVRC2012_val_00038821.JPEG n02134084/
+mv val/ILSVRC2012_val_00038822.JPEG n04515003/
+mv val/ILSVRC2012_val_00038823.JPEG n01729322/
+mv val/ILSVRC2012_val_00038824.JPEG n04033901/
+mv val/ILSVRC2012_val_00038825.JPEG n02134418/
+mv val/ILSVRC2012_val_00038826.JPEG n01514668/
+mv val/ILSVRC2012_val_00038827.JPEG n03942813/
+mv val/ILSVRC2012_val_00038828.JPEG n02101556/
+mv val/ILSVRC2012_val_00038829.JPEG n03642806/
+mv val/ILSVRC2012_val_00038830.JPEG n03733131/
+mv val/ILSVRC2012_val_00038831.JPEG n03290653/
+mv val/ILSVRC2012_val_00038832.JPEG n02174001/
+mv val/ILSVRC2012_val_00038833.JPEG n01784675/
+mv val/ILSVRC2012_val_00038834.JPEG n03777754/
+mv val/ILSVRC2012_val_00038835.JPEG n03942813/
+mv val/ILSVRC2012_val_00038836.JPEG n02802426/
+mv val/ILSVRC2012_val_00038837.JPEG n04049303/
+mv val/ILSVRC2012_val_00038838.JPEG n03535780/
+mv val/ILSVRC2012_val_00038839.JPEG n02492035/
+mv val/ILSVRC2012_val_00038840.JPEG n04070727/
+mv val/ILSVRC2012_val_00038841.JPEG n03075370/
+mv val/ILSVRC2012_val_00038842.JPEG n04372370/
+mv val/ILSVRC2012_val_00038843.JPEG n07860988/
+mv val/ILSVRC2012_val_00038844.JPEG n04367480/
+mv val/ILSVRC2012_val_00038845.JPEG n03786901/
+mv val/ILSVRC2012_val_00038846.JPEG n04562935/
+mv val/ILSVRC2012_val_00038847.JPEG n07590611/
+mv val/ILSVRC2012_val_00038848.JPEG n02102973/
+mv val/ILSVRC2012_val_00038849.JPEG n07248320/
+mv val/ILSVRC2012_val_00038850.JPEG n03095699/
+mv val/ILSVRC2012_val_00038851.JPEG n04009552/
+mv val/ILSVRC2012_val_00038852.JPEG n07614500/
+mv val/ILSVRC2012_val_00038853.JPEG n09288635/
+mv val/ILSVRC2012_val_00038854.JPEG n03724870/
+mv val/ILSVRC2012_val_00038855.JPEG n04258138/
+mv val/ILSVRC2012_val_00038856.JPEG n01698640/
+mv val/ILSVRC2012_val_00038857.JPEG n07753113/
+mv val/ILSVRC2012_val_00038858.JPEG n04263257/
+mv val/ILSVRC2012_val_00038859.JPEG n01755581/
+mv val/ILSVRC2012_val_00038860.JPEG n04447861/
+mv val/ILSVRC2012_val_00038861.JPEG n02666196/
+mv val/ILSVRC2012_val_00038862.JPEG n03733281/
+mv val/ILSVRC2012_val_00038863.JPEG n02051845/
+mv val/ILSVRC2012_val_00038864.JPEG n02058221/
+mv val/ILSVRC2012_val_00038865.JPEG n03958227/
+mv val/ILSVRC2012_val_00038866.JPEG n02403003/
+mv val/ILSVRC2012_val_00038867.JPEG n02097474/
+mv val/ILSVRC2012_val_00038868.JPEG n02099429/
+mv val/ILSVRC2012_val_00038869.JPEG n02484975/
+mv val/ILSVRC2012_val_00038870.JPEG n07836838/
+mv val/ILSVRC2012_val_00038871.JPEG n10565667/
+mv val/ILSVRC2012_val_00038872.JPEG n07720875/
+mv val/ILSVRC2012_val_00038873.JPEG n02486261/
+mv val/ILSVRC2012_val_00038874.JPEG n02321529/
+mv val/ILSVRC2012_val_00038875.JPEG n01755581/
+mv val/ILSVRC2012_val_00038876.JPEG n03100240/
+mv val/ILSVRC2012_val_00038877.JPEG n03063599/
+mv val/ILSVRC2012_val_00038878.JPEG n01664065/
+mv val/ILSVRC2012_val_00038879.JPEG n02783161/
+mv val/ILSVRC2012_val_00038880.JPEG n03803284/
+mv val/ILSVRC2012_val_00038881.JPEG n03110669/
+mv val/ILSVRC2012_val_00038882.JPEG n02086240/
+mv val/ILSVRC2012_val_00038883.JPEG n02487347/
+mv val/ILSVRC2012_val_00038884.JPEG n02097209/
+mv val/ILSVRC2012_val_00038885.JPEG n04310018/
+mv val/ILSVRC2012_val_00038886.JPEG n02012849/
+mv val/ILSVRC2012_val_00038887.JPEG n04120489/
+mv val/ILSVRC2012_val_00038888.JPEG n03482405/
+mv val/ILSVRC2012_val_00038889.JPEG n02447366/
+mv val/ILSVRC2012_val_00038890.JPEG n01749939/
+mv val/ILSVRC2012_val_00038891.JPEG n03478589/
+mv val/ILSVRC2012_val_00038892.JPEG n02963159/
+mv val/ILSVRC2012_val_00038893.JPEG n04428191/
+mv val/ILSVRC2012_val_00038894.JPEG n04285008/
+mv val/ILSVRC2012_val_00038895.JPEG n01530575/
+mv val/ILSVRC2012_val_00038896.JPEG n02111129/
+mv val/ILSVRC2012_val_00038897.JPEG n03109150/
+mv val/ILSVRC2012_val_00038898.JPEG n07697313/
+mv val/ILSVRC2012_val_00038899.JPEG n02802426/
+mv val/ILSVRC2012_val_00038900.JPEG n03690938/
+mv val/ILSVRC2012_val_00038901.JPEG n01914609/
+mv val/ILSVRC2012_val_00038902.JPEG n02481823/
+mv val/ILSVRC2012_val_00038903.JPEG n02259212/
+mv val/ILSVRC2012_val_00038904.JPEG n03538406/
+mv val/ILSVRC2012_val_00038905.JPEG n15075141/
+mv val/ILSVRC2012_val_00038906.JPEG n03649909/
+mv val/ILSVRC2012_val_00038907.JPEG n04483307/
+mv val/ILSVRC2012_val_00038908.JPEG n04613696/
+mv val/ILSVRC2012_val_00038909.JPEG n10565667/
+mv val/ILSVRC2012_val_00038910.JPEG n02488702/
+mv val/ILSVRC2012_val_00038911.JPEG n02094258/
+mv val/ILSVRC2012_val_00038912.JPEG n02096585/
+mv val/ILSVRC2012_val_00038913.JPEG n02127052/
+mv val/ILSVRC2012_val_00038914.JPEG n02391049/
+mv val/ILSVRC2012_val_00038915.JPEG n01734418/
+mv val/ILSVRC2012_val_00038916.JPEG n09332890/
+mv val/ILSVRC2012_val_00038917.JPEG n03379051/
+mv val/ILSVRC2012_val_00038918.JPEG n02133161/
+mv val/ILSVRC2012_val_00038919.JPEG n12144580/
+mv val/ILSVRC2012_val_00038920.JPEG n02099429/
+mv val/ILSVRC2012_val_00038921.JPEG n04447861/
+mv val/ILSVRC2012_val_00038922.JPEG n04120489/
+mv val/ILSVRC2012_val_00038923.JPEG n07860988/
+mv val/ILSVRC2012_val_00038924.JPEG n02129604/
+mv val/ILSVRC2012_val_00038925.JPEG n03065424/
+mv val/ILSVRC2012_val_00038926.JPEG n02095314/
+mv val/ILSVRC2012_val_00038927.JPEG n04154565/
+mv val/ILSVRC2012_val_00038928.JPEG n02655020/
+mv val/ILSVRC2012_val_00038929.JPEG n02165105/
+mv val/ILSVRC2012_val_00038930.JPEG n04275548/
+mv val/ILSVRC2012_val_00038931.JPEG n02415577/
+mv val/ILSVRC2012_val_00038932.JPEG n02786058/
+mv val/ILSVRC2012_val_00038933.JPEG n02091467/
+mv val/ILSVRC2012_val_00038934.JPEG n03444034/
+mv val/ILSVRC2012_val_00038935.JPEG n01498041/
+mv val/ILSVRC2012_val_00038936.JPEG n07590611/
+mv val/ILSVRC2012_val_00038937.JPEG n04554684/
+mv val/ILSVRC2012_val_00038938.JPEG n02109047/
+mv val/ILSVRC2012_val_00038939.JPEG n04552348/
+mv val/ILSVRC2012_val_00038940.JPEG n03814639/
+mv val/ILSVRC2012_val_00038941.JPEG n03125729/
+mv val/ILSVRC2012_val_00038942.JPEG n03888257/
+mv val/ILSVRC2012_val_00038943.JPEG n03950228/
+mv val/ILSVRC2012_val_00038944.JPEG n02089973/
+mv val/ILSVRC2012_val_00038945.JPEG n03967562/
+mv val/ILSVRC2012_val_00038946.JPEG n02749479/
+mv val/ILSVRC2012_val_00038947.JPEG n03729826/
+mv val/ILSVRC2012_val_00038948.JPEG n02018207/
+mv val/ILSVRC2012_val_00038949.JPEG n04487081/
+mv val/ILSVRC2012_val_00038950.JPEG n03017168/
+mv val/ILSVRC2012_val_00038951.JPEG n03976657/
+mv val/ILSVRC2012_val_00038952.JPEG n03938244/
+mv val/ILSVRC2012_val_00038953.JPEG n02769748/
+mv val/ILSVRC2012_val_00038954.JPEG n07836838/
+mv val/ILSVRC2012_val_00038955.JPEG n02002724/
+mv val/ILSVRC2012_val_00038956.JPEG n03100240/
+mv val/ILSVRC2012_val_00038957.JPEG n03598930/
+mv val/ILSVRC2012_val_00038958.JPEG n04479046/
+mv val/ILSVRC2012_val_00038959.JPEG n01644373/
+mv val/ILSVRC2012_val_00038960.JPEG n02708093/
+mv val/ILSVRC2012_val_00038961.JPEG n02134418/
+mv val/ILSVRC2012_val_00038962.JPEG n13054560/
+mv val/ILSVRC2012_val_00038963.JPEG n09332890/
+mv val/ILSVRC2012_val_00038964.JPEG n03133878/
+mv val/ILSVRC2012_val_00038965.JPEG n04554684/
+mv val/ILSVRC2012_val_00038966.JPEG n03041632/
+mv val/ILSVRC2012_val_00038967.JPEG n02869837/
+mv val/ILSVRC2012_val_00038968.JPEG n03014705/
+mv val/ILSVRC2012_val_00038969.JPEG n02510455/
+mv val/ILSVRC2012_val_00038970.JPEG n03954731/
+mv val/ILSVRC2012_val_00038971.JPEG n02788148/
+mv val/ILSVRC2012_val_00038972.JPEG n02859443/
+mv val/ILSVRC2012_val_00038973.JPEG n02640242/
+mv val/ILSVRC2012_val_00038974.JPEG n02087046/
+mv val/ILSVRC2012_val_00038975.JPEG n03891332/
+mv val/ILSVRC2012_val_00038976.JPEG n02124075/
+mv val/ILSVRC2012_val_00038977.JPEG n03476684/
+mv val/ILSVRC2012_val_00038978.JPEG n04270147/
+mv val/ILSVRC2012_val_00038979.JPEG n04542943/
+mv val/ILSVRC2012_val_00038980.JPEG n03916031/
+mv val/ILSVRC2012_val_00038981.JPEG n02051845/
+mv val/ILSVRC2012_val_00038982.JPEG n02104029/
+mv val/ILSVRC2012_val_00038983.JPEG n04270147/
+mv val/ILSVRC2012_val_00038984.JPEG n02422106/
+mv val/ILSVRC2012_val_00038985.JPEG n03692522/
+mv val/ILSVRC2012_val_00038986.JPEG n02115641/
+mv val/ILSVRC2012_val_00038987.JPEG n02447366/
+mv val/ILSVRC2012_val_00038988.JPEG n03710721/
+mv val/ILSVRC2012_val_00038989.JPEG n02112018/
+mv val/ILSVRC2012_val_00038990.JPEG n03000134/
+mv val/ILSVRC2012_val_00038991.JPEG n02105162/
+mv val/ILSVRC2012_val_00038992.JPEG n02097047/
+mv val/ILSVRC2012_val_00038993.JPEG n02356798/
+mv val/ILSVRC2012_val_00038994.JPEG n04037443/
+mv val/ILSVRC2012_val_00038995.JPEG n02071294/
+mv val/ILSVRC2012_val_00038996.JPEG n07892512/
+mv val/ILSVRC2012_val_00038997.JPEG n03924679/
+mv val/ILSVRC2012_val_00038998.JPEG n01687978/
+mv val/ILSVRC2012_val_00038999.JPEG n02098286/
+mv val/ILSVRC2012_val_00039000.JPEG n03345487/
+mv val/ILSVRC2012_val_00039001.JPEG n04254777/
+mv val/ILSVRC2012_val_00039002.JPEG n03680355/
+mv val/ILSVRC2012_val_00039003.JPEG n02963159/
+mv val/ILSVRC2012_val_00039004.JPEG n01582220/
+mv val/ILSVRC2012_val_00039005.JPEG n04090263/
+mv val/ILSVRC2012_val_00039006.JPEG n03761084/
+mv val/ILSVRC2012_val_00039007.JPEG n04604644/
+mv val/ILSVRC2012_val_00039008.JPEG n02097209/
+mv val/ILSVRC2012_val_00039009.JPEG n03109150/
+mv val/ILSVRC2012_val_00039010.JPEG n02088632/
+mv val/ILSVRC2012_val_00039011.JPEG n03937543/
+mv val/ILSVRC2012_val_00039012.JPEG n01943899/
+mv val/ILSVRC2012_val_00039013.JPEG n02093647/
+mv val/ILSVRC2012_val_00039014.JPEG n02093428/
+mv val/ILSVRC2012_val_00039015.JPEG n03461385/
+mv val/ILSVRC2012_val_00039016.JPEG n04270147/
+mv val/ILSVRC2012_val_00039017.JPEG n04389033/
+mv val/ILSVRC2012_val_00039018.JPEG n03534580/
+mv val/ILSVRC2012_val_00039019.JPEG n09468604/
+mv val/ILSVRC2012_val_00039020.JPEG n02107312/
+mv val/ILSVRC2012_val_00039021.JPEG n01797886/
+mv val/ILSVRC2012_val_00039022.JPEG n02090379/
+mv val/ILSVRC2012_val_00039023.JPEG n02871525/
+mv val/ILSVRC2012_val_00039024.JPEG n01667778/
+mv val/ILSVRC2012_val_00039025.JPEG n01773549/
+mv val/ILSVRC2012_val_00039026.JPEG n01755581/
+mv val/ILSVRC2012_val_00039027.JPEG n02093991/
+mv val/ILSVRC2012_val_00039028.JPEG n04350905/
+mv val/ILSVRC2012_val_00039029.JPEG n03995372/
+mv val/ILSVRC2012_val_00039030.JPEG n02280649/
+mv val/ILSVRC2012_val_00039031.JPEG n03933933/
+mv val/ILSVRC2012_val_00039032.JPEG n02226429/
+mv val/ILSVRC2012_val_00039033.JPEG n03207941/
+mv val/ILSVRC2012_val_00039034.JPEG n09399592/
+mv val/ILSVRC2012_val_00039035.JPEG n02106030/
+mv val/ILSVRC2012_val_00039036.JPEG n03590841/
+mv val/ILSVRC2012_val_00039037.JPEG n02966193/
+mv val/ILSVRC2012_val_00039038.JPEG n03787032/
+mv val/ILSVRC2012_val_00039039.JPEG n02115913/
+mv val/ILSVRC2012_val_00039040.JPEG n04099969/
+mv val/ILSVRC2012_val_00039041.JPEG n04273569/
+mv val/ILSVRC2012_val_00039042.JPEG n02037110/
+mv val/ILSVRC2012_val_00039043.JPEG n01917289/
+mv val/ILSVRC2012_val_00039044.JPEG n04254777/
+mv val/ILSVRC2012_val_00039045.JPEG n03888257/
+mv val/ILSVRC2012_val_00039046.JPEG n02807133/
+mv val/ILSVRC2012_val_00039047.JPEG n04589890/
+mv val/ILSVRC2012_val_00039048.JPEG n02091032/
+mv val/ILSVRC2012_val_00039049.JPEG n01685808/
+mv val/ILSVRC2012_val_00039050.JPEG n07714571/
+mv val/ILSVRC2012_val_00039051.JPEG n03777568/
+mv val/ILSVRC2012_val_00039052.JPEG n03379051/
+mv val/ILSVRC2012_val_00039053.JPEG n03028079/
+mv val/ILSVRC2012_val_00039054.JPEG n04275548/
+mv val/ILSVRC2012_val_00039055.JPEG n02395406/
+mv val/ILSVRC2012_val_00039056.JPEG n04040759/
+mv val/ILSVRC2012_val_00039057.JPEG n02109961/
+mv val/ILSVRC2012_val_00039058.JPEG n01872401/
+mv val/ILSVRC2012_val_00039059.JPEG n03825788/
+mv val/ILSVRC2012_val_00039060.JPEG n02112706/
+mv val/ILSVRC2012_val_00039061.JPEG n03692522/
+mv val/ILSVRC2012_val_00039062.JPEG n02086910/
+mv val/ILSVRC2012_val_00039063.JPEG n02321529/
+mv val/ILSVRC2012_val_00039064.JPEG n03131574/
+mv val/ILSVRC2012_val_00039065.JPEG n04311004/
+mv val/ILSVRC2012_val_00039066.JPEG n03929855/
+mv val/ILSVRC2012_val_00039067.JPEG n01514859/
+mv val/ILSVRC2012_val_00039068.JPEG n03804744/
+mv val/ILSVRC2012_val_00039069.JPEG n03417042/
+mv val/ILSVRC2012_val_00039070.JPEG n02794156/
+mv val/ILSVRC2012_val_00039071.JPEG n07730033/
+mv val/ILSVRC2012_val_00039072.JPEG n04120489/
+mv val/ILSVRC2012_val_00039073.JPEG n02342885/
+mv val/ILSVRC2012_val_00039074.JPEG n04041544/
+mv val/ILSVRC2012_val_00039075.JPEG n04366367/
+mv val/ILSVRC2012_val_00039076.JPEG n02116738/
+mv val/ILSVRC2012_val_00039077.JPEG n02992211/
+mv val/ILSVRC2012_val_00039078.JPEG n02276258/
+mv val/ILSVRC2012_val_00039079.JPEG n02895154/
+mv val/ILSVRC2012_val_00039080.JPEG n01984695/
+mv val/ILSVRC2012_val_00039081.JPEG n03661043/
+mv val/ILSVRC2012_val_00039082.JPEG n03207941/
+mv val/ILSVRC2012_val_00039083.JPEG n02025239/
+mv val/ILSVRC2012_val_00039084.JPEG n02123045/
+mv val/ILSVRC2012_val_00039085.JPEG n02117135/
+mv val/ILSVRC2012_val_00039086.JPEG n02107908/
+mv val/ILSVRC2012_val_00039087.JPEG n02815834/
+mv val/ILSVRC2012_val_00039088.JPEG n04355933/
+mv val/ILSVRC2012_val_00039089.JPEG n03598930/
+mv val/ILSVRC2012_val_00039090.JPEG n07742313/
+mv val/ILSVRC2012_val_00039091.JPEG n03876231/
+mv val/ILSVRC2012_val_00039092.JPEG n02259212/
+mv val/ILSVRC2012_val_00039093.JPEG n01775062/
+mv val/ILSVRC2012_val_00039094.JPEG n03617480/
+mv val/ILSVRC2012_val_00039095.JPEG n03840681/
+mv val/ILSVRC2012_val_00039096.JPEG n03902125/
+mv val/ILSVRC2012_val_00039097.JPEG n02930766/
+mv val/ILSVRC2012_val_00039098.JPEG n03633091/
+mv val/ILSVRC2012_val_00039099.JPEG n04404412/
+mv val/ILSVRC2012_val_00039100.JPEG n03825788/
+mv val/ILSVRC2012_val_00039101.JPEG n03337140/
+mv val/ILSVRC2012_val_00039102.JPEG n02018795/
+mv val/ILSVRC2012_val_00039103.JPEG n02447366/
+mv val/ILSVRC2012_val_00039104.JPEG n07613480/
+mv val/ILSVRC2012_val_00039105.JPEG n02493793/
+mv val/ILSVRC2012_val_00039106.JPEG n01694178/
+mv val/ILSVRC2012_val_00039107.JPEG n12620546/
+mv val/ILSVRC2012_val_00039108.JPEG n06874185/
+mv val/ILSVRC2012_val_00039109.JPEG n02443484/
+mv val/ILSVRC2012_val_00039110.JPEG n04209133/
+mv val/ILSVRC2012_val_00039111.JPEG n04515003/
+mv val/ILSVRC2012_val_00039112.JPEG n04540053/
+mv val/ILSVRC2012_val_00039113.JPEG n01796340/
+mv val/ILSVRC2012_val_00039114.JPEG n03623198/
+mv val/ILSVRC2012_val_00039115.JPEG n02108551/
+mv val/ILSVRC2012_val_00039116.JPEG n03763968/
+mv val/ILSVRC2012_val_00039117.JPEG n02410509/
+mv val/ILSVRC2012_val_00039118.JPEG n11879895/
+mv val/ILSVRC2012_val_00039119.JPEG n03832673/
+mv val/ILSVRC2012_val_00039120.JPEG n03930630/
+mv val/ILSVRC2012_val_00039121.JPEG n02490219/
+mv val/ILSVRC2012_val_00039122.JPEG n03937543/
+mv val/ILSVRC2012_val_00039123.JPEG n02111889/
+mv val/ILSVRC2012_val_00039124.JPEG n02096437/
+mv val/ILSVRC2012_val_00039125.JPEG n04154565/
+mv val/ILSVRC2012_val_00039126.JPEG n02971356/
+mv val/ILSVRC2012_val_00039127.JPEG n02865351/
+mv val/ILSVRC2012_val_00039128.JPEG n03776460/
+mv val/ILSVRC2012_val_00039129.JPEG n02777292/
+mv val/ILSVRC2012_val_00039130.JPEG n02190166/
+mv val/ILSVRC2012_val_00039131.JPEG n04612504/
+mv val/ILSVRC2012_val_00039132.JPEG n04081281/
+mv val/ILSVRC2012_val_00039133.JPEG n02747177/
+mv val/ILSVRC2012_val_00039134.JPEG n03777754/
+mv val/ILSVRC2012_val_00039135.JPEG n02445715/
+mv val/ILSVRC2012_val_00039136.JPEG n03857828/
+mv val/ILSVRC2012_val_00039137.JPEG n11939491/
+mv val/ILSVRC2012_val_00039138.JPEG n01981276/
+mv val/ILSVRC2012_val_00039139.JPEG n04041544/
+mv val/ILSVRC2012_val_00039140.JPEG n04458633/
+mv val/ILSVRC2012_val_00039141.JPEG n03447721/
+mv val/ILSVRC2012_val_00039142.JPEG n02106030/
+mv val/ILSVRC2012_val_00039143.JPEG n02834397/
+mv val/ILSVRC2012_val_00039144.JPEG n02097474/
+mv val/ILSVRC2012_val_00039145.JPEG n01877812/
+mv val/ILSVRC2012_val_00039146.JPEG n02085936/
+mv val/ILSVRC2012_val_00039147.JPEG n02096051/
+mv val/ILSVRC2012_val_00039148.JPEG n03272562/
+mv val/ILSVRC2012_val_00039149.JPEG n03793489/
+mv val/ILSVRC2012_val_00039150.JPEG n02099849/
+mv val/ILSVRC2012_val_00039151.JPEG n03649909/
+mv val/ILSVRC2012_val_00039152.JPEG n01882714/
+mv val/ILSVRC2012_val_00039153.JPEG n02860847/
+mv val/ILSVRC2012_val_00039154.JPEG n04039381/
+mv val/ILSVRC2012_val_00039155.JPEG n04264628/
+mv val/ILSVRC2012_val_00039156.JPEG n02484975/
+mv val/ILSVRC2012_val_00039157.JPEG n02167151/
+mv val/ILSVRC2012_val_00039158.JPEG n02074367/
+mv val/ILSVRC2012_val_00039159.JPEG n01773549/
+mv val/ILSVRC2012_val_00039160.JPEG n04367480/
+mv val/ILSVRC2012_val_00039161.JPEG n07718747/
+mv val/ILSVRC2012_val_00039162.JPEG n02841315/
+mv val/ILSVRC2012_val_00039163.JPEG n02910353/
+mv val/ILSVRC2012_val_00039164.JPEG n02106550/
+mv val/ILSVRC2012_val_00039165.JPEG n03602883/
+mv val/ILSVRC2012_val_00039166.JPEG n04153751/
+mv val/ILSVRC2012_val_00039167.JPEG n03992509/
+mv val/ILSVRC2012_val_00039168.JPEG n09468604/
+mv val/ILSVRC2012_val_00039169.JPEG n02129604/
+mv val/ILSVRC2012_val_00039170.JPEG n09229709/
+mv val/ILSVRC2012_val_00039171.JPEG n02056570/
+mv val/ILSVRC2012_val_00039172.JPEG n03594734/
+mv val/ILSVRC2012_val_00039173.JPEG n02111277/
+mv val/ILSVRC2012_val_00039174.JPEG n07590611/
+mv val/ILSVRC2012_val_00039175.JPEG n02704792/
+mv val/ILSVRC2012_val_00039176.JPEG n03868863/
+mv val/ILSVRC2012_val_00039177.JPEG n02115641/
+mv val/ILSVRC2012_val_00039178.JPEG n02444819/
+mv val/ILSVRC2012_val_00039179.JPEG n02808304/
+mv val/ILSVRC2012_val_00039180.JPEG n04355338/
+mv val/ILSVRC2012_val_00039181.JPEG n02281787/
+mv val/ILSVRC2012_val_00039182.JPEG n02138441/
+mv val/ILSVRC2012_val_00039183.JPEG n03814906/
+mv val/ILSVRC2012_val_00039184.JPEG n04409515/
+mv val/ILSVRC2012_val_00039185.JPEG n01739381/
+mv val/ILSVRC2012_val_00039186.JPEG n03495258/
+mv val/ILSVRC2012_val_00039187.JPEG n03627232/
+mv val/ILSVRC2012_val_00039188.JPEG n02085620/
+mv val/ILSVRC2012_val_00039189.JPEG n02190166/
+mv val/ILSVRC2012_val_00039190.JPEG n03355925/
+mv val/ILSVRC2012_val_00039191.JPEG n03188531/
+mv val/ILSVRC2012_val_00039192.JPEG n02100735/
+mv val/ILSVRC2012_val_00039193.JPEG n03961711/
+mv val/ILSVRC2012_val_00039194.JPEG n02823428/
+mv val/ILSVRC2012_val_00039195.JPEG n07860988/
+mv val/ILSVRC2012_val_00039196.JPEG n01740131/
+mv val/ILSVRC2012_val_00039197.JPEG n09229709/
+mv val/ILSVRC2012_val_00039198.JPEG n03777568/
+mv val/ILSVRC2012_val_00039199.JPEG n03908618/
+mv val/ILSVRC2012_val_00039200.JPEG n02108551/
+mv val/ILSVRC2012_val_00039201.JPEG n02177972/
+mv val/ILSVRC2012_val_00039202.JPEG n09288635/
+mv val/ILSVRC2012_val_00039203.JPEG n01693334/
+mv val/ILSVRC2012_val_00039204.JPEG n02106382/
+mv val/ILSVRC2012_val_00039205.JPEG n04026417/
+mv val/ILSVRC2012_val_00039206.JPEG n03388183/
+mv val/ILSVRC2012_val_00039207.JPEG n02002724/
+mv val/ILSVRC2012_val_00039208.JPEG n03208938/
+mv val/ILSVRC2012_val_00039209.JPEG n04517823/
+mv val/ILSVRC2012_val_00039210.JPEG n04336792/
+mv val/ILSVRC2012_val_00039211.JPEG n03658185/
+mv val/ILSVRC2012_val_00039212.JPEG n02097474/
+mv val/ILSVRC2012_val_00039213.JPEG n02690373/
+mv val/ILSVRC2012_val_00039214.JPEG n13044778/
+mv val/ILSVRC2012_val_00039215.JPEG n02281787/
+mv val/ILSVRC2012_val_00039216.JPEG n02641379/
+mv val/ILSVRC2012_val_00039217.JPEG n02130308/
+mv val/ILSVRC2012_val_00039218.JPEG n02704792/
+mv val/ILSVRC2012_val_00039219.JPEG n01582220/
+mv val/ILSVRC2012_val_00039220.JPEG n02027492/
+mv val/ILSVRC2012_val_00039221.JPEG n04525305/
+mv val/ILSVRC2012_val_00039222.JPEG n02119789/
+mv val/ILSVRC2012_val_00039223.JPEG n13054560/
+mv val/ILSVRC2012_val_00039224.JPEG n03724870/
+mv val/ILSVRC2012_val_00039225.JPEG n02488291/
+mv val/ILSVRC2012_val_00039226.JPEG n07697313/
+mv val/ILSVRC2012_val_00039227.JPEG n02132136/
+mv val/ILSVRC2012_val_00039228.JPEG n04336792/
+mv val/ILSVRC2012_val_00039229.JPEG n03983396/
+mv val/ILSVRC2012_val_00039230.JPEG n03944341/
+mv val/ILSVRC2012_val_00039231.JPEG n01774384/
+mv val/ILSVRC2012_val_00039232.JPEG n02027492/
+mv val/ILSVRC2012_val_00039233.JPEG n02091134/
+mv val/ILSVRC2012_val_00039234.JPEG n07860988/
+mv val/ILSVRC2012_val_00039235.JPEG n02106550/
+mv val/ILSVRC2012_val_00039236.JPEG n04357314/
+mv val/ILSVRC2012_val_00039237.JPEG n03662601/
+mv val/ILSVRC2012_val_00039238.JPEG n03868242/
+mv val/ILSVRC2012_val_00039239.JPEG n03804744/
+mv val/ILSVRC2012_val_00039240.JPEG n02112350/
+mv val/ILSVRC2012_val_00039241.JPEG n01774750/
+mv val/ILSVRC2012_val_00039242.JPEG n02088238/
+mv val/ILSVRC2012_val_00039243.JPEG n07718472/
+mv val/ILSVRC2012_val_00039244.JPEG n01742172/
+mv val/ILSVRC2012_val_00039245.JPEG n02992529/
+mv val/ILSVRC2012_val_00039246.JPEG n04404412/
+mv val/ILSVRC2012_val_00039247.JPEG n02089867/
+mv val/ILSVRC2012_val_00039248.JPEG n03345487/
+mv val/ILSVRC2012_val_00039249.JPEG n02437312/
+mv val/ILSVRC2012_val_00039250.JPEG n02930766/
+mv val/ILSVRC2012_val_00039251.JPEG n13133613/
+mv val/ILSVRC2012_val_00039252.JPEG n02206856/
+mv val/ILSVRC2012_val_00039253.JPEG n02486410/
+mv val/ILSVRC2012_val_00039254.JPEG n03843555/
+mv val/ILSVRC2012_val_00039255.JPEG n04476259/
+mv val/ILSVRC2012_val_00039256.JPEG n02094433/
+mv val/ILSVRC2012_val_00039257.JPEG n01843065/
+mv val/ILSVRC2012_val_00039258.JPEG n07714571/
+mv val/ILSVRC2012_val_00039259.JPEG n02389026/
+mv val/ILSVRC2012_val_00039260.JPEG n04099969/
+mv val/ILSVRC2012_val_00039261.JPEG n01843065/
+mv val/ILSVRC2012_val_00039262.JPEG n03180011/
+mv val/ILSVRC2012_val_00039263.JPEG n09472597/
+mv val/ILSVRC2012_val_00039264.JPEG n03670208/
+mv val/ILSVRC2012_val_00039265.JPEG n01751748/
+mv val/ILSVRC2012_val_00039266.JPEG n01807496/
+mv val/ILSVRC2012_val_00039267.JPEG n02229544/
+mv val/ILSVRC2012_val_00039268.JPEG n02101006/
+mv val/ILSVRC2012_val_00039269.JPEG n03188531/
+mv val/ILSVRC2012_val_00039270.JPEG n03290653/
+mv val/ILSVRC2012_val_00039271.JPEG n02403003/
+mv val/ILSVRC2012_val_00039272.JPEG n02699494/
+mv val/ILSVRC2012_val_00039273.JPEG n04266014/
+mv val/ILSVRC2012_val_00039274.JPEG n02708093/
+mv val/ILSVRC2012_val_00039275.JPEG n04399382/
+mv val/ILSVRC2012_val_00039276.JPEG n02804414/
+mv val/ILSVRC2012_val_00039277.JPEG n07747607/
+mv val/ILSVRC2012_val_00039278.JPEG n02749479/
+mv val/ILSVRC2012_val_00039279.JPEG n03424325/
+mv val/ILSVRC2012_val_00039280.JPEG n04522168/
+mv val/ILSVRC2012_val_00039281.JPEG n01843065/
+mv val/ILSVRC2012_val_00039282.JPEG n01682714/
+mv val/ILSVRC2012_val_00039283.JPEG n02138441/
+mv val/ILSVRC2012_val_00039284.JPEG n11879895/
+mv val/ILSVRC2012_val_00039285.JPEG n04355338/
+mv val/ILSVRC2012_val_00039286.JPEG n03662601/
+mv val/ILSVRC2012_val_00039287.JPEG n03658185/
+mv val/ILSVRC2012_val_00039288.JPEG n03483316/
+mv val/ILSVRC2012_val_00039289.JPEG n07718747/
+mv val/ILSVRC2012_val_00039290.JPEG n03476684/
+mv val/ILSVRC2012_val_00039291.JPEG n02110958/
+mv val/ILSVRC2012_val_00039292.JPEG n04040759/
+mv val/ILSVRC2012_val_00039293.JPEG n03814906/
+mv val/ILSVRC2012_val_00039294.JPEG n04461696/
+mv val/ILSVRC2012_val_00039295.JPEG n02492660/
+mv val/ILSVRC2012_val_00039296.JPEG n04044716/
+mv val/ILSVRC2012_val_00039297.JPEG n04596742/
+mv val/ILSVRC2012_val_00039298.JPEG n01770081/
+mv val/ILSVRC2012_val_00039299.JPEG n01806143/
+mv val/ILSVRC2012_val_00039300.JPEG n04589890/
+mv val/ILSVRC2012_val_00039301.JPEG n03016953/
+mv val/ILSVRC2012_val_00039302.JPEG n02493793/
+mv val/ILSVRC2012_val_00039303.JPEG n01983481/
+mv val/ILSVRC2012_val_00039304.JPEG n01484850/
+mv val/ILSVRC2012_val_00039305.JPEG n02981792/
+mv val/ILSVRC2012_val_00039306.JPEG n03710637/
+mv val/ILSVRC2012_val_00039307.JPEG n02104029/
+mv val/ILSVRC2012_val_00039308.JPEG n01498041/
+mv val/ILSVRC2012_val_00039309.JPEG n03976657/
+mv val/ILSVRC2012_val_00039310.JPEG n04009552/
+mv val/ILSVRC2012_val_00039311.JPEG n02790996/
+mv val/ILSVRC2012_val_00039312.JPEG n04235860/
+mv val/ILSVRC2012_val_00039313.JPEG n04447861/
+mv val/ILSVRC2012_val_00039314.JPEG n01910747/
+mv val/ILSVRC2012_val_00039315.JPEG n03481172/
+mv val/ILSVRC2012_val_00039316.JPEG n04090263/
+mv val/ILSVRC2012_val_00039317.JPEG n03929660/
+mv val/ILSVRC2012_val_00039318.JPEG n07248320/
+mv val/ILSVRC2012_val_00039319.JPEG n03271574/
+mv val/ILSVRC2012_val_00039320.JPEG n03661043/
+mv val/ILSVRC2012_val_00039321.JPEG n03954731/
+mv val/ILSVRC2012_val_00039322.JPEG n03016953/
+mv val/ILSVRC2012_val_00039323.JPEG n07614500/
+mv val/ILSVRC2012_val_00039324.JPEG n03920288/
+mv val/ILSVRC2012_val_00039325.JPEG n02091244/
+mv val/ILSVRC2012_val_00039326.JPEG n02676566/
+mv val/ILSVRC2012_val_00039327.JPEG n13044778/
+mv val/ILSVRC2012_val_00039328.JPEG n03843555/
+mv val/ILSVRC2012_val_00039329.JPEG n07871810/
+mv val/ILSVRC2012_val_00039330.JPEG n03832673/
+mv val/ILSVRC2012_val_00039331.JPEG n04252225/
+mv val/ILSVRC2012_val_00039332.JPEG n02174001/
+mv val/ILSVRC2012_val_00039333.JPEG n03832673/
+mv val/ILSVRC2012_val_00039334.JPEG n10148035/
+mv val/ILSVRC2012_val_00039335.JPEG n02280649/
+mv val/ILSVRC2012_val_00039336.JPEG n09229709/
+mv val/ILSVRC2012_val_00039337.JPEG n06874185/
+mv val/ILSVRC2012_val_00039338.JPEG n02823428/
+mv val/ILSVRC2012_val_00039339.JPEG n02692877/
+mv val/ILSVRC2012_val_00039340.JPEG n02823428/
+mv val/ILSVRC2012_val_00039341.JPEG n07753592/
+mv val/ILSVRC2012_val_00039342.JPEG n02782093/
+mv val/ILSVRC2012_val_00039343.JPEG n03459775/
+mv val/ILSVRC2012_val_00039344.JPEG n09288635/
+mv val/ILSVRC2012_val_00039345.JPEG n04204347/
+mv val/ILSVRC2012_val_00039346.JPEG n02483708/
+mv val/ILSVRC2012_val_00039347.JPEG n04461696/
+mv val/ILSVRC2012_val_00039348.JPEG n02791124/
+mv val/ILSVRC2012_val_00039349.JPEG n03710193/
+mv val/ILSVRC2012_val_00039350.JPEG n12768682/
+mv val/ILSVRC2012_val_00039351.JPEG n04435653/
+mv val/ILSVRC2012_val_00039352.JPEG n04204347/
+mv val/ILSVRC2012_val_00039353.JPEG n02669723/
+mv val/ILSVRC2012_val_00039354.JPEG n03657121/
+mv val/ILSVRC2012_val_00039355.JPEG n01518878/
+mv val/ILSVRC2012_val_00039356.JPEG n04026417/
+mv val/ILSVRC2012_val_00039357.JPEG n02319095/
+mv val/ILSVRC2012_val_00039358.JPEG n03791053/
+mv val/ILSVRC2012_val_00039359.JPEG n02110063/
+mv val/ILSVRC2012_val_00039360.JPEG n02281787/
+mv val/ILSVRC2012_val_00039361.JPEG n03197337/
+mv val/ILSVRC2012_val_00039362.JPEG n04152593/
+mv val/ILSVRC2012_val_00039363.JPEG n02025239/
+mv val/ILSVRC2012_val_00039364.JPEG n03633091/
+mv val/ILSVRC2012_val_00039365.JPEG n02259212/
+mv val/ILSVRC2012_val_00039366.JPEG n02423022/
+mv val/ILSVRC2012_val_00039367.JPEG n03891332/
+mv val/ILSVRC2012_val_00039368.JPEG n03874293/
+mv val/ILSVRC2012_val_00039369.JPEG n02071294/
+mv val/ILSVRC2012_val_00039370.JPEG n01773797/
+mv val/ILSVRC2012_val_00039371.JPEG n07711569/
+mv val/ILSVRC2012_val_00039372.JPEG n02007558/
+mv val/ILSVRC2012_val_00039373.JPEG n13133613/
+mv val/ILSVRC2012_val_00039374.JPEG n02017213/
+mv val/ILSVRC2012_val_00039375.JPEG n04270147/
+mv val/ILSVRC2012_val_00039376.JPEG n02113624/
+mv val/ILSVRC2012_val_00039377.JPEG n02916936/
+mv val/ILSVRC2012_val_00039378.JPEG n01675722/
+mv val/ILSVRC2012_val_00039379.JPEG n07614500/
+mv val/ILSVRC2012_val_00039380.JPEG n03673027/
+mv val/ILSVRC2012_val_00039381.JPEG n02109961/
+mv val/ILSVRC2012_val_00039382.JPEG n02950826/
+mv val/ILSVRC2012_val_00039383.JPEG n02966193/
+mv val/ILSVRC2012_val_00039384.JPEG n01685808/
+mv val/ILSVRC2012_val_00039385.JPEG n02804610/
+mv val/ILSVRC2012_val_00039386.JPEG n02095314/
+mv val/ILSVRC2012_val_00039387.JPEG n03929855/
+mv val/ILSVRC2012_val_00039388.JPEG n10565667/
+mv val/ILSVRC2012_val_00039389.JPEG n02013706/
+mv val/ILSVRC2012_val_00039390.JPEG n02123394/
+mv val/ILSVRC2012_val_00039391.JPEG n03590841/
+mv val/ILSVRC2012_val_00039392.JPEG n07711569/
+mv val/ILSVRC2012_val_00039393.JPEG n02113799/
+mv val/ILSVRC2012_val_00039394.JPEG n07860988/
+mv val/ILSVRC2012_val_00039395.JPEG n04367480/
+mv val/ILSVRC2012_val_00039396.JPEG n07873807/
+mv val/ILSVRC2012_val_00039397.JPEG n02096585/
+mv val/ILSVRC2012_val_00039398.JPEG n02002724/
+mv val/ILSVRC2012_val_00039399.JPEG n02134418/
+mv val/ILSVRC2012_val_00039400.JPEG n02398521/
+mv val/ILSVRC2012_val_00039401.JPEG n04033901/
+mv val/ILSVRC2012_val_00039402.JPEG n02110063/
+mv val/ILSVRC2012_val_00039403.JPEG n09468604/
+mv val/ILSVRC2012_val_00039404.JPEG n01990800/
+mv val/ILSVRC2012_val_00039405.JPEG n04423845/
+mv val/ILSVRC2012_val_00039406.JPEG n02177972/
+mv val/ILSVRC2012_val_00039407.JPEG n04447861/
+mv val/ILSVRC2012_val_00039408.JPEG n02096585/
+mv val/ILSVRC2012_val_00039409.JPEG n02442845/
+mv val/ILSVRC2012_val_00039410.JPEG n04265275/
+mv val/ILSVRC2012_val_00039411.JPEG n04317175/
+mv val/ILSVRC2012_val_00039412.JPEG n01807496/
+mv val/ILSVRC2012_val_00039413.JPEG n04366367/
+mv val/ILSVRC2012_val_00039414.JPEG n03814906/
+mv val/ILSVRC2012_val_00039415.JPEG n12998815/
+mv val/ILSVRC2012_val_00039416.JPEG n03482405/
+mv val/ILSVRC2012_val_00039417.JPEG n03884397/
+mv val/ILSVRC2012_val_00039418.JPEG n03673027/
+mv val/ILSVRC2012_val_00039419.JPEG n03673027/
+mv val/ILSVRC2012_val_00039420.JPEG n03793489/
+mv val/ILSVRC2012_val_00039421.JPEG n02443114/
+mv val/ILSVRC2012_val_00039422.JPEG n02988304/
+mv val/ILSVRC2012_val_00039423.JPEG n02422106/
+mv val/ILSVRC2012_val_00039424.JPEG n04326547/
+mv val/ILSVRC2012_val_00039425.JPEG n02992529/
+mv val/ILSVRC2012_val_00039426.JPEG n01860187/
+mv val/ILSVRC2012_val_00039427.JPEG n03895866/
+mv val/ILSVRC2012_val_00039428.JPEG n03180011/
+mv val/ILSVRC2012_val_00039429.JPEG n04118776/
+mv val/ILSVRC2012_val_00039430.JPEG n03461385/
+mv val/ILSVRC2012_val_00039431.JPEG n04275548/
+mv val/ILSVRC2012_val_00039432.JPEG n15075141/
+mv val/ILSVRC2012_val_00039433.JPEG n03761084/
+mv val/ILSVRC2012_val_00039434.JPEG n01944390/
+mv val/ILSVRC2012_val_00039435.JPEG n04317175/
+mv val/ILSVRC2012_val_00039436.JPEG n04152593/
+mv val/ILSVRC2012_val_00039437.JPEG n02927161/
+mv val/ILSVRC2012_val_00039438.JPEG n03956157/
+mv val/ILSVRC2012_val_00039439.JPEG n02085620/
+mv val/ILSVRC2012_val_00039440.JPEG n02727426/
+mv val/ILSVRC2012_val_00039441.JPEG n01667114/
+mv val/ILSVRC2012_val_00039442.JPEG n04493381/
+mv val/ILSVRC2012_val_00039443.JPEG n01729322/
+mv val/ILSVRC2012_val_00039444.JPEG n04081281/
+mv val/ILSVRC2012_val_00039445.JPEG n01484850/
+mv val/ILSVRC2012_val_00039446.JPEG n03124043/
+mv val/ILSVRC2012_val_00039447.JPEG n02841315/
+mv val/ILSVRC2012_val_00039448.JPEG n02108089/
+mv val/ILSVRC2012_val_00039449.JPEG n03345487/
+mv val/ILSVRC2012_val_00039450.JPEG n02892201/
+mv val/ILSVRC2012_val_00039451.JPEG n07875152/
+mv val/ILSVRC2012_val_00039452.JPEG n02093991/
+mv val/ILSVRC2012_val_00039453.JPEG n03697007/
+mv val/ILSVRC2012_val_00039454.JPEG n02119789/
+mv val/ILSVRC2012_val_00039455.JPEG n01739381/
+mv val/ILSVRC2012_val_00039456.JPEG n02319095/
+mv val/ILSVRC2012_val_00039457.JPEG n02361337/
+mv val/ILSVRC2012_val_00039458.JPEG n01883070/
+mv val/ILSVRC2012_val_00039459.JPEG n02492035/
+mv val/ILSVRC2012_val_00039460.JPEG n02107312/
+mv val/ILSVRC2012_val_00039461.JPEG n07715103/
+mv val/ILSVRC2012_val_00039462.JPEG n04264628/
+mv val/ILSVRC2012_val_00039463.JPEG n01843065/
+mv val/ILSVRC2012_val_00039464.JPEG n07860988/
+mv val/ILSVRC2012_val_00039465.JPEG n01795545/
+mv val/ILSVRC2012_val_00039466.JPEG n01592084/
+mv val/ILSVRC2012_val_00039467.JPEG n03676483/
+mv val/ILSVRC2012_val_00039468.JPEG n04254120/
+mv val/ILSVRC2012_val_00039469.JPEG n03223299/
+mv val/ILSVRC2012_val_00039470.JPEG n03220513/
+mv val/ILSVRC2012_val_00039471.JPEG n02108915/
+mv val/ILSVRC2012_val_00039472.JPEG n03873416/
+mv val/ILSVRC2012_val_00039473.JPEG n02128925/
+mv val/ILSVRC2012_val_00039474.JPEG n02389026/
+mv val/ILSVRC2012_val_00039475.JPEG n01698640/
+mv val/ILSVRC2012_val_00039476.JPEG n15075141/
+mv val/ILSVRC2012_val_00039477.JPEG n03028079/
+mv val/ILSVRC2012_val_00039478.JPEG n01644900/
+mv val/ILSVRC2012_val_00039479.JPEG n01694178/
+mv val/ILSVRC2012_val_00039480.JPEG n03761084/
+mv val/ILSVRC2012_val_00039481.JPEG n03873416/
+mv val/ILSVRC2012_val_00039482.JPEG n03710637/
+mv val/ILSVRC2012_val_00039483.JPEG n03924679/
+mv val/ILSVRC2012_val_00039484.JPEG n03627232/
+mv val/ILSVRC2012_val_00039485.JPEG n04542943/
+mv val/ILSVRC2012_val_00039486.JPEG n03095699/
+mv val/ILSVRC2012_val_00039487.JPEG n02100236/
+mv val/ILSVRC2012_val_00039488.JPEG n01784675/
+mv val/ILSVRC2012_val_00039489.JPEG n01744401/
+mv val/ILSVRC2012_val_00039490.JPEG n04153751/
+mv val/ILSVRC2012_val_00039491.JPEG n03770439/
+mv val/ILSVRC2012_val_00039492.JPEG n02107142/
+mv val/ILSVRC2012_val_00039493.JPEG n03297495/
+mv val/ILSVRC2012_val_00039494.JPEG n07753275/
+mv val/ILSVRC2012_val_00039495.JPEG n04008634/
+mv val/ILSVRC2012_val_00039496.JPEG n07615774/
+mv val/ILSVRC2012_val_00039497.JPEG n04550184/
+mv val/ILSVRC2012_val_00039498.JPEG n02110806/
+mv val/ILSVRC2012_val_00039499.JPEG n04404412/
+mv val/ILSVRC2012_val_00039500.JPEG n03976467/
+mv val/ILSVRC2012_val_00039501.JPEG n07715103/
+mv val/ILSVRC2012_val_00039502.JPEG n04525038/
+mv val/ILSVRC2012_val_00039503.JPEG n02776631/
+mv val/ILSVRC2012_val_00039504.JPEG n02099267/
+mv val/ILSVRC2012_val_00039505.JPEG n02095314/
+mv val/ILSVRC2012_val_00039506.JPEG n03028079/
+mv val/ILSVRC2012_val_00039507.JPEG n02100236/
+mv val/ILSVRC2012_val_00039508.JPEG n03930630/
+mv val/ILSVRC2012_val_00039509.JPEG n03188531/
+mv val/ILSVRC2012_val_00039510.JPEG n02094258/
+mv val/ILSVRC2012_val_00039511.JPEG n04554684/
+mv val/ILSVRC2012_val_00039512.JPEG n03887697/
+mv val/ILSVRC2012_val_00039513.JPEG n02116738/
+mv val/ILSVRC2012_val_00039514.JPEG n02007558/
+mv val/ILSVRC2012_val_00039515.JPEG n02102973/
+mv val/ILSVRC2012_val_00039516.JPEG n02130308/
+mv val/ILSVRC2012_val_00039517.JPEG n04328186/
+mv val/ILSVRC2012_val_00039518.JPEG n04141076/
+mv val/ILSVRC2012_val_00039519.JPEG n03220513/
+mv val/ILSVRC2012_val_00039520.JPEG n02444819/
+mv val/ILSVRC2012_val_00039521.JPEG n04458633/
+mv val/ILSVRC2012_val_00039522.JPEG n01735189/
+mv val/ILSVRC2012_val_00039523.JPEG n02701002/
+mv val/ILSVRC2012_val_00039524.JPEG n02071294/
+mv val/ILSVRC2012_val_00039525.JPEG n01498041/
+mv val/ILSVRC2012_val_00039526.JPEG n04070727/
+mv val/ILSVRC2012_val_00039527.JPEG n04423845/
+mv val/ILSVRC2012_val_00039528.JPEG n02089973/
+mv val/ILSVRC2012_val_00039529.JPEG n04141975/
+mv val/ILSVRC2012_val_00039530.JPEG n01729322/
+mv val/ILSVRC2012_val_00039531.JPEG n01824575/
+mv val/ILSVRC2012_val_00039532.JPEG n04251144/
+mv val/ILSVRC2012_val_00039533.JPEG n01692333/
+mv val/ILSVRC2012_val_00039534.JPEG n01484850/
+mv val/ILSVRC2012_val_00039535.JPEG n04208210/
+mv val/ILSVRC2012_val_00039536.JPEG n01667114/
+mv val/ILSVRC2012_val_00039537.JPEG n04458633/
+mv val/ILSVRC2012_val_00039538.JPEG n04141076/
+mv val/ILSVRC2012_val_00039539.JPEG n02058221/
+mv val/ILSVRC2012_val_00039540.JPEG n02088466/
+mv val/ILSVRC2012_val_00039541.JPEG n07760859/
+mv val/ILSVRC2012_val_00039542.JPEG n04560804/
+mv val/ILSVRC2012_val_00039543.JPEG n02099267/
+mv val/ILSVRC2012_val_00039544.JPEG n03000134/
+mv val/ILSVRC2012_val_00039545.JPEG n02481823/
+mv val/ILSVRC2012_val_00039546.JPEG n02788148/
+mv val/ILSVRC2012_val_00039547.JPEG n02097047/
+mv val/ILSVRC2012_val_00039548.JPEG n04487081/
+mv val/ILSVRC2012_val_00039549.JPEG n04286575/
+mv val/ILSVRC2012_val_00039550.JPEG n02233338/
+mv val/ILSVRC2012_val_00039551.JPEG n04344873/
+mv val/ILSVRC2012_val_00039552.JPEG n02490219/
+mv val/ILSVRC2012_val_00039553.JPEG n02123159/
+mv val/ILSVRC2012_val_00039554.JPEG n02120079/
+mv val/ILSVRC2012_val_00039555.JPEG n02114855/
+mv val/ILSVRC2012_val_00039556.JPEG n02088238/
+mv val/ILSVRC2012_val_00039557.JPEG n01775062/
+mv val/ILSVRC2012_val_00039558.JPEG n04136333/
+mv val/ILSVRC2012_val_00039559.JPEG n03344393/
+mv val/ILSVRC2012_val_00039560.JPEG n03535780/
+mv val/ILSVRC2012_val_00039561.JPEG n02074367/
+mv val/ILSVRC2012_val_00039562.JPEG n03782006/
+mv val/ILSVRC2012_val_00039563.JPEG n02487347/
+mv val/ILSVRC2012_val_00039564.JPEG n02134418/
+mv val/ILSVRC2012_val_00039565.JPEG n02500267/
+mv val/ILSVRC2012_val_00039566.JPEG n03208938/
+mv val/ILSVRC2012_val_00039567.JPEG n04162706/
+mv val/ILSVRC2012_val_00039568.JPEG n02410509/
+mv val/ILSVRC2012_val_00039569.JPEG n02091635/
+mv val/ILSVRC2012_val_00039570.JPEG n04417672/
+mv val/ILSVRC2012_val_00039571.JPEG n01537544/
+mv val/ILSVRC2012_val_00039572.JPEG n02951358/
+mv val/ILSVRC2012_val_00039573.JPEG n02116738/
+mv val/ILSVRC2012_val_00039574.JPEG n03594734/
+mv val/ILSVRC2012_val_00039575.JPEG n03775071/
+mv val/ILSVRC2012_val_00039576.JPEG n03594945/
+mv val/ILSVRC2012_val_00039577.JPEG n04532670/
+mv val/ILSVRC2012_val_00039578.JPEG n01695060/
+mv val/ILSVRC2012_val_00039579.JPEG n02277742/
+mv val/ILSVRC2012_val_00039580.JPEG n02123597/
+mv val/ILSVRC2012_val_00039581.JPEG n02883205/
+mv val/ILSVRC2012_val_00039582.JPEG n07932039/
+mv val/ILSVRC2012_val_00039583.JPEG n02497673/
+mv val/ILSVRC2012_val_00039584.JPEG n07754684/
+mv val/ILSVRC2012_val_00039585.JPEG n02112018/
+mv val/ILSVRC2012_val_00039586.JPEG n03538406/
+mv val/ILSVRC2012_val_00039587.JPEG n03895866/
+mv val/ILSVRC2012_val_00039588.JPEG n01494475/
+mv val/ILSVRC2012_val_00039589.JPEG n02177972/
+mv val/ILSVRC2012_val_00039590.JPEG n03197337/
+mv val/ILSVRC2012_val_00039591.JPEG n02105641/
+mv val/ILSVRC2012_val_00039592.JPEG n02992529/
+mv val/ILSVRC2012_val_00039593.JPEG n04070727/
+mv val/ILSVRC2012_val_00039594.JPEG n02109525/
+mv val/ILSVRC2012_val_00039595.JPEG n02125311/
+mv val/ILSVRC2012_val_00039596.JPEG n04456115/
+mv val/ILSVRC2012_val_00039597.JPEG n02980441/
+mv val/ILSVRC2012_val_00039598.JPEG n03841143/
+mv val/ILSVRC2012_val_00039599.JPEG n03938244/
+mv val/ILSVRC2012_val_00039600.JPEG n03661043/
+mv val/ILSVRC2012_val_00039601.JPEG n01756291/
+mv val/ILSVRC2012_val_00039602.JPEG n03794056/
+mv val/ILSVRC2012_val_00039603.JPEG n02018207/
+mv val/ILSVRC2012_val_00039604.JPEG n03126707/
+mv val/ILSVRC2012_val_00039605.JPEG n01614925/
+mv val/ILSVRC2012_val_00039606.JPEG n03992509/
+mv val/ILSVRC2012_val_00039607.JPEG n03127925/
+mv val/ILSVRC2012_val_00039608.JPEG n02115913/
+mv val/ILSVRC2012_val_00039609.JPEG n03773504/
+mv val/ILSVRC2012_val_00039610.JPEG n02776631/
+mv val/ILSVRC2012_val_00039611.JPEG n09472597/
+mv val/ILSVRC2012_val_00039612.JPEG n02177972/
+mv val/ILSVRC2012_val_00039613.JPEG n03532672/
+mv val/ILSVRC2012_val_00039614.JPEG n04476259/
+mv val/ILSVRC2012_val_00039615.JPEG n04517823/
+mv val/ILSVRC2012_val_00039616.JPEG n13052670/
+mv val/ILSVRC2012_val_00039617.JPEG n07753275/
+mv val/ILSVRC2012_val_00039618.JPEG n01685808/
+mv val/ILSVRC2012_val_00039619.JPEG n04120489/
+mv val/ILSVRC2012_val_00039620.JPEG n02120079/
+mv val/ILSVRC2012_val_00039621.JPEG n02123159/
+mv val/ILSVRC2012_val_00039622.JPEG n02087046/
+mv val/ILSVRC2012_val_00039623.JPEG n03598930/
+mv val/ILSVRC2012_val_00039624.JPEG n02487347/
+mv val/ILSVRC2012_val_00039625.JPEG n03065424/
+mv val/ILSVRC2012_val_00039626.JPEG n04517823/
+mv val/ILSVRC2012_val_00039627.JPEG n02797295/
+mv val/ILSVRC2012_val_00039628.JPEG n02804414/
+mv val/ILSVRC2012_val_00039629.JPEG n02843684/
+mv val/ILSVRC2012_val_00039630.JPEG n02018795/
+mv val/ILSVRC2012_val_00039631.JPEG n03976657/
+mv val/ILSVRC2012_val_00039632.JPEG n04005630/
+mv val/ILSVRC2012_val_00039633.JPEG n02699494/
+mv val/ILSVRC2012_val_00039634.JPEG n03814906/
+mv val/ILSVRC2012_val_00039635.JPEG n09332890/
+mv val/ILSVRC2012_val_00039636.JPEG n02493793/
+mv val/ILSVRC2012_val_00039637.JPEG n04442312/
+mv val/ILSVRC2012_val_00039638.JPEG n02100877/
+mv val/ILSVRC2012_val_00039639.JPEG n04532670/
+mv val/ILSVRC2012_val_00039640.JPEG n03047690/
+mv val/ILSVRC2012_val_00039641.JPEG n02077923/
+mv val/ILSVRC2012_val_00039642.JPEG n03733281/
+mv val/ILSVRC2012_val_00039643.JPEG n04266014/
+mv val/ILSVRC2012_val_00039644.JPEG n09835506/
+mv val/ILSVRC2012_val_00039645.JPEG n02492660/
+mv val/ILSVRC2012_val_00039646.JPEG n04330267/
+mv val/ILSVRC2012_val_00039647.JPEG n07716358/
+mv val/ILSVRC2012_val_00039648.JPEG n01601694/
+mv val/ILSVRC2012_val_00039649.JPEG n04579432/
+mv val/ILSVRC2012_val_00039650.JPEG n04380533/
+mv val/ILSVRC2012_val_00039651.JPEG n01749939/
+mv val/ILSVRC2012_val_00039652.JPEG n03444034/
+mv val/ILSVRC2012_val_00039653.JPEG n03400231/
+mv val/ILSVRC2012_val_00039654.JPEG n03584254/
+mv val/ILSVRC2012_val_00039655.JPEG n03710721/
+mv val/ILSVRC2012_val_00039656.JPEG n03895866/
+mv val/ILSVRC2012_val_00039657.JPEG n04591713/
+mv val/ILSVRC2012_val_00039658.JPEG n03903868/
+mv val/ILSVRC2012_val_00039659.JPEG n02088364/
+mv val/ILSVRC2012_val_00039660.JPEG n04141975/
+mv val/ILSVRC2012_val_00039661.JPEG n01774384/
+mv val/ILSVRC2012_val_00039662.JPEG n02112018/
+mv val/ILSVRC2012_val_00039663.JPEG n04485082/
+mv val/ILSVRC2012_val_00039664.JPEG n04259630/
+mv val/ILSVRC2012_val_00039665.JPEG n03041632/
+mv val/ILSVRC2012_val_00039666.JPEG n02097130/
+mv val/ILSVRC2012_val_00039667.JPEG n03775546/
+mv val/ILSVRC2012_val_00039668.JPEG n02093991/
+mv val/ILSVRC2012_val_00039669.JPEG n01742172/
+mv val/ILSVRC2012_val_00039670.JPEG n09193705/
+mv val/ILSVRC2012_val_00039671.JPEG n01984695/
+mv val/ILSVRC2012_val_00039672.JPEG n01924916/
+mv val/ILSVRC2012_val_00039673.JPEG n02190166/
+mv val/ILSVRC2012_val_00039674.JPEG n03706229/
+mv val/ILSVRC2012_val_00039675.JPEG n13037406/
+mv val/ILSVRC2012_val_00039676.JPEG n04604644/
+mv val/ILSVRC2012_val_00039677.JPEG n03602883/
+mv val/ILSVRC2012_val_00039678.JPEG n02504458/
+mv val/ILSVRC2012_val_00039679.JPEG n03467068/
+mv val/ILSVRC2012_val_00039680.JPEG n04536866/
+mv val/ILSVRC2012_val_00039681.JPEG n04398044/
+mv val/ILSVRC2012_val_00039682.JPEG n01986214/
+mv val/ILSVRC2012_val_00039683.JPEG n03777754/
+mv val/ILSVRC2012_val_00039684.JPEG n02066245/
+mv val/ILSVRC2012_val_00039685.JPEG n02346627/
+mv val/ILSVRC2012_val_00039686.JPEG n04370456/
+mv val/ILSVRC2012_val_00039687.JPEG n02108551/
+mv val/ILSVRC2012_val_00039688.JPEG n04204238/
+mv val/ILSVRC2012_val_00039689.JPEG n04371430/
+mv val/ILSVRC2012_val_00039690.JPEG n03792972/
+mv val/ILSVRC2012_val_00039691.JPEG n02441942/
+mv val/ILSVRC2012_val_00039692.JPEG n02096294/
+mv val/ILSVRC2012_val_00039693.JPEG n02699494/
+mv val/ILSVRC2012_val_00039694.JPEG n04589890/
+mv val/ILSVRC2012_val_00039695.JPEG n02085936/
+mv val/ILSVRC2012_val_00039696.JPEG n02105056/
+mv val/ILSVRC2012_val_00039697.JPEG n02415577/
+mv val/ILSVRC2012_val_00039698.JPEG n07734744/
+mv val/ILSVRC2012_val_00039699.JPEG n02098286/
+mv val/ILSVRC2012_val_00039700.JPEG n02113186/
+mv val/ILSVRC2012_val_00039701.JPEG n02096294/
+mv val/ILSVRC2012_val_00039702.JPEG n02871525/
+mv val/ILSVRC2012_val_00039703.JPEG n03873416/
+mv val/ILSVRC2012_val_00039704.JPEG n01784675/
+mv val/ILSVRC2012_val_00039705.JPEG n02788148/
+mv val/ILSVRC2012_val_00039706.JPEG n02051845/
+mv val/ILSVRC2012_val_00039707.JPEG n07930864/
+mv val/ILSVRC2012_val_00039708.JPEG n01692333/
+mv val/ILSVRC2012_val_00039709.JPEG n02111889/
+mv val/ILSVRC2012_val_00039710.JPEG n03662601/
+mv val/ILSVRC2012_val_00039711.JPEG n02097474/
+mv val/ILSVRC2012_val_00039712.JPEG n02165456/
+mv val/ILSVRC2012_val_00039713.JPEG n03595614/
+mv val/ILSVRC2012_val_00039714.JPEG n03452741/
+mv val/ILSVRC2012_val_00039715.JPEG n04606251/
+mv val/ILSVRC2012_val_00039716.JPEG n03796401/
+mv val/ILSVRC2012_val_00039717.JPEG n03452741/
+mv val/ILSVRC2012_val_00039718.JPEG n07693725/
+mv val/ILSVRC2012_val_00039719.JPEG n02112018/
+mv val/ILSVRC2012_val_00039720.JPEG n03388549/
+mv val/ILSVRC2012_val_00039721.JPEG n04562935/
+mv val/ILSVRC2012_val_00039722.JPEG n13133613/
+mv val/ILSVRC2012_val_00039723.JPEG n04461696/
+mv val/ILSVRC2012_val_00039724.JPEG n01796340/
+mv val/ILSVRC2012_val_00039725.JPEG n04270147/
+mv val/ILSVRC2012_val_00039726.JPEG n03187595/
+mv val/ILSVRC2012_val_00039727.JPEG n03666591/
+mv val/ILSVRC2012_val_00039728.JPEG n04120489/
+mv val/ILSVRC2012_val_00039729.JPEG n04522168/
+mv val/ILSVRC2012_val_00039730.JPEG n02111500/
+mv val/ILSVRC2012_val_00039731.JPEG n03976467/
+mv val/ILSVRC2012_val_00039732.JPEG n01729322/
+mv val/ILSVRC2012_val_00039733.JPEG n02364673/
+mv val/ILSVRC2012_val_00039734.JPEG n04356056/
+mv val/ILSVRC2012_val_00039735.JPEG n02797295/
+mv val/ILSVRC2012_val_00039736.JPEG n02114855/
+mv val/ILSVRC2012_val_00039737.JPEG n02749479/
+mv val/ILSVRC2012_val_00039738.JPEG n04357314/
+mv val/ILSVRC2012_val_00039739.JPEG n07565083/
+mv val/ILSVRC2012_val_00039740.JPEG n02676566/
+mv val/ILSVRC2012_val_00039741.JPEG n02088466/
+mv val/ILSVRC2012_val_00039742.JPEG n02823750/
+mv val/ILSVRC2012_val_00039743.JPEG n02093256/
+mv val/ILSVRC2012_val_00039744.JPEG n02256656/
+mv val/ILSVRC2012_val_00039745.JPEG n02119022/
+mv val/ILSVRC2012_val_00039746.JPEG n02883205/
+mv val/ILSVRC2012_val_00039747.JPEG n03584254/
+mv val/ILSVRC2012_val_00039748.JPEG n03775071/
+mv val/ILSVRC2012_val_00039749.JPEG n01682714/
+mv val/ILSVRC2012_val_00039750.JPEG n03124170/
+mv val/ILSVRC2012_val_00039751.JPEG n04201297/
+mv val/ILSVRC2012_val_00039752.JPEG n04044716/
+mv val/ILSVRC2012_val_00039753.JPEG n01629819/
+mv val/ILSVRC2012_val_00039754.JPEG n12998815/
+mv val/ILSVRC2012_val_00039755.JPEG n07584110/
+mv val/ILSVRC2012_val_00039756.JPEG n04532106/
+mv val/ILSVRC2012_val_00039757.JPEG n03825788/
+mv val/ILSVRC2012_val_00039758.JPEG n04501370/
+mv val/ILSVRC2012_val_00039759.JPEG n01560419/
+mv val/ILSVRC2012_val_00039760.JPEG n03065424/
+mv val/ILSVRC2012_val_00039761.JPEG n02106030/
+mv val/ILSVRC2012_val_00039762.JPEG n04229816/
+mv val/ILSVRC2012_val_00039763.JPEG n03623198/
+mv val/ILSVRC2012_val_00039764.JPEG n02280649/
+mv val/ILSVRC2012_val_00039765.JPEG n06785654/
+mv val/ILSVRC2012_val_00039766.JPEG n02342885/
+mv val/ILSVRC2012_val_00039767.JPEG n02488291/
+mv val/ILSVRC2012_val_00039768.JPEG n02606052/
+mv val/ILSVRC2012_val_00039769.JPEG n03271574/
+mv val/ILSVRC2012_val_00039770.JPEG n04070727/
+mv val/ILSVRC2012_val_00039771.JPEG n03717622/
+mv val/ILSVRC2012_val_00039772.JPEG n02447366/
+mv val/ILSVRC2012_val_00039773.JPEG n03065424/
+mv val/ILSVRC2012_val_00039774.JPEG n03527444/
+mv val/ILSVRC2012_val_00039775.JPEG n01943899/
+mv val/ILSVRC2012_val_00039776.JPEG n02095889/
+mv val/ILSVRC2012_val_00039777.JPEG n02132136/
+mv val/ILSVRC2012_val_00039778.JPEG n04204347/
+mv val/ILSVRC2012_val_00039779.JPEG n03026506/
+mv val/ILSVRC2012_val_00039780.JPEG n01749939/
+mv val/ILSVRC2012_val_00039781.JPEG n03742115/
+mv val/ILSVRC2012_val_00039782.JPEG n02105162/
+mv val/ILSVRC2012_val_00039783.JPEG n03733281/
+mv val/ILSVRC2012_val_00039784.JPEG n02006656/
+mv val/ILSVRC2012_val_00039785.JPEG n04552348/
+mv val/ILSVRC2012_val_00039786.JPEG n02493793/
+mv val/ILSVRC2012_val_00039787.JPEG n02992211/
+mv val/ILSVRC2012_val_00039788.JPEG n02089867/
+mv val/ILSVRC2012_val_00039789.JPEG n04111531/
+mv val/ILSVRC2012_val_00039790.JPEG n04590129/
+mv val/ILSVRC2012_val_00039791.JPEG n03982430/
+mv val/ILSVRC2012_val_00039792.JPEG n03495258/
+mv val/ILSVRC2012_val_00039793.JPEG n02640242/
+mv val/ILSVRC2012_val_00039794.JPEG n02099429/
+mv val/ILSVRC2012_val_00039795.JPEG n02132136/
+mv val/ILSVRC2012_val_00039796.JPEG n02444819/
+mv val/ILSVRC2012_val_00039797.JPEG n02056570/
+mv val/ILSVRC2012_val_00039798.JPEG n03494278/
+mv val/ILSVRC2012_val_00039799.JPEG n01773157/
+mv val/ILSVRC2012_val_00039800.JPEG n02137549/
+mv val/ILSVRC2012_val_00039801.JPEG n01534433/
+mv val/ILSVRC2012_val_00039802.JPEG n02018795/
+mv val/ILSVRC2012_val_00039803.JPEG n03630383/
+mv val/ILSVRC2012_val_00039804.JPEG n02281787/
+mv val/ILSVRC2012_val_00039805.JPEG n04120489/
+mv val/ILSVRC2012_val_00039806.JPEG n02104029/
+mv val/ILSVRC2012_val_00039807.JPEG n02098413/
+mv val/ILSVRC2012_val_00039808.JPEG n02488702/
+mv val/ILSVRC2012_val_00039809.JPEG n03379051/
+mv val/ILSVRC2012_val_00039810.JPEG n02807133/
+mv val/ILSVRC2012_val_00039811.JPEG n04591713/
+mv val/ILSVRC2012_val_00039812.JPEG n02110185/
+mv val/ILSVRC2012_val_00039813.JPEG n04209239/
+mv val/ILSVRC2012_val_00039814.JPEG n01558993/
+mv val/ILSVRC2012_val_00039815.JPEG n04325704/
+mv val/ILSVRC2012_val_00039816.JPEG n04264628/
+mv val/ILSVRC2012_val_00039817.JPEG n03291819/
+mv val/ILSVRC2012_val_00039818.JPEG n02793495/
+mv val/ILSVRC2012_val_00039819.JPEG n02133161/
+mv val/ILSVRC2012_val_00039820.JPEG n03908714/
+mv val/ILSVRC2012_val_00039821.JPEG n03584254/
+mv val/ILSVRC2012_val_00039822.JPEG n02091831/
+mv val/ILSVRC2012_val_00039823.JPEG n02099429/
+mv val/ILSVRC2012_val_00039824.JPEG n09835506/
+mv val/ILSVRC2012_val_00039825.JPEG n01798484/
+mv val/ILSVRC2012_val_00039826.JPEG n03041632/
+mv val/ILSVRC2012_val_00039827.JPEG n02808304/
+mv val/ILSVRC2012_val_00039828.JPEG n04136333/
+mv val/ILSVRC2012_val_00039829.JPEG n09428293/
+mv val/ILSVRC2012_val_00039830.JPEG n04465501/
+mv val/ILSVRC2012_val_00039831.JPEG n01688243/
+mv val/ILSVRC2012_val_00039832.JPEG n02093428/
+mv val/ILSVRC2012_val_00039833.JPEG n02129165/
+mv val/ILSVRC2012_val_00039834.JPEG n07749582/
+mv val/ILSVRC2012_val_00039835.JPEG n03197337/
+mv val/ILSVRC2012_val_00039836.JPEG n04392985/
+mv val/ILSVRC2012_val_00039837.JPEG n04367480/
+mv val/ILSVRC2012_val_00039838.JPEG n02484975/
+mv val/ILSVRC2012_val_00039839.JPEG n02607072/
+mv val/ILSVRC2012_val_00039840.JPEG n03089624/
+mv val/ILSVRC2012_val_00039841.JPEG n04116512/
+mv val/ILSVRC2012_val_00039842.JPEG n04286575/
+mv val/ILSVRC2012_val_00039843.JPEG n02233338/
+mv val/ILSVRC2012_val_00039844.JPEG n04118538/
+mv val/ILSVRC2012_val_00039845.JPEG n04254777/
+mv val/ILSVRC2012_val_00039846.JPEG n02410509/
+mv val/ILSVRC2012_val_00039847.JPEG n02091244/
+mv val/ILSVRC2012_val_00039848.JPEG n03016953/
+mv val/ILSVRC2012_val_00039849.JPEG n03026506/
+mv val/ILSVRC2012_val_00039850.JPEG n02113978/
+mv val/ILSVRC2012_val_00039851.JPEG n02091032/
+mv val/ILSVRC2012_val_00039852.JPEG n02096585/
+mv val/ILSVRC2012_val_00039853.JPEG n04179913/
+mv val/ILSVRC2012_val_00039854.JPEG n01775062/
+mv val/ILSVRC2012_val_00039855.JPEG n03903868/
+mv val/ILSVRC2012_val_00039856.JPEG n04277352/
+mv val/ILSVRC2012_val_00039857.JPEG n02841315/
+mv val/ILSVRC2012_val_00039858.JPEG n04597913/
+mv val/ILSVRC2012_val_00039859.JPEG n01614925/
+mv val/ILSVRC2012_val_00039860.JPEG n04067472/
+mv val/ILSVRC2012_val_00039861.JPEG n03876231/
+mv val/ILSVRC2012_val_00039862.JPEG n02095889/
+mv val/ILSVRC2012_val_00039863.JPEG n02100877/
+mv val/ILSVRC2012_val_00039864.JPEG n03444034/
+mv val/ILSVRC2012_val_00039865.JPEG n01484850/
+mv val/ILSVRC2012_val_00039866.JPEG n02490219/
+mv val/ILSVRC2012_val_00039867.JPEG n03272010/
+mv val/ILSVRC2012_val_00039868.JPEG n12057211/
+mv val/ILSVRC2012_val_00039869.JPEG n03980874/
+mv val/ILSVRC2012_val_00039870.JPEG n02097474/
+mv val/ILSVRC2012_val_00039871.JPEG n04270147/
+mv val/ILSVRC2012_val_00039872.JPEG n04429376/
+mv val/ILSVRC2012_val_00039873.JPEG n04111531/
+mv val/ILSVRC2012_val_00039874.JPEG n09399592/
+mv val/ILSVRC2012_val_00039875.JPEG n04005630/
+mv val/ILSVRC2012_val_00039876.JPEG n03595614/
+mv val/ILSVRC2012_val_00039877.JPEG n02123045/
+mv val/ILSVRC2012_val_00039878.JPEG n03657121/
+mv val/ILSVRC2012_val_00039879.JPEG n07892512/
+mv val/ILSVRC2012_val_00039880.JPEG n03840681/
+mv val/ILSVRC2012_val_00039881.JPEG n04296562/
+mv val/ILSVRC2012_val_00039882.JPEG n02807133/
+mv val/ILSVRC2012_val_00039883.JPEG n01806567/
+mv val/ILSVRC2012_val_00039884.JPEG n04258138/
+mv val/ILSVRC2012_val_00039885.JPEG n02114367/
+mv val/ILSVRC2012_val_00039886.JPEG n01675722/
+mv val/ILSVRC2012_val_00039887.JPEG n02794156/
+mv val/ILSVRC2012_val_00039888.JPEG n01698640/
+mv val/ILSVRC2012_val_00039889.JPEG n04296562/
+mv val/ILSVRC2012_val_00039890.JPEG n07717556/
+mv val/ILSVRC2012_val_00039891.JPEG n03476991/
+mv val/ILSVRC2012_val_00039892.JPEG n04005630/
+mv val/ILSVRC2012_val_00039893.JPEG n02099712/
+mv val/ILSVRC2012_val_00039894.JPEG n02099429/
+mv val/ILSVRC2012_val_00039895.JPEG n03721384/
+mv val/ILSVRC2012_val_00039896.JPEG n04277352/
+mv val/ILSVRC2012_val_00039897.JPEG n03127925/
+mv val/ILSVRC2012_val_00039898.JPEG n02256656/
+mv val/ILSVRC2012_val_00039899.JPEG n03201208/
+mv val/ILSVRC2012_val_00039900.JPEG n02088466/
+mv val/ILSVRC2012_val_00039901.JPEG n02086079/
+mv val/ILSVRC2012_val_00039902.JPEG n01632458/
+mv val/ILSVRC2012_val_00039903.JPEG n04376876/
+mv val/ILSVRC2012_val_00039904.JPEG n03998194/
+mv val/ILSVRC2012_val_00039905.JPEG n01440764/
+mv val/ILSVRC2012_val_00039906.JPEG n02704792/
+mv val/ILSVRC2012_val_00039907.JPEG n01855032/
+mv val/ILSVRC2012_val_00039908.JPEG n03095699/
+mv val/ILSVRC2012_val_00039909.JPEG n04355933/
+mv val/ILSVRC2012_val_00039910.JPEG n04465501/
+mv val/ILSVRC2012_val_00039911.JPEG n03841143/
+mv val/ILSVRC2012_val_00039912.JPEG n04501370/
+mv val/ILSVRC2012_val_00039913.JPEG n01558993/
+mv val/ILSVRC2012_val_00039914.JPEG n03042490/
+mv val/ILSVRC2012_val_00039915.JPEG n01950731/
+mv val/ILSVRC2012_val_00039916.JPEG n03935335/
+mv val/ILSVRC2012_val_00039917.JPEG n04584207/
+mv val/ILSVRC2012_val_00039918.JPEG n01984695/
+mv val/ILSVRC2012_val_00039919.JPEG n02747177/
+mv val/ILSVRC2012_val_00039920.JPEG n03775546/
+mv val/ILSVRC2012_val_00039921.JPEG n04525038/
+mv val/ILSVRC2012_val_00039922.JPEG n01632777/
+mv val/ILSVRC2012_val_00039923.JPEG n04485082/
+mv val/ILSVRC2012_val_00039924.JPEG n04116512/
+mv val/ILSVRC2012_val_00039925.JPEG n02486410/
+mv val/ILSVRC2012_val_00039926.JPEG n02096585/
+mv val/ILSVRC2012_val_00039927.JPEG n02096051/
+mv val/ILSVRC2012_val_00039928.JPEG n02110627/
+mv val/ILSVRC2012_val_00039929.JPEG n03272010/
+mv val/ILSVRC2012_val_00039930.JPEG n03775546/
+mv val/ILSVRC2012_val_00039931.JPEG n02123597/
+mv val/ILSVRC2012_val_00039932.JPEG n02992529/
+mv val/ILSVRC2012_val_00039933.JPEG n01632458/
+mv val/ILSVRC2012_val_00039934.JPEG n02089078/
+mv val/ILSVRC2012_val_00039935.JPEG n03954731/
+mv val/ILSVRC2012_val_00039936.JPEG n02437616/
+mv val/ILSVRC2012_val_00039937.JPEG n02120505/
+mv val/ILSVRC2012_val_00039938.JPEG n04507155/
+mv val/ILSVRC2012_val_00039939.JPEG n02114712/
+mv val/ILSVRC2012_val_00039940.JPEG n03532672/
+mv val/ILSVRC2012_val_00039941.JPEG n03983396/
+mv val/ILSVRC2012_val_00039942.JPEG n02108000/
+mv val/ILSVRC2012_val_00039943.JPEG n01514859/
+mv val/ILSVRC2012_val_00039944.JPEG n07802026/
+mv val/ILSVRC2012_val_00039945.JPEG n02951358/
+mv val/ILSVRC2012_val_00039946.JPEG n01882714/
+mv val/ILSVRC2012_val_00039947.JPEG n04505470/
+mv val/ILSVRC2012_val_00039948.JPEG n02231487/
+mv val/ILSVRC2012_val_00039949.JPEG n03388043/
+mv val/ILSVRC2012_val_00039950.JPEG n04482393/
+mv val/ILSVRC2012_val_00039951.JPEG n02112018/
+mv val/ILSVRC2012_val_00039952.JPEG n04008634/
+mv val/ILSVRC2012_val_00039953.JPEG n02606052/
+mv val/ILSVRC2012_val_00039954.JPEG n04273569/
+mv val/ILSVRC2012_val_00039955.JPEG n03594734/
+mv val/ILSVRC2012_val_00039956.JPEG n04532670/
+mv val/ILSVRC2012_val_00039957.JPEG n01855032/
+mv val/ILSVRC2012_val_00039958.JPEG n02342885/
+mv val/ILSVRC2012_val_00039959.JPEG n03950228/
+mv val/ILSVRC2012_val_00039960.JPEG n02093859/
+mv val/ILSVRC2012_val_00039961.JPEG n02841315/
+mv val/ILSVRC2012_val_00039962.JPEG n02025239/
+mv val/ILSVRC2012_val_00039963.JPEG n03930630/
+mv val/ILSVRC2012_val_00039964.JPEG n01797886/
+mv val/ILSVRC2012_val_00039965.JPEG n03240683/
+mv val/ILSVRC2012_val_00039966.JPEG n01775062/
+mv val/ILSVRC2012_val_00039967.JPEG n02321529/
+mv val/ILSVRC2012_val_00039968.JPEG n02342885/
+mv val/ILSVRC2012_val_00039969.JPEG n02108551/
+mv val/ILSVRC2012_val_00039970.JPEG n03216828/
+mv val/ILSVRC2012_val_00039971.JPEG n02281406/
+mv val/ILSVRC2012_val_00039972.JPEG n03710721/
+mv val/ILSVRC2012_val_00039973.JPEG n04201297/
+mv val/ILSVRC2012_val_00039974.JPEG n01950731/
+mv val/ILSVRC2012_val_00039975.JPEG n03216828/
+mv val/ILSVRC2012_val_00039976.JPEG n07880968/
+mv val/ILSVRC2012_val_00039977.JPEG n04208210/
+mv val/ILSVRC2012_val_00039978.JPEG n02514041/
+mv val/ILSVRC2012_val_00039979.JPEG n02123597/
+mv val/ILSVRC2012_val_00039980.JPEG n04517823/
+mv val/ILSVRC2012_val_00039981.JPEG n04553703/
+mv val/ILSVRC2012_val_00039982.JPEG n03482405/
+mv val/ILSVRC2012_val_00039983.JPEG n07697313/
+mv val/ILSVRC2012_val_00039984.JPEG n03690938/
+mv val/ILSVRC2012_val_00039985.JPEG n02444819/
+mv val/ILSVRC2012_val_00039986.JPEG n04049303/
+mv val/ILSVRC2012_val_00039987.JPEG n03085013/
+mv val/ILSVRC2012_val_00039988.JPEG n01843065/
+mv val/ILSVRC2012_val_00039989.JPEG n03709823/
+mv val/ILSVRC2012_val_00039990.JPEG n02117135/
+mv val/ILSVRC2012_val_00039991.JPEG n02787622/
+mv val/ILSVRC2012_val_00039992.JPEG n07579787/
+mv val/ILSVRC2012_val_00039993.JPEG n02099601/
+mv val/ILSVRC2012_val_00039994.JPEG n04229816/
+mv val/ILSVRC2012_val_00039995.JPEG n03776460/
+mv val/ILSVRC2012_val_00039996.JPEG n01644900/
+mv val/ILSVRC2012_val_00039997.JPEG n07579787/
+mv val/ILSVRC2012_val_00039998.JPEG n03733281/
+mv val/ILSVRC2012_val_00039999.JPEG n09472597/
+mv val/ILSVRC2012_val_00040000.JPEG n01797886/
+mv val/ILSVRC2012_val_00040001.JPEG n07802026/
+mv val/ILSVRC2012_val_00040002.JPEG n01806567/
+mv val/ILSVRC2012_val_00040003.JPEG n02108551/
+mv val/ILSVRC2012_val_00040004.JPEG n02093754/
+mv val/ILSVRC2012_val_00040005.JPEG n02132136/
+mv val/ILSVRC2012_val_00040006.JPEG n04254120/
+mv val/ILSVRC2012_val_00040007.JPEG n03877472/
+mv val/ILSVRC2012_val_00040008.JPEG n02480855/
+mv val/ILSVRC2012_val_00040009.JPEG n04285008/
+mv val/ILSVRC2012_val_00040010.JPEG n15075141/
+mv val/ILSVRC2012_val_00040011.JPEG n04325704/
+mv val/ILSVRC2012_val_00040012.JPEG n09332890/
+mv val/ILSVRC2012_val_00040013.JPEG n03947888/
+mv val/ILSVRC2012_val_00040014.JPEG n01828970/
+mv val/ILSVRC2012_val_00040015.JPEG n02106030/
+mv val/ILSVRC2012_val_00040016.JPEG n04501370/
+mv val/ILSVRC2012_val_00040017.JPEG n07730033/
+mv val/ILSVRC2012_val_00040018.JPEG n02113186/
+mv val/ILSVRC2012_val_00040019.JPEG n03026506/
+mv val/ILSVRC2012_val_00040020.JPEG n04266014/
+mv val/ILSVRC2012_val_00040021.JPEG n11939491/
+mv val/ILSVRC2012_val_00040022.JPEG n04270147/
+mv val/ILSVRC2012_val_00040023.JPEG n03777754/
+mv val/ILSVRC2012_val_00040024.JPEG n04522168/
+mv val/ILSVRC2012_val_00040025.JPEG n01860187/
+mv val/ILSVRC2012_val_00040026.JPEG n02443484/
+mv val/ILSVRC2012_val_00040027.JPEG n02835271/
+mv val/ILSVRC2012_val_00040028.JPEG n04125021/
+mv val/ILSVRC2012_val_00040029.JPEG n02794156/
+mv val/ILSVRC2012_val_00040030.JPEG n06596364/
+mv val/ILSVRC2012_val_00040031.JPEG n04265275/
+mv val/ILSVRC2012_val_00040032.JPEG n04136333/
+mv val/ILSVRC2012_val_00040033.JPEG n10565667/
+mv val/ILSVRC2012_val_00040034.JPEG n04483307/
+mv val/ILSVRC2012_val_00040035.JPEG n02277742/
+mv val/ILSVRC2012_val_00040036.JPEG n02094433/
+mv val/ILSVRC2012_val_00040037.JPEG n07716906/
+mv val/ILSVRC2012_val_00040038.JPEG n01514859/
+mv val/ILSVRC2012_val_00040039.JPEG n02397096/
+mv val/ILSVRC2012_val_00040040.JPEG n02102318/
+mv val/ILSVRC2012_val_00040041.JPEG n04442312/
+mv val/ILSVRC2012_val_00040042.JPEG n03680355/
+mv val/ILSVRC2012_val_00040043.JPEG n02086240/
+mv val/ILSVRC2012_val_00040044.JPEG n02174001/
+mv val/ILSVRC2012_val_00040045.JPEG n02277742/
+mv val/ILSVRC2012_val_00040046.JPEG n03832673/
+mv val/ILSVRC2012_val_00040047.JPEG n01768244/
+mv val/ILSVRC2012_val_00040048.JPEG n01739381/
+mv val/ILSVRC2012_val_00040049.JPEG n02361337/
+mv val/ILSVRC2012_val_00040050.JPEG n02607072/
+mv val/ILSVRC2012_val_00040051.JPEG n01843383/
+mv val/ILSVRC2012_val_00040052.JPEG n02091467/
+mv val/ILSVRC2012_val_00040053.JPEG n02090721/
+mv val/ILSVRC2012_val_00040054.JPEG n01756291/
+mv val/ILSVRC2012_val_00040055.JPEG n02099429/
+mv val/ILSVRC2012_val_00040056.JPEG n01806567/
+mv val/ILSVRC2012_val_00040057.JPEG n02966687/
+mv val/ILSVRC2012_val_00040058.JPEG n02094258/
+mv val/ILSVRC2012_val_00040059.JPEG n01986214/
+mv val/ILSVRC2012_val_00040060.JPEG n07697537/
+mv val/ILSVRC2012_val_00040061.JPEG n02909870/
+mv val/ILSVRC2012_val_00040062.JPEG n03967562/
+mv val/ILSVRC2012_val_00040063.JPEG n04296562/
+mv val/ILSVRC2012_val_00040064.JPEG n03388043/
+mv val/ILSVRC2012_val_00040065.JPEG n04482393/
+mv val/ILSVRC2012_val_00040066.JPEG n09421951/
+mv val/ILSVRC2012_val_00040067.JPEG n07614500/
+mv val/ILSVRC2012_val_00040068.JPEG n02865351/
+mv val/ILSVRC2012_val_00040069.JPEG n02089973/
+mv val/ILSVRC2012_val_00040070.JPEG n04557648/
+mv val/ILSVRC2012_val_00040071.JPEG n01537544/
+mv val/ILSVRC2012_val_00040072.JPEG n01819313/
+mv val/ILSVRC2012_val_00040073.JPEG n03929855/
+mv val/ILSVRC2012_val_00040074.JPEG n04136333/
+mv val/ILSVRC2012_val_00040075.JPEG n03977966/
+mv val/ILSVRC2012_val_00040076.JPEG n04099969/
+mv val/ILSVRC2012_val_00040077.JPEG n01675722/
+mv val/ILSVRC2012_val_00040078.JPEG n03832673/
+mv val/ILSVRC2012_val_00040079.JPEG n02643566/
+mv val/ILSVRC2012_val_00040080.JPEG n07749582/
+mv val/ILSVRC2012_val_00040081.JPEG n04275548/
+mv val/ILSVRC2012_val_00040082.JPEG n04005630/
+mv val/ILSVRC2012_val_00040083.JPEG n02074367/
+mv val/ILSVRC2012_val_00040084.JPEG n03623198/
+mv val/ILSVRC2012_val_00040085.JPEG n03495258/
+mv val/ILSVRC2012_val_00040086.JPEG n04296562/
+mv val/ILSVRC2012_val_00040087.JPEG n02437312/
+mv val/ILSVRC2012_val_00040088.JPEG n02113799/
+mv val/ILSVRC2012_val_00040089.JPEG n03874599/
+mv val/ILSVRC2012_val_00040090.JPEG n02454379/
+mv val/ILSVRC2012_val_00040091.JPEG n02877765/
+mv val/ILSVRC2012_val_00040092.JPEG n02109525/
+mv val/ILSVRC2012_val_00040093.JPEG n04270147/
+mv val/ILSVRC2012_val_00040094.JPEG n01729977/
+mv val/ILSVRC2012_val_00040095.JPEG n02950826/
+mv val/ILSVRC2012_val_00040096.JPEG n02110063/
+mv val/ILSVRC2012_val_00040097.JPEG n03216828/
+mv val/ILSVRC2012_val_00040098.JPEG n01484850/
+mv val/ILSVRC2012_val_00040099.JPEG n03062245/
+mv val/ILSVRC2012_val_00040100.JPEG n02128385/
+mv val/ILSVRC2012_val_00040101.JPEG n04228054/
+mv val/ILSVRC2012_val_00040102.JPEG n03179701/
+mv val/ILSVRC2012_val_00040103.JPEG n01796340/
+mv val/ILSVRC2012_val_00040104.JPEG n01694178/
+mv val/ILSVRC2012_val_00040105.JPEG n02088094/
+mv val/ILSVRC2012_val_00040106.JPEG n03942813/
+mv val/ILSVRC2012_val_00040107.JPEG n02869837/
+mv val/ILSVRC2012_val_00040108.JPEG n03770439/
+mv val/ILSVRC2012_val_00040109.JPEG n02097658/
+mv val/ILSVRC2012_val_00040110.JPEG n03047690/
+mv val/ILSVRC2012_val_00040111.JPEG n03742115/
+mv val/ILSVRC2012_val_00040112.JPEG n03724870/
+mv val/ILSVRC2012_val_00040113.JPEG n02966687/
+mv val/ILSVRC2012_val_00040114.JPEG n02098286/
+mv val/ILSVRC2012_val_00040115.JPEG n01687978/
+mv val/ILSVRC2012_val_00040116.JPEG n02100236/
+mv val/ILSVRC2012_val_00040117.JPEG n01616318/
+mv val/ILSVRC2012_val_00040118.JPEG n04442312/
+mv val/ILSVRC2012_val_00040119.JPEG n02396427/
+mv val/ILSVRC2012_val_00040120.JPEG n03998194/
+mv val/ILSVRC2012_val_00040121.JPEG n01773549/
+mv val/ILSVRC2012_val_00040122.JPEG n07747607/
+mv val/ILSVRC2012_val_00040123.JPEG n01944390/
+mv val/ILSVRC2012_val_00040124.JPEG n03891332/
+mv val/ILSVRC2012_val_00040125.JPEG n03045698/
+mv val/ILSVRC2012_val_00040126.JPEG n03877472/
+mv val/ILSVRC2012_val_00040127.JPEG n03207941/
+mv val/ILSVRC2012_val_00040128.JPEG n02494079/
+mv val/ILSVRC2012_val_00040129.JPEG n01819313/
+mv val/ILSVRC2012_val_00040130.JPEG n02093754/
+mv val/ILSVRC2012_val_00040131.JPEG n02088238/
+mv val/ILSVRC2012_val_00040132.JPEG n02168699/
+mv val/ILSVRC2012_val_00040133.JPEG n04515003/
+mv val/ILSVRC2012_val_00040134.JPEG n01675722/
+mv val/ILSVRC2012_val_00040135.JPEG n02018207/
+mv val/ILSVRC2012_val_00040136.JPEG n02690373/
+mv val/ILSVRC2012_val_00040137.JPEG n03777568/
+mv val/ILSVRC2012_val_00040138.JPEG n03026506/
+mv val/ILSVRC2012_val_00040139.JPEG n02342885/
+mv val/ILSVRC2012_val_00040140.JPEG n02102040/
+mv val/ILSVRC2012_val_00040141.JPEG n07583066/
+mv val/ILSVRC2012_val_00040142.JPEG n03961711/
+mv val/ILSVRC2012_val_00040143.JPEG n02916936/
+mv val/ILSVRC2012_val_00040144.JPEG n03958227/
+mv val/ILSVRC2012_val_00040145.JPEG n01698640/
+mv val/ILSVRC2012_val_00040146.JPEG n07714990/
+mv val/ILSVRC2012_val_00040147.JPEG n02483708/
+mv val/ILSVRC2012_val_00040148.JPEG n03680355/
+mv val/ILSVRC2012_val_00040149.JPEG n04141975/
+mv val/ILSVRC2012_val_00040150.JPEG n02085936/
+mv val/ILSVRC2012_val_00040151.JPEG n07930864/
+mv val/ILSVRC2012_val_00040152.JPEG n03691459/
+mv val/ILSVRC2012_val_00040153.JPEG n02892767/
+mv val/ILSVRC2012_val_00040154.JPEG n03770679/
+mv val/ILSVRC2012_val_00040155.JPEG n03450230/
+mv val/ILSVRC2012_val_00040156.JPEG n02165456/
+mv val/ILSVRC2012_val_00040157.JPEG n04560804/
+mv val/ILSVRC2012_val_00040158.JPEG n01614925/
+mv val/ILSVRC2012_val_00040159.JPEG n04458633/
+mv val/ILSVRC2012_val_00040160.JPEG n02500267/
+mv val/ILSVRC2012_val_00040161.JPEG n02190166/
+mv val/ILSVRC2012_val_00040162.JPEG n04380533/
+mv val/ILSVRC2012_val_00040163.JPEG n02950826/
+mv val/ILSVRC2012_val_00040164.JPEG n07860988/
+mv val/ILSVRC2012_val_00040165.JPEG n02346627/
+mv val/ILSVRC2012_val_00040166.JPEG n03814906/
+mv val/ILSVRC2012_val_00040167.JPEG n02494079/
+mv val/ILSVRC2012_val_00040168.JPEG n01817953/
+mv val/ILSVRC2012_val_00040169.JPEG n09421951/
+mv val/ILSVRC2012_val_00040170.JPEG n03041632/
+mv val/ILSVRC2012_val_00040171.JPEG n04371430/
+mv val/ILSVRC2012_val_00040172.JPEG n04371430/
+mv val/ILSVRC2012_val_00040173.JPEG n03743016/
+mv val/ILSVRC2012_val_00040174.JPEG n01630670/
+mv val/ILSVRC2012_val_00040175.JPEG n04074963/
+mv val/ILSVRC2012_val_00040176.JPEG n04326547/
+mv val/ILSVRC2012_val_00040177.JPEG n02894605/
+mv val/ILSVRC2012_val_00040178.JPEG n02086910/
+mv val/ILSVRC2012_val_00040179.JPEG n03935335/
+mv val/ILSVRC2012_val_00040180.JPEG n04461696/
+mv val/ILSVRC2012_val_00040181.JPEG n03476991/
+mv val/ILSVRC2012_val_00040182.JPEG n03697007/
+mv val/ILSVRC2012_val_00040183.JPEG n01818515/
+mv val/ILSVRC2012_val_00040184.JPEG n04263257/
+mv val/ILSVRC2012_val_00040185.JPEG n02088238/
+mv val/ILSVRC2012_val_00040186.JPEG n07697313/
+mv val/ILSVRC2012_val_00040187.JPEG n02110806/
+mv val/ILSVRC2012_val_00040188.JPEG n07747607/
+mv val/ILSVRC2012_val_00040189.JPEG n02108422/
+mv val/ILSVRC2012_val_00040190.JPEG n02641379/
+mv val/ILSVRC2012_val_00040191.JPEG n04507155/
+mv val/ILSVRC2012_val_00040192.JPEG n02124075/
+mv val/ILSVRC2012_val_00040193.JPEG n12985857/
+mv val/ILSVRC2012_val_00040194.JPEG n02342885/
+mv val/ILSVRC2012_val_00040195.JPEG n07697537/
+mv val/ILSVRC2012_val_00040196.JPEG n03742115/
+mv val/ILSVRC2012_val_00040197.JPEG n12998815/
+mv val/ILSVRC2012_val_00040198.JPEG n04591713/
+mv val/ILSVRC2012_val_00040199.JPEG n03450230/
+mv val/ILSVRC2012_val_00040200.JPEG n02110185/
+mv val/ILSVRC2012_val_00040201.JPEG n02091831/
+mv val/ILSVRC2012_val_00040202.JPEG n03424325/
+mv val/ILSVRC2012_val_00040203.JPEG n01795545/
+mv val/ILSVRC2012_val_00040204.JPEG n04507155/
+mv val/ILSVRC2012_val_00040205.JPEG n01616318/
+mv val/ILSVRC2012_val_00040206.JPEG n01704323/
+mv val/ILSVRC2012_val_00040207.JPEG n03887697/
+mv val/ILSVRC2012_val_00040208.JPEG n02128925/
+mv val/ILSVRC2012_val_00040209.JPEG n01824575/
+mv val/ILSVRC2012_val_00040210.JPEG n02099712/
+mv val/ILSVRC2012_val_00040211.JPEG n03498962/
+mv val/ILSVRC2012_val_00040212.JPEG n04273569/
+mv val/ILSVRC2012_val_00040213.JPEG n04090263/
+mv val/ILSVRC2012_val_00040214.JPEG n01775062/
+mv val/ILSVRC2012_val_00040215.JPEG n03970156/
+mv val/ILSVRC2012_val_00040216.JPEG n02480855/
+mv val/ILSVRC2012_val_00040217.JPEG n02730930/
+mv val/ILSVRC2012_val_00040218.JPEG n02326432/
+mv val/ILSVRC2012_val_00040219.JPEG n04355933/
+mv val/ILSVRC2012_val_00040220.JPEG n03355925/
+mv val/ILSVRC2012_val_00040221.JPEG n01734418/
+mv val/ILSVRC2012_val_00040222.JPEG n02107908/
+mv val/ILSVRC2012_val_00040223.JPEG n01978287/
+mv val/ILSVRC2012_val_00040224.JPEG n03874599/
+mv val/ILSVRC2012_val_00040225.JPEG n03478589/
+mv val/ILSVRC2012_val_00040226.JPEG n03788365/
+mv val/ILSVRC2012_val_00040227.JPEG n02325366/
+mv val/ILSVRC2012_val_00040228.JPEG n02445715/
+mv val/ILSVRC2012_val_00040229.JPEG n03180011/
+mv val/ILSVRC2012_val_00040230.JPEG n03792782/
+mv val/ILSVRC2012_val_00040231.JPEG n01667778/
+mv val/ILSVRC2012_val_00040232.JPEG n02490219/
+mv val/ILSVRC2012_val_00040233.JPEG n01882714/
+mv val/ILSVRC2012_val_00040234.JPEG n04005630/
+mv val/ILSVRC2012_val_00040235.JPEG n04118538/
+mv val/ILSVRC2012_val_00040236.JPEG n03775071/
+mv val/ILSVRC2012_val_00040237.JPEG n03792782/
+mv val/ILSVRC2012_val_00040238.JPEG n02123045/
+mv val/ILSVRC2012_val_00040239.JPEG n02264363/
+mv val/ILSVRC2012_val_00040240.JPEG n02776631/
+mv val/ILSVRC2012_val_00040241.JPEG n01773157/
+mv val/ILSVRC2012_val_00040242.JPEG n01614925/
+mv val/ILSVRC2012_val_00040243.JPEG n04548362/
+mv val/ILSVRC2012_val_00040244.JPEG n02009912/
+mv val/ILSVRC2012_val_00040245.JPEG n02487347/
+mv val/ILSVRC2012_val_00040246.JPEG n03272562/
+mv val/ILSVRC2012_val_00040247.JPEG n01685808/
+mv val/ILSVRC2012_val_00040248.JPEG n02835271/
+mv val/ILSVRC2012_val_00040249.JPEG n02110063/
+mv val/ILSVRC2012_val_00040250.JPEG n04153751/
+mv val/ILSVRC2012_val_00040251.JPEG n02123045/
+mv val/ILSVRC2012_val_00040252.JPEG n02417914/
+mv val/ILSVRC2012_val_00040253.JPEG n04208210/
+mv val/ILSVRC2012_val_00040254.JPEG n03476684/
+mv val/ILSVRC2012_val_00040255.JPEG n01768244/
+mv val/ILSVRC2012_val_00040256.JPEG n07697313/
+mv val/ILSVRC2012_val_00040257.JPEG n02100583/
+mv val/ILSVRC2012_val_00040258.JPEG n02504013/
+mv val/ILSVRC2012_val_00040259.JPEG n04040759/
+mv val/ILSVRC2012_val_00040260.JPEG n04067472/
+mv val/ILSVRC2012_val_00040261.JPEG n01798484/
+mv val/ILSVRC2012_val_00040262.JPEG n07248320/
+mv val/ILSVRC2012_val_00040263.JPEG n02094258/
+mv val/ILSVRC2012_val_00040264.JPEG n02483708/
+mv val/ILSVRC2012_val_00040265.JPEG n04557648/
+mv val/ILSVRC2012_val_00040266.JPEG n01828970/
+mv val/ILSVRC2012_val_00040267.JPEG n02172182/
+mv val/ILSVRC2012_val_00040268.JPEG n03658185/
+mv val/ILSVRC2012_val_00040269.JPEG n02493509/
+mv val/ILSVRC2012_val_00040270.JPEG n03991062/
+mv val/ILSVRC2012_val_00040271.JPEG n03494278/
+mv val/ILSVRC2012_val_00040272.JPEG n03291819/
+mv val/ILSVRC2012_val_00040273.JPEG n02410509/
+mv val/ILSVRC2012_val_00040274.JPEG n03733805/
+mv val/ILSVRC2012_val_00040275.JPEG n04579432/
+mv val/ILSVRC2012_val_00040276.JPEG n03124043/
+mv val/ILSVRC2012_val_00040277.JPEG n02966193/
+mv val/ILSVRC2012_val_00040278.JPEG n02190166/
+mv val/ILSVRC2012_val_00040279.JPEG n02526121/
+mv val/ILSVRC2012_val_00040280.JPEG n07753592/
+mv val/ILSVRC2012_val_00040281.JPEG n07753592/
+mv val/ILSVRC2012_val_00040282.JPEG n07768694/
+mv val/ILSVRC2012_val_00040283.JPEG n09246464/
+mv val/ILSVRC2012_val_00040284.JPEG n07711569/
+mv val/ILSVRC2012_val_00040285.JPEG n02018795/
+mv val/ILSVRC2012_val_00040286.JPEG n02105056/
+mv val/ILSVRC2012_val_00040287.JPEG n01669191/
+mv val/ILSVRC2012_val_00040288.JPEG n02268853/
+mv val/ILSVRC2012_val_00040289.JPEG n02488291/
+mv val/ILSVRC2012_val_00040290.JPEG n02793495/
+mv val/ILSVRC2012_val_00040291.JPEG n02101556/
+mv val/ILSVRC2012_val_00040292.JPEG n04476259/
+mv val/ILSVRC2012_val_00040293.JPEG n07584110/
+mv val/ILSVRC2012_val_00040294.JPEG n04542943/
+mv val/ILSVRC2012_val_00040295.JPEG n03670208/
+mv val/ILSVRC2012_val_00040296.JPEG n03929855/
+mv val/ILSVRC2012_val_00040297.JPEG n04204347/
+mv val/ILSVRC2012_val_00040298.JPEG n02094433/
+mv val/ILSVRC2012_val_00040299.JPEG n09472597/
+mv val/ILSVRC2012_val_00040300.JPEG n04479046/
+mv val/ILSVRC2012_val_00040301.JPEG n01667778/
+mv val/ILSVRC2012_val_00040302.JPEG n03459775/
+mv val/ILSVRC2012_val_00040303.JPEG n02056570/
+mv val/ILSVRC2012_val_00040304.JPEG n12620546/
+mv val/ILSVRC2012_val_00040305.JPEG n04286575/
+mv val/ILSVRC2012_val_00040306.JPEG n02795169/
+mv val/ILSVRC2012_val_00040307.JPEG n04209239/
+mv val/ILSVRC2012_val_00040308.JPEG n02101556/
+mv val/ILSVRC2012_val_00040309.JPEG n04532670/
+mv val/ILSVRC2012_val_00040310.JPEG n02009229/
+mv val/ILSVRC2012_val_00040311.JPEG n04584207/
+mv val/ILSVRC2012_val_00040312.JPEG n02795169/
+mv val/ILSVRC2012_val_00040313.JPEG n02112350/
+mv val/ILSVRC2012_val_00040314.JPEG n01667778/
+mv val/ILSVRC2012_val_00040315.JPEG n02939185/
+mv val/ILSVRC2012_val_00040316.JPEG n03908618/
+mv val/ILSVRC2012_val_00040317.JPEG n01753488/
+mv val/ILSVRC2012_val_00040318.JPEG n02841315/
+mv val/ILSVRC2012_val_00040319.JPEG n03388183/
+mv val/ILSVRC2012_val_00040320.JPEG n03218198/
+mv val/ILSVRC2012_val_00040321.JPEG n02776631/
+mv val/ILSVRC2012_val_00040322.JPEG n02363005/
+mv val/ILSVRC2012_val_00040323.JPEG n02130308/
+mv val/ILSVRC2012_val_00040324.JPEG n06596364/
+mv val/ILSVRC2012_val_00040325.JPEG n02814860/
+mv val/ILSVRC2012_val_00040326.JPEG n02110063/
+mv val/ILSVRC2012_val_00040327.JPEG n02117135/
+mv val/ILSVRC2012_val_00040328.JPEG n07684084/
+mv val/ILSVRC2012_val_00040329.JPEG n04254680/
+mv val/ILSVRC2012_val_00040330.JPEG n03109150/
+mv val/ILSVRC2012_val_00040331.JPEG n02408429/
+mv val/ILSVRC2012_val_00040332.JPEG n04389033/
+mv val/ILSVRC2012_val_00040333.JPEG n04483307/
+mv val/ILSVRC2012_val_00040334.JPEG n01797886/
+mv val/ILSVRC2012_val_00040335.JPEG n02095889/
+mv val/ILSVRC2012_val_00040336.JPEG n03958227/
+mv val/ILSVRC2012_val_00040337.JPEG n04548280/
+mv val/ILSVRC2012_val_00040338.JPEG n02410509/
+mv val/ILSVRC2012_val_00040339.JPEG n03837869/
+mv val/ILSVRC2012_val_00040340.JPEG n03720891/
+mv val/ILSVRC2012_val_00040341.JPEG n04435653/
+mv val/ILSVRC2012_val_00040342.JPEG n01498041/
+mv val/ILSVRC2012_val_00040343.JPEG n02749479/
+mv val/ILSVRC2012_val_00040344.JPEG n07718747/
+mv val/ILSVRC2012_val_00040345.JPEG n04461696/
+mv val/ILSVRC2012_val_00040346.JPEG n03388043/
+mv val/ILSVRC2012_val_00040347.JPEG n02133161/
+mv val/ILSVRC2012_val_00040348.JPEG n02165105/
+mv val/ILSVRC2012_val_00040349.JPEG n02817516/
+mv val/ILSVRC2012_val_00040350.JPEG n04532670/
+mv val/ILSVRC2012_val_00040351.JPEG n02013706/
+mv val/ILSVRC2012_val_00040352.JPEG n01682714/
+mv val/ILSVRC2012_val_00040353.JPEG n02102177/
+mv val/ILSVRC2012_val_00040354.JPEG n03290653/
+mv val/ILSVRC2012_val_00040355.JPEG n04086273/
+mv val/ILSVRC2012_val_00040356.JPEG n02090379/
+mv val/ILSVRC2012_val_00040357.JPEG n01797886/
+mv val/ILSVRC2012_val_00040358.JPEG n01440764/
+mv val/ILSVRC2012_val_00040359.JPEG n01818515/
+mv val/ILSVRC2012_val_00040360.JPEG n04562935/
+mv val/ILSVRC2012_val_00040361.JPEG n02782093/
+mv val/ILSVRC2012_val_00040362.JPEG n03793489/
+mv val/ILSVRC2012_val_00040363.JPEG n11879895/
+mv val/ILSVRC2012_val_00040364.JPEG n02814860/
+mv val/ILSVRC2012_val_00040365.JPEG n02669723/
+mv val/ILSVRC2012_val_00040366.JPEG n02974003/
+mv val/ILSVRC2012_val_00040367.JPEG n07693725/
+mv val/ILSVRC2012_val_00040368.JPEG n02104029/
+mv val/ILSVRC2012_val_00040369.JPEG n03372029/
+mv val/ILSVRC2012_val_00040370.JPEG n03045698/
+mv val/ILSVRC2012_val_00040371.JPEG n03100240/
+mv val/ILSVRC2012_val_00040372.JPEG n02127052/
+mv val/ILSVRC2012_val_00040373.JPEG n07579787/
+mv val/ILSVRC2012_val_00040374.JPEG n03874599/
+mv val/ILSVRC2012_val_00040375.JPEG n02504458/
+mv val/ILSVRC2012_val_00040376.JPEG n02132136/
+mv val/ILSVRC2012_val_00040377.JPEG n03692522/
+mv val/ILSVRC2012_val_00040378.JPEG n04517823/
+mv val/ILSVRC2012_val_00040379.JPEG n03223299/
+mv val/ILSVRC2012_val_00040380.JPEG n04418357/
+mv val/ILSVRC2012_val_00040381.JPEG n02110806/
+mv val/ILSVRC2012_val_00040382.JPEG n01728572/
+mv val/ILSVRC2012_val_00040383.JPEG n04259630/
+mv val/ILSVRC2012_val_00040384.JPEG n03930313/
+mv val/ILSVRC2012_val_00040385.JPEG n02321529/
+mv val/ILSVRC2012_val_00040386.JPEG n02105251/
+mv val/ILSVRC2012_val_00040387.JPEG n04317175/
+mv val/ILSVRC2012_val_00040388.JPEG n01491361/
+mv val/ILSVRC2012_val_00040389.JPEG n07753275/
+mv val/ILSVRC2012_val_00040390.JPEG n02028035/
+mv val/ILSVRC2012_val_00040391.JPEG n04476259/
+mv val/ILSVRC2012_val_00040392.JPEG n03742115/
+mv val/ILSVRC2012_val_00040393.JPEG n03032252/
+mv val/ILSVRC2012_val_00040394.JPEG n02328150/
+mv val/ILSVRC2012_val_00040395.JPEG n04591713/
+mv val/ILSVRC2012_val_00040396.JPEG n02088094/
+mv val/ILSVRC2012_val_00040397.JPEG n02190166/
+mv val/ILSVRC2012_val_00040398.JPEG n04067472/
+mv val/ILSVRC2012_val_00040399.JPEG n03134739/
+mv val/ILSVRC2012_val_00040400.JPEG n02102318/
+mv val/ILSVRC2012_val_00040401.JPEG n03026506/
+mv val/ILSVRC2012_val_00040402.JPEG n04371430/
+mv val/ILSVRC2012_val_00040403.JPEG n03535780/
+mv val/ILSVRC2012_val_00040404.JPEG n01614925/
+mv val/ILSVRC2012_val_00040405.JPEG n02111889/
+mv val/ILSVRC2012_val_00040406.JPEG n03977966/
+mv val/ILSVRC2012_val_00040407.JPEG n03131574/
+mv val/ILSVRC2012_val_00040408.JPEG n02071294/
+mv val/ILSVRC2012_val_00040409.JPEG n02110627/
+mv val/ILSVRC2012_val_00040410.JPEG n02109961/
+mv val/ILSVRC2012_val_00040411.JPEG n02412080/
+mv val/ILSVRC2012_val_00040412.JPEG n01580077/
+mv val/ILSVRC2012_val_00040413.JPEG n06359193/
+mv val/ILSVRC2012_val_00040414.JPEG n04209133/
+mv val/ILSVRC2012_val_00040415.JPEG n03775546/
+mv val/ILSVRC2012_val_00040416.JPEG n03630383/
+mv val/ILSVRC2012_val_00040417.JPEG n01753488/
+mv val/ILSVRC2012_val_00040418.JPEG n02672831/
+mv val/ILSVRC2012_val_00040419.JPEG n02092339/
+mv val/ILSVRC2012_val_00040420.JPEG n01644900/
+mv val/ILSVRC2012_val_00040421.JPEG n07730033/
+mv val/ILSVRC2012_val_00040422.JPEG n03124043/
+mv val/ILSVRC2012_val_00040423.JPEG n04065272/
+mv val/ILSVRC2012_val_00040424.JPEG n03697007/
+mv val/ILSVRC2012_val_00040425.JPEG n01616318/
+mv val/ILSVRC2012_val_00040426.JPEG n01558993/
+mv val/ILSVRC2012_val_00040427.JPEG n02107683/
+mv val/ILSVRC2012_val_00040428.JPEG n04044716/
+mv val/ILSVRC2012_val_00040429.JPEG n03877472/
+mv val/ILSVRC2012_val_00040430.JPEG n02786058/
+mv val/ILSVRC2012_val_00040431.JPEG n02087046/
+mv val/ILSVRC2012_val_00040432.JPEG n07717410/
+mv val/ILSVRC2012_val_00040433.JPEG n04019541/
+mv val/ILSVRC2012_val_00040434.JPEG n01622779/
+mv val/ILSVRC2012_val_00040435.JPEG n03337140/
+mv val/ILSVRC2012_val_00040436.JPEG n02978881/
+mv val/ILSVRC2012_val_00040437.JPEG n04131690/
+mv val/ILSVRC2012_val_00040438.JPEG n03887697/
+mv val/ILSVRC2012_val_00040439.JPEG n01582220/
+mv val/ILSVRC2012_val_00040440.JPEG n02536864/
+mv val/ILSVRC2012_val_00040441.JPEG n04065272/
+mv val/ILSVRC2012_val_00040442.JPEG n02977058/
+mv val/ILSVRC2012_val_00040443.JPEG n03825788/
+mv val/ILSVRC2012_val_00040444.JPEG n01687978/
+mv val/ILSVRC2012_val_00040445.JPEG n01756291/
+mv val/ILSVRC2012_val_00040446.JPEG n04486054/
+mv val/ILSVRC2012_val_00040447.JPEG n01737021/
+mv val/ILSVRC2012_val_00040448.JPEG n01968897/
+mv val/ILSVRC2012_val_00040449.JPEG n03047690/
+mv val/ILSVRC2012_val_00040450.JPEG n02106166/
+mv val/ILSVRC2012_val_00040451.JPEG n02259212/
+mv val/ILSVRC2012_val_00040452.JPEG n02326432/
+mv val/ILSVRC2012_val_00040453.JPEG n04476259/
+mv val/ILSVRC2012_val_00040454.JPEG n02115913/
+mv val/ILSVRC2012_val_00040455.JPEG n02006656/
+mv val/ILSVRC2012_val_00040456.JPEG n04254120/
+mv val/ILSVRC2012_val_00040457.JPEG n02871525/
+mv val/ILSVRC2012_val_00040458.JPEG n03220513/
+mv val/ILSVRC2012_val_00040459.JPEG n03769881/
+mv val/ILSVRC2012_val_00040460.JPEG n03692522/
+mv val/ILSVRC2012_val_00040461.JPEG n02730930/
+mv val/ILSVRC2012_val_00040462.JPEG n04235860/
+mv val/ILSVRC2012_val_00040463.JPEG n02112018/
+mv val/ILSVRC2012_val_00040464.JPEG n02107142/
+mv val/ILSVRC2012_val_00040465.JPEG n02834397/
+mv val/ILSVRC2012_val_00040466.JPEG n04008634/
+mv val/ILSVRC2012_val_00040467.JPEG n02100583/
+mv val/ILSVRC2012_val_00040468.JPEG n01729977/
+mv val/ILSVRC2012_val_00040469.JPEG n07714571/
+mv val/ILSVRC2012_val_00040470.JPEG n01629819/
+mv val/ILSVRC2012_val_00040471.JPEG n02028035/
+mv val/ILSVRC2012_val_00040472.JPEG n03724870/
+mv val/ILSVRC2012_val_00040473.JPEG n04355933/
+mv val/ILSVRC2012_val_00040474.JPEG n01614925/
+mv val/ILSVRC2012_val_00040475.JPEG n07714571/
+mv val/ILSVRC2012_val_00040476.JPEG n07584110/
+mv val/ILSVRC2012_val_00040477.JPEG n02870880/
+mv val/ILSVRC2012_val_00040478.JPEG n13054560/
+mv val/ILSVRC2012_val_00040479.JPEG n02727426/
+mv val/ILSVRC2012_val_00040480.JPEG n03877472/
+mv val/ILSVRC2012_val_00040481.JPEG n04263257/
+mv val/ILSVRC2012_val_00040482.JPEG n04127249/
+mv val/ILSVRC2012_val_00040483.JPEG n03630383/
+mv val/ILSVRC2012_val_00040484.JPEG n01978287/
+mv val/ILSVRC2012_val_00040485.JPEG n13044778/
+mv val/ILSVRC2012_val_00040486.JPEG n02509815/
+mv val/ILSVRC2012_val_00040487.JPEG n04251144/
+mv val/ILSVRC2012_val_00040488.JPEG n04141327/
+mv val/ILSVRC2012_val_00040489.JPEG n12620546/
+mv val/ILSVRC2012_val_00040490.JPEG n03388043/
+mv val/ILSVRC2012_val_00040491.JPEG n02951358/
+mv val/ILSVRC2012_val_00040492.JPEG n02412080/
+mv val/ILSVRC2012_val_00040493.JPEG n03110669/
+mv val/ILSVRC2012_val_00040494.JPEG n03937543/
+mv val/ILSVRC2012_val_00040495.JPEG n04044716/
+mv val/ILSVRC2012_val_00040496.JPEG n02101388/
+mv val/ILSVRC2012_val_00040497.JPEG n07716358/
+mv val/ILSVRC2012_val_00040498.JPEG n04462240/
+mv val/ILSVRC2012_val_00040499.JPEG n03933933/
+mv val/ILSVRC2012_val_00040500.JPEG n02840245/
+mv val/ILSVRC2012_val_00040501.JPEG n03485407/
+mv val/ILSVRC2012_val_00040502.JPEG n03461385/
+mv val/ILSVRC2012_val_00040503.JPEG n02119789/
+mv val/ILSVRC2012_val_00040504.JPEG n01944390/
+mv val/ILSVRC2012_val_00040505.JPEG n01924916/
+mv val/ILSVRC2012_val_00040506.JPEG n04127249/
+mv val/ILSVRC2012_val_00040507.JPEG n04209239/
+mv val/ILSVRC2012_val_00040508.JPEG n03908618/
+mv val/ILSVRC2012_val_00040509.JPEG n03133878/
+mv val/ILSVRC2012_val_00040510.JPEG n03992509/
+mv val/ILSVRC2012_val_00040511.JPEG n02410509/
+mv val/ILSVRC2012_val_00040512.JPEG n03796401/
+mv val/ILSVRC2012_val_00040513.JPEG n01798484/
+mv val/ILSVRC2012_val_00040514.JPEG n04557648/
+mv val/ILSVRC2012_val_00040515.JPEG n02088632/
+mv val/ILSVRC2012_val_00040516.JPEG n03000247/
+mv val/ILSVRC2012_val_00040517.JPEG n02971356/
+mv val/ILSVRC2012_val_00040518.JPEG n03840681/
+mv val/ILSVRC2012_val_00040519.JPEG n01776313/
+mv val/ILSVRC2012_val_00040520.JPEG n01773157/
+mv val/ILSVRC2012_val_00040521.JPEG n04366367/
+mv val/ILSVRC2012_val_00040522.JPEG n03325584/
+mv val/ILSVRC2012_val_00040523.JPEG n03873416/
+mv val/ILSVRC2012_val_00040524.JPEG n01807496/
+mv val/ILSVRC2012_val_00040525.JPEG n02790996/
+mv val/ILSVRC2012_val_00040526.JPEG n09421951/
+mv val/ILSVRC2012_val_00040527.JPEG n07734744/
+mv val/ILSVRC2012_val_00040528.JPEG n03000247/
+mv val/ILSVRC2012_val_00040529.JPEG n04597913/
+mv val/ILSVRC2012_val_00040530.JPEG n04332243/
+mv val/ILSVRC2012_val_00040531.JPEG n02408429/
+mv val/ILSVRC2012_val_00040532.JPEG n01677366/
+mv val/ILSVRC2012_val_00040533.JPEG n02229544/
+mv val/ILSVRC2012_val_00040534.JPEG n03891251/
+mv val/ILSVRC2012_val_00040535.JPEG n02110063/
+mv val/ILSVRC2012_val_00040536.JPEG n03532672/
+mv val/ILSVRC2012_val_00040537.JPEG n03937543/
+mv val/ILSVRC2012_val_00040538.JPEG n01558993/
+mv val/ILSVRC2012_val_00040539.JPEG n04540053/
+mv val/ILSVRC2012_val_00040540.JPEG n12057211/
+mv val/ILSVRC2012_val_00040541.JPEG n03388183/
+mv val/ILSVRC2012_val_00040542.JPEG n02841315/
+mv val/ILSVRC2012_val_00040543.JPEG n09399592/
+mv val/ILSVRC2012_val_00040544.JPEG n03933933/
+mv val/ILSVRC2012_val_00040545.JPEG n02823428/
+mv val/ILSVRC2012_val_00040546.JPEG n02102040/
+mv val/ILSVRC2012_val_00040547.JPEG n02690373/
+mv val/ILSVRC2012_val_00040548.JPEG n02895154/
+mv val/ILSVRC2012_val_00040549.JPEG n02085936/
+mv val/ILSVRC2012_val_00040550.JPEG n04458633/
+mv val/ILSVRC2012_val_00040551.JPEG n02415577/
+mv val/ILSVRC2012_val_00040552.JPEG n04579432/
+mv val/ILSVRC2012_val_00040553.JPEG n04557648/
+mv val/ILSVRC2012_val_00040554.JPEG n03630383/
+mv val/ILSVRC2012_val_00040555.JPEG n02009912/
+mv val/ILSVRC2012_val_00040556.JPEG n02113978/
+mv val/ILSVRC2012_val_00040557.JPEG n03000247/
+mv val/ILSVRC2012_val_00040558.JPEG n09246464/
+mv val/ILSVRC2012_val_00040559.JPEG n03498962/
+mv val/ILSVRC2012_val_00040560.JPEG n02992211/
+mv val/ILSVRC2012_val_00040561.JPEG n03249569/
+mv val/ILSVRC2012_val_00040562.JPEG n03930313/
+mv val/ILSVRC2012_val_00040563.JPEG n01632458/
+mv val/ILSVRC2012_val_00040564.JPEG n02086910/
+mv val/ILSVRC2012_val_00040565.JPEG n02097209/
+mv val/ILSVRC2012_val_00040566.JPEG n03032252/
+mv val/ILSVRC2012_val_00040567.JPEG n01496331/
+mv val/ILSVRC2012_val_00040568.JPEG n04118538/
+mv val/ILSVRC2012_val_00040569.JPEG n03272010/
+mv val/ILSVRC2012_val_00040570.JPEG n02095314/
+mv val/ILSVRC2012_val_00040571.JPEG n02930766/
+mv val/ILSVRC2012_val_00040572.JPEG n02112137/
+mv val/ILSVRC2012_val_00040573.JPEG n03697007/
+mv val/ILSVRC2012_val_00040574.JPEG n04127249/
+mv val/ILSVRC2012_val_00040575.JPEG n04141076/
+mv val/ILSVRC2012_val_00040576.JPEG n03376595/
+mv val/ILSVRC2012_val_00040577.JPEG n07613480/
+mv val/ILSVRC2012_val_00040578.JPEG n04023962/
+mv val/ILSVRC2012_val_00040579.JPEG n03958227/
+mv val/ILSVRC2012_val_00040580.JPEG n04515003/
+mv val/ILSVRC2012_val_00040581.JPEG n04596742/
+mv val/ILSVRC2012_val_00040582.JPEG n02108000/
+mv val/ILSVRC2012_val_00040583.JPEG n03874599/
+mv val/ILSVRC2012_val_00040584.JPEG n01776313/
+mv val/ILSVRC2012_val_00040585.JPEG n02088238/
+mv val/ILSVRC2012_val_00040586.JPEG n01950731/
+mv val/ILSVRC2012_val_00040587.JPEG n02086910/
+mv val/ILSVRC2012_val_00040588.JPEG n03384352/
+mv val/ILSVRC2012_val_00040589.JPEG n02093859/
+mv val/ILSVRC2012_val_00040590.JPEG n02088632/
+mv val/ILSVRC2012_val_00040591.JPEG n02749479/
+mv val/ILSVRC2012_val_00040592.JPEG n01631663/
+mv val/ILSVRC2012_val_00040593.JPEG n01955084/
+mv val/ILSVRC2012_val_00040594.JPEG n04275548/
+mv val/ILSVRC2012_val_00040595.JPEG n02493793/
+mv val/ILSVRC2012_val_00040596.JPEG n03690938/
+mv val/ILSVRC2012_val_00040597.JPEG n02802426/
+mv val/ILSVRC2012_val_00040598.JPEG n02110341/
+mv val/ILSVRC2012_val_00040599.JPEG n02906734/
+mv val/ILSVRC2012_val_00040600.JPEG n02124075/
+mv val/ILSVRC2012_val_00040601.JPEG n03991062/
+mv val/ILSVRC2012_val_00040602.JPEG n03584254/
+mv val/ILSVRC2012_val_00040603.JPEG n03444034/
+mv val/ILSVRC2012_val_00040604.JPEG n02979186/
+mv val/ILSVRC2012_val_00040605.JPEG n03888605/
+mv val/ILSVRC2012_val_00040606.JPEG n01534433/
+mv val/ILSVRC2012_val_00040607.JPEG n02129165/
+mv val/ILSVRC2012_val_00040608.JPEG n01614925/
+mv val/ILSVRC2012_val_00040609.JPEG n02397096/
+mv val/ILSVRC2012_val_00040610.JPEG n12985857/
+mv val/ILSVRC2012_val_00040611.JPEG n02123159/
+mv val/ILSVRC2012_val_00040612.JPEG n01984695/
+mv val/ILSVRC2012_val_00040613.JPEG n02097047/
+mv val/ILSVRC2012_val_00040614.JPEG n01616318/
+mv val/ILSVRC2012_val_00040615.JPEG n02117135/
+mv val/ILSVRC2012_val_00040616.JPEG n01682714/
+mv val/ILSVRC2012_val_00040617.JPEG n03814906/
+mv val/ILSVRC2012_val_00040618.JPEG n02105251/
+mv val/ILSVRC2012_val_00040619.JPEG n01877812/
+mv val/ILSVRC2012_val_00040620.JPEG n04367480/
+mv val/ILSVRC2012_val_00040621.JPEG n01770081/
+mv val/ILSVRC2012_val_00040622.JPEG n02099849/
+mv val/ILSVRC2012_val_00040623.JPEG n02328150/
+mv val/ILSVRC2012_val_00040624.JPEG n07590611/
+mv val/ILSVRC2012_val_00040625.JPEG n07734744/
+mv val/ILSVRC2012_val_00040626.JPEG n03673027/
+mv val/ILSVRC2012_val_00040627.JPEG n02129165/
+mv val/ILSVRC2012_val_00040628.JPEG n02111500/
+mv val/ILSVRC2012_val_00040629.JPEG n04090263/
+mv val/ILSVRC2012_val_00040630.JPEG n02129604/
+mv val/ILSVRC2012_val_00040631.JPEG n02894605/
+mv val/ILSVRC2012_val_00040632.JPEG n02128757/
+mv val/ILSVRC2012_val_00040633.JPEG n04238763/
+mv val/ILSVRC2012_val_00040634.JPEG n03720891/
+mv val/ILSVRC2012_val_00040635.JPEG n03793489/
+mv val/ILSVRC2012_val_00040636.JPEG n03424325/
+mv val/ILSVRC2012_val_00040637.JPEG n07716358/
+mv val/ILSVRC2012_val_00040638.JPEG n02493509/
+mv val/ILSVRC2012_val_00040639.JPEG n02099849/
+mv val/ILSVRC2012_val_00040640.JPEG n02091244/
+mv val/ILSVRC2012_val_00040641.JPEG n02097658/
+mv val/ILSVRC2012_val_00040642.JPEG n02138441/
+mv val/ILSVRC2012_val_00040643.JPEG n03047690/
+mv val/ILSVRC2012_val_00040644.JPEG n02093647/
+mv val/ILSVRC2012_val_00040645.JPEG n02108915/
+mv val/ILSVRC2012_val_00040646.JPEG n04263257/
+mv val/ILSVRC2012_val_00040647.JPEG n02129165/
+mv val/ILSVRC2012_val_00040648.JPEG n04335435/
+mv val/ILSVRC2012_val_00040649.JPEG n07760859/
+mv val/ILSVRC2012_val_00040650.JPEG n02091831/
+mv val/ILSVRC2012_val_00040651.JPEG n03445924/
+mv val/ILSVRC2012_val_00040652.JPEG n02280649/
+mv val/ILSVRC2012_val_00040653.JPEG n02640242/
+mv val/ILSVRC2012_val_00040654.JPEG n04613696/
+mv val/ILSVRC2012_val_00040655.JPEG n03527444/
+mv val/ILSVRC2012_val_00040656.JPEG n01798484/
+mv val/ILSVRC2012_val_00040657.JPEG n03995372/
+mv val/ILSVRC2012_val_00040658.JPEG n01728572/
+mv val/ILSVRC2012_val_00040659.JPEG n04004767/
+mv val/ILSVRC2012_val_00040660.JPEG n02099267/
+mv val/ILSVRC2012_val_00040661.JPEG n07920052/
+mv val/ILSVRC2012_val_00040662.JPEG n03709823/
+mv val/ILSVRC2012_val_00040663.JPEG n02095570/
+mv val/ILSVRC2012_val_00040664.JPEG n02018795/
+mv val/ILSVRC2012_val_00040665.JPEG n03642806/
+mv val/ILSVRC2012_val_00040666.JPEG n04074963/
+mv val/ILSVRC2012_val_00040667.JPEG n04141327/
+mv val/ILSVRC2012_val_00040668.JPEG n01917289/
+mv val/ILSVRC2012_val_00040669.JPEG n04131690/
+mv val/ILSVRC2012_val_00040670.JPEG n03250847/
+mv val/ILSVRC2012_val_00040671.JPEG n02104365/
+mv val/ILSVRC2012_val_00040672.JPEG n03602883/
+mv val/ILSVRC2012_val_00040673.JPEG n02093428/
+mv val/ILSVRC2012_val_00040674.JPEG n03109150/
+mv val/ILSVRC2012_val_00040675.JPEG n03240683/
+mv val/ILSVRC2012_val_00040676.JPEG n02086079/
+mv val/ILSVRC2012_val_00040677.JPEG n02114712/
+mv val/ILSVRC2012_val_00040678.JPEG n02093256/
+mv val/ILSVRC2012_val_00040679.JPEG n02102040/
+mv val/ILSVRC2012_val_00040680.JPEG n03495258/
+mv val/ILSVRC2012_val_00040681.JPEG n04584207/
+mv val/ILSVRC2012_val_00040682.JPEG n02870880/
+mv val/ILSVRC2012_val_00040683.JPEG n02916936/
+mv val/ILSVRC2012_val_00040684.JPEG n07875152/
+mv val/ILSVRC2012_val_00040685.JPEG n07583066/
+mv val/ILSVRC2012_val_00040686.JPEG n02730930/
+mv val/ILSVRC2012_val_00040687.JPEG n04019541/
+mv val/ILSVRC2012_val_00040688.JPEG n04254120/
+mv val/ILSVRC2012_val_00040689.JPEG n02666196/
+mv val/ILSVRC2012_val_00040690.JPEG n03141823/
+mv val/ILSVRC2012_val_00040691.JPEG n03063689/
+mv val/ILSVRC2012_val_00040692.JPEG n06596364/
+mv val/ILSVRC2012_val_00040693.JPEG n02906734/
+mv val/ILSVRC2012_val_00040694.JPEG n03445777/
+mv val/ILSVRC2012_val_00040695.JPEG n02971356/
+mv val/ILSVRC2012_val_00040696.JPEG n03891332/
+mv val/ILSVRC2012_val_00040697.JPEG n07892512/
+mv val/ILSVRC2012_val_00040698.JPEG n02442845/
+mv val/ILSVRC2012_val_00040699.JPEG n03527444/
+mv val/ILSVRC2012_val_00040700.JPEG n02667093/
+mv val/ILSVRC2012_val_00040701.JPEG n01806143/
+mv val/ILSVRC2012_val_00040702.JPEG n03902125/
+mv val/ILSVRC2012_val_00040703.JPEG n02457408/
+mv val/ILSVRC2012_val_00040704.JPEG n01693334/
+mv val/ILSVRC2012_val_00040705.JPEG n02799071/
+mv val/ILSVRC2012_val_00040706.JPEG n02814533/
+mv val/ILSVRC2012_val_00040707.JPEG n06874185/
+mv val/ILSVRC2012_val_00040708.JPEG n02088466/
+mv val/ILSVRC2012_val_00040709.JPEG n03825788/
+mv val/ILSVRC2012_val_00040710.JPEG n01484850/
+mv val/ILSVRC2012_val_00040711.JPEG n03355925/
+mv val/ILSVRC2012_val_00040712.JPEG n02095889/
+mv val/ILSVRC2012_val_00040713.JPEG n02086646/
+mv val/ILSVRC2012_val_00040714.JPEG n03942813/
+mv val/ILSVRC2012_val_00040715.JPEG n03425413/
+mv val/ILSVRC2012_val_00040716.JPEG n04550184/
+mv val/ILSVRC2012_val_00040717.JPEG n02817516/
+mv val/ILSVRC2012_val_00040718.JPEG n04049303/
+mv val/ILSVRC2012_val_00040719.JPEG n04483307/
+mv val/ILSVRC2012_val_00040720.JPEG n02097209/
+mv val/ILSVRC2012_val_00040721.JPEG n03388549/
+mv val/ILSVRC2012_val_00040722.JPEG n02815834/
+mv val/ILSVRC2012_val_00040723.JPEG n02487347/
+mv val/ILSVRC2012_val_00040724.JPEG n02074367/
+mv val/ILSVRC2012_val_00040725.JPEG n02113186/
+mv val/ILSVRC2012_val_00040726.JPEG n02536864/
+mv val/ILSVRC2012_val_00040727.JPEG n02114855/
+mv val/ILSVRC2012_val_00040728.JPEG n07697313/
+mv val/ILSVRC2012_val_00040729.JPEG n03938244/
+mv val/ILSVRC2012_val_00040730.JPEG n02492035/
+mv val/ILSVRC2012_val_00040731.JPEG n02085620/
+mv val/ILSVRC2012_val_00040732.JPEG n02085620/
+mv val/ILSVRC2012_val_00040733.JPEG n03223299/
+mv val/ILSVRC2012_val_00040734.JPEG n04273569/
+mv val/ILSVRC2012_val_00040735.JPEG n03496892/
+mv val/ILSVRC2012_val_00040736.JPEG n03866082/
+mv val/ILSVRC2012_val_00040737.JPEG n03065424/
+mv val/ILSVRC2012_val_00040738.JPEG n03877845/
+mv val/ILSVRC2012_val_00040739.JPEG n02871525/
+mv val/ILSVRC2012_val_00040740.JPEG n03404251/
+mv val/ILSVRC2012_val_00040741.JPEG n04462240/
+mv val/ILSVRC2012_val_00040742.JPEG n02113799/
+mv val/ILSVRC2012_val_00040743.JPEG n02093859/
+mv val/ILSVRC2012_val_00040744.JPEG n03742115/
+mv val/ILSVRC2012_val_00040745.JPEG n02123045/
+mv val/ILSVRC2012_val_00040746.JPEG n04487081/
+mv val/ILSVRC2012_val_00040747.JPEG n02107312/
+mv val/ILSVRC2012_val_00040748.JPEG n03938244/
+mv val/ILSVRC2012_val_00040749.JPEG n02966687/
+mv val/ILSVRC2012_val_00040750.JPEG n02342885/
+mv val/ILSVRC2012_val_00040751.JPEG n03781244/
+mv val/ILSVRC2012_val_00040752.JPEG n02493509/
+mv val/ILSVRC2012_val_00040753.JPEG n02134084/
+mv val/ILSVRC2012_val_00040754.JPEG n02749479/
+mv val/ILSVRC2012_val_00040755.JPEG n07749582/
+mv val/ILSVRC2012_val_00040756.JPEG n12144580/
+mv val/ILSVRC2012_val_00040757.JPEG n02114548/
+mv val/ILSVRC2012_val_00040758.JPEG n13052670/
+mv val/ILSVRC2012_val_00040759.JPEG n07753113/
+mv val/ILSVRC2012_val_00040760.JPEG n03777754/
+mv val/ILSVRC2012_val_00040761.JPEG n07615774/
+mv val/ILSVRC2012_val_00040762.JPEG n02483708/
+mv val/ILSVRC2012_val_00040763.JPEG n01784675/
+mv val/ILSVRC2012_val_00040764.JPEG n01978287/
+mv val/ILSVRC2012_val_00040765.JPEG n02536864/
+mv val/ILSVRC2012_val_00040766.JPEG n02443484/
+mv val/ILSVRC2012_val_00040767.JPEG n03877472/
+mv val/ILSVRC2012_val_00040768.JPEG n04074963/
+mv val/ILSVRC2012_val_00040769.JPEG n01632777/
+mv val/ILSVRC2012_val_00040770.JPEG n02815834/
+mv val/ILSVRC2012_val_00040771.JPEG n01669191/
+mv val/ILSVRC2012_val_00040772.JPEG n02104029/
+mv val/ILSVRC2012_val_00040773.JPEG n02093859/
+mv val/ILSVRC2012_val_00040774.JPEG n01883070/
+mv val/ILSVRC2012_val_00040775.JPEG n01774750/
+mv val/ILSVRC2012_val_00040776.JPEG n01667778/
+mv val/ILSVRC2012_val_00040777.JPEG n01728920/
+mv val/ILSVRC2012_val_00040778.JPEG n02219486/
+mv val/ILSVRC2012_val_00040779.JPEG n03124170/
+mv val/ILSVRC2012_val_00040780.JPEG n02123394/
+mv val/ILSVRC2012_val_00040781.JPEG n01740131/
+mv val/ILSVRC2012_val_00040782.JPEG n04228054/
+mv val/ILSVRC2012_val_00040783.JPEG n01592084/
+mv val/ILSVRC2012_val_00040784.JPEG n02128925/
+mv val/ILSVRC2012_val_00040785.JPEG n02281787/
+mv val/ILSVRC2012_val_00040786.JPEG n02093647/
+mv val/ILSVRC2012_val_00040787.JPEG n01667778/
+mv val/ILSVRC2012_val_00040788.JPEG n02128925/
+mv val/ILSVRC2012_val_00040789.JPEG n01978287/
+mv val/ILSVRC2012_val_00040790.JPEG n02130308/
+mv val/ILSVRC2012_val_00040791.JPEG n03065424/
+mv val/ILSVRC2012_val_00040792.JPEG n12620546/
+mv val/ILSVRC2012_val_00040793.JPEG n13052670/
+mv val/ILSVRC2012_val_00040794.JPEG n02480855/
+mv val/ILSVRC2012_val_00040795.JPEG n03376595/
+mv val/ILSVRC2012_val_00040796.JPEG n07734744/
+mv val/ILSVRC2012_val_00040797.JPEG n04019541/
+mv val/ILSVRC2012_val_00040798.JPEG n02536864/
+mv val/ILSVRC2012_val_00040799.JPEG n04350905/
+mv val/ILSVRC2012_val_00040800.JPEG n01773549/
+mv val/ILSVRC2012_val_00040801.JPEG n03782006/
+mv val/ILSVRC2012_val_00040802.JPEG n02111129/
+mv val/ILSVRC2012_val_00040803.JPEG n01806567/
+mv val/ILSVRC2012_val_00040804.JPEG n07753275/
+mv val/ILSVRC2012_val_00040805.JPEG n02256656/
+mv val/ILSVRC2012_val_00040806.JPEG n01984695/
+mv val/ILSVRC2012_val_00040807.JPEG n04443257/
+mv val/ILSVRC2012_val_00040808.JPEG n02410509/
+mv val/ILSVRC2012_val_00040809.JPEG n02092339/
+mv val/ILSVRC2012_val_00040810.JPEG n02115913/
+mv val/ILSVRC2012_val_00040811.JPEG n01806143/
+mv val/ILSVRC2012_val_00040812.JPEG n02815834/
+mv val/ILSVRC2012_val_00040813.JPEG n03908618/
+mv val/ILSVRC2012_val_00040814.JPEG n02279972/
+mv val/ILSVRC2012_val_00040815.JPEG n03691459/
+mv val/ILSVRC2012_val_00040816.JPEG n03216828/
+mv val/ILSVRC2012_val_00040817.JPEG n04370456/
+mv val/ILSVRC2012_val_00040818.JPEG n02676566/
+mv val/ILSVRC2012_val_00040819.JPEG n03710721/
+mv val/ILSVRC2012_val_00040820.JPEG n01629819/
+mv val/ILSVRC2012_val_00040821.JPEG n03967562/
+mv val/ILSVRC2012_val_00040822.JPEG n03482405/
+mv val/ILSVRC2012_val_00040823.JPEG n04487081/
+mv val/ILSVRC2012_val_00040824.JPEG n01744401/
+mv val/ILSVRC2012_val_00040825.JPEG n02454379/
+mv val/ILSVRC2012_val_00040826.JPEG n02007558/
+mv val/ILSVRC2012_val_00040827.JPEG n03201208/
+mv val/ILSVRC2012_val_00040828.JPEG n03793489/
+mv val/ILSVRC2012_val_00040829.JPEG n03902125/
+mv val/ILSVRC2012_val_00040830.JPEG n02672831/
+mv val/ILSVRC2012_val_00040831.JPEG n03447447/
+mv val/ILSVRC2012_val_00040832.JPEG n02749479/
+mv val/ILSVRC2012_val_00040833.JPEG n01440764/
+mv val/ILSVRC2012_val_00040834.JPEG n03538406/
+mv val/ILSVRC2012_val_00040835.JPEG n03794056/
+mv val/ILSVRC2012_val_00040836.JPEG n02097130/
+mv val/ILSVRC2012_val_00040837.JPEG n04332243/
+mv val/ILSVRC2012_val_00040838.JPEG n02814860/
+mv val/ILSVRC2012_val_00040839.JPEG n02488291/
+mv val/ILSVRC2012_val_00040840.JPEG n03032252/
+mv val/ILSVRC2012_val_00040841.JPEG n02137549/
+mv val/ILSVRC2012_val_00040842.JPEG n02281406/
+mv val/ILSVRC2012_val_00040843.JPEG n01494475/
+mv val/ILSVRC2012_val_00040844.JPEG n02749479/
+mv val/ILSVRC2012_val_00040845.JPEG n04458633/
+mv val/ILSVRC2012_val_00040846.JPEG n01847000/
+mv val/ILSVRC2012_val_00040847.JPEG n03825788/
+mv val/ILSVRC2012_val_00040848.JPEG n01819313/
+mv val/ILSVRC2012_val_00040849.JPEG n01847000/
+mv val/ILSVRC2012_val_00040850.JPEG n03908618/
+mv val/ILSVRC2012_val_00040851.JPEG n03444034/
+mv val/ILSVRC2012_val_00040852.JPEG n02483362/
+mv val/ILSVRC2012_val_00040853.JPEG n04254680/
+mv val/ILSVRC2012_val_00040854.JPEG n02123597/
+mv val/ILSVRC2012_val_00040855.JPEG n03838899/
+mv val/ILSVRC2012_val_00040856.JPEG n02104029/
+mv val/ILSVRC2012_val_00040857.JPEG n03633091/
+mv val/ILSVRC2012_val_00040858.JPEG n03775546/
+mv val/ILSVRC2012_val_00040859.JPEG n01807496/
+mv val/ILSVRC2012_val_00040860.JPEG n03692522/
+mv val/ILSVRC2012_val_00040861.JPEG n03721384/
+mv val/ILSVRC2012_val_00040862.JPEG n04208210/
+mv val/ILSVRC2012_val_00040863.JPEG n02892767/
+mv val/ILSVRC2012_val_00040864.JPEG n02086240/
+mv val/ILSVRC2012_val_00040865.JPEG n02492660/
+mv val/ILSVRC2012_val_00040866.JPEG n04049303/
+mv val/ILSVRC2012_val_00040867.JPEG n04238763/
+mv val/ILSVRC2012_val_00040868.JPEG n03793489/
+mv val/ILSVRC2012_val_00040869.JPEG n02107574/
+mv val/ILSVRC2012_val_00040870.JPEG n02364673/
+mv val/ILSVRC2012_val_00040871.JPEG n02134084/
+mv val/ILSVRC2012_val_00040872.JPEG n02092339/
+mv val/ILSVRC2012_val_00040873.JPEG n02906734/
+mv val/ILSVRC2012_val_00040874.JPEG n04371774/
+mv val/ILSVRC2012_val_00040875.JPEG n02097658/
+mv val/ILSVRC2012_val_00040876.JPEG n02102040/
+mv val/ILSVRC2012_val_00040877.JPEG n01968897/
+mv val/ILSVRC2012_val_00040878.JPEG n02090622/
+mv val/ILSVRC2012_val_00040879.JPEG n03916031/
+mv val/ILSVRC2012_val_00040880.JPEG n03658185/
+mv val/ILSVRC2012_val_00040881.JPEG n02536864/
+mv val/ILSVRC2012_val_00040882.JPEG n03697007/
+mv val/ILSVRC2012_val_00040883.JPEG n03924679/
+mv val/ILSVRC2012_val_00040884.JPEG n02325366/
+mv val/ILSVRC2012_val_00040885.JPEG n03337140/
+mv val/ILSVRC2012_val_00040886.JPEG n02999410/
+mv val/ILSVRC2012_val_00040887.JPEG n01983481/
+mv val/ILSVRC2012_val_00040888.JPEG n03141823/
+mv val/ILSVRC2012_val_00040889.JPEG n03662601/
+mv val/ILSVRC2012_val_00040890.JPEG n01729322/
+mv val/ILSVRC2012_val_00040891.JPEG n02676566/
+mv val/ILSVRC2012_val_00040892.JPEG n02992211/
+mv val/ILSVRC2012_val_00040893.JPEG n03089624/
+mv val/ILSVRC2012_val_00040894.JPEG n01632777/
+mv val/ILSVRC2012_val_00040895.JPEG n02443484/
+mv val/ILSVRC2012_val_00040896.JPEG n03534580/
+mv val/ILSVRC2012_val_00040897.JPEG n01847000/
+mv val/ILSVRC2012_val_00040898.JPEG n02102318/
+mv val/ILSVRC2012_val_00040899.JPEG n01855032/
+mv val/ILSVRC2012_val_00040900.JPEG n03961711/
+mv val/ILSVRC2012_val_00040901.JPEG n03895866/
+mv val/ILSVRC2012_val_00040902.JPEG n02892767/
+mv val/ILSVRC2012_val_00040903.JPEG n01601694/
+mv val/ILSVRC2012_val_00040904.JPEG n02443484/
+mv val/ILSVRC2012_val_00040905.JPEG n03930313/
+mv val/ILSVRC2012_val_00040906.JPEG n03062245/
+mv val/ILSVRC2012_val_00040907.JPEG n02988304/
+mv val/ILSVRC2012_val_00040908.JPEG n02090622/
+mv val/ILSVRC2012_val_00040909.JPEG n02107908/
+mv val/ILSVRC2012_val_00040910.JPEG n03290653/
+mv val/ILSVRC2012_val_00040911.JPEG n04542943/
+mv val/ILSVRC2012_val_00040912.JPEG n04296562/
+mv val/ILSVRC2012_val_00040913.JPEG n01986214/
+mv val/ILSVRC2012_val_00040914.JPEG n02233338/
+mv val/ILSVRC2012_val_00040915.JPEG n02093991/
+mv val/ILSVRC2012_val_00040916.JPEG n03482405/
+mv val/ILSVRC2012_val_00040917.JPEG n02966193/
+mv val/ILSVRC2012_val_00040918.JPEG n03786901/
+mv val/ILSVRC2012_val_00040919.JPEG n02027492/
+mv val/ILSVRC2012_val_00040920.JPEG n04392985/
+mv val/ILSVRC2012_val_00040921.JPEG n03376595/
+mv val/ILSVRC2012_val_00040922.JPEG n07714990/
+mv val/ILSVRC2012_val_00040923.JPEG n02504013/
+mv val/ILSVRC2012_val_00040924.JPEG n04606251/
+mv val/ILSVRC2012_val_00040925.JPEG n03724870/
+mv val/ILSVRC2012_val_00040926.JPEG n02093991/
+mv val/ILSVRC2012_val_00040927.JPEG n03933933/
+mv val/ILSVRC2012_val_00040928.JPEG n02804414/
+mv val/ILSVRC2012_val_00040929.JPEG n03063599/
+mv val/ILSVRC2012_val_00040930.JPEG n01698640/
+mv val/ILSVRC2012_val_00040931.JPEG n03498962/
+mv val/ILSVRC2012_val_00040932.JPEG n04252225/
+mv val/ILSVRC2012_val_00040933.JPEG n02013706/
+mv val/ILSVRC2012_val_00040934.JPEG n03026506/
+mv val/ILSVRC2012_val_00040935.JPEG n03787032/
+mv val/ILSVRC2012_val_00040936.JPEG n04536866/
+mv val/ILSVRC2012_val_00040937.JPEG n02100583/
+mv val/ILSVRC2012_val_00040938.JPEG n01582220/
+mv val/ILSVRC2012_val_00040939.JPEG n02500267/
+mv val/ILSVRC2012_val_00040940.JPEG n03388183/
+mv val/ILSVRC2012_val_00040941.JPEG n07693725/
+mv val/ILSVRC2012_val_00040942.JPEG n02033041/
+mv val/ILSVRC2012_val_00040943.JPEG n03908714/
+mv val/ILSVRC2012_val_00040944.JPEG n02219486/
+mv val/ILSVRC2012_val_00040945.JPEG n02730930/
+mv val/ILSVRC2012_val_00040946.JPEG n03710193/
+mv val/ILSVRC2012_val_00040947.JPEG n02108915/
+mv val/ILSVRC2012_val_00040948.JPEG n01749939/
+mv val/ILSVRC2012_val_00040949.JPEG n02817516/
+mv val/ILSVRC2012_val_00040950.JPEG n01729977/
+mv val/ILSVRC2012_val_00040951.JPEG n02086910/
+mv val/ILSVRC2012_val_00040952.JPEG n02107908/
+mv val/ILSVRC2012_val_00040953.JPEG n03450230/
+mv val/ILSVRC2012_val_00040954.JPEG n07565083/
+mv val/ILSVRC2012_val_00040955.JPEG n02128385/
+mv val/ILSVRC2012_val_00040956.JPEG n03141823/
+mv val/ILSVRC2012_val_00040957.JPEG n04259630/
+mv val/ILSVRC2012_val_00040958.JPEG n01914609/
+mv val/ILSVRC2012_val_00040959.JPEG n07697537/
+mv val/ILSVRC2012_val_00040960.JPEG n04447861/
+mv val/ILSVRC2012_val_00040961.JPEG n02099849/
+mv val/ILSVRC2012_val_00040962.JPEG n03126707/
+mv val/ILSVRC2012_val_00040963.JPEG n01943899/
+mv val/ILSVRC2012_val_00040964.JPEG n04118776/
+mv val/ILSVRC2012_val_00040965.JPEG n02791124/
+mv val/ILSVRC2012_val_00040966.JPEG n03763968/
+mv val/ILSVRC2012_val_00040967.JPEG n03492542/
+mv val/ILSVRC2012_val_00040968.JPEG n02094433/
+mv val/ILSVRC2012_val_00040969.JPEG n04366367/
+mv val/ILSVRC2012_val_00040970.JPEG n01614925/
+mv val/ILSVRC2012_val_00040971.JPEG n02007558/
+mv val/ILSVRC2012_val_00040972.JPEG n02128757/
+mv val/ILSVRC2012_val_00040973.JPEG n04019541/
+mv val/ILSVRC2012_val_00040974.JPEG n04612504/
+mv val/ILSVRC2012_val_00040975.JPEG n02841315/
+mv val/ILSVRC2012_val_00040976.JPEG n13044778/
+mv val/ILSVRC2012_val_00040977.JPEG n04147183/
+mv val/ILSVRC2012_val_00040978.JPEG n03933933/
+mv val/ILSVRC2012_val_00040979.JPEG n02110627/
+mv val/ILSVRC2012_val_00040980.JPEG n02226429/
+mv val/ILSVRC2012_val_00040981.JPEG n01631663/
+mv val/ILSVRC2012_val_00040982.JPEG n03676483/
+mv val/ILSVRC2012_val_00040983.JPEG n02487347/
+mv val/ILSVRC2012_val_00040984.JPEG n04507155/
+mv val/ILSVRC2012_val_00040985.JPEG n03216828/
+mv val/ILSVRC2012_val_00040986.JPEG n07718472/
+mv val/ILSVRC2012_val_00040987.JPEG n02058221/
+mv val/ILSVRC2012_val_00040988.JPEG n03127747/
+mv val/ILSVRC2012_val_00040989.JPEG n07745940/
+mv val/ILSVRC2012_val_00040990.JPEG n02102177/
+mv val/ILSVRC2012_val_00040991.JPEG n02113712/
+mv val/ILSVRC2012_val_00040992.JPEG n02965783/
+mv val/ILSVRC2012_val_00040993.JPEG n03840681/
+mv val/ILSVRC2012_val_00040994.JPEG n04310018/
+mv val/ILSVRC2012_val_00040995.JPEG n01774384/
+mv val/ILSVRC2012_val_00040996.JPEG n02177972/
+mv val/ILSVRC2012_val_00040997.JPEG n03063599/
+mv val/ILSVRC2012_val_00040998.JPEG n01697457/
+mv val/ILSVRC2012_val_00040999.JPEG n03759954/
+mv val/ILSVRC2012_val_00041000.JPEG n02085620/
+mv val/ILSVRC2012_val_00041001.JPEG n07753113/
+mv val/ILSVRC2012_val_00041002.JPEG n03393912/
+mv val/ILSVRC2012_val_00041003.JPEG n02692877/
+mv val/ILSVRC2012_val_00041004.JPEG n03868242/
+mv val/ILSVRC2012_val_00041005.JPEG n02403003/
+mv val/ILSVRC2012_val_00041006.JPEG n03249569/
+mv val/ILSVRC2012_val_00041007.JPEG n03884397/
+mv val/ILSVRC2012_val_00041008.JPEG n02396427/
+mv val/ILSVRC2012_val_00041009.JPEG n03457902/
+mv val/ILSVRC2012_val_00041010.JPEG n07718747/
+mv val/ILSVRC2012_val_00041011.JPEG n02167151/
+mv val/ILSVRC2012_val_00041012.JPEG n04154565/
+mv val/ILSVRC2012_val_00041013.JPEG n04147183/
+mv val/ILSVRC2012_val_00041014.JPEG n04118538/
+mv val/ILSVRC2012_val_00041015.JPEG n03124043/
+mv val/ILSVRC2012_val_00041016.JPEG n04372370/
+mv val/ILSVRC2012_val_00041017.JPEG n01667114/
+mv val/ILSVRC2012_val_00041018.JPEG n03998194/
+mv val/ILSVRC2012_val_00041019.JPEG n03995372/
+mv val/ILSVRC2012_val_00041020.JPEG n10565667/
+mv val/ILSVRC2012_val_00041021.JPEG n01798484/
+mv val/ILSVRC2012_val_00041022.JPEG n04591157/
+mv val/ILSVRC2012_val_00041023.JPEG n03127747/
+mv val/ILSVRC2012_val_00041024.JPEG n02105641/
+mv val/ILSVRC2012_val_00041025.JPEG n03485407/
+mv val/ILSVRC2012_val_00041026.JPEG n02102177/
+mv val/ILSVRC2012_val_00041027.JPEG n04461696/
+mv val/ILSVRC2012_val_00041028.JPEG n01824575/
+mv val/ILSVRC2012_val_00041029.JPEG n02066245/
+mv val/ILSVRC2012_val_00041030.JPEG n04317175/
+mv val/ILSVRC2012_val_00041031.JPEG n02107312/
+mv val/ILSVRC2012_val_00041032.JPEG n06874185/
+mv val/ILSVRC2012_val_00041033.JPEG n04465501/
+mv val/ILSVRC2012_val_00041034.JPEG n02939185/
+mv val/ILSVRC2012_val_00041035.JPEG n04019541/
+mv val/ILSVRC2012_val_00041036.JPEG n03459775/
+mv val/ILSVRC2012_val_00041037.JPEG n04548280/
+mv val/ILSVRC2012_val_00041038.JPEG n03047690/
+mv val/ILSVRC2012_val_00041039.JPEG n04325704/
+mv val/ILSVRC2012_val_00041040.JPEG n07871810/
+mv val/ILSVRC2012_val_00041041.JPEG n01819313/
+mv val/ILSVRC2012_val_00041042.JPEG n03782006/
+mv val/ILSVRC2012_val_00041043.JPEG n02086079/
+mv val/ILSVRC2012_val_00041044.JPEG n03584254/
+mv val/ILSVRC2012_val_00041045.JPEG n03929660/
+mv val/ILSVRC2012_val_00041046.JPEG n02492035/
+mv val/ILSVRC2012_val_00041047.JPEG n03670208/
+mv val/ILSVRC2012_val_00041048.JPEG n02412080/
+mv val/ILSVRC2012_val_00041049.JPEG n02109525/
+mv val/ILSVRC2012_val_00041050.JPEG n02397096/
+mv val/ILSVRC2012_val_00041051.JPEG n01582220/
+mv val/ILSVRC2012_val_00041052.JPEG n03188531/
+mv val/ILSVRC2012_val_00041053.JPEG n02105641/
+mv val/ILSVRC2012_val_00041054.JPEG n02033041/
+mv val/ILSVRC2012_val_00041055.JPEG n03992509/
+mv val/ILSVRC2012_val_00041056.JPEG n02328150/
+mv val/ILSVRC2012_val_00041057.JPEG n03000684/
+mv val/ILSVRC2012_val_00041058.JPEG n03126707/
+mv val/ILSVRC2012_val_00041059.JPEG n07590611/
+mv val/ILSVRC2012_val_00041060.JPEG n02102480/
+mv val/ILSVRC2012_val_00041061.JPEG n07684084/
+mv val/ILSVRC2012_val_00041062.JPEG n07590611/
+mv val/ILSVRC2012_val_00041063.JPEG n09421951/
+mv val/ILSVRC2012_val_00041064.JPEG n04285008/
+mv val/ILSVRC2012_val_00041065.JPEG n02930766/
+mv val/ILSVRC2012_val_00041066.JPEG n04604644/
+mv val/ILSVRC2012_val_00041067.JPEG n03584829/
+mv val/ILSVRC2012_val_00041068.JPEG n03447721/
+mv val/ILSVRC2012_val_00041069.JPEG n01693334/
+mv val/ILSVRC2012_val_00041070.JPEG n02910353/
+mv val/ILSVRC2012_val_00041071.JPEG n03532672/
+mv val/ILSVRC2012_val_00041072.JPEG n04127249/
+mv val/ILSVRC2012_val_00041073.JPEG n04154565/
+mv val/ILSVRC2012_val_00041074.JPEG n03014705/
+mv val/ILSVRC2012_val_00041075.JPEG n13052670/
+mv val/ILSVRC2012_val_00041076.JPEG n03483316/
+mv val/ILSVRC2012_val_00041077.JPEG n02817516/
+mv val/ILSVRC2012_val_00041078.JPEG n03759954/
+mv val/ILSVRC2012_val_00041079.JPEG n03733805/
+mv val/ILSVRC2012_val_00041080.JPEG n04204238/
+mv val/ILSVRC2012_val_00041081.JPEG n02110341/
+mv val/ILSVRC2012_val_00041082.JPEG n04147183/
+mv val/ILSVRC2012_val_00041083.JPEG n02007558/
+mv val/ILSVRC2012_val_00041084.JPEG n02268443/
+mv val/ILSVRC2012_val_00041085.JPEG n03133878/
+mv val/ILSVRC2012_val_00041086.JPEG n03255030/
+mv val/ILSVRC2012_val_00041087.JPEG n02442845/
+mv val/ILSVRC2012_val_00041088.JPEG n02018207/
+mv val/ILSVRC2012_val_00041089.JPEG n04069434/
+mv val/ILSVRC2012_val_00041090.JPEG n02667093/
+mv val/ILSVRC2012_val_00041091.JPEG n03866082/
+mv val/ILSVRC2012_val_00041092.JPEG n02113978/
+mv val/ILSVRC2012_val_00041093.JPEG n02108000/
+mv val/ILSVRC2012_val_00041094.JPEG n03832673/
+mv val/ILSVRC2012_val_00041095.JPEG n04039381/
+mv val/ILSVRC2012_val_00041096.JPEG n01677366/
+mv val/ILSVRC2012_val_00041097.JPEG n01955084/
+mv val/ILSVRC2012_val_00041098.JPEG n02113023/
+mv val/ILSVRC2012_val_00041099.JPEG n04371430/
+mv val/ILSVRC2012_val_00041100.JPEG n03134739/
+mv val/ILSVRC2012_val_00041101.JPEG n03840681/
+mv val/ILSVRC2012_val_00041102.JPEG n07714571/
+mv val/ILSVRC2012_val_00041103.JPEG n01955084/
+mv val/ILSVRC2012_val_00041104.JPEG n03785016/
+mv val/ILSVRC2012_val_00041105.JPEG n03924679/
+mv val/ILSVRC2012_val_00041106.JPEG n04443257/
+mv val/ILSVRC2012_val_00041107.JPEG n03709823/
+mv val/ILSVRC2012_val_00041108.JPEG n04204347/
+mv val/ILSVRC2012_val_00041109.JPEG n02086079/
+mv val/ILSVRC2012_val_00041110.JPEG n02361337/
+mv val/ILSVRC2012_val_00041111.JPEG n04317175/
+mv val/ILSVRC2012_val_00041112.JPEG n09229709/
+mv val/ILSVRC2012_val_00041113.JPEG n04270147/
+mv val/ILSVRC2012_val_00041114.JPEG n01518878/
+mv val/ILSVRC2012_val_00041115.JPEG n02105412/
+mv val/ILSVRC2012_val_00041116.JPEG n07720875/
+mv val/ILSVRC2012_val_00041117.JPEG n02177972/
+mv val/ILSVRC2012_val_00041118.JPEG n02098105/
+mv val/ILSVRC2012_val_00041119.JPEG n03534580/
+mv val/ILSVRC2012_val_00041120.JPEG n02492660/
+mv val/ILSVRC2012_val_00041121.JPEG n03954731/
+mv val/ILSVRC2012_val_00041122.JPEG n03874599/
+mv val/ILSVRC2012_val_00041123.JPEG n04243546/
+mv val/ILSVRC2012_val_00041124.JPEG n04344873/
+mv val/ILSVRC2012_val_00041125.JPEG n04252077/
+mv val/ILSVRC2012_val_00041126.JPEG n02009229/
+mv val/ILSVRC2012_val_00041127.JPEG n01774384/
+mv val/ILSVRC2012_val_00041128.JPEG n03843555/
+mv val/ILSVRC2012_val_00041129.JPEG n02988304/
+mv val/ILSVRC2012_val_00041130.JPEG n02422699/
+mv val/ILSVRC2012_val_00041131.JPEG n03045698/
+mv val/ILSVRC2012_val_00041132.JPEG n03775071/
+mv val/ILSVRC2012_val_00041133.JPEG n02098105/
+mv val/ILSVRC2012_val_00041134.JPEG n04099969/
+mv val/ILSVRC2012_val_00041135.JPEG n01582220/
+mv val/ILSVRC2012_val_00041136.JPEG n03026506/
+mv val/ILSVRC2012_val_00041137.JPEG n02099849/
+mv val/ILSVRC2012_val_00041138.JPEG n02814860/
+mv val/ILSVRC2012_val_00041139.JPEG n02980441/
+mv val/ILSVRC2012_val_00041140.JPEG n07875152/
+mv val/ILSVRC2012_val_00041141.JPEG n01873310/
+mv val/ILSVRC2012_val_00041142.JPEG n02117135/
+mv val/ILSVRC2012_val_00041143.JPEG n02510455/
+mv val/ILSVRC2012_val_00041144.JPEG n02108422/
+mv val/ILSVRC2012_val_00041145.JPEG n04599235/
+mv val/ILSVRC2012_val_00041146.JPEG n03450230/
+mv val/ILSVRC2012_val_00041147.JPEG n02105505/
+mv val/ILSVRC2012_val_00041148.JPEG n04239074/
+mv val/ILSVRC2012_val_00041149.JPEG n04131690/
+mv val/ILSVRC2012_val_00041150.JPEG n04033995/
+mv val/ILSVRC2012_val_00041151.JPEG n03445924/
+mv val/ILSVRC2012_val_00041152.JPEG n01558993/
+mv val/ILSVRC2012_val_00041153.JPEG n02791270/
+mv val/ILSVRC2012_val_00041154.JPEG n03770679/
+mv val/ILSVRC2012_val_00041155.JPEG n02480855/
+mv val/ILSVRC2012_val_00041156.JPEG n02134084/
+mv val/ILSVRC2012_val_00041157.JPEG n02098286/
+mv val/ILSVRC2012_val_00041158.JPEG n03478589/
+mv val/ILSVRC2012_val_00041159.JPEG n01744401/
+mv val/ILSVRC2012_val_00041160.JPEG n04532670/
+mv val/ILSVRC2012_val_00041161.JPEG n02105412/
+mv val/ILSVRC2012_val_00041162.JPEG n03874599/
+mv val/ILSVRC2012_val_00041163.JPEG n04125021/
+mv val/ILSVRC2012_val_00041164.JPEG n01682714/
+mv val/ILSVRC2012_val_00041165.JPEG n02747177/
+mv val/ILSVRC2012_val_00041166.JPEG n02992211/
+mv val/ILSVRC2012_val_00041167.JPEG n03710193/
+mv val/ILSVRC2012_val_00041168.JPEG n01514859/
+mv val/ILSVRC2012_val_00041169.JPEG n01687978/
+mv val/ILSVRC2012_val_00041170.JPEG n04418357/
+mv val/ILSVRC2012_val_00041171.JPEG n02017213/
+mv val/ILSVRC2012_val_00041172.JPEG n01677366/
+mv val/ILSVRC2012_val_00041173.JPEG n02281406/
+mv val/ILSVRC2012_val_00041174.JPEG n02138441/
+mv val/ILSVRC2012_val_00041175.JPEG n03594945/
+mv val/ILSVRC2012_val_00041176.JPEG n02106030/
+mv val/ILSVRC2012_val_00041177.JPEG n03017168/
+mv val/ILSVRC2012_val_00041178.JPEG n02105251/
+mv val/ILSVRC2012_val_00041179.JPEG n04273569/
+mv val/ILSVRC2012_val_00041180.JPEG n02488291/
+mv val/ILSVRC2012_val_00041181.JPEG n09332890/
+mv val/ILSVRC2012_val_00041182.JPEG n03873416/
+mv val/ILSVRC2012_val_00041183.JPEG n02895154/
+mv val/ILSVRC2012_val_00041184.JPEG n02494079/
+mv val/ILSVRC2012_val_00041185.JPEG n02437616/
+mv val/ILSVRC2012_val_00041186.JPEG n01692333/
+mv val/ILSVRC2012_val_00041187.JPEG n04311004/
+mv val/ILSVRC2012_val_00041188.JPEG n03218198/
+mv val/ILSVRC2012_val_00041189.JPEG n02110185/
+mv val/ILSVRC2012_val_00041190.JPEG n02256656/
+mv val/ILSVRC2012_val_00041191.JPEG n07880968/
+mv val/ILSVRC2012_val_00041192.JPEG n02666196/
+mv val/ILSVRC2012_val_00041193.JPEG n03337140/
+mv val/ILSVRC2012_val_00041194.JPEG n04399382/
+mv val/ILSVRC2012_val_00041195.JPEG n04265275/
+mv val/ILSVRC2012_val_00041196.JPEG n04254120/
+mv val/ILSVRC2012_val_00041197.JPEG n01798484/
+mv val/ILSVRC2012_val_00041198.JPEG n03602883/
+mv val/ILSVRC2012_val_00041199.JPEG n03825788/
+mv val/ILSVRC2012_val_00041200.JPEG n01833805/
+mv val/ILSVRC2012_val_00041201.JPEG n02704792/
+mv val/ILSVRC2012_val_00041202.JPEG n01734418/
+mv val/ILSVRC2012_val_00041203.JPEG n03594734/
+mv val/ILSVRC2012_val_00041204.JPEG n02701002/
+mv val/ILSVRC2012_val_00041205.JPEG n02085620/
+mv val/ILSVRC2012_val_00041206.JPEG n01582220/
+mv val/ILSVRC2012_val_00041207.JPEG n03623198/
+mv val/ILSVRC2012_val_00041208.JPEG n03000134/
+mv val/ILSVRC2012_val_00041209.JPEG n02992211/
+mv val/ILSVRC2012_val_00041210.JPEG n03691459/
+mv val/ILSVRC2012_val_00041211.JPEG n02526121/
+mv val/ILSVRC2012_val_00041212.JPEG n03998194/
+mv val/ILSVRC2012_val_00041213.JPEG n01990800/
+mv val/ILSVRC2012_val_00041214.JPEG n03933933/
+mv val/ILSVRC2012_val_00041215.JPEG n02950826/
+mv val/ILSVRC2012_val_00041216.JPEG n01748264/
+mv val/ILSVRC2012_val_00041217.JPEG n15075141/
+mv val/ILSVRC2012_val_00041218.JPEG n10565667/
+mv val/ILSVRC2012_val_00041219.JPEG n15075141/
+mv val/ILSVRC2012_val_00041220.JPEG n02116738/
+mv val/ILSVRC2012_val_00041221.JPEG n02643566/
+mv val/ILSVRC2012_val_00041222.JPEG n02837789/
+mv val/ILSVRC2012_val_00041223.JPEG n04005630/
+mv val/ILSVRC2012_val_00041224.JPEG n02091134/
+mv val/ILSVRC2012_val_00041225.JPEG n02071294/
+mv val/ILSVRC2012_val_00041226.JPEG n10148035/
+mv val/ILSVRC2012_val_00041227.JPEG n02951358/
+mv val/ILSVRC2012_val_00041228.JPEG n04127249/
+mv val/ILSVRC2012_val_00041229.JPEG n03866082/
+mv val/ILSVRC2012_val_00041230.JPEG n04579145/
+mv val/ILSVRC2012_val_00041231.JPEG n04239074/
+mv val/ILSVRC2012_val_00041232.JPEG n02492035/
+mv val/ILSVRC2012_val_00041233.JPEG n02107683/
+mv val/ILSVRC2012_val_00041234.JPEG n04239074/
+mv val/ILSVRC2012_val_00041235.JPEG n04004767/
+mv val/ILSVRC2012_val_00041236.JPEG n04550184/
+mv val/ILSVRC2012_val_00041237.JPEG n03961711/
+mv val/ILSVRC2012_val_00041238.JPEG n03201208/
+mv val/ILSVRC2012_val_00041239.JPEG n03207941/
+mv val/ILSVRC2012_val_00041240.JPEG n03134739/
+mv val/ILSVRC2012_val_00041241.JPEG n02892767/
+mv val/ILSVRC2012_val_00041242.JPEG n03394916/
+mv val/ILSVRC2012_val_00041243.JPEG n02398521/
+mv val/ILSVRC2012_val_00041244.JPEG n03868863/
+mv val/ILSVRC2012_val_00041245.JPEG n02486410/
+mv val/ILSVRC2012_val_00041246.JPEG n04487394/
+mv val/ILSVRC2012_val_00041247.JPEG n03394916/
+mv val/ILSVRC2012_val_00041248.JPEG n01496331/
+mv val/ILSVRC2012_val_00041249.JPEG n04418357/
+mv val/ILSVRC2012_val_00041250.JPEG n02168699/
+mv val/ILSVRC2012_val_00041251.JPEG n02097209/
+mv val/ILSVRC2012_val_00041252.JPEG n01537544/
+mv val/ILSVRC2012_val_00041253.JPEG n01687978/
+mv val/ILSVRC2012_val_00041254.JPEG n02799071/
+mv val/ILSVRC2012_val_00041255.JPEG n04009552/
+mv val/ILSVRC2012_val_00041256.JPEG n03345487/
+mv val/ILSVRC2012_val_00041257.JPEG n04346328/
+mv val/ILSVRC2012_val_00041258.JPEG n12057211/
+mv val/ILSVRC2012_val_00041259.JPEG n03485794/
+mv val/ILSVRC2012_val_00041260.JPEG n02443484/
+mv val/ILSVRC2012_val_00041261.JPEG n02229544/
+mv val/ILSVRC2012_val_00041262.JPEG n02840245/
+mv val/ILSVRC2012_val_00041263.JPEG n02415577/
+mv val/ILSVRC2012_val_00041264.JPEG n02104029/
+mv val/ILSVRC2012_val_00041265.JPEG n03792782/
+mv val/ILSVRC2012_val_00041266.JPEG n03888605/
+mv val/ILSVRC2012_val_00041267.JPEG n02128925/
+mv val/ILSVRC2012_val_00041268.JPEG n03045698/
+mv val/ILSVRC2012_val_00041269.JPEG n03837869/
+mv val/ILSVRC2012_val_00041270.JPEG n02749479/
+mv val/ILSVRC2012_val_00041271.JPEG n04033995/
+mv val/ILSVRC2012_val_00041272.JPEG n02422106/
+mv val/ILSVRC2012_val_00041273.JPEG n03404251/
+mv val/ILSVRC2012_val_00041274.JPEG n04208210/
+mv val/ILSVRC2012_val_00041275.JPEG n02113712/
+mv val/ILSVRC2012_val_00041276.JPEG n03459775/
+mv val/ILSVRC2012_val_00041277.JPEG n02514041/
+mv val/ILSVRC2012_val_00041278.JPEG n04371430/
+mv val/ILSVRC2012_val_00041279.JPEG n01644373/
+mv val/ILSVRC2012_val_00041280.JPEG n03447721/
+mv val/ILSVRC2012_val_00041281.JPEG n13052670/
+mv val/ILSVRC2012_val_00041282.JPEG n03492542/
+mv val/ILSVRC2012_val_00041283.JPEG n04366367/
+mv val/ILSVRC2012_val_00041284.JPEG n01968897/
+mv val/ILSVRC2012_val_00041285.JPEG n02033041/
+mv val/ILSVRC2012_val_00041286.JPEG n02114712/
+mv val/ILSVRC2012_val_00041287.JPEG n02804414/
+mv val/ILSVRC2012_val_00041288.JPEG n01796340/
+mv val/ILSVRC2012_val_00041289.JPEG n04009552/
+mv val/ILSVRC2012_val_00041290.JPEG n04597913/
+mv val/ILSVRC2012_val_00041291.JPEG n03141823/
+mv val/ILSVRC2012_val_00041292.JPEG n04612504/
+mv val/ILSVRC2012_val_00041293.JPEG n01729322/
+mv val/ILSVRC2012_val_00041294.JPEG n02492660/
+mv val/ILSVRC2012_val_00041295.JPEG n03792972/
+mv val/ILSVRC2012_val_00041296.JPEG n02130308/
+mv val/ILSVRC2012_val_00041297.JPEG n03400231/
+mv val/ILSVRC2012_val_00041298.JPEG n01632777/
+mv val/ILSVRC2012_val_00041299.JPEG n03085013/
+mv val/ILSVRC2012_val_00041300.JPEG n01729322/
+mv val/ILSVRC2012_val_00041301.JPEG n02095570/
+mv val/ILSVRC2012_val_00041302.JPEG n03970156/
+mv val/ILSVRC2012_val_00041303.JPEG n04009552/
+mv val/ILSVRC2012_val_00041304.JPEG n03950228/
+mv val/ILSVRC2012_val_00041305.JPEG n02086646/
+mv val/ILSVRC2012_val_00041306.JPEG n02108000/
+mv val/ILSVRC2012_val_00041307.JPEG n03196217/
+mv val/ILSVRC2012_val_00041308.JPEG n01580077/
+mv val/ILSVRC2012_val_00041309.JPEG n04275548/
+mv val/ILSVRC2012_val_00041310.JPEG n04599235/
+mv val/ILSVRC2012_val_00041311.JPEG n01774750/
+mv val/ILSVRC2012_val_00041312.JPEG n03498962/
+mv val/ILSVRC2012_val_00041313.JPEG n03457902/
+mv val/ILSVRC2012_val_00041314.JPEG n03930630/
+mv val/ILSVRC2012_val_00041315.JPEG n04590129/
+mv val/ILSVRC2012_val_00041316.JPEG n01968897/
+mv val/ILSVRC2012_val_00041317.JPEG n04462240/
+mv val/ILSVRC2012_val_00041318.JPEG n04554684/
+mv val/ILSVRC2012_val_00041319.JPEG n02840245/
+mv val/ILSVRC2012_val_00041320.JPEG n02804414/
+mv val/ILSVRC2012_val_00041321.JPEG n07614500/
+mv val/ILSVRC2012_val_00041322.JPEG n03482405/
+mv val/ILSVRC2012_val_00041323.JPEG n02871525/
+mv val/ILSVRC2012_val_00041324.JPEG n04192698/
+mv val/ILSVRC2012_val_00041325.JPEG n02699494/
+mv val/ILSVRC2012_val_00041326.JPEG n03388183/
+mv val/ILSVRC2012_val_00041327.JPEG n04153751/
+mv val/ILSVRC2012_val_00041328.JPEG n03733281/
+mv val/ILSVRC2012_val_00041329.JPEG n01797886/
+mv val/ILSVRC2012_val_00041330.JPEG n01689811/
+mv val/ILSVRC2012_val_00041331.JPEG n02777292/
+mv val/ILSVRC2012_val_00041332.JPEG n02389026/
+mv val/ILSVRC2012_val_00041333.JPEG n03788365/
+mv val/ILSVRC2012_val_00041334.JPEG n01514859/
+mv val/ILSVRC2012_val_00041335.JPEG n02102480/
+mv val/ILSVRC2012_val_00041336.JPEG n03942813/
+mv val/ILSVRC2012_val_00041337.JPEG n02111129/
+mv val/ILSVRC2012_val_00041338.JPEG n03017168/
+mv val/ILSVRC2012_val_00041339.JPEG n02105855/
+mv val/ILSVRC2012_val_00041340.JPEG n04328186/
+mv val/ILSVRC2012_val_00041341.JPEG n02115641/
+mv val/ILSVRC2012_val_00041342.JPEG n02093647/
+mv val/ILSVRC2012_val_00041343.JPEG n02415577/
+mv val/ILSVRC2012_val_00041344.JPEG n02536864/
+mv val/ILSVRC2012_val_00041345.JPEG n13044778/
+mv val/ILSVRC2012_val_00041346.JPEG n02113712/
+mv val/ILSVRC2012_val_00041347.JPEG n02123394/
+mv val/ILSVRC2012_val_00041348.JPEG n01735189/
+mv val/ILSVRC2012_val_00041349.JPEG n03085013/
+mv val/ILSVRC2012_val_00041350.JPEG n03127747/
+mv val/ILSVRC2012_val_00041351.JPEG n02105641/
+mv val/ILSVRC2012_val_00041352.JPEG n04606251/
+mv val/ILSVRC2012_val_00041353.JPEG n02814533/
+mv val/ILSVRC2012_val_00041354.JPEG n02980441/
+mv val/ILSVRC2012_val_00041355.JPEG n02910353/
+mv val/ILSVRC2012_val_00041356.JPEG n02098105/
+mv val/ILSVRC2012_val_00041357.JPEG n04380533/
+mv val/ILSVRC2012_val_00041358.JPEG n02098286/
+mv val/ILSVRC2012_val_00041359.JPEG n02018795/
+mv val/ILSVRC2012_val_00041360.JPEG n02788148/
+mv val/ILSVRC2012_val_00041361.JPEG n01807496/
+mv val/ILSVRC2012_val_00041362.JPEG n03908714/
+mv val/ILSVRC2012_val_00041363.JPEG n03388549/
+mv val/ILSVRC2012_val_00041364.JPEG n02100877/
+mv val/ILSVRC2012_val_00041365.JPEG n03982430/
+mv val/ILSVRC2012_val_00041366.JPEG n01986214/
+mv val/ILSVRC2012_val_00041367.JPEG n04201297/
+mv val/ILSVRC2012_val_00041368.JPEG n03347037/
+mv val/ILSVRC2012_val_00041369.JPEG n04008634/
+mv val/ILSVRC2012_val_00041370.JPEG n04557648/
+mv val/ILSVRC2012_val_00041371.JPEG n03445924/
+mv val/ILSVRC2012_val_00041372.JPEG n02980441/
+mv val/ILSVRC2012_val_00041373.JPEG n03131574/
+mv val/ILSVRC2012_val_00041374.JPEG n02948072/
+mv val/ILSVRC2012_val_00041375.JPEG n01797886/
+mv val/ILSVRC2012_val_00041376.JPEG n04005630/
+mv val/ILSVRC2012_val_00041377.JPEG n02111889/
+mv val/ILSVRC2012_val_00041378.JPEG n02325366/
+mv val/ILSVRC2012_val_00041379.JPEG n01728920/
+mv val/ILSVRC2012_val_00041380.JPEG n02129165/
+mv val/ILSVRC2012_val_00041381.JPEG n02168699/
+mv val/ILSVRC2012_val_00041382.JPEG n04465501/
+mv val/ILSVRC2012_val_00041383.JPEG n01728572/
+mv val/ILSVRC2012_val_00041384.JPEG n02105641/
+mv val/ILSVRC2012_val_00041385.JPEG n01774384/
+mv val/ILSVRC2012_val_00041386.JPEG n04418357/
+mv val/ILSVRC2012_val_00041387.JPEG n02325366/
+mv val/ILSVRC2012_val_00041388.JPEG n03888605/
+mv val/ILSVRC2012_val_00041389.JPEG n04149813/
+mv val/ILSVRC2012_val_00041390.JPEG n02281406/
+mv val/ILSVRC2012_val_00041391.JPEG n03599486/
+mv val/ILSVRC2012_val_00041392.JPEG n03124170/
+mv val/ILSVRC2012_val_00041393.JPEG n02100583/
+mv val/ILSVRC2012_val_00041394.JPEG n03956157/
+mv val/ILSVRC2012_val_00041395.JPEG n03788195/
+mv val/ILSVRC2012_val_00041396.JPEG n04286575/
+mv val/ILSVRC2012_val_00041397.JPEG n04136333/
+mv val/ILSVRC2012_val_00041398.JPEG n04344873/
+mv val/ILSVRC2012_val_00041399.JPEG n03743016/
+mv val/ILSVRC2012_val_00041400.JPEG n01494475/
+mv val/ILSVRC2012_val_00041401.JPEG n01910747/
+mv val/ILSVRC2012_val_00041402.JPEG n02787622/
+mv val/ILSVRC2012_val_00041403.JPEG n04562935/
+mv val/ILSVRC2012_val_00041404.JPEG n02909870/
+mv val/ILSVRC2012_val_00041405.JPEG n02974003/
+mv val/ILSVRC2012_val_00041406.JPEG n02111500/
+mv val/ILSVRC2012_val_00041407.JPEG n03388549/
+mv val/ILSVRC2012_val_00041408.JPEG n04550184/
+mv val/ILSVRC2012_val_00041409.JPEG n07745940/
+mv val/ILSVRC2012_val_00041410.JPEG n03673027/
+mv val/ILSVRC2012_val_00041411.JPEG n02727426/
+mv val/ILSVRC2012_val_00041412.JPEG n03207743/
+mv val/ILSVRC2012_val_00041413.JPEG n04487081/
+mv val/ILSVRC2012_val_00041414.JPEG n04009552/
+mv val/ILSVRC2012_val_00041415.JPEG n02130308/
+mv val/ILSVRC2012_val_00041416.JPEG n02105412/
+mv val/ILSVRC2012_val_00041417.JPEG n03476991/
+mv val/ILSVRC2012_val_00041418.JPEG n01632458/
+mv val/ILSVRC2012_val_00041419.JPEG n02790996/
+mv val/ILSVRC2012_val_00041420.JPEG n04505470/
+mv val/ILSVRC2012_val_00041421.JPEG n04380533/
+mv val/ILSVRC2012_val_00041422.JPEG n02108422/
+mv val/ILSVRC2012_val_00041423.JPEG n07920052/
+mv val/ILSVRC2012_val_00041424.JPEG n03467068/
+mv val/ILSVRC2012_val_00041425.JPEG n03249569/
+mv val/ILSVRC2012_val_00041426.JPEG n03633091/
+mv val/ILSVRC2012_val_00041427.JPEG n02124075/
+mv val/ILSVRC2012_val_00041428.JPEG n03763968/
+mv val/ILSVRC2012_val_00041429.JPEG n03710637/
+mv val/ILSVRC2012_val_00041430.JPEG n03100240/
+mv val/ILSVRC2012_val_00041431.JPEG n02256656/
+mv val/ILSVRC2012_val_00041432.JPEG n03461385/
+mv val/ILSVRC2012_val_00041433.JPEG n02869837/
+mv val/ILSVRC2012_val_00041434.JPEG n02948072/
+mv val/ILSVRC2012_val_00041435.JPEG n03991062/
+mv val/ILSVRC2012_val_00041436.JPEG n02091244/
+mv val/ILSVRC2012_val_00041437.JPEG n04476259/
+mv val/ILSVRC2012_val_00041438.JPEG n02099429/
+mv val/ILSVRC2012_val_00041439.JPEG n02346627/
+mv val/ILSVRC2012_val_00041440.JPEG n02782093/
+mv val/ILSVRC2012_val_00041441.JPEG n02457408/
+mv val/ILSVRC2012_val_00041442.JPEG n02009229/
+mv val/ILSVRC2012_val_00041443.JPEG n02910353/
+mv val/ILSVRC2012_val_00041444.JPEG n02087046/
+mv val/ILSVRC2012_val_00041445.JPEG n01877812/
+mv val/ILSVRC2012_val_00041446.JPEG n03787032/
+mv val/ILSVRC2012_val_00041447.JPEG n02281406/
+mv val/ILSVRC2012_val_00041448.JPEG n04461696/
+mv val/ILSVRC2012_val_00041449.JPEG n03782006/
+mv val/ILSVRC2012_val_00041450.JPEG n01924916/
+mv val/ILSVRC2012_val_00041451.JPEG n03223299/
+mv val/ILSVRC2012_val_00041452.JPEG n01768244/
+mv val/ILSVRC2012_val_00041453.JPEG n04023962/
+mv val/ILSVRC2012_val_00041454.JPEG n07717410/
+mv val/ILSVRC2012_val_00041455.JPEG n03062245/
+mv val/ILSVRC2012_val_00041456.JPEG n07875152/
+mv val/ILSVRC2012_val_00041457.JPEG n03393912/
+mv val/ILSVRC2012_val_00041458.JPEG n02364673/
+mv val/ILSVRC2012_val_00041459.JPEG n03937543/
+mv val/ILSVRC2012_val_00041460.JPEG n02101388/
+mv val/ILSVRC2012_val_00041461.JPEG n04548280/
+mv val/ILSVRC2012_val_00041462.JPEG n12620546/
+mv val/ILSVRC2012_val_00041463.JPEG n03584829/
+mv val/ILSVRC2012_val_00041464.JPEG n04606251/
+mv val/ILSVRC2012_val_00041465.JPEG n02776631/
+mv val/ILSVRC2012_val_00041466.JPEG n04443257/
+mv val/ILSVRC2012_val_00041467.JPEG n02788148/
+mv val/ILSVRC2012_val_00041468.JPEG n03838899/
+mv val/ILSVRC2012_val_00041469.JPEG n02051845/
+mv val/ILSVRC2012_val_00041470.JPEG n07768694/
+mv val/ILSVRC2012_val_00041471.JPEG n03498962/
+mv val/ILSVRC2012_val_00041472.JPEG n02100583/
+mv val/ILSVRC2012_val_00041473.JPEG n02102177/
+mv val/ILSVRC2012_val_00041474.JPEG n07716358/
+mv val/ILSVRC2012_val_00041475.JPEG n04589890/
+mv val/ILSVRC2012_val_00041476.JPEG n02128757/
+mv val/ILSVRC2012_val_00041477.JPEG n02489166/
+mv val/ILSVRC2012_val_00041478.JPEG n03417042/
+mv val/ILSVRC2012_val_00041479.JPEG n03355925/
+mv val/ILSVRC2012_val_00041480.JPEG n02111889/
+mv val/ILSVRC2012_val_00041481.JPEG n03297495/
+mv val/ILSVRC2012_val_00041482.JPEG n03180011/
+mv val/ILSVRC2012_val_00041483.JPEG n03196217/
+mv val/ILSVRC2012_val_00041484.JPEG n02859443/
+mv val/ILSVRC2012_val_00041485.JPEG n02321529/
+mv val/ILSVRC2012_val_00041486.JPEG n04443257/
+mv val/ILSVRC2012_val_00041487.JPEG n03089624/
+mv val/ILSVRC2012_val_00041488.JPEG n07730033/
+mv val/ILSVRC2012_val_00041489.JPEG n03874293/
+mv val/ILSVRC2012_val_00041490.JPEG n03594945/
+mv val/ILSVRC2012_val_00041491.JPEG n02423022/
+mv val/ILSVRC2012_val_00041492.JPEG n11879895/
+mv val/ILSVRC2012_val_00041493.JPEG n02104029/
+mv val/ILSVRC2012_val_00041494.JPEG n02916936/
+mv val/ILSVRC2012_val_00041495.JPEG n02403003/
+mv val/ILSVRC2012_val_00041496.JPEG n03709823/
+mv val/ILSVRC2012_val_00041497.JPEG n04467665/
+mv val/ILSVRC2012_val_00041498.JPEG n01833805/
+mv val/ILSVRC2012_val_00041499.JPEG n02119022/
+mv val/ILSVRC2012_val_00041500.JPEG n02687172/
+mv val/ILSVRC2012_val_00041501.JPEG n02492660/
+mv val/ILSVRC2012_val_00041502.JPEG n02877765/
+mv val/ILSVRC2012_val_00041503.JPEG n02099429/
+mv val/ILSVRC2012_val_00041504.JPEG n03942813/
+mv val/ILSVRC2012_val_00041505.JPEG n02105855/
+mv val/ILSVRC2012_val_00041506.JPEG n02168699/
+mv val/ILSVRC2012_val_00041507.JPEG n07565083/
+mv val/ILSVRC2012_val_00041508.JPEG n03895866/
+mv val/ILSVRC2012_val_00041509.JPEG n03126707/
+mv val/ILSVRC2012_val_00041510.JPEG n02346627/
+mv val/ILSVRC2012_val_00041511.JPEG n02606052/
+mv val/ILSVRC2012_val_00041512.JPEG n03670208/
+mv val/ILSVRC2012_val_00041513.JPEG n02114548/
+mv val/ILSVRC2012_val_00041514.JPEG n02109047/
+mv val/ILSVRC2012_val_00041515.JPEG n03916031/
+mv val/ILSVRC2012_val_00041516.JPEG n01871265/
+mv val/ILSVRC2012_val_00041517.JPEG n04523525/
+mv val/ILSVRC2012_val_00041518.JPEG n02690373/
+mv val/ILSVRC2012_val_00041519.JPEG n03014705/
+mv val/ILSVRC2012_val_00041520.JPEG n02356798/
+mv val/ILSVRC2012_val_00041521.JPEG n02128385/
+mv val/ILSVRC2012_val_00041522.JPEG n02133161/
+mv val/ILSVRC2012_val_00041523.JPEG n03884397/
+mv val/ILSVRC2012_val_00041524.JPEG n02108915/
+mv val/ILSVRC2012_val_00041525.JPEG n03759954/
+mv val/ILSVRC2012_val_00041526.JPEG n03630383/
+mv val/ILSVRC2012_val_00041527.JPEG n02106382/
+mv val/ILSVRC2012_val_00041528.JPEG n02256656/
+mv val/ILSVRC2012_val_00041529.JPEG n02085936/
+mv val/ILSVRC2012_val_00041530.JPEG n03197337/
+mv val/ILSVRC2012_val_00041531.JPEG n03661043/
+mv val/ILSVRC2012_val_00041532.JPEG n04590129/
+mv val/ILSVRC2012_val_00041533.JPEG n03958227/
+mv val/ILSVRC2012_val_00041534.JPEG n04525038/
+mv val/ILSVRC2012_val_00041535.JPEG n02037110/
+mv val/ILSVRC2012_val_00041536.JPEG n03956157/
+mv val/ILSVRC2012_val_00041537.JPEG n03717622/
+mv val/ILSVRC2012_val_00041538.JPEG n02326432/
+mv val/ILSVRC2012_val_00041539.JPEG n03249569/
+mv val/ILSVRC2012_val_00041540.JPEG n01631663/
+mv val/ILSVRC2012_val_00041541.JPEG n01687978/
+mv val/ILSVRC2012_val_00041542.JPEG n12144580/
+mv val/ILSVRC2012_val_00041543.JPEG n02277742/
+mv val/ILSVRC2012_val_00041544.JPEG n03692522/
+mv val/ILSVRC2012_val_00041545.JPEG n04507155/
+mv val/ILSVRC2012_val_00041546.JPEG n04389033/
+mv val/ILSVRC2012_val_00041547.JPEG n04548280/
+mv val/ILSVRC2012_val_00041548.JPEG n01914609/
+mv val/ILSVRC2012_val_00041549.JPEG n01776313/
+mv val/ILSVRC2012_val_00041550.JPEG n03125729/
+mv val/ILSVRC2012_val_00041551.JPEG n02096051/
+mv val/ILSVRC2012_val_00041552.JPEG n02769748/
+mv val/ILSVRC2012_val_00041553.JPEG n04131690/
+mv val/ILSVRC2012_val_00041554.JPEG n02669723/
+mv val/ILSVRC2012_val_00041555.JPEG n04376876/
+mv val/ILSVRC2012_val_00041556.JPEG n01818515/
+mv val/ILSVRC2012_val_00041557.JPEG n02091244/
+mv val/ILSVRC2012_val_00041558.JPEG n03207743/
+mv val/ILSVRC2012_val_00041559.JPEG n03134739/
+mv val/ILSVRC2012_val_00041560.JPEG n03838899/
+mv val/ILSVRC2012_val_00041561.JPEG n02641379/
+mv val/ILSVRC2012_val_00041562.JPEG n02666196/
+mv val/ILSVRC2012_val_00041563.JPEG n02397096/
+mv val/ILSVRC2012_val_00041564.JPEG n02009229/
+mv val/ILSVRC2012_val_00041565.JPEG n02410509/
+mv val/ILSVRC2012_val_00041566.JPEG n02276258/
+mv val/ILSVRC2012_val_00041567.JPEG n03062245/
+mv val/ILSVRC2012_val_00041568.JPEG n02097130/
+mv val/ILSVRC2012_val_00041569.JPEG n02093754/
+mv val/ILSVRC2012_val_00041570.JPEG n02123045/
+mv val/ILSVRC2012_val_00041571.JPEG n04357314/
+mv val/ILSVRC2012_val_00041572.JPEG n03089624/
+mv val/ILSVRC2012_val_00041573.JPEG n02091244/
+mv val/ILSVRC2012_val_00041574.JPEG n01685808/
+mv val/ILSVRC2012_val_00041575.JPEG n02412080/
+mv val/ILSVRC2012_val_00041576.JPEG n03841143/
+mv val/ILSVRC2012_val_00041577.JPEG n01807496/
+mv val/ILSVRC2012_val_00041578.JPEG n02098286/
+mv val/ILSVRC2012_val_00041579.JPEG n02124075/
+mv val/ILSVRC2012_val_00041580.JPEG n02086646/
+mv val/ILSVRC2012_val_00041581.JPEG n03627232/
+mv val/ILSVRC2012_val_00041582.JPEG n09468604/
+mv val/ILSVRC2012_val_00041583.JPEG n01768244/
+mv val/ILSVRC2012_val_00041584.JPEG n07920052/
+mv val/ILSVRC2012_val_00041585.JPEG n03976467/
+mv val/ILSVRC2012_val_00041586.JPEG n03534580/
+mv val/ILSVRC2012_val_00041587.JPEG n03617480/
+mv val/ILSVRC2012_val_00041588.JPEG n04467665/
+mv val/ILSVRC2012_val_00041589.JPEG n07584110/
+mv val/ILSVRC2012_val_00041590.JPEG n04040759/
+mv val/ILSVRC2012_val_00041591.JPEG n02090379/
+mv val/ILSVRC2012_val_00041592.JPEG n03393912/
+mv val/ILSVRC2012_val_00041593.JPEG n01945685/
+mv val/ILSVRC2012_val_00041594.JPEG n04482393/
+mv val/ILSVRC2012_val_00041595.JPEG n01537544/
+mv val/ILSVRC2012_val_00041596.JPEG n02231487/
+mv val/ILSVRC2012_val_00041597.JPEG n02137549/
+mv val/ILSVRC2012_val_00041598.JPEG n03045698/
+mv val/ILSVRC2012_val_00041599.JPEG n04346328/
+mv val/ILSVRC2012_val_00041600.JPEG n04597913/
+mv val/ILSVRC2012_val_00041601.JPEG n02114367/
+mv val/ILSVRC2012_val_00041602.JPEG n07613480/
+mv val/ILSVRC2012_val_00041603.JPEG n02892767/
+mv val/ILSVRC2012_val_00041604.JPEG n04209133/
+mv val/ILSVRC2012_val_00041605.JPEG n02097047/
+mv val/ILSVRC2012_val_00041606.JPEG n02100877/
+mv val/ILSVRC2012_val_00041607.JPEG n02480855/
+mv val/ILSVRC2012_val_00041608.JPEG n03259280/
+mv val/ILSVRC2012_val_00041609.JPEG n03272010/
+mv val/ILSVRC2012_val_00041610.JPEG n07684084/
+mv val/ILSVRC2012_val_00041611.JPEG n03743016/
+mv val/ILSVRC2012_val_00041612.JPEG n01773549/
+mv val/ILSVRC2012_val_00041613.JPEG n02708093/
+mv val/ILSVRC2012_val_00041614.JPEG n02939185/
+mv val/ILSVRC2012_val_00041615.JPEG n03617480/
+mv val/ILSVRC2012_val_00041616.JPEG n01753488/
+mv val/ILSVRC2012_val_00041617.JPEG n07880968/
+mv val/ILSVRC2012_val_00041618.JPEG n03218198/
+mv val/ILSVRC2012_val_00041619.JPEG n02871525/
+mv val/ILSVRC2012_val_00041620.JPEG n02093256/
+mv val/ILSVRC2012_val_00041621.JPEG n01798484/
+mv val/ILSVRC2012_val_00041622.JPEG n02417914/
+mv val/ILSVRC2012_val_00041623.JPEG n02108915/
+mv val/ILSVRC2012_val_00041624.JPEG n04125021/
+mv val/ILSVRC2012_val_00041625.JPEG n03126707/
+mv val/ILSVRC2012_val_00041626.JPEG n04285008/
+mv val/ILSVRC2012_val_00041627.JPEG n02526121/
+mv val/ILSVRC2012_val_00041628.JPEG n04111531/
+mv val/ILSVRC2012_val_00041629.JPEG n02089078/
+mv val/ILSVRC2012_val_00041630.JPEG n02927161/
+mv val/ILSVRC2012_val_00041631.JPEG n02971356/
+mv val/ILSVRC2012_val_00041632.JPEG n04553703/
+mv val/ILSVRC2012_val_00041633.JPEG n02442845/
+mv val/ILSVRC2012_val_00041634.JPEG n01945685/
+mv val/ILSVRC2012_val_00041635.JPEG n01491361/
+mv val/ILSVRC2012_val_00041636.JPEG n04347754/
+mv val/ILSVRC2012_val_00041637.JPEG n04371774/
+mv val/ILSVRC2012_val_00041638.JPEG n09428293/
+mv val/ILSVRC2012_val_00041639.JPEG n04370456/
+mv val/ILSVRC2012_val_00041640.JPEG n01682714/
+mv val/ILSVRC2012_val_00041641.JPEG n01664065/
+mv val/ILSVRC2012_val_00041642.JPEG n02085620/
+mv val/ILSVRC2012_val_00041643.JPEG n02114855/
+mv val/ILSVRC2012_val_00041644.JPEG n03255030/
+mv val/ILSVRC2012_val_00041645.JPEG n02130308/
+mv val/ILSVRC2012_val_00041646.JPEG n04200800/
+mv val/ILSVRC2012_val_00041647.JPEG n02447366/
+mv val/ILSVRC2012_val_00041648.JPEG n04127249/
+mv val/ILSVRC2012_val_00041649.JPEG n02110185/
+mv val/ILSVRC2012_val_00041650.JPEG n02793495/
+mv val/ILSVRC2012_val_00041651.JPEG n03944341/
+mv val/ILSVRC2012_val_00041652.JPEG n03196217/
+mv val/ILSVRC2012_val_00041653.JPEG n02096294/
+mv val/ILSVRC2012_val_00041654.JPEG n04133789/
+mv val/ILSVRC2012_val_00041655.JPEG n07754684/
+mv val/ILSVRC2012_val_00041656.JPEG n03384352/
+mv val/ILSVRC2012_val_00041657.JPEG n03459775/
+mv val/ILSVRC2012_val_00041658.JPEG n04579145/
+mv val/ILSVRC2012_val_00041659.JPEG n01682714/
+mv val/ILSVRC2012_val_00041660.JPEG n03041632/
+mv val/ILSVRC2012_val_00041661.JPEG n07860988/
+mv val/ILSVRC2012_val_00041662.JPEG n06596364/
+mv val/ILSVRC2012_val_00041663.JPEG n04296562/
+mv val/ILSVRC2012_val_00041664.JPEG n04152593/
+mv val/ILSVRC2012_val_00041665.JPEG n01698640/
+mv val/ILSVRC2012_val_00041666.JPEG n03792972/
+mv val/ILSVRC2012_val_00041667.JPEG n04067472/
+mv val/ILSVRC2012_val_00041668.JPEG n03394916/
+mv val/ILSVRC2012_val_00041669.JPEG n01728920/
+mv val/ILSVRC2012_val_00041670.JPEG n04597913/
+mv val/ILSVRC2012_val_00041671.JPEG n04090263/
+mv val/ILSVRC2012_val_00041672.JPEG n03445777/
+mv val/ILSVRC2012_val_00041673.JPEG n13040303/
+mv val/ILSVRC2012_val_00041674.JPEG n07717556/
+mv val/ILSVRC2012_val_00041675.JPEG n01914609/
+mv val/ILSVRC2012_val_00041676.JPEG n07730033/
+mv val/ILSVRC2012_val_00041677.JPEG n02108089/
+mv val/ILSVRC2012_val_00041678.JPEG n04597913/
+mv val/ILSVRC2012_val_00041679.JPEG n02786058/
+mv val/ILSVRC2012_val_00041680.JPEG n06785654/
+mv val/ILSVRC2012_val_00041681.JPEG n03956157/
+mv val/ILSVRC2012_val_00041682.JPEG n04584207/
+mv val/ILSVRC2012_val_00041683.JPEG n03697007/
+mv val/ILSVRC2012_val_00041684.JPEG n02114712/
+mv val/ILSVRC2012_val_00041685.JPEG n02749479/
+mv val/ILSVRC2012_val_00041686.JPEG n07248320/
+mv val/ILSVRC2012_val_00041687.JPEG n03673027/
+mv val/ILSVRC2012_val_00041688.JPEG n02090379/
+mv val/ILSVRC2012_val_00041689.JPEG n04501370/
+mv val/ILSVRC2012_val_00041690.JPEG n01917289/
+mv val/ILSVRC2012_val_00041691.JPEG n04265275/
+mv val/ILSVRC2012_val_00041692.JPEG n04515003/
+mv val/ILSVRC2012_val_00041693.JPEG n03710721/
+mv val/ILSVRC2012_val_00041694.JPEG n03495258/
+mv val/ILSVRC2012_val_00041695.JPEG n04532670/
+mv val/ILSVRC2012_val_00041696.JPEG n04040759/
+mv val/ILSVRC2012_val_00041697.JPEG n01829413/
+mv val/ILSVRC2012_val_00041698.JPEG n02840245/
+mv val/ILSVRC2012_val_00041699.JPEG n02699494/
+mv val/ILSVRC2012_val_00041700.JPEG n02106550/
+mv val/ILSVRC2012_val_00041701.JPEG n03089624/
+mv val/ILSVRC2012_val_00041702.JPEG n02105056/
+mv val/ILSVRC2012_val_00041703.JPEG n02860847/
+mv val/ILSVRC2012_val_00041704.JPEG n02487347/
+mv val/ILSVRC2012_val_00041705.JPEG n02085782/
+mv val/ILSVRC2012_val_00041706.JPEG n03888257/
+mv val/ILSVRC2012_val_00041707.JPEG n03691459/
+mv val/ILSVRC2012_val_00041708.JPEG n02398521/
+mv val/ILSVRC2012_val_00041709.JPEG n04398044/
+mv val/ILSVRC2012_val_00041710.JPEG n01687978/
+mv val/ILSVRC2012_val_00041711.JPEG n04371774/
+mv val/ILSVRC2012_val_00041712.JPEG n02777292/
+mv val/ILSVRC2012_val_00041713.JPEG n01664065/
+mv val/ILSVRC2012_val_00041714.JPEG n04476259/
+mv val/ILSVRC2012_val_00041715.JPEG n04548280/
+mv val/ILSVRC2012_val_00041716.JPEG n12144580/
+mv val/ILSVRC2012_val_00041717.JPEG n02669723/
+mv val/ILSVRC2012_val_00041718.JPEG n02095314/
+mv val/ILSVRC2012_val_00041719.JPEG n02877765/
+mv val/ILSVRC2012_val_00041720.JPEG n04429376/
+mv val/ILSVRC2012_val_00041721.JPEG n03400231/
+mv val/ILSVRC2012_val_00041722.JPEG n03729826/
+mv val/ILSVRC2012_val_00041723.JPEG n02825657/
+mv val/ILSVRC2012_val_00041724.JPEG n02802426/
+mv val/ILSVRC2012_val_00041725.JPEG n03733281/
+mv val/ILSVRC2012_val_00041726.JPEG n03124043/
+mv val/ILSVRC2012_val_00041727.JPEG n07871810/
+mv val/ILSVRC2012_val_00041728.JPEG n02169497/
+mv val/ILSVRC2012_val_00041729.JPEG n04263257/
+mv val/ILSVRC2012_val_00041730.JPEG n01689811/
+mv val/ILSVRC2012_val_00041731.JPEG n04485082/
+mv val/ILSVRC2012_val_00041732.JPEG n04099969/
+mv val/ILSVRC2012_val_00041733.JPEG n03902125/
+mv val/ILSVRC2012_val_00041734.JPEG n04371430/
+mv val/ILSVRC2012_val_00041735.JPEG n02091635/
+mv val/ILSVRC2012_val_00041736.JPEG n03344393/
+mv val/ILSVRC2012_val_00041737.JPEG n02815834/
+mv val/ILSVRC2012_val_00041738.JPEG n13044778/
+mv val/ILSVRC2012_val_00041739.JPEG n02100877/
+mv val/ILSVRC2012_val_00041740.JPEG n02130308/
+mv val/ILSVRC2012_val_00041741.JPEG n09246464/
+mv val/ILSVRC2012_val_00041742.JPEG n02843684/
+mv val/ILSVRC2012_val_00041743.JPEG n01735189/
+mv val/ILSVRC2012_val_00041744.JPEG n06874185/
+mv val/ILSVRC2012_val_00041745.JPEG n02100583/
+mv val/ILSVRC2012_val_00041746.JPEG n02100877/
+mv val/ILSVRC2012_val_00041747.JPEG n15075141/
+mv val/ILSVRC2012_val_00041748.JPEG n02109525/
+mv val/ILSVRC2012_val_00041749.JPEG n02486410/
+mv val/ILSVRC2012_val_00041750.JPEG n02950826/
+mv val/ILSVRC2012_val_00041751.JPEG n01871265/
+mv val/ILSVRC2012_val_00041752.JPEG n02823750/
+mv val/ILSVRC2012_val_00041753.JPEG n07583066/
+mv val/ILSVRC2012_val_00041754.JPEG n02051845/
+mv val/ILSVRC2012_val_00041755.JPEG n01751748/
+mv val/ILSVRC2012_val_00041756.JPEG n02483362/
+mv val/ILSVRC2012_val_00041757.JPEG n03908618/
+mv val/ILSVRC2012_val_00041758.JPEG n02977058/
+mv val/ILSVRC2012_val_00041759.JPEG n02111889/
+mv val/ILSVRC2012_val_00041760.JPEG n04447861/
+mv val/ILSVRC2012_val_00041761.JPEG n02114855/
+mv val/ILSVRC2012_val_00041762.JPEG n02095314/
+mv val/ILSVRC2012_val_00041763.JPEG n02804414/
+mv val/ILSVRC2012_val_00041764.JPEG n02489166/
+mv val/ILSVRC2012_val_00041765.JPEG n04277352/
+mv val/ILSVRC2012_val_00041766.JPEG n02236044/
+mv val/ILSVRC2012_val_00041767.JPEG n02408429/
+mv val/ILSVRC2012_val_00041768.JPEG n02655020/
+mv val/ILSVRC2012_val_00041769.JPEG n01693334/
+mv val/ILSVRC2012_val_00041770.JPEG n03447721/
+mv val/ILSVRC2012_val_00041771.JPEG n02093647/
+mv val/ILSVRC2012_val_00041772.JPEG n02791124/
+mv val/ILSVRC2012_val_00041773.JPEG n02077923/
+mv val/ILSVRC2012_val_00041774.JPEG n04536866/
+mv val/ILSVRC2012_val_00041775.JPEG n03291819/
+mv val/ILSVRC2012_val_00041776.JPEG n02093859/
+mv val/ILSVRC2012_val_00041777.JPEG n02115641/
+mv val/ILSVRC2012_val_00041778.JPEG n04254680/
+mv val/ILSVRC2012_val_00041779.JPEG n04501370/
+mv val/ILSVRC2012_val_00041780.JPEG n04019541/
+mv val/ILSVRC2012_val_00041781.JPEG n02795169/
+mv val/ILSVRC2012_val_00041782.JPEG n03459775/
+mv val/ILSVRC2012_val_00041783.JPEG n04209133/
+mv val/ILSVRC2012_val_00041784.JPEG n07860988/
+mv val/ILSVRC2012_val_00041785.JPEG n04553703/
+mv val/ILSVRC2012_val_00041786.JPEG n02484975/
+mv val/ILSVRC2012_val_00041787.JPEG n03530642/
+mv val/ILSVRC2012_val_00041788.JPEG n02906734/
+mv val/ILSVRC2012_val_00041789.JPEG n04325704/
+mv val/ILSVRC2012_val_00041790.JPEG n04008634/
+mv val/ILSVRC2012_val_00041791.JPEG n12057211/
+mv val/ILSVRC2012_val_00041792.JPEG n02342885/
+mv val/ILSVRC2012_val_00041793.JPEG n04344873/
+mv val/ILSVRC2012_val_00041794.JPEG n03794056/
+mv val/ILSVRC2012_val_00041795.JPEG n02107142/
+mv val/ILSVRC2012_val_00041796.JPEG n04090263/
+mv val/ILSVRC2012_val_00041797.JPEG n02009229/
+mv val/ILSVRC2012_val_00041798.JPEG n02971356/
+mv val/ILSVRC2012_val_00041799.JPEG n02504458/
+mv val/ILSVRC2012_val_00041800.JPEG n04273569/
+mv val/ILSVRC2012_val_00041801.JPEG n09399592/
+mv val/ILSVRC2012_val_00041802.JPEG n03272562/
+mv val/ILSVRC2012_val_00041803.JPEG n02277742/
+mv val/ILSVRC2012_val_00041804.JPEG n02279972/
+mv val/ILSVRC2012_val_00041805.JPEG n07930864/
+mv val/ILSVRC2012_val_00041806.JPEG n02917067/
+mv val/ILSVRC2012_val_00041807.JPEG n04004767/
+mv val/ILSVRC2012_val_00041808.JPEG n04392985/
+mv val/ILSVRC2012_val_00041809.JPEG n07718747/
+mv val/ILSVRC2012_val_00041810.JPEG n02089078/
+mv val/ILSVRC2012_val_00041811.JPEG n03903868/
+mv val/ILSVRC2012_val_00041812.JPEG n03208938/
+mv val/ILSVRC2012_val_00041813.JPEG n02133161/
+mv val/ILSVRC2012_val_00041814.JPEG n03376595/
+mv val/ILSVRC2012_val_00041815.JPEG n02978881/
+mv val/ILSVRC2012_val_00041816.JPEG n03201208/
+mv val/ILSVRC2012_val_00041817.JPEG n02834397/
+mv val/ILSVRC2012_val_00041818.JPEG n02443484/
+mv val/ILSVRC2012_val_00041819.JPEG n02085620/
+mv val/ILSVRC2012_val_00041820.JPEG n02111889/
+mv val/ILSVRC2012_val_00041821.JPEG n03532672/
+mv val/ILSVRC2012_val_00041822.JPEG n04263257/
+mv val/ILSVRC2012_val_00041823.JPEG n03661043/
+mv val/ILSVRC2012_val_00041824.JPEG n15075141/
+mv val/ILSVRC2012_val_00041825.JPEG n04200800/
+mv val/ILSVRC2012_val_00041826.JPEG n03786901/
+mv val/ILSVRC2012_val_00041827.JPEG n01873310/
+mv val/ILSVRC2012_val_00041828.JPEG n04423845/
+mv val/ILSVRC2012_val_00041829.JPEG n01737021/
+mv val/ILSVRC2012_val_00041830.JPEG n02951358/
+mv val/ILSVRC2012_val_00041831.JPEG n02116738/
+mv val/ILSVRC2012_val_00041832.JPEG n01798484/
+mv val/ILSVRC2012_val_00041833.JPEG n03980874/
+mv val/ILSVRC2012_val_00041834.JPEG n02834397/
+mv val/ILSVRC2012_val_00041835.JPEG n02398521/
+mv val/ILSVRC2012_val_00041836.JPEG n01531178/
+mv val/ILSVRC2012_val_00041837.JPEG n07734744/
+mv val/ILSVRC2012_val_00041838.JPEG n01847000/
+mv val/ILSVRC2012_val_00041839.JPEG n03841143/
+mv val/ILSVRC2012_val_00041840.JPEG n02110185/
+mv val/ILSVRC2012_val_00041841.JPEG n13044778/
+mv val/ILSVRC2012_val_00041842.JPEG n02727426/
+mv val/ILSVRC2012_val_00041843.JPEG n02799071/
+mv val/ILSVRC2012_val_00041844.JPEG n02107908/
+mv val/ILSVRC2012_val_00041845.JPEG n01806143/
+mv val/ILSVRC2012_val_00041846.JPEG n03770679/
+mv val/ILSVRC2012_val_00041847.JPEG n03967562/
+mv val/ILSVRC2012_val_00041848.JPEG n02086646/
+mv val/ILSVRC2012_val_00041849.JPEG n02892767/
+mv val/ILSVRC2012_val_00041850.JPEG n01855032/
+mv val/ILSVRC2012_val_00041851.JPEG n02165105/
+mv val/ILSVRC2012_val_00041852.JPEG n01514859/
+mv val/ILSVRC2012_val_00041853.JPEG n04037443/
+mv val/ILSVRC2012_val_00041854.JPEG n03877472/
+mv val/ILSVRC2012_val_00041855.JPEG n03729826/
+mv val/ILSVRC2012_val_00041856.JPEG n01728920/
+mv val/ILSVRC2012_val_00041857.JPEG n02676566/
+mv val/ILSVRC2012_val_00041858.JPEG n03627232/
+mv val/ILSVRC2012_val_00041859.JPEG n04069434/
+mv val/ILSVRC2012_val_00041860.JPEG n04192698/
+mv val/ILSVRC2012_val_00041861.JPEG n02486261/
+mv val/ILSVRC2012_val_00041862.JPEG n02795169/
+mv val/ILSVRC2012_val_00041863.JPEG n04033901/
+mv val/ILSVRC2012_val_00041864.JPEG n01824575/
+mv val/ILSVRC2012_val_00041865.JPEG n02105641/
+mv val/ILSVRC2012_val_00041866.JPEG n02444819/
+mv val/ILSVRC2012_val_00041867.JPEG n01824575/
+mv val/ILSVRC2012_val_00041868.JPEG n03908714/
+mv val/ILSVRC2012_val_00041869.JPEG n04239074/
+mv val/ILSVRC2012_val_00041870.JPEG n02102480/
+mv val/ILSVRC2012_val_00041871.JPEG n02264363/
+mv val/ILSVRC2012_val_00041872.JPEG n01498041/
+mv val/ILSVRC2012_val_00041873.JPEG n02930766/
+mv val/ILSVRC2012_val_00041874.JPEG n04355933/
+mv val/ILSVRC2012_val_00041875.JPEG n04125021/
+mv val/ILSVRC2012_val_00041876.JPEG n03481172/
+mv val/ILSVRC2012_val_00041877.JPEG n02123159/
+mv val/ILSVRC2012_val_00041878.JPEG n02099712/
+mv val/ILSVRC2012_val_00041879.JPEG n04209239/
+mv val/ILSVRC2012_val_00041880.JPEG n02111889/
+mv val/ILSVRC2012_val_00041881.JPEG n02002556/
+mv val/ILSVRC2012_val_00041882.JPEG n03690938/
+mv val/ILSVRC2012_val_00041883.JPEG n04429376/
+mv val/ILSVRC2012_val_00041884.JPEG n03814906/
+mv val/ILSVRC2012_val_00041885.JPEG n04525305/
+mv val/ILSVRC2012_val_00041886.JPEG n02107908/
+mv val/ILSVRC2012_val_00041887.JPEG n01692333/
+mv val/ILSVRC2012_val_00041888.JPEG n04127249/
+mv val/ILSVRC2012_val_00041889.JPEG n01914609/
+mv val/ILSVRC2012_val_00041890.JPEG n04201297/
+mv val/ILSVRC2012_val_00041891.JPEG n02807133/
+mv val/ILSVRC2012_val_00041892.JPEG n01985128/
+mv val/ILSVRC2012_val_00041893.JPEG n02979186/
+mv val/ILSVRC2012_val_00041894.JPEG n02088238/
+mv val/ILSVRC2012_val_00041895.JPEG n03594945/
+mv val/ILSVRC2012_val_00041896.JPEG n03388043/
+mv val/ILSVRC2012_val_00041897.JPEG n09468604/
+mv val/ILSVRC2012_val_00041898.JPEG n03729826/
+mv val/ILSVRC2012_val_00041899.JPEG n02704792/
+mv val/ILSVRC2012_val_00041900.JPEG n07930864/
+mv val/ILSVRC2012_val_00041901.JPEG n03355925/
+mv val/ILSVRC2012_val_00041902.JPEG n04554684/
+mv val/ILSVRC2012_val_00041903.JPEG n04131690/
+mv val/ILSVRC2012_val_00041904.JPEG n04026417/
+mv val/ILSVRC2012_val_00041905.JPEG n02437616/
+mv val/ILSVRC2012_val_00041906.JPEG n03769881/
+mv val/ILSVRC2012_val_00041907.JPEG n04330267/
+mv val/ILSVRC2012_val_00041908.JPEG n02091831/
+mv val/ILSVRC2012_val_00041909.JPEG n01797886/
+mv val/ILSVRC2012_val_00041910.JPEG n02687172/
+mv val/ILSVRC2012_val_00041911.JPEG n02906734/
+mv val/ILSVRC2012_val_00041912.JPEG n02091635/
+mv val/ILSVRC2012_val_00041913.JPEG n02814533/
+mv val/ILSVRC2012_val_00041914.JPEG n02114712/
+mv val/ILSVRC2012_val_00041915.JPEG n03770439/
+mv val/ILSVRC2012_val_00041916.JPEG n04099969/
+mv val/ILSVRC2012_val_00041917.JPEG n04033995/
+mv val/ILSVRC2012_val_00041918.JPEG n02085936/
+mv val/ILSVRC2012_val_00041919.JPEG n01644900/
+mv val/ILSVRC2012_val_00041920.JPEG n02930766/
+mv val/ILSVRC2012_val_00041921.JPEG n01917289/
+mv val/ILSVRC2012_val_00041922.JPEG n01704323/
+mv val/ILSVRC2012_val_00041923.JPEG n04515003/
+mv val/ILSVRC2012_val_00041924.JPEG n01950731/
+mv val/ILSVRC2012_val_00041925.JPEG n03888257/
+mv val/ILSVRC2012_val_00041926.JPEG n07836838/
+mv val/ILSVRC2012_val_00041927.JPEG n02687172/
+mv val/ILSVRC2012_val_00041928.JPEG n02102318/
+mv val/ILSVRC2012_val_00041929.JPEG n02106030/
+mv val/ILSVRC2012_val_00041930.JPEG n02676566/
+mv val/ILSVRC2012_val_00041931.JPEG n01749939/
+mv val/ILSVRC2012_val_00041932.JPEG n03314780/
+mv val/ILSVRC2012_val_00041933.JPEG n03690938/
+mv val/ILSVRC2012_val_00041934.JPEG n02823750/
+mv val/ILSVRC2012_val_00041935.JPEG n03344393/
+mv val/ILSVRC2012_val_00041936.JPEG n03666591/
+mv val/ILSVRC2012_val_00041937.JPEG n04458633/
+mv val/ILSVRC2012_val_00041938.JPEG n04398044/
+mv val/ILSVRC2012_val_00041939.JPEG n01440764/
+mv val/ILSVRC2012_val_00041940.JPEG n04482393/
+mv val/ILSVRC2012_val_00041941.JPEG n03075370/
+mv val/ILSVRC2012_val_00041942.JPEG n02701002/
+mv val/ILSVRC2012_val_00041943.JPEG n04023962/
+mv val/ILSVRC2012_val_00041944.JPEG n01558993/
+mv val/ILSVRC2012_val_00041945.JPEG n07716358/
+mv val/ILSVRC2012_val_00041946.JPEG n02325366/
+mv val/ILSVRC2012_val_00041947.JPEG n02106382/
+mv val/ILSVRC2012_val_00041948.JPEG n04590129/
+mv val/ILSVRC2012_val_00041949.JPEG n10148035/
+mv val/ILSVRC2012_val_00041950.JPEG n02236044/
+mv val/ILSVRC2012_val_00041951.JPEG n04252077/
+mv val/ILSVRC2012_val_00041952.JPEG n12144580/
+mv val/ILSVRC2012_val_00041953.JPEG n02110627/
+mv val/ILSVRC2012_val_00041954.JPEG n03000134/
+mv val/ILSVRC2012_val_00041955.JPEG n02086079/
+mv val/ILSVRC2012_val_00041956.JPEG n03032252/
+mv val/ILSVRC2012_val_00041957.JPEG n02408429/
+mv val/ILSVRC2012_val_00041958.JPEG n03394916/
+mv val/ILSVRC2012_val_00041959.JPEG n02871525/
+mv val/ILSVRC2012_val_00041960.JPEG n01806567/
+mv val/ILSVRC2012_val_00041961.JPEG n02127052/
+mv val/ILSVRC2012_val_00041962.JPEG n02879718/
+mv val/ILSVRC2012_val_00041963.JPEG n03032252/
+mv val/ILSVRC2012_val_00041964.JPEG n03935335/
+mv val/ILSVRC2012_val_00041965.JPEG n04482393/
+mv val/ILSVRC2012_val_00041966.JPEG n03710721/
+mv val/ILSVRC2012_val_00041967.JPEG n04522168/
+mv val/ILSVRC2012_val_00041968.JPEG n04371430/
+mv val/ILSVRC2012_val_00041969.JPEG n04579145/
+mv val/ILSVRC2012_val_00041970.JPEG n03967562/
+mv val/ILSVRC2012_val_00041971.JPEG n03201208/
+mv val/ILSVRC2012_val_00041972.JPEG n04355338/
+mv val/ILSVRC2012_val_00041973.JPEG n04328186/
+mv val/ILSVRC2012_val_00041974.JPEG n04111531/
+mv val/ILSVRC2012_val_00041975.JPEG n01968897/
+mv val/ILSVRC2012_val_00041976.JPEG n02115913/
+mv val/ILSVRC2012_val_00041977.JPEG n01518878/
+mv val/ILSVRC2012_val_00041978.JPEG n04344873/
+mv val/ILSVRC2012_val_00041979.JPEG n02814533/
+mv val/ILSVRC2012_val_00041980.JPEG n01697457/
+mv val/ILSVRC2012_val_00041981.JPEG n04371430/
+mv val/ILSVRC2012_val_00041982.JPEG n01855032/
+mv val/ILSVRC2012_val_00041983.JPEG n01806143/
+mv val/ILSVRC2012_val_00041984.JPEG n03598930/
+mv val/ILSVRC2012_val_00041985.JPEG n02971356/
+mv val/ILSVRC2012_val_00041986.JPEG n03372029/
+mv val/ILSVRC2012_val_00041987.JPEG n02101388/
+mv val/ILSVRC2012_val_00041988.JPEG n02963159/
+mv val/ILSVRC2012_val_00041989.JPEG n02391049/
+mv val/ILSVRC2012_val_00041990.JPEG n01560419/
+mv val/ILSVRC2012_val_00041991.JPEG n02114367/
+mv val/ILSVRC2012_val_00041992.JPEG n03933933/
+mv val/ILSVRC2012_val_00041993.JPEG n03259280/
+mv val/ILSVRC2012_val_00041994.JPEG n01756291/
+mv val/ILSVRC2012_val_00041995.JPEG n04479046/
+mv val/ILSVRC2012_val_00041996.JPEG n07583066/
+mv val/ILSVRC2012_val_00041997.JPEG n03792972/
+mv val/ILSVRC2012_val_00041998.JPEG n02100877/
+mv val/ILSVRC2012_val_00041999.JPEG n07768694/
+mv val/ILSVRC2012_val_00042000.JPEG n02007558/
+mv val/ILSVRC2012_val_00042001.JPEG n03937543/
+mv val/ILSVRC2012_val_00042002.JPEG n03666591/
+mv val/ILSVRC2012_val_00042003.JPEG n02104029/
+mv val/ILSVRC2012_val_00042004.JPEG n01910747/
+mv val/ILSVRC2012_val_00042005.JPEG n02095889/
+mv val/ILSVRC2012_val_00042006.JPEG n04417672/
+mv val/ILSVRC2012_val_00042007.JPEG n03769881/
+mv val/ILSVRC2012_val_00042008.JPEG n03929855/
+mv val/ILSVRC2012_val_00042009.JPEG n02641379/
+mv val/ILSVRC2012_val_00042010.JPEG n02229544/
+mv val/ILSVRC2012_val_00042011.JPEG n07614500/
+mv val/ILSVRC2012_val_00042012.JPEG n04311174/
+mv val/ILSVRC2012_val_00042013.JPEG n02361337/
+mv val/ILSVRC2012_val_00042014.JPEG n07753592/
+mv val/ILSVRC2012_val_00042015.JPEG n02206856/
+mv val/ILSVRC2012_val_00042016.JPEG n04090263/
+mv val/ILSVRC2012_val_00042017.JPEG n03444034/
+mv val/ILSVRC2012_val_00042018.JPEG n04525305/
+mv val/ILSVRC2012_val_00042019.JPEG n02281406/
+mv val/ILSVRC2012_val_00042020.JPEG n02526121/
+mv val/ILSVRC2012_val_00042021.JPEG n01807496/
+mv val/ILSVRC2012_val_00042022.JPEG n02096294/
+mv val/ILSVRC2012_val_00042023.JPEG n01667778/
+mv val/ILSVRC2012_val_00042024.JPEG n02480855/
+mv val/ILSVRC2012_val_00042025.JPEG n07711569/
+mv val/ILSVRC2012_val_00042026.JPEG n02009229/
+mv val/ILSVRC2012_val_00042027.JPEG n01697457/
+mv val/ILSVRC2012_val_00042028.JPEG n03271574/
+mv val/ILSVRC2012_val_00042029.JPEG n01687978/
+mv val/ILSVRC2012_val_00042030.JPEG n02100236/
+mv val/ILSVRC2012_val_00042031.JPEG n03908714/
+mv val/ILSVRC2012_val_00042032.JPEG n01531178/
+mv val/ILSVRC2012_val_00042033.JPEG n02364673/
+mv val/ILSVRC2012_val_00042034.JPEG n03773504/
+mv val/ILSVRC2012_val_00042035.JPEG n03000684/
+mv val/ILSVRC2012_val_00042036.JPEG n02981792/
+mv val/ILSVRC2012_val_00042037.JPEG n04485082/
+mv val/ILSVRC2012_val_00042038.JPEG n01797886/
+mv val/ILSVRC2012_val_00042039.JPEG n03498962/
+mv val/ILSVRC2012_val_00042040.JPEG n03538406/
+mv val/ILSVRC2012_val_00042041.JPEG n03530642/
+mv val/ILSVRC2012_val_00042042.JPEG n01872401/
+mv val/ILSVRC2012_val_00042043.JPEG n02342885/
+mv val/ILSVRC2012_val_00042044.JPEG n02457408/
+mv val/ILSVRC2012_val_00042045.JPEG n02480495/
+mv val/ILSVRC2012_val_00042046.JPEG n02480855/
+mv val/ILSVRC2012_val_00042047.JPEG n01770393/
+mv val/ILSVRC2012_val_00042048.JPEG n01560419/
+mv val/ILSVRC2012_val_00042049.JPEG n01665541/
+mv val/ILSVRC2012_val_00042050.JPEG n04540053/
+mv val/ILSVRC2012_val_00042051.JPEG n04346328/
+mv val/ILSVRC2012_val_00042052.JPEG n04485082/
+mv val/ILSVRC2012_val_00042053.JPEG n02091635/
+mv val/ILSVRC2012_val_00042054.JPEG n03733805/
+mv val/ILSVRC2012_val_00042055.JPEG n02120505/
+mv val/ILSVRC2012_val_00042056.JPEG n02988304/
+mv val/ILSVRC2012_val_00042057.JPEG n04049303/
+mv val/ILSVRC2012_val_00042058.JPEG n02607072/
+mv val/ILSVRC2012_val_00042059.JPEG n02488702/
+mv val/ILSVRC2012_val_00042060.JPEG n03026506/
+mv val/ILSVRC2012_val_00042061.JPEG n07718472/
+mv val/ILSVRC2012_val_00042062.JPEG n03627232/
+mv val/ILSVRC2012_val_00042063.JPEG n03388043/
+mv val/ILSVRC2012_val_00042064.JPEG n02403003/
+mv val/ILSVRC2012_val_00042065.JPEG n03627232/
+mv val/ILSVRC2012_val_00042066.JPEG n03877845/
+mv val/ILSVRC2012_val_00042067.JPEG n03388043/
+mv val/ILSVRC2012_val_00042068.JPEG n02487347/
+mv val/ILSVRC2012_val_00042069.JPEG n04005630/
+mv val/ILSVRC2012_val_00042070.JPEG n01682714/
+mv val/ILSVRC2012_val_00042071.JPEG n01818515/
+mv val/ILSVRC2012_val_00042072.JPEG n04311174/
+mv val/ILSVRC2012_val_00042073.JPEG n01664065/
+mv val/ILSVRC2012_val_00042074.JPEG n04509417/
+mv val/ILSVRC2012_val_00042075.JPEG n02086910/
+mv val/ILSVRC2012_val_00042076.JPEG n02219486/
+mv val/ILSVRC2012_val_00042077.JPEG n04392985/
+mv val/ILSVRC2012_val_00042078.JPEG n04344873/
+mv val/ILSVRC2012_val_00042079.JPEG n01685808/
+mv val/ILSVRC2012_val_00042080.JPEG n07717410/
+mv val/ILSVRC2012_val_00042081.JPEG n03384352/
+mv val/ILSVRC2012_val_00042082.JPEG n01728920/
+mv val/ILSVRC2012_val_00042083.JPEG n02027492/
+mv val/ILSVRC2012_val_00042084.JPEG n02012849/
+mv val/ILSVRC2012_val_00042085.JPEG n04336792/
+mv val/ILSVRC2012_val_00042086.JPEG n02481823/
+mv val/ILSVRC2012_val_00042087.JPEG n07565083/
+mv val/ILSVRC2012_val_00042088.JPEG n03868863/
+mv val/ILSVRC2012_val_00042089.JPEG n03179701/
+mv val/ILSVRC2012_val_00042090.JPEG n02109525/
+mv val/ILSVRC2012_val_00042091.JPEG n04330267/
+mv val/ILSVRC2012_val_00042092.JPEG n03982430/
+mv val/ILSVRC2012_val_00042093.JPEG n03272010/
+mv val/ILSVRC2012_val_00042094.JPEG n04005630/
+mv val/ILSVRC2012_val_00042095.JPEG n02112137/
+mv val/ILSVRC2012_val_00042096.JPEG n03770439/
+mv val/ILSVRC2012_val_00042097.JPEG n02088094/
+mv val/ILSVRC2012_val_00042098.JPEG n02114548/
+mv val/ILSVRC2012_val_00042099.JPEG n02091032/
+mv val/ILSVRC2012_val_00042100.JPEG n01728572/
+mv val/ILSVRC2012_val_00042101.JPEG n03240683/
+mv val/ILSVRC2012_val_00042102.JPEG n02808440/
+mv val/ILSVRC2012_val_00042103.JPEG n02486410/
+mv val/ILSVRC2012_val_00042104.JPEG n02930766/
+mv val/ILSVRC2012_val_00042105.JPEG n01737021/
+mv val/ILSVRC2012_val_00042106.JPEG n03733805/
+mv val/ILSVRC2012_val_00042107.JPEG n03110669/
+mv val/ILSVRC2012_val_00042108.JPEG n03016953/
+mv val/ILSVRC2012_val_00042109.JPEG n01748264/
+mv val/ILSVRC2012_val_00042110.JPEG n02325366/
+mv val/ILSVRC2012_val_00042111.JPEG n01748264/
+mv val/ILSVRC2012_val_00042112.JPEG n02364673/
+mv val/ILSVRC2012_val_00042113.JPEG n02017213/
+mv val/ILSVRC2012_val_00042114.JPEG n04252077/
+mv val/ILSVRC2012_val_00042115.JPEG n02860847/
+mv val/ILSVRC2012_val_00042116.JPEG n03124043/
+mv val/ILSVRC2012_val_00042117.JPEG n03461385/
+mv val/ILSVRC2012_val_00042118.JPEG n02090721/
+mv val/ILSVRC2012_val_00042119.JPEG n03998194/
+mv val/ILSVRC2012_val_00042120.JPEG n02095570/
+mv val/ILSVRC2012_val_00042121.JPEG n07753113/
+mv val/ILSVRC2012_val_00042122.JPEG n04423845/
+mv val/ILSVRC2012_val_00042123.JPEG n04044716/
+mv val/ILSVRC2012_val_00042124.JPEG n01695060/
+mv val/ILSVRC2012_val_00042125.JPEG n01632458/
+mv val/ILSVRC2012_val_00042126.JPEG n02643566/
+mv val/ILSVRC2012_val_00042127.JPEG n02167151/
+mv val/ILSVRC2012_val_00042128.JPEG n01860187/
+mv val/ILSVRC2012_val_00042129.JPEG n02403003/
+mv val/ILSVRC2012_val_00042130.JPEG n02840245/
+mv val/ILSVRC2012_val_00042131.JPEG n03658185/
+mv val/ILSVRC2012_val_00042132.JPEG n04116512/
+mv val/ILSVRC2012_val_00042133.JPEG n02096294/
+mv val/ILSVRC2012_val_00042134.JPEG n01735189/
+mv val/ILSVRC2012_val_00042135.JPEG n01514859/
+mv val/ILSVRC2012_val_00042136.JPEG n04131690/
+mv val/ILSVRC2012_val_00042137.JPEG n02978881/
+mv val/ILSVRC2012_val_00042138.JPEG n03461385/
+mv val/ILSVRC2012_val_00042139.JPEG n03944341/
+mv val/ILSVRC2012_val_00042140.JPEG n02441942/
+mv val/ILSVRC2012_val_00042141.JPEG n07753113/
+mv val/ILSVRC2012_val_00042142.JPEG n01693334/
+mv val/ILSVRC2012_val_00042143.JPEG n09399592/
+mv val/ILSVRC2012_val_00042144.JPEG n02105412/
+mv val/ILSVRC2012_val_00042145.JPEG n03400231/
+mv val/ILSVRC2012_val_00042146.JPEG n04550184/
+mv val/ILSVRC2012_val_00042147.JPEG n02823428/
+mv val/ILSVRC2012_val_00042148.JPEG n02112137/
+mv val/ILSVRC2012_val_00042149.JPEG n03920288/
+mv val/ILSVRC2012_val_00042150.JPEG n04509417/
+mv val/ILSVRC2012_val_00042151.JPEG n03785016/
+mv val/ILSVRC2012_val_00042152.JPEG n03534580/
+mv val/ILSVRC2012_val_00042153.JPEG n02066245/
+mv val/ILSVRC2012_val_00042154.JPEG n02807133/
+mv val/ILSVRC2012_val_00042155.JPEG n01924916/
+mv val/ILSVRC2012_val_00042156.JPEG n02017213/
+mv val/ILSVRC2012_val_00042157.JPEG n03796401/
+mv val/ILSVRC2012_val_00042158.JPEG n02090721/
+mv val/ILSVRC2012_val_00042159.JPEG n01981276/
+mv val/ILSVRC2012_val_00042160.JPEG n02497673/
+mv val/ILSVRC2012_val_00042161.JPEG n09399592/
+mv val/ILSVRC2012_val_00042162.JPEG n01749939/
+mv val/ILSVRC2012_val_00042163.JPEG n03344393/
+mv val/ILSVRC2012_val_00042164.JPEG n03344393/
+mv val/ILSVRC2012_val_00042165.JPEG n02490219/
+mv val/ILSVRC2012_val_00042166.JPEG n04335435/
+mv val/ILSVRC2012_val_00042167.JPEG n04065272/
+mv val/ILSVRC2012_val_00042168.JPEG n07873807/
+mv val/ILSVRC2012_val_00042169.JPEG n03314780/
+mv val/ILSVRC2012_val_00042170.JPEG n03530642/
+mv val/ILSVRC2012_val_00042171.JPEG n02783161/
+mv val/ILSVRC2012_val_00042172.JPEG n02114548/
+mv val/ILSVRC2012_val_00042173.JPEG n02319095/
+mv val/ILSVRC2012_val_00042174.JPEG n03018349/
+mv val/ILSVRC2012_val_00042175.JPEG n01498041/
+mv val/ILSVRC2012_val_00042176.JPEG n02859443/
+mv val/ILSVRC2012_val_00042177.JPEG n02096051/
+mv val/ILSVRC2012_val_00042178.JPEG n04251144/
+mv val/ILSVRC2012_val_00042179.JPEG n03042490/
+mv val/ILSVRC2012_val_00042180.JPEG n02167151/
+mv val/ILSVRC2012_val_00042181.JPEG n02096294/
+mv val/ILSVRC2012_val_00042182.JPEG n09246464/
+mv val/ILSVRC2012_val_00042183.JPEG n12985857/
+mv val/ILSVRC2012_val_00042184.JPEG n02100583/
+mv val/ILSVRC2012_val_00042185.JPEG n03240683/
+mv val/ILSVRC2012_val_00042186.JPEG n02236044/
+mv val/ILSVRC2012_val_00042187.JPEG n02356798/
+mv val/ILSVRC2012_val_00042188.JPEG n02317335/
+mv val/ILSVRC2012_val_00042189.JPEG n02859443/
+mv val/ILSVRC2012_val_00042190.JPEG n02510455/
+mv val/ILSVRC2012_val_00042191.JPEG n01945685/
+mv val/ILSVRC2012_val_00042192.JPEG n03792972/
+mv val/ILSVRC2012_val_00042193.JPEG n02011460/
+mv val/ILSVRC2012_val_00042194.JPEG n03220513/
+mv val/ILSVRC2012_val_00042195.JPEG n04141076/
+mv val/ILSVRC2012_val_00042196.JPEG n03662601/
+mv val/ILSVRC2012_val_00042197.JPEG n07745940/
+mv val/ILSVRC2012_val_00042198.JPEG n02747177/
+mv val/ILSVRC2012_val_00042199.JPEG n12998815/
+mv val/ILSVRC2012_val_00042200.JPEG n04209133/
+mv val/ILSVRC2012_val_00042201.JPEG n02097130/
+mv val/ILSVRC2012_val_00042202.JPEG n01685808/
+mv val/ILSVRC2012_val_00042203.JPEG n04273569/
+mv val/ILSVRC2012_val_00042204.JPEG n04515003/
+mv val/ILSVRC2012_val_00042205.JPEG n02094258/
+mv val/ILSVRC2012_val_00042206.JPEG n02109047/
+mv val/ILSVRC2012_val_00042207.JPEG n03028079/
+mv val/ILSVRC2012_val_00042208.JPEG n02408429/
+mv val/ILSVRC2012_val_00042209.JPEG n03777754/
+mv val/ILSVRC2012_val_00042210.JPEG n02113186/
+mv val/ILSVRC2012_val_00042211.JPEG n02500267/
+mv val/ILSVRC2012_val_00042212.JPEG n03891251/
+mv val/ILSVRC2012_val_00042213.JPEG n02112018/
+mv val/ILSVRC2012_val_00042214.JPEG n04487081/
+mv val/ILSVRC2012_val_00042215.JPEG n02927161/
+mv val/ILSVRC2012_val_00042216.JPEG n01664065/
+mv val/ILSVRC2012_val_00042217.JPEG n03534580/
+mv val/ILSVRC2012_val_00042218.JPEG n03729826/
+mv val/ILSVRC2012_val_00042219.JPEG n03187595/
+mv val/ILSVRC2012_val_00042220.JPEG n02105505/
+mv val/ILSVRC2012_val_00042221.JPEG n07718747/
+mv val/ILSVRC2012_val_00042222.JPEG n02802426/
+mv val/ILSVRC2012_val_00042223.JPEG n02226429/
+mv val/ILSVRC2012_val_00042224.JPEG n04116512/
+mv val/ILSVRC2012_val_00042225.JPEG n01756291/
+mv val/ILSVRC2012_val_00042226.JPEG n01817953/
+mv val/ILSVRC2012_val_00042227.JPEG n07714990/
+mv val/ILSVRC2012_val_00042228.JPEG n02457408/
+mv val/ILSVRC2012_val_00042229.JPEG n03109150/
+mv val/ILSVRC2012_val_00042230.JPEG n04026417/
+mv val/ILSVRC2012_val_00042231.JPEG n02437312/
+mv val/ILSVRC2012_val_00042232.JPEG n02124075/
+mv val/ILSVRC2012_val_00042233.JPEG n02113978/
+mv val/ILSVRC2012_val_00042234.JPEG n03109150/
+mv val/ILSVRC2012_val_00042235.JPEG n02389026/
+mv val/ILSVRC2012_val_00042236.JPEG n06785654/
+mv val/ILSVRC2012_val_00042237.JPEG n03089624/
+mv val/ILSVRC2012_val_00042238.JPEG n03444034/
+mv val/ILSVRC2012_val_00042239.JPEG n04149813/
+mv val/ILSVRC2012_val_00042240.JPEG n02091032/
+mv val/ILSVRC2012_val_00042241.JPEG n04376876/
+mv val/ILSVRC2012_val_00042242.JPEG n02606052/
+mv val/ILSVRC2012_val_00042243.JPEG n03492542/
+mv val/ILSVRC2012_val_00042244.JPEG n04579145/
+mv val/ILSVRC2012_val_00042245.JPEG n01496331/
+mv val/ILSVRC2012_val_00042246.JPEG n01592084/
+mv val/ILSVRC2012_val_00042247.JPEG n04141975/
+mv val/ILSVRC2012_val_00042248.JPEG n01580077/
+mv val/ILSVRC2012_val_00042249.JPEG n02112706/
+mv val/ILSVRC2012_val_00042250.JPEG n03388043/
+mv val/ILSVRC2012_val_00042251.JPEG n02256656/
+mv val/ILSVRC2012_val_00042252.JPEG n02087394/
+mv val/ILSVRC2012_val_00042253.JPEG n04179913/
+mv val/ILSVRC2012_val_00042254.JPEG n07930864/
+mv val/ILSVRC2012_val_00042255.JPEG n04355338/
+mv val/ILSVRC2012_val_00042256.JPEG n03874293/
+mv val/ILSVRC2012_val_00042257.JPEG n04033995/
+mv val/ILSVRC2012_val_00042258.JPEG n02088364/
+mv val/ILSVRC2012_val_00042259.JPEG n03535780/
+mv val/ILSVRC2012_val_00042260.JPEG n03476991/
+mv val/ILSVRC2012_val_00042261.JPEG n04336792/
+mv val/ILSVRC2012_val_00042262.JPEG n03888257/
+mv val/ILSVRC2012_val_00042263.JPEG n07836838/
+mv val/ILSVRC2012_val_00042264.JPEG n03028079/
+mv val/ILSVRC2012_val_00042265.JPEG n03877845/
+mv val/ILSVRC2012_val_00042266.JPEG n03982430/
+mv val/ILSVRC2012_val_00042267.JPEG n02116738/
+mv val/ILSVRC2012_val_00042268.JPEG n04596742/
+mv val/ILSVRC2012_val_00042269.JPEG n03843555/
+mv val/ILSVRC2012_val_00042270.JPEG n15075141/
+mv val/ILSVRC2012_val_00042271.JPEG n04325704/
+mv val/ILSVRC2012_val_00042272.JPEG n04398044/
+mv val/ILSVRC2012_val_00042273.JPEG n02134084/
+mv val/ILSVRC2012_val_00042274.JPEG n02132136/
+mv val/ILSVRC2012_val_00042275.JPEG n03602883/
+mv val/ILSVRC2012_val_00042276.JPEG n01955084/
+mv val/ILSVRC2012_val_00042277.JPEG n02268853/
+mv val/ILSVRC2012_val_00042278.JPEG n02490219/
+mv val/ILSVRC2012_val_00042279.JPEG n04044716/
+mv val/ILSVRC2012_val_00042280.JPEG n02492660/
+mv val/ILSVRC2012_val_00042281.JPEG n01770393/
+mv val/ILSVRC2012_val_00042282.JPEG n03447447/
+mv val/ILSVRC2012_val_00042283.JPEG n07871810/
+mv val/ILSVRC2012_val_00042284.JPEG n01739381/
+mv val/ILSVRC2012_val_00042285.JPEG n03933933/
+mv val/ILSVRC2012_val_00042286.JPEG n02110958/
+mv val/ILSVRC2012_val_00042287.JPEG n04517823/
+mv val/ILSVRC2012_val_00042288.JPEG n10565667/
+mv val/ILSVRC2012_val_00042289.JPEG n02087046/
+mv val/ILSVRC2012_val_00042290.JPEG n02909870/
+mv val/ILSVRC2012_val_00042291.JPEG n07747607/
+mv val/ILSVRC2012_val_00042292.JPEG n13037406/
+mv val/ILSVRC2012_val_00042293.JPEG n03743016/
+mv val/ILSVRC2012_val_00042294.JPEG n02113023/
+mv val/ILSVRC2012_val_00042295.JPEG n07716358/
+mv val/ILSVRC2012_val_00042296.JPEG n01828970/
+mv val/ILSVRC2012_val_00042297.JPEG n04579145/
+mv val/ILSVRC2012_val_00042298.JPEG n04482393/
+mv val/ILSVRC2012_val_00042299.JPEG n02169497/
+mv val/ILSVRC2012_val_00042300.JPEG n04371430/
+mv val/ILSVRC2012_val_00042301.JPEG n01751748/
+mv val/ILSVRC2012_val_00042302.JPEG n01632777/
+mv val/ILSVRC2012_val_00042303.JPEG n02106382/
+mv val/ILSVRC2012_val_00042304.JPEG n01697457/
+mv val/ILSVRC2012_val_00042305.JPEG n04074963/
+mv val/ILSVRC2012_val_00042306.JPEG n03062245/
+mv val/ILSVRC2012_val_00042307.JPEG n02607072/
+mv val/ILSVRC2012_val_00042308.JPEG n03868863/
+mv val/ILSVRC2012_val_00042309.JPEG n04409515/
+mv val/ILSVRC2012_val_00042310.JPEG n01829413/
+mv val/ILSVRC2012_val_00042311.JPEG n04254680/
+mv val/ILSVRC2012_val_00042312.JPEG n01728920/
+mv val/ILSVRC2012_val_00042313.JPEG n02802426/
+mv val/ILSVRC2012_val_00042314.JPEG n03666591/
+mv val/ILSVRC2012_val_00042315.JPEG n01984695/
+mv val/ILSVRC2012_val_00042316.JPEG n02708093/
+mv val/ILSVRC2012_val_00042317.JPEG n02090721/
+mv val/ILSVRC2012_val_00042318.JPEG n02089973/
+mv val/ILSVRC2012_val_00042319.JPEG n02099849/
+mv val/ILSVRC2012_val_00042320.JPEG n02134084/
+mv val/ILSVRC2012_val_00042321.JPEG n13133613/
+mv val/ILSVRC2012_val_00042322.JPEG n03733281/
+mv val/ILSVRC2012_val_00042323.JPEG n02268853/
+mv val/ILSVRC2012_val_00042324.JPEG n04347754/
+mv val/ILSVRC2012_val_00042325.JPEG n02115641/
+mv val/ILSVRC2012_val_00042326.JPEG n04346328/
+mv val/ILSVRC2012_val_00042327.JPEG n02769748/
+mv val/ILSVRC2012_val_00042328.JPEG n01665541/
+mv val/ILSVRC2012_val_00042329.JPEG n03961711/
+mv val/ILSVRC2012_val_00042330.JPEG n02391049/
+mv val/ILSVRC2012_val_00042331.JPEG n01675722/
+mv val/ILSVRC2012_val_00042332.JPEG n02017213/
+mv val/ILSVRC2012_val_00042333.JPEG n03045698/
+mv val/ILSVRC2012_val_00042334.JPEG n02356798/
+mv val/ILSVRC2012_val_00042335.JPEG n02977058/
+mv val/ILSVRC2012_val_00042336.JPEG n01873310/
+mv val/ILSVRC2012_val_00042337.JPEG n02276258/
+mv val/ILSVRC2012_val_00042338.JPEG n03692522/
+mv val/ILSVRC2012_val_00042339.JPEG n02107908/
+mv val/ILSVRC2012_val_00042340.JPEG n03954731/
+mv val/ILSVRC2012_val_00042341.JPEG n04389033/
+mv val/ILSVRC2012_val_00042342.JPEG n02226429/
+mv val/ILSVRC2012_val_00042343.JPEG n03676483/
+mv val/ILSVRC2012_val_00042344.JPEG n02107908/
+mv val/ILSVRC2012_val_00042345.JPEG n01484850/
+mv val/ILSVRC2012_val_00042346.JPEG n01774750/
+mv val/ILSVRC2012_val_00042347.JPEG n02979186/
+mv val/ILSVRC2012_val_00042348.JPEG n03761084/
+mv val/ILSVRC2012_val_00042349.JPEG n03623198/
+mv val/ILSVRC2012_val_00042350.JPEG n03445777/
+mv val/ILSVRC2012_val_00042351.JPEG n03770679/
+mv val/ILSVRC2012_val_00042352.JPEG n01728572/
+mv val/ILSVRC2012_val_00042353.JPEG n03495258/
+mv val/ILSVRC2012_val_00042354.JPEG n04613696/
+mv val/ILSVRC2012_val_00042355.JPEG n02441942/
+mv val/ILSVRC2012_val_00042356.JPEG n03594734/
+mv val/ILSVRC2012_val_00042357.JPEG n02114855/
+mv val/ILSVRC2012_val_00042358.JPEG n02883205/
+mv val/ILSVRC2012_val_00042359.JPEG n04311174/
+mv val/ILSVRC2012_val_00042360.JPEG n04532670/
+mv val/ILSVRC2012_val_00042361.JPEG n02134418/
+mv val/ILSVRC2012_val_00042362.JPEG n03717622/
+mv val/ILSVRC2012_val_00042363.JPEG n02859443/
+mv val/ILSVRC2012_val_00042364.JPEG n03930313/
+mv val/ILSVRC2012_val_00042365.JPEG n03126707/
+mv val/ILSVRC2012_val_00042366.JPEG n03977966/
+mv val/ILSVRC2012_val_00042367.JPEG n03983396/
+mv val/ILSVRC2012_val_00042368.JPEG n04456115/
+mv val/ILSVRC2012_val_00042369.JPEG n07760859/
+mv val/ILSVRC2012_val_00042370.JPEG n01532829/
+mv val/ILSVRC2012_val_00042371.JPEG n04208210/
+mv val/ILSVRC2012_val_00042372.JPEG n03991062/
+mv val/ILSVRC2012_val_00042373.JPEG n04131690/
+mv val/ILSVRC2012_val_00042374.JPEG n03649909/
+mv val/ILSVRC2012_val_00042375.JPEG n03425413/
+mv val/ILSVRC2012_val_00042376.JPEG n02017213/
+mv val/ILSVRC2012_val_00042377.JPEG n02974003/
+mv val/ILSVRC2012_val_00042378.JPEG n03958227/
+mv val/ILSVRC2012_val_00042379.JPEG n02408429/
+mv val/ILSVRC2012_val_00042380.JPEG n01614925/
+mv val/ILSVRC2012_val_00042381.JPEG n03884397/
+mv val/ILSVRC2012_val_00042382.JPEG n04429376/
+mv val/ILSVRC2012_val_00042383.JPEG n01749939/
+mv val/ILSVRC2012_val_00042384.JPEG n01756291/
+mv val/ILSVRC2012_val_00042385.JPEG n01498041/
+mv val/ILSVRC2012_val_00042386.JPEG n03992509/
+mv val/ILSVRC2012_val_00042387.JPEG n03532672/
+mv val/ILSVRC2012_val_00042388.JPEG n04286575/
+mv val/ILSVRC2012_val_00042389.JPEG n03376595/
+mv val/ILSVRC2012_val_00042390.JPEG n02108000/
+mv val/ILSVRC2012_val_00042391.JPEG n02108551/
+mv val/ILSVRC2012_val_00042392.JPEG n07565083/
+mv val/ILSVRC2012_val_00042393.JPEG n03792782/
+mv val/ILSVRC2012_val_00042394.JPEG n02089867/
+mv val/ILSVRC2012_val_00042395.JPEG n07684084/
+mv val/ILSVRC2012_val_00042396.JPEG n03404251/
+mv val/ILSVRC2012_val_00042397.JPEG n03871628/
+mv val/ILSVRC2012_val_00042398.JPEG n04311004/
+mv val/ILSVRC2012_val_00042399.JPEG n13040303/
+mv val/ILSVRC2012_val_00042400.JPEG n02111129/
+mv val/ILSVRC2012_val_00042401.JPEG n02422699/
+mv val/ILSVRC2012_val_00042402.JPEG n03733281/
+mv val/ILSVRC2012_val_00042403.JPEG n04153751/
+mv val/ILSVRC2012_val_00042404.JPEG n04179913/
+mv val/ILSVRC2012_val_00042405.JPEG n02268443/
+mv val/ILSVRC2012_val_00042406.JPEG n02443114/
+mv val/ILSVRC2012_val_00042407.JPEG n03485794/
+mv val/ILSVRC2012_val_00042408.JPEG n07579787/
+mv val/ILSVRC2012_val_00042409.JPEG n02110063/
+mv val/ILSVRC2012_val_00042410.JPEG n01616318/
+mv val/ILSVRC2012_val_00042411.JPEG n03871628/
+mv val/ILSVRC2012_val_00042412.JPEG n07697537/
+mv val/ILSVRC2012_val_00042413.JPEG n02114367/
+mv val/ILSVRC2012_val_00042414.JPEG n02091134/
+mv val/ILSVRC2012_val_00042415.JPEG n02883205/
+mv val/ILSVRC2012_val_00042416.JPEG n02814533/
+mv val/ILSVRC2012_val_00042417.JPEG n03871628/
+mv val/ILSVRC2012_val_00042418.JPEG n02105056/
+mv val/ILSVRC2012_val_00042419.JPEG n02865351/
+mv val/ILSVRC2012_val_00042420.JPEG n03991062/
+mv val/ILSVRC2012_val_00042421.JPEG n02104365/
+mv val/ILSVRC2012_val_00042422.JPEG n04275548/
+mv val/ILSVRC2012_val_00042423.JPEG n03929660/
+mv val/ILSVRC2012_val_00042424.JPEG n03814639/
+mv val/ILSVRC2012_val_00042425.JPEG n02834397/
+mv val/ILSVRC2012_val_00042426.JPEG n03792782/
+mv val/ILSVRC2012_val_00042427.JPEG n07730033/
+mv val/ILSVRC2012_val_00042428.JPEG n02445715/
+mv val/ILSVRC2012_val_00042429.JPEG n02804610/
+mv val/ILSVRC2012_val_00042430.JPEG n02119789/
+mv val/ILSVRC2012_val_00042431.JPEG n04040759/
+mv val/ILSVRC2012_val_00042432.JPEG n02415577/
+mv val/ILSVRC2012_val_00042433.JPEG n02206856/
+mv val/ILSVRC2012_val_00042434.JPEG n02114367/
+mv val/ILSVRC2012_val_00042435.JPEG n04493381/
+mv val/ILSVRC2012_val_00042436.JPEG n02276258/
+mv val/ILSVRC2012_val_00042437.JPEG n03991062/
+mv val/ILSVRC2012_val_00042438.JPEG n02236044/
+mv val/ILSVRC2012_val_00042439.JPEG n04332243/
+mv val/ILSVRC2012_val_00042440.JPEG n07760859/
+mv val/ILSVRC2012_val_00042441.JPEG n02504013/
+mv val/ILSVRC2012_val_00042442.JPEG n02090379/
+mv val/ILSVRC2012_val_00042443.JPEG n02445715/
+mv val/ILSVRC2012_val_00042444.JPEG n10565667/
+mv val/ILSVRC2012_val_00042445.JPEG n04487081/
+mv val/ILSVRC2012_val_00042446.JPEG n09472597/
+mv val/ILSVRC2012_val_00042447.JPEG n04398044/
+mv val/ILSVRC2012_val_00042448.JPEG n01873310/
+mv val/ILSVRC2012_val_00042449.JPEG n02087046/
+mv val/ILSVRC2012_val_00042450.JPEG n03788365/
+mv val/ILSVRC2012_val_00042451.JPEG n02097658/
+mv val/ILSVRC2012_val_00042452.JPEG n03467068/
+mv val/ILSVRC2012_val_00042453.JPEG n07717410/
+mv val/ILSVRC2012_val_00042454.JPEG n03642806/
+mv val/ILSVRC2012_val_00042455.JPEG n03063689/
+mv val/ILSVRC2012_val_00042456.JPEG n01914609/
+mv val/ILSVRC2012_val_00042457.JPEG n03792782/
+mv val/ILSVRC2012_val_00042458.JPEG n12267677/
+mv val/ILSVRC2012_val_00042459.JPEG n03220513/
+mv val/ILSVRC2012_val_00042460.JPEG n02119789/
+mv val/ILSVRC2012_val_00042461.JPEG n02950826/
+mv val/ILSVRC2012_val_00042462.JPEG n02113712/
+mv val/ILSVRC2012_val_00042463.JPEG n03697007/
+mv val/ILSVRC2012_val_00042464.JPEG n04009552/
+mv val/ILSVRC2012_val_00042465.JPEG n03876231/
+mv val/ILSVRC2012_val_00042466.JPEG n10148035/
+mv val/ILSVRC2012_val_00042467.JPEG n03590841/
+mv val/ILSVRC2012_val_00042468.JPEG n03461385/
+mv val/ILSVRC2012_val_00042469.JPEG n02814860/
+mv val/ILSVRC2012_val_00042470.JPEG n03729826/
+mv val/ILSVRC2012_val_00042471.JPEG n03255030/
+mv val/ILSVRC2012_val_00042472.JPEG n09288635/
+mv val/ILSVRC2012_val_00042473.JPEG n02094114/
+mv val/ILSVRC2012_val_00042474.JPEG n04550184/
+mv val/ILSVRC2012_val_00042475.JPEG n02115913/
+mv val/ILSVRC2012_val_00042476.JPEG n01990800/
+mv val/ILSVRC2012_val_00042477.JPEG n02112350/
+mv val/ILSVRC2012_val_00042478.JPEG n12998815/
+mv val/ILSVRC2012_val_00042479.JPEG n02672831/
+mv val/ILSVRC2012_val_00042480.JPEG n01860187/
+mv val/ILSVRC2012_val_00042481.JPEG n04493381/
+mv val/ILSVRC2012_val_00042482.JPEG n02979186/
+mv val/ILSVRC2012_val_00042483.JPEG n02441942/
+mv val/ILSVRC2012_val_00042484.JPEG n02128757/
+mv val/ILSVRC2012_val_00042485.JPEG n01883070/
+mv val/ILSVRC2012_val_00042486.JPEG n03803284/
+mv val/ILSVRC2012_val_00042487.JPEG n03417042/
+mv val/ILSVRC2012_val_00042488.JPEG n02992211/
+mv val/ILSVRC2012_val_00042489.JPEG n04462240/
+mv val/ILSVRC2012_val_00042490.JPEG n03759954/
+mv val/ILSVRC2012_val_00042491.JPEG n01984695/
+mv val/ILSVRC2012_val_00042492.JPEG n07584110/
+mv val/ILSVRC2012_val_00042493.JPEG n04118538/
+mv val/ILSVRC2012_val_00042494.JPEG n02105412/
+mv val/ILSVRC2012_val_00042495.JPEG n03218198/
+mv val/ILSVRC2012_val_00042496.JPEG n02835271/
+mv val/ILSVRC2012_val_00042497.JPEG n03314780/
+mv val/ILSVRC2012_val_00042498.JPEG n04070727/
+mv val/ILSVRC2012_val_00042499.JPEG n03325584/
+mv val/ILSVRC2012_val_00042500.JPEG n01742172/
+mv val/ILSVRC2012_val_00042501.JPEG n04266014/
+mv val/ILSVRC2012_val_00042502.JPEG n03447447/
+mv val/ILSVRC2012_val_00042503.JPEG n02701002/
+mv val/ILSVRC2012_val_00042504.JPEG n01877812/
+mv val/ILSVRC2012_val_00042505.JPEG n03062245/
+mv val/ILSVRC2012_val_00042506.JPEG n01592084/
+mv val/ILSVRC2012_val_00042507.JPEG n01924916/
+mv val/ILSVRC2012_val_00042508.JPEG n03781244/
+mv val/ILSVRC2012_val_00042509.JPEG n01798484/
+mv val/ILSVRC2012_val_00042510.JPEG n02730930/
+mv val/ILSVRC2012_val_00042511.JPEG n02417914/
+mv val/ILSVRC2012_val_00042512.JPEG n02791124/
+mv val/ILSVRC2012_val_00042513.JPEG n02412080/
+mv val/ILSVRC2012_val_00042514.JPEG n09256479/
+mv val/ILSVRC2012_val_00042515.JPEG n04008634/
+mv val/ILSVRC2012_val_00042516.JPEG n02493793/
+mv val/ILSVRC2012_val_00042517.JPEG n07753275/
+mv val/ILSVRC2012_val_00042518.JPEG n03980874/
+mv val/ILSVRC2012_val_00042519.JPEG n02280649/
+mv val/ILSVRC2012_val_00042520.JPEG n03400231/
+mv val/ILSVRC2012_val_00042521.JPEG n03476991/
+mv val/ILSVRC2012_val_00042522.JPEG n02787622/
+mv val/ILSVRC2012_val_00042523.JPEG n02086240/
+mv val/ILSVRC2012_val_00042524.JPEG n04041544/
+mv val/ILSVRC2012_val_00042525.JPEG n04370456/
+mv val/ILSVRC2012_val_00042526.JPEG n04591713/
+mv val/ILSVRC2012_val_00042527.JPEG n03062245/
+mv val/ILSVRC2012_val_00042528.JPEG n04254120/
+mv val/ILSVRC2012_val_00042529.JPEG n02125311/
+mv val/ILSVRC2012_val_00042530.JPEG n03920288/
+mv val/ILSVRC2012_val_00042531.JPEG n02088364/
+mv val/ILSVRC2012_val_00042532.JPEG n02002724/
+mv val/ILSVRC2012_val_00042533.JPEG n02107683/
+mv val/ILSVRC2012_val_00042534.JPEG n01498041/
+mv val/ILSVRC2012_val_00042535.JPEG n04550184/
+mv val/ILSVRC2012_val_00042536.JPEG n01984695/
+mv val/ILSVRC2012_val_00042537.JPEG n04584207/
+mv val/ILSVRC2012_val_00042538.JPEG n02971356/
+mv val/ILSVRC2012_val_00042539.JPEG n03961711/
+mv val/ILSVRC2012_val_00042540.JPEG n02447366/
+mv val/ILSVRC2012_val_00042541.JPEG n01855672/
+mv val/ILSVRC2012_val_00042542.JPEG n03126707/
+mv val/ILSVRC2012_val_00042543.JPEG n03481172/
+mv val/ILSVRC2012_val_00042544.JPEG n02640242/
+mv val/ILSVRC2012_val_00042545.JPEG n03376595/
+mv val/ILSVRC2012_val_00042546.JPEG n02814860/
+mv val/ILSVRC2012_val_00042547.JPEG n01498041/
+mv val/ILSVRC2012_val_00042548.JPEG n04442312/
+mv val/ILSVRC2012_val_00042549.JPEG n03776460/
+mv val/ILSVRC2012_val_00042550.JPEG n01882714/
+mv val/ILSVRC2012_val_00042551.JPEG n04485082/
+mv val/ILSVRC2012_val_00042552.JPEG n03201208/
+mv val/ILSVRC2012_val_00042553.JPEG n01978455/
+mv val/ILSVRC2012_val_00042554.JPEG n04456115/
+mv val/ILSVRC2012_val_00042555.JPEG n03467068/
+mv val/ILSVRC2012_val_00042556.JPEG n02086240/
+mv val/ILSVRC2012_val_00042557.JPEG n02256656/
+mv val/ILSVRC2012_val_00042558.JPEG n04517823/
+mv val/ILSVRC2012_val_00042559.JPEG n03291819/
+mv val/ILSVRC2012_val_00042560.JPEG n04263257/
+mv val/ILSVRC2012_val_00042561.JPEG n02106662/
+mv val/ILSVRC2012_val_00042562.JPEG n02823750/
+mv val/ILSVRC2012_val_00042563.JPEG n03527444/
+mv val/ILSVRC2012_val_00042564.JPEG n01807496/
+mv val/ILSVRC2012_val_00042565.JPEG n02112018/
+mv val/ILSVRC2012_val_00042566.JPEG n02860847/
+mv val/ILSVRC2012_val_00042567.JPEG n01980166/
+mv val/ILSVRC2012_val_00042568.JPEG n01514859/
+mv val/ILSVRC2012_val_00042569.JPEG n02879718/
+mv val/ILSVRC2012_val_00042570.JPEG n02128925/
+mv val/ILSVRC2012_val_00042571.JPEG n03944341/
+mv val/ILSVRC2012_val_00042572.JPEG n07831146/
+mv val/ILSVRC2012_val_00042573.JPEG n04049303/
+mv val/ILSVRC2012_val_00042574.JPEG n04004767/
+mv val/ILSVRC2012_val_00042575.JPEG n04254120/
+mv val/ILSVRC2012_val_00042576.JPEG n02108422/
+mv val/ILSVRC2012_val_00042577.JPEG n07871810/
+mv val/ILSVRC2012_val_00042578.JPEG n01775062/
+mv val/ILSVRC2012_val_00042579.JPEG n02808304/
+mv val/ILSVRC2012_val_00042580.JPEG n03929660/
+mv val/ILSVRC2012_val_00042581.JPEG n02667093/
+mv val/ILSVRC2012_val_00042582.JPEG n07716906/
+mv val/ILSVRC2012_val_00042583.JPEG n03697007/
+mv val/ILSVRC2012_val_00042584.JPEG n12057211/
+mv val/ILSVRC2012_val_00042585.JPEG n03196217/
+mv val/ILSVRC2012_val_00042586.JPEG n01855032/
+mv val/ILSVRC2012_val_00042587.JPEG n02097047/
+mv val/ILSVRC2012_val_00042588.JPEG n02444819/
+mv val/ILSVRC2012_val_00042589.JPEG n07711569/
+mv val/ILSVRC2012_val_00042590.JPEG n02071294/
+mv val/ILSVRC2012_val_00042591.JPEG n06596364/
+mv val/ILSVRC2012_val_00042592.JPEG n03584829/
+mv val/ILSVRC2012_val_00042593.JPEG n02025239/
+mv val/ILSVRC2012_val_00042594.JPEG n09256479/
+mv val/ILSVRC2012_val_00042595.JPEG n02484975/
+mv val/ILSVRC2012_val_00042596.JPEG n02840245/
+mv val/ILSVRC2012_val_00042597.JPEG n02814533/
+mv val/ILSVRC2012_val_00042598.JPEG n03188531/
+mv val/ILSVRC2012_val_00042599.JPEG n03891332/
+mv val/ILSVRC2012_val_00042600.JPEG n01560419/
+mv val/ILSVRC2012_val_00042601.JPEG n02110185/
+mv val/ILSVRC2012_val_00042602.JPEG n01685808/
+mv val/ILSVRC2012_val_00042603.JPEG n03207941/
+mv val/ILSVRC2012_val_00042604.JPEG n02096294/
+mv val/ILSVRC2012_val_00042605.JPEG n02672831/
+mv val/ILSVRC2012_val_00042606.JPEG n04311004/
+mv val/ILSVRC2012_val_00042607.JPEG n04265275/
+mv val/ILSVRC2012_val_00042608.JPEG n07730033/
+mv val/ILSVRC2012_val_00042609.JPEG n04296562/
+mv val/ILSVRC2012_val_00042610.JPEG n02167151/
+mv val/ILSVRC2012_val_00042611.JPEG n02110341/
+mv val/ILSVRC2012_val_00042612.JPEG n03832673/
+mv val/ILSVRC2012_val_00042613.JPEG n03709823/
+mv val/ILSVRC2012_val_00042614.JPEG n02115641/
+mv val/ILSVRC2012_val_00042615.JPEG n02510455/
+mv val/ILSVRC2012_val_00042616.JPEG n04325704/
+mv val/ILSVRC2012_val_00042617.JPEG n02129604/
+mv val/ILSVRC2012_val_00042618.JPEG n04296562/
+mv val/ILSVRC2012_val_00042619.JPEG n13037406/
+mv val/ILSVRC2012_val_00042620.JPEG n04554684/
+mv val/ILSVRC2012_val_00042621.JPEG n03706229/
+mv val/ILSVRC2012_val_00042622.JPEG n02500267/
+mv val/ILSVRC2012_val_00042623.JPEG n02101388/
+mv val/ILSVRC2012_val_00042624.JPEG n02206856/
+mv val/ILSVRC2012_val_00042625.JPEG n02111889/
+mv val/ILSVRC2012_val_00042626.JPEG n04442312/
+mv val/ILSVRC2012_val_00042627.JPEG n02102973/
+mv val/ILSVRC2012_val_00042628.JPEG n02098105/
+mv val/ILSVRC2012_val_00042629.JPEG n02906734/
+mv val/ILSVRC2012_val_00042630.JPEG n01770081/
+mv val/ILSVRC2012_val_00042631.JPEG n13054560/
+mv val/ILSVRC2012_val_00042632.JPEG n04325704/
+mv val/ILSVRC2012_val_00042633.JPEG n02909870/
+mv val/ILSVRC2012_val_00042634.JPEG n02927161/
+mv val/ILSVRC2012_val_00042635.JPEG n03976467/
+mv val/ILSVRC2012_val_00042636.JPEG n03014705/
+mv val/ILSVRC2012_val_00042637.JPEG n02483362/
+mv val/ILSVRC2012_val_00042638.JPEG n02012849/
+mv val/ILSVRC2012_val_00042639.JPEG n02321529/
+mv val/ILSVRC2012_val_00042640.JPEG n03841143/
+mv val/ILSVRC2012_val_00042641.JPEG n04389033/
+mv val/ILSVRC2012_val_00042642.JPEG n02094258/
+mv val/ILSVRC2012_val_00042643.JPEG n15075141/
+mv val/ILSVRC2012_val_00042644.JPEG n03733805/
+mv val/ILSVRC2012_val_00042645.JPEG n03958227/
+mv val/ILSVRC2012_val_00042646.JPEG n03792972/
+mv val/ILSVRC2012_val_00042647.JPEG n04542943/
+mv val/ILSVRC2012_val_00042648.JPEG n02979186/
+mv val/ILSVRC2012_val_00042649.JPEG n07614500/
+mv val/ILSVRC2012_val_00042650.JPEG n03666591/
+mv val/ILSVRC2012_val_00042651.JPEG n03929855/
+mv val/ILSVRC2012_val_00042652.JPEG n07802026/
+mv val/ILSVRC2012_val_00042653.JPEG n02974003/
+mv val/ILSVRC2012_val_00042654.JPEG n02319095/
+mv val/ILSVRC2012_val_00042655.JPEG n02804414/
+mv val/ILSVRC2012_val_00042656.JPEG n04325704/
+mv val/ILSVRC2012_val_00042657.JPEG n02109525/
+mv val/ILSVRC2012_val_00042658.JPEG n02999410/
+mv val/ILSVRC2012_val_00042659.JPEG n02120079/
+mv val/ILSVRC2012_val_00042660.JPEG n04404412/
+mv val/ILSVRC2012_val_00042661.JPEG n01871265/
+mv val/ILSVRC2012_val_00042662.JPEG n03871628/
+mv val/ILSVRC2012_val_00042663.JPEG n03337140/
+mv val/ILSVRC2012_val_00042664.JPEG n01667778/
+mv val/ILSVRC2012_val_00042665.JPEG n01819313/
+mv val/ILSVRC2012_val_00042666.JPEG n04532670/
+mv val/ILSVRC2012_val_00042667.JPEG n02319095/
+mv val/ILSVRC2012_val_00042668.JPEG n03457902/
+mv val/ILSVRC2012_val_00042669.JPEG n02978881/
+mv val/ILSVRC2012_val_00042670.JPEG n02119789/
+mv val/ILSVRC2012_val_00042671.JPEG n04026417/
+mv val/ILSVRC2012_val_00042672.JPEG n01693334/
+mv val/ILSVRC2012_val_00042673.JPEG n01744401/
+mv val/ILSVRC2012_val_00042674.JPEG n03825788/
+mv val/ILSVRC2012_val_00042675.JPEG n04273569/
+mv val/ILSVRC2012_val_00042676.JPEG n03942813/
+mv val/ILSVRC2012_val_00042677.JPEG n01984695/
+mv val/ILSVRC2012_val_00042678.JPEG n02727426/
+mv val/ILSVRC2012_val_00042679.JPEG n01820546/
+mv val/ILSVRC2012_val_00042680.JPEG n04487081/
+mv val/ILSVRC2012_val_00042681.JPEG n03956157/
+mv val/ILSVRC2012_val_00042682.JPEG n04465501/
+mv val/ILSVRC2012_val_00042683.JPEG n04579145/
+mv val/ILSVRC2012_val_00042684.JPEG n02117135/
+mv val/ILSVRC2012_val_00042685.JPEG n04447861/
+mv val/ILSVRC2012_val_00042686.JPEG n03085013/
+mv val/ILSVRC2012_val_00042687.JPEG n02134084/
+mv val/ILSVRC2012_val_00042688.JPEG n03769881/
+mv val/ILSVRC2012_val_00042689.JPEG n03717622/
+mv val/ILSVRC2012_val_00042690.JPEG n02105251/
+mv val/ILSVRC2012_val_00042691.JPEG n03761084/
+mv val/ILSVRC2012_val_00042692.JPEG n02088466/
+mv val/ILSVRC2012_val_00042693.JPEG n01872401/
+mv val/ILSVRC2012_val_00042694.JPEG n02807133/
+mv val/ILSVRC2012_val_00042695.JPEG n03775546/
+mv val/ILSVRC2012_val_00042696.JPEG n03590841/
+mv val/ILSVRC2012_val_00042697.JPEG n03617480/
+mv val/ILSVRC2012_val_00042698.JPEG n01677366/
+mv val/ILSVRC2012_val_00042699.JPEG n02119789/
+mv val/ILSVRC2012_val_00042700.JPEG n02226429/
+mv val/ILSVRC2012_val_00042701.JPEG n04409515/
+mv val/ILSVRC2012_val_00042702.JPEG n03995372/
+mv val/ILSVRC2012_val_00042703.JPEG n02013706/
+mv val/ILSVRC2012_val_00042704.JPEG n07697537/
+mv val/ILSVRC2012_val_00042705.JPEG n02025239/
+mv val/ILSVRC2012_val_00042706.JPEG n02114712/
+mv val/ILSVRC2012_val_00042707.JPEG n03394916/
+mv val/ILSVRC2012_val_00042708.JPEG n02494079/
+mv val/ILSVRC2012_val_00042709.JPEG n01968897/
+mv val/ILSVRC2012_val_00042710.JPEG n03977966/
+mv val/ILSVRC2012_val_00042711.JPEG n11879895/
+mv val/ILSVRC2012_val_00042712.JPEG n03492542/
+mv val/ILSVRC2012_val_00042713.JPEG n03843555/
+mv val/ILSVRC2012_val_00042714.JPEG n03742115/
+mv val/ILSVRC2012_val_00042715.JPEG n04208210/
+mv val/ILSVRC2012_val_00042716.JPEG n02423022/
+mv val/ILSVRC2012_val_00042717.JPEG n04515003/
+mv val/ILSVRC2012_val_00042718.JPEG n13054560/
+mv val/ILSVRC2012_val_00042719.JPEG n02483708/
+mv val/ILSVRC2012_val_00042720.JPEG n04507155/
+mv val/ILSVRC2012_val_00042721.JPEG n07717410/
+mv val/ILSVRC2012_val_00042722.JPEG n03255030/
+mv val/ILSVRC2012_val_00042723.JPEG n03133878/
+mv val/ILSVRC2012_val_00042724.JPEG n03877845/
+mv val/ILSVRC2012_val_00042725.JPEG n04344873/
+mv val/ILSVRC2012_val_00042726.JPEG n04540053/
+mv val/ILSVRC2012_val_00042727.JPEG n09399592/
+mv val/ILSVRC2012_val_00042728.JPEG n04517823/
+mv val/ILSVRC2012_val_00042729.JPEG n04086273/
+mv val/ILSVRC2012_val_00042730.JPEG n02978881/
+mv val/ILSVRC2012_val_00042731.JPEG n02115641/
+mv val/ILSVRC2012_val_00042732.JPEG n04461696/
+mv val/ILSVRC2012_val_00042733.JPEG n02102973/
+mv val/ILSVRC2012_val_00042734.JPEG n02277742/
+mv val/ILSVRC2012_val_00042735.JPEG n04399382/
+mv val/ILSVRC2012_val_00042736.JPEG n04330267/
+mv val/ILSVRC2012_val_00042737.JPEG n03661043/
+mv val/ILSVRC2012_val_00042738.JPEG n13037406/
+mv val/ILSVRC2012_val_00042739.JPEG n04604644/
+mv val/ILSVRC2012_val_00042740.JPEG n03958227/
+mv val/ILSVRC2012_val_00042741.JPEG n02397096/
+mv val/ILSVRC2012_val_00042742.JPEG n04125021/
+mv val/ILSVRC2012_val_00042743.JPEG n03445924/
+mv val/ILSVRC2012_val_00042744.JPEG n03492542/
+mv val/ILSVRC2012_val_00042745.JPEG n02092339/
+mv val/ILSVRC2012_val_00042746.JPEG n03787032/
+mv val/ILSVRC2012_val_00042747.JPEG n03791053/
+mv val/ILSVRC2012_val_00042748.JPEG n02804414/
+mv val/ILSVRC2012_val_00042749.JPEG n01753488/
+mv val/ILSVRC2012_val_00042750.JPEG n07754684/
+mv val/ILSVRC2012_val_00042751.JPEG n01496331/
+mv val/ILSVRC2012_val_00042752.JPEG n01990800/
+mv val/ILSVRC2012_val_00042753.JPEG n04356056/
+mv val/ILSVRC2012_val_00042754.JPEG n04065272/
+mv val/ILSVRC2012_val_00042755.JPEG n01756291/
+mv val/ILSVRC2012_val_00042756.JPEG n04136333/
+mv val/ILSVRC2012_val_00042757.JPEG n03662601/
+mv val/ILSVRC2012_val_00042758.JPEG n02006656/
+mv val/ILSVRC2012_val_00042759.JPEG n02326432/
+mv val/ILSVRC2012_val_00042760.JPEG n02018795/
+mv val/ILSVRC2012_val_00042761.JPEG n03777568/
+mv val/ILSVRC2012_val_00042762.JPEG n07932039/
+mv val/ILSVRC2012_val_00042763.JPEG n04265275/
+mv val/ILSVRC2012_val_00042764.JPEG n02268853/
+mv val/ILSVRC2012_val_00042765.JPEG n03649909/
+mv val/ILSVRC2012_val_00042766.JPEG n04548362/
+mv val/ILSVRC2012_val_00042767.JPEG n03538406/
+mv val/ILSVRC2012_val_00042768.JPEG n02104365/
+mv val/ILSVRC2012_val_00042769.JPEG n03062245/
+mv val/ILSVRC2012_val_00042770.JPEG n04131690/
+mv val/ILSVRC2012_val_00042771.JPEG n01955084/
+mv val/ILSVRC2012_val_00042772.JPEG n04606251/
+mv val/ILSVRC2012_val_00042773.JPEG n04037443/
+mv val/ILSVRC2012_val_00042774.JPEG n01990800/
+mv val/ILSVRC2012_val_00042775.JPEG n02892767/
+mv val/ILSVRC2012_val_00042776.JPEG n02113023/
+mv val/ILSVRC2012_val_00042777.JPEG n03873416/
+mv val/ILSVRC2012_val_00042778.JPEG n04254680/
+mv val/ILSVRC2012_val_00042779.JPEG n02444819/
+mv val/ILSVRC2012_val_00042780.JPEG n04606251/
+mv val/ILSVRC2012_val_00042781.JPEG n02091032/
+mv val/ILSVRC2012_val_00042782.JPEG n03623198/
+mv val/ILSVRC2012_val_00042783.JPEG n01693334/
+mv val/ILSVRC2012_val_00042784.JPEG n04162706/
+mv val/ILSVRC2012_val_00042785.JPEG n04476259/
+mv val/ILSVRC2012_val_00042786.JPEG n01773157/
+mv val/ILSVRC2012_val_00042787.JPEG n02510455/
+mv val/ILSVRC2012_val_00042788.JPEG n01616318/
+mv val/ILSVRC2012_val_00042789.JPEG n02782093/
+mv val/ILSVRC2012_val_00042790.JPEG n04209133/
+mv val/ILSVRC2012_val_00042791.JPEG n03777568/
+mv val/ILSVRC2012_val_00042792.JPEG n12998815/
+mv val/ILSVRC2012_val_00042793.JPEG n04417672/
+mv val/ILSVRC2012_val_00042794.JPEG n12620546/
+mv val/ILSVRC2012_val_00042795.JPEG n04517823/
+mv val/ILSVRC2012_val_00042796.JPEG n02259212/
+mv val/ILSVRC2012_val_00042797.JPEG n02727426/
+mv val/ILSVRC2012_val_00042798.JPEG n02797295/
+mv val/ILSVRC2012_val_00042799.JPEG n03062245/
+mv val/ILSVRC2012_val_00042800.JPEG n02794156/
+mv val/ILSVRC2012_val_00042801.JPEG n04347754/
+mv val/ILSVRC2012_val_00042802.JPEG n03417042/
+mv val/ILSVRC2012_val_00042803.JPEG n02123159/
+mv val/ILSVRC2012_val_00042804.JPEG n03530642/
+mv val/ILSVRC2012_val_00042805.JPEG n07715103/
+mv val/ILSVRC2012_val_00042806.JPEG n07716906/
+mv val/ILSVRC2012_val_00042807.JPEG n03874599/
+mv val/ILSVRC2012_val_00042808.JPEG n04179913/
+mv val/ILSVRC2012_val_00042809.JPEG n01877812/
+mv val/ILSVRC2012_val_00042810.JPEG n02101388/
+mv val/ILSVRC2012_val_00042811.JPEG n02233338/
+mv val/ILSVRC2012_val_00042812.JPEG n04141327/
+mv val/ILSVRC2012_val_00042813.JPEG n02666196/
+mv val/ILSVRC2012_val_00042814.JPEG n04131690/
+mv val/ILSVRC2012_val_00042815.JPEG n03032252/
+mv val/ILSVRC2012_val_00042816.JPEG n02114367/
+mv val/ILSVRC2012_val_00042817.JPEG n03045698/
+mv val/ILSVRC2012_val_00042818.JPEG n02090721/
+mv val/ILSVRC2012_val_00042819.JPEG n02815834/
+mv val/ILSVRC2012_val_00042820.JPEG n07873807/
+mv val/ILSVRC2012_val_00042821.JPEG n02965783/
+mv val/ILSVRC2012_val_00042822.JPEG n04429376/
+mv val/ILSVRC2012_val_00042823.JPEG n04604644/
+mv val/ILSVRC2012_val_00042824.JPEG n01855032/
+mv val/ILSVRC2012_val_00042825.JPEG n02018795/
+mv val/ILSVRC2012_val_00042826.JPEG n03729826/
+mv val/ILSVRC2012_val_00042827.JPEG n04404412/
+mv val/ILSVRC2012_val_00042828.JPEG n07615774/
+mv val/ILSVRC2012_val_00042829.JPEG n02013706/
+mv val/ILSVRC2012_val_00042830.JPEG n01955084/
+mv val/ILSVRC2012_val_00042831.JPEG n01774750/
+mv val/ILSVRC2012_val_00042832.JPEG n01644373/
+mv val/ILSVRC2012_val_00042833.JPEG n02096177/
+mv val/ILSVRC2012_val_00042834.JPEG n02114712/
+mv val/ILSVRC2012_val_00042835.JPEG n03891332/
+mv val/ILSVRC2012_val_00042836.JPEG n03482405/
+mv val/ILSVRC2012_val_00042837.JPEG n03916031/
+mv val/ILSVRC2012_val_00042838.JPEG n02099849/
+mv val/ILSVRC2012_val_00042839.JPEG n02480855/
+mv val/ILSVRC2012_val_00042840.JPEG n13044778/
+mv val/ILSVRC2012_val_00042841.JPEG n02226429/
+mv val/ILSVRC2012_val_00042842.JPEG n03670208/
+mv val/ILSVRC2012_val_00042843.JPEG n13133613/
+mv val/ILSVRC2012_val_00042844.JPEG n03670208/
+mv val/ILSVRC2012_val_00042845.JPEG n04125021/
+mv val/ILSVRC2012_val_00042846.JPEG n02276258/
+mv val/ILSVRC2012_val_00042847.JPEG n03131574/
+mv val/ILSVRC2012_val_00042848.JPEG n03929855/
+mv val/ILSVRC2012_val_00042849.JPEG n02687172/
+mv val/ILSVRC2012_val_00042850.JPEG n02443484/
+mv val/ILSVRC2012_val_00042851.JPEG n02101006/
+mv val/ILSVRC2012_val_00042852.JPEG n04367480/
+mv val/ILSVRC2012_val_00042853.JPEG n02109525/
+mv val/ILSVRC2012_val_00042854.JPEG n04049303/
+mv val/ILSVRC2012_val_00042855.JPEG n02096051/
+mv val/ILSVRC2012_val_00042856.JPEG n03929660/
+mv val/ILSVRC2012_val_00042857.JPEG n02776631/
+mv val/ILSVRC2012_val_00042858.JPEG n02027492/
+mv val/ILSVRC2012_val_00042859.JPEG n01795545/
+mv val/ILSVRC2012_val_00042860.JPEG n02109525/
+mv val/ILSVRC2012_val_00042861.JPEG n03584829/
+mv val/ILSVRC2012_val_00042862.JPEG n03595614/
+mv val/ILSVRC2012_val_00042863.JPEG n02992211/
+mv val/ILSVRC2012_val_00042864.JPEG n04243546/
+mv val/ILSVRC2012_val_00042865.JPEG n03404251/
+mv val/ILSVRC2012_val_00042866.JPEG n04023962/
+mv val/ILSVRC2012_val_00042867.JPEG n03085013/
+mv val/ILSVRC2012_val_00042868.JPEG n02128385/
+mv val/ILSVRC2012_val_00042869.JPEG n02111129/
+mv val/ILSVRC2012_val_00042870.JPEG n04613696/
+mv val/ILSVRC2012_val_00042871.JPEG n04152593/
+mv val/ILSVRC2012_val_00042872.JPEG n02978881/
+mv val/ILSVRC2012_val_00042873.JPEG n02909870/
+mv val/ILSVRC2012_val_00042874.JPEG n10565667/
+mv val/ILSVRC2012_val_00042875.JPEG n03467068/
+mv val/ILSVRC2012_val_00042876.JPEG n02280649/
+mv val/ILSVRC2012_val_00042877.JPEG n03763968/
+mv val/ILSVRC2012_val_00042878.JPEG n02056570/
+mv val/ILSVRC2012_val_00042879.JPEG n02504458/
+mv val/ILSVRC2012_val_00042880.JPEG n03958227/
+mv val/ILSVRC2012_val_00042881.JPEG n03874599/
+mv val/ILSVRC2012_val_00042882.JPEG n02133161/
+mv val/ILSVRC2012_val_00042883.JPEG n03871628/
+mv val/ILSVRC2012_val_00042884.JPEG n02099849/
+mv val/ILSVRC2012_val_00042885.JPEG n03179701/
+mv val/ILSVRC2012_val_00042886.JPEG n01985128/
+mv val/ILSVRC2012_val_00042887.JPEG n02112137/
+mv val/ILSVRC2012_val_00042888.JPEG n02098413/
+mv val/ILSVRC2012_val_00042889.JPEG n01945685/
+mv val/ILSVRC2012_val_00042890.JPEG n02105505/
+mv val/ILSVRC2012_val_00042891.JPEG n03796401/
+mv val/ILSVRC2012_val_00042892.JPEG n04152593/
+mv val/ILSVRC2012_val_00042893.JPEG n02410509/
+mv val/ILSVRC2012_val_00042894.JPEG n01665541/
+mv val/ILSVRC2012_val_00042895.JPEG n04147183/
+mv val/ILSVRC2012_val_00042896.JPEG n02655020/
+mv val/ILSVRC2012_val_00042897.JPEG n02233338/
+mv val/ILSVRC2012_val_00042898.JPEG n03297495/
+mv val/ILSVRC2012_val_00042899.JPEG n01776313/
+mv val/ILSVRC2012_val_00042900.JPEG n01945685/
+mv val/ILSVRC2012_val_00042901.JPEG n03710193/
+mv val/ILSVRC2012_val_00042902.JPEG n04462240/
+mv val/ILSVRC2012_val_00042903.JPEG n03956157/
+mv val/ILSVRC2012_val_00042904.JPEG n02229544/
+mv val/ILSVRC2012_val_00042905.JPEG n02782093/
+mv val/ILSVRC2012_val_00042906.JPEG n04355338/
+mv val/ILSVRC2012_val_00042907.JPEG n03000684/
+mv val/ILSVRC2012_val_00042908.JPEG n04542943/
+mv val/ILSVRC2012_val_00042909.JPEG n02111277/
+mv val/ILSVRC2012_val_00042910.JPEG n04505470/
+mv val/ILSVRC2012_val_00042911.JPEG n03196217/
+mv val/ILSVRC2012_val_00042912.JPEG n02112706/
+mv val/ILSVRC2012_val_00042913.JPEG n03590841/
+mv val/ILSVRC2012_val_00042914.JPEG n03197337/
+mv val/ILSVRC2012_val_00042915.JPEG n02526121/
+mv val/ILSVRC2012_val_00042916.JPEG n04522168/
+mv val/ILSVRC2012_val_00042917.JPEG n01877812/
+mv val/ILSVRC2012_val_00042918.JPEG n03617480/
+mv val/ILSVRC2012_val_00042919.JPEG n02870880/
+mv val/ILSVRC2012_val_00042920.JPEG n04591713/
+mv val/ILSVRC2012_val_00042921.JPEG n06359193/
+mv val/ILSVRC2012_val_00042922.JPEG n02110958/
+mv val/ILSVRC2012_val_00042923.JPEG n07892512/
+mv val/ILSVRC2012_val_00042924.JPEG n03796401/
+mv val/ILSVRC2012_val_00042925.JPEG n03047690/
+mv val/ILSVRC2012_val_00042926.JPEG n01518878/
+mv val/ILSVRC2012_val_00042927.JPEG n04263257/
+mv val/ILSVRC2012_val_00042928.JPEG n01910747/
+mv val/ILSVRC2012_val_00042929.JPEG n07753275/
+mv val/ILSVRC2012_val_00042930.JPEG n01882714/
+mv val/ILSVRC2012_val_00042931.JPEG n04033901/
+mv val/ILSVRC2012_val_00042932.JPEG n01784675/
+mv val/ILSVRC2012_val_00042933.JPEG n02489166/
+mv val/ILSVRC2012_val_00042934.JPEG n03534580/
+mv val/ILSVRC2012_val_00042935.JPEG n04447861/
+mv val/ILSVRC2012_val_00042936.JPEG n02403003/
+mv val/ILSVRC2012_val_00042937.JPEG n07717556/
+mv val/ILSVRC2012_val_00042938.JPEG n02027492/
+mv val/ILSVRC2012_val_00042939.JPEG n03710721/
+mv val/ILSVRC2012_val_00042940.JPEG n02281787/
+mv val/ILSVRC2012_val_00042941.JPEG n02807133/
+mv val/ILSVRC2012_val_00042942.JPEG n03124170/
+mv val/ILSVRC2012_val_00042943.JPEG n02396427/
+mv val/ILSVRC2012_val_00042944.JPEG n02981792/
+mv val/ILSVRC2012_val_00042945.JPEG n04613696/
+mv val/ILSVRC2012_val_00042946.JPEG n02481823/
+mv val/ILSVRC2012_val_00042947.JPEG n04522168/
+mv val/ILSVRC2012_val_00042948.JPEG n03930313/
+mv val/ILSVRC2012_val_00042949.JPEG n10565667/
+mv val/ILSVRC2012_val_00042950.JPEG n03776460/
+mv val/ILSVRC2012_val_00042951.JPEG n03180011/
+mv val/ILSVRC2012_val_00042952.JPEG n04235860/
+mv val/ILSVRC2012_val_00042953.JPEG n02397096/
+mv val/ILSVRC2012_val_00042954.JPEG n03016953/
+mv val/ILSVRC2012_val_00042955.JPEG n03838899/
+mv val/ILSVRC2012_val_00042956.JPEG n09193705/
+mv val/ILSVRC2012_val_00042957.JPEG n04404412/
+mv val/ILSVRC2012_val_00042958.JPEG n04336792/
+mv val/ILSVRC2012_val_00042959.JPEG n02978881/
+mv val/ILSVRC2012_val_00042960.JPEG n07720875/
+mv val/ILSVRC2012_val_00042961.JPEG n04286575/
+mv val/ILSVRC2012_val_00042962.JPEG n12985857/
+mv val/ILSVRC2012_val_00042963.JPEG n07613480/
+mv val/ILSVRC2012_val_00042964.JPEG n03063689/
+mv val/ILSVRC2012_val_00042965.JPEG n02206856/
+mv val/ILSVRC2012_val_00042966.JPEG n02011460/
+mv val/ILSVRC2012_val_00042967.JPEG n02769748/
+mv val/ILSVRC2012_val_00042968.JPEG n02317335/
+mv val/ILSVRC2012_val_00042969.JPEG n02749479/
+mv val/ILSVRC2012_val_00042970.JPEG n01770081/
+mv val/ILSVRC2012_val_00042971.JPEG n02422699/
+mv val/ILSVRC2012_val_00042972.JPEG n02088094/
+mv val/ILSVRC2012_val_00042973.JPEG n02906734/
+mv val/ILSVRC2012_val_00042974.JPEG n06785654/
+mv val/ILSVRC2012_val_00042975.JPEG n04152593/
+mv val/ILSVRC2012_val_00042976.JPEG n03916031/
+mv val/ILSVRC2012_val_00042977.JPEG n02113186/
+mv val/ILSVRC2012_val_00042978.JPEG n02115913/
+mv val/ILSVRC2012_val_00042979.JPEG n02791124/
+mv val/ILSVRC2012_val_00042980.JPEG n03764736/
+mv val/ILSVRC2012_val_00042981.JPEG n02356798/
+mv val/ILSVRC2012_val_00042982.JPEG n02979186/
+mv val/ILSVRC2012_val_00042983.JPEG n02749479/
+mv val/ILSVRC2012_val_00042984.JPEG n03630383/
+mv val/ILSVRC2012_val_00042985.JPEG n03259280/
+mv val/ILSVRC2012_val_00042986.JPEG n04023962/
+mv val/ILSVRC2012_val_00042987.JPEG n04026417/
+mv val/ILSVRC2012_val_00042988.JPEG n02909870/
+mv val/ILSVRC2012_val_00042989.JPEG n03404251/
+mv val/ILSVRC2012_val_00042990.JPEG n03868863/
+mv val/ILSVRC2012_val_00042991.JPEG n03495258/
+mv val/ILSVRC2012_val_00042992.JPEG n03899768/
+mv val/ILSVRC2012_val_00042993.JPEG n03733805/
+mv val/ILSVRC2012_val_00042994.JPEG n02823750/
+mv val/ILSVRC2012_val_00042995.JPEG n02086079/
+mv val/ILSVRC2012_val_00042996.JPEG n04356056/
+mv val/ILSVRC2012_val_00042997.JPEG n03196217/
+mv val/ILSVRC2012_val_00042998.JPEG n01806143/
+mv val/ILSVRC2012_val_00042999.JPEG n07718472/
+mv val/ILSVRC2012_val_00043000.JPEG n04335435/
+mv val/ILSVRC2012_val_00043001.JPEG n03937543/
+mv val/ILSVRC2012_val_00043002.JPEG n04070727/
+mv val/ILSVRC2012_val_00043003.JPEG n01631663/
+mv val/ILSVRC2012_val_00043004.JPEG n02643566/
+mv val/ILSVRC2012_val_00043005.JPEG n11879895/
+mv val/ILSVRC2012_val_00043006.JPEG n03690938/
+mv val/ILSVRC2012_val_00043007.JPEG n02093428/
+mv val/ILSVRC2012_val_00043008.JPEG n02105641/
+mv val/ILSVRC2012_val_00043009.JPEG n02091134/
+mv val/ILSVRC2012_val_00043010.JPEG n03131574/
+mv val/ILSVRC2012_val_00043011.JPEG n03485407/
+mv val/ILSVRC2012_val_00043012.JPEG n01677366/
+mv val/ILSVRC2012_val_00043013.JPEG n02099601/
+mv val/ILSVRC2012_val_00043014.JPEG n02123045/
+mv val/ILSVRC2012_val_00043015.JPEG n02443114/
+mv val/ILSVRC2012_val_00043016.JPEG n02134418/
+mv val/ILSVRC2012_val_00043017.JPEG n04370456/
+mv val/ILSVRC2012_val_00043018.JPEG n01883070/
+mv val/ILSVRC2012_val_00043019.JPEG n04141076/
+mv val/ILSVRC2012_val_00043020.JPEG n03467068/
+mv val/ILSVRC2012_val_00043021.JPEG n02105162/
+mv val/ILSVRC2012_val_00043022.JPEG n02226429/
+mv val/ILSVRC2012_val_00043023.JPEG n02397096/
+mv val/ILSVRC2012_val_00043024.JPEG n02692877/
+mv val/ILSVRC2012_val_00043025.JPEG n02447366/
+mv val/ILSVRC2012_val_00043026.JPEG n13037406/
+mv val/ILSVRC2012_val_00043027.JPEG n09332890/
+mv val/ILSVRC2012_val_00043028.JPEG n04482393/
+mv val/ILSVRC2012_val_00043029.JPEG n03877845/
+mv val/ILSVRC2012_val_00043030.JPEG n02102480/
+mv val/ILSVRC2012_val_00043031.JPEG n10565667/
+mv val/ILSVRC2012_val_00043032.JPEG n02791270/
+mv val/ILSVRC2012_val_00043033.JPEG n02669723/
+mv val/ILSVRC2012_val_00043034.JPEG n02808304/
+mv val/ILSVRC2012_val_00043035.JPEG n04548362/
+mv val/ILSVRC2012_val_00043036.JPEG n03658185/
+mv val/ILSVRC2012_val_00043037.JPEG n02489166/
+mv val/ILSVRC2012_val_00043038.JPEG n02098286/
+mv val/ILSVRC2012_val_00043039.JPEG n07615774/
+mv val/ILSVRC2012_val_00043040.JPEG n04532106/
+mv val/ILSVRC2012_val_00043041.JPEG n01807496/
+mv val/ILSVRC2012_val_00043042.JPEG n02992529/
+mv val/ILSVRC2012_val_00043043.JPEG n01694178/
+mv val/ILSVRC2012_val_00043044.JPEG n04428191/
+mv val/ILSVRC2012_val_00043045.JPEG n03445924/
+mv val/ILSVRC2012_val_00043046.JPEG n07742313/
+mv val/ILSVRC2012_val_00043047.JPEG n04037443/
+mv val/ILSVRC2012_val_00043048.JPEG n03887697/
+mv val/ILSVRC2012_val_00043049.JPEG n01630670/
+mv val/ILSVRC2012_val_00043050.JPEG n02099267/
+mv val/ILSVRC2012_val_00043051.JPEG n02123597/
+mv val/ILSVRC2012_val_00043052.JPEG n01981276/
+mv val/ILSVRC2012_val_00043053.JPEG n02825657/
+mv val/ILSVRC2012_val_00043054.JPEG n02106662/
+mv val/ILSVRC2012_val_00043055.JPEG n03657121/
+mv val/ILSVRC2012_val_00043056.JPEG n03249569/
+mv val/ILSVRC2012_val_00043057.JPEG n03218198/
+mv val/ILSVRC2012_val_00043058.JPEG n04152593/
+mv val/ILSVRC2012_val_00043059.JPEG n12985857/
+mv val/ILSVRC2012_val_00043060.JPEG n03160309/
+mv val/ILSVRC2012_val_00043061.JPEG n02939185/
+mv val/ILSVRC2012_val_00043062.JPEG n01817953/
+mv val/ILSVRC2012_val_00043063.JPEG n01773157/
+mv val/ILSVRC2012_val_00043064.JPEG n02999410/
+mv val/ILSVRC2012_val_00043065.JPEG n03482405/
+mv val/ILSVRC2012_val_00043066.JPEG n04200800/
+mv val/ILSVRC2012_val_00043067.JPEG n02488702/
+mv val/ILSVRC2012_val_00043068.JPEG n03272562/
+mv val/ILSVRC2012_val_00043069.JPEG n03992509/
+mv val/ILSVRC2012_val_00043070.JPEG n03544143/
+mv val/ILSVRC2012_val_00043071.JPEG n04141327/
+mv val/ILSVRC2012_val_00043072.JPEG n02099712/
+mv val/ILSVRC2012_val_00043073.JPEG n03016953/
+mv val/ILSVRC2012_val_00043074.JPEG n02107142/
+mv val/ILSVRC2012_val_00043075.JPEG n01751748/
+mv val/ILSVRC2012_val_00043076.JPEG n02009912/
+mv val/ILSVRC2012_val_00043077.JPEG n02087394/
+mv val/ILSVRC2012_val_00043078.JPEG n04355933/
+mv val/ILSVRC2012_val_00043079.JPEG n02117135/
+mv val/ILSVRC2012_val_00043080.JPEG n13054560/
+mv val/ILSVRC2012_val_00043081.JPEG n02006656/
+mv val/ILSVRC2012_val_00043082.JPEG n03733805/
+mv val/ILSVRC2012_val_00043083.JPEG n03710193/
+mv val/ILSVRC2012_val_00043084.JPEG n04141076/
+mv val/ILSVRC2012_val_00043085.JPEG n01608432/
+mv val/ILSVRC2012_val_00043086.JPEG n09835506/
+mv val/ILSVRC2012_val_00043087.JPEG n04398044/
+mv val/ILSVRC2012_val_00043088.JPEG n07579787/
+mv val/ILSVRC2012_val_00043089.JPEG n02099712/
+mv val/ILSVRC2012_val_00043090.JPEG n02123597/
+mv val/ILSVRC2012_val_00043091.JPEG n07836838/
+mv val/ILSVRC2012_val_00043092.JPEG n04131690/
+mv val/ILSVRC2012_val_00043093.JPEG n04090263/
+mv val/ILSVRC2012_val_00043094.JPEG n02981792/
+mv val/ILSVRC2012_val_00043095.JPEG n02018795/
+mv val/ILSVRC2012_val_00043096.JPEG n03602883/
+mv val/ILSVRC2012_val_00043097.JPEG n02074367/
+mv val/ILSVRC2012_val_00043098.JPEG n02443484/
+mv val/ILSVRC2012_val_00043099.JPEG n02871525/
+mv val/ILSVRC2012_val_00043100.JPEG n02457408/
+mv val/ILSVRC2012_val_00043101.JPEG n02799071/
+mv val/ILSVRC2012_val_00043102.JPEG n03764736/
+mv val/ILSVRC2012_val_00043103.JPEG n03804744/
+mv val/ILSVRC2012_val_00043104.JPEG n02190166/
+mv val/ILSVRC2012_val_00043105.JPEG n03769881/
+mv val/ILSVRC2012_val_00043106.JPEG n04399382/
+mv val/ILSVRC2012_val_00043107.JPEG n04553703/
+mv val/ILSVRC2012_val_00043108.JPEG n02058221/
+mv val/ILSVRC2012_val_00043109.JPEG n02981792/
+mv val/ILSVRC2012_val_00043110.JPEG n01692333/
+mv val/ILSVRC2012_val_00043111.JPEG n01631663/
+mv val/ILSVRC2012_val_00043112.JPEG n03868242/
+mv val/ILSVRC2012_val_00043113.JPEG n06785654/
+mv val/ILSVRC2012_val_00043114.JPEG n03977966/
+mv val/ILSVRC2012_val_00043115.JPEG n04423845/
+mv val/ILSVRC2012_val_00043116.JPEG n02791124/
+mv val/ILSVRC2012_val_00043117.JPEG n02128385/
+mv val/ILSVRC2012_val_00043118.JPEG n01664065/
+mv val/ILSVRC2012_val_00043119.JPEG n01756291/
+mv val/ILSVRC2012_val_00043120.JPEG n07802026/
+mv val/ILSVRC2012_val_00043121.JPEG n02979186/
+mv val/ILSVRC2012_val_00043122.JPEG n02814533/
+mv val/ILSVRC2012_val_00043123.JPEG n12768682/
+mv val/ILSVRC2012_val_00043124.JPEG n04201297/
+mv val/ILSVRC2012_val_00043125.JPEG n07742313/
+mv val/ILSVRC2012_val_00043126.JPEG n02489166/
+mv val/ILSVRC2012_val_00043127.JPEG n02120079/
+mv val/ILSVRC2012_val_00043128.JPEG n03743016/
+mv val/ILSVRC2012_val_00043129.JPEG n03482405/
+mv val/ILSVRC2012_val_00043130.JPEG n01795545/
+mv val/ILSVRC2012_val_00043131.JPEG n02108551/
+mv val/ILSVRC2012_val_00043132.JPEG n02096051/
+mv val/ILSVRC2012_val_00043133.JPEG n02951358/
+mv val/ILSVRC2012_val_00043134.JPEG n02169497/
+mv val/ILSVRC2012_val_00043135.JPEG n04532106/
+mv val/ILSVRC2012_val_00043136.JPEG n02268443/
+mv val/ILSVRC2012_val_00043137.JPEG n03676483/
+mv val/ILSVRC2012_val_00043138.JPEG n01798484/
+mv val/ILSVRC2012_val_00043139.JPEG n02113712/
+mv val/ILSVRC2012_val_00043140.JPEG n07697313/
+mv val/ILSVRC2012_val_00043141.JPEG n02112018/
+mv val/ILSVRC2012_val_00043142.JPEG n04525038/
+mv val/ILSVRC2012_val_00043143.JPEG n03982430/
+mv val/ILSVRC2012_val_00043144.JPEG n04239074/
+mv val/ILSVRC2012_val_00043145.JPEG n02123597/
+mv val/ILSVRC2012_val_00043146.JPEG n03063689/
+mv val/ILSVRC2012_val_00043147.JPEG n02091134/
+mv val/ILSVRC2012_val_00043148.JPEG n02138441/
+mv val/ILSVRC2012_val_00043149.JPEG n03255030/
+mv val/ILSVRC2012_val_00043150.JPEG n02012849/
+mv val/ILSVRC2012_val_00043151.JPEG n02879718/
+mv val/ILSVRC2012_val_00043152.JPEG n02111277/
+mv val/ILSVRC2012_val_00043153.JPEG n02088466/
+mv val/ILSVRC2012_val_00043154.JPEG n02105056/
+mv val/ILSVRC2012_val_00043155.JPEG n01776313/
+mv val/ILSVRC2012_val_00043156.JPEG n04584207/
+mv val/ILSVRC2012_val_00043157.JPEG n02095314/
+mv val/ILSVRC2012_val_00043158.JPEG n01806567/
+mv val/ILSVRC2012_val_00043159.JPEG n01770393/
+mv val/ILSVRC2012_val_00043160.JPEG n03271574/
+mv val/ILSVRC2012_val_00043161.JPEG n03599486/
+mv val/ILSVRC2012_val_00043162.JPEG n10148035/
+mv val/ILSVRC2012_val_00043163.JPEG n03627232/
+mv val/ILSVRC2012_val_00043164.JPEG n04275548/
+mv val/ILSVRC2012_val_00043165.JPEG n03063689/
+mv val/ILSVRC2012_val_00043166.JPEG n03016953/
+mv val/ILSVRC2012_val_00043167.JPEG n01990800/
+mv val/ILSVRC2012_val_00043168.JPEG n04141076/
+mv val/ILSVRC2012_val_00043169.JPEG n03131574/
+mv val/ILSVRC2012_val_00043170.JPEG n01968897/
+mv val/ILSVRC2012_val_00043171.JPEG n02093256/
+mv val/ILSVRC2012_val_00043172.JPEG n01774750/
+mv val/ILSVRC2012_val_00043173.JPEG n01855672/
+mv val/ILSVRC2012_val_00043174.JPEG n04435653/
+mv val/ILSVRC2012_val_00043175.JPEG n03127747/
+mv val/ILSVRC2012_val_00043176.JPEG n03657121/
+mv val/ILSVRC2012_val_00043177.JPEG n03529860/
+mv val/ILSVRC2012_val_00043178.JPEG n07730033/
+mv val/ILSVRC2012_val_00043179.JPEG n02837789/
+mv val/ILSVRC2012_val_00043180.JPEG n01828970/
+mv val/ILSVRC2012_val_00043181.JPEG n02002556/
+mv val/ILSVRC2012_val_00043182.JPEG n02132136/
+mv val/ILSVRC2012_val_00043183.JPEG n03873416/
+mv val/ILSVRC2012_val_00043184.JPEG n03424325/
+mv val/ILSVRC2012_val_00043185.JPEG n04259630/
+mv val/ILSVRC2012_val_00043186.JPEG n02097130/
+mv val/ILSVRC2012_val_00043187.JPEG n03272562/
+mv val/ILSVRC2012_val_00043188.JPEG n03496892/
+mv val/ILSVRC2012_val_00043189.JPEG n04525305/
+mv val/ILSVRC2012_val_00043190.JPEG n03916031/
+mv val/ILSVRC2012_val_00043191.JPEG n01644373/
+mv val/ILSVRC2012_val_00043192.JPEG n04591713/
+mv val/ILSVRC2012_val_00043193.JPEG n02504013/
+mv val/ILSVRC2012_val_00043194.JPEG n02091831/
+mv val/ILSVRC2012_val_00043195.JPEG n01847000/
+mv val/ILSVRC2012_val_00043196.JPEG n03000684/
+mv val/ILSVRC2012_val_00043197.JPEG n01770393/
+mv val/ILSVRC2012_val_00043198.JPEG n03763968/
+mv val/ILSVRC2012_val_00043199.JPEG n02093754/
+mv val/ILSVRC2012_val_00043200.JPEG n03063689/
+mv val/ILSVRC2012_val_00043201.JPEG n02085782/
+mv val/ILSVRC2012_val_00043202.JPEG n03290653/
+mv val/ILSVRC2012_val_00043203.JPEG n03777568/
+mv val/ILSVRC2012_val_00043204.JPEG n07718472/
+mv val/ILSVRC2012_val_00043205.JPEG n02090721/
+mv val/ILSVRC2012_val_00043206.JPEG n02089078/
+mv val/ILSVRC2012_val_00043207.JPEG n03792782/
+mv val/ILSVRC2012_val_00043208.JPEG n13037406/
+mv val/ILSVRC2012_val_00043209.JPEG n02111889/
+mv val/ILSVRC2012_val_00043210.JPEG n04550184/
+mv val/ILSVRC2012_val_00043211.JPEG n03063599/
+mv val/ILSVRC2012_val_00043212.JPEG n04229816/
+mv val/ILSVRC2012_val_00043213.JPEG n04238763/
+mv val/ILSVRC2012_val_00043214.JPEG n01693334/
+mv val/ILSVRC2012_val_00043215.JPEG n03743016/
+mv val/ILSVRC2012_val_00043216.JPEG n02108551/
+mv val/ILSVRC2012_val_00043217.JPEG n04604644/
+mv val/ILSVRC2012_val_00043218.JPEG n02281787/
+mv val/ILSVRC2012_val_00043219.JPEG n02119789/
+mv val/ILSVRC2012_val_00043220.JPEG n02808304/
+mv val/ILSVRC2012_val_00043221.JPEG n09332890/
+mv val/ILSVRC2012_val_00043222.JPEG n02106550/
+mv val/ILSVRC2012_val_00043223.JPEG n07802026/
+mv val/ILSVRC2012_val_00043224.JPEG n03249569/
+mv val/ILSVRC2012_val_00043225.JPEG n07836838/
+mv val/ILSVRC2012_val_00043226.JPEG n03775546/
+mv val/ILSVRC2012_val_00043227.JPEG n04204347/
+mv val/ILSVRC2012_val_00043228.JPEG n04592741/
+mv val/ILSVRC2012_val_00043229.JPEG n01498041/
+mv val/ILSVRC2012_val_00043230.JPEG n03929660/
+mv val/ILSVRC2012_val_00043231.JPEG n02077923/
+mv val/ILSVRC2012_val_00043232.JPEG n02108089/
+mv val/ILSVRC2012_val_00043233.JPEG n02094433/
+mv val/ILSVRC2012_val_00043234.JPEG n02107574/
+mv val/ILSVRC2012_val_00043235.JPEG n13133613/
+mv val/ILSVRC2012_val_00043236.JPEG n02749479/
+mv val/ILSVRC2012_val_00043237.JPEG n03249569/
+mv val/ILSVRC2012_val_00043238.JPEG n02641379/
+mv val/ILSVRC2012_val_00043239.JPEG n03804744/
+mv val/ILSVRC2012_val_00043240.JPEG n02321529/
+mv val/ILSVRC2012_val_00043241.JPEG n01797886/
+mv val/ILSVRC2012_val_00043242.JPEG n02690373/
+mv val/ILSVRC2012_val_00043243.JPEG n13054560/
+mv val/ILSVRC2012_val_00043244.JPEG n02950826/
+mv val/ILSVRC2012_val_00043245.JPEG n01737021/
+mv val/ILSVRC2012_val_00043246.JPEG n01689811/
+mv val/ILSVRC2012_val_00043247.JPEG n01664065/
+mv val/ILSVRC2012_val_00043248.JPEG n07693725/
+mv val/ILSVRC2012_val_00043249.JPEG n02342885/
+mv val/ILSVRC2012_val_00043250.JPEG n02169497/
+mv val/ILSVRC2012_val_00043251.JPEG n09288635/
+mv val/ILSVRC2012_val_00043252.JPEG n02087394/
+mv val/ILSVRC2012_val_00043253.JPEG n03376595/
+mv val/ILSVRC2012_val_00043254.JPEG n02120505/
+mv val/ILSVRC2012_val_00043255.JPEG n03938244/
+mv val/ILSVRC2012_val_00043256.JPEG n03345487/
+mv val/ILSVRC2012_val_00043257.JPEG n02500267/
+mv val/ILSVRC2012_val_00043258.JPEG n01797886/
+mv val/ILSVRC2012_val_00043259.JPEG n04443257/
+mv val/ILSVRC2012_val_00043260.JPEG n03492542/
+mv val/ILSVRC2012_val_00043261.JPEG n02094258/
+mv val/ILSVRC2012_val_00043262.JPEG n03721384/
+mv val/ILSVRC2012_val_00043263.JPEG n13044778/
+mv val/ILSVRC2012_val_00043264.JPEG n03868863/
+mv val/ILSVRC2012_val_00043265.JPEG n07711569/
+mv val/ILSVRC2012_val_00043266.JPEG n02236044/
+mv val/ILSVRC2012_val_00043267.JPEG n04081281/
+mv val/ILSVRC2012_val_00043268.JPEG n03838899/
+mv val/ILSVRC2012_val_00043269.JPEG n04596742/
+mv val/ILSVRC2012_val_00043270.JPEG n02111500/
+mv val/ILSVRC2012_val_00043271.JPEG n04251144/
+mv val/ILSVRC2012_val_00043272.JPEG n02100583/
+mv val/ILSVRC2012_val_00043273.JPEG n07714571/
+mv val/ILSVRC2012_val_00043274.JPEG n04238763/
+mv val/ILSVRC2012_val_00043275.JPEG n02105412/
+mv val/ILSVRC2012_val_00043276.JPEG n02443484/
+mv val/ILSVRC2012_val_00043277.JPEG n04019541/
+mv val/ILSVRC2012_val_00043278.JPEG n03394916/
+mv val/ILSVRC2012_val_00043279.JPEG n03776460/
+mv val/ILSVRC2012_val_00043280.JPEG n03000134/
+mv val/ILSVRC2012_val_00043281.JPEG n02109525/
+mv val/ILSVRC2012_val_00043282.JPEG n02109525/
+mv val/ILSVRC2012_val_00043283.JPEG n02870880/
+mv val/ILSVRC2012_val_00043284.JPEG n03393912/
+mv val/ILSVRC2012_val_00043285.JPEG n03197337/
+mv val/ILSVRC2012_val_00043286.JPEG n04081281/
+mv val/ILSVRC2012_val_00043287.JPEG n03763968/
+mv val/ILSVRC2012_val_00043288.JPEG n01688243/
+mv val/ILSVRC2012_val_00043289.JPEG n02110806/
+mv val/ILSVRC2012_val_00043290.JPEG n02834397/
+mv val/ILSVRC2012_val_00043291.JPEG n02939185/
+mv val/ILSVRC2012_val_00043292.JPEG n02279972/
+mv val/ILSVRC2012_val_00043293.JPEG n03888605/
+mv val/ILSVRC2012_val_00043294.JPEG n02268443/
+mv val/ILSVRC2012_val_00043295.JPEG n02988304/
+mv val/ILSVRC2012_val_00043296.JPEG n04310018/
+mv val/ILSVRC2012_val_00043297.JPEG n04285008/
+mv val/ILSVRC2012_val_00043298.JPEG n09246464/
+mv val/ILSVRC2012_val_00043299.JPEG n02389026/
+mv val/ILSVRC2012_val_00043300.JPEG n01558993/
+mv val/ILSVRC2012_val_00043301.JPEG n01955084/
+mv val/ILSVRC2012_val_00043302.JPEG n01930112/
+mv val/ILSVRC2012_val_00043303.JPEG n01644373/
+mv val/ILSVRC2012_val_00043304.JPEG n12620546/
+mv val/ILSVRC2012_val_00043305.JPEG n02093256/
+mv val/ILSVRC2012_val_00043306.JPEG n09256479/
+mv val/ILSVRC2012_val_00043307.JPEG n02002724/
+mv val/ILSVRC2012_val_00043308.JPEG n03160309/
+mv val/ILSVRC2012_val_00043309.JPEG n04204238/
+mv val/ILSVRC2012_val_00043310.JPEG n01753488/
+mv val/ILSVRC2012_val_00043311.JPEG n03393912/
+mv val/ILSVRC2012_val_00043312.JPEG n01641577/
+mv val/ILSVRC2012_val_00043313.JPEG n02100735/
+mv val/ILSVRC2012_val_00043314.JPEG n04584207/
+mv val/ILSVRC2012_val_00043315.JPEG n02100236/
+mv val/ILSVRC2012_val_00043316.JPEG n02879718/
+mv val/ILSVRC2012_val_00043317.JPEG n02988304/
+mv val/ILSVRC2012_val_00043318.JPEG n02105162/
+mv val/ILSVRC2012_val_00043319.JPEG n02110806/
+mv val/ILSVRC2012_val_00043320.JPEG n04258138/
+mv val/ILSVRC2012_val_00043321.JPEG n03590841/
+mv val/ILSVRC2012_val_00043322.JPEG n02927161/
+mv val/ILSVRC2012_val_00043323.JPEG n01498041/
+mv val/ILSVRC2012_val_00043324.JPEG n03720891/
+mv val/ILSVRC2012_val_00043325.JPEG n04515003/
+mv val/ILSVRC2012_val_00043326.JPEG n02134418/
+mv val/ILSVRC2012_val_00043327.JPEG n03014705/
+mv val/ILSVRC2012_val_00043328.JPEG n03344393/
+mv val/ILSVRC2012_val_00043329.JPEG n02783161/
+mv val/ILSVRC2012_val_00043330.JPEG n04443257/
+mv val/ILSVRC2012_val_00043331.JPEG n02492660/
+mv val/ILSVRC2012_val_00043332.JPEG n03218198/
+mv val/ILSVRC2012_val_00043333.JPEG n01755581/
+mv val/ILSVRC2012_val_00043334.JPEG n02090622/
+mv val/ILSVRC2012_val_00043335.JPEG n03179701/
+mv val/ILSVRC2012_val_00043336.JPEG n04252225/
+mv val/ILSVRC2012_val_00043337.JPEG n04417672/
+mv val/ILSVRC2012_val_00043338.JPEG n04037443/
+mv val/ILSVRC2012_val_00043339.JPEG n04065272/
+mv val/ILSVRC2012_val_00043340.JPEG n03721384/
+mv val/ILSVRC2012_val_00043341.JPEG n02089973/
+mv val/ILSVRC2012_val_00043342.JPEG n02091635/
+mv val/ILSVRC2012_val_00043343.JPEG n03804744/
+mv val/ILSVRC2012_val_00043344.JPEG n09288635/
+mv val/ILSVRC2012_val_00043345.JPEG n04613696/
+mv val/ILSVRC2012_val_00043346.JPEG n03796401/
+mv val/ILSVRC2012_val_00043347.JPEG n07714990/
+mv val/ILSVRC2012_val_00043348.JPEG n01770393/
+mv val/ILSVRC2012_val_00043349.JPEG n01742172/
+mv val/ILSVRC2012_val_00043350.JPEG n02128385/
+mv val/ILSVRC2012_val_00043351.JPEG n03492542/
+mv val/ILSVRC2012_val_00043352.JPEG n03916031/
+mv val/ILSVRC2012_val_00043353.JPEG n01883070/
+mv val/ILSVRC2012_val_00043354.JPEG n01739381/
+mv val/ILSVRC2012_val_00043355.JPEG n02980441/
+mv val/ILSVRC2012_val_00043356.JPEG n02966687/
+mv val/ILSVRC2012_val_00043357.JPEG n04486054/
+mv val/ILSVRC2012_val_00043358.JPEG n04443257/
+mv val/ILSVRC2012_val_00043359.JPEG n01984695/
+mv val/ILSVRC2012_val_00043360.JPEG n03026506/
+mv val/ILSVRC2012_val_00043361.JPEG n02808440/
+mv val/ILSVRC2012_val_00043362.JPEG n02977058/
+mv val/ILSVRC2012_val_00043363.JPEG n02114367/
+mv val/ILSVRC2012_val_00043364.JPEG n02094114/
+mv val/ILSVRC2012_val_00043365.JPEG n02326432/
+mv val/ILSVRC2012_val_00043366.JPEG n03016953/
+mv val/ILSVRC2012_val_00043367.JPEG n02106166/
+mv val/ILSVRC2012_val_00043368.JPEG n03710193/
+mv val/ILSVRC2012_val_00043369.JPEG n01644373/
+mv val/ILSVRC2012_val_00043370.JPEG n02091134/
+mv val/ILSVRC2012_val_00043371.JPEG n03259280/
+mv val/ILSVRC2012_val_00043372.JPEG n03018349/
+mv val/ILSVRC2012_val_00043373.JPEG n03791053/
+mv val/ILSVRC2012_val_00043374.JPEG n04008634/
+mv val/ILSVRC2012_val_00043375.JPEG n02095570/
+mv val/ILSVRC2012_val_00043376.JPEG n07718747/
+mv val/ILSVRC2012_val_00043377.JPEG n03376595/
+mv val/ILSVRC2012_val_00043378.JPEG n07717410/
+mv val/ILSVRC2012_val_00043379.JPEG n02894605/
+mv val/ILSVRC2012_val_00043380.JPEG n07583066/
+mv val/ILSVRC2012_val_00043381.JPEG n02281787/
+mv val/ILSVRC2012_val_00043382.JPEG n03483316/
+mv val/ILSVRC2012_val_00043383.JPEG n02105505/
+mv val/ILSVRC2012_val_00043384.JPEG n03837869/
+mv val/ILSVRC2012_val_00043385.JPEG n04591713/
+mv val/ILSVRC2012_val_00043386.JPEG n02749479/
+mv val/ILSVRC2012_val_00043387.JPEG n01514668/
+mv val/ILSVRC2012_val_00043388.JPEG n02090379/
+mv val/ILSVRC2012_val_00043389.JPEG n03424325/
+mv val/ILSVRC2012_val_00043390.JPEG n03642806/
+mv val/ILSVRC2012_val_00043391.JPEG n02089973/
+mv val/ILSVRC2012_val_00043392.JPEG n01532829/
+mv val/ILSVRC2012_val_00043393.JPEG n02105641/
+mv val/ILSVRC2012_val_00043394.JPEG n04591713/
+mv val/ILSVRC2012_val_00043395.JPEG n01819313/
+mv val/ILSVRC2012_val_00043396.JPEG n02127052/
+mv val/ILSVRC2012_val_00043397.JPEG n03124043/
+mv val/ILSVRC2012_val_00043398.JPEG n03649909/
+mv val/ILSVRC2012_val_00043399.JPEG n02113186/
+mv val/ILSVRC2012_val_00043400.JPEG n04067472/
+mv val/ILSVRC2012_val_00043401.JPEG n02114548/
+mv val/ILSVRC2012_val_00043402.JPEG n03791053/
+mv val/ILSVRC2012_val_00043403.JPEG n03792782/
+mv val/ILSVRC2012_val_00043404.JPEG n02093991/
+mv val/ILSVRC2012_val_00043405.JPEG n03530642/
+mv val/ILSVRC2012_val_00043406.JPEG n02397096/
+mv val/ILSVRC2012_val_00043407.JPEG n02281787/
+mv val/ILSVRC2012_val_00043408.JPEG n03661043/
+mv val/ILSVRC2012_val_00043409.JPEG n03495258/
+mv val/ILSVRC2012_val_00043410.JPEG n02174001/
+mv val/ILSVRC2012_val_00043411.JPEG n07880968/
+mv val/ILSVRC2012_val_00043412.JPEG n03459775/
+mv val/ILSVRC2012_val_00043413.JPEG n02100236/
+mv val/ILSVRC2012_val_00043414.JPEG n02727426/
+mv val/ILSVRC2012_val_00043415.JPEG n01820546/
+mv val/ILSVRC2012_val_00043416.JPEG n02988304/
+mv val/ILSVRC2012_val_00043417.JPEG n02112350/
+mv val/ILSVRC2012_val_00043418.JPEG n03476684/
+mv val/ILSVRC2012_val_00043419.JPEG n04238763/
+mv val/ILSVRC2012_val_00043420.JPEG n02028035/
+mv val/ILSVRC2012_val_00043421.JPEG n02120505/
+mv val/ILSVRC2012_val_00043422.JPEG n01704323/
+mv val/ILSVRC2012_val_00043423.JPEG n03047690/
+mv val/ILSVRC2012_val_00043424.JPEG n02268443/
+mv val/ILSVRC2012_val_00043425.JPEG n02443114/
+mv val/ILSVRC2012_val_00043426.JPEG n02112137/
+mv val/ILSVRC2012_val_00043427.JPEG n02879718/
+mv val/ILSVRC2012_val_00043428.JPEG n01697457/
+mv val/ILSVRC2012_val_00043429.JPEG n04264628/
+mv val/ILSVRC2012_val_00043430.JPEG n03314780/
+mv val/ILSVRC2012_val_00043431.JPEG n03649909/
+mv val/ILSVRC2012_val_00043432.JPEG n02133161/
+mv val/ILSVRC2012_val_00043433.JPEG n07730033/
+mv val/ILSVRC2012_val_00043434.JPEG n03670208/
+mv val/ILSVRC2012_val_00043435.JPEG n02835271/
+mv val/ILSVRC2012_val_00043436.JPEG n03584829/
+mv val/ILSVRC2012_val_00043437.JPEG n02326432/
+mv val/ILSVRC2012_val_00043438.JPEG n03916031/
+mv val/ILSVRC2012_val_00043439.JPEG n03485794/
+mv val/ILSVRC2012_val_00043440.JPEG n03314780/
+mv val/ILSVRC2012_val_00043441.JPEG n02342885/
+mv val/ILSVRC2012_val_00043442.JPEG n02105412/
+mv val/ILSVRC2012_val_00043443.JPEG n02321529/
+mv val/ILSVRC2012_val_00043444.JPEG n01669191/
+mv val/ILSVRC2012_val_00043445.JPEG n07742313/
+mv val/ILSVRC2012_val_00043446.JPEG n03045698/
+mv val/ILSVRC2012_val_00043447.JPEG n02510455/
+mv val/ILSVRC2012_val_00043448.JPEG n04201297/
+mv val/ILSVRC2012_val_00043449.JPEG n03710721/
+mv val/ILSVRC2012_val_00043450.JPEG n02966687/
+mv val/ILSVRC2012_val_00043451.JPEG n02094258/
+mv val/ILSVRC2012_val_00043452.JPEG n02109047/
+mv val/ILSVRC2012_val_00043453.JPEG n03376595/
+mv val/ILSVRC2012_val_00043454.JPEG n03017168/
+mv val/ILSVRC2012_val_00043455.JPEG n01924916/
+mv val/ILSVRC2012_val_00043456.JPEG n02017213/
+mv val/ILSVRC2012_val_00043457.JPEG n02086079/
+mv val/ILSVRC2012_val_00043458.JPEG n03666591/
+mv val/ILSVRC2012_val_00043459.JPEG n04465501/
+mv val/ILSVRC2012_val_00043460.JPEG n02981792/
+mv val/ILSVRC2012_val_00043461.JPEG n03832673/
+mv val/ILSVRC2012_val_00043462.JPEG n01806567/
+mv val/ILSVRC2012_val_00043463.JPEG n02793495/
+mv val/ILSVRC2012_val_00043464.JPEG n02110806/
+mv val/ILSVRC2012_val_00043465.JPEG n01833805/
+mv val/ILSVRC2012_val_00043466.JPEG n01622779/
+mv val/ILSVRC2012_val_00043467.JPEG n02493509/
+mv val/ILSVRC2012_val_00043468.JPEG n03495258/
+mv val/ILSVRC2012_val_00043469.JPEG n03485407/
+mv val/ILSVRC2012_val_00043470.JPEG n02051845/
+mv val/ILSVRC2012_val_00043471.JPEG n04141975/
+mv val/ILSVRC2012_val_00043472.JPEG n02909870/
+mv val/ILSVRC2012_val_00043473.JPEG n01698640/
+mv val/ILSVRC2012_val_00043474.JPEG n02096294/
+mv val/ILSVRC2012_val_00043475.JPEG n02009912/
+mv val/ILSVRC2012_val_00043476.JPEG n02097658/
+mv val/ILSVRC2012_val_00043477.JPEG n02018207/
+mv val/ILSVRC2012_val_00043478.JPEG n02804414/
+mv val/ILSVRC2012_val_00043479.JPEG n03095699/
+mv val/ILSVRC2012_val_00043480.JPEG n01665541/
+mv val/ILSVRC2012_val_00043481.JPEG n03532672/
+mv val/ILSVRC2012_val_00043482.JPEG n02102177/
+mv val/ILSVRC2012_val_00043483.JPEG n01806143/
+mv val/ILSVRC2012_val_00043484.JPEG n01847000/
+mv val/ILSVRC2012_val_00043485.JPEG n07693725/
+mv val/ILSVRC2012_val_00043486.JPEG n02268853/
+mv val/ILSVRC2012_val_00043487.JPEG n03530642/
+mv val/ILSVRC2012_val_00043488.JPEG n03908618/
+mv val/ILSVRC2012_val_00043489.JPEG n03781244/
+mv val/ILSVRC2012_val_00043490.JPEG n04286575/
+mv val/ILSVRC2012_val_00043491.JPEG n02111129/
+mv val/ILSVRC2012_val_00043492.JPEG n04273569/
+mv val/ILSVRC2012_val_00043493.JPEG n04590129/
+mv val/ILSVRC2012_val_00043494.JPEG n02100583/
+mv val/ILSVRC2012_val_00043495.JPEG n03916031/
+mv val/ILSVRC2012_val_00043496.JPEG n04404412/
+mv val/ILSVRC2012_val_00043497.JPEG n02708093/
+mv val/ILSVRC2012_val_00043498.JPEG n03160309/
+mv val/ILSVRC2012_val_00043499.JPEG n07579787/
+mv val/ILSVRC2012_val_00043500.JPEG n03476991/
+mv val/ILSVRC2012_val_00043501.JPEG n04204238/
+mv val/ILSVRC2012_val_00043502.JPEG n03344393/
+mv val/ILSVRC2012_val_00043503.JPEG n09193705/
+mv val/ILSVRC2012_val_00043504.JPEG n01665541/
+mv val/ILSVRC2012_val_00043505.JPEG n01968897/
+mv val/ILSVRC2012_val_00043506.JPEG n03180011/
+mv val/ILSVRC2012_val_00043507.JPEG n02948072/
+mv val/ILSVRC2012_val_00043508.JPEG n01871265/
+mv val/ILSVRC2012_val_00043509.JPEG n01843383/
+mv val/ILSVRC2012_val_00043510.JPEG n02494079/
+mv val/ILSVRC2012_val_00043511.JPEG n02105505/
+mv val/ILSVRC2012_val_00043512.JPEG n02356798/
+mv val/ILSVRC2012_val_00043513.JPEG n02769748/
+mv val/ILSVRC2012_val_00043514.JPEG n01955084/
+mv val/ILSVRC2012_val_00043515.JPEG n01990800/
+mv val/ILSVRC2012_val_00043516.JPEG n02113712/
+mv val/ILSVRC2012_val_00043517.JPEG n03976657/
+mv val/ILSVRC2012_val_00043518.JPEG n03633091/
+mv val/ILSVRC2012_val_00043519.JPEG n03937543/
+mv val/ILSVRC2012_val_00043520.JPEG n04252225/
+mv val/ILSVRC2012_val_00043521.JPEG n02442845/
+mv val/ILSVRC2012_val_00043522.JPEG n03461385/
+mv val/ILSVRC2012_val_00043523.JPEG n03014705/
+mv val/ILSVRC2012_val_00043524.JPEG n01644900/
+mv val/ILSVRC2012_val_00043525.JPEG n03924679/
+mv val/ILSVRC2012_val_00043526.JPEG n04152593/
+mv val/ILSVRC2012_val_00043527.JPEG n02974003/
+mv val/ILSVRC2012_val_00043528.JPEG n02804414/
+mv val/ILSVRC2012_val_00043529.JPEG n03290653/
+mv val/ILSVRC2012_val_00043530.JPEG n04344873/
+mv val/ILSVRC2012_val_00043531.JPEG n02326432/
+mv val/ILSVRC2012_val_00043532.JPEG n04371430/
+mv val/ILSVRC2012_val_00043533.JPEG n03485794/
+mv val/ILSVRC2012_val_00043534.JPEG n02107142/
+mv val/ILSVRC2012_val_00043535.JPEG n03483316/
+mv val/ILSVRC2012_val_00043536.JPEG n04330267/
+mv val/ILSVRC2012_val_00043537.JPEG n01883070/
+mv val/ILSVRC2012_val_00043538.JPEG n02105505/
+mv val/ILSVRC2012_val_00043539.JPEG n03062245/
+mv val/ILSVRC2012_val_00043540.JPEG n03924679/
+mv val/ILSVRC2012_val_00043541.JPEG n02326432/
+mv val/ILSVRC2012_val_00043542.JPEG n03761084/
+mv val/ILSVRC2012_val_00043543.JPEG n02104029/
+mv val/ILSVRC2012_val_00043544.JPEG n02074367/
+mv val/ILSVRC2012_val_00043545.JPEG n04023962/
+mv val/ILSVRC2012_val_00043546.JPEG n02123597/
+mv val/ILSVRC2012_val_00043547.JPEG n04264628/
+mv val/ILSVRC2012_val_00043548.JPEG n03902125/
+mv val/ILSVRC2012_val_00043549.JPEG n02077923/
+mv val/ILSVRC2012_val_00043550.JPEG n02927161/
+mv val/ILSVRC2012_val_00043551.JPEG n03272562/
+mv val/ILSVRC2012_val_00043552.JPEG n04399382/
+mv val/ILSVRC2012_val_00043553.JPEG n07875152/
+mv val/ILSVRC2012_val_00043554.JPEG n03478589/
+mv val/ILSVRC2012_val_00043555.JPEG n03680355/
+mv val/ILSVRC2012_val_00043556.JPEG n02093428/
+mv val/ILSVRC2012_val_00043557.JPEG n03903868/
+mv val/ILSVRC2012_val_00043558.JPEG n02396427/
+mv val/ILSVRC2012_val_00043559.JPEG n01753488/
+mv val/ILSVRC2012_val_00043560.JPEG n01914609/
+mv val/ILSVRC2012_val_00043561.JPEG n04487081/
+mv val/ILSVRC2012_val_00043562.JPEG n03372029/
+mv val/ILSVRC2012_val_00043563.JPEG n01753488/
+mv val/ILSVRC2012_val_00043564.JPEG n02096585/
+mv val/ILSVRC2012_val_00043565.JPEG n07747607/
+mv val/ILSVRC2012_val_00043566.JPEG n01601694/
+mv val/ILSVRC2012_val_00043567.JPEG n03146219/
+mv val/ILSVRC2012_val_00043568.JPEG n03733131/
+mv val/ILSVRC2012_val_00043569.JPEG n03124043/
+mv val/ILSVRC2012_val_00043570.JPEG n02090622/
+mv val/ILSVRC2012_val_00043571.JPEG n03063599/
+mv val/ILSVRC2012_val_00043572.JPEG n03599486/
+mv val/ILSVRC2012_val_00043573.JPEG n03976657/
+mv val/ILSVRC2012_val_00043574.JPEG n07880968/
+mv val/ILSVRC2012_val_00043575.JPEG n02086910/
+mv val/ILSVRC2012_val_00043576.JPEG n02494079/
+mv val/ILSVRC2012_val_00043577.JPEG n02100735/
+mv val/ILSVRC2012_val_00043578.JPEG n01693334/
+mv val/ILSVRC2012_val_00043579.JPEG n02966193/
+mv val/ILSVRC2012_val_00043580.JPEG n02089973/
+mv val/ILSVRC2012_val_00043581.JPEG n03866082/
+mv val/ILSVRC2012_val_00043582.JPEG n02640242/
+mv val/ILSVRC2012_val_00043583.JPEG n02094433/
+mv val/ILSVRC2012_val_00043584.JPEG n03947888/
+mv val/ILSVRC2012_val_00043585.JPEG n01592084/
+mv val/ILSVRC2012_val_00043586.JPEG n04039381/
+mv val/ILSVRC2012_val_00043587.JPEG n04263257/
+mv val/ILSVRC2012_val_00043588.JPEG n04326547/
+mv val/ILSVRC2012_val_00043589.JPEG n02841315/
+mv val/ILSVRC2012_val_00043590.JPEG n04009552/
+mv val/ILSVRC2012_val_00043591.JPEG n02099712/
+mv val/ILSVRC2012_val_00043592.JPEG n03271574/
+mv val/ILSVRC2012_val_00043593.JPEG n02701002/
+mv val/ILSVRC2012_val_00043594.JPEG n03791053/
+mv val/ILSVRC2012_val_00043595.JPEG n04252077/
+mv val/ILSVRC2012_val_00043596.JPEG n07717410/
+mv val/ILSVRC2012_val_00043597.JPEG n02027492/
+mv val/ILSVRC2012_val_00043598.JPEG n02097474/
+mv val/ILSVRC2012_val_00043599.JPEG n02113799/
+mv val/ILSVRC2012_val_00043600.JPEG n01773797/
+mv val/ILSVRC2012_val_00043601.JPEG n11939491/
+mv val/ILSVRC2012_val_00043602.JPEG n03494278/
+mv val/ILSVRC2012_val_00043603.JPEG n02971356/
+mv val/ILSVRC2012_val_00043604.JPEG n02509815/
+mv val/ILSVRC2012_val_00043605.JPEG n02107683/
+mv val/ILSVRC2012_val_00043606.JPEG n04328186/
+mv val/ILSVRC2012_val_00043607.JPEG n03998194/
+mv val/ILSVRC2012_val_00043608.JPEG n03938244/
+mv val/ILSVRC2012_val_00043609.JPEG n03721384/
+mv val/ILSVRC2012_val_00043610.JPEG n02089973/
+mv val/ILSVRC2012_val_00043611.JPEG n07684084/
+mv val/ILSVRC2012_val_00043612.JPEG n04613696/
+mv val/ILSVRC2012_val_00043613.JPEG n03476991/
+mv val/ILSVRC2012_val_00043614.JPEG n03444034/
+mv val/ILSVRC2012_val_00043615.JPEG n03272010/
+mv val/ILSVRC2012_val_00043616.JPEG n02219486/
+mv val/ILSVRC2012_val_00043617.JPEG n07613480/
+mv val/ILSVRC2012_val_00043618.JPEG n03899768/
+mv val/ILSVRC2012_val_00043619.JPEG n01770393/
+mv val/ILSVRC2012_val_00043620.JPEG n04532106/
+mv val/ILSVRC2012_val_00043621.JPEG n04264628/
+mv val/ILSVRC2012_val_00043622.JPEG n03314780/
+mv val/ILSVRC2012_val_00043623.JPEG n02422106/
+mv val/ILSVRC2012_val_00043624.JPEG n01689811/
+mv val/ILSVRC2012_val_00043625.JPEG n04154565/
+mv val/ILSVRC2012_val_00043626.JPEG n03991062/
+mv val/ILSVRC2012_val_00043627.JPEG n02088094/
+mv val/ILSVRC2012_val_00043628.JPEG n03384352/
+mv val/ILSVRC2012_val_00043629.JPEG n02088632/
+mv val/ILSVRC2012_val_00043630.JPEG n03146219/
+mv val/ILSVRC2012_val_00043631.JPEG n02017213/
+mv val/ILSVRC2012_val_00043632.JPEG n02123597/
+mv val/ILSVRC2012_val_00043633.JPEG n01806567/
+mv val/ILSVRC2012_val_00043634.JPEG n01740131/
+mv val/ILSVRC2012_val_00043635.JPEG n01829413/
+mv val/ILSVRC2012_val_00043636.JPEG n04004767/
+mv val/ILSVRC2012_val_00043637.JPEG n04355338/
+mv val/ILSVRC2012_val_00043638.JPEG n04044716/
+mv val/ILSVRC2012_val_00043639.JPEG n01735189/
+mv val/ILSVRC2012_val_00043640.JPEG n03218198/
+mv val/ILSVRC2012_val_00043641.JPEG n02108422/
+mv val/ILSVRC2012_val_00043642.JPEG n07831146/
+mv val/ILSVRC2012_val_00043643.JPEG n02110185/
+mv val/ILSVRC2012_val_00043644.JPEG n07932039/
+mv val/ILSVRC2012_val_00043645.JPEG n03658185/
+mv val/ILSVRC2012_val_00043646.JPEG n01773797/
+mv val/ILSVRC2012_val_00043647.JPEG n09288635/
+mv val/ILSVRC2012_val_00043648.JPEG n02133161/
+mv val/ILSVRC2012_val_00043649.JPEG n01820546/
+mv val/ILSVRC2012_val_00043650.JPEG n09332890/
+mv val/ILSVRC2012_val_00043651.JPEG n09468604/
+mv val/ILSVRC2012_val_00043652.JPEG n03935335/
+mv val/ILSVRC2012_val_00043653.JPEG n04562935/
+mv val/ILSVRC2012_val_00043654.JPEG n03908714/
+mv val/ILSVRC2012_val_00043655.JPEG n02167151/
+mv val/ILSVRC2012_val_00043656.JPEG n03216828/
+mv val/ILSVRC2012_val_00043657.JPEG n02497673/
+mv val/ILSVRC2012_val_00043658.JPEG n04493381/
+mv val/ILSVRC2012_val_00043659.JPEG n03452741/
+mv val/ILSVRC2012_val_00043660.JPEG n02117135/
+mv val/ILSVRC2012_val_00043661.JPEG n04131690/
+mv val/ILSVRC2012_val_00043662.JPEG n02120505/
+mv val/ILSVRC2012_val_00043663.JPEG n03743016/
+mv val/ILSVRC2012_val_00043664.JPEG n02364673/
+mv val/ILSVRC2012_val_00043665.JPEG n03980874/
+mv val/ILSVRC2012_val_00043666.JPEG n04462240/
+mv val/ILSVRC2012_val_00043667.JPEG n02804414/
+mv val/ILSVRC2012_val_00043668.JPEG n02051845/
+mv val/ILSVRC2012_val_00043669.JPEG n02808440/
+mv val/ILSVRC2012_val_00043670.JPEG n02172182/
+mv val/ILSVRC2012_val_00043671.JPEG n09428293/
+mv val/ILSVRC2012_val_00043672.JPEG n02093428/
+mv val/ILSVRC2012_val_00043673.JPEG n03220513/
+mv val/ILSVRC2012_val_00043674.JPEG n02699494/
+mv val/ILSVRC2012_val_00043675.JPEG n03803284/
+mv val/ILSVRC2012_val_00043676.JPEG n03804744/
+mv val/ILSVRC2012_val_00043677.JPEG n02514041/
+mv val/ILSVRC2012_val_00043678.JPEG n04099969/
+mv val/ILSVRC2012_val_00043679.JPEG n04296562/
+mv val/ILSVRC2012_val_00043680.JPEG n03388549/
+mv val/ILSVRC2012_val_00043681.JPEG n12998815/
+mv val/ILSVRC2012_val_00043682.JPEG n03933933/
+mv val/ILSVRC2012_val_00043683.JPEG n04208210/
+mv val/ILSVRC2012_val_00043684.JPEG n02410509/
+mv val/ILSVRC2012_val_00043685.JPEG n04482393/
+mv val/ILSVRC2012_val_00043686.JPEG n04487081/
+mv val/ILSVRC2012_val_00043687.JPEG n02486261/
+mv val/ILSVRC2012_val_00043688.JPEG n02113799/
+mv val/ILSVRC2012_val_00043689.JPEG n04228054/
+mv val/ILSVRC2012_val_00043690.JPEG n09835506/
+mv val/ILSVRC2012_val_00043691.JPEG n04067472/
+mv val/ILSVRC2012_val_00043692.JPEG n01664065/
+mv val/ILSVRC2012_val_00043693.JPEG n04428191/
+mv val/ILSVRC2012_val_00043694.JPEG n01740131/
+mv val/ILSVRC2012_val_00043695.JPEG n02493509/
+mv val/ILSVRC2012_val_00043696.JPEG n11939491/
+mv val/ILSVRC2012_val_00043697.JPEG n03042490/
+mv val/ILSVRC2012_val_00043698.JPEG n03584254/
+mv val/ILSVRC2012_val_00043699.JPEG n09468604/
+mv val/ILSVRC2012_val_00043700.JPEG n04120489/
+mv val/ILSVRC2012_val_00043701.JPEG n02483708/
+mv val/ILSVRC2012_val_00043702.JPEG n01498041/
+mv val/ILSVRC2012_val_00043703.JPEG n03786901/
+mv val/ILSVRC2012_val_00043704.JPEG n04523525/
+mv val/ILSVRC2012_val_00043705.JPEG n02165105/
+mv val/ILSVRC2012_val_00043706.JPEG n03888605/
+mv val/ILSVRC2012_val_00043707.JPEG n02115913/
+mv val/ILSVRC2012_val_00043708.JPEG n04201297/
+mv val/ILSVRC2012_val_00043709.JPEG n04501370/
+mv val/ILSVRC2012_val_00043710.JPEG n04037443/
+mv val/ILSVRC2012_val_00043711.JPEG n02172182/
+mv val/ILSVRC2012_val_00043712.JPEG n03793489/
+mv val/ILSVRC2012_val_00043713.JPEG n03724870/
+mv val/ILSVRC2012_val_00043714.JPEG n02391049/
+mv val/ILSVRC2012_val_00043715.JPEG n04069434/
+mv val/ILSVRC2012_val_00043716.JPEG n02807133/
+mv val/ILSVRC2012_val_00043717.JPEG n02056570/
+mv val/ILSVRC2012_val_00043718.JPEG n07584110/
+mv val/ILSVRC2012_val_00043719.JPEG n04398044/
+mv val/ILSVRC2012_val_00043720.JPEG n04398044/
+mv val/ILSVRC2012_val_00043721.JPEG n03854065/
+mv val/ILSVRC2012_val_00043722.JPEG n02655020/
+mv val/ILSVRC2012_val_00043723.JPEG n02107312/
+mv val/ILSVRC2012_val_00043724.JPEG n04366367/
+mv val/ILSVRC2012_val_00043725.JPEG n04086273/
+mv val/ILSVRC2012_val_00043726.JPEG n03485407/
+mv val/ILSVRC2012_val_00043727.JPEG n02104029/
+mv val/ILSVRC2012_val_00043728.JPEG n04251144/
+mv val/ILSVRC2012_val_00043729.JPEG n03627232/
+mv val/ILSVRC2012_val_00043730.JPEG n02132136/
+mv val/ILSVRC2012_val_00043731.JPEG n02979186/
+mv val/ILSVRC2012_val_00043732.JPEG n02317335/
+mv val/ILSVRC2012_val_00043733.JPEG n03201208/
+mv val/ILSVRC2012_val_00043734.JPEG n04479046/
+mv val/ILSVRC2012_val_00043735.JPEG n03452741/
+mv val/ILSVRC2012_val_00043736.JPEG n04258138/
+mv val/ILSVRC2012_val_00043737.JPEG n07590611/
+mv val/ILSVRC2012_val_00043738.JPEG n04149813/
+mv val/ILSVRC2012_val_00043739.JPEG n04355933/
+mv val/ILSVRC2012_val_00043740.JPEG n03207941/
+mv val/ILSVRC2012_val_00043741.JPEG n04479046/
+mv val/ILSVRC2012_val_00043742.JPEG n02441942/
+mv val/ILSVRC2012_val_00043743.JPEG n03866082/
+mv val/ILSVRC2012_val_00043744.JPEG n07583066/
+mv val/ILSVRC2012_val_00043745.JPEG n03445777/
+mv val/ILSVRC2012_val_00043746.JPEG n03017168/
+mv val/ILSVRC2012_val_00043747.JPEG n02672831/
+mv val/ILSVRC2012_val_00043748.JPEG n04204238/
+mv val/ILSVRC2012_val_00043749.JPEG n04326547/
+mv val/ILSVRC2012_val_00043750.JPEG n02113712/
+mv val/ILSVRC2012_val_00043751.JPEG n01514668/
+mv val/ILSVRC2012_val_00043752.JPEG n02415577/
+mv val/ILSVRC2012_val_00043753.JPEG n03706229/
+mv val/ILSVRC2012_val_00043754.JPEG n02981792/
+mv val/ILSVRC2012_val_00043755.JPEG n02840245/
+mv val/ILSVRC2012_val_00043756.JPEG n04389033/
+mv val/ILSVRC2012_val_00043757.JPEG n03992509/
+mv val/ILSVRC2012_val_00043758.JPEG n02403003/
+mv val/ILSVRC2012_val_00043759.JPEG n04005630/
+mv val/ILSVRC2012_val_00043760.JPEG n03637318/
+mv val/ILSVRC2012_val_00043761.JPEG n04371430/
+mv val/ILSVRC2012_val_00043762.JPEG n04347754/
+mv val/ILSVRC2012_val_00043763.JPEG n02100583/
+mv val/ILSVRC2012_val_00043764.JPEG n01518878/
+mv val/ILSVRC2012_val_00043765.JPEG n02319095/
+mv val/ILSVRC2012_val_00043766.JPEG n02492035/
+mv val/ILSVRC2012_val_00043767.JPEG n04597913/
+mv val/ILSVRC2012_val_00043768.JPEG n02206856/
+mv val/ILSVRC2012_val_00043769.JPEG n02025239/
+mv val/ILSVRC2012_val_00043770.JPEG n04591157/
+mv val/ILSVRC2012_val_00043771.JPEG n01773549/
+mv val/ILSVRC2012_val_00043772.JPEG n04081281/
+mv val/ILSVRC2012_val_00043773.JPEG n07697537/
+mv val/ILSVRC2012_val_00043774.JPEG n01682714/
+mv val/ILSVRC2012_val_00043775.JPEG n04069434/
+mv val/ILSVRC2012_val_00043776.JPEG n02085782/
+mv val/ILSVRC2012_val_00043777.JPEG n02655020/
+mv val/ILSVRC2012_val_00043778.JPEG n07714571/
+mv val/ILSVRC2012_val_00043779.JPEG n01614925/
+mv val/ILSVRC2012_val_00043780.JPEG n04008634/
+mv val/ILSVRC2012_val_00043781.JPEG n07873807/
+mv val/ILSVRC2012_val_00043782.JPEG n04131690/
+mv val/ILSVRC2012_val_00043783.JPEG n03680355/
+mv val/ILSVRC2012_val_00043784.JPEG n02422699/
+mv val/ILSVRC2012_val_00043785.JPEG n07753592/
+mv val/ILSVRC2012_val_00043786.JPEG n03840681/
+mv val/ILSVRC2012_val_00043787.JPEG n06785654/
+mv val/ILSVRC2012_val_00043788.JPEG n01530575/
+mv val/ILSVRC2012_val_00043789.JPEG n02096051/
+mv val/ILSVRC2012_val_00043790.JPEG n03764736/
+mv val/ILSVRC2012_val_00043791.JPEG n02108089/
+mv val/ILSVRC2012_val_00043792.JPEG n04044716/
+mv val/ILSVRC2012_val_00043793.JPEG n03384352/
+mv val/ILSVRC2012_val_00043794.JPEG n01818515/
+mv val/ILSVRC2012_val_00043795.JPEG n02056570/
+mv val/ILSVRC2012_val_00043796.JPEG n02097130/
+mv val/ILSVRC2012_val_00043797.JPEG n01665541/
+mv val/ILSVRC2012_val_00043798.JPEG n01688243/
+mv val/ILSVRC2012_val_00043799.JPEG n04131690/
+mv val/ILSVRC2012_val_00043800.JPEG n04606251/
+mv val/ILSVRC2012_val_00043801.JPEG n01616318/
+mv val/ILSVRC2012_val_00043802.JPEG n01688243/
+mv val/ILSVRC2012_val_00043803.JPEG n02113186/
+mv val/ILSVRC2012_val_00043804.JPEG n04613696/
+mv val/ILSVRC2012_val_00043805.JPEG n01737021/
+mv val/ILSVRC2012_val_00043806.JPEG n02776631/
+mv val/ILSVRC2012_val_00043807.JPEG n03995372/
+mv val/ILSVRC2012_val_00043808.JPEG n01806143/
+mv val/ILSVRC2012_val_00043809.JPEG n01753488/
+mv val/ILSVRC2012_val_00043810.JPEG n04037443/
+mv val/ILSVRC2012_val_00043811.JPEG n02879718/
+mv val/ILSVRC2012_val_00043812.JPEG n04009552/
+mv val/ILSVRC2012_val_00043813.JPEG n02110806/
+mv val/ILSVRC2012_val_00043814.JPEG n04332243/
+mv val/ILSVRC2012_val_00043815.JPEG n04560804/
+mv val/ILSVRC2012_val_00043816.JPEG n03884397/
+mv val/ILSVRC2012_val_00043817.JPEG n02110958/
+mv val/ILSVRC2012_val_00043818.JPEG n03888605/
+mv val/ILSVRC2012_val_00043819.JPEG n01685808/
+mv val/ILSVRC2012_val_00043820.JPEG n07565083/
+mv val/ILSVRC2012_val_00043821.JPEG n02883205/
+mv val/ILSVRC2012_val_00043822.JPEG n02492660/
+mv val/ILSVRC2012_val_00043823.JPEG n01798484/
+mv val/ILSVRC2012_val_00043824.JPEG n03100240/
+mv val/ILSVRC2012_val_00043825.JPEG n02088094/
+mv val/ILSVRC2012_val_00043826.JPEG n04229816/
+mv val/ILSVRC2012_val_00043827.JPEG n02098286/
+mv val/ILSVRC2012_val_00043828.JPEG n02841315/
+mv val/ILSVRC2012_val_00043829.JPEG n03017168/
+mv val/ILSVRC2012_val_00043830.JPEG n04120489/
+mv val/ILSVRC2012_val_00043831.JPEG n07718747/
+mv val/ILSVRC2012_val_00043832.JPEG n03933933/
+mv val/ILSVRC2012_val_00043833.JPEG n04355933/
+mv val/ILSVRC2012_val_00043834.JPEG n04483307/
+mv val/ILSVRC2012_val_00043835.JPEG n02107142/
+mv val/ILSVRC2012_val_00043836.JPEG n01744401/
+mv val/ILSVRC2012_val_00043837.JPEG n02093991/
+mv val/ILSVRC2012_val_00043838.JPEG n02112137/
+mv val/ILSVRC2012_val_00043839.JPEG n02085936/
+mv val/ILSVRC2012_val_00043840.JPEG n03929855/
+mv val/ILSVRC2012_val_00043841.JPEG n02051845/
+mv val/ILSVRC2012_val_00043842.JPEG n02091831/
+mv val/ILSVRC2012_val_00043843.JPEG n01740131/
+mv val/ILSVRC2012_val_00043844.JPEG n02948072/
+mv val/ILSVRC2012_val_00043845.JPEG n02112706/
+mv val/ILSVRC2012_val_00043846.JPEG n04584207/
+mv val/ILSVRC2012_val_00043847.JPEG n04070727/
+mv val/ILSVRC2012_val_00043848.JPEG n03584254/
+mv val/ILSVRC2012_val_00043849.JPEG n04235860/
+mv val/ILSVRC2012_val_00043850.JPEG n01749939/
+mv val/ILSVRC2012_val_00043851.JPEG n02086079/
+mv val/ILSVRC2012_val_00043852.JPEG n03424325/
+mv val/ILSVRC2012_val_00043853.JPEG n04485082/
+mv val/ILSVRC2012_val_00043854.JPEG n02165456/
+mv val/ILSVRC2012_val_00043855.JPEG n03259280/
+mv val/ILSVRC2012_val_00043856.JPEG n02132136/
+mv val/ILSVRC2012_val_00043857.JPEG n03445924/
+mv val/ILSVRC2012_val_00043858.JPEG n12768682/
+mv val/ILSVRC2012_val_00043859.JPEG n03325584/
+mv val/ILSVRC2012_val_00043860.JPEG n01644373/
+mv val/ILSVRC2012_val_00043861.JPEG n02361337/
+mv val/ILSVRC2012_val_00043862.JPEG n04523525/
+mv val/ILSVRC2012_val_00043863.JPEG n07753592/
+mv val/ILSVRC2012_val_00043864.JPEG n04067472/
+mv val/ILSVRC2012_val_00043865.JPEG n04579145/
+mv val/ILSVRC2012_val_00043866.JPEG n07880968/
+mv val/ILSVRC2012_val_00043867.JPEG n02231487/
+mv val/ILSVRC2012_val_00043868.JPEG n04486054/
+mv val/ILSVRC2012_val_00043869.JPEG n03658185/
+mv val/ILSVRC2012_val_00043870.JPEG n04429376/
+mv val/ILSVRC2012_val_00043871.JPEG n03126707/
+mv val/ILSVRC2012_val_00043872.JPEG n02085620/
+mv val/ILSVRC2012_val_00043873.JPEG n02104365/
+mv val/ILSVRC2012_val_00043874.JPEG n02692877/
+mv val/ILSVRC2012_val_00043875.JPEG n04557648/
+mv val/ILSVRC2012_val_00043876.JPEG n04606251/
+mv val/ILSVRC2012_val_00043877.JPEG n03888605/
+mv val/ILSVRC2012_val_00043878.JPEG n02105412/
+mv val/ILSVRC2012_val_00043879.JPEG n06785654/
+mv val/ILSVRC2012_val_00043880.JPEG n02101388/
+mv val/ILSVRC2012_val_00043881.JPEG n03393912/
+mv val/ILSVRC2012_val_00043882.JPEG n04370456/
+mv val/ILSVRC2012_val_00043883.JPEG n12985857/
+mv val/ILSVRC2012_val_00043884.JPEG n07871810/
+mv val/ILSVRC2012_val_00043885.JPEG n03742115/
+mv val/ILSVRC2012_val_00043886.JPEG n04238763/
+mv val/ILSVRC2012_val_00043887.JPEG n02101006/
+mv val/ILSVRC2012_val_00043888.JPEG n02090379/
+mv val/ILSVRC2012_val_00043889.JPEG n09399592/
+mv val/ILSVRC2012_val_00043890.JPEG n07930864/
+mv val/ILSVRC2012_val_00043891.JPEG n02123597/
+mv val/ILSVRC2012_val_00043892.JPEG n03494278/
+mv val/ILSVRC2012_val_00043893.JPEG n02363005/
+mv val/ILSVRC2012_val_00043894.JPEG n07892512/
+mv val/ILSVRC2012_val_00043895.JPEG n02776631/
+mv val/ILSVRC2012_val_00043896.JPEG n03785016/
+mv val/ILSVRC2012_val_00043897.JPEG n07930864/
+mv val/ILSVRC2012_val_00043898.JPEG n02123394/
+mv val/ILSVRC2012_val_00043899.JPEG n01855032/
+mv val/ILSVRC2012_val_00043900.JPEG n02883205/
+mv val/ILSVRC2012_val_00043901.JPEG n02091831/
+mv val/ILSVRC2012_val_00043902.JPEG n03868242/
+mv val/ILSVRC2012_val_00043903.JPEG n02930766/
+mv val/ILSVRC2012_val_00043904.JPEG n01945685/
+mv val/ILSVRC2012_val_00043905.JPEG n03594734/
+mv val/ILSVRC2012_val_00043906.JPEG n02493793/
+mv val/ILSVRC2012_val_00043907.JPEG n02398521/
+mv val/ILSVRC2012_val_00043908.JPEG n04501370/
+mv val/ILSVRC2012_val_00043909.JPEG n03417042/
+mv val/ILSVRC2012_val_00043910.JPEG n02815834/
+mv val/ILSVRC2012_val_00043911.JPEG n03710637/
+mv val/ILSVRC2012_val_00043912.JPEG n02100583/
+mv val/ILSVRC2012_val_00043913.JPEG n02497673/
+mv val/ILSVRC2012_val_00043914.JPEG n02894605/
+mv val/ILSVRC2012_val_00043915.JPEG n03895866/
+mv val/ILSVRC2012_val_00043916.JPEG n01756291/
+mv val/ILSVRC2012_val_00043917.JPEG n02091032/
+mv val/ILSVRC2012_val_00043918.JPEG n02120505/
+mv val/ILSVRC2012_val_00043919.JPEG n03980874/
+mv val/ILSVRC2012_val_00043920.JPEG n07745940/
+mv val/ILSVRC2012_val_00043921.JPEG n02769748/
+mv val/ILSVRC2012_val_00043922.JPEG n04208210/
+mv val/ILSVRC2012_val_00043923.JPEG n01990800/
+mv val/ILSVRC2012_val_00043924.JPEG n02397096/
+mv val/ILSVRC2012_val_00043925.JPEG n01692333/
+mv val/ILSVRC2012_val_00043926.JPEG n03814639/
+mv val/ILSVRC2012_val_00043927.JPEG n01855672/
+mv val/ILSVRC2012_val_00043928.JPEG n04154565/
+mv val/ILSVRC2012_val_00043929.JPEG n02317335/
+mv val/ILSVRC2012_val_00043930.JPEG n02815834/
+mv val/ILSVRC2012_val_00043931.JPEG n07693725/
+mv val/ILSVRC2012_val_00043932.JPEG n03720891/
+mv val/ILSVRC2012_val_00043933.JPEG n02110627/
+mv val/ILSVRC2012_val_00043934.JPEG n13037406/
+mv val/ILSVRC2012_val_00043935.JPEG n02391049/
+mv val/ILSVRC2012_val_00043936.JPEG n04131690/
+mv val/ILSVRC2012_val_00043937.JPEG n01930112/
+mv val/ILSVRC2012_val_00043938.JPEG n07760859/
+mv val/ILSVRC2012_val_00043939.JPEG n03770679/
+mv val/ILSVRC2012_val_00043940.JPEG n02111500/
+mv val/ILSVRC2012_val_00043941.JPEG n04252225/
+mv val/ILSVRC2012_val_00043942.JPEG n01877812/
+mv val/ILSVRC2012_val_00043943.JPEG n03180011/
+mv val/ILSVRC2012_val_00043944.JPEG n13044778/
+mv val/ILSVRC2012_val_00043945.JPEG n02492660/
+mv val/ILSVRC2012_val_00043946.JPEG n04273569/
+mv val/ILSVRC2012_val_00043947.JPEG n04004767/
+mv val/ILSVRC2012_val_00043948.JPEG n04238763/
+mv val/ILSVRC2012_val_00043949.JPEG n03706229/
+mv val/ILSVRC2012_val_00043950.JPEG n04357314/
+mv val/ILSVRC2012_val_00043951.JPEG n01641577/
+mv val/ILSVRC2012_val_00043952.JPEG n04311174/
+mv val/ILSVRC2012_val_00043953.JPEG n03109150/
+mv val/ILSVRC2012_val_00043954.JPEG n03866082/
+mv val/ILSVRC2012_val_00043955.JPEG n03933933/
+mv val/ILSVRC2012_val_00043956.JPEG n02412080/
+mv val/ILSVRC2012_val_00043957.JPEG n03207743/
+mv val/ILSVRC2012_val_00043958.JPEG n03218198/
+mv val/ILSVRC2012_val_00043959.JPEG n07716906/
+mv val/ILSVRC2012_val_00043960.JPEG n03218198/
+mv val/ILSVRC2012_val_00043961.JPEG n02667093/
+mv val/ILSVRC2012_val_00043962.JPEG n02799071/
+mv val/ILSVRC2012_val_00043963.JPEG n02346627/
+mv val/ILSVRC2012_val_00043964.JPEG n03874293/
+mv val/ILSVRC2012_val_00043965.JPEG n01537544/
+mv val/ILSVRC2012_val_00043966.JPEG n01728572/
+mv val/ILSVRC2012_val_00043967.JPEG n03804744/
+mv val/ILSVRC2012_val_00043968.JPEG n01855672/
+mv val/ILSVRC2012_val_00043969.JPEG n01744401/
+mv val/ILSVRC2012_val_00043970.JPEG n02747177/
+mv val/ILSVRC2012_val_00043971.JPEG n02939185/
+mv val/ILSVRC2012_val_00043972.JPEG n02676566/
+mv val/ILSVRC2012_val_00043973.JPEG n02950826/
+mv val/ILSVRC2012_val_00043974.JPEG n02097298/
+mv val/ILSVRC2012_val_00043975.JPEG n01819313/
+mv val/ILSVRC2012_val_00043976.JPEG n02276258/
+mv val/ILSVRC2012_val_00043977.JPEG n09428293/
+mv val/ILSVRC2012_val_00043978.JPEG n01682714/
+mv val/ILSVRC2012_val_00043979.JPEG n03710637/
+mv val/ILSVRC2012_val_00043980.JPEG n03920288/
+mv val/ILSVRC2012_val_00043981.JPEG n02672831/
+mv val/ILSVRC2012_val_00043982.JPEG n02447366/
+mv val/ILSVRC2012_val_00043983.JPEG n02860847/
+mv val/ILSVRC2012_val_00043984.JPEG n02412080/
+mv val/ILSVRC2012_val_00043985.JPEG n04254680/
+mv val/ILSVRC2012_val_00043986.JPEG n01692333/
+mv val/ILSVRC2012_val_00043987.JPEG n02807133/
+mv val/ILSVRC2012_val_00043988.JPEG n03394916/
+mv val/ILSVRC2012_val_00043989.JPEG n13133613/
+mv val/ILSVRC2012_val_00043990.JPEG n01806567/
+mv val/ILSVRC2012_val_00043991.JPEG n07720875/
+mv val/ILSVRC2012_val_00043992.JPEG n07836838/
+mv val/ILSVRC2012_val_00043993.JPEG n02088094/
+mv val/ILSVRC2012_val_00043994.JPEG n02102040/
+mv val/ILSVRC2012_val_00043995.JPEG n01580077/
+mv val/ILSVRC2012_val_00043996.JPEG n03775546/
+mv val/ILSVRC2012_val_00043997.JPEG n04238763/
+mv val/ILSVRC2012_val_00043998.JPEG n04118776/
+mv val/ILSVRC2012_val_00043999.JPEG n04540053/
+mv val/ILSVRC2012_val_00044000.JPEG n02096294/
+mv val/ILSVRC2012_val_00044001.JPEG n02441942/
+mv val/ILSVRC2012_val_00044002.JPEG n03781244/
+mv val/ILSVRC2012_val_00044003.JPEG n02093256/
+mv val/ILSVRC2012_val_00044004.JPEG n02988304/
+mv val/ILSVRC2012_val_00044005.JPEG n02423022/
+mv val/ILSVRC2012_val_00044006.JPEG n07871810/
+mv val/ILSVRC2012_val_00044007.JPEG n01704323/
+mv val/ILSVRC2012_val_00044008.JPEG n02132136/
+mv val/ILSVRC2012_val_00044009.JPEG n01560419/
+mv val/ILSVRC2012_val_00044010.JPEG n02206856/
+mv val/ILSVRC2012_val_00044011.JPEG n01833805/
+mv val/ILSVRC2012_val_00044012.JPEG n02980441/
+mv val/ILSVRC2012_val_00044013.JPEG n11879895/
+mv val/ILSVRC2012_val_00044014.JPEG n07875152/
+mv val/ILSVRC2012_val_00044015.JPEG n03930313/
+mv val/ILSVRC2012_val_00044016.JPEG n03042490/
+mv val/ILSVRC2012_val_00044017.JPEG n03954731/
+mv val/ILSVRC2012_val_00044018.JPEG n03933933/
+mv val/ILSVRC2012_val_00044019.JPEG n03126707/
+mv val/ILSVRC2012_val_00044020.JPEG n03461385/
+mv val/ILSVRC2012_val_00044021.JPEG n02114855/
+mv val/ILSVRC2012_val_00044022.JPEG n03929660/
+mv val/ILSVRC2012_val_00044023.JPEG n04550184/
+mv val/ILSVRC2012_val_00044024.JPEG n02783161/
+mv val/ILSVRC2012_val_00044025.JPEG n03944341/
+mv val/ILSVRC2012_val_00044026.JPEG n07693725/
+mv val/ILSVRC2012_val_00044027.JPEG n02123045/
+mv val/ILSVRC2012_val_00044028.JPEG n09288635/
+mv val/ILSVRC2012_val_00044029.JPEG n03196217/
+mv val/ILSVRC2012_val_00044030.JPEG n03297495/
+mv val/ILSVRC2012_val_00044031.JPEG n02091831/
+mv val/ILSVRC2012_val_00044032.JPEG n03670208/
+mv val/ILSVRC2012_val_00044033.JPEG n04487394/
+mv val/ILSVRC2012_val_00044034.JPEG n02105251/
+mv val/ILSVRC2012_val_00044035.JPEG n02454379/
+mv val/ILSVRC2012_val_00044036.JPEG n02099849/
+mv val/ILSVRC2012_val_00044037.JPEG n04409515/
+mv val/ILSVRC2012_val_00044038.JPEG n01592084/
+mv val/ILSVRC2012_val_00044039.JPEG n02092002/
+mv val/ILSVRC2012_val_00044040.JPEG n07590611/
+mv val/ILSVRC2012_val_00044041.JPEG n03992509/
+mv val/ILSVRC2012_val_00044042.JPEG n02412080/
+mv val/ILSVRC2012_val_00044043.JPEG n03075370/
+mv val/ILSVRC2012_val_00044044.JPEG n02447366/
+mv val/ILSVRC2012_val_00044045.JPEG n02669723/
+mv val/ILSVRC2012_val_00044046.JPEG n12985857/
+mv val/ILSVRC2012_val_00044047.JPEG n03584254/
+mv val/ILSVRC2012_val_00044048.JPEG n01753488/
+mv val/ILSVRC2012_val_00044049.JPEG n02708093/
+mv val/ILSVRC2012_val_00044050.JPEG n02497673/
+mv val/ILSVRC2012_val_00044051.JPEG n04069434/
+mv val/ILSVRC2012_val_00044052.JPEG n01484850/
+mv val/ILSVRC2012_val_00044053.JPEG n07873807/
+mv val/ILSVRC2012_val_00044054.JPEG n03492542/
+mv val/ILSVRC2012_val_00044055.JPEG n03457902/
+mv val/ILSVRC2012_val_00044056.JPEG n03670208/
+mv val/ILSVRC2012_val_00044057.JPEG n04376876/
+mv val/ILSVRC2012_val_00044058.JPEG n01697457/
+mv val/ILSVRC2012_val_00044059.JPEG n02101556/
+mv val/ILSVRC2012_val_00044060.JPEG n11879895/
+mv val/ILSVRC2012_val_00044061.JPEG n02071294/
+mv val/ILSVRC2012_val_00044062.JPEG n03710193/
+mv val/ILSVRC2012_val_00044063.JPEG n03961711/
+mv val/ILSVRC2012_val_00044064.JPEG n03930313/
+mv val/ILSVRC2012_val_00044065.JPEG n02793495/
+mv val/ILSVRC2012_val_00044066.JPEG n12768682/
+mv val/ILSVRC2012_val_00044067.JPEG n03657121/
+mv val/ILSVRC2012_val_00044068.JPEG n04596742/
+mv val/ILSVRC2012_val_00044069.JPEG n04204238/
+mv val/ILSVRC2012_val_00044070.JPEG n02093754/
+mv val/ILSVRC2012_val_00044071.JPEG n03961711/
+mv val/ILSVRC2012_val_00044072.JPEG n09472597/
+mv val/ILSVRC2012_val_00044073.JPEG n03379051/
+mv val/ILSVRC2012_val_00044074.JPEG n02417914/
+mv val/ILSVRC2012_val_00044075.JPEG n02107312/
+mv val/ILSVRC2012_val_00044076.JPEG n02489166/
+mv val/ILSVRC2012_val_00044077.JPEG n01828970/
+mv val/ILSVRC2012_val_00044078.JPEG n03884397/
+mv val/ILSVRC2012_val_00044079.JPEG n04251144/
+mv val/ILSVRC2012_val_00044080.JPEG n03792782/
+mv val/ILSVRC2012_val_00044081.JPEG n02782093/
+mv val/ILSVRC2012_val_00044082.JPEG n01820546/
+mv val/ILSVRC2012_val_00044083.JPEG n02981792/
+mv val/ILSVRC2012_val_00044084.JPEG n06359193/
+mv val/ILSVRC2012_val_00044085.JPEG n03443371/
+mv val/ILSVRC2012_val_00044086.JPEG n01735189/
+mv val/ILSVRC2012_val_00044087.JPEG n04501370/
+mv val/ILSVRC2012_val_00044088.JPEG n03673027/
+mv val/ILSVRC2012_val_00044089.JPEG n03770679/
+mv val/ILSVRC2012_val_00044090.JPEG n03085013/
+mv val/ILSVRC2012_val_00044091.JPEG n02112706/
+mv val/ILSVRC2012_val_00044092.JPEG n01978287/
+mv val/ILSVRC2012_val_00044093.JPEG n02794156/
+mv val/ILSVRC2012_val_00044094.JPEG n02087394/
+mv val/ILSVRC2012_val_00044095.JPEG n01443537/
+mv val/ILSVRC2012_val_00044096.JPEG n04286575/
+mv val/ILSVRC2012_val_00044097.JPEG n02123394/
+mv val/ILSVRC2012_val_00044098.JPEG n04264628/
+mv val/ILSVRC2012_val_00044099.JPEG n03337140/
+mv val/ILSVRC2012_val_00044100.JPEG n03710721/
+mv val/ILSVRC2012_val_00044101.JPEG n03947888/
+mv val/ILSVRC2012_val_00044102.JPEG n02514041/
+mv val/ILSVRC2012_val_00044103.JPEG n02328150/
+mv val/ILSVRC2012_val_00044104.JPEG n02110185/
+mv val/ILSVRC2012_val_00044105.JPEG n03992509/
+mv val/ILSVRC2012_val_00044106.JPEG n02965783/
+mv val/ILSVRC2012_val_00044107.JPEG n02096177/
+mv val/ILSVRC2012_val_00044108.JPEG n01824575/
+mv val/ILSVRC2012_val_00044109.JPEG n03929855/
+mv val/ILSVRC2012_val_00044110.JPEG n02815834/
+mv val/ILSVRC2012_val_00044111.JPEG n02643566/
+mv val/ILSVRC2012_val_00044112.JPEG n01744401/
+mv val/ILSVRC2012_val_00044113.JPEG n02672831/
+mv val/ILSVRC2012_val_00044114.JPEG n02447366/
+mv val/ILSVRC2012_val_00044115.JPEG n06874185/
+mv val/ILSVRC2012_val_00044116.JPEG n04325704/
+mv val/ILSVRC2012_val_00044117.JPEG n02317335/
+mv val/ILSVRC2012_val_00044118.JPEG n03126707/
+mv val/ILSVRC2012_val_00044119.JPEG n02056570/
+mv val/ILSVRC2012_val_00044120.JPEG n02457408/
+mv val/ILSVRC2012_val_00044121.JPEG n03443371/
+mv val/ILSVRC2012_val_00044122.JPEG n04125021/
+mv val/ILSVRC2012_val_00044123.JPEG n03866082/
+mv val/ILSVRC2012_val_00044124.JPEG n03127747/
+mv val/ILSVRC2012_val_00044125.JPEG n04311004/
+mv val/ILSVRC2012_val_00044126.JPEG n02134084/
+mv val/ILSVRC2012_val_00044127.JPEG n01910747/
+mv val/ILSVRC2012_val_00044128.JPEG n07716358/
+mv val/ILSVRC2012_val_00044129.JPEG n02134418/
+mv val/ILSVRC2012_val_00044130.JPEG n02071294/
+mv val/ILSVRC2012_val_00044131.JPEG n04335435/
+mv val/ILSVRC2012_val_00044132.JPEG n03594734/
+mv val/ILSVRC2012_val_00044133.JPEG n06359193/
+mv val/ILSVRC2012_val_00044134.JPEG n04336792/
+mv val/ILSVRC2012_val_00044135.JPEG n02097474/
+mv val/ILSVRC2012_val_00044136.JPEG n07717410/
+mv val/ILSVRC2012_val_00044137.JPEG n02092339/
+mv val/ILSVRC2012_val_00044138.JPEG n04376876/
+mv val/ILSVRC2012_val_00044139.JPEG n03785016/
+mv val/ILSVRC2012_val_00044140.JPEG n02087394/
+mv val/ILSVRC2012_val_00044141.JPEG n02825657/
+mv val/ILSVRC2012_val_00044142.JPEG n03208938/
+mv val/ILSVRC2012_val_00044143.JPEG n03720891/
+mv val/ILSVRC2012_val_00044144.JPEG n04366367/
+mv val/ILSVRC2012_val_00044145.JPEG n02480855/
+mv val/ILSVRC2012_val_00044146.JPEG n03124043/
+mv val/ILSVRC2012_val_00044147.JPEG n04067472/
+mv val/ILSVRC2012_val_00044148.JPEG n03180011/
+mv val/ILSVRC2012_val_00044149.JPEG n04049303/
+mv val/ILSVRC2012_val_00044150.JPEG n04243546/
+mv val/ILSVRC2012_val_00044151.JPEG n04423845/
+mv val/ILSVRC2012_val_00044152.JPEG n03127747/
+mv val/ILSVRC2012_val_00044153.JPEG n02259212/
+mv val/ILSVRC2012_val_00044154.JPEG n03697007/
+mv val/ILSVRC2012_val_00044155.JPEG n04136333/
+mv val/ILSVRC2012_val_00044156.JPEG n04590129/
+mv val/ILSVRC2012_val_00044157.JPEG n03942813/
+mv val/ILSVRC2012_val_00044158.JPEG n02268443/
+mv val/ILSVRC2012_val_00044159.JPEG n04008634/
+mv val/ILSVRC2012_val_00044160.JPEG n04254680/
+mv val/ILSVRC2012_val_00044161.JPEG n04125021/
+mv val/ILSVRC2012_val_00044162.JPEG n04040759/
+mv val/ILSVRC2012_val_00044163.JPEG n03924679/
+mv val/ILSVRC2012_val_00044164.JPEG n04485082/
+mv val/ILSVRC2012_val_00044165.JPEG n02410509/
+mv val/ILSVRC2012_val_00044166.JPEG n04259630/
+mv val/ILSVRC2012_val_00044167.JPEG n03584829/
+mv val/ILSVRC2012_val_00044168.JPEG n03196217/
+mv val/ILSVRC2012_val_00044169.JPEG n03776460/
+mv val/ILSVRC2012_val_00044170.JPEG n01774750/
+mv val/ILSVRC2012_val_00044171.JPEG n09421951/
+mv val/ILSVRC2012_val_00044172.JPEG n07802026/
+mv val/ILSVRC2012_val_00044173.JPEG n04399382/
+mv val/ILSVRC2012_val_00044174.JPEG n04536866/
+mv val/ILSVRC2012_val_00044175.JPEG n04525038/
+mv val/ILSVRC2012_val_00044176.JPEG n02091467/
+mv val/ILSVRC2012_val_00044177.JPEG n03902125/
+mv val/ILSVRC2012_val_00044178.JPEG n03544143/
+mv val/ILSVRC2012_val_00044179.JPEG n02791270/
+mv val/ILSVRC2012_val_00044180.JPEG n03888605/
+mv val/ILSVRC2012_val_00044181.JPEG n03376595/
+mv val/ILSVRC2012_val_00044182.JPEG n02397096/
+mv val/ILSVRC2012_val_00044183.JPEG n03777754/
+mv val/ILSVRC2012_val_00044184.JPEG n04592741/
+mv val/ILSVRC2012_val_00044185.JPEG n03047690/
+mv val/ILSVRC2012_val_00044186.JPEG n07693725/
+mv val/ILSVRC2012_val_00044187.JPEG n02113978/
+mv val/ILSVRC2012_val_00044188.JPEG n04398044/
+mv val/ILSVRC2012_val_00044189.JPEG n02783161/
+mv val/ILSVRC2012_val_00044190.JPEG n04596742/
+mv val/ILSVRC2012_val_00044191.JPEG n03785016/
+mv val/ILSVRC2012_val_00044192.JPEG n01582220/
+mv val/ILSVRC2012_val_00044193.JPEG n02791270/
+mv val/ILSVRC2012_val_00044194.JPEG n02791124/
+mv val/ILSVRC2012_val_00044195.JPEG n02129165/
+mv val/ILSVRC2012_val_00044196.JPEG n03404251/
+mv val/ILSVRC2012_val_00044197.JPEG n03670208/
+mv val/ILSVRC2012_val_00044198.JPEG n03903868/
+mv val/ILSVRC2012_val_00044199.JPEG n02978881/
+mv val/ILSVRC2012_val_00044200.JPEG n02094433/
+mv val/ILSVRC2012_val_00044201.JPEG n04252225/
+mv val/ILSVRC2012_val_00044202.JPEG n02096177/
+mv val/ILSVRC2012_val_00044203.JPEG n03496892/
+mv val/ILSVRC2012_val_00044204.JPEG n03000684/
+mv val/ILSVRC2012_val_00044205.JPEG n03983396/
+mv val/ILSVRC2012_val_00044206.JPEG n02111277/
+mv val/ILSVRC2012_val_00044207.JPEG n03720891/
+mv val/ILSVRC2012_val_00044208.JPEG n03782006/
+mv val/ILSVRC2012_val_00044209.JPEG n01829413/
+mv val/ILSVRC2012_val_00044210.JPEG n04153751/
+mv val/ILSVRC2012_val_00044211.JPEG n03271574/
+mv val/ILSVRC2012_val_00044212.JPEG n03538406/
+mv val/ILSVRC2012_val_00044213.JPEG n03970156/
+mv val/ILSVRC2012_val_00044214.JPEG n03924679/
+mv val/ILSVRC2012_val_00044215.JPEG n02088094/
+mv val/ILSVRC2012_val_00044216.JPEG n01806143/
+mv val/ILSVRC2012_val_00044217.JPEG n02113978/
+mv val/ILSVRC2012_val_00044218.JPEG n03207941/
+mv val/ILSVRC2012_val_00044219.JPEG n03347037/
+mv val/ILSVRC2012_val_00044220.JPEG n03633091/
+mv val/ILSVRC2012_val_00044221.JPEG n03404251/
+mv val/ILSVRC2012_val_00044222.JPEG n04579145/
+mv val/ILSVRC2012_val_00044223.JPEG n02276258/
+mv val/ILSVRC2012_val_00044224.JPEG n02086240/
+mv val/ILSVRC2012_val_00044225.JPEG n02799071/
+mv val/ILSVRC2012_val_00044226.JPEG n03871628/
+mv val/ILSVRC2012_val_00044227.JPEG n02087394/
+mv val/ILSVRC2012_val_00044228.JPEG n02264363/
+mv val/ILSVRC2012_val_00044229.JPEG n03478589/
+mv val/ILSVRC2012_val_00044230.JPEG n03788365/
+mv val/ILSVRC2012_val_00044231.JPEG n02097658/
+mv val/ILSVRC2012_val_00044232.JPEG n02093647/
+mv val/ILSVRC2012_val_00044233.JPEG n07920052/
+mv val/ILSVRC2012_val_00044234.JPEG n03788195/
+mv val/ILSVRC2012_val_00044235.JPEG n03720891/
+mv val/ILSVRC2012_val_00044236.JPEG n07717556/
+mv val/ILSVRC2012_val_00044237.JPEG n02113023/
+mv val/ILSVRC2012_val_00044238.JPEG n01855032/
+mv val/ILSVRC2012_val_00044239.JPEG n07802026/
+mv val/ILSVRC2012_val_00044240.JPEG n02037110/
+mv val/ILSVRC2012_val_00044241.JPEG n03832673/
+mv val/ILSVRC2012_val_00044242.JPEG n04350905/
+mv val/ILSVRC2012_val_00044243.JPEG n07613480/
+mv val/ILSVRC2012_val_00044244.JPEG n02814860/
+mv val/ILSVRC2012_val_00044245.JPEG n03777754/
+mv val/ILSVRC2012_val_00044246.JPEG n03218198/
+mv val/ILSVRC2012_val_00044247.JPEG n02441942/
+mv val/ILSVRC2012_val_00044248.JPEG n02115913/
+mv val/ILSVRC2012_val_00044249.JPEG n02109961/
+mv val/ILSVRC2012_val_00044250.JPEG n04347754/
+mv val/ILSVRC2012_val_00044251.JPEG n03841143/
+mv val/ILSVRC2012_val_00044252.JPEG n02786058/
+mv val/ILSVRC2012_val_00044253.JPEG n02690373/
+mv val/ILSVRC2012_val_00044254.JPEG n07697313/
+mv val/ILSVRC2012_val_00044255.JPEG n07613480/
+mv val/ILSVRC2012_val_00044256.JPEG n01873310/
+mv val/ILSVRC2012_val_00044257.JPEG n03874599/
+mv val/ILSVRC2012_val_00044258.JPEG n02113624/
+mv val/ILSVRC2012_val_00044259.JPEG n02992211/
+mv val/ILSVRC2012_val_00044260.JPEG n07871810/
+mv val/ILSVRC2012_val_00044261.JPEG n03388183/
+mv val/ILSVRC2012_val_00044262.JPEG n01644900/
+mv val/ILSVRC2012_val_00044263.JPEG n04067472/
+mv val/ILSVRC2012_val_00044264.JPEG n04039381/
+mv val/ILSVRC2012_val_00044265.JPEG n02361337/
+mv val/ILSVRC2012_val_00044266.JPEG n04039381/
+mv val/ILSVRC2012_val_00044267.JPEG n04370456/
+mv val/ILSVRC2012_val_00044268.JPEG n01843065/
+mv val/ILSVRC2012_val_00044269.JPEG n01877812/
+mv val/ILSVRC2012_val_00044270.JPEG n02488291/
+mv val/ILSVRC2012_val_00044271.JPEG n03692522/
+mv val/ILSVRC2012_val_00044272.JPEG n02669723/
+mv val/ILSVRC2012_val_00044273.JPEG n03018349/
+mv val/ILSVRC2012_val_00044274.JPEG n03207743/
+mv val/ILSVRC2012_val_00044275.JPEG n02096177/
+mv val/ILSVRC2012_val_00044276.JPEG n01514859/
+mv val/ILSVRC2012_val_00044277.JPEG n02105056/
+mv val/ILSVRC2012_val_00044278.JPEG n03495258/
+mv val/ILSVRC2012_val_00044279.JPEG n03207743/
+mv val/ILSVRC2012_val_00044280.JPEG n04523525/
+mv val/ILSVRC2012_val_00044281.JPEG n03259280/
+mv val/ILSVRC2012_val_00044282.JPEG n03127747/
+mv val/ILSVRC2012_val_00044283.JPEG n02988304/
+mv val/ILSVRC2012_val_00044284.JPEG n02096437/
+mv val/ILSVRC2012_val_00044285.JPEG n02087394/
+mv val/ILSVRC2012_val_00044286.JPEG n04370456/
+mv val/ILSVRC2012_val_00044287.JPEG n01882714/
+mv val/ILSVRC2012_val_00044288.JPEG n01644900/
+mv val/ILSVRC2012_val_00044289.JPEG n11879895/
+mv val/ILSVRC2012_val_00044290.JPEG n03814639/
+mv val/ILSVRC2012_val_00044291.JPEG n03763968/
+mv val/ILSVRC2012_val_00044292.JPEG n03788365/
+mv val/ILSVRC2012_val_00044293.JPEG n04579145/
+mv val/ILSVRC2012_val_00044294.JPEG n03837869/
+mv val/ILSVRC2012_val_00044295.JPEG n04429376/
+mv val/ILSVRC2012_val_00044296.JPEG n02219486/
+mv val/ILSVRC2012_val_00044297.JPEG n03983396/
+mv val/ILSVRC2012_val_00044298.JPEG n04591157/
+mv val/ILSVRC2012_val_00044299.JPEG n07693725/
+mv val/ILSVRC2012_val_00044300.JPEG n02281787/
+mv val/ILSVRC2012_val_00044301.JPEG n01829413/
+mv val/ILSVRC2012_val_00044302.JPEG n04606251/
+mv val/ILSVRC2012_val_00044303.JPEG n02795169/
+mv val/ILSVRC2012_val_00044304.JPEG n03467068/
+mv val/ILSVRC2012_val_00044305.JPEG n02486410/
+mv val/ILSVRC2012_val_00044306.JPEG n04505470/
+mv val/ILSVRC2012_val_00044307.JPEG n02488702/
+mv val/ILSVRC2012_val_00044308.JPEG n02108089/
+mv val/ILSVRC2012_val_00044309.JPEG n02783161/
+mv val/ILSVRC2012_val_00044310.JPEG n06596364/
+mv val/ILSVRC2012_val_00044311.JPEG n01558993/
+mv val/ILSVRC2012_val_00044312.JPEG n07871810/
+mv val/ILSVRC2012_val_00044313.JPEG n02655020/
+mv val/ILSVRC2012_val_00044314.JPEG n02256656/
+mv val/ILSVRC2012_val_00044315.JPEG n03290653/
+mv val/ILSVRC2012_val_00044316.JPEG n03131574/
+mv val/ILSVRC2012_val_00044317.JPEG n01829413/
+mv val/ILSVRC2012_val_00044318.JPEG n02930766/
+mv val/ILSVRC2012_val_00044319.JPEG n03529860/
+mv val/ILSVRC2012_val_00044320.JPEG n01871265/
+mv val/ILSVRC2012_val_00044321.JPEG n01675722/
+mv val/ILSVRC2012_val_00044322.JPEG n02840245/
+mv val/ILSVRC2012_val_00044323.JPEG n04392985/
+mv val/ILSVRC2012_val_00044324.JPEG n04286575/
+mv val/ILSVRC2012_val_00044325.JPEG n03404251/
+mv val/ILSVRC2012_val_00044326.JPEG n02823428/
+mv val/ILSVRC2012_val_00044327.JPEG n02951585/
+mv val/ILSVRC2012_val_00044328.JPEG n02077923/
+mv val/ILSVRC2012_val_00044329.JPEG n03000247/
+mv val/ILSVRC2012_val_00044330.JPEG n01843065/
+mv val/ILSVRC2012_val_00044331.JPEG n02804414/
+mv val/ILSVRC2012_val_00044332.JPEG n04525038/
+mv val/ILSVRC2012_val_00044333.JPEG n01749939/
+mv val/ILSVRC2012_val_00044334.JPEG n03095699/
+mv val/ILSVRC2012_val_00044335.JPEG n04552348/
+mv val/ILSVRC2012_val_00044336.JPEG n03532672/
+mv val/ILSVRC2012_val_00044337.JPEG n03527444/
+mv val/ILSVRC2012_val_00044338.JPEG n03947888/
+mv val/ILSVRC2012_val_00044339.JPEG n02667093/
+mv val/ILSVRC2012_val_00044340.JPEG n02346627/
+mv val/ILSVRC2012_val_00044341.JPEG n01667114/
+mv val/ILSVRC2012_val_00044342.JPEG n07749582/
+mv val/ILSVRC2012_val_00044343.JPEG n02128385/
+mv val/ILSVRC2012_val_00044344.JPEG n02093754/
+mv val/ILSVRC2012_val_00044345.JPEG n02092002/
+mv val/ILSVRC2012_val_00044346.JPEG n02782093/
+mv val/ILSVRC2012_val_00044347.JPEG n04310018/
+mv val/ILSVRC2012_val_00044348.JPEG n02104365/
+mv val/ILSVRC2012_val_00044349.JPEG n02134418/
+mv val/ILSVRC2012_val_00044350.JPEG n03769881/
+mv val/ILSVRC2012_val_00044351.JPEG n02776631/
+mv val/ILSVRC2012_val_00044352.JPEG n01984695/
+mv val/ILSVRC2012_val_00044353.JPEG n02097658/
+mv val/ILSVRC2012_val_00044354.JPEG n02095570/
+mv val/ILSVRC2012_val_00044355.JPEG n02321529/
+mv val/ILSVRC2012_val_00044356.JPEG n02108000/
+mv val/ILSVRC2012_val_00044357.JPEG n02098413/
+mv val/ILSVRC2012_val_00044358.JPEG n03623198/
+mv val/ILSVRC2012_val_00044359.JPEG n03100240/
+mv val/ILSVRC2012_val_00044360.JPEG n03109150/
+mv val/ILSVRC2012_val_00044361.JPEG n02168699/
+mv val/ILSVRC2012_val_00044362.JPEG n03017168/
+mv val/ILSVRC2012_val_00044363.JPEG n01819313/
+mv val/ILSVRC2012_val_00044364.JPEG n02117135/
+mv val/ILSVRC2012_val_00044365.JPEG n03871628/
+mv val/ILSVRC2012_val_00044366.JPEG n03924679/
+mv val/ILSVRC2012_val_00044367.JPEG n04399382/
+mv val/ILSVRC2012_val_00044368.JPEG n15075141/
+mv val/ILSVRC2012_val_00044369.JPEG n03884397/
+mv val/ILSVRC2012_val_00044370.JPEG n03425413/
+mv val/ILSVRC2012_val_00044371.JPEG n03584829/
+mv val/ILSVRC2012_val_00044372.JPEG n03976467/
+mv val/ILSVRC2012_val_00044373.JPEG n02979186/
+mv val/ILSVRC2012_val_00044374.JPEG n02124075/
+mv val/ILSVRC2012_val_00044375.JPEG n02869837/
+mv val/ILSVRC2012_val_00044376.JPEG n03998194/
+mv val/ILSVRC2012_val_00044377.JPEG n02025239/
+mv val/ILSVRC2012_val_00044378.JPEG n01558993/
+mv val/ILSVRC2012_val_00044379.JPEG n04044716/
+mv val/ILSVRC2012_val_00044380.JPEG n02107908/
+mv val/ILSVRC2012_val_00044381.JPEG n04404412/
+mv val/ILSVRC2012_val_00044382.JPEG n04266014/
+mv val/ILSVRC2012_val_00044383.JPEG n03944341/
+mv val/ILSVRC2012_val_00044384.JPEG n01751748/
+mv val/ILSVRC2012_val_00044385.JPEG n02025239/
+mv val/ILSVRC2012_val_00044386.JPEG n04040759/
+mv val/ILSVRC2012_val_00044387.JPEG n02102973/
+mv val/ILSVRC2012_val_00044388.JPEG n03930630/
+mv val/ILSVRC2012_val_00044389.JPEG n09246464/
+mv val/ILSVRC2012_val_00044390.JPEG n02174001/
+mv val/ILSVRC2012_val_00044391.JPEG n02389026/
+mv val/ILSVRC2012_val_00044392.JPEG n03764736/
+mv val/ILSVRC2012_val_00044393.JPEG n01795545/
+mv val/ILSVRC2012_val_00044394.JPEG n02790996/
+mv val/ILSVRC2012_val_00044395.JPEG n02526121/
+mv val/ILSVRC2012_val_00044396.JPEG n03133878/
+mv val/ILSVRC2012_val_00044397.JPEG n03124043/
+mv val/ILSVRC2012_val_00044398.JPEG n02979186/
+mv val/ILSVRC2012_val_00044399.JPEG n02093754/
+mv val/ILSVRC2012_val_00044400.JPEG n03598930/
+mv val/ILSVRC2012_val_00044401.JPEG n03250847/
+mv val/ILSVRC2012_val_00044402.JPEG n02134084/
+mv val/ILSVRC2012_val_00044403.JPEG n03733281/
+mv val/ILSVRC2012_val_00044404.JPEG n02226429/
+mv val/ILSVRC2012_val_00044405.JPEG n04019541/
+mv val/ILSVRC2012_val_00044406.JPEG n02105855/
+mv val/ILSVRC2012_val_00044407.JPEG n02256656/
+mv val/ILSVRC2012_val_00044408.JPEG n02787622/
+mv val/ILSVRC2012_val_00044409.JPEG n04435653/
+mv val/ILSVRC2012_val_00044410.JPEG n03599486/
+mv val/ILSVRC2012_val_00044411.JPEG n03733131/
+mv val/ILSVRC2012_val_00044412.JPEG n02325366/
+mv val/ILSVRC2012_val_00044413.JPEG n03259280/
+mv val/ILSVRC2012_val_00044414.JPEG n03028079/
+mv val/ILSVRC2012_val_00044415.JPEG n03476684/
+mv val/ILSVRC2012_val_00044416.JPEG n03133878/
+mv val/ILSVRC2012_val_00044417.JPEG n03590841/
+mv val/ILSVRC2012_val_00044418.JPEG n03197337/
+mv val/ILSVRC2012_val_00044419.JPEG n04525038/
+mv val/ILSVRC2012_val_00044420.JPEG n03494278/
+mv val/ILSVRC2012_val_00044421.JPEG n04270147/
+mv val/ILSVRC2012_val_00044422.JPEG n01860187/
+mv val/ILSVRC2012_val_00044423.JPEG n02086910/
+mv val/ILSVRC2012_val_00044424.JPEG n02457408/
+mv val/ILSVRC2012_val_00044425.JPEG n03627232/
+mv val/ILSVRC2012_val_00044426.JPEG n03133878/
+mv val/ILSVRC2012_val_00044427.JPEG n03947888/
+mv val/ILSVRC2012_val_00044428.JPEG n02823428/
+mv val/ILSVRC2012_val_00044429.JPEG n02097298/
+mv val/ILSVRC2012_val_00044430.JPEG n02108000/
+mv val/ILSVRC2012_val_00044431.JPEG n04540053/
+mv val/ILSVRC2012_val_00044432.JPEG n03141823/
+mv val/ILSVRC2012_val_00044433.JPEG n03201208/
+mv val/ILSVRC2012_val_00044434.JPEG n03476991/
+mv val/ILSVRC2012_val_00044435.JPEG n02113023/
+mv val/ILSVRC2012_val_00044436.JPEG n03777754/
+mv val/ILSVRC2012_val_00044437.JPEG n03854065/
+mv val/ILSVRC2012_val_00044438.JPEG n02415577/
+mv val/ILSVRC2012_val_00044439.JPEG n02974003/
+mv val/ILSVRC2012_val_00044440.JPEG n01820546/
+mv val/ILSVRC2012_val_00044441.JPEG n02087046/
+mv val/ILSVRC2012_val_00044442.JPEG n04149813/
+mv val/ILSVRC2012_val_00044443.JPEG n04332243/
+mv val/ILSVRC2012_val_00044444.JPEG n02090379/
+mv val/ILSVRC2012_val_00044445.JPEG n04509417/
+mv val/ILSVRC2012_val_00044446.JPEG n07760859/
+mv val/ILSVRC2012_val_00044447.JPEG n03637318/
+mv val/ILSVRC2012_val_00044448.JPEG n02672831/
+mv val/ILSVRC2012_val_00044449.JPEG n03141823/
+mv val/ILSVRC2012_val_00044450.JPEG n03538406/
+mv val/ILSVRC2012_val_00044451.JPEG n03201208/
+mv val/ILSVRC2012_val_00044452.JPEG n04286575/
+mv val/ILSVRC2012_val_00044453.JPEG n02097658/
+mv val/ILSVRC2012_val_00044454.JPEG n03873416/
+mv val/ILSVRC2012_val_00044455.JPEG n04515003/
+mv val/ILSVRC2012_val_00044456.JPEG n09193705/
+mv val/ILSVRC2012_val_00044457.JPEG n02939185/
+mv val/ILSVRC2012_val_00044458.JPEG n03933933/
+mv val/ILSVRC2012_val_00044459.JPEG n01749939/
+mv val/ILSVRC2012_val_00044460.JPEG n03483316/
+mv val/ILSVRC2012_val_00044461.JPEG n02098105/
+mv val/ILSVRC2012_val_00044462.JPEG n02107908/
+mv val/ILSVRC2012_val_00044463.JPEG n02130308/
+mv val/ILSVRC2012_val_00044464.JPEG n02105641/
+mv val/ILSVRC2012_val_00044465.JPEG n04458633/
+mv val/ILSVRC2012_val_00044466.JPEG n03692522/
+mv val/ILSVRC2012_val_00044467.JPEG n02777292/
+mv val/ILSVRC2012_val_00044468.JPEG n07565083/
+mv val/ILSVRC2012_val_00044469.JPEG n02708093/
+mv val/ILSVRC2012_val_00044470.JPEG n02783161/
+mv val/ILSVRC2012_val_00044471.JPEG n04037443/
+mv val/ILSVRC2012_val_00044472.JPEG n04259630/
+mv val/ILSVRC2012_val_00044473.JPEG n02112706/
+mv val/ILSVRC2012_val_00044474.JPEG n07802026/
+mv val/ILSVRC2012_val_00044475.JPEG n01729977/
+mv val/ILSVRC2012_val_00044476.JPEG n02168699/
+mv val/ILSVRC2012_val_00044477.JPEG n04192698/
+mv val/ILSVRC2012_val_00044478.JPEG n04209133/
+mv val/ILSVRC2012_val_00044479.JPEG n07590611/
+mv val/ILSVRC2012_val_00044480.JPEG n01729322/
+mv val/ILSVRC2012_val_00044481.JPEG n02028035/
+mv val/ILSVRC2012_val_00044482.JPEG n04579432/
+mv val/ILSVRC2012_val_00044483.JPEG n01518878/
+mv val/ILSVRC2012_val_00044484.JPEG n02443484/
+mv val/ILSVRC2012_val_00044485.JPEG n07742313/
+mv val/ILSVRC2012_val_00044486.JPEG n04376876/
+mv val/ILSVRC2012_val_00044487.JPEG n04019541/
+mv val/ILSVRC2012_val_00044488.JPEG n02791270/
+mv val/ILSVRC2012_val_00044489.JPEG n02906734/
+mv val/ILSVRC2012_val_00044490.JPEG n02264363/
+mv val/ILSVRC2012_val_00044491.JPEG n02233338/
+mv val/ILSVRC2012_val_00044492.JPEG n06874185/
+mv val/ILSVRC2012_val_00044493.JPEG n04069434/
+mv val/ILSVRC2012_val_00044494.JPEG n13044778/
+mv val/ILSVRC2012_val_00044495.JPEG n02981792/
+mv val/ILSVRC2012_val_00044496.JPEG n02117135/
+mv val/ILSVRC2012_val_00044497.JPEG n03775071/
+mv val/ILSVRC2012_val_00044498.JPEG n03249569/
+mv val/ILSVRC2012_val_00044499.JPEG n04239074/
+mv val/ILSVRC2012_val_00044500.JPEG n03868242/
+mv val/ILSVRC2012_val_00044501.JPEG n02099267/
+mv val/ILSVRC2012_val_00044502.JPEG n03467068/
+mv val/ILSVRC2012_val_00044503.JPEG n02791270/
+mv val/ILSVRC2012_val_00044504.JPEG n01632777/
+mv val/ILSVRC2012_val_00044505.JPEG n01817953/
+mv val/ILSVRC2012_val_00044506.JPEG n04325704/
+mv val/ILSVRC2012_val_00044507.JPEG n01582220/
+mv val/ILSVRC2012_val_00044508.JPEG n04081281/
+mv val/ILSVRC2012_val_00044509.JPEG n03838899/
+mv val/ILSVRC2012_val_00044510.JPEG n02865351/
+mv val/ILSVRC2012_val_00044511.JPEG n02445715/
+mv val/ILSVRC2012_val_00044512.JPEG n04009552/
+mv val/ILSVRC2012_val_00044513.JPEG n02089867/
+mv val/ILSVRC2012_val_00044514.JPEG n02256656/
+mv val/ILSVRC2012_val_00044515.JPEG n01860187/
+mv val/ILSVRC2012_val_00044516.JPEG n02815834/
+mv val/ILSVRC2012_val_00044517.JPEG n04447861/
+mv val/ILSVRC2012_val_00044518.JPEG n03786901/
+mv val/ILSVRC2012_val_00044519.JPEG n04120489/
+mv val/ILSVRC2012_val_00044520.JPEG n03584254/
+mv val/ILSVRC2012_val_00044521.JPEG n03255030/
+mv val/ILSVRC2012_val_00044522.JPEG n02006656/
+mv val/ILSVRC2012_val_00044523.JPEG n03187595/
+mv val/ILSVRC2012_val_00044524.JPEG n04152593/
+mv val/ILSVRC2012_val_00044525.JPEG n03467068/
+mv val/ILSVRC2012_val_00044526.JPEG n03942813/
+mv val/ILSVRC2012_val_00044527.JPEG n03947888/
+mv val/ILSVRC2012_val_00044528.JPEG n07831146/
+mv val/ILSVRC2012_val_00044529.JPEG n02090721/
+mv val/ILSVRC2012_val_00044530.JPEG n04532670/
+mv val/ILSVRC2012_val_00044531.JPEG n03018349/
+mv val/ILSVRC2012_val_00044532.JPEG n02093991/
+mv val/ILSVRC2012_val_00044533.JPEG n01917289/
+mv val/ILSVRC2012_val_00044534.JPEG n01729322/
+mv val/ILSVRC2012_val_00044535.JPEG n02108422/
+mv val/ILSVRC2012_val_00044536.JPEG n03197337/
+mv val/ILSVRC2012_val_00044537.JPEG n02951585/
+mv val/ILSVRC2012_val_00044538.JPEG n04263257/
+mv val/ILSVRC2012_val_00044539.JPEG n07932039/
+mv val/ILSVRC2012_val_00044540.JPEG n01537544/
+mv val/ILSVRC2012_val_00044541.JPEG n03495258/
+mv val/ILSVRC2012_val_00044542.JPEG n01755581/
+mv val/ILSVRC2012_val_00044543.JPEG n02096051/
+mv val/ILSVRC2012_val_00044544.JPEG n01737021/
+mv val/ILSVRC2012_val_00044545.JPEG n04120489/
+mv val/ILSVRC2012_val_00044546.JPEG n02111500/
+mv val/ILSVRC2012_val_00044547.JPEG n03895866/
+mv val/ILSVRC2012_val_00044548.JPEG n02106166/
+mv val/ILSVRC2012_val_00044549.JPEG n04350905/
+mv val/ILSVRC2012_val_00044550.JPEG n04081281/
+mv val/ILSVRC2012_val_00044551.JPEG n02791124/
+mv val/ILSVRC2012_val_00044552.JPEG n04501370/
+mv val/ILSVRC2012_val_00044553.JPEG n02115913/
+mv val/ILSVRC2012_val_00044554.JPEG n02088466/
+mv val/ILSVRC2012_val_00044555.JPEG n07614500/
+mv val/ILSVRC2012_val_00044556.JPEG n02410509/
+mv val/ILSVRC2012_val_00044557.JPEG n01740131/
+mv val/ILSVRC2012_val_00044558.JPEG n03483316/
+mv val/ILSVRC2012_val_00044559.JPEG n02701002/
+mv val/ILSVRC2012_val_00044560.JPEG n03792782/
+mv val/ILSVRC2012_val_00044561.JPEG n03995372/
+mv val/ILSVRC2012_val_00044562.JPEG n03016953/
+mv val/ILSVRC2012_val_00044563.JPEG n02536864/
+mv val/ILSVRC2012_val_00044564.JPEG n12144580/
+mv val/ILSVRC2012_val_00044565.JPEG n02011460/
+mv val/ILSVRC2012_val_00044566.JPEG n04355933/
+mv val/ILSVRC2012_val_00044567.JPEG n02423022/
+mv val/ILSVRC2012_val_00044568.JPEG n03658185/
+mv val/ILSVRC2012_val_00044569.JPEG n03344393/
+mv val/ILSVRC2012_val_00044570.JPEG n02096177/
+mv val/ILSVRC2012_val_00044571.JPEG n03692522/
+mv val/ILSVRC2012_val_00044572.JPEG n04423845/
+mv val/ILSVRC2012_val_00044573.JPEG n02110185/
+mv val/ILSVRC2012_val_00044574.JPEG n02177972/
+mv val/ILSVRC2012_val_00044575.JPEG n03197337/
+mv val/ILSVRC2012_val_00044576.JPEG n03924679/
+mv val/ILSVRC2012_val_00044577.JPEG n01749939/
+mv val/ILSVRC2012_val_00044578.JPEG n02229544/
+mv val/ILSVRC2012_val_00044579.JPEG n03000247/
+mv val/ILSVRC2012_val_00044580.JPEG n01744401/
+mv val/ILSVRC2012_val_00044581.JPEG n02321529/
+mv val/ILSVRC2012_val_00044582.JPEG n03874293/
+mv val/ILSVRC2012_val_00044583.JPEG n03481172/
+mv val/ILSVRC2012_val_00044584.JPEG n01872401/
+mv val/ILSVRC2012_val_00044585.JPEG n02112018/
+mv val/ILSVRC2012_val_00044586.JPEG n02492035/
+mv val/ILSVRC2012_val_00044587.JPEG n03670208/
+mv val/ILSVRC2012_val_00044588.JPEG n04372370/
+mv val/ILSVRC2012_val_00044589.JPEG n01697457/
+mv val/ILSVRC2012_val_00044590.JPEG n02788148/
+mv val/ILSVRC2012_val_00044591.JPEG n01796340/
+mv val/ILSVRC2012_val_00044592.JPEG n03272562/
+mv val/ILSVRC2012_val_00044593.JPEG n02098286/
+mv val/ILSVRC2012_val_00044594.JPEG n03781244/
+mv val/ILSVRC2012_val_00044595.JPEG n03666591/
+mv val/ILSVRC2012_val_00044596.JPEG n13037406/
+mv val/ILSVRC2012_val_00044597.JPEG n04532670/
+mv val/ILSVRC2012_val_00044598.JPEG n03394916/
+mv val/ILSVRC2012_val_00044599.JPEG n01744401/
+mv val/ILSVRC2012_val_00044600.JPEG n02114855/
+mv val/ILSVRC2012_val_00044601.JPEG n04542943/
+mv val/ILSVRC2012_val_00044602.JPEG n02860847/
+mv val/ILSVRC2012_val_00044603.JPEG n02268443/
+mv val/ILSVRC2012_val_00044604.JPEG n04254120/
+mv val/ILSVRC2012_val_00044605.JPEG n02088466/
+mv val/ILSVRC2012_val_00044606.JPEG n11939491/
+mv val/ILSVRC2012_val_00044607.JPEG n03788195/
+mv val/ILSVRC2012_val_00044608.JPEG n07860988/
+mv val/ILSVRC2012_val_00044609.JPEG n03832673/
+mv val/ILSVRC2012_val_00044610.JPEG n02134084/
+mv val/ILSVRC2012_val_00044611.JPEG n02092339/
+mv val/ILSVRC2012_val_00044612.JPEG n02797295/
+mv val/ILSVRC2012_val_00044613.JPEG n04252077/
+mv val/ILSVRC2012_val_00044614.JPEG n04591713/
+mv val/ILSVRC2012_val_00044615.JPEG n02096177/
+mv val/ILSVRC2012_val_00044616.JPEG n03134739/
+mv val/ILSVRC2012_val_00044617.JPEG n03982430/
+mv val/ILSVRC2012_val_00044618.JPEG n02107574/
+mv val/ILSVRC2012_val_00044619.JPEG n02233338/
+mv val/ILSVRC2012_val_00044620.JPEG n07697313/
+mv val/ILSVRC2012_val_00044621.JPEG n03891332/
+mv val/ILSVRC2012_val_00044622.JPEG n03325584/
+mv val/ILSVRC2012_val_00044623.JPEG n03208938/
+mv val/ILSVRC2012_val_00044624.JPEG n01518878/
+mv val/ILSVRC2012_val_00044625.JPEG n02509815/
+mv val/ILSVRC2012_val_00044626.JPEG n03710721/
+mv val/ILSVRC2012_val_00044627.JPEG n04487394/
+mv val/ILSVRC2012_val_00044628.JPEG n03014705/
+mv val/ILSVRC2012_val_00044629.JPEG n02099429/
+mv val/ILSVRC2012_val_00044630.JPEG n02834397/
+mv val/ILSVRC2012_val_00044631.JPEG n04141975/
+mv val/ILSVRC2012_val_00044632.JPEG n01978455/
+mv val/ILSVRC2012_val_00044633.JPEG n03891332/
+mv val/ILSVRC2012_val_00044634.JPEG n02870880/
+mv val/ILSVRC2012_val_00044635.JPEG n04265275/
+mv val/ILSVRC2012_val_00044636.JPEG n02497673/
+mv val/ILSVRC2012_val_00044637.JPEG n01955084/
+mv val/ILSVRC2012_val_00044638.JPEG n02963159/
+mv val/ILSVRC2012_val_00044639.JPEG n02099712/
+mv val/ILSVRC2012_val_00044640.JPEG n02793495/
+mv val/ILSVRC2012_val_00044641.JPEG n03691459/
+mv val/ILSVRC2012_val_00044642.JPEG n02085782/
+mv val/ILSVRC2012_val_00044643.JPEG n03991062/
+mv val/ILSVRC2012_val_00044644.JPEG n02088094/
+mv val/ILSVRC2012_val_00044645.JPEG n07711569/
+mv val/ILSVRC2012_val_00044646.JPEG n02346627/
+mv val/ILSVRC2012_val_00044647.JPEG n07695742/
+mv val/ILSVRC2012_val_00044648.JPEG n03218198/
+mv val/ILSVRC2012_val_00044649.JPEG n01784675/
+mv val/ILSVRC2012_val_00044650.JPEG n02799071/
+mv val/ILSVRC2012_val_00044651.JPEG n03944341/
+mv val/ILSVRC2012_val_00044652.JPEG n03179701/
+mv val/ILSVRC2012_val_00044653.JPEG n02415577/
+mv val/ILSVRC2012_val_00044654.JPEG n04370456/
+mv val/ILSVRC2012_val_00044655.JPEG n04443257/
+mv val/ILSVRC2012_val_00044656.JPEG n04254777/
+mv val/ILSVRC2012_val_00044657.JPEG n01496331/
+mv val/ILSVRC2012_val_00044658.JPEG n02699494/
+mv val/ILSVRC2012_val_00044659.JPEG n01677366/
+mv val/ILSVRC2012_val_00044660.JPEG n02514041/
+mv val/ILSVRC2012_val_00044661.JPEG n02086240/
+mv val/ILSVRC2012_val_00044662.JPEG n02107908/
+mv val/ILSVRC2012_val_00044663.JPEG n11879895/
+mv val/ILSVRC2012_val_00044664.JPEG n03770679/
+mv val/ILSVRC2012_val_00044665.JPEG n02749479/
+mv val/ILSVRC2012_val_00044666.JPEG n03803284/
+mv val/ILSVRC2012_val_00044667.JPEG n04485082/
+mv val/ILSVRC2012_val_00044668.JPEG n03201208/
+mv val/ILSVRC2012_val_00044669.JPEG n03045698/
+mv val/ILSVRC2012_val_00044670.JPEG n03944341/
+mv val/ILSVRC2012_val_00044671.JPEG n01930112/
+mv val/ILSVRC2012_val_00044672.JPEG n02113186/
+mv val/ILSVRC2012_val_00044673.JPEG n04286575/
+mv val/ILSVRC2012_val_00044674.JPEG n03706229/
+mv val/ILSVRC2012_val_00044675.JPEG n02871525/
+mv val/ILSVRC2012_val_00044676.JPEG n01774384/
+mv val/ILSVRC2012_val_00044677.JPEG n01855032/
+mv val/ILSVRC2012_val_00044678.JPEG n02109047/
+mv val/ILSVRC2012_val_00044679.JPEG n02114548/
+mv val/ILSVRC2012_val_00044680.JPEG n12998815/
+mv val/ILSVRC2012_val_00044681.JPEG n03218198/
+mv val/ILSVRC2012_val_00044682.JPEG n03216828/
+mv val/ILSVRC2012_val_00044683.JPEG n04371774/
+mv val/ILSVRC2012_val_00044684.JPEG n02114712/
+mv val/ILSVRC2012_val_00044685.JPEG n04548280/
+mv val/ILSVRC2012_val_00044686.JPEG n02276258/
+mv val/ILSVRC2012_val_00044687.JPEG n04033995/
+mv val/ILSVRC2012_val_00044688.JPEG n03393912/
+mv val/ILSVRC2012_val_00044689.JPEG n03980874/
+mv val/ILSVRC2012_val_00044690.JPEG n04389033/
+mv val/ILSVRC2012_val_00044691.JPEG n07583066/
+mv val/ILSVRC2012_val_00044692.JPEG n01704323/
+mv val/ILSVRC2012_val_00044693.JPEG n03445924/
+mv val/ILSVRC2012_val_00044694.JPEG n02018795/
+mv val/ILSVRC2012_val_00044695.JPEG n03445777/
+mv val/ILSVRC2012_val_00044696.JPEG n02098286/
+mv val/ILSVRC2012_val_00044697.JPEG n03838899/
+mv val/ILSVRC2012_val_00044698.JPEG n01689811/
+mv val/ILSVRC2012_val_00044699.JPEG n03666591/
+mv val/ILSVRC2012_val_00044700.JPEG n03000247/
+mv val/ILSVRC2012_val_00044701.JPEG n02099712/
+mv val/ILSVRC2012_val_00044702.JPEG n03483316/
+mv val/ILSVRC2012_val_00044703.JPEG n04505470/
+mv val/ILSVRC2012_val_00044704.JPEG n02490219/
+mv val/ILSVRC2012_val_00044705.JPEG n04239074/
+mv val/ILSVRC2012_val_00044706.JPEG n01531178/
+mv val/ILSVRC2012_val_00044707.JPEG n02116738/
+mv val/ILSVRC2012_val_00044708.JPEG n01950731/
+mv val/ILSVRC2012_val_00044709.JPEG n02113624/
+mv val/ILSVRC2012_val_00044710.JPEG n04204238/
+mv val/ILSVRC2012_val_00044711.JPEG n02276258/
+mv val/ILSVRC2012_val_00044712.JPEG n07715103/
+mv val/ILSVRC2012_val_00044713.JPEG n03026506/
+mv val/ILSVRC2012_val_00044714.JPEG n02108551/
+mv val/ILSVRC2012_val_00044715.JPEG n02127052/
+mv val/ILSVRC2012_val_00044716.JPEG n02088466/
+mv val/ILSVRC2012_val_00044717.JPEG n02093256/
+mv val/ILSVRC2012_val_00044718.JPEG n02102040/
+mv val/ILSVRC2012_val_00044719.JPEG n03976657/
+mv val/ILSVRC2012_val_00044720.JPEG n04532670/
+mv val/ILSVRC2012_val_00044721.JPEG n03776460/
+mv val/ILSVRC2012_val_00044722.JPEG n03220513/
+mv val/ILSVRC2012_val_00044723.JPEG n03903868/
+mv val/ILSVRC2012_val_00044724.JPEG n03792972/
+mv val/ILSVRC2012_val_00044725.JPEG n03529860/
+mv val/ILSVRC2012_val_00044726.JPEG n02009229/
+mv val/ILSVRC2012_val_00044727.JPEG n02113624/
+mv val/ILSVRC2012_val_00044728.JPEG n02447366/
+mv val/ILSVRC2012_val_00044729.JPEG n03461385/
+mv val/ILSVRC2012_val_00044730.JPEG n02102318/
+mv val/ILSVRC2012_val_00044731.JPEG n04263257/
+mv val/ILSVRC2012_val_00044732.JPEG n02114855/
+mv val/ILSVRC2012_val_00044733.JPEG n02676566/
+mv val/ILSVRC2012_val_00044734.JPEG n03425413/
+mv val/ILSVRC2012_val_00044735.JPEG n03538406/
+mv val/ILSVRC2012_val_00044736.JPEG n03666591/
+mv val/ILSVRC2012_val_00044737.JPEG n03272010/
+mv val/ILSVRC2012_val_00044738.JPEG n07768694/
+mv val/ILSVRC2012_val_00044739.JPEG n04392985/
+mv val/ILSVRC2012_val_00044740.JPEG n04330267/
+mv val/ILSVRC2012_val_00044741.JPEG n03026506/
+mv val/ILSVRC2012_val_00044742.JPEG n07730033/
+mv val/ILSVRC2012_val_00044743.JPEG n02094258/
+mv val/ILSVRC2012_val_00044744.JPEG n04515003/
+mv val/ILSVRC2012_val_00044745.JPEG n04265275/
+mv val/ILSVRC2012_val_00044746.JPEG n13044778/
+mv val/ILSVRC2012_val_00044747.JPEG n02965783/
+mv val/ILSVRC2012_val_00044748.JPEG n02120505/
+mv val/ILSVRC2012_val_00044749.JPEG n02058221/
+mv val/ILSVRC2012_val_00044750.JPEG n03314780/
+mv val/ILSVRC2012_val_00044751.JPEG n02793495/
+mv val/ILSVRC2012_val_00044752.JPEG n02708093/
+mv val/ILSVRC2012_val_00044753.JPEG n03633091/
+mv val/ILSVRC2012_val_00044754.JPEG n03014705/
+mv val/ILSVRC2012_val_00044755.JPEG n01665541/
+mv val/ILSVRC2012_val_00044756.JPEG n02526121/
+mv val/ILSVRC2012_val_00044757.JPEG n04067472/
+mv val/ILSVRC2012_val_00044758.JPEG n04428191/
+mv val/ILSVRC2012_val_00044759.JPEG n07836838/
+mv val/ILSVRC2012_val_00044760.JPEG n02177972/
+mv val/ILSVRC2012_val_00044761.JPEG n01817953/
+mv val/ILSVRC2012_val_00044762.JPEG n04296562/
+mv val/ILSVRC2012_val_00044763.JPEG n04099969/
+mv val/ILSVRC2012_val_00044764.JPEG n03956157/
+mv val/ILSVRC2012_val_00044765.JPEG n02114367/
+mv val/ILSVRC2012_val_00044766.JPEG n02091635/
+mv val/ILSVRC2012_val_00044767.JPEG n02113978/
+mv val/ILSVRC2012_val_00044768.JPEG n03838899/
+mv val/ILSVRC2012_val_00044769.JPEG n02437616/
+mv val/ILSVRC2012_val_00044770.JPEG n04370456/
+mv val/ILSVRC2012_val_00044771.JPEG n02423022/
+mv val/ILSVRC2012_val_00044772.JPEG n02112706/
+mv val/ILSVRC2012_val_00044773.JPEG n02096585/
+mv val/ILSVRC2012_val_00044774.JPEG n02497673/
+mv val/ILSVRC2012_val_00044775.JPEG n04505470/
+mv val/ILSVRC2012_val_00044776.JPEG n02098286/
+mv val/ILSVRC2012_val_00044777.JPEG n02319095/
+mv val/ILSVRC2012_val_00044778.JPEG n04560804/
+mv val/ILSVRC2012_val_00044779.JPEG n03976657/
+mv val/ILSVRC2012_val_00044780.JPEG n04330267/
+mv val/ILSVRC2012_val_00044781.JPEG n02481823/
+mv val/ILSVRC2012_val_00044782.JPEG n04532670/
+mv val/ILSVRC2012_val_00044783.JPEG n12057211/
+mv val/ILSVRC2012_val_00044784.JPEG n03584254/
+mv val/ILSVRC2012_val_00044785.JPEG n04065272/
+mv val/ILSVRC2012_val_00044786.JPEG n04596742/
+mv val/ILSVRC2012_val_00044787.JPEG n02823428/
+mv val/ILSVRC2012_val_00044788.JPEG n01494475/
+mv val/ILSVRC2012_val_00044789.JPEG n03133878/
+mv val/ILSVRC2012_val_00044790.JPEG n07579787/
+mv val/ILSVRC2012_val_00044791.JPEG n04141975/
+mv val/ILSVRC2012_val_00044792.JPEG n03794056/
+mv val/ILSVRC2012_val_00044793.JPEG n03000684/
+mv val/ILSVRC2012_val_00044794.JPEG n04067472/
+mv val/ILSVRC2012_val_00044795.JPEG n02108422/
+mv val/ILSVRC2012_val_00044796.JPEG n04254777/
+mv val/ILSVRC2012_val_00044797.JPEG n01616318/
+mv val/ILSVRC2012_val_00044798.JPEG n03814906/
+mv val/ILSVRC2012_val_00044799.JPEG n03444034/
+mv val/ILSVRC2012_val_00044800.JPEG n04277352/
+mv val/ILSVRC2012_val_00044801.JPEG n04612504/
+mv val/ILSVRC2012_val_00044802.JPEG n02917067/
+mv val/ILSVRC2012_val_00044803.JPEG n03729826/
+mv val/ILSVRC2012_val_00044804.JPEG n02095314/
+mv val/ILSVRC2012_val_00044805.JPEG n03796401/
+mv val/ILSVRC2012_val_00044806.JPEG n04486054/
+mv val/ILSVRC2012_val_00044807.JPEG n03637318/
+mv val/ILSVRC2012_val_00044808.JPEG n02786058/
+mv val/ILSVRC2012_val_00044809.JPEG n03661043/
+mv val/ILSVRC2012_val_00044810.JPEG n03400231/
+mv val/ILSVRC2012_val_00044811.JPEG n02112350/
+mv val/ILSVRC2012_val_00044812.JPEG n03980874/
+mv val/ILSVRC2012_val_00044813.JPEG n04251144/
+mv val/ILSVRC2012_val_00044814.JPEG n01978287/
+mv val/ILSVRC2012_val_00044815.JPEG n03483316/
+mv val/ILSVRC2012_val_00044816.JPEG n03633091/
+mv val/ILSVRC2012_val_00044817.JPEG n04597913/
+mv val/ILSVRC2012_val_00044818.JPEG n02093647/
+mv val/ILSVRC2012_val_00044819.JPEG n02097474/
+mv val/ILSVRC2012_val_00044820.JPEG n02097130/
+mv val/ILSVRC2012_val_00044821.JPEG n03998194/
+mv val/ILSVRC2012_val_00044822.JPEG n01689811/
+mv val/ILSVRC2012_val_00044823.JPEG n04482393/
+mv val/ILSVRC2012_val_00044824.JPEG n02231487/
+mv val/ILSVRC2012_val_00044825.JPEG n04328186/
+mv val/ILSVRC2012_val_00044826.JPEG n03188531/
+mv val/ILSVRC2012_val_00044827.JPEG n02490219/
+mv val/ILSVRC2012_val_00044828.JPEG n04579432/
+mv val/ILSVRC2012_val_00044829.JPEG n09256479/
+mv val/ILSVRC2012_val_00044830.JPEG n03770439/
+mv val/ILSVRC2012_val_00044831.JPEG n07697537/
+mv val/ILSVRC2012_val_00044832.JPEG n02389026/
+mv val/ILSVRC2012_val_00044833.JPEG n04252225/
+mv val/ILSVRC2012_val_00044834.JPEG n03594945/
+mv val/ILSVRC2012_val_00044835.JPEG n04310018/
+mv val/ILSVRC2012_val_00044836.JPEG n01978455/
+mv val/ILSVRC2012_val_00044837.JPEG n03803284/
+mv val/ILSVRC2012_val_00044838.JPEG n03063689/
+mv val/ILSVRC2012_val_00044839.JPEG n01924916/
+mv val/ILSVRC2012_val_00044840.JPEG n03240683/
+mv val/ILSVRC2012_val_00044841.JPEG n03837869/
+mv val/ILSVRC2012_val_00044842.JPEG n02114712/
+mv val/ILSVRC2012_val_00044843.JPEG n02999410/
+mv val/ILSVRC2012_val_00044844.JPEG n04371774/
+mv val/ILSVRC2012_val_00044845.JPEG n03676483/
+mv val/ILSVRC2012_val_00044846.JPEG n02091467/
+mv val/ILSVRC2012_val_00044847.JPEG n03196217/
+mv val/ILSVRC2012_val_00044848.JPEG n03347037/
+mv val/ILSVRC2012_val_00044849.JPEG n04487081/
+mv val/ILSVRC2012_val_00044850.JPEG n03888257/
+mv val/ILSVRC2012_val_00044851.JPEG n03787032/
+mv val/ILSVRC2012_val_00044852.JPEG n01631663/
+mv val/ILSVRC2012_val_00044853.JPEG n03447721/
+mv val/ILSVRC2012_val_00044854.JPEG n02086079/
+mv val/ILSVRC2012_val_00044855.JPEG n01644373/
+mv val/ILSVRC2012_val_00044856.JPEG n09468604/
+mv val/ILSVRC2012_val_00044857.JPEG n07613480/
+mv val/ILSVRC2012_val_00044858.JPEG n04356056/
+mv val/ILSVRC2012_val_00044859.JPEG n04493381/
+mv val/ILSVRC2012_val_00044860.JPEG n06785654/
+mv val/ILSVRC2012_val_00044861.JPEG n03179701/
+mv val/ILSVRC2012_val_00044862.JPEG n01675722/
+mv val/ILSVRC2012_val_00044863.JPEG n04429376/
+mv val/ILSVRC2012_val_00044864.JPEG n02966193/
+mv val/ILSVRC2012_val_00044865.JPEG n03584254/
+mv val/ILSVRC2012_val_00044866.JPEG n03673027/
+mv val/ILSVRC2012_val_00044867.JPEG n03223299/
+mv val/ILSVRC2012_val_00044868.JPEG n03443371/
+mv val/ILSVRC2012_val_00044869.JPEG n02106382/
+mv val/ILSVRC2012_val_00044870.JPEG n04125021/
+mv val/ILSVRC2012_val_00044871.JPEG n03786901/
+mv val/ILSVRC2012_val_00044872.JPEG n04467665/
+mv val/ILSVRC2012_val_00044873.JPEG n03498962/
+mv val/ILSVRC2012_val_00044874.JPEG n03662601/
+mv val/ILSVRC2012_val_00044875.JPEG n02088632/
+mv val/ILSVRC2012_val_00044876.JPEG n02510455/
+mv val/ILSVRC2012_val_00044877.JPEG n12998815/
+mv val/ILSVRC2012_val_00044878.JPEG n02747177/
+mv val/ILSVRC2012_val_00044879.JPEG n04252077/
+mv val/ILSVRC2012_val_00044880.JPEG n12267677/
+mv val/ILSVRC2012_val_00044881.JPEG n04501370/
+mv val/ILSVRC2012_val_00044882.JPEG n02113978/
+mv val/ILSVRC2012_val_00044883.JPEG n03141823/
+mv val/ILSVRC2012_val_00044884.JPEG n01817953/
+mv val/ILSVRC2012_val_00044885.JPEG n03126707/
+mv val/ILSVRC2012_val_00044886.JPEG n03110669/
+mv val/ILSVRC2012_val_00044887.JPEG n02910353/
+mv val/ILSVRC2012_val_00044888.JPEG n03417042/
+mv val/ILSVRC2012_val_00044889.JPEG n09193705/
+mv val/ILSVRC2012_val_00044890.JPEG n02102318/
+mv val/ILSVRC2012_val_00044891.JPEG n01807496/
+mv val/ILSVRC2012_val_00044892.JPEG n02268443/
+mv val/ILSVRC2012_val_00044893.JPEG n01632777/
+mv val/ILSVRC2012_val_00044894.JPEG n02814533/
+mv val/ILSVRC2012_val_00044895.JPEG n07875152/
+mv val/ILSVRC2012_val_00044896.JPEG n01484850/
+mv val/ILSVRC2012_val_00044897.JPEG n02092339/
+mv val/ILSVRC2012_val_00044898.JPEG n02791124/
+mv val/ILSVRC2012_val_00044899.JPEG n04417672/
+mv val/ILSVRC2012_val_00044900.JPEG n03160309/
+mv val/ILSVRC2012_val_00044901.JPEG n02134418/
+mv val/ILSVRC2012_val_00044902.JPEG n03483316/
+mv val/ILSVRC2012_val_00044903.JPEG n01829413/
+mv val/ILSVRC2012_val_00044904.JPEG n02095889/
+mv val/ILSVRC2012_val_00044905.JPEG n07693725/
+mv val/ILSVRC2012_val_00044906.JPEG n04579145/
+mv val/ILSVRC2012_val_00044907.JPEG n03942813/
+mv val/ILSVRC2012_val_00044908.JPEG n02091134/
+mv val/ILSVRC2012_val_00044909.JPEG n04209239/
+mv val/ILSVRC2012_val_00044910.JPEG n07584110/
+mv val/ILSVRC2012_val_00044911.JPEG n04590129/
+mv val/ILSVRC2012_val_00044912.JPEG n03873416/
+mv val/ILSVRC2012_val_00044913.JPEG n02105056/
+mv val/ILSVRC2012_val_00044914.JPEG n02488291/
+mv val/ILSVRC2012_val_00044915.JPEG n04136333/
+mv val/ILSVRC2012_val_00044916.JPEG n01855032/
+mv val/ILSVRC2012_val_00044917.JPEG n04525305/
+mv val/ILSVRC2012_val_00044918.JPEG n04039381/
+mv val/ILSVRC2012_val_00044919.JPEG n02025239/
+mv val/ILSVRC2012_val_00044920.JPEG n03476991/
+mv val/ILSVRC2012_val_00044921.JPEG n01614925/
+mv val/ILSVRC2012_val_00044922.JPEG n01735189/
+mv val/ILSVRC2012_val_00044923.JPEG n02894605/
+mv val/ILSVRC2012_val_00044924.JPEG n04505470/
+mv val/ILSVRC2012_val_00044925.JPEG n02127052/
+mv val/ILSVRC2012_val_00044926.JPEG n12267677/
+mv val/ILSVRC2012_val_00044927.JPEG n02865351/
+mv val/ILSVRC2012_val_00044928.JPEG n03481172/
+mv val/ILSVRC2012_val_00044929.JPEG n02445715/
+mv val/ILSVRC2012_val_00044930.JPEG n02892767/
+mv val/ILSVRC2012_val_00044931.JPEG n02974003/
+mv val/ILSVRC2012_val_00044932.JPEG n03249569/
+mv val/ILSVRC2012_val_00044933.JPEG n01860187/
+mv val/ILSVRC2012_val_00044934.JPEG n01687978/
+mv val/ILSVRC2012_val_00044935.JPEG n03733805/
+mv val/ILSVRC2012_val_00044936.JPEG n03445777/
+mv val/ILSVRC2012_val_00044937.JPEG n02676566/
+mv val/ILSVRC2012_val_00044938.JPEG n07734744/
+mv val/ILSVRC2012_val_00044939.JPEG n03544143/
+mv val/ILSVRC2012_val_00044940.JPEG n03676483/
+mv val/ILSVRC2012_val_00044941.JPEG n03877845/
+mv val/ILSVRC2012_val_00044942.JPEG n03372029/
+mv val/ILSVRC2012_val_00044943.JPEG n03977966/
+mv val/ILSVRC2012_val_00044944.JPEG n02090721/
+mv val/ILSVRC2012_val_00044945.JPEG n03676483/
+mv val/ILSVRC2012_val_00044946.JPEG n02655020/
+mv val/ILSVRC2012_val_00044947.JPEG n02134418/
+mv val/ILSVRC2012_val_00044948.JPEG n02364673/
+mv val/ILSVRC2012_val_00044949.JPEG n02110627/
+mv val/ILSVRC2012_val_00044950.JPEG n03527444/
+mv val/ILSVRC2012_val_00044951.JPEG n04317175/
+mv val/ILSVRC2012_val_00044952.JPEG n02280649/
+mv val/ILSVRC2012_val_00044953.JPEG n02788148/
+mv val/ILSVRC2012_val_00044954.JPEG n02119789/
+mv val/ILSVRC2012_val_00044955.JPEG n02804610/
+mv val/ILSVRC2012_val_00044956.JPEG n04435653/
+mv val/ILSVRC2012_val_00044957.JPEG n02120505/
+mv val/ILSVRC2012_val_00044958.JPEG n02802426/
+mv val/ILSVRC2012_val_00044959.JPEG n02606052/
+mv val/ILSVRC2012_val_00044960.JPEG n07717410/
+mv val/ILSVRC2012_val_00044961.JPEG n03290653/
+mv val/ILSVRC2012_val_00044962.JPEG n03017168/
+mv val/ILSVRC2012_val_00044963.JPEG n02087046/
+mv val/ILSVRC2012_val_00044964.JPEG n02093647/
+mv val/ILSVRC2012_val_00044965.JPEG n04259630/
+mv val/ILSVRC2012_val_00044966.JPEG n01819313/
+mv val/ILSVRC2012_val_00044967.JPEG n03467068/
+mv val/ILSVRC2012_val_00044968.JPEG n02113712/
+mv val/ILSVRC2012_val_00044969.JPEG n03935335/
+mv val/ILSVRC2012_val_00044970.JPEG n02927161/
+mv val/ILSVRC2012_val_00044971.JPEG n02113186/
+mv val/ILSVRC2012_val_00044972.JPEG n03673027/
+mv val/ILSVRC2012_val_00044973.JPEG n04200800/
+mv val/ILSVRC2012_val_00044974.JPEG n04192698/
+mv val/ILSVRC2012_val_00044975.JPEG n01518878/
+mv val/ILSVRC2012_val_00044976.JPEG n03417042/
+mv val/ILSVRC2012_val_00044977.JPEG n02093754/
+mv val/ILSVRC2012_val_00044978.JPEG n02088364/
+mv val/ILSVRC2012_val_00044979.JPEG n02749479/
+mv val/ILSVRC2012_val_00044980.JPEG n01688243/
+mv val/ILSVRC2012_val_00044981.JPEG n04070727/
+mv val/ILSVRC2012_val_00044982.JPEG n04604644/
+mv val/ILSVRC2012_val_00044983.JPEG n02457408/
+mv val/ILSVRC2012_val_00044984.JPEG n06874185/
+mv val/ILSVRC2012_val_00044985.JPEG n04483307/
+mv val/ILSVRC2012_val_00044986.JPEG n02422106/
+mv val/ILSVRC2012_val_00044987.JPEG n01692333/
+mv val/ILSVRC2012_val_00044988.JPEG n02834397/
+mv val/ILSVRC2012_val_00044989.JPEG n03485794/
+mv val/ILSVRC2012_val_00044990.JPEG n02219486/
+mv val/ILSVRC2012_val_00044991.JPEG n01950731/
+mv val/ILSVRC2012_val_00044992.JPEG n02028035/
+mv val/ILSVRC2012_val_00044993.JPEG n01644900/
+mv val/ILSVRC2012_val_00044994.JPEG n03125729/
+mv val/ILSVRC2012_val_00044995.JPEG n12144580/
+mv val/ILSVRC2012_val_00044996.JPEG n01682714/
+mv val/ILSVRC2012_val_00044997.JPEG n03843555/
+mv val/ILSVRC2012_val_00044998.JPEG n03602883/
+mv val/ILSVRC2012_val_00044999.JPEG n02018795/
+mv val/ILSVRC2012_val_00045000.JPEG n03447447/
+mv val/ILSVRC2012_val_00045001.JPEG n02865351/
+mv val/ILSVRC2012_val_00045002.JPEG n03223299/
+mv val/ILSVRC2012_val_00045003.JPEG n03355925/
+mv val/ILSVRC2012_val_00045004.JPEG n04592741/
+mv val/ILSVRC2012_val_00045005.JPEG n02106662/
+mv val/ILSVRC2012_val_00045006.JPEG n02033041/
+mv val/ILSVRC2012_val_00045007.JPEG n01820546/
+mv val/ILSVRC2012_val_00045008.JPEG n03761084/
+mv val/ILSVRC2012_val_00045009.JPEG n02165105/
+mv val/ILSVRC2012_val_00045010.JPEG n02397096/
+mv val/ILSVRC2012_val_00045011.JPEG n02101556/
+mv val/ILSVRC2012_val_00045012.JPEG n04328186/
+mv val/ILSVRC2012_val_00045013.JPEG n03933933/
+mv val/ILSVRC2012_val_00045014.JPEG n03355925/
+mv val/ILSVRC2012_val_00045015.JPEG n04328186/
+mv val/ILSVRC2012_val_00045016.JPEG n03950228/
+mv val/ILSVRC2012_val_00045017.JPEG n03134739/
+mv val/ILSVRC2012_val_00045018.JPEG n03535780/
+mv val/ILSVRC2012_val_00045019.JPEG n01748264/
+mv val/ILSVRC2012_val_00045020.JPEG n04330267/
+mv val/ILSVRC2012_val_00045021.JPEG n02699494/
+mv val/ILSVRC2012_val_00045022.JPEG n01985128/
+mv val/ILSVRC2012_val_00045023.JPEG n02978881/
+mv val/ILSVRC2012_val_00045024.JPEG n04141327/
+mv val/ILSVRC2012_val_00045025.JPEG n02403003/
+mv val/ILSVRC2012_val_00045026.JPEG n02120079/
+mv val/ILSVRC2012_val_00045027.JPEG n07579787/
+mv val/ILSVRC2012_val_00045028.JPEG n02317335/
+mv val/ILSVRC2012_val_00045029.JPEG n02509815/
+mv val/ILSVRC2012_val_00045030.JPEG n04146614/
+mv val/ILSVRC2012_val_00045031.JPEG n01944390/
+mv val/ILSVRC2012_val_00045032.JPEG n04467665/
+mv val/ILSVRC2012_val_00045033.JPEG n02927161/
+mv val/ILSVRC2012_val_00045034.JPEG n12620546/
+mv val/ILSVRC2012_val_00045035.JPEG n02098286/
+mv val/ILSVRC2012_val_00045036.JPEG n01914609/
+mv val/ILSVRC2012_val_00045037.JPEG n02486410/
+mv val/ILSVRC2012_val_00045038.JPEG n02963159/
+mv val/ILSVRC2012_val_00045039.JPEG n03085013/
+mv val/ILSVRC2012_val_00045040.JPEG n04525305/
+mv val/ILSVRC2012_val_00045041.JPEG n04141076/
+mv val/ILSVRC2012_val_00045042.JPEG n01742172/
+mv val/ILSVRC2012_val_00045043.JPEG n01798484/
+mv val/ILSVRC2012_val_00045044.JPEG n02102480/
+mv val/ILSVRC2012_val_00045045.JPEG n01729322/
+mv val/ILSVRC2012_val_00045046.JPEG n03938244/
+mv val/ILSVRC2012_val_00045047.JPEG n02096585/
+mv val/ILSVRC2012_val_00045048.JPEG n04099969/
+mv val/ILSVRC2012_val_00045049.JPEG n02437616/
+mv val/ILSVRC2012_val_00045050.JPEG n03729826/
+mv val/ILSVRC2012_val_00045051.JPEG n01829413/
+mv val/ILSVRC2012_val_00045052.JPEG n03527444/
+mv val/ILSVRC2012_val_00045053.JPEG n04086273/
+mv val/ILSVRC2012_val_00045054.JPEG n02013706/
+mv val/ILSVRC2012_val_00045055.JPEG n03594734/
+mv val/ILSVRC2012_val_00045056.JPEG n02105855/
+mv val/ILSVRC2012_val_00045057.JPEG n04536866/
+mv val/ILSVRC2012_val_00045058.JPEG n02489166/
+mv val/ILSVRC2012_val_00045059.JPEG n02093991/
+mv val/ILSVRC2012_val_00045060.JPEG n02109525/
+mv val/ILSVRC2012_val_00045061.JPEG n01930112/
+mv val/ILSVRC2012_val_00045062.JPEG n01580077/
+mv val/ILSVRC2012_val_00045063.JPEG n02457408/
+mv val/ILSVRC2012_val_00045064.JPEG n04328186/
+mv val/ILSVRC2012_val_00045065.JPEG n01751748/
+mv val/ILSVRC2012_val_00045066.JPEG n03026506/
+mv val/ILSVRC2012_val_00045067.JPEG n04235860/
+mv val/ILSVRC2012_val_00045068.JPEG n02113023/
+mv val/ILSVRC2012_val_00045069.JPEG n03063689/
+mv val/ILSVRC2012_val_00045070.JPEG n01882714/
+mv val/ILSVRC2012_val_00045071.JPEG n03930630/
+mv val/ILSVRC2012_val_00045072.JPEG n03710721/
+mv val/ILSVRC2012_val_00045073.JPEG n04264628/
+mv val/ILSVRC2012_val_00045074.JPEG n04081281/
+mv val/ILSVRC2012_val_00045075.JPEG n04116512/
+mv val/ILSVRC2012_val_00045076.JPEG n04044716/
+mv val/ILSVRC2012_val_00045077.JPEG n01697457/
+mv val/ILSVRC2012_val_00045078.JPEG n04330267/
+mv val/ILSVRC2012_val_00045079.JPEG n02860847/
+mv val/ILSVRC2012_val_00045080.JPEG n02107908/
+mv val/ILSVRC2012_val_00045081.JPEG n04399382/
+mv val/ILSVRC2012_val_00045082.JPEG n03873416/
+mv val/ILSVRC2012_val_00045083.JPEG n04509417/
+mv val/ILSVRC2012_val_00045084.JPEG n03792972/
+mv val/ILSVRC2012_val_00045085.JPEG n02102318/
+mv val/ILSVRC2012_val_00045086.JPEG n01883070/
+mv val/ILSVRC2012_val_00045087.JPEG n07742313/
+mv val/ILSVRC2012_val_00045088.JPEG n02033041/
+mv val/ILSVRC2012_val_00045089.JPEG n12620546/
+mv val/ILSVRC2012_val_00045090.JPEG n03995372/
+mv val/ILSVRC2012_val_00045091.JPEG n02086646/
+mv val/ILSVRC2012_val_00045092.JPEG n03485794/
+mv val/ILSVRC2012_val_00045093.JPEG n07747607/
+mv val/ILSVRC2012_val_00045094.JPEG n02098413/
+mv val/ILSVRC2012_val_00045095.JPEG n03877472/
+mv val/ILSVRC2012_val_00045096.JPEG n02106550/
+mv val/ILSVRC2012_val_00045097.JPEG n04263257/
+mv val/ILSVRC2012_val_00045098.JPEG n02134418/
+mv val/ILSVRC2012_val_00045099.JPEG n04263257/
+mv val/ILSVRC2012_val_00045100.JPEG n04606251/
+mv val/ILSVRC2012_val_00045101.JPEG n01630670/
+mv val/ILSVRC2012_val_00045102.JPEG n02280649/
+mv val/ILSVRC2012_val_00045103.JPEG n02504013/
+mv val/ILSVRC2012_val_00045104.JPEG n02871525/
+mv val/ILSVRC2012_val_00045105.JPEG n04081281/
+mv val/ILSVRC2012_val_00045106.JPEG n03782006/
+mv val/ILSVRC2012_val_00045107.JPEG n01514668/
+mv val/ILSVRC2012_val_00045108.JPEG n02396427/
+mv val/ILSVRC2012_val_00045109.JPEG n02093428/
+mv val/ILSVRC2012_val_00045110.JPEG n02979186/
+mv val/ILSVRC2012_val_00045111.JPEG n04254777/
+mv val/ILSVRC2012_val_00045112.JPEG n04009552/
+mv val/ILSVRC2012_val_00045113.JPEG n03602883/
+mv val/ILSVRC2012_val_00045114.JPEG n07747607/
+mv val/ILSVRC2012_val_00045115.JPEG n04562935/
+mv val/ILSVRC2012_val_00045116.JPEG n02033041/
+mv val/ILSVRC2012_val_00045117.JPEG n04505470/
+mv val/ILSVRC2012_val_00045118.JPEG n02906734/
+mv val/ILSVRC2012_val_00045119.JPEG n03045698/
+mv val/ILSVRC2012_val_00045120.JPEG n01629819/
+mv val/ILSVRC2012_val_00045121.JPEG n04613696/
+mv val/ILSVRC2012_val_00045122.JPEG n07717556/
+mv val/ILSVRC2012_val_00045123.JPEG n02487347/
+mv val/ILSVRC2012_val_00045124.JPEG n01917289/
+mv val/ILSVRC2012_val_00045125.JPEG n01817953/
+mv val/ILSVRC2012_val_00045126.JPEG n07753275/
+mv val/ILSVRC2012_val_00045127.JPEG n02457408/
+mv val/ILSVRC2012_val_00045128.JPEG n02992529/
+mv val/ILSVRC2012_val_00045129.JPEG n01742172/
+mv val/ILSVRC2012_val_00045130.JPEG n03950228/
+mv val/ILSVRC2012_val_00045131.JPEG n03584254/
+mv val/ILSVRC2012_val_00045132.JPEG n02526121/
+mv val/ILSVRC2012_val_00045133.JPEG n01494475/
+mv val/ILSVRC2012_val_00045134.JPEG n02085936/
+mv val/ILSVRC2012_val_00045135.JPEG n02391049/
+mv val/ILSVRC2012_val_00045136.JPEG n04355933/
+mv val/ILSVRC2012_val_00045137.JPEG n03950228/
+mv val/ILSVRC2012_val_00045138.JPEG n03584829/
+mv val/ILSVRC2012_val_00045139.JPEG n02128385/
+mv val/ILSVRC2012_val_00045140.JPEG n01872401/
+mv val/ILSVRC2012_val_00045141.JPEG n02091467/
+mv val/ILSVRC2012_val_00045142.JPEG n03481172/
+mv val/ILSVRC2012_val_00045143.JPEG n04204347/
+mv val/ILSVRC2012_val_00045144.JPEG n03899768/
+mv val/ILSVRC2012_val_00045145.JPEG n02107312/
+mv val/ILSVRC2012_val_00045146.JPEG n02692877/
+mv val/ILSVRC2012_val_00045147.JPEG n04606251/
+mv val/ILSVRC2012_val_00045148.JPEG n03770679/
+mv val/ILSVRC2012_val_00045149.JPEG n07749582/
+mv val/ILSVRC2012_val_00045150.JPEG n01558993/
+mv val/ILSVRC2012_val_00045151.JPEG n02099712/
+mv val/ILSVRC2012_val_00045152.JPEG n03792782/
+mv val/ILSVRC2012_val_00045153.JPEG n03791053/
+mv val/ILSVRC2012_val_00045154.JPEG n04317175/
+mv val/ILSVRC2012_val_00045155.JPEG n02086079/
+mv val/ILSVRC2012_val_00045156.JPEG n02480855/
+mv val/ILSVRC2012_val_00045157.JPEG n01682714/
+mv val/ILSVRC2012_val_00045158.JPEG n04509417/
+mv val/ILSVRC2012_val_00045159.JPEG n03792972/
+mv val/ILSVRC2012_val_00045160.JPEG n02108551/
+mv val/ILSVRC2012_val_00045161.JPEG n02606052/
+mv val/ILSVRC2012_val_00045162.JPEG n03995372/
+mv val/ILSVRC2012_val_00045163.JPEG n04336792/
+mv val/ILSVRC2012_val_00045164.JPEG n02490219/
+mv val/ILSVRC2012_val_00045165.JPEG n07695742/
+mv val/ILSVRC2012_val_00045166.JPEG n12998815/
+mv val/ILSVRC2012_val_00045167.JPEG n03759954/
+mv val/ILSVRC2012_val_00045168.JPEG n04265275/
+mv val/ILSVRC2012_val_00045169.JPEG n02971356/
+mv val/ILSVRC2012_val_00045170.JPEG n03661043/
+mv val/ILSVRC2012_val_00045171.JPEG n02120505/
+mv val/ILSVRC2012_val_00045172.JPEG n01530575/
+mv val/ILSVRC2012_val_00045173.JPEG n03690938/
+mv val/ILSVRC2012_val_00045174.JPEG n02422106/
+mv val/ILSVRC2012_val_00045175.JPEG n02120079/
+mv val/ILSVRC2012_val_00045176.JPEG n07873807/
+mv val/ILSVRC2012_val_00045177.JPEG n04579432/
+mv val/ILSVRC2012_val_00045178.JPEG n03930313/
+mv val/ILSVRC2012_val_00045179.JPEG n09288635/
+mv val/ILSVRC2012_val_00045180.JPEG n02509815/
+mv val/ILSVRC2012_val_00045181.JPEG n03998194/
+mv val/ILSVRC2012_val_00045182.JPEG n03791053/
+mv val/ILSVRC2012_val_00045183.JPEG n01930112/
+mv val/ILSVRC2012_val_00045184.JPEG n03991062/
+mv val/ILSVRC2012_val_00045185.JPEG n02125311/
+mv val/ILSVRC2012_val_00045186.JPEG n02909870/
+mv val/ILSVRC2012_val_00045187.JPEG n07718747/
+mv val/ILSVRC2012_val_00045188.JPEG n01729322/
+mv val/ILSVRC2012_val_00045189.JPEG n02133161/
+mv val/ILSVRC2012_val_00045190.JPEG n03763968/
+mv val/ILSVRC2012_val_00045191.JPEG n03944341/
+mv val/ILSVRC2012_val_00045192.JPEG n01943899/
+mv val/ILSVRC2012_val_00045193.JPEG n02445715/
+mv val/ILSVRC2012_val_00045194.JPEG n04443257/
+mv val/ILSVRC2012_val_00045195.JPEG n02109047/
+mv val/ILSVRC2012_val_00045196.JPEG n04141327/
+mv val/ILSVRC2012_val_00045197.JPEG n03041632/
+mv val/ILSVRC2012_val_00045198.JPEG n01592084/
+mv val/ILSVRC2012_val_00045199.JPEG n02906734/
+mv val/ILSVRC2012_val_00045200.JPEG n01828970/
+mv val/ILSVRC2012_val_00045201.JPEG n03388549/
+mv val/ILSVRC2012_val_00045202.JPEG n01917289/
+mv val/ILSVRC2012_val_00045203.JPEG n02859443/
+mv val/ILSVRC2012_val_00045204.JPEG n02110958/
+mv val/ILSVRC2012_val_00045205.JPEG n03956157/
+mv val/ILSVRC2012_val_00045206.JPEG n02797295/
+mv val/ILSVRC2012_val_00045207.JPEG n02100583/
+mv val/ILSVRC2012_val_00045208.JPEG n02776631/
+mv val/ILSVRC2012_val_00045209.JPEG n03485407/
+mv val/ILSVRC2012_val_00045210.JPEG n04285008/
+mv val/ILSVRC2012_val_00045211.JPEG n03623198/
+mv val/ILSVRC2012_val_00045212.JPEG n01753488/
+mv val/ILSVRC2012_val_00045213.JPEG n03146219/
+mv val/ILSVRC2012_val_00045214.JPEG n03535780/
+mv val/ILSVRC2012_val_00045215.JPEG n12768682/
+mv val/ILSVRC2012_val_00045216.JPEG n12768682/
+mv val/ILSVRC2012_val_00045217.JPEG n02100583/
+mv val/ILSVRC2012_val_00045218.JPEG n03976657/
+mv val/ILSVRC2012_val_00045219.JPEG n04251144/
+mv val/ILSVRC2012_val_00045220.JPEG n03444034/
+mv val/ILSVRC2012_val_00045221.JPEG n03980874/
+mv val/ILSVRC2012_val_00045222.JPEG n02066245/
+mv val/ILSVRC2012_val_00045223.JPEG n01692333/
+mv val/ILSVRC2012_val_00045224.JPEG n03223299/
+mv val/ILSVRC2012_val_00045225.JPEG n04461696/
+mv val/ILSVRC2012_val_00045226.JPEG n09835506/
+mv val/ILSVRC2012_val_00045227.JPEG n02206856/
+mv val/ILSVRC2012_val_00045228.JPEG n13040303/
+mv val/ILSVRC2012_val_00045229.JPEG n02088094/
+mv val/ILSVRC2012_val_00045230.JPEG n02487347/
+mv val/ILSVRC2012_val_00045231.JPEG n03781244/
+mv val/ILSVRC2012_val_00045232.JPEG n03832673/
+mv val/ILSVRC2012_val_00045233.JPEG n02917067/
+mv val/ILSVRC2012_val_00045234.JPEG n01806567/
+mv val/ILSVRC2012_val_00045235.JPEG n03776460/
+mv val/ILSVRC2012_val_00045236.JPEG n04208210/
+mv val/ILSVRC2012_val_00045237.JPEG n04462240/
+mv val/ILSVRC2012_val_00045238.JPEG n02093428/
+mv val/ILSVRC2012_val_00045239.JPEG n02123045/
+mv val/ILSVRC2012_val_00045240.JPEG n03047690/
+mv val/ILSVRC2012_val_00045241.JPEG n04201297/
+mv val/ILSVRC2012_val_00045242.JPEG n02895154/
+mv val/ILSVRC2012_val_00045243.JPEG n04252225/
+mv val/ILSVRC2012_val_00045244.JPEG n03837869/
+mv val/ILSVRC2012_val_00045245.JPEG n01877812/
+mv val/ILSVRC2012_val_00045246.JPEG n03961711/
+mv val/ILSVRC2012_val_00045247.JPEG n01753488/
+mv val/ILSVRC2012_val_00045248.JPEG n02105505/
+mv val/ILSVRC2012_val_00045249.JPEG n02112018/
+mv val/ILSVRC2012_val_00045250.JPEG n02110627/
+mv val/ILSVRC2012_val_00045251.JPEG n02389026/
+mv val/ILSVRC2012_val_00045252.JPEG n02782093/
+mv val/ILSVRC2012_val_00045253.JPEG n02099712/
+mv val/ILSVRC2012_val_00045254.JPEG n03742115/
+mv val/ILSVRC2012_val_00045255.JPEG n04141076/
+mv val/ILSVRC2012_val_00045256.JPEG n01735189/
+mv val/ILSVRC2012_val_00045257.JPEG n02879718/
+mv val/ILSVRC2012_val_00045258.JPEG n03594734/
+mv val/ILSVRC2012_val_00045259.JPEG n04462240/
+mv val/ILSVRC2012_val_00045260.JPEG n02788148/
+mv val/ILSVRC2012_val_00045261.JPEG n02106166/
+mv val/ILSVRC2012_val_00045262.JPEG n03991062/
+mv val/ILSVRC2012_val_00045263.JPEG n01820546/
+mv val/ILSVRC2012_val_00045264.JPEG n04259630/
+mv val/ILSVRC2012_val_00045265.JPEG n04310018/
+mv val/ILSVRC2012_val_00045266.JPEG n15075141/
+mv val/ILSVRC2012_val_00045267.JPEG n03717622/
+mv val/ILSVRC2012_val_00045268.JPEG n03595614/
+mv val/ILSVRC2012_val_00045269.JPEG n03598930/
+mv val/ILSVRC2012_val_00045270.JPEG n02132136/
+mv val/ILSVRC2012_val_00045271.JPEG n03630383/
+mv val/ILSVRC2012_val_00045272.JPEG n03692522/
+mv val/ILSVRC2012_val_00045273.JPEG n04591157/
+mv val/ILSVRC2012_val_00045274.JPEG n04154565/
+mv val/ILSVRC2012_val_00045275.JPEG n02346627/
+mv val/ILSVRC2012_val_00045276.JPEG n02687172/
+mv val/ILSVRC2012_val_00045277.JPEG n07693725/
+mv val/ILSVRC2012_val_00045278.JPEG n02514041/
+mv val/ILSVRC2012_val_00045279.JPEG n02128757/
+mv val/ILSVRC2012_val_00045280.JPEG n02095314/
+mv val/ILSVRC2012_val_00045281.JPEG n01855032/
+mv val/ILSVRC2012_val_00045282.JPEG n03942813/
+mv val/ILSVRC2012_val_00045283.JPEG n03485407/
+mv val/ILSVRC2012_val_00045284.JPEG n13133613/
+mv val/ILSVRC2012_val_00045285.JPEG n03062245/
+mv val/ILSVRC2012_val_00045286.JPEG n03447447/
+mv val/ILSVRC2012_val_00045287.JPEG n02895154/
+mv val/ILSVRC2012_val_00045288.JPEG n04380533/
+mv val/ILSVRC2012_val_00045289.JPEG n02364673/
+mv val/ILSVRC2012_val_00045290.JPEG n03146219/
+mv val/ILSVRC2012_val_00045291.JPEG n02109961/
+mv val/ILSVRC2012_val_00045292.JPEG n02113799/
+mv val/ILSVRC2012_val_00045293.JPEG n02859443/
+mv val/ILSVRC2012_val_00045294.JPEG n01558993/
+mv val/ILSVRC2012_val_00045295.JPEG n02119789/
+mv val/ILSVRC2012_val_00045296.JPEG n01930112/
+mv val/ILSVRC2012_val_00045297.JPEG n04275548/
+mv val/ILSVRC2012_val_00045298.JPEG n03602883/
+mv val/ILSVRC2012_val_00045299.JPEG n02497673/
+mv val/ILSVRC2012_val_00045300.JPEG n02037110/
+mv val/ILSVRC2012_val_00045301.JPEG n03026506/
+mv val/ILSVRC2012_val_00045302.JPEG n07930864/
+mv val/ILSVRC2012_val_00045303.JPEG n04330267/
+mv val/ILSVRC2012_val_00045304.JPEG n02480495/
+mv val/ILSVRC2012_val_00045305.JPEG n02107683/
+mv val/ILSVRC2012_val_00045306.JPEG n03786901/
+mv val/ILSVRC2012_val_00045307.JPEG n01917289/
+mv val/ILSVRC2012_val_00045308.JPEG n03133878/
+mv val/ILSVRC2012_val_00045309.JPEG n04532670/
+mv val/ILSVRC2012_val_00045310.JPEG n01775062/
+mv val/ILSVRC2012_val_00045311.JPEG n03633091/
+mv val/ILSVRC2012_val_00045312.JPEG n03777568/
+mv val/ILSVRC2012_val_00045313.JPEG n01945685/
+mv val/ILSVRC2012_val_00045314.JPEG n03109150/
+mv val/ILSVRC2012_val_00045315.JPEG n03792972/
+mv val/ILSVRC2012_val_00045316.JPEG n02895154/
+mv val/ILSVRC2012_val_00045317.JPEG n04548362/
+mv val/ILSVRC2012_val_00045318.JPEG n02114855/
+mv val/ILSVRC2012_val_00045319.JPEG n03775071/
+mv val/ILSVRC2012_val_00045320.JPEG n07717556/
+mv val/ILSVRC2012_val_00045321.JPEG n02483362/
+mv val/ILSVRC2012_val_00045322.JPEG n02909870/
+mv val/ILSVRC2012_val_00045323.JPEG n02027492/
+mv val/ILSVRC2012_val_00045324.JPEG n07584110/
+mv val/ILSVRC2012_val_00045325.JPEG n03594734/
+mv val/ILSVRC2012_val_00045326.JPEG n03642806/
+mv val/ILSVRC2012_val_00045327.JPEG n03877845/
+mv val/ILSVRC2012_val_00045328.JPEG n03379051/
+mv val/ILSVRC2012_val_00045329.JPEG n02927161/
+mv val/ILSVRC2012_val_00045330.JPEG n04417672/
+mv val/ILSVRC2012_val_00045331.JPEG n04009552/
+mv val/ILSVRC2012_val_00045332.JPEG n04004767/
+mv val/ILSVRC2012_val_00045333.JPEG n02799071/
+mv val/ILSVRC2012_val_00045334.JPEG n03874599/
+mv val/ILSVRC2012_val_00045335.JPEG n01883070/
+mv val/ILSVRC2012_val_00045336.JPEG n03933933/
+mv val/ILSVRC2012_val_00045337.JPEG n03450230/
+mv val/ILSVRC2012_val_00045338.JPEG n01698640/
+mv val/ILSVRC2012_val_00045339.JPEG n03146219/
+mv val/ILSVRC2012_val_00045340.JPEG n02113023/
+mv val/ILSVRC2012_val_00045341.JPEG n03379051/
+mv val/ILSVRC2012_val_00045342.JPEG n03160309/
+mv val/ILSVRC2012_val_00045343.JPEG n01968897/
+mv val/ILSVRC2012_val_00045344.JPEG n03976467/
+mv val/ILSVRC2012_val_00045345.JPEG n04328186/
+mv val/ILSVRC2012_val_00045346.JPEG n02018207/
+mv val/ILSVRC2012_val_00045347.JPEG n02123597/
+mv val/ILSVRC2012_val_00045348.JPEG n02791124/
+mv val/ILSVRC2012_val_00045349.JPEG n01729977/
+mv val/ILSVRC2012_val_00045350.JPEG n04228054/
+mv val/ILSVRC2012_val_00045351.JPEG n02966687/
+mv val/ILSVRC2012_val_00045352.JPEG n02094258/
+mv val/ILSVRC2012_val_00045353.JPEG n03425413/
+mv val/ILSVRC2012_val_00045354.JPEG n01819313/
+mv val/ILSVRC2012_val_00045355.JPEG n02100236/
+mv val/ILSVRC2012_val_00045356.JPEG n02389026/
+mv val/ILSVRC2012_val_00045357.JPEG n02108551/
+mv val/ILSVRC2012_val_00045358.JPEG n02085620/
+mv val/ILSVRC2012_val_00045359.JPEG n03791053/
+mv val/ILSVRC2012_val_00045360.JPEG n03916031/
+mv val/ILSVRC2012_val_00045361.JPEG n01871265/
+mv val/ILSVRC2012_val_00045362.JPEG n01698640/
+mv val/ILSVRC2012_val_00045363.JPEG n02100877/
+mv val/ILSVRC2012_val_00045364.JPEG n03146219/
+mv val/ILSVRC2012_val_00045365.JPEG n03903868/
+mv val/ILSVRC2012_val_00045366.JPEG n03803284/
+mv val/ILSVRC2012_val_00045367.JPEG n04204238/
+mv val/ILSVRC2012_val_00045368.JPEG n04037443/
+mv val/ILSVRC2012_val_00045369.JPEG n02128925/
+mv val/ILSVRC2012_val_00045370.JPEG n03131574/
+mv val/ILSVRC2012_val_00045371.JPEG n02823428/
+mv val/ILSVRC2012_val_00045372.JPEG n09421951/
+mv val/ILSVRC2012_val_00045373.JPEG n03884397/
+mv val/ILSVRC2012_val_00045374.JPEG n07742313/
+mv val/ILSVRC2012_val_00045375.JPEG n03871628/
+mv val/ILSVRC2012_val_00045376.JPEG n01770081/
+mv val/ILSVRC2012_val_00045377.JPEG n04540053/
+mv val/ILSVRC2012_val_00045378.JPEG n03000134/
+mv val/ILSVRC2012_val_00045379.JPEG n02443114/
+mv val/ILSVRC2012_val_00045380.JPEG n04476259/
+mv val/ILSVRC2012_val_00045381.JPEG n04317175/
+mv val/ILSVRC2012_val_00045382.JPEG n02091032/
+mv val/ILSVRC2012_val_00045383.JPEG n07248320/
+mv val/ILSVRC2012_val_00045384.JPEG n04146614/
+mv val/ILSVRC2012_val_00045385.JPEG n04532106/
+mv val/ILSVRC2012_val_00045386.JPEG n07920052/
+mv val/ILSVRC2012_val_00045387.JPEG n02484975/
+mv val/ILSVRC2012_val_00045388.JPEG n04612504/
+mv val/ILSVRC2012_val_00045389.JPEG n01530575/
+mv val/ILSVRC2012_val_00045390.JPEG n03929660/
+mv val/ILSVRC2012_val_00045391.JPEG n04540053/
+mv val/ILSVRC2012_val_00045392.JPEG n01796340/
+mv val/ILSVRC2012_val_00045393.JPEG n01828970/
+mv val/ILSVRC2012_val_00045394.JPEG n04162706/
+mv val/ILSVRC2012_val_00045395.JPEG n03481172/
+mv val/ILSVRC2012_val_00045396.JPEG n03983396/
+mv val/ILSVRC2012_val_00045397.JPEG n02777292/
+mv val/ILSVRC2012_val_00045398.JPEG n02018795/
+mv val/ILSVRC2012_val_00045399.JPEG n02869837/
+mv val/ILSVRC2012_val_00045400.JPEG n02835271/
+mv val/ILSVRC2012_val_00045401.JPEG n03201208/
+mv val/ILSVRC2012_val_00045402.JPEG n01518878/
+mv val/ILSVRC2012_val_00045403.JPEG n12057211/
+mv val/ILSVRC2012_val_00045404.JPEG n03787032/
+mv val/ILSVRC2012_val_00045405.JPEG n02641379/
+mv val/ILSVRC2012_val_00045406.JPEG n04554684/
+mv val/ILSVRC2012_val_00045407.JPEG n02791124/
+mv val/ILSVRC2012_val_00045408.JPEG n01819313/
+mv val/ILSVRC2012_val_00045409.JPEG n02389026/
+mv val/ILSVRC2012_val_00045410.JPEG n04090263/
+mv val/ILSVRC2012_val_00045411.JPEG n03908618/
+mv val/ILSVRC2012_val_00045412.JPEG n03792972/
+mv val/ILSVRC2012_val_00045413.JPEG n02484975/
+mv val/ILSVRC2012_val_00045414.JPEG n07590611/
+mv val/ILSVRC2012_val_00045415.JPEG n01530575/
+mv val/ILSVRC2012_val_00045416.JPEG n12985857/
+mv val/ILSVRC2012_val_00045417.JPEG n09229709/
+mv val/ILSVRC2012_val_00045418.JPEG n01755581/
+mv val/ILSVRC2012_val_00045419.JPEG n03627232/
+mv val/ILSVRC2012_val_00045420.JPEG n02123159/
+mv val/ILSVRC2012_val_00045421.JPEG n03775546/
+mv val/ILSVRC2012_val_00045422.JPEG n04596742/
+mv val/ILSVRC2012_val_00045423.JPEG n04346328/
+mv val/ILSVRC2012_val_00045424.JPEG n02669723/
+mv val/ILSVRC2012_val_00045425.JPEG n07753592/
+mv val/ILSVRC2012_val_00045426.JPEG n07613480/
+mv val/ILSVRC2012_val_00045427.JPEG n03884397/
+mv val/ILSVRC2012_val_00045428.JPEG n02892201/
+mv val/ILSVRC2012_val_00045429.JPEG n01924916/
+mv val/ILSVRC2012_val_00045430.JPEG n04467665/
+mv val/ILSVRC2012_val_00045431.JPEG n02488291/
+mv val/ILSVRC2012_val_00045432.JPEG n03868242/
+mv val/ILSVRC2012_val_00045433.JPEG n02356798/
+mv val/ILSVRC2012_val_00045434.JPEG n04265275/
+mv val/ILSVRC2012_val_00045435.JPEG n02077923/
+mv val/ILSVRC2012_val_00045436.JPEG n02102973/
+mv val/ILSVRC2012_val_00045437.JPEG n03457902/
+mv val/ILSVRC2012_val_00045438.JPEG n02190166/
+mv val/ILSVRC2012_val_00045439.JPEG n03259280/
+mv val/ILSVRC2012_val_00045440.JPEG n02105162/
+mv val/ILSVRC2012_val_00045441.JPEG n02091831/
+mv val/ILSVRC2012_val_00045442.JPEG n02256656/
+mv val/ILSVRC2012_val_00045443.JPEG n01872401/
+mv val/ILSVRC2012_val_00045444.JPEG n02493793/
+mv val/ILSVRC2012_val_00045445.JPEG n02408429/
+mv val/ILSVRC2012_val_00045446.JPEG n02106550/
+mv val/ILSVRC2012_val_00045447.JPEG n03929660/
+mv val/ILSVRC2012_val_00045448.JPEG n03325584/
+mv val/ILSVRC2012_val_00045449.JPEG n04332243/
+mv val/ILSVRC2012_val_00045450.JPEG n04270147/
+mv val/ILSVRC2012_val_00045451.JPEG n01630670/
+mv val/ILSVRC2012_val_00045452.JPEG n03250847/
+mv val/ILSVRC2012_val_00045453.JPEG n02114367/
+mv val/ILSVRC2012_val_00045454.JPEG n02106166/
+mv val/ILSVRC2012_val_00045455.JPEG n03134739/
+mv val/ILSVRC2012_val_00045456.JPEG n02814860/
+mv val/ILSVRC2012_val_00045457.JPEG n02110063/
+mv val/ILSVRC2012_val_00045458.JPEG n03903868/
+mv val/ILSVRC2012_val_00045459.JPEG n02395406/
+mv val/ILSVRC2012_val_00045460.JPEG n04311174/
+mv val/ILSVRC2012_val_00045461.JPEG n03532672/
+mv val/ILSVRC2012_val_00045462.JPEG n02840245/
+mv val/ILSVRC2012_val_00045463.JPEG n01986214/
+mv val/ILSVRC2012_val_00045464.JPEG n04429376/
+mv val/ILSVRC2012_val_00045465.JPEG n02119022/
+mv val/ILSVRC2012_val_00045466.JPEG n03218198/
+mv val/ILSVRC2012_val_00045467.JPEG n02783161/
+mv val/ILSVRC2012_val_00045468.JPEG n03770439/
+mv val/ILSVRC2012_val_00045469.JPEG n02089867/
+mv val/ILSVRC2012_val_00045470.JPEG n02966687/
+mv val/ILSVRC2012_val_00045471.JPEG n03658185/
+mv val/ILSVRC2012_val_00045472.JPEG n09193705/
+mv val/ILSVRC2012_val_00045473.JPEG n03085013/
+mv val/ILSVRC2012_val_00045474.JPEG n02971356/
+mv val/ILSVRC2012_val_00045475.JPEG n04049303/
+mv val/ILSVRC2012_val_00045476.JPEG n11939491/
+mv val/ILSVRC2012_val_00045477.JPEG n02105641/
+mv val/ILSVRC2012_val_00045478.JPEG n03494278/
+mv val/ILSVRC2012_val_00045479.JPEG n02364673/
+mv val/ILSVRC2012_val_00045480.JPEG n01534433/
+mv val/ILSVRC2012_val_00045481.JPEG n01735189/
+mv val/ILSVRC2012_val_00045482.JPEG n02105855/
+mv val/ILSVRC2012_val_00045483.JPEG n03743016/
+mv val/ILSVRC2012_val_00045484.JPEG n07718472/
+mv val/ILSVRC2012_val_00045485.JPEG n02113799/
+mv val/ILSVRC2012_val_00045486.JPEG n04443257/
+mv val/ILSVRC2012_val_00045487.JPEG n02096294/
+mv val/ILSVRC2012_val_00045488.JPEG n02128925/
+mv val/ILSVRC2012_val_00045489.JPEG n02264363/
+mv val/ILSVRC2012_val_00045490.JPEG n03796401/
+mv val/ILSVRC2012_val_00045491.JPEG n02444819/
+mv val/ILSVRC2012_val_00045492.JPEG n03770679/
+mv val/ILSVRC2012_val_00045493.JPEG n02093647/
+mv val/ILSVRC2012_val_00045494.JPEG n03483316/
+mv val/ILSVRC2012_val_00045495.JPEG n02107574/
+mv val/ILSVRC2012_val_00045496.JPEG n04127249/
+mv val/ILSVRC2012_val_00045497.JPEG n02978881/
+mv val/ILSVRC2012_val_00045498.JPEG n13054560/
+mv val/ILSVRC2012_val_00045499.JPEG n02823750/
+mv val/ILSVRC2012_val_00045500.JPEG n03794056/
+mv val/ILSVRC2012_val_00045501.JPEG n03000684/
+mv val/ILSVRC2012_val_00045502.JPEG n01496331/
+mv val/ILSVRC2012_val_00045503.JPEG n01807496/
+mv val/ILSVRC2012_val_00045504.JPEG n02791270/
+mv val/ILSVRC2012_val_00045505.JPEG n01860187/
+mv val/ILSVRC2012_val_00045506.JPEG n03218198/
+mv val/ILSVRC2012_val_00045507.JPEG n02364673/
+mv val/ILSVRC2012_val_00045508.JPEG n03498962/
+mv val/ILSVRC2012_val_00045509.JPEG n04153751/
+mv val/ILSVRC2012_val_00045510.JPEG n01688243/
+mv val/ILSVRC2012_val_00045511.JPEG n03388183/
+mv val/ILSVRC2012_val_00045512.JPEG n01968897/
+mv val/ILSVRC2012_val_00045513.JPEG n02172182/
+mv val/ILSVRC2012_val_00045514.JPEG n02112018/
+mv val/ILSVRC2012_val_00045515.JPEG n02883205/
+mv val/ILSVRC2012_val_00045516.JPEG n03854065/
+mv val/ILSVRC2012_val_00045517.JPEG n12267677/
+mv val/ILSVRC2012_val_00045518.JPEG n02094258/
+mv val/ILSVRC2012_val_00045519.JPEG n04254120/
+mv val/ILSVRC2012_val_00045520.JPEG n01855672/
+mv val/ILSVRC2012_val_00045521.JPEG n02100877/
+mv val/ILSVRC2012_val_00045522.JPEG n03344393/
+mv val/ILSVRC2012_val_00045523.JPEG n07693725/
+mv val/ILSVRC2012_val_00045524.JPEG n02669723/
+mv val/ILSVRC2012_val_00045525.JPEG n02264363/
+mv val/ILSVRC2012_val_00045526.JPEG n03763968/
+mv val/ILSVRC2012_val_00045527.JPEG n03637318/
+mv val/ILSVRC2012_val_00045528.JPEG n04447861/
+mv val/ILSVRC2012_val_00045529.JPEG n01984695/
+mv val/ILSVRC2012_val_00045530.JPEG n12267677/
+mv val/ILSVRC2012_val_00045531.JPEG n04335435/
+mv val/ILSVRC2012_val_00045532.JPEG n02120505/
+mv val/ILSVRC2012_val_00045533.JPEG n02104365/
+mv val/ILSVRC2012_val_00045534.JPEG n03450230/
+mv val/ILSVRC2012_val_00045535.JPEG n04286575/
+mv val/ILSVRC2012_val_00045536.JPEG n03207941/
+mv val/ILSVRC2012_val_00045537.JPEG n02106166/
+mv val/ILSVRC2012_val_00045538.JPEG n03325584/
+mv val/ILSVRC2012_val_00045539.JPEG n03793489/
+mv val/ILSVRC2012_val_00045540.JPEG n03788365/
+mv val/ILSVRC2012_val_00045541.JPEG n03877845/
+mv val/ILSVRC2012_val_00045542.JPEG n02190166/
+mv val/ILSVRC2012_val_00045543.JPEG n02051845/
+mv val/ILSVRC2012_val_00045544.JPEG n02100583/
+mv val/ILSVRC2012_val_00045545.JPEG n02104029/
+mv val/ILSVRC2012_val_00045546.JPEG n06359193/
+mv val/ILSVRC2012_val_00045547.JPEG n01514859/
+mv val/ILSVRC2012_val_00045548.JPEG n02106550/
+mv val/ILSVRC2012_val_00045549.JPEG n02165456/
+mv val/ILSVRC2012_val_00045550.JPEG n02276258/
+mv val/ILSVRC2012_val_00045551.JPEG n01514859/
+mv val/ILSVRC2012_val_00045552.JPEG n03485407/
+mv val/ILSVRC2012_val_00045553.JPEG n01632777/
+mv val/ILSVRC2012_val_00045554.JPEG n02408429/
+mv val/ILSVRC2012_val_00045555.JPEG n03124043/
+mv val/ILSVRC2012_val_00045556.JPEG n03717622/
+mv val/ILSVRC2012_val_00045557.JPEG n04252225/
+mv val/ILSVRC2012_val_00045558.JPEG n04517823/
+mv val/ILSVRC2012_val_00045559.JPEG n03425413/
+mv val/ILSVRC2012_val_00045560.JPEG n04310018/
+mv val/ILSVRC2012_val_00045561.JPEG n03017168/
+mv val/ILSVRC2012_val_00045562.JPEG n03832673/
+mv val/ILSVRC2012_val_00045563.JPEG n01770081/
+mv val/ILSVRC2012_val_00045564.JPEG n03127925/
+mv val/ILSVRC2012_val_00045565.JPEG n02089867/
+mv val/ILSVRC2012_val_00045566.JPEG n03461385/
+mv val/ILSVRC2012_val_00045567.JPEG n03485407/
+mv val/ILSVRC2012_val_00045568.JPEG n01592084/
+mv val/ILSVRC2012_val_00045569.JPEG n02256656/
+mv val/ILSVRC2012_val_00045570.JPEG n03146219/
+mv val/ILSVRC2012_val_00045571.JPEG n01795545/
+mv val/ILSVRC2012_val_00045572.JPEG n03947888/
+mv val/ILSVRC2012_val_00045573.JPEG n07693725/
+mv val/ILSVRC2012_val_00045574.JPEG n04483307/
+mv val/ILSVRC2012_val_00045575.JPEG n02002556/
+mv val/ILSVRC2012_val_00045576.JPEG n04532670/
+mv val/ILSVRC2012_val_00045577.JPEG n04049303/
+mv val/ILSVRC2012_val_00045578.JPEG n02892201/
+mv val/ILSVRC2012_val_00045579.JPEG n03857828/
+mv val/ILSVRC2012_val_00045580.JPEG n01494475/
+mv val/ILSVRC2012_val_00045581.JPEG n01601694/
+mv val/ILSVRC2012_val_00045582.JPEG n04131690/
+mv val/ILSVRC2012_val_00045583.JPEG n02666196/
+mv val/ILSVRC2012_val_00045584.JPEG n02098286/
+mv val/ILSVRC2012_val_00045585.JPEG n02641379/
+mv val/ILSVRC2012_val_00045586.JPEG n04228054/
+mv val/ILSVRC2012_val_00045587.JPEG n03980874/
+mv val/ILSVRC2012_val_00045588.JPEG n04590129/
+mv val/ILSVRC2012_val_00045589.JPEG n01616318/
+mv val/ILSVRC2012_val_00045590.JPEG n03690938/
+mv val/ILSVRC2012_val_00045591.JPEG n04127249/
+mv val/ILSVRC2012_val_00045592.JPEG n03345487/
+mv val/ILSVRC2012_val_00045593.JPEG n02113023/
+mv val/ILSVRC2012_val_00045594.JPEG n01749939/
+mv val/ILSVRC2012_val_00045595.JPEG n04229816/
+mv val/ILSVRC2012_val_00045596.JPEG n02927161/
+mv val/ILSVRC2012_val_00045597.JPEG n03956157/
+mv val/ILSVRC2012_val_00045598.JPEG n02111500/
+mv val/ILSVRC2012_val_00045599.JPEG n01756291/
+mv val/ILSVRC2012_val_00045600.JPEG n02492035/
+mv val/ILSVRC2012_val_00045601.JPEG n02119022/
+mv val/ILSVRC2012_val_00045602.JPEG n02443114/
+mv val/ILSVRC2012_val_00045603.JPEG n02950826/
+mv val/ILSVRC2012_val_00045604.JPEG n02319095/
+mv val/ILSVRC2012_val_00045605.JPEG n04346328/
+mv val/ILSVRC2012_val_00045606.JPEG n02128757/
+mv val/ILSVRC2012_val_00045607.JPEG n03998194/
+mv val/ILSVRC2012_val_00045608.JPEG n02667093/
+mv val/ILSVRC2012_val_00045609.JPEG n01943899/
+mv val/ILSVRC2012_val_00045610.JPEG n04467665/
+mv val/ILSVRC2012_val_00045611.JPEG n01530575/
+mv val/ILSVRC2012_val_00045612.JPEG n01614925/
+mv val/ILSVRC2012_val_00045613.JPEG n04346328/
+mv val/ILSVRC2012_val_00045614.JPEG n02093754/
+mv val/ILSVRC2012_val_00045615.JPEG n03733805/
+mv val/ILSVRC2012_val_00045616.JPEG n03742115/
+mv val/ILSVRC2012_val_00045617.JPEG n03197337/
+mv val/ILSVRC2012_val_00045618.JPEG n02107908/
+mv val/ILSVRC2012_val_00045619.JPEG n01737021/
+mv val/ILSVRC2012_val_00045620.JPEG n02281787/
+mv val/ILSVRC2012_val_00045621.JPEG n03141823/
+mv val/ILSVRC2012_val_00045622.JPEG n04254120/
+mv val/ILSVRC2012_val_00045623.JPEG n01532829/
+mv val/ILSVRC2012_val_00045624.JPEG n02526121/
+mv val/ILSVRC2012_val_00045625.JPEG n02966687/
+mv val/ILSVRC2012_val_00045626.JPEG n02484975/
+mv val/ILSVRC2012_val_00045627.JPEG n03832673/
+mv val/ILSVRC2012_val_00045628.JPEG n02113799/
+mv val/ILSVRC2012_val_00045629.JPEG n03958227/
+mv val/ILSVRC2012_val_00045630.JPEG n04350905/
+mv val/ILSVRC2012_val_00045631.JPEG n03623198/
+mv val/ILSVRC2012_val_00045632.JPEG n06874185/
+mv val/ILSVRC2012_val_00045633.JPEG n03337140/
+mv val/ILSVRC2012_val_00045634.JPEG n02097658/
+mv val/ILSVRC2012_val_00045635.JPEG n04311174/
+mv val/ILSVRC2012_val_00045636.JPEG n04201297/
+mv val/ILSVRC2012_val_00045637.JPEG n03908714/
+mv val/ILSVRC2012_val_00045638.JPEG n01740131/
+mv val/ILSVRC2012_val_00045639.JPEG n03929855/
+mv val/ILSVRC2012_val_00045640.JPEG n02509815/
+mv val/ILSVRC2012_val_00045641.JPEG n03903868/
+mv val/ILSVRC2012_val_00045642.JPEG n03658185/
+mv val/ILSVRC2012_val_00045643.JPEG n01843065/
+mv val/ILSVRC2012_val_00045644.JPEG n04557648/
+mv val/ILSVRC2012_val_00045645.JPEG n04392985/
+mv val/ILSVRC2012_val_00045646.JPEG n02454379/
+mv val/ILSVRC2012_val_00045647.JPEG n02493793/
+mv val/ILSVRC2012_val_00045648.JPEG n04275548/
+mv val/ILSVRC2012_val_00045649.JPEG n03220513/
+mv val/ILSVRC2012_val_00045650.JPEG n02606052/
+mv val/ILSVRC2012_val_00045651.JPEG n04118776/
+mv val/ILSVRC2012_val_00045652.JPEG n02514041/
+mv val/ILSVRC2012_val_00045653.JPEG n07684084/
+mv val/ILSVRC2012_val_00045654.JPEG n03388183/
+mv val/ILSVRC2012_val_00045655.JPEG n02794156/
+mv val/ILSVRC2012_val_00045656.JPEG n01632777/
+mv val/ILSVRC2012_val_00045657.JPEG n04238763/
+mv val/ILSVRC2012_val_00045658.JPEG n04372370/
+mv val/ILSVRC2012_val_00045659.JPEG n03876231/
+mv val/ILSVRC2012_val_00045660.JPEG n02948072/
+mv val/ILSVRC2012_val_00045661.JPEG n02096437/
+mv val/ILSVRC2012_val_00045662.JPEG n02497673/
+mv val/ILSVRC2012_val_00045663.JPEG n03843555/
+mv val/ILSVRC2012_val_00045664.JPEG n07565083/
+mv val/ILSVRC2012_val_00045665.JPEG n02097130/
+mv val/ILSVRC2012_val_00045666.JPEG n04509417/
+mv val/ILSVRC2012_val_00045667.JPEG n03255030/
+mv val/ILSVRC2012_val_00045668.JPEG n02129165/
+mv val/ILSVRC2012_val_00045669.JPEG n01682714/
+mv val/ILSVRC2012_val_00045670.JPEG n07753275/
+mv val/ILSVRC2012_val_00045671.JPEG n09472597/
+mv val/ILSVRC2012_val_00045672.JPEG n02134418/
+mv val/ILSVRC2012_val_00045673.JPEG n02219486/
+mv val/ILSVRC2012_val_00045674.JPEG n02097047/
+mv val/ILSVRC2012_val_00045675.JPEG n03063689/
+mv val/ILSVRC2012_val_00045676.JPEG n02091467/
+mv val/ILSVRC2012_val_00045677.JPEG n03781244/
+mv val/ILSVRC2012_val_00045678.JPEG n02807133/
+mv val/ILSVRC2012_val_00045679.JPEG n03814906/
+mv val/ILSVRC2012_val_00045680.JPEG n04355338/
+mv val/ILSVRC2012_val_00045681.JPEG n04579145/
+mv val/ILSVRC2012_val_00045682.JPEG n03272010/
+mv val/ILSVRC2012_val_00045683.JPEG n02086646/
+mv val/ILSVRC2012_val_00045684.JPEG n02106662/
+mv val/ILSVRC2012_val_00045685.JPEG n03956157/
+mv val/ILSVRC2012_val_00045686.JPEG n02783161/
+mv val/ILSVRC2012_val_00045687.JPEG n02112137/
+mv val/ILSVRC2012_val_00045688.JPEG n03188531/
+mv val/ILSVRC2012_val_00045689.JPEG n03126707/
+mv val/ILSVRC2012_val_00045690.JPEG n01608432/
+mv val/ILSVRC2012_val_00045691.JPEG n03337140/
+mv val/ILSVRC2012_val_00045692.JPEG n01847000/
+mv val/ILSVRC2012_val_00045693.JPEG n04125021/
+mv val/ILSVRC2012_val_00045694.JPEG n04147183/
+mv val/ILSVRC2012_val_00045695.JPEG n07720875/
+mv val/ILSVRC2012_val_00045696.JPEG n02319095/
+mv val/ILSVRC2012_val_00045697.JPEG n02510455/
+mv val/ILSVRC2012_val_00045698.JPEG n04311174/
+mv val/ILSVRC2012_val_00045699.JPEG n03584254/
+mv val/ILSVRC2012_val_00045700.JPEG n04542943/
+mv val/ILSVRC2012_val_00045701.JPEG n02102480/
+mv val/ILSVRC2012_val_00045702.JPEG n02114712/
+mv val/ILSVRC2012_val_00045703.JPEG n02268443/
+mv val/ILSVRC2012_val_00045704.JPEG n07718472/
+mv val/ILSVRC2012_val_00045705.JPEG n03792972/
+mv val/ILSVRC2012_val_00045706.JPEG n03724870/
+mv val/ILSVRC2012_val_00045707.JPEG n04239074/
+mv val/ILSVRC2012_val_00045708.JPEG n02091134/
+mv val/ILSVRC2012_val_00045709.JPEG n02129604/
+mv val/ILSVRC2012_val_00045710.JPEG n03127925/
+mv val/ILSVRC2012_val_00045711.JPEG n02086646/
+mv val/ILSVRC2012_val_00045712.JPEG n03207941/
+mv val/ILSVRC2012_val_00045713.JPEG n01819313/
+mv val/ILSVRC2012_val_00045714.JPEG n04522168/
+mv val/ILSVRC2012_val_00045715.JPEG n03271574/
+mv val/ILSVRC2012_val_00045716.JPEG n04487394/
+mv val/ILSVRC2012_val_00045717.JPEG n03710193/
+mv val/ILSVRC2012_val_00045718.JPEG n02105855/
+mv val/ILSVRC2012_val_00045719.JPEG n03131574/
+mv val/ILSVRC2012_val_00045720.JPEG n02105251/
+mv val/ILSVRC2012_val_00045721.JPEG n02095889/
+mv val/ILSVRC2012_val_00045722.JPEG n03384352/
+mv val/ILSVRC2012_val_00045723.JPEG n07880968/
+mv val/ILSVRC2012_val_00045724.JPEG n02259212/
+mv val/ILSVRC2012_val_00045725.JPEG n04069434/
+mv val/ILSVRC2012_val_00045726.JPEG n01669191/
+mv val/ILSVRC2012_val_00045727.JPEG n03710193/
+mv val/ILSVRC2012_val_00045728.JPEG n01855672/
+mv val/ILSVRC2012_val_00045729.JPEG n13037406/
+mv val/ILSVRC2012_val_00045730.JPEG n01484850/
+mv val/ILSVRC2012_val_00045731.JPEG n04476259/
+mv val/ILSVRC2012_val_00045732.JPEG n03871628/
+mv val/ILSVRC2012_val_00045733.JPEG n01774750/
+mv val/ILSVRC2012_val_00045734.JPEG n02108551/
+mv val/ILSVRC2012_val_00045735.JPEG n02090622/
+mv val/ILSVRC2012_val_00045736.JPEG n03733281/
+mv val/ILSVRC2012_val_00045737.JPEG n03724870/
+mv val/ILSVRC2012_val_00045738.JPEG n03976657/
+mv val/ILSVRC2012_val_00045739.JPEG n02099267/
+mv val/ILSVRC2012_val_00045740.JPEG n04127249/
+mv val/ILSVRC2012_val_00045741.JPEG n02097474/
+mv val/ILSVRC2012_val_00045742.JPEG n02056570/
+mv val/ILSVRC2012_val_00045743.JPEG n01795545/
+mv val/ILSVRC2012_val_00045744.JPEG n07714571/
+mv val/ILSVRC2012_val_00045745.JPEG n02107142/
+mv val/ILSVRC2012_val_00045746.JPEG n01608432/
+mv val/ILSVRC2012_val_00045747.JPEG n02113023/
+mv val/ILSVRC2012_val_00045748.JPEG n04486054/
+mv val/ILSVRC2012_val_00045749.JPEG n03876231/
+mv val/ILSVRC2012_val_00045750.JPEG n04270147/
+mv val/ILSVRC2012_val_00045751.JPEG n03461385/
+mv val/ILSVRC2012_val_00045752.JPEG n13040303/
+mv val/ILSVRC2012_val_00045753.JPEG n02102318/
+mv val/ILSVRC2012_val_00045754.JPEG n02910353/
+mv val/ILSVRC2012_val_00045755.JPEG n02094114/
+mv val/ILSVRC2012_val_00045756.JPEG n02786058/
+mv val/ILSVRC2012_val_00045757.JPEG n02992211/
+mv val/ILSVRC2012_val_00045758.JPEG n02396427/
+mv val/ILSVRC2012_val_00045759.JPEG n04344873/
+mv val/ILSVRC2012_val_00045760.JPEG n02097130/
+mv val/ILSVRC2012_val_00045761.JPEG n01443537/
+mv val/ILSVRC2012_val_00045762.JPEG n04325704/
+mv val/ILSVRC2012_val_00045763.JPEG n02093428/
+mv val/ILSVRC2012_val_00045764.JPEG n04258138/
+mv val/ILSVRC2012_val_00045765.JPEG n07584110/
+mv val/ILSVRC2012_val_00045766.JPEG n03443371/
+mv val/ILSVRC2012_val_00045767.JPEG n03481172/
+mv val/ILSVRC2012_val_00045768.JPEG n02110341/
+mv val/ILSVRC2012_val_00045769.JPEG n04141975/
+mv val/ILSVRC2012_val_00045770.JPEG n02226429/
+mv val/ILSVRC2012_val_00045771.JPEG n02281406/
+mv val/ILSVRC2012_val_00045772.JPEG n04141327/
+mv val/ILSVRC2012_val_00045773.JPEG n04118538/
+mv val/ILSVRC2012_val_00045774.JPEG n02037110/
+mv val/ILSVRC2012_val_00045775.JPEG n02226429/
+mv val/ILSVRC2012_val_00045776.JPEG n01692333/
+mv val/ILSVRC2012_val_00045777.JPEG n03916031/
+mv val/ILSVRC2012_val_00045778.JPEG n02787622/
+mv val/ILSVRC2012_val_00045779.JPEG n03594945/
+mv val/ILSVRC2012_val_00045780.JPEG n07860988/
+mv val/ILSVRC2012_val_00045781.JPEG n03729826/
+mv val/ILSVRC2012_val_00045782.JPEG n04515003/
+mv val/ILSVRC2012_val_00045783.JPEG n04612504/
+mv val/ILSVRC2012_val_00045784.JPEG n02007558/
+mv val/ILSVRC2012_val_00045785.JPEG n01560419/
+mv val/ILSVRC2012_val_00045786.JPEG n02951358/
+mv val/ILSVRC2012_val_00045787.JPEG n02837789/
+mv val/ILSVRC2012_val_00045788.JPEG n04456115/
+mv val/ILSVRC2012_val_00045789.JPEG n04239074/
+mv val/ILSVRC2012_val_00045790.JPEG n02094433/
+mv val/ILSVRC2012_val_00045791.JPEG n04553703/
+mv val/ILSVRC2012_val_00045792.JPEG n03045698/
+mv val/ILSVRC2012_val_00045793.JPEG n03874599/
+mv val/ILSVRC2012_val_00045794.JPEG n03595614/
+mv val/ILSVRC2012_val_00045795.JPEG n02514041/
+mv val/ILSVRC2012_val_00045796.JPEG n03876231/
+mv val/ILSVRC2012_val_00045797.JPEG n04467665/
+mv val/ILSVRC2012_val_00045798.JPEG n04146614/
+mv val/ILSVRC2012_val_00045799.JPEG n02089973/
+mv val/ILSVRC2012_val_00045800.JPEG n04005630/
+mv val/ILSVRC2012_val_00045801.JPEG n04266014/
+mv val/ILSVRC2012_val_00045802.JPEG n04074963/
+mv val/ILSVRC2012_val_00045803.JPEG n03527444/
+mv val/ILSVRC2012_val_00045804.JPEG n04355338/
+mv val/ILSVRC2012_val_00045805.JPEG n09246464/
+mv val/ILSVRC2012_val_00045806.JPEG n03980874/
+mv val/ILSVRC2012_val_00045807.JPEG n01990800/
+mv val/ILSVRC2012_val_00045808.JPEG n03697007/
+mv val/ILSVRC2012_val_00045809.JPEG n13133613/
+mv val/ILSVRC2012_val_00045810.JPEG n07613480/
+mv val/ILSVRC2012_val_00045811.JPEG n02655020/
+mv val/ILSVRC2012_val_00045812.JPEG n03240683/
+mv val/ILSVRC2012_val_00045813.JPEG n04111531/
+mv val/ILSVRC2012_val_00045814.JPEG n01871265/
+mv val/ILSVRC2012_val_00045815.JPEG n01695060/
+mv val/ILSVRC2012_val_00045816.JPEG n03478589/
+mv val/ILSVRC2012_val_00045817.JPEG n04265275/
+mv val/ILSVRC2012_val_00045818.JPEG n02094433/
+mv val/ILSVRC2012_val_00045819.JPEG n02009229/
+mv val/ILSVRC2012_val_00045820.JPEG n02708093/
+mv val/ILSVRC2012_val_00045821.JPEG n03447447/
+mv val/ILSVRC2012_val_00045822.JPEG n03216828/
+mv val/ILSVRC2012_val_00045823.JPEG n04371430/
+mv val/ILSVRC2012_val_00045824.JPEG n03991062/
+mv val/ILSVRC2012_val_00045825.JPEG n02607072/
+mv val/ILSVRC2012_val_00045826.JPEG n02481823/
+mv val/ILSVRC2012_val_00045827.JPEG n02102318/
+mv val/ILSVRC2012_val_00045828.JPEG n09256479/
+mv val/ILSVRC2012_val_00045829.JPEG n02123597/
+mv val/ILSVRC2012_val_00045830.JPEG n02927161/
+mv val/ILSVRC2012_val_00045831.JPEG n01737021/
+mv val/ILSVRC2012_val_00045832.JPEG n01675722/
+mv val/ILSVRC2012_val_00045833.JPEG n11939491/
+mv val/ILSVRC2012_val_00045834.JPEG n03937543/
+mv val/ILSVRC2012_val_00045835.JPEG n03729826/
+mv val/ILSVRC2012_val_00045836.JPEG n01820546/
+mv val/ILSVRC2012_val_00045837.JPEG n01847000/
+mv val/ILSVRC2012_val_00045838.JPEG n02112137/
+mv val/ILSVRC2012_val_00045839.JPEG n01675722/
+mv val/ILSVRC2012_val_00045840.JPEG n04613696/
+mv val/ILSVRC2012_val_00045841.JPEG n02974003/
+mv val/ILSVRC2012_val_00045842.JPEG n03384352/
+mv val/ILSVRC2012_val_00045843.JPEG n03627232/
+mv val/ILSVRC2012_val_00045844.JPEG n04429376/
+mv val/ILSVRC2012_val_00045845.JPEG n01756291/
+mv val/ILSVRC2012_val_00045846.JPEG n03496892/
+mv val/ILSVRC2012_val_00045847.JPEG n02398521/
+mv val/ILSVRC2012_val_00045848.JPEG n02168699/
+mv val/ILSVRC2012_val_00045849.JPEG n03000247/
+mv val/ILSVRC2012_val_00045850.JPEG n01739381/
+mv val/ILSVRC2012_val_00045851.JPEG n04371430/
+mv val/ILSVRC2012_val_00045852.JPEG n04335435/
+mv val/ILSVRC2012_val_00045853.JPEG n03532672/
+mv val/ILSVRC2012_val_00045854.JPEG n02441942/
+mv val/ILSVRC2012_val_00045855.JPEG n03400231/
+mv val/ILSVRC2012_val_00045856.JPEG n03793489/
+mv val/ILSVRC2012_val_00045857.JPEG n01795545/
+mv val/ILSVRC2012_val_00045858.JPEG n01740131/
+mv val/ILSVRC2012_val_00045859.JPEG n02110806/
+mv val/ILSVRC2012_val_00045860.JPEG n03063599/
+mv val/ILSVRC2012_val_00045861.JPEG n02095314/
+mv val/ILSVRC2012_val_00045862.JPEG n04579432/
+mv val/ILSVRC2012_val_00045863.JPEG n04591157/
+mv val/ILSVRC2012_val_00045864.JPEG n02321529/
+mv val/ILSVRC2012_val_00045865.JPEG n03661043/
+mv val/ILSVRC2012_val_00045866.JPEG n01440764/
+mv val/ILSVRC2012_val_00045867.JPEG n04228054/
+mv val/ILSVRC2012_val_00045868.JPEG n04462240/
+mv val/ILSVRC2012_val_00045869.JPEG n03877472/
+mv val/ILSVRC2012_val_00045870.JPEG n03720891/
+mv val/ILSVRC2012_val_00045871.JPEG n02514041/
+mv val/ILSVRC2012_val_00045872.JPEG n03272562/
+mv val/ILSVRC2012_val_00045873.JPEG n01601694/
+mv val/ILSVRC2012_val_00045874.JPEG n02091467/
+mv val/ILSVRC2012_val_00045875.JPEG n04041544/
+mv val/ILSVRC2012_val_00045876.JPEG n03796401/
+mv val/ILSVRC2012_val_00045877.JPEG n03594734/
+mv val/ILSVRC2012_val_00045878.JPEG n02089078/
+mv val/ILSVRC2012_val_00045879.JPEG n02493793/
+mv val/ILSVRC2012_val_00045880.JPEG n01440764/
+mv val/ILSVRC2012_val_00045881.JPEG n09399592/
+mv val/ILSVRC2012_val_00045882.JPEG n03775071/
+mv val/ILSVRC2012_val_00045883.JPEG n04296562/
+mv val/ILSVRC2012_val_00045884.JPEG n02099849/
+mv val/ILSVRC2012_val_00045885.JPEG n02804610/
+mv val/ILSVRC2012_val_00045886.JPEG n03384352/
+mv val/ILSVRC2012_val_00045887.JPEG n02088632/
+mv val/ILSVRC2012_val_00045888.JPEG n04026417/
+mv val/ILSVRC2012_val_00045889.JPEG n02794156/
+mv val/ILSVRC2012_val_00045890.JPEG n01968897/
+mv val/ILSVRC2012_val_00045891.JPEG n02133161/
+mv val/ILSVRC2012_val_00045892.JPEG n03777754/
+mv val/ILSVRC2012_val_00045893.JPEG n02494079/
+mv val/ILSVRC2012_val_00045894.JPEG n02107142/
+mv val/ILSVRC2012_val_00045895.JPEG n03710193/
+mv val/ILSVRC2012_val_00045896.JPEG n02640242/
+mv val/ILSVRC2012_val_00045897.JPEG n04209133/
+mv val/ILSVRC2012_val_00045898.JPEG n02443114/
+mv val/ILSVRC2012_val_00045899.JPEG n03259280/
+mv val/ILSVRC2012_val_00045900.JPEG n02172182/
+mv val/ILSVRC2012_val_00045901.JPEG n02089078/
+mv val/ILSVRC2012_val_00045902.JPEG n04049303/
+mv val/ILSVRC2012_val_00045903.JPEG n02093647/
+mv val/ILSVRC2012_val_00045904.JPEG n06785654/
+mv val/ILSVRC2012_val_00045905.JPEG n03733131/
+mv val/ILSVRC2012_val_00045906.JPEG n03476991/
+mv val/ILSVRC2012_val_00045907.JPEG n04259630/
+mv val/ILSVRC2012_val_00045908.JPEG n01768244/
+mv val/ILSVRC2012_val_00045909.JPEG n13037406/
+mv val/ILSVRC2012_val_00045910.JPEG n02168699/
+mv val/ILSVRC2012_val_00045911.JPEG n02013706/
+mv val/ILSVRC2012_val_00045912.JPEG n02089078/
+mv val/ILSVRC2012_val_00045913.JPEG n01817953/
+mv val/ILSVRC2012_val_00045914.JPEG n02280649/
+mv val/ILSVRC2012_val_00045915.JPEG n02877765/
+mv val/ILSVRC2012_val_00045916.JPEG n04273569/
+mv val/ILSVRC2012_val_00045917.JPEG n02097209/
+mv val/ILSVRC2012_val_00045918.JPEG n06785654/
+mv val/ILSVRC2012_val_00045919.JPEG n02104365/
+mv val/ILSVRC2012_val_00045920.JPEG n02107908/
+mv val/ILSVRC2012_val_00045921.JPEG n02484975/
+mv val/ILSVRC2012_val_00045922.JPEG n02906734/
+mv val/ILSVRC2012_val_00045923.JPEG n09468604/
+mv val/ILSVRC2012_val_00045924.JPEG n01632777/
+mv val/ILSVRC2012_val_00045925.JPEG n01494475/
+mv val/ILSVRC2012_val_00045926.JPEG n01983481/
+mv val/ILSVRC2012_val_00045927.JPEG n04372370/
+mv val/ILSVRC2012_val_00045928.JPEG n02364673/
+mv val/ILSVRC2012_val_00045929.JPEG n02730930/
+mv val/ILSVRC2012_val_00045930.JPEG n02100583/
+mv val/ILSVRC2012_val_00045931.JPEG n04127249/
+mv val/ILSVRC2012_val_00045932.JPEG n03355925/
+mv val/ILSVRC2012_val_00045933.JPEG n02108089/
+mv val/ILSVRC2012_val_00045934.JPEG n03197337/
+mv val/ILSVRC2012_val_00045935.JPEG n03857828/
+mv val/ILSVRC2012_val_00045936.JPEG n01496331/
+mv val/ILSVRC2012_val_00045937.JPEG n02110341/
+mv val/ILSVRC2012_val_00045938.JPEG n04074963/
+mv val/ILSVRC2012_val_00045939.JPEG n02087046/
+mv val/ILSVRC2012_val_00045940.JPEG n03000684/
+mv val/ILSVRC2012_val_00045941.JPEG n03485794/
+mv val/ILSVRC2012_val_00045942.JPEG n02500267/
+mv val/ILSVRC2012_val_00045943.JPEG n02105162/
+mv val/ILSVRC2012_val_00045944.JPEG n03425413/
+mv val/ILSVRC2012_val_00045945.JPEG n01944390/
+mv val/ILSVRC2012_val_00045946.JPEG n02112018/
+mv val/ILSVRC2012_val_00045947.JPEG n04005630/
+mv val/ILSVRC2012_val_00045948.JPEG n01582220/
+mv val/ILSVRC2012_val_00045949.JPEG n04275548/
+mv val/ILSVRC2012_val_00045950.JPEG n07754684/
+mv val/ILSVRC2012_val_00045951.JPEG n02011460/
+mv val/ILSVRC2012_val_00045952.JPEG n02132136/
+mv val/ILSVRC2012_val_00045953.JPEG n01748264/
+mv val/ILSVRC2012_val_00045954.JPEG n04228054/
+mv val/ILSVRC2012_val_00045955.JPEG n02980441/
+mv val/ILSVRC2012_val_00045956.JPEG n02113624/
+mv val/ILSVRC2012_val_00045957.JPEG n04597913/
+mv val/ILSVRC2012_val_00045958.JPEG n02123159/
+mv val/ILSVRC2012_val_00045959.JPEG n02027492/
+mv val/ILSVRC2012_val_00045960.JPEG n04590129/
+mv val/ILSVRC2012_val_00045961.JPEG n02114548/
+mv val/ILSVRC2012_val_00045962.JPEG n03208938/
+mv val/ILSVRC2012_val_00045963.JPEG n02099267/
+mv val/ILSVRC2012_val_00045964.JPEG n03538406/
+mv val/ILSVRC2012_val_00045965.JPEG n03218198/
+mv val/ILSVRC2012_val_00045966.JPEG n04254120/
+mv val/ILSVRC2012_val_00045967.JPEG n03337140/
+mv val/ILSVRC2012_val_00045968.JPEG n02089078/
+mv val/ILSVRC2012_val_00045969.JPEG n02701002/
+mv val/ILSVRC2012_val_00045970.JPEG n02086240/
+mv val/ILSVRC2012_val_00045971.JPEG n02088632/
+mv val/ILSVRC2012_val_00045972.JPEG n01943899/
+mv val/ILSVRC2012_val_00045973.JPEG n13052670/
+mv val/ILSVRC2012_val_00045974.JPEG n04606251/
+mv val/ILSVRC2012_val_00045975.JPEG n09229709/
+mv val/ILSVRC2012_val_00045976.JPEG n01687978/
+mv val/ILSVRC2012_val_00045977.JPEG n03929660/
+mv val/ILSVRC2012_val_00045978.JPEG n02093754/
+mv val/ILSVRC2012_val_00045979.JPEG n01729322/
+mv val/ILSVRC2012_val_00045980.JPEG n02107908/
+mv val/ILSVRC2012_val_00045981.JPEG n07715103/
+mv val/ILSVRC2012_val_00045982.JPEG n03773504/
+mv val/ILSVRC2012_val_00045983.JPEG n04592741/
+mv val/ILSVRC2012_val_00045984.JPEG n02107908/
+mv val/ILSVRC2012_val_00045985.JPEG n02264363/
+mv val/ILSVRC2012_val_00045986.JPEG n04154565/
+mv val/ILSVRC2012_val_00045987.JPEG n02098105/
+mv val/ILSVRC2012_val_00045988.JPEG n03485794/
+mv val/ILSVRC2012_val_00045989.JPEG n02791270/
+mv val/ILSVRC2012_val_00045990.JPEG n06874185/
+mv val/ILSVRC2012_val_00045991.JPEG n02488702/
+mv val/ILSVRC2012_val_00045992.JPEG n03014705/
+mv val/ILSVRC2012_val_00045993.JPEG n03657121/
+mv val/ILSVRC2012_val_00045994.JPEG n03854065/
+mv val/ILSVRC2012_val_00045995.JPEG n02107574/
+mv val/ILSVRC2012_val_00045996.JPEG n02669723/
+mv val/ILSVRC2012_val_00045997.JPEG n03950228/
+mv val/ILSVRC2012_val_00045998.JPEG n02317335/
+mv val/ILSVRC2012_val_00045999.JPEG n04133789/
+mv val/ILSVRC2012_val_00046000.JPEG n01685808/
+mv val/ILSVRC2012_val_00046001.JPEG n03933933/
+mv val/ILSVRC2012_val_00046002.JPEG n02097047/
+mv val/ILSVRC2012_val_00046003.JPEG n02011460/
+mv val/ILSVRC2012_val_00046004.JPEG n01819313/
+mv val/ILSVRC2012_val_00046005.JPEG n03982430/
+mv val/ILSVRC2012_val_00046006.JPEG n01784675/
+mv val/ILSVRC2012_val_00046007.JPEG n03670208/
+mv val/ILSVRC2012_val_00046008.JPEG n03220513/
+mv val/ILSVRC2012_val_00046009.JPEG n04118538/
+mv val/ILSVRC2012_val_00046010.JPEG n02782093/
+mv val/ILSVRC2012_val_00046011.JPEG n02783161/
+mv val/ILSVRC2012_val_00046012.JPEG n03496892/
+mv val/ILSVRC2012_val_00046013.JPEG n02107574/
+mv val/ILSVRC2012_val_00046014.JPEG n04040759/
+mv val/ILSVRC2012_val_00046015.JPEG n02013706/
+mv val/ILSVRC2012_val_00046016.JPEG n02777292/
+mv val/ILSVRC2012_val_00046017.JPEG n01775062/
+mv val/ILSVRC2012_val_00046018.JPEG n01748264/
+mv val/ILSVRC2012_val_00046019.JPEG n03018349/
+mv val/ILSVRC2012_val_00046020.JPEG n04111531/
+mv val/ILSVRC2012_val_00046021.JPEG n02089867/
+mv val/ILSVRC2012_val_00046022.JPEG n09246464/
+mv val/ILSVRC2012_val_00046023.JPEG n04548280/
+mv val/ILSVRC2012_val_00046024.JPEG n07734744/
+mv val/ILSVRC2012_val_00046025.JPEG n03291819/
+mv val/ILSVRC2012_val_00046026.JPEG n04552348/
+mv val/ILSVRC2012_val_00046027.JPEG n03871628/
+mv val/ILSVRC2012_val_00046028.JPEG n07753113/
+mv val/ILSVRC2012_val_00046029.JPEG n01729322/
+mv val/ILSVRC2012_val_00046030.JPEG n07715103/
+mv val/ILSVRC2012_val_00046031.JPEG n04596742/
+mv val/ILSVRC2012_val_00046032.JPEG n02128385/
+mv val/ILSVRC2012_val_00046033.JPEG n03976467/
+mv val/ILSVRC2012_val_00046034.JPEG n04548280/
+mv val/ILSVRC2012_val_00046035.JPEG n02497673/
+mv val/ILSVRC2012_val_00046036.JPEG n02134418/
+mv val/ILSVRC2012_val_00046037.JPEG n02105251/
+mv val/ILSVRC2012_val_00046038.JPEG n03970156/
+mv val/ILSVRC2012_val_00046039.JPEG n01749939/
+mv val/ILSVRC2012_val_00046040.JPEG n01795545/
+mv val/ILSVRC2012_val_00046041.JPEG n01855032/
+mv val/ILSVRC2012_val_00046042.JPEG n02395406/
+mv val/ILSVRC2012_val_00046043.JPEG n02098413/
+mv val/ILSVRC2012_val_00046044.JPEG n02111500/
+mv val/ILSVRC2012_val_00046045.JPEG n02895154/
+mv val/ILSVRC2012_val_00046046.JPEG n07565083/
+mv val/ILSVRC2012_val_00046047.JPEG n03742115/
+mv val/ILSVRC2012_val_00046048.JPEG n02108089/
+mv val/ILSVRC2012_val_00046049.JPEG n02321529/
+mv val/ILSVRC2012_val_00046050.JPEG n02971356/
+mv val/ILSVRC2012_val_00046051.JPEG n02437616/
+mv val/ILSVRC2012_val_00046052.JPEG n03208938/
+mv val/ILSVRC2012_val_00046053.JPEG n01667114/
+mv val/ILSVRC2012_val_00046054.JPEG n02226429/
+mv val/ILSVRC2012_val_00046055.JPEG n03877845/
+mv val/ILSVRC2012_val_00046056.JPEG n02910353/
+mv val/ILSVRC2012_val_00046057.JPEG n04070727/
+mv val/ILSVRC2012_val_00046058.JPEG n04152593/
+mv val/ILSVRC2012_val_00046059.JPEG n01883070/
+mv val/ILSVRC2012_val_00046060.JPEG n02870880/
+mv val/ILSVRC2012_val_00046061.JPEG n02504458/
+mv val/ILSVRC2012_val_00046062.JPEG n04243546/
+mv val/ILSVRC2012_val_00046063.JPEG n02096051/
+mv val/ILSVRC2012_val_00046064.JPEG n03899768/
+mv val/ILSVRC2012_val_00046065.JPEG n02321529/
+mv val/ILSVRC2012_val_00046066.JPEG n03877845/
+mv val/ILSVRC2012_val_00046067.JPEG n03450230/
+mv val/ILSVRC2012_val_00046068.JPEG n03290653/
+mv val/ILSVRC2012_val_00046069.JPEG n01664065/
+mv val/ILSVRC2012_val_00046070.JPEG n03908714/
+mv val/ILSVRC2012_val_00046071.JPEG n01537544/
+mv val/ILSVRC2012_val_00046072.JPEG n02088238/
+mv val/ILSVRC2012_val_00046073.JPEG n01882714/
+mv val/ILSVRC2012_val_00046074.JPEG n01773549/
+mv val/ILSVRC2012_val_00046075.JPEG n04418357/
+mv val/ILSVRC2012_val_00046076.JPEG n02727426/
+mv val/ILSVRC2012_val_00046077.JPEG n01872401/
+mv val/ILSVRC2012_val_00046078.JPEG n02106382/
+mv val/ILSVRC2012_val_00046079.JPEG n03991062/
+mv val/ILSVRC2012_val_00046080.JPEG n02017213/
+mv val/ILSVRC2012_val_00046081.JPEG n02018207/
+mv val/ILSVRC2012_val_00046082.JPEG n04370456/
+mv val/ILSVRC2012_val_00046083.JPEG n02219486/
+mv val/ILSVRC2012_val_00046084.JPEG n02669723/
+mv val/ILSVRC2012_val_00046085.JPEG n01694178/
+mv val/ILSVRC2012_val_00046086.JPEG n01784675/
+mv val/ILSVRC2012_val_00046087.JPEG n03443371/
+mv val/ILSVRC2012_val_00046088.JPEG n02114548/
+mv val/ILSVRC2012_val_00046089.JPEG n01806567/
+mv val/ILSVRC2012_val_00046090.JPEG n04090263/
+mv val/ILSVRC2012_val_00046091.JPEG n07932039/
+mv val/ILSVRC2012_val_00046092.JPEG n01608432/
+mv val/ILSVRC2012_val_00046093.JPEG n02281406/
+mv val/ILSVRC2012_val_00046094.JPEG n04238763/
+mv val/ILSVRC2012_val_00046095.JPEG n01664065/
+mv val/ILSVRC2012_val_00046096.JPEG n02028035/
+mv val/ILSVRC2012_val_00046097.JPEG n01917289/
+mv val/ILSVRC2012_val_00046098.JPEG n03793489/
+mv val/ILSVRC2012_val_00046099.JPEG n04209239/
+mv val/ILSVRC2012_val_00046100.JPEG n03042490/
+mv val/ILSVRC2012_val_00046101.JPEG n03400231/
+mv val/ILSVRC2012_val_00046102.JPEG n02356798/
+mv val/ILSVRC2012_val_00046103.JPEG n03065424/
+mv val/ILSVRC2012_val_00046104.JPEG n04335435/
+mv val/ILSVRC2012_val_00046105.JPEG n01664065/
+mv val/ILSVRC2012_val_00046106.JPEG n01692333/
+mv val/ILSVRC2012_val_00046107.JPEG n07880968/
+mv val/ILSVRC2012_val_00046108.JPEG n03297495/
+mv val/ILSVRC2012_val_00046109.JPEG n02841315/
+mv val/ILSVRC2012_val_00046110.JPEG n03095699/
+mv val/ILSVRC2012_val_00046111.JPEG n07697313/
+mv val/ILSVRC2012_val_00046112.JPEG n09399592/
+mv val/ILSVRC2012_val_00046113.JPEG n01917289/
+mv val/ILSVRC2012_val_00046114.JPEG n03724870/
+mv val/ILSVRC2012_val_00046115.JPEG n13133613/
+mv val/ILSVRC2012_val_00046116.JPEG n03787032/
+mv val/ILSVRC2012_val_00046117.JPEG n02493793/
+mv val/ILSVRC2012_val_00046118.JPEG n03843555/
+mv val/ILSVRC2012_val_00046119.JPEG n01629819/
+mv val/ILSVRC2012_val_00046120.JPEG n03843555/
+mv val/ILSVRC2012_val_00046121.JPEG n04461696/
+mv val/ILSVRC2012_val_00046122.JPEG n01669191/
+mv val/ILSVRC2012_val_00046123.JPEG n03976657/
+mv val/ILSVRC2012_val_00046124.JPEG n02097047/
+mv val/ILSVRC2012_val_00046125.JPEG n03773504/
+mv val/ILSVRC2012_val_00046126.JPEG n02951585/
+mv val/ILSVRC2012_val_00046127.JPEG n04398044/
+mv val/ILSVRC2012_val_00046128.JPEG n03599486/
+mv val/ILSVRC2012_val_00046129.JPEG n03250847/
+mv val/ILSVRC2012_val_00046130.JPEG n03796401/
+mv val/ILSVRC2012_val_00046131.JPEG n01737021/
+mv val/ILSVRC2012_val_00046132.JPEG n02776631/
+mv val/ILSVRC2012_val_00046133.JPEG n03599486/
+mv val/ILSVRC2012_val_00046134.JPEG n02110806/
+mv val/ILSVRC2012_val_00046135.JPEG n04254680/
+mv val/ILSVRC2012_val_00046136.JPEG n02138441/
+mv val/ILSVRC2012_val_00046137.JPEG n02483362/
+mv val/ILSVRC2012_val_00046138.JPEG n02747177/
+mv val/ILSVRC2012_val_00046139.JPEG n03733805/
+mv val/ILSVRC2012_val_00046140.JPEG n04118538/
+mv val/ILSVRC2012_val_00046141.JPEG n01829413/
+mv val/ILSVRC2012_val_00046142.JPEG n02112137/
+mv val/ILSVRC2012_val_00046143.JPEG n02102318/
+mv val/ILSVRC2012_val_00046144.JPEG n02097474/
+mv val/ILSVRC2012_val_00046145.JPEG n02119789/
+mv val/ILSVRC2012_val_00046146.JPEG n04136333/
+mv val/ILSVRC2012_val_00046147.JPEG n04579432/
+mv val/ILSVRC2012_val_00046148.JPEG n02493509/
+mv val/ILSVRC2012_val_00046149.JPEG n01667778/
+mv val/ILSVRC2012_val_00046150.JPEG n02442845/
+mv val/ILSVRC2012_val_00046151.JPEG n02097209/
+mv val/ILSVRC2012_val_00046152.JPEG n03404251/
+mv val/ILSVRC2012_val_00046153.JPEG n02488291/
+mv val/ILSVRC2012_val_00046154.JPEG n02091032/
+mv val/ILSVRC2012_val_00046155.JPEG n01882714/
+mv val/ILSVRC2012_val_00046156.JPEG n04081281/
+mv val/ILSVRC2012_val_00046157.JPEG n02963159/
+mv val/ILSVRC2012_val_00046158.JPEG n02088632/
+mv val/ILSVRC2012_val_00046159.JPEG n01491361/
+mv val/ILSVRC2012_val_00046160.JPEG n04380533/
+mv val/ILSVRC2012_val_00046161.JPEG n04423845/
+mv val/ILSVRC2012_val_00046162.JPEG n01629819/
+mv val/ILSVRC2012_val_00046163.JPEG n03956157/
+mv val/ILSVRC2012_val_00046164.JPEG n04548362/
+mv val/ILSVRC2012_val_00046165.JPEG n02804610/
+mv val/ILSVRC2012_val_00046166.JPEG n04310018/
+mv val/ILSVRC2012_val_00046167.JPEG n04251144/
+mv val/ILSVRC2012_val_00046168.JPEG n07860988/
+mv val/ILSVRC2012_val_00046169.JPEG n02692877/
+mv val/ILSVRC2012_val_00046170.JPEG n03938244/
+mv val/ILSVRC2012_val_00046171.JPEG n01484850/
+mv val/ILSVRC2012_val_00046172.JPEG n04325704/
+mv val/ILSVRC2012_val_00046173.JPEG n01560419/
+mv val/ILSVRC2012_val_00046174.JPEG n02916936/
+mv val/ILSVRC2012_val_00046175.JPEG n02442845/
+mv val/ILSVRC2012_val_00046176.JPEG n03998194/
+mv val/ILSVRC2012_val_00046177.JPEG n04330267/
+mv val/ILSVRC2012_val_00046178.JPEG n03425413/
+mv val/ILSVRC2012_val_00046179.JPEG n07932039/
+mv val/ILSVRC2012_val_00046180.JPEG n01984695/
+mv val/ILSVRC2012_val_00046181.JPEG n03345487/
+mv val/ILSVRC2012_val_00046182.JPEG n03259280/
+mv val/ILSVRC2012_val_00046183.JPEG n07768694/
+mv val/ILSVRC2012_val_00046184.JPEG n02444819/
+mv val/ILSVRC2012_val_00046185.JPEG n01675722/
+mv val/ILSVRC2012_val_00046186.JPEG n02328150/
+mv val/ILSVRC2012_val_00046187.JPEG n04070727/
+mv val/ILSVRC2012_val_00046188.JPEG n04423845/
+mv val/ILSVRC2012_val_00046189.JPEG n03729826/
+mv val/ILSVRC2012_val_00046190.JPEG n07684084/
+mv val/ILSVRC2012_val_00046191.JPEG n03485794/
+mv val/ILSVRC2012_val_00046192.JPEG n03498962/
+mv val/ILSVRC2012_val_00046193.JPEG n01753488/
+mv val/ILSVRC2012_val_00046194.JPEG n03958227/
+mv val/ILSVRC2012_val_00046195.JPEG n02895154/
+mv val/ILSVRC2012_val_00046196.JPEG n03100240/
+mv val/ILSVRC2012_val_00046197.JPEG n02110806/
+mv val/ILSVRC2012_val_00046198.JPEG n04118776/
+mv val/ILSVRC2012_val_00046199.JPEG n02105056/
+mv val/ILSVRC2012_val_00046200.JPEG n03874293/
+mv val/ILSVRC2012_val_00046201.JPEG n04037443/
+mv val/ILSVRC2012_val_00046202.JPEG n03496892/
+mv val/ILSVRC2012_val_00046203.JPEG n07745940/
+mv val/ILSVRC2012_val_00046204.JPEG n03871628/
+mv val/ILSVRC2012_val_00046205.JPEG n03372029/
+mv val/ILSVRC2012_val_00046206.JPEG n02100735/
+mv val/ILSVRC2012_val_00046207.JPEG n02132136/
+mv val/ILSVRC2012_val_00046208.JPEG n03623198/
+mv val/ILSVRC2012_val_00046209.JPEG n03666591/
+mv val/ILSVRC2012_val_00046210.JPEG n02823750/
+mv val/ILSVRC2012_val_00046211.JPEG n01735189/
+mv val/ILSVRC2012_val_00046212.JPEG n02106382/
+mv val/ILSVRC2012_val_00046213.JPEG n07697537/
+mv val/ILSVRC2012_val_00046214.JPEG n02454379/
+mv val/ILSVRC2012_val_00046215.JPEG n04311004/
+mv val/ILSVRC2012_val_00046216.JPEG n03110669/
+mv val/ILSVRC2012_val_00046217.JPEG n04009552/
+mv val/ILSVRC2012_val_00046218.JPEG n02074367/
+mv val/ILSVRC2012_val_00046219.JPEG n02442845/
+mv val/ILSVRC2012_val_00046220.JPEG n02099601/
+mv val/ILSVRC2012_val_00046221.JPEG n09246464/
+mv val/ILSVRC2012_val_00046222.JPEG n03814906/
+mv val/ILSVRC2012_val_00046223.JPEG n04049303/
+mv val/ILSVRC2012_val_00046224.JPEG n01749939/
+mv val/ILSVRC2012_val_00046225.JPEG n03803284/
+mv val/ILSVRC2012_val_00046226.JPEG n02667093/
+mv val/ILSVRC2012_val_00046227.JPEG n03908714/
+mv val/ILSVRC2012_val_00046228.JPEG n04409515/
+mv val/ILSVRC2012_val_00046229.JPEG n03290653/
+mv val/ILSVRC2012_val_00046230.JPEG n07730033/
+mv val/ILSVRC2012_val_00046231.JPEG n02268443/
+mv val/ILSVRC2012_val_00046232.JPEG n03028079/
+mv val/ILSVRC2012_val_00046233.JPEG n02514041/
+mv val/ILSVRC2012_val_00046234.JPEG n04592741/
+mv val/ILSVRC2012_val_00046235.JPEG n07720875/
+mv val/ILSVRC2012_val_00046236.JPEG n02988304/
+mv val/ILSVRC2012_val_00046237.JPEG n02606052/
+mv val/ILSVRC2012_val_00046238.JPEG n03877472/
+mv val/ILSVRC2012_val_00046239.JPEG n01798484/
+mv val/ILSVRC2012_val_00046240.JPEG n03742115/
+mv val/ILSVRC2012_val_00046241.JPEG n04461696/
+mv val/ILSVRC2012_val_00046242.JPEG n02917067/
+mv val/ILSVRC2012_val_00046243.JPEG n01629819/
+mv val/ILSVRC2012_val_00046244.JPEG n04486054/
+mv val/ILSVRC2012_val_00046245.JPEG n04548362/
+mv val/ILSVRC2012_val_00046246.JPEG n02860847/
+mv val/ILSVRC2012_val_00046247.JPEG n02107683/
+mv val/ILSVRC2012_val_00046248.JPEG n01944390/
+mv val/ILSVRC2012_val_00046249.JPEG n03786901/
+mv val/ILSVRC2012_val_00046250.JPEG n04044716/
+mv val/ILSVRC2012_val_00046251.JPEG n01824575/
+mv val/ILSVRC2012_val_00046252.JPEG n01440764/
+mv val/ILSVRC2012_val_00046253.JPEG n02279972/
+mv val/ILSVRC2012_val_00046254.JPEG n01914609/
+mv val/ILSVRC2012_val_00046255.JPEG n03272562/
+mv val/ILSVRC2012_val_00046256.JPEG n07590611/
+mv val/ILSVRC2012_val_00046257.JPEG n01728572/
+mv val/ILSVRC2012_val_00046258.JPEG n01687978/
+mv val/ILSVRC2012_val_00046259.JPEG n03791053/
+mv val/ILSVRC2012_val_00046260.JPEG n01518878/
+mv val/ILSVRC2012_val_00046261.JPEG n02950826/
+mv val/ILSVRC2012_val_00046262.JPEG n03982430/
+mv val/ILSVRC2012_val_00046263.JPEG n02966193/
+mv val/ILSVRC2012_val_00046264.JPEG n03841143/
+mv val/ILSVRC2012_val_00046265.JPEG n02672831/
+mv val/ILSVRC2012_val_00046266.JPEG n02787622/
+mv val/ILSVRC2012_val_00046267.JPEG n02165105/
+mv val/ILSVRC2012_val_00046268.JPEG n04525038/
+mv val/ILSVRC2012_val_00046269.JPEG n03662601/
+mv val/ILSVRC2012_val_00046270.JPEG n12057211/
+mv val/ILSVRC2012_val_00046271.JPEG n04522168/
+mv val/ILSVRC2012_val_00046272.JPEG n04613696/
+mv val/ILSVRC2012_val_00046273.JPEG n02088632/
+mv val/ILSVRC2012_val_00046274.JPEG n01985128/
+mv val/ILSVRC2012_val_00046275.JPEG n09472597/
+mv val/ILSVRC2012_val_00046276.JPEG n03271574/
+mv val/ILSVRC2012_val_00046277.JPEG n01687978/
+mv val/ILSVRC2012_val_00046278.JPEG n04147183/
+mv val/ILSVRC2012_val_00046279.JPEG n07875152/
+mv val/ILSVRC2012_val_00046280.JPEG n01580077/
+mv val/ILSVRC2012_val_00046281.JPEG n03393912/
+mv val/ILSVRC2012_val_00046282.JPEG n03903868/
+mv val/ILSVRC2012_val_00046283.JPEG n04074963/
+mv val/ILSVRC2012_val_00046284.JPEG n03788365/
+mv val/ILSVRC2012_val_00046285.JPEG n01843065/
+mv val/ILSVRC2012_val_00046286.JPEG n03690938/
+mv val/ILSVRC2012_val_00046287.JPEG n02105056/
+mv val/ILSVRC2012_val_00046288.JPEG n04525305/
+mv val/ILSVRC2012_val_00046289.JPEG n01631663/
+mv val/ILSVRC2012_val_00046290.JPEG n02097047/
+mv val/ILSVRC2012_val_00046291.JPEG n02486410/
+mv val/ILSVRC2012_val_00046292.JPEG n04152593/
+mv val/ILSVRC2012_val_00046293.JPEG n02879718/
+mv val/ILSVRC2012_val_00046294.JPEG n04443257/
+mv val/ILSVRC2012_val_00046295.JPEG n02102040/
+mv val/ILSVRC2012_val_00046296.JPEG n02093859/
+mv val/ILSVRC2012_val_00046297.JPEG n02127052/
+mv val/ILSVRC2012_val_00046298.JPEG n09332890/
+mv val/ILSVRC2012_val_00046299.JPEG n01770393/
+mv val/ILSVRC2012_val_00046300.JPEG n03527444/
+mv val/ILSVRC2012_val_00046301.JPEG n03697007/
+mv val/ILSVRC2012_val_00046302.JPEG n04515003/
+mv val/ILSVRC2012_val_00046303.JPEG n07873807/
+mv val/ILSVRC2012_val_00046304.JPEG n04429376/
+mv val/ILSVRC2012_val_00046305.JPEG n03991062/
+mv val/ILSVRC2012_val_00046306.JPEG n03085013/
+mv val/ILSVRC2012_val_00046307.JPEG n01828970/
+mv val/ILSVRC2012_val_00046308.JPEG n01608432/
+mv val/ILSVRC2012_val_00046309.JPEG n03930313/
+mv val/ILSVRC2012_val_00046310.JPEG n02105641/
+mv val/ILSVRC2012_val_00046311.JPEG n01756291/
+mv val/ILSVRC2012_val_00046312.JPEG n02500267/
+mv val/ILSVRC2012_val_00046313.JPEG n04039381/
+mv val/ILSVRC2012_val_00046314.JPEG n02168699/
+mv val/ILSVRC2012_val_00046315.JPEG n03259280/
+mv val/ILSVRC2012_val_00046316.JPEG n01855032/
+mv val/ILSVRC2012_val_00046317.JPEG n10565667/
+mv val/ILSVRC2012_val_00046318.JPEG n02115641/
+mv val/ILSVRC2012_val_00046319.JPEG n04515003/
+mv val/ILSVRC2012_val_00046320.JPEG n02669723/
+mv val/ILSVRC2012_val_00046321.JPEG n02988304/
+mv val/ILSVRC2012_val_00046322.JPEG n03825788/
+mv val/ILSVRC2012_val_00046323.JPEG n02025239/
+mv val/ILSVRC2012_val_00046324.JPEG n03706229/
+mv val/ILSVRC2012_val_00046325.JPEG n01914609/
+mv val/ILSVRC2012_val_00046326.JPEG n03344393/
+mv val/ILSVRC2012_val_00046327.JPEG n04049303/
+mv val/ILSVRC2012_val_00046328.JPEG n03259280/
+mv val/ILSVRC2012_val_00046329.JPEG n02091244/
+mv val/ILSVRC2012_val_00046330.JPEG n02514041/
+mv val/ILSVRC2012_val_00046331.JPEG n03065424/
+mv val/ILSVRC2012_val_00046332.JPEG n12057211/
+mv val/ILSVRC2012_val_00046333.JPEG n02027492/
+mv val/ILSVRC2012_val_00046334.JPEG n04118538/
+mv val/ILSVRC2012_val_00046335.JPEG n04141076/
+mv val/ILSVRC2012_val_00046336.JPEG n03899768/
+mv val/ILSVRC2012_val_00046337.JPEG n04462240/
+mv val/ILSVRC2012_val_00046338.JPEG n02096051/
+mv val/ILSVRC2012_val_00046339.JPEG n02978881/
+mv val/ILSVRC2012_val_00046340.JPEG n02114855/
+mv val/ILSVRC2012_val_00046341.JPEG n04509417/
+mv val/ILSVRC2012_val_00046342.JPEG n04505470/
+mv val/ILSVRC2012_val_00046343.JPEG n03201208/
+mv val/ILSVRC2012_val_00046344.JPEG n01986214/
+mv val/ILSVRC2012_val_00046345.JPEG n02417914/
+mv val/ILSVRC2012_val_00046346.JPEG n01677366/
+mv val/ILSVRC2012_val_00046347.JPEG n07747607/
+mv val/ILSVRC2012_val_00046348.JPEG n04409515/
+mv val/ILSVRC2012_val_00046349.JPEG n01685808/
+mv val/ILSVRC2012_val_00046350.JPEG n04599235/
+mv val/ILSVRC2012_val_00046351.JPEG n03187595/
+mv val/ILSVRC2012_val_00046352.JPEG n03657121/
+mv val/ILSVRC2012_val_00046353.JPEG n15075141/
+mv val/ILSVRC2012_val_00046354.JPEG n04372370/
+mv val/ILSVRC2012_val_00046355.JPEG n02966687/
+mv val/ILSVRC2012_val_00046356.JPEG n01820546/
+mv val/ILSVRC2012_val_00046357.JPEG n03344393/
+mv val/ILSVRC2012_val_00046358.JPEG n03476991/
+mv val/ILSVRC2012_val_00046359.JPEG n03763968/
+mv val/ILSVRC2012_val_00046360.JPEG n04070727/
+mv val/ILSVRC2012_val_00046361.JPEG n03041632/
+mv val/ILSVRC2012_val_00046362.JPEG n01877812/
+mv val/ILSVRC2012_val_00046363.JPEG n07248320/
+mv val/ILSVRC2012_val_00046364.JPEG n07875152/
+mv val/ILSVRC2012_val_00046365.JPEG n02892767/
+mv val/ILSVRC2012_val_00046366.JPEG n03355925/
+mv val/ILSVRC2012_val_00046367.JPEG n01685808/
+mv val/ILSVRC2012_val_00046368.JPEG n04228054/
+mv val/ILSVRC2012_val_00046369.JPEG n03843555/
+mv val/ILSVRC2012_val_00046370.JPEG n01755581/
+mv val/ILSVRC2012_val_00046371.JPEG n04347754/
+mv val/ILSVRC2012_val_00046372.JPEG n02277742/
+mv val/ILSVRC2012_val_00046373.JPEG n03000247/
+mv val/ILSVRC2012_val_00046374.JPEG n07742313/
+mv val/ILSVRC2012_val_00046375.JPEG n07875152/
+mv val/ILSVRC2012_val_00046376.JPEG n03075370/
+mv val/ILSVRC2012_val_00046377.JPEG n02799071/
+mv val/ILSVRC2012_val_00046378.JPEG n03133878/
+mv val/ILSVRC2012_val_00046379.JPEG n06596364/
+mv val/ILSVRC2012_val_00046380.JPEG n01806143/
+mv val/ILSVRC2012_val_00046381.JPEG n03930313/
+mv val/ILSVRC2012_val_00046382.JPEG n03930313/
+mv val/ILSVRC2012_val_00046383.JPEG n02730930/
+mv val/ILSVRC2012_val_00046384.JPEG n01773797/
+mv val/ILSVRC2012_val_00046385.JPEG n03902125/
+mv val/ILSVRC2012_val_00046386.JPEG n03721384/
+mv val/ILSVRC2012_val_00046387.JPEG n02951358/
+mv val/ILSVRC2012_val_00046388.JPEG n02119022/
+mv val/ILSVRC2012_val_00046389.JPEG n01744401/
+mv val/ILSVRC2012_val_00046390.JPEG n02112706/
+mv val/ILSVRC2012_val_00046391.JPEG n02396427/
+mv val/ILSVRC2012_val_00046392.JPEG n03633091/
+mv val/ILSVRC2012_val_00046393.JPEG n01514668/
+mv val/ILSVRC2012_val_00046394.JPEG n03791053/
+mv val/ILSVRC2012_val_00046395.JPEG n02395406/
+mv val/ILSVRC2012_val_00046396.JPEG n04370456/
+mv val/ILSVRC2012_val_00046397.JPEG n03657121/
+mv val/ILSVRC2012_val_00046398.JPEG n02096585/
+mv val/ILSVRC2012_val_00046399.JPEG n02107312/
+mv val/ILSVRC2012_val_00046400.JPEG n03970156/
+mv val/ILSVRC2012_val_00046401.JPEG n03126707/
+mv val/ILSVRC2012_val_00046402.JPEG n02105251/
+mv val/ILSVRC2012_val_00046403.JPEG n02442845/
+mv val/ILSVRC2012_val_00046404.JPEG n04461696/
+mv val/ILSVRC2012_val_00046405.JPEG n07715103/
+mv val/ILSVRC2012_val_00046406.JPEG n03873416/
+mv val/ILSVRC2012_val_00046407.JPEG n01677366/
+mv val/ILSVRC2012_val_00046408.JPEG n02012849/
+mv val/ILSVRC2012_val_00046409.JPEG n03527444/
+mv val/ILSVRC2012_val_00046410.JPEG n01798484/
+mv val/ILSVRC2012_val_00046411.JPEG n04562935/
+mv val/ILSVRC2012_val_00046412.JPEG n02279972/
+mv val/ILSVRC2012_val_00046413.JPEG n02423022/
+mv val/ILSVRC2012_val_00046414.JPEG n03992509/
+mv val/ILSVRC2012_val_00046415.JPEG n01592084/
+mv val/ILSVRC2012_val_00046416.JPEG n03788195/
+mv val/ILSVRC2012_val_00046417.JPEG n02259212/
+mv val/ILSVRC2012_val_00046418.JPEG n04462240/
+mv val/ILSVRC2012_val_00046419.JPEG n03929660/
+mv val/ILSVRC2012_val_00046420.JPEG n02090622/
+mv val/ILSVRC2012_val_00046421.JPEG n04254120/
+mv val/ILSVRC2012_val_00046422.JPEG n01592084/
+mv val/ILSVRC2012_val_00046423.JPEG n02109961/
+mv val/ILSVRC2012_val_00046424.JPEG n03769881/
+mv val/ILSVRC2012_val_00046425.JPEG n02268443/
+mv val/ILSVRC2012_val_00046426.JPEG n02909870/
+mv val/ILSVRC2012_val_00046427.JPEG n01641577/
+mv val/ILSVRC2012_val_00046428.JPEG n04550184/
+mv val/ILSVRC2012_val_00046429.JPEG n04507155/
+mv val/ILSVRC2012_val_00046430.JPEG n01630670/
+mv val/ILSVRC2012_val_00046431.JPEG n04152593/
+mv val/ILSVRC2012_val_00046432.JPEG n02090379/
+mv val/ILSVRC2012_val_00046433.JPEG n01983481/
+mv val/ILSVRC2012_val_00046434.JPEG n09421951/
+mv val/ILSVRC2012_val_00046435.JPEG n04517823/
+mv val/ILSVRC2012_val_00046436.JPEG n01744401/
+mv val/ILSVRC2012_val_00046437.JPEG n07745940/
+mv val/ILSVRC2012_val_00046438.JPEG n01843383/
+mv val/ILSVRC2012_val_00046439.JPEG n03476684/
+mv val/ILSVRC2012_val_00046440.JPEG n01735189/
+mv val/ILSVRC2012_val_00046441.JPEG n03930313/
+mv val/ILSVRC2012_val_00046442.JPEG n03916031/
+mv val/ILSVRC2012_val_00046443.JPEG n02093991/
+mv val/ILSVRC2012_val_00046444.JPEG n03207743/
+mv val/ILSVRC2012_val_00046445.JPEG n02787622/
+mv val/ILSVRC2012_val_00046446.JPEG n02106166/
+mv val/ILSVRC2012_val_00046447.JPEG n04398044/
+mv val/ILSVRC2012_val_00046448.JPEG n04428191/
+mv val/ILSVRC2012_val_00046449.JPEG n04209133/
+mv val/ILSVRC2012_val_00046450.JPEG n02085620/
+mv val/ILSVRC2012_val_00046451.JPEG n09835506/
+mv val/ILSVRC2012_val_00046452.JPEG n01871265/
+mv val/ILSVRC2012_val_00046453.JPEG n03459775/
+mv val/ILSVRC2012_val_00046454.JPEG n02089973/
+mv val/ILSVRC2012_val_00046455.JPEG n02643566/
+mv val/ILSVRC2012_val_00046456.JPEG n02481823/
+mv val/ILSVRC2012_val_00046457.JPEG n02123159/
+mv val/ILSVRC2012_val_00046458.JPEG n07875152/
+mv val/ILSVRC2012_val_00046459.JPEG n04557648/
+mv val/ILSVRC2012_val_00046460.JPEG n03196217/
+mv val/ILSVRC2012_val_00046461.JPEG n04033995/
+mv val/ILSVRC2012_val_00046462.JPEG n02037110/
+mv val/ILSVRC2012_val_00046463.JPEG n01955084/
+mv val/ILSVRC2012_val_00046464.JPEG n03089624/
+mv val/ILSVRC2012_val_00046465.JPEG n01751748/
+mv val/ILSVRC2012_val_00046466.JPEG n02099429/
+mv val/ILSVRC2012_val_00046467.JPEG n03325584/
+mv val/ILSVRC2012_val_00046468.JPEG n03445777/
+mv val/ILSVRC2012_val_00046469.JPEG n03902125/
+mv val/ILSVRC2012_val_00046470.JPEG n02116738/
+mv val/ILSVRC2012_val_00046471.JPEG n02799071/
+mv val/ILSVRC2012_val_00046472.JPEG n02843684/
+mv val/ILSVRC2012_val_00046473.JPEG n03109150/
+mv val/ILSVRC2012_val_00046474.JPEG n02869837/
+mv val/ILSVRC2012_val_00046475.JPEG n06794110/
+mv val/ILSVRC2012_val_00046476.JPEG n03908618/
+mv val/ILSVRC2012_val_00046477.JPEG n02105251/
+mv val/ILSVRC2012_val_00046478.JPEG n02790996/
+mv val/ILSVRC2012_val_00046479.JPEG n02966687/
+mv val/ILSVRC2012_val_00046480.JPEG n09256479/
+mv val/ILSVRC2012_val_00046481.JPEG n02939185/
+mv val/ILSVRC2012_val_00046482.JPEG n04417672/
+mv val/ILSVRC2012_val_00046483.JPEG n02113624/
+mv val/ILSVRC2012_val_00046484.JPEG n04266014/
+mv val/ILSVRC2012_val_00046485.JPEG n02174001/
+mv val/ILSVRC2012_val_00046486.JPEG n02483362/
+mv val/ILSVRC2012_val_00046487.JPEG n03127925/
+mv val/ILSVRC2012_val_00046488.JPEG n03717622/
+mv val/ILSVRC2012_val_00046489.JPEG n01744401/
+mv val/ILSVRC2012_val_00046490.JPEG n01739381/
+mv val/ILSVRC2012_val_00046491.JPEG n02606052/
+mv val/ILSVRC2012_val_00046492.JPEG n03290653/
+mv val/ILSVRC2012_val_00046493.JPEG n04330267/
+mv val/ILSVRC2012_val_00046494.JPEG n02486410/
+mv val/ILSVRC2012_val_00046495.JPEG n02457408/
+mv val/ILSVRC2012_val_00046496.JPEG n04355338/
+mv val/ILSVRC2012_val_00046497.JPEG n01498041/
+mv val/ILSVRC2012_val_00046498.JPEG n02134418/
+mv val/ILSVRC2012_val_00046499.JPEG n01440764/
+mv val/ILSVRC2012_val_00046500.JPEG n04552348/
+mv val/ILSVRC2012_val_00046501.JPEG n02319095/
+mv val/ILSVRC2012_val_00046502.JPEG n03781244/
+mv val/ILSVRC2012_val_00046503.JPEG n07730033/
+mv val/ILSVRC2012_val_00046504.JPEG n04525038/
+mv val/ILSVRC2012_val_00046505.JPEG n02018795/
+mv val/ILSVRC2012_val_00046506.JPEG n03494278/
+mv val/ILSVRC2012_val_00046507.JPEG n04589890/
+mv val/ILSVRC2012_val_00046508.JPEG n01829413/
+mv val/ILSVRC2012_val_00046509.JPEG n04456115/
+mv val/ILSVRC2012_val_00046510.JPEG n04118776/
+mv val/ILSVRC2012_val_00046511.JPEG n02687172/
+mv val/ILSVRC2012_val_00046512.JPEG n02992529/
+mv val/ILSVRC2012_val_00046513.JPEG n07932039/
+mv val/ILSVRC2012_val_00046514.JPEG n03075370/
+mv val/ILSVRC2012_val_00046515.JPEG n04557648/
+mv val/ILSVRC2012_val_00046516.JPEG n01728920/
+mv val/ILSVRC2012_val_00046517.JPEG n01688243/
+mv val/ILSVRC2012_val_00046518.JPEG n02443484/
+mv val/ILSVRC2012_val_00046519.JPEG n03843555/
+mv val/ILSVRC2012_val_00046520.JPEG n03786901/
+mv val/ILSVRC2012_val_00046521.JPEG n03016953/
+mv val/ILSVRC2012_val_00046522.JPEG n02536864/
+mv val/ILSVRC2012_val_00046523.JPEG n04125021/
+mv val/ILSVRC2012_val_00046524.JPEG n01514668/
+mv val/ILSVRC2012_val_00046525.JPEG n04461696/
+mv val/ILSVRC2012_val_00046526.JPEG n01983481/
+mv val/ILSVRC2012_val_00046527.JPEG n02493509/
+mv val/ILSVRC2012_val_00046528.JPEG n07614500/
+mv val/ILSVRC2012_val_00046529.JPEG n01776313/
+mv val/ILSVRC2012_val_00046530.JPEG n02091467/
+mv val/ILSVRC2012_val_00046531.JPEG n02106030/
+mv val/ILSVRC2012_val_00046532.JPEG n02814860/
+mv val/ILSVRC2012_val_00046533.JPEG n02002556/
+mv val/ILSVRC2012_val_00046534.JPEG n01818515/
+mv val/ILSVRC2012_val_00046535.JPEG n03160309/
+mv val/ILSVRC2012_val_00046536.JPEG n02092339/
+mv val/ILSVRC2012_val_00046537.JPEG n02013706/
+mv val/ILSVRC2012_val_00046538.JPEG n01753488/
+mv val/ILSVRC2012_val_00046539.JPEG n01739381/
+mv val/ILSVRC2012_val_00046540.JPEG n02981792/
+mv val/ILSVRC2012_val_00046541.JPEG n01753488/
+mv val/ILSVRC2012_val_00046542.JPEG n02704792/
+mv val/ILSVRC2012_val_00046543.JPEG n09332890/
+mv val/ILSVRC2012_val_00046544.JPEG n02317335/
+mv val/ILSVRC2012_val_00046545.JPEG n03255030/
+mv val/ILSVRC2012_val_00046546.JPEG n04201297/
+mv val/ILSVRC2012_val_00046547.JPEG n02093256/
+mv val/ILSVRC2012_val_00046548.JPEG n01688243/
+mv val/ILSVRC2012_val_00046549.JPEG n03792782/
+mv val/ILSVRC2012_val_00046550.JPEG n03028079/
+mv val/ILSVRC2012_val_00046551.JPEG n01944390/
+mv val/ILSVRC2012_val_00046552.JPEG n02107908/
+mv val/ILSVRC2012_val_00046553.JPEG n03803284/
+mv val/ILSVRC2012_val_00046554.JPEG n03775546/
+mv val/ILSVRC2012_val_00046555.JPEG n02128757/
+mv val/ILSVRC2012_val_00046556.JPEG n04542943/
+mv val/ILSVRC2012_val_00046557.JPEG n04560804/
+mv val/ILSVRC2012_val_00046558.JPEG n02514041/
+mv val/ILSVRC2012_val_00046559.JPEG n04204347/
+mv val/ILSVRC2012_val_00046560.JPEG n02916936/
+mv val/ILSVRC2012_val_00046561.JPEG n03344393/
+mv val/ILSVRC2012_val_00046562.JPEG n02364673/
+mv val/ILSVRC2012_val_00046563.JPEG n03942813/
+mv val/ILSVRC2012_val_00046564.JPEG n01614925/
+mv val/ILSVRC2012_val_00046565.JPEG n02494079/
+mv val/ILSVRC2012_val_00046566.JPEG n04542943/
+mv val/ILSVRC2012_val_00046567.JPEG n07742313/
+mv val/ILSVRC2012_val_00046568.JPEG n02490219/
+mv val/ILSVRC2012_val_00046569.JPEG n03843555/
+mv val/ILSVRC2012_val_00046570.JPEG n02281406/
+mv val/ILSVRC2012_val_00046571.JPEG n02493793/
+mv val/ILSVRC2012_val_00046572.JPEG n02123597/
+mv val/ILSVRC2012_val_00046573.JPEG n04613696/
+mv val/ILSVRC2012_val_00046574.JPEG n01796340/
+mv val/ILSVRC2012_val_00046575.JPEG n07753592/
+mv val/ILSVRC2012_val_00046576.JPEG n03384352/
+mv val/ILSVRC2012_val_00046577.JPEG n03916031/
+mv val/ILSVRC2012_val_00046578.JPEG n03908714/
+mv val/ILSVRC2012_val_00046579.JPEG n03992509/
+mv val/ILSVRC2012_val_00046580.JPEG n04201297/
+mv val/ILSVRC2012_val_00046581.JPEG n03637318/
+mv val/ILSVRC2012_val_00046582.JPEG n02977058/
+mv val/ILSVRC2012_val_00046583.JPEG n02091032/
+mv val/ILSVRC2012_val_00046584.JPEG n02494079/
+mv val/ILSVRC2012_val_00046585.JPEG n03673027/
+mv val/ILSVRC2012_val_00046586.JPEG n04548362/
+mv val/ILSVRC2012_val_00046587.JPEG n01950731/
+mv val/ILSVRC2012_val_00046588.JPEG n03721384/
+mv val/ILSVRC2012_val_00046589.JPEG n02999410/
+mv val/ILSVRC2012_val_00046590.JPEG n02483362/
+mv val/ILSVRC2012_val_00046591.JPEG n02111277/
+mv val/ILSVRC2012_val_00046592.JPEG n03709823/
+mv val/ILSVRC2012_val_00046593.JPEG n02087046/
+mv val/ILSVRC2012_val_00046594.JPEG n03929660/
+mv val/ILSVRC2012_val_00046595.JPEG n07930864/
+mv val/ILSVRC2012_val_00046596.JPEG n03954731/
+mv val/ILSVRC2012_val_00046597.JPEG n03063599/
+mv val/ILSVRC2012_val_00046598.JPEG n03692522/
+mv val/ILSVRC2012_val_00046599.JPEG n02018207/
+mv val/ILSVRC2012_val_00046600.JPEG n03788195/
+mv val/ILSVRC2012_val_00046601.JPEG n04040759/
+mv val/ILSVRC2012_val_00046602.JPEG n02011460/
+mv val/ILSVRC2012_val_00046603.JPEG n07871810/
+mv val/ILSVRC2012_val_00046604.JPEG n03690938/
+mv val/ILSVRC2012_val_00046605.JPEG n04486054/
+mv val/ILSVRC2012_val_00046606.JPEG n01986214/
+mv val/ILSVRC2012_val_00046607.JPEG n04591713/
+mv val/ILSVRC2012_val_00046608.JPEG n04127249/
+mv val/ILSVRC2012_val_00046609.JPEG n01807496/
+mv val/ILSVRC2012_val_00046610.JPEG n02095570/
+mv val/ILSVRC2012_val_00046611.JPEG n01981276/
+mv val/ILSVRC2012_val_00046612.JPEG n02128925/
+mv val/ILSVRC2012_val_00046613.JPEG n02992529/
+mv val/ILSVRC2012_val_00046614.JPEG n02815834/
+mv val/ILSVRC2012_val_00046615.JPEG n01698640/
+mv val/ILSVRC2012_val_00046616.JPEG n01632458/
+mv val/ILSVRC2012_val_00046617.JPEG n02492660/
+mv val/ILSVRC2012_val_00046618.JPEG n02319095/
+mv val/ILSVRC2012_val_00046619.JPEG n03938244/
+mv val/ILSVRC2012_val_00046620.JPEG n03876231/
+mv val/ILSVRC2012_val_00046621.JPEG n01798484/
+mv val/ILSVRC2012_val_00046622.JPEG n03666591/
+mv val/ILSVRC2012_val_00046623.JPEG n02110806/
+mv val/ILSVRC2012_val_00046624.JPEG n03782006/
+mv val/ILSVRC2012_val_00046625.JPEG n01943899/
+mv val/ILSVRC2012_val_00046626.JPEG n02643566/
+mv val/ILSVRC2012_val_00046627.JPEG n04120489/
+mv val/ILSVRC2012_val_00046628.JPEG n04399382/
+mv val/ILSVRC2012_val_00046629.JPEG n02085782/
+mv val/ILSVRC2012_val_00046630.JPEG n04389033/
+mv val/ILSVRC2012_val_00046631.JPEG n07714571/
+mv val/ILSVRC2012_val_00046632.JPEG n01614925/
+mv val/ILSVRC2012_val_00046633.JPEG n03494278/
+mv val/ILSVRC2012_val_00046634.JPEG n04141076/
+mv val/ILSVRC2012_val_00046635.JPEG n03388043/
+mv val/ILSVRC2012_val_00046636.JPEG n04118776/
+mv val/ILSVRC2012_val_00046637.JPEG n03291819/
+mv val/ILSVRC2012_val_00046638.JPEG n02389026/
+mv val/ILSVRC2012_val_00046639.JPEG n04209133/
+mv val/ILSVRC2012_val_00046640.JPEG n01685808/
+mv val/ILSVRC2012_val_00046641.JPEG n03769881/
+mv val/ILSVRC2012_val_00046642.JPEG n04074963/
+mv val/ILSVRC2012_val_00046643.JPEG n04458633/
+mv val/ILSVRC2012_val_00046644.JPEG n04532670/
+mv val/ILSVRC2012_val_00046645.JPEG n02484975/
+mv val/ILSVRC2012_val_00046646.JPEG n07579787/
+mv val/ILSVRC2012_val_00046647.JPEG n02058221/
+mv val/ILSVRC2012_val_00046648.JPEG n03000134/
+mv val/ILSVRC2012_val_00046649.JPEG n01704323/
+mv val/ILSVRC2012_val_00046650.JPEG n04044716/
+mv val/ILSVRC2012_val_00046651.JPEG n03000684/
+mv val/ILSVRC2012_val_00046652.JPEG n03179701/
+mv val/ILSVRC2012_val_00046653.JPEG n07716906/
+mv val/ILSVRC2012_val_00046654.JPEG n01518878/
+mv val/ILSVRC2012_val_00046655.JPEG n02497673/
+mv val/ILSVRC2012_val_00046656.JPEG n03445924/
+mv val/ILSVRC2012_val_00046657.JPEG n02093647/
+mv val/ILSVRC2012_val_00046658.JPEG n02410509/
+mv val/ILSVRC2012_val_00046659.JPEG n03026506/
+mv val/ILSVRC2012_val_00046660.JPEG n04153751/
+mv val/ILSVRC2012_val_00046661.JPEG n04141076/
+mv val/ILSVRC2012_val_00046662.JPEG n03532672/
+mv val/ILSVRC2012_val_00046663.JPEG n04201297/
+mv val/ILSVRC2012_val_00046664.JPEG n07836838/
+mv val/ILSVRC2012_val_00046665.JPEG n03188531/
+mv val/ILSVRC2012_val_00046666.JPEG n02486410/
+mv val/ILSVRC2012_val_00046667.JPEG n04275548/
+mv val/ILSVRC2012_val_00046668.JPEG n02133161/
+mv val/ILSVRC2012_val_00046669.JPEG n03394916/
+mv val/ILSVRC2012_val_00046670.JPEG n02098105/
+mv val/ILSVRC2012_val_00046671.JPEG n04376876/
+mv val/ILSVRC2012_val_00046672.JPEG n02106382/
+mv val/ILSVRC2012_val_00046673.JPEG n03483316/
+mv val/ILSVRC2012_val_00046674.JPEG n02490219/
+mv val/ILSVRC2012_val_00046675.JPEG n03032252/
+mv val/ILSVRC2012_val_00046676.JPEG n03770439/
+mv val/ILSVRC2012_val_00046677.JPEG n02025239/
+mv val/ILSVRC2012_val_00046678.JPEG n03840681/
+mv val/ILSVRC2012_val_00046679.JPEG n03496892/
+mv val/ILSVRC2012_val_00046680.JPEG n03633091/
+mv val/ILSVRC2012_val_00046681.JPEG n02837789/
+mv val/ILSVRC2012_val_00046682.JPEG n03126707/
+mv val/ILSVRC2012_val_00046683.JPEG n02104365/
+mv val/ILSVRC2012_val_00046684.JPEG n04584207/
+mv val/ILSVRC2012_val_00046685.JPEG n04347754/
+mv val/ILSVRC2012_val_00046686.JPEG n04243546/
+mv val/ILSVRC2012_val_00046687.JPEG n02110185/
+mv val/ILSVRC2012_val_00046688.JPEG n02865351/
+mv val/ILSVRC2012_val_00046689.JPEG n02167151/
+mv val/ILSVRC2012_val_00046690.JPEG n02871525/
+mv val/ILSVRC2012_val_00046691.JPEG n02088466/
+mv val/ILSVRC2012_val_00046692.JPEG n02138441/
+mv val/ILSVRC2012_val_00046693.JPEG n02804610/
+mv val/ILSVRC2012_val_00046694.JPEG n03935335/
+mv val/ILSVRC2012_val_00046695.JPEG n02782093/
+mv val/ILSVRC2012_val_00046696.JPEG n01744401/
+mv val/ILSVRC2012_val_00046697.JPEG n09472597/
+mv val/ILSVRC2012_val_00046698.JPEG n03445924/
+mv val/ILSVRC2012_val_00046699.JPEG n01737021/
+mv val/ILSVRC2012_val_00046700.JPEG n02102480/
+mv val/ILSVRC2012_val_00046701.JPEG n02086646/
+mv val/ILSVRC2012_val_00046702.JPEG n02137549/
+mv val/ILSVRC2012_val_00046703.JPEG n02481823/
+mv val/ILSVRC2012_val_00046704.JPEG n02107574/
+mv val/ILSVRC2012_val_00046705.JPEG n02096437/
+mv val/ILSVRC2012_val_00046706.JPEG n02701002/
+mv val/ILSVRC2012_val_00046707.JPEG n03272562/
+mv val/ILSVRC2012_val_00046708.JPEG n02978881/
+mv val/ILSVRC2012_val_00046709.JPEG n01737021/
+mv val/ILSVRC2012_val_00046710.JPEG n01824575/
+mv val/ILSVRC2012_val_00046711.JPEG n03887697/
+mv val/ILSVRC2012_val_00046712.JPEG n02097298/
+mv val/ILSVRC2012_val_00046713.JPEG n03692522/
+mv val/ILSVRC2012_val_00046714.JPEG n02437312/
+mv val/ILSVRC2012_val_00046715.JPEG n03814639/
+mv val/ILSVRC2012_val_00046716.JPEG n02236044/
+mv val/ILSVRC2012_val_00046717.JPEG n02094433/
+mv val/ILSVRC2012_val_00046718.JPEG n07742313/
+mv val/ILSVRC2012_val_00046719.JPEG n04398044/
+mv val/ILSVRC2012_val_00046720.JPEG n03255030/
+mv val/ILSVRC2012_val_00046721.JPEG n04258138/
+mv val/ILSVRC2012_val_00046722.JPEG n02422106/
+mv val/ILSVRC2012_val_00046723.JPEG n06785654/
+mv val/ILSVRC2012_val_00046724.JPEG n02319095/
+mv val/ILSVRC2012_val_00046725.JPEG n03692522/
+mv val/ILSVRC2012_val_00046726.JPEG n04350905/
+mv val/ILSVRC2012_val_00046727.JPEG n04252077/
+mv val/ILSVRC2012_val_00046728.JPEG n03804744/
+mv val/ILSVRC2012_val_00046729.JPEG n03131574/
+mv val/ILSVRC2012_val_00046730.JPEG n02107312/
+mv val/ILSVRC2012_val_00046731.JPEG n07583066/
+mv val/ILSVRC2012_val_00046732.JPEG n02006656/
+mv val/ILSVRC2012_val_00046733.JPEG n01608432/
+mv val/ILSVRC2012_val_00046734.JPEG n04428191/
+mv val/ILSVRC2012_val_00046735.JPEG n04346328/
+mv val/ILSVRC2012_val_00046736.JPEG n02493793/
+mv val/ILSVRC2012_val_00046737.JPEG n04040759/
+mv val/ILSVRC2012_val_00046738.JPEG n03733281/
+mv val/ILSVRC2012_val_00046739.JPEG n02093754/
+mv val/ILSVRC2012_val_00046740.JPEG n01677366/
+mv val/ILSVRC2012_val_00046741.JPEG n02481823/
+mv val/ILSVRC2012_val_00046742.JPEG n11939491/
+mv val/ILSVRC2012_val_00046743.JPEG n13044778/
+mv val/ILSVRC2012_val_00046744.JPEG n04070727/
+mv val/ILSVRC2012_val_00046745.JPEG n02500267/
+mv val/ILSVRC2012_val_00046746.JPEG n03347037/
+mv val/ILSVRC2012_val_00046747.JPEG n03942813/
+mv val/ILSVRC2012_val_00046748.JPEG n03218198/
+mv val/ILSVRC2012_val_00046749.JPEG n02747177/
+mv val/ILSVRC2012_val_00046750.JPEG n04286575/
+mv val/ILSVRC2012_val_00046751.JPEG n01530575/
+mv val/ILSVRC2012_val_00046752.JPEG n02437312/
+mv val/ILSVRC2012_val_00046753.JPEG n02090379/
+mv val/ILSVRC2012_val_00046754.JPEG n04447861/
+mv val/ILSVRC2012_val_00046755.JPEG n01843383/
+mv val/ILSVRC2012_val_00046756.JPEG n01629819/
+mv val/ILSVRC2012_val_00046757.JPEG n01871265/
+mv val/ILSVRC2012_val_00046758.JPEG n02077923/
+mv val/ILSVRC2012_val_00046759.JPEG n02105162/
+mv val/ILSVRC2012_val_00046760.JPEG n03873416/
+mv val/ILSVRC2012_val_00046761.JPEG n02106662/
+mv val/ILSVRC2012_val_00046762.JPEG n02096437/
+mv val/ILSVRC2012_val_00046763.JPEG n02132136/
+mv val/ILSVRC2012_val_00046764.JPEG n03000684/
+mv val/ILSVRC2012_val_00046765.JPEG n01917289/
+mv val/ILSVRC2012_val_00046766.JPEG n02777292/
+mv val/ILSVRC2012_val_00046767.JPEG n02077923/
+mv val/ILSVRC2012_val_00046768.JPEG n02110063/
+mv val/ILSVRC2012_val_00046769.JPEG n02027492/
+mv val/ILSVRC2012_val_00046770.JPEG n02124075/
+mv val/ILSVRC2012_val_00046771.JPEG n04467665/
+mv val/ILSVRC2012_val_00046772.JPEG n04192698/
+mv val/ILSVRC2012_val_00046773.JPEG n04525305/
+mv val/ILSVRC2012_val_00046774.JPEG n12057211/
+mv val/ILSVRC2012_val_00046775.JPEG n02894605/
+mv val/ILSVRC2012_val_00046776.JPEG n02108551/
+mv val/ILSVRC2012_val_00046777.JPEG n04392985/
+mv val/ILSVRC2012_val_00046778.JPEG n01742172/
+mv val/ILSVRC2012_val_00046779.JPEG n02825657/
+mv val/ILSVRC2012_val_00046780.JPEG n04336792/
+mv val/ILSVRC2012_val_00046781.JPEG n04265275/
+mv val/ILSVRC2012_val_00046782.JPEG n02172182/
+mv val/ILSVRC2012_val_00046783.JPEG n02483362/
+mv val/ILSVRC2012_val_00046784.JPEG n02168699/
+mv val/ILSVRC2012_val_00046785.JPEG n02088094/
+mv val/ILSVRC2012_val_00046786.JPEG n02128925/
+mv val/ILSVRC2012_val_00046787.JPEG n03764736/
+mv val/ILSVRC2012_val_00046788.JPEG n02113712/
+mv val/ILSVRC2012_val_00046789.JPEG n03197337/
+mv val/ILSVRC2012_val_00046790.JPEG n03393912/
+mv val/ILSVRC2012_val_00046791.JPEG n03804744/
+mv val/ILSVRC2012_val_00046792.JPEG n07697313/
+mv val/ILSVRC2012_val_00046793.JPEG n03770679/
+mv val/ILSVRC2012_val_00046794.JPEG n02795169/
+mv val/ILSVRC2012_val_00046795.JPEG n02104365/
+mv val/ILSVRC2012_val_00046796.JPEG n10148035/
+mv val/ILSVRC2012_val_00046797.JPEG n01534433/
+mv val/ILSVRC2012_val_00046798.JPEG n03089624/
+mv val/ILSVRC2012_val_00046799.JPEG n10565667/
+mv val/ILSVRC2012_val_00046800.JPEG n04536866/
+mv val/ILSVRC2012_val_00046801.JPEG n02259212/
+mv val/ILSVRC2012_val_00046802.JPEG n01828970/
+mv val/ILSVRC2012_val_00046803.JPEG n01667114/
+mv val/ILSVRC2012_val_00046804.JPEG n02110958/
+mv val/ILSVRC2012_val_00046805.JPEG n03841143/
+mv val/ILSVRC2012_val_00046806.JPEG n03325584/
+mv val/ILSVRC2012_val_00046807.JPEG n03450230/
+mv val/ILSVRC2012_val_00046808.JPEG n04423845/
+mv val/ILSVRC2012_val_00046809.JPEG n04149813/
+mv val/ILSVRC2012_val_00046810.JPEG n02802426/
+mv val/ILSVRC2012_val_00046811.JPEG n03876231/
+mv val/ILSVRC2012_val_00046812.JPEG n03868242/
+mv val/ILSVRC2012_val_00046813.JPEG n07614500/
+mv val/ILSVRC2012_val_00046814.JPEG n04356056/
+mv val/ILSVRC2012_val_00046815.JPEG n02128925/
+mv val/ILSVRC2012_val_00046816.JPEG n03379051/
+mv val/ILSVRC2012_val_00046817.JPEG n02099712/
+mv val/ILSVRC2012_val_00046818.JPEG n02870880/
+mv val/ILSVRC2012_val_00046819.JPEG n02085936/
+mv val/ILSVRC2012_val_00046820.JPEG n13044778/
+mv val/ILSVRC2012_val_00046821.JPEG n03388043/
+mv val/ILSVRC2012_val_00046822.JPEG n02113712/
+mv val/ILSVRC2012_val_00046823.JPEG n02113624/
+mv val/ILSVRC2012_val_00046824.JPEG n03141823/
+mv val/ILSVRC2012_val_00046825.JPEG n02110627/
+mv val/ILSVRC2012_val_00046826.JPEG n03394916/
+mv val/ILSVRC2012_val_00046827.JPEG n04548362/
+mv val/ILSVRC2012_val_00046828.JPEG n02927161/
+mv val/ILSVRC2012_val_00046829.JPEG n01914609/
+mv val/ILSVRC2012_val_00046830.JPEG n04275548/
+mv val/ILSVRC2012_val_00046831.JPEG n03271574/
+mv val/ILSVRC2012_val_00046832.JPEG n03527444/
+mv val/ILSVRC2012_val_00046833.JPEG n01530575/
+mv val/ILSVRC2012_val_00046834.JPEG n03775546/
+mv val/ILSVRC2012_val_00046835.JPEG n02965783/
+mv val/ILSVRC2012_val_00046836.JPEG n02105505/
+mv val/ILSVRC2012_val_00046837.JPEG n03982430/
+mv val/ILSVRC2012_val_00046838.JPEG n04258138/
+mv val/ILSVRC2012_val_00046839.JPEG n03201208/
+mv val/ILSVRC2012_val_00046840.JPEG n07684084/
+mv val/ILSVRC2012_val_00046841.JPEG n02437616/
+mv val/ILSVRC2012_val_00046842.JPEG n03388043/
+mv val/ILSVRC2012_val_00046843.JPEG n04389033/
+mv val/ILSVRC2012_val_00046844.JPEG n02841315/
+mv val/ILSVRC2012_val_00046845.JPEG n03250847/
+mv val/ILSVRC2012_val_00046846.JPEG n02480495/
+mv val/ILSVRC2012_val_00046847.JPEG n01749939/
+mv val/ILSVRC2012_val_00046848.JPEG n12998815/
+mv val/ILSVRC2012_val_00046849.JPEG n02114712/
+mv val/ILSVRC2012_val_00046850.JPEG n02056570/
+mv val/ILSVRC2012_val_00046851.JPEG n03602883/
+mv val/ILSVRC2012_val_00046852.JPEG n02281406/
+mv val/ILSVRC2012_val_00046853.JPEG n02086079/
+mv val/ILSVRC2012_val_00046854.JPEG n03769881/
+mv val/ILSVRC2012_val_00046855.JPEG n03791053/
+mv val/ILSVRC2012_val_00046856.JPEG n02165456/
+mv val/ILSVRC2012_val_00046857.JPEG n02747177/
+mv val/ILSVRC2012_val_00046858.JPEG n13040303/
+mv val/ILSVRC2012_val_00046859.JPEG n04023962/
+mv val/ILSVRC2012_val_00046860.JPEG n02948072/
+mv val/ILSVRC2012_val_00046861.JPEG n04243546/
+mv val/ILSVRC2012_val_00046862.JPEG n02690373/
+mv val/ILSVRC2012_val_00046863.JPEG n04442312/
+mv val/ILSVRC2012_val_00046864.JPEG n03837869/
+mv val/ILSVRC2012_val_00046865.JPEG n04417672/
+mv val/ILSVRC2012_val_00046866.JPEG n13054560/
+mv val/ILSVRC2012_val_00046867.JPEG n02106166/
+mv val/ILSVRC2012_val_00046868.JPEG n01776313/
+mv val/ILSVRC2012_val_00046869.JPEG n02667093/
+mv val/ILSVRC2012_val_00046870.JPEG n07565083/
+mv val/ILSVRC2012_val_00046871.JPEG n13133613/
+mv val/ILSVRC2012_val_00046872.JPEG n07730033/
+mv val/ILSVRC2012_val_00046873.JPEG n02488291/
+mv val/ILSVRC2012_val_00046874.JPEG n04423845/
+mv val/ILSVRC2012_val_00046875.JPEG n03623198/
+mv val/ILSVRC2012_val_00046876.JPEG n03977966/
+mv val/ILSVRC2012_val_00046877.JPEG n03866082/
+mv val/ILSVRC2012_val_00046878.JPEG n02100735/
+mv val/ILSVRC2012_val_00046879.JPEG n02834397/
+mv val/ILSVRC2012_val_00046880.JPEG n04461696/
+mv val/ILSVRC2012_val_00046881.JPEG n02089078/
+mv val/ILSVRC2012_val_00046882.JPEG n01694178/
+mv val/ILSVRC2012_val_00046883.JPEG n01944390/
+mv val/ILSVRC2012_val_00046884.JPEG n03706229/
+mv val/ILSVRC2012_val_00046885.JPEG n03223299/
+mv val/ILSVRC2012_val_00046886.JPEG n03980874/
+mv val/ILSVRC2012_val_00046887.JPEG n03991062/
+mv val/ILSVRC2012_val_00046888.JPEG n04004767/
+mv val/ILSVRC2012_val_00046889.JPEG n04201297/
+mv val/ILSVRC2012_val_00046890.JPEG n03761084/
+mv val/ILSVRC2012_val_00046891.JPEG n03443371/
+mv val/ILSVRC2012_val_00046892.JPEG n02033041/
+mv val/ILSVRC2012_val_00046893.JPEG n02138441/
+mv val/ILSVRC2012_val_00046894.JPEG n01924916/
+mv val/ILSVRC2012_val_00046895.JPEG n04133789/
+mv val/ILSVRC2012_val_00046896.JPEG n06359193/
+mv val/ILSVRC2012_val_00046897.JPEG n02091032/
+mv val/ILSVRC2012_val_00046898.JPEG n02981792/
+mv val/ILSVRC2012_val_00046899.JPEG n03180011/
+mv val/ILSVRC2012_val_00046900.JPEG n04522168/
+mv val/ILSVRC2012_val_00046901.JPEG n04317175/
+mv val/ILSVRC2012_val_00046902.JPEG n02106662/
+mv val/ILSVRC2012_val_00046903.JPEG n01847000/
+mv val/ILSVRC2012_val_00046904.JPEG n12768682/
+mv val/ILSVRC2012_val_00046905.JPEG n03496892/
+mv val/ILSVRC2012_val_00046906.JPEG n02892767/
+mv val/ILSVRC2012_val_00046907.JPEG n07684084/
+mv val/ILSVRC2012_val_00046908.JPEG n01877812/
+mv val/ILSVRC2012_val_00046909.JPEG n03345487/
+mv val/ILSVRC2012_val_00046910.JPEG n03495258/
+mv val/ILSVRC2012_val_00046911.JPEG n03661043/
+mv val/ILSVRC2012_val_00046912.JPEG n01990800/
+mv val/ILSVRC2012_val_00046913.JPEG n03417042/
+mv val/ILSVRC2012_val_00046914.JPEG n04330267/
+mv val/ILSVRC2012_val_00046915.JPEG n01443537/
+mv val/ILSVRC2012_val_00046916.JPEG n02397096/
+mv val/ILSVRC2012_val_00046917.JPEG n01582220/
+mv val/ILSVRC2012_val_00046918.JPEG n01910747/
+mv val/ILSVRC2012_val_00046919.JPEG n02025239/
+mv val/ILSVRC2012_val_00046920.JPEG n03724870/
+mv val/ILSVRC2012_val_00046921.JPEG n02787622/
+mv val/ILSVRC2012_val_00046922.JPEG n02892201/
+mv val/ILSVRC2012_val_00046923.JPEG n02086079/
+mv val/ILSVRC2012_val_00046924.JPEG n04417672/
+mv val/ILSVRC2012_val_00046925.JPEG n04550184/
+mv val/ILSVRC2012_val_00046926.JPEG n04525305/
+mv val/ILSVRC2012_val_00046927.JPEG n03877845/
+mv val/ILSVRC2012_val_00046928.JPEG n07718472/
+mv val/ILSVRC2012_val_00046929.JPEG n04266014/
+mv val/ILSVRC2012_val_00046930.JPEG n02396427/
+mv val/ILSVRC2012_val_00046931.JPEG n01773797/
+mv val/ILSVRC2012_val_00046932.JPEG n02009912/
+mv val/ILSVRC2012_val_00046933.JPEG n01795545/
+mv val/ILSVRC2012_val_00046934.JPEG n02120079/
+mv val/ILSVRC2012_val_00046935.JPEG n02105505/
+mv val/ILSVRC2012_val_00046936.JPEG n04252077/
+mv val/ILSVRC2012_val_00046937.JPEG n07734744/
+mv val/ILSVRC2012_val_00046938.JPEG n02793495/
+mv val/ILSVRC2012_val_00046939.JPEG n04372370/
+mv val/ILSVRC2012_val_00046940.JPEG n02667093/
+mv val/ILSVRC2012_val_00046941.JPEG n01629819/
+mv val/ILSVRC2012_val_00046942.JPEG n02493793/
+mv val/ILSVRC2012_val_00046943.JPEG n02640242/
+mv val/ILSVRC2012_val_00046944.JPEG n01748264/
+mv val/ILSVRC2012_val_00046945.JPEG n02134418/
+mv val/ILSVRC2012_val_00046946.JPEG n04335435/
+mv val/ILSVRC2012_val_00046947.JPEG n02966687/
+mv val/ILSVRC2012_val_00046948.JPEG n01608432/
+mv val/ILSVRC2012_val_00046949.JPEG n03325584/
+mv val/ILSVRC2012_val_00046950.JPEG n02013706/
+mv val/ILSVRC2012_val_00046951.JPEG n02364673/
+mv val/ILSVRC2012_val_00046952.JPEG n02791124/
+mv val/ILSVRC2012_val_00046953.JPEG n02979186/
+mv val/ILSVRC2012_val_00046954.JPEG n04493381/
+mv val/ILSVRC2012_val_00046955.JPEG n03045698/
+mv val/ILSVRC2012_val_00046956.JPEG n03032252/
+mv val/ILSVRC2012_val_00046957.JPEG n02092339/
+mv val/ILSVRC2012_val_00046958.JPEG n01806143/
+mv val/ILSVRC2012_val_00046959.JPEG n03535780/
+mv val/ILSVRC2012_val_00046960.JPEG n02319095/
+mv val/ILSVRC2012_val_00046961.JPEG n04562935/
+mv val/ILSVRC2012_val_00046962.JPEG n01873310/
+mv val/ILSVRC2012_val_00046963.JPEG n02279972/
+mv val/ILSVRC2012_val_00046964.JPEG n02124075/
+mv val/ILSVRC2012_val_00046965.JPEG n03482405/
+mv val/ILSVRC2012_val_00046966.JPEG n02056570/
+mv val/ILSVRC2012_val_00046967.JPEG n02823750/
+mv val/ILSVRC2012_val_00046968.JPEG n02823428/
+mv val/ILSVRC2012_val_00046969.JPEG n01443537/
+mv val/ILSVRC2012_val_00046970.JPEG n02860847/
+mv val/ILSVRC2012_val_00046971.JPEG n02690373/
+mv val/ILSVRC2012_val_00046972.JPEG n03825788/
+mv val/ILSVRC2012_val_00046973.JPEG n04461696/
+mv val/ILSVRC2012_val_00046974.JPEG n02106030/
+mv val/ILSVRC2012_val_00046975.JPEG n01983481/
+mv val/ILSVRC2012_val_00046976.JPEG n01632777/
+mv val/ILSVRC2012_val_00046977.JPEG n04562935/
+mv val/ILSVRC2012_val_00046978.JPEG n01847000/
+mv val/ILSVRC2012_val_00046979.JPEG n03661043/
+mv val/ILSVRC2012_val_00046980.JPEG n03272010/
+mv val/ILSVRC2012_val_00046981.JPEG n02113978/
+mv val/ILSVRC2012_val_00046982.JPEG n04550184/
+mv val/ILSVRC2012_val_00046983.JPEG n02699494/
+mv val/ILSVRC2012_val_00046984.JPEG n04505470/
+mv val/ILSVRC2012_val_00046985.JPEG n01629819/
+mv val/ILSVRC2012_val_00046986.JPEG n03944341/
+mv val/ILSVRC2012_val_00046987.JPEG n03792782/
+mv val/ILSVRC2012_val_00046988.JPEG n02071294/
+mv val/ILSVRC2012_val_00046989.JPEG n02114367/
+mv val/ILSVRC2012_val_00046990.JPEG n04536866/
+mv val/ILSVRC2012_val_00046991.JPEG n02910353/
+mv val/ILSVRC2012_val_00046992.JPEG n03355925/
+mv val/ILSVRC2012_val_00046993.JPEG n03908618/
+mv val/ILSVRC2012_val_00046994.JPEG n02786058/
+mv val/ILSVRC2012_val_00046995.JPEG n02097047/
+mv val/ILSVRC2012_val_00046996.JPEG n02088094/
+mv val/ILSVRC2012_val_00046997.JPEG n02089867/
+mv val/ILSVRC2012_val_00046998.JPEG n04356056/
+mv val/ILSVRC2012_val_00046999.JPEG n02095570/
+mv val/ILSVRC2012_val_00047000.JPEG n01756291/
+mv val/ILSVRC2012_val_00047001.JPEG n02441942/
+mv val/ILSVRC2012_val_00047002.JPEG n04208210/
+mv val/ILSVRC2012_val_00047003.JPEG n07693725/
+mv val/ILSVRC2012_val_00047004.JPEG n02088094/
+mv val/ILSVRC2012_val_00047005.JPEG n06596364/
+mv val/ILSVRC2012_val_00047006.JPEG n02992529/
+mv val/ILSVRC2012_val_00047007.JPEG n04081281/
+mv val/ILSVRC2012_val_00047008.JPEG n03467068/
+mv val/ILSVRC2012_val_00047009.JPEG n01847000/
+mv val/ILSVRC2012_val_00047010.JPEG n01693334/
+mv val/ILSVRC2012_val_00047011.JPEG n03680355/
+mv val/ILSVRC2012_val_00047012.JPEG n04501370/
+mv val/ILSVRC2012_val_00047013.JPEG n03763968/
+mv val/ILSVRC2012_val_00047014.JPEG n01917289/
+mv val/ILSVRC2012_val_00047015.JPEG n02669723/
+mv val/ILSVRC2012_val_00047016.JPEG n01924916/
+mv val/ILSVRC2012_val_00047017.JPEG n02110958/
+mv val/ILSVRC2012_val_00047018.JPEG n04041544/
+mv val/ILSVRC2012_val_00047019.JPEG n02110806/
+mv val/ILSVRC2012_val_00047020.JPEG n02134084/
+mv val/ILSVRC2012_val_00047021.JPEG n02130308/
+mv val/ILSVRC2012_val_00047022.JPEG n02443484/
+mv val/ILSVRC2012_val_00047023.JPEG n02843684/
+mv val/ILSVRC2012_val_00047024.JPEG n01968897/
+mv val/ILSVRC2012_val_00047025.JPEG n01855672/
+mv val/ILSVRC2012_val_00047026.JPEG n02113799/
+mv val/ILSVRC2012_val_00047027.JPEG n03584829/
+mv val/ILSVRC2012_val_00047028.JPEG n12768682/
+mv val/ILSVRC2012_val_00047029.JPEG n01531178/
+mv val/ILSVRC2012_val_00047030.JPEG n03197337/
+mv val/ILSVRC2012_val_00047031.JPEG n01784675/
+mv val/ILSVRC2012_val_00047032.JPEG n03075370/
+mv val/ILSVRC2012_val_00047033.JPEG n04252077/
+mv val/ILSVRC2012_val_00047034.JPEG n03935335/
+mv val/ILSVRC2012_val_00047035.JPEG n02999410/
+mv val/ILSVRC2012_val_00047036.JPEG n07716358/
+mv val/ILSVRC2012_val_00047037.JPEG n04238763/
+mv val/ILSVRC2012_val_00047038.JPEG n07753275/
+mv val/ILSVRC2012_val_00047039.JPEG n02279972/
+mv val/ILSVRC2012_val_00047040.JPEG n02666196/
+mv val/ILSVRC2012_val_00047041.JPEG n02007558/
+mv val/ILSVRC2012_val_00047042.JPEG n02105251/
+mv val/ILSVRC2012_val_00047043.JPEG n02226429/
+mv val/ILSVRC2012_val_00047044.JPEG n01751748/
+mv val/ILSVRC2012_val_00047045.JPEG n02127052/
+mv val/ILSVRC2012_val_00047046.JPEG n04579145/
+mv val/ILSVRC2012_val_00047047.JPEG n02051845/
+mv val/ILSVRC2012_val_00047048.JPEG n02445715/
+mv val/ILSVRC2012_val_00047049.JPEG n02102177/
+mv val/ILSVRC2012_val_00047050.JPEG n03759954/
+mv val/ILSVRC2012_val_00047051.JPEG n03179701/
+mv val/ILSVRC2012_val_00047052.JPEG n02007558/
+mv val/ILSVRC2012_val_00047053.JPEG n03649909/
+mv val/ILSVRC2012_val_00047054.JPEG n03992509/
+mv val/ILSVRC2012_val_00047055.JPEG n03447721/
+mv val/ILSVRC2012_val_00047056.JPEG n02916936/
+mv val/ILSVRC2012_val_00047057.JPEG n03196217/
+mv val/ILSVRC2012_val_00047058.JPEG n01883070/
+mv val/ILSVRC2012_val_00047059.JPEG n01983481/
+mv val/ILSVRC2012_val_00047060.JPEG n03000684/
+mv val/ILSVRC2012_val_00047061.JPEG n01756291/
+mv val/ILSVRC2012_val_00047062.JPEG n02111277/
+mv val/ILSVRC2012_val_00047063.JPEG n03857828/
+mv val/ILSVRC2012_val_00047064.JPEG n04479046/
+mv val/ILSVRC2012_val_00047065.JPEG n02177972/
+mv val/ILSVRC2012_val_00047066.JPEG n04067472/
+mv val/ILSVRC2012_val_00047067.JPEG n03444034/
+mv val/ILSVRC2012_val_00047068.JPEG n03854065/
+mv val/ILSVRC2012_val_00047069.JPEG n03720891/
+mv val/ILSVRC2012_val_00047070.JPEG n04208210/
+mv val/ILSVRC2012_val_00047071.JPEG n01740131/
+mv val/ILSVRC2012_val_00047072.JPEG n04423845/
+mv val/ILSVRC2012_val_00047073.JPEG n01855672/
+mv val/ILSVRC2012_val_00047074.JPEG n03388549/
+mv val/ILSVRC2012_val_00047075.JPEG n02206856/
+mv val/ILSVRC2012_val_00047076.JPEG n04606251/
+mv val/ILSVRC2012_val_00047077.JPEG n03887697/
+mv val/ILSVRC2012_val_00047078.JPEG n02865351/
+mv val/ILSVRC2012_val_00047079.JPEG n04579145/
+mv val/ILSVRC2012_val_00047080.JPEG n01496331/
+mv val/ILSVRC2012_val_00047081.JPEG n02804414/
+mv val/ILSVRC2012_val_00047082.JPEG n02787622/
+mv val/ILSVRC2012_val_00047083.JPEG n04004767/
+mv val/ILSVRC2012_val_00047084.JPEG n02097047/
+mv val/ILSVRC2012_val_00047085.JPEG n02490219/
+mv val/ILSVRC2012_val_00047086.JPEG n03529860/
+mv val/ILSVRC2012_val_00047087.JPEG n03680355/
+mv val/ILSVRC2012_val_00047088.JPEG n03942813/
+mv val/ILSVRC2012_val_00047089.JPEG n01632458/
+mv val/ILSVRC2012_val_00047090.JPEG n03733281/
+mv val/ILSVRC2012_val_00047091.JPEG n03584829/
+mv val/ILSVRC2012_val_00047092.JPEG n02797295/
+mv val/ILSVRC2012_val_00047093.JPEG n02966687/
+mv val/ILSVRC2012_val_00047094.JPEG n01824575/
+mv val/ILSVRC2012_val_00047095.JPEG n07831146/
+mv val/ILSVRC2012_val_00047096.JPEG n04366367/
+mv val/ILSVRC2012_val_00047097.JPEG n03666591/
+mv val/ILSVRC2012_val_00047098.JPEG n03788195/
+mv val/ILSVRC2012_val_00047099.JPEG n02966193/
+mv val/ILSVRC2012_val_00047100.JPEG n03042490/
+mv val/ILSVRC2012_val_00047101.JPEG n06874185/
+mv val/ILSVRC2012_val_00047102.JPEG n03345487/
+mv val/ILSVRC2012_val_00047103.JPEG n02123597/
+mv val/ILSVRC2012_val_00047104.JPEG n02895154/
+mv val/ILSVRC2012_val_00047105.JPEG n01664065/
+mv val/ILSVRC2012_val_00047106.JPEG n01819313/
+mv val/ILSVRC2012_val_00047107.JPEG n12985857/
+mv val/ILSVRC2012_val_00047108.JPEG n01855672/
+mv val/ILSVRC2012_val_00047109.JPEG n02095314/
+mv val/ILSVRC2012_val_00047110.JPEG n02102973/
+mv val/ILSVRC2012_val_00047111.JPEG n02966193/
+mv val/ILSVRC2012_val_00047112.JPEG n02115913/
+mv val/ILSVRC2012_val_00047113.JPEG n03590841/
+mv val/ILSVRC2012_val_00047114.JPEG n02093991/
+mv val/ILSVRC2012_val_00047115.JPEG n02169497/
+mv val/ILSVRC2012_val_00047116.JPEG n02814860/
+mv val/ILSVRC2012_val_00047117.JPEG n02089078/
+mv val/ILSVRC2012_val_00047118.JPEG n02138441/
+mv val/ILSVRC2012_val_00047119.JPEG n02113712/
+mv val/ILSVRC2012_val_00047120.JPEG n02883205/
+mv val/ILSVRC2012_val_00047121.JPEG n01601694/
+mv val/ILSVRC2012_val_00047122.JPEG n01774384/
+mv val/ILSVRC2012_val_00047123.JPEG n04111531/
+mv val/ILSVRC2012_val_00047124.JPEG n03000134/
+mv val/ILSVRC2012_val_00047125.JPEG n02088364/
+mv val/ILSVRC2012_val_00047126.JPEG n02489166/
+mv val/ILSVRC2012_val_00047127.JPEG n01914609/
+mv val/ILSVRC2012_val_00047128.JPEG n04009552/
+mv val/ILSVRC2012_val_00047129.JPEG n03680355/
+mv val/ILSVRC2012_val_00047130.JPEG n03843555/
+mv val/ILSVRC2012_val_00047131.JPEG n03950228/
+mv val/ILSVRC2012_val_00047132.JPEG n03680355/
+mv val/ILSVRC2012_val_00047133.JPEG n04597913/
+mv val/ILSVRC2012_val_00047134.JPEG n04347754/
+mv val/ILSVRC2012_val_00047135.JPEG n04116512/
+mv val/ILSVRC2012_val_00047136.JPEG n02747177/
+mv val/ILSVRC2012_val_00047137.JPEG n01514668/
+mv val/ILSVRC2012_val_00047138.JPEG n02840245/
+mv val/ILSVRC2012_val_00047139.JPEG n03483316/
+mv val/ILSVRC2012_val_00047140.JPEG n07715103/
+mv val/ILSVRC2012_val_00047141.JPEG n04153751/
+mv val/ILSVRC2012_val_00047142.JPEG n02500267/
+mv val/ILSVRC2012_val_00047143.JPEG n03998194/
+mv val/ILSVRC2012_val_00047144.JPEG n15075141/
+mv val/ILSVRC2012_val_00047145.JPEG n03930313/
+mv val/ILSVRC2012_val_00047146.JPEG n02112706/
+mv val/ILSVRC2012_val_00047147.JPEG n03888257/
+mv val/ILSVRC2012_val_00047148.JPEG n02110063/
+mv val/ILSVRC2012_val_00047149.JPEG n02108000/
+mv val/ILSVRC2012_val_00047150.JPEG n02102973/
+mv val/ILSVRC2012_val_00047151.JPEG n02483708/
+mv val/ILSVRC2012_val_00047152.JPEG n02097474/
+mv val/ILSVRC2012_val_00047153.JPEG n02011460/
+mv val/ILSVRC2012_val_00047154.JPEG n02492035/
+mv val/ILSVRC2012_val_00047155.JPEG n02814860/
+mv val/ILSVRC2012_val_00047156.JPEG n02009229/
+mv val/ILSVRC2012_val_00047157.JPEG n03877845/
+mv val/ILSVRC2012_val_00047158.JPEG n06596364/
+mv val/ILSVRC2012_val_00047159.JPEG n07248320/
+mv val/ILSVRC2012_val_00047160.JPEG n04344873/
+mv val/ILSVRC2012_val_00047161.JPEG n04536866/
+mv val/ILSVRC2012_val_00047162.JPEG n02823750/
+mv val/ILSVRC2012_val_00047163.JPEG n03291819/
+mv val/ILSVRC2012_val_00047164.JPEG n01770081/
+mv val/ILSVRC2012_val_00047165.JPEG n02892767/
+mv val/ILSVRC2012_val_00047166.JPEG n03481172/
+mv val/ILSVRC2012_val_00047167.JPEG n02066245/
+mv val/ILSVRC2012_val_00047168.JPEG n04370456/
+mv val/ILSVRC2012_val_00047169.JPEG n02264363/
+mv val/ILSVRC2012_val_00047170.JPEG n03670208/
+mv val/ILSVRC2012_val_00047171.JPEG n02397096/
+mv val/ILSVRC2012_val_00047172.JPEG n03075370/
+mv val/ILSVRC2012_val_00047173.JPEG n02087394/
+mv val/ILSVRC2012_val_00047174.JPEG n02536864/
+mv val/ILSVRC2012_val_00047175.JPEG n04599235/
+mv val/ILSVRC2012_val_00047176.JPEG n03982430/
+mv val/ILSVRC2012_val_00047177.JPEG n04523525/
+mv val/ILSVRC2012_val_00047178.JPEG n04522168/
+mv val/ILSVRC2012_val_00047179.JPEG n13052670/
+mv val/ILSVRC2012_val_00047180.JPEG n03633091/
+mv val/ILSVRC2012_val_00047181.JPEG n04067472/
+mv val/ILSVRC2012_val_00047182.JPEG n02988304/
+mv val/ILSVRC2012_val_00047183.JPEG n04486054/
+mv val/ILSVRC2012_val_00047184.JPEG n01677366/
+mv val/ILSVRC2012_val_00047185.JPEG n02492660/
+mv val/ILSVRC2012_val_00047186.JPEG n03127747/
+mv val/ILSVRC2012_val_00047187.JPEG n02112350/
+mv val/ILSVRC2012_val_00047188.JPEG n04336792/
+mv val/ILSVRC2012_val_00047189.JPEG n03417042/
+mv val/ILSVRC2012_val_00047190.JPEG n13133613/
+mv val/ILSVRC2012_val_00047191.JPEG n01608432/
+mv val/ILSVRC2012_val_00047192.JPEG n02865351/
+mv val/ILSVRC2012_val_00047193.JPEG n02129165/
+mv val/ILSVRC2012_val_00047194.JPEG n01773157/
+mv val/ILSVRC2012_val_00047195.JPEG n04258138/
+mv val/ILSVRC2012_val_00047196.JPEG n04041544/
+mv val/ILSVRC2012_val_00047197.JPEG n04252077/
+mv val/ILSVRC2012_val_00047198.JPEG n03197337/
+mv val/ILSVRC2012_val_00047199.JPEG n03794056/
+mv val/ILSVRC2012_val_00047200.JPEG n03877845/
+mv val/ILSVRC2012_val_00047201.JPEG n04346328/
+mv val/ILSVRC2012_val_00047202.JPEG n02086910/
+mv val/ILSVRC2012_val_00047203.JPEG n01694178/
+mv val/ILSVRC2012_val_00047204.JPEG n03445924/
+mv val/ILSVRC2012_val_00047205.JPEG n04532670/
+mv val/ILSVRC2012_val_00047206.JPEG n03781244/
+mv val/ILSVRC2012_val_00047207.JPEG n04141975/
+mv val/ILSVRC2012_val_00047208.JPEG n03124170/
+mv val/ILSVRC2012_val_00047209.JPEG n03874293/
+mv val/ILSVRC2012_val_00047210.JPEG n03498962/
+mv val/ILSVRC2012_val_00047211.JPEG n01739381/
+mv val/ILSVRC2012_val_00047212.JPEG n02791270/
+mv val/ILSVRC2012_val_00047213.JPEG n07892512/
+mv val/ILSVRC2012_val_00047214.JPEG n03444034/
+mv val/ILSVRC2012_val_00047215.JPEG n02105162/
+mv val/ILSVRC2012_val_00047216.JPEG n01734418/
+mv val/ILSVRC2012_val_00047217.JPEG n04070727/
+mv val/ILSVRC2012_val_00047218.JPEG n02916936/
+mv val/ILSVRC2012_val_00047219.JPEG n03840681/
+mv val/ILSVRC2012_val_00047220.JPEG n04399382/
+mv val/ILSVRC2012_val_00047221.JPEG n07749582/
+mv val/ILSVRC2012_val_00047222.JPEG n02480495/
+mv val/ILSVRC2012_val_00047223.JPEG n04515003/
+mv val/ILSVRC2012_val_00047224.JPEG n01688243/
+mv val/ILSVRC2012_val_00047225.JPEG n02107142/
+mv val/ILSVRC2012_val_00047226.JPEG n01914609/
+mv val/ILSVRC2012_val_00047227.JPEG n01742172/
+mv val/ILSVRC2012_val_00047228.JPEG n07753113/
+mv val/ILSVRC2012_val_00047229.JPEG n01828970/
+mv val/ILSVRC2012_val_00047230.JPEG n01797886/
+mv val/ILSVRC2012_val_00047231.JPEG n04606251/
+mv val/ILSVRC2012_val_00047232.JPEG n03062245/
+mv val/ILSVRC2012_val_00047233.JPEG n03400231/
+mv val/ILSVRC2012_val_00047234.JPEG n03483316/
+mv val/ILSVRC2012_val_00047235.JPEG n02978881/
+mv val/ILSVRC2012_val_00047236.JPEG n02109047/
+mv val/ILSVRC2012_val_00047237.JPEG n02795169/
+mv val/ILSVRC2012_val_00047238.JPEG n01728920/
+mv val/ILSVRC2012_val_00047239.JPEG n03530642/
+mv val/ILSVRC2012_val_00047240.JPEG n04209133/
+mv val/ILSVRC2012_val_00047241.JPEG n02105641/
+mv val/ILSVRC2012_val_00047242.JPEG n02111277/
+mv val/ILSVRC2012_val_00047243.JPEG n01737021/
+mv val/ILSVRC2012_val_00047244.JPEG n02092339/
+mv val/ILSVRC2012_val_00047245.JPEG n04589890/
+mv val/ILSVRC2012_val_00047246.JPEG n02454379/
+mv val/ILSVRC2012_val_00047247.JPEG n12267677/
+mv val/ILSVRC2012_val_00047248.JPEG n03627232/
+mv val/ILSVRC2012_val_00047249.JPEG n01990800/
+mv val/ILSVRC2012_val_00047250.JPEG n02109047/
+mv val/ILSVRC2012_val_00047251.JPEG n03314780/
+mv val/ILSVRC2012_val_00047252.JPEG n01798484/
+mv val/ILSVRC2012_val_00047253.JPEG n03691459/
+mv val/ILSVRC2012_val_00047254.JPEG n02669723/
+mv val/ILSVRC2012_val_00047255.JPEG n03781244/
+mv val/ILSVRC2012_val_00047256.JPEG n03467068/
+mv val/ILSVRC2012_val_00047257.JPEG n01770081/
+mv val/ILSVRC2012_val_00047258.JPEG n01796340/
+mv val/ILSVRC2012_val_00047259.JPEG n03930313/
+mv val/ILSVRC2012_val_00047260.JPEG n02226429/
+mv val/ILSVRC2012_val_00047261.JPEG n02514041/
+mv val/ILSVRC2012_val_00047262.JPEG n02356798/
+mv val/ILSVRC2012_val_00047263.JPEG n07880968/
+mv val/ILSVRC2012_val_00047264.JPEG n04131690/
+mv val/ILSVRC2012_val_00047265.JPEG n02807133/
+mv val/ILSVRC2012_val_00047266.JPEG n03841143/
+mv val/ILSVRC2012_val_00047267.JPEG n02346627/
+mv val/ILSVRC2012_val_00047268.JPEG n02397096/
+mv val/ILSVRC2012_val_00047269.JPEG n02963159/
+mv val/ILSVRC2012_val_00047270.JPEG n02641379/
+mv val/ILSVRC2012_val_00047271.JPEG n02093428/
+mv val/ILSVRC2012_val_00047272.JPEG n01537544/
+mv val/ILSVRC2012_val_00047273.JPEG n02814860/
+mv val/ILSVRC2012_val_00047274.JPEG n04074963/
+mv val/ILSVRC2012_val_00047275.JPEG n02109525/
+mv val/ILSVRC2012_val_00047276.JPEG n02085782/
+mv val/ILSVRC2012_val_00047277.JPEG n02102973/
+mv val/ILSVRC2012_val_00047278.JPEG n02319095/
+mv val/ILSVRC2012_val_00047279.JPEG n02437616/
+mv val/ILSVRC2012_val_00047280.JPEG n02395406/
+mv val/ILSVRC2012_val_00047281.JPEG n02488291/
+mv val/ILSVRC2012_val_00047282.JPEG n03777568/
+mv val/ILSVRC2012_val_00047283.JPEG n03710193/
+mv val/ILSVRC2012_val_00047284.JPEG n09421951/
+mv val/ILSVRC2012_val_00047285.JPEG n03838899/
+mv val/ILSVRC2012_val_00047286.JPEG n04004767/
+mv val/ILSVRC2012_val_00047287.JPEG n02011460/
+mv val/ILSVRC2012_val_00047288.JPEG n02526121/
+mv val/ILSVRC2012_val_00047289.JPEG n02112018/
+mv val/ILSVRC2012_val_00047290.JPEG n02687172/
+mv val/ILSVRC2012_val_00047291.JPEG n02825657/
+mv val/ILSVRC2012_val_00047292.JPEG n01882714/
+mv val/ILSVRC2012_val_00047293.JPEG n01968897/
+mv val/ILSVRC2012_val_00047294.JPEG n03196217/
+mv val/ILSVRC2012_val_00047295.JPEG n02101556/
+mv val/ILSVRC2012_val_00047296.JPEG n04389033/
+mv val/ILSVRC2012_val_00047297.JPEG n04127249/
+mv val/ILSVRC2012_val_00047298.JPEG n04254680/
+mv val/ILSVRC2012_val_00047299.JPEG n03063689/
+mv val/ILSVRC2012_val_00047300.JPEG n04125021/
+mv val/ILSVRC2012_val_00047301.JPEG n01689811/
+mv val/ILSVRC2012_val_00047302.JPEG n04325704/
+mv val/ILSVRC2012_val_00047303.JPEG n02137549/
+mv val/ILSVRC2012_val_00047304.JPEG n10565667/
+mv val/ILSVRC2012_val_00047305.JPEG n02391049/
+mv val/ILSVRC2012_val_00047306.JPEG n07836838/
+mv val/ILSVRC2012_val_00047307.JPEG n04584207/
+mv val/ILSVRC2012_val_00047308.JPEG n02423022/
+mv val/ILSVRC2012_val_00047309.JPEG n02088364/
+mv val/ILSVRC2012_val_00047310.JPEG n03961711/
+mv val/ILSVRC2012_val_00047311.JPEG n02457408/
+mv val/ILSVRC2012_val_00047312.JPEG n03535780/
+mv val/ILSVRC2012_val_00047313.JPEG n02412080/
+mv val/ILSVRC2012_val_00047314.JPEG n03017168/
+mv val/ILSVRC2012_val_00047315.JPEG n02979186/
+mv val/ILSVRC2012_val_00047316.JPEG n02676566/
+mv val/ILSVRC2012_val_00047317.JPEG n01860187/
+mv val/ILSVRC2012_val_00047318.JPEG n02423022/
+mv val/ILSVRC2012_val_00047319.JPEG n03891332/
+mv val/ILSVRC2012_val_00047320.JPEG n01494475/
+mv val/ILSVRC2012_val_00047321.JPEG n01704323/
+mv val/ILSVRC2012_val_00047322.JPEG n04423845/
+mv val/ILSVRC2012_val_00047323.JPEG n03976467/
+mv val/ILSVRC2012_val_00047324.JPEG n02091831/
+mv val/ILSVRC2012_val_00047325.JPEG n02101006/
+mv val/ILSVRC2012_val_00047326.JPEG n01491361/
+mv val/ILSVRC2012_val_00047327.JPEG n03063689/
+mv val/ILSVRC2012_val_00047328.JPEG n01910747/
+mv val/ILSVRC2012_val_00047329.JPEG n01784675/
+mv val/ILSVRC2012_val_00047330.JPEG n03967562/
+mv val/ILSVRC2012_val_00047331.JPEG n02094114/
+mv val/ILSVRC2012_val_00047332.JPEG n04065272/
+mv val/ILSVRC2012_val_00047333.JPEG n01534433/
+mv val/ILSVRC2012_val_00047334.JPEG n04372370/
+mv val/ILSVRC2012_val_00047335.JPEG n02879718/
+mv val/ILSVRC2012_val_00047336.JPEG n02871525/
+mv val/ILSVRC2012_val_00047337.JPEG n02168699/
+mv val/ILSVRC2012_val_00047338.JPEG n01784675/
+mv val/ILSVRC2012_val_00047339.JPEG n03492542/
+mv val/ILSVRC2012_val_00047340.JPEG n02101388/
+mv val/ILSVRC2012_val_00047341.JPEG n07718472/
+mv val/ILSVRC2012_val_00047342.JPEG n02110185/
+mv val/ILSVRC2012_val_00047343.JPEG n12998815/
+mv val/ILSVRC2012_val_00047344.JPEG n03127925/
+mv val/ILSVRC2012_val_00047345.JPEG n03207743/
+mv val/ILSVRC2012_val_00047346.JPEG n12057211/
+mv val/ILSVRC2012_val_00047347.JPEG n07565083/
+mv val/ILSVRC2012_val_00047348.JPEG n04525038/
+mv val/ILSVRC2012_val_00047349.JPEG n04118776/
+mv val/ILSVRC2012_val_00047350.JPEG n01616318/
+mv val/ILSVRC2012_val_00047351.JPEG n02965783/
+mv val/ILSVRC2012_val_00047352.JPEG n02206856/
+mv val/ILSVRC2012_val_00047353.JPEG n03899768/
+mv val/ILSVRC2012_val_00047354.JPEG n01687978/
+mv val/ILSVRC2012_val_00047355.JPEG n03379051/
+mv val/ILSVRC2012_val_00047356.JPEG n02104029/
+mv val/ILSVRC2012_val_00047357.JPEG n04229816/
+mv val/ILSVRC2012_val_00047358.JPEG n03124170/
+mv val/ILSVRC2012_val_00047359.JPEG n02281406/
+mv val/ILSVRC2012_val_00047360.JPEG n03032252/
+mv val/ILSVRC2012_val_00047361.JPEG n02101556/
+mv val/ILSVRC2012_val_00047362.JPEG n02980441/
+mv val/ILSVRC2012_val_00047363.JPEG n03485794/
+mv val/ILSVRC2012_val_00047364.JPEG n04366367/
+mv val/ILSVRC2012_val_00047365.JPEG n02492035/
+mv val/ILSVRC2012_val_00047366.JPEG n03599486/
+mv val/ILSVRC2012_val_00047367.JPEG n04548362/
+mv val/ILSVRC2012_val_00047368.JPEG n03764736/
+mv val/ILSVRC2012_val_00047369.JPEG n07760859/
+mv val/ILSVRC2012_val_00047370.JPEG n01978287/
+mv val/ILSVRC2012_val_00047371.JPEG n04505470/
+mv val/ILSVRC2012_val_00047372.JPEG n02488291/
+mv val/ILSVRC2012_val_00047373.JPEG n02782093/
+mv val/ILSVRC2012_val_00047374.JPEG n03417042/
+mv val/ILSVRC2012_val_00047375.JPEG n02486261/
+mv val/ILSVRC2012_val_00047376.JPEG n03843555/
+mv val/ILSVRC2012_val_00047377.JPEG n02319095/
+mv val/ILSVRC2012_val_00047378.JPEG n02493509/
+mv val/ILSVRC2012_val_00047379.JPEG n01798484/
+mv val/ILSVRC2012_val_00047380.JPEG n03857828/
+mv val/ILSVRC2012_val_00047381.JPEG n03950228/
+mv val/ILSVRC2012_val_00047382.JPEG n02791124/
+mv val/ILSVRC2012_val_00047383.JPEG n03207941/
+mv val/ILSVRC2012_val_00047384.JPEG n01751748/
+mv val/ILSVRC2012_val_00047385.JPEG n03916031/
+mv val/ILSVRC2012_val_00047386.JPEG n04074963/
+mv val/ILSVRC2012_val_00047387.JPEG n03724870/
+mv val/ILSVRC2012_val_00047388.JPEG n13133613/
+mv val/ILSVRC2012_val_00047389.JPEG n03937543/
+mv val/ILSVRC2012_val_00047390.JPEG n03255030/
+mv val/ILSVRC2012_val_00047391.JPEG n04372370/
+mv val/ILSVRC2012_val_00047392.JPEG n02168699/
+mv val/ILSVRC2012_val_00047393.JPEG n03920288/
+mv val/ILSVRC2012_val_00047394.JPEG n02514041/
+mv val/ILSVRC2012_val_00047395.JPEG n02112350/
+mv val/ILSVRC2012_val_00047396.JPEG n01443537/
+mv val/ILSVRC2012_val_00047397.JPEG n01807496/
+mv val/ILSVRC2012_val_00047398.JPEG n04070727/
+mv val/ILSVRC2012_val_00047399.JPEG n01675722/
+mv val/ILSVRC2012_val_00047400.JPEG n01518878/
+mv val/ILSVRC2012_val_00047401.JPEG n03599486/
+mv val/ILSVRC2012_val_00047402.JPEG n04162706/
+mv val/ILSVRC2012_val_00047403.JPEG n04147183/
+mv val/ILSVRC2012_val_00047404.JPEG n01795545/
+mv val/ILSVRC2012_val_00047405.JPEG n01698640/
+mv val/ILSVRC2012_val_00047406.JPEG n01873310/
+mv val/ILSVRC2012_val_00047407.JPEG n07718472/
+mv val/ILSVRC2012_val_00047408.JPEG n04033995/
+mv val/ILSVRC2012_val_00047409.JPEG n04418357/
+mv val/ILSVRC2012_val_00047410.JPEG n04429376/
+mv val/ILSVRC2012_val_00047411.JPEG n02110806/
+mv val/ILSVRC2012_val_00047412.JPEG n01944390/
+mv val/ILSVRC2012_val_00047413.JPEG n09835506/
+mv val/ILSVRC2012_val_00047414.JPEG n02092339/
+mv val/ILSVRC2012_val_00047415.JPEG n02948072/
+mv val/ILSVRC2012_val_00047416.JPEG n01978455/
+mv val/ILSVRC2012_val_00047417.JPEG n02100236/
+mv val/ILSVRC2012_val_00047418.JPEG n03710193/
+mv val/ILSVRC2012_val_00047419.JPEG n04517823/
+mv val/ILSVRC2012_val_00047420.JPEG n04154565/
+mv val/ILSVRC2012_val_00047421.JPEG n03761084/
+mv val/ILSVRC2012_val_00047422.JPEG n02346627/
+mv val/ILSVRC2012_val_00047423.JPEG n02672831/
+mv val/ILSVRC2012_val_00047424.JPEG n02422106/
+mv val/ILSVRC2012_val_00047425.JPEG n01664065/
+mv val/ILSVRC2012_val_00047426.JPEG n04125021/
+mv val/ILSVRC2012_val_00047427.JPEG n03450230/
+mv val/ILSVRC2012_val_00047428.JPEG n03980874/
+mv val/ILSVRC2012_val_00047429.JPEG n03642806/
+mv val/ILSVRC2012_val_00047430.JPEG n03866082/
+mv val/ILSVRC2012_val_00047431.JPEG n01494475/
+mv val/ILSVRC2012_val_00047432.JPEG n01910747/
+mv val/ILSVRC2012_val_00047433.JPEG n02229544/
+mv val/ILSVRC2012_val_00047434.JPEG n01770393/
+mv val/ILSVRC2012_val_00047435.JPEG n02114367/
+mv val/ILSVRC2012_val_00047436.JPEG n07920052/
+mv val/ILSVRC2012_val_00047437.JPEG n01872401/
+mv val/ILSVRC2012_val_00047438.JPEG n02109047/
+mv val/ILSVRC2012_val_00047439.JPEG n03884397/
+mv val/ILSVRC2012_val_00047440.JPEG n02704792/
+mv val/ILSVRC2012_val_00047441.JPEG n07716906/
+mv val/ILSVRC2012_val_00047442.JPEG n03843555/
+mv val/ILSVRC2012_val_00047443.JPEG n03095699/
+mv val/ILSVRC2012_val_00047444.JPEG n04532106/
+mv val/ILSVRC2012_val_00047445.JPEG n02093754/
+mv val/ILSVRC2012_val_00047446.JPEG n02879718/
+mv val/ILSVRC2012_val_00047447.JPEG n04515003/
+mv val/ILSVRC2012_val_00047448.JPEG n07718747/
+mv val/ILSVRC2012_val_00047449.JPEG n02094258/
+mv val/ILSVRC2012_val_00047450.JPEG n03838899/
+mv val/ILSVRC2012_val_00047451.JPEG n03126707/
+mv val/ILSVRC2012_val_00047452.JPEG n07730033/
+mv val/ILSVRC2012_val_00047453.JPEG n03085013/
+mv val/ILSVRC2012_val_00047454.JPEG n03680355/
+mv val/ILSVRC2012_val_00047455.JPEG n02123045/
+mv val/ILSVRC2012_val_00047456.JPEG n02279972/
+mv val/ILSVRC2012_val_00047457.JPEG n02086240/
+mv val/ILSVRC2012_val_00047458.JPEG n02134418/
+mv val/ILSVRC2012_val_00047459.JPEG n03388549/
+mv val/ILSVRC2012_val_00047460.JPEG n03637318/
+mv val/ILSVRC2012_val_00047461.JPEG n03345487/
+mv val/ILSVRC2012_val_00047462.JPEG n04517823/
+mv val/ILSVRC2012_val_00047463.JPEG n03476991/
+mv val/ILSVRC2012_val_00047464.JPEG n07734744/
+mv val/ILSVRC2012_val_00047465.JPEG n03602883/
+mv val/ILSVRC2012_val_00047466.JPEG n04371774/
+mv val/ILSVRC2012_val_00047467.JPEG n04229816/
+mv val/ILSVRC2012_val_00047468.JPEG n03249569/
+mv val/ILSVRC2012_val_00047469.JPEG n02676566/
+mv val/ILSVRC2012_val_00047470.JPEG n02011460/
+mv val/ILSVRC2012_val_00047471.JPEG n02916936/
+mv val/ILSVRC2012_val_00047472.JPEG n01806567/
+mv val/ILSVRC2012_val_00047473.JPEG n02814533/
+mv val/ILSVRC2012_val_00047474.JPEG n01560419/
+mv val/ILSVRC2012_val_00047475.JPEG n03970156/
+mv val/ILSVRC2012_val_00047476.JPEG n01978455/
+mv val/ILSVRC2012_val_00047477.JPEG n02823750/
+mv val/ILSVRC2012_val_00047478.JPEG n02883205/
+mv val/ILSVRC2012_val_00047479.JPEG n02110627/
+mv val/ILSVRC2012_val_00047480.JPEG n03787032/
+mv val/ILSVRC2012_val_00047481.JPEG n10148035/
+mv val/ILSVRC2012_val_00047482.JPEG n04596742/
+mv val/ILSVRC2012_val_00047483.JPEG n04033995/
+mv val/ILSVRC2012_val_00047484.JPEG n02444819/
+mv val/ILSVRC2012_val_00047485.JPEG n03954731/
+mv val/ILSVRC2012_val_00047486.JPEG n04311174/
+mv val/ILSVRC2012_val_00047487.JPEG n02095889/
+mv val/ILSVRC2012_val_00047488.JPEG n01914609/
+mv val/ILSVRC2012_val_00047489.JPEG n03710193/
+mv val/ILSVRC2012_val_00047490.JPEG n02782093/
+mv val/ILSVRC2012_val_00047491.JPEG n01820546/
+mv val/ILSVRC2012_val_00047492.JPEG n02091134/
+mv val/ILSVRC2012_val_00047493.JPEG n04355933/
+mv val/ILSVRC2012_val_00047494.JPEG n02389026/
+mv val/ILSVRC2012_val_00047495.JPEG n04090263/
+mv val/ILSVRC2012_val_00047496.JPEG n04254120/
+mv val/ILSVRC2012_val_00047497.JPEG n01820546/
+mv val/ILSVRC2012_val_00047498.JPEG n01641577/
+mv val/ILSVRC2012_val_00047499.JPEG n02106550/
+mv val/ILSVRC2012_val_00047500.JPEG n02326432/
+mv val/ILSVRC2012_val_00047501.JPEG n03532672/
+mv val/ILSVRC2012_val_00047502.JPEG n03065424/
+mv val/ILSVRC2012_val_00047503.JPEG n07836838/
+mv val/ILSVRC2012_val_00047504.JPEG n02786058/
+mv val/ILSVRC2012_val_00047505.JPEG n04235860/
+mv val/ILSVRC2012_val_00047506.JPEG n04264628/
+mv val/ILSVRC2012_val_00047507.JPEG n02091244/
+mv val/ILSVRC2012_val_00047508.JPEG n03773504/
+mv val/ILSVRC2012_val_00047509.JPEG n02013706/
+mv val/ILSVRC2012_val_00047510.JPEG n04458633/
+mv val/ILSVRC2012_val_00047511.JPEG n04270147/
+mv val/ILSVRC2012_val_00047512.JPEG n07711569/
+mv val/ILSVRC2012_val_00047513.JPEG n04325704/
+mv val/ILSVRC2012_val_00047514.JPEG n03017168/
+mv val/ILSVRC2012_val_00047515.JPEG n02112350/
+mv val/ILSVRC2012_val_00047516.JPEG n04192698/
+mv val/ILSVRC2012_val_00047517.JPEG n02769748/
+mv val/ILSVRC2012_val_00047518.JPEG n02096051/
+mv val/ILSVRC2012_val_00047519.JPEG n04149813/
+mv val/ILSVRC2012_val_00047520.JPEG n02483708/
+mv val/ILSVRC2012_val_00047521.JPEG n04040759/
+mv val/ILSVRC2012_val_00047522.JPEG n04265275/
+mv val/ILSVRC2012_val_00047523.JPEG n02071294/
+mv val/ILSVRC2012_val_00047524.JPEG n07873807/
+mv val/ILSVRC2012_val_00047525.JPEG n02488702/
+mv val/ILSVRC2012_val_00047526.JPEG n04200800/
+mv val/ILSVRC2012_val_00047527.JPEG n02134084/
+mv val/ILSVRC2012_val_00047528.JPEG n04418357/
+mv val/ILSVRC2012_val_00047529.JPEG n04552348/
+mv val/ILSVRC2012_val_00047530.JPEG n02999410/
+mv val/ILSVRC2012_val_00047531.JPEG n02817516/
+mv val/ILSVRC2012_val_00047532.JPEG n01981276/
+mv val/ILSVRC2012_val_00047533.JPEG n02233338/
+mv val/ILSVRC2012_val_00047534.JPEG n02504458/
+mv val/ILSVRC2012_val_00047535.JPEG n02116738/
+mv val/ILSVRC2012_val_00047536.JPEG n03633091/
+mv val/ILSVRC2012_val_00047537.JPEG n03372029/
+mv val/ILSVRC2012_val_00047538.JPEG n07714990/
+mv val/ILSVRC2012_val_00047539.JPEG n04552348/
+mv val/ILSVRC2012_val_00047540.JPEG n02504458/
+mv val/ILSVRC2012_val_00047541.JPEG n02172182/
+mv val/ILSVRC2012_val_00047542.JPEG n03691459/
+mv val/ILSVRC2012_val_00047543.JPEG n02089078/
+mv val/ILSVRC2012_val_00047544.JPEG n03594734/
+mv val/ILSVRC2012_val_00047545.JPEG n02643566/
+mv val/ILSVRC2012_val_00047546.JPEG n01665541/
+mv val/ILSVRC2012_val_00047547.JPEG n01818515/
+mv val/ILSVRC2012_val_00047548.JPEG n02802426/
+mv val/ILSVRC2012_val_00047549.JPEG n03662601/
+mv val/ILSVRC2012_val_00047550.JPEG n03495258/
+mv val/ILSVRC2012_val_00047551.JPEG n01773797/
+mv val/ILSVRC2012_val_00047552.JPEG n02206856/
+mv val/ILSVRC2012_val_00047553.JPEG n03710721/
+mv val/ILSVRC2012_val_00047554.JPEG n04442312/
+mv val/ILSVRC2012_val_00047555.JPEG n02137549/
+mv val/ILSVRC2012_val_00047556.JPEG n03657121/
+mv val/ILSVRC2012_val_00047557.JPEG n04311004/
+mv val/ILSVRC2012_val_00047558.JPEG n03775071/
+mv val/ILSVRC2012_val_00047559.JPEG n03630383/
+mv val/ILSVRC2012_val_00047560.JPEG n02412080/
+mv val/ILSVRC2012_val_00047561.JPEG n01443537/
+mv val/ILSVRC2012_val_00047562.JPEG n03874293/
+mv val/ILSVRC2012_val_00047563.JPEG n03874599/
+mv val/ILSVRC2012_val_00047564.JPEG n07590611/
+mv val/ILSVRC2012_val_00047565.JPEG n04162706/
+mv val/ILSVRC2012_val_00047566.JPEG n02108551/
+mv val/ILSVRC2012_val_00047567.JPEG n07749582/
+mv val/ILSVRC2012_val_00047568.JPEG n02804414/
+mv val/ILSVRC2012_val_00047569.JPEG n03777754/
+mv val/ILSVRC2012_val_00047570.JPEG n03584829/
+mv val/ILSVRC2012_val_00047571.JPEG n02699494/
+mv val/ILSVRC2012_val_00047572.JPEG n02097298/
+mv val/ILSVRC2012_val_00047573.JPEG n03661043/
+mv val/ILSVRC2012_val_00047574.JPEG n01774750/
+mv val/ILSVRC2012_val_00047575.JPEG n03594945/
+mv val/ILSVRC2012_val_00047576.JPEG n04005630/
+mv val/ILSVRC2012_val_00047577.JPEG n07697313/
+mv val/ILSVRC2012_val_00047578.JPEG n02009229/
+mv val/ILSVRC2012_val_00047579.JPEG n03529860/
+mv val/ILSVRC2012_val_00047580.JPEG n04355933/
+mv val/ILSVRC2012_val_00047581.JPEG n03899768/
+mv val/ILSVRC2012_val_00047582.JPEG n03337140/
+mv val/ILSVRC2012_val_00047583.JPEG n02110958/
+mv val/ILSVRC2012_val_00047584.JPEG n02092339/
+mv val/ILSVRC2012_val_00047585.JPEG n02097130/
+mv val/ILSVRC2012_val_00047586.JPEG n03337140/
+mv val/ILSVRC2012_val_00047587.JPEG n01818515/
+mv val/ILSVRC2012_val_00047588.JPEG n03345487/
+mv val/ILSVRC2012_val_00047589.JPEG n01496331/
+mv val/ILSVRC2012_val_00047590.JPEG n03124043/
+mv val/ILSVRC2012_val_00047591.JPEG n02095570/
+mv val/ILSVRC2012_val_00047592.JPEG n01558993/
+mv val/ILSVRC2012_val_00047593.JPEG n03814906/
+mv val/ILSVRC2012_val_00047594.JPEG n03216828/
+mv val/ILSVRC2012_val_00047595.JPEG n03930630/
+mv val/ILSVRC2012_val_00047596.JPEG n06874185/
+mv val/ILSVRC2012_val_00047597.JPEG n02113799/
+mv val/ILSVRC2012_val_00047598.JPEG n07720875/
+mv val/ILSVRC2012_val_00047599.JPEG n03887697/
+mv val/ILSVRC2012_val_00047600.JPEG n03697007/
+mv val/ILSVRC2012_val_00047601.JPEG n02231487/
+mv val/ILSVRC2012_val_00047602.JPEG n02669723/
+mv val/ILSVRC2012_val_00047603.JPEG n02480855/
+mv val/ILSVRC2012_val_00047604.JPEG n04366367/
+mv val/ILSVRC2012_val_00047605.JPEG n03706229/
+mv val/ILSVRC2012_val_00047606.JPEG n03529860/
+mv val/ILSVRC2012_val_00047607.JPEG n03924679/
+mv val/ILSVRC2012_val_00047608.JPEG n03527444/
+mv val/ILSVRC2012_val_00047609.JPEG n01770393/
+mv val/ILSVRC2012_val_00047610.JPEG n04493381/
+mv val/ILSVRC2012_val_00047611.JPEG n04532670/
+mv val/ILSVRC2012_val_00047612.JPEG n02883205/
+mv val/ILSVRC2012_val_00047613.JPEG n04192698/
+mv val/ILSVRC2012_val_00047614.JPEG n02129604/
+mv val/ILSVRC2012_val_00047615.JPEG n02669723/
+mv val/ILSVRC2012_val_00047616.JPEG n04259630/
+mv val/ILSVRC2012_val_00047617.JPEG n02091831/
+mv val/ILSVRC2012_val_00047618.JPEG n09332890/
+mv val/ILSVRC2012_val_00047619.JPEG n01883070/
+mv val/ILSVRC2012_val_00047620.JPEG n04026417/
+mv val/ILSVRC2012_val_00047621.JPEG n03485407/
+mv val/ILSVRC2012_val_00047622.JPEG n01877812/
+mv val/ILSVRC2012_val_00047623.JPEG n01644900/
+mv val/ILSVRC2012_val_00047624.JPEG n09256479/
+mv val/ILSVRC2012_val_00047625.JPEG n04286575/
+mv val/ILSVRC2012_val_00047626.JPEG n01601694/
+mv val/ILSVRC2012_val_00047627.JPEG n04428191/
+mv val/ILSVRC2012_val_00047628.JPEG n03065424/
+mv val/ILSVRC2012_val_00047629.JPEG n03770439/
+mv val/ILSVRC2012_val_00047630.JPEG n02174001/
+mv val/ILSVRC2012_val_00047631.JPEG n02110341/
+mv val/ILSVRC2012_val_00047632.JPEG n02916936/
+mv val/ILSVRC2012_val_00047633.JPEG n04086273/
+mv val/ILSVRC2012_val_00047634.JPEG n03393912/
+mv val/ILSVRC2012_val_00047635.JPEG n02701002/
+mv val/ILSVRC2012_val_00047636.JPEG n03991062/
+mv val/ILSVRC2012_val_00047637.JPEG n01608432/
+mv val/ILSVRC2012_val_00047638.JPEG n04273569/
+mv val/ILSVRC2012_val_00047639.JPEG n04522168/
+mv val/ILSVRC2012_val_00047640.JPEG n07760859/
+mv val/ILSVRC2012_val_00047641.JPEG n02493793/
+mv val/ILSVRC2012_val_00047642.JPEG n02804414/
+mv val/ILSVRC2012_val_00047643.JPEG n02229544/
+mv val/ILSVRC2012_val_00047644.JPEG n04009552/
+mv val/ILSVRC2012_val_00047645.JPEG n03874599/
+mv val/ILSVRC2012_val_00047646.JPEG n03649909/
+mv val/ILSVRC2012_val_00047647.JPEG n07614500/
+mv val/ILSVRC2012_val_00047648.JPEG n02094433/
+mv val/ILSVRC2012_val_00047649.JPEG n02097298/
+mv val/ILSVRC2012_val_00047650.JPEG n03662601/
+mv val/ILSVRC2012_val_00047651.JPEG n03450230/
+mv val/ILSVRC2012_val_00047652.JPEG n02093256/
+mv val/ILSVRC2012_val_00047653.JPEG n04033995/
+mv val/ILSVRC2012_val_00047654.JPEG n02113023/
+mv val/ILSVRC2012_val_00047655.JPEG n09246464/
+mv val/ILSVRC2012_val_00047656.JPEG n01704323/
+mv val/ILSVRC2012_val_00047657.JPEG n02488702/
+mv val/ILSVRC2012_val_00047658.JPEG n02096294/
+mv val/ILSVRC2012_val_00047659.JPEG n04536866/
+mv val/ILSVRC2012_val_00047660.JPEG n07873807/
+mv val/ILSVRC2012_val_00047661.JPEG n03770439/
+mv val/ILSVRC2012_val_00047662.JPEG n04409515/
+mv val/ILSVRC2012_val_00047663.JPEG n04532106/
+mv val/ILSVRC2012_val_00047664.JPEG n04542943/
+mv val/ILSVRC2012_val_00047665.JPEG n07584110/
+mv val/ILSVRC2012_val_00047666.JPEG n02808304/
+mv val/ILSVRC2012_val_00047667.JPEG n03903868/
+mv val/ILSVRC2012_val_00047668.JPEG n03888605/
+mv val/ILSVRC2012_val_00047669.JPEG n02051845/
+mv val/ILSVRC2012_val_00047670.JPEG n02115641/
+mv val/ILSVRC2012_val_00047671.JPEG n02099267/
+mv val/ILSVRC2012_val_00047672.JPEG n03452741/
+mv val/ILSVRC2012_val_00047673.JPEG n03498962/
+mv val/ILSVRC2012_val_00047674.JPEG n01945685/
+mv val/ILSVRC2012_val_00047675.JPEG n01692333/
+mv val/ILSVRC2012_val_00047676.JPEG n03930630/
+mv val/ILSVRC2012_val_00047677.JPEG n02794156/
+mv val/ILSVRC2012_val_00047678.JPEG n04311004/
+mv val/ILSVRC2012_val_00047679.JPEG n03482405/
+mv val/ILSVRC2012_val_00047680.JPEG n04540053/
+mv val/ILSVRC2012_val_00047681.JPEG n09256479/
+mv val/ILSVRC2012_val_00047682.JPEG n02607072/
+mv val/ILSVRC2012_val_00047683.JPEG n02281406/
+mv val/ILSVRC2012_val_00047684.JPEG n03991062/
+mv val/ILSVRC2012_val_00047685.JPEG n02056570/
+mv val/ILSVRC2012_val_00047686.JPEG n04243546/
+mv val/ILSVRC2012_val_00047687.JPEG n03100240/
+mv val/ILSVRC2012_val_00047688.JPEG n01532829/
+mv val/ILSVRC2012_val_00047689.JPEG n03127747/
+mv val/ILSVRC2012_val_00047690.JPEG n02119022/
+mv val/ILSVRC2012_val_00047691.JPEG n02666196/
+mv val/ILSVRC2012_val_00047692.JPEG n03379051/
+mv val/ILSVRC2012_val_00047693.JPEG n04417672/
+mv val/ILSVRC2012_val_00047694.JPEG n07920052/
+mv val/ILSVRC2012_val_00047695.JPEG n03617480/
+mv val/ILSVRC2012_val_00047696.JPEG n01818515/
+mv val/ILSVRC2012_val_00047697.JPEG n03998194/
+mv val/ILSVRC2012_val_00047698.JPEG n03388183/
+mv val/ILSVRC2012_val_00047699.JPEG n02113799/
+mv val/ILSVRC2012_val_00047700.JPEG n04344873/
+mv val/ILSVRC2012_val_00047701.JPEG n03590841/
+mv val/ILSVRC2012_val_00047702.JPEG n04228054/
+mv val/ILSVRC2012_val_00047703.JPEG n04228054/
+mv val/ILSVRC2012_val_00047704.JPEG n02231487/
+mv val/ILSVRC2012_val_00047705.JPEG n03888257/
+mv val/ILSVRC2012_val_00047706.JPEG n04086273/
+mv val/ILSVRC2012_val_00047707.JPEG n02090622/
+mv val/ILSVRC2012_val_00047708.JPEG n03933933/
+mv val/ILSVRC2012_val_00047709.JPEG n02422106/
+mv val/ILSVRC2012_val_00047710.JPEG n03720891/
+mv val/ILSVRC2012_val_00047711.JPEG n02093991/
+mv val/ILSVRC2012_val_00047712.JPEG n04347754/
+mv val/ILSVRC2012_val_00047713.JPEG n01630670/
+mv val/ILSVRC2012_val_00047714.JPEG n03843555/
+mv val/ILSVRC2012_val_00047715.JPEG n03729826/
+mv val/ILSVRC2012_val_00047716.JPEG n01644900/
+mv val/ILSVRC2012_val_00047717.JPEG n02264363/
+mv val/ILSVRC2012_val_00047718.JPEG n03126707/
+mv val/ILSVRC2012_val_00047719.JPEG n12057211/
+mv val/ILSVRC2012_val_00047720.JPEG n04461696/
+mv val/ILSVRC2012_val_00047721.JPEG n02098286/
+mv val/ILSVRC2012_val_00047722.JPEG n02276258/
+mv val/ILSVRC2012_val_00047723.JPEG n04552348/
+mv val/ILSVRC2012_val_00047724.JPEG n01514668/
+mv val/ILSVRC2012_val_00047725.JPEG n04243546/
+mv val/ILSVRC2012_val_00047726.JPEG n02871525/
+mv val/ILSVRC2012_val_00047727.JPEG n02106382/
+mv val/ILSVRC2012_val_00047728.JPEG n02100583/
+mv val/ILSVRC2012_val_00047729.JPEG n02085936/
+mv val/ILSVRC2012_val_00047730.JPEG n04487081/
+mv val/ILSVRC2012_val_00047731.JPEG n03995372/
+mv val/ILSVRC2012_val_00047732.JPEG n01601694/
+mv val/ILSVRC2012_val_00047733.JPEG n02279972/
+mv val/ILSVRC2012_val_00047734.JPEG n03444034/
+mv val/ILSVRC2012_val_00047735.JPEG n07730033/
+mv val/ILSVRC2012_val_00047736.JPEG n02011460/
+mv val/ILSVRC2012_val_00047737.JPEG n02099601/
+mv val/ILSVRC2012_val_00047738.JPEG n04536866/
+mv val/ILSVRC2012_val_00047739.JPEG n03014705/
+mv val/ILSVRC2012_val_00047740.JPEG n02486261/
+mv val/ILSVRC2012_val_00047741.JPEG n04590129/
+mv val/ILSVRC2012_val_00047742.JPEG n04265275/
+mv val/ILSVRC2012_val_00047743.JPEG n03447447/
+mv val/ILSVRC2012_val_00047744.JPEG n02102177/
+mv val/ILSVRC2012_val_00047745.JPEG n03388043/
+mv val/ILSVRC2012_val_00047746.JPEG n01665541/
+mv val/ILSVRC2012_val_00047747.JPEG n03924679/
+mv val/ILSVRC2012_val_00047748.JPEG n06874185/
+mv val/ILSVRC2012_val_00047749.JPEG n03018349/
+mv val/ILSVRC2012_val_00047750.JPEG n02403003/
+mv val/ILSVRC2012_val_00047751.JPEG n03196217/
+mv val/ILSVRC2012_val_00047752.JPEG n02132136/
+mv val/ILSVRC2012_val_00047753.JPEG n01514859/
+mv val/ILSVRC2012_val_00047754.JPEG n02397096/
+mv val/ILSVRC2012_val_00047755.JPEG n02113186/
+mv val/ILSVRC2012_val_00047756.JPEG n03924679/
+mv val/ILSVRC2012_val_00047757.JPEG n02096437/
+mv val/ILSVRC2012_val_00047758.JPEG n07831146/
+mv val/ILSVRC2012_val_00047759.JPEG n04584207/
+mv val/ILSVRC2012_val_00047760.JPEG n03777568/
+mv val/ILSVRC2012_val_00047761.JPEG n02276258/
+mv val/ILSVRC2012_val_00047762.JPEG n02108915/
+mv val/ILSVRC2012_val_00047763.JPEG n04540053/
+mv val/ILSVRC2012_val_00047764.JPEG n03874293/
+mv val/ILSVRC2012_val_00047765.JPEG n02033041/
+mv val/ILSVRC2012_val_00047766.JPEG n04270147/
+mv val/ILSVRC2012_val_00047767.JPEG n02114367/
+mv val/ILSVRC2012_val_00047768.JPEG n07730033/
+mv val/ILSVRC2012_val_00047769.JPEG n02342885/
+mv val/ILSVRC2012_val_00047770.JPEG n03929660/
+mv val/ILSVRC2012_val_00047771.JPEG n03032252/
+mv val/ILSVRC2012_val_00047772.JPEG n02992211/
+mv val/ILSVRC2012_val_00047773.JPEG n03658185/
+mv val/ILSVRC2012_val_00047774.JPEG n02777292/
+mv val/ILSVRC2012_val_00047775.JPEG n02879718/
+mv val/ILSVRC2012_val_00047776.JPEG n02319095/
+mv val/ILSVRC2012_val_00047777.JPEG n07760859/
+mv val/ILSVRC2012_val_00047778.JPEG n03888257/
+mv val/ILSVRC2012_val_00047779.JPEG n02910353/
+mv val/ILSVRC2012_val_00047780.JPEG n03868863/
+mv val/ILSVRC2012_val_00047781.JPEG n04133789/
+mv val/ILSVRC2012_val_00047782.JPEG n04136333/
+mv val/ILSVRC2012_val_00047783.JPEG n04356056/
+mv val/ILSVRC2012_val_00047784.JPEG n02028035/
+mv val/ILSVRC2012_val_00047785.JPEG n03000134/
+mv val/ILSVRC2012_val_00047786.JPEG n03355925/
+mv val/ILSVRC2012_val_00047787.JPEG n04326547/
+mv val/ILSVRC2012_val_00047788.JPEG n02494079/
+mv val/ILSVRC2012_val_00047789.JPEG n04099969/
+mv val/ILSVRC2012_val_00047790.JPEG n02966193/
+mv val/ILSVRC2012_val_00047791.JPEG n04147183/
+mv val/ILSVRC2012_val_00047792.JPEG n02966193/
+mv val/ILSVRC2012_val_00047793.JPEG n07697313/
+mv val/ILSVRC2012_val_00047794.JPEG n03877472/
+mv val/ILSVRC2012_val_00047795.JPEG n02486261/
+mv val/ILSVRC2012_val_00047796.JPEG n02510455/
+mv val/ILSVRC2012_val_00047797.JPEG n07720875/
+mv val/ILSVRC2012_val_00047798.JPEG n03764736/
+mv val/ILSVRC2012_val_00047799.JPEG n04239074/
+mv val/ILSVRC2012_val_00047800.JPEG n02443484/
+mv val/ILSVRC2012_val_00047801.JPEG n07720875/
+mv val/ILSVRC2012_val_00047802.JPEG n02840245/
+mv val/ILSVRC2012_val_00047803.JPEG n03782006/
+mv val/ILSVRC2012_val_00047804.JPEG n02119789/
+mv val/ILSVRC2012_val_00047805.JPEG n04328186/
+mv val/ILSVRC2012_val_00047806.JPEG n02417914/
+mv val/ILSVRC2012_val_00047807.JPEG n03216828/
+mv val/ILSVRC2012_val_00047808.JPEG n02108551/
+mv val/ILSVRC2012_val_00047809.JPEG n02013706/
+mv val/ILSVRC2012_val_00047810.JPEG n01734418/
+mv val/ILSVRC2012_val_00047811.JPEG n03729826/
+mv val/ILSVRC2012_val_00047812.JPEG n01689811/
+mv val/ILSVRC2012_val_00047813.JPEG n04522168/
+mv val/ILSVRC2012_val_00047814.JPEG n02422106/
+mv val/ILSVRC2012_val_00047815.JPEG n04004767/
+mv val/ILSVRC2012_val_00047816.JPEG n12620546/
+mv val/ILSVRC2012_val_00047817.JPEG n04041544/
+mv val/ILSVRC2012_val_00047818.JPEG n04116512/
+mv val/ILSVRC2012_val_00047819.JPEG n03478589/
+mv val/ILSVRC2012_val_00047820.JPEG n02174001/
+mv val/ILSVRC2012_val_00047821.JPEG n04486054/
+mv val/ILSVRC2012_val_00047822.JPEG n02107142/
+mv val/ILSVRC2012_val_00047823.JPEG n02422699/
+mv val/ILSVRC2012_val_00047824.JPEG n03400231/
+mv val/ILSVRC2012_val_00047825.JPEG n07930864/
+mv val/ILSVRC2012_val_00047826.JPEG n04200800/
+mv val/ILSVRC2012_val_00047827.JPEG n01582220/
+mv val/ILSVRC2012_val_00047828.JPEG n07753592/
+mv val/ILSVRC2012_val_00047829.JPEG n02690373/
+mv val/ILSVRC2012_val_00047830.JPEG n07880968/
+mv val/ILSVRC2012_val_00047831.JPEG n03958227/
+mv val/ILSVRC2012_val_00047832.JPEG n01665541/
+mv val/ILSVRC2012_val_00047833.JPEG n01847000/
+mv val/ILSVRC2012_val_00047834.JPEG n12768682/
+mv val/ILSVRC2012_val_00047835.JPEG n03478589/
+mv val/ILSVRC2012_val_00047836.JPEG n02091467/
+mv val/ILSVRC2012_val_00047837.JPEG n02787622/
+mv val/ILSVRC2012_val_00047838.JPEG n02776631/
+mv val/ILSVRC2012_val_00047839.JPEG n03000247/
+mv val/ILSVRC2012_val_00047840.JPEG n04074963/
+mv val/ILSVRC2012_val_00047841.JPEG n03743016/
+mv val/ILSVRC2012_val_00047842.JPEG n03325584/
+mv val/ILSVRC2012_val_00047843.JPEG n09246464/
+mv val/ILSVRC2012_val_00047844.JPEG n03871628/
+mv val/ILSVRC2012_val_00047845.JPEG n01740131/
+mv val/ILSVRC2012_val_00047846.JPEG n09288635/
+mv val/ILSVRC2012_val_00047847.JPEG n02730930/
+mv val/ILSVRC2012_val_00047848.JPEG n03884397/
+mv val/ILSVRC2012_val_00047849.JPEG n03775546/
+mv val/ILSVRC2012_val_00047850.JPEG n02114712/
+mv val/ILSVRC2012_val_00047851.JPEG n07718472/
+mv val/ILSVRC2012_val_00047852.JPEG n01728920/
+mv val/ILSVRC2012_val_00047853.JPEG n02494079/
+mv val/ILSVRC2012_val_00047854.JPEG n01774750/
+mv val/ILSVRC2012_val_00047855.JPEG n03967562/
+mv val/ILSVRC2012_val_00047856.JPEG n07718747/
+mv val/ILSVRC2012_val_00047857.JPEG n02906734/
+mv val/ILSVRC2012_val_00047858.JPEG n03444034/
+mv val/ILSVRC2012_val_00047859.JPEG n02408429/
+mv val/ILSVRC2012_val_00047860.JPEG n02319095/
+mv val/ILSVRC2012_val_00047861.JPEG n04330267/
+mv val/ILSVRC2012_val_00047862.JPEG n02113624/
+mv val/ILSVRC2012_val_00047863.JPEG n02231487/
+mv val/ILSVRC2012_val_00047864.JPEG n04141076/
+mv val/ILSVRC2012_val_00047865.JPEG n04552348/
+mv val/ILSVRC2012_val_00047866.JPEG n03759954/
+mv val/ILSVRC2012_val_00047867.JPEG n04120489/
+mv val/ILSVRC2012_val_00047868.JPEG n02869837/
+mv val/ILSVRC2012_val_00047869.JPEG n03838899/
+mv val/ILSVRC2012_val_00047870.JPEG n02268443/
+mv val/ILSVRC2012_val_00047871.JPEG n02321529/
+mv val/ILSVRC2012_val_00047872.JPEG n04023962/
+mv val/ILSVRC2012_val_00047873.JPEG n03843555/
+mv val/ILSVRC2012_val_00047874.JPEG n04525038/
+mv val/ILSVRC2012_val_00047875.JPEG n02361337/
+mv val/ILSVRC2012_val_00047876.JPEG n03924679/
+mv val/ILSVRC2012_val_00047877.JPEG n02236044/
+mv val/ILSVRC2012_val_00047878.JPEG n01530575/
+mv val/ILSVRC2012_val_00047879.JPEG n02877765/
+mv val/ILSVRC2012_val_00047880.JPEG n01980166/
+mv val/ILSVRC2012_val_00047881.JPEG n03777568/
+mv val/ILSVRC2012_val_00047882.JPEG n04008634/
+mv val/ILSVRC2012_val_00047883.JPEG n04579145/
+mv val/ILSVRC2012_val_00047884.JPEG n07873807/
+mv val/ILSVRC2012_val_00047885.JPEG n03207743/
+mv val/ILSVRC2012_val_00047886.JPEG n03970156/
+mv val/ILSVRC2012_val_00047887.JPEG n04254680/
+mv val/ILSVRC2012_val_00047888.JPEG n03345487/
+mv val/ILSVRC2012_val_00047889.JPEG n02454379/
+mv val/ILSVRC2012_val_00047890.JPEG n03110669/
+mv val/ILSVRC2012_val_00047891.JPEG n01980166/
+mv val/ILSVRC2012_val_00047892.JPEG n02536864/
+mv val/ILSVRC2012_val_00047893.JPEG n04285008/
+mv val/ILSVRC2012_val_00047894.JPEG n07684084/
+mv val/ILSVRC2012_val_00047895.JPEG n01924916/
+mv val/ILSVRC2012_val_00047896.JPEG n02108915/
+mv val/ILSVRC2012_val_00047897.JPEG n04074963/
+mv val/ILSVRC2012_val_00047898.JPEG n03837869/
+mv val/ILSVRC2012_val_00047899.JPEG n01882714/
+mv val/ILSVRC2012_val_00047900.JPEG n03873416/
+mv val/ILSVRC2012_val_00047901.JPEG n02169497/
+mv val/ILSVRC2012_val_00047902.JPEG n02687172/
+mv val/ILSVRC2012_val_00047903.JPEG n02268853/
+mv val/ILSVRC2012_val_00047904.JPEG n02906734/
+mv val/ILSVRC2012_val_00047905.JPEG n03018349/
+mv val/ILSVRC2012_val_00047906.JPEG n04310018/
+mv val/ILSVRC2012_val_00047907.JPEG n02978881/
+mv val/ILSVRC2012_val_00047908.JPEG n01693334/
+mv val/ILSVRC2012_val_00047909.JPEG n04542943/
+mv val/ILSVRC2012_val_00047910.JPEG n03770679/
+mv val/ILSVRC2012_val_00047911.JPEG n02123045/
+mv val/ILSVRC2012_val_00047912.JPEG n02974003/
+mv val/ILSVRC2012_val_00047913.JPEG n02086646/
+mv val/ILSVRC2012_val_00047914.JPEG n01530575/
+mv val/ILSVRC2012_val_00047915.JPEG n03786901/
+mv val/ILSVRC2012_val_00047916.JPEG n03710193/
+mv val/ILSVRC2012_val_00047917.JPEG n03388183/
+mv val/ILSVRC2012_val_00047918.JPEG n02112350/
+mv val/ILSVRC2012_val_00047919.JPEG n02113186/
+mv val/ILSVRC2012_val_00047920.JPEG n01883070/
+mv val/ILSVRC2012_val_00047921.JPEG n04552348/
+mv val/ILSVRC2012_val_00047922.JPEG n04344873/
+mv val/ILSVRC2012_val_00047923.JPEG n01773157/
+mv val/ILSVRC2012_val_00047924.JPEG n02109961/
+mv val/ILSVRC2012_val_00047925.JPEG n02123159/
+mv val/ILSVRC2012_val_00047926.JPEG n04404412/
+mv val/ILSVRC2012_val_00047927.JPEG n01917289/
+mv val/ILSVRC2012_val_00047928.JPEG n02169497/
+mv val/ILSVRC2012_val_00047929.JPEG n03899768/
+mv val/ILSVRC2012_val_00047930.JPEG n03697007/
+mv val/ILSVRC2012_val_00047931.JPEG n03874599/
+mv val/ILSVRC2012_val_00047932.JPEG n02669723/
+mv val/ILSVRC2012_val_00047933.JPEG n07717556/
+mv val/ILSVRC2012_val_00047934.JPEG n04147183/
+mv val/ILSVRC2012_val_00047935.JPEG n03424325/
+mv val/ILSVRC2012_val_00047936.JPEG n03498962/
+mv val/ILSVRC2012_val_00047937.JPEG n07715103/
+mv val/ILSVRC2012_val_00047938.JPEG n01632777/
+mv val/ILSVRC2012_val_00047939.JPEG n02264363/
+mv val/ILSVRC2012_val_00047940.JPEG n03018349/
+mv val/ILSVRC2012_val_00047941.JPEG n01669191/
+mv val/ILSVRC2012_val_00047942.JPEG n04204238/
+mv val/ILSVRC2012_val_00047943.JPEG n01829413/
+mv val/ILSVRC2012_val_00047944.JPEG n03785016/
+mv val/ILSVRC2012_val_00047945.JPEG n01871265/
+mv val/ILSVRC2012_val_00047946.JPEG n02992529/
+mv val/ILSVRC2012_val_00047947.JPEG n04127249/
+mv val/ILSVRC2012_val_00047948.JPEG n01774384/
+mv val/ILSVRC2012_val_00047949.JPEG n13040303/
+mv val/ILSVRC2012_val_00047950.JPEG n02090721/
+mv val/ILSVRC2012_val_00047951.JPEG n07615774/
+mv val/ILSVRC2012_val_00047952.JPEG n02231487/
+mv val/ILSVRC2012_val_00047953.JPEG n03126707/
+mv val/ILSVRC2012_val_00047954.JPEG n04399382/
+mv val/ILSVRC2012_val_00047955.JPEG n02127052/
+mv val/ILSVRC2012_val_00047956.JPEG n02480495/
+mv val/ILSVRC2012_val_00047957.JPEG n04357314/
+mv val/ILSVRC2012_val_00047958.JPEG n04597913/
+mv val/ILSVRC2012_val_00047959.JPEG n04311174/
+mv val/ILSVRC2012_val_00047960.JPEG n04376876/
+mv val/ILSVRC2012_val_00047961.JPEG n03344393/
+mv val/ILSVRC2012_val_00047962.JPEG n04146614/
+mv val/ILSVRC2012_val_00047963.JPEG n01622779/
+mv val/ILSVRC2012_val_00047964.JPEG n04325704/
+mv val/ILSVRC2012_val_00047965.JPEG n03527444/
+mv val/ILSVRC2012_val_00047966.JPEG n07753275/
+mv val/ILSVRC2012_val_00047967.JPEG n02422699/
+mv val/ILSVRC2012_val_00047968.JPEG n03759954/
+mv val/ILSVRC2012_val_00047969.JPEG n01824575/
+mv val/ILSVRC2012_val_00047970.JPEG n01704323/
+mv val/ILSVRC2012_val_00047971.JPEG n04067472/
+mv val/ILSVRC2012_val_00047972.JPEG n01872401/
+mv val/ILSVRC2012_val_00047973.JPEG n02114712/
+mv val/ILSVRC2012_val_00047974.JPEG n02979186/
+mv val/ILSVRC2012_val_00047975.JPEG n07615774/
+mv val/ILSVRC2012_val_00047976.JPEG n02094433/
+mv val/ILSVRC2012_val_00047977.JPEG n02106550/
+mv val/ILSVRC2012_val_00047978.JPEG n01930112/
+mv val/ILSVRC2012_val_00047979.JPEG n02086079/
+mv val/ILSVRC2012_val_00047980.JPEG n07754684/
+mv val/ILSVRC2012_val_00047981.JPEG n02088238/
+mv val/ILSVRC2012_val_00047982.JPEG n03764736/
+mv val/ILSVRC2012_val_00047983.JPEG n02077923/
+mv val/ILSVRC2012_val_00047984.JPEG n01770081/
+mv val/ILSVRC2012_val_00047985.JPEG n03763968/
+mv val/ILSVRC2012_val_00047986.JPEG n03544143/
+mv val/ILSVRC2012_val_00047987.JPEG n03777568/
+mv val/ILSVRC2012_val_00047988.JPEG n03706229/
+mv val/ILSVRC2012_val_00047989.JPEG n07871810/
+mv val/ILSVRC2012_val_00047990.JPEG n02100583/
+mv val/ILSVRC2012_val_00047991.JPEG n02096585/
+mv val/ILSVRC2012_val_00047992.JPEG n03538406/
+mv val/ILSVRC2012_val_00047993.JPEG n02794156/
+mv val/ILSVRC2012_val_00047994.JPEG n04325704/
+mv val/ILSVRC2012_val_00047995.JPEG n04127249/
+mv val/ILSVRC2012_val_00047996.JPEG n02277742/
+mv val/ILSVRC2012_val_00047997.JPEG n03314780/
+mv val/ILSVRC2012_val_00047998.JPEG n13037406/
+mv val/ILSVRC2012_val_00047999.JPEG n02607072/
+mv val/ILSVRC2012_val_00048000.JPEG n07720875/
+mv val/ILSVRC2012_val_00048001.JPEG n02277742/
+mv val/ILSVRC2012_val_00048002.JPEG n02412080/
+mv val/ILSVRC2012_val_00048003.JPEG n13054560/
+mv val/ILSVRC2012_val_00048004.JPEG n02865351/
+mv val/ILSVRC2012_val_00048005.JPEG n03467068/
+mv val/ILSVRC2012_val_00048006.JPEG n03891251/
+mv val/ILSVRC2012_val_00048007.JPEG n02089973/
+mv val/ILSVRC2012_val_00048008.JPEG n02002724/
+mv val/ILSVRC2012_val_00048009.JPEG n02017213/
+mv val/ILSVRC2012_val_00048010.JPEG n02917067/
+mv val/ILSVRC2012_val_00048011.JPEG n01665541/
+mv val/ILSVRC2012_val_00048012.JPEG n07714990/
+mv val/ILSVRC2012_val_00048013.JPEG n03372029/
+mv val/ILSVRC2012_val_00048014.JPEG n03584254/
+mv val/ILSVRC2012_val_00048015.JPEG n03662601/
+mv val/ILSVRC2012_val_00048016.JPEG n03337140/
+mv val/ILSVRC2012_val_00048017.JPEG n02692877/
+mv val/ILSVRC2012_val_00048018.JPEG n02110627/
+mv val/ILSVRC2012_val_00048019.JPEG n04201297/
+mv val/ILSVRC2012_val_00048020.JPEG n04154565/
+mv val/ILSVRC2012_val_00048021.JPEG n03637318/
+mv val/ILSVRC2012_val_00048022.JPEG n03255030/
+mv val/ILSVRC2012_val_00048023.JPEG n07745940/
+mv val/ILSVRC2012_val_00048024.JPEG n02056570/
+mv val/ILSVRC2012_val_00048025.JPEG n03895866/
+mv val/ILSVRC2012_val_00048026.JPEG n02169497/
+mv val/ILSVRC2012_val_00048027.JPEG n01818515/
+mv val/ILSVRC2012_val_00048028.JPEG n04493381/
+mv val/ILSVRC2012_val_00048029.JPEG n03041632/
+mv val/ILSVRC2012_val_00048030.JPEG n02110627/
+mv val/ILSVRC2012_val_00048031.JPEG n04553703/
+mv val/ILSVRC2012_val_00048032.JPEG n02099429/
+mv val/ILSVRC2012_val_00048033.JPEG n09428293/
+mv val/ILSVRC2012_val_00048034.JPEG n03495258/
+mv val/ILSVRC2012_val_00048035.JPEG n02483708/
+mv val/ILSVRC2012_val_00048036.JPEG n04336792/
+mv val/ILSVRC2012_val_00048037.JPEG n02825657/
+mv val/ILSVRC2012_val_00048038.JPEG n03891251/
+mv val/ILSVRC2012_val_00048039.JPEG n01860187/
+mv val/ILSVRC2012_val_00048040.JPEG n09472597/
+mv val/ILSVRC2012_val_00048041.JPEG n01753488/
+mv val/ILSVRC2012_val_00048042.JPEG n04540053/
+mv val/ILSVRC2012_val_00048043.JPEG n02895154/
+mv val/ILSVRC2012_val_00048044.JPEG n02321529/
+mv val/ILSVRC2012_val_00048045.JPEG n03259280/
+mv val/ILSVRC2012_val_00048046.JPEG n01630670/
+mv val/ILSVRC2012_val_00048047.JPEG n03000134/
+mv val/ILSVRC2012_val_00048048.JPEG n03866082/
+mv val/ILSVRC2012_val_00048049.JPEG n01514859/
+mv val/ILSVRC2012_val_00048050.JPEG n07873807/
+mv val/ILSVRC2012_val_00048051.JPEG n02105056/
+mv val/ILSVRC2012_val_00048052.JPEG n01978455/
+mv val/ILSVRC2012_val_00048053.JPEG n02009912/
+mv val/ILSVRC2012_val_00048054.JPEG n03794056/
+mv val/ILSVRC2012_val_00048055.JPEG n03720891/
+mv val/ILSVRC2012_val_00048056.JPEG n03995372/
+mv val/ILSVRC2012_val_00048057.JPEG n02869837/
+mv val/ILSVRC2012_val_00048058.JPEG n02169497/
+mv val/ILSVRC2012_val_00048059.JPEG n03425413/
+mv val/ILSVRC2012_val_00048060.JPEG n04355338/
+mv val/ILSVRC2012_val_00048061.JPEG n02977058/
+mv val/ILSVRC2012_val_00048062.JPEG n02916936/
+mv val/ILSVRC2012_val_00048063.JPEG n03840681/
+mv val/ILSVRC2012_val_00048064.JPEG n04560804/
+mv val/ILSVRC2012_val_00048065.JPEG n03042490/
+mv val/ILSVRC2012_val_00048066.JPEG n07734744/
+mv val/ILSVRC2012_val_00048067.JPEG n03706229/
+mv val/ILSVRC2012_val_00048068.JPEG n01774384/
+mv val/ILSVRC2012_val_00048069.JPEG n03530642/
+mv val/ILSVRC2012_val_00048070.JPEG n02346627/
+mv val/ILSVRC2012_val_00048071.JPEG n02105251/
+mv val/ILSVRC2012_val_00048072.JPEG n02229544/
+mv val/ILSVRC2012_val_00048073.JPEG n04522168/
+mv val/ILSVRC2012_val_00048074.JPEG n03535780/
+mv val/ILSVRC2012_val_00048075.JPEG n02105505/
+mv val/ILSVRC2012_val_00048076.JPEG n02168699/
+mv val/ILSVRC2012_val_00048077.JPEG n02138441/
+mv val/ILSVRC2012_val_00048078.JPEG n04131690/
+mv val/ILSVRC2012_val_00048079.JPEG n02172182/
+mv val/ILSVRC2012_val_00048080.JPEG n02111129/
+mv val/ILSVRC2012_val_00048081.JPEG n02776631/
+mv val/ILSVRC2012_val_00048082.JPEG n03785016/
+mv val/ILSVRC2012_val_00048083.JPEG n03895866/
+mv val/ILSVRC2012_val_00048084.JPEG n02457408/
+mv val/ILSVRC2012_val_00048085.JPEG n03146219/
+mv val/ILSVRC2012_val_00048086.JPEG n02134084/
+mv val/ILSVRC2012_val_00048087.JPEG n02097130/
+mv val/ILSVRC2012_val_00048088.JPEG n02361337/
+mv val/ILSVRC2012_val_00048089.JPEG n07720875/
+mv val/ILSVRC2012_val_00048090.JPEG n01871265/
+mv val/ILSVRC2012_val_00048091.JPEG n02231487/
+mv val/ILSVRC2012_val_00048092.JPEG n07717556/
+mv val/ILSVRC2012_val_00048093.JPEG n04328186/
+mv val/ILSVRC2012_val_00048094.JPEG n04317175/
+mv val/ILSVRC2012_val_00048095.JPEG n03065424/
+mv val/ILSVRC2012_val_00048096.JPEG n02442845/
+mv val/ILSVRC2012_val_00048097.JPEG n03729826/
+mv val/ILSVRC2012_val_00048098.JPEG n02892201/
+mv val/ILSVRC2012_val_00048099.JPEG n02489166/
+mv val/ILSVRC2012_val_00048100.JPEG n03721384/
+mv val/ILSVRC2012_val_00048101.JPEG n02096437/
+mv val/ILSVRC2012_val_00048102.JPEG n02093647/
+mv val/ILSVRC2012_val_00048103.JPEG n03376595/
+mv val/ILSVRC2012_val_00048104.JPEG n01692333/
+mv val/ILSVRC2012_val_00048105.JPEG n02134084/
+mv val/ILSVRC2012_val_00048106.JPEG n01978287/
+mv val/ILSVRC2012_val_00048107.JPEG n01592084/
+mv val/ILSVRC2012_val_00048108.JPEG n02504458/
+mv val/ILSVRC2012_val_00048109.JPEG n03544143/
+mv val/ILSVRC2012_val_00048110.JPEG n04039381/
+mv val/ILSVRC2012_val_00048111.JPEG n02690373/
+mv val/ILSVRC2012_val_00048112.JPEG n01756291/
+mv val/ILSVRC2012_val_00048113.JPEG n03814639/
+mv val/ILSVRC2012_val_00048114.JPEG n03443371/
+mv val/ILSVRC2012_val_00048115.JPEG n03633091/
+mv val/ILSVRC2012_val_00048116.JPEG n02066245/
+mv val/ILSVRC2012_val_00048117.JPEG n03868242/
+mv val/ILSVRC2012_val_00048118.JPEG n02133161/
+mv val/ILSVRC2012_val_00048119.JPEG n01496331/
+mv val/ILSVRC2012_val_00048120.JPEG n02108915/
+mv val/ILSVRC2012_val_00048121.JPEG n03325584/
+mv val/ILSVRC2012_val_00048122.JPEG n03372029/
+mv val/ILSVRC2012_val_00048123.JPEG n02085782/
+mv val/ILSVRC2012_val_00048124.JPEG n04026417/
+mv val/ILSVRC2012_val_00048125.JPEG n02111500/
+mv val/ILSVRC2012_val_00048126.JPEG n03482405/
+mv val/ILSVRC2012_val_00048127.JPEG n04149813/
+mv val/ILSVRC2012_val_00048128.JPEG n02108551/
+mv val/ILSVRC2012_val_00048129.JPEG n03337140/
+mv val/ILSVRC2012_val_00048130.JPEG n03970156/
+mv val/ILSVRC2012_val_00048131.JPEG n02443484/
+mv val/ILSVRC2012_val_00048132.JPEG n03657121/
+mv val/ILSVRC2012_val_00048133.JPEG n03633091/
+mv val/ILSVRC2012_val_00048134.JPEG n01675722/
+mv val/ILSVRC2012_val_00048135.JPEG n02965783/
+mv val/ILSVRC2012_val_00048136.JPEG n03908714/
+mv val/ILSVRC2012_val_00048137.JPEG n03777754/
+mv val/ILSVRC2012_val_00048138.JPEG n03394916/
+mv val/ILSVRC2012_val_00048139.JPEG n06794110/
+mv val/ILSVRC2012_val_00048140.JPEG n02492660/
+mv val/ILSVRC2012_val_00048141.JPEG n02099429/
+mv val/ILSVRC2012_val_00048142.JPEG n01828970/
+mv val/ILSVRC2012_val_00048143.JPEG n04404412/
+mv val/ILSVRC2012_val_00048144.JPEG n01532829/
+mv val/ILSVRC2012_val_00048145.JPEG n02109047/
+mv val/ILSVRC2012_val_00048146.JPEG n07768694/
+mv val/ILSVRC2012_val_00048147.JPEG n02104365/
+mv val/ILSVRC2012_val_00048148.JPEG n01632777/
+mv val/ILSVRC2012_val_00048149.JPEG n02794156/
+mv val/ILSVRC2012_val_00048150.JPEG n02807133/
+mv val/ILSVRC2012_val_00048151.JPEG n07615774/
+mv val/ILSVRC2012_val_00048152.JPEG n01532829/
+mv val/ILSVRC2012_val_00048153.JPEG n13040303/
+mv val/ILSVRC2012_val_00048154.JPEG n04149813/
+mv val/ILSVRC2012_val_00048155.JPEG n01828970/
+mv val/ILSVRC2012_val_00048156.JPEG n03345487/
+mv val/ILSVRC2012_val_00048157.JPEG n02096585/
+mv val/ILSVRC2012_val_00048158.JPEG n03291819/
+mv val/ILSVRC2012_val_00048159.JPEG n07754684/
+mv val/ILSVRC2012_val_00048160.JPEG n02123597/
+mv val/ILSVRC2012_val_00048161.JPEG n04266014/
+mv val/ILSVRC2012_val_00048162.JPEG n02114855/
+mv val/ILSVRC2012_val_00048163.JPEG n02018207/
+mv val/ILSVRC2012_val_00048164.JPEG n04532106/
+mv val/ILSVRC2012_val_00048165.JPEG n04579432/
+mv val/ILSVRC2012_val_00048166.JPEG n09246464/
+mv val/ILSVRC2012_val_00048167.JPEG n02088364/
+mv val/ILSVRC2012_val_00048168.JPEG n07615774/
+mv val/ILSVRC2012_val_00048169.JPEG n04487394/
+mv val/ILSVRC2012_val_00048170.JPEG n04612504/
+mv val/ILSVRC2012_val_00048171.JPEG n07613480/
+mv val/ILSVRC2012_val_00048172.JPEG n02058221/
+mv val/ILSVRC2012_val_00048173.JPEG n03980874/
+mv val/ILSVRC2012_val_00048174.JPEG n02134418/
+mv val/ILSVRC2012_val_00048175.JPEG n01622779/
+mv val/ILSVRC2012_val_00048176.JPEG n04209239/
+mv val/ILSVRC2012_val_00048177.JPEG n02692877/
+mv val/ILSVRC2012_val_00048178.JPEG n01560419/
+mv val/ILSVRC2012_val_00048179.JPEG n02870880/
+mv val/ILSVRC2012_val_00048180.JPEG n03445924/
+mv val/ILSVRC2012_val_00048181.JPEG n02117135/
+mv val/ILSVRC2012_val_00048182.JPEG n04356056/
+mv val/ILSVRC2012_val_00048183.JPEG n02097047/
+mv val/ILSVRC2012_val_00048184.JPEG n02281406/
+mv val/ILSVRC2012_val_00048185.JPEG n04243546/
+mv val/ILSVRC2012_val_00048186.JPEG n02129604/
+mv val/ILSVRC2012_val_00048187.JPEG n02395406/
+mv val/ILSVRC2012_val_00048188.JPEG n02089973/
+mv val/ILSVRC2012_val_00048189.JPEG n09332890/
+mv val/ILSVRC2012_val_00048190.JPEG n07747607/
+mv val/ILSVRC2012_val_00048191.JPEG n09246464/
+mv val/ILSVRC2012_val_00048192.JPEG n04417672/
+mv val/ILSVRC2012_val_00048193.JPEG n02859443/
+mv val/ILSVRC2012_val_00048194.JPEG n02105251/
+mv val/ILSVRC2012_val_00048195.JPEG n02012849/
+mv val/ILSVRC2012_val_00048196.JPEG n03724870/
+mv val/ILSVRC2012_val_00048197.JPEG n04562935/
+mv val/ILSVRC2012_val_00048198.JPEG n02790996/
+mv val/ILSVRC2012_val_00048199.JPEG n02825657/
+mv val/ILSVRC2012_val_00048200.JPEG n02510455/
+mv val/ILSVRC2012_val_00048201.JPEG n03884397/
+mv val/ILSVRC2012_val_00048202.JPEG n04069434/
+mv val/ILSVRC2012_val_00048203.JPEG n01843383/
+mv val/ILSVRC2012_val_00048204.JPEG n01440764/
+mv val/ILSVRC2012_val_00048205.JPEG n02909870/
+mv val/ILSVRC2012_val_00048206.JPEG n04344873/
+mv val/ILSVRC2012_val_00048207.JPEG n13054560/
+mv val/ILSVRC2012_val_00048208.JPEG n03976657/
+mv val/ILSVRC2012_val_00048209.JPEG n04270147/
+mv val/ILSVRC2012_val_00048210.JPEG n02804610/
+mv val/ILSVRC2012_val_00048211.JPEG n03792972/
+mv val/ILSVRC2012_val_00048212.JPEG n01704323/
+mv val/ILSVRC2012_val_00048213.JPEG n01689811/
+mv val/ILSVRC2012_val_00048214.JPEG n03908714/
+mv val/ILSVRC2012_val_00048215.JPEG n03062245/
+mv val/ILSVRC2012_val_00048216.JPEG n03376595/
+mv val/ILSVRC2012_val_00048217.JPEG n02442845/
+mv val/ILSVRC2012_val_00048218.JPEG n04589890/
+mv val/ILSVRC2012_val_00048219.JPEG n02114855/
+mv val/ILSVRC2012_val_00048220.JPEG n04465501/
+mv val/ILSVRC2012_val_00048221.JPEG n01664065/
+mv val/ILSVRC2012_val_00048222.JPEG n07711569/
+mv val/ILSVRC2012_val_00048223.JPEG n02457408/
+mv val/ILSVRC2012_val_00048224.JPEG n02165105/
+mv val/ILSVRC2012_val_00048225.JPEG n02389026/
+mv val/ILSVRC2012_val_00048226.JPEG n03207743/
+mv val/ILSVRC2012_val_00048227.JPEG n04081281/
+mv val/ILSVRC2012_val_00048228.JPEG n04458633/
+mv val/ILSVRC2012_val_00048229.JPEG n01843065/
+mv val/ILSVRC2012_val_00048230.JPEG n04335435/
+mv val/ILSVRC2012_val_00048231.JPEG n03444034/
+mv val/ILSVRC2012_val_00048232.JPEG n04311174/
+mv val/ILSVRC2012_val_00048233.JPEG n02128385/
+mv val/ILSVRC2012_val_00048234.JPEG n01819313/
+mv val/ILSVRC2012_val_00048235.JPEG n02098413/
+mv val/ILSVRC2012_val_00048236.JPEG n02110341/
+mv val/ILSVRC2012_val_00048237.JPEG n06874185/
+mv val/ILSVRC2012_val_00048238.JPEG n02098413/
+mv val/ILSVRC2012_val_00048239.JPEG n02007558/
+mv val/ILSVRC2012_val_00048240.JPEG n02077923/
+mv val/ILSVRC2012_val_00048241.JPEG n04461696/
+mv val/ILSVRC2012_val_00048242.JPEG n01514859/
+mv val/ILSVRC2012_val_00048243.JPEG n03388549/
+mv val/ILSVRC2012_val_00048244.JPEG n03447721/
+mv val/ILSVRC2012_val_00048245.JPEG n03207743/
+mv val/ILSVRC2012_val_00048246.JPEG n02443114/
+mv val/ILSVRC2012_val_00048247.JPEG n01664065/
+mv val/ILSVRC2012_val_00048248.JPEG n03825788/
+mv val/ILSVRC2012_val_00048249.JPEG n02799071/
+mv val/ILSVRC2012_val_00048250.JPEG n01753488/
+mv val/ILSVRC2012_val_00048251.JPEG n03642806/
+mv val/ILSVRC2012_val_00048252.JPEG n01847000/
+mv val/ILSVRC2012_val_00048253.JPEG n09421951/
+mv val/ILSVRC2012_val_00048254.JPEG n02086910/
+mv val/ILSVRC2012_val_00048255.JPEG n02441942/
+mv val/ILSVRC2012_val_00048256.JPEG n03141823/
+mv val/ILSVRC2012_val_00048257.JPEG n01664065/
+mv val/ILSVRC2012_val_00048258.JPEG n03642806/
+mv val/ILSVRC2012_val_00048259.JPEG n02364673/
+mv val/ILSVRC2012_val_00048260.JPEG n03884397/
+mv val/ILSVRC2012_val_00048261.JPEG n02033041/
+mv val/ILSVRC2012_val_00048262.JPEG n04019541/
+mv val/ILSVRC2012_val_00048263.JPEG n04266014/
+mv val/ILSVRC2012_val_00048264.JPEG n07749582/
+mv val/ILSVRC2012_val_00048265.JPEG n01818515/
+mv val/ILSVRC2012_val_00048266.JPEG n02415577/
+mv val/ILSVRC2012_val_00048267.JPEG n02804414/
+mv val/ILSVRC2012_val_00048268.JPEG n04599235/
+mv val/ILSVRC2012_val_00048269.JPEG n01910747/
+mv val/ILSVRC2012_val_00048270.JPEG n02965783/
+mv val/ILSVRC2012_val_00048271.JPEG n04111531/
+mv val/ILSVRC2012_val_00048272.JPEG n03794056/
+mv val/ILSVRC2012_val_00048273.JPEG n02088364/
+mv val/ILSVRC2012_val_00048274.JPEG n03733805/
+mv val/ILSVRC2012_val_00048275.JPEG n02497673/
+mv val/ILSVRC2012_val_00048276.JPEG n04296562/
+mv val/ILSVRC2012_val_00048277.JPEG n01983481/
+mv val/ILSVRC2012_val_00048278.JPEG n04041544/
+mv val/ILSVRC2012_val_00048279.JPEG n07892512/
+mv val/ILSVRC2012_val_00048280.JPEG n02085936/
+mv val/ILSVRC2012_val_00048281.JPEG n03929855/
+mv val/ILSVRC2012_val_00048282.JPEG n02396427/
+mv val/ILSVRC2012_val_00048283.JPEG n03854065/
+mv val/ILSVRC2012_val_00048284.JPEG n02802426/
+mv val/ILSVRC2012_val_00048285.JPEG n01751748/
+mv val/ILSVRC2012_val_00048286.JPEG n01632458/
+mv val/ILSVRC2012_val_00048287.JPEG n03207941/
+mv val/ILSVRC2012_val_00048288.JPEG n02110627/
+mv val/ILSVRC2012_val_00048289.JPEG n04554684/
+mv val/ILSVRC2012_val_00048290.JPEG n03729826/
+mv val/ILSVRC2012_val_00048291.JPEG n02480495/
+mv val/ILSVRC2012_val_00048292.JPEG n01914609/
+mv val/ILSVRC2012_val_00048293.JPEG n04200800/
+mv val/ILSVRC2012_val_00048294.JPEG n02480495/
+mv val/ILSVRC2012_val_00048295.JPEG n01630670/
+mv val/ILSVRC2012_val_00048296.JPEG n03825788/
+mv val/ILSVRC2012_val_00048297.JPEG n04458633/
+mv val/ILSVRC2012_val_00048298.JPEG n07754684/
+mv val/ILSVRC2012_val_00048299.JPEG n01756291/
+mv val/ILSVRC2012_val_00048300.JPEG n02807133/
+mv val/ILSVRC2012_val_00048301.JPEG n02099712/
+mv val/ILSVRC2012_val_00048302.JPEG n03223299/
+mv val/ILSVRC2012_val_00048303.JPEG n03394916/
+mv val/ILSVRC2012_val_00048304.JPEG n02100735/
+mv val/ILSVRC2012_val_00048305.JPEG n04548362/
+mv val/ILSVRC2012_val_00048306.JPEG n01774750/
+mv val/ILSVRC2012_val_00048307.JPEG n03085013/
+mv val/ILSVRC2012_val_00048308.JPEG n02974003/
+mv val/ILSVRC2012_val_00048309.JPEG n04004767/
+mv val/ILSVRC2012_val_00048310.JPEG n02111129/
+mv val/ILSVRC2012_val_00048311.JPEG n02113799/
+mv val/ILSVRC2012_val_00048312.JPEG n02963159/
+mv val/ILSVRC2012_val_00048313.JPEG n04275548/
+mv val/ILSVRC2012_val_00048314.JPEG n06874185/
+mv val/ILSVRC2012_val_00048315.JPEG n02105855/
+mv val/ILSVRC2012_val_00048316.JPEG n03710193/
+mv val/ILSVRC2012_val_00048317.JPEG n02916936/
+mv val/ILSVRC2012_val_00048318.JPEG n03125729/
+mv val/ILSVRC2012_val_00048319.JPEG n04209239/
+mv val/ILSVRC2012_val_00048320.JPEG n04033995/
+mv val/ILSVRC2012_val_00048321.JPEG n07930864/
+mv val/ILSVRC2012_val_00048322.JPEG n03443371/
+mv val/ILSVRC2012_val_00048323.JPEG n04604644/
+mv val/ILSVRC2012_val_00048324.JPEG n03788195/
+mv val/ILSVRC2012_val_00048325.JPEG n04238763/
+mv val/ILSVRC2012_val_00048326.JPEG n02174001/
+mv val/ILSVRC2012_val_00048327.JPEG n03637318/
+mv val/ILSVRC2012_val_00048328.JPEG n07615774/
+mv val/ILSVRC2012_val_00048329.JPEG n04200800/
+mv val/ILSVRC2012_val_00048330.JPEG n02107142/
+mv val/ILSVRC2012_val_00048331.JPEG n03709823/
+mv val/ILSVRC2012_val_00048332.JPEG n03786901/
+mv val/ILSVRC2012_val_00048333.JPEG n02086079/
+mv val/ILSVRC2012_val_00048334.JPEG n03201208/
+mv val/ILSVRC2012_val_00048335.JPEG n03000684/
+mv val/ILSVRC2012_val_00048336.JPEG n04099969/
+mv val/ILSVRC2012_val_00048337.JPEG n02102480/
+mv val/ILSVRC2012_val_00048338.JPEG n01950731/
+mv val/ILSVRC2012_val_00048339.JPEG n07753113/
+mv val/ILSVRC2012_val_00048340.JPEG n02013706/
+mv val/ILSVRC2012_val_00048341.JPEG n04536866/
+mv val/ILSVRC2012_val_00048342.JPEG n02423022/
+mv val/ILSVRC2012_val_00048343.JPEG n02687172/
+mv val/ILSVRC2012_val_00048344.JPEG n04208210/
+mv val/ILSVRC2012_val_00048345.JPEG n04596742/
+mv val/ILSVRC2012_val_00048346.JPEG n02051845/
+mv val/ILSVRC2012_val_00048347.JPEG n01833805/
+mv val/ILSVRC2012_val_00048348.JPEG n02058221/
+mv val/ILSVRC2012_val_00048349.JPEG n03344393/
+mv val/ILSVRC2012_val_00048350.JPEG n03857828/
+mv val/ILSVRC2012_val_00048351.JPEG n01978287/
+mv val/ILSVRC2012_val_00048352.JPEG n04118538/
+mv val/ILSVRC2012_val_00048353.JPEG n03976657/
+mv val/ILSVRC2012_val_00048354.JPEG n03717622/
+mv val/ILSVRC2012_val_00048355.JPEG n02097130/
+mv val/ILSVRC2012_val_00048356.JPEG n09399592/
+mv val/ILSVRC2012_val_00048357.JPEG n01768244/
+mv val/ILSVRC2012_val_00048358.JPEG n02317335/
+mv val/ILSVRC2012_val_00048359.JPEG n04204238/
+mv val/ILSVRC2012_val_00048360.JPEG n01580077/
+mv val/ILSVRC2012_val_00048361.JPEG n02097298/
+mv val/ILSVRC2012_val_00048362.JPEG n03673027/
+mv val/ILSVRC2012_val_00048363.JPEG n02013706/
+mv val/ILSVRC2012_val_00048364.JPEG n02105251/
+mv val/ILSVRC2012_val_00048365.JPEG n07697313/
+mv val/ILSVRC2012_val_00048366.JPEG n03980874/
+mv val/ILSVRC2012_val_00048367.JPEG n02804610/
+mv val/ILSVRC2012_val_00048368.JPEG n02125311/
+mv val/ILSVRC2012_val_00048369.JPEG n03781244/
+mv val/ILSVRC2012_val_00048370.JPEG n02095570/
+mv val/ILSVRC2012_val_00048371.JPEG n03344393/
+mv val/ILSVRC2012_val_00048372.JPEG n02408429/
+mv val/ILSVRC2012_val_00048373.JPEG n02110627/
+mv val/ILSVRC2012_val_00048374.JPEG n02807133/
+mv val/ILSVRC2012_val_00048375.JPEG n02129604/
+mv val/ILSVRC2012_val_00048376.JPEG n04332243/
+mv val/ILSVRC2012_val_00048377.JPEG n04398044/
+mv val/ILSVRC2012_val_00048378.JPEG n13044778/
+mv val/ILSVRC2012_val_00048379.JPEG n02098413/
+mv val/ILSVRC2012_val_00048380.JPEG n02129604/
+mv val/ILSVRC2012_val_00048381.JPEG n03763968/
+mv val/ILSVRC2012_val_00048382.JPEG n03028079/
+mv val/ILSVRC2012_val_00048383.JPEG n02108000/
+mv val/ILSVRC2012_val_00048384.JPEG n03825788/
+mv val/ILSVRC2012_val_00048385.JPEG n02116738/
+mv val/ILSVRC2012_val_00048386.JPEG n04344873/
+mv val/ILSVRC2012_val_00048387.JPEG n03924679/
+mv val/ILSVRC2012_val_00048388.JPEG n02486261/
+mv val/ILSVRC2012_val_00048389.JPEG n02667093/
+mv val/ILSVRC2012_val_00048390.JPEG n03584254/
+mv val/ILSVRC2012_val_00048391.JPEG n04554684/
+mv val/ILSVRC2012_val_00048392.JPEG n07932039/
+mv val/ILSVRC2012_val_00048393.JPEG n01872401/
+mv val/ILSVRC2012_val_00048394.JPEG n02128757/
+mv val/ILSVRC2012_val_00048395.JPEG n02966687/
+mv val/ILSVRC2012_val_00048396.JPEG n02101556/
+mv val/ILSVRC2012_val_00048397.JPEG n03207941/
+mv val/ILSVRC2012_val_00048398.JPEG n04476259/
+mv val/ILSVRC2012_val_00048399.JPEG n07684084/
+mv val/ILSVRC2012_val_00048400.JPEG n02109525/
+mv val/ILSVRC2012_val_00048401.JPEG n02268443/
+mv val/ILSVRC2012_val_00048402.JPEG n03793489/
+mv val/ILSVRC2012_val_00048403.JPEG n02106662/
+mv val/ILSVRC2012_val_00048404.JPEG n04335435/
+mv val/ILSVRC2012_val_00048405.JPEG n03146219/
+mv val/ILSVRC2012_val_00048406.JPEG n01774384/
+mv val/ILSVRC2012_val_00048407.JPEG n03980874/
+mv val/ILSVRC2012_val_00048408.JPEG n01930112/
+mv val/ILSVRC2012_val_00048409.JPEG n03485794/
+mv val/ILSVRC2012_val_00048410.JPEG n03710193/
+mv val/ILSVRC2012_val_00048411.JPEG n04525305/
+mv val/ILSVRC2012_val_00048412.JPEG n03916031/
+mv val/ILSVRC2012_val_00048413.JPEG n07565083/
+mv val/ILSVRC2012_val_00048414.JPEG n02264363/
+mv val/ILSVRC2012_val_00048415.JPEG n03676483/
+mv val/ILSVRC2012_val_00048416.JPEG n04235860/
+mv val/ILSVRC2012_val_00048417.JPEG n02808304/
+mv val/ILSVRC2012_val_00048418.JPEG n03796401/
+mv val/ILSVRC2012_val_00048419.JPEG n12620546/
+mv val/ILSVRC2012_val_00048420.JPEG n02098286/
+mv val/ILSVRC2012_val_00048421.JPEG n02091831/
+mv val/ILSVRC2012_val_00048422.JPEG n02319095/
+mv val/ILSVRC2012_val_00048423.JPEG n02264363/
+mv val/ILSVRC2012_val_00048424.JPEG n04317175/
+mv val/ILSVRC2012_val_00048425.JPEG n04120489/
+mv val/ILSVRC2012_val_00048426.JPEG n02788148/
+mv val/ILSVRC2012_val_00048427.JPEG n02110341/
+mv val/ILSVRC2012_val_00048428.JPEG n04252077/
+mv val/ILSVRC2012_val_00048429.JPEG n07715103/
+mv val/ILSVRC2012_val_00048430.JPEG n04540053/
+mv val/ILSVRC2012_val_00048431.JPEG n03016953/
+mv val/ILSVRC2012_val_00048432.JPEG n02091244/
+mv val/ILSVRC2012_val_00048433.JPEG n02640242/
+mv val/ILSVRC2012_val_00048434.JPEG n04612504/
+mv val/ILSVRC2012_val_00048435.JPEG n03000134/
+mv val/ILSVRC2012_val_00048436.JPEG n02112706/
+mv val/ILSVRC2012_val_00048437.JPEG n01532829/
+mv val/ILSVRC2012_val_00048438.JPEG n02115913/
+mv val/ILSVRC2012_val_00048439.JPEG n02101556/
+mv val/ILSVRC2012_val_00048440.JPEG n02119789/
+mv val/ILSVRC2012_val_00048441.JPEG n04252225/
+mv val/ILSVRC2012_val_00048442.JPEG n03492542/
+mv val/ILSVRC2012_val_00048443.JPEG n03272010/
+mv val/ILSVRC2012_val_00048444.JPEG n03770679/
+mv val/ILSVRC2012_val_00048445.JPEG n01629819/
+mv val/ILSVRC2012_val_00048446.JPEG n04517823/
+mv val/ILSVRC2012_val_00048447.JPEG n04366367/
+mv val/ILSVRC2012_val_00048448.JPEG n02410509/
+mv val/ILSVRC2012_val_00048449.JPEG n03623198/
+mv val/ILSVRC2012_val_00048450.JPEG n03777754/
+mv val/ILSVRC2012_val_00048451.JPEG n03899768/
+mv val/ILSVRC2012_val_00048452.JPEG n04367480/
+mv val/ILSVRC2012_val_00048453.JPEG n04525305/
+mv val/ILSVRC2012_val_00048454.JPEG n03208938/
+mv val/ILSVRC2012_val_00048455.JPEG n02951358/
+mv val/ILSVRC2012_val_00048456.JPEG n03110669/
+mv val/ILSVRC2012_val_00048457.JPEG n04483307/
+mv val/ILSVRC2012_val_00048458.JPEG n04517823/
+mv val/ILSVRC2012_val_00048459.JPEG n02422699/
+mv val/ILSVRC2012_val_00048460.JPEG n04509417/
+mv val/ILSVRC2012_val_00048461.JPEG n03590841/
+mv val/ILSVRC2012_val_00048462.JPEG n09332890/
+mv val/ILSVRC2012_val_00048463.JPEG n01629819/
+mv val/ILSVRC2012_val_00048464.JPEG n04557648/
+mv val/ILSVRC2012_val_00048465.JPEG n09421951/
+mv val/ILSVRC2012_val_00048466.JPEG n13052670/
+mv val/ILSVRC2012_val_00048467.JPEG n01677366/
+mv val/ILSVRC2012_val_00048468.JPEG n02058221/
+mv val/ILSVRC2012_val_00048469.JPEG n02102318/
+mv val/ILSVRC2012_val_00048470.JPEG n03126707/
+mv val/ILSVRC2012_val_00048471.JPEG n04548280/
+mv val/ILSVRC2012_val_00048472.JPEG n03187595/
+mv val/ILSVRC2012_val_00048473.JPEG n02966687/
+mv val/ILSVRC2012_val_00048474.JPEG n03938244/
+mv val/ILSVRC2012_val_00048475.JPEG n02486261/
+mv val/ILSVRC2012_val_00048476.JPEG n02096177/
+mv val/ILSVRC2012_val_00048477.JPEG n02165105/
+mv val/ILSVRC2012_val_00048478.JPEG n02979186/
+mv val/ILSVRC2012_val_00048479.JPEG n04310018/
+mv val/ILSVRC2012_val_00048480.JPEG n01669191/
+mv val/ILSVRC2012_val_00048481.JPEG n04356056/
+mv val/ILSVRC2012_val_00048482.JPEG n01644373/
+mv val/ILSVRC2012_val_00048483.JPEG n03676483/
+mv val/ILSVRC2012_val_00048484.JPEG n04311174/
+mv val/ILSVRC2012_val_00048485.JPEG n03617480/
+mv val/ILSVRC2012_val_00048486.JPEG n02107908/
+mv val/ILSVRC2012_val_00048487.JPEG n04310018/
+mv val/ILSVRC2012_val_00048488.JPEG n02100236/
+mv val/ILSVRC2012_val_00048489.JPEG n03623198/
+mv val/ILSVRC2012_val_00048490.JPEG n03841143/
+mv val/ILSVRC2012_val_00048491.JPEG n02488702/
+mv val/ILSVRC2012_val_00048492.JPEG n04507155/
+mv val/ILSVRC2012_val_00048493.JPEG n02097130/
+mv val/ILSVRC2012_val_00048494.JPEG n02769748/
+mv val/ILSVRC2012_val_00048495.JPEG n03781244/
+mv val/ILSVRC2012_val_00048496.JPEG n02441942/
+mv val/ILSVRC2012_val_00048497.JPEG n03240683/
+mv val/ILSVRC2012_val_00048498.JPEG n02115641/
+mv val/ILSVRC2012_val_00048499.JPEG n02117135/
+mv val/ILSVRC2012_val_00048500.JPEG n02137549/
+mv val/ILSVRC2012_val_00048501.JPEG n02113023/
+mv val/ILSVRC2012_val_00048502.JPEG n02129165/
+mv val/ILSVRC2012_val_00048503.JPEG n04532106/
+mv val/ILSVRC2012_val_00048504.JPEG n04118538/
+mv val/ILSVRC2012_val_00048505.JPEG n01774750/
+mv val/ILSVRC2012_val_00048506.JPEG n02917067/
+mv val/ILSVRC2012_val_00048507.JPEG n03394916/
+mv val/ILSVRC2012_val_00048508.JPEG n04458633/
+mv val/ILSVRC2012_val_00048509.JPEG n01704323/
+mv val/ILSVRC2012_val_00048510.JPEG n04399382/
+mv val/ILSVRC2012_val_00048511.JPEG n02410509/
+mv val/ILSVRC2012_val_00048512.JPEG n02111277/
+mv val/ILSVRC2012_val_00048513.JPEG n02102177/
+mv val/ILSVRC2012_val_00048514.JPEG n03000247/
+mv val/ILSVRC2012_val_00048515.JPEG n02107683/
+mv val/ILSVRC2012_val_00048516.JPEG n04037443/
+mv val/ILSVRC2012_val_00048517.JPEG n03445777/
+mv val/ILSVRC2012_val_00048518.JPEG n04296562/
+mv val/ILSVRC2012_val_00048519.JPEG n02971356/
+mv val/ILSVRC2012_val_00048520.JPEG n04418357/
+mv val/ILSVRC2012_val_00048521.JPEG n02730930/
+mv val/ILSVRC2012_val_00048522.JPEG n03841143/
+mv val/ILSVRC2012_val_00048523.JPEG n01774384/
+mv val/ILSVRC2012_val_00048524.JPEG n03271574/
+mv val/ILSVRC2012_val_00048525.JPEG n02443114/
+mv val/ILSVRC2012_val_00048526.JPEG n12144580/
+mv val/ILSVRC2012_val_00048527.JPEG n02097298/
+mv val/ILSVRC2012_val_00048528.JPEG n02948072/
+mv val/ILSVRC2012_val_00048529.JPEG n04179913/
+mv val/ILSVRC2012_val_00048530.JPEG n02105251/
+mv val/ILSVRC2012_val_00048531.JPEG n03888605/
+mv val/ILSVRC2012_val_00048532.JPEG n03208938/
+mv val/ILSVRC2012_val_00048533.JPEG n04265275/
+mv val/ILSVRC2012_val_00048534.JPEG n09421951/
+mv val/ILSVRC2012_val_00048535.JPEG n02408429/
+mv val/ILSVRC2012_val_00048536.JPEG n02101388/
+mv val/ILSVRC2012_val_00048537.JPEG n02105056/
+mv val/ILSVRC2012_val_00048538.JPEG n07836838/
+mv val/ILSVRC2012_val_00048539.JPEG n04591713/
+mv val/ILSVRC2012_val_00048540.JPEG n02011460/
+mv val/ILSVRC2012_val_00048541.JPEG n04532106/
+mv val/ILSVRC2012_val_00048542.JPEG n01698640/
+mv val/ILSVRC2012_val_00048543.JPEG n04330267/
+mv val/ILSVRC2012_val_00048544.JPEG n04039381/
+mv val/ILSVRC2012_val_00048545.JPEG n04542943/
+mv val/ILSVRC2012_val_00048546.JPEG n02317335/
+mv val/ILSVRC2012_val_00048547.JPEG n02504013/
+mv val/ILSVRC2012_val_00048548.JPEG n01704323/
+mv val/ILSVRC2012_val_00048549.JPEG n01829413/
+mv val/ILSVRC2012_val_00048550.JPEG n04357314/
+mv val/ILSVRC2012_val_00048551.JPEG n04252077/
+mv val/ILSVRC2012_val_00048552.JPEG n01601694/
+mv val/ILSVRC2012_val_00048553.JPEG n02006656/
+mv val/ILSVRC2012_val_00048554.JPEG n03124043/
+mv val/ILSVRC2012_val_00048555.JPEG n02965783/
+mv val/ILSVRC2012_val_00048556.JPEG n02814533/
+mv val/ILSVRC2012_val_00048557.JPEG n03347037/
+mv val/ILSVRC2012_val_00048558.JPEG n03920288/
+mv val/ILSVRC2012_val_00048559.JPEG n03874599/
+mv val/ILSVRC2012_val_00048560.JPEG n02364673/
+mv val/ILSVRC2012_val_00048561.JPEG n03496892/
+mv val/ILSVRC2012_val_00048562.JPEG n01978455/
+mv val/ILSVRC2012_val_00048563.JPEG n03544143/
+mv val/ILSVRC2012_val_00048564.JPEG n04252077/
+mv val/ILSVRC2012_val_00048565.JPEG n03630383/
+mv val/ILSVRC2012_val_00048566.JPEG n03717622/
+mv val/ILSVRC2012_val_00048567.JPEG n03141823/
+mv val/ILSVRC2012_val_00048568.JPEG n04259630/
+mv val/ILSVRC2012_val_00048569.JPEG n03785016/
+mv val/ILSVRC2012_val_00048570.JPEG n02174001/
+mv val/ILSVRC2012_val_00048571.JPEG n02869837/
+mv val/ILSVRC2012_val_00048572.JPEG n04335435/
+mv val/ILSVRC2012_val_00048573.JPEG n02687172/
+mv val/ILSVRC2012_val_00048574.JPEG n01729977/
+mv val/ILSVRC2012_val_00048575.JPEG n02018795/
+mv val/ILSVRC2012_val_00048576.JPEG n01494475/
+mv val/ILSVRC2012_val_00048577.JPEG n03529860/
+mv val/ILSVRC2012_val_00048578.JPEG n02106166/
+mv val/ILSVRC2012_val_00048579.JPEG n04553703/
+mv val/ILSVRC2012_val_00048580.JPEG n04523525/
+mv val/ILSVRC2012_val_00048581.JPEG n02445715/
+mv val/ILSVRC2012_val_00048582.JPEG n03891332/
+mv val/ILSVRC2012_val_00048583.JPEG n02747177/
+mv val/ILSVRC2012_val_00048584.JPEG n03676483/
+mv val/ILSVRC2012_val_00048585.JPEG n02667093/
+mv val/ILSVRC2012_val_00048586.JPEG n07920052/
+mv val/ILSVRC2012_val_00048587.JPEG n02910353/
+mv val/ILSVRC2012_val_00048588.JPEG n02097209/
+mv val/ILSVRC2012_val_00048589.JPEG n03991062/
+mv val/ILSVRC2012_val_00048590.JPEG n04204238/
+mv val/ILSVRC2012_val_00048591.JPEG n02110341/
+mv val/ILSVRC2012_val_00048592.JPEG n02089867/
+mv val/ILSVRC2012_val_00048593.JPEG n01776313/
+mv val/ILSVRC2012_val_00048594.JPEG n02328150/
+mv val/ILSVRC2012_val_00048595.JPEG n03180011/
+mv val/ILSVRC2012_val_00048596.JPEG n07717410/
+mv val/ILSVRC2012_val_00048597.JPEG n03047690/
+mv val/ILSVRC2012_val_00048598.JPEG n04505470/
+mv val/ILSVRC2012_val_00048599.JPEG n03014705/
+mv val/ILSVRC2012_val_00048600.JPEG n01518878/
+mv val/ILSVRC2012_val_00048601.JPEG n01807496/
+mv val/ILSVRC2012_val_00048602.JPEG n04591713/
+mv val/ILSVRC2012_val_00048603.JPEG n02999410/
+mv val/ILSVRC2012_val_00048604.JPEG n04254777/
+mv val/ILSVRC2012_val_00048605.JPEG n02870880/
+mv val/ILSVRC2012_val_00048606.JPEG n02002556/
+mv val/ILSVRC2012_val_00048607.JPEG n02095889/
+mv val/ILSVRC2012_val_00048608.JPEG n02487347/
+mv val/ILSVRC2012_val_00048609.JPEG n03944341/
+mv val/ILSVRC2012_val_00048610.JPEG n03770679/
+mv val/ILSVRC2012_val_00048611.JPEG n03794056/
+mv val/ILSVRC2012_val_00048612.JPEG n03759954/
+mv val/ILSVRC2012_val_00048613.JPEG n02093991/
+mv val/ILSVRC2012_val_00048614.JPEG n01968897/
+mv val/ILSVRC2012_val_00048615.JPEG n03743016/
+mv val/ILSVRC2012_val_00048616.JPEG n03388183/
+mv val/ILSVRC2012_val_00048617.JPEG n03775546/
+mv val/ILSVRC2012_val_00048618.JPEG n02437312/
+mv val/ILSVRC2012_val_00048619.JPEG n04120489/
+mv val/ILSVRC2012_val_00048620.JPEG n03642806/
+mv val/ILSVRC2012_val_00048621.JPEG n02808440/
+mv val/ILSVRC2012_val_00048622.JPEG n04099969/
+mv val/ILSVRC2012_val_00048623.JPEG n03891332/
+mv val/ILSVRC2012_val_00048624.JPEG n03958227/
+mv val/ILSVRC2012_val_00048625.JPEG n02113799/
+mv val/ILSVRC2012_val_00048626.JPEG n03998194/
+mv val/ILSVRC2012_val_00048627.JPEG n02104029/
+mv val/ILSVRC2012_val_00048628.JPEG n03250847/
+mv val/ILSVRC2012_val_00048629.JPEG n02100877/
+mv val/ILSVRC2012_val_00048630.JPEG n07714990/
+mv val/ILSVRC2012_val_00048631.JPEG n03110669/
+mv val/ILSVRC2012_val_00048632.JPEG n02676566/
+mv val/ILSVRC2012_val_00048633.JPEG n03347037/
+mv val/ILSVRC2012_val_00048634.JPEG n03530642/
+mv val/ILSVRC2012_val_00048635.JPEG n10565667/
+mv val/ILSVRC2012_val_00048636.JPEG n02108000/
+mv val/ILSVRC2012_val_00048637.JPEG n03110669/
+mv val/ILSVRC2012_val_00048638.JPEG n03690938/
+mv val/ILSVRC2012_val_00048639.JPEG n02095314/
+mv val/ILSVRC2012_val_00048640.JPEG n02012849/
+mv val/ILSVRC2012_val_00048641.JPEG n02277742/
+mv val/ILSVRC2012_val_00048642.JPEG n01532829/
+mv val/ILSVRC2012_val_00048643.JPEG n04553703/
+mv val/ILSVRC2012_val_00048644.JPEG n02051845/
+mv val/ILSVRC2012_val_00048645.JPEG n04456115/
+mv val/ILSVRC2012_val_00048646.JPEG n03998194/
+mv val/ILSVRC2012_val_00048647.JPEG n02417914/
+mv val/ILSVRC2012_val_00048648.JPEG n03594734/
+mv val/ILSVRC2012_val_00048649.JPEG n01775062/
+mv val/ILSVRC2012_val_00048650.JPEG n02105855/
+mv val/ILSVRC2012_val_00048651.JPEG n03903868/
+mv val/ILSVRC2012_val_00048652.JPEG n02096294/
+mv val/ILSVRC2012_val_00048653.JPEG n04371774/
+mv val/ILSVRC2012_val_00048654.JPEG n02927161/
+mv val/ILSVRC2012_val_00048655.JPEG n03657121/
+mv val/ILSVRC2012_val_00048656.JPEG n03937543/
+mv val/ILSVRC2012_val_00048657.JPEG n04532106/
+mv val/ILSVRC2012_val_00048658.JPEG n01883070/
+mv val/ILSVRC2012_val_00048659.JPEG n01537544/
+mv val/ILSVRC2012_val_00048660.JPEG n02667093/
+mv val/ILSVRC2012_val_00048661.JPEG n02104029/
+mv val/ILSVRC2012_val_00048662.JPEG n02487347/
+mv val/ILSVRC2012_val_00048663.JPEG n02104365/
+mv val/ILSVRC2012_val_00048664.JPEG n02051845/
+mv val/ILSVRC2012_val_00048665.JPEG n04243546/
+mv val/ILSVRC2012_val_00048666.JPEG n02006656/
+mv val/ILSVRC2012_val_00048667.JPEG n02808304/
+mv val/ILSVRC2012_val_00048668.JPEG n04251144/
+mv val/ILSVRC2012_val_00048669.JPEG n02356798/
+mv val/ILSVRC2012_val_00048670.JPEG n02391049/
+mv val/ILSVRC2012_val_00048671.JPEG n07753275/
+mv val/ILSVRC2012_val_00048672.JPEG n02974003/
+mv val/ILSVRC2012_val_00048673.JPEG n03482405/
+mv val/ILSVRC2012_val_00048674.JPEG n09193705/
+mv val/ILSVRC2012_val_00048675.JPEG n01694178/
+mv val/ILSVRC2012_val_00048676.JPEG n02168699/
+mv val/ILSVRC2012_val_00048677.JPEG n12768682/
+mv val/ILSVRC2012_val_00048678.JPEG n03272562/
+mv val/ILSVRC2012_val_00048679.JPEG n03710193/
+mv val/ILSVRC2012_val_00048680.JPEG n03843555/
+mv val/ILSVRC2012_val_00048681.JPEG n03126707/
+mv val/ILSVRC2012_val_00048682.JPEG n03196217/
+mv val/ILSVRC2012_val_00048683.JPEG n06785654/
+mv val/ILSVRC2012_val_00048684.JPEG n04350905/
+mv val/ILSVRC2012_val_00048685.JPEG n07873807/
+mv val/ILSVRC2012_val_00048686.JPEG n04310018/
+mv val/ILSVRC2012_val_00048687.JPEG n02264363/
+mv val/ILSVRC2012_val_00048688.JPEG n02492660/
+mv val/ILSVRC2012_val_00048689.JPEG n10565667/
+mv val/ILSVRC2012_val_00048690.JPEG n04275548/
+mv val/ILSVRC2012_val_00048691.JPEG n04147183/
+mv val/ILSVRC2012_val_00048692.JPEG n04366367/
+mv val/ILSVRC2012_val_00048693.JPEG n02114855/
+mv val/ILSVRC2012_val_00048694.JPEG n02100236/
+mv val/ILSVRC2012_val_00048695.JPEG n04154565/
+mv val/ILSVRC2012_val_00048696.JPEG n02276258/
+mv val/ILSVRC2012_val_00048697.JPEG n03424325/
+mv val/ILSVRC2012_val_00048698.JPEG n03777568/
+mv val/ILSVRC2012_val_00048699.JPEG n03494278/
+mv val/ILSVRC2012_val_00048700.JPEG n01806143/
+mv val/ILSVRC2012_val_00048701.JPEG n03459775/
+mv val/ILSVRC2012_val_00048702.JPEG n03598930/
+mv val/ILSVRC2012_val_00048703.JPEG n03967562/
+mv val/ILSVRC2012_val_00048704.JPEG n03775546/
+mv val/ILSVRC2012_val_00048705.JPEG n04418357/
+mv val/ILSVRC2012_val_00048706.JPEG n02412080/
+mv val/ILSVRC2012_val_00048707.JPEG n04591157/
+mv val/ILSVRC2012_val_00048708.JPEG n01770081/
+mv val/ILSVRC2012_val_00048709.JPEG n03877472/
+mv val/ILSVRC2012_val_00048710.JPEG n01531178/
+mv val/ILSVRC2012_val_00048711.JPEG n03794056/
+mv val/ILSVRC2012_val_00048712.JPEG n04485082/
+mv val/ILSVRC2012_val_00048713.JPEG n03786901/
+mv val/ILSVRC2012_val_00048714.JPEG n01773797/
+mv val/ILSVRC2012_val_00048715.JPEG n04254680/
+mv val/ILSVRC2012_val_00048716.JPEG n02128925/
+mv val/ILSVRC2012_val_00048717.JPEG n02128757/
+mv val/ILSVRC2012_val_00048718.JPEG n02442845/
+mv val/ILSVRC2012_val_00048719.JPEG n02606052/
+mv val/ILSVRC2012_val_00048720.JPEG n02099429/
+mv val/ILSVRC2012_val_00048721.JPEG n04442312/
+mv val/ILSVRC2012_val_00048722.JPEG n01807496/
+mv val/ILSVRC2012_val_00048723.JPEG n02107312/
+mv val/ILSVRC2012_val_00048724.JPEG n03710637/
+mv val/ILSVRC2012_val_00048725.JPEG n02027492/
+mv val/ILSVRC2012_val_00048726.JPEG n03016953/
+mv val/ILSVRC2012_val_00048727.JPEG n02017213/
+mv val/ILSVRC2012_val_00048728.JPEG n12768682/
+mv val/ILSVRC2012_val_00048729.JPEG n04192698/
+mv val/ILSVRC2012_val_00048730.JPEG n02747177/
+mv val/ILSVRC2012_val_00048731.JPEG n04532106/
+mv val/ILSVRC2012_val_00048732.JPEG n01537544/
+mv val/ILSVRC2012_val_00048733.JPEG n04254777/
+mv val/ILSVRC2012_val_00048734.JPEG n03259280/
+mv val/ILSVRC2012_val_00048735.JPEG n02025239/
+mv val/ILSVRC2012_val_00048736.JPEG n09835506/
+mv val/ILSVRC2012_val_00048737.JPEG n02096437/
+mv val/ILSVRC2012_val_00048738.JPEG n04372370/
+mv val/ILSVRC2012_val_00048739.JPEG n02797295/
+mv val/ILSVRC2012_val_00048740.JPEG n03871628/
+mv val/ILSVRC2012_val_00048741.JPEG n02481823/
+mv val/ILSVRC2012_val_00048742.JPEG n03837869/
+mv val/ILSVRC2012_val_00048743.JPEG n02268443/
+mv val/ILSVRC2012_val_00048744.JPEG n04522168/
+mv val/ILSVRC2012_val_00048745.JPEG n03690938/
+mv val/ILSVRC2012_val_00048746.JPEG n04550184/
+mv val/ILSVRC2012_val_00048747.JPEG n03657121/
+mv val/ILSVRC2012_val_00048748.JPEG n02105251/
+mv val/ILSVRC2012_val_00048749.JPEG n01833805/
+mv val/ILSVRC2012_val_00048750.JPEG n01755581/
+mv val/ILSVRC2012_val_00048751.JPEG n07734744/
+mv val/ILSVRC2012_val_00048752.JPEG n01873310/
+mv val/ILSVRC2012_val_00048753.JPEG n03538406/
+mv val/ILSVRC2012_val_00048754.JPEG n01688243/
+mv val/ILSVRC2012_val_00048755.JPEG n03452741/
+mv val/ILSVRC2012_val_00048756.JPEG n02120505/
+mv val/ILSVRC2012_val_00048757.JPEG n02412080/
+mv val/ILSVRC2012_val_00048758.JPEG n04254120/
+mv val/ILSVRC2012_val_00048759.JPEG n04019541/
+mv val/ILSVRC2012_val_00048760.JPEG n02112706/
+mv val/ILSVRC2012_val_00048761.JPEG n02100735/
+mv val/ILSVRC2012_val_00048762.JPEG n03201208/
+mv val/ILSVRC2012_val_00048763.JPEG n03134739/
+mv val/ILSVRC2012_val_00048764.JPEG n02514041/
+mv val/ILSVRC2012_val_00048765.JPEG n04065272/
+mv val/ILSVRC2012_val_00048766.JPEG n02165105/
+mv val/ILSVRC2012_val_00048767.JPEG n04443257/
+mv val/ILSVRC2012_val_00048768.JPEG n04149813/
+mv val/ILSVRC2012_val_00048769.JPEG n03871628/
+mv val/ILSVRC2012_val_00048770.JPEG n02100236/
+mv val/ILSVRC2012_val_00048771.JPEG n02412080/
+mv val/ILSVRC2012_val_00048772.JPEG n02992211/
+mv val/ILSVRC2012_val_00048773.JPEG n02951358/
+mv val/ILSVRC2012_val_00048774.JPEG n03776460/
+mv val/ILSVRC2012_val_00048775.JPEG n02666196/
+mv val/ILSVRC2012_val_00048776.JPEG n03000134/
+mv val/ILSVRC2012_val_00048777.JPEG n12144580/
+mv val/ILSVRC2012_val_00048778.JPEG n03141823/
+mv val/ILSVRC2012_val_00048779.JPEG n02110341/
+mv val/ILSVRC2012_val_00048780.JPEG n02094114/
+mv val/ILSVRC2012_val_00048781.JPEG n02504458/
+mv val/ILSVRC2012_val_00048782.JPEG n04389033/
+mv val/ILSVRC2012_val_00048783.JPEG n02085936/
+mv val/ILSVRC2012_val_00048784.JPEG n04553703/
+mv val/ILSVRC2012_val_00048785.JPEG n03594734/
+mv val/ILSVRC2012_val_00048786.JPEG n09468604/
+mv val/ILSVRC2012_val_00048787.JPEG n03980874/
+mv val/ILSVRC2012_val_00048788.JPEG n07831146/
+mv val/ILSVRC2012_val_00048789.JPEG n03141823/
+mv val/ILSVRC2012_val_00048790.JPEG n13054560/
+mv val/ILSVRC2012_val_00048791.JPEG n01704323/
+mv val/ILSVRC2012_val_00048792.JPEG n02356798/
+mv val/ILSVRC2012_val_00048793.JPEG n03970156/
+mv val/ILSVRC2012_val_00048794.JPEG n02071294/
+mv val/ILSVRC2012_val_00048795.JPEG n06794110/
+mv val/ILSVRC2012_val_00048796.JPEG n02860847/
+mv val/ILSVRC2012_val_00048797.JPEG n03970156/
+mv val/ILSVRC2012_val_00048798.JPEG n11879895/
+mv val/ILSVRC2012_val_00048799.JPEG n04389033/
+mv val/ILSVRC2012_val_00048800.JPEG n01770393/
+mv val/ILSVRC2012_val_00048801.JPEG n02104365/
+mv val/ILSVRC2012_val_00048802.JPEG n02033041/
+mv val/ILSVRC2012_val_00048803.JPEG n07754684/
+mv val/ILSVRC2012_val_00048804.JPEG n02666196/
+mv val/ILSVRC2012_val_00048805.JPEG n03658185/
+mv val/ILSVRC2012_val_00048806.JPEG n03447447/
+mv val/ILSVRC2012_val_00048807.JPEG n03840681/
+mv val/ILSVRC2012_val_00048808.JPEG n01990800/
+mv val/ILSVRC2012_val_00048809.JPEG n03992509/
+mv val/ILSVRC2012_val_00048810.JPEG n02319095/
+mv val/ILSVRC2012_val_00048811.JPEG n04540053/
+mv val/ILSVRC2012_val_00048812.JPEG n04141975/
+mv val/ILSVRC2012_val_00048813.JPEG n03026506/
+mv val/ILSVRC2012_val_00048814.JPEG n02009229/
+mv val/ILSVRC2012_val_00048815.JPEG n07880968/
+mv val/ILSVRC2012_val_00048816.JPEG n03459775/
+mv val/ILSVRC2012_val_00048817.JPEG n02488291/
+mv val/ILSVRC2012_val_00048818.JPEG n02108551/
+mv val/ILSVRC2012_val_00048819.JPEG n03793489/
+mv val/ILSVRC2012_val_00048820.JPEG n03041632/
+mv val/ILSVRC2012_val_00048821.JPEG n03887697/
+mv val/ILSVRC2012_val_00048822.JPEG n12057211/
+mv val/ILSVRC2012_val_00048823.JPEG n07875152/
+mv val/ILSVRC2012_val_00048824.JPEG n01828970/
+mv val/ILSVRC2012_val_00048825.JPEG n01796340/
+mv val/ILSVRC2012_val_00048826.JPEG n03494278/
+mv val/ILSVRC2012_val_00048827.JPEG n02281787/
+mv val/ILSVRC2012_val_00048828.JPEG n01698640/
+mv val/ILSVRC2012_val_00048829.JPEG n01537544/
+mv val/ILSVRC2012_val_00048830.JPEG n02110185/
+mv val/ILSVRC2012_val_00048831.JPEG n04209133/
+mv val/ILSVRC2012_val_00048832.JPEG n02536864/
+mv val/ILSVRC2012_val_00048833.JPEG n07714990/
+mv val/ILSVRC2012_val_00048834.JPEG n02100236/
+mv val/ILSVRC2012_val_00048835.JPEG n04317175/
+mv val/ILSVRC2012_val_00048836.JPEG n04265275/
+mv val/ILSVRC2012_val_00048837.JPEG n01983481/
+mv val/ILSVRC2012_val_00048838.JPEG n01833805/
+mv val/ILSVRC2012_val_00048839.JPEG n02808440/
+mv val/ILSVRC2012_val_00048840.JPEG n01443537/
+mv val/ILSVRC2012_val_00048841.JPEG n07697313/
+mv val/ILSVRC2012_val_00048842.JPEG n02109525/
+mv val/ILSVRC2012_val_00048843.JPEG n03935335/
+mv val/ILSVRC2012_val_00048844.JPEG n03903868/
+mv val/ILSVRC2012_val_00048845.JPEG n04074963/
+mv val/ILSVRC2012_val_00048846.JPEG n01807496/
+mv val/ILSVRC2012_val_00048847.JPEG n03729826/
+mv val/ILSVRC2012_val_00048848.JPEG n04111531/
+mv val/ILSVRC2012_val_00048849.JPEG n07860988/
+mv val/ILSVRC2012_val_00048850.JPEG n04133789/
+mv val/ILSVRC2012_val_00048851.JPEG n03873416/
+mv val/ILSVRC2012_val_00048852.JPEG n03991062/
+mv val/ILSVRC2012_val_00048853.JPEG n03028079/
+mv val/ILSVRC2012_val_00048854.JPEG n03207743/
+mv val/ILSVRC2012_val_00048855.JPEG n02487347/
+mv val/ILSVRC2012_val_00048856.JPEG n03207941/
+mv val/ILSVRC2012_val_00048857.JPEG n03920288/
+mv val/ILSVRC2012_val_00048858.JPEG n02100735/
+mv val/ILSVRC2012_val_00048859.JPEG n02105855/
+mv val/ILSVRC2012_val_00048860.JPEG n03544143/
+mv val/ILSVRC2012_val_00048861.JPEG n02071294/
+mv val/ILSVRC2012_val_00048862.JPEG n03496892/
+mv val/ILSVRC2012_val_00048863.JPEG n03461385/
+mv val/ILSVRC2012_val_00048864.JPEG n01443537/
+mv val/ILSVRC2012_val_00048865.JPEG n04239074/
+mv val/ILSVRC2012_val_00048866.JPEG n03956157/
+mv val/ILSVRC2012_val_00048867.JPEG n04553703/
+mv val/ILSVRC2012_val_00048868.JPEG n04371430/
+mv val/ILSVRC2012_val_00048869.JPEG n12057211/
+mv val/ILSVRC2012_val_00048870.JPEG n04118776/
+mv val/ILSVRC2012_val_00048871.JPEG n02793495/
+mv val/ILSVRC2012_val_00048872.JPEG n02808304/
+mv val/ILSVRC2012_val_00048873.JPEG n03709823/
+mv val/ILSVRC2012_val_00048874.JPEG n02099267/
+mv val/ILSVRC2012_val_00048875.JPEG n03063599/
+mv val/ILSVRC2012_val_00048876.JPEG n03018349/
+mv val/ILSVRC2012_val_00048877.JPEG n02009912/
+mv val/ILSVRC2012_val_00048878.JPEG n03467068/
+mv val/ILSVRC2012_val_00048879.JPEG n03637318/
+mv val/ILSVRC2012_val_00048880.JPEG n12998815/
+mv val/ILSVRC2012_val_00048881.JPEG n04153751/
+mv val/ILSVRC2012_val_00048882.JPEG n03063599/
+mv val/ILSVRC2012_val_00048883.JPEG n02132136/
+mv val/ILSVRC2012_val_00048884.JPEG n02879718/
+mv val/ILSVRC2012_val_00048885.JPEG n02835271/
+mv val/ILSVRC2012_val_00048886.JPEG n03089624/
+mv val/ILSVRC2012_val_00048887.JPEG n01734418/
+mv val/ILSVRC2012_val_00048888.JPEG n02027492/
+mv val/ILSVRC2012_val_00048889.JPEG n04133789/
+mv val/ILSVRC2012_val_00048890.JPEG n01491361/
+mv val/ILSVRC2012_val_00048891.JPEG n03041632/
+mv val/ILSVRC2012_val_00048892.JPEG n02361337/
+mv val/ILSVRC2012_val_00048893.JPEG n03710637/
+mv val/ILSVRC2012_val_00048894.JPEG n02169497/
+mv val/ILSVRC2012_val_00048895.JPEG n02268443/
+mv val/ILSVRC2012_val_00048896.JPEG n03291819/
+mv val/ILSVRC2012_val_00048897.JPEG n02492660/
+mv val/ILSVRC2012_val_00048898.JPEG n04069434/
+mv val/ILSVRC2012_val_00048899.JPEG n03457902/
+mv val/ILSVRC2012_val_00048900.JPEG n04200800/
+mv val/ILSVRC2012_val_00048901.JPEG n04429376/
+mv val/ILSVRC2012_val_00048902.JPEG n01945685/
+mv val/ILSVRC2012_val_00048903.JPEG n02910353/
+mv val/ILSVRC2012_val_00048904.JPEG n02096177/
+mv val/ILSVRC2012_val_00048905.JPEG n04204347/
+mv val/ILSVRC2012_val_00048906.JPEG n03347037/
+mv val/ILSVRC2012_val_00048907.JPEG n01806567/
+mv val/ILSVRC2012_val_00048908.JPEG n02002724/
+mv val/ILSVRC2012_val_00048909.JPEG n01675722/
+mv val/ILSVRC2012_val_00048910.JPEG n04404412/
+mv val/ILSVRC2012_val_00048911.JPEG n03476684/
+mv val/ILSVRC2012_val_00048912.JPEG n03868242/
+mv val/ILSVRC2012_val_00048913.JPEG n01773157/
+mv val/ILSVRC2012_val_00048914.JPEG n02102040/
+mv val/ILSVRC2012_val_00048915.JPEG n02088094/
+mv val/ILSVRC2012_val_00048916.JPEG n02797295/
+mv val/ILSVRC2012_val_00048917.JPEG n07831146/
+mv val/ILSVRC2012_val_00048918.JPEG n03764736/
+mv val/ILSVRC2012_val_00048919.JPEG n03000684/
+mv val/ILSVRC2012_val_00048920.JPEG n02536864/
+mv val/ILSVRC2012_val_00048921.JPEG n01983481/
+mv val/ILSVRC2012_val_00048922.JPEG n02106550/
+mv val/ILSVRC2012_val_00048923.JPEG n04065272/
+mv val/ILSVRC2012_val_00048924.JPEG n01685808/
+mv val/ILSVRC2012_val_00048925.JPEG n02090622/
+mv val/ILSVRC2012_val_00048926.JPEG n04579432/
+mv val/ILSVRC2012_val_00048927.JPEG n04204238/
+mv val/ILSVRC2012_val_00048928.JPEG n13054560/
+mv val/ILSVRC2012_val_00048929.JPEG n03016953/
+mv val/ILSVRC2012_val_00048930.JPEG n03937543/
+mv val/ILSVRC2012_val_00048931.JPEG n04229816/
+mv val/ILSVRC2012_val_00048932.JPEG n02492660/
+mv val/ILSVRC2012_val_00048933.JPEG n03445924/
+mv val/ILSVRC2012_val_00048934.JPEG n11939491/
+mv val/ILSVRC2012_val_00048935.JPEG n03544143/
+mv val/ILSVRC2012_val_00048936.JPEG n02894605/
+mv val/ILSVRC2012_val_00048937.JPEG n07697537/
+mv val/ILSVRC2012_val_00048938.JPEG n04153751/
+mv val/ILSVRC2012_val_00048939.JPEG n02483362/
+mv val/ILSVRC2012_val_00048940.JPEG n02134084/
+mv val/ILSVRC2012_val_00048941.JPEG n04208210/
+mv val/ILSVRC2012_val_00048942.JPEG n03197337/
+mv val/ILSVRC2012_val_00048943.JPEG n01753488/
+mv val/ILSVRC2012_val_00048944.JPEG n03680355/
+mv val/ILSVRC2012_val_00048945.JPEG n03938244/
+mv val/ILSVRC2012_val_00048946.JPEG n03857828/
+mv val/ILSVRC2012_val_00048947.JPEG n03761084/
+mv val/ILSVRC2012_val_00048948.JPEG n02105162/
+mv val/ILSVRC2012_val_00048949.JPEG n03742115/
+mv val/ILSVRC2012_val_00048950.JPEG n02536864/
+mv val/ILSVRC2012_val_00048951.JPEG n02930766/
+mv val/ILSVRC2012_val_00048952.JPEG n01514668/
+mv val/ILSVRC2012_val_00048953.JPEG n03876231/
+mv val/ILSVRC2012_val_00048954.JPEG n02493509/
+mv val/ILSVRC2012_val_00048955.JPEG n02095314/
+mv val/ILSVRC2012_val_00048956.JPEG n04517823/
+mv val/ILSVRC2012_val_00048957.JPEG n01729977/
+mv val/ILSVRC2012_val_00048958.JPEG n04442312/
+mv val/ILSVRC2012_val_00048959.JPEG n11939491/
+mv val/ILSVRC2012_val_00048960.JPEG n01614925/
+mv val/ILSVRC2012_val_00048961.JPEG n03496892/
+mv val/ILSVRC2012_val_00048962.JPEG n02281787/
+mv val/ILSVRC2012_val_00048963.JPEG n02095570/
+mv val/ILSVRC2012_val_00048964.JPEG n02105505/
+mv val/ILSVRC2012_val_00048965.JPEG n04127249/
+mv val/ILSVRC2012_val_00048966.JPEG n04579432/
+mv val/ILSVRC2012_val_00048967.JPEG n03804744/
+mv val/ILSVRC2012_val_00048968.JPEG n04613696/
+mv val/ILSVRC2012_val_00048969.JPEG n01440764/
+mv val/ILSVRC2012_val_00048970.JPEG n04133789/
+mv val/ILSVRC2012_val_00048971.JPEG n02115641/
+mv val/ILSVRC2012_val_00048972.JPEG n02099849/
+mv val/ILSVRC2012_val_00048973.JPEG n04493381/
+mv val/ILSVRC2012_val_00048974.JPEG n02102480/
+mv val/ILSVRC2012_val_00048975.JPEG n11939491/
+mv val/ILSVRC2012_val_00048976.JPEG n07565083/
+mv val/ILSVRC2012_val_00048977.JPEG n03425413/
+mv val/ILSVRC2012_val_00048978.JPEG n01756291/
+mv val/ILSVRC2012_val_00048979.JPEG n02132136/
+mv val/ILSVRC2012_val_00048980.JPEG n02109525/
+mv val/ILSVRC2012_val_00048981.JPEG n03995372/
+mv val/ILSVRC2012_val_00048982.JPEG n12057211/
+mv val/ILSVRC2012_val_00048983.JPEG n07697537/
+mv val/ILSVRC2012_val_00048984.JPEG n04023962/
+mv val/ILSVRC2012_val_00048985.JPEG n03690938/
+mv val/ILSVRC2012_val_00048986.JPEG n03676483/
+mv val/ILSVRC2012_val_00048987.JPEG n03868863/
+mv val/ILSVRC2012_val_00048988.JPEG n04147183/
+mv val/ILSVRC2012_val_00048989.JPEG n02895154/
+mv val/ILSVRC2012_val_00048990.JPEG n01773549/
+mv val/ILSVRC2012_val_00048991.JPEG n01667114/
+mv val/ILSVRC2012_val_00048992.JPEG n12267677/
+mv val/ILSVRC2012_val_00048993.JPEG n04507155/
+mv val/ILSVRC2012_val_00048994.JPEG n03658185/
+mv val/ILSVRC2012_val_00048995.JPEG n01644373/
+mv val/ILSVRC2012_val_00048996.JPEG n06785654/
+mv val/ILSVRC2012_val_00048997.JPEG n02114548/
+mv val/ILSVRC2012_val_00048998.JPEG n04065272/
+mv val/ILSVRC2012_val_00048999.JPEG n04118538/
+mv val/ILSVRC2012_val_00049000.JPEG n01491361/
+mv val/ILSVRC2012_val_00049001.JPEG n03792782/
+mv val/ILSVRC2012_val_00049002.JPEG n03773504/
+mv val/ILSVRC2012_val_00049003.JPEG n07831146/
+mv val/ILSVRC2012_val_00049004.JPEG n02092002/
+mv val/ILSVRC2012_val_00049005.JPEG n02808304/
+mv val/ILSVRC2012_val_00049006.JPEG n04330267/
+mv val/ILSVRC2012_val_00049007.JPEG n02437312/
+mv val/ILSVRC2012_val_00049008.JPEG n03481172/
+mv val/ILSVRC2012_val_00049009.JPEG n03706229/
+mv val/ILSVRC2012_val_00049010.JPEG n02100583/
+mv val/ILSVRC2012_val_00049011.JPEG n04347754/
+mv val/ILSVRC2012_val_00049012.JPEG n02666196/
+mv val/ILSVRC2012_val_00049013.JPEG n04074963/
+mv val/ILSVRC2012_val_00049014.JPEG n03976467/
+mv val/ILSVRC2012_val_00049015.JPEG n02090721/
+mv val/ILSVRC2012_val_00049016.JPEG n02002556/
+mv val/ILSVRC2012_val_00049017.JPEG n01728572/
+mv val/ILSVRC2012_val_00049018.JPEG n02129165/
+mv val/ILSVRC2012_val_00049019.JPEG n02483362/
+mv val/ILSVRC2012_val_00049020.JPEG n01910747/
+mv val/ILSVRC2012_val_00049021.JPEG n03887697/
+mv val/ILSVRC2012_val_00049022.JPEG n02422106/
+mv val/ILSVRC2012_val_00049023.JPEG n04039381/
+mv val/ILSVRC2012_val_00049024.JPEG n02356798/
+mv val/ILSVRC2012_val_00049025.JPEG n04350905/
+mv val/ILSVRC2012_val_00049026.JPEG n02871525/
+mv val/ILSVRC2012_val_00049027.JPEG n02086079/
+mv val/ILSVRC2012_val_00049028.JPEG n04485082/
+mv val/ILSVRC2012_val_00049029.JPEG n04116512/
+mv val/ILSVRC2012_val_00049030.JPEG n02346627/
+mv val/ILSVRC2012_val_00049031.JPEG n02840245/
+mv val/ILSVRC2012_val_00049032.JPEG n03345487/
+mv val/ILSVRC2012_val_00049033.JPEG n04336792/
+mv val/ILSVRC2012_val_00049034.JPEG n03777568/
+mv val/ILSVRC2012_val_00049035.JPEG n02797295/
+mv val/ILSVRC2012_val_00049036.JPEG n02093428/
+mv val/ILSVRC2012_val_00049037.JPEG n04037443/
+mv val/ILSVRC2012_val_00049038.JPEG n03188531/
+mv val/ILSVRC2012_val_00049039.JPEG n03538406/
+mv val/ILSVRC2012_val_00049040.JPEG n02108089/
+mv val/ILSVRC2012_val_00049041.JPEG n02268853/
+mv val/ILSVRC2012_val_00049042.JPEG n02219486/
+mv val/ILSVRC2012_val_00049043.JPEG n02415577/
+mv val/ILSVRC2012_val_00049044.JPEG n02113978/
+mv val/ILSVRC2012_val_00049045.JPEG n04367480/
+mv val/ILSVRC2012_val_00049046.JPEG n02111277/
+mv val/ILSVRC2012_val_00049047.JPEG n07754684/
+mv val/ILSVRC2012_val_00049048.JPEG n03207941/
+mv val/ILSVRC2012_val_00049049.JPEG n02708093/
+mv val/ILSVRC2012_val_00049050.JPEG n02791124/
+mv val/ILSVRC2012_val_00049051.JPEG n04239074/
+mv val/ILSVRC2012_val_00049052.JPEG n01872401/
+mv val/ILSVRC2012_val_00049053.JPEG n03124043/
+mv val/ILSVRC2012_val_00049054.JPEG n02788148/
+mv val/ILSVRC2012_val_00049055.JPEG n03933933/
+mv val/ILSVRC2012_val_00049056.JPEG n01798484/
+mv val/ILSVRC2012_val_00049057.JPEG n03065424/
+mv val/ILSVRC2012_val_00049058.JPEG n03658185/
+mv val/ILSVRC2012_val_00049059.JPEG n09421951/
+mv val/ILSVRC2012_val_00049060.JPEG n03000247/
+mv val/ILSVRC2012_val_00049061.JPEG n02669723/
+mv val/ILSVRC2012_val_00049062.JPEG n04592741/
+mv val/ILSVRC2012_val_00049063.JPEG n02097130/
+mv val/ILSVRC2012_val_00049064.JPEG n02105641/
+mv val/ILSVRC2012_val_00049065.JPEG n01629819/
+mv val/ILSVRC2012_val_00049066.JPEG n02793495/
+mv val/ILSVRC2012_val_00049067.JPEG n03954731/
+mv val/ILSVRC2012_val_00049068.JPEG n04141327/
+mv val/ILSVRC2012_val_00049069.JPEG n02966687/
+mv val/ILSVRC2012_val_00049070.JPEG n02769748/
+mv val/ILSVRC2012_val_00049071.JPEG n02281787/
+mv val/ILSVRC2012_val_00049072.JPEG n01687978/
+mv val/ILSVRC2012_val_00049073.JPEG n04229816/
+mv val/ILSVRC2012_val_00049074.JPEG n04009552/
+mv val/ILSVRC2012_val_00049075.JPEG n04418357/
+mv val/ILSVRC2012_val_00049076.JPEG n04461696/
+mv val/ILSVRC2012_val_00049077.JPEG n02006656/
+mv val/ILSVRC2012_val_00049078.JPEG n03770439/
+mv val/ILSVRC2012_val_00049079.JPEG n02017213/
+mv val/ILSVRC2012_val_00049080.JPEG n07716358/
+mv val/ILSVRC2012_val_00049081.JPEG n02445715/
+mv val/ILSVRC2012_val_00049082.JPEG n02389026/
+mv val/ILSVRC2012_val_00049083.JPEG n02948072/
+mv val/ILSVRC2012_val_00049084.JPEG n06785654/
+mv val/ILSVRC2012_val_00049085.JPEG n02268443/
+mv val/ILSVRC2012_val_00049086.JPEG n03457902/
+mv val/ILSVRC2012_val_00049087.JPEG n04118776/
+mv val/ILSVRC2012_val_00049088.JPEG n12768682/
+mv val/ILSVRC2012_val_00049089.JPEG n02095314/
+mv val/ILSVRC2012_val_00049090.JPEG n01518878/
+mv val/ILSVRC2012_val_00049091.JPEG n04275548/
+mv val/ILSVRC2012_val_00049092.JPEG n02894605/
+mv val/ILSVRC2012_val_00049093.JPEG n01843383/
+mv val/ILSVRC2012_val_00049094.JPEG n02840245/
+mv val/ILSVRC2012_val_00049095.JPEG n07697313/
+mv val/ILSVRC2012_val_00049096.JPEG n07930864/
+mv val/ILSVRC2012_val_00049097.JPEG n02690373/
+mv val/ILSVRC2012_val_00049098.JPEG n02788148/
+mv val/ILSVRC2012_val_00049099.JPEG n04081281/
+mv val/ILSVRC2012_val_00049100.JPEG n03127925/
+mv val/ILSVRC2012_val_00049101.JPEG n03706229/
+mv val/ILSVRC2012_val_00049102.JPEG n03721384/
+mv val/ILSVRC2012_val_00049103.JPEG n01632458/
+mv val/ILSVRC2012_val_00049104.JPEG n04265275/
+mv val/ILSVRC2012_val_00049105.JPEG n01924916/
+mv val/ILSVRC2012_val_00049106.JPEG n02979186/
+mv val/ILSVRC2012_val_00049107.JPEG n01872401/
+mv val/ILSVRC2012_val_00049108.JPEG n04235860/
+mv val/ILSVRC2012_val_00049109.JPEG n04476259/
+mv val/ILSVRC2012_val_00049110.JPEG n07697537/
+mv val/ILSVRC2012_val_00049111.JPEG n02488702/
+mv val/ILSVRC2012_val_00049112.JPEG n03920288/
+mv val/ILSVRC2012_val_00049113.JPEG n03670208/
+mv val/ILSVRC2012_val_00049114.JPEG n04493381/
+mv val/ILSVRC2012_val_00049115.JPEG n02113712/
+mv val/ILSVRC2012_val_00049116.JPEG n01682714/
+mv val/ILSVRC2012_val_00049117.JPEG n03271574/
+mv val/ILSVRC2012_val_00049118.JPEG n03018349/
+mv val/ILSVRC2012_val_00049119.JPEG n01641577/
+mv val/ILSVRC2012_val_00049120.JPEG n02422699/
+mv val/ILSVRC2012_val_00049121.JPEG n02807133/
+mv val/ILSVRC2012_val_00049122.JPEG n02749479/
+mv val/ILSVRC2012_val_00049123.JPEG n02749479/
+mv val/ILSVRC2012_val_00049124.JPEG n02480495/
+mv val/ILSVRC2012_val_00049125.JPEG n02120505/
+mv val/ILSVRC2012_val_00049126.JPEG n02277742/
+mv val/ILSVRC2012_val_00049127.JPEG n03935335/
+mv val/ILSVRC2012_val_00049128.JPEG n03759954/
+mv val/ILSVRC2012_val_00049129.JPEG n02113186/
+mv val/ILSVRC2012_val_00049130.JPEG n02100236/
+mv val/ILSVRC2012_val_00049131.JPEG n03126707/
+mv val/ILSVRC2012_val_00049132.JPEG n04458633/
+mv val/ILSVRC2012_val_00049133.JPEG n02281406/
+mv val/ILSVRC2012_val_00049134.JPEG n01775062/
+mv val/ILSVRC2012_val_00049135.JPEG n04204347/
+mv val/ILSVRC2012_val_00049136.JPEG n02116738/
+mv val/ILSVRC2012_val_00049137.JPEG n03388043/
+mv val/ILSVRC2012_val_00049138.JPEG n04418357/
+mv val/ILSVRC2012_val_00049139.JPEG n02100583/
+mv val/ILSVRC2012_val_00049140.JPEG n03584829/
+mv val/ILSVRC2012_val_00049141.JPEG n01592084/
+mv val/ILSVRC2012_val_00049142.JPEG n04456115/
+mv val/ILSVRC2012_val_00049143.JPEG n01728920/
+mv val/ILSVRC2012_val_00049144.JPEG n02091635/
+mv val/ILSVRC2012_val_00049145.JPEG n03637318/
+mv val/ILSVRC2012_val_00049146.JPEG n02105056/
+mv val/ILSVRC2012_val_00049147.JPEG n02110627/
+mv val/ILSVRC2012_val_00049148.JPEG n02776631/
+mv val/ILSVRC2012_val_00049149.JPEG n03788365/
+mv val/ILSVRC2012_val_00049150.JPEG n03179701/
+mv val/ILSVRC2012_val_00049151.JPEG n02009912/
+mv val/ILSVRC2012_val_00049152.JPEG n02219486/
+mv val/ILSVRC2012_val_00049153.JPEG n04179913/
+mv val/ILSVRC2012_val_00049154.JPEG n07590611/
+mv val/ILSVRC2012_val_00049155.JPEG n03903868/
+mv val/ILSVRC2012_val_00049156.JPEG n04560804/
+mv val/ILSVRC2012_val_00049157.JPEG n01917289/
+mv val/ILSVRC2012_val_00049158.JPEG n04133789/
+mv val/ILSVRC2012_val_00049159.JPEG n02085620/
+mv val/ILSVRC2012_val_00049160.JPEG n03259280/
+mv val/ILSVRC2012_val_00049161.JPEG n02484975/
+mv val/ILSVRC2012_val_00049162.JPEG n01744401/
+mv val/ILSVRC2012_val_00049163.JPEG n07836838/
+mv val/ILSVRC2012_val_00049164.JPEG n07753592/
+mv val/ILSVRC2012_val_00049165.JPEG n03673027/
+mv val/ILSVRC2012_val_00049166.JPEG n01494475/
+mv val/ILSVRC2012_val_00049167.JPEG n01728572/
+mv val/ILSVRC2012_val_00049168.JPEG n02174001/
+mv val/ILSVRC2012_val_00049169.JPEG n07873807/
+mv val/ILSVRC2012_val_00049170.JPEG n02058221/
+mv val/ILSVRC2012_val_00049171.JPEG n04252225/
+mv val/ILSVRC2012_val_00049172.JPEG n03782006/
+mv val/ILSVRC2012_val_00049173.JPEG n04133789/
+mv val/ILSVRC2012_val_00049174.JPEG n15075141/
+mv val/ILSVRC2012_val_00049175.JPEG n02106662/
+mv val/ILSVRC2012_val_00049176.JPEG n02346627/
+mv val/ILSVRC2012_val_00049177.JPEG n03769881/
+mv val/ILSVRC2012_val_00049178.JPEG n03630383/
+mv val/ILSVRC2012_val_00049179.JPEG n03871628/
+mv val/ILSVRC2012_val_00049180.JPEG n01984695/
+mv val/ILSVRC2012_val_00049181.JPEG n01514668/
+mv val/ILSVRC2012_val_00049182.JPEG n01749939/
+mv val/ILSVRC2012_val_00049183.JPEG n03457902/
+mv val/ILSVRC2012_val_00049184.JPEG n04347754/
+mv val/ILSVRC2012_val_00049185.JPEG n04370456/
+mv val/ILSVRC2012_val_00049186.JPEG n02892201/
+mv val/ILSVRC2012_val_00049187.JPEG n01693334/
+mv val/ILSVRC2012_val_00049188.JPEG n03109150/
+mv val/ILSVRC2012_val_00049189.JPEG n02102973/
+mv val/ILSVRC2012_val_00049190.JPEG n02098413/
+mv val/ILSVRC2012_val_00049191.JPEG n01930112/
+mv val/ILSVRC2012_val_00049192.JPEG n02834397/
+mv val/ILSVRC2012_val_00049193.JPEG n02091032/
+mv val/ILSVRC2012_val_00049194.JPEG n02489166/
+mv val/ILSVRC2012_val_00049195.JPEG n12985857/
+mv val/ILSVRC2012_val_00049196.JPEG n02092339/
+mv val/ILSVRC2012_val_00049197.JPEG n03995372/
+mv val/ILSVRC2012_val_00049198.JPEG n02089078/
+mv val/ILSVRC2012_val_00049199.JPEG n03709823/
+mv val/ILSVRC2012_val_00049200.JPEG n02111500/
+mv val/ILSVRC2012_val_00049201.JPEG n02268443/
+mv val/ILSVRC2012_val_00049202.JPEG n02410509/
+mv val/ILSVRC2012_val_00049203.JPEG n01798484/
+mv val/ILSVRC2012_val_00049204.JPEG n03720891/
+mv val/ILSVRC2012_val_00049205.JPEG n03868863/
+mv val/ILSVRC2012_val_00049206.JPEG n02092002/
+mv val/ILSVRC2012_val_00049207.JPEG n03018349/
+mv val/ILSVRC2012_val_00049208.JPEG n04487394/
+mv val/ILSVRC2012_val_00049209.JPEG n03240683/
+mv val/ILSVRC2012_val_00049210.JPEG n03803284/
+mv val/ILSVRC2012_val_00049211.JPEG n07579787/
+mv val/ILSVRC2012_val_00049212.JPEG n02804414/
+mv val/ILSVRC2012_val_00049213.JPEG n03887697/
+mv val/ILSVRC2012_val_00049214.JPEG n04542943/
+mv val/ILSVRC2012_val_00049215.JPEG n02113023/
+mv val/ILSVRC2012_val_00049216.JPEG n02607072/
+mv val/ILSVRC2012_val_00049217.JPEG n01882714/
+mv val/ILSVRC2012_val_00049218.JPEG n02102040/
+mv val/ILSVRC2012_val_00049219.JPEG n07697537/
+mv val/ILSVRC2012_val_00049220.JPEG n02443114/
+mv val/ILSVRC2012_val_00049221.JPEG n01986214/
+mv val/ILSVRC2012_val_00049222.JPEG n02777292/
+mv val/ILSVRC2012_val_00049223.JPEG n02939185/
+mv val/ILSVRC2012_val_00049224.JPEG n02009229/
+mv val/ILSVRC2012_val_00049225.JPEG n03769881/
+mv val/ILSVRC2012_val_00049226.JPEG n04554684/
+mv val/ILSVRC2012_val_00049227.JPEG n02037110/
+mv val/ILSVRC2012_val_00049228.JPEG n02817516/
+mv val/ILSVRC2012_val_00049229.JPEG n02089078/
+mv val/ILSVRC2012_val_00049230.JPEG n03691459/
+mv val/ILSVRC2012_val_00049231.JPEG n03680355/
+mv val/ILSVRC2012_val_00049232.JPEG n04591713/
+mv val/ILSVRC2012_val_00049233.JPEG n03804744/
+mv val/ILSVRC2012_val_00049234.JPEG n03617480/
+mv val/ILSVRC2012_val_00049235.JPEG n01795545/
+mv val/ILSVRC2012_val_00049236.JPEG n02865351/
+mv val/ILSVRC2012_val_00049237.JPEG n02840245/
+mv val/ILSVRC2012_val_00049238.JPEG n02909870/
+mv val/ILSVRC2012_val_00049239.JPEG n02101006/
+mv val/ILSVRC2012_val_00049240.JPEG n04208210/
+mv val/ILSVRC2012_val_00049241.JPEG n04487081/
+mv val/ILSVRC2012_val_00049242.JPEG n02111889/
+mv val/ILSVRC2012_val_00049243.JPEG n04264628/
+mv val/ILSVRC2012_val_00049244.JPEG n01629819/
+mv val/ILSVRC2012_val_00049245.JPEG n02111129/
+mv val/ILSVRC2012_val_00049246.JPEG n12768682/
+mv val/ILSVRC2012_val_00049247.JPEG n03134739/
+mv val/ILSVRC2012_val_00049248.JPEG n03075370/
+mv val/ILSVRC2012_val_00049249.JPEG n13037406/
+mv val/ILSVRC2012_val_00049250.JPEG n02100735/
+mv val/ILSVRC2012_val_00049251.JPEG n04330267/
+mv val/ILSVRC2012_val_00049252.JPEG n04540053/
+mv val/ILSVRC2012_val_00049253.JPEG n01498041/
+mv val/ILSVRC2012_val_00049254.JPEG n03874599/
+mv val/ILSVRC2012_val_00049255.JPEG n03874599/
+mv val/ILSVRC2012_val_00049256.JPEG n04485082/
+mv val/ILSVRC2012_val_00049257.JPEG n03095699/
+mv val/ILSVRC2012_val_00049258.JPEG n04252225/
+mv val/ILSVRC2012_val_00049259.JPEG n02172182/
+mv val/ILSVRC2012_val_00049260.JPEG n01667114/
+mv val/ILSVRC2012_val_00049261.JPEG n04557648/
+mv val/ILSVRC2012_val_00049262.JPEG n02119022/
+mv val/ILSVRC2012_val_00049263.JPEG n02091467/
+mv val/ILSVRC2012_val_00049264.JPEG n04350905/
+mv val/ILSVRC2012_val_00049265.JPEG n01817953/
+mv val/ILSVRC2012_val_00049266.JPEG n01985128/
+mv val/ILSVRC2012_val_00049267.JPEG n04067472/
+mv val/ILSVRC2012_val_00049268.JPEG n02504013/
+mv val/ILSVRC2012_val_00049269.JPEG n04476259/
+mv val/ILSVRC2012_val_00049270.JPEG n09229709/
+mv val/ILSVRC2012_val_00049271.JPEG n02865351/
+mv val/ILSVRC2012_val_00049272.JPEG n02105251/
+mv val/ILSVRC2012_val_00049273.JPEG n03255030/
+mv val/ILSVRC2012_val_00049274.JPEG n02325366/
+mv val/ILSVRC2012_val_00049275.JPEG n04200800/
+mv val/ILSVRC2012_val_00049276.JPEG n03065424/
+mv val/ILSVRC2012_val_00049277.JPEG n04330267/
+mv val/ILSVRC2012_val_00049278.JPEG n02403003/
+mv val/ILSVRC2012_val_00049279.JPEG n02123159/
+mv val/ILSVRC2012_val_00049280.JPEG n02326432/
+mv val/ILSVRC2012_val_00049281.JPEG n02097130/
+mv val/ILSVRC2012_val_00049282.JPEG n02966687/
+mv val/ILSVRC2012_val_00049283.JPEG n04591157/
+mv val/ILSVRC2012_val_00049284.JPEG n03538406/
+mv val/ILSVRC2012_val_00049285.JPEG n02107908/
+mv val/ILSVRC2012_val_00049286.JPEG n02009912/
+mv val/ILSVRC2012_val_00049287.JPEG n01644900/
+mv val/ILSVRC2012_val_00049288.JPEG n02356798/
+mv val/ILSVRC2012_val_00049289.JPEG n04201297/
+mv val/ILSVRC2012_val_00049290.JPEG n04235860/
+mv val/ILSVRC2012_val_00049291.JPEG n02110185/
+mv val/ILSVRC2012_val_00049292.JPEG n03544143/
+mv val/ILSVRC2012_val_00049293.JPEG n02787622/
+mv val/ILSVRC2012_val_00049294.JPEG n04296562/
+mv val/ILSVRC2012_val_00049295.JPEG n02804414/
+mv val/ILSVRC2012_val_00049296.JPEG n02114367/
+mv val/ILSVRC2012_val_00049297.JPEG n02894605/
+mv val/ILSVRC2012_val_00049298.JPEG n02119022/
+mv val/ILSVRC2012_val_00049299.JPEG n02965783/
+mv val/ILSVRC2012_val_00049300.JPEG n03837869/
+mv val/ILSVRC2012_val_00049301.JPEG n01955084/
+mv val/ILSVRC2012_val_00049302.JPEG n02701002/
+mv val/ILSVRC2012_val_00049303.JPEG n02137549/
+mv val/ILSVRC2012_val_00049304.JPEG n03794056/
+mv val/ILSVRC2012_val_00049305.JPEG n03759954/
+mv val/ILSVRC2012_val_00049306.JPEG n03956157/
+mv val/ILSVRC2012_val_00049307.JPEG n03461385/
+mv val/ILSVRC2012_val_00049308.JPEG n02939185/
+mv val/ILSVRC2012_val_00049309.JPEG n07892512/
+mv val/ILSVRC2012_val_00049310.JPEG n07715103/
+mv val/ILSVRC2012_val_00049311.JPEG n01742172/
+mv val/ILSVRC2012_val_00049312.JPEG n04350905/
+mv val/ILSVRC2012_val_00049313.JPEG n01817953/
+mv val/ILSVRC2012_val_00049314.JPEG n02865351/
+mv val/ILSVRC2012_val_00049315.JPEG n02002556/
+mv val/ILSVRC2012_val_00049316.JPEG n01644900/
+mv val/ILSVRC2012_val_00049317.JPEG n02795169/
+mv val/ILSVRC2012_val_00049318.JPEG n03617480/
+mv val/ILSVRC2012_val_00049319.JPEG n03207743/
+mv val/ILSVRC2012_val_00049320.JPEG n02403003/
+mv val/ILSVRC2012_val_00049321.JPEG n03109150/
+mv val/ILSVRC2012_val_00049322.JPEG n03590841/
+mv val/ILSVRC2012_val_00049323.JPEG n02480855/
+mv val/ILSVRC2012_val_00049324.JPEG n02091032/
+mv val/ILSVRC2012_val_00049325.JPEG n07584110/
+mv val/ILSVRC2012_val_00049326.JPEG n02102318/
+mv val/ILSVRC2012_val_00049327.JPEG n02111277/
+mv val/ILSVRC2012_val_00049328.JPEG n02692877/
+mv val/ILSVRC2012_val_00049329.JPEG n04604644/
+mv val/ILSVRC2012_val_00049330.JPEG n03793489/
+mv val/ILSVRC2012_val_00049331.JPEG n01877812/
+mv val/ILSVRC2012_val_00049332.JPEG n02412080/
+mv val/ILSVRC2012_val_00049333.JPEG n01698640/
+mv val/ILSVRC2012_val_00049334.JPEG n02110806/
+mv val/ILSVRC2012_val_00049335.JPEG n04019541/
+mv val/ILSVRC2012_val_00049336.JPEG n04476259/
+mv val/ILSVRC2012_val_00049337.JPEG n04584207/
+mv val/ILSVRC2012_val_00049338.JPEG n02012849/
+mv val/ILSVRC2012_val_00049339.JPEG n03720891/
+mv val/ILSVRC2012_val_00049340.JPEG n04311174/
+mv val/ILSVRC2012_val_00049341.JPEG n03459775/
+mv val/ILSVRC2012_val_00049342.JPEG n03781244/
+mv val/ILSVRC2012_val_00049343.JPEG n09428293/
+mv val/ILSVRC2012_val_00049344.JPEG n02106550/
+mv val/ILSVRC2012_val_00049345.JPEG n02132136/
+mv val/ILSVRC2012_val_00049346.JPEG n03630383/
+mv val/ILSVRC2012_val_00049347.JPEG n02128925/
+mv val/ILSVRC2012_val_00049348.JPEG n03903868/
+mv val/ILSVRC2012_val_00049349.JPEG n03814639/
+mv val/ILSVRC2012_val_00049350.JPEG n01630670/
+mv val/ILSVRC2012_val_00049351.JPEG n02106550/
+mv val/ILSVRC2012_val_00049352.JPEG n01855672/
+mv val/ILSVRC2012_val_00049353.JPEG n01807496/
+mv val/ILSVRC2012_val_00049354.JPEG n02088364/
+mv val/ILSVRC2012_val_00049355.JPEG n03290653/
+mv val/ILSVRC2012_val_00049356.JPEG n02109525/
+mv val/ILSVRC2012_val_00049357.JPEG n03902125/
+mv val/ILSVRC2012_val_00049358.JPEG n07583066/
+mv val/ILSVRC2012_val_00049359.JPEG n04542943/
+mv val/ILSVRC2012_val_00049360.JPEG n03937543/
+mv val/ILSVRC2012_val_00049361.JPEG n07583066/
+mv val/ILSVRC2012_val_00049362.JPEG n04008634/
+mv val/ILSVRC2012_val_00049363.JPEG n04532670/
+mv val/ILSVRC2012_val_00049364.JPEG n02095314/
+mv val/ILSVRC2012_val_00049365.JPEG n04118538/
+mv val/ILSVRC2012_val_00049366.JPEG n07584110/
+mv val/ILSVRC2012_val_00049367.JPEG n02747177/
+mv val/ILSVRC2012_val_00049368.JPEG n03929855/
+mv val/ILSVRC2012_val_00049369.JPEG n01950731/
+mv val/ILSVRC2012_val_00049370.JPEG n07742313/
+mv val/ILSVRC2012_val_00049371.JPEG n03649909/
+mv val/ILSVRC2012_val_00049372.JPEG n02319095/
+mv val/ILSVRC2012_val_00049373.JPEG n01697457/
+mv val/ILSVRC2012_val_00049374.JPEG n02092339/
+mv val/ILSVRC2012_val_00049375.JPEG n09332890/
+mv val/ILSVRC2012_val_00049376.JPEG n04347754/
+mv val/ILSVRC2012_val_00049377.JPEG n02480495/
+mv val/ILSVRC2012_val_00049378.JPEG n03478589/
+mv val/ILSVRC2012_val_00049379.JPEG n07880968/
+mv val/ILSVRC2012_val_00049380.JPEG n03935335/
+mv val/ILSVRC2012_val_00049381.JPEG n03976657/
+mv val/ILSVRC2012_val_00049382.JPEG n02835271/
+mv val/ILSVRC2012_val_00049383.JPEG n04367480/
+mv val/ILSVRC2012_val_00049384.JPEG n02177972/
+mv val/ILSVRC2012_val_00049385.JPEG n04070727/
+mv val/ILSVRC2012_val_00049386.JPEG n04277352/
+mv val/ILSVRC2012_val_00049387.JPEG n04125021/
+mv val/ILSVRC2012_val_00049388.JPEG n03134739/
+mv val/ILSVRC2012_val_00049389.JPEG n02128757/
+mv val/ILSVRC2012_val_00049390.JPEG n02504013/
+mv val/ILSVRC2012_val_00049391.JPEG n04111531/
+mv val/ILSVRC2012_val_00049392.JPEG n04152593/
+mv val/ILSVRC2012_val_00049393.JPEG n04591713/
+mv val/ILSVRC2012_val_00049394.JPEG n03400231/
+mv val/ILSVRC2012_val_00049395.JPEG n01704323/
+mv val/ILSVRC2012_val_00049396.JPEG n12768682/
+mv val/ILSVRC2012_val_00049397.JPEG n02110806/
+mv val/ILSVRC2012_val_00049398.JPEG n04418357/
+mv val/ILSVRC2012_val_00049399.JPEG n02536864/
+mv val/ILSVRC2012_val_00049400.JPEG n04409515/
+mv val/ILSVRC2012_val_00049401.JPEG n04542943/
+mv val/ILSVRC2012_val_00049402.JPEG n03763968/
+mv val/ILSVRC2012_val_00049403.JPEG n03662601/
+mv val/ILSVRC2012_val_00049404.JPEG n02490219/
+mv val/ILSVRC2012_val_00049405.JPEG n02086240/
+mv val/ILSVRC2012_val_00049406.JPEG n04404412/
+mv val/ILSVRC2012_val_00049407.JPEG n07718747/
+mv val/ILSVRC2012_val_00049408.JPEG n02096051/
+mv val/ILSVRC2012_val_00049409.JPEG n04599235/
+mv val/ILSVRC2012_val_00049410.JPEG n01944390/
+mv val/ILSVRC2012_val_00049411.JPEG n01990800/
+mv val/ILSVRC2012_val_00049412.JPEG n04152593/
+mv val/ILSVRC2012_val_00049413.JPEG n02807133/
+mv val/ILSVRC2012_val_00049414.JPEG n02086910/
+mv val/ILSVRC2012_val_00049415.JPEG n03347037/
+mv val/ILSVRC2012_val_00049416.JPEG n01847000/
+mv val/ILSVRC2012_val_00049417.JPEG n02107683/
+mv val/ILSVRC2012_val_00049418.JPEG n02279972/
+mv val/ILSVRC2012_val_00049419.JPEG n04019541/
+mv val/ILSVRC2012_val_00049420.JPEG n01695060/
+mv val/ILSVRC2012_val_00049421.JPEG n02087046/
+mv val/ILSVRC2012_val_00049422.JPEG n03891251/
+mv val/ILSVRC2012_val_00049423.JPEG n04154565/
+mv val/ILSVRC2012_val_00049424.JPEG n04398044/
+mv val/ILSVRC2012_val_00049425.JPEG n02504013/
+mv val/ILSVRC2012_val_00049426.JPEG n02138441/
+mv val/ILSVRC2012_val_00049427.JPEG n04285008/
+mv val/ILSVRC2012_val_00049428.JPEG n03942813/
+mv val/ILSVRC2012_val_00049429.JPEG n04239074/
+mv val/ILSVRC2012_val_00049430.JPEG n02704792/
+mv val/ILSVRC2012_val_00049431.JPEG n03794056/
+mv val/ILSVRC2012_val_00049432.JPEG n04476259/
+mv val/ILSVRC2012_val_00049433.JPEG n04483307/
+mv val/ILSVRC2012_val_00049434.JPEG n03982430/
+mv val/ILSVRC2012_val_00049435.JPEG n02109047/
+mv val/ILSVRC2012_val_00049436.JPEG n11939491/
+mv val/ILSVRC2012_val_00049437.JPEG n04335435/
+mv val/ILSVRC2012_val_00049438.JPEG n02727426/
+mv val/ILSVRC2012_val_00049439.JPEG n03781244/
+mv val/ILSVRC2012_val_00049440.JPEG n01978455/
+mv val/ILSVRC2012_val_00049441.JPEG n03887697/
+mv val/ILSVRC2012_val_00049442.JPEG n02268853/
+mv val/ILSVRC2012_val_00049443.JPEG n02607072/
+mv val/ILSVRC2012_val_00049444.JPEG n02009229/
+mv val/ILSVRC2012_val_00049445.JPEG n04371774/
+mv val/ILSVRC2012_val_00049446.JPEG n07892512/
+mv val/ILSVRC2012_val_00049447.JPEG n04523525/
+mv val/ILSVRC2012_val_00049448.JPEG n01748264/
+mv val/ILSVRC2012_val_00049449.JPEG n03924679/
+mv val/ILSVRC2012_val_00049450.JPEG n04200800/
+mv val/ILSVRC2012_val_00049451.JPEG n04026417/
+mv val/ILSVRC2012_val_00049452.JPEG n04208210/
+mv val/ILSVRC2012_val_00049453.JPEG n04548362/
+mv val/ILSVRC2012_val_00049454.JPEG n04389033/
+mv val/ILSVRC2012_val_00049455.JPEG n04152593/
+mv val/ILSVRC2012_val_00049456.JPEG n02910353/
+mv val/ILSVRC2012_val_00049457.JPEG n07697313/
+mv val/ILSVRC2012_val_00049458.JPEG n03196217/
+mv val/ILSVRC2012_val_00049459.JPEG n04200800/
+mv val/ILSVRC2012_val_00049460.JPEG n02279972/
+mv val/ILSVRC2012_val_00049461.JPEG n01917289/
+mv val/ILSVRC2012_val_00049462.JPEG n02488291/
+mv val/ILSVRC2012_val_00049463.JPEG n02808304/
+mv val/ILSVRC2012_val_00049464.JPEG n03992509/
+mv val/ILSVRC2012_val_00049465.JPEG n02804414/
+mv val/ILSVRC2012_val_00049466.JPEG n01774750/
+mv val/ILSVRC2012_val_00049467.JPEG n04442312/
+mv val/ILSVRC2012_val_00049468.JPEG n03535780/
+mv val/ILSVRC2012_val_00049469.JPEG n02802426/
+mv val/ILSVRC2012_val_00049470.JPEG n04044716/
+mv val/ILSVRC2012_val_00049471.JPEG n02128385/
+mv val/ILSVRC2012_val_00049472.JPEG n07697313/
+mv val/ILSVRC2012_val_00049473.JPEG n04179913/
+mv val/ILSVRC2012_val_00049474.JPEG n03400231/
+mv val/ILSVRC2012_val_00049475.JPEG n03095699/
+mv val/ILSVRC2012_val_00049476.JPEG n03871628/
+mv val/ILSVRC2012_val_00049477.JPEG n02129165/
+mv val/ILSVRC2012_val_00049478.JPEG n01773797/
+mv val/ILSVRC2012_val_00049479.JPEG n03691459/
+mv val/ILSVRC2012_val_00049480.JPEG n02018795/
+mv val/ILSVRC2012_val_00049481.JPEG n04116512/
+mv val/ILSVRC2012_val_00049482.JPEG n03089624/
+mv val/ILSVRC2012_val_00049483.JPEG n02127052/
+mv val/ILSVRC2012_val_00049484.JPEG n02111129/
+mv val/ILSVRC2012_val_00049485.JPEG n02093256/
+mv val/ILSVRC2012_val_00049486.JPEG n03742115/
+mv val/ILSVRC2012_val_00049487.JPEG n04429376/
+mv val/ILSVRC2012_val_00049488.JPEG n02009229/
+mv val/ILSVRC2012_val_00049489.JPEG n02815834/
+mv val/ILSVRC2012_val_00049490.JPEG n07747607/
+mv val/ILSVRC2012_val_00049491.JPEG n03481172/
+mv val/ILSVRC2012_val_00049492.JPEG n03220513/
+mv val/ILSVRC2012_val_00049493.JPEG n03495258/
+mv val/ILSVRC2012_val_00049494.JPEG n02974003/
+mv val/ILSVRC2012_val_00049495.JPEG n01704323/
+mv val/ILSVRC2012_val_00049496.JPEG n04277352/
+mv val/ILSVRC2012_val_00049497.JPEG n07684084/
+mv val/ILSVRC2012_val_00049498.JPEG n02107574/
+mv val/ILSVRC2012_val_00049499.JPEG n02276258/
+mv val/ILSVRC2012_val_00049500.JPEG n12998815/
+mv val/ILSVRC2012_val_00049501.JPEG n03617480/
+mv val/ILSVRC2012_val_00049502.JPEG n03721384/
+mv val/ILSVRC2012_val_00049503.JPEG n02992529/
+mv val/ILSVRC2012_val_00049504.JPEG n02321529/
+mv val/ILSVRC2012_val_00049505.JPEG n03933933/
+mv val/ILSVRC2012_val_00049506.JPEG n03764736/
+mv val/ILSVRC2012_val_00049507.JPEG n03764736/
+mv val/ILSVRC2012_val_00049508.JPEG n02317335/
+mv val/ILSVRC2012_val_00049509.JPEG n04235860/
+mv val/ILSVRC2012_val_00049510.JPEG n02808440/
+mv val/ILSVRC2012_val_00049511.JPEG n02110341/
+mv val/ILSVRC2012_val_00049512.JPEG n04542943/
+mv val/ILSVRC2012_val_00049513.JPEG n02442845/
+mv val/ILSVRC2012_val_00049514.JPEG n02869837/
+mv val/ILSVRC2012_val_00049515.JPEG n01742172/
+mv val/ILSVRC2012_val_00049516.JPEG n02088632/
+mv val/ILSVRC2012_val_00049517.JPEG n02120079/
+mv val/ILSVRC2012_val_00049518.JPEG n04259630/
+mv val/ILSVRC2012_val_00049519.JPEG n03447447/
+mv val/ILSVRC2012_val_00049520.JPEG n03876231/
+mv val/ILSVRC2012_val_00049521.JPEG n02037110/
+mv val/ILSVRC2012_val_00049522.JPEG n01914609/
+mv val/ILSVRC2012_val_00049523.JPEG n02102040/
+mv val/ILSVRC2012_val_00049524.JPEG n13054560/
+mv val/ILSVRC2012_val_00049525.JPEG n03930630/
+mv val/ILSVRC2012_val_00049526.JPEG n03759954/
+mv val/ILSVRC2012_val_00049527.JPEG n07584110/
+mv val/ILSVRC2012_val_00049528.JPEG n04259630/
+mv val/ILSVRC2012_val_00049529.JPEG n03291819/
+mv val/ILSVRC2012_val_00049530.JPEG n07697537/
+mv val/ILSVRC2012_val_00049531.JPEG n01614925/
+mv val/ILSVRC2012_val_00049532.JPEG n03814906/
+mv val/ILSVRC2012_val_00049533.JPEG n04540053/
+mv val/ILSVRC2012_val_00049534.JPEG n02116738/
+mv val/ILSVRC2012_val_00049535.JPEG n01776313/
+mv val/ILSVRC2012_val_00049536.JPEG n03954731/
+mv val/ILSVRC2012_val_00049537.JPEG n04479046/
+mv val/ILSVRC2012_val_00049538.JPEG n03658185/
+mv val/ILSVRC2012_val_00049539.JPEG n04357314/
+mv val/ILSVRC2012_val_00049540.JPEG n03763968/
+mv val/ILSVRC2012_val_00049541.JPEG n01755581/
+mv val/ILSVRC2012_val_00049542.JPEG n01749939/
+mv val/ILSVRC2012_val_00049543.JPEG n02981792/
+mv val/ILSVRC2012_val_00049544.JPEG n03485407/
+mv val/ILSVRC2012_val_00049545.JPEG n02442845/
+mv val/ILSVRC2012_val_00049546.JPEG n04548280/
+mv val/ILSVRC2012_val_00049547.JPEG n07880968/
+mv val/ILSVRC2012_val_00049548.JPEG n02825657/
+mv val/ILSVRC2012_val_00049549.JPEG n09332890/
+mv val/ILSVRC2012_val_00049550.JPEG n04596742/
+mv val/ILSVRC2012_val_00049551.JPEG n04596742/
+mv val/ILSVRC2012_val_00049552.JPEG n02930766/
+mv val/ILSVRC2012_val_00049553.JPEG n01843383/
+mv val/ILSVRC2012_val_00049554.JPEG n03532672/
+mv val/ILSVRC2012_val_00049555.JPEG n13133613/
+mv val/ILSVRC2012_val_00049556.JPEG n02963159/
+mv val/ILSVRC2012_val_00049557.JPEG n03759954/
+mv val/ILSVRC2012_val_00049558.JPEG n02098413/
+mv val/ILSVRC2012_val_00049559.JPEG n04367480/
+mv val/ILSVRC2012_val_00049560.JPEG n02643566/
+mv val/ILSVRC2012_val_00049561.JPEG n04254777/
+mv val/ILSVRC2012_val_00049562.JPEG n02415577/
+mv val/ILSVRC2012_val_00049563.JPEG n04560804/
+mv val/ILSVRC2012_val_00049564.JPEG n04485082/
+mv val/ILSVRC2012_val_00049565.JPEG n03781244/
+mv val/ILSVRC2012_val_00049566.JPEG n04597913/
+mv val/ILSVRC2012_val_00049567.JPEG n04482393/
+mv val/ILSVRC2012_val_00049568.JPEG n01530575/
+mv val/ILSVRC2012_val_00049569.JPEG n03250847/
+mv val/ILSVRC2012_val_00049570.JPEG n02108089/
+mv val/ILSVRC2012_val_00049571.JPEG n04404412/
+mv val/ILSVRC2012_val_00049572.JPEG n02687172/
+mv val/ILSVRC2012_val_00049573.JPEG n03786901/
+mv val/ILSVRC2012_val_00049574.JPEG n02108000/
+mv val/ILSVRC2012_val_00049575.JPEG n02687172/
+mv val/ILSVRC2012_val_00049576.JPEG n02317335/
+mv val/ILSVRC2012_val_00049577.JPEG n02606052/
+mv val/ILSVRC2012_val_00049578.JPEG n02165105/
+mv val/ILSVRC2012_val_00049579.JPEG n03045698/
+mv val/ILSVRC2012_val_00049580.JPEG n03218198/
+mv val/ILSVRC2012_val_00049581.JPEG n02415577/
+mv val/ILSVRC2012_val_00049582.JPEG n04069434/
+mv val/ILSVRC2012_val_00049583.JPEG n04482393/
+mv val/ILSVRC2012_val_00049584.JPEG n01806143/
+mv val/ILSVRC2012_val_00049585.JPEG n01443537/
+mv val/ILSVRC2012_val_00049586.JPEG n02100735/
+mv val/ILSVRC2012_val_00049587.JPEG n04153751/
+mv val/ILSVRC2012_val_00049588.JPEG n04254777/
+mv val/ILSVRC2012_val_00049589.JPEG n02091467/
+mv val/ILSVRC2012_val_00049590.JPEG n03482405/
+mv val/ILSVRC2012_val_00049591.JPEG n02794156/
+mv val/ILSVRC2012_val_00049592.JPEG n07754684/
+mv val/ILSVRC2012_val_00049593.JPEG n03495258/
+mv val/ILSVRC2012_val_00049594.JPEG n04542943/
+mv val/ILSVRC2012_val_00049595.JPEG n01797886/
+mv val/ILSVRC2012_val_00049596.JPEG n03085013/
+mv val/ILSVRC2012_val_00049597.JPEG n03792972/
+mv val/ILSVRC2012_val_00049598.JPEG n01980166/
+mv val/ILSVRC2012_val_00049599.JPEG n02782093/
+mv val/ILSVRC2012_val_00049600.JPEG n03920288/
+mv val/ILSVRC2012_val_00049601.JPEG n03666591/
+mv val/ILSVRC2012_val_00049602.JPEG n01695060/
+mv val/ILSVRC2012_val_00049603.JPEG n02486410/
+mv val/ILSVRC2012_val_00049604.JPEG n02088364/
+mv val/ILSVRC2012_val_00049605.JPEG n02389026/
+mv val/ILSVRC2012_val_00049606.JPEG n07753592/
+mv val/ILSVRC2012_val_00049607.JPEG n07248320/
+mv val/ILSVRC2012_val_00049608.JPEG n03355925/
+mv val/ILSVRC2012_val_00049609.JPEG n01737021/
+mv val/ILSVRC2012_val_00049610.JPEG n04266014/
+mv val/ILSVRC2012_val_00049611.JPEG n02167151/
+mv val/ILSVRC2012_val_00049612.JPEG n03930630/
+mv val/ILSVRC2012_val_00049613.JPEG n02133161/
+mv val/ILSVRC2012_val_00049614.JPEG n02107142/
+mv val/ILSVRC2012_val_00049615.JPEG n03180011/
+mv val/ILSVRC2012_val_00049616.JPEG n04023962/
+mv val/ILSVRC2012_val_00049617.JPEG n01443537/
+mv val/ILSVRC2012_val_00049618.JPEG n02443114/
+mv val/ILSVRC2012_val_00049619.JPEG n02892201/
+mv val/ILSVRC2012_val_00049620.JPEG n03109150/
+mv val/ILSVRC2012_val_00049621.JPEG n01872401/
+mv val/ILSVRC2012_val_00049622.JPEG n07565083/
+mv val/ILSVRC2012_val_00049623.JPEG n02815834/
+mv val/ILSVRC2012_val_00049624.JPEG n02206856/
+mv val/ILSVRC2012_val_00049625.JPEG n03729826/
+mv val/ILSVRC2012_val_00049626.JPEG n10565667/
+mv val/ILSVRC2012_val_00049627.JPEG n02111129/
+mv val/ILSVRC2012_val_00049628.JPEG n02704792/
+mv val/ILSVRC2012_val_00049629.JPEG n02117135/
+mv val/ILSVRC2012_val_00049630.JPEG n03000247/
+mv val/ILSVRC2012_val_00049631.JPEG n02129604/
+mv val/ILSVRC2012_val_00049632.JPEG n04550184/
+mv val/ILSVRC2012_val_00049633.JPEG n03089624/
+mv val/ILSVRC2012_val_00049634.JPEG n03785016/
+mv val/ILSVRC2012_val_00049635.JPEG n01689811/
+mv val/ILSVRC2012_val_00049636.JPEG n02441942/
+mv val/ILSVRC2012_val_00049637.JPEG n01641577/
+mv val/ILSVRC2012_val_00049638.JPEG n02229544/
+mv val/ILSVRC2012_val_00049639.JPEG n01622779/
+mv val/ILSVRC2012_val_00049640.JPEG n02089973/
+mv val/ILSVRC2012_val_00049641.JPEG n02791270/
+mv val/ILSVRC2012_val_00049642.JPEG n02102177/
+mv val/ILSVRC2012_val_00049643.JPEG n02114855/
+mv val/ILSVRC2012_val_00049644.JPEG n13040303/
+mv val/ILSVRC2012_val_00049645.JPEG n03944341/
+mv val/ILSVRC2012_val_00049646.JPEG n01667114/
+mv val/ILSVRC2012_val_00049647.JPEG n04149813/
+mv val/ILSVRC2012_val_00049648.JPEG n03792972/
+mv val/ILSVRC2012_val_00049649.JPEG n02869837/
+mv val/ILSVRC2012_val_00049650.JPEG n02112706/
+mv val/ILSVRC2012_val_00049651.JPEG n13044778/
+mv val/ILSVRC2012_val_00049652.JPEG n01688243/
+mv val/ILSVRC2012_val_00049653.JPEG n02097658/
+mv val/ILSVRC2012_val_00049654.JPEG n02109961/
+mv val/ILSVRC2012_val_00049655.JPEG n03791053/
+mv val/ILSVRC2012_val_00049656.JPEG n04286575/
+mv val/ILSVRC2012_val_00049657.JPEG n01985128/
+mv val/ILSVRC2012_val_00049658.JPEG n03014705/
+mv val/ILSVRC2012_val_00049659.JPEG n04265275/
+mv val/ILSVRC2012_val_00049660.JPEG n04467665/
+mv val/ILSVRC2012_val_00049661.JPEG n01985128/
+mv val/ILSVRC2012_val_00049662.JPEG n04344873/
+mv val/ILSVRC2012_val_00049663.JPEG n04335435/
+mv val/ILSVRC2012_val_00049664.JPEG n02676566/
+mv val/ILSVRC2012_val_00049665.JPEG n01806143/
+mv val/ILSVRC2012_val_00049666.JPEG n04599235/
+mv val/ILSVRC2012_val_00049667.JPEG n02093859/
+mv val/ILSVRC2012_val_00049668.JPEG n04486054/
+mv val/ILSVRC2012_val_00049669.JPEG n01601694/
+mv val/ILSVRC2012_val_00049670.JPEG n02966193/
+mv val/ILSVRC2012_val_00049671.JPEG n02965783/
+mv val/ILSVRC2012_val_00049672.JPEG n02099712/
+mv val/ILSVRC2012_val_00049673.JPEG n02808440/
+mv val/ILSVRC2012_val_00049674.JPEG n03785016/
+mv val/ILSVRC2012_val_00049675.JPEG n04285008/
+mv val/ILSVRC2012_val_00049676.JPEG n04141076/
+mv val/ILSVRC2012_val_00049677.JPEG n07760859/
+mv val/ILSVRC2012_val_00049678.JPEG n03717622/
+mv val/ILSVRC2012_val_00049679.JPEG n01917289/
+mv val/ILSVRC2012_val_00049680.JPEG n03942813/
+mv val/ILSVRC2012_val_00049681.JPEG n04409515/
+mv val/ILSVRC2012_val_00049682.JPEG n01819313/
+mv val/ILSVRC2012_val_00049683.JPEG n03255030/
+mv val/ILSVRC2012_val_00049684.JPEG n02328150/
+mv val/ILSVRC2012_val_00049685.JPEG n07590611/
+mv val/ILSVRC2012_val_00049686.JPEG n01985128/
+mv val/ILSVRC2012_val_00049687.JPEG n03998194/
+mv val/ILSVRC2012_val_00049688.JPEG n12985857/
+mv val/ILSVRC2012_val_00049689.JPEG n03014705/
+mv val/ILSVRC2012_val_00049690.JPEG n02823428/
+mv val/ILSVRC2012_val_00049691.JPEG n03127747/
+mv val/ILSVRC2012_val_00049692.JPEG n02825657/
+mv val/ILSVRC2012_val_00049693.JPEG n03935335/
+mv val/ILSVRC2012_val_00049694.JPEG n02793495/
+mv val/ILSVRC2012_val_00049695.JPEG n04509417/
+mv val/ILSVRC2012_val_00049696.JPEG n02655020/
+mv val/ILSVRC2012_val_00049697.JPEG n07873807/
+mv val/ILSVRC2012_val_00049698.JPEG n02906734/
+mv val/ILSVRC2012_val_00049699.JPEG n03720891/
+mv val/ILSVRC2012_val_00049700.JPEG n04037443/
+mv val/ILSVRC2012_val_00049701.JPEG n04254120/
+mv val/ILSVRC2012_val_00049702.JPEG n07614500/
+mv val/ILSVRC2012_val_00049703.JPEG n01667114/
+mv val/ILSVRC2012_val_00049704.JPEG n02415577/
+mv val/ILSVRC2012_val_00049705.JPEG n03710637/
+mv val/ILSVRC2012_val_00049706.JPEG n02361337/
+mv val/ILSVRC2012_val_00049707.JPEG n04081281/
+mv val/ILSVRC2012_val_00049708.JPEG n04070727/
+mv val/ILSVRC2012_val_00049709.JPEG n03649909/
+mv val/ILSVRC2012_val_00049710.JPEG n07720875/
+mv val/ILSVRC2012_val_00049711.JPEG n02011460/
+mv val/ILSVRC2012_val_00049712.JPEG n01443537/
+mv val/ILSVRC2012_val_00049713.JPEG n04525305/
+mv val/ILSVRC2012_val_00049714.JPEG n02894605/
+mv val/ILSVRC2012_val_00049715.JPEG n02113712/
+mv val/ILSVRC2012_val_00049716.JPEG n09229709/
+mv val/ILSVRC2012_val_00049717.JPEG n04367480/
+mv val/ILSVRC2012_val_00049718.JPEG n04266014/
+mv val/ILSVRC2012_val_00049719.JPEG n02105056/
+mv val/ILSVRC2012_val_00049720.JPEG n09421951/
+mv val/ILSVRC2012_val_00049721.JPEG n02814860/
+mv val/ILSVRC2012_val_00049722.JPEG n02167151/
+mv val/ILSVRC2012_val_00049723.JPEG n01744401/
+mv val/ILSVRC2012_val_00049724.JPEG n02808304/
+mv val/ILSVRC2012_val_00049725.JPEG n02106030/
+mv val/ILSVRC2012_val_00049726.JPEG n02074367/
+mv val/ILSVRC2012_val_00049727.JPEG n02536864/
+mv val/ILSVRC2012_val_00049728.JPEG n04485082/
+mv val/ILSVRC2012_val_00049729.JPEG n03538406/
+mv val/ILSVRC2012_val_00049730.JPEG n02108915/
+mv val/ILSVRC2012_val_00049731.JPEG n02114548/
+mv val/ILSVRC2012_val_00049732.JPEG n01698640/
+mv val/ILSVRC2012_val_00049733.JPEG n04286575/
+mv val/ILSVRC2012_val_00049734.JPEG n02797295/
+mv val/ILSVRC2012_val_00049735.JPEG n02124075/
+mv val/ILSVRC2012_val_00049736.JPEG n02927161/
+mv val/ILSVRC2012_val_00049737.JPEG n02747177/
+mv val/ILSVRC2012_val_00049738.JPEG n02641379/
+mv val/ILSVRC2012_val_00049739.JPEG n02325366/
+mv val/ILSVRC2012_val_00049740.JPEG n02536864/
+mv val/ILSVRC2012_val_00049741.JPEG n03697007/
+mv val/ILSVRC2012_val_00049742.JPEG n02281406/
+mv val/ILSVRC2012_val_00049743.JPEG n03017168/
+mv val/ILSVRC2012_val_00049744.JPEG n02090721/
+mv val/ILSVRC2012_val_00049745.JPEG n03776460/
+mv val/ILSVRC2012_val_00049746.JPEG n02037110/
+mv val/ILSVRC2012_val_00049747.JPEG n03100240/
+mv val/ILSVRC2012_val_00049748.JPEG n04398044/
+mv val/ILSVRC2012_val_00049749.JPEG n02871525/
+mv val/ILSVRC2012_val_00049750.JPEG n03792782/
+mv val/ILSVRC2012_val_00049751.JPEG n02787622/
+mv val/ILSVRC2012_val_00049752.JPEG n03180011/
+mv val/ILSVRC2012_val_00049753.JPEG n04522168/
+mv val/ILSVRC2012_val_00049754.JPEG n04266014/
+mv val/ILSVRC2012_val_00049755.JPEG n03218198/
+mv val/ILSVRC2012_val_00049756.JPEG n02088094/
+mv val/ILSVRC2012_val_00049757.JPEG n02097298/
+mv val/ILSVRC2012_val_00049758.JPEG n04548362/
+mv val/ILSVRC2012_val_00049759.JPEG n03196217/
+mv val/ILSVRC2012_val_00049760.JPEG n02095889/
+mv val/ILSVRC2012_val_00049761.JPEG n01873310/
+mv val/ILSVRC2012_val_00049762.JPEG n02088466/
+mv val/ILSVRC2012_val_00049763.JPEG n01968897/
+mv val/ILSVRC2012_val_00049764.JPEG n04548280/
+mv val/ILSVRC2012_val_00049765.JPEG n04604644/
+mv val/ILSVRC2012_val_00049766.JPEG n02090379/
+mv val/ILSVRC2012_val_00049767.JPEG n03787032/
+mv val/ILSVRC2012_val_00049768.JPEG n04229816/
+mv val/ILSVRC2012_val_00049769.JPEG n03891251/
+mv val/ILSVRC2012_val_00049770.JPEG n02356798/
+mv val/ILSVRC2012_val_00049771.JPEG n04350905/
+mv val/ILSVRC2012_val_00049772.JPEG n03782006/
+mv val/ILSVRC2012_val_00049773.JPEG n01664065/
+mv val/ILSVRC2012_val_00049774.JPEG n03950228/
+mv val/ILSVRC2012_val_00049775.JPEG n01601694/
+mv val/ILSVRC2012_val_00049776.JPEG n01558993/
+mv val/ILSVRC2012_val_00049777.JPEG n02777292/
+mv val/ILSVRC2012_val_00049778.JPEG n02091134/
+mv val/ILSVRC2012_val_00049779.JPEG n02088632/
+mv val/ILSVRC2012_val_00049780.JPEG n02442845/
+mv val/ILSVRC2012_val_00049781.JPEG n02137549/
+mv val/ILSVRC2012_val_00049782.JPEG n01669191/
+mv val/ILSVRC2012_val_00049783.JPEG n02007558/
+mv val/ILSVRC2012_val_00049784.JPEG n03782006/
+mv val/ILSVRC2012_val_00049785.JPEG n03692522/
+mv val/ILSVRC2012_val_00049786.JPEG n02916936/
+mv val/ILSVRC2012_val_00049787.JPEG n04357314/
+mv val/ILSVRC2012_val_00049788.JPEG n02132136/
+mv val/ILSVRC2012_val_00049789.JPEG n03930630/
+mv val/ILSVRC2012_val_00049790.JPEG n04019541/
+mv val/ILSVRC2012_val_00049791.JPEG n04005630/
+mv val/ILSVRC2012_val_00049792.JPEG n02102480/
+mv val/ILSVRC2012_val_00049793.JPEG n03443371/
+mv val/ILSVRC2012_val_00049794.JPEG n04523525/
+mv val/ILSVRC2012_val_00049795.JPEG n03814906/
+mv val/ILSVRC2012_val_00049796.JPEG n07693725/
+mv val/ILSVRC2012_val_00049797.JPEG n04371774/
+mv val/ILSVRC2012_val_00049798.JPEG n04209239/
+mv val/ILSVRC2012_val_00049799.JPEG n03720891/
+mv val/ILSVRC2012_val_00049800.JPEG n02086079/
+mv val/ILSVRC2012_val_00049801.JPEG n02071294/
+mv val/ILSVRC2012_val_00049802.JPEG n01774384/
+mv val/ILSVRC2012_val_00049803.JPEG n01560419/
+mv val/ILSVRC2012_val_00049804.JPEG n04204238/
+mv val/ILSVRC2012_val_00049805.JPEG n02101556/
+mv val/ILSVRC2012_val_00049806.JPEG n03998194/
+mv val/ILSVRC2012_val_00049807.JPEG n04486054/
+mv val/ILSVRC2012_val_00049808.JPEG n04505470/
+mv val/ILSVRC2012_val_00049809.JPEG n02089867/
+mv val/ILSVRC2012_val_00049810.JPEG n04179913/
+mv val/ILSVRC2012_val_00049811.JPEG n02112018/
+mv val/ILSVRC2012_val_00049812.JPEG n04201297/
+mv val/ILSVRC2012_val_00049813.JPEG n03673027/
+mv val/ILSVRC2012_val_00049814.JPEG n03908714/
+mv val/ILSVRC2012_val_00049815.JPEG n02105056/
+mv val/ILSVRC2012_val_00049816.JPEG n02791270/
+mv val/ILSVRC2012_val_00049817.JPEG n03775071/
+mv val/ILSVRC2012_val_00049818.JPEG n03785016/
+mv val/ILSVRC2012_val_00049819.JPEG n02088238/
+mv val/ILSVRC2012_val_00049820.JPEG n04376876/
+mv val/ILSVRC2012_val_00049821.JPEG n03272562/
+mv val/ILSVRC2012_val_00049822.JPEG n02132136/
+mv val/ILSVRC2012_val_00049823.JPEG n01748264/
+mv val/ILSVRC2012_val_00049824.JPEG n02939185/
+mv val/ILSVRC2012_val_00049825.JPEG n03485794/
+mv val/ILSVRC2012_val_00049826.JPEG n02105412/
+mv val/ILSVRC2012_val_00049827.JPEG n02814860/
+mv val/ILSVRC2012_val_00049828.JPEG n03527444/
+mv val/ILSVRC2012_val_00049829.JPEG n03803284/
+mv val/ILSVRC2012_val_00049830.JPEG n02396427/
+mv val/ILSVRC2012_val_00049831.JPEG n03877845/
+mv val/ILSVRC2012_val_00049832.JPEG n07614500/
+mv val/ILSVRC2012_val_00049833.JPEG n01514859/
+mv val/ILSVRC2012_val_00049834.JPEG n02105056/
+mv val/ILSVRC2012_val_00049835.JPEG n03047690/
+mv val/ILSVRC2012_val_00049836.JPEG n04254120/
+mv val/ILSVRC2012_val_00049837.JPEG n03218198/
+mv val/ILSVRC2012_val_00049838.JPEG n02910353/
+mv val/ILSVRC2012_val_00049839.JPEG n04328186/
+mv val/ILSVRC2012_val_00049840.JPEG n03776460/
+mv val/ILSVRC2012_val_00049841.JPEG n02109961/
+mv val/ILSVRC2012_val_00049842.JPEG n03467068/
+mv val/ILSVRC2012_val_00049843.JPEG n02704792/
+mv val/ILSVRC2012_val_00049844.JPEG n04136333/
+mv val/ILSVRC2012_val_00049845.JPEG n02169497/
+mv val/ILSVRC2012_val_00049846.JPEG n02094114/
+mv val/ILSVRC2012_val_00049847.JPEG n03837869/
+mv val/ILSVRC2012_val_00049848.JPEG n03131574/
+mv val/ILSVRC2012_val_00049849.JPEG n02090622/
+mv val/ILSVRC2012_val_00049850.JPEG n04238763/
+mv val/ILSVRC2012_val_00049851.JPEG n01682714/
+mv val/ILSVRC2012_val_00049852.JPEG n03388043/
+mv val/ILSVRC2012_val_00049853.JPEG n04493381/
+mv val/ILSVRC2012_val_00049854.JPEG n04040759/
+mv val/ILSVRC2012_val_00049855.JPEG n02099601/
+mv val/ILSVRC2012_val_00049856.JPEG n03803284/
+mv val/ILSVRC2012_val_00049857.JPEG n02101388/
+mv val/ILSVRC2012_val_00049858.JPEG n13044778/
+mv val/ILSVRC2012_val_00049859.JPEG n04483307/
+mv val/ILSVRC2012_val_00049860.JPEG n03404251/
+mv val/ILSVRC2012_val_00049861.JPEG n02090622/
+mv val/ILSVRC2012_val_00049862.JPEG n12768682/
+mv val/ILSVRC2012_val_00049863.JPEG n04367480/
+mv val/ILSVRC2012_val_00049864.JPEG n03134739/
+mv val/ILSVRC2012_val_00049865.JPEG n02356798/
+mv val/ILSVRC2012_val_00049866.JPEG n02408429/
+mv val/ILSVRC2012_val_00049867.JPEG n02974003/
+mv val/ILSVRC2012_val_00049868.JPEG n02101388/
+mv val/ILSVRC2012_val_00049869.JPEG n03124170/
+mv val/ILSVRC2012_val_00049870.JPEG n04435653/
+mv val/ILSVRC2012_val_00049871.JPEG n02105855/
+mv val/ILSVRC2012_val_00049872.JPEG n07920052/
+mv val/ILSVRC2012_val_00049873.JPEG n03272010/
+mv val/ILSVRC2012_val_00049874.JPEG n03180011/
+mv val/ILSVRC2012_val_00049875.JPEG n07717556/
+mv val/ILSVRC2012_val_00049876.JPEG n04235860/
+mv val/ILSVRC2012_val_00049877.JPEG n07716358/
+mv val/ILSVRC2012_val_00049878.JPEG n02088094/
+mv val/ILSVRC2012_val_00049879.JPEG n07873807/
+mv val/ILSVRC2012_val_00049880.JPEG n03775071/
+mv val/ILSVRC2012_val_00049881.JPEG n02110341/
+mv val/ILSVRC2012_val_00049882.JPEG n02817516/
+mv val/ILSVRC2012_val_00049883.JPEG n03146219/
+mv val/ILSVRC2012_val_00049884.JPEG n02113186/
+mv val/ILSVRC2012_val_00049885.JPEG n09246464/
+mv val/ILSVRC2012_val_00049886.JPEG n02119022/
+mv val/ILSVRC2012_val_00049887.JPEG n03240683/
+mv val/ILSVRC2012_val_00049888.JPEG n03706229/
+mv val/ILSVRC2012_val_00049889.JPEG n02701002/
+mv val/ILSVRC2012_val_00049890.JPEG n04154565/
+mv val/ILSVRC2012_val_00049891.JPEG n03467068/
+mv val/ILSVRC2012_val_00049892.JPEG n03843555/
+mv val/ILSVRC2012_val_00049893.JPEG n02107683/
+mv val/ILSVRC2012_val_00049894.JPEG n02088094/
+mv val/ILSVRC2012_val_00049895.JPEG n02108915/
+mv val/ILSVRC2012_val_00049896.JPEG n02786058/
+mv val/ILSVRC2012_val_00049897.JPEG n02326432/
+mv val/ILSVRC2012_val_00049898.JPEG n01629819/
+mv val/ILSVRC2012_val_00049899.JPEG n01614925/
+mv val/ILSVRC2012_val_00049900.JPEG n12267677/
+mv val/ILSVRC2012_val_00049901.JPEG n02108422/
+mv val/ILSVRC2012_val_00049902.JPEG n02481823/
+mv val/ILSVRC2012_val_00049903.JPEG n02892201/
+mv val/ILSVRC2012_val_00049904.JPEG n02877765/
+mv val/ILSVRC2012_val_00049905.JPEG n01955084/
+mv val/ILSVRC2012_val_00049906.JPEG n12057211/
+mv val/ILSVRC2012_val_00049907.JPEG n03063689/
+mv val/ILSVRC2012_val_00049908.JPEG n02113978/
+mv val/ILSVRC2012_val_00049909.JPEG n02777292/
+mv val/ILSVRC2012_val_00049910.JPEG n03717622/
+mv val/ILSVRC2012_val_00049911.JPEG n02787622/
+mv val/ILSVRC2012_val_00049912.JPEG n02437312/
+mv val/ILSVRC2012_val_00049913.JPEG n03992509/
+mv val/ILSVRC2012_val_00049914.JPEG n01930112/
+mv val/ILSVRC2012_val_00049915.JPEG n02500267/
+mv val/ILSVRC2012_val_00049916.JPEG n03627232/
+mv val/ILSVRC2012_val_00049917.JPEG n04505470/
+mv val/ILSVRC2012_val_00049918.JPEG n03250847/
+mv val/ILSVRC2012_val_00049919.JPEG n03400231/
+mv val/ILSVRC2012_val_00049920.JPEG n02977058/
+mv val/ILSVRC2012_val_00049921.JPEG n04554684/
+mv val/ILSVRC2012_val_00049922.JPEG n04456115/
+mv val/ILSVRC2012_val_00049923.JPEG n04147183/
+mv val/ILSVRC2012_val_00049924.JPEG n03676483/
+mv val/ILSVRC2012_val_00049925.JPEG n04465501/
+mv val/ILSVRC2012_val_00049926.JPEG n02094114/
+mv val/ILSVRC2012_val_00049927.JPEG n04532106/
+mv val/ILSVRC2012_val_00049928.JPEG n07892512/
+mv val/ILSVRC2012_val_00049929.JPEG n04557648/
+mv val/ILSVRC2012_val_00049930.JPEG n03482405/
+mv val/ILSVRC2012_val_00049931.JPEG n02088238/
+mv val/ILSVRC2012_val_00049932.JPEG n03991062/
+mv val/ILSVRC2012_val_00049933.JPEG n01751748/
+mv val/ILSVRC2012_val_00049934.JPEG n02104029/
+mv val/ILSVRC2012_val_00049935.JPEG n03733281/
+mv val/ILSVRC2012_val_00049936.JPEG n02536864/
+mv val/ILSVRC2012_val_00049937.JPEG n01860187/
+mv val/ILSVRC2012_val_00049938.JPEG n03133878/
+mv val/ILSVRC2012_val_00049939.JPEG n02110627/
+mv val/ILSVRC2012_val_00049940.JPEG n03208938/
+mv val/ILSVRC2012_val_00049941.JPEG n04192698/
+mv val/ILSVRC2012_val_00049942.JPEG n02106166/
+mv val/ILSVRC2012_val_00049943.JPEG n03028079/
+mv val/ILSVRC2012_val_00049944.JPEG n04515003/
+mv val/ILSVRC2012_val_00049945.JPEG n03787032/
+mv val/ILSVRC2012_val_00049946.JPEG n04317175/
+mv val/ILSVRC2012_val_00049947.JPEG n03447721/
+mv val/ILSVRC2012_val_00049948.JPEG n02326432/
+mv val/ILSVRC2012_val_00049949.JPEG n03535780/
+mv val/ILSVRC2012_val_00049950.JPEG n03998194/
+mv val/ILSVRC2012_val_00049951.JPEG n04560804/
+mv val/ILSVRC2012_val_00049952.JPEG n04507155/
+mv val/ILSVRC2012_val_00049953.JPEG n03134739/
+mv val/ILSVRC2012_val_00049954.JPEG n01697457/
+mv val/ILSVRC2012_val_00049955.JPEG n04270147/
+mv val/ILSVRC2012_val_00049956.JPEG n02107683/
+mv val/ILSVRC2012_val_00049957.JPEG n04525305/
+mv val/ILSVRC2012_val_00049958.JPEG n02410509/
+mv val/ILSVRC2012_val_00049959.JPEG n02099712/
+mv val/ILSVRC2012_val_00049960.JPEG n02132136/
+mv val/ILSVRC2012_val_00049961.JPEG n02268853/
+mv val/ILSVRC2012_val_00049962.JPEG n01817953/
+mv val/ILSVRC2012_val_00049963.JPEG n03929855/
+mv val/ILSVRC2012_val_00049964.JPEG n07615774/
+mv val/ILSVRC2012_val_00049965.JPEG n02100735/
+mv val/ILSVRC2012_val_00049966.JPEG n01833805/
+mv val/ILSVRC2012_val_00049967.JPEG n03207743/
+mv val/ILSVRC2012_val_00049968.JPEG n04584207/
+mv val/ILSVRC2012_val_00049969.JPEG n04266014/
+mv val/ILSVRC2012_val_00049970.JPEG n07248320/
+mv val/ILSVRC2012_val_00049971.JPEG n03467068/
+mv val/ILSVRC2012_val_00049972.JPEG n03908618/
+mv val/ILSVRC2012_val_00049973.JPEG n02133161/
+mv val/ILSVRC2012_val_00049974.JPEG n02486410/
+mv val/ILSVRC2012_val_00049975.JPEG n01755581/
+mv val/ILSVRC2012_val_00049976.JPEG n02445715/
+mv val/ILSVRC2012_val_00049977.JPEG n01914609/
+mv val/ILSVRC2012_val_00049978.JPEG n02841315/
+mv val/ILSVRC2012_val_00049979.JPEG n02877765/
+mv val/ILSVRC2012_val_00049980.JPEG n01697457/
+mv val/ILSVRC2012_val_00049981.JPEG n01981276/
+mv val/ILSVRC2012_val_00049982.JPEG n06794110/
+mv val/ILSVRC2012_val_00049983.JPEG n04485082/
+mv val/ILSVRC2012_val_00049984.JPEG n02119022/
+mv val/ILSVRC2012_val_00049985.JPEG n02481823/
+mv val/ILSVRC2012_val_00049986.JPEG n02802426/
+mv val/ILSVRC2012_val_00049987.JPEG n01689811/
+mv val/ILSVRC2012_val_00049988.JPEG n01796340/
+mv val/ILSVRC2012_val_00049989.JPEG n02667093/
+mv val/ILSVRC2012_val_00049990.JPEG n01622779/
+mv val/ILSVRC2012_val_00049991.JPEG n01980166/
+mv val/ILSVRC2012_val_00049992.JPEG n02442845/
+mv val/ILSVRC2012_val_00049993.JPEG n04328186/
+mv val/ILSVRC2012_val_00049994.JPEG n01871265/
+mv val/ILSVRC2012_val_00049995.JPEG n03729826/
+mv val/ILSVRC2012_val_00049996.JPEG n02123394/
+mv val/ILSVRC2012_val_00049997.JPEG n01630670/
+mv val/ILSVRC2012_val_00049998.JPEG n02106166/
+mv val/ILSVRC2012_val_00049999.JPEG n10148035/
+mv val/ILSVRC2012_val_00050000.JPEG n02437616/
diff --git a/PaddleCV/image_classification/fast_imagenet/torchvision_reader.py b/PaddleCV/image_classification/fast_imagenet/torchvision_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..c1b0fb95b04aa126263e18dc558184ea57ba048c
--- /dev/null
+++ b/PaddleCV/image_classification/fast_imagenet/torchvision_reader.py
@@ -0,0 +1,157 @@
+import os
+
+import numpy as np
+import math
+import random
+import torch
+import torch.utils.data
+from torch.utils.data.distributed import DistributedSampler
+import torchvision.transforms as transforms
+import torchvision.datasets as datasets
+
+from torch.utils.data.sampler import Sampler
+import torchvision
+import pickle
+from tqdm import tqdm
+import time
+import multiprocessing
+
+FINISH_EVENT = "FINISH_EVENT"
+class PaddleDataLoader(object):
+ def __init__(self, torch_dataset, indices=None, concurrent=24, queue_size=3072, shuffle=True, shuffle_seed=0):
+ self.torch_dataset = torch_dataset
+ self.data_queue = multiprocessing.Queue(queue_size)
+ self.indices = indices
+ self.concurrent = concurrent
+ self.shuffle = shuffle
+ self.shuffle_seed=shuffle_seed
+
+ def _worker_loop(self, dataset, worker_indices, worker_id):
+ cnt = 0
+ for idx in worker_indices:
+ cnt += 1
+ img, label = self.torch_dataset[idx]
+ img = np.array(img).astype('uint8').transpose((2, 0, 1))
+ self.data_queue.put((img, label))
+ print("worker: [%d] read [%d] samples. " % (worker_id, cnt))
+ self.data_queue.put(FINISH_EVENT)
+
+ def reader(self):
+ def _reader_creator():
+ worker_processes = []
+ total_img = len(self.torch_dataset)
+ print("total image: ", total_img)
+ if self.shuffle:
+ self.indices = [i for i in xrange(total_img)]
+ random.seed(self.shuffle_seed)
+ random.shuffle(self.indices)
+ print("shuffle indices: %s ..." % self.indices[:10])
+
+ imgs_per_worker = int(math.ceil(total_img / self.concurrent))
+ for i in xrange(self.concurrent):
+ start = i * imgs_per_worker
+ end = (i + 1) * imgs_per_worker if i != self.concurrent - 1 else None
+ sliced_indices = self.indices[start:end]
+ w = multiprocessing.Process(
+ target=self._worker_loop,
+ args=(self.torch_dataset, sliced_indices, i)
+ )
+ w.daemon = True
+ w.start()
+ worker_processes.append(w)
+ finish_workers = 0
+ worker_cnt = len(worker_processes)
+ while finish_workers < worker_cnt:
+ sample = self.data_queue.get()
+ if sample == FINISH_EVENT:
+ finish_workers += 1
+ else:
+ yield sample
+
+ return _reader_creator
+
+def train(traindir, sz, min_scale=0.08, shuffle_seed=0):
+ train_tfms = [
+ transforms.RandomResizedCrop(sz, scale=(min_scale, 1.0)),
+ transforms.RandomHorizontalFlip()
+ ]
+ train_dataset = datasets.ImageFolder(traindir, transforms.Compose(train_tfms))
+ return PaddleDataLoader(train_dataset, shuffle_seed=shuffle_seed).reader()
+
+def test(valdir, bs, sz, rect_val=False):
+ if rect_val:
+ idx_ar_sorted = sort_ar(valdir)
+ idx_sorted, _ = zip(*idx_ar_sorted)
+ idx2ar = map_idx2ar(idx_ar_sorted, bs)
+
+ ar_tfms = [transforms.Resize(int(sz* 1.14)), CropArTfm(idx2ar, sz)]
+ val_dataset = ValDataset(valdir, transform=ar_tfms)
+ return PaddleDataLoader(val_dataset, concurrent=1, indices=idx_sorted, shuffle=False).reader()
+
+ val_tfms = [transforms.Resize(int(sz* 1.14)), transforms.CenterCrop(sz)]
+ val_dataset = datasets.ImageFolder(valdir, transforms.Compose(val_tfms))
+
+ return PaddleDataLoader(val_dataset).reader()
+
+
+class ValDataset(datasets.ImageFolder):
+ def __init__(self, root, transform=None, target_transform=None):
+ super(ValDataset, self).__init__(root, transform, target_transform)
+
+ def __getitem__(self, index):
+ path, target = self.imgs[index]
+ sample = self.loader(path)
+ if self.transform is not None:
+ for tfm in self.transform:
+ if isinstance(tfm, CropArTfm):
+ sample = tfm(sample, index)
+ else:
+ sample = tfm(sample)
+ if self.target_transform is not None:
+ target = self.target_transform(target)
+
+ return sample, target
+
+
+class CropArTfm(object):
+ def __init__(self, idx2ar, target_size):
+ self.idx2ar, self.target_size = idx2ar, target_size
+
+ def __call__(self, img, idx):
+ target_ar = self.idx2ar[idx]
+ if target_ar < 1:
+ w = int(self.target_size / target_ar)
+ size = (w // 8 * 8, self.target_size)
+ else:
+ h = int(self.target_size * target_ar)
+ size = (self.target_size, h // 8 * 8)
+ return torchvision.transforms.functional.center_crop(img, size)
+
+
+def sort_ar(valdir):
+ idx2ar_file = valdir + '/../sorted_idxar.p'
+ if os.path.isfile(idx2ar_file):
+ return pickle.load(open(idx2ar_file, 'rb'))
+ print('Creating AR indexes. Please be patient this may take a couple minutes...')
+ val_dataset = datasets.ImageFolder(valdir) # AS: TODO: use Image.open instead of looping through dataset
+ sizes = [img[0].size for img in tqdm(val_dataset, total=len(val_dataset))]
+ idx_ar = [(i, round(s[0] * 1.0/ s[1], 5)) for i, s in enumerate(sizes)]
+ sorted_idxar = sorted(idx_ar, key=lambda x: x[1])
+ pickle.dump(sorted_idxar, open(idx2ar_file, 'wb'))
+ print('Done')
+ return sorted_idxar
+
+def chunks(l, n):
+ n = max(1, n)
+ return (l[i:i + n] for i in range(0, len(l), n))
+
+
+def map_idx2ar(idx_ar_sorted, batch_size):
+ ar_chunks = list(chunks(idx_ar_sorted, batch_size))
+ idx2ar = {}
+ for chunk in ar_chunks:
+ idxs, ars = list(zip(*chunk))
+ mean = round(np.mean(ars), 5)
+ for idx in idxs:
+ idx2ar[idx] = mean
+ return idx2ar
diff --git a/PaddleCV/image_classification/fast_imagenet/train.py b/PaddleCV/image_classification/fast_imagenet/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..858e5107a8fed8352603d8b1c9f204d653ceab9b
--- /dev/null
+++ b/PaddleCV/image_classification/fast_imagenet/train.py
@@ -0,0 +1,350 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import argparse
+import cProfile
+import time
+import os
+import traceback
+
+import numpy as np
+import torchvision_reader
+import torch
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.core as core
+import paddle.fluid.profiler as profiler
+import paddle.fluid.transpiler.distribute_transpiler as distribute_transpiler
+
+import sys
+sys.path.append("..")
+from utility import add_arguments, print_arguments
+import functools
+from models.fast_imagenet import FastImageNet, lr_decay
+import utils
+
+def parse_args():
+ parser = argparse.ArgumentParser(description=__doc__)
+ add_arg = functools.partial(add_arguments, argparser=parser)
+ # yapf: disable
+ add_arg('total_images', int, 1281167, "Training image number.")
+ add_arg('num_epochs', int, 120, "number of epochs.")
+ add_arg('image_shape', str, "3,224,224", "input image size")
+ add_arg('model_save_dir', str, "output", "model save directory")
+ add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+ add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+ add_arg('lr', float, 1.0, "set learning rate.")
+ add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+ add_arg('data_dir', str, "./data/ILSVRC2012", "The ImageNet dataset root dir.")
+ add_arg('model_category', str, "models", "Whether to use models_name or not, valid value:'models','models_name'" )
+ add_arg('fp16', bool, False, "Enable half precision training with fp16." )
+ add_arg('scale_loss', float, 1.0, "Scale loss for fp16." )
+ # for distributed
+ add_arg('start_test_pass', int, 0, "Start test after x passes.")
+ add_arg('num_threads', int, 8, "Use num_threads to run the fluid program.")
+ add_arg('reduce_strategy', str, "allreduce", "Choose from reduce or allreduce.")
+ add_arg('log_period', int, 30, "Print period, defualt is 5.")
+ add_arg('memory_optimize', bool, True, "Whether to enable memory optimize.")
+ add_arg('best_acc5', float, 0.93, "The best acc5, default is 93%.")
+ # yapf: enable
+ args = parser.parse_args()
+ return args
+
+def get_device_num():
+ import subprocess
+ visible_device = os.getenv('CUDA_VISIBLE_DEVICES')
+ if visible_device:
+ device_num = len(visible_device.split(','))
+ else:
+ device_num = subprocess.check_output(
+ ['nvidia-smi', '-L']).decode().count('\n')
+ return device_num
+
+DEVICE_NUM = get_device_num()
+
+def test_parallel(exe, test_args, args, test_reader, feeder, bs):
+ acc_evaluators = []
+ for i in xrange(len(test_args[2])):
+ acc_evaluators.append(fluid.metrics.Accuracy())
+
+ to_fetch = [v.name for v in test_args[2]]
+ batch_id = 0
+ start_ts = time.time()
+ for batch_id, data in enumerate(test_reader()):
+ acc_rets = exe.run(fetch_list=to_fetch, feed=feeder.feed(data))
+ ret_result = [np.mean(np.array(ret)) for ret in acc_rets]
+ print("Test batch: [%d], acc_rets: [%s]" % (batch_id, ret_result))
+ for i, e in enumerate(acc_evaluators):
+ e.update(
+ value=np.array(acc_rets[i]), weight=bs)
+ num_samples = batch_id * bs * DEVICE_NUM
+ print_train_time(start_ts, time.time(), num_samples)
+
+ return [e.eval() for e in acc_evaluators]
+
+
+def build_program(args, is_train, main_prog, startup_prog, py_reader_startup_prog, sz, trn_dir, bs, min_scale, rect_val=False):
+
+ dshape=[3, sz, sz]
+ class_dim=1000
+ pyreader = None
+ with fluid.program_guard(main_prog, startup_prog):
+ with fluid.unique_name.guard():
+ if is_train:
+ with fluid.program_guard(main_prog, py_reader_startup_prog):
+ with fluid.unique_name.guard():
+ pyreader = fluid.layers.py_reader(
+ capacity=bs * DEVICE_NUM,
+ shapes=([-1] + dshape, (-1, 1)),
+ dtypes=('uint8', 'int64'),
+ name="train_reader_" + str(sz) if is_train else "test_reader_" + str(sz),
+ use_double_buffer=True)
+ input, label = fluid.layers.read_file(pyreader)
+ else:
+ input = fluid.layers.data(name="image", shape=[3, 244, 244], dtype="uint8")
+ label = fluid.layers.data(name="label", shape=[1], dtype="int64")
+ cast_img_type = "float16" if args.fp16 else "float32"
+ cast = fluid.layers.cast(input, cast_img_type)
+ img_mean = fluid.layers.create_global_var([3, 1, 1], 0.0, cast_img_type, name="img_mean", persistable=True)
+ img_std = fluid.layers.create_global_var([3, 1, 1], 0.0, cast_img_type, name="img_std", persistable=True)
+ # image = (image - (mean * 255.0)) / (std * 255.0)
+ t1 = fluid.layers.elementwise_sub(cast, img_mean, axis=1)
+ t2 = fluid.layers.elementwise_div(t1, img_std, axis=1)
+
+ model = FastImageNet(is_train=is_train)
+ predict = model.net(t2, class_dim=class_dim, img_size=sz)
+ cost, pred = fluid.layers.softmax_with_cross_entropy(predict, label, return_softmax=True)
+ if args.scale_loss > 1:
+ avg_cost = fluid.layers.mean(x=cost) * float(args.scale_loss)
+ else:
+ avg_cost = fluid.layers.mean(x=cost)
+
+ batch_acc1 = fluid.layers.accuracy(input=pred, label=label, k=1)
+ batch_acc5 = fluid.layers.accuracy(input=pred, label=label, k=5)
+
+ # configure optimize
+ optimizer = None
+ if is_train:
+ total_images = args.total_images
+ lr = args.lr
+
+ epochs = [(0,7), (7,13), (13, 22), (22, 25), (25, 28)]
+ bs_epoch = [bs*DEVICE_NUM for bs in [224, 224, 96, 96, 50]]
+ bs_scale = [bs*1.0 / bs_epoch[0] for bs in bs_epoch]
+ lrs = [(lr, lr*2), (lr*2, lr/4), (lr*bs_scale[2], lr/10*bs_scale[2]), (lr/10*bs_scale[2], lr/100*bs_scale[2]), (lr/100*bs_scale[4], lr/1000*bs_scale[4]), lr/1000*bs_scale[4]]
+
+ boundaries, values = lr_decay(lrs, epochs, bs_epoch, total_images)
+
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(boundaries=boundaries, values=values),
+ momentum=0.9)
+ if args.fp16:
+ params_grads = optimizer.backward(avg_cost)
+ master_params_grads = utils.create_master_params_grads(
+ params_grads, main_prog, startup_prog, args.scale_loss)
+ optimizer.apply_gradients(master_params_grads)
+ utils.master_param_to_train_param(master_params_grads, params_grads, main_prog)
+ else:
+ optimizer.minimize(avg_cost)
+
+ if args.memory_optimize:
+ fluid.memory_optimize(main_prog, skip_grads=True)
+
+ return avg_cost, optimizer, [batch_acc1, batch_acc5], pyreader
+
+
+def refresh_program(args, epoch, sz, trn_dir, bs, val_bs, need_update_start_prog=False, min_scale=0.08, rect_val=False):
+ print('refresh program: epoch: [%d], image size: [%d], trn_dir: [%s], batch_size:[%d]' % (epoch, sz, trn_dir, bs))
+ train_prog = fluid.Program()
+ test_prog = fluid.Program()
+ startup_prog = fluid.Program()
+ py_reader_startup_prog = fluid.Program()
+
+ train_args = build_program(args, True, train_prog, startup_prog, py_reader_startup_prog, sz, trn_dir, bs, min_scale)
+ test_args = build_program(args, False, test_prog, startup_prog, py_reader_startup_prog, sz, trn_dir, val_bs, min_scale, rect_val=rect_val)
+
+ place = core.CUDAPlace(0)
+ startup_exe = fluid.Executor(place)
+ startup_exe.run(py_reader_startup_prog)
+
+ if need_update_start_prog:
+ startup_exe.run(startup_prog)
+ conv2d_w_vars = [var for var in startup_prog.global_block().vars.values() if var.name.startswith('conv2d_')]
+ for var in conv2d_w_vars:
+ torch_w = torch.empty(var.shape)
+ kaiming_np = torch.nn.init.kaiming_normal_(torch_w, mode='fan_out', nonlinearity='relu').numpy()
+ tensor = fluid.global_scope().find_var(var.name).get_tensor()
+ if args.fp16:
+ tensor.set(np.array(kaiming_np, dtype="float16").view(np.uint16), place)
+ else:
+ tensor.set(np.array(kaiming_np, dtype="float32"), place)
+
+ np_tensors = {}
+ np_tensors["img_mean"] = np.array([0.485 * 255.0, 0.456 * 255.0, 0.406 * 255.0]).astype("float16" if args.fp16 else "float32").reshape((3, 1, 1))
+ np_tensors["img_std"] = np.array([0.229 * 255.0, 0.224 * 255.0, 0.225 * 255.0]).astype("float16" if args.fp16 else "float32").reshape((3, 1, 1))
+ for vname, np_tensor in np_tensors.items():
+ var = fluid.global_scope().find_var(vname)
+ if args.fp16:
+ var.get_tensor().set(np_tensor.view(np.uint16), place)
+ else:
+ var.get_tensor().set(np_tensor, place)
+
+
+ strategy = fluid.ExecutionStrategy()
+ strategy.num_threads = args.num_threads
+ strategy.allow_op_delay = False
+ strategy.num_iteration_per_drop_scope = 1
+ build_strategy = fluid.BuildStrategy()
+ build_strategy.reduce_strategy = fluid.BuildStrategy().ReduceStrategy.AllReduce
+
+ avg_loss = train_args[0]
+ train_exe = fluid.ParallelExecutor(
+ True,
+ avg_loss.name,
+ main_program=train_prog,
+ exec_strategy=strategy,
+ build_strategy=build_strategy)
+ test_exe = fluid.ParallelExecutor(
+ True, main_program=test_prog, share_vars_from=train_exe)
+
+ return train_args, test_args, test_prog, train_exe, test_exe
+
+def prepare_reader(epoch_id, train_py_reader, train_bs, val_bs, trn_dir, img_dim, min_scale, rect_val):
+ train_reader = torchvision_reader.train(
+ traindir="/data/imagenet/%strain" % trn_dir, sz=img_dim, min_scale=min_scale, shuffle_seed=epoch_id+1)
+ train_py_reader.decorate_paddle_reader(paddle.batch(train_reader, batch_size=train_bs))
+
+ test_reader = torchvision_reader.test(
+ valdir="/data/imagenet/%svalidation" % trn_dir, bs=val_bs*DEVICE_NUM, sz=img_dim, rect_val=rect_val)
+ test_batched_reader = paddle.batch(test_reader, batch_size=val_bs * DEVICE_NUM)
+
+ return test_batched_reader
+
+
+def train_parallel(args):
+ over_all_start = time.time()
+ test_prog = fluid.Program()
+
+ exe = None
+ test_exe = None
+ train_args = None
+ test_args = None
+ ## dynamic batch size, image size...
+ bs = 224
+ val_bs = 64
+ trn_dir = "sz/160/"
+ img_dim=128
+ min_scale=0.08
+ rect_val=False
+ for epoch_id in range(args.num_epochs):
+ # refresh program
+ if epoch_id == 0:
+ train_args, test_args, test_prog, exe, test_exe = refresh_program(args, epoch_id, sz=img_dim, trn_dir=trn_dir, bs=bs, val_bs=val_bs, need_update_start_prog=True)
+ elif epoch_id == 13: #13
+ bs = 96
+ trn_dir="sz/352/"
+ img_dim=224
+ min_scale=0.087
+ train_args, test_args, test_prog, exe, test_exe = refresh_program(args, epoch_id, sz=img_dim, trn_dir=trn_dir, bs=bs, val_bs=val_bs, min_scale=min_scale)
+ elif epoch_id == 25: #25
+ bs = 50
+ val_bs=8
+ trn_dir=""
+ img_dim=288
+ min_scale=0.5
+ rect_val=True
+ train_args, test_args, test_prog, exe, test_exe = refresh_program(args, epoch_id, sz=img_dim, trn_dir=trn_dir, bs=bs, val_bs=val_bs, min_scale=min_scale, rect_val=rect_val)
+ else:
+ pass
+
+ avg_loss = train_args[0]
+ num_samples = 0
+ iters = 0
+ start_time = time.time()
+ train_py_reader = train_args[3]
+ test_reader = prepare_reader(epoch_id, train_py_reader, bs, val_bs, trn_dir, img_dim=img_dim, min_scale=min_scale, rect_val=rect_val)
+ train_py_reader.start() # start pyreader
+ batch_start_time = time.time()
+ while True:
+ fetch_list = [avg_loss.name]
+ acc_name_list = [v.name for v in train_args[2]]
+ fetch_list.extend(acc_name_list)
+ fetch_list.append("learning_rate")
+ if iters % args.log_period == 0:
+ should_print = True
+ else:
+ should_print = False
+
+ fetch_ret = []
+ try:
+ if should_print:
+ fetch_ret = exe.run(fetch_list)
+ else:
+ exe.run([])
+ except fluid.core.EOFException as eof:
+ print("Finish current epoch, will reset pyreader...")
+ train_py_reader.reset()
+ break
+ except fluid.core.EnforceNotMet as ex:
+ traceback.print_exc()
+ exit(1)
+
+ num_samples += bs * DEVICE_NUM
+
+ if should_print:
+ fetched_data = [np.mean(np.array(d)) for d in fetch_ret]
+ print("Epoch %d, batch %d, loss %s, accucacys: %s, learning_rate %s, py_reader queue_size: %d, avg batch time: %0.4f secs" %
+ (epoch_id, iters, fetched_data[0], fetched_data[1:-1], fetched_data[-1], train_py_reader.queue.size(), (time.time() - batch_start_time)*1.0/args.log_period))
+ batch_start_time = time.time()
+ iters += 1
+
+ print_train_time(start_time, time.time(), num_samples)
+ feed_list = [test_prog.global_block().var(varname) for varname in ("image", "label")]
+ test_feeder = fluid.DataFeeder(feed_list=feed_list, place=fluid.CUDAPlace(0))
+ test_ret = test_parallel(test_exe, test_args, args, test_reader, test_feeder, val_bs)
+ test_acc1, test_acc5 = [np.mean(np.array(v)) for v in test_ret]
+ print("Epoch: %d, Test Accuracy: %s, Spend %.2f hours\n" %
+ (epoch_id, [test_acc1, test_acc5], (time.time() - over_all_start) / 3600))
+ if np.mean(np.array(test_ret[1])) > args.best_acc5:
+ print("Achieve the best top-1 acc %f, top-5 acc: %f" % (test_acc1, test_acc5))
+ break
+
+ print("total train time: ", time.time() - over_all_start)
+
+
+def print_train_time(start_time, end_time, num_samples):
+ train_elapsed = end_time - start_time
+ examples_per_sec = num_samples / train_elapsed
+ print('\nTotal examples: %d, total time: %.5f, %.5f examples/sed\n' %
+ (num_samples, train_elapsed, examples_per_sec))
+
+
+def print_paddle_envs():
+ print('----------- Configuration envs -----------')
+ print("DEVICE_NUM: %d" % DEVICE_NUM)
+ for k in os.environ:
+ if "PADDLE_" in k:
+ print("ENV %s:%s" % (k, os.environ[k]))
+ print('------------------------------------------------')
+
+
+def main():
+ args = parse_args()
+ print_arguments(args)
+ print_paddle_envs()
+ train_parallel(args)
+
+
+if __name__ == "__main__":
+ main()
\ No newline at end of file
diff --git a/PaddleCV/image_classification/images/alexnet_imagenet1k_acc1.png b/PaddleCV/image_classification/images/alexnet_imagenet1k_acc1.png
new file mode 100644
index 0000000000000000000000000000000000000000..d9dde696f7eb15b9eda6352b488749fba76d9f5a
Binary files /dev/null and b/PaddleCV/image_classification/images/alexnet_imagenet1k_acc1.png differ
diff --git a/fluid/PaddleCV/image_classification/images/curve.jpg b/PaddleCV/image_classification/images/curve.jpg
similarity index 100%
rename from fluid/PaddleCV/image_classification/images/curve.jpg
rename to PaddleCV/image_classification/images/curve.jpg
diff --git a/PaddleCV/image_classification/images/imagenet_dist_performance.png b/PaddleCV/image_classification/images/imagenet_dist_performance.png
new file mode 100644
index 0000000000000000000000000000000000000000..d6be7f5f3a9f79c495a684c0b5f4ed459d465c5e
Binary files /dev/null and b/PaddleCV/image_classification/images/imagenet_dist_performance.png differ
diff --git a/PaddleCV/image_classification/images/imagenet_dist_speedup.png b/PaddleCV/image_classification/images/imagenet_dist_speedup.png
new file mode 100644
index 0000000000000000000000000000000000000000..de834c95ff7e4f0f8aec780ebc7cafdb77872e4d
Binary files /dev/null and b/PaddleCV/image_classification/images/imagenet_dist_speedup.png differ
diff --git a/PaddleCV/image_classification/images/mobielenetv1_imagenet1k_acc1.png b/PaddleCV/image_classification/images/mobielenetv1_imagenet1k_acc1.png
new file mode 100644
index 0000000000000000000000000000000000000000..7b41b61244810ec3da67ebd76cb85c244cd61300
Binary files /dev/null and b/PaddleCV/image_classification/images/mobielenetv1_imagenet1k_acc1.png differ
diff --git a/PaddleCV/image_classification/images/resnet101_imagenet1k_acc1.png b/PaddleCV/image_classification/images/resnet101_imagenet1k_acc1.png
new file mode 100644
index 0000000000000000000000000000000000000000..f6436324bc67f81d5eec5309afba9edefadfc483
Binary files /dev/null and b/PaddleCV/image_classification/images/resnet101_imagenet1k_acc1.png differ
diff --git a/fluid/PaddleCV/image_classification/images/resnet50_32gpus-acc1.png b/PaddleCV/image_classification/images/resnet50_32gpus-acc1.png
similarity index 100%
rename from fluid/PaddleCV/image_classification/images/resnet50_32gpus-acc1.png
rename to PaddleCV/image_classification/images/resnet50_32gpus-acc1.png
diff --git a/PaddleCV/image_classification/images/resnet50_imagenet1k_acc1.png b/PaddleCV/image_classification/images/resnet50_imagenet1k_acc1.png
new file mode 100644
index 0000000000000000000000000000000000000000..c23f2f34af0aca4b41d7537314c60f6105a88bc0
Binary files /dev/null and b/PaddleCV/image_classification/images/resnet50_imagenet1k_acc1.png differ
diff --git a/PaddleCV/image_classification/images/vgg11_imagenet1k_acc1.png b/PaddleCV/image_classification/images/vgg11_imagenet1k_acc1.png
new file mode 100644
index 0000000000000000000000000000000000000000..691071c7acc5f5aa7f807f4d51748fc2b91df356
Binary files /dev/null and b/PaddleCV/image_classification/images/vgg11_imagenet1k_acc1.png differ
diff --git a/PaddleCV/image_classification/infer.py b/PaddleCV/image_classification/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..e6e126f259c15dfe62acba167a04fe5caed58ad7
--- /dev/null
+++ b/PaddleCV/image_classification/infer.py
@@ -0,0 +1,102 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import numpy as np
+import time
+import sys
+import paddle
+import paddle.fluid as fluid
+import reader
+import argparse
+import functools
+import models
+import utils
+from utils.utility import add_arguments,print_arguments
+import math
+
+parser = argparse.ArgumentParser(description=__doc__)
+# yapf: disable
+add_arg = functools.partial(add_arguments, argparser=parser)
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('class_dim', int, 1000, "Class number.")
+add_arg('image_shape', str, "3,224,224", "Input image size")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('model', str, "SE_ResNeXt50_32x4d", "Set the network to use.")
+add_arg('save_inference', bool, False, "Whether to save inference model or not")
+# yapf: enable
+
+def infer(args):
+ # parameters from arguments
+ class_dim = args.class_dim
+ model_name = args.model
+ save_inference = args.save_inference
+ pretrained_model = args.pretrained_model
+ with_memory_optimization = args.with_mem_opt
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+ model_list = [m for m in dir(models) if "__" not in m]
+ assert model_name in model_list, "{} is not in lists: {}".format(args.model,
+ model_list)
+
+ image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
+
+ # model definition
+ model = models.__dict__[model_name]()
+ if model_name == "GoogleNet":
+ out, _, _ = model.net(input=image, class_dim=class_dim)
+ else:
+ out = model.net(input=image, class_dim=class_dim)
+
+ test_program = fluid.default_main_program().clone(for_test=True)
+
+ fetch_list = [out.name]
+ if with_memory_optimization and not save_inference:
+ fluid.memory_optimize(
+ fluid.default_main_program(), skip_opt_set=set(fetch_list))
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+
+ fluid.io.load_vars(exe, pretrained_model, predicate=if_exist)
+ if save_inference:
+ fluid.io.save_inference_model(
+ dirname=model_name,
+ feeded_var_names=['image'],
+ main_program=test_program,
+ target_vars=out,
+ executor=exe,
+ model_filename='model',
+ params_filename='params')
+ print("model: ",model_name," is already saved")
+ exit(0)
+ test_batch_size = 1
+ test_reader = paddle.batch(reader.test(), batch_size=test_batch_size)
+ feeder = fluid.DataFeeder(place=place, feed_list=[image])
+
+ TOPK = 1
+ for batch_id, data in enumerate(test_reader()):
+ result = exe.run(test_program,
+ fetch_list=fetch_list,
+ feed=feeder.feed(data))
+ result = result[0][0]
+ pred_label = np.argsort(result)[::-1][:TOPK]
+ print("Test-{0}-score: {1}, class {2}"
+ .format(batch_id, result[pred_label], pred_label))
+ sys.stdout.flush()
+
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ infer(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/PaddleCV/image_classification/legacy/README.md b/PaddleCV/image_classification/legacy/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..e4603a62da2bc15e0f834a34d403e94b332fd41b
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/README.md
@@ -0,0 +1,10 @@
+For historical reasons, We keep "no name" models here, which are different from "specified name" models.
+
+**NOTE: Training the models in legacy folder will generate models without specified parameters.**
+
+- **Released models: not specify parameter names**
+
+|model | top-1/top-5 accuracy(PIL)| top-1/top-5 accuracy(CV2) |
+|- |:-: |:-:|
+|[ResNet152](http://paddle-imagenet-models.bj.bcebos.com/ResNet152_pretrained.zip) | 78.18%/93.93% | 78.11%/94.04% |
+|[SE_ResNeXt50_32x4d](http://paddle-imagenet-models.bj.bcebos.com/se_resnext_50_model.tar) | 78.32%/93.96% | 77.58%/93.73% |
diff --git a/PaddleCV/image_classification/legacy/models/__init__.py b/PaddleCV/image_classification/legacy/models/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..9659029482ed6f51c9a63ac330ad9d8fd3b8b98f
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/__init__.py
@@ -0,0 +1,12 @@
+from .alexnet import AlexNet
+from .mobilenet import MobileNet
+from .mobilenet_v2 import MobileNetV2
+from .googlenet import GoogleNet
+from .vgg import VGG11, VGG13, VGG16, VGG19
+from .resnet import ResNet50, ResNet101, ResNet152
+from .resnet_dist import DistResNet
+from .inception_v4 import InceptionV4
+from .se_resnext import SE_ResNeXt50_32x4d, SE_ResNeXt101_32x4d, SE_ResNeXt152_32x4d
+from .dpn import DPN68, DPN92, DPN98, DPN107, DPN131
+from .shufflenet_v2 import ShuffleNetV2_x0_5, ShuffleNetV2_x1_0, ShuffleNetV2_x1_5, ShuffleNetV2_x2_0
+from .fast_imagenet import FastImageNet
diff --git a/PaddleCV/image_classification/legacy/models/alexnet.py b/PaddleCV/image_classification/legacy/models/alexnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..abe3b92965b1c16312e4ddf68809f6a4c93183fa
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/alexnet.py
@@ -0,0 +1,149 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+
+__all__ = ['AlexNet']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [40, 70, 100],
+ "steps": [0.01, 0.001, 0.0001, 0.00001]
+ }
+}
+
+
+class AlexNet():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000):
+ stdv = 1.0 / math.sqrt(input.shape[1] * 11 * 11)
+ conv1 = fluid.layers.conv2d(
+ input=input,
+ num_filters=64,
+ filter_size=11,
+ stride=4,
+ padding=2,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+ pool1 = fluid.layers.pool2d(
+ input=conv1,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=0,
+ pool_type='max')
+
+ stdv = 1.0 / math.sqrt(pool1.shape[1] * 5 * 5)
+ conv2 = fluid.layers.conv2d(
+ input=pool1,
+ num_filters=192,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+ pool2 = fluid.layers.pool2d(
+ input=conv2,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=0,
+ pool_type='max')
+
+ stdv = 1.0 / math.sqrt(pool2.shape[1] * 3 * 3)
+ conv3 = fluid.layers.conv2d(
+ input=pool2,
+ num_filters=384,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+
+ stdv = 1.0 / math.sqrt(conv3.shape[1] * 3 * 3)
+ conv4 = fluid.layers.conv2d(
+ input=conv3,
+ num_filters=256,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+
+ stdv = 1.0 / math.sqrt(conv4.shape[1] * 3 * 3)
+ conv5 = fluid.layers.conv2d(
+ input=conv4,
+ num_filters=256,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+ pool5 = fluid.layers.pool2d(
+ input=conv5,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=0,
+ pool_type='max')
+
+ drop6 = fluid.layers.dropout(x=pool5, dropout_prob=0.5)
+
+ stdv = 1.0 / math.sqrt(drop6.shape[1] * drop6.shape[2] *
+ drop6.shape[3] * 1.0)
+ fc6 = fluid.layers.fc(
+ input=drop6,
+ size=4096,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+
+ drop7 = fluid.layers.dropout(x=fc6, dropout_prob=0.5)
+
+ stdv = 1.0 / math.sqrt(drop7.shape[1] * 1.0)
+ fc7 = fluid.layers.fc(
+ input=drop7,
+ size=4096,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+
+ stdv = 1.0 / math.sqrt(fc7.shape[1] * 1.0)
+ out = fluid.layers.fc(
+ input=fc7,
+ size=class_dim,
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+ return out
diff --git a/PaddleCV/image_classification/legacy/models/dpn.py b/PaddleCV/image_classification/legacy/models/dpn.py
new file mode 100644
index 0000000000000000000000000000000000000000..316e96ac2cebd2dec4a60bf1748ed321fa651590
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/dpn.py
@@ -0,0 +1,278 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import numpy as np
+import time
+import sys
+import paddle.fluid as fluid
+import math
+
+__all__ = ["DPN", "DPN68", "DPN92", "DPN98", "DPN107", "DPN131"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class DPN(object):
+ def __init__(self, layers=68):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ # get network args
+ args = self.get_net_args(self.layers)
+ bws = args['bw']
+ inc_sec = args['inc_sec']
+ rs = args['bw']
+ k_r = args['k_r']
+ k_sec = args['k_sec']
+ G = args['G']
+ init_num_filter = args['init_num_filter']
+ init_filter_size = args['init_filter_size']
+ init_padding = args['init_padding']
+
+ ## define Dual Path Network
+
+ # conv1
+ conv1_x_1 = fluid.layers.conv2d(
+ input=input,
+ num_filters=init_num_filter,
+ filter_size=init_filter_size,
+ stride=2,
+ padding=init_padding,
+ groups=1,
+ act=None,
+ bias_attr=False)
+ conv1_x_1 = fluid.layers.batch_norm(
+ input=conv1_x_1, act='relu', is_test=False)
+ convX_x_x = fluid.layers.pool2d(
+ input=conv1_x_1,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for gc in range(4):
+ bw = bws[gc]
+ inc = inc_sec[gc]
+ R = (k_r * bw) // rs[gc]
+ if gc == 0:
+ _type1 = 'proj'
+ _type2 = 'normal'
+ else:
+ _type1 = 'down'
+ _type2 = 'normal'
+ convX_x_x = self.dual_path_factory(convX_x_x, R, R, bw, inc, G,
+ _type1)
+ for i_ly in range(2, k_sec[gc] + 1):
+ convX_x_x = self.dual_path_factory(convX_x_x, R, R, bw, inc, G,
+ _type2)
+
+ conv5_x_x = fluid.layers.concat(convX_x_x, axis=1)
+ conv5_x_x = fluid.layers.batch_norm(
+ input=conv5_x_x, act='relu', is_test=False)
+ pool5 = fluid.layers.pool2d(
+ input=conv5_x_x,
+ pool_size=7,
+ pool_stride=1,
+ pool_padding=0,
+ pool_type='avg')
+
+ #stdv = 1.0 / math.sqrt(pool5.shape[1] * 1.0)
+ stdv = 0.01
+ param_attr = fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv))
+ fc6 = fluid.layers.fc(input=pool5,
+ size=class_dim,
+ param_attr=param_attr)
+
+ return fc6
+
+ def get_net_args(self, layers):
+ if layers == 68:
+ k_r = 128
+ G = 32
+ k_sec = [3, 4, 12, 3]
+ inc_sec = [16, 32, 32, 64]
+ bw = [64, 128, 256, 512]
+ r = [64, 64, 64, 64]
+ init_num_filter = 10
+ init_filter_size = 3
+ init_padding = 1
+ elif layers == 92:
+ k_r = 96
+ G = 32
+ k_sec = [3, 4, 20, 3]
+ inc_sec = [16, 32, 24, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 64
+ init_filter_size = 7
+ init_padding = 3
+ elif layers == 98:
+ k_r = 160
+ G = 40
+ k_sec = [3, 6, 20, 3]
+ inc_sec = [16, 32, 32, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 96
+ init_filter_size = 7
+ init_padding = 3
+ elif layers == 107:
+ k_r = 200
+ G = 50
+ k_sec = [4, 8, 20, 3]
+ inc_sec = [20, 64, 64, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 128
+ init_filter_size = 7
+ init_padding = 3
+ elif layers == 131:
+ k_r = 160
+ G = 40
+ k_sec = [4, 8, 28, 3]
+ inc_sec = [16, 32, 32, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 128
+ init_filter_size = 7
+ init_padding = 3
+ else:
+ raise NotImplementedError
+ net_arg = {
+ 'k_r': k_r,
+ 'G': G,
+ 'k_sec': k_sec,
+ 'inc_sec': inc_sec,
+ 'bw': bw,
+ 'r': r
+ }
+ net_arg['init_num_filter'] = init_num_filter
+ net_arg['init_filter_size'] = init_filter_size
+ net_arg['init_padding'] = init_padding
+
+ return net_arg
+
+ def dual_path_factory(self,
+ data,
+ num_1x1_a,
+ num_3x3_b,
+ num_1x1_c,
+ inc,
+ G,
+ _type='normal'):
+ kw = 3
+ kh = 3
+ pw = (kw - 1) // 2
+ ph = (kh - 1) // 2
+
+ # type
+ if _type is 'proj':
+ key_stride = 1
+ has_proj = True
+ if _type is 'down':
+ key_stride = 2
+ has_proj = True
+ if _type is 'normal':
+ key_stride = 1
+ has_proj = False
+
+ # PROJ
+ if type(data) is list:
+ data_in = fluid.layers.concat([data[0], data[1]], axis=1)
+ else:
+ data_in = data
+
+ if has_proj:
+ c1x1_w = self.bn_ac_conv(
+ data=data_in,
+ num_filter=(num_1x1_c + 2 * inc),
+ kernel=(1, 1),
+ pad=(0, 0),
+ stride=(key_stride, key_stride))
+ data_o1, data_o2 = fluid.layers.split(
+ c1x1_w, num_or_sections=[num_1x1_c, 2 * inc], dim=1)
+ else:
+ data_o1 = data[0]
+ data_o2 = data[1]
+
+ # MAIN
+ c1x1_a = self.bn_ac_conv(
+ data=data_in, num_filter=num_1x1_a, kernel=(1, 1), pad=(0, 0))
+ c3x3_b = self.bn_ac_conv(
+ data=c1x1_a,
+ num_filter=num_3x3_b,
+ kernel=(kw, kh),
+ pad=(pw, ph),
+ stride=(key_stride, key_stride),
+ num_group=G)
+ c1x1_c = self.bn_ac_conv(
+ data=c3x3_b,
+ num_filter=(num_1x1_c + inc),
+ kernel=(1, 1),
+ pad=(0, 0))
+
+ c1x1_c1, c1x1_c2 = fluid.layers.split(
+ c1x1_c, num_or_sections=[num_1x1_c, inc], dim=1)
+
+ # OUTPUTS
+ summ = fluid.layers.elementwise_add(x=data_o1, y=c1x1_c1)
+ dense = fluid.layers.concat([data_o2, c1x1_c2], axis=1)
+
+ return [summ, dense]
+
+ def bn_ac_conv(self,
+ data,
+ num_filter,
+ kernel,
+ pad,
+ stride=(1, 1),
+ num_group=1):
+ bn_ac = fluid.layers.batch_norm(input=data, act='relu', is_test=False)
+ bn_ac_conv = fluid.layers.conv2d(
+ input=bn_ac,
+ num_filters=num_filter,
+ filter_size=kernel,
+ stride=stride,
+ padding=pad,
+ groups=num_group,
+ act=None,
+ bias_attr=False)
+ return bn_ac_conv
+
+
+def DPN68():
+ model = DPN(layers=68)
+ return model
+
+
+def DPN92():
+ model = DPN(layers=92)
+ return model
+
+
+def DPN98():
+ model = DPN(layers=98)
+ return model
+
+
+def DPN107():
+ model = DPN(layers=107)
+ return model
+
+
+def DPN131():
+ model = DPN(layers=131)
+ return model
diff --git a/PaddleCV/image_classification/legacy/models/googlenet.py b/PaddleCV/image_classification/legacy/models/googlenet.py
new file mode 100644
index 0000000000000000000000000000000000000000..be52ed96fcb801cc4a7d69d61470dd5732ff044c
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/googlenet.py
@@ -0,0 +1,167 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+
+__all__ = ['GoogleNet']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 70, 100],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class GoogleNet():
+ def __init__(self):
+ self.params = train_parameters
+
+ def conv_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None):
+ channels = input.shape[1]
+ stdv = (3.0 / (filter_size**2 * channels))**0.5
+ param_attr = fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv))
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=act,
+ param_attr=param_attr,
+ bias_attr=False)
+ return conv
+
+ def xavier(self, channels, filter_size):
+ stdv = (3.0 / (filter_size**2 * channels))**0.5
+ param_attr = fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv))
+ return param_attr
+
+ def inception(self, name, input, channels, filter1, filter3R, filter3,
+ filter5R, filter5, proj):
+ conv1 = self.conv_layer(
+ input=input, num_filters=filter1, filter_size=1, stride=1, act=None)
+ conv3r = self.conv_layer(
+ input=input,
+ num_filters=filter3R,
+ filter_size=1,
+ stride=1,
+ act=None)
+ conv3 = self.conv_layer(
+ input=conv3r,
+ num_filters=filter3,
+ filter_size=3,
+ stride=1,
+ act=None)
+ conv5r = self.conv_layer(
+ input=input,
+ num_filters=filter5R,
+ filter_size=1,
+ stride=1,
+ act=None)
+ conv5 = self.conv_layer(
+ input=conv5r,
+ num_filters=filter5,
+ filter_size=5,
+ stride=1,
+ act=None)
+ pool = fluid.layers.pool2d(
+ input=input,
+ pool_size=3,
+ pool_stride=1,
+ pool_padding=1,
+ pool_type='max')
+ convprj = fluid.layers.conv2d(
+ input=pool, filter_size=1, num_filters=proj, stride=1, padding=0)
+ cat = fluid.layers.concat(input=[conv1, conv3, conv5, convprj], axis=1)
+ cat = fluid.layers.relu(cat)
+ return cat
+
+ def net(self, input, class_dim=1000):
+ conv = self.conv_layer(
+ input=input, num_filters=64, filter_size=7, stride=2, act=None)
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_type='max', pool_stride=2)
+
+ conv = self.conv_layer(
+ input=pool, num_filters=64, filter_size=1, stride=1, act=None)
+ conv = self.conv_layer(
+ input=conv, num_filters=192, filter_size=3, stride=1, act=None)
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_type='max', pool_stride=2)
+
+ ince3a = self.inception("ince3a", pool, 192, 64, 96, 128, 16, 32, 32)
+ ince3b = self.inception("ince3b", ince3a, 256, 128, 128, 192, 32, 96,
+ 64)
+ pool3 = fluid.layers.pool2d(
+ input=ince3b, pool_size=3, pool_type='max', pool_stride=2)
+
+ ince4a = self.inception("ince4a", pool3, 480, 192, 96, 208, 16, 48, 64)
+ ince4b = self.inception("ince4b", ince4a, 512, 160, 112, 224, 24, 64,
+ 64)
+ ince4c = self.inception("ince4c", ince4b, 512, 128, 128, 256, 24, 64,
+ 64)
+ ince4d = self.inception("ince4d", ince4c, 512, 112, 144, 288, 32, 64,
+ 64)
+ ince4e = self.inception("ince4e", ince4d, 528, 256, 160, 320, 32, 128,
+ 128)
+ pool4 = fluid.layers.pool2d(
+ input=ince4e, pool_size=3, pool_type='max', pool_stride=2)
+
+ ince5a = self.inception("ince5a", pool4, 832, 256, 160, 320, 32, 128,
+ 128)
+ ince5b = self.inception("ince5b", ince5a, 832, 384, 192, 384, 48, 128,
+ 128)
+ pool5 = fluid.layers.pool2d(
+ input=ince5b, pool_size=7, pool_type='avg', pool_stride=7)
+ dropout = fluid.layers.dropout(x=pool5, dropout_prob=0.4)
+ out = fluid.layers.fc(input=dropout,
+ size=class_dim,
+ act='softmax',
+ param_attr=self.xavier(1024, 1))
+
+ pool_o1 = fluid.layers.pool2d(
+ input=ince4a, pool_size=5, pool_type='avg', pool_stride=3)
+ conv_o1 = self.conv_layer(
+ input=pool_o1, num_filters=128, filter_size=1, stride=1, act=None)
+ fc_o1 = fluid.layers.fc(input=conv_o1,
+ size=1024,
+ act='relu',
+ param_attr=self.xavier(2048, 1))
+ dropout_o1 = fluid.layers.dropout(x=fc_o1, dropout_prob=0.7)
+ out1 = fluid.layers.fc(input=dropout_o1,
+ size=class_dim,
+ act='softmax',
+ param_attr=self.xavier(1024, 1))
+
+ pool_o2 = fluid.layers.pool2d(
+ input=ince4d, pool_size=5, pool_type='avg', pool_stride=3)
+ conv_o2 = self.conv_layer(
+ input=pool_o2, num_filters=128, filter_size=1, stride=1, act=None)
+ fc_o2 = fluid.layers.fc(input=conv_o2,
+ size=1024,
+ act='relu',
+ param_attr=self.xavier(2048, 1))
+ dropout_o2 = fluid.layers.dropout(x=fc_o2, dropout_prob=0.7)
+ out2 = fluid.layers.fc(input=dropout_o2,
+ size=class_dim,
+ act='softmax',
+ param_attr=self.xavier(1024, 1))
+
+ # last fc layer is "out"
+ return out, out1, out2
diff --git a/PaddleCV/image_classification/legacy/models/inception_v4.py b/PaddleCV/image_classification/legacy/models/inception_v4.py
new file mode 100644
index 0000000000000000000000000000000000000000..1520375477ade6e61f0a5584278b13e40ab541eb
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/inception_v4.py
@@ -0,0 +1,206 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+
+__all__ = ['InceptionV4']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class InceptionV4():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000):
+ x = self.inception_stem(input)
+
+ for i in range(4):
+ x = self.inceptionA(x)
+ x = self.reductionA(x)
+
+ for i in range(7):
+ x = self.inceptionB(x)
+ x = self.reductionB(x)
+
+ for i in range(3):
+ x = self.inceptionC(x)
+
+ pool = fluid.layers.pool2d(
+ input=x, pool_size=8, pool_type='avg', global_pooling=True)
+
+ drop = fluid.layers.dropout(x=pool, dropout_prob=0.2)
+
+ stdv = 1.0 / math.sqrt(drop.shape[1] * 1.0)
+ out = fluid.layers.fc(
+ input=drop,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+ return out
+
+ def conv_bn_layer(self,
+ data,
+ num_filters,
+ filter_size,
+ stride=1,
+ padding=0,
+ groups=1,
+ act='relu'):
+ conv = fluid.layers.conv2d(
+ input=data,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=groups,
+ act=None,
+ bias_attr=False)
+ return fluid.layers.batch_norm(input=conv, act=act)
+
+ def inception_stem(self, data):
+ conv = self.conv_bn_layer(data, 32, 3, stride=2, act='relu')
+ conv = self.conv_bn_layer(conv, 32, 3, act='relu')
+ conv = self.conv_bn_layer(conv, 64, 3, padding=1, act='relu')
+
+ pool1 = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_stride=2, pool_type='max')
+ conv2 = self.conv_bn_layer(conv, 96, 3, stride=2, act='relu')
+ concat = fluid.layers.concat([pool1, conv2], axis=1)
+
+ conv1 = self.conv_bn_layer(concat, 64, 1, act='relu')
+ conv1 = self.conv_bn_layer(conv1, 96, 3, act='relu')
+
+ conv2 = self.conv_bn_layer(concat, 64, 1, act='relu')
+ conv2 = self.conv_bn_layer(
+ conv2, 64, (7, 1), padding=(3, 0), act='relu')
+ conv2 = self.conv_bn_layer(
+ conv2, 64, (1, 7), padding=(0, 3), act='relu')
+ conv2 = self.conv_bn_layer(conv2, 96, 3, act='relu')
+
+ concat = fluid.layers.concat([conv1, conv2], axis=1)
+
+ conv1 = self.conv_bn_layer(concat, 192, 3, stride=2, act='relu')
+ pool1 = fluid.layers.pool2d(
+ input=concat, pool_size=3, pool_stride=2, pool_type='max')
+
+ concat = fluid.layers.concat([conv1, pool1], axis=1)
+
+ return concat
+
+ def inceptionA(self, data):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_padding=1, pool_type='avg')
+ conv1 = self.conv_bn_layer(pool1, 96, 1, act='relu')
+
+ conv2 = self.conv_bn_layer(data, 96, 1, act='relu')
+
+ conv3 = self.conv_bn_layer(data, 64, 1, act='relu')
+ conv3 = self.conv_bn_layer(conv3, 96, 3, padding=1, act='relu')
+
+ conv4 = self.conv_bn_layer(data, 64, 1, act='relu')
+ conv4 = self.conv_bn_layer(conv4, 96, 3, padding=1, act='relu')
+ conv4 = self.conv_bn_layer(conv4, 96, 3, padding=1, act='relu')
+
+ concat = fluid.layers.concat([conv1, conv2, conv3, conv4], axis=1)
+
+ return concat
+
+ def reductionA(self, data):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_stride=2, pool_type='max')
+
+ conv2 = self.conv_bn_layer(data, 384, 3, stride=2, act='relu')
+
+ conv3 = self.conv_bn_layer(data, 192, 1, act='relu')
+ conv3 = self.conv_bn_layer(conv3, 224, 3, padding=1, act='relu')
+ conv3 = self.conv_bn_layer(conv3, 256, 3, stride=2, act='relu')
+
+ concat = fluid.layers.concat([pool1, conv2, conv3], axis=1)
+
+ return concat
+
+ def inceptionB(self, data):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_padding=1, pool_type='avg')
+ conv1 = self.conv_bn_layer(pool1, 128, 1, act='relu')
+
+ conv2 = self.conv_bn_layer(data, 384, 1, act='relu')
+
+ conv3 = self.conv_bn_layer(data, 192, 1, act='relu')
+ conv3 = self.conv_bn_layer(
+ conv3, 224, (1, 7), padding=(0, 3), act='relu')
+ conv3 = self.conv_bn_layer(
+ conv3, 256, (7, 1), padding=(3, 0), act='relu')
+
+ conv4 = self.conv_bn_layer(data, 192, 1, act='relu')
+ conv4 = self.conv_bn_layer(
+ conv4, 192, (1, 7), padding=(0, 3), act='relu')
+ conv4 = self.conv_bn_layer(
+ conv4, 224, (7, 1), padding=(3, 0), act='relu')
+ conv4 = self.conv_bn_layer(
+ conv4, 224, (1, 7), padding=(0, 3), act='relu')
+ conv4 = self.conv_bn_layer(
+ conv4, 256, (7, 1), padding=(3, 0), act='relu')
+
+ concat = fluid.layers.concat([conv1, conv2, conv3, conv4], axis=1)
+
+ return concat
+
+ def reductionB(self, data):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_stride=2, pool_type='max')
+
+ conv2 = self.conv_bn_layer(data, 192, 1, act='relu')
+ conv2 = self.conv_bn_layer(conv2, 192, 3, stride=2, act='relu')
+
+ conv3 = self.conv_bn_layer(data, 256, 1, act='relu')
+ conv3 = self.conv_bn_layer(
+ conv3, 256, (1, 7), padding=(0, 3), act='relu')
+ conv3 = self.conv_bn_layer(
+ conv3, 320, (7, 1), padding=(3, 0), act='relu')
+ conv3 = self.conv_bn_layer(conv3, 320, 3, stride=2, act='relu')
+
+ concat = fluid.layers.concat([pool1, conv2, conv3], axis=1)
+
+ return concat
+
+ def inceptionC(self, data):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_padding=1, pool_type='avg')
+ conv1 = self.conv_bn_layer(pool1, 256, 1, act='relu')
+
+ conv2 = self.conv_bn_layer(data, 256, 1, act='relu')
+
+ conv3 = self.conv_bn_layer(data, 384, 1, act='relu')
+ conv3_1 = self.conv_bn_layer(
+ conv3, 256, (1, 3), padding=(0, 1), act='relu')
+ conv3_2 = self.conv_bn_layer(
+ conv3, 256, (3, 1), padding=(1, 0), act='relu')
+
+ conv4 = self.conv_bn_layer(data, 384, 1, act='relu')
+ conv4 = self.conv_bn_layer(
+ conv4, 448, (1, 3), padding=(0, 1), act='relu')
+ conv4 = self.conv_bn_layer(
+ conv4, 512, (3, 1), padding=(1, 0), act='relu')
+ conv4_1 = self.conv_bn_layer(
+ conv4, 256, (1, 3), padding=(0, 1), act='relu')
+ conv4_2 = self.conv_bn_layer(
+ conv4, 256, (3, 1), padding=(1, 0), act='relu')
+
+ concat = fluid.layers.concat(
+ [conv1, conv2, conv3_1, conv3_2, conv4_1, conv4_2], axis=1)
+
+ return concat
diff --git a/PaddleCV/image_classification/legacy/models/mobilenet.py b/PaddleCV/image_classification/legacy/models/mobilenet.py
new file mode 100644
index 0000000000000000000000000000000000000000..d0b419e8b4083104ba529c9f886284aa724953e6
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/mobilenet.py
@@ -0,0 +1,166 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['MobileNet']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class MobileNet():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000, scale=1.0):
+ # conv1: 112x112
+ input = self.conv_bn_layer(
+ input,
+ filter_size=3,
+ channels=3,
+ num_filters=int(32 * scale),
+ stride=2,
+ padding=1)
+
+ # 56x56
+ input = self.depthwise_separable(
+ input,
+ num_filters1=32,
+ num_filters2=64,
+ num_groups=32,
+ stride=1,
+ scale=scale)
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=64,
+ num_filters2=128,
+ num_groups=64,
+ stride=2,
+ scale=scale)
+
+ # 28x28
+ input = self.depthwise_separable(
+ input,
+ num_filters1=128,
+ num_filters2=128,
+ num_groups=128,
+ stride=1,
+ scale=scale)
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=128,
+ num_filters2=256,
+ num_groups=128,
+ stride=2,
+ scale=scale)
+
+ # 14x14
+ input = self.depthwise_separable(
+ input,
+ num_filters1=256,
+ num_filters2=256,
+ num_groups=256,
+ stride=1,
+ scale=scale)
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=256,
+ num_filters2=512,
+ num_groups=256,
+ stride=2,
+ scale=scale)
+
+ # 14x14
+ for i in range(5):
+ input = self.depthwise_separable(
+ input,
+ num_filters1=512,
+ num_filters2=512,
+ num_groups=512,
+ stride=1,
+ scale=scale)
+ # 7x7
+ input = self.depthwise_separable(
+ input,
+ num_filters1=512,
+ num_filters2=1024,
+ num_groups=512,
+ stride=2,
+ scale=scale)
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=1024,
+ num_filters2=1024,
+ num_groups=1024,
+ stride=1,
+ scale=scale)
+
+ input = fluid.layers.pool2d(
+ input=input,
+ pool_size=0,
+ pool_stride=1,
+ pool_type='avg',
+ global_pooling=True)
+
+ output = fluid.layers.fc(input=input,
+ size=class_dim,
+ param_attr=ParamAttr(initializer=MSRA()))
+ return output
+
+ def conv_bn_layer(self,
+ input,
+ filter_size,
+ num_filters,
+ stride,
+ padding,
+ channels=None,
+ num_groups=1,
+ act='relu',
+ use_cudnn=True):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=num_groups,
+ act=None,
+ use_cudnn=use_cudnn,
+ param_attr=ParamAttr(initializer=MSRA()),
+ bias_attr=False)
+ return fluid.layers.batch_norm(input=conv, act=act)
+
+ def depthwise_separable(self, input, num_filters1, num_filters2, num_groups,
+ stride, scale):
+ depthwise_conv = self.conv_bn_layer(
+ input=input,
+ filter_size=3,
+ num_filters=int(num_filters1 * scale),
+ stride=stride,
+ padding=1,
+ num_groups=int(num_groups * scale),
+ use_cudnn=False)
+
+ pointwise_conv = self.conv_bn_layer(
+ input=depthwise_conv,
+ filter_size=1,
+ num_filters=int(num_filters2 * scale),
+ stride=1,
+ padding=0)
+ return pointwise_conv
diff --git a/PaddleCV/image_classification/legacy/models/mobilenet_v2.py b/PaddleCV/image_classification/legacy/models/mobilenet_v2.py
new file mode 100644
index 0000000000000000000000000000000000000000..c219b1bf5a7260fbb07627bc3fa039f4b2833092
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/mobilenet_v2.py
@@ -0,0 +1,168 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['MobileNetV2']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class MobileNetV2():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000, scale=1.0):
+
+ bottleneck_params_list = [
+ (1, 16, 1, 1),
+ (6, 24, 2, 2),
+ (6, 32, 3, 2),
+ (6, 64, 4, 2),
+ (6, 96, 3, 1),
+ (6, 160, 3, 2),
+ (6, 320, 1, 1),
+ ]
+
+ input = self.conv_bn_layer(
+ input,
+ num_filters=int(32 * scale),
+ filter_size=3,
+ stride=2,
+ padding=1,
+ if_act=True)
+
+ in_c = int(32 * scale)
+ for layer_setting in bottleneck_params_list:
+ t, c, n, s = layer_setting
+ input = self.invresi_blocks(
+ input=input,
+ in_c=in_c,
+ t=t,
+ c=int(c * scale),
+ n=n,
+ s=s, )
+ in_c = int(c * scale)
+
+ input = self.conv_bn_layer(
+ input=input,
+ num_filters=int(1280 * scale) if scale > 1.0 else 1280,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ if_act=True)
+
+ input = fluid.layers.pool2d(
+ input=input,
+ pool_size=7,
+ pool_stride=1,
+ pool_type='avg',
+ global_pooling=True)
+
+ output = fluid.layers.fc(input=input,
+ size=class_dim,
+ param_attr=ParamAttr(initializer=MSRA()))
+ return output
+
+ def conv_bn_layer(self,
+ input,
+ filter_size,
+ num_filters,
+ stride,
+ padding,
+ channels=None,
+ num_groups=1,
+ use_cudnn=True,
+ if_act=True):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=num_groups,
+ act=None,
+ use_cudnn=use_cudnn,
+ param_attr=ParamAttr(initializer=MSRA()),
+ bias_attr=False)
+ bn = fluid.layers.batch_norm(input=conv)
+ if if_act:
+ return fluid.layers.relu6(bn)
+ else:
+ return bn
+
+ def shortcut(self, input, data_residual):
+ return fluid.layers.elementwise_add(input, data_residual)
+
+ def inverted_residual_unit(self, input, num_in_filter, num_filters,
+ ifshortcut, stride, filter_size, padding,
+ expansion_factor):
+ num_expfilter = int(round(num_in_filter * expansion_factor))
+ channel_expand = self.conv_bn_layer(
+ input=input,
+ num_filters=num_expfilter,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True)
+ bottleneck_conv = self.conv_bn_layer(
+ input=channel_expand,
+ num_filters=num_expfilter,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ num_groups=num_expfilter,
+ if_act=True,
+ use_cudnn=False)
+ linear_out = self.conv_bn_layer(
+ input=bottleneck_conv,
+ num_filters=num_filters,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=False)
+ if ifshortcut:
+ out = self.shortcut(input=input, data_residual=linear_out)
+ return out
+ else:
+ return linear_out
+
+ def invresi_blocks(self, input, in_c, t, c, n, s):
+ first_block = self.inverted_residual_unit(
+ input=input,
+ num_in_filter=in_c,
+ num_filters=c,
+ ifshortcut=False,
+ stride=s,
+ filter_size=3,
+ padding=1,
+ expansion_factor=t)
+
+ last_residual_block = first_block
+ last_c = c
+
+ for i in range(1, n):
+ last_residual_block = self.inverted_residual_unit(
+ input=last_residual_block,
+ num_in_filter=last_c,
+ num_filters=c,
+ ifshortcut=True,
+ stride=1,
+ filter_size=3,
+ padding=1,
+ expansion_factor=t)
+ return last_residual_block
diff --git a/PaddleCV/image_classification/legacy/models/resnet.py b/PaddleCV/image_classification/legacy/models/resnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..def99db6d84673b77582cf93374f4cb2f00e9ac5
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/resnet.py
@@ -0,0 +1,122 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+
+__all__ = ["ResNet", "ResNet50", "ResNet101", "ResNet152"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class ResNet():
+ def __init__(self, layers=50):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ layers = self.layers
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input, num_filters=64, filter_size=7, stride=2, act='relu')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=pool,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv,
+ stdv)))
+ return out
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ bias_attr=False)
+ return fluid.layers.batch_norm(input=conv, act=act)
+
+ def shortcut(self, input, ch_out, stride):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride):
+ conv0 = self.conv_bn_layer(
+ input=input, num_filters=num_filters, filter_size=1, act='relu')
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu')
+ conv2 = self.conv_bn_layer(
+ input=conv1, num_filters=num_filters * 4, filter_size=1, act=None)
+
+ short = self.shortcut(input, num_filters * 4, stride)
+
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
+
+
+def ResNet50():
+ model = ResNet(layers=50)
+ return model
+
+
+def ResNet101():
+ model = ResNet(layers=101)
+ return model
+
+
+def ResNet152():
+ model = ResNet(layers=152)
+ return model
diff --git a/PaddleCV/image_classification/legacy/models/se_resnext.py b/PaddleCV/image_classification/legacy/models/se_resnext.py
new file mode 100644
index 0000000000000000000000000000000000000000..ac50bd87b5070000a018949e777a897427c3e5a5
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/se_resnext.py
@@ -0,0 +1,199 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+
+__all__ = [
+ "SE_ResNeXt", "SE_ResNeXt50_32x4d", "SE_ResNeXt101_32x4d",
+ "SE_ResNeXt152_32x4d"
+]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "dropout_seed": None,
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class SE_ResNeXt():
+ def __init__(self, layers=50):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ layers = self.layers
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+ if layers == 50:
+ cardinality = 32
+ reduction_ratio = 16
+ depth = [3, 4, 6, 3]
+ num_filters = [128, 256, 512, 1024]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act='relu')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+ elif layers == 101:
+ cardinality = 32
+ reduction_ratio = 16
+ depth = [3, 4, 23, 3]
+ num_filters = [128, 256, 512, 1024]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act='relu')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+ elif layers == 152:
+ cardinality = 64
+ reduction_ratio = 16
+ depth = [3, 8, 36, 3]
+ num_filters = [128, 256, 512, 1024]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=3,
+ stride=2,
+ act='relu')
+ conv = self.conv_bn_layer(
+ input=conv, num_filters=64, filter_size=3, stride=1, act='relu')
+ conv = self.conv_bn_layer(
+ input=conv,
+ num_filters=128,
+ filter_size=3,
+ stride=1,
+ act='relu')
+ conv = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_stride=2, pool_padding=1, \
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1,
+ cardinality=cardinality,
+ reduction_ratio=reduction_ratio)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+ drop = fluid.layers.dropout(
+ x=pool, dropout_prob=0.5, seed=self.params['dropout_seed'])
+ stdv = 1.0 / math.sqrt(drop.shape[1] * 1.0)
+ out = fluid.layers.fc(input=drop,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv,
+ stdv)))
+ return out
+
+ def shortcut(self, input, ch_out, stride):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ filter_size = 1
+ return self.conv_bn_layer(input, ch_out, filter_size, stride)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride, cardinality,
+ reduction_ratio):
+ conv0 = self.conv_bn_layer(
+ input=input, num_filters=num_filters, filter_size=1, act='relu')
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ groups=cardinality,
+ act='relu')
+ conv2 = self.conv_bn_layer(
+ input=conv1, num_filters=num_filters * 2, filter_size=1, act=None)
+ scale = self.squeeze_excitation(
+ input=conv2,
+ num_channels=num_filters * 2,
+ reduction_ratio=reduction_ratio)
+
+ short = self.shortcut(input, num_filters * 2, stride)
+
+ return fluid.layers.elementwise_add(x=short, y=scale, act='relu')
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ bias_attr=False)
+ return fluid.layers.batch_norm(input=conv, act=act)
+
+ def squeeze_excitation(self, input, num_channels, reduction_ratio):
+ pool = fluid.layers.pool2d(
+ input=input, pool_size=0, pool_type='avg', global_pooling=True)
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ squeeze = fluid.layers.fc(input=pool,
+ size=num_channels // reduction_ratio,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ -stdv, stdv)))
+ stdv = 1.0 / math.sqrt(squeeze.shape[1] * 1.0)
+ excitation = fluid.layers.fc(input=squeeze,
+ size=num_channels,
+ act='sigmoid',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ -stdv, stdv)))
+ scale = fluid.layers.elementwise_mul(x=input, y=excitation, axis=0)
+ return scale
+
+
+def SE_ResNeXt50_32x4d():
+ model = SE_ResNeXt(layers=50)
+ return model
+
+
+def SE_ResNeXt101_32x4d():
+ model = SE_ResNeXt(layers=101)
+ return model
+
+
+def SE_ResNeXt152_32x4d():
+ model = SE_ResNeXt(layers=152)
+ return model
diff --git a/PaddleCV/image_classification/legacy/models/shufflenet_v2.py b/PaddleCV/image_classification/legacy/models/shufflenet_v2.py
new file mode 100644
index 0000000000000000000000000000000000000000..6db88aa769dd6b3b3e2987fcac6d8054319a2a56
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/shufflenet_v2.py
@@ -0,0 +1,252 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = [
+ 'ShuffleNetV2', 'ShuffleNetV2_x0_5', 'ShuffleNetV2_x1_0',
+ 'ShuffleNetV2_x1_5', 'ShuffleNetV2_x2_0'
+]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class ShuffleNetV2():
+ def __init__(self, scale=1.0):
+ self.params = train_parameters
+ self.scale = scale
+
+ def net(self, input, class_dim=1000):
+ scale = self.scale
+ stage_repeats = [4, 8, 4]
+
+ if scale == 0.5:
+ stage_out_channels = [-1, 24, 48, 96, 192, 1024]
+ elif scale == 1.0:
+ stage_out_channels = [-1, 24, 116, 232, 464, 1024]
+ elif scale == 1.5:
+ stage_out_channels = [-1, 24, 176, 352, 704, 1024]
+ elif scale == 2.0:
+ stage_out_channels = [-1, 24, 224, 488, 976, 2048]
+ else:
+ raise ValueError("""{} groups is not supported for
+ 1x1 Grouped Convolutions""".format(num_groups))
+
+ #conv1
+
+ input_channel = stage_out_channels[1]
+ conv1 = self.conv_bn_layer(
+ input=input,
+ filter_size=3,
+ num_filters=input_channel,
+ padding=1,
+ stride=2)
+ pool1 = fluid.layers.pool2d(
+ input=conv1,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+ conv = pool1
+ # bottleneck sequences
+ for idxstage in range(len(stage_repeats)):
+ numrepeat = stage_repeats[idxstage]
+ output_channel = stage_out_channels[idxstage + 2]
+ for i in range(numrepeat):
+ if i == 0:
+ conv = self.inverted_residual_unit(
+ input=conv,
+ num_filters=output_channel,
+ stride=2,
+ benchmodel=2)
+ else:
+ conv = self.inverted_residual_unit(
+ input=conv,
+ num_filters=output_channel,
+ stride=1,
+ benchmodel=1)
+
+ conv_last = self.conv_bn_layer(
+ input=conv,
+ filter_size=1,
+ num_filters=stage_out_channels[-1],
+ padding=0,
+ stride=1)
+ pool_last = fluid.layers.pool2d(
+ input=conv_last,
+ pool_size=7,
+ pool_stride=7,
+ pool_padding=0,
+ pool_type='avg')
+
+ output = fluid.layers.fc(input=pool_last,
+ size=class_dim,
+ param_attr=ParamAttr(initializer=MSRA()))
+ return output
+
+ def conv_bn_layer(self,
+ input,
+ filter_size,
+ num_filters,
+ stride,
+ padding,
+ num_groups=1,
+ use_cudnn=True,
+ if_act=True):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=num_groups,
+ act=None,
+ use_cudnn=use_cudnn,
+ param_attr=ParamAttr(initializer=MSRA()),
+ bias_attr=False)
+ if if_act:
+ return fluid.layers.batch_norm(input=conv, act='relu')
+ else:
+ return fluid.layers.batch_norm(input=conv)
+
+ def channel_shuffle(self, x, groups):
+ batchsize, num_channels, height, width = x.shape[0], x.shape[
+ 1], x.shape[2], x.shape[3]
+ channels_per_group = num_channels // groups
+
+ # reshape
+ x = fluid.layers.reshape(
+ x=x, shape=[batchsize, groups, channels_per_group, height, width])
+
+ x = fluid.layers.transpose(x=x, perm=[0, 2, 1, 3, 4])
+
+ # flatten
+ x = fluid.layers.reshape(
+ x=x, shape=[batchsize, num_channels, height, width])
+
+ return x
+
+ def inverted_residual_unit(self, input, num_filters, stride, benchmodel):
+ assert stride in [1, 2], \
+ "supported stride are {} but your stride is {}".format([1,2], stride)
+
+ oup_inc = num_filters // 2
+ inp = input.shape[1]
+
+ if benchmodel == 1:
+ x1, x2 = fluid.layers.split(
+ input,
+ num_or_sections=[input.shape[1] // 2, input.shape[1] // 2],
+ dim=1)
+
+ conv_pw = self.conv_bn_layer(
+ input=x2,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True)
+
+ conv_dw = self.conv_bn_layer(
+ input=conv_pw,
+ num_filters=oup_inc,
+ filter_size=3,
+ stride=stride,
+ padding=1,
+ num_groups=oup_inc,
+ if_act=False)
+
+ conv_linear = self.conv_bn_layer(
+ input=conv_dw,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True)
+
+ out = fluid.layers.concat([x1, conv_linear], axis=1)
+
+ else:
+ #branch1
+ conv_dw = self.conv_bn_layer(
+ input=input,
+ num_filters=inp,
+ filter_size=3,
+ stride=stride,
+ padding=1,
+ num_groups=inp,
+ if_act=False)
+
+ conv_linear_1 = self.conv_bn_layer(
+ input=conv_dw,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True)
+
+ #branch2
+ conv_pw = self.conv_bn_layer(
+ input=input,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True)
+
+ conv_dw = self.conv_bn_layer(
+ input=conv_pw,
+ num_filters=oup_inc,
+ filter_size=3,
+ stride=stride,
+ padding=1,
+ num_groups=oup_inc,
+ if_act=False)
+
+ conv_linear_2 = self.conv_bn_layer(
+ input=conv_dw,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True)
+ out = fluid.layers.concat([conv_linear_1, conv_linear_2], axis=1)
+
+ return self.channel_shuffle(out, 2)
+
+
+def ShuffleNetV2_x0_5():
+ model = ShuffleNetV2(scale=0.5)
+ return model
+
+
+def ShuffleNetV2_x1_0():
+ model = ShuffleNetV2(scale=1.0)
+ return model
+
+
+def ShuffleNetV2_x1_5():
+ model = ShuffleNetV2(scale=1.5)
+ return model
+
+
+def ShuffleNetV2_x2_0():
+ model = ShuffleNetV2(scale=2.0)
+ return model
diff --git a/PaddleCV/image_classification/legacy/models/vgg.py b/PaddleCV/image_classification/legacy/models/vgg.py
new file mode 100644
index 0000000000000000000000000000000000000000..7f559982334575c7c2bc778e1be8a4ebf69549fc
--- /dev/null
+++ b/PaddleCV/image_classification/legacy/models/vgg.py
@@ -0,0 +1,109 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+
+__all__ = ["VGGNet", "VGG11", "VGG13", "VGG16", "VGG19"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class VGGNet():
+ def __init__(self, layers=16):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ layers = self.layers
+ vgg_spec = {
+ 11: ([1, 1, 2, 2, 2]),
+ 13: ([2, 2, 2, 2, 2]),
+ 16: ([2, 2, 3, 3, 3]),
+ 19: ([2, 2, 4, 4, 4])
+ }
+ assert layers in vgg_spec.keys(), \
+ "supported layers are {} but input layer is {}".format(vgg_spec.keys(), layers)
+
+ nums = vgg_spec[layers]
+ conv1 = self.conv_block(input, 64, nums[0])
+ conv2 = self.conv_block(conv1, 128, nums[1])
+ conv3 = self.conv_block(conv2, 256, nums[2])
+ conv4 = self.conv_block(conv3, 512, nums[3])
+ conv5 = self.conv_block(conv4, 512, nums[4])
+
+ fc_dim = 4096
+ fc1 = fluid.layers.fc(
+ input=conv5,
+ size=fc_dim,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(scale=0.005)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Constant(value=0.1)))
+ fc1 = fluid.layers.dropout(x=fc1, dropout_prob=0.5)
+ fc2 = fluid.layers.fc(
+ input=fc1,
+ size=fc_dim,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(scale=0.005)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Constant(value=0.1)))
+ fc2 = fluid.layers.dropout(x=fc2, dropout_prob=0.5)
+ out = fluid.layers.fc(
+ input=fc2,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(scale=0.005)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Constant(value=0.1)))
+
+ return out
+
+ def conv_block(self, input, num_filter, groups):
+ conv = input
+ for i in range(groups):
+ conv = fluid.layers.conv2d(
+ input=conv,
+ num_filters=num_filter,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Normal(scale=0.01)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Constant(value=0.0)))
+ return fluid.layers.pool2d(
+ input=conv, pool_size=2, pool_type='max', pool_stride=2)
+
+
+def VGG11():
+ model = VGGNet(layers=11)
+ return model
+
+
+def VGG13():
+ model = VGGNet(layers=13)
+ return model
+
+
+def VGG16():
+ model = VGGNet(layers=16)
+ return model
+
+
+def VGG19():
+ model = VGGNet(layers=19)
+ return model
diff --git a/PaddleCV/image_classification/models/__init__.py b/PaddleCV/image_classification/models/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..458991ca732a22f3774568ffbfa84514ddadfe5c
--- /dev/null
+++ b/PaddleCV/image_classification/models/__init__.py
@@ -0,0 +1,12 @@
+from .alexnet import AlexNet
+from .mobilenet import MobileNet
+from .mobilenet_v2 import MobileNetV2
+from .googlenet import GoogleNet
+from .vgg import VGG11, VGG13, VGG16, VGG19
+from .resnet import ResNet18, ResNet34, ResNet50, ResNet101, ResNet152
+from .resnet_dist import DistResNet
+from .inception_v4 import InceptionV4
+from .se_resnext import SE_ResNeXt50_32x4d, SE_ResNeXt101_32x4d, SE_ResNeXt152_32x4d
+from .dpn import DPN68, DPN92, DPN98, DPN107, DPN131
+from .shufflenet_v2 import ShuffleNetV2, ShuffleNetV2_x0_5_swish, ShuffleNetV2_x1_0_swish, ShuffleNetV2_x1_5_swish, ShuffleNetV2_x2_0_swish, ShuffleNetV2_x8_0_swish
+from .fast_imagenet import FastImageNet
diff --git a/PaddleCV/image_classification/models/alexnet.py b/PaddleCV/image_classification/models/alexnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..f063c4d6deb88905aaa5f8a5eba59903f58293e8
--- /dev/null
+++ b/PaddleCV/image_classification/models/alexnet.py
@@ -0,0 +1,168 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+
+__all__ = ['AlexNet']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [40, 70, 100],
+ "steps": [0.01, 0.001, 0.0001, 0.00001]
+ }
+}
+
+
+class AlexNet():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000):
+ stdv = 1.0 / math.sqrt(input.shape[1] * 11 * 11)
+ layer_name = [
+ "conv1", "conv2", "conv3", "conv4", "conv5", "fc6", "fc7", "fc8"
+ ]
+ conv1 = fluid.layers.conv2d(
+ input=input,
+ num_filters=64,
+ filter_size=11,
+ stride=4,
+ padding=2,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[0] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[0] + "_weights"))
+ pool1 = fluid.layers.pool2d(
+ input=conv1,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=0,
+ pool_type='max')
+
+ stdv = 1.0 / math.sqrt(pool1.shape[1] * 5 * 5)
+ conv2 = fluid.layers.conv2d(
+ input=pool1,
+ num_filters=192,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[1] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[1] + "_weights"))
+ pool2 = fluid.layers.pool2d(
+ input=conv2,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=0,
+ pool_type='max')
+
+ stdv = 1.0 / math.sqrt(pool2.shape[1] * 3 * 3)
+ conv3 = fluid.layers.conv2d(
+ input=pool2,
+ num_filters=384,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[2] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[2] + "_weights"))
+
+ stdv = 1.0 / math.sqrt(conv3.shape[1] * 3 * 3)
+ conv4 = fluid.layers.conv2d(
+ input=conv3,
+ num_filters=256,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[3] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[3] + "_weights"))
+
+ stdv = 1.0 / math.sqrt(conv4.shape[1] * 3 * 3)
+ conv5 = fluid.layers.conv2d(
+ input=conv4,
+ num_filters=256,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ groups=1,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[4] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[4] + "_weights"))
+ pool5 = fluid.layers.pool2d(
+ input=conv5,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=0,
+ pool_type='max')
+
+ drop6 = fluid.layers.dropout(x=pool5, dropout_prob=0.5)
+ stdv = 1.0 / math.sqrt(drop6.shape[1] * drop6.shape[2] *
+ drop6.shape[3] * 1.0)
+
+ fc6 = fluid.layers.fc(
+ input=drop6,
+ size=4096,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[5] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[5] + "_weights"))
+
+ drop7 = fluid.layers.dropout(x=fc6, dropout_prob=0.5)
+ stdv = 1.0 / math.sqrt(drop7.shape[1] * 1.0)
+
+ fc7 = fluid.layers.fc(
+ input=drop7,
+ size=4096,
+ act='relu',
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[6] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[6] + "_weights"))
+
+ stdv = 1.0 / math.sqrt(fc7.shape[1] * 1.0)
+ out = fluid.layers.fc(
+ input=fc7,
+ size=class_dim,
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[7] + "_offset"),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=layer_name[7] + "_weights"))
+ return out
diff --git a/PaddleCV/image_classification/models/dpn.py b/PaddleCV/image_classification/models/dpn.py
new file mode 100644
index 0000000000000000000000000000000000000000..7f759b3bb6bfa9c866e129ac93ab2c6a9cf4168c
--- /dev/null
+++ b/PaddleCV/image_classification/models/dpn.py
@@ -0,0 +1,333 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import numpy as np
+import time
+import sys
+import paddle.fluid as fluid
+import math
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ["DPN", "DPN68", "DPN92", "DPN98", "DPN107", "DPN131"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class DPN(object):
+ def __init__(self, layers=68):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ # get network args
+ args = self.get_net_args(self.layers)
+ bws = args['bw']
+ inc_sec = args['inc_sec']
+ rs = args['bw']
+ k_r = args['k_r']
+ k_sec = args['k_sec']
+ G = args['G']
+ init_num_filter = args['init_num_filter']
+ init_filter_size = args['init_filter_size']
+ init_padding = args['init_padding']
+
+ ## define Dual Path Network
+
+ # conv1
+ conv1_x_1 = fluid.layers.conv2d(
+ input=input,
+ num_filters=init_num_filter,
+ filter_size=init_filter_size,
+ stride=2,
+ padding=init_padding,
+ groups=1,
+ act=None,
+ bias_attr=False,
+ name="conv1",
+ param_attr=ParamAttr(name="conv1_weights"), )
+
+ conv1_x_1 = fluid.layers.batch_norm(
+ input=conv1_x_1,
+ act='relu',
+ is_test=False,
+ name="conv1_bn",
+ param_attr=ParamAttr(name='conv1_bn_scale'),
+ bias_attr=ParamAttr('conv1_bn_offset'),
+ moving_mean_name='conv1_bn_mean',
+ moving_variance_name='conv1_bn_variance', )
+
+ convX_x_x = fluid.layers.pool2d(
+ input=conv1_x_1,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max',
+ name="pool1")
+
+ #conv2 - conv5
+ match_list, num = [], 0
+ for gc in range(4):
+ bw = bws[gc]
+ inc = inc_sec[gc]
+ R = (k_r * bw) // rs[gc]
+ if gc == 0:
+ _type1 = 'proj'
+ _type2 = 'normal'
+ match = 1
+ else:
+ _type1 = 'down'
+ _type2 = 'normal'
+ match = match + k_sec[gc - 1]
+ match_list.append(match)
+
+ convX_x_x = self.dual_path_factory(
+ convX_x_x, R, R, bw, inc, G, _type1, name="dpn" + str(match))
+ for i_ly in range(2, k_sec[gc] + 1):
+ num += 1
+ if num in match_list:
+ num += 1
+ convX_x_x = self.dual_path_factory(
+ convX_x_x, R, R, bw, inc, G, _type2, name="dpn" + str(num))
+
+ conv5_x_x = fluid.layers.concat(convX_x_x, axis=1)
+ conv5_x_x = fluid.layers.batch_norm(
+ input=conv5_x_x,
+ act='relu',
+ is_test=False,
+ name="final_concat_bn",
+ param_attr=ParamAttr(name='final_concat_bn_scale'),
+ bias_attr=ParamAttr('final_concat_bn_offset'),
+ moving_mean_name='final_concat_bn_mean',
+ moving_variance_name='final_concat_bn_variance', )
+ pool5 = fluid.layers.pool2d(
+ input=conv5_x_x,
+ pool_size=7,
+ pool_stride=1,
+ pool_padding=0,
+ pool_type='avg', )
+
+ stdv = 0.01
+ param_attr = fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv))
+ fc6 = fluid.layers.fc(input=pool5,
+ size=class_dim,
+ param_attr=param_attr,
+ name="fc6")
+
+ return fc6
+
+ def get_net_args(self, layers):
+ if layers == 68:
+ k_r = 128
+ G = 32
+ k_sec = [3, 4, 12, 3]
+ inc_sec = [16, 32, 32, 64]
+ bw = [64, 128, 256, 512]
+ r = [64, 64, 64, 64]
+ init_num_filter = 10
+ init_filter_size = 3
+ init_padding = 1
+ elif layers == 92:
+ k_r = 96
+ G = 32
+ k_sec = [3, 4, 20, 3]
+ inc_sec = [16, 32, 24, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 64
+ init_filter_size = 7
+ init_padding = 3
+ elif layers == 98:
+ k_r = 160
+ G = 40
+ k_sec = [3, 6, 20, 3]
+ inc_sec = [16, 32, 32, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 96
+ init_filter_size = 7
+ init_padding = 3
+ elif layers == 107:
+ k_r = 200
+ G = 50
+ k_sec = [4, 8, 20, 3]
+ inc_sec = [20, 64, 64, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 128
+ init_filter_size = 7
+ init_padding = 3
+ elif layers == 131:
+ k_r = 160
+ G = 40
+ k_sec = [4, 8, 28, 3]
+ inc_sec = [16, 32, 32, 128]
+ bw = [256, 512, 1024, 2048]
+ r = [256, 256, 256, 256]
+ init_num_filter = 128
+ init_filter_size = 7
+ init_padding = 3
+ else:
+ raise NotImplementedError
+ net_arg = {
+ 'k_r': k_r,
+ 'G': G,
+ 'k_sec': k_sec,
+ 'inc_sec': inc_sec,
+ 'bw': bw,
+ 'r': r
+ }
+ net_arg['init_num_filter'] = init_num_filter
+ net_arg['init_filter_size'] = init_filter_size
+ net_arg['init_padding'] = init_padding
+
+ return net_arg
+
+ def dual_path_factory(self,
+ data,
+ num_1x1_a,
+ num_3x3_b,
+ num_1x1_c,
+ inc,
+ G,
+ _type='normal',
+ name=None):
+ kw = 3
+ kh = 3
+ pw = (kw - 1) // 2
+ ph = (kh - 1) // 2
+
+ # type
+ if _type is 'proj':
+ key_stride = 1
+ has_proj = True
+ if _type is 'down':
+ key_stride = 2
+ has_proj = True
+ if _type is 'normal':
+ key_stride = 1
+ has_proj = False
+
+ # PROJ
+ if type(data) is list:
+ data_in = fluid.layers.concat([data[0], data[1]], axis=1)
+ else:
+ data_in = data
+
+ if has_proj:
+ c1x1_w = self.bn_ac_conv(
+ data=data_in,
+ num_filter=(num_1x1_c + 2 * inc),
+ kernel=(1, 1),
+ pad=(0, 0),
+ stride=(key_stride, key_stride),
+ name=name + "_match")
+ data_o1, data_o2 = fluid.layers.split(
+ c1x1_w,
+ num_or_sections=[num_1x1_c, 2 * inc],
+ dim=1,
+ name=name + "_match_conv_Slice")
+ else:
+ data_o1 = data[0]
+ data_o2 = data[1]
+
+ # MAIN
+ c1x1_a = self.bn_ac_conv(
+ data=data_in,
+ num_filter=num_1x1_a,
+ kernel=(1, 1),
+ pad=(0, 0),
+ name=name + "_conv1")
+ c3x3_b = self.bn_ac_conv(
+ data=c1x1_a,
+ num_filter=num_3x3_b,
+ kernel=(kw, kh),
+ pad=(pw, ph),
+ stride=(key_stride, key_stride),
+ num_group=G,
+ name=name + "_conv2")
+ c1x1_c = self.bn_ac_conv(
+ data=c3x3_b,
+ num_filter=(num_1x1_c + inc),
+ kernel=(1, 1),
+ pad=(0, 0),
+ name=name + "_conv3")
+
+ c1x1_c1, c1x1_c2 = fluid.layers.split(
+ c1x1_c,
+ num_or_sections=[num_1x1_c, inc],
+ dim=1,
+ name=name + "_conv3_Slice")
+
+ # OUTPUTS
+ summ = fluid.layers.elementwise_add(
+ x=data_o1, y=c1x1_c1, name=name + "_elewise")
+ dense = fluid.layers.concat(
+ [data_o2, c1x1_c2], axis=1, name=name + "_concat")
+
+ return [summ, dense]
+
+ def bn_ac_conv(self,
+ data,
+ num_filter,
+ kernel,
+ pad,
+ stride=(1, 1),
+ num_group=1,
+ name=None):
+ bn_ac = fluid.layers.batch_norm(
+ input=data,
+ act='relu',
+ is_test=False,
+ name=name + '.output.1',
+ param_attr=ParamAttr(name=name + '_bn_scale'),
+ bias_attr=ParamAttr(name + '_bn_offset'),
+ moving_mean_name=name + '_bn_mean',
+ moving_variance_name=name + '_bn_variance', )
+ bn_ac_conv = fluid.layers.conv2d(
+ input=bn_ac,
+ num_filters=num_filter,
+ filter_size=kernel,
+ stride=stride,
+ padding=pad,
+ groups=num_group,
+ act=None,
+ bias_attr=False,
+ param_attr=ParamAttr(name=name + "_weights"))
+ return bn_ac_conv
+
+
+def DPN68():
+ model = DPN(layers=68)
+ return model
+
+
+def DPN92():
+ onvodel = DPN(layers=92)
+ return model
+
+
+def DPN98():
+ model = DPN(layers=98)
+ return model
+
+
+def DPN107():
+ model = DPN(layers=107)
+ return model
+
+
+def DPN131():
+ model = DPN(layers=131)
+ return model
diff --git a/PaddleCV/image_classification/models/fast_imagenet.py b/PaddleCV/image_classification/models/fast_imagenet.py
new file mode 100644
index 0000000000000000000000000000000000000000..d59937511e11c183c68c6806ed801b1b96d3b525
--- /dev/null
+++ b/PaddleCV/image_classification/models/fast_imagenet.py
@@ -0,0 +1,146 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import functools
+import numpy as np
+import time
+import os
+import math
+
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.core as core
+import paddle.fluid.profiler as profiler
+import utils
+
+__all__ = ["FastImageNet"]
+
+class FastImageNet():
+ def __init__(self, layers=50, is_train=True):
+ self.layers = layers
+ self.is_train = is_train
+
+ def net(self, input, class_dim=1000, img_size=224, is_train=True):
+ layers = self.layers
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input, num_filters=64, filter_size=7, stride=2, act='relu')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1)
+ pool_size = int(img_size / 32)
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=pool_size, pool_type='avg', global_pooling=True)
+ out = fluid.layers.fc(input=pool,
+ size=class_dim,
+ act=None,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.NormalInitializer(0.0, 0.01),
+ regularizer=fluid.regularizer.L2Decay(1e-4)),
+ bias_attr=fluid.ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(1e-4)))
+ return out
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ bn_init_value=1.0):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ bias_attr=False,
+ param_attr=fluid.ParamAttr(regularizer=fluid.regularizer.L2Decay(1e-4)))
+ return fluid.layers.batch_norm(input=conv, act=act, is_test=not self.is_train,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Constant(bn_init_value),
+ regularizer=None))
+
+ def shortcut(self, input, ch_out, stride):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride):
+ conv0 = self.conv_bn_layer(
+ input=input, num_filters=num_filters, filter_size=1, act='relu')
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu')
+ # init bn-weight0
+ conv2 = self.conv_bn_layer(
+ input=conv1, num_filters=num_filters * 4, filter_size=1, act=None, bn_init_value=0.0)
+
+ short = self.shortcut(input, num_filters * 4, stride)
+
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
+
+def lr_decay(lrs, epochs, bs, total_image):
+ boundaries = []
+ values = []
+ for idx, epoch in enumerate(epochs):
+ step = total_image // bs[idx]
+ if step * bs[idx] < total_image:
+ step += 1
+ ratio = (lrs[idx][1] - lrs[idx][0])*1.0 / (epoch[1] - epoch[0])
+ lr_base = lrs[idx][0]
+ for s in range(epoch[0], epoch[1]):
+ if boundaries:
+ boundaries.append(boundaries[-1] + step + 1)
+ else:
+ boundaries = [step]
+ lr = lr_base + ratio * (s - epoch[0])
+ values.append(lr)
+ print("epoch: [%d], steps: [%d], lr: [%f]" % (s, boundaries[-1], values[-1]))
+ values.append(lrs[-1])
+ print("epoch: [%d:], steps: [%d:], lr:[%f]" % (epochs[-1][-1], boundaries[-1], values[-1]))
+ return boundaries, values
diff --git a/PaddleCV/image_classification/models/googlenet.py b/PaddleCV/image_classification/models/googlenet.py
new file mode 100644
index 0000000000000000000000000000000000000000..bd9040c53e61a48d9f5bff6683bec961d3f95583
--- /dev/null
+++ b/PaddleCV/image_classification/models/googlenet.py
@@ -0,0 +1,233 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['GoogleNet']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 70, 100],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class GoogleNet():
+ def __init__(self):
+ self.params = train_parameters
+
+ def conv_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ name=None):
+ channels = input.shape[1]
+ stdv = (3.0 / (filter_size**2 * channels))**0.5
+ param_attr = ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=name + "_weights")
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=act,
+ param_attr=param_attr,
+ bias_attr=False,
+ name=name)
+ return conv
+
+ def xavier(self, channels, filter_size, name):
+ stdv = (3.0 / (filter_size**2 * channels))**0.5
+ param_attr = ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=name + "_weights")
+
+ return param_attr
+
+ def inception(self,
+ input,
+ channels,
+ filter1,
+ filter3R,
+ filter3,
+ filter5R,
+ filter5,
+ proj,
+ name=None):
+ conv1 = self.conv_layer(
+ input=input,
+ num_filters=filter1,
+ filter_size=1,
+ stride=1,
+ act=None,
+ name="inception_" + name + "_1x1")
+ conv3r = self.conv_layer(
+ input=input,
+ num_filters=filter3R,
+ filter_size=1,
+ stride=1,
+ act=None,
+ name="inception_" + name + "_3x3_reduce")
+ conv3 = self.conv_layer(
+ input=conv3r,
+ num_filters=filter3,
+ filter_size=3,
+ stride=1,
+ act=None,
+ name="inception_" + name + "_3x3")
+ conv5r = self.conv_layer(
+ input=input,
+ num_filters=filter5R,
+ filter_size=1,
+ stride=1,
+ act=None,
+ name="inception_" + name + "_5x5_reduce")
+ conv5 = self.conv_layer(
+ input=conv5r,
+ num_filters=filter5,
+ filter_size=5,
+ stride=1,
+ act=None,
+ name="inception_" + name + "_5x5")
+ pool = fluid.layers.pool2d(
+ input=input,
+ pool_size=3,
+ pool_stride=1,
+ pool_padding=1,
+ pool_type='max')
+ convprj = fluid.layers.conv2d(
+ input=pool,
+ filter_size=1,
+ num_filters=proj,
+ stride=1,
+ padding=0,
+ name="inception_" + name + "_3x3_proj",
+ param_attr=ParamAttr(
+ name="inception_" + name + "_3x3_proj_weights"),
+ bias_attr=False)
+ cat = fluid.layers.concat(input=[conv1, conv3, conv5, convprj], axis=1)
+ cat = fluid.layers.relu(cat)
+ return cat
+
+ def net(self, input, class_dim=1000):
+ conv = self.conv_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act=None,
+ name="conv1")
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_type='max', pool_stride=2)
+
+ conv = self.conv_layer(
+ input=pool,
+ num_filters=64,
+ filter_size=1,
+ stride=1,
+ act=None,
+ name="conv2_1x1")
+ conv = self.conv_layer(
+ input=conv,
+ num_filters=192,
+ filter_size=3,
+ stride=1,
+ act=None,
+ name="conv2_3x3")
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_type='max', pool_stride=2)
+
+ ince3a = self.inception(pool, 192, 64, 96, 128, 16, 32, 32, "ince3a")
+ ince3b = self.inception(ince3a, 256, 128, 128, 192, 32, 96, 64,
+ "ince3b")
+ pool3 = fluid.layers.pool2d(
+ input=ince3b, pool_size=3, pool_type='max', pool_stride=2)
+
+ ince4a = self.inception(pool3, 480, 192, 96, 208, 16, 48, 64, "ince4a")
+ ince4b = self.inception(ince4a, 512, 160, 112, 224, 24, 64, 64,
+ "ince4b")
+ ince4c = self.inception(ince4b, 512, 128, 128, 256, 24, 64, 64,
+ "ince4c")
+ ince4d = self.inception(ince4c, 512, 112, 144, 288, 32, 64, 64,
+ "ince4d")
+ ince4e = self.inception(ince4d, 528, 256, 160, 320, 32, 128, 128,
+ "ince4e")
+ pool4 = fluid.layers.pool2d(
+ input=ince4e, pool_size=3, pool_type='max', pool_stride=2)
+
+ ince5a = self.inception(pool4, 832, 256, 160, 320, 32, 128, 128,
+ "ince5a")
+ ince5b = self.inception(ince5a, 832, 384, 192, 384, 48, 128, 128,
+ "ince5b")
+ pool5 = fluid.layers.pool2d(
+ input=ince5b, pool_size=7, pool_type='avg', pool_stride=7)
+ dropout = fluid.layers.dropout(x=pool5, dropout_prob=0.4)
+ out = fluid.layers.fc(input=dropout,
+ size=class_dim,
+ act='softmax',
+ param_attr=self.xavier(1024, 1, "out"),
+ name="out",
+ bias_attr=ParamAttr(name="out_offset"))
+
+ pool_o1 = fluid.layers.pool2d(
+ input=ince4a, pool_size=5, pool_type='avg', pool_stride=3)
+ conv_o1 = self.conv_layer(
+ input=pool_o1,
+ num_filters=128,
+ filter_size=1,
+ stride=1,
+ act=None,
+ name="conv_o1")
+ fc_o1 = fluid.layers.fc(input=conv_o1,
+ size=1024,
+ act='relu',
+ param_attr=self.xavier(2048, 1, "fc_o1"),
+ name="fc_o1",
+ bias_attr=ParamAttr(name="fc_o1_offset"))
+ dropout_o1 = fluid.layers.dropout(x=fc_o1, dropout_prob=0.7)
+ out1 = fluid.layers.fc(input=dropout_o1,
+ size=class_dim,
+ act='softmax',
+ param_attr=self.xavier(1024, 1, "out1"),
+ name="out1",
+ bias_attr=ParamAttr(name="out1_offset"))
+
+ pool_o2 = fluid.layers.pool2d(
+ input=ince4d, pool_size=5, pool_type='avg', pool_stride=3)
+ conv_o2 = self.conv_layer(
+ input=pool_o2,
+ num_filters=128,
+ filter_size=1,
+ stride=1,
+ act=None,
+ name="conv_o2")
+ fc_o2 = fluid.layers.fc(input=conv_o2,
+ size=1024,
+ act='relu',
+ param_attr=self.xavier(2048, 1, "fc_o2"),
+ name="fc_o2",
+ bias_attr=ParamAttr(name="fc_o2_offset"))
+ dropout_o2 = fluid.layers.dropout(x=fc_o2, dropout_prob=0.7)
+ out2 = fluid.layers.fc(input=dropout_o2,
+ size=class_dim,
+ act='softmax',
+ param_attr=self.xavier(1024, 1, "out2"),
+ name="out2",
+ bias_attr=ParamAttr(name="out2_offset"))
+
+ # last fc layer is "out"
+ return out, out1, out2
diff --git a/PaddleCV/image_classification/models/inception_v4.py b/PaddleCV/image_classification/models/inception_v4.py
new file mode 100644
index 0000000000000000000000000000000000000000..8c6c0dbb129f903b4f0b849f930a520b5f17e5db
--- /dev/null
+++ b/PaddleCV/image_classification/models/inception_v4.py
@@ -0,0 +1,340 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['InceptionV4']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class InceptionV4():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000):
+ x = self.inception_stem(input)
+
+ for i in range(4):
+ x = self.inceptionA(x, name=str(i + 1))
+ x = self.reductionA(x)
+
+ for i in range(7):
+ x = self.inceptionB(x, name=str(i + 1))
+ x = self.reductionB(x)
+
+ for i in range(3):
+ x = self.inceptionC(x, name=str(i + 1))
+
+ pool = fluid.layers.pool2d(
+ input=x, pool_size=8, pool_type='avg', global_pooling=True)
+
+ drop = fluid.layers.dropout(x=pool, dropout_prob=0.2)
+
+ stdv = 1.0 / math.sqrt(drop.shape[1] * 1.0)
+ out = fluid.layers.fc(
+ input=drop,
+ size=class_dim,
+ param_attr=ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name="final_fc_weights"),
+ bias_attr=ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name="final_fc_offset"))
+ return out
+
+ def conv_bn_layer(self,
+ data,
+ num_filters,
+ filter_size,
+ stride=1,
+ padding=0,
+ groups=1,
+ act='relu',
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=data,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=groups,
+ act=None,
+ param_attr=ParamAttr(name=name + "_weights"),
+ bias_attr=False,
+ name=name)
+ bn_name = name + "_bn"
+ return fluid.layers.batch_norm(
+ input=conv,
+ act=act,
+ name=bn_name,
+ param_attr=ParamAttr(name=bn_name + "_scale"),
+ bias_attr=ParamAttr(name=bn_name + "_offset"),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance')
+
+ def inception_stem(self, data, name=None):
+ conv = self.conv_bn_layer(
+ data, 32, 3, stride=2, act='relu', name="conv1_3x3_s2")
+ conv = self.conv_bn_layer(conv, 32, 3, act='relu', name="conv2_3x3_s1")
+ conv = self.conv_bn_layer(
+ conv, 64, 3, padding=1, act='relu', name="conv3_3x3_s1")
+
+ pool1 = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_stride=2, pool_type='max')
+ conv2 = self.conv_bn_layer(
+ conv, 96, 3, stride=2, act='relu', name="inception_stem1_3x3_s2")
+ concat = fluid.layers.concat([pool1, conv2], axis=1)
+
+ conv1 = self.conv_bn_layer(
+ concat, 64, 1, act='relu', name="inception_stem2_3x3_reduce")
+ conv1 = self.conv_bn_layer(
+ conv1, 96, 3, act='relu', name="inception_stem2_3x3")
+
+ conv2 = self.conv_bn_layer(
+ concat, 64, 1, act='relu', name="inception_stem2_1x7_reduce")
+ conv2 = self.conv_bn_layer(
+ conv2,
+ 64, (7, 1),
+ padding=(3, 0),
+ act='relu',
+ name="inception_stem2_1x7")
+ conv2 = self.conv_bn_layer(
+ conv2,
+ 64, (1, 7),
+ padding=(0, 3),
+ act='relu',
+ name="inception_stem2_7x1")
+ conv2 = self.conv_bn_layer(
+ conv2, 96, 3, act='relu', name="inception_stem2_3x3_2")
+
+ concat = fluid.layers.concat([conv1, conv2], axis=1)
+
+ conv1 = self.conv_bn_layer(
+ concat, 192, 3, stride=2, act='relu', name="inception_stem3_3x3_s2")
+ pool1 = fluid.layers.pool2d(
+ input=concat, pool_size=3, pool_stride=2, pool_type='max')
+
+ concat = fluid.layers.concat([conv1, pool1], axis=1)
+
+ return concat
+
+ def inceptionA(self, data, name=None):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_padding=1, pool_type='avg')
+ conv1 = self.conv_bn_layer(
+ pool1, 96, 1, act='relu', name="inception_a" + name + "_1x1")
+
+ conv2 = self.conv_bn_layer(
+ data, 96, 1, act='relu', name="inception_a" + name + "_1x1_2")
+
+ conv3 = self.conv_bn_layer(
+ data, 64, 1, act='relu', name="inception_a" + name + "_3x3_reduce")
+ conv3 = self.conv_bn_layer(
+ conv3,
+ 96,
+ 3,
+ padding=1,
+ act='relu',
+ name="inception_a" + name + "_3x3")
+
+ conv4 = self.conv_bn_layer(
+ data,
+ 64,
+ 1,
+ act='relu',
+ name="inception_a" + name + "_3x3_2_reduce")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 96,
+ 3,
+ padding=1,
+ act='relu',
+ name="inception_a" + name + "_3x3_2")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 96,
+ 3,
+ padding=1,
+ act='relu',
+ name="inception_a" + name + "_3x3_3")
+
+ concat = fluid.layers.concat([conv1, conv2, conv3, conv4], axis=1)
+
+ return concat
+
+ def reductionA(self, data, name=None):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_stride=2, pool_type='max')
+
+ conv2 = self.conv_bn_layer(
+ data, 384, 3, stride=2, act='relu', name="reduction_a_3x3")
+
+ conv3 = self.conv_bn_layer(
+ data, 192, 1, act='relu', name="reduction_a_3x3_2_reduce")
+ conv3 = self.conv_bn_layer(
+ conv3, 224, 3, padding=1, act='relu', name="reduction_a_3x3_2")
+ conv3 = self.conv_bn_layer(
+ conv3, 256, 3, stride=2, act='relu', name="reduction_a_3x3_3")
+
+ concat = fluid.layers.concat([pool1, conv2, conv3], axis=1)
+
+ return concat
+
+ def inceptionB(self, data, name=None):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_padding=1, pool_type='avg')
+ conv1 = self.conv_bn_layer(
+ pool1, 128, 1, act='relu', name="inception_b" + name + "_1x1")
+
+ conv2 = self.conv_bn_layer(
+ data, 384, 1, act='relu', name="inception_b" + name + "_1x1_2")
+
+ conv3 = self.conv_bn_layer(
+ data, 192, 1, act='relu', name="inception_b" + name + "_1x7_reduce")
+ conv3 = self.conv_bn_layer(
+ conv3,
+ 224, (1, 7),
+ padding=(0, 3),
+ act='relu',
+ name="inception_b" + name + "_1x7")
+ conv3 = self.conv_bn_layer(
+ conv3,
+ 256, (7, 1),
+ padding=(3, 0),
+ act='relu',
+ name="inception_b" + name + "_7x1")
+
+ conv4 = self.conv_bn_layer(
+ data,
+ 192,
+ 1,
+ act='relu',
+ name="inception_b" + name + "_7x1_2_reduce")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 192, (1, 7),
+ padding=(0, 3),
+ act='relu',
+ name="inception_b" + name + "_1x7_2")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 224, (7, 1),
+ padding=(3, 0),
+ act='relu',
+ name="inception_b" + name + "_7x1_2")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 224, (1, 7),
+ padding=(0, 3),
+ act='relu',
+ name="inception_b" + name + "_1x7_3")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 256, (7, 1),
+ padding=(3, 0),
+ act='relu',
+ name="inception_b" + name + "_7x1_3")
+
+ concat = fluid.layers.concat([conv1, conv2, conv3, conv4], axis=1)
+
+ return concat
+
+ def reductionB(self, data, name=None):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_stride=2, pool_type='max')
+
+ conv2 = self.conv_bn_layer(
+ data, 192, 1, act='relu', name="reduction_b_3x3_reduce")
+ conv2 = self.conv_bn_layer(
+ conv2, 192, 3, stride=2, act='relu', name="reduction_b_3x3")
+
+ conv3 = self.conv_bn_layer(
+ data, 256, 1, act='relu', name="reduction_b_1x7_reduce")
+ conv3 = self.conv_bn_layer(
+ conv3,
+ 256, (1, 7),
+ padding=(0, 3),
+ act='relu',
+ name="reduction_b_1x7")
+ conv3 = self.conv_bn_layer(
+ conv3,
+ 320, (7, 1),
+ padding=(3, 0),
+ act='relu',
+ name="reduction_b_7x1")
+ conv3 = self.conv_bn_layer(
+ conv3, 320, 3, stride=2, act='relu', name="reduction_b_3x3_2")
+
+ concat = fluid.layers.concat([pool1, conv2, conv3], axis=1)
+
+ return concat
+
+ def inceptionC(self, data, name=None):
+ pool1 = fluid.layers.pool2d(
+ input=data, pool_size=3, pool_padding=1, pool_type='avg')
+ conv1 = self.conv_bn_layer(
+ pool1, 256, 1, act='relu', name="inception_c" + name + "_1x1")
+
+ conv2 = self.conv_bn_layer(
+ data, 256, 1, act='relu', name="inception_c" + name + "_1x1_2")
+
+ conv3 = self.conv_bn_layer(
+ data, 384, 1, act='relu', name="inception_c" + name + "_1x1_3")
+ conv3_1 = self.conv_bn_layer(
+ conv3,
+ 256, (1, 3),
+ padding=(0, 1),
+ act='relu',
+ name="inception_c" + name + "_1x3")
+ conv3_2 = self.conv_bn_layer(
+ conv3,
+ 256, (3, 1),
+ padding=(1, 0),
+ act='relu',
+ name="inception_c" + name + "_3x1")
+
+ conv4 = self.conv_bn_layer(
+ data, 384, 1, act='relu', name="inception_c" + name + "_1x1_4")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 448, (1, 3),
+ padding=(0, 1),
+ act='relu',
+ name="inception_c" + name + "_1x3_2")
+ conv4 = self.conv_bn_layer(
+ conv4,
+ 512, (3, 1),
+ padding=(1, 0),
+ act='relu',
+ name="inception_c" + name + "_3x1_2")
+ conv4_1 = self.conv_bn_layer(
+ conv4,
+ 256, (1, 3),
+ padding=(0, 1),
+ act='relu',
+ name="inception_c" + name + "_1x3_3")
+ conv4_2 = self.conv_bn_layer(
+ conv4,
+ 256, (3, 1),
+ padding=(1, 0),
+ act='relu',
+ name="inception_c" + name + "_3x1_3")
+
+ concat = fluid.layers.concat(
+ [conv1, conv2, conv3_1, conv3_2, conv4_1, conv4_2], axis=1)
+
+ return concat
diff --git a/PaddleCV/image_classification/models/mobilenet.py b/PaddleCV/image_classification/models/mobilenet.py
new file mode 100644
index 0000000000000000000000000000000000000000..d242bc946a7b4bec9c9d2e34da2496c0901ba870
--- /dev/null
+++ b/PaddleCV/image_classification/models/mobilenet.py
@@ -0,0 +1,195 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['MobileNet']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class MobileNet():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000, scale=1.0):
+ # conv1: 112x112
+ input = self.conv_bn_layer(
+ input,
+ filter_size=3,
+ channels=3,
+ num_filters=int(32 * scale),
+ stride=2,
+ padding=1,
+ name="conv1")
+
+ # 56x56
+ input = self.depthwise_separable(
+ input,
+ num_filters1=32,
+ num_filters2=64,
+ num_groups=32,
+ stride=1,
+ scale=scale,
+ name="conv2_1")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=64,
+ num_filters2=128,
+ num_groups=64,
+ stride=2,
+ scale=scale,
+ name="conv2_2")
+
+ # 28x28
+ input = self.depthwise_separable(
+ input,
+ num_filters1=128,
+ num_filters2=128,
+ num_groups=128,
+ stride=1,
+ scale=scale,
+ name="conv3_1")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=128,
+ num_filters2=256,
+ num_groups=128,
+ stride=2,
+ scale=scale,
+ name="conv3_2")
+
+ # 14x14
+ input = self.depthwise_separable(
+ input,
+ num_filters1=256,
+ num_filters2=256,
+ num_groups=256,
+ stride=1,
+ scale=scale,
+ name="conv4_1")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=256,
+ num_filters2=512,
+ num_groups=256,
+ stride=2,
+ scale=scale,
+ name="conv4_2")
+
+ # 14x14
+ for i in range(5):
+ input = self.depthwise_separable(
+ input,
+ num_filters1=512,
+ num_filters2=512,
+ num_groups=512,
+ stride=1,
+ scale=scale,
+ name="conv5" + "_" + str(i + 1))
+ # 7x7
+ input = self.depthwise_separable(
+ input,
+ num_filters1=512,
+ num_filters2=1024,
+ num_groups=512,
+ stride=2,
+ scale=scale,
+ name="conv5_6")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=1024,
+ num_filters2=1024,
+ num_groups=1024,
+ stride=1,
+ scale=scale,
+ name="conv6")
+
+ input = fluid.layers.pool2d(
+ input=input,
+ pool_size=0,
+ pool_stride=1,
+ pool_type='avg',
+ global_pooling=True)
+
+ output = fluid.layers.fc(input=input,
+ size=class_dim,
+ param_attr=ParamAttr(
+ initializer=MSRA(), name="fc7_weights"),
+ bias_attr=ParamAttr(name="fc7_offset"))
+ return output
+
+ def conv_bn_layer(self,
+ input,
+ filter_size,
+ num_filters,
+ stride,
+ padding,
+ channels=None,
+ num_groups=1,
+ act='relu',
+ use_cudnn=True,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=num_groups,
+ act=None,
+ use_cudnn=use_cudnn,
+ param_attr=ParamAttr(
+ initializer=MSRA(), name=name + "_weights"),
+ bias_attr=False)
+ bn_name = name + "_bn"
+ return fluid.layers.batch_norm(
+ input=conv,
+ act=act,
+ param_attr=ParamAttr(name=bn_name + "_scale"),
+ bias_attr=ParamAttr(name=bn_name + "_offset"),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance')
+
+ def depthwise_separable(self,
+ input,
+ num_filters1,
+ num_filters2,
+ num_groups,
+ stride,
+ scale,
+ name=None):
+ depthwise_conv = self.conv_bn_layer(
+ input=input,
+ filter_size=3,
+ num_filters=int(num_filters1 * scale),
+ stride=stride,
+ padding=1,
+ num_groups=int(num_groups * scale),
+ use_cudnn=False,
+ name=name + "_dw")
+
+ pointwise_conv = self.conv_bn_layer(
+ input=depthwise_conv,
+ filter_size=1,
+ num_filters=int(num_filters2 * scale),
+ stride=1,
+ padding=0,
+ name=name + "_sep")
+ return pointwise_conv
diff --git a/PaddleCV/image_classification/models/mobilenet_v2.py b/PaddleCV/image_classification/models/mobilenet_v2.py
new file mode 100644
index 0000000000000000000000000000000000000000..77d88c7da625c0c953c75d229148868f0481f2a2
--- /dev/null
+++ b/PaddleCV/image_classification/models/mobilenet_v2.py
@@ -0,0 +1,198 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['MobileNetV2']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class MobileNetV2():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000, scale=1.0):
+
+ bottleneck_params_list = [
+ (1, 16, 1, 1),
+ (6, 24, 2, 2),
+ (6, 32, 3, 2),
+ (6, 64, 4, 2),
+ (6, 96, 3, 1),
+ (6, 160, 3, 2),
+ (6, 320, 1, 1),
+ ]
+
+ #conv1
+ input = self.conv_bn_layer(
+ input,
+ num_filters=int(32 * scale),
+ filter_size=3,
+ stride=2,
+ padding=1,
+ if_act=True,
+ name='conv1_1')
+
+ # bottleneck sequences
+ i = 1
+ in_c = int(32 * scale)
+ for layer_setting in bottleneck_params_list:
+ t, c, n, s = layer_setting
+ i += 1
+ input = self.invresi_blocks(
+ input=input,
+ in_c=in_c,
+ t=t,
+ c=int(c * scale),
+ n=n,
+ s=s,
+ name='conv' + str(i))
+ in_c = int(c * scale)
+ #last_conv
+ input = self.conv_bn_layer(
+ input=input,
+ num_filters=int(1280 * scale) if scale > 1.0 else 1280,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ if_act=True,
+ name='conv9')
+
+ input = fluid.layers.pool2d(
+ input=input,
+ pool_size=7,
+ pool_stride=1,
+ pool_type='avg',
+ global_pooling=True)
+
+ output = fluid.layers.fc(input=input,
+ size=class_dim,
+ param_attr=ParamAttr(name='fc10_weights'),
+ bias_attr=ParamAttr(name='fc10_offset'))
+ return output
+
+ def conv_bn_layer(self,
+ input,
+ filter_size,
+ num_filters,
+ stride,
+ padding,
+ channels=None,
+ num_groups=1,
+ if_act=True,
+ name=None,
+ use_cudnn=True):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=num_groups,
+ act=None,
+ use_cudnn=use_cudnn,
+ param_attr=ParamAttr(name=name + '_weights'),
+ bias_attr=False)
+ bn_name = name + '_bn'
+ bn = fluid.layers.batch_norm(
+ input=conv,
+ param_attr=ParamAttr(name=bn_name + "_scale"),
+ bias_attr=ParamAttr(name=bn_name + "_offset"),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance')
+ if if_act:
+ return fluid.layers.relu6(bn)
+ else:
+ return bn
+
+ def shortcut(self, input, data_residual):
+ return fluid.layers.elementwise_add(input, data_residual)
+
+ def inverted_residual_unit(self,
+ input,
+ num_in_filter,
+ num_filters,
+ ifshortcut,
+ stride,
+ filter_size,
+ padding,
+ expansion_factor,
+ name=None):
+ num_expfilter = int(round(num_in_filter * expansion_factor))
+
+ channel_expand = self.conv_bn_layer(
+ input=input,
+ num_filters=num_expfilter,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True,
+ name=name + '_expand')
+
+ bottleneck_conv = self.conv_bn_layer(
+ input=channel_expand,
+ num_filters=num_expfilter,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ num_groups=num_expfilter,
+ if_act=True,
+ name=name + '_dwise',
+ use_cudnn=False)
+
+ linear_out = self.conv_bn_layer(
+ input=bottleneck_conv,
+ num_filters=num_filters,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=False,
+ name=name + '_linear')
+ if ifshortcut:
+ out = self.shortcut(input=input, data_residual=linear_out)
+ return out
+ else:
+ return linear_out
+
+ def invresi_blocks(self, input, in_c, t, c, n, s, name=None):
+ first_block = self.inverted_residual_unit(
+ input=input,
+ num_in_filter=in_c,
+ num_filters=c,
+ ifshortcut=False,
+ stride=s,
+ filter_size=3,
+ padding=1,
+ expansion_factor=t,
+ name=name + '_1')
+
+ last_residual_block = first_block
+ last_c = c
+
+ for i in range(1, n):
+ last_residual_block = self.inverted_residual_unit(
+ input=last_residual_block,
+ num_in_filter=last_c,
+ num_filters=c,
+ ifshortcut=True,
+ stride=1,
+ filter_size=3,
+ padding=1,
+ expansion_factor=t,
+ name=name + '_' + str(i + 1))
+ return last_residual_block
diff --git a/PaddleCV/image_classification/models/resnet.py b/PaddleCV/image_classification/models/resnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..b44a6192e798e7513fed0af2ee17023b4ef2aec6
--- /dev/null
+++ b/PaddleCV/image_classification/models/resnet.py
@@ -0,0 +1,182 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ["ResNet", "ResNet18", "ResNet34", "ResNet50", "ResNet101", "ResNet152"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class ResNet():
+ def __init__(self, layers=50):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ layers = self.layers
+ supported_layers = [18, 34, 50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+
+ if layers == 18:
+ depth = [2, 2, 2, 2]
+ elif layers == 34 or layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input, num_filters=64, filter_size=7, stride=2, act='relu',name="conv1")
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+ if layers >= 50:
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ if layers in [101, 152] and block == 2:
+ if i == 0:
+ conv_name="res"+str(block+2)+"a"
+ else:
+ conv_name="res"+str(block+2)+"b"+str(i)
+ else:
+ conv_name="res"+str(block+2)+chr(97+i)
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1, name=conv_name)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=pool,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+ else:
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ conv_name="res"+str(block+2)+chr(97+i)
+ conv = self.basic_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1,
+ is_first=block==i==0,
+ name=conv_name)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=pool,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv)))
+ return out
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ param_attr=ParamAttr(name=name + "_weights"),
+ bias_attr=False,
+ name=name + '.conv2d.output.1')
+
+ if name == "conv1":
+ bn_name = "bn_" + name
+ else:
+ bn_name = "bn" + name[3:]
+ return fluid.layers.batch_norm(input=conv,
+ act=act,
+ name=bn_name+'.output.1',
+ param_attr=ParamAttr(name=bn_name + '_scale'),
+ bias_attr=ParamAttr(bn_name + '_offset'),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance',)
+
+ def shortcut(self, input, ch_out, stride, is_first, name):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1 or is_first == True:
+ return self.conv_bn_layer(input, ch_out, 1, stride, name=name)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride, name):
+ conv0 = self.conv_bn_layer(
+ input=input, num_filters=num_filters, filter_size=1, act='relu',name=name+"_branch2a")
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu',
+ name=name+"_branch2b")
+ conv2 = self.conv_bn_layer(
+ input=conv1, num_filters=num_filters * 4, filter_size=1, act=None, name=name+"_branch2c")
+
+ short = self.shortcut(input, num_filters * 4, stride, is_first=False, name=name + "_branch1")
+
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu',name=name+".add.output.5")
+
+ def basic_block(self, input, num_filters, stride, is_first, name):
+ conv0 = self.conv_bn_layer(input=input, num_filters=num_filters, filter_size=3, act='relu', stride=stride,
+ name=name+"_branch2a")
+ conv1 = self.conv_bn_layer(input=conv0, num_filters=num_filters, filter_size=3, act=None,
+ name=name+"_branch2b")
+ short = self.shortcut(input, num_filters, stride, is_first, name=name + "_branch1")
+ return fluid.layers.elementwise_add(x=short, y=conv1, act='relu')
+
+
+def ResNet18():
+ model = ResNet(layers=18)
+ return model
+
+
+def ResNet34():
+ model = ResNet(layers=34)
+ return model
+
+
+def ResNet50():
+ model = ResNet(layers=50)
+ return model
+
+
+def ResNet101():
+ model = ResNet(layers=101)
+ return model
+
+
+def ResNet152():
+ model = ResNet(layers=152)
+ return model
diff --git a/PaddleCV/image_classification/models/resnet_dist.py b/PaddleCV/image_classification/models/resnet_dist.py
new file mode 100644
index 0000000000000000000000000000000000000000..3420d790c25534b4a73ea660b2d880ff899ee62f
--- /dev/null
+++ b/PaddleCV/image_classification/models/resnet_dist.py
@@ -0,0 +1,122 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+
+__all__ = ["DistResNet"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 80],
+ "steps": [0.1, 0.01, 0.001, 0.0001],
+ "warmup_passes": 5
+ }
+}
+
+
+class DistResNet():
+ def __init__(self, layers=50, is_train=True):
+ self.params = train_parameters
+ self.layers = layers
+ self.is_train = is_train
+ self.weight_decay = 1e-4
+
+ def net(self, input, class_dim=1000):
+ layers = self.layers
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input, num_filters=64, filter_size=7, stride=2, act='relu')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=pool,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv,
+ stdv),
+ regularizer=fluid.regularizer.L2Decay(self.weight_decay)),
+ bias_attr=fluid.ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(self.weight_decay))
+ )
+ return out
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ bn_init_value=1.0):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ bias_attr=False,
+ param_attr=fluid.ParamAttr(regularizer=fluid.regularizer.L2Decay(self.weight_decay)))
+ return fluid.layers.batch_norm(
+ input=conv, act=act, is_test=not self.is_train,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Constant(bn_init_value),
+ regularizer=None))
+
+ def shortcut(self, input, ch_out, stride):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride):
+ conv0 = self.conv_bn_layer(
+ input=input, num_filters=num_filters, filter_size=1, act='relu')
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu')
+ # NOTE: default bias is 0.0 already
+ conv2 = self.conv_bn_layer(
+ input=conv1, num_filters=num_filters * 4, filter_size=1, act=None, bn_init_value=0.0)
+
+ short = self.shortcut(input, num_filters * 4, stride)
+
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
+
diff --git a/PaddleCV/image_classification/models/se_resnext.py b/PaddleCV/image_classification/models/se_resnext.py
new file mode 100644
index 0000000000000000000000000000000000000000..0ae3d66fddbe2d1b9da5e2f52fe80d15931d256d
--- /dev/null
+++ b/PaddleCV/image_classification/models/se_resnext.py
@@ -0,0 +1,246 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = [
+ "SE_ResNeXt", "SE_ResNeXt50_32x4d", "SE_ResNeXt101_32x4d",
+ "SE_ResNeXt152_32x4d"
+]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "dropout_seed": None,
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [40, 80, 100],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class SE_ResNeXt():
+ def __init__(self, layers=50):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ layers = self.layers
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+ if layers == 50:
+ cardinality = 32
+ reduction_ratio = 16
+ depth = [3, 4, 6, 3]
+ num_filters = [128, 256, 512, 1024]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act='relu',
+ name='conv1', )
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+ elif layers == 101:
+ cardinality = 32
+ reduction_ratio = 16
+ depth = [3, 4, 23, 3]
+ num_filters = [128, 256, 512, 1024]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act='relu',
+ name="conv1", )
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+ elif layers == 152:
+ cardinality = 64
+ reduction_ratio = 16
+ depth = [3, 8, 36, 3]
+ num_filters = [128, 256, 512, 1024]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=3,
+ stride=2,
+ act='relu',
+ name='conv1')
+ conv = self.conv_bn_layer(
+ input=conv,
+ num_filters=64,
+ filter_size=3,
+ stride=1,
+ act='relu',
+ name='conv2')
+ conv = self.conv_bn_layer(
+ input=conv,
+ num_filters=128,
+ filter_size=3,
+ stride=1,
+ act='relu',
+ name='conv3')
+ conv = fluid.layers.pool2d(
+ input=conv, pool_size=3, pool_stride=2, pool_padding=1, \
+ pool_type='max')
+ n = 1 if layers == 50 or layers == 101 else 3
+ for block in range(len(depth)):
+ n += 1
+ for i in range(depth[block]):
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1,
+ cardinality=cardinality,
+ reduction_ratio=reduction_ratio,
+ name=str(n) + '_' + str(i + 1))
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+ drop = fluid.layers.dropout(
+ x=pool, dropout_prob=0.5, seed=self.params['dropout_seed'])
+ stdv = 1.0 / math.sqrt(drop.shape[1] * 1.0)
+ out = fluid.layers.fc(
+ input=drop,
+ size=class_dim,
+ param_attr=ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name='fc6_weights'),
+ bias_attr=ParamAttr(name='fc6_offset'))
+ return out
+
+ def shortcut(self, input, ch_out, stride, name):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ filter_size = 1
+ return self.conv_bn_layer(
+ input, ch_out, filter_size, stride, name='conv' + name + '_prj')
+ else:
+ return input
+
+ def bottleneck_block(self,
+ input,
+ num_filters,
+ stride,
+ cardinality,
+ reduction_ratio,
+ name=None):
+ conv0 = self.conv_bn_layer(
+ input=input,
+ num_filters=num_filters,
+ filter_size=1,
+ act='relu',
+ name='conv' + name + '_x1')
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ groups=cardinality,
+ act='relu',
+ name='conv' + name + '_x2')
+ conv2 = self.conv_bn_layer(
+ input=conv1,
+ num_filters=num_filters * 2,
+ filter_size=1,
+ act=None,
+ name='conv' + name + '_x3')
+ scale = self.squeeze_excitation(
+ input=conv2,
+ num_channels=num_filters * 2,
+ reduction_ratio=reduction_ratio,
+ name='fc' + name)
+
+ short = self.shortcut(input, num_filters * 2, stride, name=name)
+
+ return fluid.layers.elementwise_add(x=short, y=scale, act='relu')
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ bias_attr=False,
+ param_attr=ParamAttr(name=name + '_weights'), )
+ bn_name = name + "_bn"
+ return fluid.layers.batch_norm(
+ input=conv,
+ act=act,
+ param_attr=ParamAttr(name=bn_name + '_scale'),
+ bias_attr=ParamAttr(bn_name + '_offset'),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance')
+
+ def squeeze_excitation(self,
+ input,
+ num_channels,
+ reduction_ratio,
+ name=None):
+ pool = fluid.layers.pool2d(
+ input=input, pool_size=0, pool_type='avg', global_pooling=True)
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ squeeze = fluid.layers.fc(
+ input=pool,
+ size=num_channels // reduction_ratio,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=name + '_sqz_weights'),
+ bias_attr=ParamAttr(name=name + '_sqz_offset'))
+ stdv = 1.0 / math.sqrt(squeeze.shape[1] * 1.0)
+ excitation = fluid.layers.fc(
+ input=squeeze,
+ size=num_channels,
+ act='sigmoid',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv, stdv),
+ name=name + '_exc_weights'),
+ bias_attr=ParamAttr(name=name + '_exc_offset'))
+ scale = fluid.layers.elementwise_mul(x=input, y=excitation, axis=0)
+ return scale
+
+
+def SE_ResNeXt50_32x4d():
+ model = SE_ResNeXt(layers=50)
+ return model
+
+
+def SE_ResNeXt101_32x4d():
+ model = SE_ResNeXt(layers=101)
+ return model
+
+
+def SE_ResNeXt152_32x4d():
+ model = SE_ResNeXt(layers=152)
+ return model
diff --git a/PaddleCV/image_classification/models/shufflenet_v2.py b/PaddleCV/image_classification/models/shufflenet_v2.py
new file mode 100644
index 0000000000000000000000000000000000000000..c0f3d0d6e08454d0d216e758ff5328ee4dee3151
--- /dev/null
+++ b/PaddleCV/image_classification/models/shufflenet_v2.py
@@ -0,0 +1,255 @@
+import paddle.fluid as fluid
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['ShuffleNetV2', 'ShuffleNetV2_x0_5_swish', 'ShuffleNetV2_x1_0_swish', 'ShuffleNetV2_x1_5_swish',
+ 'ShuffleNetV2_x2_0_swish', 'ShuffleNetV2_x8_0_swish']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class ShuffleNetV2():
+ def __init__(self, scale=1.0):
+ self.params = train_parameters
+ self.scale = scale
+
+ def net(self, input, class_dim=1000):
+ scale = self.scale
+ stage_repeats = [4, 8, 4]
+
+ if scale == 0.5:
+ stage_out_channels = [-1, 24, 48, 96, 192, 1024]
+ elif scale == 1.0:
+ stage_out_channels = [-1, 24, 116, 232, 464, 1024]
+ elif scale == 1.5:
+ stage_out_channels = [-1, 24, 176, 352, 704, 1024]
+ elif scale == 2.0:
+ stage_out_channels = [-1, 24, 224, 488, 976, 2048]
+ elif scale == 8.0:
+ stage_out_channels = [-1, 48, 896, 1952, 3904, 8192]
+ else:
+ raise ValueError(
+ """{} groups is not supported for
+ 1x1 Grouped Convolutions""".format(num_groups))
+
+ #conv1
+
+ input_channel = stage_out_channels[1]
+ conv1 = self.conv_bn_layer(input=input, filter_size=3, num_filters=input_channel, padding=1, stride=2,name='stage1_conv')
+ pool1 = fluid.layers.pool2d(input=conv1, pool_size=3, pool_stride=2, pool_padding=1, pool_type='max')
+ conv = pool1
+ # bottleneck sequences
+ for idxstage in range(len(stage_repeats)):
+ numrepeat = stage_repeats[idxstage]
+ output_channel = stage_out_channels[idxstage+2]
+ for i in range(numrepeat):
+ if i == 0:
+ conv = self.inverted_residual_unit(input=conv, num_filters=output_channel, stride=2,
+ benchmodel=2,name=str(idxstage+2)+'_'+str(i+1))
+ else:
+ conv = self.inverted_residual_unit(input=conv, num_filters=output_channel, stride=1,
+ benchmodel=1,name=str(idxstage+2)+'_'+str(i+1))
+
+ conv_last = self.conv_bn_layer(input=conv, filter_size=1, num_filters=stage_out_channels[-1],
+ padding=0, stride=1, name='conv5')
+ pool_last = fluid.layers.pool2d(input=conv_last, pool_size=7, pool_stride=1, pool_padding=0, pool_type='avg')
+
+
+ output = fluid.layers.fc(input=pool_last,
+ size=class_dim,
+ param_attr=ParamAttr(initializer=MSRA(),name='fc6_weights'),
+ bias_attr=ParamAttr(name='fc6_offset'))
+ return output
+
+
+ def conv_bn_layer(self,
+ input,
+ filter_size,
+ num_filters,
+ stride,
+ padding,
+ num_groups=1,
+ use_cudnn=True,
+ if_act=True,
+ name=None):
+# print(num_groups)
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=num_groups,
+ act=None,
+ use_cudnn=use_cudnn,
+ param_attr=ParamAttr(initializer=MSRA(),name=name+'_weights'),
+ bias_attr=False)
+ out = int((input.shape[2] - 1)/float(stride) + 1)
+ # print(input.shape[1],(out, out), num_filters, (filter_size, filter_size), stride,
+ # (filter_size - 1) / 2, num_groups, name)
+ bn_name = name + '_bn'
+ if if_act:
+ return fluid.layers.batch_norm(input=conv, act='swish',
+ param_attr = ParamAttr(name=bn_name+"_scale"),
+ bias_attr=ParamAttr(name=bn_name+"_offset"),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance')
+ else:
+ return fluid.layers.batch_norm(input=conv,
+ param_attr = ParamAttr(name=bn_name+"_scale"),
+ bias_attr=ParamAttr(name=bn_name+"_offset"),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance')
+
+
+ def channel_shuffle(self, x, groups):
+ batchsize, num_channels, height, width = x.shape[0], x.shape[1], x.shape[2], x.shape[3]
+ channels_per_group = num_channels // groups
+
+ # reshape
+ x = fluid.layers.reshape(x=x, shape=[batchsize, groups, channels_per_group, height, width])
+
+ x = fluid.layers.transpose(x=x, perm=[0,2,1,3,4])
+
+ # flatten
+ x = fluid.layers.reshape(x=x, shape=[batchsize, num_channels, height, width])
+
+ return x
+
+
+ def inverted_residual_unit(self, input, num_filters, stride, benchmodel, name=None):
+ assert stride in [1, 2], \
+ "supported stride are {} but your stride is {}".format([1,2], stride)
+
+ oup_inc = num_filters//2
+ inp = input.shape[1]
+
+ if benchmodel == 1:
+ x1, x2 = fluid.layers.split(
+ input, num_or_sections=[input.shape[1]//2, input.shape[1]//2], dim=1)
+# x1 = input[:, :(input.shape[1]//2), :, :]
+# x2 = input[:, (input.shape[1]//2):, :, :]
+
+ conv_pw = self.conv_bn_layer(
+ input=x2,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True,
+ name='stage_'+name+'_conv1')
+
+ conv_dw = self.conv_bn_layer(
+ input=conv_pw,
+ num_filters=oup_inc,
+ filter_size=3,
+ stride=stride,
+ padding=1,
+ num_groups=oup_inc,
+ if_act=False,
+ use_cudnn=False,
+ name='stage_'+name+'_conv2')
+
+ conv_linear = self.conv_bn_layer(
+ input=conv_dw,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True,
+ name='stage_'+name+'_conv3')
+
+ out = fluid.layers.concat([x1, conv_linear], axis=1)
+
+
+ else:
+ #branch1
+ conv_dw_1 = self.conv_bn_layer(
+ input=input,
+ num_filters=inp,
+ filter_size=3,
+ stride=stride,
+ padding=1,
+ num_groups=inp,
+ if_act=False,
+ use_cudnn=False,
+ name='stage_'+name+'_conv4')
+
+ conv_linear_1 = self.conv_bn_layer(
+ input=conv_dw_1,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True,
+ name='stage_'+name+'_conv5')
+
+ #branch2
+ conv_pw_2 = self.conv_bn_layer(
+ input=input,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True,
+ name='stage_'+name+'_conv1')
+
+ conv_dw_2 = self.conv_bn_layer(
+ input=conv_pw_2,
+ num_filters=oup_inc,
+ filter_size=3,
+ stride=stride,
+ padding=1,
+ num_groups=oup_inc,
+ if_act=False,
+ use_cudnn=False,
+ name='stage_'+name+'_conv2')
+
+ conv_linear_2 = self.conv_bn_layer(
+ input=conv_dw_2,
+ num_filters=oup_inc,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ num_groups=1,
+ if_act=True,
+ name='stage_'+name+'_conv3')
+ out = fluid.layers.concat([conv_linear_1, conv_linear_2], axis=1)
+
+ return self.channel_shuffle(out, 2)
+
+def ShuffleNetV2_x0_5_swish():
+ model = ShuffleNetV2(scale=0.5)
+ return model
+
+def ShuffleNetV2_x1_0_swish():
+ model = ShuffleNetV2(scale=1.0)
+ return model
+
+def ShuffleNetV2_x1_5_swish():
+ model = ShuffleNetV2(scale=1.5)
+ return model
+
+def ShuffleNetV2_x2_0_swish():
+ model = ShuffleNetV2(scale=2.0)
+ return model
+
+def ShuffleNetV2_x8_0_swish():
+ model = ShuffleNetV2(scale=8.0)
+ return model
+
+
diff --git a/PaddleCV/image_classification/models/vgg.py b/PaddleCV/image_classification/models/vgg.py
new file mode 100644
index 0000000000000000000000000000000000000000..8fcd2d9f1c397a428685cfb7bd264f18c0d0a7e7
--- /dev/null
+++ b/PaddleCV/image_classification/models/vgg.py
@@ -0,0 +1,104 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+
+__all__ = ["VGGNet", "VGG11", "VGG13", "VGG16", "VGG19"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class VGGNet():
+ def __init__(self, layers=16):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000):
+ layers = self.layers
+ vgg_spec = {
+ 11: ([1, 1, 2, 2, 2]),
+ 13: ([2, 2, 2, 2, 2]),
+ 16: ([2, 2, 3, 3, 3]),
+ 19: ([2, 2, 4, 4, 4])
+ }
+ assert layers in vgg_spec.keys(), \
+ "supported layers are {} but input layer is {}".format(vgg_spec.keys(), layers)
+
+ nums = vgg_spec[layers]
+ conv1 = self.conv_block(input, 64, nums[0], name="conv1_")
+ conv2 = self.conv_block(conv1, 128, nums[1], name="conv2_")
+ conv3 = self.conv_block(conv2, 256, nums[2], name="conv3_")
+ conv4 = self.conv_block(conv3, 512, nums[3], name="conv4_")
+ conv5 = self.conv_block(conv4, 512, nums[4], name="conv5_")
+
+ fc_dim = 4096
+ fc_name = ["fc6", "fc7", "fc8"]
+ fc1 = fluid.layers.fc(
+ input=conv5,
+ size=fc_dim,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(name=fc_name[0] + "_weights"),
+ bias_attr=fluid.param_attr.ParamAttr(name=fc_name[0] + "_offset"))
+ fc1 = fluid.layers.dropout(x=fc1, dropout_prob=0.5)
+ fc2 = fluid.layers.fc(
+ input=fc1,
+ size=fc_dim,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(name=fc_name[1] + "_weights"),
+ bias_attr=fluid.param_attr.ParamAttr(name=fc_name[1] + "_offset"))
+ fc2 = fluid.layers.dropout(x=fc2, dropout_prob=0.5)
+ out = fluid.layers.fc(
+ input=fc2,
+ size=class_dim,
+ param_attr=fluid.param_attr.ParamAttr(name=fc_name[2] + "_weights"),
+ bias_attr=fluid.param_attr.ParamAttr(name=fc_name[2] + "_offset"))
+
+ return out
+
+ def conv_block(self, input, num_filter, groups, name=None):
+ conv = input
+ for i in range(groups):
+ conv = fluid.layers.conv2d(
+ input=conv,
+ num_filters=num_filter,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(
+ name=name + str(i + 1) + "_weights"),
+ bias_attr=fluid.param_attr.ParamAttr(
+ name=name + str(i + 1) + "_offset"))
+ return fluid.layers.pool2d(
+ input=conv, pool_size=2, pool_type='max', pool_stride=2)
+
+
+def VGG11():
+ model = VGGNet(layers=11)
+ return model
+
+
+def VGG13():
+ model = VGGNet(layers=13)
+ return model
+
+
+def VGG16():
+ model = VGGNet(layers=16)
+ return model
+
+
+def VGG19():
+ model = VGGNet(layers=19)
+ return model
diff --git a/PaddleCV/image_classification/reader.py b/PaddleCV/image_classification/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..d8aa9da49b9e0caf28f72261965814c5cdca914d
--- /dev/null
+++ b/PaddleCV/image_classification/reader.py
@@ -0,0 +1,194 @@
+import os
+import math
+import random
+import functools
+import numpy as np
+import paddle
+from PIL import Image, ImageEnhance
+
+random.seed(0)
+np.random.seed(0)
+
+DATA_DIM = 224
+
+THREAD = 8
+BUF_SIZE = 102400
+
+DATA_DIR = 'data/ILSVRC2012'
+
+img_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))
+img_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))
+
+
+def resize_short(img, target_size):
+ percent = float(target_size) / min(img.size[0], img.size[1])
+ resized_width = int(round(img.size[0] * percent))
+ resized_height = int(round(img.size[1] * percent))
+ img = img.resize((resized_width, resized_height), Image.LANCZOS)
+ return img
+
+
+def crop_image(img, target_size, center):
+ width, height = img.size
+ size = target_size
+ if center == True:
+ w_start = (width - size) / 2
+ h_start = (height - size) / 2
+ else:
+ w_start = np.random.randint(0, width - size + 1)
+ h_start = np.random.randint(0, height - size + 1)
+ w_end = w_start + size
+ h_end = h_start + size
+ img = img.crop((w_start, h_start, w_end, h_end))
+ return img
+
+
+def random_crop(img, size, scale=[0.08, 1.0], ratio=[3. / 4., 4. / 3.]):
+ aspect_ratio = math.sqrt(np.random.uniform(*ratio))
+ w = 1. * aspect_ratio
+ h = 1. / aspect_ratio
+
+ bound = min((float(img.size[0]) / img.size[1]) / (w**2),
+ (float(img.size[1]) / img.size[0]) / (h**2))
+ scale_max = min(scale[1], bound)
+ scale_min = min(scale[0], bound)
+
+ target_area = img.size[0] * img.size[1] * np.random.uniform(scale_min,
+ scale_max)
+ target_size = math.sqrt(target_area)
+ w = int(target_size * w)
+ h = int(target_size * h)
+
+ i = np.random.randint(0, img.size[0] - w + 1)
+ j = np.random.randint(0, img.size[1] - h + 1)
+
+ img = img.crop((i, j, i + w, j + h))
+ img = img.resize((size, size), Image.LANCZOS)
+ return img
+
+
+def rotate_image(img):
+ angle = np.random.randint(-10, 11)
+ img = img.rotate(angle)
+ return img
+
+
+def distort_color(img):
+ def random_brightness(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Brightness(img).enhance(e)
+
+ def random_contrast(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Contrast(img).enhance(e)
+
+ def random_color(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Color(img).enhance(e)
+
+ ops = [random_brightness, random_contrast, random_color]
+ np.random.shuffle(ops)
+
+ img = ops[0](img)
+ img = ops[1](img)
+ img = ops[2](img)
+
+ return img
+
+
+def process_image(sample, mode, color_jitter, rotate):
+ img_path = sample[0]
+
+ img = Image.open(img_path)
+ if mode == 'train':
+ if rotate: img = rotate_image(img)
+ img = random_crop(img, DATA_DIM)
+ else:
+ img = resize_short(img, target_size=256)
+ img = crop_image(img, target_size=DATA_DIM, center=True)
+ if mode == 'train':
+ if color_jitter:
+ img = distort_color(img)
+ if np.random.randint(0, 2) == 1:
+ img = img.transpose(Image.FLIP_LEFT_RIGHT)
+
+ if img.mode != 'RGB':
+ img = img.convert('RGB')
+
+ img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255
+ img -= img_mean
+ img /= img_std
+
+ if mode == 'train' or mode == 'val':
+ return img, sample[1]
+ elif mode == 'test':
+ return [img]
+
+
+def _reader_creator(file_list,
+ mode,
+ shuffle=False,
+ color_jitter=False,
+ rotate=False,
+ data_dir=DATA_DIR,
+ pass_id_as_seed=0):
+ def reader():
+ with open(file_list) as flist:
+ full_lines = [line.strip() for line in flist]
+ if shuffle:
+ if pass_id_as_seed:
+ np.random.seed(pass_id_as_seed)
+ np.random.shuffle(full_lines)
+ if mode == 'train' and os.getenv('PADDLE_TRAINING_ROLE'):
+ # distributed mode if the env var `PADDLE_TRAINING_ROLE` exits
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
+ trainer_count = int(os.getenv("PADDLE_TRAINERS_NUM", "1"))
+ per_node_lines = len(full_lines) // trainer_count
+ lines = full_lines[trainer_id * per_node_lines:(trainer_id + 1)
+ * per_node_lines]
+ print(
+ "read images from %d, length: %d, lines length: %d, total: %d"
+ % (trainer_id * per_node_lines, per_node_lines, len(lines),
+ len(full_lines)))
+ else:
+ lines = full_lines
+
+ for line in lines:
+ if mode == 'train' or mode == 'val':
+ img_path, label = line.split()
+ img_path = os.path.join(data_dir, img_path)
+ yield img_path, int(label)
+ elif mode == 'test':
+ img_path, label = line.split()
+ img_path = os.path.join(data_dir, img_path)
+
+ yield [img_path]
+
+ mapper = functools.partial(
+ process_image, mode=mode, color_jitter=color_jitter, rotate=rotate)
+
+ return paddle.reader.xmap_readers(mapper, reader, THREAD, BUF_SIZE)
+
+
+def train(data_dir=DATA_DIR, pass_id_as_seed=0):
+ file_list = os.path.join(data_dir, 'train_list.txt')
+ return _reader_creator(
+ file_list,
+ 'train',
+ shuffle=True,
+ color_jitter=False,
+ rotate=False,
+ data_dir=data_dir,
+ pass_id_as_seed=pass_id_as_seed)
+
+
+def val(data_dir=DATA_DIR):
+ file_list = os.path.join(data_dir, 'val_list.txt')
+ return _reader_creator(file_list, 'val', shuffle=False,
+ data_dir=data_dir)
+
+
+def test(data_dir=DATA_DIR):
+ file_list = os.path.join(data_dir, 'val_list.txt')
+ return _reader_creator(file_list, 'test', shuffle=False,
+ data_dir=data_dir)
diff --git a/PaddleCV/image_classification/reader_cv2.py b/PaddleCV/image_classification/reader_cv2.py
new file mode 100644
index 0000000000000000000000000000000000000000..7be5baa8014562d44371372318e4c8c81303c5fe
--- /dev/null
+++ b/PaddleCV/image_classification/reader_cv2.py
@@ -0,0 +1,211 @@
+import os
+import math
+import random
+import functools
+import numpy as np
+import paddle
+import cv2
+import io
+
+random.seed(0)
+np.random.seed(0)
+
+DATA_DIM = 224
+
+THREAD = 8
+BUF_SIZE = 102400
+
+DATA_DIR = 'data/ILSVRC2012'
+img_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))
+img_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))
+
+
+def rotate_image(img):
+ """ rotate_image """
+ (h, w) = img.shape[:2]
+ center = (w / 2, h / 2)
+ angle = np.random.randint(-10, 11)
+ M = cv2.getRotationMatrix2D(center, angle, 1.0)
+ rotated = cv2.warpAffine(img, M, (w, h))
+ return rotated
+
+
+def random_crop(img, size, scale=None, ratio=None):
+ """ random_crop """
+ scale = [0.08, 1.0] if scale is None else scale
+ ratio = [3. / 4., 4. / 3.] if ratio is None else ratio
+
+ aspect_ratio = math.sqrt(np.random.uniform(*ratio))
+ w = 1. * aspect_ratio
+ h = 1. / aspect_ratio
+
+
+ bound = min((float(img.shape[0]) / img.shape[1]) / (w**2),
+ (float(img.shape[1]) / img.shape[0]) / (h**2))
+ scale_max = min(scale[1], bound)
+ scale_min = min(scale[0], bound)
+
+ target_area = img.shape[0] * img.shape[1] * np.random.uniform(scale_min,
+ scale_max)
+ target_size = math.sqrt(target_area)
+ w = int(target_size * w)
+ h = int(target_size * h)
+ i = np.random.randint(0, img.shape[0] - w + 1)
+ j = np.random.randint(0, img.shape[1] - h + 1)
+
+ img = img[i:i + w, j:j + h, :]
+
+ resized = cv2.resize(img, (size, size), interpolation=cv2.INTER_LANCZOS4)
+ return resized
+
+def distort_color(img):
+ return img
+
+
+def resize_short(img, target_size):
+ """ resize_short """
+ percent = float(target_size) / min(img.shape[0], img.shape[1])
+ resized_width = int(round(img.shape[1] * percent))
+ resized_height = int(round(img.shape[0] * percent))
+ resized = cv2.resize(img, (resized_width, resized_height), interpolation=cv2.INTER_LANCZOS4)
+ return resized
+
+
+def crop_image(img, target_size, center):
+ """ crop_image """
+ height, width = img.shape[:2]
+ size = target_size
+ if center == True:
+ w_start = (width - size) / 2
+ h_start = (height - size) / 2
+ else:
+ w_start = np.random.randint(0, width - size + 1)
+ h_start = np.random.randint(0, height - size + 1)
+ w_end = w_start + size
+ h_end = h_start + size
+ img = img[h_start:h_end, w_start:w_end, :]
+ return img
+
+
+def process_image(sample,
+ mode,
+ color_jitter,
+ rotate,
+ crop_size=224,
+ mean=None,
+ std=None):
+ """ process_image """
+
+ mean = [0.485, 0.456, 0.406] if mean is None else mean
+ std = [0.229, 0.224, 0.225] if std is None else std
+
+ img_path = sample[0]
+ img = cv2.imread(img_path)
+
+ if mode == 'train':
+ if rotate:
+ img = rotate_image(img)
+ if crop_size > 0:
+ img = random_crop(img, crop_size)
+ if color_jitter:
+ img = distort_color(img)
+ if np.random.randint(0, 2) == 1:
+ img = img[:, ::-1, :]
+ else:
+ if crop_size > 0:
+ img = resize_short(img, crop_size)
+
+ img = crop_image(img, target_size=crop_size, center=True)
+
+ img = img[:, :, ::-1].astype('float32').transpose((2, 0, 1)) / 255
+ img_mean = np.array(mean).reshape((3, 1, 1))
+ img_std = np.array(std).reshape((3, 1, 1))
+ img -= img_mean
+ img /= img_std
+
+ if mode == 'train' or mode == 'val':
+ return (img, sample[1])
+ elif mode == 'test':
+ return (img, )
+
+
+def image_mapper(**kwargs):
+ """ image_mapper """
+ return functools.partial(process_image, **kwargs)
+
+
+def _reader_creator(file_list,
+ mode,
+ shuffle=False,
+ color_jitter=False,
+ rotate=False,
+ data_dir=DATA_DIR,
+ pass_id_as_seed=0):
+ def reader():
+ with open(file_list) as flist:
+ full_lines = [line.strip() for line in flist]
+ if shuffle:
+ if pass_id_as_seed:
+ np.random.seed(pass_id_as_seed)
+ np.random.shuffle(full_lines)
+ if mode == 'train' and os.getenv('PADDLE_TRAINING_ROLE'):
+ # distributed mode if the env var `PADDLE_TRAINING_ROLE` exits
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
+ trainer_count = int(os.getenv("PADDLE_TRAINERS_NUM", "1"))
+ per_node_lines = len(full_lines) // trainer_count
+ lines = full_lines[trainer_id * per_node_lines:(trainer_id + 1)
+ * per_node_lines]
+ print(
+ "read images from %d, length: %d, lines length: %d, total: %d"
+ % (trainer_id * per_node_lines, per_node_lines, len(lines),
+ len(full_lines)))
+ else:
+ lines = full_lines
+
+ for line in lines:
+ if mode == 'train' or mode == 'val':
+ img_path, label = line.split()
+ img_path = img_path.replace("JPEG", "jpeg")
+ img_path = os.path.join(data_dir, img_path)
+ yield img_path, int(label)
+ elif mode == 'test':
+ img_path, label = line.split()
+ img_path = img_path.replace("JPEG", "jpeg")
+ img_path = os.path.join(data_dir, img_path)
+
+ yield [img_path]
+
+ image_mapper = functools.partial(
+ process_image,
+ mode=mode,
+ color_jitter=color_jitter,
+ rotate=rotate,
+ crop_size=224)
+ reader = paddle.reader.xmap_readers(
+ image_mapper, reader, THREAD, BUF_SIZE, order=False)
+ return reader
+
+
+def train(data_dir=DATA_DIR, pass_id_as_seed=0):
+
+ file_list = os.path.join(data_dir, 'train_list.txt')
+ return _reader_creator(
+ file_list,
+ 'train',
+ shuffle=True,
+ color_jitter=False,
+ rotate=False,
+ data_dir=data_dir,
+ pass_id_as_seed=pass_id_as_seed)
+
+
+def val(data_dir=DATA_DIR):
+ file_list = os.path.join(data_dir, 'val_list.txt')
+ return _reader_creator(file_list, 'val', shuffle=False,
+ data_dir=data_dir)
+
+
+def test(data_dir=DATA_DIR):
+ file_list = os.path.join(data_dir, 'val_list.txt')
+ return _reader_creator(file_list, 'test', shuffle=False,
+ data_dir=data_dir)
diff --git a/PaddleCV/image_classification/run.sh b/PaddleCV/image_classification/run.sh
new file mode 100755
index 0000000000000000000000000000000000000000..cc516a677771c2c22bdf702d0ae77916a1bd8f06
--- /dev/null
+++ b/PaddleCV/image_classification/run.sh
@@ -0,0 +1,254 @@
+#Hyperparameters config
+#Example: SE_ResNext50_32x4d
+python train.py \
+ --model=SE_ResNeXt50_32x4d \
+ --batch_size=400 \
+ --total_images=1281167 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --model_save_dir=output/ \
+ --with_mem_opt=True \
+ --lr_strategy=cosine_decay \
+ --lr=0.1 \
+ --num_epochs=200 \
+ --l2_decay=1.2e-4 \
+# >log_SE_ResNeXt50_32x4d.txt 2>&1 &
+#AlexNet:
+#python train.py \
+# --model=AlexNet \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=piecewise_decay \
+# --num_epochs=120 \
+# --lr=0.01 \
+# --l2_decay=1e-4
+
+#MobileNet v1:
+#python train.py \
+# --model=MobileNet \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=piecewise_decay \
+# --num_epochs=120 \
+# --lr=0.1 \
+# --l2_decay=3e-5
+
+#python train.py \
+# --model=MobileNetV2 \
+# --batch_size=500 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=cosine_decay \
+# --num_epochs=240 \
+# --lr=0.1 \
+# --l2_decay=4e-5
+#ResNet18:
+#python train.py \
+# --model=ResNet18 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=cosine_decay \
+# --lr=0.1 \
+# --num_epochs=120 \
+# --l2_decay=1e-4
+#ResNet34:
+#python train.py \
+# --model=ResNet34 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=cosine_decay \
+# --lr=0.1 \
+# --num_epochs=120 \
+# --l2_decay=1e-4
+#ShuffleNetv2:
+#python train.py \
+# --model=ShuffleNetV2 \
+# --batch_size=1024 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=cosine_decay_with_warmup \
+# --lr=0.5 \
+# --num_epochs=240 \
+# --l2_decay=4e-5
+#GoogleNet:
+#python train.py \
+# --model=GoogleNet \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=cosine_decay \
+# --lr=0.01 \
+# --num_epochs=200 \
+# --l2_decay=1e-4
+#ResNet50:
+#python train.py \
+# --model=ResNet50 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=piecewise_decay \
+# --num_epochs=120 \
+# --lr=0.1 \
+# --l2_decay=1e-4
+
+#ResNet101:
+#python train.py \
+# --model=ResNet101 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --lr_strategy=piecewise_decay \
+# --num_epochs=120 \
+# --lr=0.1 \
+# --l2_decay=1e-4
+
+#ResNet152:
+#python train.py \
+# --model=ResNet152 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --lr_strategy=piecewise_decay \
+# --with_mem_opt=True \
+# --lr=0.1 \
+# --num_epochs=120 \
+# --l2_decay=1e-4
+
+
+#SE_ResNeXt50_32x4d:
+#python train.py \
+# --model=SE_ResNeXt50_32x4d \
+# --batch_size=400 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --lr_strategy=cosine_decay \
+# --model_save_dir=output/ \
+# --lr=0.1 \
+# --num_epochs=200 \
+# --with_mem_opt=True \
+# --l2_decay=1.2e-4
+
+#SE_ResNeXt101_32x4d:
+#python train.py \
+# --model=SE_ResNeXt101_32x4d \
+# --batch_size=400 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --lr_strategy=cosine_decay \
+# --model_save_dir=output/ \
+# --lr=0.1 \
+# --num_epochs=200 \
+# --with_mem_opt=True \
+# --l2_decay=1.5e-5
+
+#VGG11:
+#python train.py \
+# --model=VGG11 \
+# --batch_size=512 \
+# --total_images=1281167 \
+# --image_shape=3,224,224 \
+# --lr_strategy=cosine_decay \
+# --class_dim=1000 \
+# --model_save_dir=output/ \
+# --lr=0.1 \
+# --num_epochs=90 \
+# --with_mem_opt=True \
+# --l2_decay=2e-4
+
+#VGG13:
+#python train.py
+# --model=VGG13 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --lr_strategy=cosine_decay \
+# --lr=0.01 \
+# --num_epochs=90 \
+# --model_save_dir=output/ \
+# --with_mem_opt=True \
+# --l2_decay=3e-4
+
+#VGG16:
+#python train.py
+# --model=VGG16 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --lr_strategy=cosine_decay \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --lr=0.01 \
+# --num_epochs=90 \
+# --with_mem_opt=True \
+# --l2_decay=3e-4
+
+#VGG19:
+#python train.py
+# --model=VGG19 \
+# --batch_size=256 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --lr_strategy=cosine_decay \
+# --lr=0.01 \
+# --num_epochs=90 \
+# --with_mem_opt=True \
+# --model_save_dir=output/ \
+# --l2_decay=3e-4
+
+#ResNet50 nGraph:
+# Training:
+#OMP_NUM_THREADS=`nproc` FLAGS_use_ngraph=true python train.py \
+# --model=ResNet50 \
+# --batch_size=128 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --lr=0.001 \
+# --num_epochs=120 \
+# --with_mem_opt=False \
+# --model_save_dir=output/ \
+# --lr_strategy=adam \
+# --use_gpu=False
+# Inference:
+#OMP_NUM_THREADS=`nproc` FLAGS_use_ngraph=true python infer.py \
+# --use_gpu=false \
+# --model=ResNet50 \
+# --pretrained_model=ResNet50_pretrained
+
diff --git a/PaddleCV/image_classification/train.py b/PaddleCV/image_classification/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..7707bf4a723f8cba0f406acb8c4b280643c0ddb7
--- /dev/null
+++ b/PaddleCV/image_classification/train.py
@@ -0,0 +1,461 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import numpy as np
+import time
+import sys
+import functools
+import math
+import paddle
+import paddle.fluid as fluid
+import paddle.dataset.flowers as flowers
+import reader as reader
+import argparse
+import functools
+import subprocess
+import utils
+import models
+from utils.fp16_utils import create_master_params_grads, master_param_to_train_param
+from utils.utility import add_arguments, print_arguments
+from utils.learning_rate import cosine_decay_with_warmup
+
+IMAGENET1000 = 1281167
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 256, "Minibatch size.")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('total_images', int, 1281167, "Training image number.")
+add_arg('num_epochs', int, 120, "number of epochs.")
+add_arg('class_dim', int, 1000, "Class number.")
+add_arg('image_shape', str, "3,224,224", "input image size")
+add_arg('model_save_dir', str, "output", "model save directory")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+add_arg('lr', float, 0.1, "set learning rate.")
+add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+add_arg('model', str, "SE_ResNeXt50_32x4d", "Set the network to use.")
+add_arg('enable_ce', bool, False, "If set True, enable continuous evaluation job.")
+add_arg('data_dir', str, "./data/ILSVRC2012", "The ImageNet dataset root dir.")
+add_arg('fp16', bool, False, "Enable half precision training with fp16." )
+add_arg('scale_loss', float, 1.0, "Scale loss for fp16." )
+add_arg('l2_decay', float, 1e-4, "L2_decay parameter.")
+add_arg('momentum_rate', float, 0.9, "momentum_rate.")
+
+def optimizer_setting(params):
+ ls = params["learning_strategy"]
+ l2_decay = params["l2_decay"]
+ momentum_rate = params["momentum_rate"]
+ if ls["name"] == "piecewise_decay":
+ if "total_images" not in params:
+ total_images = IMAGENET1000
+ else:
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ step = int(math.ceil(float(total_images) / batch_size))
+ bd = [step * e for e in ls["epochs"]]
+ base_lr = params["lr"]
+ lr = []
+ lr = [base_lr * (0.1**i) for i in range(len(bd) + 1)]
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ momentum=momentum_rate,
+ regularization=fluid.regularizer.L2Decay(l2_decay))
+
+ elif ls["name"] == "cosine_decay":
+ if "total_images" not in params:
+ total_images = IMAGENET1000
+ else:
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ l2_decay = params["l2_decay"]
+ momentum_rate = params["momentum_rate"]
+ step = int(math.ceil(float(total_images) / batch_size))
+ lr = params["lr"]
+ num_epochs = params["num_epochs"]
+
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.cosine_decay(
+ learning_rate=lr, step_each_epoch=step, epochs=num_epochs),
+ momentum=momentum_rate,
+ regularization=fluid.regularizer.L2Decay(l2_decay))
+
+ elif ls["name"] == "cosine_warmup_decay":
+ if "total_images" not in params:
+ total_images = IMAGENET1000
+ else:
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ l2_decay = params["l2_decay"]
+ momentum_rate = params["momentum_rate"]
+ step = int(math.ceil(float(total_images) / batch_size))
+ lr = params["lr"]
+ num_epochs = params["num_epochs"]
+
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=cosine_decay_with_warmup(
+ learning_rate=lr, step_each_epoch=step, epochs=num_epochs),
+ momentum=momentum_rate,
+ regularization=fluid.regularizer.L2Decay(l2_decay))
+
+ elif ls["name"] == "linear_decay":
+ if "total_images" not in params:
+ total_images = IMAGENET1000
+ else:
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ num_epochs = params["num_epochs"]
+ start_lr = params["lr"]
+ l2_decay = params["l2_decay"]
+ momentum_rate = params["momentum_rate"]
+ end_lr = 0
+ total_step = int((total_images / batch_size) * num_epochs)
+ lr = fluid.layers.polynomial_decay(
+ start_lr, total_step, end_lr, power=1)
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=lr,
+ momentum=momentum_rate,
+ regularization=fluid.regularizer.L2Decay(l2_decay))
+ elif ls["name"] == "adam":
+ lr = params["lr"]
+ optimizer = fluid.optimizer.Adam(learning_rate=lr)
+ elif ls["name"] == "rmsprop_cosine":
+ if "total_images" not in params:
+ total_images = IMAGENET1000
+ else:
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ l2_decay = params["l2_decay"]
+ momentum_rate = params["momentum_rate"]
+ step = int(math.ceil(float(total_images) / batch_size))
+ lr = params["lr"]
+ num_epochs = params["num_epochs"]
+ optimizer = fluid.optimizer.RMSProp(
+ learning_rate=fluid.layers.cosine_decay(
+ learning_rate=lr, step_each_epoch=step, epochs=num_epochs),
+ momentum=momentum_rate,
+ regularization=fluid.regularizer.L2Decay(l2_decay),
+ # RMSProp Optimizer: Apply epsilon=1 on ImageNet.
+ epsilon=1
+ )
+ else:
+ lr = params["lr"]
+ l2_decay = params["l2_decay"]
+ momentum_rate = params["momentum_rate"]
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=lr,
+ momentum=momentum_rate,
+ regularization=fluid.regularizer.L2Decay(l2_decay))
+
+ return optimizer
+
+def net_config(image, label, model, args):
+ model_list = [m for m in dir(models) if "__" not in m]
+ assert args.model in model_list, "{} is not lists: {}".format(args.model,
+ model_list)
+
+ class_dim = args.class_dim
+ model_name = args.model
+
+ if args.enable_ce:
+ assert model_name == "SE_ResNeXt50_32x4d"
+ model.params["dropout_seed"] = 100
+ class_dim = 102
+
+ if model_name == "GoogleNet":
+ out0, out1, out2 = model.net(input=image, class_dim=class_dim)
+ cost0 = fluid.layers.cross_entropy(input=out0, label=label)
+ cost1 = fluid.layers.cross_entropy(input=out1, label=label)
+ cost2 = fluid.layers.cross_entropy(input=out2, label=label)
+ avg_cost0 = fluid.layers.mean(x=cost0)
+ avg_cost1 = fluid.layers.mean(x=cost1)
+ avg_cost2 = fluid.layers.mean(x=cost2)
+
+ avg_cost = avg_cost0 + 0.3 * avg_cost1 + 0.3 * avg_cost2
+ acc_top1 = fluid.layers.accuracy(input=out0, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=out0, label=label, k=5)
+ else:
+ out = model.net(input=image, class_dim=class_dim)
+ cost, pred = fluid.layers.softmax_with_cross_entropy(
+ out, label, return_softmax=True)
+ if args.scale_loss > 1:
+ avg_cost = fluid.layers.mean(x=cost) * float(args.scale_loss)
+ else:
+ avg_cost = fluid.layers.mean(x=cost)
+
+ acc_top1 = fluid.layers.accuracy(input=pred, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=pred, label=label, k=5)
+
+ return avg_cost, acc_top1, acc_top5
+
+
+def build_program(is_train, main_prog, startup_prog, args):
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+ model_name = args.model
+ model_list = [m for m in dir(models) if "__" not in m]
+ assert model_name in model_list, "{} is not in lists: {}".format(args.model,
+ model_list)
+ model = models.__dict__[model_name]()
+ with fluid.program_guard(main_prog, startup_prog):
+ py_reader = fluid.layers.py_reader(
+ capacity=16,
+ shapes=[[-1] + image_shape, [-1, 1]],
+ lod_levels=[0, 0],
+ dtypes=["float32", "int64"],
+ use_double_buffer=True)
+ with fluid.unique_name.guard():
+ image, label = fluid.layers.read_file(py_reader)
+ if args.fp16:
+ image = fluid.layers.cast(image, "float16")
+ avg_cost, acc_top1, acc_top5 = net_config(image, label, model, args)
+ avg_cost.persistable = True
+ acc_top1.persistable = True
+ acc_top5.persistable = True
+ if is_train:
+ params = model.params
+ params["total_images"] = args.total_images
+ params["lr"] = args.lr
+ params["num_epochs"] = args.num_epochs
+ params["learning_strategy"]["batch_size"] = args.batch_size
+ params["learning_strategy"]["name"] = args.lr_strategy
+ params["l2_decay"] = args.l2_decay
+ params["momentum_rate"] = args.momentum_rate
+
+ optimizer = optimizer_setting(params)
+ if args.fp16:
+ params_grads = optimizer.backward(avg_cost)
+ master_params_grads = create_master_params_grads(
+ params_grads, main_prog, startup_prog, args.scale_loss)
+ optimizer.apply_gradients(master_params_grads)
+ master_param_to_train_param(master_params_grads,
+ params_grads, main_prog)
+ else:
+ optimizer.minimize(avg_cost)
+ global_lr = optimizer._global_learning_rate()
+
+ if is_train:
+ return py_reader, avg_cost, acc_top1, acc_top5, global_lr
+ else:
+ return py_reader, avg_cost, acc_top1, acc_top5
+
+def get_device_num():
+ visible_device = os.getenv('CUDA_VISIBLE_DEVICES')
+ if visible_device:
+ device_num = len(visible_device.split(','))
+ else:
+ device_num = subprocess.check_output(['nvidia-smi','-L']).decode().count('\n')
+ return device_num
+
+def train(args):
+ # parameters from arguments
+ model_name = args.model
+ checkpoint = args.checkpoint
+ pretrained_model = args.pretrained_model
+ with_memory_optimization = args.with_mem_opt
+ model_save_dir = args.model_save_dir
+
+ startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+ test_prog = fluid.Program()
+ if args.enable_ce:
+ startup_prog.random_seed = 1000
+ train_prog.random_seed = 1000
+
+ train_py_reader, train_cost, train_acc1, train_acc5, global_lr = build_program(
+ is_train=True,
+ main_prog=train_prog,
+ startup_prog=startup_prog,
+ args=args)
+ test_py_reader, test_cost, test_acc1, test_acc5 = build_program(
+ is_train=False,
+ main_prog=test_prog,
+ startup_prog=startup_prog,
+ args=args)
+ test_prog = test_prog.clone(for_test=True)
+
+ if with_memory_optimization:
+ fluid.memory_optimize(train_prog)
+ fluid.memory_optimize(test_prog)
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup_prog)
+
+ if checkpoint is not None:
+ fluid.io.load_persistables(exe, checkpoint, main_program=train_prog)
+
+ if pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+
+ fluid.io.load_vars(
+ exe, pretrained_model, main_program=train_prog, predicate=if_exist)
+
+ if args.use_gpu:
+ device_num = get_device_num()
+ else:
+ device_num = 1
+ train_batch_size = args.batch_size / device_num
+
+ test_batch_size = 16
+ if not args.enable_ce:
+ train_reader = paddle.batch(
+ reader.train(), batch_size=train_batch_size, drop_last=True)
+ test_reader = paddle.batch(reader.val(), batch_size=test_batch_size)
+ else:
+ # use flowers dataset for CE and set use_xmap False to avoid disorder data
+ # but it is time consuming. For faster speed, need another dataset.
+ import random
+ random.seed(0)
+ np.random.seed(0)
+ train_reader = paddle.batch(
+ flowers.train(use_xmap=False),
+ batch_size=train_batch_size,
+ drop_last=True)
+ test_reader = paddle.batch(
+ flowers.test(use_xmap=False), batch_size=test_batch_size)
+
+ train_py_reader.decorate_paddle_reader(train_reader)
+ test_py_reader.decorate_paddle_reader(test_reader)
+
+ use_ngraph = os.getenv('FLAGS_use_ngraph')
+ if not use_ngraph:
+ train_exe = fluid.ParallelExecutor(
+ main_program=train_prog,
+ use_cuda=bool(args.use_gpu),
+ loss_name=train_cost.name)
+ else:
+ train_exe = exe
+
+ train_fetch_list = [
+ train_cost.name, train_acc1.name, train_acc5.name, global_lr.name
+ ]
+ test_fetch_list = [test_cost.name, test_acc1.name, test_acc5.name]
+
+ params = models.__dict__[args.model]().params
+ for pass_id in range(params["num_epochs"]):
+
+ train_py_reader.start()
+
+ train_info = [[], [], []]
+ test_info = [[], [], []]
+ train_time = []
+ batch_id = 0
+ try:
+ while True:
+ t1 = time.time()
+
+ if use_ngraph:
+ loss, acc1, acc5, lr = train_exe.run(
+ train_prog, fetch_list=train_fetch_list)
+ else:
+ loss, acc1, acc5, lr = train_exe.run(
+ fetch_list=train_fetch_list)
+ t2 = time.time()
+ period = t2 - t1
+ loss = np.mean(np.array(loss))
+ acc1 = np.mean(np.array(acc1))
+ acc5 = np.mean(np.array(acc5))
+ train_info[0].append(loss)
+ train_info[1].append(acc1)
+ train_info[2].append(acc5)
+ lr = np.mean(np.array(lr))
+ train_time.append(period)
+
+ if batch_id % 10 == 0:
+ print("Pass {0}, trainbatch {1}, loss {2}, \
+ acc1 {3}, acc5 {4}, lr {5}, time {6}"
+ .format(pass_id, batch_id, "%.5f"%loss, "%.5f"%acc1, "%.5f"%acc5, "%.5f" %
+ lr, "%2.2f sec" % period))
+ sys.stdout.flush()
+ batch_id += 1
+ except fluid.core.EOFException:
+ train_py_reader.reset()
+
+ train_loss = np.array(train_info[0]).mean()
+ train_acc1 = np.array(train_info[1]).mean()
+ train_acc5 = np.array(train_info[2]).mean()
+ train_speed = np.array(train_time).mean() / (train_batch_size *
+ device_num)
+
+ test_py_reader.start()
+
+ test_batch_id = 0
+ try:
+ while True:
+ t1 = time.time()
+ loss, acc1, acc5 = exe.run(program=test_prog,
+ fetch_list=test_fetch_list)
+ t2 = time.time()
+ period = t2 - t1
+ loss = np.mean(loss)
+ acc1 = np.mean(acc1)
+ acc5 = np.mean(acc5)
+ test_info[0].append(loss)
+ test_info[1].append(acc1)
+ test_info[2].append(acc5)
+ if test_batch_id % 10 == 0:
+ print("Pass {0},testbatch {1},loss {2}, \
+ acc1 {3},acc5 {4},time {5}"
+ .format(pass_id, test_batch_id, "%.5f"%loss,"%.5f"%acc1, "%.5f"%acc5,
+ "%2.2f sec" % period))
+ sys.stdout.flush()
+ test_batch_id += 1
+ except fluid.core.EOFException:
+ test_py_reader.reset()
+
+ test_loss = np.array(test_info[0]).mean()
+ test_acc1 = np.array(test_info[1]).mean()
+ test_acc5 = np.array(test_info[2]).mean()
+
+ print("End pass {0}, train_loss {1}, train_acc1 {2}, train_acc5 {3}, "
+ "test_loss {4}, test_acc1 {5}, test_acc5 {6}".format(
+ pass_id, "%.5f"%train_loss, "%.5f"%train_acc1, "%.5f"%train_acc5, "%.5f"%test_loss,
+ "%.5f"%test_acc1, "%.5f"%test_acc5))
+ sys.stdout.flush()
+
+ model_path = os.path.join(model_save_dir + '/' + model_name,
+ str(pass_id))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(exe, model_path, main_program=train_prog)
+
+ # This is for continuous evaluation only
+ if args.enable_ce and pass_id == args.num_epochs - 1:
+ if device_num == 1:
+ # Use the mean cost/acc for training
+ print("kpis train_cost %s" % train_loss)
+ print("kpis train_acc_top1 %s" % train_acc1)
+ print("kpis train_acc_top5 %s" % train_acc5)
+ # Use the mean cost/acc for testing
+ print("kpis test_cost %s" % test_loss)
+ print("kpis test_acc_top1 %s" % test_acc1)
+ print("kpis test_acc_top5 %s" % test_acc5)
+ print("kpis train_speed %s" % train_speed)
+ else:
+ # Use the mean cost/acc for training
+ print("kpis train_cost_card%s %s" % (device_num, train_loss))
+ print("kpis train_acc_top1_card%s %s" %
+ (device_num, train_acc1))
+ print("kpis train_acc_top5_card%s %s" %
+ (device_num, train_acc5))
+ # Use the mean cost/acc for testing
+ print("kpis test_cost_card%s %s" % (device_num, test_loss))
+ print("kpis test_acc_top1_card%s %s" % (device_num, test_acc1))
+ print("kpis test_acc_top5_card%s %s" % (device_num, test_acc5))
+ print("kpis train_speed_card%s %s" % (device_num, train_speed))
+
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ train(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/fluid/PaddleCV/image_classification/utility.py b/PaddleCV/image_classification/utility.py
similarity index 100%
rename from fluid/PaddleCV/image_classification/utility.py
rename to PaddleCV/image_classification/utility.py
diff --git a/PaddleCV/image_classification/utils/__init__.py b/PaddleCV/image_classification/utils/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..1e025483d26b01c32ccf13127c5f1c5078737a17
--- /dev/null
+++ b/PaddleCV/image_classification/utils/__init__.py
@@ -0,0 +1,3 @@
+from .learning_rate import cosine_decay, lr_warmup
+from .fp16_utils import create_master_params_grads, master_param_to_train_param
+from .utility import add_arguments, print_arguments
diff --git a/PaddleCV/image_classification/utils/fp16_utils.py b/PaddleCV/image_classification/utils/fp16_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..939ac59db2441af7b190604895f8a467d9844294
--- /dev/null
+++ b/PaddleCV/image_classification/utils/fp16_utils.py
@@ -0,0 +1,110 @@
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.core as core
+
+def cast_fp16_to_fp32(i, o, prog):
+ prog.global_block().append_op(
+ type="cast",
+ inputs={"X": i},
+ outputs={"Out": o},
+ attrs={
+ "in_dtype": fluid.core.VarDesc.VarType.FP16,
+ "out_dtype": fluid.core.VarDesc.VarType.FP32
+ }
+ )
+
+def cast_fp32_to_fp16(i, o, prog):
+ prog.global_block().append_op(
+ type="cast",
+ inputs={"X": i},
+ outputs={"Out": o},
+ attrs={
+ "in_dtype": fluid.core.VarDesc.VarType.FP32,
+ "out_dtype": fluid.core.VarDesc.VarType.FP16
+ }
+ )
+
+def copy_to_master_param(p, block):
+ v = block.vars.get(p.name, None)
+ if v is None:
+ raise ValueError("no param name %s found!" % p.name)
+ new_p = fluid.framework.Parameter(
+ block=block,
+ shape=v.shape,
+ dtype=fluid.core.VarDesc.VarType.FP32,
+ type=v.type,
+ lod_level=v.lod_level,
+ stop_gradient=p.stop_gradient,
+ trainable=p.trainable,
+ optimize_attr=p.optimize_attr,
+ regularizer=p.regularizer,
+ gradient_clip_attr=p.gradient_clip_attr,
+ error_clip=p.error_clip,
+ name=v.name + ".master")
+ return new_p
+
+
+def _update_role_var_grad(prog, params_grads):
+ BACKWARD = core.op_proto_and_checker_maker.OpRole.Backward
+ gradname_to_paramname = dict()
+ for p, g in params_grads:
+ gradname_to_paramname[g.name] = p.name
+ for op in prog.global_block().ops:
+ role = op.attr("op_role")
+ if role & int(BACKWARD) and op.has_attr("op_role_var"):
+ # have backward bits then remove all op_role_var
+ op.desc.remove_attr("op_role_var")
+ for op in prog.global_block().ops:
+ if op.type == "allreduce":
+ allreduce_role_var = []
+ for input_varname in op.input_arg_names:
+ if input_varname in gradname_to_paramname:
+ allreduce_role_var.append(gradname_to_paramname[input_varname])
+ allreduce_role_var.append(input_varname)
+ print("updating role var: ", allreduce_role_var)
+ op._set_attr("op_role_var", allreduce_role_var)
+
+def create_master_params_grads(params_grads, main_prog, startup_prog, scale_loss, reduce_master_grad=True):
+ master_params_grads = [] # master p, g on local device
+ params_grads_to_apply = [] # master p, g after allreduced, if reduce_master_grad is enabled
+ tmp_role = main_prog._current_role
+ OpRole = fluid.core.op_proto_and_checker_maker.OpRole
+ main_prog._current_role = OpRole.Backward
+ for p, g in params_grads:
+ # create master parameters
+ master_param = copy_to_master_param(p, main_prog.global_block())
+ startup_master_param = startup_prog.global_block()._clone_variable(master_param)
+ startup_p = startup_prog.global_block().var(p.name)
+ cast_fp16_to_fp32(startup_p, startup_master_param, startup_prog)
+ # cast fp16 gradients to fp32 before apply gradients
+ if g.name.startswith("batch_norm"):
+ if scale_loss > 1:
+ scaled_g = g / float(scale_loss)
+ else:
+ scaled_g = g
+ master_params_grads.append([p, scaled_g])
+ continue
+
+ master_grad = fluid.layers.cast(g, "float32")
+ if scale_loss > 1:
+ master_grad = master_grad / float(scale_loss)
+ master_params_grads.append([p, master_grad])
+ if reduce_master_grad:
+ reduced_master_grad = fluid.layers.collective._allreduce(master_grad)
+ else:
+ reduced_master_grad = master_grad
+ params_grads_to_apply.append([master_param, reduced_master_grad])
+
+ # update program op role var acording to master grads before allreduce.
+ _update_role_var_grad(main_prog, master_params_grads)
+ main_prog._current_role = tmp_role
+ return params_grads_to_apply
+
+def master_param_to_train_param(master_params_grads, params_grads, main_prog):
+ for idx, m_p_g in enumerate(master_params_grads):
+ train_p, _ = params_grads[idx]
+ if train_p.name.startswith("batch_norm"):
+ continue
+ with main_prog._optimized_guard([m_p_g[0], m_p_g[1]]):
+ cast_fp32_to_fp16(m_p_g[0], train_p, main_prog)
diff --git a/PaddleCV/image_classification/utils/learning_rate.py b/PaddleCV/image_classification/utils/learning_rate.py
new file mode 100644
index 0000000000000000000000000000000000000000..f505d79e670b1625472061a3a6de38b0a5fd4e93
--- /dev/null
+++ b/PaddleCV/image_classification/utils/learning_rate.py
@@ -0,0 +1,78 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.layers.ops as ops
+from paddle.fluid.initializer import init_on_cpu
+from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
+import math
+
+
+def cosine_decay(learning_rate, step_each_epoch, epochs=120):
+ """Applies cosine decay to the learning rate.
+ lr = 0.05 * (math.cos(epoch * (math.pi / 120)) + 1)
+ """
+ global_step = _decay_step_counter()
+
+ with init_on_cpu():
+ epoch = ops.floor(global_step / step_each_epoch)
+ decayed_lr = learning_rate * \
+ (ops.cos(epoch * (math.pi / epochs)) + 1)/2
+ return decayed_lr
+
+def cosine_decay_with_warmup(learning_rate, step_each_epoch, epochs=120):
+ """Applies cosine decay to the learning rate.
+ lr = 0.05 * (math.cos(epoch * (math.pi / 120)) + 1)
+ decrease lr for every mini-batch and start with warmup.
+ """
+ global_step = _decay_step_counter()
+ lr = fluid.layers.tensor.create_global_var(
+ shape=[1],
+ value=0.0,
+ dtype='float32',
+ persistable=True,
+ name="learning_rate")
+
+ warmup_epoch = fluid.layers.fill_constant(
+ shape=[1], dtype='float32', value=float(5), force_cpu=True)
+
+ with init_on_cpu():
+ epoch = ops.floor(global_step / step_each_epoch)
+ with control_flow.Switch() as switch:
+ with switch.case(epoch < warmup_epoch):
+ decayed_lr = learning_rate * (global_step / (step_each_epoch * warmup_epoch))
+ fluid.layers.tensor.assign(input=decayed_lr, output=lr)
+ with switch.default():
+ decayed_lr = learning_rate * \
+ (ops.cos((global_step - warmup_epoch * step_each_epoch) * (math.pi / (epochs * step_each_epoch))) + 1)/2
+ fluid.layers.tensor.assign(input=decayed_lr, output=lr)
+ return lr
+
+def lr_warmup(learning_rate, warmup_steps, start_lr, end_lr):
+ """ Applies linear learning rate warmup for distributed training
+ Argument learning_rate can be float or a Variable
+ lr = lr + (warmup_rate * step / warmup_steps)
+ """
+ assert (isinstance(end_lr, float))
+ assert (isinstance(start_lr, float))
+ linear_step = end_lr - start_lr
+ with fluid.default_main_program()._lr_schedule_guard():
+ lr = fluid.layers.tensor.create_global_var(
+ shape=[1],
+ value=0.0,
+ dtype='float32',
+ persistable=True,
+ name="learning_rate_warmup")
+
+ global_step = fluid.layers.learning_rate_scheduler._decay_step_counter()
+
+ with fluid.layers.control_flow.Switch() as switch:
+ with switch.case(global_step < warmup_steps):
+ decayed_lr = start_lr + linear_step * (global_step /
+ warmup_steps)
+ fluid.layers.tensor.assign(decayed_lr, lr)
+ with switch.default():
+ fluid.layers.tensor.assign(learning_rate, lr)
+
+ return lr
diff --git a/PaddleCV/image_classification/utils/utility.py b/PaddleCV/image_classification/utils/utility.py
new file mode 100644
index 0000000000000000000000000000000000000000..c28646da24cb0fb42a91a2cbff92c20db307da81
--- /dev/null
+++ b/PaddleCV/image_classification/utils/utility.py
@@ -0,0 +1,63 @@
+"""Contains common utility functions."""
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import distutils.util
+import numpy as np
+import six
+from paddle.fluid import core
+
+
+def print_arguments(args):
+ """Print argparse's arguments.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("name", default="Jonh", type=str, help="User name.")
+ args = parser.parse_args()
+ print_arguments(args)
+
+ :param args: Input argparse.Namespace for printing.
+ :type args: argparse.Namespace
+ """
+ print("------------- Configuration Arguments -------------")
+ for arg, value in sorted(six.iteritems(vars(args))):
+ print("%25s : %s" % (arg, value))
+ print("----------------------------------------------------")
+
+
+def add_arguments(argname, type, default, help, argparser, **kwargs):
+ """Add argparse's argument.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ add_argument("name", str, "Jonh", "User name.", parser)
+ args = parser.parse_args()
+ """
+ type = distutils.util.strtobool if type == bool else type
+ argparser.add_argument(
+ "--" + argname,
+ default=default,
+ type=type,
+ help=help + ' Default: %(default)s.',
+ **kwargs)
diff --git a/PaddleCV/metric_learning/README.md b/PaddleCV/metric_learning/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..605194946200e26ad05af09b0ba9d20f37a10e4e
--- /dev/null
+++ b/PaddleCV/metric_learning/README.md
@@ -0,0 +1,113 @@
+# Deep Metric Learning
+Metric learning is a kind of methods to learn discriminative features for each sample, with the purpose that intra-class samples have smaller distances while inter-class samples have larger distances in the learned space. With the develop of deep learning technique, metric learning methods are combined with deep neural networks to boost the performance of traditional tasks, such as face recognition/verification, human re-identification, image retrieval and so on. In this page, we introduce the way to implement deep metric learning using PaddlePaddle Fluid, including [data preparation](#data-preparation), [training](#training-metric-learning-models), [finetuning](#finetuning), [evaluation](#evaluation), [inference](#inference) and [Performances](#performances).
+
+---
+## Table of Contents
+- [Installation](#installation)
+- [Data preparation](#data-preparation)
+- [Training metric learning models](#training-metric-learning-models)
+- [Finetuning](#finetuning)
+- [Evaluation](#evaluation)
+- [Inference](#inference)
+- [Performances](#performances)
+
+## Installation
+
+Running sample code in this directory requires PaddelPaddle Fluid v0.14.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://paddlepaddle.org/documentation/docs/zh/1.3/beginners_guide/install/index_cn.html) and make an update.
+
+## Data preparation
+
+Stanford Online Product(SOP) dataset contains 120,053 images of 22,634 products downloaded from eBay.com. We use it to conduct the metric learning experiments. For training, 59,551 out of 11,318 classes are used, and 11,316 classes(60,502 images) are held out for testing. First of all, preparation of SOP data can be done as:
+```
+cd data/
+sh download_sop.sh
+```
+
+## Training metric learning models
+
+To train a metric learning model, one need to set the neural network as backbone and the metric loss function to optimize. You can download [ResNet50](http://paddle-imagenet-models-name.bj.bcebos.com/ResNet50_pretrained.zip) pretrained on imagenet dataset as backbone. We train meiric learning model using softmax or arcmargin loss firstly, and then fine-turned the model using other metric learning loss, such as triplet, quadruplet and eml loss. One example of training using arcmargin loss is shown below:
+
+
+```
+python train_elem.py \
+ --model=ResNet50 \
+ --train_batch_size=256 \
+ --test_batch_size=50 \
+ --lr=0.01 \
+ --total_iter_num=30000 \
+ --use_gpu=True \
+ --pretrained_model=${path_to_pretrain_imagenet_model} \
+ --model_save_dir=${output_model_path} \
+ --loss_name=arcmargin \
+ --arc_scale=80.0 \
+ --arc_margin=0.15 \
+ --arc_easy_margin=False
+```
+**parameter introduction:**
+* **model**: name model to use. Default: "ResNet50".
+* **train_batch_size**: the size of each training mini-batch. Default: 256.
+* **test_batch_size**: the size of each testing mini-batch. Default: 50.
+* **lr**: initialized learning rate. Default: 0.01.
+* **total_iter_num**: total number of training iterations. Default: 30000.
+* **use_gpu**: whether to use GPU or not. Default: True.
+* **pretrained_model**: model path for pretraining. Default: None.
+* **model_save_dir**: the directory to save trained model. Default: "output".
+* **loss_name**: loss for training model. Default: "softmax".
+* **arc_scale**: parameter of arcmargin loss. Default: 80.0.
+* **arc_margin**: parameter of arcmargin loss. Default: 0.15.
+* **arc_easy_margin**: parameter of arcmargin loss. Default: False.
+
+## Finetuning
+
+Finetuning is to finetune model weights in a specific task by loading pretrained weights. After training model using softmax or arcmargin loss, one can finetune the model using triplet, quadruplet or eml loss. One example of fine-turned using eml loss is shown below:
+
+```
+python train_pair.py \
+ --model=ResNet50 \
+ --train_batch_size=160 \
+ --test_batch_size=50 \
+ --lr=0.0001 \
+ --total_iter_num=100000 \
+ --use_gpu=True \
+ --pretrained_model=${path_to_pretrain_arcmargin_model} \
+ --model_save_dir=${output_model_path} \
+ --loss_name=eml \
+ --samples_each_class=2
+```
+
+## Evaluation
+Evaluation is to evaluate the performance of a trained model. You should set model path to ```path_to_pretrain_model```. Then Recall@Rank-1 can be obtained by running the following command:
+```
+python eval.py \
+ --model=ResNet50 \
+ --batch_size=50 \
+ --pretrained_model=${path_to_pretrain_model} \
+```
+
+## Inference
+Inference is used to get prediction score or image features based on trained models.
+```
+python infer.py \
+ --model=ResNet50 \
+ --batch_size=1 \
+ --pretrained_model=${path_to_pretrain_model}
+```
+
+## Performances
+
+For comparation, many metric learning models with different neural networks and loss functions are trained using corresponding experiential parameters. Recall@Rank-1 is used as evaluation metric and the performance is listed in the table.
+
+|pretrain model | softmax | arcmargin
+|- | - | -:
+|without fine-tuned | 77.42% | 78.11%
+|fine-tuned with triplet | 78.37% | 79.21%
+|fine-tuned with quadruplet | 78.10% | 79.59%
+|fine-tuned with eml | 79.32% | 80.11%
+|fine-tuned with npairs | - | 79.81%
+
+## Reference
+
+- ArcFace: Additive Angular Margin Loss for Deep Face Recognition [link](https://arxiv.org/abs/1801.07698)
+- Margin Sample Mining Loss: A Deep Learning Based Method for Person Re-identification [link](https://arxiv.org/abs/1710.00478)
+- Large Scale Strongly Supervised Ensemble Metric Learning, with Applications to Face Verification and Retrieval [link](https://arxiv.org/abs/1212.6094)
+- Improved Deep Metric Learning with Multi-class N-pair Loss Objective [link](http://www.nec-labs.com/uploads/images/Department-Images/MediaAnalytics/papers/nips16_npairmetriclearning.pdf)
diff --git a/PaddleCV/metric_learning/README_cn.md b/PaddleCV/metric_learning/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..2bb67ef8170e887d1e961fd3933de820543b0ee4
--- /dev/null
+++ b/PaddleCV/metric_learning/README_cn.md
@@ -0,0 +1,113 @@
+# 深度度量学习
+度量学习是一种对样本对学习区分性特征的方法,目的是在特征空间中,让同一个类别的样本具有较小的特征距离,不同类的样本具有较大的特征距离。随着深度学习技术的发展,基于深度神经网络的度量学习方法已经在许多视觉任务上提升了很大的性能,例如:人脸识别、人脸校验、行人重识别和图像检索等等。在本章节,介绍在PaddlePaddle Fluid里实现的几种度量学习方法和使用方法,具体包括[数据准备](#数据准备),[模型训练](#模型训练),[模型微调](#模型微调),[模型评估](#模型评估),[模型预测](#模型预测)。
+
+---
+## 简介
+- [安装](#安装)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型微调](#模型微调)
+- [模型评估](#模型评估)
+- [模型预测](#模型预测)
+- [模型性能](#模型性能)
+
+## 安装
+
+运行本章节代码需要在PaddlePaddle Fluid v0.14.0 或更高的版本环境。如果你的设备上的PaddlePaddle版本低于v0.14.0,请按照此[安装文档](http://paddlepaddle.org/documentation/docs/zh/1.3/beginners_guide/install/index_cn.html)进行安装和跟新。
+
+## 数据准备
+
+Stanford Online Product(SOP) 数据集下载自eBay,包含120053张商品图片,有22634个类别。我们使用该数据集进行实验。训练时,使用59551张图片,11318个类别的数据;测试时,使用60502张图片,11316个类别。首先,SOP数据集可以使用以下脚本下载:
+```
+cd data/
+sh download_sop.sh
+```
+
+## 模型训练
+
+为了训练度量学习模型,我们需要一个神经网络模型作为骨架模型(如[ResNet50](http://paddle-imagenet-models-name.bj.bcebos.com/ResNet50_pretrained.zip))和度量学习代价函数来进行优化。我们首先使用 softmax 或者 arcmargin 来进行训练,然后使用其它的代价函数来进行微调,例如:triplet,quadruplet和eml。下面是一个使用arcmargin训练的例子:
+
+
+```
+python train_elem.py \
+ --model=ResNet50 \
+ --train_batch_size=256 \
+ --test_batch_size=50 \
+ --lr=0.01 \
+ --total_iter_num=30000 \
+ --use_gpu=True \
+ --pretrained_model=${path_to_pretrain_imagenet_model} \
+ --model_save_dir=${output_model_path} \
+ --loss_name=arcmargin \
+ --arc_scale=80.0 \
+ --arc_margin=0.15 \
+ --arc_easy_margin=False
+```
+**参数介绍:**
+* **model**: 使用的模型名字. 默认: "ResNet50".
+* **train_batch_size**: 训练的 mini-batch大小. 默认: 256.
+* **test_batch_size**: 测试的 mini-batch大小. 默认: 50.
+* **lr**: 初始学习率. 默认: 0.01.
+* **total_iter_num**: 总的训练迭代轮数. 默认: 30000.
+* **use_gpu**: 是否使用GPU. 默认: True.
+* **pretrained_model**: 预训练模型的路径. 默认: None.
+* **model_save_dir**: 保存模型的路径. 默认: "output".
+* **loss_name**: 优化的代价函数. 默认: "softmax".
+* **arc_scale**: arcmargin的参数. 默认: 80.0.
+* **arc_margin**: arcmargin的参数. 默认: 0.15.
+* **arc_easy_margin**: arcmargin的参数. 默认: False.
+
+## 模型微调
+
+网络微调是在指定的任务上加载已有的模型来微调网络。在用softmax和arcmargin训完网络后,可以继续使用triplet,quadruplet或eml来微调网络。下面是一个使用eml来微调网络的例子:
+
+```
+python train_pair.py \
+ --model=ResNet50 \
+ --train_batch_size=160 \
+ --test_batch_size=50 \
+ --lr=0.0001 \
+ --total_iter_num=100000 \
+ --use_gpu=True \
+ --pretrained_model=${path_to_pretrain_arcmargin_model} \
+ --model_save_dir=${output_model_path} \
+ --loss_name=eml \
+ --samples_each_class=2
+```
+
+## 模型评估
+模型评估主要是评估模型的检索性能。这里需要设置```path_to_pretrain_model```。可以使用下面命令来计算Recall@Rank-1。
+```
+python eval.py \
+ --model=ResNet50 \
+ --batch_size=50 \
+ --pretrained_model=${path_to_pretrain_model} \
+```
+
+## 模型预测
+模型预测主要是基于训练好的网络来获取图像数据的特征,下面是模型预测的例子:
+```
+python infer.py \
+ --model=ResNet50 \
+ --batch_size=1 \
+ --pretrained_model=${path_to_pretrain_model}
+```
+
+## 模型性能
+
+下面列举了几种度量学习的代价函数在SOP数据集上的检索效果,这里使用Recall@Rank-1来进行评估。
+
+|预训练模型 | softmax | arcmargin
+|- | - | -:
+|未微调 | 77.42% | 78.11%
+|使用triplet微调 | 78.37% | 79.21%
+|使用quadruplet微调 | 78.10% | 79.59%
+|使用eml微调 | 79.32% | 80.11%
+|使用npairs微调 | - | 79.81%
+
+## 引用
+
+- ArcFace: Additive Angular Margin Loss for Deep Face Recognition [链接](https://arxiv.org/abs/1801.07698)
+- Margin Sample Mining Loss: A Deep Learning Based Method for Person Re-identification [链接](https://arxiv.org/abs/1710.00478)
+- Large Scale Strongly Supervised Ensemble Metric Learning, with Applications to Face Verification and Retrieval [链接](https://arxiv.org/abs/1212.6094)
+- Improved Deep Metric Learning with Multi-class N-pair Loss Objective [链接](http://www.nec-labs.com/uploads/images/Department-Images/MediaAnalytics/papers/nips16_npairmetriclearning.pdf)
diff --git a/fluid/PaddleCV/metric_learning/_ce.py b/PaddleCV/metric_learning/_ce.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/_ce.py
rename to PaddleCV/metric_learning/_ce.py
diff --git a/fluid/PaddleCV/metric_learning/data/download_sop.sh b/PaddleCV/metric_learning/data/download_sop.sh
similarity index 100%
rename from fluid/PaddleCV/metric_learning/data/download_sop.sh
rename to PaddleCV/metric_learning/data/download_sop.sh
diff --git a/fluid/PaddleCV/metric_learning/eval.py b/PaddleCV/metric_learning/eval.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/eval.py
rename to PaddleCV/metric_learning/eval.py
diff --git a/fluid/PaddleCV/metric_learning/imgtool.py b/PaddleCV/metric_learning/imgtool.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/imgtool.py
rename to PaddleCV/metric_learning/imgtool.py
diff --git a/PaddleCV/metric_learning/infer.py b/PaddleCV/metric_learning/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..2ee6cfb29eba935811ae576e3a450420b6c6be64
--- /dev/null
+++ b/PaddleCV/metric_learning/infer.py
@@ -0,0 +1,84 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import sys
+import math
+import time
+import argparse
+import functools
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import models
+import reader
+from utility import add_arguments, print_arguments
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('model', str, "ResNet50", "Set the network to use.")
+add_arg('embedding_size', int, 0, "Embedding size.")
+add_arg('batch_size', int, 1, "Minibatch size.")
+add_arg('image_shape', str, "3,224,224", "Input image size.")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('with_mem_opt', bool, False, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+# yapf: enable
+
+model_list = [m for m in dir(models) if "__" not in m]
+
+
+def infer(args):
+ # parameters from arguments
+ model_name = args.model
+ pretrained_model = args.pretrained_model
+ with_memory_optimization = args.with_mem_opt
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+
+ assert model_name in model_list, "{} is not in lists: {}".format(args.model,
+ model_list)
+
+ image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
+
+ # model definition
+ model = models.__dict__[model_name]()
+ out = model.net(input=image, embedding_size=args.embedding_size)
+
+ test_program = fluid.default_main_program().clone(for_test=True)
+
+ if with_memory_optimization:
+ fluid.memory_optimize(fluid.default_main_program())
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+
+ fluid.io.load_vars(exe, pretrained_model, predicate=if_exist)
+
+ infer_reader = paddle.batch(reader.infer(args), batch_size=args.batch_size, drop_last=False)
+ feeder = fluid.DataFeeder(place=place, feed_list=[image])
+
+ fetch_list = [out.name]
+
+ for batch_id, data in enumerate(infer_reader()):
+ result = exe.run(test_program, fetch_list=fetch_list, feed=feeder.feed(data))
+ result = result[0][0].reshape(-1)
+ print("Test-{0}-feature: {1}".format(batch_id, result[:5]))
+ sys.stdout.flush()
+
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ infer(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/PaddleCV/metric_learning/losses/__init__.py b/PaddleCV/metric_learning/losses/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..5da4327d35f07bedcaa11615a1d24d89784947dc
--- /dev/null
+++ b/PaddleCV/metric_learning/losses/__init__.py
@@ -0,0 +1,9 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from .softmaxloss import SoftmaxLoss
+from .arcmarginloss import ArcMarginLoss
+from .tripletloss import TripletLoss
+from .quadrupletloss import QuadrupletLoss
+from .emlloss import EmlLoss
+from .npairsloss import NpairsLoss
diff --git a/fluid/PaddleCV/metric_learning/losses/arcmarginloss.py b/PaddleCV/metric_learning/losses/arcmarginloss.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/losses/arcmarginloss.py
rename to PaddleCV/metric_learning/losses/arcmarginloss.py
diff --git a/fluid/PaddleCV/metric_learning/losses/commonfunc.py b/PaddleCV/metric_learning/losses/commonfunc.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/losses/commonfunc.py
rename to PaddleCV/metric_learning/losses/commonfunc.py
diff --git a/PaddleCV/metric_learning/losses/emlloss.py b/PaddleCV/metric_learning/losses/emlloss.py
new file mode 100644
index 0000000000000000000000000000000000000000..25c4658c3c4521491790cd31bbcfd403236a4ba5
--- /dev/null
+++ b/PaddleCV/metric_learning/losses/emlloss.py
@@ -0,0 +1,66 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import math
+import paddle.fluid as fluid
+from utility import get_gpu_num
+from .commonfunc import calculate_order_dist_matrix
+
+class EmlLoss():
+ def __init__(self, train_batch_size = 40, samples_each_class=2):
+ self.samples_each_class = samples_each_class
+ self.train_batch_size = train_batch_size
+ num_gpus = get_gpu_num()
+ assert(train_batch_size % num_gpus == 0)
+ self.cal_loss_batch_size = train_batch_size // num_gpus
+ assert(self.cal_loss_batch_size % samples_each_class == 0)
+
+ def surrogate_function(self, beta, theta, bias):
+ x = theta * fluid.layers.exp(bias)
+ output = fluid.layers.log(1+beta*x)/math.log(1+beta)
+ return output
+
+ def surrogate_function_approximate(self, beta, theta, bias):
+ output = (fluid.layers.log(theta) + bias + math.log(beta))/math.log(1+beta)
+ return output
+
+ def surrogate_function_stable(self, beta, theta, target, thresh):
+ max_gap = fluid.layers.fill_constant([1], dtype='float32', value=thresh)
+ max_gap.stop_gradient = True
+
+ target_max = fluid.layers.elementwise_max(target, max_gap)
+ target_min = fluid.layers.elementwise_min(target, max_gap)
+
+ loss1 = self.surrogate_function(beta, theta, target_min)
+ loss2 = self.surrogate_function_approximate(beta, theta, target_max)
+ bias = self.surrogate_function(beta, theta, max_gap)
+ loss = loss1 + loss2 - bias
+ return loss
+
+ def loss(self, input, label=None):
+ samples_each_class = self.samples_each_class
+ batch_size = self.cal_loss_batch_size
+ #input = fluid.layers.l2_normalize(input, axis=1)
+ #input_norm = fluid.layers.sqrt(fluid.layers.reduce_sum(fluid.layers.square(input), dim=1))
+ #input = fluid.layers.elementwise_div(input, input_norm, axis=0)
+ d = calculate_order_dist_matrix(input, self.cal_loss_batch_size, self.samples_each_class)
+ ignore, pos, neg = fluid.layers.split(d, num_or_sections= [1,
+ samples_each_class-1, batch_size-samples_each_class], dim=1)
+ ignore.stop_gradient = True
+
+ pos_max = fluid.layers.reduce_max(pos, dim=1)
+ pos_max = fluid.layers.reshape(pos_max, shape=[-1, 1])
+ pos = fluid.layers.exp(pos - pos_max)
+ pos_mean = fluid.layers.reduce_mean(pos, dim=1)
+
+ neg_min = fluid.layers.reduce_min(neg, dim=1)
+ neg_min = fluid.layers.reshape(neg_min, shape=[-1, 1])
+ neg = fluid.layers.exp(-1*(neg-neg_min))
+ neg_mean = fluid.layers.reduce_mean(neg, dim=1)
+ bias = pos_max - neg_min
+ theta = fluid.layers.reshape(neg_mean * pos_mean, shape=[-1,1])
+ thresh = 20.0
+ beta = 100000
+ loss = self.surrogate_function_stable(beta, theta, bias, thresh)
+ return loss
diff --git a/PaddleCV/metric_learning/losses/npairsloss.py b/PaddleCV/metric_learning/losses/npairsloss.py
new file mode 100644
index 0000000000000000000000000000000000000000..21086312c2b9c9f6ba63929d83ae06582fefec5f
--- /dev/null
+++ b/PaddleCV/metric_learning/losses/npairsloss.py
@@ -0,0 +1,58 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid as fluid
+from utility import get_gpu_num
+
+class NpairsLoss():
+ def __init__(self,
+ train_batch_size = 160,
+ samples_each_class=2,
+ reg_lambda=0.01):
+ self.samples_each_class = samples_each_class
+ assert(self.samples_each_class == 2)
+ self.train_batch_size = train_batch_size
+ num_gpus = get_gpu_num()
+ assert(train_batch_size % num_gpus == 0)
+ self.cal_loss_batch_size = train_batch_size // num_gpus
+ assert(self.cal_loss_batch_size % samples_each_class == 0)
+ self.reg_lambda = reg_lambda
+
+ def loss(self, input, label=None):
+ reg_lambda = self.reg_lambda
+ samples_each_class = self.samples_each_class
+ batch_size = self.cal_loss_batch_size
+ num_class = batch_size // samples_each_class
+ fea_dim = input.shape[1]
+
+ input = fluid.layers.reshape(input, shape = [-1, fea_dim])
+ feature = fluid.layers.reshape(input, shape = [-1, samples_each_class, fea_dim])
+ label = fluid.layers.reshape(label, shape = [-1, samples_each_class])
+ label = fluid.layers.cast(label, dtype='float32')
+ if samples_each_class == 2:
+ anchor_fea, positive_fea = fluid.layers.split(feature, num_or_sections = 2, dim = 1)
+ anchor_lab, positive_lab = fluid.layers.split(label, num_or_sections = 2, dim = 1)
+ else:
+ anchor_fea, positive_fea = fluid.layers.split(feature, num_or_sections = [1, samples_each_class-1], dim = 1)
+ anchor_lab, positive_lab = fluid.layers.split(label, num_or_sections = [1, samples_each_class-1], dim = 1)
+
+ anchor_fea = fluid.layers.reshape(anchor_fea, shape = [-1, fea_dim])
+ positive_fea = fluid.layers.reshape(positive_fea, shape = [-1, fea_dim])
+ positive_fea_trans = fluid.layers.transpose(positive_fea, perm = [1, 0])
+ similarity_matrix = fluid.layers.mul(anchor_fea, positive_fea_trans)
+
+ anchor_lab = fluid.layers.expand(x=anchor_lab, expand_times=[1, batch_size-num_class])
+ positive_lab_tran = fluid.layers.transpose(positive_lab, perm = [1, 0])
+ positive_lab_tran = fluid.layers.expand(x=positive_lab_tran, expand_times=[num_class, 1])
+ label_remapped = fluid.layers.equal(anchor_lab, positive_lab_tran)
+ label_remapped = fluid.layers.cast(label_remapped, dtype='float32') / (samples_each_class-1)
+ label_remapped.stop_gradient = True
+
+ out = fluid.layers.softmax(input=similarity_matrix, use_cudnn=False)
+ xentloss = fluid.layers.cross_entropy(input=out, label=label_remapped, soft_label=True)
+ xentloss = fluid.layers.mean(x=xentloss)
+
+ reg = fluid.layers.reduce_mean(fluid.layers.reduce_sum(fluid.layers.square(input), dim=1))
+ l2loss = 0.5 * reg_lambda * reg
+ return xentloss + l2loss
diff --git a/PaddleCV/metric_learning/losses/quadrupletloss.py b/PaddleCV/metric_learning/losses/quadrupletloss.py
new file mode 100644
index 0000000000000000000000000000000000000000..4af34caf50b5b082de0934cfe51879f11aad9c95
--- /dev/null
+++ b/PaddleCV/metric_learning/losses/quadrupletloss.py
@@ -0,0 +1,40 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid as fluid
+from utility import get_gpu_num
+from .commonfunc import calculate_order_dist_matrix
+
+class QuadrupletLoss():
+ def __init__(self,
+ train_batch_size = 80,
+ samples_each_class = 2,
+ margin = 0.1):
+ self.margin = margin
+ self.samples_each_class = samples_each_class
+ self.train_batch_size = train_batch_size
+ num_gpus = get_gpu_num()
+ assert(train_batch_size % num_gpus == 0)
+ self.cal_loss_batch_size = train_batch_size // num_gpus
+ assert(self.cal_loss_batch_size % samples_each_class == 0)
+
+ def loss(self, input, label=None):
+ #input = fluid.layers.l2_normalize(input, axis=1)
+ input_norm = fluid.layers.sqrt(fluid.layers.reduce_sum(fluid.layers.square(input), dim=1))
+ input = fluid.layers.elementwise_div(input, input_norm, axis=0)
+
+ samples_each_class = self.samples_each_class
+ batch_size = self.cal_loss_batch_size
+ margin = self.margin
+ d = calculate_order_dist_matrix(input, self.cal_loss_batch_size, self.samples_each_class)
+ ignore, pos, neg = fluid.layers.split(d, num_or_sections= [1,
+ samples_each_class-1, batch_size-samples_each_class], dim=1)
+ ignore.stop_gradient = True
+ pos_max = fluid.layers.reduce_max(pos)
+ neg_min = fluid.layers.reduce_min(neg)
+ #pos_max = fluid.layers.sqrt(pos_max + 1e-6)
+ #neg_min = fluid.layers.sqrt(neg_min + 1e-6)
+ loss = fluid.layers.relu(pos_max - neg_min + margin)
+ return loss
+
diff --git a/fluid/PaddleCV/metric_learning/losses/softmaxloss.py b/PaddleCV/metric_learning/losses/softmaxloss.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/losses/softmaxloss.py
rename to PaddleCV/metric_learning/losses/softmaxloss.py
diff --git a/PaddleCV/metric_learning/losses/tripletloss.py b/PaddleCV/metric_learning/losses/tripletloss.py
new file mode 100644
index 0000000000000000000000000000000000000000..e5bb22ef913abfe60e421934217f71fbfad39c95
--- /dev/null
+++ b/PaddleCV/metric_learning/losses/tripletloss.py
@@ -0,0 +1,32 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid as fluid
+
+class TripletLoss():
+ def __init__(self, margin=0.1):
+ self.margin = margin
+
+ def loss(self, input, label=None):
+ margin = self.margin
+ fea_dim = input.shape[1] # number of channels
+ #input = fluid.layers.l2_normalize(input, axis=1)
+ input_norm = fluid.layers.sqrt(fluid.layers.reduce_sum(fluid.layers.square(input), dim=1))
+ input = fluid.layers.elementwise_div(input, input_norm, axis=0)
+ output = fluid.layers.reshape(input, shape = [-1, 3, fea_dim])
+
+ anchor, positive, negative = fluid.layers.split(output, num_or_sections = 3, dim = 1)
+
+ anchor = fluid.layers.reshape(anchor, shape = [-1, fea_dim])
+ positive = fluid.layers.reshape(positive, shape = [-1, fea_dim])
+ negative = fluid.layers.reshape(negative, shape = [-1, fea_dim])
+
+ a_p = fluid.layers.square(anchor - positive)
+ a_n = fluid.layers.square(anchor - negative)
+ a_p = fluid.layers.reduce_sum(a_p, dim = 1)
+ a_n = fluid.layers.reduce_sum(a_n, dim = 1)
+ #a_p = fluid.layers.sqrt(a_p + 1e-6)
+ #a_n = fluid.layers.sqrt(a_n + 1e-6)
+ loss = fluid.layers.relu(a_p + margin - a_n)
+ return loss
diff --git a/fluid/PaddleCV/metric_learning/models/__init__.py b/PaddleCV/metric_learning/models/__init__.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/models/__init__.py
rename to PaddleCV/metric_learning/models/__init__.py
diff --git a/fluid/PaddleCV/metric_learning/models/resnet_embedding.py b/PaddleCV/metric_learning/models/resnet_embedding.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/models/resnet_embedding.py
rename to PaddleCV/metric_learning/models/resnet_embedding.py
diff --git a/PaddleCV/metric_learning/reader.py b/PaddleCV/metric_learning/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..ac8f257ecbadbb08454ffc616b1c587455cc92b6
--- /dev/null
+++ b/PaddleCV/metric_learning/reader.py
@@ -0,0 +1,189 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import math
+import random
+import functools
+import numpy as np
+import paddle
+from imgtool import process_image
+
+random.seed(0)
+
+DATA_DIR = "./data/Stanford_Online_Products/"
+TRAIN_LIST = './data/Stanford_Online_Products/Ebay_train.txt'
+VAL_LIST = './data/Stanford_Online_Products/Ebay_test.txt'
+
+
+def init_sop(mode):
+ if mode == 'train':
+ train_data = {}
+ train_image_list = []
+ train_list = open(TRAIN_LIST, "r").readlines()
+ for i, item in enumerate(train_list):
+ items = item.strip().split()
+ if items[0] == 'image_id':
+ continue
+ path = items[3]
+ label = int(items[1]) - 1
+ train_image_list.append((path, label))
+ if label not in train_data:
+ train_data[label] = []
+ train_data[label].append(path)
+ random.shuffle(train_image_list)
+ print("{} dataset size: {}".format(mode, len(train_data)))
+ return train_data, train_image_list
+ else:
+ val_data = {}
+ val_image_list = []
+ test_image_list = []
+ val_list = open(VAL_LIST, "r").readlines()
+ for i, item in enumerate(val_list):
+ items = item.strip().split()
+ if items[0] == 'image_id':
+ continue
+ path = items[3]
+ label = int(items[1])
+ val_image_list.append((path, label))
+ test_image_list.append(path)
+ if label not in val_data:
+ val_data[label] = []
+ val_data[label].append(path)
+ print("{} dataset size: {}".format(mode, len(val_data)))
+ if mode == 'val':
+ return val_data, val_image_list
+ else:
+ return test_image_list
+
+def common_iterator(data, settings):
+ batch_size = settings.train_batch_size
+ samples_each_class = settings.samples_each_class
+ assert (batch_size % samples_each_class == 0)
+ class_num = batch_size // samples_each_class
+ def train_iterator():
+ count = 0
+ labs = list(data.keys())
+ lab_num = len(labs)
+ ind = list(range(0, lab_num))
+ while True:
+ random.shuffle(ind)
+ ind_sample = ind[:class_num]
+ for ind_i in ind_sample:
+ lab = labs[ind_i]
+ data_list = data[lab]
+ data_ind = list(range(0, len(data_list)))
+ random.shuffle(data_ind)
+ anchor_ind = data_ind[:samples_each_class]
+
+ for anchor_ind_i in anchor_ind:
+ anchor_path = DATA_DIR + data_list[anchor_ind_i]
+ yield anchor_path, lab
+ count += 1
+ if count >= settings.total_iter_num + 1:
+ return
+
+ return train_iterator
+
+def triplet_iterator(data, settings):
+ batch_size = settings.train_batch_size
+ assert (batch_size % 3 == 0)
+ def train_iterator():
+ total_count = settings.train_batch_size * (settings.total_iter_num + 1)
+ count = 0
+ labs = list(data.keys())
+ lab_num = len(labs)
+ ind = list(range(0, lab_num))
+ while True:
+ random.shuffle(ind)
+ ind_pos, ind_neg = ind[:2]
+ lab_pos = labs[ind_pos]
+ pos_data_list = data[lab_pos]
+ data_ind = list(range(0, len(pos_data_list)))
+ random.shuffle(data_ind)
+ anchor_ind, pos_ind = data_ind[:2]
+
+ lab_neg = labs[ind_neg]
+ neg_data_list = data[lab_neg]
+ neg_ind = random.randint(0, len(neg_data_list) - 1)
+
+ anchor_path = DATA_DIR + pos_data_list[anchor_ind]
+ yield anchor_path, lab_pos
+ pos_path = DATA_DIR + pos_data_list[pos_ind]
+ yield pos_path, lab_pos
+ neg_path = DATA_DIR + neg_data_list[neg_ind]
+ yield neg_path, lab_neg
+ count += 3
+ if count >= total_count:
+ return
+
+ return train_iterator
+
+def arcmargin_iterator(data, settings):
+ def train_iterator():
+ total_count = settings.train_batch_size * (settings.total_iter_num + 1)
+ count = 0
+ while True:
+ for items in data:
+ path, label = items
+ path = DATA_DIR + path
+ yield path, label
+ count += 1
+ if count >= total_count:
+ return
+ return train_iterator
+
+def image_iterator(data, mode):
+ def val_iterator():
+ for items in data:
+ path, label = items
+ path = DATA_DIR + path
+ yield path, label
+ def test_iterator():
+ for item in data:
+ path = item
+ path = DATA_DIR + path
+ yield [path]
+ if mode == 'val':
+ return val_iterator
+ else:
+ return test_iterator
+
+def createreader(settings, mode):
+ def metric_reader():
+ if mode == 'train':
+ train_data, train_image_list = init_sop('train')
+ loss_name = settings.loss_name
+ if loss_name in ["softmax", "arcmargin"]:
+ return arcmargin_iterator(train_image_list, settings)()
+ elif loss_name == 'triplet':
+ return triplet_iterator(train_data, settings)()
+ else:
+ return common_iterator(train_data, settings)()
+ elif mode == 'val':
+ val_data, val_image_list = init_sop('val')
+ return image_iterator(val_image_list, 'val')()
+ else:
+ test_image_list = init_sop('test')
+ return image_iterator(test_image_list, 'test')()
+
+ image_shape = settings.image_shape.split(',')
+ assert(image_shape[1] == image_shape[2])
+ image_size = int(image_shape[2])
+ keep_order = False if mode != 'train' or settings.loss_name in ['softmax', 'arcmargin'] else True
+ image_mapper = functools.partial(process_image,
+ mode=mode, color_jitter=False, rotate=False, crop_size=image_size)
+ reader = paddle.reader.xmap_readers(
+ image_mapper, metric_reader, 8, 1000, order=keep_order)
+ return reader
+
+
+def train(settings):
+ return createreader(settings, "train")
+
+def test(settings):
+ return createreader(settings, "val")
+
+def infer(settings):
+ return createreader(settings, "test")
diff --git a/PaddleCV/metric_learning/train_elem.py b/PaddleCV/metric_learning/train_elem.py
new file mode 100644
index 0000000000000000000000000000000000000000..7a4feb236b20e56a13907251e6d9f5fb86533ef7
--- /dev/null
+++ b/PaddleCV/metric_learning/train_elem.py
@@ -0,0 +1,290 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import sys
+import math
+import time
+import logging
+import argparse
+import functools
+import threading
+import subprocess
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import models
+import reader
+from losses import SoftmaxLoss
+from losses import ArcMarginLoss
+from utility import add_arguments, print_arguments
+from utility import fmt_time, recall_topk, get_gpu_num
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('model', str, "ResNet50", "Set the network to use.")
+add_arg('embedding_size', int, 0, "Embedding size.")
+add_arg('train_batch_size', int, 256, "Minibatch size.")
+add_arg('test_batch_size', int, 50, "Minibatch size.")
+add_arg('image_shape', str, "3,224,224", "input image size")
+add_arg('class_dim', int, 11318 , "Class number.")
+add_arg('lr', float, 0.01, "set learning rate.")
+add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+add_arg('lr_steps', str, "15000,25000", "step of lr")
+add_arg('total_iter_num', int, 30000, "total_iter_num")
+add_arg('display_iter_step', int, 10, "display_iter_step.")
+add_arg('test_iter_step', int, 1000, "test_iter_step.")
+add_arg('save_iter_step', int, 1000, "save_iter_step.")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+add_arg('model_save_dir', str, "output", "model save directory")
+add_arg('loss_name', str, "softmax", "Set the loss type to use.")
+add_arg('arc_scale', float, 80.0, "arc scale.")
+add_arg('arc_margin', float, 0.15, "arc margin.")
+add_arg('arc_easy_margin', bool, False, "arc easy margin.")
+add_arg('enable_ce', bool, False, "If set True, enable continuous evaluation job.")
+# yapf: enable
+
+model_list = [m for m in dir(models) if "__" not in m]
+
+def optimizer_setting(params):
+ ls = params["learning_strategy"]
+ assert ls["name"] == "piecewise_decay", \
+ "learning rate strategy must be {}, \
+ but got {}".format("piecewise_decay", lr["name"])
+
+ bd = [int(e) for e in ls["lr_steps"].split(',')]
+ base_lr = params["lr"]
+ lr = [base_lr * (0.1 ** i) for i in range(len(bd) + 1)]
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ momentum=0.9,
+ regularization=fluid.regularizer.L2Decay(1e-4))
+ return optimizer
+
+
+def net_config(image, label, model, args, is_train):
+ assert args.model in model_list, "{} is not in lists: {}".format(
+ args.model, model_list)
+
+ out = model.net(input=image, embedding_size=args.embedding_size)
+ if not is_train:
+ return None, None, None, out
+
+ if args.loss_name == "softmax":
+ metricloss = SoftmaxLoss(
+ class_dim=args.class_dim,
+ )
+ elif args.loss_name == "arcmargin":
+ metricloss = ArcMarginLoss(
+ class_dim = args.class_dim,
+ margin = args.arc_margin,
+ scale = args.arc_scale,
+ easy_margin = args.arc_easy_margin,
+ )
+ cost, logit = metricloss.loss(out, label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc_top1 = fluid.layers.accuracy(input=logit, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=logit, label=label, k=5)
+ return avg_cost, acc_top1, acc_top5, out
+
+def build_program(is_train, main_prog, startup_prog, args):
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+ model = models.__dict__[args.model]()
+ with fluid.program_guard(main_prog, startup_prog):
+ if is_train:
+ queue_capacity = 64
+ py_reader = fluid.layers.py_reader(
+ capacity=queue_capacity,
+ shapes=[[-1] + image_shape, [-1, 1]],
+ lod_levels=[0, 0],
+ dtypes=["float32", "int64"],
+ use_double_buffer=True)
+ image, label = fluid.layers.read_file(py_reader)
+ else:
+ image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
+ label = fluid.layers.data(name='label', shape=[1], dtype='int64')
+
+ with fluid.unique_name.guard():
+ avg_cost, acc_top1, acc_top5, out = net_config(image, label, model, args, is_train)
+ if is_train:
+ params = model.params
+ params["lr"] = args.lr
+ params["learning_strategy"]["lr_steps"] = args.lr_steps
+ params["learning_strategy"]["name"] = args.lr_strategy
+ optimizer = optimizer_setting(params)
+ optimizer.minimize(avg_cost)
+ global_lr = optimizer._global_learning_rate()
+ """
+ if not is_train:
+ main_prog = main_prog.clone(for_test=True)
+ """
+ if is_train:
+ return py_reader, avg_cost, acc_top1, acc_top5, global_lr
+ else:
+ return out, image, label
+
+
+def train_async(args):
+ # parameters from arguments
+
+ logging.debug('enter train')
+ model_name = args.model
+ checkpoint = args.checkpoint
+ pretrained_model = args.pretrained_model
+ model_save_dir = args.model_save_dir
+
+ startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+ tmp_prog = fluid.Program()
+
+ if args.enable_ce:
+ assert args.model == "ResNet50"
+ assert args.loss_name == "arcmargin"
+ np.random.seed(0)
+ startup_prog.random_seed = 1000
+ train_prog.random_seed = 1000
+ tmp_prog.random_seed = 1000
+
+ train_py_reader, train_cost, train_acc1, train_acc5, global_lr = build_program(
+ is_train=True,
+ main_prog=train_prog,
+ startup_prog=startup_prog,
+ args=args)
+ test_feas, image, label = build_program(
+ is_train=False,
+ main_prog=tmp_prog,
+ startup_prog=startup_prog,
+ args=args)
+ test_prog = tmp_prog.clone(for_test=True)
+
+ train_fetch_list = [global_lr.name, train_cost.name, train_acc1.name, train_acc5.name]
+ test_fetch_list = [test_feas.name]
+
+ if args.with_mem_opt:
+ fluid.memory_optimize(train_prog, skip_opt_set=set(train_fetch_list))
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ exe.run(startup_prog)
+
+ logging.debug('after run startup program')
+
+ if checkpoint is not None:
+ fluid.io.load_persistables(exe, checkpoint, main_program=train_prog)
+
+ if pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+
+ fluid.io.load_vars(
+ exe, pretrained_model, main_program=train_prog, predicate=if_exist)
+
+ devicenum = get_gpu_num()
+ assert (args.train_batch_size % devicenum) == 0
+ train_batch_size = args.train_batch_size // devicenum
+ test_batch_size = args.test_batch_size
+
+ train_reader = paddle.batch(reader.train(args), batch_size=train_batch_size, drop_last=True)
+ test_reader = paddle.batch(reader.test(args), batch_size=test_batch_size, drop_last=False)
+ test_feeder = fluid.DataFeeder(place=place, feed_list=[image, label])
+ train_py_reader.decorate_paddle_reader(train_reader)
+
+ train_exe = fluid.ParallelExecutor(
+ main_program=train_prog,
+ use_cuda=args.use_gpu,
+ loss_name=train_cost.name)
+
+ totalruntime = 0
+ train_py_reader.start()
+ iter_no = 0
+ train_info = [0, 0, 0, 0]
+ while iter_no <= args.total_iter_num:
+ t1 = time.time()
+ lr, loss, acc1, acc5 = train_exe.run(fetch_list=train_fetch_list)
+ t2 = time.time()
+ period = t2 - t1
+ lr = np.mean(np.array(lr))
+ train_info[0] += np.mean(np.array(loss))
+ train_info[1] += np.mean(np.array(acc1))
+ train_info[2] += np.mean(np.array(acc5))
+ train_info[3] += 1
+ if iter_no % args.display_iter_step == 0:
+ avgruntime = totalruntime / args.display_iter_step
+ avg_loss = train_info[0] / train_info[3]
+ avg_acc1 = train_info[1] / train_info[3]
+ avg_acc5 = train_info[2] / train_info[3]
+ print("[%s] trainbatch %d, lr %.6f, loss %.6f, "\
+ "acc1 %.4f, acc5 %.4f, time %2.2f sec" % \
+ (fmt_time(), iter_no, lr, avg_loss, avg_acc1, avg_acc5, avgruntime))
+ sys.stdout.flush()
+ totalruntime = 0
+ if iter_no % 1000 == 0:
+ train_info = [0, 0, 0, 0]
+
+ totalruntime += period
+
+ if iter_no % args.test_iter_step == 0 and iter_no != 0:
+ f, l = [], []
+ for batch_id, data in enumerate(test_reader()):
+ t1 = time.time()
+ [feas] = exe.run(test_prog, fetch_list = test_fetch_list, feed=test_feeder.feed(data))
+ label = np.asarray([x[1] for x in data])
+ f.append(feas)
+ l.append(label)
+
+ t2 = time.time()
+ period = t2 - t1
+ if batch_id % 20 == 0:
+ print("[%s] testbatch %d, time %2.2f sec" % \
+ (fmt_time(), batch_id, period))
+
+ f = np.vstack(f)
+ l = np.hstack(l)
+ recall = recall_topk(f, l, k=1)
+ print("[%s] test_img_num %d, trainbatch %d, test_recall %.5f" % \
+ (fmt_time(), len(f), iter_no, recall))
+ sys.stdout.flush()
+
+ if iter_no % args.save_iter_step == 0 and iter_no != 0:
+ model_path = os.path.join(model_save_dir + '/' + model_name,
+ str(iter_no))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(exe, model_path, main_program=train_prog)
+
+ iter_no += 1
+
+ # This is for continuous evaluation only
+ if args.enable_ce:
+ # Use the mean cost/acc for training
+ print("kpis train_cost %s" % (avg_loss))
+ print("kpis test_recall %s" % (recall))
+
+
+def initlogging():
+ for handler in logging.root.handlers[:]:
+ logging.root.removeHandler(handler)
+ loglevel = logging.DEBUG
+ logging.basicConfig(
+ level=loglevel,
+ # logger.BASIC_FORMAT,
+ format=
+ "%(levelname)s:%(filename)s[%(lineno)s] %(name)s:%(funcName)s->%(message)s",
+ datefmt='%a, %d %b %Y %H:%M:%S')
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ train_async(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/PaddleCV/metric_learning/train_pair.py b/PaddleCV/metric_learning/train_pair.py
new file mode 100644
index 0000000000000000000000000000000000000000..0ae7b5a241869672d1c0d52a4c6fa30b6cbce860
--- /dev/null
+++ b/PaddleCV/metric_learning/train_pair.py
@@ -0,0 +1,282 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import os
+import sys
+import math
+import time
+import logging
+import argparse
+import functools
+import threading
+import subprocess
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import models
+import reader
+from losses import TripletLoss
+from losses import QuadrupletLoss
+from losses import EmlLoss
+from losses import NpairsLoss
+from utility import add_arguments, print_arguments
+from utility import fmt_time, recall_topk, get_gpu_num
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('model', str, "ResNet50", "Set the network to use.")
+add_arg('embedding_size', int, 0, "Embedding size.")
+add_arg('train_batch_size', int, 120, "Minibatch size.")
+add_arg('test_batch_size', int, 50, "Minibatch size.")
+add_arg('image_shape', str, "3,224,224", "input image size")
+add_arg('class_dim', int, 11318, "Class number.")
+add_arg('lr', float, 0.0001, "set learning rate.")
+add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+add_arg('lr_steps', str, "100000", "step of lr")
+add_arg('total_iter_num', int, 100000, "total_iter_num")
+add_arg('display_iter_step', int, 10, "display_iter_step.")
+add_arg('test_iter_step', int, 5000, "test_iter_step.")
+add_arg('save_iter_step', int, 5000, "save_iter_step.")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
+add_arg('model_save_dir', str, "output", "model save directory")
+add_arg('loss_name', str, "triplet", "Set the loss type to use.")
+add_arg('samples_each_class', int, 2, "samples_each_class.")
+add_arg('margin', float, 0.1, "margin.")
+add_arg('npairs_reg_lambda', float, 0.01, "npairs reg lambda.")
+# yapf: enable
+
+model_list = [m for m in dir(models) if "__" not in m]
+
+def optimizer_setting(params):
+ ls = params["learning_strategy"]
+ assert ls["name"] == "piecewise_decay", \
+ "learning rate strategy must be {}, \
+ but got {}".format("piecewise_decay", lr["name"])
+
+ bd = [int(e) for e in ls["lr_steps"].split(',')]
+ base_lr = params["lr"]
+ lr = [base_lr * (0.1 ** i) for i in range(len(bd) + 1)]
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ momentum=0.9,
+ regularization=fluid.regularizer.L2Decay(1e-4))
+ return optimizer
+
+
+def net_config(image, label, model, args, is_train):
+ assert args.model in model_list, "{} is not in lists: {}".format(
+ args.model, model_list)
+
+ out = model.net(input=image, embedding_size=args.embedding_size)
+ if not is_train:
+ return None, out
+
+ if args.loss_name == "triplet":
+ metricloss = TripletLoss(
+ margin=args.margin,
+ )
+ elif args.loss_name == "quadruplet":
+ metricloss = QuadrupletLoss(
+ train_batch_size = args.train_batch_size,
+ samples_each_class = args.samples_each_class,
+ margin=args.margin,
+ )
+ elif args.loss_name == "eml":
+ metricloss = EmlLoss(
+ train_batch_size = args.train_batch_size,
+ samples_each_class = args.samples_each_class,
+ )
+ elif args.loss_name == "npairs":
+ metricloss = NpairsLoss(
+ train_batch_size = args.train_batch_size,
+ samples_each_class = args.samples_each_class,
+ reg_lambda = args.npairs_reg_lambda,
+ )
+ cost = metricloss.loss(out, label)
+ avg_cost = fluid.layers.mean(x=cost)
+ return avg_cost, out
+
+def build_program(is_train, main_prog, startup_prog, args):
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+ model = models.__dict__[args.model]()
+ with fluid.program_guard(main_prog, startup_prog):
+ if is_train:
+ queue_capacity = 64
+ py_reader = fluid.layers.py_reader(
+ capacity=queue_capacity,
+ shapes=[[-1] + image_shape, [-1, 1]],
+ lod_levels=[0, 0],
+ dtypes=["float32", "int64"],
+ use_double_buffer=True)
+ image, label = fluid.layers.read_file(py_reader)
+ else:
+ image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
+ label = fluid.layers.data(name='label', shape=[1], dtype='int64')
+
+ with fluid.unique_name.guard():
+ avg_cost, out = net_config(image, label, model, args, is_train)
+ if is_train:
+ params = model.params
+ params["lr"] = args.lr
+ params["learning_strategy"]["lr_steps"] = args.lr_steps
+ params["learning_strategy"]["name"] = args.lr_strategy
+ optimizer = optimizer_setting(params)
+ optimizer.minimize(avg_cost)
+ global_lr = optimizer._global_learning_rate()
+ """
+ if not is_train:
+ main_prog = main_prog.clone(for_test=True)
+ """
+ if is_train:
+ return py_reader, avg_cost, global_lr, out, label
+ else:
+ return out, image, label
+
+
+def train_async(args):
+ # parameters from arguments
+
+ logging.debug('enter train')
+ model_name = args.model
+ checkpoint = args.checkpoint
+ pretrained_model = args.pretrained_model
+ model_save_dir = args.model_save_dir
+
+ startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+ tmp_prog = fluid.Program()
+
+ train_py_reader, train_cost, global_lr, train_feas, train_label = build_program(
+ is_train=True,
+ main_prog=train_prog,
+ startup_prog=startup_prog,
+ args=args)
+ test_feas, image, label = build_program(
+ is_train=False,
+ main_prog=tmp_prog,
+ startup_prog=startup_prog,
+ args=args)
+ test_prog = tmp_prog.clone(for_test=True)
+
+ train_fetch_list = [global_lr.name, train_cost.name, train_feas.name, train_label.name]
+ test_fetch_list = [test_feas.name]
+
+ if args.with_mem_opt:
+ fluid.memory_optimize(train_prog, skip_opt_set=set(train_fetch_list))
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ exe.run(startup_prog)
+
+ logging.debug('after run startup program')
+
+ if checkpoint is not None:
+ fluid.io.load_persistables(exe, checkpoint, main_program=train_prog)
+
+ if pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+
+ fluid.io.load_vars(
+ exe, pretrained_model, main_program=train_prog, predicate=if_exist)
+
+ devicenum = get_gpu_num()
+ assert (args.train_batch_size % devicenum) == 0
+ train_batch_size = args.train_batch_size / devicenum
+ test_batch_size = args.test_batch_size
+
+ train_reader = paddle.batch(reader.train(args), batch_size=train_batch_size, drop_last=True)
+ test_reader = paddle.batch(reader.test(args), batch_size=test_batch_size, drop_last=False)
+ test_feeder = fluid.DataFeeder(place=place, feed_list=[image, label])
+ train_py_reader.decorate_paddle_reader(train_reader)
+
+ train_exe = fluid.ParallelExecutor(
+ main_program=train_prog,
+ use_cuda=args.use_gpu,
+ loss_name=train_cost.name)
+
+ totalruntime = 0
+ train_py_reader.start()
+ iter_no = 0
+ train_info = [0, 0, 0]
+ while iter_no <= args.total_iter_num:
+ t1 = time.time()
+ lr, loss, feas, label = train_exe.run(fetch_list=train_fetch_list)
+ t2 = time.time()
+ period = t2 - t1
+ lr = np.mean(np.array(lr))
+ train_info[0] += np.mean(np.array(loss))
+ train_info[1] += recall_topk(feas, label, k=1)
+ train_info[2] += 1
+ if iter_no % args.display_iter_step == 0:
+ avgruntime = totalruntime / args.display_iter_step
+ avg_loss = train_info[0] / train_info[2]
+ avg_recall = train_info[1] / train_info[2]
+ print("[%s] trainbatch %d, lr %.6f, loss %.6f, "\
+ "recall %.4f, time %2.2f sec" % \
+ (fmt_time(), iter_no, lr, avg_loss, avg_recall, avgruntime))
+ sys.stdout.flush()
+ totalruntime = 0
+ if iter_no % 1000 == 0:
+ train_info = [0, 0, 0]
+
+ totalruntime += period
+
+ if iter_no % args.test_iter_step == 0 and iter_no != 0:
+ f, l = [], []
+ for batch_id, data in enumerate(test_reader()):
+ t1 = time.time()
+ [feas] = exe.run(test_prog, fetch_list = test_fetch_list, feed=test_feeder.feed(data))
+ label = np.asarray([x[1] for x in data])
+ f.append(feas)
+ l.append(label)
+
+ t2 = time.time()
+ period = t2 - t1
+ if batch_id % 20 == 0:
+ print("[%s] testbatch %d, time %2.2f sec" % \
+ (fmt_time(), batch_id, period))
+
+ f = np.vstack(f)
+ l = np.hstack(l)
+ recall = recall_topk(f, l, k=1)
+ print("[%s] test_img_num %d, trainbatch %d, test_recall %.5f" % \
+ (fmt_time(), len(f), iter_no, recall))
+ sys.stdout.flush()
+
+ if iter_no % args.save_iter_step == 0 and iter_no != 0:
+ model_path = os.path.join(model_save_dir + '/' + model_name,
+ str(iter_no))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(exe, model_path, main_program=train_prog)
+
+ iter_no += 1
+
+def initlogging():
+ for handler in logging.root.handlers[:]:
+ logging.root.removeHandler(handler)
+ loglevel = logging.DEBUG
+ logging.basicConfig(
+ level=loglevel,
+ # logger.BASIC_FORMAT,
+ format=
+ "%(levelname)s:%(filename)s[%(lineno)s] %(name)s:%(funcName)s->%(message)s",
+ datefmt='%a, %d %b %Y %H:%M:%S')
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ train_async(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/fluid/PaddleCV/metric_learning/utility.py b/PaddleCV/metric_learning/utility.py
similarity index 100%
rename from fluid/PaddleCV/metric_learning/utility.py
rename to PaddleCV/metric_learning/utility.py
diff --git a/fluid/PaddleCV/object_detection/.gitignore b/PaddleCV/object_detection/.gitignore
similarity index 100%
rename from fluid/PaddleCV/object_detection/.gitignore
rename to PaddleCV/object_detection/.gitignore
diff --git a/fluid/PaddleCV/object_detection/.run_ce.sh b/PaddleCV/object_detection/.run_ce.sh
similarity index 100%
rename from fluid/PaddleCV/object_detection/.run_ce.sh
rename to PaddleCV/object_detection/.run_ce.sh
diff --git a/PaddleCV/object_detection/README.md b/PaddleCV/object_detection/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..2466ba96577c7cb1e2bb335a0b8b5c74edbb92fd
--- /dev/null
+++ b/PaddleCV/object_detection/README.md
@@ -0,0 +1,96 @@
+## SSD Object Detection
+
+## Table of Contents
+- [Introduction](#introduction)
+- [Data Preparation](#data-preparation)
+- [Train](#train)
+- [Evaluate](#evaluate)
+- [Infer and Visualize](#infer-and-visualize)
+- [Released Model](#released-model)
+
+### Introduction
+
+[Single Shot MultiBox Detector (SSD)](https://arxiv.org/abs/1512.02325) framework for object detection can be categorized as a single stage detector. A single stage detector simplifies object detection as a regression problem, which directly predicts the bounding boxes and class probabilities without region proposal. SSD further makes improves by producing these predictions of different scales from different layers, as shown below. Six levels predictions are made in six different scale feature maps. And there are two 3x3 convolutional layers in each feature map, which predict category or a shape offset relative to the prior box(also called anchor), respectively. Thus, we get 38x38x4 + 19x19x6 + 10x10x6 + 5x5x6 + 3x3x4 + 1x1x4 = 8732 detections per class.
+
+
+The Single Shot MultiBox Detector (SSD)
+
+
+SSD is readily pluggable into a wide variant standard convolutional network, such as VGG, ResNet, or MobileNet, which is also called base network or backbone. In this tutorial we used [MobileNet](https://arxiv.org/abs/1704.04861).
+
+
+### Data Preparation
+
+Please download [PASCAL VOC dataset](http://host.robots.ox.ac.uk/pascal/VOC/) at first, skip this step if you already have one.
+
+```bash
+cd data/pascalvoc
+./download.sh
+```
+
+The command `download.sh` also will create training and testing file lists.
+
+### Train
+
+#### Download the Pre-trained Model.
+
+We provide two pre-trained models. The one is MobileNet-v1 SSD trained on COCO dataset, but removed the convolutional predictors for COCO dataset. This model can be used to initialize the models when training other datasets, like PASCAL VOC. The other pre-trained model is MobileNet-v1 trained on ImageNet 2012 dataset but removed the last weights and bias in the Fully-Connected layer. Download MobileNet-v1 SSD:
+
+ ```bash
+ ./pretrained/download_coco.sh
+ ```
+
+Declaration: the MobileNet-v1 SSD model is converted by [TensorFlow model](https://github.com/tensorflow/models/blob/f87a58cd96d45de73c9a8330a06b2ab56749a7fa/research/object_detection/g3doc/detection_model_zoo.md).
+
+
+#### Train on PASCAL VOC
+
+`train.py` is the main caller of the training module. Examples of usage are shown below.
+ ```bash
+ python -u train.py --batch_size=64 --dataset='pascalvoc' --pretrained_model='pretrained/ssd_mobilenet_v1_coco/'
+ ```
+ - Set ```export CUDA_VISIBLE_DEVICES=0,1``` to specifiy the number of GPU you want to use.
+ - For more help on arguments:
+
+ ```bash
+ python train.py --help
+ ```
+
+Data reader is defined in `reader.py`. All images will be resized to 300x300. In training stage, images are randomly distorted, expanded, cropped and flipped:
+ - distort: distort brightness, contrast, saturation, and hue.
+ - expand: put the original image into a larger expanded image which is initialized using image mean.
+ - crop: crop image with respect to different scale, aspect ratio, and overlap.
+ - flip: flip horizontally.
+
+We used RMSProp optimizer with mini-batch size 64 to train the MobileNet-SSD. The initial learning rate is 0.001, and was decayed at 40, 60, 80, 100 epochs with multiplier 0.5, 0.25, 0.1, 0.01, respectively. Weight decay is 0.00005. After 120 epochs we achieve 73.32% mAP under 11point metric.
+
+### Evaluate
+
+You can evaluate your trained model in different metrics like 11point, integral on both PASCAL VOC and COCO dataset. Note we set the default test list to the dataset's test/val list, you can use your own test list by setting ```--test_list``` args.
+
+`eval.py` is the main caller of the evaluating module. Examples of usage are shown below.
+```bash
+python eval.py --dataset='pascalvoc' --model_dir='train_pascal_model/best_model' --data_dir='data/pascalvoc' --test_list='test.txt' --ap_version='11point' --nms_threshold=0.45
+```
+
+### Infer and Visualize
+`infer.py` is the main caller of the inferring module. Examples of usage are shown below.
+```bash
+python infer.py --dataset='pascalvoc' --nms_threshold=0.45 --model_dir='train_pascal_model/best_model' --image_path='./data/pascalvoc/VOCdevkit/VOC2007/JPEGImages/009963.jpg'
+```
+Below are the examples of running the inference and visualizing the model result.
+
+
+
+
+
+MobileNet-v1-SSD 300x300 Visualization Examples
+
+
+
+### Released Model
+
+
+| Model | Pre-trained Model | Training data | Test data | mAP |
+|:------------------------:|:------------------:|:----------------:|:------------:|:----:|
+|[MobileNet-v1-SSD 300x300](http://paddlemodels.bj.bcebos.com/ssd_mobilenet_v1_pascalvoc.tar.gz) | COCO MobileNet SSD | VOC07+12 trainval| VOC07 test | 73.32% |
diff --git a/PaddleCV/object_detection/README_cn.md b/PaddleCV/object_detection/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..8c4cecab28e49c10820e092d3a521facf4be68ea
--- /dev/null
+++ b/PaddleCV/object_detection/README_cn.md
@@ -0,0 +1,99 @@
+## SSD 目标检测
+
+## Table of Contents
+- [简介](#简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型预测以及可视化](#模型预测以及可视化)
+- [模型发布](#模型发布)
+
+### 简介
+
+[Single Shot MultiBox Detector (SSD)](https://arxiv.org/abs/1512.02325) 是一种单阶段的目标检测器。与两阶段的检测方法不同,单阶段目标检测并不进行区域推荐,而是直接从特征图回归出目标的边界框和分类概率。SSD 运用了这种单阶段检测的思想,并且对其进行改进:在不同尺度的特征图上检测对应尺度的目标。如下图所示,SSD 在六个尺度的特征图上进行了不同层级的预测。每个层级由两个3x3卷积分别对目标类别和边界框偏移进行回归。因此对于每个类别,SSD 的六个层级一共会产生 38x38x4 + 19x19x6 + 10x10x6 + 5x5x6 + 3x3x4 + 1x1x4 = 8732 个检测结果。
+
+
+SSD 目标检测模型
+
+
+SSD 可以方便地插入到任何一种标准卷积网络中,比如 VGG、ResNet 或者 MobileNet,这些网络被称作检测器的基网络。在这个示例中我们使用 [MobileNet](https://arxiv.org/abs/1704.04861)。
+
+
+### 数据准备
+
+
+请先使用下面的命令下载 [PASCAL VOC 数据集](http://host.robots.ox.ac.uk/pascal/VOC/):
+
+```bash
+cd data/pascalvoc
+./download.sh
+```
+
+`download.sh` 命令会自动创建训练和测试用的列表文件。
+
+
+### 模型训练
+
+#### 下载预训练模型
+
+我们提供了两个预训练模型。第一个模型是在 COCO 数据集上预训练的 MobileNet-v1 SSD,我们将它的预测头移除了以便在 COCO 以外的数据集上进行训练。第二个模型是在 ImageNet 2012 数据集上预训练的 MobileNet-v1,我们也将最后的全连接层移除以便进行目标检测训练。下载 MobileNet-v1 SSD:
+
+ ```bash
+ ./pretrained/download_coco.sh
+ ```
+
+声明:MobileNet-v1 SSD 模型转换自[TensorFlow model](https://github.com/tensorflow/models/blob/f87a58cd96d45de73c9a8330a06b2ab56749a7fa/research/object_detection/g3doc/detection_model_zoo.md)。MobileNet-v1 模型转换自[Caffe](https://github.com/shicai/MobileNet-Caffe)。
+
+
+#### 训练
+
+`train.py` 是训练模块的主要执行程序,调用示例如下:
+ ```bash
+ python -u train.py --batch_size=64 --dataset='pascalvoc' --pretrained_model='pretrained/ssd_mobilenet_v1_coco/'
+ ```
+ - 可以通过设置 ```export CUDA_VISIBLE_DEVICES=0,1``` 指定想要使用的GPU数量。
+ - 更多的可选参数见:
+
+ ```bash
+ python train.py --help
+ ```
+
+数据的读取行为定义在 `reader.py` 中,所有的图片都会被缩放到300x300。在训练时还会对图片进行数据增强,包括随机扰动、扩张、翻转和裁剪:
+ - 扰动: 扰动图片亮度、对比度、饱和度和色相。
+ - 扩张: 将原始图片放进一张使用像素均值填充(随后会在减均值操作中减掉)的扩张图中,再对此图进行裁剪、缩放和翻转。
+ - 翻转: 水平翻转。
+ - 裁剪: 根据缩放比例、长宽比例两个参数生成若干候选框,再依据这些候选框和标注框的面积交并比(IoU)挑选出符合要求的裁剪结果。
+
+我们使用了 RMSProp 优化算法来训练 MobileNet-SSD,batch大小为64,权重衰减系数为0.00005,初始学习率为 0.001,并且在第40、60、80、100 轮时使用 0.5, 0.25, 0.1, 0.01乘子进行学习率衰减。在120轮训练后,11point评价标准下的mAP为73.32%。
+
+### 模型评估
+
+你可以使用11point、integral等指标在PASCAL VOC 数据集上评估训练好的模型。不失一般性,我们采用相应数据集的测试列表作为样例代码的默认列表,你也可以通过设置```--test_list```来指定自己的测试样本列表。
+
+`eval.py`是评估模块的主要执行程序,调用示例如下:
+```bash
+python eval.py --dataset='pascalvoc' --model_dir='train_pascal_model/best_model' --data_dir='data/pascalvoc' --test_list='test.txt' --ap_version='11point' --nms_threshold=0.45
+```
+
+### 模型预测以及可视化
+
+`infer.py`是预测及可视化模块的主要执行程序,调用示例如下:
+```bash
+python infer.py --dataset='pascalvoc' --nms_threshold=0.45 --model_dir='train_pascal_model/best_model' --image_path='./data/pascalvoc/VOCdevkit/VOC2007/JPEGImages/009963.jpg'
+```
+下图可视化了模型的预测结果:
+
+
+
+
+
+MobileNet-v1-SSD 300x300 预测可视化
+
+
+
+### 模型发布
+
+
+| 模型 | 预训练模型 | 训练数据 | 测试数据 | mAP |
+|:------------------------:|:------------------:|:----------------:|:------------:|:----:|
+|[MobileNet-v1-SSD 300x300](http://paddlemodels.bj.bcebos.com/ssd_mobilenet_v1_pascalvoc.tar.gz) | COCO MobileNet SSD | VOC07+12 trainval| VOC07 test | 73.32% |
diff --git a/PaddleCV/object_detection/README_quant.md b/PaddleCV/object_detection/README_quant.md
new file mode 100644
index 0000000000000000000000000000000000000000..7ea7f7bd79d21ba34c84d1a1b48a5298837939ac
--- /dev/null
+++ b/PaddleCV/object_detection/README_quant.md
@@ -0,0 +1,146 @@
+## Quantization-aware training for SSD
+
+### Introduction
+
+The quantization-aware training used in this experiments is introduced in [fixed-point quantization desigin](https://github.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/design/quantization/fixed_point_quantization.md). Since quantization-aware training is still an active area of research and experimentation,
+here, we just give an simple quantization training usage in Fluid based on MobileNet-SSD model, and more other exeperiments are still needed, like how to quantization traning by considering fusing batch normalization and convolution/fully-connected layers, channel-wise quantization of weights and so on.
+
+
+A Python transpiler is used to rewrite Fluid training program or evaluation program for quantization-aware training:
+
+```python
+
+ #startup_prog = fluid.Program()
+ #train_prog = fluid.Program()
+ #loss = build_program(
+ # main_prog=train_prog,
+ # startup_prog=startup_prog,
+ # is_train=True)
+ #build_program(
+ # main_prog=test_prog,
+ # startup_prog=startup_prog,
+ # is_train=False)
+ #test_prog = test_prog.clone(for_test=True)
+ # above is an pseudo code
+
+ transpiler = fluid.contrib.QuantizeTranspiler(
+ weight_bits=8,
+ activation_bits=8,
+ activation_quantize_type='abs_max', # or 'range_abs_max'
+ weight_quantize_type='abs_max')
+ # note, transpiler.training_transpile will rewrite train_prog
+ # startup_prog is needed since it needs to insert and initialize
+ # some state variable
+ transpiler.training_transpile(train_prog, startup_prog)
+ transpiler.training_transpile(test_prog, startup_prog)
+```
+
+ According to above design, this transpiler inserts fake quantization and de-quantization operation for each convolution operation (including depthwise convolution operation) and fully-connected operation. These quantizations take affect on weights and activations.
+
+ In the design, we introduce dynamic quantization and static quantization strategies for different activation quantization methods. In the expriments, when set `activation_quantize_type` to `abs_max`, it is dynamic quantization. That is to say, the quantization scale (maximum of absolute value) of activation will be calculated each mini-batch during inference. When set `activation_quantize_type` to `range_abs_max`, a quantization scale for inference period will be calculated during training. Following part will introduce how to train.
+
+### Quantization-aware training
+
+ The training is fine-tuned on the well-trained MobileNet-SSD model. So download model at first:
+
+ ```
+ wget http://paddlemodels.bj.bcebos.com/ssd_mobilenet_v1_pascalvoc.tar.gz
+ ```
+
+- dynamic quantization:
+
+ ```python
+ python main_quant.py \
+ --data_dir=$PascalVOC_DIR$ \
+ --mode='train' \
+ --init_model=ssd_mobilenet_v1_pascalvoc \
+ --act_quant_type='abs_max' \
+ --epoc_num=20 \
+ --learning_rate=0.0001 \
+ --batch_size=64 \
+ --model_save_dir=$OUTPUT_DIR$
+ ```
+ Since fine-tuned on a well-trained model, we use a small start learnng rate 0.0001, and train 20 epocs.
+
+- static quantization:
+ ```python
+ python main_quant.py \
+ --data_dir=$PascalVOC_DIR$ \
+ --mode='train' \
+ --init_model=ssd_mobilenet_v1_pascalvoc \
+ --act_quant_type='range_abs_max' \
+ --epoc_num=80 \
+ --learning_rate=0.001 \
+ --lr_epochs=30,60 \
+ --lr_decay_rates=1,0.1,0.01 \
+ --batch_size=64 \
+ --model_save_dir=$OUTPUT_DIR$
+ ```
+ Here, train 80 epocs, learning rate decays at 30 and 60 epocs by 0.1 every time. Users can adjust these hype-parameters.
+
+### Convert to inference model
+
+ As described in the design documentation, the inference graph is a little different from training, the difference is the de-quantization operation is before or after conv/fc. This is equivalent in training due to linear operation of conv/fc and de-quantization and functions' commutative law. But for inference, it needs to convert the graph, `fluid.contrib.QuantizeTranspiler.freeze_program` is used to do this:
+
+ ```python
+ #startup_prog = fluid.Program()
+ #test_prog = fluid.Program()
+ #test_py_reader, map_eval, nmsed_out, image = build_program(
+ # main_prog=test_prog,
+ # startup_prog=startup_prog,
+ # train_params=configs,
+ # is_train=False)
+ #test_prog = test_prog.clone(for_test=True)
+ #transpiler = fluid.contrib.QuantizeTranspiler(weight_bits=8,
+ # activation_bits=8,
+ # activation_quantize_type=act_quant_type,
+ # weight_quantize_type='abs_max')
+ #transpiler.training_transpile(test_prog, startup_prog)
+ #place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
+ #exe = fluid.Executor(place)
+ #exe.run(startup_prog)
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(init_model, var.name))
+ fluid.io.load_vars(exe, init_model, main_program=test_prog,
+ predicate=if_exist)
+ # freeze the rewrited training program
+ # freeze after load parameters, it will quantized weights
+ transpiler.freeze_program(test_prog, place)
+ ```
+
+ Users can evaluate the converted model by:
+
+ ```
+ python main_quant.py \
+ --data_dir=$PascalVOC_DIR$ \
+ --mode='test' \
+ --init_model=$MODLE_DIR$ \
+ --model_save_dir=$MobileNet_SSD_8BIT_MODEL$
+ ```
+
+ You also can check the 8-bit model by the inference scripts
+
+ ```
+ python main_quant.py \
+ --mode='infer' \
+ --init_model=$MobileNet_SSD_8BIT_MODEL$ \
+ --confs_threshold=0.5 \
+ --image_path='/data/PascalVOC/VOCdevkit/VOC2007/JPEGImages/002271.jpg'
+ ```
+ See 002271.jpg for the visualized image with bbouding boxes.
+
+
+ **Note**, if you want to convert model to 8-bit, you should call `fluid.contrib.QuantizeTranspiler.convert_to_int8` to do this. But, now Paddle can't load 8-bit model to do inference.
+
+### Results
+
+Results of MobileNet-v1-SSD 300x300 model on PascalVOC dataset.
+
+| Model | mAP |
+|:---------------------------------------:|:------------------:|
+|Floating point: 32bit | 73.32% |
+|Fixed point: 8bit, dynamic quantization | 72.77% |
+|Fixed point: 8bit, static quantization | 72.45% |
+
+ As mentioned above, other experiments, like how to quantization traning by considering fusing batch normalization and convolution/fully-connected layers, channel-wise quantization of weights, quantizated weights type with uint8 instead of int8 and so on.
diff --git a/PaddleCV/object_detection/_ce.py b/PaddleCV/object_detection/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..6f300e162b1c1940a2c8f1463953f0bcbeaa0a78
--- /dev/null
+++ b/PaddleCV/object_detection/_ce.py
@@ -0,0 +1,72 @@
+####this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi, DurationKpi, AccKpi
+
+#### NOTE kpi.py should shared in models in some way!!!!
+
+train_cost_kpi = CostKpi('train_cost', 0.02, 0, actived=True)
+test_acc_kpi = AccKpi('test_acc', 0.01, 0, actived=False)
+train_speed_kpi = DurationKpi('train_speed', 0.1, 0, actived=True, unit_repr="s/epoch")
+train_cost_card4_kpi = CostKpi('train_cost_card4', 0.02, 0, actived=True)
+test_acc_card4_kpi = AccKpi('test_acc_card4', 0.01, 0, actived=False)
+train_speed_card4_kpi = DurationKpi('train_speed_card4', 0.1, 0, actived=True, unit_repr="s/epoch")
+
+tracking_kpis = [
+ train_cost_kpi,
+ test_acc_kpi,
+ train_speed_kpi,
+ train_cost_card4_kpi,
+ test_acc_card4_kpi,
+ train_speed_card4_kpi,
+]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ #kpi_map = {}
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ print("-----%s" % fs)
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ #kpi_map[kpi_name] = kpi_value
+ yield kpi_name, kpi_value
+ #return kpi_map
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ print("*****")
+ print(log)
+ print("****")
+ log_to_ce(log)
diff --git a/fluid/PaddleCV/faster_rcnn/dataset/coco/download.sh b/PaddleCV/object_detection/data/coco/download.sh
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/dataset/coco/download.sh
rename to PaddleCV/object_detection/data/coco/download.sh
diff --git a/fluid/PaddleCV/object_detection/data/pascalvoc/create_list.py b/PaddleCV/object_detection/data/pascalvoc/create_list.py
similarity index 100%
rename from fluid/PaddleCV/object_detection/data/pascalvoc/create_list.py
rename to PaddleCV/object_detection/data/pascalvoc/create_list.py
diff --git a/fluid/PaddleCV/object_detection/data/pascalvoc/download.sh b/PaddleCV/object_detection/data/pascalvoc/download.sh
similarity index 100%
rename from fluid/PaddleCV/object_detection/data/pascalvoc/download.sh
rename to PaddleCV/object_detection/data/pascalvoc/download.sh
diff --git a/fluid/PaddleCV/object_detection/data/pascalvoc/label_list b/PaddleCV/object_detection/data/pascalvoc/label_list
similarity index 100%
rename from fluid/PaddleCV/object_detection/data/pascalvoc/label_list
rename to PaddleCV/object_detection/data/pascalvoc/label_list
diff --git a/PaddleCV/object_detection/eval.py b/PaddleCV/object_detection/eval.py
new file mode 100644
index 0000000000000000000000000000000000000000..157384b04f40ab2e3023fa57269267219b16d62d
--- /dev/null
+++ b/PaddleCV/object_detection/eval.py
@@ -0,0 +1,138 @@
+import os
+import time
+import numpy as np
+import argparse
+import functools
+import math
+
+import paddle
+import paddle.fluid as fluid
+import reader
+from mobilenet_ssd import mobile_net
+from utility import add_arguments, print_arguments
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('dataset', str, 'pascalvoc', "coco2014, coco2017, and pascalvoc.")
+add_arg('batch_size', int, 32, "Minibatch size.")
+add_arg('use_gpu', bool, True, "Whether use GPU.")
+add_arg('data_dir', str, '', "The data root path.")
+add_arg('test_list', str, '', "The testing data lists.")
+add_arg('model_dir', str, '', "The model path.")
+add_arg('nms_threshold', float, 0.45, "NMS threshold.")
+add_arg('ap_version', str, '11point', "integral, 11point.")
+add_arg('resize_h', int, 300, "The resized image height.")
+add_arg('resize_w', int, 300, "The resized image height.")
+add_arg('mean_value_B', float, 127.5, "Mean value for B channel which will be subtracted.") #123.68
+add_arg('mean_value_G', float, 127.5, "Mean value for G channel which will be subtracted.") #116.78
+add_arg('mean_value_R', float, 127.5, "Mean value for R channel which will be subtracted.") #103.94
+# yapf: enable
+
+
+def build_program(main_prog, startup_prog, args, data_args):
+ image_shape = [3, data_args.resize_h, data_args.resize_w]
+ if 'coco' in data_args.dataset:
+ num_classes = 91
+ elif 'pascalvoc' in data_args.dataset:
+ num_classes = 21
+
+ with fluid.program_guard(main_prog, startup_prog):
+ py_reader = fluid.layers.py_reader(
+ capacity=64,
+ shapes=[[-1] + image_shape, [-1, 4], [-1, 1], [-1, 1]],
+ lod_levels=[0, 1, 1, 1],
+ dtypes=["float32", "float32", "int32", "int32"],
+ use_double_buffer=True)
+ with fluid.unique_name.guard():
+ image, gt_box, gt_label, difficult = fluid.layers.read_file(
+ py_reader)
+ locs, confs, box, box_var = mobile_net(num_classes, image,
+ image_shape)
+ nmsed_out = fluid.layers.detection_output(
+ locs, confs, box, box_var, nms_threshold=args.nms_threshold)
+ with fluid.program_guard(main_prog):
+ map = fluid.metrics.DetectionMAP(
+ nmsed_out,
+ gt_label,
+ gt_box,
+ difficult,
+ num_classes,
+ overlap_threshold=0.5,
+ evaluate_difficult=False,
+ ap_version=args.ap_version)
+ return py_reader, map
+
+
+def eval(args, data_args, test_list, batch_size, model_dir=None):
+ startup_prog = fluid.Program()
+ test_prog = fluid.Program()
+
+ test_py_reader, map_eval = build_program(
+ main_prog=test_prog,
+ startup_prog=startup_prog,
+ args=args,
+ data_args=data_args)
+ test_prog = test_prog.clone(for_test=True)
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup_prog)
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(model_dir, var.name))
+
+ fluid.io.load_vars(
+ exe, model_dir, main_program=test_prog, predicate=if_exist)
+
+ test_reader = reader.test(data_args, test_list, batch_size=batch_size)
+ test_py_reader.decorate_paddle_reader(test_reader)
+
+ _, accum_map = map_eval.get_map_var()
+ map_eval.reset(exe)
+ test_py_reader.start()
+ try:
+ batch_id = 0
+ while True:
+ test_map, = exe.run(test_prog, fetch_list=[accum_map])
+ if batch_id % 10 == 0:
+ print("Batch {0}, map {1}".format(batch_id, test_map))
+ batch_id += 1
+ except (fluid.core.EOFException, StopIteration):
+ test_py_reader.reset()
+ print("Test model {0}, map {1}".format(model_dir, test_map))
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ print_arguments(args)
+
+ data_dir = 'data/pascalvoc'
+ test_list = 'test.txt'
+ label_file = 'label_list'
+
+ if not os.path.exists(args.model_dir):
+ raise ValueError("The model path [%s] does not exist." %
+ (args.model_dir))
+ if 'coco' in args.dataset:
+ data_dir = 'data/coco'
+ if '2014' in args.dataset:
+ test_list = 'annotations/instances_val2014.json'
+ elif '2017' in args.dataset:
+ test_list = 'annotations/instances_val2017.json'
+
+ data_args = reader.Settings(
+ dataset=args.dataset,
+ data_dir=args.data_dir if len(args.data_dir) > 0 else data_dir,
+ label_file=label_file,
+ resize_h=args.resize_h,
+ resize_w=args.resize_w,
+ mean_value=[args.mean_value_B, args.mean_value_G, args.mean_value_R],
+ apply_distort=False,
+ apply_expand=False,
+ ap_version=args.ap_version)
+ eval(
+ args,
+ data_args=data_args,
+ test_list=args.test_list if len(args.test_list) > 0 else test_list,
+ batch_size=args.batch_size,
+ model_dir=args.model_dir)
diff --git a/PaddleCV/object_detection/eval_coco_map.py b/PaddleCV/object_detection/eval_coco_map.py
new file mode 100644
index 0000000000000000000000000000000000000000..3e4d4ab8b3460263221b90d0dce787439f014f5b
--- /dev/null
+++ b/PaddleCV/object_detection/eval_coco_map.py
@@ -0,0 +1,155 @@
+import os
+import time
+import numpy as np
+import argparse
+import functools
+
+import paddle
+import paddle.fluid as fluid
+import reader
+from mobilenet_ssd import mobile_net
+from utility import add_arguments, print_arguments
+
+# A special mAP metric for COCO dataset, which averages AP in different IoUs.
+# To use this eval_cocoMAP.py, [cocoapi](https://github.com/cocodataset/cocoapi) is needed.
+import json
+from pycocotools.coco import COCO
+from pycocotools.cocoeval import COCOeval
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('dataset', str, 'coco2014', "coco2014, coco2017.")
+add_arg('batch_size', int, 32, "Minibatch size.")
+add_arg('use_gpu', bool, True, "Whether use GPU.")
+add_arg('data_dir', str, '', "The data root path.")
+add_arg('test_list', str, '', "The testing data lists.")
+add_arg('model_dir', str, '', "The model path.")
+add_arg('nms_threshold', float, 0.5, "NMS threshold.")
+add_arg('ap_version', str, 'cocoMAP', "cocoMAP.")
+add_arg('resize_h', int, 300, "The resized image height.")
+add_arg('resize_w', int, 300, "The resized image height.")
+add_arg('mean_value_B', float, 127.5, "Mean value for B channel which will be subtracted.") #123.68
+add_arg('mean_value_G', float, 127.5, "Mean value for G channel which will be subtracted.") #116.78
+add_arg('mean_value_R', float, 127.5, "Mean value for R channel which will be subtracted.") #103.94
+# yapf: enable
+
+
+def eval(args, data_args, test_list, batch_size, model_dir=None):
+ image_shape = [3, data_args.resize_h, data_args.resize_w]
+ num_classes = 91
+
+ image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
+ gt_box = fluid.layers.data(
+ name='gt_box', shape=[4], dtype='float32', lod_level=1)
+ gt_label = fluid.layers.data(
+ name='gt_label', shape=[1], dtype='int32', lod_level=1)
+ gt_iscrowd = fluid.layers.data(
+ name='gt_iscrowd', shape=[1], dtype='int32', lod_level=1)
+ gt_image_info = fluid.layers.data(
+ name='gt_image_id', shape=[3], dtype='int32')
+
+ locs, confs, box, box_var = mobile_net(num_classes, image, image_shape)
+ nmsed_out = fluid.layers.detection_output(
+ locs, confs, box, box_var, nms_threshold=args.nms_threshold)
+ loss = fluid.layers.ssd_loss(locs, confs, gt_box, gt_label, box, box_var)
+ loss = fluid.layers.reduce_sum(loss)
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ # yapf: disable
+ if model_dir:
+ def if_exist(var):
+ return os.path.exists(os.path.join(model_dir, var.name))
+ fluid.io.load_vars(exe, model_dir, predicate=if_exist)
+ # yapf: enable
+ test_reader = reader.test(data_args, test_list, batch_size)
+ feeder = fluid.DataFeeder(
+ place=place,
+ feed_list=[image, gt_box, gt_label, gt_iscrowd, gt_image_info])
+
+ def get_dt_res(nmsed_out_v, data):
+ dts_res = []
+ lod = nmsed_out_v[0].lod()[0]
+ nmsed_out_v = np.array(nmsed_out_v[0])
+ real_batch_size = min(batch_size, len(data))
+ assert (len(lod) == real_batch_size + 1), \
+ "Error Lod Tensor offset dimension. Lod({}) vs. batch_size({})".format(len(lod), batch_size)
+ k = 0
+ for i in range(real_batch_size):
+ dt_num_this_img = lod[i + 1] - lod[i]
+ image_id = int(data[i][4][0])
+ image_width = int(data[i][4][1])
+ image_height = int(data[i][4][2])
+ for j in range(dt_num_this_img):
+ dt = nmsed_out_v[k]
+ k = k + 1
+ category_id, score, xmin, ymin, xmax, ymax = dt.tolist()
+ xmin = max(min(xmin, 1.0), 0.0) * image_width
+ ymin = max(min(ymin, 1.0), 0.0) * image_height
+ xmax = max(min(xmax, 1.0), 0.0) * image_width
+ ymax = max(min(ymax, 1.0), 0.0) * image_height
+ w = xmax - xmin
+ h = ymax - ymin
+ bbox = [xmin, ymin, w, h]
+ dt_res = {
+ 'image_id': image_id,
+ 'category_id': category_id,
+ 'bbox': bbox,
+ 'score': score
+ }
+ dts_res.append(dt_res)
+ return dts_res
+
+ def test():
+ dts_res = []
+
+ for batch_id, data in enumerate(test_reader()):
+ nmsed_out_v = exe.run(fluid.default_main_program(),
+ feed=feeder.feed(data),
+ fetch_list=[nmsed_out],
+ return_numpy=False)
+ if batch_id % 20 == 0:
+ print("Batch {0}".format(batch_id))
+ dts_res += get_dt_res(nmsed_out_v, data)
+
+ with open("detection_result.json", 'w') as outfile:
+ json.dump(dts_res, outfile)
+ print("start evaluate using coco api")
+ cocoGt = COCO(os.path.join(data_args.data_dir, test_list))
+ cocoDt = cocoGt.loadRes("detection_result.json")
+ cocoEval = COCOeval(cocoGt, cocoDt, "bbox")
+ cocoEval.evaluate()
+ cocoEval.accumulate()
+ cocoEval.summarize()
+
+ test()
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ print_arguments(args)
+
+ data_dir = './data/coco'
+ if '2014' in args.dataset:
+ test_list = 'annotations/instances_val2014.json'
+ elif '2017' in args.dataset:
+ test_list = 'annotations/instances_val2017.json'
+
+ data_args = reader.Settings(
+ dataset=args.dataset,
+ data_dir=args.data_dir if len(args.data_dir) > 0 else data_dir,
+ label_file='',
+ resize_h=args.resize_h,
+ resize_w=args.resize_w,
+ mean_value=[args.mean_value_B, args.mean_value_G, args.mean_value_R],
+ apply_distort=False,
+ apply_expand=False,
+ ap_version=args.ap_version)
+ eval(
+ args,
+ data_args=data_args,
+ test_list=args.test_list if len(args.test_list) > 0 else test_list,
+ batch_size=args.batch_size,
+ model_dir=args.model_dir)
diff --git a/fluid/PaddleCV/object_detection/image_util.py b/PaddleCV/object_detection/image_util.py
similarity index 100%
rename from fluid/PaddleCV/object_detection/image_util.py
rename to PaddleCV/object_detection/image_util.py
diff --git a/fluid/PaddleCV/object_detection/images/009943.jpg b/PaddleCV/object_detection/images/009943.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/009943.jpg
rename to PaddleCV/object_detection/images/009943.jpg
diff --git a/fluid/PaddleCV/object_detection/images/009956.jpg b/PaddleCV/object_detection/images/009956.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/009956.jpg
rename to PaddleCV/object_detection/images/009956.jpg
diff --git a/fluid/PaddleCV/object_detection/images/009960.jpg b/PaddleCV/object_detection/images/009960.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/009960.jpg
rename to PaddleCV/object_detection/images/009960.jpg
diff --git a/fluid/PaddleCV/object_detection/images/009962.jpg b/PaddleCV/object_detection/images/009962.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/009962.jpg
rename to PaddleCV/object_detection/images/009962.jpg
diff --git a/fluid/PaddleCV/object_detection/images/COCO_val2014_000000000139.jpg b/PaddleCV/object_detection/images/COCO_val2014_000000000139.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/COCO_val2014_000000000139.jpg
rename to PaddleCV/object_detection/images/COCO_val2014_000000000139.jpg
diff --git a/fluid/PaddleCV/object_detection/images/COCO_val2014_000000000785.jpg b/PaddleCV/object_detection/images/COCO_val2014_000000000785.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/COCO_val2014_000000000785.jpg
rename to PaddleCV/object_detection/images/COCO_val2014_000000000785.jpg
diff --git a/fluid/PaddleCV/object_detection/images/COCO_val2014_000000000885.jpg b/PaddleCV/object_detection/images/COCO_val2014_000000000885.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/COCO_val2014_000000000885.jpg
rename to PaddleCV/object_detection/images/COCO_val2014_000000000885.jpg
diff --git a/fluid/PaddleCV/object_detection/images/COCO_val2014_000000142324.jpg b/PaddleCV/object_detection/images/COCO_val2014_000000142324.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/COCO_val2014_000000142324.jpg
rename to PaddleCV/object_detection/images/COCO_val2014_000000142324.jpg
diff --git a/fluid/PaddleCV/object_detection/images/COCO_val2014_000000144003.jpg b/PaddleCV/object_detection/images/COCO_val2014_000000144003.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/COCO_val2014_000000144003.jpg
rename to PaddleCV/object_detection/images/COCO_val2014_000000144003.jpg
diff --git a/fluid/PaddleCV/object_detection/images/SSD_paper_figure.jpg b/PaddleCV/object_detection/images/SSD_paper_figure.jpg
similarity index 100%
rename from fluid/PaddleCV/object_detection/images/SSD_paper_figure.jpg
rename to PaddleCV/object_detection/images/SSD_paper_figure.jpg
diff --git a/fluid/PaddleCV/object_detection/infer.py b/PaddleCV/object_detection/infer.py
similarity index 100%
rename from fluid/PaddleCV/object_detection/infer.py
rename to PaddleCV/object_detection/infer.py
diff --git a/PaddleCV/object_detection/main_quant.py b/PaddleCV/object_detection/main_quant.py
new file mode 100644
index 0000000000000000000000000000000000000000..2927858a1ea34fbc3adc6eb996ac596c50846fa4
--- /dev/null
+++ b/PaddleCV/object_detection/main_quant.py
@@ -0,0 +1,285 @@
+import os
+import time
+import numpy as np
+import argparse
+import functools
+import shutil
+import math
+import multiprocessing
+
+import paddle
+import paddle.fluid as fluid
+import reader
+from mobilenet_ssd import mobile_net
+from utility import add_arguments, print_arguments
+from train import build_program
+from train import train_parameters
+from infer import draw_bounding_box_on_image
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('learning_rate', float, 0.0001, "Learning rate.")
+add_arg('batch_size', int, 64, "Minibatch size.")
+add_arg('epoc_num', int, 20, "Epoch number.")
+add_arg('use_gpu', bool, True, "Whether use GPU.")
+add_arg('parallel', bool, True, "Whether train in parallel on multi-devices.")
+add_arg('model_save_dir', str, 'quant_model', "The path to save model.")
+add_arg('init_model', str, 'ssd_mobilenet_v1_pascalvoc', "The init model path.")
+add_arg('ap_version', str, '11point', "mAP version can be integral or 11point.")
+add_arg('image_shape', str, '3,300,300', "Input image shape.")
+add_arg('mean_BGR', str, '127.5,127.5,127.5', "Mean value for B,G,R channel which will be subtracted.")
+add_arg('lr_epochs', str, '30,60', "The learning decay steps.")
+add_arg('lr_decay_rates', str, '1,0.1,0.01', "The learning decay rates for each step.")
+add_arg('data_dir', str, 'data/pascalvoc', "Data directory")
+add_arg('act_quant_type', str, 'abs_max', "Quantize type of activation, whicn can be abs_max or range_abs_max")
+add_arg('image_path', str, '', "The image used to inference and visualize.")
+add_arg('confs_threshold', float, 0.5, "Confidence threshold to draw bbox.")
+add_arg('mode', str, 'train', "Job mode can be one of ['train', 'test', 'infer'].")
+#yapf: enable
+
+def test(exe, test_prog, map_eval, test_py_reader):
+ _, accum_map = map_eval.get_map_var()
+ map_eval.reset(exe)
+ test_py_reader.start()
+ try:
+ batch = 0
+ while True:
+ test_map, = exe.run(test_prog, fetch_list=[accum_map])
+ if batch % 10 == 0:
+ print("Batch {0}, map {1}".format(batch, test_map))
+ batch += 1
+ except fluid.core.EOFException:
+ test_py_reader.reset()
+ finally:
+ test_py_reader.reset()
+ print("Test map {0}".format(test_map))
+ return test_map
+
+
+def save_model(exe, main_prog, model_save_dir, postfix):
+ model_path = os.path.join(model_save_dir, postfix)
+ if os.path.isdir(model_path):
+ shutil.rmtree(model_path)
+ fluid.io.save_persistables(exe, model_path, main_program=main_prog)
+
+
+def train(args,
+ data_args,
+ train_params,
+ train_file_list,
+ val_file_list):
+
+ model_save_dir = args.model_save_dir
+ init_model = args.init_model
+ epoc_num = args.epoc_num
+ use_gpu = args.use_gpu
+ parallel = args.parallel
+ is_shuffle = True
+ act_quant_type = args.act_quant_type
+
+ if use_gpu:
+ devices_num = fluid.core.get_cuda_device_count()
+ else:
+ devices_num = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+
+ batch_size = train_params['batch_size']
+ batch_size_per_device = batch_size // devices_num
+ num_workers = 4
+
+ startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+ test_prog = fluid.Program()
+
+ train_py_reader, loss = build_program(
+ main_prog=train_prog,
+ startup_prog=startup_prog,
+ train_params=train_params,
+ is_train=True)
+ test_py_reader, map_eval, _, _ = build_program(
+ main_prog=test_prog,
+ startup_prog=startup_prog,
+ train_params=train_params,
+ is_train=False)
+
+ test_prog = test_prog.clone(for_test=True)
+
+ transpiler = fluid.contrib.QuantizeTranspiler(weight_bits=8,
+ activation_bits=8,
+ activation_quantize_type=act_quant_type,
+ weight_quantize_type='abs_max')
+
+ transpiler.training_transpile(train_prog, startup_prog)
+ transpiler.training_transpile(test_prog, startup_prog)
+
+ place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup_prog)
+
+ if init_model:
+ print('Load init model %s.' % init_model)
+ def if_exist(var):
+ return os.path.exists(os.path.join(init_model, var.name))
+ fluid.io.load_vars(exe, init_model, main_program=train_prog,
+ predicate=if_exist)
+ else:
+ print('There is no init model.')
+
+ if parallel:
+ train_exe = fluid.ParallelExecutor(main_program=train_prog,
+ use_cuda=True if use_gpu else False, loss_name=loss.name)
+
+ train_reader = reader.train(data_args,
+ train_file_list,
+ batch_size_per_device,
+ shuffle=is_shuffle,
+ num_workers=num_workers)
+ test_reader = reader.test(data_args, val_file_list, batch_size)
+ train_py_reader.decorate_paddle_reader(train_reader)
+ test_py_reader.decorate_paddle_reader(test_reader)
+
+ train_py_reader.start()
+ best_map = 0.
+ for epoc in range(epoc_num):
+ if epoc == 0:
+ # test quantized model without quantization-aware training.
+ test_map = test(exe, test_prog, map_eval, test_py_reader)
+ batch = 0
+ train_py_reader.start()
+ while True:
+ try:
+ # train
+ start_time = time.time()
+ if parallel:
+ outs = train_exe.run(fetch_list=[loss.name])
+ else:
+ outs = exe.run(train_prog, fetch_list=[loss])
+ end_time = time.time()
+ avg_loss = np.mean(np.array(outs[0]))
+ if batch % 10 == 0:
+ print("Epoc {:d}, batch {:d}, loss {:.6f}, time {:.5f}".format(
+ epoc , batch, avg_loss, end_time - start_time))
+ except (fluid.core.EOFException, StopIteration):
+ train_reader().close()
+ train_py_reader.reset()
+ break
+ test_map = test(exe, test_prog, map_eval, test_py_reader)
+ save_model(exe, train_prog, model_save_dir, str(epoc))
+ if test_map > best_map:
+ best_map = test_map
+ save_model(exe, train_prog, model_save_dir, 'best_map')
+ print("Best test map {0}".format(best_map))
+
+
+def eval(args, data_args, configs, val_file_list):
+ init_model = args.init_model
+ use_gpu = args.use_gpu
+ act_quant_type = args.act_quant_type
+ model_save_dir = args.model_save_dir
+
+ batch_size = configs['batch_size']
+ batch_size_per_device = batch_size
+
+ startup_prog = fluid.Program()
+ test_prog = fluid.Program()
+ test_py_reader, map_eval, nmsed_out, image = build_program(
+ main_prog=test_prog,
+ startup_prog=startup_prog,
+ train_params=configs,
+ is_train=False)
+ test_prog = test_prog.clone(for_test=True)
+
+ transpiler = fluid.contrib.QuantizeTranspiler(weight_bits=8,
+ activation_bits=8,
+ activation_quantize_type=act_quant_type,
+ weight_quantize_type='abs_max')
+ transpiler.training_transpile(test_prog, startup_prog)
+
+ place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup_prog)
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(init_model, var.name))
+ fluid.io.load_vars(exe, init_model, main_program=test_prog,
+ predicate=if_exist)
+
+ # freeze after load parameters
+ transpiler.freeze_program(test_prog, place)
+
+ test_reader = reader.test(data_args, val_file_list, batch_size)
+ test_py_reader.decorate_paddle_reader(test_reader)
+
+ test_map = test(exe, test_prog, map_eval, test_py_reader)
+ print("Test model {0}, map {1}".format(init_model, test_map))
+ # convert model to 8-bit before saving, but now Paddle can't load
+ # the 8-bit model to do inference.
+ # transpiler.convert_to_int8(test_prog, place)
+ fluid.io.save_inference_model(model_save_dir, [image.name],
+ [nmsed_out], exe, test_prog)
+
+
+def infer(args, data_args):
+ model_dir = args.init_model
+ image_path = args.image_path
+ confs_threshold = args.confs_threshold
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ [inference_program, feed , fetch] = fluid.io.load_inference_model(
+ dirname=model_dir,
+ executor=exe,
+ model_filename='__model__')
+
+ #print(np.array(fluid.global_scope().find_var('conv2d_20.w_0').get_tensor()))
+ #print(np.max(np.array(fluid.global_scope().find_var('conv2d_20.w_0').get_tensor())))
+ infer_reader = reader.infer(data_args, image_path)
+ data = infer_reader()
+ data = data.reshape((1,) + data.shape)
+ outs = exe.run(inference_program,
+ feed={feed[0]: data},
+ fetch_list=fetch,
+ return_numpy=False)
+ out = np.array(outs[0])
+ draw_bounding_box_on_image(image_path, out, confs_threshold,
+ data_args.label_list)
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ print_arguments(args)
+
+ # for pascalvoc
+ label_file = 'label_list'
+ train_list = 'trainval.txt'
+ val_list = 'test.txt'
+ dataset = 'pascalvoc'
+
+ mean_BGR = [float(m) for m in args.mean_BGR.split(",")]
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+ lr_epochs = [int(m) for m in args.lr_epochs.split(",")]
+ lr_rates = [float(m) for m in args.lr_decay_rates.split(",")]
+ train_parameters[dataset]['image_shape'] = image_shape
+ train_parameters[dataset]['batch_size'] = args.batch_size
+ train_parameters[dataset]['lr'] = args.learning_rate
+ train_parameters[dataset]['epoc_num'] = args.epoc_num
+ train_parameters[dataset]['ap_version'] = args.ap_version
+ train_parameters[dataset]['lr_epochs'] = lr_epochs
+ train_parameters[dataset]['lr_decay'] = lr_rates
+
+ data_args = reader.Settings(
+ dataset=dataset,
+ data_dir=args.data_dir,
+ label_file=label_file,
+ resize_h=image_shape[1],
+ resize_w=image_shape[2],
+ mean_value=mean_BGR,
+ apply_distort=True,
+ apply_expand=True,
+ ap_version = args.ap_version)
+ if args.mode == 'train':
+ train(args, data_args, train_parameters[dataset], train_list, val_list)
+ elif args.mode == 'test':
+ eval(args, data_args, train_parameters[dataset], val_list)
+ else:
+ infer(args, data_args)
diff --git a/fluid/PaddleCV/object_detection/mobilenet_ssd.py b/PaddleCV/object_detection/mobilenet_ssd.py
similarity index 100%
rename from fluid/PaddleCV/object_detection/mobilenet_ssd.py
rename to PaddleCV/object_detection/mobilenet_ssd.py
diff --git a/fluid/PaddleCV/object_detection/pretrained/download_coco.sh b/PaddleCV/object_detection/pretrained/download_coco.sh
similarity index 100%
rename from fluid/PaddleCV/object_detection/pretrained/download_coco.sh
rename to PaddleCV/object_detection/pretrained/download_coco.sh
diff --git a/fluid/PaddleCV/object_detection/pretrained/download_imagenet.sh b/PaddleCV/object_detection/pretrained/download_imagenet.sh
similarity index 100%
rename from fluid/PaddleCV/object_detection/pretrained/download_imagenet.sh
rename to PaddleCV/object_detection/pretrained/download_imagenet.sh
diff --git a/PaddleCV/object_detection/reader.py b/PaddleCV/object_detection/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..3559591c4ed5741d52f44bd92f4398d133b2e104
--- /dev/null
+++ b/PaddleCV/object_detection/reader.py
@@ -0,0 +1,360 @@
+# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import xml.etree.ElementTree
+import os
+import time
+import copy
+import six
+import math
+import numpy as np
+from PIL import Image
+from PIL import ImageDraw
+import image_util
+import paddle
+
+
+class Settings(object):
+ def __init__(self,
+ dataset=None,
+ data_dir=None,
+ label_file=None,
+ resize_h=300,
+ resize_w=300,
+ mean_value=[127.5, 127.5, 127.5],
+ apply_distort=True,
+ apply_expand=True,
+ ap_version='11point'):
+ self._dataset = dataset
+ self._ap_version = ap_version
+ self._data_dir = data_dir
+ if 'pascalvoc' in dataset:
+ self._label_list = []
+ label_fpath = os.path.join(data_dir, label_file)
+ for line in open(label_fpath):
+ self._label_list.append(line.strip())
+
+ self._apply_distort = apply_distort
+ self._apply_expand = apply_expand
+ self._resize_height = resize_h
+ self._resize_width = resize_w
+ self._img_mean = np.array(mean_value)[:, np.newaxis, np.newaxis].astype(
+ 'float32')
+ self._expand_prob = 0.5
+ self._expand_max_ratio = 4
+ self._hue_prob = 0.5
+ self._hue_delta = 18
+ self._contrast_prob = 0.5
+ self._contrast_delta = 0.5
+ self._saturation_prob = 0.5
+ self._saturation_delta = 0.5
+ self._brightness_prob = 0.5
+ self._brightness_delta = 0.125
+
+ @property
+ def dataset(self):
+ return self._dataset
+
+ @property
+ def ap_version(self):
+ return self._ap_version
+
+ @property
+ def apply_distort(self):
+ return self._apply_expand
+
+ @property
+ def apply_distort(self):
+ return self._apply_distort
+
+ @property
+ def data_dir(self):
+ return self._data_dir
+
+ @data_dir.setter
+ def data_dir(self, data_dir):
+ self._data_dir = data_dir
+
+ @property
+ def label_list(self):
+ return self._label_list
+
+ @property
+ def resize_h(self):
+ return self._resize_height
+
+ @property
+ def resize_w(self):
+ return self._resize_width
+
+ @property
+ def img_mean(self):
+ return self._img_mean
+
+
+def preprocess(img, bbox_labels, mode, settings):
+ img_width, img_height = img.size
+ sampled_labels = bbox_labels
+ if mode == 'train':
+ if settings._apply_distort:
+ img = image_util.distort_image(img, settings)
+ if settings._apply_expand:
+ img, bbox_labels, img_width, img_height = image_util.expand_image(
+ img, bbox_labels, img_width, img_height, settings)
+ # sampling
+ batch_sampler = []
+ # hard-code here
+ batch_sampler.append(
+ image_util.sampler(1, 1, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.1, 0.0))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.3, 0.0))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.5, 0.0))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.7, 0.0))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.9, 0.0))
+ batch_sampler.append(
+ image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.0, 1.0))
+ sampled_bbox = image_util.generate_batch_samples(batch_sampler,
+ bbox_labels)
+
+ img = np.array(img)
+ if len(sampled_bbox) > 0:
+ idx = int(np.random.uniform(0, len(sampled_bbox)))
+ img, sampled_labels = image_util.crop_image(
+ img, bbox_labels, sampled_bbox[idx], img_width, img_height)
+
+ img = Image.fromarray(img)
+ img = img.resize((settings.resize_w, settings.resize_h), Image.ANTIALIAS)
+ img = np.array(img)
+
+ if mode == 'train':
+ mirror = int(np.random.uniform(0, 2))
+ if mirror == 1:
+ img = img[:, ::-1, :]
+ for i in six.moves.xrange(len(sampled_labels)):
+ tmp = sampled_labels[i][1]
+ sampled_labels[i][1] = 1 - sampled_labels[i][3]
+ sampled_labels[i][3] = 1 - tmp
+ # HWC to CHW
+ if len(img.shape) == 3:
+ img = np.swapaxes(img, 1, 2)
+ img = np.swapaxes(img, 1, 0)
+ # RBG to BGR
+ img = img[[2, 1, 0], :, :]
+ img = img.astype('float32')
+ img -= settings.img_mean
+ img = img * 0.007843
+ return img, sampled_labels
+
+
+def coco(settings, coco_api, file_list, mode, batch_size, shuffle, data_dir):
+ from pycocotools.coco import COCO
+
+ def reader():
+ if mode == 'train' and shuffle:
+ np.random.shuffle(file_list)
+ batch_out = []
+ for image in file_list:
+ image_name = image['file_name']
+ image_path = os.path.join(data_dir, image_name)
+ if not os.path.exists(image_path):
+ raise ValueError("%s is not exist, you should specify "
+ "data path correctly." % image_path)
+ im = Image.open(image_path)
+ if im.mode == 'L':
+ im = im.convert('RGB')
+ im_width, im_height = im.size
+ im_id = image['id']
+
+ # layout: category_id | xmin | ymin | xmax | ymax | iscrowd
+ bbox_labels = []
+ annIds = coco_api.getAnnIds(imgIds=image['id'])
+ anns = coco_api.loadAnns(annIds)
+ for ann in anns:
+ bbox_sample = []
+ # start from 1, leave 0 to background
+ bbox_sample.append(float(ann['category_id']))
+ bbox = ann['bbox']
+ xmin, ymin, w, h = bbox
+ xmax = xmin + w
+ ymax = ymin + h
+ bbox_sample.append(float(xmin) / im_width)
+ bbox_sample.append(float(ymin) / im_height)
+ bbox_sample.append(float(xmax) / im_width)
+ bbox_sample.append(float(ymax) / im_height)
+ bbox_sample.append(float(ann['iscrowd']))
+ bbox_labels.append(bbox_sample)
+ im, sample_labels = preprocess(im, bbox_labels, mode, settings)
+ sample_labels = np.array(sample_labels)
+ if len(sample_labels) == 0: continue
+ im = im.astype('float32')
+ boxes = sample_labels[:, 1:5]
+ lbls = sample_labels[:, 0].astype('int32')
+ iscrowd = sample_labels[:, -1].astype('int32')
+ if 'cocoMAP' in settings.ap_version:
+ batch_out.append((im, boxes, lbls, iscrowd,
+ [im_id, im_width, im_height]))
+ else:
+ batch_out.append((im, boxes, lbls, iscrowd))
+
+ if len(batch_out) == batch_size:
+ yield batch_out
+ batch_out = []
+
+ if mode == 'test' and len(batch_out) > 1:
+ yield batch_out
+ batch_out = []
+
+ return reader
+
+
+def pascalvoc(settings, file_list, mode, batch_size, shuffle):
+ def reader():
+ if mode == 'train' and shuffle:
+ np.random.shuffle(file_list)
+ batch_out = []
+ cnt = 0
+ for image in file_list:
+ image_path, label_path = image.split()
+ image_path = os.path.join(settings.data_dir, image_path)
+ label_path = os.path.join(settings.data_dir, label_path)
+ if not os.path.exists(image_path):
+ raise ValueError("%s is not exist, you should specify "
+ "data path correctly." % image_path)
+ im = Image.open(image_path)
+ if im.mode == 'L':
+ im = im.convert('RGB')
+ im_width, im_height = im.size
+
+ # layout: label | xmin | ymin | xmax | ymax | difficult
+ bbox_labels = []
+ root = xml.etree.ElementTree.parse(label_path).getroot()
+ for object in root.findall('object'):
+ bbox_sample = []
+ # start from 1
+ bbox_sample.append(
+ float(settings.label_list.index(object.find('name').text)))
+ bbox = object.find('bndbox')
+ difficult = float(object.find('difficult').text)
+ bbox_sample.append(float(bbox.find('xmin').text) / im_width)
+ bbox_sample.append(float(bbox.find('ymin').text) / im_height)
+ bbox_sample.append(float(bbox.find('xmax').text) / im_width)
+ bbox_sample.append(float(bbox.find('ymax').text) / im_height)
+ bbox_sample.append(difficult)
+ bbox_labels.append(bbox_sample)
+ im, sample_labels = preprocess(im, bbox_labels, mode, settings)
+ sample_labels = np.array(sample_labels)
+ if len(sample_labels) == 0: continue
+ im = im.astype('float32')
+ boxes = sample_labels[:, 1:5]
+ lbls = sample_labels[:, 0].astype('int32')
+ difficults = sample_labels[:, -1].astype('int32')
+
+ batch_out.append((im, boxes, lbls, difficults))
+ if len(batch_out) == batch_size:
+ yield batch_out
+ cnt += len(batch_out)
+ batch_out = []
+
+ if mode == 'test' and len(batch_out) > 1:
+ yield batch_out
+ cnt += len(batch_out)
+ batch_out = []
+
+ return reader
+
+
+def train(settings,
+ file_list,
+ batch_size,
+ shuffle=True,
+ num_workers=8,
+ enable_ce=False):
+ file_path = os.path.join(settings.data_dir, file_list)
+ readers = []
+ if 'coco' in settings.dataset:
+ # cocoapi
+ from pycocotools.coco import COCO
+ coco_api = COCO(file_path)
+ image_ids = coco_api.getImgIds()
+ images = coco_api.loadImgs(image_ids)
+ n = int(math.ceil(len(images) // num_workers))
+ image_lists = [images[i:i + n] for i in range(0, len(images), n)]
+
+ if '2014' in file_list:
+ sub_dir = "train2014"
+ elif '2017' in file_list:
+ sub_dir = "train2017"
+ data_dir = os.path.join(settings.data_dir, sub_dir)
+ for l in image_lists:
+ readers.append(
+ coco(settings, coco_api, l, 'train', batch_size, shuffle,
+ data_dir))
+ else:
+ images = [line.strip() for line in open(file_path)]
+ n = int(math.ceil(len(images) // num_workers))
+ image_lists = [images[i:i + n] for i in range(0, len(images), n)]
+ for l in image_lists:
+ readers.append(pascalvoc(settings, l, 'train', batch_size, shuffle))
+
+ return paddle.reader.multiprocess_reader(readers, False)
+
+
+def test(settings, file_list, batch_size):
+ file_list = os.path.join(settings.data_dir, file_list)
+ if 'coco' in settings.dataset:
+ from pycocotools.coco import COCO
+ coco_api = COCO(file_list)
+ image_ids = coco_api.getImgIds()
+ images = coco_api.loadImgs(image_ids)
+ if '2014' in file_list:
+ sub_dir = "val2014"
+ elif '2017' in file_list:
+ sub_dir = "val2017"
+ data_dir = os.path.join(settings.data_dir, sub_dir)
+ return coco(settings, coco_api, images, 'test', batch_size, False,
+ data_dir)
+ else:
+ image_list = [line.strip() for line in open(file_list)]
+ return pascalvoc(settings, image_list, 'test', batch_size, False)
+
+
+def infer(settings, image_path):
+ def reader():
+ if not os.path.exists(image_path):
+ raise ValueError("%s is not exist, you should specify "
+ "data path correctly." % image_path)
+ img = Image.open(image_path)
+ if img.mode == 'L':
+ img = im.convert('RGB')
+ im_width, im_height = img.size
+ img = img.resize((settings.resize_w, settings.resize_h),
+ Image.ANTIALIAS)
+ img = np.array(img)
+ # HWC to CHW
+ if len(img.shape) == 3:
+ img = np.swapaxes(img, 1, 2)
+ img = np.swapaxes(img, 1, 0)
+ # RBG to BGR
+ img = img[[2, 1, 0], :, :]
+ img = img.astype('float32')
+ img -= settings.img_mean
+ img = img * 0.007843
+ return img
+
+ return reader
diff --git a/PaddleCV/object_detection/train.py b/PaddleCV/object_detection/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..ebf969ef0655557feca405c3eb0e558c9588276d
--- /dev/null
+++ b/PaddleCV/object_detection/train.py
@@ -0,0 +1,320 @@
+import os
+import time
+import numpy as np
+import argparse
+import functools
+import shutil
+import math
+import multiprocessing
+
+import paddle
+import paddle.fluid as fluid
+import reader
+from mobilenet_ssd import mobile_net
+from utility import add_arguments, print_arguments
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('learning_rate', float, 0.001, "Learning rate.")
+add_arg('batch_size', int, 64, "Minibatch size of all devices.")
+add_arg('epoc_num', int, 120, "Epoch number.")
+add_arg('use_gpu', bool, True, "Whether use GPU.")
+add_arg('parallel', bool, True, "Whether train in parallel on multi-devices.")
+add_arg('dataset', str, 'pascalvoc', "dataset can be coco2014, coco2017, and pascalvoc.")
+add_arg('model_save_dir', str, 'model', "The path to save model.")
+add_arg('pretrained_model', str, 'pretrained/ssd_mobilenet_v1_coco/', "The init model path.")
+add_arg('ap_version', str, '11point', "mAP version can be integral or 11point.")
+add_arg('image_shape', str, '3,300,300', "Input image shape.")
+add_arg('mean_BGR', str, '127.5,127.5,127.5', "Mean value for B,G,R channel which will be subtracted.")
+add_arg('data_dir', str, 'data/pascalvoc', "Data directory.")
+add_arg('enable_ce', bool, False, "Whether use CE to evaluate the model.")
+#yapf: enable
+
+train_parameters = {
+ "pascalvoc": {
+ "train_images": 16551,
+ "image_shape": [3, 300, 300],
+ "class_num": 21,
+ "batch_size": 64,
+ "lr": 0.001,
+ "lr_epochs": [40, 60, 80, 100],
+ "lr_decay": [1, 0.5, 0.25, 0.1, 0.01],
+ "ap_version": '11point',
+ },
+ "coco2014": {
+ "train_images": 82783,
+ "image_shape": [3, 300, 300],
+ "class_num": 91,
+ "batch_size": 64,
+ "lr": 0.001,
+ "lr_epochs": [12, 19],
+ "lr_decay": [1, 0.5, 0.25],
+ "ap_version": 'integral', # should use eval_coco_map.py to test model
+ },
+ "coco2017": {
+ "train_images": 118287,
+ "image_shape": [3, 300, 300],
+ "class_num": 91,
+ "batch_size": 64,
+ "lr": 0.001,
+ "lr_epochs": [12, 19],
+ "lr_decay": [1, 0.5, 0.25],
+ "ap_version": 'integral', # should use eval_coco_map.py to test model
+ }
+}
+
+def optimizer_setting(train_params):
+ batch_size = train_params["batch_size"]
+ iters = train_params["train_images"] // batch_size
+ lr = train_params["lr"]
+ boundaries = [i * iters for i in train_params["lr_epochs"]]
+ values = [ i * lr for i in train_params["lr_decay"]]
+
+ optimizer = fluid.optimizer.RMSProp(
+ learning_rate=fluid.layers.piecewise_decay(boundaries, values),
+ regularization=fluid.regularizer.L2Decay(0.00005), )
+
+ return optimizer
+
+
+def build_program(main_prog, startup_prog, train_params, is_train):
+ image_shape = train_params['image_shape']
+ class_num = train_params['class_num']
+ ap_version = train_params['ap_version']
+ outs = []
+ with fluid.program_guard(main_prog, startup_prog):
+ py_reader = fluid.layers.py_reader(
+ capacity=64,
+ shapes=[[-1] + image_shape, [-1, 4], [-1, 1], [-1, 1]],
+ lod_levels=[0, 1, 1, 1],
+ dtypes=["float32", "float32", "int32", "int32"],
+ use_double_buffer=True)
+ with fluid.unique_name.guard():
+ image, gt_box, gt_label, difficult = fluid.layers.read_file(py_reader)
+ locs, confs, box, box_var = mobile_net(class_num, image, image_shape)
+ if is_train:
+ with fluid.unique_name.guard("train"):
+ loss = fluid.layers.ssd_loss(locs, confs, gt_box, gt_label, box,
+ box_var)
+ loss = fluid.layers.reduce_sum(loss)
+ optimizer = optimizer_setting(train_params)
+ optimizer.minimize(loss)
+ outs = [py_reader, loss]
+ else:
+ with fluid.unique_name.guard("inference"):
+ nmsed_out = fluid.layers.detection_output(
+ locs, confs, box, box_var, nms_threshold=0.45)
+ map_eval = fluid.metrics.DetectionMAP(
+ nmsed_out,
+ gt_label,
+ gt_box,
+ difficult,
+ class_num,
+ overlap_threshold=0.5,
+ evaluate_difficult=False,
+ ap_version=ap_version)
+ # nmsed_out and image is used to save mode for inference
+ outs = [py_reader, map_eval, nmsed_out, image]
+ return outs
+
+
+def train(args,
+ data_args,
+ train_params,
+ train_file_list,
+ val_file_list):
+
+ model_save_dir = args.model_save_dir
+ pretrained_model = args.pretrained_model
+ use_gpu = args.use_gpu
+ parallel = args.parallel
+ enable_ce = args.enable_ce
+ is_shuffle = True
+
+ if not use_gpu:
+ devices_num = int(os.environ.get('CPU_NUM',
+ multiprocessing.cpu_count()))
+ else:
+ devices_num = fluid.core.get_cuda_device_count()
+
+ batch_size = train_params['batch_size']
+ epoc_num = train_params['epoc_num']
+ batch_size_per_device = batch_size // devices_num
+ num_workers = 8
+
+ startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+ test_prog = fluid.Program()
+
+ if enable_ce:
+ import random
+ random.seed(0)
+ np.random.seed(0)
+ is_shuffle = False
+ startup_prog.random_seed = 111
+ train_prog.random_seed = 111
+ test_prog.random_seed = 111
+
+ train_py_reader, loss = build_program(
+ main_prog=train_prog,
+ startup_prog=startup_prog,
+ train_params=train_params,
+ is_train=True)
+ test_py_reader, map_eval, _, _ = build_program(
+ main_prog=test_prog,
+ startup_prog=startup_prog,
+ train_params=train_params,
+ is_train=False)
+
+ test_prog = test_prog.clone(for_test=True)
+ place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup_prog)
+
+ if pretrained_model:
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_model, var.name))
+ fluid.io.load_vars(exe, pretrained_model, main_program=train_prog,
+ predicate=if_exist)
+
+ if parallel:
+ loss.persistable = True
+ build_strategy = fluid.BuildStrategy()
+ build_strategy.enable_inplace = True
+ build_strategy.memory_optimize = True
+ train_exe = fluid.ParallelExecutor(main_program=train_prog,
+ use_cuda=use_gpu, loss_name=loss.name, build_strategy=build_strategy)
+ train_reader = reader.train(data_args,
+ train_file_list,
+ batch_size_per_device,
+ shuffle=is_shuffle,
+ num_workers=num_workers,
+ enable_ce=enable_ce)
+ test_reader = reader.test(data_args, val_file_list, batch_size)
+ train_py_reader.decorate_paddle_reader(train_reader)
+ test_py_reader.decorate_paddle_reader(test_reader)
+
+ def save_model(postfix, main_prog):
+ model_path = os.path.join(model_save_dir, postfix)
+ if os.path.isdir(model_path):
+ shutil.rmtree(model_path)
+ print('save models to %s' % (model_path))
+ fluid.io.save_persistables(exe, model_path, main_program=main_prog)
+
+ best_map = 0.
+ def test(epoc_id, best_map):
+ _, accum_map = map_eval.get_map_var()
+ map_eval.reset(exe)
+ every_epoc_map=[] # for CE
+ test_py_reader.start()
+ try:
+ batch_id = 0
+ while True:
+ test_map, = exe.run(test_prog, fetch_list=[accum_map])
+ if batch_id % 10 == 0:
+ every_epoc_map.append(test_map)
+ print("Batch {0}, map {1}".format(batch_id, test_map))
+ batch_id += 1
+ except fluid.core.EOFException:
+ test_py_reader.reset()
+ mean_map = np.mean(every_epoc_map)
+ print("Epoc {0}, test map {1}".format(epoc_id, test_map[0]))
+ if test_map[0] > best_map:
+ best_map = test_map[0]
+ save_model('best_model', test_prog)
+ return best_map, mean_map
+
+
+ total_time = 0.0
+ for epoc_id in range(epoc_num):
+ epoch_idx = epoc_id + 1
+ start_time = time.time()
+ prev_start_time = start_time
+ every_epoc_loss = []
+ batch_id = 0
+ train_py_reader.start()
+ while True:
+ try:
+ prev_start_time = start_time
+ start_time = time.time()
+ if parallel:
+ loss_v, = train_exe.run(fetch_list=[loss.name])
+ else:
+ loss_v, = exe.run(train_prog, fetch_list=[loss])
+ loss_v = np.mean(np.array(loss_v))
+ every_epoc_loss.append(loss_v)
+ if batch_id % 10 == 0:
+ print("Epoc {:d}, batch {:d}, loss {:.6f}, time {:.5f}".format(
+ epoc_id, batch_id, loss_v, start_time - prev_start_time))
+ batch_id += 1
+ except (fluid.core.EOFException, StopIteration):
+ train_reader().close()
+ train_py_reader.reset()
+ break
+
+ end_time = time.time()
+ total_time += end_time - start_time
+ best_map, mean_map = test(epoc_id, best_map)
+ print("Best test map {0}".format(best_map))
+ if epoc_id % 10 == 0 or epoc_id == epoc_num - 1:
+ save_model(str(epoc_id), train_prog)
+
+ if enable_ce:
+ train_avg_loss = np.mean(every_epoc_loss)
+ if devices_num == 1:
+ print("kpis train_cost %s" % train_avg_loss)
+ print("kpis test_acc %s" % mean_map)
+ print("kpis train_speed %s" % (total_time / epoch_idx))
+ else:
+ print("kpis train_cost_card%s %s" %
+ (devices_num, train_avg_loss))
+ print("kpis test_acc_card%s %s" %
+ (devices_num, mean_map))
+ print("kpis train_speed_card%s %f" %
+ (devices_num, total_time / epoch_idx))
+
+
+if __name__ == '__main__':
+ args = parser.parse_args()
+ print_arguments(args)
+
+ data_dir = args.data_dir
+ dataset = args.dataset
+ assert dataset in ['pascalvoc', 'coco2014', 'coco2017']
+
+ # for pascalvoc
+ label_file = 'label_list'
+ train_file_list = 'trainval.txt'
+ val_file_list = 'test.txt'
+
+ if dataset == 'coco2014':
+ train_file_list = 'annotations/instances_train2014.json'
+ val_file_list = 'annotations/instances_val2014.json'
+ elif dataset == 'coco2017':
+ train_file_list = 'annotations/instances_train2017.json'
+ val_file_list = 'annotations/instances_val2017.json'
+
+ mean_BGR = [float(m) for m in args.mean_BGR.split(",")]
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+ train_parameters[dataset]['image_shape'] = image_shape
+ train_parameters[dataset]['batch_size'] = args.batch_size
+ train_parameters[dataset]['lr'] = args.learning_rate
+ train_parameters[dataset]['epoc_num'] = args.epoc_num
+ train_parameters[dataset]['ap_version'] = args.ap_version
+
+ data_args = reader.Settings(
+ dataset=args.dataset,
+ data_dir=data_dir,
+ label_file=label_file,
+ resize_h=image_shape[1],
+ resize_w=image_shape[2],
+ mean_value=mean_BGR,
+ apply_distort=True,
+ apply_expand=True,
+ ap_version = args.ap_version)
+ train(args,
+ data_args,
+ train_parameters[dataset],
+ train_file_list=train_file_list,
+ val_file_list=val_file_list)
diff --git a/fluid/PaddleCV/object_detection/utility.py b/PaddleCV/object_detection/utility.py
similarity index 100%
rename from fluid/PaddleCV/object_detection/utility.py
rename to PaddleCV/object_detection/utility.py
diff --git a/fluid/PaddleCV/ocr_recognition/.run_ce.sh b/PaddleCV/ocr_recognition/.run_ce.sh
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/.run_ce.sh
rename to PaddleCV/ocr_recognition/.run_ce.sh
diff --git a/PaddleCV/ocr_recognition/README.md b/PaddleCV/ocr_recognition/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..1c9553993e84d10376441407704088ec4dd66c0c
--- /dev/null
+++ b/PaddleCV/ocr_recognition/README.md
@@ -0,0 +1,206 @@
+
+
+运行本目录下的程序示例需要使用PaddlePaddle develop最新版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
+
+## 代码结构
+```
+├── data_reader.py # 下载、读取、处理数据。
+├── crnn_ctc_model.py # 定义了OCR CTC model的网络结构。
+├── attention_model.py # 定义了OCR attention model的网络结构。
+├── train.py # 用于模型的训练。
+├── infer.py # 加载训练好的模型文件,对新数据进行预测。
+├── eval.py # 评估模型在指定数据集上的效果。
+└── utils.py # 定义通用的函数。
+```
+
+
+## 简介
+
+本章的任务是识别图片中单行英文字符,这里我们分别使用CTC model和attention model两种不同的模型来完成该任务。
+
+这两种模型的有相同的编码部分,首先采用卷积将图片转为特征图, 然后使用`im2sequence op`将特征图转为序列,通过`双向GRU`学习到序列特征。
+
+两种模型的解码部分和使用的损失函数区别如下:
+
+- CTC model: 训练过程选用的损失函数为CTC(Connectionist Temporal Classification) loss, 预测阶段采用的是贪婪策略和CTC解码策略。
+- Attention model: 训练过程选用的是带注意力机制的解码策略和交叉信息熵损失函数,预测阶段采用的是柱搜索策略。
+
+训练以上两种模型的评估指标为样本级别的错误率。
+
+## 数据
+
+数据的下载和简单预处理都在`data_reader.py`中实现。
+
+### 数据示例
+
+我们使用的训练和测试数据如`图1`所示,每张图片包含单行不定长的英文字符串,这些图片都是经过检测算法进行预框选处理的。
+
+
+
+图 1
+
+
+在训练集中,每张图片对应的label是汉字在词典中的索引。 `图1` 对应的label如下所示:
+```
+80,84,68,82,83,72,78,77,68,67
+```
+在上边这个label中,`80` 表示字符`Q`的索引,`67` 表示英文字符`D`的索引。
+
+
+### 数据准备
+
+**A. 训练集**
+
+我们需要把所有参与训练的图片放入同一个文件夹,暂且记为`train_images`。然后用一个list文件存放每张图片的信息,包括图片大小、图片名称和对应的label,这里暂记该list文件为`train_list`,其格式如下所示:
+
+```
+185 48 00508_0215.jpg 7740,5332,2369,3201,4162
+48 48 00197_1893.jpg 6569
+338 48 00007_0219.jpg 4590,4788,3015,1994,3402,999,4553
+150 48 00107_4517.jpg 5936,3382,1437,3382
+...
+157 48 00387_0622.jpg 2397,1707,5919,1278
+```
+
+文件train_list
+
+上述文件中的每一行表示一张图片,每行被空格分为四列,前两列分别表示图片的宽和高,第三列表示图片的名称,第四列表示该图片对应的sequence label。
+最终我们应有以下类似文件结构:
+
+```
+|-train_data
+ |- train_list
+ |- train_imags
+ |- 00508_0215.jpg
+ |- 00197_1893.jpg
+ |- 00007_0219.jpg
+ | ...
+```
+
+在训练时,我们通过选项`--train_images` 和 `--train_list` 分别设置准备好的`train_images` 和`train_list`。
+
+
+>**注:** 如果`--train_images` 和 `--train_list`都未设置或设置为None, reader.py会自动下载使用[示例数据](http://paddle-ocr-data.bj.bcebos.com/data.tar.gz),并将其缓存到`$HOME/.cache/paddle/dataset/ctc_data/data/` 路径下。
+
+
+**B. 测试集和评估集**
+
+测试集、评估集的准备方式与训练集相同。
+在训练阶段,测试集的路径通过train.py的选项`--test_images` 和 `--test_list` 来设置。
+在评估时,评估集的路径通过eval.py的选项`--input_images_dir` 和`--input_images_list` 来设置。
+
+**C. 待预测数据集**
+
+预测支持三种形式的输入:
+
+第一种:设置`--input_images_dir`和`--input_images_list`, 与训练集类似, 只不过list文件中的最后一列可以放任意占位字符或字符串,如下所示:
+
+```
+185 48 00508_0215.jpg s
+48 48 00197_1893.jpg s
+338 48 00007_0219.jpg s
+...
+```
+
+第二种:仅设置`--input_images_list`, 其中list文件中只需放图片的完整路径,如下所示:
+
+```
+data/test_images/00000.jpg
+data/test_images/00001.jpg
+data/test_images/00003.jpg
+```
+
+第三种:从stdin读入一张图片的path,然后进行一次inference.
+
+## 模型训练与预测
+
+### 训练
+
+使用默认数据在GPU单卡上训练:
+
+```
+env CUDA_VISIBLE_DEVICES=0 python train.py
+```
+使用默认数据在CPU上训练:
+```
+env OMP_NUM_THREADS= python train.py --use_gpu False --parallel=False
+```
+
+使用默认数据在GPU多卡上训练:
+
+```
+env CUDA_VISIBLE_DEVICES=0,1,2,3 python train.py --parallel=True
+```
+
+默认使用的是`CTC model`, 可以通过选项`--model="attention"`切换为`attention model`。
+
+执行`python train.py --help`可查看更多使用方式和参数详细说明。
+
+图2为使用默认参数在默认数据集上训练`CTC model`的收敛曲线,其中横坐标轴为训练迭代次数,纵轴为样本级错误率。其中,蓝线为训练集上的样本错误率,红线为测试集上的样本错误率。测试集上最低错误率为22.0%.
+
+
+
+图 2
+
+
+图3为使用默认参数在默认数据集上训练`attention model`的收敛曲线,其中横坐标轴为训练迭代次数,纵轴为样本级错误率。其中,蓝线为训练集上的样本错误率,红线为测试集上的样本错误率。测试集上最低错误率为16.25%.
+
+
+
+图 3
+
+
+
+## 测试
+
+通过以下命令调用评估脚本用指定数据集对模型进行评估:
+
+```
+env CUDA_VISIBLE_DEVICE=0 python eval.py \
+ --model_path="./models/model_0" \
+ --input_images_dir="./eval_data/images/" \
+ --input_images_list="./eval_data/eval_list\" \
+```
+
+执行`python train.py --help`可查看参数详细说明。
+
+
+### 预测
+
+从标准输入读取一张图片的路径,并对齐进行预测:
+
+```
+env CUDA_VISIBLE_DEVICE=0 python infer.py \
+ --model_path="models/model_00044_15000"
+```
+
+执行上述命令进行预测的效果如下:
+
+```
+----------- Configuration Arguments -----------
+use_gpu: True
+input_images_dir: None
+input_images_list: None
+model_path: /home/work/models/fluid/ocr_recognition/models/model_00052_15000
+------------------------------------------------
+Init model from: ./models/model_00052_15000.
+Please input the path of image: ./test_images/00001_0060.jpg
+result: [3298 2371 4233 6514 2378 3298 2363]
+Please input the path of image: ./test_images/00001_0429.jpg
+result: [2067 2067 8187 8477 5027 7191 2431 1462]
+```
+
+从文件中批量读取图片路径,并对其进行预测:
+
+```
+env CUDA_VISIBLE_DEVICE=0 python infer.py \
+ --model_path="models/model_00044_15000" \
+ --input_images_list="data/test.list"
+```
+
+## 预训练模型
+
+|模型| 错误率|
+|- |:-: |
+|[ocr_ctc_params](https://paddle-ocr-models.bj.bcebos.com/ocr_ctc.zip) | 22.3% |
+|[ocr_attention_params](https://paddle-ocr-models.bj.bcebos.com/ocr_attention.zip) | 15.8%|
diff --git a/fluid/PaddleCV/ocr_recognition/_ce.py b/PaddleCV/ocr_recognition/_ce.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/_ce.py
rename to PaddleCV/ocr_recognition/_ce.py
diff --git a/fluid/PaddleCV/ocr_recognition/attention_model.py b/PaddleCV/ocr_recognition/attention_model.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/attention_model.py
rename to PaddleCV/ocr_recognition/attention_model.py
diff --git a/fluid/PaddleCV/ocr_recognition/crnn_ctc_model.py b/PaddleCV/ocr_recognition/crnn_ctc_model.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/crnn_ctc_model.py
rename to PaddleCV/ocr_recognition/crnn_ctc_model.py
diff --git a/fluid/PaddleCV/ocr_recognition/data_reader.py b/PaddleCV/ocr_recognition/data_reader.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/data_reader.py
rename to PaddleCV/ocr_recognition/data_reader.py
diff --git a/fluid/PaddleCV/ocr_recognition/eval.py b/PaddleCV/ocr_recognition/eval.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/eval.py
rename to PaddleCV/ocr_recognition/eval.py
diff --git a/fluid/PaddleCV/ocr_recognition/images/demo.jpg b/PaddleCV/ocr_recognition/images/demo.jpg
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/images/demo.jpg
rename to PaddleCV/ocr_recognition/images/demo.jpg
diff --git a/fluid/PaddleCV/ocr_recognition/images/train.jpg b/PaddleCV/ocr_recognition/images/train.jpg
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/images/train.jpg
rename to PaddleCV/ocr_recognition/images/train.jpg
diff --git a/fluid/PaddleCV/ocr_recognition/images/train_attention.jpg b/PaddleCV/ocr_recognition/images/train_attention.jpg
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/images/train_attention.jpg
rename to PaddleCV/ocr_recognition/images/train_attention.jpg
diff --git a/fluid/PaddleCV/ocr_recognition/infer.py b/PaddleCV/ocr_recognition/infer.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/infer.py
rename to PaddleCV/ocr_recognition/infer.py
diff --git a/fluid/PaddleCV/ocr_recognition/scripts/README.md b/PaddleCV/ocr_recognition/scripts/README.md
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/scripts/README.md
rename to PaddleCV/ocr_recognition/scripts/README.md
diff --git a/fluid/PaddleCV/ocr_recognition/scripts/infer.sh b/PaddleCV/ocr_recognition/scripts/infer.sh
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/scripts/infer.sh
rename to PaddleCV/ocr_recognition/scripts/infer.sh
diff --git a/fluid/PaddleCV/ocr_recognition/scripts/train.sh b/PaddleCV/ocr_recognition/scripts/train.sh
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/scripts/train.sh
rename to PaddleCV/ocr_recognition/scripts/train.sh
diff --git a/fluid/PaddleCV/ocr_recognition/train.py b/PaddleCV/ocr_recognition/train.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/train.py
rename to PaddleCV/ocr_recognition/train.py
diff --git a/fluid/PaddleCV/ocr_recognition/utility.py b/PaddleCV/ocr_recognition/utility.py
similarity index 100%
rename from fluid/PaddleCV/ocr_recognition/utility.py
rename to PaddleCV/ocr_recognition/utility.py
diff --git a/fluid/PaddleCV/faster_rcnn/.gitignore b/PaddleCV/rcnn/.gitignore
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/.gitignore
rename to PaddleCV/rcnn/.gitignore
diff --git a/PaddleCV/rcnn/.run_ce.sh b/PaddleCV/rcnn/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..15023b716a24c82d7a2ee282a7f0291cc4fa781e
--- /dev/null
+++ b/PaddleCV/rcnn/.run_ce.sh
@@ -0,0 +1,17 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+cudaid=${face_detection:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --model_save_dir=output/ --data_dir=dataset/coco/ --max_iter=100 --enable_ce --pretrained_model=./imagenet_resnet50_fusebn | python _ce.py
+
+
+cudaid=${face_detection_m:=0,1,2,3} # use 0,1,2,3 card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --model_save_dir=output/ --data_dir=dataset/coco/ --max_iter=100 --enable_ce --pretrained_model=./imagenet_resnet50_fusebn | python _ce.py
+
diff --git a/PaddleCV/rcnn/README.md b/PaddleCV/rcnn/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..97d1736f2b25bd6baa5ab5142be18544c9d63b85
--- /dev/null
+++ b/PaddleCV/rcnn/README.md
@@ -0,0 +1,209 @@
+# RCNN Objective Detection
+
+---
+## Table of Contents
+
+- [Installation](#installation)
+- [Introduction](#introduction)
+- [Data preparation](#data-preparation)
+- [Training](#training)
+- [Evaluation](#evaluation)
+- [Inference and Visualization](#inference-and-visualization)
+
+## Installation
+
+Running sample code in this directory requires PaddelPaddle Fluid v.1.3.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://paddlepaddle.org/documentation/docs/en/1.3/beginners_guide/install/index_en.html) and make an update.
+
+## Introduction
+
+Region Convolutional Neural Network (RCNN) models are two stages detector. According to proposals and feature extraction, obtain class and more precise proposals.
+Now RCNN model contains two typical models: Faster RCNN and Mask RCNN.
+
+[Faster RCNN](https://arxiv.org/abs/1506.01497), The total framework of network can be divided into four parts:
+
+1. Base conv layer. As a CNN objective dection, Faster RCNN extract feature maps using a basic convolutional network. The feature maps then can be shared by RPN and fc layers. This sampel uses [ResNet-50](https://arxiv.org/abs/1512.03385) as base conv layer.
+2. Region Proposal Network (RPN). RPN generates proposals for detection。This block generates anchors by a set of size and ratio and classifies anchors into fore-ground and back-ground by softmax. Then refine anchors to obtain more precise proposals using box regression.
+3. RoI Align. This layer takes feature maps and proposals as input. The proposals are mapped to feature maps and pooled to the same size. The output are sent to fc layers for classification and regression. RoIPool and RoIAlign are used separately to this layer and it can be set in roi\_func in config.py.
+4. Detection layer. Using the output of roi pooling to compute the class and locatoin of each proposal in two fc layers.
+
+[Mask RCNN](https://arxiv.org/abs/1703.06870) is a classical instance segmentation model and an extension of Faster RCNN
+
+Mask RCNN is a two stage model as well. At the first stage, it generates proposals from input images. At the second stage, it obtains class result, bbox and mask which is the result from segmentation branch on original Faster RCNN model. It decouples the relation between mask and classification.
+
+## Data preparation
+
+Train the model on [MS-COCO dataset](http://cocodataset.org/#download), download dataset as below:
+
+ cd dataset/coco
+ ./download.sh
+
+The data catalog structure is as follows:
+
+ ```
+ data/coco/
+ ├── annotations
+ │ ├── instances_train2014.json
+ │ ├── instances_train2017.json
+ │ ├── instances_val2014.json
+ │ ├── instances_val2017.json
+ | ...
+ ├── train2017
+ │ ├── 000000000009.jpg
+ │ ├── 000000580008.jpg
+ | ...
+ ├── val2017
+ │ ├── 000000000139.jpg
+ │ ├── 000000000285.jpg
+ | ...
+ ```
+
+## Training
+
+**download the pre-trained model:** This sample provides Resnet-50 pre-trained model which is converted from Caffe. The model fuses the parameters in batch normalization layer. One can download pre-trained model as:
+
+ sh ./pretrained/download.sh
+
+Set `pretrained_model` to load pre-trained model. In addition, this parameter is used to load trained model when finetuning as well.
+Please make sure that pretrained_model is downloaded and loaded correctly, otherwise, the loss may be NAN during training.
+
+**Install the [cocoapi](https://github.com/cocodataset/cocoapi):**
+
+To train the model, [cocoapi](https://github.com/cocodataset/cocoapi) is needed. Install the cocoapi:
+
+ git clone https://github.com/cocodataset/cocoapi.git
+ cd cocoapi/PythonAPI
+ # if cython is not installed
+ pip install Cython
+ # Install into global site-packages
+ make install
+ # Alternatively, if you do not have permissions or prefer
+ # not to install the COCO API into global site-packages
+ python2 setup.py install --user
+
+After data preparation, one can start the training step by:
+
+- Faster RCNN
+
+ ```
+ python train.py \
+ --model_save_dir=output/ \
+ --pretrained_model=${path_to_pretrain_model} \
+ --data_dir=${path_to_data} \
+ --MASK_ON=False
+ ```
+
+- Mask RCNN
+
+ ```
+ python train.py \
+ --model_save_dir=output/ \
+ --pretrained_model=${path_to_pretrain_model} \
+ --data_dir=${path_to_data} \
+ --MASK_ON=True
+ ```
+
+ - Set ```export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7``` to specifiy 8 GPU to train.
+ - Set ```MASK_ON``` to choose Faster RCNN or Mask RCNN model.
+ - For more help on arguments:
+
+ python train.py --help
+
+**data reader introduction:**
+
+* Data reader is defined in `reader.py`.
+* Scaling the short side of all images to `scales`. If the long side is larger than `max_size`, then scaling the long side to `max_size`.
+* In training stage, images are horizontally flipped.
+* Images in the same batch can be padding to the same size.
+
+**model configuration:**
+
+* Use RoIAlign and RoIPool separately.
+* NMS threshold=0.7. During training, pre\_nms=12000, post\_nms=2000; during test, pre\_nms=6000, post\_nms=1000.
+* In generating proposal lables, fg\_fraction=0.25, fg\_thresh=0.5, bg\_thresh_hi=0.5, bg\_thresh\_lo=0.0.
+* In rpn target assignment, rpn\_fg\_fraction=0.5, rpn\_positive\_overlap=0.7, rpn\_negative\_overlap=0.3.
+
+**training strategy:**
+
+* Use momentum optimizer with momentum=0.9.
+* Weight decay is 0.0001.
+* In first 500 iteration, the learning rate increases linearly from 0.00333 to 0.01. Then lr is decayed at 120000, 160000 iteration with multiplier 0.1, 0.01. The maximum iteration is 180000. Also, we released a 2x model which has 360000 iterations and lr is decayed at 240000, 320000. These configuration can be set by max_iter and lr_steps in config.py.
+* Set the learning rate of bias to two times as global lr in non basic convolutional layers.
+* In basic convolutional layers, parameters of affine layers and res body do not update.
+
+## Evaluation
+
+Evaluation is to evaluate the performance of a trained model. This sample provides `eval_coco_map.py` which uses a COCO-specific mAP metric defined by [COCO committee](http://cocodataset.org/#detections-eval).
+
+`eval_coco_map.py` is the main executor for evalution, one can start evalution step by:
+
+- Faster RCNN
+
+ ```
+ python eval_coco_map.py \
+ --dataset=coco2017 \
+ --pretrained_model=${path_to_trained_model} \
+ --MASK_ON=False
+ ```
+
+- Mask RCNN
+
+ ```
+ python eval_coco_map.py \
+ --dataset=coco2017 \
+ --pretrained_model=${path_to_trainde_model} \
+ --MASK_ON=True
+ ```
+
+ - Set ```--pretrained_model=${path_to_trained_model}``` to specifiy the trained model, not the initialized model.
+ - Set ```export CUDA_VISIBLE_DEVICES=0``` to specifiy one GPU to eval.
+ - Set ```MASK_ON``` to choose Faster RCNN or Mask RCNN model.
+
+Evalutaion result is shown as below:
+
+Faster RCNN:
+
+| Model | RoI function | Batch size | Max iteration | mAP |
+| :--------------- | :--------: | :------------: | :------------------: |------: |
+| [Fluid RoIPool minibatch padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_minibatch_padding.tar.gz) | RoIPool | 8 | 180000 | 0.316 |
+| [Fluid RoIPool no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_no_padding.tar.gz) | RoIPool | 8 | 180000 | 0.318 |
+| [Fluid RoIAlign no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding.tar.gz) | RoIAlign | 8 | 180000 | 0.348 |
+| [Fluid RoIAlign no padding 2x](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding_2x.tar.gz) | RoIAlign | 8 | 360000 | 0.367 |
+
+* Fluid RoIPool minibatch padding: Use RoIPool. Images in one batch padding to the same size. This method is same as detectron.
+* Fluid RoIPool no padding: Images without padding.
+* Fluid RoIAlign no padding: Images without padding.
+* Fluid RoIAlign no padding 2x: Images without padding, train for 360000 iterations, learning rate is decayed at 240000, 320000.
+
+Mask RCNN:
+
+| Model | Batch size | Max iteration | box mAP | mask mAP |
+| :--------------- | :--------: | :------------: | :--------: |------: |
+| [Fluid mask no padding](https://paddlemodels.bj.bcebos.com/faster_rcnn/Fluid_mask_no_padding.tar.gz) | 8 | 180000 | 0.359 | 0.314 |
+
+* Fluid mask no padding: Use RoIAlign. Images without padding.
+
+## Inference and Visualization
+
+Inference is used to get prediction score or image features based on trained models. `infer.py` is the main executor for inference, one can start infer step by:
+
+```
+python infer.py \
+ --pretrained_model=${path_to_trained_model} \
+ --image_path=dataset/coco/val2017/000000000139.jpg \
+ --draw_threshold=0.6
+```
+
+Please set the model path and image path correctly. GPU device is used by default, you can set `--use_gpu=False` to switch to CPU device. And you can set `draw_threshold` to tune score threshold to control the number of output detection boxes.
+
+Visualization of infer result is shown as below:
+
+
+
+Faster RCNN Visualization Examples
+
+
+
+
+
+Mask RCNN Visualization Examples
+
diff --git a/PaddleCV/rcnn/README_cn.md b/PaddleCV/rcnn/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..3d45de1c5845727d0b942f9ba5a4bf5216985af9
--- /dev/null
+++ b/PaddleCV/rcnn/README_cn.md
@@ -0,0 +1,207 @@
+# RCNN 系列目标检测
+
+---
+## 内容
+
+- [安装](#安装)
+- [简介](#简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断及可视化](#模型推断及可视化)
+
+## 安装
+
+在当前目录下运行样例代码需要PadddlePaddle Fluid的v.1.3.0或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据[安装文档](http://www.paddlepaddle.org/)中的说明来更新PaddlePaddle。
+
+## 简介
+区域卷积神经网络(RCNN)系列模型为两阶段目标检测器。通过对图像生成候选区域,提取特征,判别特征类别并修正候选框位置。
+RCNN系列目前包含两个代表模型:Faster RCNN,Mask RCNN
+
+[Faster RCNN](https://arxiv.org/abs/1506.01497) 整体网络可以分为4个主要内容:
+
+1. 基础卷积层。作为一种卷积神经网络目标检测方法,Faster RCNN首先使用一组基础的卷积网络提取图像的特征图。特征图被后续RPN层和全连接层共享。本示例采用[ResNet-50](https://arxiv.org/abs/1512.03385)作为基础卷积层。
+2. 区域生成网络(RPN)。RPN网络用于生成候选区域(proposals)。该层通过一组固定的尺寸和比例得到一组锚点(anchors), 通过softmax判断锚点属于前景或者背景,再利用区域回归修正锚点从而获得精确的候选区域。
+3. RoI Align。该层收集输入的特征图和候选区域,将候选区域映射到特征图中并池化为统一大小的区域特征图,送入全连接层判定目标类别, 该层可选用RoIPool和RoIAlign两种方式,在config.py中设置roi\_func。
+4. 检测层。利用区域特征图计算候选区域的类别,同时再次通过区域回归获得检测框最终的精确位置。
+
+[Mask RCNN](https://arxiv.org/abs/1703.06870) 扩展自Faster RCNN,是经典的实例分割模型。
+
+Mask RCNN同样为两阶段框架,第一阶段扫描图像生成候选框;第二阶段根据候选框得到分类结果,边界框,同时在原有Faster RCNN模型基础上添加分割分支,得到掩码结果,实现了掩码和类别预测关系的解藕。
+
+
+## 数据准备
+
+在[MS-COCO数据集](http://cocodataset.org/#download)上进行训练,通过如下方式下载数据集。
+
+ cd dataset/coco
+ ./download.sh
+
+数据目录结构如下:
+
+```
+data/coco/
+├── annotations
+│ ├── instances_train2014.json
+│ ├── instances_train2017.json
+│ ├── instances_val2014.json
+│ ├── instances_val2017.json
+| ...
+├── train2017
+│ ├── 000000000009.jpg
+│ ├── 000000580008.jpg
+| ...
+├── val2017
+│ ├── 000000000139.jpg
+│ ├── 000000000285.jpg
+| ...
+
+```
+
+## 模型训练
+
+**下载预训练模型:** 本示例提供Resnet-50预训练模型,该模性转换自Caffe,并对批标准化层(Batch Normalization Layer)进行参数融合。采用如下命令下载预训练模型:
+
+ sh ./pretrained/download.sh
+
+通过初始化`pretrained_model` 加载预训练模型。同时在参数微调时也采用该设置加载已训练模型。
+请在训练前确认预训练模型下载与加载正确,否则训练过程中损失可能会出现NAN。
+
+**安装[cocoapi](https://github.com/cocodataset/cocoapi):**
+
+训练前需要首先下载[cocoapi](https://github.com/cocodataset/cocoapi):
+
+ git clone https://github.com/cocodataset/cocoapi.git
+ cd cocoapi/PythonAPI
+ # if cython is not installed
+ pip install Cython
+ # Install into global site-packages
+ make install
+ # Alternatively, if you do not have permissions or prefer
+ # not to install the COCO API into global site-packages
+ python2 setup.py install --user
+
+数据准备完毕后,可以通过如下的方式启动训练:
+
+- Faster RCNN
+
+ ```
+ python train.py \
+ --model_save_dir=output/ \
+ --pretrained_model=${path_to_pretrain_model} \
+ --data_dir=${path_to_data} \
+ --MASK_ON=False
+ ```
+
+- Mask RCNN
+
+ ```
+ python train.py \
+ --model_save_dir=output/ \
+ --pretrained_model=${path_to_pretrain_model} \
+ --data_dir=${path_to_data} \
+ --MASK_ON=True
+ ```
+
+ - 通过设置export CUDA\_VISIBLE\_DEVICES=0,1,2,3,4,5,6,7指定8卡GPU训练。
+ - 通过设置```MASK_ON```选择Faster RCNN和Mask RCNN模型。
+ - 可选参数见:
+
+ python train.py --help
+
+**数据读取器说明:** 数据读取器定义在reader.py中。所有图像将短边等比例缩放至`scales`,若长边大于`max_size`, 则再次将长边等比例缩放至`max_size`。在训练阶段,对图像采用水平翻转。支持将同一个batch内的图像padding为相同尺寸。
+
+**模型设置:**
+
+* 分别使用RoIAlign和RoIPool两种方法。
+* 训练过程pre\_nms=12000, post\_nms=2000,测试过程pre\_nms=6000, post\_nms=1000。nms阈值为0.7。
+* RPN网络得到labels的过程中,fg\_fraction=0.25,fg\_thresh=0.5,bg\_thresh_hi=0.5,bg\_thresh\_lo=0.0
+* RPN选择anchor时,rpn\_fg\_fraction=0.5,rpn\_positive\_overlap=0.7,rpn\_negative\_overlap=0.3
+
+
+**训练策略:**
+
+* 采用momentum优化算法训练,momentum=0.9。
+* 权重衰减系数为0.0001,前500轮学习率从0.00333线性增加至0.01。在120000,160000轮时使用0.1,0.01乘子进行学习率衰减,最大训练180000轮。同时我们也提供了2x模型,该模型采用更多的迭代轮数进行训练,训练360000轮,学习率在240000,320000轮衰减,其他参数不变,训练最大轮数和学习率策略可以在config.py中对max_iter和lr_steps进行设置。
+* 非基础卷积层卷积bias学习率为整体学习率2倍。
+* 基础卷积层中,affine_layers参数不更新,res2层参数不更新。
+
+## 模型评估
+
+模型评估是指对训练完毕的模型评估各类性能指标。本示例采用[COCO官方评估](http://cocodataset.org/#detections-eval)
+
+`eval_coco_map.py`是评估模块的主要执行程序,调用示例如下:
+
+- Faster RCNN
+
+ ```
+ python eval_coco_map.py \
+ --dataset=coco2017 \
+ --pretrained_model=${path_to_trained_model} \
+ --MASK_ON=False
+ ```
+
+- Mask RCNN
+
+ ```
+ python eval_coco_map.py \
+ --dataset=coco2017 \
+ --pretrained_model=${path_to_trained_model} \
+ --MASK_ON=True
+ ```
+
+ - 通过设置`--pretrained_model=${path_to_trained_model}`指定训练好的模型,注意不是初始化的模型。
+ - 通过设置`export CUDA\_VISIBLE\_DEVICES=0`指定单卡GPU评估。
+ - 通过设置```MASK_ON```选择Faster RCNN和Mask RCNN模型。
+
+下表为模型评估结果:
+
+Faster RCNN
+
+| 模型 | RoI处理方式 | 批量大小 | 迭代次数 | mAP |
+| :--------------- | :--------: | :------------: | :------------------: |------: |
+| [Fluid RoIPool minibatch padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_minibatch_padding.tar.gz) | RoIPool | 8 | 180000 | 0.316 |
+| [Fluid RoIPool no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_no_padding.tar.gz) | RoIPool | 8 | 180000 | 0.318 |
+| [Fluid RoIAlign no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding.tar.gz) | RoIAlign | 8 | 180000 | 0.348 |
+| [Fluid RoIAlign no padding 2x](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding_2x.tar.gz) | RoIAlign | 8 | 360000 | 0.367 |
+
+
+
+* Fluid RoIPool minibatch padding: 使用RoIPool,同一个batch内的图像填充为相同尺寸。该方法与detectron处理相同。
+* Fluid RoIPool no padding: 使用RoIPool,不对图像做填充处理。
+* Fluid RoIAlign no padding: 使用RoIAlign,不对图像做填充处理。
+* Fluid RoIAlign no padding 2x: 使用RoIAlign,不对图像做填充处理。训练360000轮,学习率在240000,320000轮衰减。
+
+Mask RCNN:
+
+| 模型 | 批量大小 | 迭代次数 | box mAP | mask mAP |
+| :--------------- | :--------: | :------------: | :--------: |------: |
+| [Fluid mask no padding](https://paddlemodels.bj.bcebos.com/faster_rcnn/Fluid_mask_no_padding.tar.gz) | 8 | 180000 | 0.359 | 0.314 |
+
+* Fluid mask no padding: 使用RoIAlign,不对图像做填充处理
+
+## 模型推断及可视化
+
+模型推断可以获取图像中的物体及其对应的类别,`infer.py`是主要执行程序,调用示例如下:
+
+```
+python infer.py \
+ --pretrained_model=${path_to_trained_model} \
+ --image_path=dataset/coco/val2017/000000000139.jpg \
+ --draw_threshold=0.6
+```
+
+注意,请正确设置模型路径`${path_to_trained_model}`和预测图片路径。默认使用GPU设备,也可通过设置`--use_gpu=False`使用CPU设备。可通过设置`draw_threshold`调节得分阈值控制检测框的个数。
+
+下图为模型可视化预测结果:
+
+
+
+Faster RCNN 预测可视化
+
+
+
+
+
+Mask RCNN 预测可视化
+
diff --git a/fluid/PaddleCV/image_classification/__init__.py b/PaddleCV/rcnn/__init__.py
similarity index 100%
rename from fluid/PaddleCV/image_classification/__init__.py
rename to PaddleCV/rcnn/__init__.py
diff --git a/PaddleCV/rcnn/_ce.py b/PaddleCV/rcnn/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..e331d1bb7cccce5ac914dfa3417fe9090bd9cf99
--- /dev/null
+++ b/PaddleCV/rcnn/_ce.py
@@ -0,0 +1,62 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+
+each_pass_duration_card1_kpi = DurationKpi(
+ 'each_pass_duration_card1', 0.08, 0, actived=True)
+train_loss_card1_kpi = CostKpi('train_loss_card1', 0.08, 0)
+each_pass_duration_card4_kpi = DurationKpi(
+ 'each_pass_duration_card4', 0.08, 0, actived=True)
+train_loss_card4_kpi = CostKpi('train_loss_card4', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_card1_kpi,
+ train_loss_card1_kpi,
+ each_pass_duration_card4_kpi,
+ train_loss_card4_kpi,
+]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/PaddleCV/rcnn/box_utils.py b/PaddleCV/rcnn/box_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..bb3fe9c8f0cb261004578abba651ad7210518a22
--- /dev/null
+++ b/PaddleCV/rcnn/box_utils.py
@@ -0,0 +1,144 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# Detectron
+# Copyright (c) 2017-present, Facebook, Inc.
+# Licensed under the Apache License, Version 2.0;
+# Written by Ross Girshick
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import numpy as np
+
+
+def xywh_to_xyxy(xywh):
+ """Convert [x1 y1 w h] box format to [x1 y1 x2 y2] format."""
+ if isinstance(xywh, (list, tuple)):
+ # Single box given as a list of coordinates
+ assert len(xywh) == 4
+ x1, y1 = xywh[0], xywh[1]
+ x2 = x1 + np.maximum(0., xywh[2] - 1.)
+ y2 = y1 + np.maximum(0., xywh[3] - 1.)
+ return (x1, y1, x2, y2)
+ elif isinstance(xywh, np.ndarray):
+ # Multiple boxes given as a 2D ndarray
+ return np.hstack(
+ (xywh[:, 0:2], xywh[:, 0:2] + np.maximum(0, xywh[:, 2:4] - 1)))
+ else:
+ raise TypeError('Argument xywh must be a list, tuple, or numpy array.')
+
+
+def xyxy_to_xywh(xyxy):
+ """Convert [x1 y1 x2 y2] box format to [x1 y1 w h] format."""
+ if isinstance(xyxy, (list, tuple)):
+ # Single box given as a list of coordinates
+ assert len(xyxy) == 4
+ x1, y1 = xyxy[0], xyxy[1]
+ w = xyxy[2] - x1 + 1
+ h = xyxy[3] - y1 + 1
+ return (x1, y1, w, h)
+ elif isinstance(xyxy, np.ndarray):
+ # Multiple boxes given as a 2D ndarray
+ return np.hstack((xyxy[:, 0:2], xyxy[:, 2:4] - xyxy[:, 0:2] + 1))
+ else:
+ raise TypeError('Argument xyxy must be a list, tuple, or numpy array.')
+
+
+def clip_xyxy_to_image(x1, y1, x2, y2, height, width):
+ """Clip coordinates to an image with the given height and width."""
+ x1 = np.minimum(width - 1., np.maximum(0., x1))
+ y1 = np.minimum(height - 1., np.maximum(0., y1))
+ x2 = np.minimum(width - 1., np.maximum(0., x2))
+ y2 = np.minimum(height - 1., np.maximum(0., y2))
+ return x1, y1, x2, y2
+
+
+def nms(dets, thresh):
+ """Apply classic DPM-style greedy NMS."""
+ if dets.shape[0] == 0:
+ return []
+ x1 = dets[:, 0]
+ y1 = dets[:, 1]
+ x2 = dets[:, 2]
+ y2 = dets[:, 3]
+ scores = dets[:, 4]
+
+ areas = (x2 - x1 + 1) * (y2 - y1 + 1)
+ order = scores.argsort()[::-1]
+
+ ndets = dets.shape[0]
+ suppressed = np.zeros((ndets), dtype=np.int)
+
+ # nominal indices
+ # _i, _j
+ # sorted indices
+ # i, j
+ # temp variables for box i's (the box currently under consideration)
+ # ix1, iy1, ix2, iy2, iarea
+
+ # variables for computing overlap with box j (lower scoring box)
+ # xx1, yy1, xx2, yy2
+ # w, h
+ # inter, ovr
+
+ for _i in range(ndets):
+ i = order[_i]
+ if suppressed[i] == 1:
+ continue
+ ix1 = x1[i]
+ iy1 = y1[i]
+ ix2 = x2[i]
+ iy2 = y2[i]
+ iarea = areas[i]
+ for _j in range(_i + 1, ndets):
+ j = order[_j]
+ if suppressed[j] == 1:
+ continue
+ xx1 = max(ix1, x1[j])
+ yy1 = max(iy1, y1[j])
+ xx2 = min(ix2, x2[j])
+ yy2 = min(iy2, y2[j])
+ w = max(0.0, xx2 - xx1 + 1)
+ h = max(0.0, yy2 - yy1 + 1)
+ inter = w * h
+ ovr = inter / (iarea + areas[j] - inter)
+ if ovr >= thresh:
+ suppressed[j] = 1
+
+ return np.where(suppressed == 0)[0]
+
+
+def expand_boxes(boxes, scale):
+ """Expand an array of boxes by a given scale."""
+ w_half = (boxes[:, 2] - boxes[:, 0]) * .5
+ h_half = (boxes[:, 3] - boxes[:, 1]) * .5
+ x_c = (boxes[:, 2] + boxes[:, 0]) * .5
+ y_c = (boxes[:, 3] + boxes[:, 1]) * .5
+
+ w_half *= scale
+ h_half *= scale
+
+ boxes_exp = np.zeros(boxes.shape)
+ boxes_exp[:, 0] = x_c - w_half
+ boxes_exp[:, 2] = x_c + w_half
+ boxes_exp[:, 1] = y_c - h_half
+ boxes_exp[:, 3] = y_c + h_half
+
+ return boxes_exp
diff --git a/PaddleCV/rcnn/colormap.py b/PaddleCV/rcnn/colormap.py
new file mode 100644
index 0000000000000000000000000000000000000000..8c2447794fc2e9841b30c2cdf11e8fc70d20d764
--- /dev/null
+++ b/PaddleCV/rcnn/colormap.py
@@ -0,0 +1,61 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# Detectron
+# Copyright (c) 2017-present, Facebook, Inc.
+# Licensed under the Apache License, Version 2.0;
+# Written by Ross Girshick
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import numpy as np
+
+
+def colormap(rgb=False):
+ color_list = np.array([
+ 0.000, 0.447, 0.741, 0.850, 0.325, 0.098, 0.929, 0.694, 0.125, 0.494,
+ 0.184, 0.556, 0.466, 0.674, 0.188, 0.301, 0.745, 0.933, 0.635, 0.078,
+ 0.184, 0.300, 0.300, 0.300, 0.600, 0.600, 0.600, 1.000, 0.000, 0.000,
+ 1.000, 0.500, 0.000, 0.749, 0.749, 0.000, 0.000, 1.000, 0.000, 0.000,
+ 0.000, 1.000, 0.667, 0.000, 1.000, 0.333, 0.333, 0.000, 0.333, 0.667,
+ 0.000, 0.333, 1.000, 0.000, 0.667, 0.333, 0.000, 0.667, 0.667, 0.000,
+ 0.667, 1.000, 0.000, 1.000, 0.333, 0.000, 1.000, 0.667, 0.000, 1.000,
+ 1.000, 0.000, 0.000, 0.333, 0.500, 0.000, 0.667, 0.500, 0.000, 1.000,
+ 0.500, 0.333, 0.000, 0.500, 0.333, 0.333, 0.500, 0.333, 0.667, 0.500,
+ 0.333, 1.000, 0.500, 0.667, 0.000, 0.500, 0.667, 0.333, 0.500, 0.667,
+ 0.667, 0.500, 0.667, 1.000, 0.500, 1.000, 0.000, 0.500, 1.000, 0.333,
+ 0.500, 1.000, 0.667, 0.500, 1.000, 1.000, 0.500, 0.000, 0.333, 1.000,
+ 0.000, 0.667, 1.000, 0.000, 1.000, 1.000, 0.333, 0.000, 1.000, 0.333,
+ 0.333, 1.000, 0.333, 0.667, 1.000, 0.333, 1.000, 1.000, 0.667, 0.000,
+ 1.000, 0.667, 0.333, 1.000, 0.667, 0.667, 1.000, 0.667, 1.000, 1.000,
+ 1.000, 0.000, 1.000, 1.000, 0.333, 1.000, 1.000, 0.667, 1.000, 0.167,
+ 0.000, 0.000, 0.333, 0.000, 0.000, 0.500, 0.000, 0.000, 0.667, 0.000,
+ 0.000, 0.833, 0.000, 0.000, 1.000, 0.000, 0.000, 0.000, 0.167, 0.000,
+ 0.000, 0.333, 0.000, 0.000, 0.500, 0.000, 0.000, 0.667, 0.000, 0.000,
+ 0.833, 0.000, 0.000, 1.000, 0.000, 0.000, 0.000, 0.167, 0.000, 0.000,
+ 0.333, 0.000, 0.000, 0.500, 0.000, 0.000, 0.667, 0.000, 0.000, 0.833,
+ 0.000, 0.000, 1.000, 0.000, 0.000, 0.000, 0.143, 0.143, 0.143, 0.286,
+ 0.286, 0.286, 0.429, 0.429, 0.429, 0.571, 0.571, 0.571, 0.714, 0.714,
+ 0.714, 0.857, 0.857, 0.857, 1.000, 1.000, 1.000
+ ]).astype(np.float32)
+ color_list = color_list.reshape((-1, 3)) * 255
+ if not rgb:
+ color_list = color_list[:, ::-1]
+ return color_list
diff --git a/PaddleCV/rcnn/config.py b/PaddleCV/rcnn/config.py
new file mode 100644
index 0000000000000000000000000000000000000000..2a8ebdf7c1871f5863facd6e2138993ed4d7ffd1
--- /dev/null
+++ b/PaddleCV/rcnn/config.py
@@ -0,0 +1,238 @@
+# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved.
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+# http://www.apache.org/licenses/LICENSE-2.0
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+from edict import AttrDict
+import six
+import numpy as np
+
+_C = AttrDict()
+cfg = _C
+
+#
+# Training options
+#
+_C.TRAIN = AttrDict()
+
+# scales an image's shortest side
+_C.TRAIN.scales = [800]
+
+# max size of longest side
+_C.TRAIN.max_size = 1333
+
+# images per GPU in minibatch
+_C.TRAIN.im_per_batch = 1
+
+# roi minibatch size per image
+_C.TRAIN.batch_size_per_im = 512
+
+# target fraction of foreground roi minibatch
+_C.TRAIN.fg_fractrion = 0.25
+
+# overlap threshold for a foreground roi
+_C.TRAIN.fg_thresh = 0.5
+
+# overlap threshold for a background roi
+_C.TRAIN.bg_thresh_hi = 0.5
+_C.TRAIN.bg_thresh_lo = 0.0
+
+# If False, only resize image and not pad, image shape is different between
+# GPUs in one mini-batch. If True, image shape is the same in one mini-batch.
+_C.TRAIN.padding_minibatch = False
+
+# Snapshot period
+_C.TRAIN.snapshot_iter = 10000
+
+# number of RPN proposals to keep before NMS
+_C.TRAIN.rpn_pre_nms_top_n = 12000
+
+# number of RPN proposals to keep after NMS
+_C.TRAIN.rpn_post_nms_top_n = 2000
+
+# NMS threshold used on RPN proposals
+_C.TRAIN.rpn_nms_thresh = 0.7
+
+# min size in RPN proposals
+_C.TRAIN.rpn_min_size = 0.0
+
+# eta for adaptive NMS in RPN
+_C.TRAIN.rpn_eta = 1.0
+
+# number of RPN examples per image
+_C.TRAIN.rpn_batch_size_per_im = 256
+
+# remove anchors out of the image
+_C.TRAIN.rpn_straddle_thresh = 0.
+
+# target fraction of foreground examples pre RPN minibatch
+_C.TRAIN.rpn_fg_fraction = 0.5
+
+# min overlap between anchor and gt box to be a positive examples
+_C.TRAIN.rpn_positive_overlap = 0.7
+
+# max overlap between anchor and gt box to be a negative examples
+_C.TRAIN.rpn_negative_overlap = 0.3
+
+# stopgrad at a specified stage
+_C.TRAIN.freeze_at = 2
+
+# min area of ground truth box
+_C.TRAIN.gt_min_area = -1
+
+# Use horizontally-flipped images during training?
+_C.TRAIN.use_flipped = True
+
+#
+# Inference options
+#
+_C.TEST = AttrDict()
+
+# scales an image's shortest side
+_C.TEST.scales = [800]
+
+# max size of longest side
+_C.TEST.max_size = 1333
+
+# eta for adaptive NMS in RPN
+_C.TEST.rpn_eta = 1.0
+
+# min score threshold to infer
+_C.TEST.score_thresh = 0.05
+
+# overlap threshold used for NMS
+_C.TEST.nms_thresh = 0.5
+
+# number of RPN proposals to keep before NMS
+_C.TEST.rpn_pre_nms_top_n = 6000
+
+# number of RPN proposals to keep after NMS
+_C.TEST.rpn_post_nms_top_n = 1000
+
+# min size in RPN proposals
+_C.TEST.rpn_min_size = 0.0
+
+# max number of detections
+_C.TEST.detections_per_im = 100
+
+# NMS threshold used on RPN proposals
+_C.TEST.rpn_nms_thresh = 0.7
+
+#
+# Model options
+#
+
+# Whether use mask rcnn head
+_C.MASK_ON = True
+
+# weight for bbox regression targets
+_C.bbox_reg_weights = [0.1, 0.1, 0.2, 0.2]
+
+# RPN anchor sizes
+_C.anchor_sizes = [32, 64, 128, 256, 512]
+
+# RPN anchor ratio
+_C.aspect_ratio = [0.5, 1, 2]
+
+# variance of anchors
+_C.variances = [1., 1., 1., 1.]
+
+# stride of feature map
+_C.rpn_stride = [16.0, 16.0]
+
+# Use roi pool or roi align, 'RoIPool' or 'RoIAlign'
+_C.roi_func = 'RoIAlign'
+
+# sampling ratio for roi align
+_C.sampling_ratio = 0
+
+# pooled width and pooled height
+_C.roi_resolution = 14
+
+# spatial scale
+_C.spatial_scale = 1. / 16.
+
+# resolution to represent mask labels
+_C.resolution = 14
+
+# Number of channels in the mask head
+_C.dim_reduced = 256
+
+# Threshold for converting soft masks to hard masks
+_C.mrcnn_thresh_binarize = 0.5
+
+#
+# SOLVER options
+#
+
+# derived learning rate the to get the final learning rate.
+_C.learning_rate = 0.01
+
+# maximum number of iterations, 1x: 180000, 2x:360000
+_C.max_iter = 180000
+#_C.max_iter = 360000
+
+# warm up to learning rate
+_C.warm_up_iter = 500
+_C.warm_up_factor = 1. / 3.
+
+# lr steps_with_decay, 1x: [120000, 160000], 2x: [240000, 320000]
+_C.lr_steps = [120000, 160000]
+#_C.lr_steps = [240000, 320000]
+_C.lr_gamma = 0.1
+
+# L2 regularization hyperparameter
+_C.weight_decay = 0.0001
+
+# momentum with SGD
+_C.momentum = 0.9
+
+#
+# ENV options
+#
+
+# support both CPU and GPU
+_C.use_gpu = True
+
+# Whether use parallel
+_C.parallel = True
+
+# Class number
+_C.class_num = 81
+
+# support pyreader
+_C.use_pyreader = True
+
+# pixel mean values
+_C.pixel_means = [102.9801, 115.9465, 122.7717]
+
+# clip box to prevent overflowing
+_C.bbox_clip = np.log(1000. / 16.)
+
+
+def merge_cfg_from_args(args, mode):
+ """Merge config keys, values in args into the global config."""
+ if mode == 'train':
+ sub_d = _C.TRAIN
+ else:
+ sub_d = _C.TEST
+ for k, v in sorted(six.iteritems(vars(args))):
+ d = _C
+ try:
+ value = eval(v)
+ except:
+ value = v
+ if k in sub_d:
+ sub_d[k] = value
+ else:
+ d[k] = value
diff --git a/PaddleCV/rcnn/data_utils.py b/PaddleCV/rcnn/data_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..86be7f1d49762c7e57180304edfde0810374449d
--- /dev/null
+++ b/PaddleCV/rcnn/data_utils.py
@@ -0,0 +1,104 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# Detectron
+# Copyright (c) 2017-present, Facebook, Inc.
+# Licensed under the Apache License, Version 2.0;
+# Written by Ross Girshick
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import cv2
+import numpy as np
+from config import cfg
+import os
+
+
+class DatasetPath(object):
+ def __init__(self, mode):
+ self.mode = mode
+ mode_name = 'train' if mode == 'train' else 'val'
+ if cfg.dataset != 'coco2014' and cfg.dataset != 'coco2017':
+ raise NotImplementedError('Dataset {} not supported'.format(
+ cfg.dataset))
+ self.sub_name = mode_name + cfg.dataset[-4:]
+
+ def get_data_dir(self):
+ return os.path.join(cfg.data_dir, self.sub_name)
+
+ def get_file_list(self):
+ sfile_list = 'annotations/instances_' + self.sub_name + '.json'
+ return os.path.join(cfg.data_dir, sfile_list)
+
+
+def get_image_blob(roidb, mode):
+ """Builds an input blob from the images in the roidb at the specified
+ scales.
+ """
+ if mode == 'train':
+ scales = cfg.TRAIN.scales
+ scale_ind = np.random.randint(0, high=len(scales))
+ target_size = scales[scale_ind]
+ max_size = cfg.TRAIN.max_size
+ else:
+ target_size = cfg.TEST.scales[0]
+ max_size = cfg.TEST.max_size
+ im = cv2.imread(roidb['image'])
+ try:
+ assert im is not None
+ except AssertionError as e:
+ print('Failed to read image \'{}\''.format(roidb['image']))
+ os._exit(0)
+ if roidb['flipped']:
+ im = im[:, ::-1, :]
+ im, im_scale = prep_im_for_blob(im, cfg.pixel_means, target_size, max_size)
+
+ return im, im_scale
+
+
+def prep_im_for_blob(im, pixel_means, target_size, max_size):
+ """Prepare an image for use as a network input blob. Specially:
+ - Subtract per-channel pixel mean
+ - Convert to float32
+ - Rescale to each of the specified target size (capped at max_size)
+ Returns a list of transformed images, one for each target size. Also returns
+ the scale factors that were used to compute each returned image.
+ """
+ im = im.astype(np.float32, copy=False)
+ im -= pixel_means
+
+ im_shape = im.shape
+ im_size_min = np.min(im_shape[0:2])
+ im_size_max = np.max(im_shape[0:2])
+ im_scale = float(target_size) / float(im_size_min)
+ # Prevent the biggest axis from being more than max_size
+ if np.round(im_scale * im_size_max) > max_size:
+ im_scale = float(max_size) / float(im_size_max)
+ im = cv2.resize(
+ im,
+ None,
+ None,
+ fx=im_scale,
+ fy=im_scale,
+ interpolation=cv2.INTER_LINEAR)
+ im_height, im_width, channel = im.shape
+ channel_swap = (2, 0, 1) #(batch, channel, height, width)
+ im = im.transpose(channel_swap)
+ return im, im_scale
diff --git a/fluid/PaddleCV/object_detection/data/coco/download.sh b/PaddleCV/rcnn/dataset/coco/download.sh
similarity index 100%
rename from fluid/PaddleCV/object_detection/data/coco/download.sh
rename to PaddleCV/rcnn/dataset/coco/download.sh
diff --git a/fluid/PaddleCV/faster_rcnn/edict.py b/PaddleCV/rcnn/edict.py
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/edict.py
rename to PaddleCV/rcnn/edict.py
diff --git a/PaddleCV/rcnn/eval_coco_map.py b/PaddleCV/rcnn/eval_coco_map.py
new file mode 100644
index 0000000000000000000000000000000000000000..6b01da55ade6e9b5dabf370c30208c465028df9f
--- /dev/null
+++ b/PaddleCV/rcnn/eval_coco_map.py
@@ -0,0 +1,142 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import time
+import numpy as np
+from eval_helper import *
+import paddle
+import paddle.fluid as fluid
+import reader
+from utility import print_arguments, parse_args
+import models.model_builder as model_builder
+import models.resnet as resnet
+import json
+from pycocotools.coco import COCO
+from pycocotools.cocoeval import COCOeval, Params
+from config import cfg
+from data_utils import DatasetPath
+
+
+def eval():
+
+ data_path = DatasetPath('val')
+ test_list = data_path.get_file_list()
+
+ image_shape = [3, cfg.TEST.max_size, cfg.TEST.max_size]
+ class_nums = cfg.class_num
+ devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
+ devices_num = len(devices.split(","))
+ total_batch_size = devices_num * cfg.TRAIN.im_per_batch
+ cocoGt = COCO(test_list)
+ num_id_to_cat_id_map = {i + 1: v for i, v in enumerate(cocoGt.getCatIds())}
+ category_ids = cocoGt.getCatIds()
+ label_list = {
+ item['id']: item['name']
+ for item in cocoGt.loadCats(category_ids)
+ }
+ label_list[0] = ['background']
+
+ model = model_builder.RCNN(
+ add_conv_body_func=resnet.add_ResNet50_conv4_body,
+ add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
+ use_pyreader=False,
+ mode='val')
+ model.build_model(image_shape)
+ pred_boxes = model.eval_bbox_out()
+ if cfg.MASK_ON:
+ masks = model.eval_mask_out()
+ place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ # yapf: disable
+ if cfg.pretrained_model:
+ def if_exist(var):
+ return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
+ fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
+
+ # yapf: enable
+ test_reader = reader.test(total_batch_size)
+ feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
+
+ dts_res = []
+ segms_res = []
+ if cfg.MASK_ON:
+ fetch_list = [pred_boxes, masks]
+ else:
+ fetch_list = [pred_boxes]
+ eval_start = time.time()
+ for batch_id, batch_data in enumerate(test_reader()):
+ start = time.time()
+ im_info = []
+ for data in batch_data:
+ im_info.append(data[1])
+ results = exe.run(fetch_list=[v.name for v in fetch_list],
+ feed=feeder.feed(batch_data),
+ return_numpy=False)
+
+ pred_boxes_v = results[0]
+ if cfg.MASK_ON:
+ masks_v = results[1]
+
+ new_lod = pred_boxes_v.lod()
+ nmsed_out = pred_boxes_v
+
+ dts_res += get_dt_res(total_batch_size, new_lod[0], nmsed_out,
+ batch_data, num_id_to_cat_id_map)
+
+ if cfg.MASK_ON and np.array(masks_v).shape != (1, 1):
+ segms_out = segm_results(nmsed_out, masks_v, im_info)
+ segms_res += get_segms_res(total_batch_size, new_lod[0], segms_out,
+ batch_data, num_id_to_cat_id_map)
+ end = time.time()
+ print('batch id: {}, time: {}'.format(batch_id, end - start))
+ eval_end = time.time()
+ total_time = eval_end - eval_start
+ print('average time of eval is: {}'.format(total_time / (batch_id + 1)))
+ assert len(dts_res) > 0, "The number of valid bbox detected is zero.\n \
+ Please use reasonable model and check input data."
+
+ if cfg.MASK_ON:
+ assert len(
+ segms_res) > 0, "The number of valid mask detected is zero.\n \
+ Please use reasonable model and check input data.."
+
+ with open("detection_bbox_result.json", 'w') as outfile:
+ json.dump(dts_res, outfile)
+ print("start evaluate bbox using coco api")
+ cocoDt = cocoGt.loadRes("detection_bbox_result.json")
+ cocoEval = COCOeval(cocoGt, cocoDt, 'bbox')
+ cocoEval.evaluate()
+ cocoEval.accumulate()
+ cocoEval.summarize()
+
+ if cfg.MASK_ON:
+ with open("detection_segms_result.json", 'w') as outfile:
+ json.dump(segms_res, outfile)
+ print("start evaluate mask using coco api")
+ cocoDt = cocoGt.loadRes("detection_segms_result.json")
+ cocoEval = COCOeval(cocoGt, cocoDt, 'segm')
+ cocoEval.evaluate()
+ cocoEval.accumulate()
+ cocoEval.summarize()
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ eval()
diff --git a/PaddleCV/rcnn/eval_helper.py b/PaddleCV/rcnn/eval_helper.py
new file mode 100644
index 0000000000000000000000000000000000000000..dba67f6bbed2c87b5794dcc9c01a36205381e0d1
--- /dev/null
+++ b/PaddleCV/rcnn/eval_helper.py
@@ -0,0 +1,386 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import numpy as np
+import paddle.fluid as fluid
+import math
+import box_utils
+from PIL import Image
+from PIL import ImageDraw
+from PIL import ImageFont
+from config import cfg
+import pycocotools.mask as mask_util
+import six
+from colormap import colormap
+import cv2
+
+
+def box_decoder(deltas, boxes, weights):
+ if boxes.shape[0] == 0:
+ return np.zeros((0, deltas.shape[1]), dtype=deltas.dtype)
+
+ boxes = boxes.astype(deltas.dtype, copy=False)
+
+ widths = boxes[:, 2] - boxes[:, 0] + 1.0
+ heights = boxes[:, 3] - boxes[:, 1] + 1.0
+ ctr_x = boxes[:, 0] + 0.5 * widths
+ ctr_y = boxes[:, 1] + 0.5 * heights
+
+ wx, wy, ww, wh = weights
+ dx = deltas[:, 0::4] * wx
+ dy = deltas[:, 1::4] * wy
+ dw = deltas[:, 2::4] * ww
+ dh = deltas[:, 3::4] * wh
+
+ # Prevent sending too large values into np.exp()
+ dw = np.minimum(dw, cfg.bbox_clip)
+ dh = np.minimum(dh, cfg.bbox_clip)
+
+ pred_ctr_x = dx * widths[:, np.newaxis] + ctr_x[:, np.newaxis]
+ pred_ctr_y = dy * heights[:, np.newaxis] + ctr_y[:, np.newaxis]
+ pred_w = np.exp(dw) * widths[:, np.newaxis]
+ pred_h = np.exp(dh) * heights[:, np.newaxis]
+
+ pred_boxes = np.zeros(deltas.shape, dtype=deltas.dtype)
+ # x1
+ pred_boxes[:, 0::4] = pred_ctr_x - 0.5 * pred_w
+ # y1
+ pred_boxes[:, 1::4] = pred_ctr_y - 0.5 * pred_h
+ # x2 (note: "- 1" is correct; don't be fooled by the asymmetry)
+ pred_boxes[:, 2::4] = pred_ctr_x + 0.5 * pred_w - 1
+ # y2 (note: "- 1" is correct; don't be fooled by the asymmetry)
+ pred_boxes[:, 3::4] = pred_ctr_y + 0.5 * pred_h - 1
+
+ return pred_boxes
+
+
+def clip_tiled_boxes(boxes, im_shape):
+ """Clip boxes to image boundaries. im_shape is [height, width] and boxes
+ has shape (N, 4 * num_tiled_boxes)."""
+ assert boxes.shape[1] % 4 == 0, \
+ 'boxes.shape[1] is {:d}, but must be divisible by 4.'.format(
+ boxes.shape[1]
+ )
+ # x1 >= 0
+ boxes[:, 0::4] = np.maximum(np.minimum(boxes[:, 0::4], im_shape[1] - 1), 0)
+ # y1 >= 0
+ boxes[:, 1::4] = np.maximum(np.minimum(boxes[:, 1::4], im_shape[0] - 1), 0)
+ # x2 < im_shape[1]
+ boxes[:, 2::4] = np.maximum(np.minimum(boxes[:, 2::4], im_shape[1] - 1), 0)
+ # y2 < im_shape[0]
+ boxes[:, 3::4] = np.maximum(np.minimum(boxes[:, 3::4], im_shape[0] - 1), 0)
+ return boxes
+
+
+def get_nmsed_box(rpn_rois, confs, locs, class_nums, im_info):
+ lod = rpn_rois.lod()[0]
+ rpn_rois_v = np.array(rpn_rois)
+ variance_v = np.array(cfg.bbox_reg_weights)
+ confs_v = np.array(confs)
+ locs_v = np.array(locs)
+ im_results = [[] for _ in range(len(lod) - 1)]
+ new_lod = [0]
+ for i in range(len(lod) - 1):
+ start = lod[i]
+ end = lod[i + 1]
+ if start == end:
+ continue
+ locs_n = locs_v[start:end, :]
+ rois_n = rpn_rois_v[start:end, :]
+ rois_n = rois_n / im_info[i][2]
+ rois_n = box_decoder(locs_n, rois_n, variance_v)
+ rois_n = clip_tiled_boxes(rois_n, im_info[i][:2] / im_info[i][2])
+
+ cls_boxes = [[] for _ in range(class_nums)]
+ scores_n = confs_v[start:end, :]
+ for j in range(1, class_nums):
+ inds = np.where(scores_n[:, j] > cfg.TEST.score_thresh)[0]
+ scores_j = scores_n[inds, j]
+ rois_j = rois_n[inds, j * 4:(j + 1) * 4]
+ dets_j = np.hstack((scores_j[:, np.newaxis], rois_j)).astype(
+ np.float32, copy=False)
+ keep = box_utils.nms(dets_j, cfg.TEST.nms_thresh)
+ nms_dets = dets_j[keep, :]
+ #add labels
+ label = np.array([j for _ in range(len(keep))])
+ nms_dets = np.hstack((nms_dets, label[:, np.newaxis])).astype(
+ np.float32, copy=False)
+ cls_boxes[j] = nms_dets
+ # Limit to max_per_image detections **over all classes**
+ image_scores = np.hstack(
+ [cls_boxes[j][:, 1] for j in range(1, class_nums)])
+ if len(image_scores) > cfg.TEST.detections_per_im:
+ image_thresh = np.sort(image_scores)[-cfg.TEST.detections_per_im]
+ for j in range(1, class_nums):
+ keep = np.where(cls_boxes[j][:, 1] >= image_thresh)[0]
+ cls_boxes[j] = cls_boxes[j][keep, :]
+
+ im_results_n = np.vstack([cls_boxes[j] for j in range(1, class_nums)])
+ im_results[i] = im_results_n
+ new_lod.append(len(im_results_n) + new_lod[-1])
+ boxes = im_results_n[:, 2:]
+ scores = im_results_n[:, 1]
+ labels = im_results_n[:, 0]
+ im_results = np.vstack([im_results[k] for k in range(len(lod) - 1)])
+ return new_lod, im_results
+
+
+def get_dt_res(batch_size, lod, nmsed_out, data, num_id_to_cat_id_map):
+ dts_res = []
+ nmsed_out_v = np.array(nmsed_out)
+ if nmsed_out_v.shape == (
+ 1,
+ 1, ):
+ return dts_res
+ assert (len(lod) == batch_size + 1), \
+ "Error Lod Tensor offset dimension. Lod({}) vs. batch_size({})"\
+ .format(len(lod), batch_size)
+ k = 0
+ for i in range(batch_size):
+ dt_num_this_img = lod[i + 1] - lod[i]
+ image_id = int(data[i][-1])
+ image_width = int(data[i][1][1])
+ image_height = int(data[i][1][2])
+ for j in range(dt_num_this_img):
+ dt = nmsed_out_v[k]
+ k = k + 1
+ num_id, score, xmin, ymin, xmax, ymax = dt.tolist()
+ category_id = num_id_to_cat_id_map[num_id]
+ w = xmax - xmin + 1
+ h = ymax - ymin + 1
+ bbox = [xmin, ymin, w, h]
+ dt_res = {
+ 'image_id': image_id,
+ 'category_id': category_id,
+ 'bbox': bbox,
+ 'score': score
+ }
+ dts_res.append(dt_res)
+ return dts_res
+
+
+def get_segms_res(batch_size, lod, segms_out, data, num_id_to_cat_id_map):
+ segms_res = []
+ segms_out_v = np.array(segms_out)
+ k = 0
+ for i in range(batch_size):
+ dt_num_this_img = lod[i + 1] - lod[i]
+ image_id = int(data[i][-1])
+ for j in range(dt_num_this_img):
+ dt = segms_out_v[k]
+ k = k + 1
+ segm, num_id, score = dt.tolist()
+ cat_id = num_id_to_cat_id_map[num_id]
+ if six.PY3:
+ if 'counts' in segm:
+ segm['counts'] = segm['counts'].decode("utf8")
+ segm_res = {
+ 'image_id': image_id,
+ 'category_id': cat_id,
+ 'segmentation': segm,
+ 'score': score
+ }
+ segms_res.append(segm_res)
+ return segms_res
+
+
+def draw_bounding_box_on_image(image_path,
+ nms_out,
+ draw_threshold,
+ labels_map,
+ image=None):
+ if image is None:
+ image = Image.open(image_path)
+ draw = ImageDraw.Draw(image)
+ im_width, im_height = image.size
+
+ for dt in np.array(nms_out):
+ num_id, score, xmin, ymin, xmax, ymax = dt.tolist()
+ if score < draw_threshold:
+ continue
+ draw.line(
+ [(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin),
+ (xmin, ymin)],
+ width=2,
+ fill='red')
+ if image.mode == 'RGB':
+ draw.text((xmin, ymin), labels_map[num_id], (255, 255, 0))
+ image_name = image_path.split('/')[-1]
+ print("image with bbox drawed saved as {}".format(image_name))
+ image.save(image_name)
+
+
+def draw_mask_on_image(image_path, segms_out, draw_threshold, alpha=0.7):
+ image = Image.open(image_path)
+ draw = ImageDraw.Draw(image)
+ im_width, im_height = image.size
+ mask_color_id = 0
+ w_ratio = .4
+ image = np.array(image).astype('float32')
+ for dt in np.array(segms_out):
+ segm, num_id, score = dt.tolist()
+ if score < draw_threshold:
+ continue
+ mask = mask_util.decode(segm) * 255
+ color_list = colormap(rgb=True)
+ color_mask = color_list[mask_color_id % len(color_list), 0:3]
+ mask_color_id += 1
+ for c in range(3):
+ color_mask[c] = color_mask[c] * (1 - w_ratio) + w_ratio * 255
+ idx = np.nonzero(mask)
+ image[idx[0], idx[1], :] *= 1.0 - alpha
+ image[idx[0], idx[1], :] += alpha * color_mask
+ image = Image.fromarray(image.astype('uint8'))
+ return image
+
+
+def segm_results(im_results, masks, im_info):
+ im_results = np.array(im_results)
+ class_num = cfg.class_num
+ M = cfg.resolution
+ scale = (M + 2.0) / M
+ lod = masks.lod()[0]
+ masks_v = np.array(masks)
+ boxes = im_results[:, 2:]
+ labels = im_results[:, 0]
+ segms_results = [[] for _ in range(len(lod) - 1)]
+ sum = 0
+ for i in range(len(lod) - 1):
+ im_results_n = im_results[lod[i]:lod[i + 1]]
+ cls_segms = []
+ masks_n = masks_v[lod[i]:lod[i + 1]]
+ boxes_n = boxes[lod[i]:lod[i + 1]]
+ labels_n = labels[lod[i]:lod[i + 1]]
+ im_h = int(round(im_info[i][0] / im_info[i][2]))
+ im_w = int(round(im_info[i][1] / im_info[i][2]))
+ boxes_n = box_utils.expand_boxes(boxes_n, scale)
+ boxes_n = boxes_n.astype(np.int32)
+ padded_mask = np.zeros((M + 2, M + 2), dtype=np.float32)
+ for j in range(len(im_results_n)):
+ class_id = int(labels_n[j])
+ padded_mask[1:-1, 1:-1] = masks_n[j, class_id, :, :]
+
+ ref_box = boxes_n[j, :]
+ w = ref_box[2] - ref_box[0] + 1
+ h = ref_box[3] - ref_box[1] + 1
+ w = np.maximum(w, 1)
+ h = np.maximum(h, 1)
+
+ mask = cv2.resize(padded_mask, (w, h))
+ mask = np.array(mask > cfg.mrcnn_thresh_binarize, dtype=np.uint8)
+ im_mask = np.zeros((im_h, im_w), dtype=np.uint8)
+
+ x_0 = max(ref_box[0], 0)
+ x_1 = min(ref_box[2] + 1, im_w)
+ y_0 = max(ref_box[1], 0)
+ y_1 = min(ref_box[3] + 1, im_h)
+ im_mask[y_0:y_1, x_0:x_1] = mask[(y_0 - ref_box[1]):(y_1 - ref_box[
+ 1]), (x_0 - ref_box[0]):(x_1 - ref_box[0])]
+ sum += im_mask.sum()
+ rle = mask_util.encode(
+ np.array(
+ im_mask[:, :, np.newaxis], order='F'))[0]
+ cls_segms.append(rle)
+ segms_results[i] = np.array(cls_segms)[:, np.newaxis]
+ segms_results = np.vstack([segms_results[k] for k in range(len(lod) - 1)])
+ im_results = np.hstack([segms_results, im_results])
+ return im_results[:, :3]
+
+
+def coco17_labels():
+ labels_map = {
+ 0: 'background',
+ 1: 'person',
+ 2: 'bicycle',
+ 3: 'car',
+ 4: 'motorcycle',
+ 5: 'airplane',
+ 6: 'bus',
+ 7: 'train',
+ 8: 'truck',
+ 9: 'boat',
+ 10: 'traffic light',
+ 11: 'fire hydrant',
+ 12: 'stop sign',
+ 13: 'parking meter',
+ 14: 'bench',
+ 15: 'bird',
+ 16: 'cat',
+ 17: 'dog',
+ 18: 'horse',
+ 19: 'sheep',
+ 20: 'cow',
+ 21: 'elephant',
+ 22: 'bear',
+ 23: 'zebra',
+ 24: 'giraffe',
+ 25: 'backpack',
+ 26: 'umbrella',
+ 27: 'handbag',
+ 28: 'tie',
+ 29: 'suitcase',
+ 30: 'frisbee',
+ 31: 'skis',
+ 32: 'snowboard',
+ 33: 'sports ball',
+ 34: 'kite',
+ 35: 'baseball bat',
+ 36: 'baseball glove',
+ 37: 'skateboard',
+ 38: 'surfboard',
+ 39: 'tennis racket',
+ 40: 'bottle',
+ 41: 'wine glass',
+ 42: 'cup',
+ 43: 'fork',
+ 44: 'knife',
+ 45: 'spoon',
+ 46: 'bowl',
+ 47: 'banana',
+ 48: 'apple',
+ 49: 'sandwich',
+ 50: 'orange',
+ 51: 'broccoli',
+ 52: 'carrot',
+ 53: 'hot dog',
+ 54: 'pizza',
+ 55: 'donut',
+ 56: 'cake',
+ 57: 'chair',
+ 58: 'couch',
+ 59: 'potted plant',
+ 60: 'bed',
+ 61: 'dining table',
+ 62: 'toilet',
+ 63: 'tv',
+ 64: 'laptop',
+ 65: 'mouse',
+ 66: 'remote',
+ 67: 'keyboard',
+ 68: 'cell phone',
+ 69: 'microwave',
+ 70: 'oven',
+ 71: 'toaster',
+ 72: 'sink',
+ 73: 'refrigerator',
+ 74: 'book',
+ 75: 'clock',
+ 76: 'vase',
+ 77: 'scissors',
+ 78: 'teddy bear',
+ 79: 'hair drier',
+ 80: 'toothbrush'
+ }
+ return labels_map
diff --git a/fluid/PaddleCV/faster_rcnn/image/000000000139.jpg b/PaddleCV/rcnn/image/000000000139.jpg
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/image/000000000139.jpg
rename to PaddleCV/rcnn/image/000000000139.jpg
diff --git a/PaddleCV/rcnn/image/000000000139_mask.jpg b/PaddleCV/rcnn/image/000000000139_mask.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..47dfa9a435bf81c8585e8100413cfc0d6719754c
Binary files /dev/null and b/PaddleCV/rcnn/image/000000000139_mask.jpg differ
diff --git a/fluid/PaddleCV/faster_rcnn/image/000000127517.jpg b/PaddleCV/rcnn/image/000000127517.jpg
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/image/000000127517.jpg
rename to PaddleCV/rcnn/image/000000127517.jpg
diff --git a/PaddleCV/rcnn/image/000000127517_mask.jpg b/PaddleCV/rcnn/image/000000127517_mask.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..c0284591deadf6010bf780acf16124231c42d677
Binary files /dev/null and b/PaddleCV/rcnn/image/000000127517_mask.jpg differ
diff --git a/fluid/PaddleCV/faster_rcnn/image/000000203864.jpg b/PaddleCV/rcnn/image/000000203864.jpg
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/image/000000203864.jpg
rename to PaddleCV/rcnn/image/000000203864.jpg
diff --git a/fluid/PaddleCV/faster_rcnn/image/000000515077.jpg b/PaddleCV/rcnn/image/000000515077.jpg
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/image/000000515077.jpg
rename to PaddleCV/rcnn/image/000000515077.jpg
diff --git a/PaddleCV/rcnn/infer.py b/PaddleCV/rcnn/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..0c66ec08cf63cbf25df0959a5c082838965d7597
--- /dev/null
+++ b/PaddleCV/rcnn/infer.py
@@ -0,0 +1,95 @@
+import os
+import time
+import numpy as np
+from eval_helper import *
+import paddle
+import paddle.fluid as fluid
+import reader
+from utility import print_arguments, parse_args
+import models.model_builder as model_builder
+import models.resnet as resnet
+from config import cfg
+from data_utils import DatasetPath
+
+
+def infer():
+
+ try:
+ from pycocotools.coco import COCO
+ from pycocotools.cocoeval import COCOeval, Params
+
+ data_path = DatasetPath('val')
+ test_list = data_path.get_file_list()
+ coco_api = COCO(test_list)
+ cid = coco_api.getCatIds()
+ cat_id_to_num_id_map = {
+ v: i + 1
+ for i, v in enumerate(coco_api.getCatIds())
+ }
+ category_ids = coco_api.getCatIds()
+ labels_map = {
+ cat_id_to_num_id_map[item['id']]: item['name']
+ for item in coco_api.loadCats(category_ids)
+ }
+ labels_map[0] = 'background'
+ except:
+ print("The COCO dataset or COCO API is not exist, use the default "
+ "mapping of class index and real category name on COCO17.")
+ assert cfg.dataset == 'coco2017'
+ labels_map = coco17_labels()
+
+ image_shape = [3, cfg.TEST.max_size, cfg.TEST.max_size]
+ class_nums = cfg.class_num
+
+ model = model_builder.RCNN(
+ add_conv_body_func=resnet.add_ResNet50_conv4_body,
+ add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
+ use_pyreader=False,
+ mode='infer')
+ model.build_model(image_shape)
+ pred_boxes = model.eval_bbox_out()
+ if cfg.MASK_ON:
+ masks = model.eval_mask_out()
+ place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ # yapf: disable
+ if not os.path.exists(cfg.pretrained_model):
+ raise ValueError("Model path [%s] does not exist." % (cfg.pretrained_model))
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
+ fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
+ # yapf: enable
+ infer_reader = reader.infer(cfg.image_path)
+ feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
+
+ dts_res = []
+ segms_res = []
+ if cfg.MASK_ON:
+ fetch_list = [pred_boxes, masks]
+ else:
+ fetch_list = [pred_boxes]
+ data = next(infer_reader())
+ im_info = [data[0][1]]
+ result = exe.run(fetch_list=[v.name for v in fetch_list],
+ feed=feeder.feed(data),
+ return_numpy=False)
+ pred_boxes_v = result[0]
+ if cfg.MASK_ON:
+ masks_v = result[1]
+ new_lod = pred_boxes_v.lod()
+ nmsed_out = pred_boxes_v
+ image = None
+ if cfg.MASK_ON:
+ segms_out = segm_results(nmsed_out, masks_v, im_info)
+ image = draw_mask_on_image(cfg.image_path, segms_out,
+ cfg.draw_threshold)
+
+ draw_bounding_box_on_image(cfg.image_path, nmsed_out, cfg.draw_threshold,
+ labels_map, image)
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ infer()
diff --git a/fluid/PaddleCV/faster_rcnn/learning_rate.py b/PaddleCV/rcnn/learning_rate.py
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/learning_rate.py
rename to PaddleCV/rcnn/learning_rate.py
diff --git a/fluid/PaddleCV/image_classification/dist_train/__init__.py b/PaddleCV/rcnn/models/__init__.py
similarity index 100%
rename from fluid/PaddleCV/image_classification/dist_train/__init__.py
rename to PaddleCV/rcnn/models/__init__.py
diff --git a/PaddleCV/rcnn/models/model_builder.py b/PaddleCV/rcnn/models/model_builder.py
new file mode 100644
index 0000000000000000000000000000000000000000..f201e03b657ec7027693b456867bd39ad67b5236
--- /dev/null
+++ b/PaddleCV/rcnn/models/model_builder.py
@@ -0,0 +1,441 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from paddle.fluid.initializer import Constant
+from paddle.fluid.initializer import Normal
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.regularizer import L2Decay
+from config import cfg
+
+
+class RCNN(object):
+ def __init__(self,
+ add_conv_body_func=None,
+ add_roi_box_head_func=None,
+ mode='train',
+ use_pyreader=True,
+ use_random=True):
+ self.add_conv_body_func = add_conv_body_func
+ self.add_roi_box_head_func = add_roi_box_head_func
+ self.mode = mode
+ self.use_pyreader = use_pyreader
+ self.use_random = use_random
+
+ def build_model(self, image_shape):
+ self.build_input(image_shape)
+ body_conv = self.add_conv_body_func(self.image)
+ # RPN
+ self.rpn_heads(body_conv)
+ # Fast RCNN
+ self.fast_rcnn_heads(body_conv)
+ if self.mode != 'train':
+ self.eval_bbox()
+ # Mask RCNN
+ if cfg.MASK_ON:
+ self.mask_rcnn_heads(body_conv)
+
+ def loss(self):
+ losses = []
+ # Fast RCNN loss
+ loss_cls, loss_bbox = self.fast_rcnn_loss()
+ # RPN loss
+ rpn_cls_loss, rpn_reg_loss = self.rpn_loss()
+ losses = [loss_cls, loss_bbox, rpn_cls_loss, rpn_reg_loss]
+ rkeys = ['loss', 'loss_cls', 'loss_bbox', \
+ 'loss_rpn_cls', 'loss_rpn_bbox',]
+ if cfg.MASK_ON:
+ loss_mask = self.mask_rcnn_loss()
+ losses = losses + [loss_mask]
+ rkeys = rkeys + ["loss_mask"]
+ loss = fluid.layers.sum(losses)
+ rloss = [loss] + losses
+ return rloss, rkeys
+
+ def eval_mask_out(self):
+ return self.mask_fcn_logits
+
+ def eval_bbox_out(self):
+ return self.pred_result
+
+ def build_input(self, image_shape):
+ if self.use_pyreader:
+ in_shapes = [[-1] + image_shape, [-1, 4], [-1, 1], [-1, 1],
+ [-1, 3], [-1, 1]]
+ lod_levels = [0, 1, 1, 1, 0, 0]
+ dtypes = [
+ 'float32', 'float32', 'int32', 'int32', 'float32', 'int32'
+ ]
+ if cfg.MASK_ON:
+ in_shapes.append([-1, 2])
+ lod_levels.append(3)
+ dtypes.append('float32')
+ self.py_reader = fluid.layers.py_reader(
+ capacity=64,
+ shapes=in_shapes,
+ lod_levels=lod_levels,
+ dtypes=dtypes,
+ use_double_buffer=True)
+ ins = fluid.layers.read_file(self.py_reader)
+ self.image = ins[0]
+ self.gt_box = ins[1]
+ self.gt_label = ins[2]
+ self.is_crowd = ins[3]
+ self.im_info = ins[4]
+ self.im_id = ins[5]
+ if cfg.MASK_ON:
+ self.gt_masks = ins[6]
+ else:
+ self.image = fluid.layers.data(
+ name='image', shape=image_shape, dtype='float32')
+ self.gt_box = fluid.layers.data(
+ name='gt_box', shape=[4], dtype='float32', lod_level=1)
+ self.gt_label = fluid.layers.data(
+ name='gt_label', shape=[1], dtype='int32', lod_level=1)
+ self.is_crowd = fluid.layers.data(
+ name='is_crowd', shape=[1], dtype='int32', lod_level=1)
+ self.im_info = fluid.layers.data(
+ name='im_info', shape=[3], dtype='float32')
+ self.im_id = fluid.layers.data(
+ name='im_id', shape=[1], dtype='int32')
+ if cfg.MASK_ON:
+ self.gt_masks = fluid.layers.data(
+ name='gt_masks', shape=[2], dtype='float32', lod_level=3)
+
+ def feeds(self):
+ if self.mode == 'infer':
+ return [self.image, self.im_info]
+ if self.mode == 'val':
+ return [self.image, self.im_info, self.im_id]
+ if not cfg.MASK_ON:
+ return [
+ self.image, self.gt_box, self.gt_label, self.is_crowd,
+ self.im_info, self.im_id
+ ]
+ return [
+ self.image, self.gt_box, self.gt_label, self.is_crowd, self.im_info,
+ self.im_id, self.gt_masks
+ ]
+
+ def eval_bbox(self):
+ self.im_scale = fluid.layers.slice(
+ self.im_info, [1], starts=[2], ends=[3])
+ im_scale_lod = fluid.layers.sequence_expand(self.im_scale,
+ self.rpn_rois)
+ boxes = self.rpn_rois / im_scale_lod
+ cls_prob = fluid.layers.softmax(self.cls_score, use_cudnn=False)
+ bbox_pred_reshape = fluid.layers.reshape(self.bbox_pred,
+ (-1, cfg.class_num, 4))
+ decoded_box = fluid.layers.box_coder(
+ prior_box=boxes,
+ prior_box_var=cfg.bbox_reg_weights,
+ target_box=bbox_pred_reshape,
+ code_type='decode_center_size',
+ box_normalized=False,
+ axis=1)
+ cliped_box = fluid.layers.box_clip(
+ input=decoded_box, im_info=self.im_info)
+ self.pred_result = fluid.layers.multiclass_nms(
+ bboxes=cliped_box,
+ scores=cls_prob,
+ score_threshold=cfg.TEST.score_thresh,
+ nms_top_k=-1,
+ nms_threshold=cfg.TEST.nms_thresh,
+ keep_top_k=cfg.TEST.detections_per_im,
+ normalized=False)
+
+ def rpn_heads(self, rpn_input):
+ # RPN hidden representation
+ dim_out = rpn_input.shape[1]
+ rpn_conv = fluid.layers.conv2d(
+ input=rpn_input,
+ num_filters=dim_out,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ act='relu',
+ name='conv_rpn',
+ param_attr=ParamAttr(
+ name="conv_rpn_w", initializer=Normal(
+ loc=0., scale=0.01)),
+ bias_attr=ParamAttr(
+ name="conv_rpn_b", learning_rate=2., regularizer=L2Decay(0.)))
+ self.anchor, self.var = fluid.layers.anchor_generator(
+ input=rpn_conv,
+ anchor_sizes=cfg.anchor_sizes,
+ aspect_ratios=cfg.aspect_ratio,
+ variance=cfg.variances,
+ stride=cfg.rpn_stride)
+ num_anchor = self.anchor.shape[2]
+ # Proposal classification scores
+ self.rpn_cls_score = fluid.layers.conv2d(
+ rpn_conv,
+ num_filters=num_anchor,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ act=None,
+ name='rpn_cls_score',
+ param_attr=ParamAttr(
+ name="rpn_cls_logits_w", initializer=Normal(
+ loc=0., scale=0.01)),
+ bias_attr=ParamAttr(
+ name="rpn_cls_logits_b",
+ learning_rate=2.,
+ regularizer=L2Decay(0.)))
+ # Proposal bbox regression deltas
+ self.rpn_bbox_pred = fluid.layers.conv2d(
+ rpn_conv,
+ num_filters=4 * num_anchor,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ act=None,
+ name='rpn_bbox_pred',
+ param_attr=ParamAttr(
+ name="rpn_bbox_pred_w", initializer=Normal(
+ loc=0., scale=0.01)),
+ bias_attr=ParamAttr(
+ name="rpn_bbox_pred_b",
+ learning_rate=2.,
+ regularizer=L2Decay(0.)))
+
+ rpn_cls_score_prob = fluid.layers.sigmoid(
+ self.rpn_cls_score, name='rpn_cls_score_prob')
+
+ param_obj = cfg.TRAIN if self.mode == 'train' else cfg.TEST
+ pre_nms_top_n = param_obj.rpn_pre_nms_top_n
+ post_nms_top_n = param_obj.rpn_post_nms_top_n
+ nms_thresh = param_obj.rpn_nms_thresh
+ min_size = param_obj.rpn_min_size
+ eta = param_obj.rpn_eta
+ self.rpn_rois, self.rpn_roi_probs = fluid.layers.generate_proposals(
+ scores=rpn_cls_score_prob,
+ bbox_deltas=self.rpn_bbox_pred,
+ im_info=self.im_info,
+ anchors=self.anchor,
+ variances=self.var,
+ pre_nms_top_n=pre_nms_top_n,
+ post_nms_top_n=post_nms_top_n,
+ nms_thresh=nms_thresh,
+ min_size=min_size,
+ eta=eta)
+ if self.mode == 'train':
+ outs = fluid.layers.generate_proposal_labels(
+ rpn_rois=self.rpn_rois,
+ gt_classes=self.gt_label,
+ is_crowd=self.is_crowd,
+ gt_boxes=self.gt_box,
+ im_info=self.im_info,
+ batch_size_per_im=cfg.TRAIN.batch_size_per_im,
+ fg_fraction=cfg.TRAIN.fg_fractrion,
+ fg_thresh=cfg.TRAIN.fg_thresh,
+ bg_thresh_hi=cfg.TRAIN.bg_thresh_hi,
+ bg_thresh_lo=cfg.TRAIN.bg_thresh_lo,
+ bbox_reg_weights=cfg.bbox_reg_weights,
+ class_nums=cfg.class_num,
+ use_random=self.use_random)
+
+ self.rois = outs[0]
+ self.labels_int32 = outs[1]
+ self.bbox_targets = outs[2]
+ self.bbox_inside_weights = outs[3]
+ self.bbox_outside_weights = outs[4]
+
+ if cfg.MASK_ON:
+ mask_out = fluid.layers.generate_mask_labels(
+ im_info=self.im_info,
+ gt_classes=self.gt_label,
+ is_crowd=self.is_crowd,
+ gt_segms=self.gt_masks,
+ rois=self.rois,
+ labels_int32=self.labels_int32,
+ num_classes=cfg.class_num,
+ resolution=cfg.resolution)
+ self.mask_rois = mask_out[0]
+ self.roi_has_mask_int32 = mask_out[1]
+ self.mask_int32 = mask_out[2]
+
+ def fast_rcnn_heads(self, roi_input):
+ if self.mode == 'train':
+ pool_rois = self.rois
+ else:
+ pool_rois = self.rpn_rois
+ self.res5_2_sum = self.add_roi_box_head_func(roi_input, pool_rois)
+ rcnn_out = fluid.layers.pool2d(
+ self.res5_2_sum, pool_type='avg', pool_size=7, name='res5_pool')
+ self.cls_score = fluid.layers.fc(input=rcnn_out,
+ size=cfg.class_num,
+ act=None,
+ name='cls_score',
+ param_attr=ParamAttr(
+ name='cls_score_w',
+ initializer=Normal(
+ loc=0.0, scale=0.001)),
+ bias_attr=ParamAttr(
+ name='cls_score_b',
+ learning_rate=2.,
+ regularizer=L2Decay(0.)))
+ self.bbox_pred = fluid.layers.fc(input=rcnn_out,
+ size=4 * cfg.class_num,
+ act=None,
+ name='bbox_pred',
+ param_attr=ParamAttr(
+ name='bbox_pred_w',
+ initializer=Normal(
+ loc=0.0, scale=0.01)),
+ bias_attr=ParamAttr(
+ name='bbox_pred_b',
+ learning_rate=2.,
+ regularizer=L2Decay(0.)))
+
+ def SuffixNet(self, conv5):
+ mask_out = fluid.layers.conv2d_transpose(
+ input=conv5,
+ num_filters=cfg.dim_reduced,
+ filter_size=2,
+ stride=2,
+ act='relu',
+ param_attr=ParamAttr(
+ name='conv5_mask_w', initializer=MSRA(uniform=False)),
+ bias_attr=ParamAttr(
+ name='conv5_mask_b', learning_rate=2., regularizer=L2Decay(0.)))
+ act_func = None
+ if self.mode != 'train':
+ act_func = 'sigmoid'
+ mask_fcn_logits = fluid.layers.conv2d(
+ input=mask_out,
+ num_filters=cfg.class_num,
+ filter_size=1,
+ act=act_func,
+ param_attr=ParamAttr(
+ name='mask_fcn_logits_w', initializer=MSRA(uniform=False)),
+ bias_attr=ParamAttr(
+ name="mask_fcn_logits_b",
+ learning_rate=2.,
+ regularizer=L2Decay(0.)))
+
+ if self.mode != 'train':
+ mask_fcn_logits = fluid.layers.lod_reset(mask_fcn_logits,
+ self.pred_result)
+ return mask_fcn_logits
+
+ def mask_rcnn_heads(self, mask_input):
+ if self.mode == 'train':
+ conv5 = fluid.layers.gather(self.res5_2_sum,
+ self.roi_has_mask_int32)
+ self.mask_fcn_logits = self.SuffixNet(conv5)
+ else:
+ pred_res_shape = fluid.layers.shape(self.pred_result)
+ shape = fluid.layers.reduce_prod(pred_res_shape)
+ shape = fluid.layers.reshape(shape, [1, 1])
+ ones = fluid.layers.fill_constant([1, 1], value=1, dtype='int32')
+ cond = fluid.layers.equal(x=shape, y=ones)
+ ie = fluid.layers.IfElse(cond)
+
+ with ie.true_block():
+ pred_res_null = ie.input(self.pred_result)
+ ie.output(pred_res_null)
+ with ie.false_block():
+ pred_res = ie.input(self.pred_result)
+ pred_boxes = fluid.layers.slice(
+ pred_res, [1], starts=[2], ends=[6])
+ im_scale_lod = fluid.layers.sequence_expand(self.im_scale,
+ pred_boxes)
+ mask_rois = pred_boxes * im_scale_lod
+ conv5 = self.add_roi_box_head_func(mask_input, mask_rois)
+ mask_fcn = self.SuffixNet(conv5)
+ ie.output(mask_fcn)
+ self.mask_fcn_logits = ie()[0]
+
+ def mask_rcnn_loss(self):
+ mask_label = fluid.layers.cast(x=self.mask_int32, dtype='float32')
+ reshape_dim = cfg.class_num * cfg.resolution * cfg.resolution
+ mask_fcn_logits_reshape = fluid.layers.reshape(self.mask_fcn_logits,
+ (-1, reshape_dim))
+
+ loss_mask = fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=mask_fcn_logits_reshape,
+ label=mask_label,
+ ignore_index=-1,
+ normalize=True)
+ loss_mask = fluid.layers.reduce_sum(loss_mask, name='loss_mask')
+ return loss_mask
+
+ def fast_rcnn_loss(self):
+ labels_int64 = fluid.layers.cast(x=self.labels_int32, dtype='int64')
+ labels_int64.stop_gradient = True
+ loss_cls = fluid.layers.softmax_with_cross_entropy(
+ logits=self.cls_score,
+ label=labels_int64,
+ numeric_stable_mode=True, )
+ loss_cls = fluid.layers.reduce_mean(loss_cls)
+ loss_bbox = fluid.layers.smooth_l1(
+ x=self.bbox_pred,
+ y=self.bbox_targets,
+ inside_weight=self.bbox_inside_weights,
+ outside_weight=self.bbox_outside_weights,
+ sigma=1.0)
+ loss_bbox = fluid.layers.reduce_mean(loss_bbox)
+ return loss_cls, loss_bbox
+
+ def rpn_loss(self):
+ rpn_cls_score_reshape = fluid.layers.transpose(
+ self.rpn_cls_score, perm=[0, 2, 3, 1])
+ rpn_bbox_pred_reshape = fluid.layers.transpose(
+ self.rpn_bbox_pred, perm=[0, 2, 3, 1])
+
+ anchor_reshape = fluid.layers.reshape(self.anchor, shape=(-1, 4))
+ var_reshape = fluid.layers.reshape(self.var, shape=(-1, 4))
+
+ rpn_cls_score_reshape = fluid.layers.reshape(
+ x=rpn_cls_score_reshape, shape=(0, -1, 1))
+ rpn_bbox_pred_reshape = fluid.layers.reshape(
+ x=rpn_bbox_pred_reshape, shape=(0, -1, 4))
+ score_pred, loc_pred, score_tgt, loc_tgt, bbox_weight = \
+ fluid.layers.rpn_target_assign(
+ bbox_pred=rpn_bbox_pred_reshape,
+ cls_logits=rpn_cls_score_reshape,
+ anchor_box=anchor_reshape,
+ anchor_var=var_reshape,
+ gt_boxes=self.gt_box,
+ is_crowd=self.is_crowd,
+ im_info=self.im_info,
+ rpn_batch_size_per_im=cfg.TRAIN.rpn_batch_size_per_im,
+ rpn_straddle_thresh=cfg.TRAIN.rpn_straddle_thresh,
+ rpn_fg_fraction=cfg.TRAIN.rpn_fg_fraction,
+ rpn_positive_overlap=cfg.TRAIN.rpn_positive_overlap,
+ rpn_negative_overlap=cfg.TRAIN.rpn_negative_overlap,
+ use_random=self.use_random)
+ score_tgt = fluid.layers.cast(x=score_tgt, dtype='float32')
+ rpn_cls_loss = fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=score_pred, label=score_tgt)
+ rpn_cls_loss = fluid.layers.reduce_mean(
+ rpn_cls_loss, name='loss_rpn_cls')
+
+ rpn_reg_loss = fluid.layers.smooth_l1(
+ x=loc_pred,
+ y=loc_tgt,
+ sigma=3.0,
+ inside_weight=bbox_weight,
+ outside_weight=bbox_weight)
+ rpn_reg_loss = fluid.layers.reduce_sum(
+ rpn_reg_loss, name='loss_rpn_bbox')
+ score_shape = fluid.layers.shape(score_tgt)
+ score_shape = fluid.layers.cast(x=score_shape, dtype='float32')
+ norm = fluid.layers.reduce_prod(score_shape)
+ norm.stop_gradient = True
+ rpn_reg_loss = rpn_reg_loss / norm
+ return rpn_cls_loss, rpn_reg_loss
diff --git a/PaddleCV/rcnn/models/resnet.py b/PaddleCV/rcnn/models/resnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..8093470241b3297c44a2e42b5162e25cac1514be
--- /dev/null
+++ b/PaddleCV/rcnn/models/resnet.py
@@ -0,0 +1,181 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from paddle.fluid.initializer import Constant
+from paddle.fluid.regularizer import L2Decay
+from config import cfg
+
+
+def conv_bn_layer(input,
+ ch_out,
+ filter_size,
+ stride,
+ padding,
+ act='relu',
+ name=None):
+ conv1 = fluid.layers.conv2d(
+ input=input,
+ num_filters=ch_out,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ act=None,
+ param_attr=ParamAttr(name=name + "_weights"),
+ bias_attr=ParamAttr(name=name + "_biases"),
+ name=name + '.conv2d.output.1')
+ if name == "conv1":
+ bn_name = "bn_" + name
+ else:
+ bn_name = "bn" + name[3:]
+
+ return fluid.layers.batch_norm(
+ input=conv1,
+ act=act,
+ name=bn_name + '.output.1',
+ param_attr=ParamAttr(name=bn_name + '_scale'),
+ bias_attr=ParamAttr(bn_name + '_offset'),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance',
+ is_test=True)
+
+
+def conv_affine_layer(input,
+ ch_out,
+ filter_size,
+ stride,
+ padding,
+ act='relu',
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=ch_out,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ act=None,
+ param_attr=ParamAttr(name=name + "_weights"),
+ bias_attr=False,
+ name=name + '.conv2d.output.1')
+ if name == "conv1":
+ bn_name = "bn_" + name
+ else:
+ bn_name = "bn" + name[3:]
+
+ scale = fluid.layers.create_parameter(
+ shape=[conv.shape[1]],
+ dtype=conv.dtype,
+ attr=ParamAttr(
+ name=bn_name + '_scale', learning_rate=0.),
+ default_initializer=Constant(1.))
+ scale.stop_gradient = True
+ bias = fluid.layers.create_parameter(
+ shape=[conv.shape[1]],
+ dtype=conv.dtype,
+ attr=ParamAttr(
+ bn_name + '_offset', learning_rate=0.),
+ default_initializer=Constant(0.))
+ bias.stop_gradient = True
+
+ out = fluid.layers.affine_channel(x=conv, scale=scale, bias=bias)
+ if act == 'relu':
+ out = fluid.layers.relu(x=out)
+ return out
+
+
+def shortcut(input, ch_out, stride, name):
+ ch_in = input.shape[1] # if args.data_format == 'NCHW' else input.shape[-1]
+ if ch_in != ch_out:
+ return conv_affine_layer(input, ch_out, 1, stride, 0, None, name=name)
+ else:
+ return input
+
+
+def basicblock(input, ch_out, stride, name):
+ short = shortcut(input, ch_out, stride, name=name)
+ conv1 = conv_affine_layer(input, ch_out, 3, stride, 1, name=name)
+ conv2 = conv_affine_layer(conv1, ch_out, 3, 1, 1, act=None, name=name)
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu', name=name)
+
+
+def bottleneck(input, ch_out, stride, name):
+ short = shortcut(input, ch_out * 4, stride, name=name + "_branch1")
+ conv1 = conv_affine_layer(
+ input, ch_out, 1, stride, 0, name=name + "_branch2a")
+ conv2 = conv_affine_layer(conv1, ch_out, 3, 1, 1, name=name + "_branch2b")
+ conv3 = conv_affine_layer(
+ conv2, ch_out * 4, 1, 1, 0, act=None, name=name + "_branch2c")
+ return fluid.layers.elementwise_add(
+ x=short, y=conv3, act='relu', name=name + ".add.output.5")
+
+
+def layer_warp(block_func, input, ch_out, count, stride, name):
+ res_out = block_func(input, ch_out, stride, name=name + "a")
+ for i in range(1, count):
+ res_out = block_func(res_out, ch_out, 1, name=name + chr(ord("a") + i))
+ return res_out
+
+
+ResNet_cfg = {
+ 18: ([2, 2, 2, 1], basicblock),
+ 34: ([3, 4, 6, 3], basicblock),
+ 50: ([3, 4, 6, 3], bottleneck),
+ 101: ([3, 4, 23, 3], bottleneck),
+ 152: ([3, 8, 36, 3], bottleneck)
+}
+
+
+def add_ResNet50_conv4_body(body_input):
+ stages, block_func = ResNet_cfg[50]
+ stages = stages[0:3]
+ conv1 = conv_affine_layer(
+ body_input, ch_out=64, filter_size=7, stride=2, padding=3, name="conv1")
+ pool1 = fluid.layers.pool2d(
+ input=conv1,
+ pool_type='max',
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1)
+ res2 = layer_warp(block_func, pool1, 64, stages[0], 1, name="res2")
+ if cfg.TRAIN.freeze_at == 2:
+ res2.stop_gradient = True
+ res3 = layer_warp(block_func, res2, 128, stages[1], 2, name="res3")
+ if cfg.TRAIN.freeze_at == 3:
+ res3.stop_gradient = True
+ res4 = layer_warp(block_func, res3, 256, stages[2], 2, name="res4")
+ if cfg.TRAIN.freeze_at == 4:
+ res4.stop_gradient = True
+ return res4
+
+
+def add_ResNet_roi_conv5_head(head_input, rois):
+ if cfg.roi_func == 'RoIPool':
+ pool = fluid.layers.roi_pool(
+ input=head_input,
+ rois=rois,
+ pooled_height=cfg.roi_resolution,
+ pooled_width=cfg.roi_resolution,
+ spatial_scale=cfg.spatial_scale)
+ elif cfg.roi_func == 'RoIAlign':
+ pool = fluid.layers.roi_align(
+ input=head_input,
+ rois=rois,
+ pooled_height=cfg.roi_resolution,
+ pooled_width=cfg.roi_resolution,
+ spatial_scale=cfg.spatial_scale,
+ sampling_ratio=cfg.sampling_ratio)
+
+ res5 = layer_warp(bottleneck, pool, 512, 3, 2, name="res5")
+ return res5
diff --git a/fluid/PaddleCV/faster_rcnn/pretrained/download.sh b/PaddleCV/rcnn/pretrained/download.sh
similarity index 100%
rename from fluid/PaddleCV/faster_rcnn/pretrained/download.sh
rename to PaddleCV/rcnn/pretrained/download.sh
diff --git a/PaddleCV/rcnn/profile.py b/PaddleCV/rcnn/profile.py
new file mode 100644
index 0000000000000000000000000000000000000000..92f089b4238a545595723bd8251c6a2e715a59d3
--- /dev/null
+++ b/PaddleCV/rcnn/profile.py
@@ -0,0 +1,167 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import time
+import numpy as np
+import argparse
+from utility import parse_args, add_arguments, print_arguments
+
+import paddle
+import paddle.fluid as fluid
+import reader
+import paddle.fluid.profiler as profiler
+
+import models.model_builder as model_builder
+import models.resnet as resnet
+from learning_rate import exponential_with_warmup_decay
+from config import cfg
+
+
+def train():
+ learning_rate = cfg.learning_rate
+ image_shape = [3, cfg.TRAIN.max_size, cfg.TRAIN.max_size]
+ num_iterations = cfg.max_iter
+
+ devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
+ devices_num = len(devices.split(","))
+ total_batch_size = devices_num * cfg.TRAIN.im_per_batch
+ model = model_builder.RCNN(
+ add_conv_body_func=resnet.add_ResNet50_conv4_body,
+ add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
+ use_pyreader=cfg.use_pyreader,
+ use_random=False)
+ model.build_model(image_shape)
+ losses, keys = model.loss()
+ loss = losses[0]
+ fetch_list = [loss]
+
+ boundaries = cfg.lr_steps
+ gamma = cfg.lr_gamma
+ step_num = len(cfg.lr_steps)
+ values = [learning_rate * (gamma**i) for i in range(step_num + 1)]
+
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=exponential_with_warmup_decay(
+ learning_rate=learning_rate,
+ boundaries=boundaries,
+ values=values,
+ warmup_iter=500,
+ warmup_factor=1.0 / 3.0),
+ regularization=fluid.regularizer.L2Decay(0.0001),
+ momentum=0.9)
+ optimizer.minimize(loss)
+
+ fluid.memory_optimize(fluid.default_main_program())
+
+ place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if cfg.pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
+
+ fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
+
+ if cfg.parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=bool(cfg.use_gpu), loss_name=loss.name)
+
+ if cfg.use_pyreader:
+ train_reader = reader.train(
+ batch_size=cfg.TRAIN.im_per_batch,
+ total_batch_size=total_batch_size,
+ padding_total=cfg.TRAIN.padding_minibatch,
+ shuffle=False)
+ py_reader = model.py_reader
+ py_reader.decorate_paddle_reader(train_reader)
+ else:
+ train_reader = reader.train(batch_size=total_batch_size, shuffle=False)
+ feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
+
+ def run(iterations):
+ reader_time = []
+ run_time = []
+ total_images = 0
+
+ for batch_id in range(iterations):
+ start_time = time.time()
+ data = next(train_reader())
+ end_time = time.time()
+ reader_time.append(end_time - start_time)
+ start_time = time.time()
+ if cfg.parallel:
+ outs = train_exe.run(fetch_list=[v.name for v in fetch_list],
+ feed=feeder.feed(data))
+ else:
+ outs = exe.run(fluid.default_main_program(),
+ fetch_list=[v.name for v in fetch_list],
+ feed=feeder.feed(data))
+ end_time = time.time()
+ run_time.append(end_time - start_time)
+ total_images += len(data)
+ print("Batch {:d}, loss {:.6f} ".format(batch_id, np.mean(outs[0])))
+ return reader_time, run_time, total_images
+
+ def run_pyreader(iterations):
+ reader_time = [0]
+ run_time = []
+ total_images = 0
+
+ py_reader.start()
+ try:
+ for batch_id in range(iterations):
+ start_time = time.time()
+ if cfg.parallel:
+ outs = train_exe.run(
+ fetch_list=[v.name for v in fetch_list])
+ else:
+ outs = exe.run(fluid.default_main_program(),
+ fetch_list=[v.name for v in fetch_list])
+ end_time = time.time()
+ run_time.append(end_time - start_time)
+ total_images += devices_num
+ print("Batch {:d}, loss {:.6f} ".format(batch_id,
+ np.mean(outs[0])))
+ except fluid.core.EOFException:
+ py_reader.reset()
+
+ return reader_time, run_time, total_images
+
+ run_func = run if not cfg.use_pyreader else run_pyreader
+
+ # warm-up
+ run_func(2)
+ # profiling
+ start = time.time()
+ if cfg.use_profile:
+ with profiler.profiler('GPU', 'total', '/tmp/profile_file'):
+ reader_time, run_time, total_images = run_func(num_iterations)
+ else:
+ reader_time, run_time, total_images = run_func(num_iterations)
+
+ end = time.time()
+ total_time = end - start
+ print("Total time: {0}, reader time: {1} s, run time: {2} s, images/s: {3}".
+ format(total_time,
+ np.sum(reader_time),
+ np.sum(run_time), total_images / total_time))
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ train()
diff --git a/PaddleCV/rcnn/reader.py b/PaddleCV/rcnn/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..d2d5abbca8f0175fb1d1ba50ffcd62b2b5ed31da
--- /dev/null
+++ b/PaddleCV/rcnn/reader.py
@@ -0,0 +1,181 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import random
+import numpy as np
+import xml.etree.ElementTree
+import os
+import time
+import copy
+import six
+import cv2
+from collections import deque
+
+from roidbs import JsonDataset
+import data_utils
+from config import cfg
+import segm_utils
+
+
+def roidb_reader(roidb, mode):
+ im, im_scales = data_utils.get_image_blob(roidb, mode)
+ im_id = roidb['id']
+ im_height = np.round(roidb['height'] * im_scales)
+ im_width = np.round(roidb['width'] * im_scales)
+ im_info = np.array([im_height, im_width, im_scales], dtype=np.float32)
+ if mode == 'val':
+ return im, im_info, im_id
+
+ gt_boxes = roidb['gt_boxes'].astype('float32')
+ gt_classes = roidb['gt_classes'].astype('int32')
+ is_crowd = roidb['is_crowd'].astype('int32')
+ segms = roidb['segms']
+
+ outs = (im, gt_boxes, gt_classes, is_crowd, im_info, im_id)
+
+ if cfg.MASK_ON:
+ gt_masks = []
+ valid = True
+ segms = roidb['segms']
+ assert len(segms) == is_crowd.shape[0]
+ for i in range(len(roidb['segms'])):
+ segm, iscrowd = segms[i], is_crowd[i]
+ gt_segm = []
+ if iscrowd:
+ gt_segm.append([[0, 0]])
+ else:
+ for poly in segm:
+ if len(poly) == 0:
+ valid = False
+ break
+ gt_segm.append(np.array(poly).reshape(-1, 2))
+ if (not valid) or len(gt_segm) == 0:
+ break
+ gt_masks.append(gt_segm)
+ outs = outs + (gt_masks, )
+ return outs
+
+
+def coco(mode,
+ batch_size=None,
+ total_batch_size=None,
+ padding_total=False,
+ shuffle=False):
+ total_batch_size = total_batch_size if total_batch_size else batch_size
+ assert total_batch_size % batch_size == 0
+ json_dataset = JsonDataset(mode)
+ roidbs = json_dataset.get_roidb()
+
+ print("{} on {} with {} roidbs".format(mode, cfg.dataset, len(roidbs)))
+
+ def padding_minibatch(batch_data):
+ if len(batch_data) == 1:
+ return batch_data
+
+ max_shape = np.array([data[0].shape for data in batch_data]).max(axis=0)
+
+ padding_batch = []
+ for data in batch_data:
+ im_c, im_h, im_w = data[0].shape[:]
+ padding_im = np.zeros(
+ (im_c, max_shape[1], max_shape[2]), dtype=np.float32)
+ padding_im[:, :im_h, :im_w] = data[0]
+ padding_batch.append((padding_im, ) + data[1:])
+ return padding_batch
+
+ def reader():
+ if mode == "train":
+ if shuffle:
+ roidb_perm = deque(np.random.permutation(roidbs))
+ else:
+ roidb_perm = deque(roidbs)
+ roidb_cur = 0
+ count = 0
+ batch_out = []
+ device_num = total_batch_size / batch_size
+ while True:
+ roidb = roidb_perm[0]
+ roidb_cur += 1
+ roidb_perm.rotate(-1)
+ if roidb_cur >= len(roidbs):
+ if shuffle:
+ roidb_perm = deque(np.random.permutation(roidbs))
+ else:
+ roidb_perm = deque(roidbs)
+ roidb_cur = 0
+ # im, gt_boxes, gt_classes, is_crowd, im_info, im_id, gt_masks
+ datas = roidb_reader(roidb, mode)
+ if datas[1].shape[0] == 0:
+ continue
+ if cfg.MASK_ON:
+ if len(datas[-1]) != datas[1].shape[0]:
+ continue
+ batch_out.append(datas)
+ if not padding_total:
+ if len(batch_out) == batch_size:
+ yield padding_minibatch(batch_out)
+ count += 1
+ batch_out = []
+ else:
+ if len(batch_out) == total_batch_size:
+ batch_out = padding_minibatch(batch_out)
+ for i in range(device_num):
+ sub_batch_out = []
+ for j in range(batch_size):
+ sub_batch_out.append(batch_out[i * batch_size +
+ j])
+ yield sub_batch_out
+ count += 1
+ sub_batch_out = []
+ batch_out = []
+ iter_id = count // device_num
+ if iter_id >= cfg.max_iter:
+ return
+ elif mode == "val":
+ batch_out = []
+ for roidb in roidbs:
+ im, im_info, im_id = roidb_reader(roidb, mode)
+ batch_out.append((im, im_info, im_id))
+ if len(batch_out) == batch_size:
+ yield batch_out
+ batch_out = []
+ if len(batch_out) != 0:
+ yield batch_out
+
+ return reader
+
+
+def train(batch_size, total_batch_size=None, padding_total=False, shuffle=True):
+ return coco(
+ 'train', batch_size, total_batch_size, padding_total, shuffle=shuffle)
+
+
+def test(batch_size, total_batch_size=None, padding_total=False):
+ return coco('val', batch_size, total_batch_size, shuffle=False)
+
+
+def infer(file_path):
+ def reader():
+ if not os.path.exists(file_path):
+ raise ValueError("Image path [%s] does not exist." % (file_path))
+ im = cv2.imread(file_path)
+ im = im.astype(np.float32, copy=False)
+ im -= cfg.pixel_means
+ im_height, im_width, channel = im.shape
+ channel_swap = (2, 0, 1) #(channel, height, width)
+ im = im.transpose(channel_swap)
+ im_info = np.array([im_height, im_width, 1.0], dtype=np.float32)
+ yield [(im, im_info)]
+
+ return reader
diff --git a/PaddleCV/rcnn/roidbs.py b/PaddleCV/rcnn/roidbs.py
new file mode 100644
index 0000000000000000000000000000000000000000..6b5386161c3d270c3eb93e60b579727dc397e535
--- /dev/null
+++ b/PaddleCV/rcnn/roidbs.py
@@ -0,0 +1,216 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# Detectron
+# Copyright (c) 2017-present, Facebook, Inc.
+# Licensed under the Apache License, Version 2.0;
+# Written by Ross Girshick
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import copy
+import logging
+import numpy as np
+import os
+import scipy.sparse
+import random
+import time
+import matplotlib
+matplotlib.use('Agg')
+from pycocotools.coco import COCO
+import box_utils
+import segm_utils
+from config import cfg
+from data_utils import DatasetPath
+
+logger = logging.getLogger(__name__)
+
+
+class JsonDataset(object):
+ """A class representing a COCO json dataset."""
+
+ def __init__(self, mode):
+ print('Creating: {}'.format(cfg.dataset))
+ self.name = cfg.dataset
+ self.is_train = mode == 'train'
+ data_path = DatasetPath(mode)
+ data_dir = data_path.get_data_dir()
+ file_list = data_path.get_file_list()
+ self.image_directory = data_dir
+ self.COCO = COCO(file_list)
+ # Set up dataset classes
+ category_ids = self.COCO.getCatIds()
+ categories = [c['name'] for c in self.COCO.loadCats(category_ids)]
+ self.category_to_id_map = dict(zip(categories, category_ids))
+ self.classes = ['__background__'] + categories
+ self.num_classes = len(self.classes)
+ self.json_category_id_to_contiguous_id = {
+ v: i + 1
+ for i, v in enumerate(self.COCO.getCatIds())
+ }
+ self.contiguous_category_id_to_json_id = {
+ v: k
+ for k, v in self.json_category_id_to_contiguous_id.items()
+ }
+
+ def get_roidb(self):
+ """Return an roidb corresponding to the json dataset. Optionally:
+ - include ground truth boxes in the roidb
+ - add proposals specified in a proposals file
+ - filter proposals based on a minimum side length
+ - filter proposals that intersect with crowd regions
+ """
+ image_ids = self.COCO.getImgIds()
+ image_ids.sort()
+ roidb = copy.deepcopy(self.COCO.loadImgs(image_ids))
+ for entry in roidb:
+ self._prep_roidb_entry(entry)
+ if self.is_train:
+ # Include ground-truth object annotations
+ start_time = time.time()
+ for entry in roidb:
+ self._add_gt_annotations(entry)
+ end_time = time.time()
+ print('_add_gt_annotations took {:.3f}s'.format(end_time -
+ start_time))
+ if cfg.TRAIN.use_flipped:
+ print('Appending horizontally-flipped training examples...')
+ self._extend_with_flipped_entries(roidb)
+ print('Loaded dataset: {:s}'.format(self.name))
+ print('{:d} roidb entries'.format(len(roidb)))
+ if self.is_train:
+ self._filter_for_training(roidb)
+ return roidb
+
+ def _prep_roidb_entry(self, entry):
+ """Adds empty metadata fields to an roidb entry."""
+ # Make file_name an abs path
+ im_path = os.path.join(self.image_directory, entry['file_name'])
+ #assert os.path.exists(im_path), 'Image \'{}\' not found'.format(im_path)
+ entry['image'] = im_path
+ entry['flipped'] = False
+ # Empty placeholders
+ entry['gt_boxes'] = np.empty((0, 4), dtype=np.float32)
+ entry['gt_classes'] = np.empty((0), dtype=np.int32)
+ entry['gt_id'] = np.empty((0), dtype=np.int32)
+ entry['is_crowd'] = np.empty((0), dtype=np.bool)
+ entry['segms'] = []
+ # Remove unwanted fields that come from the json file (if they exist)
+ for k in ['date_captured', 'url', 'license', 'file_name']:
+ if k in entry:
+ del entry[k]
+
+ def _add_gt_annotations(self, entry):
+ """Add ground truth annotation metadata to an roidb entry."""
+ count = 0
+ #for k in self.category_to_id_map:
+ # imgs = self.COCO.getImgIds(catIds=(self.category_to_id_map[k]))
+ # count += len(imgs)
+ ann_ids = self.COCO.getAnnIds(imgIds=entry['id'], iscrowd=None)
+ objs = self.COCO.loadAnns(ann_ids)
+ # Sanitize bboxes -- some are invalid
+ valid_objs = []
+ valid_segms = []
+ width = entry['width']
+ height = entry['height']
+ for obj in objs:
+ if isinstance(obj['segmentation'], list):
+ # Valid polygons have >= 3 points, so require >= 6 coordinates
+ obj['segmentation'] = [
+ p for p in obj['segmentation'] if len(p) >= 6
+ ]
+ if obj['area'] < cfg.TRAIN.gt_min_area:
+ continue
+ if 'ignore' in obj and obj['ignore'] == 1:
+ continue
+ # Convert form (x1, y1, w, h) to (x1, y1, x2, y2)
+ x1, y1, x2, y2 = box_utils.xywh_to_xyxy(obj['bbox'])
+ x1, y1, x2, y2 = box_utils.clip_xyxy_to_image(x1, y1, x2, y2,
+ height, width)
+ # Require non-zero seg area and more than 1x1 box size
+ if obj['area'] > 0 and x2 > x1 and y2 > y1:
+ obj['clean_bbox'] = [x1, y1, x2, y2]
+ valid_objs.append(obj)
+ valid_segms.append(obj['segmentation'])
+
+ num_valid_objs = len(valid_objs)
+
+ gt_boxes = np.zeros((num_valid_objs, 4), dtype=entry['gt_boxes'].dtype)
+ gt_id = np.zeros((num_valid_objs), dtype=np.int32)
+ gt_classes = np.zeros((num_valid_objs), dtype=entry['gt_classes'].dtype)
+ is_crowd = np.zeros((num_valid_objs), dtype=entry['is_crowd'].dtype)
+ for ix, obj in enumerate(valid_objs):
+ cls = self.json_category_id_to_contiguous_id[obj['category_id']]
+ gt_boxes[ix, :] = obj['clean_bbox']
+ gt_classes[ix] = cls
+ gt_id[ix] = np.int32(obj['id'])
+ is_crowd[ix] = obj['iscrowd']
+
+ entry['gt_boxes'] = np.append(entry['gt_boxes'], gt_boxes, axis=0)
+ entry['gt_classes'] = np.append(entry['gt_classes'], gt_classes)
+ entry['gt_id'] = np.append(entry['gt_id'], gt_id)
+ entry['is_crowd'] = np.append(entry['is_crowd'], is_crowd)
+ entry['segms'].extend(valid_segms)
+
+ def _extend_with_flipped_entries(self, roidb):
+ """Flip each entry in the given roidb and return a new roidb that is the
+ concatenation of the original roidb and the flipped entries.
+ "Flipping" an entry means that that image and associated metadata (e.g.,
+ ground truth boxes and object proposals) are horizontally flipped.
+ """
+ flipped_roidb = []
+ for entry in roidb:
+ width = entry['width']
+ gt_boxes = entry['gt_boxes'].copy()
+ oldx1 = gt_boxes[:, 0].copy()
+ oldx2 = gt_boxes[:, 2].copy()
+ gt_boxes[:, 0] = width - oldx2 - 1
+ gt_boxes[:, 2] = width - oldx1 - 1
+ assert (gt_boxes[:, 2] >= gt_boxes[:, 0]).all()
+ flipped_entry = {}
+ dont_copy = ('gt_boxes', 'flipped', 'segms')
+ for k, v in entry.items():
+ if k not in dont_copy:
+ flipped_entry[k] = v
+ flipped_entry['gt_boxes'] = gt_boxes
+ flipped_entry['segms'] = segm_utils.flip_segms(
+ entry['segms'], entry['height'], entry['width'])
+ flipped_entry['flipped'] = True
+ flipped_roidb.append(flipped_entry)
+ roidb.extend(flipped_roidb)
+
+ def _filter_for_training(self, roidb):
+ """Remove roidb entries that have no usable RoIs based on config settings.
+ """
+
+ def is_valid(entry):
+ # Valid images have:
+ # (1) At least one groundtruth RoI OR
+ # (2) At least one background RoI
+ gt_boxes = entry['gt_boxes']
+ # image is only valid if such boxes exist
+ valid = len(gt_boxes) > 0
+ return valid
+
+ num = len(roidb)
+ filtered_roidb = [entry for entry in roidb if is_valid(entry)]
+ num_after = len(filtered_roidb)
+ print('Filtered {} roidb entries: {} -> {}'.format(num - num_after, num,
+ num_after))
diff --git a/PaddleCV/rcnn/scripts/eval.sh b/PaddleCV/rcnn/scripts/eval.sh
new file mode 100644
index 0000000000000000000000000000000000000000..922380acf52e594931506e791990319d152d9260
--- /dev/null
+++ b/PaddleCV/rcnn/scripts/eval.sh
@@ -0,0 +1,17 @@
+#!/bin/bash
+export CUDA_VISIBLE_DEVICES=0
+
+model=$1 # faster_rcnn, mask_rcnn
+if [ "$model" = "faster_rcnn" ]; then
+ mask_on="--MASK_ON False"
+elif [ "$model" = "mask_rcnn" ]; then
+ mask_on="--MASK_ON True"
+else
+ echo "Invalid model provided. Please use one of {faster_rcnn, mask_rcnn}"
+ exit 1
+fi
+
+python -u ../eval_coco_map.py \
+ $mask_on \
+ --pretrained_model=../output/model_iter179999 \
+ --data_dir=../dataset/coco/ \
diff --git a/PaddleCV/rcnn/scripts/infer.sh b/PaddleCV/rcnn/scripts/infer.sh
new file mode 100644
index 0000000000000000000000000000000000000000..6f0e02730b9db07568c31a280825f75e321eab64
--- /dev/null
+++ b/PaddleCV/rcnn/scripts/infer.sh
@@ -0,0 +1,19 @@
+#!/bin/bash
+export CUDA_VISIBLE_DEVICES=0
+
+model=$1 # faster_rcnn, mask_rcnn
+if [ "$model" = "faster_rcnn" ]; then
+ mask_on="--MASK_ON False"
+elif [ "$model" = "mask_rcnn" ]; then
+ mask_on="--MASK_ON True"
+else
+ echo "Invalid model provided. Please use one of {faster_rcnn, mask_rcnn}"
+ exit 1
+fi
+
+python -u ../infer.py \
+ $mask_on \
+ --pretrained_model=../output/model_iter179999 \
+ --image_path=../dataset/coco/val2017/ \
+ --image_name=000000000139.jpg \
+ --draw_threshold=0.6
diff --git a/PaddleCV/rcnn/scripts/train.sh b/PaddleCV/rcnn/scripts/train.sh
new file mode 100755
index 0000000000000000000000000000000000000000..83c67e6c39121c0fecec5cd7c037d14ab53c619d
--- /dev/null
+++ b/PaddleCV/rcnn/scripts/train.sh
@@ -0,0 +1,19 @@
+#!/bin/bash
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+
+model=$1 # faster_rcnn, mask_rcnn
+if [ "$model" = "faster_rcnn" ]; then
+ mask_on="--MASK_ON False"
+elif [ "$model" = "mask_rcnn" ]; then
+ mask_on="--MASK_ON True"
+else
+ echo "Invalid model provided. Please use one of {faster_rcnn, mask_rcnn}"
+ exit 1
+fi
+
+python -u ../train.py \
+ $mask_on \
+ --model_save_dir=../output/ \
+ --pretrained_model=../imagenet_resnet50_fusebn/ \
+ --data_dir=../dataset/coco/ \
+
diff --git a/PaddleCV/rcnn/segm_utils.py b/PaddleCV/rcnn/segm_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..17b72228bc4284dc5936d4a3fda5c2422c4aa958
--- /dev/null
+++ b/PaddleCV/rcnn/segm_utils.py
@@ -0,0 +1,88 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://w_idxw.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# Based on:
+# --------------------------------------------------------
+# Detectron
+# Copyright (c) 2017-present, Facebook, Inc.
+# Licensed under the Apache License, Version 2.0;
+# Written by Ross Girshick
+# --------------------------------------------------------
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import numpy as np
+import pycocotools.mask as mask_util
+import cv2
+
+
+def is_poly(segm):
+ """Determine if segm is a polygon. Valid segm expected (polygon or RLE)."""
+ assert isinstance(segm, (list, dict)), \
+ 'Invalid segm type: {}'.format(type(segm))
+ return isinstance(segm, list)
+
+
+def segms_to_rle(segms, height, width):
+ rle = segms
+ if isinstance(segms, list):
+ # polygon -- a single object might consist of multiple parts
+ # we merge all parts into one mask rle code
+ rles = mask_util.frPyObjects(segms, height, width)
+ rle = mask_util.merge(rles)
+ elif isinstance(segms['counts'], list):
+ # uncompressed RLE
+ rle = mask_util.frPyObjects(segms, height, width)
+ return rle
+
+
+def segms_to_mask(segms, iscrowd, height, width):
+ print('segms: ', segms)
+ if iscrowd:
+ return [[0 for i in range(width)] for j in range(height)]
+ rle = segms_to_rle(segms, height, width)
+ mask = mask_util.decode(rle)
+ return mask
+
+
+def flip_segms(segms, height, width):
+ """Left/right flip each mask in a list of masks."""
+
+ def _flip_poly(poly, width):
+ flipped_poly = np.array(poly)
+ flipped_poly[0::2] = width - np.array(poly[0::2]) - 1
+ return flipped_poly.tolist()
+
+ def _flip_rle(rle, height, width):
+ if 'counts' in rle and type(rle['counts']) == list:
+ # Magic RLE format handling painfully discovered by looking at the
+ # COCO API showAnns function.
+ rle = mask_util.frPyObjects([rle], height, width)
+ mask = mask_util.decode(rle)
+ mask = mask[:, ::-1, :]
+ rle = mask_util.encode(np.array(mask, order='F', dtype=np.uint8))
+ return rle
+
+ flipped_segms = []
+ for segm in segms:
+ if is_poly(segm):
+ # Polygon format
+ flipped_segms.append([_flip_poly(poly, width) for poly in segm])
+ else:
+ # RLE format
+ flipped_segms.append(_flip_rle(segm, height, width))
+ return flipped_segms
diff --git a/PaddleCV/rcnn/train.py b/PaddleCV/rcnn/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..8404de31d0be066fb41e0cbd44166bd53787c7ee
--- /dev/null
+++ b/PaddleCV/rcnn/train.py
@@ -0,0 +1,200 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import sys
+import numpy as np
+import time
+import shutil
+from utility import parse_args, print_arguments, SmoothedValue, TrainingStats, now_time
+import collections
+
+import paddle
+import paddle.fluid as fluid
+import reader
+import models.model_builder as model_builder
+import models.resnet as resnet
+from learning_rate import exponential_with_warmup_decay
+from config import cfg
+
+
+def train():
+ learning_rate = cfg.learning_rate
+ image_shape = [3, cfg.TRAIN.max_size, cfg.TRAIN.max_size]
+
+ if cfg.enable_ce:
+ fluid.default_startup_program().random_seed = 1000
+ fluid.default_main_program().random_seed = 1000
+ import random
+ random.seed(0)
+ np.random.seed(0)
+
+ devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
+ devices_num = len(devices.split(","))
+ total_batch_size = devices_num * cfg.TRAIN.im_per_batch
+
+ use_random = True
+ if cfg.enable_ce:
+ use_random = False
+ model = model_builder.RCNN(
+ add_conv_body_func=resnet.add_ResNet50_conv4_body,
+ add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
+ use_pyreader=cfg.use_pyreader,
+ use_random=use_random)
+ model.build_model(image_shape)
+ losses, keys = model.loss()
+ loss = losses[0]
+ fetch_list = losses
+
+ boundaries = cfg.lr_steps
+ gamma = cfg.lr_gamma
+ step_num = len(cfg.lr_steps)
+ values = [learning_rate * (gamma**i) for i in range(step_num + 1)]
+
+ lr = exponential_with_warmup_decay(
+ learning_rate=learning_rate,
+ boundaries=boundaries,
+ values=values,
+ warmup_iter=cfg.warm_up_iter,
+ warmup_factor=cfg.warm_up_factor)
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=lr,
+ regularization=fluid.regularizer.L2Decay(cfg.weight_decay),
+ momentum=cfg.momentum)
+ optimizer.minimize(loss)
+ fetch_list = fetch_list + [lr]
+
+ fluid.memory_optimize(
+ fluid.default_main_program(), skip_opt_set=set(fetch_list))
+
+ place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if cfg.pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
+
+ fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
+
+ if cfg.parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=bool(cfg.use_gpu), loss_name=loss.name)
+
+ shuffle = True
+ if cfg.enable_ce:
+ shuffle = False
+ if cfg.use_pyreader:
+ train_reader = reader.train(
+ batch_size=cfg.TRAIN.im_per_batch,
+ total_batch_size=total_batch_size,
+ padding_total=cfg.TRAIN.padding_minibatch,
+ shuffle=shuffle)
+ py_reader = model.py_reader
+ py_reader.decorate_paddle_reader(train_reader)
+ else:
+ train_reader = reader.train(
+ batch_size=total_batch_size, shuffle=shuffle)
+ feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
+
+ def save_model(postfix):
+ model_path = os.path.join(cfg.model_save_dir, postfix)
+ if os.path.isdir(model_path):
+ shutil.rmtree(model_path)
+ fluid.io.save_persistables(exe, model_path)
+
+ def train_loop_pyreader():
+ py_reader.start()
+ train_stats = TrainingStats(cfg.log_window, keys)
+ try:
+ start_time = time.time()
+ prev_start_time = start_time
+ for iter_id in range(cfg.max_iter):
+ prev_start_time = start_time
+ start_time = time.time()
+ outs = train_exe.run(fetch_list=[v.name for v in fetch_list])
+ stats = {k: np.array(v).mean() for k, v in zip(keys, outs[:-1])}
+ train_stats.update(stats)
+ logs = train_stats.log()
+ strs = '{}, iter: {}, lr: {:.5f}, {}, time: {:.3f}'.format(
+ now_time(), iter_id,
+ np.mean(outs[-1]), logs, start_time - prev_start_time)
+ print(strs)
+ sys.stdout.flush()
+ if (iter_id + 1) % cfg.TRAIN.snapshot_iter == 0:
+ save_model("model_iter{}".format(iter_id))
+ end_time = time.time()
+ total_time = end_time - start_time
+ last_loss = np.array(outs[0]).mean()
+ if cfg.enable_ce:
+ gpu_num = devices_num
+ epoch_idx = iter_id + 1
+ loss = last_loss
+ print("kpis\teach_pass_duration_card%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_loss_card%s\t%s" % (gpu_num, loss))
+ except (StopIteration, fluid.core.EOFException):
+ py_reader.reset()
+
+ def train_loop():
+ start_time = time.time()
+ prev_start_time = start_time
+ start = start_time
+ train_stats = TrainingStats(cfg.log_window, keys)
+ for iter_id, data in enumerate(train_reader()):
+ prev_start_time = start_time
+ start_time = time.time()
+ outs = train_exe.run(fetch_list=[v.name for v in fetch_list],
+ feed=feeder.feed(data))
+ stats = {k: np.array(v).mean() for k, v in zip(keys, outs[:-1])}
+ train_stats.update(stats)
+ logs = train_stats.log()
+ strs = '{}, iter: {}, lr: {:.5f}, {}, time: {:.3f}'.format(
+ now_time(), iter_id,
+ np.mean(outs[-1]), logs, start_time - prev_start_time)
+ print(strs)
+ sys.stdout.flush()
+ if (iter_id + 1) % cfg.TRAIN.snapshot_iter == 0:
+ save_model("model_iter{}".format(iter_id))
+ if (iter_id + 1) == cfg.max_iter:
+ break
+ end_time = time.time()
+ total_time = end_time - start_time
+ last_loss = np.array(outs[0]).mean()
+ # only for ce
+ if cfg.enable_ce:
+ gpu_num = devices_num
+ epoch_idx = iter_id + 1
+ loss = last_loss
+ print("kpis\teach_pass_duration_card%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_loss_card%s\t%s" % (gpu_num, loss))
+
+ return np.mean(every_pass_loss)
+
+ if cfg.use_pyreader:
+ train_loop_pyreader()
+ else:
+ train_loop()
+ save_model('model_final')
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ train()
diff --git a/PaddleCV/rcnn/utility.py b/PaddleCV/rcnn/utility.py
new file mode 100644
index 0000000000000000000000000000000000000000..fcad5566fd082904c37953b09918f2bf5e0aa273
--- /dev/null
+++ b/PaddleCV/rcnn/utility.py
@@ -0,0 +1,174 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+"""
+Contains common utility functions.
+"""
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import sys
+import distutils.util
+import numpy as np
+import six
+import collections
+from collections import deque
+import datetime
+from paddle.fluid import core
+import argparse
+import functools
+from config import *
+
+
+def print_arguments(args):
+ """Print argparse's arguments.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("name", default="Jonh", type=str, help="User name.")
+ args = parser.parse_args()
+ print_arguments(args)
+
+ :param args: Input argparse.Namespace for printing.
+ :type args: argparse.Namespace
+ """
+ print("----------- Configuration Arguments -----------")
+ for arg, value in sorted(six.iteritems(vars(args))):
+ print("%s: %s" % (arg, value))
+ print("------------------------------------------------")
+
+
+def add_arguments(argname, type, default, help, argparser, **kwargs):
+ """Add argparse's argument.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ add_argument("name", str, "Jonh", "User name.", parser)
+ args = parser.parse_args()
+ """
+ type = distutils.util.strtobool if type == bool else type
+ argparser.add_argument(
+ "--" + argname,
+ default=default,
+ type=type,
+ help=help + ' Default: %(default)s.',
+ **kwargs)
+
+
+class SmoothedValue(object):
+ """Track a series of values and provide access to smoothed values over a
+ window or the global series average.
+ """
+
+ def __init__(self, window_size):
+ self.deque = deque(maxlen=window_size)
+
+ def add_value(self, value):
+ self.deque.append(value)
+
+ def get_median_value(self):
+ return np.median(self.deque)
+
+
+def now_time():
+ return datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S.%f')
+
+
+class TrainingStats(object):
+ def __init__(self, window_size, stats_keys):
+ self.smoothed_losses_and_metrics = {
+ key: SmoothedValue(window_size)
+ for key in stats_keys
+ }
+
+ def update(self, stats):
+ for k, v in self.smoothed_losses_and_metrics.items():
+ v.add_value(stats[k])
+
+ def get(self, extras=None):
+ stats = collections.OrderedDict()
+ if extras:
+ for k, v in extras.items():
+ stats[k] = v
+ for k, v in self.smoothed_losses_and_metrics.items():
+ stats[k] = round(v.get_median_value(), 3)
+
+ return stats
+
+ def log(self, extras=None):
+ d = self.get(extras)
+ strs = ', '.join(str(dict({x: y})).strip('{}') for x, y in d.items())
+ return strs
+
+
+def parse_args():
+ """return all args
+ """
+ parser = argparse.ArgumentParser(description=__doc__)
+ add_arg = functools.partial(add_arguments, argparser=parser)
+ # yapf: disable
+ # ENV
+ add_arg('parallel', bool, True, "Whether use parallel.")
+ add_arg('use_gpu', bool, True, "Whether use GPU.")
+ add_arg('model_save_dir', str, 'output', "The path to save model.")
+ add_arg('pretrained_model', str, 'imagenet_resnet50_fusebn', "The init model path.")
+ add_arg('dataset', str, 'coco2017', "coco2014, coco2017.")
+ add_arg('class_num', int, 81, "Class number.")
+ add_arg('data_dir', str, 'dataset/coco', "The data root path.")
+ add_arg('use_pyreader', bool, True, "Use pyreader.")
+ add_arg('use_profile', bool, False, "Whether use profiler.")
+ add_arg('padding_minibatch',bool, False,
+ "If False, only resize image and not pad, image shape is different between"
+ " GPUs in one mini-batch. If True, image shape is the same in one mini-batch.")
+ #SOLVER
+ add_arg('learning_rate', float, 0.01, "Learning rate.")
+ add_arg('max_iter', int, 180000, "Iter number.")
+ add_arg('log_window', int, 20, "Log smooth window, set 1 for debug, set 20 for train.")
+ # RCNN
+ # RPN
+ add_arg('anchor_sizes', int, [32,64,128,256,512], "The size of anchors.")
+ add_arg('aspect_ratios', float, [0.5,1.0,2.0], "The ratio of anchors.")
+ add_arg('variance', float, [1.,1.,1.,1.], "The variance of anchors.")
+ add_arg('rpn_stride', float, [16.,16.], "Stride of the feature map that RPN is attached.")
+ add_arg('rpn_nms_thresh', float, 0.7, "NMS threshold used on RPN proposals")
+ # TRAIN VAL INFER
+ add_arg('MASK_ON', bool, False, "Option for different models. If False, choose faster_rcnn. If True, choose mask_rcnn")
+ add_arg('im_per_batch', int, 1, "Minibatch size.")
+ add_arg('max_size', int, 1333, "The resized image height.")
+ add_arg('scales', int, [800], "The resized image height.")
+ add_arg('batch_size_per_im',int, 512, "fast rcnn head batch size")
+ add_arg('pixel_means', float, [102.9801, 115.9465, 122.7717], "pixel mean")
+ add_arg('nms_thresh', float, 0.5, "NMS threshold.")
+ add_arg('score_thresh', float, 0.05, "score threshold for NMS.")
+ add_arg('snapshot_stride', int, 10000, "save model every snapshot stride.")
+ # SINGLE EVAL AND DRAW
+ add_arg('draw_threshold', float, 0.8, "Confidence threshold to draw bbox.")
+ add_arg('image_path', str, 'dataset/coco/val2017', "The image path used to inference and visualize.")
+ # ce
+ parser.add_argument(
+ '--enable_ce', action='store_true', help='If set, run the task with continuous evaluation logs.')
+ # yapf: enable
+ args = parser.parse_args()
+ file_name = sys.argv[0]
+ if 'train' in file_name or 'profile' in file_name:
+ merge_cfg_from_args(args, 'train')
+ else:
+ merge_cfg_from_args(args, 'val')
+ return args
diff --git a/PaddleCV/video/.gitignore b/PaddleCV/video/.gitignore
new file mode 100644
index 0000000000000000000000000000000000000000..7052bdda1c76c2ab1adebd204bdef9ebf1a39755
--- /dev/null
+++ b/PaddleCV/video/.gitignore
@@ -0,0 +1,5 @@
+checkpoints
+output*
+*.pyc
+*.swp
+*_result
diff --git a/PaddleCV/video/README.md b/PaddleCV/video/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..eedeb79d45449ccfc9b18a001a2495e5c1b02f61
--- /dev/null
+++ b/PaddleCV/video/README.md
@@ -0,0 +1,134 @@
+
+## 简介
+本教程期望给开发者提供基于PaddlePaddle的便捷、高效的使用深度学习算法解决视频理解、视频编辑、视频生成等一系列模型。目前包含视频分类模型,后续会不断的扩展到其他更多场景。
+
+目前视频分类模型包括:
+
+| 模型 | 类别 | 描述 |
+| :--------------- | :--------: | :------------: |
+| [Attention Cluster](./models/attention_cluster/README.md) | 视频分类| CVPR'18提出的视频多模态特征注意力聚簇融合方法 |
+| [Attention LSTM](./models/attention_lstm/README.md) | 视频分类| 常用模型,速度快精度高 |
+| [NeXtVLAD](./models/nextvlad/README.md) | 视频分类| 2nd-Youtube-8M最优单模型 |
+| [StNet](./models/stnet/README.md) | 视频分类| AAAI'19提出的视频联合时空建模方法 |
+| [TSM](./models/tsm/README.md) | 视频分类| 基于时序移位的简单高效视频时空建模方法 |
+| [TSN](./models/tsn/README.md) | 视频分类| ECCV'16提出的基于2D-CNN经典解决方案 |
+| [Non-local](./models/nonlocal_model/README.md) | 视频分类| 视频非局部关联建模模型 |
+
+### 主要特点
+
+- 包含视频分类方向的多个主流领先模型,其中Attention LSTM,Attention Cluster和NeXtVLAD是比较流行的特征序列模型,Non-local, TSN, TSM和StNet是End-to-End的视频分类模型。Attention LSTM模型速度快精度高,NeXtVLAD是2nd-Youtube-8M比赛中最好的单模型, TSN是基于2D-CNN的经典解决方案,TSM是基于时序移位的简单高效视频时空建模方法,Non-local模型提出了视频非局部关联建模方法。Attention Cluster和StNet是百度自研模型,分别发表于CVPR2018和AAAI2019,是Kinetics600比赛第一名中使用到的模型。
+
+- 提供了适合视频分类任务的通用骨架代码,用户可一键式高效配置模型完成训练和评测。
+
+## 安装
+
+在当前模型库运行样例代码需要PadddlePaddle Fluid v.1.4.0或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据[安装文档](http://www.paddlepaddle.org/documentation/docs/zh/1.4/beginners_guide/install/index_cn.html)中的说明来更新PaddlePaddle。
+
+## 数据准备
+
+视频模型库使用Youtube-8M和Kinetics数据集, 具体使用方法请参考[数据说明](./dataset/README.md)
+
+## 快速使用
+
+视频模型库提供通用的train/test/infer框架,通过`train.py/test.py/infer.py`指定模型名、模型配置参数等可一键式进行训练和预测。
+
+以StNet模型为例:
+
+单卡训练:
+
+``` bash
+export CUDA_VISIBLE_DEVICES=0
+python train.py --model_name=STNET
+ --config=./configs/stnet.txt
+ --save_dir=checkpoints
+```
+
+多卡训练:
+
+``` bash
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+python train.py --model_name=STNET
+ --config=./configs/stnet.txt
+ --save_dir=checkpoints
+```
+
+视频模型库同时提供了快速训练脚本,脚本位于`scripts/train`目录下,可通过如下命令启动训练:
+
+``` bash
+bash scripts/train/train_stnet.sh
+```
+
+- 请根据`CUDA_VISIBLE_DEVICES`指定卡数修改`config`文件中的`num_gpus`和`batch_size`配置。
+
+## 模型库结构
+
+### 代码结构
+
+```
+configs/
+ stnet.txt
+ tsn.txt
+ ...
+dataset/
+ youtube/
+ kinetics/
+datareader/
+ feature_readeer.py
+ kinetics_reader.py
+ ...
+metrics/
+ kinetics/
+ youtube8m/
+ ...
+models/
+ stnet/
+ tsn/
+ ...
+scripts/
+ train/
+ test/
+train.py
+test.py
+infer.py
+```
+
+- `configs`: 各模型配置文件模板
+- `datareader`: 提供Youtube-8M,Kinetics数据集reader
+- `metrics`: Youtube-8,Kinetics数据集评估脚本
+- `models`: 各模型网络结构构建脚本
+- `scripts`: 各模型快速训练评估脚本
+- `train.py`: 一键式训练脚本,可通过指定模型名,配置文件等一键式启动训练
+- `test.py`: 一键式评估脚本,可通过指定模型名,配置文件,模型权重等一键式启动评估
+- `infer.py`: 一键式推断脚本,可通过指定模型名,配置文件,模型权重,待推断文件列表等一键式启动推断
+
+## Model Zoo
+
+- 基于Youtube-8M数据集模型:
+
+| 模型 | Batch Size | 环境配置 | cuDNN版本 | GAP | 下载链接 |
+| :-------: | :---: | :---------: | :-----: | :----: | :----------: |
+| Attention Cluster | 2048 | 8卡P40 | 7.1 | 0.84 | [model](https://paddlemodels.bj.bcebos.com/video_classification/attention_cluster_youtube8m.tar.gz) |
+| Attention LSTM | 1024 | 8卡P40 | 7.1 | 0.86 | [model](https://paddlemodels.bj.bcebos.com/video_classification/attention_lstm_youtube8m.tar.gz) |
+| NeXtVLAD | 160 | 4卡P40 | 7.1 | 0.87 | [model](https://paddlemodels.bj.bcebos.com/video_classification/nextvlad_youtube8m.tar.gz) |
+
+- 基于Kinetics数据集模型:
+
+| 模型 | Batch Size | 环境配置 | cuDNN版本 | Top-1 | 下载链接 |
+| :-------: | :---: | :---------: | :----: | :----: | :----------: |
+| StNet | 128 | 8卡P40 | 7.1 | 0.69 | [model](https://paddlemodels.bj.bcebos.com/video_classification/stnet_kinetics.tar.gz) |
+| TSN | 256 | 8卡P40 | 7.1 | 0.67 | [model](https://paddlemodels.bj.bcebos.com/video_classification/tsn_kinetics.tar.gz) |
+| TSM | 128 | 8卡P40 | 7.1 | 0.70 | [model](https://paddlemodels.bj.bcebos.com/video_classification/tsm_kinetics.tar.gz) |
+| Non-local | 64 | 8卡P40 | 7.1 | 0.74 | [model](https://paddlemodels.bj.bcebos.com/video_classification/nonlocal_kinetics.tar.gz) |
+## 参考文献
+
+- [Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification](https://arxiv.org/abs/1711.09550), Xiang Long, Chuang Gan, Gerard de Melo, Jiajun Wu, Xiao Liu, Shilei Wen
+- [Beyond Short Snippets: Deep Networks for Video Classification](https://arxiv.org/abs/1503.08909) Joe Yue-Hei Ng, Matthew Hausknecht, Sudheendra Vijayanarasimhan, Oriol Vinyals, Rajat Monga, George Toderici
+- [NeXtVLAD: An Efficient Neural Network to Aggregate Frame-level Features for Large-scale Video Classification](https://arxiv.org/abs/1811.05014), Rongcheng Lin, Jing Xiao, Jianping Fan
+- [StNet:Local and Global Spatial-Temporal Modeling for Human Action Recognition](https://arxiv.org/abs/1811.01549), Dongliang He, Zhichao Zhou, Chuang Gan, Fu Li, Xiao Liu, Yandong Li, Limin Wang, Shilei Wen
+- [Temporal Segment Networks: Towards Good Practices for Deep Action Recognition](https://arxiv.org/abs/1608.00859), Limin Wang, Yuanjun Xiong, Zhe Wang, Yu Qiao, Dahua Lin, Xiaoou Tang, Luc Van Gool
+- [Temporal Shift Module for Efficient Video Understanding](https://arxiv.org/abs/1811.08383v1), Ji Lin, Chuang Gan, Song Han
+- [Non-local Neural Networks](https://arxiv.org/abs/1711.07971v1), Xiaolong Wang, Ross Girshick, Abhinav Gupta, Kaiming He
+## 版本更新
+
+- 3/2019: 新增模型库,发布Attention Cluster,Attention LSTM,NeXtVLAD,StNet,TSN五个视频分类模型。
+- 4/2019: 发布Non-local, TSM两个视频分类模型。
diff --git a/PaddleCV/video/config.py b/PaddleCV/video/config.py
new file mode 100644
index 0000000000000000000000000000000000000000..d2983170039e4614d87210282dd221af570acd99
--- /dev/null
+++ b/PaddleCV/video/config.py
@@ -0,0 +1,70 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+try:
+ from configparser import ConfigParser
+except:
+ from ConfigParser import ConfigParser
+
+from utils import AttrDict
+
+import logging
+logger = logging.getLogger(__name__)
+
+CONFIG_SECS = [
+ 'train',
+ 'valid',
+ 'test',
+ 'infer',
+]
+
+
+def parse_config(cfg_file):
+ parser = ConfigParser()
+ cfg = AttrDict()
+ parser.read(cfg_file)
+ for sec in parser.sections():
+ sec_dict = AttrDict()
+ for k, v in parser.items(sec):
+ try:
+ v = eval(v)
+ except:
+ pass
+ setattr(sec_dict, k, v)
+ setattr(cfg, sec.upper(), sec_dict)
+
+ return cfg
+
+
+def merge_configs(cfg, sec, args_dict):
+ assert sec in CONFIG_SECS, "invalid config section {}".format(sec)
+ sec_dict = getattr(cfg, sec.upper())
+ for k, v in args_dict.items():
+ if v is None:
+ continue
+ try:
+ if hasattr(sec_dict, k):
+ setattr(sec_dict, k, v)
+ except:
+ pass
+ return cfg
+
+
+def print_configs(cfg, mode):
+ logger.info("---------------- {:>5} Arguments ----------------".format(mode))
+ for sec, sec_items in cfg.items():
+ logger.info("{}:".format(sec))
+ for k, v in sec_items.items():
+ logger.info(" {}:{}".format(k, v))
+ logger.info("-------------------------------------------------")
diff --git a/PaddleCV/video/configs/attention_cluster.txt b/PaddleCV/video/configs/attention_cluster.txt
new file mode 100644
index 0000000000000000000000000000000000000000..0ce7c4b213fe4008e9f02beeb124d11a0ec1f785
--- /dev/null
+++ b/PaddleCV/video/configs/attention_cluster.txt
@@ -0,0 +1,33 @@
+[MODEL]
+name = "AttentionCluster"
+dataset = "YouTube-8M"
+bone_network = None
+drop_rate = 0.5
+feature_num = 2
+feature_names = ['rgb', 'audio']
+feature_dims = [1024, 128]
+seg_num = 100
+cluster_nums = [32, 32]
+num_classes = 3862
+topk = 20
+
+[TRAIN]
+epoch = 5
+learning_rate = 0.001
+pretrain_base = None
+batch_size = 2048
+use_gpu = True
+num_gpus = 8
+filelist = "dataset/youtube8m/train.list"
+
+[VALID]
+batch_size = 2048
+filelist = "dataset/youtube8m/val.list"
+
+[TEST]
+batch_size = 256
+filelist = "dataset/youtube8m/test.list"
+
+[INFER]
+batch_size = 1
+filelist = "dataset/youtube8m/infer.list"
diff --git a/PaddleCV/video/configs/attention_lstm.txt b/PaddleCV/video/configs/attention_lstm.txt
new file mode 100644
index 0000000000000000000000000000000000000000..9154fe2c17282e1066f248a797b50ece080994e7
--- /dev/null
+++ b/PaddleCV/video/configs/attention_lstm.txt
@@ -0,0 +1,37 @@
+[MODEL]
+name = "AttentionLSTM"
+dataset = "YouTube-8M"
+bone_nework = None
+drop_rate = 0.5
+feature_num = 2
+feature_names = ['rgb', 'audio']
+feature_dims = [1024, 128]
+embedding_size = 512
+lstm_size = 1024
+num_classes = 3862
+topk = 20
+
+[TRAIN]
+epoch = 10
+learning_rate = 0.001
+decay_epochs = [5]
+decay_gamma = 0.1
+weight_decay = 0.0008
+num_samples = 5000000
+pretrain_base = None
+batch_size = 1024
+use_gpu = True
+num_gpus = 8
+filelist = "dataset/youtube8m/train.list"
+
+[VALID]
+batch_size = 1024
+filelist = "dataset/youtube8m/val.list"
+
+[TEST]
+batch_size = 128
+filelist = "dataset/youtube8m/test.list"
+
+[INFER]
+batch_size = 1
+filelist = "dataset/youtube8m/infer.list"
diff --git a/PaddleCV/video/configs/nextvlad.txt b/PaddleCV/video/configs/nextvlad.txt
new file mode 100644
index 0000000000000000000000000000000000000000..18779b1f2eaf78cf9db3c25d5fbd991e16e2ed54
--- /dev/null
+++ b/PaddleCV/video/configs/nextvlad.txt
@@ -0,0 +1,39 @@
+[MODEL]
+name = "NEXTVLAD"
+num_classes = 3862
+topk = 20
+video_feature_size = 1024
+audio_feature_size = 128
+cluster_size = 128
+hidden_size = 2048
+groups = 8
+expansion = 2
+drop_rate = 0.5
+gating_reduction = 8
+eigen_file = "./dataset/youtube8m/yt8m_pca/eigenvals.npy"
+
+[TRAIN]
+epoch = 6
+learning_rate = 0.0002
+lr_boundary_examples = 2000000
+max_iter = 700000
+learning_rate_decay = 0.8
+l2_penalty = 1e-5
+gradient_clip_norm = 1.0
+use_gpu = True
+num_gpus = 4
+batch_size = 160
+filelist = "./dataset/youtube8m/train.list"
+
+[VALID]
+batch_size = 160
+filelist = "./dataset/youtube8m/val.list"
+
+[TEST]
+batch_size = 40
+filelist = "./dataset/youtube8m/test.list"
+
+[INFER]
+batch_size = 1
+filelist = "./dataset/youtube8m/infer.list"
+
diff --git a/PaddleCV/video/configs/nonlocal.txt b/PaddleCV/video/configs/nonlocal.txt
new file mode 100644
index 0000000000000000000000000000000000000000..ff7dc81c221130c34da842cead6cbdb85735ba5d
--- /dev/null
+++ b/PaddleCV/video/configs/nonlocal.txt
@@ -0,0 +1,92 @@
+[MODEL]
+name = "NONLOCAL"
+num_classes = 400
+image_mean = 114.75
+image_std = 57.375
+depth = 50
+dataset = 'kinetics400'
+video_arc_choice = 1
+use_affine = False
+fc_init_std = 0.01
+bn_momentum = 0.9
+bn_epsilon = 1.0e-5
+bn_init_gamma = 0.
+
+[RESNETS]
+num_groups = 1
+width_per_group = 64
+trans_func = bottleneck_transformation_3d
+
+[NONLOCAL]
+bn_momentum = 0.9
+bn_epsilon = 1.0e-5
+bn_init_gamma = 0.0
+layer_mod = 2
+conv3_nonlocal = True
+conv4_nonlocal = True
+conv_init_std = 0.01
+no_bias = 0
+use_maxpool = True
+use_softmax = True
+use_scale = True
+use_zero_init_conv = False
+use_bn = True
+use_affine = False
+
+[TRAIN]
+num_reader_threads = 8
+batch_size = 64
+num_gpus = 8
+filelist = './dataset/nonlocal/trainlist.txt'
+crop_size = 224
+sample_rate = 8
+video_length = 8
+jitter_scales = [256, 320]
+
+dropout_rate = 0.5
+
+learning_rate = 0.01
+learning_rate_decay = 0.1
+step_sizes = [150000, 150000, 100000]
+max_iter = 400000
+
+weight_decay = 0.0001
+weight_decay_bn = 0.0
+momentum = 0.9
+nesterov = True
+scale_momentum = True
+
+[VALID]
+num_reader_threads = 8
+batch_size = 64
+filelist = './dataset/nonlocal/vallist.txt'
+crop_size = 224
+sample_rate = 8
+video_length = 8
+jitter_scales = [256, 320]
+
+[TEST]
+num_reader_threads = 8
+batch_size = 4
+filelist = 'dataset/nonlocal/testlist.txt'
+filename_gt = 'dataset/nonlocal/vallist.txt'
+checkpoint_dir = './output'
+crop_size = 256
+sample_rate = 8
+video_length = 8
+jitter_scales = [256, 256]
+num_test_clips = 30
+dataset_size = 19761
+use_multi_crop = 1
+
+[INFER]
+num_reader_threads = 8
+batch_size = 1
+filelist = 'dataset/nonlocal/inferlist.txt'
+crop_size = 256
+sample_rate = 8
+video_length = 8
+jitter_scales = [256, 256]
+num_test_clips = 30
+use_multi_crop = 1
+
diff --git a/PaddleCV/video/configs/stnet.txt b/PaddleCV/video/configs/stnet.txt
new file mode 100644
index 0000000000000000000000000000000000000000..7be17834d18ec157aec5738b11b5d68871430892
--- /dev/null
+++ b/PaddleCV/video/configs/stnet.txt
@@ -0,0 +1,53 @@
+[MODEL]
+name = "STNET"
+format = "pkl"
+num_classes = 400
+seg_num = 7
+seglen = 5
+image_mean = [0.485, 0.456, 0.406]
+image_std = [0.229, 0.224, 0.225]
+num_layers = 50
+
+[TRAIN]
+epoch = 60
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 128
+num_gpus = 8
+use_gpu = True
+filelist = "./dataset/kinetics/train.list"
+learning_rate = 0.01
+learning_rate_decay = 0.1
+l2_weight_decay = 1e-4
+momentum = 0.9
+total_videos = 224684
+pretrain_base = "./dataset/pretrained/ResNet50_pretrained"
+
+[VALID]
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 128
+filelist = "./dataset/kinetics/val.list"
+
+[TEST]
+seg_num = 25
+short_size = 256
+target_size = 256
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 4
+filelist = "./dataset/kinetics/test.list"
+
+[INFER]
+seg_num = 25
+short_size = 256
+target_size = 256
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 1
+filelist = "./dataset/kinetics/infer.list"
+
diff --git a/PaddleCV/video/configs/tsm.txt b/PaddleCV/video/configs/tsm.txt
new file mode 100755
index 0000000000000000000000000000000000000000..6410bee576074b4756b6617d96a8ec97133a7ef9
--- /dev/null
+++ b/PaddleCV/video/configs/tsm.txt
@@ -0,0 +1,51 @@
+[MODEL]
+name = "TSM"
+format = "pkl"
+num_classes = 400
+seg_num = 8
+seglen = 1
+image_mean = [0.485, 0.456, 0.406]
+image_std = [0.229, 0.224, 0.225]
+num_layers = 50
+
+[TRAIN]
+epoch = 65
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 128
+use_gpu = True
+num_gpus = 8
+filelist = "./dataset/kinetics/train.list"
+learning_rate = 0.01
+learning_rate_decay = 0.1
+decay_epochs = [40, 60]
+l2_weight_decay = 1e-4
+momentum = 0.9
+total_videos = 239781
+
+[VALID]
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 128
+filelist = "./dataset/kinetics/val.list"
+
+[TEST]
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 16
+filelist = "./dataset/kinetics/test.list"
+
+[INFER]
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 1
+filelist = "./dataset/kinetics/infer.list"
+
diff --git a/PaddleCV/video/configs/tsn.txt b/PaddleCV/video/configs/tsn.txt
new file mode 100644
index 0000000000000000000000000000000000000000..d19353228f7c779092665552d8ae945c666d4882
--- /dev/null
+++ b/PaddleCV/video/configs/tsn.txt
@@ -0,0 +1,51 @@
+[MODEL]
+name = "TSN"
+format = "pkl"
+num_classes = 400
+seg_num = 3
+seglen = 1
+image_mean = [0.485, 0.456, 0.406]
+image_std = [0.229, 0.224, 0.225]
+num_layers = 50
+
+[TRAIN]
+epoch = 45
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 256
+use_gpu = True
+num_gpus = 8
+filelist = "./dataset/kinetics/train.list"
+learning_rate = 0.01
+learning_rate_decay = 0.1
+l2_weight_decay = 1e-4
+momentum = 0.9
+total_videos = 224684
+
+[VALID]
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 256
+filelist = "./dataset/kinetics/val.list"
+
+[TEST]
+seg_num = 7
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 16
+filelist = "./dataset/kinetics/test.list"
+
+[INFER]
+short_size = 256
+target_size = 224
+num_reader_threads = 12
+buf_size = 1024
+batch_size = 1
+filelist = "./dataset/kinetics/infer.list"
+
diff --git a/PaddleCV/video/datareader/__init__.py b/PaddleCV/video/datareader/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..ee898672c9a6f7ed57f73a324da3c23c48d994c1
--- /dev/null
+++ b/PaddleCV/video/datareader/__init__.py
@@ -0,0 +1,13 @@
+from .reader_utils import regist_reader, get_reader
+from .feature_reader import FeatureReader
+from .kinetics_reader import KineticsReader
+from .nonlocal_reader import NonlocalReader
+
+# regist reader, sort by alphabet
+regist_reader("ATTENTIONCLUSTER", FeatureReader)
+regist_reader("ATTENTIONLSTM", FeatureReader)
+regist_reader("NEXTVLAD", FeatureReader)
+regist_reader("NONLOCAL", NonlocalReader)
+regist_reader("TSM", KineticsReader)
+regist_reader("TSN", KineticsReader)
+regist_reader("STNET", KineticsReader)
diff --git a/PaddleCV/video/datareader/feature_reader.py b/PaddleCV/video/datareader/feature_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..9f465c09474446529dd6804a26d3c71204b2fcfa
--- /dev/null
+++ b/PaddleCV/video/datareader/feature_reader.py
@@ -0,0 +1,135 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import sys
+from .reader_utils import DataReader
+try:
+ import cPickle as pickle
+ from cStringIO import StringIO
+except ImportError:
+ import pickle
+ from io import BytesIO
+import numpy as np
+import random
+
+python_ver = sys.version_info
+
+
+class FeatureReader(DataReader):
+ """
+ Data reader for youtube-8M dataset, which was stored as features extracted by prior networks
+ This is for the three models: lstm, attention cluster, nextvlad
+
+ dataset cfg: num_classes
+ batch_size
+ list
+ NextVlad only: eigen_file
+ """
+
+ def __init__(self, name, mode, cfg):
+ self.name = name
+ self.mode = mode
+ self.num_classes = cfg.MODEL.num_classes
+
+ # set batch size and file list
+ self.batch_size = cfg[mode.upper()]['batch_size']
+ self.filelist = cfg[mode.upper()]['filelist']
+ self.eigen_file = cfg.MODEL.get('eigen_file', None)
+ self.seg_num = cfg.MODEL.get('seg_num', None)
+
+ def create_reader(self):
+ fl = open(self.filelist).readlines()
+ fl = [line.strip() for line in fl if line.strip() != '']
+ if self.mode == 'train':
+ random.shuffle(fl)
+
+ def reader():
+ batch_out = []
+ for filepath in fl:
+ if python_ver < (3, 0):
+ data = pickle.load(open(filepath, 'rb'))
+ else:
+ data = pickle.load(open(filepath, 'rb'), encoding='bytes')
+ indexes = list(range(len(data)))
+ if self.mode == 'train':
+ random.shuffle(indexes)
+ for i in indexes:
+ record = data[i]
+ nframes = record[b'nframes']
+ rgb = record[b'feature'].astype(float)
+ audio = record[b'audio'].astype(float)
+ if self.mode != 'infer':
+ label = record[b'label']
+ one_hot_label = make_one_hot(label, self.num_classes)
+ video = record[b'video']
+
+ rgb = rgb[0:nframes, :]
+ audio = audio[0:nframes, :]
+
+ rgb = dequantize(
+ rgb, max_quantized_value=2., min_quantized_value=-2.)
+ audio = dequantize(
+ audio, max_quantized_value=2, min_quantized_value=-2)
+
+ if self.name == 'NEXTVLAD':
+ # add the effect of eigen values
+ eigen_file = self.eigen_file
+ eigen_val = np.sqrt(np.load(eigen_file)
+ [:1024, 0]).astype(np.float32)
+ eigen_val = eigen_val + 1e-4
+ rgb = (rgb - 4. / 512) * eigen_val
+ if self.name == 'ATTENTIONCLUSTER':
+ sample_inds = generate_random_idx(rgb.shape[0],
+ self.seg_num)
+ rgb = rgb[sample_inds]
+ audio = audio[sample_inds]
+ if self.mode != 'infer':
+ batch_out.append((rgb, audio, one_hot_label))
+ else:
+ batch_out.append((rgb, audio, video))
+ if len(batch_out) == self.batch_size:
+ yield batch_out
+ batch_out = []
+
+ return reader
+
+
+def dequantize(feat_vector, max_quantized_value=2., min_quantized_value=-2.):
+ """
+ Dequantize the feature from the byte format to the float format
+ """
+
+ assert max_quantized_value > min_quantized_value
+ quantized_range = max_quantized_value - min_quantized_value
+ scalar = quantized_range / 255.0
+ bias = (quantized_range / 512.0) + min_quantized_value
+
+ return feat_vector * scalar + bias
+
+
+def make_one_hot(label, dim=3862):
+ one_hot_label = np.zeros(dim)
+ one_hot_label = one_hot_label.astype(float)
+ for ind in label:
+ one_hot_label[int(ind)] = 1
+ return one_hot_label
+
+
+def generate_random_idx(feature_len, seg_num):
+ idxs = []
+ stride = float(feature_len) / seg_num
+ for i in range(seg_num):
+ pos = (i + np.random.random()) * stride
+ idxs.append(min(feature_len - 1, int(pos)))
+ return idxs
diff --git a/PaddleCV/video/datareader/kinetics_reader.py b/PaddleCV/video/datareader/kinetics_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..236ff8205c15b0d51a9251e619473509cd31b443
--- /dev/null
+++ b/PaddleCV/video/datareader/kinetics_reader.py
@@ -0,0 +1,420 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import sys
+import math
+import random
+import functools
+try:
+ import cPickle as pickle
+ from cStringIO import StringIO
+except ImportError:
+ import pickle
+ from io import BytesIO
+import numpy as np
+import paddle
+from PIL import Image, ImageEnhance
+import logging
+
+from .reader_utils import DataReader
+
+logger = logging.getLogger(__name__)
+python_ver = sys.version_info
+
+
+class KineticsReader(DataReader):
+ """
+ Data reader for kinetics dataset of two format mp4 and pkl.
+ 1. mp4, the original format of kinetics400
+ 2. pkl, the mp4 was decoded previously and stored as pkl
+ In both case, load the data, and then get the frame data in the form of numpy and label as an integer.
+ dataset cfg: format
+ num_classes
+ seg_num
+ short_size
+ target_size
+ num_reader_threads
+ buf_size
+ image_mean
+ image_std
+ batch_size
+ list
+ """
+
+ def __init__(self, name, mode, cfg):
+ super(KineticsReader, self).__init__(name, mode, cfg)
+ self.format = cfg.MODEL.format
+ self.num_classes = self.get_config_from_sec('model', 'num_classes')
+ self.seg_num = self.get_config_from_sec('model', 'seg_num')
+ self.seglen = self.get_config_from_sec('model', 'seglen')
+
+ self.seg_num = self.get_config_from_sec(mode, 'seg_num', self.seg_num)
+ self.short_size = self.get_config_from_sec(mode, 'short_size')
+ self.target_size = self.get_config_from_sec(mode, 'target_size')
+ self.num_reader_threads = self.get_config_from_sec(mode, 'num_reader_threads')
+ self.buf_size = self.get_config_from_sec(mode, 'buf_size')
+
+ self.img_mean = np.array(cfg.MODEL.image_mean).reshape(
+ [3, 1, 1]).astype(np.float32)
+ self.img_std = np.array(cfg.MODEL.image_std).reshape(
+ [3, 1, 1]).astype(np.float32)
+ # set batch size and file list
+ self.batch_size = cfg[mode.upper()]['batch_size']
+ self.filelist = cfg[mode.upper()]['filelist']
+
+ def create_reader(self):
+ _reader = self._reader_creator(self.filelist, self.mode, seg_num=self.seg_num, seglen = self.seglen, \
+ short_size = self.short_size, target_size = self.target_size, \
+ img_mean = self.img_mean, img_std = self.img_std, \
+ shuffle = (self.mode == 'train'), \
+ num_threads = self.num_reader_threads, \
+ buf_size = self.buf_size, format = self.format)
+
+ def _batch_reader():
+ batch_out = []
+ for imgs, label in _reader():
+ if imgs is None:
+ continue
+ batch_out.append((imgs, label))
+ if len(batch_out) == self.batch_size:
+ yield batch_out
+ batch_out = []
+
+ return _batch_reader
+
+
+ def _reader_creator(self,
+ pickle_list,
+ mode,
+ seg_num,
+ seglen,
+ short_size,
+ target_size,
+ img_mean,
+ img_std,
+ shuffle=False,
+ num_threads=1,
+ buf_size=1024,
+ format='pkl'):
+ def decode_mp4(sample, mode, seg_num, seglen, short_size, target_size, img_mean,
+ img_std):
+ sample = sample[0].split(' ')
+ mp4_path = sample[0]
+ # when infer, we store vid as label
+ label = int(sample[1])
+ try:
+ imgs = mp4_loader(mp4_path, seg_num, seglen, mode)
+ if len(imgs) < 1:
+ logger.error('{} frame length {} less than 1.'.format(mp4_path,
+ len(imgs)))
+ return None, None
+ except:
+ logger.error('Error when loading {}'.format(mp4_path))
+ return None, None
+
+ return imgs_transform(imgs, label, mode, seg_num, seglen, \
+ short_size, target_size, img_mean, img_std)
+
+
+ def decode_pickle(sample, mode, seg_num, seglen, short_size, target_size,
+ img_mean, img_std):
+ pickle_path = sample[0]
+ try:
+ if python_ver < (3, 0):
+ data_loaded = pickle.load(open(pickle_path, 'rb'))
+ else:
+ data_loaded = pickle.load(open(pickle_path, 'rb'), encoding='bytes')
+
+ vid, label, frames = data_loaded
+ if len(frames) < 1:
+ logger.error('{} frame length {} less than 1.'.format(pickle_path,
+ len(frames)))
+ return None, None
+ except:
+ logger.info('Error when loading {}'.format(pickle_path))
+ return None, None
+
+ if mode == 'train' or mode == 'valid' or mode == 'test':
+ ret_label = label
+ elif mode == 'infer':
+ ret_label = vid
+
+ imgs = video_loader(frames, seg_num, seglen, mode)
+ return imgs_transform(imgs, ret_label, mode, seg_num, seglen, \
+ short_size, target_size, img_mean, img_std)
+
+
+ def imgs_transform(imgs, label, mode, seg_num, seglen, short_size, target_size,
+ img_mean, img_std):
+ imgs = group_scale(imgs, short_size)
+
+ if mode == 'train':
+ if self.name == "TSM":
+ imgs = group_multi_scale_crop(imgs, short_size)
+ imgs = group_random_crop(imgs, target_size)
+ imgs = group_random_flip(imgs)
+ else:
+ imgs = group_center_crop(imgs, target_size)
+
+ np_imgs = (np.array(imgs[0]).astype('float32').transpose(
+ (2, 0, 1))).reshape(1, 3, target_size, target_size) / 255
+ for i in range(len(imgs) - 1):
+ img = (np.array(imgs[i + 1]).astype('float32').transpose(
+ (2, 0, 1))).reshape(1, 3, target_size, target_size) / 255
+ np_imgs = np.concatenate((np_imgs, img))
+ imgs = np_imgs
+ imgs -= img_mean
+ imgs /= img_std
+ imgs = np.reshape(imgs, (seg_num, seglen * 3, target_size, target_size))
+
+ return imgs, label
+
+
+ def reader():
+ with open(pickle_list) as flist:
+ lines = [line.strip() for line in flist]
+ if shuffle:
+ random.shuffle(lines)
+ for line in lines:
+ pickle_path = line.strip()
+ yield [pickle_path]
+
+ if format == 'pkl':
+ decode_func = decode_pickle
+ elif format == 'mp4':
+ decode_func = decode_mp4
+ else:
+ raise "Not implemented format {}".format(format)
+
+ mapper = functools.partial(
+ decode_func,
+ mode=mode,
+ seg_num=seg_num,
+ seglen=seglen,
+ short_size=short_size,
+ target_size=target_size,
+ img_mean=img_mean,
+ img_std=img_std)
+
+ return paddle.reader.xmap_readers(mapper, reader, num_threads, buf_size)
+
+
+def group_multi_scale_crop(img_group, target_size, scales=None, \
+ max_distort=1, fix_crop=True, more_fix_crop=True):
+ scales = scales if scales is not None else [1, .875, .75, .66]
+ input_size = [target_size, target_size]
+
+ im_size = img_group[0].size
+
+ # get random crop offset
+ def _sample_crop_size(im_size):
+ image_w, image_h = im_size[0], im_size[1]
+
+ base_size = min(image_w, image_h)
+ crop_sizes = [int(base_size * x) for x in scales]
+ crop_h = [input_size[1] if abs(x - input_size[1]) < 3 else x for x in crop_sizes]
+ crop_w = [input_size[0] if abs(x - input_size[0]) < 3 else x for x in crop_sizes]
+
+ pairs = []
+ for i, h in enumerate(crop_h):
+ for j, w in enumerate(crop_w):
+ if abs(i - j) <= max_distort:
+ pairs.append((w, h))
+
+ crop_pair = random.choice(pairs)
+ if not fix_crop:
+ w_offset = random.randint(0, image_w - crop_pair[0])
+ h_offset = random.randint(0, image_h - crop_pair[1])
+ else:
+ w_step = (image_w - crop_pair[0]) / 4
+ h_step = (image_h - crop_pair[1]) / 4
+
+ ret = list()
+ ret.append((0, 0)) # upper left
+ if w_step != 0:
+ ret.append((4 * w_step, 0)) # upper right
+ if h_step != 0:
+ ret.append((0, 4 * h_step)) # lower left
+ if h_step != 0 and w_step != 0:
+ ret.append((4 * w_step, 4 * h_step)) # lower right
+ if h_step != 0 or w_step != 0:
+ ret.append((2 * w_step, 2 * h_step)) # center
+
+ if more_fix_crop:
+ ret.append((0, 2 * h_step)) # center left
+ ret.append((4 * w_step, 2 * h_step)) # center right
+ ret.append((2 * w_step, 4 * h_step)) # lower center
+ ret.append((2 * w_step, 0 * h_step)) # upper center
+
+ ret.append((1 * w_step, 1 * h_step)) # upper left quarter
+ ret.append((3 * w_step, 1 * h_step)) # upper right quarter
+ ret.append((1 * w_step, 3 * h_step)) # lower left quarter
+ ret.append((3 * w_step, 3 * h_step)) # lower righ quarter
+
+ w_offset, h_offset = random.choice(ret)
+
+ return crop_pair[0], crop_pair[1], w_offset, h_offset
+
+ crop_w, crop_h, offset_w, offset_h = _sample_crop_size(im_size)
+ crop_img_group = [img.crop((offset_w, offset_h, offset_w + crop_w, offset_h + crop_h)) for img in img_group]
+ ret_img_group = [img.resize((input_size[0], input_size[1]), Image.BILINEAR) for img in crop_img_group]
+
+ return ret_img_group
+
+
+def group_random_crop(img_group, target_size):
+ w, h = img_group[0].size
+ th, tw = target_size, target_size
+
+ assert (w >= target_size) and (h >= target_size), \
+ "image width({}) and height({}) should be larger than crop size".format(w, h, target_size)
+
+ out_images = []
+ x1 = random.randint(0, w - tw)
+ y1 = random.randint(0, h - th)
+
+ for img in img_group:
+ if w == tw and h == th:
+ out_images.append(img)
+ else:
+ out_images.append(img.crop((x1, y1, x1 + tw, y1 + th)))
+
+ return out_images
+
+
+def group_random_flip(img_group):
+ v = random.random()
+ if v < 0.5:
+ ret = [img.transpose(Image.FLIP_LEFT_RIGHT) for img in img_group]
+ return ret
+ else:
+ return img_group
+
+
+def group_center_crop(img_group, target_size):
+ img_crop = []
+ for img in img_group:
+ w, h = img.size
+ th, tw = target_size, target_size
+ assert (w >= target_size) and (h >= target_size), \
+ "image width({}) and height({}) should be larger than crop size".format(w, h, target_size)
+ x1 = int(round((w - tw) / 2.))
+ y1 = int(round((h - th) / 2.))
+ img_crop.append(img.crop((x1, y1, x1 + tw, y1 + th)))
+
+ return img_crop
+
+
+def group_scale(imgs, target_size):
+ resized_imgs = []
+ for i in range(len(imgs)):
+ img = imgs[i]
+ w, h = img.size
+ if (w <= h and w == target_size) or (h <= w and h == target_size):
+ resized_imgs.append(img)
+ continue
+
+ if w < h:
+ ow = target_size
+ oh = int(target_size * 4.0 / 3.0)
+ resized_imgs.append(img.resize((ow, oh), Image.BILINEAR))
+ else:
+ oh = target_size
+ ow = int(target_size * 4.0 / 3.0)
+ resized_imgs.append(img.resize((ow, oh), Image.BILINEAR))
+
+ return resized_imgs
+
+
+def imageloader(buf):
+ if isinstance(buf, str):
+ img = Image.open(StringIO(buf))
+ else:
+ img = Image.open(BytesIO(buf))
+
+ return img.convert('RGB')
+
+
+def video_loader(frames, nsample, seglen, mode):
+ videolen = len(frames)
+ average_dur = int(videolen / nsample)
+
+ imgs = []
+ for i in range(nsample):
+ idx = 0
+ if mode == 'train':
+ if average_dur >= seglen:
+ idx = random.randint(0, average_dur - seglen)
+ idx += i * average_dur
+ elif average_dur >= 1:
+ idx += i * average_dur
+ else:
+ idx = i
+ else:
+ if average_dur >= seglen:
+ idx = (average_dur - seglen) // 2
+ idx += i * average_dur
+ elif average_dur >= 1:
+ idx += i * average_dur
+ else:
+ idx = i
+
+ for jj in range(idx, idx + seglen):
+ imgbuf = frames[int(jj % videolen)]
+ img = imageloader(imgbuf)
+ imgs.append(img)
+
+ return imgs
+
+
+def mp4_loader(filepath, nsample, seglen, mode):
+ cap = cv2.VideoCapture(filepath)
+ videolen = int(cap.get(cv2.CAP_PROP_FRAME_COUNT))
+ average_dur = int(videolen / nsample)
+ sampledFrames = []
+ for i in range(videolen):
+ ret, frame = cap.read()
+ # maybe first frame is empty
+ if ret == False:
+ continue
+ img = frame[:, :, ::-1]
+ sampledFrames.append(img)
+
+ imgs = []
+ for i in range(nsample):
+ idx = 0
+ if mode == 'train':
+ if average_dur >= seglen:
+ idx = random.randint(0, average_dur - seglen)
+ idx += i * average_dur
+ elif average_dur >= 1:
+ idx += i * average_dur
+ else:
+ idx = i
+ else:
+ if average_dur >= seglen:
+ idx = (average_dur - 1) // 2
+ idx += i * average_dur
+ elif average_dur >= 1:
+ idx += i * average_dur
+ else:
+ idx = i
+
+ for jj in range(idx, idx + seglen):
+ imgbuf = sampledFrames[int(jj % videolen)]
+ img = Image.fromarray(imgbuf, mode='RGB')
+ imgs.append(img)
+
+ return imgs
diff --git a/PaddleCV/video/datareader/nonlocal_reader.py b/PaddleCV/video/datareader/nonlocal_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..8e13c994dc4e16f0ec0a0ad33ac8b811b94682ec
--- /dev/null
+++ b/PaddleCV/video/datareader/nonlocal_reader.py
@@ -0,0 +1,338 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import random
+import time
+import multiprocessing
+import numpy as np
+import cv2
+import logging
+
+from .reader_utils import DataReader
+
+logger = logging.getLogger(__name__)
+
+
+class NonlocalReader(DataReader):
+ """
+ Data reader for kinetics dataset, which read mp4 file and decode into numpy.
+ This is for nonlocal neural network model.
+ cfg: num_classes
+ num_reader_threads
+ image_mean
+ image_std
+ batch_size
+ filelist
+ crop_size
+ sample_rate
+ video_length
+ jitter_scales
+ Test only cfg: num_test_clips
+ use_multi_crop
+ """
+
+ def __init__(self, name, mode, cfg):
+ self.name = name
+ self.mode = mode
+ self.cfg = cfg
+
+ def create_reader(self):
+ cfg = self.cfg
+ mode = self.mode
+ num_reader_threads = cfg[mode.upper()]['num_reader_threads']
+ assert num_reader_threads >=1, \
+ "number of reader threads({}) should be a positive integer".format(num_reader_threads)
+ if num_reader_threads == 1:
+ reader_func = make_reader
+ else:
+ reader_func = make_multi_reader
+
+ dataset_args = {}
+ dataset_args['image_mean'] = cfg.MODEL.image_mean
+ dataset_args['image_std'] = cfg.MODEL.image_std
+ dataset_args['crop_size'] = cfg[mode.upper()]['crop_size']
+ dataset_args['sample_rate'] = cfg[mode.upper()]['sample_rate']
+ dataset_args['video_length'] = cfg[mode.upper()]['video_length']
+ dataset_args['min_size'] = cfg[mode.upper()]['jitter_scales'][0]
+ dataset_args['max_size'] = cfg[mode.upper()]['jitter_scales'][1]
+ dataset_args['num_reader_threads'] = num_reader_threads
+ filelist = cfg[mode.upper()]['filelist']
+ batch_size = cfg[mode.upper()]['batch_size']
+
+ if self.mode == 'train':
+ sample_times = 1
+ return reader_func(filelist, batch_size, sample_times, True, True,
+ **dataset_args)
+ elif self.mode == 'valid':
+ sample_times = 1
+ return reader_func(filelist, batch_size, sample_times, False, False,
+ **dataset_args)
+ elif self.mode == 'test' or self.mode == 'infer':
+ sample_times = cfg['TEST']['num_test_clips']
+ if cfg['TEST']['use_multi_crop'] == 1:
+ sample_times = int(sample_times / 3)
+ if cfg['TEST']['use_multi_crop'] == 2:
+ sample_times = int(sample_times / 6)
+ return reader_func(filelist, batch_size, sample_times, False, False,
+ **dataset_args)
+ else:
+ logger.info('Not implemented')
+ raise NotImplementedError
+
+
+def video_fast_get_frame(video_path,
+ sampling_rate=1,
+ length=64,
+ start_frm=-1,
+ sample_times=1):
+ cap = cv2.VideoCapture(video_path)
+ frame_cnt = int(cap.get(cv2.CAP_PROP_FRAME_COUNT))
+ width = int(cap.get(cv2.CAP_PROP_FRAME_WIDTH))
+ height = int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
+
+ sampledFrames = []
+
+ video_output = np.ndarray(shape=[length, height, width, 3], dtype=np.uint8)
+
+ use_start_frm = start_frm
+ if start_frm < 0:
+ if (frame_cnt - length * sampling_rate > 0):
+ use_start_frm = random.randint(0,
+ frame_cnt - length * sampling_rate)
+ else:
+ use_start_frm = 0
+ else:
+ frame_gaps = float(frame_cnt) / float(sample_times)
+ use_start_frm = int(frame_gaps * start_frm) % frame_cnt
+
+ for i in range(frame_cnt):
+ ret, frame = cap.read()
+ # maybe first frame is empty
+ if ret == False:
+ continue
+ img = frame[:, :, ::-1]
+ sampledFrames.append(img)
+
+ for idx in range(length):
+ i = use_start_frm + idx * sampling_rate
+ i = i % len(sampledFrames)
+ video_output[idx] = sampledFrames[i]
+
+ cap.release()
+ return video_output
+
+
+def apply_resize(rgbdata, min_size, max_size):
+ length, height, width, channel = rgbdata.shape
+ ratio = 1.0
+ # generate random scale between [min_size, max_size]
+ if min_size == max_size:
+ side_length = min_size
+ else:
+ side_length = np.random.randint(min_size, max_size)
+ if height > width:
+ ratio = float(side_length) / float(width)
+ else:
+ ratio = float(side_length) / float(height)
+ out_height = int(round(height * ratio))
+ out_width = int(round(width * ratio))
+ outdata = np.zeros(
+ (length, out_height, out_width, channel), dtype=rgbdata.dtype)
+ for i in range(length):
+ outdata[i] = cv2.resize(rgbdata[i], (out_width, out_height))
+ return outdata
+
+
+def crop_mirror_transform(rgbdata,
+ mean,
+ std,
+ cropsize=224,
+ use_mirror=True,
+ center_crop=False,
+ spatial_pos=-1):
+ channel, length, height, width = rgbdata.shape
+ assert height >= cropsize, "crop size should not be larger than video height"
+ assert width >= cropsize, "crop size should not be larger than video width"
+ # crop to specific scale
+ if center_crop:
+ h_off = int((height - cropsize) / 2)
+ w_off = int((width - cropsize) / 2)
+ if spatial_pos >= 0:
+ now_pos = spatial_pos % 3
+ if h_off > 0:
+ h_off = h_off * now_pos
+ else:
+ w_off = w_off * now_pos
+ else:
+ h_off = np.random.randint(0, height - cropsize)
+ w_off = np.random.randint(0, width - cropsize)
+ outdata = np.zeros(
+ (channel, length, cropsize, cropsize), dtype=rgbdata.dtype)
+ outdata[:, :, :, :] = rgbdata[:, :, h_off:h_off + cropsize, w_off:w_off +
+ cropsize]
+ # apply mirror
+ mirror_indicator = (np.random.rand() > 0.5)
+ mirror_me = use_mirror and mirror_indicator
+ if spatial_pos > 0:
+ mirror_me = (int(spatial_pos / 3) > 0)
+ if mirror_me:
+ outdata = outdata[:, :, :, ::-1]
+ # substract mean and divide std
+ outdata = outdata.astype(np.float32)
+ outdata = (outdata - mean) / std
+ return outdata
+
+
+def make_reader(filelist, batch_size, sample_times, is_training, shuffle,
+ **dataset_args):
+ def reader():
+ fl = open(filelist).readlines()
+ fl = [line.strip() for line in fl if line.strip() != '']
+
+ if shuffle:
+ random.shuffle(fl)
+
+ batch_out = []
+ for line in fl:
+ # start_time = time.time()
+ line_items = line.split(' ')
+ fn = line_items[0]
+ label = int(line_items[1])
+ if len(line_items) > 2:
+ start_frm = int(line_items[2])
+ spatial_pos = int(line_items[3])
+ in_sample_times = sample_times
+ else:
+ start_frm = -1
+ spatial_pos = -1
+ in_sample_times = 1
+ label = np.array([label]).astype(np.int64)
+ # 1, get rgb data for fixed length of frames
+ try:
+ rgbdata = video_fast_get_frame(fn, \
+ sampling_rate = dataset_args['sample_rate'], length = dataset_args['video_length'], \
+ start_frm = start_frm, sample_times = in_sample_times)
+ except:
+ logger.info('Error when loading {}, just skip this file'.format(
+ fn))
+ continue
+ # add prepocessing
+ # 2, reszie to randomly scale between [min_size, max_size] when training, or cgf.TEST.SCALE when inference
+ min_size = dataset_args['min_size']
+ max_size = dataset_args['max_size']
+ rgbdata = apply_resize(rgbdata, min_size, max_size)
+ # transform [length, height, width, channel] to [channel, length, height, width]
+ rgbdata = np.transpose(rgbdata, [3, 0, 1, 2])
+
+ # 3 crop, mirror and transform
+ rgbdata = crop_mirror_transform(rgbdata, mean = dataset_args['image_mean'], \
+ std = dataset_args['image_std'], cropsize = dataset_args['crop_size'], \
+ use_mirror = is_training, center_crop = (not is_training), \
+ spatial_pos = spatial_pos)
+
+ batch_out.append((rgbdata, label))
+ if len(batch_out) == batch_size:
+ yield batch_out
+ batch_out = []
+
+ return reader
+
+
+def make_multi_reader(filelist, batch_size, sample_times, is_training, shuffle,
+ **dataset_args):
+ def read_into_queue(flq, queue):
+ batch_out = []
+ for line in flq:
+ line_items = line.split(' ')
+ fn = line_items[0]
+ label = int(line_items[1])
+ if len(line_items) > 2:
+ start_frm = int(line_items[2])
+ spatial_pos = int(line_items[3])
+ in_sample_times = sample_times
+ else:
+ start_frm = -1
+ spatial_pos = -1
+ in_sample_times = 1
+ label = np.array([label]).astype(np.int64)
+ # 1, get rgb data for fixed length of frames
+ try:
+ rgbdata = video_fast_get_frame(fn, \
+ sampling_rate = dataset_args['sample_rate'], length = dataset_args['video_length'], \
+ start_frm = start_frm, sample_times = in_sample_times)
+ except:
+ logger.info('Error when loading {}, just skip this file'.format(
+ fn))
+ continue
+ # add prepocessing
+ # 2, reszie to randomly scale between [min_size, max_size] when training, or cgf.TEST.SCALE when inference
+ min_size = dataset_args['min_size']
+ max_size = dataset_args['max_size']
+ rgbdata = apply_resize(rgbdata, min_size, max_size)
+ # transform [length, height, width, channel] to [channel, length, height, width]
+ rgbdata = np.transpose(rgbdata, [3, 0, 1, 2])
+
+ # 3 crop, mirror and transform
+ rgbdata = crop_mirror_transform(rgbdata, mean = dataset_args['image_mean'], \
+ std = dataset_args['image_std'], cropsize = dataset_args['crop_size'], \
+ use_mirror = is_training, center_crop = (not is_training), \
+ spatial_pos = spatial_pos)
+
+ batch_out.append((rgbdata, label))
+ if len(batch_out) == batch_size:
+ queue.put(batch_out)
+ batch_out = []
+ queue.put(None)
+
+ def queue_reader():
+ # split file list and shuffle
+ fl = open(filelist).readlines()
+ fl = [line.strip() for line in fl if line.strip() != '']
+
+ if shuffle:
+ random.shuffle(fl)
+
+ n = dataset_args['num_reader_threads']
+ queue_size = 20
+ reader_lists = [None] * n
+ file_num = int(len(fl) // n)
+ for i in range(n):
+ if i < len(reader_lists) - 1:
+ tmp_list = fl[i * file_num:(i + 1) * file_num]
+ else:
+ tmp_list = fl[i * file_num:]
+ reader_lists[i] = tmp_list
+
+ queue = multiprocessing.Queue(queue_size)
+ p_list = [None] * len(reader_lists)
+ # for reader_list in reader_lists:
+ for i in range(len(reader_lists)):
+ reader_list = reader_lists[i]
+ p_list[i] = multiprocessing.Process(
+ target=read_into_queue, args=(reader_list, queue))
+ p_list[i].start()
+ reader_num = len(reader_lists)
+ finish_num = 0
+ while finish_num < reader_num:
+ sample = queue.get()
+ if sample is None:
+ finish_num += 1
+ else:
+ yield sample
+ for i in range(len(p_list)):
+ p_list[i].terminate()
+ p_list[i].join()
+
+ return queue_reader
diff --git a/PaddleCV/video/datareader/reader_utils.py b/PaddleCV/video/datareader/reader_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..b21bed3df012f0f64cc4a4af9296c3d569925bc9
--- /dev/null
+++ b/PaddleCV/video/datareader/reader_utils.py
@@ -0,0 +1,82 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import pickle
+import cv2
+import numpy as np
+import random
+
+
+class ReaderNotFoundError(Exception):
+ "Error: reader not found"
+
+ def __init__(self, reader_name, avail_readers):
+ super(ReaderNotFoundError, self).__init__()
+ self.reader_name = reader_name
+ self.avail_readers = avail_readers
+
+ def __str__(self):
+ msg = "Reader {} Not Found.\nAvailiable readers:\n".format(
+ self.reader_name)
+ for reader in self.avail_readers:
+ msg += " {}\n".format(reader)
+ return msg
+
+
+class DataReader(object):
+ """data reader for video input"""
+
+ def __init__(self, model_name, mode, cfg):
+ self.name = model_name
+ self.mode = mode
+ self.cfg = cfg
+
+ def create_reader(self):
+ """Not implemented"""
+ pass
+
+ def get_config_from_sec(self, sec, item, default=None):
+ if sec.upper() not in self.cfg:
+ return default
+ return self.cfg[sec.upper()].get(item, default)
+
+
+
+class ReaderZoo(object):
+ def __init__(self):
+ self.reader_zoo = {}
+
+ def regist(self, name, reader):
+ assert reader.__base__ == DataReader, "Unknow model type {}".format(
+ type(reader))
+ self.reader_zoo[name] = reader
+
+ def get(self, name, mode, cfg):
+ for k, v in self.reader_zoo.items():
+ if k == name:
+ return v(name, mode, cfg)
+ raise ReaderNotFoundError(name, self.reader_zoo.keys())
+
+
+# singleton reader_zoo
+reader_zoo = ReaderZoo()
+
+
+def regist_reader(name, reader):
+ reader_zoo.regist(name, reader)
+
+
+def get_reader(name, mode, cfg):
+ reader_model = reader_zoo.get(name, mode, cfg)
+ return reader_model.create_reader()
diff --git a/PaddleCV/video/dataset/README.md b/PaddleCV/video/dataset/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..dd8d00480a084076de407e0887a18d9fe83be7e1
--- /dev/null
+++ b/PaddleCV/video/dataset/README.md
@@ -0,0 +1,123 @@
+# 数据使用说明
+
+- [Youtube-8M](#Youtube-8M数据集)
+- [Kinetics](#Kinetics数据集)
+- [Non-local](#Non-local)
+
+## Youtube-8M数据集
+这里用到的是YouTube-8M 2018年更新之后的数据集。使用官方数据集,并将TFRecord文件转化为pickle文件以便PaddlePaddle使用。Youtube-8M数据集官方提供了frame-level和video-level的特征,这里只需使用到frame-level的特征。
+
+### 数据下载
+请使用Youtube-8M官方链接分别下载[训练集](http://us.data.yt8m.org/2/frame/train/index.html)和[验证集](http://us.data.yt8m.org/2/frame/validate/index.html)。每个链接里各提供了3844个文件的下载地址,用户也可以使用官方提供的[下载脚本](https://research.google.com/youtube8m/download.html)下载数据。数据下载完成后,将会得到3844个训练数据文件和3844个验证数据文件(TFRecord格式)。
+假设存放视频模型代码库的主目录为: Code\_Root,进入dataset/youtube8m目录
+
+ cd dataset/youtube8m
+
+在youtube8m下新建目录tf/train和tf/val
+
+ mkdir tf && cd tf
+
+ mkdir train && mkdir val
+
+并分别将下载的train和validate数据存放在其中。
+
+### 数据格式转化
+
+为了适用于PaddlePaddle训练,需要离线将下载好的TFRecord文件格式转成了pickle格式,转换脚本请使用[dataset/youtube8m/tf2pkl.py](./youtube8m/tf2pkl.py)。
+
+在dataset/youtube8m 目录下新建目录pkl/train和pkl/val
+
+ cd dataset/youtube8m
+
+ mkdir pkl && cd pkl
+
+ mkdir train && mkdir val
+
+
+转化文件格式(TFRecord -> pkl),进入dataset/youtube8m目录,运行脚本
+
+ python tf2pkl.py ./tf/train ./pkl/train
+
+和
+
+ python tf2pkl.py ./tf/val ./pkl/val
+
+分别将train和validate数据集转化为pkl文件。tf2pkl.py文件运行时需要两个参数,分别是数据源tf文件存放路径和转化后的pkl文件存放路径。
+
+备注:由于TFRecord文件的读取需要用到Tensorflow,用户要先安装Tensorflow,或者在安装有Tensorflow的环境中转化完数据,再拷贝到dataset/youtube8m/pkl目录下。为了避免和PaddlePaddle环境冲突,建议先在其他地方转化完成再将数据拷贝过来。
+
+### 生成文件列表
+
+进入dataset/youtube8m目录
+
+ ls $Code_Root/dataset/youtube8m/pkl/train/* > train.list
+
+ ls $Code_Root/dataset/youtube8m/pkl/val/* > val.list
+
+在dataset/youtube8m目录下将生成两个文件,train.list和val.list,每一行分别保存了一个pkl文件的绝对路径。
+
+## Kinetics数据集
+
+Kinetics数据集是DeepMind公开的大规模视频动作识别数据集,有Kinetics400与Kinetics600两个版本。这里使用Kinetics400数据集,具体的数据预处理过程如下。
+
+### mp4视频下载
+在Code\_Root目录下创建文件夹
+
+ cd $Code_Root/dataset && mkdir kinetics
+
+ cd kinetics && mkdir data_k400 && cd data_k400
+
+ mkdir train_mp4 && mkdir val_mp4
+
+ActivityNet官方提供了Kinetics的下载工具,具体参考其[官方repo ](https://github.com/activitynet/ActivityNet/tree/master/Crawler/Kinetics)即可下载Kinetics400的mp4视频集合。将kinetics400的训练与验证集合分别下载到dataset/kinetics/data\_k400/train\_mp4与dataset/kinetics/data\_k400/val\_mp4。
+
+### mp4文件预处理
+
+为提高数据读取速度,提前将mp4文件解帧并打pickle包,dataloader从视频的pkl文件中读取数据(该方法耗费更多存储空间)。pkl文件里打包的内容为(video-id,[frame1, frame2,...,frameN],label)。
+
+在 dataset/kinetics/data\_k400目录下创建目录train\_pkl和val\_pkl
+
+ cd $Code_Root/dataset/kinetics/data_k400
+
+ mkdir train_pkl && mkdir val_pkl
+
+进入$Code\_Root/dataset/kinetics目录,使用video2pkl.py脚本进行数据转化。首先需要下载[train](https://github.com/activitynet/ActivityNet/tree/master/Crawler/Kinetics/data/kinetics-400_train.csv)和[validation](https://github.com/activitynet/ActivityNet/tree/master/Crawler/Kinetics/data/kinetics-400_val.csv)数据集的文件列表。
+
+首先生成预处理需要的数据集标签文件
+
+ python generate_label.py kinetics-400_train.csv kinetics400_label.txt
+
+然后执行如下程序:
+
+ python video2pkl.py kinetics-400_train.csv $Source_dir $Target_dir 8 #以8个进程为例
+
+- 该脚本依赖`ffmpeg`库,请预先安装`ffmpeg`
+
+对于train数据,
+
+ Source_dir = $Code_Root/dataset/kinetics/data_k400/train_mp4
+
+ Target_dir = $Code_Root/dataset/kinetics/data_k400/train_pkl
+
+对于val数据,
+
+ Source_dir = $Code_Root/dataset/kinetics/data_k400/val_mp4
+
+ Target_dir = $Code_Root/dataset/kinetics/data_k400/val_pkl
+
+这样即可将mp4文件解码并保存为pkl文件。
+
+### 生成训练和验证集list
+
+ cd $Code_Root/dataset/kinetics
+
+ ls $Code_Root/dataset/kinetics/data_k400/train_pkl /* > train.list
+
+ ls $Code_Root/dataset/kinetics/data_k400/val_pkl /* > val.list
+
+
+即可生成相应的文件列表,train.list和val.list的每一行表示一个pkl文件的绝对路径。
+
+## Non-local
+
+Non-local模型也使用kinetics数据集,不过其数据处理方式和其他模型不一样,详细内容见[Non-local数据说明](./nonlocal/README.md)
diff --git a/PaddleCV/video/dataset/kinetics/generate_label.py b/PaddleCV/video/dataset/kinetics/generate_label.py
new file mode 100644
index 0000000000000000000000000000000000000000..4f7c504c56821527cde57bacf7e9a2d07c666c8f
--- /dev/null
+++ b/PaddleCV/video/dataset/kinetics/generate_label.py
@@ -0,0 +1,31 @@
+import sys
+
+# kinetics-400_train.csv should be down loaded first and set as sys.argv[1]
+# sys.argv[2] can be set as kinetics400_label.txt
+# python generate_label.py kinetics-400_train.csv kinetics400_label.txt
+
+num_classes = 400
+
+fname = sys.argv[1]
+outname = sys.argv[2]
+fl = open(fname).readlines()
+fl = fl[1:]
+outf = open(outname, 'w')
+
+label_list = []
+for line in fl:
+ label = line.strip().split(',')[0].strip('"')
+ if label in label_list:
+ continue
+ else:
+ label_list.append(label)
+
+assert len(label_list
+ ) == num_classes, "there should be {} labels in list, but ".format(
+ num_classes, len(label_list))
+
+label_list.sort()
+for i in range(num_classes):
+ outf.write('{} {}'.format(label_list[i], i) + '\n')
+
+outf.close()
diff --git a/PaddleCV/video/dataset/kinetics/video2pkl.py b/PaddleCV/video/dataset/kinetics/video2pkl.py
new file mode 100644
index 0000000000000000000000000000000000000000..881857c40c4ece2f192e681526e2622ef1ce2f81
--- /dev/null
+++ b/PaddleCV/video/dataset/kinetics/video2pkl.py
@@ -0,0 +1,84 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import sys
+import glob
+import cPickle
+from multiprocessing import Pool
+
+# example command line: python generate_k400_pkl.py kinetics-400_train.csv 8
+#
+# kinetics-400_train.csv is the training set file of K400 official release
+# each line contains laebl,youtube_id,time_start,time_end,split,is_cc
+
+assert (len(sys.argv) == 5)
+
+f = open(sys.argv[1])
+source_dir = sys.argv[2]
+target_dir = sys.argv[3]
+num_threads = sys.argv[4]
+all_video_entries = [x.strip().split(',') for x in f.readlines()]
+all_video_entries = all_video_entries[1:]
+f.close()
+
+category_label_map = {}
+f = open('kinetics400_label.txt')
+for line in f:
+ ens = line.strip().split(' ')
+ category = " ".join(ens[0:-1])
+ label = int(ens[-1])
+ category_label_map[category] = label
+f.close()
+
+
+def generate_pkl(entry):
+ mode = entry[4]
+ category = entry[0].strip('"')
+ category_dir = category
+ video_path = os.path.join(
+ './',
+ entry[1] + "_%06d" % int(entry[2]) + "_%06d" % int(entry[3]) + ".mp4")
+ video_path = os.path.join(source_dir, category_dir, video_path)
+ label = category_label_map[category]
+
+ vid = './' + video_path.split('/')[-1].split('.')[0]
+ if os.path.exists(video_path):
+ if not os.path.exists(vid):
+ os.makedirs(vid)
+ os.system('ffmpeg -i ' + video_path + ' -q 0 ' + vid + '/%06d.jpg')
+ else:
+ print("File not exists {}".format(video_path))
+ return
+
+ images = sorted(glob.glob(vid + '/*.jpg'))
+ ims = []
+ for img in images:
+ f = open(img)
+ ims.append(f.read())
+ f.close()
+
+ output_pkl = vid + ".pkl"
+ output_pkl = os.path.join(target_dir, output_pkl)
+ f = open(output_pkl, 'w')
+ cPickle.dump((vid, label, ims), f, -1)
+ f.close()
+
+ os.system('rm -rf %s' % vid)
+
+
+pool = Pool(processes=int(sys.argv[4]))
+pool.map(generate_pkl, all_video_entries)
+pool.close()
+pool.join()
diff --git a/PaddleCV/video/dataset/nonlocal/README.md b/PaddleCV/video/dataset/nonlocal/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..52b5d73afde083610be595c4b1f4ff529c81cbd5
--- /dev/null
+++ b/PaddleCV/video/dataset/nonlocal/README.md
@@ -0,0 +1,19 @@
+# Non-local模型数据说明
+
+在Non-local模型中,输入数据是mp4文件,在datareader部分的代码中,使用opencv读取mp4文件对视频进行解码和采样。train和valid数据随机选取起始帧的位置,对每帧图像做随机增强,短边缩放至[256, 320]之间的某个随机数,长边根据长宽比计算出来,截取出224x224大小的区域。test时每条视频会选取10个不同的位置作为起始帧,同时会选取三个不同的空间位置作为crop区域的起始点,这样每个视频会进行10x3次采样,对这30个样本的预测概率求和,选取概率最大的分类作为最终的预测结果。
+
+## 数据下载
+
+下载kinetics400数据,具体方法见[数据说明](../README.md)中kinetics数据部分,假设下载的mp4文件存放在DATADIR目录下,train和validation数据分别位于$DATADIR/train和$DATADIR/valid目录。在下载数据的时候,将所有视频的高度缩放至256,宽度通过长宽比计算出来。
+
+## 下载官方数据列表
+
+将官方提供的数据集文件表格[kinetics-400\_train.csv](https://github.com/activitynet/ActivityNet/tree/master/Crawler/Kinetics/data/kinetics-400_train.csv)和[kinetics-400\_val.csv](https://github.com/activitynet/ActivityNet/tree/master/Crawler/Kinetics/data/kinetics-400_val.csv)下载到此目录。
+
+## 生成文件列表
+
+打开generate\_list.sh,将其中的TRAIN\_DIR和VALID\_DIR修改成用户所保存的mp4文件路径,运行脚本
+
+ bash generate_list.sh
+
+即可生成trainlist.txt、vallist.txt和testlist.txt。
diff --git a/PaddleCV/video/dataset/nonlocal/change_filelist.py b/PaddleCV/video/dataset/nonlocal/change_filelist.py
new file mode 100644
index 0000000000000000000000000000000000000000..0426164fb2ee45a0ada550e0e7fa0b1e452ffff8
--- /dev/null
+++ b/PaddleCV/video/dataset/nonlocal/change_filelist.py
@@ -0,0 +1,37 @@
+# Copyright (c) 2017-present, Facebook, Inc.
+# All rights reserved.
+#
+# This source code is licensed under the license found in the
+# LICENSE file in the root directory of this source tree.
+#
+
+import sys
+import numpy as np
+import random
+
+# src = 'trainlist_download.txt'
+# outlist = 'trainlist.txt'
+# original_folder = '/nfs.yoda/xiaolonw/kinetics/data/train'
+# replace_folder = '/scratch/xiaolonw/kinetics/data/compress/train_256'
+assert (len(sys.argv) == 5)
+
+src = sys.argv[1]
+outlist = sys.argv[2]
+original_folder = sys.argv[3]
+replace_folder = sys.argv[4]
+
+f = open(src, 'r')
+flist = []
+for line in f:
+ flist.append(line)
+f.close()
+
+f2 = open(outlist, 'w')
+
+listlen = len(flist)
+for i in range(listlen):
+ line = flist[i]
+ line = line.replace(original_folder, replace_folder)
+ f2.write(line)
+
+f2.close()
diff --git a/PaddleCV/video/dataset/nonlocal/generate_filelist.py b/PaddleCV/video/dataset/nonlocal/generate_filelist.py
new file mode 100644
index 0000000000000000000000000000000000000000..c69ac95270da7b707da4eb5dd28187ad4b007b45
--- /dev/null
+++ b/PaddleCV/video/dataset/nonlocal/generate_filelist.py
@@ -0,0 +1,65 @@
+import os
+import numpy as np
+import sys
+
+num_classes = 400
+replace_space_by_underliner = True # whether to replace space by '_' in labels
+
+fn = sys.argv[1] #'trainlist_download400.txt'
+train_dir = sys.argv[
+ 2] #'/docker_mount/data/k400/Kinetics_trimmed_processed_train'
+val_dir = sys.argv[3] #'/docker_mount/data/k400/Kinetics_trimmed_processed_val'
+trainlist = sys.argv[4] #'trainlist.txt'
+vallist = sys.argv[5] #'vallist.txt'
+
+fl = open(fn).readlines()
+fl = [line.strip() for line in fl if line.strip() != '']
+action_list = []
+
+for line in fl[1:]:
+ act = line.split(',')[0].strip('\"')
+ action_list.append(act)
+
+action_set = set(action_list)
+action_list = list(action_set)
+action_list.sort()
+if replace_space_by_underliner:
+ action_list = [item.replace(' ', '_') for item in action_list]
+
+# assign integer label to each category, abseiling is labeled as 0,
+# zumba labeled as 399 and so on, sorted by the category name
+action_label_dict = {}
+for i in range(len(action_list)):
+ key = action_list[i]
+ action_label_dict[key] = i
+
+assert len(action_label_dict.keys(
+)) == num_classes, "action num should be {}".format(num_classes)
+
+
+def generate_file(Faction_label_dict, Ftrain_dir, Ftrainlist, Fnum_classes):
+ trainactions = os.listdir(Ftrain_dir)
+ trainactions.sort()
+ assert len(
+ trainactions) == Fnum_classes, "train action num should be {}".format(
+ Fnum_classes)
+
+ train_items = []
+ trainlist_outfile = open(Ftrainlist, 'w')
+ for trainaction in trainactions:
+ assert trainaction in Faction_label_dict.keys(
+ ), "action {} should be in action_dict".format(trainaction)
+ trainaction_dir = os.path.join(Ftrain_dir, trainaction)
+ trainaction_label = Faction_label_dict[trainaction]
+ trainaction_files = os.listdir(trainaction_dir)
+ for f in trainaction_files:
+ fn = os.path.join(trainaction_dir, f)
+ item = fn + ' ' + str(trainaction_label)
+ train_items.append(item)
+ trainlist_outfile.write(item + '\n')
+ trainlist_outfile.flush()
+ trainlist_outfile.close()
+
+
+generate_file(action_label_dict, train_dir, trainlist, num_classes)
+generate_file(action_label_dict, val_dir, vallist, num_classes)
diff --git a/PaddleCV/video/dataset/nonlocal/generate_list.sh b/PaddleCV/video/dataset/nonlocal/generate_list.sh
new file mode 100644
index 0000000000000000000000000000000000000000..d179f855e227eeaa6c4bda0c8ef703fa620e9042
--- /dev/null
+++ b/PaddleCV/video/dataset/nonlocal/generate_list.sh
@@ -0,0 +1,12 @@
+# Download txt name
+TRAINLIST_DOWNLOAD="kinetics-400_train.csv"
+
+# path of the train and valid data
+TRAIN_DIR=YOUR_TRAIN_DATA_DIR # replace this with your train data dir
+VALID_DIR=YOUR_VALID_DATA_DIR # replace this with your valid data dir
+
+python generate_filelist.py $TRAINLIST_DOWNLOAD $TRAIN_DIR $VALID_DIR trainlist.txt vallist.txt
+
+# generate test list
+python generate_testlist_multicrop.py
+
diff --git a/PaddleCV/video/dataset/nonlocal/generate_testlist_multicrop.py b/PaddleCV/video/dataset/nonlocal/generate_testlist_multicrop.py
new file mode 100644
index 0000000000000000000000000000000000000000..f2d9b86c86fd75b3ca296772d2b05bdf6c353766
--- /dev/null
+++ b/PaddleCV/video/dataset/nonlocal/generate_testlist_multicrop.py
@@ -0,0 +1,21 @@
+import os
+
+vallist = 'vallist.txt'
+testlist = 'testlist.txt'
+sampling_times = 10
+cropping_times = 3
+
+fl = open(vallist).readlines()
+fl = [line.strip() for line in fl if line.strip() != '']
+f_test = open(testlist, 'w')
+
+for i in range(len(fl)):
+ line = fl[i].split(' ')
+ fn = line[0]
+ label = line[1]
+ for j in range(sampling_times):
+ for k in range(cropping_times):
+ test_item = fn + ' ' + str(i) + ' ' + str(j) + ' ' + str(k) + '\n'
+ f_test.write(test_item)
+
+f_test.close()
diff --git a/PaddleCV/video/dataset/youtube8m/tf2pkl.py b/PaddleCV/video/dataset/youtube8m/tf2pkl.py
new file mode 100644
index 0000000000000000000000000000000000000000..3b32e3b41a705d6e294581ca3b92c911d238798f
--- /dev/null
+++ b/PaddleCV/video/dataset/youtube8m/tf2pkl.py
@@ -0,0 +1,278 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+"""Provides readers configured for different datasets."""
+import os, sys
+import numpy as np
+import tensorflow as tf
+from tensorflow import logging
+import cPickle
+
+from tensorflow.python.platform import gfile
+
+assert (len(sys.argv) == 3)
+source_dir = sys.argv[1]
+target_dir = sys.argv[2]
+
+
+def Dequantize(feat_vector, max_quantized_value=2, min_quantized_value=-2):
+ """Dequantize the feature from the byte format to the float format.
+
+ Args:
+ feat_vector: the input 1-d vector.
+ max_quantized_value: the maximum of the quantized value.
+ min_quantized_value: the minimum of the quantized value.
+
+ Returns:
+ A float vector which has the same shape as feat_vector.
+ """
+ assert max_quantized_value > min_quantized_value
+ quantized_range = max_quantized_value - min_quantized_value
+ scalar = quantized_range / 255.0
+ bias = (quantized_range / 512.0) + min_quantized_value
+ return feat_vector * scalar + bias
+
+
+def resize_axis(tensor, axis, new_size, fill_value=0):
+ """Truncates or pads a tensor to new_size on on a given axis.
+
+ Truncate or extend tensor such that tensor.shape[axis] == new_size. If the
+ size increases, the padding will be performed at the end, using fill_value.
+
+ Args:
+ tensor: The tensor to be resized.
+ axis: An integer representing the dimension to be sliced.
+ new_size: An integer or 0d tensor representing the new value for
+ tensor.shape[axis].
+ fill_value: Value to use to fill any new entries in the tensor. Will be
+ cast to the type of tensor.
+
+ Returns:
+ The resized tensor.
+ """
+ tensor = tf.convert_to_tensor(tensor)
+ shape = tf.unstack(tf.shape(tensor))
+
+ pad_shape = shape[:]
+ pad_shape[axis] = tf.maximum(0, new_size - shape[axis])
+
+ shape[axis] = tf.minimum(shape[axis], new_size)
+ shape = tf.stack(shape)
+
+ resized = tf.concat([
+ tf.slice(tensor, tf.zeros_like(shape), shape),
+ tf.fill(tf.stack(pad_shape), tf.cast(fill_value, tensor.dtype))
+ ], axis)
+
+ # Update shape.
+ new_shape = tensor.get_shape().as_list() # A copy is being made.
+ new_shape[axis] = new_size
+ resized.set_shape(new_shape)
+ return resized
+
+
+class BaseReader(object):
+ """Inherit from this class when implementing new readers."""
+
+ def prepare_reader(self, unused_filename_queue):
+ """Create a thread for generating prediction and label tensors."""
+ raise NotImplementedError()
+
+
+class YT8MFrameFeatureReader(BaseReader):
+ """Reads TFRecords of SequenceExamples.
+
+ The TFRecords must contain SequenceExamples with the sparse in64 'labels'
+ context feature and a fixed length byte-quantized feature vector, obtained
+ from the features in 'feature_names'. The quantized features will be mapped
+ back into a range between min_quantized_value and max_quantized_value.
+ """
+
+ def __init__(self,
+ num_classes=3862,
+ feature_sizes=[1024],
+ feature_names=["inc3"],
+ max_frames=300):
+ """Construct a YT8MFrameFeatureReader.
+
+ Args:
+ num_classes: a positive integer for the number of classes.
+ feature_sizes: positive integer(s) for the feature dimensions as a list.
+ feature_names: the feature name(s) in the tensorflow record as a list.
+ max_frames: the maximum number of frames to process.
+ """
+
+ assert len(feature_names) == len(feature_sizes), \
+ "length of feature_names (={}) != length of feature_sizes (={})".format( \
+ len(feature_names), len(feature_sizes))
+
+ self.num_classes = num_classes
+ self.feature_sizes = feature_sizes
+ self.feature_names = feature_names
+ self.max_frames = max_frames
+
+ def get_video_matrix(self, features, feature_size, max_frames,
+ max_quantized_value, min_quantized_value):
+ """Decodes features from an input string and quantizes it.
+
+ Args:
+ features: raw feature values
+ feature_size: length of each frame feature vector
+ max_frames: number of frames (rows) in the output feature_matrix
+ max_quantized_value: the maximum of the quantized value.
+ min_quantized_value: the minimum of the quantized value.
+
+ Returns:
+ feature_matrix: matrix of all frame-features
+ num_frames: number of frames in the sequence
+ """
+ decoded_features = tf.reshape(
+ tf.cast(tf.decode_raw(features, tf.uint8), tf.float32),
+ [-1, feature_size])
+
+ num_frames = tf.minimum(tf.shape(decoded_features)[0], max_frames)
+
+ feature_matrix = decoded_features
+
+ return feature_matrix, num_frames
+
+ def prepare_reader(self,
+ filename_queue,
+ max_quantized_value=2,
+ min_quantized_value=-2):
+ """Creates a single reader thread for YouTube8M SequenceExamples.
+
+ Args:
+ filename_queue: A tensorflow queue of filename locations.
+ max_quantized_value: the maximum of the quantized value.
+ min_quantized_value: the minimum of the quantized value.
+
+ Returns:
+ A tuple of video indexes, video features, labels, and padding data.
+ """
+ reader = tf.TFRecordReader()
+ _, serialized_example = reader.read(filename_queue)
+
+ contexts, features = tf.parse_single_sequence_example(
+ serialized_example,
+ context_features={
+ "id": tf.FixedLenFeature([], tf.string),
+ "labels": tf.VarLenFeature(tf.int64)
+ },
+ sequence_features={
+ feature_name: tf.FixedLenSequenceFeature(
+ [], dtype=tf.string)
+ for feature_name in self.feature_names
+ })
+
+ # read ground truth labels
+ labels = (tf.cast(
+ tf.sparse_to_dense(
+ contexts["labels"].values, (self.num_classes, ),
+ 1,
+ validate_indices=False),
+ tf.bool))
+
+ # loads (potentially) different types of features and concatenates them
+ num_features = len(self.feature_names)
+ assert num_features > 0, "No feature selected: feature_names is empty!"
+
+ assert len(self.feature_names) == len(self.feature_sizes), \
+ "length of feature_names (={}) != length of feature_sizes (={})".format( \
+ len(self.feature_names), len(self.feature_sizes))
+
+ num_frames = -1 # the number of frames in the video
+ feature_matrices = [None
+ ] * num_features # an array of different features
+
+ for feature_index in range(num_features):
+ feature_matrix, num_frames_in_this_feature = self.get_video_matrix(
+ features[self.feature_names[feature_index]],
+ self.feature_sizes[feature_index], self.max_frames,
+ max_quantized_value, min_quantized_value)
+ if num_frames == -1:
+ num_frames = num_frames_in_this_feature
+ #else:
+ # tf.assert_equal(num_frames, num_frames_in_this_feature)
+
+ feature_matrices[feature_index] = feature_matrix
+
+ # cap the number of frames at self.max_frames
+ num_frames = tf.minimum(num_frames, self.max_frames)
+
+ # concatenate different features
+ video_matrix = feature_matrices[0]
+ audio_matrix = feature_matrices[1]
+
+ return contexts["id"], video_matrix, audio_matrix, labels, num_frames
+
+
+def main(files_pattern):
+ data_files = gfile.Glob(files_pattern)
+ filename_queue = tf.train.string_input_producer(
+ data_files, num_epochs=1, shuffle=False)
+
+ reader = YT8MFrameFeatureReader(
+ feature_sizes=[1024, 128], feature_names=["rgb", "audio"])
+ vals = reader.prepare_reader(filename_queue)
+
+ with tf.Session() as sess:
+ sess.run(tf.initialize_local_variables())
+ sess.run(tf.initialize_all_variables())
+ coord = tf.train.Coordinator()
+ threads = tf.train.start_queue_runners(sess=sess, coord=coord)
+
+ vid_num = 0
+ all_data = []
+ try:
+ while not coord.should_stop():
+ vid, features, audios, labels, nframes = sess.run(vals)
+ label_index = np.where(labels == True)[0].tolist()
+ vid_num += 1
+
+ #print vid, features.shape, audios.shape, label_index, nframes
+
+ features_int = features.astype(np.uint8)
+ audios_int = audios.astype(np.uint8)
+
+ value_dict = {}
+ value_dict['video'] = vid
+ value_dict['feature'] = features_int
+ value_dict['audio'] = audios_int
+ value_dict['label'] = label_index
+ value_dict['nframes'] = nframes
+ all_data.append(value_dict)
+
+ except tf.errors.OutOfRangeError:
+ print('Finished extracting.')
+
+ finally:
+ coord.request_stop()
+ coord.join(threads)
+
+ print vid_num
+
+ record_name = files_pattern.split('/')[-1].split('.')[0]
+ outputdir = target_dir
+ fn = '%s.pkl' % record_name
+ outp = open(os.path.join(outputdir, fn), 'wb')
+ cPickle.dump(all_data, outp, protocol=cPickle.HIGHEST_PROTOCOL)
+ outp.close()
+
+
+if __name__ == '__main__':
+ record_dir = source_dir
+ record_files = os.listdir(record_dir)
+ for f in record_files:
+ record_path = os.path.join(record_dir, f)
+ main(record_path)
diff --git a/PaddleCV/video/dataset/youtube8m/yt8m_pca/eigenvals.npy b/PaddleCV/video/dataset/youtube8m/yt8m_pca/eigenvals.npy
new file mode 100644
index 0000000000000000000000000000000000000000..632506b9ad68f030d64643cc8100868b21c3eb98
Binary files /dev/null and b/PaddleCV/video/dataset/youtube8m/yt8m_pca/eigenvals.npy differ
diff --git a/PaddleCV/video/images/StNet.png b/PaddleCV/video/images/StNet.png
new file mode 100644
index 0000000000000000000000000000000000000000..fde8d77f8e76a5ede7c7cb9f9af96850fea137a1
Binary files /dev/null and b/PaddleCV/video/images/StNet.png differ
diff --git a/PaddleCV/video/images/attention_cluster.png b/PaddleCV/video/images/attention_cluster.png
new file mode 100644
index 0000000000000000000000000000000000000000..f4c1dd6e9a233de68f66b937a24765a8420f7e4b
Binary files /dev/null and b/PaddleCV/video/images/attention_cluster.png differ
diff --git a/PaddleCV/video/images/nonlocal_instantiation.png b/PaddleCV/video/images/nonlocal_instantiation.png
new file mode 100644
index 0000000000000000000000000000000000000000..e6c7309697cfe67e834677e4b0bf0fb12965e384
Binary files /dev/null and b/PaddleCV/video/images/nonlocal_instantiation.png differ
diff --git a/PaddleCV/video/images/temporal_shift.png b/PaddleCV/video/images/temporal_shift.png
new file mode 100644
index 0000000000000000000000000000000000000000..7679c4459d2b0ee37134b99fe1e8177b1a69f8b0
Binary files /dev/null and b/PaddleCV/video/images/temporal_shift.png differ
diff --git a/PaddleCV/video/infer.py b/PaddleCV/video/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..5a7f81f423c5a3e6400811eaa568da5bdbd19e17
--- /dev/null
+++ b/PaddleCV/video/infer.py
@@ -0,0 +1,150 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import sys
+import time
+import logging
+import argparse
+import numpy as np
+try:
+ import cPickle as pickle
+except:
+ import pickle
+import paddle.fluid as fluid
+
+from config import *
+import models
+from datareader import get_reader
+
+logging.root.handlers = []
+FORMAT = '[%(levelname)s: %(filename)s: %(lineno)4d]: %(message)s'
+logging.basicConfig(level=logging.DEBUG, format=FORMAT, stream=sys.stdout)
+logger = logging.getLogger(__name__)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser()
+ parser.add_argument(
+ '--model_name',
+ type=str,
+ default='AttentionCluster',
+ help='name of model to train.')
+ parser.add_argument(
+ '--config',
+ type=str,
+ default='configs/attention_cluster.txt',
+ help='path to config file of model')
+ parser.add_argument(
+ '--use_gpu', type=bool, default=True, help='default use gpu.')
+ parser.add_argument(
+ '--weights',
+ type=str,
+ default=None,
+ help='weight path, None to use weights from Paddle.')
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=1,
+ help='sample number in a batch for inference.')
+ parser.add_argument(
+ '--filelist',
+ type=str,
+ default=None,
+ help='path to inferenece data file lists file.')
+ parser.add_argument(
+ '--log_interval',
+ type=int,
+ default=1,
+ help='mini-batch interval to log.')
+ parser.add_argument(
+ '--infer_topk',
+ type=int,
+ default=20,
+ help='topk predictions to restore.')
+ parser.add_argument(
+ '--save_dir', type=str, default='./', help='directory to store results')
+ args = parser.parse_args()
+ return args
+
+
+def infer(args):
+ # parse config
+ config = parse_config(args.config)
+ infer_config = merge_configs(config, 'infer', vars(args))
+ print_configs(infer_config, "Infer")
+ infer_model = models.get_model(args.model_name, infer_config, mode='infer')
+ infer_model.build_input(use_pyreader=False)
+ infer_model.build_model()
+ infer_feeds = infer_model.feeds()
+ infer_outputs = infer_model.outputs()
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ filelist = args.filelist or infer_config.INFER.filelist
+ assert os.path.exists(filelist), "{} not exist.".format(args.filelist)
+
+ # get infer reader
+ infer_reader = get_reader(args.model_name.upper(), 'infer', infer_config)
+
+ if args.weights:
+ assert os.path.exists(
+ args.weights), "Given weight dir {} not exist.".format(args.weights)
+ # if no weight files specified, download weights from paddle
+ weights = args.weights or infer_model.get_weights()
+
+ infer_model.load_test_weights(exe, weights,
+ fluid.default_main_program(), place)
+
+ infer_feeder = fluid.DataFeeder(place=place, feed_list=infer_feeds)
+ fetch_list = [x.name for x in infer_outputs]
+
+ periods = []
+ results = []
+ cur_time = time.time()
+ for infer_iter, data in enumerate(infer_reader()):
+ data_feed_in = [items[:-1] for items in data]
+ video_id = [items[-1] for items in data]
+ infer_outs = exe.run(fetch_list=fetch_list,
+ feed=infer_feeder.feed(data_feed_in))
+ predictions = np.array(infer_outs[0])
+ for i in range(len(predictions)):
+ topk_inds = predictions[i].argsort()[0 - args.infer_topk:]
+ topk_inds = topk_inds[::-1]
+ preds = predictions[i][topk_inds]
+ results.append((video_id[i], preds.tolist(), topk_inds.tolist()))
+ prev_time = cur_time
+ cur_time = time.time()
+ period = cur_time - prev_time
+ periods.append(period)
+ if args.log_interval > 0 and infer_iter % args.log_interval == 0:
+ logger.info('Processed {} samples'.format((infer_iter) * len(
+ predictions)))
+
+ logger.info('[INFER] infer finished. average time: {}'.format(
+ np.mean(periods)))
+
+ if not os.path.isdir(args.save_dir):
+ os.mkdir(args.save_dir)
+ result_file_name = os.path.join(args.save_dir,
+ "{}_infer_result".format(args.model_name))
+ pickle.dump(results, open(result_file_name, 'wb'))
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ logger.info(args)
+
+ infer(args)
diff --git a/PaddleCV/video/metrics/__init__.py b/PaddleCV/video/metrics/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..0d1df762bdf3d3b920fc1e00d15a3a2ecdcdbe55
--- /dev/null
+++ b/PaddleCV/video/metrics/__init__.py
@@ -0,0 +1 @@
+from .metrics_util import get_metrics
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/utils/__init__.py b/PaddleCV/video/metrics/kinetics/__init__.py
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/utils/__init__.py
rename to PaddleCV/video/metrics/kinetics/__init__.py
diff --git a/PaddleCV/video/metrics/kinetics/accuracy_metrics.py b/PaddleCV/video/metrics/kinetics/accuracy_metrics.py
new file mode 100644
index 0000000000000000000000000000000000000000..d79bf2ee18ca8de7d5219dd5d1ab6452aec3fe5f
--- /dev/null
+++ b/PaddleCV/video/metrics/kinetics/accuracy_metrics.py
@@ -0,0 +1,107 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import unicode_literals
+from __future__ import print_function
+from __future__ import division
+
+import numpy as np
+import datetime
+import logging
+
+logger = logging.getLogger(__name__)
+
+
+class MetricsCalculator():
+ def __init__(self, name, mode):
+ self.name = name
+ self.mode = mode # 'train', 'val', 'test'
+ self.reset()
+
+ def reset(self):
+ logger.info('Resetting {} metrics...'.format(self.mode))
+ self.aggr_acc1 = 0.0
+ self.aggr_acc5 = 0.0
+ self.aggr_loss = 0.0
+ self.aggr_batch_size = 0
+
+ def finalize_metrics(self):
+ self.avg_acc1 = self.aggr_acc1 / self.aggr_batch_size
+ self.avg_acc5 = self.aggr_acc5 / self.aggr_batch_size
+ self.avg_loss = self.aggr_loss / self.aggr_batch_size
+
+ def get_computed_metrics(self):
+ json_stats = {}
+ json_stats['avg_loss'] = self.avg_loss
+ json_stats['avg_acc1'] = self.avg_acc1
+ json_stats['avg_acc5'] = self.avg_acc5
+ return json_stats
+
+ def calculate_metrics(self, loss, softmax, labels):
+ accuracy1 = compute_topk_accuracy(softmax, labels, top_k=1) * 100.
+ accuracy5 = compute_topk_accuracy(softmax, labels, top_k=5) * 100.
+ return accuracy1, accuracy5
+
+ def accumulate(self, loss, softmax, labels):
+ cur_batch_size = softmax.shape[0]
+ # if returned loss is None for e.g. test, just set loss to be 0.
+ if loss is None:
+ cur_loss = 0.
+ else:
+ cur_loss = np.mean(np.array(loss)) #
+ self.aggr_batch_size += cur_batch_size
+ self.aggr_loss += cur_loss * cur_batch_size
+
+ accuracy1 = compute_topk_accuracy(softmax, labels, top_k=1) * 100.
+ accuracy5 = compute_topk_accuracy(softmax, labels, top_k=5) * 100.
+ self.aggr_acc1 += accuracy1 * cur_batch_size
+ self.aggr_acc5 += accuracy5 * cur_batch_size
+
+ return
+
+
+# ----------------------------------------------
+# other utils
+# ----------------------------------------------
+def compute_topk_correct_hits(top_k, preds, labels):
+ '''Compute the number of corret hits'''
+ batch_size = preds.shape[0]
+
+ top_k_preds = np.zeros((batch_size, top_k), dtype=np.float32)
+ for i in range(batch_size):
+ top_k_preds[i, :] = np.argsort(-preds[i, :])[:top_k]
+
+ correctness = np.zeros(batch_size, dtype=np.int32)
+ for i in range(batch_size):
+ if labels[i] in top_k_preds[i, :].astype(np.int32).tolist():
+ correctness[i] = 1
+ correct_hits = sum(correctness)
+
+ return correct_hits
+
+
+def compute_topk_accuracy(softmax, labels, top_k):
+
+ computed_metrics = {}
+
+ assert labels.shape[0] == softmax.shape[0], "Batch size mismatch."
+ aggr_batch_size = labels.shape[0]
+ aggr_top_k_correct_hits = compute_topk_correct_hits(top_k, softmax, labels)
+
+ # normalize results
+ computed_metrics = \
+ float(aggr_top_k_correct_hits) / aggr_batch_size
+
+ return computed_metrics
diff --git a/PaddleCV/video/metrics/metrics_util.py b/PaddleCV/video/metrics/metrics_util.py
new file mode 100644
index 0000000000000000000000000000000000000000..4db704c24fea85fa54ca286a35669becf916af56
--- /dev/null
+++ b/PaddleCV/video/metrics/metrics_util.py
@@ -0,0 +1,197 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import unicode_literals
+from __future__ import print_function
+from __future__ import division
+
+import logging
+
+import numpy as np
+from metrics.youtube8m import eval_util as youtube8m_metrics
+from metrics.kinetics import accuracy_metrics as kinetics_metrics
+from metrics.multicrop_test import multicrop_test_metrics as multicrop_test_metrics
+
+logger = logging.getLogger(__name__)
+
+
+class Metrics(object):
+ def __init__(self, name, mode, metrics_args):
+ """Not implemented"""
+ pass
+
+ def calculate_and_log_out(self, loss, pred, label, info=''):
+ """Not implemented"""
+ pass
+
+ def accumulate(self, loss, pred, label, info=''):
+ """Not implemented"""
+ pass
+
+ def finalize_and_log_out(self, info=''):
+ """Not implemented"""
+ pass
+
+ def reset(self):
+ """Not implemented"""
+ pass
+
+
+class Youtube8mMetrics(Metrics):
+ def __init__(self, name, mode, metrics_args):
+ self.name = name
+ self.mode = mode
+ self.num_classes = metrics_args['MODEL']['num_classes']
+ self.topk = metrics_args['MODEL']['topk']
+ self.calculator = youtube8m_metrics.EvaluationMetrics(self.num_classes,
+ self.topk)
+
+ def calculate_and_log_out(self, loss, pred, label, info=''):
+ loss = np.mean(np.array(loss))
+ hit_at_one = youtube8m_metrics.calculate_hit_at_one(pred, label)
+ perr = youtube8m_metrics.calculate_precision_at_equal_recall_rate(pred,
+ label)
+ gap = youtube8m_metrics.calculate_gap(pred, label)
+ logger.info(info + ' , loss = {0}, Hit@1 = {1}, PERR = {2}, GAP = {3}'.format(\
+ '%.6f' % loss, '%.2f' % hit_at_one, '%.2f' % perr, '%.2f' % gap))
+
+ def accumulate(self, loss, pred, label, info=''):
+ self.calculator.accumulate(loss, pred, label)
+
+ def finalize_and_log_out(self, info=''):
+ epoch_info_dict = self.calculator.get()
+ logger.info(info + '\tavg_hit_at_one: {0},\tavg_perr: {1},\tavg_loss :{2},\taps: {3},\tgap:{4}'\
+ .format(epoch_info_dict['avg_hit_at_one'], epoch_info_dict['avg_perr'], \
+ epoch_info_dict['avg_loss'], epoch_info_dict['aps'], epoch_info_dict['gap']))
+
+ def reset(self):
+ self.calculator.clear()
+
+
+class Kinetics400Metrics(Metrics):
+ def __init__(self, name, mode, metrics_args):
+ self.name = name
+ self.mode = mode
+ self.calculator = kinetics_metrics.MetricsCalculator(name, mode.lower())
+
+ def calculate_and_log_out(self, loss, pred, label, info=''):
+ if loss is not None:
+ loss = np.mean(np.array(loss))
+ else:
+ loss = 0.
+ acc1, acc5 = self.calculator.calculate_metrics(loss, pred, label)
+ logger.info(info + '\tLoss: {},\ttop1_acc: {}, \ttop5_acc: {}'.format('%.6f' % loss, \
+ '%.2f' % acc1, '%.2f' % acc5))
+
+ def accumulate(self, loss, pred, label, info=''):
+ self.calculator.accumulate(loss, pred, label)
+
+ def finalize_and_log_out(self, info=''):
+ self.calculator.finalize_metrics()
+ metrics_dict = self.calculator.get_computed_metrics()
+ loss = metrics_dict['avg_loss']
+ acc1 = metrics_dict['avg_acc1']
+ acc5 = metrics_dict['avg_acc5']
+ logger.info(info + '\tLoss: {},\ttop1_acc: {}, \ttop5_acc: {}'.format('%.6f' % loss, \
+ '%.2f' % acc1, '%.2f' % acc5))
+
+ def reset(self):
+ self.calculator.reset()
+
+
+class MulticropMetrics(Metrics):
+ def __init__(self, name, mode, metrics_args):
+ self.name = name
+ self.mode = mode
+ if mode == 'test':
+ args = {}
+ args['num_test_clips'] = metrics_args.TEST.num_test_clips
+ args['dataset_size'] = metrics_args.TEST.dataset_size
+ args['filename_gt'] = metrics_args.TEST.filename_gt
+ args['checkpoint_dir'] = metrics_args.TEST.checkpoint_dir
+ args['num_classes'] = metrics_args.MODEL.num_classes
+ self.calculator = multicrop_test_metrics.MetricsCalculator(
+ name, mode.lower(), **args)
+ else:
+ self.calculator = kinetics_metrics.MetricsCalculator(name,
+ mode.lower())
+
+ def calculate_and_log_out(self, loss, pred, label, info=''):
+ if self.mode == 'test':
+ pass
+ else:
+ if loss is not None:
+ loss = np.mean(np.array(loss))
+ else:
+ loss = 0.
+ acc1, acc5 = self.calculator.calculate_metrics(loss, pred, label)
+ logger.info(info + '\tLoss: {},\ttop1_acc: {}, \ttop5_acc: {}'.format('%.6f' % loss, \
+ '%.2f' % acc1, '%.2f' % acc5))
+
+ def accumulate(self, loss, pred, label):
+ self.calculator.accumulate(loss, pred, label)
+
+ def finalize_and_log_out(self, info=''):
+ if self.mode == 'test':
+ self.calculator.finalize_metrics()
+ else:
+ self.calculator.finalize_metrics()
+ metrics_dict = self.calculator.get_computed_metrics()
+ loss = metrics_dict['avg_loss']
+ acc1 = metrics_dict['avg_acc1']
+ acc5 = metrics_dict['avg_acc5']
+ logger.info(info + '\tLoss: {},\ttop1_acc: {}, \ttop5_acc: {}'.format('%.6f' % loss, \
+ '%.2f' % acc1, '%.2f' % acc5))
+
+ def reset(self):
+ self.calculator.reset()
+
+
+class MetricsZoo(object):
+ def __init__(self):
+ self.metrics_zoo = {}
+
+ def regist(self, name, metrics):
+ assert metrics.__base__ == Metrics, "Unknow model type {}".format(
+ type(metrics))
+ self.metrics_zoo[name] = metrics
+
+ def get(self, name, mode, cfg):
+ for k, v in self.metrics_zoo.items():
+ if k == name:
+ return v(name, mode, cfg)
+ raise MetricsNotFoundError(name, self.metrics_zoo.keys())
+
+
+# singleton metrics_zoo
+metrics_zoo = MetricsZoo()
+
+
+def regist_metrics(name, metrics):
+ metrics_zoo.regist(name, metrics)
+
+
+def get_metrics(name, mode, cfg):
+ return metrics_zoo.get(name, mode, cfg)
+
+
+# sort by alphabet
+regist_metrics("ATTENTIONCLUSTER", Youtube8mMetrics)
+regist_metrics("ATTENTIONLSTM", Youtube8mMetrics)
+regist_metrics("NEXTVLAD", Youtube8mMetrics)
+regist_metrics("NONLOCAL", MulticropMetrics)
+regist_metrics("TSM", Kinetics400Metrics)
+regist_metrics("TSN", Kinetics400Metrics)
+regist_metrics("STNET", Kinetics400Metrics)
diff --git a/fluid/PaddleRec/__init__.py b/PaddleCV/video/metrics/multicrop_test/__init__.py
similarity index 100%
rename from fluid/PaddleRec/__init__.py
rename to PaddleCV/video/metrics/multicrop_test/__init__.py
diff --git a/PaddleCV/video/metrics/multicrop_test/multicrop_test_metrics.py b/PaddleCV/video/metrics/multicrop_test/multicrop_test_metrics.py
new file mode 100644
index 0000000000000000000000000000000000000000..fcb7c9544019ea8ed4fe1c2d27798bea0546f334
--- /dev/null
+++ b/PaddleCV/video/metrics/multicrop_test/multicrop_test_metrics.py
@@ -0,0 +1,194 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import unicode_literals
+from __future__ import print_function
+from __future__ import division
+
+import sys
+import os
+import numpy as np
+import datetime
+import logging
+from collections import defaultdict
+import pickle
+
+logger = logging.getLogger(__name__)
+
+
+class MetricsCalculator():
+ def __init__(self, name, mode, **metrics_args):
+ """
+ metrics args:
+ num_test_clips, number of clips of each video when test
+ dataset_size, total number of videos in the dataset
+ filename_gt, a file with each line stores the groud truth of each video
+ checkpoint_dir, dir where to store the test results
+ num_classes, number of classes of the dataset
+ """
+ self.name = name
+ self.mode = mode # 'train', 'val', 'test'
+ self.metrics_args = metrics_args
+
+ self.num_test_clips = metrics_args['num_test_clips']
+ self.dataset_size = metrics_args['dataset_size']
+ self.filename_gt = metrics_args['filename_gt']
+ self.checkpoint_dir = metrics_args['checkpoint_dir']
+ self.num_classes = metrics_args['num_classes']
+ self.reset()
+
+ def reset(self):
+ logger.info('Resetting {} metrics...'.format(self.mode))
+ self.aggr_acc1 = 0.0
+ self.aggr_acc5 = 0.0
+ self.aggr_loss = 0.0
+ self.aggr_batch_size = 0
+ self.seen_inds = defaultdict(int)
+ self.results = []
+
+ def calculate_metrics(self, loss, pred, labels):
+ pass
+
+ def accumulate(self, loss, pred, labels):
+ labels = labels.astype(int)
+ labels = labels[:, 0]
+ for i in range(pred.shape[0]):
+ probs = pred[i, :].tolist()
+ vid = labels[i]
+ self.seen_inds[vid] += 1
+ if self.seen_inds[vid] > self.num_test_clips:
+ logger.warning('Video id {} have been seen. Skip.'.format(vid,
+ ))
+ continue
+ save_pairs = [vid, probs]
+ self.results.append(save_pairs)
+ logger.info("({0} / {1}) videos".format(\
+ len(self.seen_inds), self.dataset_size))
+
+ def finalize_metrics(self):
+ if self.filename_gt is not None:
+ evaluate_results(self.results, self.filename_gt, self.dataset_size, \
+ self.num_classes, self.num_test_clips)
+ # save temporary file
+ if not os.path.isdir(self.checkpoint_dir):
+ os.makedirs(self.checkpoint_dir)
+ pkl_path = os.path.join(self.checkpoint_dir, "results_probs.pkl")
+
+ with open(pkl_path, 'w') as f:
+ pickle.dump(self.results, f)
+ logger.info('Temporary file saved to: {}'.format(pkl_path))
+
+
+def read_groundtruth(filename_gt):
+ f = open(filename_gt, 'r')
+ labels = []
+ for line in f:
+ rows = line.split()
+ labels.append(int(rows[1]))
+ f.close()
+ return labels
+
+
+def evaluate_results(results, filename_gt, test_dataset_size, num_classes,
+ num_test_clips):
+ gt_labels = read_groundtruth(filename_gt)
+ sample_num = test_dataset_size
+ class_num = num_classes
+ sample_video_times = num_test_clips
+ counts = np.zeros(sample_num, dtype=np.int32)
+ probs = np.zeros((sample_num, class_num))
+
+ assert (len(gt_labels) == sample_num)
+ """
+ clip_accuracy: the (e.g.) 10*19761 clips' average accuracy
+ clip1_accuracy: the 1st clip's accuracy (starting from frame 0)
+ """
+ clip_accuracy = 0
+ clip1_accuracy = 0
+ clip1_count = 0
+ seen_inds = defaultdict(int)
+
+ # evaluate
+ for entry in results:
+ vid = entry[0]
+ prob = np.array(entry[1])
+ probs[vid] += prob[0:class_num]
+ counts[vid] += 1
+
+ idx = prob.argmax()
+ if idx == gt_labels[vid]:
+ # clip accuracy
+ clip_accuracy += 1
+
+ # clip1 accuracy
+ seen_inds[vid] += 1
+ if seen_inds[vid] == 1:
+ clip1_count += 1
+ if idx == gt_labels[vid]:
+ clip1_accuracy += 1
+
+ # sanity checkcnt = 0
+ max_clips = 0
+ min_clips = sys.maxsize
+ count_empty = 0
+ count_corrupted = 0
+ for i in range(sample_num):
+ max_clips = max(max_clips, counts[i])
+ min_clips = min(min_clips, counts[i])
+ if counts[i] != sample_video_times:
+ count_corrupted += 1
+ logger.warning('Id: {} count: {}'.format(i, counts[i]))
+ if counts[i] == 0:
+ count_empty += 1
+
+ logger.info('Num of empty videos: {}'.format(count_empty))
+ logger.info('Num of corrupted videos: {}'.format(count_corrupted))
+ logger.info('Max num of clips in a video: {}'.format(max_clips))
+ logger.info('Min num of clips in a video: {}'.format(min_clips))
+
+ # clip1 accuracy for sanity (# print clip1 first as it is lowest)
+ logger.info('Clip1 accuracy: {:.2f} percent ({}/{})'.format(
+ 100. * clip1_accuracy / clip1_count, clip1_accuracy, clip1_count))
+
+ # clip accuracy for sanity
+ logger.info('Clip accuracy: {:.2f} percent ({}/{})'.format(
+ 100. * clip_accuracy / len(results), clip_accuracy, len(results)))
+
+ # compute accuracy
+ accuracy = 0
+ accuracy_top5 = 0
+ for i in range(sample_num):
+ prob = probs[i]
+
+ # top-1
+ idx = prob.argmax()
+ if idx == gt_labels[i] and counts[i] > 0:
+ accuracy = accuracy + 1
+
+ ids = np.argsort(prob)[::-1]
+ for j in range(5):
+ if ids[j] == gt_labels[i] and counts[i] > 0:
+ accuracy_top5 = accuracy_top5 + 1
+ break
+
+ accuracy = float(accuracy) / float(sample_num)
+ accuracy_top5 = float(accuracy_top5) / float(sample_num)
+
+ logger.info('-' * 80)
+ logger.info('top-1 accuracy: {:.2f} percent'.format(accuracy * 100))
+ logger.info('top-5 accuracy: {:.2f} percent'.format(accuracy_top5 * 100))
+ logger.info('-' * 80)
+
+ return
diff --git a/fluid/PaddleRec/ctr/__init__.py b/PaddleCV/video/metrics/youtube8m/__init__.py
similarity index 100%
rename from fluid/PaddleRec/ctr/__init__.py
rename to PaddleCV/video/metrics/youtube8m/__init__.py
diff --git a/PaddleCV/video/metrics/youtube8m/average_precision_calculator.py b/PaddleCV/video/metrics/youtube8m/average_precision_calculator.py
new file mode 100644
index 0000000000000000000000000000000000000000..9bad69dd0aff1906e3548fb0322203f0bc5b408d
--- /dev/null
+++ b/PaddleCV/video/metrics/youtube8m/average_precision_calculator.py
@@ -0,0 +1,275 @@
+# Copyright 2016 Google Inc. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS-IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""Calculate or keep track of the interpolated average precision.
+
+It provides an interface for calculating interpolated average precision for an
+entire list or the top-n ranked items. For the definition of the
+(non-)interpolated average precision:
+http://trec.nist.gov/pubs/trec15/appendices/CE.MEASURES06.pdf
+
+Example usages:
+1) Use it as a static function call to directly calculate average precision for
+a short ranked list in the memory.
+
+```
+import random
+
+p = np.array([random.random() for _ in xrange(10)])
+a = np.array([random.choice([0, 1]) for _ in xrange(10)])
+
+ap = average_precision_calculator.AveragePrecisionCalculator.ap(p, a)
+```
+
+2) Use it as an object for long ranked list that cannot be stored in memory or
+the case where partial predictions can be observed at a time (Tensorflow
+predictions). In this case, we first call the function accumulate many times
+to process parts of the ranked list. After processing all the parts, we call
+peek_interpolated_ap_at_n.
+```
+p1 = np.array([random.random() for _ in xrange(5)])
+a1 = np.array([random.choice([0, 1]) for _ in xrange(5)])
+p2 = np.array([random.random() for _ in xrange(5)])
+a2 = np.array([random.choice([0, 1]) for _ in xrange(5)])
+
+# interpolated average precision at 10 using 1000 break points
+calculator = average_precision_calculator.AveragePrecisionCalculator(10)
+calculator.accumulate(p1, a1)
+calculator.accumulate(p2, a2)
+ap3 = calculator.peek_ap_at_n()
+```
+"""
+
+import heapq
+import random
+import numbers
+
+import numpy
+
+
+class AveragePrecisionCalculator(object):
+ """Calculate the average precision and average precision at n."""
+
+ def __init__(self, top_n=None):
+ """Construct an AveragePrecisionCalculator to calculate average precision.
+
+ This class is used to calculate the average precision for a single label.
+
+ Args:
+ top_n: A positive Integer specifying the average precision at n, or
+ None to use all provided data points.
+
+ Raises:
+ ValueError: An error occurred when the top_n is not a positive integer.
+ """
+ if not ((isinstance(top_n, int) and top_n >= 0) or top_n is None):
+ raise ValueError("top_n must be a positive integer or None.")
+
+ self._top_n = top_n # average precision at n
+ self._total_positives = 0 # total number of positives have seen
+ self._heap = [] # max heap of (prediction, actual)
+
+ @property
+ def heap_size(self):
+ """Gets the heap size maintained in the class."""
+ return len(self._heap)
+
+ @property
+ def num_accumulated_positives(self):
+ """Gets the number of positive samples that have been accumulated."""
+ return self._total_positives
+
+ def accumulate(self, predictions, actuals, num_positives=None):
+ """Accumulate the predictions and their ground truth labels.
+
+ After the function call, we may call peek_ap_at_n to actually calculate
+ the average precision.
+ Note predictions and actuals must have the same shape.
+
+ Args:
+ predictions: a list storing the prediction scores.
+ actuals: a list storing the ground truth labels. Any value
+ larger than 0 will be treated as positives, otherwise as negatives.
+ num_positives = If the 'predictions' and 'actuals' inputs aren't complete,
+ then it's possible some true positives were missed in them. In that case,
+ you can provide 'num_positives' in order to accurately track recall.
+
+ Raises:
+ ValueError: An error occurred when the format of the input is not the
+ numpy 1-D array or the shape of predictions and actuals does not match.
+ """
+ if len(predictions) != len(actuals):
+ raise ValueError(
+ "the shape of predictions and actuals does not match.")
+
+ if not num_positives is None:
+ if not isinstance(num_positives,
+ numbers.Number) or num_positives < 0:
+ raise ValueError(
+ "'num_positives' was provided but it wan't a nonzero number."
+ )
+
+ if not num_positives is None:
+ self._total_positives += num_positives
+ else:
+ self._total_positives += numpy.size(numpy.where(actuals > 0))
+ topk = self._top_n
+ heap = self._heap
+
+ for i in range(numpy.size(predictions)):
+ if topk is None or len(heap) < topk:
+ heapq.heappush(heap, (predictions[i], actuals[i]))
+ else:
+ if predictions[i] > heap[0][0]: # heap[0] is the smallest
+ heapq.heappop(heap)
+ heapq.heappush(heap, (predictions[i], actuals[i]))
+
+ def clear(self):
+ """Clear the accumulated predictions."""
+ self._heap = []
+ self._total_positives = 0
+
+ def peek_ap_at_n(self):
+ """Peek the non-interpolated average precision at n.
+
+ Returns:
+ The non-interpolated average precision at n (default 0).
+ If n is larger than the length of the ranked list,
+ the average precision will be returned.
+ """
+ if self.heap_size <= 0:
+ return 0
+ predlists = numpy.array(list(zip(*self._heap)))
+
+ ap = self.ap_at_n(
+ predlists[0],
+ predlists[1],
+ n=self._top_n,
+ total_num_positives=self._total_positives)
+ return ap
+
+ @staticmethod
+ def ap(predictions, actuals):
+ """Calculate the non-interpolated average precision.
+
+ Args:
+ predictions: a numpy 1-D array storing the sparse prediction scores.
+ actuals: a numpy 1-D array storing the ground truth labels. Any value
+ larger than 0 will be treated as positives, otherwise as negatives.
+
+ Returns:
+ The non-interpolated average precision at n.
+ If n is larger than the length of the ranked list,
+ the average precision will be returned.
+
+ Raises:
+ ValueError: An error occurred when the format of the input is not the
+ numpy 1-D array or the shape of predictions and actuals does not match.
+ """
+ return AveragePrecisionCalculator.ap_at_n(predictions, actuals, n=None)
+
+ @staticmethod
+ def ap_at_n(predictions, actuals, n=20, total_num_positives=None):
+ """Calculate the non-interpolated average precision.
+
+ Args:
+ predictions: a numpy 1-D array storing the sparse prediction scores.
+ actuals: a numpy 1-D array storing the ground truth labels. Any value
+ larger than 0 will be treated as positives, otherwise as negatives.
+ n: the top n items to be considered in ap@n.
+ total_num_positives : (optionally) you can specify the number of total
+ positive
+ in the list. If specified, it will be used in calculation.
+
+ Returns:
+ The non-interpolated average precision at n.
+ If n is larger than the length of the ranked list,
+ the average precision will be returned.
+
+ Raises:
+ ValueError: An error occurred when
+ 1) the format of the input is not the numpy 1-D array;
+ 2) the shape of predictions and actuals does not match;
+ 3) the input n is not a positive integer.
+ """
+ if len(predictions) != len(actuals):
+ raise ValueError(
+ "the shape of predictions and actuals does not match.")
+
+ if n is not None:
+ if not isinstance(n, int) or n <= 0:
+ raise ValueError("n must be 'None' or a positive integer."
+ " It was '%s'." % n)
+
+ ap = 0.0
+
+ predictions = numpy.array(predictions)
+ actuals = numpy.array(actuals)
+
+ # add a shuffler to avoid overestimating the ap
+ predictions, actuals = AveragePrecisionCalculator._shuffle(predictions,
+ actuals)
+ sortidx = sorted(
+ range(len(predictions)), key=lambda k: predictions[k], reverse=True)
+
+ if total_num_positives is None:
+ numpos = numpy.size(numpy.where(actuals > 0))
+ else:
+ numpos = total_num_positives
+
+ if numpos == 0:
+ return 0
+
+ if n is not None:
+ numpos = min(numpos, n)
+ delta_recall = 1.0 / numpos
+ poscount = 0.0
+
+ # calculate the ap
+ r = len(sortidx)
+ if n is not None:
+ r = min(r, n)
+ for i in range(r):
+ if actuals[sortidx[i]] > 0:
+ poscount += 1
+ ap += poscount / (i + 1) * delta_recall
+ return ap
+
+ @staticmethod
+ def _shuffle(predictions, actuals):
+ random.seed(0)
+ suffidx = random.sample(range(len(predictions)), len(predictions))
+ predictions = predictions[suffidx]
+ actuals = actuals[suffidx]
+ return predictions, actuals
+
+ @staticmethod
+ def _zero_one_normalize(predictions, epsilon=1e-7):
+ """Normalize the predictions to the range between 0.0 and 1.0.
+
+ For some predictions like SVM predictions, we need to normalize them before
+ calculate the interpolated average precision. The normalization will not
+ change the rank in the original list and thus won't change the average
+ precision.
+
+ Args:
+ predictions: a numpy 1-D array storing the sparse prediction scores.
+ epsilon: a small constant to avoid denominator being zero.
+
+ Returns:
+ The normalized prediction.
+ """
+ denominator = numpy.max(predictions) - numpy.min(predictions)
+ ret = (predictions - numpy.min(predictions)) / numpy.max(denominator,
+ epsilon)
+ return ret
diff --git a/PaddleCV/video/metrics/youtube8m/eval_util.py b/PaddleCV/video/metrics/youtube8m/eval_util.py
new file mode 100644
index 0000000000000000000000000000000000000000..f7742236f1176073eae84fdc7c3a3a1a2e294fe0
--- /dev/null
+++ b/PaddleCV/video/metrics/youtube8m/eval_util.py
@@ -0,0 +1,245 @@
+# Copyright 2016 Google Inc. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS-IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""Provides functions to help with evaluating models."""
+import datetime
+import numpy
+
+from . import mean_average_precision_calculator as map_calculator
+from . import average_precision_calculator as ap_calculator
+
+
+def flatten(l):
+ """ Merges a list of lists into a single list. """
+ return [item for sublist in l for item in sublist]
+
+
+def calculate_hit_at_one(predictions, actuals):
+ """Performs a local (numpy) calculation of the hit at one.
+
+ Args:
+ predictions: Matrix containing the outputs of the model.
+ Dimensions are 'batch' x 'num_classes'.
+ actuals: Matrix containing the ground truth labels.
+ Dimensions are 'batch' x 'num_classes'.
+
+ Returns:
+ float: The average hit at one across the entire batch.
+ """
+ top_prediction = numpy.argmax(predictions, 1)
+ hits = actuals[numpy.arange(actuals.shape[0]), top_prediction]
+ return numpy.average(hits)
+
+
+def calculate_precision_at_equal_recall_rate(predictions, actuals):
+ """Performs a local (numpy) calculation of the PERR.
+
+ Args:
+ predictions: Matrix containing the outputs of the model.
+ Dimensions are 'batch' x 'num_classes'.
+ actuals: Matrix containing the ground truth labels.
+ Dimensions are 'batch' x 'num_classes'.
+
+ Returns:
+ float: The average precision at equal recall rate across the entire batch.
+ """
+ aggregated_precision = 0.0
+ num_videos = actuals.shape[0]
+ for row in numpy.arange(num_videos):
+ num_labels = int(numpy.sum(actuals[row]))
+ top_indices = numpy.argpartition(predictions[row],
+ -num_labels)[-num_labels:]
+ item_precision = 0.0
+ for label_index in top_indices:
+ if predictions[row][label_index] > 0:
+ item_precision += actuals[row][label_index]
+ item_precision /= top_indices.size
+ aggregated_precision += item_precision
+ aggregated_precision /= num_videos
+ return aggregated_precision
+
+
+def calculate_gap(predictions, actuals, top_k=20):
+ """Performs a local (numpy) calculation of the global average precision.
+
+ Only the top_k predictions are taken for each of the videos.
+
+ Args:
+ predictions: Matrix containing the outputs of the model.
+ Dimensions are 'batch' x 'num_classes'.
+ actuals: Matrix containing the ground truth labels.
+ Dimensions are 'batch' x 'num_classes'.
+ top_k: How many predictions to use per video.
+
+ Returns:
+ float: The global average precision.
+ """
+ gap_calculator = ap_calculator.AveragePrecisionCalculator()
+ sparse_predictions, sparse_labels, num_positives = top_k_by_class(
+ predictions, actuals, top_k)
+ gap_calculator.accumulate(
+ flatten(sparse_predictions), flatten(sparse_labels), sum(num_positives))
+ return gap_calculator.peek_ap_at_n()
+
+
+def top_k_by_class(predictions, labels, k=20):
+ """Extracts the top k predictions for each video, sorted by class.
+
+ Args:
+ predictions: A numpy matrix containing the outputs of the model.
+ Dimensions are 'batch' x 'num_classes'.
+ k: the top k non-zero entries to preserve in each prediction.
+
+ Returns:
+ A tuple (predictions,labels, true_positives). 'predictions' and 'labels'
+ are lists of lists of floats. 'true_positives' is a list of scalars. The
+ length of the lists are equal to the number of classes. The entries in the
+ predictions variable are probability predictions, and
+ the corresponding entries in the labels variable are the ground truth for
+ those predictions. The entries in 'true_positives' are the number of true
+ positives for each class in the ground truth.
+
+ Raises:
+ ValueError: An error occurred when the k is not a positive integer.
+ """
+ if k <= 0:
+ raise ValueError("k must be a positive integer.")
+ k = min(k, predictions.shape[1])
+ num_classes = predictions.shape[1]
+ prediction_triplets = []
+ for video_index in range(predictions.shape[0]):
+ prediction_triplets.extend(
+ top_k_triplets(predictions[video_index], labels[video_index], k))
+ out_predictions = [[] for v in range(num_classes)]
+ out_labels = [[] for v in range(num_classes)]
+ for triplet in prediction_triplets:
+ out_predictions[triplet[0]].append(triplet[1])
+ out_labels[triplet[0]].append(triplet[2])
+ out_true_positives = [numpy.sum(labels[:, i]) for i in range(num_classes)]
+
+ return out_predictions, out_labels, out_true_positives
+
+
+def top_k_triplets(predictions, labels, k=20):
+ """Get the top_k for a 1-d numpy array. Returns a sparse list of tuples in
+ (prediction, class) format"""
+ m = len(predictions)
+ k = min(k, m)
+ indices = numpy.argpartition(predictions, -k)[-k:]
+ return [(index, predictions[index], labels[index]) for index in indices]
+
+
+class EvaluationMetrics(object):
+ """A class to store the evaluation metrics."""
+
+ def __init__(self, num_class, top_k):
+ """Construct an EvaluationMetrics object to store the evaluation metrics.
+
+ Args:
+ num_class: A positive integer specifying the number of classes.
+ top_k: A positive integer specifying how many predictions are considered per video.
+
+ Raises:
+ ValueError: An error occurred when MeanAveragePrecisionCalculator cannot
+ not be constructed.
+ """
+ self.sum_hit_at_one = 0.0
+ self.sum_perr = 0.0
+ self.sum_loss = 0.0
+ self.map_calculator = map_calculator.MeanAveragePrecisionCalculator(
+ num_class)
+ self.global_ap_calculator = ap_calculator.AveragePrecisionCalculator()
+ self.top_k = top_k
+ self.num_examples = 0
+
+ #def accumulate(self, predictions, labels, loss):
+ def accumulate(self, loss, predictions, labels):
+ """Accumulate the metrics calculated locally for this mini-batch.
+
+ Args:
+ predictions: A numpy matrix containing the outputs of the model.
+ Dimensions are 'batch' x 'num_classes'.
+ labels: A numpy matrix containing the ground truth labels.
+ Dimensions are 'batch' x 'num_classes'.
+ loss: A numpy array containing the loss for each sample.
+
+ Returns:
+ dictionary: A dictionary storing the metrics for the mini-batch.
+
+ Raises:
+ ValueError: An error occurred when the shape of predictions and actuals
+ does not match.
+ """
+ batch_size = labels.shape[0]
+ mean_hit_at_one = calculate_hit_at_one(predictions, labels)
+ mean_perr = calculate_precision_at_equal_recall_rate(predictions,
+ labels)
+ mean_loss = numpy.mean(loss)
+
+ # Take the top 20 predictions.
+ sparse_predictions, sparse_labels, num_positives = top_k_by_class(
+ predictions, labels, self.top_k)
+ self.map_calculator.accumulate(sparse_predictions, sparse_labels,
+ num_positives)
+ self.global_ap_calculator.accumulate(
+ flatten(sparse_predictions),
+ flatten(sparse_labels), sum(num_positives))
+
+ self.num_examples += batch_size
+ self.sum_hit_at_one += mean_hit_at_one * batch_size
+ self.sum_perr += mean_perr * batch_size
+ self.sum_loss += mean_loss * batch_size
+
+ return {
+ "hit_at_one": mean_hit_at_one,
+ "perr": mean_perr,
+ "loss": mean_loss
+ }
+
+ def get(self):
+ """Calculate the evaluation metrics for the whole epoch.
+
+ Raises:
+ ValueError: If no examples were accumulated.
+
+ Returns:
+ dictionary: a dictionary storing the evaluation metrics for the epoch. The
+ dictionary has the fields: avg_hit_at_one, avg_perr, avg_loss, and
+ aps (default nan).
+ """
+ if self.num_examples <= 0:
+ raise ValueError("total_sample must be positive.")
+ avg_hit_at_one = self.sum_hit_at_one / self.num_examples
+ avg_perr = self.sum_perr / self.num_examples
+ avg_loss = self.sum_loss / self.num_examples
+
+ aps = self.map_calculator.peek_map_at_n()
+ gap = self.global_ap_calculator.peek_ap_at_n()
+
+ epoch_info_dict = {}
+ return {
+ "avg_hit_at_one": avg_hit_at_one,
+ "avg_perr": avg_perr,
+ "avg_loss": avg_loss,
+ "aps": aps,
+ "gap": gap
+ }
+
+ def clear(self):
+ """Clear the evaluation metrics and reset the EvaluationMetrics object."""
+ self.sum_hit_at_one = 0.0
+ self.sum_perr = 0.0
+ self.sum_loss = 0.0
+ self.map_calculator.clear()
+ self.global_ap_calculator.clear()
+ self.num_examples = 0
diff --git a/PaddleCV/video/metrics/youtube8m/mean_average_precision_calculator.py b/PaddleCV/video/metrics/youtube8m/mean_average_precision_calculator.py
new file mode 100644
index 0000000000000000000000000000000000000000..0ae8b0ed3717aba13b7ed35b4af025be40423967
--- /dev/null
+++ b/PaddleCV/video/metrics/youtube8m/mean_average_precision_calculator.py
@@ -0,0 +1,114 @@
+# Copyright 2016 Google Inc. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS-IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""Calculate the mean average precision.
+
+It provides an interface for calculating mean average precision
+for an entire list or the top-n ranked items.
+
+Example usages:
+We first call the function accumulate many times to process parts of the ranked
+list. After processing all the parts, we call peek_map_at_n
+to calculate the mean average precision.
+
+```
+import random
+
+p = np.array([[random.random() for _ in xrange(50)] for _ in xrange(1000)])
+a = np.array([[random.choice([0, 1]) for _ in xrange(50)]
+ for _ in xrange(1000)])
+
+# mean average precision for 50 classes.
+calculator = mean_average_precision_calculator.MeanAveragePrecisionCalculator(
+ num_class=50)
+calculator.accumulate(p, a)
+aps = calculator.peek_map_at_n()
+```
+"""
+
+import numpy
+from . import average_precision_calculator
+
+
+class MeanAveragePrecisionCalculator(object):
+ """This class is to calculate mean average precision.
+ """
+
+ def __init__(self, num_class):
+ """Construct a calculator to calculate the (macro) average precision.
+
+ Args:
+ num_class: A positive Integer specifying the number of classes.
+ top_n_array: A list of positive integers specifying the top n for each
+ class. The top n in each class will be used to calculate its average
+ precision at n.
+ The size of the array must be num_class.
+
+ Raises:
+ ValueError: An error occurred when num_class is not a positive integer;
+ or the top_n_array is not a list of positive integers.
+ """
+ if not isinstance(num_class, int) or num_class <= 1:
+ raise ValueError("num_class must be a positive integer.")
+
+ self._ap_calculators = [] # member of AveragePrecisionCalculator
+ self._num_class = num_class # total number of classes
+ for i in range(num_class):
+ self._ap_calculators.append(
+ average_precision_calculator.AveragePrecisionCalculator())
+
+ def accumulate(self, predictions, actuals, num_positives=None):
+ """Accumulate the predictions and their ground truth labels.
+
+ Args:
+ predictions: A list of lists storing the prediction scores. The outer
+ dimension corresponds to classes.
+ actuals: A list of lists storing the ground truth labels. The dimensions
+ should correspond to the predictions input. Any value
+ larger than 0 will be treated as positives, otherwise as negatives.
+ num_positives: If provided, it is a list of numbers representing the
+ number of true positives for each class. If not provided, the number of
+ true positives will be inferred from the 'actuals' array.
+
+ Raises:
+ ValueError: An error occurred when the shape of predictions and actuals
+ does not match.
+ """
+ if not num_positives:
+ num_positives = [None for i in predictions.shape[1]]
+
+ calculators = self._ap_calculators
+ for i in range(len(predictions)):
+ calculators[i].accumulate(predictions[i], actuals[i],
+ num_positives[i])
+
+ def clear(self):
+ for calculator in self._ap_calculators:
+ calculator.clear()
+
+ def is_empty(self):
+ return ([calculator.heap_size for calculator in self._ap_calculators] ==
+ [0 for _ in range(self._num_class)])
+
+ def peek_map_at_n(self):
+ """Peek the non-interpolated mean average precision at n.
+
+ Returns:
+ An array of non-interpolated average precision at n (default 0) for each
+ class.
+ """
+ aps = [
+ self._ap_calculators[i].peek_ap_at_n()
+ for i in range(self._num_class)
+ ]
+ return aps
diff --git a/PaddleCV/video/models/__init__.py b/PaddleCV/video/models/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..72ee303f343a5e6ceb39917599fc905b145acb94
--- /dev/null
+++ b/PaddleCV/video/models/__init__.py
@@ -0,0 +1,17 @@
+from .model import regist_model, get_model
+from .attention_cluster import AttentionCluster
+from .attention_lstm import AttentionLSTM
+from .nextvlad import NEXTVLAD
+from .nonlocal_model import NonLocal
+from .tsm import TSM
+from .tsn import TSN
+from .stnet import STNET
+
+# regist models, sort by alphabet
+regist_model("AttentionCluster", AttentionCluster)
+regist_model("AttentionLSTM", AttentionLSTM)
+regist_model("NEXTVLAD", NEXTVLAD)
+regist_model('NONLOCAL', NonLocal)
+regist_model("TSM", TSM)
+regist_model("TSN", TSN)
+regist_model("STNET", STNET)
diff --git a/PaddleCV/video/models/attention_cluster/README.md b/PaddleCV/video/models/attention_cluster/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..6314f2d027430e8466b5d9aeefecab2772a91a79
--- /dev/null
+++ b/PaddleCV/video/models/attention_cluster/README.md
@@ -0,0 +1,104 @@
+# Attention Cluster 视频分类模型
+
+---
+## 目录
+
+- [模型简介](#模型简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断](#模型推断)
+- [参考论文](#参考论文)
+
+
+## 模型简介
+
+Attention Cluster模型为ActivityNet Kinetics Challenge 2017中最佳序列模型。该模型通过带Shifting Opeation的Attention Clusters处理已抽取好的RGB、Flow、Audio特征数据,Attention Cluster结构如下图所示。
+
+
+
+Multimodal Attention Cluster with Shifting Operation
+
+
+Shifting Operation通过对每一个attention单元的输出添加一个独立可学习的线性变换处理后进行L2-normalization,使得各attention单元倾向于学习特征的不同成分,从而让Attention Cluster能更好地学习不同分布的数据,提高整个网络的学习表征能力。
+
+详细内容请参考[Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification](https://arxiv.org/abs/1711.09550)
+
+## 数据准备
+
+Attention Cluster模型使用2nd-Youtube-8M数据集, 数据下载及准备请参考[数据说明](../../dataset/README.md)
+
+## 模型训练
+
+数据准备完毕后,可以通过如下两种方式启动训练:
+
+ python train.py --model_name=AttentionCluster
+ --config=./configs/attention_cluster.txt
+ --save_dir=checkpoints
+ --log_interval=10
+ --valid_interval=1
+
+ bash scripts/train/train_attention_cluster.sh
+
+- 可下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/attention_cluster_youtube8m.tar.gz)通过`--resume`指定权重存放路径进行finetune等开发
+
+**数据读取器说明:** 模型读取Youtube-8M数据集中已抽取好的`rgb`和`audio`数据,对于每个视频的数据,均匀采样100帧,该值由配置文件中的`seg_num`参数指定。
+
+**模型设置:** 模型主要可配置参数为`cluster_nums`和`seg_num`参数,当配置`cluster_nums`为32, `seg_num`为100时,在Nvidia Tesla P40上单卡可跑`batch_size=256`。
+
+**训练策略:**
+
+* 采用Adam优化器,初始learning\_rate=0.001。
+* 训练过程中不使用权重衰减。
+* 参数主要使用MSRA初始化
+
+## 模型评估
+
+可通过如下两种方式进行模型评估:
+
+ python test.py --model_name=AttentionCluster
+ --config=configs/attention_cluster.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+
+ bash scripts/test/test_attention_cluster.sh
+
+- 使用`scripts/test/test_attention_cluster.sh`进行评估时,需要修改脚本中的`--weights`参数指定需要评估的权重。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/attention_cluster_youtube8m.tar.gz)进行评估
+
+当取如下参数时:
+
+| 参数 | 取值 |
+| :---------: | :----: |
+| cluster\_nums | 32 |
+| seg\_num | 100 |
+| batch\_size | 2048 |
+| nums\_gpu | 7 |
+
+在2nd-YouTube-8M数据集下评估精度如下:
+
+
+| 精度指标 | 模型精度 |
+| :---------: | :----: |
+| Hit@1 | 0.87 |
+| PERR | 0.78 |
+| GAP | 0.84 |
+
+## 模型推断
+
+可通过如下命令进行模型推断:
+
+ python infer.py --model_name=attention_cluster
+ --config=configs/attention_cluster.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+ --filelist=$FILELIST
+
+- 模型推断结果存储于`AttentionCluster_infer_result`中,通过`pickle`格式存储。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/attention_cluster_youtube8m.tar.gz)进行推断
+
+## 参考论文
+
+- [Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification](https://arxiv.org/abs/1711.09550), Xiang Long, Chuang Gan, Gerard de Melo, Jiajun Wu, Xiao Liu, Shilei Wen
diff --git a/PaddleCV/video/models/attention_cluster/__init__.py b/PaddleCV/video/models/attention_cluster/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..bd7ef3d595d3b3d286d765a24a0c44c4b29dc6c6
--- /dev/null
+++ b/PaddleCV/video/models/attention_cluster/__init__.py
@@ -0,0 +1,3 @@
+from __future__ import absolute_import
+
+from .attention_cluster import *
diff --git a/PaddleCV/video/models/attention_cluster/attention_cluster.py b/PaddleCV/video/models/attention_cluster/attention_cluster.py
new file mode 100755
index 0000000000000000000000000000000000000000..84282544c95b21231a043e22f2c8dadd25579e8f
--- /dev/null
+++ b/PaddleCV/video/models/attention_cluster/attention_cluster.py
@@ -0,0 +1,139 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+
+from ..model import ModelBase
+from .shifting_attention import ShiftingAttentionModel
+from .logistic_model import LogisticModel
+
+__all__ = ["AttentionCluster"]
+
+
+class AttentionCluster(ModelBase):
+ def __init__(self, name, cfg, mode='train'):
+ super(AttentionCluster, self).__init__(name, cfg, mode)
+ self.get_config()
+
+ def get_config(self):
+ # get model configs
+ self.feature_num = self.cfg.MODEL.feature_num
+ self.feature_names = self.cfg.MODEL.feature_names
+ self.feature_dims = self.cfg.MODEL.feature_dims
+ self.cluster_nums = self.cfg.MODEL.cluster_nums
+ self.seg_num = self.cfg.MODEL.seg_num
+ self.class_num = self.cfg.MODEL.num_classes
+ self.drop_rate = self.cfg.MODEL.drop_rate
+
+ if self.mode == 'train':
+ self.learning_rate = self.get_config_from_sec('train',
+ 'learning_rate', 1e-3)
+
+ def build_input(self, use_pyreader):
+ if use_pyreader:
+ assert self.mode != 'infer', \
+ 'pyreader is not recommendated when infer, please set use_pyreader to be false.'
+ shapes = []
+ for dim in self.feature_dims:
+ shapes.append([-1, self.seg_num, dim])
+ shapes.append([-1, self.class_num]) # label
+ self.py_reader = fluid.layers.py_reader(
+ capacity=1024,
+ shapes=shapes,
+ lod_levels=[0] * (self.feature_num + 1),
+ dtypes=['float32'] * (self.feature_num + 1),
+ name='train_py_reader'
+ if self.is_training else 'test_py_reader',
+ use_double_buffer=True)
+ inputs = fluid.layers.read_file(self.py_reader)
+ self.feature_input = inputs[:self.feature_num]
+ self.label_input = inputs[-1]
+ else:
+ self.feature_input = []
+ for name, dim in zip(self.feature_names, self.feature_dims):
+ self.feature_input.append(
+ fluid.layers.data(
+ shape=[self.seg_num, dim], dtype='float32', name=name))
+ if self.mode == 'infer':
+ self.label_input = None
+ else:
+ self.label_input = fluid.layers.data(
+ shape=[self.class_num], dtype='float32', name='label')
+
+ def build_model(self):
+ att_outs = []
+ for i, (input_dim, cluster_num, feature) in enumerate(
+ zip(self.feature_dims, self.cluster_nums, self.feature_input)):
+ att = ShiftingAttentionModel(input_dim, self.seg_num, cluster_num,
+ "satt{}".format(i))
+ att_out = att.forward(feature)
+ att_outs.append(att_out)
+ out = fluid.layers.concat(att_outs, axis=1)
+
+ if self.drop_rate > 0.:
+ out = fluid.layers.dropout(
+ out, self.drop_rate, is_test=(not self.is_training))
+
+ fc1 = fluid.layers.fc(
+ out,
+ size=1024,
+ act='tanh',
+ param_attr=ParamAttr(
+ name="fc1.weights",
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=ParamAttr(
+ name="fc1.bias", initializer=fluid.initializer.MSRA()))
+ fc2 = fluid.layers.fc(
+ fc1,
+ size=4096,
+ act='tanh',
+ param_attr=ParamAttr(
+ name="fc2.weights",
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=ParamAttr(
+ name="fc2.bias", initializer=fluid.initializer.MSRA()))
+
+ aggregate_model = LogisticModel()
+
+ self.output, self.logit = aggregate_model.build_model(
+ model_input=fc2,
+ vocab_size=self.class_num,
+ is_training=self.is_training)
+
+ def optimizer(self):
+ assert self.mode == 'train', "optimizer only can be get in train mode"
+ return fluid.optimizer.AdamOptimizer(self.learning_rate)
+
+ def loss(self):
+ assert self.mode != 'infer', "invalid loss calculationg in infer mode"
+ cost = fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=self.logit, label=self.label_input)
+ cost = fluid.layers.reduce_sum(cost, dim=-1)
+ self.loss_ = fluid.layers.mean(x=cost)
+ return self.loss_
+
+ def outputs(self):
+ return [self.output, self.logit]
+
+ def feeds(self):
+ return self.feature_input if self.mode == 'infer' else self.feature_input + [
+ self.label_input
+ ]
+
+ def weights_info(self):
+ return (
+ "attention_cluster_youtube8m",
+ "https://paddlemodels.bj.bcebos.com/video_classification/attention_cluster_youtube8m.tar.gz"
+ )
diff --git a/PaddleCV/video/models/attention_cluster/logistic_model.py b/PaddleCV/video/models/attention_cluster/logistic_model.py
new file mode 100755
index 0000000000000000000000000000000000000000..6fad2a44ffc7df2049eeb04341a88b9c342c70ce
--- /dev/null
+++ b/PaddleCV/video/models/attention_cluster/logistic_model.py
@@ -0,0 +1,47 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle
+import paddle.fluid as fluid
+
+
+class LogisticModel(object):
+ """Logistic model."""
+ def build_model(self,
+ model_input,
+ vocab_size,
+ **unused_params):
+ """Creates a logistic model.
+
+ Args:
+ model_input: 'batch' x 'num_features' matrix of input features.
+ vocab_size: The number of classes in the dataset.
+
+ Returns:
+ A dictionary with a tensor containing the probability predictions of the
+ model in the 'predictions' key. The dimensions of the tensor are
+ batch_size x num_classes."""
+ logit = fluid.layers.fc(
+ input=model_input,
+ size=vocab_size,
+ act=None,
+ name='logits_clf',
+ param_attr=fluid.ParamAttr(
+ name='logistic.weights',
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=fluid.ParamAttr(
+ name='logistic.bias',
+ initializer=fluid.initializer.MSRA(uniform=False)))
+ output = fluid.layers.sigmoid(logit)
+ return output, logit
diff --git a/PaddleCV/video/models/attention_cluster/shifting_attention.py b/PaddleCV/video/models/attention_cluster/shifting_attention.py
new file mode 100755
index 0000000000000000000000000000000000000000..e27ad8dd58b882eb96fbb9763eecccc36ddfe28a
--- /dev/null
+++ b/PaddleCV/video/models/attention_cluster/shifting_attention.py
@@ -0,0 +1,95 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+import numpy as np
+
+
+class ShiftingAttentionModel(object):
+ """Shifting Attention Model"""
+
+ def __init__(self, input_dim, seg_num, n_att, name):
+ self.n_att = n_att
+ self.input_dim = input_dim
+ self.seg_num = seg_num
+ self.name = name
+ self.gnorm = np.sqrt(n_att)
+
+ def softmax_m1(self, x):
+ x_shape = fluid.layers.shape(x)
+ x_shape.stop_gradient = True
+ flat_x = fluid.layers.reshape(x, shape=(-1, self.seg_num))
+ flat_softmax = fluid.layers.softmax(flat_x)
+ return fluid.layers.reshape(
+ flat_softmax, shape=x.shape, actual_shape=x_shape)
+
+ def glorot(self, n):
+ return np.sqrt(1.0 / np.sqrt(n))
+
+ def forward(self, x):
+ """Forward shifting attention model.
+
+ Args:
+ x: input features in shape of [N, L, F].
+
+ Returns:
+ out: output features in shape of [N, F * C]
+ """
+
+ trans_x = fluid.layers.transpose(x, perm=[0, 2, 1])
+ # scores and weight in shape [N, C, L], sum(weights, -1) = 1
+ trans_x = fluid.layers.unsqueeze(trans_x, [-1])
+ scores = fluid.layers.conv2d(
+ trans_x,
+ self.n_att,
+ filter_size=1,
+ param_attr=ParamAttr(
+ name=self.name + ".conv.weight",
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=ParamAttr(
+ name=self.name + ".conv.bias",
+ initializer=fluid.initializer.MSRA()))
+ scores = fluid.layers.squeeze(scores, [-1])
+ weights = self.softmax_m1(scores)
+
+ glrt = self.glorot(self.n_att)
+ self.w = fluid.layers.create_parameter(
+ shape=(self.n_att, ),
+ dtype=x.dtype,
+ attr=ParamAttr(self.name + ".shift_w"),
+ default_initializer=fluid.initializer.Normal(0.0, glrt))
+ self.b = fluid.layers.create_parameter(
+ shape=(self.n_att, ),
+ dtype=x.dtype,
+ attr=ParamAttr(name=self.name + ".shift_b"),
+ default_initializer=fluid.initializer.Normal(0.0, glrt))
+
+ outs = []
+ for i in range(self.n_att):
+ # slice weight and expand to shape [N, L, C]
+ weight = fluid.layers.slice(
+ weights, axes=[1], starts=[i], ends=[i + 1])
+ weight = fluid.layers.transpose(weight, perm=[0, 2, 1])
+ weight = fluid.layers.expand(weight, [1, 1, self.input_dim])
+
+ w_i = fluid.layers.slice(self.w, axes=[0], starts=[i], ends=[i + 1])
+ b_i = fluid.layers.slice(self.b, axes=[0], starts=[i], ends=[i + 1])
+ shift = fluid.layers.reduce_sum(x * weight, dim=1) * w_i + b_i
+
+ l2_norm = fluid.layers.l2_normalize(shift, axis=-1)
+ outs.append(l2_norm / self.gnorm)
+
+ out = fluid.layers.concat(outs, axis=1)
+ return out
diff --git a/PaddleCV/video/models/attention_lstm/README.md b/PaddleCV/video/models/attention_lstm/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..50a835b7d7e9736d5ee23f17967d187392257303
--- /dev/null
+++ b/PaddleCV/video/models/attention_lstm/README.md
@@ -0,0 +1,92 @@
+# AttentionLSTM视频分类模型
+
+---
+## 内容
+
+- [模型简介](#简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断](#模型推断)
+- [参考论文](#参考论文)
+
+## 模型简介
+
+递归神经网络(RNN)常用于序列数据的处理,可建模视频连续多帧的时序信息,在视频分类领域为基础常用方法。该模型采用了双向长短记忆网络(LSTM),将视频的所有帧特征依次编码。与传统方法直接采用LSTM最后一个时刻的输出不同,该模型增加了一个Attention层,每个时刻的隐状态输出都有一个自适应权重,然后线性加权得到最终特征向量。论文中实现的是两层LSTM结构,而本代码实现的是带Attention的双向LSTM,Attention层可参考论文[AttentionCluster](https://arxiv.org/abs/1711.09550)。
+
+详细内容请参考[Beyond Short Snippets: Deep Networks for Video Classification](https://arxiv.org/abs/1503.08909)。
+
+## 数据准备
+
+AttentionLSTM模型使用2nd-Youtube-8M数据集,关于数据部分请参考[数据说明](../../dataset/README.md)
+
+## 模型训练
+
+### 随机初始化开始训练
+
+数据准备完毕后,可以通过如下两种方式启动训练:
+
+ python train.py --model_name=AttentionLSTM
+ --config=./configs/attention_lstm.txt
+ --save_dir=checkpoints
+ --log_interval=10
+ --valid_interval=1
+
+ bash scripts/train/train_attention_lstm.sh
+
+- AttentionLSTM模型使用8卡Nvidia Tesla P40来训练的,总的batch size数是1024。
+
+### 使用预训练模型做finetune
+请先将提供的[model](https://paddlemodels.bj.bcebos.com/video_classification/attention_lstm_youtube8m.tar.gz)下载到本地,并在上述脚本文件中添加`--resume`为所保存的预模型存放路径。
+
+## 模型评估
+可通过如下两种方式进行模型评估:
+
+ python test.py --model_name=AttentionLSTM
+ --config=configs/attention_lstm.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+
+ bash scripts/test/test_attention_lstm.sh
+
+- 使用`scripts/test/test_attention_LSTM.sh`进行评估时,需要修改脚本中的`--weights`参数指定需要评估的权重。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/attention_lstm_youtube8m.tar.gz)进行评估
+
+模型参数列表如下:
+
+| 参数 | 取值 |
+| :---------: | :----: |
+| embedding\_size | 512 |
+| lstm\_size | 1024 |
+| drop\_rate | 0.5 |
+
+
+计算指标列表如下:
+
+| 精度指标 | 模型精度 |
+| :---------: | :----: |
+| Hit@1 | 0.8885 |
+| PERR | 0.8012 |
+| GAP | 0.8594 |
+
+
+## 模型推断
+
+可通过如下命令进行模型推断:
+
+ python infer.py --model_name=attention_lstm
+ --config=configs/attention_lstm.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+ --filelist=$FILELIST
+
+- 模型推断结果存储于`AttentionLSTM_infer_result`中,通过`pickle`格式存储。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/attention_lstm_youtube8m.tar.gz)进行推断
+
+## 参考论文
+
+- [Beyond Short Snippets: Deep Networks for Video Classification](https://arxiv.org/abs/1503.08909) Joe Yue-Hei Ng, Matthew Hausknecht, Sudheendra Vijayanarasimhan, Oriol Vinyals, Rajat Monga, George Toderici
+
+- [Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification](https://arxiv.org/abs/1711.09550), Xiang Long, Chuang Gan, Gerard de Melo, Jiajun Wu, Xiao Liu, Shilei Wen
diff --git a/PaddleCV/video/models/attention_lstm/__init__.py b/PaddleCV/video/models/attention_lstm/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..cb872f0e43ab52054b42970896e5791a0eeb691d
--- /dev/null
+++ b/PaddleCV/video/models/attention_lstm/__init__.py
@@ -0,0 +1 @@
+from .attention_lstm import *
diff --git a/PaddleCV/video/models/attention_lstm/attention_lstm.py b/PaddleCV/video/models/attention_lstm/attention_lstm.py
new file mode 100755
index 0000000000000000000000000000000000000000..5d28dc47e297ba76e2b07fd69de2efa3ae6ccb0f
--- /dev/null
+++ b/PaddleCV/video/models/attention_lstm/attention_lstm.py
@@ -0,0 +1,151 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+
+from ..model import ModelBase
+from .lstm_attention import LSTMAttentionModel
+
+__all__ = ["AttentionLSTM"]
+
+
+class AttentionLSTM(ModelBase):
+ def __init__(self, name, cfg, mode='train'):
+ super(AttentionLSTM, self).__init__(name, cfg, mode)
+ self.get_config()
+
+ def get_config(self):
+ # get model configs
+ self.feature_num = self.cfg.MODEL.feature_num
+ self.feature_names = self.cfg.MODEL.feature_names
+ self.feature_dims = self.cfg.MODEL.feature_dims
+ self.num_classes = self.cfg.MODEL.num_classes
+ self.embedding_size = self.cfg.MODEL.embedding_size
+ self.lstm_size = self.cfg.MODEL.lstm_size
+ self.drop_rate = self.cfg.MODEL.drop_rate
+
+ # get mode configs
+ self.batch_size = self.get_config_from_sec(self.mode, 'batch_size', 1)
+ self.num_gpus = self.get_config_from_sec(self.mode, 'num_gpus', 1)
+
+ if self.mode == 'train':
+ self.learning_rate = self.get_config_from_sec('train',
+ 'learning_rate', 1e-3)
+ self.weight_decay = self.get_config_from_sec('train',
+ 'weight_decay', 8e-4)
+ self.num_samples = self.get_config_from_sec('train', 'num_samples',
+ 5000000)
+ self.decay_epochs = self.get_config_from_sec('train',
+ 'decay_epochs', [5])
+ self.decay_gamma = self.get_config_from_sec('train', 'decay_gamma',
+ 0.1)
+
+ def build_input(self, use_pyreader):
+ if use_pyreader:
+ assert self.mode != 'infer', \
+ 'pyreader is not recommendated when infer, please set use_pyreader to be false.'
+ shapes = []
+ for dim in self.feature_dims:
+ shapes.append([-1, dim])
+ shapes.append([-1, self.num_classes]) # label
+ self.py_reader = fluid.layers.py_reader(
+ capacity=1024,
+ shapes=shapes,
+ lod_levels=[1] * self.feature_num + [0],
+ dtypes=['float32'] * (self.feature_num + 1),
+ name='train_py_reader'
+ if self.is_training else 'test_py_reader',
+ use_double_buffer=True)
+ inputs = fluid.layers.read_file(self.py_reader)
+ self.feature_input = inputs[:self.feature_num]
+ self.label_input = inputs[-1]
+ else:
+ self.feature_input = []
+ for name, dim in zip(self.feature_names, self.feature_dims):
+ self.feature_input.append(
+ fluid.layers.data(
+ shape=[dim], lod_level=1, dtype='float32', name=name))
+ if self.mode == 'infer':
+ self.label_input = None
+ else:
+ self.label_input = fluid.layers.data(
+ shape=[self.num_classes], dtype='float32', name='label')
+
+ def build_model(self):
+ att_outs = []
+ for i, (input_dim, feature
+ ) in enumerate(zip(self.feature_dims, self.feature_input)):
+ att = LSTMAttentionModel(input_dim, self.embedding_size,
+ self.lstm_size, self.drop_rate)
+ att_out = att.forward(feature, is_training=(self.mode == 'train'))
+ att_outs.append(att_out)
+ out = fluid.layers.concat(att_outs, axis=1)
+
+ fc1 = fluid.layers.fc(
+ input=out,
+ size=8192,
+ act='relu',
+ bias_attr=ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(0.0),
+ initializer=fluid.initializer.NormalInitializer(scale=0.0)))
+ fc2 = fluid.layers.fc(
+ input=fc1,
+ size=4096,
+ act='tanh',
+ bias_attr=ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(0.0),
+ initializer=fluid.initializer.NormalInitializer(scale=0.0)))
+
+ self.logit = fluid.layers.fc(input=fc2, size=self.num_classes, act=None, \
+ bias_attr=ParamAttr(regularizer=fluid.regularizer.L2Decay(0.0),
+ initializer=fluid.initializer.NormalInitializer(scale=0.0)))
+
+ self.output = fluid.layers.sigmoid(self.logit)
+
+ def optimizer(self):
+ assert self.mode == 'train', "optimizer only can be get in train mode"
+ values = [
+ self.learning_rate * (self.decay_gamma**i)
+ for i in range(len(self.decay_epochs) + 1)
+ ]
+ iter_per_epoch = self.num_samples / self.batch_size
+ boundaries = [e * iter_per_epoch for e in self.decay_epochs]
+ return fluid.optimizer.RMSProp(
+ learning_rate=fluid.layers.piecewise_decay(
+ values=values, boundaries=boundaries),
+ centered=True,
+ regularization=fluid.regularizer.L2Decay(self.weight_decay))
+
+ def loss(self):
+ assert self.mode != 'infer', "invalid loss calculationg in infer mode"
+ cost = fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=self.logit, label=self.label_input)
+ cost = fluid.layers.reduce_sum(cost, dim=-1)
+ sum_cost = fluid.layers.reduce_sum(cost)
+ self.loss_ = fluid.layers.scale(
+ sum_cost, scale=self.num_gpus, bias_after_scale=False)
+ return self.loss_
+
+ def outputs(self):
+ return [self.output, self.logit]
+
+ def feeds(self):
+ return self.feature_input if self.mode == 'infer' else self.feature_input + [
+ self.label_input
+ ]
+
+ def weights_info(self):
+ return ('attention_lstm_youtube8m',
+ 'https://paddlemodels.bj.bcebos.com/video_classification/attention_lstm_youtube8m.tar.gz')
diff --git a/PaddleCV/video/models/attention_lstm/lstm_attention.py b/PaddleCV/video/models/attention_lstm/lstm_attention.py
new file mode 100755
index 0000000000000000000000000000000000000000..5fce85c5c75ea176a3bf371de5f4eea5f02f25b3
--- /dev/null
+++ b/PaddleCV/video/models/attention_lstm/lstm_attention.py
@@ -0,0 +1,78 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+import numpy as np
+
+
+class LSTMAttentionModel(object):
+ """LSTM Attention Model"""
+
+ def __init__(self,
+ bias_attr,
+ embedding_size=512,
+ lstm_size=1024,
+ drop_rate=0.5):
+ self.lstm_size = lstm_size
+ self.embedding_size = embedding_size
+ self.drop_rate = drop_rate
+
+ def forward(self, input, is_training):
+ input_fc = fluid.layers.fc(
+ input=input,
+ size=self.embedding_size,
+ act='tanh',
+ bias_attr=ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(0.0),
+ initializer=fluid.initializer.NormalInitializer(scale=0.0)))
+
+ lstm_forward_fc = fluid.layers.fc(
+ input=input_fc,
+ size=self.lstm_size * 4,
+ act=None,
+ bias_attr=ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(0.0),
+ initializer=fluid.initializer.NormalInitializer(scale=0.0)))
+ lstm_forward, _ = fluid.layers.dynamic_lstm(
+ input=lstm_forward_fc, size=self.lstm_size * 4, is_reverse=False)
+
+ lsmt_backward_fc = fluid.layers.fc(
+ input=input_fc,
+ size=self.lstm_size * 4,
+ act=None,
+ bias_attr=ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(0.0),
+ initializer=fluid.initializer.NormalInitializer(scale=0.0)))
+ lstm_backward, _ = fluid.layers.dynamic_lstm(
+ input=lsmt_backward_fc, size=self.lstm_size * 4, is_reverse=True)
+
+ lstm_concat = fluid.layers.concat(
+ input=[lstm_forward, lstm_backward], axis=1)
+
+ lstm_dropout = fluid.layers.dropout(
+ x=lstm_concat, dropout_prob=self.drop_rate, is_test=(not is_training))
+
+ lstm_weight = fluid.layers.fc(
+ input=lstm_dropout,
+ size=1,
+ act='sequence_softmax',
+ bias_attr=ParamAttr(
+ regularizer=fluid.regularizer.L2Decay(0.0),
+ initializer=fluid.initializer.NormalInitializer(scale=0.0)))
+ scaled = fluid.layers.elementwise_mul(
+ x=lstm_dropout, y=lstm_weight, axis=0)
+ lstm_pool = fluid.layers.sequence_pool(input=scaled, pool_type='sum')
+
+ return lstm_pool
diff --git a/PaddleCV/video/models/model.py b/PaddleCV/video/models/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..bf5947f44f05c96d47b912edfbdc9b4c28f6321b
--- /dev/null
+++ b/PaddleCV/video/models/model.py
@@ -0,0 +1,178 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import logging
+try:
+ from configparser import ConfigParser
+except:
+ from ConfigParser import ConfigParser
+
+import paddle.fluid as fluid
+from datareader import get_reader
+from metrics import get_metrics
+from .utils import download, AttrDict
+
+WEIGHT_DIR = os.path.expanduser("~/.paddle/weights")
+
+logger = logging.getLogger(__name__)
+
+
+class NotImplementError(Exception):
+ "Error: model function not implement"
+
+ def __init__(self, model, function):
+ super(NotImplementError, self).__init__()
+ self.model = model.__class__.__name__
+ self.function = function.__name__
+
+ def __str__(self):
+ return "Function {}() is not implemented in model {}".format(
+ self.function, self.model)
+
+
+class ModelNotFoundError(Exception):
+ "Error: model not found"
+
+ def __init__(self, model_name, avail_models):
+ super(ModelNotFoundError, self).__init__()
+ self.model_name = model_name
+ self.avail_models = avail_models
+
+ def __str__(self):
+ msg = "Model {} Not Found.\nAvailiable models:\n".format(
+ self.model_name)
+ for model in self.avail_models:
+ msg += " {}\n".format(model)
+ return msg
+
+
+class ModelBase(object):
+ def __init__(self, name, cfg, mode='train'):
+ assert mode in ['train', 'valid', 'test', 'infer'], \
+ "Unknown mode type {}".format(mode)
+ self.name = name
+ self.is_training = (mode == 'train')
+ self.mode = mode
+ self.cfg = cfg
+ self.py_reader = None
+
+
+ def build_model(self):
+ "build model struct"
+ raise NotImplementError(self, self.build_model)
+
+ def build_input(self, use_pyreader):
+ "build input Variable"
+ raise NotImplementError(self, self.build_input)
+
+ def optimizer(self):
+ "get model optimizer"
+ raise NotImplementError(self, self.optimizer)
+
+ def outputs():
+ "get output variable"
+ raise notimplementerror(self, self.outputs)
+
+ def loss(self):
+ "get loss variable"
+ raise notimplementerror(self, self.loss)
+
+ def feeds(self):
+ "get feed inputs list"
+ raise NotImplementError(self, self.feeds)
+
+ def weights_info(self):
+ "get model weight default path and download url"
+ raise NotImplementError(self, self.weights_info)
+
+ def get_weights(self):
+ "get model weight file path, download weight from Paddle if not exist"
+ path, url = self.weights_info()
+ path = os.path.join(WEIGHT_DIR, path)
+ if os.path.exists(path):
+ return path
+
+ logger.info("Download weights of {} from {}".format(self.name, url))
+ download(url, path)
+ return path
+
+ def pyreader(self):
+ return self.py_reader
+
+ def epoch_num(self):
+ "get train epoch num"
+ return self.cfg.TRAIN.epoch
+
+ def pretrain_info(self):
+ "get pretrain base model directory"
+ return (None, None)
+
+ def get_pretrain_weights(self):
+ "get model weight file path, download weight from Paddle if not exist"
+ path, url = self.pretrain_info()
+ if not path:
+ return None
+
+ path = os.path.join(WEIGHT_DIR, path)
+ if os.path.exists(path):
+ return path
+
+ logger.info("Download pretrain weights of {} from {}".format(self.name,
+ url))
+ download(url, path)
+ return path
+
+ def load_pretrain_params(self, exe, pretrain, prog, place):
+ logger.info("Load pretrain weights from {}".format(pretrain))
+ fluid.io.load_params(exe, pretrain, main_program=prog)
+
+ def load_test_weights(self, exe, weights, prog, place):
+ def if_exist(var):
+ return os.path.exists(os.path.join(weights, var.name))
+
+ fluid.io.load_vars(exe, weights, predicate=if_exist)
+
+ def get_config_from_sec(self, sec, item, default=None):
+ if sec.upper() not in self.cfg:
+ return default
+ return self.cfg[sec.upper()].get(item, default)
+
+
+class ModelZoo(object):
+ def __init__(self):
+ self.model_zoo = {}
+
+ def regist(self, name, model):
+ assert model.__base__ == ModelBase, "Unknow model type {}".format(
+ type(model))
+ self.model_zoo[name] = model
+
+ def get(self, name, cfg, mode='train'):
+ for k, v in self.model_zoo.items():
+ if k.upper() == name.upper():
+ return v(name, cfg, mode)
+ raise ModelNotFoundError(name, self.model_zoo.keys())
+
+
+# singleton model_zoo
+model_zoo = ModelZoo()
+
+
+def regist_model(name, model):
+ model_zoo.regist(name, model)
+
+
+def get_model(name, cfg, mode='train'):
+ return model_zoo.get(name, cfg, mode)
diff --git a/PaddleCV/video/models/nextvlad/README.md b/PaddleCV/video/models/nextvlad/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..61b634e7fd52e2910913a5ca31c0413ce43e5021
--- /dev/null
+++ b/PaddleCV/video/models/nextvlad/README.md
@@ -0,0 +1,82 @@
+# NeXtVLAD视频分类模型
+
+---
+## 目录
+
+- [算法介绍](#模型简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断](#模型推断)
+- [参考论文](#参考论文)
+
+
+## 算法介绍
+NeXtVLAD模型是第二届Youtube-8M视频理解竞赛中效果最好的单模型,在参数量小于80M的情况下,能得到高于0.87的GAP指标。该模型提供了一种将桢级别的视频特征转化并压缩成特征向量,以适用于大尺寸视频文件的分类的方法。其基本出发点是在NetVLAD模型的基础上,将高维度的特征先进行分组,通过引入attention机制聚合提取时间维度的信息,这样既可以获得较高的准确率,又可以使用更少的参数量。详细内容请参考[NeXtVLAD: An Efficient Neural Network to Aggregate Frame-level Features for Large-scale Video Classification](https://arxiv.org/abs/1811.05014)。
+
+这里实现了论文中的单模型结构,使用2nd-Youtube-8M的train数据集作为训练集,在val数据集上做测试。
+
+## 数据准备
+
+NeXtVLAD模型使用2nd-Youtube-8M数据集, 数据下载及准备请参考[数据说明](../../dataset/README.md)
+
+## 模型训练
+
+### 随机初始化开始训练
+在video目录下运行如下脚本即可
+
+ bash ./scripts/train/train_nextvlad.sh
+
+### 使用预训练模型做finetune
+
+请先将提供的预训练模型[model](https://paddlemodels.bj.bcebos.com/video_classification/nextvlad_youtube8m.tar.gz)下载到本地,并在上述脚本文件中添加--resume为所保存的预模型存放路径。
+
+使用4卡Nvidia Tesla P40,总的batch size数是160。
+
+### 训练策略
+
+* 使用Adam优化器,初始learning\_rate=0.0002
+* 每2,000,000个样本做一次学习率衰减,learning\_rate\_decay = 0.8
+* 正则化使用l2\_weight\_decay = 1e-5
+
+## 模型评估
+
+用户可以下载的预训练模型参数,或者使用自己训练好的模型参数,请在./scripts/test/test\_nextvald.sh
+文件中修改--weights参数为保存模型参数的目录。运行
+
+ bash ./scripts/test/test_nextvlad.sh
+
+由于youtube-8m提供的数据中test数据集是没有ground truth标签的,所以这里使用validation数据集来做测试。
+
+模型参数列表如下:
+
+| 参数 | 取值 |
+| :---------: | :----: |
+| cluster\_size | 128 |
+| hidden\_size | 2048 |
+| groups | 8 |
+| expansion | 2 |
+| drop\_rate | 0.5 |
+| gating\_reduction | 8 |
+
+计算指标列表如下:
+
+| 精度指标 | 模型精度 |
+| :---------: | :----: |
+| Hit@1 | 0.8960 |
+| PERR | 0.8132 |
+| GAP | 0.8709 |
+
+## 模型推断
+
+用户可以下载的预训练模型参数,或者使用自己训练好的模型参数,请在./scripts/infer/infer\_nextvald.sh
+文件中修改--weights参数为保存模型参数的目录,运行如下脚本
+
+ bash ./scripts/infer/infer_nextvald.sh
+
+推断结果会保存在NEXTVLAD\_infer\_result文件中,通过pickle格式存储。
+
+## 参考论文
+
+- [NeXtVLAD: An Efficient Neural Network to Aggregate Frame-level Features for Large-scale Video Classification](https://arxiv.org/abs/1811.05014), Rongcheng Lin, Jing Xiao, Jianping Fan
+
diff --git a/PaddleCV/video/models/nextvlad/__init__.py b/PaddleCV/video/models/nextvlad/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..d9a233374a1ac7069280801413872e9227f820a8
--- /dev/null
+++ b/PaddleCV/video/models/nextvlad/__init__.py
@@ -0,0 +1,3 @@
+from __future__ import absolute_import
+
+from .nextvlad import *
diff --git a/PaddleCV/video/models/nextvlad/clf_model.py b/PaddleCV/video/models/nextvlad/clf_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..70728dfb1139a1d32e6a5d921629ba018ed6cea9
--- /dev/null
+++ b/PaddleCV/video/models/nextvlad/clf_model.py
@@ -0,0 +1,50 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle
+import paddle.fluid as fluid
+
+
+class LogisticModel(object):
+ """Logistic model with L2 regularization."""
+
+ def create_model(self,
+ model_input,
+ vocab_size,
+ l2_penalty=None,
+ **unused_params):
+ """Creates a logistic model.
+
+ Args:
+ model_input: 'batch' x 'num_features' matrix of input features.
+ vocab_size: The number of classes in the dataset.
+
+ Returns:
+ A dictionary with a tensor containing the probability predictions of the
+ model in the 'predictions' key. The dimensions of the tensor are
+ batch_size x num_classes."""
+ logits = fluid.layers.fc(
+ input=model_input,
+ size=vocab_size,
+ act=None,
+ name='logits_clf',
+ param_attr=fluid.ParamAttr(
+ name='logits_clf_weights',
+ initializer=fluid.initializer.MSRA(uniform=False),
+ regularizer=fluid.regularizer.L2DecayRegularizer(l2_penalty)),
+ bias_attr=fluid.ParamAttr(
+ name='logits_clf_bias',
+ regularizer=fluid.regularizer.L2DecayRegularizer(l2_penalty)))
+ output = fluid.layers.sigmoid(logits)
+ return {'predictions': output, 'logits': logits}
diff --git a/PaddleCV/video/models/nextvlad/nextvlad.py b/PaddleCV/video/models/nextvlad/nextvlad.py
new file mode 100755
index 0000000000000000000000000000000000000000..62a96b5d1d61e3447699b5ec974662566c2e45f0
--- /dev/null
+++ b/PaddleCV/video/models/nextvlad/nextvlad.py
@@ -0,0 +1,167 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+
+from ..model import ModelBase
+from .clf_model import LogisticModel
+from . import nextvlad_model
+
+__all__ = ["NEXTVLAD"]
+
+
+class NEXTVLAD(ModelBase):
+ def __init__(self, name, cfg, mode='train'):
+ super(NEXTVLAD, self).__init__(name, cfg, mode=mode)
+ self.get_config()
+
+ def get_config(self):
+ # model params
+ self.num_classes = self.get_config_from_sec('model', 'num_classes')
+ self.video_feature_size = self.get_config_from_sec('model',
+ 'video_feature_size')
+ self.audio_feature_size = self.get_config_from_sec('model',
+ 'audio_feature_size')
+ self.cluster_size = self.get_config_from_sec('model', 'cluster_size')
+ self.hidden_size = self.get_config_from_sec('model', 'hidden_size')
+ self.groups = self.get_config_from_sec('model', 'groups')
+ self.expansion = self.get_config_from_sec('model', 'expansion')
+ self.drop_rate = self.get_config_from_sec('model', 'drop_rate')
+ self.gating_reduction = self.get_config_from_sec('model',
+ 'gating_reduction')
+ self.eigen_file = self.get_config_from_sec('model', 'eigen_file')
+ # training params
+ self.base_learning_rate = self.get_config_from_sec('train',
+ 'learning_rate')
+ self.lr_boundary_examples = self.get_config_from_sec(
+ 'train', 'lr_boundary_examples')
+ self.max_iter = self.get_config_from_sec('train', 'max_iter')
+ self.learning_rate_decay = self.get_config_from_sec(
+ 'train', 'learning_rate_decay')
+ self.l2_penalty = self.get_config_from_sec('train', 'l2_penalty')
+ self.gradient_clip_norm = self.get_config_from_sec('train',
+ 'gradient_clip_norm')
+ self.use_gpu = self.get_config_from_sec('train', 'use_gpu')
+ self.num_gpus = self.get_config_from_sec('train', 'num_gpus')
+
+ # other params
+ self.batch_size = self.get_config_from_sec(self.mode, 'batch_size')
+
+ def build_input(self, use_pyreader=True):
+ rgb_shape = [self.video_feature_size]
+ audio_shape = [self.audio_feature_size]
+ label_shape = [self.num_classes]
+ if use_pyreader:
+ assert self.mode != 'infer', \
+ 'pyreader is not recommendated when infer, please set use_pyreader to be false.'
+ py_reader = fluid.layers.py_reader(
+ capacity=100,
+ shapes=[[-1] + rgb_shape, [-1] + audio_shape,
+ [-1] + label_shape],
+ lod_levels=[1, 1, 0],
+ dtypes=['float32', 'float32', 'float32'],
+ name='train_py_reader'
+ if self.is_training else 'test_py_reader',
+ use_double_buffer=True)
+ rgb, audio, label = fluid.layers.read_file(py_reader)
+ self.py_reader = py_reader
+ else:
+ rgb = fluid.layers.data(
+ name='train_rgb' if self.is_training else 'test_rgb',
+ shape=rgb_shape,
+ dtype='float32',
+ lod_level=1)
+ audio = fluid.layers.data(
+ name='train_audio' if self.is_training else 'test_audio',
+ shape=audio_shape,
+ dtype='float32',
+ lod_level=1)
+ if self.mode == 'infer':
+ label = None
+ else:
+ label = fluid.layers.data(
+ name='train_label' if self.is_training else 'test_label',
+ shape=label_shape,
+ dtype='float32')
+ self.feature_input = [rgb, audio]
+ self.label_input = label
+
+ def create_model_args(self):
+ model_args = {}
+ model_args['class_dim'] = self.num_classes
+ model_args['cluster_size'] = self.cluster_size
+ model_args['hidden_size'] = self.hidden_size
+ model_args['groups'] = self.groups
+ model_args['expansion'] = self.expansion
+ model_args['drop_rate'] = self.drop_rate
+ model_args['gating_reduction'] = self.gating_reduction
+ model_args['l2_penalty'] = self.l2_penalty
+ return model_args
+
+ def build_model(self):
+ model_args = self.create_model_args()
+ videomodel = nextvlad_model.NeXtVLADModel()
+ rgb = self.feature_input[0]
+ audio = self.feature_input[1]
+ out = videomodel.create_model(
+ rgb, audio, is_training=(self.mode == 'train'), **model_args)
+ self.logits = out['logits']
+ self.predictions = out['predictions']
+ self.network_outputs = [out['predictions']]
+
+ def optimizer(self):
+ assert self.mode == 'train', "optimizer only can be get in train mode"
+ im_per_batch = self.batch_size
+ lr_bounds, lr_values = get_learning_rate_decay_list(
+ self.base_learning_rate, self.learning_rate_decay, self.max_iter,
+ self.lr_boundary_examples, im_per_batch)
+ return fluid.optimizer.AdamOptimizer(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=lr_bounds, values=lr_values))
+
+ def loss(self):
+ assert self.mode != 'infer', "invalid loss calculationg in infer mode"
+ cost = fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=self.logits, label=self.label_input)
+ cost = fluid.layers.reduce_sum(cost, dim=-1)
+ self.loss_ = fluid.layers.mean(x=cost)
+ return self.loss_
+
+ def outputs(self):
+ return self.network_outputs
+
+ def feeds(self):
+ return self.feature_input if self.mode == 'infer' else self.feature_input + [
+ self.label_input
+ ]
+
+ def weights_info(self):
+ return ('nextvlad_youtube8m',
+ 'https://paddlemodels.bj.bcebos.com/video_classification/nextvlad_youtube8m.tar.gz')
+
+
+def get_learning_rate_decay_list(base_learning_rate, decay, max_iter,
+ decay_examples, total_batch_size):
+ decay_step = decay_examples // total_batch_size
+ lr_bounds = []
+ lr_values = [base_learning_rate]
+ i = 1
+ while True:
+ if i * decay_step >= max_iter:
+ break
+ lr_bounds.append(i * decay_step)
+ lr_values.append(base_learning_rate * (decay**i))
+ i += 1
+ return lr_bounds, lr_values
diff --git a/PaddleCV/video/models/nextvlad/nextvlad_model.py b/PaddleCV/video/models/nextvlad/nextvlad_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..9e9efe83fd936989b0b94ff2fadaf487c37c86b7
--- /dev/null
+++ b/PaddleCV/video/models/nextvlad/nextvlad_model.py
@@ -0,0 +1,231 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+from . import clf_model
+
+
+class NeXtVLAD(object):
+ """
+ This is a paddlepaddle implementation of the NeXtVLAD model. For more
+ information, please refer to the paper,
+ https://static.googleusercontent.com/media/research.google.com/zh-CN//youtube8m/workshop2018/p_c03.pdf
+ """
+
+ def __init__(self,
+ feature_size,
+ cluster_size,
+ is_training=True,
+ expansion=2,
+ groups=None,
+ inputname='video'):
+ self.feature_size = feature_size
+ self.cluster_size = cluster_size
+ self.is_training = is_training
+ self.expansion = expansion
+ self.groups = groups
+ self.name = inputname + '_'
+
+ def forward(self, input):
+ input = fluid.layers.fc(
+ input=input,
+ size=self.expansion * self.feature_size,
+ act=None,
+ name=self.name + 'fc_expansion',
+ param_attr=fluid.ParamAttr(
+ name=self.name + 'fc_expansion_w',
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=fluid.ParamAttr(
+ name=self.name + 'fc_expansion_b',
+ initializer=fluid.initializer.Constant(value=0.)))
+
+ # attention factor of per group
+ attention = fluid.layers.fc(
+ input=input,
+ size=self.groups,
+ act='sigmoid',
+ name=self.name + 'fc_group_attention',
+ param_attr=fluid.ParamAttr(
+ name=self.name + 'fc_group_attention_w',
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=fluid.ParamAttr(
+ name=self.name + 'fc_group_attention_b',
+ initializer=fluid.initializer.Constant(value=0.)))
+
+ # calculate activation factor of per group per cluster
+ feature_size = self.feature_size * self.expansion // self.groups
+ cluster_weights = fluid.layers.create_parameter(
+ shape=[
+ self.expansion * self.feature_size,
+ self.groups * self.cluster_size
+ ],
+ dtype=input.dtype,
+ attr=fluid.ParamAttr(name=self.name + 'cluster_weights'),
+ default_initializer=fluid.initializer.MSRA(uniform=False))
+
+ activation = fluid.layers.matmul(input, cluster_weights)
+ activation = fluid.layers.batch_norm(
+ activation, is_test=(not self.is_training))
+
+ # reshape of activation
+ activation = fluid.layers.reshape(activation,
+ [-1, self.groups, self.cluster_size])
+ # softmax on per cluster
+ activation = fluid.layers.softmax(activation)
+ activation = fluid.layers.elementwise_mul(activation, attention, axis=0)
+ a_sum = fluid.layers.sequence_pool(activation, 'sum')
+ a_sum = fluid.layers.reduce_sum(a_sum, dim=1)
+
+ # create cluster_weights2
+ cluster_weights2 = fluid.layers.create_parameter(
+ shape=[self.cluster_size, feature_size],
+ dtype=input.dtype,
+ attr=fluid.ParamAttr(name=self.name + 'cluster_weights2'),
+ default_initializer=fluid.initializer.MSRA(uniform=False))
+
+ # expand a_sum dimension from [-1, self.cluster_size] to be [-1, self.cluster_size, feature_size]
+ a_sum = fluid.layers.reshape(a_sum, [-1, self.cluster_size, 1])
+ a_sum = fluid.layers.expand(a_sum, [1, 1, feature_size])
+
+ # element wise multiply a_sum and cluster_weights2
+ a = fluid.layers.elementwise_mul(
+ a_sum, cluster_weights2,
+ axis=1) # output shape [-1, self.cluster_size, feature_size]
+
+ # transpose activation from [-1, self.groups, self.cluster_size] to [-1, self.cluster_size, self.groups]
+ activation2 = fluid.layers.transpose(activation, perm=[0, 2, 1])
+ # transpose op will clear the lod infomation, so it should be reset
+ activation = fluid.layers.lod_reset(activation2, activation)
+
+ # reshape input from [-1, self.expansion * self.feature_size] to [-1, self.groups, feature_size]
+ reshaped_input = fluid.layers.reshape(input,
+ [-1, self.groups, feature_size])
+ # mat multiply activation and reshaped_input
+ vlad = fluid.layers.matmul(
+ activation,
+ reshaped_input) # output shape [-1, self.cluster_size, feature_size]
+ vlad = fluid.layers.sequence_pool(vlad, 'sum')
+ vlad = fluid.layers.elementwise_sub(vlad, a)
+
+ # l2_normalization
+ vlad = fluid.layers.transpose(vlad, [0, 2, 1])
+ vlad = fluid.layers.l2_normalize(vlad, axis=1)
+
+ # reshape and batch norm
+ vlad = fluid.layers.reshape(vlad,
+ [-1, self.cluster_size * feature_size])
+ vlad = fluid.layers.batch_norm(vlad, is_test=(not self.is_training))
+
+ return vlad
+
+
+class NeXtVLADModel(object):
+ """
+ Creates a NeXtVLAD based model.
+ Args:
+ model_input: A LoDTensor of [-1, N] for the input video frames.
+ vocab_size: The number of classes in the dataset.
+ """
+
+ def __init__(self):
+ pass
+
+ def create_model(self,
+ video_input,
+ audio_input,
+ is_training=True,
+ class_dim=None,
+ cluster_size=None,
+ hidden_size=None,
+ groups=None,
+ expansion=None,
+ drop_rate=None,
+ gating_reduction=None,
+ l2_penalty=None,
+ **unused_params):
+
+ # calcluate vlad of video and audio
+ video_nextvlad = NeXtVLAD(
+ 1024,
+ cluster_size,
+ is_training,
+ expansion=expansion,
+ groups=groups,
+ inputname='video')
+ audio_nextvlad = NeXtVLAD(
+ 128,
+ cluster_size,
+ is_training,
+ expansion=expansion,
+ groups=groups,
+ inputname='audio')
+ vlad_video = video_nextvlad.forward(video_input)
+ vlad_audio = audio_nextvlad.forward(audio_input)
+
+ # concat video and audio
+ vlad = fluid.layers.concat([vlad_video, vlad_audio], axis=1)
+
+ # drop out
+ if drop_rate > 0.:
+ vlad = fluid.layers.dropout(
+ vlad, drop_rate, is_test=(not is_training))
+
+ # add fc
+ activation = fluid.layers.fc(
+ input=vlad,
+ size=hidden_size,
+ act=None,
+ name='hidden1_fc',
+ param_attr=fluid.ParamAttr(
+ name='hidden1_fc_weights',
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=False)
+ activation = fluid.layers.batch_norm(
+ activation, is_test=(not is_training))
+
+ # add fc, gate 1
+ gates = fluid.layers.fc(
+ input=activation,
+ size=hidden_size // gating_reduction,
+ act=None,
+ name='gating_fc1',
+ param_attr=fluid.ParamAttr(
+ name='gating_fc1_weights',
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=False)
+ gates = fluid.layers.batch_norm(
+ gates, is_test=(not is_training), act='relu')
+
+ # add fc, gate 2
+ gates = fluid.layers.fc(
+ input=gates,
+ size=hidden_size,
+ act='sigmoid',
+ name='gating_fc2',
+ param_attr=fluid.ParamAttr(
+ name='gating_fc2_weights',
+ initializer=fluid.initializer.MSRA(uniform=False)),
+ bias_attr=False)
+
+ activation = fluid.layers.elementwise_mul(activation, gates)
+ aggregate_model = clf_model.LogisticModel # set classification model
+
+ return aggregate_model().create_model(
+ model_input=activation,
+ vocab_size=class_dim,
+ l2_penalty=l2_penalty,
+ is_training=is_training,
+ **unused_params)
diff --git a/PaddleCV/video/models/nonlocal_model/README.md b/PaddleCV/video/models/nonlocal_model/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..ec5411bc65c32979eba5579b179ef163e709d3c4
--- /dev/null
+++ b/PaddleCV/video/models/nonlocal_model/README.md
@@ -0,0 +1,165 @@
+# Non-local Neural Networks视频分类模型
+
+---
+## 目录
+
+- [模型简介](#模型简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断](#模型推断)
+- [参考论文](#参考论文)
+
+
+## 模型简介
+
+Non-local Neural Networks是由Xiaolong Wang等研究者在2017年提出的模型,主要特点是通过引入Non-local操作来描述距离较远的像素点之间的关联关系。提取大范围内数据点之间的关联关系,一直是一个比较重要的问题。对于序列化数据,比如语音、视频等,比较主流的做法是使用循环神经网络(RNN);对于图片来说,通常使用卷积神经网络(CNN)来提取像素之间的依赖关系。然而,CNN和RNN都只是在其空间或者时间的很小的邻域内进行特征提取,很难捕捉到距离更远位置的数据的依赖关系。借助于传统计算机视觉中的Non-local mean的思想,并将其扩展到神经网络中,通过定义输出位置和所有输入位置之间的关联函数,建立起了一种具有全局关联特性的操作,输出feature map上的每个位置,都会受到输入feature map上所有位置的数据的影响。在CNN中,经过一次卷积操作,输出feature map上的像素点,只能获取其相应的感受野之内的信息,为了获得更多的上下文信息,就需要做多次卷积操作。然而在Non-local操作中,每个输出点的感受野都相当于整个输入feature map区域,能比CNN和RNN提取到更加全局的信息。
+
+详细信息请参考论文[Non-local Neural Networks](https://arxiv.org/abs/1711.07971v1)
+
+### Non-local操作
+
+Non-local 关联函数的定义如下
+
+
+
+
+
+在上面的公式中,x表示输入feature map, y表示输出feature map,i是输出feature map的位置,j是输入feature map的位置,f(xi, xj)描述了输出点i跟所有输入点j之间的关联,C是根据f(xi, xj)选取的归一化函数。g(xj)是对输入feature map做一个变换操作,通常可以选取比较简单的线性变换形式;f(xi, xj)可以选取不同的形式,通常可以使用如下几种形式
+
+#### Gaussian
+
+
+
+
+
+#### Embedded Gaussian
+
+
+
+
+
+#### Dot product
+
+
+
+
+
+#### Concatenation
+
+
+
+
+
+其中
+
+
+
+上述函数形式中的参数可以使用随机初始化的方式进行赋值,在训练过程中通过End-2-End的方式不断迭代求解。
+
+### Non-local block
+
+采用类似Resnet的结构,定义如下的Non-local block
+
+
+
+
+Non-local操作引入的部分与Resnet中的残差项类似,通过使用Non-local block,可以方便的在网络中的任何地方添加Non-local操作,而其他地方照样可以使用原始的预训练模型进行初始化。如果将Wz初始化为0,则跟不使用Non-local block的初始情形等价。
+
+### 具体实现
+
+下图描述了Non-local block使用内嵌高斯形式关联函数的具体实现过程,
+
+
+使用Eembedded Gaussian关联函数的Non-local block
+
+
+g(Xj)是对输入feature map做一个线性变换,使用1x1x1的卷积;theta和phi也是线性变化,同样使用1x1x1的卷积来实现。从上图中可以看到,Non-local操作只需用到通常的卷积、矩阵相乘、加法、softmax等比较常用的算子,不需要额外添加新的算子,用户可以非常方便的实现组网以构建模型。
+
+### 模型效果
+
+原作者的论文中指出,Non-local模型在视频分类问题上取得了较好的效果,在Resnet-50基础网络上添加Non-local block,能取得比Resnet-101更好的分类效果,TOP-1准确率要高出1~2个百分点。在图像分类和目标检测问题上,也有比较明显的提升效果。
+
+## 数据准备
+
+Non-local模型的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。数据下载及准备请参考Non-local模型的[数据说明](../../dataset/nonlocal/README.md)
+
+## 模型训练
+
+数据准备完毕后,可以通过如下两种方式启动训练:
+
+ python train.py --model_name=NONLOCAL
+ --config=./configs/nonlocal.txt
+ --save_dir=checkpoints
+ --log_interval=10
+ --valid_interval=1
+
+ bash scripts/train/train_nonlocal.sh
+
+- 可下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/nonlocal_kinetics.tar.gz)通过`--resume`指定权重存放路径进行finetune等开发
+
+**数据读取器说明:** 模型读取Kinetics-400数据集中的`mp4`数据,根据视频长度和采样频率随机选取起始帧的位置,每个视频抽取`video_length`帧图像,对每帧图像做随机增强,短边缩放至[256, 320]之间的某个随机数,长边根据长宽比计算出来,然后再截取出224x224的区域作为训练数据输入网络。
+
+**训练策略:**
+
+* 采用Momentum优化算法训练,momentum=0.9
+* 采用L2正则化,卷积和fc层weight decay系数为1e-4;bn层则设置weight decay系数为0
+* 初始学习率base\_learning\_rate=0.01,在150,000和300,000次迭代的时候分别降一次学习率,衰减系数为0.1
+
+
+## 模型评估
+
+测试时数据预处理的方式跟训练时不一样,crop区域的大小为256x256,不同于训练时的224x224,所以需要将训练中预测输出时使用的全连接操作改为1x1x1的卷积。每个视频抽取图像帧数据的时候,会选取10个不同的位置作为时间起始点,做crop的时候会选取三个不同的空间起始点。在每个视频上会进行10x3次采样,将这30个样本的预测结果进行求和,选取概率最大的类别作为最终的预测结果。
+
+可通过如下两种方式进行模型评估:
+
+ python test.py --model_name=NONLOCAL
+ --config=configs/nonlocal.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+
+ bash scripts/test/test_nonlocal.sh
+
+- 使用`scripts/test/test_nonlocal.sh`进行评估时,需要修改脚本中的`--weights`参数指定需要评估的权重。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/nonlocal_kinetics.tar.gz)进行评估
+
+
+当取如下参数时:
+
+| 参数 | 取值 |
+| :---------: | :----: |
+| back bone | Resnet-50 |
+| 卷积形式 | c2d |
+| 采样频率 | 8 |
+| 视频长度 | 8 |
+
+在Kinetics400的validation数据集下评估精度如下:
+
+| 精度指标 | 模型精度 |
+| :---------: | :----: |
+| TOP\_1 | 0.739 |
+
+### 备注
+
+由于Youtube上部分数据已删除,只下载到了kinetics400数据集中的234619条,而原始数据集包含246535条视频,可能会导致精度略微下降。
+
+## 模型推断
+
+可通过如下命令进行模型推断:
+
+ python infer.py --model_name=NONLOCAL
+ --config=configs/nonlocal.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+ --filelist=$FILELIST
+
+- 模型推断结果存储于`NONLOCAL_infer_result`中,通过`pickle`格式存储。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/nonlocal_kinetics.tar.gz)进行推断
+
+
+## 参考论文
+
+- [Non-local Neural Networks](https://arxiv.org/abs/1711.07971v1), Xiaolong Wang, Ross Girshick, Abhinav Gupta, Kaiming He
+
diff --git a/PaddleCV/video/models/nonlocal_model/__init__.py b/PaddleCV/video/models/nonlocal_model/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..0a127553f0500669188bd52cacdbb307324abf58
--- /dev/null
+++ b/PaddleCV/video/models/nonlocal_model/__init__.py
@@ -0,0 +1 @@
+from .nonlocal_model import *
diff --git a/PaddleCV/video/models/nonlocal_model/nonlocal_helper.py b/PaddleCV/video/models/nonlocal_model/nonlocal_helper.py
new file mode 100644
index 0000000000000000000000000000000000000000..a02603e474efa5fc0560e8c2d172ce09bec5bd00
--- /dev/null
+++ b/PaddleCV/video/models/nonlocal_model/nonlocal_helper.py
@@ -0,0 +1,254 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import paddle
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+
+
+# 3d spacetime nonlocal (v1, spatial downsample)
+def spacetime_nonlocal(blob_in, dim_in, dim_out, batch_size, prefix, dim_inner, cfg, \
+ test_mode = False, max_pool_stride = 2):
+ #------------
+ cur = blob_in
+ # we do projection to convert each spacetime location to a feature
+ # theta original size
+ # e.g., (8, 1024, 4, 14, 14) => (8, 1024, 4, 14, 14)
+ theta = fluid.layers.conv3d(
+ input=cur,
+ num_filters=dim_inner,
+ filter_size=[1, 1, 1],
+ stride=[1, 1, 1],
+ padding=[0, 0, 0],
+ param_attr=ParamAttr(
+ name=prefix + '_theta' + "_w",
+ initializer=fluid.initializer.Normal(
+ loc=0.0, scale=cfg.NONLOCAL.conv_init_std)),
+ bias_attr=ParamAttr(
+ name=prefix + '_theta' + "_b",
+ initializer=fluid.initializer.Constant(value=0.))
+ if (cfg.NONLOCAL.no_bias == 0) else False,
+ name=prefix + '_theta')
+ theta_shape = theta.shape
+
+ # phi and g: half spatial size
+ # e.g., (8, 1024, 4, 14, 14) => (8, 1024, 4, 7, 7)
+ if cfg.NONLOCAL.use_maxpool:
+ max_pool = fluid.layers.pool3d(
+ input=cur,
+ pool_size=[1, max_pool_stride, max_pool_stride],
+ pool_type='max',
+ pool_stride=[1, max_pool_stride, max_pool_stride],
+ pool_padding=[0, 0, 0],
+ name=prefix + '_pool')
+ else:
+ max_pool = cur
+
+ phi = fluid.layers.conv3d(
+ input=max_pool,
+ num_filters=dim_inner,
+ filter_size=[1, 1, 1],
+ stride=[1, 1, 1],
+ padding=[0, 0, 0],
+ param_attr=ParamAttr(
+ name=prefix + '_phi' + "_w",
+ initializer=fluid.initializer.Normal(
+ loc=0.0, scale=cfg.NONLOCAL.conv_init_std)),
+ bias_attr=ParamAttr(
+ name=prefix + '_phi' + "_b",
+ initializer=fluid.initializer.Constant(value=0.))
+ if (cfg.NONLOCAL.no_bias == 0) else False,
+ name=prefix + '_phi')
+ phi_shape = phi.shape
+ g = fluid.layers.conv3d(
+ input=max_pool,
+ num_filters=dim_inner,
+ filter_size=[1, 1, 1],
+ stride=[1, 1, 1],
+ padding=[0, 0, 0],
+ param_attr=ParamAttr(
+ name=prefix + '_g' + "_w",
+ initializer=fluid.initializer.Normal(
+ loc=0.0, scale=cfg.NONLOCAL.conv_init_std)),
+ bias_attr=ParamAttr(
+ name=prefix + '_g' + "_b",
+ initializer=fluid.initializer.Constant(value=0.))
+ if (cfg.NONLOCAL.no_bias == 0) else False,
+ name=prefix + '_g')
+ g_shape = g.shape
+
+ # we have to use explicit batch size (to support arbitrary spacetime size)
+ # e.g. (8, 1024, 4, 14, 14) => (8, 1024, 784)
+ theta = fluid.layers.reshape(
+ theta, [-1, 0, theta_shape[2] * theta_shape[3] * theta_shape[4]])
+ theta = fluid.layers.transpose(theta, [0, 2, 1])
+ phi = fluid.layers.reshape(
+ phi, [-1, 0, phi_shape[2] * phi_shape[3] * phi_shape[4]])
+ theta_phi = fluid.layers.matmul(theta, phi, name=prefix + '_affinity')
+ g = fluid.layers.reshape(g, [-1, 0, g_shape[2] * g_shape[3] * g_shape[4]])
+ if cfg.NONLOCAL.use_softmax:
+ if cfg.NONLOCAL.use_scale is True:
+ theta_phi_sc = fluid.layers.scale(theta_phi, scale=dim_inner**-.5)
+ else:
+ theta_phi_sc = theta_phi
+ p = fluid.layers.softmax(
+ theta_phi_sc, name=prefix + '_affinity' + '_prob')
+ else:
+ # not clear about what is doing in xlw's code
+ p = None # not implemented
+ raise "Not implemented when not use softmax"
+
+ # note g's axis[2] corresponds to p's axis[2]
+ # e.g. g(8, 1024, 784_2) * p(8, 784_1, 784_2) => (8, 1024, 784_1)
+ p = fluid.layers.transpose(p, [0, 2, 1])
+ t = fluid.layers.matmul(g, p, name=prefix + '_y')
+
+ # reshape back
+ # e.g. (8, 1024, 784) => (8, 1024, 4, 14, 14)
+ t_shape = t.shape
+ # print(t_shape)
+ # print(theta_shape)
+ t_re = fluid.layers.reshape(t, shape=list(theta_shape))
+ blob_out = t_re
+
+ blob_out = fluid.layers.conv3d(
+ input=blob_out,
+ num_filters=dim_out,
+ filter_size=[1, 1, 1],
+ stride=[1, 1, 1],
+ padding=[0, 0, 0],
+ param_attr=ParamAttr(
+ name=prefix + '_out' + "_w",
+ initializer=fluid.initializer.Constant(value=0.)
+ if cfg.NONLOCAL.use_zero_init_conv else fluid.initializer.Normal(
+ loc=0.0, scale=cfg.NONLOCAL.conv_init_std)),
+ bias_attr=ParamAttr(
+ name=prefix + '_out' + "_b",
+ initializer=fluid.initializer.Constant(value=0.))
+ if (cfg.NONLOCAL.no_bias == 0) else False,
+ name=prefix + '_out')
+ blob_out_shape = blob_out.shape
+
+ if cfg.NONLOCAL.use_bn is True:
+ bn_name = prefix + "_bn"
+ blob_out = fluid.layers.batch_norm(
+ blob_out,
+ is_test=test_mode,
+ momentum=cfg.NONLOCAL.bn_momentum,
+ epsilon=cfg.NONLOCAL.bn_epsilon,
+ name=bn_name,
+ param_attr=ParamAttr(
+ name=bn_name + "_scale",
+ initializer=fluid.initializer.Constant(
+ value=cfg.NONLOCAL.bn_init_gamma),
+ regularizer=fluid.regularizer.L2Decay(
+ cfg.TRAIN.weight_decay_bn)),
+ bias_attr=ParamAttr(
+ name=bn_name + "_offset",
+ regularizer=fluid.regularizer.L2Decay(
+ cfg.TRAIN.weight_decay_bn)),
+ moving_mean_name=bn_name + "_mean",
+ moving_variance_name=bn_name + "_variance") # add bn
+
+ if cfg.NONLOCAL.use_affine is True:
+ affine_scale = fluid.layers.create_parameter(
+ shape=[blob_out_shape[1]],
+ dtype=blob_out.dtype,
+ attr=ParamAttr(name=prefix + '_affine' + '_s'),
+ default_initializer=fluid.initializer.Constant(value=1.))
+ affine_bias = fluid.layers.create_parameter(
+ shape=[blob_out_shape[1]],
+ dtype=blob_out.dtype,
+ attr=ParamAttr(name=prefix + '_affine' + '_b'),
+ default_initializer=fluid.initializer.Constant(value=0.))
+ blob_out = fluid.layers.affine_channel(
+ blob_out,
+ scale=affine_scale,
+ bias=affine_bias,
+ name=prefix + '_affine') # add affine
+
+ return blob_out
+
+
+def add_nonlocal(blob_in,
+ dim_in,
+ dim_out,
+ batch_size,
+ prefix,
+ dim_inner,
+ cfg,
+ test_mode=False):
+ blob_out = spacetime_nonlocal(blob_in, \
+ dim_in, dim_out, batch_size, prefix, dim_inner, cfg, test_mode = test_mode)
+ blob_out = fluid.layers.elementwise_add(
+ blob_out, blob_in, name=prefix + '_sum')
+ return blob_out
+
+
+# this is to reduce memory usage if the feature maps are big
+# devide the feature maps into groups in the temporal dimension,
+# and perform non-local operations inside each group.
+def add_nonlocal_group(blob_in,
+ dim_in,
+ dim_out,
+ batch_size,
+ pool_stride,
+ height,
+ width,
+ group_size,
+ prefix,
+ dim_inner,
+ cfg,
+ test_mode=False):
+ group_num = int(pool_stride / group_size)
+ assert (pool_stride % group_size == 0), \
+ 'nonlocal block {}: pool_stride({}) should be divided by group size({})'.format(prefix, pool_stride, group_size)
+
+ if group_num > 1:
+ blob_in = fluid.layers.transpose(
+ blob_in, [0, 2, 1, 3, 4], name=prefix + '_pre_trans1')
+ blob_in = fluid.layers.reshape(
+ blob_in,
+ [batch_size * group_num, group_size, dim_in, height, width],
+ name=prefix + '_pre_reshape1')
+ blob_in = fluid.layers.transpose(
+ blob_in, [0, 2, 1, 3, 4], name=prefix + '_pre_trans2')
+
+ blob_out = spacetime_nonlocal(
+ blob_in,
+ dim_in,
+ dim_out,
+ batch_size,
+ prefix,
+ dim_inner,
+ cfg,
+ test_mode=test_mode)
+ blob_out = fluid.layers.elementwise_add(
+ blob_out, blob_in, name=prefix + '_sum')
+
+ if group_num > 1:
+ blob_out = fluid.layers.transpose(
+ blob_out, [0, 2, 1, 3, 4], name=prefix + '_post_trans1')
+ blob_out = fluid.layers.reshape(
+ blob_out,
+ [batch_size, group_num * group_size, dim_out, height, width],
+ name=prefix + '_post_reshape1')
+ blob_out = fluid.layers.transpose(
+ blob_out, [0, 2, 1, 3, 4], name=prefix + '_post_trans2')
+
+ return blob_out
diff --git a/PaddleCV/video/models/nonlocal_model/nonlocal_model.py b/PaddleCV/video/models/nonlocal_model/nonlocal_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..689106c11d07eb6170e9f2ee3a5a35ec47fa96df
--- /dev/null
+++ b/PaddleCV/video/models/nonlocal_model/nonlocal_model.py
@@ -0,0 +1,155 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import numpy as np
+import paddle.fluid as fluid
+
+from ..model import ModelBase
+from . import resnet_video
+from .nonlocal_utils import load_params_from_file
+
+import logging
+logger = logging.getLogger(__name__)
+
+__all__ = ["NonLocal"]
+
+# To add new models, import them, add them to this map and models/TARGETS
+
+
+class NonLocal(ModelBase):
+ def __init__(self, name, cfg, mode='train'):
+ super(NonLocal, self).__init__(name, cfg, mode=mode)
+ self.get_config()
+
+ def get_config(self):
+ # video_length
+ self.video_length = self.get_config_from_sec(self.mode, 'video_length')
+ # crop size
+ self.crop_size = self.get_config_from_sec(self.mode, 'crop_size')
+
+ def build_input(self, use_pyreader=True):
+ input_shape = [3, self.video_length, self.crop_size, self.crop_size]
+ label_shape = [1]
+ py_reader = None
+ if use_pyreader:
+ assert self.mode != 'infer', \
+ 'pyreader is not recommendated when infer, please set use_pyreader to be false.'
+ py_reader = fluid.layers.py_reader(
+ capacity=20,
+ shapes=[[-1] + input_shape, [-1] + label_shape],
+ dtypes=['float32', 'int64'],
+ name='train_py_reader'
+ if self.is_training else 'test_py_reader',
+ use_double_buffer=True)
+ data, label = fluid.layers.read_file(py_reader)
+ self.py_reader = py_reader
+ else:
+ data = fluid.layers.data(
+ name='train_data' if self.is_training else 'test_data',
+ shape=input_shape,
+ dtype='float32')
+ if self.mode != 'infer':
+ label = fluid.layers.data(
+ name='train_label' if self.is_training else 'test_label',
+ shape=label_shape,
+ dtype='int64')
+ else:
+ label = None
+ self.feature_input = [data]
+ self.label_input = label
+
+ def create_model_args(self):
+ return None
+
+ def build_model(self):
+ pred, loss = resnet_video.create_model(
+ data=self.feature_input[0],
+ label=self.label_input,
+ cfg=self.cfg,
+ is_training=self.is_training,
+ mode=self.mode)
+ if loss is not None:
+ loss = fluid.layers.mean(loss)
+ self.network_outputs = [pred]
+ self.loss_ = loss
+
+ def optimizer(self):
+ base_lr = self.get_config_from_sec('TRAIN', 'learning_rate')
+ lr_decay = self.get_config_from_sec('TRAIN', 'learning_rate_decay')
+ step_sizes = self.get_config_from_sec('TRAIN', 'step_sizes')
+ lr_bounds, lr_values = get_learning_rate_decay_list(base_lr, lr_decay,
+ step_sizes)
+ learning_rate = fluid.layers.piecewise_decay(
+ boundaries=lr_bounds, values=lr_values)
+
+ momentum = self.get_config_from_sec('TRAIN', 'momentum')
+ use_nesterov = self.get_config_from_sec('TRAIN', 'nesterov')
+ l2_weight_decay = self.get_config_from_sec('TRAIN', 'weight_decay')
+ logger.info(
+ 'Build up optimizer, \ntype: {}, \nmomentum: {}, \nnesterov: {}, \
+ \nregularization: L2 {}, \nlr_values: {}, lr_bounds: {}'
+ .format('Momentum', momentum, use_nesterov, l2_weight_decay,
+ lr_values, lr_bounds))
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=learning_rate,
+ momentum=momentum,
+ use_nesterov=use_nesterov,
+ regularization=fluid.regularizer.L2Decay(l2_weight_decay))
+ return optimizer
+
+ def loss(self):
+ return self.loss_
+
+ def outputs(self):
+ return self.network_outputs
+
+ def feeds(self):
+ return self.feature_input if self.mode == 'infer' else \
+ self.feature_input + [self.label_input]
+
+ def pretrain_info(self):
+ return None, None
+
+ def weights_info(self):
+ pass
+
+ def load_pretrain_params(self, exe, pretrain, prog, place):
+ load_params_from_file(exe, prog, pretrain, place)
+
+ def load_test_weights(self, exe, weights, prog, place):
+ super(NonLocal, self).load_test_weights(exe, weights, prog, place)
+ pred_w = fluid.global_scope().find_var('pred_w').get_tensor()
+ pred_array = np.array(pred_w)
+ pred_w_shape = pred_array.shape
+ if len(pred_w_shape) == 2:
+ logger.info('reshape for pred_w when test')
+ pred_array = np.transpose(pred_array, (1, 0))
+ pred_w_shape = pred_array.shape
+ pred_array = np.reshape(
+ pred_array, [pred_w_shape[0], pred_w_shape[1], 1, 1, 1])
+ pred_w.set(pred_array.astype('float32'), place)
+
+
+def get_learning_rate_decay_list(base_learning_rate, lr_decay, step_lists):
+ lr_bounds = []
+ lr_values = [base_learning_rate * 1]
+ cur_step = 0
+ for i in range(len(step_lists)):
+ cur_step += step_lists[i]
+ lr_bounds.append(cur_step)
+ decay_rate = lr_decay**(i + 1)
+ lr_values.append(base_learning_rate * decay_rate)
+
+ return lr_bounds, lr_values
diff --git a/PaddleCV/video/models/nonlocal_model/nonlocal_utils.py b/PaddleCV/video/models/nonlocal_model/nonlocal_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..2b6db0831bd328d8c9b3dcf6566a3c5ae6ee3145
--- /dev/null
+++ b/PaddleCV/video/models/nonlocal_model/nonlocal_utils.py
@@ -0,0 +1,98 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import numpy as np
+import paddle.fluid as fluid
+import logging
+logger = logging.getLogger(__name__)
+
+
+def load_params_from_file(exe, prog, pretrained_file, place):
+ logger.info('load params from {}'.format(pretrained_file))
+ if os.path.isdir(pretrained_file):
+ param_list = prog.block(0).all_parameters()
+ param_name_list = [p.name for p in param_list]
+ param_shape = {}
+ for name in param_name_list:
+ param_tensor = fluid.global_scope().find_var(name).get_tensor()
+ param_shape[name] = np.array(param_tensor).shape
+
+ param_name_from_file = os.listdir(pretrained_file)
+ common_names = get_common_names(param_name_list, param_name_from_file)
+
+ logger.info('-------- loading params -----------')
+
+ # load params from file
+ def is_parameter(var):
+ if isinstance(var, fluid.framework.Parameter):
+ return isinstance(var, fluid.framework.Parameter) and \
+ os.path.exists(os.path.join(pretrained_file, var.name))
+
+ logger.info("Load pretrain weights from file {}".format(
+ pretrained_file))
+ vars = filter(is_parameter, prog.list_vars())
+ fluid.io.load_vars(exe, pretrained_file, vars=vars, main_program=prog)
+
+ # reset params if necessary
+ for name in common_names:
+ t = fluid.global_scope().find_var(name).get_tensor()
+ t_array = np.array(t)
+ origin_shape = param_shape[name]
+ if t_array.shape == origin_shape:
+ logger.info("load param {}".format(name))
+ elif (t_array.shape[:2] == origin_shape[:2]) and (
+ t_array.shape[-2:] == origin_shape[-2:]):
+ num_inflate = origin_shape[2]
+ stack_t_array = np.stack(
+ [t_array] * num_inflate, axis=2) / float(num_inflate)
+ assert origin_shape == stack_t_array.shape, "inflated shape should be the same with tensor {}".format(
+ name)
+ t.set(stack_t_array.astype('float32'), place)
+ logger.info("load inflated({}) param {}".format(num_inflate,
+ name))
+ else:
+ logger.info("Invalid case for name: {}".format(name))
+ raise
+ logger.info("finished loading params from resnet pretrained model")
+ else:
+ logger.info(
+ "pretrained file is not in a directory, not suitable to load params".
+ format(pretrained_file))
+ pass
+
+
+def get_common_names(param_name_list, param_name_from_file):
+ # name check and return common names both in param_name_list and file
+ common_names = []
+ paddle_only_names = []
+ file_only_names = []
+ logger.info('-------- comon params -----------')
+ for name in param_name_list:
+ if name in param_name_from_file:
+ common_names.append(name)
+ logger.info(name)
+ else:
+ paddle_only_names.append(name)
+ logger.info('-------- paddle only params ----------')
+ for name in paddle_only_names:
+ logger.info(name)
+ logger.info('-------- file only params -----------')
+ for name in param_name_from_file:
+ if name in param_name_list:
+ assert name in common_names
+ else:
+ file_only_names.append(name)
+ logger.info(name)
+ return common_names
diff --git a/PaddleCV/video/models/nonlocal_model/resnet_helper.py b/PaddleCV/video/models/nonlocal_model/resnet_helper.py
new file mode 100644
index 0000000000000000000000000000000000000000..1fb5c747ac5a9ecc3dd072444e4ae2cd475b931a
--- /dev/null
+++ b/PaddleCV/video/models/nonlocal_model/resnet_helper.py
@@ -0,0 +1,356 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import unicode_literals
+from __future__ import print_function
+from __future__ import division
+
+import paddle
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+
+import numpy as np
+from . import nonlocal_helper
+
+
+def Conv3dAffine(blob_in,
+ prefix,
+ dim_in,
+ dim_out,
+ filter_size,
+ stride,
+ padding,
+ cfg,
+ group=1,
+ test_mode=False,
+ bn_init=None):
+ blob_out = fluid.layers.conv3d(
+ input=blob_in,
+ num_filters=dim_out,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=group,
+ param_attr=ParamAttr(
+ name=prefix + "_weights", initializer=fluid.initializer.MSRA()),
+ bias_attr=False,
+ name=prefix + "_conv")
+ blob_out_shape = blob_out.shape
+
+ affine_name = "bn" + prefix[3:]
+
+ affine_scale = fluid.layers.create_parameter(
+ shape=[blob_out_shape[1]],
+ dtype=blob_out.dtype,
+ attr=ParamAttr(name=affine_name + '_scale'),
+ default_initializer=fluid.initializer.Constant(value=1.))
+ affine_bias = fluid.layers.create_parameter(
+ shape=[blob_out_shape[1]],
+ dtype=blob_out.dtype,
+ attr=ParamAttr(name=affine_name + '_offset'),
+ default_initializer=fluid.initializer.Constant(value=0.))
+ blob_out = fluid.layers.affine_channel(
+ blob_out, scale=affine_scale, bias=affine_bias, name=affine_name)
+
+ return blob_out
+
+
+def Conv3dBN(blob_in,
+ prefix,
+ dim_in,
+ dim_out,
+ filter_size,
+ stride,
+ padding,
+ cfg,
+ group=1,
+ test_mode=False,
+ bn_init=None):
+ blob_out = fluid.layers.conv3d(
+ input=blob_in,
+ num_filters=dim_out,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=group,
+ param_attr=ParamAttr(
+ name=prefix + "_weights", initializer=fluid.initializer.MSRA()),
+ bias_attr=False,
+ name=prefix + "_conv")
+
+ bn_name = "bn" + prefix[3:]
+
+ blob_out = fluid.layers.batch_norm(
+ blob_out,
+ is_test=test_mode,
+ momentum=cfg.MODEL.bn_momentum,
+ epsilon=cfg.MODEL.bn_epsilon,
+ name=bn_name,
+ param_attr=ParamAttr(
+ name=bn_name + "_scale",
+ initializer=fluid.initializer.Constant(value=bn_init if
+ (bn_init != None) else 1.),
+ regularizer=fluid.regularizer.L2Decay(cfg.TRAIN.weight_decay_bn)),
+ bias_attr=ParamAttr(
+ name=bn_name + "_offset",
+ regularizer=fluid.regularizer.L2Decay(cfg.TRAIN.weight_decay_bn)),
+ moving_mean_name=bn_name + "_mean",
+ moving_variance_name=bn_name + "_variance")
+ return blob_out
+
+
+# 3d bottleneck
+def bottleneck_transformation_3d(blob_in,
+ dim_in,
+ dim_out,
+ stride,
+ prefix,
+ dim_inner,
+ cfg,
+ group=1,
+ use_temp_conv=1,
+ temp_stride=1,
+ test_mode=False):
+ conv_op = Conv3dAffine if cfg.MODEL.use_affine else Conv3dBN
+
+ # 1x1 layer
+ blob_out = conv_op(
+ blob_in,
+ prefix + "_branch2a",
+ dim_in,
+ dim_inner, [1 + use_temp_conv * 2, 1, 1], [temp_stride, 1, 1],
+ [use_temp_conv, 0, 0],
+ cfg,
+ test_mode=test_mode)
+ blob_out = fluid.layers.relu(blob_out, name=prefix + "_branch2a" + "_relu")
+
+ # 3x3 layer
+ blob_out = conv_op(
+ blob_out,
+ prefix + '_branch2b',
+ dim_inner,
+ dim_inner, [1, 3, 3], [1, stride, stride], [0, 1, 1],
+ cfg,
+ group=group,
+ test_mode=test_mode)
+ blob_out = fluid.layers.relu(blob_out, name=prefix + "_branch2b" + "_relu")
+
+ # 1x1 layer, no relu
+ blob_out = conv_op(
+ blob_out,
+ prefix + '_branch2c',
+ dim_inner,
+ dim_out, [1, 1, 1], [1, 1, 1], [0, 0, 0],
+ cfg,
+ test_mode=test_mode,
+ bn_init=cfg.MODEL.bn_init_gamma)
+
+ return blob_out
+
+
+def _add_shortcut_3d(blob_in,
+ prefix,
+ dim_in,
+ dim_out,
+ stride,
+ cfg,
+ temp_stride=1,
+ test_mode=False):
+ if ((dim_in == dim_out) and (temp_stride == 1) and (stride == 1)):
+ # identity mapping (do nothing)
+ return blob_in
+ else:
+ # when dim changes
+ conv_op = Conv3dAffine if cfg.MODEL.use_affine else Conv3dBN
+ blob_out = conv_op(
+ blob_in,
+ prefix,
+ dim_in,
+ dim_out, [1, 1, 1], [temp_stride, stride, stride], [0, 0, 0],
+ cfg,
+ test_mode=test_mode)
+
+ return blob_out
+
+
+# residual block abstraction
+def _generic_residual_block_3d(blob_in,
+ dim_in,
+ dim_out,
+ stride,
+ prefix,
+ dim_inner,
+ cfg,
+ group=1,
+ use_temp_conv=0,
+ temp_stride=1,
+ trans_func=None,
+ test_mode=False):
+ # transformation branch (e.g. 1x1-3x3-1x1, or 3x3-3x3), namely "F(x)"
+ if trans_func is None:
+ trans_func = globals()[cfg.RESNETS.trans_func]
+
+ tr_blob = trans_func(
+ blob_in,
+ dim_in,
+ dim_out,
+ stride,
+ prefix,
+ dim_inner,
+ cfg,
+ group=group,
+ use_temp_conv=use_temp_conv,
+ temp_stride=temp_stride,
+ test_mode=test_mode)
+
+ # create short cut, namely, "x"
+ sc_blob = _add_shortcut_3d(
+ blob_in,
+ prefix + "_branch1",
+ dim_in,
+ dim_out,
+ stride,
+ cfg,
+ temp_stride=temp_stride,
+ test_mode=test_mode)
+
+ # addition, namely, "x + F(x)", and relu
+ sum_blob = fluid.layers.elementwise_add(
+ tr_blob, sc_blob, act='relu', name=prefix + '_sum')
+
+ return sum_blob
+
+
+def res_stage_nonlocal(block_fn,
+ blob_in,
+ dim_in,
+ dim_out,
+ stride,
+ num_blocks,
+ prefix,
+ cfg,
+ dim_inner=None,
+ group=None,
+ use_temp_convs=None,
+ temp_strides=None,
+ batch_size=None,
+ nonlocal_name=None,
+ nonlocal_mod=1000,
+ test_mode=False):
+ # prefix is something like: res2, res3, etc.
+ # each res layer has num_blocks stacked.
+
+ # check dtype and format of use_temp_convs and temp_strides
+ if use_temp_convs is None:
+ use_temp_convs = np.zeros(num_blocks).astype(int)
+ if temp_strides is None:
+ temp_strides = np.ones(num_blocks).astype(int)
+
+ if len(use_temp_convs) < num_blocks:
+ for _ in range(num_blocks - len(use_temp_convs)):
+ use_temp_convs.append(0)
+ temp_strides.append(1)
+
+ for idx in range(num_blocks):
+ block_prefix = '{}{}'.format(prefix, chr(idx + 97))
+ block_stride = 2 if ((idx == 0) and (stride == 2)) else 1
+ blob_in = _generic_residual_block_3d(
+ blob_in,
+ dim_in,
+ dim_out,
+ block_stride,
+ block_prefix,
+ dim_inner,
+ cfg,
+ group=group,
+ use_temp_conv=use_temp_convs[idx],
+ temp_stride=temp_strides[idx],
+ test_mode=test_mode)
+ dim_in = dim_out
+
+ if idx % nonlocal_mod == nonlocal_mod - 1:
+ blob_in = nonlocal_helper.add_nonlocal(
+ blob_in,
+ dim_in,
+ dim_in,
+ batch_size,
+ nonlocal_name + '_{}'.format(idx),
+ int(dim_in / 2),
+ cfg,
+ test_mode=test_mode)
+
+ return blob_in, dim_in
+
+
+def res_stage_nonlocal_group(block_fn,
+ blob_in,
+ dim_in,
+ dim_out,
+ stride,
+ num_blocks,
+ prefix,
+ cfg,
+ dim_inner=None,
+ group=None,
+ use_temp_convs=None,
+ temp_strides=None,
+ batch_size=None,
+ pool_stride=None,
+ spatial_dim=None,
+ group_size=None,
+ nonlocal_name=None,
+ nonlocal_mod=1000,
+ test_mode=False):
+ # prefix is something like res2, res3, etc.
+ # each res layer has num_blocks stacked
+
+ # check dtype and format of use_temp_convs and temp_strides
+ if use_temp_convs is None:
+ use_temp_convs = np.zeros(num_blocks).astype(int)
+ if temp_strides is None:
+ temp_strides = np.ones(num_blocks).astype(int)
+
+ for idx in range(num_blocks):
+ block_prefix = "{}{}".format(prefix, chr(idx + 97))
+ block_stride = 2 if (idx == 0 and stride == 2) else 1
+ blob_in = _generic_residual_block_3d(
+ blob_in,
+ dim_in,
+ dim_out,
+ block_stride,
+ block_prefix,
+ dim_inner,
+ cfg,
+ group=group,
+ use_temp_conv=use_temp_convs[idx],
+ temp_stride=temp_strides[idx],
+ test_mode=test_mode)
+ dim_in = dim_out
+
+ if idx % nonlocal_mod == nonlocal_mod - 1:
+ blob_in = nonlocal_helper.add_nonlocal_group(
+ blob_in,
+ dim_in,
+ dim_in,
+ batch_size,
+ pool_stride,
+ spatial_dim,
+ spatial_dim,
+ group_size,
+ nonlocal_name + "_{}".format(idx),
+ int(dim_in / 2),
+ cfg,
+ test_mode=test_mode)
+
+ return blob_in, dim_in
diff --git a/PaddleCV/video/models/nonlocal_model/resnet_video.py b/PaddleCV/video/models/nonlocal_model/resnet_video.py
new file mode 100644
index 0000000000000000000000000000000000000000..6a0717566f491f7f0b472ab666aea5dff9392471
--- /dev/null
+++ b/PaddleCV/video/models/nonlocal_model/resnet_video.py
@@ -0,0 +1,371 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import unicode_literals
+from __future__ import print_function
+from __future__ import division
+
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+
+from . import resnet_helper
+import logging
+
+logger = logging.getLogger(__name__)
+
+# For more depths, add the block config here
+BLOCK_CONFIG = {
+ 50: (3, 4, 6, 3),
+ 101: (3, 4, 23, 3),
+}
+
+
+# ------------------------------------------------------------------------
+# obtain_arc defines the temporal kernel radius and temporal strides for
+# each layers residual blocks in a resnet.
+# e.g. use_temp_convs = 1 means a temporal kernel of 3 is used.
+# In ResNet50, it has (3, 4, 6, 3) blocks in conv2, 3, 4, 5,
+# so the lengths of the corresponding lists are (3, 4, 6, 3).
+# ------------------------------------------------------------------------
+def obtain_arc(arc_type, video_length):
+
+ pool_stride = 1
+
+ # c2d, ResNet50
+ if arc_type == 1:
+ use_temp_convs_1 = [0]
+ temp_strides_1 = [1]
+ use_temp_convs_2 = [0, 0, 0]
+ temp_strides_2 = [1, 1, 1]
+ use_temp_convs_3 = [0, 0, 0, 0]
+ temp_strides_3 = [1, 1, 1, 1]
+ use_temp_convs_4 = [0, ] * 6
+ temp_strides_4 = [1, ] * 6
+ use_temp_convs_5 = [0, 0, 0]
+ temp_strides_5 = [1, 1, 1]
+
+ pool_stride = int(video_length / 2)
+
+ # i3d, ResNet50
+ if arc_type == 2:
+ use_temp_convs_1 = [2]
+ temp_strides_1 = [1]
+ use_temp_convs_2 = [1, 1, 1]
+ temp_strides_2 = [1, 1, 1]
+ use_temp_convs_3 = [1, 0, 1, 0]
+ temp_strides_3 = [1, 1, 1, 1]
+ use_temp_convs_4 = [1, 0, 1, 0, 1, 0]
+ temp_strides_4 = [1, 1, 1, 1, 1, 1]
+ use_temp_convs_5 = [0, 1, 0]
+ temp_strides_5 = [1, 1, 1]
+
+ pool_stride = int(video_length / 2)
+
+ # c2d, ResNet101
+ if arc_type == 3:
+ use_temp_convs_1 = [0]
+ temp_strides_1 = [1]
+ use_temp_convs_2 = [0, 0, 0]
+ temp_strides_2 = [1, 1, 1]
+ use_temp_convs_3 = [0, 0, 0, 0]
+ temp_strides_3 = [1, 1, 1, 1]
+ use_temp_convs_4 = [0, ] * 23
+ temp_strides_4 = [1, ] * 23
+ use_temp_convs_5 = [0, 0, 0]
+ temp_strides_5 = [1, 1, 1]
+
+ pool_stride = int(video_length / 2)
+
+ # i3d, ResNet101
+ if arc_type == 4:
+ use_temp_convs_1 = [2]
+ temp_strides_1 = [1]
+ use_temp_convs_2 = [1, 1, 1]
+ temp_strides_2 = [1, 1, 1]
+ use_temp_convs_3 = [1, 0, 1, 0]
+ temp_strides_3 = [1, 1, 1, 1]
+ use_temp_convs_4 = []
+ for i in range(23):
+ if i % 2 == 0:
+ use_temp_convs_4.append(1)
+ else:
+ use_temp_convs_4.append(0)
+
+ temp_strides_4 = [1] * 23
+ use_temp_convs_5 = [0, 1, 0]
+ temp_strides_5 = [1, 1, 1]
+
+ pool_stride = int(video_length / 2)
+
+ use_temp_convs_set = [
+ use_temp_convs_1, use_temp_convs_2, use_temp_convs_3, use_temp_convs_4,
+ use_temp_convs_5
+ ]
+ temp_strides_set = [
+ temp_strides_1, temp_strides_2, temp_strides_3, temp_strides_4,
+ temp_strides_5
+ ]
+
+ return use_temp_convs_set, temp_strides_set, pool_stride
+
+
+def create_model(data, label, cfg, is_training=True, mode='train'):
+ group = cfg.RESNETS.num_groups
+ width_per_group = cfg.RESNETS.width_per_group
+ batch_size = int(cfg.TRAIN.batch_size / cfg.TRAIN.num_gpus)
+
+ logger.info('--------------- ResNet-{} {}x{}d-{}, {} ---------------'.
+ format(cfg.MODEL.depth, group, width_per_group,
+ cfg.RESNETS.trans_func, cfg.MODEL.dataset))
+
+ assert cfg.MODEL.depth in BLOCK_CONFIG.keys(), \
+ "Block config is not defined for specified model depth."
+ (n1, n2, n3, n4) = BLOCK_CONFIG[cfg.MODEL.depth]
+
+ res_block = resnet_helper._generic_residual_block_3d
+ dim_inner = group * width_per_group
+
+ use_temp_convs_set, temp_strides_set, pool_stride = obtain_arc(
+ cfg.MODEL.video_arc_choice, cfg[mode.upper()]['video_length'])
+ logger.info(use_temp_convs_set)
+ logger.info(temp_strides_set)
+ conv_blob = fluid.layers.conv3d(
+ input=data,
+ num_filters=64,
+ filter_size=[1 + use_temp_convs_set[0][0] * 2, 7, 7],
+ stride=[temp_strides_set[0][0], 2, 2],
+ padding=[use_temp_convs_set[0][0], 3, 3],
+ param_attr=ParamAttr(
+ name='conv1' + "_weights", initializer=fluid.initializer.MSRA()),
+ bias_attr=False,
+ name='conv1')
+
+ test_mode = False if (mode == 'train') else True
+ if cfg.MODEL.use_affine is False:
+ # use bn
+ bn_name = 'bn_conv1'
+ bn_blob = fluid.layers.batch_norm(
+ conv_blob,
+ is_test=test_mode,
+ momentum=cfg.MODEL.bn_momentum,
+ epsilon=cfg.MODEL.bn_epsilon,
+ name=bn_name,
+ param_attr=ParamAttr(
+ name=bn_name + "_scale",
+ regularizer=fluid.regularizer.L2Decay(
+ cfg.TRAIN.weight_decay_bn)),
+ bias_attr=ParamAttr(
+ name=bn_name + "_offset",
+ regularizer=fluid.regularizer.L2Decay(
+ cfg.TRAIN.weight_decay_bn)),
+ moving_mean_name=bn_name + "_mean",
+ moving_variance_name=bn_name + "_variance")
+ else:
+ # use affine
+ affine_name = 'bn_conv1'
+ conv_blob_shape = conv_blob.shape
+ affine_scale = fluid.layers.create_parameter(
+ shape=[conv_blob_shape[1]],
+ dtype=conv_blob.dtype,
+ attr=ParamAttr(name=affine_name + '_scale'),
+ default_initializer=fluid.initializer.Constant(value=1.))
+ affine_bias = fluid.layers.create_parameter(
+ shape=[conv_blob_shape[1]],
+ dtype=conv_blob.dtype,
+ attr=ParamAttr(name=affine_name + '_offset'),
+ default_initializer=fluid.initializer.Constant(value=0.))
+ bn_blob = fluid.layers.affine_channel(
+ conv_blob, scale=affine_scale, bias=affine_bias, name=affine_name)
+
+ # relu
+ relu_blob = fluid.layers.relu(bn_blob, name='res_conv1_bn_relu')
+ # max pool
+ max_pool = fluid.layers.pool3d(
+ input=relu_blob,
+ pool_size=[1, 3, 3],
+ pool_type='max',
+ pool_stride=[1, 2, 2],
+ pool_padding=[0, 0, 0],
+ name='pool1')
+
+ # building res block
+ if cfg.MODEL.depth in [50, 101]:
+ blob_in, dim_in = resnet_helper.res_stage_nonlocal(
+ res_block,
+ max_pool,
+ 64,
+ 256,
+ stride=1,
+ num_blocks=n1,
+ prefix='res2',
+ cfg=cfg,
+ dim_inner=dim_inner,
+ group=group,
+ use_temp_convs=use_temp_convs_set[1],
+ temp_strides=temp_strides_set[1],
+ test_mode=test_mode)
+
+ layer_mod = cfg.NONLOCAL.layer_mod
+ if cfg.MODEL.depth == 101:
+ layer_mod = 2
+ if cfg.NONLOCAL.conv3_nonlocal is False:
+ layer_mod = 1000
+
+ blob_in = fluid.layers.pool3d(
+ blob_in,
+ pool_size=[2, 1, 1],
+ pool_type='max',
+ pool_stride=[2, 1, 1],
+ pool_padding=[0, 0, 0],
+ name='pool2')
+
+ if cfg.MODEL.use_affine is False:
+ blob_in, dim_in = resnet_helper.res_stage_nonlocal(
+ res_block,
+ blob_in,
+ dim_in,
+ 512,
+ stride=2,
+ num_blocks=n2,
+ prefix='res3',
+ cfg=cfg,
+ dim_inner=dim_inner * 2,
+ group=group,
+ use_temp_convs=use_temp_convs_set[2],
+ temp_strides=temp_strides_set[2],
+ batch_size=batch_size,
+ nonlocal_name="nonlocal_conv3",
+ nonlocal_mod=layer_mod,
+ test_mode=test_mode)
+ else:
+ crop_size = cfg[mode.upper()]['crop_size']
+ blob_in, dim_in = resnet_helper.res_stage_nonlocal_group(
+ res_block,
+ blob_in,
+ dim_in,
+ 512,
+ stride=2,
+ num_blocks=n2,
+ prefix='res3',
+ cfg=cfg,
+ dim_inner=dim_inner * 2,
+ group=group,
+ use_temp_convs=use_temp_convs_set[2],
+ temp_strides=temp_strides_set[2],
+ batch_size=batch_size,
+ pool_stride=pool_stride,
+ spatial_dim=int(crop_size / 8),
+ group_size=4,
+ nonlocal_name="nonlocal_conv3_group",
+ nonlocal_mod=layer_mod,
+ test_mode=test_mode)
+
+ layer_mod = cfg.NONLOCAL.layer_mod
+ if cfg.MODEL.depth == 101:
+ layer_mod = layer_mod * 4 - 1
+ if cfg.NONLOCAL.conv4_nonlocal is False:
+ layer_mod = 1000
+
+ blob_in, dim_in = resnet_helper.res_stage_nonlocal(
+ res_block,
+ blob_in,
+ dim_in,
+ 1024,
+ stride=2,
+ num_blocks=n3,
+ prefix='res4',
+ cfg=cfg,
+ dim_inner=dim_inner * 4,
+ group=group,
+ use_temp_convs=use_temp_convs_set[3],
+ temp_strides=temp_strides_set[3],
+ batch_size=batch_size,
+ nonlocal_name="nonlocal_conv4",
+ nonlocal_mod=layer_mod,
+ test_mode=test_mode)
+
+ blob_in, dim_in = resnet_helper.res_stage_nonlocal(
+ res_block,
+ blob_in,
+ dim_in,
+ 2048,
+ stride=2,
+ num_blocks=n4,
+ prefix='res5',
+ cfg=cfg,
+ dim_inner=dim_inner * 8,
+ group=group,
+ use_temp_convs=use_temp_convs_set[4],
+ temp_strides=temp_strides_set[4],
+ test_mode=test_mode)
+
+ else:
+ raise Exception("Unsupported network settings.")
+
+ blob_out = fluid.layers.pool3d(
+ blob_in,
+ pool_size=[pool_stride, 7, 7],
+ pool_type='avg',
+ pool_stride=[1, 1, 1],
+ pool_padding=[0, 0, 0],
+ name='pool5')
+
+ if (cfg.TRAIN.dropout_rate > 0) and (test_mode is False):
+ blob_out = fluid.layers.dropout(
+ blob_out, cfg.TRAIN.dropout_rate, is_test=test_mode)
+
+ if mode in ['train', 'valid']:
+ blob_out = fluid.layers.fc(
+ blob_out,
+ cfg.MODEL.num_classes,
+ param_attr=ParamAttr(
+ name='pred' + "_w",
+ initializer=fluid.initializer.Normal(
+ loc=0.0, scale=cfg.MODEL.fc_init_std)),
+ bias_attr=ParamAttr(
+ name='pred' + "_b",
+ initializer=fluid.initializer.Constant(value=0.)),
+ name='pred')
+ elif mode in ['test', 'infer']:
+ blob_out = fluid.layers.conv3d(
+ input=blob_out,
+ num_filters=cfg.MODEL.num_classes,
+ filter_size=[1, 1, 1],
+ stride=[1, 1, 1],
+ padding=[0, 0, 0],
+ param_attr=ParamAttr(
+ name='pred' + "_w", initializer=fluid.initializer.MSRA()),
+ bias_attr=ParamAttr(
+ name='pred' + "_b",
+ initializer=fluid.initializer.Constant(value=0.)),
+ name='pred')
+
+ if (mode == 'train') or (mode == 'valid'):
+ softmax = fluid.layers.softmax(blob_out)
+ loss = fluid.layers.cross_entropy(
+ softmax, label, soft_label=False, ignore_index=-100)
+
+ elif (mode == 'test') or (mode == 'infer'):
+ # fully convolutional testing, when loading test model,
+ # params should be copied from train_prog fc layer named pred
+ blob_out = fluid.layers.transpose(
+ blob_out, [0, 2, 3, 4, 1], name='pred_tr')
+ blob_out = fluid.layers.softmax(blob_out, name='softmax_conv')
+ softmax = fluid.layers.reduce_mean(
+ blob_out, dim=[1, 2, 3], keep_dim=False, name='softmax')
+ loss = None
+ else:
+ raise 'Not implemented Error'
+
+ return softmax, loss
diff --git a/PaddleCV/video/models/stnet/README.md b/PaddleCV/video/models/stnet/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..15fe1ef3b458678b016167e197a00fc7fa3d8239
--- /dev/null
+++ b/PaddleCV/video/models/stnet/README.md
@@ -0,0 +1,109 @@
+# StNet 视频分类模型
+
+---
+## 目录
+
+- [模型简介](#模型简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断](#模型推断)
+- [参考论文](#参考论文)
+
+
+## 模型简介
+
+StNet模型框架为ActivityNet Kinetics Challenge 2018中夺冠的基础网络框架,本次开源的是基于ResNet50实现的StNet模型,基于其他backbone网络的框架用户可以依样配置。该模型提出“super-image"的概念,在super-image上进行2D卷积,建模视频中局部时空相关性。另外通过temporal modeling block建模视频的全局时空依赖,最后用一个temporal Xception block对抽取的特征序列进行长时序建模。StNet主体网络结构如下图所示:
+
+
+
+StNet Framework Overview
+
+
+详细内容请参考AAAI'2019年论文[StNet:Local and Global Spatial-Temporal Modeling for Human Action Recognition](https://arxiv.org/abs/1811.01549)
+
+## 数据准备
+
+StNet的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。数据下载及准备请参考[数据说明](../../dataset/README.md)
+
+## 模型训练
+
+数据准备完毕后,可以通过如下两种方式启动训练:
+
+ python train.py --model_name=STNET
+ --config=./configs/stnet.txt
+ --save_dir=checkpoints
+ --log_interval=10
+ --valid_interval=1
+
+ bash scripts/train/train_stnet.sh
+
+- 可下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/stnet_kinetics.tar.gz)通过`--resume`指定权重存放路径进行finetune等开发
+
+**数据读取器说明:** 模型读取Kinetics-400数据集中的`mp4`数据,每条数据抽取`seg_num`段,每段抽取`seg_len`帧图像,对每帧图像做随机增强后,缩放至`target_size`。
+
+**训练策略:**
+
+* 采用Momentum优化算法训练,momentum=0.9
+* 权重衰减系数为1e-4
+* 学习率在训练的总epoch数的1/3和2/3时分别做0.1的衰减
+
+**备注:**
+
+* 在训练StNet模型时使用PaddlePaddle Fluid 1.3 + cudnn5.1。使用cudnn7.0以上版本时batchnorm计算moving mean和moving average会出现异常,此问题还在修复中。建议用户安装PaddlePaddle时指定cudnn版本,
+
+ pip install paddlepaddle\_gpu==1.3.0.post85
+
+或者在PaddlePaddle的whl包[下载页面](http://paddlepaddle.org/documentation/docs/zh/1.3/beginners_guide/install/Tables.html/#permalink-4--whl-release)选择下载cuda8.0\_cudnn5\_avx\_mkl对应的whl包安装。
+关于安装PaddlePaddle的详细操作请参考[安装文档](http://www.paddlepaddle.org/documentation/docs/zh/1.3/beginners_guide/install/index_cn.html)。
+
+
+## 模型评估
+
+可通过如下两种方式进行模型评估:
+
+ python test.py --model_name=STNET
+ --config=configs/stnet.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+
+ bash scripts/test/test__stnet.sh
+
+- 使用`scripts/test/test_stnet.sh`进行评估时,需要修改脚本中的`--weights`参数指定需要评估的权重。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/stnet_kinetics.tar.gz)进行评估
+
+当取如下参数时:
+
+| 参数 | 取值 |
+| :---------: | :----: |
+| seg\_num | 25 |
+| seglen | 5 |
+| target\_size | 256 |
+
+在Kinetics400的validation数据集下评估精度如下:
+
+| 精度指标 | 模型精度 |
+| :---------: | :----: |
+| TOP\_1 | 0.69 |
+
+
+## 模型推断
+
+可通过如下命令进行模型推断:
+
+ python infer.py --model_name=stnet
+ --config=configs/stnet.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+ --filelist=$FILELIST
+
+- 模型推断结果存储于`STNET_infer_result`中,通过`pickle`格式存储。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/stnet_kinetics.tar.gz)进行推断
+
+
+## 参考论文
+
+- [StNet:Local and Global Spatial-Temporal Modeling for Human Action Recognition](https://arxiv.org/abs/1811.01549), Dongliang He, Zhichao Zhou, Chuang Gan, Fu Li, Xiao Liu, Yandong Li, Limin Wang, Shilei Wen
+
diff --git a/PaddleCV/video/models/stnet/__init__.py b/PaddleCV/video/models/stnet/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..db952550a12b34853556fa42bba04c823bc7cbe4
--- /dev/null
+++ b/PaddleCV/video/models/stnet/__init__.py
@@ -0,0 +1 @@
+from .stnet import *
diff --git a/PaddleCV/video/models/stnet/stnet.py b/PaddleCV/video/models/stnet/stnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..f82bba50a5f0fd8b565d0906946c9322a45f9002
--- /dev/null
+++ b/PaddleCV/video/models/stnet/stnet.py
@@ -0,0 +1,151 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+import numpy as np
+import paddle.fluid as fluid
+
+from ..model import ModelBase
+from .stnet_res_model import StNet_ResNet
+
+import logging
+logger = logging.getLogger(__name__)
+
+__all__ = ["STNET"]
+
+
+class STNET(ModelBase):
+ def __init__(self, name, cfg, mode='train'):
+ super(STNET, self).__init__(name, cfg, mode=mode)
+ self.get_config()
+
+ def get_config(self):
+ self.num_classes = self.get_config_from_sec('model', 'num_classes')
+ self.seg_num = self.get_config_from_sec('model', 'seg_num')
+ self.seglen = self.get_config_from_sec('model', 'seglen')
+ self.image_mean = self.get_config_from_sec('model', 'image_mean')
+ self.image_std = self.get_config_from_sec('model', 'image_std')
+ self.num_layers = self.get_config_from_sec('model', 'num_layers')
+
+ self.num_epochs = self.get_config_from_sec('train', 'epoch')
+ self.total_videos = self.get_config_from_sec('train', 'total_videos')
+ self.base_learning_rate = self.get_config_from_sec('train',
+ 'learning_rate')
+ self.learning_rate_decay = self.get_config_from_sec(
+ 'train', 'learning_rate_decay')
+ self.l2_weight_decay = self.get_config_from_sec('train',
+ 'l2_weight_decay')
+ self.momentum = self.get_config_from_sec('train', 'momentum')
+
+ self.seg_num = self.get_config_from_sec(self.mode, 'seg_num', self.seg_num)
+ self.target_size = self.get_config_from_sec(self.mode, 'target_size')
+ self.batch_size = self.get_config_from_sec(self.mode, 'batch_size')
+
+ def build_input(self, use_pyreader=True):
+ image_shape = [3, self.target_size, self.target_size]
+ image_shape[0] = image_shape[0] * self.seglen
+ image_shape = [self.seg_num] + image_shape
+ self.use_pyreader = use_pyreader
+ if use_pyreader:
+ assert self.mode != 'infer', \
+ 'pyreader is not recommendated when infer, please set use_pyreader to be false.'
+ py_reader = fluid.layers.py_reader(
+ capacity=100,
+ shapes=[[-1] + image_shape, [-1] + [1]],
+ dtypes=['float32', 'int64'],
+ name='train_py_reader'
+ if self.is_training else 'test_py_reader',
+ use_double_buffer=True)
+ image, label = fluid.layers.read_file(py_reader)
+ self.py_reader = py_reader
+ else:
+ image = fluid.layers.data(
+ name='image', shape=image_shape, dtype='float32')
+ if self.mode != 'infer':
+ label = fluid.layers.data(
+ name='label', shape=[1], dtype='int64')
+ else:
+ label = None
+ self.feature_input = [image]
+ self.label_input = label
+
+ def create_model_args(self):
+ cfg = {}
+ cfg['layers'] = self.num_layers
+ cfg['class_dim'] = self.num_classes
+ cfg['seg_num'] = self.seg_num
+ cfg['seglen'] = self.seglen
+ return cfg
+
+ def build_model(self):
+ cfg = self.create_model_args()
+ videomodel = StNet_ResNet(layers = cfg['layers'], seg_num = cfg['seg_num'], \
+ seglen = cfg['seglen'], is_training = (self.mode == 'train'))
+ out = videomodel.net(input=self.feature_input[0],
+ class_dim=cfg['class_dim'])
+ self.network_outputs = [out]
+
+ def optimizer(self):
+ epoch_points = [self.num_epochs / 3, self.num_epochs * 2 / 3]
+ total_videos = self.total_videos
+ step = int(total_videos / self.batch_size + 1)
+ bd = [e * step for e in epoch_points]
+ base_lr = self.base_learning_rate
+ lr_decay = self.learning_rate_decay
+ lr = [base_lr, base_lr * lr_decay, base_lr * lr_decay * lr_decay]
+ l2_weight_decay = self.l2_weight_decay
+ momentum = self.momentum
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ momentum=momentum,
+ regularization=fluid.regularizer.L2Decay(l2_weight_decay))
+
+ return optimizer
+
+ def loss(self):
+ cost = fluid.layers.cross_entropy(input=self.network_outputs[0], \
+ label=self.label_input, ignore_index=-1)
+ self.loss_ = fluid.layers.mean(x=cost)
+ return self.loss_
+
+ def outputs(self):
+ return self.network_outputs
+
+ def feeds(self):
+ return self.feature_input if self.mode == 'infer' else self.feature_input + [
+ self.label_input
+ ]
+
+ def pretrain_info(self):
+ return ('ResNet50_pretrained', 'https://paddlemodels.bj.bcebos.com/video_classification/ResNet50_pretrained.tar.gz')
+
+ def weights_info(self):
+ return ('stnet_kinetics',
+ 'https://paddlemodels.bj.bcebos.com/video_classification/stnet_kinetics.tar.gz')
+
+ def load_pretrain_params(self, exe, pretrain, prog, place):
+ def is_parameter(var):
+ if isinstance(var, fluid.framework.Parameter):
+ return isinstance(var, fluid.framework.Parameter) and (not ("fc_0" in var.name)) \
+ and (not ("batch_norm" in var.name)) and (not ("xception" in var.name)) and (not ("conv3d" in var.name))
+
+ logger.info("Load pretrain weights from {}, exclude fc, batch_norm, xception, conv3d layers.".format(pretrain))
+ vars = filter(is_parameter, prog.list_vars())
+ fluid.io.load_vars(exe, pretrain, vars=vars, main_program=prog)
+
+ param_tensor = fluid.global_scope().find_var(
+ "conv1_weights").get_tensor()
+ param_numpy = np.array(param_tensor)
+ param_numpy = np.mean(param_numpy, axis=1, keepdims=True) / self.seglen
+ param_numpy = np.repeat(param_numpy, 3 * self.seglen, axis=1)
+ param_tensor.set(param_numpy.astype(np.float32), place)
diff --git a/PaddleCV/video/models/stnet/stnet_res_model.py b/PaddleCV/video/models/stnet/stnet_res_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..71a22c4f869161a8b92ec5c79a69b85bc68d4c86
--- /dev/null
+++ b/PaddleCV/video/models/stnet/stnet_res_model.py
@@ -0,0 +1,312 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import time
+import sys
+import paddle.fluid as fluid
+import math
+
+
+class StNet_ResNet():
+ def __init__(self, layers=50, seg_num=7, seglen=5, is_training=True):
+ self.layers = layers
+ self.seglen = seglen
+ self.seg_num = seg_num
+ self.is_training = is_training
+
+ def temporal_conv_bn(
+ self,
+ input, #(B*seg_num, c, h, w)
+ num_filters,
+ filter_size=(3, 1, 1),
+ padding=(1, 0, 0)):
+ #(B, seg_num, c, h, w)
+ in_reshape = fluid.layers.reshape(
+ x=input,
+ shape=[
+ -1, self.seg_num, input.shape[-3], input.shape[-2],
+ input.shape[-1]
+ ])
+ in_transpose = fluid.layers.transpose(in_reshape, perm=[0, 2, 1, 3, 4])
+
+ conv = fluid.layers.conv3d(
+ input=in_transpose,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=1,
+ groups=1,
+ padding=padding,
+ act='relu',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.MSRAInitializer()),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0.0)))
+
+ out = fluid.layers.batch_norm(
+ input=conv,
+ act=None,
+ is_test=(not self.is_training),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=1.0)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0.0)))
+ out = out + in_transpose
+ out = fluid.layers.transpose(out, perm=[0, 2, 1, 3, 4])
+ out = fluid.layers.reshape(x=out, shape=input.shape)
+ return out
+
+ def xception(self, input): #(B, C, seg_num,1)
+ bn = fluid.layers.batch_norm(
+ input=input,
+ act=None,
+ name="xception_bn",
+ is_test=(not self.is_training),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=1.0)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0.0)))
+ att_conv = fluid.layers.conv2d(
+ input=bn,
+ num_filters=2048,
+ filter_size=[3, 1],
+ stride=[1, 1],
+ padding=[1, 0],
+ groups=2048,
+ name="xception_att_conv",
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.MSRAInitializer()),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0)))
+ att_2 = fluid.layers.conv2d(
+ input=att_conv,
+ num_filters=1024,
+ filter_size=[1, 1],
+ stride=[1, 1],
+ name="xception_att_2",
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.MSRAInitializer()),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0)))
+ bndw = fluid.layers.batch_norm(
+ input=att_2,
+ act="relu",
+ name="xception_bndw",
+ is_test=(not self.is_training),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=1.0)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0.0)))
+ att1 = fluid.layers.conv2d(
+ input=bndw,
+ num_filters=1024,
+ filter_size=[3, 1],
+ stride=[1, 1],
+ padding=[1, 0],
+ groups=1024,
+ name="xception_att1",
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.MSRAInitializer()),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0)))
+ att1_2 = fluid.layers.conv2d(
+ input=att1,
+ num_filters=1024,
+ filter_size=[1, 1],
+ stride=[1, 1],
+ name="xception_att1_2",
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.MSRAInitializer()),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0)))
+ dw = fluid.layers.conv2d(
+ input=bn,
+ num_filters=1024,
+ filter_size=[1, 1],
+ stride=[1, 1],
+ name="xception_dw",
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.MSRAInitializer()),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0)))
+ add_to = dw + att1_2
+ bn2 = fluid.layers.batch_norm(
+ input=add_to,
+ act=None,
+ name='xception_bn2',
+ is_test=(not self.is_training),
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=1.0)),
+ bias_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.ConstantInitializer(value=0.0)))
+ return fluid.layers.relu(bn2)
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ param_attr=fluid.param_attr.ParamAttr(name=name + "_weights"),
+ bias_attr=False,
+ #name = name+".conv2d.output.1"
+ )
+ if name == "conv1":
+ bn_name = "bn_" + name
+ else:
+ bn_name = "bn" + name[3:]
+ return fluid.layers.batch_norm(
+ input=conv,
+ act=act,
+ is_test=(not self.is_training),
+ #name=bn_name+'.output.1',
+ param_attr=fluid.param_attr.ParamAttr(name=bn_name + "_scale"),
+ bias_attr=fluid.param_attr.ParamAttr(bn_name + '_offset'),
+ moving_mean_name=bn_name + "_mean",
+ moving_variance_name=bn_name + '_variance')
+
+ def shortcut(self, input, ch_out, stride, name):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride, name=name)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride, name):
+ conv0 = self.conv_bn_layer(
+ input=input,
+ num_filters=num_filters,
+ filter_size=1,
+ act='relu',
+ name=name + "_branch2a")
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu',
+ name=name + "_branch2b")
+ conv2 = self.conv_bn_layer(
+ input=conv1,
+ num_filters=num_filters * 4,
+ filter_size=1,
+ act=None,
+ name=name + "_branch2c")
+
+ short = self.shortcut(
+ input, num_filters * 4, stride, name=name + "_branch1")
+
+ return fluid.layers.elementwise_add(
+ x=short,
+ y=conv2,
+ act='relu',
+ #name=".add.output.5"
+ )
+
+ def net(self, input, class_dim=101):
+ layers = self.layers
+ seg_num = self.seg_num
+ seglen = self.seglen
+
+ supported_layers = [50, 101, 152]
+ if layers not in supported_layers:
+ print("supported layers are", supported_layers, \
+ "but input layer is ", layers)
+ exit()
+
+ # reshape input
+ # [B, seg_num, seglen*c, H, W] --> [B*seg_num, seglen*c, H, W]
+ channels = input.shape[2]
+ short_size = input.shape[3]
+ input = fluid.layers.reshape(
+ x=input, shape=[-1, channels, short_size, short_size])
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act='relu',
+ name='conv1')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ if layers in [101, 152] and block == 2:
+ if i == 0:
+ conv_name = "res" + str(block + 2) + "a"
+ else:
+ conv_name = "res" + str(block + 2) + "b" + str(i)
+ else:
+ conv_name = "res" + str(block + 2) + chr(97 + i)
+
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1,
+ name=conv_name)
+ if block == 1:
+ #insert the first temporal modeling block
+ conv = self.temporal_conv_bn(input=conv, num_filters=512)
+ if block == 2:
+ #insert the second temporal modeling block
+ conv = self.temporal_conv_bn(input=conv, num_filters=1024)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+
+ feature = fluid.layers.reshape(
+ x=pool, shape=[-1, seg_num, pool.shape[1], 1])
+ feature = fluid.layers.transpose(feature, perm=[0, 2, 1, 3])
+
+ #append the temporal Xception block
+ xfeat = self.xception(feature) #(B, 1024, seg_num, 1)
+ out = fluid.layers.pool2d(
+ input=xfeat,
+ pool_size=(seg_num, 1),
+ pool_type='max',
+ global_pooling=True)
+ out = fluid.layers.reshape(x=out, shape=[-1, 1024])
+
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=out,
+ size=class_dim,
+ act='softmax',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv,
+ stdv)))
+ return out
diff --git a/PaddleCV/video/models/tsm/README.md b/PaddleCV/video/models/tsm/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..0fe81f7d1019f86988366eea15b10d514b82fa0c
--- /dev/null
+++ b/PaddleCV/video/models/tsm/README.md
@@ -0,0 +1,91 @@
+# TSM 视频分类模型
+
+---
+## 内容
+
+- [模型简介](#模型简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断](#模型推断)
+- [参考论文](#参考论文)
+
+
+## 模型简介
+
+Temporal Shift Module是由MIT和IBM Watson AI Lab的Ji Lin,Chuang Gan等人提出的通过时间位移来提高网络视频理解能力的模块,其位移操作原理如下图所示。
+
+
+
+Temporal shift module
+
+
+上图中矩阵表示特征图中的temporal和channel维度,通过将一部分的channel在temporal维度上向前位移一步,一部分的channel在temporal维度上向后位移一步,位移后的空缺补零。通过这种方式在特征图中引入temporal维度上的上下文交互,提高在时间维度上的视频理解能力。
+
+TSM模型是将Temporal Shift Module插入到ResNet网络中构建的视频分类模型,本模型库实现版本为以ResNet-50作为主干网络的TSM模型。
+
+详细内容请参考论文[Temporal Shift Module for Efficient Video Understanding](https://arxiv.org/abs/1811.08383v1)
+
+## 数据准备
+
+TSM的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。数据下载及准备请参考[数据说明](../../dataset/README.md)
+
+## 模型训练
+
+数据准备完毕后,可以通过如下两种方式启动训练:
+
+ python train.py --model_name=TSM
+ --config=./configs/tsm.txt
+ --save_dir=checkpoints
+ --log_interval=10
+ --valid_interval=1
+
+ bash scripts/train/train_tsm.sh
+
+- 可下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/tsm_kinetics.tar.gz)通过`--resume`指定权重存放路径进行finetune等开发
+
+**数据读取器说明:** 模型读取Kinetics-400数据集中的`mp4`数据,每条数据抽取`seg_num`段,每段抽取1帧图像,对每帧图像做随机增强后,缩放至`target_size`。
+
+**训练策略:**
+
+* 采用Momentum优化算法训练,momentum=0.9
+* 权重衰减系数为1e-4
+
+## 模型评估
+
+可通过如下两种方式进行模型评估:
+
+ python test.py --model_name=TSM
+ --config=configs/tsm.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+
+ bash scripts/test/test_tsm.sh
+
+- 使用`scripts/test/test_tsm.sh`进行评估时,需要修改脚本中的`--weights`参数指定需要评估的权重。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/tsm_kinetics.tar.gz)进行评估
+
+当取如下参数时,在Kinetics400的validation数据集下评估精度如下:
+
+| seg\_num | target\_size | Top-1 |
+| :------: | :----------: | :----: |
+| 8 | 224 | 0.70 |
+
+## 模型推断
+
+可通过如下命令进行模型推断:
+
+ python infer.py --model_name=TSM
+ --config=configs/tsm.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+ --filelist=$FILELIST
+
+- 模型推断结果存储于`TSM_infer_result`中,通过`pickle`格式存储。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/tsm_kinetics.tar.gz)进行推断
+
+## 参考论文
+
+- [Temporal Shift Module for Efficient Video Understanding](https://arxiv.org/abs/1811.08383v1), Ji Lin, Chuang Gan, Song Han
diff --git a/PaddleCV/video/models/tsm/__init__.py b/PaddleCV/video/models/tsm/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..0a939d369b6980ad977f46827a7dd330e33e7bbe
--- /dev/null
+++ b/PaddleCV/video/models/tsm/__init__.py
@@ -0,0 +1 @@
+from .tsm import *
diff --git a/PaddleCV/video/models/tsm/tsm.py b/PaddleCV/video/models/tsm/tsm.py
new file mode 100644
index 0000000000000000000000000000000000000000..7d8dcc77661414319597858d801176cc855262e8
--- /dev/null
+++ b/PaddleCV/video/models/tsm/tsm.py
@@ -0,0 +1,138 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+
+from ..model import ModelBase
+from .tsm_res_model import TSM_ResNet
+
+import logging
+logger = logging.getLogger(__name__)
+
+__all__ = ["TSM"]
+
+
+class TSM(ModelBase):
+ def __init__(self, name, cfg, mode='train'):
+ super(TSM, self).__init__(name, cfg, mode=mode)
+ self.get_config()
+
+ def get_config(self):
+ self.num_classes = self.get_config_from_sec('model', 'num_classes')
+ self.seg_num = self.get_config_from_sec('model', 'seg_num')
+ self.seglen = self.get_config_from_sec('model', 'seglen')
+ self.image_mean = self.get_config_from_sec('model', 'image_mean')
+ self.image_std = self.get_config_from_sec('model', 'image_std')
+ self.num_layers = self.get_config_from_sec('model', 'num_layers')
+
+ self.num_epochs = self.get_config_from_sec('train', 'epoch')
+ self.total_videos = self.get_config_from_sec('train', 'total_videos')
+ self.base_learning_rate = self.get_config_from_sec('train',
+ 'learning_rate')
+ self.learning_rate_decay = self.get_config_from_sec(
+ 'train', 'learning_rate_decay')
+ self.decay_epochs = self.get_config_from_sec('train', 'decay_epochs')
+ self.l2_weight_decay = self.get_config_from_sec('train',
+ 'l2_weight_decay')
+ self.momentum = self.get_config_from_sec('train', 'momentum')
+
+ self.target_size = self.get_config_from_sec(self.mode, 'target_size')
+ self.batch_size = self.get_config_from_sec(self.mode, 'batch_size')
+
+ def build_input(self, use_pyreader=True):
+ image_shape = [3, self.target_size, self.target_size]
+ image_shape[0] = image_shape[0] * self.seglen
+ image_shape = [self.seg_num] + image_shape
+ self.use_pyreader = use_pyreader
+ if use_pyreader:
+ assert self.mode != 'infer', \
+ 'pyreader is not recommendated when infer, please set use_pyreader to be false.'
+ py_reader = fluid.layers.py_reader(
+ capacity=100,
+ shapes=[[-1] + image_shape, [-1] + [1]],
+ dtypes=['float32', 'int64'],
+ name='train_py_reader'
+ if self.is_training else 'test_py_reader',
+ use_double_buffer=True)
+ image, label = fluid.layers.read_file(py_reader)
+ self.py_reader = py_reader
+ else:
+ image = fluid.layers.data(
+ name='image', shape=image_shape, dtype='float32')
+ if self.mode != 'infer':
+ label = fluid.layers.data(
+ name='label', shape=[1], dtype='int64')
+ else:
+ label = None
+ self.feature_input = [image]
+ self.label_input = label
+
+ def build_model(self):
+ videomodel = TSM_ResNet(
+ layers=self.num_layers,
+ seg_num=self.seg_num,
+ is_training=self.is_training)
+ out = videomodel.net(input=self.feature_input[0],
+ class_dim=self.num_classes)
+ self.network_outputs = [out]
+
+ def optimizer(self):
+ assert self.mode == 'train', "optimizer only can be get in train mode"
+ total_videos = self.total_videos
+ step = int(total_videos / self.batch_size + 1)
+ bd = [e * step for e in self.decay_epochs]
+ base_lr = self.base_learning_rate
+ lr_decay = self.learning_rate_decay
+ lr = [base_lr, base_lr * lr_decay, base_lr * lr_decay * lr_decay]
+ l2_weight_decay = self.l2_weight_decay
+ momentum = self.momentum
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ momentum=momentum,
+ regularization=fluid.regularizer.L2Decay(l2_weight_decay))
+
+ return optimizer
+
+ def loss(self):
+ assert self.mode != 'infer', "invalid loss calculationg in infer mode"
+ cost = fluid.layers.cross_entropy(input=self.network_outputs[0], \
+ label=self.label_input, ignore_index=-1)
+ self.loss_ = fluid.layers.mean(x=cost)
+ return self.loss_
+
+ def outputs(self):
+ return self.network_outputs
+
+ def feeds(self):
+ return self.feature_input if self.mode == 'infer' else self.feature_input + [
+ self.label_input
+ ]
+
+ def pretrain_info(self):
+ return ('ResNet50_pretrained', 'https://paddlemodels.bj.bcebos.com/video_classification/ResNet50_pretrained.tar.gz')
+
+ def weights_info(self):
+ return ('tsm_kinetics',
+ 'https://paddlemodels.bj.bcebos.com/video_classification/tsm_kinetics.tar.gz')
+
+ def load_pretrain_params(self, exe, pretrain, prog, place):
+ def is_parameter(var):
+ return isinstance(var, fluid.framework.Parameter) and (not ("fc_0" in var.name))
+
+ logger.info("Load pretrain weights from {}, exclude fc layer.".format(pretrain))
+ vars = filter(is_parameter, prog.list_vars())
+ fluid.io.load_vars(exe, pretrain, vars=vars, main_program=prog)
+
diff --git a/PaddleCV/video/models/tsm/tsm_res_model.py b/PaddleCV/video/models/tsm/tsm_res_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..f40a0771da0fc2c112994cdb116fb5d298d0467f
--- /dev/null
+++ b/PaddleCV/video/models/tsm/tsm_res_model.py
@@ -0,0 +1,154 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import time
+import sys
+import paddle.fluid as fluid
+import math
+
+
+class TSM_ResNet():
+ def __init__(self, layers=50, seg_num=8, is_training=False):
+ self.layers = layers
+ self.seg_num = seg_num
+ self.is_training = is_training
+
+ def shift_module(self, input):
+ output = fluid.layers.temporal_shift(input, self.seg_num, 1.0 / 8)
+ return output
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ param_attr=fluid.param_attr.ParamAttr(name=name+"_weights"),
+ bias_attr=False)
+ if name == "conv1":
+ bn_name = "bn_" + name
+ else:
+ bn_name = "bn" + name[3:]
+
+ return fluid.layers.batch_norm(input=conv, act=act,
+ is_test=(not self.is_training),
+ param_attr=fluid.param_attr.ParamAttr(name=bn_name+"_scale"),
+ bias_attr=fluid.param_attr.ParamAttr(bn_name+'_offset'),
+ moving_mean_name=bn_name+"_mean",
+ moving_variance_name=bn_name+'_variance')
+
+ def shortcut(self, input, ch_out, stride, name):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride, name=name)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride, name):
+ shifted = self.shift_module(input)
+
+ conv0 = self.conv_bn_layer(
+ input=shifted, num_filters=num_filters, filter_size=1, act='relu',
+ name=name+"_branch2a")
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu', name=name+"_branch2b")
+ conv2 = self.conv_bn_layer(
+ input=conv1, num_filters=num_filters * 4, filter_size=1, act=None, name=name+"_branch2c")
+
+ short = self.shortcut(input, num_filters * 4, stride, name=name+"_branch1")
+
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
+
+ def net(self, input, class_dim=101):
+ layers = self.layers
+ seg_num = self.seg_num
+ supported_layers = [50, 101, 152]
+ if layers not in supported_layers:
+ print("supported layers are", supported_layers, \
+ "but input layer is ", layers)
+ exit()
+
+ # reshape input
+ channels = input.shape[2]
+ short_size = input.shape[3]
+ input = fluid.layers.reshape(
+ x=input, shape=[-1, channels, short_size, short_size])
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input, num_filters=64, filter_size=7, stride=2, act='relu', name='conv1')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ if layers in [101, 152] and block == 2:
+ if i == 0:
+ conv_name = "res" + str(block+2) + "a"
+ else:
+ conv_name = "res" + str(block+2) + "b" + str(i)
+ else:
+ conv_name = "res" + str(block+2) + chr(97+i)
+
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1,
+ name=conv_name)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+
+ dropout = fluid.layers.dropout(x=pool, dropout_prob=0.5, is_test=(not self.is_training))
+
+ feature = fluid.layers.reshape(
+ x=dropout, shape=[-1, seg_num, pool.shape[1]])
+ out = fluid.layers.reduce_mean(feature, dim=1)
+
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=out,
+ size=class_dim,
+ act='softmax',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv,
+ stdv)),
+ bias_attr=fluid.param_attr.ParamAttr(learning_rate=2.0,
+ regularizer=fluid.regularizer.L2Decay(0.)))
+ return out
diff --git a/PaddleCV/video/models/tsn/README.md b/PaddleCV/video/models/tsn/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..800fdf95fd9d16282420ace1dc71085ff83e6091
--- /dev/null
+++ b/PaddleCV/video/models/tsn/README.md
@@ -0,0 +1,85 @@
+# TSN 视频分类模型
+
+---
+## 内容
+
+- [模型简介](#模型简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断](#模型推断)
+- [参考论文](#参考论文)
+
+
+## 模型简介
+
+Temporal Segment Network (TSN) 是视频分类领域经典的基于2D-CNN的解决方案。该方法主要解决视频的长时间行为判断问题,通过稀疏采样视频帧的方式代替稠密采样,既能捕获视频全局信息,也能去除冗余,降低计算量。最终将每帧特征平均融合后得到视频的整体特征,并用于分类。本代码实现的模型为基于单路RGB图像的TSN网络结构,Backbone采用ResNet-50结构。
+
+详细内容请参考ECCV 2016年论文[StNet:Local and Global Spatial-Temporal Modeling for Human Action Recognition](https://arxiv.org/abs/1608.00859)
+
+## 数据准备
+
+TSN的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。数据下载及准备请参考[数据说明](../../dataset/README.md)
+
+## 模型训练
+
+数据准备完毕后,可以通过如下两种方式启动训练:
+
+ python train.py --model_name=TSN
+ --config=./configs/tsn.txt
+ --save_dir=checkpoints
+ --log_interval=10
+ --valid_interval=1
+
+ bash scripts/train/train_tsn.sh
+
+- 可下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/tsn_kinetics.tar.gz)通过`--resume`指定权重存放路径进行finetune等开发
+
+**数据读取器说明:** 模型读取Kinetics-400数据集中的`mp4`数据,每条数据抽取`seg_num`段,每段抽取1帧图像,对每帧图像做随机增强后,缩放至`target_size`。
+
+**训练策略:**
+
+* 采用Momentum优化算法训练,momentum=0.9
+* 权重衰减系数为1e-4
+* 学习率在训练的总epoch数的1/3和2/3时分别做0.1的衰减
+
+## 模型评估
+
+可通过如下两种方式进行模型评估:
+
+ python test.py --model_name=TSN
+ --config=configs/tsn.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+
+ bash scripts/test/test_tsn.sh
+
+- 使用`scripts/test/test_tsn.sh`进行评估时,需要修改脚本中的`--weights`参数指定需要评估的权重。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/tsn_kinetics.tar.gz)进行评估
+
+当取如下参数时,在Kinetics400的validation数据集下评估精度如下:
+
+| seg\_num | target\_size | Top-1 |
+| :------: | :----------: | :----: |
+| 3 | 224 | 0.66 |
+| 7 | 224 | 0.67 |
+
+## 模型推断
+
+可通过如下命令进行模型推断:
+
+ python infer.py --model_name=TSN
+ --config=configs/tsn.txt
+ --log_interval=1
+ --weights=$PATH_TO_WEIGHTS
+ --filelist=$FILELIST
+
+- 模型推断结果存储于`TSN_infer_result`中,通过`pickle`格式存储。
+
+- 若未指定`--weights`参数,脚本会下载已发布模型[model](https://paddlemodels.bj.bcebos.com/video_classification/tsn_kinetics.tar.gz)进行推断
+
+## 参考论文
+
+- [Temporal Segment Networks: Towards Good Practices for Deep Action Recognition](https://arxiv.org/abs/1608.00859), Limin Wang, Yuanjun Xiong, Zhe Wang, Yu Qiao, Dahua Lin, Xiaoou Tang, Luc Van Gool
+
diff --git a/PaddleCV/video/models/tsn/__init__.py b/PaddleCV/video/models/tsn/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..bd57d2687bc948e63dd88306e9d435bbbb5a7978
--- /dev/null
+++ b/PaddleCV/video/models/tsn/__init__.py
@@ -0,0 +1 @@
+from .tsn import *
diff --git a/PaddleCV/video/models/tsn/tsn.py b/PaddleCV/video/models/tsn/tsn.py
new file mode 100644
index 0000000000000000000000000000000000000000..5c18d0d7d34e53e102a72e13b2d1061d3b51b6c9
--- /dev/null
+++ b/PaddleCV/video/models/tsn/tsn.py
@@ -0,0 +1,147 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid import ParamAttr
+
+from ..model import ModelBase
+from .tsn_res_model import TSN_ResNet
+
+import logging
+logger = logging.getLogger(__name__)
+
+__all__ = ["TSN"]
+
+
+class TSN(ModelBase):
+ def __init__(self, name, cfg, mode='train'):
+ super(TSN, self).__init__(name, cfg, mode=mode)
+ self.get_config()
+
+ def get_config(self):
+ self.num_classes = self.get_config_from_sec('model', 'num_classes')
+ self.seg_num = self.get_config_from_sec('model', 'seg_num')
+ self.seglen = self.get_config_from_sec('model', 'seglen')
+ self.image_mean = self.get_config_from_sec('model', 'image_mean')
+ self.image_std = self.get_config_from_sec('model', 'image_std')
+ self.num_layers = self.get_config_from_sec('model', 'num_layers')
+
+ self.num_epochs = self.get_config_from_sec('train', 'epoch')
+ self.total_videos = self.get_config_from_sec('train', 'total_videos')
+ self.base_learning_rate = self.get_config_from_sec('train',
+ 'learning_rate')
+ self.learning_rate_decay = self.get_config_from_sec(
+ 'train', 'learning_rate_decay')
+ self.l2_weight_decay = self.get_config_from_sec('train',
+ 'l2_weight_decay')
+ self.momentum = self.get_config_from_sec('train', 'momentum')
+
+ self.seg_num = self.get_config_from_sec(self.mode, 'seg_num', self.seg_num)
+ self.target_size = self.get_config_from_sec(self.mode, 'target_size')
+ self.batch_size = self.get_config_from_sec(self.mode, 'batch_size')
+
+ def build_input(self, use_pyreader=True):
+ image_shape = [3, self.target_size, self.target_size]
+ image_shape[0] = image_shape[0] * self.seglen
+ image_shape = [self.seg_num] + image_shape
+ self.use_pyreader = use_pyreader
+ if use_pyreader:
+ assert self.mode != 'infer', \
+ 'pyreader is not recommendated when infer, please set use_pyreader to be false.'
+ py_reader = fluid.layers.py_reader(
+ capacity=100,
+ shapes=[[-1] + image_shape, [-1] + [1]],
+ dtypes=['float32', 'int64'],
+ name='train_py_reader'
+ if self.is_training else 'test_py_reader',
+ use_double_buffer=True)
+ image, label = fluid.layers.read_file(py_reader)
+ self.py_reader = py_reader
+ else:
+ image = fluid.layers.data(
+ name='image', shape=image_shape, dtype='float32')
+ if self.mode != 'infer':
+ label = fluid.layers.data(
+ name='label', shape=[1], dtype='int64')
+ else:
+ label = None
+ self.feature_input = [image]
+ self.label_input = label
+
+ def create_model_args(self):
+ cfg = {}
+ cfg['layers'] = self.num_layers
+ cfg['class_dim'] = self.num_classes
+ cfg['seg_num'] = self.seg_num
+ return cfg
+
+ def build_model(self):
+ cfg = self.create_model_args()
+ videomodel = TSN_ResNet(
+ layers=cfg['layers'],
+ seg_num=cfg['seg_num'],
+ is_training=(self.mode == 'train'))
+ out = videomodel.net(input=self.feature_input[0],
+ class_dim=cfg['class_dim'])
+ self.network_outputs = [out]
+
+ def optimizer(self):
+ assert self.mode == 'train', "optimizer only can be get in train mode"
+ epoch_points = [self.num_epochs / 3, self.num_epochs * 2 / 3]
+ total_videos = self.total_videos
+ step = int(total_videos / self.batch_size + 1)
+ bd = [e * step for e in epoch_points]
+ base_lr = self.base_learning_rate
+ lr_decay = self.learning_rate_decay
+ lr = [base_lr, base_lr * lr_decay, base_lr * lr_decay * lr_decay]
+ l2_weight_decay = self.l2_weight_decay
+ momentum = self.momentum
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ momentum=momentum,
+ regularization=fluid.regularizer.L2Decay(l2_weight_decay))
+
+ return optimizer
+
+ def loss(self):
+ assert self.mode != 'infer', "invalid loss calculationg in infer mode"
+ cost = fluid.layers.cross_entropy(input=self.network_outputs[0], \
+ label=self.label_input, ignore_index=-1)
+ self.loss_ = fluid.layers.mean(x=cost)
+ return self.loss_
+
+ def outputs(self):
+ return self.network_outputs
+
+ def feeds(self):
+ return self.feature_input if self.mode == 'infer' else self.feature_input + [
+ self.label_input
+ ]
+
+ def pretrain_info(self):
+ return ('ResNet50_pretrained', 'https://paddlemodels.bj.bcebos.com/video_classification/ResNet50_pretrained.tar.gz')
+
+ def weights_info(self):
+ return ('tsn_kinetics',
+ 'https://paddlemodels.bj.bcebos.com/video_classification/tsn_kinetics.tar.gz')
+
+ def load_pretrain_params(self, exe, pretrain, prog, place):
+ def is_parameter(var):
+ return isinstance(var, fluid.framework.Parameter) and (not ("fc_0" in var.name))
+
+ logger.info("Load pretrain weights from {}, exclude fc layer.".format(pretrain))
+ vars = filter(is_parameter, prog.list_vars())
+ fluid.io.load_vars(exe, pretrain, vars=vars, main_program=prog)
+
diff --git a/PaddleCV/video/models/tsn/tsn_res_model.py b/PaddleCV/video/models/tsn/tsn_res_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..09dc54893f3305a0a1a94fe6e73aff32680915d9
--- /dev/null
+++ b/PaddleCV/video/models/tsn/tsn_res_model.py
@@ -0,0 +1,158 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import time
+import sys
+import paddle.fluid as fluid
+import math
+
+
+class TSN_ResNet():
+ def __init__(self, layers=50, seg_num=7, is_training=True):
+ self.layers = layers
+ self.seg_num = seg_num
+ self.is_training = is_training
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ param_attr=fluid.param_attr.ParamAttr(name=name + "_weights"),
+ bias_attr=False)
+ if name == "conv1":
+ bn_name = "bn_" + name
+ else:
+ bn_name = "bn" + name[3:]
+
+ return fluid.layers.batch_norm(
+ input=conv,
+ act=act,
+ is_test=(not self.is_training),
+ param_attr=fluid.param_attr.ParamAttr(name=bn_name + "_scale"),
+ bias_attr=fluid.param_attr.ParamAttr(bn_name + '_offset'),
+ moving_mean_name=bn_name + "_mean",
+ moving_variance_name=bn_name + '_variance')
+
+ def shortcut(self, input, ch_out, stride, name):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride, name=name)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride, name):
+ conv0 = self.conv_bn_layer(
+ input=input,
+ num_filters=num_filters,
+ filter_size=1,
+ act='relu',
+ name=name + "_branch2a")
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu',
+ name=name + "_branch2b")
+ conv2 = self.conv_bn_layer(
+ input=conv1,
+ num_filters=num_filters * 4,
+ filter_size=1,
+ act=None,
+ name=name + "_branch2c")
+
+ short = self.shortcut(
+ input, num_filters * 4, stride, name=name + "_branch1")
+
+ return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
+
+ def net(self, input, class_dim=101):
+ layers = self.layers
+ seg_num = self.seg_num
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+
+ # reshape input
+ channels = input.shape[2]
+ short_size = input.shape[3]
+ input = fluid.layers.reshape(
+ x=input, shape=[-1, channels, short_size, short_size])
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act='relu',
+ name='conv1')
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ if layers in [101, 152] and block == 2:
+ if i == 0:
+ conv_name = "res" + str(block + 2) + "a"
+ else:
+ conv_name = "res" + str(block + 2) + "b" + str(i)
+ else:
+ conv_name = "res" + str(block + 2) + chr(97 + i)
+
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1,
+ name=conv_name)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+
+ feature = fluid.layers.reshape(
+ x=pool, shape=[-1, seg_num, pool.shape[1]])
+ out = fluid.layers.reduce_mean(feature, dim=1)
+
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=out,
+ size=class_dim,
+ act='softmax',
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv,
+ stdv)))
+ return out
diff --git a/PaddleCV/video/models/utils.py b/PaddleCV/video/models/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..b02abfdf134c869fe4805f4a746d7357efd0b7b1
--- /dev/null
+++ b/PaddleCV/video/models/utils.py
@@ -0,0 +1,47 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import wget
+import tarfile
+
+__all__ = ['decompress', 'download', 'AttrDict']
+
+
+def decompress(path):
+ t = tarfile.open(path)
+ t.extractall(path='/'.join(path.split('/')[:-1]))
+ t.close()
+ os.remove(path)
+
+
+def download(url, path):
+ weight_dir = '/'.join(path.split('/')[:-1])
+ if not os.path.exists(weight_dir):
+ os.makedirs(weight_dir)
+
+ path = path + ".tar.gz"
+ wget.download(url, path)
+ decompress(path)
+
+
+class AttrDict(dict):
+ def __getattr__(self, key):
+ return self[key]
+
+ def __setattr__(self, key, value):
+ if key in self.__dict__:
+ self.__dict__[key] = value
+ else:
+ self[key] = value
diff --git a/PaddleCV/video/scripts/infer/infer_attention_cluster.sh b/PaddleCV/video/scripts/infer/infer_attention_cluster.sh
new file mode 100644
index 0000000000000000000000000000000000000000..6a122dc2cecd6382d97cb7b8b97439588bbbaf85
--- /dev/null
+++ b/PaddleCV/video/scripts/infer/infer_attention_cluster.sh
@@ -0,0 +1,4 @@
+python infer.py --model_name="AttentionCluster" --config=./configs/attention_cluster.txt \
+ --filelist=./dataset/youtube8m/infer.list \
+ --weights=./checkpoints/AttentionCluster_epoch0 \
+ --save_dir="./save"
diff --git a/PaddleCV/video/scripts/infer/infer_attention_lstm.sh b/PaddleCV/video/scripts/infer/infer_attention_lstm.sh
new file mode 100644
index 0000000000000000000000000000000000000000..659ca891896a21be5561d38dbef2de94a4a68b0b
--- /dev/null
+++ b/PaddleCV/video/scripts/infer/infer_attention_lstm.sh
@@ -0,0 +1,4 @@
+python infer.py --model_name="AttentionLSTM" --config=./configs/attention_lstm.txt \
+ --filelist=./dataset/youtube8m/infer.list \
+ --weights=./checkpoints/AttentionLSTM_epoch0 \
+ --save_dir="./save"
diff --git a/PaddleCV/video/scripts/infer/infer_nextvlad.sh b/PaddleCV/video/scripts/infer/infer_nextvlad.sh
new file mode 100644
index 0000000000000000000000000000000000000000..db383b81a638fd388d77b872c30441f743142abe
--- /dev/null
+++ b/PaddleCV/video/scripts/infer/infer_nextvlad.sh
@@ -0,0 +1,3 @@
+python infer.py --model_name="NEXTVLAD" --config=./configs/nextvlad.txt --filelist=./dataset/youtube8m/infer.list \
+ --weights=./checkpoints/NEXTVLAD_epoch0 \
+ --save_dir="./save"
diff --git a/PaddleCV/video/scripts/infer/infer_nonlocal.sh b/PaddleCV/video/scripts/infer/infer_nonlocal.sh
new file mode 100644
index 0000000000000000000000000000000000000000..73ed47dec55a9330dc1b04486a3a8afd008c6c65
--- /dev/null
+++ b/PaddleCV/video/scripts/infer/infer_nonlocal.sh
@@ -0,0 +1,2 @@
+python infer.py --model_name="NONLOCAL" --config=./configs/nonlocal.txt --filelist=./dataset/nonlocal/inferlist.txt \
+ --log_interval=10 --weights=./checkpoints/NONLOCAL_epoch0 --save_dir=./save
diff --git a/PaddleCV/video/scripts/infer/infer_stnet.sh b/PaddleCV/video/scripts/infer/infer_stnet.sh
new file mode 100644
index 0000000000000000000000000000000000000000..fb598fde81f050ed61ba7dc931a4a2cf02e0f088
--- /dev/null
+++ b/PaddleCV/video/scripts/infer/infer_stnet.sh
@@ -0,0 +1,2 @@
+python infer.py --model_name="STNET" --config=./configs/stnet.txt --filelist=./dataset/kinetics/infer.list \
+ --log_interval=10 --weights=./checkpoints/STNET_epoch0 --save_dir=./save
diff --git a/PaddleCV/video/scripts/infer/infer_tsm.sh b/PaddleCV/video/scripts/infer/infer_tsm.sh
new file mode 100644
index 0000000000000000000000000000000000000000..cdcc30dfbdbddd090f0604b590be18eb81bb7509
--- /dev/null
+++ b/PaddleCV/video/scripts/infer/infer_tsm.sh
@@ -0,0 +1,2 @@
+python infer.py --model_name="TSM" --config=./configs/tsm.txt --filelist=./dataset/kinetics/infer.list \
+ --log_interval=10 --weights=./checkpoints/TSM_epoch0 --save_dir=./save
diff --git a/PaddleCV/video/scripts/infer/infer_tsn.sh b/PaddleCV/video/scripts/infer/infer_tsn.sh
new file mode 100644
index 0000000000000000000000000000000000000000..03dd1f0a1ed6e51b1872ea525ded34f82f481562
--- /dev/null
+++ b/PaddleCV/video/scripts/infer/infer_tsn.sh
@@ -0,0 +1,2 @@
+python infer.py --model_name="TSN" --config=./configs/tsn.txt --filelist=./dataset/kinetics/infer.list \
+ --log_interval=10 --weights=./checkpoints/TSN_epoch0 --save_dir=./save
diff --git a/PaddleCV/video/scripts/test/test_attention_cluster.sh b/PaddleCV/video/scripts/test/test_attention_cluster.sh
new file mode 100644
index 0000000000000000000000000000000000000000..1bdc5acfa7d7b44bf9b74c746e40c1557cfc8f93
--- /dev/null
+++ b/PaddleCV/video/scripts/test/test_attention_cluster.sh
@@ -0,0 +1,2 @@
+python test.py --model_name="AttentionCluster" --config=./configs/attention_cluster.txt \
+ --log_interval=5 --weights=./checkpoints/AttentionCluster_epoch0
diff --git a/PaddleCV/video/scripts/test/test_attention_lstm.sh b/PaddleCV/video/scripts/test/test_attention_lstm.sh
new file mode 100644
index 0000000000000000000000000000000000000000..27bff350a84683ea452ae0053105c89c55afeedd
--- /dev/null
+++ b/PaddleCV/video/scripts/test/test_attention_lstm.sh
@@ -0,0 +1,2 @@
+python test.py --model_name="AttentionLSTM" --config=./configs/attention_lstm.txt \
+ --log_interval=5 --weights=./checkpoints/AttentionLSTM_epoch0
diff --git a/PaddleCV/video/scripts/test/test_nextvlad.sh b/PaddleCV/video/scripts/test/test_nextvlad.sh
new file mode 100644
index 0000000000000000000000000000000000000000..4d390a0b2d1fc8eb32c16b862b34a47c7b05c2fd
--- /dev/null
+++ b/PaddleCV/video/scripts/test/test_nextvlad.sh
@@ -0,0 +1,2 @@
+python test.py --model_name="NEXTVLAD" --config=./configs/nextvlad.txt \
+ --log_interval=10 --weights=./checkpoints/NEXTVLAD_epoch0
diff --git a/PaddleCV/video/scripts/test/test_nonlocal.sh b/PaddleCV/video/scripts/test/test_nonlocal.sh
new file mode 100644
index 0000000000000000000000000000000000000000..7a42bb05fda5a1343c673e59cf8832d338b7d25e
--- /dev/null
+++ b/PaddleCV/video/scripts/test/test_nonlocal.sh
@@ -0,0 +1,2 @@
+python -i test.py --model_name="NONLOCAL" --config=./configs/nonlocal.txt \
+ --log_interval=1 --weights=./checkpoints/NONLOCAL_epoch0
diff --git a/PaddleCV/video/scripts/test/test_stnet.sh b/PaddleCV/video/scripts/test/test_stnet.sh
new file mode 100644
index 0000000000000000000000000000000000000000..0b471ed9fa4fdd7bf055638835cdb3307a1a260c
--- /dev/null
+++ b/PaddleCV/video/scripts/test/test_stnet.sh
@@ -0,0 +1,2 @@
+python test.py --model_name="STNET" --config=./configs/stnet.txt \
+ --log_interval=10 --weights=./checkpoints/STNET_epoch0
diff --git a/PaddleCV/video/scripts/test/test_tsm.sh b/PaddleCV/video/scripts/test/test_tsm.sh
new file mode 100644
index 0000000000000000000000000000000000000000..ffebc4774cf239ab0501f64aa895c3a7dea439f6
--- /dev/null
+++ b/PaddleCV/video/scripts/test/test_tsm.sh
@@ -0,0 +1,2 @@
+python test.py --model_name="TSM" --config=./configs/tsm.txt \
+ --log_interval=10 --weights=./checkpoints/TSM_epoch0
diff --git a/PaddleCV/video/scripts/test/test_tsn.sh b/PaddleCV/video/scripts/test/test_tsn.sh
new file mode 100644
index 0000000000000000000000000000000000000000..ffe0ff51470e9223b45f37242360dd461e28e678
--- /dev/null
+++ b/PaddleCV/video/scripts/test/test_tsn.sh
@@ -0,0 +1,2 @@
+python test.py --model_name="TSN" --config=./configs/tsn.txt \
+ --log_interval=10 --weights=./checkpoints/TSN_epoch0
diff --git a/PaddleCV/video/scripts/train/train_attention_cluster.sh b/PaddleCV/video/scripts/train/train_attention_cluster.sh
new file mode 100644
index 0000000000000000000000000000000000000000..68c1509a1bc5a301c242e08004b2ddcd9cd4415a
--- /dev/null
+++ b/PaddleCV/video/scripts/train/train_attention_cluster.sh
@@ -0,0 +1,3 @@
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+python train.py --model_name="AttentionCluster" --config=./configs/attention_cluster.txt --epoch_num=5 \
+ --valid_interval=1 --log_interval=10
diff --git a/PaddleCV/video/scripts/train/train_attention_lstm.sh b/PaddleCV/video/scripts/train/train_attention_lstm.sh
new file mode 100644
index 0000000000000000000000000000000000000000..3faa1dba8aa48bc53fa80cbb4cde65c2ec8fc9d7
--- /dev/null
+++ b/PaddleCV/video/scripts/train/train_attention_lstm.sh
@@ -0,0 +1,3 @@
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+python train.py --model_name="AttentionLSTM" --config=./configs/attention_lstm.txt --epoch_num=10 \
+ --valid_interval=1 --log_interval=10
diff --git a/PaddleCV/video/scripts/train/train_nextvlad.sh b/PaddleCV/video/scripts/train/train_nextvlad.sh
new file mode 100644
index 0000000000000000000000000000000000000000..dd8e5c5766ba546e081c8f12cb1ae0ea6795e6b7
--- /dev/null
+++ b/PaddleCV/video/scripts/train/train_nextvlad.sh
@@ -0,0 +1,3 @@
+export CUDA_VISIBLE_DEVICES=0,1,2,3
+python train.py --model_name="NEXTVLAD" --config=./configs/nextvlad.txt --epoch_num=6 \
+ --valid_interval=1 --log_interval=10
diff --git a/PaddleCV/video/scripts/train/train_nonlocal.sh b/PaddleCV/video/scripts/train/train_nonlocal.sh
new file mode 100644
index 0000000000000000000000000000000000000000..67b8efb199c3cd7af59d7fb54ac732080ee5f155
--- /dev/null
+++ b/PaddleCV/video/scripts/train/train_nonlocal.sh
@@ -0,0 +1,3 @@
+python train.py --model_name="NONLOCAL" --config=./configs/nonlocal.txt --epoch_num=120 \
+ --valid_interval=1 --log_interval=1 \
+ --pretrain=./pretrained/ResNet50_pretrained
diff --git a/PaddleCV/video/scripts/train/train_stnet.sh b/PaddleCV/video/scripts/train/train_stnet.sh
new file mode 100644
index 0000000000000000000000000000000000000000..8d021566ac3ee625c7fe7c7ee3eca9fd1205ea6f
--- /dev/null
+++ b/PaddleCV/video/scripts/train/train_stnet.sh
@@ -0,0 +1,3 @@
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+python train.py --model_name="STNET" --config=./configs/stnet.txt --epoch_num=60 \
+ --valid_interval=1 --log_interval=10
diff --git a/PaddleCV/video/scripts/train/train_tsm.sh b/PaddleCV/video/scripts/train/train_tsm.sh
new file mode 100644
index 0000000000000000000000000000000000000000..d09096637211f70889d00b5a68f1dcd5df010cc2
--- /dev/null
+++ b/PaddleCV/video/scripts/train/train_tsm.sh
@@ -0,0 +1,3 @@
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+python train.py --model_name="TSM" --config=./configs/tsm.txt --epoch_num=65 \
+ --valid_interval=1 --log_interval=10
diff --git a/PaddleCV/video/scripts/train/train_tsn.sh b/PaddleCV/video/scripts/train/train_tsn.sh
new file mode 100644
index 0000000000000000000000000000000000000000..80c0e4f890af77bcfac1cfd5c1f390068e090ffe
--- /dev/null
+++ b/PaddleCV/video/scripts/train/train_tsn.sh
@@ -0,0 +1,3 @@
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+python train.py --model_name="TSN" --config=./configs/tsn.txt --epoch_num=45 \
+ --valid_interval=1 --log_interval=10
diff --git a/PaddleCV/video/test.py b/PaddleCV/video/test.py
new file mode 100644
index 0000000000000000000000000000000000000000..2b314bfdf3cf9041494c53d2a0ed7f6d1b028060
--- /dev/null
+++ b/PaddleCV/video/test.py
@@ -0,0 +1,130 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import sys
+import time
+import logging
+import argparse
+import numpy as np
+import paddle.fluid as fluid
+
+from config import *
+import models
+from datareader import get_reader
+from metrics import get_metrics
+
+logging.root.handlers = []
+FORMAT = '[%(levelname)s: %(filename)s: %(lineno)4d]: %(message)s'
+logging.basicConfig(level=logging.INFO, format=FORMAT, stream=sys.stdout)
+logger = logging.getLogger(__name__)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser()
+ parser.add_argument(
+ '--model_name',
+ type=str,
+ default='AttentionCluster',
+ help='name of model to train.')
+ parser.add_argument(
+ '--config',
+ type=str,
+ default='configs/attention_cluster.txt',
+ help='path to config file of model')
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=None,
+ help='test batch size. None to use config file setting.')
+ parser.add_argument(
+ '--use_gpu', type=bool, default=True, help='default use gpu.')
+ parser.add_argument(
+ '--weights',
+ type=str,
+ default=None,
+ help='weight path, None to use weights from Paddle.')
+ parser.add_argument(
+ '--log_interval',
+ type=int,
+ default=1,
+ help='mini-batch interval to log.')
+ args = parser.parse_args()
+ return args
+
+
+def test(args):
+ # parse config
+ config = parse_config(args.config)
+ test_config = merge_configs(config, 'test', vars(args))
+ print_configs(test_config, "Test")
+
+ # build model
+ test_model = models.get_model(args.model_name, test_config, mode='test')
+ test_model.build_input(use_pyreader=False)
+ test_model.build_model()
+ test_feeds = test_model.feeds()
+ test_outputs = test_model.outputs()
+ test_loss = test_model.loss()
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ if args.weights:
+ assert os.path.exists(
+ args.weights), "Given weight dir {} not exist.".format(args.weights)
+ weights = args.weights or test_model.get_weights()
+
+ test_model.load_test_weights(exe, weights,
+ fluid.default_main_program(), place)
+
+ # get reader and metrics
+ test_reader = get_reader(args.model_name.upper(), 'test', test_config)
+ test_metrics = get_metrics(args.model_name.upper(), 'test', test_config)
+
+ test_feeder = fluid.DataFeeder(place=place, feed_list=test_feeds)
+ if test_loss is None:
+ fetch_list = [x.name for x in test_outputs] + [test_feeds[-1].name]
+ else:
+ fetch_list = [test_loss.name] + [x.name for x in test_outputs
+ ] + [test_feeds[-1].name]
+
+ epoch_period = []
+ for test_iter, data in enumerate(test_reader()):
+ cur_time = time.time()
+ test_outs = exe.run(fetch_list=fetch_list, feed=test_feeder.feed(data))
+ period = time.time() - cur_time
+ epoch_period.append(period)
+ if test_loss is None:
+ loss = np.zeros(1, ).astype('float32')
+ pred = np.array(test_outs[0])
+ label = np.array(test_outs[-1])
+ else:
+ loss = np.array(test_outs[0])
+ pred = np.array(test_outs[1])
+ label = np.array(test_outs[-1])
+ test_metrics.accumulate(loss, pred, label)
+
+ # metric here
+ if args.log_interval > 0 and test_iter % args.log_interval == 0:
+ info_str = '[EVAL] Batch {}'.format(test_iter)
+ test_metrics.calculate_and_log_out(loss, pred, label, info_str)
+ test_metrics.finalize_and_log_out("[EVAL] eval finished. ")
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ logger.info(args)
+
+ test(args)
diff --git a/fluid/PaddleRec/multiview_simnet/__init__.py b/PaddleCV/video/tools/__init__.py
similarity index 100%
rename from fluid/PaddleRec/multiview_simnet/__init__.py
rename to PaddleCV/video/tools/__init__.py
diff --git a/PaddleCV/video/tools/train_utils.py b/PaddleCV/video/tools/train_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..8b7becb7c315d08ccfb29123e601637e0d4061a2
--- /dev/null
+++ b/PaddleCV/video/tools/train_utils.py
@@ -0,0 +1,146 @@
+import os
+import time
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import logging
+import shutil
+
+logger = logging.getLogger(__name__)
+
+
+def test_without_pyreader(test_exe,
+ test_reader,
+ test_feeder,
+ test_fetch_list,
+ test_metrics,
+ log_interval=0):
+ test_metrics.reset()
+ for test_iter, data in enumerate(test_reader()):
+ test_outs = test_exe.run(test_fetch_list, feed=test_feeder.feed(data))
+ loss = np.array(test_outs[0])
+ pred = np.array(test_outs[1])
+ label = np.array(test_outs[-1])
+ test_metrics.accumulate(loss, pred, label)
+ if log_interval > 0 and test_iter % log_interval == 0:
+ test_metrics.calculate_and_log_out(loss, pred, label, \
+ info = '[TEST] test_iter {} '.format(test_iter))
+ test_metrics.finalize_and_log_out("[TEST] Finish")
+
+
+def test_with_pyreader(test_exe,
+ test_pyreader,
+ test_fetch_list,
+ test_metrics,
+ log_interval=0):
+ if not test_pyreader:
+ logger.error("[TEST] get pyreader failed.")
+ test_pyreader.start()
+ test_metrics.reset()
+ test_iter = 0
+ try:
+ while True:
+ test_outs = test_exe.run(fetch_list=test_fetch_list)
+ loss = np.array(test_outs[0])
+ pred = np.array(test_outs[1])
+ label = np.array(test_outs[-1])
+ test_metrics.accumulate(loss, pred, label)
+ if log_interval > 0 and test_iter % log_interval == 0:
+ test_metrics.calculate_and_log_out(loss, pred, label, \
+ info = '[TEST] test_iter {} '.format(test_iter))
+ test_iter += 1
+ except fluid.core.EOFException:
+ test_metrics.finalize_and_log_out("[TEST] Finish")
+ finally:
+ test_pyreader.reset()
+
+
+def train_without_pyreader(exe, train_prog, train_exe, train_reader, train_feeder, \
+ train_fetch_list, train_metrics, epochs = 10, \
+ log_interval = 0, valid_interval = 0, save_dir = './', \
+ save_model_name = 'model', test_exe = None, test_reader = None, \
+ test_feeder = None, test_fetch_list = None, test_metrics = None):
+ for epoch in range(epochs):
+ lr = fluid.global_scope().find_var("learning_rate").get_tensor()
+ lr_count = fluid.global_scope().find_var(
+ "@LR_DECAY_COUNTER@").get_tensor()
+ logger.info("------- learning rate {}, learning rate counter {} -----"
+ .format(np.array(lr), np.array(lr_count)))
+ epoch_periods = []
+ for train_iter, data in enumerate(train_reader()):
+ cur_time = time.time()
+ train_outs = train_exe.run(train_fetch_list,
+ feed=train_feeder.feed(data))
+ period = time.time() - cur_time
+ epoch_periods.append(period)
+ loss = np.array(train_outs[0])
+ pred = np.array(train_outs[1])
+ label = np.array(train_outs[-1])
+ if log_interval > 0 and (train_iter % log_interval == 0):
+ # eval here
+ train_metrics.calculate_and_log_out(loss, pred, label, \
+ info = '[TRAIN] Epoch {}, iter {} '.format(epoch, train_iter))
+ train_iter += 1
+ logger.info('[TRAIN] Epoch {} training finished, average time: {}'.
+ format(epoch, np.mean(epoch_periods)))
+ save_model(exe, train_prog, save_dir, save_model_name,
+ "_epoch{}".format(epoch))
+ if test_exe and valid_interval > 0 and (epoch + 1
+ ) % valid_interval == 0:
+ test_without_pyreader(test_exe, test_reader, test_feeder,
+ test_fetch_list, test_metrics, log_interval)
+
+
+
+def train_with_pyreader(exe, train_prog, train_exe, train_pyreader, \
+ train_fetch_list, train_metrics, epochs = 10, \
+ log_interval = 0, valid_interval = 0, \
+ save_dir = './', save_model_name = 'model', \
+ test_exe = None, test_pyreader = None, \
+ test_fetch_list = None, test_metrics = None):
+ if not train_pyreader:
+ logger.error("[TRAIN] get pyreader failed.")
+ for epoch in range(epochs):
+ lr = fluid.global_scope().find_var("learning_rate").get_tensor()
+ lr_count = fluid.global_scope().find_var(
+ "@LR_DECAY_COUNTER@").get_tensor()
+ logger.info("------- learning rate {}, learning rate counter {} -----"
+ .format(np.array(lr), np.array(lr_count)))
+ train_pyreader.start()
+ train_metrics.reset()
+ try:
+ train_iter = 0
+ epoch_periods = []
+ while True:
+ cur_time = time.time()
+ train_outs = train_exe.run(fetch_list=train_fetch_list)
+ period = time.time() - cur_time
+ epoch_periods.append(period)
+ loss = np.array(train_outs[0])
+ pred = np.array(train_outs[1])
+ label = np.array(train_outs[-1])
+ if log_interval > 0 and (train_iter % log_interval == 0):
+ # eval here
+ train_metrics.calculate_and_log_out(loss, pred, label, \
+ info = '[TRAIN] Epoch {}, iter {} '.format(epoch, train_iter))
+ train_iter += 1
+ except fluid.core.EOFException:
+ # eval here
+ logger.info('[TRAIN] Epoch {} training finished, average time: {}'.
+ format(epoch, np.mean(epoch_periods)))
+ save_model(exe, train_prog, save_dir, save_model_name,
+ "_epoch{}".format(epoch))
+ if test_exe and valid_interval > 0 and (epoch + 1
+ ) % valid_interval == 0:
+ test_with_pyreader(test_exe, test_pyreader, test_fetch_list,
+ test_metrics, log_interval)
+ finally:
+ epoch_period = []
+ train_pyreader.reset()
+
+
+def save_model(exe, program, save_dir, model_name, postfix=None):
+ model_path = os.path.join(save_dir, model_name + postfix)
+ if os.path.isdir(model_path):
+ shutil.rmtree(model_path)
+ fluid.io.save_persistables(exe, model_path, main_program=program)
diff --git a/PaddleCV/video/train.py b/PaddleCV/video/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..115d586df6f657062ac6bd6b656b2b17b6badc60
--- /dev/null
+++ b/PaddleCV/video/train.py
@@ -0,0 +1,251 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import sys
+import time
+import argparse
+import logging
+import numpy as np
+import paddle.fluid as fluid
+
+from tools.train_utils import train_with_pyreader, train_without_pyreader
+import models
+from config import *
+from datareader import get_reader
+from metrics import get_metrics
+
+logging.root.handlers = []
+FORMAT = '[%(levelname)s: %(filename)s: %(lineno)4d]: %(message)s'
+logging.basicConfig(level=logging.INFO, format=FORMAT, stream=sys.stdout)
+logger = logging.getLogger(__name__)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("Paddle Video train script")
+ parser.add_argument(
+ '--model_name',
+ type=str,
+ default='AttentionCluster',
+ help='name of model to train.')
+ parser.add_argument(
+ '--config',
+ type=str,
+ default='configs/attention_cluster.txt',
+ help='path to config file of model')
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=None,
+ help='training batch size. None to use config file setting.')
+ parser.add_argument(
+ '--learning_rate',
+ type=float,
+ default=None,
+ help='learning rate use for training. None to use config file setting.')
+ parser.add_argument(
+ '--pretrain',
+ type=str,
+ default=None,
+ help='path to pretrain weights. None to use default weights path in ~/.paddle/weights.'
+ )
+ parser.add_argument(
+ '--resume',
+ type=str,
+ default=None,
+ help='path to resume training based on previous checkpoints. '
+ 'None for not resuming any checkpoints.')
+ parser.add_argument(
+ '--use_gpu', type=bool, default=True, help='default use gpu.')
+ parser.add_argument(
+ '--no_use_pyreader',
+ action='store_true',
+ default=False,
+ help='whether to use pyreader')
+ parser.add_argument(
+ '--no_memory_optimize',
+ action='store_true',
+ default=False,
+ help='whether to use memory optimize in train')
+ parser.add_argument(
+ '--epoch_num',
+ type=int,
+ default=0,
+ help='epoch number, 0 for read from config file')
+ parser.add_argument(
+ '--valid_interval',
+ type=int,
+ default=1,
+ help='validation epoch interval, 0 for no validation.')
+ parser.add_argument(
+ '--save_dir',
+ type=str,
+ default='checkpoints',
+ help='directory name to save train snapshoot')
+ parser.add_argument(
+ '--log_interval',
+ type=int,
+ default=10,
+ help='mini-batch interval to log.')
+ args = parser.parse_args()
+ return args
+
+
+def train(args):
+ # parse config
+ config = parse_config(args.config)
+ train_config = merge_configs(config, 'train', vars(args))
+ valid_config = merge_configs(config, 'valid', vars(args))
+ print_configs(train_config, 'Train')
+ train_model = models.get_model(args.model_name, train_config, mode='train')
+ valid_model = models.get_model(args.model_name, valid_config, mode='valid')
+
+ # build model
+ startup = fluid.Program()
+ train_prog = fluid.Program()
+ with fluid.program_guard(train_prog, startup):
+ with fluid.unique_name.guard():
+ train_model.build_input(not args.no_use_pyreader)
+ train_model.build_model()
+ # for the input, has the form [data1, data2,..., label], so train_feeds[-1] is label
+ train_feeds = train_model.feeds()
+ train_feeds[-1].persistable = True
+ # for the output of classification model, has the form [pred]
+ train_outputs = train_model.outputs()
+ for output in train_outputs:
+ output.persistable = True
+ train_loss = train_model.loss()
+ train_loss.persistable = True
+ # outputs, loss, label should be fetched, so set persistable to be true
+ optimizer = train_model.optimizer()
+ optimizer.minimize(train_loss)
+ train_pyreader = train_model.pyreader()
+
+ if not args.no_memory_optimize:
+ fluid.memory_optimize(train_prog)
+
+ valid_prog = fluid.Program()
+ with fluid.program_guard(valid_prog, startup):
+ with fluid.unique_name.guard():
+ valid_model.build_input(not args.no_use_pyreader)
+ valid_model.build_model()
+ valid_feeds = valid_model.feeds()
+ valid_outputs = valid_model.outputs()
+ valid_loss = valid_model.loss()
+ valid_pyreader = valid_model.pyreader()
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup)
+
+ if args.resume:
+ # if resume weights is given, load resume weights directly
+ assert os.path.exists(args.resume), \
+ "Given resume weight dir {} not exist.".format(args.resume)
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(args.resume, var.name))
+
+ fluid.io.load_vars(
+ exe, args.resume, predicate=if_exist, main_program=train_prog)
+ else:
+ # if not in resume mode, load pretrain weights
+ if args.pretrain:
+ assert os.path.exists(args.pretrain), \
+ "Given pretrain weight dir {} not exist.".format(args.pretrain)
+ pretrain = args.pretrain or train_model.get_pretrain_weights()
+ if pretrain:
+ train_model.load_pretrain_params(exe, pretrain, train_prog, place)
+
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=args.use_gpu,
+ loss_name=train_loss.name,
+ main_program=train_prog)
+ valid_exe = fluid.ParallelExecutor(
+ use_cuda=args.use_gpu,
+ share_vars_from=train_exe,
+ main_program=valid_prog)
+
+ # get reader
+ bs_denominator = 1
+ if (not args.no_use_pyreader) and args.use_gpu:
+ bs_denominator = train_config.TRAIN.num_gpus
+ train_config.TRAIN.batch_size = int(train_config.TRAIN.batch_size /
+ bs_denominator)
+ valid_config.VALID.batch_size = int(valid_config.VALID.batch_size /
+ bs_denominator)
+ train_reader = get_reader(args.model_name.upper(), 'train', train_config)
+ valid_reader = get_reader(args.model_name.upper(), 'valid', valid_config)
+
+ # get metrics
+ train_metrics = get_metrics(args.model_name.upper(), 'train', train_config)
+ valid_metrics = get_metrics(args.model_name.upper(), 'valid', valid_config)
+
+ train_fetch_list = [train_loss.name] + [x.name for x in train_outputs
+ ] + [train_feeds[-1].name]
+ valid_fetch_list = [valid_loss.name] + [x.name for x in valid_outputs
+ ] + [valid_feeds[-1].name]
+
+ epochs = args.epoch_num or train_model.epoch_num()
+
+ if args.no_use_pyreader:
+ train_feeder = fluid.DataFeeder(place=place, feed_list=train_feeds)
+ valid_feeder = fluid.DataFeeder(place=place, feed_list=valid_feeds)
+ train_without_pyreader(
+ exe,
+ train_prog,
+ train_exe,
+ train_reader,
+ train_feeder,
+ train_fetch_list,
+ train_metrics,
+ epochs=epochs,
+ log_interval=args.log_interval,
+ valid_interval=args.valid_interval,
+ save_dir=args.save_dir,
+ save_model_name=args.model_name,
+ test_exe=valid_exe,
+ test_reader=valid_reader,
+ test_feeder=valid_feeder,
+ test_fetch_list=valid_fetch_list,
+ test_metrics=valid_metrics)
+ else:
+ train_pyreader.decorate_paddle_reader(train_reader)
+ valid_pyreader.decorate_paddle_reader(valid_reader)
+ train_with_pyreader(
+ exe,
+ train_prog,
+ train_exe,
+ train_pyreader,
+ train_fetch_list,
+ train_metrics,
+ epochs=epochs,
+ log_interval=args.log_interval,
+ valid_interval=args.valid_interval,
+ save_dir=args.save_dir,
+ save_model_name=args.model_name,
+ test_exe=valid_exe,
+ test_pyreader=valid_pyreader,
+ test_fetch_list=valid_fetch_list,
+ test_metrics=valid_metrics)
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ logger.info(args)
+
+ if not os.path.exists(args.save_dir):
+ os.makedirs(args.save_dir)
+
+ train(args)
diff --git a/PaddleCV/video/utils.py b/PaddleCV/video/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..11681ca9e3faa6e0fdf8cd946a3a48d5e2d76fdc
--- /dev/null
+++ b/PaddleCV/video/utils.py
@@ -0,0 +1,26 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+__all__ = ['AttrDict']
+
+
+class AttrDict(dict):
+ def __getattr__(self, key):
+ return self[key]
+
+ def __setattr__(self, key, value):
+ if key in self.__dict__:
+ self.__dict__[key] = value
+ else:
+ self[key] = value
diff --git a/PaddleCV/video_classification/README.md b/PaddleCV/video_classification/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..822c3ccf64cb1c5567e574425229974524a34471
--- /dev/null
+++ b/PaddleCV/video_classification/README.md
@@ -0,0 +1,140 @@
+# Video Classification Based on Temporal Segment Network
+
+Video classification has drawn a significant amount of attentions in the past few years. This page introduces how to perform video classification with PaddlePaddle Fluid, on the public UCF-101 dataset, based on the state-of-the-art Temporal Segment Network (TSN) method.
+
+______________________________________________________________________________
+
+## Table of Contents
+Installation
+Data preparation
+Training
+Evaluation
+Inference
+Performance
+
+### Installation
+Running sample code in this directory requires PaddelPaddle Fluid v0.13.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in installation document and make an update.
+
+### Data preparation
+
+#### download UCF-101 dataset
+Users can download the UCF-101 dataset by the provided script in data/download.sh.
+
+#### decode video into frame
+To avoid the process of decoding videos in network training, we offline decode them into frames and save it in the pickle format, easily readable for python.
+
+Users can refer to the script data/video_decode.py for video decoding.
+
+#### split data into train and test
+We follow the split 1 of UCF-101 dataset. After data splitting, users can get 9537 videos for training and 3783 videos for validation. The reference script is data/split_data.py.
+
+#### save pickle for training
+As stated above, we save all data as pickle format for training. All information in each video is saved into one pickle, includes video id, frames binary and label. Please refer to the script data/generate_train_data.py.
+After this operation, one can get two directories containing training and testing data in pickle format, and two files train.list and test.list, with each line seperated by SPACE.
+
+### Training
+After data preparation, users can start the PaddlePaddle Fluid training by:
+```
+python train.py \
+ --batch_size=128 \
+ --total_videos=9537 \
+ --class_dim=101 \
+ --num_epochs=60 \
+ --image_shape=3,224,224 \
+ --model_save_dir=output/ \
+ --with_mem_opt=True \
+ --lr_init=0.01 \
+ --num_layers=50 \
+ --seg_num=7 \
+ --pretrained_model={path_to_pretrained_model}
+```
+
+parameter introduction:
+batch_size: the size of each mini-batch.
+total_videos: total number of videos in the training set.
+class_dim: the class number of the classification task.
+num_epochs: the number of epochs.
+image_shape: input size of the network.
+model_save_dir: the directory to save trained model.
+with_mem_opt: whether to use memory optimization or not.
+lr_init: initialized learning rate.
+num_layers: the number of layers for ResNet.
+seg_num: the number of segments in TSN.
+pretrained_model: model path for pretraining.
+
+
+data reader introduction:
+Data reader is defined in reader.py. Note that we use group operation for all frames in one video.
+
+
+training:
+The training log is like:
+```
+[TRAIN] Pass: 0 trainbatch: 0 loss: 4.630959 acc1: 0.0 acc5: 0.0390625 time: 3.09 sec
+[TRAIN] Pass: 0 trainbatch: 10 loss: 4.559069 acc1: 0.0546875 acc5: 0.1171875 time: 3.91 sec
+[TRAIN] Pass: 0 trainbatch: 20 loss: 4.040092 acc1: 0.09375 acc5: 0.3515625 time: 3.88 sec
+[TRAIN] Pass: 0 trainbatch: 30 loss: 3.478214 acc1: 0.3203125 acc5: 0.5546875 time: 3.32 sec
+[TRAIN] Pass: 0 trainbatch: 40 loss: 3.005404 acc1: 0.3515625 acc5: 0.6796875 time: 3.33 sec
+[TRAIN] Pass: 0 trainbatch: 50 loss: 2.585245 acc1: 0.4609375 acc5: 0.7265625 time: 3.13 sec
+[TRAIN] Pass: 0 trainbatch: 60 loss: 2.151489 acc1: 0.4921875 acc5: 0.8203125 time: 3.35 sec
+[TRAIN] Pass: 0 trainbatch: 70 loss: 1.981680 acc1: 0.578125 acc5: 0.8359375 time: 3.30 sec
+```
+
+### Evaluation
+Evaluation is to evaluate the performance of a trained model. One can download pretrained models and set its path to path_to_pretrain_model. Then top1/top5 accuracy can be obtained by running the following command:
+```
+python eval.py \
+ --batch_size=128 \
+ --class_dim=101 \
+ --image_shape=3,224,224 \
+ --with_mem_opt=True \
+ --num_layers=50 \
+ --seg_num=7 \
+ --test_model={path_to_pretrained_model}
+```
+
+According to the congfiguration of evaluation, the output log is like:
+```
+[TEST] Pass: 0 testbatch: 0 loss: 0.011551 acc1: 1.0 acc5: 1.0 time: 0.48 sec
+[TEST] Pass: 0 testbatch: 10 loss: 0.710330 acc1: 0.75 acc5: 1.0 time: 0.49 sec
+[TEST] Pass: 0 testbatch: 20 loss: 0.000547 acc1: 1.0 acc5: 1.0 time: 0.48 sec
+[TEST] Pass: 0 testbatch: 30 loss: 0.036623 acc1: 1.0 acc5: 1.0 time: 0.48 sec
+[TEST] Pass: 0 testbatch: 40 loss: 0.138705 acc1: 1.0 acc5: 1.0 time: 0.48 sec
+[TEST] Pass: 0 testbatch: 50 loss: 0.056909 acc1: 1.0 acc5: 1.0 time: 0.49 sec
+[TEST] Pass: 0 testbatch: 60 loss: 0.742937 acc1: 0.75 acc5: 1.0 time: 0.49 sec
+[TEST] Pass: 0 testbatch: 70 loss: 1.720186 acc1: 0.5 acc5: 0.875 time: 0.48 sec
+[TEST] Pass: 0 testbatch: 80 loss: 0.199669 acc1: 0.875 acc5: 1.0 time: 0.48 sec
+[TEST] Pass: 0 testbatch: 90 loss: 0.195510 acc1: 1.0 acc5: 1.0 time: 0.48 sec
+```
+
+### Inference
+Inference is used to get prediction score or video features based on trained models.
+```
+python infer.py \
+ --class_dim=101 \
+ --image_shape=3,224,224 \
+ --with_mem_opt=True \
+ --num_layers=50 \
+ --seg_num=7 \
+ --test_model={path_to_pretrained_model}
+```
+
+The output contains predication results, including maximum score (before softmax) and corresponding predicted label.
+```
+Test sample: PlayingGuitar_g01_c03, score: [21.418629], class [62]
+Test sample: SalsaSpin_g05_c06, score: [13.238657], class [76]
+Test sample: TrampolineJumping_g04_c01, score: [21.722862], class [93]
+Test sample: JavelinThrow_g01_c04, score: [16.27892], class [44]
+Test sample: PlayingTabla_g01_c01, score: [15.366951], class [65]
+Test sample: ParallelBars_g04_c07, score: [18.42596], class [56]
+Test sample: PlayingCello_g05_c05, score: [18.795723], class [58]
+Test sample: LongJump_g03_c04, score: [7.100088], class [50]
+Test sample: SkyDiving_g06_c03, score: [15.144707], class [82]
+Test sample: UnevenBars_g07_c04, score: [22.114838], class [95]
+```
+
+### Performance
+Configuration | Top-1 acc
+------------- | ---------------:
+seg=7, size=224 | 0.859
+seg=10, size=224 | 0.863
diff --git a/fluid/PaddleCV/video_classification/data/download.sh b/PaddleCV/video_classification/data/download.sh
similarity index 100%
rename from fluid/PaddleCV/video_classification/data/download.sh
rename to PaddleCV/video_classification/data/download.sh
diff --git a/fluid/PaddleCV/video_classification/data/generate_train_data.py b/PaddleCV/video_classification/data/generate_train_data.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/data/generate_train_data.py
rename to PaddleCV/video_classification/data/generate_train_data.py
diff --git a/fluid/PaddleCV/video_classification/data/split_data.py b/PaddleCV/video_classification/data/split_data.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/data/split_data.py
rename to PaddleCV/video_classification/data/split_data.py
diff --git a/fluid/PaddleCV/video_classification/data/video_decode.py b/PaddleCV/video_classification/data/video_decode.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/data/video_decode.py
rename to PaddleCV/video_classification/data/video_decode.py
diff --git a/fluid/PaddleCV/video_classification/eval.py b/PaddleCV/video_classification/eval.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/eval.py
rename to PaddleCV/video_classification/eval.py
diff --git a/fluid/PaddleCV/video_classification/infer.py b/PaddleCV/video_classification/infer.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/infer.py
rename to PaddleCV/video_classification/infer.py
diff --git a/fluid/PaddleCV/video_classification/reader.py b/PaddleCV/video_classification/reader.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/reader.py
rename to PaddleCV/video_classification/reader.py
diff --git a/fluid/PaddleCV/video_classification/resnet.py b/PaddleCV/video_classification/resnet.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/resnet.py
rename to PaddleCV/video_classification/resnet.py
diff --git a/fluid/PaddleCV/video_classification/train.py b/PaddleCV/video_classification/train.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/train.py
rename to PaddleCV/video_classification/train.py
diff --git a/fluid/PaddleCV/video_classification/utility.py b/PaddleCV/video_classification/utility.py
similarity index 100%
rename from fluid/PaddleCV/video_classification/utility.py
rename to PaddleCV/video_classification/utility.py
diff --git a/PaddleCV/yolov3/.gitignore b/PaddleCV/yolov3/.gitignore
new file mode 100644
index 0000000000000000000000000000000000000000..c8fdc82b116a140c5d09cbf2d76468df195885ec
--- /dev/null
+++ b/PaddleCV/yolov3/.gitignore
@@ -0,0 +1,11 @@
+*.log
+*.json
+*.jpg
+*.png
+output/
+checkpoints/
+weights/
+!weights/*.sh
+dataset/coco/
+log*
+output*
diff --git a/PaddleCV/yolov3/README.md b/PaddleCV/yolov3/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..8b37aded4c21d7e9f50f1d79965bce4bb567b15c
--- /dev/null
+++ b/PaddleCV/yolov3/README.md
@@ -0,0 +1,152 @@
+# YOLO V3 Objective Detection
+
+---
+## Table of Contents
+
+- [Installation](#installation)
+- [Introduction](#introduction)
+- [Data preparation](#data-preparation)
+- [Training](#training)
+- [Evaluation](#evaluation)
+- [Inference and Visualization](#inference-and-visualization)
+- [Appendix](#appendix)
+
+## Installation
+
+Running sample code in this directory requires PaddelPaddle Fluid v.1.4 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/documentation/docs/zh/1.4/beginners_guide/install/install_doc.html#paddlepaddle) and make an update.
+
+## Introduction
+
+[YOLOv3](https://arxiv.org/abs/1804.02767) is a one stage end to end detector。the detection principle of YOLOv3 is as follow:
+
+
+YOLOv3 detection principle
+
+
+YOLOv3 divides the input image in to S\*S grids and predict B bounding boxes in each grid, predictions of boxes include Location(x, y, w, h), Confidence Score and probabilities of C classes, therefore YOLOv3 output layer has S\*S\*B\*(5 + C) channels. YOLOv3 loss consists of three parts: location loss, confidence loss and classification loss.
+The bone network of YOLOv3 is darknet53, the structure of YOLOv3 is as follow:
+
+
+YOLOv3 structure
+
+
+YOLOv3 networks are composed of base feature extraction network, multi-scale feature fusion layers, and output layers.
+
+1. Feature extraction network: YOLOv3 uses [DarkNet53](https://arxiv.org/abs/1612.08242) for feature extracion. Darknet53 uses a full convolution structure, replacing the pooling layer with a convolution operation of step size 2, and adding Residual-block to avoid gradient dispersion when the number of network layers is too deep.
+
+2. Feature fusion layer. In order to solve the problem that the previous YOLO version is not sensitive to small objects, YOLOv3 uses three different scale feature maps for target detection, which are 13\*13, 26\*26, 52\*52, respectively, for detecting large, medium and small objects. The feature fusion layer selects the three scale feature maps produced by DarkNet as input, and draws on the idea of FPN (feature pyramid networks) to fuse the feature maps of each scale through a series of convolutional layers and upsampling.
+
+3. Output layer: The output layer also uses a full convolution structure. The number of convolution kernels in the last convolutional layer is 255:3\*(80+4+1)=255, and 3 indicates that a grid cell contains 3 bounding boxes. 4 represents the four coordinate information of the box, 1 represents the Confidence Score, and 80 represents the probability of 80 categories in the COCO dataset.
+
+## Data preparation
+
+Train the model on [MS-COCO dataset](http://cocodataset.org/#download), download dataset as below:
+
+ cd dataset/coco
+ ./download.sh
+
+
+## Training
+
+After data preparation, one can start the training step by:
+
+ python train.py \
+ --model_save_dir=output/ \
+ --pretrain=${path_to_pretrain_model}
+ --data_dir=${path_to_data}
+
+- Set ```export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7``` to specifiy 8 GPU to train.
+- For more help on arguments:
+
+ python train.py --help
+
+**download the pre-trained model:** This sample provides Resnet-50 pre-trained model which is converted from Caffe. The model fuses the parameters in batch normalization layer. One can download pre-trained model as:
+
+ sh ./weights/download.sh
+
+Set `pretrain` to load pre-trained model. In addition, this parameter is used to load trained model when finetuning as well.
+Please make sure that pre-trained model is downloaded and loaded correctly, otherwise, the loss may be NAN during training.
+
+**Install the [cocoapi](https://github.com/cocodataset/cocoapi):**
+
+To train the model, [cocoapi](https://github.com/cocodataset/cocoapi) is needed. Install the cocoapi:
+
+ git clone https://github.com/cocodataset/cocoapi.git
+ cd PythonAPI
+ # if cython is not installed
+ pip install Cython
+ # Install into global site-packages
+ make install
+ # Alternatively, if you do not have permissions or prefer
+ # not to install the COCO API into global site-packages
+ python2 setup.py install --user
+
+**data reader introduction:**
+
+* Data reader is defined in `reader.py` .
+
+**model configuration:**
+
+* The model uses 9 anchors generated based on the COCO dataset, which are 10x13, 16x30, 33x23, 30x61, 62x45, 59x119, 116x90, 156x198, 373x326.
+
+* NMS threshold=0.45, NMS valid=0.005 nms_topk=400, nms_posk=100
+
+**training strategy:**
+
+* Use momentum optimizer with momentum=0.9.
+* In first 4000 iteration, the learning rate increases linearly from 0.0 to 0.001. Then lr is decayed at 400000, 450000 iteration with multiplier 0.1, 0.01. The maximum iteration is 500000.
+
+Training result is shown as below:
+
+
+Train Loss
+
+
+## Evaluation
+
+Evaluation is to evaluate the performance of a trained model. This sample provides `eval.py` which uses a COCO-specific mAP metric defined by [COCO committee](http://cocodataset.org/#detections-eval).
+
+`eval.py` is the main executor for evalution, one can start evalution step by:
+
+ python eval.py \
+ --dataset=coco2017 \
+ --weights=${path_to_weights} \
+
+- Set ```export CUDA_VISIBLE_DEVICES=0``` to specifiy one GPU to eval.
+
+Evalutaion result is shown as below:
+
+| input size | mAP(IoU=0.50:0.95) | mAP(IoU=0.50) | mAP(IoU=0.75) |
+| :------: | :------: | :------: | :------: |
+| 608x608| 37.7 | 59.8 | 40.8 |
+| 416x416 | 36.5 | 58.2 | 39.1 |
+| 320x320 | 34.1 | 55.4 | 36.3 |
+
+## Inference and Visualization
+
+Inference is used to get prediction score or image features based on trained models. `infer.py` is the main executor for inference, one can start infer step by:
+
+ python infer.py \
+ --dataset=coco2017 \
+ --weights=${path_to_weights} \
+ --image_path=data/COCO17/val2017/ \
+ --image_name=000000000139.jpg \
+ --draw_threshold=0.5
+
+Inference speed:
+
+
+| input size | 608x608 | 416x416 | 320x320 |
+|:-------------:| :-----: | :-----: | :-----: |
+| infer speed | 50 ms/frame | 29 ms/frame |24 ms/frame |
+
+
+Visualization of infer result is shown as below:
+
+
+
+
+
+YOLOv3 Visualization Examples
+
+
diff --git a/PaddleCV/yolov3/README_cn.md b/PaddleCV/yolov3/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..2c5c7ebb7d2ba8475f37df112996a2f2b6e64112
--- /dev/null
+++ b/PaddleCV/yolov3/README_cn.md
@@ -0,0 +1,154 @@
+# YOLO V3 目标检测
+
+---
+## 内容
+
+- [安装](#安装)
+- [简介](#简介)
+- [数据准备](#数据准备)
+- [模型训练](#模型训练)
+- [模型评估](#模型评估)
+- [模型推断及可视化](#模型推断及可视化)
+- [附录](#附录)
+
+## 安装
+
+在当前目录下运行样例代码需要PadddlePaddle Fluid的v.1.4或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据[安装文档](http://www.paddlepaddle.org/documentation/docs/zh/1.4/beginners_guide/install/install_doc.html#paddlepaddle)中的说明来更新PaddlePaddle。
+
+## 简介
+
+[YOLOv3](https://arxiv.org/abs/1804.02767) 是一阶段End2End的目标检测器。其目标检测原理如下图所示:
+
+
+YOLOv3检测原理
+
+
+YOLOv3将输入图像分成S\*S个格子,每个格子预测B个bounding box,每个bounding box预测内容包括: Location(x, y, w, h)、Confidence Score和C个类别的概率,因此YOLOv3输出层的channel数为S\*S\*B\*(5 + C)。YOLOv3的loss函数也有三部分组成:Location误差,Confidence误差和分类误差。
+
+YOLOv3的网络结构如下图所示:
+
+
+YOLOv3网络结构
+
+
+YOLOv3 的网络结构由基础特征提取网络、multi-scale特征融合层和输出层组成。
+
+1. 特征提取网络。YOLOv3使用 [DarkNet53](https://arxiv.org/abs/1612.08242)作为特征提取网络:DarkNet53 基本采用了全卷积网络,用步长为2的卷积操作替代了池化层,同时添加了 Residual 单元,避免在网络层数过深时发生梯度弥散。
+
+2. 特征融合层。为了解决之前YOLO版本对小目标不敏感的问题,YOLOv3采用了3个不同尺度的特征图来进行目标检测,分别为13\*13,26\*26,52\*52,用来检测大、中、小三种目标。特征融合层选取 DarkNet 产出的三种尺度特征图作为输入,借鉴了FPN(feature pyramid networks)的思想,通过一系列的卷积层和上采样对各尺度的特征图进行融合。
+
+3. 输出层。同样使用了全卷积结构,其中最后一个卷积层的卷积核个数是255:3\*(80+4+1)=255,3表示一个grid cell包含3个bounding box,4表示框的4个坐标信息,1表示Confidence Score,80表示COCO数据集中80个类别的概率。
+
+
+## 数据准备
+
+在[MS-COCO数据集](http://cocodataset.org/#download)上进行训练,通过如下方式下载数据集。
+
+ cd dataset/coco
+ ./download.sh
+
+
+## 模型训练
+
+数据准备完毕后,可以通过如下的方式启动训练:
+
+ python train.py \
+ --model_save_dir=output/ \
+ --pretrain=${path_to_pretrain_model}
+ --data_dir=${path_to_data}
+
+- 通过设置export CUDA\_VISIBLE\_DEVICES=0,1,2,3,4,5,6,7指定8卡GPU训练。
+- 可选参数见:
+
+ python train.py --help
+
+**下载预训练模型:** 本示例提供darknet53预训练模型,该模型转换自作者提供的darknet53在ImageNet上预训练的权重,采用如下命令下载预训练模型:
+
+ sh ./weights/download.sh
+
+通过初始化`pretrain` 加载预训练模型。同时在参数微调时也采用该设置加载已训练模型。
+请在训练前确认预训练模型下载与加载正确,否则训练过程中损失可能会出现NAN。
+
+**安装[cocoapi](https://github.com/cocodataset/cocoapi):**
+
+训练前需要首先下载[cocoapi](https://github.com/cocodataset/cocoapi):
+
+ git clone https://github.com/cocodataset/cocoapi.git
+ cd PythonAPI
+ # if cython is not installed
+ pip install Cython
+ # Install into global site-packages
+ make install
+ # Alternatively, if you do not have permissions or prefer
+ # not to install the COCO API into global site-packages
+ python2 setup.py install --user
+
+**数据读取器说明:**
+
+* 数据读取器定义在reader.py中。
+
+**模型设置:**
+
+* 模型使用了基于COCO数据集生成的9个先验框:10x13,16x30,33x23,30x61,62x45,59x119,116x90,156x198,373x326
+* 检测过程中,nms_topk=400, nms_posk=100,nms_thresh=0.45
+
+**训练策略:**
+
+* 采用momentum优化算法训练YOLOv3,momentum=0.9。
+* 学习率采用warmup算法,前4000轮学习率从0.0线性增加至0.001。在400000,450000轮时使用0.1,0.01乘子进行学习率衰减,最大训练500000轮。
+
+下图为模型训练结果:
+
+
+Train Loss
+
+
+## 模型评估
+
+模型评估是指对训练完毕的模型评估各类性能指标。本示例采用[COCO官方评估](http://cocodataset.org/#detections-eval)
+
+`eval.py`是评估模块的主要执行程序,调用示例如下:
+
+ python eval.py \
+ --dataset=coco2017 \
+ --weights=${path_to_weights} \
+
+- 通过设置export CUDA\_VISIBLE\_DEVICES=0指定单卡GPU评估。
+
+模型评估结果:
+
+| input size | mAP(IoU=0.50:0.95) | mAP(IoU=0.50) | mAP(IoU=0.75) |
+| :------: | :------: | :------: | :------: |
+| 608x608| 37.7 | 59.8 | 40.8 |
+| 416x416 | 36.5 | 58.2 | 39.1 |
+| 320x320 | 34.1 | 55.4 | 36.3 |
+
+
+
+## 模型推断及可视化
+
+模型推断可以获取图像中的物体及其对应的类别,`infer.py`是主要执行程序,调用示例如下:
+
+ python infer.py \
+ --dataset=coco2017 \
+ --weights=${path_to_weights} \
+ --image_path=data/COCO17/val2017/ \
+ --image_name=000000000139.jpg \
+ --draw_threshold=0.5
+
+模型预测速度:
+
+
+| input size | 608x608 | 416x416 | 320x320 |
+|:-------------:| :-----: | :-----: | :-----: |
+| infer speed | 50 ms/frame | 29 ms/frame |24 ms/frame |
+
+下图为模型可视化预测结果:
+
+
+
+
+
+YOLOv3 预测可视化
+
+
diff --git a/PaddleCV/yolov3/box_utils.py b/PaddleCV/yolov3/box_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..fb8918dc802b3126d3e4e623290b94786d5bf937
--- /dev/null
+++ b/PaddleCV/yolov3/box_utils.py
@@ -0,0 +1,178 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import numpy as np
+
+import matplotlib
+matplotlib.use('Agg')
+from matplotlib import pyplot as plt
+from PIL import Image
+
+
+def coco_anno_box_to_center_relative(box, img_height, img_width):
+ """
+ Convert COCO annotations box with format [x1, y1, w, h] to
+ center mode [center_x, center_y, w, h] and divide image width
+ and height to get relative value in range[0, 1]
+ """
+ assert len(box) == 4, "box should be a len(4) list or tuple"
+ x, y, w, h = box
+
+ x1 = max(x, 0)
+ x2 = min(x + w - 1, img_width - 1)
+ y1 = max(y, 0)
+ y2 = min(y + h - 1, img_height - 1)
+
+ x = (x1 + x2) / 2 / img_width
+ y = (y1 + y2) / 2 / img_height
+ w = (x2 - x1) / img_width
+ h = (y2 - y1) / img_height
+
+ return np.array([x, y, w, h])
+
+def clip_relative_box_in_image(x, y, w, h):
+ """Clip relative box coordinates x, y, w, h to [0, 1]"""
+ x1 = max(x - w / 2, 0.)
+ x2 = min(x + w / 2, 1.)
+ y1 = min(y - h / 2, 0.)
+ y2 = max(y + h / 2, 1.)
+ x = (x1 + x2) / 2
+ y = (y1 + y2) / 2
+ w = x2 - x1
+ h = y2 - y1
+
+def box_xywh_to_xyxy(box):
+ shape = box.shape
+ assert shape[-1] == 4, "Box shape[-1] should be 4."
+
+ box = box.reshape((-1, 4))
+ box[:, 0], box[:, 2] = box[:, 0] - box[:, 2] / 2, box[:, 0] + box[:, 2] / 2
+ box[:, 1], box[:, 3] = box[:, 1] - box[:, 3] / 2, box[:, 1] + box[:, 3] / 2
+ box = box.reshape(shape)
+ return box
+
+def box_iou_xywh(box1, box2):
+ assert box1.shape[-1] == 4, "Box1 shape[-1] should be 4."
+ assert box2.shape[-1] == 4, "Box2 shape[-1] should be 4."
+
+ b1_x1, b1_x2 = box1[:, 0] - box1[:, 2] / 2, box1[:, 0] + box1[:, 2] / 2
+ b1_y1, b1_y2 = box1[:, 1] - box1[:, 3] / 2, box1[:, 1] + box1[:, 3] / 2
+ b2_x1, b2_x2 = box2[:, 0] - box2[:, 2] / 2, box2[:, 0] + box2[:, 2] / 2
+ b2_y1, b2_y2 = box2[:, 1] - box2[:, 3] / 2, box2[:, 1] + box2[:, 3] / 2
+
+ inter_x1 = np.maximum(b1_x1, b2_x1)
+ inter_x2 = np.minimum(b1_x2, b2_x2)
+ inter_y1 = np.maximum(b1_y1, b2_y1)
+ inter_y2 = np.minimum(b1_y2, b2_y2)
+ inter_w = inter_x2 - inter_x1 + 1
+ inter_h = inter_y2 - inter_y1 + 1
+ inter_w[inter_w < 0] = 0
+ inter_h[inter_h < 0] = 0
+
+ inter_area = inter_w * inter_h
+ b1_area = (b1_x2 - b1_x1 + 1) * (b1_y2 - b1_y1 + 1)
+ b2_area = (b2_x2 - b2_x1 + 1) * (b2_y2 - b2_y1 + 1)
+
+ return inter_area / (b1_area + b2_area - inter_area)
+
+def box_iou_xyxy(box1, box2):
+ assert box1.shape[-1] == 4, "Box1 shape[-1] should be 4."
+ assert box2.shape[-1] == 4, "Box2 shape[-1] should be 4."
+
+ b1_x1, b1_y1, b1_x2, b1_y2 = box1[:, 0], box1[:, 1], box1[:, 2], box1[:, 3]
+ b2_x1, b2_y1, b2_x2, b2_y2 = box2[:, 0], box2[:, 1], box2[:, 2], box2[:, 3]
+
+ inter_x1 = np.maximum(b1_x1, b2_x1)
+ inter_x2 = np.minimum(b1_x2, b2_x2)
+ inter_y1 = np.maximum(b1_y1, b2_y1)
+ inter_y2 = np.minimum(b1_y2, b2_y2)
+ inter_w = inter_x2 - inter_x1
+ inter_h = inter_y2 - inter_y1
+ inter_w[inter_w < 0] = 0
+ inter_h[inter_h < 0] = 0
+
+ inter_area = inter_w * inter_h
+ b1_area = (b1_x2 - b1_x1) * (b1_y2 - b1_y1)
+ b2_area = (b2_x2 - b2_x1) * (b2_y2 - b2_y1)
+
+ return inter_area / (b1_area + b2_area - inter_area)
+
+def box_crop(boxes, labels, scores, crop, img_shape):
+ x, y, w, h = map(float, crop)
+ im_w, im_h = map(float, img_shape)
+
+ boxes = boxes.copy()
+ boxes[:, 0], boxes[:, 2] = (boxes[:, 0] - boxes[:, 2] / 2) * im_w, (boxes[:, 0] + boxes[:, 2] / 2) * im_w
+ boxes[:, 1], boxes[:, 3] = (boxes[:, 1] - boxes[:, 3] / 2) * im_h, (boxes[:, 1] + boxes[:, 3] / 2) * im_h
+
+ crop_box = np.array([x, y, x + w, y + h])
+ centers = (boxes[:, :2] + boxes[:, 2:]) / 2.0
+ mask = np.logical_and(crop_box[:2] <= centers, centers <= crop_box[2:]).all(axis=1)
+
+ boxes[:, :2] = np.maximum(boxes[:, :2], crop_box[:2])
+ boxes[:, 2:] = np.minimum(boxes[:, 2:], crop_box[2:])
+ boxes[:, :2] -= crop_box[:2]
+ boxes[:, 2:] -= crop_box[:2]
+
+ mask = np.logical_and(mask, (boxes[:, :2] < boxes[:, 2:]).all(axis=1))
+ boxes = boxes * np.expand_dims(mask.astype('float32'), axis=1)
+ labels = labels * mask.astype('float32')
+ scores = scores * mask.astype('float32')
+ boxes[:, 0], boxes[:, 2] = (boxes[:, 0] + boxes[:, 2]) / 2 / w, (boxes[:, 2] - boxes[:, 0]) / w
+ boxes[:, 1], boxes[:, 3] = (boxes[:, 1] + boxes[:, 3]) / 2 / h, (boxes[:, 3] - boxes[:, 1]) / h
+
+ return boxes, labels, scores, mask.sum()
+
+def draw_boxes_on_image(image_path, boxes, scores, labels, label_names, score_thresh=0.5):
+ image = np.array(Image.open(image_path))
+ plt.figure()
+ _, ax = plt.subplots(1)
+ ax.imshow(image)
+
+ image_name = image_path.split('/')[-1]
+ print("Image {} detect: ".format(image_name))
+ colors = {}
+ for box, score, label in zip(boxes, scores, labels):
+ if score < score_thresh:
+ continue
+ if box[2] <= box[0] or box[3] <= box[1]:
+ continue
+ label = int(label)
+ if label not in colors:
+ colors[label] = plt.get_cmap('hsv')(label / len(label_names))
+ x1, y1, x2, y2 = box[0], box[1], box[2], box[3]
+ rect = plt.Rectangle((x1, y1), x2 - x1, y2 - y1,
+ fill=False, linewidth=2.0,
+ edgecolor=colors[label])
+ ax.add_patch(rect)
+ ax.text(x1, y1, '{} {:.4f}'.format(label_names[label], score),
+ verticalalignment='bottom', horizontalalignment='left',
+ bbox={'facecolor': colors[label], 'alpha': 0.5, 'pad': 0},
+ fontsize=8, color='white')
+ print("\t {:15s} at {:25} score: {:.5f}".format(label_names[int(label)], map(int, list(box)), score))
+ image_name = image_name.replace('jpg', 'png')
+ plt.axis('off')
+ plt.gca().xaxis.set_major_locator(plt.NullLocator())
+ plt.gca().yaxis.set_major_locator(plt.NullLocator())
+ plt.savefig("./output/{}".format(image_name), bbox_inches='tight', pad_inches=0.0)
+ print("Detect result save at ./output/{}\n".format(image_name))
+ plt.cla()
+ plt.close('all')
+
diff --git a/PaddleCV/yolov3/config.py b/PaddleCV/yolov3/config.py
new file mode 100644
index 0000000000000000000000000000000000000000..b7e1eb1c7bf0fe0a2024ed8ef29270687fae5a3a
--- /dev/null
+++ b/PaddleCV/yolov3/config.py
@@ -0,0 +1,128 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+# http://www.apache.org/licenses/LICENSE-2.0
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+from edict import AttrDict
+import six
+import numpy as np
+
+_C = AttrDict()
+cfg = _C
+
+#
+# Training options
+#
+
+# Snapshot period
+_C.snapshot_iter = 2000
+
+# min valid area for gt boxes
+_C.gt_min_area = -1
+
+# max target box number in an image
+_C.max_box_num = 50
+
+
+#
+# Training options
+#
+
+# valid score threshold to include boxes
+_C.valid_thresh = 0.005
+
+# threshold vale for box non-max suppression
+_C.nms_thresh = 0.45
+
+# the number of top k boxes to perform nms
+_C.nms_topk = 400
+
+# the number of output boxes after nms
+_C.nms_posk = 100
+
+# score threshold for draw box in debug mode
+_C.draw_thresh = 0.5
+
+
+#
+# Model options
+#
+
+# pixel mean values
+_C.pixel_means = [0.485, 0.456, 0.406]
+
+# pixel std values
+_C.pixel_stds = [0.229, 0.224, 0.225]
+
+# anchors box weight and height
+_C.anchors = [10, 13, 16, 30, 33, 23, 30, 61, 62, 45, 59, 119, 116, 90, 156, 198, 373, 326]
+
+# anchor mask of each yolo layer
+_C.anchor_masks = [[6, 7, 8], [3, 4, 5], [0, 1, 2]]
+
+# IoU threshold to ignore objectness loss of pred box
+_C.ignore_thresh = .7
+
+
+#
+# SOLVER options
+#
+
+# batch size
+_C.batch_size = 8
+
+# derived learning rate the to get the final learning rate.
+_C.learning_rate = 0.001
+
+# maximum number of iterations
+_C.max_iter = 500200
+
+# warm up to learning rate
+_C.warm_up_iter = 4000
+_C.warm_up_factor = 0.
+
+# lr steps_with_decay
+_C.lr_steps = [400000, 450000]
+_C.lr_gamma = 0.1
+
+# L2 regularization hyperparameter
+_C.weight_decay = 0.0005
+
+# momentum with SGD
+_C.momentum = 0.9
+
+#
+# ENV options
+#
+
+# support both CPU and GPU
+_C.use_gpu = True
+
+# Class number
+_C.class_num = 80
+
+# dataset path
+_C.train_file_list = 'annotations/instances_train2017.json'
+_C.train_data_dir = 'train2017'
+_C.val_file_list = 'annotations/instances_val2017.json'
+_C.val_data_dir = 'val2017'
+
+
+def merge_cfg_from_args(args):
+ """Merge config keys, values in args into the global config."""
+ for k, v in sorted(six.iteritems(vars(args))):
+ try:
+ value = eval(v)
+ except:
+ value = v
+ _C[k] = value
diff --git a/PaddleCV/yolov3/data_utils.py b/PaddleCV/yolov3/data_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..c43e040af8d46f0fc57ee9862da4ee749768e7c6
--- /dev/null
+++ b/PaddleCV/yolov3/data_utils.py
@@ -0,0 +1,151 @@
+"""
+This code is based on https://github.com/fchollet/keras/blob/master/keras/utils/data_utils.py
+"""
+
+import time
+import numpy as np
+import threading
+import multiprocessing
+try:
+ import queue
+except ImportError:
+ import Queue as queue
+
+
+class GeneratorEnqueuer(object):
+ """
+ Builds a queue out of a data generator.
+
+ Args:
+ generator: a generator function which endlessly yields data
+ use_multiprocessing (bool): use multiprocessing if True,
+ otherwise use threading.
+ wait_time (float): time to sleep in-between calls to `put()`.
+ random_seed (int): Initial seed for workers,
+ will be incremented by one for each workers.
+ """
+
+ def __init__(self,
+ generator,
+ use_multiprocessing=False,
+ wait_time=0.05,
+ random_seed=None):
+ self.wait_time = wait_time
+ self._generator = generator
+ self._use_multiprocessing = use_multiprocessing
+ self._threads = []
+ self._stop_event = None
+ self.queue = None
+ self._manager = None
+ self.seed = random_seed
+
+ def start(self, workers=1, max_queue_size=10):
+ """
+ Start worker threads which add data from the generator into the queue.
+
+ Args:
+ workers (int): number of worker threads
+ max_queue_size (int): queue size
+ (when full, threads could block on `put()`)
+ """
+
+ def data_generator_task():
+ """
+ Data generator task.
+ """
+
+ def task():
+ if (self.queue is not None and
+ self.queue.qsize() < max_queue_size):
+ generator_output = next(self._generator)
+ self.queue.put((generator_output))
+ else:
+ time.sleep(self.wait_time)
+
+ if not self._use_multiprocessing:
+ while not self._stop_event.is_set():
+ with self.genlock:
+ try:
+ task()
+ except Exception:
+ self._stop_event.set()
+ break
+ else:
+ while not self._stop_event.is_set():
+ try:
+ task()
+ except Exception:
+ self._stop_event.set()
+ break
+
+ try:
+ if self._use_multiprocessing:
+ self._manager = multiprocessing.Manager()
+ self.queue = self._manager.Queue(maxsize=max_queue_size)
+ self._stop_event = multiprocessing.Event()
+ else:
+ self.genlock = threading.Lock()
+ self.queue = queue.Queue()
+ self._stop_event = threading.Event()
+ for _ in range(workers):
+ if self._use_multiprocessing:
+ # Reset random seed else all children processes
+ # share the same seed
+ np.random.seed(self.seed)
+ thread = multiprocessing.Process(target=data_generator_task)
+ thread.daemon = True
+ if self.seed is not None:
+ self.seed += 1
+ else:
+ thread = threading.Thread(target=data_generator_task)
+ self._threads.append(thread)
+ thread.start()
+ except:
+ self.stop()
+ raise
+
+ def is_running(self):
+ """
+ Returns:
+ bool: Whether the worker theads are running.
+ """
+ return self._stop_event is not None and not self._stop_event.is_set()
+
+ def stop(self, timeout=None):
+ """
+ Stops running threads and wait for them to exit, if necessary.
+ Should be called by the same thread which called `start()`.
+
+ Args:
+ timeout(int|None): maximum time to wait on `thread.join()`.
+ """
+ if self.is_running():
+ self._stop_event.set()
+ for thread in self._threads:
+ if self._use_multiprocessing:
+ if thread.is_alive():
+ thread.join(timeout)
+ else:
+ thread.join(timeout)
+ if self._manager:
+ self._manager.shutdown()
+
+ self._threads = []
+ self._stop_event = None
+ self.queue = None
+
+ def get(self):
+ """
+ Creates a generator to extract data from the queue.
+ Skip the data if it is `None`.
+
+ # Yields
+ tuple of data in the queue.
+ """
+ while self.is_running():
+ if not self.queue.empty():
+ inputs = self.queue.get()
+ if inputs is not None:
+ yield inputs
+ else:
+ time.sleep(self.wait_time)
diff --git a/PaddleCV/yolov3/dataset/coco/download.sh b/PaddleCV/yolov3/dataset/coco/download.sh
new file mode 100644
index 0000000000000000000000000000000000000000..6f262ccebb635e993b35349890a793430d9ad597
--- /dev/null
+++ b/PaddleCV/yolov3/dataset/coco/download.sh
@@ -0,0 +1,20 @@
+DIR="$( cd "$(dirname "$0")" ; pwd -P )"
+cd "$DIR"
+
+# Download the data.
+echo "Downloading..."
+wget http://images.cocodataset.org/zips/train2014.zip
+wget http://images.cocodataset.org/zips/val2014.zip
+wget http://images.cocodataset.org/zips/train2017.zip
+wget http://images.cocodataset.org/zips/val2017.zip
+wget http://images.cocodataset.org/annotations/annotations_trainval2014.zip
+wget http://images.cocodataset.org/annotations/annotations_trainval2017.zip
+# Extract the data.
+echo "Extracting..."
+unzip train2014.zip
+unzip val2014.zip
+unzip train2017.zip
+unzip val2017.zip
+unzip annotations_trainval2014.zip
+unzip annotations_trainval2017.zip
+
diff --git a/PaddleCV/yolov3/edict.py b/PaddleCV/yolov3/edict.py
new file mode 100644
index 0000000000000000000000000000000000000000..552ede8e4006b5d4e90dd85d566749fd624c26d1
--- /dev/null
+++ b/PaddleCV/yolov3/edict.py
@@ -0,0 +1,37 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+
+class AttrDict(dict):
+ def __init__(self, *args, **kwargs):
+ super(AttrDict, self).__init__(*args, **kwargs)
+
+ def __getattr__(self, name):
+ if name in self.__dict__:
+ return self.__dict__[name]
+ elif name in self:
+ return self[name]
+ else:
+ raise AttributeError(name)
+
+ def __setattr__(self, name, value):
+ if name in self.__dict__:
+ self.__dict__[name] = value
+ else:
+ self[name] = value
diff --git a/PaddleCV/yolov3/eval.py b/PaddleCV/yolov3/eval.py
new file mode 100644
index 0000000000000000000000000000000000000000..9a0ceae2fd1c0e3b10c3ebd88ce205abe2f2b157
--- /dev/null
+++ b/PaddleCV/yolov3/eval.py
@@ -0,0 +1,124 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import time
+import json
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import reader
+from models.yolov3 import YOLOv3
+from utility import print_arguments, parse_args
+from pycocotools.coco import COCO
+from pycocotools.cocoeval import COCOeval, Params
+from config import cfg
+
+
+def eval():
+ if '2014' in cfg.dataset:
+ test_list = 'annotations/instances_val2014.json'
+ elif '2017' in cfg.dataset:
+ test_list = 'annotations/instances_val2017.json'
+
+ if cfg.debug:
+ if not os.path.exists('output'):
+ os.mkdir('output')
+
+ model = YOLOv3(is_train=False)
+ model.build_model()
+ outputs = model.get_pred()
+ place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ # yapf: disable
+ if cfg.weights:
+ def if_exist(var):
+ return os.path.exists(os.path.join(cfg.weights, var.name))
+ fluid.io.load_vars(exe, cfg.weights, predicate=if_exist)
+ # yapf: enable
+ input_size = cfg.input_size
+ test_reader = reader.test(input_size, 1)
+ label_names, label_ids = reader.get_label_infos()
+ if cfg.debug:
+ print("Load in labels {} with ids {}".format(label_names, label_ids))
+ feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
+
+ def get_pred_result(boxes, scores, labels, im_id):
+ result = []
+ for box, score, label in zip(boxes, scores, labels):
+ x1, y1, x2, y2 = box
+ w = x2 - x1 + 1
+ h = y2 - y1 + 1
+ bbox = [x1, y1, w, h]
+
+ res = {
+ 'image_id': im_id,
+ 'category_id': label_ids[int(label)],
+ 'bbox': map(float, bbox),
+ 'score': float(score)
+ }
+ result.append(res)
+ return result
+
+ dts_res = []
+ fetch_list = [outputs]
+ total_time = 0
+ for batch_id, batch_data in enumerate(test_reader()):
+ start_time = time.time()
+ batch_outputs = exe.run(
+ fetch_list=[v.name for v in fetch_list],
+ feed=feeder.feed(batch_data),
+ return_numpy=False,
+ use_program_cache=True)
+ lod = batch_outputs[0].lod()[0]
+ nmsed_boxes = np.array(batch_outputs[0])
+ if nmsed_boxes.shape[1] != 6:
+ continue
+ for i in range(len(lod) - 1):
+ im_id = batch_data[i][1]
+ start = lod[i]
+ end = lod[i + 1]
+ if start == end:
+ continue
+ nmsed_box = nmsed_boxes[start:end, :]
+ labels = nmsed_box[:, 0]
+ scores = nmsed_box[:, 1]
+ boxes = nmsed_box[:, 2:6]
+ dts_res += get_pred_result(boxes, scores, labels, im_id)
+
+ end_time = time.time()
+ print("batch id: {}, time: {}".format(batch_id, end_time - start_time))
+ total_time += end_time - start_time
+
+ with open("yolov3_result.json", 'w') as outfile:
+ json.dump(dts_res, outfile)
+ print("start evaluate detection result with coco api")
+ coco = COCO(os.path.join(cfg.data_dir, test_list))
+ cocoDt = coco.loadRes("yolov3_result.json")
+ cocoEval = COCOeval(coco, cocoDt, 'bbox')
+ cocoEval.evaluate()
+ cocoEval.accumulate()
+ cocoEval.summarize()
+ print("evaluate done.")
+
+ print("Time per batch: {}".format(total_time / batch_id))
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ eval()
diff --git a/PaddleCV/yolov3/image/000000000139.png b/PaddleCV/yolov3/image/000000000139.png
new file mode 100644
index 0000000000000000000000000000000000000000..a2e3d5d0cd9f6c05ecef83794486410949b53762
Binary files /dev/null and b/PaddleCV/yolov3/image/000000000139.png differ
diff --git a/PaddleCV/yolov3/image/000000127517.png b/PaddleCV/yolov3/image/000000127517.png
new file mode 100644
index 0000000000000000000000000000000000000000..ef04630142bccf1fe8be78f73c4000c02209f3e4
Binary files /dev/null and b/PaddleCV/yolov3/image/000000127517.png differ
diff --git a/PaddleCV/yolov3/image/000000203864.png b/PaddleCV/yolov3/image/000000203864.png
new file mode 100644
index 0000000000000000000000000000000000000000..8067fd8065c272f86952cd289418b4d3d1d44643
Binary files /dev/null and b/PaddleCV/yolov3/image/000000203864.png differ
diff --git a/PaddleCV/yolov3/image/000000515077.png b/PaddleCV/yolov3/image/000000515077.png
new file mode 100644
index 0000000000000000000000000000000000000000..70bbbe6f640fad5394da02e217f52f6912ee3dd3
Binary files /dev/null and b/PaddleCV/yolov3/image/000000515077.png differ
diff --git a/PaddleCV/yolov3/image/YOLOv3.jpg b/PaddleCV/yolov3/image/YOLOv3.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..06b81f545247c1d542fd661f947eb0cf3edc480e
Binary files /dev/null and b/PaddleCV/yolov3/image/YOLOv3.jpg differ
diff --git a/PaddleCV/yolov3/image/YOLOv3_structure.jpg b/PaddleCV/yolov3/image/YOLOv3_structure.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..57ed3298893c10c19c5a25c108237ff9d2933852
Binary files /dev/null and b/PaddleCV/yolov3/image/YOLOv3_structure.jpg differ
diff --git a/PaddleCV/yolov3/image/dog.jpg b/PaddleCV/yolov3/image/dog.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..77b0381222eaed50867643f4166092c781e56d5b
Binary files /dev/null and b/PaddleCV/yolov3/image/dog.jpg differ
diff --git a/PaddleCV/yolov3/image/eagle.jpg b/PaddleCV/yolov3/image/eagle.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..8b7509505b01a766bbf637dcbb1e2c5f24903ac5
Binary files /dev/null and b/PaddleCV/yolov3/image/eagle.jpg differ
diff --git a/PaddleCV/yolov3/image/giraffe.jpg b/PaddleCV/yolov3/image/giraffe.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..a93e8b88398d94a7454f201372317a9414344c7c
Binary files /dev/null and b/PaddleCV/yolov3/image/giraffe.jpg differ
diff --git a/PaddleCV/yolov3/image/horses.jpg b/PaddleCV/yolov3/image/horses.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..3a761f46ba08ed459af026b59f6b91b6fa597dd1
Binary files /dev/null and b/PaddleCV/yolov3/image/horses.jpg differ
diff --git a/PaddleCV/yolov3/image/kite.jpg b/PaddleCV/yolov3/image/kite.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..9eb325ac5fc375cb2513380087dd713be9be19d8
Binary files /dev/null and b/PaddleCV/yolov3/image/kite.jpg differ
diff --git a/PaddleCV/yolov3/image/person.jpg b/PaddleCV/yolov3/image/person.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..61d377fff94d48c365b0cf18edcd4de38b229465
Binary files /dev/null and b/PaddleCV/yolov3/image/person.jpg differ
diff --git a/PaddleCV/yolov3/image/scream.jpg b/PaddleCV/yolov3/image/scream.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..43f2c36a8d4df72c4f8621b377944e05f6c1fa08
Binary files /dev/null and b/PaddleCV/yolov3/image/scream.jpg differ
diff --git a/PaddleCV/yolov3/image/train_loss.png b/PaddleCV/yolov3/image/train_loss.png
new file mode 100644
index 0000000000000000000000000000000000000000..f16728e95d781d996639a35b54a944e91af6b640
Binary files /dev/null and b/PaddleCV/yolov3/image/train_loss.png differ
diff --git a/PaddleCV/yolov3/image_utils.py b/PaddleCV/yolov3/image_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..a35e20be71fe10cce2c4629cb64ea4c5e74cfe27
--- /dev/null
+++ b/PaddleCV/yolov3/image_utils.py
@@ -0,0 +1,218 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import numpy as np
+import cv2
+from PIL import Image, ImageEnhance
+import random
+
+import box_utils
+
+
+def random_distort(img):
+ def random_brightness(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Brightness(img).enhance(e)
+
+ def random_contrast(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Contrast(img).enhance(e)
+
+ def random_color(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Color(img).enhance(e)
+
+ ops = [random_brightness, random_contrast, random_color]
+ np.random.shuffle(ops)
+
+ img = Image.fromarray(img)
+ img = ops[0](img)
+ img = ops[1](img)
+ img = ops[2](img)
+ img = np.asarray(img)
+
+ return img
+
+
+def random_crop(img, boxes, labels, scores, scales=[0.3, 1.0], max_ratio=2.0, constraints=None, max_trial=50):
+ if len(boxes) == 0:
+ return img, boxes
+
+ if not constraints:
+ constraints = [
+ (0.1, 1.0),
+ (0.3, 1.0),
+ (0.5, 1.0),
+ (0.7, 1.0),
+ (0.9, 1.0),
+ (0.0, 1.0)]
+
+ img = Image.fromarray(img)
+ w, h = img.size
+ crops = [(0, 0, w, h)]
+ for min_iou, max_iou in constraints:
+ for _ in range(max_trial):
+ scale = random.uniform(scales[0], scales[1])
+ aspect_ratio = random.uniform(max(1 / max_ratio, scale * scale), \
+ min(max_ratio, 1 / scale / scale))
+ crop_h = int(h * scale / np.sqrt(aspect_ratio))
+ crop_w = int(w * scale * np.sqrt(aspect_ratio))
+ crop_x = random.randrange(w - crop_w)
+ crop_y = random.randrange(h - crop_h)
+ crop_box = np.array([[
+ (crop_x + crop_w / 2.0) / w,
+ (crop_y + crop_h / 2.0) / h,
+ crop_w / float(w),
+ crop_h /float(h)
+ ]])
+
+ iou = box_utils.box_iou_xywh(crop_box, boxes)
+ if min_iou <= iou.min() and max_iou >= iou.max():
+ crops.append((crop_x, crop_y, crop_w, crop_h))
+ break
+
+ while crops:
+ crop = crops.pop(np.random.randint(0, len(crops)))
+ crop_boxes, crop_labels, crop_scores, box_num = box_utils.box_crop(boxes, labels, scores, crop, (w, h))
+ if box_num < 1:
+ continue
+ img = img.crop((crop[0], crop[1], crop[0] + crop[2], crop[1] + crop[3])).resize(img.size, Image.LANCZOS)
+ img = np.asarray(img)
+ return img, crop_boxes, crop_labels, crop_scores
+ img = np.asarray(img)
+ return img, boxes, labels, scores
+
+def random_flip(img, gtboxes, thresh=0.5):
+ if random.random() > thresh:
+ img = img[:, ::-1, :]
+ gtboxes[:, 0] = 1.0 - gtboxes[:, 0]
+ return img, gtboxes
+
+def random_interp(img, size, interp=None):
+ interp_method = [
+ cv2.INTER_NEAREST,
+ cv2.INTER_LINEAR,
+ cv2.INTER_AREA,
+ cv2.INTER_CUBIC,
+ cv2.INTER_LANCZOS4,
+ ]
+ if not interp or interp not in interp_method:
+ interp = interp_method[random.randint(0, len(interp_method) - 1)]
+ h, w, _ = img.shape
+ im_scale_x = size / float(w)
+ im_scale_y = size / float(h)
+ img = cv2.resize(img, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=interp)
+ return img
+
+def random_expand(img, gtboxes, max_ratio=4., fill=None, keep_ratio=True, thresh=0.5):
+ if random.random() > thresh:
+ return img, gtboxes
+
+ if max_ratio < 1.0:
+ return img, gtboxes
+
+ h, w, c = img.shape
+ ratio_x = random.uniform(1, max_ratio)
+ if keep_ratio:
+ ratio_y = ratio_x
+ else:
+ ratio_y = random.uniform(1, max_ratio)
+ oh = int(h * ratio_y)
+ ow = int(w * ratio_x)
+ off_x = random.randint(0, ow -w)
+ off_y = random.randint(0, oh -h)
+
+ out_img = np.zeros((oh, ow, c))
+ if fill and len(fill) == c:
+ for i in range(c):
+ out_img[:, :, i] = fill[i] * 255.0
+
+ out_img[off_y: off_y + h, off_x: off_x + w, :] = img
+ gtboxes[:, 0] = ((gtboxes[:, 0] * w) + off_x) / float(ow)
+ gtboxes[:, 1] = ((gtboxes[:, 1] * h) + off_y) / float(oh)
+ gtboxes[:, 2] = gtboxes[:, 2] / ratio_x
+ gtboxes[:, 3] = gtboxes[:, 3] / ratio_y
+
+ return out_img.astype('uint8'), gtboxes
+
+def shuffle_gtbox(gtbox, gtlabel, gtscore):
+ gt = np.concatenate([gtbox, gtlabel[:, np.newaxis], gtscore[:, np.newaxis]], axis=1)
+ idx = np.arange(gt.shape[0])
+ np.random.shuffle(idx)
+ gt = gt[idx, :]
+ return gt[:, :4], gt[:, 4], gt[:, 5]
+
+def image_mixup(img1, gtboxes1, gtlabels1, gtscores1, img2, gtboxes2, gtlabels2, gtscores2):
+ factor = np.random.beta(1.5, 1.5)
+ factor = max(0.0, min(1.0, factor))
+ if factor >= 1.0:
+ return img1, gtboxes1, gtlabels1
+ if factor <= 0.0:
+ return img2, gtboxes2, gtlabels2
+ gtscores1 = gtscores1 * factor
+ gtscores2 = gtscores2 * (1.0 - factor)
+
+ h = max(img1.shape[0], img2.shape[0])
+ w = max(img1.shape[1], img2.shape[1])
+ img = np.zeros((h, w, img1.shape[2]), 'float32')
+ img[:img1.shape[0], :img1.shape[1], :] = img1.astype('float32') * factor
+ img[:img2.shape[0], :img2.shape[1], :] += img2.astype('float32') * (1.0 - factor)
+ gtboxes = np.zeros_like(gtboxes1)
+ gtlabels = np.zeros_like(gtlabels1)
+ gtscores = np.zeros_like(gtscores1)
+
+ gt_valid_mask1 = np.logical_and(gtboxes1[:, 2] > 0, gtboxes1[:, 3] > 0)
+ gtboxes1 = gtboxes1[gt_valid_mask1]
+ gtlabels1 = gtlabels1[gt_valid_mask1]
+ gtscores1 = gtscores1[gt_valid_mask1]
+ gtboxes1[:, 0] = gtboxes1[:, 0] * img1.shape[1] / w
+ gtboxes1[:, 1] = gtboxes1[:, 1] * img1.shape[0] / h
+ gtboxes1[:, 2] = gtboxes1[:, 2] * img1.shape[1] / w
+ gtboxes1[:, 3] = gtboxes1[:, 3] * img1.shape[0] / h
+
+ gt_valid_mask2 = np.logical_and(gtboxes2[:, 2] > 0, gtboxes2[:, 3] > 0)
+ gtboxes2 = gtboxes2[gt_valid_mask2]
+ gtlabels2 = gtlabels2[gt_valid_mask2]
+ gtscores2 = gtscores2[gt_valid_mask2]
+ gtboxes2[:, 0] = gtboxes2[:, 0] * img2.shape[1] / w
+ gtboxes2[:, 1] = gtboxes2[:, 1] * img2.shape[0] / h
+ gtboxes2[:, 2] = gtboxes2[:, 2] * img2.shape[1] / w
+ gtboxes2[:, 3] = gtboxes2[:, 3] * img2.shape[0] / h
+
+ gtboxes_all = np.concatenate((gtboxes1, gtboxes2), axis=0)
+ gtlabels_all = np.concatenate((gtlabels1, gtlabels2), axis=0)
+ gtscores_all = np.concatenate((gtscores1, gtscores2), axis=0)
+ gt_num = min(len(gtboxes), len(gtboxes_all))
+ gtboxes[:gt_num] = gtboxes_all[:gt_num]
+ gtlabels[:gt_num] = gtlabels_all[:gt_num]
+ gtscores[:gt_num] = gtscores_all[:gt_num]
+ return img.astype('uint8'), gtboxes, gtlabels, gtscores
+
+def image_augment(img, gtboxes, gtlabels, gtscores, size, means=None):
+ img = random_distort(img)
+ img, gtboxes = random_expand(img, gtboxes, fill=means)
+ img, gtboxes, gtlabels, gtscores = random_crop(img, gtboxes, gtlabels, gtscores)
+ img = random_interp(img, size)
+ img, gtboxes = random_flip(img, gtboxes)
+ gtboxes, gtlabels, gtscores = shuffle_gtbox(gtboxes, gtlabels, gtscores)
+
+ return img.astype('float32'), gtboxes.astype('float32'), \
+ gtlabels.astype('int32'), gtscores.astype('float32')
+
diff --git a/PaddleCV/yolov3/infer.py b/PaddleCV/yolov3/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..58615ccfcf986dfdba385b3c6456df30d63c3890
--- /dev/null
+++ b/PaddleCV/yolov3/infer.py
@@ -0,0 +1,80 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+import os
+import time
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import box_utils
+import reader
+from utility import print_arguments, parse_args
+from models.yolov3 import YOLOv3
+from pycocotools.coco import COCO
+from pycocotools.cocoeval import COCOeval, Params
+from config import cfg
+
+
+def infer():
+
+ if not os.path.exists('output'):
+ os.mkdir('output')
+
+ model = YOLOv3(is_train=False)
+ model.build_model()
+ outputs = model.get_pred()
+ input_size = cfg.input_size
+ place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ # yapf: disable
+ if cfg.weights:
+ def if_exist(var):
+ return os.path.exists(os.path.join(cfg.weights, var.name))
+ fluid.io.load_vars(exe, cfg.weights, predicate=if_exist)
+ # yapf: enable
+ feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
+ fetch_list = [outputs]
+ image_names = []
+ if cfg.image_name is not None:
+ image_names.append(cfg.image_name)
+ else:
+ for image_name in os.listdir(cfg.image_path):
+ if image_name.split('.')[-1] in ['jpg', 'png']:
+ image_names.append(image_name)
+ for image_name in image_names:
+ infer_reader = reader.infer(input_size, os.path.join(cfg.image_path, image_name))
+ label_names, _ = reader.get_label_infos()
+ data = next(infer_reader())
+ im_shape = data[0][2]
+ outputs = exe.run(
+ fetch_list=[v.name for v in fetch_list],
+ feed=feeder.feed(data),
+ return_numpy=False)
+ bboxes = np.array(outputs[0])
+ if bboxes.shape[1] != 6:
+ print("No object found in {}".format(image_name))
+ continue
+ labels = bboxes[:, 0].astype('int32')
+ scores = bboxes[:, 1].astype('float32')
+ boxes = bboxes[:, 2:].astype('float32')
+
+ path = os.path.join(cfg.image_path, image_name)
+ box_utils.draw_boxes_on_image(path, boxes, scores, labels, label_names, cfg.draw_thresh)
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ infer()
diff --git a/PaddleCV/yolov3/learning_rate.py b/PaddleCV/yolov3/learning_rate.py
new file mode 100644
index 0000000000000000000000000000000000000000..d712832d31463cc054b99aa924bf9ca84f976634
--- /dev/null
+++ b/PaddleCV/yolov3/learning_rate.py
@@ -0,0 +1,61 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid as fluid
+import paddle.fluid.layers.learning_rate_scheduler as lr_scheduler
+from paddle.fluid.layers import control_flow
+
+
+def exponential_with_warmup_decay(learning_rate, boundaries, values,
+ warmup_iter, warmup_factor):
+ global_step = lr_scheduler._decay_step_counter()
+
+ lr = fluid.layers.create_global_var(
+ shape=[1],
+ value=0.0,
+ dtype='float32',
+ persistable=True,
+ name="learning_rate")
+
+ warmup_iter_var = fluid.layers.fill_constant(
+ shape=[1], dtype='float32', value=float(warmup_iter), force_cpu=True)
+
+ with control_flow.Switch() as switch:
+ with switch.case(global_step < warmup_iter_var):
+ alpha = global_step / warmup_iter_var
+ factor = warmup_factor * (1 - alpha) + alpha
+ decayed_lr = learning_rate * factor
+ fluid.layers.assign(decayed_lr, lr)
+
+ for i in range(len(boundaries)):
+ boundary_val = fluid.layers.fill_constant(
+ shape=[1],
+ dtype='float32',
+ value=float(boundaries[i]),
+ force_cpu=True)
+ value_var = fluid.layers.fill_constant(
+ shape=[1], dtype='float32', value=float(values[i]))
+ with switch.case(global_step < boundary_val):
+ fluid.layers.assign(value_var, lr)
+
+ last_value_var = fluid.layers.fill_constant(
+ shape=[1], dtype='float32', value=float(values[len(values) - 1]))
+ with switch.default():
+ fluid.layers.assign(last_value_var, lr)
+
+ return lr
diff --git a/fluid/PaddleRec/ssr/__init__.py b/PaddleCV/yolov3/models/__init__.py
similarity index 100%
rename from fluid/PaddleRec/ssr/__init__.py
rename to PaddleCV/yolov3/models/__init__.py
diff --git a/PaddleCV/yolov3/models/darknet.py b/PaddleCV/yolov3/models/darknet.py
new file mode 100644
index 0000000000000000000000000000000000000000..6fd89d425960afe1c2af1674d1ed37ca7fd47271
--- /dev/null
+++ b/PaddleCV/yolov3/models/darknet.py
@@ -0,0 +1,96 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from paddle.fluid.initializer import Constant
+from paddle.fluid.regularizer import L2Decay
+
+def conv_bn_layer(input,
+ ch_out,
+ filter_size,
+ stride,
+ padding,
+ act='leaky',
+ is_test=True,
+ name=None):
+ conv1 = fluid.layers.conv2d(
+ input=input,
+ num_filters=ch_out,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ act=None,
+ param_attr=ParamAttr(initializer=fluid.initializer.Normal(0., 0.02),
+ name=name+".conv.weights"),
+ bias_attr=False)
+
+ bn_name = name + ".bn"
+ out = fluid.layers.batch_norm(
+ input=conv1,
+ act=None,
+ is_test=is_test,
+ param_attr=ParamAttr(
+ initializer=fluid.initializer.Normal(0., 0.02),
+ regularizer=L2Decay(0.),
+ name=bn_name + '.scale'),
+ bias_attr=ParamAttr(
+ initializer=fluid.initializer.Constant(0.0),
+ regularizer=L2Decay(0.),
+ name=bn_name + '.offset'),
+ moving_mean_name=bn_name + '.mean',
+ moving_variance_name=bn_name + '.var')
+ if act == 'leaky':
+ out = fluid.layers.leaky_relu(x=out, alpha=0.1)
+ return out
+
+def downsample(input, ch_out, filter_size=3, stride=2, padding=1, is_test=True, name=None):
+ return conv_bn_layer(input,
+ ch_out=ch_out,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ is_test=is_test,
+ name=name)
+
+def basicblock(input, ch_out, is_test=True, name=None):
+ conv1 = conv_bn_layer(input, ch_out, 1, 1, 0, is_test=is_test, name=name+".0")
+ conv2 = conv_bn_layer(conv1, ch_out*2, 3, 1, 1, is_test=is_test, name=name+".1")
+ out = fluid.layers.elementwise_add(x=input, y=conv2, act=None)
+ return out
+
+def layer_warp(block_func, input, ch_out, count, is_test=True, name=None):
+ res_out = block_func(input, ch_out, is_test=is_test, name='{}.0'.format(name))
+ for j in range(1, count):
+ res_out = block_func(res_out, ch_out, is_test=is_test, name='{}.{}'.format(name, j))
+ return res_out
+
+DarkNet_cfg = {
+ 53: ([1,2,8,8,4],basicblock)
+}
+
+def add_DarkNet53_conv_body(body_input, is_test=True):
+ stages, block_func = DarkNet_cfg[53]
+ stages = stages[0:5]
+ conv1 = conv_bn_layer(
+ body_input, ch_out=32, filter_size=3, stride=1, padding=1, is_test=is_test, name="yolo_input")
+ downsample_ = downsample(conv1, ch_out=conv1.shape[1]*2, is_test=is_test, name="yolo_input.downsample")
+ blocks = []
+ for i, stage in enumerate(stages):
+ block = layer_warp(block_func, downsample_, 32 *(2**i), stage, is_test=is_test, name="stage.{}".format(i))
+ blocks.append(block)
+ if i < len(stages) - 1: # do not downsaple in the last stage
+ downsample_ = downsample(block, ch_out=block.shape[1]*2, is_test=is_test, name="stage.{}.downsample".format(i))
+ return blocks[-1:-4:-1]
+
diff --git a/PaddleCV/yolov3/models/yolov3.py b/PaddleCV/yolov3/models/yolov3.py
new file mode 100644
index 0000000000000000000000000000000000000000..55c0667b16097b4424c948798a5faccc8ad2e366
--- /dev/null
+++ b/PaddleCV/yolov3/models/yolov3.py
@@ -0,0 +1,187 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from paddle.fluid.initializer import Constant
+from paddle.fluid.initializer import Normal
+from paddle.fluid.regularizer import L2Decay
+
+from config import cfg
+
+from .darknet import add_DarkNet53_conv_body
+from .darknet import conv_bn_layer
+
+def yolo_detection_block(input, channel, is_test=True, name=None):
+ assert channel % 2 == 0, "channel {} cannot be divided by 2".format(channel)
+ conv = input
+ for j in range(2):
+ conv = conv_bn_layer(conv, channel, filter_size=1, stride=1, padding=0, is_test=is_test, name='{}.{}.0'.format(name, j))
+ conv = conv_bn_layer(conv, channel*2, filter_size=3, stride=1, padding=1, is_test=is_test, name='{}.{}.1'.format(name, j))
+ route = conv_bn_layer(conv, channel, filter_size=1, stride=1, padding=0, is_test=is_test, name='{}.2'.format(name))
+ tip = conv_bn_layer(route,channel*2, filter_size=3, stride=1, padding=1, is_test=is_test, name='{}.tip'.format(name))
+ return route, tip
+
+def upsample(input, scale=2,name=None):
+ # get dynamic upsample output shape
+ shape_nchw = fluid.layers.shape(input)
+ shape_hw = fluid.layers.slice(shape_nchw, axes=[0], starts=[2], ends=[4])
+ shape_hw.stop_gradient = True
+ in_shape = fluid.layers.cast(shape_hw, dtype='int32')
+ out_shape = in_shape * scale
+ out_shape.stop_gradient = True
+
+ # reisze by actual_shape
+ out = fluid.layers.resize_nearest(
+ input=input,
+ scale=scale,
+ actual_shape=out_shape,
+ name=name)
+ return out
+
+class YOLOv3(object):
+ def __init__(self,
+ is_train=True,
+ use_random=True):
+ self.is_train = is_train
+ self.use_random = use_random
+ self.outputs = []
+ self.losses = []
+ self.downsample = 32
+
+ def build_input(self):
+ self.image_shape = [3, cfg.input_size, cfg.input_size]
+ if self.is_train:
+ self.py_reader = fluid.layers.py_reader(
+ capacity=64,
+ shapes = [[-1] + self.image_shape, [-1, cfg.max_box_num, 4], [-1, cfg.max_box_num], [-1, cfg.max_box_num]],
+ lod_levels=[0, 0, 0, 0],
+ dtypes=['float32'] * 2 + ['int32'] + ['float32'],
+ use_double_buffer=True)
+ self.image, self.gtbox, self.gtlabel, self.gtscore = fluid.layers.read_file(self.py_reader)
+ else:
+ self.image = fluid.layers.data(
+ name='image', shape=self.image_shape, dtype='float32'
+ )
+ self.im_shape = fluid.layers.data(
+ name="im_shape", shape=[2], dtype='int32')
+ self.im_id = fluid.layers.data(
+ name="im_id", shape=[1], dtype='int32')
+
+ def feeds(self):
+ if not self.is_train:
+ return [self.image, self.im_id, self.im_shape]
+ return [self.image, self.gtbox, self.gtlabel, self.gtscore]
+
+ def build_model(self):
+ self.build_input()
+
+ self.outputs = []
+ self.boxes = []
+ self.scores = []
+
+ blocks = add_DarkNet53_conv_body(self.image, not self.is_train)
+ for i, block in enumerate(blocks):
+ if i > 0:
+ block = fluid.layers.concat(
+ input=[route, block],
+ axis=1)
+ route, tip = yolo_detection_block(block, channel=512//(2**i),
+ is_test=(not self.is_train),
+ name="yolo_block.{}".format(i))
+
+ # out channel number = mask_num * (5 + class_num)
+ num_filters = len(cfg.anchor_masks[i]) * (cfg.class_num + 5)
+ block_out = fluid.layers.conv2d(
+ input=tip,
+ num_filters=num_filters,
+ filter_size=1,
+ stride=1,
+ padding=0,
+ act=None,
+ param_attr=ParamAttr(initializer=fluid.initializer.Normal(0., 0.02),
+ name="yolo_output.{}.conv.weights".format(i)),
+ bias_attr=ParamAttr(initializer=fluid.initializer.Constant(0.0),
+ regularizer=L2Decay(0.),
+ name="yolo_output.{}.conv.bias".format(i)))
+ self.outputs.append(block_out)
+
+ if i < len(blocks) - 1:
+ route = conv_bn_layer(
+ input=route,
+ ch_out=256//(2**i),
+ filter_size=1,
+ stride=1,
+ padding=0,
+ is_test=(not self.is_train),
+ name="yolo_transition.{}".format(i))
+ # upsample
+ route = upsample(route)
+
+
+ for i, out in enumerate(self.outputs):
+ anchor_mask = cfg.anchor_masks[i]
+
+ if self.is_train:
+ loss = fluid.layers.yolov3_loss(
+ x=out,
+ gtbox=self.gtbox,
+ gtlabel=self.gtlabel,
+ gtscore=self.gtscore,
+ anchors=cfg.anchors,
+ anchor_mask=anchor_mask,
+ class_num=cfg.class_num,
+ ignore_thresh=cfg.ignore_thresh,
+ downsample_ratio=self.downsample,
+ use_label_smooth=cfg.label_smooth,
+ name="yolo_loss"+str(i))
+ self.losses.append(fluid.layers.reduce_mean(loss))
+ else:
+ mask_anchors=[]
+ for m in anchor_mask:
+ mask_anchors.append(cfg.anchors[2 * m])
+ mask_anchors.append(cfg.anchors[2 * m + 1])
+ boxes, scores = fluid.layers.yolo_box(
+ x=out,
+ img_size=self.im_shape,
+ anchors=mask_anchors,
+ class_num=cfg.class_num,
+ conf_thresh=cfg.valid_thresh,
+ downsample_ratio=self.downsample,
+ name="yolo_box"+str(i))
+ self.boxes.append(boxes)
+ self.scores.append(fluid.layers.transpose(scores, perm=[0, 2, 1]))
+
+ self.downsample //= 2
+
+
+ def loss(self):
+ return sum(self.losses)
+
+ def get_pred(self):
+ yolo_boxes = fluid.layers.concat(self.boxes, axis=1)
+ yolo_scores = fluid.layers.concat(self.scores, axis=2)
+ return fluid.layers.multiclass_nms(
+ bboxes=yolo_boxes,
+ scores=yolo_scores,
+ score_threshold=cfg.valid_thresh,
+ nms_top_k=cfg.nms_topk,
+ keep_top_k=cfg.nms_posk,
+ nms_threshold=cfg.nms_thresh,
+ background_label=-1,
+ name="multiclass_nms")
+
diff --git a/PaddleCV/yolov3/reader.py b/PaddleCV/yolov3/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..24e830dfc6487b90b266c70effe8984d5a661276
--- /dev/null
+++ b/PaddleCV/yolov3/reader.py
@@ -0,0 +1,303 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import numpy as np
+import os
+import random
+import time
+import copy
+import cv2
+import box_utils
+import image_utils
+from pycocotools.coco import COCO
+from data_utils import GeneratorEnqueuer
+from config import cfg
+
+
+class DataSetReader(object):
+ """A class for parsing and read COCO dataset"""
+
+ def __init__(self):
+ self.has_parsed_categpry = False
+
+ def _parse_dataset_dir(self, mode):
+ if 'coco2014' in cfg.dataset:
+ cfg.train_file_list = 'annotations/instances_train2014.json'
+ cfg.train_data_dir = 'train2014'
+ cfg.val_file_list = 'annotations/instances_val2014.json'
+ cfg.val_data_dir = 'val2014'
+ elif 'coco2017' in cfg.dataset:
+ cfg.train_file_list = 'annotations/instances_train2017.json'
+ cfg.train_data_dir = 'train2017'
+ cfg.val_file_list = 'annotations/instances_val2017.json'
+ cfg.val_data_dir = 'val2017'
+ else:
+ raise NotImplementedError('Dataset {} not supported'.format(
+ cfg.dataset))
+
+ if mode == 'train':
+ cfg.train_file_list = os.path.join(cfg.data_dir, cfg.train_file_list)
+ cfg.train_data_dir = os.path.join(cfg.data_dir, cfg.train_data_dir)
+ self.COCO = COCO(cfg.train_file_list)
+ self.img_dir = cfg.train_data_dir
+ elif mode == 'test' or mode == 'infer':
+ cfg.val_file_list = os.path.join(cfg.data_dir, cfg.val_file_list)
+ cfg.val_data_dir = os.path.join(cfg.data_dir, cfg.val_data_dir)
+ self.COCO = COCO(cfg.val_file_list)
+ self.img_dir = cfg.val_data_dir
+
+
+ def _parse_dataset_catagory(self):
+ self.categories = self.COCO.loadCats(self.COCO.getCatIds())
+ self.num_category = len(self.categories)
+ self.label_names = []
+ self.label_ids = []
+ for category in self.categories:
+ self.label_names.append(category['name'])
+ self.label_ids.append(int(category['id']))
+ self.category_to_id_map = {
+ v: i
+ for i, v in enumerate(self.label_ids)
+ }
+ print("Load in {} categories.".format(self.num_category))
+ self.has_parsed_categpry = True
+
+ def get_label_infos(self):
+ if not self.has_parsed_categpry:
+ self._parse_dataset_dir("test")
+ self._parse_dataset_catagory()
+ return (self.label_names, self.label_ids)
+
+ def _parse_gt_annotations(self, img):
+ img_height = img['height']
+ img_width = img['width']
+ anno = self.COCO.loadAnns(self.COCO.getAnnIds(imgIds=img['id'], iscrowd=None))
+ gt_index = 0
+ for target in anno:
+ if target['area'] < cfg.gt_min_area:
+ continue
+ if 'ignore' in target and target['ignore']:
+ continue
+
+ box = box_utils.coco_anno_box_to_center_relative(target['bbox'], img_height, img_width)
+ if box[2] <= 0 and box[3] <= 0:
+ continue
+
+ img['gt_id'][gt_index] = np.int32(target['id'])
+ img['gt_boxes'][gt_index] = box
+ img['gt_labels'][gt_index] = self.category_to_id_map[target['category_id']]
+ gt_index += 1
+ if gt_index >= cfg.max_box_num:
+ break
+
+ def _parse_images(self, is_train):
+ image_ids = self.COCO.getImgIds()
+ image_ids.sort()
+ imgs = copy.deepcopy(self.COCO.loadImgs(image_ids))
+ for img in imgs:
+ img['image'] = os.path.join(self.img_dir, img['file_name'])
+ assert os.path.exists(img['image']), \
+ "image {} not found.".format(img['image'])
+ box_num = cfg.max_box_num
+ img['gt_id'] = np.zeros((cfg.max_box_num), dtype=np.int32)
+ img['gt_boxes'] = np.zeros((cfg.max_box_num, 4), dtype=np.float32)
+ img['gt_labels'] = np.zeros((cfg.max_box_num), dtype=np.int32)
+ for k in ['date_captured', 'url', 'license', 'file_name']:
+ if k in img:
+ del img[k]
+
+ if is_train:
+ self._parse_gt_annotations(img)
+
+ print("Loaded {0} images from {1}.".format(len(imgs), cfg.dataset))
+
+ return imgs
+
+ def _parse_images_by_mode(self, mode):
+ if mode == 'infer':
+ return []
+ else:
+ return self._parse_images(is_train=(mode=='train'))
+
+ def get_reader(self, mode, size=416, batch_size=None, shuffle=False, mixup_iter=0, random_sizes=[], image=None):
+ assert mode in ['train', 'test', 'infer'], "Unknow mode type!"
+ if mode != 'infer':
+ assert batch_size is not None, "batch size connot be None in mode {}".format(mode)
+ self._parse_dataset_dir(mode)
+ self._parse_dataset_catagory()
+
+ def img_reader(img, size, mean, std):
+ im_path = img['image']
+ im = cv2.imread(im_path).astype('float32')
+ im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
+
+ h, w, _ = im.shape
+ im_scale_x = size / float(w)
+ im_scale_y = size / float(h)
+ out_img = cv2.resize(im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=cv2.INTER_CUBIC)
+ mean = np.array(mean).reshape((1, 1, -1))
+ std = np.array(std).reshape((1, 1, -1))
+ out_img = (out_img / 255.0 - mean) / std
+ out_img = out_img.transpose((2, 0, 1))
+
+ return (out_img, int(img['id']), (h, w))
+
+ def img_reader_with_augment(img, size, mean, std, mixup_img):
+ im_path = img['image']
+ im = cv2.imread(im_path)
+ im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
+ gt_boxes = img['gt_boxes'].copy()
+ gt_labels = img['gt_labels'].copy()
+ gt_scores = np.ones_like(gt_labels)
+
+ if mixup_img:
+ mixup_im = cv2.imread(mixup_img['image'])
+ mixup_im = cv2.cvtColor(mixup_im, cv2.COLOR_BGR2RGB)
+ mixup_gt_boxes = np.array(mixup_img['gt_boxes']).copy()
+ mixup_gt_labels = np.array(mixup_img['gt_labels']).copy()
+ mixup_gt_scores = np.ones_like(mixup_gt_labels)
+ im, gt_boxes, gt_labels, gt_scores = image_utils.image_mixup(im, gt_boxes, \
+ gt_labels, gt_scores, mixup_im, mixup_gt_boxes, mixup_gt_labels, \
+ mixup_gt_scores)
+
+ im, gt_boxes, gt_labels, gt_scores = image_utils.image_augment(im, gt_boxes, gt_labels, gt_scores, size, mean)
+
+ mean = np.array(mean).reshape((1, 1, -1))
+ std = np.array(std).reshape((1, 1, -1))
+ out_img = (im / 255.0 - mean) / std
+ out_img = out_img.astype('float32').transpose((2, 0, 1))
+
+ return (out_img, gt_boxes, gt_labels, gt_scores)
+
+ def get_img_size(size, random_sizes=[]):
+ if len(random_sizes):
+ return np.random.choice(random_sizes)
+ return size
+
+ def get_mixup_img(imgs, mixup_iter, total_iter, read_cnt):
+ if total_iter >= mixup_iter:
+ return None
+
+ mixup_idx = np.random.randint(1, len(imgs))
+ mixup_img = imgs[(read_cnt + mixup_idx) % len(imgs)]
+ return mixup_img
+
+ def reader():
+ if mode == 'train':
+ imgs = self._parse_images_by_mode(mode)
+ if shuffle:
+ np.random.shuffle(imgs)
+ read_cnt = 0
+ total_iter = 0
+ batch_out = []
+ img_size = get_img_size(size, random_sizes)
+ while True:
+ img = imgs[read_cnt % len(imgs)]
+ mixup_img = get_mixup_img(imgs, mixup_iter, total_iter, read_cnt)
+ read_cnt += 1
+ if read_cnt % len(imgs) == 0 and shuffle:
+ np.random.shuffle(imgs)
+ im, gt_boxes, gt_labels, gt_scores = img_reader_with_augment(img, img_size, cfg.pixel_means, cfg.pixel_stds, mixup_img)
+ batch_out.append([im, gt_boxes, gt_labels, gt_scores])
+
+ if len(batch_out) == batch_size:
+ yield batch_out
+ batch_out = []
+ total_iter += 1
+ img_size = get_img_size(size, random_sizes)
+
+ elif mode == 'test':
+ imgs = self._parse_images_by_mode(mode)
+ batch_out = []
+ for img in imgs:
+ im, im_id, im_shape = img_reader(img, size, cfg.pixel_means, cfg.pixel_stds)
+ batch_out.append((im, im_id, im_shape))
+ if len(batch_out) == batch_size:
+ yield batch_out
+ batch_out = []
+ if len(batch_out) != 0:
+ yield batch_out
+ else:
+ img = {}
+ img['image'] = image
+ img['id'] = 0
+ im, im_id, im_shape = img_reader(img, size, cfg.pixel_means, cfg.pixel_stds)
+ batch_out = [(im, im_id, im_shape)]
+ yield batch_out
+
+ return reader
+
+
+dsr = DataSetReader()
+
+def train(size=416,
+ batch_size=64,
+ shuffle=True,
+ total_iter=0,
+ mixup_iter=0,
+ random_sizes=[],
+ num_workers=8,
+ max_queue=32,
+ use_multiprocessing=True):
+ generator = dsr.get_reader('train', size, batch_size, shuffle, int(mixup_iter/num_workers), random_sizes)
+
+ if not use_multiprocessing:
+ return generator
+
+ def infinite_reader():
+ while True:
+ for data in generator():
+ yield data
+
+ def reader():
+ cnt = 0
+ try:
+ enqueuer = GeneratorEnqueuer(
+ infinite_reader(), use_multiprocessing=use_multiprocessing)
+ enqueuer.start(max_queue_size=max_queue, workers=num_workers)
+ generator_out = None
+ while True:
+ while enqueuer.is_running():
+ if not enqueuer.queue.empty():
+ generator_out = enqueuer.queue.get()
+ break
+ else:
+ time.sleep(0.02)
+ yield generator_out
+ cnt += 1
+ if cnt >= total_iter:
+ enqueuer.stop()
+ return
+ generator_out = None
+ finally:
+ if enqueuer is not None:
+ enqueuer.stop()
+
+ return reader
+
+def test(size=416, batch_size=1):
+ return dsr.get_reader('test', size, batch_size)
+
+def infer(size=416, image=None):
+ return dsr.get_reader('infer', size, image=image)
+
+def get_label_infos():
+ return dsr.get_label_infos()
+
diff --git a/PaddleCV/yolov3/train.py b/PaddleCV/yolov3/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..01394c80c7765de1b3baa78e9fc1868e1a6fae73
--- /dev/null
+++ b/PaddleCV/yolov3/train.py
@@ -0,0 +1,141 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import sys
+import numpy as np
+import random
+import time
+import shutil
+from utility import parse_args, print_arguments, SmoothedValue
+
+import paddle
+import paddle.fluid as fluid
+import reader
+from models.yolov3 import YOLOv3
+from learning_rate import exponential_with_warmup_decay
+from config import cfg
+
+
+def train():
+
+ if cfg.debug:
+ fluid.default_startup_program().random_seed = 1000
+ fluid.default_main_program().random_seed = 1000
+ random.seed(0)
+ np.random.seed(0)
+
+ if not os.path.exists(cfg.model_save_dir):
+ os.makedirs(cfg.model_save_dir)
+
+ model = YOLOv3()
+ model.build_model()
+ input_size = cfg.input_size
+ loss = model.loss()
+ loss.persistable = True
+
+ devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
+ devices_num = len(devices.split(","))
+ print("Found {} CUDA devices.".format(devices_num))
+
+ learning_rate = cfg.learning_rate
+ boundaries = cfg.lr_steps
+ gamma = cfg.lr_gamma
+ step_num = len(cfg.lr_steps)
+ values = [learning_rate * (gamma**i) for i in range(step_num + 1)]
+
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=exponential_with_warmup_decay(
+ learning_rate=learning_rate,
+ boundaries=boundaries,
+ values=values,
+ warmup_iter=cfg.warm_up_iter,
+ warmup_factor=cfg.warm_up_factor),
+ regularization=fluid.regularizer.L2Decay(cfg.weight_decay),
+ momentum=cfg.momentum)
+ optimizer.minimize(loss)
+
+ place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if cfg.pretrain:
+ if not os.path.exists(cfg.pretrain):
+ print("Pretrain weights not found: {}".format(cfg.pretrain))
+ def if_exist(var):
+ return os.path.exists(os.path.join(cfg.pretrain, var.name))
+ fluid.io.load_vars(exe, cfg.pretrain, predicate=if_exist)
+
+ compile_program = fluid.compiler.CompiledProgram(
+ fluid.default_main_program()).with_data_parallel(
+ loss_name=loss.name)
+
+ random_sizes = [cfg.input_size]
+ if cfg.random_shape:
+ random_sizes = [32 * i for i in range(10, 20)]
+
+ total_iter = cfg.max_iter - cfg.start_iter
+ mixup_iter = total_iter - cfg.no_mixup_iter
+ train_reader = reader.train(input_size, batch_size=cfg.batch_size, shuffle=True, total_iter=total_iter*devices_num, mixup_iter=mixup_iter*devices_num, random_sizes=random_sizes, use_multiprocessing=cfg.use_multiprocess)
+ py_reader = model.py_reader
+ py_reader.decorate_paddle_reader(train_reader)
+
+ def save_model(postfix):
+ model_path = os.path.join(cfg.model_save_dir, postfix)
+ if os.path.isdir(model_path):
+ shutil.rmtree(model_path)
+ fluid.io.save_persistables(exe, model_path)
+
+ fetch_list = [loss]
+
+ py_reader.start()
+ smoothed_loss = SmoothedValue()
+ try:
+ start_time = time.time()
+ prev_start_time = start_time
+ snapshot_loss = 0
+ snapshot_time = 0
+ for iter_id in range(cfg.start_iter, cfg.max_iter):
+ prev_start_time = start_time
+ start_time = time.time()
+ losses = exe.run(compile_program, fetch_list=[v.name for v in fetch_list])
+ smoothed_loss.add_value(np.mean(np.array(losses[0])))
+ snapshot_loss += np.mean(np.array(losses[0]))
+ snapshot_time += start_time - prev_start_time
+ lr = np.array(fluid.global_scope().find_var('learning_rate')
+ .get_tensor())
+ print("Iter {:d}, lr {:.6f}, loss {:.6f}, time {:.5f}".format(
+ iter_id, lr[0],
+ smoothed_loss.get_mean_value(), start_time - prev_start_time))
+ sys.stdout.flush()
+ if (iter_id + 1) % cfg.snapshot_iter == 0:
+ save_model("model_iter{}".format(iter_id))
+ print("Snapshot {} saved, average loss: {}, average time: {}".format(
+ iter_id + 1, snapshot_loss / float(cfg.snapshot_iter),
+ snapshot_time / float(cfg.snapshot_iter)))
+ snapshot_loss = 0
+ snapshot_time = 0
+ except fluid.core.EOFException:
+ py_reader.reset()
+
+ save_model('model_final')
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ train()
diff --git a/PaddleCV/yolov3/utility.py b/PaddleCV/yolov3/utility.py
new file mode 100644
index 0000000000000000000000000000000000000000..d28f6a862d43e9e81f7b9b260fd25dd02e2d996c
--- /dev/null
+++ b/PaddleCV/yolov3/utility.py
@@ -0,0 +1,129 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+"""
+Contains common utility functions.
+"""
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import sys
+import distutils.util
+import numpy as np
+import six
+from collections import deque
+from paddle.fluid import core
+import argparse
+import functools
+from config import *
+
+
+def print_arguments(args):
+ """Print argparse's arguments.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("name", default="Jonh", type=str, help="User name.")
+ args = parser.parse_args()
+ print_arguments(args)
+
+ :param args: Input argparse.Namespace for printing.
+ :type args: argparse.Namespace
+ """
+ print("----------- Configuration Arguments -----------")
+ for arg, value in sorted(six.iteritems(vars(args))):
+ print("%s: %s" % (arg, value))
+ print("------------------------------------------------")
+
+
+def add_arguments(argname, type, default, help, argparser, **kwargs):
+ """Add argparse's argument.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ add_argument("name", str, "Jonh", "User name.", parser)
+ args = parser.parse_args()
+ """
+ type = distutils.util.strtobool if type == bool else type
+ argparser.add_argument(
+ "--" + argname,
+ default=default,
+ type=type,
+ help=help + ' Default: %(default)s.',
+ **kwargs)
+
+
+class SmoothedValue(object):
+ """Track a series of values and provide access to smoothed values over a
+ window or the global series average.
+ """
+
+ def __init__(self):
+ self.loss_sum = 0.0
+ self.iter_cnt = 0
+
+ def add_value(self, value):
+ self.loss_sum += np.mean(value)
+ self.iter_cnt += 1
+
+ def get_mean_value(self):
+ return self.loss_sum / self.iter_cnt
+
+
+def parse_args():
+ """return all args
+ """
+ parser = argparse.ArgumentParser(description=__doc__)
+ add_arg = functools.partial(add_arguments, argparser=parser)
+ # yapf: disable
+ # ENV
+ add_arg('use_gpu', bool, True, "Whether use GPU.")
+ add_arg('model_save_dir', str, 'checkpoints', "The path to save model.")
+ add_arg('pretrain', str, 'weights/darknet53', "The pretrain model path.")
+ add_arg('weights', str, 'weights/yolov3', "The weights path.")
+ add_arg('dataset', str, 'coco2017', "Dataset: coco2014, coco2017.")
+ add_arg('class_num', int, 80, "Class number.")
+ add_arg('data_dir', str, 'dataset/coco', "The data root path.")
+ add_arg('start_iter', int, 0, "Start iteration.")
+ add_arg('use_multiprocess', bool, True, "add multiprocess.")
+ #SOLVER
+ add_arg('batch_size', int, 8, "Mini-batch size per device.")
+ add_arg('learning_rate', float, 0.001, "Learning rate.")
+ add_arg('max_iter', int, 500200, "Iter number.")
+ add_arg('snapshot_iter', int, 2000, "Save model every snapshot stride.")
+ add_arg('label_smooth', bool, True, "Use label smooth in class label.")
+ add_arg('no_mixup_iter', int, 40000, "Disable mixup in last N iter.")
+ # TRAIN TEST INFER
+ add_arg('input_size', int, 608, "Image input size of YOLOv3.")
+ add_arg('random_shape', bool, True, "Resize to random shape for train reader.")
+ add_arg('valid_thresh', float, 0.005, "Valid confidence score for NMS.")
+ add_arg('nms_thresh', float, 0.45, "NMS threshold.")
+ add_arg('nms_topk', int, 400, "The number of boxes to perform NMS.")
+ add_arg('nms_posk', int, 100, "The number of boxes of NMS output.")
+ add_arg('debug', bool, False, "Debug mode")
+ # SINGLE EVAL AND DRAW
+ add_arg('image_path', str, 'image', "The image path used to inference and visualize.")
+ add_arg('image_name', str, None, "The single image used to inference and visualize. None to inference all images in image_path")
+ add_arg('draw_thresh', float, 0.5, "Confidence score threshold to draw prediction box in image in debug mode")
+ # yapf: enable
+ args = parser.parse_args()
+ file_name = sys.argv[0]
+ merge_cfg_from_args(args)
+ return args
diff --git a/PaddleCV/yolov3/weights/download.sh b/PaddleCV/yolov3/weights/download.sh
new file mode 100644
index 0000000000000000000000000000000000000000..44295ab53ef428ba664bf0b73a61e469c643690a
--- /dev/null
+++ b/PaddleCV/yolov3/weights/download.sh
@@ -0,0 +1,10 @@
+DIR="$( cd "$(dirname "$0")" ; pwd -P )"
+cd "$DIR"
+
+# Download the pretrain weights.
+echo "Downloading..."
+wget https://paddlemodels.bj.bcebos.com/yolo/darknet53.tar.gz
+wget https://paddlemodels.bj.bcebos.com/yolo/yolov3.tar.gz
+echo "Extracting..."
+tar -xf darknet53.tar.gz
+tar -xf yolov3.tar.gz
diff --git a/PaddleNLP/ELMO/README.md b/PaddleNLP/ELMO/README.md
new file mode 100755
index 0000000000000000000000000000000000000000..e945c7c26fe9ceab21ab5842da32438f4f56223b
--- /dev/null
+++ b/PaddleNLP/ELMO/README.md
@@ -0,0 +1,132 @@
+ELMO
+
+## 介绍
+
+ELMO(Embeddings from Language Models)是一种新型深度语境化词表征,可对词进行复杂特征(如句法和语义)和词在语言语境中的变化进行建模(即对多义词进行建模)。ELMO作为词向量,解决了两个重要问题:(1)词使用的复杂特性,如句法和语法。(2)如何在具体的语境下使用词,比如多义词的问题。
+
+ELMO在大语料上以language model为训练目标,训练出bidirectional LSTM模型,利用LSTM产生词语的表征, 对下游NLP任务(如问答、分类、命名实体识别等)进行微调。
+
+此版本发布要点:
+1. 发布预训练模型完整代码。
+2. 支持多卡训练,训练速度比主流实现快约1倍。
+3. 发布[ELMO中文预训练模型](https://dureader.gz.bcebos.com/elmo/elmo_chinese_checkpoint.tar.gz),
+训练约3.8G中文百科数据。
+4. 发布基于ELMO微调步骤和示例代码,验证在中文词法分析任务LAC上f1值提升了0.68%。
+
+
+## 基本配置及第三方安装包
+
+Python==2.7
+
+PaddlePaddle lastest版本
+
+numpy ==1.15.1
+
+six==1.11.0
+
+glob
+
+
+## 预训练模型
+
+1. 把文档文件切分成句子,并基于词表(参考[`data/vocabulary_min5k.txt`](data/vocabulary_min5k.txt))对句子进行切词。把文件切分成训练集trainset和测试集testset。训练数据参考[`data/train`](data/train),测试数据参考[`data/dev`](data/dev),
+训练集和测试集比例推荐为5:1。
+
+```
+本 书 介绍 了 中国 经济 发展 的 内外 平衡 问题 、 亚洲 金融 危机 十 周年 回顾 与 反思 、 实践 中 的 城乡 统筹 发展 、 未来 十 年 中国 需要 研究 的 重大 课题 、 科学 发展 与 新型 工业 化 等 方面 。
+```
+```
+吴 敬 琏 曾经 提出 中国 股市 “ 赌场 论 ” , 主张 维护 市场 规则 , 保护 草根 阶层 生计 , 被 誉 为 “ 中国 经济 学界 良心 ” , 是 媒体 和 公众 眼中 的 学术 明星
+```
+
+2. 训练模型
+
+```shell
+sh run.sh
+```
+
+3. 把checkpoint结果写入文件中。
+
+
+## 单机多卡训练
+
+模型支持单机多卡训练,需要在[`run.sh`](run.sh)里export CUDA_VISIBLE_DEVICES设置指定卡,如下所示:
+```shell
+export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
+```
+
+## 如何利用ELMO做微调
+
+ 在深度学习训练中,例如图像识别训练,每次从零开始训练都要消耗大量的时间和资源。而且当数据集比较少时,模型也难以拟合的情况。基于这种情况下,就出现了迁移学习,通过使用已经训练好的模型来初始化即将训练的网络,可以加快模型的收敛速度,而且还能提高模型的准确率。这个用于初始化训练网络的模型是使用大型数据集训练得到的一个模型,而且模型已经完全收敛。最好训练的模型和预训练的模型是同一个网络,这样可以最大限度地初始化全部层。
+
+ 利用ELMO做微调,与Bert方式不同,ELMO微调是把ELMO部分作为已预训练好的词向量,接入到NLP下游任务中。
+
+ 在原论文中推荐的使用方式是,NLP下游任务输入的embedding层与ELMO的输出向量直接做concat。其中,ELMO部分是直接加载预训练出来的模型参数(PaddlePaddle中通过fluid.io.load_vars接口来加载参数),模型参数输入到NLP下游任务是fix的(在PaddlePaddle中通过stop_gradient = True来实现)。
+
+ELMO微调任务的要点如下:
+
+1)下载预训练模型的参数文件。
+
+2)加载elmo网络定义部分bilm.py。
+
+3)在网络启动时加载预训练模型。
+
+4)基于elmo字典对输入做切词并转化为id。
+
+5)elmo词向量与网络embedding层做concat。
+
+具体步骤如下:
+1. 下载ELMO Paddle官方发布Checkpoint文件,Checkpoint文件为预训练好的约3.8G中文百科数据。
+[PaddlePaddle官方发布Checkpoint文件下载地址](https://dureader.gz.bcebos.com/elmo/elmo_chinese_checkpoint.tar.gz)
+
+2. 在网络初始化启动中加载ELMO Checkpoint文件。加载参数接口(fluid.io.load_vars),可加在网络参数(exe.run(fluid.default_startup_program()))初始化之后。
+
+```shell
+# 定义一个使用CPU的执行器
+place = fluid.CUDAPlace(0)
+# place = fluid.CPUPlace()
+exe = fluid.Executor(place)
+# 进行参数初始化
+exe.run(fluid.default_startup_program())
+
+```
+
+```shell
+src_pretrain_model_path = '490001' #490001为ELMO checkpoint文件
+def if_exist(var):
+ path = os.path.join(src_pretrain_model_path, var.name)
+ exist = os.path.exists(path)
+ if exist:
+ print('Load model: %s' % path)
+ return exist
+
+fluid.io.load_vars(executor=exe, dirname=src_pretrain_model_path, predicate=if_exist, main_program=main_program)
+```
+
+3. 在下游NLP任务代码中加入[`bilm.py`](bilm.py) 文件,[`bilm.py`](bilm.py) 是ELMO网络定义部分。
+
+4. 基于elmo词表(参考[`data/vocabulary_min5k.txt`](data/vocabulary_min5k.txt) )对输入的句子或段落进行切词,并把切词的词转化为id,放入feed_dict中。
+
+5. 在NLP下游任务代码,网络定义中embedding部分加入ELMO网络的定义
+
+```shell
+#引入 bilm.py embedding部分和encoder部分
+from bilm import elmo_encoder
+from bilm import emb
+
+#word为输入elmo部分切词后的字典
+elmo_embedding = emb(word)
+elmo_enc= elmo_encoder(elmo_embedding)
+
+#与NLP任务中生成词向量word_embedding做连接操作
+word_embedding=layers.concat(input=[elmo_enc, word_embedding], axis=1)
+
+```
+
+
+## 参考论文
+[Deep contextualized word representations](https://arxiv.org/abs/1802.05365)
+
+
+## Contributors
+本项目由百度深度学习技术平台部PaddlePaddle团队和百度自然语言处理部合作完成。欢迎贡献代码和反馈问题。
diff --git a/PaddleNLP/ELMO/args.py b/PaddleNLP/ELMO/args.py
new file mode 100755
index 0000000000000000000000000000000000000000..53bc0191280718d5ab44b8d482495b335d5d1e96
--- /dev/null
+++ b/PaddleNLP/ELMO/args.py
@@ -0,0 +1,107 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import argparse
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(description=__doc__)
+ parser.add_argument(
+ "--load_dir",
+ type=str,
+ default="",
+ help="Specify the path to load trained models.")
+ parser.add_argument(
+ "--load_pretraining_params",
+ type=str,
+ default="",
+ help="Specify the path to load pretrained model parameters, NOT including moment and learning_rate")
+ parser.add_argument(
+ "--batch_size",
+ type=int,
+ default=128,
+ help="The sequence number of a mini-batch data. (default: %(default)d)")
+ parser.add_argument(
+ "--embed_size",
+ type=int,
+ default=512,
+ help="The dimension of embedding table. (default: %(default)d)")
+ parser.add_argument(
+ "--hidden_size",
+ type=int,
+ default=4096,
+ help="The size of rnn hidden unit. (default: %(default)d)")
+ parser.add_argument(
+ "--num_layers",
+ type=int,
+ default=2,
+ help="The size of rnn layers. (default: %(default)d)")
+ parser.add_argument(
+ "--num_steps",
+ type=int,
+ default=20,
+ help="The size of sequence len. (default: %(default)d)")
+ parser.add_argument(
+ "--data_path", type=str, help="all the data for train,valid,test")
+ parser.add_argument("--vocab_path", type=str, help="vocab file path")
+ parser.add_argument(
+ '--use_gpu', type=bool, default=False, help='whether using gpu')
+ parser.add_argument('--enable_ce', action='store_true')
+ parser.add_argument('--test_nccl', action='store_true')
+ parser.add_argument('--optim', default='adagrad', help='optimizer type')
+ parser.add_argument('--sample_softmax', action='store_true')
+ parser.add_argument(
+ "--learning_rate",
+ type=float,
+ default=0.2,
+ help="Learning rate used to train the model. (default: %(default)f)")
+ parser.add_argument(
+ "--log_interval",
+ type=int,
+ default=100,
+ help="log the train loss every n batches."
+ "(default: %(default)d)")
+ parser.add_argument(
+ "--save_interval",
+ type=int,
+ default=10000,
+ help="log the train loss every n batches."
+ "(default: %(default)d)")
+ parser.add_argument(
+ "--dev_interval",
+ type=int,
+ default=10000,
+ help="cal dev loss every n batches."
+ "(default: %(default)d)")
+ parser.add_argument('--dropout', type=float, default=0.1)
+ parser.add_argument('--max_grad_norm', type=float, default=10.0)
+ parser.add_argument('--proj_clip', type=float, default=3.0)
+ parser.add_argument('--cell_clip', type=float, default=3.0)
+ parser.add_argument('--max_epoch', type=float, default=10)
+ parser.add_argument('--local', type=bool, default=False)
+ parser.add_argument('--shuffle', type=bool, default=False)
+ parser.add_argument('--use_custom_samples', type=bool, default=False)
+ parser.add_argument('--para_save_dir', type=str, default='model_new')
+ parser.add_argument('--train_path', type=str, default='')
+ parser.add_argument('--test_path', type=str, default='')
+ parser.add_argument('--update_method', type=str, default='nccl2')
+ parser.add_argument('--random_seed', type=int, default=0)
+ parser.add_argument('--n_negative_samples_batch', type=int, default=8000)
+ args = parser.parse_args()
+
+ return args
diff --git a/PaddleNLP/ELMO/bilm.py b/PaddleNLP/ELMO/bilm.py
new file mode 100755
index 0000000000000000000000000000000000000000..3f517694373abec3ef543e11b3dc1befed9eb10f
--- /dev/null
+++ b/PaddleNLP/ELMO/bilm.py
@@ -0,0 +1,141 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# This file is used to finetune.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import numpy
+import paddle.fluid.layers as layers
+import paddle.fluid as fluid
+import numpy as np
+
+# if you use our release weight layers,do not use the args.
+cell_clip = 3.0
+proj_clip = 3.0
+hidden_size = 4096
+vocab_size = 52445
+embed_size = 512
+# according to orginal paper, dropout need to be modifyed on finetune
+modify_dropout = 1
+proj_size = 512
+num_layers = 2
+random_seed = 0
+dropout_rate = 0.5
+
+
+def dropout(input):
+ return layers.dropout(
+ input,
+ dropout_prob=dropout_rate,
+ dropout_implementation="upscale_in_train",
+ seed=random_seed,
+ is_test=False)
+
+def lstmp_encoder(input_seq, gate_size, h_0, c_0, para_name):
+ # A lstm encoder implementation with projection.
+ # Linear transformation part for input gate, output gate, forget gate
+ # and cell activation vectors need be done outside of dynamic_lstm.
+ # So the output size is 4 times of gate_size.
+
+ input_proj = layers.fc(input=input_seq,
+ param_attr=fluid.ParamAttr(
+ name=para_name + '_gate_w', initializer=init),
+ size=gate_size * 4,
+ act=None,
+ bias_attr=False)
+ hidden, cell = layers.dynamic_lstmp(
+ input=input_proj,
+ size=gate_size * 4,
+ proj_size=proj_size,
+ h_0=h_0,
+ c_0=c_0,
+ use_peepholes=False,
+ proj_clip=proj_clip,
+ cell_clip=cell_clip,
+ proj_activation="identity",
+ param_attr=fluid.ParamAttr(initializer=None),
+ bias_attr=fluid.ParamAttr(initializer=None))
+ return hidden, cell, input_proj
+
+
+def encoder(x_emb,
+ init_hidden=None,
+ init_cell=None,
+ para_name=''):
+ rnn_input = x_emb
+ rnn_outs = []
+ rnn_outs_ori = []
+ cells = []
+ projs = []
+ for i in range(num_layers):
+ if init_hidden and init_cell:
+ h0 = layers.squeeze(
+ layers.slice(
+ init_hidden, axes=[0], starts=[i], ends=[i + 1]),
+ axes=[0])
+ c0 = layers.squeeze(
+ layers.slice(
+ init_cell, axes=[0], starts=[i], ends=[i + 1]),
+ axes=[0])
+ else:
+ h0 = c0 = None
+ rnn_out, cell, input_proj = lstmp_encoder(
+ rnn_input, hidden_size, h0, c0,
+ para_name + 'layer{}'.format(i + 1))
+ rnn_out_ori = rnn_out
+ if i > 0:
+ rnn_out = rnn_out + rnn_input
+ rnn_out.stop_gradient = True
+ rnn_outs.append(rnn_out)
+ rnn_outs_ori.append(rnn_out_ori)
+ # add weight layers for finetone
+ a1 = layers.create_parameter(
+ [1], dtype="float32", name="gamma1")
+ a2 = layers.create_parameter(
+ [1], dtype="float32", name="gamma2")
+ rnn_outs[0].stop_gradient = True
+ rnn_outs[1].stop_gradient = True
+ num_layer1 = rnn_outs[0] * a1
+ num_layer2 = rnn_outs[1] * a2
+ output_layer = num_layer1 * 0.5 + num_layer2 * 0.5
+ return output_layer, rnn_outs_ori
+
+
+def emb(x):
+ x_emb = layers.embedding(
+ input=x,
+ size=[vocab_size, embed_size],
+ dtype='float32',
+ is_sparse=False,
+ param_attr=fluid.ParamAttr(name='embedding_para'))
+ return x_emb
+
+
+def elmo_encoder(x_emb):
+ x_emb_r = fluid.layers.sequence_reverse(x_emb, name=None)
+ fw_hiddens, fw_hiddens_ori = encoder(
+ x_emb,
+ para_name='fw_')
+ bw_hiddens, bw_hiddens_ori = encoder(
+ x_emb_r,
+ para_name='bw_')
+ embedding = layers.concat(input=[fw_hiddens, bw_hiddens], axis=1)
+ # add dropout on finetune
+ embedding = dropout(embedding)
+ a = layers.create_parameter(
+ [1], dtype="float32", name="gamma")
+ embedding = embedding * a
+ return embedding
diff --git a/PaddleNLP/ELMO/data.py b/PaddleNLP/ELMO/data.py
new file mode 100755
index 0000000000000000000000000000000000000000..4bb67a1cd0bd9cdf268d02bcd93ee21c009ccb32
--- /dev/null
+++ b/PaddleNLP/ELMO/data.py
@@ -0,0 +1,509 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import glob
+import random
+
+import numpy as np
+import io
+import six
+
+class Vocabulary(object):
+ '''
+ A token vocabulary. Holds a map from token to ids and provides
+ a method for encoding text to a sequence of ids.
+ '''
+
+ def __init__(self, filename, validate_file=False):
+ '''
+ filename = the vocabulary file. It is a flat text file with one
+ (normalized) token per line. In addition, the file should also
+ contain the special tokens , , (case sensitive).
+ '''
+ self._id_to_word = []
+ self._word_to_id = {}
+ self._unk = -1
+ self._bos = -1
+ self._eos = -1
+
+ with io.open(filename, 'r', encoding='utf-8') as f:
+ idx = 0
+ for line in f:
+ word_name = line.strip()
+ if word_name == '':
+ self._bos = idx
+ elif word_name == '':
+ self._eos = idx
+ elif word_name == '':
+ self._unk = idx
+ if word_name == '!!!MAXTERMID':
+ continue
+
+ self._id_to_word.append(word_name)
+ self._word_to_id[word_name] = idx
+ idx += 1
+
+ # check to ensure file has special tokens
+ if validate_file:
+ if self._bos == -1 or self._eos == -1 or self._unk == -1:
+ raise ValueError("Ensure the vocabulary file has "
+ ", , tokens")
+
+ @property
+ def bos(self):
+ return self._bos
+
+ @property
+ def eos(self):
+ return self._eos
+
+ @property
+ def unk(self):
+ return self._unk
+
+ @property
+ def size(self):
+ return len(self._id_to_word)
+
+ def word_to_id(self, word):
+ if word in self._word_to_id:
+ return self._word_to_id[word]
+ return self.unk
+
+ def id_to_word(self, cur_id):
+ return self._id_to_word[cur_id]
+
+ def decode(self, cur_ids):
+ """Convert a list of ids to a sentence, with space inserted."""
+ return ' '.join([self.id_to_word(cur_id) for cur_id in cur_ids])
+
+ def encode(self, sentence, reverse=False, split=True):
+ """Convert a sentence to a list of ids, with special tokens added.
+ Sentence is a single string with tokens separated by whitespace.
+
+ If reverse, then the sentence is assumed to be reversed, and
+ this method will swap the BOS/EOS tokens appropriately."""
+
+ if split:
+ word_ids = [
+ self.word_to_id(cur_word) for cur_word in sentence.split()
+ ]
+ else:
+ word_ids = [self.word_to_id(cur_word) for cur_word in sentence]
+
+ if reverse:
+ return np.array([self.eos] + word_ids + [self.bos], dtype=np.int32)
+ else:
+ return np.array([self.bos] + word_ids + [self.eos], dtype=np.int32)
+
+
+class UnicodeCharsVocabulary(Vocabulary):
+ """Vocabulary containing character-level and word level information.
+
+ Has a word vocabulary that is used to lookup word ids and
+ a character id that is used to map words to arrays of character ids.
+
+ The character ids are defined by ord(c) for c in word.encode('utf-8')
+ This limits the total number of possible char ids to 256.
+ To this we add 5 additional special ids: begin sentence, end sentence,
+ begin word, end word and padding.
+
+ WARNING: for prediction, we add +1 to the output ids from this
+ class to create a special padding id (=0). As a result, we suggest
+ you use the `Batcher`, `TokenBatcher`, and `LMDataset` classes instead
+ of this lower level class. If you are using this lower level class,
+ then be sure to add the +1 appropriately, otherwise embeddings computed
+ from the pre-trained model will be useless.
+ """
+
+ def __init__(self, filename, max_word_length, **kwargs):
+ super(UnicodeCharsVocabulary, self).__init__(filename, **kwargs)
+ self._max_word_length = max_word_length
+
+ # char ids 0-255 come from utf-8 encoding bytes
+ # assign 256-300 to special chars
+ self.bos_char = 256 #
+ self.eos_char = 257 #
+ self.bow_char = 258 #
+ self.eow_char = 259 #
+ self.pad_char = 260 #
+
+ num_words = len(self._id_to_word)
+
+ self._word_char_ids = np.zeros(
+ [num_words, max_word_length], dtype=np.int32)
+
+ # the charcter representation of the begin/end of sentence characters
+ def _make_bos_eos(c):
+ r = np.zeros([self.max_word_length], dtype=np.int32)
+ r[:] = self.pad_char
+ r[0] = self.bow_char
+ r[1] = c
+ r[2] = self.eow_char
+ return r
+
+ self.bos_chars = _make_bos_eos(self.bos_char)
+ self.eos_chars = _make_bos_eos(self.eos_char)
+
+ for i, word in enumerate(self._id_to_word):
+ self._word_char_ids[i] = self._convert_word_to_char_ids(word)
+
+ self._word_char_ids[self.bos] = self.bos_chars
+ self._word_char_ids[self.eos] = self.eos_chars
+
+ @property
+ def word_char_ids(self):
+ return self._word_char_ids
+
+ @property
+ def max_word_length(self):
+ return self._max_word_length
+
+ def _convert_word_to_char_ids(self, word):
+ code = np.zeros([self.max_word_length], dtype=np.int32)
+ code[:] = self.pad_char
+
+ word_encoded = word.encode('utf-8',
+ 'ignore')[:(self.max_word_length - 2)]
+ code[0] = self.bow_char
+ for k, chr_id in enumerate(word_encoded, start=1):
+ code[k] = ord(chr_id)
+ code[k + 1] = self.eow_char
+
+ return code
+
+ def word_to_char_ids(self, word):
+ if word in self._word_to_id:
+ return self._word_char_ids[self._word_to_id[word]]
+ else:
+ return self._convert_word_to_char_ids(word)
+
+ def encode_chars(self, sentence, reverse=False, split=True):
+ '''
+ Encode the sentence as a white space delimited string of tokens.
+ '''
+ if split:
+ chars_ids = [
+ self.word_to_char_ids(cur_word)
+ for cur_word in sentence.split()
+ ]
+ else:
+ chars_ids = [
+ self.word_to_char_ids(cur_word) for cur_word in sentence
+ ]
+ if reverse:
+ return np.vstack([self.eos_chars] + chars_ids + [self.bos_chars])
+ else:
+ return np.vstack([self.bos_chars] + chars_ids + [self.eos_chars])
+
+
+class Batcher(object):
+ '''
+ Batch sentences of tokenized text into character id matrices.
+ '''
+
+ # def __init__(self, lm_vocab_file: str, max_token_length: int):
+ def __init__(self, lm_vocab_file, max_token_length):
+ '''
+ lm_vocab_file = the language model vocabulary file (one line per
+ token)
+ max_token_length = the maximum number of characters in each token
+ '''
+ max_token_length = int(max_token_length)
+ self._lm_vocab = UnicodeCharsVocabulary(lm_vocab_file,
+ max_token_length)
+ self._max_token_length = max_token_length
+
+ # def batch_sentences(self, sentences: List[List[str]]):
+ def batch_sentences(self, sentences):
+ '''
+ Batch the sentences as character ids
+ Each sentence is a list of tokens without or , e.g.
+ [['The', 'first', 'sentence', '.'], ['Second', '.']]
+ '''
+ n_sentences = len(sentences)
+ max_length = max(len(sentence) for sentence in sentences) + 2
+
+ X_char_ids = np.zeros(
+ (n_sentences, max_length, self._max_token_length), dtype=np.int64)
+
+ for k, sent in enumerate(sentences):
+ length = len(sent) + 2
+ char_ids_without_mask = self._lm_vocab.encode_chars(
+ sent, split=False)
+ # add one so that 0 is the mask value
+ X_char_ids[k, :length, :] = char_ids_without_mask + 1
+
+ return X_char_ids
+
+
+class TokenBatcher(object):
+ '''
+ Batch sentences of tokenized text into token id matrices.
+ '''
+
+ def __init__(self, lm_vocab_file):
+ # def __init__(self, lm_vocab_file: str):
+ '''
+ lm_vocab_file = the language model vocabulary file (one line per
+ token)
+ '''
+ self._lm_vocab = Vocabulary(lm_vocab_file)
+
+ # def batch_sentences(self, sentences: List[List[str]]):
+ def batch_sentences(self, sentences):
+ '''
+ Batch the sentences as character ids
+ Each sentence is a list of tokens without or , e.g.
+ [['The', 'first', 'sentence', '.'], ['Second', '.']]
+ '''
+ n_sentences = len(sentences)
+ max_length = max(len(sentence) for sentence in sentences) + 2
+
+ X_ids = np.zeros((n_sentences, max_length), dtype=np.int64)
+
+ for k, sent in enumerate(sentences):
+ length = len(sent) + 2
+ ids_without_mask = self._lm_vocab.encode(sent, split=False)
+ # add one so that 0 is the mask value
+ X_ids[k, :length] = ids_without_mask + 1
+
+ return X_ids
+
+
+##### for training
+def _get_batch(generator, batch_size, num_steps, max_word_length):
+ """Read batches of input."""
+ cur_stream = [None] * batch_size
+
+ no_more_data = False
+ while True:
+ inputs = np.zeros([batch_size, num_steps], np.int32)
+ if max_word_length is not None:
+ char_inputs = np.zeros([batch_size, num_steps, max_word_length],
+ np.int32)
+ else:
+ char_inputs = None
+ targets = np.zeros([batch_size, num_steps], np.int32)
+
+ for i in range(batch_size):
+ cur_pos = 0
+
+ while cur_pos < num_steps:
+ if cur_stream[i] is None or len(cur_stream[i][0]) <= 1:
+ try:
+ cur_stream[i] = list(next(generator))
+ except StopIteration:
+ # No more data, exhaust current streams and quit
+ no_more_data = True
+ break
+
+ how_many = min(len(cur_stream[i][0]) - 1, num_steps - cur_pos)
+ next_pos = cur_pos + how_many
+
+ inputs[i, cur_pos:next_pos] = cur_stream[i][0][:how_many]
+ if max_word_length is not None:
+ char_inputs[i, cur_pos:next_pos] = cur_stream[i][
+ 1][:how_many]
+ targets[i, cur_pos:next_pos] = cur_stream[i][0][1:how_many + 1]
+
+ cur_pos = next_pos
+
+ cur_stream[i][0] = cur_stream[i][0][how_many:]
+ if max_word_length is not None:
+ cur_stream[i][1] = cur_stream[i][1][how_many:]
+
+ if no_more_data:
+ # There is no more data. Note: this will not return data
+ # for the incomplete batch
+ break
+
+ X = {
+ 'token_ids': inputs,
+ 'tokens_characters': char_inputs,
+ 'next_token_id': targets
+ }
+
+ yield X
+
+
+class LMDataset(object):
+ """
+ Hold a language model dataset.
+
+ A dataset is a list of tokenized files. Each file contains one sentence
+ per line. Each sentence is pre-tokenized and white space joined.
+ """
+
+ def __init__(self,
+ filepattern,
+ vocab,
+ reverse=False,
+ test=False,
+ shuffle_on_load=False):
+ '''
+ filepattern = a glob string that specifies the list of files.
+ vocab = an instance of Vocabulary or UnicodeCharsVocabulary
+ reverse = if True, then iterate over tokens in each sentence in reverse
+ test = if True, then iterate through all data once then stop.
+ Otherwise, iterate forever.
+ shuffle_on_load = if True, then shuffle the sentences after loading.
+ '''
+ self._vocab = vocab
+ self._all_shards = glob.glob(filepattern)
+ print('Found %d shards at %s' % (len(self._all_shards), filepattern))
+ if test:
+ self._all_shards = list(np.random.choice(self._all_shards, size=4))
+ print('sampled %d shards at %s' % (len(self._all_shards), filepattern))
+ self._shards_to_choose = []
+
+ self._reverse = reverse
+ self._test = test
+ self._shuffle_on_load = shuffle_on_load
+ self._use_char_inputs = hasattr(vocab, 'encode_chars')
+
+ self._ids = self._load_random_shard()
+
+ def _choose_random_shard(self):
+ if len(self._shards_to_choose) == 0:
+ self._shards_to_choose = list(self._all_shards)
+ random.shuffle(self._shards_to_choose)
+ shard_name = self._shards_to_choose.pop()
+ return shard_name
+
+ def _load_random_shard(self):
+ """Randomly select a file and read it."""
+ if self._test:
+ if len(self._all_shards) == 0:
+ # we've loaded all the data
+ # this will propogate up to the generator in get_batch
+ # and stop iterating
+ raise StopIteration
+ else:
+ shard_name = self._all_shards.pop()
+ else:
+ # just pick a random shard
+ shard_name = self._choose_random_shard()
+
+ ids = self._load_shard(shard_name)
+ self._i = 0
+ self._nids = len(ids)
+ return ids
+
+ def _load_shard(self, shard_name):
+ """Read one file and convert to ids.
+
+ Args:
+ shard_name: file path.
+
+ Returns:
+ list of (id, char_id) tuples.
+ """
+ print('Loading data from: %s' % shard_name)
+ with io.open(shard_name, 'r', encoding='utf-8') as f:
+ sentences_raw = f.readlines()
+
+ if self._reverse:
+ sentences = []
+ for sentence in sentences_raw:
+ splitted = sentence.split()
+ splitted.reverse()
+ sentences.append(' '.join(splitted))
+ else:
+ sentences = sentences_raw
+
+ if self._shuffle_on_load:
+ print('shuffle sentences')
+ random.shuffle(sentences)
+
+ ids = [
+ self.vocab.encode(sentence, self._reverse)
+ for sentence in sentences
+ ]
+ if self._use_char_inputs:
+ chars_ids = [
+ self.vocab.encode_chars(sentence, self._reverse)
+ for sentence in sentences
+ ]
+ else:
+ chars_ids = [None] * len(ids)
+
+ print('Loaded %d sentences.' % len(ids))
+ print('Finished loading')
+ return list(zip(ids, chars_ids))
+
+ def get_sentence(self):
+ while True:
+ if self._i == self._nids:
+ self._ids = self._load_random_shard()
+ ret = self._ids[self._i]
+ self._i += 1
+ yield ret
+
+ @property
+ def max_word_length(self):
+ if self._use_char_inputs:
+ return self._vocab.max_word_length
+ else:
+ return None
+
+ def iter_batches(self, batch_size, num_steps):
+ for X in _get_batch(self.get_sentence(), batch_size, num_steps,
+ self.max_word_length):
+
+ # token_ids = (batch_size, num_steps)
+ # char_inputs = (batch_size, num_steps, 50) of character ids
+ # targets = word ID of next word (batch_size, num_steps)
+ yield X
+
+ @property
+ def vocab(self):
+ return self._vocab
+
+
+class BidirectionalLMDataset(object):
+ def __init__(self, filepattern, vocab, test=False, shuffle_on_load=False):
+ '''
+ bidirectional version of LMDataset
+ '''
+ self._data_forward = LMDataset(
+ filepattern,
+ vocab,
+ reverse=False,
+ test=test,
+ shuffle_on_load=shuffle_on_load)
+ self._data_reverse = LMDataset(
+ filepattern,
+ vocab,
+ reverse=True,
+ test=test,
+ shuffle_on_load=shuffle_on_load)
+
+ def iter_batches(self, batch_size, num_steps):
+ max_word_length = self._data_forward.max_word_length
+
+ for X, Xr in six.moves.zip(
+ _get_batch(self._data_forward.get_sentence(), batch_size,
+ num_steps, max_word_length),
+ _get_batch(self._data_reverse.get_sentence(), batch_size,
+ num_steps, max_word_length)):
+
+ for k, v in Xr.items():
+ X[k + '_reverse'] = v
+
+ yield X
+
+
+class InvalidNumberOfCharacters(Exception):
+ pass
diff --git a/PaddleNLP/ELMO/data/dev/sentence_file_3.txt b/PaddleNLP/ELMO/data/dev/sentence_file_3.txt
new file mode 100644
index 0000000000000000000000000000000000000000..b4e4c4010d1530d39a2b49dce43e39edd15d692d
--- /dev/null
+++ b/PaddleNLP/ELMO/data/dev/sentence_file_3.txt
@@ -0,0 +1,1000 @@
+考核 军官 , 应当 实行 领导 和 群众 相 结合 , 根据 军官 的 基本 条件 和 中央 军事 委员 会 规定 的 军官 考核 标准 、 程序 、 方法 , 以 工作 实绩 为主 , 全面 考核 。
+考核 结果 分 为 优秀 、 称职 、 不 称职 三 个 等次 , 并 作为 任免 军官 职务 的 主要 依据 。
+考核 结果 应当 告知 本人 。
+任免 军官 职务 , 应当 先 经 考核 ; 未经 考核 不 得 任免 。
+第 十二 条 军官 职务 的 任免 权限 : ( 一 ) 总 参谋 长 、 总 政治 部 主任 至正 师职 军官 职务 , 由 中央 军事 委员 会 主席 任免 ; ( 二 ) 副 师职 ( 正 旅 职 ) 、 正 团职 ( 副 旅 职 ) 军官 职务 和 高级 专业 技术 军官 职务 , 由 总 参谋 长 、 总 政治 部 主任 、 总 后勤 部 部长 和 政治 委员 、 总 装备 部 部长 和 政治 委员 、 大军 区 及 军 兵种 或者 相当 大军 区级 单位 的 正职 首长 任免 , 副 大军 区级 单位 的 正 团职 ( 副 旅 职 ) 军官 职务 由 副 大军 区级 单位 的 正职 首长 任免 ; ( 三 ) 副 团职 、 正 营 职 军官 职务 和 中级 专业 技术 军官 职务 , 由 集团 军 或者 其他 有 任免 权 的 军级 单位 的 正职 首长 任免 , 独立 师 的 正 营 职 军官 职务 由 独立 师 的 正职 首长 任免 ; ( 四 ) 副 营 职 以下 军官 职务 和 初级 专业 技术 军官 职务 , 由 师 ( 旅 ) 或者 其他 有 任免 权 的 师 ( 旅 ) 级 单位 的 正职 首长 任免 。
+前款 所 列 军官 职务 的 任免 , 按照 中央 军事 委员 会 规定 的 程序 办理 。
+第 十三 条 在 执行 作战 、 抢险 救灾 等 紧急 任务 时 , 上级 首长 有 权 暂时 免去 违抗 命令 、 不 履行 职责 或者 不 称职 的 所 属 军官 的 职务 , 并 可以 临时 指派 其他 军人 代理 ; 因 其他 原因 , 军官 职务 出现 空缺 时 , 上级 首长 也 可以 临时 指派 军人 代理 。
+依照 前款 规定 暂时 免去 或者 临时 指派 军人 代理 军官 职务 , 应当 尽快 报请 有 任免 权 的 上级 审核 决定 , 履行 任免 手续 。
+第 十四 条 作战 部队 的 军事 、 政治 、 后勤 、 装备 军官 平时 任职 的 最 高 年龄 分别 为 : ( 一 ) 担任 排 级 职务 的 , 三十 岁 ; ( 二 ) 担任 连 级 职务 的 , 三十五 岁 ; ( 三 ) 担任 营 级 职务 的 , 四十 岁 ; ( 四 ) 担任 团级 职务 的 , 四十五 岁 ; ( 五 ) 担任 师级 职务 的 , 五十岁 ; ( 六 ) 担任 军级 职务 的 , 五十五 岁 ; ( 七 ) 担任 大军 区级 职务 的 , 副职 六十三 岁 , 正职 六十五 岁 。
+在 舰艇 上 服役 的 营 级 和 团级 职务 军官 , 任职 的 最 高 年龄 分别 为 四十五 岁 和 五十岁 ; 从事 飞行 的 团级 职务 军官 , 任职 的 最 高 年龄 为 五十岁 。
+作战 部队 的 师级 和 军级 职务 军官 , 少数 工作 需要 的 , 按照 任免 权限 经过 批准 , 任职 的 最 高 年龄 可以 适当 延长 , 但是 师级 和 正 军职 军官 延长 的 年龄 最 多 不 得 超过 五 岁 , 副 军职 军官 延长 的 年龄 最 多 不 得 超过 三 岁 。
+第 十五 条 作战 部队 以外 单位 的 副 团职 以下 军官 和 大军 区级 职务 军官 , 任职 的 最 高 年龄 依照 本 法 第 十四 条 第 一款 的 相应 规定 执行 ; 正 团职 军官 , 任职 的 最 高 年龄 为 五十岁 ; 师级 职务 军官 , 任职 的 最 高 年龄 为 五十五 岁 ; 副 军职 和 正 军职 军官 , 任职 的 最 高 年龄 分别 为 五十八 岁 和 六十 岁 。
+第 十六 条 专业 技术 军官 平时 任职 的 最 高 年龄 分别 为 : ( 一 ) 担任 初级 专业 技术 职务 的 , 四十 岁 ; ( 二 ) 担任 中级 专业 技术 职务 的 , 五十岁 ; ( 三 ) 担任 高级 专业 技术 职务 的 , 六十 岁 。
+担任 高级 专业 技术 职务 的 军官 , 少数 工作 需要 的 , 按照 任免 权限 经过 批准 , 任职 的 最 高 年龄 可以 适当 延长 , 但是 延长 的 年龄 最 多 不 得 超过 五 岁 。
+第 十七 条 担任 排 、 连 、 营 、 团 、 师 ( 旅 ) 、 军级 主官 职务 的 军官 , 平时 任职 的 最 低 年限 分别 为 三年 。
+第 十八 条 机关 和 院校 的 股长 、 科长 、 处长 、 局长 、 部长 及 相当 领导 职务 的 军官 , 任职 的 最 低 年限 参照 本 法 第 十七 条 的 规定 执行 。
+机关 和 院校 的 参谋 、 干事 、 秘书 、 助理 员 、 教员 等 军官 , 每个 职务 等级 任职 的 最 低 年限 为 三年 。
+第 十九 条 专业 技术 军官 平时 任职 的 最 低 年限 , 按照 中央 军事 委员 会 的 有 关 规定 执行 。
+第 二十 条 军官 任职 满 最 低 年限 后 , 才能 根据 编制 缺额 和 本人 德才 条件 逐 职 晋升 。
+军官 德才 优秀 、 实绩 显著 、 工作 需要 的 , 可以 提前 晋升 ; 特别 优秀 的 , 可以 越 职 晋升 。
+第 二十一 条 军官 晋升 职务 , 应当 具备 拟 任 职务 所 要求 的 任职 经历 、 文化 程度 、 院校 培训 等 资格 。
+具体 条件 由 中央 军事 委员 会 规定 。
+第 二十二 条 军官 职务 应当 按照 编制 员额 和 编制 职务 等级 任命 。
+第 二十三 条 军 官 经 考核 不 称职 的 , 应当 调任 下级 职务 或者 改 做 其他 工作 , 并 按照 新任 职务 确定 待遇 。
+第 二十四 条 担任 师 、 军 、 大军 区级 职务 的 军官 , 正职 和 副职 平时 任职 的 最 高 年限 分别 为 十 年 。
+任职 满 最 高 年限 的 , 应当 免去 现任 职务 。
+第 二十五 条 根据 国防 建设 的 需要 , 军队 可以 向 非 军事 部门 派遣 军官 , 执行 军队 交付 的 任务 。
+第 二十六 条 军官 可以 按照 中央 军事 委员 会 的 规定 改任 文职 干部 。
+第 四 章 军官 的 交流 和 回避 第 二十七 条 军官 应当 在 不 同 岗位 或者 不 同 单位 之间 进行 交流 , 具体 办法 由 中央 军事 委员 会 根据 本 法 规定 。
+第 二十八 条 军官 在 一 个 岗位 任职 满 下列 年限 的 , 应当 交流 : ( 一 ) 作战 部队 担任 师级 以下 主官 职务 的 , 四 年 ; 担任 军级 主官 职务 的 , 五年 ; ( 二 ) 作战 部队 以外 单位 担任 军级 以下 主官 职务 的 , 五年 ; ( 三 ) 机关 担任 股长 、 科长 、 处长 及 相当 领导 职务 的 , 四 年 ; 担任 局长 、 部长 及 相当 领导 职务 的 , 五年 ; 但是 少数 专业 性 强 和 工作 特别 需要 的 除外 。
+担任 师级 和 军级 领导 职务 的 军官 , 在 本 单位 连续 工作 分别 满 二十五 年 和 三十 年 的 , 应当 交流 。
+担任 其他 职务 的 军官 , 也 应当 根据 需要 进行 交流 。
+第 二十九 条 在 艰苦 地区 工作 的 军官 向 其他 地区 交流 , 按照 中央 军事 委员 会 的 有 关 规定 执行 。
+第 三十 条 军官 之间 有 夫妻 关系 、 直系 血亲 关系 、 三 代 以内 旁系 血亲 关系 以及 近 姻亲 关系 的 , 不 得 担任 有 直接 上 下级 或者 间隔 一 级 领导 关系 的 职务 , 不 得 在 同 一 单位 担任 双方 直接 隶属 于 同 一 首长 的 职务 , 也 不 得 在 担任 领导 职务 一 方 的 机关 任职 。
+第 三十一 条 军官 不 得 在 其 原籍 所在 地 的 军 分区 ( 师级 警备 区 ) 和县 、 市 、 市 辖区 的 人民武装 部 担任 主官 职务 , 但是 工作 特别 需要 的 除外 。
+第 三十二 条 军官 在 执行 职务 时 , 涉及 本人 或者 涉及 与 本人 有 本 法 第 三十 条 所 列 亲属 关系 人员 的 利益 关系 的 , 应当 回避 , 但是 执行 作战 任务 和 其他 紧急 任务 的 除外 。
+第 五 章 军官 的 奖励 和 处分 第 三十三 条 军官 在 作战 和 军队 建设 中 做出 突出 贡献 或者 取得 显著 成绩 , 以及 为 国家 和 人民 做出 其他 较大 贡献 的 , 按照 中央 军事 委员 会 的 规定 给予 奖励 。
+奖励 分 为 : 嘉奖 、 三 等 功 、 二 等 功 、 一 等 功 、 荣誉 称号 以及 中央 军事 委员 会 规定 的 其他 奖励 。
+第 三十四 条 军官 违反 军纪 的 , 按照 中央 军事 委员 会 的 规定 给予 处分 。
+处分 分 为 : 警告 、 严重 警告 、 记过 、 记大过 、 降职 ( 级 ) 或者 降 衔 、 撤职 、 开除 军籍 以及 中央 军事 委员 会 规定 的 其他 处分 。
+第 三十五 条 对 被 撤职 的 军官 , 根据 其 所 犯 错误 的 具体 情况 , 任命 新 的 职务 ; 未 任命 新 的 职务 的 , 应当 确定 职务 等级 待遇 。
+第 三十六 条 军官 违反 法律 , 构成 犯罪 的 , 依法 追究 刑事 责任 。
+第 六 章 军官 的 待遇 第 三十七 条 军官 实行 职务 军衔 等级 工资 制 和 定期 增资 制度 , 按照 国家 和 军队 的 有 关 规定 享受 津贴 和 补贴 , 并 随着 国民 经济 的 发展 适时 调整 。
+具体 标准 和 办法 由 中央 军事 委员 会 规定 。
+军官 按照 规定 离职 培训 、 休假 、 治病 疗养 以及 免职 待 分配 期间 , 工资 照发 。
+第 三十八 条 军官 享受 公费 医疗 待遇 。
+有 关 部门 应当 做好 军官 的 医疗 保健 工作 , 妥善 安排 军官 的 治病 和 疗养 。
+军官 按照 国家 和 军队 的 有 关 规定 享受 军人 保险 待遇 。
+第 三十九 条 军官 住房 实行 公寓 住房 与 自有 住房 相 结合 的 保障 制度 。
+军官 按照 规定 住 用 公寓 住房 或者 购买 自有 住房 , 享受 相应 的 住房 补贴 和 优惠 待遇 。
+第 四十 条 军官 享受 休假 待遇 。
+上级 首长 应当 每年 按照 规定 安排 军官 休假 。
+执行 作战 任务 部队 的 军官 停止 休假 。
+国家 发布 动员 令 后 , 按照 动员 令 应当 返回 部队 的 正在 休假 的 军官 , 应当 自动 结束 休假 , 立即 返回 本 部 。
+第 四十一 条 军官 的 家属 随军 、 就业 、 工作 调动 和 子女 教育 , 享受 国家 和 社会 优待 。
+军官 具备 家属 随军 条件 的 , 经 师 ( 旅 ) 级 以上 单位 的 政治 机关 批准 , 其 配偶 和 未 成年 子女 、 无 独立 生活 能力 的 子女 可以 随军 , 是 农村 户口 的 , 转 为 城镇 户口 。
+部队 移防 或者 军官 工作 调动 的 , 随军 家属 可以 随调 。
+军官 年满 五十岁 、 身边 无 子女 的 , 可以 调 一名 有 工作 的 子女 到 军官 所在 地 。
+所 调 子女 已婚 的 , 其 配偶 和 未 成年 子女 、 无 独立 生活 能力 的 子女 可以 随调 。
+随军 的 军官 家属 、 随调 的 军官 子女 及其 配偶 的 就业 和 工作 调动 , 按照 国务院 和 中央 军事 委员 会 的 有 关 规定 办理 。
+第 四十二 条 军官 牺牲 、 病故 后 , 其 随军 家属 移交 政府 安置 管理 。
+具体 办法 由 国务院 和 中央 军事 委员 会 规定 。
+第 七 章 军官 退出 现役 第 四十三 条 军事 、 政治 、 后勤 、 装备 军官 平时 服 现役 的 最 低 年限 分别 为 : ( 一 ) 担任 排 级 职务 的 , 八 年 ; ( 二 ) 担任 连 级 职务 的 , 副职 十 年 , 正职 十二 年 ; ( 三 ) 担任 营 级 职务 的 , 副职 十四 年 , 正职 十六 年 ; ( 四 ) 担任 团级 职务 的 , 副职 十八 年 , 正职 二十 年 。
+第 四十四 条 专业 技术 军官 平时 服 现役 的 最 低 年限 分别 为 : ( 一 ) 担任 初级 专业 技术 职务 的 , 十二 年 ; ( 二 ) 担任 中级 专业 技术 职务 的 , 十六 年 ; ( 三 ) 担任 高级 专业 技术 职务 的 , 二十 年 。
+第 四十五 条 军官 未 达到 平时 服 现役 的 最 低 年限 的 , 不 得 退出 现役 。
+但是 有 下列 情形 之 一 的 , 应当 提前 退出 现役 : ( 一 ) 伤病 残 不 能 坚持 正常 工作 的 ; ( 二 ) 经 考核 不 称职 又 不 宜 作 其他 安排 的 ; ( 三 ) 犯有 严重 错误 不 适合 继续 服 现役 的 ; ( 四 ) 调离 军队 , 到 非 军事 部门 工作 的 ; ( 五 ) 因 军队 体制 编制 调整 精简 需要 退出 现役 的 。
+军官 未 达到 平时 服 现役 的 最 低 年限 , 要求 提前 退出 现役 未获 批准 , 经 教育 仍 坚持 退出 现役 的 , 给予 降职 ( 级 ) 处分 或者 取消 其 军官 身份 后 , 可以 作出 退出 现役 处理 。
+第 四十六 条 军官 达到 平时 服 现役 的 最 高 年龄 的 , 应当 退出 现役 。
+军官 平时 服 现役 的 最 高 年龄 分别 为 : ( 一 ) 担任 正 团职 职务 的 , 五十岁 ; ( 二 ) 担任 师级 职务 的 , 五十五 岁 ; ( 三 ) 担任 军级 职务 的 , 副职 五十八 岁 , 正职 六十 岁 ; ( 四 ) 担任 其他 职务 的 , 服 现役 的 最 高 年龄 与 任职 的 最 高 年龄 相同 。
+第 四十七 条 军官 未 达到 平时 服 现役 的 最 高 年龄 , 有 下列 情形 之 一 的 , 应当 退出 现役 : ( 一 ) 任职 满 最 高 年限 后 需要 退出 现役 的 ; ( 二 ) 伤病 残 不 能 坚持 正常 工作 的 ; ( 三 ) 受 军队 编制 员额 限制 , 不 能 调整 使用 的 ; ( 四 ) 调离 军队 , 到 非 军事 部门 工作 的 ; ( 五 ) 有 其他 原因 需要 退出 现役 的 。
+第 四十八 条 军官 退出 现役 的 批准 权限 与 军官 职务 的 任免 权限 相同 。
+第 四十九 条 军官 退出 现役 后 , 采取 转业 由 政府 安排 工作 和 职务 , 或者 由 政府 协助 就业 、 发给 退役 金 的 方式 安置 ; 有 的 也 可以 采取 复员 或者 退休 的 方式 安置 。
+担任 师级 以上 职务 和 高级 专业 技术 职务 的 军官 , 退出 现役 后作 退休 安置 , 有 的 也 可以 作 转业 安置 或者 其他 安置 。
+担任 团级 以下 职务 和 初级 、 中级 专业 技术 职务 的 军官 , 退出 现役 后作 转业 安置 或者 其他 安置 。
+对 退出 现役 由 政府 安排 工作 和 职务 以及 由 政府 协助 就业 、 发给 退役 金 的 军官 , 政府 应当 根据 需要 进行 职业 培训 。
+未 达到 服 现役 的 最 高 年龄 , 基本 丧失 工作 能力 的 军官 , 退出 现役 后作 退休 安置 。
+服 现役 满 三十 年 以上 或者 服 现役 和 参加 工作 满 三十 年 以上 , 或者 年满 五十岁 以上 的 军官 , 担任 师级 以上 职务 , 本人 提出 申请 , 经 组织 批准 的 , 退出 现役 后 可以 作 退休 安置 ; 担任 团级 职务 , 不 宜 作 转业 或者 其他 安置 的 , 可以 由 组织 批准 退出 现役 后作 退休 安置 。
+第 五十 条 军官 达到 服 现役 的 最 高 年龄 , 符合 国家 规定 的 离休 条件 的 , 可以 离职 休养 。
+因 工作 需要 或者 其他 原因 , 经过 批准 , 可以 提前 或者 推迟 离休 。
+第 五十一 条 军官 退出 现役 后 的 安置 管理 具体 办法 由 国务院 和 中央 军事 委员 会 规定 。
+军官 离职 休养 和 军级 以上 职务 军官 退休 后 , 按照 国务院 和 中央 军事 委员 会 的 有 关 规定 安置 管理 。
+第 八 章 附则 第 五十二 条 人民 解放 军 总 政治 部 根据 本 法 制定 实施 办法 , 报 国务院 和 中央 军事 委员 会 批准 后 施行 。
+第 五十三 条 中国 人民 武装 警察 部队 现役 警官 适用 本 法 , 具体 办法 由 国务院 和 中央 军事 委员 会 规定 。
+第 五十四 条 本 法 ( 原 称 《 中国 人民 解放 军 现役 军官 服役 条例 》 ) 自 1989 年 1 月 1 日起 施行 。
+1978 年 8 月 18 日 第 五 届 全国 人民 代表 大会 常务 委员 会 批准 、 1978 年 8 月 19 日 国务院 和 中央 军事 委员 会 颁布 的 《 中国 人民 解放 军 干部 服役 条例 》 同时 废止 。
+发布 部门 全国 人大 常委会 发布 日期 2000 年 12 月 28 日 实施 日期 1989 年 01 月 01 日 时效 性 现行 有效 效力 级别 法律 法规 类别 军官 军衔
+Don ' t Take The Girl 《 Don & amp ; apos ; tTakeTheGirl 》 是 TimMc Graw 的 音乐 作品 , 收录 在 《 NotAMomentTooSoon 》 专辑 中 。
+歌词 0Don ' t Take The Girl - TimMc GrawJohnny ' s daddy wastaking himfishing Whe nhe waseight yearsoldA little girl came through the front gate holding a fishing poleHis dad looked down and smiled , said we cant leave herbe hindSon I know youdont wanthertogo but someday you ' ll change your mind And Johnny said take Jimmy Johnson , take Tommy Thompson , take my best friend BoTake anybody that you want , aslong as she dont goTake anyboy in the world Daddy please , don ' t take the girl Same old boySame sweet girlTen years down the road He held hertight and kissed herlips Infront of the picture showS tranger came and pulled agunG rabbed herby the arm , saidif you do whatItell you to there wont beany harmAnd Johnny said take my money , take my wallet , take mycred it cards Said here ' s the watch that my grand pagave meHere ' s the keytomy carMister give it a whirl But pleased onttake the girl Same old boySame sweet girl Five years down ther oad There ' s going to beal ittle one and she says it ' s time to go Doctor says the baby ' s fine but you ' ll have to leave ' cause his momma ' s fading fast and johnny hithis knees and the rehe prayed Taketh every breath you gave me Take the heart from mychest I ' ll gladly takeher place if you ' llletme Make this my last request Take me out of this world God , pleased onttake the girl Said Johnny ' s daddy Wastaking himfishing Whe nhe waseight years old 所 属 专辑 NotA Moment TooSoon 音乐 时 长 4 分 9 秒
+恒 参 信道 恒 参 信道 的 特性 ( 参数 ) 不 随 时间 变化 。
+如果 实际 信道 的 性质 ( 参数 ) 不 随 时间 变化 , 或者 基本 不 随 时间 变化 , 或者 变化 极 慢 , 则 可以 认为 是 恒 参 信道 。
+一般 的 有 线 信道 可以 看作 是 恒 参 信道 , 部分 无 线 信道 可 看作 是 恒 参 信道 。
+特性 0 特性 幅 频 特性 和 相 频 特性 1 恒 参 信道 可以 看作 是 一 个 非 时 变 线性 网络 线性 网络 的 特性 可以 用 其 单位 冲 激 响应 h ( t ) ( 时域 ) 和 传输 特性 ( 频域 ) 表征 。
+其中 , 为 线性 网络 的 幅 频 特性 , 为 线性 网络 的 相 频 特性 。
+满足 任意 信号 经过 信道 不 会 失真 的 条件 为 : ( 1 ) 幅 频 特性 为 与 频率 无 关 的 常数 ; ( 2 ) 相 频 特性 为 频率 的 过 原点 的 线性 函数 。
+特性 时延 特性 2 时延 特性 表示 不 同 频率 的 正弦 型 信号 经过 信道 后 的 时延 与其 频率 的 关系 , 定义 为 时延 特性 为 常数 时 , 信号 传输 不 会 引起 波形 的 失真 。
+特性 群 时延 特性 3 所谓 群 时延 特性 就 是 相 频 特性 对角 频率 的 导数 , 定义 为 可以 用 群 时延 特性 来 衡量 网络 的 相 频 特性 。
+当 群 时延 特性 为 常数 时 , 信号 传输 不 会 引起 信号 包络 的 失真 。
+典型 的 恒 参 信道 0 典型 的 恒 参 信道 有 线 信道 1 一般 的 有 线 信道 均可 看作 是 恒 参 信道 。
+常见 的 有 线 信道 有 : 明线 、 对称 电缆 和 同轴 电缆 等 。
+明线 是 平行 而 相互 绝缘 的 架空 线路 , 其 传输 损耗 较小 , 通 频带 在 0 . 3 ~ 27 kHz 之间 ; 对称 电缆 是 在 同 一 保护 套 内 有 许多 对 相互 绝缘 的 双 导线 的 电缆 , 其 传输 损耗 比 明线 大 得 多 , 通 频带 在 12 ~ 250 kHz 之间 ; 同轴 电缆 由 同轴 的 两 个 导体 组成 , 外 导体 是 一 个 圆柱 形 的 空管 , 通常 由 金属 丝 编织 而成 。
+内 导体 是 金属 芯 线 。
+内外 导体 之间 填充 着 介质 。
+通常 在 一 个 大 的 保护 套 内 安装 若干 根 同 轴线 管 芯 , 还 装入 一些 二 芯 绞线 或 四 芯 线 组 用作 传输 控制 信号 。
+同 轴线 的 外 导体 是 接地 的 , 对 外界 干扰 起 到 屏蔽 作用 。
+同轴 电缆 分 小 同轴 电缆 和 中 同轴 电缆 。
+小 同轴 电缆 的 通 频带 在 60 ~ 4100 kHz 之间 , 增 音 段 长度 约 为 8km 和 4km , 中 同轴 电缆 的 通 频带 在 300 ~ 60000 kHz 之间 , 增 音 段 长度 约 为 6km 、 4 . 5km 和 1 . 5km 。
+典型 的 恒 参 信道 光纤 信道 2 光线 信道 是 以 光导 纤维 为 传输 媒质 、 以 光波 为 载波 的 信道 , 具有 极 宽 的 通 频带 , 能够 提供 极大 的 传输 容量 。
+光纤 的 特点 是 : 损耗 低 、 通 频 带宽 、 重量 轻 、 不 怕 腐蚀 以及 不 受 电磁 干扰 等 。
+典型 的 恒 参 信道 无 线 电视 距 中继 信道 3 无 线 电视 距 中继 通信 工作 在 超短波 和 微波 波段 , 利用 定向 天线 实现 视距 直线 传播 。
+由于 直线 视距 一般 在 40 ~ 50 km , 因此 需要 中继 方式 实现 长 距离 通信 。
+相邻 中继 站 间 距离 为 直线 视距 40 ~ 50 km 。
+由于 中继 站 之间 采用 定向 天线 实现 点 对 点 的 传输 , 并且 距离 较 短 , 因此 传输 条件 比较 稳定 , 可以 看作 是 恒 参 信道 。
+这种 系统 具有 传输 容量 大 、 发射 功率 小 、 通信 可靠 稳定 等 特点 。
+典型 的 恒 参 信道 卫星 中继 信道 4 卫星 通信 是 利用 人造 地球 卫星 作为 中继 转发 站 实现 的 通信 。
+当 人造 地球 卫星 的 运行 轨道 在 赤道 平面 上 、 距离 地面 35860 km 时 , 其 绕 地球 一 周 的 时间 为 24h , 在 地球 上 看到 的 该 卫星 是 相对 静止 的 , 因此 称 其 为 地球 同步 卫星 。
+利用 它 作为 中继 站 , 一颗 同步 卫星 能以 零 仰角 覆盖 全球 表面 的 42 % 。
+采用 三 颗 经度 相差 的 同步 卫星 作 中继 站 就 可以 几乎 覆盖 全球 范围 ( 除 南北 两极 盲区 外 ) 。
+由于 同步 卫星 通信 的 电磁 波 直线 传播 , 因此 其 信道 传播 性能 稳定 可靠 、 传输 距离 远 、 容量 大 、 覆盖 地域 广 , 广泛 用于 传输 多路 电话 、 数据 和 电视 节目 , 还 支持 Internet 业务 。
+中文 名 恒 参 信道 外文 名 parametric stabilization channel
+山药 紫 薯 酸奶 泥 用料 0 < table > 材料 用量 铁棍 山药 四分 之 一 自己 喜欢 的 酸奶 100 g < / table > 做法 0 < table > 1 . 1 ⃣ ️ 山药 蒸熟 冷却 加 酸奶 打 成 泥 。
+2 ⃣ ️ 装入 裱花 袋 造型 。
+3 ⃣ ️ 准备 就绪 可以 开动 啦 ^ _ ^ 2 . 换 成 紫 薯 也 OK 中文 名 山药 紫 薯 酸奶 泥 主要 食材 铁棍 山药 , 自己 喜欢 的 酸奶
+中国 商务 管理 创新 研究 2011 《 中国 商务 管理 创新 研究 ( 2011 ) 》 , 本 书 汇集 了 全国 高校 商务 管理 研究 会 第 26 次 年会 各 理事 单位 的 41 篇 论文 , 内容 涉及 扩大 内需 背景 下 居民 投资 与 消费 、 家族 企业 控制 权 代际 传承 、 创业 研究 、 “ 三网 融合 ” 进程 下 的 对策 、 团购 网站 的 商业 模式 及 日本 3 . 11 大 地震 引发 的 对 我 国 应急 管理 体系 建设 与 完善 的 思考 , 探讨 地方 商会 面临 的 困难 及 改善 对策 等 。
+0 《 中国 商务 管理 创新 研究 ( 2011 ) 》 , 本 书 汇集 了 全国 高校 商务 管理 研究 会 第 26 次 年会 各 理事 单位 的 41 篇 论文 , 内容 涉及 扩大 内需 背景 下 居民 投资 与 消费 、 家族 企业 控制 权 代际 传承 、 创业 研究 、 “ 三网 融合 ” 进程 下 的 对策 、 团购 网站 的 商业 模式 及 日本 3 . 11 大 地震 引发 的 对 我 国 应急 管理 体系 建设 与 完善 的 思考 , 探讨 地方 商会 面临 的 困难 及 改善 对策 等 。
+书名 中国 商务 管理 创新 研究 2011 出版 社 兰州 大学 出版 社 页数 352 页 定价 48 . 00 作者 廖 进球 王 学军 出版 日期 2011 年 7 月 1 日 ISB N73110 37077
+农用 化学 品 制造 技术 《 农用 化学 品 制造 技术 》 是 2012 年 科学 技术 文献 出版 社 出版 的 图书 , 作者 是 宋 小平 、 韩 长日 。
+10 < b > 内容 简介 < / b > < b > < / b > 《 精细 化工 品 实用 生产 技术 手册 》 是 一 部 有 关 精细 化学 品 的 技术 性 系列 丛书 。
+它 包括 有 机 化学 品 、 无机 化学 品 和 复配 型 化学 品 , 按照 印染 与 橡塑 助剂 、 日用 化工 品 、 涂料 、 药物 、 农药 、 香料 与 食品 添加 剂 、 染料 、 颜料 与 色料 、 电子 化学 品 与 信息 化学 品 、 胶粘 剂 、 精细 有 机 中间 体 、 洗涤 剂 、 表面 活性 剂 和 表面 处理 剂 、 皮革 纺织 及 造纸 化学 品 、 建筑 用 化学 品 、 化妆 品 、 精细 无机 化学 品 、 饲料 添加 剂 、 石油 化学 助剂 及 石油 产品 、 农用 化学 品 等 20 个 分册 出版 。
+本 书 为 农用 化学 品 分册 , 介绍 了 76 种 植物 生长 调节 齐 1 ] 、 78 种 杀虫 剂 、 58 种 杀菌 剂 、 25 种 除草 剂 、 31 种 化学 肥料 、 22 种 农用 薄膜 、 36 种 农副 化工 产品 和 67 种 其他 农用 化学 品 的 制造 技术 。
+对 每个 品种 的 产品 性能 、 生产 方法 、 生产 配方 、 生产 流程 、 生产 工艺 、 产品 标准 、 产品 用途 都 做 了 全面 系统 的 阐述 。
+是 一 本 内容 丰富 、 资料 翔实 、 实用 性 很 强 的 生产 技 工具 书 。
+本 书 在 编写 过程 中 , 参阅 和 引用 了 大量 国内 外 专利 及 技术 资料 , 书 末 列出 了 一些 主要 参考 文献 , 部分 产品 中 还 列出 了 相应 的 原始 的 研究 文献 , 书名 < a href = " # " > 农用 化学 品 制造 技术 < / a > 作者 宋 小平 、 韩 长日 IS BN 978750 2370282 类别 工程 技术 定价 48 . 00 元 出版 社 科学 技术 文献 出版 社 出版 时间 2012 年 03 月 装帧 平装 开本 32
+张 传玲 张 传玲 , 女 , 硕士 , 副 教授 。
+1983 年 7 月 毕业 于 黑龙江 商 学院 , 2008 年 获取 哈尔滨 理工 大学 硕士 学位 人物 经历 01983 年 7 月 毕业 于 黑龙江 商 学院 , 2008 年 获取 哈尔滨 理工 大学 硕士 学位 。
+自 1983 年 7 月 至今 从事 教学 科研 一 线 工作 。
+现 具有 电子 商务 师 、 ERP 管理 咨询 师 职业 资格 。
+主讲 课程 0 先后 多次 担任 《 网络 营销 》 、 《 电子 商务 概论 》 、 《 电子 商务 网站 运营 与 管理 》 、 《 电子 商务 实训 》 等 课程 的 理论 教学 、 实践 教学 和 毕业 论文 指导 工作 。
+现 为 电子 商务 专业 负责 人 , 省级 精品 课程 《 电子 商务 网站 运营 与 管理 》 课程 负责 人 。
+研究 方向 0 自 工作 以来 一直 致力 于 电子 商务 、 计算机 应用 方向 的 研究 。
+主要 贡献 0 主持 申报 计算机 应用 、 计算机 软件 、 电子 商务 等 多 个 专业 , 近 几 年 主持 《 高职 教育 工学 结合 模式 可 持续 发展 的 研究 》 、 《 高职 高专 面向 “ 三农 ” 的 电子 商务 应用 型 人才 培养 模式 的 研究 》 等 国家 级 科研 项目 三项 , 参与 省级 重点 科研 项目 《 济南 市 现代 物流 中心 和 物流 体系 建设 研究 》 的 科研 任务 两 项 。
+先后 在 核心 期刊 《 商业 研究 》 、 《 现代 企业 教育 》 、 《 中国 成人 教育 》 、 《 中国 校外 教育 》 等 省 部级 以上 刊物 发表 论文 多 篇 。
+获奖 记录 0 其中 《 从 培养 职业 能力 出发 探索 职业 教育 教学 方法 》 获 中国 职业 技术 教育 协会 二 等 奖 , 《 谈 职业 教育 毕业 生 择业 》 获 中国 职业 技术 教育 协会 三 等 奖 。
+2003 年 评 为 山东 省 供销 社 系统 优秀 教师 , 多次 被 评 为 院 级 优秀 教师 、 优秀 共产 党员 。
+中文 名 张 传玲 国籍 中国 民族 汉 出生 地 中国
+杨 家河 村 杨 家河 村 隶属 于 前卫 镇 石 河 村委 会 。
+位于 前卫 镇 西北 边 , 距离 前卫 镇 政府 4 公里 , 道路 为 水泥 路 , 交通 方便 。
+全村 国土 面积 5107 . 8 亩 。
+农业 0 海拔 1882 米 , 年 平均 气温 15 . 6 ℃ , 适合 种植 烤烟 、 洋芋 、 水稻 等 农 作物 。
+全村 耕地 面积 145 亩 , 林地 3450 亩 , 人均 耕地 0 . 59 亩 , 共 有 经济 林 果 地 14 亩 。
+全村 有 农户 71 户 , 共 253 人 , 其中 农业 人口 总数 为 253 人 , 劳动 力 161 人 。
+2010 年 全村 经济 总 收入 193 万元 , 农民 人均 纯 收入 4023 元 , 农民 收入 以 种 、 养殖 业 为主 。
+杨 家河 村 今后 发展 的 重点 : 发展 特色 产业 , 扩大 二 、 三 产业 。
+基础 设施 0 该 村 目前 已 实现 水 、 电 、 路 、 电视 、 电话 五 通 。
+全村 有 71 户 通 自来水 , 71 户 通电 , 拥有 电视 机 农户 71 户 ; 安装 固定 电话 或 拥有 移动 电话 的 农户 数 68 户 , 其中 拥有 移动 电话 农户 数 23 户 。
+该 村 进村 道路 为 水泥 路面 , 村内 主干 道 均 为 硬化 的 水泥 路面 。
+装有 太阳 能 15 户 , 建 有 小 水 窑 8 口 ; 耕地 有效 灌溉 面积 为 65 亩 。
+农村 经济 02010 年 杨 家河 村 农村 经济 总 收入 193 万元 , 其中 : 种植 业 收入 110 万元 , 占 农村 经济 总 收入 的 57 % ; 畜牧 业 收入 21 万元 , 占 农村 经济 总 收入 的 11 % ; 渔业 收入 12 万元 , 占 农村 经济 总 收入 的 6 % ; 林业 收入 11 万元 , 占 农村 经济 总 收入 的 6 % 。
+农民 人均 总 收入 7628 元 , 人均 纯 收入 4023 元 。
+该 村 村民 的 住房 以 土木 结构 住房 居住 为主 , 随着 农村 经济 的 发展 , 农民 群众 生活 水平 的 提高 , 杨 家河 村 目前 已有 22 户 居住 砖 ( 钢 ) 混 结构 住房 , 居住 于 土木 结构 住房 的 农户 49 户 。
+2010 年底 全村 拥有 汽车 5 辆 , 拖拉 机 37 辆 , 摩托 车 20 辆 。
+中文 名称 杨 家河 村 行政 区 类别 行政 村 所 属 地区 云南 江川 县 前卫 镇 电话 区号 0877 邮政 区码 652600 地理 位置 前卫 镇 西北 边 面积 5107 . 8 亩 人口 253 人 气候 条件 温带 气候 著名 景点 孤山 岛 、 明星 鱼洞 火车 站 江川 火车 站 车牌 代码 云 F 海拔 1882 米
+伦敦 巴 金 普瑞 米尔 酒店 伦敦 巴 金 普瑞 米尔 酒店 位于 伦敦 , 是 家 3 星级 酒店 。
+简介 0 伦敦 巴 金 普瑞 米尔 酒店 提供 餐厅 服务 , 提供 免费 的 停车 服务 。
+酒店 提供 儿童 设施 , 残疾 人 服务 等 。
+伦敦 巴 金 普瑞 米尔 酒店 酒店 还会 提供 一项 住宿 的 免费 政策 : 所有 15 岁 以下 的 儿童 在 使用 现有 的 床铺 时 不 需 付费 。
+周围 景观 0 West field Strat ford 购物 中心 、 格林威治 遗产 中心 、 伦敦 城市 机场 酒店 地址 0 High bridgeRoad , 纽 汉姆 , 伦敦 , IG117BA 交通 信息 0 酒店 交通 比较 方便 , 以 各 主要 景点 的 距离 如下 : West field Strat ford 购物 中心 : 直线 距离 约 5 公里 格林威治 遗产 中心 : 直线 距离 约 5 公里 伦敦 城市 机场 : 直线 距离 约 4 公里 房型 及 房价 0 房型 及 房价 房型 介绍 1 酒店 的 房型 有 多种 选择 , 提供 了 双 床 间 、 家庭 间 ( 2 位 成人 + 2 名 儿童 ) 、 每个 客房 都 配有 书桌 , 暖气 , 铺设 了 地毯 的 地板 , 吹 风机 , 免费 洗浴 用品 , 卫生 间 , 浴室 , 浴缸 或 淋浴 , 电视 , 电话 , 卫星 频道 , 沏茶 / 咖啡 设备 。
+房型 及 房价 房价 信息 2 双 床 间 1228 元 家庭 间 ( 2 位 成人 + 2 名 儿童 ) 885 元 设施 与 服务 0 电视 、 暖气 、 残疾 人 设施 、 无 烟 客房 、 携带 宠物 、 酒吧 、 餐厅 相关 条款 0 入住 时间 从 14 : 00 时 退房 时间 12 : 00 时 之 前 预订 取消 / 预付 政策 不 同 类型 的 客房 附带 不 同 的 取消 预订 和 预先 付费 政策 选择 上述 客房 时 , 请 参阅 客房 条款 。
+儿童 和 加床 所有 15 岁 以下 的 儿童 在 使用 现有 的 床铺 时 不 需 付费 。
+最 多 容纳 : 0 张 加床 。
+最 多 容纳 : 一 间 客房 增加 1 张 幼儿 床 。
+所 提出 的 任何 加床 或 婴儿 床 的 要求 均 需 获得 酒店 的 确认 。
+宠物 不 允许 携带 宠物 . 团体 当 预订 超过 5 间 客房 , 将 实施 不 同 的 政策 和 额外 补充 规定 。
+中文 名称 伦敦 巴 金 普瑞 米尔 酒店 英文 名称 Premier Inn Barking 房间 数量 88 酒店 星级 3星级
+宝贝 阿良 《 宝贝 阿良 》 是 在 云中 书城 连载 的 影视 同人 小说 , 作者 是 寒江 渔夫 。
+小说 类型 0 影视 同人 内容 简介 0 流浪 狗 阿良 偶遇 农村 小 青年 牛 刚 , 从此 他们 便 产生 了 割舍 不 掉 的 感情 。
+牛 刚 是 结婚 不 久 的 农村 青年 , 犹 由于 对 狗 的 喜爱 , 把 阿良 带回 了 家 , 可是 却 遭到 了 妻子 的 反对 。
+牛 刚 顶住 妻子 的 压力 , 还是 把 阿良 留在 了 家中 。
+. 阿良 凭借 着 自己 的 灵性 和 对 主人 的 忠诚 , 终于 打动 了 他们 , 成为 牛 刚 家 的 一员 … … 中文 名 宝贝 阿良 作者 寒江 渔夫 小说 进度 连载 连载 网站 云中 书城
+欧拉 旅馆 欧拉 旅馆 位于 圣保罗 。
+酒店 地址 0Rua Alce u Wamosy , 40 , Liberda de , 圣保罗 , CEP 04105 - 040 , 巴西 房型 介绍 0 酒店 的 房型 有 多种 选择 , 提供 了 6 床位 混合 宿舍 的 单人 床 、 4 床 女性 宿舍 间 的 1 张 床铺 、 双 床 间 - 带 共用 浴室 和 迷你 吧 、 双 人间 - 带 私人 浴室 、 双 床 间 - 带 共用 浴室 、 每个 客房 都 配有 阳台 , 电视 , 有 线 频道 , 电 风扇 , 木质 / 镶 木 地板 , 衣柜 / 衣橱 , 淋浴 , 卫生 间 , 浴室 。
+设施 与 服务 0 电视 、 洗衣 间 、 携带 宠物 、 儿童 加床 、 语言 酒店 提供 免费 wifi 。
+欧拉 旅馆 酒店 还会 提供 一项 住宿 的 免费 政策 : 不 接待 儿童 相关 条款 0 入住 时间 从 13 : 00 时 退房 时间 11 : 00 时 之 前 预订 取消 / 预付 政策 不 同 类型 的 客房 附带 不 同 的 取消 预订 和 预先 付费 政策 选择 上述 客房 时 , 请 参阅 客房 条款 。
+儿童 和 加床 酒店 不 接待 儿童 入住 。
+没 有 加床 。
+最 多 容纳 : 0 张 额外 的 床 / 婴儿 床 。
+宠物 不 允许 携带 宠物 . 团体 当 预订 超过 4 间 客房 , 将 实施 不 同 的 政策 和 额外 补充 规定 。
+中文 名称 欧拉 旅馆 英文 名称 Olah Hostel 房间 数量 8
+单据 托管 单据 托管 是 中国 银行 根据 客户 申请 , 为其 提供 单据 保管 , 到期 协助 收款 、 支付 等 服务 。
+特点 0 免去 保管 实物 单据 风险 单据 到期 后 按 客户 要求 自动 收款 、 支付 适用 范围 0 单位 客户 在 境内 开户 行 办理 单据 保管 时 , 可 使用 单据 托管 。
+业务 范围 0 托管 定期 存款 证实 书 、 托管 国债 凭证 、 托管 电汇 凭证 、 托管 银行 汇票 申请 书 、 托管 银行 本 票 申请 书 、 托管 各类 业务 回单 。
+办理 流程 01 、 中国 银行 客户 申请 办理 单据 托管 业务 时 , 应 填制 业务 申请 表 , 加盖 预留 银行 印 签 , 提交 开户 行 。
+经 开户 行 审核 后 , 交付 客户 加以 编号 的 托管 申请 表 回单 。
+2 、 中国 银行 客户 申请 解除 单据 托管 业务 时 , 应 填制 解除 业务 申请 表 , 加盖 预留 银行 印 签 , 提交 开户 行 。
+经 开户 行 审核 后 , 返还 客户 单据 实物 与 解除 申请 表 回单 。
+3 、 中国 银行 客户 申请 托管 的 定期 存款 证实 书 、 国债 凭证 , 到期 自动 办理 收款 入账 ; 托管 的 电汇 凭证 、 银行 汇票 / 银行 本 票 申请 书 到期 自动 办理 汇兑 或 签发 银行 汇票 / 银行 本 票 。
+提示 0 定期 存款 证实 书 、 国债 凭证 记载 的 存款 人名 称 应 与 托管 客户 名称 一致 。
+电汇 凭证 、 银行 汇票 / 银行 本 票 申请 书 等 填写 的 汇款 人 / 申请 人名 称 应 与 托管 客户 名称 一致 , 凭证 或 申请 书 上 应 明确 填写 要求 银行 到期 协助 支付 的 日期 及 支付 指令 。
+中文 名 单据 托管 受理 方 < a href = " # " > 中国 银行 < / a > 受理 内容 协助 < a href = " # " > 收款 < / a > 、 支付 等 委托 人 客户
+金汉斯 ( 大兴 店 ) 餐馆 类型 0 西 餐厅 适宜 商务 宴请 、 休闲 时光 、 休闲 小憩 、 随便 吃 吃 、 情侣 约会 、 家庭 聚会 、 朋友 聚餐 餐馆 简介 0 金汉斯 是 一 家 拉美 烤肉 自助 餐厅 , 为 食客 提供 包括 牛肉 、 羊肉 、 猪肉 、 禽类 、 海鲜 在内 的 几十 种 丰富 高 质量 的 自取 烤肉 ; 新鲜 蔬菜 、 西式 甜品 、 水果 沙拉 、 中式 冷菜 、 韩式 汤 羹 泡菜 、 日式 铁板 烧 、 粤 式 小吃 、 国际 饮料 、 新鲜 水果 及 口味 超 棒 的 德国 鲜 酿 啤酒 。
+特色 服务 : 是 老 字号 , 可以 刷卡 , 有 包厢 , 有 停车 位 , 有 表演 , 提供 早餐 , 生日 优惠 。
+推荐 菜 0 < table > 蜜汁 梅 肉 烤鸡 心 烤 翅 烤 腊肠 啤酒 鸡翅 烤鸭 胸 沙拉 烤肠 烤鸡 腿 梅 汁 肉 烤 鱿鱼 冰 虾 培根 烤 玉米 烤 牛舌 培根 金针 菇 烤 大虾 烤 虾 蜜汁 烤肉 酒 香肠 烤 羊腿 蜀 香 肥牛 ( V ) 果 味 烤肉 大虾 蒜 香 猪肉 西 冷 牛肉 炭火 烤 猪 肘 慕尼黑 烤肠 美式 烤 羊腿 圣保罗 鸡腿 额 吉 蒙 羊肉 南美 羊腿 精 烤 烤 玉米 腌肉 卷 < / table > 地址 0 北京 市 大兴 区 兴华 大街 三段 3 号 百联 清城 国际 购物 中心 3 层 餐馆 名称 金汉斯 ( 大兴 店 ) 地区 北京 市 人均 价格 53 营业 时间 11 : 00 - 14 : 00180022 - 00
+福建 民企 投资 有限 公司 福建 民企 投资 有限 公司 于 2013 年 8 月 15 日 在 福建 省 工商 行政 管理 局 登记 成立 。
+法定 代表 人 林 霖 , 公司 经营 范围 包括 对 农业 、 林业 、 工业 、 矿业 、 房地 产业 、 建筑 业 等 。
+企业 信息 0 公司 类型 有限 责任 公司 登记 机关 福建 省 工商 行政 管理 局 成立 时间 2013 年 8 月 15 日发 照 时间 2016 年 8 月 1 2 日
+生化 时代 的 战争 《 生化 时代 的 战争 》 是 百度 文学 旗 下 纵横 中文 网 签约 作家 金 蝎子 创作 的 一 部 架空 历史 小说 , 小说 已 于 2012 - 12 - 20 正式 发布 。
+0 < b > 内容 简介 < / b > < b > < / b > 金 博士 本事 从事 于 M 国 的 生物 进化 研究 , 用于 改造 地球 环境 等 服务 于 民众 的 好事 。
+然而 他 没 有 想到 的 是 , 背后 支持 他 的 人 却是 要 把 生物 的 研究 变成 生化 武器 用于 战争 。
+于是 。
+金 博士 , 带 着 他 的 实现 品 逃离 。
+在 逃离 的 路 上 。
+由于 逃跑 的 汽车 被 高压 电 击中 。
+产生 磁场 回到 了 1937 年 。
+现在 只能 看 用 生化 武器 拯救 人民 。
+中文 名称 生化 时代 的 战争 作者 金 蝎子 类型 架空 历史 连载 平台 纵横 中文 网 出版 日期 2012 - 12 - 20
+甜甜 圈 生活 社区 甜甜 圈 生活 社区 是 一款 网络 社区 类 软件 , 适用 于 mobile 平台 。
+运行 环境 0 支持 Android 2 . 1 应用 类型 0 网络 社区 类 软件 应用 介绍 0 甜甜 圈 是 一 个 神奇 的 生活 互动 工具 , 让 你 可以 与 任何 地点 的 人群 直接 对话 !
+说出 你 想 的 , 分享 你 有 的 , 更有 各种 达人 为 你 指点 生活 的 乐趣 。
+- 不 想 宅 在家 , 约 人 KTV ?
+@ 一 下 ‘ 附近 ‘ : 谁 去 一同 K歌 ?
+K 友 蜂拥 而至 !
+- 租房 买房 , 中介 挡路 ?
+@ 一 下 ‘ xx 小区 ‘ : 谁 的 房子 要 出租 ?
+房主 立马 出现 !
+- 工作 加班 , 懒得 翻 外卖 单 ?
+@ 一 下 ‘ 公司 周边 ‘ : 我 想来 份 大 排 饭 - 小吃 店 老板 迅猛 响应 !
+语音 视频 文字 图片 , 各种 强大 的 聊天 分享 功能 , 让 你 与 生活 更 贴近 。
+功能 特点 : 1 . 锁定 兴趣 发现 , 圈点 你 的 生活 网络 信息 多 到 爆炸 , 却 找 不 到 有 价值 的 ?
+甜甜 圈 让 你 的 兴趣 与 地理 相联 , 关心 的 话题 一 个 也 不 少 。
+2 . 你 的 地盘 你 做主 , 用户 商家 零 距离 在 甜甜 圈 点 击 ‘ 热点 ‘ 即可 看到 锁定 区域 附近 的 各类 消费 , 帮 你 玩转 身边 !
+还 可以 与 商家 用户 直接 聊天 。
+3 . 随时 随地 针对 性 发布 生活 点滴 甜甜 圈 与 LBS 作 了 无 缝 对接 与 融合 , 让 你 不仅 可以 分享 信息 给 私有 圈 的 好友 , 更 可 在 自 定义 的 关注 区域 有 针对 性 地 发布 。
+4 . 交互 式 畅 聊 , 轻松 相视 而 见 开启 甜甜 圈 聊天 , 私 聊 或 是 群 聊 让 你 随时 随地 眉目 传情 、 暗 送 秋波 !
+不 只是 文字 , 当然 也 可以 插入 图片 、 拍照 , 更 可用 语音 、 视频 来 互动 。
+5 . 拖拽 式 智能 化 分组 建 圈 甜甜 圈 支持 智能 匹配 通讯 录 和 微 博 好友 , 还 可以 通过 拖拽 式 的 圈子 管理 操作 , 将 好友 便捷 分类 。
+软件 名称 甜甜 圈 生活 社区 软件 平台 mobile 软件 版本 1 . 0 . 4
+错误 信息 632 为 电脑 中 错误 代码 632 The structuresize is in correct . 结构 大小 是 正确 的 . 需要 您 检查 计算机 的 设备 是否 运行 正常 和 网络 配置 是否 完好
+童 叶 翔童 叶 翔 , 1963 年 12 月 生 , 浙江 遂昌 人 , 博士 , 教授 , 博士 生 导师 , 广东 省 化学 会 秘书 长 、 中国 化学 会 化学 教育 委员 会 委员 等 。
+人物 履历 0 · 1981 . 9 - 1985 . 7 中山 大学 化学 系 化学 专业 获 学士 学位 · 1985 . 9 - 1988 . 7 中山 大学 化学 系 物理 化学 专业 获 硕士 学位 · 1996 . 9 - 1999 . 12 中山 大学 化 学院 有 机 化学 专业 获 博士 学位 · 1988 . 7 - 1991 . 4 中山 大学 化学 系 , 助教 · 1991 . 4 - 1995 . 12 中山 大学 化学 系 , 讲师 · 1995 . 12 - 2001 . 6 中山 大学 化学 系 , 副 教授 · 1996 . 2 遴选 为 中山 大学 重点 培养 教师 · 1996 . 6 遴 选 为 物化 专业 硕士 生 导师 · 1996 . 9 入选 广东 省 千 、 百 、 十 工程 计划 · 2000 . 1 – 2008 . 7 担任 中山 大学 化学 与 化工 学院 副 院长 , 分管 本科 教学 工作 · 2001 . 6 - 至今 中山 大学 化学 学院 , 教授 , 博士 生 导师 · 2000 . 3 - 至今 广东 省 表面 工程 协会 理事 和 广州 市 电镀 协会 理事 · 2000 . 3 - 至今 中国 化学 会 化学 教育 委员 会 委员 · 2003 . 8 - 至今 《 大学 化学 》 编委 。
+· 2006 . 6 - 至今 广东 省 稀土 协会 理事 。
+· 2006 . 6 - 至今 中国 金属 学会 冶金 物化 分会 委员 · 2007 . 11 - 至今 中国 化学 会 电 化学 专业 委员 会 委员 · 2009 . 3 - 至今 广东 省 化学 会 秘书 长 。
+讲授 课程 0 物理 化学 ( 本科 生 基础 课 ) 2 . 现代 物理 化学 专题 ( 硕士 生 学位 课 ) 3 . 化学 与 社会 ( 全校 公共 选修 课 ) 科研 项目 0 承担 教学 研究 项目 : 1 . 中山 大学 化学 基地 , 国家 基础 科学 人才 培养 项目 [ 条件 建设 ] ( J 0630428 ) , 2007 . 1 - 2009 . 122 . 以 化学 基地 为 载体 , 培养 创新 人才 的 研究 与 实践 , 2006 年度 广东 省 高等 教育 教学 改革 工程 项目 ( BKY BJG 20060203 ) , 2007 . 5 - 2009 . 123 . 中山 大学 化学 基地 , 国家 基础 科学 人才 培养 项目 [ 能力 提高 ] ( J 0730420 ) , 2008 . 1 - 2010 . 124 . 教育 部 “ 高等 学校 特色 专业 建设 点 ” 项目 , ( TS 10443 ) , 2008 . 1 - 2010 . 125 . 依托 “ 国家 级 实验 教学 示范 中心 ” , 提高 创新 人才 培养 质量 的 研究 与 实践 , 广东 省 本科 高等 教育 教学 改革 项目 ( BKJG 200702 ) , 2007 - 2009 。
+6 . 21 世纪 化学 创新 人才 培养 和 教学 改革 , 全国 高等 教育 科学 十五 规划 重点 课题 , 2003 . 1 - 2004 . 12 , 7 . 物理 化学 , 国家 理科 基地 名牌 课程 创建 优秀 项目 , 2002 . 12 - 2003 . 12 , 8 . 主持 国家 理科 基地 创建 名牌 课程 项目 物理 化学 课程 体系 改革 和 CAI 的 研究 2000 . 3 . ~ 2001 . 39 . 物理 化学 CAI 软件 的 研制 和 课程 体系 改革 , 国家 理科 基地 创建 名牌 课程 项目 , 1999 . 12 - 2000 . 12 , 10 . 化学 创新 人才 培养 研究 与 实践 , 新 世纪 广东 省 高等 教育 教学 改革 工程 项目 , 2002 . 1 - 2003 . 12 , 11 . 化学 专业 资源 库 建设 , 广东 省 教育 厅 151 工程 项目 , 2002 . 6 - 2003 . 12 , 课题 负责 人 近年 承担 科研 项目 : 1 . 广东 省 自然 科学 基金 重点 项目 , 2009 . 10 - 2012 . 10 . 2 . 国家 自然 科学 基金 重大 研究 计划 ( 90923008 ) , 2010 . 1 - 2012 . 12 . 3 . 国家 自然 科学 基金 ( 20873184 ) , 2009 . 1 - 2011 . 12 . 4 . 广东 省 科技 计划 项目 ( 2008 B 010600040 ) , 2008 . 9 - 2011 . 9 . 5 . 国家 自然 科学 基金 ( 20573136 ) , 2006 . 1 - 2008 . 12 . 6 . 广东 省 自然 科学 基金 ( 06023099 ) , 2006 . 9 - 2008 . 10 . 7 . 广东 省 自然 科学 基金 研究 团队 项目 ( 04205405 ) , 2005 . 1 - 2008 . 12 . 8 . 广东 省 科技 计划 项目 ( 2003 C 104030 ) , 2004 . 1 - 2006 . 12 . 9 . 广东 省 自然 科学 基金 ( 031584 ) , 2004 . 1 - 2005 . 12 . 获奖 情况 01 . 稀土 合金 熔 盐 电解 制备 的 基础 研究 获 1990 年 广东 省 自然 科学 三 等 奖 ( 排名 第 5 ) 。
+2 . 熔 盐 电 化学 方法 制备 稀土 金属 及 合金 的 研究 获 1997 年 广东 省 自然 科学 一等奖 ( 排名 第 3 ) 。
+3 . 含 羧基 与 多 胺基 的 金属 酶 模型 化合 物 的 研究 获 1998 年 广东 省 自然 科学 一等奖 ( 排名 第 2 ) 。
+4 . 获 1998 年 广东 省 南粤 教 坛 新秀 称号 。
+5 . 2002 年 被 教育 部 和 国家 基金 委 评 为 国家 理科 基地 建设 实施 先进 工作 者 6 . 2002 年 被 评 为 宝钢 教育 基金 优秀 教师 7 . 2003 年 被 评 为 中山 大学 教学 名师 8 . 化学 创新 人才 培养 的 研究 与 实践 2005 年 获 第 五 届 广东 省 高等 教育 教学 成果 一等奖 ( 排名 第 2 ) 9 . 化学 类 专业 《 化学 生物 学 》 课程 新 体系 的 建设 与 实践 2005 年 获 第 五 届 广东 省 高等 教育 教学 成果 一等奖 ( 排名 第 5 ) 10 . 金属 基 固体 化合 物 在 微 纳 空间 的 结构 与 合成 化学 研究 获 2007 年 中国 高校 自然 科学 一等奖 ( 排名 第 3 ) 11 . 化学 创新 人才 培养 体系 的 构建 与 实践 2009 年 获 第 六 届 广东 省 高等 教育 教学 成果 一等奖 ( 排名 第 1 ) 12 . 2009 年 获 广东 省 第 四 届 教学 名师 奖 主要 论著 01 . Dun linQu , Fangy anXie , HuiMeng , § LiGong , Wei hongZhang , JianChen , * GaorenLi , Peng Liu , and Yexiang Tong * : Pre paration and Characterization of Nanocry stalline CeO 2 - Tb2O 3 Films Obtained by Electrochemical Deposition MethodJ . Phys . Chem . C 2010 , 114 , 1424 – 1429 . 2 . Gao - RenLi , * Zhan - Ping Feng , Yan - NanOu , Ding caiWu , Ruo wenFu , and Yexiang Tong * : MesoporousM nO 2 / CarbonA erogel Composites asPromising Elect rodeMaterials for High - Performance Super capacitors . Lang muir , 2010 , 26 ( 4 ) , 2209 – 2213 . 3 . Gao - RenLi * , Dun - LinQu , Zi - Long Wang , Cheng - YongSu , Ye - Xiang Tong * and LaurentArurault : Ceria – terbia solid solutionna no belts with high catalyticactivities for COoxidation . Chem . Commun . , 2009 , 7557 - 7559 . 4 . Gao - RenLi * , Zi - Shou Zhang , Cheng - YongSu , andYe - Xiang Tong * : Electrochemical Synthes is of Tb - CoAlloyNa no particle Agg regates and Their Magnetic Properties . J . Phys . Chem . C , 2009 , 113 , 1227 – 1234 . 5 . Gao - RenLi , * Dun - LinQu , LaurentArurault , andYe - Xiang Tong * : Hierar chically Porous Gd 3 - Doped CeO 2 Nanostructures fortheRe marka bleEn hancement of Optical and Magnetic Properties . . J . Phys . Chem . C , 2009 , 113 , 1235 – 1241 . 6 . Gao - RenLi , * Wen - XiaZhao , QiongBu , andYe - Xiang Tong * : A novel electrochemical deposition route for the pre paration of Zn 1 - xCdxOn anorods with controllable optical properties . Electrochemistry Communications , 2009 , 11 , 282 – 285 . 7 . Gao - RenLi , * Ci - RenDawa , Xi - HongLu , Xiao - LanYu , andYe - Xiang Tong * : Use of Additives in the Electro deposition of Nanostructure dEu 3 / ZnOFilms for Photolumine scent Devices . Lang muir , 2009 , 25 , 2378 - 2384 . 8 . Gao - RenLi , * Dun - LinQu , andYe - Xiang Tong * : Electrochemical Grow thofCe 1 - xZrxO 2 Solid Solution Flower - Like Nanostructures and The irOptical and Magnetic Properties . J . Phys . Chem . C , 2009 , 113 ( 7 ) , 2704 – 27099 . Gao - RenLi , * QiongBu , Fu - LinZheng , Cheng - YongSu , andYe - Xiang Tong * : Electrochemically Controlla bleGrow thand Tunable Optical Properties of Zn 1 - xCdx OAlloy Nanostructures . Crystal Growth . 10 . Gao - RenLi * , Dun - LinQuan dYe - Xiang Tong * : Facile Fabrication of Magnetic Single - Crystalline CeriaNa no belts . Electrochemistry Communications , 2008 , 10 ( 1 ) , 80 - 84 . 11 . Gao - RenLi * , Xi - HongLuan dYe - Xiang Tong * : Electrochemical Reduction Synthes is of In - Sb Nanoropes and Terraced Micropy ramids . Electrochemistry Communications , 2008 , 10 ( 1 ) , 127 - 130 . 12 . Gao - RenLi * , Xi - HongLu , Cheng - YongSu , andYe - Xiang Tong * : Facile Synthes is of Hierar chical Z nO : Tb3 Nanorod Bundles and The irOptical and Magnetic Properties . J . Phys . Chem . C , 2008 , 112 , 2927 - 2933 . 13 . Gao - RenLi * , Dun - LinQu , Xiao - LanYuan dYe - Xiang Tong * : MicrostructuralE volution of CeO 2 fromPorous Structuresto Clusters of Nanosheet Array sAssisted by Gas Bubble svia Electro deposition . Lang muir , 2008 , 24 , 4254 – 4259 . 14 . Gao - RenLi * , Fu - LinZheng and Ye - Xiang Tong * : Controllable Synthes is of Bi2Te 3 Inter metallic Compounds with Hierar chical Nanostructures via Electrochemi calDe position Route . Cryst . Grow thDes . ; 2008 , 8 ( 4 ) , 1226 – 1232 . 15 . Gao - RenLi * , Xi - HongLu , Wen - Xia , Cheng - Yong SuandYe - Xiang Tong * : Controllable Electrochemical Synthes is of Ce 4 - Doped ZnONa no structures fromNa notubestoN anorods and Nanocages . Cryst . Grow thDes . ; 2008 , 8 ( 4 ) , 1276 – 1281 . 16 . Gao - RenLi , * Chen - Zhong Yao , Xi - HongLu , Fu - LinZheng , Zhan - Ping Feng , Xiao - LaYu , Cheng - YongSu , andYe - Xiang Tong * : Facile and EfficientElectrochemical Synthes is of PbTeD endritic Structures . Chem . Mater . 2008 , 20 , 3306 – 3314
+不 停 跳跃 《 不 停 跳跃 》 是 一款 敏捷 小 游戏 , 游戏 大小 为 10 20K 。
+基本 信息 0 不 停 跳跃 分类 : 敏捷 小 游戏 大小 : 10 20K 日期 : 2015 - 07 - 14 英文 名 : Lectro 专题 : H5 游戏 游戏 介绍 0 一颗 跳跃 的 圆球 渴望 旅行 , 于是 一 场 四处 跳跃 的 旅程 就 这样 开始 了 。
+游戏 中 , 玩家 控制 圆球 的 方向 , 发射 跳跃 到 另外 的 圆球 上面 。
+要 注意 的 是 一旦 错过 游戏 就 结束 了 !
+操作 指南 0 鼠标 操作 鼠标 点 击 , 发射 圆球 , 跳跃 到 另外 的 圆球 上面 。
+如何 开始 0 游戏 加载 完毕 点 击 方向 键 按钮 开始 游戏 游戏 目标 0 合理 发射 圆珠 , 成功 撞 上 获取 高分 。
+
+辽宁 达能 电气 股份 有限 公司 辽宁 达能 电气 股份 有限 公司 于 2002 年 08 月 27 日 在 沈阳 市 工商 行政 管理 局 登记 成立 。
+法定 代表 人 高 艺 , 公司 经营 范围 包括 承 装 电力 设施 ( 承 装 三级 、 承修 三级 ) ; 建筑 工程 等 。
+企业 信息 0 公司 类型 股份 有限 公司 ( 非上市 、 自然 人 投资 或 控股 ) 登记 机关 沈阳 市 工商 行政 管理 局 成立 时间 2002 年 08 月 27 日发 照 时间 2016 年 10 月 19 日
+中国 人 为什么 活 得 累 作者 : 杨 连宁 著 出版 社 : 东方 出版 社 出版 时间 : 2014 - 08 基本 信息 0ISBN : 978 - 7 - 5060 - 4966 - 5 页数 : 238 页 价格 : 39 . 00 元 内容 简介 0 有 网友 曾 用 一 个 字 总结 中国 人 的 生活 : 物价 一 个 字 : 涨 ; 房子 一 个 字 : 贵 ; 交通 一 个 字 : 堵 ; 空气 一 个 字 , 毒 ; 没 钱 一 个 字 : 怂 ; 有 权 一 个 字 : 牛 ; 信用 一 个 字 : 缺 ; 感情 一 个 字 : 乱 ; 归根 结底 一 个 字 : 累 。
+而且 , 不管 你 有 钱 没 钱 , 有 权 没 权 , 有 道德 没 道德 , 只是 累 的 方式 不 同 , 累 的 感觉 都 一 样 。
+中国 人 为什么 活 得 累 ? 我们 都 在 寻找 答案 。
+但 类似 于 “ 雾霾 天 能 做 的 就 是 关 上 门窗 , 尽量 不 让 雾霾 进 到 家 里 ” 这样 的 心灵 鸡汤 看多 了 , 不仅 没 有 让 我们 活 得 更 轻松 , 反而 越来 越 腻味 。
+这本 书 虽然 外表 秀色 可 餐 , 但 内容 却 不 是 心灵 鸡汤 , 而是 用 犀利 又 不 失 幽默 , 通俗 又 不 失 深刻 的 笔法 , 不仅 将 这些 我们 感 同 身受 的 “ 累 ” 事 一一 道 出 , 而且 从 经济 、 社会 、 历史 、 道德 等 角度 , 深挖 了 “ 累 ” 的 根源 , 为 我们 拉直 “ 为什么 活 得 累 ” 这个 大 问号 。
+明白 了 艰辛 生活 的 社会 源头 , 虽然 不 是 获得 轻松 的 全部 , 但 却是 第 一 步 。
+如果 您 想 活 得 不 累 , 请 先 “ 累 ” 一 次 读完 本 书 。
+目录 0 < b > A 篇 成果 不 能 分享 , 利益 冲突 就 多 < / b > < b > < / b > “ 五子 登科 ” 变成 了 “ 五大 累 ” 你 的 “ 财富 之 母 ” 被 侵权 了 高 房价 、 高 油价 = 高 征税 谁 是 “ 无 壳 蜗牛 ” “ 无 腿 螃蟹 ” ? 五 座 “ 大山 ” , 压力 山大 弱 权 化 生存 PK 特权 化 生存 民 穷国 富 , 不 是 真富 不 怕 吃苦 , 就怕 吃亏 < b > B 篇 通胀 是 全民 加税 , 印钞 要 全民 还债 < / b > < b > < / b > 为什么 乍 富 还 穷 ? 靠 倒腾 房子 能 过 上 好 日子 吗 ? 最 怕 “ 一 顿 吃 伤 , 三 顿 喝汤 ” 穷 日子 能 当 富 日子 过 吗 ? 被 通胀 偷 了 钱 , 你 没 处 报案 也 没 处 说理 印钞 印 成 了 全民 加税 , 全民 还债 纸面 财富 是 镀金 的 “ 货币 幻觉 ” 是 “ 诈 和 ” 天机 不 可 泄露 的 货币 双 发 < b > C 篇 拼 爹 OR 近亲 退化 , 弱 肉 强 食 OR 返祖 < / b > < b > < / b > 拼 爹 导致 近亲 退化 内亲 外 疏 , 肥水 不 落 外人 田 别 把 市场 经济 变成 官场 经济 镀金 时代 也是 扒 粪 时代 傍 官 傍 权 的 食 腐 动物 太多 啦 垄断 与 投机 大行 其 道 不 想 茹 毛 饮 血 , 就 不 能 弱 肉 强 食 < b > D 篇 道德 越 低 成本 越 高 , 良知 越 少 灾祸 越多 < / b > < b > < / b > 凭 什么 道德 越 少 牟利 越多 ? 道德 雪崩 时 , 哪 片 雪花 有 负 罪 感 ? 医治 “ 傅立叶 变态 心理 综合 征 ” 德福 一致 还是 德福 悖 反 ? 幸福 不仅 在 丰衣 足 食 上 做人 = 装 人 吗 ? 善 欲 人 见 , 不 是 真 善 ; 恶 恐 人知 , 才 是 大 恶 不 做 权力 动物 和 金钱 动物 找回 人性 才能 找回 人格 尊严 别 再 “ 假装 智慧 ” 了 中文 名 中国 人 为什么 活 得 累
+狂 性 暴君 《 狂 性 暴君 》 是 一 部 小说 , 作者 是 孟 琴 2 , 连载 网站 是 3G 书城 。
+小说 类型 0 综合 其它 内容 简介 0 五年 前 , 寻找 目标 , 他 强暴 了 她 。
+五年 后 , 重 寻 目标 , 目的 是 带回 她 生 下 的 胎儿 。
+五年 前 , 莫名 其妙 , 她 被 强暴 。
+五年 后 , 匪 夷 所 思 , 新婚 之 日 与 儿子 一 起 遭遇 强 掳 。
+他 脾气 暴躁 狂 性 十足 , 将 她 丢 进 堡 内 只许 进 、 不 许 出 。
+她 性情 温柔 善 解 人意 , 不 愿 服 他 的 专 暴 携子 摸黑 偷逃 。
+他 一 路 追杀 再次 将 她 丢 进 牢笼 , 她 反抗 不 从 又 被 强暴 了 好 几 次 。
+本 不 爱 , 一旦 爱 上 他 对 她 的 占有 强烈 到 极点 , 令人 窒息 、 令人 心慌 意乱 。
+“ 我 让 你 再 反抗 ! ” 暴戾 的 撕裂 她 的 衣裳 , 健 躯 直 压 , 一 场 烈火 翻云 的 强暴 。
+“ 你 这 辈子 都 休想 逃开 我 ! ” 醋意 蔓延 扩散 , 钳住 她 的 下巴 狠狠 的 吻 住 了 她 颤抖 拒绝 的 唇瓣 。
+儿子 站 在 娘亲 一边 , 拥有 与 父 相同 暴燥 性情 。
+双狮 争斗 嘶 咬 , 总能 将 父 气 得 火 冒 三 丈 狂 性 大发 想 杀人 ! 夹 在 两人 中间 , 她 既 头痛 又 心痛 … … 中文 名 狂 性 暴君 作者 孟 琴 2 小说 进度 已 完结 连载 网站 3G 书城
+通俗 日报 《 通俗 日报 》 原 在 郑州 出版 , 名 为 《 郑州 通俗 日报 》 。
+1938 年 6 月 该 报 由 河南 郑州 迁至 陕西 宝鸡 出版 , 更名 为 《 通俗 日报 》 ( 系 4 开 4 版 ) 。
+发行 人 兼 社长 孟 紫萍 , 副 社长 兼 总 编辑 杜 健英 , 经理 刘 亚亭 。
+记者 有 孙 冀春 、 肖 风 、 黄 宗勋 、 梁 儒 等 。
+日 销 数 1000 余 份 。
+0 《 通俗 日报 》 主要 转载 政治 新闻 及 战事 消息 。
+其 值得 一 提 的 是 署名 “ 通俗 社 记者 ” 的 长篇 通讯 , 文笔 清新 , 叙事 周 祥 , 对 宝鸡 抗战 后期 的 政治 、 经济 、 工业 建设 、 文化 事业 作 了 全面 的 介绍 。
+被 收入 1946 年 出版 的 《 最近 宝鸡 乡土 志 》 , 保存 了 不 少 珍贵 的 民国 时期 地方 史料 。
+《 通俗 日报 》 出刊 期间 , 因 在 新闻 报道 中 触及 教育 界 人士 , 曾 招致 报社 两度 被 捣毁 。
+第 1 次 是 在 1946 年 春 , 一 位 记者 在 采访 宝鸡 县 参议 会后 所 写 的 新闻 稿 《 会场 侧 听 记 》 中 , 描述 “ 小学 教师 白天 打牌 , 输光 了 作 小偷 ” , 激起 部分 学校 教师 强烈 不 满 。
+事后 不 久 , 复兴 小学 及 实验 小学 师生 遂 乘 一 次 集体 游行 示威 结束 之 机 , 对 《 通俗 日报 》 社 设在 东关 大街 的 经理 部 进行 突袭 , 该 部 的 橱窗 、 架 阁 、 电灯 等 各种 用具 均 被 彻底 捣毁 。
+事发 后 , 《 通俗 日报 》 当即 在 头版 登载 《 呼吁 书 》 , 并 由 社长 孟 紫萍 向 宝鸡 警备 司令 部 、 九 区 专员 公署 等 军政 机关 多次 呼吁 , 要求 严厉 制裁 肇事 者 , 但 终 因 宝鸡 县长 杨 炳南 有 意 偏袒 闹事 校方 , 使 这 一 事件 拖延 至 最后 不 了 了 之 。
+《 通俗 日报 》 社 再次 被 捣毁 , 是 在 1947 年 冬初 。
+报纸 在 采用 宝鸡 警备 司令 刘 某之 子 ( 系 宝鸡 县立 中学 学生 ) 批评 学术 管理 制度 的 稿件 时 另 制 标题 , 题为 : “ 县 中 教师 是 木 刀 , 学生 是 木偶 ” 。
+稿件 见报 后 , 舆论 大 哗 , 县 中 师生 群情 激愤 , 怒 不 可 遏 。
+没 过 几 天 , 县 中学生 即 在 学校 大 礼堂 集合 , 选派 代表 前往 报社 质问 。
+紧接 着 , 大批 学生 又 随后 赶 至 报社 大院 , 学生 到达 报社 编辑 部 时 , 恰 遇 社长 孟 紫萍 , 双方 展开 争辩 , 孟 所 戴 礼帽 被 学生 戳 掉 于 地 , 用 脚 乱 踩 , 报社 副刊 编辑 同时 遭到 质问 、 围攻 。
+霎时 间 其余 学生 将 报社 社 牌 , 办公 室 的 电话 机 、 吊灯 、 沙发 、 桌椅 等 砸 得 稀烂 。
+撤离 途中 , 想起 排字 房 还 安然 无 恙 , 又 返回 报社 , 将 全部 铅字 推 乱 , 丢 之 满地 , 使 该 报 彻底 失去 出刊 能力 。
+待 宪 、 警 闻讯 赶赴 现场 时 , 肇事 者 已 无 踪影 。
+事件 发生 后 , 社长 孟 紫萍 立即 向 国民党 陕西 省 政府 、 九 区 专署 、 宝鸡 县 党政 机关 接连 呼救 , 同时 还 向 国民党 “ 中央 社 ” 求援 ( 孟 是 中央 社 特约 通讯 员 ) 。
+少数 为 首 的 肇事 学生 , 见 事件 发生 当天 傍晚 , 学校 门口 军警 密布 , 均 连夜 逃 之 夭夭 。
+嗣后 , 第 九 区 行政 督察 专员 兼 宝鸡 县长 孙 宗复 亲自 出面 干预 , 将 县 中 校长 尹 培业 撤职 。
+报社 所 遭受 损失 , 由 宝鸡 县 商会 理事 长 李 生润 ( 孟 的 同乡 ) 在 商界 捐助 部分 款物 弥补 。
+《 通俗 日报 》 在 停刊 5 天后 , 始 正式 复刊 。
+中文 名 通俗 日报 含义 原 在 郑州 出版 类型 报刊 地点 郑州
+海清 寺 海清 寺 法 脉 为 临济 正宗 , 始建 于 唐代 , 曾 是 历史 上 苏北 、 鲁南 地区 规模 和 影响 最 大 的 佛教 寺院 之 一 。
+鼎盛 时期 高僧 云集 , 法音 远 达 , 人 天 普济 。
+据 载 : 宋 天圣 二年 正月 设 供 僧 850 人 , 宋 天圣 三年 设 供 僧 900 人 , 足见 其 香火 鼎盛 。
+可惜 的 是 在 58 年 修建 大 村 水库 时 , 该 寺 被 完全 拆毁 。
+景点 介绍 0 海清 寺 法 脉 为 临济 正宗 , 始建 于 唐代 , 曾 是 历史 上 苏北 、 鲁南 地区 规模 和 影响 最 大 的 佛教 寺院 之 一 。
+鼎盛 时期 高僧 云集 , 法音 远 达 , 人 天 普济 。
+据 载 : 宋 天圣 二年 正月 设 供 僧 850 人 , 宋 天圣 三年 设 供 僧 900 人 , 足见 其 香火 鼎盛 。
+可惜 的 是 在 58 年 修建 大 村 水库 时 , 该 寺 被 完全 拆毁 。
+据传 过去 海青 寺 是 花果 山 三 元 宫 的 下院 , 不管 是 到 这 游山 观景 者 、 还是 到 三 元 宫 朝山 进香 者 , 他们 都 是 先由 南城 渡过 风 急 浪 高 的 恬 风 渡 , 然后 到 海青 寺 歇脚 留宿 , 养 足 精神 第 二 天 开始 进 山 赏玩 。
+即使 在 云 台山 与 陆地 相连 以后 , 人们 仍旧 把 海青 寺 作为 理想 的 歇息 处所 , 从 明朝 的 海州 判官 唐 伯元 到 民国 期间 走遍 名山 古寺 的 高 鹤年 居士 都 莫 不 如此 。
+高 鹤年 居士 在 他 的 《 名山 游 访 记 》 中 曾 对 这里 的 境况 作 了 详细 的 记述 “ 大 村长 桥 飞瀑 , 有 王 朗 墓 , 老君堂 。
+上海 清 寺 宿 。
+寺 前 有 大塔 九级 。
+高 十馀 丈 。
+远近 瞻仰 。
+相传 傍 有 三 元祖 墓 。
+前人 昕 称 桃花 源 · · · · · · ” 。
+地址 0 花果 山
+矮人 与 公主 白雪 公主 与 邻国 王子 过 了 一 段 快乐 幸福 的 日子 。
+可是 好 景 不 长 , 王子 带兵 出征 、 恶毒 王后 卷 土 重来 使得 白雪 公主 返回 森林 。
+白雪 公主 与 小 矮人 一 起 召集 森林 中 散居 的 矮人 建设 村落 抵抗 恶毒 王后 。
+他们 建造 房屋 、 种植 农 作物 、 修建 桥梁 , 巡逻 森林 , 每天 的 生活 都 很 充实 快乐 。
+恶毒 的 王后 很 不 甘心 , 一 次 又 一 次 的 打乱 这 快乐 的 生活 … … 游戏 简介 0 这 是 一款 装扮 类 SNS 游戏 游戏 简介 游戏 名称 : 1 中文 : 公主 与 矮人 英文 : The Snow White andDwarfs 游戏 简介 游戏 类型 : 2 装扮 , 经营 游戏 简介 故事 背景 : 3 白雪 公主 与 邻国 王子 过 了 一 段 快乐 幸福 的 日子 , 因为 边疆 战事 吃紧 , 王子 亲自 带兵 出征 , 白雪 公主 独自 生活 在 宫殿 里 。
+坠 下 山崖 的 恶毒 王后 并 没 有 被 摔死 , 不幸 的 是 她 变得 更 丑 了 , 于是 嫉妒 变成 了 仇恨 , 她 发誓 一定 要 置 公主 于 死地 。
+侥幸 逃脱 魔掌 的 白雪 公主 无 处 可 去 , 最终 又 回到 森林 里 , 小 矮 人们 真诚 而又 热情 的 欢迎 了 她 , 并 发誓 要 保护 美丽 善良 的 公主 。
+白雪 公主 非常 感动 , 决定 和 矮 人们 一 起 好好 保护 自己 。
+经过 商议 , 他们 决定 召集 森林 中 散居 的 矮人 聚居 成 村落 , 一 起 生活 , 抵抗 恶毒 王后 。
+于是 矮 人们 在 公主 的 指挥 下 , 开始 在 原来 的 房子 周围 建造 新 房屋 、 种植 蔬菜 水果 等 作物 、 修建 桥梁 , 并 派出 矮人 到 森 里 中 寻找 其他 矮人 。
+慢慢 的 , 矮人 越来 越多 , 他们 形成 了 一 个 个 村落 , 每天 快乐 的 生活 着 , 恶毒 的 王后 很 不 甘心 , 一 次 又 一 次 的 打乱 这 快乐 的 生活 … … 创意 来源 0 创意 来源 于 《 白雪 公主 》 。
+白雪 公主 与 七 个 小 矮人 相信 是 在 我们 儿时 被 经常 提起 的 童话 故事 。
+《 公主 与 矮人 》 这款 游戏 讲述 的 是 那 童话 故事 之后 的 故事 。
+中文 名 矮人 与 公主 原版 名称 The Snow White andDwarfs 游戏 类型 装扮 , 经营 类别 SNS 游戏
+虎门 东方 小学 学校 建于 1985 年 , 前身 是 虎门 镇 中心 小学 , 1991 年 易名 为 “ 虎门 镇 东方 小学 ” 。
+2000 年 镇 政府 重点 投资 建设 后 , 新建 2 号 教学 楼 , 装修 1 号 旧 楼 , 成为 一 所有 一定 规模 、 有 一定 档次 的 城区 小学 。
+学校 介绍 0 学校 现有 教学 班 29 个 , 学生 1355 人 。
+教 职工 89 人 , 中共 党员 21 人 , 共青 团员 16 人 , 大专 学历 以上 76 人 , 本科 学历 63 人 , 高级 教师 64 人 , 一 级 教师 19 人 。
+学校 占地 面积 11712 平方 米 , 建筑 面积 11633 平方 米 。
+两 幢 教学 楼 共 有 课室 31 间 , 功能 室 32 间 , 一 个 宽阔 的 大 球场 ( 200 米 环行 塑胶 跑道 、 3 个 篮球 场 ) 、 一 个 内 球场 、 雨天 运动 场 。
+篮球 场 、 羽毛 球场 、 乒乓 球台 及 有 关 体育 器械 教学 设备 基本 符合 标准 。
+校园 文化 氛围 浓郁 , 各类 宣传 栏 、 壁挂 画框 等 集 教育 性 、 知识 性 、 艺术 性 于 一体 。
+校园 整洁 有 序 , 干净 清爽 , 两 个 生物 园 , 以及 周围 花圃 草地 、 绿树 红花 , 使 校园 更 绿 、 更 美 、 更有 希望 、 更有 生命 力 , 是 孩子 们 健康 成长 的 乐园 。
+学校 办学 思路 清晰 , 行政 、 新 老 教师 严谨 敬业 、 团结 向上 , 老师 们 重视 理论 学习 , 不 断 更新 观念 , 用好 电 教化 辅助 教学 设备 , 提高 教学 效率 , 全体 教师 抓 安全 、 抓 风纪 , 争取 正确 的 教育 活动 和 良好 的 教育 方式 , 后进 生 得到 进步 , “ 德育 为 首 、 五 育 并举 ” , 推进 素质 教育 , 与 时 俱 进 , 不 断 创新 。
+教育 教学 质量 得到 家长 、 社会 的 认同 , 是 市 的 文明 学校 。
+校歌 0 < b > 要 做 新 世纪 的 主人 < / b > < b > < / b > —— 东方 小学 校歌 全体 教师 词 平 波 执笔 魏 金培 曲 五爱 塑造 了 我们 的 灵魂 四有 铸就 了 我们 的 身心 五星 火炬 前头 照 雏鹰 展翅 雏鹰 展翅 奔向 光明 爱 学 求真 东方 育英 诚实 有 礼 活泼 创新 努力 攀登 科学 高 峰 要 做 新 世纪 的 主人 五爱 塑造 了 我们 的 灵魂 四有 铸就 了 我们 的 身心 五星 火炬 前头 照 雏鹰 展翅 雏鹰 展翅 奔向 光明 团结 向上 东方 精神 面向 世界 面向 未来 面向 现代 化 要 做 新 世纪 的 主人 校训 0 爱 学 求真 、 诚实 有 礼 、 活泼 健康 校风 0 严谨 、 敬业 、 团结 、 向上 教 风 0 诲 人 不 倦 、 全面 育人 学风 0 自主 、 合作 、 探究 校徽 0 东方 小学 校徽 说明 : 校徽 图案 由 “ 东方 ” 二 字 拼音 首 字母 变形 造型 , 可 使人 联想 到 展翅 的 雏鹰 , 充满 生气 的 幼苗 , 托起 明天 的 太阳 。
+发展 规划 0 < b > 发展 背景 < / b > < b > < / b > < b > 学校 概况 < / b > < b > < / b > 虎门 镇 东方 小学 成立 于 1991 年 , 经 2000 年 镇 政府 重点 投资 后 , 学校 占地 11712 平方 米 , 建筑 面积 11633 平方 米 。
+两 栋 教学 楼 共 有 课室 38 间 , 功能 室 30 间 。
+一 个 塑胶 运动 场 ( 含 200 米 跑道 ) 、 两 个 内 球场 、 雨天 运动 场 、 羽毛 球场 、 排球 场 、 乒乓 球台 及 有 关 体育 器械 等 教学 设备 基本 符合 标准 。
+学校 现有 教学 班 29 个 , 学生 1355 人 。
+教 职工 89 人 , 中共 党员 21 人 , 共青 团员 16 人 , 大专 学历 以上 76 人 , 本科 学历 63 人 , 高级 教师 64 人 , 一 级 教师 19 人 。
+校园 文化 氛围 浓厚 。
+53 米 历史 长廊 , 34 米 绿色 长廊 , 四 道 楼梯 墙壁 上 成语 镜框 、 走廊 墙上 的 书法 镜框 、 喷涂 镜框 、 名人 画像 及 室内 标语 镜框 等 大小 镜框 200 多 个 , 集 教育 性 、 知识 性 、 艺术 性 于 一体 。
+整洁 、 有 序 、 干净 、 清爽 的 校园 内 两 个 小型 生物 园 以及 周围 花圃 、 草地 、 绿树 红花 , 使 校园 更 绿 、 更 美 、 更有 希望 、 更具 生命 力 , 是 孩子 们 健康 成长 的 乐园 。
+< b > 发展 优势 < / b > < b > < / b > 1 、 正逢 教育 盛世 , 学校 发展 大有 作为 。
+随着 社会 经济 的 不 断 增长 , 人民 群众 越来 越 关注 教育 的 发展 , 人民 群众 对 优质 教育 的 渴望 也 越来 越 强烈 。
+最近 几 年 , 从 素质 教育 的 提出 到 新 课程 改革 的 实施 , 再 到 新 《 义务 教育 法 》 的 颁布 , 大 教育 、 终身 教育 等 观念 的 提出 , 证明 教育 改革 的 春天 已经 来到 。
+而 吴 书记 在 领导 干部 会议 上 的 重要 讲话 , 为 虎门 的 未来 描绘 了 美好 蓝图 , 更 指明 了 教育 事业 的 发展 方向 。
+这些 为 学校 的 特色 办学 、 课题 实验 、 德育 研究 等 提供 了 广阔 的 天地 。
+2 、 学校 定位 明确 , 学校 发展 前景 良好 。
+学校 办学 思路 清晰 , 全体 行政 、 教师 严谨 敬业 、 团结 向上 。
+一年 一 个 台阶 , 按照 “ 绿色 学校 ” 、 “ 市 一 级 学校 ” 、 “ 文化 先进 学校 ” 、 “ 省 综合 实践 样本 学校 ” 和 “ 现代 教育 技术 实验 学校 ” 的 标准 , 在 各级 领导 的 高度 重视 和 支持 下 竭尽 全力 把 学校 办好 , 教育 、 教学 质量 充分 得到 家长 和 社会 的 认同 与 肯定 。
+大 的 氛围 很 利于 学校 的 发展 。
+3 、 全体 教师 精诚 团结 , 敬业 爱岗 , 教学 设施 设备 先进 。
+学校 的 领导 班子 团结 、 勤政 、 务实 、 精干 、 尽责 , 班子 的 以身 立教 、 同心 同德 之 撑 着 “ 东方 ” 这个 大 家庭 。
+教师 们 团结 协作 、 谦虚 谨慎 , 相互 学习 、 相互 帮助 、 关心 集体 、 尽职 尽责 、 教书 育人 , 注意 培养 学生 具有 良好 的 思想 品德 , 认真 备课 、 上课 、 批改 作业 , 不 敷衍 塞责 , 不 传播 有 害 学生 身心 健康 的 思想 。
+2006 年 镇 政府 投资 为 学校 增建 一 个 多 功能 电脑 室 和 48 个 多 媒体 电化 教学 平台 , 2008 年 再 增建 的 校园 网 、 电子 阅览 室 和 广播 系统 等 , 为 学校 现代 化 教育 教学 提供 了 有 力 的 保障 。
+< b > 发展 理念 与 目标 < / b > < b > < / b > < b > 指导 思想 < / b > < b > < / b > 以 邓 小平 “ 三 个 面向 ” 和 “ 三 个 代表 ” 重要 思想 为 指导 。
+结合 我 镇 的 “ 快乐 教育 ” 和 学校 自身 特点 , 以 科学 的 发展 观 , 不 断 整合 教育 资源 , 不 断 改革 创新 , 不 断 超越 自我 。
+以 教育 规律 办事 , 走 依法 治校 之 路 , 走 科研 兴 校 之 路 , 走 质量 立 校 之 路 。
+努力 打造 优秀 的 教师 队伍 , 营造 和谐 的 校园 文化 , 构建 新型 的 教育 网络 , 为 孩子 的 健康 成长 提供 最 优质 的 教育 。
+< b > 发展 理念 < / b > < b > < / b > < b > 办学 理念 : < / b > < b > 以 现代 科技 为 手段 , 以 人文 精神 为 核心 , 创建 “ 塑造 人 · 可 发展 ” 的 教育 模式 。
+< / b > < b > < / b > < b > 办学 目标 : < / b > < b > 坚持 教育 “ 三 个 面向 ” , 全面 贯彻 党 的 教育 方针 , 坚持 改革 , 以 法治 校 , 民主 管理 。
+努力 改善 办学 条件 , 调动 全体 师生 积极 性 , 以 “ 德育 为 首 , 五 育 并举 ” , 实践 教书 育人 、 管理 育人 、 环境 育人 、 保障 育人 的 教育 新 概念 。
+< / b > < b > < / b > < b > 校训 : < / b > < b > 爱 学 求真 诚实 有 礼 活泼 健康 < / b > < b > < / b > < b > 校风 : < / b > < b > 严谨 敬业 团结 向上 < / b > < b > < / b > < b > 教 风 : < / b > < b > 诲 人 不 倦 全面 育人 < / b > < b > < / b > < b > 学风 : < / b > < b > 自主 合作 探究 < / b > < b > < / b > < b > 五年 发展 目标 < / b > < b > < / b > 以 人为 本 , 扎实 推进 素质 教育 。
+经过 五年 的 努力 , 把 学校 办 成 教育 理念 先进 , 教师 队伍 优良 , 学校 管理 民主 , 教学 质量 过硬 , 师生 关系 和谐 , 校园 环境 优美 , 人民 群众 满意 , 学生 得以 自信 、 健康 成长 , 在 镇 内 具有 自身 办学 特色 的 实验 学校 。
+< b > 阶段 目标 < / b > < b > < / b > 第 一 阶段 : 准备 与 试行 阶段 ( 2009 年 2 月 —— 2009 年 8 月 ) 确定 思路 、 制订 计划 、 贯彻 任务 、 初步 施行 第 二 阶段 : 实施 阶段 ( 2009 年 9 月 —— 2013 年 8 月 ) 完成 发展 规划 的 各项 指标 , 逐步 积累 材料 第 三 阶段 : 总结 阶段 ( 2013 年 9 月 —— 2013 年 12 月 ) 总结 各项 成绩 , 提炼 经验 成果 , 大力 推广 应用 < b > 发展 措施 < / b > < b > < / b > 今后 五年 , 我们 将 着力 于 以下 八 方面 的 努力 , 以此 推动 学校 整体 发展 。
+< b > 学校 管理 < / b > < b > < / b > 1 、 工作 目标 以 规范 化 、 精细 化 、 民主 化 为 学校 管理 目标 , 建立 “ 规范 化 、 精细 化 、 民主 化 ” 的 内部 管理 体制 。
+2 、 工作 措施 ( 1 ) 通过 制定 《 东方 小学 学校 常规 管理 流程 图 》 , 使 学校 事事 有 人 管 , 人人 有 事 干 ; 通过 各项 规章 制度 的 不 断 完善 , 实现 管理 的 规范 化 。
+( 2 ) 关注 现有 规章 制度 的 落实 情况 , 关注 管理 的 细节 。
+制定 现有 制度 落实 机制 , 提高 现有 制度 的 执行 力 , 实现 管理 的 精细 化 。
+( 3 ) 组织 领导 班子 参加 管理 培训 , 提高 管理 的 能力 ; 创造 条件 让 全体 教师 参与 到 学校 管理 , 提高 学校 的 凝聚 力 和 战斗 力 ; 完善 教代 会 的 监督 机制 , 确保 学校 管理 的 民主 。
+< b > 教师 队伍 < / b > < b > < / b > 1 、 工作 目标 教师 整体 水平 得到 提高 , 各 学科 骨干 教师 群体 真正 形成 。
+一支 “ 师德 高尚 、 业务 精良 、 身心 健康 、 团结 协作 ” 的 教师 队伍 基本 建成 。
+力争 五年 内 有 一定 数量 的 镇 教 坛 新秀 、 骨干 教师 、 名师 、 高级 教师 出现 。
+2 、 工作 措施 ( 1 ) 加强 师德 师风 建设 。
+政治 思想 工作 是 一切 工作 的 保证 , 政治 工作 做好 了 , 人 的 积极 性 和 创造 性 发挥 起来 , 各项 工作 就 可以 做好 。
+五年 里 , 学校 要 把 “ 爱 与 责任 ” 作为 师德 教育 主线 , 把 政治 学校 制度 化 、 常规 化 ; 要 充分 发挥 党 支部 的 作用 , 不 断 发展 党员 队伍 , 用 榜样 的 力量 教育 人 ; 工会 要 利用 自身 优势 , 通过 各种 形式 的 活动 确保 教师 身心 健康 。
+( 2 ) 加强 学校 领导 班子 队伍 建设 。
+俗话 说 : 赢 在 中层 。
+我 校 领导 班子 , 年轻 富有 朝气 , 班子 内 团结 、 协作 , 但 经验 不 足 、 管理 理论 欠缺 的 问题 同样 突出 。
+领导 班子 的 全体 成员 要 把 “ 忠于 职守 、 甘于 奉献 、 团结 协作 、 廉洁 奉 公 、 以身 作 则 ” 作为 工作 信条 。
+学校 争取 给 领导 班子 每 一 位 成员 提供 外出 学习 培训 的 机会 , 提高 他们 科学 管理 水平 。
+( 3 ) 加强 校本 教研 建设 。
+学校 制订 《 教师 学习 制度 》 , 各 学科 都 要 形成 较 系统 的 校本 培训 方案 , 通过 扎扎 实实 的 校本 教研 活动 , 提高 教师 的 专业 水平 。
+每 学期 坚持 举办 一 次 有 一定 规模 的 学科 研讨 活动 ; 每 学年 坚持 “ 请 近来 、 走出 去 ” 为 教师 提供 学习 机会 。
+( 4 ) 加强 青年 教师 和 骨干 教师 队伍 建设 。
+学校 以 “ 学科 骨干 —— 学科 带头 人 —— 名 教师 —— 专家 型 教师 ” 的 培养 模式 , 制定 科学 的 骨干 教师 成长 规划 , 对 年轻 教师 建立 专业 成长 档案 袋 。
+学校 每 学年 有 明确 的 青年 教师 和 骨干 教师 培养 人选 , 并 有 组织 、 有 计划 、 有 层次 的 进行 培养 。
+( 5 ) 建立 科学 的 教师 评价 制度 建设 。
+通过 《 教师 奖励 考核 条例 》 、 《 教师 专业 发展 性 评价 》 等 制度 的 制订 , 对 教师 进行 定性 和 定量 相 结合 的 考评 。
+在 校园 内 努力 营造 一 种 “ 互相 信任 、 互相 支持 、 互相 谅解 、 互相 帮助 ” 的 健康 人际 关系 。
+< b > 德育 工作 < / b > < b > < / b > 1 、 工作 目标 坚持 素质 教育 , 坚持 以 人为 本 , 坚持 联系 实际 。
+开展 以 快乐 第 二 课堂 活动 、 养成 教育 、 感恩 教育 为 载体 的 育人 模式 的 探究 。
+让 学生 在 活动 中 提高 素质 、 体验 成功 、 享受 幸福 , 在 人格 上 获得 全面 发展 。
+2 、 工作 措施 教育 工作 除了 抓好 日常 工作 以外 , 要 突出 主线 , 重点 搞好 三 个 “ 载体 ” 的 建设 。
+充分 挖掘 三 个 “ 载体 ” 所 蕴涵 的 德育 内涵 , 关注 学生 个性 成长 。
+( 1 ) 开展 快乐 第 二 课堂 活动 口号 : 享受 快乐 , 锻炼 体魄 , 培养 品质 目标 : 让 学生 在 享受 快乐 的 同时 提高 全面 素质 , 培养 良好 品质 。
+① 快乐 第 二 课堂 活动 以 大 课间 活动 为 依托 。
+开展 适合 学校 实际 的 篮球 、 书法 、 象棋 、 即席 作文 、 奥数 、 等 兴趣 小组 。
+② 活动 要 坚持 做到 三 有 : 学生 人人 有 项目 , 教师 人人 有 辅导 , 活动 有 固定 地点 。
+③ 各 兴趣 小组 要 善于 挖掘 活动 的 内涵 , 把 活动 和 德育 联系 起来 。
+让 学生 在 活动 中 学会 本领 , 在 活动 中 培养 品质 , 在 活动 中 明白 道理 。
+( 2 ) 注重 学生 养成 教育 口号 : 造就 君子 性格 , 培养 君子 素质 目标 : 培养 学生 养成 良好 的 行为 习惯 , 形成 良好 的 学风 、 班风 。
+① 行为 教育 要 紧紧 抓住 学校 的 校风 、 学风 , 以 学生 良好 行为 习惯 的 养成 为 重点 。
+② 活动 设计 要 根据 学生 特点 , 要 紧扣 主题 。
+活动 的 过程 要 注重 实效 性 和 可 操作 性 , 对 学生 、 班级 的 评价 要 注重 真实 性 和 可 延续 性 。
+( 3 ) 对 学生 进行 感恩 教育 口号 : 学会 感恩 , 学会 奉献 目标 : 探索 出 富有 学校 特色 的 感恩 教育 模式 , 形成 以 传统 节日 和 教育 周 为 基本 阵地 。
+让 学生 在 活动 中 学会 感恩 , 学会 奉献 。
+措施 : ① 活动 要 抓住 二 条 主线 。
+一 是 以 传统 节日 活动 为 主线 , 把 “ 妇女 节 ” 、 “ 清明 节 ” 、 “ 儿童 节 ” 、 “ 国庆 节 ” 、 “ 母亲 节 ” 等 传统 节日 教育 和 感恩 教育 有 机 结合 。
+二 是 明确 每 学期 一 次 的 “ 感恩 周 ” 和 “ 奉献 周 ” , 展示 学校 感恩 教育 成果 , 同时 把 活动 推向 高潮 。
+② 在 抓好 二 条 主线 的 同时 , 要 善于 挖掘 活动 的 内涵 , 不 断 拓展 活动 的 形式 。
+< b > 课堂 教学 < / b > < b > < / b > 1 、 工作 目标 树立 正确 的 教育 观 , 以 质量 意识 为 中心 , 以 课堂 教学 为 重点 , 以 改革 创新 为 手段 。
+注重 校本 教研 、 注重 小 课题 研究 , 积极 探索 “ 关注 个体 、 注重 实效 ” 的 课堂 教学 途径 , 通过 五年 的 努力 , 使 学校 教学 质量 得以 稳步 提升 。
+2 、 工作 措施 ( 1 ) 严肃 执行 课程 计划 , 严格 按 国家 规定 开 足 、 开 齐 、 开 好 各 门 课程 。
+修订 《 东方 小学 课堂 教学 六大 原则 》 , 加强 教学 过程 管理 , 重视 课堂 40 分钟 效率 。
+( 2 ) 加强 校本 教研 建设 。
+校本 教研 以 “ 关注 个体 、 注重 实效 ” 的 课堂 教学 研究 为 重点 , 积极 倡导 以 “ 学习 、 培训 、 科研 ” 为 主要 内容 的 校本 研究 , 逐步 形成 学校 校本 教研 特色 。
+( 3 ) 进行 校本 课程 开发 。
+语 、 数 、 英 学科 要 带头 挖掘 学科 资源 , 找准 适合 学校 特点 的 校本 课程 。
+各 教研 组 都 要 以 争创 品牌 学科 为 努力 目标 , 每 学期 进行 一 次 主题 明确 的 研讨 活动 , 经过 三年 努力 , 初步 形成 各自 的 学科 特色 。
+( 4 ) 课题 研究 必须 和 学生 的 教学 质量 相 联系 , 和 学生 的 个性 发展 相 联系 。
+学校 对 各 年级 的 教学 质量 重点 突出 抓 合格 率 和 优秀 率 。
+同时 对 教学 质量 进行 动态 管理 , 学校 坚持 每 学年 进行 一 次 教学 质量 调研 , 每 学期 进行 一 次 年级 质量 分析 。
+( 5 ) 坚持 学生 评价 制度 的 改革 , 建立 科学 、 合理 的 学生 质量 评价 体系 ; 坚持 学生 质量 评价 的 多元化 , 评价 过程 的 个性化 。
+< b > 教 科研 工作 < / b > < b > < / b > 1 、 工作 目标 树立 “ 科研 兴 校 , 科研 促 教 ” 的 思想 , 建立 科研 管理 机制 。
+通过 学校 中心 课题 的 研究 和 辐射 , 营造 学校 的 科研 氛围 , 促进 学校 的 个性 发展 , 提高 教师 的 科研 水平 。
+2 、 工作 措施 ( 1 ) 成立 学校 教 科研 领导 小组 , 统一 规划 学校 教 科研 工作 。
+( 2 ) 保证 每年 有 区级 以上 立项 课题 , 人人 有 一 个 校级 以上 立项 课题 , 每 学期 组织 一 次 校内 论文 、 案例 评比 , 每 学年 有 校级 论文 、 案例 汇编 成册 , 每 学期 邀请 专家 进行 指导 。
+( 3 ) 利用 教 科研 队伍 , 促进 学校 校本 课程 、 综合 实践 活动 的 开发 , 形成 学校 特色 课程 。
+< b > 办学 特色 < / b > < b > < / b > 1 、 工作 目标 通过 五年 努力 , 综合 实践 学科 教学 特色 在 镇 内 有 较大 的 知名 度 , 学校 初步 摸索 出 适合 学校 实际 的 办学 特色 之 路 。
+2 、 工作 措施 ( 1 ) 根据 学校 办学 理念 、 办学 目标 , 在 实践 中 形成 学校 的 办学 特色 。
+( 2 ) 各 学科 也 要 努力 争创 自己 的 特色 , 要 与 学校 实际 相 结合 , 要 与 校本 课程 开发 相 结合 , 争取 在 五年 内 有 1 - 2 个 在 镇 内 有 一定 影响 的 学科 特色 。
+( 3 ) 要 重视 项目 的 过程 管理 , 及时 进行 分析 , 综合 多种 因素 确定 学校 特色 。
+( 4 ) 建立 学校 特色 领导 小组 , 制订 学校 特色 实施 办法 , 稳步 推进 学校 特色 。
+< b > 校园 文化 < / b > < b > < / b > 1 、 工作 目标 坚持 环境 育人 的 思想 , 以 学校 办学 理念 为 校园 文化 建设 的 核心 , 构筑 “ 洋溢 诗韵 、 充满 书香 ” 的 校园 文化 。
+2 、 工作 措施 ( 1 ) 重视 校园 走廊 文化 和 班级 文化 的 营造 。
+重视 校训 、 教 风 、 学风 的 建设 。
+重视 黑板 报 、 宣传 窗 、 红 领巾 演播 厅 等 宣传 阵地 的 建设 。
+( 2 ) 大力 开展 校园 文化 活动 , 通过 学校 体育 节 、 读书 节 、 艺术 节 、 感恩节 等 活动 的 开办 , 逐步 形成 学习 传统 文化 。
+( 3 ) 利用 校园 网 等 手段 , 宣传 学校 教育 思想 , 寻求 形成 教育 合力 。
+( 4 ) 重视 制度 文化 建设 , 促进 师 与 师 、 师 与 生 、 生 与 生 的 和谐 发展 。
+< b > 办学 条件 < / b > < b > < / b > 1 、 工作 目标 经过 五年 努力 , 使 学校 布局 更 合理 , 校 容 校 貌 更 优美 , 办学 条件 更 现代 。
+2 、 工作 措施 ( 1 ) 加强 平安 校园 建设 。
+做到 制度 保证 , 检查 落实 , 教育 到位 , 重点 突出 , 保证 五年 内 不 发生 重大 事故 。
+( 2 ) 积极 争取 上级 领导 及 社会 各界 的 重视 、 关心 和 支持 , 多方 筹集 资金 改善 办学 条件 。
+( 3 ) 重视 学校 图书 室 、 阅览 室 、 展览 室 等 专用 教室 的 布置 。
+重视 学校 网站 的 建设 。
+( 4 ) 健全 专用 教室 现代 化 设备 的 管理 、 使用 和 维修 制度 。
+2009 年 2 月 2 日 中文 名 虎门 东方 小学 校训 爱 学 求真 诚实 有 礼 活泼 健康 创办 时间 1985 类别 公立 小学 现任 校长 曾 灿基 教 职工 89 人 学校 占地 面积 11712 平方 米
+饥饿 末日 《 饥饿 末日 》 是 连载 于 起点 中文 网 的 网络 小说 , 作者 是 不 死 鸟 - 天堂 。
+小说 类型 0 科幻 幻想 内容 简介 0 面对 饥饿 有 多少 人 能 保留 理智 。
+假如 末日 真 的 来临 , 有 多少 人 还会 祈求 活 下去 … … 中文 名 饥饿 末日 作者 不 死 鸟 - 天堂 小说 进度 连载 连载 网站 起点 中文 网
+莫塞斯 · 亚当斯 莫塞斯 · 亚当斯 ( Mose sAdams , 1988 年 7 月 21 日 - ) , 比利时 足球 运动 员 。
+0 莫塞斯 · 亚当斯 ( Mose sAdams ) 国籍 比利时 粤语 名 阿当 出生 日期 1988 - 07 - 21 出生 地点 比利时 身高 180 . 0CM 体重 73 . 0K G 场 上 位置 中场 现 效力 球队 韦斯 达路 球衣 号码 20 号
+典藏 奶油 巧克力 蛋糕 用料 0 < table > 材料 用量 鸡蛋 5 个 细 砂糖 80g 低 粉 125 g 可可 粉 25g 黄油 30g 动物 鲜奶 油 250 g ( 裱花 ) 糖粉 适量 ( 裱花 ) 水 50g ( 酒 糖液 ) 细 砂糖 25g ( 酒 糖液 ) 朗姆酒 25g ( 酒 糖液 ) 巧克力 58g 动物 鲜奶 油 58g 黄油 24g < / table > 做法 0 < table > 1 . 全 蛋 海绵 法 : 将 鸡蛋 磕 入 盆 中 , 打散 , 加入 细 砂糖 , 隔 水 加热 至 35 左右 , 继续 用 电动 打 蛋 器 搅 打 至 蛋 糊 体积 膨胀 明显 , 纹路 明显 , 用 刮刀 盛 起 面糊 滴 落后 , 纹路 留在 盆 中 的 形状 不 变 , 不 消失 , 此时 全 蛋 已经 打 好 2 . 低 粉 可可 粉 混合 , 分 5 - 6 次 筛 入 蛋 糊 中 , 每次 切 拌 均匀 后再 筛 下 一 次粉 3 . 黄油 隔 水 融化 , 稍 凉 后 缓缓 倒 在 橡皮 刮刀 上 , 用 刮刀 快速 的 将 黄油 和 面糊 翻 拌 均匀 , 倒入 模具 中 4 . 烤箱 预热 170 ℃ , 中 下层 , 烘烤 35 mins 左右 , 出炉 倒 扣 晾 凉 5 、 蛋糕 胚 脱 模 后 , 均分 3 ~ 5 等份 , 备用 5 . 酒 糖液 : 50g 热水 融化 25g 白糖 后 , 倒入 朗姆酒 , 放 凉 备用 6 . 巧克力 奶油 馅 : 所有 材料 混合 , 加热 全部 融化 , 搅拌 均匀 后 , 稍稍 冷冻 至 凝结 备用 7 . 胚 体 组合 : 取 一 片 蛋糕 片 , 刷 上 酒 糖液 , 抹 上 一 层 巧克力 奶油 馅 , 再铺 上 一 片 蛋糕 片 , 依次 类推 8 . 奶油 + 糖粉 坐 冰 打发 , 抹 在 组合 好 的 蛋糕 体 上 , 冷藏 , 即可 中文 名典 藏 奶油 巧克力 蛋糕 主要 食材 鸡蛋 , 细 砂糖 分类 蛋糕 , 下午 茶
+现代 HYT - 03 ( 4GB ) 数字 电视 现代 HYT - 03 ( 4GB ) 数字 电视 是 一款 电子 产品 , 参数 是 屏幕 尺寸 为 3 . 5 英寸 320 * 240 像素 。
+重要 参数 0 续航 时间 : 视频 播放 : 4 小时 MP3 播放 : 6 小时 . . . 视频 格式 : AVI / RM / RMVB / FLV 等 格式 音乐 格式 : MP 3 / WMA / WAV / FLAC 录音 : WAV 图片 . . . 扩容 卡 槽 : 支持 MicroSD ( TF ) 卡 录音 功能 : 支持 信噪 比 : 85 输出 功率 : ( L ) 5mW + ( R ) 5mW ( 32 Ohm ) 支持 语言 : 18 种 语言 操作 系统 : WindowsSE / ME / 2000 / XP 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 0 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 显示 系统 1 屏幕 尺寸 : 3 . 5 英寸 屏幕 分辨率 : 320 * 240 像素 数字 电视 接收 标准 CMMB : 支持 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 视频 性能 2 视频 格式 : AVI / RM / RMVB / FLV 等 格式 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 音频 性能 3 设计 类型 : 录放 音乐 格式 : MP 3 / WMA / WAV / FLAC 录音 : WAV 图片 : JPEG / GIF 录音 功能 : 支持 信噪 比 : 85 输出 功率 : ( L ) 5mW + ( R ) 5mW ( 32 Ohm ) 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 文本 阅读 4 支持 语言 : 18 种 语言 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 存储 性能 5 存储 容量 : 4GB 扩容 卡 槽 : 支持 MicroSD ( TF ) 卡 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 电池 功耗 6 电池 类型 : 聚合 物 电池 续航 时间 : 视频 播放 : 4 小时 MP3 播放 : 6 小时 系统 / 接口 操作 系统 : WindowsSE / ME / 2000 / XP 接口 方式 : USB 2 . 0 现代 HYT - 03 ( 4GB ) 数字 电视 详细 参数 产品 特性 7 产品 特性 1 : 工作 温度 : 5 度 - 40 度 中文 名 现代 HYT - 03 ( 4GB ) 数字 电视 存储 容量 4GB 屏幕 尺寸 3 . 5 英寸 320 * 240 像素 电池 类型 聚合 物 电池
+北京 中海 沃邦 能源 投资 有限 公司 北京 中海 沃邦 能源 投资 有限 公司 于 2007 - 06 - 07 在 西城 分局 登记 成立 。
+法定 代表 人 陈 晓峰 , 公司 经营 范围 包括 项目 投资 ; 投资 管理 ; 投资 咨询 ; 技术 咨询 、 服务 等 。
+企业 信息 0 公司 类型 其他 有限 责任 公司 登记 机关 西城 分局 成立 时间 2007 - 06 - 07 发 照 时间 2016 - 08 - 30
+伸缩 租税 伸缩 租税 是 指 美国 著名 凯恩斯 主义 者 汉森 提出 的 一 种 税收 理论 。
+其 基本 内容 是 : 为 使 税收 能 自动 地 适应 经济 的 周期 变化 , 从而 减少 或 消除 经济 波动 所 带来 的 消极 影响 , 应 改变 税率 的 固定 性 而 增大 税率 的 灵活 性 。
+例如 : 对 工资 税 和 销售 税 即可 实施 某种 自动 税率 。
+具体 做法 是 : 由 国会 制定 两 套 不 同 的 税率 : 一 种 是 基本 税率 , 一 种 是 预备 税率 。
+两种 税率 的 择 一 使用 取决 于 特定 的 经济 指标 。
+10 当 经济 指标 在 规定 的 幅度 内 变化 时 适用 基本 税率 , 当 经济 指标 超出 规定 的 变化 幅度 时 则 采用 预备 税率 , 使 税率 随 经济 指标 的 变动 而 自动 地 降低 或 升高 , 借此 来 达到 保证 个人 可 支配 收入 , 维护 消费 支出 的 目的 。
+
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 ( 修改 稿 ) 根据 新 义务 教育 法 的 有 关 要求 , 结合 我 县 近 两年 撤 乡 并 镇 , 学校 布局 调整 等 实际 。
+本 着 进一步 健全 学籍 管理 制度 , 规范 学校 管理 的 原则 , 经 研究 决定 , 对 现行 的 《 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 ( 试行 ) 》 进行 修改 , 现 将 《 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 ( 修改 稿 ) 》 印发 给 你们 , 请 认真 遵照 执行 。
+发文 通知 0 < b > 多 教 办 ( 2008 ) 20 号 < / b > < b > < / b > < b > 关于 修改 《 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 ( 试行 ) 》 的 通知 < / b > < b > < / b > 各 有 关 学校 : 附件 : 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 ( 修改 稿 ) 二 00 八 年 四月 二十一 日 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 0 < b > ( 修改 稿 ) < / b > 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 总则 1 第 一 条 根据 《 内蒙古 自治 区 实施 〈 中华 人民 共和 国 义务 教育 法 〉 办法 》 的 规定 , 为 规范 学籍 管理 , 控制 学生 流动 的 随意 性 , 保证 所有 适龄 儿童 少年 按时 接受 九 年 义务 教育 。
+结合 全县 撤 乡 并 镇 、 中小 学 布局 调整 等 实际 , 本 着 就近 入学 的 原则 , 调整 全县 义务 教育 阶段 中小 学 入学 范围 , 制定 义务 教育 阶段 中小 学 学籍 管理 办法 。
+第 二 条 本 《 办法 》 适用 于 全县 义务 教育 阶段 各 中小 学 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 入学 2 第 三 条 根据 《 内蒙古 自治 区 普及 九 年 义务 教育 和 扫除 青 壮年 文盲 评估 验收 办法 》 规定 , 小学 在 籍 年龄 为 7 — 12 周岁 , 初中 为 13 — 15 周岁 。
+结合 新 《 中华 人民 共和 国 义务 教育 法 》 第 十一 条 , 凡 年满 六 周岁 的 儿童 , 其 父母 或 其他 法定 监护 人 应当 送 其 入学 接受 并 完成 义务 教育 ; 条件 不 具备 的 地区 的 儿童 , 可以 推迟 到 七 周岁 。
+第 四 条 小学 严格 遵照 划片 招生 的 原则 , 招收 符合 入学 年龄 的 儿童 入学 。
+学校 在 新 学年 开始 前 两 个 月 将 辖区 内 适龄 儿童 核实 造册 。
+经 当地 人民 政府 ( 或 教育 行政 部门 ) 授权 的 小学 , 最 迟 在 新 学年 开学 前 20 天 , 将 本 辖区 盖有 乡镇 人民 政府 和 教育 行政 部门 公章 的 入学 通知 书 发放 给 其 父母 或 其他 监护 人 。
+第 五 条 凡 年满 7 周岁 的 儿童 , 凭 户口 薄 和 入学 通知 书 到 教育 行政 部门 规定 的 小学 办理 入学 手续 后 , 即可 取得 学籍 。
+学校 要 保证 本 辖区 每 一 位 学龄 儿童 都 能 进入 学校 就读 , 享受 九 年 义务 教育 , 不 得以 任何 借口 拒收 本 辖区 学龄 儿童 , 不 得 跨 区域 招生 。
+凡 收 到 入学 通知 书 的 适龄 儿童 , 没 有 特殊 原因 一 周 内 不 报到 入学 者 , 由 学校 将 名单 报 乡镇 政府 , 乡镇 政府 依法 督促 其 入学 。
+第 六条 适龄 儿童 需要 免 入学 、 缓 入学 的 , 由 其 父母 或 其他 监护 人 提出 申请 , 经 乡镇 人民 政府 和 教育 行政 部门 批准 。
+因 身体 原因 申请 免 缓 学 的 , 应 附 县 以上 医院 的 诊断 证明 。
+缓 学期 满 仍 不 能 就读 的 , 应当 重新 提出 缓 入学 申请 。
+第 七 条 小学 应 创造 条件 接收 视力 、 听力 、 智力 等 轻度 残疾 儿童 随 班 就读 , 并 努力 为其 创造 良好 的 学习 环境 。
+残疾 儿童 随 班 就读 的 条件 与 正常 儿童 入学 条件 相同 , 特殊 情况 的 可 适当 放宽 。
+第 八 条 小学 入学 分 班 应 采取 随机 分配 的 办法 进行 , 严禁 通过 任何 面试 、 笔试 等 方式 为 学生 分配 班级 。
+小学 新生 入学 后 , 学校 要 为 学生 建立 《 新生 入学 花 名册 》 、 《 学籍 表 》 等 学籍 档案 。
+第 九 条 外来 经商 和 进城 务工 人员 的 子女 , 教育 行政 部门 应 按照 国家 有 关 规定 妥善 安排 就近 入学 , 学校 应 给予 学籍 编排 学号 。
+对 本 地 户口 回流 的 学生 , 执 有效 证件 , 按 户籍 所在 地 就近 入学 , 学校 不 得 拒收 。
+第 十条 小学 毕业 生 升 初中 不 举行 升学 考试 , 按照 划片 就近 入学 原则 , 由 教育 行政 部门 统一 分配 , 全部 进入 初中 。
+经 当地 政府 和 教育 行政 部门 授权 的 初中 , 按照 教育 行政 部门 分配 的 名单 , 提前 20 天 向 初一 新生 发放 盖有 乡镇 政府 和 教育 行政 部门 公章 的 入学 通知 书 。
+第 十一 条 初一 新生 凭 入学 通知 书 到 指定 初中 入学 后 , 即 取得 学籍 。
+已 领到 通知 书 在 开学 一 周 内 不 报到 入学 者 , 学校 查明 原因 后 , 报 乡镇 政府 和 教育 行政 部门 。
+属 无 正当 理由 不 入 学者 , 乡镇 政府 依法 督促 其 入学 。
+第 十二 条 学籍 一经 确定 , 除 转学 、 毕业 或 死亡 外 , 任何 学校 和 个人 不 得以 任何 理由 取消 其 学籍 。
+任何 学校 不 允许 存在 无 学籍 学生 。
+学籍 档案 由 学生 就读 学校 负责 管理 , 不 得 出现 生 、 籍 分离 或 不 符 的 现象 。
+学籍 表 由 班主任 负责 填写 完整 , 填写 应 实事 求是 , 如实 反映 学生 真实 情况 。
+每 学期 末 由 班主任 交 教导 处 保管 。
+第 十三 条 中小 学 入学 范围 、 入学 通知 书 编号 、 义教 证书 编号 < b > 一 、 入学 范围 < / b > < b > < / b > ( 一 ) 小学 1 、 第 一 小学 招生 范围 为 : 兴盛 社区 、 中 村 、 南村 、 南 林场 。
+2 、 第 二 小学 招生 范围 为 : 福 盛 社区 。
+3 、 第 三 小学 招生 范围 为 : 善 因 社区 、 向阳 社区 、 西村 、 富泉 村 、 苗圃 。
+4 、 第 四 小学 招生 范围 为 : 大 仓 社区 、 东 仓 村 、 新仓 村 、 盆 窑 村 、 水泉 村 、 小 营盘 村 、 北村 、 二 道 洼 村 、 第 一 建材 厂 。
+5 、 十 五号 小学 招生 范围 为 : 十 五号 村 、 十七 号 村 、 白沙 梁 村 。
+6 、 五号 小学 招生 范围 为 : 五号 村 、 九 号 村 、 南山 根 村 、 白 音坤 兑 村 。
+7 、 小 石 砬 小学 招生 范围 为 : 小 石 砬村 、 大石 砬 村 、 平 甸 沟 村 。
+8 、 西 干沟 小学 招生 范围 为 : 大 官场 村 、 小 官场 村 、 河槽 子 村 、 牛槽 洼 村 、 马 群 后 沟 村 。
+9 、 耗 来 沟 小学 招生 范围 为 : 牛 眼睛 村 、 牛心 山 村 、 大耗 来 沟 村 、 羊 盘 沟 村 、 小河 村 。
+10 、 大北 沟 小学 招生 范围 为 : 大北 沟 村 、 学 田地 村 、 黑 山头 村 、 西山 根 村 、 蒙古 营 村 、 北 石门 村 、 花 塘 沟 村 、 白 石头 沟 村 。
+11 、 黑山 嘴 小学 招生 范围 为 : 黑 山嘴 村 、 团结 村 、 双井 子 村 、 胜利 村 、 新民 村 。
+12 、 三 道 沟 小学 招生 范围 为 : 三 道 沟 村 、 榆树 林 村 、 黄羊 沟 村 。
+13 、 大 河口 小学 招生 范围 为 : 前 九 号 村 、 温塘 河 村 、 大孤 山 村 、 大 河口 村 、 公 吉 诺 村 、 滦河 村 。
+14 、 上 都 河 小学 招生 范围 为 : 白 音 卜 罗村 、 白 城子 村 、 老 北 沟 村 、 上 都 河 村 、 炮台 村 、 砧子 山村 、 光明 村 。
+15 、 蔡 木 山 小学 招生 范围 为 : 一 家 河 村 、 黑 风河 村 、 铁 公 泡子 村 、 青龙 背 村 。
+( 二 ) 初中 1 、 第 二 中学 招生 范围 为 : 第 一 小学 、 第 二 小学 、 第 三 小学 、 第 四 小学 的 全部 走读 学生 。
+2 、 第 四 中学 招生 范围 为 : 十 五号 小学 、 五号 小学 、 小 石 砬 小学 、 大北 沟 小学 、 西 干沟 小学 、 耗 来 沟 小学 、 黑山 嘴 小学 、 三 道 沟 小学 、 大 河口 小学 、 上 都 河 小学 、 蔡 木 山 小学 及第 一 小学 、 第 二 小学 、 第 三 小学 校外 住宿 生 和 第 四 小学 校内 住宿 生 。
+< b > 二 、 入学 通知 书 编号 < / b > < b > < / b > 1 、 以 村 、 社区 为 单位 编号 ( ) Ⅹ Ⅹ ( ) ― Ⅹ Ⅹ Ⅹ 乡镇 代号 年号 后 两位 村 、 社区 代号 序号 例 : C 0819 — 012 代表 大 北 沟 镇 2008 年 羊 盘 沟 村 第 12 号 2 、 各 乡镇 、 村 ( 社区 ) 代号 编排 表 < table > 乡镇 名称 代号 村 、 社区 名称 及 编号 大 河口 乡 A 前 九 号 01 温塘 河 02 大孤 山 03 大 河口 04 公 吉 诺 05 滦河 06 三 道 沟 07 榆树 林 08 黄羊 沟 09 蔡 木 山乡 B 白 音 卜罗 01 白城 子 02 老 北 沟 03 上 都 河 04 炮台 05 砧子 山 06 光明 07 一 家 河 08 黑 风河 09 铁 公 泡子 10 青龙 背 11 大北 沟镇 C 十 五号 01 十七 号 02 白沙 梁 03 五号 04 九 号 05 南山 根 06 白 音坤 兑 07 小 石 砬 08 大石 砬 09 平 甸 沟 10大 官场 11 小 官场 12 河 槽子 13 牛槽 洼 14 马 群 后 沟 15 牛 眼睛 16 牛心 山 17 大耗 来 沟 18 羊 盘 沟 19 小河 20 大北 沟 21 学 田地 22 黑山 头 23 西山 根 24 蒙古 营 25 北 石门 26 花 塘 沟 27 白 石头 沟 28 多伦 淖尔 镇 D 兴盛 社区 01 中 村 02 南村 03 南 林场 04 福 盛 社区 05 善 因 社区 06 向阳 社区 07 西村 08 富泉 村 09 苗圃 10大 仓 社区 11 东 仓 村 12 新仓 村 13 盆 窑 村 14 水泉 村 15 小 营盘 村 16 北村 17 二 道 洼 村 18 第 一 建材 厂 19 黑 山嘴 村 20 团结 村 21 双井 子 村 22 胜利 村 23 新民 村 24 < / table > < b > 三 、 义务 教育 证书 编号 < / b > < b > < / b > 大 河口 乡 : 大 河口 小学 11 × × ― ― × × × 三 道 沟 小学 12 × × ― ― × × × 蔡 木 山 乡 : 上 都 河 小学 21 × × ― ― × × × 蔡 木 山 小学 22 × × ― ― × × × 大 北 沟 镇 : 十 五号 小学 31 × × ― ― × × × 五号 小学 32 × × ― ― × × × 小 石 砬 小学 33 × × ― ― × × × 大北 沟 小学 34 × × ― ― × × × 西 干沟 小学 35 × × ― ― × × × 耗 来 沟 小学 36 × × ― ― × × × 多伦 淖尔 镇 : 第 一 小学 41 × × ― ― × × × 第 二 小学 42 × × ― ― × × × 第 三 小学 43 × × ― ― × × × 第 四 小学 44 × × ― ― × × × 黑山 嘴 小学 45 × × ― ― × × × 多伦 二中 50 × × ― ― × × × 多伦 四中 60 × × ― ― × × × 注 : 前 × × 为 年号 后 两位 , 后 × × × 为 序号 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 考勤 和 综合 素质 评定 3 第 十四 条 学生 上课 、 自习 、 参加 劳动 实践 、 社会 实践 等 各项 活动 都 实行 考勤 。
+学生 在校 期间 应 按 规定 时间 到校 和 离校 。
+因故 不 能 按时 上课 或 参加 学校 组织 的 各项 活动 , 必须 请假 , 否则 按 旷课 处理 。
+对 旷课 和 经常 迟到 、 早退 的 学生 , 学校 应 及时 了解 情况 , 进行 批评 教育 。
+寄宿 制 学校 应 加强 对 寄宿 学生 的 管理 , 实行 寄宿 生 在校 考勤 制度 , 确保 寄宿 生 在校 安全 。
+寄宿 制 学校 要 按 国家 和 上级 有 关 规定 , 实行 寄宿 生 作息 制度 , 确保 学生 足够 的 休息 和 睡眠 时间 。
+寄宿 制 学校 要 结合 本 校 实际 和 学生 年龄 特点 , 开展 丰富 多彩 的 课余 文化 生活 , 使 学生 全面 健康 成长 。
+第 十五 条 学校 应 按照 课程 计划 和 新 课程 标准 的 要求 , 遵循 促进 学生 全面 发展 的 原则 , 通过 多种 形式 , 测评 学生 素质 发展 状况 。
+综合 素质 评定 包括 学业 成绩 的 考核 和 操行 的 评定 。
+采用 “ 等级 + 特长 + 激励 性 评语 ” 的 评价 方式 , 实行 素质 报告 单 制度 。
+学校 要 建立 学生 成长 记录 档案 , 按 学期 向 家长 或 其他 监护 人 报告 学生 素质 发展 的 全面 情况 , 并 征求 对 学校 工作 的 意见 , 记入 学生 成长 档案 。
+第 十六 条 德育 考察 主要 根据 学生 本人 的 思想 品德 和 行为 习惯 , 以及 遵守 学生 守则 和 行为 规范 的 情况 做出 全面 的 鉴定 ; 文化 课 考试 侧重 学生 掌握 课程 标准 或 教学 大纲 所 规定 的 基础 知识 、 基本 技能 的 情况 和 自学 能力 、 分析 解决 问题 的 能力 ; 课外 阅读 单独 设项 考查 , 主要 考查 阅读 量 ; 体育 考查 侧重 学生 身体 素质 、 健康 状况 、 体育 课 和 课外 体育 活动 的 质量 ; 劳动 课 考查 主要 看 学生 的 劳动 态度 、 劳动 纪律 及 掌握 劳动 知识 和 技能 情况 ; 研究 性 学习 和 综合 实践 活动 主要 考查 学生 的 创新 精神 , 动手 能力 和 实际 处理 问题 的 能力 。
+学校 每 学年 要 进行 一 次 学生 体质 健康 检查 。
+第 十七 条 学生 操行 一般 用 评语 的 方式 评定 。
+评语 以 激励 性 评价 为主 , 积极 采用 赏识 教育 的 思想 教育 学生 。
+用 全面 的 、 发展 的 观点 反映 学生 德 、 智 、 体 等 方面 的 情况 , 并 对 学生 具有 教育 、 指导 意义 。
+评语 由 班主任 拟稿 , 征求 任课 教师 、 少先 队 干部 意见 , 学校 领导 审定 。
+操行 评定 的 结果 应 填入 素质 发展 报告 单 。
+通知 学生 及其 家长 。
+第 十八 条 学生 的 学业 成绩 和 操行 要 记入 学生 档案 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 转学 和 借读 4 第 十九 条 学生 不 得以 择校 为 目的 随意 转学 。
+确 因 家长 工作 调动 、 家庭 搬迁 、 进城 务工 或 有 其他 特殊 原因 需 转学 的 , 由 学生 家长 提出 书面 申请 , 出示 有效 证件 , 经 转出 学校 同意 , 家长 持 转出 学校 的 转学 申请 表 与 转入 学校 联系 , 转入 学校 同意 接收 后 , 到 教育 局 教育 股 审批 登记 。
+经 教育 局 审批 登记 , 转出 学校 开具 转学 证 , 同时 索要 转入 学校 的 接收 回执 。
+对 未经 教育 局 审批 登记 , 学校 私自 接收 的 转入 学生 , 在 学年 初 ( 学期 初 ) 核定 人数 时 , 不 计入 本 校 学生 总数 , 同时 也 不 予 核算 生均 公用 经费 。
+第 二十 条 学校 不 得 接收 未 办理 转学 手续 的 插班 生 , 要 严格 控制 班 容量 , 对 符合 转入 条件 的 学生 应 及时 安排 插班 学习 。
+对 因 转入 学校 班 额 已 满 转入 有 困难 的 学生 , 由 教育 局 统一 安排 , 凡 转学 学生 不 能 转入 他 校 者 , 原 学校 应 允许 该 生 回校 学习 。
+第 二十一 条 学校 接收 的 转学 生 , 要 按 原 学校 提供 的 年级 编入 相应 年级 就读 , 不 允许 跨校 留级 。
+第 二十二 条 正常 转学 的 学生 , 由 转出 学校 给 学生 提供 学籍 档案 资料 , 转入 学校 收取 学生 档案 , 为其 建立 新 学籍 。
+第 二十三 条 毕业 班 学生 不 允许 办理 转学 手续 , 学生 在 休学 期间 不 准 转学 。
+第 二十四 条 转学 手续 一般 在 学期 结束 前 或 开学 后 一 周 内 办理 。
+中途 无 特殊 情况 一般 不 予 办理 。
+第 二十五 条 义务 教育 学校 一般 不 收 借读 生 。
+借读 系 指 学生 到 非 户口 所在 地 ( 指 跨 旗县 ) 申请 就读 。
+如 学校 班 额 许可 , 学生 在 服务 区内 有 居住 条件 ( 寄宿 制 学校 能 解决 住宿 ) 和 监护 人 。
+并 具备 下列 条件 之 一 者 , 可 准予 借读 : 1 、 父母 双方 从事 野外 或 流动 性 较大 的 工作 , 需 由 亲属 照管 的 。
+2 、 父母 双方 或 一 方 不 在 学生 户口 所在 地 工作 , 需 随 父母 居住 的 。
+3 、 父母 双方 均 无法 履行 或 一 方 无法 履行 监护 人 职责 , 需 由 亲属 抚养 监护 的 。
+4 、 儿童 随 父母 或 其他 监护 人 在 流入 地 居住 并 取得 公安 部门 颁发 的 暂住 证 , 同时 父母 或 其他 监护 人 取得 就业 证明 或 工商 部门 颁发 的 营业 执照 的 。
+第 二十六 条 外 旗县 适龄 儿童 、 少年 在 本 县 新 居住 地 借读 入学 , 经 教育 局 审批 注册 后 , 所在 学校 为其 建立 临时 学籍 。
+学校 为 借读 学生 开具 回执 , 由 家长 或 其他 监护 人 交 其 户籍 所在 地 学校 做 为 已 入学 凭证 。
+第 二十七 条 借读 学生 完成 初等 或 初级 中等 义务 教育 , 可 在 借读 学校 领取 义务 教育 证书 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 休学 和 复学 5 第 二十八 条 因 病 需 治疗 、 休养 和 发生 意外 伤害 的 学生 , 由 监护 人 凭 县 以上 医院 证明 , 向 所在 学校 提出 书面 申请 , 学校 核实 同意 后 , 经 教育 局 审批 登记 方可 休学 。
+休学 期限 一般 不 超过 下 一 学年 开学 日期 , 届时 不 能 复学 的 , 应 再 办理 审批 手续 。
+第 二十九 条 在 休学 期间 , 其 学籍 予以 保留 , 但 不 得 转入 其他 学校 。
+休 学期 满 或 休 学期 未 满 请求 复 学者 , 学校 要 准予 复学 。
+到期 不 复 学者 , 学校 应 按时 动员 其 复学 。
+复学 时 学校 可 据 其 实际 学历 程度 , 并 征求 本人 及 家长 意见 后 编入 相 年级 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 退学 和 辍学 6 第 三十 条 义务 教育 阶段 , 除 因 病 或 因故 丧失 学习 能力 者 外 , 不 允许 学生 退学 。
+丧失 学习 能力 必须 退学 的 , 其 退学 手续 一律 由 教育 局 办理 。
+第 三十一 条 学校 要 时刻 防止 义务 教育 阶段 学生 辍学 。
+在校生 未经 学校 准假 不 到 校 , 班主任 应 连续 进行 两次 家访 , 查明 原因 , 耐心 做 学生 思想 工作 , 敦促 其 到校 。
+经 班主任 两次 督促 后 , 三日 内 仍 不 到 校 , 班主任 应 及时 报告 学校 。
+学校 应 与其 所在 地 村委 会 或 居委 会 联系 , 共同 做 第 三 次 动员 , 并 采取 有效 措施 , 解决 实际 问题 。
+学生 一 周 内 还 不 到 校 , 学校 应 报告 乡镇 人民 政府 和 教育 行政 部门 , 并 配合 有 关 部门 依法 使 其 复学 。
+学生 复学 后 学校 要 做好 巩固 工作 。
+第 三十二 条 每个 月底 学校 以 月报 的 形式 向 教育 局 报告 一 次 学生 辍学 情况 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 升级 和 毕业 7 第 三十三 条 义务 教育 阶段 实行 年限 教育 , 学生 每 一 学 年末 自然 升入 高一 年级 。
+第 三十四 条 义务 教育 阶段 取消 留级 制度 , 随 班 就读 的 残疾 儿童 应 列 为 特殊 教育 学生 , 可 根据 本人 情况 安排 到 相应 年级 学习 。
+第 三十五 条 对 修 完 规定 年限 和 全部 课程 的 义务 教育 阶段 学生 , 学校 填写 初等 或 初级 中等 义务 教育 证书 , 经 教育 局 验印 后发 给 学生 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 奖励 和 批评 8 第 三十六 条 对 德 、 智 、 体 全面 发展 或 某 一方面 有 突出 成绩 的 学生 , 应 给予 奖励 。
+奖励 等级 分 为 班级 奖 、 校级 奖 和 上级 主管 部门 奖 。
+学生 受到 校级 以上 奖励 应 记入 学籍 表 。
+学校 每 学年 评选 一 次德 、 智 、 体 等 方面 全面 发展 的 “ 三好 学生 ” 、 “ 优秀 学生 干部 ” 和 “ 先进 班 集体 ” 。
+第 三十七 条 学生 违反 《 学生 守则 》 、 《 行为 规范 》 和 学校 有 关 规章 制度 , 要 进行 批评 教育 , 教育 方法 要 得当 , 严禁 教师 体罚 和 变相 体罚 学生 。
+对 屡 教 不 改 的 学生 , 应 给予 处分 。
+处分 学生 需 由 班主任 提出 , 经 校务 会 讨论 , 校长 批准 , 报 教育 局 备案 。
+处分 分 为 : 警告 、 严重 警告 、 记过 三种 。
+学校 不 得 开除 或 劝退 学生 。
+对 学生 处分 要 事实 清楚 , 允许 学生 申辩 。
+学生 在 受 处分 后 满 一 学期 , 确定 已 改正 错误 , 进步 显著 者 , 经 校务 会议 讨论 , 校长 批准 , 可以 撤消 处分 。
+处分 撤消 后 , 应 及时 将 处分 记录 从 学生 个人 学籍 档案 中 撤出 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 管理 职能 9 第 三十八 条 义务 教育 阶段 学籍 管理 由县 教育 局 负责 组织 、 指导 、 检查 、 处理 。
+由 学校 校长 负责 实施 , 各 中小 学校 要 有 专人 负责 此项 工作 。
+学籍 资料 以 学校 为 单位 建档 、 管理 。
+第 三十九 条 凡 弄 虚 作假 , 乱 开 转学 手续 , 乱发 假 义教 证书 , 乱 评 假 “ 三好 学生 ” 、 “ 优秀 学生 干部 ” 、 “ 先进 班 集体 ” , 随意 涂改 学籍 档案 的 , 对 直接 责任 人 应 给予 纪律 处分 。
+第 四十 条 在 执行 本 办法 的 过程 中 , 各校 可 根据 实际 情况 制定 相应 的 辅助 性 措施 。
+但 不 能 与 《 中华 人民 共和 国 义务 教育 法 》 和 《 中小 学 管理 规程 》 相 抵触 , 也 不 得 违背 本 规定 的 基本 原则 。
+第 四十一 条 教育 局 对 学校 的 学籍 管理 工作 要 加强 管理 , 及时 总结 分析 , 纠正 和 处理 学籍 管理 中 出现 的 问题 。
+多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 附则 10 第 四十二 条 本 办法 的 解释 权属 教育 局 教育 股 。
+第 四十三 条 本 办法 自 2008 年 秋季 新 学年 开始 起 执行 。
+二 00 八 年 四月 十四 日 中文 名 < a href = " # " > 多伦 县 义务 教育 阶段 中小 学 学籍 管理 办法 ( 修改 稿 ) < / a > 发布 时间 二 00 八 年 四月 二十一 日 地点 多伦 县 属性 管理 条例
+马 六塘 自然 村 马 六塘 自然 村 隶属 于 鱼塘 彝族 乡 挖 鲁 行政 村 , 属于 山区 。
+位于 鱼塘 乡 西南 边 , 距离 挖 鲁 村委 会 5 公里 , 距离 鱼塘 乡 30 公里 。
+全村 有 耕地 总 面积 336 亩 。
+村 情 概况 0 国土 面积 2 . 07 亩 , 海拔 1050 米 , 年 平均 气温 20 ℃ , 年 降水 量 1200 毫米 , 适宜 种植 水稻 、 玉米 等 农 作物 。
+有 耕地 336 亩 , 其中 人均 耕地 6 . 34 亩 ; 有 林地 621 亩 。
+全村 辖 1 个 村民 小组 , 有 农户 14 户 , 有 乡村 人口 53 人 , 其中 农业 人口 53 人 , 劳动 力 24 人 , 其中 从事 第 一 产业 人数 24 人 。
+2009 年 全村 经济 总 收入 9 . 02 万元 , 农民 人均 纯 收入 877 元 。
+该 村 农民 收入 主要 以 花生 为主 。
+自然 资源 0 全村 有 耕地 总 面积 336 亩 ( 其中 : 田 12 亩 , 地 324 亩 ) , 人均 耕地 6 . 3 亩 , 主要 种植 粮食 等 作物 ; 拥有 林地 621 亩 , 其中 经济 林 果 地 100 亩 , 人均 经济 林果 地 1 . 89 亩 , 主要 种植 橡胶 、 紫胶 等 经济 林果 ; 水面 面积 0 亩 , 其中 养殖 面积 0 亩 ; 草地 0 亩 ; 荒山 荒地 1950 亩 , 其他 面积 143 . 75 亩 。
+基础 设施 0 该 村 截止 2009 年底 , 已 实现 通水 、 通电 、 通路 , 无 路灯 。
+全村 有 14 户 通 自来水 , 有 0 户 饮用 井水 , 有 0 户 还 存在 饮水 困难 或 水质 未 达标 ( 占 农户 总数 的 0 % ) 。
+有 14 户 通电 , 有 0 户 通 有 线 电视 , 拥有 电视 机 农户 13 户 ( 分别 占 农户 总数 的 100 % 、 0 % 和 93 % ) ; 安装 固定 电话 或 拥有 移动 电话 的 农户 数 14 户 , 其中 拥有 移动 电话 农户 数 7 户 ( 分别 占 总数 的 29 % 和 50 % ) 。
+该 村 到 乡 已 通路 ; 进村 道路 为 土路 路面 ; 村内 主干 道 均 为 未 硬化 的 路面 ; 距离 最近 的 车站 30 公里 , 距离 最近 的 集贸 市场 30 公里 。
+全村 共 拥有 汽车 0 辆 , 拖拉 机 1 辆 , 摩托 车 7 辆 。
+全村 建 有 沼气 池 农户 10 户 ; 装有 太阳 能 农户 8 户 ; 建 有 小 水窖 0 口 ; 已 完成 “ 一 池 三改 ” 的 农户 0 户 。
+耕地 有效 灌溉 面积 为 0 亩 , 有效 灌溉 率 为 0 % , 其中 有 高稳产 农田 地 面积 41 亩 , 人均 高稳产 农田 地 面积 0 . 77 亩 。
+该 村 到 2009 年底 , 农户 住房 以 土木 结构 住房 为主 , 其中 有 0 户 居住 砖混 结构 住房 ; 有 0 户 居住 砖木 结构 住房 ; 有 14 户 居住 于 土木 结构 住房 , 还有 0 户 居住 于 其他 结构 的 住房 。
+中文 名称 马 六塘 自然 村 行政 区 类别 自然 村 所 属 地区 鱼塘 彝族 乡 挖 鲁 行政 村 面积 2 . 07 亩
+幸福 公开 课 《 幸福 公开 课 》 以 心理 学 最 前沿 的 研究 成果 为 依据 , 围绕 “ 幸福 理解 ” “ 幸福 养生 ” “ 幸福 修炼 ” “ 劳动 创造 幸福 ” 等 命题 做 了 深入 浅出 的 阐述 。
+0 < b > 内容 介绍 < / b > < b > < / b > 《 幸福 公开 课 》 以 心理 学 最 前沿 的 研究 成果 为 依据 , 围绕 “ 幸福 理解 ” “ 幸福 养生 ” “ 幸福 修炼 ” “ 劳动 创造 幸福 ” 等 命题 做 了 深入 浅出 的 阐述 。
+同时 , 《 幸福 公开 课 》 还 特意 安排 了 一些 有 趣 的 小 专栏 , 如 “ 幸福 百 昧 ” “ 幸福 测 测 看 ” “ 幸福 养生 小 妙方 ” 等 , 既 有 理论 的 援引 , 也有 操作 的 方法 , 力求 能 将 “ 有 益 ” 与 “ 有效 ” 成功 地 结合 起来 。
+幸福 是 一 种 彻悟 , 而 将 幸福 植入 心中 、 与 它 终身 同行 , 则 需要 一定 的 技巧 。
+幸福 , 对 儿童 来讲 , 是 一 种 本能 ; 对 成人 来讲 , 则 是 一 种 技能 。
+IS BN 97870 40356373 页数 214 定价 28 . 00 元 出版 时间 2012 - 5
+Pro / ENGINEER Wild fire 4 . 0 中文 版 标准 实例 教程 《 Pro / ENGINEERWildfire 4 . 0 中文 版 标准 实例 教程 》 是 2009 年 机械 工业 出版 社 出版 的 图书 。
+0 < b > 内容 介绍 < / b > < b > < / b > 《 Pro / ENGINEERWildfire 4 . 0 中文 版 标准 实例 教程 》 一 书 分 为 11 章 , 第 1 章 介绍 Pro / Engineer 4 . 0 入门 ; 第 2 章 介绍 基本 操作 : 第 3 章 介绍 草图 绘制 : 第 4 章 介绍 标准 特征 ; 第 5 章 主要 介绍 基础 特征 设计 ; 第 6 章 主要 介绍 工程 特征 设计 : 第 7 章 介绍 高级 特征 改 计 ; 第 8 章 介绍 实体 特征 操作 ; 第 9 章 介绍 曲面 设计 ; 第 10 章 介绍 装配 设计 ; 第 11 章 介绍 2D 工程 图 。
+章节 的 安排 次序 采用 由 浅 入 深 、 前后 呼应 的 原则 。
+《 Pro / ENGINEERWildfire 4 . 0 中文 版 标准 实例 教程 》 一 书 除 利用 传统 的 纸面 讲解 外 , 随 书 配送 了 多 功能 学习 光盘 。
+光盘 中 包含 全书 讲解 实例 和 练习 实例 的 源 文件 索 材 以及 教师 教学 使用 的 PowerPoint 电子 教案 。
+并 制作 了 全部 实例 动态 语音 讲解 同步 AVI 文件 。
+利用 作者 精心 驶 计 的 多 媒体 界面 , 读者 可以 随心 所 欲 , 如 看 电影 一 样 轻松 愉悦 地 学习 。
+《 Pro / ENGINEERWildfire 4 . 0 中文 版 标准 实例 教程 》 一 书 突出 了 基础 性 以及 实用 性 , 使 学习 者 可以 很快 地 掌握 Pro / ENGINEER 中 的 知识 点 和 技巧 , 适合 广大 技术 人员 和 机械 工程 专业 的 学生 学习 使用 , 也 可以 作为 各 大 中专 学校 的 教学 参考 书 。
+IS BN 9787111255611 页数 306 定价 38 . 00 元 出版 时间 2009 - 1
+韩 范 换装 《 韩 范 换装 》 是 一款 装扮 小 游戏 , 游戏 大小 为 540 K 。
+背景 设定 0 姑娘 喜欢 逛街 , 最 喜欢 韩 范 的 衣服 , 这家 韩 范 店 能否 搭配 出 她 满意 的 衣服 呢 ?
+快 来 推荐 一 下 。
+操作 指南 0 鼠标 拖动 衣服 到 人物 身 上 换装 。
+游戏 目标 0 给 女孩 换装 。
+中文 名 韩 范 换装 类型 装扮 类 主体 姑娘 游戏 大小 540 K
+Drive Clone 一款 硬盘 备份 克隆 工具 . 可以 备份 或者 克隆 整个 硬盘 或者 分区 . 它 可以 创建 一 个 压缩 归档 文件 以 包含 所有 存储 在 硬盘 或者 分区 中 的 文件 、 分区 信息 和 安全 信息 . 0 < b > 运行 环境 < / b > < b > < / b > 支持 Win9x / Me / NT / 2000 / XP / 2003 软件 名称 Drive Clone 软件 平台 pc 软件 版本 5 . 1 软件 语言 英语 软件 大小 86 . 21MB 软件 授权 免费
+天然 腐女 游 古代 《 天然 腐女 游 古代 》 是 断桥 、 雨 樱 创作 的 网络 小说 , 发表 于 晋江 文学 网 。
+作者 0 断桥 、 雨 樱 作品 简介 0 她 , 铭 , 14 岁 , 要 身材 没 身材 , 要 长相 没 长相 , 扎 在 人海 堆 都 看 不 见 , 唯独 有 一 份 ‘ 天大 地大 , 没 有 我 大 ’ 的 勇气 , 为了 抓住 一 个 ‘ 该死 ’ 的 同学 , 曾经 怒 闯 男 厕所 。
+最 大 的 喜好 是 读 BL 小说 、 看 BL 漫画 , 然后 再 看 几 场 BL 真人 H , 把 班 里 的 学生 带 成 支持 BL 的 腐 男 腐女 ( 喂 喂 !
+) 。
+好 吧 , 饶 了 她 吧 , 她 一 没 招 他们 , 二 没 惹 他们 , 天下 好 女人 多 的 是 , 干 啥 还 非得 在 她 这 颗 小 歪 树 上 吊死 啊 , 而且 还是 不 少 人 自愿 吊死 !
+拜托 !
+你们 去 搞 BL 吧 , 不 要 再 缠 着 我 了 , 我 不 支持 BG 啊 混蛋 !
+而且 还是 被 那个 一 脸 阳光 样 内心 是 腹黑 健 气 攻 的 混蛋 XXOO 了 !
+老娘 是 攻 啊 老娘 是 攻 !
+你 丫 的 , 你 叫 炎 是 吧 ?
+老娘 跟 你 杠 上 了 !
+赌 上 我 S 的 名誉 !
+我 管 你 什么 皇亲 贵族 、 医圣 药 仙 , 我 要 不 把 你 教 好 , 我 就 把 宇 灭 了 !
+( 宇 : 跟 我 有 什么 关系 啊 喂 !
+) 中文 名称 天然 腐女 游 古代 作者 断桥 、 雨 樱 连载 平台 晋江 文学 网
+成帝 刘 骜 皇后 许氏 汉 成帝 皇后 许氏 0 许 娥 ( ?
+— 前 8 年 ) , 昌邑 ( 今 山东 金乡 ) 人 , 许 平君 的 侄女 。
+汉 成帝 第 一 位 皇后 。
+汉 元帝 为了 补偿 早年 丧 母 之 痛 , 将 自己 的 表妹 许氏 许配 给 太子 刘 骜 为 妃 。
+她 在 刘 骜 登基 后 成为 皇后 。
+出身 名门 0 许 皇后 出身 名门 , 色 艺 俱佳 , 犹 擅 文章 , < b > 致使 十数 年间 < / b > < b > 汉 成帝 < / b > < b > 专 宠 皇后 < / b > < b > , 其他 嫔妃 难得 临幸 。
+但 其 年长 之后 , 色 衰 爱 弛 , 所 生 子女 又 皆 早夭 。
+< / b > 赵 飞燕 姐妹 入 宫 后 许 皇后 地位 更加 不 稳固 , 鸿 嘉 三年 ( 前 18 年 ) , 赵 飞燕 诬陷 许 皇后 和 她 的 姐姐 侯 夫人 < b > 许 谒 < / b > < b > 及 < / b > 班 婕妤 以 巫蛊 诅咒 宫中 怀孕 的 宫女 和 大 将军 王 凤 , 结果 许 谒 处斩 , 许 皇后 被 废 , 囚禁 于 昭台 宫 。
+许氏 家族 被 遣返 老家 昌邑 。
+姐姐 许 靡 0 许 皇后 的 另 一 个 姐姐 < b > 许 靡 < / b > < b > , 其 丈夫 < / b > 韩 宝 去世 后 , 在家 寡居 。
+与 王 太后 的 侄儿 、 宫廷 卫尉 、 侍中 淳于 长 私通 。
+之后 , 将 许 靡 娶 来 作 妾 。
+淳于 长 挥霍 无 度 , 便 通过 许 靡 向 许 皇后 表示 , 只要 有 钱财 , 他 有 办法 让 许 皇后 复出 当 “ < b > 左 皇后 < / b > < b > ” 。
+许 皇后 竟 信 以为 真 , 陆陆 续 续 送 去 千余两 银子 , < / b > < b > 并 表示 只要 能 做 个 < / b > < b > 婕妤 < / b > < b > , 便 已 足矣 < / b > < b > 。
+但 淳于 长 并 没 做 任何 努力 , 反而 和 许 靡 一 起 嘲笑 许 皇后 异想 天开 。
+< / b > 许 皇后 服毒 自杀 0 绥和 元年 ( 公元 前 8 年 ) , 当权 的 王 根 已 老 , 淳于 长 和 王 莽 暗自 竞争 。
+王 莽 去 王 根 病榻 前 挑拨 , 说出 了 淳于 长 、 许 靡 、 许 皇后 之间 的 事 。
+王 根 大怒 , 禀报 王 太后 。
+王 太后 又 报 成帝 。
+成帝 下令 将 淳于 长 免职 , 遣送 回 封 邑 定陵 ( 今 河南 舞阳 ) 。
+淳于 长 将 自己 的 车骑 并 一批 珠宝 送给 王 立 的 儿子 王 融 ( 淳于 长 是 王 立 的 表弟 ) , 求 王 融 说情 。
+王 融 找 父亲 王 立 , 王 立 则 向 成帝 上 奏 , 希望 成帝 收回 成命 。
+成帝 就此 起疑 , 展开 调查 , 不 久 真相 大白 , 下令 逮捕 王 融 , 王 立 逼迫 王 融 服毒 自杀 。
+成帝 大 惊 , 下令 逮捕 淳于 长 。
+淳于 长 当时 已 逃 至 洛阳 , 就 在 洛阳 被捕 , 囚 入 洛阳 监狱 。
+淳于 长 就 在 洛阳 监狱 中 被 绞死 , 家属 放逐 到 合浦 ( 今 广西 合浦 ) 。
+同时 , 成帝 派 廷尉 孔 光 持 诏 前往 冷宫 。
+许 皇后 服毒 自杀 。
+史料 记载 0 《 汉书 》 卷 九十七 孝 成 许 皇后 , 大 司马 车骑 将军 平恩 侯 嘉 女 也 。
+元帝 悼 伤 母 恭 哀 后 居 位 日 浅 而 遭 霍氏 之 辜 , 故 选 嘉 女 以 配 皇 太子 。
+初 入 太 了 家 , 上 令 中常 侍 黄门 亲近 者 侍 送 , 还 白 太子 欢 说 状 , 元帝 喜 谓 左右 : “ 酌酒 贺 我 !
+” 左右 皆 称 万岁 。
+久 之 , 有 一 男 , 失 之 。
+乃 成帝 即位 , 立 许 妃 为 皇后 , 复生 一 女 , 失 之 。
+初 , 后父 嘉 自 元帝 时 为 大 司马 车骑 将军 辅政 , 已 八九 年 矣 。
+及 成帝 立 , 复 以 元 舅 阳平 侯 王 凤 为 大 司马 、 大 将军 , 与 嘉 并 。
+杜 钦 以为 故事 后父 重于 帝 舅 , 乃 说 凤 曰 : “ 车骑 将军 至 贵 , 将军 宜 尊 之 敬 之 , 无 失 其意 。
+盖 轻 细微 眇 之 渐 , 必 生 乖 忤 之 患 , 不 可 不 慎 。
+卫 将军 之 日盛 于 盖 侯 , 近世 之 事 , 语 尚 在于 长老 之 耳 , 唯 将军 察 焉 。
+” 久 之 , 上 欲 专 委任 凤 , 乃 策 嘉 曰 : “ 将军 家 重 身 尊 , 不 宜 以 吏 职 自 累 。
+赐 黄金 二 百斤 , 以 特 进 侯 就 朝 位 。
+” 后 岁 余 薨 , 谥 曰 恭 侯 。
+后 聪慧 , 善 史书 , 自 为 妃 至 即位 , 常 宠 于 上 , 后宫 希得 进见 。
+皇 太后 及 帝 诸 舅 忧 上 无 继嗣 , 时 又 数 有 灾异 , 刘向 、 谷 永 等 皆 陈 其 咎 在于 后宫 。
+上 然 其言 , 于是 省 减 椒 房 掖 廷 用度 。
+皇后 及 上 疏 曰 : 妾 夸布 服 粝 粮 , 加以 幼稚 愚 惑 , 不 明 义理 , 幸 得 免 离 茅屋 之 下 , 备 后宫 扫除 。
+蒙 过误 之 宠 , 居 非命 所当托 , 污秽 不 修 , 旷职 尸 官 , 数 逆 至 法 , 逾越 制度 , 当 伏 放流 之 诛 , 不 足以 塞责 。
+乃 壬寅 日 大 长 秋 受 诏 : “ 椒 房 仪 法 , 御 服 舆 驾 , 所 发 诸 官署 , 及 所 造作 , 遗 赐 外家 群 臣 妾 , 皆 如 竟 宁 以前 故事 。
+” 妾 伏 自 念 , 入 椒 房 以来 , 遗 赐 外家 未尝 逾 故事 , 每 辄 决 上 , 可 复 问 也 。
+今 诚 时世 异 制 , 长短 相 补 , 不 出 汉 制 而已 , 纤 微 之间 , 未必 可 同 。
+若 竟 宁 前 与 黄龙 前 , 岂 相 放 哉 ?
+家 吏 不 晓 , 今 一 受 诏 如此 , 且 使 妾 摇手 不 得 。
+今 言 无 得 发 取 诸 官 , 殆 谓 未央 官 不 属 妾 , 不 宜 独 取 也 。
+言 妾 家府 亦 不 当 得 , 妾 窃 惑 焉 。
+幸 得 赐 汤沐 邑 以 自 奉养 , 亦 小 发 取 其中 , 何 害 于 谊 而 不 可 哉 ?
+又 诏书 言 服 御 所 造 , 皆 如 竟 宁 前 , 吏 诚 不 能 揆 其意 , 即 且 令 妾 被服 所 为 不 得 不 如 前 。
+设 妾 欲 作 某 屏风 张 于 某 所 , 曰 故事 无 有 , 或 不 能 得 , 则 必 绳 妾 以 诏书 矣 。
+此 二 事 诚 不 可行 , 唯 陛下 省察 。
+宦 吏 忮 佷 , 必 欲 自 胜 。
+幸 妾 尚 贵 时 , 犹 以 不 急事 操 人 , 况 今 日日 益 侵 , 又 获 此 诏 , 其 操 约 人 , 岂 有 所 诉 ?
+陛下 见 妾 在 椒 房 , 终 不 肯 给 妾 纤 微 内 邪 ?
+若 不 私 府 小 取 , 将 安 所 仰 乎 ?
+旧故 , 中官 乃 私 夺 左右 之 贱 缯 , 乃 发 乘舆 服 缯 , 言 为 待 诏 补 , 已而 贸易 其中 。
+左右 多 窃 怨 者 , 甚 耻 为 之 。
+又 故事 以 特 牛 祠大 父母 , 戴 侯 、 敬 侯 皆得 蒙 恩 以太 牢 祠 , 今 当 率 如 故事 , 唯 陛下 哀 之 !
+今 吏 甫 受 诏 读 记 , 直 豫 言 使 后 知 之 , 非 可 复 若 私 府 有 所 取 也 。
+其 萌 牙 所以 约制 妾 者 , 恐 失 人 理 。
+今 但 损 车驾 , 及 毋 若 未央 官 有 所 发 , 遗 赐 衣服 如 故事 , 则 可 矣 。
+其 余 诚 太 迫 急 , 奈何 ?
+妾 薄命 , 端 遇 竟 宁 前 , 竟 宁 前 于 今世 而 比 之 , 岂可 邪 ?
+故 时 酒肉 有 所 赐 外家 , 辄 上 表 乃 决 。
+又 故 杜 陵 梁 美人 岁时 遗 酒 一石 , 肉 百斤 耳 。
+妾 甚少 之 , 遗 田 八子 诚 不 可 若 是 。
+事 率 众多 , 不 可 胜 以 文 陈 。
+俟 自 见 , 索 言 之 , 唯 陛下 深 察 焉 !
+上 于是 采 刘向 、 谷 永之 言 以 报 曰 : 皇帝 向 皇后 , 所 言 事 闻 之 。
+夫 日 者 众 阳 之 宗 , 天光 之 贵 , 王者 之 象 , 人 君 之 位 也 。
+夫 以 阴 而 侵 阳 , 亏 其 正体 , 是非 下 陵 上 , 妻 乘 夫 , 贱 逾 贵 之 变 与 ?
+春秋 二百四十二 年 , 变异 为 众 , 莫若 日蚀 大 。
+自 汉兴 , 日蚀 亦 为 吕 、 霍 之 属 见 。
+以 今 揆 之 , 岂 有 此 等 之 效 与 ?
+诸侯 拘 迫 汉 制 , 牧 相 执 持 之 也 , 又 安 获 齐 、 赵 七国 之 难 ?
+将 相 大臣 怀 诚 秉 忠 , 唯 义 是 从 , 又 恶 有 上官 、 博 陆 、 宣 成 之 谋 ?
+若 乃 徒步 豪 桀 , 非 有 陈 胜 、 项 梁 之群 也 ; 匈奴 、 夷狄 , 非 有 冒顿 、 郅 支之 伦 也 。
+方外 内乡 , 百 蛮 宾服 , 殊 俗 慕 义 , 八 州 怀 德 , 虽 使 其 怀 挟 邪 意 , 狄 不 足 忧 , 又 况 其 无 乎 ?
+求 于 夷狄 无 有 , 求 于 臣 下 无 有 , 微 后 官 也 当 , 何以 塞 之 ?
+日 者 , 建始 元年 正月 , 白 气 出于 营 室 。
+营 室 者 , 天子 之后 官 也 。
+正月 于 《 尚书 》 为 皇 极 。
+皇 极 者 , 王 气之 极 也 。
+白 者 西方 之 气 , 其 于 春 当 废 。
+今 正 于 皇 极 之 月 , 兴 废气 于 后宫 , 视 后 妾 无 能 怀 任 保全 者 , 以 著 继嗣 之 微 , 贱人 将 起 也 。
+至 其 九月 , 流星 如 瓜 , 出于 文昌 , 贯 紫 宫 , 尾 委曲 如龙 , 临 于 钩 陈 , 此 又 章 显 前 尤 , 著 在内 也 。
+其后 则 有 北 宫 井溢 , 南 流 逆 理 , 数 郡 水 出 , 流 杀 人民 。
+后 则 讹 言传 相 惊 震 , 女童 入 殿 , 咸 莫 觉知 。
+夫 河 者 水 阴 , 四 渎 之 长 , 今 乃 大 决 , 没 漂 陵 邑 , 斯 昭 阴 盛 盈 溢 , 违 经 绝 纪 之 应 也 。
+乃 昔 之 月 , 鼠 巢 于 树 , 野 鹊 变色 。
+五月 庚子 , 鸟 焚 其 巢 太山 之 域 。
+《 易 》 曰 : “ 鸟 焚 其 巢 , 旅人 先 笑 后 号 啕 。
+丧 牛 于 易 , 凶 。
+” 言 王者 处 民 上 , 如 鸟 之 处 巢 也 , 不 顾恤 百姓 , 百姓 畔 而去 之 , 若 鸟 之 自焚 也 , 虽 先 快意 说笑 , 其后 必 号 而 无 及 也 。
+百姓 丧 其 君 , 若 牛 亡 其 毛 也 , 故称 凶 。
+泰山 , 王者 易 姓 告 代 之 处 , 今 正 于 岱宗 之 山 , 甚 可 惧 也 。
+三月 癸未 , 大风 自西 摇 祖宗 寝 庙 , 扬 裂 帷 席 , 折 拔 树木 , 顿 僵 车 辇 , 毁坏 槛 屋 , 灾 及 宗庙 , 足 为 寒心 !
+四月 己亥 , 日蚀 东 井 , 转 旅 且 索 , 与 既 无 异 。
+己 犹 戊 也 , 亥 复 水 也 , 明 阴 盛 , 咎 在内 。
+于 戊 己 , 亏 君 体 , 著 绝世 于 皇 极 , 显 祸 败 及 京都 。
+于 东 井 , 变 怪 众 备 , 末 重 益 大 , 来数 益 甚 。
+成形 之 祸 月 以 迫切 , 不 救 之 患 日 寝 屡 深 , 咎 败 灼灼 若此 , 岂 可以 忽 哉 !
+《 书 》 云 : “ 高宗 肜 日 , 粤 有 雊 雉 。
+祖 己 曰 : ‘ 惟 先 假 王 正 厥 事 。
+’ ” 又曰 : “ 虽 休 勿 休 , 惟 敬 五刑 , 以 成 三德 。
+” 即 饬 椒 房 及 掖庭 耳 。
+今 皇后 有 所 疑 , 便 不便 , 其 条 刺 , 使 大 长 秋 来 白 之 。
+吏 拘 于 法 , 亦 安 足 过 ?
+盖 矫 枉 者 过 直 , 古今 同 之 。
+且 财 币 之 省 , 特 牛 之 祠 , 其 于 皇后 , 所以 扶助 德美 , 为 华 宠 也 。
+咎 根 不 除 , 灾变 相 袭 , 祖宗 且 不 血 食 , 何 戴 侯 也 !
+传 不 云 乎 !
+“ 以 纳 失 之 者 鲜 。
+” 审 皇后 欲 从 其 奢 与 ?
+朕 亦 当 法 孝 武 皇帝 也 , 如此 则 甘泉 、 建 章 可 复兴 矣 。
+世俗 岁 殊 , 时 变 日化 , 遭事 制宜 , 因 时而 移 , 旧 之 非 者 , 何 可放 焉 !
+郡 子 之 道 , 乐 因循 而 重 改作 。
+昔 鲁 人 为 长 府 , 闵 子骞 曰 : “ 仍 旧贯 如 之 何 ?
+何必 改作 !
+” 盖 恶 之 也 。
+《 诗 》 云 : “ 虽无 老 成人 , 尚有 典 刑 , 曾 是 莫 听 , 大命 以 倾 。
+” 孝文 皇帝 , 朕 之 师 也 。
+皇 太后 , 皇后 成法 也 。
+假使 太后 在 彼时 不 如 职 , 今 见 亲 厚 , 又 恶 可以 逾 乎 !
+皇后 其 刻 心 秉 德 , 毋 违 先后 之 制度 , 力 谊 勉 行 , 称 顺 妇道 , 减 省 群 事 , 谦 约 为 右 , 其 孝 东宫 , 毋 厥 朔望 , 推诚 永 究 , 爰 何 不 臧 !
+养 名 显 行 , 以 息 众 讠 雚 , 垂 则 列 妾 , 使 有 法 焉 。
+皇后 深 惟 毋 忽 !
+是 时 , 大 将军 凤 用事 , 威权 尤 盛 。
+其后 , 比 三年 日蚀 , 言 事 者 颇 归咎 于 凤 矣 。
+而 谷 永 等 遂 著 之 许氏 , 许氏 自知 为 凤 所 不 佑 。
+久 之 , 皇后 宠 亦 益 衰 , 而 后宫 多 新 爱 。
+后 姊 平安 刚 侯 夫人 谒 等 为 媚 道 祝 诅 后宫 有 身 者 王 美人 及 凤 等 , 事 发觉 , 太后 大怒 , 下 吏 考问 , 谒 等 诛 死 , 许 后坐 废 处 昭 台 宫 , 亲属 皆 归 故 郡 山阳 , 后 弟子 平恩 侯 旦 就 国 。
+凡 立 十四 年 而 废 , 在 昭 台 岁 余 , 还 徙 长 定 宫 。
+后 九 年 , 上 怜 许氏 , 下 诏 曰 : “ 盖 闻 仁 不 遗 远 , 谊 不 忘 亲 。
+前 平安 刚 侯 夫人 谒 坐 大逆罪 , 家属 幸 蒙 赦 令 , 归 故 郡 。
+朕 惟 平恩 戴 侯 , 先 帝 外 祖 , 魂 神 废弃 , 莫 奉 祭祀 , 念 之 未尝 忘 于心 。
+其 还 平恩 侯 旦 及 亲属 在 山阳 郡 者 。
+” 是 岁 , 废后 败 。
+先 是 , 废后 姊 孊 寡居 , 与 定陵 侯 淳于 长 私通 , 因为 之 小 妻 。
+长 绐 之 曰 : “ 我 能 白 东宫 , 复 立 许 后 为 左 皇后 。
+” 废后 因 孊 私 赂 遗 长 , 数 通 书记 相 报 谢 。
+长 书 有 悖 谩 , 发觉 , 天子 使 廷尉 孔 光 持节 赐 废后 药 , 自杀 , 葬 延陵 交道 厩 西 。
+影视 作品 0 《 汉宫 飞燕 》 : 戴 春荣 饰 《 母 仪 天下 》 : 练 束 梅 饰
+重庆 市 第 二十六 中学 重庆 市 第 26 中学 创办 于 1953 年 , 有 近 半 个 世纪 的 办学 历史 。
+学校 地处 西南 最 大 商品 集散 地 —— 朝天 门 、 望 龙门 地区 , 是 下半 城 该 地区 唯一 一 所 中学 , 位于 朝天 门 码头 —— 菜园坝 火车 站 的 主干 道 上 , 交通 便利 ; 与 市 第 一 人民 医院 、 望 龙门 派出所 毗邻 , 社会 治安 良好 , 有 很好 育人 环境 。
+既 有 利于 该 地区 小学 毕业 生 的 入学 , 又 为 外地 来 朝天 门 地区 工作 的 家长 们 解除 了 子女 就学 的 后顾 之 忧 。
+学校 设施 0 学校 现有 办公 楼 、 实验 楼 , 教学 楼 。
+学 校内 教学 设施 完备 , 教学 仪器 先进 , 一 流 的 电 教室 、 微机 室 、 语音 室 、 理化 生 实验 室 的 设备 全部 按 国家 一类 标准 配置 , 为 学生 提供 了 动手 动脑 的 天地 ; 多 媒体 教学 室 也 为 师生 提供 了 崭新 的 教学 舞台 。
+在 教委 大力 支持 下 , 九月 我 校 将 落成 一 栋 设施 一 流 的 综合 教学 楼 , 现代 化 的 电视 教学 网络 和 崭新 的 塑胶 操场 , 将 为 学生 的 全面 发展 创造 更 优越 的 条件 . 学校 校风 良好 , 学生 举止 文明 、 规范 , 是 重庆 市 文明 单位 ; 重庆 市 首批 文明 礼仪 示范 学校 ; 市 、 区 德育 先进 集体 ; 地区 综合 治理 模范 单位 ; 渝中 区 法制 教育 示范 学校 。
+并 与 交警 队 、 消防 队 成立 了 警民 、 军民 共建 单位 , 保证 了 学生 有 一 个 良好 而 有 序 的 周边 环境 。
+师资 队伍 0 重庆 26 中 拥有 一支 实力 雄厚 的 师资 队伍 , 教师 75 人 , 大学 本科 学历 占 66 % , 中 高级 教师 40 人 , 占 55 % 。
+他们 中 有 从教 30 余年 , 经验 丰富 的 老 教师 ; 也有 年富 力强 、 风华 正茂 的 青年 教师 。
+更有 重庆 市 优秀 教师 、 师德 楷模 , 区 “ 十佳 ” 班主任 、 区 优秀 教师 、 区 百佳 优秀 青年 教师 、 区 优质 课 竞赛 一 、 二 、 三 等 奖 获得 者 。
+学校 现有 16 个 教学 班 , 600 多名 学生 。
+教师 们 精诚 团结 、 勇于 创新 、 奋力 拼搏 、 乐于 奉献 , 教学 质量 高 。
+曾 连续 三年 获 渝中 区 教学 质量 综合 奖 , 并 获 渝中 区 办学 水平 一等奖 。
+学生 年年 都 有 数十 人 考 上 市属 重点 高中 ( 除 保送 市属 重点 高中 的 名额 以外 ) 。
+在 狠抓 教学 质量 的 同时 , 学校 也 注意 对 学生 素质 的 培养 , 素质 教育 也 结 出 了 硕果 , 曹 庭 、 肖 禾 同学 获得 重庆 市 航模 比赛 一等奖 ; 还有 学生 获得 区 中学生 汉字 录入 比赛 三 等 奖 ; 全国 少儿 书画 大赛 银奖 ; 在 区 运 会 获得 了 多项 名次 … … 四十八 年 过去 了 , 弹指 一 挥 间 , 办学 质量 和 效益 的 提高 , 为 26 中 再创 辉煌 打 下 了 坚实 的 基础 。
+面对 21 世纪 的 挑战 , 我们 充满 着 前程 似锦 、 大有 可为 的 豪情 , 我们 更 具有 励精 图 治 、 再创 辉煌 的 信心 , 我们 相信 , 26 中 的 明天 一定 会 更 美好 !
+中文 名 重庆 市 第 二十六 中学 简称 二十六 中 所 属 地区 重庆 市 渝中 区
+新 自由 主义 全球 化 《 新 自由 主义 全球 化 : 资本 主义 危机 抑或 全球 美国 化 ? 》 分析 了 当今 自由 资本 主义 市场 主导 的 全球 化 给 世界 经济 、 社会 和 文化 所 带来 的 影响 , 尤其 是 给 非 发达 国家 以及 社会 底层 民众 带来 的 影响 。
+这些 影响 不仅 充斥 经济 领域 , 而且 在 世界 范围 内 渗透 到 了 教育 和 文化 领域 。
+作者 在 研究 中用 大量 的 数据 和 事实 , 以及 著名 学者 和 权威 机构 的 研究 成果 , 分析 了 对 全球 化 的 进程 、 方向 产生 重大 影响 的 政治 和 意识 形态 章程 , 以及 为了 实现 这些 章程 , 如何 运用 国际 和 国内 制度 、 机制 和 杠杆 。
+0 全球 化 从 哪里 来 ?
+要 带 我们 到 哪里 去 ?
+全球 化 的 实质 是 什么 ?
+它 有 哪些 弊端 和 陷阱 ?
+俄罗斯 著名 学者 、 教育 理论 家 谢马 · 阿马扎 斯波 维奇 · 坦 基扬 利用 自己 在 联合 国教 科文 组织 长达 13 年 担任 重要 职务 的 经历 , 掌握 和 查阅 了 大量 各 方面 的 研究 报告 和 权威 学者 的 著作 , 从 独到 的 视角 , 著 就 了 《 新 自由 主义 全球 化 》 一 书 。
+作者 坦 基扬 IS BN 9787504139573 页数 136 定价 17 . 00 元 出版 时间 2008 - 6
+明基 EW 2430 V 产品 特色 0 荣获 国际 iF 设计 大奖 , 拥有 “ 精 钢 侠 ” 美誉 的 24 英寸 LED 背光 液晶 显示 器 新品 BenQEW 2430 V , 采用 世界 领先 的 广 视角 VALED 黑 锐 丽 屏 液晶 面板 , 不仅 融入 集团 企业 AUO ( 友达 光电 ) 之 独家 AMVA 面板 技术 , 更 兼具 绿色 节能 的 先进 LED 背光 设计 , 以 特有 的 Smart Focus 智能 聚焦 视频 画面 功能 , 有效 抑 除 视频 背景 图文 , 令 网络 视频 的 观赏 体验 大 为 改善 。
+与此 同时 , BenQEW 2430 V 不仅 以 原 材料 展现 金属 之 美 , 且 拥有 11 个 接口 、 PIP 子母 画面 及 PBP 双 画面 并排 显示 等 一 系列 强大 功能 , 无 缝 整合 音 视频 娱乐 空间 , 甚至 配备 真正 8 bit 色彩 , 呈现 流畅 的 视觉 效果 , 为 您 在 家中 营造 完整 的 影院 体验 , 把 液晶 娱乐 引向 一 个 新 境界 。
+明基 EW 2430 V 专门 为 您 的 个性化 音 视频 娱乐 量 身 打造 !
+产品 特点 0 产品 特点 简洁 成就 精确 与 完美 1 简洁 的 钻石 切割 线条 透露 出 自然 美 与 金属 铬 纹理 。
+精湛 的 工艺 将 完美 铭记 于心 , 每 一根 线条 都 经过 精心 设计 , EW 2430 V 制造 功能 与 美丽 的 完美 和谐 。
+产品 特点 超凡 的 音 视频 体验 2 < b > 3000 : 1 超高 原生 对比 度 < / b > < b > < / b > 超高 原生 对比 度 可 改进 细节 表现 和 色彩 对比 , 产生 最 精美 真实 的 画面 。
+< b > 真正 8 bit 面板 呈现 < / b > < b > < / b > 真正 8 bit 色彩 可 产生 167 万 种 颜色 , 使 每 一 幅 画面 都 呈现 盎然 生机 和 完美 平衡 。
+< b > 最 大 程度 减少 漏光 , 无 懈 可 击 的 深邃 画面 < / b > < b > < / b > VA 面板 漏光 更 少 , 不 再 有 亮点 , 最 真实 地 呈现 每 一 幅 画面 , 美 得 令人 惊叹 。
+< b > 智能 聚焦 < / b > < b > < / b > 智能 聚焦 可 突出 显示 选定 的 窗口 或 区域 , 减少 其他 区域 的 干扰 , 让 用户 可以 将 注意 力 集中 在 主要 的 观看 内容 上 。
+产品 特点 专注 的 研究 , 卓越 的 性能 3 < b > 超高 分辨率 < / b > < b > < / b > 提高 画面 清晰 度 , 准备 捕捉 细节 并 增强 画质 。
+< b > 3D 降噪 < / b > < b > < / b > 自动 去除 来自 信号 源 的 可见 噪声 , 创造 高 质量 输出 。
+< b > 3D 去 隔 行 < / b > < b > < / b > 减少 HDMI 和 其他 分量 造成 的 交叉 干扰 闪烁 , 始终 如 一 地 提供 流畅 高 质量 视觉 呈现 。
+< b > PIP < / b > < b > PBP < / b > < b > < / b > 可以 从 2 路 信号 源 接收 信号 , 可 在 窗口 间 轻松 切换 , 让 娱乐 达到 极限 。
+< b > 24P 影院 视角 支持 < / b > < b > < / b > 支持 24P 视频 格式 , 产生 无 比 流畅 和 分毫 毕 现 的 观看 质量 , 完美 重建 场景 。
+产品 特点 专注 的 研究 , 卓越 的 性能 4 只需 插入 一 下 , EW 2430 就可 支持 所有 娱乐 功能 。
+2 个 HDMI 端口 : 支持 光盘 播放 机 、 PS 3 、 XBOX 以及 各种 游戏 设备 。
+4 个 USB 端口 : 用于 闪存 盘 、 USB 读 卡 器 和 任何 其他 便携 式 设备 。
+1 个 分量 端口 ( RCA 插孔 ) : 支持 DVD 播放 机 等 普通 家庭 娱乐 设备 。
+1 个 D - Sub / 1 个 DVI - D 接口 : 支持 模拟 信号 和 数字 信号 , 支持 台式 机 、 笔记 本 电脑 和 其他 输入 。
+产品 特点 Sensey e5reg ; 显 彩 技术 BenQSensey ereg ; 是 BenQ 为人 眼 在 视频 画面 中 的 真正 潜能 给出 的 答案 。
+它 可以 改进 色彩 、 提高 清晰 度 和 对比 度 。
+电影 模式 优化 深色 、 丰富 、 精致 的 画面 提供 完美 的 观看 效果 。
+荣获 奖项 0 · 天极 网 2011 年度 创新 显示 器 产品 奖 · 泡泡 网 技术 创新 奖 · PC Magazine 2011 年度 最 佳 · 微 电脑 世界 年度 产品 奖 · 数码 精品 世界 2011 年度 最 佳 产品 · ZOL 年度 卓越 产品 奖 基本 参数 0 < b > 屏幕 比例 < / b > < b > : 16 : 9 ( 宽屏 ) < / b > < b > < / b > < b > 最 大 分辨率 < / b > < b > : 1920 x 1080 < / b > < b > < / b > < b > 最 佳 分辨率 < / b > < b > : 1920 x 1080 < / b > < b > < / b > < b > 高清 标准 < / b > < b > : < / b > 1080 p ( 全 高清 ) < b > 面板 类型 < / b > < b > : < / b > MVA 面板 < b > 背光 类型 < / b > < b > : < / b > LED 背光 < b > 动态 对比 度 < / b > < b > : 2000 万 : 1 < / b > < b > < / b > < b > 静态 对比 度 < / b > < b > : 3000 : 1 < / b > < b > < / b > < b > 灰阶 响应 时间 < / b > < b > : 8ms < / b > 显示 参数 0 < b > 点距 < / b > < b > : 0 . 276 mm < / b > < b > < / b > < b > 亮度 < / b > < b > : 250 cd / ㎡ < / b > < b > < / b > < b > 可视 角度 < / b > < b > : 178 / 178 ° < / b > < b > < / b > < b > 显示 颜色 < / b > < b > : 16 . 7M < / b > 面板 控制 0 < b > 控制 方式 < / b > < b > : 按键 < / b > < b > < / b > < b > 语言 菜单 < / b > < b > : 英文 , 德语 , 法语 , 意大利 语 , 西班牙 语 , 俄语 , 葡萄牙 语 , 土耳其 语 , 简体 中文 < / b > 接口 0 < b > 视频 接口 < / b > < b > : D - Sub ( < / b > VGA ) , DVI - D , 2 × HDMI < b > 其它 接口 < / b > < b > : 4 × USB 2 . 0 < / b > 外观 设计 0 < b > 机身 颜色 < / b > < b > : 黑色 < / b > < b > 外观 设计 < / b > < b > : 简洁 的 钻石 切割 线条 透露 出 自然 美 与 金属 铬 纹理 。
+精湛 的 工艺 将 完美 铭记 于心 , 每 一根 线条 都 经过 精心 设计 < / b > < b > < / b > < b > 产品 尺寸 < / b > < b > : < / b > < b > < / b > 367 . 8 × 579 . 2 × 61 . 8mm ( 不 含 底座 ) 433 × 579 × 169 mm ( 包含 底座 ) < b > 产品 重量 < / b > < b > : < / b > < b > < / b > 5 . 7kg ( 净重 ) 8 . 0kg ( 毛重 ) < b > 底座 功能 < / b > < b > : 倾斜 < / b > < b > < / b > < b > 音箱 < / b > < b > : 内置 音箱 ( 2 × 2W ) < / b > 其它 0 < b > HDCP < / b > < b > : 支持 HDCP < / b > < b > 消耗 功率 < / b > < b > : 典型 : 65 W < / b > < b > < / b > < b > 待机 < / b > < b > : 1W < / b > < b > < / b > < b > 安 规 认证 < / b > < b > : CCC < / b > < b > < / b > < b > 其它 性能 < / b > < b > : Senseye 显 彩 科技 : Senseye 3 < / b > < b > < / b > < b > AMA 疾 彩 引擎 < / b > < b > : 是 < / b > < b > < / b > < b > 垂直 角度 调整 < / b > < b > : - 5 - 20 ° < / b > < b > < / b > < b > 其它 特点 < / b > < b > : < / b > < b > < / b > 支持 画中画 功能 具备 超 解像 3D 降噪 智能 集中 优化 3D 隔 行 扫描 < b > 上市 时间 < / b > < b > : 2011 年 01 月 < / b > 显示 器 附件 0 < b > 包装 清单 : < / b > < b > < / b > 显示 器 主机 × 1 底座 × 1 电源 线 × 1 VGA 线 × 1 三包 卡 × 1 光盘 × 1 快速 入门 指南 × 1 中文 名 明基 EW 2430 V 外文 名 BenQEW 2430 V 产品 类型 < a href = " # " > LED < / a > 显示 器 , 广 视角 显示 器 产品 定位 娱乐 影音 , 游戏 竞技 < a href = " # " > 屏幕 尺寸 < / a > 24 英寸 最 大 分辨率 1920 x1080
+定南 大华 新 材料 资源 有限 公司 定南 大华 新 材料 资源 有限 公司 ( 以下 简称 DAMR ) 成立 于 2004 年 , 2006 年 10 正式 投产 。
+定南 大华 简介 0 定南 大华 新 材料 资源 有限 公司 ( 以下 简称 DAMR ) 成立 于 2004 年 , 2006 年 10 正式 投产 。
+2008 年 7 月 , 在 当前 国家 稀土 产业 政策 的 影响 下 , 总 经理 魏 建中 先生 用 靠 大 做 强 的 做法 , 凭着 前 赡 性 的 眼光 与 中国 五矿 集团 、 赣县 红金 稀土 有限 公司 三 方 发起 设立 五矿 稀土 ( 赣州 ) 股份 有限 公司 , 公司 注册 资本 8 . 37 亿元 , 年 稀土 分离 能力 为 8500 吨 , 是 目前 国内 最 大 的 离子 型 稀土 分离 企业集团 。
+DMAR 位于 定南 县 太湖 工业 区 , 县城 东 环 路 , 占地 面积 158 亩 , 拥有 现代 化 厂房 和 各类 高 素质 的 管理 、 技术 和 生产 人员 360 多 人 。
+具有 完善 的 基础 设施 和一 流 的 生产 设备 , 厂区 环境 优美 , 各 车间 布局 合理 , 生产 能力 为 年 分离 3500 吨 南方 离子 型 稀土 矿 。
+产品 0DA MR 目前 生产 高纯 的 单一 稀土 氧化 物 及 共 沉 物 产品 , 80 % 以上 产品 的 纯度 大于 99 . 99 % , 其中 高纯 氧化 镧 、 高纯 氧化 钇 ( 纯度 ≥ 99 . 9999 % ) 为 公司 的 “ 拳头 ” 产品 。
+本 着 “ 强化 管理 、 质量 第 一 、 顾客 至上 、 创新 发展 ” 的 质量 方针 , 产品 的 质量 和 市场 信誉 越来 越 高 。
+DAMR 产品 销售 立足 于 国际 市场 , 已 形成 自已 独特 的 销售 网络 , 并 可以 应 客户 要求 研制 不 同 规格 的 新 产品 。
+产品 取得 了 客户 的 信赖 , 深受 客户 的 好评 。
+荣誉 02007 年 , 销售 额 达 3 亿元 , 上缴 税收 5000 余万 元 , 2008 年 销售 额 4 亿元 , 上缴 税收 6000 余万 元 。
+2006 年 评 为 “ 全市 发展 非公有制 经济 先进 企业 ” , 2007 年 评 为 赣州 市 消防 工作 先进 单位 、 B 级 纳税 信用 企业 、 金融 信用 单位 、 “ 五 个 十百亿 工程 ” 先进 企业 、 突出 贡献 企业 , 在 赣州 50 强 排名 第 37 名 , 并 获得 了 江西 省 高新 技术 企业 的 荣誉 称号 。
+产品 质量 、 经济 效益 在 江西 稀土 行业 中 处 双 领先 的 地位 。
+各项 技术 经济 指标 居 全市 行业 前列 。
+质量 、 环保 0DA MR 拥有 独特 的 排放 物 处理 技术 , 环保 处理 系统 于 2007 年 3 月 29 日 通过 省市 两 级 环保 验收 。
+环保 工艺 主要 是 将 所有 工业 污水 进行 处理 , 使 污水 中 含有 的 稀土 、 盐酸 、 氨 等 资源 的 98 % 以上 得以 回收 利用 , 既 达标 排放 又 节约 成本 。
+DAMR 分析 实验 室 拥有 先进 的 分析 检测 仪器 和 经验 丰富 的 检测 人员 , 可 进行 产品 全 方位 的 物化 性能 分析 , 目前 已 上报 申请 国家 二级 实验 室 资格 的 认证 。
+DAMR 自 创立 以来 , 始终 坚持 “ 质量 第 一 , 持续 发展 ” 的 经营 理念 , 以 优质 的 产品 占领 市场 , 以 优质 的 服务 争取 客户 。
+2007 年 8 月 通过 了 美国 UL 公司 ISO 9000 质量 体系 的 认证 , 2008 年 12 月 ISO 14000 环境 保护 体系 和 OH SAS 18000 职业 健康 安全 管理 体系 的 论证 。
+DAMR 将 以 不 断 开拓 的 精神 , 实现 对 客户 的 承诺 , 满足 并 超越 客户 的 要求 , 实现 企业 更新 的 腾飞 。
+
+湖 上 皇宫 湖 上 皇宫 建造 于 1746 年 , 位于 印度 乌代普 , 是 世界 上 最 奢华 的 酒店 之 一 。
+简介 0 背景 : 这栋 白色 宫殿 建造 于 1746 年 , 是 贾 哥 - 辛格 二世 在位 时 用 纯白 大理石 和 花岗岩 修造 的 夏季 行宫 。
+如今 , 它 已 成为 世界 上 最 奢华 的 酒店 之 一 , 是 印度 贵族 和 来自 全球 富豪 们 最 喜爱 的 宴会 与 派对 场所 。
+奇特 之 处 0 这 座 豪华 的 宫殿 坐落 于 皮丘 拉 湖 中心 的 一 个 小岛 上 。
+它 的 外 墙体 是 用 纯白 大理石 所 建造 , 这种 刺目 的 白色 与 周围 的 环境 形成 了 鲜明 的 对比 。
+这里 有 独特 的 大 拱门 、 精雕 细 琢 的 大理石 入口 , 以及 刻有 浮雕 的 屏风 和 用 壁画 装饰 的 天花板 。
+整 座 宫殿 只有 83 间 房间 , 间 间 可以 看 得到 湖景 。
+科利 奥 普洛斯 点评 道 , “ 一般 的 设计 理念 , 都 是 考虑 到 与 周边 事物 的 协调 。
+而 这栋 建筑 恰恰 相反 , 通体 耀眼 的 白色 大理石 让 它 有 如 鹤立 鸡 群 , 这 也 恰恰 体现 了 它 的 怪异 设计 风格 。
+” 中文 名 湖 上 皇宫 地点 印度 乌代普 建造 于 1746 年 称号 世界 上 最 奢华 的 酒店 之 一
+星际 之 门 之 亚特兰蒂斯 第 五 季 影片 讲述 一 组 由 科学 家 和 军人 组成 的 国际 探险 队伍 , 在 飞马 座 银河 星系 发现 了 一 种 星际 门 系统 , 进而 要 面对 一 种名 为 “ 幽灵 ” 的 威力 强大 的 新型 外星 异形 的 奇幻 冒险 经历 。
+基本 信息 0 【 原名 】 : Stargate . Season 5 【 译名 】 : 星际 之 门 第 五 季 【 演员 】 : Richard Dean Anderson - Colonel / Briga dierGeneral Jona thanJ . JackONe ill DonS . Davis - MajorGeneral GeorgeS . Hammond ( Seasons 1 - 7 ) Christopher Judge - Teal cCorinNe mec - Jona sQuinn ( Season 6 ) Michael Shanks - Dr . Daniel Jackson ( Seasons 1 - 5 , 7 , 8 ) AmandaT apping - Captain / Major / Lt . Colonel Samanth a Carter 【 片长 】 : 每 集 45 分钟 左右 【 首播 】 : 2008 年 剧情 简介 0 数百 万年 前 , 南极 洲 的 某 个 地方 , 一座 巨大 无 比 的 城市 腾空 而 起 , 带 着 眩目 的 闪光 向 浩瀚 的 星河 深处 飞去 , 气雾 腾腾 , 雪花 四溅 。
+《 星际 之 门 : 亚特兰蒂斯 》 是 《 星际 之 门 》 系列 电视 剧 的 第 三 部 作品 , 使用 现代 化 的 科幻 手段 来 重新 解释 和 演绎 广 为 流传 的 亚特兰蒂斯 ( 大 西 洲 ) 的 传说 这个 千古 之迷 。
+曾经 , 一 个 高度 发达 的 先进 种族 Ancient 以 地球 为 基地 , 几百万 年 前 , 不 知 什么 原因 , 他们 离开 地球 去 了 某地 , 在 南极 只 留 下 一 个 前哨 站 。
+亚特兰蒂斯 的 故事 紧接 着 《 星际 之 门 : SG - 1 》 的 第 七 季 , 在 JackONe ill 将 军用 前哨 站 的 一 种 Ancient 武器 消灭 前来 攻击 地区 的 Go auld 人 的 舰队 之后 , 各国 合作 建立 了 南极 联盟 , 为 研究 前哨 站里 的 Ancient 科技 、 促进 国际 合作 与 和平 而 共同 努力 。
+在 研究 过程 中 , 他们 发现 了 Ancient 路 开 地球 后 所 去 的 地方 —— 飞马 座 星系 。
+于是 , 为了 发掘 被 埋没 的 历史 、 探索 未知 的 世界 , 各国 组建 了 亚特兰蒂斯 探险 团队 , 通过 星际 之 门 到达 了 几百万 年 前 从 地球 飞走 的 亚特兰蒂斯 城 , 从而 揭开 了 传说 中 的 亚特兰蒂斯 之迷 。
+但 不幸 的 是 , 地球 人 的 到来 惊醒 了 一 群 统治 飞马 座 星系 的 怪物 —— Warith , 他们 技术 先进 、 体格 健壮 , 性情 粗野 残暴 , 不 讲 信义 , 而且 以 人类 为 牲畜 , 吸 人 生命 。
+一 万年 前 他们 打败 了 Ancient 并 迫使 其 逃 回 地球 , 而 探险 队 也 发现 了 一 个 惊人 的 事实 : Warith 是 Ancient 试验 的 产物 !
+为了 对抗 强大 的 Warith , 亚特兰蒂斯 的 小 分队 不 得 不 探索 飞马 座 星系 诸 星球 , 与 飞马 的 原 住 民 联系 , 试图 找到 抵御 Warith 的 方法 , 经历 了 一 次次 险象 环生 的 旅程 。
+《 星际 之 门 : 亚特兰蒂斯 》 当前 进入 第 三 季 , 在 以 牺牲 一艘 战舰 为 代价 之后 , 亚特兰蒂斯 城 的 人们 终于 阻止 了 Warith 入侵 地球 、 企图 把 地球 变成 一 个 大 屠宰 场 的 计划 , 但 他们 仍然 需要 继续 寻找 御 敌 之 术 , 但 就 在 寻找 的 过程 中 , 他们 不幸 又 碰到 了 一 种 更 先进 的 敌人 、 同样 也是 Ancient 实验 产物 的 Asurans , 与 Warith 一 样 , Asurans 也 向 亚特兰蒂斯 发起 了 猛烈 的 进攻 , 只 不过 , 这次 双方 力量 对比 更加 悬殊 , 更加 没 有 可比 性 , 亚特兰蒂斯 城 处于 危急 存亡 的 关头 . . . . . . . . . . . . . . . . . . . 本剧 画面 精美 , 特技 特效 颇有 水准 , 是 近年 来 收视 率 较高 的 美剧 之 一 。
+本剧 字幕 由 星际 之 门 中文 网 - SGT 字幕 组 真诚 为 您 制作 , 重 效率 、 重 质量 、 坚持 原创 字幕 是 我们 的 一贯 宗旨 , 星际 之 门 中文 网 是 所有 星 门 粉丝 的 共同 家园 , 热诚 欢迎 您 的 到来 。
+演员 表 0 参考 资料 中文 名 星际 之 门 : 亚特兰蒂斯 外文 名 Stargate : Atlant is 其它 译名 Ported & # 39 ; Atlant is , La 出品 时间 2004 年 出品 公司 米高梅 家庭 娱乐 公司 制片 地区 加拿大 / 美国 首播 时间 2008 - 07 - 11 ( 美国 ) 导演 Neill Fearnley 、 Martin Wood 编剧 Brad Wright 、 RobertC . Cooper 主演 杰森 · 莫 玛 Jason Momoa , 乔 · 弗拉尼甘 JoeFlanigan , 托 蕊 · 海 金 森 Torri Higginson , Rachel Luttrell , Rainbow Francks , 大卫 · 休 莱特 David Hew lett 集数 20 集 类型 动作 / 剧情 / 科幻 / 冒险 上映 时间 2008 年 在线 播放 平台 土豆
+上海 子 伟 物流 设备 有限 公司 上海 子 伟 物流 设备 有限 公司 于 2008 年 9 月 17 日 在 嘉定 区 市场 监管 局 登记 成立 。
+法定 代表 人 余 芸 付 , 公司 经营 范围 包括 自有 物流 设备 、 自有 托盘 租赁 ( 不 得 从事 金融 租赁 ) 等 。
+企业 信息 0 公司 类型 有限 责任 公司 登记 机关 嘉定 区 市场 监管 局 成立 时间 2008 年 9 月 17 日发 照 时间 2008 年 9 月 17 日
+玉 奴 的 天下 《 玉 奴 的 天下 》 是 连载 在 晋江 文学 城 的 小说 , 作者 是 闲庭 晚 雪 。
+小说 类型 0 综合 其它 内容 简介 0 她 是 三 个 男人 的 奴隶 。
+占据 她 身体 和 自由 的 , 她 用 美色 和 财富 煅 铸 利刃 。
+段 苍璧 , 她 的 主子 和 义父 , 非 死 不 可 ! 占据 她 前 半生 情感 的 , 她 用 馀 生 祭奠 他 的 亡灵 。
+北野 辰 飞 , 可 他 戴 着 面具 活着 , 爱 她 恨 她 骗 她 ! 当 真相 大白 , 她 还 爱 他 吗 ? 那个 曾经 因 她 而死 为 她 而生 的 王朝 新贵 ? 占据 她 残 命 残 情 的 , 她 拼 尽 全力 靠近 他 。
+叶 见琛 , 西域 之 王 , 亡国 王孙 , 当 他 愿意 用 半壁 江山 续 她 残 命 , 她 却 惨笑 , 说 , 地老 天荒 对 她 而言 , 终究 是 一 个 谎言 … … 中文 名 玉 奴 的 天下 作者 闲庭 晚 雪 小说 进度 连载 连载 网站 晋江 文学 城
+汤米 · 瓦尔 汤米 · 瓦尔 , 现 效力 于 乌德 勒 支 俱乐部 , 担任 球队 的 左 中场 位置 个人 简介 0 出生 地 : 澳大利亚 黄金 海岸 位置 : 左 中场 俱乐部 : 乌德 勒 支 职业 生涯 011 , 汤米 · 瓦尔 还是 一 个 只有 铁杆 澳大利亚 球迷 才 知道 的 名字 。
+不过 , 凭借 着 在 世界 杯 预选 赛 中 的 出色 表现 , 小伙子 已经 搭 上 了 去 往 南非 的 班机 。
+对于 平均 年龄 偏 大 的 澳大利亚 队 而言 , 以 瓦尔 为 代表 的 少数 新鲜 血液 将 是 球队 至关 重要 的 补充 。
+现年 18 岁 的 瓦尔 司 职 左 前卫 , 他 最 拿手 的 就 是 用 精湛 的 控 球 技艺 戏耍 可怜 的 防守 队员 。
+作为 传统 的 边路 球员 , 瓦尔 拥有 的 这 一 特性 显得 更加 弥 足 珍贵 , 同时 , 小伙子 也是 主帅 维尔 贝克 手头 为数 不 多 的 左脚 将 。
+在 澳大利亚 国家 队 的 现有 阵容 中 , 斯科特 · 齐 佩 菲尔德 以及 大卫 · 卡尼 将会 为 左 后卫 而 展开 激烈 竞争 ; 虽然 时常 扮演 影子 前锋 的 角色 , 但 老将 科威尔 还是 左 前卫 的 首选 ; 像 瓦尔 这样 朝气 蓬勃 的 小将 们 则 在 排 兵 布阵 时 , 丰富 了 主帅 维尔 贝克 的 选择 。
+在前 几 年 12 月满 18 岁 的 瓦尔 代表 澳大利亚 队 参加 了 09 年 的 埃及 U20 世青赛 。
+作为 星 工场 布里斯班 海啸 俱乐部 的 又 一 杰出 产品 , 瓦尔 在 三 场 比赛 中 均 以 主力 身份 亮相 。
+尽管 在 海啸 队 的 处子 秀 直到 09 年 3 月份 才 姗姗 来迟 , 但 瓦尔 依然 用 优异 的 表现 当选 年度 联赛 最 佳 新人 。
+08 年 1 月 , 澳大利亚 与 印度尼西亚 的 2011 年 亚洲 杯 预选 赛 中 , 瓦尔 首次 代表 国家 队 出场 就 赢得 了 教练 与 媒体 的 一致 好评 。
+就 连 向来 沉默 寡言 的 主帅 维尔 贝克 都 忍 不 住 表扬 了 小伙子 一 番 : “ 以前 我们 总 是 认为 只有 经验 丰富 的 球员 才 能够 在 比赛 中 有 着 出色 的 表现 , 现在 看来 , 这个 想法 得 变 了 。
+他 ( 瓦尔 ) 是 个 很 出色 的 球员 , 有 技术 , 有 速度 , 有 一脚 准确 、 稳定 的 传 中 , 另外 , 他 在 一 对 一 时 有 着 惊人 的 成功 率 。
+” 瓦尔 在 那 场 比赛 中 的 表现 的确 配 得 上 这位 荷兰 籍 主帅 的 夸奖 。
+同时 , 小伙子 还 打动 了 在 场边 观战 的 英格兰 传奇 右 后卫 保罗 - 帕克 , “ 我 给 他 ( 瓦尔 ) 打 10 分 , 满分 !
+” 瓦尔 充满 创造 力 且 激情 四射 的 踢法 不 禁 让 人们 想起 了 他 的 前辈 —— 哈里 · 科威尔 , 而 小伙子 也是 唯一 一 位 得到 如此 殊荣 的 年轻 球员 。
+不过 , 爱才 心切 的 主帅 维尔 贝克 并 不 赞成 这样 的 比较 , “ 他 才 18 岁 , 应该 让 他 找到 自己 的 风格 , 而 不 是 单纯 地 模仿 。
+他 表现 地 很 不 错 , 不过 , 只有 通过 不 断 地 努力 , 他 才能 保持 这种 进步 的 速度 与 状态 。
+” 和 谨慎 的 荷兰 人 相比 , 两年 前 为 海啸 队 签 下 瓦尔 的 澳大利亚 主帅 法里纳 毫 不 掩饰 地 表达 了 对 小伙子 的 赞美 之 词 , “ 瓦尔 被 维尔 贝克 招 去 踢 世界 杯 , 我 一 点 都 不 惊讶 。
+他 很 年轻 , 又 能 脚踏 实地 地 提高 自己 。
+更 重要 的 是 , 他 具备 一切 成为 伟大 球员 的 条件 !
+” 梦幻 般 的 国家 队 处子 秀 过后 , 瓦尔 于 海啸 队 的 队友 , 亚当 - 萨罗 塔 以及 迈克尔 - 祖 罗 一 道 转会 去 了 荷兰 乌德 勒 支 俱乐部 。
+这次 转会 看似 很 成功 , 不过 瓦尔 并 没 有 被 成功 冲 昏 了 头脑 。
+尽管 已经 有 无 数 人 替 自己 对 未来 作 了 无 数 种 猜想 , 但 他 依然 全身 心地 投入 训练 与 比赛 , 正如 他 自己 所 说 : “ 我 才 踢 了 20 场 联赛 , 国家 队 出场 也 只有 一 次 , 所以 , 我 必须 提高 自己 , 我 必须 不 断 进步 。
+” 姓名 汤米 · 瓦尔 出生 日期 1991 . 12 . 10 运动 项目 足球 性别 男
+化脓 性 滑 囊 炎 切开 引流 术 桡 侧滑 囊 和 尺 侧滑 囊 互相 沟通 , 炎症 可 互相 蔓延 , 二者 近 端 在 尺 、 桡骨 茎突 上 2cm 处 与 屈肌 后 间隙 相邻 。
+故 化脓 性 滑 囊 炎 应 及时 治疗 , 以 防 扩散 。
+简介 0 < b > 化脓 性 滑 囊 炎 切开 引流 术 < / b > 适应 症 0 化脓 性 滑 囊 炎 明显 肿胀 , 穿刺 有 脓 时 , 应 及早 切开 减压 、 引流 。
+手术 过程 0 手术 过程 术 前 准备 11 . 根据 病情 合理 选用 抗生 素 。
+2 . 对 严重 手 部 感染 , 全身 情况 衰弱 者 , 应 注意 改善 全身 情况 , 提高 身体 抵抗 力 。
+3 . 手 部 较 深 脓肿 切开 时 , 宜 用 止血 带 控制 止血 , 使 手术 野 清晰 , 保 症 手术 安全 。
+手术 过程 麻醉 21 . 脓 性 指头 炎 切开 引流 术 或 甲 下 积脓 拔 甲状 , 一般 采用 指 根 神经 阻滞 麻醉 。
+麻醉 剂 内 不 可 加 用 肾上腺 素 , 以免 小 动脉 痉挛 , 造成 手指 血 运 障碍 。
+2 . 掌 间隙 脓肿 、 化脓 性 腱鞘 炎 或 手 部 滑 囊 炎 切开 引流 时 , 采用 臂 丛 神经 或 腕 部 神经 阻滞 麻醉 ; 也 可 采用 氯胺酮 静脉 麻醉 。
+手术 过程 手术 步骤 3 尺 侧 滑 囊 炎 可 沿 小鱼 际 肌 的 桡 侧 , 从 远 侧 掌 横纹 至 腕 横 韧带 平面 作 纵 切口 。
+向 两侧 拉开 切口 , 在 第 5 掌骨 掌 面 即可 看到 肿胀 的 尺 侧滑 囊 , 予以 切开 后 扩大 引流 , 排出 脓 液 。
+然后 冲洗 脓 腔 。
+囊 外 放 凡士林 纱 布条 或 胶皮 片 引流 1 。
+尺 侧 滑 囊 炎 可 合并 掌中 间隙 感染 2 , 需 同时 切开 引流 。
+桡 侧 滑 囊 炎 用 大 鱼际 肌 尺 侧 缘 切口 , 即 在 近 侧 掌 横纹 远 半 段 的 桡 侧 切开 皮肤 、 皮 下 组织 及 桡 侧滑 囊 , 进行 引流 。
+注意 事项 0 桡 侧 滑 囊 炎 切开 时 , 切口 宜 在 大 鱼际 肌 尺 侧 缘 远 侧 半 段 , 因 近 侧 半 段 有 正中 神经 返 支 ( 运动 支 ) 存在 , 如 受 损伤 将会 丧失 重要 的 拇指 对 掌 功能 3 。
+中文 名 化脓 性 滑 囊 炎 切开 引流 术 简介 化脓 性 滑 囊 炎 切开 引流 术 适应 症 化脓 性 滑 囊 炎 明显 肿胀 穿刺 有 脓 时 注意 事项 切口 宜 在 大 鱼际 肌 尺 侧 缘 远 侧 半 段
+应用 新闻 学 原理 与 案例 第 一 卷 新闻 资源 开发 设计 《 应用 新闻 学 原理 与 案例 第 一 卷 新闻 资源 开发 设计 》 是 2007 年 中国 人民 大学 出版 社 出版 的 图书 , 作者 是 蔡 雯 、 甘露 。
+1 内容 简介 0 《 应用 新闻 学 · 原理 与 案例 》 是 我 国 新闻 传播 学界 第 一 套 全面 包容 各种 类型 媒介 新闻 传播 业务 的 应用 新闻 学 案例 教材 。
+编者 采用 案例 与 原理 相互 配合 的 方式 , 系统 阐述 报纸 、 广播 、 电视 和 网络 新闻 的 设计 、 采集 、 编制 原理 与 技巧 。
+案例 素材 收集 全面 , 包括 与 案例 有 关 的 媒介 的 基本 情况 简介 、 案例 社会 背景 分析 、 对 案例 主要 当事 人 的 访谈 、 案例 代表 性 成果 展示 、 案例 研究 等 。
+本 教材 各 卷 均 配送 多 媒体 光盘 , 为 读者 学习 和 研究 案例 提供 最 直观 的 资料 。
+《 新闻 资源 开发 设计 》 是 本 套 教材 的 第 一 卷 , 主要 说明 各 类型 新闻 媒介 进行 市场 定位 、 媒介 产品 设计 和 新闻 报道 . . . 图书 目录 0 第 一 章 新闻 资源 开发 与 新闻 产品 定位 第 一 节 新闻 资源 概论 第 二节 新闻 产品 定位 第 三 节 新闻 产品 定位 案 例案 例 一 《 华西 都市 报 》 的 创刊 定位 案例 二 北京 交通 台 的 定位 第 二 章 新闻 产品 设计 第 一 节 新闻 产品 设计 基本 原理 第 二节 新闻 报刊 产品 设计 案 例案 例 一 《 京华 时报 》 的 创刊 设计 案例 二 《 南方 日报 》 改版 设计 第 三 节 新闻 广播 产品 设计 案例 案例 中央 人民 广播 电台 “ 中国 之 声 ” 设 第 四 节 新闻 电视 产品 设计 案例 案例 中央 电视 台 新闻 频道 设计 第 五 节 . . . 作者 简介 0 蔡 雯 , 中国 人民 大学 新闻 学院 副 院长 、 教授 、 博士 生 导师 , 中国 人民 大学 新闻 与 社会 发展 研究 中心 研究 员 。
+任 全国 高等 教育 自考 委员 会 新闻 类 委员 、 中国 科技 新闻 协会 理事 、 北京 市 新闻 工作 者 协会 理事 ; 担任 “ 中国 新闻 奖 ” 评委 及 多家 报社 的 顾问 ; 2001 年 入选 北京 市 “ 新 世纪 社科 理论 人才 百人 工程 ” ; 2004 年 入选 教育 部 “ 新 世纪 优秀 人才 资助 计划 ” 。
+是 新闻 传播 学界 第 一 位 “ 全国 优秀 博士 学位 论文 ” 获得 者 ; 全国 精品 课 “ 新闻 编辑 ” 主持 人 ; 曾 获得 “ 北京 市 优秀 新闻 工作 者 ” 、 “ 北京 市 优秀 青年 骨干 教师 ” 等 多项 荣誉 和 嘉奖 。
+主持 多项 教育 部 人文 社会 科 . . . 书名 应用 新闻 学 原理 与 案例 第 一 卷 新闻 资源 开发 设计 作者 < a href = " # " > 蔡 雯 、 甘露 < / a > & nbsp ; ISBN 9787300081281 ; 7300081282 页数 444 定价 ¥ 38 . 00 出版 社 < a href = " # " > 中国 人民 大学 出版 社 < / a > 出版 时间 2007 年 06 月
+最 美 年华 的 遇见 你 《 最 美 年华 的 遇见 你 》 是 鑫 雨 创作 的 网络 小说 , 发表 于 17K 小说 网 。
+0 < b > 作品 简介 < / b > < b > < / b > 把 过往 写成 小说 , 对于 过往 认真 已经 不 重要 了 , 可是 对于 某些 人 来说 , 一 提到 过往 便 痛 得 认真 , 让 人 无法 窒息 , 最 美 年华 遇见 了 你 , 是 我 之 幸运 。
+中文 名称 最 美 年华 的 遇见 你 作者 鑫 雨 连载 平台 17K 小说 网 小说 类型 网络 小说
+Rebirth of Fortune 2 《 RebirthofFortune 2 》 将 拥有 超过 25 种 战斗 职业 , 每个 职业 都 有 不 同 的 技能 和 能力 , 在 战斗 过程 中 出战 单位 遭到 攻击 和 发动 攻击 都 可以 积攒 技能 点 , 蓄 满 之后 便 能 发动 专属 的 特殊 技能 。
+随着 战斗 获胜 , 玩家 可以 用 奖金 来 升级 战斗 单位 , 解锁 新 的 技能 , 或者 转职 成 更 高级 的 职业 。
+开发 商 承诺 新作 比 前 作 更具 战略 性 和 更 完善 的 战斗 系统 。
+0 《 RebirthofFortune 2 》 将 拥有 超过 25 种 战斗 职业 , 每个 职业 都 有 不 同 的 技能 和 能力 , 在 战斗 过程 中 出战 单位 遭到 攻击 和 发动 攻击 都 可以 积攒 技能 点 , 蓄 满 之后 便 能 发动 专属 的 特殊 技能 。
+随着 战斗 获胜 , 玩家 可以 用 奖金 来 升级 战斗 单位 , 解锁 新 的 技能 , 或者 转职 成 更 高级 的 职业 。
+开发 商 承诺 新作 比 前 作 更具 战略 性 和 更 完善 的 战斗 系统 。
+游戏 提供 全新 的 故事 剧情 , 将 玩家 带入 一 个 剑 与 魔法 的 世界 , 玩家 还 可以 通过 苹果 游戏 中心 进行 PVP 挑战 , 享受 与 来自 世界 各地 高手 竞争 的 乐趣 。
+中文 名 Rebirth of Fortune 2 游戏 类型 策略 语言 英文 平台 IOS
+皇家 美孚 ( 燕儿 岛 路 店 ) 餐馆 类型 0 蛋糕 西点 餐馆 简介 0 “ 爱吃 甜点 ” 的 人 大概 都会 喜欢 这里 。
+无 论 是 蛋糕 、 面包 , 还是 蛋挞 之类 的 小 点心 , 口感 都 “ 很好 ” , 价位 “ 中档 ” , 性价 比 “ 至高 ” 。
+分店 “ 挺 多 的 ” , “ 各 大超 . . . 地址 0 市南区 燕儿 岛 路 52 号 ( 燕儿 岛 路 与 逍遥 二 路 交叉 口 北 30 米 ) 餐馆 名称 皇家 美孚 ( 燕儿 岛 路 店 ) 地区 青岛 市 营业 时间 07 : 00 - 21 : 00
+M 型 变星 “ M 型 变星 ” 是 天文 学 专有 名词 。
+来自 中国 天文 学 名词 审定 委员 会 审定 发布 的 天文 学 专有 名词 中文 译名 , 词条 译名 和 中英 文 解释 数据 版权 由 天文 学 名词委 所有 。
+内容 简介 0 “ M 型 变星 ” 是 天文 学 专有 名词 。
+来自 中国 天文 学 名词 审定 委员 会 审定 发布 的 天文 学 专有 名词 中文 译名 , 词条 译名 和 中英 文 解释 数据 版权 由 天文 学 名词委 所有 。
+< table > 中文 译名 M 型 变星 英文 原名 / 注释 Mvariable < / table > 补充 说明 0 “ 英汉 天文 学 名词 数据 库 ” ( 以下 简称 “ 天文 名 词库 ” ) 是 由 中国 天文 学会 天文 学 名词 审定 委员 会 ( 以下 简称 “ 名词委 ” ) 编纂 和 维护 的 天文 学 专业 名词 数据 库 。
+该 数据 库 的 所有 权 归 中国 天文 学会 所有 。
+中文 名 M 型 变星 外文 名 Mvariable
+徐州 铜山 农村 商业 银行 股份 有限 公司 食品 城 支行 诉 李 延东 等 金融 借款 合同 纠纷 案 徐州 铜山 农村 商业 银行 股份 有限 公司 食品 城 支行 诉 李 延东 等 金融 借款 合同 纠纷 案 是 2017 年 2 月 4 日 在 江苏 省 徐州 市 云龙 区 人民 法院 审理 的 案件 。
+案由 0 民事 、 合同 、 无 因 管理 、 不 当 得利 纠纷 、 合同 纠纷 、 借款 合同 纠纷 、 金融 借款 合同 纠纷 、 执行 权责 关键 词 0 民 商事 、 权责 情节 、 民事 责任 规定 、 合同 、 诉讼 关键 词 、 执行 、 强制 执行 案例 原文 0 徐州 市 云龙 区 人民 法院 执行 裁定 书 申请 执行 人 : 徐州 铜山 农村 商业 银行 股份 有限 公司 食品 城 支行 。
+负责 人 赵 志明 , 该 行 行长 。
+被 执行 人 : 李 延东 。
+被 执行 人 : 王 云 。
+被 执行 人 : 袁 兆刚 。
+被 执行 人 : 李 延江 。
+申请 执行 人 徐州 铜山 农村 商业 银行 股份 有限 公司 食品 城 支行 与 被 执行 人 李 延东 、 王 云 、 袁 兆刚 、 李 延江 金融 借款 合同 纠纷 一 案 , 本 院 作出 的 ( 2016 ) 苏 0303 民初 17 号 民事 判决 书 已经 发生 法律 效力 。
+因 被 执行 人 在 规定 的 限期 内 拒 不 履行 法律 文书 所 确定 的 义务 , 申请 执行 人 向 本 院 申请 强制 执行 。
+本 院 于 2016 年 10 月 28 日 立案 执行 , 立案 标的 额 为 391468 . 67 元 。
+本 院 在 执行 过程 中 采取 了 以下 执行 措施 : 1 、 2016 年 11 月 2 日 , 依法 向 被 执行 人 发出 执行 通知 书 、 报告 财产 令 等 法律 文书 , 被 执行 人 既未 到庭 说明 情况 亦 未 主动 履行 义务 。
+2 、 2016 年 11 月份 至今 , 本 院 多次 依法 查询 被 执行 人 银行 、 车辆 、 房产 等 财产 信息 , 经查 被 执行 人 暂无 财产 可 供 执行 。
diff --git a/PaddleNLP/ELMO/data/train/sentence_file_1.txt b/PaddleNLP/ELMO/data/train/sentence_file_1.txt
new file mode 100644
index 0000000000000000000000000000000000000000..51e3d36aae308c5f0a5da281b9c4d1970eefb7bc
--- /dev/null
+++ b/PaddleNLP/ELMO/data/train/sentence_file_1.txt
@@ -0,0 +1,1000 @@
+从 上 幼儿 园 开始 , 渠 鹏 就 养成 了 自立 的 好 习惯 , 在 乡里 做 公务员 的 父母 也 有 意 让 他 处理 自己 的 事情 。
+“ 上 小学 时 , 我 妈妈 还 过问 我 学习 的 事 , 从 初中 开始 , 父母 就 对 我 完全 放开 了 , 全 是 我 自己 处理 , ” 渠 鹏 对 自己 的 自立 、 自理 能力 非常 满意 , 就 是 北大 老师 动员 他 读 北大 , 他 也 没 让 父母 跟着 , 完全 是 自己 一 个人 做主 。
+关于 高考 0 渠 鹏 总 分 为 70 4 分 ( 裸 分 694 ) , 语文 126 分 , 数学 142 分 , 英语 133 分 , 理综 295 分 , 河北 省 优秀 干部 的 殊荣 更 是 为 他 获得 了 10 分 的 加分 , 成为 河北 理科 总分 第 一 的 折桂 者 。
+这位 颇有 “ 东坡 风范 ” 的 小伙子 出生 于 91 年 5 月 16 日 , 今年 18 岁 。
+金牛 座 赋予 他 踏实 稳重 而又 坚毅 达观 的 气度 。
+和 许 许多 多 同龄 人 一 样 , 他 热爱 篮球 和 乒乓 球 , 这 两 项 运动 也是 他 的 拿手 好戏 。
+在 运动 之 余 , 他 “ 脑力 娱乐 与 体力 磨练 相 结合 ” , 偏好 各种 棋类 活动 , 经常 和 同学 切磋 象棋 和 围棋 ; 在 丰富 多彩 的 班级 活动 中 , 渠 鹏 谦虚 地 坦言 自己 “ 经常 是 参与 者 , 偶尔 是 领导 者 ” 。
+除了 这 两 项 之外 , 计算机 网络 也是 他 的 拿手 好戏 。
+渠 鹏 对于 许多 高三 学生 翘首 以 盼 的 暑假 , 并 不 打算 放松 , 而是 张 弛 结合 : 扩大 英语 词汇 量 , 继续 自学 计算机 , 闲暇 时间 会 抽空 去 山东 旅游 一趟 , 力求 践行 “ 读 万 卷 书 , 行 万 里路 ” 。
+“ 操 千 曲 而后 晓 声 , 观 千 剑 而后 识 器 ” , 渠 鹏 表示 , 五彩 缤纷 的 课余 活动 让 他 扩展 了 视野 , 更 是 发展 了 独立 思维 能力 。
+谈及 高考 备战 , 他 坦言 自己 的 最 大 秘诀 就 是 “ 顺 其 自然 , 随 性 为 之 ” , 从不 强迫 自己 。
+他 在 精神 状态 好时 , 往往 会 超额 多 学 ; 而 感觉 自己 不 在 状态 时 , 往往 会 让 自己 放松 一 下 , 而 不 是 “ 明知 山 有 虎 , 偏向 虎山 行 ” 。
+由于 自控 能力 很 强 , 渠 鹏 能够 轻松 自如 的 在 两种 状态 中 切换 , 而 不 会 过于 放纵 自己 。
+在 备战 高考 之 时 , 区别 于 其他 同学 的 懵懵 懂 懂 , 渠 鹏 对于 自己 的 未来 , 已经 有 了 较 为 明晰 的 规划 。
+他 总结 道 , 自己 是 一 个 统筹 规划 , 注重 全局 的 人 , 对于 未来 有 着 满满 的 期待 和 把握 。
+在 北大 清华 甚至 香港 学校 纷纷 向 他 伸出 橄榄 枝 的 之 时 , 渠 鹏 表示 , 自己 比较 青睐 清华 大学 , 尤其 向往 电子 计算机 , 土木 工程 和 生物 化学 研究 专业 , 希望 自己 在 相关 领域 能 有 所 发展 。
+渠 鹏 的 父母 在 他 的 成长 期间 给与 了 他 相当 多 的 支持 。
+双亲 均 为 乡镇 干部 , 这样 的 家庭 背景 让 渠 鹏 更 添 理性 思维 气质 , 父亲 的 思维 缜密 , 行事 沉稳 ; 母亲 乐观 积极 , 勤勉 持家 。
+这样 传统 的 中国 家庭 下 成长 的 渠 鹏 无 疑 汲取 了 最 多 的 中国 式 教育 精髓 。
+渠 鹏 出 类 拔萃 的 学习 实力 使得 他 对 自己 大学 的 学习 无 疑 是 自信 的 , 对于 大学 生活 展望 , 他 指出 , 理科 学生 在 高中 往往 沉浸 于 书山 题 海 而 “ 乐 不 思 蜀 ” , 进入 大学 后 , 应该 多多 参加 社会 活动 , 拓展 人脉 交往 。
+大学 生涯 的 开始 , 不 是 学习 的 结束 , 而 往往 是 学习 的 又 一 轮 攻坚 战 , 知识 的 拓展 毋庸 置疑 , 而 为人 处事 的 能力 以及 独立 思维 才 是 使 你 从众 多 考生 中 脱 颖 而 出 的 秘诀 。
+中文 名 渠 鹏 < a href = " # " > < / a > 国籍 中国 出生 日期 1991 年 5 月 16 日 毕业 院校 定州 中学 主要 成就 省 优秀 班 干部 荣誉 理科 状元
+奶油 圆 蛋糕 奶油 圆 蛋糕 < / b > < b > ( 法语 : kougl of , 又 作 kugel opf 或 kougel hopf , 德语 : Gugel hupf ) 是 法国 阿尔萨斯 地区 及 德国 南部 特产 的 甜 食品 。
+其 特点 在于 其 用 一 种 中空 螺旋 纹 形 的 模子 作成 皇冠 形 蛋糕 。
+< / b > < b > < / b > 面粉 奶油 白 砂糖 葡萄 干 酵母 鸡蛋 牛奶 杏仁
+罗 孟郊 罗 孟郊 ( 1092 - 1153 年 ) , 字 耕 甫 , 号 休 休 , 广东 循 州府 兴 宁县 刁 坊 镇 罗 坝 村人 。
+罗 孟郊 早年 丧 父 , 侍 母 很 孝顺 。
+少时 家贫 却 聪颖 好学 , 精通 经史 。
+二十 岁 时 , 他 在 兴宁 神光 山 附近 的 罗 岭 草 舍里 , 勤奋 攻读 , 留 下 了 “ 神光 映 读 ” 一 段 佳话 。
+宣和 五年 ( 1123 年 ) 他 考 中 举人 。
+宣和 六 年 ( 1124 年 ) 考 中 进士 , 为 一 榜 ( 亦称 一 甲 ) 第 三名 探花 , 始 就 太学 博士 职 。
+后 累 官 至 谏议 大夫 、 翰林 院 学士 、 掌 制诰 。
+人物 简介 0 罗 孟郊 ( 1092 ~ 1153 年 ) , 广东 兴宁 宁 新镇 文星 村人 。
+从小 勤奋 攻读 , 博览 经史 。
+北宋 宣和 五年 ( 1123 ) 中举 , 翌年 中 进士 。
+随即 参加 殿试 , 考 中一 甲 第 三名 —— 探花 , 授职 谏议 大夫 、 翰林 院 学士 。
+其时 , 奸臣 蔡 京 在 朝廷 任 太师 , 以 恢复 新法 为名 , 加重 剥削 , 排除 异已 , 使 朝政 日 非 。
+罗 孟郊 疾恶 如仇 , 主使 太 学生 陈 东 上书 , 揭露 蔡 京 为 首 六大 奸臣 之 罪行 , 称 他们 为 “ 六 贼 ” 。
+不 久 , 钦宗 便 把 “ 六 贼 ” 中 的 王 黼 、 朱 勔 等 治罪 。
+当 金人 南 侵 时 , 钦宗 慑于 金人 的 气焰 , 想 罢免 力主 抗 金 的 尚书 右 丞相 李 纲 的 职务 , 以 讨好 金人 。
+罗 孟郊 联络 太 学生 陈 东 等 发动 京都 市民 万余 人 上书 请愿 , 要求 留用 李 纲 , 为国 效力 , 挽救 民族 命运 。
+及 后 , 宋 廷南 迁 临安 ( 杭州 ) , 奸臣 秦 桧 当权 , 阴谋 与 金人 议和 , 罗 孟郊 与 叶 三省 等人 极力 反对 , 秦 桧 恨 之 入骨 , 命 御史 罗 汝楫 奏 谤 罗 孟郊 “ 饰 非 横议 ” 。
+南宋 高宗 绍兴 二十一 年 ( 1152 ) , 罗 孟郊 被 贬谪 到 湖北 兴 国军 ( 行政 区 名 , 今 湖 北阳 新县 ) 。
+罗 孟郊 看到 宋 主 昏庸 , 奸臣 误国 , 外敌 侵凌 , 山河 日 蹙 , 自己 一 片 忠贞 报国 之 志 不 得 酬 , 徒 唤 奈何 。
+到 湖北 后 , 他 放 怀 山水 , 不 问 世事 。
+次年 , 便 在 贬 所 逝世 , 终年 62 岁 。
+过 了 两年 , 秦 桧 不 知 罗 孟郊 已 逝世 , 阴谋 使人 陷害 罗 孟郊 , 以 忤逆 罪 欲 置 他 于 死地 。
+秦 桧 死后 , 高宗 下 诏 复用 罗 孟郊 , 得悉 罗 已 去世 , 遂 追赠 他 为 礼部 尚书 , 敕 葬 兴 国军 甘棠 山 。
+今 兴宁 “ 神光 山 ” 、 “ 墨池 寺 ” 等 处 , 留 有 与 罗 孟郊 有 关 的 事迹 和 传说 。
+生平 0 罗 孟郊 任职 太学 博士 期间 , 正值 奸臣 蔡 京 专权 , 朝政 日 非 。
+罗 孟郊 疾恶 如仇 , 令 太 学生 陈 东 等 上书 , 揭露 蔡 京 等人 罪行 , 称 他们 为 “ 六 贼 ” 。
+迫于 压力 , 不 久 , 宋 钦宗 便 把 “ 六 贼 ” 中 的 王 黼 等 治 了 罪 。
+当 金人 南 侵 时 , 钦宗 慑于 金人 的 气焰 , 欲 罢免 主战 派 右 丞相 李 纲 的 职务 , 来 讨好 金人 。
+罗 孟郊 再次 令 陈 东 上书 , 要求 留用 李 纲 为国 效力 。
+绍兴 十八 年 ( 1148 年 ) 罗 孟郊 入 翰林 , 又 值 秦 桧 阴谋 与 金人 和议 之 际 , 罗 孟郊 与 龙 图 学士 叶 三省 等 人力 抵 和议 。
+秦 桧 恨 之 入骨 , 便 指使 御史 罗 汝 揖 上书 高宗 皇帝 。
+他们 在 奏 书 中 编造 罪名 , 大肆 诽谤 罗 孟郊 等人 , 称 他们 是 “ 饰 非 横议 ” 。
+南宋 绍兴 二十二 年 ( 1152 年 ) , 罗 孟郊 、 叶 三省 等人 含冤 遭 贬 , 罗 孟郊 被 贬谪 到 当时 的 兴国 军 即 现在 的 阳新 县 。
+动身 赴 贬 所 时 , 朝廷 诸 大夫 洒泪 为其 饯行 。
+罗 孟郊 答谢 说 : “ 吾 本 豫章 柏林 族 也 , 兴国 军 乃 故乡 。
+天下 事 至此 , 为 之 奈何 !
+吾 今 无 意 于 人世 矣 。
+诸公 勉力 以 报国 , 不必 为 吾 偷生 者 注 念 。
+” 看到 宋 主 昏庸 , 奸臣 误国 , 外敌 侵凌 , 山河 日 蹙 , 而 自己 的 境遇 却 又 如此 , 一 片 忠贞 报国 之 志 不 得 酬 , 罗 孟郊 很 是 无 奈 , 对 自己 的 前途 已 心 灰 意 冷 到 了 极点 。
+然而 , 罗 孟郊 要 表白 的 不 仅仅 如此 , 此番 话语 , 更多 的 却是 表明 了 他 置 自己 的 前途 、 安危 、 冷暖 于 不 顾 , 而且 , 尽管 身处 此 境 , 他 仍 不 忘 勉励 诸君 报国 勿 偷生 , 足见 其一 片 忠贞 为国 之 心 !
+罗 孟郊 到 兴国 军 任 后 , 他 不 理 军事 , 选择 甘棠 山 山谷 , 在 山脚 下 朝阳 的 一 面 定居 下来 。
+他 建 屋 数 间 , 在 正屋 大门 上方 悬挂 一 横匾 , 题 曰 “ 翰林 堂 ” 。
+他 又 叫 人 开挖 了 一 个 一 亩 见方 的 水池 , 名 之 “ 洗 砚池 ” 。
+自此 , 他 过 起 了 “ 放 怀 山水 , 屏绝 人迹 ” 的 隐居 生活 。
+后来 , 很多 人 登门 拜访 , 他 一律 不 予 接待 , 统统 拒 之 门外 。
+即便 其时 的 兴国 军 知 军 薛 朋 龟 、 周 紫芝 等 贤能 之 人 登门 造访 , 罗 孟郊 都 闭门 不 见 。
+当时 薛 朋 龟 、 周 紫芝 等人 带来 了 酥 米 等 食物 赠送 给 罗 孟郊 , 同时 还 赠 荷叶 诗 一 首 于罗 孟郊 , 诗云 : 湖边 老 守 湖 是 家 , 湖光 十 里 皆 荷花 。
+我 来 不 见 花 如 锦 , 但 闻 荷叶 飞 天涯 。
+罗 孟郊 答 云 : “ 仆 得 托 身 草 莱 , 随意 耕 钓 , 皆 君 惠 耳 。
+辱 磁 多 议 , 何以 堪 之 ?
+” 原来 他 的 志向 , 早 就 是 打算 辞官 归隐 , 寄情 山水 之间 了 。
+南宋 绍兴 二十三 年 ( 1153 年 ) , 罗 孟郊 在 贬 所 兴国 军 即 现 阳新 这里 去世 。
+一 代 忠贞 爱国 之 人 就 这样 悄 无 声息 地 撒手 人寰 , 然而 朝廷 对此 一点儿 不 知道 。
+绍兴 二十五 年 ( 1155 年 ) , 罪 大 恶 极 的 秦 桧 终于 死去 。
+就 在 秦 桧 死后 不 久 , 高宗 不 知 是 良心 发现 还是 另有 原因 , 他 下 诏 复用 罗 孟郊 等 53 人 。
+而 此时 , 罗 孟郊 去世 已有 两年 多 了 。
+朝廷 后来 得 悉罗 孟郊 已 去世 , 于是 追赠 他 为 礼部 尚书 , 敕 葬 在 现 阳新 甘棠 山 , 并 于 绍兴 二十六 年 ( 1156 年 ) 丙子 岁 正月 为其 墓 立 《 神道碑 》 。
+时 为 《 神道碑 》 撰文 的 是 赐 进士 、 知 庐州 的 豫章 人 张 文 举 , 书写 的 是 宣教 右 侍 郎中 、 权 知 兴国 军事 的 江州 人 周 紫芝 , 篆 额 的 是 奉 礼部 、 知 舞阳 县 事 的 希 寿 。
+罗 孟郊 虽 未能 挽 宋 室 于 沉沦 , 但 他 不 畏 权奸 的 凛然 正气 和 爱国 精神 , 永远 值得 世人 崇敬 !
+作品 0 一自 题名 后 , 思归 何日 归 。
+虽然 着 宫 锦 , 不 及 无 斑 衣 。
+故里 桑 榆晚 , 他 乡 雨雪 霏 。
+庭 前 停 玉 轸 , 目送 雁南 飞 。
+这 是 北宋 时 , 在 朝中 做官 的 罗 孟郊 厌倦 官场 勾 心 斗 角 、 尔 虞 我 诈 、 因循 守旧 和 花天 酒地 , 而 写 下 的 一 首 诗 《 京 中 怀 归 》 , 以此 表达 他 想念 家乡 的 思归 之 情 。
+< b > 罗 孟郊 墓 < / b > < b > 神道碑 < / b > < b > 碑文 < / b > < b > < / b > 这 篇 碑文 , 是 清代 同治 年 ( 1865 ) 楚 北 兴国 州 罗 有文 ( 孟郊 后裔 ) 录 下 寄给 兴宁 罗 仲期 的 。
+随 文 附一 信 : 自 公 谪居 兴国 寄籍 , 由 宁 历元 、 明 , 以至 本 朝 , 未尝 他 徙 , 后 考 《 广 舆 记 》 等 书 , 始 知 公 为 宋 循州 兴宁 人 , 特 询 原籍 有 无 公 裔 。
+《 神道碑 》 碑文 : 绍兴 二十三 年 , 翰林 学士 罗 公 卒 于 兴 国军 , 盖 谪 地 也 。
+公 讳 孟郊 , 字 耕 甫 , 休 休 其 别号 。
+幼 负 美 质 。
+居 乡 校 时 , 值 元 佑 禁锢 诸 贤 , 初 不 悦 曰 : “ 事 势 如此 , 是 可以 爵禄 为 哉 ?
+” 乡 先生 韦 公闻 之 叹服 。
+及 闻 文正 公立 朝 风采 , 辄 语 人 曰 : “ 此 正 大 丈夫 流 也 。
+孟郊 肯甘 为 草木 , 漫 度 春秋 已 耶 ?
+” 与 胡 铨 、 杨 时 、 蒋 之 奇数 子 友 , 慷慨 激烈 , 博通 经史 , 为 一 时 名士 首 称 。
+宣和 五年 , 领 乡 荐 ; 明年 , 廷试 。
+策 语 有 云 : “ 躬耕 而 亲 蚕 , 已 喜 示 斯 民 之 大端 大本 ; 辽 衰 而 金盛 , 更 望 振 天下 之 大 纪 大纲 。
+” 掌 枢密 院 事 谭 正 , 一见 而 大 奇 之 。
+登 沈 晦 榜 进士 , 始 就 太学 博士 职 。
+是 时 蔡 京 擅 权 、 天下 侧目 。
+公 令 太 学生 陈 东 上书 , 乞 斩 六 贼 , 徽 庙 嘉 之 。
+复 令 太 学生 留 李 纲 、 吴 敏 , 乞 用 老成 , 以 靖 太学 , 杨 时 始 擢 国 子 祭酒 。
+靖康 间 , 因 适 南 剑 见 罗 从彦 , 从 吴 国华 游 , 以 同族 劝 就 杨 时 讲道 , 以 承 濂 洛 之 绪 。
+绍兴 十八 年 入 翰林 , 正 秦 桧 纷纷 和议 之 际 。
+与 龙 图 学士 叶 三省 、 从政 郎 杨 炜 , 各 以 书 通 赵 鼎 、 王 庶 , 力 底 和议 。
+桧 讽 御史 罗 汝楫 , 奏 以 饰 非 横议 , 贬 叶 三省 于 筠州 , 贬 公 于 兴 国军 , 贬 炜 于 万 安军 , 诸 大夫 祖饯 为 洒泪 。
+公 谢 曰 : “ 吾 本 豫章 柏林 族 也 。
+兴国 军 乃 故乡 。
+天下 事 至此 , 为 之 奈何 !
+吾 今 无 意 于 人世 矣 。
+诸公 勉力 以 报国 , 不必 为 吾 偷生 者 注 念 。
+” 盖 二十二 年事 也 。
+公 抵 兴国 , 不 理 军事 , 择 朝阳 甘棠 之 谷 而 居 之 。
+结 屋 数 间 , 匾 曰 “ 翰林 堂 ” 。
+凿 池 一 亩 , 曰 “ 洗 砚池 ” 。
+放 怀 山水 , 屏绝 人迹 。
+虽 知 军 薛 ? 、 周 紫芝 之 贤 , 求 不 得 见 。
+时 以 酥 米 馈 之 , 赠 以 荷叶 诗云 : “ 湖边 老 守 湖 是 家 , 湖光 十 里 皆 荷花 。
+我 来 不 见 花 如 锦 , 但 闻 荷叶 飞 天涯 。
+” 公 答 云 : “ 仆 得 ? 身 草 莱 , 随意 耕 钓 , 皆 君 惠 耳 。
+辱 兹 多 议 , 何以 堪 之 ?
+” 原 其 志向 , 既而 归 籍 之 计 矣 。
+明年 春 , 霍山 崩裂 。
+至 秋 , 公 一 日 沐浴 衣冠 , 北面 稽首 毕 , 命 其 子 钧 曰 : “ 吾 气数 止 矣 !
+尔 竭力 事 母 , 不必 还 籍 , 甘棠 是 葬地 也 。
+知 我 者 , 无 如 张 文 举 , 可以 铭 事 ? 之 。
+” 言 未 毕 而 气 已 绝 。
+上 距 元 佑 七年 , 寿 六十 有 二 。
+越 二年 , 桧 谋 使 赵 汾 诬 张 浚 、 胡 铨 等 五十三 人大 逆 , 欲 置 之 死 , 公 与 焉 。
+狱 几 成 而 桧 暴卒 , 朝野 共 称 天 鉴 。
+诏 复 列 用 五十三 人 , 而 公 已故 二 载 矣 。
+有 司 以 闻 , 特 追 赠礼 部 尚书 职 , 敕 葬 甘棠 山 。
+丧祭 如 制 。
+呜呼 !
+若 公 者 , 可谓 屈 之 以 位 而 伸 之 以 道 , 枉 之 以 生 而 直 之 以 死者 欤 !
+令子 以 进士 吴 彦夔 状 公行 实 , 乞 予 铭 之 。
+予 忝 知 公 , 义 不 容 辞 。
+按 罗氏 , 世 居 豫章 。
+五 代 时 , 昌 儒 为 循 州 剌 史 , 遂 家 焉 。
+历数 世 , 至 清 臣 , 生 孝 忠 , 孝 忠 生 居敬 , 知 汀州 , 娶 谢氏 , 感 星 瑞生 公 于 官舍 , 故 公 负 美 质 , 为 忠良 , 承 道学 , 岂 偶然 耶 ?
+公 娶 周氏 , 涉猎 书史 , 有 令 仪 , 生 二 男 訇 、 钧 。
+铡 室 何 氏 。
+公 来 兴国 , 氏 卒 于 道 。
+訇 娶 赵氏 , 仍 原籍 。
+钧 娶 胡 氏 , 枢密 院 编修 铨 之 女 也 , 侍 公 于 兴国 , 生子 三 : 尧 明 、 文 、 思 , 皆 琳琅 之 器 , 将 发 公 之 幽光 而 有 待 者 耳 。
+铭 曰 : 霍山 维 灵 , 富 水 为 清 。
+灵 不 可 味 , 清 不 可 尘 。
+于 戏 君子 , 邦 家 之 珍 。
+志 动 金石 , 心 质 鬼神 。
+公 今 已 矣 , 庶乎 致辞 身 。
+吾 其 祭 也 , 廓 族 有 征 。
+此 碑 立 石 署名 除 其 两子 外 , 撰文 的 是 豫章 人 张 文 举 ( 赐 进士 知 庐 川 ) , 书写 的 为 江州 周 紫芝 ( 宣教 右 侍郎 中 权 知 兴国 军事 ) , 篆 额 的 是 希 寿 ( 奉 礼部 知 舞阳 县 事 ) 。
+立碑 年代 为 绍兴 二十六 年 丙子 岁 正月 。
+史料 记载 0 罗 孟郊 , 其 先祖 南昌 人 , 五 代 间 有 官 兴宁 者 , 因 家 焉 。
+至 宋 景德 中 , 孟郊 生 , 生 而 颖异 , 早 丧 父 , 事 母 孝 。
+儿时 牧 牛 长 陂 庄 , 坐 读书 , 有 山人 过 , 奇 之 , 与 语 , 孟郊 告 以 父 丧 , 贫 未 葬 , 山人 指示 其 地 , 遂 葬 焉 。
+弱冠 结 庐 罗 岭 以 学 , 乡 子弟 从 之 , 孟郊 指授 笃 焉 , 至 邑 始 多 学者 。
+尝 夜 读书 山 上 , 有 光 烛 之 , 又 箸 书 , 涤 砚 于 池 , 池水 尽 黑 , 今 志 所 传 墨池 者 是 也 。
+天圣 八 年 , 举 进士 第 三人 , 累 官 谏议 大夫 , 翰林 学士 , 掌 制诰 , 文 词 温雅 , 特 承 顾问 。
+乞 归 养母 , 茅 荜 萧然 。
+母 冬月 思 鲙 , 孟郊 下 池 取 鱼 以 供 , 乡人 目 其 地 为 曾 子湖 。
+年 七十 而 终 , 众 立 祠 于 读书 处 祀 之 。
+( 本 阮 省志 、 施 志 , 参 用 王 州志 及 仲 志 。
+仲 志 改 取 鱼 为 少时 事 , 未知 何 据 , 今 依 诸 志叙 于归 养 后 。
+) 本名 罗 孟郊 字号 字 耕 甫 , 号 休 休 所 处 时代 南宋 靖康 之 乱 到 建康 年间 民族 族群 汉族 出生 地 广东 < a href = " # " > 循 州 < / a > 府 出生 时间 公元 1092 年 去世 时间 公元 1153 年 主要 成就 官 至 < a href = " # " > 谏议 大夫 < / a > 、 < a href = " # " > 翰林 院 < / a > 学士 、 掌 制诰 政治 抨击 秦 桧 为 首 的 投降 派 , 力主 抗 金
+吴 毅瑧 吴 毅瑧 , 男 , 中国 人 , 职位 足球 运动 员 个人 简介 0 姓名 : 吴 毅瑧 英文 名 : Wu Yizhen 生日 : 1994 - 05 - 26 场 上 位置 : 前卫 身高 : 185 厘米 体重 : 73 公斤 惯用 脚 : 右脚 出生 地 : 未知 ( 中国 ) 国籍 : 中国 代表 国家 队 : 出场 0 次 , 进 0 球 欧洲 三大 杯 : 出场 0 次 , 进 0 球 欧洲 冠军 联赛 : 出场 0 次 , 进 0 球 职业 生涯 0 < table > 赛季 俱乐部 号码 出场 进球 国家 联赛 等级 排名 2012 上海 中 邦 20 中国 3 中文 名 吴 毅瑧 外 文名 Wu Yizhen 国籍 中国 出生 地 中国 职业 足球 运动 员
+拉拉 认知 书 : 颜色 形状 《 拉拉 认知 书 : 颜色 形状 》 是 第 1 版 ( 2011 年 1 月 1 日 ) 北方 妇女 儿童 出版 社 出版 的 图书 。
+图书 信息 0 读者 对象 : 0 - 2 岁 正文 语种 : 简体 中文 IS BN : 9787538551822 条形码 : 9787538551822 尺寸 : 25 x 20 . 8x 0 . 6cm 重量 : 82g 内容 简介 0 《 拉拉 认知 书 : 颜色 形状 》 是 《 拉拉 认知 书 》 丛书 之 一 。
+《 拉拉 认知 书 》 这 套书 内容 丰富 , 分 为 “ 动物 ” 、 “ 动物 2 ” 、 “ 比较 数字 ” 、 “ 认 车 ” 、 “ 认 物 ” 、 “ 蔬菜 ” 、 “ 水果 ” 、 “ 颜色 形状 ” 8 册 , 以 孩子 最 熟悉 的 植物 , 动物 , 孩子 最 喜爱 的 玩具 , 零食 为 素材 , 生动 形象 的 让 孩子 从 中学 到 知识 。
+书名 拉拉 认知 书 : 颜色 形状 页数 7 页 出版 社 吉林 出版 集团 , 北方 妇女 儿童 出版 社 出版 时间 第 1 版 ( 2011 年 1 月 1 日 ) 装帧 平装 开本 4
+对角 巷 魔秀 桌面 主题 《 对角 巷 魔秀 桌面 主题 》 是 一款 Android 平台 的 应用 。
+应用 介绍 0 魔秀 主题 桌面 — 国内 主题 桌面 先驱 者 魔秀 主题 桌面 团队 数百 名 优秀 设计 师 亲自 为 你 打造 属于 你 的 主题 桌面 , 万余 种 免费 主题 桌面 任 你 挑选 , 彰显 手机 个性 。
+千万 用户 的 选择 , 用户 心中 “ 最 潮 ” “ 最 酷 ” “ 最 时尚 ” 的 主题 桌面 。
+最 丰富 的 内容 : 海量 精品 主题 桌面 免费 下载 , 总 有 一款 适合 你 。
+最 干净 的 桌面 : 无 任何 广告 插件 干扰 , 主题 桌面 应用 中 耗用 内存 最 少 。
+最 时尚 的 界面 : 简洁 , 明快 , 更 适合 东方 人 的 操作 方式 。
+最 便捷 的 功能 : 主题 桌面 管理 , 随时 下载 保存 最 新 主题 桌面 , 应用 一键 添加 , 方便 快捷 。
+最 精彩 的 专题 : 资深 小 编 每日 推荐 百余 种 个性 主题 桌面 , 让 你 永远 紧跟 潮流 。
+—— 魔秀 是 一 种 时尚 的 选择 更多 精彩 请 关注 — 魔秀 主题 桌面 支持 版本 0 Android 2 . 1 . x 以上 软件 名称 对角 巷 魔秀 桌面 主题 软件 平台 Android 软件 大小 5 . 83M 价格 免费
+月球 保卫 战 《 月球 保卫 战 》 是 一款 休闲 小 游戏 , 游戏 目标 是 合理 操作 , 一 起来 保卫 月球 殖民 地 吧 。
+背景 设定 0 月球 保卫 战 是 一款 有 趣 的 休闲 小 游戏 , 你 的 任务 是 保卫 月球 不 让 外星 人 入侵 , 把 来犯 的 敌人 飞船 上 的 字 输入 正确 就 可以 消灭 他们 。
+别 让 他们 靠 得太 近 咯 !
+操作 指南 0 操作 指南 如何 开始 1 游戏 载入 后 , 点 击 START , 选择 ENGLISH 。
+操作 指南 操作 方法 2 键盘 输入 对应 字母 , 按 回车 键 。
+游戏 目标 0 消灭 外星 入侵 者 。
+中文 名 月球 保卫 战 游戏 英文 名 MoonType 游戏 类型 < a href = " # " > 休闲 小 游戏 < / a > 游戏 目标 消灭 外星 入侵 者 操作 方法 键盘 输入 对应 字母 , 按 回车 键
+七夜 杀 《 七夜 杀 》 是 连 不 归 创作 的 网络 小说 , 发表 于 晋江 文学 网 。
+作者 0 连 不 归 作品 简介 0 曾经 幻想 , 暗夜 之 中 , 你 就 是 我 唯一 的 光明 , 无 论 健康 或 是 疾病 , 贫穷 或 是 富有 , 有 你 相伴 , 我们 便 可以 携手 走过 一生 。
+到头来 , 一切 都 只是 自己 的 一 厢 情愿 。
+还 记得 那段 宣言 吗 ?
+" With thishand , I will lift your sorrows . Your cupwill neve rempty , for I will be your wine . With this candle , I will light your way ind arkness . With this ring , Iask you to be mine . " 中文 名称 七夜 杀 作者 连 不 归 连载 平台 晋江 文学 网
+杀手 大亨 《 杀手 大亨 》 是 连载 于 起点 中文 网 的 小说 , 作者 是 莫 尐雨 。
+小说 类型 0 异术 异能 内容 简介 0 一 个 身 怀古 武 精髓 的 杀手 神兵 轩辕 的 主人 , 拥有 着 世界 上 最 强大 的 杀手 集团 , 看 猪脚 如何 运用 手中 的 权利 创造 一 个 属于 自己 的 黑暗 王朝 … … 中文 名 杀手 大亨 作者 莫 尐雨 小说 进度 连载 连载 网站 起点 中文 网
+cs 我 是 玩家 《 cs 我 是 玩家 》 是 连载 于 17k 小说 网 的 小说 , 作者 是 月光 男孩 。
+小说 类型 0 游戏 网游 内容 简介 0 陈 雨 一 个 CS 潜能 很大 的 CSRE 在 一 次 战队 选拔 赛 中 , 侥幸 加入 , 后 俩 因为 队长 桦 磊 的 指导 , 在 CS 道 上 越 走 越 高 … … 中文 名 cs 我 是 玩家 作者 月光 男孩 小说 进度 连载 连载 网站 17k 小说 网
+[ 山 / 弄 ] 岗 耳 叶马 蓝山 / 弄 ] 岗 耳 叶 马蓝 , Perileptalong gangensis ( D . FangetH . S . Lo ) C . Y . WuetC . C . Hu , 爵 床 科 耳 叶 马蓝 属 植物 。
+产 广西 ( 龙州 ) 。
+生 石灰 岩 山坡 。
+形态 特征 0 亚 灌木 , 高 逾 40 厘米 。
+除 花 外 无 毛 。
+叶 不 等 大 , 狭 椭圆 形 , 稀 卵 状 椭圆 形 , 大叶 长 6 - 11 . 5 厘米 , 小叶 长 ( 1 . 5 - ) 3 - 8 . 5 厘米 , 基 部下 延 , 边 有 浅 齿 , 侧 脉 每 边 4 - 6 条 。
+穗状 花序 腋生 , 长 2 - 4 厘米 , 不 分枝 , 着 花 1 - 5 朵 ; 苞 片 对生 , 疏离 , 狭 三角 形 , 长 2 - 4 毫米 , 先端 急 尖 1 枚 不 孕 , 宿 存 ; 花萼 长 约 5 毫米 , 2 唇形 , 上唇 3 浅 裂 , 下唇 2 裂 至 基 部 , 裂片 2 毫米 ; 花冠 紫色 , 长 逾 2 . 5 厘米 , 外面 无 毛 , 冠 檐 裂片 5 , 长 约 4 毫米 , 有 凹 缺 ; 雄蕊 4 , 2 强 , 花丝 全 被 疏 柔毛 ; 子房 无 毛 , 花柱 被 硬 毛 。
+( 仅见 模式 标本 照片 ) 生长 环境 0 生 石灰 岩 山坡 。
+分布 范围 0 国内 分布 : 产 广西 ( 龙州 ) 中国 植物 志 : 70 : 119 中 文学 名山 / 弄 岗 耳 叶 马蓝 拉丁 学名 Perileptalong gangensis ( D . FangetH . S . Lo ) C . Y . WuetC . C . Hu 界 植物 界 门 < a href = " # " > 被子 植物 门 < / a > 纲 < a href = " # " > 双子 叶 植物 纲 < / a > 亚纲 < a href = " # " > 合瓣 花 亚纲 < / a > 目 < a href = " # " > 管状花 目 < / a > 科 < a href = " # " > 爵 床 科 < / a > 亚科 < a href = " # " > 爵 床 亚科 < / a > 族 芦 莉 花 族 , 马蓝 亚 族 属 < a href = " # " > 耳 叶马 蓝 属 < / a > 种 山 / 弄 岗 耳 叶 马蓝
+流量 车 流量 车 是 一款 互联 网 电子 商务 营销 优化 工具 。
+通过 网络 互访 , 共同 贡献 , 共同 分享 的 原理 ; 独创 鼠标 键盘 驱动 , 详细 浏览 页面 , 免费 为 用户 提供 网站 流量 提升 、 电子 商务 营销 优化 的 服务 。
+软件 原理 0 流量 车 利用 所有 用户 进行 网络 互访 , 共同 贡献 , 共同 分享 的 原理 ; 由于 用户 的 IP 不 同 、 接入 商 线路 不 同 、 地理 位置 不 同 、 访问 时间 不 同 , 用户 即可 轻松 获得 真实 有效 的 流量 访问 , 独创 鼠标 键盘 驱动 , 完全 模拟 真人 买家 操作 电脑 详细 浏览 宝贝 页面 。
+软件 特点 0 真实 有效 1 . 软件 独创 的 模拟 鼠标 键盘 自动 运行 , 完全 百分 百 还原 真人 浏览 , 让 鼠标 键盘 自己 动 起来 ; 2 . 用户 可 自行 设置 搜索 条件 , 价格 区间 、 地区 等 。
+可 设定 流量 数量 , 精准 控制 ; 3 . 软件 基于 网络 互访 的 随机 性 , 设置 了 不 同 的 访问 模式 , 会 对 店铺 内 交易 记录 、 评价 管理 , 以及 主页 等 相关 宝贝 随机 浏览 , 做到 百分 百 真实 ; 4 . 软件 使用 基于 IE 内核 以及 webkit 内核 自主 研发 的 嵌入 式 浏览 器 , 对 访问 效果 百分 百 真实 有效 。
+简单 易 用 1 . 使用 简单 , 发布 任务 后 即可 全 自动 开始 获得 优质 流量 , 无 需 人工 值守 ; 2 . 可 同时 发布 多 条 任务 , 随意 设定 所 需 流量 的 数量 ; 3 . 软件 界面 清爽 , 并 根据 用户 反馈 意见 , 持续 进行 改进 升级 ; 4 . 纯 绿色 软件 , 全面 支持 Windows 2000 , XP , 2003 , Vista , Win 7 操作 系统 。
+安全 可靠 1 . 采用 恶意 网址 智能 鉴定 技术 , 自动 收集 、 过滤 恶意 网址 ; 2 . 采用 深层 防 木马 技术 , 有效 阻止 各类 病毒 木马 从 网页 入侵 ; 3 . 软件 采用 基于 IE 内核 以及 webkit 自主 研发 的 嵌入 式 浏览 器 , 能 有效 拦截 各类 病毒 木马 程序 下载 ; 4 . 软件 自带 的 反 淘宝 识别 系统 , 每次 运行 之 前 会 自动 清除 cookies 等 信息 , 保护 客户 隐私 安全 。
+中文 名 流量 车 软件 特点 1 真实 有效 软件 特点 2 简单 易 用 软件 特点 3 安全 可靠
+李 锦春 李 锦春 , 男 , 生于 1965 年 4 月 , 研究 生 学历 , 常州 大学 材料 科学 与 工程 学院 教授 , 硕士 生 导师 。
+东风 汽车 工艺 所 客座 研究 员 , 常州 聚合 物 改 性 与 应用 技术 研究 院 特聘 院长 。
+1 在 研 项目 0 ① 江苏 省 科技 厅 “ 无 卤 阻燃 热塑 性 聚合 物 基 复合 材料 关键 技术 及 装备 ” 产学研 前瞻 性 项目 ; ② 江苏 省 科技 厅 “ 高 性能 纤维 增强 热塑 性 聚合 物 基 复合 材料 关键 技术 的 研究 与 产业 化 ” ; ③ 常州 市 “ 无 卤 阻燃 长 纤维 增强 热塑 性 聚合 物 基 复合 材料 的 产业 化 ” ; ④ 企业 委托 项目 4 项 。
+科研 成果 01992 ~ 1999 , 先后 在 高 分子 材料 功能 化 改 性 领域 主持 和 以 主要 完成 人 参与 省 部级 项目 4 项 , 企业 委托 项目 4 项 。
+2000 ~ 今 , 先后 在 功能 化 改 性 以及 阻燃 材料 、 复合 材料 领域 主持 并 完成 省 部级 基金 4 项 , 3 项 通过 省级 鉴定 , 2 项 通过 验收 。
+以 第 二 完成 人 参与 省级 应用 基金 、 省 十五 招标 子 课题 各 1 项 , 并 分别 通过 省级 鉴定 。
+先后 获得 教育 部 技术 发明 一等奖 、 江苏 省 科技 进步 二 等 奖 、 三 等 奖 、 四等 奖 各 一 次 、 中国 石化 总 公司 科技 进步 三 等 、 中国 包装 总 公司 科技 进步 三 等 奖 各 一 次 。
+申请 发明 专利 16 件 。
+中文 名 李 锦春 < a href = " # " > < / a > 国籍 中国 出生 日期 1965 性别 男
+微动 测 探 微动 ( Microse is 或 Microtre mor ) 是 地球 表面 日常 微小 的 颤动 , 它 区别 于 有 特定 震源 和 发 震 事件 的 微震 , 在 任何 事件 和 地点 均 可以 观测 到 。
+微动 位移 幅度 一般 为 几 个 微米 , 频率 变化 范围 在 0 . 3 - 5 . 0 Hz 之间 。
+利用 空间 自 相关 ( SAC ) 法 , 从 微动 中 可以 提取 到 瑞 雷波 频 散 , 通过 计算 求 出 波速 在 垂 向上 的 分布 , 依据 不 同 时代 不 同 岩性 地层 的 波速 差异 , 即可 推测 出 地下 各种 地层 的 分布 情况 。
+
+明 林 良 山茶 白羽 图 《 明 林 良 山茶 白羽 图 》 是 林 良 在 明代 所 创作 的 中国 古画 , 绢本 , 设色 , 纵 152 . 3 厘米 , 横 77 . 2 厘米 , 文物 原属 民间 收藏 , 现 藏 于 上海 博物 馆 。
+作品 介绍 0 图 写 山野 一隅 , 屹 岸 上 伫立 着 一 双 神态 俊逸 的 雄 白 雉 , 雌 白 雉 在 崖 下 过 步 。
+石 后山 茶树 枝叶 疏落 有 致 , 纷 开 粉红 色 的 花朵 。
+一 对 喜鹊 在 树 斡 上 跳跃 喳 叫 。
+洋溢 着 观 腾 的 气氛 。
+作品 赏析 0 此 图 用 精细 的 工笔 勾勒 , 敷 色 鲜 妍 雅丽 , 发扬 了 宋代 院体画 周密 不 苟 的 写生 传统 , 崖 石 和 树 斡 则 以 劲健 纵 放 的 得寸 进尺 墨 皴 染 , 形成 了 工 写 结合 、 刚柔 相 济 的 艺术 神采 , 是 较 工整 的 代表 作 。
+图 上 署 款 “ 林 良 ” 。
+作者 简介 0 林 良 ( 约 公元 1416 — 1480 年 ) , 字 以 善 , 南海 ( 今 广州 ) 人 。
+弘治 拜 工 部 营 善 所 丞 , 直 仁智 殿 改 锦衣 卫 百户 , 与 四明 吕 纪 先后 供奉 内廷 。
+善 画 花果 、 翎毛 , 着色 简 淡 , 备 见 精巧 。
+其 水墨 禽鸟 、 树 石 , 继承 南宋 院 体 画派 放纵 简括 笔法 , 遒劲 飞动 , 有 类 草书 , 墨色 灵活 , 为 明代 院 体 花鸟 画 变 格 的 代表 作家 , 也是 明代 水墨 写意 画派 的 开创 者 。
+中文 名 明 林 良 山茶 白羽 图 作者 < a href = " # " > 林 良 < / a > 类别 中国 古画 规格 纵 152 . 3 厘米 , 横 77 . 2 厘米
+城 山 村 城 山 村 位于 建德 市 的 东南 部 , 距 市府 约 37 公里 , 大 长线 穿 村 而 过 , 村 的 北面 和 东面 与 本 镇 的 丰 畈 、 溪口 村 接壤 , 南 倚 上 马 溪 , 西面 与 万 兴 村 相邻 , 全村 总 面积 约 4 . 2 平方 公里 , 康庄 环 村 水泥 路 1 . 8 公里 ; 现有 自然 村 5 个 , 村民 小组 10 个 , 居民 387 户 , 总 人口 1242 人 ; 全村 现有 耕地 面积 745 亩 , 山林 面积 560 亩 ; 02008 年 村 人均 收入 5461 元 ; 我 村 目前 发展 以 养殖 、 蚕桑 为 农业 的 主导 产业 , 拥有 畜禽 生态 养殖 大户 两家 , 蛋鸡 4 . 5 万 羽 , 母猪 50 余头 , 生态 桑 果园 基地 和 冬枣 种植 基地 各 一 个 , 现 其他 村民 都 已 开始 向 种 养殖 业 发展 。
+中文 名称 城 山 村 行政 区 类别 村 地理 位置 浙江 省 杭州 市 建德 市 大同 镇 人口 1242 人
+西安 今 正 广告 有限 责任 公司 西安 今 正 广告 有限 责任 公司 于 2013 年 05 月 14 日 成立 。
+法定 代表 人 朱 云龙 , 公司 经营 范围 包括 : 一般 经营 项目 : 广告 的 设计 、 制作 、 代理 、 发布 ; 房 地产 营销 策划 ; 商业 运营 管理 等 。
+10 公司 性质 有限 责任 公司 ( 自然 人 投资 或 控股 ) 成立 时间 2013 年 05 月 14 日 登记 机关 西安 市 工商 行政 管理 局 曲江 分局
+Visual Autor unAuto run 是 一款 应用 软件 , 适用 于 PC 平台 , 软件 版本 为 v3 . 3 。
+运行 环境 0 支持 WinXp , Win 2000 , Win 2003 软件 介绍 0 Visual Auto run 是 一款 免费 试用 的 数据 光盘 自动 播放 程序 . 通过 它 , 用户 可以 创建 出 和 商业 光盘 一 样 的 交互 式 效果 . 用户 可以 用 它 创建 光盘 自动 播放 菜单 , 或者 自动 打开 某一 文件 , 或者 显示 幻灯 片 等 . 另外 , 用户 也 可以 通过 安装 播放 器 的 方式 , 实现 自动 播放 多 媒体 文件 的 功能 . 软件 名称 Visual Auto run 软件 平台 pc 软件 版本 v3 . 3
+2012 坦帕 2012 Tampa 《 2012 坦帕 2012 Tampa 》 是 一款 IOS 平台 的 应用 。
+应用 介绍 0 My 2012 的 坦帕 移动 应用 程序 增强 你 的 2012 年 共和 党 全国 代表 大会 ( RNC ) 的 经验 。
+管理 所有 在 手掌 邀请 / 票 , 建立 你 的 日历 , 你 的 那 一 周 , 在 夏洛特 。
+有 多余 的 票 在 与 My 2012 坦帕 飞 他们 转移 。
+将 通过 Face book 和 Linked In 与 朋友 和 同事 知道 他们 是否 出席 事件 , 甚至 得到 警报 , 当 他们 到达 。
+该 应用 程序 包括 Yelp 和 流 公约 从 彭 博 新闻 由 坦帕 城市 指南 。
+它 也有 在 2012 年 RNC 和 叽叽 喳喳 的 谈话 围绕 每个 事件 发生 的 所有 事件 的 完整 列表 。
+这 是 RNC 做 的 权利 。
+Brought to you by Event Farmand Bloomberg Government . The My 2012 Tampamobile appen hances your 2012 RepublicanNational Convention ( RNC ) experience . Manage all of your invitations / tickets in the palm of your hand , build your calendar for your week in Charlotte . Have extraticket sTransfer themon the fly with My 2012 Tampa . Connect with friends and colleagues viaFace book and Linked In and know if they areattending you revents , evengetalerts when they arrive . The appin cludes aT ampacity guide powered by Yelpands treaming convention news from Bloomberg . Ital sohas a full listing of all events happeningat the 2012 RNCand the twitter conversationssur rounding eachev ent . Thisis theRNCdone right . 支持 版本 0iOS 4 . 0 及 以上 软件 名称 2012 坦帕 2012 Tampa 软件 平台 IOS 软件 大小 4 . 14 MB
+陈 诚 墓 陈 诚 墓 位于 中华 人民 共和 国 台湾 省 台北 泰山 乡 墓 址 , 在 一 块 海拔 400 米 的 山腰 平台 上 。
+这里 四周 是 连绵 的 山峰 , 山坡 上 植 满 了 郁郁 葱葱 的 修竹 和 相思 树 。
+1965 年 3 月 5 日 陈 诚 去世 , 台湾 各 大 新闻 媒体 进行 了 充分 报道 。
+蒋 介石 手书 挽 匾 、 挽联 登 在 显目 位置 。
+挽 匾 上书 : “ 党 国 精华 ” 。
+挽联 是 : “ 光复 志节 已 至 最后 奋斗 关头 , 那 堪 吊 此 国殇 , 果 有 数 耶 !
+革命 事业 尚 在 共同 完成 阶段 , 竟 忍 夺 我 元 辅 , 岂 无 天 乎 ?
+” 公祭 之 日 , 蒋 介石 亲率 文武 官员 致祭 , 备 极 哀荣 。
+0 < b > 人物 生平 < / b > < b > < / b > 陈 诚 去世 后 , 国民党 “ 副 总裁 ” 一 职 从此 取消 。
+陈 诚 , 中华 民国 国民 革命 军 一 级 上将 , 历任 台湾 省 政府 主席 , 台湾 当局 “ 行政 院 院长 ” , 台湾 当局 “ 副 总统 ” 等 职 。
+陈 诚 主政 台湾 期间 , 对 稳定 国民党 在 台 统治 作用 甚 大 。
+陈 诚 是 蒋 中正 的 亲信 , 也是 自 黄埔 成立 后 蒋 中正 执政 的 心腹 之 一 , 有 “ 小 委员 长 ” 之 称 。
+国民 革命 军 内部 由 陈 诚 领导 的 派系 亦 有 土木 系 之 称 。
+中文 名 陈 诚 墓 位置 台湾 省 台北 泰山 乡 相关 人物 陈 诚 人物 去世 时间 1965 年 3 月 5 日
+李 蕴辉 李 蕴辉 , 女 , 1963 年 生 , 现 为 山东 政法 学院 法学 教授 , 刑法 教研 室 主任 , 青岛 大学 硕士 生 导师 , 山东 省 法学 会 刑法 学 研究 会 秘书 长 、 常务 理事 。
+被 评 为 山东 政法 学院 首届 “ 教学 名师 ” , 山东 政法 学院 刑法 学 省级 精品 课程 负责 人 。
+多次 获得 山东 政法 学院 “ 优秀 教学 奖 ” 、 “ 科研 工作 优秀 奖 ” 、 “ 优秀 教师 ” 和 “ 优秀 共产 党员 ” 称号 。
+个人 简介 0 教育 背景 01985 年 毕业 于 中国 政法 大学 法律 系 , 2002 年 获 中国 政法 大学 法学 硕士 学位 。
+工作 经历 01985 至于 1988 年 在 黑龙江 省 人民 检察 院 刑事 检察 处 工作 。
+主讲 过 《 刑法 学 》 、 《 刑事 案例 研究 》 、 《 刑事 法律 专题 》 、 《 刑事 诉讼 法学 》 、 《 证据 法学 》 等 课程 。
+科研 成果 0 在 《 法学 》 、 《 中央 政法 管理 干部 学院 学报 》 、 《 法学 论坛 》 、 《 法制 与 社会 发展 》 、 《 政法 论丛 》 等 法学 学术 刊物 上 发表 论文 近 30 篇 。
+编写 《 刑事 法治 与 人权 保障 》 、 《 刑罚 之 道 》 ( 副 主编 ) ; 参与 编写 《 刑事 诉讼 法学 》 、 《 证据 学 》 、 《 刑事 案例 研究 》 等 教材 。
+代表 作品 0 《 公诉 权 滥用 与 制约 研究 》 , 《 法学 论坛 》 2004 年 第 3 期 ; 《 公司 治理 结构 的 立法 原则 》 , 《 法制 与 社会 发展 》 2004 年 第 6 期 ; 《 从 刑法 的 功能 论 当前 严打 刑事 政策 的 理念 》 , 《 政法 论丛 》 2004 年 第 6 期 。
+承担 课题 0 山东 省 社科 规划 重点 课题 《 中国 严打 刑事 政策 研究 》 , 主要 成员 ; 山东 省 教育 厅 课题 《 刑事 政策 与 社会 政策 研究 》 , 主持 人 ; 山东 政法 学院 重点 研究 课题 《 刑事 案例 研究 》 , 主要 成员 ; 山东 政法 学院 教改 项目 《 刑事 案例 教学 法 研究 》 , 主持 人 。
+科研 获奖 0 参与 的 山东 省级 重点 课题 《 中国 严打 刑事 政策 研究 》 获 山东 省 社科 优秀 成果 三 等 奖 ; 个人 撰写 论文 曾 获 山东 省 法学 优秀 成果 二 、 三 等 奖 3 项 ; 个人 论文 及 参编 教材 及 课题 曾 获 山东 政法 学院 优秀 科研 成果 一 、 二 、 三 等 奖 4 项 。
+中文 名 李 蕴辉 出生 日期 1963 年 毕业 院校 中国 政法 大学 性别 女
+冰川 世纪 冰川 世纪 是 指 地球 气候 酷寒 , 高 纬度 地方 的 广阔 区域 为 大陆 冰川 所 覆盖 的 时期 。
+最近 的 冰川 期 在 更新 世 , 据 在 欧洲 和 北美 研究 的 结果 , 认为 共 有 六 次 冰川 期 , 五 次 间 冰川 期 。
+0 这 是 指 地球 气候 酷寒 , 高 纬度 地方 的 广阔 区域 为 大陆 冰川 ( continental glacier ) 所 覆盖 的 时期 。
+最近 的 冰川 期 在 更新 世 , 据 在 欧洲 和 北美 研究 的 结果 , 认为 共 有 六 次 冰川 期 , 五 次 间 冰川 期 。
+在 日本 根据 分析 冰 斗 地形 ( 围 谷 地形 , kar ) 地形 发现 有 两次 冰川 期 。
+最 显著 的 冰川 期 是 在 石炭纪 一二 迭 纪 , 冰川 的 遗迹 残留 于 冈瓦纳 大陆 。
+除 上述 两大 冰川 期 外 , 在 欧洲 和 美洲 还 发现 有 前 寒 武 纪 、 中生代 和 第 三 纪 的 冰川 遗迹 , 但 都 不 太 显著 。
+中文 名 冰川 世纪 历史 次数 六 次 遗迹 残留 于 冈瓦纳 大陆 成因 地球 气候 酷寒
+保定 市 尤耐特 电气 有限 公司 保定 市 尤耐特 电气 有限 公司 成立 于 2003 年 , 坐落 于 河北 省 保定 国家 高新 技术 产业 开发 区 内 , 是 国家 认定 的 高新 技术 企业 。
+专业 从事 以 电力 自动 化 技术 及 电力 电子 技术 为 核心 的 软 、 硬件 产品 的 研发 、 生产 、 销售 和 服务 。
+经过 尤耐特 全体 员工 的 共同 努力 , 主力 产品 的 市场 占有 率 居 领先 地位 基本 信息 0 企业 logo 公司 简介 0 。
+科学 完善 的 技术 创新 体系 和 庞大 健全 的 营销 服务 网络 使 尤耐特 走 在 国内 同 行业 的 前列 。
+拥有 自主 知识 产权 的 高新 技术 产品 二十余 项 , 主要 产品 有 中 高压 固态 软 起动 柜 、 高压 无 功 补偿 装置 ( TCR 型 SVC 静止 式 动态 无 功 补偿 装置 , 并联 电容 器 投 切 型 无 功 补偿 装置 和 有 载 调 压 型 无 功 补偿 装置 ) 、 厂 用电 监控 系统 及 继电 保护 装置 系列 产品 。
+公司 于 2006 年 6 月 成立 “ 保定 国家 高新 技术 产业 开发 区 工厂 自动 化 及 现场 总线 应用 技术 研发 中心 ” , 在 技术 研发 水平 上 再 上 新 的 台阶 。
+公司 营销 网络 遍及 全国 , 产品 广泛 的 应用 于 火力 发电 、 水电 、 输 配电 、 冶金 、 化工 、 矿山 、 建筑 等 行业 。
+公司 严格 控制 产品 质量 , 并 通过 了 ISO 9001 质量 管理 体系 认证 。
+多年 来 尤耐特 人 艰苦 奋斗 , 拼搏 进取 , 企业 获得 了 大量 的 荣誉 , 被 评 为 国家 高新 技术 企业 、 全国 守 合同 重 信用 企业 , 其中 中 高压 固态 软 起动 装置 被 列 为 2010 年 河北 省 重点 项目 , 配电 网 无 功 优化 及 用电 监测 系统 被 列 为 2009 年 保定 市 重点 项目 。
+公司 致力 于 电力 自动 化 技术 及 电力 电子 技术 的 研究 与 应用 , 大力 开发 光伏 逆变 、 柔性 交流 输电 、 智能 保护 、 监控 等 关键 产品 , 努力 打造 国内 一 流 品牌 。
+公司 理念 0 公司 本 着 “ 以 德 为本 , 任人 唯 贤 , 发扬 团队 精神 , 帮助 员工 成功 ” 的 先进 用人 理念 , 凝聚 了 一批 优秀 的 人才 , 铸造 了 一批 高 科技 团队 , 80 % 以上 员工 具有 本科 学历 。
+公司 注重 人才 培养 , 每年 出 巨资 用于 人员 培训 , 为 每 一名 优秀 员工 提供 展示 自 我 的 舞台 。
+尤耐特 把 企业 文化 作为 企业 灵魂 , 以 独具 特色 的 价值 理念 体系 和 丰富 多彩 的 文化 生活 , 塑造 “ 敬业 、 创新 、 诚信 、 高效 ” 的 企业 精神 , 获得 行业 认同 和 广大 员工 的 归属 感 。
+公司 以 事业 留 魂 、 感情 留心 、 待遇 留 人 创造 和谐 的 环境 , 让 各类 人才 放飞 理想 , 大展 宏图 。
+发展 目标 0 公司 始终 以 “ 创 名牌 产品 , 争 行业 第 一 ” 作为 公司 的 战略 发展 目标 , 以 “ 科技 创新 为 动力 , 提高 企业 核心 竞争 力 ” 作为 公司 发展 战略 。
+发展 历程 02003 年 公司 成立 2004 年 公司 推出 了 保护 监控 类 产品 2005 年 公司 的 ECS 厂 用电 监控 系统 应用 于 ( 首 个 超 超 临界 机组 ) 江苏 阚 山 电厂 , 同年 通过 河北 省 软件 技术 企业 认证 2006 年 公司 的 首 台 高压 固态 软 启动 产品 开始 正式 投运 , 同年 公司 通过 ISO 9001 认证 , 并 被 评 为 “ 明星 企业 ” 成立 保定 国家 高新 区 挂牌 的 “ 工厂 自动 化 及 现场 总线 应用 技术 研发 中心 ” 2007 年 公司 的 厂 用电 监控 系统 成为 中石化 的 推荐 产品 , 同年 公司 被 评 为 “ 守 合同 、 重 信用 ” 企业 , 获得 科技 部 技术 创新 基金 的 支持 2008 年 公司 的 首套 SVC 无 功 补偿 产品 正式 投运 , 同年 被 评 为 河北 省 首批 “ 高新 技术 企业 ” 2009 年 公司 的 高压 固态 软 启动 项目 被 列 为 河北 省 的 重点 项目 2010 年 公司 开始 建成 “ 尤耐特 ” 工业 园 , 同年 被 评 为 河北 省 “ 第 四 届 优秀 软件 企业 ” , 成为 省 青少年 科技 创新 教育 基地 公司 名称 保定 市 尤耐特 电气 有限 公司 总部 地点 河北 省 保定 国家 高新 技术 产业 开发 区 内 成立 时间 2003 年 经营 范围 认定 的 高新 技术 企业
+直接 印刷 UV 板 简介 0 直接 印刷 UV 板 : 直接 喷 印 , 一 次 成像 , 速度 快 , 印刷 效果 永 不 掉色 , 刮 不 掉 , 防水 , 环保 , 具有 很高 的 表面 耐 抗 性能 , 非常 高 光泽 的 效果 , 还 具有 抗 划痕 抗 抓挠 的 性能 综合 效果 , 增加 产品 亮度 , 保护 产品 表面 , 其 硬度 高 , 耐 腐蚀 摩擦 , 不 易 出现 划痕 等 。
+分类 0 直接 印刷 UV 板 分 为 : 单色 直接 印刷 UV 板 , 木纹 直接 印刷 UV 板
+山阳 银花 镇 上 店 子 小学 山阳 银花 镇 上 电子 小学 是 一 所 农村 普通 完全 小学 , 位于 陕西 省 山阳 县 东 5 公里 的 银花 镇 上 店 子 村村 , 创建 于 1985 年 3 月 , 始 称 “ 山阳 二小 ” 。
+学校 北 依 烟墩 岭 , 南邻 商 南河 , 西 与 山阳 名 景 狮子 山 相望 , 山 郭 公路 从 校园 脚下 穿过 , 交通 便利 , 民 殷 物丰 , 实为 教化 育人 之 佳境 。
+学校 现有 25 个 教学 班 , 教 职工 40 人 , 学生 560 人 , 服务 银花 镇 东 片 3 个 乡镇 。
+近年 来 , 学校 先后 被 市县 授予 “ 德育 教育 先进 集体 ” 、 “ 治安 模范 学校 ” 、 “ 优秀 考点 ” 、 “ 民主 管理 示范 学校 ” 、 “ 提高 普通 中学 教育 质量 先进 单位 ” 、 “ 高考 旗帜 奖 ” 等 。
+中文 名 山阳 银花 镇 上 店 子 小学 类别 小学 地址 陕西 省 山阳 县 东 5 公里 的 银花 镇 上 店 子 村 在校 学生 560 人
+Bryan Smith 电影 作品 < / b > < b > < / b > < table > 上映 时间 剧 名 扮演 角色 导演 主演 担任 职务 1993 Star Michael Miller 詹妮 · 加斯 , 克雷格 · 比尔克 演员 1993 超速 魔 侠 2 : 魔界 再现 Jimmy Anthony Hick ox 朱利安 · 山 德斯 , 乔安娜 · 帕库 拉 演员 1993 情 碎 海伦娜 Russell 詹妮弗 · 林奇 朱利安 · 山 德斯 , 雪琳 · 芬 演员 1991 巴 格 西 Chick Hill 巴 瑞 · 莱文森 沃伦 · 比蒂 , 安妮特 · 贝宁 演员
+盾 眼 新疆 螨 盾 眼 新疆 螨 是 属 、 沙 螨 、 沙 螨 、 下 一 个 物种 。
+0 < table > 编号 : 15294 中文 科名 : 沙 螨 科 拉丁 科名 : Trombiculidae 中文 亚科 : 沙 螨 亚科 拉丁 亚科 : Trombiculinae 中文 属 名 : 新疆 螨 属 拉丁 属 名 : Xinjiangsha 拉丁 种名 : scu tocularis 定 名人 : We netal . 年代 : 1984 中文 名 : 盾 眼 新疆 螨 模式 产地 : 新疆 生境 : 寄主 : 鼹 形 田鼠 国内 分布 : 新疆 中文 学名 盾 眼 新疆 螨 界 动物 界 科沙 螨 亚科 沙 螨
+润 禾 汽车 销售 《 润 禾 汽车 销售 》 是 一款 Android 平台 的 应用 。
+应用 介绍 0 汽车 销售 是 根据 市场 需求 , 为 广大 的 浏览 者 提供 汽车 销售 产品 行业 的 最 新 的 新闻 动态 、 产品 展示 、 企业 荣誉 、 企业 相册 等 消息 的 综合 性 客户 端 。
+支持 版本 02 . 1 . x 以上 软件 名称 润 禾 汽车 销售 软件 平台 Android 软件 大小 5 . 22M 软件 语言 中文
+王 连甲 王 连甲 ( 1870 ~ 1939 ) , 即 农历 庚午 年 ( 清 穆宗 同治 九 年 ) 至 农历 己卯 年 ( 民国 二十八 年 ) , 享年 69 岁 。
+现 甘肃 景泰 县 人 , 当地 有 名 的 儒医 。
+“ 王氏 鼻 愈 疗法 ” 首创 者 。
+精通 中医 鼻 科 、 儒学 。
+生平 简介 0 王 连甲 先 先 ( 1870 - 1939 ) , 甘肃 景泰 县 人 。
+好 读书 , 尤 好 医书 。
+常 白日 忙于 农务 , 夜晚 挑灯 研习 医学 。
+王 连甲 先生 生平 涉猎 广泛 , 包括 儒学 、 中医 、 周易 、 历史 、 地理 、 书法 等 。
+其中 医学 与 儒学 更 是 对其 产生 了 深远 的 影响 。
+王 连甲 先生 对于 医学 孜孜 不 倦 、 废 寝 忘 食 的 态度 , 终 使 其 医术 得以 大成 。
+其 “ 妙手 回春 ” 的 医术 加 之 “ 慈善 仁德 ” 的 作风 , 深受 当地 乡亲 好评 , 患者 治愈 后 喜 笑 欢颜 的 表情 成 了 王 连甲 先生 最 大 的 享受 。
+先生 从 医方 数 年 , 其 声名 已 传播 开来 , 至此 方圆 几十 里 的 人们 患病 都 要 找 王 连甲 先生 医治 。
+而 王 连甲 先生 所 到 之 处 , 皆 受 乡亲 们 的 爱戴 , 常有 富者 馈赠 家 资 , 皆 被 先生 婉拒 。
+主要 贡献 0 先生 从医 数十 年 , 常有 鼻塞 、 流涕 、 打 喷嚏 、 头痛 者 就诊 , 先生 辩 为 鼻炎 鼻窦 炎 症状 。
+因 “ 肺 开窍 于 鼻 ” 、 “ 肺脏 宣 畅 , 清道 自利 ” , 故而 先生 尝 以 “ 宣 肺 通窍 ” 之 法 , 配 以 当地 特产 植物 , 患者 皆 不 久 而 愈 。
+此 便 是 “ 王氏 鼻 愈 疗法 ” 的 雏形 , 后 经 四代 传承 , “ 王氏 鼻 愈 疗法 ” 得以 成 其 体系 , 并 为 中医 呼吸 道 疾病 的 治疗 做出 了 巨大 贡献 。
+10 月 中旬 , “ 王氏 鼻 愈 疗法 ” 第 四代 传人 王 生 梁 先生 介绍 央视 网 健康 有 约 栏目 专访 , 解读 。
+王氏 鼻 愈 疗法 家族 传承 顺序 : 王 连甲 、 王 怀儒 、 王 中 相 、 王 生梁 中文 名 王 连甲 国籍 中国 民族 汉 出生 地 现 甘肃 景泰 县 出生 日期 1870 逝世 日期 1939
+浙江 桐 童 文化 发展 有限 公司 浙江 桐 童 文化 发展 有限 公司 ( 简称 桐 童 文化 ) , 正式 注册 于 2017 年 4 月 , 总部 位于 杭州 , 是 一 家 致力 于 文化 艺术 交流 、 艺术 创作 、 文化 创意 的 发展 型 公司 。
+公司 不 断 践行 文化 艺术 与 人才 双赢 的 理念 , 逐步 实现 文化 与 商业 的 结合 , 以 与 时 俱 进 的 理念 , 不 断 整合 各 专业 艺术 文化 人才 、 商业 人才 及 探索 , 力求 多元化 发展 , 通过 专业 的 文化 创意 , 让 文化 艺术 在 经济 发展 中 得到 弘扬 与 尊重 , 促进 文化 产业 的 发展 。
+公司 主 营 : 工艺 美术 品 设计 、 文化 创意 策划 , 会展 承办 、 艺术 品 零售 等 , 服务 : 依托 自身 的 资源 优势 为 各 企事 业 单位 提供 特色 的 品牌 、 营销 策划 、 商务 咨询 服务 及 投资 管理 . 。
+
+Highway Chile 《 HighwayChile 》 是 JimiHendrix 的 音乐 作品 , 收录 在 《 TheJimiHendrixExperience [ MCABox ] 》 专辑 中 。
+歌词 0Yeah , his guitar s lung acrosshis back . His dusty bootsishi sCadillac . Flamin ' hair justa blowin in the windA in ' t seenabed insolong it ' sasin . Heleft home whe nhe wass eventeen The rest of the worl dhe had longed to see . But every body knows the Boss . Arolling sto new hogathers nomoss . But you ' d probably call himat ramp . Butitgo esa little deeper thantha tHe ' sa . . . high way chile , yeah . Now some people say he hada girl back homeW homessed around and did himpretty wrong Theytell me it kinda hurt himbad , Kindamade himfeel pretty sad . Icouldn ' tsay what went through hism ind Anyway , heleft the world behind . But every body knows the same old story , In love and war you can ' t loseinglory . Now you ' d probably call himat ramp . ButIkno witgo esa little deeper thantha tHe ' sa . . . high way chile Walkon brother , yeah . Hisold guitar s lung acrosshis back , His dusty bootsishi sCadillac . Flamin ' hair justa blowin in the wind , Ain ' t seenabed insolong it ' sasin . Now you may call himat ramp . ButIkno witgo esa little deeper than that , He ' sa . . . high way chile Walkon brother , yeah . Don ' t let noonestop you . Highway Chile . Yeah yeah yeah ! 所 属 专辑 The JimiHendrix Experience [ MCABox ] 发行 时间 2000 - 09 - 12 音乐 时 长 3 分 40 秒
+神 之 眷属 之 规则 之 手 《 神 之 眷属 之 规则 之 手 》 是 连载 于 17k 小说 网 的 一 部 奇幻 魔法 类 小说 , 作者 是 简约 派 。
+小说 类型 0 奇幻 魔法 内容 简介 0 百年 难得 一见 的 绝顶 无 属性 废 材 , 有 是 有 怎样 的 奇遇 , 使 他 一 步步 踏 上 了 强者 的 巅峰 , 又 是 怎样 在 这个 过程 中 邂逅 了 成 神 后 的 眷属 呢 … … 中文 名 神 之 眷属 之 规则 之 手 作者 简约 派 小说 进度 连载 连载 网站 17k 小说 网
+要素 异质 视角 的 技术 选择 与 区域 发展 效率 研究 《 要素 异质 视角 的 技术 选择 与 区域 发展 效率 研究 》 是 2017 年 厦门 大学 出版 社 出版 的 图书 , 作者 是 张 月玲 , 。
+0 < b > 内容 简介 < / b > < b > < / b > 本 书 基于 劳动 异质 视角 的 要素 替代 弹性 分析 突破 了 既 有 文献 局限 于 资本 - 劳动 替代 弹性 及其 增长 效应 的 研究 , 为 进一步 探讨 区域 潜在 增长 动力 转换 机制 提供 了 一 个 要素 替代 约束 分析 的 新思路 。
+研究 表明 , 我 国 丰裕 的 劳动 力 资源 禀赋 比较 优势 存在 着 动态 变化 , 人力 资本 结构 与 技术 选择 之间 的 适配 性 对 区域 发展 效率 具有 决定 性 作用 。
+书名 要素 异质 视角 的 技术 选择 与 区域 发展 效率 研究 作者 张 月玲 定价 & nbsp ; ¥ 58 出版 社 厦门 大学 出版 社 出版 时间 2017 - 06 - 23 开本 18 开
+二 代 微光 瞄准 镜 11 16 式 二 代 微光 瞄准 镜 是 由 荷兰 尤斯 发 信号 设备 公司 制造 的 单兵 武器 夜间 瞄准 镜 。
+它 使用 北约 标准 瞄准 具 支座 , 可 装 在 所有 目前 服役 的 步兵 轻 武器 和 重 机枪 上 。
+该 瞄准 镜 已 装备 北约 国家 和 中东 一些 国家 的 军队 。
+0 产品 名称 : UA 1 1 16 式 二 代 微光 瞄准 镜 U A1116 Second - generation Night Sight SIGNAAL - Us faBV , NL 结构 特点 UA 1 1 16 式 二 代 微光 瞄准 镜 质量 小 , 体积 小 。
+它 采用 的 第 二 代 微 通道 板 像 增强 器 , 自带 供电 电路 和 亮度 自动 控制 及 防 强 光电 路 , 物镜 和 目镜 均 可调 , 且 有 亮 分 划 。
+瞄准 镜 的 精度 不 受 射击 振动 和 反复 拆装 的 影响 。
+性能 数据 放大 倍率 - - - - 4 . 2 # + [ × ] 视场 - - - - 9 ° 30 ′ 角 分辨率 ( 星光 条件 下 , 目标 对比 度 为 30 % 时 ) - - - - 1 密 位 调焦 范围 - - - - 25 m ~ ∞ 电源 - - - - 3VD C 质量 - - - - 1 . 6kg 中文 名称 二 代 微光 瞄准 镜 生产 单位 荷兰 尤斯 发 信号 设备 公司 现状 生产 用途 单兵 武器 瞄准
+灵活 的 圣诞 老人 《 灵活 的 圣诞 老人 》 是 一款 休闲 类 小 游戏 , 大小 为 1292K 。
+背景 设定 0 小 朋友 们 收 到 圣诞 礼物 了 吧 ?
+那么 你 知道 圣诞 老人 的 礼物 是 怎么 得到 的 吗 ?
+这 可 不 是 件 容易 的 活 , 看看 圣诞 老人 灵敏 的 身手 你 就 知道 了 。
+操作 指南 0 操作 指南 如何 开始 1 游戏 加载 完毕 点 击 PLAY 即可 开始 游戏 。
+操作 指南 操作 按键 2 键盘 方向 键 ← → 控制 圣诞 老人 移动 , 要 躲避 上方 的 火球 哦 。
+游戏 目标 0 帮助 圣诞 老人 躲避 火球 , 收集 更多 的 礼物 , 挑战 高分 吧 。
+中文 名 灵活 的 圣诞 老人 & nbsp ; 游戏 类型 休闲 类 游戏 平台 pc 玩家 人数 单人 游戏 引擎 flash 主要 角色 圣诞 老人
+HP 黑色 诈欺 师 《 HP 黑色 诈欺 师 》 是 disy 创作 的 网络 小说 , 发表 于 晋江 文学 网 。
+作者 0disy 作品 简介 0 流星 街 的 黑色 诈欺 师 黑 鹭 , 性格 反复 无 常 , 有 严重 的 异 装 癖 , 嗜好 化妆 , 喜欢 看 【 哗 】 片 。
+兴趣 是 捕捉 有 成长 价值 或者 强大 的 猎物 , 用 美貌 诱惑 目标 , 最后 给予 对方 死神 KISS 。
+死神 KISS 是 一 种 通过 接吻 吸取 别人 生命 的 BT 能力 。
+“ 黑 鹭 先生 , 我 要 以 诈欺 罪 逮捕 你 , 因为 你 偷走 了 我 的 心 。
+” 黑 鹭 攻 , CP 卢修斯 , 1 V1 中文 名称 HP 黑色 诈欺 师 作者 disy 连载 状态 已 完结 连载 平台 晋江 文学 网 更新 时间 2012 - 11 - 28
+黄 富 一年 参加 革命 , 任 宜 五区 八 乡 赤卫 队员 , 1931 年 在 辽 市 石头 公 下 被 敌 杀害 , 时年 30 岁 。
+
+深圳 高 优 电子 有限 公司 深圳 高 优 电子 有限 公司 于 2016 年 02 月 03 日 在 深圳 市 市场 监督 管理 局 登记 成立 。
+法定 代表 人 吕 友 锋 , 公司 经营 范围 包括 安防 智能 设备 的 研发 、 销售 ; 建筑 智能 化 系统 等 。
+企业 信息 0 公司 类型 有限 责任 公司 登记 机关 深圳 市 市场 监督 管理 局 成立 时间 2016 年 02 月 03 日发 照 时间 2016 年 02 月 03 日
+Kristian Bernard Bernard 是 一名 演员 , 代表 作品 有 《 森林 凶 物 》 、 《 贝尔纳克尔兹 》 等 。
+主要 作品 0 主要 作品 电影 作品 1 < table > 上映 时间 剧 名 扮演 角色 导演 主演 担任 职务 2011 森林 凶 物 Jason Horton 格伦 · 帕默尔 , Kristian Bernard 演员 制作 人 2010 贝尔纳克尔兹 BooBooScuba 埃里克 · 艾特 巴里 Martin Kove , JeanetteR oxborough 演员 2009 琼斯 复仇 DJ Rough Knuckles 塞德里克 · 凯尔斯 凯尔 · 米歇尔 , 塞德里克 · 凯尔斯 演员 2009 布鲁克林 警察 Thugball player ( uncre dited ) 安东尼 · 福奎阿 理查 · 基尔 , 唐 · 钱德尔 演员 2006 黑色 十字 架 Jean - Claude La Marre Jean - Claude La Marre , John Jean 制作 人 2005 孤注 一掷 2 Mr . Um fu fu John My rick 格伦 · 普拉默 , Jared Day 演员 2004 美发 沙龙 Jerry LaMothe 塔 缇 娜 · 阿里 , Kristian Bernard 演员 2003 Vegas Vampires Private investigator 弗雷德 · 威廉森 Lorna Baez , 丹尼尔 · 鲍德温 演员 2003 Play asBall Waiter Jennifer Harper 艾伦 · 佩恩 , 伊莉丝 · 尼尔 演员 < / table > 主要 作品 电视 剧 作品 2 < table > 首映 时间 剧 名 扮演 角色 导演 主演 担任 职务 2000 SexWar sHim self ( 1 episode ) Kristian Bernard , Adam Hatley 演员 < / table > 人物 关系 0 < table > 合作 关系 人物 名称 合作 作品 合作 两次 以上 的 影人 John Jean 合作 作品 ( 3 ) : 《 黑色 十字 架 》 , 《 孤注 一掷 2 》 , 《 美发 沙龙 》 Jean - Claude La Marre 合作 作品 ( 3 ) : 《 黑色 十字 架 》 , 《 孤注 一掷 2 》 , 《 美发 沙龙 》 格伦 · 帕默尔 Glenn Plummer 合作 作品 ( 2 ) : 《 VegasVampires 》 , 《 森林 凶 物 》 埃里克 · 艾特 巴里 EricE tebari 合作 作品 ( 2 ) : 《 贝尔纳克尔兹 》 , 《 VegasVampires 》 Chris Broughton 合作 作品 ( 2 ) : 《 VegasVampires 》 , 《 PlayasBall 》 LilaA viv 合作 作品 ( 2 ) : 《 黑色 十字 架 》 , 《 美发 沙龙 》 AlbertC . Che valier 合作 作品 ( 2 ) : 《 黑色 十字 架 》 , 《 孤注 一掷 2 》 LizLa Marre 合作 作品 ( 2 ) : 《 孤注 一掷 2 》 , 《 美发 沙龙 》
+红 凯贝尔 法国 红 凯贝尔 ( 国际 ) 集团 有限 公司 1999 年 创立 , 以 领先 的 公司 运营 管理 和 务实 、 高效 、 以 人为 本 的 企业 精神 , 坚持 与 时尚 同行 、 锐意 创新 , 使得 红 凯贝尔 企业 稳步 发展 , 已 形成 稳固 的 销售 体系 。
+近年 来 , 红 凯贝尔 加快 国际 化 运营 步伐 , 由 法国 总部 在 中国 内地 设立 品牌 运营 中心 , 并 在 中国 花城 广州 设立 了 生产 、 研发 及 物流 基地 , 市场 网络 迅速 扩 伸 至 中国 内地 、 东南亚 等 地区 , 至今 全国 已 覆盖 20 多 个 省 , 拥有 百 个 销售 终端 。
+红 凯贝尔 与 国际 时尚 接轨 , 以 非凡 的 品味 和 鲜明 的 风格 得到 女装 界 的 高度 瞩目 , 它 独特 的 设计 理念 体现 出 西式 文化 潮流 与 东方 传统 魅力 的 结合 , 款式 优雅 而 充满 激情 , 深受 东方 时尚 女性 青睐 。
+企业 介绍 0 法国 红 凯贝尔 ( 国际 ) 集团 有限 公司 创立 于 1999 年 , 是 一 家 集 品牌 研发 、 品牌 策划 、 品牌 生产 、 品牌 销售 为 一体 化 的 高 端 集团 公司 。
+公司 大 中华 区 总部 设立 在 美丽 的 花城 广州 , 拥有 厂房 5000 平米 及 全套 美 、 日 进口 作业 设备 , 共 有 员工 800 多 人 , 每年 向 市场 推出 2000 多 个 适销 款式 , 年 生产 能力 50 万 件 以上 , 同时 拥有 多 个 伙伴 合作 工厂 , 年 生产 能力 累计 可达 80 万 件 以上 。
+强大 的 生产 能力 保证 货品 的 快速 及时 供应 , 保证 质量 及 补货 时间 , 满足 市场 需求 。
+集团 大楼 内 设有 : 办公 大楼 、 服装 展示 厅 、 产品 研发 中心 、 生产 中心 、 商品 中心 、 物流 配送 中心 。
+长期 以来 公司 秉承 服务 于 市场 、 服务 于 大众 , 展现 女性 魅力 和 不 同 生活 态度 的 经营 理念 , 为 消费 者 传递 着 不 同 时期 、 不 同 场合 的 国际 时尚 流行 趋势 , 以 人为 本 , 致力 于 打造 全国 最 优秀 的 女装 品牌 , 已 在 全国 建立 起 20 多 个 总 代理 商 , 已 形成 稳固 的 销售 体系 。
+品牌 介绍 0 品牌 介绍 品牌 简介 1 上 个 世纪 , 欧美 时尚 向 中国 市场 发起 全面 冲击 , 源自 法国 的 HOK ABR 红 凯贝尔 女装 品牌 于 1999 年 正式 进军 中国 市场 , 并 以其 独特 的 魅力 风靡 于 全国 各 大 城市 。
+她 不仅 继承 了 法国 时装 的 浓郁 浪漫 , 还 吸收 了 欧陆 服饰 文化 的 大气 精粹 。
+在 风格 上 , 沿袭 国际 大牌 时装 风范 、 融合 欧 韩 潮流 元素 , 引入 独特 的 设计 精髓 , 利用 精湛 的 剪裁 艺术 铸造 出 出 类 拔萃 的 时尚 美感 , 满足 女性 消费 者 追逐 浪漫 时尚 的 情怀 。
+优雅 、 浪漫 、 富有 贵族 气息 , 红 凯贝尔 始终 以 满足 消费 者 需求 、 引领 国际 时装 潮流 为 己任 , 为 天下 女人 寻觅 时尚 和 浪漫 , 创造 新鲜 、 优质 的 新 时代 生活 方式 。
+< b > 品牌 诠释 < / b > < b > < / b > 红 凯贝尔 抒写 优雅 、 自信 女人 … … 红 凯贝尔 源自 浪漫 之 都 法国 。
+她 优雅 , 贵族 , 崇尚 时尚 浪漫 主义 , 于 千万 人 之 中 轻易 展露 , 大 放 异彩 ; 她 自信 , 智慧 , 具有 独到 的 品位 , 在 纷扰 杂乱 间 如 一颗 珍珠 , 超凡 脱俗 ; 她 集 现代 灵感 与 传统 元素 与 一 身 , 是 梦想 与 现实 碰撞 而 出 的 火花 。
+优雅 与 含蓄 , 品质 与 时尚 的 结合 , 永恒 演 译 着 现代 都市 女性 典雅 含蓄 , 温柔 婉约 的 特征 , 魅惑 、 耀眼 、 炫丽 、 多姿 , 红 凯贝尔 将 最 完美 的 时尚 呈现 给 有 品味 有 知识 函 养 的 时尚 女性 。
+品牌 介绍 年龄 定位 2 < b > 20 - 40 岁 的 都会 白领 女性 < / b > 追求 时尚 品味 , 优雅 , 女人 味 的 时尚 白领 一族 , 机关 单位 , 银行 职员 、 教师 、 医生 、 公务员 、 自由 职业 者 等 。
+品牌 介绍 风格 定位 3 < b > 简约 + 优雅 + 时尚 + 且 带 女人 味 < / b > < b > < / b > 简约 珠花 、 蝴蝶 结 、 蕾丝 等 精致 的 细节 点染 出 廓 形 的 极致 简约 为 职场 拼搏 的 女性 塑 出 利落 干练 的 形象 优雅 优雅 而 灵动 、 知性 而 婉约 的 女子 红 凯贝尔 以 发现 的 视角 捕捉 特殊 的 灵韵 塑造 出 一 个 个 优雅 的 都市 名媛 时尚 红 凯贝尔 从来 不 盲从 于 流行 但 又 不 忘 从 巴黎 、 米兰 等 时尚 之 都 汲取 养分 融入 自己 独有 的 语言 形成 职业 装 中 独特 的 时尚 女人 味 作为 女人 , 尤其 不 能 少 了 那 一 点 专属 女人 的 韵致 在 职场 、 在 酒会 Patry 上 、 在 度假 海滩 … … 女人 味 都 是 女人 最 精彩 的 点缀 品牌 介绍 发展 理念 4 以 合理 的 价格 创造 优秀 的 设计 与 优质 的 产品 以 最 优 性价 比 成为 代理 加盟 的 首选 合作 伙伴 公司 名称 法国 红 凯贝尔 ( 国际 ) 集团 有限 公司 总部 地点 广州 市 海珠 区 逸景 路 珠江 国际 轻纺 城 自编 A1 栋 13 楼 经营 范围 品牌 研发 、 策划 、 生产 、 销售 员 工 数 800 人 创立 时间 1999 年
+极致 拼图 1904 期 《 极致 拼图 1904 期 》 是 一款 Android 平台 的 应用 。
+应用 介绍 0 一款 很 耐玩 的 手机 拼图 游戏 , 拼图 游戏 是 一款 很 经典 的 益智 类 游戏 , 它 变化 多端 , 给 你 带来 无限 的 想象 与 乐趣 。
+支持 版本 01 . 6 以上 软件 名称 极致 拼图 1904 期 软件 平台 Android 软件 大小 2 . 36M 软件 语言 中文
+西安 九宫 格 电子 信息 技术 有限 公司 西安 九宫 格 电子 信息 技术 有限 公司 是 一 家 专注 于 旅游 景区 微 信 公众 平台 建设 以及 企业 业务 系统 延伸 手机 端 的 高 科技 公司 。
+总部 位于 陕西 省 西安 市 西安 九宫 格 电子 信息 技术 有限 公司 主要 产品 有 针对 旅游 景区 的 第 三 方 微 信 公众 平台 建设 及 服务 。
+针对 企业 的 业务 系统 手机 移动 化 解决 方案 。
+
+广东 泰 新 信息 科技 有限 公司 广东 泰 新 信息 科技 有限 公司 是 中国 广州 的 一 家 高 科技 公司 , 位于 广州 市 天河 区 东圃 一 横 路 118 号 东 泷 创意 园 G02 - 03 室 。
+泰 新 公司 致力 于 互联 网 平台 的 开发 、 互联 网 商品 销售 、 软件 开发 、 批发 和 零售 、 信息 系统 集成 、 信息 技术 咨询 及 广告 等 的 综合 服务 。
+泰 新 的 业务 包括 核心 电商 、 云 计算 、 创新 项目 “ 互联 网 + 包装 ” 以及 其他 业务 等 。
+公司 名称 广东 泰 新 信息 科技 有限 公司 外文 名称 Guang dong taixin information technology co . , LTD 总部 地点 中国 广州 成立 时间 2015 年 12 月 24 日 经营 范围 & nbsp ; 软件 开发 、 信息 系统 集成 服务 、 软件 批发 、 互联 网 商品 销售 公司 性质 有限 责任 公司 ( 自然 人 投资 或 控股 ) 法定 代表 人 李 剑平 注册 资本 1000 万 注册 号 91440101 MA59B9AE7Y
+兆基 光明 城 兆基 光明 城 项目 位于 泉州 市 丰泽 区 清源 山 余脉 , 东岳 山南 少林 寺 脚下 , 东 临 刺桐 北路 , 南 靠 泉州 市 体育 中心 , 南侧 为 沿 山 规划 路 , 西 接 东岳 前街 。
+项目 简介 0 项目 用地 119357 平方 米 , 总 建筑 面积 35 万 平方 米 , 其中 商业 16 万 平方 米 , 地下 车库 6 万 平方 米 , 住宅 6 万 平方 米 , 其余 为 影院 、 酒店 、 公寓 、 幼儿 园 等 。
+总 停车 数 1688 辆 , 属 中 大型 城市 综合 体 。
+项目 北侧 与 南 少林 寺 、 清源 山 风景 区 毗邻 , 用地 范围 内 保留 两 颗 百年 老 榕树 及 一 块 清代 急 公 尚义 牌坊 , 依托 名城 、 名山 、 名寺 、 名牌 坊 , 打造 “ 泉州 清源 山 至 东海 生态 绿 道 ” 的 最 重要 景观 节点 之 一 。
+该 项目 深厚 的 历史 文化 积淀 , 也 使 本 项目 成为 拥有 “ 山海 江城 ” 之 说 的 泉州 浓墨 重彩 的 一 笔 。
+本 项目 聘请 戴德梁行 房 地产 顾问 ( 深圳 ) 有限 公司 、 深圳 市 朗图 广告 有限 公司 、 厦门 聚贤庄 房 地产 营销 代理 有限 公司 等 知名 企业 , 结合 项目 自身 北 高 南 低 的 有 利 地势 , 北面 最 高 为 海拔 29 米 , 南面 最 低 为 海拔 9 . 2 米 , 规划 “ 坡地 地 式 情景 体验 街区 ” 最 佳 商业 形态 。
+引进 国际 知名 百货 、 影院 、 大型 宴会 、 “ 泉州 的 兰 桂 坊 ” , 成就 国际 、 文化 、 时尚 气息 并重 的 都市 中心 。
+奉献 给 泉州 的 是 —— 南 中国 . 首席 都市 文 旅 商业 公园 。
+交通 信息 0 < b > 项目 位置 : < / b > < b > 泉州 市 丰泽 区 南 少林 寺 南侧 , 刺桐 路 西侧 < / b > 配套 信息 0 < b > 绿化 率 : < / b > < b > 35 % < / b > < b > < / b > < b > 容积 率 : < / b > < b > 1 . 4 < / b > < b > < / b > < b > 车位 < / b > < b > 1688 < / b > < b > < / b > 楼盘 名 兆基 光明 城 物业 类别 普通 住宅 , 公寓 , 别墅 , 写字 楼 建筑 面积 35 万 平方 米 占地 面积 119357 平方 米
+梅 酱 中国 传统 食品 。
+【 清 】 顾 仲 《 养 小 录 》 有 : 【 原文 】 三伏 取 熟 梅 捣烂 , 不 见 水 , 不 加盐 , 晒 十日 。
+去 核 及 皮 , 加 紫苏 , 再 晒 十日 收 贮 。
+用 时或 盐 或 糖 , 代 醋 亦 精 。
+【 译文 】 三 伏天 取 熟 梅 捣烂 , 不 沾 水 , 不 加盐 , 晒 十天 。
+去掉 核 及 皮 , 加 紫苏 , 再 晒 十天 收藏 。
+用 的 时候 , 又 当 盐 又 当 糖 , 当 醋 用 也 行 。
+
+天津 摩 屋 科技 有限 公司 摩 屋 公司 ( Mohut ) 是 一 家 高 科技 公司 , 总部 位于 北京 融 创 动力 产业 园区 。
+摩 屋 公司 主要 主要 提供 : 自助 学琴 — 24 小时 营业 — 包月 服务 — 线 上 购买 — 选择 课程 — 摩 屋 上课 。
+摩 屋 公司 在 2017 年 获得 50 万 种子 轮 融资 。
+证明 材料 来自 蚂蚁 天使
+武 破 天地 复仇 《 武 破 天地 复仇 》 是 连载 于 起点 中文 网 的 小说 , 作者 是 千 风 乱 。
+小说 类型 0 玄幻 小说 内容 简介 0 这 是 一 个 注定 充满 恨 意 的 人 , 一生 中 的 跌宕 起伏 , 只 为 一 个 复仇 梦想 , 少年 时 的 悲苦 令 得 他 对 这个 世界 充满 恨 意 , 一心 只 想 改变 这个 世界 的 规矩 , 走 自己 的 路 … … 中文 名 武 破 天地 复仇 作者 千 风 乱 小说 进度 连载 连载 网站 起点 中文 网
+青 苹果 宝宝 智力 丛书 : 益智 贴 一 贴 1 《 青 苹果 宝宝 智力 丛书 : 益智 贴 一 贴 1 》 是 2010 年 1 月 1 日 中国 人口 出版 社 出版 的 图书 。
+图书 信息 0 出版 社 : 中国 人口 出版 社 ; 第 1 版 ( 2010 年 1 月 1 日 ) 12 页 读者 对象 : 3 - 6 岁 IS BN : 978751010 2707 条形码 : 978751010 2707 尺寸 : 27 . 8x24x 0 . 1cm 重量 : 159 g 内容 简介 0 《 青 苹果 宝宝 智力 丛书 : 益智 贴 一 贴 1 》 是 一 套 融合 智力 开发 与 贴纸 游戏 为 一体 的 启蒙 教育 丛书 。
+《 青 苹果 宝宝 智力 丛书 : 益智 贴 一 贴 1 》 根据 不 同 年龄 阶段 宝宝 的 特点 , 通过 大量 构思 巧妙 , 趣味 盎然 的 益智 游戏 , 由 浅 入 深 、 循序 渐进 地 让 宝宝 一边 贴纸 , 一边 学习 第 二 章 的 数字 、 加减 法 、 颜色 、 形状 、 认识 常见 的 物品 、 小 动物 、 了解 基本 生活 常识 。
+书名 青 苹果 宝宝 智力 丛书 : 益智 贴 一 贴 1 装帧 平装 开本 12 语言 简体 中文
+绢 毛 蝇子 草 绢 毛 蝇子 草 , 多年生 草本 , 高 20 - 45 厘米 。
+根 圆柱 状 。
+茎 疏 丛生 , 直立 , 不 分枝 , 无 毛 , 常 具 不 育 茎 。
+形态 特征 0 基生 叶 叶片 线状 倒 披针形 或 线形 , 长 5 - 8 厘米 , 宽 1 - 2 . 5 毫米 , 基 部 渐 狭 成长 柄 状 , 微 合生 , 顶端 尖 , 边缘 下部 具 缘 毛 , 余 均 无 毛 , 中脉 明显 ; 茎 生叶 少数 。
+总状 花序 , 稀 圆锥 式 总状 花序 ; 花 对生 , 花梗 长 5 - 10 毫米 , 无 毛 ; 苞 片 披针形 , 长 3 - 7 毫米 , 具 缘 毛 ; 花萼 狭 钟形 , 长 10 - 12 毫米 , 直径 3 - 3 . 5 毫米 , 无 毛 或 仅 萼 齿 被 绢 毛 , 余 均 无 毛 , 纵 脉 紫色 , 脉 端 连合 , 萼 齿 卵形 或 三角 状 卵形 , 长 1 - 1 . 5 毫米 , 顶端 圆形 或 钝 头 , 稀 具 凸 尖 , 被 绢 毛 , 边缘 具 缘 毛 ; 雌雄 蕊 柄 被 短毛 , 长 3 - 4 毫米 ; 花瓣 淡 紫色 ( ?
+) , 爪 匙 状 倒 披针形 , 长 10 - 12 毫米 , 露出 花萼 1 - 2 毫米 , 无 毛 和 无明 显 耳 , 瓣 片 轮廓 狭 倒卵形 , 长 5 - 6 毫米 , 深 2 裂 达 瓣 片 的 2 / 3 或 更 深 , 裂片 倒 披 针状 , 全 缘 ; 副 花冠 片 狭长 圆形 或 披针形 , 长 约 1 毫米 , 全 缘 或 微 缺 ; 雄蕊 显著 外露 , 长 12 - 14 毫米 , 无 毛 ; 花柱 明显 外露 , 长 约 12 毫米 。
+蒴果 卵形 , 长 6 - 8 毫米 , 比 宿 存萼 短 ; 种子 肾 形 , 长 约 1 . 2 毫米 , 暗 褐色 。
+花期 7 - 8 月 , 果 期 9 月 。
+生长 环境 0 生长 于 海拔 2300 - 3000 米 的 山地 草丛 。
+分布 范围 0 产 于 甘肃 、 宁夏 。
+中文 学名 绢 毛 蝇子 草 拉丁 学名 Silene & nbsp ; sericata & nbsp ; C . L . Tang 界 植物 界 门 < a href = " # " > 被子 植物 门 < / a > 纲 < a href = " # " > 双子 叶 植物 纲 < / a > 亚纲 < a href = " # " > 原始 花被 亚纲 < / a > 目 < a href = " # " > 中央 种子 目 < / a > 科 < a href = " # " > 石竹 科 < / a > & nbsp ; 亚科 < a href = " # " > 石竹 亚科 < / a > 族 < a href = " # " > 剪秋罗 族 < / a > < a href = " # " > 蝇子 草 亚 族 < / a > 属 < a href = " # " > 蝇子 草属 < / a > 组 < adata - lemmaid = " 12607822 " > 禾 叶 组 < / a >
+吴 国彬 吴 国彬 1 , 1952 年 生 , 江苏 新沂 人 , 中共 党员 , 省 八 届 政协 委员 、 科技 委 副 主任 , 省 九届 、 十 届 政协 常委 , 科技 委 副 主任 , 大学 学历 。
+01977 年 参加 工作 。
+历任 徐州 地区 教育 进修 学院 团 总支 书记 、 徐州 市 科协 办公 室 副 主任 、 徐州 市委 党史 办公 室 副 主任 、 江苏 省 科协 办公 室 副 主任 、 主任 , 1998 年 任 江苏 省 科协 党组 成员 、 秘书 长 、 办公 室 主任 , 2000 年 12 月 起 任 江苏 省 科协 副 主席 、 党组 成员 。
+中文 名 吴 国彬 出生 地 江苏 新沂 出生 日期 1952 年 职业 江苏 省 科协 副 主席 、 党组 成员
+淡 积云 淡 积云 即 空气 对流 运动 不 很 强 ( 一般 垂直 速度 不 超过 5m / s ) 时 形成 的 积云 , 为 对 流云 的 初期 阶段 。
+淡 积云 ( Cuhum ) 呈 孤立 分散 的 小 云块 , 底部 较 平 , 顶部 呈 圆弧 形 凸起 , 像 小 土包 , 云 体 的 垂直 厚度 小于 水平 宽度 。
+在 阳光 下 成 白色 , 厚 的 云块 中部 有 淡 影 , 晴天 常见 。
+1 成因 0 对 流云 顾 名 思 义 , 就 是 由 对流 而 产生 的 云 , 这种 云 多 产生 在 夏季 。
+如果 低 层 中 的 空气 温度 明显 高于 高层 的 空气 温度 ( 温度 随 高度 的 垂直 递减 率 过大 ) 时 , 大气 就 处在 一 种 不 稳定 状态 , 低 层 暖 空气 就会 做 上升 运动 , 从而 形成 对流 。
+上升 运动 的 暖 空气 温度 不 断 降低 , 当 达到 凝结 高度 后 , 水汽 就会 凝结 形成 云 。
+这种 云 就 是 对 流云 。
+对 流云 一般 根据 对流 强度 分 为 三 个 阶段 , 初期 形成 的 为 “ 淡 积云 ” 。
+这种 云 的 特点 是 云 体 较 松散 , 云顶 向上 凸起 , 有 一 个 相对 平坦 的 底部 , 这时 对流 相对 较 弱 。
+淡 积云 成 因为 热力 对流 , 在 夏季 , 地面 受到 太阳 强烈 辐射 , 地温 很高 , 进一步 加热 了 近 地面 气层 , 由于 地表 的 不 均匀 性 , 有 的 地方 空气 加热 得 厉害 一些 , 有 的 地方 空气 湿 一些 , 这样 在 近 地面 气层 中 就 生成 了 大大 小小 和 外界 温度 、 湿度 、 密度 稍 有 不 同 的 气 块 。
+当 达到 凝结 高度 以上 就 形成 了 对流 单体 , 又 逐步 发展 成 淡 积云 。
+从 外形 看淡 积云 的 积云 特征 是 比较 典型 的 孤立 分散 , 底部 平坦 与 顶部 凸起 , 厚度 为 几百 米 至 2000 m , 云顶 一般 在 0 ° 等温 线 高度 以下 , 云 体 由 水滴 组成 。
+由于 对流 所 及 高度 比 凝结 高度 多 不 了 多少 , 所以 云 向上 发展 较 弱 , 造成 形体 扁平 , 顶部 略有 拱 起 , 云 的 水平 宽度 大于 垂直 厚度 。
+厚 的 淡 积云 中间 有 阴影 。
+薄 的 云块 呈 白色 。
+南方 淡 积云 由于 水汽 较多 , 轮廓 不 如 北方 淡 积云 清晰 。
+淡 积云 单体 分散 或 成群 成群 分布 在 空中 , 晴天 多 见 。
+在 高原 地带 , 由于 地形 特殊 , 淡 积云 早晚 形成 的 云 较 低 , 午时 可 达到 2000 m , 云 底 高度 一般 在 500 ~ 1200 m 。
+预示 天气 0 淡 积云 出现 一般 为 晴好 天气 , 不 会 产生 雨 、 雪 现象 , 特别 是 到 了 下午 , 天空 还 只是 淡 积云 , 这 表明 空气 比较 稳定 , 淡 积云 又 叫 晴天 积云 , 是 连 晴天 的 预兆 。
+淡 积云 多数 在 天空 晴朗 的 时候 孤立 分散 地 出现 , 它 的 出现 标志 着 在 云团 上方 出现 稳定 的 气层 , 表明 至少 在 未来 的 几 个 小时 内 的 天气 都 是 不 错 的 。
+组成 部分 淡 积云 是 由 直径 5 - 30 微米 的 小 水滴 组成 , 而 北方 和 青藏 高原 地区 冬季 的 淡 积云 是 由 过 冷水 滴 或 冰晶 组成 , 有 时 会 降 零星 雨雪 。
+在 广州 观测 到 的 淡 积云 , 也有 预兆 坏 天气 的 。
+如 预兆 当天 有 地方 性 阵雨 的 淡 积云 , 这种 淡 积云 的 特点 是 : 淡 积云 生成 的 时间 比 睛 天时 早 ; 云块 大小 不 一 , 天顶 的 比 四周 大 得 多 ; 云顶 圆弧 形 突起 重叠 较多 , 中心 垂直 上 伸 很快 , 从 其 厚度 比 宽度 略 小时 起 , 10 分钟 左右 能 发展 为 浓 积云 。
+中文 名 淡 积云 外文 名 Cuhum 含义 空气 对流 运动 不 很 强 时 形成 的 积云 外形 特征 孤立 分散 的 小 云块 颜色 颜色 白色 常见 于 晴朗天 午后
+撒旦 前妻 , 别 想 跑 《 撒旦 前妻 , 别 想 跑 》 是 沐 水 云杉 写 的 网络 小说 连载 于 红袖 添香 网 。
+小说 类型 0 言情 小说 内容 简介 0 她 是 商界 少有 的 美女 强人 , 他 是 国际 第 二 大 集团 的 首席 总裁 , 他们 各 怀 心事 , 叱咤 商场 , 却 因为 一 y è 情 而 定下 一桩 看似 可笑 的 政治 婚姻 。
+他们 的 婚礼 在 梦幻 中 举行 , 可 婚后 , 他 却 三 个 月 都 没 再 碰 过 她 。
+当 她 终于 爆发 来 拷问 他 时 , 他 的 回答 却是 : “ 我 只是 想 等到 你 爱 上 我 的 那 一天 。
+” 时光 流逝 , 当 她 发现 自己 已 爱 上 他 并 怀孕 时 , 他 的 怀抱 里 却 躺着 另 一 个 女人 , 她 的 好 朋友 。
+她 苦笑 着 问 他 为什么 , 迎来 的 却是 一 场 策划 已 久 的 阴谋 。
+悲愤 中 , 她 撞 向 急速 驶 来 的 车子 , 而 那 一 刻 , 她 告诉 自己 : 她 会 将 他们 还 未 成形 的 小 生命 和 对 他 的 怨恨 一同 带走 。
+五年 后 , 她 归国 , 她 在 一 场 股东 大会 上 极其 顺利 的 拿 下 他 的 集团 的 最 多 股份 , 她 变相 的 报复 他 , 侮辱 他 的 人格 , 甚至 将 他 的 公司 变 为 她 复仇 的 工具 , 可 急功 近利 的 她 , 却 因为 一 件 事情 触怒 了 黑帮 老大 , 无 奈 , 她 只能 嫁给 了 黑帮 老大 以 赎 清 罪孽 , 留 他 一人 守望 。
+一年 后 , 他们 又 再次 重逢 , 此时 的 她 , 是 选择 报复 , 抑 是 重生 … … 中文 名 撒旦 前妻 , 别 想 跑 作者 沐 水 云杉 小说 进度 连载 连载 网站 红袖 添香 网
+幽幽 子心 小说 名称 : 幽幽 子心 小说 书号 : 30780 小说 类型 : 耽美 同人 小说 作者 : 驰 驴 小说 状态 : 完结 首发 网站 : 凤鸣 轩 小说 网 总共 字数 : 100425 内容 简介 0 苏 子幽 还 记得 九 岁时 , 妈妈 牵 着 他 的 手 , 指 着眼 前面 带 笑容 的 男人 说 : 叫 爸爸 。
+对 陌生 的 人 叫 爸爸 ?
+虽然 原先 那个 爸爸 对 自己 不 是 打 就 骂 , 子 幽 不 要 爸爸 , 不 要 。
+不管 苏 子幽 接 不 接受 , 子 幽 妈妈 还是 嫁给 了 这个 男人 ; 这个 新 爸爸 对子 幽 母子 很好 , 最 起码 没 让 他们 饿 着 , 时而 还 带子 幽 去 公园 玩 , 给 子 幽 买 新 玩具 , 可是 子 幽 就 是 不 肯 叫 男人 爸爸 , 总 是 学 妈妈 叫 这 男人 : 井 泽 、 井 泽 。
+每当 子 幽 叫 男人 井 泽 时 都 被 妈妈 呵斥 , 男人 都会 帮子 幽 解围 。
+子 幽 十二 岁时 , 妈妈 在 一 次 出差 时 意外 身亡 , 井 泽 为了 让 子 幽 过 什么 有 母爱 的 生活 , 决定 再婚 , 怎 知 子 幽 离家 出走 被 找 回来 后 不 和 他 说 一句 话 , 连平 时 最 爱 吃 的 菜 都 摔 了 ; 史 井泽 眼看 儿子 越来 越 虚弱 , 一心 急 就 拿 鞭子 来 吓吓 子 幽 , 没 想到 子 幽 倔强 地 咬 紧 最 含 住 泪 用 惶恐 不安 的 眼睛 瞪 史 井泽 ; 从此 那 时候 子 幽 不 再 和 井 泽 说 一句 话 。
+过年 时 井 泽 带子 幽 回家 过年 见人 也 不 叫 , 史 家人 对子 幽 也 没 好 印象 , 都 劝 井 泽 趁 着 年轻 再娶 一 个 , 生 个 自己 小孩 , 这 小 白眼 狼 长大 后 也 不 可靠 ; 不料 这些 被 门外 的 子 幽 听到 。
+那时 起 , 井 泽 没 有 再娶 , 子 幽 对 这事 也 不 闻 不 问 ; 平日 里 一 回家 就 关 在 房间 里 , 吃饭 也 只是 吃 素菜 。
+子 幽 十八 岁 生日 , 井 泽 开车 去 子 幽 学校 接 子 幽 , 想 借此 机会 改善 一 下 父子 之间 的 关系 , 没 想到 , 在 学校 旁边 的 小 巷子 里 , 跟 子 幽 同 一 年级 的 男生 抱住 子 幽 , 两人 口舌 交缠 在 一 起 , 发出 啧啧 响声 。
+回到 家 , 子 幽 向 井 泽 表达 爱意 却 被 井 泽 无 情 的 拒绝 , 从此 两人 上演 了 曲折 的 爱情 故事 。
+章节 目录 0 第 一 章 第 二 章 第 三章 第 四 章 第 五 章 第 六 章 第 七 章 第 八 章 第 九章 第 十 章 第 十一 章 第 十二 章 第 十二 章 第 十 三章 第 十四 章 第 十五 章 第 十六 章 第 十七 章 第 十八 章 第 十九 章 第 二十 章 第 二十一 章 第 二十二 章 第 二十三 章 第 二十四 章 第 二十五 章 第 二十六 章 第 二十七 章 第 二十八 章 第 二十九 章 第 三十 章 第 三十一 章 第 三十二 章 中文 名称 幽幽 子心 作者 驰 驴 类型 耽美 同人 连载 状态 完结
+你 我 不 曾 到 的 远方 《 你 我 不 曾 到 的 远方 》 是 连载 于 红袖 添香 网 的 小说 , 作者 萧 子 虞 . 0 < b > 作品 简介 < / b > < b > < / b > 一 次 任务 变成 一 次 历练 , 一 次 历练 变成 一 次 邂逅 。
+善良 ?
+狠辣 ?
+心机 ?
+单纯 ?
+哪 个 才 是 她 的 真 面目 , 她 的 背后 藏 着 什么 秘密 。
+夜 子 玄 : “ 只要 你 喜欢 , 只要 我 能 办到 , 我 就会 尽 一切 所 能 去 满足 你 , 哪 怕 会 豁出 生命 。
+” 江 琛熙 : “ 我 无 所谓 的 , 你 若 是 真 的 喜欢 就要 顺从 自己 的 心意 , 不 要 让 自己 迷茫 和 受伤 。
+” 他们 说 了 , 可 最后 一 个 都 没 有 剩 下 , 都 离开 了 , 留 下 她 孤身 一人 藏 于 暗夜 。
+她 想 回家 , 可是 … … 哪里 才 是 她 的 家 … …
+北京 欧艺 诗 家具 有限 公司 北京 欧艺 诗 家具 有限 公司 于 2010 年 04 月 07 日 成立 。
+法定 代表 人 郭 红梅 , 公司 经营 范围 包括 : 销售 家具 、 装饰 材料 、 五金 交 电 ; 家具 设计 ; 产品 设计 ; 室内 装饰 工程 设计 ; 技术 开发 、 技术 咨询 、 技术 服务 等 。
+10 公司 性质 有限 责任 公司 ( 自然 人 投资 或 控股 ) 成立 时间 2010 年 04 月 07 日 登记 机关 丰台 分局
+朱 瑶盛 朱 瑶盛 , 男 , 1962 年 生 , 浙江 仙居 人 , 孙 建国 弟子 。
+0 朱 瑶盛 朱 瑶 盛传 承 人 证书
+厦门 景 迅 网络 科技 有限 公司 厦门 景 迅 网络 科技 有限 公司 于 2014 年 09 月 26 日 在 厦门 市 思明 区 市场 监督 管理 局 登记 成立 。
+法定 代表 人 邱 锦章 , 公司 经营 范围 包括 软件 开发 ; 其他 未 列 明 科技 推广 和 应用 服务 业 等 。
+企业 信息 0 公司 类型 有限 责任 公司 ( 自然 人 独资 ) 登记 机关 厦门 市 思明 区 市场 监督 管理 局 成立 时间 2014 年 09 月 26 日发 照 时间 2016 年 03 月 31 日
+历险 的 故事 1 《 历险 的 故事 1 》 是 2005 年 7 月 1 日 中国 社会 出版 社 出版 的 图书 。
+图书 信息 0 出版 社 : 中国 社会 出版 社 ; 第 2 版 ( 2005 年 7 月 1 日 ) 平装 : 316 页 开本 : 16 开 IS BN : 7508 705882 条形码 : 9787508 705880 商品 尺寸 : 23 . 2x 16 . 6x 1 . 4cm 商品 重量 : 399 gASIN : B0011BV VTG 内容 简介 0 四 个 孩子 和 一只 鹦鹉 巧遇 一 起 , 由此 开始 了 他们 一 段 难忘 的 历险 。
+在 古堡 对面 的 海上 , 有 一座 小岛 , 当地 人 叫 它 死 岛 。
+孩子 们 发现 死 岛 上 有 种种 奇怪 的 现象 , 这 大大 激发 了 他们 的 好奇 心 , 于是 , 他们 决定 要 去 探 个 究竟 。
+书名 历险 的 故事 1 ISB N7508 705882 出版 社 中国 社会 出版 社 ; 第 2 版 ( 2005 年 7 月 1 日 ) 装帧 平装 开本 16 开 商品 尺寸 23 . 2x 16 . 6x 1 . 4cm
+ZakHil ditch 电影 作品 < / b > < b > < / b > < table > 上映 时间 剧 名 扮演 角色 导演 主演 担任 职务 2013 最终 时刻 ZakHil ditch 内森 · 菲利普斯 , AngourieRice 导演 编剧
+西式 红烧 牛尾 西式 红烧 牛尾 是 一 道 美食 , 主要 食料 有 牛尾 、 土豆 。
+原料 0 牛尾 500 克 、 土豆 1 个 、 、 洋葱 半 个 、 杏鲍 菇 1 个 、 番茄 沙司 30 克 、 蚝油 10 克 、 盐 、 生抽 、 葱 、 姜 、 调料 包 1 个 ( 没 有 可 放 : 大料 、 桂皮 、 香叶 ) 做法 01 、 牛肉 尽 可能 先 泡 干净 血水 , 然后 用 开水 氽 烫 一 下 , 去除 血 沫 。
+2 、 土豆 、 杏鲍 菇 切块 过 油煎 熟 备用 。
+3 、 沙锅 中 加入 沸水 , 放入 葱 、 姜 、 调料 包 、 蚝油 、 生抽 , 再 下 入 牛尾 , 小 火 炖 , 不 要 放 盐 。
+4 、 根据 牛肉 的 老 嫩 程度 , 炖 到 一定 时候 也 就 差 不 多 了 , 一般 小 火 40 分钟 , 用 高压 锅 就会 快 得 多 。
+5 、 炒锅 , 加一 点 橄榄 油 , 洋葱 炝 锅 , 加一 点 清水 , 连 汤 倒入 炖 好 的 牛尾 , 然后 再 加入 土豆 、 杏鲍 菇 、 番茄 沙司 。
+6 、 加入 盐 , 把 汤汁 收 至 浓 稠 即可 中文 名 西式 红烧 牛尾 主要 食材 牛尾 、 土豆 调料 耗油 、 盐 、 生抽 工艺 炖 、 煎 、 烧
+成都 道森 电器 有限 公司 诉 四川 信华 电力 工程 有限 公司 买卖 合同 纠纷 案 成都 道森 电器 有限 公司 诉 四川 信华 电力 工程 有限 公司 买卖 合同 纠纷 案 是 2017 年 3 月 13 日 在 四川 省 成都 市 高新 技术 产业 开发 区 人民 法院 审理 的 案件 。
+案由 0 民事 、 合同 、 无 因 管理 、 不 当 得利 纠纷 、 合同 纠纷 、 买卖 合同 纠纷 权责 关键 词 0 民 商事 、 权责 情节 、 代理 、 委托 代理 、 民事 责任 规定 、 合同 、 免 责 事由 、 合同 约定 、 诉讼 关键 词 、 诉讼 参加 人 ( 当事 人 和 诉讼 代理 人 ) 、 一般 代理 、 证据 、 审判 程序 、 诉讼 请求 、 简易 程序 案例 原文 0 成都 市 新都 区 人民 法院 民事 判决 书 原告 : 成都 道森 电器 有限 公司 。
+法定 代表 人 : 李 皞 , 总 经理 。
+委托 代理 人 杨 剑 , 四川 中 奥 律师 事务 所 律师 , 一般 代理 。
+委托 代理 人 谢 成 , 四川 中 奥 律师 事务 所 律师 , 一般 代理 。
+被告 : 四川 信华 电力 工程 有限 公司 。
+法定 代表 人 : 易 俊 名 。
+原告 成都 道森 电器 有限 公司 与 被告 四川 信华 电力 工程 有限 公司 买卖 合同 纠纷 一 案 , 本 院 于 2017 年 1 月 10 日 立案 受理 后 , 依法 适用 简易 程序 由 审判 员 陈 善元 独 任 审理 , 于 2017 年 2 月 21 日 在 成都 市 新都 区 人民 法院 泰兴 人民 法庭 公开 开庭 进行 了 审理 。
+原告 成都 道森 电器 有限 公司 的 委托 诉讼 代理 人 杨 剑 到庭 参加 诉讼 , 被告 四川 信华 电力 工程 有限 公司 经 本 院 合法 传唤 无 正当 理由 拒 不 到庭 参加 诉讼 。
+本 案 现已 审理 终结 。
+原告 成都 道森 电器 有限 公司 向 本 院 提出 诉讼 请求 : 1 . 请求 判令 被告 四川 信华 电力 工程 有限 公司 立即 支付 欠款 80000 元 及 利息 ( 按 中国 人民 银行 同期 同类 贷款 利率 计算 , 自 2014 年 12 月 1 日起 至 清偿 之 日 止 , 利息 暂 计算 至 2016 年 10 月 3 日 计 7975 元 , 共计 87975 元 ; 2 . 请求 判令 被告 四川 信华 电力 工程 有限 公司 承担 本 案 全部 诉讼 费用 。
+事实 与 理由 : 原 、 被告 双方 签订 了 关于 箱式 变 电站 zbw 2 某 800 kva / 10 kv / 0 . 4kv 的 买卖 合同 , 货物 价款 为 300000 元 。
+原告 成都 道森 电器 有限 公司 于 2014 年 10 月 1 日 将 货物 送 至 被告 四川 信华 电力 工程 有限 公司 西充 项目 部 , 被告 四川 信华 电力 工程 有限 公司 工作 人员 于 2014 年 10 月 2 日 签收 , 原告 成都 道森 电器 有限 公司 已 履行 合同 约定 义务 , 按照 合同 约定 , 被告 四川 信华 电力 工程 有限 公司 应 在 收 货 之 日起 2 个 月 内 付清 货款 , 但 被告 四川 信华 电力 工程 有限 公司 至今 仍 拖欠 货款 80000 元 , 经 原告 成都 道森 电器 有限 公司 多次 催收 未果 , 特 向 人民 法院 提起 诉讼 , 请求 解决 。
+被告 四川 信华 电力 工程 有限 公司 未 到庭 答辩 。
+本 院 经 审理 认定 事实 如下 : 对于 当事 人 双方 没 有 争议 的 事实 , 本 院 予以 确认 。
+原 、 被告 双方 签订 的 《 购销 合同 》 系 双方 真实 意思 表示 , 未 违反 法律 规定 , 本 院 予以 确认 。
+截止 庭审 当日 , 被告 四川 信华 电力 工程 有限 公司 尚 欠 原告 成都 道森 电器 有限 公司 货款 80000 元 , 本 院 予以 确认 并 在 卷 为证 。
+关于 利息 问题 , 根据 原 、 被告 双方 签订 的 《 购销 合同 》 约定 的 第 九 条 付款 日期 及 结算 方式 : 设备 到 现场 即 支付 至 合同 总 金额 的 85 % , 余款 在 2 个 月 内 支付 完毕 。
+从 送货 单 载明 的 送货 时间 为 2014 年 10 月 1 日 , 故 被告 四川 信华 电力 工程 有限 公司 应 于 2014 年 12 月 31 日前 支付 剩余 货款 。
+故 利息 应从 2015 年 1 月 1 日起 至 货款 付清 之 日 止 , 按 中国 人民 银行 公布 的 人民 币 同期 同类 的 贷款 利率 计付 。
+本 案 中 , 被告 四川 信华 电力 工程 有限 公司 未 到庭 参加 诉讼 , 应 视 为 其 对 原告 成都 道森 电器 有限 公司 所 举 证据 放弃 了 抗辩 的 权利 。
+综上 所 述 , 依照 《 中华 人民 共和 国 民事 诉讼 法 》 第 一百 三十四 条 第 一款 、 第 一百 四十四 条 , 《 中华 人民 共和 国 合同 法 》 第 一百 零七 条 之 规定 , 判决 如下 : 一 、 被告 四川 信华 电力 工程 有限 公司 本 判决 生效 之 日起 十日 内向 原告 成都 道森 电器 有限 公司 支付 货款 人民 币 80000 元 及 利息 ( 以 本金 80000 元 为 基数 , 从 2015 年 1 月 1 日起 至 货款 付清 之 日 止 按 中国 人民 银行 公布 的 人民 币 同期 同类 的 贷款 利率 计付 ) ; 二 、 驳回 原告 成都 道森 电器 有限 公司 的 其它 诉讼 请求 。
+如果 未 按 本 判决 指定 的 期间 履行 给付 金钱 义务 , 应当 依照 《 中华 人民 共和 国 民事 诉讼 法 》 第 二百五十三 条 的 规定 , 加倍 支付 迟延 履行 期间 的 债务 利息 。
+案件 受理 费 1000 元 , 由 被告 四川 信华 电力 工程 有限 公司 负担 ( 此 款 原告 成都 道森 电器 有限 公司 已 垫付 , 被告 四川 信华 电力 工程 有限 公司 在 本 判决 生效 后 十日 内向 原告 成都 道森 电器 有限 公司 给付 ) 。
+如 不 服 本 判决 , 可 在 判决 书 送达 之 日起 十五 日内 , 向 本 院 递交 上 诉状 , 并 按 对方 当事 人 的 人数 提出 副本 , 上诉 于 四川 省 成都 市 中级 人民 法院 。
+审判 员 陈 善元 二 〇 一七 年 三月 十三 日 书记 员 袁 萍萍 审结 日期 2017 年 3 月 13 日 文件 类型 判决 书 审理 程序 一审
+送 宝贝 的 芭蕾 舞 裙 送 宝贝 的 芭蕾 舞 裙 分类 : 装扮 | 大小 : 3684 K | 日期 : 2016 - 09 - 06 专题 : 制作 芭比 娃娃 详细 介绍 : 1 . 按键 操作 mouse left 道具 帮助 人物 做 裙子 和 给 宝贝 打扮 2 . 如何 开始 游戏 加载 完成 后 , 点 击 播放 按钮 , 再 点 击 【 play 】 , 接 着 点 【 next 】 , 然后 点 击 图标 , 进入 游戏 。
+3 . 游戏 目标 帮助 人物 做 裙子 和 给 宝贝 打扮 。
+
+杭州 卫尔 靓 保洁 服务 有限 公司 诉 杭州 麒 杨 文化 娱乐 有限 公司 等 买卖 合同 纠纷 案 杭州 卫尔 靓 保洁 服务 有限 公司 诉 杭州 麒 杨 文化 娱乐 有限 公司 等 买卖 合同 纠纷 案 是 2017 年 1 月 13 日 在 浙江 省 杭州 市 下 城区 人民 法院 审理 的 案件 。
+案由 0 民事 、 合同 、 无 因 管理 、 不 当 得利 纠纷 、 合同 纠纷 、 买卖 合同 纠纷 权责 关键 词 0 民 商事 、 权责 情节 、 代理 、 无 权 代理 、 民事 责任 规定 、 合同 、 诉讼 关键 词 、 证据 、 证据 种类 、 书证 、 视听 资料 、 证据 分类 、 反证 、 证明 、 证据 不 足 、 关联 性 、 合法 性 、 质证 、 证明 对象 、 审判 程序 、 诉讼 请求 、 陪审 、 简易 程序 、 执行 案例 原文 0 杭州 市 下 城区 人民 法院 民事 判决 书 原告 : 杭州 卫尔 靓 保洁 服务 有限 公司 。
+法定 代表 人 : 黄 志武 , 执行 董事 兼 总 经理 。
+委托 诉讼 代理 人 : 赵 娟利 、 祝 敏 , 浙江 一 墨 律师 事务 所 律师 。
+被告 : 杭州 麒 杨 文化 娱乐 有限 公司 。
+法定 代表 人 : 张 雪 , 执行 董事 兼 总 经理 。
+委托 诉讼 代理 人 : 郑 乐 、 张 启龙 , 浙江 泽 大 律师 事务 所 律师 。
+被告 : 郭 万惠 。
+委托 诉讼 代理 人 : 谷 延飞 , 浙江 裕丰 律师 事务 所 律师 。
+原告 杭州 卫尔 靓 保洁 服务 有限 公司 ( 以下 简称 卫尔 靓 保洁 服务 公司 ) 诉 被告 杭州 麒 杨 文化 娱乐 有限 公司 ( 以下 简称 麒 杨 文化 娱乐 公司 ) 、 郭 万惠 买卖 合同 纠纷 一 案 , 本 院 于 2016 年 6 月 17 日 受理 后 , 在 审理 过程 中 发现 该 案 不 宜 适用 简易 程序 , 依法 转 为 适用 普通 程序 , 由 审判 员 王 晓芳 任 审判 长 , 与 人民 陪审 员 赵 招 娣 、 张 帷 组成 合议 庭 , 于 2016 年 11 月 8 日 公开 开庭 进行 了 审理 。
+原告 的 委托 诉讼 代理 人 赵 娟利 、 祝 敏 , 被告 麒 杨 文化 娱乐 公司 的 委托 诉讼 代理 人 郑 乐 、 被告 郭 万惠 的 委托 诉讼 代理 人 谷 延飞 到庭 参加 诉讼 。
+本 案 现已 审理 终结 。
+原告 卫尔 靓 保洁 服务 公司 向 本 院 提出 诉讼 请求 : 1 . 判令 被告 麒 杨 文化 娱乐 公司 向 原告 支付 货款 34524 元 , 赔偿 逾期 付款 损失 2982 . 8 元 ( 暂 自 2016 年 2 月 1 日起 分期 计算 至 起诉 之 日 止 , 此后 按 银行 同期 同类 贷款 利率 为 基础 、 参照 逾期 罚息 标准 计算 至 实际 支付 之 日 止 ) ; 2 . 判令 被告 郭 万惠 对 上述 债务 承担 连带 清偿 责任 ; 3 . 本 案 的 诉讼 费用 ( 案件 受理 费 、 保全 费 等 ) 由 两 名 被告 承担 。
+事实 和 理由 : 原告 自 2014 年 6 月 起 向 被告 麒 杨 文化 娱乐 公司 供应 纸巾 、 精油 用品 等 货物 , 双方 口头 约定 货款 月 结 , 于 每 月初 支付 上月 货款 , 麒 杨 文化 娱乐 公司 仓库 管理 员 等人 在 原告 送货 单 、 每月 供货 清单 上 签字 确认 , 并 出具 入库 单 。
+相应 的 货款 均 由 被告 郭 万惠 向 原告 支付 。
+2016 年 1 月 至 4 月 , 原告 向 两 被告 供应 纸巾 、 精油 用品 等 货物 共计 34524 元 , 经 多次 催讨 , 两 被告 仍未 支付 。
+被告 郭 万惠 的 个人 财产 与 被告 麒 杨 文化 娱乐 公司 的 财产 混同 , 严重 损害 原告 利益 , 参照 《 公司 法 》 相关 规定 , 被告 郭 万惠 应对 被告 麒 杨 文化 娱乐 公司 的 债务 承担 连带 清偿 责任 , 原告 故 诉至 本 院 。
+被告 麒 杨 文化 娱乐 公司 答辩 称 : 双方 的 买卖 行为 被告 并 不 清楚 , 原告 提交 的 证据 中 没 有 被告 公司 的 盖章 , 证据 材料 中 郭 万惠 等 个人 的 行为 也 没 有 得到 被告 公司 的 授权 , 原 、 被告 之间 并非 通过 被告 的 账户 进行 交易 , 故 请求 驳回 原告 的 诉讼 请求 。
+被告 郭 万惠 答 辩称 : 本 案 没 有 合同 , 无法 看出 被告 是 本 案 合同 的 相对 人 。
+即便 存在 合同 关系 , 也是 原告 与 被告 麒 杨 文化 娱乐 公司 的 合同 关系 , 与 被告 郭 万惠 个人 无 关 。
+原告 的 证据 无法 证明 是 原告 公司 与 被告 麒 杨 文化 娱乐 公司 存在 合同 往来 关系 , 仅 证明 黄 志芳 与 相关 人员 的 经济 往来 关系 。
+被告 郭 万惠 与 被告 麒 杨 文化 娱乐 公司 之间 不 存在 财产 混同 的 情况 , 其 也 不 是 被告 麒 杨 文化 娱乐 公司 的 股东 、 法定 代表 人 或 实际 控制 人 , 故 原告 起诉 被告 郭 万惠 缺乏 法律 依据 。
+原告 卫尔 靓 保洁 服务 公司 为 证明 主张 的 事实 , 向 本 院 提交 如下 证据 : 1 . 环球 盛典 预订 平台 官方 微 信 公众 号 注册 的 详细 资料 截屏 1 份 ; 2 . 环球 盛典 服务 中心 微 信 公众 号 注册 主体 的 截屏 1 份 ; 3 . 《 厨房 承包 合同 》 1 份 ; 4 . 公司 文件 1 份 ; 5 . 环球 盛典 行政 人员 通讯 录 1 份 。
+证据 1 - 5 共同 证明 : 环球 盛典 系 被告 麒 杨 文化 娱乐 公司 的 商号 。
+被告 郭 万惠 在 被告 麒 杨 文化 娱乐 公司 处 任 董事 长 一 职 , 系 被告 公司 的 实际 控制 人 。
+6 . 情况 说明 1 份 以及 黄 志芳 的 证言 1 份 , 证明 黄 志芳 系 原告 员工 , 与 被告 麒 杨 文化 娱乐 公司 、 被告 郭 万惠 之间 的 单据 签收 及 款项 往来 系 代表 原告 所 作出 的 职务 行为 。
+7 . 领 付款 凭证 4 份 ; 8 . 货物 供应 明细 4 份 ; 9 . 送货 单 4 份 ; 10 . 入库 单 8 份 ; 11 . 中国 农业 银行 银行 卡 活期 子 账户 交易 明细 3 份 。
+证据 7 - 11 共同 证明 : 1 . 2016 年 1 月 至 4 月 期间 , 原告 向 被告 麒 杨 文化 娱乐 公司 供货 , 款项 共计 34524 元 ; 2 . 原告 向 被告 麒 杨 文化 娱乐 公司 供货 的 相关 的 文件 及 单据 由 被告 一 仓库 管理 员 、 财务 、 总 经理 、 董事 长 签字 确认 ; 3 . 原告 向 被告 麒 杨 文化 娱乐 公司 供货 的 款项 由 被告 郭 万惠 支付 ; 4 . 被告 麒 杨 文化 娱乐 公司 、 被告 郭 万惠 之间 人格 混同 。
+12 . 浙江 省 社会 保险 参保 证明 2 份 , 证明 郭 万惠 、 董 琳 、 杜 丽娟 系 被告 麒 杨 文化 娱乐 公司 的 工作 人员 , 其 对外 签署 文件 的 行为 系 代表 被告 麒 杨 文化 娱乐 公司 做出 。
+13 . 被告 的 经营 场所 及 员工 的 视频 1 份 , 证明 : 1 . 李 子龙 系 被告 麒 杨 文化 娱乐 公司 楼 面部 经理 , 张 振 系 被告 麒 杨 文化 娱乐 公司 楼 面部 主管 , 李 建华 系 仓库 管理 人员 , 三人 对外 签署 文件 的 行为 系 代表 被告 麒 杨 文化 娱乐 公司 做出 ; 2 . 秦 英杰 、 郭 万惠 系 被告 公司 的 管理 层 人员 , 签署 大量 公司 内部 管理 文件 。
+其 二人 有 权限 在 原告 领 付款 凭证 中 签字 并且 该 签字 真实 ; 3 . 环球 盛典 入库 单 系 被告 麒 杨 文化 娱乐 公司 日常 使用 单据 。
+14 . 郭 万惠 与 原告 员工 黄 志芳 的 短信 记录 1 份 , 证明 : 1 . 被告 郭 万惠 确认 其 在 被告 麒 杨 文化 娱乐 公司 处 任职 期间 原告 向 被告 麒 杨 文化 娱乐 公司 供货 的 事实 。
+2 . 被告 郭 万惠 确认 在 2016 年 1 - 4 月 有 共计 34524 元 的 货款 未 向 原告 支付 。
+上述 证据 经 原告 、 被告 当庭 举证 、 质证 , 本 院 认证 如下 : 两 名 被告 对 原告 提交 的 证据 1 、 2 的 真实 性 以及 证明 对象 均有 异议 , 认为 微 信 公众 号 截图 与 被告 麒 杨 文化 娱乐 公司 的 实际 公众 号 不 符 , 本 院 经 核实 环球 盛典 预订 平台 与 环球 盛典 服务 中心 两 个 公众 号 的 信息 , 虽然 目前 的 信息 与 原告 提供 的 信息 不 完全 一致 , 但 因 微 信 公众 号 的 内容 系 可 变更 删除 信息 , 因此 信息 不 一致 的 情况 客观 存在 , 但 目前 公众 号 上 环球 盛典 系 被告 麒 杨 文化 娱乐 公司 的 商号 的 事实 可以 确认 , 故 本 院 予以 确认 上述 证据 的 证明 效力 。
+两 名 被告 对 原告 提交 的 证据 3 、 4 、 5 的 真实 性 、 合法 性 、 关联 性 以及 证明 对象 均有 异议 , 认为 证据 3 合同 上 的 公章 与 被告 麒 杨 文化 娱乐 公司 目前 使用 的 公章 不 相符 , 证据 4 、 5 没 有 被告 麒 杨 文化 娱乐 公司 的 公章 , 本 院 认为 上述 证据 的 原件 在 被告 处 保管 , 被告 虽 有 异议 但 未能 提交 反驳 的 原件 或 相反 证据 , 故 对 上述 证据 的 真实 性 本 院 予以 确认 , 对 被告 郭 万惠 系 被告 公司 工作 人员 的 事实 本 院 予以 确认 , 对 原告 的 其他 证明 对象 本 院 不 予 确认 。
+被告 麒 杨 文化 娱乐 公司 对 原告 提交 的 证据 6 提出 异议 , 认为 无法 说明 黄 志芳 与 被告 麒 杨 文化 娱乐 公司 发生 单据 签收 及 款项 往来 的 情况 。
+被告 郭 万惠 提出 与 黄 志芳 之间 不 存在 货物 往来 , 黄 志芳 作为 原告 员工 缺少 劳动 合同 , 真实 性 存疑 。
+本 院 经 向 黄 志芳 核实 , 结合 黄 志芳 提交 的 社保 参保 证明 以及 原告 提交 的 证据 7 、 8 、 9 、 10 、 11中 黄 志芳 的 签字 , 可以 认定 黄 志芳 系 代表 原告 公司 与 被告 之间 进行 买卖 , 故 对 原告 的 证明 效力 本 院 予以 确认 。
+被告 麒 杨 文化 娱乐 公司 对 原告 提交 的 证据 7 无 异议 , 被告 郭 万惠 对 证据 7 提出 只有 2016 年 2 月 25 日 的 签字 是 被告 郭 万惠 作为 被告 麒 杨 文化 娱乐 公司 工作 人员 的 签字 , 其余 领 付款 凭证 与 被告 郭 万惠 无 关 ; 本 院 对 证据 7 的 真实 性 予以 确认 。
+两 名 被告 对 原告 提交 的 证据 8 无 异议 , 本 院 予以 确认 。
+两 名 被告 对 原告 提交 的 证据 9 中 2016 年 2 月 29 日 的 送货 单 有 异议 , 本 院 认为 被告 未能 提交 足以 反驳 的 相反 证据 , 故 对 该 份 书证 的 真实 性 予以 确认 。
+两 名 被告 对 原告 提交 的 证据 10 中 与 其它 入库 单 不 同 的 两 张 单据 提出 异议 , 本 院 认为 入库 单 虽然 颜色 不 同 , 但 入库 单 上 均 盖有 环球 盛典 仓库 专用 章 , 两 被告 也 未能 提交 相反 证据 反驳 , 故 对 该 证据 的 真实 性 本 院 予以 确认 。
+被告 麒 杨 文化 娱乐 公司 对 原告 提交 的 证据 11 提出 该 证据 无法 证明 原告 与 被告 麒 杨 文化 娱乐 公司 直接 发生 交易 , 所有 款项 系 被告 郭 万惠 汇出 , 被告 郭 万惠 对 该 份 证据 提出 其 代表 被告 麒 杨 文化 娱乐 公司 支付 , 本 院 认为 该 份 证据 系 书证 , 对 真实 性 予以 确认 。
+被告 麒 杨 文化 娱乐 公司 对 证据 12 提出 异议 , 认为 与 本 案 无 关联 性 , 无法 证明 郭 万惠 、 董 琳 、 杜 丽娟 系 麒 杨 文化 娱乐 公司 员工 。
+被告 郭 万惠 对 该 证据 无 异议 , 但 称 自己 没 有 签署 文件 的 资格 。
+本 院 认为 被告 麒 杨 文化 娱乐 公司 为 郭 万惠 、 董 琳 、 杜 丽娟 缴纳 社会 保险 , 可以 证明 上述 人员 系 麒 杨 文化 娱乐 公司 的 工作 人员 , 故 对 该 证据 本 院 予以 确认 。
+被告 麒 杨 文化 娱乐 公司 对 原告 提交 的 证据 13 提出 该 场地 系 被告 郭 万惠 实际 经营 , 被告 麒 杨 文化 娱乐 公司 不 知情 , 李 子龙 、 张 振 、 李 建华 均 非 被告 麒 杨 文化 娱乐 公司 员工 。
+被告 郭 万惠 对 该 证据 的 真实 性 、 合法 性 与 证明 对象 均有 异议 , 认为 不 清楚 是否 在 被告 麒 杨 文化 娱乐 公司 的 场所 拍摄 , 且 拍摄 时 也 没 有 被告 麒 杨 文化 娱乐 公司 的 工作 人员 在场 。
+公司 文件 中 无法 看出 被告 郭 万惠 系 董事 长 , 不 能 证明 原告 的 主张 。
+本 院 认为 原告 提交 的 该 份 视听 资料 以 合法 手段 取得 , 与 原告 提交 的 其他 书证 能 相互 佐证 , 本 院 予以 确认 该 证据 的 真实 性 以及 证明 效力 。
+被告 麒 杨 文化 娱乐 公司 对 原告 提交 的 证据 14 提出 该 证据 证明 系 原告 与 被告 郭 万惠 直接 发生 交易 , 与 被告 麒 杨 文化 娱乐 公司 无 关 ; 被告 郭 万惠 对 证据 14 的 真实 性 与 证明 对象 均有 异议 , 本 院 认为 原告 提交 的 证据 14 与 其他 证据 能 相互 佐证 , 具有 真实 性 , 与 本 案 相 关联 , 本 院 予以 确认 。
+综上 有效 证据 及 原告 、 被告 的 陈述 , 本 院 认定 案件 事实 如下 : 原告 卫尔 靓 保洁 服务 公司 向 被告 麒 杨 文化 娱乐 公司 供应 纸巾 、 精油 用品 等 货物 , 双方 口头 约定 货款 月 结 , 于 每 月初 支付 上月 货款 , 2015 年 双方 之间 发生 相应 的 货款 均 通过 被告 麒 杨 文化 娱乐 公司 的 工作 人员 即 被告 郭 万惠 的 银行 账户 打入 原告 卫尔 靓 保洁 服务 公司 的 工作 人员 黄 志芳 的 账户 进行 支付 。
+2016 年 1 月 至 4 月 , 原告 向 被告 麒 杨 文化 娱乐 公司 供应 纸巾 、 精油 用品 等 货物 共计 34524 元 , 并 开具 了 送货 单 、 每月 供货 清单 , 被告 麒 杨 文化 娱乐 公司 的 工作 人员 在 原告 的 送货 单 、 每月 供货 清单 上 签字 确认 , 并 出具 入库 单 , 但 之后 被告 麒 杨 文化 娱乐 公司 均 未 支付 货款 , 原告 经 多次 向 被告 公司 以及 被告 郭 万惠 催讨 未果 , 故 诉至 本 院 。
+本 院 认为 , 公民 、 法人 可以 通过 代理 人 实施 民事 法律 行为 。
+未经 授权 的 无 权 代理 人 因 其 与 本人 之间 的 关系 , 具有 外表 授权 的 特征 , 致使 相对 人 有 理由 相信 行为 人 有 代理 权 而 与其 进行 民事 法律 行为 , 该 无 权 代理 人 的 法律 后果 由 本人 承担 。
+本 案 中 , 按照 原告 卫尔 靓 保洁 服务 公司 与 被告 麒 杨 文化 娱乐 公司 之间 的 交易 习惯 , 原告 向 被告 公司 所在 地 供应 纸巾 、 精油 用品 , 由 被告 公司 的 工作 人员 在 原告 的 送货 单 、 每月 供货 清单 上 签字 确认 , 被告 郭 万惠 通过 银行 账户 向 原告 的 工作 人员 黄 志芳 的 账户 支付 货款 的 方式 进行 。
+郭 万惠 、 李 建华 、 董 琳 、 杜 丽娟 等 作为 在 被告 公司 经营 场所 工作 的 人员 , 使 原告 有 理由 相信 被告 公司 的 工作 人员 在 送货 单 、 入库 单 、 每月 供货 清单 上 的 签字 行为 是 依 被告 麒 杨 文化 娱乐 公司 的 授权 所 为 。
+因此 , 被告 公司 工作 人员 签收 的 货物 应 视 为 已 由 被告 麒 杨 文化 娱乐 公司 收取 , 被告 麒 杨 文化 娱乐 公司 应当 支付 原告 货款 34524 元 。
+被告 麒 杨 文化 娱乐 公司 辩称 系 个人 行为 与 公司 无 关 的 意见 , 理由 不 能 成立 。
+原告 公司 为 黄 志芳 交纳 了 社保 , 同时 确认 黄 志芳 的 行为 系 原告 公司 授权 的 职务 行为 , 故 黄 志芳 向 被告 公司 送货 、 向 郭 万惠 收取 货款 系 黄 志芳 代表 原告 公司 实施 的 民事 法律 行为 。
+因此 , 原告 卫尔 靓 保洁 服务 公司 有 权 向 被告 麒 杨 文化 娱乐 公司 主张 未 支付 的 货款 34524 元 。
+关于 原告 要求 两 被告 赔偿 逾期 付款 损失 的 诉讼 请求 , 因 双方 并无 订立 书面 协议 对 该 内容 作出 明确 约定 , 故 本 院 不 予 支持 。
+原告 要求 被告 郭 万惠 承担 连带 清偿 责任 的 意见 , 因 原告 提交 的 现有 证据 不 足以 证明 被告 郭 万惠 系 被告 麒 杨 文化 娱乐 公司 的 控股 股东 或 实际 控制 人 , 也 不 能 证明 被告 郭 万惠 的 个人 财产 与 被告 麒 杨 文化 娱乐 公司 的 财产 混同 , 故 对 原告 的 该 诉讼 请求 本 院 不 予 支持 。
+依照 《 中华 人民 共和 国 合同 法 》 第 十条 、 第 四十九 条 、 第 一百 五十九 条 、 一百零七 条 以及 《 中华 人民 共和 国 民事 诉讼 法 》 第 六十四 条 之 规定 , 判决 如下 : 一 、 被告 杭州 麒 杨 文化 娱乐 有限 公司 与 本 判决 生效 之 日起 十日 内 支付 原告 杭州 卫尔 靓 保洁 服务 有限 公司 货款 34524 元 ; 二 、 驳回 原告 杭州 卫尔 靓 保洁 服务 有限 公司 的 其他 诉讼 请求 。
+如果 未 按 本 判决 指定 的 期间 履行 给付 金钱 义务 , 应当 依照 《 中华 人民 共和 国 民事 诉讼 法 》 第 二百五十三 条 之 规定 , 加倍 支付 迟延 履行 期间 的 债务 利息 。
+案件 受理 费 738 元 , 由 原告 杭州 卫尔 靓 保洁 服务 有限 公司 负担 75 元 , 被告 杭州 麒 杨 文化 娱乐 有限 公司 负担 663 元 。
+原告 杭州 卫尔 靓 保洁 服务 有限 公司 于 本 判决 生效 之 日起 十五 日 内向 本 院 申请 退费 ; 被告 杭州 麒 杨 文化 娱乐 有限 公司 于 本 判决 生效 之 日起 七日 内 , 向 本 院 交纳 应 负担 的 诉讼 费 。
+如 不 服 本 判决 , 可以 在 判决 书 送达 之 日起 十五 日内 , 向 本 院 递交 上 诉状 , 并 按照 对方 当事 人 或者 代表 人 的 人数 提出 副本 , 上诉 于 浙江 省 杭州 市 中级 人民 法院 。
+审判 长 王 晓芳 人民 陪审 员 赵 招 娣 人民 陪审 员 张 帷二 〇 一七 年 一月 十三 日 代 书记 员 宋 圆圆 审结 日期 2017 年 1 月 13 日 审理 法院 浙江 省 杭州 市 下 城区 人民 法院 文件 类型 判决 书 审理 程序 一审
+青岛 盛 文 集团 中韩 合资 青岛 盛 文 集团 , 成立 于 2005 年 , 注册 资金 2700 万 美元 。
+盛 文 集团 依托 自身 地产 开发 的 专业 经验 , 致力 于 商业 地产 发展 与 运营 。
+1 公司 架构 0 青岛 盛 文 集团 , 成立 于 2005 年 , 注册 资金 2700 万 美元 , 主 营 地产 开发 、 商业 运营 、 休闲 旅游 、 酒店 餐饮 、 物业 管理 等 业务 。
+地产 开发 方面 , 2006 年 于 青岛 黑龙江 中路 成功 开发 城郊 商业 综合 体 , 累计 投资 逾 20 亿元 ( 人民 币 ) , 包括 青岛 国际 工艺 品 城 、 盛 文 新港 奥特 莱斯 、 盛 文 国贸 大厦 等 主力 项目 , 从而 将 308 国道 夏 庄 段 从 原始 土地 状态 , 直接 提升 为 国际 化 的 商业 区域 , 提升 了 周边 商业 价值 , 凝聚 了 一批 优势 的 商业 项目 。
+另外 , 目前 正在 规划 建设 威海 盛 文 韩国 新城 。
+商业 运营 方面 , 青岛 盛 文 集团 成功 的 运营 管理 了 盛 文 奥特 莱斯 、 青岛 国际 工艺 品 城 等 近 10 万 平米 的 商场 , 年 客流 量 突破 7000 万 人次 , 交易 额 逾 19 亿元 ( 人民 币 ) , 其中 盛 文 奥特 莱斯 为 山东 首 个 精致 奥特 莱斯 , 青岛 国际 工艺 品 城 是 “ 国家 AAAA 级 旅游 景区 ” 、 山东 省级 文化 产业 示范 基地 。
+休闲 旅游 方面 , 已经 投资 800 万元 ( 人民 币 ) 成功 运营 管理 国内 最 大 的 室内 真人 CS 实战 游戏 项目 - 盛 文 镭 战 天下 ; 规划 投资 5 . 5 亿元 ( 人民 币 ) 开发 建设 休闲 农业 项目 —— 盛 文 金钱 鼎 都市 农场 , 以 富有 特色 的 休闲 旅游 文化 打造 一 个 极富 竞争 力 的 朝阳 产业 。
+酒店 餐饮 方面 , 投资 1 . 5 亿元 ( 人民 币 ) , 成功 运营 管理 金 富悦 精品 酒店 等 。
+发现 土地 价值 , 引导 城市 发展 。
+在 过去 的 地产 项目 中 , 盛 文 集团 坚持 整体 规划 、 专业 发展 , 将 每 一 个 项目 都 置于 国内 外 经济 环境 中 , 使得 每 一 个 项目 都 具备 最 大 的 竞争 力 , 满足 城市 综合 发展 需求 。
+盛 文 集团 坚持 以 人为 本 的 发展 原则 , 挖掘 每 一 个 项目 的 最 大 价值 , 让 每 一 个 项目 都 成为 市场 经典 。
+业务 范围 0 地产 开发 、 商业 运营 、 休闲 旅游 、 酒店 餐饮 、 物业 管理 等 业务 。
+发展 历史 02004 年 8 月 , 青岛 盛 文 集团 针对 工艺 品 城 P . J 事宜 , 与 政府 部门 进行 首次 协商 , 10 月 , 提出 事业 提案 书 。
+12 月 , 政府 部门 确定 青岛 国际 工艺 品 城 事业 提案 。
+2005 年 9 月 20 日 , 青岛 国际 工艺 品 城 奠基 开工 。
+项目 总 投资 2700 万 美元 , 占地 面积 64 亩 , 建筑 面积 99 , 351 平方 米 。
+2006 年 2 月 19 日 , 盛 文 国贸 大厦 奠基 开工 , 写字 楼 由 3 个 船型 塔楼 ( 1 ~ 2 层 联体 ) 组成 , 建筑 面积 43 , 614 平方 米 。
+2007 年 3 月 28 日 , 青岛 国际 工艺 品 城 开业 。
+2007 年 3 月 28 日 , 盛 文 奥特 莱斯 奠基 开工 。
+2007 年 4 月 16 日 , 青岛 国际 工艺 品 城 生产 基地 奠基 开工 , 并 于 11 月 30 主体 建成 。
+2008 年 1 月 16 日 , 盛 文 奥特 莱斯 主体 建成 竣工 。
+2008 年 8 月 , 青岛 国际 工艺 品 城 生产 基地 投入 使用 。
+2008 年 9 月 27 日 , 2008 中国 ( 青岛 ) 国际 饰品 节暨 中国 ( 青岛 ) 国际 饰品 博览 会 在 青岛 国际 工艺 品 城 开幕 。
+2009 年 8 月 28 日 , 2009 中国 ( 青岛 ) 国际 饰品 节暨 中国 ( 青岛 ) 国际 饰品 博览 会 在 青岛 国际 工艺 品 城 开幕 。
+2009 年 9 月 , 盛 文 奥特 莱斯 商业 广场 项目 正式 启动 。
+2010 年 8 月 27 日 , 盛 文 奥特 莱斯 开业 。
+2010 年 8 月 27 日 , 2010 中国 ( 青岛 ) 国际 饰品 节暨 服饰 文化 博览 会 在 青岛 国际 工艺 品 城 开幕 。
+2011 年 4 月 , 盛 文 镭 战 天下 项目 开工 建设 , 7 月 23 日盛 文 镭 战 天下 正式 启动 运营 。
+2011 年 , 金 富悦 精品 酒店 开始 筹建 。
+2011 年 8 月 26 日 , 2011 中国 ( 青岛 ) 国际 饰品 节暨 服饰 文化 博览 会 在 青岛 国际 工艺 品 城 开幕 。
+2012 年 8 月 23 日 , 盛 文 珠宝 饰品 创意 文化 产业 园 奠基 开工 。
+2012 年 8 月 23 日 , 2012 中国 ( 青岛 ) 国际 饰品 节暨 服饰 文化 博览 会 在 青岛 国际 工艺 品 城 开幕 。
+2012 年 , 盛 文 金钱 鼎 都市 农场 开工 建设 。
+2012 年 8 月 23 日 , 金 富悦 精品 酒店 隆重 开业 , 拥有 各类 房型 110 间 , 是 青岛 市 城阳 区 首家 精品 酒店 。
+2013 年 , 盛 文 珠宝 饰品 创意 文化 产业 园 主体 建成 完工 , 进入 装修 阶段 。
+2013 年 8 月 23 日 , 2013 中国 ( 青岛 ) 国际 饰品 节暨 时尚 文化 博览 会 在 青岛 国际 工艺 品 城 开幕 。
+2013 年 9 月 28 日 , 盛 文 · 特雅 5D 艺术 馆 开业 。
+2013 年 , 盛 文 金钱 鼎 都市 农场 工程 进展 顺利 , 一期 工程 基本 建成 , 并 于 2014 年 5 月份 开园 纳 客 。
+公益 活动 02007 年 , 青岛 盛 文 集团 斥资 1600 万 , 以 “ 青岛 国际 工艺 品 城 ” 冠名 青岛 中 能 队 一 个 赛季 。
+2007 年 , 青岛 盛 文 集团 向 奥 帆 委 捐赠 价值 20 万元 的 嘉宾 纪念 品 。
+2007 年 , 青岛 盛 文 集团 向 青岛 市 红十字 会 捐赠 400 万元 , 成立 “ 盛 文 集团 · 红十字 救助 金 ” 。
+2008 年 , 盛 文 集团 继续 冠名 中 能 花费 1600 万 人民 币 。
+2008 年 , 盛 文 集团 出资 赞助 的 青岛 市 首部 本土 原创 三维 科普 动画 长片 《 轻 轻松 松 看 奥 帆 》 。
+2008 年 , 盛 文 集团 出资 20 万 人民 币 作为 韩国 珠宝 产 学 珠宝 研究 协会 的 饰品 研究 资金 。
+2009 年 , 青岛 盛 文 集团 捐资 200 万元 , 在 城阳 区 设立 30 处 盛 文 益民 书屋 。
+企业 荣誉 02007 年 , 青岛 国际 工艺 品 城 荣获 “ 质量 、 服务 、 信誉 AAA 级 市场 试验 基地 ” 荣誉 称号 。
+2008 年 , 青岛 国际 工艺 品 城 荣获 “ 山东 省级 文化 产业 示范 基地 ” 称号 。
+2008 年 , 青岛 国际 工艺 品 城 荣获 “ AAAA 国家 级 旅游 景区 ” 。
+2009 年 , 青岛 国际 工艺 品 城 荣获 青岛 市 文化 创意 产业 “ 十大 重点 市场 ” 荣誉 称号 。
+2009 年 , 青岛 国际 工艺 品 城 荣获 “ 共青团 ( 青岛 市委 ) 创业 孵化 基地 ” 荣誉 称号 。
+2010 年 , 青岛 国际 工艺 品 城 荣获 “ 山东 省 大型 综合 旅游 购物 场所 十佳 品牌 ” 。
+2010 年 , 青岛 国际 工艺 品 城 荣获 青岛 市 “ 诚信 企业 ” 荣誉 称号 。
+2011 年 , 青岛 国际 工艺 品 城 荣膺 “ 青岛 市 外贸 公共 服务 平台 ” 。
+2012 年 , 青岛 盛 文 集团 荣获 “ 山东 省 十佳 民营 文化 企业 ” 荣誉 称号 。
+2012 年 , 盛 文 珠宝 饰品 创意 文化 产业 园 荣获 “ 青岛 市 小 企业 创业 ( 孵化 ) 基地 ” 荣誉 称号 。
+2013 年 , 盛 文 集团 总裁 王 裕涛 被 选拔 为 首批 “ 齐鲁 文化 之 星 ” 。
+2013 年 , 盛 文 珠宝 创意 文化 产业 园 荣获 “ 青岛 市 ‘ 千万 平米 ’ 重点 工程 ” 、 “ 青岛 市 文化 产业 示范 园区 ” 等 荣誉 。
+公司 总部 0 中国 · 青岛 公司 名称 青岛 盛 文 集团 外文 名称 Qingdao Sung wonGroup 总部 地点 中国 · 青岛 成立 时间 2005 年 经营 范围 地产 开发 & nbsp ; 商业 运营 & nbsp ; & nbsp ; 休闲 旅游 & nbsp ; 酒店 餐饮 & nbsp ; & nbsp ; 物业 管理 公司 性质 中韩 合资 企业
+华 灿 光电 简介 0 武汉 华 灿 光电 有限 公司 致力 于 研发 、 生产 、 销售 以 GaN 基 蓝 、 绿光 系列 产品 为主 的 高 质量 LED ( LightE mitting Diodes , 发光 二极 管 ) 外延 材料 与 芯片 , 拥有 国际 领先 的 技术 研发 能力 和 成熟 的 生产 工艺 。
+华 灿 光电 2005 年底 在 中国 光谷 —— 武汉 东湖 新 技术 开发 区 注册 成立 。
+2008 年 3 月 , 美国 著名 风险 投资 公司 —— IDG 技术 创业 基金 ( IDGVC ) 投资 华 灿 光电 。
+IDG VC 不仅 为 华 灿 光电 提供 了 扩增 产能 所 需 的 资金 , 也 给 企业 的 发展 注入 了 国际 化 的 经营 理念 。
+华 灿 光电 项目 得到 了 湖北 省 、 武汉 市 和 东湖 新 技术 开发 区 等 各 方面 的 有 力 支持 , 厂区 占地 160 多 亩 , 项目 第 一期 建设 的 固定 资产 投资 1 . 5 亿元 人民 币 , 7000 多 平米 的 一号 厂房 、 包括 MOCVD 在内 的 世界 先进 水平 的 全套 LED 外延 和 芯片 生产 线 已 形成 生产 规模 , 在 LED 显示 屏 用 蓝 绿光 芯片 的 全面 品质 和 白光 用 蓝光 芯片 的 亮度 、 寿命 等 方面 均 处于 国内 领先 水平 , 正 迅速 形成 每月 生产 并 处理 10000 片 2 吋 外延 的 一期 生产 能力 。
+项目 二期 建设 也 将 于 2008 年 7 月 开始 启动 。
+华 灿 光电 以 技术 为 先导 , 汇集 国际 技术 力量 。
+技术 团队 的 主体 由 多位 具有 优秀 化合 物 半 导体 专业 背景 和 丰富 实践 经验 的 归国 博士 、 台湾 及 外籍 专家 组成 , 具有 国际 领先 水平 的 基础 技术 研究 和 产品 开发 、 应用 能力 。
+公司 研发 团队 与 国际 知名 实验 室 同步 进行 技术 研发 , 具备 持续 的 技术 自主 创新 能力 。
+拥有 多项 LED 领域 的 核心 专利 和 先进 的 II I 族 氮化物 基高 性能 LED 发光 管 材料 生长 、 器件 设计 与 生产 制造 的 核心 技术 。
+团队 技术 能力 全面 、 互补 , 确保 公司 的 LED 芯片 技术 研发 能力 保持 领先 水平 。
+企业 历程 2005 年 11 月 武汉 华 灿 光电 有限 公司 成立 于 中国 光谷 —— 武汉 东湖 新 技术 开发 区 , 初期 注册 资金 4000 万 人民 币 。
+2006 年 4 月 一号 厂房 开工 奠基 , 开始 总 投资 1 . 5 亿元 的 一期 项目 建设 。
+2007 年 3 月 一号 厂房 投入 使用 , 公司 正式 迁入 滨湖 路 8 号 , 厂区 总 占地 面积 约 160 亩 。
+2007 年 7 月华 灿 光电 第 一颗 芯片 点亮 。
+2007 年 8 月华 灿 光电 首次 获得 “ 高新 技术 企业 ” 认证 。
+2007 年 9 月华 灿 光电 系列 高 品质 产品 首次 亮相 , 获得 行业 和 媒体 的 高度 关注 。
+2008 年 3 月 IDG 资本 等 投资 机构 参股 华 灿 光电 , 华 灿 光电 在 接受 1150 万 美元 的 境外 投资 后 改制 为 中外 合资 企业 。
+2008 年 7 月华 灿 光电 通过 ISO 9001 质量 管理 体系 认证 。
+2008 年 12 月华 灿 光电 被 评 为 年度 湖北 省 “ 十佳 雇主 ” 。
+2009 年 6 月华 灿 光电 完成 项目 一期 建设 并 启动 二期 项目 扩 产 建设 , 计划 累计 投资 总额 超过 4 . 5 亿元 。
+2009 年 8 月华 灿 光电 通过 ISO 14001 环境 管理 体系 、 OH SAS 18001 职业 健康 安全 管理 体系 认证 。
+2009 年 9 月华 灿 光电 再次 通过 高新 技术 企业 认证 。
+2009 年 12 月华 灿 光电 荣获 最 高级 AAA 级 信用 等级 评价 , 被 授予 “ 武汉 市 信用 企业 ” 称号 。
+2009 年 12 月华 灿 光电 年度 销售 过 亿元 , 技术 和 品质 在 国内 处于 领先 地位 , 获得 良好 的 业内 口碑 。
+2010 年 3 月 公司 再次 获得 开 投 基金 、 IDG 资本 等 机构 注资 折合 人民 币 1 . 5 亿元 。
+上市 之 路 0 华 灿 光电 ( 300323 ) 首次 公开 发行 不 超过 5000 万 股 的 申请 已 获得 中国 证监 会 核准 , 公司 定于 2012 年 5 月 21 日 举行 网 上路 演 。
+华 灿 光电 的 首发 募 投 项目 将 在 公司 现有 产能 基础 上将 新增 产能 10 . 44 万 片 / 月 , 增长 2 . 61 倍 , 为 公司 进入 照明 领域 做好 战略 储备
+学长 在 一 起 吧 《 学长 在 一 起 吧 》 是 汐 默然 创作 的 网络 小说 , 发表 于 小说 阅读 网 。
+作者 0 汐 默然 作品 简介 0 学长 , 如果 我 说 喜欢 你 , 你 会 不 会 以为 我 有 病 啊 。
+学长 , 我 喜欢 你 啊 , 当 你 和 别 的 女生 在 一 起 时 我 会 吃醋 啊 。
+学长 , 我们 在 一 起 吧 。
+【 BL 文 , 清水 向 】 中文 名称 学长 在 一 起 吧 作者 汐 默然 连载 平台 小说 阅读 网
+雷 坦 毕业 于 上海 戏剧 学院 舞台 美术 系 。
+1971 年 参加 中国 人民 解放 军 , 从事 美术 工作 。
+1985 年 转业 到 广州 画院 。
+现 为 中国 美协 会员 、 广州 画院 专职 画家 、 一 级 美术 师 、 广州 画院 “ 特 美国 画廊 ” 艺术 顾问 。
+0 从 1959 年 开始 发表 作品 及 参加 省市 、 全国 性 以及 国际 性 展览 。
+作品 曾 十六 次 入选 全国 性 美展 , 在 《 美术 》 上 刊登 过 七次 。
+代表 《 飞 夺 泸定 桥 》 、 《 故乡 行 》 、 《 钦 马 长江 》 多 件 作品 获 全军 、 全省 优秀 作品 奖 , 巨幅 油画 创作 《 庐山 风云 》 获 广东 省 美术 作品 展览 金牌 奖 , 有 八 件 油画 由 国家 收藏 。
+出版 个人 作品 集 三 册 。
+在 广州 、 香港 、 台湾 等 地 举办 个人 画展 九 次 , 电视 “ 艺术 长廊 ” 、 刊物 作 过 多次 专题 采访 介绍 。
+1993 年 以来 油画 作品 在 香港 太古 佳 土德 公司 举办 的 “ 中国 当代 油画 拍卖 会 ” 中 六 次 取得 成功 。
+1993 年 受 邀 前往 前 国 ART CENTRE COLLEGE OF DESIGN 讲学 。
+论文 著作 有 《 论 色彩 表现 的 张 力 》 、 《 让 色彩 世界 更 班 斓 》 等 。
+< b > 代表 作品 : < / b > < b > < / b > 中文 名 雷 坦 外文 名 LeiTan 国籍 中华 人民 共和 国 出生 地 广州 市 出生 日期 1939 逝世 日期 1999 原名 < a > 雷 志 能 < / a >
+美丽 的 舞蹈 家 美丽 的 舞蹈 家 是 一款 女生 类 小 游戏 , 大小 是 329 K 。
+背景 设定 0 这 是 一 位 美丽 的 舞蹈 家 , 一 起来 帮 她 打扮 一 下 吧 !
+操作 指南 0 如何 开始 : 游戏 加载 完毕 点 击 两次 [ PLAY ] 即可 开始 游戏 。
+操作 方法 : 游戏 中 使用 鼠标 操作 , 拖动 服饰 给 人物 打扮 。
+游戏 目标 0 把 她 打扮 成 舞者 的 样子 吧 。
+中文 名 美丽 的 舞蹈 家 类型 女生 小 游戏 大小 329K 操作 使用 鼠标 操作
+科法斯 世界 贸易 信用 风险 手册 《 科法斯 世界 贸易 信用 风险 手册 》 是 2005 年 8 月 1 日 中国 商务 出版 社 出版 的 一 本 图书 , 作者 是 科法斯 ( Coface ) 。
+作者 简介 0 科法斯 ( Coface ) , 科法斯 集团 为 全球 出口 信用 保险 业 翘楚 , 在 全球 93 个 国家 拥有 超过 85000 家 企业 客户 。
+并 拥有 全球 首 个 网 上 贸易 信用 评级 保障 系统 - @ 评估 方案 。
+同时 , 依托 遍布 全球 93 个 国家 的 分支 公司 、 伙伴 机构 以及 拥有 4400 万家 公司 信息 实时 更新 的 庞大 数据 库 , 科法斯 为 全球 提供 一 流 的 企业 信用 信息 。
+近 60 年 来 科法斯 集团 一直 不懈 努力 , 致力 于 推动 国际 贸易 发展 。
+内容 简介 0 《 科法斯 世界 贸易 信用 风险 手册 ( 2005 - 2006 ) 》 旨在 为 所有 从事 国际 贸易 的 企业 或 机构 提供 帮助 与 信息 。
+本 手册 提供 全球 151 个 国家 和 地区 最 新 的 贸易 与 投资 风险 分析 , 并 对 每个 国家 / 地区 作 了 相应 的 贸易 信用 风险 评级 。
+对 公司 财务 部门 来说 , 该 手册 强调 了 各国 / 地区 的 关键 经济 指标 , 更 对 各国 / 地区 企业 发生 欠款 的 平均 概率 作 了 评估 对 业务 拓展 及 投资 人士 来说 , 该 书 对 各 个 新兴 市场 的 风险 与 机遇 逐一 衡量 , 并 精辟 分析 了 发达 国家 和 发展 中 国家 主要 产业 部门 的 业绩 状况 。
+目录 0 鸣谢 前言 增长 … … 及 付款 所 面临 的 风险 科法斯 董事 长 戴 伟 译者 的 话 康帕斯 ( 中国 ) 国际 信息 服务 有限 公 绪论 展望 2005 和 2006 年 乔纳森 · 鲁 维德 , 高级 编辑 , 环球 市场 简报 出版 公司 @ Rating 信用 评估 系统 @ Rating 国家 风险 评级 定义 行业 概况 西 尔维亚 · 格莱斯曼 与 多米尼克 · 芙 罗彻 , 科法斯 国别 风险 与 经济 研究 部 , 巴黎 展望 2005 年 世界 经济 将 遭遇 美元 疲软 的 风雨 牛津 分析 社 专家 , 牛津 2005 年 的 石油 行业 油价 会 向 30 美元 漂移 牛津 分析 社 专家 , 牛津 欧洲 欧元 区 经济 令人 失望 牛津 分析 社 专家 , 牛津 中东 欧 8国 经济 稳定 增长 牛津 分析 社 专家 , 牛津 欧洲 国家 风险 评级 范 闸 西 尔维亚 · 格莱斯曼 , 让 · 路易 剪 . 多蒂尔 , 多米尼克 · 芙 罗彻 及 伊夫斯兹洛托夫 斯基 , 科法斯 国别 风险 与 经济 研究 部 , 巴黎 阿尔巴尼亚 亚美尼亚 奥地利 阿塞拜疆 白俄罗斯 比利时 波黑 保加利亚 克罗地亚 塞浦路斯 捷克 丹麦 爱沙尼亚 芬兰 法国 格鲁吉亚 德国 希腊 匈牙利 冰岛 爱尔兰 意大利 哈萨克斯坦 吉尔吉斯斯坦 拉脱维亚 立陶宛 卢森堡 马其顿 马耳他 摩尔多瓦 荷兰 挪威 波兰 葡萄牙 罗马尼亚 俄罗斯 塞尔维亚 和 黑山 斯洛 伐克斯 洛 文尼 亚 西班牙 瑞典 瑞士 塔吉克斯坦 土库曼斯坦 乌克兰 英国 乌兹别克斯坦 美洲 平静 的 通货 膨胀 允许 美 114 增长 牛津 分析 社 专家 , 牛津 阿根廷 的 风险 上升 年 牛津 分析 社 专家 , 牛津 荚 洲 国家 风险 评级 范喇西尔维亚 · 格莱斯曼 与 奥利弗 · 欧 奇林 , 科法斯 国别 风险 与 经济 研究 部 巴黎 阿根廷 玻利维亚 巴西 加拿大 智利 哥伦比亚 哥斯达黎加 古巴 多米尼加 共和 国 厄瓜多尔 萨尔瓦多 危地马拉 海地 洪都拉斯 牙买加 墨西哥 尼 加 拉 瓜 巴拿马 巴拉圭 秘鲁 美国 乌拉圭 委内瑞拉 亚洲 中国 经济 的 持续 增长 牛津 分析 社 专家 , 牛津 弧 洲 陶家 / 地区 风险 评级 范 阳西 尔维亚 · 格莱斯曼 与 皮埃尔 · 帕格纳利 , 科法斯 国别 风险 与 经济 研究 部 巴黎 澳大利亚 孟加拉 国 柬埔寨 中国 内地 中国 香港 印度 印度尼西亚 日本 马来西亚 蒙古 缅甸 尼泊尔 新西兰 巴基斯坦 巴布亚 新几内亚 菲律宾 新加坡 韩国 斯里兰卡 中国 台湾 泰国 越南 … … 书名 科法斯 世界 贸易 信用 风险 手册 作者 科法斯 IS BN 9787801814364 页数 424 出版 社 中国 商务 出版 社 出版 时间 2005 年 8 月 1 日 装帧 平装 开本 16
+K - F 环环 , 即 角膜 色素 环 。
+是 肝 豆 状 核 变 性病 的 最 重要 体征 , 95 % - 98 % 患者 有 K - F 环 。
+由 铜 沉积 于 角膜 后 弹力 层 所致 , 绝 大 多数 见于 双眼 , 个别 见于 单眼 。
+大多 出现 神经 症 状 时 就可 发现 此 环 , 位于 角膜 与 巩膜 的 内 表面 上 , 呈 绿 宰 色 或 金 褐色 , 宽 约 1 . 3MM , 光线 斜 照 角膜 时 看 得 最 清楚 , 但 早期 常 须 用 裂隙 灯 检查 方可 发现 。
+0 少数 可 伴 晶体 浑浊 , 白 内障 , 暗 适应 下降 及 瞳孔 对光 反应 迟钝 等 。
+
+段 干 越 人 谓 新城 君 《 段 干 越 人 谓 新城 君 》 》 是 一篇 散文 , 出自 西汉 文学 家 刘向 编 的 《 战 国策 》 。
+主要 讲 叫 上级 重用 自己 , 总 不 是 一 个 该 直截 了当 陈说 的 事情 。
+尤其 在 主张 谦虚 内敛 的 中国 , 就 更 应该 讲 点 技巧 来 推重 自己 。
+作品 提要 0 叫 上级 重用 自己 , 总 不 是 一 个 该 直截 了当 陈说 的 事情 。
+尤其 在 主张 谦虚 内敛 的 中国 , 就 更 应该 讲 点 技巧 来 推重 自己 。
+作品 原文 0 段 干 越 人 谓 新城 君 曰 : “ 王 良之 弟子 驾 , 云 取 千里马 , 遇 造 父 之 弟子 。
+造 父 之 弟子 曰 : ‘ 马 不 千里 。
+’ 王 良 弟子 曰 : ‘ 马 , 千里 之 马 也 ; 服 , 千里 之 服 也 。
+而 不 能 取 千里 , 何 也 ? ’ 曰 : ‘ 子 繿 牵 长 。
+故 繿 牵 于 事 , 万分 之 一 也 , 而 难 千里 之 行 。
+’ 今 臣 虽 不 肖 , 于 秦 亦 万分 之 一 也 , 而 相国 见 臣 不 释塞 者 , 是 繿 牵 长 也 。
+” 作品 注释 0 ① 段 干 越 人 : 段 干 , 本 为 地名 , 此人 以此 地名 为 姓 , 名 越 人 。
+② 王 良 : 人名 , 为 赵 简子 驾车 的 人 。
+③ 造 父 : 人名 , 为 周 穆王 驾车 的 人 。
+④ 服 : 古 时候 一辆 车 有 四 匹马 , 旁边 的 两 匹马 称 为 骖 , 中间 的 两 匹马 称 为 服 。
+⑤ 纆 ( m ò ) : 驾驭 马 时 所 用 的 缰绳 。
+⑥ 释 : 释放 , 打开 。
+作品 译文 0 段 干 越 人 对 新城 君 说 : “ 王 良 的 弟子 驾车 , 说 是 要 日 行 千里 , 他 遇见 了 造 父 的 弟子 。
+造 父 的 弟子 说 : ‘ 你 的 马 不 能 跑 千里 。
+’ 王 良 的 弟子 说 : ‘ 我 的 边 马 是 千里 之 马 , 辕马 也是 千里 之 马 , 却说 我 不 能 日 行 千里 , 为什么 呢 ? ’ 造 父 的 弟子 说 : ‘ 你 的 缰绳 拉 得太 长 了 。
+缰绳 的 长短 对于 驾御 来说 , 其 作用 不过 万分 之 一 , 却 妨碍 千里 之 行 。
+’ 现在 我 虽然 不 才 , 但 对 秦国 的 作用 多少 也有 那么 万分 之 一 吧 , 您 见到 我 却 不 高兴 , 这 也 正 是 缰绳 拉 得太 长 了 的 缘故 吧 。
+” 作品 评析 0 段 干 越 人 通过 马跑 千里 与 缰绳 拉 得太 长 的 关系 的 言说 , 指出 如果 不 重用 自己 , 秦国 就 不 会 有 大 的 发展 。
+他 充分 运用 了 类比 的 方法 , 避免 了 直接 自荐 的 卤莽 和 直白 , 曲折 形象 地 说出 了 自己 的 心中 所 想 , 完全 达到 了 预期 的 效果 。
+作品 出处 0 《 段 干 越 人 谓 新城 君 》 出自 《 战 国策 》 《 战 国策 》 是 中国 古代 的 一 部 历史 学 名著 。
+它 是 一 部 国别 体 史书 ( 《 国语 》 是 第 一 部 ) 又 称 《 国策 》 。
+是 战国 时期 游说 之 士 的 著作 。
+主要 记载 战国 时期 谋臣 策士 纵横 捭阖 ( b ǎ ih é ) 的 斗争 。
+全书 按 东周 、 西周 、 秦国 、 齐国 、 楚国 、 赵 国 、 魏 国 、 韩国 、 燕 国 、 宋 国 、 卫国 、 中山 国 依次 分 国 编写 , 分 为 12 策 , 33 卷 , 共 497 篇 , 约 12 万字 。
+所 记载 的 历史 , 上 起 公元 前 490 年 智 伯 灭 范 氏 , 下 至 公元 前 221 年 高 渐 离 以 筑 击 秦 始皇 。
+是 先秦 历史 散文 成就 最 高 , 影响 最 大 的 著作 之 一 。
+《 战 国策 》 是 我 国 古代 记载 战国 时期 政治 斗争 的 一 部 最 完整 的 著作 。
+它 实际 上 是 当时 纵横 家 〔 即 策士 〕 游说 之 辞 的 汇编 , 而 当时 七国 的 风云 变幻 , 合纵 连横 , 战争 绵延 , 政权 更迭 , 都 与 谋士 献策 、 智 士 论辩 有 关 , 因而 具有 重要 的 史料 价值 。
+该 书 文辞 优美 , 语言 生动 , 富于 雄辩 与 运筹 的 机智 , 描写 人物 绘 声 绘 色 , 常用 寓言 阐述 道理 , 著名 的 寓言 有 “ 画 蛇 添 足 ” “ 亡羊 补牢 ” “ 狡兔 三窟 ” “ 狐 假 虎威 ” “ 南辕 北辙 ” 等 。
+这部 书 有 文辞 之 胜 , 在 我 国 古典 文学 史 上 亦 占有 重要 地位 。
+《 战 国策 》 是 我 国 一 部 优秀 散文 集 , 它 文笔 恣肆 , 语言 流畅 , 论 事 透辟 , 写 人 传神 , 还 善于 运用 寓言 故事 和 新奇 的 比喻 来 说明 抽象 的 道理 , 具有 浓厚 的 艺术 魅力 和 文学 趣味 。
+《 战 国策 》 对 我 国 两汉 以来 史 传文 政 论文 的 发展 都 产生 过 积极 影响 。
+编者 简介 0 刘向 ( 约 前 77 — 前 6 ) 又 名 刘 更生 , 字 子 政 。
+西汉 经 学家 、 目录 学家 、 文学 家 。
+沛县 ( 今 属 江苏 ) 人 。
+楚 元 王 刘 交 四世 孙 。
+汉 宣 帝 时 , 为 谏 大夫 。
+汉 元帝 时 , 任 宗正 。
+以 反对 宦官 弘 恭 、 石 显 下 狱 , 旋 得 释 。
+后 又 以 反对 恭 、 显 下 狱 , 免 为 庶人 。
+汉 成帝 即位 后 , 得 进用 , 任 光禄 大夫 , 改名 为 “ 向 ” , 官 至 中 垒 校 慰 。
+曾 奉命 领 校 秘书 , 所 撰 《 别录 》 , 为 中国 最 早 的 图书 公 类 目录 。
+治 《 春秋 彀 梁 传 》 。
+著 《 九 叹 》 等 辞赋 三十三 篇 , 大多 亡佚 。
+今 存 《 新序 》 、 《 说 苑 》 、 《 列女传 》 等 书 , 《 五经 通义 》 有 清人 马 国翰 辑 本 。
+原有 集 , 已 佚 , 明人 辑 为 《 刘 中垒 集 》 。
+生平 事迹 见 《 汉书 》 卷 三十六 。
+中文 名 段 干 越 人 谓 新城 君 作品 出处 < a > 《 战 国策 》 < / a > 文学 体裁 < a > 散文 < / a > 作品 年代 西汉 编者 刘向
+何 塞 · 马托斯 简介 0 何 塞 · 马托斯 , 是 一名 西班牙 职业 足球 运动 员 , 司 职 后卫 。
+现在 效力 于 西班牙 足球 甲级 联赛 的 塞维利亚 足球 俱乐部 。
+参加 比赛 0 < table > 比赛 日期 比赛 性质 代表 球队 对手 球队 主客 场 比分 胜负 状态 出场 时间 进球 助攻 得 牌 详情 2018 - 03 - 25 西乙 塞维利亚 B 萨拉戈萨 1 : 0 首发 24 - - 2018 - 03 - 19 西乙 塞维利亚 B 阿尔巴塞特 1 : 2 首发 90 - - 2018 - 02 - 26 西乙 塞维利亚 B 雷乌斯 0 : 2 替补 23 - - 2018 - 02 - 18 西乙 塞维利亚 B 阿尔梅里亚 0 : 3 首发 90 - - 2018 - 02 - 12 西乙 塞维利亚 B 瓦莱 卡诺 0 : 2 首发 90 - - 2018 - 01 - 21 西乙 塞维利亚 B 巴拉多利德 0 : 1 首发 90 - - 2018 - 01 - 15 西乙 塞维利亚 B 奥 萨 苏 纳 0 : 1 首发 90 - - 2018 - 01 - 07 西乙 塞维利亚 B 洛尔卡 3 : 2 首发 901 - 2017 - 12 - 22 西乙 塞维利亚 B 努 曼 西亚 0 : 3 首发 90 - - 2017 - 12 - 17 西乙 塞维利亚 B 奥维耶多 0 : 1 首发 90 - - 2017 - 11 - 20 西乙 塞维利亚 B 科尔多巴 1 : 1 首发 51 - - 2017 - 11 - 12 西乙 塞维利亚 B 格拉纳达 2 : 1 首发 90 - - 2017 - 11 - 06 西乙 塞维利亚 B 特 内里 费 1 : 1 首发 90 - - 2017 - 10 - 29 西乙 塞维利亚 B 巴塞罗那 B1 : 1 首发 90 - - 2017 - 10 - 22 西乙 塞维利亚 B 萨拉戈萨 2 : 2 首发 90 - -
+魔女 王妃 你 别 装 乖乖 让 我 抱 《 魔女 王妃 你 别 装 乖乖 让 我 抱 》 是 在 17k 小说 网 连载 的 一 部 小说 , 作者 是 UIHOH 小说 类型 0 穿越 架空 内容 简介 0 啊 ! 格 昕 雅 醒来 后 , 竟 发现 自己 在 一 个 充满 着 古典 韵味 的 屋子 里 。
+这 是 哪里 ? 格 昕 雅 抓住 旁边 的 一 个 婢女 就 询问 起来 。
+姑 · · 姑娘 , 这 是 王爷 府 啊 。
+那 婢女 战 战兢兢 地 回答 。
+穿越 了 ! 穿越 了 ! 我 怎么 会 穿越 ? 一 场 穿梭 时空 的 爱恋 , 只 不过 是 一 场 梦 。
+一觉 醒来 , 再 回到 他 的 身边 , 是否 还能 记得 ? 临终 前 的 遗言 , 是 梦 , 还是 现实 ? 一切 , 只 为 一 场 , 白头 偕老 … … 中文 名 魔女 王妃 你 别 装 乖乖 让 我 抱 作者 UIHOH 小说 进度 连载 连载 网站 17k 小说 网
+周 磊 周 磊 , 男 , 汉族 , 中共 党员 , 国家 一 级 武术 运动 员 , 武术 五 段 ; 国家 二级 武术 裁判 。
+1987 年 11 月 3 日 出生 于 山东 省 济宁 市 , 北京 体育 大学 武术 学院 06 级 武术 套路 专业 学生 。
+在校 曾 任 军训 院 旗手 、 团 支书 、 武术 协会 会长 等 职务 , “ 东影 工作 室 ” 创建 人 之 一 。
+运动 生涯 0 周 磊 在 网络 视频 新 媒体 创作 上 获得 累累 硕果 , 他 曾 策 划过 “ 功夫 福娃 ” 、 “ 猫耳 宝贝 ” 、 “ 酷 武 ” 武术 经典 案例 , 其 策划 监制 的 作品 《 伊 战 年代 》 获 2006 年 首届 酷 溜 微 视频 大赛 特等 奖 、 《 520 》 获 三 等 奖 ; 《 偷 福 的 女贼 》 获 “ 互联 星空 ” 首届 新浪 中国 播客 大奖 赛 “ 原创 作品 组 ” 一等奖 和 德国 之 声 “ 2007 年 国际 博客 大奖 赛 ” 最 佳 视频 博客 公众 奖 ; “ 猫耳 宝贝 ” 系列 视频 播客 获 新浪 ? 蒙 牛 2007 网络 盛典 “ 年度 原创 播客 金奖 ” ; 《 男女 合租 》 获 2007 年 中国 视频 榜 颁奖 盛典 大奖 “ 2007 年度 十大 网络 视频 红人 ” 和 “ 2007 年度 十大 网络 视频 制作 人 ; 《 我 最 喜欢 》 获 2008 年 土豆 电影 节 最 感动 奖 和 首届 三 分钟 奥运 短片 动画 大赛 二 等 奖 ; 《 盾 》 获 2008 “ 高清 世界 奥运 之 美 ” PANASONIC 高清 影像 大赛 三 等 奖 ; 《 中国 老人 》 已 入围 “ 第 四 届 传媒 影像 力 ” 北京 高校 原创 影像 大赛 。
+周 磊 还是 一名 尽职 尽责 的 志愿 者 , 参加 “ 好运 北京 ” 皮 划艇 静水 项目 赛场 管理 志愿 者 , “ 好运 北京 ” 世界 武术 锦标 赛 竞赛 组织 内部 运行 志愿 者 , 第 二十九 届 北京 奥运 会 皮 划艇 静水 项目 国际 技术 官员 服务 志愿 者 。
+< b > 敢于 创新 —— 让 武术 搭 上 新 媒体 的 < / b > 快车 周 磊 在 山东 郓城 宋江 武术 学院 读书 期间 , 一直 是 学校 武术 队 队长 , 并 参加 了 中央 电视 台 、 港 澳 回归 等 国内 外 大型 文艺 演出 以及 奥运 宣传 片 拍摄 。
+在校 的 优秀 表现 使 周 磊 在 高中 就 光荣 入党 。
+2006 年 进入 大学 后 , 周 磊 也 积极 发挥 自己 的 武术 节目 策划 编排 特长 。
+借助 网络 视频 新 媒体 的 出现 , 他 和 赵 春梁 等人 组建 “ 东影 工作 室 ” 团队 , 创作 了 一批 优秀 的 网络 小 电影 及 宣传 片 , 相继 获得 网络 营销 年 网络 广告 特等 奖 和 视频 榜 等 奖项 。
+作为 北京 体育 大学 武术 协会 会长 的 周 磊 , 师从 北京 体育 大学 武术 学院 张 明廷 院长 和 北京 体育 大学 武术 学院 民族 民间 教研 室 主任 吕 韶钧 教授 。
+通过 和 一些 武术 前辈 的 学习 交流 , 思维 活跃 的 周 磊 , 对 当今 武术 发展 现状 有 了 新 的 认识 。
+借助 网络 媒体 的 迅速 发展 , 和 针对 当今 武术 发展 现状 , 以 “ 武术 中 的 生活 , 生活 中 的 武术 ” 为 理念 , 于 2007 年 初 策划 推出 “ 猫耳 宝贝 ” 武术 系列 视频 , 14 个 视频 两 个 月 点 击 总量 达 80000000 左右 , 把 武术 最 本质 的 一 面 展示 在 网友 眼前 , 让 武术 不 再 留 有 神秘 感 。
+其 系列 视频 《 偷 福 的 女贼 》 获 “ 互联 星空 ” 首届 新浪 中国 播客 大奖 赛 “ 原创 作品 组 ” 一等奖 和 德国 之 声 “ 2007 年 国际 博客 大奖 赛 ” 最 佳 视频 博客 公众 奖 , “ 猫耳 宝贝 ” 系列 视频 播客 获 新浪 · 蒙 牛 2007 网络 盛典 “ 年度 原创 播客 金奖 ” 。
+< b > 志愿 服务 —— I ’ mChine seI ’ m Volunteer < / b > < b > < / b > 2007 年 8 月 周 磊 同学 作为 “ 好运 北京 ” 皮 划艇 静水 项目 赛场 管理 志愿 者 , 同 IOC ( 国际 技术 官员 ) 负责 运动 员 的 检录 及 复检 。
+11 月 作为 “ 好运 北京 ” 世界 武术 锦标 赛 内部 运行 志愿 者 , 周 磊 负责 国际 武协 官员 及 88 个 参赛 国 代表 团 接待 , 以及 参加 开幕 式 的 奥 组委 领导 和 嘉宾 接待 。
+由于 参赛 国 代表 队 比较 多 , 而 志愿 者 又 比较 少 , 再 加 上 航班 的 不 准 时 晚点 , 工作 量 特别 大 。
+有 次 周 磊 上 的 是 夜班 , 从 奥体 中心 体育 馆 志愿 者 回来 已 下午 一 点 多 了 , 疲惫 不 堪 的 他 却 在校 门口 就 被 一 行 墨西哥 友人 叫 住 。
+他 用 很 “ poor ” 的 英语 和 他们 交流 后 , 好 不 容易 才 明白 他们 是 来 参加 “ 好运 北京 ” 世界 武术 锦标 赛 的 , 这次 是 慕名 来 北京 体育 大学 武术 学院 买 武术 器材 。
+周 磊 抛掉 所有 的 困乏 , 带领 他们 这支 小 分队 开始 了 “ 淘宝 ” 之 旅 。
+他们 买到 满意 的 物品 后 , 高兴 地 给 周 磊 小费 , 但 周 磊 毫 不 犹豫 的 拒绝 了 , 并 礼貌 微笑 着 说 : “ I ’ mChinese , I ’ m Volunteer . ” 这 也许 是 他 那么 “ poor ” 的 英语 说 的 最 完整 , 也 最 完美 的 一句 话 , 一句 代表 所有 志愿 者 心声 的 话 。
+与此 同时 周 磊 和 他 团队 创作 的 《 我 最 喜欢 》 先后 获 “ 三 分钟 奥运 短片 动画 大赛 三 等 奖 ” 和 2008 年 土豆 电影 节 最 感动 奖 。
+< b > 勇于 实践 —— 让 武术 走进 生活 , 走向 世界 < / b > < b > < / b > 周 磊 就读 于 民族 传统 体育 专业 , 平日 扎实 学习 传统 武术 理论 、 关注 社会 发展 现状 , 注重 将 理论 和 现实 相 结合 , 活 学 活用 。
+2008 年 的 奥运 吉祥 物 , 把 中国 的 金 、 木 、 水 、 火 、 土 的 五行 哲学 思想 , 与 奥运 五环 相 匹配 。
+奥运 、 福娃 、 五行 、 武术 四 者 文化 之间 有 着 密 不 可 分 的 相生 关系 , 武术 作为 中国 本土 体育 项目 , 在 漫长 的 历史 发展 中 , 中华 民族 精神 早已 浓缩 于 中国 武术 精神 之 中 。
+第 二十九 届 北京 奥运 会 即将 盛大 开幕 之 际 , 周 磊 策划 编排 出 集 喜 、 怒 、 哀 、 乐 、 武术 摔打 动作 等 元素 为 一体 的 “ 功夫 福娃 / 牛 ” ( 由 北京 体育 大学 武术 套路 教研 室 教师 、 奥 组委 体育 展示 导演 朱 宏 执行 编导 ) 。
+“ 功夫 福娃 ” 是 对 奥林匹克 精神 和 中华 武术 精神 相 包容 的 体现 , 这 也许 是 奥运 精神 和 中国 文化 最 精彩 的 结合 , 向 世界 展示 了 光辉 灿烂 的 中华 文化 。
+该 节目 在 鸟巢 、 国家 体育 馆 、 五 课 松 体育 馆 等 七 个 奥运 场馆 演出 , 成 了 奥运 赛场 上 的 一 道 “ 民族 文化 ” 风景 线 , 在 赛场 引起 观众 、 媒体 热烈 反响 , 并 先后 被 中央 电视 台 、 新华 社 、 BBC 等 国内 外 十多 家 知名 媒体 报道 。
+为 纪录 北京 奥运 会 , 周 磊 策划 的 奥运 短片 作品 《 盾 》 获得 2008 高清 影像 大赛 三 等 奖 ; 《 中国 老人 》 已 入围 “ 第 四 届 传媒 影像 力 ” 北京 高校 原创 影像 大赛 。
+现在 , 周 磊 依然 忙碌 着 , 11 月 他 又 带领 团队 成员 策划 创作 《 朋友 , 勇敢 去 闯 》 和 改编 《 功夫 之 王 》 武术 节目 , 参加 北京 体育 大学 武术 学院 建院 50 周年 和 北京 体育 大学 建校 55 周年 庆典 演出 。
+数据 生涯 02008 中国 大 学生 年度 人物 ( 提名 评选 中 ) 提名 首届 中国 大 学生 自强 之 星 第 二 届 全国 传统 武术 拳种 赛 三 等 奖 获 2008 年 新东方 自强 奖学 金 获 2006 年 北京 体育 大学 军训 优秀 标兵 “ 功夫 福娃 ” 表演 团队 获 北京 奥运 会 、 残 奥 会 体育 展示 突出 贡献 奖 中文 名 周 磊 民族 汉族 出生 日期 1987 年 职业 运动 员 性别 男
+呼和浩特 惠 则 恒 汽车 租赁 有限 责任 公司 呼和浩特 惠 则 恒 汽车 租赁 有限 责任 公司 于 2014 年 05 月 15 日 在 呼和浩特 市 工商 行政 管理 局 登记 成立 。
+法定 代表 人 李 春泽 , 公司 经营 范围 包括 许可 经营 项目 : 无 。
+一般 经营 项目 : 汽车 租赁 等 。
+企业 信息 0 公司 类型 有限 责任 公司 ( 法人 独资 ) 登记 机关 呼和浩特 市 工商 行政 管理 局 成立 时间 2014 年 05 月 15 日发 照 时间 2016 年 02 月 19 日
+乙酰 色 氨酸 乙酰 色 氨酸 又 称 Na 一 乙 酞 氨基 足 叫 噪 丙酸 DL 型 为 白色 片状 结晶 。
+熔点 204 . - 205 ℃ , 微 溶 于水 和 乙醇 。
+0 < b > 特性 < / b > < b > < / b > L 型 为 白色 片状 结晶 , 溶 于水 和 乙醇 , 熔点 189 ~ l90C : } D 型 为 白色 片状 结晶 , 溶 于水 和 乙醇 , 熔 . 氨 189 一 飞 9lC : 。
+以 L 一 或 D 一 色 氨酸 为 原料 制 得 。
+用于 生化 研究 。
+
+比茜 · 图 诺克 比茜 · 图 诺克 , 1981 年 1 月 19 日 生于 美国 加利福尼亚 州 的 圣地亚哥 , 演员 。
+参演 过 《 格林 》 、 《 艺术 家 》 、 《 青年 危机 》 等 影视 作品 。
+早年 经历 0 比茜 · 图 诺克 是 一名 美国 演员 , 1981 年 1 月 19 日 生于 加利福尼亚 州 的 圣地亚哥 , 却 在 西班牙 , 乌拉圭 和 阿根廷 度过 童年 。
+她 的 祖父 在 二战 时 是 一名 轰炸 机 飞行 员 , Bitsie 不 是 Elizabeth 的 缩写 , 而是 她 这位 祖父 的 外号 。
+她 父亲 Andrew Tulloch 在 拉美 银行 上班 , 所以 她 从小 在 那些 地方 生活 。
+她 回到 美国 , 在 纽约 州 的 贝德福德 上 了 初中 和 高中 , 然后 进入 哈佛 大学 , 最后 以 特别 荣誉 生 的 身份 获得 英语 与 美国 文学 , 视觉 与 环境 研究 两 门 主修 的 学位 。
+她 的 毕业 作品 展 曾 在 剑桥 ( 此 剑桥 非彼 剑桥 ) 的 哈佛 校内 的 卡朋特 中心 画廊 展出 。
+2003 年 , 她 的 父亲 曾 作为 高级 银行 顾问 为 伊拉克 的 临时 联合 政府 服务 , 是 伊拉克 战后 金融 重建 的 核心 人员 。
+她 的 血统 来自 父 方 的 苏格兰 血统 和 母 方 的 西班牙 血统 。
+演艺 经历 0 她 的 表演 处女作 是 乔治 卢卡斯 为 星战 中 的 机器 人 R2 - D2 拍摄 的 纪录 片 R2 - D 2 : beneath the dome , 在 其中 饰演 了 女 主 —— R 2 - D2 的 女友 , 表演 受到 洛杉矶 时报 的 高度 赞扬 。
+之后 她 相继 出演 了 作品 合法 入侵 ( lakeview terrace ) , 爱情 喜剧 losing control , 并 为 3D 动画 丛林 有 情 狼 ( aloha and domega ) 配音 , 2010 年 她 参演 了 the artist , 这部 影片 在 2011 年 戛纳 电影 节 获得 了 最 佳 男 演员 的 称号 , 韦恩 斯坦 公司 即将 在 年末 时 发行 这部 影片 。
+她 正在 拍摄 一 部 自己 参与 制片 的 独立 电影 , 而 手头 还有 两部 电影 tyanny 和 riding the pine 在 前期 制作 中 。
+她 演艺 生涯 的 的 转折 点 来自 于 2008 年 NBC 播出 的 电视 剧 青年 危机 ( quarter life ) , 在 其中 的 表演 受到 主流 媒体 和 评论 界 的 赞赏 , 洛杉矶 时报 称赞 她 注定 会 闪光 , 纽约 杂志 将 她 比作 下 一 个 克莱尔 丹恩斯 和 伊 万 蕾切尔 伍德 。
+因此 她 还 获得 了 第 28 届 洛杉矶 周报 戏剧 奖 的 提名 她 参演 的 电视 剧 有 《 豪斯 医生 》 , 《 铁证 悬案 》 , 《 白宫 风云 》 , 《 血色 月光 》 和 《 法 外 狂徒 》 等 。
+2007 年 , 她 在 迷失 中 获得 了 一 个 角色 。
+2009 年 , 她 作为 三 主演 之 一 拍摄 了 HBO 和 莎拉 杰西卡 帕克 联合 出品 的 喜剧 华盛顿 宝贝 ( washing tonienne ) 的 首播 集 失 中 获得 了 一 个 角色 , 但是 由于 与 青年 危机 的 拍摄 冲突 , 她 的 戏 份 又 全部 换 成 别人 重新 拍摄 。
+她 是 NBC 新剧 Grimm 的 常任 演员 , 出演 男 主角 格林 的 女友 。
+主要 作品 0 主要 作品 参演 电影 1 主要 作品 参演 电视 剧 2 中文 名 比茜 · 图 诺克 外文 名 BitsieTulloch 国籍 美国 星座 摩羯 座 出生 地 美国 , 纽约 出生 日期 1981 - 01 - 19 职业 演员 毕业 院校 哈佛 大学 代表 作品 电视 剧 《 格林 》 电影 《 艺术 家 》
+杨梅 镇 概况 0 杨梅 市 旧 名 杨梅 坜 庄 , 清 康熙 时 就 有 移民 进入 杨梅 坜 , 至 乾隆 年间 大 规模 开发 。
+民国 9 年 时 改 为 杨梅 庄 , 光复 后 改 为 杨梅 镇 迄今 。
+杨梅 镇 位于 台湾 省 桃园 县 的 西 南方 , 面积 为 89 . 13 平方 公里 , 约 有 三 分 之 一 的 疆界 与 新竹 县 接壤 ; 东 与 平 镇市 、 龙潭 乡 连接 , 北边 与 中 坜 、 新 屋 衔接 , 西 与 新竹 县 湖口 乡 交界 , 南 与 新竹 县 新 埔 镇 接壤 。
+当地 的 原 住 民 为 平埔 族 , 最 大 住 民族 群 为 客 家族 群 。
+该 镇 亦 为 台湾 人口 最 多 的 镇 之 一 。
+交通 0 杨梅 镇 交通 有 纵贯 公路 、 纵贯 铁路 、 中山 高速 公路 等 贯穿 该 镇 , 杨梅 交流 道 与 幼狮 交流 道 分设 在 铁路 两旁 , 又 有 杨梅 、 埔 心 、 富冈 等 3 个 火车 站 , 交通 极 为 便利 , 东西 向 快速 道路 建成 后 , 交通 更 为 便捷 。
+教育 0 杨梅 市 设有 小学 14 所 , 国 中 8 所 , 高中 职 4 所 。
+比较 著名 的 有 国立 杨梅 高中 、 私立 治平 高中 、 私立 大华 高中 。
+旅游 0 杨梅 镇 地理 位置 特殊 , 交通 便利 , 环境 优越 , 旅游 资源 丰富 。
+例如 , 鹭鸶 园 、 杨梅 观光 茶园 、 杨梅 贵 山 公园 、 马奇 园 、 味全 埔 心 牧场 、 儿 八 公园 。
+< b > 味全 埔 心 牧场 < / b > < b > 简介 < / b > < b > < / b > 味全 埔 心 牧场 位于 桃园 县 杨梅 镇 , 从 民国 46 年 起 即 配合 政府 发展 台湾 酪农 产业 , 将 原本 一 片 荒野 林地 , 开发 垦 成 北 台湾 第 一座 牧场 , 更 是 北 台湾 一 个 休闲 渡假 、 户外 活动 的 最 佳 场所 ; 味全 埔 心 牧场 还 贩卖 新鲜 牛乳 、 牧场 牛奶 面包 等 乳制 副 食品 , 让 您 有 得 玩 还有 得 吃 !
+< b > 味全 埔 心 牧场 < / b > < b > 园区 介绍 < / b > < b > < / b > 味全 埔 心 牧场 占地 56 公顷 , 精致 化 的 综合 性 游乐 区 , 使 埔 心 牧场 成为 知性 、 感性 活动 的 最 爱 一趟 美 、 知性 与 欢乐 的 旅程 , 尽 在 味全 埔 心 牧场 ; 味全 埔 心 牧场 园 区分 为 湖山 景致 区 、 欧式 花园 、 体能 训练 场 、 露营 烤肉 区 、 各式 游乐 区 等 , 游乐 设施 皆 十分 完善 , 牧场 还 开放 了 榨 乳 室 、 乳业 展览 馆 等 , 供 游客 参观 , 亲自 挤 牛乳 , 亲手 喂 牛 , 小 朋友 可 一边 亲手 挤 牛奶 , 一边 专业 人员 乳牛 生态 解说 , 带 您 及 小 朋友 进入 牛乳 生产 过程 的 感性 与 知性 的 世界 , 并 开放 牧场 乳牛 区 、 乳业 展览 馆 供 人 参观 , 让 民众 在 旅游 的 过程 中 也 能 增加 知识 , 更 了解 乳牛 的 世界 。
+< b > 味全 埔 心 牧场 < / b > < b > 设施 完善 < / b > < b > < / b > 味全 埔 心 牧场 内 设施 有 『 绿草 广场 』 辽阔 的 草原 让 小 朋友 自由 自在 的 奔跑 、 『 露营 烤肉 区 』 设备 完善 且 专业 , 您 不必 准备 繁杂 的 烤肉 用具 及 食物 , 备有 生鲜 超市 都 有 、 『 跑马 场 』 、 『 科学 探索 馆 』 、 『 游泳 池 』 与 『 航天 飞机 游乐 区 』 等 设施 ; 味全 埔 心 牧场 还有 异国 魔幻 杂技 秀 、 骑 迷你 马 、 天鹅 脚踏 船 、 亲子 协力 车 等 , 是 您 享受 亲子 之 乐 最 好 的 选择 ; 味全 埔 心 牧场 还有 很多 种 漂亮 的 休闲 小 木屋 , 提供 餐饮 的 餐饮 服务 中心 、 卡拉 OK 及 容纳 20 ~ 100 人 的 大 、 小 会议 厅 , 不 再 只是 一座 饲养 乳牛 的 牧场 , 更 是 北 台湾 一 个 休闲 渡假 好 去处 。
+来去 杨梅 玩乐 味全 埔 心 牧场 交通 十分 方便 , 下 杨梅 交流 道 后 不 久 即可 到达 , 因此 成为 民众 在 假日 休闲 时 的 好 去处 ; 此外 , 来到 杨梅 也 可以 前往 杨梅 贵 山 公园 , 杨梅 贵 山 公园 位于 省道 上 , 交通 十分 便利 , 指针 亦 清晰 可见 , 是 杨梅 镇 一处 清幽 的 登山 休闲 去处 , 贵 山 公园 上方 最 顶 处 奉 有 观 世 音 巨 佛像 , 前 有 一 尊 龙龟 与 水池 , 您 可 在此 眺望 各处 山 下 的 美景 , 每 一 个 观 景点 皆 有 详尽 解说 , 让 您 不 须 此行 哦 ! 此外 , 来到 杨梅 也 要 尝尝 道地 的 客家 美食 , 客家 小炒 、 姜 丝 大肠 等 招牌 客家 料理 更 是 不 容错 过 喔 !
+地址 : 桃园 县 杨梅 市 高 荣里 3 号 之 1 营业 ( 开放 ) 时间 : 上午 09 : 00 至 下午 05 : 00 。
+味全 埔 心 牧场 / 洽询 电话 03 - 4644131 、 0800 - 268286 费用 简介 : 1 、 个人 票 NT $ 300 元 ; 适用 一般 民众 。
+2 、 优待 票 NT $ 250 元 ; 适用 持证 之 65 岁 以上 年长 者 、 学生 、 军警 和 身高 110 公分 以下 儿童 。
+3 、 团体 一般 票 NT $ 240 元 ; 适用 30 人 以上 一般 团体 。
+4 、 团体 优待 票 NT $ 200 元 ; 适用 30 人 以上 优待 团体 。
+5 、 镇民 票 NT $ 150 元 ; 适用 持证 之 杨梅 市民 。
+6 、 身心 残障 票 NT $ 150 元 ; 适用 持证 之 身心 残障 者 和 陪同 者 一名 。
+
+HD 环球 网 运行 环境 0 支持 Android 1 . 6 应用 类型 0 书籍 阅读 类 软件 应用 介绍 0 HD 环球 网 设有 国际 、 中国 、 台海 、 军事 、 海外 看 中国 、 财经 、 娱乐 等 频道 , 并 开 设有 环球 论坛 、 环球 博客 、 手机 报 、 电子 报 等 板块 , 为 读者 提供 全 方位 手机 内容 服务 。
+应用 名称 HD 环球 网 应用 平台 mobile 应用 版本 1 . 53
+柳 公权 书 : 玄 秘 塔碑 临 习 指南 图书 信息 0 出版 社 : 辽宁 美术 出版 社 ; 第 1 版 ( 1993 年 12 月 1 日 ) 丛书 名 : 名牌 名帖 临 习 指南 系列 平装 : 143 页 正文 语种 : 简体 中文 开本 : 16 ISBN : 9787531410263 , 7531410265 条形码 : 9787531410 263 尺寸 : 25 . 4x 18 . 6x1cm 重量 : 281 g 内容 简介 0 《 柳 公权 书 : 玄 秘 塔碑 临 习 指南 》 主要 内容 : 集 联中 名言 名句 、 成语 联语 的 内容 均 来自 《 中国 古代 名句 辞典 》 、 《 中国 成语 大 辞典 》 、 《 古诗 常用 句 类 编 》 、 《 警语 名句 辞典 》 以及 书法 创作 中 积累 的 佳句 秀 语 , 凡 二 言 、 三言 、 四 言 、 五言 、 六 言 句 均 为 原 大 , 七 言及 碑文 选 句 经 照相 缩小 , 上述 集 字 全部 由 原帖 中 撷取 , 极 个 别字 属 拼成 , 但 力求 得体 , 不 失 柳 字 风貌 。
+目录 0 柳 公权 书 玄 秘 塔碑 ( 全 幅 ? 局部 ) 柳 公权 ? 柳体 ? 玄 秘 塔碑 柳 公权 书 玄 秘 塔碑 《 柳 公权 书 玄 秘 塔碑 》 笔法 举要 《 柳 公权 书 玄 秘 塔碑 》 构 法 举要 《 柳 公权 书 玄 秘 塔碑 》 临 法 举要 《 柳 公权 书 玄 秘 塔碑 》 集 名言 名句 成语 联语 ( 附 : 碑文 选 句 ) 后记
+中文 故事 绘 · 丽丽 的 幻想 世界 : 一 双 好 眼睛 中文 故事 绘 · 丽丽 的 幻想 世界 : 一 双 好 眼睛 是 北京 大学 出版 社 出版 的 一 本 书 0 内容 简介 《 中文 故事 绘 ? 丽丽 的 幻想 世界 : 一 双 好 眼睛 ( 彩图 注音 版 ) 》 为 中文 故事 绘 , 读 故事 , 看 绘本 , 学 中文 。
+“ 丽丽 的 幻想 世界 ” 通过 20 个 绘本 小 故事 , 集中 展示 了 可爱 小 女孩 丽丽 的 生活 幻想 , 文字 浅显 , 情节 有 趣 , 绘画 精美 , 整套 书 充满 了 童真 与 童趣 , 不 公 如此 , 灵活 地 运用 绘本 中 设计 的 各种 元素 , 孩子 们 的 语言 能力 也 就 在 潜 移 默 化 中 提高 了 。
+元素 1 : 汉语 拼音 。
+借助 拼音 的 力量 , 孩子 们 可以 简单 地 朗读 故事 。
+元素 2 : 八格 漫画 。
+依据 漫画 的 情节 提示 , 孩子 们 可 “ 轻松 地 复述 故事 , 也 可以 自行 设计 新 的 故事 对白 。
+元素 3 : 提问 。
+顺着 提问 的 线索 , 孩子 们 完全 可以 展开 自己 的 想象 力 , 编 出 自己 的 幻想 故事 。
+元素 4 : 词语 卡片 。
+图文 并茂 , 孩子 们 可以 搜索 卡片 , 然后 和 朋友 们 一 起 列 猜词 游戏 , 在 游戏 中 丰富 词汇 量 。
+书名 中文 故事 绘 · 丽丽 的 幻想 世界 : 一 双 好 眼睛 又 名 Apair of SharpEyes 作者 李 延祜 译者 ( 澳大利亚 ) YvonneY ungIS BN & nbsp ; 9787301189665 页数 23 页 出版 社 北京 大学 出版 社 出版 时间 第 1 版 ( 2011 年 6 月 1 日 ) 装帧 平装 开本 32 读者 对象 & nbsp ; 3 - 6 岁 IS BN 9787301189665 尺寸 & nbsp ; 20 . 4x 13 . 6x 0 . 6cm 重量 82g 正文 语种 简体 中文
+东莞 市 艾 贝 莎 网络 科技 有限 公司 东莞 市 艾 贝 莎 网络 科技 有限 公司 成为 与 2016 年 , 总 公司 设在 世界 工厂 的 东莞 , 是 中国 专注 于 服务 消费 者 日常 生活 所 需 的 平价 电子 商务 平台 , 专注 为 消费 者 提供 平价 商品 和 更 好 购物 体验 . 目前 公司 旗 下 开办 : 海豚 800 网 , 艾 贝 莎 官网 作为 服务 消费 者 日常 生活 所 需 的 平价 电子 商务 平台 , 海豚 800 倡导 价格 与 品质 的 平衡 , 主打 平价 、 品质 、 生活 , 在 国内 电商 行业 中 创新 “ 平价 零售 ” 模式 。
+为 消费 者 提供 服饰 、 居家 、 母婴 等 平价 优质 商品 。
+海豚 800 商城 : 购物 即 享 返利 , 返利 商城 几乎 涵盖 所有 知名 电商 , 包括 天猫 、 淘宝 、 京 东 、 苏 宁 易购 、 携程 、 一号 店 、 亚马逊 、 聚 美 优品 等 400 多家 电商 网站 超级 返 : 精选 知名 品牌 特卖 , 每日 10 点 和 20 点 上 新 , 最 高 返利 81 % 。
+确保 正品 、 确保 特价 、 限量 抢购 。
+9 块 9 : 每日 9 点 上 新 , 全场 包 邮 , 全 网 抄底 。
+
+新 古典 政治 经济 学 《 新 古典 政治 经济 学 》 是 2005 年 长春 出版 社 出版 的 图书 , 作者 是 ( 美 ) 柯 兰德 。
+图书 信息 0 书名 : 新 古典 政治 经济 学 作者 : ( 美 ) 柯兰 德 编 出版 社 : 长春 出版 社 出版 时间 : 2005 - 5 - 1ISBN : 9787806649619 开本 : 16 开 定价 : 35 . 00 元 内容 简介 0 本 书 第 一 个 把 新 古典 政治 经济 学 的 主要 文献 集 合成 一体 。
+它 挑战 了 传统 的 新 古典 主义 讨论 , 并且 为 新 政治 / 经济 均衡 绘制 了 一 幅 令人 兴奋 的 蓝图 。
+作者 简介 0 大卫 · 柯 兰德 目前 是 米德尔 伯里 大学 、 克里斯蒂安 · A · 约翰逊 大学 的 经济 学 教授 , 他 在 选择 宏观 经济 稳定 政策 方面 是 个 专家 , 他 还 与 阿巴 · 勒纳 合著 《 反 通货 膨胀 计划 划 行情 》 。
+图书 目录 0 《 新 政治 经济 学 译丛 》 总 序 鸣谢 导论 第 一篇 寻租 : 新 政治 经济 学 第 1 章 DUP 活动 和 经济 理论 —— 贾 迪 什 · N · 巴 格瓦提理查德 · A · 布雷奇尔 T · N · 斯里尼瓦桑第 2 章 制度 研究 的 三种 方法 —— 道格拉斯 · 诺思 第 3 章 内生 关税 理论 : 一 个 综述 —— 斯蒂芬 · 迈吉 第 二 篇 寻租 和 DUP 行动 $ tLN 第 4 章 寻租 理论 批判 —— 沃伦 · J · 赛明思 尼古拉 · 莫卡洛 第 5 章 自利 的 限度 —— 道德 在 经济 生活 中 的 作用 —— 迈克尔 · 麦克弗森 第 三 篇 理论 模型 第 6 章 一 个 内生 寻租 模型 —— 罗纳德 · 费 恩德 莱斯坦 尼斯 劳 · 卫理 思 兹 第 7 章 购买 垄断 —— 哈罗德 · 德姆塞兹 第 8 章 分 利 集团 与 宏观 经济 学 —— 大卫 · 柯 兰德 曼 柯 尔 · 奥尔森 第 9 章 有 寻 收入 的 最 优 关税 —— 对 DUP 活动 理论 的 一 个 贡献 —— 伊利 亚斯 · 狄 诺波 洛斯 第 四 篇 应用 和 经验 检验 第 10 章 发展 中 国家 的 保护 和 寻租 —— 斯坦 尼斯 劳 · 卫理 思 兹 罗纳德 · 费 恩德 莱 第 11 章 寻租 与 各国 的 增长 和 波动 —— 对 近来 一些 假说 的 经验 检验 . —— 弗里德 里克 · L · 普赖尔 第 12 章 看 不 见 的 脚 和 国民 浪费 —— 再 分配 与 经济 增长 —— 威廉 · A · 布洛克 斯蒂芬 · P · 迈吉 第 13 章 对 英国 工厂 法 的 一 种 寻租 解释 —— 加利 · M · 安德森 罗伯特 D · 托利森 第 五 篇 改革 的 可能 性 第 14 章 到 IIN 寻租 者 —— 肯尼思 J · 柯 福 大卫 · C · 柯 兰德 第 15 章 政治 企业 家 和 寻租 社会 的 改革 —— 詹姆士 · 贝内特 托马斯 · 狄 洛伦佐 第 16 章 如何 既 做 好事 又 得 好报 —— 戈登 · 图洛克 中文 名 新 古典 政治 经济 学 作者 : < a href = " # " > ( 美 ) 柯兰 德 编 < / a > < a href = " # " > 出版 社 < / a > : & nbsp ; < a href = " # " > 长春 出版 社 < / a > 出版 时间 : & nbsp ; 2005 - 5 - 1
+诡 图 ǐ t ú ㄍ ㄨ ㄟ ˇ ㄊ ㄨ ˊ < b > 诡 图 ( 诡 图 ) < / b > < b > < / b > 诡诈 的 计谋 。
+唐 陈 子 昂 《 为 副 大 总管 屯 营 大 将军 谢 表 》 : “ 臣 等 仁 亏 圣略 , 智 昧 诡 图 , 遂以 熊 罴之 师 , 挫 於 犬 羊 之 旅 。
+” 吴 组缃 《 山洪 》 三十 : “ 在 这里 , 参谋 官 一再 指点 出来 , 说 对 敌人 的 这些 诡 图 , 我 军 已经 作 了 妥善 的 布置 , 一天 果然 来犯 , 我们 有 把握 叫 他 吃 个 大亏 。
+”
+卫星 导航 仪 卫星 导航 仪 是 指 具有 GPS 全球 卫星 定位 系统 功能 的 车用 工具 , 并且 利用 语音 提示 的 方式 来 引导 驾驶 员 开车 。
+简介 0G PS 是 英文 Global Positioning System ( 全球 定位 系统 ) 的 简称 , 而 其 中文 简称 为 “ 球 位 系 ” 。
+GPS 是 20 世纪 70 年代 由 美国 陆海 空 三军 联合 研制 的 新 一 代 空间 卫星 导航 定位 系统 。
+其 主要 目的 是 为 陆 、 海 、 空 三大 领域 提供 实时 、 全 天候 和 全球 性 的 导航 服务 , 并 用于 情报 收集 、 核爆 监测 和 应急 通讯 等 一些 军事 目的 , 是 美国 独霸 全球 战略 的 重要 组成 。
+经过 20 余年 的 研究 实验 , 耗资 300 亿 美元 , 到 1994 年 3 月 , 全球 覆盖 率 高 达 98 % 的 24 颗 GPS 卫星 星座 己 布设 完成 。
+在 机械 领域 GPS 则 有 另外 一 种 含义 : 产品 几何 技术 规范 ( Geometrical Product Specifications ) - 简称 GPS 。
+功能 0 其中 , 使用 于 车 上 的 GPS , 是 指 具有 GPS 全球 卫星 定位 系统 功能 , 让 您 在 驾驶 汽车 时 随时 随地 知晓 自己 的 确切 位置 。
+其 具有 的 自动 语音 导航 、 最 佳 路径 搜索 等 功能 让 您 一 路 捷径 、 畅行 无 阻 , 集成 的 办公 、 娱乐 功能 让 您 轻松 行驶 、 高效 出行 。
+一般 的 车载 导航 的 功能 都 有 DVD 播放 器 、 收音 接收 、 蓝牙 免提 、 触摸 屏 、 选配 功能 、 智能 轨迹 倒车 、 胎 压 检测 功能 、 虚拟 六 碟 , 后台 控制 功能 。
+中文 名 卫星 导航 仪 性质 车用 工具 特点 具有 < ahref = " # " > GPS < / a > 全球 卫星 定位 系统 功能 方式 利用 语音 提示 的 方式
+蒙山 春节 祈福 庙会 是 山东 省 十大 祈福 庙会 之 一 的 节 会 庆典 , 包括 蒙山 庙会 和 祈福 法会 两大 部分 。
+其中 , 蒙山 庙会 以 民族 传统 文化 讴歌 沂蒙 山 的 风土 民情 , 主要 有 皇帝 巡游 、 “ 年 的 味道 ” 之 民俗 表 、 非遗 表演 、 猜 灯谜 、 曲艺 演出 、 沂蒙 美食 汇 等 民俗 特色 活动 ; 祈福 法会 以 蒙山 千年 祈福 拜寿 文化 为 依托 , 正月 初一 至 初七 相继 在 始建 于 北宋 的 万寿宫 、 蒙山 之 巅 玉皇 殿 和 拜寿 台 进行 祈福 迎祥 大法 会 。
+0 < b > 时间 < / b > < b > : 农历 正月 初一 到 正月 初七 < / b > < b > < / b > < b > 地点 < / b > < b > : 临沂 蒙山 龟蒙 景区 < / b > < b > < / b > < b > 民俗 活动 < / b > < b > : 道家 祈福 拜寿 法会 、 皇帝 巡游 、 “ 年 的 味道 ” 之 民俗 表 、 非遗 高手 聚 蒙山 、 猜 灯谜 、 沂蒙 美食 汇 等 。
+< / b > < b > < / b > < b > 推荐 理由 < / b > < b > : 蒙山 群峰 嵯峨 , 重 峦 叠翠 , 森林 茂密 , 风光 秀丽 , 以 雄伟 壮丽 名著 华夏 , 素有 “ 九州 之 巨 镇 , 巍然 敦 大观 ” 之 称 ; 身着 古装 的 皇帝 、 嫔妃 、 商贩 、 衙役 , 古色 古香 的 摊点 , 糖人 、 泥塑 、 风车 、 皮影 戏 等 非 物质 文化 遗产 商品 的 叫卖 声 让 游客 仿佛 进入 时光 隧道 , 穿越 千年 一 下子 走进 古都 的 繁华 街市 , 一 步 迈入 画卷 中 的 清明 上 河 园 市井 。
+庄严 肃穆 的 道家 祈福 拜寿 法会 给 人 以 心灵 的 洗礼 和 震撼 。
+< / b > < b > 交通 : < / b > < b > < / b > 外围 交通 : 国道 327 、 国道 205 、 G2 京沪 高速 、 G3 京 福 高速 、 G25 长 深 高速 、 G1511 日兰 高速 、 兖 石 铁路 贯穿 蒙山 , 蒙山 距离 临沂 市区 90 公里 , 距离 临沂 机场 90 公里 , 距离 济南 平遥 机场 180 公里 , 距离 曲阜 东 高 铁 70 公里 , 距离 泰安 南高 铁 110 公里 。
+公共 交通 : 火车 : 临沂 火车 站 下 , 乘坐 公交 至 新 客运 总站 , 乘坐 临沂 蒙山 专线 直通 车 直达 景区 , 早 7 : 30 隆达 换乘 中心 、 8 : 00 新 客运 总站 发车 , 9 : 30 直达 , 天天 发车 。
+平邑 火车 站 下 , 乘坐 1 路 公交 至 东城 换乘 中心 , 换乘 蒙山 专线 , 约计 40 分钟 , 营运 时间 : 7 : 00 — 17 : 00 流水 发车 。
+高 铁 : 曲阜 东 高 铁 站 下 , 换乘 曲阜 至 平邑 客车 , 平邑 新 汽车 站 换乘 至 蒙山 客车 , 营运 时间 : 7 : 00 — 17 : 00 流水 发车 。
+飞机 : 临沂 机场 下 , 乘坐 临沂 蒙山 专线 直通 车 直达 景区 , 早 7 : 30 隆达 换乘 中心 、 8 : 00 新 客运 总站 发车 , 9 : 30 直达 , 天天 发车 。
+自 驾 路线 : 1 . 北京 、 天津 、 济南 、 泰安 、 德州 、 枣庄 、 徐州 、 淮北 — G3 京 福 - 经 曲阜 转 G1511 日兰 高速 蒙山 ( 平邑 东 ) 出口 下 。
+2 . 西安 、 郑州 、 菏泽 、 济宁 、 临沂 、 日照 —— G1511 日兰 - - 蒙山 ( 平邑 东 ) 出口 下 。
+3 . 东营 、 滨州 、 淄博 、 莱芜 — G2 京沪 ( 蒙阴 ) 出口 下 左转 , 335 省道 至 龟蒙 景区 。
+4 . 青岛 、 潍坊 、 烟台 、 威海 —— G15 同 三 经 日照 转 G1511 日兰 蒙山 ( 平邑 东 ) 出口 下 。
+5 . 上海 、 南京 、 苏州 、 扬州 、 淮安 —— G2 京沪 - 经 竹园 立交 - 转 G1511 日兰 蒙山 ( 平邑 东 ) 出口 下 。
+< b > 举办 地 : < / b > < b > < / b > 沂蒙 山 旅游 区 龟蒙 景区 , 古 称 东蒙 、 东山 , 位于 临沂 市 西北 部 , 现 为 国家 5A 级 旅游 景区 、 中国 首座 生态 名山 、 国家 森林 公园 、 国家 地质 公园 、 中国 最 美 地质 公园 、 省级 风景 名胜 区 、 世界 养生 长寿 圣地 。
+龟蒙 景区 是 沂蒙 山 旅游 区 主峰 所在 地 , 主峰 龟 蒙顶 海拔 1156 米 , 为 山东 省 第 二 、 沂蒙 山区 最 高 峰 。
+区内 植被 茂密 , 负 氧 离子 含量 极高 , 被 誉 为 “ 天然 氧 仓 ” 。
+这里 拥有 世界 最 大 山体 雕刻 — 蒙山 寿 仙 、 江北 最 长 栈道 — 蒙山 悬崖 栈道 、 临沂 对外 宣传 标志 性 旅游 景点 —— 鹰 峰 奇观 、 江北 最 大 道观 — 万寿宫 等 百余 处 自然 人文 景观 。
+佳 山 、 秀水 、 幽 林 、 清气 , 得 天 独 厚 的 自然 条件 , 孕育 了 著名 的 养身 长寿 圣地 和 休闲 度假 胜地 。
+这里 还是 东夷 文化 的 中心 发源 地 , 夏 商周 时期 由 太昊 氏 后裔 颛 臾 王 建立 颛 臾 古国 , 专门 祭祀 蒙山 。
+蒙山 的 秀美 风光 和 灿烂 文明 曾 吸引 无 数 圣哲 贤士 、 文人 墨客 、 帝王 将 相 流连 忘返 , 孔 子 登 龟蒙 而 慨叹 “ 登 东山 而 小 鲁 ” ; 李白 、 杜 甫 、 苏 轼 游历 此地 , 吟 留 千古 佳句 ; 康熙 、 乾隆 大帝 巡游 至此 , 叹 为 观止 , 御笔 题词 … … 龟蒙 景区 同时 拥有 悠久 的 佛道 宗教 文化 , 尤其 以 道 教养 生 长寿 文化 为 特色 。
+蒙山 春节 祈福 庙会 现状 与 文化 内涵 0 蒙山 群峰 嵯峨 , 重 峦 叠翠 , 森林 茂密 , 风光 秀丽 , 以 雄伟 壮丽 名著 华夏 , 素有 “ 九州 之 巨 镇 , 巍然 敦 大观 ” 之 称 。
+主峰 龟 蒙顶 海拔 1156 米 , 秀 出 云表 , 耸 翠 天际 , 因 其 状如 神龟 伏卧 云端 而 得名 , 为 山东 第 二 高 峰 , 与 泰山 遥遥 相望 , 被 誉 为 “ 岱宗 之 亚 ” , 素有 “ 齐鲁 第 二 高 峰 ” 之 称 。
+龟 蒙顶 为 蒙山 主峰 , 海拔 1156 米 , 是 山东 省 第 二 高 峰 。
+峰 体 呈 穹隆 形 , 犹如 一只 巨大 的 神龟 伏卧 于 云端 天际 而 得名 。
+《 孟 子 》 中 记载 : “ 孔 子 登 东山 而 小 鲁 , 登 泰山 而 小 天下 。
+” 其中 的 “ 东山 ” 即 指 蒙山 。
+唐宋 以来 , 蒙山 一直 为 文人 骚客 、 帝王 将 相 所 瞩目 。
+李白 、 杜 甫 曾 结伴 游 蒙山 , 杜 甫 写 下 “ 余 亦东 蒙 客 , 怜 君如 弟兄 。
+醉 眠 秋 共 被 , 携手 同日 行 ” 的 佳句 ; 唐 玄宗 曾 率 群臣 登临 蒙山 ; 苏 轼 登 蒙山 写 有 “ 不 惊 渤海 桑田 变 , 来看 龟蒙 漏 泽 春 ” 的 名句 ; 康熙 大帝 冬 游 蒙山 留 下 “ 马蹄 踏 碎 琼瑶 路 , 隔断 蒙 山顶 上峰 ” 的 诗篇 ; 乾隆 皇帝 南巡 中 游历 蒙山 则 留 有 “ 山 灵 盖 不 违 尧 命 , 示 我 诗情 在 玉峰 ” 的 赞美 诗篇 , 都 对 蒙山 颂扬 备至 。
+道教 早期 的 重要 人物 春秋 时期 的 老莱子 、 战国 时期 纵横 家 的 鼻祖 鬼谷子 、 汉朝 史 学家 蔡 邕 等 曾 隐居 此 山 。
+蒙山 以 道教 最 为 兴盛 , 道佛 共修 , 向 有 “ 三十六 洞天 , 七十二 古刹 ” 之 说 。
+蒙山 钟 灵 毓秀 , 孕育 了 诸如 孔 子 弟子 仲由 、 “ 算 圣 ” 刘 洪 、 “ 智圣 ” 诸葛 亮 、 “ 书圣 ” 王 羲之 、 书法 家 颜 真 卿 家族 等 贤 圣人 杰 。
+除 此 而外 , 该 地区 还是 著名 的 红色 旅游 胜地 , 著名 的 孟良崮 战役 遗址 就 在 此地 。
+蒙山 山区 , 自古 当地 居民 就 有 春节 登 蒙山 主峰 祭 玉皇 , 拜寿 星 , 祈福 寿 平安 的 习俗 。
+山里 人 原先 游猎 为生 , 逐渐 过渡 到 种 桑 养蚕 , 纺线 织布 到 种田 。
+种 瓜 得 瓜 , 种 豆 得 豆 , 六畜 兴旺 , 五谷 丰登 。
+蒙山 山区 百姓 感激 苍天 玉皇 大帝 , 逢 年 过节 , 便 自发 地 行动 起来 , 到 龟 蒙顶 上去 祭祀 , 以 酬谢 天恩 。
+后来 便 修筑 了 玉皇 殿 , 雕塑 玉皇 神像 牌位 , 专供 人们 去 蒙山 山顶 上去 报恩 祭祀 。
+逐渐 形成 蒙山 祭祀 的 山 会 。
+随着 生产 生活 的 发展 , 蒙山 祭祀 狂欢 盛会 , 一直 被 人们 延续 了 下来 , 民众 性 的 祭祀 活动 也 一直 流传 下来 。
+山 下 的 人们 扶 老 携 幼 , 徒步 上山 , 各 带 纸 香 和 供品 , 到 龟 蒙顶 玉皇 殿 里 的 神像 前 磕头 叩拜 , 烧香 许愿 。
+在 玉皇 神像 前 , 人人 尽 表 自己 的 崇敬 心愿 与 心情 。
+蒙山 山 会 祭祀 , 人人 都 说 灵验 的 迷信 说法 , 则 变成 了 灵验 消息 , 越 传 越 远 、 赵 传越 神 , 祭祀 的 人们 更 是 越来 愈多 。
+天南 地 北 的 民众 纷纷 前来 祭祀 , 观摩 参与 狂欢 。
+祭祀 的 人们 不 惜 千里 遥远 的 跋涉 之 苦 , 徒步 上山 与 山区 民众 一 起 祭祀 , 痛 说 自身 的 心愿 与 苦恼 , 乞求 上天 玉皇 大帝 保佑 降 福 。
+蒙山 龟蒙 景区 在 当地 原有 的 玉皇 等 信仰 基础 上 , 于 2014 年 正月 初一 到 初七 举办 了 蒙山 春节 祈福 庙会 。
+庙会 含 道家 祈福 拜寿 法会 、 皇帝 巡游 、 “ 年 的 味道 ” 之 民俗 表 、 非遗 高手 聚 蒙山 、 猜 灯谜 、 沂蒙 美食 汇 等 民俗 特色 活动 和 亲子 游戏 比拼 、 卡通 人物 大 狂欢 等 潮流 节目 , 精彩 纷呈 。
+身着 古装 的 皇帝 、 嫔妃 、 商贩 、 衙役 , 古色 古香 的 摊点 , 糖人 、 泥塑 、 风车 、 皮影 戏 等 非 物质 文化 遗产 商品 的 叫卖 声 让 游客 仿佛 进入 时光 隧道 , 穿越 千年 一 下子 走进 古都 的 繁华 街市 , 一 步 迈入 画卷 中 的 清明 上 河 园 市井 。
+庄严 肃穆 的 道家 祈福 法会 、 祈 寿 法会 给 人 以 心灵 的 洗礼 和 震撼 , 亲子 游戏 、 儿童 节目 、 才艺 表演 让 萌 娃 与 父母 演绎 一 段 情感 的 交融 , 真正 打造 了 蒙山 龟蒙 景区 祈福 祈 寿 、 圆 梦 之 旅 的 旅游 品牌 。
+蒙山 春节 祈福 庙会 详情 0 蒙山 春节 祈福 庙会 详情 第 一 届 蒙山 春节 祈福 庙会 ( 2013 年 ) 1 此次 庙会 包括 “ 穿越 千年 、 梦回 大 宋 ” 首届 蒙山 庙会 和 蒙山 祈福 迎祥 法会 两 项 内容 。
+庙会 的 节目 内容 也是 精彩 纷呈 。
+旱船 、 骑 毛驴 、 二人转 等 民间 民俗 节目 循环 演出 , 歌舞 、 舞蹈 、 小品 等 也 在 综艺 舞台 上 轮番 上演 。
+捏 面人 , 空竹 等 非 物质 遗产 独具 特色 , 更有 武 大郎 、 潘 金莲 与 游客 互动 表演 。
+大 宋 皇帝 携 妃子 、 侍卫 出游 , 苏 轼 游 蒙山 , 全真 七 子 现场 演绎 更 为 本 次 庙会 平添 了 几 分 特色 。
+全国 各地 的 特色 小吃 也 在 蒙山 庙会 聚集 , 亲身 体验 舌尖 上 的 福地 。
+与此 同时 , 祈福 法会 , 每天 有 两场 法事 , 每场 法事 一 个 半 小时 左右 , 游客 可以 更 好 的 祈福 寿 康 宁 , 保 事事 平安 。
+蒙山 春节 祈福 庙会 详情 第 二 届 蒙山 祈福 迎祥 法会 ( 2014 年 ) 2 农历 正月 初一 、 初三 、 初七 , 祈福 拜寿 大法 会 。
+天 腊 之 辰 , 焚香 敬 天地 , 古人 畏惧 天 , 崇敬 天 , 祭拜 天 , 希望 得到 上苍 的 庇护 和 福佑 , 祈求 风 调 雨 顺 , 事业 兴旺 , 福寿康 宁 , 国泰 民安 。
+蒙山 自古 以来 就 是 福寿 之 山 , 也是 山东 省 十二 大 祈福 圣地 之 一 。
+自 周 朝颛 臾 王 代表 周 天子 在 蒙山 祭天 。
+千百 年 来 , 各地 信 众 就 有 在 蒙山 ( 沂蒙 主峰 ) 龟 蒙顶 燃 香 敬 天 祈福 习俗 。
+蒙山 春节 祈福 庙会 详情 第 三届 蒙山 春节 祈福 庙会 ( 2015 年 ) 3 包括 蒙山 庙会 和 祈福 法会 两大 部分 。
+其中 , 蒙山 庙会 以 民族 传统 文化 讴歌 沂蒙 山 的 风土 民情 , 主要 有 皇帝 巡游 、 “ 年 的 味道 ” 之 民俗 表 、 非遗 表演 、 猜 灯谜 、 曲艺 演出 、 沂蒙 美食 汇 等 民俗 特色 活动 ; 祈福 法会 以 蒙山 千年 祈福 拜寿 文化 为 依托 , 正月 初一 至 初七 相继 在 始建 于 北宋 的 万寿宫 、 蒙山 之 巅 玉皇 殿 和 拜寿 台 进行 祈福 迎祥 大法 会 。
+本 次 庙会 围绕 四 个 主题 展开 , 主题 一 主打 喜气 “ 羊 羊 ” 过 大年 , 寻找 逝去 的 年 味 。
+主题 二 主打 大 宋 文化 , 让 大家 一 起 体验 一 场 穿越 盛宴 。
+主题 三 主 打非 遗 类 表演 , 杂耍 , 弘扬 祖国 传统 文化 , 丰富 群众 文化 生活 。
+主题 四 主打 祈福 法会 , 拜 蒙山 寿星 、 祈福 寿 康 宁 。
+蒙山 春节 祈福 庙会 详情 第 四 届 蒙山 春节 祈福 庙会 ( 2016 年 ) 4 依旧 延续 了 “ 逛 蒙山 庙会 , 祈福 寿 康 宁 ” 的 节 会 主题 , 既 传承 往届 蒙山 庙会 带有 沂蒙 风土 人情 的 传统 民俗 , 也 吸收 借鉴 了 观赏 性 强 、 参与 度 高 的 优秀 节目 。
+舞龙 舞狮 、 高跷 秧歌 、 民俗 非遗 、 乐队 曲艺 、 快板 评书 等 传统 节目 轮番 上演 ; 猴 年 主题 的 美 猴王 也 携手 观音 财神 、 福禄寿 、 十二 生肖 等 众多 模仿 秀 参与 到 庙会 中 来 , 引得 大小 朋友 争相 合影 , 其乐 融融 ; 祭 山 文化 演出 最 大 程度 还原 了 周 天子 时期 颛 臾 王 祭拜 蒙山 , 祈求 风 调 雨 顺 、 国泰 民安 的 宏伟 场面 , 再现 了 颛 臾 王 祭 山 仪式 的 盛况 … … 此外 , 微 信 现场 摇 红包 、 分享 信息 送 福袋 、 刮刮乐 中 大奖 等 活动 , 赢得 了 游客 普遍 好评 。
+蒙山 是 “ 东夷 文化 ” 的 发祥 地 之 一 。
diff --git a/PaddleNLP/ELMO/data/train/sentence_file_2.txt b/PaddleNLP/ELMO/data/train/sentence_file_2.txt
new file mode 100644
index 0000000000000000000000000000000000000000..533137d8b124cc27049b3fb7b956f24e994d3a94
--- /dev/null
+++ b/PaddleNLP/ELMO/data/train/sentence_file_2.txt
@@ -0,0 +1,1000 @@
+1999 年 被 确定 为 湖南 省 跨 世纪 学术 与 技术 带头 人 后备 人选 , 同年 还 被 评 为 省级 骨干 教师 。
+2003 年 晋升 为 教授 , 主要 从事 矿 产品 加工 、 钾 长石 综合 利用 工艺 的 研究 。
+0 曾经 参加 和 主持 省级 科研 课题 11 项 , 在 《 高校 化学 工程 学报 》 、 《 物理 化学 学报 》 、 《 过程 工程 学报 》 等 刊物 发表 多 篇 论文 , 部分 论文 被 SCI 、 EI 收录 。
+1996 —— 1998 年 主持 由 湖南 省 教委 资助 的 课题 “ 钾 长石 的确 提 钾 及 综合 利用 ” , 1997 —— 1998 年 曾 参加 “ 电镀 铝 及 铝 合金 工艺 研究 ” 、 “ 用于 电镀 铝 及 铝 合金 的 电镀 液 的 研究 ” 、 “ 固体 酸 催化 精 留 的 研究 ” 、 “ 用 细胞 组织 培养 技术 合成 柠檬 醛 的 研究 ” 、 “ 磷肥 生产 中 三废 的 综合 利用 ” 等 省级 课题 。
+2000 —— 2001 年 主持 湖南 省 资助 的 课题 “ 金属 卟啉 催化 闪锌矿 氧化 的 研究 ” , 2002 —— 2003 年 主持 由 湖南 省 教委 资助 的 课题 “ 金属 卟啉 催化 空气 氧化 烷烃 为 醇 、 醛 、 酮 的 研究 ” , 2003 —— 2004 年 主持 了 省 教委 资助 的 “ 佛 碱 双核 配合 物 模拟 甲烷 单 加 氧 酶 的 研究 ” 的 课题 。
+同时 还 主持 了 由 湖南 省 科技 厅 资助 的 “ 纳米 二氧化锰 的 研究 ” 。
+在 化工 技术 方面 有 较 深 的 造诣 , 曾 获 97 全国 ( HN ) 双 新会 金奖 。
+中文 名 彭 清静 国籍 中国 民族 汉 性别 男
+万古 霉素 试剂 盒 万古 霉素 试剂 盒 是 一 种 药品 , 产品 适用 范围 适用 于 西门子 公司 生产 的 ADVIA Centaur 系列 自动 化学 免疫 分析 仪 及 ACS : 180 化学 发光 免疫 分析 仪 , 进行 血清 / 血浆 中 万古 霉素 浓度 的 定量 测定 。
+0 注册 号 国 食 药监 械 ( 进 ) 字 2005 第 3401383 号 ( 更 ) 生产 厂商 名称 ( 中文 ) 产品 性能 结构 及 组成 万古 霉素 试剂 盒 试剂 盒 主要 是 由 万古 霉素 标记 试剂 、 固相 试剂 、 测定 标准 曲线 卡 组成 。
+标记 试剂 主要 成分 为 : 吖啶 酯 标记 的 结合 牛 血清 白 蛋白 的 万古 霉素 ( ~ 6ng / ml ) 置于 含 叠 氮 钠 ( ﹤ 0 . 1 % ) 、 防腐 剂 的 缓冲 盐 溶液 中 ; 固相 试剂 主要 成分 为 : 与 顺 磁性 粒子 共价 结合 的 鼠 单 克隆 抗 万古 霉素 抗体 ( ~ 6 . 7ug / ml ) 置于 含 叠 氮 钠 ( ﹤ 0 . 1 % ) 、 防腐 剂 的 缓冲 盐 溶液 中 。
+注册 代理 北京 西门子 医疗 诊断 设备 有限 公司 售后 服务 机构 北京 西门子 医疗 诊断 设备 有限 公司 批准 日期 2005 . 05 . 23 有效 期 截止 日 2009 . 05 . 23 备注 生产 者 名称 由 Bayer Health CareLL C ( Bayer CorporationDiagnostic sDivision ) 变更 为 Siemens Medical Solutions Diagnostics ; 生产 者 地址 与 生产 场所 地址 由 Bayer Corporation , 511 BenedictA venue , Tarry town , NY 10591 - 5097 USA 变更 为 Siemens Medical Solutions Diagnostics , 511 BenedictA venue , Tarry town , NY 10591 - 5097 USA ; 注册 代理 与 售后 服务 机构 由 拜耳 医药 保健 有限 公司 变更 为 北京 西门子 医疗 诊断 设备 有限 公司 ; 产品 适用 范围 中 拜耳 公司 变更 为 西门子 公司 ; 注册 证 由 国 食 药监 械 ( 进 ) 字 2005 第 3401383 号 变更 为国 食 药监 械 ( 进 ) 字 2005 第 3401383 号 ( 更 ) , 原 证 自发 证 之 日起 作废 。
+变更 日期 2008 . 04 . 18 生产 厂商 名称 ( 英文 ) Siemens Medical Solutions Diagnostics 生产 厂 地址 ( 中文 ) Siemens Medical Solutions Diagnostics , 511 BenedictA venue , Tarry town , NY 10591 - 5097 USA 生产 场所 Siemens Medical Solutions Diagnostics , 511 BenedictA venue , Tarry town , NY 10591 - 5097 USA 生产 国 ( 中文 ) 美国 产品 名称 ( 中文 ) 万古 霉素 试剂 盒 产品 名称 ( 英文 ) Vancomy cin ( VANC ) 规格 型号 121197 : 250 个 测试 ; 118451 : 50 个 测试 ; 672360 : 300 个 测试 ; 672361 ; 50 个 测试 产品 标准 进口 产品 注册 标准 YZB / USA 1144 - 40 - 2004 《 万古 霉素 试剂 盒 》 中文 名 万古 霉素 试剂 盒 属性 药品 注册 号 2005 第 3401383 相关 领域 医学
+男女 柔术 男女 柔术 的 最 高 境界 。
+10 广州 军区 战士 杂技 员 女 演员 : 张 婉 男 演员 : 李 童 代表 作品 : < b > 1 . < / b > < b > 《 春花 烂漫 》 < / b > 2013 军委 慰问 老 干部 晚会 编导 : 韩 真 贾 娟 作曲 : 王 喆 教练 : 高 文成 广州 军区 战士 杂技 团 < b > 2 . < / b > < b > 《 < / b > 梦蝶 》 2014 央视 春晚 编导 : 韩 真 贾 娟 作曲 : 蔡 东真 艺术 指导 : 李 亚萍
+朱 子 家训 - 增广 贤文 《 朱 子 家训 》 是 清代 学者 朱 用纯 ( 字 致 一 , 自 号 柏 庐 》 根据 自己 一生 的 研究 , 以 儒家 “ 修身 ” 、 “ 齐家 ” 的 核心 思想 为 宗旨 , 广 采 儒家 的 为人 处世 经验 、 方法 编撰 而成 。
+可说 是 一 部 详尽 的 “ 治家 ” 规范 书 。
+它 的 最 妙 之 处 就 在于 , 它 能够 采用 循 循 善 诱 、 步步 深入 的 方式 , 从 如何 “ 治家 ” 说起 , 先 联系 “ 修 人 ” , 再 推广 到 治理 国家 、 立身 处事 等 , 内容 深入 浅出 , 语言 浅显 明白 , 说理 中肯 易懂 。
+0 < b > 编者 推荐 < / b > < b > < / b > 《 增广 贤文 》 是 与 《 朱 子 家训 》 有 着 异 曲 同工 之 妙 的 另 一 部 书 。
+这部 书 可说 是 一 本 非常 优秀 的 民间 谚语 集 , 其 内容 大多 都 是 教导 人 如何 在 现实 生活 中 立身 处世 、 生存 发展 的 。
+这部 书 基本 上 反映 了 我 国 古代 老 百姓 的 一些 生活 原则 , 还 包括 一些 道教 或 佛教 的 处世 方式 。
+因此 , 我们 也 要 一 分 为 二 、 有 所 辨别 地 进行 吸收 和 摒弃 。
+当然 , 其中 大 多数 的 谚语 还是 很 有 哲理 、 耐 人 寻味 的 , 值得 我们 现代 人 借鉴 。
+作者 朱 用纯 IS BN 9787537122832 页数 157 定价 10 . 0 出版 社 第 1 版 ( 2006 年 5 月 1 日 ) 出版 时间 2006 - 5 装帧 平装
+我 和 医院 的 约定 《 我 和 医院 的 约定 》 是 新浪 读书 连载 的 都市 小说 , 作者 是 玉 琉璃 66 。
+小说 类型 0 都市 小说 内容 简介 0 我 和 医院 的 约定 , 是 以 现实 医院 的 生活 作为 写照 。
+丁 雨薇 是 医学 院校 的 风云 人物 , “ 不 忘 初 心 , 方 得 始终 ” , 怀揣 着 求医 梦想 , 踏 寻 在 医学 的 道路 上 。
+她 和 医院 的 不 解 之 缘 , 源自 她 的 双亲 , 她 的 父母 已经 远离 了 她 的 世界 , 但是 他们 走过 的 足迹 , 却 像 北极 星 一 样 引导 着 她 继续 走 上 寻医 的 这 条 路 , 在 这 条 路 上 究竟 有 什么 在 等待 着 她 执着 探索 下去 … … 中文 名 我 和 医院 的 约定 作者 玉 琉璃 66 小说 进度 连载 连载 网站 新浪 读书
+长征 4 号 核 潜艇 长征 4 号 核 潜艇 , 即 404 艇 , “ 汉 ” 级 ( 091 型 ) 攻击 型 核 潜艇 , 1987 年 下水 。
+技术 参数 0 主机 : 核 动力 , 涡轮 - 电力 推进 ; 1 座 压 水 堆 , 90 兆瓦 , 单 轴 编制 : 75 名 导弹 : C801 反舰 导弹 ( 发射 筒 安装 在 指挥 台 围 壳 后 ) 鱼雷 : 6 具 533 毫米 发射 管 声纳 : 可能 包括 法国 的 DUUx - 5 结构 特征 0 采用 锥 尾 、 单 轴 单 桨 , 围 壳 舵 与 十字 型 尾 操作 面 布局 , 双层 船 壳 , 配备 4 个 尾 舵 。
+艇 上 设 7 个 水 密 舱 , 艇 上 主要 设备 都 位于 甲板 的 第 二 层 , 包括 潜望 镜 、 舰载 雷达 、 通讯 和 卫星 导航 设备 等 。
+简氏 防务 周刊 认为 , 自 403 号 艇 开始 , 该 型 艇 的 船身 延长 了 8 米 , 以便 安装 潜 射 型 鹰击 - 82 反舰 导弹 和 配套 的 火控 系统 , 鹰击 - 82 通过 533 毫米 鱼雷 发射 管 发射 。
+北约 代号 “ 汉 ” 级 。
+中文 名称 长征 4 号 核 潜艇 国家 中国 下水 时间 1987 年 导弹 C801 反舰 导弹
+黄桷 古道 黄桷 古道 位于 重庆 市 南岸 区 南山 北面 , 从 上 新 街 到 黄桷 垭 。
+0 黄桷 古道 位于 重庆 市 南岸 区 南山 北面 , 从 上 新 街 到 黄桷 垭 。
+是 一 条 被 称 为 “ 老君 坡 ” 的 古籍 板 古道 , 为 当地 一 景 。
+古道 的 下端 起于 两 处 , 一 是 位于 上 新 街 前驱 路 的 左侧 , 是 一 条 顺 坡 蜿蜒 的 青 石板 铺垫 的 中 大路 , 是 重庆 城 连接 龙门 浩 渡口 的 通道 , 也是 通往 川黔 的 主干 道 ; 一处 起于 海棠 溪 , 紧邻 老君 洞 的 一 条 石磴 盘曲 的 天梯 云 栈 。
+两 条 路 在 黄桷 垭口 前 汇合 。
+据 考证 , 该 路 始于 唐宋 , 自此 人群 熙攘 , 热闹 非凡 。
+黄桷 古道 的 精彩 段落 , 也是 历代 名人 常 经 之 地 , 曾 留 下 丰富 的 题刻 和 诗词 。
+抗战 期间 , 入 缅 作战 的 部分 军队 也 经 此 古道 而 过 , 足见 其 当年 重要 的 交通 作用 。
+在 黄龙 公路 通车 之 前 , 也是 重庆 市区 与 黄桷 垭 的 重要 联系 道路 。
+漫步 在 青 石板 路 上 , 青 石板 山径 两侧 一 路数 百年 古老 的 黄桷 树 , 虬 枝 粗 干 。
+树荫 下 青石 径 被 往来 人 磨 得 水 亮 。
+春 来 绿意 盎然 , 嫩芽 拂 黄 遍 身 轻 , 夏 来风 清 四散 , 行人 如织 好 乘凉 。
+秋日 树 果 遍地 , 一径 秋色 随 山 转 , 冬日 残叶 枝 苍 , 衰草 枯 意 烟 笼 纱 , 意韵 天成 。
+其中 , 又 有 旧 名 为 杨家 山 的 山峰 孤峰 簇 起 , 如 巨大 的 刀刃 斜倚 南 山脚 下 , 登临 此 峰 , 回望 南山 数 峰 壁立 如画 , 远眺 江 天 一色 , 脚下 是 游丝 般 的 川黔 高速 路 。
+置身 此处 , 心中 顿感 壮阔 。
+黄桷 古道 复兴 工程 , 是 南岸 区 挖掘 旅游 资源 潜力 , 实施 重点 开发 的 项目 。
+修复 中 的 黄桷 古道 , 像 一根 蜿蜒 的 绳 线 , 把 散落 在 南山 这 一侧 的 黄桷 浓荫 、 古朴 民居 、 杨家 山 天生 的 观景 台 、 白 莎 墓 、 老君 洞 、 德国 大使 馆 、 文峰 塔 等 美丽 的 珠子 串连 在 一 起 , 让 它们 相映 成趣 , 相映 生辉 , 共同 构成 了 重庆 近郊 这 一 片 新 的 富有 吸引 力 的 风景 区 。
+中文 名 黄桷 古道 地点 重庆 市 南岸 区 又 称 老君 坡 时代 唐宋
+江苏 润 信 投资 股份 有限 公司 江苏 润 信 投资 股份 有限 公司 于 2014 年 07 月 21 日 在 南京 市 工商 行政 管理 局 登记 成立 。
+法定 代表 人 周 毅 , 公司 经营 范围 包括 实业 投资 ; 预 包装 食品 批发 与 零售 ; 百货 、 服装 等 。
+企业 信息 0 公司 类型 股份 有限 公司 ( 非上市 ) 登记 机关 南京 市 工商 行政 管理 局 成立 时间 2014 年 07 月 21 日发 照 时间 2016 年 08 月 19 日
+MCIS 职业 测评 “ MCIS 职业 测评 ” 是 目前 全美 高中 阶段 学生 必经 的 个性化 测评 , 被 誉 为 青年 生涯 发展 规划 的 “ 航标 灯 ” 。
+该 测评 从 个人 品评 与 智能 配对 、 兴趣 志向 等 全 方位 测试 , 对 参与 者 进行 学业 背景 、 综合 素质 、 个人 需求 等 推出 精准 性 服务 , 使 其 个性化 特点 与 职业 规划 发展 匹配 。
+简介 0 在 中国 , “ MCIS 职业 测评 ” 由 “ 亿 通 国际 ” 率先 自 美国 引入 , 参与 测评 的 对象 主要 为 有 志 于 海外 留学 的 高中 学生 。
+据 统计 , 受 评 的 留学 生 反应 与 测评 结果 匹配 率 达到 98 % , 成为 一项 辅助 学生 职业 选择 的 专业 性 测评 。
+理念 0 个人 所 从事 的 行业 和 工作 , 是否 真正 地 适合 个人 , 是否 有 利于 个人 在 较 长 时间 内 充分 发挥 自身 的 优势 ?
+基于 这 一 对 个人 职业 生涯 甚至 是 整个 人生 都 至关 重要 的 一 环 , 美国 开发 出 “ MCIS 职业 测评 ” 系统 , 准确 地 对 参评 者 自身 的 兴趣 、 性格 、 能力 等 特征 进行 了解 、 分析 , 发现 潜在 竞争 优势 和 能 充分 发挥 自身 优势 的 职业 , 从而 使 参评 者 能够 更 为 准确 地 迈出 人生 中 最 重要 、 最 关键 的 一 步 。
+哪些 人 适合 MCIS 职业 测评 ?
+0 除了 高中 择校 阶段 , 青年 职业 规划 也 符合 MCIS 测评 要求 。
+通过 测评 找出 个 的 技能 偏好 和 计算 出 人格 类型 , 由 职业 技能 表 比对 技能 偏好 , 通过 测评 帮助 个人 确定 利益 价值 , 从而 提供 最 相 匹配 的 择业 建议 。
+测评 亮点 0 < b > 结合 实时 需求 , 提供 实质 性 信息 服务 < / b > < b > < / b > 全美 超过 500 个 职业 的 分布 区域 、 所 需 技能 、 薪酬 前景 等 ; 同时 列出 超过 800 个 的 教育 和 培训 信息 、 研究 方案 、 入学 和 课程 计划 类别 。
+通过 互动 工具 , 搜寻 出 符合 个人 的 生活 方式 、 有 关 职业 、 教育 方面 的 信息 , 从而 引导 和 启发 个人 对 职业 / 留学 专业 的 选择 思考 , 让 参评 者 更 从容 选择 。
+中文 名 MCIS 职业 测评 属性 全美 高中 学生 必经 的 < ahref = " # " > 个性化 < / a > < ahref = " # " > 测评 < / a > 适合 人群 高中 择校 阶段 , 青年 职业 规划 亮点 结合 实时 需求 提供 实质 性 信息 服务
+简单 计算 器 Solver 《 简单 计算 器 Solver 》 是 一款 Android 平台 的 应用 。
+应用 介绍 0 简单 的 TVM 计算 器 是 一 金融 货币 的 时间 价值 计算 器 。
+它 将 计算 本金 和 最后 大量 的 贷款 和 投资 。
+简单 的 TVM 计算 器 也 能够 计算 数量 的 复合 周期 或 付款 , 每次 付款 金额 , 即使 利率 。
+这个 计算 器 会 让 你 有 不 同 的 支付 和 复合 频率 。
+支持 版本 0 Android 2 . 1 及 以上 软件 名称 简单 计算 器 Solver 软件 平台 Android 软件 大小 804 . 55 KB
+客家 盐 酒 浸 鸡 用料 0 < table > 材料 用量 鸡腿 3 只 客家 酒酿 一听 老姜 一 块 ( 稍微 大 些 , 不 去皮 的 ) 枸杞 少许 < / table > 做法 0 < table > 1 . 鸡腿 斩断 、 老姜 不 去皮 拍 碎 , 枸杞 用 冷水 浸泡 清洗 片刻 2 . 将 鸡腿 放入 锅 仔 中 , 放 上 老姜 , 适量 的 盐 , 倒入 一听 的 客家 酒酿 , ( 这个 量 刚好 能 淹没 到 鸡腿 肉 ) 腌制 2 小时 左右 3 . 蒸 之 前 放 上 少许 的 枸杞 , 盖盖 , 入 热水 锅 中 蒸煮 25 分钟 左右 就 可以 了 中文 名 客家 盐 酒 浸 鸡 主要 食材 鸡腿 , 客家 酒酿 分类 清蒸
+诗 华 日报 《 诗 华 日报 》 , 马来西亚 的 一 家 华文 综合 性 报纸 , 总社 设在 诗 巫 , 在 斗 湖 、 美 里 、 古晋 、 亚 庇 有 分社 , 是 东 马 较 具 影响 力 的 华文 报纸 。
+根据 马来西亚 ABC ( 马来西亚 刊物 稽查 局 ) 统计 , 每 日报 份 净 销量 为 东 马 之 冠 。
+0 诗 华 日报 是 大马 华文 报 中 历史 较 久 的 一 家 报纸 , 1952 年 4 月 1 日 在 诗 巫 创刊 时 ( 一说 在 古晋 创刊 ) 仅 属于 地方 性 报纸 , 每日 出版 两大 张 。
+后来 逐渐 扩充 内容 和 张 数 , 且 发行 范围 也 涵盖 到 砂 劳 越 、 沙巴 及 汶莱 , 由 一 家 地方 性 报纸 发展 成为 婆罗 洲 的 报纸 。
+现在 每日 发行 12 大张 共 48 版 , 涵盖 政治 、 经济 、 论坛 、 体育 、 影视 、 副刊 等 , 是 一 家 综合 性 报纸 [ 1 ] [ 2 ] 。
+2001 年 由 启 德行 集团 接手 後 , 除了 出版 中文 版 外 , 还 出版 英文 版 《 马来 日报 》 、 马来 文 版 《 婆罗 洲 邮报 》 , 成为 面向 全 社会 、 跨 族群 的 媒体 [ 3 ] 。
+诗 华 报业 周边 产品 : 《 小 乐园 》 儿童 月刊 、 《 豆苗 》 学生 周刊 、 《 自然 与 健康 》 / 《 婆罗 洲 风采 》 周刊 、 《 乐 》 周刊 、 《 足球 王 》 周刊 、 诗 华 资讯 ( 电子 网站 ) 。
+中文 名 诗 华 日报 类型 马来西亚 的 一 家 华文 综合 性 报纸 总社 诗 巫 地区 马来西亚
+恋 绝 尘 《 恋 绝 尘 》 是 董 飞 创作 的 网络 小说 , 发表 于 晋江 文学 网 。
+0 怎么 了 , 我 爱 上 就 是 爱 上 了 ?
+可是 你 为什么 要 对 我 这么 好 , 想 忘记 你 真 的 很 难 啊 。
+你 说 经过 烈火 的 洗礼 才能 变成 凤凰 , 你 说 只有 无 情 才会 无 懈 可 击 , 你 说 得 每 一句 话 我 都 记得 , 你 什么 时候 会 回来 啊 ?
+还是 需要 我 去找 你 !
+书桓 !
+中文 名称 恋 绝 尘 作者 董 飞 类型 网络 小说 连载 平台 晋江 文学 网
+仙 风月 情 《 仙 风月 情 》 是 潇洒 月亮 创作 的 网络 小说 , 发表 于 起点 网 。
+作者 0 潇洒 月亮 作品 简介 0 她 如 一弯 清 月 , 阴 差 阳 错 陷入 天宫 , 感动 着 逍遥 王 和善 薇 彼此 的 那 份 随 君 而生 , 伴君 一世 , 坚 如 磐石 的 爱情 , 在 帮助 他们 的 同时 , 她 那 轻柔 的 月光 不 由 自主 痴痴 洒 在 了 本 不 属于 她 的 逍遥 王 身 上 , 纠结 痛苦 的 爱恋 让 她 情愿 为了 成全 他们 的 爱情 而 与 阴晴 不 定 , 腹黑 霸道 的 冥王 虚假 爱恋 。
+可是 一切 如风 如月 , 事事 仿佛 无 着落 , 她 猜 不 透 那 风 是否 柔情 刮过 她 身 ?
+更加 猜 不 透 仙 风月 情 一切 的 结局 。
+中文 名称 仙 风月 情 作者 潇洒 月亮 连载 平台 起点 网
+Cr - Mn - N 奥 氏 体 不锈钢 锰 和 镍 都 是 奥 氏 体 形成 元素 , 且 都 可以 与 铁 形成 无限 固 溶 体 。
+当 含 ω ( C ) = 0 . 1 % 时 , 由于 碳 稳定 奥 氏 体 的 作用 , 获得 单相 奥 氏 体 的 铬 量 可 提高 到 15 % , 但是 铬 量 大于 15 % 后 , 锰 量 再 增加 , 也 不 能 得到 单相 奥 氏 体 , 而且 钢 中 出现 了 δ 相 。
+所以 , Cr - Mn 系 奥 氏 体 不锈钢 不 能 用于 耐热 腐蚀 的 部件 。
+利用 氮 稳定 奥 氏 体 的 作用 , 开发 了 Cr - Mn - N 系 奥 氏 体 不锈钢 。
+10 Cr - Ni 奥 氏 体 不锈钢 的 应用 广泛 , 但 镍 是 比较 紧缺 的 元素 , 为了 减少 镍 的 消耗 , 因此 国内 外 进行 了 大量 的 研究 , 发展 了 许 多少 镍 和 无 镍 的 奥 氏 不锈钢 。
+主要 有 3 种 类型 : Cr - Mn 系 奥 氏 体 不锈钢 , Cr - Mn - N 系 奥 氏 体 不 锈 锅 和 Cr - Mn - Ni 、 Cr - Mn - Ni - N 系 奥 氏 体 不锈钢 。
+氮 能 抑制 δ 相 的 形成 , 稳定 奥 氏 体 组织 的 作用 也 比较 大 , 能 有效 提高 钢 的 强度 而 不 降低 室温 韧性 , 并且 对 耐 腐蚀 性 无 影响 。
+但是 , 氮 含量 受到 溶解 度 的 限制 , 一般 的 氮 含量 在 0 . 3 % ~ 0 . 5 % 以下 。
+
+钟 宏钟 宏 , 女 , 中共 党员 , 副 研究 员 。
+1989 年 毕业 , 1993 年 7 月 由 锦州 师专 调至 渤海 大学 。
+0 先后 在 党政 办 、 外国 语 学院 、 中文 系 、 数学 系 等 部门 工作 。
+现任 渤海 大学 数理 学院 党 总支 副 书记 兼 副 院长 。
+在 任职 期间 , 曾 发表 《 高校 办公 室 工作 刍议 》 、 《 创新 教育 是 高校 教学 改革 的 重要 内容 》 、 《 创新 教育 环境 下 的 教书 育人 工作 》 、 《 试论 高等 教育 素质 教育 的 实施 》 、 《 创新 教育 呼唤 创新 型 教师 》 等 论文 。
+中文 名 钟 宏 职务 渤海 大学 数理 学院 副 院长 性别 女 政治 面貌 中共 党员
+Captain Mene gosS tudiosMene gosS tudios 位于 帕罗斯 , 是 家 2 星级 酒店 。
+简介 0Captain Mene gosS tudios 位于 帕罗斯 。
+酒店 地址 0 PeraMeria , 帕罗奇亚 , 84400 , 希腊 相关 条款 0 入住 时间 10 : 00 - 23 : 30 时 退房 时间 14 : 00 时 之 前 预订 取消 / 预付 政策 取消 和 预付 款 政策 根据 各种 公寓 类型 而 有 所 不 同 。
+选择 上述 公寓 时 , 请 参阅 公寓 条款 。
+儿童 和 加床 最 多 容纳 : 0 张 加床 。
+宠物 酒店 允许 客人 带 宠物 入住 , 但 需 事先 提出 请求 。
+不 收 取 额外 费用 。
+仅 接受 现金 酒店 仅 接受 客人 使用 现金 支付 。
+英文 名称 Captain Mene gosS tudios 房间 数量 5 酒店 星级 2 星级
+广州 市 她 他 会 酒店 公寓 广州 她 他 会 酒店 公寓 ( 邦泰 公寓 ) 位于 广州 市 海珠 区 新港 东路 33 号 自编 101 号 酒店 介绍 0 广州 她 他 会 酒店 公寓 ( 邦泰 公寓 ) 位于 广州 市 海珠 区 新港 东路 33 号 自编 101 号 , 邻近 会展 中心 , 公寓 倡导 浪漫 的 理想 主义 旅游 观 , 提供 的 是 自助 式 的 住宿 。
+公寓 拥有 豪华 双 床 房 、 复式 双 床 房 、 复式 套房 、 豪华 大床 房 等 多种 房型 , 满足 不 同 宾客 的 入住 需求 , 是 商旅 宾客 理想 居停 之 所 。
+公寓 开业 时间 2010 年 03 月 01 日 , 主楼 高 30 层 , 客房 总数 50 间 ( 套 ) 。
+酒店 信息 0 酒店 地址 : 广州 市 海珠 区 新港 东路 33 号 自编 101 号 交通 信息 : - 距离 地铁 8 号 线 磨 碟 沙 站 B 口 200 米 , 步行 约 5 分钟 ; - 距离 白云 国际 机场 27 公里 , 乘坐 出租 车 约 30 - 40 分钟 ; - 距离 广州 火车 东站 7 公里 , 乘坐 出租 车 约 20 分钟 ; - 距离 广州 火车 总站 15 公里 , 乘坐 出租 车 约 20 分钟 ; - 距离 广州 火车 南站 20 公里 , 乘坐 出租 车 约 30 分钟 。
+- 距离 琶洲 会展 中心 2 . 2 公里 , 乘坐 出租 车 约 5 - 10 分钟 ; - 距离 广州 新 电视 塔 3 . 4 公里 , 乘坐 出租 车 约 10 分钟 ; - 距离 珠江 新城 5 . 7 公里 , 乘坐 出租 车 约 15 分钟 ; - 距离 广州 美术 馆 6 公里 , 乘坐 出租 车 约 15 分钟 ; - 距离 星海 音乐 厅 6 . 5 公里 , 乘坐 出租 车 约 15 - 20 分钟 。
+中文 名 广州 市 她 他 会 酒店 公寓 酒店 星级 三 星级 商业 区域 琶洲 交易 会馆 附近 地点 广州 市 海珠 区
+建筑 施工 教程 本 教材 根据 国家 颁布 的 规范 标准 , 重点 介绍 了 建筑 施工 的 基础 理论 、 基础 知识 和 基本 施工 方法 , 同时 也 增加 了 国内 外 在 建筑 施工 方面 的 新 工艺 、 新 材料 、 新 技术 , 力求 反映 建筑 施工 的 先进 水平 。
+基本 信息 0 书名 : 建筑 施工 教程 图书 编号 : 913771 出版 社 : 中国 建材 工业 出版 社 定价 : 48 . 0ISBN : 780159556 作者 : 贾 晓弟 出版 日期 : 2005 - 10 - 01 版次 : 1 开本 : 小 16 开 简介 0 为 适应 市场 经济 发展 的 需求 , 本 教程 在 施工 组织 部分 , 按 现行 市场 经济 规律 , 体现 了 相关 法律 法规 的 要求 和 先进 的 管理 理念 、 手段 和 方法 。
+本 教程 力求 体现 : 1 、 创新 性 : 吸收 近年 来 教学 改革 成果 , 基础 理论 的 教学 以 应用 为 目的 , 以 必须 够用 为 度 , 专业 教学 加强 针对 性 和 实用 性 , 力求 编 出 新意 。
+2 、 普遍 适用 性 : 本 教程 以 培养 高等 工程 技术 人才 为 目标 , 适用 于 普通 高等 学校 , 也 适用 于 函授 大学 及 专科 教育 。
+同时 对于 从事 土木 工程 施工 建设 的 技术 与 管理 人员 也 应有 较好 的 参考 价值 。
+3 、 实用 性 : 加强 针对 性 , 考虑 到 近年 来 建筑 工程 的 施工 发展 方向 及 新 修订 的 规范 , 增加 了 相应 的 内容 。
+目录 0 第 一 章 土方 工程 第 一 节 概述 第 二节 场地 平整 第 三 节 基坑 开挖 第 四 节 土方 的 填筑 与 压实 第 二 章 基础 工程 第 一 节 浅 基础 的 施工 第 二节 地基 处理 第 三 节 桩 基础 第 四 节 地下 连续 墙 第 五 节 墩 基础 第 六 节 沉井 基础 第 三章 砌筑 工程 第 一 节 脚手 架 工程 第 二节 砌体 结构 的 材料 第 三 节 砌体 施工 工艺 第 四 节 砌体 工程 冬季 施工 第 四 章 混凝土 结构 工程 第 一 节 混凝土 结构 工程 施工 概述 第 二节 钢筋 工程 第 三 节 模板 工程 第 四 节 混凝土 工程 第 五 节 液压 滑 升 模板 施工 第 六 节 高强 混凝土 施工 第 五 章 预应力 混凝土 工程 第 一 节 先 张 法 第 二节 后 张 法 第 六 章 结构 安装工程 第 一 节 起重 机械 第 二节 索具 和 设备 第 三 节 单层 工业 厂房 结构 安装 第 四 节 多层 装配 式 结构 安装 第 五 节 钢 结构 及 空间 结构 吊装 第 七 章 防水 工程 第 一 节 层面 防水 工程 第 二节 地下 防水 工程 第 八 章 装饰 工程 第 一 节 门窗 工程 第 二节 吊顶 、 隔墙 工程 第 三 节 抹灰 工程 第 四 节 饰面 工程 第 五 节 楼 地面 工程 第 六 节 涂料 、 刷 浆 、 裱糊 工程 第 九章 施工 组织 概论 第 一 节 工程 项目 施工 组织 原则 第 二节 施工 准备 工作 第 三 节 施工 组织 设计 第 四 节 施工 组织 设计 的 内容 和 编制 程序 第 十 章 流水 施工 第 一 节 流水 施工 的 基本 概念 第 二节 流水 施工 参数 第 三 节 流水 施工 的 组织 方法 第 十一 章 网络 计划 技术 … … 第 十二 章 施工 组织 总 设计 第 十三 章 单位 工程 施工 组织 设计 第 十四 章 工程 项目 管理 与 施工 项目 管理 书名 建筑 施工 教程 IS BN 780159556 定价 48 . 0 出版 社 中国 建材 工业 出版 社 出版 时间 2005 - 10 - 01 开本 小 16 开
+信 地 城市 广场 公寓 信 地 城市 广场 公寓 位于 合肥 市 临泉 东路 与 铜陵 路 交叉 口 。
+楼盘 详细 01 、 城区 : 合肥 市 新 站区 2 、 地址 : 3 、 售楼 地址 : 合肥 市 临泉 东路 与 铜陵 路 交叉 口 信 地 城市 广场 展示 中心 4 、 均价 : 11000 元 / 平方 米 5 、 价格 详情 : 11000 元 / 平米 6 、 户型 面积 : 37 — 71 平米 7 、 建筑 类型 : 框架 8 、 物业 类别 : 公寓 , 商铺 9 、 建筑 面积 : 95000010 、 占地 面积 : 27930011 、 开盘 时间 : 2010 年 04 月 25 日 12 、 入住 时间 : 2011 年 11 月 13 、 装修 情况 : 毛坯 14 、 产权 年限 : 7015 、 付款 方式 : 银行 按揭 贷款 , 一 次性 付款 16 、 容积 率 : 2 . 7317 、 绿化 率 : 40 周边 配套 01 、 小区 内部 配套 : 商务 面积 58 万 平方 米 , 由 Shopping Mall , 家具 Mall , 服装 Mall , 星级 酒店 , 写字 楼 , 步行 街 和 高档 公寓 等 组成 , 配套 有 24 万 平方 米 的 花园 式 住宅 和 12 . 79 万 平方 米 综合 停车 场 2 、 中小 学 : 十 一中 3 、 综合 商场 : 安徽 大 市场 , 元 一 时代 广场 等 4 、 邮局 : 火车 站 邮局 5 、 银行 : 建行 、 工行 、 徽 行 、 招行 等 6 、 医院 : 新站 二院 楼盘 简介 0 信 地 城市 广场 的 规划 设计 由 设计 公司 RTKL 主创 。
+RTKL 在 设计 上 引进 了 国际 时尚 流行 元素 , 充分 考虑 了 车流 、 人流 、 商流 之间 的 相互 协调 。
+一 个 占地 约 2 . 5 公顷 的 景观 主题 花园 式 休闲 广场 , 配合 入口 大 草坪 和 主题 水景 花园 形成 了 城市 广场 入口 别 具 特色 的 门户 景观 , 为 消费 的 人群 提供 了 一 个 优美 愉悦 的 绿色 休闲 环境 。
+广场 不仅 设有 水系 、 凉亭 等 活动 场所 , 内部 还 设有 5000 平方 米 的 休闲 中心 、 酒吧 、 西餐 等 小型 餐饮 功能 ; 在 停车 库 的 配套 上 , 首次 引进 国际 一 流 的 设计 要素 , 建设 了 5 . 28 万 平方 米 的 立体 停车 库 , 7 . 51 万 平方 米 的 地下 停车 库 , 完全 做到 了 人 车 分流 及 每 层 皆 首 层 的 立体 效果 。
+周边 交通 状况 0 距 合肥 火车 站 850 余 米 , 步行 5 至 10 分钟 。
+信 地 城市 广场 5 号 楼 距 中绿 广场 直线 距离 约 150 米 。
+中绿 广场 枢纽 站 公交 线路 有 公交 1 路 、 10 路 、 11 路 、 22 路 、 111 路 、 119 路 、 129 路 、 149 路 、 152 路 、 226 路 、 301 路 、 504 路 、 701 路 、 801 路 、 902 路 、 滨湖 专线 等 16 条 公交 线路 。
+另外 , 目前 在 5 号 乐 富 公寓 门口 修建 地铁 及 高架 桥 。
+楼盘 名 信 地 城市 广场 公寓 建筑 面积 950000 占地 面积 279300 容积 率 2 . 73
+西安 贝 鑫诺 信息 科技 有限 公司 西安 贝 鑫诺 信息 科技 有限 公司 于 2017 年 12 月 15 日 成立 。
+法定 代表 人 黄 波 , 公司 经营 范围 包括 : 软件 开发 ; 计算机 信息 系统 集成 服务 ( 除 审批 ) 、 数据 处理 和 存储 服务 ; 计算机 机房 维护 ; 集成 电路 设计 ; 数字 动漫 制作 ; 游戏 软件 设计 制作 ; 地理 信息 数据 处理 ; 企业 管理 咨询 ; 广告 设计 、 制作 、 代理 及 发布 等 。
+10 公司 性质 有限 责任 公司 ( 自然 人 投资 或 控股 ) 成立 时间 2017 年 12 月 15 日 登记 机关 西安 市 工商 行政 管理 局 雁塔 分局
+残魂 问鼎 《 残魂 问鼎 》 是 住持 师兄 创作 的 网络 小说 , 发表 于 起点 网 。
+0 < b > 作品 简介 < / b > < b > < / b > 陈 枫 , 原 是 上界 天仙 。
+一 次 意外 被 人 生生 炼 去 天 、 人 二 魂 。
+禁 喜 、 乐 、 爱 三 情 。
+魂魄 融 于 一 怨 婴 体内 。
+重生 修仙 , 依稀 获得 了 生前 的 记忆 。
+一 步步 解开 了 远古 密事 !
+【 新人 , 求 点 击 , 求 收藏 , 求 推荐 。
+先 谢谢 了 !
+】 中文 名称 残魂 问鼎 作者 住持 师兄 类型 网络 小说 连载 平台 起点 网
+白菜 海带 豆腐 汤 白菜 海带 豆腐 汤 是 一 道 由 海带 、 白菜 等 做成 的 美食 。
+原料 列表 0 海带 150 . 0g 大 白菜 200 . 0g 豆腐 ( 南 ) 100 . 0g 大葱 10 . 0g 饮用 水 300 . 0g 详细 介绍 0 所有 的 东西 都 切 好 放 一 起 煮 , 我 喜欢 把 葱 也 煮 烂 呵呵 一般 我 都 加一 个 八角 , 这样 比较 好 喝 。
+中文 名 白菜 海带 豆腐 汤 主要 食材 海带 , 大 白菜 , 豆腐 分类 汤 类 佐料 食盐
+上海 尚 元菲 雪 制冷 设备 有限 公司 上海 尚 元菲 雪 制冷 设备 有限 公司 是 专业 设计 、 生产 、 制造 高 低温 试验 设备 、 装配 式 冷库 、 空调 设备 、 工艺 冷冻 ( 冷却 ) 设备 和 非 标 冷冻 设备 等 产品 的 民营 股份 制 企业 。
+简介 0 本 公司 主要 产品 有 各种 温度 的 装配 式 冷库 、 高 低温 试验 箱 、 湿热 试验 箱 ( 室 ) 、 高 低温 试验 室 、 工艺 冷冻 ( 冷却 ) 水 设备 、 冷 饮水 机 、 厨房 冰箱 , 同时 设计 并 安装 中央 空调 。
+产品 广泛 应用 于 工矿 企业 、 文教 科研 、 国防 工程 、 大 卖场 、 超级 市场 、 宾馆 大楼 、 医院 、 船舶 、 铁路 运输 部门 。
+企业 文化 0 上海 尚 元菲 雪 制冷 设备 有限 公司 本 着 “ 战略 前瞻 创新 无限 信守 承诺 质量 一 流 ” 的 经营 理念 , 秉承 原 上海 制冷 设备 厂 五十 年 专业 从事 制冷 系统 、 超 低温 装置 的 开发 、 设计 、 制造 、 安装 的 丰富 经验 和 良好 快速 的 售后 服务 , 利用 原 上海 制冷 设备 厂 在 低温 冷冻 及 装配 式 冷库 等 方面 的 技术 优势 , 积极 适应 市场 的 需求 , 努力 致力 于 产品 的 开发 研制 。
+上海 尚 元菲 雪 制冷 设备 有限 公司 作为 原 上海 冷气 机 厂 制冷 设备 公司 改制 后 的 企业 , 将 继续 真诚 服务 于 新 老 客户 , 对于 原来 由 上海 冷气 机 厂 制冷 设备 公司 生产 的 设备 和 安装 的 工程 , 我们 将 继续 快速 良好 的 售后 服务 , 将 “ 您 的 需求 , 我 的 追求 ” 作为 我们 的 服务 宗旨 , 竭力 使 广大 的 客户 无 后顾 之 忧 , 满意 我们 的 服务 。
+公司 名称 上海 尚 元菲 雪 制冷 设备 有限 公司 总部 地点 上海 经营 范围 专业 设计 、 生产 、 制造 高 低温 试验 设备 公司 性质 民营 公司 口号 “ 战略 前瞻 创新 无限 ”
+鼓励 童星 出身 的 方 顺吉 曾 举办 反毒 演唱 会 , 并 鼓励 年轻 人 反毒 , 但 也 难以 拒绝 毒品 诱惑 , 沦 为 摇头 族 , 为了 鼓励 日前 因 嗑 摇头 丸 被 抓 的 年轻 歌手 方 顺吉 重新 站 起来 , 并 推出 了 这 张 台语 专辑 !
+简介 0 童星 出身 的 方 顺吉 曾 举办 反毒 演唱 会 , 并 鼓励 年轻 人 反毒 , 但 也 难以 拒绝 毒品 诱惑 , 沦 为 摇头 族 , 为了 鼓励 日前 因 嗑 摇头 丸 被 抓 的 年轻 歌手 方 顺吉 重新 站 起来 , 并 推出 了 这 张 台语 专辑 !
+在 专辑 封面 , 印 了 二 行 他 的 心声 “ 我 会 认真 我 会 打拼 勇敢 搁 再 行 争取 好 名声 ” , 希望 自己 能 再次 积极 面对 人生 !
+专辑 目录 0 [ 01 ] 挂念 [ 02 ] 少年 仔 的 心声 ( 音乐 版 ) [ 03 ] 鼓励 ( 音乐 版 ) [ 04 ] 少年 仔 的 心声 [ 05 ] 鼓励 下载 点歌 [ 06 ] 当作 一 场 梦 - 方 顺吉 / 于 美玲 [ 07 ] 为 伊 心 茫茫 [ 08 ] 怎样 放 袂 离 [ 09 ] 真心 的 知己 [ 10 ] 心 波浪 [ 11 ] 人生 路 [ 12 ] 望 你 在 回头 中文 名 鼓励 歌手 方 顺吉 语言 普通 话 日期 2003 . 07 . 29 心声 我 会 认真 我 会 打拼 诠释 鼓励 年轻 人 反毒 , 以 拒绝 毒品 诱惑
+彭 树桢 彭 树桢 ( 1917 — 1940 ) 莱州 市 后 上庄 村人 。
+1938 年 3 月 , 参加 了 掖县 玉皇 顶 起义 , 随之 加入 胶东 抗日 游击 第 三 支队 , 同年 加入 中国 共产党 , 任 三 支队 大队 政治 指导 员 。
+所在 部队 合编 后 , 1940 年 任 八路 军 山东 纵队 五 旅 十三 团 政委 。
+后 在 招远 齐 山 反 “ 扫荡 ” 中 牺牲 。
+中文 名 彭 树桢 国籍 中国 民族 汉族 出生 地 莱州 市 后 上庄 村 出生 日期 1917 逝世 日期 1940 职业 军人
+掌控 云霄 小说 类型 0 东方 玄幻 内容 简介 0 一 个 被 所有 人 都 认为 是 家族 废 材 的 影 寂 , 意外 得到 了 一 个 鼎 , 从此 开始 他 的 武 道 巅峰 之 路 , 看 他 如何 踩 着 世间 强者 。
+掌控 风云 , 气冲 云霄 。
+一 步 一 步 创造 属于 自己 的 天地 … … 中文 名 掌控 云霄 作者 瞬间 失忆 小说 进度 连载 连载 网站 17k 小说 网
+不 争 春 像 一朵 野花 , 开 在 春天 的 野地 里 。
+野花 是 花中 的 凡夫 俗子 , 不 能 交通 神明 , 连 梦都 没 有 一 个 儿 , 故 不 争 春 。
+这 便 是 作者 程 耀凯 《 不 争 春 》 的 由来 《 不 争 春 》 是 一 部 当代 散文 集 , 共 分 三 部分 : 望乡 、 蜉蝣 和 虚实 。
+基本 介绍 0 基本 介绍 内容 简介 1 睿智 的 人 看 得 透 , 故 不 争 ; 豁达 的 人 想 得 开 , 故 不 斗 ; 明理 的 人 放 得 下 , 故 不 痴 ; 重 义 的 人 交 天下 , 故 不 孤 ; 浓情 的 人 淡 名利 , 故 不 独 ; 宁 静 的 人行 深远 , 故 不 折 ; 知足 的 人 常 快乐 , 故 不 老 … … 于是 便 有 了 作者 这本 《 不 争 春 》 。
+基本 介绍 作者 简介 2 程 耀恺 , 1942 年 生于 安徽 六安 东乡 。
+1964 大学 毕业 , 先后 在 阜阳 四清 工作 总 团 、 霍邱 县 农林 局 、 皖西 报社 、 安徽 省 粮油 科学 研究 所 从事 技术 与 科研 工作 , 退休 后 到 南方 做 房 地产 策划 。
+谋生 之 余 , 潜心 散文 创作 , 作品 散见 于 海 内外 报纸 副刊 及 文学 刊物 。
+草根 人生 。
+边缘 写作 。
+秋水 文章 文摘 0 春雨 我 寄居 的 城市 , 位于 江淮 之间 , 每 至 早春 二月 , 会 有 一 场 或 几 场 春雨 , 年年 如此 。
+上午 进城 , 到 三 孝 口 的 爱知 书店 , 买 新版 《 河童 杂记 本 》 和 当月 《 万象 》 , 付 完 书款 , 缘 长江 路 往回 走 。
+刚 到 大 西门 , 一 阵风 起 , 雨 接踵 而来 。
+多亏 老伴 叮嘱 , 行前 备 了 伞 。
+急忙 撑开 , 伞 面 上 响起 淅 淅沥 沥 的 音乐 , 雨声 淹 , 没 了 杂乱 无 章 的 市声 , 真 开心 。
+我 想 , 该 是 春 之 神 在 我 的 伞 上 跳舞 吧 。
+西去 不 远 , 要 经过 一 所 校园 , 那 是 我 的 母校 , 沿街 一 排 紫叶 李 , 几 株 白玉兰 。
+紫叶 李 含苞 , 白玉兰 盛放 , 一 眼 望 去 , 恰似 19 世纪 印象派 大师 莫 奈 笔 下 的 画境 , 亲切 而 迷人 。
+每次 路过 这里 , 下 意识 就会 放慢 脚步 , 或 略作 停顿 。
+正 是 这 所 学校 , 把 我 和 这 座 城市 , 牵扯 到 了 一 起 , 否则 , 我 也许 别 有 一 种 生存 环境 和 人生 境遇 。
+立 在 伞 下 , 漫天 的 雨声 , 似乎 要 把 我 送 上 一座 时空 秋千 架 , 任由 我 在 四十七 年 的 跨度 中 , 来回 荡漾 。
+跨入 这 座 校园 之 前 , 我 的 天地 是 乡村 。
+那 乡村 地处 大别 山 北麓 , 尽管 有 山 不 高 , 有 水 不 长 , 但 寒暑 交替 , 日月 浮沉 , 叶 吐 而 燕子 归来 , 花 尽 而 果实 满 枝 。
+乡村 世界 , 犹如 这 雨中 风景 , 隐隐 约约 散发 出 忧郁 的 美 , 承载 着 朦胧 的 梦 。
+总之 , 乡下 的 一切 , 又 寂寞 又 美好 。
+虽 是 乡下 人 后代 , 但 毕竟 是 书生 , 自幼 便 从 书 中 知道 , 人们 对 “ 城市 ” , 万来 有 不 同 的 理解 和 祈愿 : 在 史家 眼里 , 城市 是 记载 人类 文明 历程 的 史书 ; 生 文人 心中 , 城市 是 演绎 悲欢 离合 故事 的 舞台 ; 在 能工 巧匠 命运 里 , 城市 是 大展 手脚 的 用武 之 地 ; 在 市井 百姓 日子 里 , 城市 是 安身 立命 的 家园 。
+书本 还 告诉 我 : 在 中国 , 城 与 人 的 关系 , 一言 难 尽 , 当官 的 , 爱城 不 恋 城 , 为 民 的 , 恋 城 却 难以 爱城 。
+毕业 之后 , 几经 周折 , 终 在 省城 定居 下 , 摇 身 一 变 而 为 城 里 人 。
+但 城市 带给 我 什么 呢 ? 近 半 个 世纪 了 , 错乱 的 城市 生活 , 如梦 如幻 。
+命运 时而 令 我 陷入 无 可 奈何 的 绝望 , 时而 重 燃 似 有 若 无 的 希望 。
+数十 年 来 , 行走 在 这 迷宫 般 街巷 里 , 总 有 一 种 莫名 的 不安 , 因为 你 永远 无 从 知道 , 将 发生 什么 , 会 错过 什么 。
+各种 宿命 与 追寻 , 各样 巧合 和 错失 , 就 隐藏 在 楼群 与 树丛 之间 , 显得 那么 忧郁 和 诡异 。
+以往 住 乡下 , 一年 中 的 行事 , 总 是 踏着 二十四 气节 的 拍 节 , 与 大 自然 保持 同步 。
+而 在 充满 傲慢 与 偏见 的 城市 , 不 是 漠视 季节 , 就 是 反 季节 , 即使 艳阳 当空 , 万 里 无 云 , 媒体 与 商家 , 却 在 共谋 什么 “ 飓风 行动 ” 、 “ 零 利 风暴 ” , 一 场 又 一 场 。
+在 乡下 , 春雨 贵 如 油 , 而 在 城 里 , 除非 天 降 金币 , 余 则 事 不 关 己 。
+不 错 , 城市 正在 日趋 繁荣 , 而 这种 繁荣 , 归根 到底 , 实 与 无 数 小 人物 胼 手 胝 足 的 努力 息 息 相关 , 但 城市 愈 发展 , 普通 人 就 愈 显 渺小 与 无 奈 。
+城市 的 光彩 , 照亮 了 高层 建筑 , 照 红 了 精英 人物 , 却 顾及 不 了 所有 的 角落 。
+芸芸 众生 所 能 感受 到 的 , 今天 拆 了 一 大片 , 明天 堆 起 一大 群 , 日新 月异 的 同时 , 面目 全非 。
+岂止 是 面目 全非 , 不 断 制造 富豪 与 时尚 的 城市 , 原有 那 一 点 诸如 “ 春雨 楼 头 八尺 箫 ” 、 “ 龙 池柳 色 雨中 深 ” 的 色调 , 早已 被 名号 各异 的 “ 飓风 ” 、 “ 风暴 ” , 扫荡 殆尽 。
+离开 校 门口 , 沿 立交 桥 继续 西行 。
+没 走 几 步 , 蓦然 记起 昨晚 读 过 的 一 本 书 。
+书 上 说 : 在 一座 我 没 去过 城市 里 , 住 着 一 位 年轻 画家 , 他 的 城市 , 雨水 丰沛 , 花木 繁茂 , 他 居然 说 , 他 的 生活 就 是 “ 假装 ” : 假装 是 鸟 , 可以 飞 ; 假装 是 鱼 , 可以 潜水 ; 假装 是 狗 , 可以 追捕 野兔 ; 假装 是 猫 , 可以 叫春 撒野 ; 假装 是 天使 , 可以 纯洁 神圣 。
+他 的 异想 天开 , 略带 一 丝 酸楚 : 我 将 自己 种 进 花盆 , 假装 是 一朵花 。
+… … 尚未 学会 绽放 , 就 已 习 于 凋零 。
+读 这 番 话 , 让 我 惊愕 得 彻夜 难 眠 。
+我们 这些 生活 在 城市 里 的 人 , 干嘛 要 以 “ 假装 ” 为 能事 ? 可 现在 , 就 在 春雨 中 , 我 多少 悟 出 点头 绪 来 : 对 城市 而言 , 一心 向 富 , 想必 不 得 不 忽略 一些 该 忽略 的 人和 事 ; 以 个人 角度 看 , 对于 自己 长期 生活 的 城市 , … … P 79 - 80 序言 0 瞧 我 这 凡夫 俗子 , 不 能 交通 神明 , 所以 连 梦都 没 有 一 个 儿 —— 《 红楼 梦 》 第 一百 零九 回 上 这 句 话 , 差 不 多 说 的 就 是 我 。
+作为 凡夫 俗子 , 我 只 合 以 凡 眼看 世界 , 凡 心 想 问题 。
+至于 凡 手写 文章 , 只是 偶尔 为 之 。
+虽说 偶尔 为 之 , 然而 雪 泥 鸿爪 , 毕竟 记录 下 了 往日 崎岖 与 一生 冷暖 。
+人 总会 老 , 总会 有 提 不 动 笔 敲 不 动 键盘 的 那 一天 , 倘若 一 任 这些 凡文 “ 空 阶 滴 到 明 ” , 也许 真 的 会 落得 “ 坏 壁 无 由 见 旧 题 ” 那样 的 情景 。
+于是 , 在 冬天 的 寒夜 里 , 我 把 历年 零星 发表 过 的 凡文 , 归并 为 望乡 、 蜉蝣 、 虚实 三 部分 , 汇集 成 书 。
+书名 不 好 起 。
+曾经 想 叫 : 淡淡 心思 , 只 说 三 分 —— 这 当然 多少 合乎 我 的 写作 心态 , 然而 这 八 个 字 太 过 朦胧 , 说 不 定 会 被 误读 为 花边 文学 什么 的 。
+元旦 清晨 , 文友 赵 昂 发 来 贺 岁 短信 : “ 睿智 的 人 看 得 透 , 故 不 争 ; 豁达 的 人 想 得 开 , 故 不 斗 ; 明理 的 人 放 得 下 , 故 不 痴 ; 重 义 的 人 交 天下 , 故 不 孤 ; 浓情 的 人 淡 名利 , 故 不 独 ; 宁 静 的 人行 深远 , 故 不 折 ; 知足 的 人 常 快乐 , 故 不 老 … … ” 一连 用 了 七 个 “ 不 ” , 蓦然 之 问 , 我 的 情绪 被 感染 思绪 被 激活 —— 何 不 把 书名 定 为 : 不 争 春 。
+编纂 这本 散文 、 随笔 集 的 当儿 , 正值 山 寒 水 瘦 , 交付 出版 之 时 , 春 气 萌动 。
+我 期盼 它 在 2013 年 的 春天 里面 世 , 祝愿 它 像 一朵 野花 , 开 在 春天 的 野地 里 。
+野花 是 花中 的 凡夫 俗子 , 不 能 交通 神明 , 连 梦都 没 有 一 个 儿 , 故 不 争 春 。
+书名 不 争 春 出版 社 合肥 工业 大学 出版 社 页数 327 页 开本 16 品牌 合肥 工业 大学 出版 社 作者 程 耀恺 出版 日期 2013 年 1 月 10 日 语种 简体 中文 IS BN 9787565012242
+成都 运 司 西园 亭 诗 · 雪峰 楼 《 成都 运 司 西园 亭 诗 · 雪峰 楼 》 是 宋代 诗人 许 将 的 作品 之 一 。
+作者 0 宋 许 将 诗词 正文 0 重楼 起 城 阴 , 乘 高 望 四极 。
+列 峰 横 青天 , 飞雪 千里 积 。
+疑 是 空 素 山 , 冬夏 海 中 白 。
+莫 怪 频 东 向 , 上 有 思 归客 。
+注释 0 作品 名称 成都 运 司 西园 亭 诗 · 雪峰 楼 创作 年代 宋 文学 体裁 五言 诗 作者 许 将
+宝 小 戈 求学 记 《 宝 小 戈 求学 记 》 是 老 末 越冬 创作 的 网络 小说 , 发表 于 晋江 文学 网 。
+0 见到 李 沁之 前 , 宝 小 戈 不 知道 自己 会 对 一 个 男孩 子 的 外表 如此 着迷 。
+他 大骂 自己 肤浅 。
+认识 李 沁 之后 , 宝 小 戈 不 知道 自己 将 被 一 个 男孩 子 把 生活 搅 得 天翻 地覆 。
+他 大骂 自己 贱 。
+缘分 是 什么 ?
+缘分 原来 就 是 一 个 贱人 屡 败 屡 战 的 血泪 史 。
+高中 是 一 个 如此 奇妙 的 地方 , 让 大家 一遍 遍地 在 痛苦 中 寻找 快感 , 悲哀 的 是 , 还 真 的 有 人 找到 了 。
+想 看 宝 小 戈 的 爆笑 求学 故事 , 请 关注 《 宝 小 戈 求学 记 》 。
+中文 名称 宝 小 戈 求学 记 作者 老 末 越冬 类型 纯爱 连载 平台 晋江 文学 网
+民国 清流 《 民国 清流 : 那些 远去 的 大师 们 》 是 国内 首部 立体 全景 式 再现 民国 大师 们 的 史诗 级 集体 传记 —— “ 民国 清流 ” 系列 之 一 1 , 入选 《 作家 文摘 》 2015 年度 十大 非 虚构 好书 , 百 道 网 中国 好书 榜 2015 年度 人文 类 好书 等 年度 好书 榜 1 。
+系 著名 作家 、 编辑 家 , 原 《 当代 》 副 主编 汪 兆骞 先生 的 心血 力作 。
+2 著名 作家 、 中国 作协 副 主席 张 抗 抗 、 叶 辛 、 何 建明 , 著名 作家 王 跃 文 , 著名 文化 学者 叶 廷芳 、 张 颐 武 、 李 建军 等 众多 名家 倾 情 推荐 !
+2 内容 简介 0 《 民国 清流 : 那些 远去 的 大师 们 》 是 国内 首部 立体 全景 式 再现 民国 大师 们 的 史诗 级 集体 传记 —— “ 民国 清流 ” 系列 之 一 , 入选 《 作家 文摘 》 2015 年度 十大 非 虚构 好书 , 百 道 网 中国 好书 榜 2015 年度 人文 类 好书 等 年度 好书 榜 1 。
+系 著名 作家 、 编辑 家 , 原 《 当代 》 副 主编 汪 兆骞 先生 的 心血 力作 。
+著名 作家 、 中国 作协 副 主席 张 抗 抗 、 叶 辛 、 何 建明 , 著名 作家 王 跃 文 , 著名 文化 学者 叶 廷芳 、 张 颐 武 、 李 建军 等 众多 名家 倾 情 推荐 !
+民国 六 年 至 民国 十六 年 ( 1917 至 1927 年 ) , 中国 呈现 了 与 春秋 战国 时期 的 “ 百家 争鸣 ” 相比 肩 的 思想 最 活跃 、 文化 最 灿烂 的 局面 。
+继承 了 传统 “ 士 ” 担当 精神 的 一 代 民国 清流 , “ 以 天下 为 己任 ” , 为 民族 自尊 、 学术 自尊 , 在 风云 动荡 的 时局 中 扮演 着 先锋 角色 。
+他们 学问 超绝 而又 狷介 不 羁 ; 相互 间 道义 学问 相 砥砺 , 却 在 时代 大潮 的 冲击 下 不 断 被 分化 , 一时间 龙吟 虎啸 , 各 领 风骚 , 折射 了 一 个 大 时代 的 风 起 云涌 、 雪 雨 阴晴 , 写就 了 中国 “ 士 ” 的 短暂 与 辉煌 。
+本 书 第 一 次 系统 地 讲述 了 那些 特 立 独行 的 民国 文化 大师 们 在 思想 、 学术 、 政见 、 工作 和 生活 各 个 层面 , 互相 交流 、 碰撞 、 交锋 过程 中 的 友谊 、 恩怨 、 是非 、 因缘 与 分 合 , 生动 地 再现 了 他们 在 近代 中国 的 重大 转折 时期 的 生活 图景 , 刻画 了 他们 各自 独特 的 人格 魅力 和 文化 品格 , 忠实 讲述 了 他们 的 伟大 与 卑微 , 崇高 与 缺陷 , 描绘 了 一 轴 无 比 辉煌 的 民国 文化 巨星 画卷 。
+作者 简介 0 汪 兆骞 : 著名 编辑 家 、 作家 , 生于 1941 年 , 人民 文学 出版 社 编审 , 原 《 当代 》 副 主编 兼 《 文学 故事 报 》 主编 。
+中国 作协 会员 。
+著有 《 民国 清流 1 : 那些 远去 的 大师 们 》 、 《 民国 清流 2 : 大师 们 的 “ 战国 ” 时代 》 、 《 民国 清流 3 : 大师 们 的 中兴 时代 》 4 、 《 文坛 亦 江湖 : 大师 们 的 相 重 与 相 轻 》 、 《 往事 流光 》 、 《 春 明 门 内 客 》 、 《 记忆 飘逝 》 、 《 紫塞 烟云 》 、 《 张 骞 》 等 。
+名家 推荐 0 < b > 著名 作家 王 跃 文 : < / b > < b > < / b > 汪 兆骞 老师 深谙 《 左传 》 笔法 , 其 《 民国 清流 》 依照 编年 剪裁 民国 历史 , 将 人物 置于 云 诡 波 谲 的 大 事件 大冲 突 中 摹 形 刻画 , 以 史家 手眼 钩沉 实录 , 以 文学 笔墨 传神 写照 , 于 人物 书写 中 别 嫌疑 , 于 叙事 中富 褒贬 , 明 是非 , 定 犹豫 , 善 善恶 恶 , 援 史 明志 , 其 为 士人 清流 招魂 之 深情 苦心 , 令人 感佩 。
+< b > 著名 作家 、 中国 作协 副 主席 何 建明 < / b > : 我 给 这本 书 定 了 几 个 关键 词 , 这个 是 一 个 美文 传记 , 这个 文字 特别 优美 , 汪 老师 一直 是 大 编辑 家 , 所以 他 的 文字 不 用 说 , 他 对 我们 这 作家 的 要求 他 是 很 严 , 他 自己 的 写作 是 用尽 了 他 的 才华 , 所以 我 觉得 他 是 一 个 美文 传记 , 他 跟 其它 传记 不 太 一 样 。
+第 二 是 文学 的 传记 , 他 是 用 文学 表现 的 手法 , 很多 情节 大家 看 了 , 这些 伟人 , 这些 重要 的 人物 在 其它 历史 书 上 都 可以 看到 , 这 一 事件 大概 我们 多 多少 少 知道 一 点 , 但是 在 汪 老师 的 笔 下 , 那种 叙述 的 方法 是 一 种 文学 的 叙述 方法 , 是 一 个 一 个 的 情节 , 也许 我们 在 另 一 本 书 当中 , 另 一 个 世界 当中 , 包括 影视 作品 当中 , 我们 可能 有 闪过 这样 的 经历 , 但是 汪 老师 用 文学 的 方式 勾 连 得 特别 精美 。
+第 三 , 思想 传记 , 是 民国 时期 一些 中国 的 思想 的 精英 , 文化 的 精英 , 他们 的 活动 , 他们 经历 的 那段 历史 都 充满 这种 伟大 的 思想 , 这 一批 人 都 是 我们 中国 现 当代 最 杰出 的 一些 思想 家 、 政治 家 , 这些 领袖 界 人物 , 所以 这本 书 当中 , 在 情节 推动 的 时候 , 给 我们 的 很多 都 是 思想 的 东西 。
+< b > 著名 文化 学者 张 颐 武 < / b > : 过去 我们 对 大师 文人 我们 都 了解 , 因为 都 熟悉 , 因为 大家 都 研究 过 , 对 那段 历史 有 所 了解 , 中国 现代 的 源头 是 怎么 回事 , 那段 时间 最 关键 。
+但是 那个 时候 看 , 你 就会 发现 , 我们 把 它 作为 伟人 敬仰 去 看 。
+汪 先生 就 好像 是 穿行 在 伟人 之间 , 好像 穿越 小说 一 样 , 一 下 穿越 过去 了 , 穿越 过去 跟 这些 人 聊 一 聊 , 把 这个 书稿 拿 过来 , 又 把 这个 鲁 迅 书稿 拿 一 本 , 黄侃 书稿 拿 一 个 , 然后 写出 他 自己 的 书 来 。
+这个 都 读 过 了 , 这些 史料 下 了 很大 的 工夫 , 非常 不 容易 , 这个 书 最 好 的 地方 就 是 现场 感 很 强 , 很 鲜活 。
+因为 汪 先生 有 一 个 好处 , 中国 现代 大 文人 , 像 王蒙 开始 都 有 来往 , 他 对 当代 大 文人 都 熟悉 。
+用 跟 当代 大 文人 交往 的 方式 跟 民国 的 大 文人 对话 、 沟通 , 这个 路子 我 觉得 特别 有 意思 , 这 是 一 个 好处 , 现场 感 强 。
+目录 0 第 一 章 民国 六 年 ( 1917 年 ) / 001 陈 独秀 、 胡 适应 北京 大学 校长 蔡 元 培 之 邀 , 分别 出任 该 校 文科 学长 和 教授 。
+二人 先后 在 《 新 青年 》 发表 《 文学 改良 刍议 》 、 《 文学 革命 论 》 , 率先 举起 文学 革命 的 大旗 , 一 个 崭新 的 《 新 青年 》 时代 呼 之 欲 出 。
+暮气 沉沉 的 北大 逐渐 成为 新 文化 运动 的 精神 高地 。
+李 大 钊 、 钱 玄 同 、 刘 半农 、 周氏 兄弟 等 一 代 新型 知识 精英 , 也 开始 登 上 了 历史 舞台 , 开创 了 以 现代 文明 为 核心 的 新 文化 运动 。
+第 二 章 民国 七年 ( 1918 年 ) / 063 就 在 辛亥 革命 步履 艰难 之 时 , 北京 的 新 文化 运动 , 却 有 一抹 春色 。
+《 新 青年 》 改组 为 同人 刊物 , 由 陈 独秀 、 胡 适 、 李 大 钊 、 钱 玄 同 、 沈 尹 默 、 高一 涵 六人 轮流 编辑 , 请 撰稿 人 周氏 兄弟 、 刘 半农 等人 协助 办刊 。
+周 树人 以 鲁 迅 笔名 发表 《 狂人 日记 》 等 小说 , 我们 民族 文学 的 面貌 、 气象 、 精神 、 灵魂 , 焕然 一新 。
+胡 适 、 鲁 迅 成为 中国 现代 思想 史 、 文学 史 上 并 峙 的 双峰 。
+毛 泽东 创办 《 湘江 评论 》 , 写 《 民众 大 联合 》 , 胡 适 高度 评价 云 : “ 一篇 大 文章 , 眼光 很 远大 , 议论 也 很 痛快 , 确 是 现今 重要 文字 。
+” 李 大 钊 与 胡 适 开展 “ 问题 与 主义 ” 的 学术 讨论 , 陈 独秀 、 毛 泽东 、 鲁 迅 旗帜 鲜明 地 站 在 胡 适 一边 。
+陈 独秀 、 李 大 钊 办 《 每周 评论 》 , 有 良知 和 正义 感 的 报人 章 士 钊 、 邵 飘萍 等 办 《 京报 》 、 《 时报 》 等 , 掀起 舆论 波澜 , 给 中国 言论 史 留 下 一 笔 丰厚 的 遗产 。
+第 三章 民国 八 年 ( 1919 年 ) / 095 民国 八 年 , 中国 依然 是 独裁 者 横行 的 屠场 和 炼狱 。
+新 文化 运动 合乎 逻辑 地 催生 了 “ 五四 ” 爱国 学生 运动 。
+那些 从 黑暗 中 突围 出来 的 知识 分子 , 集体 亮相 , 以 启蒙 者 和 革命 家 的 胆魄 , 继续 奋力 开启 新 时代 的 闸门 , 一 路 高歌 猛进 , 为 20 世纪 的 中国 历史 谱写 了 新 的 序言 。
+第 四 章 民国 九 年 ( 1920 年 ) / 141 民国 九 年 , 比起 波澜 壮阔 的 民国 八 年 , 少 了 些 红火 , 但 并 不 沉寂 , “ 五四 ” 余 波 仍 在 荡漾 。
+各种 政治 派别 博弈 的 同时 , 知识 分子 关于 新旧 之 争 , 也 日趋 激烈 。
+陈 独秀 、 李 大 钊 等 倾向 政治 革命 , 宣传 马克思 主义 、 秘密 酝酿 成立 共产党 。
+胡 适 出版 新诗 《 尝试 集 》 , 成为 “ 新诗 老 祖宗 ” , 他 还 与 高一 涵 等人 发起 《 争 自由 的 宣言 》 , 为 争 自由 而战 。
+鲁 迅 进 北大 当 讲师 , 讲授 《 中国 小说 史略 》 。
+他 对 盲目 引进 “ 俄国 思潮 ” , 有 振 聋 发 聩 的 回答 : “ 中国 人 无 感染 性 , 他 国 的 思潮 , 甚难 移 殖 ( 《 致 宋 崇义 》 1920 年 5 月 4 日 ) 。
+” 周 作人 作 《 人 的 文学 》 、 《 平民 的 文学 》 、 《 思想 革命 》 , 提倡 “ 为 人生 的 文学 ” , 赢得 远远 超过 鲁 迅 的 声誉 。
+道 不 同 , 不 相 为 谋 , 《 新 青年 》 时代 接近 尾声 。
+第 五 章 民国 十 年 ( 1921 年 ) / 171 民国 十 年 , 是 中国 社会 剧烈 震动 的 一年 , 也是 民国 清流 激荡 分化 的 岁月 。
+大 多数 知识 分子 继续 高举 “ 个性 解放 与 自由 ” 的 旗帜 , 而 部分 人 放弃 “ 五四 ” 个性 主义 , 走向 无 产 阶级 战斗 集体 主义 , 创立 了 中国 共产党 。
+民国 清流 各自 扮演 了 不 同 的 历史 角色 , 青史 留名 。
+第 六 章 民国 十 一年 ( 1922 年 ) / 191 民国 十 一年 , 是 个 天灾 兵 祸 频仍 的 年头 。
+1 月 19 日 , 武汉 酷寒 , 天 降 大雪 , 冻死 很多 人 。
+4 月 直 奉 战 争 爆发 , 在 这 场 血战 中 , 百姓 生命 惨遭 屠戮 。
+6 月 16 日 南方 陈 炯明 兵变 , 炮轰 总统 府 , 局势 动荡 混乱 。
+同时 , 文化 界 和 舆论 界 , 争取 言论 自由 的 斗争 , 却是 这 一年 的 主题 。
+一 个 以 欧美 派 留学 生 为主 的 文人 集体 , 在 他们 的 精神 领袖 群体 的 带领 下 , 继续 向 武人 的 专制 统治 , 庄严 地 宣告 他们 的 政治 主张 。
+第 七 章 民国 十二 年 ( 1923 年 ) / 223 民国 十二 年 的 第 一天 , 孙 中山 发表 《 中国 国民党 宣言 》 , 发 国民党 改组 之 先声 。
+二十六 天后 , 苏俄 表示 倾力 支持 孙 中山 。
+孙 中山 与 苏俄 代表 联合 会 发表 声明 表示 : 共产 组织 及 苏维埃 制度 均 不 能 引用 于 中国 , 双方 认为 , “ 中国 最 要 最 急 之 问题 , 乃 在 民国 的 统一 之 成功 , 与 完全 国家 的 独立 之 获得 ” 。
+6 月 12 日 在 中共 第 三 次 代表 大会 上 , 中共 决定 与 国民党 合作 。
+10 月 28 日 , 孙 中山 任命 廖 仲恺 、 李 大 钊 、 汪 精卫 、 张 继 、 戴 季陶 五人 为 国民党 改组 委员 , 着手 筹备 改组 国民党 。
+北洋 政府 的 反动 统治 更加 黑暗 且 风雨 飘摇 。
+其 通过 “ 取缔 新 思想 ” 议案 , 控制 舆论 。
+2 月 7 日 , 吴 佩 孚 武力 镇压 京汉 铁路 工人 大 罢工 , 杀害 共产党 人 施洋 等 五十余 人 , 制造 了 血腥 的 “ 二七 ” 惨案 。
+惨案 发生 第 二 天 , 吴 佩 孚 就 以 “ 鼓动 罢工 , 扰乱 社会 秩序 ” 罪名 , 查封 了 汉口 《 真 报 》 。
+2 月 16 日 , 中共 的 《 向导 》 遭 查封 。
+四月 因 林 水白 在 《 社会 日报 》 揭露 曹 锟 贿选 总统 丑闻 , 该 报 被 封 , 林 被 监禁 三 个 月 。
+8 月 27 日 , 天津 的 《 京津 泰晤士 报 》 也 因 反对 曹 锟 贿选 , 被 禁止 在 租界 外 发行 。
+10 月 , 上海 的 《 时报 》 因 发表 反对 直系 军阀 的 新闻 被 禁 邮 … … 1923 年 , 知识 分子 与 当局 的 “ 不 合作 宣言 ” , 成 了 主 旋律 。
+第 八 章 民国 十三 年 ( 1924 年 ) / 257 民国 十三 年 ( 1924 年 ) , 大小 军阀 为 继续 分割 山河 而 使得 烽烟 四 起 —— 齐 卢 之 战 , 直 奉 火拼 … … 百姓 饱受 战乱 之 苦 。
+民国 的 缔造 者 孙 中山 , 经历 了 漫长 痛苦 的 奋斗 和 摸索 之后 , 局势 出现 了 新 的 转机 。
+他 在 苏俄 和 中国 共产党 的 帮助 下 , 终于 改组 了 国民党 , 确定 了 “ 新 三民 主义 ” 和 联 俄 、 联共 、 扶助 农工 的 三大 政策 , 并 在 年 初 顺利 地 召开 了 国民党 的 “ 一大 ” , 咄咄 逼人 地 在 南方 崛起 。
+是 年 的 民国 清流 , 除 共产党 人 陈 独秀 、 李 大 钊 、 瞿 秋 白 等 有 明确 政治 理想 外 , 更多 人 也 怀 着 希冀 , 以 文学 为 武器 , 继续 寻求 各自 的 改造 社会 之 路 。
+他们 没 有 忘记 自己 的 理性 良知 和 社会 责任 。
+可悲 的 是 , 因为 各 有 不 同 的 精神 追求 和 不 同 的 价值 选择 , 他们 却 陷入 了 无 休止 的 混战 。
+第 九章 民国 十四 年 ( 1925 年 ) / 2791924 年 12 月 31 日 , 孙 中山 扶病 入京 , 受到 包括 李 大 钊 在内 的 北京 十万 各界 群众 的 热烈 欢迎 。
+其 入京 后 发表 《 入京 宣言 》 , 称 “ 乃 为 救国 ” , 但 并 没 给 段 祺 瑞 控制 下 的 北京 带来 变化 , 如 遭到 国民党 反对 的 “ 善后 会议 ” 如期 举行 。
+3 月 12 日 孙 中山 逝世 , 国民党 成立 治丧 委员 会 , 李 大 钊 担任 秘书 股 中文 主稿 。
+巨星 陨落 , 举国 哀恸 。
+反动 军阀 依然 我 行 我 素 。
+4 月 , 京师 警 厅 发布 新 制定 的 《 管理 新闻 营业 规则 》 控制 言论 , 遭到 胡 适 、 陈 西滢 、 钱 玄 同 等 十八 位 知识 分子 抵制 。
+他们 联名 致函 司法 总长 章 士 钊 , 提出 阁 议 撤销 这 一 规则 , 以 维护 言论 出版 自由 。
+是 年 5 月 30 日 , 爆发 “ 五卅 惨案 ” , 激发 了 汹涌 澎湃 的 民族 浪潮 。
+6 月 3 日 , 郑 振铎 、 茅 盾 、 叶 圣陶 、 胡 愈之 等 创办 《 公理 日报 》 。
+瞿 秋 白 主编 的 《 热血 日报 》 面世 , 邵 飘萍 也 从 6 月 起 在 《 京报 》 推出 清华 学子 王 造 时 主编 的 《 京报 副刊 》 , 连续 报道 了 “ 五卅 惨案 ” , 进行 反帝 宣传 。
+知识 分子 反 专制 , 争 自由 , 仍是 该 年 思想 文化 战线 的 主流 。
+而 5 月 11 日 发生 的 北京 女 师大 驱逐 校长 杨 荫榆 出 校 运动 和 该 年底 的 11 月 29 日 , “ 认定 改造 人心 —— 思想 革命 , 是 急务 中 的 急务 ” 的 《 晨报 》 , 被 暴徒 纵火 焚毁 疑案 , 让 知识 界 变得 扑 朔 迷离 。
+当 我们 怅 望 民国 十四 年 的 纷乱 的 文化 星空 , 你 会 悲哀 地 发现 , 透过 曾经 同 是 新 文化 的 主将 , 陈 独秀 、 胡 适 、 周 作人 等人 围绕 着 上述 两 案 的 纷争 及其 关于 政治 、 思想 、 文化 的 纷争 , 已 深刻 地 反映 出 知识 分子 内部 分裂 已 愈来 愈 严重 。
+第 十 章 民国 十 五年 ( 1926 年 ) / 309 民国 十 五年 依然 是 北洋 军阀 横行 肆虐 、 充满 肃杀 之 气 的 一年 。
+惨案 继续 不 断 发生 。
+3 月 18 日 , 段 祺 瑞 悍然 枪杀 47 名 、 伤 200 多名 和平 请愿 的 学生 和 民众 。
+“ 三 · 一八 ” 惨案 发生 后 , 邵 飘萍 、 成 舍 我 主办 的 《 京报 》 《 世界 日报 》 等 纷纷 报道 事件 真相 , 严厉 谴责 这 场 杀戮 。
+朱 自 清 、 鲁 迅 、 周 作人 、 林 语 堂 等 作家 也 纷纷 发表 文章 抨击 反动 政权 ; 王 世杰 、 高一 涵 、 许 士康 、 陈 翰生 等 在 《 现代 评论 》 周刊 发出 抗议 声讨 之 声 。
+上海 的 叶 圣陶 、 郑 振铎 等 以 文学 形式 发出 怒吼 。
+张 作 霖 、 张 宗 昌 等 在 段 祺 瑞 垮台 后 , 杀气 腾腾 入京 , 先后 杀死 民初 新闻 史 上 熠熠 闪光 的 邵 飘萍 和 林 白水 。
+《 京报 》 、 《 社会 日报 》 随之 被 查封 。
+鲁 迅 、 蒋 梦麟 、 李 大 钊 等 都 上 了 黑 名单 。
+共产党 人 李 大 钊 面对 黑暗 和 杀戮 , 勇敢 地 率领 民众 冲锋 陷阵 , 无 私 无 畏 。
+知识 分子 站 在 民众 一边 , 表达 道义 立场 。
+梁 启 超 、 胡 适 、 徐 志摩 、 陈 寅 恪 等于 春秋 更替 、 风雨 晨昏 中 , 继续 追逐 新 文化 之 梦 … … 第 十一 章 民国 十六 年 ( 1927 年 ) / 3391927 年 , 从 北京 到 南方 , 中华 大地 弥漫 着 血腥 狰狞 和 死亡 的 气息 。
+义士 喋血 , 大师 殒命 。
+在 军阀 张 作 霖 统治 下 的 北京 , 共产党 人 李 大 钊 被 推 上 绞刑 架 , 英勇 就义 ; 白发 书生 王 国 维 , 自沉 昆明 湖 , 国 之 魂 消 ; 戊戌 变法 首领 康 有 为 病死 青岛 。
+三 位 文人 精魂 美 魄 , 如火 如 炬 。
+空前 屠杀 , 鲜血 成 河 。
+北伐 军 席卷 江南 。
+“ 四一二 ” , 上海 宝山 路 再次 被 鲜血 染红 。
+国民党 军队 悍然 向 徒手 民众 开枪 , “ 伤 毙 至 百余 人 ” 。
+郑 振铎 、 胡 愈之 、 章 锡琛 等 联名 致信 国民党 元老 蔡 元 培 、 吴 稚晖 等 , 发出 抗议 之 声 , 并 于 4 月 15 日 在 《 商报 》 发表 。
+广州 的 共产党 人 萧 楚 女 4 月 被 杀 狱中 , 赵 世炎 、 陈 延年 被 杀 于 上海 , 共产党 人 李 汉俊 、 《 大江 报 》 创始 者 詹 大悲 遇 难于 武汉 。
+作家 郭 沫若 、 郁 达夫 、 成 舍 我 、 章 乃器 、 周 作人 都 以 笔 为 武器 , 批评 时政 , 激 浊 扬 清 。
+4 月 26 日 , 商务印书馆 的 高 梦旦 写信 给 胡 适 : “ 时局 混乱 已 极 , 国共 与 北方 鼎足 而 三 , 兵 祸 党 祸 , 几 成 恐怖 世界 , 言论 尤 不 能 自由 。
+” 是 年 , 中国 政局 发生 巨变 , “ 北伐 ” 成功 已 成 定局 , 国共 两党 分裂 也 成 事实 , 国民党 通过 军事 手段 建立 了 一党 专政 的 南京 政府 。
+而 文化 界 却 依然 是 纷纷 乱象 , 创造 社 、 太阳 社 与 鲁 迅 的 论战 , 鲁 迅 与 其他 学者 、 作家 的 争斗 也 从未 停止 。
+跋 / 361 书名 民国 清流 又 名 那些 远去 的 大师 们 作者 汪 兆骞 & nbsp ; IS BN 9787514338126 页数 372 定价 49 . 80 元 出版 社 现代 出版 社 出版 时间 2015 - 8 装帧 平装
+今日 热卖 今日 热卖 是 一款 由 WentaoYan 开发 的 大小 为 3 . 5M 的 网购 生活 类 软件 , 目前 的 版本 是 1 . 8 , 能 与 iPhone 、 iPad 、 iPod touch 这些 设备 相 兼容 基本 信息 0 类别 : 网购 生活 开发 商 : WentaoYan 系统 要求 : 与 iPhone 、 iPad 、 iPod touch 兼容 兼容 设备 : iPhone 、 iPad 、 iPod touch 应用 简介 0 百里 挑 一 聚 便宜 !
+精选 聚合 最 划算 的 宝贝 , 更有 9 . 9 包 邮 秒杀 火爆 活动 每日 进行 中 。
+应用 特色 0 - 折扣 力度 大 。
+推荐 商品 折扣 前 所 未有 , 让 最 in 最 实惠 的 商品 主动 出现 在 您 眼前 。
+- 每天 都 更新 。
+每天 都 有 至少 百 款 商品 超 低 价格 等 您 抢购 。
+折扣 , 每 一天 都 是 新 的 ! - 查找 更 容易 。
+实用 详尽 的 商品 类别 划分 , 动动 手指 , 即可 轻松 找到 想要 的 商品 。
+每日 收集 淘宝 网 最 新 , 最 全 , 最 高 性价 比 的 限时 折扣 商品 , 只 为 用户 手机 购物 的 全新 体验 。
+每日 为 您 提供 八大 分类 , 近 百 个 小 分类 , 数万 种 淘宝 商品 的 最 新 折扣 信息 。
+无 论 是 等 公车 , 坐 地铁 , 还是 窝 在 被窝 里 , 您 都 可以 淘 最 低价 的 商品 啦 ! 中文 名 今日 热卖 更新 2014 - 03 - 13 版本 1 . 8 大小 3 . 50 M
+大风 如 歌 一九三九 年 , 抗日 战争 进入 相持 阶段 , 国民党 顽固 派 发起 第 二 次 反共 高潮 。
+时任 中原 局 书记 的 刘 少 奇 同志 临危 受命 , 肩负 起 发展 华中 根据 地 壮大 新四军 的 历史 使命 。
+基本 信息 0 许可 证 号 : ( 苏 ) 字 第 066 号 内容 提要 0 少 奇 带领 中原 局 机关 躲过 顽 军 的 追击 。
+他 组织 召开 中原 局 扩大 会议 , 并 迅速 打开 皖 东 局面 , 震惊 了 蒋 介石 。
+随后 取得 了 半塔 集 保卫 战 胜利 、 黄 桥 决战 胜利 。
+经 中央 同意 , 少 奇 将 中原 局 改 为 华中 局 , 在 盐城 重建 新四军 军部 , 制定 出 巩固 苏北 根据 地 的 一 系列 措施 。
+曹 甸 战役 失利 , 少 奇 向 中央 请罪 , 并 及时 总结 教训 。
+他 还 组建 了 华中 局 党校 。
+少 奇 与 陈毅 、 粟 裕 等 一 起 指挥 反 扫荡 取得 重大 胜利 。
+少 奇 写 了 《 战争 和 战略 问题 》 、 《 反对 党 内 各种 不 良 倾向 》 等 文 , 为 华中 局 党 的 组织 、 军队 建设 、 军事 斗争 奠定 了 基础 , 为 抗日 斗争 和 解放 事业 做出 巨大 贡献 。
+中文 名 大风 如 歌 制片 地区 中国 大陆 & nbsp ; 首播 时间 2011 年 集数 20 类型 革命 历史 上映 时间 2011 年 在线 播放 平台 百度 影音 题材 重大 革命 历史 题材 制作 机构 江苏 文化 产业 集团
+汤 竹庭 广西 平南 人 。
+1983 年 毕业 于 广西 大学 中文 系 新闻 专业 。
+现任 广西 社会 科学 界 联合 会 副 书记 、 副 主席 , 高级 记者 人物 简介 0 汤 竹庭 ( 1954 . 1 — ) , 男 , 广西 平南 人 。
+1983 年 毕业 于 广西 大学 中文 系 新闻 专业 。
+现任 广西 社会 科学 界 联合 会 副 书记 、 副 主席 , 高级 记者 。
+历任 广西 人民 广播 电台 台长 助理 、 新闻 部 ( 台 ) 主任 , 广西 人民 广播 电台 副 台长 、 台长 、 总 编辑 、 自治 区 广播 电视 厅 ( 局 ) 党组 成员 。
+2000 年 7 月 到 2006 年 8 月 任 中共 梧州 市委 副 书记 , 分管 科教 文卫 体 及 旅游 等 方面 工作 。
+有 30 多 篇 作品 获 全省 、 全区 新闻 奖 和 广播 电视 优秀 作品 奖 。
+学术 论文 《 录音 报道 发展 趋向 浅析 》 、 《 重 谈 新闻 的 倾向 性 》 等 被 《 新闻 潮 》 、 《 中国 广播 》 等 刊物 刊载 , 多次 获 广西 新闻 学会 和 广西 广播 电视 学会 优秀 学术 论文 奖 。
+《 跨越 低谷 —— 新 时期 广播 问题 研究 》 获 第 三届 中国 广播 电视 优秀 专著 一等奖 。
+在 广西 新闻 界 第 三 次 评选 活动 中 获 “ 广西 优秀 新闻 工作 者 ” 称号 。
+1996 年 获 “ 广西 优秀 专家 ” 称号 。
+妻 涉 高利贷 事件 0 首创 “ 订单 办学 “ 闻名 全国 的 广西 壮族 自治 区 梧州 市 电子 工业 学校 校长 赖 志 坚称 , 因为 曾经 被 某 高官 夫人 以 帮助 解决 新 校区 困难 为名 , 胁迫 自己 写 下 3 张 累计 金额 789 . 3 万元 的 借条 , 并 因 无法 偿还 , 而 向 法庭 起诉 他 本人 。
+2010 年 12 月 15 日 , 广西 壮族 自治 区 高级 人民 法院 作出 终审 判决 , 支持 覃 清 等人 要求 赖 志坚 偿还 789 . 3 万元 的 欠款 主张 。
+覃 清 的 身份 非同 寻常 , 她 是 原 梧州 市委 副 书记 、 现任 广西 壮族 自治 区 社科 联 党组 副 书记 、 副 主席 汤 竹庭 的 夫人 。
+中文 名 汤 竹庭 出生 地 广西 平南 出生 日期 1951 . 1 职业 高级 记者 代表 作品 《 录音 报道 发展 趋向 浅析 》 、 《 重 谈 新闻 的 倾向 性 》 性别 男
+NACHI 54211 轴承 54211 轴承 是 一款 NACHI 品牌 轴承 , 型号 为 54211 。
+尺寸 参数 0 型号 : 54211 品牌 : NACHI 系列 : 推力 球 轴承 d : 55 mmOD : 90 mmB : 50mm 样本 图片 0 中文 名 NACHI 54211 轴承 型号 54211 品牌 NACHI 系列 推力 球 轴承
+少女 王子 革命 史 《 少女 王子 革命 史 》 是 在 小说 阅读 网 上 连载 的 小说 , 作者 是 半月 草 。
+小说 类型 0 东方 玄幻 内容 简介 0 天 呐 ? 谁 来 告诉 她 , 为什么 一觉 醒来 , 就 到 了 一 个 完全 陌生 的 世界 , 先 是 被 狼人 囚禁 在 山洞 , 后 又 被 人 给 拐卖 到 众 人口 中 惟恐 避 之 不 及 的 海盗 . . . 终于 再度 逃脱 时 , 竟 又 被 诱拐 到 京城 , 去 冒充 什么 什么 王子 , 天 . 她 可是 一 个 名 符 其实 的 女生 耶 , 虽然 长 得 平凡 了 点 , 也 不 至于 被 人 误 认为 男生 呀 . 承担 起 那么 沉重 的 责任 呀 , 甚至 还 莫名 其妙 地 被 卷入 了 一 场 四 个 皇子 一 个 帝王 一 个 丞相 的 可怕 的 阴谋 里 . . . 本 文 女 主 , 是 一 个 很 普通 甚至 有 点 却 懦 的 高中 生 , 一 次 又 一 次 地 被 欺负 , 被 背叛 , 被 伤害 , 也 总 是 一 次 又 一 次 地 陷入 无 助 地 孤独 里 , 一切 的 一切 , 让 她 明白 哭 和 软弱 什么 也 不 能 改变 , 只能 擦干 眼泪 , 学会 在 逆境 中 成长 , 在 成长 中 坚强 , 在 坚强 中 变得 成熟 … … 中文 名 少女 王子 革命 史 作者 半月 草 小说 进度 连载 连载 网站 小说 阅读 网
+分解 和弦 分解 和弦 常 作为 伴奏 织体 , 在 主调 音乐 中 应用 。
+18 世纪 上 半 叶 , 早期 钢琴 音乐 中 著名 的 阿尔 贝 蒂 低音 , 即 为 分解 和弦 伴奏 音型 。
+也 可 作为 曲调 或 声部 进行 的 一 种 成分 , 具有 旋律 作用 , 并 表示 出 和声 内涵 。
+18 世纪 以后 到 19 世纪 浪漫 派 作家 的 作品 中 , 分解 和弦 的 应用 极 为 普遍 , 并 愈 趋 复杂 , 为 和声 的 一 种 重要 织体 类型 。
+历史 0 有 的 作品 仅 用 分解 和弦 构成 , 如 J . S . 巴赫 《 平均 律 钢琴 曲 集 》 第 1 册 第 1 首 的 前奏 曲 。
+上例 左手 部分 为 分解 和弦 伴奏 音型 。
+分解 和弦 在 民谣 伴奏 中 使用 较多 。
+弹奏 0 一般 地 , 先 用 P 指弹 低音 , 然后 用 i 、 m 、 a 三 指 重复 或 不 重复 地 弹出 其余 的 和弦 音 。
+分解 和弦 与 琶音 的 共同 点 是 依次 弹出 和弦 内 各 音 , 不 同 之 处 是 琶音 弹奏 时值 较 短 , 往往 要 在 一 拍 内 完成 ; 而 分解 和弦 的 弹奏 时值 较 长 , 可以 在 4 拍 或 更 长 的 时值 才 完成 整个 和弦 的 分解 弹奏 。
+例如 C 和弦 是 由 1 ( do ) 、 3 ( mi ) 、 5 ( sol ) 三 个 音 组成 的 。
+分解 和弦 就 是 要 把 这 三 个 音 依次 弹 响 , ( 注意 : 不 是 同时 弹 响 !
+而是 依次 , 一 个 一 个 的 弹 响 ) 不管 是 按 什么 顺序 , 这 就叫 分解 和弦 ( 顾 名 把 和弦 拆分 依次 演奏 ) 。
+弹奏 吉他 演奏 1 从 吉他 演奏 的 角度 讲 , 就 是 比如 : 你 按住 C 和弦 , 右手 可以 先 弹 五 弦 3 品 、 然后 弹 三弦 空 弦 、 然后 二弦 1 品 、 然后 三弦 空 弦 、 然后 一 弦 空 弦 、 然后 三弦 空 弦 、 然后 二弦 1 品 、 最后 三弦 空 弦 。
+也 就 是 通常 所 说 的 “ 53231323 ” 。
+弹奏 钢琴 演奏 2 所谓 分解 和弦 就 是 把 音 顺次 弹出 的 和弦 , 柱式 和弦 就 是 三 个 或 四 个 音 一 起 弹 的 和弦 , 对于 钢琴 伴奏 , 最 普通 且 好听 的 就 是 根据 右手 弹 的 旋律 相应 的 弹出 左手 的 1351 的 伴奏 , 主音 和 右手 的 第 一 个 音 相同 , 如 对应 不 上 可以 等 左手 弹 完 了 再 弹 右手 。
+实践 01 , 先 学习 五线谱 。
+2 , 学习 五线谱 和 吉他 音位 的 对照 。
+3 , 学习 音名 、 唱 名 以及 音 级 。
+4 , 训练 各 调子 的 音阶 。
+5 , 训练 各 调子 的 常用 和弦 。
+6 , 弹奏 单音 旋律 , 学习 视唱 练耳 , 掌握 音准 和 节奏 。
+7 , 弹奏 带 简单 和弦 的 独奏 曲 , 分清 楚 旋律 和 伴奏 的 声部 。
+8 , 学习 和声 知识 , 复调 知识 , 9 , 弹奏 古典 吉他 名曲 。
+10 , 弹唱 学习 分开 训练 , 先 抓住 主 旋律 的 唱 , 单独 训练 转换 和弦 , 最后 是 弹唱 。
+11 , 训练 音阶 的 速度 和 准确 , 把 音阶 速度 训练 到 疯狂 速度 , 提高 技巧 。
+中文 名 分解 和弦 外文 名 broken chord 定义 把 和弦 的 各 和弦 音 顺次 弹出 的 奏 法 应用 领域 音乐 目的 为 和弦 的 装饰 性 用处 作为 伴奏 织体
+中国 化学 [ 601117 ] 要点 一 : 所 属 板块 机构 重仓 板块 , 预亏 预 减 板块 , 沪 股 通 板块 , 证 金 持股 板块 , 一带 一 路 板块 , 上证 180 _ 板块 , 北京 板块 , 工程 建设 板块 , HS 300 _ 板块 , 中 字头 板块 , 融资 融券 板块 。
+要点 二 : 经营 范围 许可 经营 项目 : 对外 派遣 实施 与其 实力 、 规模 、 业绩 相 适应 的 国外 工程 项目 所 需 的 劳务 人员 。
+一般 经营 项目 : 建筑 工程 、 基础 设施 工程 和 境外 工程 的 承包 ; 化工 、 石油 、 医药 、 电力 、 煤炭 工业 工程 的 承包 ; 工程 咨询 、 勘察 、 设计 、 施工 及 项目 管理 和 服务 ; 环境 治理 ; 技术 研发 及 成果 推广 ; 管线 、 线路 及 设备 成套 的 制造 安装 ; 进出 口 业务 ; 房 地产 开发 经营 ; 工业 装置 和 基础 设施 的 投资 和 管理 。
+要点 三 : 化学 工业 工程 领域 的 领先 者 公司 是 一 家 集 勘察 , 设计 , 施工 为 一体 , 知识 技术 相对 密集 的 工业 工程 公司 , 是 我 国 化学 工业 工程 领域 内 资质 最 为 齐全 , 功能 最 为 完备 , 业务 链 最 为 完整 的 工业 工程 公司 , 是 行业 内 具有 突出 优势 的 领导 企业 , 我 国 化学 工业 工程 领域 的 领先 者 。
+公司 是 我 国 最 早 进行 专业 化 经营 , 市场 化 程度 及 业务 一体 化 程度 最 高 的 工业 工程 公司 之 一 , 业务 范围 涵盖 工程 承包 , 勘察 , 设计 及 服务 , 以及 其他 三大 板块 , 公司 拥有 工程 设计 综合 甲级 资质 4 项 , 工程 勘察 综合 甲级 资质 3 项 , 行业 工程 勘察 , 工程 设计 甲级 资质 17 项 , 一 级 施工 总 承包 资质 27 项 , 一 级 施工 专业 资质 32 项 。
+要点 四 : 重大 合同 - 中标 4 . 35 亿 美元 合同 2014 年 12 月 18 日 公告 , 近日 , 公司 所 属 全资 子 公司 中国 五环 工程 有限 公司 与 印尼 PKG 有限 责任 公司 ( 国营 肥料 厂 ) 签署 了 印尼 PKG 合成 氨 尿素 工程 总 承包 项目 , 合同 工期 预计 为 34 个 月 , 合同 总 金额 4 . 35 亿 美元 , 折合 人民 币 约 26 . 66 亿元 , 占 公司 2013 年 营业 收入 的 4 . 32 % 。
+要点 五 : 签署 重大 合同 2015 年 12 月 14 日 晚间 公告 称 , 公司 近日 与 哈萨克斯坦 石油 化工 工业 公司 签署 了 哈萨克斯坦 天然 气 化工 综合 体 项目 工程 总 承包 合同 。
+合同 工期 42 个 月 , 合同 总 金额 18 . 65 亿 美元 , 折合 人民 币 120 . 10 亿元 , 占 公司 2014 年 营业 收入 的 17 . 34 % 。
+同时 公告 称 , 该 项目 的 开工 需要 以 项目 合作 方 获得 中国 境内 银行 80 % 的 项目 融资 贷款 为 前提 , 正式 贷款 协议 的 最终 签署 情况 将 影响 到 此次 项目 合同 的 执行 , 该 合同 能否 最终 执行 还 存在 不 确定 性 。
+要点 六 : 重大 合同 - 子 公司 签 35 . 6 亿 工程 承包 合同 2014 年 11 月 20 日 公告 称 , 公司 所 属 全资 子 公司 中国 化学 工程 第 十三 建设 有限 公司 近日 与 内蒙古 中佳 能源 有限 责任 公司 签署 了 30 万 吨/年 乙烯 - 乙烯 醇 树脂 项目 工程 总 承包 合同 , 合同 工期 预计 为 36 个 月 , 合同 总 金额 约 35 . 60 亿元 , 约 占 公司 2013 年 营业 收入 的 5 . 76 % 。
+中国 化学 11 月 20 日报 收 6 . 59 元 , 下跌 0 . 30 % 。
+要点 七 : 重大 合同 2012 年 5 月 , 公司 与 迪拜 政府 建设 主管 部门 EO - ( Engineer ` sOffice ) 签订 了 迪拜 国民 安居 项目 施工 总 承包 合同 ( National House Scheme InDubai ) 。
+合同 工期 60 个 月 , 合同 总 金额 约 29 . 5 亿 美元 , 折合 人民 币 约 186 . 44 亿元 , 约 占 本 公司 2011 年 营业 收入 的 42 . 82 % 。
+要点 八 : 聚酯 及 尼龙 新 材料 项目 公司 计划 在 四川 南充 化工 园区 投资 建设 聚酯 及 尼龙 新 材料 项目 , 项目 总 投资 101 . 3 亿元 , 项目 将 以 设立 项目 公司 运营 的 方式 完成 。
+公司 子 公司 成达 工程 拟 持有 该 项目 公司 31 . 5 % 的 股权 , 投资 约 31 . 91 亿元 。
+项目 包括 50 万 吨/年 聚酯 ( PET ) , 22 万 吨/年 己二酸 , 10 万 吨/年 尼龙 6 等 8 套 装置 。
+投资 建设 期 36 个 月 , 项目 建成 后 , 预计 年 增 营业 收入 141 . 69 亿元 , 年 实现 净 利润 10 . 20 亿元 , 预计 总 投资 收益 率 14 . 78 % 。
+要点 九 : 重大 合同 42013 年 6 月 , 公司 所 属 全资 控股 子 公司 华陆 工程 科技 有限 责任 公司 与 新疆 国泰 新华 矿业 股份 有限 公司 签署 了 煤 基 精细 化工 循环 经济 工业 园 一期 项目 主体 工艺 生产 装置 、 全厂 性 公用 工程 及 生产 辅助 设施 工程 承包 合同 。
+合同 工期 预计 为 21 个 月 , 合同 总 金额 约合 40 亿元 人民 币 , 占 本 公司 2012 年 营业 收入 的 7 . 4 % 。
+要点 十 : 签 23 . 3 亿 芳烃 承包 合同 2013 年 10 月 , 公司 全资 子 公司 赛 鼎 工程 有限 公司 与 双鸭 山 龙 煤 天泰 煤 化工 有限 公司 签署 了 双鸭 山 龙 煤 天泰 煤 制 10 万 吨/年 芳烃 项目 工程 承包 合同 。
+合同 工期 预计 为 36 个 月 , 合同 总 金额 约 23 . 3 亿元 , 约 占 公司 2012 年 营业 收入 的 4 . 3 % 。
+要点 十一 : 大 订单 概念 公司 2010 年 全年 新 签 合同 2454 份 , 合同 额 513 . 3 亿元 , 同比 增长 22 . 6 % 。
+大型 项目 所 占 比例 增大 , 单项 建造 规模 大 幅度 上升 , 海外 工程 签约 量 回升 , 经营 层次 不 断 得以 提升 。
+公司 2011 年 上 半年 新 签 合同 额 总计 约 590 亿元 , 其中 6 家 工程 公司 新 签 合同 额 总计 约 370 亿元 。
+要点 十二 : 对外 投资 2011 年 8 月 公司 拟 在 江苏 启东 吕 四 海洋 经济 开发 区 投资 建设 中国 化学 启东 新 材料 产业 园区 ( 以下 简称 园区 ) 项目 。
+该 投资 项目 的 实施 , 将 有 利于 公司 实现 跨越 式 发展 和 战略 转型 。
+一期 工程 按照 供需 平衡 、 结构 优化 、 集约 高效 的 原则 , 实现 吕 四港 海洋 开发 区 港口 资源 、 原料 资源 、 土地 资源 、 水 资源 等 的 合理 利用 和 合理 配置 , 实现 资源 优势 向 经济 优势 转化 , 最 大 限度 地 提高 资源 综合 利用 效率 , 实现 可 持续 发展 。
+一期 工程 建成 后 将 具备 可观 的 经济 效益 , 公司 还 可 在此 基础 上 开发 下游 产品 , 提高 产品 附加 值 , 为 公司 创造 更多 的 经济 利益 , 确保 股东 价值 最 大 化 。
+要点 十三 : 100 万 吨 / PTA 项目 公司 计划 在 四川 南充 化工 园区 投资 建设 100 万 吨/年 精 对苯二甲酸 ( PTA ) 项目 , 项目 总 投资 41 . 93 亿元 , 公司 拟 持有 PTA 项目 公司 45 % 的 股权 , 投资 约 18 . 87 亿元 , 其中 以 超 募 资金 出资 5 . 66 亿元 , 子 公司 成达 工程 公司 拟 持有 PTA 项目 公司 35 % 的 股权 , 投资 约 14 . 67 亿元 , 另外 20 % 的 股权 由 中国 石油 四川 石化 持有 。
+本 项目 是 中 西部 地区 第 二 套 大型 PTA 装置 , 投资 建设 期 30 个 月 , 项目 建成 后 , 预计 年 增 营业 收入 67 . 12 亿元 , 年 实现 净 利润 3 . 43 亿元 , 预计 总 投资 收益 率 18 . 75 % 。
+要点 十四 : 工程 承包 业务 公司 工程 承包 板块 主要 服务 于 工业 建筑 市场 , 作为 国家 基础 建设 的 骨干 力量 之 一 , 公司 50 年 来 先后 建成 了 吉林 , 大庆 , 兰州 , 齐鲁 , 南京 , 上海 , 大连 , 太原 , 乌鲁木齐 等 大型 化工 , 石油 化工 基地 , 是 我 国 化工 和 石油 化工 工程 领域 领先 的 工业 工程 公司 , 公司 在 国内 煤 化工 , 多晶硅 等 新兴 产业 的 工程 建设 领域 处于 技术 领先 地位 , 承建 了 国内 大量 该 类 项目 , 该 板块 是 公司 第 一大 主业 , 也是 公司 最 主要 的 利润 来源 。
+要点 十五 : 勘察 , 设计 及 服务 业务 公司 是 我 国 化工 , 石油 化工 , 煤 化工 , 多晶硅 工程 领域 主要 的 勘察 , 设计 及 服务 提供 商 , 是 将 国际 先进 技术 引入 国内 并 成功 实现 产业 化 的 先行 者 , 能够 为 客户 提供 一 整套 勘察 , 设计 与 服务 , 公司 在 石油 能源 的 替代 产品 如 甲醇 等 , 太阳 能 能源 涉及 的 硅 材料 等 能源 产品 工业 化 领域 拥有 一批 专有 技术 或 技术 专长 。
+要点 十六 : “ 一 站 式 ” 工程 服务 公司 拥有 多项 化工 , 石油 化工 行业 的 高 等级 资质 , 其中 拥有 工程 设计 综合 甲级 资质 4 项 , 工程 勘察 综合 甲级 资质 3 项 , 行业 工程 勘察 , 工程 设计 甲级 资质 17 项 , 一 级 施工 总 承包 资质 27 项 , 一 级 施工 专业 资质 32 项 , 是 国内 同类 企业 中 资质 最 为 全面 的 工业 工程 公司 , 同时 公司 下属 子 公司 天辰 公司 , 成达 公司 , 华陆 科技 , 五环 科技 拥有 工程 设计 最 高 级别 资质 — 综合 甲级 资质 , 可 开展 所有 行业 的 工程 设计 , 工程 总 承包 业务 。
+公司 能以 PMC , EPC , BLT , BOT 等 多种 承包 方式 为 各类 业主 提供 服务 。
+要点 十七 : 国际 化 经营 公司 从事 海外 工程 承包 业务 历史 悠久 , 通过 下属 7 家 设计 板块 全资 子 公司 承接 境内 外 工程 总 承包 业务 , 通过 施工 板块 8 家 境内 全资 子 公司 , 1 家 境外 全资 子 公司 开展 境内 外 施工 总 承包 , 施工 承包 业务 , 逐步 进入 10 余 个 国家 和 地区 的 建筑 市场 。
+2010 年 , 全年 新 签 合同 2454 份 , 合同 额 513 . 3 亿元 , 其中 境外 合同 额 96 . 2 亿元 , 占 新 签 合同 额 的 18 . 74 % , 完成 境外 收入 42 . 52 亿元 。
+要点 十八 : 研发 实力 2010 年 , 公司 共 投入 技术 创新 资金 6 . 06 亿元 , 同比 增长 19 . 8 % , 其中 技术 研发 人工 成本 支出 3 . 23 亿元 , 技术 研发 人员 达到 3798 人 。
+公司 组织 编制 《 “ 十二五 ” 技术 发展 规划 》 , 提出 124 个 重点 研发 项目 。
+全年 共 获得 国家 授权 专利 60 项 , 被 认定 专有 技术 9 项 , 主编 , 参编 国家 , 行业 标准 60 余 项 , 获得 工程 设计 , 咨询 , 施工 , 监理 , 科技 进步 等 方面 的 省 部级 以上 奖项 80 余 项 。
+要点 十九 : 自愿 锁定 股份 的 承诺 公司 控股 股东 和 实际 控制 人 中国 化学 工程 集团 承诺 : 自 公司 股票 上市 之 日 ( 2010 年 1 月 7 日 ) 起 三十六 个 月 内 , 不 转让 或 委托 他人 管理 其 已 直接 和 间接 持有 的 公司 股份 , 也 不 由 公司 收购 该 部分 股份 。
+其他 发起 人 神华 集团 , 中国 中化 集团 均 承诺 : 自 公司 工商 登记 日 ( 08 年 9 月 23 日 ) 起 三十六 个 月 内 , 不 转让 其 持有 的 公司 在 工商 登记 日向 其 发行 的 全部 普通 股 股份 , 根据 有 关 规定 , 公司 首次 公开 发行 股票 并 上市 后 , 由 公司 国有 股 股东 转 由 全国 社会 保障 基金 理事 会 持有 的 公司 国有 股 , 全国 社会 保障 基金 理事 会 将 承继 原 国有 股东 的 禁售 期 义务 。
+
+骨 劈开 术 区 牙 槽 嵴 宽度 不 足 , 骨质 密度 为 Ⅲ 级 - - - Ⅳ 级 , 在 除去 刃 状 骨 嵴 后 , 牙 槽 骨 高度 仍 足以 植入 一定 长度 种植 体 时 , 使用 专门 的 劈开 器械 将 牙 槽 嵴 从 中间 劈开 , 形成 完整 的 颊 、 舌 侧 皮质 骨 板 , 将 种植 体 植入 劈开 的 间隙 内 , 而 骨 和 种植 体 之间 的 间隙 则 移植 骨 材料 , 这 是 处理 较 严重 的 局部 骨 缺损 的 一 种 外科 手术 方法 - - - 骨 劈开 技术 , 可以 保证 种植 体 周围 有 一定 自体 牙 槽 骨 支持 和 相对 良好 地 初期 稳定 性 , 同时 不必 移植 大量 的 骨 材料 。
+优点 0 扩大 了 种植 手术 适应 症 的 范围 , 保留 牙 槽 嵴 高度 。
+避免 牙 槽 骨 大量 丧失 , 减少 术 中骨 代 用品 用量 , 可 降低 费用 。
+手术 过程 01 、 预备 骨 劈开 平台 , 使 牙 槽 嵴 颊 舌 向 宽度 能够 进行 骨 劈开 操作 。
+2 、 定位 , 用 球 钻 或 超声 骨 刀 预备 出 接近 骨 劈开 器械 宽度 的 凹槽 。
+3 、 将 骨 劈开 器械 刃 端 置于 凹槽 内 , 通过 敲击 劈开 牙 槽 骨 , 直到 设计 深度 后 , 更换 下 一 级 器械 。
+4 、 每 级 器械 都 要 逐渐 增加 深度 。
+劈开 间隙 的 颊 舌 向 宽度 一旦 可以 放置 一 级 骨 挤压 器械 , 则 换 用 骨 挤压 方法 继续 扩 孔 。
+5 、 植入 种植 体 6 、 GBR 技术 7 、 缝合 种植 术 前 牙 槽 嵴 宽度 不 足 , H 形 切口 , 小球 钻 预备 嵴 顶 沟槽 骨 劈开 器械 沿 理想 长 轴 预备 骨 劈开 器械 进入 设计 深度 劈开 后 形成 狭长 骨 洞 , 唇 侧 古板 凸起 扁 锥形 器械 预备 , 方便 进入 狭长 的 骨 洞 圆锥 形 器械 预备 骨 挤压 器械 预备 骨 挤压 到 设计 深度 , 皮质 骨 有 断裂 种植 窝 唇 侧 骨 壁 完整 缓慢 植入 种植 体 缝合 < b > 注意 事项 < / b > < b > < / b > 1 、 尽量 植入 较 长 种植 体 , 保证 种植 体 的 初期 稳定 性 。
+2 、 预备 手术 平台 时 应 减少 磨 除 皮质 骨 。
+3 、 劈开 力度 适中 , 过大 会 造成 皮质 骨 断裂 , 影响 初期 稳定 性 。
+4 、 劈开 过程 中 极 有 可能 出现 皮质 骨 裂纹 , 应 盖可 吸收 性 屏障 膜 。
+5 、 注意 骨 劈开 方向 , 一 次 进入 的 深度 不 可 过 深 , 以免 取出 困难 和 局部 过热 。
+中文 名 骨 劈开 器械 扁形 双面 骨 凿 优点 扩大 了 种植 手术 适应 症 的 范围 技术 GBR 技术
+南京 中盟 知识 产权 事务 所 南京 中盟 知识 产权 事务 所 有限 公司 , 是 由 创立 于 1997 年 的 江苏 中盟 律师 事务 所 的 知识 产权 团队 组建 的 知识 产权 法律 服务 专业 机构 , 主要 从事 商标 、 版权 ( 著作 权 ) 、 专利 等 业务 的 代理 以及 相关 的 法律 诉讼 等 。
+0 我们 的 团队 是 由 资深 执业 律师 和 专业 代理 人 构建 , 专业 的 法律 背景 及 丰富 的 经验 积累 , 使 我们 能够 为 客户 提供 全 方位 的 知识 产权 法律 服务 。
+我们 不仅 在 商标 确权 和 争议 解决 等 领域 具有 较 强 优势 , 同时 还 擅长 提供 商标 检索 、 监测 的 法律 意见 、 不 正当 竞争 的 法律 意见 , 提供 品牌 保护 的 战略 方案 , 帮助 客户 挑选 最 具 市场 识别 力 的 品牌 , 从而 使 客户 的 品牌 及 相关 权利 得到 最 充分 的 保护 。
+我们 的 专业 团队 为 中国 大陆 的 几千 家 企业 办理 了 国内 和 国外 的 商标 、 版权 、 专利 代理 及 知识 产权 诉讼 事宜 , 也 为 众多 国外 企业 办理 了 国内 的 商标 、 版权 、 专利 代理 及 知识 产权 诉讼 等 事宜 , 拥有 非常 丰富 的 专业 代理 经验 。
+同时 , 我们 对于 驰名 商标 、 著名 商标 的 认定 , 也有 着 非常 丰富 的 经验 。
+我们 秉持 客户 至上 的 理念 , 致力 于 为 客户 提供 优质 法律 服务 。
+我们 崇尚 团队 合作 精神 , 致力 于 细致 的 专业 分工 。
+专注 于 提供 务实 的 解决 方案 , 使 我们 不 但 能 为 客户 提供 充分 的 法律 分析 , 而且 还能 根据 客户 的 具体 情况 建议 可行 的 解决 方案 。
+我们 追求 专业 、 专注 , 不 断 的 超越 。
+中文 名 南京 中盟 知识 产权 事务 所 实质 法律 机构 领域 知识 产权 所在 地 南京
+上海 三 商 食品 工业 有限 公司 上海 三 商 食品 工业 有限 公司 于 1994 年 05 月 19 日 在 上海 市 工商 局 登记 成立 。
+法定 代表 人 铃木 胜 也 , 公司 经营 范围 包括 生产 盐渍 蔬菜 、 冷冻 蔬菜 、 脱水 蔬菜 、 蔬菜 制品 ( 酱 腌菜 ) 等 。
+企业 信息 0 公司 类型 有限 责任 公司 ( 中外 合资 ) 登记 机关 上海 市 工商 局 成立 时间 1994 年 05 月 19 日发 照 时间 1994 年 05 月 19 日
+辽宁 汇 心 科技 有限 公司 辽宁 汇 心 科技 有限 公司 成立 于 2011 年 11 月 24 日 , 是 一 家 专业 从事 电子 商务 运营 的 公司 , 位于 辽宁 省 丹东 市 。
+公司 采用 先进 的 网络 技术 与 严谨 的 管理 制度 , 坚持 以 “ 让 客户 满意 , 为 客户 赢利 ” 为 服务 宗旨 , 把 最 好 的 产品 以 最 优 的 价格 提供 给 广大 消费 者 。
+汇 心 科技 致力 于 把 丹东 本 地 特色 产品 带给 全国 百姓 , 产品 主要 有 黄海 海鲜 、 东港 草莓 、 食用 菌 类 等 本 地 特色 食品 。
+
+津巴布韦 文化 津巴布韦 在 绍纳 语 中 是 \ & amp ; quot ; 石头 城 \ & amp ; quot ; 之 意 。
+这 类 石头 建筑 物 遗址 在 津巴布韦 共和 国 境内 已 发现 有 100 多 处 , 其中 座落 在 维多利亚 堡 附近 的 大津 巴布 韦 遗址 规模 最 大 , 也是 保存 最 完整 的 一 组 石头 建筑 群 。
+它 分 为 卫城 和 椭圆 形 石城 两 部分 。
+范围 0 以 大津 巴布 韦 遗址 为 代表 的 铁器 时代 文化 。
+位于 非洲 中 南部 今 津巴布韦 维多利亚 堡 东南 27 公里 处 。
+这种 文化 以 花岗石 垒砌 而成 的 建筑 物 和 发达 的 黄金 贸易 为 重要 特征 。
+1868 年 发现 。
+它 的 繁荣 时期 为 13 ~ 16 世纪 。
+津巴布韦 在 绍纳 语 中 是 石头 城 之 意 。
+这 类 石头 建筑 物 遗址 在 津巴布韦 共和 国 境内 已 发现 有 100 多 处 , 其中 座落 在 维多利亚 堡 附近 的 大津 巴布 韦 遗址 规模 最 大 , 也是 保存 最 完整 的 一 组 石头 建筑 群 。
+它 分 为 卫城 和 椭圆 形 石城 两 部分 。
+卫城 修建 在 70 米高 的 花岗岩 山丘 上 , 构筑 坚固 , 里面 设有 “ 圣坛 ” , 可能 用于 宗教 祭祀 。
+椭圆 形 石城 建造 在 山 下 谷地 。
+全部 用 凿 切 齐整 的 花岗 石块 精巧 地垒 砌 起来 , 没 有 用 灰浆 粘合 。
+城墙 上 装饰 有 用 皂石 雕刻 成 的 精美 石 鸟 和 附有 美丽 图案 的 独 石柱 。
+城内 有 大小 不 一 、 错落 分布 的 圆锥 形 石塔 , 有 台阶 的 壁龛 , 雕凿 而成 的 甬道 和 石门 。
+( 见 彩图 ) 据 碳 - 14 测定 , 大津 巴布 韦 建筑 群 营造 的 最 早 年代 为 1100 年 左右 。
+13 世纪 时 曾 进行 过大 规模 的 扩建 。
+在 这 组 建筑 群 周围 , 有 古代 修建 的 梯田 、 水渠 和 水井 的 遗迹 。
+梯田 绵延 几千 公里 , 证明 当时 已有 比较 发达 的 农业 。
+在 津巴布韦 广大 地区 , 还 发现 古代 采矿 场 7000 多 处 。
+19 世纪 末 侵入 津巴布韦 的 欧洲 殖民 者 和 寻 金客 , 曾 在 这些 古代 采矿 场 挖掘 黄金 。
+在 遗址 中 , 还 发 掘出 锄头 、 箭头 和 矛 等 铁 制品 , 还有 冶炼 黄金 用 的 熔炉 , 古代 铸 钱 用 的 泥 模 , 以及 来自 古代 东方 的 各种 陶瓷 器具 和 中国 瓷器 的 碎片 。
+论证 0 考古 学家 认为 , 中世纪 时期 的 津巴布韦 经济 相当 繁荣 。
+14 世纪 以后 这里 逐渐 发展 成为 绍纳 人 的 政治 和 宗教 活动 中心 。
+15 世纪 初 强盛 起来 的 姆 韦 尼 · 马 塔 帕 王国 和 15 世纪 末 兴起 的 罗 兹 韦 王国 , 都 曾 一 度 以 大津 巴布 韦 为 京都 。
+在 姆 韦 尼 · 马 塔帕 王国 时期 , 大津 巴布 韦 是 一 个 贸易 中心 。
+索法拉 港 是 内陆 地区 与 外界 通商 的 重要 口岸 。
+内地 的 商人 用 黄金 或 象牙 在 这里 换取 印度 的 棉布 、 铜丝 、 串珠 , 波斯 的 彩色 陶器 , 以及 阿拉伯 的 玻璃 器皿 。
+一些 来自 东非 海岸 的 斯瓦希里 商人 , 亦 曾 深入 内地 , 与 姆 韦 尼 · 马 塔 帕 王国 及 罗 兹 韦 王国 直接 交易 。
+16 世纪 初 , 葡萄牙 殖民 者 的 舰队 摧毁 了 东非 沿海 的 许多 著名 商业 城邦 , 随后 又 占领 了 索法拉 港 。
+此后 100 多年 内 。
+他们 还 使用 武力 多次 侵犯 姆 韦 尼 · 马 塔 帕 王国 。
+再 加 上 王国 内部 的 分裂 和 长期 纷争 , 姆 韦 尼 · 马 塔帕 和 罗 兹 韦 两 个 王国 先后 由 盛 而 衰 。
+津巴布韦 文化 由此 也 逐渐 衰落 。
+中文 名 津巴布韦 文化 繁荣 时期 13 ~ 16 世纪 重要 特征 花岗 石垒 砌 成 的 建筑 物 重要 特征 发达 的 黄金 贸易 遗址 津巴布韦 维多利亚 堡 东南 27 公里 处 两 部分 卫城 和 椭圆 形 石城
+五峰 宜 红茶 都 有限 公司 五峰 宜 红茶 都 有限 公司 于 2013 年 01 月 24 日 在 宜昌 市 五峰 县 工商 局 登记 成立 。
+法定 代表 人 张 万鸿 , 公司 经营 范围 包括 茶叶 市场 开发 建设 与 管理 ; 茶叶 品牌 策划 ; 红 等 。
+企业 信息 0 公司 类型 有限 责任 公司 ( 国有 控股 ) 登记 机关 宜昌 市 五峰 县 工商 局 成立 时间 2013 年 01 月 24 日发 照 时间 2016 年 12 月 15 日
+北京 光明 大 饭店 酒店 介绍 0 北京 光明 大 饭店 是 一 家 按照 国际 标准 设计 的 酒店 , 由 知名 的 设计 公司 设计 , 将 时尚 与 中国 文代 精髓 巧妙 地 结合 起来 , 堪称 奇葩 , 座落 在 风景 秀丽 的 亮马 河 畔 , 环境 优美 , 交通 便利 , 毗邻 中央 商务 区 、 距 燕莎 商场 300 米 , 紧邻 第 三 使馆 区 , 是 一座 外观 美丽 、 造型 新颖 、 功能 齐全 的 建筑 群体 , 饭店 内 有 会议 厅 、 室外 游泳 、 网 球场 、 独立 写字 楼 等 设施 。
+饭店 设有 各式 客房 252 套 , 附设 中 餐厅 、 日韩 餐厅 、 咖啡 厅 、 商务 中心 、 美容 美发 、 足疗 中心 、 娱乐 健身 室 、 各 类型 会议 室 , 周边 具备 丰富 的 美食 娱乐 项目 , 是 国内 外 宾客 来京 旅游 观光 、 商务 外出 理想 的 下榻 之 地 。
+酒店 信息 0 酒店 星级 : 四 星级 商业 区域 : 燕莎 商务 圈 开业 时间 : 2008 - 07 - 01 装修 时间 : 2008 - 07 - 01 酒店 地址 : 北京 市 朝阳 区 亮 马桥 路 42 号
+greates say 是 一 家 提供 欧美 资深 各 行业 编辑 为 中国 客户 提供 英文 翻译 润色 服务 , 在 留学 申请 和 学术 论文 润色 方面 有 突出 的 服务 质量 & amp ; quot ; 的 机构 , 其 发起 人 为 两位 美籍 华人 。
+机构 简介 0 机构 简介 创办 原由 1G reates say 的 发起 人 他们 看到 了 中国 客户 渴望 , 获得 欧美 教育 欧美 学术 刊物 认可 以及 获得 欧美 主流 消费 市场 认可 的 需求 , 大 部分 中国 人 在 求学 / 发表 论文 / 进入 欧美 市场 或者 和 欧美 上 下游 企业 进行 业务 往来 需要 欧美 籍 编辑 完善 相关 文档 时 , 往往 无法 方便 有效 的 找到 合适 的 实体 提供 优质 可靠 又 物美 价廉 的 服务 , 所以 创办 了 该 网站 , 通过 两位 ABC 在 美国 的 人脉 , 在 短 时间 内 组成 了 一只 具备 非凡 的 留学 文书 . 学术 论文 / 商业 文章 的 编辑 团队 , 发布 了 该 网站 , 并 将 “ 非凡 文书 ” 申请 了 中国 商标 。
+机构 简介 非凡 文书 2 “ 非凡 文书 ” 是 上海 程 瑞 商务 咨询 有限 公司 的 注册 商标 , 英文 商标 为 “ greates say ” , 已经 于 2009 年 7 月份 向 中华 人民 共和 国 商标 局 提交 并 通过 了 商标 注册 流程 , ” 非凡 文书 网 ” 是 一 家 提供 ” 留学 文书 撰写 , 英文 论文 润色 和 高 端 商业 翻译 的 互联 网 企业 ” , 其 特色 是 ” 全部 编辑 全部 来自 美国 大学 , 100 % 为 欧 美籍 英语 母语 人士 , 其中 50 % 以上 是 博士 学位 获得 者 ” 。
+发展 历程 02007 年 4 月 , 两位 创始 人 Scott . Chen 和 Alexa . Chai ( 均 为 Yale 大学 毕业 生 , 均有 5 年 华尔 街 工作 经验 ) , 在 美国 纽约 成 了 翻译 公司 SACCon sulting . ltd 。
+专门 帮助 美国 以外 的 非 英文 母语 客户 翻译 学术 论文 和 技术 / 市场 类 文章 , 主要 客户 来自 中国 / 韩国 / 日本 驻 美国 的 分子 机构 , 其中 包括 ( 海尔 电器 / 奇瑞 汽车 / 英利 太阳 能 ) 等 中国 知名 企业 。
+2009 年 5 月 , 两位 创始 人 找到 中国 籍 合伙 人 木易 , 在 中国 上海 成立 了 分 公司 , 办公 室 设置 在 上海 张江 高 科技 园 , 并 获得 了 上海 市 政府 的 创业 扶持 基金 。
+2009 年 6 月份 , 成立 并 发布 网站 , 并 尝试 使用 口 口 相传 的 方式 开始 获得 中国 客户 。
+2009 年 7 月份 , 发布 了 ticket 客户 服务 系统 , 摒弃 了 很多 网 上 小 公司 使用 的 email 沟通 方式 , 让 客户 的 资料 更加 安全 , 沟通 更加 高效 。
+2009 年 11 月 , 英国 金融 时报 在 对 来自 东亚 50 位 internet 未来 领 军 人物 相关 报导 中 , 评价 了 greates say 的 情况 , 并 给予 了 该 网站 为 中国 人 在 欧美 人 主宰 的 世界 学术 / 商业 领域 突破 语言 劣势 提供 一 种 简单 而 方便 的 工具 的 评论 。
+2010 年 2 月份 , 截止 已经 获得 了 523 位 个人 留学 客户 , 95 % 的 目标 申请 学校 为 全球 排名 50 内 的 知名 大学 , 申请 成功 率 达到 80 % 以上 。
+2010 年 3 月份 开始 , 开始 推广 英文 学术 论文 和 英文 商业 刊物 翻译 服务 , 截止 2010 年 7 月 , 已经 获得 了 20 多家 院校 和 科研 / 专利 机构 以及 新闻 机构 图 的 多 份 订单 。
+2010 年 6 月份 , 日 / 韩语 子 网站 准备 上线 。
+主要 业务 01 . 留学 申请 相关 英文 文书 撰写 和 润色 , 如 personal statement , statement of purpose , cv , recommendation letter , scholarshipletter 等 。
+2 . 英文 学术 论文 撰写 和 润色 。
+3 . 商业 级 新闻 报道 / 法律 合同 / 市场 营销 / 分析 报告 / 技术 文献 翻译 和 润色 。
+4 . 特殊 的 定制 英文 编辑 润色 ( 如 中文 学术 书籍 的 海外 版 的 翻译 / 润色 / 校对 ) 。
+5 . 特殊 学术 文章 服务 , 如 SCI 收录 加速 , 论文 防 抄袭 报告 等 。
+服务 特色 01 , 只用 国内 翻译 公司 的 价位 , 可以 获得 欧美 人 修改 的 , 符合 欧美 人 阅读 习惯 , 文藻 精确 而 不 失 优美 的 英文 稿件 质量 。
+2 , 100 多位 编辑 为 欧美 顶级 大学 毕业 或者 在 读 ( 世界 排名 前 100 , 美 排名 前 30 ) 的 欧美 籍 人士 。
+3 , 所有 编辑 均 为 学士 以上 学历 , 50 % 以上 具备 博士 以上 学历 。
+4 , 所有 编辑 均有 学士 以上 欧美 顶级 大学 的 申请 经验 , 90 % 以上 具备 200 份 申请 文档 撰写 经验 , 20 % 编辑 有 留学 文书 审批 经验 。
+5 , 70 % 的 编辑 有 超过 1 年 辅导 东亚 / 东南亚 / 南亚 学生 申请 欧美 顶级 大学 的 经验 。
+6 , 80 % 的 编辑 都 具备 国际 顶级 学术 期刊 论文 发表 经验 , 50 % 编辑 具备 超过 5 篇 以上 的 顶级 刊物 论文 发表 经验 。
+7 , 50 % 以上 的 编辑 有 3 年 以上 工作 经验 , 其中 30 % 的 编辑 具备 10 年 以上 欧美 顶级 科技 刊物 的 编辑 / 写作 经验 。
+服务 领域 0 覆盖 学术 技术 商业 方面 的 领域 包括 : 计算机 / 数学 / 物理 / 化学 / 电气 / 电机 / 自动 化 / 生化 / 医科 / 法律 / 商业 管理 / 英美 文学 / 人类 学 / 社会 学 / 公共 管理 / 汽车 制造 / 机器 人 等 28 个 领域 , 而且 各 领域 的 资深 编辑 正在 不 断 增加 中 。
+相关 说明 0 全部 付费 和 服务 均可 通过 网站 上 的 专用 网 上 服务 平台 完成 。
+公司 可以 开 据 正规 大陆 商业 发票 。
+如何 联系 0 您 可以 直接 访问 该 网站 , 获得 更加 详细 的 信息 和 资料 , 或者 发送 email 到 service # greates say , net ( 将 # 换 成 @ , ” , ” 号 换 成 ” . ” ) , 以 询问 您 感 兴趣 的 问题 。
+此外 您 可以 点 击 该 网站 顶端 的 在线 客服 聊天 在 周一 到 周五 和 客服 人员 取得 联系 , 进行 实时 沟通 。
+此外 , 该 网站 上 很快 会 公布 400 电话 。
+公司 地址 : 上海 市 浦东 区 张江 高 科技 园 春晓 堤 路 398 号 火炬 创新 园 1 楼 。
+外文 名 greates say 特点 提供 英文 翻译 润色 服务 服务 对象 中国 客户 性质 服务 机构
+任 小弟 任 小弟 是 任 家萱 ( Selina ) 在 节目 《 我 猜 我 猜 我 猜 猜 猜 》 的 “ 迷你 短剧 ” 单元 中 的 扮演 的 宅男 角色 , 突破 性 的 扮相 令人 捧腹 。
+简介 0 Selina 任 家萱 在 《 我 猜 我 猜 我 猜 猜 猜 》 里 的 “ 赛 琳娜 小 短剧 ” 里 的 宅男 扮相 ( 右 ) 酷似 好友 Ella , 左 为 陈 汉典 。
+Selina 的 扮相 被 称 为 任 小弟 。
+背景 信息 0 Selina 主持 中视 “ 我 猜 我 猜 我 猜 猜 猜 ” 时 , 曾 在 节目 单元 “ 迷你 短剧 ” 中 与 陈 汉典 一 起 扮 丑 搞笑 , 堪称 是 出道 以来 最 大 尺度 。
+她 戴 上 酷似 马桶 盖 的 假发 , 配 上 粗框 眼镜 化身 “ 宅男 ” , 造型 有 如 “ 猪 哥 亮 ” 上身 , 不过 , 她 自己 照 镜子 后 却 自嘲 : 怎 麼 有 点 像 Ella 的 感觉 。
+Selina 曾 在 “ 康熙 来 了 ” 代 班 主持 而 与 陈 汉典 熟识 , 两人 合作 默契 不 成 问题 , 私下 常 闲话 家常 。
+Selina 演 短剧 不 计 形象 , 笑场 时 还会 发出 猪 叫 的 笑声 , 与 偶像 形象 反差 极大 , 让 陈 汉典 不 敢 小看 她 的 搞笑 功力 。
+哈林 与 Selina 联手 主持 节目 收视 渐 稳 , 录制 “ 谎言 追 追 追 ” 单元 时 , 请到 苗 可丽 、 米可白 和 王 尹平 大 谈 惊险 经验 , 苗 可丽 曾 在 光 天化 日 之 下 被 醉汉 痛殴 成 伤 , 米 可 白 则 差点 发生 致命 车祸 而 丧命 , 王 尹平 经历 空中 飞来 横祸 , 差点 被 高空 掉落 的 物品 砸 中 头部 。
+血淋淋 的 亲身 经验 , 让 现场 来宾 无 不 露 出 惊吓 面容 , 就 连 主持 人 哈林 和 Selina 都 直呼 好 恐怖 。
+中文 名 任 小弟 外文 名 BoyRen 角色 性别 男 角色 出处 节目 《 我 猜 我 猜 我 猜 猜 猜 》 扮演 者 Selina ( 任 家萱 ) 性质 精灵 、 古怪 、 爆笑
+柯 九 思 本 定 武 兰亭 序 《 柯 九 思 本 定 武 兰亭 序 》 是 吉林 文史 出版 社 2009 年 出版 的 书籍 , 作者 是 [ 唐 ] 欧阳 询 。
+内容 简介 0 《 柯 九 思 本 定 武 兰亭 序 》 推出 的 这 件 宋 拓 定 武 兰亭 , 为 元代 柯 九 思旧 藏 。
+对于 这 一 拓本 , 柯 九 思 曾经 说过 , “ 予 家 所 藏 , 得 之 乔氏 仲 山 , 天 历 间 , 上 御 奎 章阁 , 命 取 观 之 , 识 以 天 历 之 宝 , 命 侍 书 学士 虞 公 识 其 左 , 还 以 赐 之 。
+” 有 方家 评介 , “ 此 兰亭 真 刻 , 用 薄纸 打 : 故 字 美 肥 , 然 意 度 具足 , 无 毫发 街 遗 , 鉴 者 当 自得 之 。
+” 此 拓本 是 所谓 的 “ 蝉翼 拓 ” , 全 卷 纵 二十七 点 四 厘米 , 连同 泉 多 题跋 , 横 六百二十六 厘米 , 现 藏 于 台北 故宫 博物院 。
+唐 太宗 得到 “ 兰亭 序 ” 真迹 后 , 曾 令 欧阳 询 等人 临 写 。
+又 下令 将 欧阳 询 临 写 的 “ 兰亭 序 中 刻 于 学士 院 , 后 刻石 置于 定州 。
+故此 石 所 拓 被 称 为 定本 , 世 称 “ 定 武 兰亭 ” 。
+元代 著名 书家 赵 孟 俯 在 “ 兰亭 序 十三 跋 ” 中 写道 : “ 兰亭 帖 自定 武 石刻 既 亡 , 在 人间 者 有 数 , 有 日 减 , 无 日 增 , 故 博古 之 上 以为 至宝 。
+乙 可见 , 定 武 兰亭 弥 足 珍贵 。
+南宋 著名 词人 姜 夔 在 “ 续 书谱 ” 中 有 言 : 定 武 兰亭 有 数 样 , 其 位置 长短 大小 无 不 同 , 而 肥瘦 刚柔 工 拙 之 处 , 如 人 之 面 , 无 有 同 者 。
+这 件 名 冠 古今 的 墨 拓 神品 , 系 台北 故宫 博物院 秘籍 之 一 , 亦 是 公认 的 最 佳 宋 拓 定 武 兰亭 本 , 现 将 其 全 卷 出版 , 供 广大 读者 鉴赏 。
+内 页 插图 0 书名 柯 九 思 本 定 武 兰亭 序 作者 [ 唐 ] & nbsp ; < a href = " # " > 欧阳 询 < / a > IS BN 97878070 27843 页数 32 页 定价 28 . 00 出版 社 吉林 文史 出版 社 出版 时间 2009 - 01 - 01 装帧 平装 开本 16 尺寸 30 x29x1cm 重量 159 g
+尼玛 的 夏天 影片 《 尼玛 的 夏天 》 讲述 了 藏族 学生 尼玛 受 国家 援藏 政策 的 援助 , 得到 一 次 去 海滨 城市 接受 良好 教育 的 机会 , 他 带 着 家人 的 嘱托 和 自己 的 梦想 , 踏 上 了 学习 的 旅途 。
+在 与 新 同学 的 相处 过程 中 , 开始 矛盾 不 断 的 他们 , 渐渐 地 被 相互 感动 , 最后 成为 了 朝夕 相处 的 好 朋友 , 并 一 起 代表 学校 的 足球 队 出战 。
+据 介绍 , 影片 中 除了 高 亚 麟 和 秦 子樾 是 专业 演员 之外 , 其他 所有 的 小 演员 均 是 没 有 表演 经验 的 90后 普通 学生 。
+0 名称 : 尼玛 的 夏天 导演 : 张 津 演员 : 高 亚 麟 秦 子樾 久美 次仁 类型 : 励志 剧情 地区 : 内地 语言 : 国语 主人 公 : 尼玛 上映 时间 : 2010 - 06 中文 名 尼玛 的 夏天 制片 地区 内地 类型 励志 剧情 主演 高 亚 麟 , 秦 子樾 上映 时间 2010 - 06 对白 语言 普通 话 色彩 彩色
+ktv 裸 门 一波 未 平 一波 又 起 , 似乎 网络 就 是 这样 , 每天 总会 有 很多 意 想不到 的 事情 发生 。
+“ KTV 裸 门 ” 事件 又 引爆 了 网友 们 的 眼球 。
+简介 0 “ KTV 裸 门 ” 视频 是 在 KTV 拍摄 的 , 一 个 年纪 不 大 的 女生 光 着 身子 在 房间 里 乱跑 , 挑逗 旁边 的 男子 , 做出 不 堪 入目 的 动作 。
+旁边 坐 的 几 个 男 的 随之 应和 着 !
+视频 上 看 这个 女孩 子 年龄 好 小 , 应该 是 在 上学 的 年龄 段 , 不 知 是 不 是 学生 。
+周围 男生 很多 , 女 主角 就 一 个 女孩 子 , 有 网友 看到 此 段 视频 后 大呼 : “ 现在 的 年轻 人 , 真 的 是 越来 越 疯狂 了 ” 。
+点评 0 “ KTV 裸 门 “ 事件 已经 成为 近期 众多 “ X 门 ” 类 事件 里面 的 新 热点 , 网友 惊呼 当今 有 些 年轻 人 的 社会 道德 观念 沦丧 让 人 感动 非常 痛心 。
+有 专家 也 呼吁 应该 加强 对 娱乐 场所 行为 的 监管 , 杜绝 此类 事件 的 发生 。
+当然 从 根源 上 , 应该 对 这些 青少年 的 人生 观 、 价值 观 、 甚至 恋爱 观 都 需要 更多 的 人 关心 和 爱护 。
+最近 的 这些 “ 门 ” 事件 , 多 是 发生 在 青少年 身 上 , 作为 孩子 的 家长 、 老师 不 应该 只 做 孩子 学习 上 的 导师 , 应该 做 他们 生活 上 的 导师 , 引导 他们 怎么 面对 变化 多端 、 瞬息 万变 的 世界 。
+中文 名 ktv 裸 门 类别 相关 词汇 阐述 “ KTV 裸 门 ” 事件 相关 新 热点
+美女 找茬 mx 运行 环境 0 支持 Android 1 . 6 应用 类型 0 休闲 益智 类 游戏 应用 介绍 0 这 是 一款 免费 游戏 。
+这款 游戏 没 有 激烈 的 对抗 , 适合 打发 闲暇 的 时间 , 保持 愉悦 的 心情 。
+美女 找茬 , 经典 版 。
+应用 名称 美女 找茬 mx 应用 平台 mobile 应用 版本 3 . 0
+施工 图 结构 构造 识读 综合 训练 基本 简介 0 《 施工 图 结构 构造 识读 综合 训练 》 是 武汉 理工 大学 出版 社 出版 的 一 本 图书 。
+图书 简介 0 本 书 根据 建筑 行业 对 高职 高专 层次 建筑 技术 人才 的 要求 , 结合 建筑 实例 和 建筑 结构 训练 软件 , 反映 现代 建筑 结构 构造 的 规范 要求 和 施工 工艺 , 并 根据 我 国 建筑 业 的 最 新 标准 和 规范 , 结合 混凝土 结构 施工 图 平面 整体 表示 方法 制图 规则 和 构造 详图 的 要求 , 以 简练 的 文字 、 真实 的 工程 案例 、 立体 的 构造 节点 来 表达 混凝土 结构 的 构造 做法 , 注重 对 学生 基本 知识 的 传授 和 基本 技能 的 培养 。
+全书 内容 共 分 为 6 章 , 主要 内容 包括 结构 构造 基础 知识 、 框架 柱 平 法 识读 及其 结构 构造 、 剪力 墙 平 法 识读 及其 结构 构造 、 框架 梁 平 法 识读 及其 结构 构造 、 有 梁 楼板 平 法 识读 及其 结构 构造 、 板式 楼梯 平 法 识读 及其 结构 构造 。
+本 书 可 作为 高职 高专 技术 学校 、 高等 学校 专科 、 成人 教育 学院 等 的 建筑 工程 类 教材 和 教学 参考 书 , 也 可 供 从事 建筑 造价 和 施工 的 人员 参考 。
+
+大竹 县 金鸡 乡 中心 小学 大竹 县 金鸡 乡 中心 小学 坐落 于 风光 无限 的 铜锣 山 的 山麓 , 竹石 公路 20 公里 旁 , 是 一 所 农村 九 年 一贯 制 学校 。
+学校 创办 于 1972 年 , 占地 面积 17400 余 平方 米 。
+学校 设备 设施 较 齐全 , 建 有 多 媒体 室 、 远程 教育 网络 室 、 语音 室 、 图书 室 。
+校园 绿树 成荫 , 与 花台 、 凉亭 、 葡萄 架 、 艺术 墙 , 相映 成趣 , 格外 幽静 雅致 , 是 育人 的 好 地方 。
+0 学校 现有 教学 班 30 个 , 学生 近 1000 人 , 在职 教 职工 49 人 , 其中 中学 高级 教师 3 人 , 中学 一 级 教师 4 人 , 小学 高级 教师 17 人 , 市级 骨干 教师 3 人 , 县级 骨干 教师 3 人 , 省级 优秀 教师 1 人 , 市级 优秀 教师 1 人 , 县级 优秀 教师 7 人 , 县级 师德 标兵 1 人 。
+学校 坚持 以 “ 全面 贯彻 教育 方针 , 面向 全体 学生 , 全面 提高 学生 素质 ” 为 办学 方向 , 以 “ 突出 教学 、 强化 管理 、 服务 师生 、 争创 一 流 ” 为 办学 思路 , 以 “ 重 素质 、 促 发展 、 创 特色 、 全面 提高 教育 质量 ” 为 办学 目标 , 大力 营造 一 个 “ 政风 廉 、 校风 好 、 教 风 正 、 学风 浓 ” 的 和谐 校园 。
+承 各级 领导 悉心 关怀 , 蒙 几 位 校长 栉 风 沐雨 , 经 数 届 师生 发愤 图强 , 学校 先后 获得 了 “ 县级 文明 单位 ” 、 “ 实验 合格 学校 ” 、 “ 平安 学校 ” “ 艺术 教育 先进 集体 ” “ 职业 教育 先进 集体 ” “ 关 工委 先进 集体 ” 、 “ 先进 党 支部 ” 、 “ 模范 教工 之 家 ” 、 “ 达州 五四 红旗 团 支部 ” 等 光荣 称号 。
+近年 来 , 县委 政府 、 县 财政 局 、 县 教育 局 、 乡 党委 政府 领导 多次 莅临 我 校 , 给予 了 我们 更 大 的 关怀 , 为 我们 插 上 理想 的 翅膀 , 明天 , 我们 一定 会 腾飞 起来 , 飞 得 更 高 更 远 。
+中文 名 大竹 县 金鸡 乡 中心 小学 创办 时间 1972 年 类别 农村 九 年 一贯 制 学校 占地 面积 17400 余 平方 米
+Boutique Fifi 应用 介绍 < / b > < b > < / b > 菲菲 已经 建立 了 一 个 精品 , 但是 是 需要 你 的 帮助 来 完成 她 的 收藏 的 衣服 。
+过来 帮 画 菲菲 和 漂亮 的 衣服 给 她 朋友 。
+软件 名称 Boutique Fifi 软件 平台 mobile 软件 版本 1 . 0 运行 环境 Android 2 . 2 应用 类型 卡片 棋牌 类
+四川 火锅 大侠 贸易 有限 公司 公司 简介 0 四川 火锅 大侠 贸易 有限 公司 成立 于 2017 年 , 公司 位于 成都 市 武侯 区 群众 路 69 号 1 栋 1 楼 53 号 。
+公司 业务 0 销售 : 厨房 用品 、 安防 设备 、 电力 设备 、 酒店 用品 、 节能 产品 、 农副 产品
+五 空间 传说 《 五 空间 传说 》 是 郭 紫昱 创作 的 网络 小说 , 发表 于 起点 网 。
+0 < b > 作品 简介 < / b > < b > < / b > 隆重 推出 新书 《 灵 诀 幻 界 》 书号 : 140326 。
+— 隆重 推出 新书 《 灵 诀 幻 界 》 书号 : 140326 。
+— 隆重 推出 新书 《 灵 诀 幻 界 》 书号 : 140326 。
+— 中文 名称 五 空间 传说 作者 郭 紫昱 类型 网络 小说 连载 平台 起点 网
+从 你 眼底 至 我 心尖 《 从 你 眼底 , 至 我 心尖 》 是 连载 于 百度 阅读 的 网络 小说 , 作者 是 凝 韵 。
+作者 简介 0 凝 韵 , 80后 , 双子 座 , 普通 白领 , 偶尔 精 分 。
+执着 于 文学 创作 , 喜欢 挖掘 残酷 现实 里 的 温情 与 浪漫 。
+希望 成为 一 个 自信 又 完美 的 人 , 凝结 希望 , 雅韵 平生 。
+主要 角色 0 夏 稚 ( 女 ) , 何 阗 ( 男 ) 作品 简介 0 小说 主要 讲述 了 跆拳 道 运动 员 夏 雉 退役 后 成为 一名 普通 职员 , 一直 未能 转正 的 同时 遭遇 小三 插足 、 男友 劈腿 。
+在 面临 爱情 与 事业 双重 打击 时 , 意外 和 公司 高层 何 阗 越 走 越 近 , 职场 得到 成长 的 同时 重新 收获 真 爱 的 故事 。
+一 个 是 理性 自省 的 精英 男 上司 , 一 个 是 沉默 努力 的 底层 女 职员 。
+何 阗 为了 家人 夺走 了 属于 夏 雉 的 外派 名额 , 也 切断 了 夏 雉 挽回 男友 的 最后 一 丝 希望 。
+本 以为 会 回到 各自 的 位置 相 敬 如 宾 , 但 夏 雉 在 工作 中 的 通透 坚韧 、 清醒 善良 一 点点 融化 着 何 阗 坚硬 如 铁 的 心 , 让 他 不 由 自主 地 靠近 。
+他 教会 她 职场 规则 , 帮助 她 迅速 成长 , 也 让 她 懂得 什么 是 好 的 爱情 。
+可是 , 当 谎言 被 揭穿 , 曾经 与 现在 的 伤害 让 夏 雉 不 堪 重负 。
+她 用 最 决绝 的 方式 离开 , 在 没 有 他 的 世界 里 , 她 努力 成为 独当 一 面 的 存在 。
+再次 相遇 , 她 要 撕开 那些 曾经 让 她 近乎 绝望 的 “ 职场 潜 规则 ” , 也想 让 所有 伤害 过 她 的 人 得到 报应 , 也 包括 他 … … 章节 目录 0 < b > 第 一 卷 红 玫瑰 与 白 玫瑰 < / b > < b > < / b > 第 一 章 第 二 个 男人 第 二 章 第 二 眼 美女 第 三章 美女 救 英雄 第 四 章 狼狈 的 “ 小三 儿 ” 第 五 章 踏破 铁 鞋 无 觅 处 第 六 章 爱情 死 了 第 七 章 渣 男 第 八 章 病 急 乱 投医 第 九章 一夜 情 第 十 章 妒忌 第 十一 章 木 已 成 炊 第 十二 章 虚伪 的 男人 第 十三 章 意外 发现 第 十四 章 试探 第 十五 章 善于 伪装 的 女人 第 十六 章 夏 雉 的 反常 第 十七 章 冤家 路窄 第 十八 章 你 还 知道 要脸 ?
+第 十九 章 闯祸 第 二十 章 仰视 一 个人 的 感觉 第 二十一 章 太 过 真实 的 梦 第 二十二 章 梦醒 时分 第 二十三 章 只要 你 能 陪 在 我 身边 第 二十四 章 女人 的 直觉 第 二十五 章 最 狗血 的 八卦 第 二十六 章 约会 第 二十七 章 被 酒精 侵蚀 的 理智 第 二十八 章 机会 第 二十九 章 好 上 了 第 三十 章 你 是 我 的 人 了 第 三十一 章 保护 欲 第 三十二 章 我 才 是 老大 第 三十三 章 夏 雉 的 不 确定 < b > 第 二 卷 再 靠近 一 点点 < / b > < b > < / b > 第 三十四 章 当 爱 再 靠近 第 三十四 章 不 确定 的 心 第 三十五 章 请 前妻 当 说客 第 三十六 章 一 个 男人 的 好心 第 三十七 章 何 阗 的 顾虑 第 三十八 章 前妻 的 婚礼 第 三十九 章 错 的 不 止 这 一 件 第 四十 章 例外 的 夏 雉 第 四十一 章 二十九 岁 的 人生 第 一 束 玫瑰 第 四十二 章 一 个 渣 男 就 够 了 第 四十三 章 变数 第 四十四 章 前 男友 的 疑惑 第 四十五 章 别 让 夏 雉 为难 第 四 十六 章 傻 丫头 第 四十七 章 何 阗 的 温柔 第 四十八 章 安全 感 第 四十九 章 因为 我 喜欢 你 呀 第 五十 章 英雄 救美 第 五十一 章 何 阗 的 家事 第 五十二 章 女人 的 诅咒 第 五十三 章 我 很好 , 因为 有 她 在 第 五十四 章 痛苦 中 的 温馨 第 五十五 章 自私 的 亲人 第 五十六 章 最 重要 的 人 第 五十七 章 谢谢 你 能 来 第 五十八 章 蜕变 第 五十九 章 从此 以后 我们 再 也 没 有 关系 第 六十 章 你好 , 何 阗 第 六十一 章 认可 第 六十二 章 保护 第 六十三 章 跟 我 走 吧 第 六十四 章 调戏 第 六十五 章 管家 婆 第 六十六 章 把 我 交给 你 < b > 第 三卷 比起 被 喜欢 的 人 说出 实情 , 还是 被骗 的 时候 更 幸福 < / b > < b > < / b > 第 六十七 章 我们 结婚 吧 第 六十八 章 你 才 是 要 陪 我 一生 的 人 第 六十九 章 你 还有 我 第 七十 章 我 是 不 一 样 的 男人 第 七十一 章 舒适 的 人生 第 七十二 章 狼狈 的 谈判 第 七十三 章 嫁给 我 吧 第 七十四 章 我 要 结婚 了 第 七十五 章 你 会 遇到 更 好 的 人 第 七十六 章 尤 瑞儿 的 麻烦 第 七十七 章 欠 你 的 第 七十八 章 隐忧 第 七十九 章 你 和 尤 瑞儿 熟 吗 ?
+第 八十 章 机会 第 八十一 章 我 不 贪心 , 有 你 就 够 了 。
+第 八十二 章 如果 时间 可以 重来 第 八十二 章 伤逝 第 八十三 章 祝 你 新婚 快乐 第 八十四 章 葬礼 和 婚礼 第 八十五 章 遗言 第 八十六 章 我 怀孕 了 第 八十六 章 适 可 而 止 吧 第 八十七 章 不 该 承受 的 第 八十八 章 一 家 三 口 第 八十九 章 七天 第 九十 章 关键 时刻 , 他 还是 放弃 了 你 第 九十一 章 太多 不 甘心 第 九十二 章 我 终于 还是 失去 了 你 第 九十三 章 重逢 第 九十四 章 她 果然 还是 那个 夏 雉 第 九十五 章 婚 内 性 骚扰 第 九十五 章 荒唐 的 婚姻 < b > 第 四 卷 没 有 时间 治愈 不 了 的 伤 < / b > < b > < / b > 第 九十六 章 梁 朵 的 心事 第 九十七 章 不 期 而 遇 第 九十八 章 纠结 第 九十九 章 出 糗 第 一百 章 何 阗 , 我 来 了 第 一百 零一 章 你 把 真真 藏 哪 了 ?
+第 一百 零二 章 复仇 的 开端 第 一百 零三 章 我 所 承受 的 痛苦 第 一百 零四 章 亲人 的 背叛 第 一百 零五 章 还有 什么 是 不 能 承受 的 第 一百 零六 章 人 都 是 自私 的 第 一百 零七 章 都 是 骗子 第 一百 零八 章 必须 要 做 一 个 决定 了 第 一百 零九 章 克星 第 一百 零九 章 道歉 的 方式 第 一百 一十 章 落水 狗 第 一百 一十一 章 尤 瑞尔 的 傲气 第 一百 一十二 章 不 是 何 阗干 的 , 是 我 。
+第 一百 一十三 章 一 个 不 再 为难 的 理由 第 一百 一十四 章 我们 怎么 会 变成 这样 ?
+第 一百 一十五 章 假如 这 就 是 永远 第 一百 一十六 章 夏 雉 的 丑闻 第 一百 一十七 章 这样 很好 第 一百 一十八 章 表面 的 妥协 第 一百 一十七 章 表面 上 的 美满 第 一百 一十八 章 陌生 人 第 一百 一十九 章 热闹 好看 吗 第 一百 二十 章 廖 总裁 的 红人 第 一百 二十一 章 让 你 惹 我 第 一百 二十二 章 我 以为 , 我 能 陪 你 一 辈子 第 一百 二十三 章 狠毒 的 女人 第 一百 二十四 章 最后 的 挣扎 第 一百 二十四 章 他 不 应该 有 这样 的 结局 第 一百 二十五 章 逃离 第 一百 二十六 章 你 还会 爱 我 吗 ?
+第 一百 二十七 章 大 结局 : 没 有 时间 治愈 不 了 的 伤 中文 名称 从 你 眼底 , 至 我 心尖 别名 另 一类 爱情 作者 凝 韵 类型 小说 连载 状态 已 完结 < sup > 1 < / sup > 连载 平台 百度 阅读 创建 时间 2017
+太阳 真 火 之 焚天 《 太阳 真 火 之 焚天 》 是 连载 于 17k 小说 网 的 小说 , 作者 是 雨 叶 草 。
+小说 类型 0 东方 玄幻 内容 简介 0 一 个 小 门派 , 一 个 被 师父 从 草丛 中 拾取 的 少年 。
+因为 天才 遭 嫉 , 因此 被迫 浪迹 天涯 。
+机缘 巧合 , 获得 一 个 奇怪 的 小 钟 世人 有 言 : 东皇 钟 响 , 镇压 鸿蒙 。
+离奇 的 身世 , 一 步步 寻觅 为 何 脑子 里 总会 涌现 奇异 的 记忆 。
+我 , 到底 是 谁 ? 且看 我 太阳 真 火 , 焚天 煮海 … … 中文 名 太阳 真 火 之 焚天 作者 雨 叶 草 小说 进度 连载 连载 网站 17k 小说 网
+荆 轲 后人 《 荆 轲 后人 》 是 尊 冕 创作 的 网络 小说 , 发表 于 起点 网 。
+0 怎样 能够 无 惧 死亡 而死 得 其 所 , 那 就 是 追寻 祖先 的 足迹 、 保护 信仰 的 圣殿 。
+荆 轲 后人 穿越 尘世 , 将 如何 书写 刺客 的 传奇 ?
+毒杀 、 扼 喉 、 背 刺 — 主 杀 伐 的 目的 在于 止 干戈 。
+毕竟 人 都 有 安宁 祥和 的 一 面 … … 中文 名称 荆 轲 后人 作者 尊 冕 类型 网络 小说 连载 平台 起点 网
+大 悦读 · 语文 新 课标 必读 丛书 : 木偶 奇遇 记 《 大 悦读 · 语文 新 课标 必读 丛书 : 木偶 奇遇 记 ( 修订 版 ) 》 描述 了 木偶 皮诺 曹 从 一 个 任性 、 淘气 、 懒惰 、 爱 说谎 、 不 关心 他人 、 不 爱 学习 、 整天 只 想着 玩儿 的 木偶 , 变成 一 个 懂 礼貌 、 爱 学习 、 勤奋 工作 、 孝敬 长辈 、 关爱 他人 的 好 孩子 的 过程 , 以及 他 所 经历 的 一连串 的 奇遇 , 充满 了 童趣 与 想象 。
+发生 于 皮诺 曹 身 上 的 故事 告诉 我们 , 一 个 孩子 的 自然 天性 在 许多 方面 都 是 需要 修正 的 , 也 就 是 说 , 在 自然 天性 里 往往 会 有 不 少 不 够 尽善 尽美 的 表现 , 等待 着 我们 去 逐步 克服 。
+基本 介绍 0 基本 介绍 内容 简介 1 《 大 悦读 · 语文 新 课标 必读 丛书 : 木偶 奇遇 记 ( 修订 版 ) 》 是 教育 部 《 语文 新 课程 标准 》 推荐 篇目 , 特色 是 : 课标 篇目 全部 收录 , 专家 名师 全程 助 读 , 阅读 写作 全面 提升 , 真题 模拟 全能 演练 。
+基本 介绍 作者 简介 2 作者 : ( 意大利 ) 卡洛 · 科洛迪 ( CalloodiC . ) 译者 : 随 郁 丛书 主编 : 黄 宝国 卡洛 · 科洛迪 ( 1826 - 1890 ) , 出生 在 意大利 托斯 坎纳 地区 一 个 叫 科洛迪 的 小镇 。
+他 的 笔名 便 是 由 这个 小镇 的 名称 而来 。
+科洛迪 一生 中 , 曾 写 过 许多 短篇 小说 、 随笔 、 评论 , 然而 最 著名 的 要 数 他 写给 孩子 们 看 的 童话 故事 。
+他 的 主要 作品 有 : 《 小 手杖 》 、 《 小 木片 》 、 《 小 手杖 漫游 意大利 》 、 《 小 手杖 地理 》 、 《 小 手杖 文法 》 、 《 木偶 奇遇 记 》 、 《 眼睛 和 鼻子 》 等 , 《 木偶 奇遇 记 》 是 他 的 代表 作 。
+专业 推荐 0 专业 推荐 媒体 推荐 1 我 费 了 几 点钟 功夫 把 《 木偶 奇遇 记 》 读完 之后 , 我 虽然 已经 不 是 一 个 小孩 了 , 然而 我 也 像 丰 子 恺 先生 家 里 的 孩子 们 那样 被 这 奇异 的 故事 迷住 了 。
+—— 巴 金 专业 推荐 名人 推荐 2 我 费 了 几 点钟 功夫 把 《 木偶 奇遇 记 》 读完 之后 , 我 虽然 已经 不 是 一 个 小孩 了 , 然而 我 也 像 丰 子 恺 先生 家 里 的 孩子 们 那样 被 这 奇异 的 故事 迷住 了 。
+- - 巴 金 图书 目录 0 第 一 章 第 二 章 第 三章 第 四 章 第 五 章 第 六 章 第 七 章 第 八 章 第 九章 第 十 章 第 十一 章 第 十二 章 第 十三 章 第 十四 章 第 十五 章 第 十六 章 第 十七 章 第 十八 章 第 十九 章 第 二十 章 第 二十一 章 第 二十二 章 第 二十三 章 第 二十四 章 第 二十五 章 第 二十六 章 第 二十七 章 第 二十八 章 第 二十九 章 第 三十 章 第 三十一 章 第 三十二 章 第 三十三 章 第 三十四 章 第 三十五 章 第 三十六 章 参考 答案 序言 0 科洛迪 ( 182611890 ) , 原名 卡洛 · . 科洛迪 , 1826 年 11 月 24 日 出生 在 佛罗伦萨 乡下 的 一 个 厨师 家庭 里 。
+科洛迪 其实 是 他 的 母亲 出生 和 成长 的 一 个 小镇 的 名字 , 他 的 笔名 便 是 由 这个 小镇 的 名称 而来 。
+《 木偶 奇遇 记 》 发表 于 1880 年 。
+科洛迪 在 教会 学校 毕业 后 , 开始 给 地方 报纸 写稿 , 并 积极 参加 意大利 的 民族 解放 运动 。
+科洛迪 以 儿童 文学 作家 闻名 于 世 , 他 先后 写 过 《 小 手杖 游 意大利 》 《 小 手杖 地理 》 《 快乐 的 故事 》 等 童 书 , 当然 , 他 最 著名 的 作品 是 《 木偶 奇遇 记 》 。
+1880 年 的 一天 , 科洛迪 给 他 在 《 儿童 杂志 》 工作 的 朋友 寄 了 一些 稿子 , 并 附有 一 张 纸条 , 说 送 上 “ 这点 傻 玩意儿 ” , 请 朋友 随意 处理 。
+这些 稿子 就 是 《 木偶 奇遇 记 》 的 前身 。
+木偶 的 故事 一经 发表 , 即时 引起 了 轰动 , 杂志 社 不 断 催 着 作者 快 写 , 最后 就 形成 了 我们 今天 读 到 的 《 木偶 奇遇 记 》 。
+这部 童话 自 出版 以来 , 受到 了 各国 儿童 的 喜爱 , 并 多次 被 拍成 动画 片 和 故事 片 , 在 世界 各国 广 受 欢迎 。
+科洛迪 精通 法文 , 曾 翻译 过眼 罗 的 童话 , 为 广大 小 读者 所 喜爱 。
+科洛迪 一生 中 , 曾 写 过 许多 短篇 小说 、 随笔 、 评论 , 然而 最 著名 的 要 数 他 写给 孩子 们 看 的 童话 。
+这些 童话 想象 力 丰富 , 人物 形象 栩栩 如 生 , 情节 曲折 动人 , 为 他 赢得 了 巨大 的 声誉 。
+他 的 主要 作品 有 : 《 小 手杖 》 ( 1876 ) , 《 小 木片 》 ( 1878 ) , 《 小 手杖 漫游 意大利 》 、 《 小 手杖 地理 》 、 《 小 手杖 文法 》 、 《 木偶 奇遇 记 》 、 《 眼睛 和 鼻子 》 ( 1880 ) , 《 快乐 的 故事 》 ( 1887 ) 。
+最后 两部 作品 是 : 《 愉快 的 符号 》 、 《 讽刺 杂谈 》 ( 1892 ) 。
+《 木偶 奇遇 记 》 是 科洛迪 的 代表 作 , 发表 于 1880 年 。
+故事 讲 的 是 一 个 叫 杰佩托 的 老头 儿 没 有 孩子 , 他 把 一 块 能 哭 会 笑 的 木头 雕 成 了 木偶 , 并 把 取得 生命 的 小 木偶 当成 了 儿子 , 起名 为 皮诺 曹 。
+小 木偶 会 跳舞 , 又 会 耍 剑 , 还会 翻 跟头 。
+有 一 次 皮诺 曹 把 脚 搁 在 火盆 上 睡觉 , 醒来 后 两只 脚 都 被 烧掉 了 。
+为了 获得 一 双 新 脚 , 皮诺 曹 答应 爸爸 去 上学 , 于是 , 老人 卖掉 了 上衣 , 供 儿子 上学 。
+可是 小 木偶 一心 贪玩 , 为了 看戏 不 惜 卖掉 了 课本 。
+在 酒店 他 获得 了 好心 的 老板 五枚 金币 , 回家 的 路 上 又 受 狐狸 和 猫 的 欺骗 , 差点 儿 被 他们 吊死 , 幸亏 巧遇 仙女 而 得救 。
+后 又 被 愚人 国 法官 投 进 监狱 ; 出狱 后 , 又 被 捕兽 器 夹住 , 被迫 当 了 看家 狗 。
+他 后悔 极 了 , 心想 : “ 如果 我 像 其他 好 孩子 一 样 喜欢 读书 、 做工 , 现在 我 就会 和 爸爸 呆 在 一 起 过 着 幸福 的 生活 , 就 不 会 在 这里 给 人家 当 看门 狗 了 。
+” 夜里 , 他 因 帮助 主人 抓住 偷鸡 的 黄鼠狼 而 重获 自由 。
+他 一 心想 成为 一 个 用功 读书 的 好 孩子 , 可是 又 经 不 起 诱惑 , 在 坏 同学 的 怂恿 下 逃 学到 海边 去 看 鲨鱼 , 后 又 被 引诱 到 了 玩儿 国 , 在 疯狂 地 玩儿 了 几 天 之后 , 变成 了 一 头 蠢驴 。
+后来 还是 仙女 搭救 了 他 。
+最后 , 他 和 爸爸 在 鲨鱼 腹中 意外 重逢 , 并 设法 逃 了 出来 , 在 海边 住 下 。
+从此 , 小 木偶 每天 去 做工 , 有 空 还 编 篮子 , 晚上 读书 、 写字 。
+经过 这次 历险 , 皮诺 曹 终于 长大 了 , 他 变得 诚实 、 勤劳 、 善良 , 成为 了 一 个 能 帮助 父母 的 真正 的 男孩 儿 。
+本 书 描述 了 木偶 皮诺 曹 从 一 个 任性 、 淘气 、 懒惰 、 爱 说谎 、 不 关心 他人 、 不 爱 学习 、 整天 只 想着 玩儿 的 木偶 , 变成 一 个 懂 礼貌 、 爱 学习 、 勤奋 工作 、 孝敬 长辈 、 关爱 他人 的 好 孩子 的 过程 , 以及 他 所 经历 的 一连串 的 奇遇 , 充满 了 童趣 与 想象 。
+发生 于 皮诺 曹 身 上 的 故事 告诉 我们 , 一 个 孩子 的 自然 天性 在 许多 方面 都 是 需要 修正 的 , 也 就 是 说 , 在 自然 天性 里 往往 会 有 不 少 不 够 尽善 尽美 的 表现 , 等待 着 我们 去 逐步 克服 。
+文章 中 作者 为 我们 刻画 了 一 个 会 跳舞 , 会 耍 剑 , 还会 翻 跟头 玩儿 的 顽皮 形象 。
+他 天真 无 邪 、 纯洁 朴实 、 正直 勇敢 , 他 任性 、 淘气 、 捣乱 、 不 守 规矩 、 有 时候 还 喜欢 撒 点 谎 。
+他 既 没 坏 到 无 可 救药 , 也 没 好 到 无 可 挑剔 , 而是 和 现实 生活 中 的 许多 孩子 一 样 , 心地 善良 、 聪明 伶俐 , 但 又 缺点 多多 —— 这 就 是 大名 鼎鼎 的 调皮 木偶 皮诺 曹 。
+这个 故事 虽然 大多 只是 讲述 小 木偶 遇到 挫折 的 过程 , 但 我们 还是 可以 感受 到 故事 中 蕴含 着 的 浓厚 的 爱 。
+小 木偶 没 有 课本 , 老 木匠 杰佩托 便 把 自己 的 上衣 换 了 一 本 书 给 小 木偶 , 而 自己 忍受 着 寒冷 的 折磨 。
+他 宁肯 自己 受苦 也 要 尽 最 大 的 努力 带给 小 木偶 幸福 , 这 是 多么 伟大 的 父爱 啊 !
+其实 , 《 木偶 奇遇 记 》 意欲 揭示 的 道理 并 不 复杂 , 它 倚 借 一 个 木偶 的 形象 不过 是 为了 向 我们 演绎 一 个人 由 不 完美 走向 完美 、 由 不 幸福 走向 幸福 的 曲折 历程 。
+问题 是 , 这个 看似 简单 的 道理 并 不 是 一 个人 生来 就 可以 明白 的 。
+否则 , 皮诺 曹 也 就 不 会 历尽 周折 才 得以 完成 这 一 历程 了 。
+从 他 由 木偶 最后 变成 真人 的 奇特 经历 中 我们 可以 发现 一些 于 我们 的 人生 有 益 的 真理 。
+皮 潜 曹 的 经历 被 称 之 为 “ 奇遇 ” , 但 那 可 不 是 一般 意义 上 的 离奇 遭遇 , 其中 蕴藏 着 等待 我们 去 仔细 咀嚼 的 深长 意味 。
+. 这 篇 童话 , 通过 皮诺 曹 的 种种 曲折 、 离奇 的 经历 , 表现 出 他 热爱 正义 、 痛恨 邪恶 、 天真 纯洁 的 品质 , 教育 我们 要 抵御 种种 诱惑 , 做 一 个 诚实 、 听话 、 爱 学习 、 爱 劳动 , 并 能 帮助 父母 的 好 孩子 。
+作者 成功 地 塑造 了 小 木偶 的 形象 , 他 聪明 、 善良 、 顽皮 而又 任性 , 他 的 故事 给 我们 以 有 益 的 教诲 和 艺术 的 感染 。
+皮诺 曹 文章 中 皮诺 曹 是 个 调皮 、 任性 、 顽皮 的 孩子 , 他 集中 了 小 朋友 们 在 这个 时期 身 上 所有 的 大 多数 缺点 , 同时 他 身 上 又 具备 了 孩子 本性 中 的 天真 和 善良 。
+他 的 冒险 经历 同时 也是 他 的 成长 经历 , 在 经过 了 一 系列 的 冒险 后 , 皮诺 曹 终于 变成 了 一 个 诚实 、 懂事 并 能 帮助 父母 的 好 孩子 。
+书名 大 悦读 • 语文 新 课标 必读 丛书 : 木偶 奇遇 记 出版 社 吉林 大学 出版 社 页数 156 页 开本 16 品牌 吉林 大学 出版 社 作者 卡洛 • 科洛迪 ( CalloodiC . ) 出版 日期 2011 年 8 月 1 日 语种 简体 中文 IS BN 7560 174892 , 9787560174891
+夏 文坚 夏 文坚 , 男 , 汉族 , 1961 年 6 月 出生 , 江苏 灌云 县 人 。
+1983 年 7 月 参加 工作 , 1997 年 6 月 加入 中国 共产党 , 大学 学历 。
+0 < b > 江苏 省 淮安 市 交通 局 副 局长 、 党委 委员 < / b > < b > < / b > 1979 年 9 月 - - 1983 年 6 月 就读 于 镇江 船舶 学院 动力 装置 专业 学习 ; 1983 年 7 月 - - 1994 年 3 月 任 江苏 省 运河 船厂 技术 科 副 科长 、 生产 办 副 主任 , 省 运河 公司 计算机 室 主任 ( 其间 : 1985 年 7 月 至 1987 年 10 月 在 省 交通 厅 参加 交通 部 计算机 推广 应用 开发 组 课题 研究 ) ; 1994 年 3 月 - - 1998 年 1 月 任 江苏 省 淮阴 市 交通 规费 稽 征 处 科员 ; 1998 年 1 月 - - 1999 年 1 月 任 江苏 省 淮阴 市 交通 局 综合 计划 科 科员 ; 1999 年 1 月 - - 2002 年 3 月 任 江苏 省 淮阴 ( 淮安 ) 市 交通 局 综合 计划 科 副 科长 ; 2002 年 3 月 - - 2004 年 7 月 任 江苏 省 淮安 市 交通 局 交通 战备 处 处长 ; 2004 年 7 月 - - 2009 年 5 月 任 江苏 省 淮安 市 交通 局 综合 计划 处 处长 ; 2009 年 5 月 - - 至今 任 江苏 省 淮安 市 交通 局 副 局长 、 党委 委员 。
+分管 交通 工程 前期 工作 、 计划 、 目标 考核 ; 分管 全 2 市 航道 管理 处 ; 联系 涟水 县 交通 局 。
+中文 名 夏 文坚 民族 汉族 出生 地 江苏 < a href = " # " > 灌云 县 < / a > 出生 日期 1961 年 6 月
+丢失 在 初春 的 午后 《 丢失 在 初春 的 午后 》 是 在 17k 小说 网 连载 的 小说 , 作者 是 这 一生 。
+小说 类型 0 综合 其它 内容 简介 0 童年 的 不幸 给 我 投 下 一 块 阴影 。
+错误 的 爱情 又 划伤 了 青春 , 可是 那 是 我 逃脱 不 了 的 宿命 的 爱恋 。
+然而 生命 总 要 成熟 , 在 心灵 死 而 复生 的 那 一 瞬间 , 给 我 新生 。
+过去 的 一切 , 丢失 在 那个 冷 凉 的 初春 的 午后 … … 中文 名 丢失 在 初春 的 午后 作者 这 一生 小说 进度 已 完结 连载 网站 17k 小说 网
+双龙 板 双龙 板 是 活力 板 的 一 种 , 大 多数 活力 板 为 单体 板 , 而 双龙 板 而是 采用 分体 式 的 设计 , 分成 两 块 板 , 因 其 独特 的 设计 故 取 为 “ 双龙 板 ” ; 双龙 板 运动 时 两只 脚 或 前 或 后摇 摆 即 快速 前进 , 双龙 板 的 特点 是 : 滑行 速度 快 , 摇摆 更 轻松 , 掌握 方向 更 灵活 , 而且 完全 可以 360 度 旋转 。
+特点 01 、 有 冲浪 的 自由 快感 , 仿佛 驭风 而行 , 仿佛 乘龙 翱翔 , 是 轮滑 同 冰鞋 的 二合一 ; 2 、 靠 身体 扭动 而 前进 , 不 需要 脚 推 滑 , 可做 各种 花式 变化 动作 ; 3 、 配合 扭腰 运动 , 可 达到 显著 的 瘦身 效果 , 比 转 呼 拉 圈 效果 好 很多 ; 4 、 前进 方向 随心 所 欲 , 可 在 原地 随时 转弯 , 最 擅长 作出 原地 360 度 旋转 动作 ; 5 、 速度 快慢 轻松 掌握 , 要 快要 慢 任 你 驾驭 。
+飘 如风 , 移 如影 。
+益处 01 、 提升 儿童 自 信心 , 远离 网络 游戏 ; 2 、 有 冲浪 的 自由 快感 , 仿佛 驭风 而行 ; 3 、 提高 身体 的 灵活 性 , 加强 身体 抵抗 力 ; 玩法 简单 易学 , 安全 性能 高 ; 4 、 配合 扭腰 运动 , 可 达到 瘦身 的 效果 ; 5 、 场地 大小 不 受 限制 , 支持 360 都 旋转 ; 6 、 不 受 年龄 限制 , 8 — 40 岁 均可 适合 ; 区别 0 < b > 飘移 板 和 双龙 板 的 真正 区别 : < / b > < b > < / b > 一 、 双龙 板 与 漂移 板 是 绝对 有 区别 的 虽然 两 个 都 是 分体 式 滑板 , 但是 底部 是 完全 不 同 的 。
+漂移 板 下面 的 两 个 轮子 是 固定 的 , 轮 座 并 不 能 自由 活动 , 整个 板子 都 是 一体 的 。
+所以 飘移 板 要 动 的 话 , 需要 将 整个 板子 动 起来 。
+双龙 板 的 底 轮 的 两 个 轮子 是 独立 的 , 下面 的 两 轮子 可以 自由 活动 。
+所以 双龙 板 比 飘移 板 更 灵活 点 。
+二 、 材料 的 不 同 飘移 板 是 合金 的 , 双龙 板 是 塑料 的 。
+三 、 玩法 不 同 、 速度 、 花样 都 有 区别 飘移 板 玩 起来 稍 有 难度 一 点 , 活动 方式 是 内 八 和 外 内 的 运动 原理 。
+但是 飘移 板 的 的 速度 比 双龙 板 的 快 。
+双龙 板 学 起来 不 难 , 会 活力 板 的 话 , 稍加 训练 就 可以 轻松 掌握 。
+双龙 板 最 大 的 好处 就 是 玩 得 起 速度 , 花样 比 两轮 的 活力 板 多 , 也 更 自由 , 还能 360 度 旋转 。
+中文 名 双龙 板 外文 名 Ssang yong plate 设计 分体 式 特点 滑行 速度 快 , 摇摆 更 轻松 区别 材料 , 花式 , 技巧 等 优势 场地 大小 不 受 限制
+全球 论剑 全集 全球 论剑 全集 , 是 一款 其他 软件 类 软件 , 运行 环境 支持 Android 2 . 1 。
+应用 介绍 0 本 书 收录 了 全球 论剑 全集 ~ 重回 江湖 , 血雨 腥 风 !
+追逐 “ 一 帝 二 后 三王 七 侯 ” 三年 之 久 的 陈 凯 心 ; 《 全球 论剑 在 华山 》 大赛 前夕 , 不料 被 阴险 小人 灌醉 , 随后 被 轮回 到 新人 阶段 , 彻底 错失 跻身 江湖 顶尖 强者 行列 机会 !
+万念 俱 灰 之 下 , 凯 心 黯然 退出 《 江湖 》 。
+迷醉 伤神 的 雷雨 夜 , 陈 凯新 一 梦 三年 。
+重回 学生 时代 。
+软件 更新 01 支持 存储 书签 ; 2 支持 多 点 触控 , 可用 俩 手指 做 伸展 , 回收 动作 , 控制 字体 的 大小 ; 3 增加 了 进度 条 展示 等 功能 ; 4 增加 了 教程 ; 5 增加 了 设置 页面 。
+软件 名称 全球 论剑 全集 软件 平台 mobile 软件 版本 3 . 67
+转 铁 蛋白 转 铁 蛋白 又 名 运 铁 蛋白 trans ferrin , TRF , siderophilin ) 是 血浆 中 主要 的 含 铁 蛋白 质 , 负责 运载 由 消化 管 吸收 的 铁 和 由 红 细胞 降解 释放 的 铁 。
+以 TRF - Fe 3 + 的 复合 物 形式 进入 骨髓 中 , 供 成熟 红 细胞 的 生成 。
+简介 0TRF 分子 量 约 7 . 7 万 , 为 单链 糖 蛋白 , 含 糖量 约 6 % 。
+TRF 可逆 地 结合 多价 离子 , 包括 铁 、 铜 、 锌 、 钴 等 。
+每 一 分子 TRF 可 结合 两 个 三价 铁 原子 。
+TRF 主要 由 肝 细胞 合成 , 半衰期 为 7 天 。
+血浆 中 TRF 的 浓度 受 铁 供应 的 调节 , 在 缺铁 状态 时 , 血浆 TRF 浓度 上升 , 经 铁 有效 治疗 后 恢复 到 正常 水平 。
+血浆 中 TRF 水平 可 用于 贫血 的 诊断 和 对 治疗 的 监测 。
+在 缺铁 性 的 低 血色素 贫血 中 TRF 的 水平 增高 ( 由于 其 合成 增加 ) , 但 其 铁 的 饱和 度 很 低 ( 正常 值 在 30 % - 38 % ) 。
+相反 , 如果 贫血 是 由于 红 细胞 对 铁 的 利用 障碍 ( 如 再生 障碍 性 贫血 ) , 则 血浆 中 TRF 正常 或 低 下 , 但 铁 的 饱和 度 增高 。
+在 铁 负荷 过量 时 , TRF 水平 正常 , 但 饱和 度 可 超过 50 % , 甚至 达 90 % 。
+TRF 在 急性 时 相 反应 中 往往 降低 。
+因此 在 炎症 、 恶性 病变 时常 随着 白 蛋白 、 前 白 蛋白 同时 下降 。
+在 慢性 肝 疾病 及 营养 不 良 时 亦 下降 , 因此 可以 作为 营养 状态 的 一项 指标 。
+妊娠 及 口服 避孕 药 或 雌 激素 注射 可使 血浆 TRF 升高 。
+有 免疫 试剂 盒 供应 抗体 级 标准 品 。
+用 免疫 扩散 或 浊度 法 检测 。
+正常 成人 参考 值 为 2200 - 4000 mg / L 。
+新生儿 为 1300 - 2750 mg / L 。
+临床 评价 时常 同时 测定 血清 铁 含量 及 TRF 的 铁 结合 容量 ( TIBC ) , 并 可 计算 出 的 TRF 铁 饱和 度 ( % ) 。
+TRF 亦可 通过 测定 而 间接 计算 估 得 , 其 计算 方程式 如下 : TRF ( mg / L ) = TIBC ( μ g / L ) × 0 . 70 简介 别名 1 血 清运 铁 蛋白 , TF 简介 正常 值 2 ELISA 法 、 RIA 法 : 成人 : 2 . 20 ~ 4 . 0g / L ( 220 ~ 400 mg / dl ) 60 岁 : 1 . 80 ~ 3 . 8g / L ( 180 ~ 380 mg / dl ) 简介 化验 结果 意义 3 ( 1 ) 升高 : 缺铁 时 增高 ( 缺铁 性 贫血 ) 、 铁 蛋白 释放 增加 ( 急性 病毒 性 肝炎 、 肝 细胞 坏死 ) 。
+( 2 ) 降低 : 感染 性 疾病 、 风湿 性 关节 炎 、 原发 性 肝癌 、 肾病 、 尿毒 症 、 遗传 性 运 铁 蛋白 缺乏 症 、 流行 性 出血热 、 血色 病 、 再生 障碍 性 贫血 、 慢性 溶血 性 贫血 、 系统 性 红斑 狼疮 等 。
+简介 化验 取材 4 血液 简介 化验 方法 5 蛋白 质 测定 简介 化验 类别 6 血液 生化 检查 、 蛋白 质 测定 参考 资料 0 《 新编 临床 检验 与 检查 手册 》 、 《 新编 化验 员 工作 手册 》 相关 疾病 0 原发 性 肝癌 中文 学名 转 铁 蛋白 拉丁 学名 ns ferrin 别称 运 铁 蛋白 二 名 法 trans ferrin 界 动物 界 种 转 铁 蛋白 分布 区域 骨髓
+精英 消费 本 书 于 2013 年 6 月 在 企业 管理 出版 社 出版 。
+本 书 揭示 了 奢侈 品 的 本质 和 人类 占有 的 本能 欲望 , 教导 销售 人员 如何 将 目标 锁定 在 精英 分子 身 上 、 如何 将 潜在 客户 转化 为 真实 客户 和 忠实 客户 、 如何 让 忠实 客户 扩展 至 全球 , 以及 如何 挽回 不 忠实 客户 的 青睐 。
+0 内容 介绍 《 精英 消费 : 新 时代 的 精英 消费 者 研究 及 销售 秘诀 》 内容 简介 : 想要 赚 大钱 , 就 把 您 的 目标 客户 群 锁定 在 金字 塔 顶端 的 新 精英 分子 !
+这 群 人 想要 什么 ?
+购物 习惯 和 思维 如何 ?
+世界 领先 的 战略 研究 公司 哈里森 集团 以 自身 经验 和 研究 成果 为 您 一一 分析 , 让 您 攻 无 不 克 , 驰骋 商场 !
+《 精英 消费 : 新 时代 的 精英 消费 者 研究 及 销售 秘诀 》 为 您 剖析 新 精英 分子 的 生活 与 心理 : 什么 样 的 商品 会 让 他们 动心 ?
+他们 都 在 哪些 地方 购物 ?
+他们 如何 做出 购买 决定 ?
+消费 方式 如何 ?
+他们 想 从 顶级 品牌 中 获得 什么 ?
+他们 对 市场 有 什么 期许 ?
+基于 对 世界 高 端 品牌 —— 雷克萨斯 、 香奈儿 、 卡地亚 、 内曼 · 马库斯 百货 、 苹果 、 通用 汽车 、 美国 运通 等 案例 的 研究 , 销售 专家 和 营销 大师 的 指导 , 《 精英 消费 : 新 时代 的 精英 消费 者 研究 及 销售 秘诀 》 揭示 了 奢侈 品 的 本质 和 人类 占有 的 本能 欲望 , 教导 销售 人员 如何 将 目标 锁定 在 精英 分子 身 上 、 如何 将 潜在 客户 转化 为 真实 客户 和 忠实 客户 、 如何 让 忠实 客户 扩展 至 全球 , 以及 如何 挽回 不 忠实 客户 的 青睐 。
+透过 研究 成果 、 真实 案例 、 练习 , 《 精英 消费 : 新 时代 的 精英 消费 者 研究 及 销售 秘诀 》 将 理论 转化 为 销售 激情 的 13 种 表达 方式 、 理想 变现 7 法则 等 行动 策略 。
+同时 , 列举 了 数 个 成功 销售 人员 与 客户 之间 如何 实现 双赢 的 案例 。
+读 过 此书 , 您 同样 也 可以 赢得 新 精英 分子 的 青睐 , 并 跻身 为 他们 其中 的 一员 。
+IS BN 9787516402832 页数 143 定价 39 . 50 元 出版 社 企业 管理 出版 社 出版 时间 2013 - 6
+纷乱 , 这 世界 !
+《 纷乱 , 这 世界 !
+》 是 烟云 韶光 创作 的 网络 小说 , 发表 于 晋江 文学 网 。
+作者 0 烟云 韶光 作品 简介 0 伊尔 迷 在 二十 岁 时 , 一 时 不 慎 救 了 一只 像 狗 又 像 狮子 的 东西 ( 抗议 !
+那 是 麒麟 !
+) 。
+原本 他 还是 一 个 单纯 , 不 爱 说话 的 正常 杀手 少年 . . . 但 之后 他 就 悲 催 了 . . . 当 伊尔 迷 带 着 一 个 名义 上 的 家人 回归 后 , 世界 悲 催 了 . . . 第 二 卷 路西菲尔 从小 就 被 米迦勒 收拾 的 叫 天天 不 应 , 叫 地 地 不 灵 。
+被 逼 无 奈 躲 去 了 流星 街 , 结果 一 出 流星 街 , 他 又 悲 催 了 个 愤 得 … … 我 活 得 不 舒服 , 你们 也 别 想 享福 !
+!
+!
+新手 , 欢迎 意见 。
+拍砖 的 都 下手 稍微 轻 那么 一 点点 , 我 就 很 满足 了 。
+本 文 崩坏 , 剧情 党 勿入 . . . 因为 新 文 数据 实在 是 惨 不 忍 睹 , 打 个 广告 — 西 幻 文 , 大 架构 , 升级 流 , 双 男 主 , 无 西皮 , 剧情 流 。
+我 在 努力 构架 一 个 完整 的 合理 的 世界 , 应该 比 据说 “ 智 硬 ” 的 上将 高档 一些 。
+中文 名称 纷乱 , 这 世界 !
+作者 烟云 韶光 连载 平台 晋江 文学 网
+北京 一维 弦 科技 有限 责任 公司 北京 一维 弦 科技 有限 责任 公司 于 2015 年 07 月 30 日 在 海淀 分局 登记 成立 。
+法定 代表 人 唐 博维 , 公司 经营 范围 包括 技术 开发 、 技术 推广 、 技术 转让 、 技术 咨询 、 技术 服务 等 。
+企业 信息 0 公司 类型 有限 责任 公司 登记 机关 海淀 分局 成立 时间 2015 年 07 月 30 日发 照 时间 2017 年 04 月 21 日
+大通 丸 名称 : 大通 丸 别名 : 大通 丸 组成 : 川 乌头 2 两 ( 炮 裂 , 去皮 脐 ) , 砒 黄 1 分 ( 细 研 ) , 巴豆 1 两 ( 去皮 心 , 研 , 纸 裹 压 去油 ) , 芫花 1 两 ( 醋 拌 , 炒 令 黄 ) , 杏仁 1 两半 ( 汤 浸 , 去皮 尖 双 仁 , 麸 炒 微 黄 ) , 麝香 1 钱 ( 细 研 ) , 黄丹 1 分 ( 炒 令 紫色 ) , 猪牙 皂荚 1 两 ( 去 黑皮 , 涂 酥 , 炙 令 焦黄 , 去 子 ) , 自然 铜 1 两 ( 细 研 , 别 用 ) 。
+出处 : 《 圣 惠 》 卷四 十九 。
+主治 : 症瘕 。
+用法 用量 : 每 服 3 丸 , 空心 煎 生姜 、 橘皮 汤 送 下 。
+制备 方法 : 上 为 末 , 入 研 了 药 令 匀 , 用 黑豆 面 为 丸 , 如 绿豆 大 , 以 研 了 自然 铜 末 滚 过 。
+
+2013 入党 申请 书 教程 《 2013 入党 申请 书 教程 》 是 一款 Android 平台 的 应用 。
+应用 介绍 0 入党 申请 书 标志 着 申请 人 经过 了 郑重 思考 , 向 党 组织 表明 自己 有 入党 的 志愿 和 要求 , 使 党 组织 了解 申请 人 的 政治 信仰 和 追求 , 便于 党 组织 对 申请 人 有 针对 性 地 进行 培养 、 教育 、 考察 , 同时 也是 党 组织 确定 入党 积极 分子 和 发展 对象 的 重要 依据 。
+入党 申请 书 是 要求 入党 的 同志 对 党 的 认识 和 自我 认识 的 反映 。
+因此 , 每 一 位 要求 入党 的 同志 , 都 应该 认真 写好 入党 申请 书 。
+支持 版本 0 Android 2 . 0 及 以上 软件 名称 2013 入党 申请 书 教程 软件 平台 Android 软件 大小 1 . 82 MB
+天下 悲歌 《 天下 悲歌 》 是 一 部 小说 , 作者 是 掬 花 在 手 纳兰 于 袖 17195611 。
+小说 类型 0 异界 大陆 内容 简介 0 本 物 天下 霸 唱 是 什么 ? 是 一 把 刀 ? 是 一 柄 剑 ? 还是 一 本 修仙 练 圣 的 真 诀 . 一 个 名叫 楚 烟 寒 的 少年 尚未 出世 , 就 已 胎 死 腹中 , 一 股 怨气 不 灭 , 幸 被 灵鹫 峰 长老 说 不 得 大师 所 救 , 带回 灵鹫 峰 , 后 以 蚩尤 之 魂 , 刑天 之 魄 , 聚 练 重生 , 以 魔道 修习 佛法 , 一时间 惹起 修真 道 无 数 腥 风 血雨 , 异世 大陆 , 修真 界 诸多 高手 争霸 天下 , 树 下 野狐 , 唐 家 三 少 , 断 刃 天涯 , 黯然 销魂 , 梦 入 神机 , 静官 , 撒冷 , 血红 , 鬼雨 , 斩 空 , 赤 虎 , 傲 无 常 . . . 齐 出 江湖 , 最后 谁 能 问鼎 ? 请 看 拙作 天下 霸 唱 2008 最 值得 期待 的 玄幻 仙 侠 传奇 ! 中文 名 天下 悲歌 作者 掬 花 在 手 纳兰 于 袖 17195611 小说 进度 连载 连载 网站 起点 中文 网
+365 找 房 运行 环境 0 支持 Android 2 . 1 应用 类型 0 生活 实用 类 软件 应用 介绍 0 软件 仅 适用 2 . 1 及 以上 固件 版本 365 找 房 是 三六五 地产 家居 网 推出 的 一款 免费 的 专业 的 手机 找 房 应用 软件 。
+为 广大 有 买房 租房 需求 的 用户 提供 全面 专业 房屋 租售 , 小区 楼盘 和 方便 效率 的 找 房 服务 。
+应用 名称 365 找 房 应用 平台 mobile 应用 版本 1 . 2 . 4
+蝇虎 跳 蛛 蝇虎 跳 蛛 既 是 跳 蛛 , 在 中国 南方 广东 福建 沿海 地区 与 台湾 地区 称 跳 蛛 为 蝇虎 , 意 为 “ 吃 苍蝇 的 老虎 ” , 《 本草 纲目 拾遗 》 中 称 跳 蛛 为 蝇虎 , 皆以 其 捕食 蝇 类 而 得名 。
+跳 蛛 科 ( Salticidae ) 是 蜘蛛 目 ( Aran eae ) 中 种数 最 多 的 一 个 科 。
+据 NormanI . Plat nick 统计 , 截止 到 2005 年 12 月 27 日 , 该 科 蜘蛛 共计 有 553 属 5035 种 , 广泛 分布 于 世界 各地 。
+1 基本 信息 0 跳 蛛 体型 一般 较 粗壮 , 体色 鲜明 , 被 有 细毛 或 鳞状 毛 , 部分 种 被 有 金属 光泽 。
+身体 分 为 头 胸部 和 腹部 两 部分 , 其间 以 腹 柄 相连 , 腹部 不 分 节 。
+跳 蛛 是 一 种 重要 的 农业 益虫 , 可 捕食 大量 双翅 目 成虫 。
+蝇虎 跳 蛛 的 头部 长 着 大 眼睛 以及 微 青 绿色 的 牙齿 , 除了 它们 的 外表 具有 嫩 黄色 的 软毛 , 有 时 猛地 一看 会 误 将 它们 当成 野花 。
+这 一 可爱 造型 让 蝇虎 跳 蛛 荣获 了 “ 最 可爱 的 蜘蛛 ” 的 称号 。
+到 目前 为止 , 全 世界 已经 识别 出来 的 跳 蛛 种类 共 有 5000 多种 。
+人们 可以 很 容易 根据 它们 头部 和 面部 的 八 只眼 睛 来 识别 它们 。
+蝇虎 之所以 得到 “ 跳 蛛 ” 的 名字 , 就 在于 他们 的 特长 是 跳跃 , 它们 一 次 跳出 的 距离 甚至 比 它们 身长 的 50 倍 还要 长 。
+形态 特征 0 体 小 , 长 约 1 厘米 , 全身 被 绒毛 , 背面 有 黑色 或 灰黄色 的 斑纹 。
+头 端 有 细小 的 口 ; 单眼 4 对 , 位于 头 胸部 背 侧 的 前端 , 以 2 、 2 、 4 排 成 3 列 , 第 1 列 的 2 个 单眼 最 大 。
+头 胸部 呈 四 角 形 , 下 有 附 肢 6 对 , 第 1 对 呈 螯 状 , 内 有 毒 腺 ; 第 2 对 为 脚 须 , 如 触角 , 雄性 末节 膨大 成交 配器 。
+其他 4 对 为 步 足 , 粗 短 , 跗 节 末端 有 爪 2 枚 。
+腹部 较 狭 , 前 腹 有 生殖 孔 及 生殖 板 , 尾端 有 疣 状 突起 的 纺锤 突 , 内 通 纺 绩 腺 , 能 分泌 粘液 而 抽丝 。
+形态 特征 头部 : 1 头 胸部 的 外 骨骼 角质 化 、 坚硬 , 背面 称 为 背 甲 , 颈 沟 、 放射 沟 颜色 稍 深 。
+颈 沟 腹面 常 称 为 胸甲 或 胸板 , 胸板 前方 为 颚 叶 和 下唇 。
+背 甲 : 表面 多 被 有 毛 , 具有 散发 金属 光泽 的 鳞状 毛 , 黄褐色 或 黑 褐色 。
+胸甲 : 由 胚胎 时期 头 胸部 各 节 的 腹板 愈合 而成 , 被 有 细毛 。
+眼 : 跳 蛛 眼 同型 , 皆 为 昼 眼 , 黑色 , 呈 三 列 , 形式 排列 。
+前 眼 列 近乎 端 直 , 位于 头 胸部 前缘 的 垂直 面 ( 即 眼 朝 前 ) , 眼 较大 , 其中 前 中 眼 最 大 , 形 如 汽车 前灯 。
+后 眼 列强 后 曲 , 后 中 眼 较小 , 肉眼 难以 见 及 , 位于 前 眼 列 后方 , 成为 中 眼 列 , 后侧 眼 为 后 眼 列 。
+所有 眼 占据 的 头 区 部分 称 为 眼 域 , 眼 域 一般 占 头 胸部 的 , 呈 方形 。
+螯 肢 : 由 螯 基 和 螯 爪 组成 。
+螯 爪 细长 而 弯曲 , 腹面 具 细 齿 , 背面 近 端 部 有 一 毒腺 的 开 孔 。
+螯 基 内缘 有 牙 沟 , 沟 的 两岸 具 齿 , 称 为 齿 堤 。
+位于 前侧 的 称 前齿 堤 , 位于 后 侧面 的 称 为 后 齿 堤 。
+形态 特征 胸 腹部 : 2 跳 蛛 的 腹部 绝 大 多数 为 卵 圆形 或 长 卵形 , 少数 成 圆柱 状 。
+腹部 柔软 、 不 分 节 , 多 褐色 或 灰黄色 , 斑纹 多样 , 有 的 具有 金属 光泽 。
+多数 种 腹部 背面 前端 两侧 或 后面 常有 对 肌 痕 , 腹部 腹面 前端 两侧 具有 对 书 肺 , 成熟 雌 蛛 的 生殖 沟 正 中部 有 外 雌 器 , 其 内部 有 交媾 管 、 纳 精囊 等 , 这 是 雌 蛛 分类 重要 形态 特征 依据 。
+腹部 末端 具有 对 纺 器 , 即 前 纺 器 、 中纺 器 和 后 纺 器 , 前 纺 器 比 后 纺 器 短 且 粗 。
+捕食 行为 0 蝇虎 蜘蛛 眼睛 的 视觉 灵敏 度 大约 有 人类 的 六分 之 一 , 这 让 它们 拥有 了 “ 与 灵长 类 动物 一 样 的 视力 和 狮子 一 样 的 捕食 能力 ” 。
+蝇虎 蜘蛛 善于 蹦跳 。
+潜 近 猎物 时 , 先 逐步 靠近 , 再 从 较 近 的 位置 跳 到 猎物 身 上 , 用 前 足 、 螯 肢 共同 抓紧 猎物 , 毒牙 咬住 猎物 后 注入 毒液 , 麻痹 猎物 、 消化 猎物 组织 , 之后 吸取 消化 后 的 汁液 。
+在 天花板 上 倒 行时 亦可 跳跃 捕食 。
+因 从 树 上 等 高处 摔 下 危险 , 预备 跳 时 先 粘 上 一根 丝 , 就像 爬山 者 用 岩钉 及 绳子 防止 坠落 。
+蝇虎 蜘蛛 因 擅长 于 解决 一些 复杂 的 任务 而 著称 。
+自 1936 年 蝇虎 蜘蛛 在 印度 被 发现 以来 , 人类 还是 首次 发现 蜘蛛 的 这种 行为 。
+蝇虎 蜘蛛 首先 通常 潜伏 于 蚁巢 周围 10 厘米 处 。
+5 秒钟 后 , 它 会 毁坏 蚁巢 的 表层 封口 , 用 自己 的 尖牙 利爪 从 蚂蚁 的 口中 抢夺 食物 , 然后 再 返回 自己 的 家中 享用 。
+一 次 掠食 , 这 一 过程 大约 会 重复 多达 4 次 。
+这种 攫取 方式 比 正常 的 捕食 方式 节省 了 不 少 的 精力 , 因为 鲜活 的 目标 通常 会 极力 逃避 蝇虎 蜘蛛 的 攻击 。
+这种 偷食 也许 会 让 蝇虎 蜘蛛 的 工作 量 大增 , 因为 它们 也 在 尽量 挑选 那些 新鲜 的 、 营养 丰富 的 湖 蝇 。
+食物 0 蝇虎 一般 以 双翅 目 、 同 翅 目 ( 苍蝇 、 蚊子 、 蠓 、 叶蝉 等 ) 等 一类 中 小型 昆虫 为食 , 居家 环境 下 偶尔 捕食 蟑螂 若虫 。
+它们 白天 活动 , 喜欢 呆 在 有 阳光 的 地方 , 到 了 晚上 就 钻进 树皮 下 或 石头 下面 的 巢 里 。
+蝇虎 虽然 不 会 结网 , 但是 也 会 吐丝 , 尾部 有 一 条 拉丝 , 是 它们 的 生命 线 。
+碰到 危险 时 , 可以 利用 这 条 拉丝 , 从容 地 滑 入 草丛 中 , 避开 敌人 的 追击 。
+中文 学名 跳 蛛 界 动物 界 门 节肢 动物 门 亚门 螯 肢 亚门 纲 蛛 形 纲 亚纲 柄 腹 亚 纲目 蜘蛛 目 亚目 新 蛛 亚目 科 跳 蛛 科
+娄烦 镇 娄烦 镇 位于 周 洪山 南麓 , 涧 河 两岸 , 辖 属 山西 省 太原 市 娄烦 县 。
+东 濒 汾河 水库 , 南接 天池 店 乡 , 西邻 马 家庄 乡 和 盖 家庄 乡 , 北 与 静 游 镇 相连 。
+简介 0 是 县 政府 所在 地 , 是 全县 政治 、 经济 、 文化 中心 。
+总 面积 159 平方 公里 , 耕地 面积 18000 亩 , 共 有 4101 户 24059 人 。
+其中 农业 人口 16859 人 。
+共 有 基层 党 支部 36 个 , 其中 机关 支部 4 个 , 社区 支部 3 个 , 农村 支部 29 个 。
+共 有 党员 656 名 , 其中 女 党员 104 人 。
+境内 有 娄烦 著名 的 三教 寺 和 周 洪山 普净 寺 。
+辖 煤矿 2 座 ( 娄烦 镇 煤矿 、 黑山 岔 沟 煤矿 ) , 砖厂 4 座 ( 个体 ) 。
+共 有 中学 1 所 ( 娄烦 镇 中学 ) , 小学 18 所 , 教 职工 210 人 , 在校 学生 2750 名 。
+卫生 院 1 所 , 村级 卫生 所 27 个 。
+文化 站 1 个 , 村级 文化 室 27 个 。
+行政 区划 0 共 有 自然 村 40 个 , 其中 行政 村 26 个 , 社区 居委 会 6 个 ( 城东 、 城中 、 城西 、 娄烦 一 、 二 、 三 居委 会 ) 。
+库区 村 11 个 ( 娄烦 村 、 城北 、 西 果园 、 尹 家 窑 、 范 家 村 、 官庄 、 旧 娄烦 、 旧 城北 、 西街 、 蒲 峪 、 向阳 村 ) , 占 到 全县 库区 村 的 一半 。
+: 瓦 子 坪 村 、 凤凰 村 、 白道 坡 村 、 白道 村 、 大 圪 垛 村 、 崖 窑 足 村 、 第 二 足 村 、 四家 坪 村 、 杜 家岭 村 、 红 崖头 村 、 我 家 村 、 旧城 北村 、 大夫 庄 村 、 娄 家庄 村 、 童子 崖 村 、 姚 罗村 、 河 家庄 村 、 新良 庄 村 、 官庄 村 、 三 元 村 、 城北 村 、 西 果园 村 、 尹 家 窑 村 、 旧 娄烦 村 、 范 家 村 、 任 家 沟 村 、 蔡 阳庄 村 、 前 黑 山村 、 后 黑 山村 、 西街 村 、 蒲 峪村 、 向阳 村 、 席 岭村 、 塔 圪 垛 村 、 小泉 沟 村 、 寨 沟 村 、 峁 儿 上 村 等 。
+经济 建设 0 截止 2006 年底 , 完成 农村 经济 总 收入 6300 万元 , 比 2005 年 的 5700 万元 增长 了 10 . 5 % ; 农民 人均 纯 收入 达 1807 元 , 比 2005 年 的 1567 元 增长 了 15 . 3 % ; 粮食 播种 面积 20200 亩 , 粮食 总 产量 达 215 . 8 万 斤 。
+名人 : 李 述 孔 中文 名称 娄烦 镇 行政 区 类别 镇 电话 区号 0351 地理 位置 周 洪山 南麓 , 涧 河 两岸 面积 159 平方 公里 人口 24059 人 车牌 代码 晋 A
+粗 茎 卷柏 粗 茎 卷柏 ( 学名 : Selaginella frondosa ) 为 卷柏 科 卷柏 属下 的 一 个 种 。
+粗 茎 卷柏 , 卷柏 科 卷柏 属 植物 , 土生 , 直立 。
+分布 于 云南 南部 ( 河口 ) , 也 分布 到 越南 北部 。
+形态 特征 0 土生 , 直立 , 高 ( 20 - ) 50 - 70 厘米 , 具 一 横 走 的 地下 根 状 茎 和 游走 茎 。
+根托 生于 茎 的 基 部 或 匍匐 根 状 茎 处 , 长 0 . 5 - 1 . 8 厘米 , 直径 1 - 2 毫米 , 根 多 分叉 , 被 毛 。
+主茎 自 中 上部 开始 羽状 分枝 , 不 呈 “ 之 ” 字形 , 无 关节 , 禾 秆 色 , 不 分枝 的 主茎 高 20 - 30 厘米 , 主茎 下部 直径 3 - 5 毫米 , 茎 近 方形 , 具 沟槽 , 无 毛 , 维管束 1 条 , 主茎 顶端 不 呈 黑 褐色 ; 侧枝 5 - 7 对 , 2 回 羽状 分枝 , 小枝 排列 紧密 , 主茎 上 相邻 分枝 相距 2 . 5 - 8 厘米 , 分枝 无 毛 , 背 腹 压扁 , 主茎 在 分枝 部分 中部 连 叶 宽 10 - 13 毫米 , 末 回 分枝 连 叶 宽 5 - 7 ( - 8 ) 毫米 。
+叶 ( 除 不 分枝 主茎 上 的 外 ) 交互 排列 , 二 形 , 草纸 或 多少 厚 , 表面 光滑 , 不 具白 边 , 不 分枝 主茎 上 的 叶 排列 稀疏 , 不 大于 分枝 上 的 , 一 形 , 绿色 , 卵 状 三角 形 , 压扁 , 叶 背 呈 龙骨 状 , 基 部 边缘 有 睫毛 。
+分枝 上 的 腋叶 对称 , 卵 状 披针形 , 3 . 2 - 5 . 0 毫米 × 1 . 1 - 1 . 8 毫米 , 基 部 边缘 具 长 睫毛 , 其余 部分 具 短 睫毛 , 基 部 无 耳 藓 ( 心形 或 近 心形 ) 。
+中叶 不 对称 , 小枝 上 的 卵 状 椭圆 形 , 2 . 2 - 2 . 8 毫米 × 1 . 0 - 1 . 8 毫米 , 紧接 或 覆 瓦 状 排列 , 背部 呈 明显 的 龙骨 状 , 先端 具 芒 , 基 部 斜 心形 , 基 部 边缘 有 很少 的 长 睫毛 , 其余 部分 具 睫毛 。
+侧 叶 不 对称 , 分枝 上 的 长圆 状 镰 形 , 略 斜 升 , 4 . 7 - 6 . 0 毫米 × 1 . 7 - 2 . 2 毫米 , 先端 急 尖 。
+上侧 基 部 圆形 , 覆盖 茎 枝 , 上侧 基 部 边缘 具 长 睫毛 , 中部 具 端 睫毛 , 其余 部分 全 缘 , 睫毛 长 0 . 2 - 0 , 5 毫米 。
+生殖 特征 孢子 叶 穗 紧密 , 四 棱柱 形 , 单 生 于 小枝 末端 , 或 偶 呈 侧 生 , 或 成对 着生 , 10 - 30 毫米 × 2 . 0 - 3 . 0 毫米 ; 孢子 叶 一 形 , 不 具白 边 , 边缘 具 细 齿 , 锐龙 骨 状 , 先端 渐 尖 ; 只有 一 个 大 孢子 叶 位于 孢子 叶 穗基 部 的 下侧 , 或 大 、 小 孢子 叶 相间 排列 , 或 大 孢子 叶 分布 于 中部 下侧 。
+大 孢子 白色 或 灰色 ; 小 孢子 淡 黄色 。
+生长 环境 0 生长 于 海拔 100 - 150 米得 石灰 岩 山地 雨林 下 。
+分布 范围 0 分布 于 云南 南部 ( 河口 ) , 也 分布 到 越南 北部 。
+中文 学名 粗 茎 卷柏 拉丁 学名 Selaginella frondosa Warb . 界 植物 界 门 蕨类 植物 门 纲 石松 纲 科 卷柏 科 属 卷柏 属 种 粗 茎 卷柏 分布 区域 云南 南部 ( 河口 ) , 也 分布 到 越南 北部 。
+
+湖南 联 冠 电子 科技 有限 公司 湖南 联 冠 电子 科技 有限 公司 于 2009 年 07 月 08 日 在 长沙 市 工商 行政 管理 局 芙蓉 分局 登记 成立 。
+法定 代表 人 焦作 为 , 公司 经营 范围 包括 电子 产品 研发 ; 软件 开发 ; 软件 技术 服务 ; 软件 技术 转让 等 。
+企业 信息 0 公司 类型 有限 责任 公司 登记 机关 长沙 市 工商 行政 管理 局 芙蓉 分局 成立 时间 2009 年 07 月 08 日发 照 时间 2 016 年 05 月 31 日
+大庆 市 建筑 规划 设计 研究 院 大庆 市 规划 建筑 设计 研究 院 创建 于 1989 年 3 月 , 是 以 服务 于 城乡 建设 事业 为主 的 综合 型 规划 建筑 设计 科研 单位 , 具有 国家 批准 的 城市 规划 编制 、 建筑 工程 及 风景 园林 3 项 甲级 资质 , 市政 公用 行业 、 测绘 2 项 乙级 资质 , 公路 工程 资质 及 工程 咨询 资质 。
+并 通过 了 ISO 9001 质量 管理 体系 的 认证 。
+规模 0 现有 工作 人员 156 人 。
+其中 : 高级 工程 师 48 人 , 工程 师 51 人 , 助理 工程 师 41 人 , 工 勤 人员 16 人 。
+具有 研究 生 以上 学历 人员 11 人 , 现有 注册 人员 32 人 。
+其中 : 一 级 建筑 师 3 人 , 一 级 结构 师 5 人 , 二级 建筑 师 3 人 , 注册 规划 师 9 人 , 注册 设备 师 4 人 , 注册 电气 师 2 人 , 注册 咨询 师 3 人 , 注册 造价 师 3 人 。
+院内 现代 化 技术 设备 先进 , 彩色 激光 打印 机 、 彩色 扫描 仪 、 工程 扫描 仪 、 大幅 面 绘图 仪 、 全站 仪 、 数码 一体 机 等 设备 齐全 , 计算机 普及 率 及 成 图 率 达到 100 % 。
+建院 二十 年 来 , 业务 遍及 国内 4 个 省 、 自治 区 , 承担 完成 各 层次 和 类型 的 城市 规划 设计 、 市政 工程 设计 、 风景 园林 工程 设计 、 建筑 工程 设计 及 工程 测绘 任务 总计 2000 余 项 。
+其中 : 大庆 市 万宝 居住 区 规划 、 大庆 市 兰德 居住 区 规划 、 大庆 市 妇儿 活动 中心 工程 、 大庆 市 万宝 中学 工程 、 大庆 市 经 九 街 改造 设计 等 30 余 项 成果 分别 获 国家 及 省 部级 嘉奖 。
+承担 任务 0 有 : 城市 发展 研究 、 城市 总体 规划 、 详细 规划 、 城市 设计 、 环境 景观 设计 、 建筑 单体 设计 、 市政 工程 设计 、 公路 工程 设计 、 工程 测量 、 工程 概 预算 以及 工程 咨询 等 项目 , 对外 服务 项目 有 : 打字 、 复印 、 打 图 、 装订 、 晒图 。
+在 新 世纪 里 , 大庆 市 规划 建筑 设计 研究 院 以 “ 科学 管理 、 一 流 设计 、 优质 服务 、 持续 满足 顾客 要求 ” 为 质量 方针 , 以 “ 热诚 、 优质 、 高效 ” 为 服务 宗旨 , 以 创建 “ 精品 工程 ” 为 己任 。
+竭诚 为 全国 各地 城乡 建设 提供 更加 优质 的 服务 , 为 全国 城乡 建设 事业 的 繁荣 发展 做出 更 大 贡献 。
+公司 名称 大庆 市 建筑 规划 设计 研究 院 成立 时间 1989 年 3 月 公司 口号 服务 于 城乡 建设 事业 为主 员工 数 156 人
+荣 胜荣 胜 , 男 , 汉族 , 1968 年 9 月 出生 , 籍贯 安徽 枞阳 , 中央 党校 大学 学历 , 中共 党员 , 现任 铜陵 市 纪委 常委 。
+
+酸菜 饼 酸菜 饼 是 一款 由 面粉 、 酸菜 等 原料 制成 的 菜品 。
+材料 0 面粉 , 热 开水 70 - - 80 度 , 冷水 少许 , 食用 油 , 酸菜 , 猪肉 末 带 点 肥肉 , 蒜 , 葱 , 酱油 , 胡椒 粉 , 香油 做法 01 . 面粉 放 盆 里 , 边 加热 开水 边 用 筷子 搅匀 , 再 加 适量 凉水 , 加入 少许 油 揉 成 团 , 盖 保鲜 膜 醒 半 小时 。
+2 . 醒 好 的 面团 分成 小块 , 把 小块 面团 擀 开 包 入 馅料 , 收口 朝 下 压扁 , 粘 上 芝麻 。
+3 . 锅 里 倒 适量 油 , 中小 火把 饼 煎 至 两面 金黄 即可 。
+4 . 酸菜 馅 做法 : 酸菜 切末 , 挤 干 些 水份 。
+5 . 锅 里 放 油 , 放 蒜 , 葱 炒 香 , 放入 猪肉 末 炒 熟 , 加入 少许 酱油 、 胡椒 粉 、 调味 。
+6 . 再 加入 酸菜 稍 炒 , 加 适量 香油 出 锅 。
+中文 名 酸菜 饼 材料 < a href = " # " > 面粉 < / a > , 热 开水 70 - - 80 度 , 冷水 少许 做法 面粉 放 盆 里 , 边 加热 开 水边 . 酸菜 馅 做法 酸菜 切末 , 挤 干 些 水份
+江苏 德川 汇 汽车 科技 有限 公司 江苏 德川 汇 汽车 科技 有限 公司 于 2014 年 05 月 29 日 在 常熟 市 市场 监督 管理 局 登记 成立 。
+法定 代表 人 邱 孝川 , 公司 经营 范围 包括 汽车 零 部件 研发 ; 机械 装备 、 汽车 生产 工艺 装备 等 。
+企业 信息 0 公司 类型 有限 责任 公司 登记 机关 常熟 市 市场 监督 管理 局 成立 时间 2014 年 05 月 29 日发 照 时间 2016 年 07 月 12 日
+初一 下册 政治 复习 提纲 《 初一 下册 政治 复习 提纲 》 是 一款 Android 平台 的 应用 。
+应用 介绍 0 初一 下册 政治 复习 提纲 , 精心 整理 编辑 知识 95 条 。
+本 应用 提供 顺序 阅读 、 收藏 阅读 、 按 标题 浏览 、 分类 阅读 、 搜索 功能 , 方便 您 阅读 。
+分类 如下 : 第 一课 珍惜 无 价 的 自尊 12 条第 二 课 扬起 自信 的 风帆 16 条第 四 课 人生 当 自强 8 条第 五 课 让 挫折 丰富 我们 的 人生 10 条第 六 课 为 坚强 喝彩 5 条第 七 课 感受 法律 的 尊严 22 条第 八 课 法律 护 我 成长 22 条 特点 : . 无 需 联网 , 离线 查阅 . 内容 丰富 . 简单 方便 、 运行 速度 快 . 自动 保存 阅读 进度 . 点 击 图片 放大 显示 . 增加 收藏 功能 . 增加 明显 页 内 功能 按钮 . 优化 阅读 视觉 体验 支持 版本 0 Android 1 . 6 及 以上 软件 名称 初一 下册 政治 复习 提纲 软件 平台 Android 软件 大小 2 . 65 MB
+徐 玖平 徐 玖平 , 曾 任 成都 市 发展 和 改革 委员 会 巡视 员 。
+2017 年 7 月 , 免去 徐 玖平 的 成都 市 发展 和 改革 委员 会 巡视 员 职务 ;
+奥迪 A6L 2 . 0T 手动 基本 装备 车型 名称 : 奥迪 A6L 2 . 0T 手动 基本 装备 报价 ( 最 低价 ) : 34 . 68 万 ( 34 . 68 万 ) 品牌 : 一汽 奥迪 级别 : 中大 型 车 发动机 : 2 . 0T 170 马力 L4 变速 箱 : 6 挡 手动 长 × 宽 × 高 : 5012 × 1855 × 1485 车体 结构 : 4 门 5 座 三 厢 车 “ 奥迪 A6L 2 . 0T 手动 基本 装备 ” 的 详细 参数 车身 | 发动机 | 变速 箱 | 底盘 转向 | 车轮 制动 | 配置 基本 信息 0 车型 名称 奥迪 A6L 2 . 0T 手动 基本 装备 推出 年份 2005 最 高 车速 2190 - 100 加速 时间 10 . 7100 - 0 制动 距离 - 百 公里 经济 油耗 6 . 4 车身 长度 5012 宽度 1855 高度 1485 车重 1730 轴距 2945 前 轮距 - 后 轮距 - 最 小 离 地 间隙 142 车身 结构 三 厢 车 车门 数 4 座位 数 5 油箱 容积 80 行李 箱 容积 501 发动机 排量 ( cc ) 1984 排量 ( L ) 2 . 0 工作 方式 涡轮 增压 气缸 排列 型式 L 汽缸 数 4 每 缸 气门 数 4 压缩 比 10 . 5 气门 结构 DOHC 缸径 82 . 5 冲程 92 . 8 马力 170 最 大 功率 ( kW ) 125 最 大 功率 转速 4300 - 6000 最 大 扭矩 ( N · m ) 280 最 大 扭 据 转速 ( rpm ) 1800 - 4200 发动机 特有 技术 可变 相位 燃油 汽油 燃油 标号 97 号 供油 方式 直 喷 缸体 材料 铝 缸盖 材料 铝 环保 标准 欧 IV 机油 容量 - 变速 箱 名称 6 挡 手动 挡 位 个 数 6 变速 箱 类型 MT 底盘 转向 驱动 方式 前置 前驱 前 悬挂 类型 轻质 四 连杆 式 独立 悬 后 悬挂 类型 挂 梯形 连杆 式 独立 悬挂 助力 类型 机械 式 液压 动力 车轮 制动 0 前 制动 器 类型 碟 式 后 制动 器 类型 碟 式 前 轮胎 规格 225 / 55R 16 后 轮胎 规格 225 / 55R 16 前 轮辋 规格 7 . 5JX 16 后 轮辋 规格 7 . 5JX 16 备胎 全 尺寸 中文 名 奥迪 A6L 2 . 0T 手动 基本 装备 品牌 & nbsp ; 一汽 奥迪 级别 & nbsp ; & nbsp ; 中大 型 车 发动机 & nbsp ; & nbsp ; 2 . 0T 170 马力 L 4
+一 分钟 快速 致富 《 一 分钟 快速 致富 》 是 2010 年 中信 出版 社 出版 的 书籍 , 作者 是 马克 · 维克多 · 汉森 、 罗伯特 · G · 艾伦 。
+本 书 介绍 了 获得 致富 秘籍 的 三 个 关键 要素 。
+内容 简介 0 马克 和 罗伯特 曾 取得 了 《 一 分钟 百万 富翁 》 的 成功 。
+本 书 继续 延用 这 一 模式 , 在 第 一 部分 续写 了 《 一 分钟 百万 富翁 》 中 米歇尔 的 致富 故事 , 第 二 部分 提出 了 培养 百万 富翁 思维 和 积累 财富 的 对策 。
+米歇尔 的 故事 极具 启示 性 , 讲述 了 她 如何 将 自己 新 发现 的 理财 智慧 运用 于 她 组织 起来 的 一 群 女性 中间 。
+而 这些 女性 处在 不 同 的 生活 阶段 , 有 着 不 同 的 财务 需求 。
+凭借 十几 年 帮助 成千 上万 人 实现 理财 目标 的 成功 经验 , 在 本 书 中 , 汉森 和 艾伦 将 告诉 你 : 你 想 让 账户 余额 越来 越多 吗 ? 你 想 实现 财务 自由 吗 ? 你 想过 上 期待 已 久 的 理想 生活 吗 ? 在 本 书 中 , 两位 美国 最 成功 的 畅销 书 作家 将 与 你 分享 致富 秘籍 , 告诉 你 如何 马上 把 这 一切 变 为 现实 , 无 论 你 现在 的 财务 状况 如何 , 也 无 论 现在 的 经济 是否 景气 。
+你 将 学会 如何 利用 自己 已有 的 技能 、 资源 和 方法 , 迅速 打开 财富 之 门 。
+· 获得 致富 秘籍 的 三 个 关键 要素 —— 尤其 是 在 经济 不 景气 的 背景 下 · 如何 培养 实现 一切 日 标 不 可 或 缺 的 激情 · 如何 鼓舞 与 你 志同 道合 的 人 并 把 他们 组织 起来 , 建立 你 的 “ 梦 之 队 ” , 并 帮助 你 实现 目标 · 如何 利用 具体 的 成功 策略 , 在 最 短 的 时间 内 创造 永 不 枯竭 的 现金 流 正确 的 思维 、 得力 的 团队 和 完美 的 技术 , 将 引领 你 走 上 快速 致富 之 路 。
+作者 简介 0 马克 · 维克多 · 汉森 ( Mark Victor Hansen ) 历史 上 最 大 的 畅销 书 系列 之 一 《 心灵 鸡汤 》 的 作者 , “ 心灵 鸡汤 ” 现在 已 成为 一 种 出版 现象 。
+“ 心灵 鸡汤 ” 系列 已 被 泽 成 十几 种 语言 , 在 全球 销售 逾 1 . 12 亿 册 。
+汉森 拥有 25 年 的 公开 演讲 经验 , 受 众 遍布 全球 。
+图书 目录 0 前瞻 第 1 部分 米歇尔 的 故事 : 用 最 短 时间 致富 的 传奇 前言 1 财富 菜单 2 最后 一 分钟 成为 百万 富翁 3 心灵 的 协议 4 找到 平衡 5 全力 以 赴 6 开始 探险 7 相信 过程 8 巨大 的 心理 斗争 9 前方 的 路 10 开端 11 生活 中 小小 的 紧急 状况 12 谁 是 下 一 个 ?
+13 找到 终点 14 丰富 多彩 的 生活 第 2 部分 快速 致富 导言 1 过 上 富足 生活 的 烹饪 秘籍 : 最 快 的 赚钱 方式 2 三种 必备 材料 : “ 现在 就 欢呼 ” , “ 内在 成功 者 ” 、 梦 之 队 3 成功 的 公式 第 一 章 第 一 要素 : “ 现在 就 欢呼 ” 4 声音 、 愿景 和 感应 : 快速 赚钱 始于 内心 5 消极 声音 : 让 “ 内在 自责 者 ” 闭嘴 6 积极 声音 : 使 你 有 力量 的 问题 和 让 你 精神 振奋 的 话语 7 改变 过去 , 展望 未来 : 成功 的 基石 8KA - CHING 思维 : 让 你 的 账户 充实 起来 9 感应 : 如何 为 现在 喝彩 第 二 章 第 二 要素 : “ 内在 成功 者 ” 10 听从 你 的 心声 : 走近 ” 内在 成功 者 ” 11 创造 收入 的 快速 通道 : 从 你 喜欢 做 的 事情 上 赚钱 12 告诉 我 钱 在 哪里 : 财富 就 在 你 眼前 第 三章 第 三 要素 : 梦 之 队 13 你 的 梦 之 队 : 齐心 协力 , 创造 奇迹 第 四 章 快速 致富 14 个人 财富 之 轮 : 把 想法 变成 金钱 15 把 问题 变成 财富 16 财富 的 源泉 : 稳定 的 收入 流 17 百万 富翁 矩阵 : 赚取 百万 美元 的 7 种 方式 18 唯一 的 出路 是 推销 19 向 财富 兑现 致谢 书名 一 分钟 快速 致富 作者 马克 · 维克多 · 汉森 、 罗伯特 · G · 艾伦 译者 李 耀 IS BN 9787508618982 定价 38 . 00 元 出版 社 中信 出版 社 出版 时间 2010 - 3 - 1 开本 16 开
+强 殖 装甲 风暴 《 强 殖 装甲 风暴 》 是 连载 于 幻剑 书盟 的 一 部 异术 异能 类 小说 , 作者 是 赤眉 教主 。
+小说 类型 0 异术 异能 内容 简介 0 一 块 不 知 从 何 而来 的 神奇 的 石头 , 竟然 是 一 具 神秘 的 强 殖 装甲 。
+少年 孙 若丹 在 因缘 巧合 之 下 得到 了 强 殖 装甲 , 他 的 生活 从此 起 了 翻天 复地 的 变化 , 家仇 、 情感 、 黑暗 · · · · · … … 中文 名 强 殖 装甲 风暴 作者 赤眉 教主 小说 进度 暂停 连载 网站 幻剑 书盟
+漫说 梅 兰 芳 《 漫说 梅 兰 芳 》 是 2009 年 河南 教育 出版 社 出版 的 图书 , 作者 是 田 汉 。
+图书 信息 0 出版 社 : 河南 教育 出版 社 ; 第 1 版 ( 2009 年 10 月 1 日 ) 丛书 名 : 名家 文化 小 丛书 平装 : 79 页 正文 语种 : 简体 中文 开本 : 32 ISBN : 7534756960 , 9787534756962 条形码 : 9787534756962 尺寸 : 18 x 12 . 6x 0 . 6cm 重量 : 100 g 作者 简介 0 田 汉 ( 1898 — 1968 ) 。
+本名 田 寿 昌 , 湖南 长沙 人 。
+创造 社 成员 。
+曾 组织 和 主持 有 名 的 南国 社 、 南国 艺术 学校 等 戏剧 团体 和 戏剧 教育 组织 。
+他 的 《 获 虎 之 夜 》 、 《 名优 之 死 》 、 《 丽人 行 》 等 话剧 , 是 现代 文学 史 上 卓有 影响 的 优秀 之 作 。
+建国 后 曾 在 文化 部 、 中国 戏剧 家 协会 任职 , 在 指导 戏曲 改革 方面 获得 了 显著 的 成果 , 他 自己 曾 改编 了 《 西厢 记 》 、 《 白 娘子 》 等 传统 剧目 。
+《 关 汉 卿 》 和 《 谢 瑶 环 》 分别 代表 着 田 汉 话剧 创作 和 戏曲 创作 的 最 高 水平 。
+内容 简介 0 《 漫说 梅 兰 芳 》 内容 简介 : 田 汉 原名 寿 昌 , 曾 用 笔名 伯 鸿 、 陈 瑜 、 漱 人 、 汉 仙 等 。
+1898 年 3 月 12 日 生于 湖南 省 长沙 县 。
+话剧 作家 , 戏曲 作家 , 电影 剧本 作家 , 小说 家 , 诗人 , 歌词 作家 , 文艺 批评 家 , 社会 活动 家 , 文艺 工作 领导 者 。
+中国 现代 戏剧 的 奠基 人 。
+多才 多 艺 , 著作 等 身 。
+原籍 湖南 长沙 。
+早年 留学 日本 , 1920 年代 开始 戏剧 活动 , 写 过多 部 著名 话剧 , 成功 地 改编 过 一些 传统 戏曲 。
+他 还是 中华 人民 共和 国 国歌 《 义勇 军 进行 曲 》 词 作者 。
+文化 大 革命 中 , 被 左 的 社会 势力 迫害 死于 狱中 。
+《 漫说 梅 兰 芳 》 为其 撰写 的 记录 梅 兰 芳 生平 事迹 的 专著 。
+目录 0 中国 旧 戏 与 梅 兰 芳 的 再 批判 苏联 为什么 邀 梅 兰 芳 去 演戏 梅 兰 芳 、 周 信 芳 演出 剧本 选集 序 追悼 梅 兰 芳 同志 和 梅 兰 芳 同志 最后 几 次 见面 梅 兰 芳 同志 精神 不 死 纪念 梅 兰 芳 同志 逝世 一 周年 书名 漫说 梅 兰 芳 页数 79 页 出版 社 河南 教育 出版 社 装帧 平装
+15 只 小猪 搭 房子 《 15 只 小猪 搭 房子 》 , 作者 大胆 的 设计 , 让 颜色 相遇 、 交织 、 碰撞 出 令人 惊艳 的 视觉 效果 。
+书 中 并 没 有 出现 15 只 小猪 的 具体 形象 , 然而 通过 它们 搭建 的 各种 新奇 房子 , 读者 却 可以 真切 地 认识 、 了解 它们 。
+1 获奖 记录 02018 年 3 月 , 获 2018 博洛尼亚 国际 儿童 书展 最 佳 童 书 奖 艺术 、 建筑 与 设计 奖 ( Art - Architecture Design ) 。
+获奖 点评 0Aureli é n Debat 大胆 的 设计 , 让 颜色 相遇 、 交织 、 碰撞 出 令人 惊艳 的 视觉 效果 。
+书 中 并 没 有 出现 15 只 小猪 的 具体 形象 , 然而 通过 它们 搭建 的 各种 新奇 房子 , 我们 却 可以 真切 地 认识 、 了解 它们 。
+它们 搭建 的 房子 很 特别 , 有 些 可以 实现 , 有 些 无法 实现 。
+读者 们 也 可以 利用 生活 中 的 常见 物品 搭建 属于 自己 风格 的 建筑 , 创作 新 的 故事 。
+中文 名 15 只 小猪 搭 房子 外文 名 Cabanes 国家 法国 作者 Aureli é n Debat
+选股 心经 : 买 在 起 涨 点 的 选股 方法 《 选股 心经 : 买 在 起 涨 点 的 选股 方法 》 内容 简介 : 股市 盈利 的 关键 是 选择 一只 表现 优异 的 股票 , 在 合适 的 时间 以 合适 的 价格 买入 、 卖出 。
+《 选股 心经 : 买 在 起 涨 点 的 选股 方法 》 在 基本 分析 与 技术 分析 的 基础 上 , 根据 股市 操盘 技术 的 发展 以及 作者 自身 的 实战 经验 , 分别 从 选股 的 “ 风向标 ” 、 “ 晴雨表 ” 、 “ 透视 镜 ” 、 “ 路线 图 ” 、 “ 导航 仪 ” 、 “ 刻度 尺 ” 、 “ 瞄准 星 ” 和 “ 千里 眼 ” 八 个 方面 进行 了 详细 介绍 , 全面 阐释 了 诸如 基本 面 、 行情 、 K线 、 形态 、 趋势 线 和 均线 、 技术 指标 、 庄家 动向 、 寻找 黑马 等 股市 分析 工具 和 方法 , 针对 如何 选取 表现 优异 的 股票 以及 如何 确定 买卖 点 等 重点 、 难点 进行 了 详细 剖析 , 系统 性 地 为 读者 朋友 们 介绍 了 一 套 实用 性 极 强 的 选股 技巧 。
diff --git a/PaddleNLP/ELMO/data/vocabulary_min5k.txt b/PaddleNLP/ELMO/data/vocabulary_min5k.txt
new file mode 100755
index 0000000000000000000000000000000000000000..189606edb802d4cc536afc3032e9f58003fe6a91
--- /dev/null
+++ b/PaddleNLP/ELMO/data/vocabulary_min5k.txt
@@ -0,0 +1,52445 @@
+
+
+
+,
+的
+。
+、
+.
+>
+<
+:
+是
+一
+b
+和
+在
+第
+年
+/
+不
+0
+有
+了
+)
+-
+(
+《
+》
+,
+为
+“
+”
+人
+1
+与
+;
+2
+月
+个
+中
+等
+之
+公司
+章
+上
+3
+)
+(
+我
+于
+市
+管理
+中国
+日
+对
+本
+4
+会
+名
+5
+工作
+他
+:
+以
+最
+及
+·
+中文
+大
+你
+无
+所
+性
+二
+也
+被
+省
+到
+三
+后
+研究
+并
+下
+发展
+…
+时间
+出版
+技术
+企业
+而
+就
+时
+a
+或
+"
+地
+6
+服务
+有限
+将
+区
+可
+人民
+作者
+这
+主要
+10
+都
+小说
+信息
+由
+者
+可以
+7
+新
+类型
+条
+要
+高
+教育
+其
+大学
+;
+能
+元
+小
+国家
+多
+从
+8
+系统
+产品
+县
+着
+经济
+简介
+作品
+书
+部
+村
+进行
+四
+内容
+网
+用
+!
+她
+该
+社
+建设
+%
+家
+文化
+设计
+社会
+12
+化
+院
+没
+自己
+局
+工程
+来
+9
+=
+项目
+游戏
+又
+政府
+节
+组织
+?
+被告
+内
+——
+使用
+学
+但
+11
+张
+器
+王
+专业
+国
+次
+让
+提供
+原告
+世界
+长
+连载
+我们
+副
+执行
+同
+应用
+好
+总
+名称
+法
+种
+五
+学校
+生产
+更
+至
+市场
+规定
+路
+前
+行政
+单位
+里
+已
+#
+中心
+向
+网络
+基本
+通过
+责任
+生活
+科学
+得
+代表
+环境
+学生
+号
+如
+还
+点
+说
+支持
+国际
+李
+机
+把
+合同
+其他
+成立
+00
+科技
+其中
+分
+起
+米
+出
+活动
+经营
+问题
+href
+软件
+两
+生
+负责
+北京
+关
+20
+开发
+.
+部门
+介绍
+公里
+室
+面积
+安全
+以及
+合作
+爱
+功能
+学院
+各
+包括
+方法
+镇
+水
+力
+标准
+使
+员
+平台
+table
+地区
+委员
+资源
+成为
+投资
+给
+去
+线
+站
+全国
+教学
+建筑
+30
+机构
+人员
+基础
+它
+他们
+诉讼
+艺术
+类
+成
+历史
+处
+出生
+范围
+同时
+过
+应
+开始
+部分
+万
+经
+产业
+具有
+15
+当
+属
+方式
+上海
+相关
+六
+结构
+法律
+情况
+要求
+很
+再
+天
+因
+设备
+&
+式
+分析
+酒店
+学习
+收入
+法院
+职业
+个人
+美国
+日期
+能力
+重要
+申请
+纠纷
+曾
+自然
+参加
+16
+交通
+作为
+处理
+山
+亩
+做
+根据
+控制
+质量
+网站
+图书
+刘
+需要
+金
+【
+型
+保护
+陈
+】
+万元
+费
+文学
+约
+程序
+奖
+案
+音乐
+?
+以上
+却
+监督
+自
+按
+外
+规划
+电视
+机关
+关系
+获得
+01
+城市
+影响
+系
+登记
+]
+[
+中华
+这个
+任
+未
+材料
+那
+全
+品牌
+综合
+道
+级
+工业
+知识
+电子
+师
+即
+—
+发生
+理论
+子
+数据
+农业
+什么
+nbsp
+政策
+实施
+七
+所有
+作用
+心
+则
+选择
+称
+2010
+位
+拥有
+现代
+场
+行业
+提高
+系列
+期
+分钟
+过程
+或者
+操作
+计划
+车
+率
+位于
+人物
+每
+因为
+运动
+指导
+制作
+方面
+应当
+2016
+版
+原
+看
+平方
+民事
+女
+教师
+户
+作
+审理
+建立
+男
+设施
+13
+制度
+2011
+十
+民族
+故事
+八
+如果
+~
+创新
+业
+14
+间
+周
+银行
+一百
+提出
+集
+主任
+实现
+已经
+馆
+城
+2017
+各种
+几
+共
+片
+完成
+方
+卫生
+岁
+销售
+强
+2012
+物
+保险
+国内
+电影
+毕业
+2009
+目标
+数
+形成
+科
+创作
+乡
+语言
+一般
+业务
+100
+量
+领导
+发现
+想
+黄
++
+旅游
+非
+2008
+林
+!
+比
+集团
+18
+协会
+50
+为了
+台
+带
+关于
+只
+培训
+农村
+度
+篇
+条件
+断
+25
+日本
+人口
+优秀
+模式
+出现
+重点
+病
+您
+克
+词
+代理
+获
+共和
+文件
+采用
+届
+利用
+书记
+品
+实验
+行为
+精神
+如何
+由于
+正
+演员
+组
+权
+门
+2015
+们
+特色
+这些
+the
+特点
+’
+花
+2014
+杨
+可能
+办公
+开展
+房
+儿童
+地方
+发表
+之间
+位置
+-
+2013
+2007
+认为
+2006
+主
+头
+参数
+×
+会议
+论
+文
+客户
+美
+开
+21
+17
+指
+距离
+一款
+价值
+叶
+军
+劳动
+均
+目录
+卡
+商
+传统
+字
+表
+健康
+低
+吧
+也是
+白
+决定
+关键
+治疗
+承担
+结合
+受
+这样
+余
+判决
+起点
+一些
+卷
+有效
+从事
+近
+体系
+按照
+汽车
+24
+项
+先进
+成功
+中央
+教授
+资料
+孩子
+加
+目前
+样
+*
+首
+进度
+一定
+队
+借款
+浙江
+老
+2000
+图
+依法
+体
+较
+生物
+领域
+谁
+行
+代
+改革
+基金
+还是
+双
+自动
+思想
+政治
+事业
+界
+九
+得到
+期间
+石
+马
+of
+神
+才
+医疗
+产生
+支付
+所以
+相
+世纪
+因此
+19
+丰富
+2005
+外文
+商业
+此
+时候
+案件
+区域
+任务
+达
+人才
+真
+全面
+存在
+植物
+中学
+22
+必须
+山东
+实际
+用户
+进入
+园
+西
+时代
+医院
+但是
+培养
+少
+公开
+风
+口
+机械
+金融
+水平
+法定
+东
+加强
+何
+史
+入
+这种
+增加
+~
+以下
+见
+话
+适用
+解决
+工商
+龙
+之后
+计算
+组成
+明
+检查
+吴
+最后
+专家
+面
+及其
+段
+达到
+会员
+英语
+发行
+土地
+楼
+占
+走
+食品
+导演
+层
+接口
+参与
+裁定
+制
+IS
+23
+党
+03
+现
+商品
+审判
+形式
+请
+2004
+40
+需
+积极
+经验
+特别
+南
+对于
+动物
+课程
+事
+方向
+非常
+咨询
+价格
+运行
+国籍
+清
+比较
+喜欢
+化学
+2003
+红
+加入
+初
+资金
+证
+直接
+普通
+知道
+09
+90
+身
+光
+便
+能够
+赵
+微
+学术
+效果
+家庭
+进
+专辑
+商务
+著名
+04
+年代
+表现
+团
+体育
+BN
+28
+加工
+免费
+制造
+重
+还有
+书名
+08
+消费
+手
+形
+保障
+履行
+剧
+07
+现在
+空间
+种植
+规范
+05
+证明
+显示
+报告
+论文
+写
+分类
+打
+实践
+社区
+佳
+联合
+广东
+主题
+工艺
+古
+北
+A
+云
+保证
+交流
+州
+一直
+事故
+帮助
+成员
+发
+确定
+成果
+委托
+农民
+生态
+心理
+06
+核心
+主义
+促进
+科研
+时期
+计算机
+案例
+人们
+流
+生命
+参考
+资产
+证据
+一切
+理念
+变化
+知
+江苏
+战略
+宣传
+电话
+四川
+26
+朋友
+大小
+变
+版本
+分别
+性能
+任何
+设置
+办法
+半
+回
+二节
+页数
+尺寸
+角色
+方案
+油
+02
+先
+吃
+and
+高级
+共同
+为主
+60
+特征
+像
+梦
+另
+剂
+媒体
+厅
+厂
+店
+若
+担任
+安装
+属于
+越
+吗
+经过
+歌曲
+通
+而且
+宋
+用于
+交易
+原则
+只有
+调查
+医学
+深
+河
+河南
+管
+且
+考试
+阅读
+移动
+青年
+单
+美术
+唐
+班
+2002
+亦
+爱情
+地理
+股份
+具体
+小时
+孙
+公共
+这里
+英文
+博士
+技能
+胡
+结果
+含
+27
+小学
+安
+全村
+坚持
+深圳
+起来
+建
+评价
+注意
+海
+人类
+件
+法规
+足球
+超
+基地
+性质
+理事
+分布
+太
+发布
+取得
+状态
+手机
+全部
+学会
+财政
+道路
+球
+批准
+完善
+详情
+湖南
+自治
+重大
+类别
+特殊
+费用
+击
+制定
+广州
+地址
+情
+营销
+80
+田
+十二
+朱
+很多
+'
+易
+毛
+人生
+大会
+民
+徐
+香港
+许多
+协调
+罗
+C
+农
+规模
+教材
+热
+年度
+定价
+施工
+创造
+是否
+人均
+主管
+广告
+符合
+细胞
+说明
+需求
+照
+日起
+民间
+造成
+如下
+平
+厘米
+教
+黑
+努力
+力量
+公
+接受
+新闻
+座
+超过
+平均
+其它
+常
+传
+改
+先后
+约定
+の
+29
+左右
+诗
+继续
+只是
+今
+解释
+风格
+物业
+故
+意见
+经历
+运输
+诉
+赔偿
+调整
+文明
+开本
+全球
+无法
+主持
+义务
+干
+完全
+当时
+2001
+工具
+公路
+推荐
+训练
+统一
+℃
+请求
+稳定
+寺
+财产
+具
+评
+简单
+速度
+良好
+死
+事件
+先生
+画
+感
+希望
+独立
+地点
+党员
+比赛
+提升
+自由
+会计
+大型
+观
+未来
+多种
+杯
+统计
+措施
+农户
+优势
+表示
+革命
+江
+了解
+每个
+然后
+校
+财务
+团队
+语
+色
+街
+意义
+32
+钱
+英国
+200
+正式
+记
+放
+歌
+苏
+药
+大量
+推进
+各类
+既
+注册
+尽
+足
+贸易
+事实
+电
+库
+以来
+检测
+利息
+报
+客房
+别
+重庆
+份
+31
+定
+战争
+保持
+数量
+改变
+星
+背景
+十一
+题
+出来
+虽然
+值
+权责
+读者
+德
+学科
+页
+一体
+套
+编辑
+啊
+款
+多年
+台湾
+疾病
+干部
+呢
+办
+某
+留
+充分
+气
+Android
+赛
+快
+声
+特性
+状
+群
+晋江
+战
+建议
+地产
+试验
+天下
+快速
+机制
+受到
+宽
+目的
+汉族
+长期
+曲
+记录
+除
+予以
+现有
+」
+增长
+「
+湖北
+族
+to
+叫
+南京
+信
+原因
+早
+营养
+香
+天津
+大陆
+视频
+华
+进一步
+火
+及时
+院校
+_
+员工
+概念
+临床
+推广
+温度
+研发
+包装
+安徽
+B
+合理
+硕士
+容量
+不仅
+阶段
+数字
+中共
+一名
+盐
+高等
+律师
+成长
+连
+you
+竞争
+公园
+怎么
+い
+更多
+分子
+群众
+依照
+大家
+金属
+等级
+形象
+测试
+接
+反应
+难
+当事
+打造
+mm
+桥
+极
+风险
+树
+协议
+并且
+维护
+45
+物质
+德国
+读
+武
+岛
+儿
+攻击
+战斗
+师范
+鉴定
+学位
+500
+智能
+端
+供
+电源
+办理
+主席
+固定
+欧洲
+法国
+反
+方便
+绿色
+鱼
+耕地
+板
+十五
+那么
+校园
+房屋
+村民
+物理
+不过
+品种
+成就
+收
+广泛
+河北
+飞
+即可
+源
+转
+少年
+志
+汤
+70
+对象
+最终
+享受
+描述
+仅
+看到
+夏
+玩家
+毫米
+东西
+经典
+真实
+夜
+世
+言
+福建
+准备
+人数
+角
+取
+些
+电脑
+以后
+酒
+引起
+开放
+公安
+成本
+内部
+基
+质
+实行
+in
+一年
+详细
+1999
+小组
+The
+数学
+整个
+乐
+住房
+立
+令
+联系
+汉
+创建
+服
+受理
+许
+目
+生长
+原理
+职务
+持续
+梁
+中医
+云南
+35
+双方
+甚至
+高速
+合
+随着
+求
+木
+上述
+主演
+床
+边
+通知
+工
+土
+粉
+幸福
+十分
+官
+才能
+满足
+荣誉
+优质
+郑
+茶
+现任
+实用
+来自
+正常
+短
+十三
+情节
+职工
+严重
+派
+贷款
+事务
+广场
+铁路
+1998
+认识
+党委
+亿元
+传播
+亚
+休闲
+应该
+患者
+从而
+互联
+依据
+职责
+适合
+青
+因素
+信用
+广大
+女人
+事项
+穿越
+性别
+保存
+掌握
+武汉
+仙
+身体
+真正
+离
+损失
+‘
+多次
+现场
+评估
+全市
+宫
+至今
+据
+创业
+经理
+颜色
+素质
+明显
+意
+日发
+作家
+仍
+远
+玉
+所在
+更加
+湖
+满
+郭
+来源
+血
+输入
+娱乐
+装饰
+集体
+菜
+资格
+跟
+认定
+杂志
+纯
+久
+举办
+老师
+为什么
+症
+俱乐部
+概述
+品质
+程度
+居住
+贡献
+平装
+炒
+环保
+文献
+美丽
+300
+构成
+配套
+特
+后来
+成绩
+都市
+球队
+君
+讲
+效力
+许可
+1000
+权利
+书法
+规格
+本科
+块
+孔
+江西
+表面
+韩
+I
+常务
+幼儿
+抗
+利
+英雄
+资本
+节目
+编制
+步
+课
+策划
+示范
+运用
+包
+集中
+电池
+连续
+深入
+认证
+课题
+摄影
+交
+问
+杭州
+怎样
+设
+发挥
+太阳
+带来
+作出
+身份
+办学
+时尚
+文字
+就业
+通信
+装帧
+减少
+素
+广西
+高度
+污染
+动
+局长
+屏幕
+车辆
+装备
+学历
+居民
+杜
+选
+剑
+氏
+车程
+丝
+突出
+模型
+主体
+峰
+重量
+物流
+女性
+杀
+落实
+条例
+运营
+拍摄
+陕西
+相应
+哲学
+推动
+1997
+导致
+送
+成都
+听
+机场
+正在
+针对
+指南
+魔
+独特
+高中
+简称
+森林
+住
+三章
+合法
+占地
+良
+轴承
+适应
+连接
+蛋白
+恢复
+广
+山西
+古代
+感觉
+容易
+庄
+秘书
+组合
+相对
+整体
+原文
+采取
+那些
+意识
+拉
+称号
+根
+左
+姜
+做好
+严格
+它们
+审计
+压力
+行动
+困难
+维修
+须
+治
+神经
+气候
+营
+快乐
+解放
+于是
+术
+当地
+展示
+现实
+青春
+安排
+明确
+也有
+果
+视
+设立
+铁
+进步
+兼
+体验
+期限
+曾经
+练习
+队伍
+屏
+状况
+死亡
+秦
+思考
+父亲
+支
+配置
+丛书
+确认
+雷
+●
+一天
+随
+冠军
+每年
+1996
+传说
+计
+塔
+呈
+牌
+检验
+な
+重新
+键
+降低
+知名
+秒
+二十
+曹
+兵
+雪
+街道
+千
+输出
+关注
+斯
+收录
+如此
+错
+省级
+当代
+药物
+年底
+总部
+追求
+g
+高校
+灵
+堂
+原料
+技巧
+日常
+十四
+春
+各项
+东北
+仪
+能源
+盘
+十六
+色彩
+皆
+动画
+犯罪
+审查
+眼
+韩国
+波
+债务
+投入
+景观
+动力
+现象
+购买
+增强
+找
+英
+男人
+存储
+司法
+广播
+众
+打车
+中小
+季
+档案
+策略
+定位
+草
+止
+国务院
+优化
+航空
+倍
+形态
+工资
+纪念
+基于
+找到
+穿
+测量
+内存
+演唱
+编号
+→
+距
+锅
+皮肤
+丁
+出场
+其实
+十八
+给予
+理解
+唯一
+改造
+防治
+司
+雨
+录
+自身
+肉
+宝
+图片
+对方
+ー
+东方
+记忆
+m
+过去
+型号
+阿
+那个
+1995
+魔法
+另外
+民主
+监管
+更新
+荣获
+机会
+探索
+宗
+展
+港
+唱
+币
+驾驶
+装置
+曰
+S
+地位
+百
+常用
+轻
+吨
+地球
+直
+女孩
+再次
+芯片
+空
+电力
+M
+思
+自我
+配合
+博物
+鲁
+吉林
+朝
+通常
+改善
+常见
+放入
+て
+年龄
+游
+属性
+结束
+动态
+只要
+谢
+格式
+归
+末
+机动
+联赛
+养殖
+林业
+部队
+公交
+流行
+服装
+诊断
+签订
+食
+各地
+差
+齐
+宠物
+产权
+白色
+危险
+修改
+全县
+炎
+信号
+尚
+人大
+涉及
+天然
+风景
+指标
+举行
+欲
+审批
+比例
+贯彻
+年轻
+一生
+规则
+股
+除了
+T
+借贷
+收藏
+推出
+护理
+面对
+文艺
+文物
+36
+武器
+会长
+和谐
+强大
+莫
+演出
+模拟
+灯
+甲
+D
+离开
+单元
+空气
+小区
+た
+利益
+动作
+经常
+复杂
+宜
+池
+专利
+体现
+否
+智慧
+温
+校长
+著作
+落
+表演
+身高
+剧情
+海拔
+绝
+永远
+冷
+步行
+著
+担保
+皮
+定义
+sup
+挑战
+供应
+军事
+神秘
+三年
+联
+纸
+液
+充满
+编写
+歌手
+往
+西安
+言情
+相当
+团体
+完美
+鬼
+指挥
+来说
+博
+蓝
+餐馆
+文章
+案由
+宁
+修
+刀
+调
+效率
+意大利
+反映
+董事
+地下
+相互
+味
+迅速
+公主
+学者
+1994
+义
+牛
+西南
+复
+作物
+考核
+播放
+核
+适量
+主角
+在线
+父母
+每天
+阳
+藏
+海洋
+致力
+1993
+指定
+地质
+待
+变成
+人体
+陆
+元素
+敌人
+松
+尤其
+解
+联盟
+监测
+梅
+突破
+具备
+只能
+寻找
+编
+用品
+生涯
+漫画
+下来
+居
+亿
+母亲
+化工
+歌词
+独
+分配
+喜
+理由
+序
+正确
+保修
+种类
+出口
+红色
+村委
+上市
+用途
+利率
+装
+上映
+载
+33
+就会
+制品
+府
+成熟
+然而
+可是
+证书
+等等
+事情
+市委
+集成
+建成
+1992
+圣
+标志
+〇
+切
+川
+坐
+精
+委
+s
+审结
+手术
+理想
+结
+兄弟
+笔
+公元
+罪
+旧
+对外
+谷
+公益
+电路
+较大
+西北
+宗旨
+岩
+平衡
+限制
+大力
+辽宁
+做法
+逐渐
+众多
+调节
+药品
+欢迎
+秘密
+共产党
+承包
+册
+决策
+典型
+加床
+帝
+主编
+治理
+考虑
+社会主义
+扮演
+星级
+周边
+大赛
+n
+同意
+玻璃
+贵州
+压
+命运
+循环
+处罚
+软
+预防
+实力
+功率
+授权
+来到
+梦想
+强化
+V
+适宜
+列
+确保
+女子
+讲述
+命
+善
+静
+零
+共享
+周围
+商事
+营业
+扩大
+阳光
+写作
+保健
+生效
+主张
+黑色
+学名
+提交
+协助
+胜
+相同
+空调
+完整
+盛
+专门
+民法
+国有
+绿化
+伤害
+手段
+环
+箱
+流程
+体重
+缘
+收费
+各级
+沈
+防
+巨大
+L
+影视
+生育
+认真
+避免
+音
+覆盖
+而是
+今天
+房型
+尾
+加快
+花园
+体制
+38
+声音
+此外
+布
+に
+圈
+高效
+终于
+必要
+执法
+吸收
+糖
+告诉
+影片
+名字
+中级
+餐厅
+证券
+病毒
+人力
+封
+39
+笑
+150
+公斤
+接触
+魏
+破
+场所
+一七
+终结
+升级
+奖励
+违反
+道德
+机器
+骨
+水利
+永
+别人
+公寓
+附录
+苏州
+创办
+玩
+习惯
+望
+表达
+兰
+新型
+架
+秋
+旗
+汉语
+人文
+原来
+施
+48
+家族
+整合
+产
+审核
+注
+洞
+景
+右
+予
+乃
+乡村
+相信
+买卖
+诸
+青岛
+一项
+下列
+同志
+部长
+授予
+室内
+多数
+思维
+通讯
+简体
+附
+400
+|
+る
+始终
+负担
+办事
+°
+34
+辖
+违法
+角度
+获奖
+高新
+国土
+引导
+两种
+夫
+观察
+条款
+x1
+重视
+令人
+し
+1990
+准
+拼音
+细
+沟
+主动
+舞
+买
+住宅
+医师
+镜
+飞机
+倒
+纤维
+刑事
+以为
+自主
+E
+感染
+图像
+设定
+K
+党组
+烧
+一致
+若干
+宗教
+破坏
+虽
+启动
+传奇
+职
+三十
+换
+额
+USB
+功
+邓
+专项
+兴
+顾
+内置
+院长
+分局
+节能
+十七
+膜
+哪
+钢
+戴
+靠
+导师
+频率
+柳
+侧
+遇到
+首先
+活
+排
+沙
+园区
+损害
+火车
+论坛
+配
+兽
+秀
+灰
+ン
+多少
+120
+读书
+预算
+菌
+轻松
+X
+少女
+景点
+大师
+玉米
+税
+入住
+立即
+觉得
+背
+感受
+食物
+必
+效益
+基层
+溪
+创意
+线路
+理
+斗
+银
+愿
+厚
+散
+同样
+回到
+翻译
+飞行
+总结
+租赁
+照片
+排名
+负
+人士
+康
+葡萄
+同学
+书画
+生存
+钟
+盖
+频道
+地震
+记者
+收益
+佛
+画面
+37
+专用
+底
+姓名
+紫
+停车
+蔡
+冯
+记载
+鸡
+十九
+总数
+异
+饰
+展览
+似
+超级
+诗人
+准确
+将军
+人事
+定期
+起诉
+报道
+酸
+绿
+性格
+取消
+刚
+42
+料
+编剧
+妈妈
+鼠标
+城乡
+流动
+监察
+创
+范
+做出
+优
+粮食
+允许
+黑龙江
+混合
+1991
+洪
+井
+很大
+仪器
+境内
+吉
+召开
+收集
+1988
+链
+奇
+购物
+吕
+成分
+石油
+顾问
+撤诉
+伤
+凡
+拿
+程
+外国
+显卡
+而成
+例
+城镇
+几乎
+轮
+故障
+房间
+注重
+传媒
+职能
+毒
+一批
+危机
+升
+黄金
+首发
+防止
+英寸
+能量
+所谓
+55
+影
+例如
+CPU
+西方
+土壤
+医生
+血管
+检察
+75
+纲
+引进
+兴趣
+最近
+新疆
+国外
+★
+日内
+儿子
+―
+社团
+讨论
+刻
+险
+语种
+装修
+强制
+圆
+J
+包含
+沟通
+大道
+人间
+算
+一座
+精品
+都会
+采购
+应急
+总体
+作业
+代码
+皇帝
+帮
+全新
+隶属
+船
+乡镇
+实例
+湾
+依
+乱
+电压
+蔬菜
+辅助
+规律
+餐饮
+生于
+袁
+1989
+顶
+敢
+主人
+鼓励
+i
+一等奖
+800
+高人
+W
+融合
+保
+意思
+摄像
+补
+脚
+越来
+设有
+侯
+蒋
+公布
+然
+俄罗斯
+交换
+N
+吸引
+种子
+处于
+当然
+VIP
+下载
+逐步
+甘肃
+食用
+轴
+助
+中山
+冰
+诚信
+含量
+学报
+H
+1984
+趋势
+音频
+三大
+魂
+首次
+完
+布局
+感情
+烟
+印刷
+亚洲
+平面
+か
+旅行
+印度
+实
+1985
+症状
+顾客
+展开
+煮
+合格
+桌面
+食材
+造型
+这么
+同年
+硬盘
+般
+电动
+转移
+残疾
+权益
+层次
+专题
+天地
+鸟
+拉丁
+と
+厦门
+传输
+庙
+全省
+宝宝
+1987
+西部
+虎
+构建
+打开
+西班牙
+艾
+附近
+他人
+amp
+中间
+货物
+突然
+你们
+句
+税务
+羽
+送达
+情感
+致
+1986
+黄色
+物品
+』
+『
+粒
+身边
+孟
+眼睛
+扩展
+景区
+呼吸
+母
+美容
+个性
+民国
+41
+客
+t
+胜利
+携带
+老年
+县级
+做到
+国民
+在此
+谈
+复合
+概况
+合成
+二十一
+失去
+吹
+使得
+绝对
+以前
+面向
+my
+冲
+戏
+Windows
+期刊
+卫星
+驻
+一审
+董
+方针
+函数
+宇宙
+亮
+临
+障碍
+美元
+考
+地图
+园林
+降
+当年
+互动
+强度
+痛
+咖啡
+成人
+观众
+直到
+P
+卫
+女儿
+潘
+士
+东南
+帝国
+智
+64
+怕
+从此
+政协
+浅
+停
+答案
+终
+根本
+争议
+号码
+对手
+围绕
+现状
+耳
+评论
+肿瘤
+600
+查询
+思路
+妇女
+脑
+融资
+鲜
+提
+旅
+演
+分辨率
+京
+人工
+婴儿
+65
+基因
+美食
+五年
+遇
+医药
+现已
+野
+近年
+学家
+心中
+放弃
+少数
+美好
+耐
+添加
+配有
+意外
+命名
+赴
+得利
+郡
+外观
+务工
+早期
+商标
+格
+长沙
+转换
+竹
+分离
+卢
+车站
+邀请
+登
+46
+郑州
+青少年
+\
+栽培
+试
+均匀
+牺牲
+害
+寒
+沈阳
+想要
+原创
+闻
+两人
+亲
+在于
+渠道
+违约
+置
+分享
+较高
+预测
+经费
+发射
+43
+G
+顺
+kg
+城区
+大连
+收取
+年间
+可靠
+翻
+养
+cm
+往往
+拟
+态度
+天使
+全书
+精彩
+可爱
+显著
+羊
+即使
+塑料
+爱好
+访问
+满意
+热带
+手续
+51
+郎
+征
+贵
+纪
+R
+税收
+地面
+作曲
+F
+整理
+别名
+跨
+模块
+书籍
+之外
+便利
+制片
+岭
+弹
+95
+县委
+放在
+晚
+横
+─
+猪
+到庭
+印
+补充
+消防
+作战
+苦
+申报
+免疫
+海南
+巴
+长江
+同步
+内蒙古
+发动机
+二十四
+尔
+军队
+猫
+一只
+宁波
+44
+识别
+GB
+获取
+阴
+外科
+继承
+哪些
+明星
+尊
+两次
+群体
+56
+材质
+改进
+提示
+界面
+认可
+难以
+电气
+舞蹈
+走向
+打印
+繁殖
+庭
+产地
+异议
+3000
+到达
+和平
+出处
+年级
+皇
+舞台
+南北
+脸
+出具
+回忆
+监控
+抓
+二级
+彭
+发明
+途径
+本身
+x
+转让
+内外
+畜牧
+饮食
+战士
+岗位
+周期
+52
+家具
+data
+研制
+灵魂
+境
+萝卜
+到底
+引
+を
+生日
+后卫
+85
+衣
+错误
+仁
+心灵
+恋
+守
+虫
+补偿
+立案
+效应
+变得
+礼
+う
+小型
+尖
+无限
+学士
+控股
+停止
+进口
+赛季
+驱动
+此时
+男子
+适当
+趣
+益
+婚姻
+上诉
+圆形
+志愿
+里面
+正文
+福
+习题
+仲裁
+合金
+省委
+搜索
+晋
+硬
+功效
+哦
+株
+职员
+投
+冒险
+lemmaid
+医
+丹
+长度
+步骤
+纹
+含有
+掉
+聚
+泉
+通道
+键盘
+遗址
+主办
+产量
+清华
+承办
+助理
+丰
+如今
+老人
+一人
+刺激
+路线
+失
+1983
+整
+鸡蛋
+存
+三角
+万人
+贾
+逻辑
+很好
+幅
+亭
+高考
+养老
+始
+考察
+九章
+推
+陶
+比如
+98
+①
+名人
+1982
+团结
+硬件
+恨
+ら
+脉
+失败
+楚
+利于
+视觉
+密
+承诺
+辅导
+辐射
+氧化
+理工
+清单
+も
+法人
+加大
+变更
+像素
+友
+江湖
+伙伴
+熟
+47
+股票
+喜爱
+个体
+运
+货币
+is
+配件
+for
+忘
+荣
+当前
+辆
+暗
+っ
+一支
+化妆
+は
+支部
+进出
+象
+二十二
+给付
+侠
+で
+富
+绘画
+病人
+领先
+观念
+上级
+也许
+合并
+熊
+it
+肝
+本金
+擅长
+虚
+呈现
+消息
+饰演
+初中
+级别
+图形
+挂
+够
+教程
+专
+数码
+清晰
+严
+法学
+规章
+欧
+王子
+奥运
+伊
+发出
+类似
+法制
+密度
+人群
+诗词
+修复
+萧
+颜
+mobile
+出发
+气温
+粗
+佛教
+大战
+原始
+简
+黑暗
+百姓
+榜
+偿还
+纺织
+墨
+职称
+借
+容
+拓展
+日报
+炉
+轨道
+笔记
+针
+美女
+戏剧
+本人
+备案
+怀
+3D
+完结
+洋
+邮政
+逾期
+铜
+议
+感到
+枪
+就读
+外出
+环节
+健全
+进攻
+全体
+山区
+三国
+赢得
+维生素
+99
+魅力
+臣
+临时
+宇
+谓
+持
+异常
+灭
+市级
+②
+进程
+汁
+领
+编码
+淡
+崔
+可见
+哈尔滨
+O
+稍
+值得
+肾
+消除
+语文
+顺利
+流量
+权威
+净
+水泥
+立体
+发电
+壁纸
+展现
+塘
+两年
+强调
+验收
+仍然
+肺
+58
+卓
+传感
+气象
+零售
+く
+下降
+采
+玄幻
+5000
+禁止
+通用
+简易
+壁
+警察
+墙
+公民
+消失
+常委会
+れ
+现金
+差异
+é
+退
+电流
+免
+债权
+发育
+沿
+泪
+每次
+拍
+清楚
+碑
+碳
+49
+が
+舒适
+济南
+ま
+选举
+矛盾
+恐怖
+肖
+妖
+大众
+on
+上下
+部位
+开拓
+自行
+培育
+赤
+海军
+番
+部件
+怪
+伟大
+判断
+汪
+金额
+薄
+电器
+金钱
+狗
+玄
+略
+工人
+竞赛
+林地
+斗争
+转变
+复印
+财富
+符
+词汇
+公告
+二十三
+洗净
+保留
+后期
+强烈
+常委
+燕
+观点
+锁
+000
+遂
+殿
+她们
+三人
+享有
+预订
+54
+奈
+拒绝
+入选
+虚拟
+离子
+加热
+课堂
+250
+已有
+译
+历程
+支行
+姚
+180
+me
+用量
+防护
+等待
+家长
+公众
+抗日
+地铁
+边缘
+随时
+直径
+依靠
+り
+海外
+救
+带领
+千年
+地处
+二十五
+手册
+命令
+菜单
+某些
+於
+维
+新建
+效
+连锁
+祖
+罗马
+53
+男性
+副本
+总裁
+付
+垃圾
+乘
+艺
+尽管
+豆腐
+招商
+遗产
+调研
+精灵
+这次
+建造
+关节
+内心
+酶
+大队
+清洁
+流通
+出品
+柱
+褐色
+危害
+加拿大
+1980
+文书
+转化
+湿
+登场
+混凝土
+就像
+贺
+二年
+上面
+狼
+情绪
+各自
+动漫
+依然
+大大
+缺乏
+那里
+出租
+彩色
+显
+豪华
+场景
+重生
+还要
+小结
+县城
+画家
+配备
+区别
+似乎
+凤
+实训
+要点
+农田
+理财
+父
+诗歌
+积累
+苹果
+侵权
+山水
+浏览
+甘
+周年
+凤凰
+章程
+组建
+利润
+深度
+传承
+方位
+测定
+拟订
+解析
+抑制
+防御
+罚款
+健身
+工厂
+因而
+凭
+跳
+墓
+纵横
+甚
+时光
+犯
+17K
+脱
+激光
+普遍
+用来
+攻
+师生
+干燥
+结婚
+勇
+脂肪
+细节
+预
+接收
+一方面
+十大
+信仰
+蛋
+外形
+72
+散文
+屋
+大厦
+热情
+人为
+珠
+纽约
+回来
+日向
+三月
+ス
+近代
+逝世
+南方
+姓
+深刻
+词语
+网页
+指数
+秩序
+伟
+贴
+学员
+蓝色
+ISBN
+邪
+苏联
+寻
+神奇
+跑
+评审
+正当
+灵活
+江南
+至少
+用地
+探讨
+66
+雄
+资质
+妻子
+书面
+精心
+前锋
+枚
+乐队
+110
+古典
+前往
+等人
+奥
+灾害
+抵押
+工会
+释义
+1978
+DVD
+首届
+截止
+骨干
+温暖
+迁
+折
+面前
+时空
+生理
+养生
+急
+解除
+平安
+专科
+价
+纳
+争
+62
+想象
+阵
+假
+柄
+商贸
+活性
+吾
+总统
+57
+气体
+那样
+新区
+支撑
+啦
+铝
+而言
+仅仅
+住院
+生成
+ん
+度假
+优良
+血液
+陶瓷
+洗
+为人
+创立
+野生
+把握
+LED
+征收
+17k
+夹
+批发
+每日
+下去
+冠
+き
+要素
+政务
+日子
+熟悉
+乙
+北方
+〔
+相比
+开设
+招
+ル
+消化
+力度
+冬
+在校
+59
+泥
+优美
+内涵
+〕
+名单
+留学
+想到
+运作
+中场
+枝
+权力
+疯狂
+分管
+多项
+息
+开关
+彻底
+栏目
+用法
+控
+1981
+61
+二十六
+由此
+凌
+录音
+卓越
+骑士
+MB
+前后
+导弹
+齐全
+小姐
+刺
+损伤
+神话
+矿
+教练
+教研
+诞生
+泡
+68
+优先
+辖区
+发达
+温州
+抱
+敌
+蒙
+天空
+接近
+中学生
+管道
+澳大利亚
+增
+イ
+追
+东莞
+蒙古
+球员
+就能
+测
+巴黎
+巴西
+肯定
+超市
+生动
+容纳
+浪漫
+遵守
+男孩
+堡
+财经
+凭借
+依托
+水库
+竟
+阁
+服饰
+烤
+慢慢
+回家
+将会
+随后
+愿意
+こ
+誉
+南部
+朝鲜
+独资
+洲
+煎
+统治
+管辖
+茎
+显存
+劳务
+精选
+土木
+铺
+指出
+葱
+加速
+・
+市区
+表明
+网卡
+寨
+翼
+框架
+辛
+陪审
+看着
+水果
+硬化
+宣布
+乔
+支出
+贤
+恩
+拳
+苗
+合肥
+退休
+齿
+进球
+国防
+附件
+You
+官方
+构造
+心里
+1979
+中等
+政
+诚
+专著
+对其
+西藏
+四大
+凉
+压缩
+贝
+采集
+民政
+随机
+即将
+卖
+掌
+房产
+镜头
+96
+马克思
+h
+一带
+考生
+健
+瑞
+泰
+党校
+应对
+宏
+63
+希腊
+关联
+受伤
+惠
+房价
+别墅
+欣赏
+高温
+风云
+御
+益智
+以外
+战役
+路面
+坏
+厂商
+打击
+栋
+额外
+快捷
+二十八
+百年
+关心
+your
+88
+多家
+对话
+灌溉
+适
+么
+船舶
+教堂
+附加
+实业
+举
+面临
+少许
+PC
+统筹
+客观
+力学
+发动
+念
+懂
+血压
+时刻
+夫人
+泽
+旋转
+开启
+透明
+泵
+恶
+惊
+调解
+各国
+辑
+焦
+特定
+酱
+备
+势
+一句
+四十五
+浓度
+初步
+家人
+冲击
+优点
+却是
+高层
+家居
+接待
+章节
+北部
+划分
+模
+建材
+形状
+蒸
+很快
+辉煌
+对照
+教室
+超越
+承
+招生
+多样
+坐落
+丸
+中外
+还款
+售后
+资讯
+舍
+隆
+索
+温柔
+计量
+极大
+付款
+形势
+练
+宝贝
+杰
+发放
+颁发
+普
+220
+纪录
+91
+还在
+机电
+晓
+汇
+完毕
+这本
+全区
+Intel
+移
+研
+姑娘
+识
+下面
+MHz
+森
+夫妻
+域
+搅拌
+二十七
+67
+〉
+优惠
+水稻
+模具
+薛
+贫困
+历任
+大气
+长春
+桂
+偏
+○
+欠
+旁
+燃烧
+试题
+谭
+签约
+使命
+复议
+昌
+刊物
+射击
+哪里
+补助
+传递
+进展
+庆
+春秋
+坑
+还会
+l
+对比
+86
+队长
+人性
+绩效
+下午
+残
+批评
+〈
+76
+唱片
+with
+茶叶
+普及
+仪式
+造
+水电
+电信
+不管
+160
+英格兰
+富有
+选用
+可用
+异界
+幻想
+明白
+客运
+东京
+洛阳
+作文
+好评
+倒入
+天文
+产值
+照明
+平米
+阀
+炮
+1958
+琴
+舰
+这时
+得以
+红袖
+译者
+司令
+240
+慢性
+年后
+ISO
+犬
+1500
+共计
+票
+过滤
+查看
+78
+创始
+作词
+详
+荷兰
+黄河
+行星
+低于
+澳
+痛苦
+爱国
+一颗
+情形
+喝
+纳入
+恶魔
+守护
+藤
+福州
+准许
+纳税
+辣椒
+撰写
+密码
+切实
+篮球
+寿命
+事物
+大地
+蛇
+公务员
+爆炸
+扫描
+二十九
+卵
+洛
+厚度
+ラ
+变形
+心脏
+航天
+奖项
+妻
+71
+电机
+相似
+直属
+融
+市政
+专注
+反对
+草原
+遗传
+今年
+添香
+维持
+码
+清洗
+四十四
+胶
+生平
+雅
+á
+饭
+童话
+忠
+局部
+非法
+玫瑰
+季节
+远程
+疏
+撤销
+释放
+福利
+清代
+提取
+漂亮
+竞技
+81
+立法
+颗
+扬
+象征
+岁月
+难度
+92
+密切
+露
+视角
+丽
+哥哥
+恒
+130
+跟踪
+游泳
+杰出
+履历
+男女
+题材
+拒
+口味
+裂
+坦克
+五十
+资助
+激励
+初级
+籍
+宏观
+尊重
+童
+说话
+客人
+实体
+公顷
+上升
+觉
+继
+播出
+常规
+干净
+味道
+97
+查
+活力
+中药
+师资
+为此
+肥
+账户
+安置
+挖掘
+加盟
+语音
+股东
+隐
+代谢
+结算
+辉
+摩托
+爆发
+弱
+嘉
+壳
+互相
+背后
+从业
+岗
+呀
+固
+82
+两侧
+决赛
+彩
+d
+期待
+芳
+一事
+提名
+交付
+减
+妃
+特种
+清水
+钢琴
+试点
+四十
+麦
+注释
+太子
+降水
+爸爸
+住宿
+曲线
+大专
+营造
+朝阳
+赛事
+导航
+重复
+奋斗
+新加坡
+ト
+讲解
+月份
+际
+二月
+约翰
+外部
+带有
+处置
+渐
+礼仪
+国王
+隔
+韵
+1949
+精英
+触摸
+原子
+究竟
+麻
+光明
+瞬间
+中部
+拍卖
+瓶
+隐藏
+千米
+矣
+责令
+指示
+达成
+傅
+看见
+威
+提前
+较好
+③
+密封
+三级
+而来
+love
+发票
+外交
+操
+参
+绘制
+73
+饮用
+润
+一月
+一边
+醉
+就要
+修订
+实时
+昆明
+加以
+复习
+淀粉
+窗口
+深受
+竟然
+饮
+排水
+�
+冲突
+舒
+组长
+女生
+误
+场地
+实习
+研讨
+外语
+哭
+注射
+民营
+解读
+天气
+影像
+对应
+流域
+三种
+伯
+成年
+だ
+提起
+处分
+更名
+治安
+插槽
+缴纳
+主治
+文本
+消耗
+93
+太多
+◆
+公平
+淋浴
+招聘
+核算
+尹
+孝
+速
+附属
+心情
+未知
+国民党
+码头
+大约
+短信
+赋
+电磁
+标
+劳
+堆
+限
+认知
+债券
+液晶
+一篇
+泽东
+桥梁
+宾馆
+图案
+处长
+回归
+招标
+69
+树立
+收购
+斜
+业绩
+牛肉
+盒
+转型
+原名
+皇家
+存款
+出席
+厨房
+更换
+by
+初期
+游客
+延伸
+U
+耳机
+迷
+一旦
+敏
+网游
+高压
+慢
+77
+伦敦
+深化
+疑
+聪明
+十日
+尘
+不锈钢
+83
+开庭
+书院
+废
+恋爱
+名家
+冷却
+胸
+证实
+含义
+空中
+饲料
+无锡
+发病
+あ
+新鲜
+精度
+发起
+前言
+常常
+透
+首都
+搭配
+俊
+葛
+便捷
+保全
+并非
+弄
+溶液
+制成
+分工
+今后
+本质
+宋代
+护
+告
+讲师
+袋
+俱
+踏
+引发
+杆
+去世
+哈
+结论
+仲
+股权
+二人
+逆
+武装
+线性
+名词
+东部
+腹
+大奖
+观测
+平凡
+供电
+颗粒
+喜剧
+势力
+考古
+混
+84
+氛围
+涵盖
+实战
+坡
+丈夫
+长安
+情报
+扣
+总局
+未能
+符号
+重建
+业主
+叶片
+煤
+74
+忍
+好友
+And
+三十一
+便于
+前进
+手法
+编程
+秉承
+韦
+各界
+狂
+小小
+介质
+还能
+向上
+字数
+托
+网友
+祥
+主机
+用药
+列入
+物资
+牙
+积
+宣
+太平
+生死
+煤炭
+充电
+DDR
+结局
+会同
+黎
+感动
+古老
+修建
+瓦
+㎡
+爆
+楼盘
+岳
+87
+否则
+召唤
+晚上
+主导
+纵
+撤回
+201
+这部
+付费
+光学
+老板
+妹妹
+ng
+原本
+垂直
+容积
+尼
+六十
+元年
+禁
+阶
+萌
+邦
+材
+市民
+分解
+鼓
+质保
+竣工
+星座
+上网
+私
+中原
+奉
+彼此
+89
+缺
+肌
+佐
+终端
+插
+育
+协商
+威胁
+响应
+围
+救助
+看看
+液体
+延长
+皇后
+钙
+湿度
+终止
+编曲
+急性
+橡胶
+洗衣
+蒜
+甜
+锦
+启
+怪兽
+リ
+谜
+明代
+2018
+邮件
+全身
+新增
+唐代
+任职
+百科
+陪
+巨
+阶级
+非洲
+à
+倾
+大于
+审定
+热爱
+返还
+内设
+分泌
+滑
+书城
+端口
+经纪
+鼎
+凭证
+穗
+反复
+引擎
+算法
+动脉
+青海
+干扰
+涂
+疗法
+增值
+战场
+c
+切成
+累计
+1200
+す
+质证
+投影
+顺序
+这位
+限额
+盆
+128
+寂寞
+爱心
+祖国
+新华
+79
+视野
+鼻
+批
+弟子
+带动
+阐述
+今日
+豆
+三包
+递交
+部署
+走进
+交叉
+消
+忘记
+哥
+激情
+虾
+霍
+标题
+亲自
+屈
+记得
+–
+博览
+数控
+南昌
+返回
+兮
+印象
+报刊
+杂
+真相
+林果
+前提
+行驶
+冰箱
+卒
+}
+王朝
+轩
+积分
+94
+矿产
+杀手
+查明
+碗
+¥
+尝试
+进化
+诉状
+按摩
+迟延
+兼容
+衣服
+龙头
+肌肤
+技
+评选
+CD
+听到
+减半
+定制
+审议
+美人
+校区
+乎
+幽默
+科长
+that
+さ
+商场
+碎
+观赏
+施行
+立足
+刑
+监
+体内
+精美
+科普
+紧密
+此次
+温泉
+径
+执业
+体裁
+执
+康复
+坝
+人格
+兰州
+乾隆
+电台
+地板
+商城
+偿
+剩
+败
+自愿
+胃
+可达
+ア
+长大
+柯
+归还
+果实
+け
+蛋糕
+红军
+市长
+1956
+风机
+移民
+大批
+贵族
+司马
+见到
+槽
+出演
+奉献
+起义
+当中
+改编
+骑
+1963
+千万
+尤
+连带
+潮
+太原
+浩
+高清
+坊
+光驱
+年末
+焊接
+出自
+仪表
+走出
+指令
+よ
+椭圆
+阴谋
+承认
+输送
+愈
+宝石
+器材
+华东
+分支
+微笑
+王国
+振
+情人
+架空
+申
+times
+穆
+p
+新生
+逃
+始建
+±
+疼痛
+在职
+遵循
+半年
+尚未
+大桥
+道具
+查处
+斩
+常年
+足够
+鞋
+姐姐
+4000
+排放
+排列
+前期
+弃
+方言
+弹性
+ㄧ
+大街
+伏
+输
+引领
+从小
+选手
+穴
+液压
+队员
+be
+染
+电缆
+消灭
+述
+砂
+尽量
+演奏
+101
+布置
+系数
+核桃
+天才
+炸
+钢铁
+卿
+姬
+当选
+日记
+棒
+
+砖
+小心
+1976
+额定
+浴室
+相遇
+机身
+孤独
+任免
+政法
+1964
+持有
+粥
+摆
+唯
+三者
+开花
+净化
+后记
+渡
+再生
+授
+巧
+水晶
+鸿
+莲
+锂
+饭店
+决
+除外
+续
+云中
+导向
+一期
+军区
+中西
+草案
+两大
+卫视
+子女
+争取
+分会
+户外
+流传
+太平洋
+以其
+遭遇
+1970
+杀人
+三十六
+或许
+任教
+瑞士
+种种
+信任
+路口
+搞
+海上
+宁夏
+芯
+导
+通电
+奇幻
+复制
+诸多
+此后
+纪委
+紧张
+谱
+细菌
+家园
+放大
+商会
+节奏
+供水
+手中
+陵
+暖
+酒吧
+IT
+规
+桑
+主流
+天堂
+冬季
+注音
+查封
+备用
+坪
+公社
+村庄
+剂量
+美洲
+希
+人心
+出血
+参照
+咸
+锻炼
+体积
+起草
+板块
+牧
+所长
+风情
+苯
+待遇
+对策
+全程
+乌
+三十二
+佛山
+伴
+敏感
+赖
+饼
+鸭
+犹
+明天
+星期
+ク
+源于
+建国
+抓住
+演讲
+智力
+追究
+◎
+瑞典
+鑫
+以此
+quot
+肩
+版次
+预付
+方程
+感谢
+塑造
+优雅
+温馨
+地域
+监狱
+赏
+前身
+起源
+出色
+境界
+开业
+过来
+煤矿
+激发
+繁荣
+放射
+糖尿
+私人
+描写
+共产
+籍贯
+平等
+患
+妙
+敬
+陈述
+球场
+花生
+村务
+模范
+1962
+一世
+退房
+较多
+意味
+热水
+节约
+村内
+艺人
+平原
+为止
+化合
+同人
+加倍
+交办
+土豆
+渐渐
+严谨
+显得
+必然
+占有
+900
+清理
+滴
+很高
+高手
+鲜明
+津
+斑
+风雨
+无敌
+港口
+民俗
+清新
+汉字
+修炼
+霜
+面貌
+牛奶
+还原
+突发
+缺陷
+济
+水分
+用人
+Q
+700
+商家
+地带
+新兴
+考点
+其余
+关闭
+收获
+任命
+休息
+线索
+格局
+尺
+三十三
+越多
+发送
+全力
+柏
+几十
+权限
+信访
+兼任
+大多
+日语
+更有
+面粉
+习
+协作
+扬州
+小麦
+拜
+带头
+渔业
+倡导
+早餐
+寄
+附带
+桃花
+遇见
+癌
+友谊
+正义
+己
+忧
+前景
+优越
+互
+1965
+风光
+历代
+窗
+怪物
+地狱
+四月
+精华
+奇迹
+器件
+直线
+扶贫
+百度
+原有
+II
+胡椒
+开心
+早已
+栏
+10000
+期货
+扶持
+320
+实务
+旨在
+全镇
+稿
+寿
+静脉
+萨
+140
+酱油
+胆
+典
+列车
+融入
+邵
+出土
+后面
+日益
+肌肉
+补贴
+幸
+睡
+开创
+单独
+储备
+责
+家乡
+联邦
+内科
+昆虫
+门户
+表彰
+自来水
+历
+一类
+晶
+回收
+光盘
+到期
+美化
+首席
+1950
+ā
+手工
+决议
+泰国
+曼
+新人
+货
+打破
+1975
+婚礼
+兼职
+某种
+选育
+僧
+主板
+保卫
+阴阳
+通风
+嘴
+帅
+e
+华人
+合适
+绍兴
+休
+氧
+回答
+涛
+瘤
+单曲
+华丽
+释
+剩余
+生前
+纷纷
+报警
+来看
+迅
+全年
+水质
+纳米
+つ
+腐蚀
+1977
+缝
+宽度
+器械
+活跃
+澳门
+人身
+慈善
+乳
+个性化
+直至
+职权
+其间
+辣
+后方
+理性
+震
+采访
+え
+肠
+未经
+塞
+中毒
+毕
+勇敢
+污水
+面包
+交往
+中介
+保密
+单纯
+劫
+律
+探测
+五金
+架构
+机遇
+水中
+谈判
+遭
+三维
+冻结
+mg
+家用
+撒
+区委
+108
+三星
+丘
+ッ
+潜力
+另有
+穷
+演变
+载体
+行使
+引入
+春天
+360
+与其
+人家
+一半
+趣味
+铭
+条第
+海关
+紧急
+花卉
+德育
+救援
+廖
+溶
+立方米
+幻
+禅
+水面
+人次
+1960
+渗透
+能否
+突
+术语
+徐州
+新浪
+精密
+雾
+Ⅱ
+振动
+轮回
+样子
+勤
+聚集
+进取
+演绎
+条形码
+善良
+叫做
+内地
+五十四
+乃至
+石家庄
+Y
+武术
+协同
+自行车
+修正
+口腔
+上线
+说法
+备注
+赐
+气息
+夏季
+润滑
+永恒
+湿地
+乐趣
+电梯
+用料
+某某
+隧道
+发酵
+代替
+暖气
+减轻
+长篇
+两岸
+燃料
+兔
+人人
+注定
+高于
+颁布
+仓
+遍
+用水
+日军
+驳回
+No
+夏天
+潜
+下辖
+每月
+遭到
+大楼
+雷达
+康熙
+紧
+抽
+法则
+清除
+海岸
+主干
+流水
+大事
+y
+尿
+辞
+祭
+任意
+储存
+模特
+IP
+堪
+三十四
+示
+罪犯
+配音
+一大
+链接
+玩具
+空军
+从来
+三十五
+蓝牙
+乐园
+习性
+宠
+极限
+百万
+畅销
+准则
+复仇
+推行
+友好
+班子
+容器
+对此
+入学
+珍贵
+调味
+沉
+按钮
+数十
+切换
+1968
+激烈
+面板
+转向
+Z
+答
+伦理
+自信
+调料
+均有
+Unit
+卡片
+官员
+延
+宪法
+乘坐
+累
+迎
+情景
+形容
+士兵
+零件
+拆迁
+总线
+投标
+手动
+斋
+传唤
+1966
+邱
+书写
+诺
+好像
+入门
+百货
+新城
+很少
+评定
+冶金
+み
+执导
+烈
+那种
+报表
+1972
+饲养
+消毒
+医科
+多元
+本来
+聚合
+公式
+话题
+植株
+付出
+≤
+财
+公子
+足以
+长城
+多彩
+all
+编著
+岩石
+票据
+少量
+烈士
+三届
+确实
+审
+粘
+寸
+浮
+通俗
+后果
+内向
+制订
+调控
+参观
+告知
+仔
+就算
+监理
+苑
+总额
+约束
+公正
+世间
+悠久
+民生
+判令
+夺
+XP
+占用
+纪检
+ˋ
+房子
+潭
+体会
+货款
+出去
+介
+纪律
+后勤
+反馈
+善于
+筋
+腿
+主力
+美味
+石头
+华夏
+献
+得知
+巩固
+美学
+排除
+筒
+审美
+眼前
+感应
+而已
+划
+顶级
+至于
+桃
+f
+散热
+几何
+法师
+整治
+工委
+宿舍
+wifi
+饮水
+慧
+磨
+相连
+替
+物体
+社科
+常州
+语法
+金刚
+同类
+′
+对称
+资深
+翁
+论述
+测绘
+保养
+联想
+1600
+拔
+1957
+精确
+大门
+总量
+答辩
+炖
+南宁
+当今
+玩法
+古城
+1973
+科幻
+河流
+杀死
+高稳产
+试卷
+人气
+晚会
+二百五十三
+潮流
+过敏
+大纲
+防水
+鼠
+相机
+节日
+1974
+绘
+小儿
+í
+1969
+赢
+法庭
+删除
+进修
+170
+论证
+女神
+湘
+高职
+奏
+十字
+单一
+和一
+105
+漫
+物种
+巷
+雕塑
+谋
+交纳
+首批
+陌生
+依赖
+想法
+报名
+畜
+实物
+陆军
+深深
+预计
+因子
+对抗
+吸
+主料
+在内
+女王
+精细
+高原
+暨
+ì
+修理
+刷
+剧本
+某一
+可行
+油画
+熙
+着力
+农场
+信心
+职位
+别称
+燃气
+路径
+床铺
+伦
+坚
+浪
+技法
+传染
+转会
+事迹
+红外
+借鉴
+报纸
+翅
+拯救
+如同
+而又
+宅
+颖
+制备
+次数
+姐妹
+截至
+名师
+腰
+出入
+锋
+替代
+种族
+沿革
+加载
+次性
+损坏
+制冷
+城堡
+为其
+专职
+陷入
+1955
+1945
+会展
+元件
+印发
+刚刚
+平时
+1959
+缺点
+腔
+新颖
+激素
+总监
+实在
+款项
+特有
+地形
+鸣
+自觉
+援助
+凯
+标签
+拟定
+战国
+经贸
+毫无
+下设
+上帝
+幽
+确立
+拌
+力求
+後
+1800
+轻轻
+上午
+等于
+浓
+邹
+小于
+顿
+黑白
+愿望
+欢乐
+退出
+饮料
+勺
+雄厚
+反射
+热点
+前沿
+涂料
+分散
+悦
+前列
+脂
+电阻
+战术
+欧阳
+厂家
+短篇
+高档
+七年
+越南
+菩萨
+检疫
+汗
+多名
+粮
+孕
+深厚
+服用
+俞
+俗
+名胜
+推理
+紫色
+花序
+辅
+命题
+催
+版权
+莲花
+席
+鹿
+误工
+发掘
+向前
+著有
+射
+真空
+展会
+真题
+企事
+下属
+逼
+扩张
+运算
+核准
+聚会
+仿真
+固体
+经销
+配方
+惜
+毫克
+音效
+小平
+筑
+赠
+一处
+鹏
+相传
+六条
+奠定
+精致
+不幸
+标识
+崛起
+顶端
+迪
+传动
+扬声
+束
+捐赠
+雕刻
+o
+变革
+1961
+务
+讲座
+担心
+以便
+封闭
+清朝
+析
+验证
+档
+粉丝
+总理
+生殖
+受害
+弟弟
+照顾
+跨越
+绕
+热血
+生意
+太极
+宿
+圣诞
+变动
+童年
+宽带
+损
+为本
+搭建
+偶像
+125
+一一
+参赛
+数额
+延续
+惟
+投保
+选定
+信誉
+督促
+怒
+出身
+番茄
+剪
+演艺
+up
+植
+刑法
+辈子
+蝶
+金奖
+光绪
+繁华
+1952
+闻名
+徒刑
+尚书
+华中
+甲基
+国道
+终身
+亚科
+亡
+职场
+站点
+僵尸
+列表
+防范
+套房
+成型
+木材
+五月
+狮子
+802
+洁
+前面
+名义
+概论
+随意
+三十七
+☆
+泉州
+102
+天涯
+仿佛
+预警
+省市
+123
+填词
+味精
+全民
+手指
+擦
+树脂
+变换
+编委
+游击
+武侠
+过错
+育人
+出售
+均衡
+磷
+珠宝
+吻
+七日
+朗
+大脑
+开工
+抢
+晨
+沙漠
+风扇
+参阅
+辅料
+そ
+年份
+中期
+漆
+态
+旋
+协
+1971
+页面
+成语
+欣
+350
+民办
+播种
+螺旋
+蝴蝶
+we
+家中
+名誉
+马上
+药材
+十几
+上去
+太空
+欧美
+$
+院士
+清偿
+终点
+麦克风
+梨
+吉他
+防腐
+祝
+心态
+文名
+整形
+浦
+相继
+偶然
+娘
+实质
+合计
+投诉
+细腻
+昭
+壶
+短期
+入侵
+日前
+霸
+变量
+沿海
+预先
+6000
+承受
+弘扬
+低温
+训
+集合
+逸
+胎
+移植
+好好
+机体
+历经
+开通
+依旧
+每人
+悲
+伍
+估计
+氢
+孤
+三十八
+复兴
+蒲
+少儿
+气质
+联络
+闹
+一同
+剧场
+明朝
+凝聚
+ド
+扎实
+观光
+MP3
+灰色
+关卡
+必备
+购
+朴
+科目
+In
+天天
+接到
+联保
+垂
+观音
+迷你
+屯
+注入
+囊
+检索
+火箭
+尝
+永久
+礼物
+忙
+借助
+1953
+政权
+金牌
+锅炉
+取出
+县长
+大人
+点点
+深处
+登录
+数值
+约会
+电站
+滩
+暂时
+攻略
+特效
+居委
+④
+火焰
+无穷
+活泼
+达标
+通话
+料酒
+这款
+南海
+白云
+区划
+轿车
+晶体
+结实
+抓好
+率先
+统
+坚强
+香菇
+文档
+微波
+头发
+坚定
+台北
+合作社
+赋予
+病理
+シ
+乐器
+彦
+千克
+读音
+虞
+观看
+龄
+枫
+摇滚
+分区
+储
+省内
+全长
+居然
+看来
+对待
+浦东
+带给
+牡丹
+嘉宾
+自学
+头部
+另行
+提醒
+光电
+同行
+流畅
+波兰
+肯
+懂得
+保温
+花期
+串
+旋律
+表决
+离婚
+万物
+借条
+窑
+长达
+大臣
+QQ
+响
+磁
+斌
+到位
+辰
+名牌
+置业
+排行
+桂林
+领袖
+酷
+分行
+ㄨ
+鹰
+1954
+回报
+挡
+内在
+破碎
+碧
+渊
+样本
+妇
+抗战
+钻
+正面
+火山
+群岛
+殷
+真诚
+喷
+颈
+进士
+功夫
+四十二
+军人
+部落
+口号
+选拔
+故意
+华南
+奇怪
+la
+驻地
+埃及
+殇
+幕
+111
+字母
+器官
+到来
+烦恼
+1948
+机床
+总会
+录制
+随便
+伪
+事宜
+柴
+ち
+外汇
+侦探
+座椅
+祠
+醇
+决心
+录取
+极其
+广阔
+岸
+两位
+闭
+博士后
+滚
+层面
+应予
+大夫
+放心
+瓣
+104
+本土
+其次
+暴力
+改正
+一声
+描绘
+货运
+鉴别
+稳
+鹤
+模糊
+合伙
+朝廷
+充足
+作风
+徒
+机组
+弦
+当天
+风暴
+病变
+信念
+分裂
+戏曲
+稻
+高山
+社交
+风味
+口服
+工学
+部级
+五大
+八十
+冀
+μ
+出栏
+回顾
+眼泪
+再见
+结晶
+制动
+变迁
+km
+弘
+翔
+才会
+直流
+480
+星星
+应有
+出资
+山地
+产物
+雕
+叙述
+钠
+惯用
+一中
+帝王
+小孩
+封建
+众人
+颇
+报酬
+茂
+伤残
+闪光
+渴望
+毁灭
+火灾
+重组
+聘
+超声
+翠
+流转
+锦标
+擅自
+焉
+鉴赏
+麻烦
+催化
+奶
+矿山
+辩称
+动员
+柜
+缓存
+表情
+邪恶
+通报
+珍珠
+场合
+混乱
+遗
+均可
+月亮
+养成
+幅度
+阵容
+r
+参谋
+农药
+PCI
+享
+娜
+难题
+双手
+112
+腺
+恐惧
+霞
+成品
+三十九
+菊
+登陆
+情侣
+配送
+礼品
+占领
+那时
+机理
+业余
+种质
+know
+据说
+击败
+过渡
+龟
+本名
+疗效
+装扮
+弟
+妹
+从未
+欢
+曝光
+め
+故乡
+得名
+^
+促
+慎
+醋
+入口
+≥
+平行
+水源
+103
+简洁
+适配
+罐
+租金
+禁忌
+多元化
+美观
+可谓
+细致
+嫁
+like
+扇
+倾向
+外壳
+坐标
+供给
+菇
+务实
+忌
+对白
+图文
+978
+阻止
+见面
+睡眠
+枢纽
+怀疑
+金华
+撞
+步步
+建于
+逢
+按键
+慕
+江山
+成效
+启示
+不必
+{
+边界
+舌
+背光
+拓
+碱
+è
+锡
+愁
+想起
+缺席
+科室
+装甲
+减刑
+碰撞
+北大
+养护
+幼
+框
+挖
+旅客
+眼镜
+暂停
+签署
+害怕
+莉
+设在
+壮大
+先锋
+千里
+民众
+道理
+以往
+盾
+旺
+表格
+106
+走过
+意志
+IOS
+柔毛
+后者
+中型
+气管
+叔
+珍
+气氛
+四周
+挺
+3G
+车间
+溶解
+膨胀
+后人
+弯
+资
+全文
+亚纲
+流浪
+灯光
+赞
+忆
+厂房
+基准
+阿根廷
+前卫
+勇气
+醒
+标本
+钻石
+局面
+分级
+二期
+190
+扩建
+国立
+毒性
+紧紧
+水产
+衍生
+酯
+寻求
+タ
+ˊ
+用书
+拆除
+支队
+gt
+舟
+华北
+优异
+商店
+俄
+收支
+レ
+歌唱
+探
+十余
+球衣
+艳
+频
+白糖
+ml
+将来
+取代
+独自
+过多
+腹部
+陆续
+增产
+分开
+罢
+吐
+iPhone
+台州
+怡
+放置
+引资
+试图
+百合
+私营
+Chapter
+举报
+1967
+爬
+副总
+分数
+青山
+译文
+树木
+往事
+考入
+充
+猪肉
+埋
+检
+曲目
+赫
+中专
+索引
+失业
+远离
+叠
+羽毛
+1937
+传真
+115
+烹饪
+明清
+牵
+卧室
+烟台
+棵
+克服
+贯穿
+策
+蓄
+铸造
+初始
+改名
+晋升
+中南
+1946
+墨西哥
+凝
+方形
+解答
+户型
+蜀
+户数
+译名
+南通
+悲伤
+炼
+弓
+华侨
+相反
+导体
+丛
+铃声
+伴随
+撰
+双重
+暴
+梦幻
+力争
+归来
+同期
+靠近
+高尔夫
+防火
+高兴
+协定
+吸血
+四十三
+右脚
+烂
+渣
+挑
+返
+毛重
+艰难
+1938
+256
+1951
+播
+缓解
+柠檬
+联网
+仓库
+欲望
+My
+四十八
+呼
+四十一
+讲话
+芽
+思念
+每周
+剧院
+棉
+艰苦
+硅
+甫
+境外
+わ
+子宫
+战胜
+鲍
+十条
+庵
+珠海
+垫
+民用
+军团
+铁道
+加油
+袭击
+唤醒
+开采
+东风
+考验
+怨
+耶
+携手
+光滑
+毫
+只需
+裁判
+烤烟
+血型
+柏林
+一号
+简历
+读物
+各位
+半岛
+使人
+钥匙
+一夜
+以内
+107
+琳
+1947
+齿轮
+裕
+2500
+片长
+晴
+前来
+南阳
+盟
+嫩
+团委
+逍遥
+阮
+210
+并未
+透过
+主线
+火锅
+法治
+东路
+PS
+喻
+缔约
+龚
+旅馆
+悲剧
+噪声
+VS
+分校
+鄂
+人际
+樱
+摇
+世人
+特长
+万余
+暂行
+5mm
+仓储
+de
+变压
+引用
+丢
+畅
+哲
+减肥
+扎
+涉
+豪
+公示
+逐
+年月
+进而
+常识
+GPS
+钾
+天生
+交给
+签
+跟着
+邑
+契约
+大胆
+鉴
+贵阳
+两者
+署
+单词
+西路
+外资
+用心
+淋巴
+役
+东南亚
+To
+大概
+认
+途中
+爽
+戒
+庞大
+弯曲
+中考
+隔离
+柔
+小镇
+マ
+k
+词典
+航
+难道
+豫
+纸张
+前世
+行走
+误差
+from
+锌
+参演
+直辖
+臂
+兹
+开幕
+份额
+巅峰
+蒂
+媒介
+回合
+两国
+天线
+南路
+与此
+葫芦
+传送
+驾
+淘宝
+签字
+自杀
+警告
+白菜
+ī
+诠释
+河道
+瓜
+岁时
+吉祥
+终极
+调度
+负荷
+集贸
+花开
+制药
+仅有
+面团
+兴建
+给出
+至上
+各方
+光纤
+郝
+较小
+诊治
+切割
+回国
+身心
+数百
+揭示
+幸运
+物料
+葬
+合资
+粤
+山脉
+魔术
+君子
+光荣
+教会
+惊人
+覆
+星球
+合唱
+邮电
+鳞
+世家
+悟
+体质
+骨折
+探险
+即时
+大火
+组件
+土耳其
+外面
+遥控
+矿业
+蟹
+西湖
+触
+默默
+立项
+地名
+沉积
+调试
+进村
+化石
+彼
+国语
+再现
+餐
+乾
+口语
+筹
+HD
+威力
+壮族
+侵犯
+转速
+上部
+稀
+先端
+七十
+滨
+惹
+巧克力
+阿拉伯
+洋葱
+支援
+听说
+唇
+金山
+芝麻
+遭受
+疲劳
+南宋
+崖
+闲
+领取
+复旦
+函
+死后
+音响
+累积
+过度
+ジ
+保管
+租
+Win
+贼
+命中
+规程
+遗迹
+标的
+下游
+蒸汽
+被迫
+万年
+关怀
+迟
+速率
+香味
+抵抗
+率领
+尸
+文集
+NO
+光线
+单人
+裁决
+避
+条约
+多么
+赛车
+ロ
+大厅
+幢
+书桌
+概率
+愤怒
+莫斯科
+秘
+栖息
+大海
+素材
+甜蜜
+综上
+主讲
+外贸
+阳台
+马来西亚
+精准
+九十
+王爷
+斤
+间隔
+お
+友情
+六十四
+增大
+夷
+狐
+歌舞
+瀑布
+肉牛
+面试
+北宋
+工伤
+标记
+从中
+遥
+娃娃
+落后
+坚实
+机箱
+历年
+见证
+运转
+不懈
+痕
+孵化
+粒子
+喂
+湿润
+披针形
+1940
+交界
+黄山
+成一
+大卫
+外表
+后续
+上报
+回族
+苍
+109
+合一
+反而
+啤酒
+121
+谱曲
+异世
+开水
+帖
+字号
+武帝
+恐龙
+十佳
+南山
+过于
+跳跃
+公约
+定时
+聊天
+瓷
+电容
+1944
+户口
+馆藏
+摩擦
+缺少
+党政
+幕后
+留在
+蜜
+歌剧
+国产
+坛
+夕
+西洋
+邻
+黄瓜
+毗邻
+乙烯
+添
+疑难
+陷阱
+TV
+海滩
+China
+大使
+众生
+焦点
+挑选
+交汇
+先天
+由来
+婚
+隋
+老子
+刊
+路由
+比利时
+上方
+打扮
+崇
+虹
+本级
+难点
+银河
+8000
+道教
+棋
+于此
+海水
+浴缸
+113
+卵形
+提要
+好处
+红尘
+HDMI
+氮
+写字
+序列
+班级
+偷
+地说
+花椒
+征服
+派出所
+カ
+填写
+天赋
+摆脱
+车型
+450
+报送
+文体
+农历
+风水
+人称
+商人
+外商
+葡萄牙
+防空
+丙
+疾
+侍
+详解
+敬业
+插入
+1942
+海淀
+交互
+浆
+背叛
+造价
+开办
+VGA
+出国
+电网
+争夺
+玲
+承载
+◇
+占据
+收缩
+全乡
+杆菌
+ǎ
+发热
+罕见
+装有
+异能
+区分
+姆
+聘请
+v
+衔接
+菲
+刃
+人和
+空白
+俗称
+并发
+拾
+沉默
+绝缘
+匹配
+十月
+门口
+悬
+流体
+桌
+赶
+抵
+雌
+据此
+所致
+复活
+应力
+浴
+往来
+浙
+升高
+平板
+限期
+彩虹
+灾难
+直播
+差别
+敏捷
+彝族
+箭
+今生
+错过
+古今
+节省
+1920
+配料
+壮
+期望
+南瓜
+窝
+筹备
+横向
+ù
+忠诚
+自助
+勒
+分枝
+朵
+1941
+之所以
+废物
+证件
+既然
+东海
+砖木
+秋季
+超出
+季风
+忽
+米兰
+织
+签名
+认同
+淋
+波动
+荷
+批判
+强者
+潜在
+匀
+飘
+色素
+1939
+病虫
+和尚
+奥地利
+泰山
+早年
+圆柱
+拼
+辽
+孕妇
+牵头
+@
+平静
+业界
+脱离
+十多
+海峡
+水土
+整机
+眼中
+步伐
+女士
+做人
+118
+当日
+男生
+样式
+住在
+建模
+122
+信赖
+隐患
+农家
+序言
+汇总
+It
+闯
+店铺
+机型
+救灾
+伤心
+`
+救济
+可怕
+缓慢
+古人
+提到
+班主任
+困
+访
+巧妙
+亮度
+海底
+在家
+主场
+月底
+团长
+技艺
+史料
+钝
+失踪
+兴奋
+硫酸
+之二
+夜晚
+延安
+1936
+宾
+邀
+线条
+天上
+乐团
+丧失
+1930
+煲
+坚决
+温带
+四十六
+受益
+劲
+棉花
+讨厌
+病因
+尽快
+泛
+廉
+渠
+影人
+科员
+海域
+预期
+确
+套装
+偶
+遣
+显然
+同事
+防守
+ahref
+诱惑
+性状
+开辟
+地貌
+革
+阻
+大同
+多层
+诊疗
+书店
+盈利
+主频
+麻醉
+俩
+135
+展望
+独家
+230
+1900
+分明
+天子
+卧
+妊娠
+订单
+金色
+定律
+勤奋
+キ
+dited
+光泽
+阀门
+修真
+阅览
+日照
+遍布
+公用
+再度
+512
+家电
+北路
+津贴
+药用
+庭审
+磊
+辞典
+祸
+打算
+腾
+绳
+魔秀
+苗族
+区内
+暂
+咏
+浓郁
+﹐
+毒品
+■
+跟随
+宫廷
+操纵
+柴油
+1935
+娶
+清热
+uncre
+红旗
+亚门
+亩产
+伞
+相邻
+ó
+樊
+字符
+灿烂
+对面
+励志
+装配
+武林
+风流
+郁
+学说
+ど
+Web
+矿物
+仔细
+114
+翅膀
+下方
+一手
+2GB
+Love
+悬挂
+University
+乔治
+脉冲
+室外
+禾
+接入
+保安
+易懂
+简化
+检修
+听力
+通行
+口径
+四方
+旭
+蔷薇
+打败
+管委会
+太后
+面部
+巢
+以免
+声明
+构
+扁
+极具
+微型
+可惜
+倪
+肉猪
+窦
+如有
+衰
+棕
+DNA
+限度
+扣押
+四十七
+鱼类
+用以
+电工
+计生
+真人
+热量
+毅
+现行
+代号
+被子
+1934
+腐败
+裴
+廉政
+杂交
+同等
+模板
+双子
+林木
+one
+诚实
+相爱
+分割
+绣
+纲目
+峡谷
+去年
+概括
+春节
+岂
+吏
+祭祀
+管家
+祝福
+破产
+腾讯
+门诊
+叹
+冒
+梅花
+个别
+头痛
+怀孕
+∶
+剪辑
+两面
+举证
+勿
+其后
+SATA
+冬天
+节点
+代价
+察
+丹麦
+孢子
+一根
+宝贵
+临沂
+降雨
+原版
+116
+可选
+爱人
+全校
+フ
+唐朝
+条文
+α
+发光
+time
+罗伯特
+独具
+危
+只好
+抢救
+橄榄
+并处
+对接
+镁
+生化
+残酷
+出于
+品位
+小吃
+1943
+切除
+じ
+百分
+上传
+绿地
+噪音
+荒
+漏
+仕
+260
+客厅
+模仿
+颂
+共青团
+法官
+双向
+不安
+讲究
+乙方
+变速
+红楼
+土路
+口感
+夫妇
+剧中
+重重
+当初
+动人
+水域
+猜
+汇集
+氯
+签定
+封印
+嘉兴
+扩散
+嵌入
+环球
+六十一
+助手
+迈克尔
+拨
+锐
+明月
+密集
+定点
+新娘
+280
+汝
+直升
+何时
+样品
+细则
+Flash
+局限
+日文
+1933
+遥远
+姿态
+缓
+诏
+浪费
+选项
+回头
+唐山
+中断
+水量
+事先
+间接
+气势
+赏析
+爷爷
+扶
+瑶
+比分
+桩
+La
+老婆
+时装
+校友
+聚餐
+犹如
+168
+科级
+浓厚
+派出
+外界
+林场
+分成
+精力
+海口
+哺乳
+攻关
+总值
+倾斜
+can
+氟
+时效
+倾城
+各样
+培
+储蓄
+解剖
+神仙
+执着
+环绕
+活着
+巫
+亲情
+兵团
+威廉
+国学
+比喻
+变身
+Ⅰ
+音像
+唱歌
+SCI
+下部
+四十九
+背面
+庞
+灵感
+教委
+暴露
+落叶
+真理
+禹
+明亮
+笔画
+团员
+兴起
+刮
+蘑菇
+改建
+文人
+家常
+派遣
+远远
+关爱
+薄膜
+变种
+转动
+纠正
+易于
+兽医
+陈列
+沪
+发言
+GSM
+余年
+盈
+春季
+放松
+姓氏
+自有
+促使
+琪
+聂
+9001
+逃离
+铸
+评委
+激活
+动手
+钢筋
+风湿
+拌匀
+蜘蛛
+AC
+名录
+熟练
+诸葛
+五行
+安静
+状元
+清算
+同名
+炸弹
+女友
+前方
+转入
+客车
+外包
+存放
+小子
+呆
+糯米
+震撼
+修养
+牵引
+解毒
+醒来
+填
+黄油
+普查
+定向
+衡
+查找
+潍坊
+学期
+电商
+结核
+颁奖
+增多
+上游
+畏
+亚目
+影子
+妇科
+混沌
+异地
+音箱
+浸泡
+老鼠
+詹姆斯
+成交
+20000
+愉快
+提倡
+试行
+处处
+预报
+勾
+亚种
+硬度
+盗
+忍者
+石化
+成份
+体力
+议会
+信贷
+亚军
+兄
+目光
+筛选
+候选
+胶囊
+John
+吟
+青铜
+排骨
+暂无
+外来
+来电
+猎人
+出行
+物权
+曲折
+垄断
+钩
+致使
+彼得
+迄今
+筹资
+舰队
+中路
+依次
+爷
+奥林匹克
+绝望
+1931
+海鲜
+立刻
+月光
+随笔
+得分
+钢管
+月经
+反思
+滨海
+轮胎
+截
+甲方
+以太
+到处
+卜
+睿
+变频
+做事
+氨基酸
+吊
+分手
+起伏
+过关
+遗憾
+此书
+殖民
+New
+刷卡
+沉淀
+同比
+运河
+毁
+时机
+师傅
+缩短
+no
+九月
+迎接
+武功
+违规
+涉案
+名著
+调动
+跨国
+发扬
+进去
+看出
+茅
+相处
+长途
+病例
+正规
+乱世
+召
+汇编
+泡沫
+困境
+预备
+定量
+当作
+山庄
+核定
+堪称
+Mbps
+木质
+IEEE
+六月
+振兴
+里程
+术后
+北美
+执照
+纠错
+社长
+投票
+先行
+宴
+特产
+上古
+长寿
+精通
+阔
+九龙
+感悟
+桶
+启蒙
+伐
+名片
+判
+必需
+テ
+企
+干事
+默
+灌
+秀丽
+闪
+景色
+随之
+RJ
+灾
+假设
+特大
+重力
+珍惜
+格斗
+博客
+亲爱
+世博
+脾
+启发
+冻
+缸
+调制
+探究
+物价
+杀害
+包围
+习俗
+分期
+年限
+制剂
+花瓣
+真心
+废弃
+京剧
+制约
+石灰
+字体
+膏
+扣除
+小额
+j
+实证
+脆
+117
+度过
+热门
+三千
+淘汰
+大全
+伤亡
+差距
+羊肉
+业内
+总支
+公关
+蜂
+加密
+品味
+直立
+抵达
+侵害
+造就
+伸
+集群
+魅
+极端
+178
+死神
+街头
+大致
+寄生
+归纳
+题目
+草地
+专栏
+会所
+菜品
+参展
+改良
+春风
+迁移
+抹
+传票
+丑
+螺
+熊猫
+岚
+途
+骄傲
+五十二
+精装
+穿过
+轨迹
+貌
+市中心
+搞笑
+综述
+上门
+八月
+恋人
+潇湘
+底部
+庆祝
+棕色
+风采
+蜂蜜
+保罗
+徽
+缩
+色泽
+降临
+去除
+答应
+展出
+老虎
+浇
+未有
+丝绸
+点评
+坤
+吃饭
+1932
+卡通
+尸体
+琼
+侦查
+傲
+马克
+will
+促销
+精湛
+偶尔
+病情
+糊
+支配
+携
+侵略
+全集
+一笑
+基督教
+烟草
+极高
+β
+东汉
+搭
+矩阵
+狐狸
+下达
+销
+131
+传达
+苞
+销量
+理学
+瞄准
+透露
+荆
+贡
+公立
+演化
+史记
+学前
+什
+渔
+baby
+团购
+题型
+索尼
+典雅
+开封
+询问
+弥
+吸附
+出任
+冥
+瘦
+一辆
+游记
+被害
+揉
+楼层
+舱
+送给
+衡阳
+库存
+昆山
+馀
+w
+拖拉
+妈
+伙食
+地势
+被动
+教科
+清真
+烫
+□
+捕捉
+户籍
+119
+名列
+旁边
+可使
+感恩
+器具
+素养
+杭
+柔软
+鲜花
+停留
+4GB
+圆锥
+做成
+耀
+余万
+笑容
+声誉
+发售
+椒
+相思
+邮票
+各省
+窄
+打工
+全局
+现存
+兵器
+难得
+频繁
+帽
+勘察
+132
+勇士
+本息
+巨人
+恭
+高产
+←
+取决
+儒
+盛世
+邢
+渝
+哉
+选取
+明珠
+致命
+千金
+藏书
+严肃
+梦中
+木耳
+切片
+灌木
+加州
+127
+艘
+傻
+伊斯兰
+预案
+爪
+а
+淄博
+冷冻
+预交
+BOSS
+追踪
+置于
+周岁
+一时间
+行人
+司机
+汉书
+教导
+贞
+涉外
+排出
+奶油
+光辉
+拆
+替补
+阎
+1080
+xxx
+粘贴
+灵敏
+抽象
+三峡
+机车
+踩
+静电
+中旬
+求职
+路灯
+一贯
+上年
+致富
+范畴
+大豆
+五十三
+蓉
+摸
+为由
+悬疑
+极品
+回去
+回复
+专属
+尼斯
+this
+年均
+TOP
+仙人
+古籍
+拥抱
+せ
+捞出
+监视
+构筑
+上来
+影音
+授课
+痕迹
+烤箱
+姐
+支柱
+定理
+网点
+送到
+神圣
+承接
+罚
+靖
+夜间
+溃疡
+用作
+三层
+发音
+126
+海盗
+min
+亮点
+编排
+哈佛
+1300
+封面
+悔
+天主
+光芒
+↓
+而后
+上学
+魄
+圆满
+仇
+考证
+画展
+劝
+猴
+奔
+洪水
+外星
+触发
+玛
+鹃
+干旱
+о
+MV
+桐
+124
+咬
+崇拜
+维尔
+修行
+纱
+农机
+判处
+捷
+谢谢
+粉末
+债
+酮
+权证
+133
+限于
+盆地
+夺得
+变异
+王者
+处以
+暮
+死去
+HP
+哀
+沫
+International
+本草
+人选
+西瓜
+诸侯
+汕头
+白天
+刻录
+直观
+重型
+看似
+特约
+牙齿
+自称
+新西兰
+莫名
+Part
+水准
+五十一
+保定
+魔鬼
+绪论
+总则
+剖析
+证人
+捷克
+负债
+乾坤
+步兵
+温和
+酷睿
+刺史
+校舍
+身材
+洗浴
+录像
+Linux
+粤语
+写出
+芝
+民初
+青睐
+叙事
+学子
+ll
+强力
+客场
+﹑
+名校
+大王
+同盟
+狮
+勇于
+袭
+上演
+使者
+军官
+红颜
+乔木
+周刊
+u
+犹太
+菲律宾
+独有
+双核
+珊瑚
+局域
+招收
+备份
+田径
+MP
+说过
+衰老
+奖学
+疯
+料理
+步枪
+等离子
+远古
+亲人
+介入
+欠款
+决算
+高贵
+车身
+1927
+大方
+潜艇
+猛
+迷人
+just
+信托
+名叫
+以色列
+晚期
+饱和
+实地
+棋牌
+至尊
+搞好
+商铺
+§
+量子
+抛弃
+游览
+场面
+樱花
+荧光
+小康
+儿女
+华盛顿
+贴近
+高低
+比重
+We
+遗忘
+券
+沼气
+出道
+着重
+公证
+拿到
+雀
+首播
+正月
+塑
+蕾
+全日制
+找出
+奖金
+校训
+首家
+狄
+年产
+相见
+学堂
+指引
+双人
+have
+胺
+无机
+传记
+奇妙
+画院
+罩
+记住
+闽
+潜能
+持久
+争霸
+家属
+挪威
+考研
+管制
+微软
+特区
+主观
+コ
+天后
+相识
+发作
+轨
+揭开
+粉碎
+收音
+隆重
+浸
+向往
+一心
+墩
+175
+阵地
+枸杞
+监事
+165
+价款
+毕竟
+光谱
+那年
+搬迁
+紫外
+胃肠
+Ⅲ
+塑胶
+右手
+金币
+并入
+惊喜
+原型
+魔王
+鹅
+扑
+go
+明细
+南开
+上班
+告别
+风俗
+攻读
+吸烟
+狱
+溶剂
+首选
+邯郸
+宋朝
+地段
+萍
+行程
+惠州
+时钟
+没收
+离去
+拍照
+洗涤
+185
+近期
+示例
+执政
+后世
+整齐
+恶性
+抛
+黄帝
+湖泊
+750
+光源
+较少
+仆
+152
+好奇
+警方
+义乌
+嘛
+天真
+1928
+芙蓉
+月刊
+拖
+非凡
+俄国
+趁
+护肤
+对战
+之类
+筛
+炭
+侦察
+百家
+哥伦比亚
+头脑
+听取
+二手
+旅程
+突击
+酸性
+文联
+星光
+坂
+可怜
+诅咒
+奇特
+欧盟
+亲切
+五十五
+葵
+同级
+洛杉矶
+211
+自古
+庄园
+末日
+择
+堤
+脊
+异侠
+后代
+山顶
+仿
+仄
+畸形
+古迹
+农林
+狭
+自从
+而生
+砂糖
+冲洗
+行李
+图标
+闹钟
+202
+丘陵
+看法
+黎明
+大床
+公认
+师大
+漫长
+1929
+深层
+涉嫌
+や
+五十九
+完备
+沙滩
+磷酸
+枕
+反击
+1100
+一枚
+散发
+兆
+香油
+水文
+首映
+骑兵
+奴
+乐观
+下车
+整改
+批复
+客家
+植被
+方可
+同于
+射线
+五十七
+天河
+自幼
+心动
+低压
+琉璃
+自在
+自律
+141
+车牌
+蛮
+车位
+ú
+沙发
+鸟类
+忽视
+下旬
+区号
+NT
+通往
+短暂
+微机
+歌声
+凹
+流失
+次年
+增添
+血清
+草本
+亲属
+冷漠
+边境
+内径
+得出
+续航
+145
+药理
+摩
+绘图
+典礼
+汇报
+顶尖
+宣告
+见于
+亦可
+动机
+角落
+微量
+贴心
+眼神
+帆
+通路
+232
+断裂
+繁
+⑤
+互助
+导入
+届满
+直达
+新村
+癌症
+翡翠
+干预
+学派
+诗集
+井水
+星空
+面具
+禄
+此类
+三名
+勋章
+接壤
+应由
+芒
+等到
+生姜
+法语
+那天
+御史
+躲
+父子
+身世
+撤
+报复
+生抽
+绍
+营运
+兔子
+陪伴
+例子
+改制
+碟
+兑换
+前者
+原作
+十二五
+是非
+奴隶
+文库
+可口
+屡
+归类
+口头
+显微
+138
+闸
+监护
+提琴
+摄
+蒸发
+北海
+专卖
+轰炸
+オ
+乒乓
+儒家
+三项
+尧
+网球
+日月
+多年生
+酒精
+读本
+十足
+从前
+收回
+抗震
+臭
+共建
+严禁
+子弹
+肉羊
+沾
+铅
+富含
+高专
+亲子
+书目
+辨
+世代
+载荷
+ィ
+适时
+域名
+山坡
+火花
+觉醒
+花费
+碎片
+投产
+成像
+乳腺
+岛屿
+奇异
+全会
+192
+待机
+风貌
+稳步
+月票
+迹
+梳理
+外人
+顶部
+ˇ
+安定
+耳朵
+嫌疑
+壹
+七十六
+门票
+枯
+才华
+映
+舆论
+眉
+毫升
+镇江
+变焦
+爵士
+Hz
+功耗
+what
+饰品
+now
+蓝天
+慈
+裂片
+匈牙利
+百花
+梗
+出台
+开阔
+民航
+高尚
+售
+出让
+教训
+飞翔
+晒
+万历
+奶奶
+静态
+急救
+爵
+参编
+耕
+Vista
+油田
+杰克
+引言
+襄阳
+史学
+拥
+七十二
+透视
+棘
+176
+文史
+工序
+解题
+宴会
+武将
+碰
+首领
+语句
+寇
+卡尔
+残留
+引人
+Me
+化解
+编织
+动词
+大蒜
+五十六
+操控
+墓地
+开口
+芬
+激动
+贯
+动产
+外径
+八大
+冷水
+家庄
+贫血
+壁画
+睡觉
+硫
+zh
+应邀
+昆
+搜集
+定居
+君主
+轻易
+眼光
+核实
+函授
+棚
+年会
+共赢
+贷
+快递
+寺院
+亚太
+至此
+肝炎
+荆州
+慕容
+菊花
+锤
+枣
+子孙
+闫
+back
+乘客
+河口
+决战
+龙门
+九届
+水肿
+共用
+婚纱
+大兴
+人造
+━
+瑜伽
+挣扎
+减速
+142
+口岸
+战队
+愿景
+扫
+Server
+甲状腺
+讨
+正如
+后再
+凶
+石器
+失效
+苏格兰
+党建
+姿
+大庆
+逃脱
+拼图
+督查
+136
+问答
+少爷
+如意
+国画
+代言
+一举
+警
+爱尔兰
+去皮
+就让
+多多
+冈
+超人
+门将
+充实
+五十八
+忽然
+测评
+英超
+129
+左手
+啥
+广电
+孤儿
+访谈
+百余
+督导
+笼
+贮藏
+丢失
+初次
+杉
+雁
+台阶
+バ
+182
+大幅
+线程
+候
+惩罚
+炎症
+汉代
+讲授
+受损
+生机
+退货
+六十二
+洒
+真情
+花鸟
+省人民
+酥
+南非
+冷藏
+聚焦
+脱落
+泌尿
+八卦
+总之
+四种
+深远
+维吾尔
+昔
+购房
+151
+淮
+后悔
+任期
+鸡肉
+町
+赚钱
+账号
+简约
+寓意
+狭窄
+谦
+out
+疑问
+CAS
+插图
+低保
+说道
+绝世
+注销
+双语
+咳嗽
+匙
+巡视
+校级
+减排
+基数
+口碑
+集聚
+栽
+区位
+来临
+EI
+向东
+甲级
+情怀
+痰
+Journal
+赣
+商用
+down
+淮南
+城墙
+火力
+沐
+钉
+720
+电解
+粗糙
+罚金
+话语
+门窗
+腹黑
+大理
+敷
+卦
+西汉
+针灸
+纯粹
+侧面
+脚步
+特级
+缩小
+拼搏
+钢板
+NBA
+退役
+性感
+佑
+护士
+难忘
+纲要
+少林
+更改
+死者
+渊源
+公积金
+账
+九州
+征求
+股市
+庆典
+草莓
+独到
+处在
+大米
+Java
+赠送
+改装
+釉
+万字
+焖
+途经
+青龙
+脊椎
+满族
+混居
+渲染
+氨
+燃
+三天
+试用
+处级
+橙
+蛾
+脚下
+起步
+怀念
+祖先
+男友
+大都
+二维
+粉红
+帘
+附有
+肝脏
+归属
+淮安
+华语
+创伤
+李白
+转身
+过高
+詹
+一族
+do
+文摘
+270
+戈
+故宫
+薇
+ナ
+尾巴
+必将
+平民
+摘要
+召集
+明明
+崎
+矢
+001
+廊
+合约
+评比
+伤口
+助学
+见过
+1400
+排污
+黑板
+朔
+均价
+情歌
+安阳
+ミ
+很久
+プ
+保鲜
+千瓦
+见解
+便携
+中将
+储量
+抚
+172
+感冒
+一键
+幽灵
+峡
+元代
+田园
+界定
+真菌
+藏族
+追逐
+镇长
+胎儿
+测验
+六大
+山药
+反抗
+淡水
+主打
+汇聚
+富贵
+出手
+155
+红枣
+便宜
+增收
+大军
+割
+火星
+诗文
+用电
+艇
+历时
+源头
+厨
+总价
+勘探
+刑罚
+时报
+全能
+的确
+磁场
+外地
+阿尔
+服役
+京城
+彰显
+班车
+宜昌
+谈话
+东山
+이
+强势
+生气
+中英
+称作
+星际
+集数
+定额
+流派
+食堂
+成形
+从不
+火腿
+美男
+芬兰
+眠
+当下
+长征
+食谱
+草药
+采矿
+祖父
+一并
+采纳
+球形
+晚年
+微观
+违纪
+抗体
+栖
+过后
+嫁给
+余额
+畜禽
+瞳
+素描
+听证
+阴道
+闪电
+血糖
+载明
+藻
+造纸
+喊
+耿
+驰
+三亚
+答复
+篆刻
+好莱坞
+巡回
+学园
+伊朗
+一律
+烟花
+抒情
+簿
+中海
+而去
+甘蔗
+合议
+笔名
+衡量
+大唐
+按时
+滑雪
+托管
+上天
+评分
+即便
+205
+冷酷
+陌
+施肥
+产后
+周末
+诡异
+身影
+抚养
+园艺
+财物
+标注
+浓缩
+坦
+沿着
+玛丽
+美誉
+赶紧
+2020
+辟
+弥补
+抚慰
+多万
+罚息
+央视
+热力
+饶
+阶层
+热烈
+治愈
+181
+分化
+162
+企图
+交际
+迷宫
+王妃
+欺骗
+学业
+天王
+栗
+日后
+锅内
+远方
+言语
+美军
+跃
+144
+哈哈
+秋天
+声优
+膝
+完好
+奥斯卡
+杀菌
+滤
+敦煌
+地层
+锁定
+筹建
+中日
+裁
+演示
+芜湖
+树种
+数千
+澄
+万多
+耐磨
+z
+香蕉
+悄悄
+重修
+褐
+老公
+首创
+148
+踢
+摘
+轼
+供热
+图画
+兼顾
+淡淡
+包裹
+校内
+渤海
+随即
+挂牌
+自强
+构件
+老化
+能使
+油气
+乌克兰
+妆
+新编
+猎
+落地
+构思
+向下
+应付
+娇
+и
+拱
+供销
+简明
+选购
+提案
+三百
+课外
+化身
+介石
+弹簧
+婴幼
+九江
+缠
+东亚
+真爱
+胡同
+区长
+起重
+183
+02016
+捕
+话剧
+儿科
+长子
+就此
+握
+鸡精
+二世
+〖
+代数
+虚假
+交警
+村级
+〗
+衍
+克里斯
+助攻
+正好
+呕吐
+肥料
+阴影
+效能
+公害
+逮捕
+天道
+三中
+论著
+氨基
+人参
+SD
+典故
+燃油
+赞誉
+付清
+入手
+旅途
+摧毁
+层层
+诛
+550
+匹
+服从
+心肌
+称呼
+驱
+涵
+十万
+怎
+ム
+北纬
+佩
+疼
+帐
+固醇
+亨利
+茄子
+海尔
+前置
+废水
+唐诗
+掀起
+场馆
+产科
+笔录
+石材
+焊
+株洲
+畔
+四肢
+传授
+纠缠
+е
+好玩
+编者
+play
+数目
+妖怪
+守门员
+380
+发源
+庭院
+滋味
+原件
+药剂
+脆弱
+拓宽
+年华
+KB
+祁
+应试
+声道
+喜好
+新手
+升降
+1024
+月初
+搬
+当局
+三日
+湘潭
+胸部
+平稳
+デ
+处方
+接地
+二中
+录用
+披露
+三更
+威海
+限定
+整顿
+剧团
+查阅
+举例
+晏
+范例
+乐曲
+住所
+杏
+骗
+年鉴
+高频
+预言
+恐
+134
+说出
+星辰
+千古
+阻碍
+双眼
+高大
+翟
+机枪
+212
+堰
+掌控
+负载
+蓬勃
+六十五
+珠江
+メ
+诺贝尔
+缅甸
+编译
+骨骼
+担当
+专线
+课时
+镶
+民歌
+南朝
+眼里
+二百零六
+加水
+皇上
+澳洲
+7000
+村镇
+外事
+入党
+长久
+阳性
+丛林
+迎来
+大帝
+演练
+pc
+幼虫
+电极
+感知
+野外
+公会
+邂逅
+笋
+帕
+泄漏
+间隙
+片段
+泰安
+乌鲁木齐
+AMD
+面料
+故居
+换货
+多位
+156
+油菜
+丧
+太郎
+离心
+以南
+サ
+导论
+衣柜
+评议
+西侧
+グ
+用餐
+建构
+折磨
+孤单
+沥青
+二者
+日用
+弗
+四处
+东城
+风波
+台式
+配电
+民警
+县志
+诀
+精益
+白马
+涌
+抑郁
+1GB
+白酒
+汽油
+声乐
+克拉
+露出
+二战
+热闹
+构图
+右侧
+验
+常住
+魔力
+APP
+确有
+158
+宪
+立场
+尺度
+腹泻
+深情
+大国
+精髓
+凌晨
+交错
+丈
+屠
+退还
+转载
+四级
+籽
+137
+妾
+小白
+night
+霹雳
+206
+新版
+一看
+皮革
+吞
+武士
+家伙
+轻重
+苗木
+冢
+黄昏
+壮观
+not
+组装
+底蕴
+恶劣
+客服
+后裔
+建制
+庄子
+挤
+和解
+编纂
+投降
+ウ
+法术
+一千
+德州
+汇率
+辩护
+勘查
+泊
+烦
+铺设
+三世
+人寿
+放到
+图解
+ば
+初学
+档次
+安慰
+扩
+逾
+首位
+CV
+博弈
+政委
+痴
+将领
+中性
+亲密
+她说
+DirectX
+头上
+屋顶
+大理石
+金银
+村落
+也称
+手枪
+各式
+腐
+衰竭
+泣
+爱恋
+旦
+卸载
+追寻
+188
+帧
+绽放
+161
+赞助
+143
+DC
+寺庙
+厉
+夕阳
+缩写
+此处
+she
+省外
+新生儿
+破解
+图表
+大展
+快餐
+365
+柳州
+平常
+五百
+城关
+专任
+这份
+公文
+焦虑
+工农
+灶
+身后
+208
+未婚
+笔者
+宫殿
+东侧
+新年
+达人
+整洁
+英特尔
+彼岸
+国会
+麟
+楠
+手里
+体操
+始于
+老大
+联动
+鉴于
+莱
+特许
+昔日
+旱
+纯净
+受体
+石刻
+可视
+同济
+黑道
+多方
+综艺
+立志
+樱桃
+并购
+花朵
+潜水
+季度
+146
+多样化
+疫苗
+已知
+钢材
+山谷
+六十三
+139
+注明
+成名
+青梅
+昨天
+北极
+领土
+实事
+联通
+1926
+贴吧
+底座
+一见
+寻常
+订阅
+毒素
+战线
+药业
+跳舞
+追溯
+皖
+Is
+药学
+联手
+文革
+就医
+征地
+水流
+半径
+阅
+试剂
+失望
+8x
+纵向
+支架
+减灾
+短短
+香菜
+水位
+折射
+挥
+开机
+统战
+典范
+奎
+集装
+报考
+门外
+带宽
+宣言
+古镇
+回路
+情境
+意境
+天鹅
+声卡
+珍藏
+杂质
+足迹
+雍
+色谱
+此事
+湖州
+插件
+激
+荣耀
+育种
+汤姆
+ャ
+极致
+启用
+出卖
+铃
+山林
+遮
+总督
+可为
+倚
+诸如
+误会
+尽情
+加重
+膳食
+纬度
+保守
+阻力
+果树
+家系
+剧目
+肥胖
+源自
+3500
+一季
+调理
+再用
+豪门
+树林
+轮廓
+维也纳
+辛苦
+辩证
+中东
+循
+步入
+特聘
+抗菌
+计数
+以致
+姿势
+并茂
+喉
+紧凑
+圣母
+加固
+格尔
+妮
+思议
+失眠
+用具
+驱逐
+常绿
+静静
+昆仑
+变态
+岭南
+物语
+坠
+子弟
+地表
+洽谈
+盐城
+详尽
+欧元
+底层
+病原
+主权
+缴费
+186
+163
+亚历山大
+地毯
+171
+畅通
+廷
+各异
+甘草
+导游
+菜肴
+逃避
+霸道
+黑夜
+way
+忽略
+中和
+零七
+照相
+氯化
+马丁
+奠基
+叉
+济宁
+方块
+尽头
+取名
+初恋
+清末
+冷静
+奔腾
+进军
+大成
+投放
+条目
+三卷
+海滨
+八角
+全称
+瓷器
+※
+酵母
+首饰
+拉开
+票价
+突变
+迷失
+视图
+1111
+量化
+说服
+中枢
+采收
+彩票
+310
+左侧
+游乐
+刻画
+赚
+填充
+年年
+晚报
+惊悚
+躲避
+碰到
+203
+宏伟
+安庆
+中途
+淳
+谨慎
+650
+赌
+回首
+陷
+行车
+寒冷
+Wang
+四面
+网民
+否定
+血症
+模样
+油脂
+大片
+失落
+184
+耐心
+沉重
+海贼
+戚
+报价
+在乎
+聊
+增生
+梯
+崭新
+颇有
+咒
+年纪
+就好
+1925
+小憩
+对付
+来访
+同伴
+圣人
+航海
+ē
+护照
+海事
+四海
+肋
+学问
+莆田
+come
+ǐ
+附则
+联谊
+改称
+故名
+切断
+通车
+钛
+篇章
+回应
+允
+假如
+事由
+沧桑
+圣经
+垫付
+忧伤
+海量
+旗帜
+馨
+Pro
+六十七
+历险
+赛区
+矫正
+地块
+躺
+人情
+乏
+All
+逆天
+太守
+话说
+寓
+鳍
+祖籍
+冰雪
+225
+孙子
+撕
+私立
+唯有
+举措
+巴士
+师从
+四人
+着手
+科名
+瞩目
+陆地
+时常
+并用
+豹
+管线
+檐
+盐酸
+公务
+エ
+对着
+life
+默认
+exe
+开门
+建华
+一遍
+号召
+连城
+手下
+感人
+肿
+地主
+204
+增进
+后宫
+便秘
+款式
+漳州
+哭泣
+几百
+科教
+小品
+坚守
+再一
+薄弱
+基督
+贯通
+早上
+爆破
+FM
+蝙蝠
+麒麟
+演义
+舜
+组委会
+佩戴
+夺取
+热心
+肾脏
+配角
+体型
+京都
+一万
+拖欠
+意甲
+批量
+Super
+景象
+唯美
+ろ
+质地
+跋
+城南
+妥善
+考评
+端子
+口袋
+敢于
+霸王
+宽敞
+造林
+棱
+荒山
+嘉靖
+美妙
+温室
+汤匙
+水墨
+额度
+现今
+革新
+史诗
+果然
+邻居
+ず
+稽查
+同心
+新书
+薪酬
+高潮
+袍
+勋
+上层
+5cm
+砌
+罗马尼亚
+刊登
+解说
+埠
+骂
+贝尔
+修仙
+波斯
+延迟
+标配
+推销
+173
+抉择
+麦克
+兵力
+完工
+本体
+间谍
+渐进
+飞船
+气味
+Michael
+杀戮
+太宗
+一曲
+制图
+词条
+仰
+欠条
+特邀
+芝加哥
+村人
+芳香
+捉
+视力
+奖章
+收视
+披
+西甲
+座谈
+中年
+魔幻
+茫茫
+转发
+战神
+挫折
+质疑
+排序
+体检
+月球
+用房
+合著
+骏
+妨
+澜
+万千
+解散
+チ
+贪
+屑
+备受
+153
+雄蕊
+民居
+失误
+旧址
+want
+技师
+省道
+超强
+议员
+烧烤
+拿出
+西山
+早就
+淘
+ゃ
+ブ
+850
+芦
+前端
+两边
+理查德
+ü
+跑道
+此地
+以北
+道士
+剑桥
+CAD
+绚丽
+太太
+彻
+滥用
+简便
+罗斯
+门派
+铁矿
+帮忙
+巡
+解开
+调入
+恩怨
+桂花
+中队
+艰辛
+小人
+轩辕
+情趣
+CEO
+迈
+盗窃
+每位
+高分
+耻
+1280
+年少
+木马
+短片
+悠悠
+ュ
+美景
+指向
+湛江
+嗣
+食盐
+督
+威尼斯
+土家
+帅哥
+艾滋
+Microsoft
+雍正
+冲动
+魔女
+尊贵
+全景
+清明
+got
+会使
+购置
+н
+坟
+夏日
+中美
+襄
+荡
+彬
+ㄢ
+SQL
+乙醇
+清风
+漂流
+商圈
+154
+变色
+色调
+ニ
+逃出
+三期
+右边
+同日
+连云港
+CCTV
+人机
+264
+队友
+冰川
+熬
+一战
+献给
+公诉
+阙
+精盐
+笑话
+匡
+腌
+吸入
+题名
+死于
+滕
+武昌
+好看
+香气
+七十八
+超高
+中信
+分组
+书号
+洁净
+钓鱼
+食疗
+安娜
+双层
+呼叫
+签证
+殊
+粮油
+159
+东经
+灭火
+147
+申诉
+品德
+特工
+长生
+禽
+碱性
+国庆
+生猪
+素有
+站长
+硕
+沿线
+遥感
+六十六
+折叠
+形体
+镶嵌
+预约
+货车
+崩溃
+削
+缓冲
+下级
+say
+竖
+填补
+心得
+黄土
+胶版
+磨损
+滚动
+肉质
+小块
+界限
+牢固
+星系
+LCD
+莎
+婆
+微分
+出门
+50000
+悟空
+困惑
+试试
+砸
+特质
+遵义
+阿里
+且看
+30000
+判定
+活血
+狙击
+光照
+竞
+周转
+David
+楼梯
+增高
+印章
+看书
+并无
+215
+缴
+此前
+昊
+西城
+节水
+宝典
+222
+走廊
+厉害
+多大
+订立
+束缚
+客座
+重伤
+山村
+日历
+官网
+两端
+问世
+加盖
+心爱
+明日
+南极
+抢劫
+贮存
+PK
+q
+流星
+上课
+一新
+亦称
+追偿
+震惊
+丞相
+起飞
+维奇
+适度
+二十多
+订
+本村
+竹马
+Excel
+馅
+高雅
+中都
+02017
+耗
+小城
+银川
+文科
+除去
+侍郎
+恩来
+sh
+和弦
+富裕
+Of
+庚
+河东
+赣州
+辞职
+166
+潮州
+皇宫
+航线
+剥夺
+采信
+自制
+因果
+机能
+向南
+呼唤
+谎言
+东阳
+花纹
+白领
+电线
+溢
+态势
+恶心
+娃
+镜子
+德军
+太祖
+佳作
+上涨
+金字
+薯
+9000
+享用
+红十字
+路段
+时任
+狩猎
+石膏
+伸出
+丽江
+iPod
+Internet
+表述
+海报
+一侧
+包厢
+德语
+座位
+五一
+菩提
+广义
+照射
+单片机
+熔点
+佐证
+宝藏
+潇洒
+车祸
+区间
+174
+仙女
+↑
+抗生
+前线
+廉洁
+稀释
+服刑
+病害
+贫
+峪
+行列
+177
+片区
+长远
+建校
+大明
+加深
+一经
+后因
+调用
+相逢
+负有
+龙岩
+黔
+蛋黄
+震动
+芹菜
+意愿
+美发
+立功
+古装
+三相
+iPad
+印次
+岳阳
+想不到
+防盗
+每当
+气囊
+进来
+清澈
+make
+移交
+拘留
+心思
+赔
+碳酸
+颠覆
+骆
+两只
+207
+视听
+亮相
+泰州
+淋漓
+捧
+当成
+191
+时而
+独创
+world
+眼科
+近似
+自成
+thing
+破裂
+气血
+泡泡
+接头
+木兰
+随地
+白银
+性价
+同胞
+阐释
+背部
+近日
+陵园
+奈何
+七十九
+假日
+迫
+分院
+华硕
+亲王
+凡是
+多变
+排球
+马铃薯
+后台
+压迫
+河西
+征集
+MS
+330
+单身
+太湖
+锥
+矮
+叶子
+心底
+南站
+水系
+157
+银色
+永乐
+友人
+事后
+火炬
+烧热
+黄石
+计时
+情缘
+在校生
+供奉
+上限
+地道
+冰冷
+困扰
+咱
+裤
+万分
+花样
+悬崖
+次次
+病症
+安防
+虐
+医保
+后备
+02015
+尊严
+几率
+167
+末端
+绿豆
+末车
+车载
+无聊
+仇恨
+中止
+五四
+报仇
+官兵
+中一
+发给
+内衣
+琦
+融化
+863
+290
+巧合
+班长
+麻辣
+家教
+出院
+鸳鸯
+扔
+│
+战后
+支流
+中科院
+著称
+画法
+小鸟
+荷花
+崇尚
+免去
+糖果
+预定
+固化
+平坦
+法甲
+聘任
+虚构
+辩论
+华山
+学界
+姑
+ò
+师长
+佳人
+前行
+弹药
+好吃
+离奇
+院系
+1924
+西域
+创刊
+杏仁
+总站
+减小
+马德里
+灭亡
+episodes
+蛙
+繁多
+中兴
+国人
+气动
+聘用
+磅
+即位
+化肥
+221
+中午
+小米
+泪水
+句子
+霖
+墓志
+收款
+逃亡
+皮质
+愧
+恒星
+互通
+哮喘
+孕育
+体外
+江门
+216
+梓
+禁用
+护卫
+不便
+上司
+离别
+对决
+8GB
+成套
+制式
+随心
+冬瓜
+日光
+稠
+TFT
+银杏
+冰糖
+新星
+带走
+闪烁
+突起
+另类
+克隆
+染色体
+自此
+筹集
+泻
+景德镇
+PLC
+朋
+苍穹
+叙
+清晨
+195
+新乡
+燥
+区级
+宰相
+收养
+历届
+Home
+绵阳
+托马斯
+赶到
+青椒
+恶意
+互补
+包头
+拨款
+权属
+主宰
+磁盘
+强迫
+逐年
+179
+640
+自发
+尊敬
+赞美
+酌情
+苟
+当场
+追杀
+心跳
+点燃
+求实
+少将
+1923
+道光
+164
+社员
+天山
+梅州
+徘徊
+香水
+契机
+落户
+强劲
+熨
+宾客
+捐款
+其一
+崩
+MBA
+校风
+接种
+沃
+车道
+各县
+入围
+染色
+动感
+光伏
+名城
+Oh
+商丘
+玩笑
+蔓延
+帽子
+牧业
+虾仁
+堕落
+宿命
+八路
+预热
+台风
+好事
+quo
+钦
+开朗
+单体
+恰
+用力
+皇子
+登山
+420
+西门
+ゆ
+临界
+ダ
+透支
+骨髓
+尤为
+犹豫
+斑点
+马鞍
+片中
+航行
+早晨
+发型
+301
+守信
+转账
+滤波
+奢侈
+止痛
+通则
+厌
+列传
+雇佣
+大典
+吞噬
+山市
+全家
+电能
+疏松
+课文
+金陵
+百强
+同情
+流氓
+1912
+剧集
+废止
+回避
+封装
+饱
+绑架
+板材
+汉堡
+视察
+弧
+防洪
+西红柿
+结识
+She
+窟
+诱导
+光彩
+不朽
+北魏
+大洋
+雨水
+品尝
+提炼
+欢喜
+捏
+六十八
+财税
+良性
+细分
+装潢
+圣地
+相等
+文号
+手持
+坎坷
+che
+沙拉
+换装
+滇
+搜
+引种
+出境
+如实
+187
+荒地
+System
+340
+党风
+下落
+织物
+灾区
+顽强
+169
+租车
+6x
+ROM
+交织
+特异
+龙泉
+风尚
+腌制
+药师
+武警
+聊城
+游人
+西亚
+匈奴
+脾气
+米饭
+绘本
+面目
+敌方
+欺负
+从容
+元帅
+搜狐
+根源
+WiFi
+吸取
+末年
+永安
+外伤
+煞
+149
+配制
+day
+撑
+理化
+桃源
+兰花
+雪山
+相近
+终究
+瘦肉
+东坡
+灵异
+开盘
+参保
+教务
+圩
+卤
+监制
+城建
+骨头
+遍及
+提请
+水体
+摊
+摩尔
+隐私
+oh
+岑
+卡拉
+外侧
+Hotel
+梦境
+莹
+东区
+着眼
+诉请
+抬
+误区
+搭档
+厕所
+学风
+砚
+抗病
+芒果
+经络
+北平
+昂
+远处
+留给
+古诗
+蔓
+花冠
+修罗
+写成
+they
+位居
+誓言
+常有
+IBM
+ū
+实效
+奔跑
+泄
+书局
+葱花
+瑜
+担
+滑动
+钓
+宴请
+逼真
+性情
+堆积
+称赞
+会费
+胜地
+谨
+笔墨
+残忍
+红豆
+乳房
+送餐
+ね
+气压
+198
+开支
+斗士
+咽
+手套
+CN
+ひ
+印制
+开元
+创世
+随身
+苔
+android
+泉水
+行情
+投身
+喇叭
+戒指
+日志
+影院
+推崇
+陀
+秘诀
+多余
+趋
+213
+潜心
+七十五
+砍
+肥沃
+海湾
+享年
+龙山
+七星
+熔
+首页
+矩
+成才
+田间
+对立
+丫头
+膀胱
+菠菜
+福特
+六十九
+工件
+工地
+在位
+脊柱
+锯齿
+福田
+石英
+诈骗
+多发
+敦
+厅长
+就可
+英勇
+一两
+偷偷
+衣橱
+原著
+杖
+部份
+尘埃
+治病
+水壶
+穿梭
+八十一
+领悟
+2100
+供养
+if
+洞穴
+罪恶
+戴尔
+哟
+189
+前款
+颈部
+址
+单个
+丞
+合力
+七十一
+刀具
+病学
+暗示
+喜悦
+指纹
+心愿
+隶
+延期
+每亩
+理科
+鼓楼
+原谅
+防疫
+312
+Chinese
+打入
+裹
+少见
+潮湿
+山羊
+激战
+省长
+回事
+ō
+重心
+好人
+周到
+清醒
+02011
+大声
+仍是
+得意
+猴子
+车队
+蚂蚁
+冉
+过往
+应变
+双面
+排气
+居家
+极度
+易学
+地基
+妖精
+必杀
+炫
+开户
+红星
+分销
+制止
+一朵
+放手
+so
+肺炎
+蕨
+入驻
+浅出
+饮酒
+new
+联社
+活塞
+海绵
+校本
+粟
+1922
+卷入
+始祖
+翰林
+中篇
+2400
+肠道
+山峰
+奕
+妍
+象棋
+二氧化碳
+传世
+军民
+方正
+马力
+World
+医务
+更具
+扭曲
+1700
+坚固
+简直
+科协
+妇联
+韭菜
+稀有
+此刻
+前途
+选址
+食欲
+伴有
+螺纹
+单项
+牛仔
+农副
+视线
+玲珑
+涧
+绵
+朕
+七十三
+阶梯
+宰
+选中
+交友
+相望
+祀
+产出
+交替
+安宁
+饱满
+词曲
+河水
+长短
+懒
+ズ
+De
+科院
+张家界
+纪实
+帐户
+倾听
+板型
+国税
+奇遇
+世俗
+瑾
+下令
+沼泽
+大哥
+吓
+城中
+几千
+曙光
+多久
+通关
+顺德
+升学
+放电
+例外
+神州
+灯具
+摔
+暴风
+不料
+防爆
+稳健
+right
+血腥
+江北
+师德
+伪造
+肚子
+209
+大山
+论语
+Vol
+胖
+传导
+健脾
+解锁
+可可
+纯洁
+过分
+匪
+射手
+厂长
+胞
+诺基亚
+寓言
+弥漫
+悉
+中世纪
+准入
+航道
+菏泽
+稍微
+游玩
+望远
+RS
+例题
+一幕
+大有
+丰厚
+鲜艳
+毅然
+记述
+追随
+师父
+可乐
+庶
+王家
+全真
+wanna
+左边
+鳞片
+大观
+火炮
+深夜
+棍
+折扣
+香料
+波长
+末世
+裸
+部曲
+茶匙
+撞击
+交接
+银奖
+嘉庆
+篮板
+骨科
+前辈
+参见
+超大
+迷茫
+文案
+又要
+多达
+218
+票务
+争论
+可知
+墙体
+巴塞罗那
+桓
+辈
+校正
+画廊
+寅
+摆放
+Word
+夸张
+金牛
+票房
+北侧
+情仇
+沸
+当归
+北站
+芹
+督办
+族群
+面膜
+万能
+逃跑
+忠实
+脚本
+一说
+悠
+愉悦
+听见
+腹腔
+招募
+爹
+转世
+知情
+天体
+2200
+鼻子
+育苗
+波罗
+焰
+行长
+字形
+蜡
+山河
+准予
+深渊
+血脉
+牌子
+书盟
+道家
+东营
+经商
+凸
+旗舰
+大将
+山中
+隐居
+用语
+作画
+定性
+八十四
+间距
+every
+承租
+肢体
+辛亥
+传人
+总共
+高一
+重返
+dB
+304
+严密
+适中
+小叶
+法宝
+1919
+如来
+首尔
+离合
+刻苦
+后排
+煸
+而死
+智利
+敏锐
+求精
+你好
+还清
+尴尬
+心目
+1921
+射程
+薪
+浇水
+CT
+Lesson
+须知
+起始
+居士
+谋杀
+鲜美
+宽容
+列宁
+政党
+上机
+松江
+癫痫
+羹
+一员
+转折
+水温
+低调
+梢
+脊索
+古朴
+北面
+切削
+196
+大名
+包容
+并行
+252
+涩
+小球
+虑
+全军
+飞扬
+南宫
+微微
+闵
+可调
+除非
+军医
+as
+执教
+较快
+隐形
+幻剑
+伴奏
+4500
+气流
+前台
+开头
+青花
+宜人
+航班
+奶粉
+拘
+难过
+血浆
+结尾
+A级
+图纸
+求是
+诊
+郊区
+综
+喷射
+回味
+豌豆
+日程
+菠萝
+5400
+花草
+常德
+768
+走势
+主唱
+片名
+九十七
+花香
+mAh
+终生
+р
+斧
+奥特曼
+大棚
+堵
+内务
+邻近
+造船
+收据
+同仁
+质感
+朝代
+流感
+一如
+盛大
+七十四
+冠心病
+德甲
+平淡
+腻
+己任
+序号
+曲面
+方药
+名将
+相声
+外围
+水路
+水浒
+魔界
+一文
+大肠
+次日
+常熟
+干涉
+九十六
+璀璨
+点火
+泸州
+逝
+忧郁
+向北
+金龙
+235
+煤气
+三界
+奢华
+报批
+前夕
+80后
+沸腾
+流年
+搅
+会务
+企业集团
+惧
+屏蔽
+开来
+绪
+崇高
+轻微
+原为
+良种
+一字
+偏差
+一封
+萼
+助力
+272
+取向
+冷暖
+看守
+美德
+咽喉
+领略
+情深
+曦
+ビ
+辩
+pH
+Baby
+流入
+内侧
+平整
+供求
+长老
+千字
+栽植
+茹
+反诉
+娄
+劈
+敲
+底盘
+touch
+剥
+侵蚀
+磁性
+采样
+推介
+烯
+人权
+凝固
+会馆
+援
+增设
+草坪
+晋级
+丰收
+复发
+一格
+城东
+涂层
+航运
+装卸
+纯真
+维多利亚
+贩卖
+卵巢
+设想
+抄
+滞纳
+远销
+苦难
+上旬
+xx
+放开
+海边
+纬
+在场
+值班
+302
+能耗
+挥发
+诉求
+沿岸
+抽样
+主教
+科委
+何以
+研修
+彰
+屠杀
+能手
+盲
+盐业
+盛开
+俱全
+龙王
+搭载
+导读
+宏大
+上将
+鼓舞
+纽带
+履
+辜
+轻型
+分层
+214
+水田
+高空
+厨师
+硝酸
+紧邻
+并列
+阔叶
+侵入
+留言
+意乙
+明治
+殿堂
+234
+高超
+区码
+人道
+普陀
+意图
+肃
+1911
+坏死
+沁
+连连
+邮局
+划归
+婴
+靶
+奶酪
+甄
+砖混
+慎用
+著述
+坎
+OK
+斯基
+龙潭
+帅气
+南侧
+集镇
+高地
+快门
+02010
+遗失
+换乘
+侧重
+荀
+青蛙
+姜片
+造诣
+技工
+水生
+沐浴
+一览
+专区
+网格
+扁平
+石门
+距今
+向外
+祈祷
+涨
+显现
+016
+中式
+冶
+时时
+涌现
+驿
+饼干
+纵队
+军衔
+聆听
+缤纷
+搬运
+成因
+咸阳
+写真
+呵护
+逆转
+旨
+天龙
+族人
+脱水
+防火墙
+瓷砖
+午夜
+保税
+小龙
+妖孽
+惑
+15000
+乐山
+露天
+闷
+怜
+never
+耶稣
+课件
+WMA
+萎缩
+侵
+邢台
+实实
+GPRS
+两部
+鲜血
+浮雕
+初年
+间断
+期末
+盲目
+上百
+奥秘
+空格
+较差
+全额
+好多
+区县
+大体
+碘
+藕
+麻将
+海棠
+求学
+平平
+向西
+诱发
+面条
+启迪
+嫉妒
+全套
+过量
+卫浴
+偏偏
+调配
+牛津
+袖
+记事
+驰名
+庐
+孔雀
+教书
+通州
+琵琶
+侵占
+工贸
+干线
+more
+揭露
+缺失
+污
+起居
+224
+金曲
+解码
+佛像
+治国
+肽
+中书
+指针
+大局
+匆匆
+bit
+199
+除尘
+寂
+建军
+重叠
+为期
+卷二
+同治
+上官
+裙
+过大
+PLAY
+锰
+戎
+梁山
+笛
+花儿
+七十七
+死刑
+黄埔
+止血
+冲锋
+讽刺
+听众
+王府
+英冠
+ǔ
+细长
+信阳
+02012
+以求
+看待
+上岗
+宝安
+轰动
+缓缓
+预售
+三类
+呼和浩特
+知县
+see
+193
+礁
+英尺
+雪花
+画像
+江阴
+英才
+自如
+警务
+专长
+方剂
+产区
+heart
+深海
+流泪
+媳妇
+分量
+一会儿
+血脂
+布朗
+爱德华
+大殿
+诗意
+萧山
+总队
+242
+益气
+朴实
+蔚
+指控
+EP
+с
+生肖
+天水
+契
+展厅
+ガ
+座落
+惠普
+萃取
+特技
+农用
+东站
+辨证
+应诉
+东湖
+南下
+手写
+中秋
+2D
+周易
+氧气
+慕尼黑
+坚信
+首要
+哲理
+回转
+借口
+其人
+啦啦
+嘉定
+必读
+记账
+讯
+重逢
+南面
+向阳
+颇具
+军校
+誓
+留守
+始皇
+沧海
+阵营
+放映
+遗留
+光华
+违
+面的
+触控
+野心
+255
+五彩
+子房
+布尔
+鲜活
+实惠
+严厉
+标兵
+八一
+表皮
+嫌
+脏
+when
+AutoCAD
+竹林
+气球
+核发
+甩
+嵌
+印尼
+⑥
+相亲
+盛行
+比率
+For
+两家
+扭转
+iOS
+man
+尖端
+婷
+时限
+军委
+修剪
+娟
+鸽
+掉落
+245
+写生
+简要
+固件
+股价
+纯正
+天内
+倦
+出世
+拉萨
+日趋
+MPEG
+蜜蜂
+暧昧
+耐用
+坚硬
+安心
+想着
+整车
+参合
+替换
+骨质
+晖
+先驱
+老头
+阻抗
+翌年
+被捕
+皇室
+贴士
+水泵
+解脱
+晕
+只见
+火烧
+板栗
+络
+黄花
+感慨
+蕴含
+渗
+升华
+身旁
+恰当
+法兰
+围棋
+客栈
+乐坛
+″
+柯南
+附中
+屿
+糊涂
+衫
+孢
+惊心
+饺子
+老龄
+贱
+旺盛
+305
+比尔
+刺客
+墙壁
+相伴
+必定
+柔和
+净水
+党派
+直通
+行者
+木瓜
+罗汉
+钻研
+倒霉
+衔
+镇压
+漏洞
+苏维埃
+马拉
+聪
+选出
+水务
+产能
+三轮
+博学
+擅
+截面
+叔叔
+跌
+瑞安
+绒毛
+头顶
+厚重
+穴位
+装机
+爱过
+字典
+伪装
+反响
+辊
+期权
+人名
+酰胺
+丽水
+供货
+先秦
+期满
+同性
+眼球
+黑客
+乘车
+陕
+锦绣
+越好
+共鸣
+江河
+隋唐
+绒
+脾胃
+劳均
+随风
+甜美
+青云
+信道
+换届
+肚
+电厂
+英汉
+难免
+纸质
+巡逻
+晟
+单机
+凭着
+沧州
+学费
+汀
+螺丝
+适于
+平滑
+忍受
+组团
+将要
+悉尼
+筹划
+520
+商机
+酿酒
+招待
+国度
+证监
+时段
+瓦斯
+课标
+博物院
+南岸
+舒服
+细小
+进驻
+天台
+濑
+清香
+结印
+攻打
+庐山
+一向
+秦皇岛
+尽力
+深造
+车主
+京师
+黄冈
+编入
+宜宾
+モ
+亚麻
+修饰
+空前
+烘干
+下册
+这块
+河畔
+亲手
+隐蔽
+循序
+每晚
+伤寒
+巨星
+以东
+正直
+抗癌
+抗原
+惠民
+228
+民兵
+梵
+金沙
+低音
+手表
+本条
+号称
+强行
+西周
+生院
+安居
+楚国
+胶原
+征战
+次元
+退化
+精良
+里昂
+事变
+维权
+妨碍
+女装
+挤压
+摇篮
+红花
+性命
+223
+感光
+恐怕
+Music
+パ
+运气
+讲学
+新作
+插曲
+胚胎
+231
+水道
+舞曲
+果园
+变卖
+来讲
+越冬
+啸
+新秀
+1918
+相依
+糕
+was
+or
+book
+腹痛
+赶快
+财险
+大神
+其妙
+两委
+宝山
+Express
+秦汉
+急诊
+文笔
+况
+雅典
+婆婆
+全都
+食管
+开车
+分集
+违背
+去掉
+Office
+雄性
+温水
+颈椎
+末期
+跪
+滞
+救治
+花梗
+派人
+老少
+殊荣
+嗜
+胜负
+精油
+巅
+俄语
+液化
+狠
+暗杀
+捞
+滋
+缠绵
+伯爵
+实录
+想像
+雅思
+包子
+千秋
+公分
+高高
+light
+九十三
+九十九
+品格
+1366
+全场
+此种
+驴
+做大
+完小
+蒸气
+丝毫
+想念
+钟情
+长相
+可信
+缔造
+脑袋
+衢州
+4x
+黑人
+鞭
+红酒
+转弯
+漂
+刹车
+西站
+湖区
+损耗
+面面
+办案
+便民
+军用
+置换
+艾伦
+颜料
+承德
+村村
+2x
+知己
+门店
+原发
+种群
+越野
+掩
+而行
+能效
+知晓
+缠绕
+清凉
+故里
+花柱
+内阁
+国旗
+领事
+鲤鱼
+贵妃
+插画
+党务
+新民
+八十二
+裂缝
+陛下
+盛典
+售价
+球迷
+ji
+But
+亚运
+河谷
+病逝
+拖动
+读取
+219
+暗黑
+博大
+т
+藩
+押
+乡长
+大中
+日夜
+电荷
+下乡
+耦合
+市容
+负面
+长官
+垣
+光大
+2cm
+控件
+余家
+神庙
+出借
+NET
+高三
+√
+神魔
+立式
+教职
+点缀
+芝士
+插座
+纤
+脉络
+win
+动工
+失常
+二极
+五花肉
+肖像
+重现
+名声
+清初
+影业
+抖
+松下
+哪儿
+剪纸
+产卵
+街区
+飞天
+联军
+核查
+医用
+儒学
+女郎
+镍
+播音
+教员
+日出
+软化
+1914
+防汛
+荔枝
+课后
+汉中
+指点
+亲身
+炮兵
+北斗
+遵
+家装
+平定
+SKF
+果汁
+向后
+Ω
+肘
+二百零七
+外墙
+濒危
+周一
+抬头
+证言
+金石
+一击
+天国
+专员
+患有
+217
+接线
+功用
+华为
+集结
+兴隆
+盐水
+拉伸
+成虫
+格林
+内核
+舟山
+殿下
+学分
+蹲
+感性
+总计
+肾上腺
+提议
+中标
+树皮
+大树
+齐鲁
+煮熟
+平阳
+甸
+异性
+010
+宝鸡
+手臂
+减免
+两天
+首部
+dpi
+经办
+PVC
+墙面
+演技
+莽
+急需
+昏迷
+吉安
+ch
+富士
+司徒
+IN
+战友
+咸丰
+Visual
+丰田
+霉
+移送
+字幕
+染料
+90后
+金星
+OF
+馆长
+仙境
+协和
+钢丝
+估算
+万达
+灯泡
+冶炼
+州立
+热线
+学到
+中卫
+掩护
+城内
+轴线
+兜
+引力
+教徒
+茧
+多情
+気
+半球
+公办
+熏
+失调
+石榴
+宇航
+262
+凿
+at
+歧视
+天皇
+出击
+腊
+DJ
+哈利
+妇幼
+杂文
+树干
+大坝
+经度
+雷电
+国营
+about
+100000
+推向
+凶手
+雄伟
+冲刺
+驾车
+意料
+一端
+刷新
+榜样
+柔情
+1905
+压制
+西宁
+答题
+金黄
+噬
+蓬
+抱怨
+报社
+盼
+暴雨
+来历
+评级
+列出
+杜绝
+战火
+抗辩
+变性
+松弛
+八十五
+水性
+矿石
+滨江
+盛名
+说到
+微创
+禁烟
+血栓
+珍稀
+后任
+落成
+鹦鹉
+剧烈
+稀土
+云集
+繁体
+必修
+涅
+两地
+禅师
+未见
+白雪
+拇指
+屠宰
+就职
+思潮
+文稿
+乡土
+辨析
+大鼠
+学府
+要塞
+194
+古怪
+战车
+ERP
+八十三
+彪
+JPEG
+文武
+统称
+明末
+据悉
+储藏
+整套
+疗
+迪士尼
+逃生
+仙子
+室温
+史密斯
+卯
+扩充
+免除
+攀
+沸点
+连续剧
+全线
+惊讶
+建平
+较量
+唐宋
+古道
+中层
+数组
+豆浆
+余姚
+以西
+其父
+机房
+招牌
+麻油
+古文
+亏损
+车轮
+下子
+流出
+心血
+归口
+海景
+迁徙
+蛛
+出家
+条纹
+大阪
+轻度
+张家
+起动
+弱点
+王牌
+沥干
+交谈
+Power
+谥
+阻挡
+红薯
+马路
+共创
+塞尔维亚
+高宗
+怀抱
+编导
+抵御
+伴侣
+漫游
+大院
+墓葬
+西乙
+防控
+时年
+灭绝
+长发
+恶化
+缘分
+共识
+匠
+位移
+庄严
+兴国
+some
+央
+Photoshop
+永康
+沿用
+隔音
+迷离
+祈
+玄武
+意向
+在意
+昵称
+会社
+耍
+玉林
+守卫
+远东
+印第安
+维斯
+环形
+俺
+野蛮
+印度尼西亚
+顺应
+五言
+穿透
+滚子
+上万
+堵塞
+目睹
+蕊
+词目
+One
+反正
+裘
+露台
+2600
+社保
+奶牛
+网址
+THE
+太行
+笑着
+外币
+沙龙
+微小
+天神
+AT
+奔驰
+226
+蝉
+环氧
+维度
+花果
+排泄
+诗经
+薄荷
+享誉
+学部
+挪用
+02014
+尚有
+铬
+锦州
+兴办
+魏晋
+岐
+椅
+保洁
+霉素
+take
+否认
+卖出
+水深
+迫切
+八十七
+警示
+CM
+堆放
+电位
+牟
+ハ
+昼夜
+盒子
+亲近
+宜兴
+获胜
+Co
+质检
+어
+卸
+凝结
+屏障
+慈溪
+专人
+船长
+商户
+xi
+지
+采摘
+出众
+展品
+八十八
+入伍
+妥
+来信
+典籍
+洼
+基建
+沦
+丰台
+饿
+法医
+走私
+233
+得益
+自带
+智商
+手足
+黄豆
+8cm
+报关
+牧场
+样板
+风筝
+愈合
+아
+演进
+USA
+高祖
+地委
+悬浮
+昌平
+安妮
+伊拉克
+许昌
+12000
+吴江
+网王
+咖喱
+莺
+谏
+万家
+敬请
+原味
+房地
+钞
+大河
+本色
+1917
+氨酸
+James
+黄褐色
+楔形
+宿迁
+GDP
+玉石
+脊髓
+幸存
+借用
+共振
+拿起
+全心
+绫
+表扬
+刑期
+裔
+多姿
+事发
+敌军
+蜕变
+鸡翅
+切口
+救护
+MM
+繁衍
+分册
+石碑
+菱
+238
+暗中
+疮
+围剿
+文帝
+盏
+催眠
+1913
+文昌
+熹
+If
+赔付
+嗯
+故人
+vip
+林中
+隆起
+断面
+画册
+避开
+火爆
+大叔
+0mm
+通水
+幼年
+身亡
+粘膜
+灵芝
+入会
+棒球
+看作
+龙岗
+派对
+言论
+花絮
+中亚
+地税
+想过
+油漆
+调和
+盛宴
+烟雨
+寄宿
+CMOS
+玉器
+安康
+特制
+宽屏
+暗恋
+花岗岩
+淑
+奸
+战机
+大户
+连环
+坞
+上行
+入户
+桦
+在建
+律诗
+侨
+前瞻
+找回
+按期
+职守
+课本
+自豪
+歌迷
+向量
+领地
+估价
+下垂
+萌芽
+八十六
+海参
+悲哀
+com
+立交
+肉类
+雕像
+一品
+唤
+小岛
+嫁接
+语录
+潜伏
+大象
+曲艺
+紊乱
+校外
+血小板
+缺口
+动荡
+原地
+风力
+沉睡
+祐
+相通
+永生
+装载
+都督
+正宗
+切开
+风气
+蓬莱
+长治
+五星
+富豪
+矿井
+低价
+叛逆
+降解
+议案
+哎
+敖
+位宽
+喝酒
+波特
+游离
+全美
+道歉
+逊
+东门
+绿茶
+197
+ョ
+新婚
+番禺
+配偶
+楼房
+结石
+自家
+甲醛
+藏品
+橘
+硒
+功德
+小事
+F1
+降落
+公爵
+变幻
+小鼠
+水池
+小屋
+九十二
+girl
+给水
+贬
+飞跃
+假期
+平和
+海豚
+追击
+but
+263
+百里
+山川
+名医
+词人
+316
+涡轮
+乞
+古巴
+荟萃
+武当
+放出
+幻影
+修补
+以防
+收拾
+III
+虹桥
+学年
+党内
+鱿鱼
+教改
+XX
+恋情
+나
+滋润
+运送
+笨
+下行
+好学
+七大
+感叹
+角膜
+肆
+m2
+石窟
+截图
+腕
+闪耀
+肾病
+山楂
+哇
+天人
+罢免
+保湿
+细心
+苏醒
+党史
+留校
+耀眼
+矿区
+漫步
+祠堂
+翰
+反动
+捣
+部首
+本性
+廊坊
+写实
+开学
+二十余
+组分
+埃
+书信
+固有
+参军
+1910
+懿
+心力
+减压
+波澜
+选秀
+天门
+元朝
+文中
+Black
+花花
+髓
+应以
+020
+阜阳
+长效
+存活
+离散
+倩
+餐桌
+羡慕
+版块
+担负
+凡人
+捡
+评析
+贵宾
+六安
+241
+雌雄
+网易
+充沛
+募集
+此项
+抗性
+天成
+头像
+路边
+茜
+杀伤
+导电
+夜里
+腥
+邮寄
+有特色
+发车
+开除
+边疆
+置身
+楷书
+2800
+人居
+推力
+践行
+牢
+颐
+仍未
+旋风
+人像
+佛法
+School
+苏区
+参议
+诱
+自卫
+关税
+传入
+培根
+钧
+神殿
+隐瞒
+企鹅
+昨日
+440
+以至
+城北
+也好
+多得
+白衣
+继电器
+野兽
+邮箱
+购机
+墙上
+风靡
+275
+售出
+冰山
+喷雾
+源泉
+空心
+暑假
+前茅
+林区
+乌龙
+가
+茉莉
+沿途
+伸缩
+五千
+阿甲
+兴业
+发誓
+宠爱
+奉命
+试样
+早熟
+2300
+卡车
+来袭
+妇人
+神龙
+暗夜
+军阀
+微米
+333
+惊险
+学制
+前人
+口中
+定型
+开国
+洗澡
+原先
+北上
+魔神
+容颜
+二百五十七
+泰勒
+字样
+国库
+Science
+着色
+双赢
+结成
+糟
+国债
+拳击
+常数
+减退
+唐书
+未曾
+兄妹
+利物浦
+虚弱
+九十五
+部委
+规矩
+鸡汤
+容貌
+簇
+督察
+槐
+统领
+贫穷
+外貌
+265
+减弱
+图腾
+战乱
+生病
+防震
+虚空
+分布式
+一艘
+茨
+惯
+假面
+肇事
+扯
+彩绘
+疆
+в
+5g
+阻塞
+SP
+酚
+玩转
+馒头
+长于
+威尔
+擀
+顺治
+连同
+只得
+甲骨
+集资
+ID
+查理
+波浪
+在前
+国情
+杨柳
+查尔斯
+总论
+浪潮
+消极
+三家
+蚁
+软骨
+哩
+瓶颈
+豆瓣
+麼
+说起
+正统
+搓
+流经
+草木
+元璋
+篇幅
+一朝
+印花
+六合
+疫情
+ノ
+拼命
+城西
+嵩
+两句
+竭
+警报
+ISB
+汽
+三联
+直销
+打动
+宫颈
+战绩
+白金
+over
+备考
+植入
+尖锐
+Basic
+节度使
+莲子
+暇
+狠抓
+女娲
+朦胧
+何必
+初一
+故称
+再来
+节选
+坚韧
+满洲
+南湖
+999
+拥护
+转子
+串联
+棺
+楼宇
+九十四
+受贿
+宽广
+中方
+采暖
+峻
+酬
+徒弟
+决斗
+内脏
+覃
+官吏
+接管
+本领
+236
+珍品
+争斗
+县域
+亲戚
+2G
+中段
+东至
+酸奶
+三面
+抛光
+大堂
+萝莉
+颇为
+税法
+三生
+四分
+拓扑
+临近
+外用
+美白
+布线
+邮编
+午
+借据
+普法
+城门
+粉尘
+石桥
+化疗
+慰问
+历来
+GT
+长廊
+丹尼尔
+耐火
+归于
+背负
+远近
+自立
+KG
+辗转
+兼并
+打出
+帮扶
+脚踏
+并重
+存续
+扭矩
+四项
+网吧
+写意
+羟基
+芭蕾
+羁
+衬
+中关村
+摄氏
+美感
+荫
+先导
+冤家
+送往
+杠
+税款
+取材
+静止
+麻痹
+人防
+吻合
+へ
+小便
+粗壮
+愚
+厕
+合用
+菜谱
+南充
+成败
+解救
+助教
+nm
+褚
+方圆
+诉至
+007
+贪污
+特务
+建起
+分部
+540
+歇
+全资
+苦瓜
+尉
+AV
+App
+宝物
+天安
+251
+酒业
+隔热
+进阶
+笙
+天命
+官场
+于水
+茯苓
+rpm
+界线
+简史
+剧组
+蚌埠
+指甲
+一会
+峰会
+当庭
+公孙
+心律
+中人
+亿万
+大和
+热能
+洪荒
+九十一
+撇
+海岛
+违章
+大雄
+万户
+书生
+黄酒
+封锁
+那段
+248
+时节
+精深
+纹理
+求真
+神灵
+太白
+浑
+线形
+五百一十九
+拆卸
+二百零五
+池塘
+Life
+龙凤
+978750
+饮品
+雨林
+红烧
+对口
+焦作
+On
+衣物
+冲压
+展馆
+局势
+AAA
+本能
+炙
+燕子
+可读
+宝玉
+奚
+%
+鱼雷
+泥土
+奖惩
+买家
+肇庆
+南门
+既往
+3200
+作法
+推测
+焚
+坠落
+洛克
+清河
+光圈
+纠葛
+鞘
+船员
+SD卡
+去找
+酿造
+发芽
+媚
+喂养
+田野
+DVI
+布拉格
+黏膜
+石墨
+三农
+惊魂
+处世
+趾
+宽阔
+赞赏
+手绘
+魁
+227
+交融
+踪
+邮
+骆驼
+相接
+轰
+乐章
+富于
+血统
+拜访
+勤劳
+4cm
+酒庄
+勃勃
+量刑
+当做
+亡灵
+脖子
+再说
+自测
+反之
+2g
+细化
+金针
+购销
+八十九
+相差
+吉利
+揭
+奋进
+加利福尼亚
+盛产
+路过
+觅
+粪便
+动摇
+天马
+害虫
+三江
+胡子
+抗旱
+全天
+白玉
+视镜
+器物
+00g
+缪
+五笔
+get
+上手
+提问
+救出
+上山
+擒
+感激
+时速
+扦插
+棋子
+泄露
+胰岛素
+惊奇
+套餐
+⑵
+惨
+漩涡
+频段
+气泡
+新锐
+未必
+yeah
+船只
+厦
+酿
+复苏
+雷锋
+调任
+亨
+筷子
+肢
+⑴
+体温
+平顶
+骷髅
+铸铁
+财会
+绝句
+精子
+攻克
+病菌
+Ltd
+搏
+荒漠
+‧
+柔性
+Live
+皱纹
+起初
+出名
+What
+马蹄
+囧
+首府
+巴基斯坦
+湘西
+名气
+脑海
+乐于
+巨型
+骨架
+位数
+Paul
+浊
+离家
+冰淇淋
+丧葬
+边防
+排查
+小巧
+1916
+烹调
+繁育
+荐
+唑
+设法
+430
+奥斯
+换来
+肝病
+away
+东晋
+陷害
+南美
+鸾
+补给
+氏族
+心痛
+个案
+Video
+织品
+朴素
+自尊
+臧
+选修
+史书
+机缘
+Robert
+老家
+嗜血
+神农
+迫使
+摄入
+酒楼
+写道
+濮阳
+人马
+返利
+植树
+亏
+ATX
+集散
+儿歌
+张家口
+苏丹
+会话
+一二
+安葬
+片刻
+结案
+交响乐
+波士顿
+TO
+腾飞
+蚀
+乐清
+303
+偶遇
+4G
+调速
+神兽
+红椒
+映射
+High
+whe
+叛乱
+航母
+名山
+天长
+外传
+现身
+疗程
+轻工
+捕获
+变质
+新四军
+一十九
+征用
+刺杀
+下肢
+过人
+身处
+银幕
+辛勤
+香草
+个子
+导线
+文秘
+小山
+出厂
+家家
+表达式
+一应
+格外
+初审
+1915
+分歧
+离职
+370
+蚕
+亚马逊
+如其
+off
+少于
+229
+Mr
+何种
+水族
+口才
+惯性
+一波
+正中
+铲
+守法
+省份
+筱
+喷涂
+PP
+门类
+姊妹
+羌
+呵呵
+志强
+古墓
+迈进
+小花
+九十八
+路易斯
+议论
+罕
+写给
+Zhang
+罪名
+漫漫
+问卷
+锐意
+切入
+育儿
+迅猛
+绿树
+专访
+专心
+烽火
+汐
+处女
+翔实
+提及
+柏油
+漠
+娘子
+昏
+录入
+追加
+‰
+围墙
+税率
+网店
+日渐
+唯物
+出示
+村子
+序幕
+粘土
+5m
+呼吁
+矢量
+盘古
+私募
+次要
+桨
+Force
+终年
+兼备
+切碎
+收发
+浮动
+瑶族
+282
+舞弊
+斯洛伐克
+概要
+海带
+私家
+宝马
+结语
+大江
+线圈
+转业
+分流
+Your
+郴州
+头晕
+小鱼
+拿破仑
+焕
+娱
+GO
+将它
+文教
+与会
+文坛
+к
+高桥
+误解
+主轴
+往返
+5kg
+挽
+重症
+意中
+凸显
+叹息
+硅胶
+检定
+抑
+情愿
+敕
+世事
+优胜
+洞察
+市内
+sky
+贝壳
+滋养
+转运
+game
+血色
+徵
+政工
+邵阳
+群落
+届时
+健美
+纠结
+ㄣ
+二轮
+拖鞋
+泾
+求解
+球菌
+巡抚
+零八
+欺诈
+傀儡
+营救
+噩梦
+集约
+鞍山
+腰椎
+寿司
+巾
+识字
+征管
+西夏
+盯
+南洋
+为师
+电热
+罪行
+计价
+站台
+半数
+银耳
+边区
+002
+分辨
+胞胎
+潜入
+神通
+九五
+CP
+原产
+木偶
+运载
+初见
+承揽
+6cm
+л
+凌云
+酸碱
+疲惫
+果蔬
+MC
+243
+奉贤
+江东
+旷
+266
+总长
+名言
+解密
+全州
+会场
+刻意
+措
+关中
+爱护
+核对
+秒钟
+保加利亚
+风华
+千百
+315
+短路
+理智
+
+△
+密室
+为先
+媒
+心头
+水解
+Every
+安装工程
+丰满
+赌博
+耐寒
+FAG
+听觉
+厝
+方方
+工期
+秘鲁
+颠
+互利
+巨石
+寝
+佟
+昂贵
+瞻
+光亮
+功法
+尾部
+诵
+受过
+省会
+利尿
+德里
+▪
+签发
+肾炎
+斯坦
+就诊
+异步
+远征
+科举
+出路
+欺
+560
+抢险
+矿床
+男装
+电镀
+未果
+明快
+枝条
+水力
+永州
+心意
+靳
+交房
+欧式
+两性
+熟知
+黑帮
+株式
+晴天
+出走
+狂欢
+洞口
+猿
+窗户
+伊丽莎白
+鼎盛
+异术
+诗句
+这时候
+Peter
+AB
+对联
+推翻
+Ⅳ
+恒温
+人脉
+傍
+临海
+1908
+西游
+good
+萼片
+还给
+型式
+花药
+960
+艳丽
+易经
+汇款
+划拨
+高丽
+将士
+02009
+治学
+防灾
+针织
+压抑
+排斥
+琼斯
+239
+皓
+此生
+叠加
+景物
+二类
+Sun
+海宁
+功力
+咱们
+告白
+积淀
+居易
+累累
+外籍
+放过
+病历
+防卫
+带回
+244
+太和
+议事
+寡
+贿赂
+秀才
+节假
+机要
+生子
+财力
+庸
+充值
+警卫
+园内
+绮
+02013
+图层
+小狗
+RPG
+突围
+甲板
+Kg
+下雨
+古树
+大宗
+巡航
+内膜
+业态
+动车
+拽
+는
+订购
+亮丽
+琥珀
+悲惨
+迷雾
+十堰
+引证
+征文
+丁香
+囊括
+玄宗
+升起
+中立
+树冠
+253
+祖师
+Google
+芸
+460
+崇祯
+北伐
+全员
+笃
+花粉
+农学
+茶园
+OH
+母婴
+杠杆
+南安
+分红
+甜心
+汶川
+410
+扶养
+产学研
+沙河
+扮
+中锋
+滨州
+克罗地亚
+一国
+韧性
+将近
+江宁
+阴凉
+接力
+梨花
+千余
+忙碌
+青州
+点心
+养猪
+修缮
+English
+探寻
+322
+绢
+星云
+贮
+收敛
+小队
+安东尼
+惬意
+ょ
+流淌
+滋补
+优于
+Time
+西边
+白发
+路易
+手游
+重任
+确切
+意象
+退回
+When
+各族
+石板
+考取
+大便
+FC
+旧时
+避孕
+商界
+利害
+长河
+辱
+CDMA
+孰
+版画
+生生
+妖魔
+复核
+取胜
+审视
+瘦身
+路桥
+纷争
+青藏
+英俊
+继而
+焯
+左传
+乌鸦
+并存
+汲取
+呼应
+谱写
+AVI
+过硬
+至关
+英美
+深沉
+油墨
+注目
+feel
+白虎
+Be
+兵种
+ㄥ
+下层
+力图
+仙剑
+射门
+杀虫
+诉说
+任性
+真谛
+君王
+蛊
+拦截
+灵气
+恩施
+划定
+满分
+招式
+专有
+坯
+288
+亲友
+ME
+掌管
+树叶
+绅士
+牲畜
+逝去
+奇观
+悄然
+眩晕
+钱币
+上空
+山寨
+兰德
+造影
+轶事
+吸纳
+才子
+锻造
+长三角
+1906
+文选
+牌坊
+兵法
+清爽
+评测
+武力
+一剑
+痉挛
+就餐
+胤
+原生
+烧伤
+纳粹
+捐
+选题
+棠
+城际
+别致
+vs
+每隔
+列举
+下山
+吼
+衡水
+加成
+Liu
+雷霆
+大风
+246
+淮北
+文凭
+接轨
+新品
+依山
+延边
+3星级
+承保
+橙色
+跑到
+格调
+苦苦
+霸气
+研磨
+纪元
+总管
+茗
+约瑟夫
+奥特
+淹没
+好感
+总分
+视网膜
+倾心
+峨眉
+晃
+军政
+武夷
+真假
+杰伦
+编撰
+村建
+落入
+50Hz
+炮塔
+钵
+穿插
+That
+牧师
+整整
+备忘
+寄托
+天宝
+数理
+晾
+每秒
+佛罗伦萨
+大葱
+齐齐哈尔
+强壮
+构想
+SL
+脓
+ceae
+陪同
+分娩
+
+站立
+西区
+净土
+乐府
+二三
+兴旺
+城管
+问道
+阻燃
+迷信
+互换
+落花
+Technology
+魔族
+向着
+富民
+易燃
+立方
+苍白
+链条
+迁入
+圭
+现货
+质押
+瑟
+立于
+紫砂
+扭
+揭阳
+行书
+群山
+甲等
+顿时
+曜
+蒸馏
+一滴
+木雕
+速生
+挂职
+一版
+概
+明年
+音量
+锦江
+段落
+梭
+迁至
+军长
+撤退
+内战
+家政
+居于
+小林
+瀑
+鸡腿
+床位
+幼稚
+胸怀
+电场
+救人
+今夜
+疫病
+山口
+白沙
+齐国
+迷惑
+石雕
+特等
+砂浆
+联席
+疲
+勇者
+再造
+308
+制导
+装入
+五项
+集市
+白血
+阳明
+秋风
+杂草
+伤痛
+至高
+交战
+做起
+都城
+沸水
+幕墙
+官职
+红茶
+精制
+中风
+绯
+感官
+厢
+喔
+色拉
+整天
+面孔
+工笔
+精炼
+大业
+爆竹
+复查
+ェ
+美展
+火柴
+学社
+胜任
+糯
+饥饿
+晶莹
+阿拉
+灰尘
+用工
+亚当
+东面
+阜
+纹饰
+造像
+旧金山
+卫国
+追索
+292
+深山
+修身
+蕴藏
+山路
+通天
+减去
+功底
+说唱
+万象
+笼罩
+差点
+少有
+帐号
+行事
+详见
+清远
+254
+ㄐ
+夏威夷
+逗
+实景
+春夏
+237
+湖畔
+岱
+礼拜
+正气
+俏
+浏阳
+河边
+body
+赤峰
+矿泉
+俘虏
+而至
+书系
+载入
+突厥
+鞋子
+下水
+慷慨
+剪切
+总署
+肛门
+海里
+281
+画报
+京津
+灿
+璃
+谊
+振荡
+学籍
+流露
+要领
+end
+蜗牛
+透明度
+为名
+白话
+猪油
+配乐
+道人
+掷
+过失
+来往
+分店
+261
+赞扬
+丛生
+囊肿
+限量
+黑洞
+加勒比
+肠炎
+导管
+七彩
+零二
+兑
+绑定
+东盟
+辐
+Bridge
+射频
+走路
+大字
+恰恰
+透析
+炒锅
+戊
+草根
+Go
+加盐
+双人床
+此文
+考前
+可观
+画集
+眼部
+鳃
+入境
+称谓
+米粉
+薄片
+MP4
+免税
+兢
+陶器
+强国
+耗材
+高音
+假冒
+炮弹
+堕
+通达
+390
+男士
+击中
+刹那
+伤痕
+港区
+花色
+嘿
+可变
+越过
+马达
+书刊
+极少
+自营
+表白
+教养
+勐
+衰落
+这支
+梧桐
+撰稿
+015
+窗外
+325
+只因
+眼底
+PDF
+强悍
+施展
+形似
+幼苗
+ve
+洁白
+希特勒
+陇
+名贵
+店面
+巴马
+定名
+烛
+晚餐
+成龙
+点播
+鱼肉
+度量
+其二
+常人
+肉体
+上饶
+虚无
+戟
+周口
+冷凝
+川菜
+足额
+书香
+家禽
+耐热
+源流
+种名
+死人
+唱腔
+患病
+AA
+加减
+V1
+尿道
+捐资
+协力
+02008
+名利
+新安
+灵山
+污泥
+数次
+火热
+东边
+可贵
+贴身
+零三
+诊所
+爬行
+交货
+抓紧
+X线
+掺
+粪
+凑
+韩语
+东华
+延缓
+酷爱
+憧憬
+嗓音
+盛会
+粉色
+兼具
+韵味
+富翁
+鲤
+寄存
+经由
+四中
+专营
+1904
+王后
+仙界
+怀化
+促成
+游行
+月饼
+预留
+宛
+牵手
+私有
+制宜
+竿
+枣庄
+瓶装
+秋冬
+客商
+结业
+居室
+路基
+活化
+领衔
+上皮
+体例
+乖
+共生
+力作
+心电
+贪婪
+歌谣
+滴滴
+光子
+聚居
+灵动
+失信
+同居
+五人
+功勋
+英里
+递
+从头
+狂热
+制服
+楹联
+总和
+012
+茄
+庙宇
+流产
+如梦
+焚烧
+雌性
+淹
+防雷
+蚝油
+system
+管网
+悔改
+暴动
+上书
+废墟
+降温
+臻
+怪异
+剖
+换热
+WAP
+辨认
+租借
+文华
+肉丝
+阵列
+1907
+牛顿
+闵行
+作坊
+秦国
+切块
+塑性
+商务印书馆
+倡
+268
+网购
+玖
+萝
+竭诚
+抚顺
+API
+粘度
+张开
+蘸
+309
+引出
+娘娘
+臭氧
+贸
+语义
+比利
+睿智
+各市
+Play
+当家
+郊
+榆林
+保值
+庙会
+涂抹
+哼
+首相
+宛如
+285
+草书
+接纳
+警惕
+浑身
+only
+预告
+胀
+机智
+张家港
+广安
+滁州
+海燕
+穆斯林
+离休
+万岁
+画卷
+258
+老天
+荷载
+服药
+深邃
+军工
+元气
+谷歌
+溜
+企划
+用车
+烈火
+1902
+遁
+风范
+极富
+拂
+年表
+车库
+NPC
+人保
+太监
+谈到
+议题
+每逢
+气功
+选集
+•
+主客
+壤土
+毛巾
+晒干
+KTV
+发证
+舆
+鸭子
+知觉
+义勇
+塌
+檀
+滚滚
+找寻
+三省
+三环
+华文
+铅笔
+废除
+民谣
+阴性
+入库
+油锅
+缺血
+刚才
+复利
+彤
+增压
+巫师
+实为
+转到
+311
+锈
+连年
+战舰
+营利
+ほ
+走来
+建房
+孤立
+造福
+降压
+严峻
+细微
+通货
+休克
+罐头
+公署
+1903
+日元
+壤
+消肿
+乖乖
+奋
+征税
+鬼神
+这家
+赛场
+连线
+汤汁
+演说
+到会
+び
+览
+锯
+榆
+断层
+倾力
+芬芳
+为生
+磨难
+高僧
+247
+部下
+师兄
+出水
+挤出
+条码
+仗
+DIY
+凯文
+几经
+다
+底下
+建安
+背影
+耕耘
+CE
+栩栩
+旱地
+四库
+掠夺
+闪亮
+嚣张
+山门
+固态
+淮河
+透气
+封神
+刊号
+猜测
+云龙
+950
+华尔
+纤细
+俊杰
+五官
+根系
+肤
+汉口
+篮
+自传
+点数
+星火
+高明
+美文
+商量
+等候
+声望
+防伪
+花丝
+CO
+北洋
+围攻
+分队
+一碗
+尿素
+导出
+札
+259
+伤感
+老抽
+1909
+心事
+范本
+下颌
+信徒
+项链
+上篇
+边路
+温岭
+定植
+可控
+讨伐
+北岸
+美协
+相册
+门槛
+心智
+错落
+奇葩
+来一
+兴盛
+报销
+妥协
+acid
+/
+单据
+平湖
+简述
+533
+003
+石柱
+5V
+精巧
+广元
+元和
+光度
+去处
+核电
+了然
+掠
+龙族
+充当
+灼
+self
+编审
+321
+背包
+EUR
+驾驭
+展区
+喙
+阐明
+秘籍
+比特
+百人
+外在
+伟业
+进食
+极佳
+乞丐
+盗墓
+寻觅
+烘焙
+分包
+穿刺
+332
+沦陷
+平地
+云雾
+长乐
+咫尺
+遗存
+垒
+惟一
+直板
+窥
+Just
+天性
+紫金
+单价
+缝合
+AS
+化作
+莫测
+相位
+宫中
+超前
+谈论
+宗祠
+永嘉
+大雨
+中子
+发文
+毒理
+属实
+遇难
+修辞
+萱
+诙谐
+内含
+He
+牡丹江
+疑惑
+龙华
+与非
+群雄
+high
+补肾
+投机
+铃木
+请示
+牛排
+佛学
+加剧
+摄制
+曲阜
+拨号
+吸水
+桃园
+心境
+げ
+306
+多一
+支线
+爱丽丝
+鸦片
+疏散
+音符
+扒
+飞龙
+这儿
+茂名
+金城
+阻滞
+为数
+针叶
+HTML
+下岗
+趋向
+特权
+低头
+解调
+比起
+闯入
+元宝
+密闭
+Win9x
+我方
+打倒
+礼貌
+余杭
+明知
+阿富汗
+堡垒
+羽状
+352
+患儿
+烟雾
+land
+长长
+极力
+基石
+人行
+AAC
+百分点
+ABS
+男儿
+前去
+海运
+剁
+一十八
+防线
+修筑
+获利
+常态
+睁
+关机
+show
+唐人
+僧人
+诞辰
+婚庆
+稿件
+甲酸
+011
+院内
+礼服
+莱特
+翘
+某人
+浩然
+胰腺
+等长
+熏陶
+Richard
+肩膀
+飞速
+澄清
+水箱
+咪
+挂钩
+同位素
+纯度
+厥
+伸展
+重工
+椰
+现役
+得失
+韶关
+喷气
+仅次于
+307
+绝技
+掩盖
+掌门
+周五
+童子
+书房
+今晚
+敬老
+运筹
+极速
+说文
+甲醇
+蕴
+红河
+王室
+湖水
+甘露
+电量
+发货
+扑克
+真挚
+搜寻
+运城
+陡
+人世
+沉浮
+维克
+画风
+浩瀚
+打死
+花被
+吞吐
+left
+Passage
+勉强
+背板
+弱势
+保镖
+痴情
+抽查
+gonna
+昕
+钟楼
+版面
+洞庭
+认罪
+陈皮
+车体
+啼
+要件
+伺服
+附注
+求助
+换算
+赤壁
+潮汕
+醋酸
+印记
+螺栓
+阿里巴巴
+螃蟹
+世祖
+菜系
+刺绣
+稀疏
+セ
+立面
+王宫
+千兆
+手掌
+徙
+璐
+志明
+绑
+按月
+果断
+箱子
+和睦
+造化
+武王
+凹陷
+税费
+丶
+寂静
+花萼
+相间
+江汉
+been
+闪存
+缘故
+贰
+车床
+积水
+数以
+巩
+绰号
+沉降
+小熊
+荷叶
+上任
+红线
+两用
+救赎
+抗拒
+注塑
+上册
+与否
+环线
+语境
+温情
+施用
+陂
+喷泉
+致癌
+多伦多
+む
+外型
+溴
+烟火
+如画
+aceae
+高价
+复古
+笔试
+2700
+散射
+毛发
+厌恶
+一枝
+后退
+瞿
+国强
+幽雅
+次子
+PE
+此人
+凰
+图谱
+致病
+钻井
+裂纹
+酒类
+318
+得主
+种苗
+帜
+泳池
+附表
+画作
+不二
+花木
+想想
+973
+画画
+待人
+马里奥
+葡
+向日
+改版
+跆拳
+天府
+井冈
+熨斗
+天平
+周日
+库区
+7200
+晚清
+铠甲
+客体
+经意
+增益
+单列
+ふ
+各具
+荷甲
+知府
+乳头
+填空
+鲸
+ě
+一课
+雯
+移居
+闽南
+笑脸
+一室
+天桥
+奶茶
+养分
+预览
+一目
+补救
+考查
+语气
+氧基
+WAV
+喀
+基质
+分段
+声称
+攻坚
+久远
+始发
+吧主
+里尔
+煮沸
+尼泊尔
+圈子
+攻占
+红糖
+踏入
+主攻
+四代
+冰心
+分立
+肺癌
+麻木
+零一
+打架
+流向
+机密
+跨度
+LG
+浮生
+坚果
+英镑
+涂装
+宗师
+腹地
+合体
+常温
+313
+办好
+尽致
+谕
+田中
+五位
+蛋液
+跌宕
+酰
+舰艇
+热衷
+主峰
+双龙
+挂靠
+背鳍
+鲫鱼
+羡
+碳水
+傻瓜
+印证
+响起
+通鉴
+天数
+龙虾
+惯例
+宸
+生灵
+づ
+西邻
+帖子
+初探
+春晚
+剖面
+福音
+翻唱
+睹
+开裂
+土司
+麓
+内陆
+村中
+天界
+微妙
+高额
+稻田
+根部
+日式
+受托
+动向
+百态
+新药
+flash
+考题
+吊销
+嘎
+互生
+小猫
+8mm
+挽救
+陕北
+两旁
+管控
+象山
+垂钓
+代办
+273
+Girl
+湿热
+面子
+婉
+趋于
+文山
+眼看
+期盼
+客串
+拉动
+参战
+高楼
+把手
+继发
+than
+通向
+细细
+原油
+丹阳
+取水
+Film
+树枝
+一秒
+上缴
+会战
+跻身
+等同
+血流
+宝库
+音色
+纺
+一抹
+时事
+汞
+这点
+瞧
+英甲
+傣族
+规制
+苯基
+戳
+为害
+House
+腐烂
+猎物
+入味
+肩负
+刻有
+颌
+倔强
+申论
+⑶
+咀
+打发
+群星
+271
+举人
+救国
+牵挂
+公馆
+坐骑
+复位
+养颜
+北欧
+威尔士
+文理
+预后
+反向
+农行
+梗概
+母舰
+开出
+译本
+lt
+横行
+拓片
+湛
+六日
+产妇
+西岸
+在下
+秉
+泳
+法令
+埔
+年号
+秦岭
+女巫
+加班
+发包
+辨别
+书评
+两点
+出新
+育成
+增殖
+多糖
+日落
+005
+谋划
+字节
+中叶
+写入
+爆笑
+洋溢
+扰乱
+数万
+510
+342
+单向
+以至于
+兴化
+柱头
+文言
+捕鱼
+治校
+长兴
+324
+Mark
+心声
+渭
+流逝
+法兰克福
+一再
+泛指
+自私
+公厕
+辅以
+血战
+逆袭
+CS
+芙
+守候
+绝色
+表层
+那位
+重构
+科隆
+well
+诸神
+隙
+算计
+纯属
+DE
+永定
+相交
+临安
+樟
+全名
+涉足
+奉行
+弊
+主意
+佛寺
+据点
+mL
+专场
+法典
+受聘
+自序
+耐久
+手感
+一来
+插孔
+颠倒
+长宁
+祖母
+小猪
+盟军
+加装
+voice
+温控
+外加
+矩形
+哆
+勘测
+开端
+言行
+弗兰克
+老爸
+丧尸
+椎
+上人
+渴
+黄芪
+太仓
+鄂尔多斯
+烹
+指责
+以期
+未央
+质朴
+249
+利税
+承运
+耳目
+Micro
+契丹
+秀美
+萨斯
+3600
+遗物
+青青
+三明
+普洱
+妈咪
+蝇
+异国
+中天
+2mm
+极性
+杰克逊
+勾勒
+志刚
+出战
+名优
+西欧
+名作
+救世
+猛烈
+塔尔
+传闻
+凝血
+以身
+贵重
+小路
+好坏
+杰作
+佳能
+红叶
+拙
+洗礼
+大笑
+学时
+住户
+攻城
+篆
+兽人
+备课
+月下
+出庭
+瑛
+主播
+工矿
+市直
+仰望
+近乎
+障
+校花
+羊毛
+一幢
+按揭
+出没
+1901
+战时
+下篇
+小河
+ワ
+双双
+王氏
+筹措
+乳化
+水管
+畏惧
+半月
+方面军
+租房
+腰部
+龙珠
+到此
+五谷
+进制
+墨尔本
+加收
+中度
+美貌
+古风
+胶片
+屌
+燕山
+ツ
+性病
+为辅
+靴
+杨家
+皮带
+光环
+募
+生境
+农贸
+板桥
+皇冠
+说着
+桌子
+敛
+发烧
+红利
+凉拌
+点滴
+滞后
+Day
+好转
+相符
+高雄
+南边
+对峙
+323
+奥迪
+新奇
+考场
+午餐
+凝胶
+照亮
+狠狠
+品类
+灸
+夸
+热潮
+光束
+西面
+混音
+歪
+Best
+规避
+翩翩
+灭菌
+名额
+盖章
+淫
+捍卫
+二元
+全境
+躯
+六道
+正值
+特意
+肠胃
+气缸
+相约
+液态
+龙骨
+Face
+闯关
+铜奖
+显露
+遵照
+成群
+ATM
+巨龙
+进城
+瘫痪
+存栏
+收缴
+谋略
+College
+讲堂
+傍晚
+台词
+国华
+主委
+北边
+玺
+⑦
+游艇
+299
+冤
+省区
+家门
+售楼
+议定
+冠状
+飓风
+弹道
+霜期
+Che
+丸子
+赏识
+过境
+复原
+铁血
+蜿蜒
+乙酰
+蓄水
+心碎
+谜团
+绿洲
+佣兵
+德乙
+米歇尔
+民风
+基尔
+落到
+果品
+正版
+职数
+徇私
+繁忙
+教案
+大千
+串行
+炸药
+珊
+抽取
+are
+鳄鱼
+ボ
+阵阵
+羞
+奋力
+跑车
+史前
+可视化
+乌拉圭
+过年
+γ
+辽东
+引来
+思索
+全院
+路人
+商行
+偿付
+Cr
+前任
+范式
+巡演
+YY
+市面
+三改
+异形
+预见
+高教
+带入
+橡皮
+推断
+拉力
+箫
+脑筋
+惊叹
+南区
+直觉
+缀
+缺损
+刑警
+位列
+名流
+欧亚
+顷
+渭南
+剥离
+存货
+身手
+脑子
+笔法
+胥
+介于
+飞舞
+益阳
+南亚
+南平
+例句
+谷物
+良心
+箱包
+官僚
+河床
+低廉
+先河
+二环
+黏
+02007
+建新
+停靠
+程式
+日新
+力气
+拐
+为王
+选配
+雨花
+销往
+糕点
+梧州
+大清
+上场
+快感
+海面
+遗体
+内服
+宁愿
+攀登
+276
+开具
+嫡
+母子
+推拿
+刹
+西蒙
+季军
+大一
+布袋
+弹出
+脐
+横跨
+苍天
+萌发
+风趣
+嗨
+排毒
+路程
+典藏
+举动
+鲨鱼
+花都
+沿江
+应聘
+五万
+尼龙
+圃
+接连
+组队
+结节
+诈
+翻开
+陨石
+讳
+中高
+精简
+倒卵形
+º
+何谓
+跑步
+妄想
+远大
+缚
+养育
+ǒ
+顽固
+连长
+WCDMA
+标明
+房源
+眸
+供述
+485
+向导
+皂
+收复
+Dr
+Blue
+闲置
+卷四
+绛
+出差
+宜春
+交代
+酸甜
+留住
+发泡
+水窖
+填料
+千家
+教辅
+反弹
+玉皇
+举止
+献血
+Management
+竞选
+中空
+安石
+散步
+回乡
+反转
+小丑
+兵部
+头衔
+早教
+薰
+国军
+政变
+玮
+笠
+钳
+鲨
+开会
+加一
+何方
+元宵
+木工
+资委
+卑
+假释
+焦距
+短语
+Dream
+镀锌
+丫
+身子
+硅酸
+铜陵
+踏实
+辈出
+取景
+顺义
+580
+head
+40000
+失控
+Young
+IF
+希尔
+礼堂
+ASP
+托运
+Jean
+烤肉
+530
+各行
+402
+乙肝
+嘲笑
+清军
+侗族
+布鲁斯
+和声
+榕
+欢笑
+北区
+骚扰
+灯火
+这天
+ケ
+014
+香山
+工匠
+三八
+摇摆
+拨付
+顶峰
+行销
+解谜
+炜
+废气
+触角
+月息
+中铁
+直肠
+淮海
+防尘
+后置
+斯科特
+玉环
+鱼头
+早日
+古时
+格格
+蹄
+02006
+天际
+报废
+螨
+大田
+急剧
+仇人
+迈向
+略有
+平生
+型材
+念头
+丝瓜
+Design
+砂岩
+三菱
+稳固
+散装
+珍宝
+福德
+home
+郎中
+泥沙
+情结
+致死
+桔
+孕期
+呵
+徽州
+考勤
+贵人
+汶
+来宾
+八达
+TCP
+RAID
+停产
+肉馅
+巨头
+宣城
+势头
+软体
+过户
+河源
+汇入
+法力
+沟渠
+相聚
+能说
+波形
+速写
+Back
+退税
+大腿
+蜀山
+揭秘
+殡葬
+直隶
+制胜
+神学
+削弱
+员外
+痒
+大灯
+西医
+代偿
+镇区
+卖场
+ぎ
+协办
+630
+绝佳
+黄疸
+题词
+汗水
+ァ
+前缘
+257
+4星
+县令
+宅男
+弥勒
+璧
+伙
+抓获
+高汤
+地标
+多维
+选编
+浮云
+聚氨
+测光
+一统
+硫化
+上校
+ネ
+抗击
+偷袭
+文具
+混淆
+万科
+1ISBN
+盗贼
+四平
+分身
+祝贺
+伤情
+体贴
+走近
+裝
+骑马
+环卫
+理赔
+诱人
+滋阴
+拖延
+公章
+发觉
+护航
+偏离
+WTO
+格言
+捐助
+屈服
+撕裂
+油箱
+闭幕
+溪流
+催讨
+除草
+华润
+括
+国君
+打通
+吃掉
+为国
+泽民
+半夜
+酸菜
+煜
+甜甜
+live
+弦乐
+莲藕
+卫士
+680
+高架
+尾声
+光影
+兽药
+铵
+狗狗
+开场
+绝招
+涯
+同城
+⋯
+诗作
+嘴巴
+长足
+下次
+斗气
+描
+迁居
+浸润
+杨梅
+点头
+定语
+归档
+买入
+沧
+枯萎
+避暑
+there
+教区
+烧结
+智者
+歼灭
+统帅
+西兰
+诺斯
+各家
+院落
+少先
+除湿
+万事
+肝胆
+为食
+雨量
+霍尔
+Now
+厘
+恺
+碍
+炮台
+教士
+捕捞
+诡
+PA
+走访
+飞利浦
+玩耍
+环城
+破损
+相撞
+4S
+重来
+桥头
+衰退
+翎
+杂技
+钟表
+象牙
+挨
+名曰
+车厢
+续集
+疗养
+国标
+迷恋
+点亮
+单击
+计委
+机翼
+CCD
+挠
+绞
+零九
+承建
+紧扣
+油耗
+诸暨
+漂泊
+00PM
+以示
+帮派
+操场
+乡亲
+集会
+醛
+King
+多米
+760
+醚
+营口
+0000
+晓东
+究
+法西斯
+糟糕
+嘉奖
+长白山
+效用
+北门
+警戒
+在世
+迫害
+司长
+重度
+妹子
+自贡
+木板
+金庸
+菌丝
+油炸
+断路
+黄龙
+美式
+手柄
+主旨
+基底
+开店
+苏军
+时分
+自卑
+283
+could
+笺
+波段
+标杆
+世民
+祺
+娥
+猫咪
+铎
+碑文
+德阳
+五脏
+喵
+山体
+CA
+校址
+How
+极化
+倡议
+后部
+忠于
+假装
+心想
+体征
+盆景
+冰冻
+勃
+ㄠ
+衰弱
+仅存
+当晚
+房山
+八景
+胁
+国安
+差额
+王位
+经纬
+风速
+何况
+丹东
+表征
+教主
+梳
+枯燥
+怎能
+霸主
+电报
+Do
+删
+武则天
+霉菌
+礼部
+校准
+玛雅
+跨境
+青菜
+移栽
+拦
+单调
+虏
+校验
+定做
+禅宗
+虔诚
+入药
+VCD
+浅谈
+重回
+B1
+Master
+飘逸
+星等
+老区
+回购
+新月
+盘中
+声名
+妖娆
+攀枝花
+pp
+因缘
+雨中
+挽回
+皆宜
+纪事
+胚
+고
+排队
+金水
+顺畅
+窃
+乌龟
+测算
+开挖
+倾倒
+舞会
+286
+血红
+TM
+原汁
+侨务
+串口
+学刊
+行道
+审稿
+满怀
+噢
+双边
+官府
+爸
+报业
+国宝
+非公有制
+土匪
+示意
+加点
+rd
+稀少
+药店
+妄
+众神
+回眸
+弼
+ぐ
+鸠
+求婚
+塑像
+公有
+赛道
+毅力
+看好
+面糊
+补血
+撤离
+Core
+洋房
+云端
+淳朴
+对阵
+情商
+物产
+售票
+皱
+乏力
+保姆
+清泉
+龙江
+燕麦
+夯实
+先祖
+345
+净重
+套路
+南端
+1250
+棣
+清扫
+Out
+板式
+世凯
+担忧
+小匙
+椅子
+逃走
+彗星
+呐
+诺言
+Studio
+玉兰
+姻缘
+抒发
+喇嘛
+展位
+267
+桑拿
+long
+昼
+凛
+烟气
+分担
+婚后
+及第
+幻灯
+硬币
+超值
+小枝
+职教
+睾丸
+嘴角
+1333
+人流
+胆小
+漂浮
+螺杆
+胎教
+帆船
+朱德
+60Hz
+二百
+县市
+静安
+泗
+汉文
+滤镜
+好书
+大巴
+神界
+坡度
+敞开
+等效
+热电
+荆门
+隐身
+儿时
+五台
+功臣
+显赫
+博爱
+书本
+毕生
+隅
+失恋
+缝隙
+喷洒
+各业
+干警
+荣华
+两栖
+298
+ㄕ
+高管
+Up
+windows
+排版
+老爷
+早晚
+내
+陀螺
+肇
+豆芽
+神兵
+Chen
+卞
+海战
+两手
+李氏
+会见
+悲痛
+粉剂
+演习
+渡过
+行署
+特派
+钨
+改用
+画中
+电控
+乡人
+羁押
+股长
+汉朝
+珂
+外务
+Max
+高斯
+掩饰
+公报
+城池
+雇主
+278
+飘飘
+学识
+失忆
+处境
+地级
+溃
+顺着
+依约
+选材
+自贸区
+计入
+支教
+布满
+送货
+濒临
+闭合
+剪刀
+慈禧
+情调
+车窗
+江淮
+Jack
+还好
+乙酸
+凯特
+相恋
+新郎
+波音
+重镇
+孚
+磅礴
+莉莉
+episode
+排长
+称霸
+芦笋
+医治
+非营利性
+搬到
+付息
+禅寺
+察觉
+海天
+防风
+前排
+DOS
+借此
+危重
+整流
+边际
+醉酒
+临汾
+毛坯
+香肠
+指派
+Don
+泼
+华美
+异物
+松花
+335
+动听
+DL
+护法
+去向
+车子
+苦恼
+an
+刚好
+原状
+那边
+译著
+貌似
+에
+鸦
+耕作
+夜空
+水手
+近视
+天一
+校企
+虚幻
+安抚
+传热
+下跌
+得太
+韶
+May
+就地
+扰
+执著
+008
+铀
+躯体
+嘴里
+一念
+梯度
+由县
+呐喊
+调皮
+名门
+弓箭
+脸部
+港湾
+复方
+东芝
+彷徨
+牢牢
+德意志
+胆囊
+守备
+分卷
+托盘
+5A
+酸钠
+高粱
+黑龙
+化验
+礼记
+节肢
+创设
+力强
+卖给
+传到
+腰带
+兼有
+黑衣
+水陆
+订货
+Φ
+壮丽
+闪闪
+得罪
+그
+时至
+甘油
+小偷
+整数
+乍
+香烟
+亳州
+窜
+叫醒
+潇
+天井
+SK
+木业
+作协
+013
+年检
+真皮
+追忆
+试论
+一指
+ㄒ
+奢
+here
+玉龙
+莱芜
+乙级
+控告
+画师
+想说
+钰
+西伯利亚
+六级
+扛
+稻谷
+南沙
+飞来
+let
+冲浪
+西门子
+构架
+退耕
+尘世
+评标
+293
+丰硕
+警官
+改写
+年终
+修为
+矿工
+媲美
+孝感
+水溶
+危急
+残余
+做工
+埋葬
+咳
+锻
+止咳
+槃
+清净
+德华
+空虚
+药膳
+截然
+CSSCI
+小三
+油料
+翻身
+三军
+管路
+产销
+阳江
+用到
+走红
+晓明
+公克
+病死
+出纳
+宋元
+瘀
+雅致
+φ
+锚
+长虹
+谓语
+奋发
+伏羲
+追捧
+白族
+绵绵
+看清
+曼谷
+橘子
+三万
+临江
+武侯
+通航
+争创
+散落
+独生
+中大
+一连串
+抗争
+举世
+烨
+朝气
+侮辱
+AU
+钱塘
+内经
+牧民
+枪支
+方略
+蜂窝
+休假
+work
+迭
+刻骨
+岸边
+户户
+深水
+⊙
+寮
+独奏
+自救
+茵
+胭脂
+穷人
+小巷
+各部
+写在
+老实
+电波
+脚印
+ö
+游历
+痴呆
+泡菜
+电车
+巡洋舰
+爱因斯坦
+延年
+西点
+宝塔
+接见
+帷幕
+高强
+圣女
+夺冠
+东欧
+倍数
+高位
+where
+釜
+人质
+菲利普
+清淡
+工科
+家境
+做梦
+拳头
+雨季
+近义词
+弧形
+日经
+莎士比亚
+松鼠
+295
+角质
+省钱
+厩
+反恐
+财神
+凄凉
+IE
+杀毒
+印染
+人族
+北朝
+大白
+平底
+丰产
+忘却
+歌颂
+缘起
+超凡
+加压
+戦
+制程
+give
+果皮
+一口气
+宣称
+溶洞
+奇石
+Chris
+递增
+摘录
+甘心
+巾帼
+香火
+610
+俭
+妨害
+用材
+生粉
+气门
+金光
+铸就
+短小
+浦江
+酝酿
+透彻
+推选
+咲
+开胃
+民意
+埋藏
+八仙
+二批
+医嘱
+through
+漫天
+木头
+抗议
+苍生
+调换
+高处
+囚
+平日
+楚雄
+解体
+秸秆
+美院
+小溪
+逐一
+亲生
+寨村
+生根
+身穿
+大炮
+绕组
+委屈
+米高
+实木
+竹子
+刚性
+买到
+流连
+汉化
+毗
+紧跟
+箭头
+坟墓
+揽
+指尖
+打包
+溥
+南邻
+虚伪
+遗弃
+导热
+胶州
+打磨
+肿胀
+可疑
+鞋帽
+抵制
+手腕
+轮流
+洗手
+政局
+神舟
+宣判
+古香
+即墨
+沉浸
+综治
+销毁
+神韵
+往前
+镜像
+沫若
+乳酸
+取自
+捐献
+▲
+振华
+并举
+疤痕
+eyes
+卡特
+弥陀
+Inter
+各区
+古色
+把关
+ご
+284
+宜居
+优生
+农垦
+追捕
+财保
+说完
+004
+窗帘
+写法
+受命
+山洞
+爽口
+宽大
+栅
+说来
+灌丛
+地壳
+报请
+管材
+Multi
+剑客
+壁挂
+戴维斯
+本事
+麻雀
+底线
+先在
+牧羊
+频频
+安乐
+大败
+安然
+咯
+惊艳
+高程
+冷血
+晋城
+274
+蜡烛
+餐具
+格勒
+用笔
+观景
+夜色
+Free
+法门
+替身
+催要
+默契
+重塑
+避难
+刁
+滴水
+好想
+正本
+情操
+奋战
+马赛
+典当
+书屋
+管子
+范文
+相距
+装订
+账单
+徒步
+果肉
+华纳
+泛滥
+辽阔
+渡口
+502
+出兵
+病房
+成活
+荒野
+玉树
+肉片
+朱雀
+게
+题写
+叶面
+失真
+暑期
+WIFI
+269
+见识
+co
+风化
+64bit
+神明
+极好
+辞去
+Mac
+清纯
+煌
+天宫
+叛
+静音
+内胆
+6mm
+自负
+反面
+枝叶
+栗子
+冰岛
+变故
+大会堂
+摸索
+入编
+宝剑
+索赔
+绝代
+采伐
+375
+邻里
+信奉
+遴选
+猕猴桃
+歼
+山海
+ABC
+SS
+佳肴
+二郎
+交锋
+枪手
+悬念
+犀利
+温差
+大区
+羟
+上乘
+咸宁
+战败
+通畅
+归国
+三四
+하
+靓丽
+胖子
+曲调
+大头
+长年
+单行本
+连通
+曲靖
+铂
+软弱
+比拟
+白头
+教皇
+韧带
+叙利亚
+矛
+炳
+专制
+渔民
+巍
+搬家
+钢笔
+爱民
+出动
+秉持
+修道院
+Come
+凯尔特
+诗篇
+var
+富阳
+ㄌ
+远流
+music
+362
+蒿
+V2
+医术
+相克
+达州
+导引
+怀旧
+辽阳
+换取
+西式
+would
+打进
+弩
+祈求
+世锦赛
+逼近
+喧嚣
+饥
+陵墓
+年青
+ld
+媛
+兰溪
+联欢
+散布
+缩放
+喘
+询
+吐蕃
+how
+张某
+涂鸦
+疣
+阻断
+屋面
+汹涌
+3mm
+安顺
+372
+烧毁
+319
+Base
+拨打
+体形
+共性
+雇员
+马车
+脱贫
+328
+生源
+归宿
+巨额
+驰骋
+军营
+为官
+套书
+定金
+取证
+绥
+插头
+硬质
+汇票
+孩
+人本
+匠心
+骗取
+争执
+籽粒
+犁
+首任
+春日
+读懂
+沥
+炎热
+豆豉
+青色
+匣
+镇痛
+半部
+卷烟
+Service
+募捐
+七八
+电离
+送入
+根基
+价位
+当期
+291
+英军
+比萨
+斐
+骄
+甜品
+施加
+297
+月夜
+退伍
+捕食
+微克
+理发
+遍地
+淡化
+善恶
+多远
+慈悲
+相貌
+纽
+Standard
+野战军
+而立
+可得
+芋
+地平
+MTV
+计收
+般若
+临摹
+引水
+梯形
+驯
+水份
+羲之
+尼克
+链路
+来回
+盎然
+舒缓
+干脆
+光阴
+公墓
+生菜
+铭文
+建德
+Engineering
+核酸
+永兴
+天候
+省立
+璇
+兰亭
+万方
+上位
+幻觉
+3C
+小调
+着陆
+亚瑟
+屋子
+도
+先前
+法兰西
+共存
+配对
+任用
+逛
+占卜
+网状
+紧接
+税源
+脱硫
+楼阁
+蚕桑
+彝
+三子
+烘烤
+314
+AP
+偏好
+瘙痒
+广角
+舍利
+Reading
+游走
+草堂
+异味
+囚禁
+宿州
+端正
+412
+硼
+钦州
+供暖
+百变
+供需
+红斑
+上架
+托尼
+讲课
+往下
+Media
+排量
+心形
+William
+曼彻斯特
+淘气
+市县
+椰子
+粤菜
+阿兰
+窍
+相隔
+茶树
+同乡
+IC
+百计
+提拔
+红梅
+柑橘
+鸽子
+water
+看成
+专户
+峰值
+也应
+心脑
+佛陀
+3M
+安溪
+仍旧
+根茎
+台山
+靓
+高龄
+皮炎
+自选
+间歇
+淡定
+外联
+兑现
+投稿
+砂锅
+茂密
+琉
+滑稽
+纂
+重在
+并肩
+摧
+洒脱
+内江
+初三
+采光
+1898
+热忱
+栾
+华西
+选自
+辞书
+浴巾
+机甲
+层级
+620
+煤层
+翻转
+深沟
+看病
+主食
+卖家
+780
+黄海
+free
+身躯
+摩根
+女婿
+沉沦
+奇缘
+秦王
+激起
+观摩
+Bill
+梧
+开源
+荆棘
+仑
+Lv
+弊端
+亥
+虐待
+first
+法案
+石林
+隐秘
+一缕
+租用
+入狱
+关东
+评述
+金花
+大义
+2m
+问候
+杂粮
+大作
+冲破
+生性
+大枣
+取样
+万万
+零点
+大黄
+耳边
+棕榈
+一十四
+精粹
+条街
+暂住
+he
+山茶
+估值
+平仄
+雪地
+Tom
+TD
+相助
+天尊
+消炎
+历练
+穿孔
+紧迫
+许愿
+窒息
+箱体
+外层
+幕府
+镜片
+江水
+轧
+塞纳
+Game
+遮阳
+296
+三段
+气相
+﹝
+通告
+说爱
+铜牌
+剂型
+玩忽
+和县
+﹞
+胶带
+二千
+新东方
+侵袭
+图象
+方程式
+马来
+如花
+八字
+三七
+民初字
+高能
+hu
+政事
+超能
+台球
+501
+人海
+宣扬
+睁开
+初衷
+三宝
+路子
+马特
+后天
+house
+填报
+中用
+合集
+今世
+SBS
+抢断
+攻势
+领主
+乐业
+大石
+一动
+Business
+云山
+触及
+矫
+哈尼
+黄泉
+金鱼
+宾语
+牛羊
+宅基
+依依
+华沙
+小班
+行进
+竹笋
+化痰
+渚
+机票
+上路
+事例
+杉木
+006
+RAM
+学徒
+疯子
+3g
+交大
+赌场
+谐
+石狮
+只用
+千山
+觉悟
+银联
+爱慕
+车展
+连环画
+金鸡
+336
+彩画
+转出
+60000
+山头
+Yeah
+水运
+新春
+击退
+producer
+喀什
+那儿
+请假
+远去
+迟到
+货架
+安吉
+阴暗
+钻孔
+委内瑞拉
+权势
+慎重
+PHP
+争吵
+供职
+先知
+壁垒
+富家
+拥挤
+金矿
+幼时
+头孢
+门市
+救命
+人品
+again
+林肯
+车手
+跳动
+深切
+士气
+夺回
+美酒
+流血
+休眠
+镇静
+两三
+上等
+ベ
+精练
+PAL
+水煮
+病灶
+摩托罗拉
+处女作
+罗兰
+筋骨
+对流
+园地
+花钱
+面值
+肥皂
+桃李
+注视
+黄连
+布依
+Group
+远景
+深思
+抢夺
+玥
+人杰
+滚筒
+厅级
+调养
+满天
+吴中
+院子
+经受
+纲领
+抽出
+气化
+金像
+集邮
+出征
+急速
+含蓄
+国教
+女皇
+出产
+宦官
+五分
+大圣
+芦苇
+药典
+瑕
+引爆
+武进
+CBD
+半夏
+下班
+With
+vista
+字帖
+上前
+422
+原装
+膨大
+调到
+090
+384
+厂区
+恰好
+高邮
+以待
+湿性
+恪
+灯塔
+从严
+哈萨克斯坦
+荚
+时针
+放送
+钮
+资费
+名次
+吐槽
+屁股
+危及
+401
+GMP
+修道
+下限
+每期
+海龙
+配伍
+470
+狭义
+周知
+毋
+017
+祛风
+宇文
+屹立
+争端
+萍乡
+卫东
+277
+大乱
+面纱
+朗诵
+Art
+杂物
+港台
+淑女
+絮
+日日
+start
+02005
+军中
+健壮
+混战
+昭和
+을
+参政
+半身
+投篮
+洋洋
+诚意
+法式
+赞叹
+风行
+福尔摩斯
+sun
+小肠
+海角
+初具
+耗电
+千叶
+奖牌
+浓浓
+起舞
+种养
+捞起
+溯源
+MMC
+ぶ
+烷
+宁德
+身着
+知青
+导言
+一到
+摆动
+哨
+3GP
+悠然
+蒸熟
+025
+损毁
+AD
+犯人
+稻草
+一招
+肠癌
+Rock
+毛细
+暗影
+光缆
+暨南
+养血
+求得
+友爱
+立新
+可说
+一倍
+香甜
+深知
+BMP
+海伦
+春光
+银牌
+中转
+北齐
+牌号
+脏腑
+五色
+沭阳
+摩天
+拜师
+0m
+经省
+兴衰
+霄
+义工
+分隔
+滑坡
+通气
+能以
+宝殿
+半生
+净值
+烟叶
+挑衅
+犯规
+受精
+喜庆
+亲临
+大部
+本溪
+赢家
+保质
+输卵
+匾
+月中
+代代
+股指
+垢
+谋求
+书协
+阴极
+事事
+持股
+收受
+水瓶
+茶业
+梗阻
+279
+铣
+NVIDIA
+ㄤ
+渔船
+暴走
+抬起
+嗪
+肉食
+诞
+亚特兰大
+泓
+毁坏
+励
+波尔多
+桐城
+710
+居多
+带状
+曲风
+液相
+抚州
+盆栽
+安神
+并联
+测控
+风量
+墨水
+内镜
+化成
+黛
+才艺
+同义
+种业
+一档
+飞鸟
+恪守
+配比
+悠闲
+凯瑟琳
+海涛
+佚
+一气
+中线
+城郊
+寄主
+功绩
+颤抖
+心疼
+沃尔玛
+滚轮
+朵朵
+通缉
+求知
+蓝光
+更年
+鬼子
+坏人
+滥
+安县
+举起
+肝癌
+强强
+水秀
+算术
+披萨
+输电
+内饰
+风度
+Club
+主页
+葡超
+炒饭
+九华
+升格
+钼
+浑然
+加里
+滴定
+弑
+明目
+售卖
+防潮
+自考
+内幕
+日寇
+即兴
+诸子
+Like
+Two
+人头
+佳品
+劳工
+尼日利亚
+全息
+冷战
+游牧
+一式
+x768
+知音
+317
+助人
+巴蜀
+奔波
+痘
+Show
+墓碑
+BIOS
+冬笋
+免提
+短缺
+楷
+快要
+颊
+宿主
+曲子
+开播
+帮手
+Good
+青天
+麻疹
+连绵
+缺氧
+夜夜
+331
+一粒
+正德
+莱茵
+确诊
+司空
+故而
+收割
+陈氏
+ソ
+韵律
+密布
+双鱼
+猪肝
+壮烈
+乡下
+苍南
+戒毒
+户部
+汉阳
+红木
+梦见
+虚实
+set
+古董
+编成
+裳
+应运
+育才
+氽
+克制
+EX
+优酷
+划入
+农科院
+家务
+LC
+佛罗里达
+403
+流速
+爱美
+聊斋
+雏
+县境
+角逐
+罗湖
+功劳
+道场
+增减
+买方
+金子
+化纤
+苦涩
+得一
+卧龙
+住持
+声带
+土层
+布丁
+增至
+发声
+桔梗
+仁义
+金马
+狭长
+军舰
+凉山
+蛤
+卓著
+跨界
+曼联
+反光
+动用
+国藩
+毒药
+抚恤
+粘合
+芋头
+Adobe
+肤色
+大盘
+联结
+迎春
+灵性
+相公
+反驳
+引申
+利器
+低速
+县区
+030
+NVIDIAGe
+너
+耽美
+George
+肥大
+湘江
+闭塞
+人中
+盛夏
+迈出
+摔跤
+石块
+凉水
+独行
+雨后
+RM
+天边
+多于
+袜
+竞速
+ㄓ
+献身
+微风
+动静
+纶
+波纹
+432
+锣
+狂风
+执笔
+看完
+对生
+酋长
+吸毒
+徽章
+浮现
+公尺
+起床
+锣鼓
+女排
+343
+功名
+飘香
+可笑
+松林
+江户
+外套
+镀
+382
+浩劫
+灯饰
+地热
+289
+해
+Cause
+贞观
+触动
+省直
+信件
+架设
+北邻
+税额
+块状
+正道
+一言
+赤水
+精明
+迪拜
+南城
+常设
+随处
+009
+哥特
+药性
+奇才
+顶点
+元旦
+前部
+294
+大乘
+追赶
+朗读
+帐篷
+name
+电感
+亿吨
+蚕豆
+超薄
+舍弃
+怨恨
+CF
+推开
+place
+CC
+精诚
+戴维
+隶书
+摄取
+断电
+充斥
+迎宾
+谍
+改组
+伦比
+斯蒂芬
+斯巴达
+署名
+羁绊
+格子
+分发
+宦
+次序
+土建
+相知
+磁带
+球面
+中超
+地步
+交配
+懦弱
+蓝图
+申领
+香葱
+腊肉
+在后
+评判
+圣保罗
+安德鲁
+
+伊犁
+外婆
+终日
+其时
+魔方
+汉人
+米勒
+马店
+太古
+小波
+自已
+甲苯
+外向
+强盗
+索取
+脉搏
+预科
+候补
+精解
+引流
+批示
+奔走
+满满
+耐力
+手势
+上色
+意在
+研习
+彩信
+南国
+庄家
+协奏
+柬埔寨
+军方
+预选
+家畜
+多国
+史实
+蚌
+丽君
+々
+保利
+各路
+大意
+大好
+师徒
+议程
+癸
+长青
+国策
+鳖
+525
+小伙子
+工况
+恐慌
+棋盘
+推迟
+荡漾
+同安
+针刺
+武陵
+肿大
+夫子
+世外
+临终
+好运
+Li
+一作
+芭比
+倾国
+阵亡
+1066
+蓄积
+don
+升温
+东平
+往日
+笑声
+战功
+微粒
+凶猛
+蔬
+九三
+恶搞
+水火
+龙眼
+切尔西
+车速
+突袭
+吸气
+嘴唇
+麦芽
+水井
+营长
+总称
+残联
+空运
+韩式
+走入
+打字
+CG
+落魄
+版图
+蒴果
+秒杀
+金丝
+点儿
+血雨
+舞者
+葱段
+287
+句型
+381
+元月
+琢
+谒
+勘
+传出
+д
+宝箱
+境地
+速成
+神马
+形制
+求生
+歧
+逆向
+支票
+半天
+Inc
+从轻
+转学
+大块
+瓮
+西晋
+萨尔
+对症
+勉
+苍茫
+片状
+惨遭
+互惠
+小鸡
+等式
+瘟病
+RNA
+大漠
+亲亲
+铭心
+思绪
+砂石
+锋芒
+磺酸
+招呼
+淬火
+警备
+中庸
+战区
+瀚
+幽冥
+火龙
+鲍鱼
+申办
+下同
+纸业
+v1
+权重
+西风
+回旋
+再起
+White
+选型
+山石
+梗死
+冰封
+各色
+面容
+锦涛
+南斯拉夫
+枪管
+进退
+受让
+平庸
+而异
+叫声
+肝肾
+总能
+明初
+横生
+马尾松
+坳
+风口
+连任
+赦
+委派
+打的
+硝基
+滤芯
+沛
+桃子
+外置
+豆角
+天意
+秦淮
+严寒
+混交
+程控
+查验
+鸣人
+行军
+迎合
+属地
+By
+皮具
+经久
+乳汁
+就叫
+鞋业
+约翰逊
+固执
+BT
+而今
+班组
+338
+绵羊
+向来
+电算
+稳重
+刨
+贝拉
+如愿
+流亡
+MAC
+透镜
+起于
+过热
+耐受
+重阳
+努
+夺目
+征途
+Research
+遗嘱
+Hong
+趴
+远洋
+联队
+中州
+1050
+新旧
+存档
+继位
+阴茎
+源源
+白水
+守望
+维新
+平房
+颅内
+可逆
+醒目
+墟
+首长
+卑微
+造血
+真伪
+自理
+埋伏
+宽松
+驿站
+笔触
+厄
+改动
+或称
+云霄
+牧区
+乙基
+情书
+绵延
+芮
+别离
+纪要
+交响曲
+合称
+标示
+浪子
+This
+助推
+剌
+纷呈
+Sound
+ST
+音质
+静物
+夜宵
+挺拔
+文王
+民心
+布莱恩
+小鬼
+战死
+新罗
+浜
+高二
+后于
+西林
+竭力
+生在
+Conference
+肺部
+壬
+陨落
+征程
+into
+焕发
+ピ
+色情
+粘连
+库容
+善后
+彩妆
+巳
+冥王
+芦荟
+笑傲
+烩
+优抚
+见习
+5500
+two
+升任
+写照
+端午
+灯笼
+踪迹
+厚德
+岳麓
+高举
+应届
+利民
+Can
+讲义
+蚩尤
+麦当劳
+紧固
+藉
+An
+皋
+畅游
+乐天
+同源
+任一
+强奸
+辄
+睫毛
+326
+原籍
+玉溪
+魂魄
+鹌鹑
+1895
+折旧
+疏通
+ARM
+龙湾
+笔顺
+报国
+纷
+同在
+2048
+LA
+炎黄
+弱者
+题为
+1x
+PSP
+费城
+沼
+异域
+诵读
+位面
+兴安
+晋国
+罗伊
+庆阳
+将使
+零食
+胃炎
+通化
+倾角
+拿来
+kW
+倒塌
+万一
+榜单
+天池
+唉
+词性
+筠
+公演
+农夫
+体能
+烈焰
+碾
+山歌
+四等
+ゴ
+多孔
+柑桔
+MT
+郁郁
+废旧
+021
+SC
+人形
+调频
+青浦
+阳泉
+切身
+木制
+捆绑
+中游
+濡
+They
+小雨
+苷
+株型
+秆
+遏制
+二胡
+下放
+处事
+搜查
+本义
+调剂
+江口
+变法
+一汽
+中脉
+选派
+国企
+北海道
+拷贝
+弹奏
+节庆
+待定
+血量
+倒车
+白日
+落差
+谅解
+中区
+峭壁
+355
+收看
+才智
+延时
+族谱
+中韩
+到第
+反叛
+大路
+song
+限时
+长者
+油烟
+97870
+3MB
+价廉
+炼金
+海德
+群英
+尚无
+ㄙ
+贝克
+光临
+公差
+继任
+药房
+绞痛
+斯大林
+偏见
+履带
+崇明
+880
+中成药
+章鱼
+本田
+闭会
+恒定
+都江
+再版
+比甲
+粘结
+隔膜
+佳木斯
+譬如
+极地
+习得
+开山
+深感
+吊顶
+案情
+徐汇
+器皿
+费率
+纪年
+334
+392
+茅台
+稳压
+银监
+迷路
+涡
+大侠
+廉价
+海啸
+脓肿
+吕梁
+660
+部族
+电大
+洋芋
+辛辣
+不列颠
+相随
+文电
+声学
+瓦解
+200000
+选区
+美称
+传球
+辨识
+灌注
+龙舟
+盘龙
+兄长
+户晓
+м
+郁闷
+べ
+史蒂夫
+火线
+O2O
+地灵
+两头
+사
+一地
+脯
+胸前
+洞天
+腹膜
+反腐
+童装
+接送
+就近
+名士
+LV
+附后
+悲欢
+必经
+Light
+明智
+掌声
+相容
+掘
+桓公
+眶
+玉山
+二甲
+澈
+永丰
+寻宝
+教化
+头盔
+端庄
+平江
+伊藤
+膳
+五中
+嚼
+佼佼
+纳西
+官司
+Alex
+大麦
+高点
+席卷
+白术
+盲人
+好听
+伊人
+豪杰
+流过
+锅盖
+谥号
+何人
+生津
+教官
+心扉
+花心
+天机
+北山
+3100
+皮层
+密云
+等高
+巡查
+辙
+线缆
+厌倦
+蓝莓
+油茶
+祥和
+伊始
+仿古
+可喜
+钴
+写到
+沉稳
+重用
+护栏
+廓
+16GB
+心室
+旧城
+白龙
+⑷
+后遗
+忠义
+晋中
+蛋清
+迦
+眉山
+琼脂
+东升
+适龄
+风土
+正大
+横扫
+载人
+基站
+中俄
+弹头
+埃斯
+漯河
+耗费
+天师
+天窗
+特攻
+리
+法理
+松柏
+萃
+实干
+毕节
+衷心
+西江
+赶来
+酷似
+搞定
+东吴
+茸
+迥异
+母乳
+选民
+预应力
+讯息
+雅安
+越秀
+新式
+得力
+婷婷
+左翼
+契合
+溢出
+误入
+mol
+驭
+白痴
+科文
+异构
+宵
+非上市
+抵挡
+主创
+Control
+级数
+鱼片
+施教
+下手
+试探
+远期
+元首
+辉映
+667
+越剧
+铸件
+葬于
+绝伦
+东大
+碑林
+503
+复审
+比武
+弗吉尼亚
+忧虑
+瞬
+腐病
+CO2
+青霉素
+搅匀
+着装
+西餐
+B2C
+谦虚
+明媚
+春雨
+优待
+小生
+脑膜
+客机
+德川
+PART
+五千年
+瞎
+向右
+臀部
+评奖
+落叶松
+一旁
+爻
+庭长
+略带
+咆哮
+轿
+亲眼
+巢湖
+瞪
+谜题
+悬架
+风月
+行文
+离线
+GIF
+出海
+年糕
+Park
+基调
+内燃机
+VC
+ㄟ
+野猪
+当即
+八百
+隔壁
+酷炫
+自首
+凸起
+诡计
+连结
+痢疾
+翻天
+通史
+戌
+拭
+ä
+别样
+指肠
+新风
+赤道
+绿叶
+SEO
+龙马
+IV
+02004
+尖叫
+Night
+自知
+敌对
+注解
+app
+亚属
+止境
+them
+上交
+城外
+逆境
+成片
+南华
+家纺
+祖宗
+BL
+授信
+光合
+у
+苇
+池州
+害羞
+橱柜
+说好
+自小
+致辞
+天明
+超额
+正向
+库房
+谣
+致谢
+葬礼
+327
+器乐
+农耕
+群发
+灌区
+延庆
+硕果
+残缺
+战地
+孜孜
+鲜奶
+年幼
+日产
+神剑
+衰减
+探头
+祈福
+僳
+天蝎
+工夫
+零六
+353
+四世
+工行
+衬托
+劳伦斯
+哈特
+东江
+老挝
+考究
+Red
+阿森纳
+仁宗
+民盟
+海港
+庄重
+除名
+衣着
+调处
+车门
+碎石
+神情
+午后
+增效
+碧水
+派驻
+上火
+猪蹄
+1860
+禁毒
+颁
+陶艺
+MBC
+甜点
+成荫
+尽心
+江陵
+岩浆
+称帝
+图册
+转战
+报省
+肩部
+马列
+分水
+紫菜
+大洲
+ever
+劝说
+大鹏
+毛孔
+落日
+烙
+电化
+PCB
+作假
+持卡
+圣殿
+物件
+Chem
+山脚
+受灾
+黄酮
+一寸
+女仆
+衣食
+增补
+陶冶
+天帝
+教父
+ind
+Frank
+仓位
+意念
+培植
+复式
+专修
+繁琐
+Green
+总状
+胰
+先民
+存量
+现世
+莓
+奔放
+互感
+诗学
+热泵
+炽
+机架
+古都
+停放
+孽
+亲朋
+建工
+天元
+满载
+看点
+4800
+拜仁
+结肠
+瞳孔
+头皮
+总编
+豪情
+肉末
+天宇
+云峰
+微弱
+万古
+破灭
+工种
+拉拉
+秋水
+妩媚
+坠入
+相守
+蕃
+混血
+成千
+宝座
+报导
+心怀
+紧贴
+大赏
+悉心
+GeForce
+思明
+洞内
+放眼
+枕头
+领会
+电教
+开往
+溺
+村寨
+风雪
+TF
+此诗
+虽说
+壮阔
+王母
+国泰
+神功
+预示
+东明
+聚丙烯
+空空
+永昌
+秩
+050
+磨练
+中职
+胶粘
+铁板
+留恋
+狱中
+永和
+局级
+夫君
+噪
+新车
+安监
+甘愿
+欢快
+艺文
+晁
+推移
+SMS
+祛
+透光
+斑斓
+爱吃
+安德烈
+歌坛
+清廷
+柿子
+煨
+鲜嫩
+秤
+巨资
+可燃
+1896
+裤子
+ ̄
+北约
+路途
+镇海
+雉
+栏杆
+ㄩ
+回馈
+结交
+云海
+词组
+Hey
+襄樊
+试飞
+收容
+攻防
+鄂州
+老街
+舞动
+停业
+豪放
+同业
+嫦娥
+如皋
+乐乐
+卡洛斯
+舌头
+鞠
+363
+页岩
+大禹
+看望
+拔尖
+农资
+前传
+仙桃
+大调
+GIS
+吵
+黑岩
+令第
+dll
+湿疹
+衬衫
+翱翔
+致远
+集美
+特警
+痛风
+西村
+已然
+血族
+党群
+反攻
+领航
+被盗
+防锈
+拼接
+轻轨
+喜怒
+属下
+寡妇
+愚蠢
+兀
+KBS
+队列
+土著
+黑山
+委任
+购车
+营地
+进货
+用尽
+唤起
+血缘
+抱歉
+塞维利亚
+宁乡
+天籁
+多级
+虹口
+盆腔
+充血
+胁迫
+听从
+百分比
+漆黑
+美金
+鱼儿
+扎根
+DS
+分蘖
+三围
+东岸
+孤寂
+地黄
+假定
+环抱
+心房
+登高
+地坪
+错综
+京沪
+生息
+现时
+丕
+搏斗
+丢弃
+教义
+春华
+损益
+Card
+影评
+涂改
+龙湖
+过长
+钗
+陶醉
+六朝
+字型
+暑
+后门
+山间
+face
+冕
+快点
+490
+小弟
+内障
+美甲
+崂山
+空洞
+东周
+助剂
+捷径
+莫斯
+开平
+独孤
+资治
+京华
+粘性
+蝎
+托福
+说说
+1g
+浮游
+伸手
+成真
+秘方
+炼丹
+那不勒斯
+9mm
+航程
+隔绝
+侍卫
+山本
+侬
+神像
+XML
+永平
+焊机
+救生
+成对
+日韩
+矮人
+酒家
+通透
+怒火
+套管
+宋词
+强弱
+留存
+赐予
+武威
+桐乡
+二醇
+升值
+阿姆斯特丹
+墅
+军旅
+腹面
+小兵
+内蒙
+水滴
+首脑
+漂移
+尿液
+灾情
+可比
+368
+腺炎
+水口
+犀
+吉姆
+古玩
+大佛
+辅佐
+side
+粒度
+RMVB
+SPA
+对准
+诺亚
+辍
+菱形
+战事
+爱岗
+质子
+聪慧
+华阳
+股本
+还本
+新歌
+会谈
+咀嚼
+统统
+CAM
+牧草
+眼界
+8g
+10g
+骗子
+龙井
+书社
+鞍
+湖面
+箐
+追肥
+人教
+平息
+阳极
+Take
+友善
+上阵
+MPa
+压机
+忏悔
+负重
+比值
+叁
+3800
+1899
+挚爱
+运费
+Martin
+Star
+大姐
+孤岛
+九大
+满目
+二句
+4mm
+家喻
+裸露
+南大
+迈克
+展销
+接通
+融会
+老城
+A4
+危房
+球果
+声声
+抑或
+架子
+两倍
+谐波
+心弦
+丹尼
+疫
+肉桂
+务必
+专攻
+闪点
+分出
+按住
+明朗
+35mm
+夏令
+街机
+3cm
+佳绩
+德文
+心悸
+re
+撒旦
+心地
+hold
+近于
+曲江
+口水
+鱼纲
+充气
+膈
+Top
+瘟
+谈谈
+过剩
+隐性
+倾诉
+皇城
+开天
+舵
+联机
+信噪
+尽显
+疑似
+十种
+晨曦
+遂宁
+朝天
+安平
+凯旋
+示威
+俯
+跳出
+毛病
+降至
+认购
+齐名
+361
+爱到
+Radeon
+衙门
+余款
+水乡
+依附
+绝无
+max
+头目
+见长
+人鱼
+香格里拉
+争鸣
+425
+货源
+西双版纳
+音节
+差错
+山城
+香精
+巴尔
+师专
+戛纳
+ポ
+幂
+铁塔
+古筝
+B2B
+总务
+请来
+光年
+柒
+穿戴
+揭晓
+插拔
+片头
+哈萨克
+指明
+万圣
+手提
+写信
+百米
+劣
+分社
+嗅觉
+丝丝
+臀
+斩杀
+切忌
+斥
+嘉年华
+失衡
+沉思
+摩洛哥
+钊
+HM
+谐音
+忘掉
+隐隐
+皇甫
+缅
+茴香
+画派
+片剂
+带电
+小店
+安迪
+篆书
+轻盈
+高远
+漏斗
+怪谈
+杨浦
+苗圃
+尔雅
+押金
+舆情
+32GB
+官话
+POP
+网路
+馆员
+伽
+冥界
+齐心
+Let
+大汉
+幽静
+活佛
+NSK
+榜首
+刀片
+台中
+论丛
+管件
+松散
+烹制
+振奋
+红灯
+飞车
+扶手
+湖滨
+本源
+渡河
+漳
+风尘
+绚烂
+璞
+医护
+白俄罗斯
+雏形
+肿痛
+来得
+志向
+570
+甚多
+4A
+盘点
+迹象
+炎帝
+母校
+边框
+瘟疫
+四核
+馆内
+机油
+实则
+草图
+移位
+沉醉
+家谱
+夜景
+结膜
+茂盛
+毁于
+涪陵
+哒
+环状
+痴迷
+褶
+匾额
+苏北
+冲出
+松树
+罂粟
+毛笔
+展翅
+孔隙
+people
+娄底
+硝烟
+实名
+高耸
+南接
+炉灶
+帕特里克
+新高
+经脉
+隐匿
+四句
+写手
+018
+出炉
+JPG
+鼻炎
+淬
+财团
+酒泉
+举足
+温热
+6500
+赢利
+大势
+小桥
+锋利
+退庭
+一脚
+天启
+413
+情愫
+配色
+情意
+ㄉ
+大雪
+FX
+斗志
+善意
+演戏
+闭路
+3300
+斑纹
+藓
+反派
+02018
+晋书
+怀柔
+北辰
+ギ
+致敬
+护送
+低级
+倭
+落幕
+City
+赛尔
+方舟
+情敌
+本位
+汇合
+眷恋
+世贸
+气派
+佣金
+马修
+休养
+邬
+神探
+疱疹
+势必
+枢
+绳子
+海珠
+豪宅
+松山
+丑陋
+蜻蜓
+ppm
+七天
+鸟儿
+洞中
+自编
+810
+外号
+直角
+下沉
+丽华
+回流
+听得
+商议
+导轨
+FDD
+猜想
+年老
+擎
+混混
+贴纸
+商代
+已故
+擢
+ザ
+351
+取暖
+女声
+做饭
+亲笔
+白山
+凤山
+王道
+壮志
+二百四十二
+空降
+tome
+设防
+百色
+VI
+版式
+昭通
+转而
+代词
+7mm
+来人
+回放
+学艺
+这般
+关头
+痹
+送来
+相片
+ZL
+狗血
+使出
+高傲
+塔防
+七七
+镜面
+数列
+耶鲁
+馅料
+垂体
+↘
+前程
+433
+该死
+发财
+短文
+阳春
+美满
+上台
+雷诺
+经查
+Whe
+甲烷
+报到
+丹青
+倾销
+图鉴
+卖方
+人意
+懵懂
+万丈
+保监
+金发
+马戏
+二七
+轰轰
+洛夫
+随手
+咒语
+杏花
+変
+瑕疵
+圆筒
+豆类
+巴拿马
+File
+零星
+消逝
+汤圆
+焗
+小宝
+导购
+千方
+使馆
+脸色
+朱家
+半两
+赛马
+千粒重
+召回
+爱戴
+巴里
+团圆
+擂台
+五子
+检举
+觅食
+大公
+下场
+情谊
+立业
+榴弹
+单打
+私自
+温哥华
+天心
+颅
+历尽
+铭记
+殴打
+25000
+里脊
+明史
+呜
+国力
+前述
+翻新
+ㄏ
+岔
+1890
+丙烯酸
+半场
+荒诞
+足协
+穆罕默德
+画笔
+八戒
+江城
+乌拉
+337
+大开
+西雅图
+大德
+澎湃
+堂堂
+Fire
+甲子
+奇侠
+三五
+罢工
+Net
+酸辣
+起身
+沿河
+接替
+Man
+建站
+难关
+吃苦
+枫叶
+拆装
+341
+344
+汾
+长成
+走遍
+土质
+别说
+建中
+纷飞
+禾本
+相会
+变相
+期中
+窍门
+挟
+年金
+自建
+棺材
+单证
+万众
+刚果
+咖
+独占
+参会
+荣立
+傲娇
+增量
+399
+承兑
+青石
+古堡
+985
+白羊
+嘿嘿
+出事
+cd
+ii
+照耀
+月色
+起兵
+新意
+夜叉
+炒股
+顽
+18000
+直面
+名号
+丙酮
+窖
+案发
+酌
+坝区
+声响
+商住
+901
+⑧
+根治
+3m
+帛
+强健
+古琴
+为难
+长毛
+啃
+驻扎
+LOVE
+5mg
+骥
+复用
+铁岭
+乘法
+张贴
+捆
+书后
+驼
+荒唐
+私塾
+初等
+柔弱
+阻挠
+沉香
+盼望
+作弊
+疲倦
+西街
+斯里兰卡
+冲天
+八宝
+但愿
+独唱
+伯格
+Access
+太保
+新桥
+流落
+开篇
+三色
+总兵
+抗美
+Test
+全城
+线状
+流变
+艺名
+左脚
+逆变
+喜马拉雅
+扫荡
+386
+三郎
+六角
+伪劣
+倘若
+CRM
+布景
+860
+早春
+鳞翅
+火药
+富人
+淡雅
+女足
+黄陂
+康乐
+储运
+七次
+高炉
+趟
+红壤
+SO
+涵义
+瓶子
+大通
+烦躁
+329
+流光
+斯坦福
+闺蜜
+运通
+停机
+老者
+雄心
+作人
+金钟
+动地
+初二
+文博
+亲吻
+佛祖
+〜
+孙女
+黄岩
+刀剑
+齐备
+房管
+请勿
+019
+鄞州
+溪水
+手稿
+酌定
+雷神
+大别
+智库
+蛹
+独角
+apos
+一百多
+775
+侠客
+食人
+丹尼斯
+双脚
+二百一十三
+亮化
+布料
+涮
+省略
+腺体
+剑士
+诀窍
+折价
+Poly
+经委
+浚
+娴熟
+board
+单反
+皮疹
+传略
+多路
+福清
+靡
+龙口
+迪纳摩
+一叶
+老妈
+接班
+各科
+此法
+423
+HR
+顺便
+蠕动
+伍德
+转眼
+入股
+释迦牟尼
+引诱
+年画
+ㄚ
+投注
+会稽
+绩
+精于
+青涩
+哥伦布
+sp
+只身
+绿水
+烧制
+给定
+高热
+燕京
+进水
+日生
+往复
+禁区
+手心
+上虞
+剧毒
+大爷
+胆汁
+目击
+嫣
+摒弃
+过载
+连贯
+小车
+对内
+轮船
+逼迫
+没落
+383
+备有
+置信
+亢
+万亿
+瑞金
+茶馆
+Information
+拣
+两根
+勤工
+双流
+各处
+西华
+酒厂
+may
+谴责
+单车
+紧缺
+水彩
+色色
+刘家
+威武
+皮鞋
+飞越
+龙城
+狡猾
+海藻
+擦拭
+孟加拉
+水洗
+大怒
+双打
+AM
+节气
+淘金
+猩猩
+义军
+发祥
+web
+流离
+文风
+决胜
+鸥
+小明
+民工
+好心
+克拉克
+373
+980
+拉面
+订婚
+喷水
+文娱
+Auto
+三清
+柿
+星海
+口气
+脍炙
+画质
+ヒ
+356
+闯荡
+病史
+洪山
+来生
+卓有
+空隙
+辩解
+紧靠
+伪军
+笨蛋
+385
+连成
+打响
+雪峰
+荞麦
+晚间
+明晰
+0iOS
+灌装
+裁剪
+晶晶
+计较
+残存
+GDDR
+地瓜
+点拨
+自创
+建行
+谋取
+量表
+履约
+爱奇
+街舞
+罚球
+热源
+矗立
+登台
+Song
+马尾
+眼睑
+婚恋
+农庄
+吴越
+地雷
+Charles
+挣
+动乱
+炮制
+忍耐
+登基
+PM
+兖州
+雌花
+折断
+白白
+은
+设于
+影剧
+被俘
+答疑
+重奏
+一色
+交会
+狼狈
+1897
+村长
+照料
+头颅
+一鸣
+动能
+紫荆
+三一
+348
+重创
+攒
+贫苦
+门第
+星河
+鹭
+fire
+拾遗
+栓
+患处
+金桥
+欣然
+ヤ
+两千
+陈旧
+图集
+宣读
+龙腾
+周密
+捻
+白鹤
+连杆
+竺
+全权
+声波
+应答
+争相
+镇守
+溶血
+ways
+慰
+很远
+久久
+史籍
+小红
+生计
+羽翼
+黑马
+耳鸣
+萤火
+诗选
+钟声
+佰
+五指
+积聚
+叛徒
+IDE
+劣势
+绝地
+Mobile
+受尽
+同方
+358
+瓜果
+轻便
+横贯
+说谎
+腹胀
+汉子
+石壁
+山势
+巫山
+ɡ
+超群
+跌打
+405
+九九
+螺钉
+公职
+表单
+佛经
+孜
+手脚
+联袂
+联接
+搁
+tt
+八五
+339
+鹊
+关门
+面世
+better
+wood
+衣裳
+沿袭
+塔楼
+凉爽
+放纵
+昧
+节拍
+本刊
+急忙
+永宁
+君臣
+甲鱼
+虾米
+冥冥
+援朝
+青鸟
+玛瑙
+1mm
+中餐
+淇
+原址
+字段
+腿部
+惆怅
+在岗
+分设
+术士
+Section
+胶体
+词义
+读完
+Window
+美工
+揭牌
+饭后
+boss
+based
+地盘
+时人
+栽种
+凯旋门
+飞向
+参天
+频谱
+TXT
+飘渺
+约克
+Yang
+牢记
+天庭
+衡山
+SM
+汽轮
+彩电
+匕首
+首期
+卧底
+号令
+福克斯
+失意
+黄沙
+长圆
+伟人
+新会
+投掷
+缺水
+鬼怪
+谐振
+布鲁塞尔
+一年生
+结界
+挺进
+4GHz
+淮阴
+阿姨
+栈
+水仙
+水草
+税金
+ensis
+反潜
+埃尔
+鉴证
+菜鸟
+琳娜
+唯独
+尘封
+吐司
+出乎
+生出
+也想
+罗杰
+傲慢
+灵力
+单相
+可汗
+栓塞
+沉迷
+故城
+常青
+精锐
+大红
+大本营
+药水
+守则
+恒大
+式样
+上车
+终将
+欣喜
+SAS
+加纳
+神力
+肛肠
+X1
+墨子
+诺维奇
+神父
+残暴
+来客
+沉没
+特地
+奖杯
+前奏
+齐聚
+国荣
+舍得
+入锅
+跳水
+馄饨
+长辈
+琅
+蓝田
+趣事
+UL
+新都
+杀机
+睡着
+820
+驾校
+柴胡
+自认
+过早
+France
+1394
+交集
+吉尔
+山南
+标语
+真切
+梨园
+再遇
+固然
+盘子
+见闻
+安放
+百岁
+忒
+邸
+藻类
+三河
+批次
+国发
+痛快
+随州
+™
+门面
+后缘
+偏向
+普罗
+悔罪
+屏风
+┃
+名扬
+超长
+搭乘
+大刀
+薰衣草
+政教
+体格
+好似
+理疗
+纯爱
+商定
+面色
+科大
+040
+侵华
+母本
+阻击
+迷糊
+侄
+传言
+简报
+艰巨
+流放
+理会
+祭司
+田地
+探明
+首度
+主楼
+低频
+华裔
+胜过
+积蓄
+凡事
+作案
+德利
+常驻
+优劣
+襟
+UPS
+宗族
+嘉禾
+巴林
+保费
+立地
+拧
+浮出
+鼓手
+ㄅ
+到场
+预设
+姨
+讲坛
+渍
+速查
+来世
+起落
+摩崖
+校名
+022
+烃
+Fi
+行医
+纯水
+教法
+自控
+伊利
+通直
+带球
+鲜香
+就该
+著录
+似水
+口译
+下第
+柏拉图
+夔
+02003
+草场
+失利
+603
+短线
+夹具
+江夏
+勤学
+银子
+白杨
+大荒
+握手
+冬日
+自筹
+建明
+抗力
+祥云
+蔽
+三楼
+要好
+心机
+领队
+碧波
+陈某
+塔斯
+尖头
+偏远
+Big
+文成
+汉江
+笑笑
+尤文图斯
+山人
+红衣
+宣统
+Gold
+假说
+一十
+UV
+商旅
+投射
+镖
+概算
+洽
+书签
+14001
+爱意
+巧遇
+mind
+1850
+奉化
+得当
+蒙特利尔
+贝多芬
+牛头
+榨
+024
+留意
+取缔
+雷恩
+念念
+上次
+寺内
+栖霞
+榆树
+天亮
+昱
+探析
+震荡
+溅
+576
+稽
+柚
+50g
+胎盘
+InChI
+瑰宝
+枇杷
+万州
+内壁
+钱包
+稀缺
+游侠
+据传
+圣彼得堡
+凤翔
+going
+预料
+远眺
+长白
+little
+小腿
+葵花
+宁海
+中正
+好在
+米酒
+五世
+摇头
+肖恩
+防病
+赞成
+LD
+4200
+梅斯
+彩云
+铣床
+素食
+幼教
+炒作
+划出
+自拍
+春花
+变电
+苍蝇
+勾结
+渊博
+出访
+矶
+反省
+扼要
+呼伦贝尔
+下巴
+选矿
+理应
+驻华
+各村
+着想
+411
+乳液
+荒芜
+安德森
+基辅
+校草
+琉球
+当红
+CH
+制裁
+耕种
+幅面
+SR
+草甸
+苦心
+梦醒
+从军
+举重
+并进
+原点
+昌盛
+越狱
+滑板
+被毁
+芜
+河内
+省电
+英伦
+风寒
+周公
+畦
+判官
+442
+913
+张掖
+MIDI
+此举
+夺走
+主妇
+幡
+溯
+杂货
+劳资
+原属
+未尽
+stop
+安保
+海波
+河岸
+华严
+叶鞘
+MATLAB
+纽曼
+磺
+里斯本
+劲松
+肋骨
+威廉姆斯
+新郑
+增幅
+乌斯
+三门峡
+出游
+计费
+汇演
+915
+1894
+Steve
+告诫
+抓捕
+布什
+馈赠
+始创
+耐压
+富强
+自述
+好久
+甲午
+大病
+West
+起家
+外力
+1A
+温润
+名篇
+梦魇
+主刑
+凹凸
+炼成
+获悉
+冲泡
+哈密
+空战
+尽收
+着迷
+新诗
+й
+竞价
+黄芩
+整修
+房费
+平度
+梦回
+楷模
+乡里
+译成
+而战
+万国
+后继
+ô
+西装
+宋江
+602
+绿灯
+海峰
+日内瓦
+保山
+屡次
+零四
+香槟
+参股
+马赫
+芳草
+Open
+凯撒
+升入
+触犯
+骚
+看重
+阿胶
+晨光
+操盘
+琛
+母鸡
+车次
+辞赋
+叱咤
+通灵
+海城
+元老
+正比
+独秀
+菌种
+换代
+击落
+自拔
+甘蓝
+末班
+023
+舒展
+都灵
+亡国
+刻本
+渡江
+Festival
+阿尔及利亚
+着急
+30g
+迁往
+山田
+导数
+攻入
+抱负
+主业
+小马
+清汤
+罗宾
+性爱
+憔悴
+pos
+舍人
+畈
+药效
+科尔
+406
+平山
+555
+华商
+搏击
+夏普
+海神
+黄浦
+mann
+尼尔
+
+K线
+转生
+1880
+一十三
+沙场
+身临
+文庙
+50mm
+入睡
+租界
+鸟语
+腮
+红白
+排练
+卷轴
+桌椅
+清宫
+巴厘
+惩戒
+凉州
+枉
+山麓
+米尔
+简陋
+永续
+捉弄
+Oracle
+红光
+618
+ground
+小段
+肯尼迪
+虫草
+世宗
+四五
+六盘水
+钱财
+细雨
+保有
+亲历
+缅怀
+这边
+次级
+展演
+近战
+中新
+格列
+红莲
+大盗
+射箭
+防滑
+血染
+心病
+出错
+割据
+劫难
+圣灵
+ゲ
+养鸡
+肯德基
+浅水
+UK
+五更
+海浪
+海盐
+成全
+楚辞
+后顾
+物力
+音译
+特困
+布衣
+AF
+线型
+盘锦
+qi
+果子
+小虎
+园丁
+沙子
+停滞
+大修
+冷冷
+2019
+模组
+探秘
+蚊子
+TS
+凸轮
+万世
+强身
+转播
+熔化
+取舍
+就任
+停电
+疙瘩
+谣言
+胆碱
+女方
+善用
+鲁国
+圈套
+润肺
+石景
+椿
+纯化
+投向
+mA
+LOGO
+曾用名
+藏传
+柜台
+喷嘴
+推举
+嘻哈
+景致
+线虫
+武魂
+黛玉
+剑术
+褒
+能动
+神医
+于心
+尊称
+痊愈
+怪石
+仁和
+樵
+述评
+라
+既定
+蕉
+嘉陵
+国贸
+2MB
+减低
+简练
+党参
+母体
+电讯
+日久
+建树
+天柱
+着生
+公牛
+骨灰
+遥遥
+蛟
+精工
+亲和力
+杯子
+BBC
+外延
+事中
+Daniel
+零五
+入场
+扬子
+乘员
+融洽
+超标
+卧式
+外滩
+354
+四合院
+集训
+OS
+好汉
+又说
+每户
+玄机
+石斛
+惨案
+绝境
+套利
+秧歌
+United
+崇文
+山海经
+秋月
+全职
+忘返
+364
+老舍
+致密
+新昌
+千伏
+开垦
+打散
+炫耀
+下滑
+调戏
+346
+已婚
+内政
+中继
+连拍
+土工
+破案
+百官
+灌水
+妈祖
+恕
+可言
+逛街
+文公
+嫂
+呗
+开明
+如火
+忧愁
+胃癌
+山泉
+唐僧
+厌氧
+韩文
+重启
+换位
+爱玲
+狂野
+稳妥
+硝
+纸箱
+巷道
+NTSC
+Sub
+西施
+run
+圆弧
+find
+入园
+蠕虫
+Will
+蚊
+绝美
+条理
+宙斯
+肥牛
+商周
+Real
+高飞
+十三五
+枭雄
+海信
+菁
+伏击
+Smith
+交点
+黑河
+Systems
+土豪
+跟进
+稳态
+打断
+宣誓
+伤人
+만
+看透
+366
+芍药
+尚存
+借阅
+颚
+佚名
+GAME
+智斗
+就象
+苛刻
+桂圆
+黯然
+烘
+褪色
+筝
+7500
+小妹
+金莲
+ISSN
+校车
+预制
+起因
+娅
+激进
+纵深
+毒蛇
+大冶
+密林
+血肉
+의
+巴中
+未定
+around
+如期
+积雪
+神态
+S5
+美梦
+魅惑
+给排
+欢呼
+天敌
+家乐福
+Jones
+苏黎世
+盛唐
+剑法
+春色
+图说
+女真
+金银花
+桉
+nothing
+393
+盛誉
+通称
+白蚁
+听话
+益寿
+艾斯
+择优
+西昌
+玩弄
+八部
+魔芋
+志伟
+金门
+崭
+天麻
+六一
+style
+聚酯
+治水
+外挂
+嵩山
+禧
+杰森
+035
+海门
+狂人
+密钥
+中欧
+WMV
+胸口
+青城
+盐务
+狸
+头疼
+阜新
+轮机
+名菜
+em
+空地
+大权
+碑刻
+太上
+初生
+舰船
+攀升
+Station
+开张
+青楼
+敬畏
+寿光
+妞
+专款
+相映
+523
+肉眼
+尽早
+画室
+忻州
+多才
+天赐
+RB
+四百
+滨河
+reg
+译作
+噶尔
+闲暇
+风潮
+513
+经管
+击破
+接踵
+AI
+怀特
+遐迩
+缝纫
+克里斯托弗
+蒜头
+龙卷
+颇多
+大肆
+轻薄
+年展
+盟主
+联运
+保单
+452
+咋
+imdb
+ヴ
+仆人
+洁具
+战列
+瞒
+叩
+苏打
+应收
+388
+EDGE
+链球菌
+布里
+701
+转接
+玩乐
+内河
+霍华德
+415
+瓢
+So
+白人
+兵役
+语系
+清秀
+汉城
+收条
+道格拉斯
+601
+舰载
+十次
+娓娓
+混杂
+切勿
+缭绕
+忠心
+茫然
+한
+MAX
+蜥蜴
+Some
+海西
+爸妈
+闭锁
+剥削
+悦目
+Himself
+火种
+民进
+捺
+课余
+进京
+糙
+单层
+Brown
+楞
+嵊州
+太尉
+空闲
+Games
+新泰
+大关
+陕甘
+节制
+ㄑ
+往后
+古尔
+P2P
+俯瞰
+手艺
+难民
+广度
+安东
+动量
+顶层
+渐变
+VPN
+推算
+等价
+半衰期
+广德
+配饰
+保住
+存有
+冗余
+政绩
+就学
+摩纳哥
+可溶性
+believe
+列席
+普宁
+DIMM
+糖浆
+狼人
+纸盒
+治法
+实测
+从没
+雨天
+Institute
+then
+TX
+大号
+斯洛文尼亚
+赚取
+秀英
+RW
+眺望
+鳌
+Model
+胶东
+南麓
+萧条
+甲壳
+山腰
+谋生
+群臣
+哥本哈根
+居所
+ㄍ
+虫子
+丝状
+浴血
+地久
+跑酷
+识读
+蟒
+跌落
+年满
+脸庞
+门禁
+朱砂
+看中
+寒冰
+豆豆
+俊美
+文殊
+单株
+额头
+索要
+兴华
+诣
+白花
+复数
+即刻
+宪政
+孩童
+仕途
+浮华
+辛酸
+疼爱
+私房
+同上
+出面
+隋朝
+挫
+来华
+可做
+山丘
+异彩
+百川
+悠扬
+结盟
+迟迟
+荒凉
+油松
+net
+屁
+熄灭
+远在
+冷轧
+need
+光洁
+近海
+酱汁
+群组
+老君
+水城
+直言
+渗入
+母语
+隐喻
+施放
+庾
+2GHz
+饱受
+放飞
+石山
+公鸡
+di
+上进
+肥水
+私密
+万寿
+痴心
+雷斯
+恶人
+潮汐
+背诵
+磕
+立柱
+网关
+电平
+幅画
+mail
+叶脉
+同龄
+内力
+报经
+垩
+RH
+垛
+荷塘
+西米
+乌云
+骨肉
+旬
+青衣
+琢磨
+漕
+辍学
+猪脚
+植保
+炼油
+圣贤
+Heart
+770
+盖子
+病程
+旺火
+特委
+之举
+Pinus
+DV
+保暖
+华联
+海东
+白宫
+line
+Party
+场站
+痢
+保户
+软管
+进场
+体表
+上端
+欧姆
+圆润
+기
+实操
+四轮
+枭
+哑
+锡山
+卡利
+贝贝
+MR
+汉英
+11000
+牵连
+图尔
+房款
+斗牛
+圆圆
+滩涂
+boy
+尖椒
+双亲
+团聚
+演播
+OEM
+大鱼
+云涌
+一十二
+粗大
+回想
+自得
+做主
+建功
+玛格丽特
+五六
+幸好
+雅克
+上浮
+黄蜂
+引子
+many
+鳄
+良知
+特价
+Sky
+天佑
+First
+美人鱼
+youre
+羽片
+文法
+摇曳
+锥形
+耸立
+糖类
+喉科
+盅
+404
+大亨
+惊恐
+油腻
+神社
+酒水
+三山
+倍受
+369
+信条
+经学
+底板
+莱西
+百般
+刻度
+¥
+心率
+平凉
+夹杂
+旁观
+超导
+繁花
+幻化
+传神
+鼻孔
+畲族
+625
+1H
+金盾
+bug
+LAN
+仙游
+悼
+流明
+化瘀
+玩偶
+赞同
+催收
+劫持
+干道
+七夕
+快车
+偏低
+冰块
+拜托
+朝向
+光禄
+红包
+茶艺
+菲尔德
+天仙
+轴距
+易爆
+精要
+农技
+输液
+打折
+宝地
+篱
+移至
+受限
+字义
+云台
+桐庐
+主产
+电泳
+叶轮
+正殿
+哀伤
+郁金
+耙
+中行
+看图
+飘落
+乙酯
+Thomas
+讷
+σ
+黄牛
+常山
+匍匐
+仁爱
+北周
+武德
+玫
+B2
+萌萌
+铜仁
+眼角
+双倍
+冷淡
+风车
+蔗糖
+清江
+错位
+扫除
+398
+未尝
+法文
+高阳
+在编
+陈设
+尾鳍
+抢占
+MD
+前海
+基肥
+皇族
+佘
+灵宝
+高涨
+木刻
+TVB
+何在
+打法
+康德
+经文
+方差
+护甲
+啰
+酪
+传销
+练功
+483
+旅长
+喷发
+弱小
+繁星
+容忍
+奋勇
+梅尔
+看她
+油泵
+甜味
+沙门
+韬
+汉姆
+弹力
+盘旋
+旖旎
+存入
+金玉
+祝愿
+了此
+摇摇
+Text
+忠告
+径向
+入海
+浴场
+补丁
+二子
+皮埃尔
+禀
+®
+颤
+收纳
+奖品
+应从
+水发
+080
+357
+惨重
+釜山
+处死
+焕然
+盐湖
+童谣
+王公
+花蕾
+庇护
+柳树
+省属
+粗犷
+劳力
+睦
+仟
+极小
+中法
+多款
+体态
+芭蕉
+尼古拉
+From
+一语
+408
+浮躁
+单方
+发泄
+翊
+干掉
+行宫
+地籍
+情爱
+山高
+空港
+张飞
+疾患
+外阴
+迈阿密
+劳拉
+送别
+重温
+腐朽
+页码
+启东
+波涛
+笑颜
+露头
+刺猬
+BUG
+American
+季后赛
+14000
+玉泉
+勾芡
+上品
+定于
+汝州
+兰陵
+肉身
+443
+一十一
+魅影
+螺母
+黑豆
+打扫
+甙
+次韵
+杂症
+郢
+论点
+临川
+唔
+镌刻
+best
+02002
+453
+月季
+Ray
+原野
+倾注
+MacOS
+Audio
+昙
+field
+计分
+争先
+泥浆
+挡住
+参差
+趁机
+出嫁
+︰
+偶有
+野人
+清查
+人手
+殆尽
+勇猛
+枷锁
+厨具
+西行
+国共
+阎王
+人偶
+落水
+轻快
+起名
+终审
+睡前
+主修
+物化
+Jr
+士官
+广陵
+台南
+踢球
+倍增
+泰兴
+三十多
+膨
+云天
+凯尔
+竭尽
+打着
+身价
+抵触
+Ni
+轻柔
+三甲
+失明
+炼制
+皮影
+大城
+玛丽亚
+AND
+如云
+私下
+可否
+未予
+尼古拉斯
+郡王
+晾干
+磁力
+拱桥
+中指
+冷热
+马里
+霓虹
+刚度
+年头
+孔径
+阵法
+A1
+加法
+输血
+堀
+市属
+迷惘
+试炼
+牛角
+反义词
+白粉
+预知
+吃惊
+放疗
+王维
+打斗
+行间
+轮椅
+五层
+347
+462
+WinXp
+今朝
+申购
+技改
+毫安
+凋零
+谛
+农艺
+卖点
+才女
+应激
+怀中
+硅谷
+缓刑
+电弧
+清清
+采石
+楽
+资信
+渗出
+佬
+纱布
+甬
+指环
+berg
+难于
+重整
+MMS
+泰和
+偏僻
+准时
+叶酸
+轻巧
+滑冰
+融汇
+供气
+再审
+军师
+开战
+作证
+深藏
+村上
+奇峰
+门楼
+食指
+TC
+标牌
+多半
+阁楼
+龛
+存取
+吃醋
+滕州
+战记
+快快
+水景
+为有
+极乐
+攻破
+坨
+山里
+兵马
+四国
+滞留
+削减
+莱斯
+Digital
+瀛
+敞
+因公
+河段
+爱乐
+吸尘
+争锋
+学长
+安达
+五保
+联考
+陡峭
+白鹭
+试制
+夜行
+正交
+太公
+TCL
+SA
+藤原
+如山
+主帅
+肥厚
+缇
+骨伤
+裙子
+绅
+恩格斯
+塞尔
+670
+讲求
+办刊
+褶皱
+恢
+洲际
+IC卡
+斯文
+进深
+恃
+石料
+手记
+巴比伦
+松动
+交涉
+建业
+寝室
+Water
+环山
+骏马
+DM
+羲
+1
+为邻
+Winxp
+伞形
+阳城
+眷属
+地价
+郑重
+天大
+边坡
+under
+缘由
+创制
+物美
+歌星
+蓝本
+round
+三叉
+吵架
+单色
+胶质
+航速
+力矩
+清史
+双侧
+体液
+大牌
+俗话
+邮递
+York
+辣酱
+虹膜
+德比
+中产
+开枪
+标定
+Pop
+狂暴
+神族
+软驱
+两江
+弹匣
+德行
+ㄔ
+账簿
+冰冰
+数日
+欣慰
+缓和
+冲积
+li
+鲜味
+楚楚
+词句
+湖中
+唱法
+x2
+送礼
+艾米
+米格
+哀乐
+秧
+某天
+青瓷
+Little
+掌心
+拉美
+家世
+眉毛
+杀灭
+戍
+T恤
+写明
+处室
+13000
+讼
+天书
+学人
+GS
+奥委会
+376
+Athlon
+烙印
+宫女
+片片
+金球
+先人
+friend
+关押
+会合
+涌动
+熔融
+含水
+凤阳
+鲜卑
+赫赫
+脏器
+USD
+肆意
+斯图加特
+上衣
+正经
+政区
+右翼
+中环
+断绝
+共进
+π
+扣留
+过期
+化脓
+权贵
+江海
+紫薇
+UI
+信箱
+新任
+大发
+中科
+南唐
+难受
+通辽
+考官
+State
+只求
+WEB
+1m
+层高
+诗话
+父本
+学友
+剔除
+NGC
+砷
+乔丹
+鼎力
+准备金
+精妙
+逐级
+蝎子
+蔚蓝
+冲刷
+转角
+巨变
+遮光
+斗罗
+爵位
+小楼
+1150
+吡啶
+帕克
+Long
+猎杀
+瓦房
+二审
+纯情
+新房
+史蒂芬
+Data
+北接
+驻守
+社联
+惊世
+远端
+麝香
+高昂
+交道
+粳米
+水压
+西河
+盾牌
+463
+偏爱
+永新
+胸腔
+腕表
+开化
+童趣
+DSP
+天魔
+暴君
+草丛
+methyl
+著书
+蜥
+克里斯蒂安
+饲
+505
+乳胶
+小道
+口述
+沙特阿拉伯
+山野
+天祥
+执事
+月异
+产前
+铠
+基座
+联展
+应征
+释文
+书体
+嗜好
+至极
+任县
+压倒
+一十五
+三线
+麻黄
+欧文
+余地
+土方
+2900
+独裁
+细嫩
+加藤
+数位
+卉
+395
+玉女
+锉
+朝夕
+焱
+晗
+辜负
+♂
+RGB
+厮杀
+咎
+枪弹
+朝政
+立陶宛
+保佑
+一听
+师团
+麻省
+受热
+大喜
+煎熬
+哈姆
+双拥
+夜市
+古寺
+击球
+土星
+德尔
+建委
+税种
+声母
+点子
+负伤
+镰
+Al
+勤俭
+磨灭
+451
+雪莲
+大选
+为重
+小节
+028
+痛经
+童心
+天灾
+邕
+840
+千千
+虎门
+482
+长尾
+病斑
+赶走
+2560
+冷库
+芷
+驱使
+年长
+方丈
+脑力
+外景
+为政
+花城
+浪花
+网评
+上身
+和好
+挥洒
+破格
+名画
+国文
+屠龙
+再有
+奔赴
+五洲
+春江
+海马
+渔港
+显出
+弯弯
+同班
+大安
+岩体
+利兹
+学好
+OL
+大戏
+投奔
+边远
+干嘛
+扶助
+大半
+混蛋
+联营
+叟
+捡到
+鬼魂
+保育
+排卵
+伊斯坦布尔
+非同
+漾
+日中
+过低
+深浅
+故此
+Pentium
+简短
+查出
+473
+九宫
+山姆
+祜
+满面
+此案
+芡
+独尊
+公私
+416
+乙型
+德才
+新潮
+化名
+天秤
+篮子
+晔
+祭奠
+国药
+仿制
+小船
+封口
+女星
+痞
+打拼
+已拨
+空袭
+时序
+一跃
+樟树
+舅舅
+左岸
+南郊
+归去
+复垦
+洞房
+道口
+苏木
+愤
+天竺
+无影
+达尔文
+楼兰
+聋
+章丘
+颓废
+旭东
+青田
+重量级
+球状
+格兰
+禁锢
+骑行
+要道
+写好
+双腿
+猎犬
+永春
+史家
+滋生
+梅林
+名曲
+裂变
+盘式
+奉天
+修士
+茶具
+图示
+努尔
+风向
+同轴
+巨蟹
+川芎
+Three
+相成
+琅琊
+雕琢
+全运
+鹤壁
+哈尔
+桂枝
+屡屡
+敬仰
+漓江
+时髦
+悲壮
+开局
+隽
+通体
+Development
+出力
+472
+多边
+赡养
+正品
+竹叶
+母爱
+纵贯
+歆
+ED
+急于
+疾风
+я
+腔内
+征得
+附子
+老兵
+楔子
+桂冠
+名镇
+长条
+911
+大受
+外周
+望月
+志士
+转盘
+while
+隐约
+计谋
+CR
+嬉戏
+隼
+满地
+高深
+脱俗
+氮肥
+抹茶
+枢密
+盐碱
+371
+奔向
+里夫
+射出
+编组
+濮
+教工
+IUCN
+战力
+出使
+山岭
+522
+废柴
+风骚
+诗情
+贵港
+行距
+皇位
+东征
+夜半
+复出
+切丁
+防晒
+435
+肛
+厄瓜多尔
+挑剔
+US
+二线
+镰刀
+猥琐
+柱子
+开立
+茶道
+酒杯
+壳体
+东港
+缕
+1889
+1350
+救亡
+内功
+横山
+Fox
+大潮
+交割
+为证
+四射
+多想
+迂回
+VAC
+放牧
+包修
+Sandy
+管护
+砼
+UP
+族长
+理气
+ホ
+松开
+足部
+压铸
+信长
+施救
+鸟纲
+晓峰
+两半
+MA
+青光眼
+惠安
+威严
+湖里
+因数
+宝钢
+往昔
+水木
+云烟
+殷墟
+后半
+绰
+击毙
+ペ
+瘢痕
+妓女
+回民
+林海
+成矿
+牡蛎
+面皮
+多长
+请教
+钾肥
+Pina
+求索
+泰坦
+掏
+Wu
+雷同
+590
+列国
+兴宁
+头号
+接任
+杨氏
+好意思
+声势
+嗟
+浅显
+吨/年
+惩治
+FTP
+幻境
+丝网
+过河
+考级
+全彩
+伤者
+阻尼
+山色
+一例
+明帝
+西郊
+俯视
+一十七
+次生
+赠与
+东台
+寒风
+清白
+木屋
+屋檐
+富丽
+鹏飞
+蛤蟆
+伤员
+固始
+吃喝
+惨烈
+编修
+杰夫
+韦伯
+飞升
+博文
+瑰丽
+小数
+校尉
+VR
+巴赫
+航站
+交联
+诺尔
+蕴涵
+驳
+合影
+明信
+鸢
+球星
+风沙
+夭
+条状
+十卷
+俗名
+吐鲁番
+果酱
+岩层
+电击
+逆流
+时与
+匿名
+克罗
+插口
+历法
+制版
+026
+以利
+清溪
+高密
+野战
+白塔
+东兴
+灵兽
+建文
+烂漫
+摩登
+Socket
+返乡
+宣教
+Down
+领跑
+宗室
+士卒
+酒后
+收账
+猛虎
+郫县
+Key
+溟
+白河
+野菜
+大法
+眷
+374
+致残
+钻探
+危难
+创汇
+戮
+抓手
+人脸
+ㄊ
+1MB
+乘机
+烟囱
+菜名
+秋千
+殖
+367
+为题
+十七大
+荣膺
+因人
+婚前
+工事
+信函
+区块
+南岳
+商报
+安徒生
+垦
+裂隙
+532
+AAAA
+刘易斯
+探望
+体味
+柚子
+强盛
+出狱
+伺机
+表象
+公转
+挑起
+倒数
+岩溶
+⑸
+卢卡斯
+之後
+阑
+赎回
+碳钢
+信封
+章法
+飘零
+太师
+艰
+习作
+立意
+陇西
+翻滚
+痔疮
+三十余
+响亮
+partment
+探求
+478
+苔藓
+gu
+山清
+味美
+四子
+分项
+垠
+子曰
+SIM
+进气
+征信
+负压
+失传
+映像
+非典
+黄梅
+满月
+true
+提纲
+star
+豪迈
+疏导
+video
+军训
+坦白
+西德
+游荡
+银币
+州长
+3A
+伯明翰
+吡
+集锦
+悬赏
+全屏
+谚语
+黑猫
+圆盘
+再创
+护国
+膝盖
+平谷
+6g
+磐
+1888
+缔结
+油品
+非公
+个头
+国美
+沮丧
+730
+change
+多边形
+丽人
+鞘翅
+放学
+尽责
+除夕
+皇朝
+坦诚
+球体
+慌
+黄素
+称道
+凤鸣
+庄稼
+公道
+稍稍
+四家
+判别
+纹样
+深信
+坏事
+绯闻
+当月
+善待
+托叶
+依稀
+伝
+霆
+2A
+楚王
+场次
+埃里克
+热风
+放逐
+匮乏
+教诲
+秋雨
+单子
+高台
+小鸭
+交出
+毒气
+粽子
+通志
+桑田
+压实
+师承
+大碟
+嘲讽
+测距
+禽兽
+凄美
+疏水
+切磋
+五级
+深陷
+心胸
+赶赴
+遇险
+过客
+一匹
+钣金
+実
+恬
+高危
+两翼
+衣冠
+公款
+总苞
+怪人
+晚宴
+收听
+Williams
+狩
+余热
+贫民
+060
+乳油
+杜仲
+补益
+027
+路网
+马龙
+渗漏
+real
+纸币
+包袱
+获准
+天工
+罗盘
+侦破
+团伙
+跳过
+文火
+其三
+晦
+橄榄球
+583
+骗局
+妙趣
+泊位
+五号
+异变
+每每
+金州
+开动
+领头
+养心
+冬至
+犀牛
+上流
+博采
+大水
+彩图
+世子
+too
+宣讲
+刊载
+349
+健体
+493
+品行
+梗塞
+功课
+613
+抗炎
+乔纳森
+本意
+里斯
+猎头
+渡边
+白癜
+dream
+小草
+墙纸
+收服
+燕窝
+客流
+文工团
+总产
+挽留
+析出
+炀帝
+今人
+高官
+淡然
+普鲁士
+桧
+贴息
+文论
+风冷
+身长
+熔岩
+主事
+成王
+螳螂
+论治
+轮毂
+why
+淳安
+01S
+恼
+涉猎
+节操
+得道
+癖
+秃
+石城
+因材
+河山
+NACHI
+线长
+阅兵
+旅店
+02000
+江边
+马尔
+做客
+拉斯维加斯
+宫内
+党人
+栋梁
+挺身
+马克斯
+378
+玛利亚
+北仑
+珐琅
+丹霞
+白果
+突尼斯
+沉寂
+定格
+检讨
+宁远
+锄
+划过
+白兔
+蚜虫
+1cm
+下凡
+围观
+巴拉圭
+验方
+学区
+喝水
+退火
+木叶
+水月
+508
+抢走
+养鱼
+老鹰
+自重
+029
+BBQ
+莫扎特
+恢宏
+part
+口罩
+欠缺
+贷记
+1840
+元音
+采编
+德清
+神女
+衍射
+安逸
+夜雨
+皖南
+貂
+Mike
+首富
+断开
+中联
+饭菜
+space
+脱发
+power
+圣地亚哥
+CSS
+莘
+雄花
+社稷
+槟榔
+门锁
+当面
+2M
+越发
+马氏
+445
+实话
+墨迹
+414
+出苗
+联名
+台面
+发散
+观止
+定西
+瞬时
+GTX
+社群
+02001
+焙
+新余
+量程
+上肢
+陈毅
+511
+荼
+月末
+飞碟
+欣欣
+一十六
+战备
+0ISBN
+南水
+新语
+蚕丝
+朝日
+延误
+天花
+毛利
+提早
+正弦
+制衣
+险恶
+工场
+过海
+转录
+呼声
+骚动
+为限
+424
+Mary
+起码
+羽绒
+菜花
+巨著
+凡尘
+猫头
+璋
+充填
+South
+布莱克
+闽东
+头骨
+优选
+421
+追问
+大碗
+叉车
+拉丝
+party
+毓
+阿联酋
+提督
+戏台
+共处
+闸门
+学龄
+Project
+359
+习武
+防城
+紫云
+夏侯
+雷雨
+城主
+安岭
+垄
+BB
+眼眶
+承重
+江浙
+央行
+触碰
+博导
+醴陵
+浓烈
+word
+岙
+花瓶
+机柜
+功成
+浆果
+一卡通
+恩师
+世袭
+双工
+天猫
+诚恳
+凉血
+COM
+糖粉
+美美
+堤防
+Lee
+耸
+臀鳍
+民乐
+740
+薨
+厢房
+音韵
+免受
+认清
+母女
+生人
+大功
+福布斯
+HK
+反过来
+Network
+荡荡
+格力
+IEC
+1885
+Trans
+抗衡
+崎岖
+拢
+PowerPoint
+大话
+当当
+打扰
+云母
+大悲
+蝠
+妒
+ready
+河中
+马山
+睛
+北国
+意气
+邪教
+天目
+路旁
+售前
+嫔
+顽皮
+余力
+逮
+杨树
+德化
+1秒
+商厦
+032
+潺潺
+锦衣
+对空
+液位
+补水
+耳聋
+径流
+大英
+传教
+PET
+拉塔
+羽化
+南街
+杰瑞
+1893
+Entertainment
+比对
+卷起
+田家
+has
+左转
+科松
+金碧
+佳话
+荣辱
+肯尼亚
+冥想
+掩映
+偏心
+颍
+汴
+贺兰
+指着
+涟漪
+赤裸
+桂皮
+巍峨
+壮年
+郑和
+悖论
+燃煤
+六国
+婚约
+触手
+体弱
+念佛
+人武
+省时
+格式化
+加权
+炮火
+真好
+乳糖
+打仗
+标点
+call
+赛中
+造反
+HTC
+中建
+暖暖
+奇兵
+木门
+猪肚
+stay
+免征
+放养
+在天
+National
+随从
+永泰
+探案
+侍中
+实绩
+三好
+嘉善
+老一辈
+⒈
+绊
+傻傻
+名茶
+052
+侦测
+合营
+福鼎
+遵纪
+焊缝
+盖茨
+备战
+量产
+回声
+记叙
+税制
+轻狂
+食客
+汕
+厄运
+衙
+悬殊
+中宣部
+阅历
+阴险
+上瘾
+乌尔
+绝妙
+手握
+小管
+省力
+玻利维亚
+种田
+减震
+痤疮
+过时
+喷出
+Kong
+见人
+统考
+相干
+行至
+雷州
+检出
+光复
+黑水
+初夏
+抄袭
+真真
+狂想
+本职
+哄
+AKB
+疝
+粘稠
+加温
+遨游
+芾
+蹬
+御用
+玻
+代管
+大写
+小片
+剔透
+长势
+发帖
+利比亚
+北端
+异位
+胫
+粗细
+总社
+粘液
+影印
+水母
+字画
+东乡
+456
+大悟
+旭日
+皮特
+称重
+名古屋
+518
+西乡
+山阳
+柔韧
+德胜
+沙丘
+账务
+Center
+515
+Street
+冠名
+家子
+幅员
+歩
+Cross
+坍塌
+封号
+乡间
+纵然
+who
+自闭
+痣
+蛟龙
+滑行
+缩影
+大叶
+梯队
+原形
+吐血
+付诸
+建党
+梯田
+数十万
+人种
+青松
+髋
+砍伐
+机具
+ㄇ
+玄奘
+心经
+煤业
+力行
+两万
+句丽
+增资
+降级
+详实
+蝉联
+鲈鱼
+龙神
+仓颉
+海豹
+阻隔
+GE
+λ
+卷曲
+Under
+赖以
+所罗门
+夜幕
+淡泊
+神木
+熔炼
+深色
+At
+湖边
+邮轮
+错觉
+同桌
+Jason
+一千多
+公车
+润泽
+门头
+put
+片子
+声调
+矮小
+共济
+WinXP
+人面
+冒充
+名品
+打碎
+南无
+达拉斯
+母猪
+Happy
+break
+丽丽
+更衣
+果冻
+脸颊
+附赠
+礼包
+凯恩
+红牛
+完颜
+亲和
+690
+艾美
+淼
+成化
+狭小
+内敛
+巴塞尔
+加分
+专一
+原貌
+多头
+萨满
+龙胆
+赛扬
+毡
+挪
+卫队
+033
+求救
+民权
+解法
+Beijing
+543
+械
+空腹
+研读
+3400
+出头
+浑厚
+SUV
+江都
+按压
+汲
+雪白
+036
+Make
+佐助
+光头
+加长
+皮毛
+古刹
+小孔
+苓
+沃尔特
+1886
+乙醚
+黎族
+清脆
+松软
+上颌
+眼眸
+成佛
+付给
+浙商
+破除
+起火
+比丘
+可取
+恍惚
+石子
+骄人
+钟山
+du
+福山
+校对
+繁重
+侵染
+move
+头颈
+显像
+蔺
+初识
+偏高
+爱丁堡
+主战
+支书
+凡间
+552
+乌江
+活用
+新近
+Get
+玉华
+舞剧
+勤于
+补气
+字辈
+大步
+维特
+伸长
+v2
+清蒸
+分体
+喧
+书稿
+视界
+太史
+增速
+病床
+难看
+三水
+马其顿
+选读
+492
+省心
+Johnson
+鬓
+格雷
+气旋
+均属
+475
+Ⅴ
+ぼ
+劳累
+伶
+阿贾克斯
+白城
+宾夕法尼亚
+俨然
+入世
+哥们
+16000
+标称
+外教
+News
+自费
+鱼塘
+衷
+王座
+丝路
+俘获
+代收
+拥军
+青豆
+山峦
+追梦
+围城
+妓
+恭喜
+副词
+榭
+抄本
+总目
+僵
+火控
+庭园
+马家
+傲视
+何等
+香炉
+傻子
+陨
+樽
+视点
+尽职
+Henry
+教子
+闯进
+络绎
+搂
+金凤
+♀
+缴获
+查证
+管弦
+防癌
+守住
+920
+先由
+407
+私法
+驻军
+铜矿
+从句
+米拉
+甘甜
+波折
+牛郎
+功利
+古书
+肥力
+双峰
+聚落
+赫尔
+算子
+留情
+影楼
+身怀
+涵养
+小豆
+下线
+耗尽
+枪械
+泷
+话机
+两度
+034
+见效
+OA
+914
+住处
+几多
+水冷
+504
+海螺
+血泪
+钢化
+以对
+1650
+法系
+屎
+荣成
+惟有
+六名
+热播
+民委
+迎战
+冷落
+作诗
+挂车
+城邦
+感伤
+公理
+后端
+分值
+南风
+肉鸡
+DB
+网线
+延寿
+上口
+枋
+雪梨
+1001
+首演
+触点
+丢掉
+Japan
+编队
+清幽
+大华
+1892
+住进
+眼花
+End
+靖江
+西芹
+居留
+蓟
+ver
+截留
+日间
+静谧
+绝非
+覆灭
+在任
+放宽
+弘治
+斐然
+磨砺
+电筒
+号牌
+织造
+知州
+辽河
+冰河
+得住
+橙子
+交响
+民革
+纸牌
+圣洁
+031
+外皮
+莴笋
+均等
+604
+稚
+床垫
+秘笈
+一觉
+亦即
+PPT
+Over
+帷幄
+桥面
+停用
+考量
+实处
+一盏
+钟爱
+小手
+博得
+白垩
+合乎
+碧海
+异同
+轲
+干活
+⒉
+例会
+磷脂
+黄鹤
+海阳
+挣脱
+553
+起算
+门人
+莎拉
+设色
+边锋
+速冻
+令其
+亚斯
+两极
+躯干
+冬菇
+Boy
+贴现
+总成
+这事
+火神
+暗藏
+丛刊
+维系
+斯德哥尔摩
+拘役
+吸食
+而入
+醇厚
+罗布
+051
+丰县
+沙皇
+雁塔
+鲜红
+赶出
+五华
+浅浅
+强敌
+戈壁
+炼狱
+四维
+There
+髻
+挚友
+国事
+神童
+宁县
+体位
+一夫
+石鼓
+车用
+侣
+干粉
+跌倒
+und
+湖广
+牛皮
+糠
+莱阳
+叔父
+阳痿
+396
+910
+天骄
+贩
+烤鸭
+驹
+楔
+丰盛
+僧侣
+Class
+益处
+厚实
+070
+店里
+锭
+千姿
+减缓
+梅县
+热油
+后盾
+盔甲
+单于
+联姻
+繁茂
+安东尼奥
+倒闭
+足联
+采煤
+行风
+医案
+花季
+英杰
+EMBA
+句式
+要靠
+START
+亡命
+剧种
+飞过
+水疗
+举手
+沅
+流言
+走马
+418
+脱位
+节间
+男科
+柯克
+新政
+接手
+栈道
+自用
+郡主
+感言
+霁
+近来
+推导
+结集
+对数
+上访
+出奇
+NP
+Plus
+守护神
+感触
+纯朴
+兵工厂
+碧玉
+涝
+车马
+鄱阳湖
+二百六十二
+弗洛伊德
+God
+集权
+芭芭拉
+坏蛋
+胳膊
+两汉
+失职
+鸡尾
+飞瀑
+TP
+江岸
+糜烂
+东岳
+法老
+普尔
+下称
+退缩
+Fran
+阿尔卑斯
+华贵
+虚心
+佩服
+树人
+扁豆
+拉近
+贴图
+惊醒
+低端
+饵
+外衣
+同台
+两眼
+道道
+祇
+民商
+茬
+ON
+V3
+纵观
+Application
+栎
+Simon
+三教
+子龙
+娇妻
+时隔
+遗漏
+巴菲特
+绳索
+配额
+修习
+腋生
+481
+烈日
+复叶
+地藏
+支承
+红山
+修长
+贫寒
+中坚
+同化
+平缓
+偃
+俄亥俄
+袜子
+昆曲
+馍
+海安
+超限
+凯利
+斜坡
+昭君
+迟钝
+red
+杞
+砖瓦
+80000
+F2
+札记
+民情
+年报
+马歇尔
+钤
+拆分
+Phys
+平易
+暖通
+岸线
+家和
+球阀
+RFID
+孝子
+冼
+激荡
+冠以
+庆元
+溧阳
+敬意
+国别
+护身
+永福
+瑞丽
+菌株
+更正
+龙洞
+悲观
+854
+卡斯
+重病
+淀
+抽搐
+黄岛
+福安
+394
+窗体
+丽娜
+before
+差价
+腰痛
+胆管
+麦田
+041
+灌浆
+杀敌
+听课
+再三
+君临
+公称
+盖世
+迟缓
+派生
+KW
+七五
+碎屑
+皆知
+阴虚
+SDHC
+分摊
+Society
+理顺
+芥
+练就
+荨
+五花
+M2
+忘怀
+驱魔
+喇
+噻
+佳丽
+交口
+挖出
+东洋
+体色
+滔滔
+防毒
+停工
+风声
+中控
+pisodes
+自力
+直指
+长处
+BD
+sound
+0CM
+活检
+远见
+牌照
+TIMKEN
+关帝
+서
+顺势
+横空
+563
+B超
+胜出
+曼哈顿
+电价
+沱
+黑手
+差速器
+及早
+罗斯福
+闰
+悲情
+躬
+704
+Memory
+木星
+概览
+每篇
+612
+消亡
+発
+遵从
+胡安
+Huang
+颞
+506
+Enter
+听听
+右转
+簧
+党代会
+前胸
+先贤
+0K
+影城
+雷鸣
+两路
+List
+跋涉
+15g
+酵素
+土石方
+政发
+博洛尼亚
+丫鬟
+上尉
+吕氏
+献出
+煎饼
+萤
+两难
+新思路
+飞往
+尊崇
+客货
+记号
+昨夜
+启程
+热轧
+糖醋
+小野
+旧事
+小穗
+恩爱
+蜜月
+Jackson
+湖人
+绥化
+沙特
+汉王
+每股
+华安
+卡塔尔
+去过
+果业
+细密
+半边
+打捞
+孜然
+其诗
+委婉
+先于
+格律
+纠
+受阻
+磨削
+软硬
+MW
+滑翔
+存亡
+礼节
+核能
+峥嵘
+天时
+月子
+micro
+直前
+勺子
+长风
+荥阳
+自给
+精辟
+火电
+一帆
+三昧
+萨拉
+魔道
+胆道
+象形
+天牛
+文心
+超音速
+中生
+降噪
+这笔
+贤者
+小牛
+起眼
+Harry
+斯特
+ㄜ
+BE
+定远
+图样
+Jose
+盗版
+日立
+大蛇
+978730
+北非
+605
+远走
+亢进
+鼻祖
+郊外
+诚心
+误导
+休斯顿
+涉农
+觞
+财贸
+墨鱼
+可恶
+克星
+漏水
+磋商
+art
+一意
+高平
+四书
+Eric
+秀山
+缥缈
+로
+补正
+道义
+梦游
+喝彩
+千岛
+翼翼
+绝顶
+300000
+凝视
+俏皮
+鹿城
+连击
+安稳
+酉
+暴发
+AK
+污垢
+ㄛ
+830
+交流电
+Touch
+一串
+槐树
+法拉利
+毛囊
+威尔逊
+巴勒斯坦
+威风
+EN
+解难
+赵氏
+消散
+六人
+MSN
+却说
+等分
+1870
+生母
+掩埋
+高岭
+历次
+诗书
+气孔
+爱新觉罗
+前妻
+坐拥
+茶花
+NTN
+守恒
+修成
+底色
+攸
+百事
+DN
+猎手
+达摩
+丑闻
+合生
+包谷
+节俭
+赣南
+席位
+娘家
+话筒
+圆明园
+431
+坚毅
+嵇
+灰烬
+吉田
+特写
+北调
+哈里
+黄白色
+发情
+积液
+巽
+苍山
+合编
+套间
+075
+断肠
+谷子
+光速
+线材
+林家
+Link
+复生
+驸马
+Wi
+贝勒
+息肉
+ㄡ
+cause
+看得见
+炼钢
+增援
+科斯
+GPU
+δ
+对角
+偷走
+投靠
+缨
+国办
+格栅
+团团
+怒气
+Player
+寿山
+饮茶
+毛衣
+465
+咬伤
+高科
+苏珊
+报案
+皈依
+东林
+氟化
+吉米
+消瘦
+²
+A3
+045
+正厅
+景气
+黒
+Chang
+双全
+王城
+科罗拉多
+紫禁城
+车票
+妃子
+震颤
+食宿
+庐江
+追缴
+访客
+莞
+套件
+童星
+宝珠
+ThinkPad
+下单
+01982
+迭代
+贺州
+百老汇
+SNS
+干流
+淤
+宁都
+签到
+八旗
+权衡
+议长
+文登
+青联
+管状
+平直
+剑道
+屋里
+假名
+小雅
+谅
+prise
+hand
+大年
+GHz
+偏重
+村组
+拈
+三公
+玉祥
+海子
+芳心
+师表
+场外
+一连
+大丰
+杜比
+虚荣
+合署
+敷设
+结清
+声色
+DP
+消退
+条形
+校庆
+窃取
+用兵
+谜底
+ォ
+用处
+遮盖
+掐
+水泡
+晶石
+驶
+玉雕
+号角
+崇德
+虫害
+旌
+木香
+巴斯
+初代
+BC
+专委会
+德克萨斯
+红肿
+夏秋
+电费
+证照
+任天堂
+六岁
+之窗
+念珠
+筵
+膛
+即行
+unknowne
+多肽
+还乡
+险峻
+5200
+聚伞
+下端
+怀远
+波及
+粥样
+Ctrl
+水边
+Run
+一趟
+审讯
+慕名
+397
+中德
+直射
+造出
+霏
+烫伤
+三句
+387
+美育
+朝圣
+敲门
+好象
+横滨
+蒂姆
+拉格
+发球
+花鼓
+封存
+汉斯
+花灯
+加西亚
+GL
+分批
+痛哭
+Well
+叛军
+照样
+脲
+东安
+双目
+芥末
+到家
+解冻
+大举
+look
+浅析
+幽深
+小刚
+折腾
+洪湖
+写诗
+市局
+042
+冀州
+一票
+药方
+菲尔
+南明
+邪神
+堂课
+铜像
+大运
+加冕
+八分
+秀吉
+元神
+新知
+祗
+君士坦丁
+class
+搜救
+苦练
+雪儿
+入微
+重合
+年谱
+整编
+MY
+余生
+极短
+横穿
+测井
+南天
+能干
+三品
+涟水
+丹参
+容错
+熵
+涙
+Fe
+sub
+读出
+坦然
+哎呀
+圆圈
+州委
+侥幸
+足总
+支护
+买房
+触电
+高歌
+神威
+黄鱼
+柔美
+腐竹
+盈亏
+奋起
+紫光
+0MHz
+混入
+止渴
+教派
+1887
+缆
+得体
+剿
+铝业
+过节
+增城
+宁可
+control
+肿块
+饱含
+班次
+提速
+沢
+老爹
+调侃
+穹
+释手
+思科
+原位
+会面
+寿险
+0kg
+Say
+偕
+乘风
+周三
+工大
+办结
+龙女
+纸板
+后院
+十人
+乳业
+Apple
+蜈蚣
+催生
+失灵
+快活
+雨伞
+枪机
+古国
+鞭炮
+消融
+事前
+威望
+海风
+一齐
+573
+骇
+广域
+游子
+北流
+小强
+紫阳
+魔导
+肮脏
+农工
+ford
+柑
+依恋
+冲绳
+铜钱
+诺夫
+找茬
+花边
+带队
+心性
+As
+悠长
+鼻腔
+强硬
+文峰
+报答
+寻味
+水厂
+掺杂
+袭来
+半山
+抵扣
+羔羊
+老乡
+391
+国史
+分站
+宣武
+抽调
+妖兽
+037
+辞退
+心仪
+精雕
+锐利
+super
+高通
+inter
+网管
+听闻
+水星
+峙
+韧
+来去
+低空
+乐意
+二百四十四
+尘土
+鞠躬
+us
+踏板
+ユ
+南桥
+莫大
+长汀
+补习
+更生
+自大
+音讯
+爱华
+萧萧
+萦
+梨树
+脂质
+双星
+位在
+KOYO
+遮挡
+绿城
+破旧
+毛皮
+20g
+马头
+勒索
+迪斯尼
+长石
+426
+寥寥
+经期
+纵使
+대
+石棉
+西奥
+龙吟
+氢键
+物证
+出山
+吃水
+疆域
+红眼
+正门
+幽幽
+音阶
+奏鸣
+分钱
+汕尾
+珍妮
+颅脑
+Version
+г
+速记
+月明
+手笔
+在案
+购进
+恩仇
+撰文
+444
+铺垫
+触觉
+京广
+高州
+1884
+口令
+连片
+作用力
+一桩
+really
+小刀
+相生
+绝情
+一城
+怀有
+殆
+羊皮
+汇丰
+入室
+满腔
+点睛
+双臂
+杂乱
+省去
+队名
+623
+五经
+正定
+LRC
+赔款
+楚天
+心心
+安化
+without
+超载
+福地
+悲歌
+代行
+祚
+叫作
+迅雷
+资阳
+姐弟
+琳琅
+惊慌
+四强
+韵母
+两相
+取经
+涤
+雨果
+正视
+纷繁
+拮抗
+随军
+饵料
+首班
+烟尘
+平反
+落寞
+峦
+意蕴
+创投
+T1
+刚柔
+次第
+薄壁
+尿酸
+涅盘
+572
+事态
+梦寐
+1875
+牵制
+1G
+烘托
+味觉
+复读
+阿哥
+漂白
+病故
+心包
+繁杂
+每条
+永明
+商洛
+郦
+巴掌
+战警
+测度
+外圈
+锦囊
+容许
+高寒
+school
+持证
+瘘
+置疑
+县治
+延展
+矿务
+圣战
+介意
+击溃
+些许
+ㄈ
+停泊
+水暖
+糖水
+chapter
+为己
+猛进
+憎恨
+烧饼
+轮式
+琐事
+RPM
+蹈
+预审
+白骨
+从政
+莫里斯
+婉转
+die
+生辉
+面筋
+美体
+泥石流
+面食
+新股
+计付
+接应
+锅底
+
+脚架
+计息
+停办
+麾下
+喝茶
+金丹
+全开
+润发
+牛腩
+世系
+多项式
+如故
+市值
+河图
+苗期
+用时
+榕树
+格兰特
+讨好
+呼喊
+诚挚
+什锦
+放水
+白带
+中航
+爆裂
+叮当
+获批
+避雷
+赏赐
+承销
+仙居
+左旋
+删去
+微山
+镉
+青阳
+价钱
+巴洛克
+夹层
+ATP
+革委
+修整
+翠绿
+大虾
+艮
+仆射
+内皮
+疹
+栅栏
+忠臣
+455
+新化
+气度
+气概
+周长
+清丽
+双生
+掰
+汽配
+载波
+捶
+空灵
+安国
+史志
+竹木
+回肠
+两会
+新款
+雅典娜
+诗社
+1891
+荸荠
+节电
+伊尔
+包换
+隐士
+迷途
+定本
+头条
+基业
+担纲
+409
+白芍
+骁
+起降
+羌族
+龙宫
+专集
+铁西
+丙烯
+甘泉
+围巾
+瓢虫
+用好
+到手
+PCR
+强直
+至善
+PT
+胸中
+鹧鸪
+志勇
+筹办
+库尔
+食肉
+傍水
+其乐
+骸
+挡板
+苯胺
+征兵
+乏味
+绍酒
+歹徒
+告终
+她想
+重于
+风韵
+陌上
+053
+続
+候鸟
+581
+月圆
+克斯
+报恩
+义和
+文静
+送进
+西京
+云梦
+翰墨
+少校
+临清
+汉诺威
+济源
+703
+血肿
+莅临
+m3
+季刊
+蓓
+摄政
+格拉斯哥
+石路
+天荒
+705
+双线
+沽
+摇晃
+新港
+摧残
+实况
+红红
+FLAC
+慧眼
+母线
+峨
+也罢
+巫术
+箍
+朋克
+铜鼓
+汉奸
+卫军
+运维
+莱州
+筹码
+插花
+国界
+卢森堡
+Work
+心计
+娴
+跗
+魔剑
+百叶
+偏移
+美眉
+抽屉
+反舰
+睑
+少时
+皮蛋
+厚厚
+ny
+抽烟
+牌楼
+神山
+做官
+守军
+1750
+庆云
+好又快
+盛况
+光栅
+said
+绕城
+奖状
+ぬ
+水银
+莱比锡
+卑鄙
+织布
+高跟
+停顿
+毒物
+青草
+小号
+抛开
+起跑
+崩塌
+病态
+535
+心路
+散打
+米线
+713
+行省
+读数
+智勇
+达尔
+Pre
+践踏
+抚摸
+大足
+输尿
+后座
+上榜
+饕餮
+算命
+愈来
+己方
+∕
+裸体
+东街
+攻陷
+东村
+赛跑
+鳕鱼
+还算
+九曲
+HT
+眨眼
+忻
+一维
+津津
+游学
+keep
+执掌
+521
+风能
+纨绔
+轻视
+钻头
+狂妄
+耗时
+打猎
+灵石
+扰动
+现年
+身负
+类推
+抚育
+密歇根
+句法
+黑格尔
+切面
+茅山
+壮士
+回访
+腾冲
+防备
+蓝海
+劳作
+党纪
+hard
+杏鲍
+扇形
+焯水
+乐道
+轧制
+嗅
+兴义
+别克
+377
+RMB
+菲菲
+宜忌
+良师
+天花板
+名句
+2
+贴合
+雪域
+had
+M1
+金文
+台站
+落泪
+螯
+环岛
+寒冬
+弋
+四边
+商法
+阳气
+冷面
+CB
+1450
+801
+遥望
+中奖
+OVA
+钻进
+帮会
+嘘
+长空
+婺源
+孤身
+始末
+542
+百佳
+DRAM
+地府
+中非
+开凿
+异型
+店主
+039
+立国
+依存
+丛中
+石像
+菌素
+PNG
+融通
+小菜
+非人
+蕾丝
+包包
+志摩
+新开
+虔
+基点
+胜于
+节育
+从教
+653
+男声
+波尔图
+城阳
+假山
+JB
+市井
+害死
+DT
+网通
+男方
+海内
+动因
+town
+桢
+七种
+喷施
+死因
+新动态
+穿行
+038
+旅顺
+试管
+甜菜
+柔道
+本剧
+片断
+望江
+大川
+3ds
+广岛
+牙膏
+建康
+八千
+HTTP
+翻阅
+隐含
+小青
+对虾
+千分
+饮片
+菅
+气场
+夜店
+Brian
+回执
+神曲
+摘除
+厨艺
+诗云
+鼬
+刺痛
+惧怕
+杂记
+已死
+乌鸡
+大鼓
+072
+独白
+01999
+心酸
+417
+流利
+俘
+礼乐
+脱胎
+邝
+酒馆
+时政
+切合
+炼化
+Online
+WS
+探针
+和风
+霸权
+真经
+喽
+跨入
+薄层
+嘟嘟
+海纳
+钒
+筋膜
+张杰
+两百
+宽厚
+浙东
+紫苏
+佛光
+讲演
+要诀
+427
+沂蒙
+恙
+盖帽
+上街
+瓦尔
+六部
+暂定
+佛塔
+褪
+领馆
+墓室
+豁免
+热那亚
+月台
+霉病
+敲击
+弧焊
+499
+妙用
+腊月
+肆虐
+瑞恩
+酸痛
+股民
+远山
+天狼
+雄浑
+齐王
+南州
+624
+议政
+水工
+夙
+雷击
+内裤
+荟
+满脸
+通便
+水塘
+乞讨
+识破
+叛变
+红岩
+未了
+5L
+北街
+日化
+机芯
+龙海
+多特蒙德
+古田
+反差
+副刊
+孝顺
+瓜子
+商贾
+桑托斯
+428
+寰宇
+多端
+蛤蜊
+幽州
+留香
+ext
+印本
+Micha
+追回
+扉
+疗伤
+1881
+熟人
+哥斯达黎加
+蕙
+扬帆
+追查
+侧翼
+ь
+砌筑
+林子
+轻质
+书架
+QS
+山崖
+河池
+贝类
+刀法
+地缘
+黯
+白光
+精读
+清廉
+进门
+丹丹
+海鸥
+韶山
+全然
+安息
+印有
+永磁
+富饶
+好比
+文卫
+纺锤
+Edition
+南岗
+名册
+红霞
+舅
+建宁
+并口
+片面
+湘菜
+益友
+十天
+祯
+波状
+直系
+国门
+牙龈
+仙道
+清流
+嘻嘻
+汉墓
+比拼
+壑
+主城
+食性
+合浦
+明暗
+近些年
+Andrew
+船队
+登封
+20mm
+挂号
+593
+橱窗
+9g
+百战
+豁达
+闽西
+呼啸
+握住
+内层
+对冲
+纹路
+天香
+⑨
+SF
+腋
+海地
+528
+执委
+字迹
+房租
+耐药
+隔断
+接续
+四座
+其内
+一筹
+现职
+常量
+西关
+挥手
+黄州
+小二
+知性
+广博
+桥东
+殿内
+聚氯
+945
+绝壁
+水头
+朗朗
+起航
+水岸
+吉尼斯
+揍
+制取
+Girls
+奥妙
+弛
+389
+关照
+官庄
+落落
+军火
+孪生
+精进
+斯图尔特
+ML
+倒地
+明军
+约旦
+铲除
+西厢
+10mm
+怜悯
+Rose
+441
+来来
+与共
+VLAN
+沙地
+半路
+遍野
+ざ
+香粉
+专程
+可人
+漏电
+请问
+居间
+身分
+远望
+科尔沁
+激昂
+466
+621
+阿坝
+Mc
+精气
+荣登
+定下
+熄火
+在册
+为命
+反顾
+研究型
+鸡粉
+Drive
+伊甸
+文杰
+璟
+花坛
+药液
+祭坛
+坡地
+大正
+534
+赋诗
+054
+丽娟
+木乃伊
+徐徐
+瑞尔
+原判
+绣花
+激流
+冬青
+雾化
+竞相
+为力
+高下
+GA
+RAW
+硫磺
+比比
+豫剧
+南越
+自转
+口渴
+罔
+志远
+自诉
+行踪
+灯管
+增厚
+指正
+1882
+及格
+藏有
+候车
+近身
+秘境
+二为
+水轮
+铮
+正阳
+阿尔巴尼亚
+露地
+知心
+OP
+胸鳍
+剧社
+务农
+部属
+平遥
+弹射
+纽伦堡
+调匀
+浇注
+Education
+剑圣
+原木
+关公
+好戏
+塞浦路斯
+从无
+解构
+精液
+Company
+鼎立
+勤务
+水能
+寻访
+阿瑟
+船体
+功放
+难怪
+围栏
+遐想
+近郊
+杳
+泥鳅
+空旷
+宣战
+中古
+图例
+生均
+毛竹
+ا
+渝北
+否决
+NHK
+东流
+填埋
+古井
+富商
+巫女
+初创
+末路
+临淄
+收留
+吟唱
+掉进
+玉佩
+穿衣
+地市
+After
+痞子
+Si
+实属
+爆炒
+436
+迪克
+基生
+缜密
+挫伤
+新田
+装满
+供血
+行知
+唾液
+赫拉
+云浮
+一遇
+缙云
+ω
+客气
+如若
+斛
+疃
+围困
+奥林匹亚
+吊装
+八项
+献策
+千亩
+狗肉
+碾压
+藏经
+放假
+清源
+红人
+金宝
+花山
+佛门
+金鹰
+公公
+二百零八
+色差
+飞羽
+逃逸
+蒂娜
+许诺
+镀膜
+ville
+磁铁
+575
+超脱
+赖氨
+缩略
+约合
+怯
+调色
+着火
+浮肿
+灾民
+白莲
+脚手
+车前
+心神
+瞭
+导报
+娜娜
+聚众
+043
+错乱
+正太
+水师
+新兵
+采砂
+总算
+奥义
+重金
+150000
+气道
+鲍勃
+551
+奥斯曼
+彭城
+Book
+背靠
+跃进
+人声
+地利
+青莲
+打听
+碑帖
+巡检
+消食
+气虚
+凯恩斯
+受累
+认出
+衢
+使徒
+东郊
+玩玩
+奥赛
+探亲
+379
+椒江
+544
+林间
+罗德
+渎职
+마
+扑救
+杂剧
+圆球
+MK
+英豪
+箕
+懂事
+耶路撒冷
+子囊
+春笋
+快线
+心系
+归侨
+much
+048
+leave
+803
+都尉
+耻辱
+DR
+Tony
+를
+想见
+来者
+阿拉斯加
+送出
+Set
+西溪
+老夫
+撒娇
+幼小
+精米
+BCD
+作好
+八门
+庆幸
+远行
+到老
+闭环
+退学
+乐见
+Materials
+惰性
+纹身
+微星
+淡漠
+something
+黑子
+固原
+称得上
+黎巴嫩
+博山
+后进
+承袭
+解惑
+右脑
+712
+定罪
+鸟巢
+回火
+豆沙
+萘
+倍感
+正传
+如风
+腹中
+oF
+晃动
+矫治
+异乡
+宋人
+01998
+舞步
+塔罗
+沙盘
+瘾
+故曰
+业者
+至正
+伤悲
+悼念
+文森特
+镀金
+五角
+舞美
+布达佩斯
+同月
+SB
+1536
+454
+小雪
+质谱
+search
+矿藏
+茉
+702
+圣旨
+倭寇
+压电
+花语
+珞
+大寨
+埃塞俄比亚
+黄柏
+Small
+文采
+题解
+逵
+接过
+懒惰
+钞票
+艰险
+宙
+尼康
+南川
+635
+ы
+仿生
+清心
+tell
+织女
+St
+通商
+小葱
+Christopher
+开馆
+舌尖
+脑部
+祖传
+渺小
+成林
+Lady
+hey
+凶残
+様
+踝
+格鲁吉亚
+486
+利川
+铁甲
+蚯蚓
+三通
+正派
+情义
+推陈
+国都
+驾到
+略微
+实心
+AN
+喀斯特
+海州
+竹山
+速递
+野性
+每场
+毛豆
+河间
+序曲
+力士
+围着
+稻米
+楼台
+相国
+写景
+升天
+12V
+房东
+击倒
+闻风
+Proce
+外缘
+ㄝ
+归路
+载重
+手榴弹
+猫猫
+部将
+馗
+扇贝
+古塔
+抽水
+044
+支点
+出汗
+转机
+交手
+手镯
+니
+筛查
+休止
+只愿
+阿克苏
+their
+轮距
+hear
+铣削
+推倒
+鲸鱼
+如烟
+单行
+怒江
+栉
+蜃
+清雅
+点球
+514
+射速
+抵消
+侗
+首轮
+灵巧
+尔兰
+掌中
+轴心
+浩大
+2中
+递减
+拉链
+046
+582
+抨击
+花枝
+沉沉
+恤
+刀锋
+知足
+same
+俗语
+伎
+涉税
+胶水
+蓦然
+提纯
+达赖
+舒心
+沦落
+腹足
+IPS
+剔
+б
+出线
+流云
+向内
+登临
+自查
+婚事
+朔州
+ゅ
+小国
+补办
+雪梅
+马尔代夫
+况且
+勾起
+光波
+清静
+曲张
+肾虚
+起价
+铁匠
+好色
+870
+塘村
+劝阻
+空缺
+元帝
+happy
+肌瘤
+钳工
+巩义
+站前
+8G
+葬地
+四首
+湖光
+辛庄
+仗义
+1010
+影星
+鼓动
+殁
+书证
+游园
+参事
+因故
+样样
+宪章
+朝拜
+闭关
+影展
+帮主
+开拍
+Association
+股利
+村官
+芳华
+643
+434
+王冠
+事关
+沙石
+吹过
+吉凶
+粉煤
+新泽西
+金田
+佐料
+引脚
+保龄
+低估
+管内
+요
+想尽
+喀麦隆
+亟待
+音调
+临死
+想吃
+同属
+438
+财源
+山花
+濂
+缝制
+5GHz
+帝位
+大埔
+View
+字经
+Films
+旋钮
+歙县
+荣光
+明光
+传唱
+结缔
+老娘
+超乎
+524
+岐山
+儒雅
+无纺布
+车架
+机率
+夜视
+洞悉
+雄鹰
+633
+陌路
+水经
+得胜
+入院
+蠢
+自旋
+vision
+自禁
+甘孜
+448
+博雅
+授粉
+附设
+羊城
+调拨
+折合
+早早
+泡椒
+狂奔
+躁
+核销
+KM
+著者
+黑斑
+横溢
+熟地
+自产
+吉隆坡
+对账
+叮
+尼采
+斯拉夫
+明王
+按日
+恢弘
+基团
+来得及
+白斑
+永川
+589
+氢气
+金国
+鹿角
+浮沉
+积木
+仰慕
+464
+罗山
+亚东
+凌乱
+强人
+父女
+荔
+白蛇
+恭敬
+威震
+帝都
+皆以
+一成
+柠
+福冈
+056
+3episodes
+定州
+巡游
+五帝
+金秋
+八面
+通晓
+毁掉
+仅供
+孕前
+爱沙尼亚
+丙胺
+择业
+暴躁
+伊春
+爽快
+长女
+迎风
+东关
+419
+二百一十一
+神道
+迁都
+基体
+圆梦
+风浪
+涉水
+亚诺
+丹田
+516
+忍术
+Francis
+一脉
+步进
+玲玲
+长卷
+鹤山
+用完
+校务
+蝽
+新论
+蟾
+845
+机载
+条条
+旷世
+太傅
+小凤
+Smart
+랑
+玉帝
+探矿
+后生
+涂色
+blue
+PRO
+男爵
+1020
+排挤
+care
+天资
+Why
+逼人
+Space
+寄语
+预装
+铁皮
+coming
+雄风
+薄薄
+学法
+PG
+列强
+小黑
+小新
+宪宗
+油性
+近卫
+立马
+young
+应手
+蚜
+羞辱
+暗淡
+婧
+素问
+摆设
+用纸
+德宏
+鸡胸
+史册
+赈
+四环
+低谷
+三叶
+JavaScript
+知行
+自尽
+儋州
+615
+刑侦
+燎原
+飞飞
+小松
+画意
+瓣膜
+火候
+篡改
+Point
+form
+after
+夜深
+特快
+塬
+校歌
+选股
+济世
+腊肠
+倘
+加紧
+要害
+松原
+拉夫
+筐
+抢修
+顾虑
+而亡
+钡
+家传
+城隍庙
+吨位
+接合
+折算
+随行
+露营
+探访
+MicroSD
+想出
+校门
+漱
+仁慈
+胸膛
+脸谱
+烟云
+奥兰多
+A类
+中尉
+水印
+译丛
+丹江
+专案
+小炒
+场内
+频响
+密谋
+雨夜
+Kevin
+享乐
+金库
+п
+生地
+相变
+逃往
+燃放
+天行
+长夜
+屈辱
+南特
+珍爱
+双城
+562
+来年
+区段
+结缘
+1883
+495
+一点儿
+五味子
+七世
+匆忙
+羧酸
+林夕
+书案
+Yes
+近人
+方志
+开路
+名氏
+城楼
+超然
+活体
+尔后
+杵
+拜占庭
+645
+共舞
+色斑
+琰
+kiss
+腹水
+鼻窦
+paper
+草果
+猛兽
+渔村
+奂
+安危
+沓
+切末
+Mini
+蝗
+原意
+疾苦
+儒林
+赶往
+总行
+H2O
+阴雨
+飘荡
+伯乐
+口干
+挣钱
+游刃
+脑脊液
+特洛伊
+沿街
+派别
+李子
+日均
+薏米
+溶性
+夏尔
+二两
+Wall
+呃
+泰斗
+良方
+广发
+话费
+遗书
+North
+民建
+乌托邦
+性欲
+雷特
+连江
+湘南
+监禁
+药监
+乳酪
+资历
+提单
+发自
+当众
+BY
+大志
+相看
+白醋
+贴切
+中加
+紧缩
+2kg
+乳品
+练兵
+自恋
+类比
+粉条
+俑
+Story
+吉首
+滴血
+70000
+订制
+初版
+622
+内圈
+共青
+棚户
+旅费
+新军
+实有
+镂
+心存
+摔倒
+男篮
+2030
+敌机
+体魄
+探伤
+豆科
+沃尔
+UNIT
+RC
+潇潇
+炭疽
+完形
+贪官
+堂皇
+丰碑
+海防
+道合
+龙游
+中意
+政界
+国境
+大旗
+征兆
+国务
+热身
+骰子
+取食
+选聘
+电炉
+改观
+明基
+栀子
+揭发
+特设
+上当
+这话
+夺命
+拉克
+做完
+Air
+志军
+钢轨
+666
+VB
+4400
+炽热
+优等
+剑侠
+撤消
+剁椒
+太虚
+萌生
+friends
+铜器
+延吉
+牛油
+新街
+奥尔
+对岸
+艾克
+看护
+大口
+晴朗
+对讲
+搞怪
+丰胸
+反感
+国光
+下文
+方阵
+上月
+feeling
+500000
+格拉
+韦恩
+采风
+鸡西
+异化
+毛虫
+斗拱
+诃
+峰山
+侄子
+调教
+少华
+盈盈
+出牌
+豇豆
+二氧化硫
+055
+腺癌
+马云
+过头
+兵败
+476
+茁壮
+浓重
+淤泥
+富集
+特此
+盱眙
+Level
+过瘾
+电偶
+073
+丙基
+海潮
+斩首
+褐斑
+汉唐
+白芷
+第四纪
+丝带
+etal
+春水
+Hall
+匹马
+夏洛特
+路况
+动议
+悦耳
+永胜
+瓯海
+整容
+小碗
+枯竭
+病机
+赤子
+790
+DLP
+牵扯
+党籍
+昌吉
+乐视
+艾玛
+Hello
+秦朝
+锆
+喉咙
+诺丁汉
+霸业
+冰霜
+小妖
+杀出
+phone
+成瘾
+巍巍
+结点
+三藏
+斑驳
+自分
+石龙
+盛装
+CHAPTER
+亚美尼亚
+凤城
+熊熊
+节节
+塌陷
+065
+low
+胡桃
+会后
+斑马
+Δ
+鲷
+mental
+臣服
+WWW
+拍拍
+决断
+必胜
+倒影
+骤
+便以
+质询
+极点
+马术
+草鱼
+一闪
+1680
+备选
+Moon
+主队
+舰长
+拯
+行贿
+前门
+圣火
+水牛
+托付
+签收
+宥
+曳
+缴款
+分厂
+5600
+击穿
+碁
+长孙
+四溢
+洗刷
+憾
+花盆
+奠
+真气
+版主
+立宪
+普惠
+787
+1002
+邺
+打球
+鸭肉
+鬼王
+召见
+朝野
+Review
+魔咒
+狼疮
+凌空
+会刊
+父老
+强光
+新洲
+史话
+中下
+扑灭
+分外
+浸出
+车削
+飞燕
+板楼
+学成
+周报
+碧绿
+踊跃
+中水
+标价
+撤出
+大坪
+技校
+掌柜
+唱响
+骞
+游船
+维拉
+东段
+迎面
+摩羯
+磐石
+满城
+Wild
+道法
+Magic
+多普勒
+01984
+油酸
+龙川
+Language
+水饺
+灭世
+车长
+보
+608
+胃口
+短毛
+1876
+益州
+维德
+KV
+街巷
+青羊
+中校
+受压
+雪月
+074
+本山
+id
+mmHg
+塔式
+火红
+Angel
+亨特
+瓦伦西亚
+疏忽
+撼
+Patrick
+总科
+4300
+泵站
+狠心
+无证
+萦绕
+结余
+百病
+记入
+飞雪
+水底
+麻花
+战前
+归结
+千户
+万变
+内生
+阅卷
+海河
+二人民
+群团
+座舱
+蓼
+宴席
+很早
+海丰
+花茶
+红石
+月华
+南村
+必先
+连夜
+↙
+获益
+柳林
+康定
+晚安
+关切
+most
+赵家
+红油
+背离
+志在
+心腹
+458
+临港
+皇权
+展台
+家河
+骨盆
+同种
+偏执
+vol
+流沙
+拾取
+0g
+大梁
+시
+TN
+变通
+全貌
+草帽
+等值
+谈笑
+溶化
+行云
+潍
+格罗
+修葺
+二万
+特辑
+斑块
+聪颖
+心甘
+APE
+莘莘
+久后
+拉齐奥
+蔗
+507
+theatrical
+遮蔽
+鄙视
+放火
+食道
+which
+老牌
+耗资
+泠
+吞咽
+交感
+取款
+隔着
+天黑
+痛恨
+农商
+盟友
+怀古
+北风
+永年
+725
+ε
+仍可
+card
+囚犯
+稽核
+早产
+向荣
+星月
+籼
+巴克
+瘠薄
+MOS
+离乡
+真身
+播报
+油污
+回廊
+成亲
+Never
+谷地
+门徒
+6GHz
+免于
+镐
+531
+8MB
+费力
+山阴
+多时
+尊师
+贫富
+自体
+终有
+逐鹿
+草拟
+诽谤
+僵硬
+484
+规费
+白皙
+外销
+踪影
+陪护
+暴龙
+拉手
+邈
+大韩
+要紧
+翳
+大计
+牙科
+兰斯
+清道
+放荡
+仙山
+煽动
+超频
+长剑
+words
+国栋
+矢志
+抬高
+初开
+诏书
+推定
+箴言
+ㄞ
+浇灌
+治所
+给与
+志同
+老友
+XXX
+日剧
+宗派
+生来
+某日
+榨菜
+至宝
+昊天
+布艺
+漆器
+甲酯
+分获
+患难
+免收
+葱葱
+扈
+菜馆
+铜山
+便当
+干贝
+UFO
+047
+婕
+索道
+风花
+Stephen
+橡塑
+渴求
+阿曼
+诺曼
+核糖
+靠谱
+鼓吹
+抗拉
+电导
+谬
+632
+戒烟
+铁人
+增发
+攀岩
+学得
+潮阳
+居中
+政办
+乌克
+1861
+和美
+师院
+分叉
+恍然
+车船
+泰达
+椎间
+隽永
+last
+北川
+驮
+孺
+敬重
+闇
+牵动
+4g
+钙质
+影迷
+蟑螂
+crazy
+FLASH
+并网
+当兵
+甾
+本罪
+杯中
+烛光
+甘薯
+架起
+军士
+吃货
+侍从
+南康
+惠农
+祭品
+071
+袋鼠
+书包
+蕨类
+纽卡斯尔
+鳗鱼
+AR
+英德
+敌后
+焦化
+天坛
+丁基
+威信
+概说
+腾空
+站址
+梆子
+校徽
+店员
+伤病
+抑菌
+赋税
+拍案
+墨客
+马桶
+境遇
+西天
+贝利
+Professional
+冷风
+二百一十
+评优
+MRI
+蚧
+钜
+监听
+诸国
+出局
+SE
+彬彬
+飞刀
+西药
+明文
+大港
+胸膜
+绞线
+基线
+胶合
+出轨
+陵县
+安详
+喷头
+票证
+俱佳
+繁盛
+068
+Kim
+宛若
+804
+805
+可悲
+酷暑
+盈余
+兵家
+透出
+东渡
+1864
+招数
+CCC
+出货
+敲诈
+变型
+金平
+建伟
+惠阳
+01985
+喜事
+FLV
+台长
+四张
+嘌呤
+公德
+披着
+供用
+粉状
+核苷酸
+洗发
+昌黎
+GW
+启明
+店长
+忍心
+打伤
+暹罗
+西元
+戏院
+真言
+单单
+歌德
+胜者
+奥甲
+历城
+EMS
+万化
+Global
+甚少
+ship
+苍苍
+极光
+卖掉
+076
+可致
+三九
+东东
+三毛
+方格
+沟壑
+754
+忧患
+枝头
+碑记
+1865
+滚石
+双塔
+重装
+飞檐
+雪人
+北城
+10大
+萨姆
+继电
+066
+江州
+俾
+苏超
+甄选
+抽奖
+泮
+笛子
+岳父
+音源
+科威特
+西平
+绕过
+除掉
+整备
+至诚
+case
+死党
+马可
+烷基
+592
+PH值
+开火
+麻酱
+思源
+待查
+冬奥
+061
+dont
+包扎
+荣幸
+道观
+深奥
+蛋鸡
+踹
+基材
+过世
+夯
+掀
+分文
+县直
+共谋
+忽悠
+刚毛
+01997
+矿用
+秒内
+贝尔格莱德
+铁观音
+People
+原稿
+扁桃体
+时辰
+door
+rock
+偈
+复员
+要命
+波尔
+婚宴
+由衷
+原画
+三折
+一物
+陆丰
+岩画
+弟兄
+古村
+〓
+洗车
+0L
+雇用
+人妖
+斯科
+真如
+腰腿
+悲凉
+疟疾
+份子
+会师
+索尔
+挥舞
+轻纺
+打乱
+蹦
+蜀汉
+居高
+545
+067
+正负
+土鸡
+help
+Not
+摩诃
+腐熟
+⒊
+614
+Cat
+平价
+苦闷
+横梁
+结伴
+驻足
+抢先
+躲过
+祁连
+困苦
+血气
+焦炭
+通力
+金榜
+减产
+内需
+Cu
+联防
+御姐
+册封
+接听
+对齐
+各镇
+五常
+美艳
+岩洞
+离岸
+焊条
+招考
+久期
+隐忍
+Body
+890
+五岳
+有才
+蝶恋
+铝箔
+涟
+安南
+追封
+蒸锅
+琴弦
+崇敬
+铜川
+END
+徜徉
+062
+光谷
+gone
+突显
+5W
+神户
+千变
+澹
+太阴
+肌腱
+SH
+薪水
+潜质
+外露
+鸭蛋
+Introduction
+洪流
+8m
+丁丁
+闹剧
+于民
+月牙
+大龙
+分机
+中耳
+飘扬
+仁寿
+Road
+Ⅵ
+胜景
+北郊
+金杯
+西沙
+驻外
+公祠
+百倍
+南关
+痔
+绸
+打劫
+苏南
+情系
+新河
+Team
+西西里
+姥姥
+YOU
+闪避
+Phone
+出逃
+作序
+较慢
+咔
+德惠
+戊戌
+録
+talk
+潮水
+牌匾
+轻声
+异名
+八级
+064
+자
+九霄
+表里
+寰
+酒会
+弹琴
+TBS
+院线
+大餐
+和合
+世尊
+国歌
+里约
+挥霍
+报文
+出题
+苜蓿
+勾引
+注水
+百草
+饰物
+things
+三皇
+咨
+其意
+恳
+台地
+照度
+贡品
+摘自
+段子
+精卫
+轮换
+刮痧
+六世
+超细
+肤质
+世说
+现出
+字词
+凤仙
+巴巴
+烤漆
+淡黄
+振幅
+JR
+热销
+机头
+七一
+案子
+高家
+新平
+安家
+紧致
+砌块
+雪松
+树龄
+成文
+01996
+修女
+重申
+究极
+催促
+番薯
+金枪鱼
+488
+前翅
+6kg
+四门
+铁丝
+天女
+马场
+虎头
+红土
+基斯
+德明
+1M
+油温
+节约型
+和面
+Hill
+KHz
+排入
+蒜泥
+风洞
+蟋蟀
+光耀
+深蓝
+审慎
+特指
+节流
+陆路
+hot
+飞去
+包房
+环路
+737
+俯仰
+返程
+Date
+中公
+长方
+798
+still
+紧握
+止步
+耳熟
+健胃
+全胜
+药王
+济州
+排便
+言辞
+略显
+金堂
+walk
+椒盐
+Study
+汝南
+1820
+会考
+死角
+人命
+字面
+曲轴
+保送
+35000
+分步
+TA
+盔
+阑珊
+上元
+普林斯顿
+点钟
+秦腔
+乐平
+击剑
+步道
+战俘
+管教
+圆周
+分界
+官渡
+宽恕
+险些
+一瞬
+四册
+炬
+沃土
+外卖
+ps
+乌兹别克斯坦
+物性
+双击
+许久
+先期
+蝌蚪
+卡耐基
+威慑
+沏茶
+高调
+1003
+大吉
+葱白
+卧床
+华府
+罗尼
+可望
+朝中
+1866
+四驱
+神武
+守时
+祭祖
+士人
+711
+潘多拉
+未遂
+刻石
+野马
+越王
+创面
+两院
+心肠
+云阳
+袖珍
+专署
+Advanced
+蒸煮
+叶腋
+公所
+果敢
+965
+小春
+好客
+养性
+殿前
+606
+五峰
+痛心
+话中
+一斑
+送回
+723
+励精
+萨克斯
+吃亏
+Hot
+风铃
+429
+财宝
+海林
+内资
+X2
+565
+利差
+爱琴
+园长
+幻象
+测温
+西泠
+Season
+原型机
+旁人
+零陵
+8kg
+years
+孔庙
+邻家
+新店
+省城
+坠毁
+血海
+上卷
+自备
+发力
+宫崎
+青丝
+树形
+663
+ACT
+炫酷
+冲撞
+下述
+报应
+灼热
+3kg
+羊绒
+野草
+文盲
+视窗
+稼
+大增
+香叶
+桀
+成大
+1874
+豆子
+forever
+回望
+地膜
+中岛
+468
+洪洞
+指使
+真名
+华泰
+京九
+爪哇
+写完
+喧闹
+兔肉
+新沂
+三支
+勤勉
+外线
+整地
+炝
+秋日
+一石
+飞鸿
+type
+EF
+漪
+鳝
+菊科
+通络
+2022
+决意
+关口
+θ
+涡流
+Xiao
+伯克利
+家康
+耒阳
+同知
+078
+河海
+049
+禹州
+吉恩
+倍率
+ρ
+武义
+恬静
+馅饼
+两广
+上马
+22000
+雨露
+GM
+工时
+剧作
+北安
+用友
+充盈
+真宗
+环游
+经书
+全盘
+港元
+蔬果
+平乐
+Dont
+抖动
+原物
+燕王
+拔萃
+br
+氧化铝
+家国
+引文
+收起
+乘船
+火球
+益生
+步法
+亚当斯
+930
+借给
+捣蛋
+衣架
+082
+原定
+转矩
+天雷
+偿债
+made
+防渗
+缭乱
+冰激凌
+大衣
+顿悟
+晨报
+POS
+十数
+幸免
+玉带
+抱住
+石井
+九鼎
+背心
+大雅
+昏暗
+立德
+残渣
+重彩
+天华
+建置
+尔斯
+1868
+地矿
+同调
+劳模
+CX
+作价
+汉军
+437
+外省
+欺凌
+回响
+露西
+084
+冷眼
+ACID
+业已
+紧盯
+ATI
+随访
+伊斯
+愣
+kHz
+634
+西服
+露露
+ISTP
+转轮
+鹰潭
+杏林
+丹药
+Louis
+剪裁
+外设
+干渠
+BS
+临夏
+疏远
+招来
+5ml
+825
+供油
+灼伤
+Grand
+此情
+累及
+姑苏
+蒂亚
+往生
+01986
+健脑
+利弊
+法华
+突兀
+大额
+惩
+Dance
+925
+气魄
+格鲁
+446
+借钱
+借记
+鄱阳
+蒙山
+房门
+旗袍
+风神
+干湿
+耽误
+自娱
+SCDMA
+园中
+问鼎
+藏身
+保驾
+镂空
+郡县
+卤水
+菜园
+527
+库克
+脑病
+镜中
+思茅
+希尔顿
+青白
+成华
+key
+针状
+下唇
+063
+말
+鲶
+TL
+旋翼
+摇动
+失球
+严明
+内里
+郯城
+石质
+故土
+破晓
+春生
+力克
+思辨
+龙飞
+MG
+Post
+禺
+小樱
+PU
+砌体
+淄川
+谜语
+裁缝
+凋
+拼写
+甜瓜
+卖淫
+连忙
+病重
+亘古
+专柜
+皇妃
+正史
+当真
+英式
+别号
+马匹
+同声
+Kelly
+嫣然
+百日
+影坛
+494
+MX
+妙招
+魔戒
+侏罗纪
+FDA
+洵
+卡卡
+瓯
+Family
+船用
+骨子
+611
+奇人
+057
+履职
+同德
+古兰
+077
+秒表
+平滑肌
+衰败
+补中
+覆羽
+糜
+商检
+急流
+熄
+陇海
+白板
+黄页
+聚变
+三高
+信达
+五好
+育龄
+升空
+瑞德
+身受
+农大
+关山
+拿走
+未及
+上岸
+减法
+永世
+丑恶
+养阴
+合璧
+三钱
+压榨
+RF
+陇南
+良田
+BBS
+敲打
+布鲁日
+弈
+Current
+生鲜
+Boss
+自来
+信守
+排尿
+狂澜
+剿匪
+何物
+荒谬
+麦冬
+815
+抹去
+A2
+干果
+勤政
+七宝
+半点
+25mm
+鼎鼎
+普京
+德宗
+诬陷
+亚里士多德
+看台
+单面
+枪击
+研判
+边陲
+音标
+甲型
+兼营
+过桥
+从化
+圆环
+口诀
+墨盒
+675
+图版
+控温
+shine
+春播
+扇子
+濒
+心烦
+贝母
+唢呐
+震慑
+状如
+556
+问天
+罚没
+而致
+斜阳
+暗暗
+花痴
+宣泄
+阴霾
+接吻
+普济
+气源
+右派
+⑹
+米粒
+正午
+鸡鸣
+宁国
+远航
+琴声
+马齿
+氮气
+冬虫
+五香
+身陷
+海牙
+紫檀
+噶
+黏度
+人民武装
+婿
+474
+短发
+路政
+胸闷
+羞涩
+宏图
+钻机
+刀刃
+升至
+泄泻
+灯箱
+С
+征讨
+死地
+058
+货品
+寻根
+牙买加
+珏
+裨益
+显性
+Fred
+藏语
+海沧
+后劲
+子夜
+热气
+联产
+侃
+结扎
+荒原
+wards
+行空
+耗能
+酿成
+孝义
+甲虫
+乘以
+059
+意面
+新竹
+绝学
+万事达
+辽源
+冀鲁豫
+大喊
+凉亭
+gASIN
+梅子
+弗朗西斯
+岩性
+01995
+水沟
+秀水
+信义
+各人
+宠儿
+鲜果
+风霜
+画坛
+含笑
+秽
+恽
+奥克兰
+适口
+芥菜
+重担
+主语
+温江
+眉头
+中视
+扇门
+双桥
+温文
+单挑
+埃弗顿
+尾气
+宓
+武安
+破译
+非遗
+进击
+family
+编办
+音速
+相框
+年复
+灰度
+黄铜
+西线
+州府
+如初
+唯心
+1440
+黏液
+广袤
+残杀
+分属
+┼
+伤势
+贫瘠
+归入
+撒哈拉
+减慢
+WLAN
+察看
+悲愤
+来时
+澜沧
+拉脱维亚
+核素
+顺从
+缩减
+小智
+域外
+备件
+早逝
+调出
+家训
+683
+活人
+西园
+游动
+梅毒
+易行
+固守
+5800
+0A
+利湿
+富足
+峪村
+梅森
+锤炼
+扬名
+凝重
+覆地
+肚皮
+069
+此间
+旋涡
+打理
+three
+嘱
+关西
+01988
+动心
+8500
+环流
+闻讯
+471
+贬值
+圣杯
+Know
+情欲
+性交
+浔
+1862
+继光
+找准
+布政使
+地宫
+北麓
+天晴
+剪力
+辣味
+小兔
+朽
+654
+后场
+Pa
+索马里
+乌苏
+莫非
+逐个
+连队
+青木
+东川
+纯白
+拴
+昂首
+侍女
+东宫
+585
+兴亡
+藏文
+赫兹
+性腺
+痈
+595
+鲜肉
+再入
+借以
+竹园
+扳机
+果糖
+德怀
+和亲
+fall
+浴盆
+相加
+首尾
+簪
+达夫
+主簿
+代购
+684
+脚趾
+公羊
+金川
+血吸虫
+线粒
+款额
+崩坏
+常压
+再战
+公允
+亭台
+牙周
+JSP
+体委
+对等
+大叫
+天象
+下马
+between
+牵牛
+得奖
+底肥
+私分
+过夜
+南疆
+AH
+县属
+疑点
+老总
+算得
+1830
+记名
+文君
+霍乱
+老三
+勒沃库森
+站内
+安西
+罗密欧
+史论
+个展
+小兰
+缴存
+呦
+够用
+Summer
+奥利弗
+浅色
+days
+立信
+命案
+虱
+离任
+砂轮
+捂
+才干
+673
+制毒
+九级
+请到
+封顶
+081
+Na
+棘手
+凯里
+稚嫩
+失地
+所部
+纷乱
+积温
+畲
+卡萨
+闾
+万水
+皮卡
+领带
+羚羊
+守城
+EQ
+蹊径
+万计
+老庄
+氧化钠
+刘向
+起搏
+抒写
+诸家
+海龟
+742
+砖厂
+先发
+远赴
+端木
+内环
+平川
+清照
+Spring
+拖车
+贴片
+夹角
+分贝
+组网
+卡号
+克强
+火把
+二重
+晴空
+质问
+÷
+ㄗ
+讽
+yu
+平潭
+圈足
+EVA
+选送
+夺宝
+泥塑
+冬梅
+绝热
+魔教
+米奇
+验资
+福星
+厅堂
+一百万
+明尼苏达
+满山
+money
+滤液
+霓裳
+杀青
+open
+闭门
+圆桌
+法王
+俭学
+517
+埋头
+闹市
+通论
+柳叶
+dance
+小传
+银座
+派系
+癣
+确权
+1879
+彩印
+农具
+灵光
+转印
+异族
+布洛
+Christian
+馥
+医患
+八九
+天高
+美籍
+翠竹
+义气
+称王
+中频
+鸡丁
+诰
+冷月
+554
+纣王
+锑
+经年
+傩
+查办
+剩女
+晋阳
+4C
+迫于
+乱舞
+607
+两场
+柏树
+耿耿
+三进
+数独
+荷尔蒙
+下雪
+女工
+纱线
+帝君
+Wilson
+敌国
+绿草
+Special
+新农合
+北村
+王菲
+荻
+支离
+每份
+498
+讯号
+烯烃
+归根
+二村
+Port
+土楼
+磨铁
+079
+购入
+燃起
+forme
+关乎
+www
+振宇
+万安
+相救
+太康
+顺风
+永清
+Script
+多好
+脉动
+毒害
+中时
+建民
+绝版
+夸大
+457
+旅居
+天和
+喜鹊
+点距
+国平
+滑石
+白纸
+脱险
+深秋
+桐柏
+ONE
+熔断
+避风
+题记
+支路
+3700
+烟酒
+嫩枝
+释迦
+尼罗河
+呜呜
+颜面
+相待
+黑海
+宣化
+1101
+乔布斯
+枪口
+泗水
+RCA
+蝶阀
+599
+顾及
+忠孝
+图强
+法度
+461
+black
+剑阁
+广平
+托夫
+悔恨
+戒律
+干支
+479
+ATA
+垭
+醒悟
+长线
+掘进
+异兽
+采油
+饺
+满身
+5x
+创先
+决裂
+ч
+纤毛
+萧县
+书家
+彩霞
+714
+神人
+蓝调
+腹鳍
+埃因霍温
+Cor
+任人
+阳新
+选美
+奇门
+蒜苗
+01992
+阻拦
+ald
+特里
+Radio
+昨晚
+4600
+model
+潜山
+册子
+黄牌
+762
+ə
+红牌
+景泰
+凳
+亲水
+禁地
+view
+수
+同质
+颅骨
+嗜睡
+棋手
+明火
+弧线
+超速
+侍奉
+邪气
+医家
+马尼拉
+五环
+大营
+들
+见方
+医德
+外企
+秋色
+百千万
+叉烧
+商船
+花莲
+表妹
+质优
+铜版
+忙于
+佛手
+正要
+侨联
+从属
+戈登
+倒流
+过重
+眺
+对曰
+海选
+呕
+突飞
+墨家
+fun
+党部
+住址
+现名
+指名
+7cm
+商讨
+虎口
+少男
+杀神
+会诊
+臭味
+彪悍
+介词
+造访
+information
+王权
+棱角
+1878
+富兰克林
+1877
+枪声
+国良
+嫉
+佛家
+麒
+能见
+阿布
+化州
+解字
+疑云
+路德
+能人
+813
+格莱美
+钢制
+低洼
+小川
+癫
+HIV
+成册
+法器
+传道
+Manager
+吹奏
+金砖
+110000
+老二
+篝火
+Sony
+重围
+皇马
+凉菜
+状语
+二哥
+里河
+是故
+铁马
+膜拜
+小飞
+战局
+合欢
+践
+509
+临危
+鼓风
+油缸
+粗暴
+台商
+交加
+00AM
+山神
+春晓
+松子
+13xxx
+感化
+高薪
+男子汉
+型钢
+消渴
+初稿
+当好
+云彩
+派兵
+恶毒
+各州
+避险
+top
+壮阳
+忠贞
+视作
+推论
+神采
+放任
+建武
+当量
+昙花
+任重
+专刊
+敬佩
+善战
+送交
+715
+黔江
+北爱
+黍
+婉约
+月影
+枪战
+从今
+当头
+法家
+雄鸟
+包罗
+野花
+确信
+中石化
+诸城
+稍后
+牲
+会昌
+story
+汉学
+玄天
+图景
+3d
+预赛
+擦干
+齐放
+暗算
+า
+热恋
+祖辈
+选刊
+湍流
+头等
+526
+私奔
+徽商
+广汉
+584
+中流
+南诏
+靠山
+投递
+摊位
+入射
+从古
+花式
+尼克斯
+闽侯
+山乡
+version
+断桥
+屯田
+填满
+几近
+花溪
+Rober
+缠身
+启航
+新几内亚
+安泰
+海员
+温婉
+占星
+嚣
+能级
+扬起
+Town
+ASF
+长庆
+01987
+惨败
+渝中
+兵营
+收买
+太岁
+梦龙
+PH
+马达加斯加
+浩浩
+行家
+塔林
+仁者
+消遣
+744
+鳞茎
+先机
+CI
+唇形
+Bank
+军需
+宗亲
+调适
+519
+弱电
+闻到
+表哥
+粳
+昌邑
+宗谱
+克林顿
+装填
+hands
+纺纱
+露面
+น
+婚嫁
+农委
+磺胺
+映画
+罗刹
+古庙
+9600
+无良
+01983
+历朝
+波特兰
+测速
+黑发
+和成
+老式
+糊状
+赏金
+笑语
+平实
+佩带
+学军
+及至
+克尔
+447
+吞食
+羚
+毯
+二十几
+一斤
+439
+嘧啶
+引线
+巨匠
+沂水
+五龙
+板凳
+饿死
+转包
+喂食
+澄海
+怀揣
+康佳
+啄木
+爱神
+et
+选料
+云朵
+kPa
+poly
+波利
+压强
+鼓起
+万般
+德高
+变体
+01993
+沙土
+恩赐
+评书
+发亮
+对碰
+564
+藏宝
+良药
+政体
+小李
+划艇
+茭白
+网名
+长歌
+跨过
+直击
+调至
+尾翼
+祸害
+晚霞
+屋内
+下榻
+自言
+抗洪
+蓟县
+定论
+公学
+阿拉善
+傲世
+退款
+代销
+小村
+春暖
+淌
+云和
+乡试
+兴致
+清冷
+降生
+天理
+前轮
+长清
+机长
+正邪
+侧扁
+周游
+防弹
+孤傲
+标高
+085
+那双
+里希
+风头
+滔天
+千钧
+煤田
+波黑
+曲棍
+水汽
+镌
+笔划
+美名
+县乡
+帕金森
+致电
+而逃
+808
+1867
+毛毛
+苋
+藉由
+俸
+饷
+市郊
+濠
+全无
+炒菜
+腺毛
+阿尔法
+沙湾
+吝啬
+雍容
+747
+时不时
+Style
+槎
+LTD
+洞窟
+脱产
+滨湖
+吟诗
+拳法
+AGP
+733
+刁难
+米粥
+奥斯汀
+小羊
+激酶
+继母
+绿野
+感召
+赈灾
+公国
+雨雪
+抗寒
+4MB
+1872
+垮
+编选
+家事
+南溪
+Its
+悄
+国资
+托里
+呈报
+沙坪坝
+发呆
+早点
+每张
+沙尔克
+市立
+憨
+花科
+石室
+回荡
+病痛
+离世
+沪宁
+平壤
+论集
+DNS
+题诗
+蛮荒
+昨
+安邦
+橡树
+없
+田螺
+干枯
+1120
+真武
+∩
+四军
+ICP
+星宿
+德安
+璜
+糖苷
+幽谷
+恰似
+FS
+更替
+锈病
+桂阳
+Gary
+选录
+北碚
+熟透
+施以
+小将
+何家
+hould
+殡仪
+侠义
+特典
+中郎
+直管
+大雁
+邻国
+暗流
+1550
+七律
+流化
+水灾
+812
+巴顿
+死伤
+站牌
+会晤
+Earth
+Software
+桂平
+圈圈
+number
+滤过
+贪吃
+书友
+丹凤
+消声
+V5
+standing
+诛杀
+讹
+尼亚
+题字
+宝林
+阀体
+前夫
+怒吼
+重者
+前生
+内酯
+某部
+走开
+Dong
+仰天
+巢穴
+意指
+通判
+现成
+焦糖
+凌霄
+蚤
+武松
+从新
+加注
+荷包
+橙黄
+轮滑
+望族
+极管
+祸水
+几万
+南江
+哌
+甲乙
+喷墨
+浸入
+到站
+Effects
+沉静
+涌出
+胡须
+开光
+折纸
+人亡
+磬
+喘息
+Nor
+五湖
+佤族
+扫地
+拖着
+城址
+大妈
+普照
+EC
+风雅
+雄狮
+定海
+款物
+米质
+联播
+活字
+凯蒂
+果穗
+除皱
+Twitter
+FT
+畅快
+沃克
+推手
+地暖
+砖块
+代序
+飘摇
+热病
+确凿
+01990
+虎丘
+反常
+侦
+蛛丝
+茶楼
+坦言
+真知
+一箭
+市中
+刁蛮
+迷幻
+夏草
+大宝
+氰
+废品
+战法
+气愤
+Ti
+三鲜
+外公
+俟
+初遇
+霍山
+536
+两军
+从重
+含油
+创出
+后腰
+冲入
+无道
+平南
+心田
+载有
+卒中
+Dragon
+胜率
+布加勒斯特
+迪亚
+诸位
+进发
+初音
+0Android
+武清
+496
+丽萍
+燮
+大伟
+渎
+索菲亚
+ê
+金代
+SOHO
+903
+哪吒
+1848
+沂
+林口
+西征
+赎
+节律
+尼尔森
+明堂
+婢
+将至
+豫章
+小径
+林州
+钥
+商战
+嘉祥
+基丝
+嗔
+940
+地学
+诗刊
+拆解
+牦牛
+西大
+分清
+王府井
+淮阳
+图卢兹
+通量
+熠熠
+机务
+腰间
+兼优
+撤职
+新庄
+辟地
+华容
+热敏
+哈达
+恐吓
+苹
+匀称
+з
+维纳斯
+平移
+提成
+走动
+相城
+西段
+七绝
+1871
+西西
+会徽
+星团
+乳突
+周身
+own
+砾
+堇
+任课
+特雷
+抱有
+隆庆
+秘史
+鸿蒙
+弄清
+情理
+花丛
+沧浪
+同工
+焘
+寻址
+动情
+前路
+正则
+room
+姑姑
+断定
+吴氏
+083
+Jim
+手印
+pop
+续写
+金元
+粤剧
+惊厥
+久违
+LTE
+Security
+讲到
+碳素
+铃音
+华亭
+救美
+华佗
+贻
+金穗
+脱氧
+太一
+语调
+红旗手
+咕
+荧屏
+ray
+酿制
+串通
+DX
+涌泉
+trans
+遇害
+豆蔻
+想来
+聚糖
+Page
+俄文
+宰杀
+Beach
+载客
+挂机
+网商
+客站
+大秦
+墓穴
+文宗
+型态
+云岩
+侨眷
+女侠
+装运
+橡
+Cheng
+易名
+细丝
+武技
+成婚
+抒
+子午
+上唇
+选票
+东陵
+罗田
+战马
+惊鸿
+魔物
+茶壶
+喧哗
+副业
+钙化
+1004
+克里
+长焦
+乌兰
+制高点
+宝应
+全队
+转职
+茶水
+泛舟
+编印
+信令
+眷顾
+king
+捣烂
+晓燕
+多酚
+阿塞拜疆
+怠
+独山
+沔
+拉斐尔
+千军
+用意
+维克多
+观者
+1644
+磁化
+谌
+倚天
+云杉
+度日
+芸豆
+稠密
+Jona
+居正
+年薪
+颤动
+GP
+冰火
+舔
+shu
+袅袅
+苏菲
+574
+立身
+混浊
+3
+甘南
+日伪
+洗脸
+金三角
+975
+落款
+华宁
+鳝鱼
+点阵
+跳跳
+鸡丝
+荣昌
+克劳德
+卷首
+眨
+可塑
+挫败
+支农
+lands
+邮报
+公卿
+这年
+塔里木
+名臣
+百亿
+祖庙
+弹劾
+난
+征伐
+1873
+分给
+丰城
+PV
+濯
+↔
+烧香
+带鱼
+万有
+血尿
+天朝
+顺手
+死心
+30mm
+惩处
+快跑
+快意
+金寨
+01994
+三两
+拿手
+小伙
+忌食
+高尔基
+鱼丸
+赎罪
+中青
+夜郎
+鸢尾
+763
+单果
+海派
+快报
+灭门
+波导
+殷勤
+万福
+映衬
+括号
+老生
+临街
+场场
+科考
+肉瘤
+调取
+县立
+镇定
+梅山
+厚朴
+抓取
+鄢
+金源
+封地
+发卡
+Mpa
+木本
+685
+罡
+哈里斯
+鼓浪
+弧度
+科维奇
+5100
+劳保
+潜江
+书里
+阿特
+家产
+赌注
+汁液
+迪斯
+剑锋
+永华
+往南
+面瘫
+失散
+亚迪
+永强
+其心
+良渚
+放生
+万马
+床单
+零散
+讴歌
+Great
+船厂
+托尔
+晓红
+厚爱
+矜
+集贤
+赶回
+湘乡
+鲁能
+涤纶
+剥皮
+孝宗
+Press
+浦口
+自导
+光敏
+稀奇
+百世
+Johnny
+怖
+724
+罗列
+只怕
+丽莎
+Applied
+定子
+中常
+钱江
+拟人
+襄公
+黄泥
+刻划
+皮瓣
+金湖
+拉里
+偃师
+新法
+涂布
+火光
+辫子
+扩容
+里约热内卢
+阴天
+酷刑
+香椿
+风热
+信使
+753
+焰火
+长跑
+拍打
+814
+世杰
+酗酒
+堆叠
+喷火
+爱河
+学画
+文馆
+钺
+条腿
+换气
+糙米
+后背
+先辈
+粗放
+列支
+朱丽叶
+三阳
+拳王
+丧胆
+神往
+耐水
+饭煲
+捉摸
+驱车
+需用
+肥肉
+迷情
+养子
+瞄
+滑落
+已付
+沉船
+824
+升迁
+斗殴
+Module
+7M
+JAVA
+初吻
+Mn
+临朐
+乡情
+硬骨
+艳阳
+经史
+守门
+驯化
+864
+1620
+年审
+效仿
+啄
+傈
+互连
+增刊
+弹指
+探查
+喝醉
+结转
+喉部
+多本
+东线
+治平
+俩人
+葆
+丘脑
+老祖
+稷
+二百四十三
+金人
+飒
+车行
+火化
+桃仁
+地心
+铂金
+生锈
+肖邦
+曼城
+华里
+咦
+师弟
+孝敬
+字句
+电光
+福祉
+豪爽
+依偎
+花海
+群峰
+大兵
+NC
+淞
+541
+炸毁
+两字
+压扁
+其妻
+诲
+骶
+责备
+韩城
+735
+inside
+恩人
+开罗
+仰卧
+鸟兽
+暗地
+桡
+Four
+佛道
+mark
+渔夫
+随父
+两轮
+594
+针法
+HDCP
+瞬息
+学力
+降耗
+安哥拉
+垅
+Frame
+肥城
+讲台
+领兵
+八强
+低沉
+薪资
+伊恩
+CBA
+联华
+着实
+婺城
+昶
+受孕
+NEW
+二百五十四
+虾皮
+神农架
+Give
+있
+655
+우
+陷于
+急症
+相争
+菜心
+承恩
+林荫
+墓园
+末代
+雨刷
+精编
+流芳
+援军
+2025
+家庙
+iphone
+虚寒
+日游
+侨胞
+洪都拉斯
+092
+693
+珠子
+尚志
+潜藏
+Photo
+廿
+转交
+洼地
+丑女
+904
+清正
+赛程
+辞海
+阿伦
+甲亢
+HE
+对子
+除以
+校勘
+藜
+布鲁克
+品系
+险要
+政坛
+马甲
+616
+遗精
+平展
+归途
+浅海
+山脊
+红心
+榴莲
+込
+风物
+两期
+酥脆
+束手
+讲习
+锁住
+
+机件
+劲爆
+BRT
+膺
+1102
+Law
+咧
+杰西
+泥炭
+成份股
+三套
+Alexander
+Or
+为时
+1855
+让步
+流星雨
+棱镜
+碧云
+贞元
+废话
+644
+纷纭
+声部
+陡坡
+工控
+防务
+尚可
+匿
+火辣
+witha
+朵拉
+音律
+mistry
+躲藏
+斯托克
+黏土
+586
+‖
+雅虎
+进犯
+乐迷
+府第
+洗钱
+比试
+中北
+七色
+ms
+虢
+浸渍
+588
+画图
+捅
+下卷
+大堆
+激怒
+尽美
+cc
+辐照
+5GT
+凸现
+朱莉
+拉长
+说教
+带兵
+昭王
+卡门
+分时
+塞外
+close
+百位
+懒人
+开到
+见图
+潼关
+挪超
+t1
+重振
+绝唱
+微薄
+几十万
+长流
+六百
+毙
+富余
+富士康
+菖蒲
+1kg
+拍成
+军部
+杂种
+亚欧
+1023
+Episode
+正反
+DMI
+勾画
+诱饵
+心魔
+爹爹
+619
+吸盘
+北段
+之策
+助残
+缺铁
+Computer
+Sci
+大竹
+留念
+聚光
+匪浅
+维尼
+首支
+异端
+凯莉
+底特律
+磨床
+随同
+牛蛙
+875
+可依
+曹魏
+悍
+662
+方才
+胃痛
+fish
+天劫
+墉
+荔湾
+斩断
+围岩
+难逃
+木匠
+肃穆
+分道
+Only
+搁浅
+毕加索
+澡
+股骨
+油藏
+拱墅
+晚唐
+峰峦
+逗留
+路南
+失分
+曙
+志坚
+静海
+盗取
+透射
+中平
+佩里
+40mm
+磷肥
+Davis
+热火
+翼展
+台子
+前移
+Director
+加料
+雅各布
+妙手
+分得
+奴役
+劣质
+剑影
+导学
+写书
+吋
+蛛网
+敷衍
+狂潮
+嗓子
+亚于
+德兴
+合剂
+何如
+例证
+丧生
+big
+胃酸
+凤蝶
+穷尽
+Queen
+分居
+肉芽
+674
+耳环
+连天
+家书
+千佛
+自交
+薇薇
+990
+创维
+熟食
+西苑
+苍凉
+称职
+榛
+百灵
+粤东
+位点
+归集
+肺腑
+慰藉
+091
+淤积
+东阿
+珊珊
+绿荫
+撒谎
+悬空
+吴县
+定式
+雪茄
+石河子
+万民
+回龙
+军机
+双飞
+破烂
+压痛
+加气
+地老
+闸北
+水粉
+老旧
+齐天
+水色
+Miller
+红豆杉
+免检
+异质
+倒伏
+叶斑
+永城
+古剑
+想必
+棒棒
+前夜
+硕大
+大麻
+江油
+广饶
+吸湿
+01989
+石洞
+707
+印张
+More
+紫癜
+卷材
+玄学
+的士
+609
+方便面
+兽场
+恋恋
+观望
+和顺
+绮丽
+反目
+nsh
+548
+预感
+568
+斯通
+过山
+精怪
+鲁南
+莆
+奥德
+清迈
+离异
+开荒
+总院
+银屑
+荒废
+法会
+重达
+民安
+碎纸
+干式
+气韵
+鼻咽
+沙星
+Tatort
+酱香
+内切
+桑德
+零度
+称量
+0x
+1856
+淫秽
+1863
+未解
+Wolf
+金质
+国子监
+不莱梅
+飙
+连珠
+国名
+538
+泗阳
+低等
+Social
+蕃茄
+cant
+校方
+大胜
+大班
+云飞
+怪事
+绶
+自办
+身形
+仲夏
+夜袭
+弁
+布林
+白眼
+902
+后汉
+扶正
+惊吓
+0G
+内窥
+亚利桑那
+Type
+吴家
+绝恋
+柘
+厌食
+下位
+浓密
+华强
+独栋
+控诉
+Action
+减量
+励磁
+口吻
+AMR
+大案
+姐夫
+枚举
+附小
+豁然
+勇闯
+赞许
+神将
+诶
+高加索
+行当
+环环
+高安
+Adam
+石人
+642
+千层
+晚熟
+乌海
+合辑
+赢取
+右岸
+together
+乘务
+渔家
+房车
+沙质
+ぜ
+科恩
+茂林
+通宝
+扶风
+国粹
+游艺
+抛出
+对唱
+阈值
+低位
+FCC
+絮凝
+永红
+文雅
+结冰
+kV
+基坑
+沙田
+党代表
+Quick
+挑拨
+黄巾
+狂飙
+756
+里头
+模量
+回音
+755
+斟
+倒置
+阿斯
+ti
+仔猪
+梯子
+布拉德
+异样
+鹏程
+C3
+652
+黄斑
+情思
+前例
+汉魏
+榻
+渺
+查清
+小脑
+四千
+曾孙
+冰凉
+大眼
+尽善
+杜邦
+同音
+云游
+fps
+取回
+愈加
+一多
+宣威
+文汇
+天价
+先烈
+多来
+Dark
+村道
+后妃
+睡莲
+627
+霓
+电玩
+忌惮
+十国
+邹平
+麻城
+重载
+迄
+大行
+鸿雁
+嘱咐
+遐
+628
+阳谷
+简朴
+内乱
+昂扬
+598
+调集
+1368
+共赏
+稳居
+阆中
+SG
+乌兰察布
+扼
+惠东
+祛斑
+繁复
+大招
+皿
+精干
+苦战
+阳朔
+禁令
+一针
+467
+岂能
+皇陵
+迎新
+圣日耳曼
+等量
+人儿
+各派
+建厂
+警车
+操练
+溶胶
+评语
+入关
+险阻
+寡言
+细部
+周二
+摹
+粮仓
+热诚
+脚跟
+木器
+相称
+01991
+苍劲
+砧木
+归真
+军械
+冲向
+舷
+绵竹
+水槽
+购票
+风干
+拉达
+九洲
+088
+起病
+宝坻
+偏振
+英灵
+SAT
+估量
+枪托
+布施
+巴东
+线段
+星体
+预拌
+包公
+修护
+致公
+相宜
+高升
+漫谈
+造句
+万载
+师妹
+飨
+里边
+单亲
+南苑
+琐碎
+般地
+爹地
+三板
+知人
+野鸭
+临水
+窗前
+击打
+浓香
+AMDRadeon
+维京
+股级
+仁怀
+酉阳
+成家
+仁川
+病院
+888
+普贤
+777
+钟祥
+名酒
+沮
+764
+楹
+均质
+对头
+KFR
+基德
+三餐
+旅舍
+绣球
+沙市
+嫂子
+应考
+檀香
+到时
+萨尔瓦多
+小粒
+行径
+吸脂
+信服
+草酸
+金龟
+盐田
+休憩
+女鬼
+两河
+剑气
+改任
+喷漆
+渥
+难解
+共价
+伊凡
+柱塞
+散曲
+王村
+beat
+巡警
+477
+面罩
+炯
+叶状
+张口
+Ca
+变现
+而动
+景帝
+众议院
+日耳曼
+664
+手冢
+峤
+空城
+共度
+529
+外援
+安丘
+何事
+闲话
+颇丰
+爱立信
+后缀
+Le
+淮山
+OGG
+氧吧
+榄
+宫城
+里奇
+墨菲
+086
+自语
+Woman
+时令
+败坏
+情妇
+琨
+篇目
+皇太极
+浮选
+书库
+大湖
+张江
+即由
+邓州
+多事
+970
+塔顶
+罗氏
+英姿
+超常
+钓鱼岛
+诫
+如东
+锁骨
+家桥
+簇生
+Sam
+严酷
+朝晖
+二十万
+克里斯蒂
+冬暖
+遮住
+行草
+双模
+宗庙
+01980
+溪边
+晓得
+仕女
+貂蝉
+账目
+假象
+长阳
+면
+神魂
+壕
+疟
+平成
+Golden
+2021
+论者
+application
+献礼
+朵花
+译林
+雅俗
+踏着
+谈及
+入市
+亲信
+应酬
+灌输
+气力
+哀悼
+克拉科夫
+似曾
+废料
+拉塞尔
+前段
+长丰
+桃树
+松涛
+咬牙
+英明
+1125
+集韵
+妇产
+暂未
+谋反
+巻
+point
+陪葬
+西陵
+套筒
+袂
+浇筑
+云门
+一六
+爆料
+厨卫
+阿尔伯特
+546
+London
+扑鼻
+高差
+掠过
+反倒
+徒手
+service
+飞云
+均线
+边塞
+大新
+海山
+青葱
+暮色
+Five
+游说
+오
+掏出
+胸襟
+1005
+酣畅
+混元
+自白
+6m
+美林
+琼海
+上风
+1320
+化装
+每组
+险境
+漳浦
+西口
+恋曲
+放倒
+Maria
+深井
+养胃
+揭幕
+新派
+执勤
+box
+带子
+Practice
+疑虑
+喆
+多门
+沛县
+yous
+勾践
+满堂
+宣宗
+天开
+每家
+偏方
+志敏
+海曙
+1X
+天伦
+愚昧
+忘情
+拉米
+1210
+安仁
+语用
+专政
+铁线
+宝华
+罗拉
+同舟
+重播
+新居
+警探
+用油
+785
+汇通
+科特迪瓦
+布鲁诺
+查获
+汉川
+普洱市
+普吉
+风顺
+通通
+天日
+合奏
+幽魂
+pin
+波波
+莉亚
+窈窕
+祛除
+卡斯特
+中放
+823
+忘我
+名望
+一亿
+赛后
+戏水
+水层
+港务
+要死
+秀兰
+阵风
+分理处
+air
+玉田
+牵绊
+二小
+百分之百
+cn
+凤舞
+脱毛
+Jun
+him
+星罗
+名店
+打尽
+E1
+火海
+伺候
+晓华
+人烟
+涓
+DHA
+烁
+户名
+留置
+月报
+1234
+惠及
+拉出
+红帽
+迈入
+大马
+趣闻
+水清
+肌酐
+里拉
+时值
+落定
+乡级
+涿州
+银鱼
+风琴
+太乙
+锲
+寨子
+卢克
+放肆
+归隐
+立春
+受挫
+媛媛
+蟾蜍
+泌
+别扭
+王族
+灾祸
+中轴
+大金
+评介
+苍龙
+白求恩
+包干
+现正
+件件
+优树
+铁山
+Mrs
+山景
+凶杀
+热刺
+拉拢
+渭河
+区区
+核弹
+开间
+南口
+风骨
+异体
+升平
+后起
+聚苯
+形变
+魔尊
+呋喃
+种皮
+爱尔
+博罗
+下药
+东都
+爪子
+安检
+锁孔
+Services
+周四
+尾矿
+外感
+乡民
+自西
+眼见
+科夫
+负于
+纸浆
+743
+大浪
+串串
+连用
+揪
+晶粒
+联办
+2050
+对华
+侧枝
+Explorer
+水费
+黄壤
+九阳
+六七
+定国
+报检
+张店
+百部
+水波
+股息
+俯冲
+效忠
+试车
+Way
+冉冉
+室友
+微孔
+运往
+夭折
+劾
+County
+而外
+辅音
+相配
+449
+扶桑
+毁损
+处决
+解压
+福寿
+感想
+凤尾
+随想
+闽江
+白毛
+拼盘
+契税
+竖立
+指望
+舍身
+五步
+生还
+解热
+庆贺
+巴萨
+寐
+和气
+崮
+硬性
+盐政
+752
+waiting
+120000
+度数
+桌球
+干涸
+台前
+出师
+冲进
+胆红素
+芬奇
+父辈
+目视
+正一
+等额
+907
+博尔
+家坝
+情事
+风姿
+苍翠
+营建
+高亮
+美声
+走走
+夏至
+487
+习水
+文苑
+拼凑
+史无
+为凭
+新市
+五莲
+愧疚
+挂图
+精算
+汾阳
+空手道
+日臻
+泰晤士
+军务
+6600
+问问
+懋
+政和
+Bob
+局属
+核磁
+拜年
+黔南
+常理
+失事
+恶灵
+时日
+系谱
+626
+章回
+呜呼
+PPP
+卡诺
+株距
+狂犬
+昆士兰
+心惊
+level
+藏匿
+独步
+大度
+槛
+高射
+Guy
+moon
+算盘
+春晖
+攸县
+1GHz
+欧陆
+索爱
+抠
+右图
+排灌
+Name
+蜡笔
+枯死
+北湖
+old
+华生
+卓尔
+出库
+公明
+何地
+分寸
+体悟
+罪人
+静心
+泡妞
+奥体
+国师
+紫微
+FPS
+阿勒泰
+政纪
+例行
+富国
+П
+method
+多动
+回生
+家福
+任凭
+吃饱
+指教
+初露
+前臂
+Academy
+道门
+外甥
+周代
+V6
+嫌弃
+凤梨
+月度
+拂晓
+征缴
+节度
+重演
+夜光
+ios
+运价
+∞
+大汗
+南段
+街坊
+铉
+击沉
+筑巢
+僖
+台套
+智囊
+养牛
+泗洪
+独享
+战魂
+自爆
+套期
+花艺
+丝线
+IQ
+花芽
+694
+案卷
+撩
+光标
+万山
+邳州
+梳妆
+客队
+帮教
+巴山
+花盘
+腰膝
+君山
+茶山
+品学
+edings
+LP
+制种
+家兔
+Image
+596
+恶梦
+车费
+机壳
+恋歌
+房室
+4m
+审阅
+杰西卡
+官窑
+Miss
+765
+裂解
+放款
+髯
+补足
+鼓掌
+煮饭
+应纳
+膑
+诬
+长笛
+必考
+Ill
+德森
+逐出
+华兴
+枪杀
+湄
+妖姬
+尼玛
+矾
+九寨沟
+希伯来
+吞并
+公使
+利斯
+对敌
+横店
+埭
+明镜
+冰雹
+咕噜
+觉察
+低矮
+吩咐
+金边
+四明
+尼卡
+脱脂
+中点
+下海
+丙氨酸
+转债
+神王
+冰球
+096
+胶黏
+普天
+偷窥
+0001
+狡诈
+Guest
+史蒂文
+寒热
+伤疤
+塘沽
+672
+谷底
+永顺
+质点
+历元
+亚冠
+745
+云雀
+全党
+1000000
+南丰
+单叶
+马兰
+屈光
+墨色
+黄道
+移除
+春兰
+长崎
+夏夜
+kN
+贺卡
+相合
+MOV
+嬴
+欧联
+iO
+仓鼠
+王杰
+hellip
+弯路
+领巾
+稔
+晓光
+秋收
+大变
+这回
+走兽
+卢浮
+乖巧
+字眼
+苯酚
+鱼翅
+脸红
+风雷
+曲牌
+自检
+未获
+肃宗
+凄
+域内
+延平
+主将
+思量
+刻印
+畸变
+二王
+自责
+中远
+转转
+补天
+01978
+七位
+Where
+血迹
+奢望
+破浪
+Writing
+辽西
+尤里
+酷我
+眼圈
+均值
+鱼尾
+姊
+嶂
+纸条
+引论
+眼皮
+看做
+山梨
+猪头
+四季豆
+据称
+响声
+怪盗
+鱼种
+正法
+楼市
+王陵
+千斤
+865
+凶狠
+隔开
+裂痕
+咚
+耆
+蟠
+升职
+Step
+邀约
+舀
+对偶
+勘验
+赤色
+首场
+羊角
+戈尔
+筏
+隘
+单发
+帮帮
+孩儿
+前驱
+少阳
+树根
+粲
+牙刷
+10m
+上杭
+社工
+清人
+遏
+DF
+美利坚
+爰
+溪口
+489
+木薯
+当心
+机灵
+罗森
+震源
+犯错
+霍夫曼
+小院
+柯达
+九层
+铜镜
+咩
+806
+漓
+087
+拳皇
+椎体
+内线
+笨拙
+马林
+可采
+臼
+弹跳
+三原
+中音
+MCPU
+佳节
+理查
+胶南
+舒张
+public
+成渝
+湖湘
+沉溺
+恰如
+英烈
+大料
+online
+料到
+561
+祥瑞
+某甲
+557
+组别
+晚饭
+四十余
+方寸
+2K
+4kg
+致仕
+豪斯
+金坛
+科林
+国徽
+纽扣
+下关
+01981
+扇面
+Environ
+map
+偕老
+旷野
+旧居
+当阳
+驯养
+闽南语
+跳楼
+路军
+吃肉
+也门
+捕杀
+DD
+Death
+水渠
+茶香
+微光
+桐梓
+大显
+拘泥
+kw
+734
+镇东
+Bruce
+南汇
+剧照
+周遭
+459
+А
+破土
+阳山
+頫
+戏弄
+混迹
+韦斯特
+周朝
+莱恩
+侏儒
+中印
+出书
+烟酸
+估
+Germany
+荆楚
+销魂
+迁出
+095
+蒙面
+WINDOWS
+迥
+紫罗兰
+鄙
+BP
+亲本
+32nm
+真意
+谈心
+揭穿
+Jane
+搭桥
+本菲卡
+体校
+焦急
+对日
+天梯
+093
+乌贼
+阿尔泰
+袋装
+DO
+云贵
+棱形
+角化
+利安
+牙签
+安陆
+舞厅
+口蘑
+诱因
+氧化酶
+刚毅
+凝练
+恶霸
+818
+周全
+网罗
+眼帘
+胄
+凉粉
+工整
+西子
+碉堡
+盎司
+大宇
+建忠
+嬗变
+幺
+comes
+薄板
+志祥
+冒牌
+气瓶
+嘉峪
+吉普
+和约
+冬眠
+快艇
+豉
+山沟
+村口
+螭
+聆
+脑瘫
+火火
+6800
+滓
+行路
+马耳他
+圣堂
+同住
+855
+墙角
+揣
+闸阀
+盘活
+俐
+作废
+铜锣
+亦然
+赤卫队
+铁骑
+辣子
+Sciences
+专供
+峭
+由得
+重臣
+忍辱
+笔直
+ヨ
+孵
+农会
+PVP
+摔角
+氢化
+创一
+大哭
+苄
+날
+道远
+571
+入骨
+造物
+信宜
+师姐
+慢速
+释怀
+盘山
+广韵
+俊秀
+躺着
+表态
+706
+日晒
+万苦
+万向
+腌渍
+寂寥
+祭拜
+General
+往东
+背书
+DLL
+牛膝
+大厨
+市府
+圣光
+颏
+手袋
+仓促
+小五
+神偷
+Call
+红火
+469
+残破
+标号
+抽签
+惊动
+镭
+来龙
+灏
+红桥
+神工
+大智
+星驰
+悖
+Thisis
+老屋
+超微
+吉诃德
+砂质
+泰戈尔
+竖起
+将帅
+武官
+河滩
+买进
+Duo
+宫主
+汤普森
+一性
+当世
+冷艳
+阪
+Communication
+忌讳
+2B
+奇瑞
+痛楚
+刺刀
+登顶
+三关
+Anthony
+创优
+罗杰斯
+DIN
+浙大
+脑炎
+传遍
+僚
+Nick
+再会
+摄食
+Exercise
+孤僻
+粹
+567
+伯父
+Shanghai
+美玉
+伸直
+鲁山
+湮灭
+冰毒
+钢厂
+新密
+自燃
+舂
+刊发
+仙踪
+元始
+主笔
+元丰
+倦怠
+094
+推敲
+记起
+京山
+参选
+Hand
+费尽
+老汉
+际遇
+龙脉
+祛痘
+海阔
+活生生
+射杀
+浪迹
+即日
+History
+EA
+Production
+二龙
+奈良
+491
+芭
+执意
+Old
+滔
+里德
+校服
+永济
+杜塞尔多夫
+左眼
+瘀血
+凶险
+画眉
+刮刀
+三岛
+里外
+花明
+疏浚
+洪泽
+晕倒
+碧落
+故地
+沁园
+537
+茅草
+羊排
+贴膜
+绿林
+休学
+肌炎
+海市
+洪江
+拍档
+油管
+长岛
+洛神
+小华
+拉姆
+托尔斯泰
+四六
+小云
+污物
+红海
+下棋
+SN
+弱化
+洪亮
+陈年
+商朝
+传至
+稀薄
+害人
+卡丁
+病残
+层数
+保时捷
+赛特
+新馆
+WinVista
+F3
+招致
+our
+冰水
+花苑
+威士忌
+悟性
+冻土
+双头
+口角
+应税
+营盘
+加尔
+波罗蜜
+落脚点
+殿宇
+iTunes
+残骸
+妄图
+悠远
+ldquo
+妆容
+scape
+905
+翻动
+艾德
+猛然
+榨汁
+逃过
+馈
+调谐
+使节
+东进
+Have
+圣者
+新南
+拍戏
+露水
+job
+汉服
+少妇
+Ultra
+侯爵
+法务
+样机
+敌我
+腱
+包涵
+SATAII
+点通
+1802
+XYZ
+跳槽
+兴农
+鹿特丹
+北图
+莱斯特
+铺装
+塞内加尔
+贩毒
+足浴
+殷商
+高唐
+滑县
+置入
+呆呆
+似锦
+Jay
+元稹
+载货
+埋怨
+美色
+古德
+下半
+鸡爪
+阁下
+乡愁
+783
+话音
+赫尔辛基
+形神
+girls
+果果
+本杰明
+展期
+Alan
+到头来
+558
+木炭
+文房
+发炎
+水表
+建阳
+豁
+惭
+広
+初现
+官名
+序章
+莱卡
+群书
+CK
+呛
+北新
+安史
+阳虚
+华龙
+Sarah
+西汉姆联
+media
+杰米
+轧机
+야
+宏达
+前尘
+当属
+建交
+洗碗机
+688
+晋安
+女扮
+低落
+长枪
+轮渡
+荠菜
+滤膜
+微凉
+各向
+憎
+元史
+还珠
+大佬
+输水
+川崎
+待业
+Last
+抓起
+禅城
+极尽
+低潮
+1869
+猝死
+寻梦
+凛然
+电杆
+台账
+伶俐
+茨基
+当朝
+访华
+党团
+建言
+SONY
+书卷
+边城
+089
+大闸蟹
+抗大
+1859
+蒙城
+建都
+长山
+阕
+眯
+长海
+东家
+笔耕
+满贯
+割舍
+应负
+鳗
+川西
+灰暗
+晋商
+连胜
+遵化
+补缴
+明胶
+身患
+綦江
+连翘
+百盛
+脚气
+充裕
+尸骨
+唐纳德
+炮灰
+炕
+炫彩
+LS
+平铺
+万劫
+待到
+花梨
+说书
+黔西
+每间
+兼论
+回升
+荤
+PL
+日暮
+血水
+汛期
+匕
+抽检
+琮
+肩胛
+须根
+隧洞
+陷落
+中银
+蜚声
+外债
+MENU
+沙洲
+草菇
+屈曲
+放行
+豆干
+支座
+添置
+DNF
+千卡
+were
+西单
+大可
+排忧
+主次
+认输
+大钟
+幽暗
+评说
+良机
+育林
+养蚕
+圣域
+拉尔夫
+揣摩
+原体
+快慢
+和蔼
+惨死
+南道
+Based
+布鲁
+悟道
+辇
+白昼
+格伦
+接机
+Ivy
+油条
+飞花
+担子
+098
+巴哈马
+商号
+整日
+精兵
+荡气
+汽缸
+奥托
+灵儿
+天罡
+脾虚
+齐家
+四宝
+江华
+金戈
+榴
+RD
+784
+怒放
+真传
+遂以
+v6
+清剿
+券商
+运力
+C10
+钉子
+峻岭
+跳转
+Battle
+万宁
+平菇
+平分
+将向
+恒久
+轮子
+弯道
+胃病
+耐劳
+克勤
+复赛
+sweet
+海明威
+阿斯顿维拉
+汉密尔顿
+求和
+番号
+拜见
+雕花
+叠嶂
+出神
+气喘
+沈河
+惋惜
+屈指
+指法
+电竞
+景山
+莴苣
+大东
+历历
+惊变
+归一
+察哈尔
+车头
+贵溪
+1851
+爱车
+砖雕
+毫秒
+辛格
+佩斯
+绝密
+四至
+澧
+意欲
+497
+鬼魅
+椽
+不足道
+醉人
+巴勒莫
+猜疑
+成双
+碌碌
+了结
+棒状
+脊梁
+蹇
+岫
+攀爬
+吴川
+上体
+ゼ
+周村
+封山
+
+asa
+震天
+地牢
+退换
+琏
+险情
+本尊
+藏区
+Corei
+莱姆
+单薄
+名目
+克劳斯
+陆离
+细辛
+处所
+嫩叶
+改元
+诗坛
+造假
+迪奥
+长卿
+聚类
+机舱
+黑影
+画幅
+成帝
+插队
+魁北克
+微细
+焚香
+Food
+逞
+子集
+境域
+龙兴
+前额
+山大
+火器
+Zhao
+土体
+电瓶
+麻衣
+民房
+作图
+开怀
+呼气
+粘附
+红妆
+1007
+女权
+修造
+5M
+轶
+巴特
+小名
+任由
+古韵
+渔场
+迪恩
+Joe
+铃铛
+着地
+艳遇
+阐
+难为
+山前
+花岗石
+参赞
+奥斯特
+三无
+难分
+坚贞
+投胎
+心术
+涌入
+后轮
+招录
+敝
+鼠疫
+危地马拉
+本真
+雪莉
+辑录
+仝
+告示
+令牌
+老将
+家营
+四人帮
+金匮
+层叠
+硬座
+凸出
+紧随
+面朝
+7kg
+警长
+JS
+煤电
+鱼香
+经传
+穿甲
+气短
+中耕
+冷笑
+O2
+三十万
+月儿
+孝道
+対
+稽察
+如潮
+铜管
+八荒
+浑浊
+一炮
+情迷
+鲶鱼
+绾
+钓鱼台
+won
+Right
+坦率
+洁面
+学联
+要地
+昭示
+打人
+殊死
+澎湖
+锡纸
+宗主
+谯
+死灵
+庄河
+列子
+中院
+哈维
+Theory
+奖赏
+四色
+言谈
+井然
+字字
+774
+合办
+死死
+神雕
+北戴河
+国号
+留心
+分封
+身披
+触感
+五区
+运势
+逊色
+碳化
+恩德
+香浓
+快手
+凋亡
+伸向
+屡获
+首钢
+宣和
+率众
+金地
+划线
+铿锵
+辫
+盛事
+教具
+取土
+千禧
+阑尾
+沟谷
+新丰
+庚子
+血案
+入魔
+抗凝
+塞巴斯蒂安
+乌头
+东丽
+气节
+12010
+暗斗
+BG
+湘鄂
+海星
+切花
+吹风
+乐手
+女佣
+利剑
+蹭
+隔墙
+香樟
+脱去
+双翅
+四柱
+Canada
+扳手
+WIN
+微距
+陆家嘴
+闭经
+马拉加
+大校
+月月
+fight
+二百三十九
+镀层
+润肠
+见底
+创富
+665
+芸芸
+1015
+贝里
+俄军
+划转
+修水
+句号
+考据
+中庭
+帕尔马
+红枫
+四十多
+春雷
+伯恩
+国富
+638
+华莱士
+笔调
+东宝
+汤锅
+技嘉
+艺品
+731
+托斯
+居里
+足见
+酸度
+欲来
+Barry
+am
+天分
+融融
+white
+听者
+李宁
+6MB
+外史
+台网
+先例
+各校
+中巴
+谢绝
+罪过
+拱形
+曝
+韶华
+再婚
+刀子
+窘迫
+捣乱
+雁门
+懈
+梵文
+散见
+快速路
+转轨
+往西
+双耳
+建院
+酒醉
+冷峻
+作乱
+HS
+桑德兰
+谱系
+灵界
+修路
+皮损
+连心
+动脑
+Code
+合江
+坎贝尔
+聚散
+发黄
+718
+study
+高保真
+亚龙
+祷告
+颗星
+理清
+野鸡
+阵型
+石佛
+生发
+烤制
+Pictures
+梯级
+笼子
+进销
+翻页
+但丁
+情话
+1220
+谦逊
+卷心菜
+651
+金乡
+794
+词库
+山盟
+利落
+防冻
+残月
+瞩
+标尺
+中低
+金兵
+file
+本行
+潼南
+五品
+1008
+682
+药具
+河里
+遗落
+一机
+待客
+咪唑
+乳鸽
+被骗
+前庭
+五十万
+伫立
+台东
+要约
+ward
+胆识
+逾越
+721
+大勇
+宪兵
+五七
+圆寂
+临河
+槌
+巨兽
+瓒
+受训
+胫骨
+初八
+上林
+婆娑
+曲径
+广宁
+副将
+记帐
+成衣
+722
+结账
+骈
+C1
+Talk
+类群
+法网
+978780
+伽利略
+圆舞
+桜
+线网
+渔政
+润肤
+港币
+裏
+bad
+迷们
+อ
+早安
+24h
+发怒
+fast
+城隍
+亚特
+旺季
+情势
+硼酸
+端面
+陈云
+休斯
+喷淋
+定级
+咸菜
+米切尔
+赤字
+695
+会阴
+欲动
+java
+一己
+金瓶梅
+竹海
+霸占
+复置
+谟
+楼顶
+目眩
+文官
+擎天
+黄旗
+面相
+别处
+热工
+家业
+迅捷
+引路
+马关
+世道
+617
+石阶
+脆皮
+1mg
+099
+曼陀罗
+合葬
+肉汤
+都柏林
+倍速
+奉节
+活络
+肄业
+Case
+志华
+退位
+大略
+表率
+公映
+微晶
+636
+8800
+慈祥
+其言
+蒂诺
+汉水
+庠
+安民
+雄壮
+窑址
+黑手党
+丧命
+1821
+Km
+must
+三洋
+思乡
+自流
+热塑
+从医
+解围
+海德堡
+焦油
+xxxx
+LIVE
+学组
+庸俗
+design
+党章
+阳刚
+顷刻
+昭阳
+Spa
+Movie
+手语
+卢氏
+太学
+Christmas
+善款
+站房
+护林
+瓤
+B6
+logo
+品鉴
+健忘
+新教
+仞
+喷油
+规整
+世昌
+惊涛
+武平
+soul
+易地
+小鹿
+通江
+天宁
+紊
+口音
+温顺
+郑氏
+木子
+巴南
+八位
+热土
+脸蛋
+洗漱
+枫林
+布阵
+青玉
+FBI
+利亚
+拉杆
+憋
+1W
+Note
+鳅
+地下党
+汽修
+射流
+罗恩
+略论
+鱼鳞
+腥味
+毛细管
+受封
+末梢
+激化
+二百四十
+两方
+周至
+务求
+一干
+奉旨
+眩
+马赛克
+校史
+血瘀
+猴头
+补发
+铁军
+bring
+湿法
+畴
+乐谱
+乘胜
+神迹
+西海
+838
+砀山
+门业
+飞轮
+凝望
+穗状
+鸣叫
+连铸
+竜
+毒舌
+收场
+566
+勤恳
+脂鲤
+看破
+次方
+丰润
+临沧市
+1853
+厂矿
+位图
+示波
+⒋
+德林
+早起
+此病
+wish
+免职
+觊觎
+胚珠
+吸虫
+视域
+古往
+DI
+异类
+1858
+卤素
+曲率
+IM
+停战
+特纳
+上水
+义理
+835
+大器
+减振
+成风
+纹章
+甜椒
+WXGA
+池田
+神鬼
+屹
+看懂
+昇
+引渡
+咯血
+兴奋剂
+幻术
+魄力
+乔恩
+挚
+弘法
+另案
+行数
+weaver
+干酪
+弹丸
+盘内
+深谙
+长诗
+精矿
+盂
+下流
+斜视
+离心机
+帆布
+可乘
+检阅
+到访
+显影
+白皮
+MVP
+鲈
+蓝山
+玉玺
+笃行
+志文
+英华
+挽歌
+答谢
+骤然
+江干
+色带
+寄予
+陋
+浣溪
+砾石
+优育
+认错
+一瞥
+昀
+志平
+侵扰
+T9
+慵懒
+撮合
+М
+黑点
+纵隔
+成性
+填海
+6L
+组诗
+恩惠
+分权
+寒山
+饭团
+末尾
+呻吟
+912
+预习
+苏格拉底
+严惩
+句容
+红润
+排烟
+涕
+Walker
+三百多
+家贫
+楚汉
+夜曲
+Anderson
+九品
+两把
+基本法
+本报
+烟波
+存世
+吹来
+涎
+环评
+旻
+奥利
+餐厨
+嫁人
+汉宫
+航向
+歼击
+795
+多态
+呱呱
+会址
+票数
+缓释
+吊车
+双眸
+钦佩
+初试
+政要
+坐姿
+家寨
+蒙特
+拔罐
+愚人
+操守
+清平
+仙鹤
+姫
+河马
+肾盂
+脆性
+难产
+桥墩
+满清
+1046
+雇
+贲
+描摹
+看穿
+巴伐利亚
+亚非
+跨省
+韭
+截断
+情殇
+派员
+沃尔夫斯堡
+马萨诸塞
+内控
+ร
+寄居
+颐和园
+品名
+磨砂
+ngy
+00000
+壮举
+狄仁杰
+炉火
+大嘴
+照明灯
+METHYL
+羰基
+坦桑尼亚
+湘军
+黄骅
+联创
+州城
+脐橙
+大捷
+如斯
+三都
+处子
+下界
+华章
+续作
+虬
+请客
+Hi
+01979
+妖女
+暗器
+詹妮弗
+5300
+笑意
+Tele
+瘦弱
+音域
+愈发
+曲奇
+客源
+凝集
+野兔
+维基
+求新
+畅想
+cm3
+脱出
+琼瑶
+品茶
+结对
+混搭
+表露
+嫩江
+誓死
+罗伯
+姗姗
+DK
+油路
+指掌
+卵磷脂
+测出
+赛艇
+马里兰
+ç
+3x
+论战
+海商
+浮世
+肉松
+高声
+吴起
+时局
+刻刻
+切斯特
+清气
+海陵
+心结
+杀气
+苴
+起泡
+三和
+可治
+肉丸
+扶梯
+博格
+邨
+云霞
+卓绝
+导流
+Learning
+杂费
+重获
+拥堵
+白描
+黄鳝
+flow
+份量
+饶平
+埃菲尔
+郡望
+通宵
+聿
+多瑙
+温特
+喷砂
+executive
+具足
+伊利诺伊
+Easy
+弥散
+656
+苍梧
+责成
+nd
+Saint
+桔子
+溺水
+4100
+电抗
+比照
+试听
+衰变
+833
+上坡
+写于
+衮
+ø
+切记
+阿司匹林
+剥落
+俨
+双管
+银花
+恰逢
+血虚
+猪猪
+候机
+三讲
+樱井
+前腰
+沉入
+飞镖
+密苏里
+辛劳
+编造
+待发
+相仿
+赞歌
+中土
+可循
+沃伦
+Direct
+油门
+2H
+西川
+宣纸
+955
+任选
+福斯特
+后发
+茅庐
+官军
+834
+热管
+预置
+Gordon
+举家
+实数
+熔体
+图谋
+自足
+华表
+停运
+电灯
+横幅
+自修
+情色
+沉闷
+折服
+分装
+追认
+机顶
+更迭
+吃药
+父爱
+气滞
+冷气
+猕猴
+毗连
+小乔
+猛龙
+误诊
+未免
+记分
+变暖
+宝丰
+扑火
+门兴
+漳平
+迳
+泄密
+侧线
+包办
+枝干
+外源
+香酥
+生叶
+顿涅茨克
+梅西
+巨浪
+旧有
+牙痛
+山形
+舞姿
+越国
+大难
+浠水
+turn
+密宗
+展露
+出谋
+嬉
+751
+HZ
+feat
+一概
+黄梅戏
+递进
+绽
+解聘
+相惜
+随葬
+滚针
+桀骜
+耶斯
+氦
+枕边
+新奥尔良
+孛
+颂歌
+刷牙
+主公
+莱昂
+井口
+深长
+府城
+Andy
+祛痰
+心狠
+水军
+加厚
+咏叹
+大青
+舰炮
+追赠
+正果
+巴彦淖尔
+良缘
+八世
+老母
+儿媳
+联体
+假体
+production
+天正
+收付
+生辰
+六经
+重兵
+Shift
+还须
+软文
+降糖
+妒忌
+贫乏
+掀开
+К
+紧闭
+倒角
+护工
+常会
+清兵
+万缕
+分家
+蝗虫
+Island
+应得
+囚徒
+631
+情态
+苦味
+Tell
+泡软
+解雇
+地支
+党性
+瓦格纳
+转至
+IL
+哑巴
+可溶
+∈
+亲民
+巴州
+利华
+1801
+OO
+门生
+翻盖
+妙法
+微调
+登门
+维和
+773
+亮亮
+福晋
+交待
+光棍
+遮掩
+服输
+消解
+求爱
+Effect
+硅烷
+海誓
+else
+心眼
+wrong
+雌蕊
+神奈川
+仙台
+牢笼
+547
+Studies
+满园
+3V
+彩陶
+斗法
+珠峰
+邪魔
+归案
+咸蛋
+拘束
+穷困
+诺曼底
+图库
+True
+巴彦
+莒
+分形
+低档
+4K
+评传
+重写
+独处
+宣州
+扳
+霍普金斯
+5S
+言传
+茶饮
+591
+里路
+忠良
+新塘
+大礼
+当涂
+⑺
+认字
+隔阂
+1403
+菌落
+二合一
+此片
+神色
+Jerry
+联大
+里根
+清高
+value
+瑶池
+窥视
+戬
+迢迢
+玉门
+天南
+把酒
+盖板
+临潼
+强省
+打掉
+工频
+钩子
+097
+威尔斯
+铳
+Ben
+呕心
+爬山
+就使
+酒瓶
+wind
+干校
+英山
+逸夫
+西学
+高亢
+桥南
+章事
+袋子
+石首
+故国
+圆角
+阿玛
+衣衫
+漂洗
+糊糊
+Kbps
+笔谈
+弗雷德
+图拉
+杂病
+腐乳
+张张
+平坝
+AC220V
+诠
+牧野
+何其
+办税
+明道
+善本
+珠光
+缩水
+道夫
+洗去
+散结
+纣
+克莱尔
+绕道
+喋血
+九星
+大堤
+二院
+早泄
+玉成
+新课
+后辈
+建昌
+悲喜
+配线
+大抵
+土城
+648
+连发
+语料
+热度
+换挡
+耳光
+哭声
+狭隘
+CL
+city
+非农
+卡口
+扫雷
+耳塞
+GF
+IDC
+think
+Im
+沙溪
+楼板
+会心
+东立
+搅动
+勉励
+蹂躏
+三丰
+实情
+颍上
+逐浪
+幽兰
+气量
+垫片
+纵火
+莫愁
+彭州
+924
+耳麦
+让给
+文龙
+酱料
+明华
+会友
+ㄆ
+使劲
+沥血
+俪
+赦免
+周庄
+开合
+自习
+超重
+意趣
+赛前
+Wind
+Guo
+卷帘
+珀
+狐仙
+卫兵
+失血
+铁器
+水车
+935
+中苏
+Billy
+上京
+选好
+鸵鸟
+人脑
+白炽
+而论
+赫然
+告警
+title
+Hyper
+给以
+剃
+皓月
+小腹
+Are
+通胀
+仿效
+长岭
+铝塑
+党外
+分选
+二氯
+玉屏
+10ml
+铝板
+停歇
+止水
+基板
+811
+镳
+子胥
+足下
+艳芳
+南浔
+王孙
+油罐
+氐
+热泪
+追到
+跋扈
+895
+蟠龙
+圣王
+望京
+安福
+雒
+扑面
+天色
+至爱
+判例
+百丈
+峰顶
+沙俄
+训诂
+笔迹
+morning
+失色
+终於
+Jimmy
+德累斯顿
+津液
+开式
+地中
+沁阳
+亚伦
+寒食
+风帆
+坡区
+军统
+潜龙
+书展
+骗人
+武冈
+继往
+金兰
+戏班
+垂涎
+⑩
+辅酶
+胡乱
+color
+喜人
+氩
+女篮
+匈
+腐殖质
+El
+济阳
+1223
+ET
+汨罗
+颢
+以法
+草属
+弑神
+尉迟
+惊异
+济公
+稍加
+上下文
+八国
+身姿
+成行
+建档
+物象
+归功
+富力
+三防
+初春
+拨开
+续行
+九段
+平卡
+面宽
+用火
+磨刀
+动容
+温饱
+严防
+飞蛾
+ip
+彩椒
+DA
+要钱
+刊行
+圣兽
+沂南
+引得
+棉布
+洛桑
+丘疹
+烽
+844
+德勒
+大呼
+棹
+浓墨
+四两
+Series
+海苔
+came
+时差
+大笔
+名花
+抄送
+油纸
+黄雀
+普希金
+滇池
+敌意
+来玩
+716
+登峰
+冤枉
+祁阳
+原价
+1854
+MKV
+鲈形
+娘亲
+油水
+内训
+挂失
+锤子
+击鼓
+安分
+递归
+6400
+中师
+鲍尔
+纸巾
+家小
+比作
+皮脂
+816
+牒
+电业
+对於
+一星
+至强
+1230
+灵犀
+受苦
+1022
+NH
+758
+699
+环比
+量力
+phe
+六色
+静坐
+离骚
+相中
+观照
+en
+中军
+艾灸
+红四
+炼铁
+壅
+拟建
+同乐
+振作
+永祥
+家私
+熏蒸
+判刑
+青峰
+罗伯茨
+通透性
+Taylor
+命脉
+奇数
+奇案
+端倪
+李鹏
+瀬
+歌者
+形声
+乐舞
+标致
+扔掉
+4L
+track
+奸臣
+佛爷
+成趣
+洮
+家臣
+蠢蠢
+南河
+密植
+环礁
+阖
+玄武岩
+接缝
+麻风
+卢卡
+叶绿素
+鲜汤
+为公
+计提
+砂仁
+维罗纳
+炮击
+脱身
+怀宁
+商谈
+apk
+构型
+関
+金卡
+小七
+豚
+佳佳
+Jeff
+收盘
+56K
+犹他
+难舍
+贝特
+beautiful
+Speed
+蒙蒙
+小字
+迫击
+高院
+必争
+胀痛
+相融
+跑马
+脘
+文丛
+谙
+墓群
+依此
+水气
+参评
+选本
+美剧
+枞阳
+王蒙
+煅
+新新
+奈奈
+橙汁
+qq
+莫及
+双翼
+长袍
+黄杨
+ADSL
+姚明
+Wei
+豆花
+贝格
+见得
+毋庸
+child
+加到
+mode
+绝缘子
+红娘
+screen
+爱爱
+金蝶
+喷嚏
+846
+full
+539
+血性
+春运
+养父
+下联
+686
+三民
+往年
+885
+ES
+丧事
+549
+凶恶
+滤清
+举荐
+同门
+翔安
+女尊
+病区
+732
+凉风
+砺
+辞官
+SAP
+Phe
+怀仁
+半空
+魇
+多指
+ぞ
+热河
+Anna
+FSB
+必死
+氯仿
+乐安
+遗风
+水剂
+绝杀
+ㄎ
+Silver
+抢购
+廊桥
+蒙彼利埃
+渤
+二八
+过路
+Break
+铅锌
+587
+小泉
+全剧
+棋局
+侠盗
+此曲
+国风
+如海
+当道
+心安
+斧头
+Ф
+赋值
+劳教
+Ed
+斩获
+白手
+孽缘
+泡茶
+用功
+UC
+铸钢
+木桶
+六层
+韩剧
+飞禽
+水声
+菜刀
+淘沙
+劝告
+4D
+米业
+六五
+督学
+692
+引信
+艺苑
+泾县
+选人
+七成
+起用
+温病
+华林
+火炉
+忠厚
+泰迪
+低地
+羊群
+疤
+万顷
+春梅
+罗定
+紫藤
+皆非
+城子
+楠木
+Danny
+定作
+using
+姨妈
+撮
+映照
+疏离
+TT
+天保
+尖子
+若虫
+辖市
+烟熏
+牛蒡
+保外
+里克
+家主
+永利
+酸枣
+刷机
+木箱
+锺
+惠山
+狼烟
+百出
+学林
+TF卡
+横流
+RX
+曹县
+排位
+遽
+军国
+打分
+射阳
+生者
+工兵
+朝霞
+红日
+飞鱼
+珉
+寿星
+伯克
+PDA
+安利
+772
+逃婚
+春香
+Sweet
+泰顺
+语族
+峨嵋
+秋山
+瓶中
+光能
+开销
+沃德
+法郎
+Scott
+鱼虾
+失禁
+溢流
+1006
+贤良
+风向标
+真主
+鹿茸
+跳起
+机炮
+兴平
+过目
+CNC
+焚毁
+滤料
+打赌
+威远
+反帝
+上庄
+迷迭
+江中
+新科
+宜都
+空姐
+麦子
+农科
+并州
+锁链
+丹佛
+石峰
+申辩
+夹击
+原液
+天门冬
+淡出
+会试
+秦州
+寿宁
+干姜
+757
+火机
+兰山
+亚特兰蒂斯
+拼装
+炫目
+农事
+澍
+连体
+内嵌
+赏花
+1S
+塑形
+上边
+浓汤
+面片
+李敏
+地砖
+农地
+闹事
+合拍
+湮没
+916
+银两
+578
+0GHz
+凄惨
+律己
+散去
+本经
+眼病
+1857
+铨
+万通
+饥荒
+际会
+名词委
+793
+假扮
+点状
+658
+溢价
+锦屏
+巷子
+风云录
+扬尘
+708
+蔻
+飞奔
+裂开
+灰白
+嘟
+换向
+根特
+真迹
+回车
+北桥
+康体
+受控
+交费
+C6H
+洪涝
+暗香
+旅人
+公案
+范蠡
+Mar
+炎炎
+Program
+火鸡
+霞浦
+五加
+峥
+阎罗
+NY
+海蜇
+大脚
+谗
+头巾
+眸子
+克鲁斯
+自定
+一问
+生花
+弗兰
+订座
+玄参
+阴囊
+卫冕
+伙计
+Edward
+贯中
+涅夫
+义卖
+北条
+科比
+牵涉
+汽水
+助阵
+橡木
+勃起
+M3
+药酒
+兴城
+瘦果
+U17
+Die
+硫化物
+布展
+629
+城厢
+人影
+老树
+958
+中江
+姝
+Color
+胆怯
+锥体
+燕赵
+1842
+变数
+肥东
+幻灭
+wan
+攻心
+PCM
+希冀
+开卷
+ISP
+貌美
+体量
+春燕
+宅女
+溧水
+兽类
+拓本
+庆华
+Got
+剁碎
+水闸
+遗言
+隋代
+腭
+账面
+IMobility
+弹唱
+三产
+入主
+索拉
+寿县
+骑车
+吉士
+托起
+险种
+OUT
+扣缴
+3900
+附图
+穿越者
+古柏
+跳伞
+筛分
+山洪
+装作
+鸭绿
+上联
+桑普多利亚
+迷藏
+MPA
+干洗
+饰面
+耐酸
+暝
+慌乱
+宝岛
+祷
+披靡
+闺
+ham
+下坡
+上党
+复辟
+上调
+高炮
+腾腾
+卵石
+异想
+犊
+繇
+仪仗
+成祖
+亭子
+弗朗索瓦
+外壁
+年轮
+拳术
+降脂
+编目
+一吻
+毛绒
+种地
+敦化
+填制
+腧穴
+侄女
+声纳
+蜂王
+茶杯
+轻伤
+闪现
+纹枯病
+九子
+月神
+其主
+鱼龙
+村办
+妓院
+看不起
+肱
+吲哚
+晓岚
+凌驾
+景宁
+昌都
+轮到
+蛊惑
+竹溪
+三绝
+襄城
+1123
+today
+美中
+茸毛
+接驳
+rich
+外宾
+公然
+白面
+GN
+张元
+新词
+寒假
+EXE
+兵马俑
+诏令
+羿
+插管
+插值
+胆大
+方中
+相与
+迎娶
+两厢
+甜食
+顾忌
+Ann
+祸福
+永存
+施治
+Cl
+连山
+滑块
+铁质
+置地
+气肿
+金昌
+活捉
+睢阳
+鰕
+塔山
+水雷
+猪场
+矿化
+海生
+打孔
+调离
+粉刷
+出外
+山崎
+市价
+function
+精舍
+WWE
+川北
+采药
+蕲春
+寡人
+遍体
+泰式
+词牌
+爽朗
+1260
+different
+新野
+ั
+容重
+东端
+重檐
+油盐
+直入
+559
+law
+再者
+婺
+素菜
+桑树
+恺撒
+筹款
+阵线
+锻压
+赞颂
+狼群
+smile
+−
+肴
+皇姑
+调式
+大仙
+史观
+白粉病
+เ
+ENGINEER
+弗雷德里克
+共轭
+叼
+767
+874
+邵东
+述说
+炊
+幼树
+冷宫
+荆芥
+定陶
+讥
+圆通
+1844
+上届
+全旗
+Double
+member
+富平
+刘庄
+抗诉
+GC
+1823
+开福
+修业
+爱普生
+福岛
+下体
+解药
+右臂
+犹存
+妙语
+肺泡
+富士通
+目镜
+详述
+乌梅
+插秧
+磨料
+宽裕
+乳山
+磨合
+痘痘
+女警
+八中
+MF
+吝
+驶入
+乌迪内斯
+汉庭
+灵璧
+花旦
+萜
+缎
+政令
+塔纳
+扩增
+保理
+球赛
+招用
+拎
+恳请
+波光
+取保
+Metal
+审问
+园子
+开窍
+籽油
+蜗杆
+查杀
+手书
+醉心
+师门
+声光
+菌核
+577
+严守
+历久
+843
+远播
+旨意
+泪痕
+合瓣
+密西西比
+Head
+中道
+678
+特尔
+胯
+推诿
+大岛
+但求
+追悼
+惭愧
+唐初
+班底
+公费
+步履
+装箱
+经得起
+清运
+凹槽
+极快
+爱知
+998
+塑钢
+过身
+青蒜
+巨野
+VA
+神龟
+内应
+可亲
+飞快
+蹉跎
+年岁
+罗勒
+Line
+广谱
+Australia
+底片
+臣子
+944
+转租
+荃
+星矢
+老道
+万代
+出招
+大夏
+强县
+器重
+锈蚀
+模态
+噩耗
+讲经
+含糊
+驱散
+世故
+幻兽
+哈雷
+云云
+蘇
+敲响
+换骨
+溪谷
+计税
+雾灯
+小玉
+上药
+内装
+同理
+矿体
+隋书
+狰狞
+单字
+复国
+殃
+Mg
+常备
+6300
+潦倒
+蔑视
+品级
+大钱
+里奥
+琳达
+骨料
+素数
+954
+锗
+凫
+WE
+宅院
+17000
+安岳
+暗自
+美颜
+茱
+推车
+草草
+飞溅
+哥德堡
+竞标
+止痒
+阚
+输变电
+前厅
+See
+包揽
+前朝
+池水
+雷丁
+喷绘
+小狼
+量具
+应受
+青绿
+康唑
+干练
+仇家
+Land
+边长
+斥资
+还魂
+初五
+同道
+如诗
+靠拢
+铸成
+Start
+利刃
+MySQL
+三性
+xp
+路窄
+学海
+防具
+要义
+词话
+Jiang
+巨作
+谪
+LM
+绝口
+大本
+莫尔
+下腔
+倒退
+单数
+手帕
+胸痛
+酚醛
+倜傥
+strong
+除法
+美肤
+养活
+5ms
+供体
+直视
+蜜枣
+FB
+文辉
+谬误
+贮运
+图式
+启事
+戒备
+五国
+齐河
+661
+壮美
+驯服
+免交
+Marie
+珑
+三氯
+行善
+平顺
+论道
+中牟
+公房
+孤山
+首例
+灵武
+1815
+Jian
+婆罗
+刻板
+屋脊
+小树
+散失
+泰坦尼克
+曲阳
+填入
+英亩
+Fang
+引向
+修好
+泉城
+UNIX
+志书
+转制
+珍视
+圣徒
+小曲
+林芝
+姜堰
+1520
+辅食
+蠹
+蓝皮书
+草稿
+821
+ل
+斯巴达克
+令狐
+丙酸
+force
+root
+病案
+荚果
+夜话
+多米尼克
+远志
+Market
+魔君
+果真
+潮安
+傲骨
+内疚
+检校
+跃然
+大邑
+真话
+归家
+公历
+1322
+脉相
+樟子松
+Youre
+长存
+单产
+啪
+641
+中宗
+Honey
+粒状
+云散
+艾哈迈德
+复函
+邵武
+光顾
+歼敌
+Bell
+评剧
+补光
+四百八十七
+建瓯
+山镇
+Idon
+视讯
+Ⅹ
+天干
+南陵
+场中
+圣水
+防抱死
+全盛
+候审
+沃尔沃
+活现
+8GHz
+迁建
+戾
+雷公
+婀娜
+明示
+丙烷
+欺人
+赣江
+741
+邯
+侧向
+1822
+战将
+软膏
+大书
+商贩
+Original
+少吃
+哗
+RT
+普罗旺斯
+接二
+幽门
+1202
+秋实
+S1
+受难
+棋布
+鲁尔
+黑熊
+失守
+大伙
+銮
+阐发
+作成
+题跋
+面额
+一马
+猛将
+名厨
+了得
+堂号
+呎
+加氢
+咽炎
+50HZ
+嶙峋
+掉入
+WPA
+苦干
+体面
+标榜
+ching
+往常
+奥拉
+渔人
+呼呼
+cott
+神志
+压紧
+天福
+自乐
+晋代
+秘术
+剪贴
+进贡
+入耳
+大波
+缸体
+论据
+猪手
+使君
+面点
+颓
+约翰尼
+京畿
+马刺
+853
+雾气
+南岭
+PID
+银发
+奇花
+缺刻
+Roger
+宜城
+旁路
+类固醇
+堕入
+艾瑞克
+比邻
+两脚
+相忘
+1723
+景园
+称心
+心裁
+铺路
+副官
+两岁
+984
+建兴
+固相
+茶油
+四星
+728
+995
+2nd
+悉数
+形同
+横竖
+湮
+闲适
+昴
+平昌
+崽
+帝皇
+还价
+Database
+布莱克本
+唯识
+贾斯汀
+成武
+搁置
+天都
+尖山
+宝盒
+用字
+艾迪
+大寺
+作响
+易错
+台币
+呵成
+伴生
+底牌
+list
+榆次
+哺育
+腐女
+伯特
+晓玲
+1014
+沟槽
+Box
+划破
+石塔
+management
+温中
+瓶盖
+Ⅷ
+空想
+扶植
+Display
+卡伦
+拱门
+学苑
+Process
+生土
+西贡
+sare
+923
+避税
+卵子
+宝钗
+魔头
+死掉
+Paris
+能耐
+NASA
+萨拉戈萨
+台儿庄
+拉比
+珀斯
+三娘
+法政
+出关
+Holly
+菌类
+田纳西
+莎莎
+盖亚
+蛇口
+截取
+增建
+相向
+阿訇
+cold
+年费
+Mode
+校办
+适逢
+寻回
+看管
+崇山
+腾达
+Band
+私利
+修学
+化龙
+XGA
+Fate
+年久
+石竹
+止泻
+郎君
+雪菜
+欢聚
+NDS
+跳蚤
+6200
+三三
+贬义
+红一
+爱惜
+群力
+妙计
+Please
+灰狼
+梵语
+断奶
+雪球
+Here
+听懂
+前四
+快板
+红三
+盐类
+非但
+文斌
+赠品
+时态
+网箱
+茱萸
+its
+废渣
+遥测
+翼龙
+下人
+禀赋
+画一
+暗色
+重物
+热切
+会堂
+笔力
+BK
+HACCP
+美玲
+组曲
+打来
+嘈杂
+五虎
+乾元
+配用
+脱氢
+core
+临风
+花叶病
+天乐
+心肝
+视同
+贵在
+七夜
+千亿
+上汤
+周氏
+观澜
+告密
+游轮
+比价
+退场
+半斤
+旦夕
+某处
+回京
+shan
+雄踞
+director
+罗浮
+祝寿
+BA
+耐候
+粱
+小凡
+Hard
+一把手
+辱骂
+基隆
+淡薄
+睁眼
+邛崃
+斗篷
+GX
+Ican
+阴山
+源码
+皴
+政企
+亚铁
+668
+潞
+zhu
+暴行
+EMI
+诵经
+模数
+民团
+拉西
+满腹
+泼墨
+花火
+后羿
+湖山
+味甜
+款待
+美因茨
+旁白
+1025
+9002
+忐忑
+thats
+卧虎
+土石
+1016
+备胎
+平喘
+筑城
+神气
+友军
+魔域
+偷盗
+庙堂
+深重
+凭据
+胸径
+氮化
+两派
+败退
+探戈
+烬
+闪过
+柯蒂斯
+二女
+惶恐
+迟早
+Crazy
+泰德
+Full
+街口
+巫婆
+髂
+荒岛
+睢宁
+鞋底
+哀愁
+986
+新界
+中堂
+单科
+钏
+青藤
+节录
+变脸
+沙县
+白露
+冲程
+乳香
+development
+滚珠
+圳
+艾琳
+仙姑
+升压
+DHCP
+worth
+浮力
+招惹
+前肢
+屠夫
+良辰
+扬长
+pro
+二叉
+拔除
+CS3
+盗汗
+洋人
+全歼
+VISA
+如月
+大旱
+1103
+症候
+旧日
+锻件
+扼杀
+江畔
+10mg
+啼笑
+包皮
+忧心
+骇客
+箱内
+匪徒
+祸首
+白首
+偷懒
+高纯
+Letters
+群芳
+阳关
+凋谢
+笔架
+EPS
+蘑
+ittle
+査
+钟离
+感怀
+派克
+流苏
+D3
+莒南
+切切
+3H
+屏保
+匝
+逆行
+此剧
+二百二十七
+671
+乙炔
+败北
+597
+汉白玉
+兵变
+其词
+言说
+后腿
+秀发
+釉色
+卡利亚里
+next
+URL
+博鳌
+长门
+大厂
+凯斯
+种草
+风生水
+强求
+忠信
+吉特
+坐骨
+脑电
+现值
+摘取
+底物
+接点
+消磨
+天寒
+香客
+人学
+肌理
+迸发
+洙
+鼻部
+行会
+骚乱
+巨幅
+Moore
+币种
+假想
+Howard
+宿松
+Royal
+套用
+麋鹿
+各条
+陈明
+起死
+舒城
+蠡
+文莱
+克雷格
+河蟹
+万源
+恳求
+涉密
+遂昌
+婆罗门
+砖石
+直拨
+双剑
+圣路易斯
+右眼
+大把
+低迷
+星城
+装点
+辽代
+Award
+morrow
+求医
+票面
+翠屏
+1420
+煦
+U21
+说理
+908
+唱和
+肝素
+PS2
+拼死
+屏山
+科利
+悪
+晾晒
+工龄
+苦痛
+抱抱
+双刃
+车务段
+校际
+浸透
+884
+무
+全班
+作息
+1803
+卡罗来纳
+锡金
+破绽
+please
+茫
+怕死
+徳
+外戚
+景天
+扭伤
+捕猎
+退路
+济困
+驱赶
+LGA
+EXO
+蒙自
+做菜
+嗣后
+一诺
+Soul
+幼体
+要务
+鎏金
+伙同
+战果
+双拼
+编年史
+862
+一隅
+膏药
+Corporation
+包菜
+花蕊
+苗床
+尼姑
+家坪
+两艘
+64MB
+萌动
+found
+芦花
+登机
+亭亭
+代缴
+嘉园
+鸡毛
+贪心
+精微
+枣阳
+Number
+上清
+黄耆
+啮合
+飘雪
+耗油
+采掘
+仰韶
+db
+group
+9kg
+耽
+税前
+自保
+抗灾
+SCSI
+二百五十八
+桐子
+下拉
+绿园
+林氏
+统编
+琐
+收徒
+窑洞
+工藤
+Nature
+中弹
+岛国
+直率
+猪排
+银元
+纯美
+多米尼加
+顶替
+援引
+红安
+木鱼
+踌躇
+毛茛
+符咒
+粉刺
+赣榆
+JJ
+憨厚
+旁通
+懒得
+统率
+Voice
+塔拉
+纵容
+湖口
+罗宾逊
+单线
+cross
+乳剂
+都护
+Health
+모
+郓城
+坚忍
+爱迪生
+描红
+乘积
+天蓝
+武圣
+中暑
+白皮书
+拍手
+Blood
+娇艳
+唠叨
+掘金
+化生
+胱氨
+费时
+顺子
+在即
+W33
+舛
+1810
+布里斯班
+醇香
+萧何
+脚尖
+售货
+内衬
+CHINA
+雁荡
+疯癫
+HG
+SiO
+铰
+鼻涕
+北道
+宗仁
+明争
+西格玛
+跟前
+一院
+鲛
+冷饮
+病种
+归附
+威利
+小包
+耗用
+鹤岗
+成天
+红艳
+简写
+States
+劳苦
+塔城
+覆膜
+斜面
+刊名
+总装
+方山
+雪夜
+鬃
+获救
+红遍
+序数
+调经
+哈里森
+一扫
+奇闻
+眼力
+数不清
+柚木
+敢死队
+农奴
+博彩
+双河
+伊达
+泥泞
+1560
+葱头
+乙烷
+冷清
+用例
+皮尔斯
+空载
+934
+色散
+圣陶
+脸皮
+粤港
+朱迪
+鸿沟
+贤人
+姗
+rdquo
+西魏
+Converter
+印象派
+中计
+千岁
+规约
+Ø
+华清
+夏奥
+大内
+摩西
+秀峰
+费尔南多
+円
+平王
+初九
+富勒姆
+上床
+十全
+众说
+光山
+改型
+入梦
+公估
+窑炉
+原色
+翘楚
+星巴克
+课间
+云层
+镇南
+迩
+青稞
+借机
+天之骄
+开行
+马苏
+建州
+一江
+嘴边
+决然
+894
+三姐
+whole
+拌饭
+踏进
+保函
+金缕
+Modem
+bi
+时域
+0V
+干法
+新种
+块茎
+厮
+嫁衣
+剪除
+出彩
+灵秀
+教条
+诗曰
+ㄘ
+1180
+雨滴
+首推
+Premium
+wall
+年迈
+长洲
+非特
+Performance
+厮守
+方城
+窗台
+恰巧
+格瑞
+披肩
+铁锅
+Zhou
+近亲
+狼牙
+MPG
+射雕
+才情
+配种
+考纲
+世态
+石马
+教廷
+登天
+盖尔
+校报
+丰都
+Galaxy
+2023
+大娘
+声像
+星形
+罪孽
+方家
+耀华
+黑穗
+秋叶
+Chemical
+厂牌
+贵贱
+提神
+残阳
+沣
+道尔
+Anne
+尚武
+В
+歪曲
+变价
+王侯
+水杨酸
+悯
+暴怒
+尼克松
+加人
+罗城
+续编
+正名
+秋夜
+花旗
+诺瓦
+塑身
+垌
+了当
+下课
+Off
+1206
+震中
+茌平
+羧
+饭碗
+烟机
+仗剑
+alone
+Secret
+一五
+尿频
+储罐
+黯淡
+脱壳
+蜜汁
+嬛
+上文
+VOL
+九死
+鏖战
+569
+拟合
+传单
+班禅
+申花
+决绝
+石岩
+含泪
+铁心
+印在
+nov
+磨成
+老外
+Walter
+高发
+弓形
+护城河
+鲀
+珪
+粉彩
+点卡
+柳暗
+诗风
+Junior
+埋没
+彷佛
+口子
+立碑
+区属
+1203
+命理
+名教
+堤坝
+聊生
+子民
+拗
+光碟
+雪景
+恶鬼
+西距
+东直
+1040
+蓬松
+斡尔
+东胜
+招揽
+石笋
+名都
+南安普顿
+HB
+夏末
+夹心
+失所
+馆舍
+进位
+真龙
+嘉义
+HC
+守望者
+烈度
+叠翠
+阳子
+荞
+号子
+安眠
+好笑
+红霉素
+焊工
+尿毒
+国色
+海米
+包衣
+钝化
+逃命
+646
+蜱
+爆出
+虚报
+国忠
+殿中
+孤寡
+歌咏
+暂缓
+梁子
+文才
+匹敌
+小点
+1845
+两重
+案头
+Chan
+德拉
+白鹿
+吹牛
+新贵
+律动
+喵喵
+鲁西
+东非
+猎豹
+wave
+合川
+躲开
+桥西
+丘吉尔
+stars
+要略
+大泽
+CAE
+兰科
+西克
+可待
+离间
+发疯
+新德里
+公事
+赤脚
+假意
+厚薄
+娇小
+蛋壳
+永刚
+岔河
+百代
+温服
+棒棒糖
+合围
+蜜语
+吉水
+阡陌
+Central
+手链
+南渡
+摇匀
+fly
+失足
+三网
+仅剩
+自嘲
+天光
+保藏
+吟咏
+通经
+榨油
+砂土
+大件
+美英
+剪影
+凤英
+内热
+疏密
+为着
+泰隆
+景洪
+933
+代建
+Public
+磐安
+订做
+立方体
+灌顶
+朗斯
+明人
+录影带
+matter
+海归
+颠沛
+军粮
+组态
+熟料
+复线
+逆光
+资本论
+猴王
+txt
+石嘴山
+囊中
+脆嫩
+玩伴
+35W
+民宿
+忍冬
+三桥
+薯条
+影帝
+样貌
+医技
+거
+箔
+头昏
+黎平
+刻字
+礁石
+蜀国
+豆油
+活水
+烧鸡
+736
+宁化
+颛顼
+斯坦利
+惊雷
+蔽日
+45000
+TB
+木料
+QQ群
+仙草
+商海
+国和
+傣
+洒水
+发威
+左旗
+泸
+Kiss
+预谋
+活到
+经卷
+旗杆
+列岛
+883
+军警
+1380
+nVIDIA
+内丹
+谢尔盖
+海陆
+列日
+八成
+碟片
+革兰氏
+珍禽
+东河
+预兆
+读后
+随缘
+荷马
+锦旗
+溺爱
+徒劳
+转瞬
+断点
+other
+苍术
+博白
+运销
+三味
+浮山
+得逞
+普顿
+颌面
+毓秀
+后事
+黄绿
+染红
+lover
+邰
+汰
+札幌
+托特纳姆
+林林
+矫形
+唱词
+对局
+球类
+虽小
+summer
+新光
+AE
+新宇
+便血
+高工
+12009
+金鹏
+四会
+่
+RP
+정
+彩屏
+鼻塞
+军装
+安大略
+Dave
+1602
+残废
+年款
+芡实
+副职
+降价
+BASE
+特利
+上列
+String
+论题
+拐杖
+长颈鹿
+成金
+缢
+自居
+阿德莱德
+签章
+自荐
+线控
+评点
+积压
+见谅
+Jordan
+上楼
+土改
+寄给
+斯克
+膜片
+回鹘
+卫校
+自省
+偏转
+California
+妮可
+巴尔干
+大士
+秀珍
+性子
+比基尼
+卖身
+底栖
+养身
+红门
+评出
+凡尔赛
+输赢
+姥
+1852
+822
+款第
+鬼才
+打假
+灰黑色
+康桥
+夜来
+原种
+涵洞
+类目
+FPGA
+桁架
+春城
+求人
+香香
+蛔虫
+慈爱
+无盐
+海平
+鬼谷子
+苍松
+丹心
+蒸笼
+涨跌
+还击
+斟酌
+饭锅
+936
+答道
+瞻仰
+鼻梁
+1TB
+easy
+冰晶
+好歹
+南靖
+Place
+小梅
+电学
+1060
+乌拉尔
+脸型
+划船
+直奔
+BR
+일
+扎克
+厂址
+划算
+征召
+摆摊
+休整
+国辉
+改姓
+就义
+凯瑞
+大林
+腔镜
+螟
+回溯
+売
+铁拳
+笙歌
+Xi
+利津
+双雄
+大雾
+639
+七情
+咪咪
+OTC
+懒散
+漕运
+何日
+NA
+1310
+攀援
+升力
+大图
+726
+火场
+聋哑
+1780
+儒略
+房客
+안
+道术
+下台
+自生
+下图
+新起点
+mouse
+晶状体
+折弯
+Wood
+沾染
+踞
+Trojan
+作答
+审评
+领养
+睡衣
+就绪
+腰果
+难懂
+always
+灵长
+清点
+partments
+分度
+玉竹
+法兰克
+1722
+咸味
+洛克菲勒
+接龙
+带下
+仪征
+退职
+脱垂
+亳
+做题
+堵住
+963
+德福
+照着
+商都
+复试
+仓山
+常任
+华光
+眼袋
+北亚
+移入
+濛
+落得
+Human
+缺憾
+枣树
+水帘
+奇景
+852
+断头
+檄
+易受
+提款
+舫
+洛斯
+谢菲尔德
+威廉斯
+追风
+房中
+聊聊
+金朝
+IPO
+681
+盛衰
+南都
+管用
+响水
+孟买
+开府
+叱
+须臾
+叶落
+竖向
+宫门
+商鞅
+摩尔多瓦
+湛蓝
+机种
+力挽
+四化
+用光
+情场
+贝蒂斯
+水珠
+掣
+乌黑
+Maya
+新县
+海英
+黄门
+哲人
+蒜瓣
+哀怨
+船坞
+粽
+花菜
+Forever
+文正
+仇敌
+夏娃
+楼主
+躁动
+蜂鸟
+恵
+三思
+路北
+飞艇
+CIS
+阿尔弗雷德
+nb
+广厦
+悸动
+走投
+山巅
+屯溪
+偷窃
+告成
+细弱
+圆滑
+援建
+轰击
+传诵
+Du
+亚平
+Villa
+赔率
+前缀
+南粤
+两晋
+马迹
+消沉
+残片
+927
+作对
+豆皮
+桑园
+人缘
+软卧
+Joy
+百家姓
+马祖
+恶臭
+全本
+修文
+链式
+mini
+地大
+pre
+JCB
+在美
+山珍
+车友
+蜂巢
+流于
+果仁
+杯状
+抗敌
+盘踞
+名模
+富顺
+巴布亚
+人设
+二话
+滤光
+红砖
+搪瓷
+辟邪
+
+1580
+复建
+Collection
+退居
+盛放
+暴利
+cry
+慨
+V4
+法相
+万全
+邹城
+笑料
+灰分
+Check
+酒香
+掌权
+Ah
+梵天
+催款
+萧瑟
+光景
+好意
+单间
+外经
+藏獒
+身兼
+大荔
+原道
+晓辉
+镭射
+Sect
+飙升
+罗格
+来稿
+公法
+旗鼓
+安远
+女优
+印迹
+year
+Motion
+环湖
+下注
+阴森
+弗赖堡
+护套
+被控
+床头
+process
+宁阳
+酸洗
+Jos
+网纹
+今宵
+叫法
+贩子
+佛殿
+林学
+取值
+蛀
+1846
+良人
+松石
+北关
+要脸
+颠峰
+吓人
+芳芳
+靴子
+978710
+奔流
+通海
+黄体
+赤城
+plus
+凡响
+春意
+size
+asp
+中组
+Clark
+点菜
+白帝
+和善
+失火
+权谋
+works
+康王
+eae
+河湖
+戴安娜
+read
+胪
+颠簸
+mine
+依林
+情味
+伏魔
+字头
+五福
+刚强
+府邸
+异种
+本子
+even
+长恨
+军刀
+常宁
+晓军
+摸清
+反共
+灯会
+泊车
+赭
+沙棘
+技击
+赉
+过压
+小试
+头子
+满心
+投币
+C8
+菱角
+后援
+白羽
+四象
+工坊
+白釉
+覚
+吴语
+罪案
+辛普森
+should
+艳红
+心衰
+砧
+安德莱赫
+和硕
+小美
+装裱
+二三十
+妖狐
+两基
+九歌
+武打
+τ
+干劲
+彩蛋
+电焊
+刷子
+波束
+佐治亚
+河川
+雪橇
+博尔顿
+身故
+熬夜
+赛胜
+白兰
+曝气
+全传
+包间
+玄妙
+Share
+植根
+原平
+976
+3S
+桃红
+固结
+妖艳
+제
+心语
+欢欢
+顽童
+坦荡
+双肩
+堪萨斯
+冀东
+those
+共管
+出塞
+悬臂
+Harris
+安倍
+四壁
+挺直
+郜
+急躁
+飘散
+安于
+乐购
+低产
+臣民
+卡西欧
+计画
+斗破
+幕僚
+狼王
+创收
+隆隆
+埕
+堪比
+笈
+佣
+碎裂
+淘洗
+苋菜
+1035
+钦差
+南坪
+内源
+纳斯
+1837
+硒鼓
+零落
+莒县
+警觉
+击毁
+DTS
+响彻
+望见
+鹜
+62mm
+Yuan
+746
+ngsh
+加持
+九一八
+下嫁
+save
+包河
+锶
+散开
+威宁
+酣
+2mg
+夹缝
+在写
+驱除
+介休
+鲍威尔
+狠毒
+卷云
+歹
+孤城
+光机
+干戈
+蜡像
+彻夜
+梁平
+多尔
+etc
+矸石
+1710
+询价
+猷
+恶龙
+花筒
+梁祝
+dBm
+rain
+恩情
+盾构
+山岗
+黜
+高淳
+竹竿
+5K
+钟形
+庇
+Vincent
+健将
+天顺
+鸿钧
+送行
+727
+goes
+文英
+虎皮
+宿敌
+鱼米
+花店
+克拉玛依
+康庄
+兼修
+兰西
+佗
+率真
+邦德
+灯谜
+579
+篱笆
+宕
+腹股
+受骗
+24000
+奇谭
+披风
+粗略
+眩目
+罪魁
+制鞋
+紧身
+商道
+燧
+论断
+贵妇
+great
+夙愿
+挥毫
+鱼池
+네
+蓝星
+闲情
+景深
+名媛
+单兵
+转用
+硬脂
+民勤
+单片
+浣
+变造
+煤油
+唐卡
+制衡
+烽烟
+行行
+天虹
+文职
+泽尼
+杂音
+连衣裙
+划策
+松阳
+래
+燥湿
+胸腺
+老歌
+翻翻
+848
+676
+球拍
+五五
+梵高
+头领
+啮
+server
+桥涵
+案板
+80mm
+北冥
+平负
+撕开
+大历
+每套
+标线
+批注
+974
+海相
+花甲
+十首
+小圆
+832
+渔具
+三秦
+鞑靼
+香附
+户县
+宁安
+氯苯
+ni
+常平
+恒山
+竹荪
+泽泻
+尝尝
+圣心
+野史
+发家
+顶板
+沿路
+殉
+振东
+巡展
+4700
+京报
+新津
+审验
+Joseph
+蚝
+698
+暴乱
+1603
+717
+河洛
+ing
+驻防
+驴肉
+xu
+Object
+布鲁克林
+赃款
+甲申
+南齐
+绩溪
+品性
+武都
+雄奇
+华能
+公测
+铣刀
+三寸
+AL
+献计
+4episodes
+身教
+乙女
+礼盒
+Plant
+发抖
+12014
+蒴
+颉
+多花
+金相
+薯类
+贝司
+二百六十三
+1011
+瓜分
+红松
+挥动
+1021
+长片
+耐蚀
+여
+茶碗
+奇袭
+誓要
+英皇
+艺体
+南巡
+腺瘤
+云林
+费耶诺德
+铋
+轸
+836
+慢火
+神游
+官制
+gun
+1847
+色相
+炊烟
+硫代
+主子
+吸油
+外边
+莎草
+查实
+1835
+归顺
+剃须
+淞沪
+突触
+意即
+百怪
+文秀
+926
+瘫
+大振
+殒
+回回
+防霉
+梅园
+摇杆
+嵌套
+拆毁
+册数
+绿茵
+微酸
+煲汤
+山岳
+V8
+献帝
+咸水
+西端
+return
+含金
+943
+Lewis
+医大
+平仓
+964
+侠侣
+一三
+福气
+Larry
+飘过
+까
+凯歌
+翠微
+钟状
+蛋挞
+三井
+包养
+性教育
+舞狮
+病株
+摊派
+德斯
+Antonio
+面包车
+eye
+驾崩
+登载
+凌波
+错爱
+南雄
+提货
+共聚物
+把好
+being
+伊索
+忘川
+编发
+4
+三弦
+5km
+粮草
+消去
+后传
+MH
+农安
+1849
+结义
+台江
+镗
+哨兵
+灵台
+贡生
+Browser
+财主
+镖局
+堆砌
+沃特
+军功
+操刀
+侈
+牧歌
+弱视
+mission
+沾满
+辞世
+大手
+公债
+畅谈
+硅藻
+脉脉
+每块
+中粮
+绵长
+霜冻
+赘
+瑶海
+球门
+恒河
+职官
+旁听
+赞比亚
+摸摸
+降魔
+痴痴
+聚四氟
+何首乌
+正对
+文彬
+山场
+潮人
+America
+朱氏
+Agent
+崇左
+由纪
+总店
+夜宴
+枯枝
+766
+木棉
+猎鹰
+逃难
+一早
+937
+故道
+洋务
+知彼
+侨乡
+福禄
+懈怠
+双鸭
+1030
+闯过
+故作
+做错
+5D
+此行
+听完
+太谷
+杨庄
+越战
+南澳
+涂刷
+横纹
+救火
+黄浦江
+下边
+未几
+木贼
+低声
+盐度
+补钙
+女童
+x1080
+SWF
+1012
+房改
+寒暑
+首座
+失手
+상
+gotta
+河套
+勿忘
+代入
+脂类
+高质
+蟠桃
+生面
+手辣
+988
+面带
+凭空
+上善
+滑轮
+支原体
+幽香
+磨机
+磨坊
+Who
+正方
+病患
+仿照
+蔚然
+氰酸
+尾随
+急促
+众望
+广益
+826
+细作
+癫狂
+格兰仕
+1818
+收放
+水坝
+静默
+阿诺德
+远日点
+赛制
+豫州
+皮包
+抵债
+斐济
+ACE
+甄别
+曲名
+裱花
+箨
+水桶
+迭起
+盤
+值钱
+灌肠
+碛
+1825
+Pierre
+威名
+退让
+Then
+红杏
+魏王
+止损
+辅仁
+小看
+1702
+band
+老幼
+三聚
+龙套
+蒙蔽
+鼹鼠
+自演
+Hop
+层状
+四路
+食府
+溶质
+HKEY
+mp
+错失
+城垣
+屯兵
+芊
+文治
+听歌
+伞房
+上汽
+樟木
+60g
+龙魂
+金顶
+放牛
+聚宝
+茶陵
+共事
+遇事
+mother
+足坛
+MATX
+信物
+端坐
+茶树菇
+克林
+JP
+事理
+绸缪
+反演
+独居
+旧梦
+如上
+灭失
+德黑兰
+无铅
+快件
+两架
+原罪
+穷苦
+上装
+物镜
+用血
+二百二十五
+双喜
+寒气
+发红
+国政
+小田
+味儿
+瑙
+鲑鱼
+拥戴
+lot
+失修
+欲滴
+forget
+江川
+单边
+柬
+王志
+祝融
+转炉
+从师
+G2
+谚
+炸鸡
+断言
+Method
+base
+873
+咏春
+another
+인
+凡例
+晓霞
+谷氨酸
+根植
+关进
+李阳
+篷
+丨
+后翅
+森严
+原审
+细碎
+1110
+桑叶
+巴布
+岳西
+一二三
+买单
+隆回
+过气
+ม
+诘
+锷
+10cm
+海航
+锡伯
+1192
+人肉
+watch
+抓走
+弋阳
+排头
+二百二十六
+质权
+虎穴
+阀座
+抽风
+武藏
+永辉
+809
+甲状
+苦海
+铝材
+1812
+失重
+奇书
+曲式
+对讲机
+内网
+厌烦
+为学
+新片
+成材
+喃喃
+刊本
+主枝
+喀山
+鹫
+岩壁
+去病
+5
+thatI
+小贩
+紫红
+JC
+FOR
+颜氏
+酸味
+1303
+骂人
+暗处
+调酒
+打完
+朱利安
+排涝
+玛莉
+入球
+842
+选调
+方兴
+胸骨
+关掉
+同窗
+dark
+犹在
+茶汤
+15mm
+收货人
+Machine
+打压
+帷
+土墙
+勿扰
+急切
+浮桥
+空难
+愉
+金科
+夷陵
+闽台
+AST
+进站
+攻守
+PND
+豪气
+technology
+隧
+飘然
+南漳
+酵
+交情
+Han
+加泰罗尼亚
+硫化氢
+硬膜
+Nm
+火系
+本心
+松土
+胆子
+use
+一展
+乐活
+西非
+南庄
+彧
+晶格
+斯塔
+port
+枫树
+检视
+开奖
+盆底
+十载
+米德尔斯堡
+明令
+小月
+拉蒂
+迭戈
+二人转
+Paper
+T3
+收成
+ñ
+伏笔
+0cm
+1819
+泰克
+飞驰
+郴
+参量
+硫胺
+树苗
+川剧
+金枝
+水合物
+富硒
+1130
+2345
+徐家汇
+廪
+磨炼
+椎管
+顶棚
+水肥
+入网
+妥当
+杀入
+1610
+太清
+小王
+畑
+失措
+幔
+发火
+HA
+转头
+收尾
+肥肠
+香芋
+赚到
+铅酸
+讨价
+电镜
+照本
+外敷
+Allen
+称颂
+黑豹
+安身
+洛奇
+袭人
+学案
+渗流
+689
+高盛
+咸鱼
+善行
+盛京
+Bang
+落空
+落败
+山鸡
+创举
+针头
+下笔
+过道
+绝迹
+纯阳
+抹杀
+南向
+灵丹
+637
+名车
+底漆
+御剑
+百慕大
+茜茜
+拉德
+轮番
+郡守
+勰
+兼用
+宋庄
+翻云
+burn
+簇拥
+梅花鹿
+cast
+威逼
+合流
+井陉
+上扬
+649
+天衣
+愍
+fuck
+906
+强暴
+早报
+通幽
+忘忧
+伐木
+清贫
+受制
+面谈
+血洗
+踏踏
+毛刷
+小黄
+品评
+潢川
+条幅
+WHO
+拐卖
+显眼
+报效
+翦
+造景
+台基
+生铁
+加味
+亲缘
+孕产
+翻越
+奴婢
+Andre
+坐镇
+心慌
+智取
+郗
+ex
+多夫
+贪图
+暴涨
+G1
+轮作
+缆车
+657
+易碎
+广济
+补遗
+计策
+瑞特
+童真
+盘县
+明灯
+凤台
+美兰
+货船
+芽孢
+谋士
+五河
+余庆
+2W
+佳句
+算出
+Child
+摘得
+海拉尔
+家屯
+现址
+994
+崇阳
+7800
+频发
+世情
+免试
+卡昂
+凤爪
+色狼
+天剑
+大元
+暴虐
+一栏
+五粮液
+木槿
+酸软
+封禅
+处士
+莫高
+高呼
+巴伦西亚
+问号
+香醇
+军旗
+膝部
+卡罗
+CAN
+天罗
+名闻
+大跃进
+难堪
+Yong
+稳产
+亚克力
+泯
+摩尼
+1360
+老店
+发愁
+招魂
+铭刻
+爷们
+富春
+craft
+借力
+阡
+996
+牛尾
+一掷
+9cm
+西进
+非得
+以图
+Beyond
+学姐
+款款
+木属
+篁
+数显
+打中
+阿克
+小楷
+初战
+训导
+狭路
+886
+尘暴
+拉科鲁尼亚
+郊游
+史迹
+骇人
+星斗
+扶沟
+知悉
+Cole
+文汇报
+浓淡
+双溪
+弯腰
+车顶
+达官
+要挟
+枪炮
+悦读
+休宁
+投手
+新宿
+丘比特
+mmx
+雏鸟
+1720
+ledge
+猪笼草
+两党
+1760
+UA
+Yu
+嬉笑
+甾醇
+미
+脑残
+赤芍
+1841
+Douglas
+画布
+乐理
+好动
+와
+字根
+MIC
+擂
+惊现
+税后
+争辩
+外长
+962
+站稳
+PR
+单程
+六腑
+心皮
+和光
+藤蔓
+启功
+菜籽
+浮屠
+凤冈
+宝鉴
+礼宾
+李斯特
+肉酱
+准绳
+Xu
+家学
+1105
+呼风
+袈裟
+七百
+唐王
+豹子
+check
+裕华
+妲己
+碧血
+Willia
+迷踪
+轧辊
+映入
+弗拉门戈
+接球
+1522
+外道
+紫花
+槐花
+摆渡
+中越
+kins
+污秽
+泼辣
+智谋
+本分
+康宝
+苗种
+布雷西亚
+伸长率
+RO
+俯首
+海洛因
+轧钢
+殊途
+观海
+ku
+晓春
+足足
+Chine
+佳境
+箴
+百花奖
+三木
+677
+吁
+排演
+良久
+爱子
+709
+茜草
+战犯
+枳壳
+滚刀
+SHOW
+骠骑
+周折
+殉国
+具象
+唱功
+持平
+局外
+南浦
+基诺
+21000
+9500
+庆丰
+玉柱
+偏旁
+红粉
+Airport
+光州
+二氧化硅
+夜月
+1104
+秋葵
+Dead
+虎山
+女兵
+深耕
+东瀛
+wait
+挖坑
+门卫
+Mario
+野牛
+猪骨
+Cell
+两腿
+流线
+把持
+画龙
+异人
+明理
+讫
+降龙
+头饰
+金塔
+山道
+龙泉驿
+鼓声
+死活
+跳板
+颗颗
+残害
+欢庆
+量级
+大镇
+了却
+限速
+提法
+金线
+飞马
+紫竹
+雨淋
+体改
+月桂
+恣
+玩儿
+脚部
+偏瘫
+水磨
+乐东
+后唐
+缝线
+初赛
+发端
+日食
+彼时
+后边
+泄洪
+唇瓣
+斡
+苍老
+782
+佛堂
+劳损
+少尉
+寺塔
+SX
+鱼缸
+欠佳
+B12
+润湿
+前场
+过冬
+郑码
+贤妻
+干性
+手风琴
+钯
+忿
+义齿
+缘份
+桥段
+assistant
+进给
+苻
+火箭炮
+身段
+Listening
+功业
+宫室
+崆峒
+used
+麝
+玉珍
+LW
+塞尔塔
+破茧
+管状花
+白旗
+Words
+华新
+相认
+红卫
+冬春
+炭黑
+作恶
+然则
+鲉
+려
+ก
+法子
+触目
+开阳
+波恩
+jun
+回春
+浩荡
+沙参
+枝江
+761
+官邸
+干红
+沙锅
+Christ
+道子
+风儿
+韦尔
+传颂
+峒
+民防
+R2
+获知
+酋
+羽衣
+举国
+旋即
+捉拿
+永吉
+大龄
+磨擦
+诉权
+常乐
+内因
+她家
+夸克
+闲居
+无明
+酸化
+米特
+西峰
+平远
+翌
+shit
+Pb
+梵蒂冈
+看戏
+亚伯拉罕
+龙港
+启智
+789
+借调
+成事
+颂扬
+AG
+866
+牙医
+TG
+泛起
+外翻
+金狮
+APEC
+0100
+随和
+金身
+琳琳
+配餐
+自愈
+Mail
+长矛
+1140
+用事
+狐狸精
+鞋类
+龟山
+持枪
+深仇
+罐子
+地网
+1201
+G3
+脚底
+踊
+缔
+桩基
+油井
+马洛卡
+生词
+警力
+Xbox
+中生代
+ALT
+听雨
+装束
+街市
+对开
+led
+了事
+违禁
+ok
+成鱼
+谤
+请愿
+片场
+下调
+696
+陆战
+留出
+磷化
+管区
+分兵
+宝洁
+战船
+腻子
+到任
+日电
+Bernard
+二百五十
+Justin
+在身
+母系
+剖腹
+PD
+摩斯
+出钱
+埋入
+薪金
+重见
+771
+衬衣
+i7
+顶天
+短裤
+涸
+虎视
+并立
+撃
+深谷
+比肩
+类聚
+Steven
+9800
+政客
+霞光
+庙内
+中发
+红歌
+布吉
+人眼
+警员
+哈拉
+泳装
+萨特
+遗愿
+取法
+难言
+耒
+SECAM
+警用
+Am
+惨淡
+空头
+海瑞
+速效
+论争
+盘片
+麻子
+喹
+跟班
+text
+剑门
+灵岩
+大湾
+沃特福德
+猜忌
+水法
+1402
+晦涩
+校生
+开锁
+划伤
+管径
+Garden
+rough
+猖獗
+美华
+欢心
+EMC
+元曲
+东夷
+1314
+Future
+鞍钢
+表盘
+清亮
+张大
+2RS
+斯特拉斯堡
+融水
+香干
+12015
+文脉
+鸳
+三晋
+Fan
+桉树
+老爷子
+吐温
+F4
+3GHz
+赶超
+建好
+刺伤
+场长
+跃迁
+河湾
+五颜
+夜莺
+造作
+剂子
+Year
+三百六十
+贱人
+바
+小资
+松香
+名典
+氰化
+洪峰
+监利
+笔会
+special
+五十余
+葱茏
+正线
+南史
+chance
+苦于
+长眠
+饲喂
+蜂拥
+SUN
+深绿
+bye
+小光
+岷江
+牛皮癣
+亟
+卓别林
+风挡
+roidOS
+物探
+手性
+首战
+数名
+组编
+望海
+由於
+乐果
+浏
+000000000
+秦代
+乐亭
+浮萍
+Village
+雅居
+807
+招远
+东巴
+结拜
+磁体
+泯灭
+눈
+车场
+盛赞
+黄平
+938
+skin
+商榷
+裂伤
+同归
+华信
+金箔
+1115
+雪崩
+Middle
+应战
+斗门
+请进
+炒面
+丰泽
+书斋
+might
+hope
+相投
+隗
+647
+多道
+半壁
+滚水
+舞阳
+填土
+开锅
+巍然
+内伤
+葱油
+射精
+警花
+噜
+二百四十七
+二百一十六
+小丸子
+玉峰
+伯顿
+红玉
+驾照
+792
+塾
+文种
+天路
+goodbye
+布设
+作主
+SMT
+顺昌
+About
+特洛
+败给
+休斯敦
+鄂伦春
+临泉
+真命
+长平
+丰年
+都匀
+花桥
+恬淡
+导则
+两队
+扬言
+QC
+濉溪
+948
+标段
+征象
+受雇
+里加
+女贞
+祁门
+地界
+吵闹
+阿城
+properties
+乔尔
+诗派
+nana
+格陵兰
+连理
+中元
+组委
+1805
+详图
+滋事
+ESP
+限价
+杂谈
+cat
+県
+抚平
+Wave
+飚
+桂英
+珩
+点穴
+2S
+牛市
+索性
+利己
+0mouse
+抗虫
+挺立
+点水
+黄忠
+嗽
+人静
+信源
+波多黎各
+回程
+店家
+Nothing
+单县
+组胺
+擦洗
+和服
+公权
+出勤
+说课
+蜂胶
+岁岁
+围场
+穆勒
+汤米
+优品
+车工
+车骑
+孟津
+出票
+网校
+woman
+65536
+华阴
+降服
+传家
+龙州
+酥油
+路标
+Prince
+骁勇
+阿德里
+三棱
+拌和
+比亚
+韦德
+Active
+青翠
+新乐
+假借
+称雄
+汉寿
+佩恩
+裁员
+数论
+拉扯
+国电
+津市
+心急
+只顾
+Argentina
+底本
+过眼
+中泰
+廉江
+隔板
+691
+安特卫普
+莫属
+卡罗尔
+军情
+qu
+化物
+Change
+鸮
+位子
+四叶草
+下楼
+上高
+庄主
+老伴
+密尔
+蛋蛋
+扎西
+科班
+5MB
+ぁ
+正装
+福斯
+查尔顿
+门神
+Feng
+输油
+森森
+大官
+新剧
+人知
+893
+缓急
+蘖
+Champion
+1843
+茅屋
+细毛
+如影
+沃尔夫
+制售
+调心滚子
+汛
+书报
+辽沈
+东昌
+C12
+厅局
+指头
+原煤
+肯特
+澄江
+从艺
+东四
+欢欣
+亮光
+黑石
+成教
+洋行
+汇流
+稀罕
+捧腹
+游憩
+辛集
+燕尾
+跳远
+吉日
+深省
+亚泰
+4M
+圆顶
+地裂
+碱基
+珍重
+环宇
+流涕
+豪森
+点将
+二百多
+CRT
+12011
+729
+分拣
+措手
+走火
+领奖
+祭酒
+斥责
+狐妖
+⑻
+分馆
+灰黄色
+Р
+1085
+灵枢
+睢
+Album
+涨价
+怡然
+1640
+西青
+晚风
+斯普利特
+过氧化物
+CODE
+上上
+乌干达
+扫帚
+退却
+茶室
+脑后
+小虫
+六祖
+尼姆
+Resort
+IntelGMA
+任城
+揉搓
+铆钉
+重唱
+高时
+时宜
+5700
+画栋
+调回
+黑头
+提质
+食量
+倍频
+百户
+雷声
+看花
+收紧
+今古
+商科
+私语
+湖景
+密实
+test
+谷城
+初速
+加封
+短剧
+劲儿
+退行
+玩意
+一朵花
+菜场
+抹灰
+相府
+OLED
+踏遍
+娶妻
+拐弯
+茶厂
+利马
+淡忘
+上溯
+敌情
+工本
+右卫
+收编
+架桥
+San
+亓
+Bay
+花篮
+1503
+汽化
+少府
+萨格勒布
+da
+土族
+921
+建水
+纳兰
+EB
+石泉
+应龙
+忠烈
+偎
+Re
+志国
+膨化
+雷亚尔
+陆川
+西楼
+苏门答腊
+x800
+新街口
+献唱
+博兴
+南市
+修神
+数倍
+cos
+千手
+助长
+Wonder
+练级
+右旋
+棉纺
+水坑
+女中
+果胶
+1222
+获选
+骊
+竞买
+2250
+1621
+俗套
+亚美
+光武
+放声
+拜金
+圭亚那
+赶集
+马虎
+帙
+贫贱
+风道
+心绪
+哔
+平邑
+ns
+YouTube
+二百一十二
+到校
+小可
+惕
+Limited
+皮棉
+碑亭
+瞎子
+小斑
+僵局
+东魏
+尤溪
+外方
+上地
+ball
+道别
+揖
+饭馆
+自若
+牧童
+扁鹊
+慈母
+崇州
+纂修
+倒卖
+显贵
+盖地
+河豚
+dong
+正畸
+告状
+房舍
+牢狱
+串珠
+李娜
+瑰
+太炎
+成灾
+宜兰
+鹄
+大公报
+莲塘
+拿去
+北行
+农经
+Tim
+828
+Bad
+盏灯
+长裙
+克曼
+丰顺
+满眼
+德智体
+天冬
+左臂
+魔镜
+遒劲
+医改
+念书
+土特
+信手
+新发
+俊朗
+苍溪
+龙翔
+临沧
+河堤
+耐德
+通城
+黄叶
+大亚湾
+甑
+MUSIC
+deep
+安新
+以礼
+Michel
+开窗
+色度
+汉森
+悚然
+偏斜
+捣碎
+佩尔
+笨重
+清官
+华诞
+918
+有福
+扎扎
+噎
+撼动
+ヘ
+西直
+比方
+指路
+林带
+国兴
+护坡
+会宁
+于都
+涨幅
+病员
+回过头
+Gray
+苗条
+金辉
+熘
+清肺
+西布朗
+cell
+斯卡
+十亿
+1013
+小气
+押韵
+挂面
+拾起
+区外
+积攒
+Remix
+流形
+咳咳
+歙
+散户
+尧舜
+勇夺
+射电
+层析
+多晶硅
+能行
+地望
+祈愿
+一花
+8M
+汉娜
+㎜
+加价
+死别
+七品
+红云
+宜家
+气垫
+绚
+浴池
+淮扬
+1302
+壁炉
+重读
+石蜡
+清音
+阿米巴
+维诺
+多难
+佥
+ther
+频带
+奥兹
+주
+临行
+递质
+梨子
+才知
+柔顺
+0T
+八达岭
+明志
+武穴
+仙缘
+群居
+毛尖
+寻人
+南坡
+支取
+香醋
+闲事
+抢得
+Money
+羊水
+鸟瞰
+苁蓉
+business
+闽北
+1064
+迟疑
+搜罗
+礼器
+读史
+中电
+女鞋
+雀斑
+卡夫卡
+须弥
+968
+免遭
+芥子
+弊病
+地市级
+交游
+拉起
+镜湖
+肌注
+口舌
+新址
+前兆
+车模
+挟持
+助词
+小舟
+谢意
+戛
+俸禄
+扣分
+K歌
+1703
+强权
+存留
+尼诺
+道行
+默然
+杰尔
+深红
+焙烧
+透亮
+红发
+鱼汤
+文林
+合意
+芦笙
+组员
+竞拍
+士林
+二百零三
+Price
+琬
+惠灵顿
+大加
+淋病
+啮齿
+蚊帐
+国信
+可立
+夭夭
+黔东
+台灯
+swith
+蝼蚁
+戏楼
+小口
+点名
+岔路
+guy
+此卡
+圆心
+主茎
+奥格斯堡
+元婴
+育肥
+N2O
+聚首
+镜花
+入京
+风压
+雨过
+大溪
+麦克斯
+面饼
+脾脏
+锚杆
+计征
+Physics
+德江
+EM
+兰考
+移情
+布里斯托尔
+靖宇
+脐带
+青史
+紫气
+三百余
+使然
+红塔
+麦加
+气态
+莱德
+刍议
+锡林郭勒
+conomic
+投案
+下身
+鲁菜
+家岭
+XI
+起跳
+北里
+竹枝
+推入
+同株
+总评
+官位
+唾弃
+归因
+拉卡
+晚稻
+工位
+出息
+外派
+Herself
+23000
+CFL
+罗庄
+断线
+百亩
+辣妹
+鄞县
+2102
+文斯
+大热
+当铺
+采花
+楼内
+知事
+举杯
+壬辰
+萨瓦
+神化
+896
+保底
+Guang
+半圆
+破败
+鸟鸣
+村居
+1795
+兴县
+薛家
+蒲城
+得救
+MACHINE
+原由
+ㄖ
+西环
+黑土
+胆量
+益民
+州级
+句句
+挞
+战利品
+一九
+唐河
+逝者
+板机
+1521
+659
+Beta
+强生
+LOCAL
+疲乏
+要冲
+什邡
+关外
+清空
+路遇
+nt
+DAC
+抱紧
+寂灭
+雅士
+差值
+利达
+克莱斯勒
+浅笑
+蔡甸
+鲁道夫
+嗡
+俎
+偶数
+锏
+1623
+翠微居
+求援
+力大
+胶印
+政研
+鬼影
+库珀
+本原
+阿什
+多式
+礼遇
+缮
+还债
+夹紧
+youneed
+榛子
+Studios
+维达
+届中
+单模
+葱郁
+弗里德里希
+后侧
+莹莹
+古来
+见鬼
+Think
+Ch
+薏苡
+核果
+万盛
+劫匪
+划一
+专版
+office
+Lake
+射中
+恒生
+景仰
+蕨菜
+SJ
+千帆
+简略
+乳粉
+阴湿
+龟兹
+九种
+阴间
+陶片
+癞
+鸡冠
+fora
+短波
+药量
+share
+浆液
+近世
+缱绻
+天城
+银海
+责编
+Notes
+鹟
+无界
+香洲
+波鸿
+伏特
+粑
+威龙
+野猫
+至亲
+吉州
+值守
+派发
+偷看
+旺角
+做作
+И
+火源
+兴庆
+家道
+鱼苗
+nough
+Track
+图录
+武生
+钢带
+汾河
+布伦特
+台语
+订房
+作乐
+站位
+德昌
+1839
+738
+奇趣
+敌手
+AUC
+云顶
+沉痛
+小一
+卡内基
+川陕
+纵情
+Fried
+高居
+Ex
+小站
+普兰店
+进贤
+陕南
+冰片
+买车
+鲫
+义士
+表姐
+发丝
+领海
+枯叶
+1204
+chu
+分列
+7300
+网架
+长林
+蹴
+此言
+卡梅隆
+宝斋
+助产
+专权
+EL
+羞耻
+玉英
+骤雨
+786
+新声
+9787500
+树势
+疱
+锰酸
+泰宁
+好胜
+冬冬
+戸
+拜伦
+武城
+内销
+摸底
+招工
+泽州
+烤鸡
+覆没
+CJ
+hydroxy
+铌
+平阴
+涂膜
+优属
+Rap
+丐帮
+阻截
+定稿
+二百零四
+卫辉
+U20
+鳜鱼
+Beauty
+Terry
+若非
+枫香
+玛莎
+修武
+半坡
+扭力
+奥林
+情志
+盯人
+兴修
+退隐
+東
+Juan
+7600
+授牌
+修编
+掾
+文辞
+伽蓝
+奥尼尔
+达达
+私企
+ALL
+Dean
+776
+巴德
+月老
+再临
+按察使
+镕
+纠风
+正弦波
+moment
+军备
+皖北
+柱形
+洛城
+拧紧
+≈
+1009
+银灰色
+野营
+鬼屋
+二百一十四
+徐闻
+小军
+连生
+海王
+眼珠
+垂柳
+强项
+切沃
+如茵
+丽都
+高氏
+学理
+恣意
+杰拉德
+四联
+火箭弹
+槿
+顺延
+锦华
+织金
+渡船
+求求
+地力
+绷带
+译码
+祭典
+几内亚
+布拉加
+闽中
+天威
+云水
+网银
+小群
+捷运
+泡芙
+1323
+STAR
+静压
+长高
+肃清
+药厂
+内测
+阳澄湖
+武宁
+市集
+芊芊
+诋毁
+白兰地
+硐
+梦到
+黑莓
+遁甲
+皮孔
+兵权
+三院
+洗面
+病魔
+共聚
+当官
+未得
+contact
+水政
+剡
+标有
+图们
+未接
+痴狂
+外运
+例文
+汉室
+肉丁
+无字
+素以
+3W
+肉搏
+983
+女单
+隐逸
+禅定
+水旱
+和珅
+轻功
+啖
+陟
+各组
+木版
+婆媳
+灵珠
+针线
+黑米
+穿山甲
+米醋
+望城
+Crystal
+率队
+ɪ
+脚踢
+德性
+播映
+液力
+1205
+北环
+箱式
+山雨
+奥斯陆
+议和
+真一
+其四
+脑室
+搽
+巴格达
+闲聊
+圈养
+回信
+过问
+肩周炎
+补短
+强攻
+1622
+拨动
+难辨
+漏气
+求购
+卡洛
+天方
+田庄
+Records
+奶瓶
+小街
+1301
+经合
+大船
+比色
+进餐
+眼线
+赛亚
+str
+复眼
+SSD
+赃物
+Productions
+县府
+逆风
+master
+养蜂
+整块
+哺
+线装
+逃窜
+挤占
+XD
+叮嘱
+罕有
+禽蛋
+要人
+Analysis
+gold
+争做
+空域
+专车
+Hip
+miss
+凡夫
+街角
+京口
+猎户
+直白
+宝龙
+虐恋
+斯诺克
+幼枝
+两仪
+肥西
+巴哈
+水样
+竹简
+1127
+希拉里
+背投
+living
+球馆
+XL
+营巢
+跌入
+半截
+爱德
+安源
+涡阳
+疏肝
+加斯
+唏嘘
+本末
+建强
+Apartment
+Rain
+未闻
+瓦特
+回单
+肥瘦
+大野
+怪诞
+佩德罗
+雪佛兰
+1045
+美琪
+两节
+一千零一
+凌厉
+刻薄
+卵泡
+教唆
+橘皮
+吸声
+志宏
+敬爱
+定都
+施药
+纷至
+Ⅶ
+toget
+已逝
+保尔
+荷兰豆
+马库斯
+皮尔
+制表
+支那
+盟员
+胚芽
+两房
+memory
+紧要
+Museum
+蒜茸
+硬朗
+孟婆
+怨念
+王刚
+粗心
+MJ
+芳菲
+es
+志气
+义县
+TH
+全站
+形貌
+渔翁
+登科
+干群
+持仓
+腹内
+未然
+二百二十四
+短时
+吸管
+科拉
+征婚
+人迹
+三圣
+临场
+小写
+班机
+国奥
+挺好
+迁址
+DAT
+界首
+承压
+之状
+担子菌
+王八
+织机
+隆昌
+Living
+菩萨蛮
+中金
+歇尔
+858
+雌虫
+化学化
+文豪
+提防
+霭
+苦参
+圣德
+Sean
+DOTA
+二百五十五
+钱庄
+孝经
+1502
+1615
+言简
+Republic
+职高
+岛津
+得手
+笏
+高要
+左派
+若离
+除害
+Industry
+共通
+硝化
+航路
+问津
+增塑
+扁担
+斑斑
+中朝
+Boys
+惠帝
+蓝蓝
+种菜
+摆布
+丁酸
+田氏
+仪礼
+警衔
+幸亏
+х
+卸任
+痿
+兵临
+馔
+欤
+互信
+祭天
+拷问
+探源
+同僚
+0017
+全篇
+耘
+自新
+送走
+别怕
+轰鸣
+挨打
+Library
+秘法
+苛
+⒌
+色温
+深意
+敦厚
+鸿门
+搞错
+mith
+心切
+酒酿
+Alice
+马六甲
+竹筒
+飞鹰
+凡尔纳
+共工
+效法
+除臭
+糖衣
+右腿
+引燃
+观花
+地租
+坝顶
+俗世
+鳃盖
+顾家
+活活
+养肝
+夕照
+噬魂
+震旦
+顺水
+下沙
+748
+吹起
+广南
+Ross
+1370
+玉玲
+天狗
+iP
+述职
+阜宁
+津巴布韦
+uk
+强效
+欧锦
+奥数
+40g
+EE
+PS3
+盐分
+冷杉
+面馆
+1804
+二百六十四
+出马
+剑南
+乐土
+登月
+汪汪
+嘎查
+浦发
+傲然
+Winter
+油压
+水化
+食粮
+甜言
+ol
+礼教
+花烛
+豪尔
+久病
+二百二十二
+擀面杖
+才气
+标出
+Br
+cp
+尾端
+银泰
+1814
+细数
+946
+两校
+仙术
+899
+边学
+影后
+同生
+步态
+network
+物欲
+unit
+门牌
+慢热
+皮草
+斫
+Country
+断断
+油然
+淳化
+磊落
+鄂温克
+战歌
+墓道
+托德
+都昌
+红十
+零担
+撤并
+放权
+存单
+莉莎
+进纸
+封堵
+1828
+俊峰
+干将
+956
+山塘
+死战
+古史
+统合
+収
+桥式
+克里特
+匿迹
+龙灯
+粤西
+一千万
+照壁
+Mix
+Material
+ATIRadeon
+开课
+行尸
+甘于
+非议
+保本
+祢
+欺压
+秉公
+指间
+Tang
+来啦
+至多
+gang
+1836
+争得
+减值
+嘲
+牵线
+Dan
+腺苷
+福尔
+松滋
+心源
+Proto
+两臂
+谷类
+翌日
+玉兔
+梅江
+世忠
+格尔木
+采茶
+春梦
+七台河
+伟民
+传情
+政风
+水泉
+彼特
+扑倒
+深港
+iS
+中化
+一丈
+塞斯
+专精
+颦
+917
+行囊
+飙车
+NB
+白洋
+救起
+灶具
+NS
+遗民
+盲点
+ping
+北口
+知母
+到时候
+制做
+毒死
+玉洁
+分词
+糅合
+执手
+藿香
+单张
+补选
+归心
+救救
+茎秆
+二百余
+漱口
+称做
+县名
+青蓝
+953
+吹响
+编年
+继父
+巴拉
+687
+雄鹿
+奈特
+禽类
+OST
+阿诺
+甲组
+商河
+可查
+photo
+1215
+0I
+专论
+乔什
+中川
+油库
+昌乐
+围住
+两小
+电喷
+南锡
+握紧
+驱虫
+赠予
+1480
+必得
+state
+η
+铜板
+大庄
+宋诗
+小异
+浔阳
+搭讪
+怜惜
+章句
+阿难
+种花
+兴文
+旌旗
+梁家
+弗尔
+货主
+灰熊
+幼子
+Dolby
+负率
+破获
+建园
+定损
+航标
+Roy
+856
+班会
+Charlie
+经籍
+呆板
+跑出
+Jennifer
+Building
+学宫
+德庆
+NFC
+三焦
+亲率
+褒扬
+弹珠
+科尔多瓦
+雎
+豉油
+芳烃
+中举
+1070
+北坡
+子路
+原样
+仡佬
+嘀
+三才
+量度
+印务
+野村
+841
+谈情
+毛骨
+邾
+往届
+发迹
+哈斯
+斩棘
+边沿
+人材
+三科
+液面
+尖峰
+镀铬
+S7
+每吨
+摩卡
+磨盘
+备查
+男单
+累加
+hp
+三杯
+连州
+发狂
+重地
+朝华
+Franch
+造山
+1834
+离校
+ぱ
+同伙
+威德
+辽金
+生硬
+临洮
+先兆
+魔杖
+势能
+Fern
+Alt
+清甜
+薏仁
+嫁妆
+铛
+乐高
+迤
+姬姓
+供方
+大庙
+马鹿
+皂甙
+经略
+阳平
+产蛋
+旧式
+续约
+River
+施舍
+省局
+秭归
+转送
+任劳
+一袋
+欢歌
+肱骨
+铜梁
+大骂
+终归
+书馆
+简装
+常开
+境况
+辉县
+国权
+血魔
+宝莲
+博德
+中选
+塞缪尔
+护发
+曼妙
+正轨
+胶管
+桎梏
+蹶
+口臭
+射入
+嘉华
+兑付
+农工商
+小寨
+全椒
+杂食
+科贸
+打量
+撕碎
+819
+屈膝
+岩寺
+烧死
+飞信
+严苛
+氤氲
+锦城
+靠背
+御览
+再如
+逐行
+双江
+发汗
+正东
+诺顿
+浙南
+掉头
+余量
+洞府
+想爱
+烟霞
+夏雨
+浓情
+Ghost
+﹪
+口香
+玉梅
+高跷
+屏东
+南星
+1075
+死生
+震级
+抗衰
+畜生
+梅雨
+惺惺
+经纶
+路和
+C2
+雾霾
+羌活
+莲池
+离愁
+围护
+慈利
+镇国
+职中
+二百三十
+开成
+回天
+日方
+朽木
+狂魔
+仓皇
+人氏
+wont
+Synthes
+天火
+夜班
+柳絮
+北地
+津南
+德龙
+1460
+亦庄
+奄奄
+葶
+主见
+肺病
+酰基
+拍片
+水浴
+定制化
+扬扬
+Volume
+1086
+双凤
+庙号
+Communications
+余脉
+drive
+故园
+帕斯
+kill
+UDP
+遍历
+伏安
+菜叶
+club
+肯塔基
+作客
+县官
+大熊
+氡
+冻干
+斯人
+step
+二百三十四
+肉饼
+Wong
+易发
+兼程
+赡
+洛南
+int
+牛人
+猛攻
+紧实
+译注
+老翁
+安好
+二百五十一
+佐夫
+嵴
+旱灾
+建制镇
+屡遭
+霸州
+Value
+一厅
+漆膜
+远扬
+帕特
+考拉
+宁河
+湿气
+肤浅
+牟平
+溜冰
+平泉
+水龙
+垸
+硬卧
+蜜饯
+FF
+总汇
+Marshall
+零零
+忍痛
+棋类
+雄鱼
+痂
+void
+她来
+顿挫
+欲求
+调音
+温存
+金阳
+莲华
+曝晒
+成倍
+重负
+颍川
+州郡
+藉此
+康奈尔
+真凶
+天眼
+明成
+Medical
+草蜢
+systems
+美雪
+英属
+加利
+鳞伤
+白肉
+见状
+配菜
+高塔
+effects
+抟
+幻梦
+生父
+票选
+振国
+MicroATX
+FR
+冀南
+乡贤
+易水
+绎
+饧
+疏漏
+医方
+恒丰
+声息
+德隆
+缩合
+卡普
+散乱
+four
+진
+盐场
+1034
+满头
+萨雷
+2150
+窃听
+在目
+电表
+舞龙
+木桥
+扫射
+脱节
+测序
+信度
+开开
+1505
+注浆
+武备
+mmOD
+逐日
+庆功
+布莱德
+Field
+裱
+796
+建桥
+错案
+自爱
+拘禁
+甬道
+锚固
+小池
+维克托
+极值
+撸
+爬坡
+压轴
+868
+计财
+陈醋
+冷傲
+搜狗
+呢喃
+祖上
+Search
+熬煮
+义演
+停息
+刚体
+涂炭
+婚育
+入行
+心身
+神速
+H2
+云雨
+Days
+力帆
+697
+细度
+落难
+珠穆朗玛
+花束
+下地
+洪涛
+makes
+绝症
+缘于
+火箭筒
+香花
+嘹亮
+亚胺
+假性
+生态型
+全委
+哀求
+劲旅
+腾龙
+膦
+凤仪
+可也
+封盖
+工党
+押题
+诡谲
+制浆
+小受
+藏龙
+毛线
+放慢
+丝袜
+非礼
+小倩
+纷扰
+古建
+mi
+逃犯
+六千
+719
+璧山
+矫健
+输掉
+轻者
+简阳
+低俗
+Week
+1832
+阿吉
+巨鹿
+卡巴
+巧手
+稀世
+贰级
+湘阴
+肾经
+KN
+山涧
+running
+奶爸
+falling
+卷成
+浒
+定情
+悍将
+布达拉
+藏家
+伞兵
+时点
+相联
+龙首
+独门
+Jan
+五成
+富源
+似火
+五十多
+陵寝
+惇
+area
+俊逸
+莘县
+林峰
+存贮
+什麽
+包车
+大赦
+色变
+Amy
+剖开
+高墙
+Philip
+company
+伊川
+提笔
+青苔
+责怪
+品茗
+下陷
+整夜
+舶
+boys
+莫利
+物管
+向善
+鲲
+珠玑
+单晶
+立军
+需氧
+蛰
+腰背
+ง
+国医
+惨痛
+生离
+速滑
+스
+歌会
+网速
+定安
+砖头
+仪态
+分镜
+直译
+道外
+包覆
+讯问
+湿疣
+秘闻
+到头
+二尖瓣
+黑心
+初六
+拆开
+road
+圣物
+披荆
+垦区
+聚力
+青贮
+道真
+Soc
+议院
+烈酒
+仁德
+忠县
+太庙
+孤军
+毛驴
+骑射
+先决
+斯勒
+春联
+1816
+污渍
+志趣
+中站
+Hu
+暂扣
+BN7
+薪火
+夜蛾
+克鲁塞罗
+换回
+CS4
+队形
+结欠
+昂达
+奇想
+LE
+掌故
+戎马
+氧量
+天网
+油类
+集线
+教宗
+索罗
+汊
+纯种
+时区
+托比
+虎鱼
+大戟
+菲尼克斯
+冷情
+周天
+兜兜
+1826
+Ineed
+10M
+青釉
+消暑
+如数
+社会党
+Foundation
+817
+帕尔
+前因
+触媒
+909
+基岩
+劭
+1789
+道元
+难求
+煤粉
+赌徒
+席尔瓦
+耿直
+船闸
+油层
+鬲
+插手
+铮铮
+Shake
+龙胜
+利辛
+袖子
+巡礼
+堆肥
+Channel
+内史
+精血
+批文
+连冠
+匹夫
+幻世
+丈量
+柜子
+难事
+陈留
+试纸
+1122
+GD
+ν
+狂傲
+心伤
+博园
+恶少
+WASD
+小量
+秀色
+哈尔科夫
+中街
+战斧
+Oslash
+凯西
+1824
+退赔
+沙头
+wing
+刀枪
+空手
+回波
+冠脉
+申明
+软盘
+雄霸
+首开
+半日
+榫
+Т
+刘涛
+小书
+中农
+耐克
+加菲
+松紧
+庑
+霰弹
+Self
+柱式
+江面
+993
+抜
+01949
+Vision
+889
+港航
+胀满
+尘缘
+活灵
+mW
+凤姐
+1645
+白米
+毒液
+瑞昌
+多尔衮
+溢彩
+槭
+乐善
+罢休
+只此
+沪杭
+法身
+船山
+28000
+飘浮
+费内巴切
+琼崖
+糖分
+旺旺
+卸妆
+普渡
+html
+提拉
+金谷
+答卷
+国联
+压器
+塔内
+1423
+咿
+thus
+风发
+拜佛
+新曲
+利诱
+税负
+镇安
+华天
+地磁
+逶迤
+Beautiful
+250000
+热毒
+后肢
+898
+刺眼
+柳河
+科特
+升麻
+屋后
+黑幕
+4096
+香型
+渗水
+宗法
+车管
+1135
+后金
+老牛
+澄迈
+收储
+SNMP
+卡尔斯鲁厄
+合众国
+霍邱
+876
+女帝
+开票
+笑谈
+虫卵
+浇铸
+斯密
+爹娘
+气急
+龙南
+营房
+宛城
+布莱顿
+882
+连云
+加力
+打鼓
+万里
+福林
+green
+简捷
+散尽
+费县
+热流
+沉郁
+涨停
+洪波
+玉明
+玄关
+联储
+Bobby
+3u
+幽梦
+罐装
+T2
+办报
+beta
+二百五十六
+钦定
+8600
+菲斯
+Trade
+锦鸡
+诗稿
+通山
+驴友
+皎
+冠词
+Block
+金鑫
+杏子
+递给
+小苗
+半途
+艾略特
+快车道
+966
+克里米亚
+鸡排
+平添
+鸫
+庶子
+窥探
+迂
+元勋
+伊宁
+建辉
+平津
+调子
+水浸
+织锦
+Baker
+Ps
+龙文
+推土机
+多宝
+明皇
+闷骚
+圆融
+染整
+丝竹
+小海
+Asia
+史莱克
+每刻
+BNC
+because
+文军
+靖远
+市盈
+滑道
+黑化
+开县
+宝庆
+屠戮
+Suite
+补缺
+蓄势
+1018
+1155
+国清
+药科
+骨病
+2160
+即食
+达斯
+赓
+827
+柳江
+中日韩
+脱色
+牡
+﹕
+空穴
+CJK
+依兰
+肘子
+好手
+叫好
+跖
+码字
+举目
+1121
+惶惶
+冲孔
+普安
+一桥
+好酒
+两全
+思南
+三体
+麦弗逊
+攻取
+高利贷
+编绘
+圣子
+暖风
+国战
+绥德
+精悍
+加宽
+788
+词根
+里子
+90000
+东麓
+新装
+养女
+剑灵
+潜行
+复归
+奄
+名天
+1126
+立群
+踵
+雕饰
+奏响
+﹒
+shop
+1225
+分所
+高风
+彰化
+1054
+洗练
+眉睫
+白狐
+一臂
+神光
+批转
+嫩芽
+浸种
+1095
+九尾狐
+岗前
+瘿
+861
+Energy
+获颁
+景德
+遮荫
+GRE
+Arthur
+自封
+宜阳
+绝活
+开园
+鳌头
+Holiday
+搞活
+Fun
+衾
+跳进
+经穴
+家财
+Il
+外患
+高卢
+省际
+Mo
+致畸
+二百二十
+文忠
+外遇
+二百零九
+宦海
+风火
+仰视
+难缠
+knows
+收口
+花魁
+受用
+楚歌
+Tommy
+739
+枳
+如新
+猿人
+定案
+服食
+神宫
+亚历克斯
+中观
+马自达
+1523
+1275
+大利
+动弹
+骑乘
+恩泽
+刮板
+春林
+落雁
+布道
+blood
+较于
+渺茫
+国新
+坪山
+Morgan
+赫本
+Six
+暴徒
+西凉
+销路
+12016
+十景
+鞭子
+井底
+Times
+累犯
+空山
+达县
+红绿
+电声
+脱掉
+闺秀
+旧约
+Zn
+猫儿
+别来
+文士
+顺达
+1170
+补交
+论辩
+水塔
+用钱
+快照
+特拉维夫
+泊头
+巴人
+盘口
+精武
+鼠李
+ο
+泥巴
+轻量级
+加害
+同姓
+炙热
+does
+舞女
+忏
+戏谑
+纸杯
+丹徒
+眼内
+봐
+翠柏
+异军
+即把
+剥蚀
+TTL
+渣土
+蜜糖
+坡道
+夹板
+Б
+适航
+宾阳
+老山
+法罗
+附生
+爱妻
+Look
+安顿
+长垣
+雪糕
+卫平
+入账
+马步
+山民
+熔炉
+Inn
+野狼
+山岩
+雅加达
+砥柱
+隆德
+腈
+附送
+1124
+SLI
+控油
+然後
+近程
+首日
+色光
+括约肌
+v3
+锦瑟
+BABY
+轮盘
+银魂
+春明
+甜面酱
+广佛
+倒掉
+三百六十五
+P4S
+氙气
+引致
+八音
+续建
+武斗
+铺开
+打退
+鞭策
+海明
+渭水
+塔塔
+东园
+托普
+齿状
+隔世
+打牌
+厚道
+硬汉
+内化
+左腿
+婚俗
+低热
+公仆
+蝶舞
+野餐
+德鲁克
+粉笔
+驿道
+咸平
+苦笑
+坐车
+芭堤雅
+持节
+竹内
+钢构
+射洪
+外孙
+母牛
+镇平
+岁到
+绥靖
+酒窖
+衣袖
+桑梓
+先声
+多管
+虞城
+汇兑
+瑞星
+承继
+武隆
+献祭
+抛物线
+一盒
+完税
+烟煤
+全图
+篡位
+破门
+校核
+间期
+脱逃
+小步
+Topic
+噬菌
+劳改
+搬进
+农校
+theory
+穗轴
+电势
+圣天使
+民企
+Exchange
+踏步
+烟道
+砖墙
+腼腆
+暴晒
+唱段
+June
+女校
+学医
+自取
+遇刺
+0S
+涟源
+独木
+剥去
+肉厚
+川岛
+二百零二
+千代
+频域
+免得
+墨镜
+XT
+quality
+干菜
+外流
+禁食
+Н
+明码
+修练
+后周
+阿姆斯特朗
+弗拉基米尔
+睡梦
+社火
+湍急
+机敏
+南线
+1809
+上野
+正职
+二百零一
+换发
+至德
+中唐
+987
+水合
+朝臣
+大任
+任丘
+泰来
+哥斯
+0P
+二百二十九
+猛地
+西峡
+体谅
+破阵
+走秀
+石台
+玻纤
+庐阳
+入化
+征询
+假肢
+相持
+WPS
+忧国
+lose
+饭庄
+晕厥
+葬身
+发回
+新芽
+施密特
+16X
+爬升
+film
+捉住
+普利
+松木
+高昌
+达芬奇
+957
+兵士
+铰链
+首级
+伯尔尼
+Single
+新创
+其外
+苦读
+光鲜
+女妖
+午饭
+支局
+922
+欠据
+传质
+职校
+砂砾
+逼供
+点明
+AVX
+滘
+东源
+片式
+对位
+改道
+∥
+小英
+1790
+惮
+歌诀
+淄
+冲床
+歧途
+彩灯
+埂
+machine
+石牌
+内能
+掖
+芸香
+1601
+温县
+洛基
+岳池
+晋朝
+1065
+拖曳
+壁虎
+汤勺
+更趋
+蜜桃
+演算
+1501
+回落
+旗号
+跳高
+琥
+螺蛳
+突入
+effect
+团练
+受宠
+0C
+5G
+心魄
+师法
+瑞斯
+弯头
+商州
+治党
+拆借
+831
+威斯康星
+型男
+兆瓦
+俄勒冈
+长叹
+高悬
+大藏经
+此战
+dreams
+重施
+边关
+872
+丰乐
+Acer
+佳音
+小家
+青竹
+科创
+眈眈
+栖居
+叫绝
+火葬
+藁城
+卤汁
+湍
+包机
+弗朗西斯科
+龙子
+亡魂
+专指
+鬻
+奇效
+鱼皮
+黏性
+生财
+秘传
+正平
+口琴
+自持
+仙灵
+初四
+入座
+山猫
+星野
+虚妄
+报春
+吗啡
+26000
+三夜
+丐
+丰功
+拿铁
+金童
+南县
+1112
+养家
+战栗
+杂木
+僻
+行在
+甘井子
+贾汪
+隆安
+圆点
+三室
+退费
+滇南
+重击
+核子
+防寒
+紫草
+卢沟桥
+Op
+AVG
+后路
+铺满
+寿阳
+警匪
+猫眼
+重印
+愈烈
+669
+达利
+station
+普兰
+疫区
+二百三十二
+篡
+推演
+外耳
+细粒
+Guard
+喷码
+思远
+释疑
+副高
+1055
+铁芯
+关内
+酷酷
+市北区
+MOD
+EINECS
+呼出
+1775
+绿意
+Xeon
+th
+布洛克
+买断
+痈肿
+1410
+阔步
+饥渴
+冠县
+走好
+握把
+朱红
+建元
+造粒
+loved
+Common
+手头
+歪歪
+WP
+pain
+扇区
+蒙太奇
+post
+俏佳人
+甲类
+罗江
+万花
+Officer
+刑天
+分场
+改由
+丰登
+眉目
+野山
+增订
+Pier
+河镇
+振铎
+莎莉
+凝神
+玉立
+寄养
+哈罗德
+919
+妲
+只读
+SDXC
+沂源
+惨剧
+张敏
+凯伦
+亚细亚
+戏法
+并罚
+市南区
+瓦尔特
+八极
+骄子
+若想
+虫族
+会商
+金工
+软禁
+行经
+甘地
+道县
+双十
+普瑞
+民宅
+拉登
+NBC
+玛丽苏
+蟹肉
+顶礼
+孤星
+枪法
+济济
+种猪
+格拉斯
+弱酸
+房顶
+扶余
+魅族
+百鬼
+菟丝
+定员
+优厚
+家坡
+暄
+寄情
+夏朝
+若梦
+取悦
+青瓜
+移山
+su
+X3
+赏罚
+荧幕
+市外
+挑逗
+歌王
+超时
+1806
+别走
+幼女
+把戏
+桅杆
+小令
+断续
+1838
+Making
+国旅
+警句
+定速
+变奏
+悬案
+重归
+Adopted
+关区
+伽马
+埋设
+二百三十六
+段长
+导刊
+丁字
+局内
+骨膜
+伍迪
+接住
+1270
+埃米尔
+九峰
+迷津
+烧掉
+娣
+奉祀
+其责
+门厅
+劫后
+万荣
+飞弹
+黎川
+kind
+香艳
+茄汁
+春树
+豪强
+一木
+缠枝
+Room
+猖狂
+猜谜
+段位
+机台
+史官
+长啸
+飘忽
+note
+邗江
+王为
+修竹
+丝印
+巩膜
+笨笨
+闲人
+旅团
+存折
+Symbian
+新源
+胧
+愚公
+两世
+걸
+麦克唐纳
+林奇
+足矣
+洛河
+自卸
+making
+小胖
+同构
+承天
+产期
+星夜
+视唱
+姜子牙
+逻
+木纹
+初唐
+翻倍
+坝址
+烧肉
+尿布
+断然
+阿兹
+妻儿
+三连
+权杖
+赣县
+威廉姆
+蒙受
+炉子
+桓台
+配图
+金蝉
+IntelGMAHD
+mp3
+染病
+牛黄
+巨响
+救活
+白鸽
+曲霉
+熟睡
+梁王
+um
+旗人
+洛水
+全非
+操心
+去污
+辗
+胶卷
+分头
+宣汉
+谎
+大车
+岳母
+秦安
+倒立
+燎
+吟诵
+厢式
+二百三十五
+永德
+求教
+992
+起见
+惊风
+石羊
+散居
+2Kg
+奴仆
+破甲
+乌德
+信和
+万贯
+太仆
+反扑
+补体
+春游
+答曰
+滑移
+杨过
+善人
+南高
+一四
+阴历
+队内
+1094
+珲春
+童声
+花红
+罗源
+851
+惦记
+逼婚
+西康
+湖西
+一房
+田里
+礼制
+发条
+关东军
+凛凛
+成疾
+豆渣
+幼师
+花脸
+群集
+涂片
+南园
+五音
+唱戏
+้
+日升
+鱼露
+中资
+窥见
+冰点
+救药
+琼山
+魂灵
+沃森
+Short
+嚎
+纳什
+升旗
+source
+清秋
+咿呀
+海蓝
+腑
+DCS
+一骑
+莱因
+罗德里格斯
+1160
+内城
+争光
+攥
+粉皮
+旧房
+抽筋
+1799
+西德尼
+顺流
+制糖
+效度
+呈祥
+极目
+偶氮
+Parker
+诸君
+织成
+星君
+二百三十三
+批号
+拜尔
+黑皮
+转帐
+EDI
+孤苦
+迷魂
+三小
+海北
+管乐
+巧匠
+拓荒
+兵书
+永军
+SW
+SSE
+佛事
+二百二十一
+Pacific
+诺克
+尕
+12012
+try
+静力
+黎民
+混世
+七国
+督军
+1530
+两色
+Matthew
+巴比妥
+亚述
+身居
+嘉庚
+脱皮
+每笔
+预估
+灵公
+春阳
+UN
+肉用
+外稃
+扬威
+红唇
+伊尹
+听写
+詹姆士
+条石
+策马
+夹子
+679
+溴化
+Turbo
+对弈
+中置
+七步
+川味
+圣安东尼奥
+行礼
+≠
+胡杨
+疑案
+甩掉
+全数
+扬中
+连读
+拱手
+福泉
+落笔
+松脂
+沸石
+克伦
+看官
+别太
+命定
+醉翁
+大坑
+海味
+发霉
+筑起
+对半
+雪原
+吃力
+憎恶
+百济
+要职
+六天
+牢骚
+特钢
+捉鬼
+异象
+引自
+护军
+鄂豫皖
+tou
+直树
+吉尔吉斯斯坦
+建始
+独活
+隆平
+铁骨
+建邺
+违者
+Bull
+能者
+病危
+华彩
+四万
+SMTP
+小文
+欧拉
+诩
+史事
+来犯
+迥然
+淡季
+粿
+项城
+1P
+宫外
+冈山
+VT
+印加
+滴眼
+金茂
+侍者
+安培
+分节
+斗鸡
+赛会
+会演
+ARP
+驴子
+DG
+小样
+8L
+781
+呼号
+2456400
+书坛
+陆机
+晋州
+圣器
+药草
+剪彩
+北疆
+尼克酸
+怨气
+提任
+无爱
+哼哼
+沭
+茶碱
+扁舟
+房县
+若以
+赔礼
+邺城
+青少
+晞
+垂死
+诡秘
+3星
+失水
+庶人
+国术
+独坐
+伺
+932
+follow
+秋香
+中油
+气阀
+巨制
+1506
+会要
+引荐
+垫圈
+同堂
+临门
+体式
+Delphi
+财运
+金泰
+狄更斯
+犬只
+bird
+欢声
+库尔勒
+1510
+酸梅
+孝心
+佩斯卡拉
+麦迪逊
+Modern
+失学
+甘州
+下士
+打铁
+堵漏
+盟约
+近邻
+考卷
+生物圈
+拖拽
+驶向
+洗劫
+枝繁
+施政
+矛头
+灌篮
+怀素
+异色
+划痕
+0c
+寓居
+VOB
+选文
+奥丁
+门板
+华盖
+热键
+1670
+淅川
+Brand
+青鱼
+意赅
+Stay
+吊桥
+铁锤
+名剑
+搜捕
+并排
+胃液
+鱼人
+视场
+志成
+饭前
+奏效
+涿鹿
+天圣
+水淹
+辽国
+四块
+宜于
+巴登
+木棍
+二百七十二
+绡
+老王
+宝刀
+此方
+瑞克
+霍克
+调运
+Knight
+课税
+梦露
+常客
+跑动
+褪去
+H1N
+12017
+沪上
+沅陵
+厚望
+汪峰
+菜地
+CSI
+纸片
+伤及
+交所
+击碎
+海路
+蟒蛇
+玉凤
+10A
+联苯
+˙
+祛湿
+江安
+淹死
+x480
+加德
+寓所
+羞愧
+商誉
+渠县
+垫江
+Bean
+还贷
+现价
+col
+油灯
+sin
+formation
+论及
+ma
+走边
+实拍
+八尺
+火势
+泪眼
+据实
+创编
+把头
+上铺
+克明
+钠盐
+超模
+음
+磨具
+甘谷
+1074
+1405
+熠
+胸围
+活期
+小玲
+D2
+军令
+味鲜
+3L
+橹
+和和
+磁头
+耐烦
+宜章
+紫衣
+特例
+留用
+扫盲
+神机
+石屏
+足疗
+1106
+猝
+穆王
+Committee
+华尔兹
+嘱托
+展板
+小芳
+照常
+吸塑
+公仔
+胍
+醉卧
+要买
+一丁
+元阳
+C4
+柔嫩
+朔方
+抗逆性
+鹊起
+速龙
+恩平
+泉河
+百炼
+LH
+抽丝
+捡起
+鱼骨
+大项
+拳脚
+三友
+缸径
+头寸
+瑄
+圆白菜
+甲戌
+意乱
+IR
+英东
+举兵
+娇嫩
+排空
+溢洪道
+竞聘
+雏鹰
+迁安
+Broad
+极多
+嘉德
+稃
+严刑
+
+汴京
+偷渡
+万隆
+小南
+南柯
+直抵
+凌辱
+行列式
+先遣
+东方明珠
+汞灯
+比斯
+荏苒
+横线
+茶座
+流场
+祥子
+鲜鱼
+Applications
+华特
+交还
+访友
+合众
+1114
+乌镇
+主审
+镇远
+莱顿
+掌状
+Ryan
+瞬态
+陶然
+长明
+威猛
+猪腰
+伸入
+贝宁
+腓
+等闲
+得福
+击伤
+守旧
+pass
+Smile
+岷
+mmB
+长袖
+富庶
+image
+早衰
+杀猪
+虽无
+1240
+肌体
+香河
+妻妾
+Flow
+保荐
+Ma
+以资
+县衙
+壳牌
+nl
+贞德
+笛声
+隆兴
+父王
+青川
+铺天
+家风
+繁昌
+1195
+安卡拉
+村合并
+期许
+除险
+广汽
+丽景
+百尺
+PF
+勋爵
+移步
+1796
+East
+试一试
+畿
+既未
+克利
+意林
+手雷
+歉意
+铁门
+论衡
+来料
+茼蒿
+Natural
+晓风
+浅薄
+典章
+羣
+饶舌
+禅修
+思妙
+滴答
+单刀
+EV
+正觉
+景东
+H3
+lu
+禁书
+867
+鹊桥
+台帐
+长龙
+盖住
+玉米螟
+忍着
+五六十
+青皮
+淑敏
+三资
+靛
+1044
+环己
+增色
+亵渎
+御制
+六子
+会儿
+军马
+地龙
+三强
+阉
+江湾
+Cloud
+离生
+雄健
+400000
+1616
+赵薇
+群魔
+辛丑
+Front
+二百二十三
+面授
+善心
+后盖
+普九
+翻腾
+水云
+鼐
+柯城
+Compaq
+殿试
+二百二十八
+洋河
+得票
+剧变
+生怕
+国难
+源氏
+鬼谷
+tree
+Heaven
+罹患
+天海
+谦和
+龙蛇
+乐此
+戏珠
+20cm
+富城
+二年生
+糊口
+汗腺
+专享
+倾覆
+X4
+贸促
+山贼
+HBV
+理数
+三百四十七
+集萃
+跳入
+寻踪
+情丝
+晃晃
+配股
+驼峰
+1808
+簋
+二一
+Win9X
+double
+治沙
+军服
+779
+桂兰
+中经
+科达
+课例
+圆孔
+涣
+桥头堡
+腐化
+椰汁
+1736
+明经
+大卡
+长野
+切齿
+SNH
+送子
+谈吐
+表弟
+香坊
+1721
+压延
+云起
+OpenGL
+支系
+武钢
+山系
+箩
+Stone
+过少
+意译
+floor
+前秦
+复始
+慌张
+殷切
+PWM
+普雷斯顿
+吠
+举步
+气密
+Western
+开赛
+倒转
+属相
+精忠
+脑干
+tu
+湖底
+爬虫
+上段
+大宁
+采写
+Engine
+鼻尖
+互访
+Watch
+i5
+1807
+致胜
+荡然
+1770
+触头
+Picture
+2C
+讥讽
+12mm
+逸事
+NK
+北江
+Aleks
+茶几
+交变
+祎
+英年
+开标
+上侧
+Com
+生冷
+paration
+华电
+street
+Jere
+瓦当
+60mm
+RAP
+乌冬
+迅驰
+那曲
+铸剑
+创见
+法帖
+统制
+动辄
+怒骂
+抄写
+短裙
+遗尿
+政论
+勖
+嫣红
+译制
+丁亥
+双阳
+浙西
+又曰
+油机
+洗脑
+田鸡
+消灾
+巴斯蒂亚
+yuan
+体感
+木糖
+新海
+倒映
+尖刀
+阀杆
+盒装
+一九九八
+90mm
+language
+解渴
+一凡
+糖量
+塔形
+鲁莽
+卷发
+管委
+火速
+凉茶
+穰
+迸
+yang
+解困
+惊呼
+Final
+窠
+短促
+抗联
+后梁
+仅靠
+手捧
+多谢
+掳
+纯熟
+桥架
+对光
+孔孟
+Zheng
+打非
+二百一十五
+虚名
+己见
+鹗
+刺槐
+上田
+特勤
+兽王
+伯恩利
+圣教
+本家
+香榭丽舍
+浮点
+柳城
+爨
+必武
+锡耶纳
+芋螺
+改嫁
+高烧
+模压
+Doctor
+萤光
+管中
+Marketing
+事权
+prote
+照应
+二百六十六
+底稿
+肼
+杰弗里
+读法
+水兵
+拍子
+幼鸟
+恒泰
+伍佰
+华氏
+瑞华
+得志
+崇仁
+豺狼
+可辨
+剿灭
+邓迪
+志杰
+二姐
+丙方
+路西法
+腋下
+云锦
+师爷
+星尘
+水门
+link
+潜逃
+泰拳
+脩
+省事
+928
+1B
+红墙
+江滨
+配子
+博古
+特刊
+怂恿
+Grace
+艳艳
+挂件
+二百三十七
+Distribution
+漠视
+000000
+家驹
+北回
+入土
+压路
+鱼子
+惠来
+布莱尔
+刺身
+反问
+臣下
+扁桃
+和龙
+support
+水落
+七贤
+山亭
+1831
+分派
+题刻
+致以
+卫华
+鱼得水
+Carl
+972
+XM
+钽
+芝罘
+Andro
+原石
+雌鱼
+WAN
+鼓乐
+肌力
+受潮
+斩将
+参拜
+小戏
+胃经
+神鹰
+仙途
+诗画
+萨克森
+骄阳
+牢度
+线描
+配殿
+巴氏
+952
+棍棒
+ADC
+问及
+石缝
+生态村
+衢江
+洛尔
+小笼
+behind
+喜食
+大力士
+楫
+大族
+、
+Hunter
+wide
+1701
+螺层
+蹒跚
+撞见
+拉布
+集注
+残障
+车灯
+上佳
+出云
+集安
+赫伯特
+新龙
+2R
+禄丰
+D1
+抽动
+开刀
+多者
+夯土
+减负
+摊销
+苦寒
+重楼
+充饥
+3DS
+中档
+亘
+踏破
+斯宾塞
+何用
+Index
+던
+虎啸
+河桥
+京西
+花轿
+玩味
+6G
+弱水
+鹅肝
+岘
+shell
+手刹
+1214
+沙漏
+大敌
+过半
+雪铁龙
+血站
+1730
+沙井
+治本
+平头
+道学
+纳豆
+援手
+梅赛德斯
+扬琴
+天坑
+郅
+咏梅
+待考
+御风
+nf
+家村
+乘数
+雷蒙德
+磁悬浮
+建林
+砝码
+武山
+资兴
+就怕
+Feel
+子母
+直机关
+官能
+香薰
+别哭
+Surface
+鼎湖
+鲜亮
+颇高
+焊管
+搜神
+块根
+惠城
+泸县
+双百
+小霸王
+昭明
+忤
+星子
+女将
+渗碳
+废纸
+固安
+Industrial
+后学
+采种
+姻
+跟单
+灭口
+疡
+格拉纳达
+毕露
+蜀道
+秋田
+丫丫
+椰蓉
+997
+镰仓
+佞
+称臣
+墨脱
+八段
+音程
+拦住
+巨魔
+崖壁
+1827
+潼
+Brad
+棺木
+瓦窑
+即事
+免予
+农网
+岬
+炫丽
+纳斯达克
+色块
+舒曼
+昂热
+金台
+栖身
+后土
+Definition
+飞人
+金溪
+蛮横
+烧灼
+笔锋
+乐昌
+一九九九
+既无
+日俄
+Fr
+幸村
+放歌
+原田
+桨叶
+软水
+神性
+道岔
+膜翅目
+간
+SARS
+背弃
+奥尔良
+选任
+白起
+帚
+圪
+溃烂
+大嫂
+歌姬
+北岳
+阴魂
+谔
+么么
+酥皮
+滚烫
+东口
+松茸
+流民
+河滨
+鸣声
+间或
+海清
+Ready
+莲座
+瓦片
+八神
+上头
+石墙
+钢球
+病者
+湄公
+借书
+枯木
+矿冶
+鹤城
+文海
+户均
+特惠
+皂苷
+NJ
+为善
+Friend
+必知
+∠
+恩波利
+描画
+滴灌
+陆海
+邪灵
+只管
+城府
+奇功
+焓
+4X
+正言
+将就
+佝偻
+外海
+哭笑
+逢源
+果香
+预想
+格里菲斯
+会泽
+列夫
+何年
+1084
+盼盼
+高青
+原水
+把玩
+顺眼
+心志
+더
+岁前
+陵水
+别录
+阿尔克马尔
+种粮
+红学
+犀角
+dead
+老姜
+预研
+机井
+75mm
+才俊
+Culture
+种树
+矜持
+War
+1190
+隐情
+粉底
+赖特
+金叶
+大顺
+李安
+东一
+海鹰
+新约
+耦
+中压
+枉法
+古蔺
+林森
+疮疡
+明义
+陇东
+迹部
+僵化
+缉
+shmerge
+败于
+中气
+子长
+jarmerge
+避让
+神乐
+Robin
+〞
+缙
+溜溜
+高坪
+千张
+6100
+盲区
+卷筒
+黑茶
+截击
+灾变
+水天
+古雅
+赌局
+楼道
+连词
+吐露
+庆历
+Jet
+TI
+三杰
+纳尔
+染发
+抗清
+부
+兖
+绷
+獠牙
+px
+星宇
+酸类
+Virtual
+蛱蝶
+lit
+佛国
+合阳
+爱莲
+Pan
+押运
+月宫
+雏菊
+试问
+皆因
+759
+珺
+恼怒
+苗寨
+灭族
+残局
+夸奖
+水鸟
+起讫
+劳技
+肾气
+租期
+虎耳草
+Wii
+敬礼
+樾
+马背
+mean
+1404
+领回
+血球
+下腹
+鞭毛
+缓期
+威客
+匝道
+主音
+脖
+Unix
+韩元
+倾慕
+冰期
+滇西
+歇后语
+1207
+野望
+1677
+本本
+偷情
+会意
+战况
+辕
+口算
+多伦
+Spain
+铁轨
+天翻
+探长
+Carter
+集刊
+庄村
+米兰达
+子明
+转场
+E3
+洱海
+二百四十八
+戏称
+秘技
+CBS
+营区
+之迷
+皂角
+Works
+谶
+1305
+泪滴
+喊道
+条命
+岛内
+施特劳斯
+邓肯
+仪陇
+学文
+幼鱼
+苎麻
+偏北
+簧管
+漳河
+薮
+劈开
+Technical
+愤慨
+mand
+笔尖
+假发
+转投
+喷药
+南湾
+剑心
+颍州
+Д
+奥林巴斯
+怅
+光热
+马鞭
+扫黄
+白玉兰
+复学
+HI
+军乐
+肉块
+中旗
+考务
+蹙
+闭眼
+Gun
+天风
+家宴
+总参
+永盛
+彩钢
+Unicode
+两两
+放样
+可耻
+宽频
+1630
+擒拿
+Zhong
+行规
+French
+本钱
+剑舞
+1216
+顶住
+1776
+奥古斯特
+永葆
+浚县
+鲁斯
+她走
+闲散
+伯斯
+待命
+为伍
+夫差
+建波
+1208
+人伦
+慌忙
+亚比
+扣篮
+异度
+防撞
+断代
+倾泻
+粼粼
+前殿
+滴落
+城口
+nobody
+虎豹
+挡风
+陶罐
+靖西
+蚕茧
+金安
+恶战
+浓烟
+墨宝
+磨光
+煎煮
+kov
+两制
+志高
+厦大
+沙河口
+心窝
+白蜡
+二百六十
+巴塔
+周而
+细说
+台电
+垟
+778
+上苍
+癌变
+告急
+枯水
+腊梅
+LR
+转嫁
+le
+乙酉
+跳线
+MID
+Expre
+地方官
+时势
+灵验
+士族
+德生
+故知
+朝歌
+软软
+氯气
+lost
+1660
+练武
+褒奖
+福生
+艾米莉
+灌云
+70mm
+下颚
+SOS
+32000
+下风
+老套
+布宜诺斯艾利斯
+标枪
+主材
+892
+Ser
+预缴
+砍柴
+铜质
+认得
+绿光
+油价
+JPL
+广昌
+六家
+景明
+官宦
+呆子
+剑指
+Stage
+棉纱
+N2
+志丹
+正极
+坛子
+染成
+溜走
+等温
+祁县
+宿城
+鲁棒
+布鲁克斯
+李沧
+梳子
+卡卡西
+天中
+红果
+会盟
+断崖
+寻衅
+三档
+小妞
+穹顶
+牟利
+1829
+满语
+1st
+PLUS
+碧霞
+泪光
+礼俗
+公婆
+凭栏
+沙沙
+大成殿
+毛毯
+纳沃纳
+妙笔
+猛禽
+胖胖
+白象
+德鲁伊
+桑植
+考释
+胜诉
+三番
+百兆
+朝贡
+抓狂
+拆散
+鹰眼
+hydro
+伟强
+汤加
+始兴
+云溪
+941
+绸缎
+长庚
+槟城
+脉管
+新亚
+hate
+合于
+天幕
+地覆
+山丹
+支脉
+搏动
+Hell
+街边
+喷灌
+二百一十九
+爱伦
+阿蒙
+鸡皮
+佃
+奉承
+书业
+酸酯
+子嗣
+press
+而逝
+礼义
+謇
+叮咚
+约谈
+豕
+乙未
+胡说
+半知菌
+Children
+夹克
+勐海
+Dun
+Builder
+制革
+建荣
+公主岭
+小篆
+death
+礼赞
+笋干
+鹅卵石
+二百三十八
+维维
+武鸣
+水疱
+939
+Low
+行刑
+笔译
+并序
+麻美
+喜来登
+死时
+二百四十一
+森田
+建湖
+炉渣
+881
+下头
+yours
+争雄
+小杰
+漠北
+942
+醮
+资政
+球磨机
+巴特勒
+宜州
+1235
+毂
+下移
+水患
+酒廊
+湖岸
+才学
+手背
+黑桃
+来潮
+正书
+跷
+玩物
+雅鲁藏布
+头版
+获释
+黑鹰
+败露
+渺渺
+先师
+赴任
+会期
+铍
+Lawrence
+诸岛
+墨竹
+西塘
+桶装
+2592
+为父
+泪珠
+大屯
+news
+削皮
+货代
+纵身
+Exercises
+谨防
+虫虫
+善变
+乐陵
+新学
+逍
+UG
+滑膜
+狼队
+埃迪
+面颊
+吴郡
+Lets
+汜
+遂溪
+Mother
+购货
+要强
+中港
+Union
+圣马丁
+绿岛
+狮吼
+此景
+SSL
+相离
+1817
+旧情
+OCG
+月入
+冤案
+超滤
+菜油
+切线
+MMORPG
+兼管
+负极
+布隆
+宫闱
+0102
+SI
+辺
+碰见
+国务卿
+媒人
+山青
+林和
+嶷
+加仑
+佩鲁贾
+80g
+题量
+文思
+键值
+飞走
+两笔
+钢质
+绥中
+打脸
+Fu
+bang
+曾祖
+异动
+951
+扔进
+内乡
+克己
+looking
+Keith
+久美
+浆料
+Wanna
+伏尔加
+记过
+1284
+草食
+库塔
+女婴
+1540
+Les
+小指
+cond
+起事
+人时
+踏青
+贴面
+满口
+干妈
+华晨
+鬼道
+두
+祛病
+作祟
+附魔
+冗长
+1019
+Electronic
+Stop
+临城
+前旗
+门庭
+南蛮
+839
+绝经
+1605
+nj
+绥远
+造势
+栗色
+当权
+50ml
+春卷
+塔基
+代称
+九阴
+cool
+沪深
+内蕴
+小丽
+12013
+大包
+呼兰
+突进
+牢房
+龙船
+广水
+写歌
+濒死
+劫掠
+家宅
+加时
+信州
+木通
+订票
+捏造
+1154
+闹闹
+소
+克莱
+wonder
+疵
+赤红
+轻触
+几百万
+1185
+培土
+托架
+古稀
+学学
+磨牙
+早些
+大沽
+挂帅
+ZCars
+釉面
+模版
+ぇ
+税控
+泰语
+农工党
+取信
+托儿所
+情网
+基带
+剣
+人潮
+Police
+出价
+浪人
+Shell
+赠给
+泣血
+考文垂
+Next
+1063
+横琴
+崩漏
+上钩
+困顿
+AndI
+快进
+澧县
+海葵
+霍金
+伯纳德
+百胜
+家俱
+停刊
+穆宗
+Lost
+ment
+吴哥
+抢眼
+伤风
+体察
+秦川
+王晶
+明教
+三岔
+汉民
+普快
+泾阳
+石湾
+服侍
+黑体
+言志
+上城
+code
+二百一十七
+Instrumental
+灵隐
+Angiosperma
+Zero
+驯鹿
+腰酸
+LINK
+横县
+泡水
+毛里求斯
+鑫源
+手把
+肥羊
+宝成
+口红
+S2
+地摊
+沅江
+勾搭
+灭掉
+野火
+浮夸
+示弱
+正安
+峰林
+未明
+赫尔曼
+华星
+在朝
+正剧
+马村
+新传
+缪斯
+借方
+和谈
+南屏
+转院
+碰触
+运回
+于山
+鱼肚
+追诉
+孔洞
+竖琴
+独身
+1175
+紫霞
+30cm
+水球
+狸猫
+到案
+0KG
+儒教
+TR
+米克
+喷塑
+笔筒
+胸脯
+金口
+美凯龙
+wild
+健儿
+豚鼠
+Worm
+等次
+MBRAM
+金立
+直插
+山明
+施乐
+清酒
+COD
+里耶
+virus
+王墓
+千丈
+国脚
+多些
+Susan
+官署
+当先
+公敌
+安得
+曲谱
+打手
+武定
+自问
+ย
+小曼
+PI
+西望
+1224
+押送
+点整
+河津
+汶上
+Bot
+明达
+雷米
+reare
+刺青
+泵业
+847
+蜗轮
+1573
+马王堆
+回填
+甘菊
+总集
+龟甲
+迪尔
+戊辰
+解酒
+防身
+二则
+死守
+密友
+再世
+以次
+旧爱
+ˈ
+97875
+跟风
+代扣
+灵泉
+惊呆
+TNT
+哈密瓜
+痫
+ha
+月薪
+贝伦
+广丰
+首乌
+蓖麻
+潮红
+上船
+绦虫
+巨子
+太早
+东道主
+鸿运
+地精
+Fashion
+老马
+吊灯
+烧酒
+脚踝
+Seven
+瓠
+棍子
+明山
+米面
+1735
+结社
+硫磷
+游山
+药丸
+五毒
+天寿
+冈萨
+焊丝
+甲辰
+长腿
+Shen
+转正
+价差
+信教
+炼乳
+招人
+点灯
+动圈
+feels
+壬午
+消火栓
+挂念
+伊莎贝尔
+清波
+sure
+花炮
+恒通
+蒙台梭利
+981
+毒手
+骊山
+粘胶
+血溅
+筑路
+编舞
+除锈
+陛
+安第斯
+雄黄
+children
+奸诈
+胜迹
+布氏
+伟绩
+半程
+超车
+花青素
+石斑
+7m
+羯
+农科所
+恶寒
+彭水
+传习
+裂化
+961
+四环素
+Edge
+five
+古体
+天翔
+getting
+镇委
+乌有
+沙哑
+塞北
+点歌
+气死
+县里
+庶民
+Dual
+动迁
+落脚
+素雅
+BIG
+魁星
+庙里
+笔意
+相较
+堂主
+铜业
+七界
+装弹
+⑼
+元凶
+front
+谦卑
+PN
+好些
+党报
+转给
+Finland
+溶酶
+溅射
+褒贬
+盟国
+lov
+就打
+睥睨
+北航
+char
+帰
+热卖
+溃败
+老干
+conomics
+佥事
++
+pay
+十来
+波普
+北镇
+上证
+加高
+良多
+黑线
+福海
+运命
+东野
+Yan
+父皇
+莉娜
+胃部
+竟被
+荠
+跃居
+高邑
+CorelD
+broken
+二五
+贵池
+话本
+跪拜
+修通
+三王
+强酸
+唾
+SDK
+长滩
+4pin
+二百三十一
+迈尔斯
+P4
+滕王阁
+ANSI
+fine
+奶昔
+桥牌
+声速
+菲利普斯
+ART
+萨尔茨堡
+结队
+晓波
+骨性
+德山
+Cold
+海胆
+西斯
+皑皑
+下跪
+后街
+成色
+六分
+船头
+洒落
+晓月
+库伦
+整版
+骨化
+蓝旗
+1625
+夜宿
+凯奇
+内质
+立论
+困住
+诚邀
+白夜
+啶
+1213
+阿尔萨斯
+泛函
+常春藤
+蚬
+杂诗
+猪血
+科斯塔
+1107
+黑虎
+1811
+仁波切
+人山
+姒
+拌面
+心口
+戈麦斯
+零碎
+名园
+金条
+01976
+寒夜
+sorry
+丙醇
+迷彩
+Capital
+装车
+公堂
+晨星
+冷汗
+伊娃
+双极
+行营
+走失
+EDA
+馨香
+涪
+破冰
+璘
+杀生
+黄淮
+川沙
+nch
+父兄
+乘势
+浦城
+砦
+郊野
+口粮
+诉诸
+Guide
+bility
+Source
+海狮
+豪庭
+Email
+旧书
+灭活
+靖康
+哈维尔
+拔刀
+咬合
+dependent
+FAT
+头尾
+复明
+美孚
+叭
+三湘
+GUI
+1833
+1276
+昏睡
+榆中
+偏光
+恍若
+畏寒
+NE
+粗鲁
+精馏
+立秋
+乱石
+千里马
+畅达
+牙髓
+桂芳
+小灵通
+毛刺
+那头
+猪皮
+圣祖
+苦修
+公家
+村部
+河阳
+第戎
+咳喘
+街景
+玉莲
+秀全
+山月
+内审
+二百四十六
+加特
+诏安
+天穹
+国计
+7MB
+应时
+Dick
+衄
+投足
+799
+Carlos
+春分
+二百一十八
+未变
+秀场
+猗
+大冢
+未至
+妫
+矿长
+海难
+上臂
+products
+内人
+MFCD
+php
+代工
+巴尔扎克
+穷途
+房贷
+莲都
+亲家
+宋家
+耗子
+撬
+既而
+Russell
+署长
+泰伯
+闹鬼
+鲲鹏
+相送
+蜗
+马桥
+米汤
+continue
+adata
+洪区
+洱
+桶金
+反贪
+钹
+36000
+立命
+洗牌
+Financial
+薄饼
+怨言
+保国
+1194
+收人
+international
+4399
+企及
+1265
+982
+IgG
+表带
+早先
+玩意儿
+原阳
+渐趋
+坐果
+涂在
+刚劲
+海云
+退守
+大张
+fraid
+要员
+根雕
+夜游
+lock
+只须
+天兵
+团子
+接穗
+苏里南
+新明
+水神
+熟读
+回城
+古画
+小仙
+扶摇
+开赴
+艘船
+砰
+面砖
+FL
+花神
+猎奇
+印鉴
+877
+阴沉
+转数
+Driver
+快船
+Tree
+旱涝
+痈疽
+6
+轻浮
+啉
+洛夫斯基
+散漫
+pot
+升官
+Processing
+灰心
+蹦蹦
+头球
+史馆
+苏拉
+进境
+随其
+牧马
+居安
+769
+投行
+东宁
+加薪
+联锁
+花茎
+离队
+点画
+腓特烈
+22010
+铲子
+龙源
+郭氏
+特立尼达
+footage
+罗西
+WinNT
+艺委
+纵裂
+鱼跃
+易损
+武火
+饱经
+堆栈
+筵席
+坑道
+1401
+Graham
+初选
+力战
+5P
+险象
+演剧
+帝制
+蛇蝎
+0iO
+消音
+良乡
+进补
+养母
+可儿
+兰博基尼
+同形
+1792
+静水
+美语
+衰亡
+her
+杲
+迎击
+多党
+噩
+气液
+山墙
+卓玛
+横断
+浮空
+东皇
+眼影
+双人房
+白喉
+虾球
+雄才
+露珠
+艾泽拉斯
+太爷
+细语
+水花
+物色
+弗雷
+临别
+加农炮
+耐性
+残雪
+冻伤
+ZERO
+学车
+FA
+合谋
+保苗
+戊子
+经手
+师友
+竹炭
+草鞋
+湾仔
+秀秀
+THB
+皮球
+文姬
+尊老
+写就
+苻坚
+单恋
+肚脐
+大曲
+er
+梅里
+薄命
+想听
+小猴
+事成
+4H
+腰鼓
+瞌睡
+楞严
+别字
+硕博
+切细
+子网
+蒯
+内尔
+IU
+壳口
+罗姆
+根除
+江流
+起手
+索普
+大祸
+千面
+SU
+谢尔
+糸
+树德
+紧逼
+疳
+炉膛
+减税
+蔷
+浑圆
+内华达
+尻
+明辉
+褐土
+指称
+伤透
+small
+柳条
+气壮
+749
+大慈
+此心
+鸡公
+体坛
+1134
+松溪
+玻尿酸
+皱褶
+图卷
+如龙
+迎客
+League
+蚊虫
+芫荽
+yl
+卸料
+康平
+滚圆
+爱萍
+三观
+可好
+Fast
+1430
+盐池
+情郎
+丹妮
+Hold
+speed
+提花
+间作
+路易斯安那
+合纵
+援外
+筑成
+坏账
+倒海
+咚咚
+哗啦
+哈勃
+种马
+维管束
+Outlook
+翘首
+极差
+伯恩茅斯
+稀里
+Nether
+休想
+独霸
+前哨
+南航
+缕缕
+东丰
+改过
+哑铃
+东方神
+豺
+TOEFL
+隐者
+选段
+拔剑
+衬砌
+音波
+注疏
+合水
+每季
+人祸
+桁
+儒生
+饶有
+克/升
+Friends
+坩埚
+拳打
+遂将
+高帝
+双环
+大儒
+幼犬
+专管
+山后
+1092
+海王星
+姨娘
+喜气
+947
+松竹
+配戴
+圣彼得
+歌喉
+饮马
+黑羽
+1619
+装设
+931
+育婴
+戴高乐
+暂且
+谎称
+能带
+阿金
+多哈
+昆剧
+附院
+隔声
+翻过
+昌江
+乙丑
+北史
+赤松
+国网
+Jacques
+湿式
+孝廉
+掠影
+凭吊
+脊骨
+纤纤
+瓜菜
+佩剑
+松懈
+教团
+Kitty
+旧制
+文水
+속
+助听
+女尸
+心学
+安乡
+褒义
+外展
+蹊跷
+枫桥
+前嫌
+Cong
+1311
+Roberts
+悬吊
+港城
+退后
+苏共
+授业
+堵截
+白鸟
+格物
+地头
+RSS
+粤北
+斯皮尔伯格
+晚辈
+登楼
+os
+点石
+谷雨
+老酒
+Designer
+嫩肤
+思虑
+初任
+北漂
+乞求
+昌明
+尖顶
+恩宠
+水价
+新安江
+躲进
+1017
+适者
+维亚
+香芹
+Min
+环生
+孙吴
+煅烧
+手迹
+博世
+通篇
+来使
+竹节
+黑市
+花会
+散人
+后跟
+腊八
+尧臣
+1425
+压板
+受教
+土肥
+云间
+波希米亚
+明辨
+微尘
+式微
+鳞状
+克鲁
+长葛
+问路
+地温
+融券
+武大
+保教
+柏木
+learn
+追凶
+语态
+桓温
+外购
+国英
+儿孙
+竣
+门道
+佶
+Yellow
+授精
+造园
+BOX
+嶋
+礼金
+军地
+碳化物
+蹼
+华斯
+金钗
+瑟瑟
+兴仁
+宏远
+撒拉
+2L
+越界
+光启
+医书
+杀鸡
+工区
+墨香
+附庸
+弗里德
+渐次
+捆扎
+寻呼
+筹委
+酒器
+名星
+弗莱德
+搭救
+熊本
+羔
+蟹黄
+诗体
+当空
+侧身
+多汗
+庚戌
+莱昂纳多
+工务
+景阳
+wp
+黄陵
+大论
+租售
+弱智
+唯恐
+神位
+蠕变
+血亲
+概貌
+经心
+步走
+侠女
+Women
+小贝
+龟裂
+强袭
+小檗
+朗文
+鹑
+暖流
+双节
+抽穗
+捧场
+骋
+圣洛伦索
+越位
+清淤
+先念
+水杉
+昉
+神灯
+悲悯
+汤阴
+豫西
+写本
+贴子
+猎捕
+盈江
+华堂
+Soft
+1145
+想死
+亲征
+心儿
+严冬
+饱览
+天地人
+刽子手
+诧异
+Rick
+指战员
+氰胺
+1515
+变应
+格雷厄姆
+秀华
+郧阳
+恶习
+粘着
+门洞
+发奋
+尤金
+藩镇
+逋
+UTC
+何曾
+住地
+油酥
+埃德加
+子贡
+斗艳
+他汀
+名就
+升腾
+锈菌
+截获
+ж
+坝子
+短剑
+临走
+台大
+高材生
+西皮
+肟
+后晋
+初乳
+把守
+耻骨
+校刊
+华源
+Deep
+人丁
+花前
+两厅
+力主
+赤诚
+block
+酸汤
+狙
+栀
+al
+雀巢
+芷江
+嘻
+2103
+啪啪
+方将
+middle
+﹔
+1415
+829
+轻装
+独舞
+Lite
+粗粮
+敢当
+击发
+企管
+小声
+二百四十五
+Pr
+豫东
+惊梦
+小毛
+窘境
+DC12V
+哎哟
+01958
+简章
+石原
+誉满
+大腕
+德格
+河面
+Mind
+널
+对仗
+床边
+俾斯麦
+仁贵
+杀意
+蒸腾
+横亘
+归线
+甲寅
+世华
+雄虫
+12008
+爱玩
+黑带
+警钟
+曲柄
+ß
+贪欲
+资中
+KO
+AppStore
+景颇族
+ju
+phenyl
+啰嗦
+铁链
+s1
+1076
+创想
+揺
+红英
+轩然
+迷局
+混纺
+酱菜
+Lett
+糖化
+芈
+难忍
+独子
+小羽
+故址
+参知
+Pack
+附会
+条鱼
+钟乳石
+会审
+绝后
+轻点
+钻床
+通渭
+地线
+龙阳
+2202
+甘油三酯
+背井
+馋
+设宴
+顾全
+25g
+Math
+极简
+truth
+千斤顶
+断魂
+意拳
+寄送
+斜线
+身家
+﹡
+它那
+一百二十
+济贫
+future
+安特
+神风
+富山
+志雄
+洒满
+胃气
+白脸
+昏倒
+大帅
+云梯
+七千
+凉子
+金丝猴
+和林
+先验
+载运
+水榭
+1470
+头面
+猫王
+黑鱼
+芪
+Jame
+偶联
+数数
+石磨
+1330
+选课
+介子
+HSD
+牛马
+割裂
+通明
+猪心
+酷派
+显灵
+纽卡斯尔联
+横断面
+试种
+AIM
+▽
+母后
+子叶
+元人
+法海
+RA
+里氏
+胶结
+败家
+耶和华
+MINI
+箓
+红皮
+节油
+筮
+配体
+巨蛋
+呼救
+镖客
+五老
+道情
+争当
+menu
+折腰
+敦促
+虫洞
+年刊
+五角星
+挑出
+期第
+策动
+989
+私心
+8X
+实权
+理睬
+Elizabeth
+级联
+含羞
+莫桑比克
+职棒
+条狗
+狼藉
+Lin
+婉儿
+浪荡
+萤石
+无神论
+冲剂
+越级
+熙熙
+支委
+泉山
+广交
+1304
+定形
+艾特
+nh
+内讧
+贪玩
+捐躯
+anything
+干煸
+艺员
+铺陈
+阿曼达
+油坊
+针脚
+蛰伏
+国朝
+防暴
+平手
+鲜有
+严加
+安龙
+工薪
+外行
+大劫
+宝兴
+大沙
+重教
+扫墓
+心里话
+未艾
+情天
+▼
+黑金
+骈文
+32MB
+路德维希
+圆雕
+茶场
+浩渺
+追思
+克洛
+空位
+立夏
+阿雅克肖
+衡器
+大政
+金和
+泉眼
+穷追
+food
+A8
+码率
+胆石
+沙化
+mRNA
+总账
+圻
+晓龙
+绕行
+湘雅
+跳绳
+卫城
+后坐
+Province
+王姓
+法警
+ft
+亮剑
+受助
+良苦
+849
+左轮
+DH
+暗访
+中教
+云气
+Logo
+烧造
+大南
+朗姆酒
+生香
+莫如
+生蚝
+炸死
+误读
+阿宝
+汉武
+desk
+満
+Student
+欧莱雅
+晓阳
+敖包
+求情
+外宣
+咧咧
+新天
+志安
+斯洛
+大市
+发问
+甲天下
+愈多
+敬酒
+难耐
+招手
+汤水
+Phil
+阿Q
+三石
+同游
+利奥
+阿龙
+推辞
+血丝
+手球
+赤坎
+葛洲坝
+蹑
+吡咯
+梓潼
+中控锁
+洛川
+Father
+争权
+â
+如林
+论说
+难说
+请柬
+气焰
+通红
+did
+ah
+837
+971
+酸盐
+岌岌
+二百七十
+格斯
+珍奇
+武师
+好说
+忍住
+威仪
+碰壁
+香皂
+蓬江
+皮肉
+887
+ي
+支前
+1073
+na
+炁
+乳清
+阇
+必看
+宽泛
+1758
+交管
+圆头
+Barbara
+硅油
+回扣
+隔年
+美腿
+狮山
+区公所
+宝儿
+普利策
+鄄城
+page
+聚醚
+悱恻
+stand
+养路费
+榕江
+槐荫
+³
+现况
+冗
+嫁娶
+折返
+宗地
+写写
+菌盖
+苦荞
+狂放
+广新
+单性
+八股
+罗纳德
+质粒
+银质
+玉壶
+桥镇
+CNN
+量测
+早稻
+1118
+负心
+探花
+砜
+non
+华工
+和局
+Medicine
+Walk
+cut
+龙科
+荧
+舌苔
+甘棠
+通勤
+夺权
+量词
+野味
+驱邪
+内黄
+庚申
+ABB
+干草
+791
+仪容
+烟海
+大商
+鸭血
+Campbell
+current
+地衣
+沁水
+站住
+染指
+美方
+NU
+967
+腾出
+万金
+离析
+趋同
+相像
+A5
+狼族
+丹皮
+灭顶
+阿伯丁
+万宝
+FD
+试玩
+尤物
+口吃
+洞头
+兴山
+看板
+声闻
+泰然
+遗骸
+三言
+贤才
+服气
+造地
+S3
+凯勒
+晶片
+archive
+蒙阴
+提点
+百业
+催芽
+米罗
+生路
+拉曼
+中通
+1340
+大补
+淡红
+争艳
+口哨
+语汇
+文景
+Bird
+Fly
+率性
+东沟
+|
+string
+藕片
+多疑
+驼背
+ETF
+穆尔
+张嘴
+失语
+视距
+Julia
+歪斜
+潢
+龙须
+行凶
+Cooper
+牛乳
+师风
+辛卯
+谏议
+来由
+1165
+ว
+阿房
+弹体
+衣钵
+90W
+偷吃
+动笔
+露丝
+双轨
+闺女
+摇臂
+激越
+搭起
+总政
+雅集
+叶形
+9x
+Hans
+瀚海
+拐点
+鲁中
+提携
+镇雄
+苛求
+somebody
+小头
+尽忠
+眼压
+刺骨
+说明文
+足底
+慈云
+东正教
+1037
+QVGA
+可危
+匠人
+先试
+长泰
+学完
+贪财
+小项
+古斯塔夫
+小夜
+M4
+德鲁
+酒席
+م
+Rev
+武陵源
+喷枪
+百分数
+布雷
+如麻
+勿入
+失窃
+师古
+乐事
+轮转
+Lang
+铤
+三彩
+耗散
+座席
+1715
+游仙
+银饰
+老太
+IMD
+押解
+暖和
+凪
+卢塞恩
+收官
+来日
+蚕食
+加进
+邮品
+主犯
+3002
+快运
+南环
+简析
+警世
+萎
+小半
+前头
+调兵
+拔河
+毎
+体罚
+使团
+晰
+拄
+婚丧
+鹤峰
+贵子
+便溏
+豆荚
+刀口
+韭黄
+甘冈
+炔
+中招
+水雾
+泼水
+C2C
+栅格
+傲气
+Quality
+水貂
+Build
+淅
+转轴
+甥
+知网
+丙子
+Lord
+崇义
+园景
+叶氏
+1672
+水灵
+磁石
+01975
+1740
+沧县
+格里
+禹城
+折光
+搏杀
+课业
+日清
+残部
+对折
+正骨
+代尔
+史略
+灸法
+破军
+运量
+背带
+阴冷
+顶岗
+攀比
+书坊
+满汉
+漠然
+929
+投档
+小农
+阿道夫
+肺气
+海大
+AMEX
+薄冰
+毛纺
+克难
+闪动
+沙茶
+丙级
+Mont
+久别
+圣主
+Korea
+难测
+6700
+罗萨里奥
+凶器
+赵州
+松仁
+十进制
+弃置
+利浦
+唱到
+偏西
+五原
+刮起
+软座
+寻欢
+OC
+萨迦
+APS
+贝叶斯
+波谱
+新欢
+雄关
+娃哈哈
+徕
+禹王
+吉庆
+鸭梨
+尧山
+翻修
+早饭
+市电
+出气
+青衫
+简政
+莉娅
+近岸
+非晶
+杜诗
+Ocean
+派拉蒙
+诗赋
+莱万特
+泰康
+冲模
+大觉
+手持式
+广记
+拘捕
+文清
+9300
+牛筋
+终老
+疑心
+夜校
+铢
+试液
+Blues
+徽派
+辛酉
+庆林
+婶
+慨叹
+打消
+来凤
+品酒
+邋遢
+浩如
+罗汉果
+鸡鸭
+法名
+讲稿
+ER
+象限
+半世
+酷热
+鞅
+window
+卫矛
+蓑
+册页
+采莲
+二百六十九
+i3
+临邑
+6M
+吉尔伯特
+曹妃甸
+留名
+very
+Cisco
+烘箱
+菊酯
+寞
+蓬溪
+点化
+烟斗
+刚正
+ICU
+开运
+干货
+蚕种
+扣件
+静冈
+公积
+凳子
+8226
+落马
+Marc
+Ford
+剪接
+截流
+终南
+大渡口
+卡住
+柘城
+龙年
+花苞
+续篇
+生力军
+Da
+80GB
+Mass
+冲服
+艺校
+Count
+Stars
+麦当娜
+侑
+Matt
+鳕
+1688
+炭化
+铺平
+支招
+礼佛
+注资
+2702
+中班
+1566
+矿渣
+追星
+Other
+汤面
+三镇
+小蓝
+拉风
+met
+炎凉
+智障
+铁水
+蔑
+关於
+堕胎
+收效
+布告
+症结
+歇斯
+精典
+Wright
+胡斯
+尚且
+美联
+骡
+珈
+监审
+2802
+六甲
+蒙哥马利
+跃升
+綦
+试想
+壊
+自欺
+在行
+迈步
+牛刀
+斯旺西
+井中
+宝鼎
+1525
+buy
+柔滑
+樊城
+气运
+帕尔梅拉斯
+垂青
+20ml
+70kg
+诗史
+德政
+铁杆
+叫卖
+丈人
+耕田
+尼科
+驻京
+层序
+堪舆
+疯人
+桃金娘
+蹿
+같
+Bass
+幕布
+简况
+元康
+nw
+重油
+两分
+集子
+回纥
+套种
+Because
+1h
+桃色
+NAS
+新解
+奥兰
+荒淫
+左脑
+摊点
+免息
+喜获
+落选
+皎洁
+857
+神似
+维琴察
+justa
+剪掉
+架次
+阿鲁
+美龄
+1053
+琶洲
+啤
+冬夏
+ˉ
+黑云
+口镇
+ر
+商学
+冒用
+科尼
+文光
+水线
+盖有
+开基
+喃
+克庄
+氯酸
+yi
+裂谷
+格萨尔
+Andr
+利比里亚
+1026
+酒井
+邻村
+下侧
+瑞祥
+代课
+万石
+诸公
+净空
+工房
+L1
+大数
+淤血
+过火
+八万
+比翼
+22011
+开示
+苣
+1033
+鹿邑
+精当
+舱门
+二百四十九
+南派
+集英社
+怠速
+QT
+怨灵
+配器
+地物
+胫节
+多模
+北岛
+Python
+呤
+蓄意
+Maxim
+科林蒂安
+横渡
+旅社
+絵
+団
+钎焊
+海流
+连累
+投石
+管长
+生物质能
+脉象
+次贷
+庆安
+贝斯特
+分秒
+起亚
+灵柩
+0103
+惰
+广福
+岫岩
+子华
+语词
+苟且
+镛
+雅苑
+低毒
+沱江
+zu
+布斯
+眉间
+花中
+翩
+盗走
+佛子
+金鹰奖
+鼓膜
+祛瘀
+纳尔逊
+金氏
+浪尖
+瑞达
+戴夫
+库车
+壳斗
+小霞
+散播
+格雷米奥
+多美
+yo
+明丽
+1116
+丰沛
+古音
+耀邦
+晚秋
+涓涓
+秦皇
+会客
+妃嫔
+全片
+童工
+捐出
+撷取
+雨声
+舞钢
+1375
+震怒
+精采
+瞒报
+重游
+1212
+科莫
+关停
+加的夫
+字元
+⑾
+1705
+COS
+Kate
+治军
+德普
+镬
+高崎
+囤
+黟县
+U19
+啜
+大锅
+倒挂
+德雷克
+元君
+护花
+治乱
+NEC
+奇谈
+永别
+鄢陵
+造字
+CSA
+叉子
+1067
+己烷
+xxxxx
+质酸
+医政
+布兰
+小燕
+秉着
+伯伯
+洪灾
+还剩
+魍魉
+暮春
+颈肩
+富华
+箬
+开导
+吏治
+借来
+碰碰
+编集
+参看
+内忧
+砾岩
+晕眩
+轻舟
+姊姊
+8400
+重排
+包膜
+loving
+弹弓
+记功
+采血
+lonely
+擦伤
+桃江
+提亲
+圣像
+绘出
+碘盐
+压低
+鹿鸣
+政司
+二百五十二
+帮困
+骸骨
+东光
+合家
+住区
+錾
+British
+浣熊
+吞没
+晚于
+Structure
+墨玉
+刊出
+正茂
+常胜
+丹桂
+1221
+翘起
+激增
+木桩
+东道
+水蜜桃
+沙尘
+太医
+剧痛
+surface
+虚火
+邓家
+索隐
+魔灵
+或曰
+揆
+吸音
+1113
+澎
+1435
+獾
+锰矿
+万语
+斗智
+阴气
+å
+劲敌
+评注
+RoHS
+县民
+河底
+夹带
+古铜
+wanted
+纯棉
+花环
+WOW
+墓主
+device
+891
+带病
+伊通
+哈桑
+12cm
+埼玉
+亦云
+栉比
+哮
+Italy
+havior
+桦木
+doing
+踏雪
+乌布
+水原
+上载
+卢布
+靠边
+疽
+宏宇
+Private
+皖西
+赏月
+栀子花
+MicroUS
+厂内
+关岛
+Grant
+武乡
+Again
+苦头
+忱
+名企
+力道
+㎝
+丰盈
+朱门
+初涉
+目测
+SGS
+Letter
+要案
+信女
+Mhz
+三百一十二
+冲淡
+浅滩
+中肯
+假货
+Albert
+育英
+性灵
+祖孙
+斯诺
+函件
+大维
+红龙
+中分
+1813
+作死
+圣魔
+2040
+胡润
+逐项
+入夜
+一龙
+兴教
+lady
+gl
+浸染
+小食
+时或
+再论
+易怒
+称誉
+赛果
+刚直
+定单
+武胜
+诱骗
+争气
+竞合
+太婆
+马勒
+past
+平桥
+三川
+捎
+别后
+重聚
+门框
+秋菊
+wow
+9999
+大单
+小齿
+静宁
+华天宝
+螯合
+双休
+灼见
+Save
+SPSS
+过数
+小方
+被叫
+堆焊
+次女
+龙尾
+书页
+NCAA
+婢女
+改作
+linux
+bar
+疆土
+屯门
+泡桐
+1516
+龙桥
+菲尔特
+孤魂
+爪牙
+醴
+Pascal
+精囊
+做爱
+March
+礼法
+同族
+布偶
+迪纳莫
+新邵
+荣格
+科索沃
+畸
+普朗克
+归化
+冠毛
+高挑
+水垢
+andits
+关市
+兴安盟
+司仪
+玉良
+Function
+刈
+从戎
+通婚
+掩藏
+雾里
+闲谈
+竖直
+沟内
+伏虎
+疝气
+阴晴
+鞍马
+刍
+威克
+0y
+米内罗
+生吃
+罐体
+空心菜
+烙铁
+层峦
+2V
+嫔妃
+원
+铸币
+Michelle
+棉被
+辅政
+迪安
+引航
+变奏曲
+河沟
+原案
+弘一
+运球
+嫁祸
+时延
+蒙娜丽莎
+荣臻
+财商
+7V
+才人
+痨
+院中
+质变
+功过
+长命
+老宅
+某地
+和南
+苔草
+改行
+翻拍
+797
+崇安
+邮资
+小把
+防磁
+占道
+高梁
+871
+利多
+单眼
+自利
+杂居
+塑业
+牟取
+飞腾
+成田
+工友
+CMS
+要旨
+李冰
+878
+绀
+滑梯
+临高
+方达
+灵药
+药事
+南望
+入声
+银针
+MSM
+二百六十一
+银山
+弃婴
+生字
+狮王
+被窝
+平调
+西王
+绥芬河
+地源
+觐
+卡斯特罗
+lights
+玄冥
+含硫
+8CI
+曷
+逢春
+久经
+果林
+算账
+雨点
+苗疆
+容县
+约法
+week
+农办
+虞山
+地外
+瑞雪
+战纪
+诀别
+气馁
+台胞
+怪圈
+처
+枣强
+Г
+大圆
+977
+千多
+獭兔
+行道树
+多寡
+9MB
+亜
+奇点
+主谓
+发飙
+全才
+求证
+大谷
+SOD
+百大
+此番
+酸钾
+恶棍
+古方
+凯迪拉克
+NLP
+废液
+猛犸
+隔日
+岱山
+亲热
+合演
+罗利
+UGNX
+金典
+布兰卡
+入眠
+湄洲
+市里
+青苗
+全速
+迪斯科
+2101
+曲水
+♥
+锅子
+手札
+天语
+陋习
+返璞
+市管
+鬼胎
+祉
+姣
+嘅
+郡国
+钿
+4mg
+亮出
+知者
+节理
+刑场
+獒
+通河
+两湖
+自诩
+维冈
+格雷格
+送信
+analysis
+立异
+1575
+金河
+尽瘁
+胃镜
+追尾
+蹄筋
+汗马
+必胜客
+谎报
+2123
+1226
+强加
+晓丽
+sex
+国法
+备至
+JIS
+C14
+琪琪
+亚州
+澄明
+暗涌
+潜望
+世英
+working
+百越
+倩倩
+卢湾
+微缩
+载道
+如流
+瓦德
+vie
+曲曲
+近地
+星象
+胸甲
+各户
+胎记
+Ming
+受辱
+遗属
+球手
+二外
+测报
+疆场
+煎蛋
+笼统
+超炫
+康明斯
+莱城
+现形
+2ml
+后妈
+同创
+特使
+多选
+山崩
+同体
+丙午
+军品
+古语
+Files
+DC5V
+电白
+藏药
+Board
+蜕
+瞰
+沟口
+笑靥
+表壳
+柯林斯
+习练
+反比
+招架
+书艺
+善感
+贪恋
+侨办
+HSK
+曹氏
+斯蒂文
+甲氧
+修车
+Woo
+铭牌
+娜塔莎
+圆子
+绢本
+低聚
+错题
+晋绥
+斜纹
+府谷
+50mg
+招贴
+篙
+点钞
+鲑
+淋湿
+城头
+交趾
+平城
+恍
+各显
+重光
+1718
+怡情
+抽吸
+丛编
+山子
+因势
+巨款
+刺穿
+1618
+枷
+万两
+如神
+麻布
+未名
+东昌府
+考辨
+牛牛
+alpha
+1032
+肺经
+三百二十
+STEP
+脱衣
+布匹
+黄皮
+黄水
+德治
+乜
+牍
+堆场
+分式
+六角形
+课桌
+寿春
+变质岩
+亿多
+门楣
+西卡
+2303
+拉尔
+二百六十七
+池上
+乱想
+昏庸
+搔
+词源
+南翔
+書
+1315
+水东
+见诸
+959
+速算
+入山
+补品
+雷曼
+含情
+谍影
+Jump
+绯红
+大寿
+死罪
+您好
+Jazz
+虐杀
+吐谷浑
+录影
+拜拜
+远游
+三百零二
+草皮
+子午线
+志鹏
+1570
+1325
+沿边
+笃信
+Aaron
+甲醚
+短命
+沟边
+路经
+顶盖
+大干
+大错
+HL
+摩德纳
+世嘉
+滴虫
+撩人
+埃文斯
+破伤风
+斑鸠
+2450
+苏俄
+industry
+临阵
+铜币
+船级
+挂架
+党办
+长桥
+彩铃
+窗子
+直逼
+Eye
+西经
+闻喜
+圣乔治
+诬告
+独辟
+麸
+菌体
+硚口
+营山
+Sea
+二百六十八
+围垦
+薙
+木屑
+打打
+野趣
+粮库
+自焚
+树丛
+天职
+迷乱
+淡色
+1725
+阴部
+伊顿
+兴达
+啦啦队
+秋林
+旸
+一失
+石楠
+少陵
+瑜珈
+区直
+珠算
+Recovery
+human
+1245
+west
+肃然
+二连
+count
+铺子
+肌病
+铁锈
+梅岭
+国宾
+腓力
+细看
+从众
+正南
+貉
+瑠
+红桃
+1264
+镇北
+坂田
+LT
+劝导
+上岛
+重审
+八家
+1295
+宁武
+床头柜
+后头
+笄
+丹顶
+送检
+要不然
+仲恺
+稀饭
+沾化
+微粉
+通衢
+凤头
+派送
+近况
+求变
+优派
+炮手
+3022
+869
+大集
+车流
+冻害
+南德
+Ruby
+贝加尔
+1704
+通间
+南投
+揽胜
+葭
+田东
+宫保
+古意
+倍加
+米内
+长亭
+就问
+‚
+华大
+文宣
+1345
+HF
+hell
+拉普拉斯
+六度
+上图
+鸟人
+右江
+ن
+续修
+山风
+迪庆
+Camera
+分页
+叩头
+花蜜
+压敏
+接洽
+驾乘
+一败
+羊腿
+father
+拈花
+志红
+晨钟
+乔装
+1227
+碳化硅
+古街
+小伟
+Better
+Singapore
+一百余
+含意
+데
+偌大
+护手
+转导
+多亿
+奈何桥
+Mission
+殉职
+假使
+1798
+沃恩
+二百六十五
+开原
+多囊
+Shadow
+方士
+二百七十一
+1279
+问情
+August
+天源
+R1
+稻区
+脚掌
+沈丘
+致意
+古物
+马儿
+国伟
+country
+樟脑
+迦叶
+热压
+熟女
+会理
+环翠
+加布里埃尔
+防虫
+2602
+藏王
+豪侠
+相和
+牧人
+player
+文星
+KJ
+Wake
+闭式
+练字
+1036
+音高
+防制
+灵机
+子安
+吉布森
+神树
+Non
+照搬
+川大
+楝
+彭泽
+余音
+肠管
+捐建
+小安
+湘赣
+sea
+落雪
+留连
+market
+实乃
+EPA
+升限
+超神
+准星
+QA
+画谱
+清肝
+竹编
+主考
+孕酮
+BAS
+感恩节
+圣山
+中煤
+平卧
+门路
+甘苦
+SolidWorks
+FN
+虎子
+府办
+前倾
+科科
+Linda
+群像
+下毒
+v4
+福克
+这等
+天公
+陷阵
+可编
+古生代
+低效
+若此
+昭雪
+布莱克浦
+三百零三
+曲沃
+遂平
+客舱
+袁州
+iOS4
+垦丁
+庆生
+曼德拉
+辻
+死机
+矿场
+内收
+誓约
+善事
+雪豹
+山杏
+Realtek
+1636
+2520
+泥岩
+科克
+二桥
+苗家
+憩
+洛维奇
+yes
+诗序
+STM
+马尔默
+抛丸
+瘪
+QWERTY
+4p
+银龙
+断水
+枕木
+删减
+偷拍
+witch
+味蕾
+念奴娇
+翻脸
+龙威
+M4A
+头人
+2P
+跑跑
+宽窄
+殊胜
+但凡
+诒
+老成
+冤魂
+simple
+路演
+活宝
+自弃
+她笑
+昏昏
+塔吉克斯坦
+应选
+折衷
+神会
+LNG
+高碑店
+垂危
+主抓
+æ
+坡面
+观感
+胰岛
+歌女
+多赢
+pretty
+NF
+万县
+一醉
+杂色
+亚克
+拔掉
+信丰
+heaven
+按年
+灞桥
+抿
+索桥
+音协
+饥寒
+辔
+北市
+1519
+夏邑
+乐师
+里海
+克罗托内
+隔夜
+素面
+炫舞
+电通
+卡尼
+脱线
+台车
+陈式
+月湖
+吉斯
+焚天
+日下
+1765
+天喜
+巴音
+女双
+主根
+多愁
+泌乳
+金泉
+心醉
+南和
+藏学
+明心
+小脸
+尽兴
+琴谱
+Gang
+胚乳
+奇珍
+二百八十
+Flex
+诗论
+Roman
+SGH
+美亚
+SY
+吹拂
+李贽
+Karen
+都行
+耀辉
+生擒
+甘霖
+成山
+太短
+流泉
+纤体
+鹿岛
+香奈儿
+涂覆
+色球
+底子
+新世
+弥撒
+晋南
+皱缩
+中影
+红皮书
+巨蟒
+蒹葭
+烟灰
+燕郊
+号房
+愤恨
+富川
+回迁
+1038
+松井
+康涅狄格
+巴豆
+东篱
+af
+1108
+影碟
+司命
+歧义
+麦片
+Thats
+Dennis
+重做
+退热
+桑切斯
+骐
+Environment
+恩典
+盘根
+1043
+山陵
+OR
+进屋
+艾莎
+激扬
+磨粉
+此物
+用武
+运程
+奥克斯
+易读
+百花洲
+利是
+声援
+御赐
+去路
+Plaza
+汹汹
+丁未
+脑科
+盘面
+起先
+腹板
+鲢鱼
+头绪
+标新
+脱困
+Washington
+剪断
+行权
+平版
+挹
+翱
+罢官
+对视
+一九九五
+二百八十六
+江阳
+赫斯
+迷航
+Before
+衰微
+降妖
+怦然
+小乘
+859
+Idont
+nji
+転
+村屯
+采区
+Collins
+招摇
+burg
+霉变
+囊状
+SYSTEM
+寡头
+Want
+罗莎
+mV
+2h
+近水
+ROHS
+书摘
+MIT
+玉儿
+main
+捌
+上吊
+一拖
+泡面
+PB
+ш
+川渝
+临县
+搬出
+5TM
+四神
+刚玉
+救星
+震后
+1306
+Bio
+Whats
+汤料
+阻值
+汉藏
+律法
+宁死
+军职
+阿贝尔
+上浆
+2403
+饴
+脚踩
+痛失
+1090
+吆喝
+南楼
+1406
+宫人
+BM
+筷
+2423
+家乐
+铁桥
+ㄋ
+岳家
+CPI
+瓮安
+hide
+Tour
+偶极
+wo
+铺面
+驷
+平果
+STK
+拱北
+熟识
+清唱
+海产
+点校
+大梦
+安生
+多爱
+中隔
+可胜
+乗
+Al2O
+喜多
+温布尔
+总攻
+其罪
+枣园
+兴师
+酒神
+约见
+1465
+高小
+Normal
+高聚物
+组播
+L2
+ride
+stone
+上光
+睾酮
+0803
+HDR
+肝脾
+Bee
+文涛
+Bug
+病倒
+春申
+克什米尔
+孟州
+合租
+困局
+明净
+北望
+IntelGMAX
+蛟河
+翔宇
+Sunday
+摊开
+炒蛋
+滤纸
+brand
+年事
+手环
+互帮
+国梁
+単
+东兰
+宁晋
+1685
+善男
+⒍
+元嘉
+溶入
+SRS
+塞进
+杀父
+机库
+铺就
+溆浦
+让与
+特斯
+桑葚
+共荣
+抒怀
+2623
+跳台
+火枪
+鼻咽癌
+纸袋
+厉行
+隐退
+故我
+皮套
+肉皮
+洒洒
+中速
+活儿
+金秀
+欢愉
+枥
+车臣
+捐物
+檬
+tears
+慈济
+降雪
+聚能
+叙永
+PFC
+地幔
+摘心
+律令
+乙胺
+水势
+过招
+BLUE
+余香
+青冈
+0523
+多向
+封住
+尊号
+风衣
+朝时
+991
+alive
+Brazil
+油桐
+平声
+巨无霸
+思危
+易得
+金毛
+业户
+Biology
+主站
+库里
+一九九三
+椁
+香瓜
+丁酉
+三叔
+万华
+等位
+卡里
+四房
+果枝
+怯懦
+肾阴
+浙赣
+学佛
+shake
+曹州
+胎动
+庐州
+12g
+试镜
+丁酯
+带钢
+甚微
+肉碱
+喷管
+观山
+鼓点
+梶
+神华
+奈斯
+留成
+熟虑
+遣送
+挛缩
+基色
+寒湿
+王尔德
+实学
+卫民
+立时
+声腔
+撕破
+裆
+锹
+直冲
+唇齿
+Blu
+魏氏
+恍如
+垦利
+姑息
+PTC
+zip
+在野
+芦山
+三百零五
+僮
+dl
+锐减
+东映
+徐克
+中阳
+天照
+封底
+爱憎
+走时
+宾王
+三义
+丽霞
+安塞
+Taiwan
+志辉
+国运
+牛肚
+纵览
+Acid
+真性
+Lisa
+二百九十
+河姆
+1421
+煮酒
+VII
+Mor
+克拉斯诺达尔
+战例
+混用
+WindowsCE
+伸张
+前科
+解约
+埋名
+红票
+要闻
+法书
+盐铁
+离石
+高扬
+玄女
+普利茅斯
+Physical
+Perry
+侯马
+方框
+碾碎
+淑珍
+能工
+徒长
+娟娟
+刑律
+QB
+里卡多
+Warner
+闘
+司农
+Dog
+Josh
+access
+Goodbye
+乌石
+球鞋
+沃野
+取走
+CCFL
+洗脱
+二百七十九
+擒获
+抗风
+三多
+泰尔
+Foreign
+武工
+car
+信通
+援兵
+苗苗
+汇金
+麦迪
+1646
+0101
+嘶
+坐垫
+监考
+死灰
+步调
+吹散
+空谷
+樵夫
+列于
+水潭
+Victor
+雄姿
+GH
+阜南
+弘毅
+白热
+洛佩斯
+朝觐
+摊贩
+易购
+广明
+虚度
+鹳
+Julie
+FIFA
+浮士德
+大隐
+布克
+慑
+叛国
+熊掌
+Move
+谦让
+三四十
+Victoria
+鳞次
+定势
+缺省
+勾魂
+假币
+cor
+酸雨
+车险
+布兰登
+编钟
+哥白尼
+财大
+昭陵
+追兵
+OPTERA
+玩命
+兴邦
+和乐
+伎俩
+训斥
+谢幕
+外轮
+众志成
+早恋
+惜别
+伟明
+Report
+C5
+飞逝
+卡内
+安魂
+yen
+新市场
+2422
+维西
+北票
+ZZ
+牛栏
+代步
+己丑
+斑竹
+known
+连平
+eSATA
+杜撰
+步长
+难当
+曼努埃尔
+朦
+内行
+登州
+肘部
+U2
+Believe
+Laser
+腓骨
+分频
+可恨
+车桥
+姹
+丙辰
+新元
+K1
+短途
+法尔
+压住
+倩影
+扬程
+舍尔
+琴音
+Ball
+1785
+Fish
+臼齿
+鬼脸
+防患
+蚀刻
+Hook
+包退
+七杀
+兰特
+闺房
+城郭
+构象
+诱杀
+立卷
+田林
+秉性
+桑塔纳
+边线
+中田
+阿尔梅里亚
+选煤
+远道
+伍兹
+丁丑
+尼亚加拉
+三百零四
+1164
+ф
+倩女
+泛亚
+发髻
+自谋
+带土
+果酒
+双关
+一九九六
+台办
+退去
+活命
+宽城
+1369
+毒蛾
+dragon
+壬子
+pi
+冬宫
+薄雾
+单管
+托勒密
+血祭
+兼及
+莱尔
+盗用
+重水
+大瀑布
+血路
+楚州
+高照
+刀客
+1174
+颅底
+术数
+出镜
+包庇
+盾片
+亿千
+络合物
+VP
+结块
+赫鲁晓夫
+算作
+科罗
+哈巴
+桓仁
+俊俏
+空巢
+正明
+笃学
+僧尼
+Karl
+along
+秀梅
+盈科
+领到
+起爆
+闷热
+关紧
+图典
+天问
+刚健
+起立
+恒信
+1255
+nom
+白布
+圩镇
+POWER
+创口
+改错
+特命
+二百七十五
+RK
+石梁
+端点
+捷报
+楸
+传开
+阿妈
+2502
+镯
+毒杀
+糖业
+天德
+万恶
+建东
+芳村
+癸酉
+被擒
+摸鱼
+井田
+米尼
+凤岗
+畅饮
+担架
+小觑
+长街
+骇浪
+弹壳
+祸乱
+加蓬
+层流
+死难
+赶路
+nano
+造币
+BF
+NG
+砥砺
+塔克
+妙方
+О
+小童
+local
+哈德
+坑口
+拐角
+脑汁
+推杆
+二百七十四
+安拉
+职级
+奢求
+调停
+水洞
+其义
+岔口
+柴达木
+密教
+抄表
+婵
+鼓山
+沙司
+divide
+幽王
+酸酐
+frame
+VRay
+上蔡
+如约
+Gregory
+造极
+贴标
+Simple
+EDR
+协约
+1152
+立华
+超高层
+include
+1788
+光强
+全域
+仰光
+铁艺
+碳氢
+嗤
+条子
+单骑
+批改
+俏丽
+世茂
+飞侠
+外物
+四散
+1211
+HDTV
+俦
+洁癖
+佣人
+箸
+阴云
+货舱
+1628
+死尸
+扪
+催缴
+越岭
+微子
+二百八十八
+贯注
+癸巳
+诂
+固镇
+earth
+XXXX
+andI
+劳埃德
+杓
+好生
+汝阳
+井盖
+建峰
+英制
+1604
+彩蝶
+四份
+衡东
+神棍
+碰巧
+外敌
+糖酒
+structure
+犍为
+纳兰性德
+新衣
+双簧
+艾米丽
+听信
+洋子
+amino
+恶果
+放浪
+立誓
+明德
+烦闷
+看家
+帆板
+赐福
+妖媚
+尽数
+莱蒙
+邃
+板面
+西柏坡
+魂断
+南希
+散光
+合击
+梦乡
+硕导
+吸热
+0121
+青蛇
+1285
+再开
+易通
+古洞
+上冲
+30min
+构词
+水浇地
+三声
+争抢
+雨浩
+羸
+天音
+连日
+见地
+ι
+三百一十
+岑溪
+salt
+捣毁
+焦躁
+Premiere
+伊莎贝拉
+HRC
+教友
+试读
+青叶
+Mitchell
+星辉
+Appl
+瀛洲
+贬低
+炉料
+Second
+阴线
+弯管
+三百零六
+盐山
+节食
+霜雪
+养料
+矫情
+杨桃
+亚型
+神职
+海登
+负数
+睡床
+攘
+9787810
+便器
+rise
+雷蒙
+供氧
+歇业
+裙楼
+配售
+罚则
+旧货
+赛格
+配属
+炸酱
+獣
+夜探
+ODM
+贿
+摸着
+近处
+漏勺
+吉野
+宇通
+其成
+操劳
+好又多
+打坐
+同庆
+洛佩兹
+怀玉
+靓颖
+云岭
+Maker
+ASIN
+水獭
+蓄热
+艾利
+三百零七
+自刎
+自留
+MAN
+归乡
+杨花
+进香
+乔家
+锟
+腾云
+刀鱼
+重音
+社论
+乃以
+恒心
+未决
+球杆
+熨烫
+胃脘
+大凡
+枯草
+签注
+乌金
+郧西
+JK
+乐都
+铁面
+车夫
+只眼
+Mountain
+海默
+哈德斯菲尔德
+早前
+侧重点
+FW
+反悔
+二百七十六
+便用
+元元
+砥
+伯牙
+皖江
+玉海
+菜根
+July
+论剑
+猥亵
+饲草
+长假
+天羽
+爱犬
+望天
+Produced
+嫌犯
+避短
+站起
+桃林
+谢氏
+证道
+麦穗
+全厂
+magic
+瘰疬
+Stick
+老四
+奴才
+拉普拉塔
+4900
+1626
+워
+谋害
+SOA
+渑池
+帅府
+七里河
+雷厉
+府发
+吹塑
+Jin
+Laura
+开进
+暴政
+殉难
+明水
+新亮点
+till
+找死
+酞
+阴干
+経
+Snap
+1390
+games
+授奖
+卷宗
+赴宴
+桑德拉
+交底
+哈布斯堡
+苦乐
+二百七十七
+1143
+千牛
+Neil
+自汗
+1119
+Alpha
+华州
+history
+黄鸡
+高唱
+绥阳
+体性
+诱敌
+伪满
+Alber
+压式
+六郎
+临沭
+泥潭
+novel
+3dB
+华明
+离歌
+卸货
+教头
+舜帝
+氨苄
+陶土
+示众
+桌游
+1485
+箧
+快讯
+酒鬼
+进项
+彩球
+事端
+裁切
+宋氏
+塞拉
+1136
+民爆
+卡路里
+决口
+参将
+语意
+大红袍
+磁选
+康城
+悲怆
+钢架
+军管
+Pt
+木犀
+修德
+路西
+Fall
+余震
+刘翔
+冰原
+铝土
+半干
+政审
+永寿
+诗诗
+sd
+黄鹂
+烟幕
+撞倒
+凶悍
+电传
+法轮
+永忠
+火苗
+摄录
+戊寅
+世明
+Kids
+诚然
+着意
+NAT
+1728
+偷心
+Catherine
+轮生
+水禽
+耶夫斯基
+辛未
+玉生
+1321
+暗红
+捋
+新墨西哥
+吸干
+秋来
+四有
+Berry
+成方
+Mater
+闯祸
+肉品
+拨给
+羊羔
+粉蝶
+Mk
+力场
+Brothers
+侠骨
+双份
+报时
+Close
+明察
+learning
+降火
+喜讯
+阮籍
+葱蒜
+太岳
+回环
+机师
+尼尼
+榕城
+顾盼
+母羊
+吸式
+保额
+癸未
+1274
+鼻血
+多因
+静观
+辟谷
+推卸
+炼焦
+舸
+箩筐
+纠集
+20mg
+Beat
+让利
+隆中
+搭接
+勒令
+晶系
+上生
+肯干
+垫底
+坊间
+人字
+bot
+Ver
+散花
+展映
+M16
+阿信
+滑铁卢
+废黜
+秋后
+镁铝
+ICD
+麦哲伦
+城域
+房事
+皮衣
+Jing
+文生
+模切
+Speaking
+造册
+酸酸
+庞德
+Eyes
+机耕
+北卡
+VIA
+纳德
+噗
+附睾
+honey
+清茶
+函电
+Networks
+恂
+Zhu
+禽畜
+玉芳
+Valley
+园路
+花属
+围裙
+饬
+五律
+1ml
+Emily
+光远
+千吨
+胜境
+1316
+秀明
+Peng
+整装
+解暑
+16mm
+保康
+杰伊
+炮轰
+心悦
+韦斯
+重瓣
+尼日尔
+官家
+强吻
+higher
+声呐
+百利
+探视
+化蛹
+细沙
+Cortex
+郁结
+翡
+Qing
+会聚
+锅巴
+悔过
+球墨
+污浊
+全副
+维奥
+小体
+孵出
+直方图
+重耳
+卖艺
+刻制
+全岛
+圣泉
+块块
+7A
+走人
+凿岩
+永成
+诱拐
+Mat
+阔别
+利群
+校规
+Zone
+慧琳
+泡沫塑料
+佩雷斯
+抢掠
+1144
+1C
+华威
+用不着
+天罚
+俚语
+闺中
+艾伯特
+安吉拉
+ぽ
+闽粤
+性学
+短程
+沙洋
+长波
+烧水
+只限
+重压
+川军
+德立
+田阳
+针刀
+古松
+轻灵
+22013
+压裂
+银白
+Event
+矮星
+借读
+阿甘
+立言
+盖伊
+属国
+wang
+愤然
+IIS
+三百五十
+暗号
+官道
+招亲
+款识
+验光
+果农
+丙寅
+时务
+告发
+凯德
+前边
+紫菀
+苏铁
+獐
+牵强
+催乳
+合川区
+复地
+修斯
+戻
+避讳
+润色
+叮咛
+诱变
+面市
+幸而
+永元
+狗仔
+自卫军
+百问
+兵库
+狎
+调性
+稗
+洗脚
+入蜀
+软木
+琴瑟
+调演
+气田
+户主
+床头灯
+冰清
+Rachel
+育秧
+泰格
+龙卡
+Car
+Ever
+丝杠
+渔父
+乔迁
+道一
+珠山
+二铵
+青霞
+稼轩
+Reader
+后藤
+招展
+一草
+财阀
+统辖
+西岗
+食醋
+泰罗
+祁东
+主桥
+五尺
+]
+离人
+metal
+文部
+爱听
+双轴
+天云
+当夜
+Chordata
+索肖
+1072
+ngb
+单件
+杜伊斯堡
+渔阳
+音义
+[
+Mid
+嘲弄
+55mm
+费尔南德斯
+Member
+Computing
+古县
+신
+西娅
+对号
+伏兵
+浸膏
+动身
+麦积
+挪作
+合子
+耳闻
+借刀
+Something
+醍醐
+薹草
+andre
+1096
+拉什
+和易
+平田
+侧柏
+黄庄
+center
+NTD
+鹤鸣
+手杖
+圣塔
+迎亲
+google
+小庄
+二百五十九
+新宁
+件套
+信鸽
+高登
+DivX
+名子
+蕲
+犯法
+链子
+1786
+老林
+普罗米修斯
+胡麻
+纳米比亚
+好施
+叽
+货场
+Jesus
+外族
+1166
+印版
+助威
+彩瓷
+席勒
+肉嫩
+芭莎
+赈济
+PhotoshopCS
+訾
+sec
+白眉
+钢瓶
+1219
+果木
+灵通
+对撞
+JUMP
+将门
+紫玉
+作伴
+悲鸣
+卡塔尼亚
+今昔
+自通
+十色
+N1
+阿凡达
+秧龄
+瑞丰
+cv
+泵浦
+碎碎
+戏子
+嵬
+Disney
+婵娟
+工法
+红烛
+Mart
+二百九十三
+窝窝
+克利夫兰
+所困
+母犬
+弗里曼
+秋叶原
+焦炉
+虚词
+弘历
+造形
+古惑
+灰阶
+罗伯托
+尼西亚
+择校
+1606
+27000
+四清
+听命
+文胸
+罹
+荃湾
+丝巾
+霍斯
+nx
+鹞
+同辈
+萨罗
+ゾ
+菲亚特
+卡恩
+气雾
+瞧瞧
+举出
+记念
+黏合
+中稻
+清徐
+阀芯
+钻戒
+晓林
+哲思
+台资
+伊斯特
+胜似
+风生
+此说
+Sonic
+煽情
+同里
+蓝染
+背痛
+元氏
+少卿
+Studyon
+东三省
+索非亚
+Kw
+护肝
+密麻麻
+临空
+玉清
+闳
+凌河
+滩区
+失掉
+毙命
+二百八十五
+Wan
+过门
+康斯坦丁
+龙生
+ab
+营林
+苏苏
+针剂
+Centre
+嵘
+海顿
+音轨
+惩恶
+1308
+sing
+却步
+国林
+魂归
+煞星
+求全
+刑讯
+起效
+导语
+经方
+海原
+说到底
+HCl
+전
+丁醇
+鲍德温
+劣汰
+楮
+Hamilton
+释出
+光裕
+碉楼
+厚黑
+김
+印行
+元吉
+利沃诺
+竖井
+万夫
+料斗
+浅黄
+译介
+营收
+威利斯
+庾信
+22012
+小营
+巨擘
+9m
+克山
+发愤
+敞篷
+擅于
+飘动
+堪萨斯城
+3GB
+割断
+峡江
+花月
+文俊
+看门
+分则
+늘
+唇膏
+鹛
+汗出
+瘴
+狂龙
+红曲
+吉它
+爱女
+Warren
+禁闭
+ц
+参禅
+BI
+排布
+yy
+拿回
+退到
+威斯敏斯特
+木船
+选辑
+惊愕
+冒烟
+小哥
+珍珠岩
+管事
+19000
+铿
+ã
+轻钢
+喝汤
+普拉
+海蒂
+牲口
+癸丑
+病源
+benzene
+1236
+IO
+二百九十二
+十面
+2Z
+芒硝
+穿出
+靖国
+二百七十八
+汝城
+玷污
+租船
+sUnit
+22014
+琚
+补齐
+渌
+浓绿
+昭著
+凌源
+西和
+1662
+六边形
+vi
+gave
+雅各
+凭祥
+校董
+论处
+中岳
+金斯
+窃贼
+办厂
+津浦
+3650
+免票
+吽
+1504
+高和
+泡影
+广开
+Ping
+谍报
+麻姑
+睇
+一阵子
+大角
+二百八十一
+银光
+康师傅
+捐给
+口试
+1475
+齐头
+历山
+嫖
+胚层
+斯蒂夫
+莳
+ai
+欧共体
+桑椹
+多巴哥
+0203
+耀州
+改封
+孝庄
+机队
+清产
+北冰洋
+栾川
+短跑
+short
+组方
+偷听
+完胜
+students
+牌位
+整风
+诺诺
+十郎
+查克
+譬
+武宣
+干杯
+Theme
+合订
+分宜
+直人
+NR
+北汽
+补课
+崛
+法号
+黏稠
+中洲
+石花
+吹箫
+绮罗
+XeonE
+璨
+应允
+绥宁
+素来
+同好
+shot
+象鼻
+X5
+塞班
+冬枣
+被褥
+钟鼓
+乐乎
+布鞋
+坐像
+秦风
+IELTS
+庚午
+彼德
+时弊
+弃权
+福兴
+春草
+1608
+879
+射灯
+斜率
+莱切
+涂地
+单质
+夸父
+mall
+平叛
+ning
+氧化钛
+大板
+王龙
+sheng
+编配
+薄情
+格致
+三百零一
+放屁
+耳穴
+葩
+御医
+枯黄
+波兹南
+大牛
+浪涌
+云吞
+六七十
+双唇
+联发
+Penryn
+波洛
+哈克
+乌骨
+七十余
+菁华
+1797
+荣县
+Translation
+意谓
+留白
+省府
+龋齿
+中程
+兵工
+ACM
+进兵
+八子
+Properties
+逆反
+桡骨
+梦断
+1385
+百兽
+扭亏
+NEXT
+SEM
+乔治亚
+明伦
+头牛
+哭闹
+Manuel
+再想
+roll
+罗萨
+对劲
+欧塞尔
+特鲁瓦
+Travel
+新机
+诹
+阿狸
+啼哭
+八零
+打滚
+相图
+Bor
+阿梅
+1673
+create
+布鲁姆
+嘞
+益母草
+RV
+阿利
+数亿
+蛮夷
+坡头
+归并
+出仕
+泌阳
+框图
+兵燹
+恒湿
+1083
+游标
+二百八十二
+径庭
+东软
+炭火
+利国
+拉线
+石屋
+锄头
+肝气
+开普勒
+八公
+未可
+Jeffrey
+大阪府
+波塞
+Morning
+泥人
+CauseI
+健在
+Donald
+大力神
+丝织
+招办
+禅意
+小四
+四喜
+雅丽
+战鼓
+晋宁
+二百八十四
+蒙马特
+石出
+01977
+奇奇
+NGO
+囤积
+2350
+2120
+商埠
+良友
+面漆
+授意
+紧绷
+食者
+ress
+低语
+地气
+手卷
+海青
+援救
+far
+王储
+5episodes
+警醒
+靖边
+湘云
+赵丽
+系别
+Vocal
+连合
+1088
+三百零九
+缉毒
+怕冷
+吸力
+滁
+Using
+魁梧
+主上
+锦鲤
+棒子
+质素
+炷
+十三陵
+柯尔克孜
+生油
+外泄
+24V
+入画
+莱切斯特
+write
+生畏
+European
+烧焦
+欧罗巴
+玛利
+宁城
+游鱼
+殉情
+勃艮第
+腊味
+欠缴
+起头
+皮卡丘
+Van
+针尖
+厘清
+丹枫
+矽
+怀集
+动动
+封杀
+红兵
+泱泱
+昌宁
+作痛
+初冬
+简评
+改水
+模子
+徂
+长荣
+照看
+风动
+棋艺
+筛网
+卡带
+蒜薹
+两立
+PPS
+Lock
+饭局
+热敷
+受业
+金殿
+布加勒
+调治
+甜酒
+3264
+XY
+Root
+阗
+随队
+真丝
+ARPG
+酰氯
+外研社
+混养
+出诊
+2402
+串连
+糊精
+尖尖
+帐目
+铀矿
+万维
+白骨精
+黄化
+冒犯
+腹肌
+汇源
+으
+斯旺
+打虎
+22009
+核资
+华通
+胤禛
+1184
+宏斌
+radio
+危情
+睢县
+骑手
+钱粮
+汤包
+自命
+腰围
+成规
+Ⅸ
+盂县
+戏说
+897
+凛冽
+1794
+真价
+剑魂
+孝昌
+店镇
+萩
+迷醉
+阿德
+虎胆
+谗言
+辅机
+载流子
+失声
+1555
+委以
+反证
+Council
+Forest
+油轮
+noone
+鱼台
+颖达
+月城
+湘东
+凯丽
+邻接
+滹
+苄基
+台视
+幼嫩
+一阳
+顺心
+宗南
+969
+0l
+神谕
+卷进
+深南
+临湘
+充任
+哈迪斯
+超度
+西塔
+麦肯
+前桥
+及物
+简牍
+ISDN
+QoS
+位错
+8100
+分地
+1218
+空谈
+后患
+荔波
+1335
+谪仙
+格林豪泰
+扫清
+藐视
+撞车
+Pass
+2223
+凄凄
+米格尔
+生员
+孔道
+嗡嗡
+华艺
+祟
+呼噜
+颧骨
+空腔
+萎靡
+HQ
+规劝
+定标
+乡绅
+费解
+骨气
+深表
+眉眼
+piece
+抗磨
+附和
+新干
+输精管
+圣埃蒂安
+大姓
+核减
+甘氨酸
+桑坦德
+angel
+防污
+桦树
+恶行
+2302
+蓝藻
+志民
+尸鬼
+凋落
+乡野
+绿卡
+1422
+湄潭
+制冰
+当户
+卢龙
+反手
+HY
+江浦
+偏激
+浓雾
+卟啉
+灵灵
+金桔
+己亥
+丙申
+停住
+专机
+二百八十三
+兼得
+A6
+竹签
+罪状
+1508
+武馆
+因何
+红卫兵
+展柜
+多色
+2550
+回风
+工效
+衣领
+肛管
+蛋卷
+新集
+1755
+三百一十三
+反覆
+颧
+肯尼
+在握
+云开
+惹祸
+Niko
+地沟
+拼杀
+新地
+轩昂
+噱头
+半价
+吸合
+鄞
+菰
+克拉斯
+伫
+开掘
+扭动
+馥郁
+打桩
+平复
+包工
+1518
+好战
+青林
+小报
+1290
+缭
+报税
+未卜
+行乐
+1793
+琅琅
+水保
+配齐
+19003
+彩凤
+弯刀
+心门
+等译
+金泽
+2024
+小刺
+总校
+LX
+烯酸
+Away
+咖哩
+老窖
+老僧
+乾县
+rong
+Syste
+辖内
+麦尔
+博取
+一晃
+登堡
+戊午
+纱厂
+沙湖
+铆
+入校
+混凝
+差压
+评鉴
+韩江
+侄儿
+桫椤
+马尔科
+轿子
+SET
+春红
+抹布
+志诚
+指认
+KF
+暴戾
+云鹏
+including
+Screen
+肠系膜
+篓
+男团
+闭嘴
+和歌
+hood
+黑田
+城邑
+巴尔的摩
+袄
+豆苗
+近景
+可敬
+南皮
+多福
+花楸
+躲闪
+神格
+锅贴
+问罪
+康纳
+Title
+暗潮
+B5
+小锅
+改掉
+Sing
+屉
+分等
+1196
+雄辩
+花雕
+Yi
+压碎
+关隘
+寄寓
+解禁
+钧瓷
+谩骂
+一九九四
+限界
+脱钩
+嫡系
+六盘
+中集
+铺地
+巡按
+长丝
+齐鸣
+黑陶
+内海
+广阳
+EVD
+德成
+1661
+射影
+呈贡
+1028
+杉树
+声控
+皇天
+偶得
+鲟
+金乌
+穿心
+异己
+三百一十一
+米脂
+大统
+近利
+干爽
+集宁
+抵销
+碑石
+南塘
+王都
+野果
+工交
+浸会
+百日咳
+1408
+默克
+文保
+手拉
+비
+锁具
+屏南
+滢
+塔吉克
+真品
+芬妮
+船务
+维生
+台海
+宁陵
+专事
+靖州
+益都
+克扣
+换药
+海文
+七海
+1598
+信步
+起司
+单飞
+洗头
+Gene
+诺拉
+副手
+绞尽
+一九九七
+一八
+熔铸
+1228
+钚
+BTS
+张姓
+増
+铺张
+康拉德
+丢脸
+1424
+正题
+副食
+medical
+利尼
+自毁
+轻蔑
+采撷
+开去
+草业
+侠士
+HIGH
+小浪
+鸡肝
+尝尽
+16G
+基址
+斯大林格勒
+水柱
+如仇
+扑向
+显威
+作怪
+雏鸡
+呋
+大富
+报建
+않
+热带鱼
+淡紫
+伏特加
+选装
+冰镇
+嘎嘎
+图志
+红树
+01974
+user
+崇禧
+Os
+拮据
+乳白
+할
+3150
+三百二十一
+镇咳
+上界
+农闲
+二爷
+闻香
+转体
+东溪
+订餐
+桦南
+右军
+下欠
+沤
+裸子
+双排
+碑铭
+纺丝
+琅邪
+打探
+岩山
+广纳
+响尾
+运管
+叮咬
+费雷拉
+2323
+经线
+鼻翼
+铡
+微服
+开眼
+假单
+句柄
+革职
+使魔
+霍普
+桂树
+阿塔
+FET
+情窦
+坚挺
+露宿
+蓬安
+群侠
+假若
+氨水
+1648
+cover
+总览
+才叫
+位次
+劳尔
+秋末
+梦蝶
+工青妇
+咸安
+年富
+酯类
+双座
+石棺
+伯夷
+余味
+房前
+离谱
+袖口
+六六
+酒色
+project
+寒凉
+下等
+方解石
+纳新
+被围
+装好
+粉饰
+読
+冷门
+许氏
+战史
+米高梅
+Struts
+托人
+TVP
+初七
+瓦楞纸
+回采
+1246
+宫阙
+字音
+离场
+Bullet
+文通
+吉普赛
+树梢
+戒严
+TIME
+霾
+老练
+会否
+紫晶
+打井
+隠
+邛
+正巧
+学工
+掌机
+渡轮
+weight
+裂口
+牛舌
+Helen
+虚开
+楼面
+979
+诗性
+摆件
+岷县
+掩体
+用益
+六枝
+农机具
+战袍
+闽清
+1057
+实弹
+不来梅
+1565
+末节
+鹰嘴
+正压
+华英
+底商
+三百三十
+影集
+1719
+祖德
+人寰
+帝曰
+白地
+重出
+上周
+稜
+弹雨
+6V
+沥水
+绛县
+瑶山
+三百四十
+圣城
+⑽
+二二
+双曲
+归元
+二百九十一
+成仇
+郧县
+眦
+弱冠
+重铸
+1526
+眩光
+白灵
+庆春
+柳枝
+奉新
+鹿儿岛
+读图
+G4
+碰面
+帝陵
+强忍
+科林斯
+质控
+三百零八
+票号
+循着
+都察
+阿加莎
+看房
+bank
+菩
+堂弟
+瘦小
+气胸
+润州
+6S
+返航
+1mol
+细读
+粉嫩
+冠冕
+神坛
+逆鳞
+死敌
+深达
+探微
+针孔
+Today
+银丝
+巴库
+OTG
+卢斯
+皮子
+奉诏
+剽窃
+铁壁
+汀州
+拉斯帕尔马斯
+咄咄
+帕米尔
+元江
+属於
+贤明
+砍掉
+瓴
+热解
+笔端
+SOFTWARE
+妞妞
+漆面
+驳斥
+问曰
+怎奈
+读后感
+耀武
+海通
+敷料
+粉墨
+pk
+农化
+今夕
+横祸
+婴孩
+魑魅
+累月
+介壳
+兰草
+二附属
+碓
+身孕
+允诺
+雾中
+杂念
+hong
+Sweden
+儆
+卢旺达
+无妄
+san
+3mg
+1042
+道地
+齐飞
+同文
+迦太基
+长兄
+吓死
+Tou
+龙骑
+送电
+陪衬
+曲柳
+小江
+开沟
+画皮
+竹石
+布里斯托
+世青赛
+草率
+入伙
+花子
+因明
+莱斯利
+彩民
+军转
+S60
+鉄
+晓敏
+赛德
+神勇
+特拉
+苗子
+中排
+光面
+蓟州
+淘米
+1355
+Summary
+科龙
+heard
+洗印
+恩里克
+东郭
+寺观
+眼疾
+潮剧
+如飞
+卡蒂
+纳托
+Blake
+川藏
+Asian
+霊
+0200
+量贩
+团风
+翻牌
+十美
+内省
+andra
+样车
+金冠
+MHZ
+yout
+屐
+3R
+加精
+马岛
+羊头
+稿子
+妙算
+Angela
+小象
+WM
+土门
+和音
+1318
+1093
+Cool
+华蓥
+邻水
+2201
+Mars
+展室
+猩
+山芋
+mide
+髁
+横卧
+2650
+甲壳虫
+垂范
+三顾
+短道
+打点
+明霞
+Deluxe
+1585
+松针
+Century
+Weare
+Football
+广交会
+ี
+弥留
+赴台
+遗赠
+Keep
+南丹
+国投
+燥热
+涡旋
+活该
+二百七十三
+清歌
+威斯特
+善举
+1386
+1122445
+漫话
+商情
+磁通
+建仓
+二百八十九
+靖安
+政府奖
+Alfred
+横刀
+秧田
+移出
+仁心
+恶名
+茶色
+遂良
+圜
+宠幸
+跑路
+毛钱
+离宫
+酮类
+法音
+林德
+April
+犍
+1787
+六欲
+树荫
+线束
+名世
+Steel
+强碱
+交火
+Jessica
+菁英
+Construction
+西罗
+러
+0d
+7MHz
+加拉
+CS5
+水南
+铜线
+长工
+巴拉多利德
+手摇
+人兽
+断口
+兴元
+此道
+制成品
+叹气
+仙丹
+安世
+亚裔
+重拳
+醪
+口疮
+C11
+机关枪
+布谷
+5Kg
+新力
+一二九
+建勋
+呆滞
+水事
+格罗宁根
+崇宁
+林内
+鱼腥草
+承志
+UML
+瓦里
+明轩
+1724
+。
+多日
+映山
+托幼
+珣
+1117
+酒宴
+五马
+容留
+粪池
+堡子
+淳于
+托收
+斋堂
+蜕皮
+Proc
+瓜类
+拷打
+明斯克
+暮年
+妮娜
+小蜂
+仙洞
+index
+郯
+田七
+漏油
+
+博纳
+瓦西里
+oll
+发愿
+降伏
+监军
+1607
+fan
+怀恨
+桂东
+己卯
+簸箕
+情急
+玄门
+抓拍
+地空
+1296
+村头
+算数
+调情
+田鼠
+篠
+寒光
+时文
+鹿泉
+佐伊
+千百万
+停火
+星战
+小东
+瓯江
+铃兰
+洛宁
+3P
+娇娇
+嫩绿
+安胎
+夜战
+禁军
+消长
+冬雪
+尔时
+焚书
+chang
+阿基米德
+丽质
+四百多
+咽痛
+腰包
+套数
+当政
+七喜
+吕克
+保肝
+詹森
+遂川
+琼花
+猪肺
+1CD
+迪特
+Writer
+荼蘼
+四足
+Act
+冰凌
+渔猎
+白驹
+渐近
+咬人
+YL
+综观
+帛书
+转借
+开班
+录放
+嘉宝
+淑英
+掇
+露点
+黄麻
+挺胸
+后旗
+断流
+1567
+泛美
+师部
+襄垣
+1254
+北三
+非命
+弟妹
+文达
+within
+奶业
+初雪
+潴留
+热熔
+抢手
+俄克拉荷
+7
+绿肥
+慈恩
+整枝
+中城
+鱼油
+粮站
+船舱
+尖角
+掩蔽
+瞠目
+Murphy
+舅父
+行将
+2902
+寻乌
+拉奇
+冰壶
+收归
+x240
+拉特
+铁牛
+大空
+RE
+折枝
+大跌
+圣手
+糖液
+ordm
+洪堡
+圣保利
+面纹
+九万
+法库
+食补
+成百
+Features
+孤注
+FILE
+水户
+辎重
+元正
+开讲
+分治
+Leon
+2722
+永夜
+0202
+肉色
+朝南
+甘美
+内卷
+1365
+好伙伴
+阿卡
+清扬
+山湖
+华伦
+俱到
+因患
+光良
+米娜
+1635
+阴郁
+优柔
+兴学
+heat
+Protection
+对酒
+改选
+杏树
+Candy
+火眼
+厚街
+同村
+啧啧
+切去
+奔月
+Fair
+多亏
+荨麻
+胡琴
+7GHz
+Hay
+阿美
+忽而
+浯
+解封
+剧透
+阿飞
+歌仔戏
+克里姆林
+万灵
+密探
+迟滞
+喷涌
+脊背
+受戒
+三百二十四
+je
+飞剑
+濬
+月如
+未眠
+氪
+藏医
+颛
+inn
+纯金
+金珠
+魔爪
+情诗
+内奸
+命门
+屋外
+浅议
+强韧
+0603
+胶布
+削去
+精索
+0123
+une
+打狗
+血友
+Assistant
+右方
+被查
+晓云
+苏子
+哈里发
+国胜
+喹啉
+SUS
+国威
+西道
+育雏
+放羊
+定边
+Safety
+堂兄
+0303
+航模
+当机
+卡比
+归田
+维利
+小女
+1193
+三百二十五
+心火
+翻番
+皱眉
+快步
+二百九十五
+Bye
+裘皮
+英武
+soon
+衡南
+iso
+拿捏
+文不
+翻版
+束腰
+旋流
+技防
+柱石
+哌啶
+逸出
+PCMCIA
+贩运
+梳头
+P1
+霎时
+掸
+吐气
+襁褓
+批件
+申遗
+打闹
+1163
+怀胎
+VBA
+VHS
+学步
+部务
+审题
+主文
+擎天柱
+1V
+纳凉
+experience
+拉格朗日
+挂载
+岁入
+皮囊
+离岗
+父系
+香橙
+三百三十八
+tv
+Rhodo
+一桌
+磕头
+俚
+涉县
+擦亮
+妖气
+其先
+引黄
+大森林
+毒辣
+蕈
+纸面
+1109
+米诺
+Lu
+正房
+双子星
+氧化铁
+芽庄
+调焦
+衣带
+胎生
+hello
+written
+大肉
+怀来
+氛
+沉陷
+松果
+抚琴
+幼童
+├
+仮
+淑华
+扮成
+Without
+偷税
+男双
+独闯
+特摄
+念经
+俭朴
+卖座
+码流
+每瓶
+顺丰
+西塞
+猎枪
+洄游
+浑河
+三百二十七
+羧基
+私情
+纾
+多斯
+三百一十七
+酒石
+电锯
+宁津
+0902
+颁证
+铝制
+拔牙
+少得
+东德
+荔浦
+决出
+改换
+互认
+计件
+双月
+XlogP
+志良
+考克斯
+中投
+小娟
+八七
+压滤
+隐姓
+台和
+机警
+闼
+抄录
+1778
+数一数二
+释名
+迁延
+情定
+侮
+三百一十六
+迎泽
+胆结
+征订
+三百一十四
+华图
+双创
+易县
+言词
+雷山
+SUPER
+林产
+雄鸡
+1186
+胶粉
+1176
+1398
+独断
+Product
+老弱
+seems
+风疹
+金屋
+半条
+孟郊
+覆雨
+信诚
+纯平
+금
+悲苦
+集水
+过街
+污点
+满屋
+二流
+青蒿
+丢人
+墨粉
+主殿
+三百一十八
+0011
+科里
+于洪
+道经
+文安
+贤士
+复调
+虻
+INTEL
+教代
+十步
+衣襟
+1590
+住友
+鲇
+售房
+普外
+攀高
+马来亚
+临桂
+李铁
+7x
+馆陶
+蜇
+抑扬
+果岭
+彼女
+琴键
+忧民
+渥太华
+刚烈
+大生产
+泰利
+初秋
+扬眉
+官厅
+丽红
+唐贞观
+夏津
+熔胶
+变坏
+桠
+22016
+大比
+克朗
+缶
+木里
+衡州
+盐都
+grow
+压降
+明矾
+1256
+埃克
+暗战
+赟
+Hope
+兵王
+安信
+射向
+秘辛
+就坐
+学武
+千红
+1324
+金锁
+封爵
+突现
+轰然
+seen
+浸水
+收件人
+1217
+OFF
+手轮
+auto
+鲁西南
+蒲江
+矿种
+植发
+Form
+语感
+浮标
+五山
+猛鬼
+达也
+三百二十三
+ECC
+PMP
+糍粑
+耳部
+绝路
+合掌
+文一
+大果
+易传
+1674
+觚
+萧然
+一九八六
+铁厂
+月面
+焦黄
+牛魔王
+华苑
+洪都
+战旗
+湟
+高干
+奸商
+军法
+寒流
+满文
+age
+妄为
+华子
+ness
+埙
+丰宁
+拦路
+Care
+网盘
+临漳
+Luo
+撷
+昭平
+second
+玉质
+接拍
+家院
+二来
+摘星
+日香
+han
+新马
+墙头
+玄德
+他者
+石经
+东道国
+大马士革
+康巴
+腌菜
+长堤
+拉紧
+穴居
+索罗斯
+均分
+海船
+校订
+3001
+手相
+克洛斯
+三百二十六
+井筒
+ICH
+客船
+Methyl
+倒戈
+水蛭
+墙板
+七剑
+仅见
+张震
+三百二十八
+撒手
+劝学
+名苑
+自缢
+CIP
+提袋
+Going
+惹来
+万壑
+赃
+勒姆
+圣明
+米尔斯
+结汇
+金福
+阴平
+expression
+壮语
+敌百虫
+还钱
+名实
+中微子
+二百九十六
+哽咽
+锡兰
+B3
+续签
+下院
+1545
+嘉鱼
+Japanese
+揭东
+亟需
+清早
+希德
+天煞
+江永
+杂环
+拆掉
+打滑
+易事
+产率
+小记
+双顶
+TiO
+肇东
+Tech
+拜谒
+脱肛
+出巡
+1128
+遑
+镑
+伴唱
+Storm
+扮相
+糖厂
+脚垫
+0602
+二百八十七
+送修
+梵音
+阿特拉斯
+45mm
+空余
+再犯
+发运
+세
+禅林
+劳斯莱斯
+哈瓦那
+防己
+坐席
+丰茂
+电闪
+research
+梁氏
+1783
+应否
+吒
+拒付
+ŋ
+特拉帕尼
+此画
+烷烃
+鳉
+刀尖
+耳根
+司局
+二六
+cetes
+配发
+风信
+御寒
+GK
+鹤庆
+卑劣
+锂电
+光剑
+CIA
+徇
+Still
+新枝
+打鱼
+原委
+倾盆
+热菜
+三百三十二
+毁容
+柔柔
+心软
+俊男
+南郑
+垦荒
+月令
+late
+填色
+二百九十七
+自守
+淑芳
+凉面
+打样
+戴姆勒
+以权
+打结
+切莫
+水痘
+升龙
+949
+扉页
+玉簪
+竞走
+抹平
+塔高
+感念
+奏章
+玛特
+bis
+牛街
+血泊
+中值
+汉尼拔
+蓝颜
+栈桥
+发难
+翻山
+长角
+Wing
+UR
+麾
+小门
+方队
+Dell
+商汤
+永登
+宜良
+1068
+数词
+安图
+饶恕
+帐单
+受降
+1133
+好歌
+壬寅
+瑞奇
+造谣
+坯料
+私欲
+三百三十一
+多哥
+鹬
+Client
+20m
+三百四十八
+双歧
+斯威夫特
+普特
+T4
+米易
+黄毛
+双枪
+告捷
+黄粱
+咸肉
+框框
+热学
+火石
+升仙
+闪躲
+Neo
+二百九十八
+赫斯特
+长裤
+22015
+VAN
+仿若
+阱
+GSP
+纯一
+瓦砾
+吊坠
+讲道
+Dicotyled
+雀跃
+烫手
+seek
+BH
+1307
+金生
+above
+施塔特
+金柱
+虎林
+Extra
+说白
+泗县
+维纳
+灌南
+相控阵
+八哥
+误食
+福祥
+成圣
+6A
+遗事
+鬼马
+隆美尔
+0300
+歇息
+如许
+瑟斯
+炊具
+尖塔
+Sports
+周宁
+全血
+x3
+南召
+亨通
+2322
+低能
+Celeron
+糟粕
+VE
+定襄
+Snow
+14mm
+维斯特
+反刍
+先走
+BOT
+铅华
+Through
+太刀
+家法
+午睡
+大邱
+缘何
+武陟
+到货
+DSC
+康泰
+演替
+血红素
+performance
+咬住
+谦益
+弹球
+富勒
+翠鸟
+LL
+1588
+玉叶
+屏边
+水西
+三百三十五
+对月
+Ace
+腰花
+奥匈
+如幻
+1535
+国明
+硼砂
+毛毡
+竹笛
+盐渍
+春寒
+订正
+治世
+吊带
+勾当
+ASCII
+新蕾
+企盼
+雹
+until
+庭芳
+0622
+茶经
+德里克
+匣子
+减重
+角斗
+纯纯
+奏折
+广雅
+때
+大伯
+放走
+相扑
+Square
+讨债
+翻出
+攸关
+词类
+菜豆
+天灵
+终局
+僵持
+山北
+云县
+找找
+雨晴
+Ching
+勾兑
+BMW
+扼守
+7100
+冲冲
+神龛
+季子
+0502
+马卡比
+嬉皮
+三百三十四
+雨衣
+谢罪
+璎珞
+菀
+金盛
+霍恩
+截拳
+印数
+荔城
+释然
+岁寒
+味噌
+三百三十三
+弹子
+共商
+1707
+壮苗
+放缓
+惠能
+0702
+百多
+Focus
+声韵
+Pink
+神笔
+1146
+毛遂
+进气口
+养眼
+含山
+多形
+临武
+遒
+闻人
+多田
+低劣
+卑贱
+用场
+灬
+儒者
+thought
+慢跑
+平西
+恒达
+Peace
+乱真
+脸孔
+播撒
+鄂东
+伯德
+夜风
+决明子
+截肢
+海韵
+寒带
+烫金
+东庄
+讲理
+三百四十二
+新报
+世行
+潭水
+至圣
+三百三十七
+尖峰岭
+杜康
+脓疱
+立明
+缗
+摁
+傀
+角楼
+雷管
+土山
+遗恨
+1655
+村田
+single
+手软
+睿宗
+空置
+HJ
+陆域
+欠债
+火攻
+八珍
+稷山
+名表
+苒
+1344
+翻看
+仁杰
+波峰
+三百一十九
+FREE
+Servlet
+顶楼
+Murray
+应援
+AVC
+裁军
+carbon
+mid
+博通
+义子
+神谷
+界河
+叨
+日美
+千般
+须眉
+道奇
+长顺
+DDD
+枳实
+光效
+合拢
+潦
+神物
+得病
+田原
+谋面
+治保
+Harvey
+失魂
+进气道
+Pretty
+固氮
+狠辣
+1784
+石渠
+cam
+enter
+IDR
+璐璐
+六十余
+沉舟
+同根
+0122
+肩头
+二百九十四
+围歼
+康乃
+观鸟
+1376
+大鸟
+黄羊
+应验
+落榜
+廷尉
+MU
+逞强
+端州
+难料
+除恶
+Diamond
+水立方
+bright
+石岭
+Illustrator
+重放
+1313
+播客
+Magazine
+重蹈
+0703
+真金
+吴桥
+版别
+ssx
+Rob
+dendron
+粟米
+自锁
+寿辰
+氯霉素
+时评
+官人
+鼎新
+草叶
+出笼
+总责
+ECP
+出鞘
+尼古
+10W
+门客
+Would
+Tokyo
+倒装
+仙桥
+劳役
+违犯
+碣石
+贾岛
+儒道
+物候
+呷
+shMETA
+嘶哑
+Listen
+艺能
+weare
+彩排
+奎因
+有感于
+EBD
+多拉
+mmol
+三千里
+舒马赫
+章光
+异丙
+垒球
+缄
+社教
+Jamie
+2203
+中士
+内腔
+六万
+场子
+攫取
+供品
+阿方索
+荷香
+着落
+孟菲斯
+三百四十一
+和会
+菁菁
+闭口
+施奈德
+水网
+产假
+专干
+形像
+1791
+弗格森
+旗委
+唯爱
+图图
+捭阖
+五线谱
+羊肚
+卤肉
+灼灼
+闱
+康康
+嵯峨
+水杯
+重开
+处男
+调质
+包络
+蚓
+ments
+dog
+法拉第
+光晕
+谎话
+苏德
+频数
+Total
+封丘
+纳维亚
+图灵
+小星
+done
+火拼
+松潘
+钎
+米娅
+圣马可
+如春
+天湖
+岩礁
+god
+乔伊斯
+MFC
+PSA
+海霞
+分心
+西去
+Princess
+0402
+噻唑
+雷克斯
+钛白
+一块儿
+逐条
+五百万
+肥猪
+丑角
+迁就
+团组
+比容
+岂可
+伊本
+来京
+小盘
+石楼
+前文
+joy
+元谋
+暮雨
+欧尚
+流注
+溶栓
+将死
+三百二十二
+秘宝
+妇儿
+ICS
+小巴
+柞
+传书
+酒窝
+阿托品
+2X
+上香
+天王星
+特斯拉
+化蝶
+家养
+兰英
+kbps
+nz
+幽怨
+0Chapter
+键入
+港岛
+认命
+免冠
+理惠
+TJ
+蓝桥
+1627
+¢
+保甲
+活页
+艺考
+管口
+Scene
+攘攘
+ug
+逐月
+Liang
+来势
+垫层
+膻
+老死
+阿斯特拉
+梦华
+超量
+GBK
+1690
+库仑
+1082
+通病
+高喊
+车驾
+载歌
+助兴
+并驾
+长胜
+崧
+1777
+汉南
+喜迎
+孝文
+줘
+嗜酒
+杜预
+稿本
+国航
+疏影
+Methods
+8200
+É
+摩诘
+3B
+身外
+西吉
+耳畔
+生老
+三百三十九
+吉星
+管腔
+混双
+斯佩齐亚
+禅学
+兼施
+习习
+电文
+玉人
+春霞
+房主
+洛娃
+压条
+reason
+基面
+苌
+1455
+康健
+急功
+漫步者
+检票
+1027
+低脂
+三百四十五
+魔影
+热辣
+辛辛那提
+退兵
+评弹
+巍山
+念力
+景县
+大井
+异宝
+踏足
+核苷
+AMINO
+冬夜
+许家
+昔阳
+0k
+流道
+小景
+辛巳
+征稿
+稍作
+小件
+惨叫
+大鲵
+Arts
+看齐
+戍边
+排档
+名石
+Mill
+普教
+冽
+名堂
+Xiang
+侨民
+择偶
+0J
+活埋
+威尔森
+佩佩
+大佐
+引物
+痉
+来朝
+解囊
+牛肝菌
+被诉
+卓识
+沫沫
+粗俗
+太上皇
+致知
+裕禄
+眼福
+摊子
+1416
+知了
+BIM
+氧化氮
+院里
+3010
+PX
+满门
+名吃
+con
+一尘
+array
+1517
+残花
+Create
+缅因
+青烟
+此役
+了望
+茶农
+压气
+适任
+世恒
+宣示
+风纪
+今安
+工长
+留神
+填平
+眼目
+罹难
+涛声
+套曲
+Ren
+Cache
+板书
+宾州
+建生
+特教
+子平
+喝完
+奎文
+Classic
+双键
+御景
+龙袍
+暴食
+锦纶
+Lind
+麦肯锡
+乳母
+圆钢
+疖
+六十多
+skip
+名噪
+卡塔
+Int
+1445
+st
+Devil
+仿造
+堑
+夜谭
+罗陀
+窖藏
+二星
+阿片
+祛邪
+陵川
+果壳
+国父
+OSI
+1514
+枧
+利玛
+随口
+虹吸
+逗号
+已逾
+雅乐
+conomy
+塞浦
+白点
+甚或
+B股
+积存
+暧
+下坠
+落伍
+吓唬
+滚开
+贤惠
+接招
+学养
+神伤
+呼市
+0302
+蜡染
+小幅
+正华
+仙宫
+普定
+生民
+清零
+穂
+Wayne
+大坂
+一九八五
+天君
+剑峰
+略阳
+仁政
+吧台
+林分
+ding
+1148
+人份
+鸡场
+五光
+用情
+CBC
+包钢
+式神
+丝绒
+贝吉塔
+fans
+⑿
+轻信
+五羊
+瘠
+断臂
+三废
+军心
+士绅
+一九四九
+对价
+苏珊娜
+效价
+抗皱
+Stan
+滦县
+听诊
+暗室
+date
+子爵
+series
+四野
+刀背
+冷冰冰
+Robinson
+天壤
+総
+佛冈
+会通
+脱除
+1624
+180000
+弯矩
+避雨
+鞋材
+常言
+厚板
+链表
+先世
+春耕
+乡音
+SV
+声场
+压花
+开建
+宜黄
+策应
+二百九十九
+生火
+盐化
+外网
+布拉德福德
+许配
+宝峰
+features
+折翼
+疏于
+衬里
+阿维
+若即
+河坝
+硬皮
+伽玛
+下葬
+皙
+盆花
+胸椎
+浉河
+人人乐
+烟壶
+50kg
+奎恩
+浊度
+二四
+死海
+上翘
+贡山
+1642
+多面体
+殉葬
+前侧
+TK
+文质
+亚萍
+整型
+独当
+沈氏
+远安
+献县
+爱默
+急转
+苦衷
+蔚县
+躯壳
+雷暴
+浅草
+密山
+智多
+x4
+Univers
+陈酿
+就位
+顺河
+Cmax
+海法
+science
+佛龛
+结露
+瓦罐
+Sou
+铁索
+CCS
+三百一十五
+连篇
+1414
+大昌
+5KB
+短句
+TUV
+谴
+dale
+灭鼠
+1168
+下野
+Morris
+共担
+看盘
+services
+麦克斯韦
+朝见
+45nm
+扛起
+新宠
+弘景
+丫环
+从犯
+向明
+密语
+装船
+国服
+乙巳
+2F
+UniCode
+磨制
+贤能
+散客
+私生
+绿藻
+南涧
+旗手
+Chate
+SaaS
+文祥
+1077
+贾德
+嵩县
+瑚
+白下
+比干
+瓷业
+Thing
+干咳
+天运
+三二
+外溢
+交款
+劵
+瘦脸
+Par
+勐腊
+英名
+三纵
+阿婆
+红场
+今井
+尔德
+中成
+泾川
+新镇
+懵
+添翼
+打针
+英子
+巴渝
+载舞
+回向
+涨潮
+store
+琴师
+Mb
+渡假
+松饼
+便便
+缸盖
+吼叫
+双轮
+天险
+谷粒
+论纲
+梁州
+雅马哈
+验算
+峯
+que
+在旁
+林丹
+父业
+行时
+soft
+僧伽
+Sunny
+整场
+内翻
+雪恨
+忧思
+掳走
+遗孤
+LBS
+展播
+hearts
+泛酸
+子云
+因时
+眼罩
+Welcome
+豫北
+衣裤
+难觅
+网孔
+武人
+大余
+SZ
+Update
+再走
+図
+心虚
+笼络
+腩
+9787030
+球茎
+慨然
+基多
+Boost
+蒙特雷
+八爪
+云居
+Device
+国府
+猿猴
+顶芽
+纪行
+原虫
+breath
+迭出
+玛尔
+甜度
+内贸
+涵江
+白土
+_
+晦暗
+女团
+╯
+盛情
+磷矿
+陶喆
+森工
+球技
+发胖
+呸
+比热
+屠城
+媳
+飘带
+平罗
+佛珠
+安州
+安塔利亚
+蛆
+龙身
+醒脑
+时行
+匀速
+SKIP
+蓝兔
+凯拉
+透雕
+巴斯克
+米色
+55000
+耕读
+糍
+侧板
+棉铃
+上座
+chain
+1364
+辉石
+奥黛丽
+埃德
+IPTV
+玄孙
+总重
+砖塔
+放热
+说罢
+Su
+听来
+速8
+预购
+爱军
+冰帝
+卿卿
+探路
+深交
+后海
+大醉
+滩头
+麻袋
+先觉
+韦氏
+蜗居
+橘红
+义诊
+ITU
+平素
+肿物
+District
+志清
+安贞
+骨节
+旌德
+妙龄
+怅然
+试驾
+LB
+情归
+九流
+招兵
+2448
+决明
+沃顿
+1081
+香喷喷
+杞县
+妙药
+罗网
+巫溪
+Legend
+胡闹
+新空
+唠
+1147
+雷克萨斯
+薄利
+font
+艳玲
+火炎
+1244
+石臼
+ngch
+藐
+探知
+晨风
+自销
+夹江
+幅值
+泸沽
+德纳
+落网
+三百五十六
+交运
+也谈
+受惊
+珥
+COSPLAY
+CQ
+画传
+迈尔
+该隐
+改判
+善终
+慈航
+示踪
+二医
+土产
+catch
+福娃
+Un
+中沙
+西蜀
+妇孺
+花岗
+丹巴
+庆国
+可畏
+・
+조
+激波
+难寻
+导火索
+主从
+前年
+抚松
+蓓蕾
+水仙花
+主调
+两角
+Ice
+youll
+深挖
+刀柄
+长驱
+130000
+委曲
+BEC
+仇视
+英模
+것
+法纪
+振宁
+开悟
+Experimental
+婚外情
+Disc
+刺耳
+波尔多液
+亲昵
+打小
+希亚
+Meta
+听讲
+GAP
+息县
+BGM
+弾
+Sara
+0CAS
+南宗
+郤
+圆月
+恭维
+托托
+买回
+客票
+忘形
+清屏
+月亮湾
+空投
+食神
+繋
+越权
+点胶
+小众
+carry
+肢解
+五菱
+圮
+变局
+败走
+血钙
+KS
+地窖
+jack
+冻死
+pray
+宋体
+苏家屯
+高足
+金汇
+橐
+小六
+像样
+高招
+会前
+银狐
+石炭
+磁极
+怪病
+旭光
+雕版
+朱莉娅
+炎龙
+互变
+延绵
+抬升
+Main
+子建
+LIFE
+排解
+span
+折扇
+15cm
+伏龙
+乘凉
+回击
+沉甸
+小语
+宝生
+8A
+跨区
+神田
+洋县
+1706
+BTV
+十中
+三百五十四
+百里香
+熊市
+承托
+因循
+打翻
+撞死
+丽英
+布兰妮
+against
+丽珠
+追根
+浩天
+海伦芬
+滩地
+情劫
+地动
+盛传
+利明
+自满
+造口
+士达
+布莱
+达伦
+位於
+禽鸟
+滂
+吸氧
+张狂
+速溶
+隔天
+平野
+wei
+30ml
+莫邪
+网眼
+角钢
+海泉
+壁立
+声压
+道藏
+红柳
+听政
+黄檀
+太迟
+2750
+将乐
+自变量
+姿色
+巴戟
+入川
+四八
+成见
+行舟
+佐餐
+陇右
+Ajax
+敬献
+莫里
+Ag
+参半
+字部
+洗液
+蓝领
+癸亥
+金大
+迫降
+系辞
+相异
+肝火
+1716
+亚尔
+胸臆
+≦
+叫来
+清虚
+阈
+少昊
+Monster
+祥林
+垂帘
+得悉
+0223
+1767
+缺钙
+世传
+1048
+费尔
+初评
+맘
+御膳
+ɑ
+细叶
+船型
+雅阁
+诗仙
+Could
+1766
+红杉
+玉平
+割让
+睡醒
+牛车
+中基
+ed
+片材
+药片
+一百五十
+农畜
+Clarke
+AES
+干涩
+游弋
+1632
+借入
+佳期
+1745
+芮城
+马灯
+浙菜
+文波
+三哥
+马兜铃
+lie
+复姓
+调车
+阿妹
+懊悔
+档期
+艾尔莎
+两户
+年逾
+亲口
+取笑
+家丁
+德全
+宾利
+管束
+洛克希德
+桥接
+在逃
+强横
+新林
+Tex
+娑
+澄澈
+勒斯
+鸡胗
+18001
+大早
+命格
+歳
+试析
+五祖
+四爷
+预收
+映月
+9200
+德钦
+发笑
+鸣谢
+党项
+摞
+中途岛
+送葬
+路牌
+1586
+玩水
+翠湖
+活法
+花影
+IDD
+低处
+TIFF
+迷蒙
+秀洲
+强风
+玛德
+波茨坦
+缯
+分路
+两得
+木石
+1098
+杂烩
+ACD
+species
+莱山
+Find
+如常
+三百四十四
+Thank
+烧碱
+np
+peace
+船东
+安多
+1612
+bps
+重头
+借势
+杨森
+zone
+内宾
+1524
+╰
+FE
+hotel
+南岛
+№
+恁
+姑且
+宜山
+怪胎
+画板
+DMA
+上犹
+DVR
+老农
+富美
+三北
+扫把
+新站
+反哺
+http
+文强
+劣绅
+Justice
+Cant
+重头戏
+瓶口
+九二
+小卖
+op
+工质
+武举
+Claude
+E2
+全向
+沈约
+散记
+超酷
+骢
+岩棉
+begin
+三百三十六
+脸面
+Beck
+阳差
+塔里
+拉努斯
+真事
+情圣
+耶夫
+古旧
+V0
+桃山
+新和
+警队
+利好
+十余万
+噫
+Universal
+被抢
+구
+痧
+跨径
+选种
+恼火
+志斌
+暮暮
+Planning
+答问
+莫拉
+兴发
+世雄
+净增
+笛卡尔
+람
+固着
+粉磨
+安理会
+全团
+糖糖
+滑过
+香氛
+DAY
+青果
+il
+暮云
+祖坟
+砖机
+弗里德曼
+啡
+中略
+YOO
+略小
+丰饶
+揭西
+∑
+布点
+元甲
+开道
+百媚
+铆接
+瑶瑶
+坝体
+篾
+Rosen
+极板
+EBA
+威士
+ethyl
+稚气
+芽菜
+想家
+偏颇
+临池
+先手
+缓坡
+鲤城
+叹服
+清闲
+利特
+堪布
+广源
+燃尽
+尾灯
+天青
+伟岸
+消减
+甜橙
+log
+毒枭
+田块
+天虎
+远足
+衅
+艾森豪威尔
+佳酿
+PVE
+过激
+乍得
+圣衣
+2603
+掠食
+狗熊
+定购
+单抗
+七八十
+航拍
+18mm
+鸿鹄
+jiang
+INForgtmp
+uh
+花好
+石匠
+兵员
+御笔
+天前
+定一
+军费
+吊脚楼
+结舌
+Problems
+秀芳
+经编
+天域
+涣散
+大饼
+铁氧
+磁粉
+原配
+gota
+莅
+内阻
+茶人
+光宇
+绞车
+悍马
+袖手
+赞曰
+content
+ring
+婕妤
+内联
+演武
+GG
+艾达
+儒商
+泵阀
+ISOFIX
+焦耳
+问心
+莫泊桑
+庇佑
+好啦
+永光
+敬德
+贾府
+晓梅
+仙林
+拉深
+张爱
+1153
+CAST
+佘山
+罗宾汉
+长颈
+堂屋
+胜败
+怛
+圣训
+clear
+逗趣
+million
+紧追
+硬壳
+C18
+除虫
+桑蚕
+天顶
+康辉
+链轮
+羊肝
+铺镇
+通吃
+fang
+ゥ
+唐古拉
+脱机
+低血
+♪
+Flower
+ROCK
+长文
+正泰
+肾阳
+莽莽
+草虫
+杰里
+去留
+1617
+棒材
+转过
+斗神
+Use
+序文
+古乐
+拳手
+靛蓝
+劫数
+人同
+洺
+土人
+送客
+出尘
+尸检
+超期
+华德
+章草
+加2
+积德
+缉私
+羽林
+萨拉热窝
+伪善
+Jesse
+家口
+受罪
+情海
+十五大
+二虎
+天街
+光化
+接访
+白云石
+德平
+福华
+10TM
+三神
+电石
+旺苍
+﹙
+斡旋
+stra
+秀荣
+湿布
+Gabriel
+次生林
+德保
+鱼群
+可资
+走道
+祭文
+美日
+丽芳
+both
+便饭
+水光
+轻言
+混悬
+汉光
+密令
+套取
+Plan
+fu
+1764
+译员
+蛋类
+영
+底气
+裔孙
+黄果树
+志龙
+互译
+待续
+双林
+KISS
+楠竹
+雇工
+01970
+马太
+1595
+三百四十六
+收率
+代换
+problem
+bridge
+寒战
+梭罗
+kNCor
+梆
+西充
+仪轨
+炖鸡
+胜仗
+佢
+强占
+1495
+布尔什维克
+报章
+国花
+女色
+夏洛克
+素素
+Additional
+嘉应
+一平
+国势
+硅片
+光刻
+FZ
+1356
+冷若
+阆
+气爽
+瞅
+1695
+巡更
+奥德赛
+张角
+Lily
+东郡
+china
+微利
+篦
+STS
+合群
+1582
+Wireless
+准格尔
+斯特林
+油皮
+운
+砂纸
+瑗
+西楚
+谦恭
+feng
+搅乱
+listen
+河曲
+Interactive
+阳曲
+Path
+韵文
+CW
+读经
+问诊
+小屯
+欠费
+马科斯
+乐声
+霞山
+独揽
+别国
+伪娘
+wake
+六书
+撤军
+1756
+德克
+冇
+产褥
+Animalia
+舒伯特
+银矿
+男星
+韩庚
+路费
+GR
+径直
+1266
+临猗
+向海
+达蒙
+干粮
+厄尔
+友邦
+立柜
+Animal
+潋滟
+苦命
+爆满
+平陆
+繁峙
+金明
+海鹏
+Apache
+祭礼
+镍氢
+转行
+柴火
+链霉素
+衬垫
+家峪
+贬谪
+唐老鸭
+1336
+丹毒
+图斯
+2n
+眼红
+尊荣
+鞭打
+镞
+溢血
+奇偶
+利爪
+清炖
+湫
+定慧
+onlyone
+圆珠笔
+晚晴
+淦
+赫连
+奇事
+多闻
+沙画
+1162
+庞杂
+1069
+积劳
+Mad
+林园
+余数
+麻阳
+三孔
+妤
+饭桌
+﹚
+年节
+1426
+蛇形
+王充
+微火
+冰层
+直捣
+成仁
+永林
+κ
+金一
+ACS
+九千
+完满
+相好
+壬申
+孺子
+Detective
+俊华
+魏碑
+正副
+铄
+元洪
+Shan
+安能
+昌图
+俊彦
+清浅
+凑合
+STP
+慎行
+粕
+Area
+班师
+热浪
+阿明
+墨江
+志忠
+耐玩
+刊印
+委机关
+金领
+艺专
+江心
+hurt
+奥尔森
+bach
+翠华
+枪林
+花圃
+求偶
+writer
+对歌
+寒暄
+即期
+赛段
+轻舞
+雷诺兹
+天润
+单座
+峪口
+浸湿
+宅邸
+自暴
+国斌
+Side
+骡子
+贝尼
+航展
+批斗
+胞弟
+华发
+1666
+promise
+龙舞
+元朗
+清城
+重发
+救死
+第聂伯
+佩特
+EAP
+Policy
+人猿
+领班
+涨落
+未开
+压片
+橱
+对门
+殷实
+0623
+上星
+团扇
+ANSYS
+沈周
+类书
+阿凡提
+髓鞘
+内森
+1233
+贵客
+高陵
+遗骨
+习字
+省厅
+EFR
+先蒿
+床具
+公制
+吠陀
+朝云
+叛逃
+凤冠
+字库
+ς
+2222
+红磡
+Socke
+马尔科姆
+宝光
+朴茨茅斯
+胎体
+垦殖
+复检
+代用
+枸橼
+HDD
+麦卡锡
+五铢
+兰香
+道格
+提斯
+Fight
+feet
+格里芬
+林政
+新蔡
+Ian
+监外
+万佛
+差生
+五矿
+Array
+4P
+泗州
+利导
+上朝
+大经
+翩然
+杰拉尔
+吊挂
+上菜
+Inst
+蔡襄
+速报
+⒎
+三百七十
+虎威
+校纪
+闻见
+热狗
+1675
+≧
+跳马
+卵生
+魂兽
+HSPA
+折中
+神皇
+1031
+咒术
+达意
+邕宁
+清镇
+作罢
+娜塔莉
+平武
+不逊
+专横
+舍命
+妣
+石牛
+分野
+楼座
+昔年
+鸡肋
+鼠类
+KEY
+舌战
+print
+欹
+归正
+Brain
+一餐
+托斯卡纳
+劝谏
+阴错
+兰迪
+闽浙
+卸车
+蒙皮
+居处
+1527
+细目
+西拉
+爱欲
+太华
+超生
+Training
+兴中
+血汗
+嘴脸
+中阿
+立轴
+蛙泳
+过亿
+一百零八
+历数
+月英
+内廷
+宿豫
+晓平
+悠游
+美杜莎
+新园
+濑户
+卫戍
+1286
+赐死
+兰道
+帐簿
+蒂尔
+沙里
+薯片
+一九八四
+行旅
+土堆
+灯罩
+肉糜
+美肌
+噻吩
+kids
+义愤
+球台
+微带
+Jenny
+黄鼠狼
+三平
+3020
+文身
+洪门
+泰瑞
+可动
+政通
+绪言
+自民
+好时
+团部
+擦身
+纯碱
+守夜
+砭
+蒺藜
+制氧
+小榄
+打蜡
+1665
+贝儿
+萨克
+苏泊尔
+奥古斯丁
+记大过
+证词
+万无
+俯身
+诺兰
+乌木
+头角
+网恋
+聪敏
+引智
+棕熊
+三侠
+文娟
+南乐
+公物
+月山
+主厨
+氨氮
+库尔特
+请罪
+伊格
+下邳
+环江
+Discovery
+1708
+0MB
+息烽
+采果
+布尔萨
+nomore
+轻风
+youthink
+Inspiron
+门子
+1407
+并力
+阿布扎比
+寒潮
+卷板
+1436
+While
+夏初
+怜爱
+瑞秋
+土库曼斯坦
+慧敏
+用计
+fully
+福克兰
+客隆
+TextA
+普度
+剃刀
+勃兰登堡
+三百四十三
+镜检
+转位
+逝水
+九型
+1727
+social
+开江
+宏基
+增光
+方术
+祸患
+迪伦
+民丰
+强震
+6episodes
+飞库
+50cm
+泰华
+鞭挞
+原药
+lan
+正体
+中落
+凤林
+美达
+治教
+Benjamin
+大良
+戊申
+粑粑
+Acta
+雩
+被埋
+评卷
+装傻
+骨痛
+铅山
+旦角
+ZX
+chi
+老手
+尊卑
+斗笠
+严整
+木棒
+了断
+神火
+神鸟
+懊恼
+韩流
+his
+park
+¨
+争风
+装瓶
+忠勇
+建刚
+拒收
+布赖恩
+定兴
+锯片
+Finance
+LOL
+敌视
+BW
+忍让
+崇信
+查勘
+陈坤
+传扬
+流态
+stage
+MI
+存心
+货单
+雨润
+安可
+春雪
+原初
+海堤
+相州
+海道
+宿营
+5T
+阿木
+BEST
+刘欢
+狂乱
+남
+多吉
+三百二十九
+梅洛
+P2
+人犯
+盘江
+computer
+金蟾
+迷倒
+Mazz
+沙坪
+2922
+志超
+捷达
+东张
+善堂
+重卡
+吉大
+湿婆
+苦楚
+三百四十九
+推脱
+庖
+誓师
+张北
+村正
+扶伤
+Integrated
+赛斯
+HYDROXY
+鱼目
+俗人
+擦鞋
+x86
+昆阳
+百乐
+板岩
+白鹅
+圣典
+挑食
+乐评
+reading
+堆满
+00000000
+伉俪
+西岭
+1078
+败诉
+亲善
+茵陈
+一九八八
+三百五十八
+再发
+籁
+白帽
+1667
+孝通
+秋生
+书事
+直呼
+苦力
+中波
+阳信
+影射
+二厅
+唐县
+总工
+好斗
+布罗
+Perfect
+漕河
+∴
+宏志
+入木
+顺平
+坐卧
+IMAX
+许字
+拔节
+寥
+三百六十一
+西樵
+0010
+磁卡
+Tran
+凉皮
+012008
+烤瓷
+便知
+奥克
+四四
+诱使
+瓮城
+柩
+放贷
+氢氧
+转贷
+低平
+韬奋
+小方块
+西岛
+旃
+精粉
+花街
+拉玛
+骅
+崴
+铁力
+手扶
+丧礼
+微动
+1763
+蚴
+梅香
+按语
+旬阳
+嗓
+22008
+太君
+辰溪
+忆起
+喜洋洋
+可掬
+0mg
+물
+威县
+敲开
+迁西
+原生质
+神盾
+憩室
+学妹
+韬光
+跟从
+و
+temperature
+CVT
+鸱
+主理
+主词
+0403
+轮候
+金江
+贝利亚
+拱坝
+沙弥
+挂历
+是安
+order
+抢到
+战壕
+何足
+世荣
+浑噩
+带路
+歧路
+天合
+三百五十一
+谆谆
+密密
+代发
+合情
+1757
+带菌
+误伤
+拖累
+柔然
+深州
+发配
+2080
+沩
+花圈
+博望
+娜拉
+寒月
+九百
+峡口
+秣
+Stanley
+潮头
+秀娟
+开膛
+丽园
+单极
+脏话
+Joan
+隆盛
+文曲
+泛读
+岚山
+宅第
+蛇王
+通考
+睡去
+下派
+不伦瑞克
+Youth
+天极
+六四
+Commerce
+01956
+法哲
+印江
+贸然
+精到
+连横
+安盛
+显明
+传阅
+受困
+潮热
+团职
+︱
+深闺
+书柜
+抚宁
+运单
+3003
+冰释
+walking
+花椰菜
+悠哉
+月琴
+village
+ท
+故都
+强音
+双基
+刊刻
+STAFF
+恩斯特
+TU
+松鹤
+FlashAction
+忆莲
+Ted
+渔歌
+人妻
+摆好
+名龙
+三叠纪
+花布
+家冲
+苍耳
+私密性
+protection
+地接
+男排
+Kid
+燊
+李毅
+前日
+夜盲
+1507
+宝瓶
+郃
+Widget
+结荚
+增广
+新荣
diff --git a/PaddleNLP/ELMO/lm_model.py b/PaddleNLP/ELMO/lm_model.py
new file mode 100755
index 0000000000000000000000000000000000000000..0d72147bdc7624cbcacf21ed4dbddfb933f28401
--- /dev/null
+++ b/PaddleNLP/ELMO/lm_model.py
@@ -0,0 +1,266 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid.layers as layers
+import paddle.fluid as fluid
+import numpy as np
+
+
+def dropout(input, test_mode, args):
+ if args.dropout and (not test_mode):
+ return layers.dropout(
+ input,
+ dropout_prob=args.dropout,
+ dropout_implementation="upscale_in_train",
+ seed=args.random_seed,
+ is_test=False)
+ else:
+ return input
+
+
+def lstmp_encoder(input_seq, gate_size, h_0, c_0, para_name, proj_size, test_mode, args):
+ # A lstm encoder implementation with projection.
+ # Linear transformation part for input gate, output gate, forget gate
+ # and cell activation vectors need be done outside of dynamic_lstm.
+ # So the output size is 4 times of gate_size.
+
+ input_seq = dropout(input_seq, test_mode, args)
+ input_proj = layers.fc(input=input_seq,
+ param_attr=fluid.ParamAttr(
+ name=para_name + '_gate_w', initializer=None),
+ size=gate_size * 4,
+ act=None,
+ bias_attr=False)
+ hidden, cell = layers.dynamic_lstmp(
+ input=input_proj,
+ size=gate_size * 4,
+ proj_size=proj_size,
+ h_0=h_0,
+ c_0=c_0,
+ use_peepholes=False,
+ proj_clip=args.proj_clip,
+ cell_clip=args.cell_clip,
+ proj_activation="identity",
+ param_attr=fluid.ParamAttr(initializer=None),
+ bias_attr=fluid.ParamAttr(initializer=None))
+
+ return hidden, cell, input_proj
+
+
+def encoder(x,
+ y,
+ vocab_size,
+ emb_size,
+ init_hidden=None,
+ init_cell=None,
+ para_name='',
+ custom_samples=None,
+ custom_probabilities=None,
+ test_mode=False,
+ args=None):
+ x_emb = layers.embedding(
+ input=x,
+ size=[vocab_size, emb_size],
+ dtype='float32',
+ is_sparse=False,
+ param_attr=fluid.ParamAttr(name='embedding_para'))
+ rnn_input = x_emb
+ rnn_outs = []
+ rnn_outs_ori = []
+ cells = []
+ projs = []
+ for i in range(args.num_layers):
+ rnn_input = dropout(rnn_input, test_mode, args)
+ if init_hidden and init_cell:
+ h0 = layers.squeeze(
+ layers.slice(
+ init_hidden, axes=[0], starts=[i], ends=[i + 1]),
+ axes=[0])
+ c0 = layers.squeeze(
+ layers.slice(
+ init_cell, axes=[0], starts=[i], ends=[i + 1]),
+ axes=[0])
+ else:
+ h0 = c0 = None
+ rnn_out, cell, input_proj = lstmp_encoder(
+ rnn_input, args.hidden_size, h0, c0,
+ para_name + 'layer{}'.format(i + 1), emb_size, test_mode, args)
+ rnn_out_ori = rnn_out
+ if i > 0:
+ rnn_out = rnn_out + rnn_input
+ rnn_out = dropout(rnn_out, test_mode, args)
+ cell = dropout(cell, test_mode, args)
+ rnn_outs.append(rnn_out)
+ rnn_outs_ori.append(rnn_out_ori)
+ rnn_input = rnn_out
+ cells.append(cell)
+ projs.append(input_proj)
+
+ softmax_weight = layers.create_parameter(
+ [vocab_size, emb_size], dtype="float32", name="softmax_weight")
+ softmax_bias = layers.create_parameter(
+ [vocab_size], dtype="float32", name='softmax_bias')
+ projection = layers.matmul(rnn_outs[-1], softmax_weight, transpose_y=True)
+ projection = layers.elementwise_add(projection, softmax_bias)
+
+ projection = layers.reshape(projection, shape=[-1, vocab_size])
+
+ if args.sample_softmax and (not test_mode):
+ loss = layers.sampled_softmax_with_cross_entropy(
+ logits=projection,
+ label=y,
+ num_samples=args.n_negative_samples_batch,
+ seed=args.random_seed)
+ else:
+ label = layers.one_hot(input=y, depth=vocab_size)
+ loss = layers.softmax_with_cross_entropy(
+ logits=projection, label=label, soft_label=True)
+ return [x_emb, projection, loss], rnn_outs, rnn_outs_ori, cells, projs
+
+
+class LanguageModel(object):
+ def __init__(self, args, vocab_size, test_mode):
+ self.args = args
+ self.vocab_size = vocab_size
+ self.test_mode = test_mode
+
+ def build(self):
+ args = self.args
+ emb_size = args.embed_size
+ proj_size = args.embed_size
+ hidden_size = args.hidden_size
+ batch_size = args.batch_size
+ num_layers = args.num_layers
+ num_steps = args.num_steps
+
+ lstm_outputs = []
+
+ x_f = layers.data(name="x", shape=[1], dtype='int64', lod_level=1)
+ y_f = layers.data(name="y", shape=[1], dtype='int64', lod_level=1)
+
+ x_b = layers.data(name="x_r", shape=[1], dtype='int64', lod_level=1)
+ y_b = layers.data(name="y_r", shape=[1], dtype='int64', lod_level=1)
+
+ init_hiddens_ = layers.data(
+ name="init_hiddens", shape=[1], dtype='float32')
+ init_cells_ = layers.data(
+ name="init_cells", shape=[1], dtype='float32')
+
+ init_hiddens = layers.reshape(
+ init_hiddens_, shape=[2 * num_layers, -1, proj_size])
+ init_cells = layers.reshape(
+ init_cells_, shape=[2 * num_layers, -1, hidden_size])
+
+ init_hidden = layers.slice(
+ init_hiddens, axes=[0], starts=[0], ends=[num_layers])
+ init_cell = layers.slice(
+ init_cells, axes=[0], starts=[0], ends=[num_layers])
+ init_hidden_r = layers.slice(
+ init_hiddens, axes=[0], starts=[num_layers],
+ ends=[2 * num_layers])
+ init_cell_r = layers.slice(
+ init_cells, axes=[0], starts=[num_layers], ends=[2 * num_layers])
+
+ if args.use_custom_samples:
+ custom_samples = layers.data(
+ name="custom_samples",
+ shape=[args.n_negative_samples_batch + 1],
+ dtype='int64',
+ lod_level=1)
+ custom_samples_r = layers.data(
+ name="custom_samples_r",
+ shape=[args.n_negative_samples_batch + 1],
+ dtype='int64',
+ lod_level=1)
+ custom_probabilities = layers.data(
+ name="custom_probabilities",
+ shape=[args.n_negative_samples_batch + 1],
+ dtype='float32',
+ lod_level=1)
+ else:
+ custom_samples = None
+ custom_samples_r = None
+ custom_probabilities = None
+
+ forward, fw_hiddens, fw_hiddens_ori, fw_cells, fw_projs = encoder(
+ x_f,
+ y_f,
+ self.vocab_size,
+ emb_size,
+ init_hidden,
+ init_cell,
+ para_name='fw_',
+ custom_samples=custom_samples,
+ custom_probabilities=custom_probabilities,
+ test_mode=self.test_mode,
+ args=args)
+ backward, bw_hiddens, bw_hiddens_ori, bw_cells, bw_projs = encoder(
+ x_b,
+ y_b,
+ self.vocab_size,
+ emb_size,
+ init_hidden_r,
+ init_cell_r,
+ para_name='bw_',
+ custom_samples=custom_samples_r,
+ custom_probabilities=custom_probabilities,
+ test_mode=self.test_mode,
+ args=args)
+
+ losses = layers.concat([forward[-1], backward[-1]])
+ self.loss = layers.reduce_mean(losses)
+ self.loss.persistable = True
+ self.grad_vars = [x_f, y_f, x_b, y_b, self.loss]
+ self.grad_vars_name = ['x', 'y', 'x_r', 'y_r', 'final_loss']
+ fw_vars_name = ['x_emb', 'proj', 'loss'] + [
+ 'init_hidden', 'init_cell'
+ ] + ['rnn_out', 'rnn_out2', 'cell', 'cell2', 'xproj', 'xproj2']
+ bw_vars_name = ['x_emb_r', 'proj_r', 'loss_r'] + [
+ 'init_hidden_r', 'init_cell_r'
+ ] + [
+ 'rnn_out_r', 'rnn_out2_r', 'cell_r', 'cell2_r', 'xproj_r',
+ 'xproj2_r'
+ ]
+ fw_vars = forward + [init_hidden, init_cell
+ ] + fw_hiddens + fw_cells + fw_projs
+ bw_vars = backward + [init_hidden_r, init_cell_r
+ ] + bw_hiddens + bw_cells + bw_projs
+ for i in range(len(fw_vars_name)):
+ self.grad_vars.append(fw_vars[i])
+ self.grad_vars.append(bw_vars[i])
+ self.grad_vars_name.append(fw_vars_name[i])
+ self.grad_vars_name.append(bw_vars_name[i])
+ if args.use_custom_samples:
+ self.feed_order = [
+ 'x', 'y', 'x_r', 'y_r', 'custom_samples', 'custom_samples_r',
+ 'custom_probabilities'
+ ]
+ else:
+ self.feed_order = ['x', 'y', 'x_r', 'y_r']
+ self.last_hidden = [
+ fluid.layers.sequence_last_step(input=x)
+ for x in fw_hiddens_ori + bw_hiddens_ori
+ ]
+ self.last_cell = [
+ fluid.layers.sequence_last_step(input=x)
+ for x in fw_cells + bw_cells
+ ]
+ self.last_hidden = layers.concat(self.last_hidden, axis=0)
+ self.last_hidden.persistable = True
+ self.last_cell = layers.concat(self.last_cell, axis=0)
+ self.last_cell.persistable = True
diff --git a/PaddleNLP/ELMO/reader.py b/PaddleNLP/ELMO/reader.py
new file mode 100755
index 0000000000000000000000000000000000000000..3645be8502a365122e5ba262786a70e20d9b67fe
--- /dev/null
+++ b/PaddleNLP/ELMO/reader.py
@@ -0,0 +1,130 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import collections
+import os
+import sys
+import numpy as np
+
+Py3 = sys.version_info[0] == 3
+
+
+def listDir(rootDir):
+ res = []
+ for filename in os.listdir(rootDir):
+ pathname = os.path.join(rootDir, filename)
+ if (os.path.isfile(pathname)):
+ res.append(pathname)
+ return res
+
+
+_unk = -1
+_bos = -1
+_eos = -1
+
+
+def _read_words(filename):
+ data = []
+ with open(filename, "r") as f:
+ return f.read().decode("utf-8").replace("\n", "").split()
+
+
+def _build_vocab(filename):
+ data = _read_words(filename)
+
+ counter = collections.Counter(data)
+ count_pairs = sorted(counter.items(), key=lambda x: (-x[1], x[0]))
+
+ words, _ = list(zip(*count_pairs))
+
+ print("vocab word num", len(words))
+ word_to_id = dict(zip(words, range(len(words))))
+
+ return word_to_id
+
+
+def _load_vocab(filename):
+ with open(filename, "r") as f:
+ words = f.read().decode("utf-8").replace("\n", " ").split()
+ word_to_id = dict(zip(words, range(len(words))))
+ _unk = word_to_id['']
+ _eos = word_to_id['']
+ _unk = word_to_id['']
+ return word_to_id
+
+
+def _file_to_word_ids(filenames, word_to_id):
+ for filename in filenames:
+ data = _read_words(filename)
+ for id in [word_to_id[word] for word in data if word in word_to_id]:
+ yield id
+
+
+def ptb_raw_data(data_path=None, vocab_path=None, args=None):
+ """Load PTB raw data from data directory "data_path".
+
+ Reads PTB text files, converts strings to integer ids,
+ and performs mini-batching of the inputs.
+
+ The PTB dataset comes from Tomas Mikolov's webpage:
+
+ http://www.fit.vutbr.cz/~imikolov/rnnlm/simple-examples.tgz
+
+ Args:
+ data_path: string path to the directory where simple-examples.tgz has
+ been extracted.
+
+ Returns:
+ tuple (train_data, valid_data, test_data, vocabulary)
+ where each of the data objects can be passed to PTBIterator.
+ """
+ if vocab_path:
+ word_to_id = _load_vocab(vocab_path)
+
+ if not args.train_path:
+ train_path = os.path.join(data_path, "train")
+ train_data = _file_to_word_ids(listDir(train_path), word_to_id)
+ else:
+ train_path = args.train_path
+ train_data = _file_to_word_ids([train_path], word_to_id)
+ valid_path = os.path.join(data_path, "dev")
+ test_path = os.path.join(data_path, "dev")
+ valid_data = _file_to_word_ids(listDir(valid_path), word_to_id)
+ test_data = _file_to_word_ids(listDir(test_path), word_to_id)
+ vocabulary = len(word_to_id)
+ return train_data, valid_data, test_data, vocabulary
+
+
+def get_data_iter(raw_data, batch_size, num_steps):
+ def __impl__():
+ buf = []
+ while True:
+ if len(buf) >= num_steps * batch_size + 1:
+ x = np.asarray(
+ buf[:-1], dtype='int64').reshape((batch_size, num_steps))
+ y = np.asarray(
+ buf[1:], dtype='int64').reshape((batch_size, num_steps))
+ yield (x, y)
+ buf = [buf[-1]]
+ try:
+ buf.append(raw_data.next())
+ except StopIteration:
+ break
+
+ return __impl__
diff --git a/PaddleNLP/ELMO/run.sh b/PaddleNLP/ELMO/run.sh
new file mode 100755
index 0000000000000000000000000000000000000000..a20d00102da255e1799d965346f4fc9af8f2add9
--- /dev/null
+++ b/PaddleNLP/ELMO/run.sh
@@ -0,0 +1,8 @@
+export CUDA_VISIBLE_DEVICES=0
+python train.py \
+--train_path='data/train/sentence_file_*' \
+--test_path='data/dev/sentence_file_*' \
+--vocab_path data/vocabulary_min5k.txt \
+--learning_rate 0.2 \
+--use_gpu True \
+--local True $@
diff --git a/PaddleNLP/ELMO/train.py b/PaddleNLP/ELMO/train.py
new file mode 100755
index 0000000000000000000000000000000000000000..b3f53cdd869801c4fe067676aed395c2fba62e45
--- /dev/null
+++ b/PaddleNLP/ELMO/train.py
@@ -0,0 +1,598 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import six
+import numpy as np
+import time
+import os
+import math
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.core as core
+import paddle.fluid.framework as framework
+from paddle.fluid.executor import Executor
+import data
+from args import *
+import lm_model
+import logging
+logging.basicConfig()
+import pickle
+
+def prepare_batch_input(batch, args):
+ x = batch['token_ids']
+ x_r = batch['token_ids_reverse']
+ y = batch['next_token_id']
+ y_r = batch['next_token_id_reverse']
+ inst = []
+ for i in range(len(x)):
+ if args.use_custom_samples:
+ custom_samples_array = np.zeros(
+ (args.num_steps, args.n_negative_samples_batch + 1),
+ dtype='int64')
+ custom_samples_array_r = np.zeros(
+ (args.num_steps, args.n_negative_samples_batch + 1),
+ dtype='int64')
+ custom_probabilities_array = np.zeros(
+ (args.num_steps, args.n_negative_samples_batch + 1),
+ dtype='float32')
+ for j in range(args.num_steps):
+ for k in range(args.n_negative_samples_batch + 1):
+ custom_samples_array[j][k] = k
+ custom_samples_array_r[j][k] = k
+ custom_probabilities_array[j][k] = 1.0
+ custom_samples_array[j][0] = y[i][j]
+ custom_samples_array_r[j][0] = y_r[i][j]
+ inst.append([
+ x[i], y[i], x_r[i], y_r[i], custom_samples_array,
+ custom_samples_array_r, custom_probabilities_array
+ ])
+ else:
+ inst.append([x[i], y[i], x_r[i], y_r[i]])
+ return inst
+
+
+def batch_reader(batch_list, args):
+ res = []
+ for batch in batch_list:
+ res.append(prepare_batch_input(batch, args))
+ return res
+
+
+def read_multiple(reader, batch_size, count, clip_last=True):
+ """
+ Stack data from reader for multi-devices.
+ """
+
+ def __impl__():
+ # one time read batch_size * count data for rnn
+ for data in reader():
+ inst_num_per_part = batch_size
+ split_data = {}
+ len_check = True
+ for k in data.keys():
+ if data[k] is not None:
+ if len(data[k]) != batch_size * count:
+ len_check = False
+ print("data check error!!, data=" + data[k] + ", k=" + k)
+ break
+ if len_check:
+ res = []
+ for i in range(count):
+ split_data = {}
+ for k in data.keys():
+ if data[k] is not None:
+ split_data[k] = data[k][inst_num_per_part * i:inst_num_per_part * (i + 1)]
+ res.append(split_data)
+ yield res
+
+ return __impl__
+
+
+def LodTensor_Array(lod_tensor):
+ lod = lod_tensor.lod()
+ array = np.array(lod_tensor)
+ new_array = []
+ for i in range(len(lod[0]) - 1):
+ new_array.append(array[lod[0][i]:lod[0][i + 1]])
+ return new_array
+
+
+def get_current_model_para(train_prog, train_exe):
+ param_list = train_prog.block(0).all_parameters()
+ param_name_list = [p.name for p in param_list]
+
+ vals = {}
+ for p_name in param_name_list:
+ p_array = np.array(fluid.global_scope().find_var(p_name).get_tensor())
+ vals[p_name] = p_array
+
+ return vals
+
+
+def save_para_npz(train_prog, train_exe):
+ logger.info("begin to save model to model_base")
+ param_list = train_prog.block(0).all_parameters()
+ param_name_list = [p.name for p in param_list]
+
+ vals = {}
+ for p_name in param_name_list:
+ p_array = np.array(fluid.global_scope().find_var(p_name).get_tensor())
+ vals[p_name] = p_array
+
+ emb = vals["embedding_para"]
+ logger.info("begin to save model to model_base")
+ np.savez("mode_base", **vals)
+
+
+def prepare_input(batch, epoch_id=0, with_lr=True):
+ x, y = batch
+ inst = []
+ for i in range(len(x)):
+ inst.append([x[i], y[i]])
+ return inst
+
+
+def eval(vocab, infer_progs, dev_count, logger, args):
+ infer_prog, infer_startup_prog, infer_model = infer_progs
+ feed_order = infer_model.feed_order
+ loss = infer_model.loss
+
+ # prepare device
+ place = core.CUDAPlace(0) if args.use_gpu else core.CPUPlace()
+ exe = Executor(place)
+ if not args.use_gpu:
+ place = fluid.CPUPlace()
+ import multiprocessing
+ dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ else:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+
+ total_loss = 0.0
+ total_cnt = 0
+ n_batch_cnt = 0
+ n_batch_loss = 0.0
+ val_feed_list = [
+ infer_prog.global_block().var(var_name) for var_name in feed_order
+ ]
+ val_feeder = fluid.DataFeeder(val_feed_list, place)
+ dev_data = data.BidirectionalLMDataset(
+ args.test_path, vocab, test=True, shuffle_on_load=False)
+ dev_data_iter = lambda: dev_data.iter_batches(args.batch_size * dev_count, args.num_steps)
+ dev_reader = read_multiple(dev_data_iter, args.batch_size, dev_count)
+
+ last_hidden_values = np.zeros(
+ (dev_count, args.num_layers * 2 * args.batch_size * args.embed_size),
+ dtype='float32')
+ last_cell_values = np.zeros(
+ (dev_count, args.num_layers * 2 * args.batch_size * args.hidden_size),
+ dtype='float32')
+ for batch_id, batch_list in enumerate(dev_reader(), 1):
+ feed_data = batch_reader(batch_list, args)
+ feed = list(val_feeder.feed_parallel(feed_data, dev_count))
+ for i in range(dev_count):
+ init_hidden_tensor = fluid.core.LoDTensor()
+ if args.use_gpu:
+ placex = fluid.CUDAPlace(i)
+ else:
+ placex = fluid.CPUPlace()
+ init_hidden_tensor.set(last_hidden_values[i], placex)
+ init_cell_tensor = fluid.core.LoDTensor()
+ init_cell_tensor.set(last_cell_values[i], placex)
+
+ feed[i]['init_hiddens'] = init_hidden_tensor
+ feed[i]['init_cells'] = init_cell_tensor
+ last_hidden_values = []
+ last_cell_values = []
+ for i in range(dev_count):
+ val_fetch_outs = exe.run(
+ program=infer_prog,
+ feed=feed[i],
+ fetch_list=[
+ infer_model.loss.name, infer_model.last_hidden.name,
+ infer_model.last_cell.name
+ ],
+ return_numpy=False)
+ last_hidden_values.append(np.array(val_fetch_outs[1]))
+ last_cell_values.append(np.array(val_fetch_outs[2]))
+ total_loss += np.array(val_fetch_outs[0]).sum()
+
+ n_batch_cnt += len(np.array(val_fetch_outs[0]))
+ total_cnt += len(np.array(val_fetch_outs[0]))
+ n_batch_loss += np.array(val_fetch_outs[0]).sum()
+
+ last_hidden_values = np.array(last_hidden_values).reshape((
+ dev_count, args.num_layers * 2 * args.batch_size * args.embed_size))
+ last_cell_values = np.array(last_cell_values).reshape(
+ (dev_count,
+ args.num_layers * 2 * args.batch_size * args.hidden_size))
+
+ log_every_n_batch = args.log_interval
+ if log_every_n_batch > 0 and batch_id % log_every_n_batch == 0:
+ logger.info('Average dev loss from batch {} to {} is {}'.format(
+ batch_id - log_every_n_batch + 1, batch_id, "%.10f" % (
+ n_batch_loss / n_batch_cnt)))
+ n_batch_loss = 0.0
+ n_batch_cnt = 0
+ batch_offset = 0
+
+ ppl = np.exp(total_loss / total_cnt)
+ return ppl
+
+
+def train():
+ args = parse_args()
+ if args.random_seed == 0:
+ args.random_seed = None
+ print("random seed is None")
+ if args.enable_ce:
+ random.seed(args.random_seed)
+ np.random.seed(args.random_seed)
+ logger = logging.getLogger("lm")
+ logger.setLevel(logging.INFO)
+ formatter = logging.Formatter(
+ '%(asctime)s - %(name)s - %(levelname)s - %(message)s')
+ console_handler = logging.StreamHandler()
+ console_handler.setLevel(logging.INFO)
+ console_handler.setFormatter(formatter)
+
+ logger.info('Running with args : {}'.format(args))
+ logger.info('Running paddle : {}'.format(paddle.version.commit))
+
+ hidden_size = args.hidden_size
+ batch_size = args.batch_size
+ data_path = args.data_path
+ logger.info("begin to load vocab")
+ vocab = data.Vocabulary(args.vocab_path, validate_file=True)
+ vocab_size = vocab.size
+ logger.info("finished load vocab")
+
+ logger.info('build the model...')
+ # build model
+ train_prog = fluid.Program()
+ train_startup_prog = fluid.Program()
+ if args.enable_ce:
+ train_prog.random_seed = args.random_seed
+ train_startup_prog.random_seed = args.random_seed
+
+ # build infer model
+ infer_prog = fluid.Program()
+ infer_startup_prog = fluid.Program()
+ with fluid.program_guard(infer_prog, infer_startup_prog):
+ with fluid.unique_name.guard():
+ # Infer process
+ infer_model = lm_model.LanguageModel(
+ args, vocab_size, test_mode=True)
+ infer_model.build()
+ infer_progs = infer_prog, infer_startup_prog, infer_model
+
+ with fluid.program_guard(train_prog, train_startup_prog):
+ with fluid.unique_name.guard():
+ # Training process
+ train_model = lm_model.LanguageModel(
+ args, vocab_size, test_mode=False)
+ train_model.build()
+ fluid.clip.set_gradient_clip(
+ clip=fluid.clip.GradientClipByGlobalNorm(
+ clip_norm=args.max_grad_norm))
+
+ # build optimizer
+ if args.optim == 'adagrad':
+ optimizer = fluid.optimizer.Adagrad(
+ learning_rate=args.learning_rate,
+ epsilon=0.0,
+ initial_accumulator_value=1.0)
+ elif args.optim == 'sgd':
+ optimizer = fluid.optimizer.SGD(
+ learning_rate=args.learning_rate)
+ elif args.optim == 'adam':
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=args.learning_rate)
+ elif args.optim == 'rprop':
+ optimizer = fluid.optimizer.RMSPropOptimizer(
+ learning_rate=args.learning_rate)
+ else:
+ logger.error('Unsupported optimizer: {}'.format(args.optim))
+ exit(-1)
+ optimizer.minimize(train_model.loss * args.num_steps)
+
+ # initialize parameters
+ place = core.CUDAPlace(0) if args.use_gpu else core.CPUPlace()
+ exe = Executor(place)
+ train_progs = train_prog, train_startup_prog, train_model
+
+ if args.local:
+ logger.info("local start_up:")
+ train_loop(args, logger, vocab, train_progs, infer_progs, optimizer)
+ else:
+ if args.update_method == "nccl2":
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
+ if args.test_nccl:
+ worker_endpoints_env = os.getenv("PADDLE_WORK_ENDPOINTS")
+ worker_endpoints = worker_endpoints_env.split(',')
+ trainers_num = len(worker_endpoints)
+ current_endpoint = worker_endpoints[trainer_id]
+ else:
+ port = os.getenv("PADDLE_PORT")
+ worker_ips = os.getenv("PADDLE_TRAINERS")
+ worker_endpoints = []
+ for ip in worker_ips.split(","):
+ worker_endpoints.append(':'.join([ip, port]))
+ worker_endpoints_env = ','.join(worker_endpoints)
+ trainers_num = len(worker_endpoints)
+ current_endpoint = os.getenv("POD_IP") + ":" + port
+ if trainer_id == 0:
+ logger.info("train_id == 0, sleep 60s")
+ time.sleep(60)
+
+ logger.info("trainers_num:{}".format(trainers_num))
+ logger.info("worker_endpoints:{}".format(worker_endpoints))
+ logger.info("current_endpoint:{}".format(current_endpoint))
+ config = fluid.DistributeTranspilerConfig()
+ config.mode = "nccl2"
+ t = fluid.DistributeTranspiler(config=config)
+ t.transpile(
+ trainer_id,
+ trainers=worker_endpoints_env,
+ current_endpoint=current_endpoint,
+ program=train_prog,
+ startup_program=train_startup_prog)
+ train_progs = train_prog, train_startup_prog, train_model
+ train_loop(args, logger, vocab, train_progs, infer_progs, optimizer,
+ trainers_num, trainer_id, worker_endpoints)
+ else:
+ port = os.getenv("PADDLE_PORT", "6174")
+ pserver_ips = os.getenv("PADDLE_PSERVERS")
+ eplist = []
+ for ip in pserver_ips.split(","):
+ eplist.append(':'.join([ip, port]))
+ pserver_endpoints = ",".join(eplist)
+ trainers = int(os.getenv("PADDLE_TRAINERS_NUM", "0"))
+ current_endpoint = os.getenv("POD_IP") + ":" + port
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID"))
+
+ logger.info("pserver_endpoints:{}".format(pserver_endpoints))
+ logger.info("current_endpoint:{}".format(current_endpoint))
+ logger.info("trainer_id:{}".format(trainer_id))
+ logger.info("pserver_ips:{}".format(pserver_ips))
+ logger.info("port:{}".format(port))
+
+ t = fluid.DistributeTranspiler()
+ t.transpile(
+ trainer_id,
+ pservers=pserver_endpoints,
+ trainers=trainers,
+ program=train_prog,
+ startup_program=startup_prog)
+
+ if training_role == "PSERVER":
+ logger.info("distributed: pserver started")
+ current_endpoint = os.getenv("POD_IP") + ":" + os.getenv(
+ "PADDLE_PORT")
+ if not current_endpoint:
+ logger.critical("need env SERVER_ENDPOINT")
+ exit(1)
+ pserver_prog = t.get_pserver_program(current_endpoint)
+ pserver_startup = t.get_startup_program(current_endpoint,
+ pserver_prog)
+
+ exe.run(pserver_startup)
+ exe.run(pserver_prog)
+ elif training_role == "TRAINER":
+ logger.info("distributed: trainer started")
+ trainer_prog = t.get_trainer_program()
+ train_loop(args, logger, vocab, train_progs, infer_progs,
+ optimizer)
+ else:
+ logger.critical(
+ "environment var TRAINER_ROLE should be TRAINER os PSERVER")
+ exit(1)
+
+def init_pretraining_params(exe,
+ pretraining_params_path,
+ main_program):
+ assert os.path.exists(pretraining_params_path
+ ), "[%s] cann't be found." % pretraining_params_path
+
+ def existed_params(var):
+ if not isinstance(var, fluid.framework.Parameter):
+ return False
+ return os.path.exists(os.path.join(pretraining_params_path, var.name))
+
+ fluid.io.load_vars(
+ exe,
+ pretraining_params_path,
+ main_program=main_program,
+ predicate=existed_params)
+ print("Load pretraining parameters from {}.".format(
+ pretraining_params_path))
+
+
+def train_loop(args,
+ logger,
+ vocab,
+ train_progs,
+ infer_progs,
+ optimizer,
+ nccl2_num_trainers=1,
+ nccl2_trainer_id=0,
+ worker_endpoints=None):
+ train_prog, train_startup_prog, train_model = train_progs
+ infer_prog, infer_startup_prog, infer_model = infer_progs
+
+ # prepare device
+ place = core.CUDAPlace(0) if args.use_gpu else core.CPUPlace()
+ exe = Executor(place)
+ if not args.use_gpu:
+ place = fluid.CPUPlace()
+ import multiprocessing
+ dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ else:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+
+ if args.load_dir:
+ logger.info('load pretrained checkpoints from {}'.format(args.load_dir))
+ fluid.io.load_persistables(exe, args.load_dir, main_program=train_prog)
+ elif args.load_pretraining_params:
+ logger.info('load pretrained params from {}'.format(args.load_pretraining_params))
+ exe.run(train_startup_prog)
+ init_pretraining_params(exe, args.load_pretraining_params, main_program=train_prog)
+ else:
+ exe.run(train_startup_prog)
+
+ # prepare data
+ feed_list = [
+ train_prog.global_block().var(var_name)
+ for var_name in train_model.feed_order
+ ]
+ feeder = fluid.DataFeeder(feed_list, place)
+
+ logger.info('Training the model...')
+ exe_strategy = fluid.parallel_executor.ExecutionStrategy()
+ parallel_executor = fluid.ParallelExecutor(
+ loss_name=train_model.loss.name,
+ main_program=train_prog,
+ use_cuda=bool(args.use_gpu),
+ exec_strategy=exe_strategy,
+ num_trainers=nccl2_num_trainers,
+ trainer_id=nccl2_trainer_id)
+
+ logger.info("begin to load data")
+ train_data = data.BidirectionalLMDataset(
+ args.train_path,
+ vocab,
+ test=(not args.shuffle),
+ shuffle_on_load=args.shuffle)
+ logger.info("finished load vocab")
+
+ # get train epoch size
+ log_interval = args.log_interval
+ total_time = 0.0
+ batch_size = args.batch_size
+ hidden_size = args.hidden_size
+ custom_samples_array = np.zeros(
+ (batch_size, args.num_steps, args.n_negative_samples_batch + 1),
+ dtype='int64')
+ custom_probabilities_array = np.zeros(
+ (batch_size, args.num_steps, args.n_negative_samples_batch + 1),
+ dtype='float32')
+ for i in range(batch_size):
+ for j in range(0, args.num_steps):
+ for k in range(0, args.n_negative_samples_batch + 1):
+ custom_samples_array[i][j][k] = k
+ custom_probabilities_array[i][j][k] = 1.0
+
+ for epoch_id in range(args.max_epoch):
+ start_time = time.time()
+ logger.info("epoch id {}".format(epoch_id))
+ train_data_iter = lambda: train_data.iter_batches(batch_size * dev_count, args.num_steps)
+ train_reader = read_multiple(train_data_iter, batch_size, dev_count)
+
+ total_num = 0
+ n_batch_loss = 0.0
+ n_batch_cnt = 0
+ last_hidden_values = np.zeros(
+ (dev_count, args.num_layers * 2 * batch_size * args.embed_size),
+ dtype='float32')
+ last_cell_values = np.zeros(
+ (dev_count, args.num_layers * 2 * batch_size * hidden_size),
+ dtype='float32')
+
+ begin_time = time.time()
+ for batch_id, batch_list in enumerate(train_reader(), 1):
+ feed_data = batch_reader(batch_list, args)
+ feed = list(feeder.feed_parallel(feed_data, dev_count))
+ for i in range(dev_count):
+ init_hidden_tensor = fluid.core.LoDTensor()
+ if args.use_gpu:
+ placex = fluid.CUDAPlace(i)
+ else:
+ placex = fluid.CPUPlace()
+ init_hidden_tensor.set(last_hidden_values[i], placex)
+ init_cell_tensor = fluid.core.LoDTensor()
+ init_cell_tensor.set(last_cell_values[i], placex)
+
+ feed[i]['init_hiddens'] = init_hidden_tensor
+ feed[i]['init_cells'] = init_cell_tensor
+
+ fetch_outs = parallel_executor.run(
+ feed=feed,
+ fetch_list=[
+ train_model.loss.name, train_model.last_hidden.name,
+ train_model.last_cell.name
+ ],
+ return_numpy=False)
+ cost_train = np.array(fetch_outs[0]).mean()
+ last_hidden_values = np.array(fetch_outs[1])
+ last_hidden_values = last_hidden_values.reshape(
+ (dev_count, args.num_layers * 2 * batch_size * args.embed_size))
+ last_cell_values = np.array(fetch_outs[2])
+ last_cell_values = last_cell_values.reshape((
+ dev_count, args.num_layers * 2 * batch_size * args.hidden_size))
+
+ total_num += args.batch_size * dev_count
+ n_batch_loss += np.array(fetch_outs[0]).sum()
+
+ n_batch_cnt += len(np.array(fetch_outs[0]))
+
+ if batch_id > 0 and batch_id % log_interval == 0:
+ smoothed_ppl = np.exp(n_batch_loss / n_batch_cnt)
+ ppl = np.exp(
+ np.array(fetch_outs[0]).sum() /
+ len(np.array(fetch_outs[0])))
+ used_time = time.time() - begin_time
+ speed = log_interval / used_time
+ logger.info(
+ "[train] epoch:{}, step:{}, loss:{:.3f}, ppl:{:.3f}, smoothed_ppl:{:.3f}, speed:{:.3f}".
+ format(epoch_id, batch_id, n_batch_loss / n_batch_cnt, ppl,
+ smoothed_ppl, speed))
+ n_batch_loss = 0.0
+ n_batch_cnt = 0
+ begin_time = time.time()
+ if batch_id > 0 and batch_id % args.dev_interval == 0:
+ valid_ppl = eval(vocab, infer_progs, dev_count, logger, args)
+ logger.info("valid ppl {}".format(valid_ppl))
+ if batch_id > 0 and batch_id % args.save_interval == 0:
+ model_path = os.path.join(args.para_save_dir,
+ str(batch_id + epoch_id))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(
+ executor=exe, dirname=model_path, main_program=train_prog)
+
+ end_time = time.time()
+ total_time += end_time - start_time
+ if epoch_id == args.max_epoch - 1 and args.enable_ce:
+ logger.info("lstm_language_model_duration\t%s" %
+ (total_time / args.max_epoch))
+ logger.info("lstm_language_model_loss\t%s" % ppl[0])
+
+ model_path = os.path.join(args.para_save_dir, str(epoch_id))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(
+ executor=exe, dirname=model_path, main_program=train_prog)
+ valid_ppl = eval(vocab, infer_progs, dev_count, logger, args)
+ logger.info("valid ppl {}".format(valid_ppl))
+ test_ppl = eval(vocab, infer_progs, dev_count, logger, args)
+ logger.info("test ppl {}".format(test_ppl))
+
+
+if __name__ == '__main__':
+ train()
diff --git a/PaddleNLP/LAC b/PaddleNLP/LAC
new file mode 160000
index 0000000000000000000000000000000000000000..a4eb73b2fb64d8aab8499a1184edf4fc386f8268
--- /dev/null
+++ b/PaddleNLP/LAC
@@ -0,0 +1 @@
+Subproject commit a4eb73b2fb64d8aab8499a1184edf4fc386f8268
diff --git a/PaddleNLP/LARK b/PaddleNLP/LARK
new file mode 160000
index 0000000000000000000000000000000000000000..77ab80a7061024c4b28f0b41fdd6ba42d5e6d9e1
--- /dev/null
+++ b/PaddleNLP/LARK
@@ -0,0 +1 @@
+Subproject commit 77ab80a7061024c4b28f0b41fdd6ba42d5e6d9e1
diff --git a/PaddleNLP/README.md b/PaddleNLP/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..fa81f6a27df879e68e704491225c1fee10930c93
--- /dev/null
+++ b/PaddleNLP/README.md
@@ -0,0 +1,56 @@
+PaddleNLP
+=========
+
+机器翻译
+--------
+
+机器翻译(Machine Translation)将一种自然语言(源语言)转换成一种自然语言(目标语言),是自然语言处理中非常基础和重要的研究方向。在全球化的浪潮中,机器翻译在促进跨语言文明的交流中所起的重要作用是不言而喻的。其发展经历了统计机器翻译和基于神经网络的神经机器翻译(Nueural
+Machine Translation, NMT)等阶段。在 NMT 成熟后,机器翻译才真正得以大规模应用。而早阶段的 NMT 主要是基于循环神经网络 RNN 的,其训练过程中当前时间步依赖于前一个时间步的计算,时间步之间难以并行化以提高训练速度。因此,非 RNN 结构的 NMT 得以应运而生,例如基 卷积神经网络 CNN 的结构和基于自注意力机制(Self-Attention)的结构。
+
+本实例所实现的 Transformer 就是一个基于自注意力机制的机器翻译模型,其中不再有RNN或CNN结构,而是完全利用 Attention 学习语言中的上下文依赖。相较于RNN/CNN, 这种结构在单层内计算复杂度更低、易于并行化、对长程依赖更易建模,最终在多种语言之间取得了最好的翻译效果。
+
+- [Transformer](https://github.com/PaddlePaddle/models/blob/develop/PaddleNLP/neural_machine_translation/transformer/README_cn.md)
+
+
+中文词法分析
+------------
+
+中文分词(Word Segmentation)是将连续的自然语言文本,切分出具有语义合理性和完整性的词汇序列的过程。因为在汉语中,词是承担语义的最基本单位,切词是文本分类、情感分析、信息检索等众多自然语言处理任务的基础。 词性标注(Part-of-speech Tagging)是为自然语言文本中的每一个词汇赋予一个词性的过程,这里的词性包括名词、动词、形容词、副词等等。 命名实体识别(Named Entity Recognition,NER)又称作“专名识别”,是指识别自然语言文本中具有特定意义的实体,主要包括人名、地名、机构名、专有名词等。 我们将这三个任务统一成一个联合任务,称为词法分析任务,基于深度神经网络,利用海量标注语料进行训练,提供了一个端到端的解决方案。
+
+我们把这个联合的中文词法分析解决方案命名为LAC。LAC既可以认为是Lexical Analysis of Chinese的首字母缩写,也可以认为是LAC Analyzes Chinese的递归缩写。
+
+- [LAC](https://github.com/baidu/lac/blob/master/README.md)
+
+情感倾向分析
+------------
+
+情感倾向分析针对带有主观描述的中文文本,可自动判断该文本的情感极性类别并给出相应的置信度。情感类型分为积极、消极、中性。情感倾向分析能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控,为企业提供有力的决策支持。本次我们开放 AI 开放平台中情感倾向分析采用的[模型](http://ai.baidu.com/tech/nlp/sentiment_classify),提供给用户使用。
+
+- [Senta](https://github.com/baidu/Senta/blob/master/README.md)
+
+语义匹配
+--------
+
+在自然语言处理很多场景中,需要度量两个文本在语义上的相似度,这类任务通常被称为语义匹配。例如在搜索中根据查询与候选文档的相似度对搜索结果进行排序,文本去重中文本与文本相似度的计算,自动问答中候选答案与问题的匹配等。
+
+本例所开放的DAM (Deep Attention Matching Network)为百度自然语言处理部发表于ACL-2018的工作,用于检索式聊天机器人多轮对话中应答的选择。DAM受Transformer的启发,其网络结构完全基于注意力(attention)机制,利用栈式的self-attention结构分别学习不同粒度下应答和语境的语义表示,然后利用cross-attention获取应答与语境之间的相关性,在两个大规模多轮对话数据集上的表现均好于其它模型。
+
+- [Deep Attention Matching Network](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/deep_attention_matching_net)
+
+AnyQ
+----
+
+[AnyQ](https://github.com/baidu/AnyQ)(ANswer Your Questions) 开源项目主要包含面向FAQ集合的问答系统框架、文本语义匹配工具SimNet。 问答系统框架采用了配置化、插件化的设计,各功能均通过插件形式加入,当前共开放了20+种插件。开发者可以使用AnyQ系统快速构建和定制适用于特定业务场景的FAQ问答系统,并加速迭代和升级。
+
+SimNet是百度自然语言处理部于2013年自主研发的语义匹配框架,该框架在百度各产品上广泛应用,主要包括BOW、CNN、RNN、MM-DNN等核心网络结构形式,同时基于该框架也集成了学术界主流的语义匹配模型,如MatchPyramid、MV-LSTM、K-NRM等模型。使用SimNet构建出的模型可以便捷的加入AnyQ系统中,增强AnyQ系统的语义匹配能力。
+
+- [SimNet in PaddlePaddle Fluid](https://github.com/baidu/AnyQ/blob/master/tools/simnet/train/paddle/README.md)
+
+机器阅读理解
+----------
+
+机器阅读理解(MRC)是自然语言处理(NLP)中的核心任务之一,最终目标是让机器像人类一样阅读文本,提炼文本信息并回答相关问题。深度学习近年来在NLP中得到广泛使用,也使得机器阅读理解能力在近年有了大幅提高,但是目前研究的机器阅读理解都采用人工构造的数据集,以及回答一些相对简单的问题,和人类处理的数据还有明显差距,因此亟需大规模真实训练数据推动MRC的进一步发展。
+
+百度阅读理解数据集是由百度自然语言处理部开源的一个真实世界数据集,所有的问题、原文都来源于实际数据(百度搜索引擎数据和百度知道问答社区),答案是由人类回答的。每个问题都对应多个答案,数据集包含200k问题、1000k原文和420k答案,是目前最大的中文MRC数据集。百度同时开源了对应的阅读理解模型,称为DuReader,采用当前通用的网络分层结构,通过双向attention机制捕捉问题和原文之间的交互关系,生成query-aware的原文表示,最终基于query-aware的原文表示通过point network预测答案范围。
+
+- [DuReader in PaddlePaddle Fluid](https://github.com/PaddlePaddle/models/blob/develop/PaddleNLP/machine_reading_comprehension/README.md)
diff --git a/PaddleNLP/Senta b/PaddleNLP/Senta
new file mode 160000
index 0000000000000000000000000000000000000000..dc1af6a83dd1372055158ac6d17f6d14b3a0f0f8
--- /dev/null
+++ b/PaddleNLP/Senta
@@ -0,0 +1 @@
+Subproject commit dc1af6a83dd1372055158ac6d17f6d14b3a0f0f8
diff --git a/PaddleNLP/SimNet b/PaddleNLP/SimNet
new file mode 160000
index 0000000000000000000000000000000000000000..b3e096b92f26720f6e3b020b374e11aa0748c032
--- /dev/null
+++ b/PaddleNLP/SimNet
@@ -0,0 +1 @@
+Subproject commit b3e096b92f26720f6e3b020b374e11aa0748c032
diff --git a/PaddleNLP/chinese_ner/.run_ce.sh b/PaddleNLP/chinese_ner/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..79b2da72c165a9b60b14f2748c8d46eb5b061a7d
--- /dev/null
+++ b/PaddleNLP/chinese_ner/.run_ce.sh
@@ -0,0 +1,20 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+cudaid=${chinese_ner:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --num_passes 300 --device GPU --enable_ce | python _ce.py
+
+cudaid=${chinese_ner_4:=0,1,2,3} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --num_passes 300 --device GPU --parallel True --enable_ce | python _ce.py
+
+export CPU_NUM=1
+export NUM_THREADS=1
+
+FLAGS_benchmark=true python train.py --num_passes 300 --device CPU --enable_ce | python _ce.py
diff --git a/PaddleNLP/chinese_ner/README.md b/PaddleNLP/chinese_ner/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a458c83b5f1ad9c007d35ddfb7a6578fb14bbf2a
--- /dev/null
+++ b/PaddleNLP/chinese_ner/README.md
@@ -0,0 +1,62 @@
+# 使用ParallelExecutor的中文命名实体识别示例
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── data # 存储运行本例所依赖的数据,从外部获取
+├── reader.py # 数据读取接口, 从外部获取
+├── README.md # 文档
+├── train.py # 训练脚本
+├── infer.py # 预测脚本
+```
+
+## 数据
+在data目录下,有两个文件夹,train_files中保存的是训练数据,test_files中保存的是测试数据,作为示例,在目录下我们各放置了两个文件,实际训练时,根据自己的实际需要将数据放置在对应目录,并根据数据格式,修改reader.py中的数据读取函数。
+
+## 训练
+
+通过运行
+
+```
+python train.py --help
+```
+
+来获取命令行参数的帮助,设置正确的数据路径等参数后,运行`train.py`开始训练。
+
+训练记录形如
+```txt
+pass_id:0, time_cost:4.92960214615s
+[Train] precision:0.000862136531076, recall:0.0059880239521, f1:0.00150726226363
+[Test] precision:0.000796178343949, recall:0.00335758254057, f1:0.00128713933283
+pass_id:1, time_cost:0.715255975723s
+[Train] precision:0.00474094141551, recall:0.00762112139358, f1:0.00584551148225
+[Test] precision:0.0228873239437, recall:0.00727476217124, f1:0.0110403397028
+pass_id:2, time_cost:0.740842103958s
+[Train] precision:0.0120967741935, recall:0.00163309744148, f1:0.00287769784173
+[Test] precision:0, recall:0.0, f1:0
+```
+
+## 预测
+类似于训练过程,预测时指定需要测试模型的路径、测试数据、预测标记文件的路径,运行`infer.py`开始预测。
+
+预测结果如下
+```txt
+152804 O O
+130048 O O
+38862 10-B O
+784 O O
+1540 O O
+4145 O O
+2255 O O
+0 O O
+1279 O O
+7793 O O
+373 O O
+1621 O O
+815 O O
+2 O O
+247 24-B O
+401 24-I O
+```
+输出分为三列,以"\t"分割,第一列是输入的词语的序号,第二列是标准结果,第三列为标记结果。多条输入序列之间以空行分隔。
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/test.py b/PaddleNLP/chinese_ner/__init__.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/test.py
rename to PaddleNLP/chinese_ner/__init__.py
diff --git a/PaddleNLP/chinese_ner/_ce.py b/PaddleNLP/chinese_ner/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..7afa65d346459dd691d93765ffbd6e02406e2e93
--- /dev/null
+++ b/PaddleNLP/chinese_ner/_ce.py
@@ -0,0 +1,66 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+from kpi import AccKpi
+
+
+each_pass_duration_cpu1_thread1_kpi = DurationKpi('each_pass_duration_cpu1_thread1', 0.08, 0, actived=True)
+train_recall_cpu1_thread1_kpi = AccKpi('train_recall_cpu1_thread1', 0.08, 0)
+each_pass_duration_gpu1_kpi = DurationKpi('each_pass_duration_gpu1', 0.08, 0, actived=True)
+train_recall_gpu1_kpi = AccKpi('train_recall_gpu1', 0.08, 0)
+each_pass_duration_gpu4_kpi = DurationKpi('each_pass_duration_gpu4', 0.08, 0, actived=True)
+train_recall_gpu4_kpi = AccKpi('train_recall_gpu4', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_cpu1_thread1_kpi,
+ train_recall_cpu1_thread1_kpi,
+ each_pass_duration_gpu1_kpi,
+ train_recall_gpu1_kpi,
+ each_pass_duration_gpu4_kpi,
+ train_recall_gpu4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/fluid/PaddleNLP/chinese_ner/data/label_dict b/PaddleNLP/chinese_ner/data/label_dict
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/data/label_dict
rename to PaddleNLP/chinese_ner/data/label_dict
diff --git a/fluid/PaddleNLP/chinese_ner/data/test_files/test_part_1 b/PaddleNLP/chinese_ner/data/test_files/test_part_1
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/data/test_files/test_part_1
rename to PaddleNLP/chinese_ner/data/test_files/test_part_1
diff --git a/fluid/PaddleNLP/chinese_ner/data/test_files/test_part_2 b/PaddleNLP/chinese_ner/data/test_files/test_part_2
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/data/test_files/test_part_2
rename to PaddleNLP/chinese_ner/data/test_files/test_part_2
diff --git a/fluid/PaddleNLP/chinese_ner/data/train_files/train_part_1 b/PaddleNLP/chinese_ner/data/train_files/train_part_1
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/data/train_files/train_part_1
rename to PaddleNLP/chinese_ner/data/train_files/train_part_1
diff --git a/fluid/PaddleNLP/chinese_ner/data/train_files/train_part_2 b/PaddleNLP/chinese_ner/data/train_files/train_part_2
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/data/train_files/train_part_2
rename to PaddleNLP/chinese_ner/data/train_files/train_part_2
diff --git a/PaddleNLP/chinese_ner/infer.py b/PaddleNLP/chinese_ner/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..dd0d156b015423cb6805b3eb946383905328c835
--- /dev/null
+++ b/PaddleNLP/chinese_ner/infer.py
@@ -0,0 +1,181 @@
+import numpy as np
+import argparse
+import time
+
+import paddle.fluid as fluid
+import paddle.fluid.profiler as profiler
+import paddle
+
+import reader
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("Run inference.")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=6,
+ help='The size of a batch. (default: %(default)d)')
+ parser.add_argument(
+ '--device',
+ type=str,
+ default='GPU',
+ choices=['CPU', 'GPU'],
+ help='The device type. (default: %(default)s)')
+ parser.add_argument(
+ '--model_path',
+ type=str,
+ default='output/params_pass_0',
+ help='A path to the model. (default: %(default)s)')
+ parser.add_argument(
+ '--test_data_dir',
+ type=str,
+ default='data/test_files',
+ help='A directory with test data files. (default: %(default)s)')
+ parser.add_argument(
+ '--test_label_file',
+ type=str,
+ default='data/label_dict',
+ help='A file with test labels. (default: %(default)s)')
+ parser.add_argument(
+ '--num_passes', type=int, default=1, help='The number of passes.')
+ parser.add_argument(
+ '--skip_pass_num',
+ type=int,
+ default=0,
+ help='The first num of passes to skip in statistics calculations.')
+ parser.add_argument(
+ '--profile', action='store_true', help='If set, do profiling.')
+ args = parser.parse_args()
+ return args
+
+
+def print_arguments(args):
+ print('----------- Configuration Arguments -----------')
+ for arg, value in sorted(vars(args).items()):
+ print('%s: %s' % (arg, value))
+ print('------------------------------------------------')
+
+
+def load_reverse_dict(dict_path):
+ return dict((idx, line.strip().split("\t")[0])
+ for idx, line in enumerate(open(dict_path, "r").readlines()))
+
+
+def to_lodtensor(data, place):
+ seq_lens = [len(seq) for seq in data]
+ cur_len = 0
+ lod = [cur_len]
+ for l in seq_lens:
+ cur_len += l
+ lod.append(cur_len)
+ flattened_data = np.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res = fluid.LoDTensor()
+ res.set(flattened_data, place)
+ res.set_lod([lod])
+ return res
+
+
+def infer(args):
+ word = fluid.layers.data(name='word', shape=[1], dtype='int64', lod_level=1)
+ mention = fluid.layers.data(
+ name='mention', shape=[1], dtype='int64', lod_level=1)
+ target = fluid.layers.data(
+ name='target', shape=[1], dtype='int64', lod_level=1)
+
+ label_reverse_dict = load_reverse_dict(args.test_label_file)
+
+ test_data = paddle.batch(
+ reader.file_reader(args.test_data_dir), batch_size=args.batch_size)
+ place = fluid.CUDAPlace(0) if args.device == 'GPU' else fluid.CPUPlace()
+ feeder = fluid.DataFeeder(feed_list=[word, mention, target], place=place)
+ exe = fluid.Executor(place)
+
+ inference_scope = fluid.core.Scope()
+ with fluid.scope_guard(inference_scope):
+ [inference_program, feed_target_names,
+ fetch_targets] = fluid.io.load_inference_model(args.model_path, exe)
+ total_passes = args.num_passes + args.skip_pass_num
+ batch_times = [0] * total_passes
+ word_counts = [0] * total_passes
+ wpses = [0] * total_passes
+ all_iters = 0
+ for pass_id in range(total_passes):
+ if pass_id < args.skip_pass_num:
+ print("Warm-up pass")
+ if pass_id == args.skip_pass_num:
+ profiler.reset_profiler()
+ iters = 0
+ for data in test_data():
+ word = to_lodtensor(list(map(lambda x: x[0], data)), place)
+ mention = to_lodtensor(list(map(lambda x: x[1], data)), place)
+
+ start = time.time()
+ crf_decode = exe.run(inference_program,
+ feed={"word": word,
+ "mention": mention},
+ fetch_list=fetch_targets,
+ return_numpy=False)
+ batch_time = time.time() - start
+ lod_info = (crf_decode[0].lod())[0]
+ np_data = np.array(crf_decode[0])
+ word_count = 0
+ assert len(data) == len(lod_info) - 1
+ for sen_index in range(len(data)):
+ assert len(data[sen_index][0]) == lod_info[
+ sen_index + 1] - lod_info[sen_index]
+ word_index = 0
+ for tag_index in range(lod_info[sen_index],
+ lod_info[sen_index + 1]):
+ word = str(data[sen_index][0][word_index])
+ gold_tag = label_reverse_dict[data[sen_index][2][
+ word_index]]
+ tag = label_reverse_dict[np_data[tag_index][0]]
+ word_index += 1
+ word_count += word_index
+ batch_times[pass_id] += batch_time
+ word_counts[pass_id] += word_count
+ iters += 1
+ all_iters += 1
+ batch_times[pass_id] /= iters
+ word_counts[pass_id] /= iters
+ wps = word_counts[pass_id] / batch_times[pass_id]
+ wpses[pass_id] = wps
+
+ print(
+ "Pass: %d, iterations (total): %d (%d), latency: %.5f s, words: %d, wps: %f"
+ % (pass_id, iters, all_iters, batch_times[pass_id],
+ word_counts[pass_id], wps))
+
+ # Postprocess benchmark data
+ latencies = batch_times[args.skip_pass_num:]
+ latency_avg = np.average(latencies)
+ latency_std = np.std(latencies)
+ latency_pc99 = np.percentile(latencies, 99)
+ wps_avg = np.average(wpses)
+ wps_std = np.std(wpses)
+ wps_pc01 = np.percentile(wpses, 1)
+
+ # Benchmark output
+ print('\nTotal passes (incl. warm-up): %d' % (total_passes))
+ print('Total iterations (incl. warm-up): %d' % (all_iters))
+ print('Total examples (incl. warm-up): %d' % (all_iters * args.batch_size))
+ print('avg latency: %.5f, std latency: %.5f, 99pc latency: %.5f' %
+ (latency_avg, latency_std, latency_pc99))
+ print('avg wps: %.5f, std wps: %.5f, wps for 99pc latency: %.5f' %
+ (wps_avg, wps_std, wps_pc01))
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ print_arguments(args)
+ if args.profile:
+ if args.device == 'GPU':
+ with profiler.cuda_profiler("cuda_profiler.txt", 'csv') as nvprof:
+ infer(args)
+ else:
+ with profiler.profiler('CPU', sorted_key='total') as cpuprof:
+ infer(args)
+ else:
+ infer(args)
diff --git a/fluid/PaddleNLP/chinese_ner/reader.py b/PaddleNLP/chinese_ner/reader.py
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/reader.py
rename to PaddleNLP/chinese_ner/reader.py
diff --git a/fluid/PaddleNLP/chinese_ner/scripts/README.md b/PaddleNLP/chinese_ner/scripts/README.md
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/scripts/README.md
rename to PaddleNLP/chinese_ner/scripts/README.md
diff --git a/fluid/PaddleNLP/chinese_ner/scripts/infer.sh b/PaddleNLP/chinese_ner/scripts/infer.sh
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/scripts/infer.sh
rename to PaddleNLP/chinese_ner/scripts/infer.sh
diff --git a/fluid/PaddleNLP/chinese_ner/scripts/train.sh b/PaddleNLP/chinese_ner/scripts/train.sh
similarity index 100%
rename from fluid/PaddleNLP/chinese_ner/scripts/train.sh
rename to PaddleNLP/chinese_ner/scripts/train.sh
diff --git a/PaddleNLP/chinese_ner/train.py b/PaddleNLP/chinese_ner/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..4a6a61b34390b4c25c0f6042604b25878f0bc134
--- /dev/null
+++ b/PaddleNLP/chinese_ner/train.py
@@ -0,0 +1,403 @@
+import os
+import math
+import time
+import argparse
+
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+from paddle.fluid.initializer import NormalInitializer
+
+import reader
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("Run training.")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=256,
+ help='The size of a batch. (default: %(default)d)')
+ parser.add_argument(
+ '--word_dict_len',
+ type=int,
+ default=1942563,
+ help='The lenght of the word dictionary. (default: %(default)d)')
+ parser.add_argument(
+ '--label_dict_len',
+ type=int,
+ default=49,
+ help='The lenght of the label dictionary. (default: %(default)d)')
+ parser.add_argument(
+ '--device',
+ type=str,
+ default='GPU',
+ choices=['CPU', 'GPU'],
+ help='The device type. (default: %(default)s)')
+ parser.add_argument(
+ '--train_data_dir',
+ type=str,
+ default='data/train_files',
+ help='A directory with train data files. (default: %(default)s)')
+ parser.add_argument(
+ '--parallel',
+ type=bool,
+ default=False,
+ help="Whether to use parallel training. (default: %(default)s)")
+ parser.add_argument(
+ '--test_data_dir',
+ type=str,
+ default='data/test_files',
+ help='A directory with test data files. (default: %(default)s)')
+ parser.add_argument(
+ '--model_save_dir',
+ type=str,
+ default='./output',
+ help='A directory for saving models. (default: %(default)s)')
+ parser.add_argument(
+ '--num_passes',
+ type=int,
+ default=1000,
+ help='The number of epochs. (default: %(default)d)')
+ parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs.')
+ args = parser.parse_args()
+ return args
+
+
+def print_arguments(args):
+ print('----------- Configuration Arguments -----------')
+ for arg, value in sorted(vars(args).items()):
+ print('%s: %s' % (arg, value))
+ print('------------------------------------------------')
+
+
+def load_reverse_dict(dict_path):
+ return dict((idx, line.strip().split("\t")[0])
+ for idx, line in enumerate(open(dict_path, "r").readlines()))
+
+
+def to_lodtensor(data, place):
+ seq_lens = [len(seq) for seq in data]
+ cur_len = 0
+ lod = [cur_len]
+ for l in seq_lens:
+ cur_len += l
+ lod.append(cur_len)
+ flattened_data = np.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res = fluid.LoDTensor()
+ res.set(flattened_data, place)
+ res.set_lod([lod])
+ return res
+
+
+def ner_net(word_dict_len, label_dict_len):
+ IS_SPARSE = False
+ word_dim = 32
+ mention_dict_len = 57
+ mention_dim = 20
+ grnn_hidden = 36
+ emb_lr = 5
+ init_bound = 0.1
+
+ def _net_conf(word, mark, target):
+ word_embedding = fluid.layers.embedding(
+ input=word,
+ size=[word_dict_len, word_dim],
+ dtype='float32',
+ is_sparse=IS_SPARSE,
+ param_attr=fluid.ParamAttr(
+ learning_rate=emb_lr,
+ name="word_emb",
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound)))
+
+ mention_embedding = fluid.layers.embedding(
+ input=mention,
+ size=[mention_dict_len, mention_dim],
+ dtype='float32',
+ is_sparse=IS_SPARSE,
+ param_attr=fluid.ParamAttr(
+ learning_rate=emb_lr,
+ name="mention_emb",
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound)))
+
+ word_embedding_r = fluid.layers.embedding(
+ input=word,
+ size=[word_dict_len, word_dim],
+ dtype='float32',
+ is_sparse=IS_SPARSE,
+ param_attr=fluid.ParamAttr(
+ learning_rate=emb_lr,
+ name="word_emb_r",
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound)))
+
+ mention_embedding_r = fluid.layers.embedding(
+ input=mention,
+ size=[mention_dict_len, mention_dim],
+ dtype='float32',
+ is_sparse=IS_SPARSE,
+ param_attr=fluid.ParamAttr(
+ learning_rate=emb_lr,
+ name="mention_emb_r",
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound)))
+
+ word_mention_vector = fluid.layers.concat(
+ input=[word_embedding, mention_embedding], axis=1)
+
+ word_mention_vector_r = fluid.layers.concat(
+ input=[word_embedding_r, mention_embedding_r], axis=1)
+
+ pre_gru = fluid.layers.fc(
+ input=word_mention_vector,
+ size=grnn_hidden * 3,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound),
+ regularizer=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=1e-4)))
+ gru = fluid.layers.dynamic_gru(
+ input=pre_gru,
+ size=grnn_hidden,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound),
+ regularizer=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=1e-4)))
+
+ pre_gru_r = fluid.layers.fc(
+ input=word_mention_vector_r,
+ size=grnn_hidden * 3,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound),
+ regularizer=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=1e-4)))
+ gru_r = fluid.layers.dynamic_gru(
+ input=pre_gru_r,
+ size=grnn_hidden,
+ is_reverse=True,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound),
+ regularizer=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=1e-4)))
+
+ gru_merged = fluid.layers.concat(input=[gru, gru_r], axis=1)
+
+ emission = fluid.layers.fc(
+ size=label_dict_len,
+ input=gru_merged,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-init_bound, high=init_bound),
+ regularizer=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=1e-4)))
+
+ crf_cost = fluid.layers.linear_chain_crf(
+ input=emission,
+ label=target,
+ param_attr=fluid.ParamAttr(
+ name='crfw',
+ learning_rate=0.2, ))
+ avg_cost = fluid.layers.mean(x=crf_cost)
+ return avg_cost, emission
+
+ word = fluid.layers.data(name='word', shape=[1], dtype='int64', lod_level=1)
+ mention = fluid.layers.data(
+ name='mention', shape=[1], dtype='int64', lod_level=1)
+ target = fluid.layers.data(
+ name="target", shape=[1], dtype='int64', lod_level=1)
+
+ avg_cost, emission = _net_conf(word, mention, target)
+
+ return avg_cost, emission, word, mention, target
+
+
+def test2(exe, chunk_evaluator, inference_program, test_data, place,
+ cur_fetch_list):
+ chunk_evaluator.reset()
+ for data in test_data():
+ word = to_lodtensor(list(map(lambda x: x[0], data)), place)
+ mention = to_lodtensor(list(map(lambda x: x[1], data)), place)
+ target = to_lodtensor(list(map(lambda x: x[2], data)), place)
+ result_list = exe.run(
+ inference_program,
+ feed={"word": word,
+ "mention": mention,
+ "target": target},
+ fetch_list=cur_fetch_list)
+ number_infer = np.array(result_list[0])
+ number_label = np.array(result_list[1])
+ number_correct = np.array(result_list[2])
+ chunk_evaluator.update(number_infer[0].astype('int64'),
+ number_label[0].astype('int64'),
+ number_correct[0].astype('int64'))
+ return chunk_evaluator.eval()
+
+
+def test(test_exe, chunk_evaluator, inference_program, test_data, place,
+ cur_fetch_list):
+ chunk_evaluator.reset()
+ for data in test_data():
+ word = to_lodtensor(list(map(lambda x: x[0], data)), place)
+ mention = to_lodtensor(list(map(lambda x: x[1], data)), place)
+ target = to_lodtensor(list(map(lambda x: x[2], data)), place)
+ result_list = test_exe.run(
+ fetch_list=cur_fetch_list,
+ feed={"word": word,
+ "mention": mention,
+ "target": target})
+ number_infer = np.array(result_list[0])
+ number_label = np.array(result_list[1])
+ number_correct = np.array(result_list[2])
+ chunk_evaluator.update(number_infer.sum().astype('int64'),
+ number_label.sum().astype('int64'),
+ number_correct.sum().astype('int64'))
+ return chunk_evaluator.eval()
+
+
+def main(args):
+ if not os.path.exists(args.model_save_dir):
+ os.makedirs(args.model_save_dir)
+
+ main = fluid.Program()
+ startup = fluid.Program()
+ if args.enable_ce:
+ SEED = 102
+ main.random_seed = SEED
+ startup.random_seed = SEED
+ with fluid.program_guard(main, startup):
+ avg_cost, feature_out, word, mention, target = ner_net(
+ args.word_dict_len, args.label_dict_len)
+
+ crf_decode = fluid.layers.crf_decoding(
+ input=feature_out, param_attr=fluid.ParamAttr(name='crfw'))
+
+ (precision, recall, f1_score, num_infer_chunks, num_label_chunks,
+ num_correct_chunks) = fluid.layers.chunk_eval(
+ input=crf_decode,
+ label=target,
+ chunk_scheme="IOB",
+ num_chunk_types=int(math.ceil((args.label_dict_len - 1) / 2.0)))
+
+ inference_program = fluid.default_main_program().clone(for_test=True)
+
+ sgd_optimizer = fluid.optimizer.SGD(learning_rate=1e-3)
+ sgd_optimizer.minimize(avg_cost)
+
+ chunk_evaluator = fluid.metrics.ChunkEvaluator()
+
+ train_reader = paddle.batch(
+ paddle.reader.shuffle(
+ reader.file_reader(args.train_data_dir), buf_size=2000000),
+ batch_size=args.batch_size)
+ test_reader = paddle.batch(
+ paddle.reader.shuffle(
+ reader.file_reader(args.test_data_dir), buf_size=2000000),
+ batch_size=args.batch_size)
+
+ place = fluid.CUDAPlace(0) if args.device == 'GPU' else fluid.CPUPlace()
+ feeder = fluid.DataFeeder(
+ feed_list=[word, mention, target], place=place)
+
+ exe = fluid.Executor(place)
+
+ exe.run(startup)
+ if args.parallel:
+ train_exe = fluid.ParallelExecutor(
+ loss_name=avg_cost.name, use_cuda=(args.device == 'GPU'))
+ test_exe = fluid.ParallelExecutor(
+ use_cuda=(args.device == 'GPU'),
+ main_program=inference_program,
+ share_vars_from=train_exe)
+ else:
+ train_exe = exe
+ test_exe = exe
+
+ total_time = 0
+ ce_info = []
+ batch_id = 0
+ for pass_id in range(args.num_passes):
+ chunk_evaluator.reset()
+ train_reader_iter = train_reader()
+ start_time = time.time()
+ while True:
+ try:
+ cur_batch = next(train_reader_iter)
+ cost, nums_infer, nums_label, nums_correct = train_exe.run(
+ fetch_list=[
+ avg_cost.name, num_infer_chunks.name,
+ num_label_chunks.name, num_correct_chunks.name
+ ],
+ feed=feeder.feed(cur_batch))
+ chunk_evaluator.update(
+ np.array(nums_infer).sum().astype("int64"),
+ np.array(nums_label).sum().astype("int64"),
+ np.array(nums_correct).sum().astype("int64"))
+ cost_list = np.array(cost)
+ batch_id += 1
+ except StopIteration:
+ break
+ end_time = time.time()
+ total_time += end_time - start_time
+ print("pass_id:" + str(pass_id) + ", time_cost:" + str(
+ end_time - start_time) + "s")
+ precision, recall, f1_score = chunk_evaluator.eval()
+ print("[Train] precision:" + str(precision) + ", recall:" + str(
+ recall) + ", f1:" + str(f1_score))
+ ce_info.append(recall)
+ p, r, f1 = test2(
+ exe, chunk_evaluator, inference_program, test_reader, place,
+ [num_infer_chunks, num_label_chunks, num_correct_chunks])
+ print("[Test] precision:" + str(p) + ", recall:" + str(r) + ", f1:"
+ + str(f1))
+ save_dirname = os.path.join(args.model_save_dir,
+ "params_pass_%d" % pass_id)
+ fluid.io.save_inference_model(save_dirname, ['word', 'mention'],
+ [crf_decode], exe)
+ # only for ce
+ if args.enable_ce:
+ ce_recall = 0
+ try:
+ ce_recall = ce_info[-2]
+ except:
+ print("ce info error")
+ epoch_idx = args.num_passes
+ device = get_device(args)
+ if args.device == "GPU":
+ gpu_num = device[1]
+ print("kpis\teach_pass_duration_gpu%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_recall_gpu%s\t%s" %
+ (gpu_num, ce_recall))
+ else:
+ cpu_num = device[1]
+ threads_num = device[2]
+ print("kpis\teach_pass_duration_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, total_time / epoch_idx))
+ print("kpis\ttrain_recall_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, ce_recall))
+
+
+def get_device(args):
+ if args.device == "GPU":
+ gpus = os.environ.get("CUDA_VISIBLE_DEVICES", "")
+ gpu_num = len(gpus.split(','))
+ return "gpu", gpu_num
+ else:
+ threads_num = os.environ.get('NUM_THREADS', 1)
+ cpu_num = os.environ.get('CPU_NUM', 1)
+ return "cpu", int(cpu_num), int(threads_num)
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ print_arguments(args)
+ main(args)
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/.run_ce.sh b/PaddleNLP/deep_attention_matching_net/.run_ce.sh
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/.run_ce.sh
rename to PaddleNLP/deep_attention_matching_net/.run_ce.sh
diff --git a/PaddleNLP/deep_attention_matching_net/README.md b/PaddleNLP/deep_attention_matching_net/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..37085fe46ee6774b3e553a35d840eb11395da8a0
--- /dev/null
+++ b/PaddleNLP/deep_attention_matching_net/README.md
@@ -0,0 +1,87 @@
+# __Deep Attention Matching Network__
+
+This is the source code of Deep Attention Matching network (DAM), that is proposed for multi-turn response selection in the retrieval-based chatbot.
+
+DAM is a neural matching network that entirely based on attention mechanism. The motivation of DAM is to capture those semantic dependencies, among dialogue elements at different level of granularities, in multi-turn conversation as matching evidences, in order to better match response candidate with its multi-turn context. DAM appears on ACL-2018, please find our paper at [http://aclweb.org/anthology/P18-1103](http://aclweb.org/anthology/P18-1103).
+
+
+## __Network__
+
+DAM is inspired by Transformer in Machine Translation (Vaswani et al., 2017), and we extend the key attention mechanism of Transformer in two perspectives and introduce those two kinds of attention in one uniform neural network.
+
+- **self-attention** To gradually capture semantic representations in different granularities by stacking attention from word-level embeddings. Those multi-grained semantic representations would facilitate exploring segmental dependencies between context and response.
+
+- **cross-attention** Attention across context and response can generally capture the relevance in dependency between segment pairs, which could provide complementary information to textual relevance for matching response with multi-turn context.
+
+
+
+Overview of Deep Attention Matching Network
+
+
+## __Results__
+
+We test DAM on two large-scale multi-turn response selection tasks, i.e., the Ubuntu Corpus v1 and Douban Conversation Corpus, experimental results are bellow:
+
+
+
+
+
+## __Usage__
+
+Take the experiment on the Ubuntu Corpus v1 for Example.
+
+1) Go to the `ubuntu` directory
+
+```
+cd ubuntu
+```
+2) Download the well-preprocessed data for training
+
+```
+sh download_data.sh
+```
+3) Execute the model training and evaluation by
+
+```
+sh train.sh
+```
+for more detailed explanation about the arguments, please run
+
+```
+python ../train_and_evaluate.py --help
+```
+
+By default, the training is executed on one single GPU, which can be switched to multiple-GPU mode easily by simply resetting the visible devices in `train.sh`, e.g.,
+
+```
+export CUDA_VISIBLE_DEVICES=0,1,2,3
+```
+
+4) Run test by
+
+```
+sh test.sh
+```
+and run the test for different saved models by using different argument `--model_path`.
+
+Similary, one can carry out the experiment on the Douban Conversation Corpus by going to the directory `douban` and following the same procedure.
+
+## __Dependencies__
+
+- Python >= 2.7.3
+- PaddlePaddle latest develop branch
+
+## __Citation__
+
+The following article describe the DAM in detail. We recommend citing this article as default.
+
+```
+@inproceedings{ ,
+ title={Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network},
+ author={Xiangyang Zhou, Lu Li, Daxiang Dong, Yi Liu, Ying Chen, Wayne Xin Zhao, Dianhai Yu and Hua Wu},
+ booktitle={Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
+ volume={1},
+ pages={ -- },
+ year={2018}
+}
+```
diff --git a/PaddleNLP/deep_attention_matching_net/_ce.py b/PaddleNLP/deep_attention_matching_net/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..7ad30288074da3124c33fad6c96fd369a812c77c
--- /dev/null
+++ b/PaddleNLP/deep_attention_matching_net/_ce.py
@@ -0,0 +1,46 @@
+####this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi, DurationKpi, AccKpi
+
+#### NOTE kpi.py should shared in models in some way!!!!
+
+train_cost_kpi = CostKpi('train_cost', 0.02, 0, actived=True)
+train_duration_kpi = DurationKpi('train_duration', 0.05, 0, actived=True)
+
+tracking_kpis = [
+ train_cost_kpi,
+ train_duration_kpi,
+]
+
+
+def parse_log(log):
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ print("-----%s" % fs)
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ print("*****")
+ print(log)
+ print("****")
+ log_to_ce(log)
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/douban/download_data.sh b/PaddleNLP/deep_attention_matching_net/douban/download_data.sh
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/douban/download_data.sh
rename to PaddleNLP/deep_attention_matching_net/douban/download_data.sh
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/douban/test.sh b/PaddleNLP/deep_attention_matching_net/douban/test.sh
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/douban/test.sh
rename to PaddleNLP/deep_attention_matching_net/douban/test.sh
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/douban/train.sh b/PaddleNLP/deep_attention_matching_net/douban/train.sh
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/douban/train.sh
rename to PaddleNLP/deep_attention_matching_net/douban/train.sh
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/images/Figure1.png b/PaddleNLP/deep_attention_matching_net/images/Figure1.png
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/images/Figure1.png
rename to PaddleNLP/deep_attention_matching_net/images/Figure1.png
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/images/Figure2.png b/PaddleNLP/deep_attention_matching_net/images/Figure2.png
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/images/Figure2.png
rename to PaddleNLP/deep_attention_matching_net/images/Figure2.png
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/model.py b/PaddleNLP/deep_attention_matching_net/model.py
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/model.py
rename to PaddleNLP/deep_attention_matching_net/model.py
diff --git a/PaddleNLP/deep_attention_matching_net/test_and_evaluate.py b/PaddleNLP/deep_attention_matching_net/test_and_evaluate.py
new file mode 100644
index 0000000000000000000000000000000000000000..98239220054af17b136deb32da71b7ec81366d86
--- /dev/null
+++ b/PaddleNLP/deep_attention_matching_net/test_and_evaluate.py
@@ -0,0 +1,227 @@
+import os
+import six
+import numpy as np
+import time
+import argparse
+import multiprocessing
+import paddle
+import paddle.fluid as fluid
+import utils.reader as reader
+from utils.util import print_arguments, mkdir
+
+try:
+ import cPickle as pickle #python 2
+except ImportError as e:
+ import pickle #python 3
+
+from model import Net
+
+
+#yapf: disable
+def parse_args():
+ parser = argparse.ArgumentParser("Test for DAM.")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=256,
+ help='Batch size for training. (default: %(default)d)')
+ parser.add_argument(
+ '--num_scan_data',
+ type=int,
+ default=2,
+ help='Number of pass for training. (default: %(default)d)')
+ parser.add_argument(
+ '--learning_rate',
+ type=float,
+ default=1e-3,
+ help='Learning rate used to train. (default: %(default)f)')
+ parser.add_argument(
+ '--data_path',
+ type=str,
+ default="data/ubuntu/data_small.pkl",
+ help='Path to training data. (default: %(default)s)')
+ parser.add_argument(
+ '--save_path',
+ type=str,
+ default="./",
+ help='Path to save score and result files. (default: %(default)s)')
+ parser.add_argument(
+ '--model_path',
+ type=str,
+ default="saved_models/step_1000",
+ help='Path to load well-trained models. (default: %(default)s)')
+ parser.add_argument(
+ '--use_cuda',
+ action='store_true',
+ help='If set, use cuda for training.')
+ parser.add_argument(
+ '--ext_eval',
+ action='store_true',
+ help='If set, use MAP, MRR ect for evaluation.')
+ parser.add_argument(
+ '--max_turn_num',
+ type=int,
+ default=9,
+ help='Maximum number of utterances in context.')
+ parser.add_argument(
+ '--max_turn_len',
+ type=int,
+ default=50,
+ help='Maximum length of setences in turns.')
+ parser.add_argument(
+ '--word_emb_init',
+ type=str,
+ default=None,
+ help='Path to the initial word embedding.')
+ parser.add_argument(
+ '--vocab_size',
+ type=int,
+ default=434512,
+ help='The size of vocabulary.')
+ parser.add_argument(
+ '--emb_size',
+ type=int,
+ default=200,
+ help='The dimension of word embedding.')
+ parser.add_argument(
+ '--_EOS_',
+ type=int,
+ default=28270,
+ help='The id for end of sentence in vocabulary.')
+ parser.add_argument(
+ '--stack_num',
+ type=int,
+ default=5,
+ help='The number of stacked attentive modules in network.')
+ parser.add_argument(
+ '--channel1_num',
+ type=int,
+ default=32,
+ help="The channels' number of the 1st conv3d layer's output.")
+ parser.add_argument(
+ '--channel2_num',
+ type=int,
+ default=16,
+ help="The channels' number of the 2nd conv3d layer's output.")
+ args = parser.parse_args()
+ return args
+
+
+#yapf: enable
+
+
+def test(args):
+ if not os.path.exists(args.save_path):
+ mkdir(args.save_path)
+ if not os.path.exists(args.model_path):
+ raise ValueError("Invalid model init path %s" % args.model_path)
+ # data data_config
+ data_conf = {
+ "batch_size": args.batch_size,
+ "max_turn_num": args.max_turn_num,
+ "max_turn_len": args.max_turn_len,
+ "_EOS_": args._EOS_,
+ }
+
+ dam = Net(args.max_turn_num, args.max_turn_len, args.vocab_size,
+ args.emb_size, args.stack_num, args.channel1_num,
+ args.channel2_num)
+ dam.create_data_layers()
+ loss, logits = dam.create_network()
+
+ loss.persistable = True
+ logits.persistable = True
+
+ # gradient clipping
+ fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByValue(
+ max=1.0, min=-1.0))
+
+ test_program = fluid.default_main_program().clone(for_test=True)
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=fluid.layers.exponential_decay(
+ learning_rate=args.learning_rate,
+ decay_steps=400,
+ decay_rate=0.9,
+ staircase=True))
+ optimizer.minimize(loss)
+
+ fluid.memory_optimize(fluid.default_main_program())
+
+ if args.use_cuda:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+ else:
+ place = fluid.CPUPlace()
+ dev_count = multiprocessing.cpu_count()
+
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ fluid.io.load_persistables(exe, args.model_path)
+
+ test_exe = fluid.ParallelExecutor(
+ use_cuda=args.use_cuda, main_program=test_program)
+
+ print("start loading data ...")
+ with open(args.data_path, 'rb') as f:
+ if six.PY2:
+ train_data, val_data, test_data = pickle.load(f)
+ else:
+ train_data, val_data, test_data = pickle.load(f, encoding="bytes")
+ print("finish loading data ...")
+
+ if args.ext_eval:
+ import utils.douban_evaluation as eva
+ eval_metrics = ["MAP", "MRR", "P@1", "R_{10}@1", "R_{10}@2", "R_{10}@5"]
+ else:
+ import utils.evaluation as eva
+ eval_metrics = ["R_2@1", "R_{10}@1", "R_{10}@2", "R_{10}@5"]
+
+ test_batches = reader.build_batches(test_data, data_conf)
+
+ test_batch_num = len(test_batches["response"])
+
+ print("test batch num: %d" % test_batch_num)
+
+ print("begin inference ...")
+ print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
+
+ score_path = os.path.join(args.save_path, 'score.txt')
+ score_file = open(score_path, 'w')
+
+ for it in six.moves.xrange(test_batch_num // dev_count):
+ feed_list = []
+ for dev in six.moves.xrange(dev_count):
+ index = it * dev_count + dev
+ batch_data = reader.make_one_batch_input(test_batches, index)
+ feed_dict = dict(zip(dam.get_feed_names(), batch_data))
+ feed_list.append(feed_dict)
+
+ predicts = test_exe.run(feed=feed_list, fetch_list=[logits.name])
+
+ scores = np.array(predicts[0])
+ print("step = %d" % it)
+
+ for dev in six.moves.xrange(dev_count):
+ index = it * dev_count + dev
+ for i in six.moves.xrange(args.batch_size):
+ score_file.write(
+ str(scores[args.batch_size * dev + i][0]) + '\t' + str(
+ test_batches["label"][index][i]) + '\n')
+
+ score_file.close()
+
+ #write evaluation result
+ result = eva.evaluate(score_path)
+ result_file_path = os.path.join(args.save_path, 'result.txt')
+ with open(result_file_path, 'w') as out_file:
+ for metric, p_at in zip(eval_metrics, result):
+ out_file.write(metric + ": " + str(p_at) + '\n')
+ print('finish test')
+ print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ test(args)
diff --git a/PaddleNLP/deep_attention_matching_net/train_and_evaluate.py b/PaddleNLP/deep_attention_matching_net/train_and_evaluate.py
new file mode 100644
index 0000000000000000000000000000000000000000..823d13de9facb2537fbb6358e16c339f2ea5354b
--- /dev/null
+++ b/PaddleNLP/deep_attention_matching_net/train_and_evaluate.py
@@ -0,0 +1,409 @@
+import os
+import six
+import numpy as np
+import time
+import argparse
+import multiprocessing
+import paddle
+import paddle.fluid as fluid
+import utils.reader as reader
+from utils.util import print_arguments, mkdir
+
+try:
+ import cPickle as pickle #python 2
+except ImportError as e:
+ import pickle #python 3
+
+from model import Net
+
+
+#yapf: disable
+def parse_args():
+ parser = argparse.ArgumentParser("Training DAM.")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=256,
+ help='Batch size for training. (default: %(default)d)')
+ parser.add_argument(
+ '--num_scan_data',
+ type=int,
+ default=2,
+ help='Number of pass for training. (default: %(default)d)')
+ parser.add_argument(
+ '--learning_rate',
+ type=float,
+ default=1e-3,
+ help='Learning rate used to train. (default: %(default)f)')
+ parser.add_argument(
+ '--data_path',
+ type=str,
+ default="data/data_small.pkl",
+ help='Path to training data. (default: %(default)s)')
+ parser.add_argument(
+ '--save_path',
+ type=str,
+ default="saved_models",
+ help='Path to save trained models. (default: %(default)s)')
+ parser.add_argument(
+ '--use_cuda',
+ action='store_true',
+ help='If set, use cuda for training.')
+ parser.add_argument(
+ '--use_pyreader',
+ action='store_true',
+ help='If set, use pyreader for reading data.')
+ parser.add_argument(
+ '--ext_eval',
+ action='store_true',
+ help='If set, use MAP, MRR ect for evaluation.')
+ parser.add_argument(
+ '--max_turn_num',
+ type=int,
+ default=9,
+ help='Maximum number of utterances in context.')
+ parser.add_argument(
+ '--max_turn_len',
+ type=int,
+ default=50,
+ help='Maximum length of setences in turns.')
+ parser.add_argument(
+ '--word_emb_init',
+ type=str,
+ default=None,
+ help='Path to the initial word embedding.')
+ parser.add_argument(
+ '--vocab_size',
+ type=int,
+ default=434512,
+ help='The size of vocabulary.')
+ parser.add_argument(
+ '--emb_size',
+ type=int,
+ default=200,
+ help='The dimension of word embedding.')
+ parser.add_argument(
+ '--_EOS_',
+ type=int,
+ default=28270,
+ help='The id for the end of sentence in vocabulary.')
+ parser.add_argument(
+ '--stack_num',
+ type=int,
+ default=5,
+ help='The number of stacked attentive modules in network.')
+ parser.add_argument(
+ '--channel1_num',
+ type=int,
+ default=32,
+ help="The channels' number of the 1st conv3d layer's output.")
+ parser.add_argument(
+ '--channel2_num',
+ type=int,
+ default=16,
+ help="The channels' number of the 2nd conv3d layer's output.")
+ args = parser.parse_args()
+ return args
+
+
+#yapf: enable
+
+
+def evaluate(score_path, result_file_path):
+ if args.ext_eval:
+ import utils.douban_evaluation as eva
+ else:
+ import utils.evaluation as eva
+ #write evaluation result
+ result = eva.evaluate(score_path)
+ with open(result_file_path, 'w') as out_file:
+ for p_at in result:
+ out_file.write(str(p_at) + '\n')
+ print('finish evaluation')
+ print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
+
+
+def test_with_feed(exe, program, feed_names, fetch_list, score_path, batches,
+ batch_num, dev_count):
+ score_file = open(score_path, 'w')
+ for it in six.moves.xrange(batch_num // dev_count):
+ feed_list = []
+ for dev in six.moves.xrange(dev_count):
+ val_index = it * dev_count + dev
+ batch_data = reader.make_one_batch_input(batches, val_index)
+ feed_dict = dict(zip(feed_names, batch_data))
+ feed_list.append(feed_dict)
+
+ predicts = exe.run(feed=feed_list, fetch_list=fetch_list)
+
+ scores = np.array(predicts[0])
+ for dev in six.moves.xrange(dev_count):
+ val_index = it * dev_count + dev
+ for i in six.moves.xrange(args.batch_size):
+ score_file.write(
+ str(scores[args.batch_size * dev + i][0]) + '\t' + str(
+ batches["label"][val_index][i]) + '\n')
+ score_file.close()
+
+
+def test_with_pyreader(exe, program, pyreader, fetch_list, score_path, batches,
+ batch_num, dev_count):
+ def data_provider():
+ for index in six.moves.xrange(batch_num):
+ yield reader.make_one_batch_input(batches, index)
+
+ score_file = open(score_path, 'w')
+ pyreader.decorate_tensor_provider(data_provider)
+ it = 0
+ pyreader.start()
+ while True:
+ try:
+ predicts = exe.run(fetch_list=fetch_list)
+
+ scores = np.array(predicts[0])
+ for dev in six.moves.xrange(dev_count):
+ val_index = it * dev_count + dev
+ for i in six.moves.xrange(args.batch_size):
+ score_file.write(
+ str(scores[args.batch_size * dev + i][0]) + '\t' + str(
+ batches["label"][val_index][i]) + '\n')
+ it += 1
+ except fluid.core.EOFException:
+ pyreader.reset()
+ break
+ score_file.close()
+
+
+def train(args):
+ if not os.path.exists(args.save_path):
+ os.makedirs(args.save_path)
+
+ # data data_config
+ data_conf = {
+ "batch_size": args.batch_size,
+ "max_turn_num": args.max_turn_num,
+ "max_turn_len": args.max_turn_len,
+ "_EOS_": args._EOS_,
+ }
+
+ dam = Net(args.max_turn_num, args.max_turn_len, args.vocab_size,
+ args.emb_size, args.stack_num, args.channel1_num,
+ args.channel2_num)
+
+ train_program = fluid.Program()
+ train_startup = fluid.Program()
+ if "CE_MODE_X" in os.environ:
+ train_program.random_seed = 110
+ train_startup.random_seed = 110
+ with fluid.program_guard(train_program, train_startup):
+ with fluid.unique_name.guard():
+ if args.use_pyreader:
+ train_pyreader = dam.create_py_reader(
+ capacity=10, name='train_reader')
+ else:
+ dam.create_data_layers()
+ loss, logits = dam.create_network()
+ loss.persistable = True
+ logits.persistable = True
+ # gradient clipping
+ fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByValue(
+ max=1.0, min=-1.0))
+
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=fluid.layers.exponential_decay(
+ learning_rate=args.learning_rate,
+ decay_steps=400,
+ decay_rate=0.9,
+ staircase=True))
+ optimizer.minimize(loss)
+ print("begin memory optimization ...")
+ fluid.memory_optimize(train_program)
+ print("end memory optimization ...")
+
+ test_program = fluid.Program()
+ test_startup = fluid.Program()
+ if "CE_MODE_X" in os.environ:
+ test_program.random_seed = 110
+ test_startup.random_seed = 110
+ with fluid.program_guard(test_program, test_startup):
+ with fluid.unique_name.guard():
+ if args.use_pyreader:
+ test_pyreader = dam.create_py_reader(
+ capacity=10, name='test_reader')
+ else:
+ dam.create_data_layers()
+
+ loss, logits = dam.create_network()
+ loss.persistable = True
+ logits.persistable = True
+
+ test_program = test_program.clone(for_test=True)
+
+ if args.use_cuda:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+ else:
+ place = fluid.CPUPlace()
+ dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+
+ print("device count %d" % dev_count)
+ print("theoretical memory usage: ")
+ print(
+ fluid.contrib.memory_usage(
+ program=train_program, batch_size=args.batch_size))
+
+ exe = fluid.Executor(place)
+ exe.run(train_startup)
+ exe.run(test_startup)
+
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=args.use_cuda, loss_name=loss.name, main_program=train_program)
+
+ test_exe = fluid.ParallelExecutor(
+ use_cuda=args.use_cuda,
+ main_program=test_program,
+ share_vars_from=train_exe)
+
+ if args.word_emb_init is not None:
+ print("start loading word embedding init ...")
+ if six.PY2:
+ word_emb = np.array(pickle.load(open(args.word_emb_init,
+ 'rb'))).astype('float32')
+ else:
+ word_emb = np.array(
+ pickle.load(
+ open(args.word_emb_init, 'rb'), encoding="bytes")).astype(
+ 'float32')
+ dam.set_word_embedding(word_emb, place)
+ print("finish init word embedding ...")
+
+ print("start loading data ...")
+ with open(args.data_path, 'rb') as f:
+ if six.PY2:
+ train_data, val_data, test_data = pickle.load(f)
+ else:
+ train_data, val_data, test_data = pickle.load(f, encoding="bytes")
+ print("finish loading data ...")
+
+ val_batches = reader.build_batches(val_data, data_conf)
+
+ batch_num = len(train_data[six.b('y')]) // args.batch_size
+ val_batch_num = len(val_batches["response"])
+
+ print_step = max(1, batch_num // (dev_count * 100))
+ save_step = max(1, batch_num // (dev_count * 10))
+
+ print("begin model training ...")
+ print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
+
+ # train on one epoch data by feeding
+ def train_with_feed(step):
+ ave_cost = 0.0
+ for it in six.moves.xrange(batch_num // dev_count):
+ feed_list = []
+ for dev in six.moves.xrange(dev_count):
+ index = it * dev_count + dev
+ batch_data = reader.make_one_batch_input(train_batches, index)
+ feed_dict = dict(zip(dam.get_feed_names(), batch_data))
+ feed_list.append(feed_dict)
+
+ cost = train_exe.run(feed=feed_list, fetch_list=[loss.name])
+
+ ave_cost += np.array(cost[0]).mean()
+ step = step + 1
+ if step % print_step == 0:
+ print("processed: [" + str(step * dev_count * 1.0 / batch_num) +
+ "] ave loss: [" + str(ave_cost / print_step) + "]")
+ ave_cost = 0.0
+
+ if (args.save_path is not None) and (step % save_step == 0):
+ save_path = os.path.join(args.save_path, "step_" + str(step))
+ print("Save model at step %d ... " % step)
+ print(
+ time.strftime('%Y-%m-%d %H:%M:%S',
+ time.localtime(time.time())))
+ fluid.io.save_persistables(exe, save_path, train_program)
+
+ score_path = os.path.join(args.save_path, 'score.' + str(step))
+ test_with_feed(test_exe, test_program,
+ dam.get_feed_names(), [logits.name], score_path,
+ val_batches, val_batch_num, dev_count)
+
+ result_file_path = os.path.join(args.save_path,
+ 'result.' + str(step))
+ evaluate(score_path, result_file_path)
+ return step, np.array(cost[0]).mean()
+
+ # train on one epoch with pyreader
+ def train_with_pyreader(step):
+ def data_provider():
+ for index in six.moves.xrange(batch_num):
+ yield reader.make_one_batch_input(train_batches, index)
+
+ train_pyreader.decorate_tensor_provider(data_provider)
+
+ ave_cost = 0.0
+ train_pyreader.start()
+ while True:
+ try:
+ cost = train_exe.run(fetch_list=[loss.name])
+
+ ave_cost += np.array(cost[0]).mean()
+ step = step + 1
+ if step % print_step == 0:
+ print("processed: [" + str(step * dev_count * 1.0 /
+ batch_num) + "] ave loss: [" +
+ str(ave_cost / print_step) + "]")
+ ave_cost = 0.0
+
+ if (args.save_path is not None) and (step % save_step == 0):
+ save_path = os.path.join(args.save_path,
+ "step_" + str(step))
+ print("Save model at step %d ... " % step)
+ print(
+ time.strftime('%Y-%m-%d %H:%M:%S',
+ time.localtime(time.time())))
+ fluid.io.save_persistables(exe, save_path, train_program)
+
+ score_path = os.path.join(args.save_path,
+ 'score.' + str(step))
+ test_with_pyreader(test_exe, test_program, test_pyreader,
+ [logits.name], score_path, val_batches,
+ val_batch_num, dev_count)
+
+ result_file_path = os.path.join(args.save_path,
+ 'result.' + str(step))
+ evaluate(score_path, result_file_path)
+
+ except fluid.core.EOFException:
+ train_pyreader.reset()
+ break
+ return step, np.array(cost[0]).mean()
+
+ # train over different epoches
+ global_step, train_time = 0, 0.0
+ for epoch in six.moves.xrange(args.num_scan_data):
+ shuffle_train = reader.unison_shuffle(
+ train_data, seed=110 if ("CE_MODE_X" in os.environ) else None)
+ train_batches = reader.build_batches(shuffle_train, data_conf)
+
+ begin_time = time.time()
+ if args.use_pyreader:
+ global_step, last_cost = train_with_pyreader(global_step)
+ else:
+ global_step, last_cost = train_with_feed(global_step)
+
+ pass_time_cost = time.time() - begin_time
+ train_time += pass_time_cost
+ print("Pass {0}, pass_time_cost {1}"
+ .format(epoch, "%2.2f sec" % pass_time_cost))
+ # For internal continuous evaluation
+ if "CE_MODE_X" in os.environ:
+ print("kpis train_cost %f" % last_cost)
+ print("kpis train_duration %f" % train_time)
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+ train(args)
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/ubuntu/download_data.sh b/PaddleNLP/deep_attention_matching_net/ubuntu/download_data.sh
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/ubuntu/download_data.sh
rename to PaddleNLP/deep_attention_matching_net/ubuntu/download_data.sh
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/ubuntu/test.sh b/PaddleNLP/deep_attention_matching_net/ubuntu/test.sh
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/ubuntu/test.sh
rename to PaddleNLP/deep_attention_matching_net/ubuntu/test.sh
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/ubuntu/train.sh b/PaddleNLP/deep_attention_matching_net/ubuntu/train.sh
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/ubuntu/train.sh
rename to PaddleNLP/deep_attention_matching_net/ubuntu/train.sh
diff --git a/PaddleNLP/deep_attention_matching_net/utils/__init__.py b/PaddleNLP/deep_attention_matching_net/utils/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/utils/douban_evaluation.py b/PaddleNLP/deep_attention_matching_net/utils/douban_evaluation.py
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/utils/douban_evaluation.py
rename to PaddleNLP/deep_attention_matching_net/utils/douban_evaluation.py
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/utils/evaluation.py b/PaddleNLP/deep_attention_matching_net/utils/evaluation.py
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/utils/evaluation.py
rename to PaddleNLP/deep_attention_matching_net/utils/evaluation.py
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/utils/layers.py b/PaddleNLP/deep_attention_matching_net/utils/layers.py
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/utils/layers.py
rename to PaddleNLP/deep_attention_matching_net/utils/layers.py
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/utils/reader.py b/PaddleNLP/deep_attention_matching_net/utils/reader.py
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/utils/reader.py
rename to PaddleNLP/deep_attention_matching_net/utils/reader.py
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/utils/util.py b/PaddleNLP/deep_attention_matching_net/utils/util.py
similarity index 100%
rename from fluid/PaddleNLP/deep_attention_matching_net/utils/util.py
rename to PaddleNLP/deep_attention_matching_net/utils/util.py
diff --git a/PaddleNLP/knowledge-driven-dialogue b/PaddleNLP/knowledge-driven-dialogue
new file mode 160000
index 0000000000000000000000000000000000000000..199af0665f705aaf4ac7db3dd5cb9608201376e7
--- /dev/null
+++ b/PaddleNLP/knowledge-driven-dialogue
@@ -0,0 +1 @@
+Subproject commit 199af0665f705aaf4ac7db3dd5cb9608201376e7
diff --git a/fluid/PaddleNLP/language_model/gru/.run_ce.sh b/PaddleNLP/language_model/gru/.run_ce.sh
similarity index 100%
rename from fluid/PaddleNLP/language_model/gru/.run_ce.sh
rename to PaddleNLP/language_model/gru/.run_ce.sh
diff --git a/PaddleNLP/language_model/gru/README.md b/PaddleNLP/language_model/gru/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..91ce2d7f58085b56da2ac2dec03af2a05985ab8f
--- /dev/null
+++ b/PaddleNLP/language_model/gru/README.md
@@ -0,0 +1,148 @@
+# 语言模型
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── README.md # 文档
+├── train.py # 训练脚本
+├── infer.py # 预测脚本
+└── utils.py # 通用函数
+```
+
+
+## 简介
+
+循环神经网络语言模型的介绍可以参阅论文[Recurrent Neural Network Regularization](https://arxiv.org/abs/1409.2329),在本例中,我们实现了GRU-RNN语言模型。
+
+## 训练
+
+运行命令 `python train.py` 开始训练模型。
+```python
+python train.py
+```
+
+当前支持的参数可参见[train.py](./train.py) `train_net` 函数
+```python
+vocab, train_reader, test_reader = utils.prepare_data(
+ batch_size=20, # batch size
+ buffer_size=1000, # buffer size, default value is OK
+ word_freq_threshold=0) # vocabulary related parameter, and words with frequency below this value will be filtered
+
+train(train_reader=train_reader,
+ vocab=vocab,
+ network=network,
+ hid_size=200, # embedding and hidden size
+ base_lr=1.0, # base learning rate
+ batch_size=20, # batch size, the same as that in prepare_data
+ pass_num=12, # the number of passes for training
+ use_cuda=True, # whether to use GPU card
+ parallel=False, # whether to be parallel
+ model_dir="model", # directory to save model
+ init_low_bound=-0.1, # uniform parameter initialization lower bound
+ init_high_bound=0.1) # uniform parameter initialization upper bound
+```
+
+## 自定义网络结构
+
+可在[train.py](./train.py) `network` 函数中调整网络结构,当前的网络结构如下:
+```python
+emb = fluid.layers.embedding(input=src, size=[vocab_size, hid_size],
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
+ learning_rate=emb_lr_x),
+ is_sparse=True)
+
+fc0 = fluid.layers.fc(input=emb, size=hid_size * 3,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
+ learning_rate=gru_lr_x))
+gru_h0 = fluid.layers.dynamic_gru(input=fc0, size=hid_size,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
+ learning_rate=gru_lr_x))
+
+fc = fluid.layers.fc(input=gru_h0, size=vocab_size, act='softmax',
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
+ learning_rate=fc_lr_x))
+
+cost = fluid.layers.cross_entropy(input=fc, label=dst)
+```
+
+## 训练结果示例
+
+我们在Tesla K40m单GPU卡上训练的日志如下所示
+```text
+epoch_1 start
+step:100 ppl:771.053
+step:200 ppl:449.597
+step:300 ppl:642.654
+step:400 ppl:458.128
+step:500 ppl:510.912
+step:600 ppl:451.545
+step:700 ppl:364.404
+step:800 ppl:324.272
+step:900 ppl:360.797
+step:1000 ppl:275.761
+step:1100 ppl:294.599
+step:1200 ppl:335.877
+step:1300 ppl:185.262
+step:1400 ppl:241.744
+step:1500 ppl:211.507
+step:1600 ppl:233.431
+step:1700 ppl:298.767
+step:1800 ppl:203.403
+step:1900 ppl:158.828
+step:2000 ppl:171.148
+step:2100 ppl:280.884
+epoch:1 num_steps:2104 time_cost(s):47.478780
+model saved in model/epoch_1
+epoch_2 start
+step:100 ppl:238.099
+step:200 ppl:136.527
+step:300 ppl:204.184
+step:400 ppl:252.886
+step:500 ppl:177.377
+step:600 ppl:197.688
+step:700 ppl:131.650
+step:800 ppl:223.906
+step:900 ppl:144.785
+step:1000 ppl:176.286
+step:1100 ppl:148.158
+step:1200 ppl:203.581
+step:1300 ppl:168.208
+step:1400 ppl:159.412
+step:1500 ppl:114.032
+step:1600 ppl:157.985
+step:1700 ppl:147.743
+step:1800 ppl:88.676
+step:1900 ppl:141.962
+step:2000 ppl:106.087
+step:2100 ppl:122.709
+epoch:2 num_steps:2104 time_cost(s):47.583789
+model saved in model/epoch_2
+...
+```
+
+## 预测
+运行命令 `python infer.py model_dir start_epoch last_epoch(inclusive)` 开始预测,其中,start_epoch指定开始预测的轮次,last_epoch指定结束的轮次,例如
+```python
+python infer.py model 1 12 # prediction from epoch 1 to epoch 12
+```
+
+## 预测结果示例
+```text
+model:model/epoch_1 ppl:254.540 time_cost(s):3.29
+model:model/epoch_2 ppl:177.671 time_cost(s):3.27
+model:model/epoch_3 ppl:156.251 time_cost(s):3.27
+model:model/epoch_4 ppl:139.036 time_cost(s):3.27
+model:model/epoch_5 ppl:132.661 time_cost(s):3.27
+model:model/epoch_6 ppl:130.092 time_cost(s):3.28
+model:model/epoch_7 ppl:128.751 time_cost(s):3.27
+model:model/epoch_8 ppl:125.411 time_cost(s):3.27
+model:model/epoch_9 ppl:124.604 time_cost(s):3.28
+model:model/epoch_10 ppl:124.754 time_cost(s):3.29
+model:model/epoch_11 ppl:125.421 time_cost(s):3.27
+model:model/epoch_12 ppl:125.676 time_cost(s):3.27
+```
diff --git a/fluid/PaddleNLP/language_model/gru/_ce.py b/PaddleNLP/language_model/gru/_ce.py
similarity index 100%
rename from fluid/PaddleNLP/language_model/gru/_ce.py
rename to PaddleNLP/language_model/gru/_ce.py
diff --git a/fluid/PaddleNLP/language_model/gru/infer.py b/PaddleNLP/language_model/gru/infer.py
similarity index 100%
rename from fluid/PaddleNLP/language_model/gru/infer.py
rename to PaddleNLP/language_model/gru/infer.py
diff --git a/fluid/PaddleNLP/language_model/gru/train.py b/PaddleNLP/language_model/gru/train.py
similarity index 100%
rename from fluid/PaddleNLP/language_model/gru/train.py
rename to PaddleNLP/language_model/gru/train.py
diff --git a/fluid/PaddleNLP/language_model/gru/train_on_cloud.py b/PaddleNLP/language_model/gru/train_on_cloud.py
similarity index 100%
rename from fluid/PaddleNLP/language_model/gru/train_on_cloud.py
rename to PaddleNLP/language_model/gru/train_on_cloud.py
diff --git a/fluid/PaddleNLP/language_model/gru/utils.py b/PaddleNLP/language_model/gru/utils.py
similarity index 100%
rename from fluid/PaddleNLP/language_model/gru/utils.py
rename to PaddleNLP/language_model/gru/utils.py
diff --git a/PaddleNLP/language_model/lstm/.run_ce.sh b/PaddleNLP/language_model/lstm/.run_ce.sh
new file mode 100644
index 0000000000000000000000000000000000000000..a1870244c470b46d39508829016740dbdcdc77f9
--- /dev/null
+++ b/PaddleNLP/language_model/lstm/.run_ce.sh
@@ -0,0 +1,12 @@
+export CUDA_VISIBLE_DEVICES=0
+cd data
+sh download_data.sh
+cd ..
+
+python train.py \
+ --data_path data/simple-examples/data/ \
+ --model_type small \
+ --use_gpu True \
+ --rnn_model static \
+ --enable_ce | python _ce.py
+
diff --git a/PaddleNLP/language_model/lstm/README.md b/PaddleNLP/language_model/lstm/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..f6d1250ff66a066c8634eca9c3f74312f00a7749
--- /dev/null
+++ b/PaddleNLP/language_model/lstm/README.md
@@ -0,0 +1,76 @@
+# lstm lm
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── README.md # 文档
+├── train.py # 训练脚本
+├── reader.py # 数据读取
+└── lm_model.py # 模型定义文件
+```
+
+
+## 简介
+
+循环神经网络语言模型的介绍可以参阅论文[Recurrent Neural Network Regularization](https://arxiv.org/abs/1409.2329),本文主要是说明基于lstm的语言的模型的实现,数据是采用ptb dataset,下载地址为
+http://www.fit.vutbr.cz/~imikolov/rnnlm/simple-examples.tgz
+
+## 数据下载
+用户可以自行下载数据,并解压, 也可以利用目录中的脚本
+
+cd data; sh download_data.sh
+
+## 训练
+
+运行命令
+`CUDA_VISIBLE_DEVICES=0 python train.py --data_path data/simple-examples/data/ --model_type small --use_gpu True`
+ 开始训练模型。
+
+model_type 为模型配置的大小,目前支持 small,medium, large 三种配置形式
+
+实现采用双层的lstm,具体的参数和网络配置 可以参考 train.py, lm_model.py 文件中的设置
+
+
+## 训练结果示例
+
+p40中训练日志如下(small config), test 测试集仅在最后一个epoch完成后进行测试
+```text
+epoch id 0
+ppl 232 865.86505 1.0
+ppl 464 632.76526 1.0
+ppl 696 510.47153 1.0
+ppl 928 437.60617 1.0
+ppl 1160 393.38422 1.0
+ppl 1392 353.05365 1.0
+ppl 1624 325.73267 1.0
+ppl 1856 305.488 1.0
+ppl 2088 286.3128 1.0
+ppl 2320 270.91504 1.0
+train ppl 270.86246
+valid ppl 181.867964379
+...
+ppl 2320 40.975872 0.001953125
+train ppl 40.974102
+valid ppl 117.85741214
+test ppl 113.939103843
+```
+## 与tf结果对比
+
+tf采用的版本是1.6
+```text
+small config
+ train valid test
+fluid 1.0 40.962 118.111 112.617
+tf 1.6 40.492 118.329 113.788
+
+medium config
+ train valid test
+fluid 1.0 45.620 87.398 83.682
+tf 1.6 45.594 87.363 84.015
+
+large config
+ train valid test
+fluid 1.0 37.221 82.358 78.137
+tf 1.6 38.342 82.311 78.121
+```
diff --git a/fluid/PaddleNLP/language_model/lstm/_ce.py b/PaddleNLP/language_model/lstm/_ce.py
similarity index 100%
rename from fluid/PaddleNLP/language_model/lstm/_ce.py
rename to PaddleNLP/language_model/lstm/_ce.py
diff --git a/PaddleNLP/language_model/lstm/args.py b/PaddleNLP/language_model/lstm/args.py
new file mode 100644
index 0000000000000000000000000000000000000000..1c6957b2bf90eb45bf009e022e7ee27cc4742f4a
--- /dev/null
+++ b/PaddleNLP/language_model/lstm/args.py
@@ -0,0 +1,45 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import argparse
+import distutils.util
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(description=__doc__)
+ parser.add_argument(
+ "--model_type",
+ type=str,
+ default="small",
+ help="model_type [test|small|medium|large]")
+ parser.add_argument(
+ "--rnn_model",
+ type=str,
+ default="static",
+ help="model_type [static|padding|cudnn]")
+ parser.add_argument(
+ "--data_path", type=str, help="all the data for train,valid,test")
+ parser.add_argument('--para_init', action='store_true')
+ parser.add_argument(
+ '--use_gpu', type=bool, default=False, help='whether using gpu')
+ parser.add_argument(
+ '--log_path',
+ help='path of the log file. If not set, logs are printed to console')
+ parser.add_argument('--enable_ce', action='store_true')
+ args = parser.parse_args()
+ return args
diff --git a/fluid/PaddleNLP/language_model/lstm/data/download_data.sh b/PaddleNLP/language_model/lstm/data/download_data.sh
similarity index 100%
rename from fluid/PaddleNLP/language_model/lstm/data/download_data.sh
rename to PaddleNLP/language_model/lstm/data/download_data.sh
diff --git a/PaddleNLP/language_model/lstm/lm_model.py b/PaddleNLP/language_model/lstm/lm_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..21a5dc04994304663aaea02b658d5f5d1ed1ad02
--- /dev/null
+++ b/PaddleNLP/language_model/lstm/lm_model.py
@@ -0,0 +1,299 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid.layers as layers
+import paddle.fluid as fluid
+from paddle.fluid.layers.control_flow import StaticRNN as PaddingRNN
+import numpy as np
+
+
+def lm_model(hidden_size,
+ vocab_size,
+ batch_size,
+ num_layers=2,
+ num_steps=20,
+ init_scale=0.1,
+ dropout=None,
+ rnn_model='static'):
+ def padding_rnn(input_embedding, len=3, init_hidden=None, init_cell=None):
+ weight_1_arr = []
+ weight_2_arr = []
+ bias_arr = []
+ hidden_array = []
+ cell_array = []
+ mask_array = []
+ for i in range(num_layers):
+ weight_1 = layers.create_parameter([hidden_size * 2, hidden_size*4], dtype="float32", name="fc_weight1_"+str(i), \
+ default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
+ weight_1_arr.append(weight_1)
+ bias_1 = layers.create_parameter(
+ [hidden_size * 4],
+ dtype="float32",
+ name="fc_bias1_" + str(i),
+ default_initializer=fluid.initializer.Constant(0.0))
+ bias_arr.append(bias_1)
+
+ pre_hidden = layers.slice(
+ init_hidden, axes=[0], starts=[i], ends=[i + 1])
+ pre_cell = layers.slice(
+ init_cell, axes=[0], starts=[i], ends=[i + 1])
+ pre_hidden = layers.reshape(pre_hidden, shape=[-1, hidden_size])
+ pre_cell = layers.reshape(pre_cell, shape=[-1, hidden_size])
+ hidden_array.append(pre_hidden)
+ cell_array.append(pre_cell)
+
+ input_embedding = layers.transpose(input_embedding, perm=[1, 0, 2])
+ rnn = PaddingRNN()
+
+ with rnn.step():
+ input = rnn.step_input(input_embedding)
+ for k in range(num_layers):
+ pre_hidden = rnn.memory(init=hidden_array[k])
+ pre_cell = rnn.memory(init=cell_array[k])
+ weight_1 = weight_1_arr[k]
+ bias = bias_arr[k]
+
+ nn = layers.concat([input, pre_hidden], 1)
+ gate_input = layers.matmul(x=nn, y=weight_1)
+
+ gate_input = layers.elementwise_add(gate_input, bias)
+ #i, j, f, o = layers.split(gate_input, num_or_sections=4, dim=-1)
+ i = layers.slice(
+ gate_input, axes=[1], starts=[0], ends=[hidden_size])
+ j = layers.slice(
+ gate_input,
+ axes=[1],
+ starts=[hidden_size],
+ ends=[hidden_size * 2])
+ f = layers.slice(
+ gate_input,
+ axes=[1],
+ starts=[hidden_size * 2],
+ ends=[hidden_size * 3])
+ o = layers.slice(
+ gate_input,
+ axes=[1],
+ starts=[hidden_size * 3],
+ ends=[hidden_size * 4])
+
+ c = pre_cell * layers.sigmoid(f) + layers.sigmoid(
+ i) * layers.tanh(j)
+ m = layers.tanh(c) * layers.sigmoid(o)
+
+ rnn.update_memory(pre_hidden, m)
+ rnn.update_memory(pre_cell, c)
+
+ rnn.step_output(m)
+ rnn.step_output(c)
+
+ input = m
+
+ if dropout != None and dropout > 0.0:
+ input = layers.dropout(
+ input,
+ dropout_prob=dropout,
+ dropout_implementation='upscale_in_train')
+
+ rnn.step_output(input)
+ #real_res = layers.concat(res, 0)
+ rnnout = rnn()
+
+ last_hidden_array = []
+ last_cell_array = []
+ real_res = rnnout[-1]
+ for i in range(num_layers):
+ m = rnnout[i * 2]
+ c = rnnout[i * 2 + 1]
+ m.stop_gradient = True
+ c.stop_gradient = True
+ last_h = layers.slice(
+ m, axes=[0], starts=[num_steps - 1], ends=[num_steps])
+ last_hidden_array.append(last_h)
+ last_c = layers.slice(
+ c, axes=[0], starts=[num_steps - 1], ends=[num_steps])
+ last_cell_array.append(last_c)
+ '''
+ else:
+ real_res = rnnout[-1]
+ for i in range( num_layers ):
+
+ m1, c1, m2, c2 = rnnout
+ real_res = m2
+ m1.stop_gradient = True
+ c1.stop_gradient = True
+ c2.stop_gradient = True
+ '''
+
+ #layers.Print( first_hidden, message="22", summarize=10)
+ #layers.Print( rnnout[1], message="11", summarize=10)
+ #real_res = ( rnnout[1] + rnnout[2] + rnnout[3] + rnnout[4]) / 4.0
+ real_res = layers.transpose(x=real_res, perm=[1, 0, 2])
+ last_hidden = layers.concat(last_hidden_array, 0)
+ last_cell = layers.concat(last_cell_array, 0)
+ '''
+ last_hidden = layers.concat( hidden_array, 1 )
+ last_hidden = layers.reshape( last_hidden, shape=[-1, num_layers, hidden_size])
+ last_hidden = layers.transpose( x = last_hidden, perm = [1, 0, 2])
+ last_cell = layers.concat( cell_array, 1)
+ last_cell = layers.reshape( last_cell, shape=[ -1, num_layers, hidden_size])
+ last_cell = layers.transpose( x = last_cell, perm = [1, 0, 2])
+ '''
+
+ return real_res, last_hidden, last_cell
+
+ def encoder_static(input_embedding, len=3, init_hidden=None,
+ init_cell=None):
+
+ weight_1_arr = []
+ weight_2_arr = []
+ bias_arr = []
+ hidden_array = []
+ cell_array = []
+ mask_array = []
+ for i in range(num_layers):
+ weight_1 = layers.create_parameter([hidden_size * 2, hidden_size*4], dtype="float32", name="fc_weight1_"+str(i), \
+ default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
+ weight_1_arr.append(weight_1)
+ bias_1 = layers.create_parameter(
+ [hidden_size * 4],
+ dtype="float32",
+ name="fc_bias1_" + str(i),
+ default_initializer=fluid.initializer.Constant(0.0))
+ bias_arr.append(bias_1)
+
+ pre_hidden = layers.slice(
+ init_hidden, axes=[0], starts=[i], ends=[i + 1])
+ pre_cell = layers.slice(
+ init_cell, axes=[0], starts=[i], ends=[i + 1])
+ pre_hidden = layers.reshape(pre_hidden, shape=[-1, hidden_size])
+ pre_cell = layers.reshape(pre_cell, shape=[-1, hidden_size])
+ hidden_array.append(pre_hidden)
+ cell_array.append(pre_cell)
+
+ res = []
+ for index in range(len):
+ input = layers.slice(
+ input_embedding, axes=[1], starts=[index], ends=[index + 1])
+ input = layers.reshape(input, shape=[-1, hidden_size])
+ for k in range(num_layers):
+ pre_hidden = hidden_array[k]
+ pre_cell = cell_array[k]
+ weight_1 = weight_1_arr[k]
+ bias = bias_arr[k]
+
+ nn = layers.concat([input, pre_hidden], 1)
+ gate_input = layers.matmul(x=nn, y=weight_1)
+
+ gate_input = layers.elementwise_add(gate_input, bias)
+ i, j, f, o = layers.split(gate_input, num_or_sections=4, dim=-1)
+
+ c = pre_cell * layers.sigmoid(f) + layers.sigmoid(
+ i) * layers.tanh(j)
+ m = layers.tanh(c) * layers.sigmoid(o)
+
+ hidden_array[k] = m
+ cell_array[k] = c
+ input = m
+
+ if dropout != None and dropout > 0.0:
+ input = layers.dropout(
+ input,
+ dropout_prob=dropout,
+ dropout_implementation='upscale_in_train')
+
+ res.append(layers.reshape(input, shape=[1, -1, hidden_size]))
+ real_res = layers.concat(res, 0)
+ real_res = layers.transpose(x=real_res, perm=[1, 0, 2])
+ last_hidden = layers.concat(hidden_array, 1)
+ last_hidden = layers.reshape(
+ last_hidden, shape=[-1, num_layers, hidden_size])
+ last_hidden = layers.transpose(x=last_hidden, perm=[1, 0, 2])
+ last_cell = layers.concat(cell_array, 1)
+ last_cell = layers.reshape(
+ last_cell, shape=[-1, num_layers, hidden_size])
+ last_cell = layers.transpose(x=last_cell, perm=[1, 0, 2])
+
+ return real_res, last_hidden, last_cell
+
+ x = layers.data(name="x", shape=[-1, 1, 1], dtype='int64')
+ y = layers.data(name="y", shape=[-1, 1], dtype='float32')
+
+ init_hidden = layers.data(name="init_hidden", shape=[1], dtype='float32')
+ init_cell = layers.data(name="init_cell", shape=[1], dtype='float32')
+
+ init_hidden = layers.reshape(
+ init_hidden, shape=[num_layers, -1, hidden_size])
+ init_cell = layers.reshape(init_cell, shape=[num_layers, -1, hidden_size])
+
+ x_emb = layers.embedding(
+ input=x,
+ size=[vocab_size, hidden_size],
+ dtype='float32',
+ is_sparse=False,
+ param_attr=fluid.ParamAttr(
+ name='embedding_para',
+ initializer=fluid.initializer.UniformInitializer(
+ low=-init_scale, high=init_scale)))
+
+ x_emb = layers.reshape(x_emb, shape=[-1, num_steps, hidden_size])
+ if dropout != None and dropout > 0.0:
+ x_emb = layers.dropout(
+ x_emb,
+ dropout_prob=dropout,
+ dropout_implementation='upscale_in_train')
+
+ if rnn_model == "padding":
+ rnn_out, last_hidden, last_cell = padding_rnn(
+ x_emb, len=num_steps, init_hidden=init_hidden, init_cell=init_cell)
+ elif rnn_model == "static":
+ rnn_out, last_hidden, last_cell = encoder_static(
+ x_emb, len=num_steps, init_hidden=init_hidden, init_cell=init_cell)
+ elif rnn_model == "cudnn":
+ x_emb = layers.transpose( x_emb, perm=[1, 0, 2])
+ rnn_out, last_hidden, last_cell = layers.lstm( x_emb, init_hidden, init_cell, num_steps, hidden_size, num_layers, \
+ is_bidirec=False, \
+ default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale) )
+ rnn_out = layers.transpose( rnn_out, perm=[1, 0, 2])
+ else:
+ print( "type not support")
+ return
+ rnn_out = layers.reshape(rnn_out, shape=[-1, num_steps, hidden_size])
+
+
+ softmax_weight = layers.create_parameter([hidden_size, vocab_size], dtype="float32", name="softmax_weight", \
+ default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
+ softmax_bias = layers.create_parameter([vocab_size], dtype="float32", name='softmax_bias', \
+ default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
+
+ projection = layers.matmul(rnn_out, softmax_weight)
+ projection = layers.elementwise_add(projection, softmax_bias)
+
+ projection = layers.reshape(projection, shape=[-1, vocab_size])
+ #y = layers.reshape( y, shape=[-1, vocab_size])
+
+ loss = layers.softmax_with_cross_entropy(
+ logits=projection, label=y, soft_label=False)
+
+ loss = layers.reshape(loss, shape=[-1, num_steps])
+ loss = layers.reduce_mean(loss, dim=[0])
+ loss = layers.reduce_sum(loss)
+
+ loss.permissions = True
+
+ feeding_list = ['x', 'y', 'init_hidden', 'init_cell']
+ return loss, last_hidden, last_cell, feeding_list
diff --git a/fluid/PaddleNLP/language_model/lstm/reader.py b/PaddleNLP/language_model/lstm/reader.py
similarity index 100%
rename from fluid/PaddleNLP/language_model/lstm/reader.py
rename to PaddleNLP/language_model/lstm/reader.py
diff --git a/PaddleNLP/language_model/lstm/train.py b/PaddleNLP/language_model/lstm/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..2599a92c7a49af28776bbd2e4886c2009e07fa9f
--- /dev/null
+++ b/PaddleNLP/language_model/lstm/train.py
@@ -0,0 +1,305 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import numpy as np
+import time
+import os
+import random
+
+import math
+
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.core as core
+import paddle.fluid.framework as framework
+from paddle.fluid.executor import Executor
+
+import reader
+
+import sys
+if sys.version[0] == '2':
+ reload(sys)
+ sys.setdefaultencoding("utf-8")
+sys.path.append('..')
+import os
+os.environ["TF_CPP_MIN_LOG_LEVEL"] = "3"
+
+from args import *
+import lm_model
+import logging
+import pickle
+
+SEED = 123
+
+
+def get_current_model_para(train_prog, train_exe):
+ param_list = train_prog.block(0).all_parameters()
+ param_name_list = [p.name for p in param_list]
+
+ vals = {}
+ for p_name in param_name_list:
+ p_array = np.array(fluid.global_scope().find_var(p_name).get_tensor())
+ vals[p_name] = p_array
+
+ return vals
+
+
+def save_para_npz(train_prog, train_exe):
+ print("begin to save model to model_base")
+ param_list = train_prog.block(0).all_parameters()
+ param_name_list = [p.name for p in param_list]
+
+ vals = {}
+ for p_name in param_name_list:
+ p_array = np.array(fluid.global_scope().find_var(p_name).get_tensor())
+ vals[p_name] = p_array
+
+ emb = vals["embedding_para"]
+ print("begin to save model to model_base")
+ np.savez("mode_base", **vals)
+
+
+def train():
+ args = parse_args()
+ model_type = args.model_type
+ rnn_model = args.rnn_model
+ logger = logging.getLogger("lm")
+ logger.setLevel(logging.INFO)
+ formatter = logging.Formatter(
+ '%(asctime)s - %(name)s - %(levelname)s - %(message)s')
+ if args.enable_ce:
+ fluid.default_startup_program().random_seed = SEED
+ if args.log_path:
+ file_handler = logging.FileHandler(args.log_path)
+ file_handler.setLevel(logging.INFO)
+ file_handler.setFormatter(formatter)
+ logger.addHandler(file_handler)
+ else:
+ console_handler = logging.StreamHandler()
+ console_handler.setLevel(logging.INFO)
+ console_handler.setFormatter(formatter)
+ logger.addHandler(console_handler)
+
+ logger.info('Running with args : {}'.format(args))
+
+ vocab_size = 10000
+ if model_type == "test":
+ num_layers = 1
+ batch_size = 2
+ hidden_size = 10
+ num_steps = 3
+ init_scale = 0.1
+ max_grad_norm = 5.0
+ epoch_start_decay = 1
+ max_epoch = 1
+ dropout = 0.0
+ lr_decay = 0.5
+ base_learning_rate = 1.0
+ elif model_type == "small":
+ num_layers = 2
+ batch_size = 20
+ hidden_size = 200
+ num_steps = 20
+ init_scale = 0.1
+ max_grad_norm = 5.0
+ epoch_start_decay = 4
+ max_epoch = 13
+ dropout = 0.0
+ lr_decay = 0.5
+ base_learning_rate = 1.0
+ elif model_type == "medium":
+ num_layers = 2
+ batch_size = 20
+ hidden_size = 650
+ num_steps = 35
+ init_scale = 0.05
+ max_grad_norm = 5.0
+ epoch_start_decay = 6
+ max_epoch = 39
+ dropout = 0.5
+ lr_decay = 0.8
+ base_learning_rate = 1.0
+ elif model_type == "large":
+ num_layers = 2
+ batch_size = 20
+ hidden_size = 1500
+ num_steps = 35
+ init_scale = 0.04
+ max_grad_norm = 10.0
+ epoch_start_decay = 14
+ max_epoch = 55
+ dropout = 0.65
+ lr_decay = 1.0 / 1.15
+ base_learning_rate = 1.0
+ else:
+ print("model type not support")
+ return
+
+ # Training process
+ loss, last_hidden, last_cell, feed_order = lm_model.lm_model(
+ hidden_size,
+ vocab_size,
+ batch_size,
+ num_layers=num_layers,
+ num_steps=num_steps,
+ init_scale=init_scale,
+ dropout=dropout,
+ rnn_model=rnn_model)
+ # clone from default main program and use it as the validation program
+ main_program = fluid.default_main_program()
+ inference_program = fluid.default_main_program().clone(for_test=True)
+
+ fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByGlobalNorm(
+ clip_norm=max_grad_norm))
+
+ learning_rate = fluid.layers.create_global_var(
+ name="learning_rate",
+ shape=[1],
+ value=1.0,
+ dtype='float32',
+ persistable=True)
+
+ optimizer = fluid.optimizer.SGD(learning_rate=learning_rate)
+
+ optimizer.minimize(loss)
+
+ place = core.CUDAPlace(0) if args.use_gpu else core.CPUPlace()
+ exe = Executor(place)
+ exe.run(framework.default_startup_program())
+
+ data_path = args.data_path
+ print("begin to load data")
+ raw_data = reader.ptb_raw_data(data_path)
+ print("finished load data")
+ train_data, valid_data, test_data, _ = raw_data
+
+ def prepare_input(batch, init_hidden, init_cell, epoch_id=0, with_lr=True):
+ x, y = batch
+ new_lr = base_learning_rate * (lr_decay**max(
+ epoch_id + 1 - epoch_start_decay, 0.0))
+ lr = np.ones((1), dtype='float32') * new_lr
+ res = {}
+ x = x.reshape((-1, num_steps, 1))
+ y = y.reshape((-1, 1))
+
+ res['x'] = x
+ res['y'] = y
+ res['init_hidden'] = init_hidden
+ res['init_cell'] = init_cell
+ if with_lr:
+ res['learning_rate'] = lr
+
+ return res
+
+ def eval(data):
+ # when eval the batch_size set to 1
+ eval_data_iter = reader.get_data_iter(data, batch_size, num_steps)
+ total_loss = 0.0
+ iters = 0
+ init_hidden = np.zeros((num_layers, batch_size, hidden_size), dtype='float32')
+ init_cell = np.zeros((num_layers, batch_size, hidden_size), dtype='float32')
+ for batch_id, batch in enumerate(eval_data_iter):
+ input_data_feed = prepare_input(
+ batch, init_hidden, init_cell, epoch_id, with_lr=False)
+ fetch_outs = exe.run(
+ inference_program,
+ feed=input_data_feed,
+ fetch_list=[loss.name, last_hidden.name, last_cell.name],
+ use_program_cache=True)
+
+ cost_train = np.array(fetch_outs[0])
+ init_hidden = np.array(fetch_outs[1])
+ init_cell = np.array(fetch_outs[2])
+
+ total_loss += cost_train
+ iters += num_steps
+
+ ppl = np.exp(total_loss / iters)
+ return ppl
+
+ # get train epoch size
+ batch_len = len(train_data) // batch_size
+ epoch_size = (batch_len - 1) // num_steps
+ log_interval = epoch_size // 10
+ total_time = 0.0
+ for epoch_id in range(max_epoch):
+ start_time = time.time()
+ print("epoch id", epoch_id)
+ train_data_iter = reader.get_data_iter(train_data, batch_size,
+ num_steps)
+
+ total_loss = 0
+
+ init_hidden = None
+ init_cell = None
+ #debug_para(fluid.framework.default_main_program(), parallel_executor)
+ total_loss = 0
+ iters = 0
+ init_hidden = np.zeros(
+ (num_layers, batch_size, hidden_size), dtype='float32')
+ init_cell = np.zeros(
+ (num_layers, batch_size, hidden_size), dtype='float32')
+ for batch_id, batch in enumerate(train_data_iter):
+ input_data_feed = prepare_input(
+ batch, init_hidden, init_cell, epoch_id=epoch_id)
+ fetch_outs = exe.run(feed=input_data_feed,
+ fetch_list=[
+ loss.name, last_hidden.name,
+ last_cell.name, 'learning_rate'
+ ],
+ use_program_cache=True)
+
+ cost_train = np.array(fetch_outs[0])
+ init_hidden = np.array(fetch_outs[1])
+ init_cell = np.array(fetch_outs[2])
+
+ lr = np.array(fetch_outs[3])
+
+ total_loss += cost_train
+ iters += num_steps
+ if batch_id > 0 and batch_id % log_interval == 0:
+ ppl = np.exp(total_loss / iters)
+ print("ppl ", batch_id, ppl[0], lr[0])
+
+ ppl = np.exp(total_loss / iters)
+ if epoch_id == 0 and ppl[0] > 1000:
+ # for bad init, after first epoch, the loss is over 1000
+ # no more need to continue
+ return
+ end_time = time.time()
+ total_time += end_time - start_time
+ print("train ppl", ppl[0])
+
+ if epoch_id == max_epoch - 1 and args.enable_ce:
+ print("ptblm\tlstm_language_model_duration\t%s" %
+ (total_time / max_epoch))
+ print("ptblm\tlstm_language_model_loss\t%s" % ppl[0])
+
+ model_path = os.path.join("model_new/", str(epoch_id))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+ fluid.io.save_persistables(
+ executor=exe, dirname=model_path, main_program=main_program)
+ valid_ppl = eval(valid_data)
+ print("valid ppl", valid_ppl[0])
+ test_ppl = eval(test_data)
+ print("test ppl", test_ppl[0])
+
+
+if __name__ == '__main__':
+ train()
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/.run_ce.sh b/PaddleNLP/machine_reading_comprehension/.run_ce.sh
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/.run_ce.sh
rename to PaddleNLP/machine_reading_comprehension/.run_ce.sh
diff --git a/PaddleNLP/machine_reading_comprehension/README.md b/PaddleNLP/machine_reading_comprehension/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..228291d47c9a350c5f515f60e46c40ba4d1be507
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/README.md
@@ -0,0 +1,91 @@
+DuReader是一个端到端的机器阅读理解神经网络模型,能够在给定文档和问题的情况下,定位文档中问题的答案。我们首先利用双向注意力网络获得文档和问题的相同向量空间的表示,然后使用`point network` 定位文档中答案的位置。实验显示,我们的模型能够获得在Dureader数据集上SOTA的结果。
+
+# 算法介绍
+DuReader模型主要实现了论文[BiDAF](https://arxiv.org/abs/1611.01603), [Match-LSTM](https://arxiv.org/abs/1608.07905)中的模型结构。
+
+模型在层次上可以分为5层:
+
+- **词嵌入层** 将每一个词映射到一个向量(这个向量可以是预训练好的)。
+- **编码层** 使用双向LSMT网络获得问题和文档的每一个词的上下文信息。
+- **注意力层** 通过双向注意力网络获得文档的问题向量空间表示。更多参考[BiDAF](https://arxiv.org/abs/1611.01603)。
+- **融合层** 通过双向LSTM网络获得文档向量表示中上下文的关联性信息。
+- **输出层** 通过`point network`预测答案在问题中的位置。更多参考 [Match-LSTM](https://arxiv.org/abs/1608.07905)。
+
+## 数据准备
+### 下载数据集
+通过如下脚本下载数据集:
+```
+cd data && bash download.sh
+```
+模型默认使用DuReader数据集,是百度开源的真实阅读理解数据,更多参考[DuReader Dataset Homepage](https://ai.baidu.com//broad/subordinate?dataset=dureader)
+
+### 下载第三方依赖
+我们使用Bleu和Rouge作为度量指标, 这些度量指标的源码位于[coco-caption](https://github.com/tylin/coco-caption), 可以使用如下命令下载源码:
+
+```
+cd utils && bash download_thirdparty.sh
+```
+### 环境依赖
+当前模型是在paddlepaddle 1.2版本上测试, 因此建议在1.2版本上使用本模型。关于PaddlePaddle的安装可以参考[PaddlePaddle Homepage](http://paddlepaddle.org)。
+
+## 模型训练
+### 段落抽取
+在段落抽取阶段,主要是使用文档相关性score对文档内容进行优化, 抽取的结果将会放到`data/extracted/`目录下。如果你用demo数据测试,可以跳过这一步。如果你用dureader数据,需要指定抽取的数据目录,命令如下:
+```
+bash run.sh --para_extraction --trainset data/preprocessed/trainset/zhidao.train.json data/preprocessed/trainset/search.train.json --devset data/preprocessed/devset/zhidao.dev.json data/preprocessed/devset/search.dev.json --testset data/preprocessed/testset/zhidao.test.json data/preprocessed/testset/search.test.json
+```
+其中参数 `trainset`/`devset`/`testset`分别对应训练、验证和测试数据集(下同)。
+### 词典准备
+在训练模型之前,我们应该确保数据已经准备好。在准备阶段,通过全部数据文件生成一个词典,这个词典会在后续的训练和预测中用到。你可以通过如下命令生成词典:
+```
+bash run.sh --prepare
+```
+上面的命令默认使用demo数据,如果想使用dureader数据集,应该按照如下方式指定:
+```
+bash run.sh --prepare --trainset data/extracted/trainset/zhidao.train.json data/extracted/trainset/search.train.json --devset data/extracted/devset/zhidao.dev.json data/extracted/devset/search.dev.json --testset data/extracted/testset/zhidao.test.json data/extracted/testset/search.test.json
+```
+其中参数 `trainset`/`devset`/`testset`分别对应训练、验证和测试数据集。
+### 模型训练
+训练模型的启动命令如下:
+```
+bash run.sh --train
+```
+上面的命令默认使用demo数据,如果想使用dureader数据集,应该按照如下方式指定:
+```
+bash run.sh --train --trainset data/extracted/trainset/zhidao.train.json data/extracted/trainset/search.train.json --devset data/extracted/devset/zhidao.dev.json data/extracted/devset/search.dev.json --testset data/extracted/testset/zhidao.test.json data/extracted/testset/search.test.json
+```
+可以通过设置超参数更改训练的配置,比如通过`--learning_rate NUM`更改学习率,通过`--pass_num NUM`更改训练的轮数
+训练的过程中,每隔一定迭代周期,会测试在验证集上的性能指标, 通过`--dev_interval NUM`设置周期大小
+
+### 模型评测
+在模型训练结束后,如果想使用训练好的模型进行评测,获得度量指标,可以使用如下命令:
+```
+bash run.sh --evaluate --load_dir data/models/1
+```
+其中,`--load_dir data/models/1`是模型的checkpoint目录
+
+上面的命令默认使用demo数据,如果想使用dureader数据集,应该按照如下方式指定:
+```
+bash run.sh --evaluate --load_dir data/models/1 --devset data/extracted/devset/zhidao.dev.json data/extracted/devset/search.dev.json --testset data/extracted/testset/zhidao.test.json data/extracted/testset/search.test.json
+```
+
+### 预测
+使用训练好的模型,对问答文档数据直接预测结果,获得答案,可以使用如下命令:
+```
+bash run.sh --predict --load_dir data/models/1
+```
+上面的命令默认使用demo数据,如果想使用dureader数据集,应该按照如下方式指定:
+```
+bash run.sh --predict --load_dir data/models/1 --testset data/extracted/testset/search.test.json data/extracted/testset/zhidao.test.json
+```
+其中`--testset`指定了预测用的数据集,生成的问题答案默认会放到`data/results/` 目录,你可以通过参数`--result_dir DIR_PATH`更改配置
+
+### 实验结果
+验证集 ROUGE-L:47.65。
+
+这是在P40上,使用4卡GPU,batch size=4*32的训练5个epoch(约30个小时)的结果,如果使用单卡,指标可能会略有降低,但在验证集上的ROUGE-L也不小于47。
+
+## 参考文献
+[Machine Comprehension Using Match-LSTM and Answer Pointer](https://arxiv.org/abs/1608.07905)
+
+[Bidirectional Attention Flow for Machine Comprehension](https://arxiv.org/abs/1611.01603)
diff --git a/PaddleNLP/machine_reading_comprehension/_ce.py b/PaddleNLP/machine_reading_comprehension/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..a425fe951fb587749f31b18959917cdeed76a41d
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/_ce.py
@@ -0,0 +1,69 @@
+####this file is only used for continuous evaluation test!
+
+import os
+import sys
+#sys.path.insert(0, os.environ['ceroot'])
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi, DurationKpi, AccKpi
+
+#### NOTE kpi.py should shared in models in some way!!!!
+
+train_cost_card1_kpi = CostKpi('train_cost_card1', 0.02, 0, actived=True)
+test_cost_card1_kpi = CostKpi('test_cost_card1', 0.005, 0, actived=True)
+train_duration_card1_kpi = DurationKpi(
+ 'train_duration_card1', 0.06, 0, actived=True)
+train_cost_card4_kpi = CostKpi('train_cost_card4', 0.01, 0, actived=True)
+test_cost_card4_kpi = CostKpi('test_cost_card4', 0.005, 0, actived=True)
+train_duration_card4_kpi = DurationKpi(
+ 'train_duration_card4', 0.06, 0, actived=True)
+
+tracking_kpis = [
+ train_cost_card1_kpi,
+ test_cost_card1_kpi,
+ train_duration_card1_kpi,
+ train_cost_card4_kpi,
+ test_cost_card4_kpi,
+ train_duration_card4_kpi,
+]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+ The suggestion:
+ each line in the log should be key, value, for example:
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ print("-----%s" % fs)
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ print("*****")
+ print(log)
+ print("****")
+ log_to_ce(log)
diff --git a/PaddleNLP/machine_reading_comprehension/args.py b/PaddleNLP/machine_reading_comprehension/args.py
new file mode 100644
index 0000000000000000000000000000000000000000..96c4e35cf3fe822e2794b41a385c106882935906
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/args.py
@@ -0,0 +1,91 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import argparse
+import distutils.util
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(description=__doc__)
+ parser.add_argument(
+ '--prepare',
+ action='store_true',
+ help='create the directories, prepare the vocabulary and embeddings')
+ parser.add_argument('--train', action='store_true', help='train the model')
+ parser.add_argument('--evaluate', action='store_true', help='evaluate the model on dev set')
+ parser.add_argument('--predict', action='store_true',
+ help='predict the answers for test set with trained model')
+
+ parser.add_argument("--embed_size", type=int, default=300,
+ help="The dimension of embedding table. (default: %(default)d)")
+ parser.add_argument("--hidden_size", type=int, default=150,
+ help="The size of rnn hidden unit. (default: %(default)d)")
+ parser.add_argument("--learning_rate", type=float, default=0.001,
+ help="Learning rate used to train the model. (default: %(default)f)")
+ parser.add_argument('--optim', default='adam', help='optimizer type')
+ parser.add_argument("--weight_decay", type=float, default=0.0001,
+ help="Weight decay. (default: %(default)f)")
+
+ parser.add_argument('--drop_rate', type=float, default=0.0, help="Dropout probability")
+ parser.add_argument('--random_seed', type=int, default=123)
+ parser.add_argument("--batch_size", type=int, default=32,
+ help="The sequence number of a mini-batch data. (default: %(default)d)")
+ parser.add_argument("--pass_num", type=int, default=5,
+ help="The number epochs to train. (default: %(default)d)")
+ parser.add_argument("--use_gpu", type=distutils.util.strtobool, default=True,
+ help="Whether to use gpu. (default: %(default)d)")
+ parser.add_argument("--log_interval", type=int, default=50,
+ help="log the train loss every n batches. (default: %(default)d)")
+
+ parser.add_argument('--max_p_num', type=int, default=5)
+ parser.add_argument('--max_a_len', type=int, default=200)
+ parser.add_argument('--max_p_len', type=int, default=500)
+ parser.add_argument('--max_q_len', type=int, default=60)
+ parser.add_argument('--doc_num', type=int, default=5)
+
+ parser.add_argument('--vocab_dir', default='data/vocab', help='vocabulary')
+ parser.add_argument("--save_dir", type=str, default="data/models",
+ help="Specify the path to save trained models.")
+ parser.add_argument("--save_interval", type=int, default=1,
+ help="Save the trained model every n passes. (default: %(default)d)")
+ parser.add_argument("--load_dir", type=str, default="",
+ help="Specify the path to load trained models.")
+ parser.add_argument('--log_path',
+ help='path of the log file. If not set, logs are printed to console')
+ parser.add_argument('--result_dir', default='data/results/',
+ help='the dir to output the results')
+ parser.add_argument('--result_name', default='test_result',
+ help='the file name of the predicted results')
+
+ parser.add_argument('--trainset', nargs='+',
+ default=['data/demo/trainset/search.train.json'],
+ help='train dataset')
+ parser.add_argument('--devset', nargs='+',
+ default=['data/demo/devset/search.dev.json'],
+ help='dev dataset')
+ parser.add_argument('--testset', nargs='+',
+ default=['data/demo/testset/search.test.json'],
+ help='test dataset')
+
+ parser.add_argument("--enable_ce", action='store_true',
+ help="If set, run the task with continuous evaluation logs.")
+ parser.add_argument('--para_print', action='store_true', help="Print debug info")
+ parser.add_argument("--dev_interval", type=int, default=-1,
+ help="evaluate on dev set loss every n batches. (default: %(default)d)")
+ args = parser.parse_args()
+ return args
diff --git a/PaddleNLP/machine_reading_comprehension/data/download.sh b/PaddleNLP/machine_reading_comprehension/data/download.sh
new file mode 100644
index 0000000000000000000000000000000000000000..5ad22e5c3de8c3e36e6105f00de2dfa7eeb46af3
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/data/download.sh
@@ -0,0 +1,35 @@
+#!/bin/bash
+# ==============================================================================
+# Copyright 2017 Baidu.com, Inc. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# ==============================================================================
+
+
+if [[ -d preprocessed ]] && [[ -d raw ]]; then
+ echo "data exist"
+ exit 0
+else
+ wget -c http://dureader.gz.bcebos.com/demo.zip
+ wget -c https://aipedataset.cdn.bcebos.com/dureader/dureader_raw.zip
+ wget -c https://aipedataset.cdn.bcebos.com/dureader/dureader_preprocessed.zip
+fi
+
+if md5sum --status -c md5sum.txt; then
+ unzip demo.zip
+ unzip dureader_raw.zip
+ unzip dureader_preprocessed.zip
+else
+ echo "download data error!" >> /dev/stderr
+ exit 1
+fi
diff --git a/PaddleNLP/machine_reading_comprehension/data/md5sum.txt b/PaddleNLP/machine_reading_comprehension/data/md5sum.txt
new file mode 100644
index 0000000000000000000000000000000000000000..118e34190c5823c0d096ff5c300bf877101679d4
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/data/md5sum.txt
@@ -0,0 +1,3 @@
+0ca0510fa625d35d902b73033c4ba9d8 demo.zip
+dc7658b8cdf4f94b8714d130b7d15196 dureader_raw.zip
+3db9a32e5a7c5375a604a70687b45479 dureader_preprocessed.zip
diff --git a/PaddleNLP/machine_reading_comprehension/dataset.py b/PaddleNLP/machine_reading_comprehension/dataset.py
new file mode 100644
index 0000000000000000000000000000000000000000..7c583e100997b41aa5d5229c4c3f8fee59d7539f
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/dataset.py
@@ -0,0 +1,244 @@
+# -*- coding:utf8 -*-
+# ==============================================================================
+# Copyright 2017 Baidu.com, Inc. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# ==============================================================================
+"""
+This module implements data process strategies.
+"""
+
+import os
+import json
+import logging
+import numpy as np
+from collections import Counter
+import io
+
+
+class BRCDataset(object):
+ """
+ This module implements the APIs for loading and using baidu reading comprehension dataset
+ """
+
+ def __init__(self,
+ max_p_num,
+ max_p_len,
+ max_q_len,
+ train_files=[],
+ dev_files=[],
+ test_files=[]):
+ self.logger = logging.getLogger("brc")
+ self.max_p_num = max_p_num
+ self.max_p_len = max_p_len
+ self.max_q_len = max_q_len
+
+ self.train_set, self.dev_set, self.test_set = [], [], []
+ if train_files:
+ for train_file in train_files:
+ self.train_set += self._load_dataset(train_file, train=True)
+ self.logger.info('Train set size: {} questions.'.format(
+ len(self.train_set)))
+
+ if dev_files:
+ for dev_file in dev_files:
+ self.dev_set += self._load_dataset(dev_file)
+ self.logger.info('Dev set size: {} questions.'.format(
+ len(self.dev_set)))
+
+ if test_files:
+ for test_file in test_files:
+ self.test_set += self._load_dataset(test_file)
+ self.logger.info('Test set size: {} questions.'.format(
+ len(self.test_set)))
+
+ def _load_dataset(self, data_path, train=False):
+ """
+ Loads the dataset
+ Args:
+ data_path: the data file to load
+ """
+ with io.open(data_path, 'r', encoding='utf-8') as fin:
+ data_set = []
+ for lidx, line in enumerate(fin):
+ sample = json.loads(line.strip())
+ if train:
+ if len(sample['answer_spans']) == 0:
+ continue
+ if sample['answer_spans'][0][1] >= self.max_p_len:
+ continue
+
+ if 'answer_docs' in sample:
+ sample['answer_passages'] = sample['answer_docs']
+
+ sample['question_tokens'] = sample['segmented_question']
+
+ sample['passages'] = []
+ for d_idx, doc in enumerate(sample['documents']):
+ if train:
+ most_related_para = doc['most_related_para']
+ sample['passages'].append({
+ 'passage_tokens':
+ doc['segmented_paragraphs'][most_related_para],
+ 'is_selected': doc['is_selected']
+ })
+ else:
+ para_infos = []
+ for para_tokens in doc['segmented_paragraphs']:
+ question_tokens = sample['segmented_question']
+ common_with_question = Counter(
+ para_tokens) & Counter(question_tokens)
+ correct_preds = sum(common_with_question.values())
+ if correct_preds == 0:
+ recall_wrt_question = 0
+ else:
+ recall_wrt_question = float(
+ correct_preds) / len(question_tokens)
+ para_infos.append((para_tokens, recall_wrt_question,
+ len(para_tokens)))
+ para_infos.sort(key=lambda x: (-x[1], x[2]))
+ fake_passage_tokens = []
+ for para_info in para_infos[:1]:
+ fake_passage_tokens += para_info[0]
+ sample['passages'].append({
+ 'passage_tokens': fake_passage_tokens
+ })
+ data_set.append(sample)
+ return data_set
+
+ def _one_mini_batch(self, data, indices, pad_id):
+ """
+ Get one mini batch
+ Args:
+ data: all data
+ indices: the indices of the samples to be selected
+ pad_id:
+ Returns:
+ one batch of data
+ """
+ batch_data = {
+ 'raw_data': [data[i] for i in indices],
+ 'question_token_ids': [],
+ 'question_length': [],
+ 'passage_token_ids': [],
+ 'passage_length': [],
+ 'start_id': [],
+ 'end_id': [],
+ 'passage_num': []
+ }
+ max_passage_num = max(
+ [len(sample['passages']) for sample in batch_data['raw_data']])
+ max_passage_num = min(self.max_p_num, max_passage_num)
+ for sidx, sample in enumerate(batch_data['raw_data']):
+ count = 0
+ for pidx in range(max_passage_num):
+ if pidx < len(sample['passages']):
+ count += 1
+ batch_data['question_token_ids'].append(sample[
+ 'question_token_ids'][0:self.max_q_len])
+ batch_data['question_length'].append(
+ min(len(sample['question_token_ids']), self.max_q_len))
+ passage_token_ids = sample['passages'][pidx][
+ 'passage_token_ids'][0:self.max_p_len]
+ batch_data['passage_token_ids'].append(passage_token_ids)
+ batch_data['passage_length'].append(
+ min(len(passage_token_ids), self.max_p_len))
+ # record the start passage index of current sample
+ passade_idx_offset = sum(batch_data['passage_num'])
+ batch_data['passage_num'].append(count)
+ gold_passage_offset = 0
+ if 'answer_passages' in sample and len(sample['answer_passages']) and \
+ sample['answer_passages'][0] < len(sample['documents']):
+ for i in range(sample['answer_passages'][0]):
+ gold_passage_offset += len(batch_data['passage_token_ids'][
+ passade_idx_offset + i])
+ start_id = min(sample['answer_spans'][0][0], self.max_p_len)
+ end_id = min(sample['answer_spans'][0][1], self.max_p_len)
+ batch_data['start_id'].append(gold_passage_offset + start_id)
+ batch_data['end_id'].append(gold_passage_offset + end_id)
+ else:
+ # fake span for some samples, only valid for testing
+ batch_data['start_id'].append(0)
+ batch_data['end_id'].append(0)
+ return batch_data
+
+ def word_iter(self, set_name=None):
+ """
+ Iterates over all the words in the dataset
+ Args:
+ set_name: if it is set, then the specific set will be used
+ Returns:
+ a generator
+ """
+ if set_name is None:
+ data_set = self.train_set + self.dev_set + self.test_set
+ elif set_name == 'train':
+ data_set = self.train_set
+ elif set_name == 'dev':
+ data_set = self.dev_set
+ elif set_name == 'test':
+ data_set = self.test_set
+ else:
+ raise NotImplementedError('No data set named as {}'.format(
+ set_name))
+ if data_set is not None:
+ for sample in data_set:
+ for token in sample['question_tokens']:
+ yield token
+ for passage in sample['passages']:
+ for token in passage['passage_tokens']:
+ yield token
+
+ def convert_to_ids(self, vocab):
+ """
+ Convert the question and passage in the original dataset to ids
+ Args:
+ vocab: the vocabulary on this dataset
+ """
+ for data_set in [self.train_set, self.dev_set, self.test_set]:
+ if data_set is None:
+ continue
+ for sample in data_set:
+ sample['question_token_ids'] = vocab.convert_to_ids(sample[
+ 'question_tokens'])
+ for passage in sample['passages']:
+ passage['passage_token_ids'] = vocab.convert_to_ids(passage[
+ 'passage_tokens'])
+
+ def gen_mini_batches(self, set_name, batch_size, pad_id, shuffle=True):
+ """
+ Generate data batches for a specific dataset (train/dev/test)
+ Args:
+ set_name: train/dev/test to indicate the set
+ batch_size: number of samples in one batch
+ pad_id: pad id
+ shuffle: if set to be true, the data is shuffled.
+ Returns:
+ a generator for all batches
+ """
+ if set_name == 'train':
+ data = self.train_set
+ elif set_name == 'dev':
+ data = self.dev_set
+ elif set_name == 'test':
+ data = self.test_set
+ else:
+ raise NotImplementedError('No data set named as {}'.format(
+ set_name))
+ data_size = len(data)
+ indices = np.arange(data_size)
+ if shuffle:
+ np.random.shuffle(indices)
+ for batch_start in np.arange(0, data_size, batch_size):
+ batch_indices = indices[batch_start:batch_start + batch_size]
+ yield self._one_mini_batch(data, batch_indices, pad_id)
diff --git a/PaddleNLP/machine_reading_comprehension/paragraph_extraction.py b/PaddleNLP/machine_reading_comprehension/paragraph_extraction.py
new file mode 100644
index 0000000000000000000000000000000000000000..414d745d2deee5ece1df52b90d86b6e84c622c76
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/paragraph_extraction.py
@@ -0,0 +1,199 @@
+#!/usr/bin/python
+#-*- coding:utf-8 -*-
+
+import sys
+if sys.version[0] == '2':
+ reload(sys)
+ sys.setdefaultencoding("utf-8")
+import json
+import copy
+from preprocess import metric_max_over_ground_truths, f1_score
+
+
+def compute_paragraph_score(sample):
+ """
+ For each paragraph, compute the f1 score compared with the question
+ Args:
+ sample: a sample in the dataset.
+ Returns:
+ None
+ Raises:
+ None
+ """
+ question = sample["segmented_question"]
+ for doc in sample['documents']:
+ doc['segmented_paragraphs_scores'] = []
+ for p_idx, para_tokens in enumerate(doc['segmented_paragraphs']):
+ if len(question) > 0:
+ related_score = metric_max_over_ground_truths(f1_score,
+ para_tokens,
+ question)
+ else:
+ related_score = 0.0
+ doc['segmented_paragraphs_scores'].append(related_score)
+
+
+def dup_remove(doc):
+ """
+ For each document, remove the duplicated paragraphs
+ Args:
+ doc: a doc in the sample
+ Returns:
+ bool
+ Raises:
+ None
+ """
+ paragraphs_his = {}
+ del_ids = []
+ para_id = None
+ if 'most_related_para' in doc:
+ para_id = doc['most_related_para']
+ doc['paragraphs_length'] = []
+ for p_idx, (segmented_paragraph, paragraph_score) in \
+ enumerate(zip(doc["segmented_paragraphs"], doc["segmented_paragraphs_scores"])):
+ doc['paragraphs_length'].append(len(segmented_paragraph))
+ paragraph = ''.join(segmented_paragraph)
+ if paragraph in paragraphs_his:
+ del_ids.append(p_idx)
+ if p_idx == para_id:
+ para_id = paragraphs_his[paragraph]
+ continue
+ paragraphs_his[paragraph] = p_idx
+ # delete
+ prev_del_num = 0
+ del_num = 0
+ for p_idx in del_ids:
+ if p_idx < para_id:
+ prev_del_num += 1
+ del doc["segmented_paragraphs"][p_idx - del_num]
+ del doc["segmented_paragraphs_scores"][p_idx - del_num]
+ del doc['paragraphs_length'][p_idx - del_num]
+ del_num += 1
+ if len(del_ids) != 0:
+ if 'most_related_para' in doc:
+ doc['most_related_para'] = para_id - prev_del_num
+ doc['paragraphs'] = []
+ for segmented_para in doc["segmented_paragraphs"]:
+ paragraph = ''.join(segmented_para)
+ doc['paragraphs'].append(paragraph)
+ return True
+ else:
+ return False
+
+
+def paragraph_selection(sample, mode):
+ """
+ For each document, select paragraphs that includes as much information as possible
+ Args:
+ sample: a sample in the dataset.
+ mode: string of ("train", "dev", "test"), indicate the type of dataset to process.
+ Returns:
+ None
+ Raises:
+ None
+ """
+ # predefined maximum length of paragraph
+ MAX_P_LEN = 500
+ # predefined splitter
+ splitter = u''
+ # topN of related paragraph to choose
+ topN = 3
+ doc_id = None
+ if 'answer_docs' in sample and len(sample['answer_docs']) > 0:
+ doc_id = sample['answer_docs'][0]
+ if doc_id >= len(sample['documents']):
+ # Data error, answer doc ID > number of documents, this sample
+ # will be filtered by dataset.py
+ return
+ for d_idx, doc in enumerate(sample['documents']):
+ if 'segmented_paragraphs_scores' not in doc:
+ continue
+ status = dup_remove(doc)
+ segmented_title = doc["segmented_title"]
+ title_len = len(segmented_title)
+ para_id = None
+ if doc_id is not None:
+ para_id = sample['documents'][doc_id]['most_related_para']
+ total_len = title_len + sum(doc['paragraphs_length'])
+ # add splitter
+ para_num = len(doc["segmented_paragraphs"])
+ total_len += para_num
+ if total_len <= MAX_P_LEN:
+ incre_len = title_len
+ total_segmented_content = copy.deepcopy(segmented_title)
+ for p_idx, segmented_para in enumerate(doc["segmented_paragraphs"]):
+ if doc_id == d_idx and para_id > p_idx:
+ incre_len += len([splitter] + segmented_para)
+ if doc_id == d_idx and para_id == p_idx:
+ incre_len += 1
+ total_segmented_content += [splitter] + segmented_para
+ if doc_id == d_idx:
+ answer_start = incre_len + sample['answer_spans'][0][0]
+ answer_end = incre_len + sample['answer_spans'][0][1]
+ sample['answer_spans'][0][0] = answer_start
+ sample['answer_spans'][0][1] = answer_end
+ doc["segmented_paragraphs"] = [total_segmented_content]
+ doc["segmented_paragraphs_scores"] = [1.0]
+ doc['paragraphs_length'] = [total_len]
+ doc['paragraphs'] = [''.join(total_segmented_content)]
+ doc['most_related_para'] = 0
+ continue
+ # find topN paragraph id
+ para_infos = []
+ for p_idx, (para_tokens, para_scores) in \
+ enumerate(zip(doc['segmented_paragraphs'], doc['segmented_paragraphs_scores'])):
+ para_infos.append((para_tokens, para_scores, len(para_tokens), p_idx))
+ para_infos.sort(key=lambda x: (-x[1], x[2]))
+ topN_idx = []
+ for para_info in para_infos[:topN]:
+ topN_idx.append(para_info[-1])
+ final_idx = []
+ total_len = title_len
+ if doc_id == d_idx:
+ if mode == "train":
+ final_idx.append(para_id)
+ total_len = title_len + 1 + doc['paragraphs_length'][para_id]
+ for id in topN_idx:
+ if total_len > MAX_P_LEN:
+ break
+ if doc_id == d_idx and id == para_id and mode == "train":
+ continue
+ total_len += 1 + doc['paragraphs_length'][id]
+ final_idx.append(id)
+ total_segmented_content = copy.deepcopy(segmented_title)
+ final_idx.sort()
+ incre_len = title_len
+ for id in final_idx:
+ if doc_id == d_idx and id < para_id:
+ incre_len += 1 + doc['paragraphs_length'][id]
+ if doc_id == d_idx and id == para_id:
+ incre_len += 1
+ total_segmented_content += [splitter] + doc['segmented_paragraphs'][id]
+ if doc_id == d_idx:
+ answer_start = incre_len + sample['answer_spans'][0][0]
+ answer_end = incre_len + sample['answer_spans'][0][1]
+ sample['answer_spans'][0][0] = answer_start
+ sample['answer_spans'][0][1] = answer_end
+ doc["segmented_paragraphs"] = [total_segmented_content]
+ doc["segmented_paragraphs_scores"] = [1.0]
+ doc['paragraphs_length'] = [total_len]
+ doc['paragraphs'] = [''.join(total_segmented_content)]
+ doc['most_related_para'] = 0
+
+
+if __name__ == "__main__":
+ # mode="train"/"dev"/"test"
+ mode = sys.argv[1]
+ for line in sys.stdin:
+ line = line.strip()
+ if line == "":
+ continue
+ try:
+ sample = json.loads(line, encoding='utf8')
+ except:
+ print >>sys.stderr, "Invalid input json format - '{}' will be ignored".format(line)
+ continue
+ compute_paragraph_score(sample)
+ paragraph_selection(sample, mode)
+ print(json.dumps(sample, encoding='utf8', ensure_ascii=False))
+
diff --git a/PaddleNLP/machine_reading_comprehension/preprocess.py b/PaddleNLP/machine_reading_comprehension/preprocess.py
new file mode 100644
index 0000000000000000000000000000000000000000..af76549f4427a341fed341a7d09fd5bb7230de7b
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/preprocess.py
@@ -0,0 +1,218 @@
+###############################################################################
+# ==============================================================================
+# Copyright 2017 Baidu.com, Inc. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# ==============================================================================
+"""
+This module finds the most related paragraph of each document according to recall.
+"""
+
+import sys
+if sys.version[0] == '2':
+ reload(sys)
+ sys.setdefaultencoding("utf-8")
+import json
+from collections import Counter
+
+
+def precision_recall_f1(prediction, ground_truth):
+ """
+ This function calculates and returns the precision, recall and f1-score
+ Args:
+ prediction: prediction string or list to be matched
+ ground_truth: golden string or list reference
+ Returns:
+ floats of (p, r, f1)
+ Raises:
+ None
+ """
+ if not isinstance(prediction, list):
+ prediction_tokens = prediction.split()
+ else:
+ prediction_tokens = prediction
+ if not isinstance(ground_truth, list):
+ ground_truth_tokens = ground_truth.split()
+ else:
+ ground_truth_tokens = ground_truth
+ common = Counter(prediction_tokens) & Counter(ground_truth_tokens)
+ num_same = sum(common.values())
+ if num_same == 0:
+ return 0, 0, 0
+ p = 1.0 * num_same / len(prediction_tokens)
+ r = 1.0 * num_same / len(ground_truth_tokens)
+ f1 = (2 * p * r) / (p + r)
+ return p, r, f1
+
+
+def recall(prediction, ground_truth):
+ """
+ This function calculates and returns the recall
+ Args:
+ prediction: prediction string or list to be matched
+ ground_truth: golden string or list reference
+ Returns:
+ floats of recall
+ Raises:
+ None
+ """
+ return precision_recall_f1(prediction, ground_truth)[1]
+
+
+def f1_score(prediction, ground_truth):
+ """
+ This function calculates and returns the f1-score
+ Args:
+ prediction: prediction string or list to be matched
+ ground_truth: golden string or list reference
+ Returns:
+ floats of f1
+ Raises:
+ None
+ """
+ return precision_recall_f1(prediction, ground_truth)[2]
+
+
+def metric_max_over_ground_truths(metric_fn, prediction, ground_truths):
+ """
+ This function calculates and returns the precision, recall and f1-score
+ Args:
+ metric_fn: metric function pointer which calculates scores according to corresponding logic.
+ prediction: prediction string or list to be matched
+ ground_truth: golden string or list reference
+ Returns:
+ floats of (p, r, f1)
+ Raises:
+ None
+ """
+ scores_for_ground_truths = []
+ for ground_truth in ground_truths:
+ score = metric_fn(prediction, ground_truth)
+ scores_for_ground_truths.append(score)
+ return max(scores_for_ground_truths)
+
+
+def find_best_question_match(doc, question, with_score=False):
+ """
+ For each document, find the paragraph that matches best to the question.
+ Args:
+ doc: The document object.
+ question: The question tokens.
+ with_score: If True then the match score will be returned,
+ otherwise False.
+ Returns:
+ The index of the best match paragraph, if with_score=False,
+ otherwise returns a tuple of the index of the best match paragraph
+ and the match score of that paragraph.
+ """
+ most_related_para = -1
+ max_related_score = 0
+ most_related_para_len = 0
+ for p_idx, para_tokens in enumerate(doc['segmented_paragraphs']):
+ if len(question) > 0:
+ related_score = metric_max_over_ground_truths(recall,
+ para_tokens,
+ question)
+ else:
+ related_score = 0
+
+ if related_score > max_related_score \
+ or (related_score == max_related_score \
+ and len(para_tokens) < most_related_para_len):
+ most_related_para = p_idx
+ max_related_score = related_score
+ most_related_para_len = len(para_tokens)
+ if most_related_para == -1:
+ most_related_para = 0
+ if with_score:
+ return most_related_para, max_related_score
+ return most_related_para
+
+
+def find_fake_answer(sample):
+ """
+ For each document, finds the most related paragraph based on recall,
+ then finds a span that maximize the f1_score compared with the gold answers
+ and uses this span as a fake answer span
+ Args:
+ sample: a sample in the dataset
+ Returns:
+ None
+ Raises:
+ None
+ """
+ for doc in sample['documents']:
+ most_related_para = -1
+ most_related_para_len = 999999
+ max_related_score = 0
+ for p_idx, para_tokens in enumerate(doc['segmented_paragraphs']):
+ if len(sample['segmented_answers']) > 0:
+ related_score = metric_max_over_ground_truths(recall,
+ para_tokens,
+ sample['segmented_answers'])
+ else:
+ continue
+ if related_score > max_related_score \
+ or (related_score == max_related_score
+ and len(para_tokens) < most_related_para_len):
+ most_related_para = p_idx
+ most_related_para_len = len(para_tokens)
+ max_related_score = related_score
+ doc['most_related_para'] = most_related_para
+
+ sample['answer_docs'] = []
+ sample['answer_spans'] = []
+ sample['fake_answers'] = []
+ sample['match_scores'] = []
+
+ best_match_score = 0
+ best_match_d_idx, best_match_span = -1, [-1, -1]
+ best_fake_answer = None
+ answer_tokens = set()
+ for segmented_answer in sample['segmented_answers']:
+ answer_tokens = answer_tokens | set([token for token in segmented_answer])
+ for d_idx, doc in enumerate(sample['documents']):
+ if not doc['is_selected']:
+ continue
+ if doc['most_related_para'] == -1:
+ doc['most_related_para'] = 0
+ most_related_para_tokens = doc['segmented_paragraphs'][doc['most_related_para']][:1000]
+ for start_tidx in range(len(most_related_para_tokens)):
+ if most_related_para_tokens[start_tidx] not in answer_tokens:
+ continue
+ for end_tidx in range(len(most_related_para_tokens) - 1, start_tidx - 1, -1):
+ span_tokens = most_related_para_tokens[start_tidx: end_tidx + 1]
+ if len(sample['segmented_answers']) > 0:
+ match_score = metric_max_over_ground_truths(f1_score, span_tokens,
+ sample['segmented_answers'])
+ else:
+ match_score = 0
+ if match_score == 0:
+ break
+ if match_score > best_match_score:
+ best_match_d_idx = d_idx
+ best_match_span = [start_tidx, end_tidx]
+ best_match_score = match_score
+ best_fake_answer = ''.join(span_tokens)
+ if best_match_score > 0:
+ sample['answer_docs'].append(best_match_d_idx)
+ sample['answer_spans'].append(best_match_span)
+ sample['fake_answers'].append(best_fake_answer)
+ sample['match_scores'].append(best_match_score)
+
+
+if __name__ == '__main__':
+ for line in sys.stdin:
+ sample = json.loads(line)
+ find_fake_answer(sample)
+ print(json.dumps(sample, encoding='utf8', ensure_ascii=False))
diff --git a/PaddleNLP/machine_reading_comprehension/rc_model.py b/PaddleNLP/machine_reading_comprehension/rc_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..d96c2dbd78abcda8fce6c5011d8782b8413292de
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/rc_model.py
@@ -0,0 +1,330 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid.layers as layers
+import paddle.fluid as fluid
+import numpy as np
+
+
+def dropout(input, args):
+ """Dropout function"""
+ if args.drop_rate:
+ return layers.dropout(
+ input,
+ dropout_prob=args.drop_rate,
+ seed=args.random_seed,
+ is_test=False)
+ else:
+ return input
+
+
+def bi_lstm_encoder(input_seq, gate_size, para_name, args):
+ """
+ A bi-directional lstm encoder implementation.
+ Linear transformation part for input gate, output gate, forget gate
+ and cell activation vectors need be done outside of dynamic_lstm.
+ So the output size is 4 times of gate_size.
+ """
+
+ input_forward_proj = layers.fc(
+ input=input_seq,
+ param_attr=fluid.ParamAttr(name=para_name + '_fw_gate_w'),
+ size=gate_size * 4,
+ act=None,
+ bias_attr=False)
+ input_reversed_proj = layers.fc(
+ input=input_seq,
+ param_attr=fluid.ParamAttr(name=para_name + '_bw_gate_w'),
+ size=gate_size * 4,
+ act=None,
+ bias_attr=False)
+ forward, _ = layers.dynamic_lstm(
+ input=input_forward_proj,
+ size=gate_size * 4,
+ use_peepholes=False,
+ param_attr=fluid.ParamAttr(name=para_name + '_fw_lstm_w'),
+ bias_attr=fluid.ParamAttr(name=para_name + '_fw_lstm_b'))
+ reversed, _ = layers.dynamic_lstm(
+ input=input_reversed_proj,
+ param_attr=fluid.ParamAttr(name=para_name + '_bw_lstm_w'),
+ bias_attr=fluid.ParamAttr(name=para_name + '_bw_lstm_b'),
+ size=gate_size * 4,
+ is_reverse=True,
+ use_peepholes=False)
+
+ encoder_out = layers.concat(input=[forward, reversed], axis=1)
+ return encoder_out
+
+
+def get_data(input_name, lod_level, args):
+ input_ids = layers.data(
+ name=input_name, shape=[1], dtype='int64', lod_level=lod_level)
+ return input_ids
+
+
+def embedding(input_ids, shape, args):
+ """Embedding layer"""
+ input_embedding = layers.embedding(
+ input=input_ids,
+ size=shape,
+ dtype='float32',
+ is_sparse=True,
+ param_attr=fluid.ParamAttr(name='embedding_para'))
+ return input_embedding
+
+
+def encoder(input_embedding, para_name, hidden_size, args):
+ """Encoding layer"""
+ encoder_out = bi_lstm_encoder(
+ input_seq=input_embedding,
+ gate_size=hidden_size,
+ para_name=para_name,
+ args=args)
+ return dropout(encoder_out, args)
+
+
+def attn_flow(q_enc, p_enc, p_ids_name, args):
+ """Bidirectional Attention layer"""
+ tag = p_ids_name + "::"
+ drnn = layers.DynamicRNN()
+ with drnn.block():
+ h_cur = drnn.step_input(p_enc)
+ u_all = drnn.static_input(q_enc)
+ h_expd = layers.sequence_expand(x=h_cur, y=u_all)
+ s_t_mul = layers.elementwise_mul(x=u_all, y=h_expd, axis=0)
+ s_t_sum = layers.reduce_sum(input=s_t_mul, dim=1, keep_dim=True)
+ s_t_re = layers.reshape(s_t_sum, shape=[-1, 0])
+ s_t = layers.sequence_softmax(input=s_t_re)
+ u_expr = layers.elementwise_mul(x=u_all, y=s_t, axis=0)
+ u_expr = layers.sequence_pool(input=u_expr, pool_type='sum')
+
+ b_t = layers.sequence_pool(input=s_t_sum, pool_type='max')
+ drnn.output(u_expr, b_t)
+ U_expr, b = drnn()
+ b_norm = layers.sequence_softmax(input=b)
+ h_expr = layers.elementwise_mul(x=p_enc, y=b_norm, axis=0)
+ h_expr = layers.sequence_pool(input=h_expr, pool_type='sum')
+
+ H_expr = layers.sequence_expand(x=h_expr, y=p_enc)
+ H_expr = layers.lod_reset(x=H_expr, y=p_enc)
+ h_u = layers.elementwise_mul(x=p_enc, y=U_expr, axis=0)
+ h_h = layers.elementwise_mul(x=p_enc, y=H_expr, axis=0)
+
+ g = layers.concat(input=[p_enc, U_expr, h_u, h_h], axis=1)
+ return dropout(g, args)
+
+
+def fusion(g, args):
+ """Fusion layer"""
+ m = bi_lstm_encoder(
+ input_seq=g, gate_size=args.hidden_size, para_name='fusion', args=args)
+ return dropout(m, args)
+
+
+def lstm_step(x_t, hidden_t_prev, cell_t_prev, size, para_name, args):
+ """Util function for pointer network"""
+ def linear(inputs, para_name, args):
+ return layers.fc(input=inputs,
+ size=size,
+ param_attr=fluid.ParamAttr(name=para_name + '_w'),
+ bias_attr=fluid.ParamAttr(name=para_name + '_b'))
+
+ input_cat = layers.concat([hidden_t_prev, x_t], axis=1)
+ forget_gate = layers.sigmoid(x=linear(input_cat, para_name + '_lstm_f',
+ args))
+ input_gate = layers.sigmoid(x=linear(input_cat, para_name + '_lstm_i',
+ args))
+ output_gate = layers.sigmoid(x=linear(input_cat, para_name + '_lstm_o',
+ args))
+ cell_tilde = layers.tanh(x=linear(input_cat, para_name + '_lstm_c', args))
+
+ cell_t = layers.sums(input=[
+ layers.elementwise_mul(
+ x=forget_gate, y=cell_t_prev), layers.elementwise_mul(
+ x=input_gate, y=cell_tilde)
+ ])
+
+ hidden_t = layers.elementwise_mul(x=output_gate, y=layers.tanh(x=cell_t))
+
+ return hidden_t, cell_t
+
+
+def point_network_decoder(p_vec, q_vec, hidden_size, args):
+ """Output layer - pointer network"""
+ tag = 'pn_decoder:'
+ init_random = fluid.initializer.Normal(loc=0.0, scale=1.0)
+
+ random_attn = layers.create_parameter(
+ shape=[1, hidden_size],
+ dtype='float32',
+ default_initializer=init_random)
+ random_attn = layers.fc(
+ input=random_attn,
+ size=hidden_size,
+ act=None,
+ param_attr=fluid.ParamAttr(name=tag + 'random_attn_fc_w'),
+ bias_attr=fluid.ParamAttr(name=tag + 'random_attn_fc_b'))
+ random_attn = layers.reshape(random_attn, shape=[-1])
+ U = layers.fc(input=q_vec,
+ param_attr=fluid.ParamAttr(name=tag + 'q_vec_fc_w'),
+ bias_attr=False,
+ size=hidden_size,
+ act=None) + random_attn
+ U = layers.tanh(U)
+
+ logits = layers.fc(input=U,
+ param_attr=fluid.ParamAttr(name=tag + 'logits_fc_w'),
+ bias_attr=fluid.ParamAttr(name=tag + 'logits_fc_b'),
+ size=1,
+ act=None)
+ scores = layers.sequence_softmax(input=logits)
+ pooled_vec = layers.elementwise_mul(x=q_vec, y=scores, axis=0)
+ pooled_vec = layers.sequence_pool(input=pooled_vec, pool_type='sum')
+
+ init_state = layers.fc(
+ input=pooled_vec,
+ param_attr=fluid.ParamAttr(name=tag + 'init_state_fc_w'),
+ bias_attr=fluid.ParamAttr(name=tag + 'init_state_fc_b'),
+ size=hidden_size,
+ act=None)
+
+ def custom_dynamic_rnn(p_vec, init_state, hidden_size, para_name, args):
+ tag = para_name + "custom_dynamic_rnn:"
+
+ def static_rnn(step,
+ p_vec=p_vec,
+ init_state=None,
+ para_name='',
+ args=args):
+ tag = para_name + "static_rnn:"
+ ctx = layers.fc(
+ input=p_vec,
+ param_attr=fluid.ParamAttr(name=tag + 'context_fc_w'),
+ bias_attr=fluid.ParamAttr(name=tag + 'context_fc_b'),
+ size=hidden_size,
+ act=None)
+
+ beta = []
+ c_prev = init_state
+ m_prev = init_state
+ for i in range(step):
+ m_prev0 = layers.fc(
+ input=m_prev,
+ size=hidden_size,
+ act=None,
+ param_attr=fluid.ParamAttr(name=tag + 'm_prev0_fc_w'),
+ bias_attr=fluid.ParamAttr(name=tag + 'm_prev0_fc_b'))
+ m_prev1 = layers.sequence_expand(x=m_prev0, y=ctx)
+
+ Fk = ctx + m_prev1
+ Fk = layers.tanh(Fk)
+ logits = layers.fc(
+ input=Fk,
+ size=1,
+ act=None,
+ param_attr=fluid.ParamAttr(name=tag + 'logits_fc_w'),
+ bias_attr=fluid.ParamAttr(name=tag + 'logits_fc_b'))
+
+ scores = layers.sequence_softmax(input=logits)
+ attn_ctx = layers.elementwise_mul(x=p_vec, y=scores, axis=0)
+ attn_ctx = layers.sequence_pool(input=attn_ctx, pool_type='sum')
+
+ hidden_t, cell_t = lstm_step(
+ attn_ctx,
+ hidden_t_prev=m_prev,
+ cell_t_prev=c_prev,
+ size=hidden_size,
+ para_name=tag,
+ args=args)
+ m_prev = hidden_t
+ c_prev = cell_t
+ beta.append(scores)
+ return beta
+
+ return static_rnn(
+ 2, p_vec=p_vec, init_state=init_state, para_name=para_name)
+
+ fw_outputs = custom_dynamic_rnn(p_vec, init_state, hidden_size, tag + "fw:",
+ args)
+ bw_outputs = custom_dynamic_rnn(p_vec, init_state, hidden_size, tag + "bw:",
+ args)
+
+ start_prob = layers.elementwise_add(
+ x=fw_outputs[0], y=bw_outputs[1], axis=0) / 2
+ end_prob = layers.elementwise_add(
+ x=fw_outputs[1], y=bw_outputs[0], axis=0) / 2
+
+ return start_prob, end_prob
+
+
+def rc_model(hidden_size, vocab, args):
+ """This function build the whole BiDAF network"""
+ emb_shape = [vocab.size(), vocab.embed_dim]
+ start_labels = layers.data(
+ name="start_lables", shape=[1], dtype='float32', lod_level=1)
+ end_labels = layers.data(
+ name="end_lables", shape=[1], dtype='float32', lod_level=1)
+
+ # stage 1:setup input data, embedding table & encode
+ q_id0 = get_data('q_id0', 1, args)
+
+ q_ids = get_data('q_ids', 2, args)
+ p_ids_name = 'p_ids'
+
+ p_ids = get_data('p_ids', 2, args)
+ p_embs = embedding(p_ids, emb_shape, args)
+ q_embs = embedding(q_ids, emb_shape, args)
+ drnn = layers.DynamicRNN()
+ with drnn.block():
+ p_emb = drnn.step_input(p_embs)
+ q_emb = drnn.step_input(q_embs)
+
+ p_enc = encoder(p_emb, 'p_enc', hidden_size, args)
+ q_enc = encoder(q_emb, 'q_enc', hidden_size, args)
+
+ # stage 2:match
+ g_i = attn_flow(q_enc, p_enc, p_ids_name, args)
+ # stage 3:fusion
+ m_i = fusion(g_i, args)
+ drnn.output(m_i, q_enc)
+
+ ms, q_encs = drnn()
+ p_vec = layers.lod_reset(x=ms, y=start_labels)
+ q_vec = layers.lod_reset(x=q_encs, y=q_id0)
+
+ # stage 4:decode
+ start_probs, end_probs = point_network_decoder(
+ p_vec=p_vec, q_vec=q_vec, hidden_size=hidden_size, args=args)
+
+ # calculate model loss
+ cost0 = layers.sequence_pool(
+ layers.cross_entropy(
+ input=start_probs, label=start_labels, soft_label=True),
+ 'sum')
+ cost1 = layers.sequence_pool(
+ layers.cross_entropy(
+ input=end_probs, label=end_labels, soft_label=True),
+ 'sum')
+
+ cost0 = layers.mean(cost0)
+ cost1 = layers.mean(cost1)
+ cost = cost0 + cost1
+ cost.persistable = True
+
+ feeding_list = ["q_ids", "start_lables", "end_lables", "p_ids", "q_id0"]
+ return cost, start_probs, end_probs, ms, feeding_list
diff --git a/PaddleNLP/machine_reading_comprehension/run.py b/PaddleNLP/machine_reading_comprehension/run.py
new file mode 100644
index 0000000000000000000000000000000000000000..19b637cd41d228e3a3e4bcebd734dbf66043c7ab
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/run.py
@@ -0,0 +1,652 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import numpy as np
+import time
+import os
+import random
+import json
+import six
+import multiprocessing
+
+import paddle
+import paddle.fluid as fluid
+import paddle.fluid.core as core
+import paddle.fluid.framework as framework
+from paddle.fluid.executor import Executor
+
+import sys
+if sys.version[0] == '2':
+ reload(sys)
+ sys.setdefaultencoding("utf-8")
+sys.path.append('..')
+
+from args import *
+import rc_model
+from dataset import BRCDataset
+import logging
+import pickle
+from utils import normalize
+from utils import compute_bleu_rouge
+from vocab import Vocab
+
+
+def prepare_batch_input(insts, args):
+ batch_size = len(insts['raw_data'])
+ inst_num = len(insts['passage_num'])
+ if batch_size != inst_num:
+ print("data error %d, %d" % (batch_size, inst_num))
+ return None
+ new_insts = []
+
+ passage_idx = 0
+ for i in range(batch_size):
+ p_len = 0
+ p_id = []
+ p_ids = []
+ q_ids = []
+ q_id = []
+ p_id_r = []
+ p_ids_r = []
+ q_ids_r = []
+ q_id_r = []
+
+ for j in range(insts['passage_num'][i]):
+ p_ids.append(insts['passage_token_ids'][passage_idx + j])
+ p_id = p_id + insts['passage_token_ids'][passage_idx + j]
+ q_ids.append(insts['question_token_ids'][passage_idx + j])
+ q_id = q_id + insts['question_token_ids'][passage_idx + j]
+
+ passage_idx += insts['passage_num'][i]
+ p_len = len(p_id)
+
+ def _get_label(idx, ref_len):
+ ret = [0.0] * ref_len
+ if idx >= 0 and idx < ref_len:
+ ret[idx] = 1.0
+ return [[x] for x in ret]
+
+ start_label = _get_label(insts['start_id'][i], p_len)
+ end_label = _get_label(insts['end_id'][i], p_len)
+ new_inst = [q_ids, start_label, end_label, p_ids, q_id]
+ new_insts.append(new_inst)
+ return new_insts
+
+
+def batch_reader(batch_list, args):
+ res = []
+ for batch in batch_list:
+ res.append(prepare_batch_input(batch, args))
+ return res
+
+
+def read_multiple(reader, count, clip_last=True):
+ """
+ Stack data from reader for multi-devices.
+ """
+
+ def __impl__():
+ res = []
+ for item in reader():
+ res.append(item)
+ if len(res) == count:
+ yield res
+ res = []
+ if len(res) == count:
+ yield res
+ elif not clip_last:
+ data = []
+ for item in res:
+ data += item
+ if len(data) > count:
+ inst_num_per_part = len(data) // count
+ yield [
+ data[inst_num_per_part * i:inst_num_per_part * (i + 1)]
+ for i in range(count)
+ ]
+
+ return __impl__
+
+
+def LodTensor_Array(lod_tensor):
+ lod = lod_tensor.lod()
+ array = np.array(lod_tensor)
+ new_array = []
+ for i in range(len(lod[0]) - 1):
+ new_array.append(array[lod[0][i]:lod[0][i + 1]])
+ return new_array
+
+
+def print_para(train_prog, train_exe, logger, args):
+ """Print para info for debug purpose"""
+ if args.para_print:
+ param_list = train_prog.block(0).all_parameters()
+ param_name_list = [p.name for p in param_list]
+ num_sum = 0
+ for p_name in param_name_list:
+ p_array = np.array(train_exe.scope.find_var(p_name).get_tensor())
+ param_num = np.prod(p_array.shape)
+ num_sum = num_sum + param_num
+ logger.info(
+ "param: {0}, mean={1} max={2} min={3} num={4} {5}".format(
+ p_name,
+ p_array.mean(),
+ p_array.max(), p_array.min(), p_array.shape, param_num))
+ logger.info("total param num: {0}".format(num_sum))
+
+
+def find_best_answer_for_passage(start_probs, end_probs, passage_len):
+ """
+ Finds the best answer with the maximum start_prob * end_prob from a single passage
+ """
+ if passage_len is None:
+ passage_len = len(start_probs)
+ else:
+ passage_len = min(len(start_probs), passage_len)
+ best_start, best_end, max_prob = -1, -1, 0
+ for start_idx in range(passage_len):
+ for ans_len in range(args.max_a_len):
+ end_idx = start_idx + ans_len
+ if end_idx >= passage_len:
+ continue
+ prob = start_probs[start_idx] * end_probs[end_idx]
+ if prob > max_prob:
+ best_start = start_idx
+ best_end = end_idx
+ max_prob = prob
+ return (best_start, best_end), max_prob
+
+
+def find_best_answer_for_inst(sample, start_prob, end_prob, inst_lod,
+ para_prior_scores=(0.44, 0.23, 0.15, 0.09, 0.07)):
+ """
+ Finds the best answer for a sample given start_prob and end_prob for each position.
+ This will call find_best_answer_for_passage because there are multiple passages in a sample
+ """
+ best_p_idx, best_span, best_score = None, None, 0
+ for p_idx, passage in enumerate(sample['passages']):
+ if p_idx >= args.max_p_num:
+ continue
+ if len(start_prob) != len(end_prob):
+ logger.info('error: {}'.format(sample['question']))
+ continue
+ passage_start = inst_lod[p_idx] - inst_lod[0]
+ passage_end = inst_lod[p_idx + 1] - inst_lod[0]
+ passage_len = passage_end - passage_start
+ passage_len = min(args.max_p_len, len(passage['passage_tokens']))
+ answer_span, score = find_best_answer_for_passage(
+ start_prob[passage_start:passage_end],
+ end_prob[passage_start:passage_end], passage_len)
+ if para_prior_scores is not None:
+ # the Nth prior score = the Number of training samples whose gold answer comes
+ # from the Nth paragraph / the number of the training samples
+ score *= para_prior_scores[p_idx]
+ if score > best_score:
+ best_score = score
+ best_p_idx = p_idx
+ best_span = answer_span
+ if best_p_idx is None or best_span is None:
+ best_answer = ''
+ else:
+ best_answer = ''.join(sample['passages'][best_p_idx]['passage_tokens'][
+ best_span[0]:best_span[1] + 1])
+ return best_answer, best_span
+
+
+def validation(inference_program, avg_cost, s_probs, e_probs, match, feed_order,
+ place, dev_count, vocab, brc_data, logger, args):
+ """
+ do inference with given inference_program
+ """
+ parallel_executor = fluid.ParallelExecutor(
+ main_program=inference_program,
+ use_cuda=bool(args.use_gpu),
+ loss_name=avg_cost.name)
+ print_para(inference_program, parallel_executor, logger, args)
+
+ # Use test set as validation each pass
+ total_loss = 0.0
+ count = 0
+ n_batch_cnt = 0
+ n_batch_loss = 0.0
+ pred_answers, ref_answers = [], []
+ val_feed_list = [
+ inference_program.global_block().var(var_name)
+ for var_name in feed_order
+ ]
+ val_feeder = fluid.DataFeeder(val_feed_list, place)
+ pad_id = vocab.get_id(vocab.pad_token)
+ dev_reader = lambda:brc_data.gen_mini_batches('dev', args.batch_size, pad_id, shuffle=False)
+ dev_reader = read_multiple(dev_reader, dev_count)
+
+ for batch_id, batch_list in enumerate(dev_reader(), 1):
+ feed_data = batch_reader(batch_list, args)
+ val_fetch_outs = parallel_executor.run(
+ feed=list(val_feeder.feed_parallel(feed_data, dev_count)),
+ fetch_list=[avg_cost.name, s_probs.name, e_probs.name, match.name],
+ return_numpy=False)
+ total_loss += np.array(val_fetch_outs[0]).sum()
+ start_probs_m = LodTensor_Array(val_fetch_outs[1])
+ end_probs_m = LodTensor_Array(val_fetch_outs[2])
+ match_lod = val_fetch_outs[3].lod()
+ count += len(np.array(val_fetch_outs[0]))
+
+ n_batch_cnt += len(np.array(val_fetch_outs[0]))
+ n_batch_loss += np.array(val_fetch_outs[0]).sum()
+ log_every_n_batch = args.log_interval
+ if log_every_n_batch > 0 and batch_id % log_every_n_batch == 0:
+ logger.info('Average dev loss from batch {} to {} is {}'.format(
+ batch_id - log_every_n_batch + 1, batch_id, "%.10f" % (
+ n_batch_loss / n_batch_cnt)))
+ n_batch_loss = 0.0
+ n_batch_cnt = 0
+ batch_offset = 0
+ for idx, batch in enumerate(batch_list):
+ #one batch
+ batch_size = len(batch['raw_data'])
+ batch_range = match_lod[0][batch_offset:batch_offset + batch_size +
+ 1]
+ batch_lod = [[batch_range[x], batch_range[x + 1]]
+ for x in range(len(batch_range[:-1]))]
+ start_prob_batch = start_probs_m[batch_offset:batch_offset +
+ batch_size + 1]
+ end_prob_batch = end_probs_m[batch_offset:batch_offset + batch_size
+ + 1]
+ for sample, start_prob_inst, end_prob_inst, inst_range in zip(
+ batch['raw_data'], start_prob_batch, end_prob_batch,
+ batch_lod):
+ #one instance
+ inst_lod = match_lod[1][inst_range[0]:inst_range[1] + 1]
+ best_answer, best_span = find_best_answer_for_inst(
+ sample, start_prob_inst, end_prob_inst, inst_lod)
+ pred = {
+ 'question_id': sample['question_id'],
+ 'question_type': sample['question_type'],
+ 'answers': [best_answer],
+ 'entity_answers': [[]],
+ 'yesno_answers': []
+ }
+ pred_answers.append(pred)
+ if 'answers' in sample:
+ ref = {
+ 'question_id': sample['question_id'],
+ 'question_type': sample['question_type'],
+ 'answers': sample['answers'],
+ 'entity_answers': [[]],
+ 'yesno_answers': []
+ }
+ ref_answers.append(ref)
+ batch_offset = batch_offset + batch_size
+
+ result_dir = args.result_dir
+ result_prefix = args.result_name
+ if result_dir is not None and result_prefix is not None:
+ if not os.path.exists(args.result_dir):
+ os.makedirs(args.result_dir)
+ result_file = os.path.join(result_dir, result_prefix + '.json')
+ with open(result_file, 'w') as fout:
+ for pred_answer in pred_answers:
+ fout.write(json.dumps(pred_answer, ensure_ascii=False) + '\n')
+ logger.info('Saving {} results to {}'.format(result_prefix,
+ result_file))
+
+ ave_loss = 1.0 * total_loss / count
+ # compute the bleu and rouge scores if reference answers is provided
+ if len(ref_answers) > 0:
+ pred_dict, ref_dict = {}, {}
+ for pred, ref in zip(pred_answers, ref_answers):
+ question_id = ref['question_id']
+ if len(ref['answers']) > 0:
+ pred_dict[question_id] = normalize(pred['answers'])
+ ref_dict[question_id] = normalize(ref['answers'])
+ bleu_rouge = compute_bleu_rouge(pred_dict, ref_dict)
+ else:
+ bleu_rouge = None
+ return ave_loss, bleu_rouge
+
+
+def l2_loss(train_prog):
+ param_list = train_prog.block(0).all_parameters()
+ para_sum = []
+ for para in param_list:
+ para_mul = fluid.layers.elementwise_mul(x=para, y=para, axis=0)
+ para_sum.append(fluid.layers.reduce_sum(input=para_mul, dim=None))
+ return fluid.layers.sums(para_sum) * 0.5
+
+
+def train(logger, args):
+ """train a model"""
+ logger.info('Load data_set and vocab...')
+ with open(os.path.join(args.vocab_dir, 'vocab.data'), 'rb') as fin:
+ if six.PY2:
+ vocab = pickle.load(fin)
+ else:
+ vocab = pickle.load(fin, encoding='bytes')
+ logger.info('vocab size is {} and embed dim is {}'.format(vocab.size(
+ ), vocab.embed_dim))
+ brc_data = BRCDataset(args.max_p_num, args.max_p_len, args.max_q_len,
+ args.trainset, args.devset)
+ logger.info('Converting text into ids...')
+ brc_data.convert_to_ids(vocab)
+ logger.info('Initialize the model...')
+
+ if not args.use_gpu:
+ place = fluid.CPUPlace()
+ dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ else:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+
+ # build model
+ main_program = fluid.Program()
+ startup_prog = fluid.Program()
+ if args.enable_ce:
+ main_program.random_seed = args.random_seed
+ startup_prog.random_seed = args.random_seed
+ with fluid.program_guard(main_program, startup_prog):
+ with fluid.unique_name.guard():
+ avg_cost, s_probs, e_probs, match, feed_order = rc_model.rc_model(
+ args.hidden_size, vocab, args)
+ # clone from default main program and use it as the validation program
+ inference_program = main_program.clone(for_test=True)
+
+ # build optimizer
+ if args.optim == 'sgd':
+ optimizer = fluid.optimizer.SGD(
+ learning_rate=args.learning_rate)
+ elif args.optim == 'adam':
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=args.learning_rate)
+ elif args.optim == 'rprop':
+ optimizer = fluid.optimizer.RMSPropOptimizer(
+ learning_rate=args.learning_rate)
+ else:
+ logger.error('Unsupported optimizer: {}'.format(args.optim))
+ exit(-1)
+ if args.weight_decay > 0.0:
+ obj_func = avg_cost + args.weight_decay * l2_loss(main_program)
+ optimizer.minimize(obj_func)
+ else:
+ obj_func = avg_cost
+ optimizer.minimize(obj_func)
+
+ # initialize parameters
+ place = core.CUDAPlace(0) if args.use_gpu else core.CPUPlace()
+ exe = Executor(place)
+ if args.load_dir:
+ logger.info('load from {}'.format(args.load_dir))
+ fluid.io.load_persistables(
+ exe, args.load_dir, main_program=main_program)
+ else:
+ exe.run(startup_prog)
+ embedding_para = fluid.global_scope().find_var(
+ 'embedding_para').get_tensor()
+ embedding_para.set(vocab.embeddings.astype(np.float32), place)
+
+ # prepare data
+ feed_list = [
+ main_program.global_block().var(var_name)
+ for var_name in feed_order
+ ]
+ feeder = fluid.DataFeeder(feed_list, place)
+
+ logger.info('Training the model...')
+ parallel_executor = fluid.ParallelExecutor(
+ main_program=main_program,
+ use_cuda=bool(args.use_gpu),
+ loss_name=avg_cost.name)
+ print_para(main_program, parallel_executor, logger, args)
+
+ for pass_id in range(1, args.pass_num + 1):
+ pass_start_time = time.time()
+ pad_id = vocab.get_id(vocab.pad_token)
+ if args.enable_ce:
+ train_reader = lambda:brc_data.gen_mini_batches('train', args.batch_size, pad_id, shuffle=False)
+ else:
+ train_reader = lambda:brc_data.gen_mini_batches('train', args.batch_size, pad_id, shuffle=True)
+ train_reader = read_multiple(train_reader, dev_count)
+ log_every_n_batch, n_batch_loss = args.log_interval, 0
+ total_num, total_loss = 0, 0
+ for batch_id, batch_list in enumerate(train_reader(), 1):
+ feed_data = batch_reader(batch_list, args)
+ fetch_outs = parallel_executor.run(
+ feed=list(feeder.feed_parallel(feed_data, dev_count)),
+ fetch_list=[obj_func.name],
+ return_numpy=False)
+ cost_train = np.array(fetch_outs[0]).mean()
+ total_num += args.batch_size * dev_count
+ n_batch_loss += cost_train
+ total_loss += cost_train * args.batch_size * dev_count
+
+ if args.enable_ce and batch_id >= 100:
+ break
+ if log_every_n_batch > 0 and batch_id % log_every_n_batch == 0:
+ print_para(main_program, parallel_executor, logger,
+ args)
+ logger.info(
+ 'Average loss from batch {} to {} is {}'.format(
+ batch_id - log_every_n_batch + 1, batch_id,
+ "%.10f" % (n_batch_loss / log_every_n_batch)))
+ n_batch_loss = 0
+ if args.dev_interval > 0 and batch_id % args.dev_interval == 0:
+ if brc_data.dev_set is not None:
+ eval_loss, bleu_rouge = validation(
+ inference_program, avg_cost, s_probs, e_probs,
+ match, feed_order, place, dev_count, vocab,
+ brc_data, logger, args)
+ logger.info('Dev eval loss {}'.format(eval_loss))
+ logger.info('Dev eval result: {}'.format(
+ bleu_rouge))
+ pass_end_time = time.time()
+ time_consumed = pass_end_time - pass_start_time
+ logger.info('epoch: {0}, epoch_time_cost: {1:.2f}'.format(
+ pass_id, time_consumed))
+ logger.info('Evaluating the model after epoch {}'.format(
+ pass_id))
+ if brc_data.dev_set is not None:
+ eval_loss, bleu_rouge = validation(
+ inference_program, avg_cost, s_probs, e_probs, match,
+ feed_order, place, dev_count, vocab, brc_data, logger,
+ args)
+ logger.info('Dev eval loss {}'.format(eval_loss))
+ logger.info('Dev eval result: {}'.format(bleu_rouge))
+ else:
+ logger.warning(
+ 'No dev set is loaded for evaluation in the dataset!')
+
+ logger.info('Average train loss for epoch {} is {}'.format(
+ pass_id, "%.10f" % (1.0 * total_loss / total_num)))
+
+ if pass_id % args.save_interval == 0:
+ model_path = os.path.join(args.save_dir, str(pass_id))
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+
+ fluid.io.save_persistables(
+ executor=exe,
+ dirname=model_path,
+ main_program=main_program)
+ if args.enable_ce: # For CE
+ print("kpis\ttrain_cost_card%d\t%f" %
+ (dev_count, total_loss / total_num))
+ if brc_data.dev_set is not None:
+ print("kpis\ttest_cost_card%d\t%f" %
+ (dev_count, eval_loss))
+ print("kpis\ttrain_duration_card%d\t%f" %
+ (dev_count, time_consumed))
+
+
+def evaluate(logger, args):
+ """evaluate a specific model using devset"""
+ logger.info('Load data_set and vocab...')
+ with open(os.path.join(args.vocab_dir, 'vocab.data'), 'rb') as fin:
+ vocab = pickle.load(fin)
+ logger.info('vocab size is {} and embed dim is {}'.format(vocab.size(
+ ), vocab.embed_dim))
+ brc_data = BRCDataset(
+ args.max_p_num, args.max_p_len, args.max_q_len, dev_files=args.devset)
+ logger.info('Converting text into ids...')
+ brc_data.convert_to_ids(vocab)
+ logger.info('Initialize the model...')
+
+ # build model
+ main_program = fluid.Program()
+ startup_prog = fluid.Program()
+ with fluid.program_guard(main_program, startup_prog):
+ with fluid.unique_name.guard():
+ avg_cost, s_probs, e_probs, match, feed_order = rc_model.rc_model(
+ args.hidden_size, vocab, args)
+ # initialize parameters
+ if not args.use_gpu:
+ place = fluid.CPUPlace()
+ dev_count = int(
+ os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ else:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+
+ exe = Executor(place)
+ if args.load_dir:
+ logger.info('load from {}'.format(args.load_dir))
+ fluid.io.load_persistables(
+ exe, args.load_dir, main_program=main_program)
+ else:
+ logger.error('No model file to load ...')
+ return
+
+ inference_program = main_program.clone(for_test=True)
+ eval_loss, bleu_rouge = validation(
+ inference_program, avg_cost, s_probs, e_probs, match, feed_order,
+ place, dev_count, vocab, brc_data, logger, args)
+ logger.info('Dev eval loss {}'.format(eval_loss))
+ logger.info('Dev eval result: {}'.format(bleu_rouge))
+ logger.info('Predicted answers are saved to {}'.format(
+ os.path.join(args.result_dir)))
+
+
+def predict(logger, args):
+ """do inference on the test dataset """
+ logger.info('Load data_set and vocab...')
+ with open(os.path.join(args.vocab_dir, 'vocab.data'), 'rb') as fin:
+ vocab = pickle.load(fin)
+ logger.info('vocab size is {} and embed dim is {}'.format(vocab.size(
+ ), vocab.embed_dim))
+ brc_data = BRCDataset(
+ args.max_p_num, args.max_p_len, args.max_q_len, dev_files=args.testset)
+ logger.info('Converting text into ids...')
+ brc_data.convert_to_ids(vocab)
+ logger.info('Initialize the model...')
+
+ # build model
+ main_program = fluid.Program()
+ startup_prog = fluid.Program()
+ with fluid.program_guard(main_program, startup_prog):
+ with fluid.unique_name.guard():
+ avg_cost, s_probs, e_probs, match, feed_order = rc_model.rc_model(
+ args.hidden_size, vocab, args)
+ # initialize parameters
+ if not args.use_gpu:
+ place = fluid.CPUPlace()
+ dev_count = int(
+ os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ else:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+
+ exe = Executor(place)
+ if args.load_dir:
+ logger.info('load from {}'.format(args.load_dir))
+ fluid.io.load_persistables(
+ exe, args.load_dir, main_program=main_program)
+ else:
+ logger.error('No model file to load ...')
+ return
+
+ inference_program = main_program.clone(for_test=True)
+ eval_loss, bleu_rouge = validation(
+ inference_program, avg_cost, s_probs, e_probs, match,
+ feed_order, place, dev_count, vocab, brc_data, logger, args)
+
+
+def prepare(logger, args):
+ """
+ checks data, creates the directories, prepare the vocabulary and embeddings
+ """
+ logger.info('Checking the data files...')
+ for data_path in args.trainset + args.devset + args.testset:
+ assert os.path.exists(data_path), '{} file does not exist.'.format(
+ data_path)
+ logger.info('Preparing the directories...')
+ for dir_path in [args.vocab_dir, args.save_dir, args.result_dir]:
+ if not os.path.exists(dir_path):
+ os.makedirs(dir_path)
+
+ logger.info('Building vocabulary...')
+ brc_data = BRCDataset(args.max_p_num, args.max_p_len, args.max_q_len,
+ args.trainset, args.devset, args.testset)
+ vocab = Vocab(lower=True)
+ for word in brc_data.word_iter('train'):
+ vocab.add(word)
+
+ unfiltered_vocab_size = vocab.size()
+ vocab.filter_tokens_by_cnt(min_cnt=2)
+ filtered_num = unfiltered_vocab_size - vocab.size()
+ logger.info('After filter {} tokens, the final vocab size is {}'.format(
+ filtered_num, vocab.size()))
+
+ logger.info('Assigning embeddings...')
+ vocab.randomly_init_embeddings(args.embed_size)
+
+ logger.info('Saving vocab...')
+ with open(os.path.join(args.vocab_dir, 'vocab.data'), 'wb') as fout:
+ pickle.dump(vocab, fout)
+
+ logger.info('Done with preparing!')
+
+
+if __name__ == '__main__':
+ args = parse_args()
+
+ if args.enable_ce:
+ random.seed(args.random_seed)
+ np.random.seed(args.random_seed)
+
+ logger = logging.getLogger("brc")
+ logger.setLevel(logging.INFO)
+ formatter = logging.Formatter(
+ '%(asctime)s - %(name)s - %(levelname)s - %(message)s')
+ if args.log_path:
+ file_handler = logging.FileHandler(args.log_path)
+ file_handler.setLevel(logging.INFO)
+ file_handler.setFormatter(formatter)
+ logger.addHandler(file_handler)
+ else:
+ console_handler = logging.StreamHandler()
+ console_handler.setLevel(logging.INFO)
+ console_handler.setFormatter(formatter)
+ logger.addHandler(console_handler)
+ args = parse_args()
+ logger.info('Running with args : {}'.format(args))
+ if args.prepare:
+ prepare(logger, args)
+ if args.train:
+ train(logger, args)
+ if args.evaluate:
+ evaluate(logger, args)
+ if args.predict:
+ predict(logger, args)
diff --git a/PaddleNLP/machine_reading_comprehension/run.sh b/PaddleNLP/machine_reading_comprehension/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..9f448c0611ae8236dcfd668a093cb3ae72d30603
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/run.sh
@@ -0,0 +1,52 @@
+#!/bin/bash
+export CUDA_VISIBLE_DEVICES=1
+
+paragraph_extraction ()
+{
+ SOURCE_DIR=$1
+ TARGET_DIR=$2
+ echo "Start paragraph extraction, this may take a few hours"
+ echo "Source dir: $SOURCE_DIR"
+ echo "Target dir: $TARGET_DIR"
+ mkdir -p $TARGET_DIR/trainset
+ mkdir -p $TARGET_DIR/devset
+ mkdir -p $TARGET_DIR/testset
+
+ echo "Processing trainset"
+ cat $SOURCE_DIR/trainset/search.train.json | python paragraph_extraction.py train \
+ > $TARGET_DIR/trainset/search.train.json
+ cat $SOURCE_DIR/trainset/zhidao.train.json | python paragraph_extraction.py train \
+ > $TARGET_DIR/trainset/zhidao.train.json
+
+ echo "Processing devset"
+ cat $SOURCE_DIR/devset/search.dev.json | python paragraph_extraction.py dev \
+ > $TARGET_DIR/devset/search.dev.json
+ cat $SOURCE_DIR/devset/zhidao.dev.json | python paragraph_extraction.py dev \
+ > $TARGET_DIR/devset/zhidao.dev.json
+
+ echo "Processing testset"
+ cat $SOURCE_DIR/testset/search.test.json | python paragraph_extraction.py test \
+ > $TARGET_DIR/testset/search.test.json
+ cat $SOURCE_DIR/testset/zhidao.test.json | python paragraph_extraction.py test \
+ > $TARGET_DIR/testset/zhidao.test.json
+ echo "Paragraph extraction done!"
+}
+
+
+PROCESS_NAME="$1"
+case $PROCESS_NAME in
+ --para_extraction)
+ # Start paragraph extraction
+ if [ ! -d data/preprocessed ]; then
+ echo "Please download the preprocessed data first (See README - Preprocess)"
+ exit 1
+ fi
+ paragraph_extraction data/preprocessed data/extracted
+ ;;
+ --prepare|--train|--evaluate|--predict)
+ # Start Paddle baseline
+ python run.py $@
+ ;;
+ *)
+ echo $"Usage: $0 {--para_extraction|--prepare|--train|--evaluate|--predict}"
+esac
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/__init__.py b/PaddleNLP/machine_reading_comprehension/utils/__init__.py
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/__init__.py
rename to PaddleNLP/machine_reading_comprehension/utils/__init__.py
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/download_thirdparty.sh b/PaddleNLP/machine_reading_comprehension/utils/download_thirdparty.sh
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/download_thirdparty.sh
rename to PaddleNLP/machine_reading_comprehension/utils/download_thirdparty.sh
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/dureader_eval.py b/PaddleNLP/machine_reading_comprehension/utils/dureader_eval.py
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/dureader_eval.py
rename to PaddleNLP/machine_reading_comprehension/utils/dureader_eval.py
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/get_vocab.py b/PaddleNLP/machine_reading_comprehension/utils/get_vocab.py
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/get_vocab.py
rename to PaddleNLP/machine_reading_comprehension/utils/get_vocab.py
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/marco_tokenize_data.py b/PaddleNLP/machine_reading_comprehension/utils/marco_tokenize_data.py
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/marco_tokenize_data.py
rename to PaddleNLP/machine_reading_comprehension/utils/marco_tokenize_data.py
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/marcov1_to_dureader.py b/PaddleNLP/machine_reading_comprehension/utils/marcov1_to_dureader.py
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/marcov1_to_dureader.py
rename to PaddleNLP/machine_reading_comprehension/utils/marcov1_to_dureader.py
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/marcov2_to_v1_tojsonl.py b/PaddleNLP/machine_reading_comprehension/utils/marcov2_to_v1_tojsonl.py
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/marcov2_to_v1_tojsonl.py
rename to PaddleNLP/machine_reading_comprehension/utils/marcov2_to_v1_tojsonl.py
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/preprocess.py b/PaddleNLP/machine_reading_comprehension/utils/preprocess.py
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/preprocess.py
rename to PaddleNLP/machine_reading_comprehension/utils/preprocess.py
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/utils/run_marco2dureader_preprocess.sh b/PaddleNLP/machine_reading_comprehension/utils/run_marco2dureader_preprocess.sh
similarity index 100%
rename from fluid/PaddleNLP/machine_reading_comprehension/utils/run_marco2dureader_preprocess.sh
rename to PaddleNLP/machine_reading_comprehension/utils/run_marco2dureader_preprocess.sh
diff --git a/PaddleNLP/machine_reading_comprehension/vocab.py b/PaddleNLP/machine_reading_comprehension/vocab.py
new file mode 100644
index 0000000000000000000000000000000000000000..f1662cbcb27d9c012451c1b74cea6b4c60239710
--- /dev/null
+++ b/PaddleNLP/machine_reading_comprehension/vocab.py
@@ -0,0 +1,200 @@
+# -*- coding:utf8 -*-
+# ==============================================================================
+# Copyright 2017 Baidu.com, Inc. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# ==============================================================================
+"""
+This module implements the Vocab class for converting string to id and back
+"""
+
+import numpy as np
+
+
+class Vocab(object):
+ """
+ Implements a vocabulary to store the tokens in the data, with their corresponding embeddings.
+ """
+
+ def __init__(self, filename=None, initial_tokens=None, lower=False):
+ self.id2token = {}
+ self.token2id = {}
+ self.token_cnt = {}
+ self.lower = lower
+
+ self.embed_dim = None
+ self.embeddings = None
+
+ self.pad_token = ''
+ self.unk_token = ''
+ self.split_token = ''
+
+ self.initial_tokens = initial_tokens if initial_tokens is not None else []
+ self.initial_tokens.extend([self.pad_token, self.unk_token, self.split_token])
+ for token in self.initial_tokens:
+ self.add(token)
+
+ if filename is not None:
+ self.load_from_file(filename)
+
+ def size(self):
+ """
+ get the size of vocabulary
+ Returns:
+ an integer indicating the size
+ """
+ return len(self.id2token)
+
+ def load_from_file(self, file_path):
+ """
+ loads the vocab from file_path
+ Args:
+ file_path: a file with a word in each line
+ """
+ for line in open(file_path, 'r'):
+ token = line.rstrip('\n')
+ self.add(token)
+
+ def get_id(self, token):
+ """
+ gets the id of a token, returns the id of unk token if token is not in vocab
+ Args:
+ key: a string indicating the word
+ Returns:
+ an integer
+ """
+ token = token.lower() if self.lower else token
+ try:
+ return self.token2id[token]
+ except KeyError:
+ return self.token2id[self.unk_token]
+
+ def get_token(self, idx):
+ """
+ gets the token corresponding to idx, returns unk token if idx is not in vocab
+ Args:
+ idx: an integer
+ returns:
+ a token string
+ """
+ try:
+ return self.id2token[idx]
+ except KeyError:
+ return self.unk_token
+
+ def add(self, token, cnt=1):
+ """
+ adds the token to vocab
+ Args:
+ token: a string
+ cnt: a num indicating the count of the token to add, default is 1
+ """
+ token = token.lower() if self.lower else token
+ if token in self.token2id:
+ idx = self.token2id[token]
+ else:
+ idx = len(self.id2token)
+ self.id2token[idx] = token
+ self.token2id[token] = idx
+ if cnt > 0:
+ if token in self.token_cnt:
+ self.token_cnt[token] += cnt
+ else:
+ self.token_cnt[token] = cnt
+ return idx
+
+ def filter_tokens_by_cnt(self, min_cnt):
+ """
+ filter the tokens in vocab by their count
+ Args:
+ min_cnt: tokens with frequency less than min_cnt is filtered
+ """
+ filtered_tokens = [
+ token for token in self.token2id if self.token_cnt[token] >= min_cnt
+ ]
+ # rebuild the token x id map
+ self.token2id = {}
+ self.id2token = {}
+ for token in self.initial_tokens:
+ self.add(token, cnt=0)
+ for token in filtered_tokens:
+ self.add(token, cnt=0)
+
+ def randomly_init_embeddings(self, embed_dim):
+ """
+ randomly initializes the embeddings for each token
+ Args:
+ embed_dim: the size of the embedding for each token
+ """
+ self.embed_dim = embed_dim
+ self.embeddings = np.random.rand(self.size(), embed_dim)
+ for token in [self.pad_token, self.unk_token, self.split_token]:
+ self.embeddings[self.get_id(token)] = np.zeros([self.embed_dim])
+
+ def load_pretrained_embeddings(self, embedding_path):
+ """
+ loads the pretrained embeddings from embedding_path,
+ tokens not in pretrained embeddings will be filtered
+ Args:
+ embedding_path: the path of the pretrained embedding file
+ """
+ trained_embeddings = {}
+ with open(embedding_path, 'r') as fin:
+ for line in fin:
+ contents = line.strip().split()
+ token = contents[0].decode('utf8')
+ if token not in self.token2id:
+ continue
+ trained_embeddings[token] = list(map(float, contents[1:]))
+ if self.embed_dim is None:
+ self.embed_dim = len(contents) - 1
+ filtered_tokens = trained_embeddings.keys()
+ # rebuild the token x id map
+ self.token2id = {}
+ self.id2token = {}
+ for token in self.initial_tokens:
+ self.add(token, cnt=0)
+ for token in filtered_tokens:
+ self.add(token, cnt=0)
+ # load embeddings
+ self.embeddings = np.zeros([self.size(), self.embed_dim])
+ for token in self.token2id.keys():
+ if token in trained_embeddings:
+ self.embeddings[self.get_id(token)] = trained_embeddings[token]
+
+ def convert_to_ids(self, tokens):
+ """
+ Convert a list of tokens to ids, use unk_token if the token is not in vocab.
+ Args:
+ tokens: a list of token
+ Returns:
+ a list of ids
+ """
+ vec = [self.get_id(label) for label in tokens]
+ return vec
+
+ def recover_from_ids(self, ids, stop_id=None):
+ """
+ Convert a list of ids to tokens, stop converting if the stop_id is encountered
+ Args:
+ ids: a list of ids to convert
+ stop_id: the stop id, default is None
+ Returns:
+ a list of tokens
+ """
+ tokens = []
+ for i in ids:
+ tokens += [self.get_token(i)]
+ if stop_id is not None and i == stop_id:
+ break
+ return tokens
diff --git a/PaddleNLP/neural_machine_translation/README.md b/PaddleNLP/neural_machine_translation/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a0271ad42e62490282ccc154f6a3c50029b6d13d
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/README.md
@@ -0,0 +1,9 @@
+The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
+
+---
+
+This is a collection of example models for neural machine translation and neural sequence modeling.
+
+### TODO
+
+This project is still under active development.
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/.run_ce.sh b/PaddleNLP/neural_machine_translation/rnn_search/.run_ce.sh
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/.run_ce.sh
rename to PaddleNLP/neural_machine_translation/rnn_search/.run_ce.sh
diff --git a/PaddleNLP/neural_machine_translation/rnn_search/README.md b/PaddleNLP/neural_machine_translation/rnn_search/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..86d4a021baf11e04a9fd07c05dbf50425451efab
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/rnn_search/README.md
@@ -0,0 +1,134 @@
+运行本目录下的范例模型需要安装PaddlePaddle Fluid 1.0版。如果您的 PaddlePaddle 安装版本低于此要求,请按照[安装文档](http://paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/install/index_cn.html)中的说明更新 PaddlePaddle 安装版本。
+
+# 机器翻译:RNN Search
+
+以下是本范例模型的简要目录结构及说明:
+
+```text
+.
+├── README.md # 文档,本文件
+├── args.py # 训练、预测以及模型参数
+├── train.py # 训练主程序
+├── infer.py # 预测主程序
+├── attention_model.py # 带注意力机制的翻译模型配置
+└── no_attention_model.py # 无注意力机制的翻译模型配置
+```
+
+## 简介
+机器翻译(machine translation, MT)是用计算机来实现不同语言之间翻译的技术。被翻译的语言通常称为源语言(source language),翻译成的结果语言称为目标语言(target language)。机器翻译即实现从源语言到目标语言转换的过程,是自然语言处理的重要研究领域之一。
+
+近年来,深度学习技术的发展不断为机器翻译任务带来新的突破。直接用神经网络将源语言映射到目标语言,即端到端的神经网络机器翻译(End-to-End Neural Machine Translation, End-to-End NMT)模型逐渐成为主流,此类模型一般简称为NMT模型。
+
+本目录包含一个经典的机器翻译模型[RNN Search](https://arxiv.org/pdf/1409.0473.pdf)的Paddle Fluid实现。事实上,RNN search是一个较为传统的NMT模型,在现阶段,其表现已被很多新模型(如[Transformer](https://arxiv.org/abs/1706.03762))超越。但除机器翻译外,该模型是许多序列到序列(sequence to sequence, 以下简称Seq2Seq)类模型的基础,很多解决其他NLP问题的模型均以此模型为基础;因此其在NLP领域具有重要意义,并被广泛用作Baseline.
+
+本目录下此范例模型的实现,旨在展示如何用Paddle Fluid实现一个带有注意力机制(Attention)的RNN模型来解决Seq2Seq类问题,以及如何使用带有Beam Search算法的解码器。如果您仅仅只是需要在机器翻译方面有着较好翻译效果的模型,则建议您参考[Transformer的Paddle Fluid实现](https://github.com/PaddlePaddle/models/tree/develop/fluid/neural_machine_translation/transformer)。
+
+## 模型概览
+RNN Search模型使用了经典的编码器-解码器(Encoder-Decoder)的框架结构来解决Seq2Seq类问题。这种方法先用编码器将源序列编码成vector,再用解码器将该vector解码为目标序列。这其实模拟了人类在进行翻译类任务时的行为:先解析源语言,理解其含义,再根据该含义来写出目标语言的语句。编码器和解码器往往都使用RNN来实现。关于此方法的具体原理和数学表达式,可以参考[深度学习101](http://paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/basics/machine_translation/index.html).
+
+本模型中,在编码器方面,我们的实现使用了双向循环神经网络(Bi-directional Recurrent Neural Network);在解码器方面,我们使用了带注意力(Attention)机制的RNN解码器,并同时提供了一个不带注意力机制的解码器实现作为对比;而在预测方面我们使用柱搜索(beam search)算法来生成翻译的目标语句。以下将分别介绍用到的这些方法。
+
+### 双向循环神经网络
+这里介绍Bengio团队在论文\[[2](#参考文献),[4](#参考文献)\]中提出的一种双向循环网络结构。该结构的目的是输入一个序列,得到其在每个时刻的特征表示,即输出的每个时刻都用定长向量表示到该时刻的上下文语义信息。
+具体来说,该双向循环神经网络分别在时间维以顺序和逆序——即前向(forward)和后向(backward)——依次处理输入序列,并将每个时间步RNN的输出拼接成为最终的输出层。这样每个时间步的输出节点,都包含了输入序列中当前时刻完整的过去和未来的上下文信息。下图展示的是一个按时间步展开的双向循环神经网络。该网络包含一个前向和一个后向RNN,其中有六个权重矩阵:输入到前向隐层和后向隐层的权重矩阵($W_1, W_3$),隐层到隐层自己的权重矩阵($W_2,W_5$),前向隐层和后向隐层到输出层的权重矩阵($W_4, W_6$)。注意,该网络的前向隐层和后向隐层之间没有连接。
+
+
+
+图1. 按时间步展开的双向循环神经网络
+
+
+
+
+图2. 使用双向LSTM的编码器
+
+
+### 注意力机制
+如果编码阶段的输出是一个固定维度的向量,会带来以下两个问题:1)不论源语言序列的长度是5个词还是50个词,如果都用固定维度的向量去编码其中的语义和句法结构信息,对模型来说是一个非常高的要求,特别是对长句子序列而言;2)直觉上,当人类翻译一句话时,会对与当前译文更相关的源语言片段上给予更多关注,且关注点会随着翻译的进行而改变。而固定维度的向量则相当于,任何时刻都对源语言所有信息给予了同等程度的关注,这是不合理的。因此,Bahdanau等人\[[4](#参考文献)\]引入注意力(attention)机制,可以对编码后的上下文片段进行解码,以此来解决长句子的特征学习问题。下面介绍在注意力机制下的解码器结构。
+
+与简单的解码器不同,这里$z_i$的计算公式为 (由于Github原生不支持LaTeX公式,请您移步[这里](http://www.paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/basics/machine_translation/index.html)查看):
+
+$$z_{i+1}=\phi _{\theta '}\left ( c_i,u_i,z_i \right )$$
+
+可见,源语言句子的编码向量表示为第$i$个词的上下文片段$c_i$,即针对每一个目标语言中的词$u_i$,都有一个特定的$c_i$与之对应。$c_i$的计算公式如下:
+
+$$c_i=\sum _{j=1}^{T}a_{ij}h_j, a_i=\left[ a_{i1},a_{i2},...,a_{iT}\right ]$$
+
+从公式中可以看出,注意力机制是通过对编码器中各时刻的RNN状态$h_j$进行加权平均实现的。权重$a_{ij}$表示目标语言中第$i$个词对源语言中第$j$个词的注意力大小,$a_{ij}$的计算公式如下:
+
+$$a_{ij} = {exp(e_{ij}) \over {\sum_{k=1}^T exp(e_{ik})}}$$
+$$e_{ij} = {align(z_i, h_j)}$$
+
+其中,$align$可以看作是一个对齐模型,用来衡量目标语言中第$i$个词和源语言中第$j$个词的匹配程度。具体而言,这个程度是通过解码RNN的第$i$个隐层状态$z_i$和源语言句子的第$j$个上下文片段$h_j$计算得到的。传统的对齐模型中,目标语言的每个词明确对应源语言的一个或多个词(hard alignment);而在注意力模型中采用的是soft alignment,即任何两个目标语言和源语言词间均存在一定的关联,且这个关联强度是由模型计算得到的实数,因此可以融入整个NMT框架,并通过反向传播算法进行训练。
+
+
+
+图3. 基于注意力机制的解码器
+
+
+### 柱搜索算法
+
+柱搜索([beam search](http://en.wikipedia.org/wiki/Beam_search))是一种启发式图搜索算法,用于在图或树中搜索有限集合中的最优扩展节点,通常用在解空间非常大的系统(如机器翻译、语音识别)中,原因是内存无法装下图或树中所有展开的解。如在机器翻译任务中希望翻译“`你好`”,就算目标语言字典中只有3个词(``, ``, `hello`),也可能生成无限句话(`hello`循环出现的次数不定),为了找到其中较好的翻译结果,我们可采用柱搜索算法。
+
+柱搜索算法使用广度优先策略建立搜索树,在树的每一层,按照启发代价(heuristic cost)(本教程中,为生成词的log概率之和)对节点进行排序,然后仅留下预先确定的个数(文献中通常称为beam width、beam size、柱宽度等)的节点。只有这些节点会在下一层继续扩展,其他节点就被剪掉了,也就是说保留了质量较高的节点,剪枝了质量较差的节点。因此,搜索所占用的空间和时间大幅减少,但缺点是无法保证一定获得最优解。
+
+使用柱搜索算法的解码阶段,目标是最大化生成序列的概率。思路是:
+
+1. 每一个时刻,根据源语言句子的编码信息$c$、生成的第$i$个目标语言序列单词$u_i$和$i$时刻RNN的隐层状态$z_i$,计算出下一个隐层状态$z_{i+1}$。
+2. 将$z_{i+1}$通过`softmax`归一化,得到目标语言序列的第$i+1$个单词的概率分布$p_{i+1}$。
+3. 根据$p_{i+1}$采样出单词$u_{i+1}$。
+4. 重复步骤1~3,直到获得句子结束标记``或超过句子的最大生成长度为止。
+
+注意:$z_{i+1}$和$p_{i+1}$的计算公式同解码器中的一样。且由于生成时的每一步都是通过贪心法实现的,因此并不能保证得到全局最优解。
+
+## 数据介绍
+
+本教程使用[WMT-14](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/)数据集中的[bitexts(after selection)](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/bitexts.tgz)作为训练集,[dev+test data](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/dev+test.tgz)作为测试集和生成集。
+
+### 数据预处理
+
+我们的预处理流程包括两步:
+- 将每个源语言到目标语言的平行语料库文件合并为一个文件:
+ - 合并每个`XXX.src`和`XXX.trg`文件为`XXX`。
+ - `XXX`中的第$i$行内容为`XXX.src`中的第$i$行和`XXX.trg`中的第$i$行连接,用'\t'分隔。
+- 创建训练数据的“源字典”和“目标字典”。每个字典都有**DICTSIZE**个单词,包括:语料中词频最高的(DICTSIZE - 3)个单词,和3个特殊符号``(序列的开始)、``(序列的结束)和``(未登录词)。
+
+### 示例数据
+
+因为完整的数据集数据量较大,为了验证训练流程,PaddlePaddle接口paddle.dataset.wmt14中默认提供了一个经过预处理的[较小规模的数据集](http://paddlepaddle.bj.bcebos.com/demo/wmt_shrinked_data/wmt14.tgz)。
+
+该数据集有193319条训练数据,6003条测试数据,词典长度为30000。因为数据规模限制,使用该数据集训练出来的模型效果无法保证。
+
+## 训练模型
+
+`train.py`包含训练程序的主函数,要使用默认参数开始训练,只需要简单地执行:
+```sh
+python train.py
+```
+您可以使用命令行参数来设置模型训练时的参数。要显示所有可用的命令行参数,执行:
+```sh
+python train.py -h
+```
+这样会显示所有的命令行参数的描述,以及其默认值。默认的模型是带有注意力机制的。您也可以尝试运行无注意力机制的模型,命令如下:
+```sh
+python train.py --no_attention
+```
+训练好的模型默认会被保存到`./models`路径下。您可以用命令行参数`--save_dir`来指定模型的保存路径。默认每个pass结束时会保存一个模型。
+
+## 生成预测结果
+
+在模型训练好后,可以用`infer.py`来生成预测结果。同样的,使用默认参数,只需要执行:
+```sh
+python infer.py
+```
+您也可以同样用命令行来指定各参数。注意,预测时的参数设置必须与训练时完全一致,否则载入模型会失败。您可以用`--pass_num`参数来选择读取哪个pass结束时保存的模型。同时您可以使用`--beam_width`参数来选择beam search宽度。
+
+## 参考文献
+
+1. Koehn P. [Statistical machine translation](https://books.google.com.hk/books?id=4v_Cx1wIMLkC&printsec=frontcover&hl=zh-CN&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false)[M]. Cambridge University Press, 2009.
+2. Cho K, Van Merriënboer B, Gulcehre C, et al. [Learning phrase representations using RNN encoder-decoder for statistical machine translation](http://www.aclweb.org/anthology/D/D14/D14-1179.pdf)[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2014: 1724-1734.
+3. Chung J, Gulcehre C, Cho K H, et al. [Empirical evaluation of gated recurrent neural networks on sequence modeling](https://arxiv.org/abs/1412.3555)[J]. arXiv preprint arXiv:1412.3555, 2014.
+4. Bahdanau D, Cho K, Bengio Y. [Neural machine translation by jointly learning to align and translate](https://arxiv.org/abs/1409.0473)[C]//Proceedings of ICLR 2015, 2015.
+5. Papineni K, Roukos S, Ward T, et al. [BLEU: a method for automatic evaluation of machine translation](http://dl.acm.org/citation.cfm?id=1073135)[C]//Proceedings of the 40th annual meeting on association for computational linguistics. Association for Computational Linguistics, 2002: 311-318.
+
+
+
本教程 由 PaddlePaddle 创作,采用 知识共享 署名-相同方式共享 4.0 国际 许可协议进行许可。
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/_ce.py b/PaddleNLP/neural_machine_translation/rnn_search/_ce.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/_ce.py
rename to PaddleNLP/neural_machine_translation/rnn_search/_ce.py
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/args.py b/PaddleNLP/neural_machine_translation/rnn_search/args.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/args.py
rename to PaddleNLP/neural_machine_translation/rnn_search/args.py
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/attention_model.py b/PaddleNLP/neural_machine_translation/rnn_search/attention_model.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/attention_model.py
rename to PaddleNLP/neural_machine_translation/rnn_search/attention_model.py
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/images/bi_rnn.png b/PaddleNLP/neural_machine_translation/rnn_search/images/bi_rnn.png
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/images/bi_rnn.png
rename to PaddleNLP/neural_machine_translation/rnn_search/images/bi_rnn.png
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/images/decoder_attention.png b/PaddleNLP/neural_machine_translation/rnn_search/images/decoder_attention.png
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/images/decoder_attention.png
rename to PaddleNLP/neural_machine_translation/rnn_search/images/decoder_attention.png
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/images/encoder_attention.png b/PaddleNLP/neural_machine_translation/rnn_search/images/encoder_attention.png
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/images/encoder_attention.png
rename to PaddleNLP/neural_machine_translation/rnn_search/images/encoder_attention.png
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/infer.py b/PaddleNLP/neural_machine_translation/rnn_search/infer.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/infer.py
rename to PaddleNLP/neural_machine_translation/rnn_search/infer.py
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/no_attention_model.py b/PaddleNLP/neural_machine_translation/rnn_search/no_attention_model.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/no_attention_model.py
rename to PaddleNLP/neural_machine_translation/rnn_search/no_attention_model.py
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/train.py b/PaddleNLP/neural_machine_translation/rnn_search/train.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/rnn_search/train.py
rename to PaddleNLP/neural_machine_translation/rnn_search/train.py
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/.gitignore b/PaddleNLP/neural_machine_translation/transformer/.gitignore
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/.gitignore
rename to PaddleNLP/neural_machine_translation/transformer/.gitignore
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/.run_ce.sh b/PaddleNLP/neural_machine_translation/transformer/.run_ce.sh
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/.run_ce.sh
rename to PaddleNLP/neural_machine_translation/transformer/.run_ce.sh
diff --git a/PaddleNLP/neural_machine_translation/transformer/README.md b/PaddleNLP/neural_machine_translation/transformer/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..6fea167b5e7c3e9dd759ef30d9225b451350e889
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/transformer/README.md
@@ -0,0 +1,23 @@
+The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
+
+---
+
+# Attention is All You Need: A Paddle Fluid implementation
+
+This is a Paddle Fluid implementation of the Transformer model in [Attention is All You Need]() (Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin, arxiv, 2017).
+
+If you use the dataset/code in your research, please cite the paper:
+
+```text
+@inproceedings{vaswani2017attention,
+ title={Attention is all you need},
+ author={Vaswani, Ashish and Shazeer, Noam and Parmar, Niki and Uszkoreit, Jakob and Jones, Llion and Gomez, Aidan N and Kaiser, {\L}ukasz and Polosukhin, Illia},
+ booktitle={Advances in Neural Information Processing Systems},
+ pages={6000--6010},
+ year={2017}
+}
+```
+
+### TODO
+
+This project is still under active development.
diff --git a/PaddleNLP/neural_machine_translation/transformer/README_cn.md b/PaddleNLP/neural_machine_translation/transformer/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..bdac7cb0b7c4f9d51bbc281b351232c6edc75a36
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/transformer/README_cn.md
@@ -0,0 +1,272 @@
+运行本目录下的程序示例需要使用 PaddlePaddle 最新的 develop branch 版本。如果您的 PaddlePaddle 安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新 PaddlePaddle 安装版本。
+
+---
+
+## Transformer
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── images # README 文档中的图片
+├── config.py # 训练、预测以及模型参数配置
+├── infer.py # 预测脚本
+├── model.py # 模型定义
+├── optim.py # learning rate scheduling 计算程序
+├── reader.py # 数据读取接口
+├── README.md # 文档
+├── train.py # 训练脚本
+└── gen_data.sh # 数据生成脚本
+```
+
+### 简介
+
+Transformer 是论文 [Attention Is All You Need](https://arxiv.org/abs/1706.03762) 中提出的用以完成机器翻译(machine translation, MT)等序列到序列(sequence to sequence, Seq2Seq)学习任务的一种全新网络结构,其完全使用注意力(Attention)机制来实现序列到序列的建模[1]。
+
+相较于此前 Seq2Seq 模型中广泛使用的循环神经网络(Recurrent Neural Network, RNN),使用(Self)Attention 进行输入序列到输出序列的变换主要具有以下优势:
+
+- 计算复杂度小
+ - 特征维度为 d 、长度为 n 的序列,在 RNN 中计算复杂度为 `O(n * d * d)` (n 个时间步,每个时间步计算 d 维的矩阵向量乘法),在 Self-Attention 中计算复杂度为 `O(n * n * d)` (n 个时间步两两计算 d 维的向量点积或其他相关度函数),n 通常要小于 d 。
+- 计算并行度高
+ - RNN 中当前时间步的计算要依赖前一个时间步的计算结果;Self-Attention 中各时间步的计算只依赖输入不依赖之前时间步输出,各时间步可以完全并行。
+- 容易学习长程依赖(long-range dependencies)
+ - RNN 中相距为 n 的两个位置间的关联需要 n 步才能建立;Self-Attention 中任何两个位置都直接相连;路径越短信号传播越容易。
+
+这些也在机器翻译任务中得到了印证,Transformer 模型在训练时间大幅减少的同时取得了 WMT'14 英德翻译任务 BLEU 值的新高。此外,Transformer 在应用于成分句法分析(Constituency Parsing)任务时也有着不俗的表现,这也说明其具有较高的通用性,容易迁移到其他应用场景中。这些都表明 Transformer 有着广阔的前景。
+
+### 模型概览
+
+Transformer 同样使用了 Seq2Seq 模型中典型的编码器-解码器(Encoder-Decoder)的框架结构,整体网络结构如图1所示。
+
+
+
+图 1. Transformer 网络结构图
+
+
+Encoder 由若干相同的 layer 堆叠组成,每个 layer 主要由多头注意力(Multi-Head Attention)和全连接的前馈(Feed-Forward)网络这两个 sub-layer 构成。
+- Multi-Head Attention 在这里用于实现 Self-Attention,相比于简单的 Attention 机制,其将输入进行多路线性变换后分别计算 Attention 的结果,并将所有结果拼接后再次进行线性变换作为输出。参见图2,其中 Attention 使用的是点积(Dot-Product),并在点积后进行了 scale 的处理以避免因点积结果过大进入 softmax 的饱和区域。
+- Feed-Forward 网络会对序列中的每个位置进行相同的计算(Position-wise),其采用的是两次线性变换中间加以 ReLU 激活的结构。
+
+此外,每个 sub-layer 后还施以 Residual Connection [2]和 Layer Normalization [3]来促进梯度传播和模型收敛。
+
+
+
+图 2. Multi-Head Attention
+
+
+Decoder 具有和 Encoder 类似的结构,只是相比于组成 Encoder 的 layer ,在组成 Decoder 的 layer 中还多了一个 Multi-Head Attention 的 sub-layer 来实现对 Encoder 输出的 Attention,这个 Encoder-Decoder Attention 在其他 Seq2Seq 模型中也是存在的。
+
+
+### 数据准备
+
+WMT 数据集是机器翻译领域公认的主流数据集,[WMT'16 EN-DE 数据集](http://www.statmt.org/wmt16/translation-task.html)是其中一个中等规模的数据集,也是 Transformer 论文中用到的一个数据集,这里将其作为示例,可以直接运行 `gen_data.sh` 脚本进行 WMT'16 EN-DE 数据集的下载和预处理。数据处理过程主要包括 Tokenize 和 BPE 编码(byte-pair encoding);BPE 编码的数据能够较好的解决未登录词(out-of-vocabulary,OOV)的问题[4],其在 Transformer 论文中也被使用。运行成功后,将会生成文件夹 `gen_data`,其目录结构如下(可在 `gen_data.sh` 中修改):
+
+```text
+.
+├── wmt16_ende_data # WMT16 英德翻译数据
+├── wmt16_ende_data_bpe # BPE 编码的 WMT16 英德翻译数据
+├── mosesdecoder # Moses 机器翻译工具集,包含了 Tokenize、BLEU 评估等脚本
+└── subword-nmt # BPE 编码的代码
+```
+
+`gen_data/wmt16_ende_data_bpe` 中是我们最终使用的英德翻译数据,其中 `train.tok.clean.bpe.32000.en-de` 为训练数据,`newstest2016.tok.bpe.32000.en-de` 等为验证和测试数据,`vocab_all.bpe.32000` 为相应的词典文件(已加入 `` 、`` 和 `` 这三个特殊符号,源语言和目标语言共享该词典文件)。另外我们也整理提供了一份处理好的 WMT'16 EN-DE 数据以供[下载](https://transformer-res.bj.bcebos.com/wmt16_ende_data_bpe_clean.tar.gz)使用(包含训练所需 BPE 数据和词典以及预测和评估所需的 BPE 数据和 tokenize 的数据)。
+
+对于其他自定义数据,转换为类似 `train.tok.clean.bpe.32000.en-de` 的数据格式(`\t` 分隔的源语言和目标语言句子对,句子中的 token 之间使用空格分隔)即可;如需使用 BPE 编码,亦可以使用类似 WMT'16 EN-DE 原始数据的格式,参照 `gen_data.sh` 进行处理。
+
+### 模型训练
+
+`train.py` 是模型训练脚本。以英德翻译数据为例,可以执行以下命令进行模型训练:
+```sh
+python -u train.py \
+ --src_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --trg_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --special_token '' '' '' \
+ --train_file_pattern gen_data/wmt16_ende_data_bpe/train.tok.clean.bpe.32000.en-de \
+ --token_delimiter ' ' \
+ --use_token_batch True \
+ --batch_size 4096 \
+ --sort_type pool \
+ --pool_size 200000
+```
+上述命令中设置了源语言词典文件路径(`src_vocab_fpath`)、目标语言词典文件路径(`trg_vocab_fpath`)、训练数据文件(`train_file_pattern`,支持通配符)等数据相关的参数和构造 batch 方式(`use_token_batch` 指定了数据按照 token 数目或者 sequence 数目组成 batch)等 reader 相关的参数。有关这些参数更详细的信息可以通过执行以下命令查看:
+```sh
+python train.py --help
+```
+
+更多模型训练相关的参数则在 `config.py` 中的 `ModelHyperParams` 和 `TrainTaskConfig` 内定义;`ModelHyperParams` 定义了 embedding 维度等模型超参数,`TrainTaskConfig` 定义了 warmup 步数等训练需要的参数。这些参数默认使用了 Transformer 论文中 base model 的配置,如需调整可以在该脚本中进行修改。另外这些参数同样可在执行训练脚本的命令行中设置,传入的配置会合并并覆盖 `config.py` 中的配置,如可以通过以下命令来训练 Transformer 论文中的 big model (如显存不够可适当减小 batch size 的值,或设置 `max_length 200` 过滤过长的句子,或修改某些显存使用相关环境变量的值):
+
+```sh
+# 显存使用的比例,显存不足可适当增大,最大为1
+export FLAGS_fraction_of_gpu_memory_to_use=1.0
+# 显存清理的阈值,显存不足可适当减小,最小为0,为负数时不启用
+export FLAGS_eager_delete_tensor_gb=0.8
+python -u train.py \
+ --src_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --trg_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --special_token '' '' '' \
+ --train_file_pattern gen_data/wmt16_ende_data_bpe/train.tok.clean.bpe.32000.en-de \
+ --token_delimiter ' ' \
+ --use_token_batch True \
+ --batch_size 3200 \
+ --sort_type pool \
+ --pool_size 200000 \
+ n_head 16 \
+ d_model 1024 \
+ d_inner_hid 4096 \
+ prepostprocess_dropout 0.3
+```
+有关这些参数更详细信息的请参考 `config.py` 中的注释说明。
+
+训练时默认使用所有 GPU,可以通过 `CUDA_VISIBLE_DEVICES` 环境变量来设置使用的 GPU 数目。也可以只使用 CPU 训练(通过参数 `--divice CPU` 设置),训练速度相对较慢。在训练过程中,每隔一定 iteration 后(通过参数 `save_freq` 设置,默认为10000)保存模型到参数 `model_dir` 指定的目录,每个 epoch 结束后也会保存 checkpiont 到 `ckpt_dir` 指定的目录,每隔一定数目的 iteration (通过参数 `--fetch_steps` 设置,默认为100)将打印如下的日志到标准输出:
+```txt
+[2018-10-26 00:49:24,705 INFO train.py:536] step_idx: 0, epoch: 0, batch: 0, avg loss: 10.999878, normalized loss: 9.624138, ppl: 59866.832031
+[2018-10-26 00:50:08,717 INFO train.py:545] step_idx: 100, epoch: 0, batch: 100, avg loss: 9.454134, normalized loss: 8.078394, ppl: 12760.809570, speed: 2.27 step/s
+[2018-10-26 00:50:52,655 INFO train.py:545] step_idx: 200, epoch: 0, batch: 200, avg loss: 8.643907, normalized loss: 7.268166, ppl: 5675.458496, speed: 2.28 step/s
+[2018-10-26 00:51:36,529 INFO train.py:545] step_idx: 300, epoch: 0, batch: 300, avg loss: 7.916654, normalized loss: 6.540914, ppl: 2742.579346, speed: 2.28 step/s
+[2018-10-26 00:52:20,692 INFO train.py:545] step_idx: 400, epoch: 0, batch: 400, avg loss: 7.902879, normalized loss: 6.527138, ppl: 2705.058350, speed: 2.26 step/s
+[2018-10-26 00:53:04,537 INFO train.py:545] step_idx: 500, epoch: 0, batch: 500, avg loss: 7.818271, normalized loss: 6.442531, ppl: 2485.604492, speed: 2.28 step/s
+[2018-10-26 00:53:48,580 INFO train.py:545] step_idx: 600, epoch: 0, batch: 600, avg loss: 7.554341, normalized loss: 6.178601, ppl: 1909.012451, speed: 2.27 step/s
+[2018-10-26 00:54:32,878 INFO train.py:545] step_idx: 700, epoch: 0, batch: 700, avg loss: 7.177765, normalized loss: 5.802025, ppl: 1309.977661, speed: 2.26 step/s
+[2018-10-26 00:55:17,108 INFO train.py:545] step_idx: 800, epoch: 0, batch: 800, avg loss: 7.005494, normalized loss: 5.629754, ppl: 1102.674805, speed: 2.26 step/s
+```
+
+### 模型预测
+
+`infer.py` 是模型预测脚本。以英德翻译数据为例,模型训练完成后可以执行以下命令对指定文件中的文本进行翻译:
+```sh
+python -u infer.py \
+ --src_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --trg_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --special_token '' '' '' \
+ --test_file_pattern gen_data/wmt16_ende_data_bpe/newstest2016.tok.bpe.32000.en-de \
+ --token_delimiter ' ' \
+ --batch_size 32 \
+ model_path trained_models/iter_100000.infer.model \
+ beam_size 5 \
+ max_out_len 255
+```
+和模型训练时类似,预测时也需要设置数据和 reader 相关的参数,并可以执行 `python infer.py --help` 查看这些参数的说明(部分参数意义和训练时略有不同);同样可以在预测命令中设置模型超参数,但应与模型训练时的设置一致,如训练时使用 big model 的参数设置,则预测时对应类似如下命令:
+```sh
+python -u infer.py \
+ --src_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --trg_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
+ --special_token '' '' '' \
+ --test_file_pattern gen_data/wmt16_ende_data_bpe/newstest2016.tok.bpe.32000.en-de \
+ --token_delimiter ' ' \
+ --batch_size 32 \
+ model_path trained_models/iter_100000.infer.model \
+ n_head 16 \
+ d_model 1024 \
+ d_inner_hid 4096 \
+ prepostprocess_dropout 0.3 \
+ beam_size 5 \
+ max_out_len 255
+```
+此外相比于模型训练,预测时还有一些额外的参数,如需要设置 `model_path` 来给出模型所在目录,可以设置 `beam_size` 和 `max_out_len` 来指定 Beam Search 算法的搜索宽度和最大深度(翻译长度),这些参数也可以在 `config.py` 中的 `InferTaskConfig` 内查阅注释说明并进行更改设置。
+
+执行以上预测命令会打印翻译结果到标准输出,每行输出是对应行输入的得分最高的翻译。对于使用 BPE 的英德数据,预测出的翻译结果也将是 BPE 表示的数据,要还原成原始的数据(这里指 tokenize 后的数据)才能进行正确的评估,可以使用以下命令来恢复 `predict.txt` 内的翻译结果到 `predict.tok.txt` 中(无需再次 tokenize 处理):
+```sh
+sed -r 's/(@@ )|(@@ ?$)//g' predict.txt > predict.tok.txt
+```
+
+接下来就可以使用参考翻译对翻译结果进行 BLEU 指标的评估了,评估需要用到 mosesdecoder 中的脚本,可以通过以下命令获取:
+```sh
+git clone https://github.com/moses-smt/mosesdecoder.git
+```
+以英德翻译 `newstest2014.tok.de` 数据为例,获取 mosesdecoder 后使用 `multi-bleu.perl` 执行如下命令进行翻译结果评估:
+```sh
+perl gen_data/mosesdecoder/scripts/generic/multi-bleu.perl gen_data/wmt16_ende_data/newstest2014.tok.de < predict.tok.txt
+```
+可以看到类似如下的结果:
+```
+BLEU = 26.35, 57.7/32.1/20.0/13.0 (BP=1.000, ratio=1.013, hyp_len=63903, ref_len=63078)
+```
+目前在未使用 model average 的情况下,英德翻译 base model 和 big model 八卡训练 100K 个 iteration 后测试 BLEU 值如下:
+
+| 测试集 | newstest2014 | newstest2015 | newstest2016 |
+|-|-|-|-|
+| Base | 26.35 | 29.07 | 33.30 |
+| Big | 27.07 | 30.09 | 34.38 |
+
+我们这里也提供了以上 [base model](https://transformer-res.bj.bcebos.com/base_model.tar.gz) 和 [big model](https://transformer-res.bj.bcebos.com/big_model.tar.gz) 模型的下载以供使用。
+
+### 分布式训练
+
+Transformer 模型支持同步或者异步的分布式训练。分布式的配置主要两个方面:
+
+1 命令行配置
+
+ - `--local`,有两个取值,`True`表示单机训练,而`False`表示使用分布式训练。默认为单机训练模式。
+
+ - `--sync`,有两个取值,但只有当`--local`参数为False才会产生影响,其中`True`表示同步训练模式,`False`表示异步训练模式。默认为同步训练模式。
+
+2 环境变量配置
+
+ 在分布式训练模式下,会手动配置训练的trainer数量和pserver数量。在网络拓扑上,每一个trainer都会和每一个pserver相连,pserver作为服务端,而trainer作为客户端。下面分pserver和trainer说明具体的参数配置:
+
+1) pserver配置
+
+- `PADDLE_IS_LOCAL=[0|1]` 是否是分布式训练,`0`标识是分布式,`1`标识是单机
+
+- `TRAINING_ROLE=PSERVER` 标识当前节点是pserver
+
+- `POD_IP=ip` 设置当前pserver使用对外服务的地址
+
+- `PADDLE_PORT=port` 设置当前pserver对外服务监听端口号,和`POD_IP`共同构成对外的唯一标识
+
+- `PADDLE_TRAINERS_NUM=num` 设置pserver连接的trainer的数量
+
+下面是配置的示例, 使用两个pserver, 192.168.2.2上的配置如下:
+```
+export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
+export POD_IP=192.168.2.2
+export PADDLE_TRAINERS_NUM=2
+export TRAINING_ROLE=PSERVER
+export PADDLE_IS_LOCAL=0
+export PADDLE_PORT=6177
+```
+192.168.2.3上的配置如下:
+```
+export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
+export POD_IP=192.168.2.3
+export PADDLE_TRAINERS_NUM=2
+export TRAINING_ROLE=PSERVER
+export PADDLE_IS_LOCAL=0
+export PADDLE_PORT=6177
+```
+2) trainer配置
+
+- `PADDLE_IS_LOCAL=[0|1]` 是否是分布式训练,`0`标识是分布式,`1`标识是单机
+
+- `TRAINING_ROLE=TRAINER` 标识当前节点是trainer
+
+- `PADDLE_PSERVERS=[ip1,ip2,……]` 设置pserver的ip地址,用于告知trainer互联的pserver的ip, 使用`,`分割
+
+- `PADDLE_TRAINER_ID=num` 设置当前节点的编号, 编号的取值范围为0到N-1的整数
+
+- `PADDLE_PORT=port` 设置请求的pserver服务端口号
+
+下面是配置的示例, 使用两个trainer, trainer 1上的配置如下:
+```
+export TRAINING_ROLE=TRAINER
+export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
+export PADDLE_TRAINERS_NUM=2
+export PADDLE_TRAINER_ID=0
+export PADDLE_IS_LOCAL=0
+export PADDLE_PORT=6177
+```
+trainer 2上的配置如下:
+```
+export TRAINING_ROLE=TRAINER
+export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
+export PADDLE_TRAINERS_NUM=2
+export PADDLE_TRAINER_ID=1
+export PADDLE_IS_LOCAL=0
+export PADDLE_PORT=6177
+```
+
+### 参考文献
+1. Vaswani A, Shazeer N, Parmar N, et al. [Attention is all you need](http://papers.nips.cc/paper/7181-attention-is-all-you-need.pdf)[C]//Advances in Neural Information Processing Systems. 2017: 6000-6010.
+2. He K, Zhang X, Ren S, et al. [Deep residual learning for image recognition](http://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf)[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2016: 770-778.
+3. Ba J L, Kiros J R, Hinton G E. [Layer normalization](https://arxiv.org/pdf/1607.06450.pdf)[J]. arXiv preprint arXiv:1607.06450, 2016.
+4. Sennrich R, Haddow B, Birch A. [Neural machine translation of rare words with subword units](https://arxiv.org/pdf/1508.07909)[J]. arXiv preprint arXiv:1508.07909, 2015.
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/_ce.py b/PaddleNLP/neural_machine_translation/transformer/_ce.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/_ce.py
rename to PaddleNLP/neural_machine_translation/transformer/_ce.py
diff --git a/PaddleNLP/neural_machine_translation/transformer/config.py b/PaddleNLP/neural_machine_translation/transformer/config.py
new file mode 100644
index 0000000000000000000000000000000000000000..823341ed9084e80b5fe74655bf8db897d72175f0
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/transformer/config.py
@@ -0,0 +1,201 @@
+class TrainTaskConfig(object):
+ # support both CPU and GPU now.
+ use_gpu = True
+ # the epoch number to train.
+ pass_num = 30
+ # the number of sequences contained in a mini-batch.
+ # deprecated, set batch_size in args.
+ batch_size = 32
+ # the hyper parameters for Adam optimizer.
+ # This static learning_rate will be multiplied to the LearningRateScheduler
+ # derived learning rate the to get the final learning rate.
+ learning_rate = 2.0
+ beta1 = 0.9
+ beta2 = 0.997
+ eps = 1e-9
+ # the parameters for learning rate scheduling.
+ warmup_steps = 8000
+ # the weight used to mix up the ground-truth distribution and the fixed
+ # uniform distribution in label smoothing when training.
+ # Set this as zero if label smoothing is not wanted.
+ label_smooth_eps = 0.1
+ # the directory for saving trained models.
+ model_dir = "trained_models"
+ # the directory for saving checkpoints.
+ ckpt_dir = "trained_ckpts"
+ # the directory for loading checkpoint.
+ # If provided, continue training from the checkpoint.
+ ckpt_path = None
+ # the parameter to initialize the learning rate scheduler.
+ # It should be provided if use checkpoints, since the checkpoint doesn't
+ # include the training step counter currently.
+ start_step = 0
+ # the frequency to save trained models.
+ save_freq = 10000
+
+
+class InferTaskConfig(object):
+ use_gpu = True
+ # the number of examples in one run for sequence generation.
+ batch_size = 10
+ # the parameters for beam search.
+ beam_size = 5
+ max_out_len = 256
+ # the number of decoded sentences to output.
+ n_best = 1
+ # the flags indicating whether to output the special tokens.
+ output_bos = False
+ output_eos = False
+ output_unk = True
+ # the directory for loading the trained model.
+ model_path = "trained_models/pass_1.infer.model"
+
+
+class ModelHyperParams(object):
+ # These following five vocabularies related configurations will be set
+ # automatically according to the passed vocabulary path and special tokens.
+ # size of source word dictionary.
+ src_vocab_size = 10000
+ # size of target word dictionay
+ trg_vocab_size = 10000
+ # index for token
+ bos_idx = 0
+ # index for token
+ eos_idx = 1
+ # index for token
+ unk_idx = 2
+ # max length of sequences deciding the size of position encoding table.
+ max_length = 256
+ # the dimension for word embeddings, which is also the last dimension of
+ # the input and output of multi-head attention, position-wise feed-forward
+ # networks, encoder and decoder.
+ d_model = 512
+ # size of the hidden layer in position-wise feed-forward networks.
+ d_inner_hid = 2048
+ # the dimension that keys are projected to for dot-product attention.
+ d_key = 64
+ # the dimension that values are projected to for dot-product attention.
+ d_value = 64
+ # number of head used in multi-head attention.
+ n_head = 8
+ # number of sub-layers to be stacked in the encoder and decoder.
+ n_layer = 6
+ # dropout rates of different modules.
+ prepostprocess_dropout = 0.1
+ attention_dropout = 0.1
+ relu_dropout = 0.1
+ # to process before each sub-layer
+ preprocess_cmd = "n" # layer normalization
+ # to process after each sub-layer
+ postprocess_cmd = "da" # dropout + residual connection
+ # random seed used in dropout for CE.
+ dropout_seed = None
+ # the flag indicating whether to share embedding and softmax weights.
+ # vocabularies in source and target should be same for weight sharing.
+ weight_sharing = True
+
+
+def merge_cfg_from_list(cfg_list, g_cfgs):
+ """
+ Set the above global configurations using the cfg_list.
+ """
+ assert len(cfg_list) % 2 == 0
+ for key, value in zip(cfg_list[0::2], cfg_list[1::2]):
+ for g_cfg in g_cfgs:
+ if hasattr(g_cfg, key):
+ try:
+ value = eval(value)
+ except Exception: # for file path
+ pass
+ setattr(g_cfg, key, value)
+ break
+
+
+# The placeholder for batch_size in compile time. Must be -1 currently to be
+# consistent with some ops' infer-shape output in compile time, such as the
+# sequence_expand op used in beamsearch decoder.
+batch_size = -1
+# The placeholder for squence length in compile time.
+seq_len = ModelHyperParams.max_length
+# Here list the data shapes and data types of all inputs.
+# The shapes here act as placeholder and are set to pass the infer-shape in
+# compile time.
+input_descs = {
+ # The actual data shape of src_word is:
+ # [batch_size, max_src_len_in_batch, 1]
+ "src_word": [(batch_size, seq_len, 1), "int64", 2],
+ # The actual data shape of src_pos is:
+ # [batch_size, max_src_len_in_batch, 1]
+ "src_pos": [(batch_size, seq_len, 1), "int64"],
+ # This input is used to remove attention weights on paddings in the
+ # encoder.
+ # The actual data shape of src_slf_attn_bias is:
+ # [batch_size, n_head, max_src_len_in_batch, max_src_len_in_batch]
+ "src_slf_attn_bias": [(batch_size, ModelHyperParams.n_head, seq_len,
+ seq_len), "float32"],
+ # The actual data shape of trg_word is:
+ # [batch_size, max_trg_len_in_batch, 1]
+ "trg_word": [(batch_size, seq_len, 1), "int64",
+ 2], # lod_level is only used in fast decoder.
+ # The actual data shape of trg_pos is:
+ # [batch_size, max_trg_len_in_batch, 1]
+ "trg_pos": [(batch_size, seq_len, 1), "int64"],
+ # This input is used to remove attention weights on paddings and
+ # subsequent words in the decoder.
+ # The actual data shape of trg_slf_attn_bias is:
+ # [batch_size, n_head, max_trg_len_in_batch, max_trg_len_in_batch]
+ "trg_slf_attn_bias": [(batch_size, ModelHyperParams.n_head, seq_len,
+ seq_len), "float32"],
+ # This input is used to remove attention weights on paddings of the source
+ # input in the encoder-decoder attention.
+ # The actual data shape of trg_src_attn_bias is:
+ # [batch_size, n_head, max_trg_len_in_batch, max_src_len_in_batch]
+ "trg_src_attn_bias": [(batch_size, ModelHyperParams.n_head, seq_len,
+ seq_len), "float32"],
+ # This input is used in independent decoder program for inference.
+ # The actual data shape of enc_output is:
+ # [batch_size, max_src_len_in_batch, d_model]
+ "enc_output": [(batch_size, seq_len, ModelHyperParams.d_model), "float32"],
+ # The actual data shape of label_word is:
+ # [batch_size * max_trg_len_in_batch, 1]
+ "lbl_word": [(batch_size * seq_len, 1), "int64"],
+ # This input is used to mask out the loss of paddding tokens.
+ # The actual data shape of label_weight is:
+ # [batch_size * max_trg_len_in_batch, 1]
+ "lbl_weight": [(batch_size * seq_len, 1), "float32"],
+ # This input is used in beam-search decoder.
+ "init_score": [(batch_size, 1), "float32", 2],
+ # This input is used in beam-search decoder for the first gather
+ # (cell states updation)
+ "init_idx": [(batch_size, ), "int32"],
+}
+
+# Names of word embedding table which might be reused for weight sharing.
+word_emb_param_names = (
+ "src_word_emb_table",
+ "trg_word_emb_table", )
+# Names of position encoding table which will be initialized externally.
+pos_enc_param_names = (
+ "src_pos_enc_table",
+ "trg_pos_enc_table", )
+# separated inputs for different usages.
+encoder_data_input_fields = (
+ "src_word",
+ "src_pos",
+ "src_slf_attn_bias", )
+decoder_data_input_fields = (
+ "trg_word",
+ "trg_pos",
+ "trg_slf_attn_bias",
+ "trg_src_attn_bias",
+ "enc_output", )
+label_data_input_fields = (
+ "lbl_word",
+ "lbl_weight", )
+# In fast decoder, trg_pos (only containing the current time step) is generated
+# by ops and trg_slf_attn_bias is not needed.
+fast_decoder_data_input_fields = (
+ "trg_word",
+ "init_score",
+ "init_idx",
+ "trg_src_attn_bias", )
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/gen_data.sh b/PaddleNLP/neural_machine_translation/transformer/gen_data.sh
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/gen_data.sh
rename to PaddleNLP/neural_machine_translation/transformer/gen_data.sh
diff --git a/PaddleNLP/neural_machine_translation/transformer/images/attention_formula.png b/PaddleNLP/neural_machine_translation/transformer/images/attention_formula.png
new file mode 100644
index 0000000000000000000000000000000000000000..249857f524b4137bafc2d4d1b779ed62d1437b6d
Binary files /dev/null and b/PaddleNLP/neural_machine_translation/transformer/images/attention_formula.png differ
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/images/multi_head_attention.png b/PaddleNLP/neural_machine_translation/transformer/images/multi_head_attention.png
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/images/multi_head_attention.png
rename to PaddleNLP/neural_machine_translation/transformer/images/multi_head_attention.png
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/images/transformer_network.png b/PaddleNLP/neural_machine_translation/transformer/images/transformer_network.png
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/images/transformer_network.png
rename to PaddleNLP/neural_machine_translation/transformer/images/transformer_network.png
diff --git a/PaddleNLP/neural_machine_translation/transformer/infer.py b/PaddleNLP/neural_machine_translation/transformer/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..96b8e0a14fcc225e11e63a7604c8bf8e6db8e45e
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/transformer/infer.py
@@ -0,0 +1,324 @@
+import argparse
+import ast
+import multiprocessing
+import numpy as np
+import os
+from functools import partial
+
+import paddle
+import paddle.fluid as fluid
+
+import model
+import reader
+from config import *
+from model import wrap_encoder as encoder
+from model import wrap_decoder as decoder
+from model import fast_decode as fast_decoder
+from train import pad_batch_data, prepare_data_generator
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("Training for Transformer.")
+ parser.add_argument(
+ "--src_vocab_fpath",
+ type=str,
+ required=True,
+ help="The path of vocabulary file of source language.")
+ parser.add_argument(
+ "--trg_vocab_fpath",
+ type=str,
+ required=True,
+ help="The path of vocabulary file of target language.")
+ parser.add_argument(
+ "--test_file_pattern",
+ type=str,
+ required=True,
+ help="The pattern to match test data files.")
+ parser.add_argument(
+ "--batch_size",
+ type=int,
+ default=50,
+ help="The number of examples in one run for sequence generation.")
+ parser.add_argument(
+ "--pool_size",
+ type=int,
+ default=10000,
+ help="The buffer size to pool data.")
+ parser.add_argument(
+ "--special_token",
+ type=str,
+ default=["", "", ""],
+ nargs=3,
+ help="The , and tokens in the dictionary.")
+ parser.add_argument(
+ "--token_delimiter",
+ type=lambda x: str(x.encode().decode("unicode-escape")),
+ default=" ",
+ help="The delimiter used to split tokens in source or target sentences. "
+ "For EN-DE BPE data we provided, use spaces as token delimiter. ")
+ parser.add_argument(
+ "--use_mem_opt",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to use memory optimization.")
+ parser.add_argument(
+ "--use_py_reader",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to use py_reader.")
+ parser.add_argument(
+ "--use_parallel_exe",
+ type=ast.literal_eval,
+ default=False,
+ help="The flag indicating whether to use ParallelExecutor.")
+ parser.add_argument(
+ 'opts',
+ help='See config.py for all options',
+ default=None,
+ nargs=argparse.REMAINDER)
+ args = parser.parse_args()
+ # Append args related to dict
+ src_dict = reader.DataReader.load_dict(args.src_vocab_fpath)
+ trg_dict = reader.DataReader.load_dict(args.trg_vocab_fpath)
+ dict_args = [
+ "src_vocab_size", str(len(src_dict)), "trg_vocab_size",
+ str(len(trg_dict)), "bos_idx", str(src_dict[args.special_token[0]]),
+ "eos_idx", str(src_dict[args.special_token[1]]), "unk_idx",
+ str(src_dict[args.special_token[2]])
+ ]
+ merge_cfg_from_list(args.opts + dict_args,
+ [InferTaskConfig, ModelHyperParams])
+ return args
+
+
+def post_process_seq(seq,
+ bos_idx=ModelHyperParams.bos_idx,
+ eos_idx=ModelHyperParams.eos_idx,
+ output_bos=InferTaskConfig.output_bos,
+ output_eos=InferTaskConfig.output_eos):
+ """
+ Post-process the beam-search decoded sequence. Truncate from the first
+ and remove the and tokens currently.
+ """
+ eos_pos = len(seq) - 1
+ for i, idx in enumerate(seq):
+ if idx == eos_idx:
+ eos_pos = i
+ break
+ seq = [
+ idx for idx in seq[:eos_pos + 1]
+ if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)
+ ]
+ return seq
+
+
+def prepare_batch_input(insts, data_input_names, src_pad_idx, bos_idx, n_head,
+ d_model, place):
+ """
+ Put all padded data needed by beam search decoder into a dict.
+ """
+ src_word, src_pos, src_slf_attn_bias, src_max_len = pad_batch_data(
+ [inst[0] for inst in insts], src_pad_idx, n_head, is_target=False)
+ # start tokens
+ trg_word = np.asarray([[bos_idx]] * len(insts), dtype="int64")
+ trg_src_attn_bias = np.tile(src_slf_attn_bias[:, :, ::src_max_len, :],
+ [1, 1, 1, 1]).astype("float32")
+ trg_word = trg_word.reshape(-1, 1, 1)
+ src_word = src_word.reshape(-1, src_max_len, 1)
+ src_pos = src_pos.reshape(-1, src_max_len, 1)
+
+ def to_lodtensor(data, place, lod=None):
+ data_tensor = fluid.LoDTensor()
+ data_tensor.set(data, place)
+ if lod is not None:
+ data_tensor.set_lod(lod)
+ return data_tensor
+
+ # beamsearch_op must use tensors with lod
+ init_score = to_lodtensor(
+ np.zeros_like(
+ trg_word, dtype="float32").reshape(-1, 1),
+ place, [range(trg_word.shape[0] + 1)] * 2)
+ trg_word = to_lodtensor(trg_word, place, [range(trg_word.shape[0] + 1)] * 2)
+ init_idx = np.asarray(range(len(insts)), dtype="int32")
+
+ data_input_dict = dict(
+ zip(data_input_names, [
+ src_word, src_pos, src_slf_attn_bias, trg_word, init_score,
+ init_idx, trg_src_attn_bias
+ ]))
+ return data_input_dict
+
+
+def prepare_feed_dict_list(data_generator, count, place):
+ """
+ Prepare the list of feed dict for multi-devices.
+ """
+ feed_dict_list = []
+ if data_generator is not None: # use_py_reader == False
+ data_input_names = encoder_data_input_fields + fast_decoder_data_input_fields
+ data = next(data_generator)
+ for idx, data_buffer in enumerate(data):
+ data_input_dict = prepare_batch_input(
+ data_buffer, data_input_names, ModelHyperParams.eos_idx,
+ ModelHyperParams.bos_idx, ModelHyperParams.n_head,
+ ModelHyperParams.d_model, place)
+ feed_dict_list.append(data_input_dict)
+ return feed_dict_list if len(feed_dict_list) == count else None
+
+
+def py_reader_provider_wrapper(data_reader, place):
+ """
+ Data provider needed by fluid.layers.py_reader.
+ """
+
+ def py_reader_provider():
+ data_input_names = encoder_data_input_fields + fast_decoder_data_input_fields
+ for batch_id, data in enumerate(data_reader()):
+ data_input_dict = prepare_batch_input(
+ data, data_input_names, ModelHyperParams.eos_idx,
+ ModelHyperParams.bos_idx, ModelHyperParams.n_head,
+ ModelHyperParams.d_model, place)
+ yield [data_input_dict[item] for item in data_input_names]
+
+ return py_reader_provider
+
+
+def fast_infer(args):
+ """
+ Inference by beam search decoder based solely on Fluid operators.
+ """
+ out_ids, out_scores, pyreader = fast_decoder(
+ ModelHyperParams.src_vocab_size,
+ ModelHyperParams.trg_vocab_size,
+ ModelHyperParams.max_length + 1,
+ ModelHyperParams.n_layer,
+ ModelHyperParams.n_head,
+ ModelHyperParams.d_key,
+ ModelHyperParams.d_value,
+ ModelHyperParams.d_model,
+ ModelHyperParams.d_inner_hid,
+ ModelHyperParams.prepostprocess_dropout,
+ ModelHyperParams.attention_dropout,
+ ModelHyperParams.relu_dropout,
+ ModelHyperParams.preprocess_cmd,
+ ModelHyperParams.postprocess_cmd,
+ ModelHyperParams.weight_sharing,
+ InferTaskConfig.beam_size,
+ InferTaskConfig.max_out_len,
+ ModelHyperParams.eos_idx,
+ use_py_reader=args.use_py_reader)
+
+ # This is used here to set dropout to the test mode.
+ infer_program = fluid.default_main_program().clone(for_test=True)
+
+ if args.use_mem_opt:
+ fluid.memory_optimize(infer_program)
+
+ if InferTaskConfig.use_gpu:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+ else:
+ place = fluid.CPUPlace()
+ dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ fluid.io.load_vars(
+ exe,
+ InferTaskConfig.model_path,
+ vars=[
+ var for var in infer_program.list_vars()
+ if isinstance(var, fluid.framework.Parameter)
+ ])
+
+ exec_strategy = fluid.ExecutionStrategy()
+ # For faster executor
+ exec_strategy.use_experimental_executor = True
+ exec_strategy.num_threads = 1
+ build_strategy = fluid.BuildStrategy()
+ infer_exe = fluid.ParallelExecutor(
+ use_cuda=TrainTaskConfig.use_gpu,
+ main_program=infer_program,
+ build_strategy=build_strategy,
+ exec_strategy=exec_strategy)
+
+ # data reader settings for inference
+ args.train_file_pattern = args.test_file_pattern
+ args.use_token_batch = False
+ args.sort_type = reader.SortType.NONE
+ args.shuffle = False
+ args.shuffle_batch = False
+ test_data = prepare_data_generator(
+ args,
+ is_test=False,
+ count=dev_count,
+ pyreader=pyreader,
+ py_reader_provider_wrapper=py_reader_provider_wrapper,
+ place=place)
+ if args.use_py_reader:
+ pyreader.start()
+ data_generator = None
+ else:
+ data_generator = test_data()
+ trg_idx2word = reader.DataReader.load_dict(
+ dict_path=args.trg_vocab_fpath, reverse=True)
+
+ while True:
+ try:
+ feed_dict_list = prepare_feed_dict_list(data_generator, dev_count,
+ place)
+ if args.use_parallel_exe:
+ seq_ids, seq_scores = infer_exe.run(
+ fetch_list=[out_ids.name, out_scores.name],
+ feed=feed_dict_list,
+ return_numpy=False)
+ else:
+ seq_ids, seq_scores = exe.run(
+ program=infer_program,
+ fetch_list=[out_ids.name, out_scores.name],
+ feed=feed_dict_list[0]
+ if feed_dict_list is not None else None,
+ return_numpy=False,
+ use_program_cache=True)
+ seq_ids_list, seq_scores_list = [
+ seq_ids
+ ], [seq_scores] if isinstance(
+ seq_ids, paddle.fluid.LoDTensor) else (seq_ids, seq_scores)
+ for seq_ids, seq_scores in zip(seq_ids_list, seq_scores_list):
+ # How to parse the results:
+ # Suppose the lod of seq_ids is:
+ # [[0, 3, 6], [0, 12, 24, 40, 54, 67, 82]]
+ # then from lod[0]:
+ # there are 2 source sentences, beam width is 3.
+ # from lod[1]:
+ # the first source sentence has 3 hyps; the lengths are 12, 12, 16
+ # the second source sentence has 3 hyps; the lengths are 14, 13, 15
+ hyps = [[] for i in range(len(seq_ids.lod()[0]) - 1)]
+ scores = [[] for i in range(len(seq_scores.lod()[0]) - 1)]
+ for i in range(len(seq_ids.lod()[0]) -
+ 1): # for each source sentence
+ start = seq_ids.lod()[0][i]
+ end = seq_ids.lod()[0][i + 1]
+ for j in range(end - start): # for each candidate
+ sub_start = seq_ids.lod()[1][start + j]
+ sub_end = seq_ids.lod()[1][start + j + 1]
+ hyps[i].append(" ".join([
+ trg_idx2word[idx]
+ for idx in post_process_seq(
+ np.array(seq_ids)[sub_start:sub_end])
+ ]))
+ scores[i].append(np.array(seq_scores)[sub_end - 1])
+ print(hyps[i][-1])
+ if len(hyps[i]) >= InferTaskConfig.n_best:
+ break
+ except (StopIteration, fluid.core.EOFException):
+ # The data pass is over.
+ if args.use_py_reader:
+ pyreader.reset()
+ break
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ fast_infer(args)
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/local_dist.sh b/PaddleNLP/neural_machine_translation/transformer/local_dist.sh
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/local_dist.sh
rename to PaddleNLP/neural_machine_translation/transformer/local_dist.sh
diff --git a/PaddleNLP/neural_machine_translation/transformer/model.py b/PaddleNLP/neural_machine_translation/transformer/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..cfd85dc505216a0d226a8cb29012ae3e36cc26cc
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/transformer/model.py
@@ -0,0 +1,905 @@
+from functools import partial
+import numpy as np
+
+import paddle.fluid as fluid
+import paddle.fluid.layers as layers
+
+from config import *
+
+
+def wrap_layer_with_block(layer, block_idx):
+ """
+ Make layer define support indicating block, by which we can add layers
+ to other blocks within current block. This will make it easy to define
+ cache among while loop.
+ """
+
+ class BlockGuard(object):
+ """
+ BlockGuard class.
+
+ BlockGuard class is used to switch to the given block in a program by
+ using the Python `with` keyword.
+ """
+
+ def __init__(self, block_idx=None, main_program=None):
+ self.main_program = fluid.default_main_program(
+ ) if main_program is None else main_program
+ self.old_block_idx = self.main_program.current_block().idx
+ self.new_block_idx = block_idx
+
+ def __enter__(self):
+ self.main_program.current_block_idx = self.new_block_idx
+
+ def __exit__(self, exc_type, exc_val, exc_tb):
+ self.main_program.current_block_idx = self.old_block_idx
+ if exc_type is not None:
+ return False # re-raise exception
+ return True
+
+ def layer_wrapper(*args, **kwargs):
+ with BlockGuard(block_idx):
+ return layer(*args, **kwargs)
+
+ return layer_wrapper
+
+
+def position_encoding_init(n_position, d_pos_vec):
+ """
+ Generate the initial values for the sinusoid position encoding table.
+ """
+ channels = d_pos_vec
+ position = np.arange(n_position)
+ num_timescales = channels // 2
+ log_timescale_increment = (np.log(float(1e4) / float(1)) /
+ (num_timescales - 1))
+ inv_timescales = np.exp(np.arange(
+ num_timescales)) * -log_timescale_increment
+ scaled_time = np.expand_dims(position, 1) * np.expand_dims(inv_timescales,
+ 0)
+ signal = np.concatenate([np.sin(scaled_time), np.cos(scaled_time)], axis=1)
+ signal = np.pad(signal, [[0, 0], [0, np.mod(channels, 2)]], 'constant')
+ position_enc = signal
+ return position_enc.astype("float32")
+
+
+def multi_head_attention(queries,
+ keys,
+ values,
+ attn_bias,
+ d_key,
+ d_value,
+ d_model,
+ n_head=1,
+ dropout_rate=0.,
+ cache=None,
+ gather_idx=None,
+ static_kv=False):
+ """
+ Multi-Head Attention. Note that attn_bias is added to the logit before
+ computing softmax activiation to mask certain selected positions so that
+ they will not considered in attention weights.
+ """
+ keys = queries if keys is None else keys
+ values = keys if values is None else values
+
+ if not (len(queries.shape) == len(keys.shape) == len(values.shape) == 3):
+ raise ValueError(
+ "Inputs: quries, keys and values should all be 3-D tensors.")
+
+ def __compute_qkv(queries, keys, values, n_head, d_key, d_value):
+ """
+ Add linear projection to queries, keys, and values.
+ """
+ q = layers.fc(input=queries,
+ size=d_key * n_head,
+ bias_attr=False,
+ num_flatten_dims=2)
+ # For encoder-decoder attention in inference, insert the ops and vars
+ # into global block to use as cache among beam search.
+ fc_layer = wrap_layer_with_block(
+ layers.fc, fluid.default_main_program().current_block()
+ .parent_idx) if cache is not None and static_kv else layers.fc
+ k = fc_layer(
+ input=keys,
+ size=d_key * n_head,
+ bias_attr=False,
+ num_flatten_dims=2)
+ v = fc_layer(
+ input=values,
+ size=d_value * n_head,
+ bias_attr=False,
+ num_flatten_dims=2)
+ return q, k, v
+
+ def __split_heads_qkv(queries, keys, values, n_head, d_key, d_value):
+ """
+ Reshape input tensors at the last dimension to split multi-heads
+ and then transpose. Specifically, transform the input tensor with shape
+ [bs, max_sequence_length, n_head * hidden_dim] to the output tensor
+ with shape [bs, n_head, max_sequence_length, hidden_dim].
+ """
+ # The value 0 in shape attr means copying the corresponding dimension
+ # size of the input as the output dimension size.
+ reshaped_q = layers.reshape(
+ x=queries, shape=[0, 0, n_head, d_key], inplace=True)
+ # permuate the dimensions into:
+ # [batch_size, n_head, max_sequence_len, hidden_size_per_head]
+ q = layers.transpose(x=reshaped_q, perm=[0, 2, 1, 3])
+ # For encoder-decoder attention in inference, insert the ops and vars
+ # into global block to use as cache among beam search.
+ reshape_layer = wrap_layer_with_block(
+ layers.reshape,
+ fluid.default_main_program().current_block()
+ .parent_idx) if cache is not None and static_kv else layers.reshape
+ transpose_layer = wrap_layer_with_block(
+ layers.transpose,
+ fluid.default_main_program().current_block().
+ parent_idx) if cache is not None and static_kv else layers.transpose
+ reshaped_k = reshape_layer(
+ x=keys, shape=[0, 0, n_head, d_key], inplace=True)
+ k = transpose_layer(x=reshaped_k, perm=[0, 2, 1, 3])
+ reshaped_v = reshape_layer(
+ x=values, shape=[0, 0, n_head, d_value], inplace=True)
+ v = transpose_layer(x=reshaped_v, perm=[0, 2, 1, 3])
+
+ if cache is not None: # only for faster inference
+ if static_kv: # For encoder-decoder attention in inference
+ cache_k, cache_v = cache["static_k"], cache["static_v"]
+ # To init the static_k and static_v in cache.
+ # Maybe we can use condition_op(if_else) to do these at the first
+ # step in while loop to replace these, however it might be less
+ # efficient.
+ static_cache_init = wrap_layer_with_block(
+ layers.assign,
+ fluid.default_main_program().current_block().parent_idx)
+ static_cache_init(k, cache_k)
+ static_cache_init(v, cache_v)
+ else: # For decoder self-attention in inference
+ cache_k, cache_v = cache["k"], cache["v"]
+ # gather cell states corresponding to selected parent
+ select_k = layers.gather(cache_k, index=gather_idx)
+ select_v = layers.gather(cache_v, index=gather_idx)
+ if not static_kv:
+ # For self attention in inference, use cache and concat time steps.
+ select_k = layers.concat([select_k, k], axis=2)
+ select_v = layers.concat([select_v, v], axis=2)
+ # update cell states(caches) cached in global block
+ layers.assign(select_k, cache_k)
+ layers.assign(select_v, cache_v)
+ return q, select_k, select_v
+ return q, k, v
+
+ def __combine_heads(x):
+ """
+ Transpose and then reshape the last two dimensions of inpunt tensor x
+ so that it becomes one dimension, which is reverse to __split_heads.
+ """
+ if len(x.shape) != 4:
+ raise ValueError("Input(x) should be a 4-D Tensor.")
+
+ trans_x = layers.transpose(x, perm=[0, 2, 1, 3])
+ # The value 0 in shape attr means copying the corresponding dimension
+ # size of the input as the output dimension size.
+ return layers.reshape(
+ x=trans_x,
+ shape=[0, 0, trans_x.shape[2] * trans_x.shape[3]],
+ inplace=True)
+
+ def scaled_dot_product_attention(q, k, v, attn_bias, d_key, dropout_rate):
+ """
+ Scaled Dot-Product Attention
+ """
+ product = layers.matmul(x=q, y=k, transpose_y=True, alpha=d_key**-0.5)
+ if attn_bias:
+ product += attn_bias
+ weights = layers.softmax(product)
+ if dropout_rate:
+ weights = layers.dropout(
+ weights,
+ dropout_prob=dropout_rate,
+ seed=ModelHyperParams.dropout_seed,
+ is_test=False)
+ out = layers.matmul(weights, v)
+ return out
+
+ q, k, v = __compute_qkv(queries, keys, values, n_head, d_key, d_value)
+ q, k, v = __split_heads_qkv(q, k, v, n_head, d_key, d_value)
+
+ ctx_multiheads = scaled_dot_product_attention(q, k, v, attn_bias, d_model,
+ dropout_rate)
+
+ out = __combine_heads(ctx_multiheads)
+
+ # Project back to the model size.
+ proj_out = layers.fc(input=out,
+ size=d_model,
+ bias_attr=False,
+ num_flatten_dims=2)
+ return proj_out
+
+
+def positionwise_feed_forward(x, d_inner_hid, d_hid, dropout_rate):
+ """
+ Position-wise Feed-Forward Networks.
+ This module consists of two linear transformations with a ReLU activation
+ in between, which is applied to each position separately and identically.
+ """
+ hidden = layers.fc(input=x,
+ size=d_inner_hid,
+ num_flatten_dims=2,
+ act="relu")
+ if dropout_rate:
+ hidden = layers.dropout(
+ hidden,
+ dropout_prob=dropout_rate,
+ seed=ModelHyperParams.dropout_seed,
+ is_test=False)
+ out = layers.fc(input=hidden, size=d_hid, num_flatten_dims=2)
+ return out
+
+
+def pre_post_process_layer(prev_out, out, process_cmd, dropout_rate=0.):
+ """
+ Add residual connection, layer normalization and droput to the out tensor
+ optionally according to the value of process_cmd.
+ This will be used before or after multi-head attention and position-wise
+ feed-forward networks.
+ """
+ for cmd in process_cmd:
+ if cmd == "a": # add residual connection
+ out = out + prev_out if prev_out else out
+ elif cmd == "n": # add layer normalization
+ out = layers.layer_norm(
+ out,
+ begin_norm_axis=len(out.shape) - 1,
+ param_attr=fluid.initializer.Constant(1.),
+ bias_attr=fluid.initializer.Constant(0.))
+ elif cmd == "d": # add dropout
+ if dropout_rate:
+ out = layers.dropout(
+ out,
+ dropout_prob=dropout_rate,
+ seed=ModelHyperParams.dropout_seed,
+ is_test=False)
+ return out
+
+
+pre_process_layer = partial(pre_post_process_layer, None)
+post_process_layer = pre_post_process_layer
+
+
+def prepare_encoder_decoder(src_word,
+ src_pos,
+ src_vocab_size,
+ src_emb_dim,
+ src_max_len,
+ dropout_rate=0.,
+ word_emb_param_name=None,
+ pos_enc_param_name=None):
+ """Add word embeddings and position encodings.
+ The output tensor has a shape of:
+ [batch_size, max_src_length_in_batch, d_model].
+ This module is used at the bottom of the encoder stacks.
+ """
+ src_word_emb = layers.embedding(
+ src_word,
+ size=[src_vocab_size, src_emb_dim],
+ padding_idx=ModelHyperParams.bos_idx, # set embedding of bos to 0
+ param_attr=fluid.ParamAttr(
+ name=word_emb_param_name,
+ initializer=fluid.initializer.Normal(0., src_emb_dim**-0.5)))
+
+ src_word_emb = layers.scale(x=src_word_emb, scale=src_emb_dim**0.5)
+ src_pos_enc = layers.embedding(
+ src_pos,
+ size=[src_max_len, src_emb_dim],
+ param_attr=fluid.ParamAttr(
+ name=pos_enc_param_name, trainable=False))
+ src_pos_enc.stop_gradient = True
+ enc_input = src_word_emb + src_pos_enc
+ return layers.dropout(
+ enc_input,
+ dropout_prob=dropout_rate,
+ seed=ModelHyperParams.dropout_seed,
+ is_test=False) if dropout_rate else enc_input
+
+
+prepare_encoder = partial(
+ prepare_encoder_decoder, pos_enc_param_name=pos_enc_param_names[0])
+prepare_decoder = partial(
+ prepare_encoder_decoder, pos_enc_param_name=pos_enc_param_names[1])
+
+
+def encoder_layer(enc_input,
+ attn_bias,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd="n",
+ postprocess_cmd="da"):
+ """The encoder layers that can be stacked to form a deep encoder.
+ This module consits of a multi-head (self) attention followed by
+ position-wise feed-forward networks and both the two components companied
+ with the post_process_layer to add residual connection, layer normalization
+ and droput.
+ """
+ attn_output = multi_head_attention(
+ pre_process_layer(enc_input, preprocess_cmd,
+ prepostprocess_dropout), None, None, attn_bias, d_key,
+ d_value, d_model, n_head, attention_dropout)
+ attn_output = post_process_layer(enc_input, attn_output, postprocess_cmd,
+ prepostprocess_dropout)
+ ffd_output = positionwise_feed_forward(
+ pre_process_layer(attn_output, preprocess_cmd, prepostprocess_dropout),
+ d_inner_hid, d_model, relu_dropout)
+ return post_process_layer(attn_output, ffd_output, postprocess_cmd,
+ prepostprocess_dropout)
+
+
+def encoder(enc_input,
+ attn_bias,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd="n",
+ postprocess_cmd="da"):
+ """
+ The encoder is composed of a stack of identical layers returned by calling
+ encoder_layer.
+ """
+ for i in range(n_layer):
+ enc_output = encoder_layer(
+ enc_input,
+ attn_bias,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd, )
+ enc_input = enc_output
+ enc_output = pre_process_layer(enc_output, preprocess_cmd,
+ prepostprocess_dropout)
+ return enc_output
+
+
+def decoder_layer(dec_input,
+ enc_output,
+ slf_attn_bias,
+ dec_enc_attn_bias,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ cache=None,
+ gather_idx=None):
+ """ The layer to be stacked in decoder part.
+ The structure of this module is similar to that in the encoder part except
+ a multi-head attention is added to implement encoder-decoder attention.
+ """
+ slf_attn_output = multi_head_attention(
+ pre_process_layer(dec_input, preprocess_cmd, prepostprocess_dropout),
+ None,
+ None,
+ slf_attn_bias,
+ d_key,
+ d_value,
+ d_model,
+ n_head,
+ attention_dropout,
+ cache=cache,
+ gather_idx=gather_idx)
+ slf_attn_output = post_process_layer(
+ dec_input,
+ slf_attn_output,
+ postprocess_cmd,
+ prepostprocess_dropout, )
+ enc_attn_output = multi_head_attention(
+ pre_process_layer(slf_attn_output, preprocess_cmd,
+ prepostprocess_dropout),
+ enc_output,
+ enc_output,
+ dec_enc_attn_bias,
+ d_key,
+ d_value,
+ d_model,
+ n_head,
+ attention_dropout,
+ cache=cache,
+ gather_idx=gather_idx,
+ static_kv=True)
+ enc_attn_output = post_process_layer(
+ slf_attn_output,
+ enc_attn_output,
+ postprocess_cmd,
+ prepostprocess_dropout, )
+ ffd_output = positionwise_feed_forward(
+ pre_process_layer(enc_attn_output, preprocess_cmd,
+ prepostprocess_dropout),
+ d_inner_hid,
+ d_model,
+ relu_dropout, )
+ dec_output = post_process_layer(
+ enc_attn_output,
+ ffd_output,
+ postprocess_cmd,
+ prepostprocess_dropout, )
+ return dec_output
+
+
+def decoder(dec_input,
+ enc_output,
+ dec_slf_attn_bias,
+ dec_enc_attn_bias,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ caches=None,
+ gather_idx=None):
+ """
+ The decoder is composed of a stack of identical decoder_layer layers.
+ """
+ for i in range(n_layer):
+ dec_output = decoder_layer(
+ dec_input,
+ enc_output,
+ dec_slf_attn_bias,
+ dec_enc_attn_bias,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ cache=None if caches is None else caches[i],
+ gather_idx=gather_idx)
+ dec_input = dec_output
+ dec_output = pre_process_layer(dec_output, preprocess_cmd,
+ prepostprocess_dropout)
+ return dec_output
+
+
+def make_all_inputs(input_fields):
+ """
+ Define the input data layers for the transformer model.
+ """
+ inputs = []
+ for input_field in input_fields:
+ input_var = layers.data(
+ name=input_field,
+ shape=input_descs[input_field][0],
+ dtype=input_descs[input_field][1],
+ lod_level=input_descs[input_field][2]
+ if len(input_descs[input_field]) == 3 else 0,
+ append_batch_size=False)
+ inputs.append(input_var)
+ return inputs
+
+
+def make_all_py_reader_inputs(input_fields, is_test=False):
+ reader = layers.py_reader(
+ capacity=20,
+ name="test_reader" if is_test else "train_reader",
+ shapes=[input_descs[input_field][0] for input_field in input_fields],
+ dtypes=[input_descs[input_field][1] for input_field in input_fields],
+ lod_levels=[
+ input_descs[input_field][2]
+ if len(input_descs[input_field]) == 3 else 0
+ for input_field in input_fields
+ ])
+ return layers.read_file(reader), reader
+
+
+def transformer(src_vocab_size,
+ trg_vocab_size,
+ max_length,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ label_smooth_eps,
+ use_py_reader=False,
+ is_test=False):
+ if weight_sharing:
+ assert src_vocab_size == trg_vocab_size, (
+ "Vocabularies in source and target should be same for weight sharing."
+ )
+
+ data_input_names = encoder_data_input_fields + \
+ decoder_data_input_fields[:-1] + label_data_input_fields
+
+ if use_py_reader:
+ all_inputs, reader = make_all_py_reader_inputs(data_input_names,
+ is_test)
+ else:
+ all_inputs = make_all_inputs(data_input_names)
+
+ enc_inputs_len = len(encoder_data_input_fields)
+ dec_inputs_len = len(decoder_data_input_fields[:-1])
+ enc_inputs = all_inputs[0:enc_inputs_len]
+ dec_inputs = all_inputs[enc_inputs_len:enc_inputs_len + dec_inputs_len]
+ label = all_inputs[-2]
+ weights = all_inputs[-1]
+
+ enc_output = wrap_encoder(
+ src_vocab_size,
+ max_length,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ enc_inputs, )
+
+ predict = wrap_decoder(
+ trg_vocab_size,
+ max_length,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ dec_inputs,
+ enc_output, )
+
+ # Padding index do not contribute to the total loss. The weights is used to
+ # cancel padding index in calculating the loss.
+ if label_smooth_eps:
+ label = layers.label_smooth(
+ label=layers.one_hot(
+ input=label, depth=trg_vocab_size),
+ epsilon=label_smooth_eps)
+
+ cost = layers.softmax_with_cross_entropy(
+ logits=predict,
+ label=label,
+ soft_label=True if label_smooth_eps else False)
+ weighted_cost = cost * weights
+ sum_cost = layers.reduce_sum(weighted_cost)
+ token_num = layers.reduce_sum(weights)
+ token_num.stop_gradient = True
+ avg_cost = sum_cost / token_num
+ return sum_cost, avg_cost, predict, token_num, reader if use_py_reader else None
+
+
+def wrap_encoder(src_vocab_size,
+ max_length,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ enc_inputs=None):
+ """
+ The wrapper assembles together all needed layers for the encoder.
+ """
+ if enc_inputs is None:
+ # This is used to implement independent encoder program in inference.
+ src_word, src_pos, src_slf_attn_bias = make_all_inputs(
+ encoder_data_input_fields)
+ else:
+ src_word, src_pos, src_slf_attn_bias = enc_inputs
+ enc_input = prepare_encoder(
+ src_word,
+ src_pos,
+ src_vocab_size,
+ d_model,
+ max_length,
+ prepostprocess_dropout,
+ word_emb_param_name=word_emb_param_names[0])
+ enc_output = encoder(
+ enc_input,
+ src_slf_attn_bias,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd, )
+ return enc_output
+
+
+def wrap_decoder(trg_vocab_size,
+ max_length,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ dec_inputs=None,
+ enc_output=None,
+ caches=None,
+ gather_idx=None):
+ """
+ The wrapper assembles together all needed layers for the decoder.
+ """
+ if dec_inputs is None:
+ # This is used to implement independent decoder program in inference.
+ trg_word, trg_pos, trg_slf_attn_bias, trg_src_attn_bias, enc_output = \
+ make_all_inputs(decoder_data_input_fields)
+ else:
+ trg_word, trg_pos, trg_slf_attn_bias, trg_src_attn_bias = dec_inputs
+
+ dec_input = prepare_decoder(
+ trg_word,
+ trg_pos,
+ trg_vocab_size,
+ d_model,
+ max_length,
+ prepostprocess_dropout,
+ word_emb_param_name=word_emb_param_names[0]
+ if weight_sharing else word_emb_param_names[1])
+ dec_output = decoder(
+ dec_input,
+ enc_output,
+ trg_slf_attn_bias,
+ trg_src_attn_bias,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ caches=caches,
+ gather_idx=gather_idx)
+ # Reshape to 2D tensor to use GEMM instead of BatchedGEMM
+ dec_output = layers.reshape(
+ dec_output, shape=[-1, dec_output.shape[-1]], inplace=True)
+ if weight_sharing:
+ predict = layers.matmul(
+ x=dec_output,
+ y=fluid.default_main_program().global_block().var(
+ word_emb_param_names[0]),
+ transpose_y=True)
+ else:
+ predict = layers.fc(input=dec_output,
+ size=trg_vocab_size,
+ bias_attr=False)
+ if dec_inputs is None:
+ # Return probs for independent decoder program.
+ predict = layers.softmax(predict)
+ return predict
+
+
+def fast_decode(src_vocab_size,
+ trg_vocab_size,
+ max_in_len,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ beam_size,
+ max_out_len,
+ eos_idx,
+ use_py_reader=False):
+ """
+ Use beam search to decode. Caches will be used to store states of history
+ steps which can make the decoding faster.
+ """
+ data_input_names = encoder_data_input_fields + fast_decoder_data_input_fields
+
+ if use_py_reader:
+ all_inputs, reader = make_all_py_reader_inputs(data_input_names)
+ else:
+ all_inputs = make_all_inputs(data_input_names)
+
+ enc_inputs_len = len(encoder_data_input_fields)
+ dec_inputs_len = len(fast_decoder_data_input_fields)
+ enc_inputs = all_inputs[0:enc_inputs_len]
+ dec_inputs = all_inputs[enc_inputs_len:enc_inputs_len + dec_inputs_len]
+
+ enc_output = wrap_encoder(
+ src_vocab_size,
+ max_in_len,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ enc_inputs, )
+ start_tokens, init_scores, parent_idx, trg_src_attn_bias = dec_inputs
+
+ def beam_search():
+ max_len = layers.fill_constant(
+ shape=[1],
+ dtype=start_tokens.dtype,
+ value=max_out_len,
+ force_cpu=True)
+ step_idx = layers.fill_constant(
+ shape=[1], dtype=start_tokens.dtype, value=0, force_cpu=True)
+ cond = layers.less_than(x=step_idx, y=max_len) # default force_cpu=True
+ while_op = layers.While(cond)
+ # array states will be stored for each step.
+ ids = layers.array_write(
+ layers.reshape(start_tokens, (-1, 1)), step_idx)
+ scores = layers.array_write(init_scores, step_idx)
+ # cell states will be overwrited at each step.
+ # caches contains states of history steps in decoder self-attention
+ # and static encoder output projections in encoder-decoder attention
+ # to reduce redundant computation.
+ caches = [
+ {
+ "k": # for self attention
+ layers.fill_constant_batch_size_like(
+ input=start_tokens,
+ shape=[-1, n_head, 0, d_key],
+ dtype=enc_output.dtype,
+ value=0),
+ "v": # for self attention
+ layers.fill_constant_batch_size_like(
+ input=start_tokens,
+ shape=[-1, n_head, 0, d_value],
+ dtype=enc_output.dtype,
+ value=0),
+ "static_k": # for encoder-decoder attention
+ layers.create_tensor(dtype=enc_output.dtype),
+ "static_v": # for encoder-decoder attention
+ layers.create_tensor(dtype=enc_output.dtype)
+ } for i in range(n_layer)
+ ]
+
+ with while_op.block():
+ pre_ids = layers.array_read(array=ids, i=step_idx)
+ # Since beam_search_op dosen't enforce pre_ids' shape, we can do
+ # inplace reshape here which actually change the shape of pre_ids.
+ pre_ids = layers.reshape(pre_ids, (-1, 1, 1), inplace=True)
+ pre_scores = layers.array_read(array=scores, i=step_idx)
+ # gather cell states corresponding to selected parent
+ pre_src_attn_bias = layers.gather(
+ trg_src_attn_bias, index=parent_idx)
+ pre_pos = layers.elementwise_mul(
+ x=layers.fill_constant_batch_size_like(
+ input=pre_src_attn_bias, # cann't use lod tensor here
+ value=1,
+ shape=[-1, 1, 1],
+ dtype=pre_ids.dtype),
+ y=step_idx,
+ axis=0)
+ logits = wrap_decoder(
+ trg_vocab_size,
+ max_in_len,
+ n_layer,
+ n_head,
+ d_key,
+ d_value,
+ d_model,
+ d_inner_hid,
+ prepostprocess_dropout,
+ attention_dropout,
+ relu_dropout,
+ preprocess_cmd,
+ postprocess_cmd,
+ weight_sharing,
+ dec_inputs=(pre_ids, pre_pos, None, pre_src_attn_bias),
+ enc_output=enc_output,
+ caches=caches,
+ gather_idx=parent_idx)
+ # intra-beam topK
+ topk_scores, topk_indices = layers.topk(
+ input=layers.softmax(logits), k=beam_size)
+ accu_scores = layers.elementwise_add(
+ x=layers.log(topk_scores), y=pre_scores, axis=0)
+ # beam_search op uses lod to differentiate branches.
+ topk_indices = layers.lod_reset(accu_scores, pre_ids)
+ # topK reduction across beams, also contain special handle of
+ # end beams and end sentences(batch reduction)
+ selected_ids, selected_scores, gather_idx = layers.beam_search(
+ pre_ids=pre_ids,
+ pre_scores=pre_scores,
+ ids=topk_indices,
+ scores=accu_scores,
+ beam_size=beam_size,
+ end_id=eos_idx,
+ return_parent_idx=True)
+ layers.increment(x=step_idx, value=1.0, in_place=True)
+ # cell states(caches) have been updated in wrap_decoder,
+ # only need to update beam search states here.
+ layers.array_write(selected_ids, i=step_idx, array=ids)
+ layers.array_write(selected_scores, i=step_idx, array=scores)
+ layers.assign(gather_idx, parent_idx)
+ layers.assign(pre_src_attn_bias, trg_src_attn_bias)
+ length_cond = layers.less_than(x=step_idx, y=max_len)
+ finish_cond = layers.logical_not(layers.is_empty(x=selected_ids))
+ layers.logical_and(x=length_cond, y=finish_cond, out=cond)
+
+ finished_ids, finished_scores = layers.beam_search_decode(
+ ids, scores, beam_size=beam_size, end_id=eos_idx)
+ return finished_ids, finished_scores
+
+ finished_ids, finished_scores = beam_search()
+ return finished_ids, finished_scores, reader if use_py_reader else None
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/optim.py b/PaddleNLP/neural_machine_translation/transformer/optim.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/optim.py
rename to PaddleNLP/neural_machine_translation/transformer/optim.py
diff --git a/PaddleNLP/neural_machine_translation/transformer/profile.py b/PaddleNLP/neural_machine_translation/transformer/profile.py
new file mode 100644
index 0000000000000000000000000000000000000000..76711ece132113863f1e42d4ac1529f63ed90ff3
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/transformer/profile.py
@@ -0,0 +1,284 @@
+import argparse
+import ast
+import contextlib
+import multiprocessing
+import os
+import six
+import time
+
+import numpy as np
+import paddle.fluid as fluid
+import paddle.fluid.profiler as profiler
+
+import reader
+from config import *
+from train import pad_batch_data, prepare_data_generator, \
+ prepare_feed_dict_list, py_reader_provider_wrapper
+from model import transformer, position_encoding_init
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("Training for Transformer.")
+ parser.add_argument(
+ "--src_vocab_fpath",
+ type=str,
+ required=True,
+ help="The path of vocabulary file of source language.")
+ parser.add_argument(
+ "--trg_vocab_fpath",
+ type=str,
+ required=True,
+ help="The path of vocabulary file of target language.")
+ parser.add_argument(
+ "--train_file_pattern",
+ type=str,
+ required=True,
+ help="The pattern to match training data files.")
+ parser.add_argument(
+ "--use_token_batch",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to "
+ "produce batch data according to token number.")
+ parser.add_argument(
+ "--batch_size",
+ type=int,
+ default=4096,
+ help="The number of sequences contained in a mini-batch, or the maximum "
+ "number of tokens (include paddings) contained in a mini-batch. Note "
+ "that this represents the number on single device and the actual batch "
+ "size for multi-devices will multiply the device number.")
+ parser.add_argument(
+ "--pool_size",
+ type=int,
+ default=200000,
+ help="The buffer size to pool data.")
+ parser.add_argument(
+ "--sort_type",
+ default="pool",
+ choices=("global", "pool", "none"),
+ help="The grain to sort by length: global for all instances; pool for "
+ "instances in pool; none for no sort.")
+ parser.add_argument(
+ "--shuffle",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to shuffle instances in each pass.")
+ parser.add_argument(
+ "--shuffle_batch",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to shuffle the data batches.")
+ parser.add_argument(
+ "--special_token",
+ type=str,
+ default=["", "", ""],
+ nargs=3,
+ help="The , and tokens in the dictionary.")
+ parser.add_argument(
+ "--token_delimiter",
+ type=lambda x: str(x.encode().decode("unicode-escape")),
+ default=" ",
+ help="The delimiter used to split tokens in source or target sentences. "
+ "For EN-DE BPE data we provided, use spaces as token delimiter.")
+ parser.add_argument(
+ "--use_mem_opt",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to use memory optimization.")
+ parser.add_argument(
+ "--use_py_reader",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to use py_reader.")
+ parser.add_argument(
+ "--iter_num",
+ type=int,
+ default=20,
+ help="The iteration number to run in profiling.")
+ parser.add_argument(
+ "--use_parallel_exe",
+ type=ast.literal_eval,
+ default=False,
+ help="The flag indicating whether to use ParallelExecutor.")
+ parser.add_argument(
+ "--profile_ops",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to profile operators.")
+ parser.add_argument(
+ 'opts',
+ help='See config.py for all options',
+ default=None,
+ nargs=argparse.REMAINDER)
+
+ args = parser.parse_args()
+ # Append args related to dict
+ src_dict = reader.DataReader.load_dict(args.src_vocab_fpath)
+ trg_dict = reader.DataReader.load_dict(args.trg_vocab_fpath)
+ dict_args = [
+ "src_vocab_size", str(len(src_dict)), "trg_vocab_size",
+ str(len(trg_dict)), "bos_idx", str(src_dict[args.special_token[0]]),
+ "eos_idx", str(src_dict[args.special_token[1]]), "unk_idx",
+ str(src_dict[args.special_token[2]])
+ ]
+ merge_cfg_from_list(args.opts + dict_args,
+ [TrainTaskConfig, ModelHyperParams])
+ return args
+
+
+def main(args):
+ train_prog = fluid.Program()
+ startup_prog = fluid.Program()
+ train_prog.random_seed = 1000
+ startup_prog.random_seed = 1000
+ with fluid.program_guard(train_prog, startup_prog):
+ with fluid.unique_name.guard():
+ sum_cost, avg_cost, predict, token_num, pyreader = transformer(
+ ModelHyperParams.src_vocab_size,
+ ModelHyperParams.trg_vocab_size,
+ ModelHyperParams.max_length + 1,
+ ModelHyperParams.n_layer,
+ ModelHyperParams.n_head,
+ ModelHyperParams.d_key,
+ ModelHyperParams.d_value,
+ ModelHyperParams.d_model,
+ ModelHyperParams.d_inner_hid,
+ ModelHyperParams.prepostprocess_dropout,
+ ModelHyperParams.attention_dropout,
+ ModelHyperParams.relu_dropout,
+ ModelHyperParams.preprocess_cmd,
+ ModelHyperParams.postprocess_cmd,
+ ModelHyperParams.weight_sharing,
+ TrainTaskConfig.label_smooth_eps,
+ use_py_reader=args.use_py_reader,
+ is_test=False)
+ lr_decay = fluid.layers.learning_rate_scheduler.noam_decay(
+ ModelHyperParams.d_model, TrainTaskConfig.warmup_steps)
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=lr_decay * TrainTaskConfig.learning_rate,
+ beta1=TrainTaskConfig.beta1,
+ beta2=TrainTaskConfig.beta2,
+ epsilon=TrainTaskConfig.eps)
+ optimizer.minimize(avg_cost)
+
+ if args.use_mem_opt:
+ fluid.memory_optimize(train_prog)
+
+ if TrainTaskConfig.use_gpu:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+ else:
+ place = fluid.CPUPlace()
+ dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ exe = fluid.Executor(place)
+ # Initialize the parameters.
+ if TrainTaskConfig.ckpt_path:
+ fluid.io.load_persistables(exe, TrainTaskConfig.ckpt_path)
+ else:
+ exe.run(startup_prog)
+
+ exec_strategy = fluid.ExecutionStrategy()
+ # For faster executor
+ exec_strategy.use_experimental_executor = True
+ exec_strategy.num_iteration_per_drop_scope = 5
+ build_strategy = fluid.BuildStrategy()
+ # Since the token number differs among devices, customize gradient scale to
+ # use token average cost among multi-devices. and the gradient scale is
+ # `1 / token_number` for average cost.
+ # build_strategy.gradient_scale_strategy = fluid.BuildStrategy.GradientScaleStrategy.Customized
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=TrainTaskConfig.use_gpu,
+ loss_name=avg_cost.name,
+ main_program=train_prog,
+ build_strategy=build_strategy,
+ exec_strategy=exec_strategy)
+
+ # the best cross-entropy value with label smoothing
+ loss_normalizer = -((1. - TrainTaskConfig.label_smooth_eps) * np.log(
+ (1. - TrainTaskConfig.label_smooth_eps
+ )) + TrainTaskConfig.label_smooth_eps *
+ np.log(TrainTaskConfig.label_smooth_eps / (
+ ModelHyperParams.trg_vocab_size - 1) + 1e-20))
+
+ train_data = prepare_data_generator(
+ args, is_test=False, count=dev_count, pyreader=pyreader)
+ if args.use_py_reader:
+ pyreader.start()
+ data_generator = None
+ else:
+ data_generator = train_data()
+
+ def run(iter_num):
+ reader_time = []
+ run_time = []
+
+ for step_idx in six.moves.xrange(iter_num):
+ try:
+ start_time = time.time()
+ feed_dict_list = prepare_feed_dict_list(data_generator,
+ init_flag, dev_count)
+ end_time = time.time()
+ reader_time.append(end_time - start_time)
+
+ start_time = time.time()
+ if args.use_parallel_exe:
+ outs = train_exe.run(
+ fetch_list=[sum_cost.name, token_num.name],
+ feed=feed_dict_list)
+ else:
+ outs = exe.run(program=train_prog,
+ fetch_list=[sum_cost.name, token_num.name],
+ feed=feed_dict_list[0]
+ if feed_dict_list is not None else None)
+ end_time = time.time()
+ run_time.append(end_time - start_time)
+
+ sum_cost_val, token_num_val = np.array(outs[0]), np.array(outs[
+ 1])
+ # sum the cost from multi-devices
+ total_sum_cost = sum_cost_val.sum()
+ total_token_num = token_num_val.sum()
+ total_avg_cost = total_sum_cost / total_token_num
+ print("step_idx: %d, avg loss: %f, "
+ "normalized loss: %f, ppl: %f" %
+ (step_idx, total_avg_cost,
+ total_avg_cost - loss_normalizer,
+ np.exp([min(total_avg_cost, 100)])))
+ except (StopIteration, fluid.core.EOFException):
+ # The current pass is over.
+ if args.use_py_reader:
+ pyreader.reset()
+ pyreader.start()
+
+ return reader_time, run_time
+
+ @contextlib.contextmanager
+ def profile_context(profile=True):
+ if profile:
+ with profiler.profiler('All', 'total', '/tmp/profile_file'):
+ yield
+ else:
+ yield
+
+ # start-up
+ init_flag = True
+ run(5)
+ init_flag = False
+
+ # profiling
+ start = time.time()
+ # currently only support profiling on one device
+ with profile_context(args.profile_ops):
+ reader_time, run_time = run(args.iter_num)
+ end = time.time()
+ total_time = end - start
+ print(
+ "Total time: {0}, reader time: {1} s, run time: {2} s, step number: {3}".
+ format(total_time, np.sum(reader_time), np.sum(run_time),
+ args.iter_num))
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ main(args)
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/reader.py b/PaddleNLP/neural_machine_translation/transformer/reader.py
similarity index 100%
rename from fluid/PaddleNLP/neural_machine_translation/transformer/reader.py
rename to PaddleNLP/neural_machine_translation/transformer/reader.py
diff --git a/PaddleNLP/neural_machine_translation/transformer/train.py b/PaddleNLP/neural_machine_translation/transformer/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..4313f8b441ee194935c7c47abc52271589c7765d
--- /dev/null
+++ b/PaddleNLP/neural_machine_translation/transformer/train.py
@@ -0,0 +1,771 @@
+import argparse
+import ast
+import copy
+import logging
+import multiprocessing
+import os
+import six
+import sys
+import time
+
+import numpy as np
+import paddle.fluid as fluid
+
+import reader
+from config import *
+from model import transformer, position_encoding_init
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("Training for Transformer.")
+ parser.add_argument(
+ "--src_vocab_fpath",
+ type=str,
+ required=True,
+ help="The path of vocabulary file of source language.")
+ parser.add_argument(
+ "--trg_vocab_fpath",
+ type=str,
+ required=True,
+ help="The path of vocabulary file of target language.")
+ parser.add_argument(
+ "--train_file_pattern",
+ type=str,
+ required=True,
+ help="The pattern to match training data files.")
+ parser.add_argument(
+ "--val_file_pattern",
+ type=str,
+ help="The pattern to match validation data files.")
+ parser.add_argument(
+ "--use_token_batch",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to "
+ "produce batch data according to token number.")
+ parser.add_argument(
+ "--batch_size",
+ type=int,
+ default=4096,
+ help="The number of sequences contained in a mini-batch, or the maximum "
+ "number of tokens (include paddings) contained in a mini-batch. Note "
+ "that this represents the number on single device and the actual batch "
+ "size for multi-devices will multiply the device number.")
+ parser.add_argument(
+ "--pool_size",
+ type=int,
+ default=200000,
+ help="The buffer size to pool data.")
+ parser.add_argument(
+ "--sort_type",
+ default="pool",
+ choices=("global", "pool", "none"),
+ help="The grain to sort by length: global for all instances; pool for "
+ "instances in pool; none for no sort.")
+ parser.add_argument(
+ "--shuffle",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to shuffle instances in each pass.")
+ parser.add_argument(
+ "--shuffle_batch",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to shuffle the data batches.")
+ parser.add_argument(
+ "--special_token",
+ type=str,
+ default=["", "", ""],
+ nargs=3,
+ help="The , and tokens in the dictionary.")
+ parser.add_argument(
+ "--token_delimiter",
+ type=lambda x: str(x.encode().decode("unicode-escape")),
+ default=" ",
+ help="The delimiter used to split tokens in source or target sentences. "
+ "For EN-DE BPE data we provided, use spaces as token delimiter. ")
+ parser.add_argument(
+ 'opts',
+ help='See config.py for all options',
+ default=None,
+ nargs=argparse.REMAINDER)
+ parser.add_argument(
+ '--local',
+ type=ast.literal_eval,
+ default=True,
+ help='Whether to run as local mode.')
+ parser.add_argument(
+ '--device',
+ type=str,
+ default='GPU',
+ choices=['CPU', 'GPU'],
+ help="The device type.")
+ parser.add_argument(
+ '--update_method',
+ choices=("pserver", "nccl2"),
+ default="pserver",
+ help='Update method.')
+ parser.add_argument(
+ '--sync', type=ast.literal_eval, default=True, help="sync mode.")
+ parser.add_argument(
+ "--enable_ce",
+ type=ast.literal_eval,
+ default=False,
+ help="The flag indicating whether to run the task "
+ "for continuous evaluation.")
+ parser.add_argument(
+ "--use_mem_opt",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to use memory optimization.")
+ parser.add_argument(
+ "--use_py_reader",
+ type=ast.literal_eval,
+ default=True,
+ help="The flag indicating whether to use py_reader.")
+ parser.add_argument(
+ "--fetch_steps",
+ type=int,
+ default=100,
+ help="The frequency to fetch and print output.")
+
+ args = parser.parse_args()
+ # Append args related to dict
+ src_dict = reader.DataReader.load_dict(args.src_vocab_fpath)
+ trg_dict = reader.DataReader.load_dict(args.trg_vocab_fpath)
+ dict_args = [
+ "src_vocab_size", str(len(src_dict)), "trg_vocab_size",
+ str(len(trg_dict)), "bos_idx", str(src_dict[args.special_token[0]]),
+ "eos_idx", str(src_dict[args.special_token[1]]), "unk_idx",
+ str(src_dict[args.special_token[2]])
+ ]
+ merge_cfg_from_list(args.opts + dict_args,
+ [TrainTaskConfig, ModelHyperParams])
+ return args
+
+
+def append_nccl2_prepare(startup_prog, trainer_id, worker_endpoints,
+ current_endpoint):
+ assert (trainer_id >= 0 and len(worker_endpoints) > 1 and
+ current_endpoint in worker_endpoints)
+ eps = copy.deepcopy(worker_endpoints)
+ eps.remove(current_endpoint)
+ nccl_id_var = startup_prog.global_block().create_var(
+ name="NCCLID", persistable=True, type=fluid.core.VarDesc.VarType.RAW)
+ startup_prog.global_block().append_op(
+ type="gen_nccl_id",
+ inputs={},
+ outputs={"NCCLID": nccl_id_var},
+ attrs={
+ "endpoint": current_endpoint,
+ "endpoint_list": eps,
+ "trainer_id": trainer_id
+ })
+ return nccl_id_var
+
+
+def pad_batch_data(insts,
+ pad_idx,
+ n_head,
+ is_target=False,
+ is_label=False,
+ return_attn_bias=True,
+ return_max_len=True,
+ return_num_token=False):
+ """
+ Pad the instances to the max sequence length in batch, and generate the
+ corresponding position data and attention bias.
+ """
+ return_list = []
+ max_len = max(len(inst) for inst in insts)
+ # Any token included in dict can be used to pad, since the paddings' loss
+ # will be masked out by weights and make no effect on parameter gradients.
+ inst_data = np.array(
+ [inst + [pad_idx] * (max_len - len(inst)) for inst in insts])
+ return_list += [inst_data.astype("int64").reshape([-1, 1])]
+ if is_label: # label weight
+ inst_weight = np.array(
+ [[1.] * len(inst) + [0.] * (max_len - len(inst)) for inst in insts])
+ return_list += [inst_weight.astype("float32").reshape([-1, 1])]
+ else: # position data
+ inst_pos = np.array([
+ list(range(0, len(inst))) + [0] * (max_len - len(inst))
+ for inst in insts
+ ])
+ return_list += [inst_pos.astype("int64").reshape([-1, 1])]
+ if return_attn_bias:
+ if is_target:
+ # This is used to avoid attention on paddings and subsequent
+ # words.
+ slf_attn_bias_data = np.ones((inst_data.shape[0], max_len, max_len))
+ slf_attn_bias_data = np.triu(slf_attn_bias_data,
+ 1).reshape([-1, 1, max_len, max_len])
+ slf_attn_bias_data = np.tile(slf_attn_bias_data,
+ [1, n_head, 1, 1]) * [-1e9]
+ else:
+ # This is used to avoid attention on paddings.
+ slf_attn_bias_data = np.array([[0] * len(inst) + [-1e9] *
+ (max_len - len(inst))
+ for inst in insts])
+ slf_attn_bias_data = np.tile(
+ slf_attn_bias_data.reshape([-1, 1, 1, max_len]),
+ [1, n_head, max_len, 1])
+ return_list += [slf_attn_bias_data.astype("float32")]
+ if return_max_len:
+ return_list += [max_len]
+ if return_num_token:
+ num_token = 0
+ for inst in insts:
+ num_token += len(inst)
+ return_list += [num_token]
+ return return_list if len(return_list) > 1 else return_list[0]
+
+
+def prepare_batch_input(insts, data_input_names, src_pad_idx, trg_pad_idx,
+ n_head, d_model):
+ """
+ Put all padded data needed by training into a dict.
+ """
+ src_word, src_pos, src_slf_attn_bias, src_max_len = pad_batch_data(
+ [inst[0] for inst in insts], src_pad_idx, n_head, is_target=False)
+ src_word = src_word.reshape(-1, src_max_len, 1)
+ src_pos = src_pos.reshape(-1, src_max_len, 1)
+ trg_word, trg_pos, trg_slf_attn_bias, trg_max_len = pad_batch_data(
+ [inst[1] for inst in insts], trg_pad_idx, n_head, is_target=True)
+ trg_word = trg_word.reshape(-1, trg_max_len, 1)
+ trg_pos = trg_pos.reshape(-1, trg_max_len, 1)
+
+ trg_src_attn_bias = np.tile(src_slf_attn_bias[:, :, ::src_max_len, :],
+ [1, 1, trg_max_len, 1]).astype("float32")
+
+ lbl_word, lbl_weight, num_token = pad_batch_data(
+ [inst[2] for inst in insts],
+ trg_pad_idx,
+ n_head,
+ is_target=False,
+ is_label=True,
+ return_attn_bias=False,
+ return_max_len=False,
+ return_num_token=True)
+
+ data_input_dict = dict(
+ zip(data_input_names, [
+ src_word, src_pos, src_slf_attn_bias, trg_word, trg_pos,
+ trg_slf_attn_bias, trg_src_attn_bias, lbl_word, lbl_weight
+ ]))
+
+ return data_input_dict, np.asarray([num_token], dtype="float32")
+
+
+def prepare_data_generator(args,
+ is_test,
+ count,
+ pyreader,
+ py_reader_provider_wrapper,
+ place=None):
+ """
+ Data generator wrapper for DataReader. If use py_reader, set the data
+ provider for py_reader
+ """
+ data_reader = reader.DataReader(
+ fpattern=args.val_file_pattern if is_test else args.train_file_pattern,
+ src_vocab_fpath=args.src_vocab_fpath,
+ trg_vocab_fpath=args.trg_vocab_fpath,
+ token_delimiter=args.token_delimiter,
+ use_token_batch=args.use_token_batch,
+ batch_size=args.batch_size * (1 if args.use_token_batch else count),
+ pool_size=args.pool_size,
+ sort_type=args.sort_type,
+ shuffle=args.shuffle,
+ shuffle_batch=args.shuffle_batch,
+ start_mark=args.special_token[0],
+ end_mark=args.special_token[1],
+ unk_mark=args.special_token[2],
+ # count start and end tokens out
+ max_length=ModelHyperParams.max_length - 2,
+ clip_last_batch=False).batch_generator
+
+ def stack(data_reader, count, clip_last=True):
+ def __impl__():
+ res = []
+ for item in data_reader():
+ res.append(item)
+ if len(res) == count:
+ yield res
+ res = []
+ if len(res) == count:
+ yield res
+ elif not clip_last:
+ data = []
+ for item in res:
+ data += item
+ if len(data) > count:
+ inst_num_per_part = len(data) // count
+ yield [
+ data[inst_num_per_part * i:inst_num_per_part * (i + 1)]
+ for i in range(count)
+ ]
+
+ return __impl__
+
+ def split(data_reader, count):
+ def __impl__():
+ for item in data_reader():
+ inst_num_per_part = len(item) // count
+ for i in range(count):
+ yield item[inst_num_per_part * i:inst_num_per_part * (i + 1
+ )]
+
+ return __impl__
+
+ if not args.use_token_batch:
+ # to make data on each device have similar token number
+ data_reader = split(data_reader, count)
+ if args.use_py_reader:
+ pyreader.decorate_tensor_provider(
+ py_reader_provider_wrapper(data_reader, place))
+ data_reader = None
+ else: # Data generator for multi-devices
+ data_reader = stack(data_reader, count)
+ return data_reader
+
+
+def prepare_feed_dict_list(data_generator, init_flag, count):
+ """
+ Prepare the list of feed dict for multi-devices.
+ """
+ feed_dict_list = []
+ if data_generator is not None: # use_py_reader == False
+ data_input_names = encoder_data_input_fields + \
+ decoder_data_input_fields[:-1] + label_data_input_fields
+ data = next(data_generator)
+ for idx, data_buffer in enumerate(data):
+ data_input_dict, num_token = prepare_batch_input(
+ data_buffer, data_input_names, ModelHyperParams.eos_idx,
+ ModelHyperParams.eos_idx, ModelHyperParams.n_head,
+ ModelHyperParams.d_model)
+ feed_dict_list.append(data_input_dict)
+ if init_flag:
+ for idx in range(count):
+ pos_enc_tables = dict()
+ for pos_enc_param_name in pos_enc_param_names:
+ pos_enc_tables[pos_enc_param_name] = position_encoding_init(
+ ModelHyperParams.max_length + 1, ModelHyperParams.d_model)
+ if len(feed_dict_list) <= idx:
+ feed_dict_list.append(pos_enc_tables)
+ else:
+ feed_dict_list[idx] = dict(
+ list(pos_enc_tables.items()) + list(feed_dict_list[idx]
+ .items()))
+
+ return feed_dict_list if len(feed_dict_list) == count else None
+
+
+def py_reader_provider_wrapper(data_reader, place):
+ """
+ Data provider needed by fluid.layers.py_reader.
+ """
+
+ def py_reader_provider():
+ data_input_names = encoder_data_input_fields + \
+ decoder_data_input_fields[:-1] + label_data_input_fields
+ for batch_id, data in enumerate(data_reader()):
+ data_input_dict, num_token = prepare_batch_input(
+ data, data_input_names, ModelHyperParams.eos_idx,
+ ModelHyperParams.eos_idx, ModelHyperParams.n_head,
+ ModelHyperParams.d_model)
+ yield [data_input_dict[item] for item in data_input_names]
+
+ return py_reader_provider
+
+
+def test_context(exe, train_exe, dev_count):
+ # Context to do validation.
+ test_prog = fluid.Program()
+ startup_prog = fluid.Program()
+ if args.enable_ce:
+ test_prog.random_seed = 1000
+ startup_prog.random_seed = 1000
+ with fluid.program_guard(test_prog, startup_prog):
+ with fluid.unique_name.guard():
+ sum_cost, avg_cost, predict, token_num, pyreader = transformer(
+ ModelHyperParams.src_vocab_size,
+ ModelHyperParams.trg_vocab_size,
+ ModelHyperParams.max_length + 1,
+ ModelHyperParams.n_layer,
+ ModelHyperParams.n_head,
+ ModelHyperParams.d_key,
+ ModelHyperParams.d_value,
+ ModelHyperParams.d_model,
+ ModelHyperParams.d_inner_hid,
+ ModelHyperParams.prepostprocess_dropout,
+ ModelHyperParams.attention_dropout,
+ ModelHyperParams.relu_dropout,
+ ModelHyperParams.preprocess_cmd,
+ ModelHyperParams.postprocess_cmd,
+ ModelHyperParams.weight_sharing,
+ TrainTaskConfig.label_smooth_eps,
+ use_py_reader=args.use_py_reader,
+ is_test=True)
+ test_prog = test_prog.clone(for_test=True)
+ test_data = prepare_data_generator(
+ args,
+ is_test=True,
+ count=dev_count,
+ pyreader=pyreader,
+ py_reader_provider_wrapper=py_reader_provider_wrapper)
+
+ exe.run(startup_prog) # to init pyreader for testing
+ if TrainTaskConfig.ckpt_path:
+ fluid.io.load_persistables(
+ exe, TrainTaskConfig.ckpt_path, main_program=test_prog)
+
+ exec_strategy = fluid.ExecutionStrategy()
+ exec_strategy.use_experimental_executor = True
+ build_strategy = fluid.BuildStrategy()
+ test_exe = fluid.ParallelExecutor(
+ use_cuda=TrainTaskConfig.use_gpu,
+ main_program=test_prog,
+ build_strategy=build_strategy,
+ exec_strategy=exec_strategy,
+ share_vars_from=train_exe)
+
+ def test(exe=test_exe, pyreader=pyreader):
+ test_total_cost = 0
+ test_total_token = 0
+
+ if args.use_py_reader:
+ pyreader.start()
+ data_generator = None
+ else:
+ data_generator = test_data()
+ while True:
+ try:
+ feed_dict_list = prepare_feed_dict_list(data_generator, False,
+ dev_count)
+ outs = test_exe.run(fetch_list=[sum_cost.name, token_num.name],
+ feed=feed_dict_list)
+ except (StopIteration, fluid.core.EOFException):
+ # The current pass is over.
+ if args.use_py_reader:
+ pyreader.reset()
+ break
+ sum_cost_val, token_num_val = np.array(outs[0]), np.array(outs[1])
+ test_total_cost += sum_cost_val.sum()
+ test_total_token += token_num_val.sum()
+ test_avg_cost = test_total_cost / test_total_token
+ test_ppl = np.exp([min(test_avg_cost, 100)])
+ return test_avg_cost, test_ppl
+
+ return test
+
+
+def train_loop(exe,
+ train_prog,
+ startup_prog,
+ dev_count,
+ sum_cost,
+ avg_cost,
+ token_num,
+ predict,
+ pyreader,
+ nccl2_num_trainers=1,
+ nccl2_trainer_id=0):
+ # Initialize the parameters.
+ if TrainTaskConfig.ckpt_path:
+ exe.run(startup_prog) # to init pyreader for training
+ logging.info("load checkpoint from {}".format(
+ TrainTaskConfig.ckpt_path))
+ fluid.io.load_persistables(
+ exe, TrainTaskConfig.ckpt_path, main_program=train_prog)
+ else:
+ logging.info("init fluid.framework.default_startup_program")
+ exe.run(startup_prog)
+
+ logging.info("begin reader")
+ train_data = prepare_data_generator(
+ args,
+ is_test=False,
+ count=dev_count,
+ pyreader=pyreader,
+ py_reader_provider_wrapper=py_reader_provider_wrapper)
+
+ # For faster executor
+ exec_strategy = fluid.ExecutionStrategy()
+ exec_strategy.use_experimental_executor = True
+ exec_strategy.num_iteration_per_drop_scope = int(args.fetch_steps)
+ build_strategy = fluid.BuildStrategy()
+ # Since the token number differs among devices, customize gradient scale to
+ # use token average cost among multi-devices. and the gradient scale is
+ # `1 / token_number` for average cost.
+ # build_strategy.gradient_scale_strategy = fluid.BuildStrategy.GradientScaleStrategy.Customized
+
+ logging.info("begin executor")
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=TrainTaskConfig.use_gpu,
+ loss_name=avg_cost.name,
+ main_program=train_prog,
+ build_strategy=build_strategy,
+ exec_strategy=exec_strategy,
+ num_trainers=nccl2_num_trainers,
+ trainer_id=nccl2_trainer_id)
+
+ if args.val_file_pattern is not None:
+ test = test_context(exe, train_exe, dev_count)
+
+ # the best cross-entropy value with label smoothing
+ loss_normalizer = -((1. - TrainTaskConfig.label_smooth_eps) * np.log(
+ (1. - TrainTaskConfig.label_smooth_eps
+ )) + TrainTaskConfig.label_smooth_eps *
+ np.log(TrainTaskConfig.label_smooth_eps / (
+ ModelHyperParams.trg_vocab_size - 1) + 1e-20))
+
+ step_idx = 0
+ init_flag = True
+
+ logging.info("begin train")
+ for pass_id in six.moves.xrange(TrainTaskConfig.pass_num):
+ pass_start_time = time.time()
+
+ if args.use_py_reader:
+ pyreader.start()
+ data_generator = None
+ else:
+ data_generator = train_data()
+
+ batch_id = 0
+ while True:
+ try:
+ feed_dict_list = prepare_feed_dict_list(data_generator,
+ init_flag, dev_count)
+ outs = train_exe.run(
+ fetch_list=[sum_cost.name, token_num.name]
+ if step_idx % args.fetch_steps == 0 else [],
+ feed=feed_dict_list)
+
+ if step_idx % args.fetch_steps == 0:
+ sum_cost_val, token_num_val = np.array(outs[0]), np.array(
+ outs[1])
+ # sum the cost from multi-devices
+ total_sum_cost = sum_cost_val.sum()
+ total_token_num = token_num_val.sum()
+ total_avg_cost = total_sum_cost / total_token_num
+
+ if step_idx == 0:
+ logging.info(
+ "step_idx: %d, epoch: %d, batch: %d, avg loss: %f, "
+ "normalized loss: %f, ppl: %f" %
+ (step_idx, pass_id, batch_id, total_avg_cost,
+ total_avg_cost - loss_normalizer,
+ np.exp([min(total_avg_cost, 100)])))
+ avg_batch_time = time.time()
+ else:
+ logging.info(
+ "step_idx: %d, epoch: %d, batch: %d, avg loss: %f, "
+ "normalized loss: %f, ppl: %f, speed: %.2f step/s" %
+ (step_idx, pass_id, batch_id, total_avg_cost,
+ total_avg_cost - loss_normalizer,
+ np.exp([min(total_avg_cost, 100)]),
+ args.fetch_steps / (time.time() - avg_batch_time)))
+ avg_batch_time = time.time()
+
+ if step_idx % TrainTaskConfig.save_freq == 0 and step_idx > 0:
+ fluid.io.save_persistables(
+ exe,
+ os.path.join(TrainTaskConfig.ckpt_dir,
+ "latest.checkpoint"), train_prog)
+ fluid.io.save_params(
+ exe,
+ os.path.join(TrainTaskConfig.model_dir,
+ "iter_" + str(step_idx) + ".infer.model"),
+ train_prog)
+
+ init_flag = False
+ batch_id += 1
+ step_idx += 1
+ except (StopIteration, fluid.core.EOFException):
+ # The current pass is over.
+ if args.use_py_reader:
+ pyreader.reset()
+ break
+
+ time_consumed = time.time() - pass_start_time
+ # Validate and save the persistable.
+ if args.val_file_pattern is not None:
+ val_avg_cost, val_ppl = test()
+ logging.info(
+ "epoch: %d, val avg loss: %f, val normalized loss: %f, val ppl: %f,"
+ " consumed %fs" % (pass_id, val_avg_cost,
+ val_avg_cost - loss_normalizer, val_ppl,
+ time_consumed))
+ else:
+ logging.info("epoch: %d, consumed %fs" % (pass_id, time_consumed))
+ if not args.enable_ce:
+ fluid.io.save_persistables(
+ exe,
+ os.path.join(TrainTaskConfig.ckpt_dir,
+ "pass_" + str(pass_id) + ".checkpoint"),
+ train_prog)
+
+ if args.enable_ce: # For CE
+ print("kpis\ttrain_cost_card%d\t%f" % (dev_count, total_avg_cost))
+ if args.val_file_pattern is not None:
+ print("kpis\ttest_cost_card%d\t%f" % (dev_count, val_avg_cost))
+ print("kpis\ttrain_duration_card%d\t%f" % (dev_count, time_consumed))
+
+
+def train(args):
+ # priority: ENV > args > config
+ is_local = os.getenv("PADDLE_IS_LOCAL", "1")
+ if is_local == '0':
+ args.local = False
+ logging.info(args)
+
+ if args.device == 'CPU':
+ TrainTaskConfig.use_gpu = False
+
+ training_role = os.getenv("TRAINING_ROLE", "TRAINER")
+
+ if training_role == "PSERVER" or (not TrainTaskConfig.use_gpu):
+ place = fluid.CPUPlace()
+ dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
+ else:
+ place = fluid.CUDAPlace(0)
+ dev_count = fluid.core.get_cuda_device_count()
+
+ exe = fluid.Executor(place)
+
+ train_prog = fluid.Program()
+ startup_prog = fluid.Program()
+
+ if args.enable_ce:
+ train_prog.random_seed = 1000
+ startup_prog.random_seed = 1000
+
+ with fluid.program_guard(train_prog, startup_prog):
+ with fluid.unique_name.guard():
+ sum_cost, avg_cost, predict, token_num, pyreader = transformer(
+ ModelHyperParams.src_vocab_size,
+ ModelHyperParams.trg_vocab_size,
+ ModelHyperParams.max_length + 1,
+ ModelHyperParams.n_layer,
+ ModelHyperParams.n_head,
+ ModelHyperParams.d_key,
+ ModelHyperParams.d_value,
+ ModelHyperParams.d_model,
+ ModelHyperParams.d_inner_hid,
+ ModelHyperParams.prepostprocess_dropout,
+ ModelHyperParams.attention_dropout,
+ ModelHyperParams.relu_dropout,
+ ModelHyperParams.preprocess_cmd,
+ ModelHyperParams.postprocess_cmd,
+ ModelHyperParams.weight_sharing,
+ TrainTaskConfig.label_smooth_eps,
+ use_py_reader=args.use_py_reader,
+ is_test=False)
+
+ optimizer = None
+ if args.sync:
+ lr_decay = fluid.layers.learning_rate_scheduler.noam_decay(
+ ModelHyperParams.d_model, TrainTaskConfig.warmup_steps)
+ logging.info("before adam")
+
+ with fluid.default_main_program()._lr_schedule_guard():
+ learning_rate = lr_decay * TrainTaskConfig.learning_rate
+
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=learning_rate,
+ beta1=TrainTaskConfig.beta1,
+ beta2=TrainTaskConfig.beta2,
+ epsilon=TrainTaskConfig.eps)
+ else:
+ optimizer = fluid.optimizer.SGD(0.003)
+ optimizer.minimize(avg_cost)
+
+ if args.use_mem_opt:
+ fluid.memory_optimize(train_prog)
+
+ if args.local:
+ logging.info("local start_up:")
+ train_loop(exe, train_prog, startup_prog, dev_count, sum_cost, avg_cost,
+ token_num, predict, pyreader)
+ else:
+ if args.update_method == "nccl2":
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
+ port = os.getenv("PADDLE_PORT")
+ worker_ips = os.getenv("PADDLE_TRAINERS")
+ worker_endpoints = []
+ for ip in worker_ips.split(","):
+ worker_endpoints.append(':'.join([ip, port]))
+ trainers_num = len(worker_endpoints)
+ current_endpoint = os.getenv("POD_IP") + ":" + port
+ if trainer_id == 0:
+ logging.info("train_id == 0, sleep 60s")
+ time.sleep(60)
+ logging.info("trainers_num:{}".format(trainers_num))
+ logging.info("worker_endpoints:{}".format(worker_endpoints))
+ logging.info("current_endpoint:{}".format(current_endpoint))
+ append_nccl2_prepare(startup_prog, trainer_id, worker_endpoints,
+ current_endpoint)
+ train_loop(exe, train_prog, startup_prog, dev_count, sum_cost,
+ avg_cost, token_num, predict, pyreader, trainers_num,
+ trainer_id)
+ return
+
+ port = os.getenv("PADDLE_PORT", "6174")
+ pserver_ips = os.getenv("PADDLE_PSERVERS") # ip,ip...
+ eplist = []
+ for ip in pserver_ips.split(","):
+ eplist.append(':'.join([ip, port]))
+ pserver_endpoints = ",".join(eplist) # ip:port,ip:port...
+ trainers = int(os.getenv("PADDLE_TRAINERS_NUM", "0"))
+ current_endpoint = os.getenv("POD_IP") + ":" + port
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID"))
+
+ logging.info("pserver_endpoints:{}".format(pserver_endpoints))
+ logging.info("current_endpoint:{}".format(current_endpoint))
+ logging.info("trainer_id:{}".format(trainer_id))
+ logging.info("pserver_ips:{}".format(pserver_ips))
+ logging.info("port:{}".format(port))
+
+ t = fluid.DistributeTranspiler()
+ t.transpile(
+ trainer_id,
+ pservers=pserver_endpoints,
+ trainers=trainers,
+ program=train_prog,
+ startup_program=startup_prog)
+
+ if training_role == "PSERVER":
+ logging.info("distributed: pserver started")
+ current_endpoint = os.getenv("POD_IP") + ":" + os.getenv(
+ "PADDLE_PORT")
+ if not current_endpoint:
+ logging.critical("need env SERVER_ENDPOINT")
+ exit(1)
+ pserver_prog = t.get_pserver_program(current_endpoint)
+ pserver_startup = t.get_startup_program(current_endpoint,
+ pserver_prog)
+
+ exe.run(pserver_startup)
+ exe.run(pserver_prog)
+ elif training_role == "TRAINER":
+ logging.info("distributed: trainer started")
+ trainer_prog = t.get_trainer_program()
+
+ train_loop(exe, train_prog, startup_prog, dev_count, sum_cost,
+ avg_cost, token_num, predict, pyreader)
+ else:
+ logging.critical(
+ "environment var TRAINER_ROLE should be TRAINER os PSERVER")
+ exit(1)
+
+
+if __name__ == "__main__":
+ LOG_FORMAT = "[%(asctime)s %(levelname)s %(filename)s:%(lineno)d] %(message)s"
+ logging.basicConfig(
+ stream=sys.stdout, level=logging.DEBUG, format=LOG_FORMAT)
+ logging.getLogger().setLevel(logging.INFO)
+
+ args = parse_args()
+ train(args)
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/.run_ce.sh b/PaddleNLP/sequence_tagging_for_ner/.run_ce.sh
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/.run_ce.sh
rename to PaddleNLP/sequence_tagging_for_ner/.run_ce.sh
diff --git a/PaddleNLP/sequence_tagging_for_ner/README.md b/PaddleNLP/sequence_tagging_for_ner/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..6d4efa9eb19dd708a87d4883dccef5ecb5e11666
--- /dev/null
+++ b/PaddleNLP/sequence_tagging_for_ner/README.md
@@ -0,0 +1,116 @@
+# 命名实体识别
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── data # 存储运行本例所依赖的数据,从外部获取
+├── network_conf.py # 模型定义
+├── reader.py # 数据读取接口, 从外部获取
+├── README.md # 文档
+├── train.py # 训练脚本
+├── infer.py # 预测脚本
+├── utils.py # 定义通用的函数, 从外部获取
+└── utils_extend.py # 对utils.py的拓展
+```
+
+
+## 简介,模型详解
+
+在PaddlePaddle v2版本[命名实体识别](https://github.com/PaddlePaddle/models/blob/develop/legacy/sequence_tagging_for_ner/README.md)中对于命名实体识别任务有较详细的介绍,在本例中不再重复介绍。
+在模型上,我们沿用了v2版本的模型结构,唯一区别是我们使用LSTM代替原始的RNN。
+
+## 数据获取
+
+完整数据的获取请参考PaddlePaddle v2版本[命名实体识别](https://github.com/PaddlePaddle/models/blob/develop/legacy/sequence_tagging_for_ner/README.md) 一节中的方式。本例的示例数据同样可以通过运行data/download.sh来获取。
+
+## 训练
+
+1. 运行 `sh data/download.sh`
+2. 修改 `train.py` 的 `main` 函数,指定数据路径
+
+ ```python
+ main(
+ train_data_file="data/train",
+ test_data_file="data/test",
+ vocab_file="data/vocab.txt",
+ target_file="data/target.txt",
+ emb_file="data/wordVectors.txt",
+ model_save_dir="models",
+ num_passes=1000,
+ use_gpu=False,
+ parallel=False)
+ ```
+
+3. 运行命令 `python train.py` ,**需要注意:直接运行使用的是示例数据,请替换真实的标记数据。**
+
+ ```text
+ Pass 127, Batch 9525, Cost 4.0867705, Precision 0.3954984, Recall 0.37846154, F1_score0.38679245
+ Pass 127, Batch 9530, Cost 3.137265, Precision 0.42971888, Recall 0.38351256, F1_score0.405303
+ Pass 127, Batch 9535, Cost 3.6240938, Precision 0.4272152, Recall 0.41795665, F1_score0.4225352
+ Pass 127, Batch 9540, Cost 3.5352352, Precision 0.48464164, Recall 0.4536741, F1_score0.46864685
+ Pass 127, Batch 9545, Cost 4.1130385, Precision 0.40131578, Recall 0.3836478, F1_score0.39228293
+ Pass 127, Batch 9550, Cost 3.6826708, Precision 0.43333334, Recall 0.43730888, F1_score0.43531203
+ Pass 127, Batch 9555, Cost 3.6363933, Precision 0.42424244, Recall 0.3962264, F1_score0.4097561
+ Pass 127, Batch 9560, Cost 3.6101768, Precision 0.51363635, Recall 0.353125, F1_score0.41851854
+ Pass 127, Batch 9565, Cost 3.5935276, Precision 0.5152439, Recall 0.5, F1_score0.5075075
+ Pass 127, Batch 9570, Cost 3.4987144, Precision 0.5, Recall 0.4330218, F1_score0.46410686
+ Pass 127, Batch 9575, Cost 3.4659843, Precision 0.39864865, Recall 0.38064516, F1_score0.38943896
+ Pass 127, Batch 9580, Cost 3.1702557, Precision 0.5, Recall 0.4490446, F1_score0.47315437
+ Pass 127, Batch 9585, Cost 3.1587276, Precision 0.49377593, Recall 0.4089347, F1_score0.4473684
+ Pass 127, Batch 9590, Cost 3.5043538, Precision 0.4556962, Recall 0.4600639, F1_score0.45786962
+ Pass 127, Batch 9595, Cost 2.981989, Precision 0.44981414, Recall 0.45149255, F1_score0.4506518
+ [TrainSet] pass_id:127 pass_precision:[0.46023396] pass_recall:[0.43197003] pass_f1_score:[0.44565433]
+ [TestSet] pass_id:127 pass_precision:[0.4708409] pass_recall:[0.47971722] pass_f1_score:[0.4752376]
+ ```
+## 预测
+1. 修改 [infer.py](./infer.py) 的 `infer` 函数,指定:需要测试的模型的路径、测试数据、字典文件,预测标记文件的路径,默认参数如下:
+
+ ```python
+ infer(
+ model_path="models/params_pass_0",
+ batch_size=6,
+ test_data_file="data/test",
+ vocab_file="data/vocab.txt",
+ target_file="data/target.txt",
+ use_gpu=False
+ )
+ ```
+
+2. 在终端运行 `python infer.py`,开始测试,会看到如下预测结果(以下为训练70个pass所得模型的部分预测结果):
+
+ ```text
+ leicestershire B-ORG B-LOC
+ extended O O
+ their O O
+ first O O
+ innings O O
+ by O O
+ DGDG O O
+ runs O O
+ before O O
+ being O O
+ bowled O O
+ out O O
+ for O O
+ 296 O O
+ with O O
+ england B-LOC B-LOC
+ discard O O
+ andy B-PER B-PER
+ caddick I-PER I-PER
+ taking O O
+ three O O
+ for O O
+ DGDG O O
+ . O O
+ ```
+
+ 输出分为三列,以“\t” 分隔,第一列是输入的词语,第二列是标准结果,第三列为生成的标记结果。多条输入序列之间以空行分隔。
+
+## 结果示例
+
+
+
+图1. 学习曲线, 横轴表示训练轮数,纵轴表示F1值
+
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/_ce.py b/PaddleNLP/sequence_tagging_for_ner/_ce.py
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/_ce.py
rename to PaddleNLP/sequence_tagging_for_ner/_ce.py
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/data/download.sh b/PaddleNLP/sequence_tagging_for_ner/data/download.sh
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/data/download.sh
rename to PaddleNLP/sequence_tagging_for_ner/data/download.sh
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/data/target.txt b/PaddleNLP/sequence_tagging_for_ner/data/target.txt
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/data/target.txt
rename to PaddleNLP/sequence_tagging_for_ner/data/target.txt
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/data/test b/PaddleNLP/sequence_tagging_for_ner/data/test
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/data/test
rename to PaddleNLP/sequence_tagging_for_ner/data/test
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/data/train b/PaddleNLP/sequence_tagging_for_ner/data/train
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/data/train
rename to PaddleNLP/sequence_tagging_for_ner/data/train
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/imgs/convergence_curve.png b/PaddleNLP/sequence_tagging_for_ner/imgs/convergence_curve.png
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/imgs/convergence_curve.png
rename to PaddleNLP/sequence_tagging_for_ner/imgs/convergence_curve.png
diff --git a/PaddleNLP/sequence_tagging_for_ner/infer.py b/PaddleNLP/sequence_tagging_for_ner/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..20f00ccda5dcaafeccf5435ae32db0d5d2dcbd6a
--- /dev/null
+++ b/PaddleNLP/sequence_tagging_for_ner/infer.py
@@ -0,0 +1,72 @@
+from __future__ import print_function
+
+import numpy as np
+import six
+
+import paddle
+import paddle.fluid as fluid
+
+from network_conf import ner_net
+import reader
+from utils import load_dict, load_reverse_dict
+from utils_extend import to_lodtensor
+
+
+def infer(model_path, batch_size, test_data_file, vocab_file, target_file,
+ use_gpu):
+ """
+ use the model under model_path to predict the test data, the result will be printed on the screen
+
+ return nothing
+ """
+ word_dict = load_dict(vocab_file)
+ word_reverse_dict = load_reverse_dict(vocab_file)
+
+ label_dict = load_dict(target_file)
+ label_reverse_dict = load_reverse_dict(target_file)
+
+ test_data = paddle.batch(
+ reader.data_reader(test_data_file, word_dict, label_dict),
+ batch_size=batch_size)
+ place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ inference_scope = fluid.core.Scope()
+ with fluid.scope_guard(inference_scope):
+ [inference_program, feed_target_names,
+ fetch_targets] = fluid.io.load_inference_model(model_path, exe)
+ for data in test_data():
+ word = to_lodtensor([x[0] for x in data], place)
+ mark = to_lodtensor([x[1] for x in data], place)
+ crf_decode = exe.run(
+ inference_program,
+ feed={"word": word,
+ "mark": mark},
+ fetch_list=fetch_targets,
+ return_numpy=False)
+ lod_info = (crf_decode[0].lod())[0]
+ np_data = np.array(crf_decode[0])
+ assert len(data) == len(lod_info) - 1
+ for sen_index in six.moves.xrange(len(data)):
+ assert len(data[sen_index][0]) == lod_info[
+ sen_index + 1] - lod_info[sen_index]
+ word_index = 0
+ for tag_index in six.moves.xrange(lod_info[sen_index],
+ lod_info[sen_index + 1]):
+ word = word_reverse_dict[data[sen_index][0][word_index]]
+ gold_tag = label_reverse_dict[data[sen_index][2][
+ word_index]]
+ tag = label_reverse_dict[np_data[tag_index][0]]
+ print(word + "\t" + gold_tag + "\t" + tag)
+ word_index += 1
+ print("")
+
+
+if __name__ == "__main__":
+ infer(
+ model_path="models/params_pass_0",
+ batch_size=6,
+ test_data_file="data/test",
+ vocab_file="data/vocab.txt",
+ target_file="data/target.txt",
+ use_gpu=False)
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/network_conf.py b/PaddleNLP/sequence_tagging_for_ner/network_conf.py
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/network_conf.py
rename to PaddleNLP/sequence_tagging_for_ner/network_conf.py
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/reader.py b/PaddleNLP/sequence_tagging_for_ner/reader.py
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/reader.py
rename to PaddleNLP/sequence_tagging_for_ner/reader.py
diff --git a/PaddleNLP/sequence_tagging_for_ner/train.py b/PaddleNLP/sequence_tagging_for_ner/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..68e621371e09b654007134c8ce449e3491b9516f
--- /dev/null
+++ b/PaddleNLP/sequence_tagging_for_ner/train.py
@@ -0,0 +1,159 @@
+from __future__ import print_function
+
+import os
+import math
+import time
+import numpy as np
+import six
+
+import paddle
+import paddle.fluid as fluid
+
+import reader
+from network_conf import ner_net
+from utils import logger, load_dict
+from utils_extend import to_lodtensor, get_embedding
+
+
+def test(exe, chunk_evaluator, inference_program, test_data, test_fetch_list,
+ place):
+ chunk_evaluator.reset()
+ for data in test_data():
+ word = to_lodtensor([x[0] for x in data], place)
+ mark = to_lodtensor([x[1] for x in data], place)
+ target = to_lodtensor([x[2] for x in data], place)
+ rets = exe.run(inference_program,
+ feed={"word": word,
+ "mark": mark,
+ "target": target},
+ fetch_list=test_fetch_list)
+ num_infer = np.array(rets[0])
+ num_label = np.array(rets[1])
+ num_correct = np.array(rets[2])
+ chunk_evaluator.update(num_infer[0].astype('int64'),
+ num_label[0].astype('int64'),
+ num_correct[0].astype('int64'))
+ return chunk_evaluator.eval()
+
+
+def main(train_data_file,
+ test_data_file,
+ vocab_file,
+ target_file,
+ emb_file,
+ model_save_dir,
+ num_passes,
+ use_gpu,
+ parallel,
+ batch_size=200):
+ if not os.path.exists(model_save_dir):
+ os.mkdir(model_save_dir)
+
+ word_dict = load_dict(vocab_file)
+ label_dict = load_dict(target_file)
+
+ word_vector_values = get_embedding(emb_file)
+
+ word_dict_len = len(word_dict)
+ label_dict_len = len(label_dict)
+
+ if "CE_MODE_X" in os.environ:
+ fluid.default_startup_program().random_seed = 110
+
+ avg_cost, feature_out, word, mark, target = ner_net(
+ word_dict_len, label_dict_len, parallel)
+
+ crf_decode = fluid.layers.crf_decoding(
+ input=feature_out, param_attr=fluid.ParamAttr(name='crfw'))
+
+ (precision, recall, f1_score, num_infer_chunks, num_label_chunks,
+ num_correct_chunks) = fluid.layers.chunk_eval(
+ input=crf_decode,
+ label=target,
+ chunk_scheme="IOB",
+ num_chunk_types=int(math.ceil((label_dict_len - 1) / 2.0)))
+ chunk_evaluator = fluid.metrics.ChunkEvaluator()
+
+ inference_program = fluid.default_main_program().clone(for_test=True)
+ test_fetch_list = [num_infer_chunks, num_label_chunks, num_correct_chunks]
+ sgd_optimizer = fluid.optimizer.SGD(learning_rate=1e-3)
+ sgd_optimizer.minimize(avg_cost)
+
+ if "CE_MODE_X" not in os.environ:
+ train_reader = paddle.batch(
+ paddle.reader.shuffle(
+ reader.data_reader(train_data_file, word_dict, label_dict),
+ buf_size=20000),
+ batch_size=batch_size)
+ test_reader = paddle.batch(
+ paddle.reader.shuffle(
+ reader.data_reader(test_data_file, word_dict, label_dict),
+ buf_size=20000),
+ batch_size=batch_size)
+ else:
+ train_reader = paddle.batch(
+ reader.data_reader(train_data_file, word_dict, label_dict),
+ batch_size=batch_size)
+ test_reader = paddle.batch(
+ reader.data_reader(test_data_file, word_dict, label_dict),
+ batch_size=batch_size)
+
+ place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
+ feeder = fluid.DataFeeder(feed_list=[word, mark, target], place=place)
+ exe = fluid.Executor(place)
+
+ exe.run(fluid.default_startup_program())
+
+ embedding_name = 'emb'
+ embedding_param = fluid.global_scope().find_var(embedding_name).get_tensor()
+ embedding_param.set(word_vector_values, place)
+
+ time_begin = time.time()
+ for pass_id in six.moves.xrange(num_passes):
+ chunk_evaluator.reset()
+ for batch_id, data in enumerate(train_reader()):
+ cost_var, nums_infer, nums_label, nums_correct = exe.run(
+ fluid.default_main_program(),
+ feed=feeder.feed(data),
+ fetch_list=[
+ avg_cost, num_infer_chunks, num_label_chunks,
+ num_correct_chunks
+ ])
+ if batch_id % 5 == 0:
+ print("Pass " + str(pass_id) + ", Batch " + str(batch_id) +
+ ", Cost " + str(cost_var[0]))
+ chunk_evaluator.update(nums_infer, nums_label, nums_correct)
+ pass_precision, pass_recall, pass_f1_score = chunk_evaluator.eval()
+ print("[TrainSet] pass_id:" + str(pass_id) + " pass_precision:" + str(
+ pass_precision) + " pass_recall:" + str(pass_recall) +
+ " pass_f1_score:" + str(pass_f1_score))
+
+ test_pass_precision, test_pass_recall, test_pass_f1_score = test(
+ exe, chunk_evaluator, inference_program, test_reader,
+ test_fetch_list, place)
+ print("[TestSet] pass_id:" + str(pass_id) + " pass_precision:" + str(
+ test_pass_precision) + " pass_recall:" + str(test_pass_recall) +
+ " pass_f1_score:" + str(test_pass_f1_score))
+
+ save_dirname = os.path.join(model_save_dir, "params_pass_%d" % pass_id)
+ if "CE_MODE_X" not in os.environ:
+ fluid.io.save_inference_model(save_dirname, ['word', 'mark'],
+ crf_decode, exe)
+
+ if "CE_MODE_X" in os.environ:
+ print("kpis train_precision %f" % pass_precision)
+ print("kpis test_precision %f" % test_pass_precision)
+ print("kpis train_duration %f" % (time.time() - time_begin))
+
+
+if __name__ == "__main__":
+ main(
+ train_data_file="data/train",
+ test_data_file="data/test",
+ vocab_file="data/vocab.txt",
+ target_file="data/target.txt",
+ emb_file="data/wordVectors.txt",
+ model_save_dir="models",
+ num_passes=2000,
+ use_gpu=False,
+ parallel=False)
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/utils.py b/PaddleNLP/sequence_tagging_for_ner/utils.py
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/utils.py
rename to PaddleNLP/sequence_tagging_for_ner/utils.py
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/utils_extend.py b/PaddleNLP/sequence_tagging_for_ner/utils_extend.py
similarity index 100%
rename from fluid/PaddleNLP/sequence_tagging_for_ner/utils_extend.py
rename to PaddleNLP/sequence_tagging_for_ner/utils_extend.py
diff --git a/fluid/PaddleNLP/text_classification/.run_ce.sh b/PaddleNLP/text_classification/.run_ce.sh
similarity index 100%
rename from fluid/PaddleNLP/text_classification/.run_ce.sh
rename to PaddleNLP/text_classification/.run_ce.sh
diff --git a/PaddleNLP/text_classification/README.md b/PaddleNLP/text_classification/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..669774bac04fe906cc5bffafa1f60de60323c806
--- /dev/null
+++ b/PaddleNLP/text_classification/README.md
@@ -0,0 +1,112 @@
+# 文本分类
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── nets.py # 模型定义
+├── README.md # 文档
+├── train.py # 训练脚本
+├── infer.py # 预测脚本
+└── utils.py # 定义通用函数,从外部获取
+```
+
+
+## 简介,模型详解
+
+在PaddlePaddle v2版本[文本分类](https://github.com/PaddlePaddle/models/blob/develop/legacy/text_classification/README.md)中对于文本分类任务有较详细的介绍,在本例中不再重复介绍。
+在模型上,我们采用了bow, cnn, lstm, gru四种常见的文本分类模型。
+
+## 训练
+
+1. 运行命令 `python train.py bow` 开始训练模型。
+ ```python
+ python train.py bow # bow指定网络结构,可替换成cnn, lstm, gru
+ ```
+
+2. (可选)想自定义网络结构,需在[nets.py](./nets.py)中自行添加,并设置[train.py](./train.py)中的相应参数。
+ ```python
+ def train(train_reader, # 训练数据
+ word_dict, # 数据字典
+ network, # 模型配置
+ use_cuda, # 是否用GPU
+ parallel, # 是否并行
+ save_dirname, # 保存模型路径
+ lr=0.2, # 学习率大小
+ batch_size=128, # 每个batch的样本数
+ pass_num=30): # 训练的轮数
+ ```
+
+## 训练结果示例
+```text
+ pass_id: 0, avg_acc: 0.848040, avg_cost: 0.354073
+ pass_id: 1, avg_acc: 0.914200, avg_cost: 0.217945
+ pass_id: 2, avg_acc: 0.929800, avg_cost: 0.184302
+ pass_id: 3, avg_acc: 0.938680, avg_cost: 0.164240
+ pass_id: 4, avg_acc: 0.945120, avg_cost: 0.149150
+ pass_id: 5, avg_acc: 0.951280, avg_cost: 0.137117
+ pass_id: 6, avg_acc: 0.955360, avg_cost: 0.126434
+ pass_id: 7, avg_acc: 0.961400, avg_cost: 0.117405
+ pass_id: 8, avg_acc: 0.963560, avg_cost: 0.110070
+ pass_id: 9, avg_acc: 0.965840, avg_cost: 0.103273
+ pass_id: 10, avg_acc: 0.969800, avg_cost: 0.096314
+ pass_id: 11, avg_acc: 0.971720, avg_cost: 0.090206
+ pass_id: 12, avg_acc: 0.974800, avg_cost: 0.084970
+ pass_id: 13, avg_acc: 0.977400, avg_cost: 0.078981
+ pass_id: 14, avg_acc: 0.980000, avg_cost: 0.073685
+ pass_id: 15, avg_acc: 0.981080, avg_cost: 0.069898
+ pass_id: 16, avg_acc: 0.982080, avg_cost: 0.064923
+ pass_id: 17, avg_acc: 0.984680, avg_cost: 0.060861
+ pass_id: 18, avg_acc: 0.985840, avg_cost: 0.057095
+ pass_id: 19, avg_acc: 0.988080, avg_cost: 0.052424
+ pass_id: 20, avg_acc: 0.989160, avg_cost: 0.049059
+ pass_id: 21, avg_acc: 0.990120, avg_cost: 0.045882
+ pass_id: 22, avg_acc: 0.992080, avg_cost: 0.042140
+ pass_id: 23, avg_acc: 0.992280, avg_cost: 0.039722
+ pass_id: 24, avg_acc: 0.992840, avg_cost: 0.036607
+ pass_id: 25, avg_acc: 0.994440, avg_cost: 0.034040
+ pass_id: 26, avg_acc: 0.995000, avg_cost: 0.031501
+ pass_id: 27, avg_acc: 0.995440, avg_cost: 0.028988
+ pass_id: 28, avg_acc: 0.996240, avg_cost: 0.026639
+ pass_id: 29, avg_acc: 0.996960, avg_cost: 0.024186
+```
+
+## 预测
+1. 运行命令 `python infer.py bow_model`, 开始预测。
+ ```python
+ python infer.py bow_model # bow_model指定需要导入的模型
+
+## 预测结果示例
+```text
+ model_path: bow_model/epoch0, avg_acc: 0.882800
+ model_path: bow_model/epoch1, avg_acc: 0.882360
+ model_path: bow_model/epoch2, avg_acc: 0.881400
+ model_path: bow_model/epoch3, avg_acc: 0.877800
+ model_path: bow_model/epoch4, avg_acc: 0.872920
+ model_path: bow_model/epoch5, avg_acc: 0.872640
+ model_path: bow_model/epoch6, avg_acc: 0.869960
+ model_path: bow_model/epoch7, avg_acc: 0.865160
+ model_path: bow_model/epoch8, avg_acc: 0.863680
+ model_path: bow_model/epoch9, avg_acc: 0.861200
+ model_path: bow_model/epoch10, avg_acc: 0.853520
+ model_path: bow_model/epoch11, avg_acc: 0.850400
+ model_path: bow_model/epoch12, avg_acc: 0.855960
+ model_path: bow_model/epoch13, avg_acc: 0.853480
+ model_path: bow_model/epoch14, avg_acc: 0.855960
+ model_path: bow_model/epoch15, avg_acc: 0.854120
+ model_path: bow_model/epoch16, avg_acc: 0.854160
+ model_path: bow_model/epoch17, avg_acc: 0.852240
+ model_path: bow_model/epoch18, avg_acc: 0.852320
+ model_path: bow_model/epoch19, avg_acc: 0.850280
+ model_path: bow_model/epoch20, avg_acc: 0.849760
+ model_path: bow_model/epoch21, avg_acc: 0.850160
+ model_path: bow_model/epoch22, avg_acc: 0.846800
+ model_path: bow_model/epoch23, avg_acc: 0.845440
+ model_path: bow_model/epoch24, avg_acc: 0.845640
+ model_path: bow_model/epoch25, avg_acc: 0.846200
+ model_path: bow_model/epoch26, avg_acc: 0.845880
+ model_path: bow_model/epoch27, avg_acc: 0.844880
+ model_path: bow_model/epoch28, avg_acc: 0.844680
+ model_path: bow_model/epoch29, avg_acc: 0.844960
+```
+注:过拟合导致acc持续下降,请忽略
diff --git a/fluid/PaddleNLP/text_classification/_ce.py b/PaddleNLP/text_classification/_ce.py
similarity index 100%
rename from fluid/PaddleNLP/text_classification/_ce.py
rename to PaddleNLP/text_classification/_ce.py
diff --git a/PaddleNLP/text_classification/async_executor/README.md b/PaddleNLP/text_classification/async_executor/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..0e36a8be7653787852d4d04b7603cec1046f61be
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/README.md
@@ -0,0 +1,130 @@
+# 文本分类
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+|-- README.md # README
+|-- data_generator # IMDB数据集生成工具
+| |-- IMDB.py # 在data_generator.py基础上扩展IMDB数据集处理逻辑
+| |-- build_raw_data.py # IMDB数据预处理,其产出被splitfile.py读取。格式:word word ... | label
+| |-- data_generator.py # 与AsyncExecutor配套的数据生成工具框架
+| `-- splitfile.py # 将build_raw_data.py生成的文件切分,其产出被IMDB.py读取
+|-- data_generator.sh # IMDB数据集生成工具入口
+|-- data_reader.py # 预测脚本使用的数据读取工具
+|-- infer.py # 预测脚本
+`-- train.py # 训练脚本
+```
+
+## 简介
+
+本目录包含用fluid.AsyncExecutor训练文本分类任务的脚本。网络模型定义沿用自父目录nets.py
+
+## 训练
+
+1. 运行命令 `sh data_generator.sh`,下载IMDB数据集,并转化成适合AsyncExecutor读取的训练数据
+2. 运行命令 `python train.py bow` 开始训练模型。
+ ```python
+ python train.py bow # bow指定网络结构,可替换成cnn, lstm, gru
+ ```
+
+3. (可选)想自定义网络结构,需在[nets.py](../nets.py)中自行添加,并设置[train.py](./train.py)中的相应参数。
+ ```python
+ def train(train_reader, # 训练数据
+ word_dict, # 数据字典
+ network, # 模型配置
+ use_cuda, # 是否用GPU
+ parallel, # 是否并行
+ save_dirname, # 保存模型路径
+ lr=0.2, # 学习率大小
+ batch_size=128, # 每个batch的样本数
+ pass_num=30): # 训练的轮数
+ ```
+
+## 训练结果示例
+
+```text
+pass_id: 0 pass_time_cost 4.723438
+pass_id: 1 pass_time_cost 3.867186
+pass_id: 2 pass_time_cost 4.490111
+pass_id: 3 pass_time_cost 4.573296
+pass_id: 4 pass_time_cost 4.180547
+pass_id: 5 pass_time_cost 4.214476
+pass_id: 6 pass_time_cost 4.520387
+pass_id: 7 pass_time_cost 4.149485
+pass_id: 8 pass_time_cost 3.821354
+pass_id: 9 pass_time_cost 5.136178
+pass_id: 10 pass_time_cost 4.137318
+pass_id: 11 pass_time_cost 3.943429
+pass_id: 12 pass_time_cost 3.766478
+pass_id: 13 pass_time_cost 4.235983
+pass_id: 14 pass_time_cost 4.796462
+pass_id: 15 pass_time_cost 4.668116
+pass_id: 16 pass_time_cost 4.373798
+pass_id: 17 pass_time_cost 4.298131
+pass_id: 18 pass_time_cost 4.260021
+pass_id: 19 pass_time_cost 4.244411
+pass_id: 20 pass_time_cost 3.705138
+pass_id: 21 pass_time_cost 3.728070
+pass_id: 22 pass_time_cost 3.817919
+pass_id: 23 pass_time_cost 4.698598
+pass_id: 24 pass_time_cost 4.859262
+pass_id: 25 pass_time_cost 5.725732
+pass_id: 26 pass_time_cost 5.102599
+pass_id: 27 pass_time_cost 3.876582
+pass_id: 28 pass_time_cost 4.762538
+pass_id: 29 pass_time_cost 3.797759
+```
+与fluid.Executor不同,AsyncExecutor在每个pass结束不会将accuracy打印出来。为了观察训练过程,可以将fluid.AsyncExecutor.run()方法的Debug参数设为True,这样每个pass结束会把参数指定的fetch variable打印出来:
+
+```
+async_executor.run(
+ main_program,
+ dataset,
+ filelist,
+ thread_num,
+ [acc],
+ debug=True)
+```
+
+## 预测
+
+1. 运行命令 `python infer.py bow_model`, 开始预测。
+ ```python
+ python infer.py bow_model # bow_model指定需要导入的模型
+ ```
+
+## 预测结果示例
+```text
+model_path: bow_model/epoch0.model, avg_acc: 0.882600
+model_path: bow_model/epoch1.model, avg_acc: 0.887920
+model_path: bow_model/epoch2.model, avg_acc: 0.886920
+model_path: bow_model/epoch3.model, avg_acc: 0.884720
+model_path: bow_model/epoch4.model, avg_acc: 0.879760
+model_path: bow_model/epoch5.model, avg_acc: 0.876920
+model_path: bow_model/epoch6.model, avg_acc: 0.874160
+model_path: bow_model/epoch7.model, avg_acc: 0.872000
+model_path: bow_model/epoch8.model, avg_acc: 0.870360
+model_path: bow_model/epoch9.model, avg_acc: 0.868480
+model_path: bow_model/epoch10.model, avg_acc: 0.867240
+model_path: bow_model/epoch11.model, avg_acc: 0.866200
+model_path: bow_model/epoch12.model, avg_acc: 0.865560
+model_path: bow_model/epoch13.model, avg_acc: 0.865160
+model_path: bow_model/epoch14.model, avg_acc: 0.864480
+model_path: bow_model/epoch15.model, avg_acc: 0.864240
+model_path: bow_model/epoch16.model, avg_acc: 0.863800
+model_path: bow_model/epoch17.model, avg_acc: 0.863520
+model_path: bow_model/epoch18.model, avg_acc: 0.862760
+model_path: bow_model/epoch19.model, avg_acc: 0.862680
+model_path: bow_model/epoch20.model, avg_acc: 0.862240
+model_path: bow_model/epoch21.model, avg_acc: 0.862280
+model_path: bow_model/epoch22.model, avg_acc: 0.862080
+model_path: bow_model/epoch23.model, avg_acc: 0.861560
+model_path: bow_model/epoch24.model, avg_acc: 0.861280
+model_path: bow_model/epoch25.model, avg_acc: 0.861160
+model_path: bow_model/epoch26.model, avg_acc: 0.861080
+model_path: bow_model/epoch27.model, avg_acc: 0.860920
+model_path: bow_model/epoch28.model, avg_acc: 0.860800
+model_path: bow_model/epoch29.model, avg_acc: 0.860760
+```
+注:过拟合导致acc持续下降,请忽略
diff --git a/PaddleNLP/text_classification/async_executor/data_generator.sh b/PaddleNLP/text_classification/async_executor/data_generator.sh
new file mode 100644
index 0000000000000000000000000000000000000000..7b5aad5f7609924b469d29cad8c95c7df8e75b6e
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/data_generator.sh
@@ -0,0 +1,43 @@
+#!/usr/bin/env bash
+
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+pushd .
+cd ./data_generator
+
+# wget "http://ai.stanford.edu/%7Eamaas/data/sentiment/aclImdb_v1.tar.gz"
+if [ ! -f aclImdb_v1.tar.gz ]; then
+ wget "http://ai.stanford.edu/%7Eamaas/data/sentiment/aclImdb_v1.tar.gz"
+fi
+tar zxvf aclImdb_v1.tar.gz
+
+mkdir train_data
+python build_raw_data.py train | python splitfile.py 12 train_data
+
+mkdir test_data
+python build_raw_data.py test | python splitfile.py 12 test_data
+
+/opt/python27/bin/python IMDB.py train_data
+/opt/python27/bin/python IMDB.py test_data
+
+mv ./output_dataset/train_data ../
+mv ./output_dataset/test_data ../
+cp aclImdb/imdb.vocab ../
+
+rm -rf ./output_dataset
+rm -rf train_data
+rm -rf test_data
+rm -rf aclImdb
+popd
diff --git a/PaddleNLP/text_classification/async_executor/data_generator/IMDB.py b/PaddleNLP/text_classification/async_executor/data_generator/IMDB.py
new file mode 100644
index 0000000000000000000000000000000000000000..579df4e0e722d245cabc366ffaeeab71dbf2aa0a
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/data_generator/IMDB.py
@@ -0,0 +1,60 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import re
+import os, sys
+sys.path.append(os.path.abspath(os.path.join('..')))
+from data_generator import MultiSlotDataGenerator
+
+
+class IMDbDataGenerator(MultiSlotDataGenerator):
+ def load_resource(self, dictfile):
+ self._vocab = {}
+ wid = 0
+ with open(dictfile) as f:
+ for line in f:
+ self._vocab[line.strip()] = wid
+ wid += 1
+ self._unk_id = len(self._vocab)
+ self._pattern = re.compile(r'(;|,|\.|\?|!|\s|\(|\))')
+
+ def process(self, line):
+ send = '|'.join(line.split('|')[:-1]).lower().replace("
",
+ " ").strip()
+ label = [int(line.split('|')[-1])]
+
+ words = [x for x in self._pattern.split(send) if x and x != " "]
+ feas = [
+ self._vocab[x] if x in self._vocab else self._unk_id for x in words
+ ]
+
+ return ("words", feas), ("label", label)
+
+
+imdb = IMDbDataGenerator()
+imdb.load_resource("aclImdb/imdb.vocab")
+
+# data from files
+file_names = os.listdir(sys.argv[1])
+filelist = []
+for i in range(0, len(file_names)):
+ filelist.append(os.path.join(sys.argv[1], file_names[i]))
+
+line_limit = 2500
+process_num = 24
+imdb.run_from_files(
+ filelist=filelist,
+ line_limit=line_limit,
+ process_num=process_num,
+ output_dir=('output_dataset/%s' % (sys.argv[1])))
diff --git a/PaddleNLP/text_classification/async_executor/data_generator/build_raw_data.py b/PaddleNLP/text_classification/async_executor/data_generator/build_raw_data.py
new file mode 100644
index 0000000000000000000000000000000000000000..2c0c0981c93b3b1e9231c7efe1f0b49e178c060f
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/data_generator/build_raw_data.py
@@ -0,0 +1,62 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""
+Build raw data
+"""
+from __future__ import print_function
+import sys
+import os
+import random
+import re
+data_type = sys.argv[1]
+
+if not (data_type == "train" or data_type == "test"):
+ print("python %s [test/train]" % sys.argv[0], file=sys.stderr)
+ sys.exit(-1)
+
+pos_folder = "aclImdb/" + data_type + "/pos/"
+neg_folder = "aclImdb/" + data_type + "/neg/"
+
+pos_train_list = [(pos_folder + x, "1") for x in os.listdir(pos_folder)]
+neg_train_list = [(neg_folder + x, "0") for x in os.listdir(neg_folder)]
+
+all_train_list = pos_train_list + neg_train_list
+random.shuffle(all_train_list)
+
+
+def load_dict(dictfile):
+ """
+ Load word id dict
+ """
+ vocab = {}
+ wid = 0
+ with open(dictfile) as f:
+ for line in f:
+ vocab[line.strip()] = str(wid)
+ wid += 1
+ return vocab
+
+
+vocab = load_dict("aclImdb/imdb.vocab")
+unk_id = str(len(vocab))
+print("vocab size: ", len(vocab), file=sys.stderr)
+pattern = re.compile(r'(;|,|\.|\?|!|\s|\(|\))')
+
+for fitem in all_train_list:
+ label = str(fitem[1])
+ fname = fitem[0]
+ with open(fname) as f:
+ sent = f.readline().lower().replace("
", " ").strip()
+ out_s = "%s | %s" % (sent, label)
+ print(out_s, file=sys.stdout)
diff --git a/PaddleNLP/text_classification/async_executor/data_generator/data_generator.py b/PaddleNLP/text_classification/async_executor/data_generator/data_generator.py
new file mode 100644
index 0000000000000000000000000000000000000000..70d1e1f9a020be13f43129cf26964c860ae2ce4f
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/data_generator/data_generator.py
@@ -0,0 +1,508 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import os
+import sys
+import multiprocessing
+__all__ = ['MultiSlotDataGenerator']
+
+
+class DataGenerator(object):
+ def __init__(self):
+ self._proto_info = None
+
+ def _set_filelist(self, filelist):
+ if not isinstance(filelist, list) and not isinstance(filelist, tuple):
+ raise ValueError("filelist%s must be in list or tuple type" %
+ type(filelist))
+ if not filelist:
+ raise ValueError("filelist can not be empty")
+ self._filelist = filelist
+
+ def _set_process_num(self, process_num):
+ if not isinstance(process_num, int):
+ raise ValueError("process_num%s must be in int type" %
+ type(process_num))
+ if process_num < 1:
+ raise ValueError("process_num can not less than 1")
+ self._process_num = process_num
+
+ def _set_line_limit(self, line_limit):
+ if not isinstance(line_limit, int):
+ raise ValueError("line_limit%s must be in int type" %
+ type(line_limit))
+ if line_limit < 1:
+ raise ValueError("line_limit can not less than 1")
+ self._line_limit = line_limit
+
+ def _set_output_dir(self, output_dir):
+ if not isinstance(output_dir, str):
+ raise ValueError("output_dir%s must be in str type" %
+ type(output_dir))
+ if not output_dir:
+ raise ValueError("output_dir can not be empty")
+ self._output_dir = output_dir
+
+ def _set_output_prefix(self, output_prefix):
+ if not isinstance(output_prefix, str):
+ raise ValueError("output_prefix%s must be in str type" %
+ type(output_prefix))
+ self._output_prefix = output_prefix
+
+ def _set_output_fill_digit(self, output_fill_digit):
+ if not isinstance(output_fill_digit, int):
+ raise ValueError("output_fill_digit%s must be in int type" %
+ type(output_fill_digit))
+ if output_fill_digit < 1:
+ raise ValueError("output_fill_digit can not less than 1")
+ self._output_fill_digit = output_fill_digit
+
+ def _set_proto_filename(self, proto_filename):
+ if not isinstance(proto_filename, str):
+ raise ValueError("proto_filename%s must be in str type" %
+ type(proto_filename))
+ if not proto_filename:
+ raise ValueError("proto_filename can not be empty")
+ self._proto_filename = proto_filename
+
+ def _print_info(self):
+ '''
+ Print the configuration information
+ (Called only in the run_from_stdin function).
+ '''
+ sys.stderr.write("=" * 16 + " config " + "=" * 16 + "\n")
+ sys.stderr.write(" filelist size: %d\n" % len(self._filelist))
+ sys.stderr.write(" process num: %d\n" % self._process_num)
+ sys.stderr.write(" line limit: %d\n" % self._line_limit)
+ sys.stderr.write(" output dir: %s\n" % self._output_dir)
+ sys.stderr.write(" output prefix: %s\n" % self._output_prefix)
+ sys.stderr.write(" output fill digit: %d\n" % self._output_fill_digit)
+ sys.stderr.write(" proto filename: %s\n" % self._proto_filename)
+ sys.stderr.write("==== This may take a few minutes... ====\n")
+
+ def _get_output_filename(self, output_index, lock=None):
+ '''
+ This function is used to get the name of the output file and
+ update output_index.
+ Args:
+ output_index(manager.Value(i)): the index of output file.
+ lock(manager.Lock): The lock for processes safe.
+ Return:
+ Return the name(string) of output file.
+ '''
+ if lock is not None: lock.acquire()
+ file_index = output_index.value
+ output_index.value += 1
+ if lock is not None: lock.release()
+ filename = os.path.join(self._output_dir, self._output_prefix) \
+ + str(file_index).zfill(self._output_fill_digit)
+ sys.stderr.write("[%d] write data to file: %s\n" %
+ (os.getpid(), filename))
+ return filename
+
+ def run_from_stdin(self,
+ is_local=True,
+ hadoop_host=None,
+ hadoop_ugi=None,
+ proto_path=None,
+ proto_filename="data_feed.proto"):
+ '''
+ This function reads the data row from stdin, parses it with the
+ process function, and further parses the return value of the
+ process function with the _gen_str function. The parsed data will
+ be wrote to stdout and the corresponding protofile will be
+ generated. If local is set to False, the protofile will be
+ uploaded to hadoop.
+ Args:
+ is_local(bool): Whether to execute locally. If it is False, the
+ protofile will be uploaded to hadoop. The
+ default value is True.
+ hadoop_host(str): The host name of the hadoop. It should be
+ in this format: "hdfs://${HOST}:${PORT}".
+ hadoop_ugi(str): The ugi of the hadoop. It should be in this
+ format: "${USERNAME},${PASSWORD}".
+ proto_path(str): The hadoop path you want to upload the
+ protofile to.
+ proto_filename(str): The name of protofile. The default value
+ is "data_feed.proto". It is not
+ recommended to modify it.
+ '''
+ if is_local:
+ print \
+'''\033[1;34m=======================================================
+ Pay attention to that the version of Python in Hadoop
+ may inconsistent with local version. Please check the
+ Python version of Hadoop to ensure that it is >= 2.7.
+=======================================================\033[0m'''
+ else:
+ if hadoop_ugi is None or \
+ hadoop_host is None or \
+ proto_path is None:
+ raise ValueError(
+ "pls set hadoop_ugi, hadoop_host, and proto_path")
+ self._set_proto_filename(proto_filename)
+ for line in sys.stdin:
+ user_parsed_line = self.process(line)
+ sys.stdout.write(self._gen_str(user_parsed_line))
+ if self._proto_info is not None:
+ # maybe some task do not catch files
+ with open(self._proto_filename, "w") as f:
+ f.write(self._get_proto_desc(self._proto_info))
+ if is_local == False:
+ cmd = "$HADOOP_HOME/bin/hadoop fs" \
+ + " -Dhadoop.job.ugi=" + hadoop_ugi \
+ + " -Dfs.default.name=" + hadoop_host \
+ + " -put " + self._proto_filename + " " + proto_path
+ os.system(cmd)
+
+ def run_from_files(self,
+ filelist,
+ line_limit,
+ process_num=1,
+ output_dir="./output_dataset",
+ output_prefix="part-",
+ output_fill_digit=8,
+ proto_filename="data_feed.proto"):
+ '''
+ This function will run process_num processes to process the files
+ in the filelist. It will create the output data folder(output_dir)
+ in the current directory, and write the processed data into the
+ output_dir folder(each file line_limit data, the prefix of filename
+ is output_prefix, the suffix of filename is output_fill_digit
+ numbers). And the proto_info is generated at the same time. the
+ name of proto file will be proto_filename.
+ Args:
+ filelist(list or tuple): Files that need to be processed.
+ line_limit(int): Maximum number of data stored per file.
+ process_num(int): Number of processes running simultaneously.
+ output_dir(str): The name of the folder where the output
+ data file is stored.
+ output_prefix(str): The prefix of output data file.
+ output_fill_digit(int): The number of suffix numbers of the
+ output data file.
+ proto_filename(str): The name of protofile.
+ '''
+ self._set_filelist(filelist)
+ self._set_line_limit(line_limit)
+ self._set_process_num(min(process_num, len(filelist)))
+ self._set_output_dir(output_dir)
+ self._set_output_prefix(output_prefix)
+ self._set_output_fill_digit(output_fill_digit)
+ self._set_proto_filename(proto_filename)
+ self._print_info()
+
+ if not os.path.exists(self._output_dir):
+ os.makedirs(self._output_dir)
+ elif not os.path.isdir(self._output_dir):
+ raise ValueError("%s is not a directory" % self._output_dir)
+
+ processes = multiprocessing.Pool()
+ manager = multiprocessing.Manager()
+ output_index = manager.Value('i', 0)
+ file_queue = manager.Queue()
+ lock = manager.Lock()
+ remaining_queue = manager.Queue()
+ for file in self._filelist:
+ file_queue.put(file)
+ info_result = []
+ for i in range(self._process_num):
+ info_result.append(processes.apply_async(subprocess_wrapper, \
+ (self, file_queue, remaining_queue, output_index, lock, )))
+ processes.close()
+ processes.join()
+
+ infos = [
+ result.get() for result in info_result if result.get() is not None
+ ]
+ proto_info = self._combine_infos(infos)
+ with open(os.path.join(self._output_dir, self._proto_filename),
+ "w") as f:
+ f.write(self._get_proto_desc(proto_info))
+
+ while not remaining_queue.empty():
+ with open(self._get_output_filename(output_index), "w") as f:
+ for i in range(min(self._line_limit, remaining_queue.qsize())):
+ f.write(remaining_queue.get(False))
+
+ def _subprocess(self, file_queue, remaining_queue, output_index, lock):
+ '''
+ This function will be called by multiple processes. It is used to
+ continuously fetch files from file_queue, using process() function
+ (defined by user) and _gen_str() function(defined by concrete classes)
+ to process data in units of rows. Write the processed data to the
+ file(each file will be self._line_limit line). If the file in the
+ file_queue has been consumed, but the file is not full, the data
+ that is less than the self._line_limit line will be stored in the
+ remaining_queue.
+ Args:
+ file_queue(manager.Queue): The queue contains all the file
+ names to be processed.
+ remaining_queue(manager.Queue): The queue contains the data that
+ is less than the self._line_limit
+ line.
+ output_index(manager.Value(i)): The index(suffix) of the
+ output file.
+ lock(manager.Lock): The lock for processes safe.
+ Returns:
+ Return a proto_info which can be translated into a proto string.
+ '''
+ buffer = []
+ while not file_queue.empty():
+ try:
+ filename = file_queue.get(False)
+ except: # file_queue empty
+ break
+ with open(filename, 'r') as f:
+ for line in f:
+ buffer.append(self._gen_str(self.process(line)))
+ if len(buffer) == self._line_limit:
+ with open(
+ self._get_output_filename(output_index, lock),
+ "w") as wf:
+ for x in buffer:
+ wf.write(x)
+ buffer = []
+ if buffer:
+ for x in buffer:
+ remaining_queue.put(x)
+ return self._proto_info
+
+ def _gen_str(self, line):
+ '''
+ Further processing the output of the process() function rewritten by
+ user, outputting data that can be directly read by the datafeed,and
+ updating proto_info infomation.
+ Args:
+ line(str): the output of the process() function rewritten by user.
+ Returns:
+ Return a string data that can be read directly by the datafeed.
+ '''
+ raise NotImplementedError(
+ "pls use MultiSlotDataGenerator or PairWiseDataGenerator")
+
+ def _combine_infos(self, infos):
+ '''
+ This function is used to merge proto_info information from different
+ processes. In general, the proto_info of each process is consistent.
+ Args:
+ infos(list): the list of proto_infos from different processes.
+ Returns:
+ Return a unified proto_info.
+ '''
+ raise NotImplementedError(
+ "pls use MultiSlotDataGenerator or PairWiseDataGenerator")
+
+ def _get_proto_desc(self, proto_info):
+ '''
+ This function outputs the string of the proto file(can be directly
+ written to the file) according to the proto_info information.
+ Args:
+ proto_info: The proto information used to generate the proto
+ string. The type of the variable will be determined
+ by the subclass. In the MultiSlotDataGenerator,
+ proto_info variable is a list of tuple.
+ Returns:
+ Returns a string of the proto file.
+ '''
+ raise NotImplementedError(
+ "pls use MultiSlotDataGenerator or PairWiseDataGenerator")
+
+ def process(self, line):
+ '''
+ This function needs to be overridden by the user to process the
+ original data row into a list or tuple.
+ Args:
+ line(str): the original data row
+ Returns:
+ Returns the data processed by the user.
+ The data format is list or tuple:
+ [(name, [feasign, ...]), ...]
+ or ((name, [feasign, ...]), ...)
+
+ For example:
+ [("words", [1926, 08, 17]), ("label", [1])]
+ or (("words", [1926, 08, 17]), ("label", [1]))
+ Note:
+ The type of feasigns must be in int or float. Once the float
+ element appears in the feasign, the type of that slot will be
+ processed into a float.
+ '''
+ raise NotImplementedError(
+ "pls rewrite this function to return a list or tuple: " +
+ "[(name, [feasign, ...]), ...] or ((name, [feasign, ...]), ...)")
+
+
+def subprocess_wrapper(instance, file_queue, remaining_queue, output_index,
+ lock):
+ '''
+ In order to use the class function as a process, you need to wrap it.
+ '''
+ return instance._subprocess(file_queue, remaining_queue, output_index, lock)
+
+
+class MultiSlotDataGenerator(DataGenerator):
+ def _combine_infos(self, infos):
+ '''
+ This function is used to merge proto_info information from different
+ processes. In general, the proto_info of each process is consistent.
+ The type of input infos is list, and the type of element of infos is
+ tuple. The format of element of infos will be (name, type).
+ Args:
+ infos(list): the list of proto_infos from different processes.
+ Returns:
+ Return a unified proto_info.
+ Note:
+ This function is only called by the run_from_files function, so
+ when using the run_from_stdin function(usually used for hadoop),
+ the output of the process function(rewritten by the user) does
+ not allow that the same field to have both float and int type
+ values.
+ '''
+ proto_info = infos[0]
+ for info in infos:
+ for index, slot in enumerate(info):
+ name, type = slot
+ if name != proto_info[index][0]:
+ raise ValueError(
+ "combine infos error, pls contact the maintainer of this code~"
+ )
+ if type == "float" and proto_info[index][1] == "uint64":
+ proto_info[index] = (name, type)
+ return proto_info
+
+ def _get_proto_desc(self, proto_info):
+ '''
+ Generate a string of proto file based on the proto_info information.
+
+ The proto_info will be a list of tuples:
+ >>> [(Name, Type), ...]
+
+ The string of proto file will be in this format:
+ >>> name: "MultiSlotDataFeed"
+ >>> batch_size: 32
+ >>> multi_slot_desc {
+ >>> slots {
+ >>> name: Name
+ >>> type: Type
+ >>> is_dense: false
+ >>> is_used: false
+ >>> }
+ >>> }
+ Args:
+ proto_info(list): The proto information used to generate the
+ proto string.
+ Returns:
+ Returns a string of the proto file.
+ '''
+ proto_str = "name: \"MultiSlotDataFeed\"\n" \
+ + "batch_size: 32\nmulti_slot_desc {\n"
+ for elem in proto_info:
+ proto_str += " slots {\n" \
+ + " name: \"%s\"\n" % elem[0]\
+ + " type: \"%s\"\n" % elem[1]\
+ + " is_dense: false\n" \
+ + " is_used: false\n" \
+ + " }\n"
+ proto_str += "}"
+ return proto_str
+
+ def _gen_str(self, line):
+ '''
+ Further processing the output of the process() function rewritten by
+ user, outputting data that can be directly read by the MultiSlotDataFeed,
+ and updating proto_info infomation.
+ The input line will be in this format:
+ >>> [(name, [feasign, ...]), ...]
+ >>> or ((name, [feasign, ...]), ...)
+ The output will be in this format:
+ >>> [ids_num id1 id2 ...] ...
+ The proto_info will be in this format:
+ >>> [(name, type), ...]
+
+ For example, if the input is like this:
+ >>> [("words", [1926, 08, 17]), ("label", [1])]
+ >>> or (("words", [1926, 08, 17]), ("label", [1]))
+ the output will be:
+ >>> 3 1234 2345 3456 1 1
+ the proto_info will be:
+ >>> [("words", "uint64"), ("label", "uint64")]
+ Args:
+ line(str): the output of the process() function rewritten by user.
+ Returns:
+ Return a string data that can be read directly by the MultiSlotDataFeed.
+ '''
+ if not isinstance(line, list) and not isinstance(line, tuple):
+ raise ValueError(
+ "the output of process() must be in list or tuple type")
+ output = ""
+
+ if self._proto_info is None:
+ self._proto_info = []
+ for item in line:
+ name, elements = item
+ if not isinstance(name, str):
+ raise ValueError("name%s must be in str type" % type(name))
+ if not isinstance(elements, list):
+ raise ValueError("elements%s must be in list type" %
+ type(elements))
+ if not elements:
+ raise ValueError(
+ "the elements of each field can not be empty, you need padding it in process()."
+ )
+ self._proto_info.append((name, "uint64"))
+ if output:
+ output += " "
+ output += str(len(elements))
+ for elem in elements:
+ if isinstance(elem, float):
+ self._proto_info[-1] = (name, "float")
+ elif not isinstance(elem, int) and not isinstance(elem,
+ long):
+ raise ValueError(
+ "the type of element%s must be in int or float" %
+ type(elem))
+ output += " " + str(elem)
+ else:
+ if len(line) != len(self._proto_info):
+ raise ValueError(
+ "the complete field set of two given line are inconsistent.")
+ for index, item in enumerate(line):
+ name, elements = item
+ if not isinstance(name, str):
+ raise ValueError("name%s must be in str type" % type(name))
+ if not isinstance(elements, list):
+ raise ValueError("elements%s must be in list type" %
+ type(elements))
+ if not elements:
+ raise ValueError(
+ "the elements of each field can not be empty, you need padding it in process()."
+ )
+ if name != self._proto_info[index][0]:
+ raise ValueError(
+ "the field name of two given line are not match: require<%s>, get<%d>."
+ % (self._proto_info[index][0], name))
+ if output:
+ output += " "
+ output += str(len(elements))
+ for elem in elements:
+ if self._proto_info[index][1] != "float":
+ if isinstance(elem, float):
+ self._proto_info[index] = (name, "float")
+ elif not isinstance(elem, int) and not isinstance(elem,
+ long):
+ raise ValueError(
+ "the type of element%s must be in int or float"
+ % type(elem))
+ output += " " + str(elem)
+ return output + "\n"
diff --git a/PaddleNLP/text_classification/async_executor/data_generator/splitfile.py b/PaddleNLP/text_classification/async_executor/data_generator/splitfile.py
new file mode 100644
index 0000000000000000000000000000000000000000..414e097c5a6352673c00e487230e38bf64e6299a
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/data_generator/splitfile.py
@@ -0,0 +1,29 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""
+Split file into parts
+"""
+import sys
+import os
+block = int(sys.argv[1])
+datadir = sys.argv[2]
+file_list = []
+for i in range(block):
+ file_list.append(open(datadir + "/part-" + str(i), "w"))
+id_ = 0
+for line in sys.stdin:
+ file_list[id_ % block].write(line)
+ id_ += 1
+for f in file_list:
+ f.close()
diff --git a/PaddleNLP/text_classification/async_executor/data_reader.py b/PaddleNLP/text_classification/async_executor/data_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..0cdad76295ea7d6d2eb7f2982aa5cb6b830d4976
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/data_reader.py
@@ -0,0 +1,50 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import sys
+import os
+import paddle
+
+
+def parse_fields(fields):
+ words_width = int(fields[0])
+ words = fields[1:1 + words_width]
+ label = fields[-1]
+
+ return words, label
+
+
+def imdb_data_feed_reader(data_dir, batch_size, buf_size):
+ """
+ Data feed reader for IMDB dataset.
+ This data set has been converted from original format to a format suitable
+ for AsyncExecutor
+ See data.proto for data format
+ """
+
+ def reader():
+ for file in os.listdir(data_dir):
+ if file.endswith('.proto'):
+ continue
+
+ with open(os.path.join(data_dir, file), 'r') as f:
+ for line in f:
+ fields = line.split(' ')
+ words, label = parse_fields(fields)
+ yield words, label
+
+ test_reader = paddle.batch(
+ paddle.reader.shuffle(
+ reader, buf_size=buf_size), batch_size=batch_size)
+ return test_reader
diff --git a/PaddleNLP/text_classification/async_executor/infer.py b/PaddleNLP/text_classification/async_executor/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..5c9f53afbc992ab1804eb5995702fbcd14e7dcbf
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/infer.py
@@ -0,0 +1,79 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import os
+import sys
+import time
+import unittest
+import contextlib
+import numpy as np
+
+import paddle
+import paddle.fluid as fluid
+
+import data_reader
+
+
+def infer(test_reader, use_cuda, model_path=None):
+ """
+ inference function
+ """
+ if model_path is None:
+ print(str(model_path) + " cannot be found")
+ return
+
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ inference_scope = fluid.core.Scope()
+ with fluid.scope_guard(inference_scope):
+ [inference_program, feed_target_names,
+ fetch_targets] = fluid.io.load_inference_model(model_path, exe)
+
+ total_acc = 0.0
+ total_count = 0
+ for data in test_reader():
+ acc = exe.run(inference_program,
+ feed=utils.data2tensor(data, place),
+ fetch_list=fetch_targets,
+ return_numpy=True)
+ total_acc += acc[0] * len(data)
+ total_count += len(data)
+
+ avg_acc = total_acc / total_count
+ print("model_path: %s, avg_acc: %f" % (model_path, avg_acc))
+
+
+if __name__ == "__main__":
+ if __package__ is None:
+ from os import sys, path
+ sys.path.append(
+ os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
+ import utils
+
+ batch_size = 128
+ model_path = sys.argv[1]
+ test_data_dirname = 'test_data'
+
+ if len(sys.argv) == 3:
+ test_data_dirname = sys.argv[2]
+
+ test_reader = data_reader.imdb_data_feed_reader(
+ 'test_data', batch_size, buf_size=500000)
+
+ models = os.listdir(model_path)
+ for i in range(0, len(models)):
+ epoch_path = "epoch" + str(i) + ".model"
+ epoch_path = os.path.join(model_path, epoch_path)
+ infer(test_reader, use_cuda=False, model_path=epoch_path)
diff --git a/PaddleNLP/text_classification/async_executor/train.py b/PaddleNLP/text_classification/async_executor/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..034d65dd5bf94a717f791e04b8648d9606528d6c
--- /dev/null
+++ b/PaddleNLP/text_classification/async_executor/train.py
@@ -0,0 +1,112 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import os
+import sys
+import time
+import multiprocessing
+
+import paddle
+import paddle.fluid as fluid
+
+
+def train(network, dict_dim, lr, save_dirname, training_data_dirname, pass_num,
+ thread_num, batch_size):
+ file_names = os.listdir(training_data_dirname)
+ filelist = []
+ for i in range(0, len(file_names)):
+ if file_names[i] == 'data_feed.proto':
+ continue
+ filelist.append(os.path.join(training_data_dirname, file_names[i]))
+
+ dataset = fluid.DataFeedDesc(
+ os.path.join(training_data_dirname, 'data_feed.proto'))
+ dataset.set_batch_size(
+ batch_size) # datafeed should be assigned a batch size
+ dataset.set_use_slots(['words', 'label'])
+
+ data = fluid.layers.data(
+ name="words", shape=[1], dtype="int64", lod_level=1)
+ label = fluid.layers.data(name="label", shape=[1], dtype="int64")
+
+ avg_cost, acc, prediction = network(data, label, dict_dim)
+ optimizer = fluid.optimizer.Adagrad(learning_rate=lr)
+ opt_ops, weight_and_grad = optimizer.minimize(avg_cost)
+
+ startup_program = fluid.default_startup_program()
+ main_program = fluid.default_main_program()
+
+ place = fluid.CPUPlace()
+ executor = fluid.Executor(place)
+ executor.run(startup_program)
+
+ async_executor = fluid.AsyncExecutor(place)
+ for i in range(pass_num):
+ pass_start = time.time()
+ async_executor.run(main_program,
+ dataset,
+ filelist,
+ thread_num, [acc],
+ debug=False)
+ print('pass_id: %u pass_time_cost %f' % (i, time.time() - pass_start))
+ fluid.io.save_inference_model('%s/epoch%d.model' % (save_dirname, i),
+ [data.name, label.name], [acc], executor)
+
+
+if __name__ == "__main__":
+ if __package__ is None:
+ from os import sys, path
+ sys.path.append(
+ os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
+
+ from nets import bow_net, cnn_net, lstm_net, gru_net
+ from utils import load_vocab
+
+ batch_size = 4
+ lr = 0.002
+ pass_num = 30
+ save_dirname = ""
+ thread_num = multiprocessing.cpu_count()
+
+ if sys.argv[1] == "bow":
+ network = bow_net
+ batch_size = 128
+ save_dirname = "bow_model"
+ elif sys.argv[1] == "cnn":
+ network = cnn_net
+ lr = 0.01
+ save_dirname = "cnn_model"
+ elif sys.argv[1] == "lstm":
+ network = lstm_net
+ lr = 0.05
+ save_dirname = "lstm_model"
+ elif sys.argv[1] == "gru":
+ network = gru_net
+ batch_size = 128
+ lr = 0.05
+ save_dirname = "gru_model"
+
+ training_data_dirname = 'train_data/'
+ if len(sys.argv) == 3:
+ training_data_dirname = sys.argv[2]
+
+ if len(sys.argv) == 4:
+ if thread_num >= int(sys.argv[3]):
+ thread_num = int(sys.argv[3])
+
+ vocab = load_vocab('imdb.vocab')
+ dict_dim = len(vocab)
+
+ train(network, dict_dim, lr, save_dirname, training_data_dirname, pass_num,
+ thread_num, batch_size)
diff --git a/fluid/PaddleNLP/text_classification/clouds/scdb_parallel_executor.py b/PaddleNLP/text_classification/clouds/scdb_parallel_executor.py
similarity index 100%
rename from fluid/PaddleNLP/text_classification/clouds/scdb_parallel_executor.py
rename to PaddleNLP/text_classification/clouds/scdb_parallel_executor.py
diff --git a/fluid/PaddleNLP/text_classification/clouds/scdb_single_card.py b/PaddleNLP/text_classification/clouds/scdb_single_card.py
similarity index 100%
rename from fluid/PaddleNLP/text_classification/clouds/scdb_single_card.py
rename to PaddleNLP/text_classification/clouds/scdb_single_card.py
diff --git a/fluid/PaddleNLP/text_classification/infer.py b/PaddleNLP/text_classification/infer.py
similarity index 100%
rename from fluid/PaddleNLP/text_classification/infer.py
rename to PaddleNLP/text_classification/infer.py
diff --git a/PaddleNLP/text_classification/nets.py b/PaddleNLP/text_classification/nets.py
new file mode 100644
index 0000000000000000000000000000000000000000..4a7caad99f89ae6db0a748634a7c9b0d6632f2ec
--- /dev/null
+++ b/PaddleNLP/text_classification/nets.py
@@ -0,0 +1,124 @@
+import sys
+import time
+import numpy as np
+
+import paddle
+import paddle.fluid as fluid
+
+
+def bow_net(data,
+ label,
+ dict_dim,
+ emb_dim=128,
+ hid_dim=128,
+ hid_dim2=96,
+ class_dim=2):
+ """
+ bow net
+ """
+ emb = fluid.layers.embedding(input=data, size=[dict_dim, emb_dim])
+ bow = fluid.layers.sequence_pool(input=emb, pool_type='sum')
+ bow_tanh = fluid.layers.tanh(bow)
+ fc_1 = fluid.layers.fc(input=bow_tanh, size=hid_dim, act="tanh")
+ fc_2 = fluid.layers.fc(input=fc_1, size=hid_dim2, act="tanh")
+ prediction = fluid.layers.fc(input=[fc_2], size=class_dim, act="softmax")
+ cost = fluid.layers.cross_entropy(input=prediction, label=label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc = fluid.layers.accuracy(input=prediction, label=label)
+
+ return avg_cost, acc, prediction
+
+
+def cnn_net(data,
+ label,
+ dict_dim,
+ emb_dim=128,
+ hid_dim=128,
+ hid_dim2=96,
+ class_dim=2,
+ win_size=3):
+ """
+ conv net
+ """
+ emb = fluid.layers.embedding(input=data, size=[dict_dim, emb_dim])
+
+ conv_3 = fluid.nets.sequence_conv_pool(
+ input=emb,
+ num_filters=hid_dim,
+ filter_size=win_size,
+ act="tanh",
+ pool_type="max")
+
+ fc_1 = fluid.layers.fc(input=[conv_3], size=hid_dim2)
+
+ prediction = fluid.layers.fc(input=[fc_1], size=class_dim, act="softmax")
+ cost = fluid.layers.cross_entropy(input=prediction, label=label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc = fluid.layers.accuracy(input=prediction, label=label)
+
+ return avg_cost, acc, prediction
+
+
+def lstm_net(data,
+ label,
+ dict_dim,
+ emb_dim=128,
+ hid_dim=128,
+ hid_dim2=96,
+ class_dim=2,
+ emb_lr=30.0):
+ """
+ lstm net
+ """
+ emb = fluid.layers.embedding(
+ input=data,
+ size=[dict_dim, emb_dim],
+ param_attr=fluid.ParamAttr(learning_rate=emb_lr))
+
+ fc0 = fluid.layers.fc(input=emb, size=hid_dim * 4)
+
+ lstm_h, c = fluid.layers.dynamic_lstm(
+ input=fc0, size=hid_dim * 4, is_reverse=False)
+
+ lstm_max = fluid.layers.sequence_pool(input=lstm_h, pool_type='max')
+ lstm_max_tanh = fluid.layers.tanh(lstm_max)
+
+ fc1 = fluid.layers.fc(input=lstm_max_tanh, size=hid_dim2, act='tanh')
+
+ prediction = fluid.layers.fc(input=fc1, size=class_dim, act='softmax')
+
+ cost = fluid.layers.cross_entropy(input=prediction, label=label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc = fluid.layers.accuracy(input=prediction, label=label)
+
+ return avg_cost, acc, prediction
+
+
+def gru_net(data,
+ label,
+ dict_dim,
+ emb_dim=128,
+ hid_dim=128,
+ hid_dim2=96,
+ class_dim=2,
+ emb_lr=30.0):
+ """
+ gru net
+ """
+ emb = fluid.layers.embedding(
+ input=data,
+ size=[dict_dim, emb_dim],
+ param_attr=fluid.ParamAttr(learning_rate=emb_lr))
+
+ fc0 = fluid.layers.fc(input=emb, size=hid_dim * 3)
+ gru_h = fluid.layers.dynamic_gru(input=fc0, size=hid_dim, is_reverse=False)
+ gru_max = fluid.layers.sequence_pool(input=gru_h, pool_type='max')
+ gru_max_tanh = fluid.layers.tanh(gru_max)
+ fc1 = fluid.layers.fc(input=gru_max_tanh, size=hid_dim2, act='tanh')
+ prediction = fluid.layers.fc(input=fc1, size=class_dim, act='softmax')
+
+ cost = fluid.layers.cross_entropy(input=prediction, label=label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc = fluid.layers.accuracy(input=prediction, label=label)
+
+ return avg_cost, acc, prediction
diff --git a/PaddleNLP/text_classification/train.py b/PaddleNLP/text_classification/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..a6978a15d2d58a91998b6941a438d804e3e0ee5e
--- /dev/null
+++ b/PaddleNLP/text_classification/train.py
@@ -0,0 +1,139 @@
+import os
+import six
+import sys
+import time
+import unittest
+import contextlib
+
+import paddle
+import paddle.fluid as fluid
+
+import utils
+from nets import bow_net
+from nets import cnn_net
+from nets import lstm_net
+from nets import gru_net
+
+
+def train(train_reader,
+ word_dict,
+ network,
+ use_cuda,
+ parallel,
+ save_dirname,
+ lr=0.2,
+ pass_num=30):
+ """
+ train network
+ """
+ data = fluid.layers.data(
+ name="words", shape=[1], dtype="int64", lod_level=1)
+
+ label = fluid.layers.data(name="label", shape=[1], dtype="int64")
+
+ if not parallel:
+ cost, acc, prediction = network(data, label, len(word_dict))
+ else:
+ places = fluid.layers.device.get_places(device_count=2)
+ pd = fluid.layers.ParallelDo(places)
+ with pd.do():
+ cost, acc, prediction = network(
+ pd.read_input(data), pd.read_input(label), len(word_dict))
+
+ pd.write_output(cost)
+ pd.write_output(acc)
+
+ cost, acc = pd()
+ cost = fluid.layers.mean(cost)
+ acc = fluid.layers.mean(acc)
+
+ sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=lr)
+ sgd_optimizer.minimize(cost)
+
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ feeder = fluid.DataFeeder(feed_list=[data, label], place=place)
+
+ # For internal continuous evaluation
+ if "CE_MODE_X" in os.environ:
+ fluid.default_startup_program().random_seed = 110
+ exe.run(fluid.default_startup_program())
+ for pass_id in six.moves.xrange(pass_num):
+ pass_start = time.time()
+ data_size, data_count, total_acc, total_cost = 0, 0, 0.0, 0.0
+ for data in train_reader():
+ avg_cost_np, avg_acc_np = exe.run(fluid.default_main_program(),
+ feed=feeder.feed(data),
+ fetch_list=[cost, acc])
+ data_size = len(data)
+ total_acc += data_size * avg_acc_np
+ total_cost += data_size * avg_cost_np
+ data_count += data_size
+ avg_cost = total_cost / data_count
+
+ avg_acc = total_acc / data_count
+ print("pass_id: %d, avg_acc: %f, avg_cost: %f, pass_time_cost: %f" %
+ (pass_id, avg_acc, avg_cost, time.time() - pass_start))
+
+ epoch_model = save_dirname + "/" + "epoch" + str(pass_id)
+ fluid.io.save_inference_model(epoch_model, ["words", "label"], acc, exe)
+
+ pass_end = time.time()
+ # For internal continuous evaluation
+ if "CE_MODE_X" in os.environ:
+ print("kpis train_acc %f" % avg_acc)
+ print("kpis train_cost %f" % avg_cost)
+ print("kpis train_duration %f" % (pass_end - pass_start))
+
+
+def train_net():
+ word_dict, train_reader, test_reader = utils.prepare_data(
+ "imdb", self_dict=False, batch_size=128, buf_size=50000)
+
+ if sys.argv[1] == "bow":
+ train(
+ train_reader,
+ word_dict,
+ bow_net,
+ use_cuda=False,
+ parallel=False,
+ save_dirname="bow_model",
+ lr=0.002,
+ pass_num=30)
+ elif sys.argv[1] == "cnn":
+ train(
+ train_reader,
+ word_dict,
+ cnn_net,
+ use_cuda=True,
+ parallel=False,
+ save_dirname="cnn_model",
+ lr=0.01,
+ pass_num=30)
+ elif sys.argv[1] == "lstm":
+ train(
+ train_reader,
+ word_dict,
+ lstm_net,
+ use_cuda=True,
+ parallel=False,
+ save_dirname="lstm_model",
+ lr=0.05,
+ pass_num=30)
+ elif sys.argv[1] == "gru":
+ train(
+ train_reader,
+ word_dict,
+ gru_net,
+ use_cuda=True,
+ parallel=False,
+ save_dirname="gru_model",
+ lr=0.05,
+ pass_num=30)
+ else:
+ print("network name cannot be found!")
+ sys.exit(1)
+
+
+if __name__ == "__main__":
+ train_net()
diff --git a/fluid/PaddleNLP/text_classification/utils.py b/PaddleNLP/text_classification/utils.py
similarity index 100%
rename from fluid/PaddleNLP/text_classification/utils.py
rename to PaddleNLP/text_classification/utils.py
diff --git a/PaddleNLP/text_matching_on_quora/.run_ce.sh b/PaddleNLP/text_matching_on_quora/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..f1bb7febd3f2c572544612baf24be14c711108e3
--- /dev/null
+++ b/PaddleNLP/text_matching_on_quora/.run_ce.sh
@@ -0,0 +1,14 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+cudaid=${text_matching_on_quora:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train_and_evaluate.py --model_name=cdssmNet --config=cdssm_base --enable_ce --epoch_num=5 | python _ce.py
+
+cudaid=${text_matching_on_quora_m:=0,1,2,3} # use 0,1,2,3 card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train_and_evaluate.py --model_name=cdssmNet --config=cdssm_base --enable_ce --epoch_num=5 | python _ce.py
diff --git a/PaddleNLP/text_matching_on_quora/README.md b/PaddleNLP/text_matching_on_quora/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..77d93943ae7dcbe775e60307b74430f320dbaab1
--- /dev/null
+++ b/PaddleNLP/text_matching_on_quora/README.md
@@ -0,0 +1,177 @@
+# Text matching on Quora qestion-answer pair dataset
+
+## contents
+
+* [Introduction](#introduction)
+ * [a brief review of the Quora Question Pair (QQP) Task](#a-brief-review-of-the-quora-question-pair-qqp-task)
+ * [Our Work](#our-work)
+* [Environment Preparation](#environment-preparation)
+ * [Install Fluid release 1.0](#install-fluid-release-10)
+ * [cpu version](#cpu-version)
+ * [gpu version](#gpu-version)
+ * [Have I installed Fluid successfully?](#have-i-installed-fluid-successfully)
+* [Prepare Data](#prepare-data)
+* [Train and evaluate](#train-and-evaluate)
+* [Models](#models)
+* [Results](#results)
+
+
+## Introduction
+
+### a brief review of the Quora Question Pair (QQP) Task
+
+The [Quora Question Pair](https://data.quora.com/First-Quora-Dataset-Release-Question-Pairs) dataset contains 400,000 question pairs from [Quora](https://www.quora.com/), where people ask and answer questions related to specific areas. Each sample in the dataset consists of two questions (both English) and a label that represents whether the questions are duplicate. The dataset is well annotated by human.
+
+Below are two samples from the dataset. The last column indicates whether the two questions are duplicate (1) or not (0).
+
+|id | qid1 | qid2| question1| question2| is_duplicate
+|:---:|:---:|:---:|:---:|:---:|:---:|
+|0 |1 |2 |What is the step by step guide to invest in share market in india? |What is the step by step guide to invest in share market? |0|
+|1 |3 |4 |What is the story of Kohinoor (Koh-i-Noor) Diamond? | What would happen if the Indian government stole the Kohinoor (Koh-i-Noor) diamond back? |0|
+
+ A [kaggle competition](https://www.kaggle.com/c/quora-question-pairs#description) was held based on this dataset in 2017. The kagglers were given a training dataset (with labels), and requested to make predictions on a test dataset (without labels). The predictions were evaluated by the log-likelihood loss on the test data.
+
+The kaggle competition has inspired much effective work. However, most of these models are rule-based and difficult to be transferred to new tasks. Researchers are seeking for more general models that work well on this task and other natual language processing (NLP) tasks.
+
+[Wang _et al._](https://arxiv.org/abs/1702.03814) proposed a bilateral multi-perspective matching (BIMPM) model based on the Quora Question Pair dataset. They splitted the original dataset to [3 parts](https://drive.google.com/file/d/0B0PlTAo--BnaQWlsZl9FZ3l1c28/view?usp=sharing): _train.tsv_ (384,348 samples), _dev.tsv_ (10,000 samples) and _test.tsv_ (10,000 samples). The class distribution of _train.tsv_ is unbalanced (37% positive and 63% negative), while those of _dev.tsv_ and _test.tsv_ are balanced(50% positive and 50% negetive). We used the same splitting method in our experiments.
+
+### Our Work
+
+Based on the Quora Question Pair Dataset, we implemented some classic models in the area of neural language understanding (NLU). The accuracy of prediction results are evaluated on the _test.tsv_ from [Wang _et al._](https://arxiv.org/abs/1702.03814).
+
+## Environment Preparation
+
+### Install Fluid release 1.0
+
+Please follow the [official document in English](http://www.paddlepaddle.org/documentation/docs/en/1.0/build_and_install/pip_install_en.html) or [official document in Chinese](http://www.paddlepaddle.org/documentation/docs/zh/1.0/beginners_guide/install/Start.html) to install the Fluid deep learning framework.
+
+#### Have I installed Fluid successfully?
+
+Run the following script from your command line:
+
+```shell
+python -c "import paddle"
+```
+
+If Fluid is installed successfully you should see no error message. Feel free to open issues under the [PaddlePaddle repository](https://github.com/PaddlePaddle/Paddle/issues) for support.
+
+## Prepare Data
+
+Please download the Quora dataset from [Google drive](https://drive.google.com/file/d/0B0PlTAo--BnaQWlsZl9FZ3l1c28/view?usp=sharing) and unzip to $HOME/.cache/paddle/dataset.
+
+Then run _data/prepare_quora_data.sh_ to download the pre-trained _word2vec_ embedding file -- _glove.840B.300d.zip_:
+
+```shell
+sh data/prepare_quora_data.sh
+```
+
+At this point the dataset directory ($HOME/.cache/paddle/dataset) structure should be:
+
+```shell
+
+$HOME/.cache/paddle/dataset
+ |- Quora_question_pair_partition
+ |- train.tsv
+ |- test.tsv
+ |- dev.tsv
+ |- readme.txt
+ |- wordvec.txt
+ |- glove.840B.300d.txt
+```
+
+## Train and evaluate
+
+We provide multiple models and configurations. Details are shown in `models` and `configs` directories. For a quick start, please run the _cdssmNet_ model with the corresponding configuration:
+
+```shell
+python train_and_evaluate.py \
+ --model_name=cdssmNet \
+ --config=cdssm_base
+```
+
+Logs will be output to the console. If everything works well, the logging information will have the same formats as the content in _cdssm_base.log_.
+
+All configurations used in our experiments are as follows:
+
+|Model|Config|command
+|:----:|:----:|:----:|
+|cdssmNet|cdssm_base|python train_and_evaluate.py --model_name=cdssmNet --config=cdssm_base
+|DecAttNet|decatt_glove|python train_and_evaluate.py --model_name=DecAttNet --config=decatt_glove
+|InferSentNet|infer_sent_v1|python train_and_evaluate.py --model_name=InferSentNet --config=infer_sent_v1
+|InferSentNet|infer_sent_v2|python train_and_evaluate.py --model_name=InferSentNet --config=infer_sent_v2
+|SSENet|sse_base|python train_and_evaluate.py --model_name=SSENet --config=sse_base
+
+## Models
+
+We implemeted 4 models for now: the convolutional deep-structured semantic model (CDSSM, CNN-based), the InferSent model (RNN-based), the shortcut-stacked encoder (SSE, RNN-based), and the decomposed attention model (DecAtt, attention-based).
+
+|Model|features|Context Encoder|Match Layer|Classification Layer
+|:----:|:----:|:----:|:----:|:----:|
+|CDSSM|word|1 layer conv1d|concatenation|MLP
+|DecAtt|word|Attention|concatenation|MLP
+|InferSent|word|1 layer Bi-LSTM|concatenation/element-wise product/
absolute element-wise difference|MLP
+|SSE|word|3 layer Bi-LSTM|concatenation/element-wise product/
absolute element-wise difference|MLP
+
+### CDSSM
+
+```
+@inproceedings{shen2014learning,
+ title={Learning semantic representations using convolutional neural networks for web search},
+ author={Shen, Yelong and He, Xiaodong and Gao, Jianfeng and Deng, Li and Mesnil, Gr{\'e}goire},
+ booktitle={Proceedings of the 23rd International Conference on World Wide Web},
+ pages={373--374},
+ year={2014},
+ organization={ACM}
+}
+```
+
+### InferSent
+
+```
+@article{conneau2017supervised,
+ title={Supervised learning of universal sentence representations from natural language inference data},
+ author={Conneau, Alexis and Kiela, Douwe and Schwenk, Holger and Barrault, Loic and Bordes, Antoine},
+ journal={arXiv preprint arXiv:1705.02364},
+ year={2017}
+}
+```
+
+### SSE
+
+```
+@article{nie2017shortcut,
+ title={Shortcut-stacked sentence encoders for multi-domain inference},
+ author={Nie, Yixin and Bansal, Mohit},
+ journal={arXiv preprint arXiv:1708.02312},
+ year={2017}
+}
+```
+
+### DecAtt
+
+```
+@article{tomar2017neural,
+ title={Neural paraphrase identification of questions with noisy pretraining},
+ author={Tomar, Gaurav Singh and Duque, Thyago and T{\"a}ckstr{\"o}m, Oscar and Uszkoreit, Jakob and Das, Dipanjan},
+ journal={arXiv preprint arXiv:1704.04565},
+ year={2017}
+}
+```
+
+## Results
+
+|Model|Config|dev accuracy| test accuracy
+|:----:|:----:|:----:|:----:|
+|cdssmNet|cdssm_base|83.56%|82.83%|
+|DecAttNet|decatt_glove|86.31%|86.22%|
+|InferSentNet|infer_sent_v1|87.15%|86.62%|
+|InferSentNet|infer_sent_v2|88.55%|88.43%|
+|SSENet|sse_base|88.35%|88.25%|
+
+In our experiment, we found that LSTM-based models outperformed convolution-based models. The DecAtt model has fewer parameters than LSTM-based models, but is sensitive to hyper-parameters.
+
+
+
+
+
+
diff --git a/PaddleNLP/text_matching_on_quora/__init__.py b/PaddleNLP/text_matching_on_quora/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleNLP/text_matching_on_quora/_ce.py b/PaddleNLP/text_matching_on_quora/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..eadeb821da6f7049d1916a65a1ae4eb995c5cb6d
--- /dev/null
+++ b/PaddleNLP/text_matching_on_quora/_ce.py
@@ -0,0 +1,65 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+
+
+each_pass_duration_card1_kpi = DurationKpi('each_pass_duration_card1', 0.08, 0, actived=True)
+train_avg_cost_card1_kpi = CostKpi('train_avg_cost_card1', 0.08, 0)
+train_avg_acc_card1_kpi = CostKpi('train_avg_acc_card1', 0.02, 0)
+each_pass_duration_card4_kpi = DurationKpi('each_pass_duration_card4', 0.08, 0, actived=True)
+train_avg_cost_card4_kpi = CostKpi('train_avg_cost_card4', 0.08, 0)
+train_avg_acc_card4_kpi = CostKpi('train_avg_acc_card4', 0.02, 0)
+
+tracking_kpis = [
+ each_pass_duration_card1_kpi,
+ train_avg_cost_card1_kpi,
+ train_avg_acc_card1_kpi,
+ each_pass_duration_card4_kpi,
+ train_avg_cost_card4_kpi,
+ train_avg_acc_card4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/fluid/PaddleNLP/text_matching_on_quora/cdssm_base.log b/PaddleNLP/text_matching_on_quora/cdssm_base.log
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/cdssm_base.log
rename to PaddleNLP/text_matching_on_quora/cdssm_base.log
diff --git a/fluid/PaddleNLP/text_matching_on_quora/configs/__init__.py b/PaddleNLP/text_matching_on_quora/configs/__init__.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/configs/__init__.py
rename to PaddleNLP/text_matching_on_quora/configs/__init__.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/configs/basic_config.py b/PaddleNLP/text_matching_on_quora/configs/basic_config.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/configs/basic_config.py
rename to PaddleNLP/text_matching_on_quora/configs/basic_config.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/configs/cdssm.py b/PaddleNLP/text_matching_on_quora/configs/cdssm.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/configs/cdssm.py
rename to PaddleNLP/text_matching_on_quora/configs/cdssm.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/configs/dec_att.py b/PaddleNLP/text_matching_on_quora/configs/dec_att.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/configs/dec_att.py
rename to PaddleNLP/text_matching_on_quora/configs/dec_att.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/configs/infer_sent.py b/PaddleNLP/text_matching_on_quora/configs/infer_sent.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/configs/infer_sent.py
rename to PaddleNLP/text_matching_on_quora/configs/infer_sent.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/configs/sse.py b/PaddleNLP/text_matching_on_quora/configs/sse.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/configs/sse.py
rename to PaddleNLP/text_matching_on_quora/configs/sse.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/data/prepare_quora_data.sh b/PaddleNLP/text_matching_on_quora/data/prepare_quora_data.sh
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/data/prepare_quora_data.sh
rename to PaddleNLP/text_matching_on_quora/data/prepare_quora_data.sh
diff --git a/fluid/PaddleNLP/text_matching_on_quora/imgs/README.md b/PaddleNLP/text_matching_on_quora/imgs/README.md
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/imgs/README.md
rename to PaddleNLP/text_matching_on_quora/imgs/README.md
diff --git a/fluid/PaddleNLP/text_matching_on_quora/imgs/models_test_acc.png b/PaddleNLP/text_matching_on_quora/imgs/models_test_acc.png
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/imgs/models_test_acc.png
rename to PaddleNLP/text_matching_on_quora/imgs/models_test_acc.png
diff --git a/fluid/PaddleNLP/text_matching_on_quora/metric.py b/PaddleNLP/text_matching_on_quora/metric.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/metric.py
rename to PaddleNLP/text_matching_on_quora/metric.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/__init__.py b/PaddleNLP/text_matching_on_quora/models/__init__.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/__init__.py
rename to PaddleNLP/text_matching_on_quora/models/__init__.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/cdssm.py b/PaddleNLP/text_matching_on_quora/models/cdssm.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/cdssm.py
rename to PaddleNLP/text_matching_on_quora/models/cdssm.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/dec_att.py b/PaddleNLP/text_matching_on_quora/models/dec_att.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/dec_att.py
rename to PaddleNLP/text_matching_on_quora/models/dec_att.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/infer_sent.py b/PaddleNLP/text_matching_on_quora/models/infer_sent.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/infer_sent.py
rename to PaddleNLP/text_matching_on_quora/models/infer_sent.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/match_layers.py b/PaddleNLP/text_matching_on_quora/models/match_layers.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/match_layers.py
rename to PaddleNLP/text_matching_on_quora/models/match_layers.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/my_layers.py b/PaddleNLP/text_matching_on_quora/models/my_layers.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/my_layers.py
rename to PaddleNLP/text_matching_on_quora/models/my_layers.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/pwim.py b/PaddleNLP/text_matching_on_quora/models/pwim.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/pwim.py
rename to PaddleNLP/text_matching_on_quora/models/pwim.py
diff --git a/fluid/PaddleNLP/text_matching_on_quora/models/sse.py b/PaddleNLP/text_matching_on_quora/models/sse.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/models/sse.py
rename to PaddleNLP/text_matching_on_quora/models/sse.py
diff --git a/PaddleNLP/text_matching_on_quora/models/test.py b/PaddleNLP/text_matching_on_quora/models/test.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleNLP/text_matching_on_quora/pretrained_word2vec.py b/PaddleNLP/text_matching_on_quora/pretrained_word2vec.py
new file mode 100755
index 0000000000000000000000000000000000000000..cda934d3402d1fd05432e75e3b7bfd0a1bd4ad2c
--- /dev/null
+++ b/PaddleNLP/text_matching_on_quora/pretrained_word2vec.py
@@ -0,0 +1,61 @@
+# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+This Module provide pretrained word-embeddings
+"""
+
+from __future__ import print_function, unicode_literals
+import numpy as np
+import time, datetime
+import os, sys
+
+def maybe_open(filepath):
+ if sys.version_info <= (3, 0): # for python2
+ return open(filepath, 'r')
+ else:
+ return open(filepath, 'r', encoding="utf-8")
+
+def Glove840B_300D(filepath, keys=None):
+ """
+ input: the "glove.840B.300d.txt" file path
+ return: a dict, key: word (unicode), value: a numpy array with shape [300]
+ """
+ if keys is not None:
+ assert(isinstance(keys, set))
+ print("loading word2vec from ", filepath)
+ print("please wait for a minute.")
+ start = time.time()
+ word2vec = {}
+ with maybe_open(filepath) as f:
+ for line in f:
+ if sys.version_info <= (3, 0): # for python2
+ line = line.decode('utf-8')
+ info = line.strip("\n").split(" ")
+ word = info[0]
+ if (keys is not None) and (word not in keys):
+ continue
+ vector = info[1:]
+ assert(len(vector) == 300)
+ word2vec[word] = np.asarray(vector, dtype='float32')
+
+ end = time.time()
+ print("Spent ", str(datetime.timedelta(seconds=end-start)), " on loading word2vec.")
+ return word2vec
+
+if __name__ == '__main__':
+ from os.path import expanduser
+ home = expanduser("~")
+ embed_dict = Glove840B_300D(os.path.join(home, "./.cache/paddle/dataset/glove.840B.300d.txt"))
+ exit(0)
diff --git a/PaddleNLP/text_matching_on_quora/quora_question_pairs.py b/PaddleNLP/text_matching_on_quora/quora_question_pairs.py
new file mode 100755
index 0000000000000000000000000000000000000000..4a1694929dc9a5a1d78bce2f99be04de0f1ba8e5
--- /dev/null
+++ b/PaddleNLP/text_matching_on_quora/quora_question_pairs.py
@@ -0,0 +1,196 @@
+# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+"""
+
+import paddle.dataset.common
+import collections
+import tarfile
+import re
+import string
+import random
+import os, sys
+import nltk
+from os.path import expanduser
+
+
+__all__ = ['word_dict', 'train', 'dev', 'test']
+
+URL = "https://drive.google.com/file/d/0B0PlTAo--BnaQWlsZl9FZ3l1c28/view"
+
+DATA_HOME = os.path.expanduser('~/.cache/paddle/dataset')
+DATA_DIR = "Quora_question_pair_partition"
+
+QUORA_TRAIN_FILE_NAME = os.path.join(DATA_HOME, DATA_DIR, 'train.tsv')
+QUORA_DEV_FILE_NAME = os.path.join(DATA_HOME, DATA_DIR, 'dev.tsv')
+QUORA_TEST_FILE_NAME = os.path.join(DATA_HOME, DATA_DIR, 'test.tsv')
+
+# punctuation or nltk or space
+TOKENIZE_METHOD='space'
+
+COLUMN_COUNT = 4
+
+
+def tokenize(s):
+ if sys.version_info <= (3, 0): # for python2
+ s = s.decode('utf-8')
+ if TOKENIZE_METHOD == "nltk":
+ return nltk.tokenize.word_tokenize(s)
+ elif TOKENIZE_METHOD == "punctuation":
+ return s.translate({ord(char): None for char in string.punctuation}).lower().split()
+ elif TOKENIZE_METHOD == "space":
+ return s.split()
+ else:
+ raise RuntimeError("Invalid tokenize method")
+
+
+def maybe_open(file_name):
+ if not os.path.isfile(file_name):
+ msg = "file not exist: %s\nPlease download the dataset firstly from: %s\n\n" % (file_name, URL) + \
+ ("# The finally dataset dir should be like\n\n"
+ "$HOME/.cache/paddle/dataset\n"
+ " |- Quora_question_pair_partition\n"
+ " |- train.tsv\n"
+ " |- test.tsv\n"
+ " |- dev.tsv\n"
+ " |- readme.txt\n"
+ " |- wordvec.txt\n")
+ raise RuntimeError(msg)
+ if sys.version_info <= (3, 0): # for python2
+ return open(file_name, 'r')
+ else:
+ return open(file_name, 'r', encoding="utf-8")
+
+
+def tokenized_question_pairs(file_name):
+ """
+ """
+ with maybe_open(file_name) as f:
+ questions = {}
+ lines = f.readlines()
+ for line in lines:
+ info = line.strip().split('\t')
+ if len(info) != COLUMN_COUNT:
+ # formatting error
+ continue
+ (label, question1, question2, id) = info
+ question1 = tokenize(question1)
+ question2 = tokenize(question2)
+ yield question1, question2, int(label)
+
+
+def tokenized_questions(file_name):
+ """
+ """
+ with maybe_open(file_name) as f:
+ lines = f.readlines()
+ for line in lines:
+ info = line.strip().split('\t')
+ if len(info) != COLUMN_COUNT:
+ # formatting error
+ continue
+ (label, question1, question2, id) = info
+ yield tokenize(question1)
+ yield tokenize(question2)
+
+
+def build_dict(file_name, cutoff):
+ """
+ Build a word dictionary from the corpus. Keys of the dictionary are words,
+ and values are zero-based IDs of these words.
+ """
+ word_freq = collections.defaultdict(int)
+ for doc in tokenized_questions(file_name):
+ for word in doc:
+ word_freq[word] += 1
+
+ word_freq = filter(lambda x: x[1] > cutoff, word_freq.items())
+
+ dictionary = sorted(word_freq, key=lambda x: (-x[1], x[0]))
+ words, _ = list(zip(*dictionary))
+ word_idx = dict(zip(words, range(len(words))))
+ word_idx[''] = len(words)
+ word_idx[''] = len(words) + 1
+ return word_idx
+
+
+def reader_creator(file_name, word_idx):
+ UNK_ID = word_idx['']
+
+ def reader():
+ for (q1, q2, label) in tokenized_question_pairs(file_name):
+ q1_ids = [word_idx.get(w, UNK_ID) for w in q1]
+ q2_ids = [word_idx.get(w, UNK_ID) for w in q2]
+ if q1_ids != [] and q2_ids != []: # [] is not allowed in fluid
+ assert(label in [0, 1])
+ yield q1_ids, q2_ids, label
+
+ return reader
+
+
+def train(word_idx):
+ """
+ Quora training set creator.
+
+ It returns a reader creator, each sample in the reader is two zero-based ID
+ list and label in [0, 1].
+
+ :param word_idx: word dictionary
+ :type word_idx: dict
+ :return: Training reader creator
+ :rtype: callable
+ """
+ return reader_creator(QUORA_TRAIN_FILE_NAME, word_idx)
+
+
+def dev(word_idx):
+ """
+ Quora develop set creator.
+
+ It returns a reader creator, each sample in the reader is two zero-based ID
+ list and label in [0, 1].
+
+ :param word_idx: word dictionary
+ :type word_idx: dict
+ :return: develop reader creator
+ :rtype: callable
+
+ """
+ return reader_creator(QUORA_DEV_FILE_NAME, word_idx)
+
+def test(word_idx):
+ """
+ Quora test set creator.
+
+ It returns a reader creator, each sample in the reader is two zero-based ID
+ list and label in [0, 1].
+
+ :param word_idx: word dictionary
+ :type word_idx: dict
+ :return: Test reader creator
+ :rtype: callable
+ """
+ return reader_creator(QUORA_TEST_FILE_NAME, word_idx)
+
+
+def word_dict():
+ """
+ Build a word dictionary from the corpus.
+
+ :return: Word dictionary
+ :rtype: dict
+ """
+ return build_dict(file_name=QUORA_TRAIN_FILE_NAME, cutoff=4)
+
diff --git a/PaddleNLP/text_matching_on_quora/train_and_evaluate.py b/PaddleNLP/text_matching_on_quora/train_and_evaluate.py
new file mode 100755
index 0000000000000000000000000000000000000000..0f88c6b6ef13aec25e08527b7efabe8638a3af25
--- /dev/null
+++ b/PaddleNLP/text_matching_on_quora/train_and_evaluate.py
@@ -0,0 +1,309 @@
+#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import print_function
+
+import os
+import sys
+import time
+import argparse
+import unittest
+import contextlib
+import numpy as np
+
+import paddle.fluid as fluid
+
+import utils, metric, configs
+import models
+
+from pretrained_word2vec import Glove840B_300D
+
+parser = argparse.ArgumentParser(description=__doc__)
+
+parser.add_argument('--model_name', type=str, default='cdssmNet', help="Which model to train")
+parser.add_argument('--config', type=str, default='cdssm_base', help="The global config setting")
+parser.add_argument('--enable_ce', action='store_true', help='If set, run the task with continuous evaluation logs.')
+parser.add_argument('--epoch_num', type=int, help='Number of epoch')
+
+DATA_DIR = os.path.join(os.path.expanduser('~'), '.cache/paddle/dataset')
+
+def evaluate(epoch_id, exe, inference_program, dev_reader, test_reader, fetch_list, feeder, metric_type):
+ """
+ evaluate on test/dev dataset
+ """
+ def infer(test_reader):
+ """
+ do inference function
+ """
+ total_cost = 0.0
+ total_count = 0
+ preds, labels = [], []
+ for data in test_reader():
+ avg_cost, avg_acc, batch_prediction = exe.run(inference_program,
+ feed=feeder.feed(data),
+ fetch_list=fetch_list,
+ return_numpy=True)
+ total_cost += avg_cost * len(data)
+ total_count += len(data)
+ preds.append(batch_prediction)
+ labels.append(np.asarray([x[-1] for x in data], dtype=np.int64))
+ y_pred = np.concatenate(preds)
+ y_label = np.concatenate(labels)
+
+ metric_res = []
+ for metric_name in metric_type:
+ if metric_name == 'accuracy_with_threshold':
+ metric_res.append((metric_name, metric.accuracy_with_threshold(y_pred, y_label, threshold=0.3)))
+ elif metric_name == 'accuracy':
+ metric_res.append((metric_name, metric.accuracy(y_pred, y_label)))
+ else:
+ print("Unknown metric type: ", metric_name)
+ exit()
+ return total_cost / (total_count * 1.0), metric_res
+
+ dev_cost, dev_metric_res = infer(dev_reader)
+ print("[%s] epoch_id: %d, dev_cost: %f, " % (
+ time.asctime( time.localtime(time.time()) ),
+ epoch_id,
+ dev_cost)
+ + ', '.join([str(x[0]) + ": " + str(x[1]) for x in dev_metric_res]))
+
+ test_cost, test_metric_res = infer(test_reader)
+ print("[%s] epoch_id: %d, test_cost: %f, " % (
+ time.asctime( time.localtime(time.time()) ),
+ epoch_id,
+ test_cost)
+ + ', '.join([str(x[0]) + ": " + str(x[1]) for x in test_metric_res]))
+ print("")
+
+
+def train_and_evaluate(train_reader,
+ dev_reader,
+ test_reader,
+ network,
+ optimizer,
+ global_config,
+ pretrained_word_embedding,
+ use_cuda,
+ parallel):
+ """
+ train network
+ """
+
+ # define the net
+ if global_config.use_lod_tensor:
+ # automatic add batch dim
+ q1 = fluid.layers.data(
+ name="question1", shape=[1], dtype="int64", lod_level=1)
+ q2 = fluid.layers.data(
+ name="question2", shape=[1], dtype="int64", lod_level=1)
+ label = fluid.layers.data(name="label", shape=[1], dtype="int64")
+ cost, acc, prediction = network(q1, q2, label)
+ else:
+ # shape: [batch_size, max_seq_len_in_batch, 1]
+ q1 = fluid.layers.data(
+ name="question1", shape=[-1, -1, 1], dtype="int64")
+ q2 = fluid.layers.data(
+ name="question2", shape=[-1, -1, 1], dtype="int64")
+ # shape: [batch_size, max_seq_len_in_batch]
+ mask1 = fluid.layers.data(name="mask1", shape=[-1, -1], dtype="float32")
+ mask2 = fluid.layers.data(name="mask2", shape=[-1, -1], dtype="float32")
+ label = fluid.layers.data(name="label", shape=[1], dtype="int64")
+ cost, acc, prediction = network(q1, q2, mask1, mask2, label)
+
+ if parallel:
+ # TODO: Paarallel Training
+ print("Parallel Training is not supported for now.")
+ sys.exit(1)
+
+ #optimizer.minimize(cost)
+ if use_cuda:
+ print("Using GPU")
+ place = fluid.CUDAPlace(0)
+ else:
+ print("Using CPU")
+ place = fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ if global_config.use_lod_tensor:
+ feeder = fluid.DataFeeder(feed_list=[q1, q2, label], place=place)
+ else:
+ feeder = fluid.DataFeeder(feed_list=[q1, q2, mask1, mask2, label], place=place)
+
+ # only for ce
+ args = parser.parse_args()
+ if args.enable_ce:
+ SEED = 102
+ fluid.default_startup_program().random_seed = SEED
+ fluid.default_main_program().random_seed = SEED
+
+ # logging param info
+ for param in fluid.default_main_program().global_block().all_parameters():
+ print("param name: %s; param shape: %s" % (param.name, param.shape))
+
+ # define inference_program
+ inference_program = fluid.default_main_program().clone(for_test=True)
+
+ optimizer.minimize(cost)
+
+ exe.run(fluid.default_startup_program())
+
+ # load emb from a numpy erray
+ if pretrained_word_embedding is not None:
+ print("loading pretrained word embedding to param")
+ embedding_name = "emb.w"
+ embedding_param = fluid.global_scope().find_var(embedding_name).get_tensor()
+ embedding_param.set(pretrained_word_embedding, place)
+
+ evaluate(-1,
+ exe,
+ inference_program,
+ dev_reader,
+ test_reader,
+ fetch_list=[cost, acc, prediction],
+ feeder=feeder,
+ metric_type=global_config.metric_type)
+
+ # start training
+ total_time = 0.0
+ print("[%s] Start Training" % time.asctime(time.localtime(time.time())))
+ for epoch_id in range(global_config.epoch_num):
+
+ data_size, data_count, total_acc, total_cost = 0, 0, 0.0, 0.0
+ batch_id = 0
+ epoch_begin_time = time.time()
+ for data in train_reader():
+ avg_cost_np, avg_acc_np = exe.run(fluid.default_main_program(),
+ feed=feeder.feed(data),
+ fetch_list=[cost, acc])
+ data_size = len(data)
+ total_acc += data_size * avg_acc_np[0]
+ total_cost += data_size * avg_cost_np[0]
+ data_count += data_size
+ if batch_id % 100 == 0:
+ print("[%s] epoch_id: %d, batch_id: %d, cost: %f, acc: %f" % (
+ time.asctime(time.localtime(time.time())),
+ epoch_id,
+ batch_id,
+ avg_cost_np,
+ avg_acc_np))
+ batch_id += 1
+ avg_cost = total_cost / data_count
+ avg_acc = total_acc / data_count
+ epoch_end_time = time.time()
+ total_time += epoch_end_time - epoch_begin_time
+
+ print("")
+ print("[%s] epoch_id: %d, train_avg_cost: %f, train_avg_acc: %f, epoch_time_cost: %f" % (
+ time.asctime( time.localtime(time.time())),
+ epoch_id, avg_cost, avg_acc,
+ time.time() - epoch_begin_time))
+
+ # only for ce
+ if epoch_id == global_config.epoch_num - 1 and args.enable_ce:
+ #Note: The following logs are special for CE monitoring.
+ #Other situations do not need to care about these logs.
+ gpu_num = get_cards(args)
+ print("kpis\teach_pass_duration_card%s\t%s" % \
+ (gpu_num, total_time / (global_config.epoch_num)))
+ print("kpis\ttrain_avg_cost_card%s\t%s" %
+ (gpu_num, avg_cost))
+ print("kpis\ttrain_avg_acc_card%s\t%s" %
+ (gpu_num, avg_acc))
+
+
+ epoch_model = global_config.save_dirname + "/" + "epoch" + str(epoch_id)
+ fluid.io.save_inference_model(epoch_model, ["question1", "question2", "label"], acc, exe)
+
+ evaluate(epoch_id,
+ exe,
+ inference_program,
+ dev_reader,
+ test_reader,
+ fetch_list=[cost, acc, prediction],
+ feeder=feeder,
+ metric_type=global_config.metric_type)
+
+def main():
+ """
+ This function will parse argments, prepare data and prepare pretrained embedding
+ """
+ args = parser.parse_args()
+ global_config = configs.__dict__[args.config]()
+
+ if args.epoch_num != None:
+ global_config.epoch_num = args.epoch_num
+
+ print("net_name: ", args.model_name)
+ net = models.__dict__[args.model_name](global_config)
+
+ # get word_dict
+ word_dict = utils.getDict(data_type="quora_question_pairs")
+
+ # get reader
+ train_reader, dev_reader, test_reader = utils.prepare_data(
+ "quora_question_pairs",
+ word_dict=word_dict,
+ batch_size = global_config.batch_size,
+ buf_size=800000,
+ duplicate_data=global_config.duplicate_data,
+ use_pad=(not global_config.use_lod_tensor))
+
+ # load pretrained_word_embedding
+ if global_config.use_pretrained_word_embedding:
+ word2vec = Glove840B_300D(filepath=os.path.join(DATA_DIR, "glove.840B.300d.txt"),
+ keys=set(word_dict.keys()))
+ pretrained_word_embedding = utils.get_pretrained_word_embedding(
+ word2vec=word2vec,
+ word2id=word_dict,
+ config=global_config)
+ print("pretrained_word_embedding to be load:", pretrained_word_embedding)
+ else:
+ pretrained_word_embedding = None
+
+ # define optimizer
+ optimizer = utils.getOptimizer(global_config)
+
+ # use cuda or not
+ if not global_config.has_member('use_cuda'):
+ if 'CUDA_VISIBLE_DEVICES' in os.environ and os.environ['CUDA_VISIBLE_DEVICES'] != '':
+ global_config.use_cuda = True
+ else:
+ global_config.use_cuda = False
+
+ global_config.list_config()
+
+ train_and_evaluate(
+ train_reader,
+ dev_reader,
+ test_reader,
+ net,
+ optimizer,
+ global_config,
+ pretrained_word_embedding,
+ use_cuda=global_config.use_cuda,
+ parallel=False)
+
+
+def get_cards(args):
+ if args.enable_ce:
+ cards = os.environ.get('CUDA_VISIBLE_DEVICES')
+ num = len(cards.split(","))
+ return num
+ else:
+ return args.num_devices
+
+
+if __name__ == "__main__":
+ main()
diff --git a/fluid/PaddleNLP/text_matching_on_quora/utils.py b/PaddleNLP/text_matching_on_quora/utils.py
similarity index 100%
rename from fluid/PaddleNLP/text_matching_on_quora/utils.py
rename to PaddleNLP/text_matching_on_quora/utils.py
diff --git a/PaddleRL/DeepQNetwork/DQN_agent.py b/PaddleRL/DeepQNetwork/DQN_agent.py
new file mode 100644
index 0000000000000000000000000000000000000000..5b474325f656533b91965fd59d70c2d421e16fc3
--- /dev/null
+++ b/PaddleRL/DeepQNetwork/DQN_agent.py
@@ -0,0 +1,187 @@
+#-*- coding: utf-8 -*-
+
+import math
+import numpy as np
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from tqdm import tqdm
+
+
+class DQNModel(object):
+ def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
+ self.img_height = state_dim[0]
+ self.img_width = state_dim[1]
+ self.action_dim = action_dim
+ self.gamma = gamma
+ self.exploration = 1.1
+ self.update_target_steps = 10000 // 4
+ self.hist_len = hist_len
+ self.use_cuda = use_cuda
+
+ self.global_step = 0
+ self._build_net()
+
+ def _get_inputs(self):
+ return fluid.layers.data(
+ name='state',
+ shape=[self.hist_len, self.img_height, self.img_width],
+ dtype='float32'), \
+ fluid.layers.data(
+ name='action', shape=[1], dtype='int32'), \
+ fluid.layers.data(
+ name='reward', shape=[], dtype='float32'), \
+ fluid.layers.data(
+ name='next_s',
+ shape=[self.hist_len, self.img_height, self.img_width],
+ dtype='float32'), \
+ fluid.layers.data(
+ name='isOver', shape=[], dtype='bool')
+
+ def _build_net(self):
+ self.predict_program = fluid.Program()
+ self.train_program = fluid.Program()
+ self._sync_program = fluid.Program()
+
+ with fluid.program_guard(self.predict_program):
+ state, action, reward, next_s, isOver = self._get_inputs()
+ self.pred_value = self.get_DQN_prediction(state)
+
+ with fluid.program_guard(self.train_program):
+ state, action, reward, next_s, isOver = self._get_inputs()
+ pred_value = self.get_DQN_prediction(state)
+
+ reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
+
+ action_onehot = fluid.layers.one_hot(action, self.action_dim)
+ action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
+
+ pred_action_value = fluid.layers.reduce_sum(
+ fluid.layers.elementwise_mul(action_onehot, pred_value), dim=1)
+
+ targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
+ best_v = fluid.layers.reduce_max(targetQ_predict_value, dim=1)
+ best_v.stop_gradient = True
+
+ target = reward + (1.0 - fluid.layers.cast(
+ isOver, dtype='float32')) * self.gamma * best_v
+ cost = fluid.layers.square_error_cost(pred_action_value, target)
+ cost = fluid.layers.reduce_mean(cost)
+
+ optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
+ optimizer.minimize(cost)
+
+ vars = list(self.train_program.list_vars())
+ policy_vars = list(filter(
+ lambda x: 'GRAD' not in x.name and 'policy' in x.name, vars))
+ target_vars = list(filter(
+ lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
+ policy_vars.sort(key=lambda x: x.name)
+ target_vars.sort(key=lambda x: x.name)
+
+ with fluid.program_guard(self._sync_program):
+ sync_ops = []
+ for i, var in enumerate(policy_vars):
+ sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
+ sync_ops.append(sync_op)
+
+ # fluid exe
+ place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
+ self.exe = fluid.Executor(place)
+ self.exe.run(fluid.default_startup_program())
+
+ def get_DQN_prediction(self, image, target=False):
+ image = image / 255.0
+
+ variable_field = 'target' if target else 'policy'
+
+ conv1 = fluid.layers.conv2d(
+ input=image,
+ num_filters=32,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
+ max_pool1 = fluid.layers.pool2d(
+ input=conv1, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv2 = fluid.layers.conv2d(
+ input=max_pool1,
+ num_filters=32,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
+ max_pool2 = fluid.layers.pool2d(
+ input=conv2, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv3 = fluid.layers.conv2d(
+ input=max_pool2,
+ num_filters=64,
+ filter_size=4,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
+ max_pool3 = fluid.layers.pool2d(
+ input=conv3, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv4 = fluid.layers.conv2d(
+ input=max_pool3,
+ num_filters=64,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
+
+ flatten = fluid.layers.flatten(conv4, axis=1)
+
+ out = fluid.layers.fc(
+ input=flatten,
+ size=self.action_dim,
+ param_attr=ParamAttr(name='{}_fc1'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_fc1_b'.format(variable_field)))
+ return out
+
+
+ def act(self, state, train_or_test):
+ sample = np.random.random()
+ if train_or_test == 'train' and sample < self.exploration:
+ act = np.random.randint(self.action_dim)
+ else:
+ if np.random.random() < 0.01:
+ act = np.random.randint(self.action_dim)
+ else:
+ state = np.expand_dims(state, axis=0)
+ pred_Q = self.exe.run(self.predict_program,
+ feed={'state': state.astype('float32')},
+ fetch_list=[self.pred_value])[0]
+ pred_Q = np.squeeze(pred_Q, axis=0)
+ act = np.argmax(pred_Q)
+ if train_or_test == 'train':
+ self.exploration = max(0.1, self.exploration - 1e-6)
+ return act
+
+ def train(self, state, action, reward, next_state, isOver):
+ if self.global_step % self.update_target_steps == 0:
+ self.sync_target_network()
+ self.global_step += 1
+
+ action = np.expand_dims(action, -1)
+ self.exe.run(self.train_program,
+ feed={
+ 'state': state.astype('float32'),
+ 'action': action.astype('int32'),
+ 'reward': reward,
+ 'next_s': next_state.astype('float32'),
+ 'isOver': isOver
+ })
+
+ def sync_target_network(self):
+ self.exe.run(self._sync_program)
diff --git a/PaddleRL/DeepQNetwork/DoubleDQN_agent.py b/PaddleRL/DeepQNetwork/DoubleDQN_agent.py
new file mode 100644
index 0000000000000000000000000000000000000000..c95ae5632fd2e904a625f680f4a9147d5615b765
--- /dev/null
+++ b/PaddleRL/DeepQNetwork/DoubleDQN_agent.py
@@ -0,0 +1,195 @@
+#-*- coding: utf-8 -*-
+
+import math
+import numpy as np
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from tqdm import tqdm
+
+
+class DoubleDQNModel(object):
+ def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
+ self.img_height = state_dim[0]
+ self.img_width = state_dim[1]
+ self.action_dim = action_dim
+ self.gamma = gamma
+ self.exploration = 1.1
+ self.update_target_steps = 10000 // 4
+ self.hist_len = hist_len
+ self.use_cuda = use_cuda
+
+ self.global_step = 0
+ self._build_net()
+
+ def _get_inputs(self):
+ return fluid.layers.data(
+ name='state',
+ shape=[self.hist_len, self.img_height, self.img_width],
+ dtype='float32'), \
+ fluid.layers.data(
+ name='action', shape=[1], dtype='int32'), \
+ fluid.layers.data(
+ name='reward', shape=[], dtype='float32'), \
+ fluid.layers.data(
+ name='next_s',
+ shape=[self.hist_len, self.img_height, self.img_width],
+ dtype='float32'), \
+ fluid.layers.data(
+ name='isOver', shape=[], dtype='bool')
+
+ def _build_net(self):
+ self.predict_program = fluid.Program()
+ self.train_program = fluid.Program()
+ self._sync_program = fluid.Program()
+
+ with fluid.program_guard(self.predict_program):
+ state, action, reward, next_s, isOver = self._get_inputs()
+ self.pred_value = self.get_DQN_prediction(state)
+
+ with fluid.program_guard(self.train_program):
+ state, action, reward, next_s, isOver = self._get_inputs()
+ pred_value = self.get_DQN_prediction(state)
+
+ reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
+
+ action_onehot = fluid.layers.one_hot(action, self.action_dim)
+ action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
+
+ pred_action_value = fluid.layers.reduce_sum(
+ fluid.layers.elementwise_mul(action_onehot, pred_value), dim=1)
+
+ targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
+
+ next_s_predcit_value = self.get_DQN_prediction(next_s)
+ greedy_action = fluid.layers.argmax(next_s_predcit_value, axis=1)
+ greedy_action = fluid.layers.unsqueeze(greedy_action, axes=[1])
+
+ predict_onehot = fluid.layers.one_hot(greedy_action, self.action_dim)
+ best_v = fluid.layers.reduce_sum(
+ fluid.layers.elementwise_mul(predict_onehot, targetQ_predict_value),
+ dim=1)
+ best_v.stop_gradient = True
+
+ target = reward + (1.0 - fluid.layers.cast(
+ isOver, dtype='float32')) * self.gamma * best_v
+ cost = fluid.layers.square_error_cost(pred_action_value, target)
+ cost = fluid.layers.reduce_mean(cost)
+
+ optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
+ optimizer.minimize(cost)
+
+ vars = list(self.train_program.list_vars())
+ policy_vars = list(filter(
+ lambda x: 'GRAD' not in x.name and 'policy' in x.name, vars))
+ target_vars = list(filter(
+ lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
+ policy_vars.sort(key=lambda x: x.name)
+ target_vars.sort(key=lambda x: x.name)
+
+ with fluid.program_guard(self._sync_program):
+ sync_ops = []
+ for i, var in enumerate(policy_vars):
+ sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
+ sync_ops.append(sync_op)
+
+ # fluid exe
+ place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
+ self.exe = fluid.Executor(place)
+ self.exe.run(fluid.default_startup_program())
+
+ def get_DQN_prediction(self, image, target=False):
+ image = image / 255.0
+
+ variable_field = 'target' if target else 'policy'
+
+ conv1 = fluid.layers.conv2d(
+ input=image,
+ num_filters=32,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
+ max_pool1 = fluid.layers.pool2d(
+ input=conv1, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv2 = fluid.layers.conv2d(
+ input=max_pool1,
+ num_filters=32,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
+ max_pool2 = fluid.layers.pool2d(
+ input=conv2, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv3 = fluid.layers.conv2d(
+ input=max_pool2,
+ num_filters=64,
+ filter_size=4,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
+ max_pool3 = fluid.layers.pool2d(
+ input=conv3, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv4 = fluid.layers.conv2d(
+ input=max_pool3,
+ num_filters=64,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
+
+ flatten = fluid.layers.flatten(conv4, axis=1)
+
+ out = fluid.layers.fc(
+ input=flatten,
+ size=self.action_dim,
+ param_attr=ParamAttr(name='{}_fc1'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_fc1_b'.format(variable_field)))
+ return out
+
+
+ def act(self, state, train_or_test):
+ sample = np.random.random()
+ if train_or_test == 'train' and sample < self.exploration:
+ act = np.random.randint(self.action_dim)
+ else:
+ if np.random.random() < 0.01:
+ act = np.random.randint(self.action_dim)
+ else:
+ state = np.expand_dims(state, axis=0)
+ pred_Q = self.exe.run(self.predict_program,
+ feed={'state': state.astype('float32')},
+ fetch_list=[self.pred_value])[0]
+ pred_Q = np.squeeze(pred_Q, axis=0)
+ act = np.argmax(pred_Q)
+ if train_or_test == 'train':
+ self.exploration = max(0.1, self.exploration - 1e-6)
+ return act
+
+ def train(self, state, action, reward, next_state, isOver):
+ if self.global_step % self.update_target_steps == 0:
+ self.sync_target_network()
+ self.global_step += 1
+
+ action = np.expand_dims(action, -1)
+ self.exe.run(self.train_program,
+ feed={
+ 'state': state.astype('float32'),
+ 'action': action.astype('int32'),
+ 'reward': reward,
+ 'next_s': next_state.astype('float32'),
+ 'isOver': isOver
+ })
+
+ def sync_target_network(self):
+ self.exe.run(self._sync_program)
diff --git a/PaddleRL/DeepQNetwork/DuelingDQN_agent.py b/PaddleRL/DeepQNetwork/DuelingDQN_agent.py
new file mode 100644
index 0000000000000000000000000000000000000000..cf2ff71bb811e5dce62be78beab1f0afb05d31f9
--- /dev/null
+++ b/PaddleRL/DeepQNetwork/DuelingDQN_agent.py
@@ -0,0 +1,197 @@
+#-*- coding: utf-8 -*-
+
+import math
+import numpy as np
+import paddle.fluid as fluid
+from paddle.fluid.param_attr import ParamAttr
+from tqdm import tqdm
+
+
+class DuelingDQNModel(object):
+ def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
+ self.img_height = state_dim[0]
+ self.img_width = state_dim[1]
+ self.action_dim = action_dim
+ self.gamma = gamma
+ self.exploration = 1.1
+ self.update_target_steps = 10000 // 4
+ self.hist_len = hist_len
+ self.use_cuda = use_cuda
+
+ self.global_step = 0
+ self._build_net()
+
+ def _get_inputs(self):
+ return fluid.layers.data(
+ name='state',
+ shape=[self.hist_len, self.img_height, self.img_width],
+ dtype='float32'), \
+ fluid.layers.data(
+ name='action', shape=[1], dtype='int32'), \
+ fluid.layers.data(
+ name='reward', shape=[], dtype='float32'), \
+ fluid.layers.data(
+ name='next_s',
+ shape=[self.hist_len, self.img_height, self.img_width],
+ dtype='float32'), \
+ fluid.layers.data(
+ name='isOver', shape=[], dtype='bool')
+
+ def _build_net(self):
+ self.predict_program = fluid.Program()
+ self.train_program = fluid.Program()
+ self._sync_program = fluid.Program()
+
+ with fluid.program_guard(self.predict_program):
+ state, action, reward, next_s, isOver = self._get_inputs()
+ self.pred_value = self.get_DQN_prediction(state)
+
+ with fluid.program_guard(self.train_program):
+ state, action, reward, next_s, isOver = self._get_inputs()
+ pred_value = self.get_DQN_prediction(state)
+
+ reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
+
+ action_onehot = fluid.layers.one_hot(action, self.action_dim)
+ action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
+
+ pred_action_value = fluid.layers.reduce_sum(
+ fluid.layers.elementwise_mul(action_onehot, pred_value), dim=1)
+
+ targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
+ best_v = fluid.layers.reduce_max(targetQ_predict_value, dim=1)
+ best_v.stop_gradient = True
+
+ target = reward + (1.0 - fluid.layers.cast(
+ isOver, dtype='float32')) * self.gamma * best_v
+ cost = fluid.layers.square_error_cost(pred_action_value, target)
+ cost = fluid.layers.reduce_mean(cost)
+
+ optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
+ optimizer.minimize(cost)
+
+ vars = list(self.train_program.list_vars())
+ policy_vars = list(filter(
+ lambda x: 'GRAD' not in x.name and 'policy' in x.name, vars))
+ target_vars = list(filter(
+ lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
+ policy_vars.sort(key=lambda x: x.name)
+ target_vars.sort(key=lambda x: x.name)
+
+ with fluid.program_guard(self._sync_program):
+ sync_ops = []
+ for i, var in enumerate(policy_vars):
+ sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
+ sync_ops.append(sync_op)
+
+ # fluid exe
+ place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
+ self.exe = fluid.Executor(place)
+ self.exe.run(fluid.default_startup_program())
+
+ def get_DQN_prediction(self, image, target=False):
+ image = image / 255.0
+
+ variable_field = 'target' if target else 'policy'
+
+ conv1 = fluid.layers.conv2d(
+ input=image,
+ num_filters=32,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
+ max_pool1 = fluid.layers.pool2d(
+ input=conv1, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv2 = fluid.layers.conv2d(
+ input=max_pool1,
+ num_filters=32,
+ filter_size=5,
+ stride=1,
+ padding=2,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
+ max_pool2 = fluid.layers.pool2d(
+ input=conv2, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv3 = fluid.layers.conv2d(
+ input=max_pool2,
+ num_filters=64,
+ filter_size=4,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
+ max_pool3 = fluid.layers.pool2d(
+ input=conv3, pool_size=2, pool_stride=2, pool_type='max')
+
+ conv4 = fluid.layers.conv2d(
+ input=max_pool3,
+ num_filters=64,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ act='relu',
+ param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
+
+ flatten = fluid.layers.flatten(conv4, axis=1)
+
+ value = fluid.layers.fc(
+ input=flatten,
+ size=1,
+ param_attr=ParamAttr(name='{}_value_fc'.format(variable_field)),
+ bias_attr=ParamAttr(name='{}_value_fc_b'.format(variable_field)))
+
+ advantage = fluid.layers.fc(
+ input=flatten,
+ size=self.action_dim,
+ param_attr=ParamAttr(name='{}_advantage_fc'.format(variable_field)),
+ bias_attr=ParamAttr(
+ name='{}_advantage_fc_b'.format(variable_field)))
+
+ Q = advantage + (value - fluid.layers.reduce_mean(
+ advantage, dim=1, keep_dim=True))
+ return Q
+
+
+ def act(self, state, train_or_test):
+ sample = np.random.random()
+ if train_or_test == 'train' and sample < self.exploration:
+ act = np.random.randint(self.action_dim)
+ else:
+ if np.random.random() < 0.01:
+ act = np.random.randint(self.action_dim)
+ else:
+ state = np.expand_dims(state, axis=0)
+ pred_Q = self.exe.run(self.predict_program,
+ feed={'state': state.astype('float32')},
+ fetch_list=[self.pred_value])[0]
+ pred_Q = np.squeeze(pred_Q, axis=0)
+ act = np.argmax(pred_Q)
+ if train_or_test == 'train':
+ self.exploration = max(0.1, self.exploration - 1e-6)
+ return act
+
+ def train(self, state, action, reward, next_state, isOver):
+ if self.global_step % self.update_target_steps == 0:
+ self.sync_target_network()
+ self.global_step += 1
+
+ action = np.expand_dims(action, -1)
+ self.exe.run(self.train_program,
+ feed={
+ 'state': state.astype('float32'),
+ 'action': action.astype('int32'),
+ 'reward': reward,
+ 'next_s': next_state.astype('float32'),
+ 'isOver': isOver
+ })
+
+ def sync_target_network(self):
+ self.exe.run(self._sync_program)
diff --git a/PaddleRL/DeepQNetwork/README.md b/PaddleRL/DeepQNetwork/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..1edeaaa884318ec3a530ec4fdb7d031d07411b56
--- /dev/null
+++ b/PaddleRL/DeepQNetwork/README.md
@@ -0,0 +1,67 @@
+[中文版](README_cn.md)
+
+## Reproduce DQN, DoubleDQN, DuelingDQN model with Fluid version of PaddlePaddle
+Based on PaddlePaddle's next-generation API Fluid, the DQN model of deep reinforcement learning is reproduced, and the same level of indicators of the paper is reproduced in the classic Atari game. The model receives the image of the game as input, and uses the end-to-end model to directly predict the next step. The repository contains the following three types of models:
++ DQN in
+[Human-level Control Through Deep Reinforcement Learning](http://www.nature.com/nature/journal/v518/n7540/full/nature14236.html)
++ DoubleDQN in:
+[Deep Reinforcement Learning with Double Q-Learning](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/viewPaper/12389)
++ DuelingDQN in:
+[Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html)
+
+## Atari benchmark & performance
+
+### Atari games introduction
+
+Please see [here](https://gym.openai.com/envs/#atari) to know more about Atari game.
+
+### Pong game result
+
+The average game rewards that can be obtained for the three models as the number of training steps changes during the training are as follows(about 3 hours/1 Million steps):
+
+
+

+
+
+## How to use
+### Dependencies:
++ python2.7
++ gym
++ tqdm
++ opencv-python
++ paddlepaddle-gpu>=1.0.0
++ ale_python_interface
+
+### Install Dependencies:
++ Install PaddlePaddle:
+ recommended to compile and install PaddlePaddle from source code
++ Install other dependencies:
+ ```
+ pip install -r requirement.txt
+ pip install gym[atari]
+ ```
+ Install ale_python_interface, please see [here](https://github.com/mgbellemare/Arcade-Learning-Environment).
+
+### Start Training:
+```
+# To train a model for Pong game with gpu (use DQN model as default)
+python train.py --rom ./rom_files/pong.bin --use_cuda
+
+# To train a model for Pong with DoubleDQN
+python train.py --rom ./rom_files/pong.bin --use_cuda --alg DoubleDQN
+
+# To train a model for Pong with DuelingDQN
+python train.py --rom ./rom_files/pong.bin --use_cuda --alg DuelingDQN
+```
+
+To train more games, you can install more rom files from [here](https://github.com/openai/atari-py/tree/master/atari_py/atari_roms).
+
+### Start Testing:
+```
+# Play the game with saved best model and calculate the average rewards
+python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong
+
+# Play the game with visualization
+python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong --viz 0.01
+```
+[Here](https://pan.baidu.com/s/1gIsbNw5V7tMeb74ojx-TMA) is saved models for Pong and Breakout games. You can use it to play the game directly.
diff --git a/PaddleRL/DeepQNetwork/README_cn.md b/PaddleRL/DeepQNetwork/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..640d775ad8fed2be360d308b6c5df41c86d77c04
--- /dev/null
+++ b/PaddleRL/DeepQNetwork/README_cn.md
@@ -0,0 +1,71 @@
+## 基于PaddlePaddle的Fluid版本复现DQN, DoubleDQN, DuelingDQN三个模型
+
+基于PaddlePaddle下一代API Fluid复现了深度强化学习领域的DQN模型,在经典的Atari 游戏上复现了论文同等水平的指标,模型接收游戏的图像作为输入,采用端到端的模型直接预测下一步要执行的控制信号,本仓库一共包含以下3类模型:
++ DQN模型:
+[Human-level Control Through Deep Reinforcement Learning](http://www.nature.com/nature/journal/v518/n7540/full/nature14236.html)
++ DoubleDQN模型:
+[Deep Reinforcement Learning with Double Q-Learning](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/viewPaper/12389)
++ DuelingDQN模型:
+[Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html)
+
+## 模型效果:Atari游戏表现
+
+### Atari游戏介绍
+
+请点击[这里](https://gym.openai.com/envs/#atari)了解Atari游戏。
+
+### Pong游戏训练结果
+三个模型在训练过程中随着训练步数的变化,能得到的平均游戏奖励如下图所示(大概3小时每1百万步):
+
+
+

+
+
+## 使用教程
+
+### 依赖:
++ python2.7
++ gym
++ tqdm
++ opencv-python
++ paddlepaddle-gpu>=1.0.0
++ ale_python_interface
+
+### 下载依赖:
+
++ 安装PaddlePaddle:
+ 建议通过PaddlePaddle源码进行编译安装
++ 下载其它依赖:
+ ```
+ pip install -r requirement.txt
+ pip install gym[atari]
+ ```
+ 安装ale_python_interface可以参考[这里](https://github.com/mgbellemare/Arcade-Learning-Environment)
+
+### 训练模型:
+
+```
+# 使用GPU训练Pong游戏(默认使用DQN模型)
+python train.py --rom ./rom_files/pong.bin --use_cuda
+
+# 训练DoubleDQN模型
+python train.py --rom ./rom_files/pong.bin --use_cuda --alg DoubleDQN
+
+# 训练DuelingDQN模型
+python train.py --rom ./rom_files/pong.bin --use_cuda --alg DuelingDQN
+```
+
+训练更多游戏,可以从[这里](https://github.com/openai/atari-py/tree/master/atari_py/atari_roms)下载游戏rom
+
+### 测试模型:
+
+```
+# Play the game with saved model and calculate the average rewards
+# 使用训练过程中保存的最好模型玩游戏,以及计算平均奖励(rewards)
+python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong
+
+# 以可视化的形式来玩游戏
+python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong --viz 0.01
+```
+
+[这里](https://pan.baidu.com/s/1gIsbNw5V7tMeb74ojx-TMA)是Pong和Breakout游戏训练好的模型,可以直接用来测试。
diff --git a/fluid/DeepQNetwork/assets/dqn.png b/PaddleRL/DeepQNetwork/assets/dqn.png
similarity index 100%
rename from fluid/DeepQNetwork/assets/dqn.png
rename to PaddleRL/DeepQNetwork/assets/dqn.png
diff --git a/fluid/DeepQNetwork/atari.py b/PaddleRL/DeepQNetwork/atari.py
similarity index 100%
rename from fluid/DeepQNetwork/atari.py
rename to PaddleRL/DeepQNetwork/atari.py
diff --git a/fluid/DeepQNetwork/atari_wrapper.py b/PaddleRL/DeepQNetwork/atari_wrapper.py
similarity index 100%
rename from fluid/DeepQNetwork/atari_wrapper.py
rename to PaddleRL/DeepQNetwork/atari_wrapper.py
diff --git a/fluid/DeepQNetwork/expreplay.py b/PaddleRL/DeepQNetwork/expreplay.py
similarity index 100%
rename from fluid/DeepQNetwork/expreplay.py
rename to PaddleRL/DeepQNetwork/expreplay.py
diff --git a/fluid/DeepQNetwork/play.py b/PaddleRL/DeepQNetwork/play.py
similarity index 100%
rename from fluid/DeepQNetwork/play.py
rename to PaddleRL/DeepQNetwork/play.py
diff --git a/fluid/DeepQNetwork/requirement.txt b/PaddleRL/DeepQNetwork/requirement.txt
similarity index 100%
rename from fluid/DeepQNetwork/requirement.txt
rename to PaddleRL/DeepQNetwork/requirement.txt
diff --git a/fluid/DeepQNetwork/rom_files/breakout.bin b/PaddleRL/DeepQNetwork/rom_files/breakout.bin
similarity index 100%
rename from fluid/DeepQNetwork/rom_files/breakout.bin
rename to PaddleRL/DeepQNetwork/rom_files/breakout.bin
diff --git a/fluid/DeepQNetwork/rom_files/pong.bin b/PaddleRL/DeepQNetwork/rom_files/pong.bin
similarity index 100%
rename from fluid/DeepQNetwork/rom_files/pong.bin
rename to PaddleRL/DeepQNetwork/rom_files/pong.bin
diff --git a/fluid/DeepQNetwork/train.py b/PaddleRL/DeepQNetwork/train.py
similarity index 100%
rename from fluid/DeepQNetwork/train.py
rename to PaddleRL/DeepQNetwork/train.py
diff --git a/PaddleRL/README.md b/PaddleRL/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..5b8d2caf78d426a14b96f7d842eb88ed37bab233
--- /dev/null
+++ b/PaddleRL/README.md
@@ -0,0 +1,11 @@
+PaddleRL
+============
+
+强化学习
+--------
+
+强化学习是近年来一个愈发重要的机器学习方向,特别是与深度学习相结合而形成的深度强化学习(Deep Reinforcement Learning, DRL),取得了很多令人惊异的成就。人们所熟知的战胜人类顶级围棋职业选手的 AlphaGo 就是 DRL 应用的一个典型例子,除游戏领域外,其它的应用还包括机器人、自然语言处理等。
+
+深度强化学习的开山之作是在Atari视频游戏中的成功应用, 其可直接接受视频帧这种高维输入并根据图像内容端到端地预测下一步的动作,所用到的模型被称为深度Q网络(Deep Q-Network, DQN)。本实例就是利用PaddlePaddle Fluid这个灵活的框架,实现了 DQN 及其变体,并测试了它们在 Atari 游戏中的表现。
+
+- [DeepQNetwork](https://github.com/PaddlePaddle/models/blob/develop/PaddleRL/DeepQNetwork/README_cn.md)
diff --git a/PaddleRL/policy_gradient/README.md b/PaddleRL/policy_gradient/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..b813aa124466597adfb80261bee7c2de22b95e67
--- /dev/null
+++ b/PaddleRL/policy_gradient/README.md
@@ -0,0 +1,171 @@
+运行本目录下的程序示例需要使用PaddlePaddle的最新develop分枝。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
+
+---
+
+# Policy Gradient RL by PaddlePaddle
+本文介绍了如何使用PaddlePaddle通过policy-based的强化学习方法来训练一个player(actor model), 我们希望这个player可以完成简单的走阶梯任务。
+
+ 内容分为:
+
+ - 任务描述
+ - 模型
+ - 策略(目标函数)
+ - 算法(Gradient ascent)
+ - PaddlePaddle实现
+
+
+## 1. 任务描述
+假设有一个阶梯,连接A、B点,player从A点出发,每一步只能向前走一步或向后走一步,到达B点即为完成任务。我们希望训练一个聪明的player,它知道怎么最快的从A点到达B点。
+我们在命令行以下边的形式模拟任务:
+```
+A - O - - - - - B
+```
+一个‘-'代表一个阶梯,A点在行头,B点在行末,O代表player当前在的位置。
+
+## 2. Policy Gradient
+### 2.1 模型
+#### inputyer
+模型的输入是player观察到的当前阶梯的状态$S$, 要包含阶梯的长度和player当前的位置信息。
+在命令行模拟的情况下,player的位置和阶梯长度连个变量足以表示当前的状态,但是我们为了便于将这个demo推广到更复杂的任务场景,我们这里用一个向量来表示游戏状态$S$.
+向量$S$的长度为阶梯的长度,每一维代表一个阶梯,player所在的位置为1,其它位置为0.
+下边是一个例子:
+```
+S = [0, 1, 0, 0] // 阶梯长度为4,player在第二个阶梯上。
+```
+#### hidden layer
+隐藏层采用两个全连接layer `FC_1`和`FC_2`, 其中`FC_1` 的size为10, `FC_2`的size为2.
+
+#### output layer
+我们使用softmax将`FC_2`的output映射为所有可能的动作(前进或后退)的概率分布(Probability of taking the action),即为一个二维向量`act_probs`, 其中,`act_probs[0]` 为后退的概率,`act_probs[1]`为前进的概率。
+
+#### 模型表示
+我将我们的player模型(actor)形式化表示如下:
+$$a = \pi_\theta(s)$$
+其中$\theta$表示模型的参数,$s$是输入状态。
+
+
+### 2.2 策略(目标函数)
+我们怎么评估一个player(模型)的好坏呢?首先我们定义几个术语:
+我们让$\pi_\theta(s)$来玩一局游戏,$s_t$表示第$t$时刻的状态,$a_t$表示在状态$s_t$做出的动作,$r_t$表示做过动作$a_t$后得到的奖赏。
+一局游戏的过程可以表示如下:
+$$\tau = [s_1, a_1, r_1, s_2, a_2, r_2 ... s_T, a_T, r_T] \tag{1}$$
+
+一局游戏的奖励表示如下:
+$$R(\tau) = \sum_{t=1}^Tr_t$$
+
+player玩一局游戏,可能会出现多种操作序列$\tau$ ,某个$\tau$出现的概率是依赖于player model的$\theta$, 记做:
+$$P(\tau | \theta)$$
+那么,给定一个$\theta$(player model), 玩一局游戏,期望得到的奖励是:
+$$\overline {R}_\theta = \sum_\tau R(\tau)\sum_\tau R(\tau) P(\tau|\theta)$$
+大多数情况,我们无法穷举出所有的$\tau$,所以我们就抽取N个$\tau$来计算近似的期望:
+$$\overline {R}_\theta = \sum_\tau R(\tau) P(\tau|\theta) \approx \frac{1}{N} \sum_{n=1}^N R(\tau^n)$$
+
+$\overline {R}_\theta$就是我们需要的目标函数,它表示了一个参数为$\theta$的player玩一局游戏得分的期望,这个期望越大,代表这个player能力越强。
+### 2.3 算法(Gradient ascent)
+我们的目标函数是$\overline {R}_\theta$, 我们训练的任务就是, 我们训练的任务就是:
+$$\theta^* = \arg\max_\theta \overline {R}_\theta$$
+
+为了找到理想的$\theta$,我们使用Gradient ascent方法不断在$\overline {R}_\theta$的梯度方向更新$\theta$,可表示如下:
+$$\theta' = \theta + \eta * \bigtriangledown \overline {R}_\theta$$
+
+$$ \bigtriangledown \overline {R}_\theta = \sum_\tau R(\tau) \bigtriangledown P(\tau|\theta)\\
+= \sum_\tau R(\tau) P(\tau|\theta) \frac{\bigtriangledown P(\tau|\theta)}{P(\tau|\theta)} \\
+=\sum_\tau R(\tau) P(\tau|\theta) {\bigtriangledown \log P(\tau|\theta)} $$
+
+
+$$P(\tau|\theta) = P(s_1)P(a_1|s_1,\theta)P(s_2, r_1|s_1,a_1)P(a_2|s_2,\theta)P(s_3,r_2|s_2,a_2)...P(a_t|s_t,\theta)P(s_{t+1}, r_t|s_t,a_t)\\
+=P(s_1) \sum_{t=1}^T P(a_t|s_t,\theta)P(s_{t+1}, r_t|s_t,a_t)$$
+
+$$\log P(\tau|\theta) = \log P(s_1) + \sum_{t=1}^T [\log P(a_t|s_t,\theta) + \log P(s_{t+1}, r_t|s_t,a_t)]$$
+
+$$ \bigtriangledown \log P(\tau|\theta) = \sum_{t=1}^T \bigtriangledown \log P(a_t|s_t,\theta)$$
+
+$$ \bigtriangledown \overline {R}_\theta = \sum_\tau R(\tau) P(\tau|\theta) {\bigtriangledown \log P(\tau|\theta)} \\
+\approx \frac{1}{N} \sum_{n=1}^N R(\tau^n) {\bigtriangledown \log P(\tau|\theta)} \\
+= \frac{1}{N} \sum_{n=1}^N R(\tau^n) {\sum_{t=1}^T \bigtriangledown \log P(a_t|s_t,\theta)} \\
+= \frac{1}{N} \sum_{n=1}^N \sum_{t=1}^T R(\tau^n) { \bigtriangledown \log P(a_t|s_t,\theta)} \tag{11}$$
+
+#### 2.3.2 导数解释
+
+在使用深度学习框架进行训练求解时,一般用梯度下降方法,所以我们把Gradient ascent转为Gradient
+descent, 重写等式$(5)(6)$为:
+
+$$\theta^* = \arg\min_\theta (-\overline {R}_\theta \tag{13}$$
+$$\theta' = \theta - \eta * \bigtriangledown (-\overline {R}_\theta)) \tag{14}$$
+
+根据上一节的推导,$ (-\bigtriangledown \overline {R}_\theta) $结果如下:
+
+$$ -\bigtriangledown \overline {R}_\theta
+= \frac{1}{N} \sum_{n=1}^N \sum_{t=1}^T R(\tau^n) { \bigtriangledown -\log P(a_t|s_t,\theta)} \tag{15}$$
+
+根据等式(14), 我们的player的模型可以设计为:
+
+
+
+图 1
+
+
+用户的在一局游戏中的一次操作可以用元组$(s_t, a_t)$, 就是在状态$s_t$状态下做了动作$a_t$, 我们通过图(1)中的前向网络计算出来cross entropy cost为$−\log P(a_t|s_t,\theta)$, 恰好是等式(15)中我们需要微分的一项。
+图1是我们需要的player模型,我用这个网络的前向计算可以预测任何状态下该做什么动作。但是怎么去训练学习这个网络呢?在等式(15)中还有一项$R(\tau^n)$, 我做反向梯度传播的时候要加上这一项,所以我们需要在图1基础上再加上$R(\tau^n)$, 如 图2 所示:
+
+
+
+图 2
+
+
+图2就是我们最终的网络结构。
+
+#### 2.3.3 直观理解
+对于等式(15),我只看游戏中的一步操作,也就是这一项: $R(\tau^n) { \bigtriangledown -\log P(a_t|s_t,\theta)}$, 我们可以简单的认为我们训练的目的是让 $R(\tau^n) {[ -\log P(a_t|s_t,\theta)]}$尽可能的小,也就是$R(\tau^n) \log P(a_t|s_t,\theta)$尽可能的大。
+
+- 如果我们当前游戏局的奖励$R(\tau^n)$为正,那么我们希望当前操作的出现的概率$P(a_t|s_t,\theta)$尽可能大。
+- 如果我们当前游戏局的奖励$R(\tau^n)$为负,那么我们希望当前操作的出现的概率$P(a_t|s_t,\theta)$尽可能小。
+
+#### 2.3.4 一个问题
+
+一人犯错,诛连九族。一人得道,鸡犬升天。如果一局游戏得到奖励,我们希望帮助获得奖励的每一次操作都被重视;否则,导致惩罚的操作都要被冷落一次。
+是不是很有道理的样子?但是,如果有些游戏场景只有奖励,没有惩罚,怎么办?也就是所有的$R(\tau^n)$都为正。
+针对不同的游戏场景,我们有不同的解决方案:
+
+1. 每局游戏得分不一样:将每局的得分减去一个bias,结果就有正有负了。
+2. 每局游戏得分一样:把完成一局的时间作为计分因素,并减去一个bias.
+
+我们在第一章描述的游戏场景,需要用第二种 ,player每次到达终点都会收到1分的奖励,我们可以按完成任务所用的步数来定义奖励R.
+更进一步,我们认为一局游戏中每步动作对结局的贡献是不同的,有聪明的动作,也有愚蠢的操作。直观的理解,一般是靠前的动作是愚蠢的,靠后的动作是聪明的。既然有了这个价值观,那么我们拿到1分的奖励,就不能平均分给每个动作了。
+如图3所示,让所有动作按先后排队,从后往前衰减地给每个动作奖励,然后再每个动作的奖励再减去所有动作奖励的平均值:
+
+
+
+图 3
+
+
+## 3. 训练效果
+
+demo运行训练效果如下,经过1000轮尝试,我们的player就学会了如何有效的完成任务了:
+
+```
+---------O epoch: 0; steps: 42
+---------O epoch: 1; steps: 77
+---------O epoch: 2; steps: 82
+---------O epoch: 3; steps: 64
+---------O epoch: 4; steps: 79
+---------O epoch: 501; steps: 19
+---------O epoch: 1001; steps: 9
+---------O epoch: 1501; steps: 9
+---------O epoch: 2001; steps: 11
+---------O epoch: 2501; steps: 9
+---------O epoch: 3001; steps: 9
+---------O epoch: 3002; steps: 9
+---------O epoch: 3003; steps: 9
+---------O epoch: 3004; steps: 9
+---------O epoch: 3005; steps: 9
+---------O epoch: 3006; steps: 9
+---------O epoch: 3007; steps: 9
+---------O epoch: 3008; steps: 9
+---------O epoch: 3009; steps: 9
+---------O epoch: 3010; steps: 11
+---------O epoch: 3011; steps: 9
+---------O epoch: 3012; steps: 9
+---------O epoch: 3013; steps: 9
+---------O epoch: 3014; steps: 9
+```
diff --git a/fluid/policy_gradient/brain.py b/PaddleRL/policy_gradient/brain.py
similarity index 100%
rename from fluid/policy_gradient/brain.py
rename to PaddleRL/policy_gradient/brain.py
diff --git a/fluid/policy_gradient/env.py b/PaddleRL/policy_gradient/env.py
similarity index 100%
rename from fluid/policy_gradient/env.py
rename to PaddleRL/policy_gradient/env.py
diff --git a/fluid/policy_gradient/images/PG_1.svg b/PaddleRL/policy_gradient/images/PG_1.svg
similarity index 100%
rename from fluid/policy_gradient/images/PG_1.svg
rename to PaddleRL/policy_gradient/images/PG_1.svg
diff --git a/fluid/policy_gradient/images/PG_2.svg b/PaddleRL/policy_gradient/images/PG_2.svg
similarity index 100%
rename from fluid/policy_gradient/images/PG_2.svg
rename to PaddleRL/policy_gradient/images/PG_2.svg
diff --git a/fluid/policy_gradient/images/PG_3.svg b/PaddleRL/policy_gradient/images/PG_3.svg
similarity index 100%
rename from fluid/policy_gradient/images/PG_3.svg
rename to PaddleRL/policy_gradient/images/PG_3.svg
diff --git a/fluid/policy_gradient/run.py b/PaddleRL/policy_gradient/run.py
similarity index 100%
rename from fluid/policy_gradient/run.py
rename to PaddleRL/policy_gradient/run.py
diff --git a/PaddleRec/README.md b/PaddleRec/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..45b3e87ee05cf553a5683b5f4f27089ba083fb5a
--- /dev/null
+++ b/PaddleRec/README.md
@@ -0,0 +1,18 @@
+PaddleRec
+=========
+
+个性化推荐
+-------
+
+推荐系统在当前的互联网服务中正在发挥越来越大的作用,目前大部分电子商务系统、社交网络,广告推荐,搜索引擎,都不同程度的使用了各种形式的个性化推荐技术,帮助用户快速找到他们想要的信息。
+
+在工业可用的推荐系统中,推荐策略一般会被划分为多个模块串联执行。以新闻推荐系统为例,存在多个可以使用深度学习技术的环节,例如新闻的自动化标注,个性化新闻召回,个性化匹配与排序等。PaddlePaddle对推荐算法的训练提供了完整的支持,并提供了多种模型配置供用户选择。
+
+- [TagSpace](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec/tagspace)
+- [GRU4Rec](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec/gru4rec)
+- [SequenceSemanticRetrieval](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec/ssr)
+- [DeepCTR](https://github.com/PaddlePaddle/models/blob/develop/PaddleRec/ctr/README.cn.md)
+- [Multiview-Simnet](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec/multiview_simnet)
+- [Word2Vec](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec/word2vec)
+- [GraphNeuralNetwork](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec/gnn)
+- [DeepInterestNetwork](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec/din)
diff --git a/PaddleRec/__init__.py b/PaddleRec/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/ctr/.run_ce.sh b/PaddleRec/ctr/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..fc1e1303a06f96086eea679142770e3aefac3ff3
--- /dev/null
+++ b/PaddleRec/ctr/.run_ce.sh
@@ -0,0 +1,22 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+#cudaid=${face_detection:=0} # use 0-th card as default
+#export CUDA_VISIBLE_DEVICES=$cudaid
+export CPU_NUM=1
+export NUM_THREADS=1
+
+FLAGS_benchmark=true python train.py --is_local 1 --cloud_train 0 --train_data_path data/raw/train.txt --enable_ce | python _ce.py
+
+export CPU_NUM=1
+export NUM_THREADS=8
+
+FLAGS_benchmark=true python train.py --is_local 1 --cloud_train 0 --train_data_path data/raw/train.txt --enable_ce | python _ce.py
+
+export CPU_NUM=8
+export NUM_THREADS=8
+
+FLAGS_benchmark=true python train.py --is_local 1 --cloud_train 0 --train_data_path data/raw/train.txt --enable_ce | python _ce.py
diff --git a/PaddleRec/ctr/README.cn.md b/PaddleRec/ctr/README.cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..05d1653e52c1db36e9690c64166283afc26df429
--- /dev/null
+++ b/PaddleRec/ctr/README.cn.md
@@ -0,0 +1,79 @@
+
+# 基于DNN模型的点击率预估模型
+
+## 介绍
+本模型实现了下述论文中提出的DNN模型:
+
+```text
+@inproceedings{guo2017deepfm,
+ title={DeepFM: A Factorization-Machine based Neural Network for CTR Prediction},
+ author={Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He},
+ booktitle={the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI)},
+ pages={1725--1731},
+ year={2017}
+}
+```
+
+## 运行环境
+需要先安装PaddlePaddle Fluid,然后运行:
+
+```shell
+pip install -r requirements.txt
+```
+
+## 数据集
+本文使用的是Kaggle公司举办的[展示广告竞赛](https://www.kaggle.com/c/criteo-display-ad-challenge/)中所使用的Criteo数据集。
+
+每一行是一次广告展示的特征,第一列是一个标签,表示这次广告展示是否被点击。总共有39个特征,其中13个特征采用整型值,另外26个特征是类别类特征。测试集中是没有标签的。
+
+下载数据集:
+```bash
+cd data && ./download.sh && cd ..
+```
+
+## 模型
+本例子只实现了DeepFM论文中介绍的模型的DNN部分,DeepFM会在其他例子中给出。
+
+
+## 数据准备
+处理原始数据集,整型特征使用min-max归一化方法规范到[0, 1],类别类特征使用了one-hot编码。原始数据集分割成两部分:90%用于训练,其他10%用于训练过程中的验证。
+
+## 训练
+训练的命令行选项可以通过`python train.py -h`列出。
+
+### 单机训练:
+```bash
+python train.py \
+ --train_data_path data/raw/train.txt \
+ 2>&1 | tee train.log
+```
+
+训练到第1轮的第40000个batch后,测试的AUC为0.801178,误差(cost)为0.445196。
+
+### 分布式训练
+
+本地启动一个2 trainer 2 pserver的分布式训练任务,分布式场景下训练数据会按照trainer的id进行切分,保证trainer之间的训练数据不会重叠,提高训练效率
+
+```bash
+sh cluster_train.sh
+```
+
+## 预测
+预测的命令行选项可以通过`python infer.py -h`列出。
+
+对测试集进行预测:
+```bash
+python infer.py \
+ --model_path models/pass-0/ \
+ --data_path data/raw/valid.txt
+```
+注意:infer.py跑完最后输出的AUC才是整个预测文件的整体AUC。
+
+## 在百度云上运行集群训练
+1. 参考文档 [在百度云上启动Fluid分布式训练](https://github.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/user_guides/howto/training/train_on_baidu_cloud_cn.rst) 在百度云上部署一个CPU集群。
+1. 用preprocess.py处理训练数据生成train.txt。
+1. 将train.txt切分成集群机器份,放到每台机器上。
+1. 用上面的 `分布式训练` 中的命令行启动分布式训练任务.
+
+## 在PaddleCloud上运行集群训练
+如果你正在使用PaddleCloud做集群训练,你可以使用```cloud.py```这个文件来帮助你提交任务,```trian.py```中所需要的参数可以通过PaddleCloud的环境变量来提交。
\ No newline at end of file
diff --git a/PaddleRec/ctr/README.md b/PaddleRec/ctr/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..e29e2e1eb5493fc52b8bb38a6bf49bd397bb8455
--- /dev/null
+++ b/PaddleRec/ctr/README.md
@@ -0,0 +1,96 @@
+
+# DNN for Click-Through Rate prediction
+
+## Introduction
+This model implements the DNN part proposed in the following paper:
+
+```text
+@inproceedings{guo2017deepfm,
+ title={DeepFM: A Factorization-Machine based Neural Network for CTR Prediction},
+ author={Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He},
+ booktitle={the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI)},
+ pages={1725--1731},
+ year={2017}
+}
+```
+
+The DeepFm combines factorization machine and deep neural networks to model
+both low order and high order feature interactions. For details of the
+factorization machines, please refer to the paper [factorization
+machines](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf)
+
+## Environment
+You should install PaddlePaddle Fluid first, and run:
+
+```shell
+pip install -r requirements.txt
+```
+
+## Dataset
+This example uses Criteo dataset which was used for the [Display Advertising
+Challenge](https://www.kaggle.com/c/criteo-display-ad-challenge/)
+hosted by Kaggle.
+
+Each row is the features for an ad display and the first column is a label
+indicating whether this ad has been clicked or not. There are 39 features in
+total. 13 features take integer values and the other 26 features are
+categorical features. For the test dataset, the labels are omitted.
+
+Download dataset:
+```bash
+cd data && ./download.sh && cd ..
+```
+
+## Model
+This Demo only implement the DNN part of the model described in DeepFM paper.
+DeepFM model will be provided in other model.
+
+
+## Data Preprocessing method
+To preprocess the raw dataset, the integer features are clipped then min-max
+normalized to [0, 1] and the categorical features are one-hot encoded. The raw
+training dataset are splited such that 90% are used for training and the other
+10% are used for validation during training. In reader.py, training data is the first
+90% of data in train.txt, and validation data is the left.
+
+## Train
+The command line options for training can be listed by `python train.py -h`.
+
+### Local Train:
+```bash
+python train.py \
+ --train_data_path data/raw/train.txt \
+ 2>&1 | tee train.log
+```
+
+After training pass 1 batch 40000, the testing AUC is `0.801178` and the testing
+cost is `0.445196`.
+
+### Distributed Train
+Run a 2 pserver 2 trainer distribute training on a single machine.
+In distributed training setting, training data is splited by trainer_id, so that training data
+ do not overlap among trainers
+
+```bash
+sh cluster_train.sh
+```
+
+## Infer
+The command line options for infering can be listed by `python infer.py -h`.
+
+To make inference for the test dataset:
+```bash
+python infer.py \
+ --model_path models/ \
+ --data_path data/raw/train.txt
+```
+Note: The AUC value in the last log info is the total AUC for all test dataset. Here, train.txt is splited inside the reader.py so that validation data does not have overlap with training data.
+
+## Train on Baidu Cloud
+1. Please prepare some CPU machines on Baidu Cloud following the steps in [train_on_baidu_cloud](https://github.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/user_guides/howto/training/train_on_baidu_cloud_cn.rst)
+1. Prepare dataset using preprocess.py.
+1. Split the train.txt to trainer_num parts and put them on the machines.
+1. Run training with the cluster train using the command in `Distributed Train` above.
+
+## Train on Paddle Cloud
+If you want to run this training on PaddleCloud, you can use the script ```cloud.py```, you can change the arguments in ```trian.py``` through environments in PaddleCloud.
\ No newline at end of file
diff --git a/PaddleRec/ctr/__init__.py b/PaddleRec/ctr/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/ctr/_ce.py b/PaddleRec/ctr/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..91867d036a050f03b7c685ba73c3051eca97c9aa
--- /dev/null
+++ b/PaddleRec/ctr/_ce.py
@@ -0,0 +1,78 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+from kpi import AccKpi
+
+
+each_pass_duration_cpu1_thread1_kpi = DurationKpi('each_pass_duration_cpu1_thread1', 0.08, 0, actived=True)
+train_loss_cpu1_thread1_kpi = CostKpi('train_loss_cpu1_thread1', 0.08, 0)
+train_auc_val_cpu1_thread1_kpi = AccKpi('train_auc_val_cpu1_thread1', 0.08, 0)
+train_batch_auc_val_cpu1_thread1_kpi = AccKpi('train_batch_auc_val_cpu1_thread1', 0.08, 0)
+each_pass_duration_cpu1_thread8_kpi = DurationKpi('each_pass_duration_cpu1_thread8', 0.08, 0, actived=True)
+train_loss_cpu1_thread8_kpi = CostKpi('train_loss_cpu1_thread8', 0.08, 0)
+train_auc_val_cpu1_thread8_kpi = AccKpi('train_auc_val_cpu1_thread8', 0.08, 0)
+train_batch_auc_val_cpu1_thread8_kpi = AccKpi('train_batch_auc_val_cpu1_thread8', 0.08, 0)
+each_pass_duration_cpu8_thread8_kpi = DurationKpi('each_pass_duration_cpu8_thread8', 0.08, 0, actived=True)
+train_loss_cpu8_thread8_kpi = CostKpi('train_loss_cpu8_thread8', 0.08, 0)
+train_auc_val_cpu8_thread8_kpi = AccKpi('train_auc_val_cpu8_thread8', 0.08, 0)
+train_batch_auc_val_cpu8_thread8_kpi = AccKpi('train_batch_auc_val_cpu8_thread8', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_cpu1_thread1_kpi,
+ train_loss_cpu1_thread1_kpi,
+ train_auc_val_cpu1_thread1_kpi,
+ train_batch_auc_val_cpu1_thread1_kpi,
+ each_pass_duration_cpu1_thread8_kpi,
+ train_loss_cpu1_thread8_kpi,
+ train_auc_val_cpu1_thread8_kpi,
+ train_batch_auc_val_cpu1_thread8_kpi,
+ each_pass_duration_cpu8_thread8_kpi,
+ train_loss_cpu8_thread8_kpi,
+ train_auc_val_cpu8_thread8_kpi,
+ train_batch_auc_val_cpu8_thread8_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/PaddleRec/ctr/cloud.py b/PaddleRec/ctr/cloud.py
new file mode 100644
index 0000000000000000000000000000000000000000..c5388c6a9ca77380bea1073267233b16eb202513
--- /dev/null
+++ b/PaddleRec/ctr/cloud.py
@@ -0,0 +1,143 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+# ======================================================================
+#
+# Copyright (c) 2017 Baidu.com, Inc. All Rights Reserved
+#
+# ======================================================================
+"""this file is only for PaddleCloud"""
+
+import os
+
+import logging
+
+import paddle.fluid.contrib.utils.hdfs_utils as hdfs_utils
+
+logging.basicConfig(
+ format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("cloud")
+logger.setLevel(logging.INFO)
+
+
+def run():
+ cmd = "python -u train.py "
+
+ cmd += " --train_data_path %s " % "data/train.txt"
+
+ cmd += " --test_data_path %s " % "data/test.txt"
+
+ if os.getenv("BATCH_SIZE", ""):
+ cmd += " --batch_size %s " % os.getenv("BATCH_SIZE")
+
+ if os.getenv("EMBEDDING_SIZE", ""):
+ cmd += " --embedding_size %s " % os.getenv("EMBEDDING_SIZE")
+
+ if os.getenv("NUM_PASSES", ""):
+ cmd += " --num_passes %s " % os.getenv("NUM_PASSES")
+
+ if os.getenv("MODEL_OUTPUT_DIR", ""):
+ cmd += " --model_output_dir %s " % os.getenv("MODEL_OUTPUT_DIR")
+
+ if os.getenv("SPARSE_FEATURE_DIM", ""):
+ cmd += " --sparse_feature_dim %s " % os.getenv("SPARSE_FEATURE_DIM")
+
+ if os.getenv("ASYNC_MODE", ""):
+ cmd += " --async_mode "
+
+ if os.getenv("NO_SPLIT_VAR", ""):
+ cmd += " --no_split_var "
+
+ is_local = int(os.getenv("PADDLE_IS_LOCAL", "1"))
+
+ if is_local:
+ cmd += " --is_local 1 "
+ cmd += " --cloud_train 0 "
+ else:
+ cmd += " --is_local 0 "
+ cmd += " --cloud_train 1 "
+
+ trainer_id = int(os.environ["PADDLE_TRAINER_ID"])
+ trainers = int(os.environ["PADDLE_TRAINERS"])
+ training_role = os.environ["PADDLE_TRAINING_ROLE"]
+
+ port = os.getenv("PADDLE_PSERVER_PORT", "6174")
+ pserver_ips = os.getenv("PADDLE_PSERVER_IPS", "")
+ eplist = []
+ for ip in pserver_ips.split(","):
+ eplist.append(':'.join([ip, port]))
+ pserver_endpoints = ",".join(eplist)
+ current_endpoint = os.getenv("PADDLE_CURRENT_IP", "") + ":" + port
+
+ if training_role == "PSERVER":
+ cmd += " --role pserver "
+ else:
+ cmd += " --role trainer "
+ cmd += " --endpoints %s " % pserver_endpoints
+ cmd += " --current_endpoint %s " % current_endpoint
+ cmd += " --trainer_id %s " % trainer_id
+ cmd += " --trainers %s " % trainers
+
+ logging.info("run cluster commands: {}".format(cmd))
+
+ exit(os.system(cmd))
+
+
+def download():
+ hadoop_home = os.getenv("HADOOP_HOME")
+
+ configs = {}
+ configs["fs.default.name"] = os.getenv("DATA_FS_NAME")
+ configs["hadoop.job.ugi"] = os.getenv("DATA_FS_UGI")
+ client = hdfs_utils.HDFSClient(hadoop_home, configs)
+
+ local_train_data_dir = os.getenv("TRAIN_DATA_LOCAL", "data")
+ hdfs_train_data_dir = os.getenv("TRAIN_DATA_HDFS", "")
+
+ downloads = hdfs_utils.multi_download(client, hdfs_train_data_dir, local_train_data_dir, 0, 1, multi_processes=1)
+
+ print(downloads)
+ for d in downloads:
+ base_dir = os.path.dirname(d)
+ tar_cmd = "tar -zxvf {} -C {}".format(d, base_dir)
+ print tar_cmd
+
+ for d in downloads:
+ base_dir = os.path.dirname(d)
+ tar_cmd = "tar -zxvf {} -C {}".format(d, base_dir)
+ logging.info("DOWNLOAD DATA: {}, AND TAR IT: {}".format(d, tar_cmd))
+ os.system(tar_cmd)
+
+
+def env_declar():
+ logging.info("******** Rename Cluster Env to PaddleFluid Env ********")
+
+ if os.environ["TRAINING_ROLE"] == "PSERVER" or os.environ["PADDLE_IS_LOCAL"] == "0":
+ os.environ["PADDLE_TRAINING_ROLE"] = os.environ["TRAINING_ROLE"]
+ os.environ["PADDLE_PSERVER_PORT"] = os.environ["PADDLE_PORT"]
+ os.environ["PADDLE_PSERVER_IPS"] = os.environ["PADDLE_PSERVERS"]
+ os.environ["PADDLE_TRAINERS"] = os.environ["PADDLE_TRAINERS_NUM"]
+ os.environ["PADDLE_CURRENT_IP"] = os.environ["POD_IP"]
+ os.environ["PADDLE_TRAINER_ID"] = os.environ["PADDLE_TRAINER_ID"]
+
+ os.environ["CPU_NUM"] = os.getenv("CPU_NUM", "12")
+ os.environ["NUM_THREADS"] = os.getenv("NUM_THREADS", "12")
+
+ logging.info("Content-Type: text/plain\n\n")
+ for key in os.environ.keys():
+ logging.info("%30s %s \n" % (key, os.environ[key]))
+
+ logging.info("****** Rename Cluster Env to PaddleFluid Env END ******")
+
+
+if __name__ == '__main__':
+ env_declar()
+
+ if os.getenv("NEED_CUSTOM_DOWNLOAD", ""):
+
+ if os.environ["PADDLE_TRAINING_ROLE"] == "PSERVER":
+ logging.info("PSERVER do not need to download datas")
+ else:
+ logging.info("NEED_CUSTOM_DOWNLOAD is True, will download train data with hdfs_utils")
+ download()
+
+ run()
diff --git a/fluid/PaddleRec/ctr/cluster_train.sh b/PaddleRec/ctr/cluster_train.sh
similarity index 100%
rename from fluid/PaddleRec/ctr/cluster_train.sh
rename to PaddleRec/ctr/cluster_train.sh
diff --git a/PaddleRec/ctr/data/download.sh b/PaddleRec/ctr/data/download.sh
new file mode 100755
index 0000000000000000000000000000000000000000..f5c301ee48f6ef10bf80079a6820056571c4dbc7
--- /dev/null
+++ b/PaddleRec/ctr/data/download.sh
@@ -0,0 +1,8 @@
+#!/bin/bash
+
+wget --no-check-certificate https://s3-eu-west-1.amazonaws.com/kaggle-display-advertising-challenge-dataset/dac.tar.gz
+tar zxf dac.tar.gz
+rm -f dac.tar.gz
+
+mkdir raw
+mv ./*.txt raw/
diff --git a/PaddleRec/ctr/infer.py b/PaddleRec/ctr/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..19f7013b305e3900fe05b7575cfc20c331ca5daf
--- /dev/null
+++ b/PaddleRec/ctr/infer.py
@@ -0,0 +1,91 @@
+import argparse
+import logging
+
+import numpy as np
+# disable gpu training for this example
+import os
+os.environ["CUDA_VISIBLE_DEVICES"] = ""
+import paddle
+import paddle.fluid as fluid
+
+import reader
+from network_conf import ctr_dnn_model
+
+
+logging.basicConfig(
+ format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(description="PaddlePaddle DeepFM example")
+ parser.add_argument(
+ '--model_path',
+ type=str,
+ required=True,
+ help="The path of model parameters gz file")
+ parser.add_argument(
+ '--data_path',
+ type=str,
+ required=True,
+ help="The path of the dataset to infer")
+ parser.add_argument(
+ '--embedding_size',
+ type=int,
+ default=10,
+ help="The size for embedding layer (default:10)")
+ parser.add_argument(
+ '--sparse_feature_dim',
+ type=int,
+ default=1000001,
+ help="The size for embedding layer (default:1000001)")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=1000,
+ help="The size of mini-batch (default:1000)")
+
+ return parser.parse_args()
+
+
+def infer():
+ args = parse_args()
+
+ place = fluid.CPUPlace()
+ inference_scope = fluid.core.Scope()
+
+ dataset = reader.CriteoDataset(args.sparse_feature_dim)
+ test_reader = paddle.batch(dataset.test([args.data_path]), batch_size=args.batch_size)
+
+ startup_program = fluid.framework.Program()
+ test_program = fluid.framework.Program()
+ with fluid.framework.program_guard(test_program, startup_program):
+ loss, auc_var, batch_auc_var, _, data_list = ctr_dnn_model(args.embedding_size, args.sparse_feature_dim, False)
+
+ exe = fluid.Executor(place)
+
+ feeder = fluid.DataFeeder(feed_list=data_list, place=place)
+
+ fluid.io.load_persistables(executor=exe, dirname=args.model_path,
+ main_program=fluid.default_main_program())
+
+ def set_zero(var_name):
+ param = inference_scope.var(var_name).get_tensor()
+ param_array = np.zeros(param._get_dims()).astype("int64")
+ param.set(param_array, place)
+
+ auc_states_names = ['_generated_var_2', '_generated_var_3']
+ for name in auc_states_names:
+ set_zero(name)
+
+ for batch_id, data in enumerate(test_reader()):
+ loss_val, auc_val = exe.run(test_program,
+ feed=feeder.feed(data),
+ fetch_list=[loss, auc_var])
+ if batch_id % 100 == 0:
+ logger.info("TEST --> batch: {} loss: {} auc: {}".format(batch_id, loss_val/args.batch_size, auc_val))
+
+
+if __name__ == '__main__':
+ infer()
diff --git a/PaddleRec/ctr/network_conf.py b/PaddleRec/ctr/network_conf.py
new file mode 100644
index 0000000000000000000000000000000000000000..569e84f46b327e84f70c622f06e99ddba031b4f6
--- /dev/null
+++ b/PaddleRec/ctr/network_conf.py
@@ -0,0 +1,161 @@
+import paddle.fluid as fluid
+import math
+
+dense_feature_dim = 13
+
+
+def ctr_deepfm_model(factor_size, sparse_feature_dim, dense_feature_dim, sparse_input):
+ def dense_fm_layer(input, emb_dict_size, factor_size, fm_param_attr):
+ """
+ dense_fm_layer
+ """
+ first_order = fluid.layers.fc(input=input, size=1)
+ emb_table = fluid.layers.create_parameter(shape=[emb_dict_size, factor_size],
+ dtype='float32', attr=fm_param_attr)
+
+ input_mul_factor = fluid.layers.matmul(input, emb_table)
+ input_mul_factor_square = fluid.layers.square(input_mul_factor)
+ input_square = fluid.layers.square(input)
+ factor_square = fluid.layers.square(emb_table)
+ input_square_mul_factor_square = fluid.layers.matmul(input_square, factor_square)
+
+ second_order = 0.5 * (input_mul_factor_square - input_square_mul_factor_square)
+ return first_order, second_order
+
+ def sparse_fm_layer(input, emb_dict_size, factor_size, fm_param_attr):
+ """
+ sparse_fm_layer
+ """
+ first_embeddings = fluid.layers.embedding(
+ input=input, dtype='float32', size=[emb_dict_size, 1], is_sparse=True)
+ first_order = fluid.layers.sequence_pool(input=first_embeddings, pool_type='sum')
+
+ nonzero_embeddings = fluid.layers.embedding(
+ input=input, dtype='float32', size=[emb_dict_size, factor_size],
+ param_attr=fm_param_attr, is_sparse=True)
+ summed_features_emb = fluid.layers.sequence_pool(input=nonzero_embeddings, pool_type='sum')
+ summed_features_emb_square = fluid.layers.square(summed_features_emb)
+
+ squared_features_emb = fluid.layers.square(nonzero_embeddings)
+ squared_sum_features_emb = fluid.layers.sequence_pool(
+ input=squared_features_emb, pool_type='sum')
+
+ second_order = 0.5 * (summed_features_emb_square - squared_sum_features_emb)
+ return first_order, second_order
+
+ dense_input = fluid.layers.data(name="dense_input", shape=[dense_feature_dim], dtype='float32')
+
+ sparse_input_ids = [
+ fluid.layers.data(name="C" + str(i), shape=[1], lod_level=1, dtype='int64')
+ for i in range(1, 27)]
+
+ label = fluid.layers.data(name='label', shape=[1], dtype='int64')
+
+ datas = [dense_input] + sparse_input_ids + [label]
+
+ py_reader = fluid.layers.create_py_reader_by_data(capacity=64,
+ feed_list=datas,
+ name='py_reader',
+ use_double_buffer=True)
+ words = fluid.layers.read_file(py_reader)
+
+ sparse_fm_param_attr = fluid.param_attr.ParamAttr(name="SparseFeatFactors",
+ initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(sparse_feature_dim)))
+ dense_fm_param_attr = fluid.param_attr.ParamAttr(name="DenseFeatFactors",
+ initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(dense_feature_dim)))
+
+ sparse_fm_first, sparse_fm_second = sparse_fm_layer(
+ sparse_input, sparse_feature_dim, factor_size, sparse_fm_param_attr)
+ dense_fm_first, dense_fm_second = dense_fm_layer(
+ dense_input, dense_feature_dim, factor_size, dense_fm_param_attr)
+
+ def embedding_layer(input):
+ """embedding_layer"""
+ emb = fluid.layers.embedding(
+ input=input, dtype='float32', size=[sparse_feature_dim, factor_size],
+ param_attr=sparse_fm_param_attr, is_sparse=True)
+ return fluid.layers.sequence_pool(input=emb, pool_type='average')
+
+ sparse_embed_seq = list(map(embedding_layer, sparse_input_ids))
+ concated = fluid.layers.concat(sparse_embed_seq + [dense_input], axis=1)
+ fc1 = fluid.layers.fc(input=concated, size=400, act='relu',
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(concated.shape[1]))))
+ fc2 = fluid.layers.fc(input=fc1, size=400, act='relu',
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc1.shape[1]))))
+ fc3 = fluid.layers.fc(input=fc2, size=400, act='relu',
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc2.shape[1]))))
+ predict = fluid.layers.fc(
+ input=[sparse_fm_first, sparse_fm_second, dense_fm_first, dense_fm_second, fc3],
+ size=2,
+ act="softmax",
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(scale=1 / math.sqrt(fc3.shape[1]))))
+
+ cost = fluid.layers.cross_entropy(input=predict, label=words[-1])
+ avg_cost = fluid.layers.reduce_sum(cost)
+ accuracy = fluid.layers.accuracy(input=predict, label=words[-1])
+ auc_var, batch_auc_var, auc_states = \
+ fluid.layers.auc(input=predict, label=words[-1], num_thresholds=2 ** 12, slide_steps=20)
+
+ return avg_cost, auc_var, batch_auc_var, py_reader
+
+
+def ctr_dnn_model(embedding_size, sparse_feature_dim, use_py_reader=True):
+
+ def embedding_layer(input):
+ return fluid.layers.embedding(
+ input=input,
+ is_sparse=True,
+ # you need to patch https://github.com/PaddlePaddle/Paddle/pull/14190
+ # if you want to set is_distributed to True
+ is_distributed=False,
+ size=[sparse_feature_dim, embedding_size],
+ param_attr=fluid.ParamAttr(name="SparseFeatFactors",
+ initializer=fluid.initializer.Uniform()))
+
+ dense_input = fluid.layers.data(
+ name="dense_input", shape=[dense_feature_dim], dtype='float32')
+
+ sparse_input_ids = [
+ fluid.layers.data(name="C" + str(i), shape=[1], lod_level=1, dtype='int64')
+ for i in range(1, 27)]
+
+ label = fluid.layers.data(name='label', shape=[1], dtype='int64')
+
+ words = [dense_input] + sparse_input_ids + [label]
+
+ py_reader = None
+ if use_py_reader:
+ py_reader = fluid.layers.create_py_reader_by_data(capacity=64,
+ feed_list=words,
+ name='py_reader',
+ use_double_buffer=True)
+ words = fluid.layers.read_file(py_reader)
+
+ sparse_embed_seq = list(map(embedding_layer, words[1:-1]))
+ concated = fluid.layers.concat(sparse_embed_seq + words[0:1], axis=1)
+
+ fc1 = fluid.layers.fc(input=concated, size=400, act='relu',
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(concated.shape[1]))))
+ fc2 = fluid.layers.fc(input=fc1, size=400, act='relu',
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc1.shape[1]))))
+ fc3 = fluid.layers.fc(input=fc2, size=400, act='relu',
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc2.shape[1]))))
+ predict = fluid.layers.fc(input=fc3, size=2, act='softmax',
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc3.shape[1]))))
+
+ cost = fluid.layers.cross_entropy(input=predict, label=words[-1])
+ avg_cost = fluid.layers.reduce_sum(cost)
+ accuracy = fluid.layers.accuracy(input=predict, label=words[-1])
+ auc_var, batch_auc_var, auc_states = \
+ fluid.layers.auc(input=predict, label=words[-1], num_thresholds=2 ** 12, slide_steps=20)
+
+ return avg_cost, auc_var, batch_auc_var, py_reader, words
diff --git a/PaddleRec/ctr/preprocess.py b/PaddleRec/ctr/preprocess.py
new file mode 100755
index 0000000000000000000000000000000000000000..e6fc456c3547947ae425ff42161ff075dbfae65f
--- /dev/null
+++ b/PaddleRec/ctr/preprocess.py
@@ -0,0 +1,164 @@
+"""
+Preprocess Criteo dataset. This dataset was used for the Display Advertising
+Challenge (https://www.kaggle.com/c/criteo-display-ad-challenge).
+"""
+import os
+import sys
+import click
+import random
+import collections
+
+# There are 13 integer features and 26 categorical features
+continous_features = range(1, 14)
+categorial_features = range(14, 40)
+
+# Clip integer features. The clip point for each integer feature
+# is derived from the 95% quantile of the total values in each feature
+continous_clip = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+
+
+class CategoryDictGenerator:
+ """
+ Generate dictionary for each of the categorical features
+ """
+
+ def __init__(self, num_feature):
+ self.dicts = []
+ self.num_feature = num_feature
+ for i in range(0, num_feature):
+ self.dicts.append(collections.defaultdict(int))
+
+ def build(self, datafile, categorial_features, cutoff=0):
+ with open(datafile, 'r') as f:
+ for line in f:
+ features = line.rstrip('\n').split('\t')
+ for i in range(0, self.num_feature):
+ if features[categorial_features[i]] != '':
+ self.dicts[i][features[categorial_features[i]]] += 1
+ for i in range(0, self.num_feature):
+ self.dicts[i] = filter(lambda x: x[1] >= cutoff,
+ self.dicts[i].items())
+ self.dicts[i] = sorted(self.dicts[i], key=lambda x: (-x[1], x[0]))
+ vocabs, _ = list(zip(*self.dicts[i]))
+ self.dicts[i] = dict(zip(vocabs, range(1, len(vocabs) + 1)))
+ self.dicts[i][''] = 0
+
+ def gen(self, idx, key):
+ if key not in self.dicts[idx]:
+ res = self.dicts[idx]['']
+ else:
+ res = self.dicts[idx][key]
+ return res
+
+ def dicts_sizes(self):
+ return list(map(len, self.dicts))
+
+
+class ContinuousFeatureGenerator:
+ """
+ Normalize the integer features to [0, 1] by min-max normalization
+ """
+
+ def __init__(self, num_feature):
+ self.num_feature = num_feature
+ self.min = [sys.maxsize] * num_feature
+ self.max = [-sys.maxsize] * num_feature
+
+ def build(self, datafile, continous_features):
+ with open(datafile, 'r') as f:
+ for line in f:
+ features = line.rstrip('\n').split('\t')
+ for i in range(0, self.num_feature):
+ val = features[continous_features[i]]
+ if val != '':
+ val = int(val)
+ if val > continous_clip[i]:
+ val = continous_clip[i]
+ self.min[i] = min(self.min[i], val)
+ self.max[i] = max(self.max[i], val)
+
+ def gen(self, idx, val):
+ if val == '':
+ return 0.0
+ val = float(val)
+ return (val - self.min[idx]) / (self.max[idx] - self.min[idx])
+
+
+@click.command("preprocess")
+@click.option("--datadir", type=str, help="Path to raw criteo dataset")
+@click.option("--outdir", type=str, help="Path to save the processed data")
+def preprocess(datadir, outdir):
+ """
+ All 13 integer features are normalized to continuous values and these continuous
+ features are combined into one vector with dimension of 13.
+
+ Each of the 26 categorical features are one-hot encoded and all the one-hot
+ vectors are combined into one sparse binary vector.
+ """
+ dists = ContinuousFeatureGenerator(len(continous_features))
+ dists.build(os.path.join(datadir, 'train.txt'), continous_features)
+
+ dicts = CategoryDictGenerator(len(categorial_features))
+ dicts.build(
+ os.path.join(datadir, 'train.txt'), categorial_features, cutoff=200)
+
+ dict_sizes = dicts.dicts_sizes()
+ categorial_feature_offset = [0]
+ for i in range(1, len(categorial_features)):
+ offset = categorial_feature_offset[i - 1] + dict_sizes[i - 1]
+ categorial_feature_offset.append(offset)
+
+ random.seed(0)
+
+ # 90% of the data are used for training, and 10% of the data are used
+ # for validation.
+ with open(os.path.join(outdir, 'train.txt'), 'w') as out_train:
+ with open(os.path.join(outdir, 'valid.txt'), 'w') as out_valid:
+ with open(os.path.join(datadir, 'train.txt'), 'r') as f:
+ for line in f:
+ features = line.rstrip('\n').split('\t')
+
+ continous_vals = []
+ for i in range(0, len(continous_features)):
+ val = dists.gen(i, features[continous_features[i]])
+ continous_vals.append("{0:.6f}".format(val).rstrip('0')
+ .rstrip('.'))
+ categorial_vals = []
+ for i in range(0, len(categorial_features)):
+ val = dicts.gen(i, features[categorial_features[
+ i]]) + categorial_feature_offset[i]
+ categorial_vals.append(str(val))
+
+ continous_vals = ','.join(continous_vals)
+ categorial_vals = ','.join(categorial_vals)
+ label = features[0]
+ if random.randint(0, 9999) % 10 != 0:
+ out_train.write('\t'.join(
+ [continous_vals, categorial_vals, label]) + '\n')
+ else:
+ out_valid.write('\t'.join(
+ [continous_vals, categorial_vals, label]) + '\n')
+
+ with open(os.path.join(outdir, 'test.txt'), 'w') as out:
+ with open(os.path.join(datadir, 'test.txt'), 'r') as f:
+ for line in f:
+ features = line.rstrip('\n').split('\t')
+
+ continous_vals = []
+ for i in range(0, len(continous_features)):
+ val = dists.gen(i, features[continous_features[i] - 1])
+ continous_vals.append("{0:.6f}".format(val).rstrip('0')
+ .rstrip('.'))
+ categorial_vals = []
+ for i in range(0, len(categorial_features)):
+ val = dicts.gen(i, features[categorial_features[
+ i] - 1]) + categorial_feature_offset[i]
+ categorial_vals.append(str(val))
+
+ continous_vals = ','.join(continous_vals)
+ categorial_vals = ','.join(categorial_vals)
+ out.write('\t'.join([continous_vals, categorial_vals]) + '\n')
+
+
+if __name__ == "__main__":
+ preprocess()
diff --git a/PaddleRec/ctr/reader.py b/PaddleRec/ctr/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..e6bcc11d4f465d48f99ef2aa8059a952a3f8dfd4
--- /dev/null
+++ b/PaddleRec/ctr/reader.py
@@ -0,0 +1,52 @@
+class Dataset:
+ def __init__(self):
+ pass
+
+class CriteoDataset(Dataset):
+ def __init__(self, sparse_feature_dim):
+ self.cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+ self.cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+ self.hash_dim_ = sparse_feature_dim
+ # here, training data are lines with line_index < train_idx_
+ self.train_idx_ = 41256555
+ self.continuous_range_ = range(1, 14)
+ self.categorical_range_ = range(14, 40)
+
+ def _reader_creator(self, file_list, is_train, trainer_num, trainer_id):
+ def reader():
+ for file in file_list:
+ with open(file, 'r') as f:
+ line_idx = 0
+ for line in f:
+ line_idx += 1
+ if is_train and line_idx > self.train_idx_:
+ break
+ elif not is_train and line_idx <= self.train_idx_:
+ continue
+ if line_idx % trainer_num != trainer_id:
+ continue
+ features = line.rstrip('\n').split('\t')
+ dense_feature = []
+ sparse_feature = []
+ for idx in self.continuous_range_:
+ if features[idx] == '':
+ dense_feature.append(0.0)
+ else:
+ dense_feature.append((float(features[idx]) - self.cont_min_[idx - 1]) / self.cont_diff_[idx - 1])
+ for idx in self.categorical_range_:
+ sparse_feature.append([hash(str(idx) + features[idx]) % self.hash_dim_])
+
+ label = [int(features[0])]
+ yield [dense_feature] + sparse_feature + [label]
+
+ return reader
+
+ def train(self, file_list, trainer_num, trainer_id):
+ return self._reader_creator(file_list, True, trainer_num, trainer_id)
+
+ def test(self, file_list):
+ return self._reader_creator(file_list, False, 1, 0)
+
+ def infer(self, file_list):
+ return self._reader_creator(file_list, False, 1, 0)
diff --git a/fluid/PaddleRec/ctr/requirements.txt b/PaddleRec/ctr/requirements.txt
similarity index 100%
rename from fluid/PaddleRec/ctr/requirements.txt
rename to PaddleRec/ctr/requirements.txt
diff --git a/PaddleRec/ctr/train.py b/PaddleRec/ctr/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..93ce357b45e9bab4a0d46d43cfaf7573ddb31adc
--- /dev/null
+++ b/PaddleRec/ctr/train.py
@@ -0,0 +1,262 @@
+from __future__ import print_function
+
+import argparse
+import logging
+import os
+import time
+
+import numpy as np
+
+import paddle
+import paddle.fluid as fluid
+
+import reader
+from network_conf import ctr_dnn_model
+from multiprocessing import cpu_count
+
+
+# disable gpu training for this example
+os.environ["CUDA_VISIBLE_DEVICES"] = ""
+
+logging.basicConfig(
+ format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(description="PaddlePaddle CTR example")
+ parser.add_argument(
+ '--train_data_path',
+ type=str,
+ default='./data/raw/train.txt',
+ help="The path of training dataset")
+ parser.add_argument(
+ '--test_data_path',
+ type=str,
+ default='./data/raw/valid.txt',
+ help="The path of testing dataset")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=1000,
+ help="The size of mini-batch (default:1000)")
+ parser.add_argument(
+ '--embedding_size',
+ type=int,
+ default=10,
+ help="The size for embedding layer (default:10)")
+ parser.add_argument(
+ '--num_passes',
+ type=int,
+ default=10,
+ help="The number of passes to train (default: 10)")
+ parser.add_argument(
+ '--model_output_dir',
+ type=str,
+ default='models',
+ help='The path for model to store (default: models)')
+ parser.add_argument(
+ '--sparse_feature_dim',
+ type=int,
+ default=1000001,
+ help='sparse feature hashing space for index processing')
+ parser.add_argument(
+ '--is_local',
+ type=int,
+ default=1,
+ help='Local train or distributed train (default: 1)')
+ parser.add_argument(
+ '--cloud_train',
+ type=int,
+ default=0,
+ help='Local train or distributed train on paddlecloud (default: 0)')
+ parser.add_argument(
+ '--async_mode',
+ action='store_true',
+ default=False,
+ help='Whether start pserver in async mode to support ASGD')
+ parser.add_argument(
+ '--no_split_var',
+ action='store_true',
+ default=False,
+ help='Whether split variables into blocks when update_method is pserver')
+ # the following arguments is used for distributed train, if is_local == false, then you should set them
+ parser.add_argument(
+ '--role',
+ type=str,
+ default='pserver', # trainer or pserver
+ help='The path for model to store (default: models)')
+ parser.add_argument(
+ '--endpoints',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The pserver endpoints, like: 127.0.0.1:6000,127.0.0.1:6001')
+ parser.add_argument(
+ '--current_endpoint',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The path for model to store (default: 127.0.0.1:6000)')
+ parser.add_argument(
+ '--trainer_id',
+ type=int,
+ default=0,
+ help='The path for model to store (default: models)')
+ parser.add_argument(
+ '--trainers',
+ type=int,
+ default=1,
+ help='The num of trianers, (default: 1)')
+ parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs.')
+
+ return parser.parse_args()
+
+
+def train_loop(args, train_program, py_reader, loss, auc_var, batch_auc_var,
+ trainer_num, trainer_id):
+
+ if args.enable_ce:
+ SEED = 102
+ train_program.random_seed = SEED
+ fluid.default_startup_program().random_seed = SEED
+
+ dataset = reader.CriteoDataset(args.sparse_feature_dim)
+ train_reader = paddle.batch(
+ paddle.reader.shuffle(
+ dataset.train([args.train_data_path], trainer_num, trainer_id),
+ buf_size=args.batch_size * 100),
+ batch_size=args.batch_size)
+
+ py_reader.decorate_paddle_reader(train_reader)
+ data_name_list = []
+
+ place = fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ exec_strategy = fluid.ExecutionStrategy()
+ build_strategy = fluid.BuildStrategy()
+
+ if os.getenv("NUM_THREADS", ""):
+ exec_strategy.num_threads = int(os.getenv("NUM_THREADS"))
+
+ cpu_num = int(os.environ.get('CPU_NUM', cpu_count()))
+ build_strategy.reduce_strategy = \
+ fluid.BuildStrategy.ReduceStrategy.Reduce if cpu_num > 1 \
+ else fluid.BuildStrategy.ReduceStrategy.AllReduce
+
+ pe = fluid.ParallelExecutor(
+ use_cuda=False,
+ loss_name=loss.name,
+ main_program=train_program,
+ build_strategy=build_strategy,
+ exec_strategy=exec_strategy)
+
+ exe.run(fluid.default_startup_program())
+
+ total_time = 0
+ for pass_id in range(args.num_passes):
+ pass_start = time.time()
+ batch_id = 0
+ py_reader.start()
+
+ try:
+ while True:
+ loss_val, auc_val, batch_auc_val = pe.run(fetch_list=[loss.name, auc_var.name, batch_auc_var.name])
+ loss_val = np.mean(loss_val)
+ auc_val = np.mean(auc_val)
+ batch_auc_val = np.mean(batch_auc_val)
+
+ logger.info("TRAIN --> pass: {} batch: {} loss: {} auc: {}, batch_auc: {}"
+ .format(pass_id, batch_id, loss_val/args.batch_size, auc_val, batch_auc_val))
+ if batch_id % 1000 == 0 and batch_id != 0:
+ model_dir = args.model_output_dir + '/batch-' + str(batch_id)
+ if args.trainer_id == 0:
+ fluid.io.save_persistables(executor=exe, dirname=model_dir,
+ main_program=fluid.default_main_program())
+ batch_id += 1
+ except fluid.core.EOFException:
+ py_reader.reset()
+ print("pass_id: %d, pass_time_cost: %f" % (pass_id, time.time() - pass_start))
+
+ total_time += time.time() - pass_start
+
+ model_dir = args.model_output_dir + '/pass-' + str(pass_id)
+ if args.trainer_id == 0:
+ fluid.io.save_persistables(executor=exe, dirname=model_dir,
+ main_program=fluid.default_main_program())
+
+ # only for ce
+ if args.enable_ce:
+ threads_num, cpu_num = get_cards(args)
+ epoch_idx = args.num_passes
+ print("kpis\teach_pass_duration_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, total_time / epoch_idx))
+ print("kpis\ttrain_loss_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, loss_val/args.batch_size))
+ print("kpis\ttrain_auc_val_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, auc_val))
+ print("kpis\ttrain_batch_auc_val_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, batch_auc_val))
+
+
+def train():
+ args = parse_args()
+
+ if not os.path.isdir(args.model_output_dir):
+ os.mkdir(args.model_output_dir)
+
+ loss, auc_var, batch_auc_var, py_reader, _ = ctr_dnn_model(args.embedding_size, args.sparse_feature_dim)
+ optimizer = fluid.optimizer.Adam(learning_rate=1e-4)
+ optimizer.minimize(loss)
+ if args.cloud_train:
+ # the port of all pservers, needed by both trainer and pserver
+ port = os.getenv("PADDLE_PORT", "6174")
+ # comma separated ips of all pservers, needed by trainer and
+ pserver_ips = os.getenv("PADDLE_PSERVERS", "")
+ eplist = []
+ for ip in pserver_ips.split(","):
+ eplist.append(':'.join([ip, port]))
+ args.endpoints = ",".join(eplist)
+ args.trainers = int(os.getenv("PADDLE_TRAINERS_NUM", "1"))
+ args.current_endpoint = os.getenv("POD_IP", "localhost") + ":" + port
+ args.role = os.getenv("TRAINING_ROLE", "TRAINER")
+ args.trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
+ args.is_local = bool(int(os.getenv("PADDLE_IS_LOCAL", 0)))
+
+ if args.is_local:
+ logger.info("run local training")
+ main_program = fluid.default_main_program()
+ train_loop(args, main_program, py_reader, loss, auc_var, batch_auc_var, 1, 0)
+ else:
+ logger.info("run dist training")
+ t = fluid.DistributeTranspiler()
+ t.transpile(args.trainer_id, pservers=args.endpoints, trainers=args.trainers)
+ if args.role == "pserver" or args.role == "PSERVER":
+ logger.info("run pserver")
+ prog = t.get_pserver_program(args.current_endpoint)
+ startup = t.get_startup_program(args.current_endpoint, pserver_program=prog)
+ exe = fluid.Executor(fluid.CPUPlace())
+ exe.run(startup)
+ exe.run(prog)
+ elif args.role == "trainer" or args.role == "TRAINER":
+ logger.info("run trainer")
+ train_prog = t.get_trainer_program()
+ train_loop(args, train_prog, py_reader, loss, auc_var, batch_auc_var,
+ args.trainers, args.trainer_id)
+ else:
+ raise ValueError(
+ 'PADDLE_TRAINING_ROLE environment variable must be either TRAINER or PSERVER'
+ )
+
+
+def get_cards(args):
+ threads_num = os.environ.get('NUM_THREADS', 1)
+ cpu_num = os.environ.get('CPU_NUM', 1)
+ return int(threads_num), int(cpu_num)
+
+
+if __name__ == '__main__':
+ train()
diff --git a/PaddleRec/din/.run_ce.sh b/PaddleRec/din/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..c6dadb237ba11708849bf8414d626483b5ec4945
--- /dev/null
+++ b/PaddleRec/din/.run_ce.sh
@@ -0,0 +1,18 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+cudaid=${face_detection:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python -u train.py --config_path 'data/config.txt' --train_dir 'data/paddle_train.txt' --batch_size 32 --epoch_num 1 --use_cuda 1 --enable_ce --batch_num 10000 | python _ce.py
+
+
+cudaid=${face_detection_4:=0,1,2,3} # use 0,1,2,3 card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python -u train.py --config_path 'data/config.txt' --train_dir 'data/paddle_train.txt' --batch_size 32 --epoch_num 1 --use_cuda 1 --enable_ce --batch_num 10000 | python _ce.py
+
+
diff --git a/PaddleRec/din/README.md b/PaddleRec/din/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..3538ba760ff9b80807a6a56aed4b75400c97ae03
--- /dev/null
+++ b/PaddleRec/din/README.md
@@ -0,0 +1,137 @@
+# DIN
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── README.md # 文档
+├── train.py # 训练脚本
+├── infer.py # 预测脚本
+├── network.py # 网络结构
+├── cluster_train.py # 多机训练
+├── cluster_train.sh # 多机训练脚本
+├── reader.py # 和读取数据相关的函数
+├── data/
+ ├── build_dataset.py # 文本数据转化为paddle数据
+ ├── convert_pd.py # 将原始数据转化为pandas的dataframe
+ ├── data_process.sh # 数据预处理脚本
+ ├── remap_id.py # remap类别id
+
+```
+
+## 简介
+
+DIN模型的介绍可以参阅论文[Deep Interest Network for Click-Through Rate Prediction](https://arxiv.org/abs/1706.06978)。
+
+DIN通过一个兴趣激活模块(Activation Unit),用预估目标Candidate ADs的信息去激活用户的历史点击商品,以此提取用户与当前预估目标相关的兴趣。
+
+权重高的历史行为表明这部分兴趣和当前广告相关,权重低的则是和广告无关的”兴趣噪声“。我们通过将激活的商品和激活权重相乘,然后累加起来作为当前预估目标ADs相关的兴趣状态表达。
+
+最后我们将这相关的用户兴趣表达、用户静态特征和上下文相关特征,以及ad相关的特征拼接起来,输入到后续的多层DNN网络,最后预测得到用户对当前目标ADs的点击概率。
+
+
+## 数据下载及预处理
+
+* Step 1: 运行如下命令 下载[Amazon Product数据集](http://jmcauley.ucsd.edu/data/amazon/)并进行预处理
+```
+cd data && sh data_process.sh && cd ..
+```
+如果执行过程中遇到找不到某个包(例如pandas包)的报错,使用如下命令安装对应的包即可。
+```
+pip install pandas
+```
+
+* Step 2: 产生训练集、测试集和config文件
+```
+python build_dataset.py
+```
+运行之后在data文件夹下会产生config.txt、paddle_test.txt、paddle_train.txt三个文件
+
+数据格式例子如下:
+```
+3737 19450;288 196;18486;674;1
+3647 4342 6855 3805;281 463 558 674;4206;463;1
+1805 4309;87 87;21354;556;1
+18209 20753;649 241;51924;610;0
+13150;351;41455;792;1
+35120 40418;157 714;52035;724;0
+```
+
+其中每一行是一个Sample,由分号分隔的5个域组成。前两个域是历史交互的item序列和item对应的类别,第三、四个域是待预测的item和其类别,最后一个域是label,表示点击与否。
+
+
+## 训练
+
+具体的参数配置说明可通过运行下列代码查看
+```
+python train.py -h
+```
+
+gpu 单机单卡训练
+``` bash
+CUDA_VISIBLE_DEVICES=1 python -u train.py --config_path 'data/config.txt' --train_dir 'data/paddle_train.txt' --batch_size 32 --epoch_num 100 --use_cuda 1 > log.txt 2>&1 &
+```
+
+cpu 单机训练
+``` bash
+python -u train.py --config_path 'data/config.txt' --train_dir 'data/paddle_train.txt' --batch_size 32 --epoch_num 100 --use_cuda 0 > log.txt 2>&1 &
+```
+
+值得注意的是上述单卡训练可以通过加--parallel 1参数使用Parallel Executor来进行加速
+
+gpu 单机多卡训练
+``` bash
+CUDA_VISIBLE_DEVICES=0,1 python -u train.py --config_path 'data/config.txt' --train_dir 'data/paddle_train.txt' --batch_size 32 --epoch_num 100 --use_cuda 1 --parallel 1 --num_devices 2 > log.txt 2>&1 &
+```
+
+cpu 单机多卡训练
+``` bash
+CPU_NUM=10 python -u train.py --config_path 'data/config.txt' --train_dir 'data/paddle_train.txt' --batch_size 32 --epoch_num 100 --use_cuda 0 --parallel 1 --num_devices 10 > log.txt 2>&1 &
+```
+
+
+## 训练结果示例
+
+我们在Tesla K40m单GPU卡上训练的日志如下所示(以实际输出为准)
+```text
+2019-02-22 09:31:51,578 - INFO - reading data begins
+2019-02-22 09:32:22,407 - INFO - reading data completes
+W0222 09:32:24.151955 7221 device_context.cc:263] Please NOTE: device: 0, CUDA Capability: 35, Driver API Version: 9.0, Runtime API Version: 8.0
+W0222 09:32:24.152046 7221 device_context.cc:271] device: 0, cuDNN Version: 7.0.
+2019-02-22 09:32:27,797 - INFO - train begins
+epoch: 1 global_step: 1000 train_loss: 0.6950 time: 14.64
+epoch: 1 global_step: 2000 train_loss: 0.6854 time: 15.41
+epoch: 1 global_step: 3000 train_loss: 0.6799 time: 14.84
+...
+model saved in din_amazon/global_step_50000
+...
+```
+
+提示:
+
+* 在单机条件下,使用代码中默认的超参数运行时,产生最优auc的global step大致在440000到500000之间
+
+* 训练超出一定的epoch后会稍稍出现过拟合
+
+## 预测
+参考如下命令,开始预测.
+
+其中model_path为模型的路径,test_path为测试数据路径。
+
+```
+CUDA_VISIBLE_DEVICES=3 python infer.py --model_path 'din_amazon/global_step_400000' --test_path 'data/paddle_test.txt' --use_cuda 1
+```
+
+## 预测结果示例
+```text
+2019-02-22 11:22:58,804 - INFO - TEST --> loss: [0.47005194] auc:0.863794952818
+```
+
+
+## 多机训练
+可参考cluster_train.py 配置多机环境
+
+运行命令本地模拟多机场景
+```
+sh cluster_train.sh
+```
diff --git a/PaddleRec/din/__init__.py b/PaddleRec/din/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/din/_ce.py b/PaddleRec/din/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..9d5850fd22c3d023eb866fa474b6f6f586ca326e
--- /dev/null
+++ b/PaddleRec/din/_ce.py
@@ -0,0 +1,61 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+
+
+each_pass_duration_card1_kpi = DurationKpi('each_pass_duration_card1', 0.08, 0, actived=True)
+train_loss_card1_kpi = CostKpi('train_loss_card1', 0.08, 0)
+each_pass_duration_card4_kpi = DurationKpi('each_pass_duration_card4', 0.08, 0, actived=True)
+train_loss_card4_kpi = CostKpi('train_loss_card4', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_card1_kpi,
+ train_loss_card1_kpi,
+ each_pass_duration_card4_kpi,
+ train_loss_card4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/PaddleRec/din/cluster_train.py b/PaddleRec/din/cluster_train.py
new file mode 100644
index 0000000000000000000000000000000000000000..6b3272366fa674c2bfaa6454beb2c93de1545a4f
--- /dev/null
+++ b/PaddleRec/din/cluster_train.py
@@ -0,0 +1,172 @@
+import sys
+import logging
+import time
+import numpy as np
+import argparse
+import paddle.fluid as fluid
+import paddle
+import time
+import network
+import reader
+import random
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("din")
+ parser.add_argument(
+ '--config_path',
+ type=str,
+ default='data/config.txt',
+ help='dir of config')
+ parser.add_argument(
+ '--train_dir',
+ type=str,
+ default='data/paddle_train.txt',
+ help='dir of train file')
+ parser.add_argument(
+ '--model_dir',
+ type=str,
+ default='din_amazon/',
+ help='dir of saved model')
+ parser.add_argument(
+ '--batch_size', type=int, default=16, help='number of batch size')
+ parser.add_argument(
+ '--epoch_num', type=int, default=200, help='number of epoch')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether to use gpu')
+ parser.add_argument(
+ '--parallel',
+ type=int,
+ default=0,
+ help='whether to use parallel executor')
+ parser.add_argument(
+ '--base_lr', type=float, default=0.85, help='based learning rate')
+ parser.add_argument(
+ '--role', type=str, default='pserver', help='trainer or pserver')
+ parser.add_argument(
+ '--endpoints',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The pserver endpoints, like: 127.0.0.1:6000, 127.0.0.1:6001')
+ parser.add_argument(
+ '--current_endpoint',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The current_endpoint')
+ parser.add_argument(
+ '--trainer_id',
+ type=int,
+ default=0,
+ help='trainer id ,only trainer_id=0 save model')
+ parser.add_argument(
+ '--trainers',
+ type=int,
+ default=1,
+ help='The num of trianers, (default: 1)')
+ args = parser.parse_args()
+ return args
+
+
+def train():
+ args = parse_args()
+
+ config_path = args.config_path
+ train_path = args.train_dir
+ epoch_num = args.epoch_num
+ use_cuda = True if args.use_cuda else False
+ use_parallel = True if args.parallel else False
+
+ logger.info("reading data begins")
+ user_count, item_count, cat_count = reader.config_read(config_path)
+ #data_reader, max_len = reader.prepare_reader(train_path, args.batch_size)
+ logger.info("reading data completes")
+
+ avg_cost, pred = network.network(item_count, cat_count, 433)
+ #fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByGlobalNorm(clip_norm=5.0))
+ base_lr = args.base_lr
+ boundaries = [410000]
+ values = [base_lr, 0.2]
+ sgd_optimizer = fluid.optimizer.SGD(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=boundaries, values=values))
+ sgd_optimizer.minimize(avg_cost)
+
+ def train_loop(main_program):
+ data_reader, max_len = reader.prepare_reader(train_path,
+ args.batch_size)
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ feeder = fluid.DataFeeder(
+ feed_list=[
+ "hist_item_seq", "hist_cat_seq", "target_item", "target_cat",
+ "label", "mask", "target_item_seq", "target_cat_seq"
+ ],
+ place=place)
+ if use_parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=use_cuda,
+ loss_name=avg_cost.name,
+ main_program=main_program)
+ else:
+ train_exe = exe
+ logger.info("train begins")
+ global_step = 0
+ PRINT_STEP = 1000
+
+ start_time = time.time()
+ loss_sum = 0.0
+ for id in range(epoch_num):
+ epoch = id + 1
+ for data in data_reader():
+ global_step += 1
+ results = train_exe.run(main_program,
+ feed=feeder.feed(data),
+ fetch_list=[avg_cost.name, pred.name],
+ return_numpy=True)
+ loss_sum += results[0].mean()
+
+ if global_step % PRINT_STEP == 0:
+ logger.info(
+ "epoch: %d\tglobal_step: %d\ttrain_loss: %.4f\t\ttime: %.2f"
+ % (epoch, global_step, loss_sum / PRINT_STEP,
+ time.time() - start_time))
+ start_time = time.time()
+ loss_sum = 0.0
+
+ if (global_step > 400000 and
+ global_step % PRINT_STEP == 0) or (
+ global_step < 400000 and
+ global_step % 50000 == 0):
+ save_dir = args.model_dir + "/global_step_" + str(
+ global_step)
+ feed_var_name = [
+ "hist_item_seq", "hist_cat_seq", "target_item",
+ "target_cat", "label", "mask", "target_item_seq",
+ "target_cat_seq"
+ ]
+ fetch_vars = [avg_cost, pred]
+ fluid.io.save_inference_model(save_dir, feed_var_name,
+ fetch_vars, exe)
+ train_exe.close()
+
+ t = fluid.DistributeTranspiler()
+ t.transpile(
+ args.trainer_id, pservers=args.endpoints, trainers=args.trainers)
+ if args.role == "pserver":
+ logger.info("run psever")
+ prog, startup = t.get_pserver_programs(args.current_endpoint)
+ exe = fluid.Executor(fluid.CPUPlace())
+ exe.run(startup)
+ exe.run(prog)
+ elif args.role == "trainer":
+ logger.info("run trainer")
+ train_loop(t.get_trainer_program())
+
+
+if __name__ == "__main__":
+ train()
diff --git a/PaddleRec/din/cluster_train.sh b/PaddleRec/din/cluster_train.sh
new file mode 100644
index 0000000000000000000000000000000000000000..76115c825423f5de4a1114be863cc7ec40bad0b4
--- /dev/null
+++ b/PaddleRec/din/cluster_train.sh
@@ -0,0 +1,56 @@
+#!/bin/bash
+
+#export GLOG_v=30
+#export GLOG_logtostderr=1
+
+python -u cluster_train.py \
+--config_path 'data/config.txt' \
+--train_dir 'data/paddle_train.txt' \
+--batch_size 32 \
+--epoch_num 100 \
+--use_cuda 0 \
+--parallel 0 \
+--role pserver \
+--endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+--current_endpoint 127.0.0.1:6000 \
+--trainers 2 \
+> pserver0.log 2>&1 &
+
+python -u cluster_train.py \
+--config_path 'data/config.txt' \
+--train_dir 'data/paddle_train.txt' \
+--batch_size 32 \
+--epoch_num 100 \
+--use_cuda 0 \
+--parallel 0 \
+--role pserver \
+--endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+--current_endpoint 127.0.0.1:6001 \
+--trainers 2 \
+> pserver1.log 2>&1 &
+
+python -u cluster_train.py \
+--config_path 'data/config.txt' \
+--train_dir 'data/paddle_train.txt' \
+--batch_size 32 \
+--epoch_num 100 \
+--use_cuda 0 \
+--parallel 0 \
+--role trainer \
+--endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+--trainers 2 \
+--trainer_id 0 \
+> trainer0.log 2>&1 &
+
+python -u cluster_train.py \
+--config_path 'data/config.txt' \
+--train_dir 'data/paddle_train.txt' \
+--batch_size 32 \
+--epoch_num 100 \
+--use_cuda 0 \
+--parallel 0 \
+--role trainer \
+--endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+--trainers 2 \
+--trainer_id 1 \
+> trainer1.log 2>&1 &
diff --git a/PaddleRec/din/data/build_dataset.py b/PaddleRec/din/data/build_dataset.py
new file mode 100644
index 0000000000000000000000000000000000000000..34c053ccdb2686c10875740f72f1e0abf3cb4f10
--- /dev/null
+++ b/PaddleRec/din/data/build_dataset.py
@@ -0,0 +1,87 @@
+from __future__ import print_function
+import random
+import pickle
+
+random.seed(1234)
+
+print("read and process data")
+
+with open('./raw_data/remap.pkl', 'rb') as f:
+ reviews_df = pickle.load(f)
+ cate_list = pickle.load(f)
+ user_count, item_count, cate_count, example_count = pickle.load(f)
+
+train_set = []
+test_set = []
+for reviewerID, hist in reviews_df.groupby('reviewerID'):
+ pos_list = hist['asin'].tolist()
+
+ def gen_neg():
+ neg = pos_list[0]
+ while neg in pos_list:
+ neg = random.randint(0, item_count - 1)
+ return neg
+
+ neg_list = [gen_neg() for i in range(len(pos_list))]
+
+ for i in range(1, len(pos_list)):
+ hist = pos_list[:i]
+ if i != len(pos_list) - 1:
+ train_set.append((reviewerID, hist, pos_list[i], 1))
+ train_set.append((reviewerID, hist, neg_list[i], 0))
+ else:
+ label = (pos_list[i], neg_list[i])
+ test_set.append((reviewerID, hist, label))
+
+random.shuffle(train_set)
+random.shuffle(test_set)
+
+assert len(test_set) == user_count
+
+
+def print_to_file(data, fout):
+ for i in range(len(data)):
+ fout.write(str(data[i]))
+ if i != len(data) - 1:
+ fout.write(' ')
+ else:
+ fout.write(';')
+
+
+print("make train data")
+with open("paddle_train.txt", "w") as fout:
+ for line in train_set:
+ history = line[1]
+ target = line[2]
+ label = line[3]
+ cate = [cate_list[x] for x in history]
+ print_to_file(history, fout)
+ print_to_file(cate, fout)
+ fout.write(str(target) + ";")
+ fout.write(str(cate_list[target]) + ";")
+ fout.write(str(label) + "\n")
+
+print("make test data")
+with open("paddle_test.txt", "w") as fout:
+ for line in test_set:
+ history = line[1]
+ target = line[2]
+ cate = [cate_list[x] for x in history]
+
+ print_to_file(history, fout)
+ print_to_file(cate, fout)
+ fout.write(str(target[0]) + ";")
+ fout.write(str(cate_list[target[0]]) + ";")
+ fout.write("1\n")
+
+ print_to_file(history, fout)
+ print_to_file(cate, fout)
+ fout.write(str(target[1]) + ";")
+ fout.write(str(cate_list[target[1]]) + ";")
+ fout.write("0\n")
+
+print("make config data")
+with open('config.txt', 'w') as f:
+ f.write(str(user_count) + "\n")
+ f.write(str(item_count) + "\n")
+ f.write(str(cate_count) + "\n")
diff --git a/PaddleRec/din/data/convert_pd.py b/PaddleRec/din/data/convert_pd.py
new file mode 100644
index 0000000000000000000000000000000000000000..d7927c7ef1a9da28732cad9c44be24e72095983a
--- /dev/null
+++ b/PaddleRec/din/data/convert_pd.py
@@ -0,0 +1,27 @@
+from __future__ import print_function
+import pickle
+import pandas as pd
+
+
+def to_df(file_path):
+ with open(file_path, 'r') as fin:
+ df = {}
+ i = 0
+ for line in fin:
+ df[i] = eval(line)
+ i += 1
+ df = pd.DataFrame.from_dict(df, orient='index')
+ return df
+
+
+print("start to analyse reviews_Electronics_5.json")
+reviews_df = to_df('./raw_data/reviews_Electronics_5.json')
+with open('./raw_data/reviews.pkl', 'wb') as f:
+ pickle.dump(reviews_df, f, pickle.HIGHEST_PROTOCOL)
+
+print("start to analyse meta_Electronics.json")
+meta_df = to_df('./raw_data/meta_Electronics.json')
+meta_df = meta_df[meta_df['asin'].isin(reviews_df['asin'].unique())]
+meta_df = meta_df.reset_index(drop=True)
+with open('./raw_data/meta.pkl', 'wb') as f:
+ pickle.dump(meta_df, f, pickle.HIGHEST_PROTOCOL)
diff --git a/PaddleRec/din/data/data_process.sh b/PaddleRec/din/data/data_process.sh
new file mode 100644
index 0000000000000000000000000000000000000000..93b1c32d68ec40a9eb7b85ffaa896ea4a5e8052f
--- /dev/null
+++ b/PaddleRec/din/data/data_process.sh
@@ -0,0 +1,13 @@
+#! /bin/bash
+
+echo "begin download data"
+mkdir raw_data && cd raw_data
+wget -c http://snap.stanford.edu/data/amazon/productGraph/categoryFiles/reviews_Electronics_5.json.gz
+gzip -d reviews_Electronics_5.json.gz
+wget -c http://snap.stanford.edu/data/amazon/productGraph/categoryFiles/meta_Electronics.json.gz
+gzip -d meta_Electronics.json.gz
+echo "download data successful"
+
+cd ..
+python convert_pd.py
+python remap_id.py
diff --git a/PaddleRec/din/data/remap_id.py b/PaddleRec/din/data/remap_id.py
new file mode 100644
index 0000000000000000000000000000000000000000..b110dac54de8f8d201ede7248d6a2844ac350c90
--- /dev/null
+++ b/PaddleRec/din/data/remap_id.py
@@ -0,0 +1,48 @@
+from __future__ import print_function
+import random
+import pickle
+import numpy as np
+
+random.seed(1234)
+
+with open('./raw_data/reviews.pkl', 'rb') as f:
+ reviews_df = pickle.load(f)
+ reviews_df = reviews_df[['reviewerID', 'asin', 'unixReviewTime']]
+with open('./raw_data/meta.pkl', 'rb') as f:
+ meta_df = pickle.load(f)
+ meta_df = meta_df[['asin', 'categories']]
+ meta_df['categories'] = meta_df['categories'].map(lambda x: x[-1][-1])
+
+
+def build_map(df, col_name):
+ key = sorted(df[col_name].unique().tolist())
+ m = dict(zip(key, range(len(key))))
+ df[col_name] = df[col_name].map(lambda x: m[x])
+ return m, key
+
+
+asin_map, asin_key = build_map(meta_df, 'asin')
+cate_map, cate_key = build_map(meta_df, 'categories')
+revi_map, revi_key = build_map(reviews_df, 'reviewerID')
+
+user_count, item_count, cate_count, example_count =\
+ len(revi_map), len(asin_map), len(cate_map), reviews_df.shape[0]
+print('user_count: %d\titem_count: %d\tcate_count: %d\texample_count: %d' %
+ (user_count, item_count, cate_count, example_count))
+
+meta_df = meta_df.sort_values('asin')
+meta_df = meta_df.reset_index(drop=True)
+reviews_df['asin'] = reviews_df['asin'].map(lambda x: asin_map[x])
+reviews_df = reviews_df.sort_values(['reviewerID', 'unixReviewTime'])
+reviews_df = reviews_df.reset_index(drop=True)
+reviews_df = reviews_df[['reviewerID', 'asin', 'unixReviewTime']]
+
+cate_list = [meta_df['categories'][i] for i in range(len(asin_map))]
+cate_list = np.array(cate_list, dtype=np.int32)
+
+with open('./raw_data/remap.pkl', 'wb') as f:
+ pickle.dump(reviews_df, f, pickle.HIGHEST_PROTOCOL) # uid, iid
+ pickle.dump(cate_list, f, pickle.HIGHEST_PROTOCOL) # cid of iid line
+ pickle.dump((user_count, item_count, cate_count, example_count), f,
+ pickle.HIGHEST_PROTOCOL)
+ pickle.dump((asin_key, cate_key, revi_key), f, pickle.HIGHEST_PROTOCOL)
diff --git a/PaddleRec/din/infer.py b/PaddleRec/din/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..1ca763a484676bbbc680b38a2ade5c79b161b853
--- /dev/null
+++ b/PaddleRec/din/infer.py
@@ -0,0 +1,98 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import argparse
+import logging
+import numpy as np
+import os
+import paddle
+import paddle.fluid as fluid
+import reader
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(description="PaddlePaddle DIN example")
+ parser.add_argument(
+ '--model_path', type=str, required=True, help="path of model parameters")
+ parser.add_argument(
+ '--test_path', type=str, default='data/paddle_test.txt.bak', help='dir of test file')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether to use gpu')
+
+ return parser.parse_args()
+
+
+def calc_auc(raw_arr):
+ # sort by pred value, from small to big
+ arr = sorted(raw_arr, key=lambda d: d[2])
+ auc = 0.0
+ fp1, tp1, fp2, tp2 = 0.0, 0.0, 0.0, 0.0
+ for record in arr:
+ fp2 += record[0] # noclick
+ tp2 += record[1] # click
+ auc += (fp2 - fp1) * (tp2 + tp1)
+ fp1, tp1 = fp2, tp2
+ # if all nonclick or click, disgard
+ threshold = len(arr) - 1e-3
+ if tp2 > threshold or fp2 > threshold:
+ return -0.5
+ if tp2 * fp2 > 0.0: # normal auc
+ return (1.0 - auc / (2.0 * tp2 * fp2))
+ else:
+ return None
+
+
+def infer():
+ args = parse_args()
+ model_path = args.model_path
+ use_cuda = True if args.use_cuda else False
+ data_reader, _ = reader.prepare_reader(args.test_path, 32 * 16)
+
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ inference_scope = fluid.core.Scope()
+
+ exe = fluid.Executor(place)
+
+ #with fluid.scope_guard(inference_scope):
+ [inference_program, feed_target_names,
+ fetch_targets] = fluid.io.load_inference_model(model_path, exe)
+
+ feeder = fluid.DataFeeder(
+ feed_list=feed_target_names, place=place, program=inference_program)
+
+ loss_sum = 0.0
+ score = []
+ count = 0
+ for data in data_reader():
+ res = exe.run(inference_program,
+ feed=feeder.feed(data),
+ fetch_list=fetch_targets)
+ loss_sum += res[0]
+
+ for i in range(len(data)):
+ if data[i][4] > 0.5:
+ score.append([0, 1, res[1][i]])
+ else:
+ score.append([1, 0, res[1][i]])
+ count += 1
+ auc = calc_auc(score)
+ logger.info("TEST --> loss: {}, auc: {}".format(loss_sum / count, auc))
+
+
+if __name__ == '__main__':
+ infer()
diff --git a/PaddleRec/din/network.py b/PaddleRec/din/network.py
new file mode 100644
index 0000000000000000000000000000000000000000..a65e155f22d52680380a46d715ad09f48bf995d6
--- /dev/null
+++ b/PaddleRec/din/network.py
@@ -0,0 +1,140 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle.fluid as fluid
+
+def din_attention(hist, target_expand, max_len, mask):
+ """activation weight"""
+
+ hidden_size = hist.shape[-1]
+
+ concat = fluid.layers.concat(
+ [hist, target_expand, hist - target_expand, hist * target_expand],
+ axis=2)
+ atten_fc1 = fluid.layers.fc(name="atten_fc1",
+ input=concat,
+ size=80,
+ act="sigmoid",
+ num_flatten_dims=2)
+ atten_fc2 = fluid.layers.fc(name="atten_fc2",
+ input=atten_fc1,
+ size=40,
+ act="sigmoid",
+ num_flatten_dims=2)
+ atten_fc3 = fluid.layers.fc(name="atten_fc3",
+ input=atten_fc2,
+ size=1,
+ num_flatten_dims=2)
+ atten_fc3 += mask
+ atten_fc3 = fluid.layers.transpose(x=atten_fc3, perm=[0, 2, 1])
+ atten_fc3 = fluid.layers.scale(x=atten_fc3, scale=hidden_size**-0.5)
+ weight = fluid.layers.softmax(atten_fc3)
+ out = fluid.layers.matmul(weight, hist)
+ out = fluid.layers.reshape(x=out, shape=[0, hidden_size])
+ return out
+
+
+def network(item_count, cat_count, max_len):
+ """network definition"""
+
+ item_emb_size = 64
+ cat_emb_size = 64
+ is_sparse = False
+ #significant for speeding up the training process
+
+ item_emb_attr = fluid.ParamAttr(name="item_emb")
+ cat_emb_attr = fluid.ParamAttr(name="cat_emb")
+
+ hist_item_seq = fluid.layers.data(
+ name="hist_item_seq", shape=[max_len, 1], dtype="int64")
+ hist_cat_seq = fluid.layers.data(
+ name="hist_cat_seq", shape=[max_len, 1], dtype="int64")
+ target_item = fluid.layers.data(
+ name="target_item", shape=[1], dtype="int64")
+ target_cat = fluid.layers.data(
+ name="target_cat", shape=[1], dtype="int64")
+ label = fluid.layers.data(
+ name="label", shape=[1], dtype="float32")
+ mask = fluid.layers.data(
+ name="mask", shape=[max_len, 1], dtype="float32")
+ target_item_seq = fluid.layers.data(
+ name="target_item_seq", shape=[max_len, 1], dtype="int64")
+ target_cat_seq = fluid.layers.data(
+ name="target_cat_seq", shape=[max_len, 1], dtype="int64", lod_level=0)
+
+ hist_item_emb = fluid.layers.embedding(
+ input=hist_item_seq,
+ size=[item_count, item_emb_size],
+ param_attr=item_emb_attr,
+ is_sparse=is_sparse)
+
+ hist_cat_emb = fluid.layers.embedding(
+ input=hist_cat_seq,
+ size=[cat_count, cat_emb_size],
+ param_attr=cat_emb_attr,
+ is_sparse=is_sparse)
+
+ target_item_emb = fluid.layers.embedding(
+ input=target_item,
+ size=[item_count, item_emb_size],
+ param_attr=item_emb_attr,
+ is_sparse=is_sparse)
+
+ target_cat_emb = fluid.layers.embedding(
+ input=target_cat,
+ size=[cat_count, cat_emb_size],
+ param_attr=cat_emb_attr,
+ is_sparse=is_sparse)
+
+ target_item_seq_emb = fluid.layers.embedding(
+ input=target_item_seq,
+ size=[item_count, item_emb_size],
+ param_attr=item_emb_attr,
+ is_sparse=is_sparse)
+
+ target_cat_seq_emb = fluid.layers.embedding(
+ input=target_cat_seq,
+ size=[cat_count, cat_emb_size],
+ param_attr=cat_emb_attr,
+ is_sparse=is_sparse)
+
+ item_b = fluid.layers.embedding(
+ input=target_item,
+ size=[item_count, 1],
+ param_attr=fluid.initializer.Constant(value=0.0))
+
+ hist_seq_concat = fluid.layers.concat([hist_item_emb, hist_cat_emb], axis=2)
+ target_seq_concat = fluid.layers.concat(
+ [target_item_seq_emb, target_cat_seq_emb], axis=2)
+ target_concat = fluid.layers.concat(
+ [target_item_emb, target_cat_emb], axis=1)
+
+ out = din_attention(hist_seq_concat, target_seq_concat, max_len, mask)
+ out_fc = fluid.layers.fc(name="out_fc",
+ input=out,
+ size=item_emb_size + cat_emb_size,
+ num_flatten_dims=1)
+ embedding_concat = fluid.layers.concat([out_fc, target_concat], axis=1)
+
+ fc1 = fluid.layers.fc(name="fc1",
+ input=embedding_concat,
+ size=80,
+ act="sigmoid")
+ fc2 = fluid.layers.fc(name="fc2", input=fc1, size=40, act="sigmoid")
+ fc3 = fluid.layers.fc(name="fc3", input=fc2, size=1)
+ logit = fc3 + item_b
+
+ loss = fluid.layers.sigmoid_cross_entropy_with_logits(x=logit, label=label)
+ avg_loss = fluid.layers.mean(loss)
+ return avg_loss, fluid.layers.sigmoid(logit)
diff --git a/PaddleRec/din/reader.py b/PaddleRec/din/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..a50a90894b3619f8f752c1d18663f1677c0f6154
--- /dev/null
+++ b/PaddleRec/din/reader.py
@@ -0,0 +1,97 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import random
+import numpy as np
+import paddle
+import pickle
+
+def pad_batch_data(input, max_len):
+ res = np.array([x + [0] * (max_len - len(x)) for x in input])
+ res = res.astype("int64").reshape([-1, max_len, 1])
+ return res
+
+
+def make_data(b):
+ max_len = max(len(x[0]) for x in b)
+ item = pad_batch_data([x[0] for x in b], max_len)
+ cat = pad_batch_data([x[1] for x in b], max_len)
+ len_array = [len(x[0]) for x in b]
+ mask = np.array(
+ [[0] * x + [-1e9] * (max_len - x) for x in len_array]).reshape(
+ [-1, max_len, 1])
+ target_item_seq = np.array(
+ [[x[2]] * max_len for x in b]).astype("int64").reshape(
+ [-1, max_len, 1])
+ target_cat_seq = np.array(
+ [[x[3]] * max_len for x in b]).astype("int64").reshape(
+ [-1, max_len, 1])
+ res = []
+ for i in range(len(b)):
+ res.append([
+ item[i], cat[i], b[i][2], b[i][3], b[i][4], mask[i],
+ target_item_seq[i], target_cat_seq[i]
+ ])
+ return res
+
+
+def batch_reader(reader, batch_size, group_size):
+ def batch_reader():
+ bg = []
+ for line in reader:
+ bg.append(line)
+ if len(bg) == group_size:
+ sortb = sorted(bg, key=lambda x: len(x[0]), reverse=False)
+ bg = []
+ for i in range(0, group_size, batch_size):
+ b = sortb[i:i + batch_size]
+ yield make_data(b)
+ len_bg = len(bg)
+ if len_bg != 0:
+ sortb = sorted(bg, key=lambda x: len(x[0]), reverse=False)
+ bg = []
+ remain = len_bg % batch_size
+ for i in range(0, len_bg - remain, batch_size):
+ b = sortb[i:i + batch_size]
+ yield make_data(b)
+
+ return batch_reader
+
+
+def base_read(file_dir):
+ res = []
+ max_len = 0
+ with open(file_dir, "r") as fin:
+ for line in fin:
+ line = line.strip().split(';')
+ hist = line[0].split()
+ cate = line[1].split()
+ max_len = max(max_len, len(hist))
+ res.append([hist, cate, line[2], line[3], float(line[4])])
+ return res, max_len
+
+
+def prepare_reader(data_path, bs):
+ data_set, max_len = base_read(data_path)
+ random.shuffle(data_set)
+ return batch_reader(data_set, bs, bs * 20), max_len
+
+
+def config_read(config_path):
+ with open(config_path, "r") as fin:
+ user_count = int(fin.readline().strip())
+ item_count = int(fin.readline().strip())
+ cat_count = int(fin.readline().strip())
+ return user_count, item_count, cat_count
diff --git a/PaddleRec/din/train.py b/PaddleRec/din/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..9697961510c786ddd7f9f4fd4560e3f8d62fe4b1
--- /dev/null
+++ b/PaddleRec/din/train.py
@@ -0,0 +1,178 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import os
+import sys
+import logging
+import time
+import numpy as np
+import argparse
+import paddle.fluid as fluid
+import paddle
+import time
+import network
+import reader
+import random
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("din")
+ parser.add_argument(
+ '--config_path', type=str, default='data/config.txt', help='dir of config')
+ parser.add_argument(
+ '--train_dir', type=str, default='data/paddle_train.txt', help='dir of train file')
+ parser.add_argument(
+ '--model_dir', type=str, default='din_amazon', help='dir of saved model')
+ parser.add_argument(
+ '--batch_size', type=int, default=16, help='number of batch size')
+ parser.add_argument(
+ '--epoch_num', type=int, default=200, help='number of epoch')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether to use gpu')
+ parser.add_argument(
+ '--parallel', type=int, default=0, help='whether to use parallel executor')
+ parser.add_argument(
+ '--base_lr', type=float, default=0.85, help='based learning rate')
+ parser.add_argument(
+ '--num_devices', type=int, default=1, help='Number of GPU devices')
+ parser.add_argument(
+ '--enable_ce', action='store_true', help='If set, run the task with continuous evaluation logs.')
+ parser.add_argument(
+ '--batch_num', type=int, help="batch num for ce")
+ args = parser.parse_args()
+ return args
+
+
+def train():
+ args = parse_args()
+
+ if args.enable_ce:
+ SEED = 102
+ fluid.default_main_program().random_seed = SEED
+ fluid.default_startup_program().random_seed = SEED
+
+ config_path = args.config_path
+ train_path = args.train_dir
+ epoch_num = args.epoch_num
+ use_cuda = True if args.use_cuda else False
+ use_parallel = True if args.parallel else False
+
+ logger.info("reading data begins")
+ user_count, item_count, cat_count = reader.config_read(config_path)
+ data_reader, max_len = reader.prepare_reader(train_path, args.batch_size *
+ args.num_devices)
+ logger.info("reading data completes")
+
+ avg_cost, pred = network.network(item_count, cat_count, max_len)
+ fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByGlobalNorm(
+ clip_norm=5.0))
+ base_lr = args.base_lr
+ boundaries = [410000]
+ values = [base_lr, 0.2]
+ sgd_optimizer = fluid.optimizer.SGD(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=boundaries, values=values))
+ sgd_optimizer.minimize(avg_cost)
+
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ feeder = fluid.DataFeeder(
+ feed_list=[
+ "hist_item_seq", "hist_cat_seq", "target_item", "target_cat",
+ "label", "mask", "target_item_seq", "target_cat_seq"
+ ],
+ place=place)
+ if use_parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=use_cuda, loss_name=avg_cost.name)
+ else:
+ train_exe = exe
+
+ logger.info("train begins")
+
+ global_step = 0
+ PRINT_STEP = 1000
+
+ total_time = []
+ ce_info = []
+ start_time = time.time()
+ loss_sum = 0.0
+ for id in range(epoch_num):
+ epoch = id + 1
+ for data in data_reader():
+ global_step += 1
+ results = train_exe.run(feed=feeder.feed(data),
+ fetch_list=[avg_cost.name, pred.name],
+ return_numpy=True)
+ loss_sum += results[0].mean()
+
+ if global_step % PRINT_STEP == 0:
+ ce_info.append(loss_sum / PRINT_STEP)
+ total_time.append(time.time() - start_time)
+ logger.info(
+ "epoch: %d\tglobal_step: %d\ttrain_loss: %.4f\t\ttime: %.2f"
+ % (epoch, global_step, loss_sum / PRINT_STEP,
+ time.time() - start_time))
+ start_time = time.time()
+ loss_sum = 0.0
+
+ if (global_step > 400000 and global_step % PRINT_STEP == 0) or (
+ global_step <= 400000 and global_step % 50000 == 0):
+ save_dir = args.model_dir + "/global_step_" + str(
+ global_step)
+ feed_var_name = [
+ "hist_item_seq", "hist_cat_seq", "target_item",
+ "target_cat", "label", "mask", "target_item_seq",
+ "target_cat_seq"
+ ]
+ fetch_vars = [avg_cost, pred]
+ fluid.io.save_inference_model(save_dir, feed_var_name,
+ fetch_vars, exe)
+ logger.info("model saved in " + save_dir)
+ if args.enable_ce and global_step >= args.batch_num:
+ break
+ # only for ce
+ if args.enable_ce:
+ gpu_num = get_cards(args)
+ ce_loss = 0
+ ce_time = 0
+ try:
+ ce_loss = ce_info[-1]
+ ce_time = total_time[-1]
+ except:
+ print("ce info error")
+ print("kpis\teach_pass_duration_card%s\t%s" %
+ (gpu_num, ce_time))
+ print("kpis\ttrain_loss_card%s\t%s" %
+ (gpu_num, ce_loss))
+
+
+def get_cards(args):
+ if args.enable_ce:
+ cards = os.environ.get('CUDA_VISIBLE_DEVICES')
+ num = len(cards.split(","))
+ return num
+ else:
+ return args.num_devices
+
+
+if __name__ == "__main__":
+ train()
diff --git a/PaddleRec/gnn/.run_ce.sh b/PaddleRec/gnn/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..7b56efb87cec85bf9346a251dc864456e1c1f54a
--- /dev/null
+++ b/PaddleRec/gnn/.run_ce.sh
@@ -0,0 +1,13 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+cudaid=${gnn:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python -u train.py --use_cuda 1 --epoch_num 5 --enable_ce | python _ce.py
+
+
+
diff --git a/PaddleRec/gnn/README.md b/PaddleRec/gnn/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..7261d8e9b9caf3ddf88358ecedf94df2de4b3acd
--- /dev/null
+++ b/PaddleRec/gnn/README.md
@@ -0,0 +1,118 @@
+# SR-GNN
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── README.md # 文档
+├── train.py # 训练脚本
+├── infer.py # 预测脚本
+├── network.py # 网络结构
+├── reader.py # 和读取数据相关的函数
+├── data/
+ ├── download.sh # 下载数据的脚本
+ ├── preprocess.py # 数据预处理
+
+```
+
+## 简介
+
+SR-GNN模型的介绍可以参阅论文[Session-based Recommendation with Graph Neural Networks](https://arxiv.org/abs/1811.00855)。
+
+本文解决的是Session-based Recommendation这一问题,过程大致分为以下四步:
+
+是对所有的session序列通过有向图进行建模。
+
+然后通过GNN,学习每个node(item)的隐向量表示
+
+然后通过一个attention架构模型得到每个session的embedding
+
+最后通过一个softmax层进行全表预测
+
+我们复现了论文效果,在DIGINETICA数据集上P@20可以达到50.7
+
+
+## 数据下载及预处理
+
+使用[DIGINETICA](http://cikm2016.cs.iupui.edu/cikm-cup)数据集。可以按照下述过程操作获得数据集以及进行简单的数据预处理。
+
+* Step 1: 运行如下命令,下载DIGINETICA数据集并进行预处理
+```
+cd data && sh download.sh
+```
+
+* Step 2: 产生训练集、测试集和config文件
+```
+python preprocess.py --dataset diginetica
+cd ..
+```
+运行之后在data文件夹下会产生diginetica文件夹,里面包含config.txt、test.txt train.txt三个文件
+
+生成的数据格式为:(session_list,
+label_list)。
+
+其中session_list是一个session的列表,其中每个元素都是一个list,代表不同的session。label_list是一个列表,每个位置的元素是session_list中对应session的label。
+
+例子:session_list=[[1,2,3], [4], [7,9]]。代表这个session_list包含3个session,第一个session包含的item序列是1,2,3,第二个session只有1个item 4,第三个session包含的item序列是7,9。
+
+label_list = [6, 9,
+1]。代表[1,2,3]这个session的预测label值应该为6,后两个以此类推。
+
+提示:
+
+* 如果您想使用自己业务场景下的数据,只要令数据满足上述格式要求即可
+* 本例中的train.txt和test.txt两个文件均为二进制文件
+
+
+## 训练
+
+可以参考下面不同场景下的运行命令进行训练,还可以指定诸如batch_size,lr(learning rate)等参数,具体的配置说明可通过运行下列代码查看
+```
+python train.py -h
+```
+
+gpu 单机单卡训练
+``` bash
+CUDA_VISIBLE_DEVICES=1 python -u train.py --use_cuda 1 > log.txt 2>&1 &
+```
+
+cpu 单机训练
+``` bash
+CPU_NUM=1 python -u train.py --use_cuda 0 > log.txt 2>&1 &
+```
+
+值得注意的是上述单卡训练可以通过加--use_parallel 1参数使用Parallel Executor来进行加速。
+
+
+## 训练结果示例
+
+我们在Tesla K40m单GPU卡上训练的日志如下所示(以实际输出为准)
+```text
+W0308 16:08:24.249840 1785 device_context.cc:263] Please NOTE: device: 0, CUDA Capability: 35, Driver API Version: 9.0, Runtime API Version: 8.0
+W0308 16:08:24.249974 1785 device_context.cc:271] device: 0, cuDNN Version: 7.0.
+2019-03-08 16:08:38,079 - INFO - load data complete
+2019-03-08 16:08:38,080 - INFO - begin train
+2019-03-08 16:09:07,605 - INFO - step: 500, loss: 10.2052, train_acc: 0.0088
+2019-03-08 16:09:36,940 - INFO - step: 1000, loss: 9.7192, train_acc: 0.0320
+2019-03-08 16:10:08,617 - INFO - step: 1500, loss: 8.9290, train_acc: 0.1350
+...
+2019-03-08 16:16:01,151 - INFO - model saved in ./saved_model/epoch_0
+...
+```
+
+## 预测
+运行如下命令即可开始预测。可以通过参数指定开始和结束的epoch轮次。
+
+```
+CUDA_VISIBLE_DEVICES=3 python infer.py
+```
+
+## 预测结果示例
+```text
+W0308 16:41:56.847339 31709 device_context.cc:263] Please NOTE: device: 0, CUDA Capability: 35, Driver API Version: 9.0, Runtime API Version: 8.0
+W0308 16:41:56.847705 31709 device_context.cc:271] device: 0, cuDNN Version: 7.0.
+2019-03-08 16:42:20,420 - INFO - TEST --> loss: 5.8865, Recall@20: 0.4525
+2019-03-08 16:42:45,153 - INFO - TEST --> loss: 5.5314, Recall@20: 0.5010
+2019-03-08 16:43:10,233 - INFO - TEST --> loss: 5.5128, Recall@20: 0.5047
+...
+```
diff --git a/PaddleRec/gnn/__init__.py b/PaddleRec/gnn/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/gnn/_ce.py b/PaddleRec/gnn/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..eea7c9aa6f91650a595f361ec8e5fea6aebafa69
--- /dev/null
+++ b/PaddleRec/gnn/_ce.py
@@ -0,0 +1,60 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+from kpi import AccKpi
+
+
+each_pass_duration_card1_kpi = DurationKpi('each_pass_duration_card1', 0.08, 0, actived=True)
+train_loss_card1_kpi = CostKpi('train_loss_card1', 0.08, 0)
+train_acc_card1_kpi = AccKpi('train_acc_card1', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_card1_kpi,
+ train_loss_card1_kpi,
+ train_acc_card1_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/PaddleRec/gnn/data/download.sh b/PaddleRec/gnn/data/download.sh
new file mode 100644
index 0000000000000000000000000000000000000000..23b75be34d81fff96c8e03ed527f307c8f3e0f3c
--- /dev/null
+++ b/PaddleRec/gnn/data/download.sh
@@ -0,0 +1,8 @@
+#!/bin/bash
+
+#The gdown.pl script comes from: https://github.com/circulosmeos/gdown.pl
+./gdown.pl https://drive.google.com/open?id=0B7XZSACQf0KdenRmMk8yVUU5LWc dataset-train-diginetica.zip
+unzip dataset-train-diginetica.zip "train-item-views.csv"
+sed -i '1d' train-item-views.csv
+sed -i '1i session_id;user_id;item_id;timeframe;eventdate' train-item-views.csv
+mkdir diginetica
diff --git a/PaddleRec/gnn/data/gdown.pl b/PaddleRec/gnn/data/gdown.pl
new file mode 100755
index 0000000000000000000000000000000000000000..05e3367ae50b6cd75e4c21704808589426d5dca1
--- /dev/null
+++ b/PaddleRec/gnn/data/gdown.pl
@@ -0,0 +1,75 @@
+#!/usr/bin/env perl
+#
+# Google Drive direct download of big files
+# ./gdown.pl 'gdrive file url' ['desired file name']
+#
+# v1.0 by circulosmeos 04-2014.
+# v1.1 by circulosmeos 01-2017.
+# v1.2, v1.3, v1.4 by circulosmeos 01-2019, 02-2019.
+# //circulosmeos.wordpress.com/2014/04/12/google-drive-direct-download-of-big-files
+# Distributed under GPL 3 (//www.gnu.org/licenses/gpl-3.0.html)
+#
+use strict;
+use POSIX;
+
+my $TEMP='gdown.cookie.temp';
+my $COMMAND;
+my $confirm;
+my $check;
+sub execute_command();
+
+my $URL=shift;
+die "\n./gdown.pl 'gdrive file url' [desired file name]\n\n" if $URL eq '';
+
+my $FILENAME=shift;
+$FILENAME='gdown.'.strftime("%Y%m%d%H%M%S", localtime).'.'.substr(rand,2) if $FILENAME eq '';
+
+if ($URL=~m#^https?://drive.google.com/file/d/([^/]+)#) {
+ $URL="https://docs.google.com/uc?id=$1&export=download";
+}
+elsif ($URL=~m#^https?://drive.google.com/open\?id=([^/]+)#) {
+ $URL="https://docs.google.com/uc?id=$1&export=download";
+}
+
+execute_command();
+
+while (-s $FILENAME < 100000) { # only if the file isn't the download yet
+ open fFILENAME, '<', $FILENAME;
+ $check=0;
+ foreach () {
+ if (/href="(\/uc\?export=download[^"]+)/) {
+ $URL='https://docs.google.com'.$1;
+ $URL=~s/&/&/g;
+ $confirm='';
+ $check=1;
+ last;
+ }
+ if (/confirm=([^;&]+)/) {
+ $confirm=$1;
+ $check=1;
+ last;
+ }
+ if (/"downloadUrl":"([^"]+)/) {
+ $URL=$1;
+ $URL=~s/\\u003d/=/g;
+ $URL=~s/\\u0026/&/g;
+ $confirm='';
+ $check=1;
+ last;
+ }
+ }
+ close fFILENAME;
+ die "Couldn't download the file :-(\n" if ($check==0);
+ $URL=~s/confirm=([^;&]+)/confirm=$confirm/ if $confirm ne '';
+
+ execute_command();
+}
+
+unlink $TEMP;
+
+sub execute_command() {
+ $COMMAND="wget --progress=dot:giga --no-check-certificate --load-cookie $TEMP --save-cookie $TEMP \"$URL\"";
+ $COMMAND.=" -O \"$FILENAME\"" if $FILENAME ne '';
+ system ( $COMMAND );
+ return 1;
+}
diff --git a/PaddleRec/gnn/data/preprocess.py b/PaddleRec/gnn/data/preprocess.py
new file mode 100755
index 0000000000000000000000000000000000000000..e35e1b4988ca45a818f2bddeb17539cc7e182e86
--- /dev/null
+++ b/PaddleRec/gnn/data/preprocess.py
@@ -0,0 +1,256 @@
+#!/usr/bin/env python36
+# -*- coding: utf-8 -*-
+"""
+Created on July, 2018
+
+@author: Tangrizzly
+"""
+
+import argparse
+import time
+import csv
+import pickle
+import operator
+import datetime
+import os
+
+parser = argparse.ArgumentParser()
+parser.add_argument(
+ '--dataset',
+ default='sample',
+ help='dataset name: diginetica/yoochoose/sample')
+opt = parser.parse_args()
+print(opt)
+
+dataset = 'sample_train-item-views.csv'
+if opt.dataset == 'diginetica':
+ dataset = 'train-item-views.csv'
+elif opt.dataset == 'yoochoose':
+ dataset = 'yoochoose-clicks.dat'
+
+print("-- Starting @ %ss" % datetime.datetime.now())
+with open(dataset, "r") as f:
+ if opt.dataset == 'yoochoose':
+ reader = csv.DictReader(f, delimiter=',')
+ else:
+ reader = csv.DictReader(f, delimiter=';')
+ sess_clicks = {}
+ sess_date = {}
+ ctr = 0
+ curid = -1
+ curdate = None
+ for data in reader:
+ sessid = data['session_id']
+ if curdate and not curid == sessid:
+ date = ''
+ if opt.dataset == 'yoochoose':
+ date = time.mktime(
+ time.strptime(curdate[:19], '%Y-%m-%dT%H:%M:%S'))
+ else:
+ date = time.mktime(time.strptime(curdate, '%Y-%m-%d'))
+ sess_date[curid] = date
+ curid = sessid
+ if opt.dataset == 'yoochoose':
+ item = data['item_id']
+ else:
+ item = data['item_id'], int(data['timeframe'])
+ curdate = ''
+ if opt.dataset == 'yoochoose':
+ curdate = data['timestamp']
+ else:
+ curdate = data['eventdate']
+
+ if sessid in sess_clicks:
+ sess_clicks[sessid] += [item]
+ else:
+ sess_clicks[sessid] = [item]
+ ctr += 1
+ date = ''
+ if opt.dataset == 'yoochoose':
+ date = time.mktime(time.strptime(curdate[:19], '%Y-%m-%dT%H:%M:%S'))
+ else:
+ date = time.mktime(time.strptime(curdate, '%Y-%m-%d'))
+ for i in list(sess_clicks):
+ sorted_clicks = sorted(sess_clicks[i], key=operator.itemgetter(1))
+ sess_clicks[i] = [c[0] for c in sorted_clicks]
+ sess_date[curid] = date
+print("-- Reading data @ %ss" % datetime.datetime.now())
+
+# Filter out length 1 sessions
+for s in list(sess_clicks):
+ if len(sess_clicks[s]) == 1:
+ del sess_clicks[s]
+ del sess_date[s]
+
+# Count number of times each item appears
+iid_counts = {}
+for s in sess_clicks:
+ seq = sess_clicks[s]
+ for iid in seq:
+ if iid in iid_counts:
+ iid_counts[iid] += 1
+ else:
+ iid_counts[iid] = 1
+
+sorted_counts = sorted(iid_counts.items(), key=operator.itemgetter(1))
+
+length = len(sess_clicks)
+for s in list(sess_clicks):
+ curseq = sess_clicks[s]
+ filseq = list(filter(lambda i: iid_counts[i] >= 5, curseq))
+ if len(filseq) < 2:
+ del sess_clicks[s]
+ del sess_date[s]
+ else:
+ sess_clicks[s] = filseq
+
+# Split out test set based on dates
+dates = list(sess_date.items())
+maxdate = dates[0][1]
+
+for _, date in dates:
+ if maxdate < date:
+ maxdate = date
+
+# 7 days for test
+splitdate = 0
+if opt.dataset == 'yoochoose':
+ splitdate = maxdate - 86400 * 1 # the number of seconds for a day:86400
+else:
+ splitdate = maxdate - 86400 * 7
+
+print('Splitting date', splitdate) # Yoochoose: ('Split date', 1411930799.0)
+tra_sess = filter(lambda x: x[1] < splitdate, dates)
+tes_sess = filter(lambda x: x[1] > splitdate, dates)
+
+# Sort sessions by date
+tra_sess = sorted(
+ tra_sess, key=operator.itemgetter(1)) # [(session_id, timestamp), (), ]
+tes_sess = sorted(
+ tes_sess, key=operator.itemgetter(1)) # [(session_id, timestamp), (), ]
+print(len(tra_sess)) # 186670 # 7966257
+print(len(tes_sess)) # 15979 # 15324
+print(tra_sess[:3])
+print(tes_sess[:3])
+print("-- Splitting train set and test set @ %ss" % datetime.datetime.now())
+
+# Choosing item count >=5 gives approximately the same number of items as reported in paper
+item_dict = {}
+
+
+# Convert training sessions to sequences and renumber items to start from 1
+def obtian_tra():
+ train_ids = []
+ train_seqs = []
+ train_dates = []
+ item_ctr = 1
+ for s, date in tra_sess:
+ seq = sess_clicks[s]
+ outseq = []
+ for i in seq:
+ if i in item_dict:
+ outseq += [item_dict[i]]
+ else:
+ outseq += [item_ctr]
+ item_dict[i] = item_ctr
+ item_ctr += 1
+ if len(outseq) < 2: # Doesn't occur
+ continue
+ train_ids += [s]
+ train_dates += [date]
+ train_seqs += [outseq]
+ print(item_ctr) # 43098, 37484
+ with open("./diginetica/config.txt", "w") as fout:
+ fout.write(str(item_ctr) + "\n")
+ return train_ids, train_dates, train_seqs
+
+
+# Convert test sessions to sequences, ignoring items that do not appear in training set
+def obtian_tes():
+ test_ids = []
+ test_seqs = []
+ test_dates = []
+ for s, date in tes_sess:
+ seq = sess_clicks[s]
+ outseq = []
+ for i in seq:
+ if i in item_dict:
+ outseq += [item_dict[i]]
+ if len(outseq) < 2:
+ continue
+ test_ids += [s]
+ test_dates += [date]
+ test_seqs += [outseq]
+ return test_ids, test_dates, test_seqs
+
+
+tra_ids, tra_dates, tra_seqs = obtian_tra()
+tes_ids, tes_dates, tes_seqs = obtian_tes()
+
+
+def process_seqs(iseqs, idates):
+ out_seqs = []
+ out_dates = []
+ labs = []
+ ids = []
+ for id, seq, date in zip(range(len(iseqs)), iseqs, idates):
+ for i in range(1, len(seq)):
+ tar = seq[-i]
+ labs += [tar]
+ out_seqs += [seq[:-i]]
+ out_dates += [date]
+ ids += [id]
+ return out_seqs, out_dates, labs, ids
+
+
+tr_seqs, tr_dates, tr_labs, tr_ids = process_seqs(tra_seqs, tra_dates)
+te_seqs, te_dates, te_labs, te_ids = process_seqs(tes_seqs, tes_dates)
+tra = (tr_seqs, tr_labs)
+tes = (te_seqs, te_labs)
+print(len(tr_seqs))
+print(len(te_seqs))
+print(tr_seqs[:3], tr_dates[:3], tr_labs[:3])
+print(te_seqs[:3], te_dates[:3], te_labs[:3])
+all = 0
+
+for seq in tra_seqs:
+ all += len(seq)
+for seq in tes_seqs:
+ all += len(seq)
+print('avg length: ', all / (len(tra_seqs) + len(tes_seqs) * 1.0))
+if opt.dataset == 'diginetica':
+ if not os.path.exists('diginetica'):
+ os.makedirs('diginetica')
+ pickle.dump(tra, open('diginetica/train.txt', 'wb'))
+ pickle.dump(tes, open('diginetica/test.txt', 'wb'))
+ pickle.dump(tra_seqs, open('diginetica/all_train_seq.txt', 'wb'))
+elif opt.dataset == 'yoochoose':
+ if not os.path.exists('yoochoose1_4'):
+ os.makedirs('yoochoose1_4')
+ if not os.path.exists('yoochoose1_64'):
+ os.makedirs('yoochoose1_64')
+ pickle.dump(tes, open('yoochoose1_4/test.txt', 'wb'))
+ pickle.dump(tes, open('yoochoose1_64/test.txt', 'wb'))
+
+ split4, split64 = int(len(tr_seqs) / 4), int(len(tr_seqs) / 64)
+ print(len(tr_seqs[-split4:]))
+ print(len(tr_seqs[-split64:]))
+
+ tra4, tra64 = (tr_seqs[-split4:], tr_labs[-split4:]), (tr_seqs[-split64:],
+ tr_labs[-split64:])
+ seq4, seq64 = tra_seqs[tr_ids[-split4]:], tra_seqs[tr_ids[-split64]:]
+
+ pickle.dump(tra4, open('yoochoose1_4/train.txt', 'wb'))
+ pickle.dump(seq4, open('yoochoose1_4/all_train_seq.txt', 'wb'))
+
+ pickle.dump(tra64, open('yoochoose1_64/train.txt', 'wb'))
+ pickle.dump(seq64, open('yoochoose1_64/all_train_seq.txt', 'wb'))
+
+else:
+ if not os.path.exists('sample'):
+ os.makedirs('sample')
+ pickle.dump(tra, open('sample/train.txt', 'wb'))
+ pickle.dump(tes, open('sample/test.txt', 'wb'))
+ pickle.dump(tra_seqs, open('sample/all_train_seq.txt', 'wb'))
+
+print('Done.')
diff --git a/PaddleRec/gnn/infer.py b/PaddleRec/gnn/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..ef007712dcd1b100138b5d887d3c6126d6761ead
--- /dev/null
+++ b/PaddleRec/gnn/infer.py
@@ -0,0 +1,78 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import argparse
+import logging
+import numpy as np
+import os
+import paddle
+import paddle.fluid as fluid
+import reader
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(description="PaddlePaddle DIN example")
+ parser.add_argument(
+ '--model_path', type=str, default='./saved_model/', help="path of model parameters")
+ parser.add_argument(
+ '--test_path', type=str, default='./data/diginetica/test.txt', help='dir of test file')
+ parser.add_argument(
+ '--use_cuda', type=int, default=1, help='whether to use gpu')
+ parser.add_argument(
+ '--batch_size', type=int, default=100, help='input batch size')
+ parser.add_argument(
+ '--start_index', type=int, default='0', help='start index')
+ parser.add_argument(
+ '--last_index', type=int, default='10', help='end index')
+ return parser.parse_args()
+
+
+def infer(epoch_num):
+ args = parse_args()
+ batch_size = args.batch_size
+ test_data = reader.Data(args.test_path, False)
+ place = fluid.CUDAPlace(0) if args.use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ model_path = args.model_path + "epoch_" + str(epoch_num)
+ try:
+ [infer_program, feed_names, fetch_targets] = fluid.io.load_inference_model(
+ model_path, exe)
+ feeder = fluid.DataFeeder(
+ feed_list=feed_names, place=place, program=infer_program)
+
+ loss_sum = 0.0
+ acc_sum = 0.0
+ count = 0
+ for data in test_data.reader(batch_size, batch_size, False):
+ res = exe.run(infer_program,
+ feed=feeder.feed(data),
+ fetch_list=fetch_targets)
+ loss_sum += res[0]
+ acc_sum += res[1]
+ count += 1
+ logger.info("TEST --> loss: %.4lf, Recall@20: %.4lf" %
+ (loss_sum / count, acc_sum / count))
+ except ValueError as e:
+ logger.info("TEST --> error: there is no model in " + model_path)
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ for index in range(args.start_index, args.last_index + 1):
+ infer(index)
diff --git a/PaddleRec/gnn/network.py b/PaddleRec/gnn/network.py
new file mode 100644
index 0000000000000000000000000000000000000000..1cd1af9243603a9fff0554e9f2c22734e1bee217
--- /dev/null
+++ b/PaddleRec/gnn/network.py
@@ -0,0 +1,203 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import paddle
+import math
+import numpy as np
+import paddle.fluid as fluid
+import paddle.fluid.layers as layers
+
+
+def network(batch_size, items_num, hidden_size, step):
+ stdv = 1.0 / math.sqrt(hidden_size)
+
+ items = layers.data(
+ name="items",
+ shape=[batch_size, items_num, 1],
+ dtype="int64",
+ append_batch_size=False) #[bs, uniq_max, 1]
+ seq_index = layers.data(
+ name="seq_index",
+ shape=[batch_size, items_num],
+ dtype="int32",
+ append_batch_size=False) #[-1(seq_max)*batch_size, 1]
+ last_index = layers.data(
+ name="last_index",
+ shape=[batch_size],
+ dtype="int32",
+ append_batch_size=False) #[batch_size, 1]
+ adj_in = layers.data(
+ name="adj_in",
+ shape=[batch_size, items_num, items_num],
+ dtype="float32",
+ append_batch_size=False)
+ adj_out = layers.data(
+ name="adj_out",
+ shape=[batch_size, items_num, items_num],
+ dtype="float32",
+ append_batch_size=False)
+ mask = layers.data(
+ name="mask",
+ shape=[batch_size, -1, 1],
+ dtype="float32",
+ append_batch_size=False)
+ label = layers.data(
+ name="label",
+ shape=[batch_size, 1],
+ dtype="int64",
+ append_batch_size=False)
+
+ items_emb = layers.embedding(
+ input=items,
+ param_attr=fluid.ParamAttr(
+ name="emb",
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv)),
+ size=[items_num, hidden_size]) #[batch_size, uniq_max, h]
+
+ pre_state = items_emb
+ for i in range(step):
+ pre_state = layers.reshape(
+ x=pre_state, shape=[batch_size, -1, hidden_size])
+ state_in = layers.fc(
+ input=pre_state,
+ name="state_in",
+ size=hidden_size,
+ act=None,
+ num_flatten_dims=2,
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv)),
+ bias_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) #[batch_size, uniq_max, h]
+ state_out = layers.fc(
+ input=pre_state,
+ name="state_out",
+ size=hidden_size,
+ act=None,
+ num_flatten_dims=2,
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv)),
+ bias_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) #[batch_size, uniq_max, h]
+
+ state_adj_in = layers.matmul(adj_in, state_in) #[batch_size, uniq_max, h]
+ state_adj_out = layers.matmul(adj_out, state_out) #[batch_size, uniq_max, h]
+
+ gru_input = layers.concat([state_adj_in, state_adj_out], axis=2)
+
+ gru_input = layers.reshape(x=gru_input, shape=[-1, hidden_size * 2])
+ gru_fc = layers.fc(
+ input=gru_input,
+ name="gru_fc",
+ size=3 * hidden_size,
+ bias_attr=False)
+ pre_state, _, _ = fluid.layers.gru_unit(
+ input=gru_fc,
+ hidden=layers.reshape(
+ x=pre_state, shape=[-1, hidden_size]),
+ size=3 * hidden_size)
+
+ final_state = pre_state
+ seq_index = layers.reshape(seq_index, shape=[-1])
+ seq = layers.gather(final_state, seq_index) #[batch_size*-1(seq_max), h]
+ last = layers.gather(final_state, last_index) #[batch_size, h]
+
+ seq = layers.reshape(
+ seq, shape=[batch_size, -1, hidden_size]) #[batch_size, -1(seq_max), h]
+ last = layers.reshape(
+ last, shape=[batch_size, hidden_size]) #[batch_size, h]
+
+ seq_fc = layers.fc(
+ input=seq,
+ name="seq_fc",
+ size=hidden_size,
+ bias_attr=False,
+ act=None,
+ num_flatten_dims=2,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) #[batch_size, -1(seq_max), h]
+ last_fc = layers.fc(
+ input=last,
+ name="last_fc",
+ size=hidden_size,
+ bias_attr=False,
+ act=None,
+ num_flatten_dims=1,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) #[bathc_size, h]
+
+ seq_fc_t = layers.transpose(
+ seq_fc, perm=[1, 0, 2]) #[-1(seq_max), batch_size, h]
+ add = layers.elementwise_add(
+ seq_fc_t, last_fc) #[-1(seq_max), batch_size, h]
+ b = layers.create_parameter(
+ shape=[hidden_size],
+ dtype='float32',
+ default_initializer=fluid.initializer.Constant(value=0.0)) #[h]
+ add = layers.elementwise_add(add, b) #[-1(seq_max), batch_size, h]
+
+ add_sigmoid = layers.sigmoid(add) #[-1(seq_max), batch_size, h]
+ add_sigmoid = layers.transpose(
+ add_sigmoid, perm=[1, 0, 2]) #[batch_size, -1(seq_max), h]
+
+ weight = layers.fc(
+ input=add_sigmoid,
+ name="weight_fc",
+ size=1,
+ act=None,
+ num_flatten_dims=2,
+ bias_attr=False,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) #[batch_size, -1, 1]
+ weight *= mask
+ weight_mask = layers.elementwise_mul(seq, weight, axis=0)
+ global_attention = layers.reduce_sum(weight_mask, dim=1)
+
+ final_attention = layers.concat(
+ [global_attention, last], axis=1) #[batch_size, 2*h]
+ final_attention_fc = layers.fc(
+ input=final_attention,
+ name="fina_attention_fc",
+ size=hidden_size,
+ bias_attr=False,
+ act=None,
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) #[batch_size, h]
+
+ all_vocab = layers.create_global_var(
+ shape=[items_num - 1, 1],
+ value=0,
+ dtype="int64",
+ persistable=True,
+ name="all_vocab")
+
+ all_emb = layers.embedding(
+ input=all_vocab,
+ param_attr=fluid.ParamAttr(
+ name="emb",
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv)),
+ size=[items_num, hidden_size]) #[all_vocab, h]
+
+ logits = layers.matmul(
+ x=final_attention_fc, y=all_emb,
+ transpose_y=True) #[batch_size, all_vocab]
+ softmax = layers.softmax_with_cross_entropy(
+ logits=logits, label=label) #[batch_size, 1]
+ loss = layers.reduce_mean(softmax) # [1]
+ acc = layers.accuracy(input=logits, label=label, k=20)
+ return loss, acc
diff --git a/PaddleRec/gnn/reader.py b/PaddleRec/gnn/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..9b5793bc9b26ec6587b2999c98f72d5ca3bdc98e
--- /dev/null
+++ b/PaddleRec/gnn/reader.py
@@ -0,0 +1,116 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import numpy as np
+import copy
+import random
+import pickle
+
+
+class Data():
+ def __init__(self, path, shuffle=False):
+ data = pickle.load(open(path, 'rb'))
+ self.shuffle = shuffle
+ self.length = len(data[0])
+ self.input = list(zip(data[0], data[1]))
+
+ def make_data(self, cur_batch, batch_size):
+ cur_batch = [list(e) for e in cur_batch]
+ max_seq_len = 0
+ for e in cur_batch:
+ max_seq_len = max(max_seq_len, len(e[0]))
+ last_id = []
+ for e in cur_batch:
+ last_id.append(len(e[0]) - 1)
+ e[0] += [0] * (max_seq_len - len(e[0]))
+
+ max_uniq_len = 0
+ for e in cur_batch:
+ max_uniq_len = max(max_uniq_len, len(np.unique(e[0])))
+
+ items, adj_in, adj_out, seq_index, last_index = [], [], [], [], []
+ mask, label = [], []
+
+ id = 0
+ for e in cur_batch:
+ node = np.unique(e[0])
+ items.append(node.tolist() + (max_uniq_len - len(node)) * [0])
+ adj = np.zeros((max_uniq_len, max_uniq_len))
+
+ for i in np.arange(len(e[0]) - 1):
+ if e[0][i + 1] == 0:
+ break
+ u = np.where(node == e[0][i])[0][0]
+ v = np.where(node == e[0][i + 1])[0][0]
+ adj[u][v] = 1
+
+ u_deg_in = np.sum(adj, 0)
+ u_deg_in[np.where(u_deg_in == 0)] = 1
+ adj_in.append(np.divide(adj, u_deg_in).transpose())
+
+ u_deg_out = np.sum(adj, 1)
+ u_deg_out[np.where(u_deg_out == 0)] = 1
+ adj_out.append(np.divide(adj.transpose(), u_deg_out).transpose())
+
+ seq_index.append(
+ [np.where(node == i)[0][0] + id * max_uniq_len for i in e[0]])
+ last_index.append(
+ np.where(node == e[0][last_id[id]])[0][0] + id * max_uniq_len)
+ label.append(e[1] - 1)
+ mask.append([[1] * (last_id[id] + 1) + [0] *
+ (max_seq_len - last_id[id] - 1)])
+ id += 1
+
+ items = np.array(items).astype("int64").reshape((batch_size, -1, 1))
+ seq_index = np.array(seq_index).astype("int32").reshape(
+ (batch_size, -1))
+ last_index = np.array(last_index).astype("int32").reshape(
+ (batch_size, 1))
+ adj_in = np.array(adj_in).astype("float32").reshape(
+ (batch_size, max_uniq_len, max_uniq_len))
+ adj_out = np.array(adj_out).astype("float32").reshape(
+ (batch_size, max_uniq_len, max_uniq_len))
+ mask = np.array(mask).astype("float32").reshape((batch_size, -1, 1))
+ label = np.array(label).astype("int64").reshape((batch_size, 1))
+ return zip(items, seq_index, last_index, adj_in, adj_out, mask, label)
+
+ def reader(self, batch_size, batch_group_size, train=True):
+ if self.shuffle:
+ random.shuffle(self.input)
+ group_remain = self.length % batch_group_size
+ for bg_id in range(0, self.length - group_remain, batch_group_size):
+ cur_bg = copy.deepcopy(self.input[bg_id:bg_id + batch_group_size])
+ if train:
+ cur_bg = sorted(cur_bg, key=lambda x: len(x[0]), reverse=True)
+ for i in range(0, batch_group_size, batch_size):
+ cur_batch = cur_bg[i:i + batch_size]
+ yield self.make_data(cur_batch, batch_size)
+
+ #deal with the remaining, discard at most batch_size data
+ if group_remain < batch_size:
+ return
+ remain_data = copy.deepcopy(self.input[-group_remain:])
+ if train:
+ remain_data = sorted(
+ remain_data, key=lambda x: len(x[0]), reverse=True)
+ for i in range(0, batch_group_size, batch_size):
+ if i + batch_size <= len(remain_data):
+ cur_batch = remain_data[i:i + batch_size]
+ yield self.make_data(cur_batch, batch_size)
+
+
+def read_config(path):
+ with open(path, "r") as fin:
+ item_num = int(fin.readline())
+ return item_num
diff --git a/PaddleRec/gnn/train.py b/PaddleRec/gnn/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..c244f84a586f91a231f5edf507ad0ca8439a7647
--- /dev/null
+++ b/PaddleRec/gnn/train.py
@@ -0,0 +1,172 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+import numpy as np
+import os
+from functools import partial
+import logging
+import time
+import paddle
+import paddle.fluid as fluid
+import argparse
+import network
+import reader
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("gnn")
+ parser.add_argument(
+ '--train_path', type=str, default='./data/diginetica/train.txt', help='dir of training data')
+ parser.add_argument(
+ '--config_path', type=str, default='./data/diginetica/config.txt', help='dir of config')
+ parser.add_argument(
+ '--model_path', type=str, default='./saved_model', help="path of model parameters")
+ parser.add_argument(
+ '--epoch_num', type=int, default=30, help='number of epochs to train for')
+ parser.add_argument(
+ '--batch_size', type=int, default=100, help='input batch size')
+ parser.add_argument(
+ '--hidden_size', type=int, default=100, help='hidden state size')
+ parser.add_argument(
+ '--l2', type=float, default=1e-5, help='l2 penalty')
+ parser.add_argument(
+ '--lr', type=float, default=0.001, help='learning rate')
+ parser.add_argument(
+ '--step', type=int, default=1, help='gnn propogation steps')
+ parser.add_argument(
+ '--lr_dc', type=float, default=0.1, help='learning rate decay rate')
+ parser.add_argument(
+ '--lr_dc_step', type=int, default=3, help='the number of steps after which the learning rate decay')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether to use gpu')
+ parser.add_argument(
+ '--use_parallel', type=int, default=1, help='whether to use parallel executor')
+ parser.add_argument(
+ '--enable_ce', action='store_true', help='If set, run the task with continuous evaluation logs.')
+ return parser.parse_args()
+
+
+def train():
+ args = parse_args()
+
+ if args.enable_ce:
+ SEED = 102
+ fluid.default_main_program().random_seed = SEED
+ fluid.default_startup_program().random_seed = SEED
+
+ batch_size = args.batch_size
+ items_num = reader.read_config(args.config_path)
+ loss, acc = network.network(batch_size, items_num, args.hidden_size,
+ args.step)
+
+ data_reader = reader.Data(args.train_path, True)
+ logger.info("load data complete")
+
+ use_cuda = True if args.use_cuda else False
+ use_parallel = True if args.use_parallel else False
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+
+ exe = fluid.Executor(place)
+ step_per_epoch = data_reader.length // batch_size
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=fluid.layers.exponential_decay(
+ learning_rate=args.lr,
+ decay_steps=step_per_epoch * args.lr_dc_step,
+ decay_rate=args.lr_dc),
+ regularization=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=args.l2))
+ optimizer.minimize(loss)
+
+ exe.run(fluid.default_startup_program())
+
+ all_vocab = fluid.global_scope().var("all_vocab").get_tensor()
+ all_vocab.set(
+ np.arange(1, items_num).astype("int64").reshape((-1, 1)), place)
+
+ feed_list = [
+ "items", "seq_index", "last_index", "adj_in", "adj_out", "mask", "label"
+ ]
+ feeder = fluid.DataFeeder(feed_list=feed_list, place=place)
+
+ if use_parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=use_cuda, loss_name=loss.name)
+ else:
+ train_exe = exe
+
+ logger.info("begin train")
+
+ total_time = []
+ ce_info = []
+ start_time = time.time()
+ loss_sum = 0.0
+ acc_sum = 0.0
+ global_step = 0
+ PRINT_STEP = 500
+ for i in range(args.epoch_num):
+ epoch_sum = []
+ for data in data_reader.reader(batch_size, batch_size * 20, True):
+ res = train_exe.run(feed=feeder.feed(data),
+ fetch_list=[loss.name, acc.name])
+ loss_sum += res[0]
+ acc_sum += res[1]
+ epoch_sum.append(res[0])
+ global_step += 1
+ if global_step % PRINT_STEP == 0:
+ ce_info.append([loss_sum / PRINT_STEP, acc_sum / PRINT_STEP])
+ total_time.append(time.time() - start_time)
+ logger.info("global_step: %d, loss: %.4lf, train_acc: %.4lf" % (
+ global_step, loss_sum / PRINT_STEP, acc_sum / PRINT_STEP))
+ loss_sum = 0.0
+ acc_sum = 0.0
+ start_time = time.time()
+ logger.info("epoch loss: %.4lf" % (np.mean(epoch_sum)))
+ save_dir = args.model_path + "/epoch_" + str(i)
+ fetch_vars = [loss, acc]
+ fluid.io.save_inference_model(save_dir, feed_list, fetch_vars, exe)
+ logger.info("model saved in " + save_dir)
+
+ # only for ce
+ if args.enable_ce:
+ gpu_num = get_cards(args)
+ ce_loss = 0
+ ce_acc = 0
+ ce_time = 0
+ try:
+ ce_loss = ce_info[-1][0]
+ ce_acc = ce_info[-1][1]
+ ce_time = total_time[-1]
+ except:
+ print("ce info error")
+ print("kpis\teach_pass_duration_card%s\t%s" %
+ (gpu_num, ce_time))
+ print("kpis\ttrain_loss_card%s\t%f" %
+ (gpu_num, ce_loss))
+ print("kpis\ttrain_acc_card%s\t%f" %
+ (gpu_num, ce_acc))
+
+
+def get_cards(args):
+ num = 0
+ cards = os.environ.get('CUDA_VISIBLE_DEVICES')
+ num = len(cards.split(","))
+ return num
+
+
+if __name__ == "__main__":
+ train()
diff --git a/PaddleRec/gru4rec/.run_ce.sh b/PaddleRec/gru4rec/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..902b49f1335ea0f527dab4c4826d36ead04d7680
--- /dev/null
+++ b/PaddleRec/gru4rec/.run_ce.sh
@@ -0,0 +1,22 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+export CPU_NUM=1
+export NUM_THREADS=1
+
+FLAGS_benchmark=true python train.py --train_dir train_big_data --vocab_path vocab_big.txt --use_cuda 0 --batch_size 500 --model_dir model_output --pass_num 2 --enable_ce --step_num 10 | python _ce.py
+
+
+cudaid=${gru4rec:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --train_dir train_big_data --vocab_path vocab_big.txt --use_cuda 1 --batch_size 500 --model_dir model_output --pass_num 2 --enable_ce --step_num 1000 | python _ce.py
+
+
+cudaid=${gru4rec_4:=0,1,2,3} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --train_dir train_big_data --vocab_path vocab_big.txt --use_cuda 1 --parallel 1 --num_devices 2 --batch_size 500 --model_dir model_output --pass_num 2 --enable_ce --step_num 1000 | python _ce.py
diff --git a/PaddleRec/gru4rec/README.md b/PaddleRec/gru4rec/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..353781567f7012996199e51169233b306cd18722
--- /dev/null
+++ b/PaddleRec/gru4rec/README.md
@@ -0,0 +1,283 @@
+# GRU4REC
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── README.md # 文档
+├── train.py # 训练脚本 全词表 cross-entropy
+├── train_sample_neg.py # 训练脚本 sample负例 包含bpr loss 和cross-entropy
+├── infer.py # 预测脚本 全词表
+├── infer_sample_neg.py # 预测脚本 sample负例
+├── net.py # 网络结构
+├── text2paddle.py # 文本数据转paddle数据
+├── cluster_train.py # 多机训练
+├── cluster_train.sh # 多机训练脚本
+├── utils # 通用函数
+├── convert_format.py # 转换数据格式
+├── vocab.txt # 小样本字典
+├── train_data # 小样本训练目录
+└── test_data # 小样本测试目录
+
+```
+
+
+## 简介
+
+GRU4REC模型的介绍可以参阅论文[Session-based Recommendations with Recurrent Neural Networks](https://arxiv.org/abs/1511.06939)。
+
+论文的贡献在于首次将RNN(GRU)运用于session-based推荐,相比传统的KNN和矩阵分解,效果有明显的提升。
+
+论文的核心思想是在一个session中,用户点击一系列item的行为看做一个序列,用来训练RNN模型。预测阶段,给定已知的点击序列作为输入,预测下一个可能点击的item。
+
+session-based推荐应用场景非常广泛,比如用户的商品浏览、新闻点击、地点签到等序列数据。
+
+支持三种形式的损失函数, 分别是全词表的cross-entropy, 负采样的Bayesian Pairwise Ranking和负采样的Cross-entropy.
+
+我们基本复现了论文效果,recall@20的效果分别为
+
+全词表 cross entropy : 0.67
+
+负采样 bpr : 0.606
+
+负采样 cross entropy : 0.605
+
+
+运行样例程序可跳过'RSC15 数据下载及预处理'部分
+## RSC15 数据下载及预处理
+
+运行命令 下载RSC15官网数据集
+```
+curl -Lo yoochoose-data.7z https://s3-eu-west-1.amazonaws.com/yc-rdata/yoochoose-data.7z
+7z x yoochoose-data.7z
+```
+
+GRU4REC的数据过滤,下载脚本[https://github.com/hidasib/GRU4Rec/blob/master/examples/rsc15/preprocess.py](https://github.com/hidasib/GRU4Rec/blob/master/examples/rsc15/preprocess.py),
+
+注意修改文件路径
+
+line12: PATH_TO_ORIGINAL_DATA = './'
+
+line13:PATH_TO_PROCESSED_DATA = './'
+
+注意使用python3 执行脚本
+```
+python preprocess.py
+```
+生成的数据格式如下
+
+```
+SessionId ItemId Time
+1 214536502 1396839069.277
+1 214536500 1396839249.868
+1 214536506 1396839286.998
+1 214577561 1396839420.306
+2 214662742 1396850197.614
+2 214662742 1396850239.373
+2 214825110 1396850317.446
+2 214757390 1396850390.71
+2 214757407 1396850438.247
+```
+
+数据格式需要转换, 运行脚本如下
+```
+python convert_format.py
+```
+
+模型的训练及测试数据如下,一行表示一个用户按照时间顺序的序列
+
+```
+214536502 214536500 214536506 214577561
+214662742 214662742 214825110 214757390 214757407 214551617
+214716935 214774687 214832672
+214836765 214706482
+214701242 214826623
+214826835 214826715
+214838855 214838855
+214576500 214576500 214576500
+214821275 214821275 214821371 214821371 214821371 214717089 214563337 214706462 214717436 214743335 214826837 214819762
+214717867 214717867
+```
+
+根据训练和测试文件生成字典和对应的paddle输入文件
+
+需要将训练文件放到目录raw_train_data下,测试文件放到目录raw_test_data下,并生成对应的train_data,test_data和vocab.txt文件
+```
+python text2paddle.py raw_train_data/ raw_test_data/ train_data test_data vocab.txt
+```
+
+转化后生成的格式如下,可参考train_data/small_train.txt
+```
+197 196 198 236
+93 93 384 362 363 43
+336 364 407
+421 322
+314 388
+128 58
+138 138
+46 46 46
+34 34 57 57 57 342 228 321 346 357 59 376
+110 110
+```
+
+## 训练
+
+具体的参数配置可运行
+```
+python train.py -h
+```
+全词表cross entropy 训练代码
+
+gpu 单机单卡训练
+``` bash
+CUDA_VISIBLE_DEVICES=0 python train.py --train_dir train_data --use_cuda 1 --batch_size 50 --model_dir model_output
+```
+
+cpu 单机训练
+``` bash
+python train.py --train_dir train_data --use_cuda 0 --batch_size 50 --model_dir model_output
+```
+
+gpu 单机多卡训练
+``` bash
+CUDA_VISIBLE_DEVICES=0,1 python train.py --train_dir train_data --use_cuda 1 --parallel 1 --batch_size 50 --model_dir model_output --num_devices 2
+```
+
+cpu 单机多卡训练
+``` bash
+CPU_NUM=10 python train.py --train_dir train_data --use_cuda 0 --parallel 1 --batch_size 50 --model_dir model_output --num_devices 10
+```
+
+负采样 bayesian pairwise ranking loss(bpr loss) 训练
+```
+CUDA_VISIBLE_DEVICES=0 python train_sample_neg.py --loss bpr --use_cuda 1
+```
+
+负采样 cross entropy 训练
+```
+CUDA_VISIBLE_DEVICES=0 python train_sample_neg.py --loss ce --use_cuda 1
+```
+
+## 自定义网络结构
+
+可在[net.py](./net.py) `network` 函数中调整网络结构,当前的网络结构如下:
+```python
+emb = fluid.layers.embedding(
+ input=src,
+ size=[vocab_size, hid_size],
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=emb_lr_x),
+ is_sparse=True)
+
+fc0 = fluid.layers.fc(input=emb,
+ size=hid_size * 3,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=gru_lr_x))
+gru_h0 = fluid.layers.dynamic_gru(
+ input=fc0,
+ size=hid_size,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=gru_lr_x))
+
+fc = fluid.layers.fc(input=gru_h0,
+ size=vocab_size,
+ act='softmax',
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=fc_lr_x))
+
+cost = fluid.layers.cross_entropy(input=fc, label=dst)
+acc = fluid.layers.accuracy(input=fc, label=dst, k=20)
+```
+
+## 训练结果示例
+
+我们在Tesla K40m单GPU卡上训练的日志如下所示
+```text
+epoch_1 start
+step:100 ppl:441.468
+step:200 ppl:311.043
+step:300 ppl:218.952
+step:400 ppl:186.172
+step:500 ppl:188.600
+step:600 ppl:131.213
+step:700 ppl:165.770
+step:800 ppl:164.414
+step:900 ppl:156.470
+step:1000 ppl:174.201
+step:1100 ppl:118.619
+step:1200 ppl:122.635
+step:1300 ppl:118.220
+step:1400 ppl:90.372
+step:1500 ppl:135.018
+step:1600 ppl:114.327
+step:1700 ppl:141.806
+step:1800 ppl:93.416
+step:1900 ppl:92.897
+step:2000 ppl:121.703
+step:2100 ppl:96.288
+step:2200 ppl:88.355
+step:2300 ppl:101.737
+step:2400 ppl:95.934
+step:2500 ppl:86.158
+step:2600 ppl:80.925
+step:2700 ppl:202.219
+step:2800 ppl:106.828
+step:2900 ppl:91.458
+step:3000 ppl:105.988
+step:3100 ppl:87.067
+step:3200 ppl:92.651
+step:3300 ppl:101.145
+step:3400 ppl:91.247
+step:3500 ppl:107.656
+step:3600 ppl:89.410
+...
+...
+step:15700 ppl:76.819
+step:15800 ppl:62.257
+step:15900 ppl:81.735
+epoch:1 num_steps:15907 time_cost(s):4154.096032
+model saved in model_recall20/epoch_1
+...
+```
+
+## 预测
+运行命令 全词表运行infer.py, 负采样运行infer_sample_neg.py。
+
+```
+CUDA_VISIBLE_DEVICES=0 python infer.py --test_dir test_data/ --model_dir model_output/ --start_index 1 --last_index 10 --use_cuda 1
+```
+
+## 预测结果示例
+```text
+model:model_r@20/epoch_1 recall@20:0.613 time_cost(s):12.23
+model:model_r@20/epoch_2 recall@20:0.647 time_cost(s):12.33
+model:model_r@20/epoch_3 recall@20:0.662 time_cost(s):12.38
+model:model_r@20/epoch_4 recall@20:0.669 time_cost(s):12.21
+model:model_r@20/epoch_5 recall@20:0.673 time_cost(s):12.17
+model:model_r@20/epoch_6 recall@20:0.675 time_cost(s):12.26
+model:model_r@20/epoch_7 recall@20:0.677 time_cost(s):12.25
+model:model_r@20/epoch_8 recall@20:0.679 time_cost(s):12.37
+model:model_r@20/epoch_9 recall@20:0.680 time_cost(s):12.22
+model:model_r@20/epoch_10 recall@20:0.681 time_cost(s):12.2
+```
+
+
+## 多机训练
+厂内用户可以参考[wiki](http://wiki.baidu.com/pages/viewpage.action?pageId=628300529)利用paddlecloud 配置多机环境
+
+可参考cluster_train.py 配置其他多机环境
+
+运行命令本地模拟多机场景
+```
+sh cluster_train.sh
+```
+
+注意本地模拟需要关闭代理
diff --git a/PaddleRec/gru4rec/__init__.py b/PaddleRec/gru4rec/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/gru4rec/_ce.py b/PaddleRec/gru4rec/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..90ddbf787a32e819c76394d44c5f9a41e3b08685
--- /dev/null
+++ b/PaddleRec/gru4rec/_ce.py
@@ -0,0 +1,66 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+from kpi import AccKpi
+
+
+each_pass_duration_cpu1_thread1_kpi = DurationKpi('each_pass_duration_cpu1_thread1', 0.08, 0, actived=True)
+train_ppl_cpu1_thread1_kpi = CostKpi('train_ppl_cpu1_thread1', 0.08, 0)
+each_pass_duration_gpu1_kpi = DurationKpi('each_pass_duration_gpu1', 0.08, 0, actived=True)
+train_ppl_gpu1_kpi = CostKpi('train_ppl_gpu1', 0.08, 0)
+each_pass_duration_gpu4_kpi = DurationKpi('each_pass_duration_gpu4', 0.08, 0, actived=True)
+train_ppl_gpu4_kpi = CostKpi('train_ppl_gpu4', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_cpu1_thread1_kpi,
+ train_ppl_cpu1_thread1_kpi,
+ each_pass_duration_gpu1_kpi,
+ train_ppl_gpu1_kpi,
+ each_pass_duration_gpu4_kpi,
+ train_ppl_gpu4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/PaddleRec/gru4rec/cluster_train.py b/PaddleRec/gru4rec/cluster_train.py
new file mode 100644
index 0000000000000000000000000000000000000000..f50542bf011d0caacddb3831368493df106463f5
--- /dev/null
+++ b/PaddleRec/gru4rec/cluster_train.py
@@ -0,0 +1,164 @@
+import os
+import sys
+import time
+import six
+import numpy as np
+import math
+import argparse
+import paddle.fluid as fluid
+import paddle
+import time
+import utils
+import net
+
+SEED = 102
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("gru4rec benchmark.")
+ parser.add_argument(
+ '--train_dir',
+ type=str,
+ default='train_data',
+ help='train file address')
+ parser.add_argument(
+ '--vocab_path',
+ type=str,
+ default='vocab.txt',
+ help='vocab file address')
+ parser.add_argument('--is_local', type=int, default=1, help='whether local')
+ parser.add_argument('--hid_size', type=int, default=100, help='hid size')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_recall20', help='model dir')
+ parser.add_argument(
+ '--batch_size', type=int, default=5, help='num of batch size')
+ parser.add_argument('--pass_num', type=int, default=10, help='num of epoch')
+ parser.add_argument(
+ '--print_batch', type=int, default=10, help='num of print batch')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether use gpu')
+ parser.add_argument(
+ '--base_lr', type=float, default=0.01, help='learning rate')
+ parser.add_argument(
+ '--num_devices', type=int, default=1, help='Number of GPU devices')
+ parser.add_argument(
+ '--role', type=str, default='pserver', help='trainer or pserver')
+ parser.add_argument(
+ '--endpoints',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The pserver endpoints, like: 127.0.0.1:6000, 127.0.0.1:6001')
+ parser.add_argument(
+ '--current_endpoint',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The current_endpoint')
+ parser.add_argument(
+ '--trainer_id',
+ type=int,
+ default=0,
+ help='trainer id ,only trainer_id=0 save model')
+ parser.add_argument(
+ '--trainers',
+ type=int,
+ default=1,
+ help='The num of trianers, (default: 1)')
+ args = parser.parse_args()
+ return args
+
+
+def get_cards(args):
+ return args.num_devices
+
+
+def train():
+ """ do training """
+ args = parse_args()
+ hid_size = args.hid_size
+ train_dir = args.train_dir
+ vocab_path = args.vocab_path
+ use_cuda = True if args.use_cuda else False
+ print("use_cuda:", use_cuda)
+ batch_size = args.batch_size
+ vocab_size, train_reader = utils.prepare_data(
+ train_dir, vocab_path, batch_size=batch_size * get_cards(args),\
+ buffer_size=1000, word_freq_threshold=0, is_train=True)
+
+ # Train program
+ src_wordseq, dst_wordseq, avg_cost, acc = net.all_vocab_network(
+ vocab_size=vocab_size, hid_size=hid_size)
+
+ # Optimization to minimize lost
+ sgd_optimizer = fluid.optimizer.SGD(learning_rate=args.base_lr)
+ sgd_optimizer.minimize(avg_cost)
+
+ def train_loop(main_program):
+ """ train network """
+ pass_num = args.pass_num
+ model_dir = args.model_dir
+ fetch_list = [avg_cost.name]
+
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ total_time = 0.0
+ for pass_idx in six.moves.xrange(pass_num):
+ epoch_idx = pass_idx + 1
+ print("epoch_%d start" % epoch_idx)
+
+ t0 = time.time()
+ i = 0
+ newest_ppl = 0
+ for data in train_reader():
+ i += 1
+ lod_src_wordseq = utils.to_lodtensor([dat[0] for dat in data],
+ place)
+ lod_dst_wordseq = utils.to_lodtensor([dat[1] for dat in data],
+ place)
+ ret_avg_cost = exe.run(main_program,
+ feed={
+ "src_wordseq": lod_src_wordseq,
+ "dst_wordseq": lod_dst_wordseq
+ },
+ fetch_list=fetch_list)
+ avg_ppl = np.exp(ret_avg_cost[0])
+ newest_ppl = np.mean(avg_ppl)
+ if i % args.print_batch == 0:
+ print("step:%d ppl:%.3f" % (i, newest_ppl))
+
+ t1 = time.time()
+ total_time += t1 - t0
+ print("epoch:%d num_steps:%d time_cost(s):%f" %
+ (epoch_idx, i, total_time / epoch_idx))
+ save_dir = "%s/epoch_%d" % (model_dir, epoch_idx)
+ feed_var_names = ["src_wordseq", "dst_wordseq"]
+ fetch_vars = [avg_cost, acc]
+ if args.trainer_id == 0:
+ fluid.io.save_inference_model(save_dir, feed_var_names,
+ fetch_vars, exe)
+ print("model saved in %s" % save_dir)
+ print("finish training")
+
+ if args.is_local:
+ print("run local training")
+ train_loop(fluid.default_main_program())
+ else:
+ print("run distribute training")
+ t = fluid.DistributeTranspiler()
+ t.transpile(
+ args.trainer_id, pservers=args.endpoints, trainers=args.trainers)
+ if args.role == "pserver":
+ print("run psever")
+ pserver_prog = t.get_pserver_program(args.current_endpoint)
+ pserver_startup = t.get_startup_program(args.current_endpoint,
+ pserver_prog)
+ exe = fluid.Executor(fluid.CPUPlace())
+ exe.run(pserver_startup)
+ exe.run(pserver_prog)
+ elif args.role == "trainer":
+ print("run trainer")
+ train_loop(t.get_trainer_program())
+
+
+if __name__ == "__main__":
+ train()
diff --git a/fluid/PaddleRec/gru4rec/cluster_train.sh b/PaddleRec/gru4rec/cluster_train.sh
similarity index 100%
rename from fluid/PaddleRec/gru4rec/cluster_train.sh
rename to PaddleRec/gru4rec/cluster_train.sh
diff --git a/fluid/PaddleRec/gru4rec/convert_format.py b/PaddleRec/gru4rec/convert_format.py
similarity index 100%
rename from fluid/PaddleRec/gru4rec/convert_format.py
rename to PaddleRec/gru4rec/convert_format.py
diff --git a/PaddleRec/gru4rec/infer.py b/PaddleRec/gru4rec/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..ec113260bb4391cb970c515a2c958ae71b6176b7
--- /dev/null
+++ b/PaddleRec/gru4rec/infer.py
@@ -0,0 +1,93 @@
+import argparse
+import sys
+import time
+import math
+import unittest
+import contextlib
+import numpy as np
+import six
+import paddle.fluid as fluid
+import paddle
+
+import utils
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("gru4rec benchmark.")
+ parser.add_argument(
+ '--test_dir', type=str, default='test_data', help='test file address')
+ parser.add_argument(
+ '--start_index', type=int, default='1', help='start index')
+ parser.add_argument(
+ '--last_index', type=int, default='10', help='end index')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_recall20', help='model dir')
+ parser.add_argument(
+ '--use_cuda', type=int, default='0', help='whether use cuda')
+ parser.add_argument(
+ '--batch_size', type=int, default='5', help='batch_size')
+ parser.add_argument(
+ '--vocab_path', type=str, default='vocab.txt', help='vocab file')
+ args = parser.parse_args()
+ return args
+
+
+def infer(test_reader, use_cuda, model_path):
+ """ inference function """
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ with fluid.scope_guard(fluid.core.Scope()):
+ infer_program, feed_target_names, fetch_vars = fluid.io.load_inference_model(
+ model_path, exe)
+ accum_num_recall = 0.0
+ accum_num_sum = 0.0
+ t0 = time.time()
+ step_id = 0
+ for data in test_reader():
+ step_id += 1
+ src_wordseq = utils.to_lodtensor([dat[0] for dat in data], place)
+ label_data = [dat[1] for dat in data]
+ dst_wordseq = utils.to_lodtensor(label_data, place)
+ para = exe.run(
+ infer_program,
+ feed={"src_wordseq": src_wordseq,
+ "dst_wordseq": dst_wordseq},
+ fetch_list=fetch_vars,
+ return_numpy=False)
+
+ acc_ = para[1]._get_float_element(0)
+ data_length = len(
+ np.concatenate(
+ label_data, axis=0).astype("int64"))
+ accum_num_sum += (data_length)
+ accum_num_recall += (data_length * acc_)
+ if step_id % 1 == 0:
+ print("step:%d recall@20:%.4f" %
+ (step_id, accum_num_recall / accum_num_sum))
+ t1 = time.time()
+ print("model:%s recall@20:%.3f time_cost(s):%.2f" %
+ (model_path, accum_num_recall / accum_num_sum, t1 - t0))
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ start_index = args.start_index
+ last_index = args.last_index
+ test_dir = args.test_dir
+ model_dir = args.model_dir
+ batch_size = args.batch_size
+ vocab_path = args.vocab_path
+ use_cuda = True if args.use_cuda else False
+ print("start index: ", start_index, " last_index:", last_index)
+ vocab_size, test_reader = utils.prepare_data(
+ test_dir,
+ vocab_path,
+ batch_size=batch_size,
+ buffer_size=1000,
+ word_freq_threshold=0,
+ is_train=False)
+
+ for epoch in range(start_index, last_index + 1):
+ epoch_path = model_dir + "/epoch_" + str(epoch)
+ infer(test_reader=test_reader, use_cuda=use_cuda, model_path=epoch_path)
diff --git a/PaddleRec/gru4rec/infer_sample_neg.py b/PaddleRec/gru4rec/infer_sample_neg.py
new file mode 100644
index 0000000000000000000000000000000000000000..8839d6ceee2959b81edc2e50ef3b7ae9049c1a1f
--- /dev/null
+++ b/PaddleRec/gru4rec/infer_sample_neg.py
@@ -0,0 +1,104 @@
+import argparse
+import sys
+import time
+import math
+import unittest
+import contextlib
+import numpy as np
+import six
+import paddle.fluid as fluid
+import paddle
+import net
+import utils
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("gru4rec benchmark.")
+ parser.add_argument(
+ '--test_dir', type=str, default='test_data', help='test file address')
+ parser.add_argument(
+ '--start_index', type=int, default='1', help='start index')
+ parser.add_argument(
+ '--last_index', type=int, default='3', help='last index')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_neg_recall20', help='model dir')
+ parser.add_argument(
+ '--use_cuda', type=int, default='0', help='whether use cuda')
+ parser.add_argument(
+ '--batch_size', type=int, default='5', help='batch_size')
+ parser.add_argument(
+ '--hid_size', type=int, default='100', help='batch_size')
+ parser.add_argument(
+ '--vocab_path', type=str, default='vocab.txt', help='vocab file')
+ args = parser.parse_args()
+ return args
+
+
+def infer(args, vocab_size, test_reader, use_cuda):
+ """ inference function """
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ hid_size = args.hid_size
+ batch_size = args.batch_size
+ with fluid.scope_guard(fluid.core.Scope()):
+ main_program = fluid.Program()
+ with fluid.program_guard(main_program):
+ acc = net.infer_network(vocab_size, batch_size, hid_size)
+ for epoch in range(start_index, last_index + 1):
+ copy_program = main_program.clone()
+ model_path = model_dir + "/epoch_" + str(epoch)
+ fluid.io.load_params(
+ executor=exe, dirname=model_path, main_program=copy_program)
+ accum_num_recall = 0.0
+ accum_num_sum = 0.0
+ t0 = time.time()
+ step_id = 0
+ for data in test_reader():
+ step_id += 1
+ label_data = [dat[1] for dat in data]
+ ls, lp = utils.to_lodtensor_bpr_test(data, vocab_size,
+ place)
+ para = exe.run(
+ copy_program,
+ feed={
+ "src": ls,
+ "all_label":
+ np.arange(vocab_size).reshape(vocab_size, 1),
+ "pos_label": lp
+ },
+ fetch_list=[acc.name],
+ return_numpy=False)
+
+ acc_ = np.array(para[0])[0]
+ data_length = len(
+ np.concatenate(
+ label_data, axis=0).astype("int64"))
+ accum_num_sum += (data_length)
+ accum_num_recall += (data_length * acc_)
+ if step_id % 1 == 0:
+ print("step:%d recall@20:%.4f" %
+ (step_id, accum_num_recall / accum_num_sum))
+ t1 = time.time()
+ print("model:%s recall@20:%.4f time_cost(s):%.2f" %
+ (model_path, accum_num_recall / accum_num_sum, t1 - t0))
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ start_index = args.start_index
+ last_index = args.last_index
+ test_dir = args.test_dir
+ model_dir = args.model_dir
+ batch_size = args.batch_size
+ vocab_path = args.vocab_path
+ use_cuda = True if args.use_cuda else False
+ print("start index: ", start_index, " last_index:", last_index)
+ vocab_size, test_reader = utils.prepare_data(
+ test_dir,
+ vocab_path,
+ batch_size=batch_size,
+ buffer_size=1000,
+ word_freq_threshold=0,
+ is_train=False)
+
+ infer(args, vocab_size, test_reader=test_reader, use_cuda=use_cuda)
diff --git a/PaddleRec/gru4rec/net.py b/PaddleRec/gru4rec/net.py
new file mode 100644
index 0000000000000000000000000000000000000000..6a715443ff1e72ae77aba51d5eaffe4eefee9687
--- /dev/null
+++ b/PaddleRec/gru4rec/net.py
@@ -0,0 +1,217 @@
+import paddle.fluid as fluid
+
+
+def all_vocab_network(vocab_size,
+ hid_size=100,
+ init_low_bound=-0.04,
+ init_high_bound=0.04):
+ """ network definition """
+ emb_lr_x = 10.0
+ gru_lr_x = 1.0
+ fc_lr_x = 1.0
+ # Input data
+ src_wordseq = fluid.layers.data(
+ name="src_wordseq", shape=[1], dtype="int64", lod_level=1)
+ dst_wordseq = fluid.layers.data(
+ name="dst_wordseq", shape=[1], dtype="int64", lod_level=1)
+
+ emb = fluid.layers.embedding(
+ input=src_wordseq,
+ size=[vocab_size, hid_size],
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=emb_lr_x),
+ is_sparse=True)
+ fc0 = fluid.layers.fc(input=emb,
+ size=hid_size * 3,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=gru_lr_x))
+ gru_h0 = fluid.layers.dynamic_gru(
+ input=fc0,
+ size=hid_size,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=gru_lr_x))
+
+ fc = fluid.layers.fc(input=gru_h0,
+ size=vocab_size,
+ act='softmax',
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=init_low_bound, high=init_high_bound),
+ learning_rate=fc_lr_x))
+ cost = fluid.layers.cross_entropy(input=fc, label=dst_wordseq)
+ acc = fluid.layers.accuracy(input=fc, label=dst_wordseq, k=20)
+ avg_cost = fluid.layers.mean(x=cost)
+ return src_wordseq, dst_wordseq, avg_cost, acc
+
+
+def train_bpr_network(vocab_size, neg_size, hid_size, drop_out=0.2):
+ """ network definition """
+ emb_lr_x = 1.0
+ gru_lr_x = 1.0
+ fc_lr_x = 1.0
+ # Input data
+ src = fluid.layers.data(name="src", shape=[1], dtype="int64", lod_level=1)
+ pos_label = fluid.layers.data(
+ name="pos_label", shape=[1], dtype="int64", lod_level=1)
+ label = fluid.layers.data(
+ name="label", shape=[neg_size + 1], dtype="int64", lod_level=1)
+
+ emb_src = fluid.layers.embedding(
+ input=src,
+ size=[vocab_size, hid_size],
+ param_attr=fluid.ParamAttr(
+ name="emb",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=emb_lr_x))
+
+ emb_src_drop = fluid.layers.dropout(emb_src, dropout_prob=drop_out)
+
+ fc0 = fluid.layers.fc(input=emb_src_drop,
+ size=hid_size * 3,
+ param_attr=fluid.ParamAttr(
+ name="gru_fc",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=gru_lr_x),
+ bias_attr=False)
+ gru_h0 = fluid.layers.dynamic_gru(
+ input=fc0,
+ size=hid_size,
+ param_attr=fluid.ParamAttr(
+ name="dy_gru.param",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=gru_lr_x),
+ bias_attr="dy_gru.bias")
+ gru_h0_drop = fluid.layers.dropout(gru_h0, dropout_prob=drop_out)
+
+ label_re = fluid.layers.sequence_reshape(input=label, new_dim=1)
+ emb_label = fluid.layers.embedding(
+ input=label_re,
+ size=[vocab_size, hid_size],
+ param_attr=fluid.ParamAttr(
+ name="emb",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=emb_lr_x))
+
+ emb_label_drop = fluid.layers.dropout(emb_label, dropout_prob=drop_out)
+
+ gru_exp = fluid.layers.expand(
+ x=gru_h0_drop, expand_times=[1, (neg_size + 1)])
+ gru = fluid.layers.sequence_reshape(input=gru_exp, new_dim=hid_size)
+
+ ele_mul = fluid.layers.elementwise_mul(emb_label_drop, gru)
+ red_sum = fluid.layers.reduce_sum(input=ele_mul, dim=1, keep_dim=True)
+
+ pre = fluid.layers.sequence_reshape(input=red_sum, new_dim=(neg_size + 1))
+
+ cost = fluid.layers.bpr_loss(input=pre, label=pos_label)
+ cost_sum = fluid.layers.reduce_sum(input=cost)
+ return src, pos_label, label, cost_sum
+
+
+def train_cross_entropy_network(vocab_size, neg_size, hid_size, drop_out=0.2):
+ """ network definition """
+ emb_lr_x = 1.0
+ gru_lr_x = 1.0
+ fc_lr_x = 1.0
+ # Input data
+ src = fluid.layers.data(name="src", shape=[1], dtype="int64", lod_level=1)
+ pos_label = fluid.layers.data(
+ name="pos_label", shape=[1], dtype="int64", lod_level=1)
+ label = fluid.layers.data(
+ name="label", shape=[neg_size + 1], dtype="int64", lod_level=1)
+
+ emb_src = fluid.layers.embedding(
+ input=src,
+ size=[vocab_size, hid_size],
+ param_attr=fluid.ParamAttr(
+ name="emb",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=emb_lr_x))
+
+ emb_src_drop = fluid.layers.dropout(emb_src, dropout_prob=drop_out)
+
+ fc0 = fluid.layers.fc(input=emb_src_drop,
+ size=hid_size * 3,
+ param_attr=fluid.ParamAttr(
+ name="gru_fc",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=gru_lr_x),
+ bias_attr=False)
+ gru_h0 = fluid.layers.dynamic_gru(
+ input=fc0,
+ size=hid_size,
+ param_attr=fluid.ParamAttr(
+ name="dy_gru.param",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=gru_lr_x),
+ bias_attr="dy_gru.bias")
+ gru_h0_drop = fluid.layers.dropout(gru_h0, dropout_prob=drop_out)
+
+ label_re = fluid.layers.sequence_reshape(input=label, new_dim=1)
+ emb_label = fluid.layers.embedding(
+ input=label_re,
+ size=[vocab_size, hid_size],
+ param_attr=fluid.ParamAttr(
+ name="emb",
+ initializer=fluid.initializer.XavierInitializer(),
+ learning_rate=emb_lr_x))
+
+ emb_label_drop = fluid.layers.dropout(emb_label, dropout_prob=drop_out)
+
+ gru_exp = fluid.layers.expand(
+ x=gru_h0_drop, expand_times=[1, (neg_size + 1)])
+ gru = fluid.layers.sequence_reshape(input=gru_exp, new_dim=hid_size)
+
+ ele_mul = fluid.layers.elementwise_mul(emb_label_drop, gru)
+ red_sum = fluid.layers.reduce_sum(input=ele_mul, dim=1, keep_dim=True)
+
+ pre_ = fluid.layers.sequence_reshape(input=red_sum, new_dim=(neg_size + 1))
+ pre = fluid.layers.softmax(input=pre_)
+
+ cost = fluid.layers.cross_entropy(input=pre, label=pos_label)
+ cost_sum = fluid.layers.reduce_sum(input=cost)
+ return src, pos_label, label, cost_sum
+
+
+def infer_network(vocab_size, batch_size, hid_size, dropout=0.2):
+ src = fluid.layers.data(name="src", shape=[1], dtype="int64", lod_level=1)
+ emb_src = fluid.layers.embedding(
+ input=src, size=[vocab_size, hid_size], param_attr="emb")
+ emb_src_drop = fluid.layers.dropout(
+ emb_src, dropout_prob=dropout, is_test=True)
+
+ fc0 = fluid.layers.fc(input=emb_src_drop,
+ size=hid_size * 3,
+ param_attr="gru_fc",
+ bias_attr=False)
+ gru_h0 = fluid.layers.dynamic_gru(
+ input=fc0,
+ size=hid_size,
+ param_attr="dy_gru.param",
+ bias_attr="dy_gru.bias")
+ gru_h0_drop = fluid.layers.dropout(
+ gru_h0, dropout_prob=dropout, is_test=True)
+
+ all_label = fluid.layers.data(
+ name="all_label",
+ shape=[vocab_size, 1],
+ dtype="int64",
+ append_batch_size=False)
+ emb_all_label = fluid.layers.embedding(
+ input=all_label, size=[vocab_size, hid_size], param_attr="emb")
+ emb_all_label_drop = fluid.layers.dropout(
+ emb_all_label, dropout_prob=dropout, is_test=True)
+
+ all_pre = fluid.layers.matmul(
+ gru_h0_drop, emb_all_label_drop, transpose_y=True)
+
+ pos_label = fluid.layers.data(
+ name="pos_label", shape=[1], dtype="int64", lod_level=1)
+ acc = fluid.layers.accuracy(input=all_pre, label=pos_label, k=20)
+ return acc
diff --git a/fluid/PaddleRec/gru4rec/test_data/small_test.txt b/PaddleRec/gru4rec/test_data/small_test.txt
similarity index 100%
rename from fluid/PaddleRec/gru4rec/test_data/small_test.txt
rename to PaddleRec/gru4rec/test_data/small_test.txt
diff --git a/fluid/PaddleRec/gru4rec/text2paddle.py b/PaddleRec/gru4rec/text2paddle.py
similarity index 100%
rename from fluid/PaddleRec/gru4rec/text2paddle.py
rename to PaddleRec/gru4rec/text2paddle.py
diff --git a/PaddleRec/gru4rec/train.py b/PaddleRec/gru4rec/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..b43926b69eaf002380a261a0689be91ec3f6ff90
--- /dev/null
+++ b/PaddleRec/gru4rec/train.py
@@ -0,0 +1,172 @@
+import os
+import sys
+import time
+import six
+import numpy as np
+import math
+import argparse
+import paddle.fluid as fluid
+import paddle
+import time
+import utils
+import net
+
+SEED = 102
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("gru4rec benchmark.")
+ parser.add_argument(
+ '--train_dir', type=str, default='train_data', help='train file')
+ parser.add_argument(
+ '--vocab_path', type=str, default='vocab.txt', help='vocab file')
+ parser.add_argument(
+ '--is_local', type=int, default=1, help='whether is local')
+ parser.add_argument(
+ '--hid_size', type=int, default=100, help='hidden-dim size')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_recall20', help='model dir')
+ parser.add_argument(
+ '--batch_size', type=int, default=5, help='num of batch size')
+ parser.add_argument(
+ '--print_batch', type=int, default=10, help='num of print batch')
+ parser.add_argument(
+ '--pass_num', type=int, default=10, help='number of epoch')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether use gpu')
+ parser.add_argument(
+ '--parallel', type=int, default=0, help='whether parallel')
+ parser.add_argument(
+ '--base_lr', type=float, default=0.01, help='learning rate')
+ parser.add_argument(
+ '--num_devices', type=int, default=1, help='Number of GPU devices')
+ parser.add_argument(
+ '--step_num', type=int, default=1000, help='Number of steps')
+ parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs.')
+ args = parser.parse_args()
+ return args
+
+
+def get_cards(args):
+ return args.num_devices
+
+
+def train():
+ """ do training """
+ args = parse_args()
+ if args.enable_ce:
+ fluid.default_startup_program().random_seed = SEED
+ fluid.default_main_program().random_seed = SEED
+ hid_size = args.hid_size
+ train_dir = args.train_dir
+ vocab_path = args.vocab_path
+ use_cuda = True if args.use_cuda else False
+ parallel = True if args.parallel else False
+ print("use_cuda:", use_cuda, "parallel:", parallel)
+ batch_size = args.batch_size
+ vocab_size, train_reader = utils.prepare_data(
+ train_dir, vocab_path, batch_size=batch_size * get_cards(args),\
+ buffer_size=1000, word_freq_threshold=0, is_train=True)
+
+ # Train program
+ src_wordseq, dst_wordseq, avg_cost, acc = net.all_vocab_network(
+ vocab_size=vocab_size, hid_size=hid_size)
+
+ # Optimization to minimize lost
+ sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=args.base_lr)
+ sgd_optimizer.minimize(avg_cost)
+
+ # Initialize executor
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ if parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=use_cuda, loss_name=avg_cost.name)
+ else:
+ train_exe = exe
+
+ pass_num = args.pass_num
+ model_dir = args.model_dir
+ fetch_list = [avg_cost.name]
+
+ ce_info = []
+ total_time = 0.0
+ for pass_idx in six.moves.xrange(pass_num):
+ epoch_idx = pass_idx + 1
+ print("epoch_%d start" % epoch_idx)
+
+ t0 = time.time()
+ i = 0
+ newest_ppl = 0
+ for data in train_reader():
+ i += 1
+ lod_src_wordseq = utils.to_lodtensor([dat[0] for dat in data],
+ place)
+ lod_dst_wordseq = utils.to_lodtensor([dat[1] for dat in data],
+ place)
+ ret_avg_cost = train_exe.run(feed={
+ "src_wordseq": lod_src_wordseq,
+ "dst_wordseq": lod_dst_wordseq
+ },
+ fetch_list=fetch_list)
+ avg_ppl = np.exp(ret_avg_cost[0])
+ newest_ppl = np.mean(avg_ppl)
+ ce_info.append(newest_ppl)
+ if i % args.print_batch == 0:
+ print("step:%d ppl:%.3f" % (i, newest_ppl))
+ if args.enable_ce and i > args.step_num:
+ break
+
+ t1 = time.time()
+ total_time += t1 - t0
+ print("epoch:%d num_steps:%d time_cost(s):%f" %
+ (epoch_idx, i, total_time / epoch_idx))
+ save_dir = "%s/epoch_%d" % (model_dir, epoch_idx)
+ feed_var_names = ["src_wordseq", "dst_wordseq"]
+ fetch_vars = [avg_cost, acc]
+ fluid.io.save_inference_model(save_dir, feed_var_names, fetch_vars, exe)
+ print("model saved in %s" % save_dir)
+
+ # only for ce
+ if args.enable_ce:
+ ce_ppl = 0
+ try:
+ ce_ppl = ce_info[-2]
+ except:
+ print("ce info error")
+ epoch_idx = args.pass_num
+ device = get_device(args)
+ if args.use_cuda:
+ gpu_num = device[1]
+ print("kpis\teach_pass_duration_gpu%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_ppl_gpu%s\t%s" %
+ (gpu_num, ce_ppl))
+ else:
+ cpu_num = device[1]
+ threads_num = device[2]
+ print("kpis\teach_pass_duration_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, total_time / epoch_idx))
+ print("kpis\ttrain_ppl_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, ce_ppl))
+
+ print("finish training")
+
+
+def get_device(args):
+ if args.use_cuda:
+ gpus = os.environ.get("CUDA_VISIBLE_DEVICES", 1)
+ gpu_num = len(gpus.split(','))
+ return "gpu", gpu_num
+ else:
+ threads_num = os.environ.get('NUM_THREADS', 1)
+ cpu_num = os.environ.get('CPU_NUM', 1)
+ return "cpu", int(cpu_num), int(threads_num)
+
+
+if __name__ == "__main__":
+ train()
diff --git a/fluid/PaddleRec/gru4rec/train_data/small_train.txt b/PaddleRec/gru4rec/train_data/small_train.txt
similarity index 100%
rename from fluid/PaddleRec/gru4rec/train_data/small_train.txt
rename to PaddleRec/gru4rec/train_data/small_train.txt
diff --git a/PaddleRec/gru4rec/train_sample_neg.py b/PaddleRec/gru4rec/train_sample_neg.py
new file mode 100644
index 0000000000000000000000000000000000000000..2642452024810fe16cfa1154e273febdb1d63254
--- /dev/null
+++ b/PaddleRec/gru4rec/train_sample_neg.py
@@ -0,0 +1,131 @@
+import os
+import sys
+import time
+import six
+import numpy as np
+import math
+import argparse
+import paddle.fluid as fluid
+import paddle
+import time
+import utils
+import net
+
+SEED = 102
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("gru4rec benchmark.")
+ parser.add_argument(
+ '--train_dir', type=str, default='train_data', help='train file')
+ parser.add_argument(
+ '--vocab_path', type=str, default='vocab.txt', help='vocab file')
+ parser.add_argument(
+ '--is_local', type=int, default=1, help='whether is local')
+ parser.add_argument(
+ '--hid_size', type=int, default=100, help='hidden-dim size')
+ parser.add_argument(
+ '--neg_size', type=int, default=10, help='neg item size')
+ parser.add_argument(
+ '--loss', type=str, default="bpr", help='loss: bpr/cross_entropy')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_neg_recall20', help='model dir')
+ parser.add_argument(
+ '--batch_size', type=int, default=5, help='num of batch size')
+ parser.add_argument(
+ '--print_batch', type=int, default=10, help='num of print batch')
+ parser.add_argument(
+ '--pass_num', type=int, default=10, help='number of epoch')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether use gpu')
+ parser.add_argument(
+ '--parallel', type=int, default=0, help='whether parallel')
+ parser.add_argument(
+ '--base_lr', type=float, default=0.01, help='learning rate')
+ parser.add_argument(
+ '--num_devices', type=int, default=1, help='Number of GPU devices')
+ args = parser.parse_args()
+ return args
+
+
+def get_cards(args):
+ return args.num_devices
+
+
+def train():
+ """ do training """
+ args = parse_args()
+ hid_size = args.hid_size
+ train_dir = args.train_dir
+ vocab_path = args.vocab_path
+ use_cuda = True if args.use_cuda else False
+ parallel = True if args.parallel else False
+ print("use_cuda:", use_cuda, "parallel:", parallel)
+ batch_size = args.batch_size
+ vocab_size, train_reader = utils.prepare_data(
+ train_dir, vocab_path, batch_size=batch_size * get_cards(args),\
+ buffer_size=1000, word_freq_threshold=0, is_train=True)
+
+ # Train program
+ if args.loss == 'bpr':
+ print('bpr loss')
+ src, pos_label, label, avg_cost = net.train_bpr_network(
+ neg_size=args.neg_size, vocab_size=vocab_size, hid_size=hid_size)
+ else:
+ print('cross-entory loss')
+ src, pos_label, label, avg_cost = net.train_cross_entropy_network(
+ neg_size=args.neg_size, vocab_size=vocab_size, hid_size=hid_size)
+
+ # Optimization to minimize lost
+ sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=args.base_lr)
+ sgd_optimizer.minimize(avg_cost)
+
+ # Initialize executor
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ if parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=use_cuda, loss_name=avg_cost.name)
+ else:
+ train_exe = exe
+
+ pass_num = args.pass_num
+ model_dir = args.model_dir
+ fetch_list = [avg_cost.name]
+
+ total_time = 0.0
+ for pass_idx in six.moves.xrange(pass_num):
+ epoch_idx = pass_idx + 1
+ print("epoch_%d start" % epoch_idx)
+
+ t0 = time.time()
+ i = 0
+ newest_ppl = 0
+ for data in train_reader():
+ i += 1
+ ls, lp, ll = utils.to_lodtensor_bpr(data, args.neg_size, vocab_size,
+ place)
+ ret_avg_cost = train_exe.run(
+ feed={"src": ls,
+ "label": ll,
+ "pos_label": lp},
+ fetch_list=fetch_list)
+ avg_ppl = np.exp(ret_avg_cost[0])
+ newest_ppl = np.mean(avg_ppl)
+ if i % args.print_batch == 0:
+ print("step:%d ppl:%.3f" % (i, newest_ppl))
+
+ t1 = time.time()
+ total_time += t1 - t0
+ print("epoch:%d num_steps:%d time_cost(s):%f" %
+ (epoch_idx, i, total_time / epoch_idx))
+ save_dir = "%s/epoch_%d" % (model_dir, epoch_idx)
+ fluid.io.save_params(executor=exe, dirname=save_dir)
+ print("model saved in %s" % save_dir)
+
+ print("finish training")
+
+
+if __name__ == "__main__":
+ train()
diff --git a/PaddleRec/gru4rec/utils.py b/PaddleRec/gru4rec/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..1cd6a313b2a5097b16c473722737e0e6936f4e31
--- /dev/null
+++ b/PaddleRec/gru4rec/utils.py
@@ -0,0 +1,193 @@
+import sys
+import collections
+import six
+import time
+import numpy as np
+import paddle.fluid as fluid
+import paddle
+import os
+
+
+def to_lodtensor(data, place):
+ """ convert to LODtensor """
+ seq_lens = [len(seq) for seq in data]
+ cur_len = 0
+ lod = [cur_len]
+ for l in seq_lens:
+ cur_len += l
+ lod.append(cur_len)
+ flattened_data = np.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res = fluid.LoDTensor()
+ res.set(flattened_data, place)
+ res.set_lod([lod])
+ return res
+
+
+def to_lodtensor_bpr(raw_data, neg_size, vocab_size, place):
+ """ convert to LODtensor """
+ data = [dat[0] for dat in raw_data]
+ seq_lens = [len(seq) for seq in data]
+ cur_len = 0
+ lod = [cur_len]
+ for l in seq_lens:
+ cur_len += l
+ lod.append(cur_len)
+ flattened_data = np.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res = fluid.LoDTensor()
+ res.set(flattened_data, place)
+ res.set_lod([lod])
+
+ data = [dat[1] for dat in raw_data]
+ pos_data = np.concatenate(data, axis=0).astype("int64")
+ length = np.size(pos_data)
+ neg_data = np.tile(pos_data, neg_size)
+ np.random.shuffle(neg_data)
+ for ii in range(length * neg_size):
+ if neg_data[ii] == pos_data[ii // neg_size]:
+ neg_data[ii] = pos_data[length - 1 - ii // neg_size]
+
+ label_data = np.column_stack(
+ (pos_data.reshape(length, 1), neg_data.reshape(length, neg_size)))
+ res_label = fluid.LoDTensor()
+ res_label.set(label_data, place)
+ res_label.set_lod([lod])
+
+ res_pos = fluid.LoDTensor()
+ res_pos.set(np.zeros([len(flattened_data), 1]).astype("int64"), place)
+ res_pos.set_lod([lod])
+
+ return res, res_pos, res_label
+
+
+def to_lodtensor_bpr_test(raw_data, vocab_size, place):
+ """ convert to LODtensor """
+ data = [dat[0] for dat in raw_data]
+ seq_lens = [len(seq) for seq in data]
+ cur_len = 0
+ lod = [cur_len]
+ for l in seq_lens:
+ cur_len += l
+ lod.append(cur_len)
+ flattened_data = np.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res = fluid.LoDTensor()
+ res.set(flattened_data, place)
+ res.set_lod([lod])
+
+ data = [dat[1] for dat in raw_data]
+ flattened_data = np.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res_pos = fluid.LoDTensor()
+ res_pos.set(flattened_data, place)
+ res_pos.set_lod([lod])
+ return res, res_pos
+
+
+def get_vocab_size(vocab_path):
+ with open(vocab_path, "r") as rf:
+ line = rf.readline()
+ return int(line.strip())
+
+
+def prepare_data(file_dir,
+ vocab_path,
+ batch_size,
+ buffer_size=1000,
+ word_freq_threshold=0,
+ is_train=True):
+ """ prepare the English Pann Treebank (PTB) data """
+ print("start constuct word dict")
+ if is_train:
+ vocab_size = get_vocab_size(vocab_path)
+ reader = sort_batch(
+ paddle.reader.shuffle(
+ train(
+ file_dir, buffer_size, data_type=DataType.SEQ),
+ buf_size=buffer_size),
+ batch_size,
+ batch_size * 20)
+ else:
+ vocab_size = get_vocab_size(vocab_path)
+ reader = paddle.batch(
+ test(
+ file_dir, buffer_size, data_type=DataType.SEQ), batch_size)
+ return vocab_size, reader
+
+
+def sort_batch(reader, batch_size, sort_group_size, drop_last=False):
+ """
+ Create a batched reader.
+ :param reader: the data reader to read from.
+ :type reader: callable
+ :param batch_size: size of each mini-batch
+ :type batch_size: int
+ :param sort_group_size: size of partial sorted batch
+ :type sort_group_size: int
+ :param drop_last: drop the last batch, if the size of last batch is not equal to batch_size.
+ :type drop_last: bool
+ :return: the batched reader.
+ :rtype: callable
+ """
+
+ def batch_reader():
+ r = reader()
+ b = []
+ for instance in r:
+ b.append(instance)
+ if len(b) == sort_group_size:
+ sortl = sorted(b, key=lambda x: len(x[0]), reverse=True)
+ b = []
+ c = []
+ for sort_i in sortl:
+ c.append(sort_i)
+ if (len(c) == batch_size):
+ yield c
+ c = []
+ if drop_last == False and len(b) != 0:
+ sortl = sorted(b, key=lambda x: len(x[0]), reverse=True)
+ c = []
+ for sort_i in sortl:
+ c.append(sort_i)
+ if (len(c) == batch_size):
+ yield c
+ c = []
+
+ # Batch size check
+ batch_size = int(batch_size)
+ if batch_size <= 0:
+ raise ValueError("batch_size should be a positive integeral value, "
+ "but got batch_size={}".format(batch_size))
+ return batch_reader
+
+
+class DataType(object):
+ SEQ = 2
+
+
+def reader_creator(file_dir, n, data_type):
+ def reader():
+ files = os.listdir(file_dir)
+ for fi in files:
+ with open(file_dir + '/' + fi, "r") as f:
+ for l in f:
+ if DataType.SEQ == data_type:
+ l = l.strip().split()
+ l = [w for w in l]
+ src_seq = l[:len(l) - 1]
+ trg_seq = l[1:]
+ if n > 0 and len(src_seq) > n: continue
+ yield src_seq, trg_seq
+ else:
+ assert False, 'error data type'
+
+ return reader
+
+
+def train(train_dir, n, data_type=DataType.SEQ):
+ return reader_creator(train_dir, n, data_type)
+
+
+def test(test_dir, n, data_type=DataType.SEQ):
+ return reader_creator(test_dir, n, data_type)
diff --git a/fluid/PaddleRec/gru4rec/vocab.txt b/PaddleRec/gru4rec/vocab.txt
similarity index 100%
rename from fluid/PaddleRec/gru4rec/vocab.txt
rename to PaddleRec/gru4rec/vocab.txt
diff --git a/fluid/PaddleRec/multiview_simnet/.pre-commit-config.yaml b/PaddleRec/multiview_simnet/.pre-commit-config.yaml
similarity index 100%
rename from fluid/PaddleRec/multiview_simnet/.pre-commit-config.yaml
rename to PaddleRec/multiview_simnet/.pre-commit-config.yaml
diff --git a/PaddleRec/multiview_simnet/.run_ce.sh b/PaddleRec/multiview_simnet/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..4aaed27e9eab72c7043925cd051a7e528e2205bc
--- /dev/null
+++ b/PaddleRec/multiview_simnet/.run_ce.sh
@@ -0,0 +1,11 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+export CPU_NUM=1
+export NUM_THREADS=1
+
+FLAGS_benchmark=true python train.py --enable_ce | python _ce.py
+
diff --git a/PaddleRec/multiview_simnet/README.cn.md b/PaddleRec/multiview_simnet/README.cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..06df3c32c7996f5003bd7b9c1eb749f32c28b752
--- /dev/null
+++ b/PaddleRec/multiview_simnet/README.cn.md
@@ -0,0 +1,27 @@
+# 个性化推荐中的多视角Simnet模型
+
+## 介绍
+在个性化推荐场景中,推荐系统给用户提供的项目(Item)列表通常是通过个性化的匹配模型计算出来的。在现实世界中,一个用户可能有很多个视角的特征,比如用户Id,年龄,项目的点击历史等。一个项目,举例来说,新闻资讯,也会有多种视角的特征比如新闻标题,新闻类别等。多视角Simnet模型是可以融合用户以及推荐项目的多个视角的特征并进行个性化匹配学习的一体化模型。这类模型在很多工业化的场景中都会被使用到,比如百度的Feed产品中。
+
+## 数据集
+目前,本项目使用机器生成的数据集来介绍多视角Simnet模型的概念,未来我们会逐渐加入真是世界中的数据集并在这个模型上进行效果验证。
+
+## 模型
+本项目的目标是提供一个在个性化匹配场景下利用Paddle搭建的模型。多视角Simnet模型包括多个编码器模块,每个编码器被用在不同的特征视角上。当前,项目中提供Bag-of-Embedding编码器,Temporal-Convolutional编码器,和Gated-Recurrent-Unit编码器。我们会逐渐加入稀疏特征场景下比较实用的编码器到这个项目中。模型的训练方法,当前采用的是Pairwise ranking模式进行训练,即针对一对具有关联的User-Item组合,随机实用一个Item作为负例进行排序学习。
+
+## 训练
+如下
+如下命令行可以获得训练工具的具体选项,`python train.py -h`内容可以参考说明
+```bash
+python train.py
+```
+##
+如下
+如下命令行可以获得预测工具的具体选项,`python infer -h`内容可以参考说明
+```bash
+python infer.py
+```
+## 未来的工作
+- 多种pairwise的损失函数会被加入到这个项目中。对于不同视角的特征,用户-项目之间的匹配关系可以使用不同的损失函数进行联合优化。整个模型会在真实数据中进行验证。
+- Parallel Executor选项会被加入
+- 分布式训练能力会被加入
diff --git a/PaddleRec/multiview_simnet/README.md b/PaddleRec/multiview_simnet/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..525946e612592b97e10707cadf35e5252230c2bd
--- /dev/null
+++ b/PaddleRec/multiview_simnet/README.md
@@ -0,0 +1,27 @@
+# Multi-view Simnet for Personalized recommendation
+
+## Introduction
+In personalized recommendation scenario, a user often is provided with several items from personalized interest matching model. In real world application, a user may have multiple views of features, say user-id, age, click-history of items, search queries. A item, e.g. news, may also have multiple views of features like news title, news category, images in news and so on. Multi-view Simnet is matching a model that combine users' and items' multiple views of features into one unified model. The model can be used in many industrial product like Baidu's feed news. The model is adapted from the paper A Multi-View Deep Learning(MV-DNN) Approach for Cross Domain User Modeling in Recommendation Systems, WWW 2015. The difference between our model and the MV-DNN is that we also consider multiple feature views of users.
+
+## Dataset
+Currently, synthetic dataset is provided for proof of concept and we aim to add more real world dataset in this project in the future.
+
+## Model
+This project aims to provide practical usage of Paddle in personalized matching scenario. The model provides several encoder modules for different views of features. Currently, Bag-of-Embedding encoder, Temporal-Convolutional encoder, Gated-Recurrent-Unit encoder are provided. We will add more practical encoder for sparse features commonly used in recommender systems. Training algorithms used in this model is pairwise ranking in that a negative item with multiple views will be sampled given a pair of positive user-item pair.
+
+## Train
+The command line options for training can be listed by `python train.py -h`
+```bash
+python train.py
+```
+
+## Infer
+The command line options for inference can be listed by `python infer.py -h`
+```bash
+python infer.py
+```
+
+## Future work
+- Multiple types of pairwise loss will be added in this project. For different views of features between a user and an item, multiple losses will be supported. The model will be verified in real world dataset.
+- Parallel Executor will be added in this project
+- Distributed Training will be added
diff --git a/PaddleRec/multiview_simnet/__init__.py b/PaddleRec/multiview_simnet/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/multiview_simnet/_ce.py b/PaddleRec/multiview_simnet/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..ce6cf7cfa41748ab69ca9aa0955461371c42fe36
--- /dev/null
+++ b/PaddleRec/multiview_simnet/_ce.py
@@ -0,0 +1,58 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+from kpi import AccKpi
+
+
+each_pass_duration_cpu1_thread1_kpi = DurationKpi('each_pass_duration_cpu1_thread1', 0.08, 0, actived=True)
+train_loss_cpu1_thread1_kpi = CostKpi('train_loss_cpu1_thread1', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_cpu1_thread1_kpi,
+ train_loss_cpu1_thread1_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/fluid/PaddleRec/multiview_simnet/infer.py b/PaddleRec/multiview_simnet/infer.py
similarity index 100%
rename from fluid/PaddleRec/multiview_simnet/infer.py
rename to PaddleRec/multiview_simnet/infer.py
diff --git a/PaddleRec/multiview_simnet/nets.py b/PaddleRec/multiview_simnet/nets.py
new file mode 100644
index 0000000000000000000000000000000000000000..fed177844bdd247d163aee9e8625cd0ec74378b3
--- /dev/null
+++ b/PaddleRec/multiview_simnet/nets.py
@@ -0,0 +1,243 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import paddle.fluid as fluid
+import paddle.fluid.layers.nn as nn
+import paddle.fluid.layers.tensor as tensor
+import paddle.fluid.layers.control_flow as cf
+import paddle.fluid.layers.io as io
+
+
+class BowEncoder(object):
+ """ bow-encoder """
+
+ def __init__(self):
+ self.param_name = ""
+
+ def forward(self, emb):
+ return nn.sequence_pool(input=emb, pool_type='sum')
+
+
+class CNNEncoder(object):
+ """ cnn-encoder"""
+
+ def __init__(self,
+ param_name="cnn",
+ win_size=3,
+ ksize=128,
+ act='tanh',
+ pool_type='max'):
+ self.param_name = param_name
+ self.win_size = win_size
+ self.ksize = ksize
+ self.act = act
+ self.pool_type = pool_type
+
+ def forward(self, emb):
+ return fluid.nets.sequence_conv_pool(
+ input=emb,
+ num_filters=self.ksize,
+ filter_size=self.win_size,
+ act=self.act,
+ pool_type=self.pool_type,
+ param_attr=self.param_name + ".param",
+ bias_attr=self.param_name + ".bias")
+
+
+
+class GrnnEncoder(object):
+ """ grnn-encoder """
+
+ def __init__(self, param_name="grnn", hidden_size=128):
+ self.param_name = param_name
+ self.hidden_size = hidden_size
+
+ def forward(self, emb):
+ fc0 = nn.fc(
+ input=emb,
+ size=self.hidden_size * 3,
+ param_attr=self.param_name + "_fc.w",
+ bias_attr=False)
+
+ gru_h = nn.dynamic_gru(
+ input=fc0,
+ size=self.hidden_size,
+ is_reverse=False,
+ param_attr=self.param_name + ".param",
+ bias_attr=self.param_name + ".bias")
+ return nn.sequence_pool(input=gru_h, pool_type='max')
+
+
+'''this is a very simple Encoder factory
+most default argument values are used'''
+
+
+class SimpleEncoderFactory(object):
+ def __init__(self):
+ pass
+
+ ''' create an encoder through create function '''
+
+ def create(self, enc_type, enc_hid_size):
+ if enc_type == "bow":
+ bow_encode = BowEncoder()
+ return bow_encode
+ elif enc_type == "cnn":
+ cnn_encode = CNNEncoder(ksize=enc_hid_size)
+ return cnn_encode
+ elif enc_type == "gru":
+ rnn_encode = GrnnEncoder(hidden_size=enc_hid_size)
+ return rnn_encode
+
+
+class MultiviewSimnet(object):
+ """ multi-view simnet """
+
+ def __init__(self, embedding_size, embedding_dim, hidden_size):
+ self.embedding_size = embedding_size
+ self.embedding_dim = embedding_dim
+ self.emb_shape = [self.embedding_size, self.embedding_dim]
+ self.hidden_size = hidden_size
+ self.margin = 0.1
+
+ def set_query_encoder(self, encoders):
+ self.query_encoders = encoders
+
+ def set_title_encoder(self, encoders):
+ self.title_encoders = encoders
+
+ def get_correct(self, x, y):
+ less = tensor.cast(cf.less_than(x, y), dtype='float32')
+ correct = nn.reduce_sum(less)
+ return correct
+
+ def train_net(self):
+ # input fields for query, pos_title, neg_title
+ q_slots = [
+ io.data(
+ name="q%d" % i, shape=[1], lod_level=1, dtype='int64')
+ for i in range(len(self.query_encoders))
+ ]
+ pt_slots = [
+ io.data(
+ name="pt%d" % i, shape=[1], lod_level=1, dtype='int64')
+ for i in range(len(self.title_encoders))
+ ]
+ nt_slots = [
+ io.data(
+ name="nt%d" % i, shape=[1], lod_level=1, dtype='int64')
+ for i in range(len(self.title_encoders))
+ ]
+
+ # lookup embedding for each slot
+ q_embs = [
+ nn.embedding(
+ input=query, size=self.emb_shape, param_attr="emb")
+ for query in q_slots
+ ]
+ pt_embs = [
+ nn.embedding(
+ input=title, size=self.emb_shape, param_attr="emb")
+ for title in pt_slots
+ ]
+ nt_embs = [
+ nn.embedding(
+ input=title, size=self.emb_shape, param_attr="emb")
+ for title in nt_slots
+ ]
+
+ # encode each embedding field with encoder
+ q_encodes = [
+ self.query_encoders[i].forward(emb) for i, emb in enumerate(q_embs)
+ ]
+ pt_encodes = [
+ self.title_encoders[i].forward(emb) for i, emb in enumerate(pt_embs)
+ ]
+ nt_encodes = [
+ self.title_encoders[i].forward(emb) for i, emb in enumerate(nt_embs)
+ ]
+
+ # concat multi view for query, pos_title, neg_title
+ q_concat = nn.concat(q_encodes)
+ pt_concat = nn.concat(pt_encodes)
+ nt_concat = nn.concat(nt_encodes)
+
+ # projection of hidden layer
+ q_hid = nn.fc(q_concat, size=self.hidden_size, param_attr='q_fc.w', bias_attr='q_fc.b')
+ pt_hid = nn.fc(pt_concat, size=self.hidden_size, param_attr='t_fc.w', bias_attr='t_fc.b')
+ nt_hid = nn.fc(nt_concat, size=self.hidden_size, param_attr='t_fc.w', bias_attr='t_fc.b')
+
+ # cosine of hidden layers
+ cos_pos = nn.cos_sim(q_hid, pt_hid)
+ cos_neg = nn.cos_sim(q_hid, nt_hid)
+
+ # pairwise hinge_loss
+ loss_part1 = nn.elementwise_sub(
+ tensor.fill_constant_batch_size_like(
+ input=cos_pos,
+ shape=[-1, 1],
+ value=self.margin,
+ dtype='float32'),
+ cos_pos)
+
+ loss_part2 = nn.elementwise_add(loss_part1, cos_neg)
+
+ loss_part3 = nn.elementwise_max(
+ tensor.fill_constant_batch_size_like(
+ input=loss_part2, shape=[-1, 1], value=0.0, dtype='float32'),
+ loss_part2)
+
+ avg_cost = nn.mean(loss_part3)
+ correct = self.get_correct(cos_neg, cos_pos)
+
+ return q_slots + pt_slots + nt_slots, avg_cost, correct
+
+ def pred_net(self, query_fields, pos_title_fields, neg_title_fields):
+ q_slots = [
+ io.data(
+ name="q%d" % i, shape=[1], lod_level=1, dtype='int64')
+ for i in range(len(self.query_encoders))
+ ]
+ pt_slots = [
+ io.data(
+ name="pt%d" % i, shape=[1], lod_level=1, dtype='int64')
+ for i in range(len(self.title_encoders))
+ ]
+ # lookup embedding for each slot
+ q_embs = [
+ nn.embedding(
+ input=query, size=self.emb_shape, param_attr="emb")
+ for query in q_slots
+ ]
+ pt_embs = [
+ nn.embedding(
+ input=title, size=self.emb_shape, param_attr="emb")
+ for title in pt_slots
+ ]
+ # encode each embedding field with encoder
+ q_encodes = [
+ self.query_encoder[i].forward(emb) for i, emb in enumerate(q_embs)
+ ]
+ pt_encodes = [
+ self.title_encoders[i].forward(emb) for i, emb in enumerate(pt_embs)
+ ]
+ # concat multi view for query, pos_title, neg_title
+ q_concat = nn.concat(q_encodes)
+ pt_concat = nn.concat(pt_encodes)
+ # projection of hidden layer
+ q_hid = nn.fc(q_concat, size=self.hidden_size, param_attr='q_fc.w', bias_attr='q_fc.b')
+ pt_hid = nn.fc(pt_concat, size=self.hidden_size, param_attr='t_fc.w', bias_attr='t_fc.b')
+ # cosine of hidden layers
+ cos = nn.cos_sim(q_hid, pt_hid)
+ return cos
diff --git a/fluid/PaddleRec/multiview_simnet/reader.py b/PaddleRec/multiview_simnet/reader.py
similarity index 100%
rename from fluid/PaddleRec/multiview_simnet/reader.py
rename to PaddleRec/multiview_simnet/reader.py
diff --git a/PaddleRec/multiview_simnet/train.py b/PaddleRec/multiview_simnet/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..f098fd109e8813ffbfb40753122acbef3cd896a6
--- /dev/null
+++ b/PaddleRec/multiview_simnet/train.py
@@ -0,0 +1,173 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import os
+import sys
+import time
+import six
+import numpy as np
+import math
+import argparse
+import logging
+import paddle.fluid as fluid
+import paddle
+import time
+import reader as reader
+from nets import MultiviewSimnet, SimpleEncoderFactory
+
+logging.basicConfig(format="%(asctime)s - %(levelname)s - %(message)s")
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("multi-view simnet")
+ parser.add_argument("--train_file", type=str, help="Training file")
+ parser.add_argument("--valid_file", type=str, help="Validation file")
+ parser.add_argument(
+ "--epochs", type=int, default=10, help="Number of epochs for training")
+ parser.add_argument(
+ "--model_output_dir",
+ type=str,
+ default='model_output',
+ help="Model output folder")
+ parser.add_argument(
+ "--query_slots", type=int, default=1, help="Number of query slots")
+ parser.add_argument(
+ "--title_slots", type=int, default=1, help="Number of title slots")
+ parser.add_argument(
+ "--query_encoder",
+ type=str,
+ default="bow",
+ help="Encoder module for slot encoding")
+ parser.add_argument(
+ "--title_encoder",
+ type=str,
+ default="bow",
+ help="Encoder module for slot encoding")
+ parser.add_argument(
+ "--query_encode_dim",
+ type=int,
+ default=128,
+ help="Dimension of query encoder output")
+ parser.add_argument(
+ "--title_encode_dim",
+ type=int,
+ default=128,
+ help="Dimension of title encoder output")
+ parser.add_argument(
+ "--batch_size", type=int, default=128, help="Batch size for training")
+ parser.add_argument(
+ "--embedding_dim",
+ type=int,
+ default=128,
+ help="Default Dimension of Embedding")
+ parser.add_argument(
+ "--sparse_feature_dim",
+ type=int,
+ default=1000001,
+ help="Sparse feature hashing space"
+ "for index processing")
+ parser.add_argument(
+ "--hidden_size", type=int, default=128, help="Hidden dim")
+ parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs.')
+ return parser.parse_args()
+
+
+def start_train(args):
+ if args.enable_ce:
+ SEED = 102
+ fluid.default_startup_program().random_seed = SEED
+ fluid.default_startup_program().random_seed = SEED
+
+ dataset = reader.SyntheticDataset(args.sparse_feature_dim, args.query_slots,
+ args.title_slots)
+ train_reader = paddle.batch(
+ paddle.reader.shuffle(
+ dataset.train(), buf_size=args.batch_size * 100),
+ batch_size=args.batch_size)
+ place = fluid.CPUPlace()
+ factory = SimpleEncoderFactory()
+ query_encoders = [
+ factory.create(args.query_encoder, args.query_encode_dim)
+ for i in range(args.query_slots)
+ ]
+ title_encoders = [
+ factory.create(args.title_encoder, args.title_encode_dim)
+ for i in range(args.title_slots)
+ ]
+ m_simnet = MultiviewSimnet(args.sparse_feature_dim, args.embedding_dim,
+ args.hidden_size)
+ m_simnet.set_query_encoder(query_encoders)
+ m_simnet.set_title_encoder(title_encoders)
+ all_slots, avg_cost, correct = m_simnet.train_net()
+ optimizer = fluid.optimizer.Adam(learning_rate=1e-4)
+ optimizer.minimize(avg_cost)
+ startup_program = fluid.default_startup_program()
+ loop_program = fluid.default_main_program()
+
+ feeder = fluid.DataFeeder(feed_list=all_slots, place=place)
+ exe = fluid.Executor(place)
+ exe.run(startup_program)
+
+ total_time = 0
+ ce_info = []
+ for pass_id in range(args.epochs):
+ start_time = time.time()
+ for batch_id, data in enumerate(train_reader()):
+ loss_val, correct_val = exe.run(loop_program,
+ feed=feeder.feed(data),
+ fetch_list=[avg_cost, correct])
+ logger.info("TRAIN --> pass: {} batch_id: {} avg_cost: {}, acc: {}"
+ .format(pass_id, batch_id, loss_val,
+ float(correct_val) / args.batch_size))
+ ce_info.append(loss_val[0])
+ end_time = time.time()
+ total_time += end_time - start_time
+ fluid.io.save_inference_model(args.model_output_dir,
+ [val.name for val in all_slots],
+ [avg_cost, correct], exe)
+
+ # only for ce
+ if args.enable_ce:
+ threads_num, cpu_num = get_cards(args)
+ epoch_idx = args.epochs
+ ce_loss = 0
+ try:
+ ce_loss = ce_info[-2]
+ except:
+ logger.error("ce info error")
+
+ print("kpis\teach_pass_duration_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, total_time / epoch_idx))
+ print("kpis\ttrain_loss_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, ce_loss))
+
+
+def get_cards(args):
+ threads_num = os.environ.get('NUM_THREADS', 1)
+ cpu_num = os.environ.get('CPU_NUM', 1)
+ return int(threads_num), int(cpu_num)
+
+
+def main():
+ args = parse_args()
+ start_train(args)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/PaddleRec/ssr/.run_ce.sh b/PaddleRec/ssr/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..ffcc0f6ac0a400641fe42cd2f6d62d8488115432
--- /dev/null
+++ b/PaddleRec/ssr/.run_ce.sh
@@ -0,0 +1,22 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+export CPU_NUM=1
+export NUM_THREADS=1
+
+FLAGS_benchmark=true python train.py --train_dir train_big_data --vocab_path vocab_big.txt --use_cuda 0 --batch_size 500 --model_dir model_output --epochs 2 --enable_ce --step_num 500 | python _ce.py
+
+
+cudaid=${ssr:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --train_dir train_big_data --vocab_path vocab_big.txt --use_cuda 1 --batch_size 500 --model_dir model_output --epochs 2 --enable_ce --step_num 1000 | python _ce.py
+
+
+cudaid=${ssr_4:=0,1,2,3} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --train_dir train_big_data --vocab_path vocab_big.txt --use_cuda 1 --parallel 1 --num_devices 2 --batch_size 500 --model_dir model_output --epochs 2 --enable_ce --step_num 1000 | python _ce.py
diff --git a/PaddleRec/ssr/README.md b/PaddleRec/ssr/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..d0b4dfb41b4cea19efa42c4a233c9544349d1770
--- /dev/null
+++ b/PaddleRec/ssr/README.md
@@ -0,0 +1,52 @@
+# Sequence Semantic Retrieval Model
+
+## Introduction
+In news recommendation scenarios, different from traditional systems that recommend entertainment items such as movies or music, there are several new problems to solve.
+- Very sparse user profile features exist that a user may login a news recommendation app anonymously and a user is likely to read a fresh news item.
+- News are generated or disappeared very fast compare with movies or musics. Usually, there will be thousands of news generated in a news recommendation app. The Consumption of news is also fast since users care about newly happened things.
+- User interests may change frequently in the news recommendation setting. The content of news will affect users' reading behaviors a lot even the category of the news does not belong to users' long-term interest. In news recommendation, reading behaviors are determined by both short-term interest and long-term interest of users.
+
+[GRU4Rec](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/gru4rec) models a user's short-term and long-term interest by applying a gated-recurrent-unit on the user's reading history. The generalization ability of recurrent neural network captures users' similarity of reading sequences that alleviates the user profile sparsity problem. However, the paper of GRU4Rec operates on close domain of items that the model predicts which item a user will be interested in through classification method. In news recommendation, news items are dynamic through time that GRU4Rec model can not predict items that do not exist in training dataset.
+
+Sequence Semantic Retrieval(SSR) Model shares the similar idea with Multi-Rate Deep Learning for Temporal Recommendation, SIGIR 2016. Sequence Semantic Retrieval Model has two components, one is the matching model part, the other one is the retrieval part.
+- The idea of SSR is to model a user's personalized interest of an item through matching model structure, and the representation of a news item can be computed online even the news item does not exist in training dataset.
+- With the representation of news items, we are able to build an vector indexing service online for news prediction and this is the retrieval part of SSR.
+
+## Dataset
+Dataset preprocessing follows the method of [GRU4Rec Project](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/gru4rec). Note that you should reuse scripts from GRU4Rec project for data preprocessing.
+
+## Training
+
+The command line options for training can be listed by `python train.py -h`
+
+gpu 单机单卡训练
+``` bash
+CUDA_VISIBLE_DEVICES=0 python train.py --train_dir train_data --use_cuda 1 --batch_size 50 --model_dir model_output
+```
+
+cpu 单机训练
+``` bash
+python train.py --train_dir train_data --use_cuda 0 --batch_size 50 --model_dir model_output
+```
+
+gpu 单机多卡训练
+``` bash
+CUDA_VISIBLE_DEVICES=0,1 python train.py --train_dir train_data --use_cuda 1 --parallel 1 --batch_size 50 --model_dir model_output --num_devices 2
+```
+
+cpu 单机多卡训练
+``` bash
+CPU_NUM=10 python train.py --train_dir train_data --use_cuda 0 --parallel 1 --batch_size 50 --model_dir model_output --num_devices 10
+```
+
+本地模拟多机训练
+``` bash
+sh cluster_train.sh
+```
+
+## Inference
+
+gpu 预测
+``` bash
+CUDA_VISIBLE_DEVICES=0 python infer.py --test_dir test_data --use_cuda 1 --batch_size 50 --model_dir model_output
+```
diff --git a/PaddleRec/ssr/__init__.py b/PaddleRec/ssr/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/ssr/_ce.py b/PaddleRec/ssr/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..e5a3d3b19339b39e638d4e708878a68778fb3fd9
--- /dev/null
+++ b/PaddleRec/ssr/_ce.py
@@ -0,0 +1,66 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+from kpi import AccKpi
+
+
+each_pass_duration_cpu1_thread1_kpi = DurationKpi('each_pass_duration_cpu1_thread1', 0.08, 0, actived=True)
+train_acc_cpu1_thread1_kpi = CostKpi('train_acc_cpu1_thread1', 0.08, 0)
+each_pass_duration_gpu1_kpi = DurationKpi('each_pass_duration_gpu1', 0.08, 0, actived=True)
+train_acc_gpu1_kpi = CostKpi('train_acc_gpu1', 0.08, 0)
+each_pass_duration_gpu4_kpi = DurationKpi('each_pass_duration_gpu4', 0.08, 0, actived=True)
+train_acc_gpu4_kpi = CostKpi('train_acc_gpu4', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_cpu1_thread1_kpi,
+ train_acc_cpu1_thread1_kpi,
+ each_pass_duration_gpu1_kpi,
+ train_acc_gpu1_kpi,
+ each_pass_duration_gpu4_kpi,
+ train_acc_gpu4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/PaddleRec/ssr/cluster_train.py b/PaddleRec/ssr/cluster_train.py
new file mode 100644
index 0000000000000000000000000000000000000000..bcbdde7e6672cff86374d7836fc3df5b69a531da
--- /dev/null
+++ b/PaddleRec/ssr/cluster_train.py
@@ -0,0 +1,205 @@
+#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import os
+import sys
+import time
+import argparse
+import logging
+import paddle.fluid as fluid
+import paddle
+import utils
+import numpy as np
+from nets import SequenceSemanticRetrieval
+
+logging.basicConfig(format="%(asctime)s - %(levelname)s - %(message)s")
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("sequence semantic retrieval")
+ parser.add_argument(
+ "--train_dir", type=str, default='train_data', help="Training file")
+ parser.add_argument(
+ "--base_lr", type=float, default=0.01, help="learning rate")
+ parser.add_argument(
+ '--vocab_path', type=str, default='vocab.txt', help='vocab file')
+ parser.add_argument(
+ "--epochs", type=int, default=10, help="Number of epochs")
+ parser.add_argument(
+ '--parallel', type=int, default=0, help='whether parallel')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether use gpu')
+ parser.add_argument(
+ '--print_batch', type=int, default=10, help='num of print batch')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_output', help='model dir')
+ parser.add_argument(
+ "--hidden_size", type=int, default=128, help="hidden size")
+ parser.add_argument(
+ "--batch_size", type=int, default=50, help="number of batch")
+ parser.add_argument(
+ "--embedding_dim", type=int, default=128, help="embedding dim")
+ parser.add_argument(
+ '--num_devices', type=int, default=1, help='Number of GPU devices')
+ parser.add_argument(
+ '--step_num', type=int, default=1000, help='Number of steps')
+ parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs.')
+ parser.add_argument(
+ '--role', type=str, default='pserver', help='trainer or pserver')
+ parser.add_argument(
+ '--endpoints',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The pserver endpoints, like: 127.0.0.1:6000, 127.0.0.1:6001')
+ parser.add_argument(
+ '--current_endpoint',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The current_endpoint')
+ parser.add_argument(
+ '--trainer_id',
+ type=int,
+ default=0,
+ help='trainer id ,only trainer_id=0 save model')
+ parser.add_argument(
+ '--trainers',
+ type=int,
+ default=1,
+ help='The num of trianers, (default: 1)')
+ return parser.parse_args()
+
+
+def get_cards(args):
+ return args.num_devices
+
+
+def train_loop(main_program, avg_cost, acc, train_input_data, place, args,
+ train_reader):
+ data_list = [var.name for var in train_input_data]
+ feeder = fluid.DataFeeder(feed_list=data_list, place=place)
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ train_exe = exe
+
+ total_time = 0.0
+ ce_info = []
+ for pass_id in range(args.epochs):
+ epoch_idx = pass_id + 1
+ print("epoch_%d start" % epoch_idx)
+ t0 = time.time()
+ i = 0
+ for batch_id, data in enumerate(train_reader()):
+ i += 1
+ loss_val, correct_val = train_exe.run(
+ feed=feeder.feed(data), fetch_list=[avg_cost.name, acc.name])
+ ce_info.append(float(np.mean(correct_val)) / args.batch_size)
+ if i % args.print_batch == 0:
+ logger.info(
+ "Train --> pass: {} batch_id: {} avg_cost: {}, acc: {}".
+ format(pass_id, batch_id,
+ np.mean(loss_val),
+ float(np.mean(correct_val)) / args.batch_size))
+ if args.enable_ce and i > args.step_num:
+ break
+ t1 = time.time()
+ total_time += t1 - t0
+ print("epoch:%d num_steps:%d time_cost(s):%f" %
+ (epoch_idx, i, total_time / epoch_idx))
+ save_dir = "%s/epoch_%d" % (args.model_dir, epoch_idx)
+ fluid.io.save_params(executor=exe, dirname=save_dir)
+ print("model saved in %s" % save_dir)
+
+ # only for ce
+ if args.enable_ce:
+ ce_acc = 0
+ try:
+ ce_acc = ce_info[-2]
+ except:
+ print("ce info error")
+ epoch_idx = args.epochs
+ device = get_device(args)
+ if args.use_cuda:
+ gpu_num = device[1]
+ print("kpis\teach_pass_duration_gpu%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_acc_gpu%s\t%s" % (gpu_num, ce_acc))
+ else:
+ cpu_num = device[1]
+ threads_num = device[2]
+ print("kpis\teach_pass_duration_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, total_time / epoch_idx))
+ print("kpis\ttrain_acc_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, ce_acc))
+
+
+def train(args):
+ if args.enable_ce:
+ SEED = 102
+ fluid.default_startup_program().random_seed = SEED
+ fluid.default_main_program().random_seed = SEED
+ use_cuda = True if args.use_cuda else False
+ parallel = True if args.parallel else False
+ print("use_cuda:", use_cuda, "parallel:", parallel)
+ train_reader, vocab_size = utils.construct_train_data(
+ args.train_dir, args.vocab_path, args.batch_size * get_cards(args))
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ ssr = SequenceSemanticRetrieval(vocab_size, args.embedding_dim,
+ args.hidden_size)
+ # Train program
+ train_input_data, cos_pos, avg_cost, acc = ssr.train()
+
+ # Optimization to minimize lost
+ optimizer = fluid.optimizer.Adagrad(learning_rate=args.base_lr)
+ optimizer.minimize(avg_cost)
+
+ print("run distribute training")
+ t = fluid.DistributeTranspiler()
+ t.transpile(
+ args.trainer_id, pservers=args.endpoints, trainers=args.trainers)
+ if args.role == "pserver":
+ print("run psever")
+ pserver_prog = t.get_pserver_program(args.current_endpoint)
+ pserver_startup = t.get_startup_program(args.current_endpoint,
+ pserver_prog)
+ exe = fluid.Executor(fluid.CPUPlace())
+ exe.run(pserver_startup)
+ exe.run(pserver_prog)
+ elif args.role == "trainer":
+ print("run trainer")
+ train_loop(t.get_trainer_program(), avg_cost, acc, train_input_data,
+ place, args, train_reader)
+
+
+def get_device(args):
+ if args.use_cuda:
+ gpus = os.environ.get("CUDA_VISIBLE_DEVICES", 1)
+ gpu_num = len(gpus.split(','))
+ return "gpu", gpu_num
+ else:
+ threads_num = os.environ.get('NUM_THREADS', 1)
+ cpu_num = os.environ.get('CPU_NUM', 1)
+ return "cpu", int(cpu_num), int(threads_num)
+
+
+def main():
+ args = parse_args()
+ train(args)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/PaddleRec/ssr/cluster_train.sh b/PaddleRec/ssr/cluster_train.sh
new file mode 100644
index 0000000000000000000000000000000000000000..aeb1d9c5cb102a511b0bc3485e6906f9d7985628
--- /dev/null
+++ b/PaddleRec/ssr/cluster_train.sh
@@ -0,0 +1,58 @@
+#!/bin/bash
+
+#export GLOG_v=30
+#export GLOG_logtostderr=1
+
+# start pserver0
+python cluster_train.py \
+ --train_dir train_data \
+ --model_dir cluster_model \
+ --vocab_path vocab.txt \
+ --batch_size 5 \
+ --role pserver \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --current_endpoint 127.0.0.1:6000 \
+ --trainers 2 \
+ > pserver0.log 2>&1 &
+
+# start pserver1
+python cluster_train.py \
+ --train_dir train_data \
+ --model_dir cluster_model \
+ --vocab_path vocab.txt \
+ --batch_size 5 \
+ --role pserver \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --current_endpoint 127.0.0.1:6001 \
+ --trainers 2 \
+ > pserver1.log 2>&1 &
+
+# start trainer0
+#CUDA_VISIBLE_DEVICES=1 python cluster_train.py \
+python cluster_train.py \
+ --train_dir train_data \
+ --model_dir cluster_model \
+ --vocab_path vocab.txt \
+ --batch_size 5 \
+ --print_batch 10 \
+ --use_cuda 0 \
+ --role trainer \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --trainers 2 \
+ --trainer_id 0 \
+ > trainer0.log 2>&1 &
+
+# start trainer1
+#CUDA_VISIBLE_DEVICES=2 python cluster_train.py \
+python cluster_train.py \
+ --train_dir train_data \
+ --model_dir cluster_model \
+ --vocab_path vocab.txt \
+ --batch_size 5 \
+ --print_batch 10 \
+ --use_cuda 0 \
+ --role trainer \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --trainers 2 \
+ --trainer_id 1 \
+ > trainer1.log 2>&1 &
diff --git a/PaddleRec/ssr/infer.py b/PaddleRec/ssr/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..38fb5cd762e117409b12ce8bd202f110a1cdfcb4
--- /dev/null
+++ b/PaddleRec/ssr/infer.py
@@ -0,0 +1,134 @@
+import sys
+import argparse
+import time
+import math
+import unittest
+import contextlib
+import numpy as np
+import six
+import paddle.fluid as fluid
+import paddle
+import utils
+import nets as net
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("ssr benchmark.")
+ parser.add_argument(
+ '--test_dir', type=str, default='test_data', help='test file address')
+ parser.add_argument(
+ '--vocab_path', type=str, default='vocab.txt', help='vocab path')
+ parser.add_argument(
+ '--start_index', type=int, default='1', help='start index')
+ parser.add_argument(
+ '--last_index', type=int, default='10', help='end index')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_output', help='model dir')
+ parser.add_argument(
+ '--use_cuda', type=int, default='0', help='whether use cuda')
+ parser.add_argument(
+ '--batch_size', type=int, default='50', help='batch_size')
+ parser.add_argument(
+ '--hid_size', type=int, default='128', help='hidden size')
+ parser.add_argument(
+ '--emb_size', type=int, default='128', help='embedding size')
+ args = parser.parse_args()
+ return args
+
+
+def model(vocab_size, emb_size, hidden_size):
+ user_data = fluid.layers.data(
+ name="user", shape=[1], dtype="int64", lod_level=1)
+ all_item_data = fluid.layers.data(
+ name="all_item", shape=[vocab_size, 1], dtype="int64")
+
+ user_emb = fluid.layers.embedding(
+ input=user_data, size=[vocab_size, emb_size], param_attr="emb.item")
+ all_item_emb = fluid.layers.embedding(
+ input=all_item_data, size=[vocab_size, emb_size], param_attr="emb.item")
+ all_item_emb_re = fluid.layers.reshape(x=all_item_emb, shape=[-1, emb_size])
+
+ user_encoder = net.GrnnEncoder(hidden_size=hidden_size)
+ user_enc = user_encoder.forward(user_emb)
+ user_hid = fluid.layers.fc(input=user_enc,
+ size=hidden_size,
+ param_attr='user.w',
+ bias_attr="user.b")
+ user_exp = fluid.layers.expand(x=user_hid, expand_times=[1, vocab_size])
+ user_re = fluid.layers.reshape(x=user_exp, shape=[-1, hidden_size])
+
+ all_item_hid = fluid.layers.fc(input=all_item_emb_re,
+ size=hidden_size,
+ param_attr='item.w',
+ bias_attr="item.b")
+ cos_item = fluid.layers.cos_sim(X=all_item_hid, Y=user_re)
+ all_pre_ = fluid.layers.reshape(x=cos_item, shape=[-1, vocab_size])
+ pos_label = fluid.layers.data(name="pos_label", shape=[1], dtype="int64")
+ acc = fluid.layers.accuracy(input=all_pre_, label=pos_label, k=20)
+ return acc
+
+
+def infer(args, vocab_size, test_reader):
+ """ inference function """
+ place = fluid.CUDAPlace(0) if args.use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ emb_size = args.emb_size
+ hid_size = args.hid_size
+ batch_size = args.batch_size
+ model_path = args.model_dir
+ with fluid.scope_guard(fluid.core.Scope()):
+ main_program = fluid.Program()
+ start_up_program = fluid.Program()
+ with fluid.program_guard(main_program, start_up_program):
+ acc = model(vocab_size, emb_size, hid_size)
+ for epoch in range(start_index, last_index + 1):
+ copy_program = main_program.clone()
+ model_path = model_dir + "/epoch_" + str(epoch)
+ fluid.io.load_params(
+ executor=exe, dirname=model_path, main_program=copy_program)
+ accum_num_recall = 0.0
+ accum_num_sum = 0.0
+ t0 = time.time()
+ step_id = 0
+ for data in test_reader():
+ step_id += 1
+ user_data, pos_label = utils.infer_data(data, place)
+ all_item_numpy = np.tile(
+ np.arange(vocab_size), len(pos_label)).reshape(
+ len(pos_label), vocab_size, 1)
+ para = exe.run(copy_program,
+ feed={
+ "user": user_data,
+ "all_item": all_item_numpy,
+ "pos_label": pos_label
+ },
+ fetch_list=[acc.name],
+ return_numpy=False)
+
+ acc_ = para[0]._get_float_element(0)
+ data_length = len(
+ np.concatenate(
+ pos_label, axis=0).astype("int64"))
+ accum_num_sum += (data_length)
+ accum_num_recall += (data_length * acc_)
+ if step_id % 1 == 0:
+ print("step:%d " % (step_id),
+ accum_num_recall / accum_num_sum)
+ t1 = time.time()
+ print("model:%s recall@20:%.3f time_cost(s):%.2f" %
+ (model_path, accum_num_recall / accum_num_sum, t1 - t0))
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ start_index = args.start_index
+ last_index = args.last_index
+ test_dir = args.test_dir
+ model_dir = args.model_dir
+ batch_size = args.batch_size
+ vocab_path = args.vocab_path
+ use_cuda = True if args.use_cuda else False
+ print("start index: ", start_index, " last_index:", last_index)
+ test_reader, vocab_size = utils.construct_test_data(
+ test_dir, vocab_path, batch_size=args.batch_size)
+ infer(args, vocab_size, test_reader=test_reader)
diff --git a/PaddleRec/ssr/nets.py b/PaddleRec/ssr/nets.py
new file mode 100644
index 0000000000000000000000000000000000000000..4df23573c91fcf16a4ef95d1bab1ac01e437d148
--- /dev/null
+++ b/PaddleRec/ssr/nets.py
@@ -0,0 +1,122 @@
+#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import paddle.fluid as fluid
+import paddle.fluid.layers.nn as nn
+import paddle.fluid.layers.tensor as tensor
+import paddle.fluid.layers.control_flow as cf
+import paddle.fluid.layers.io as io
+
+
+class BowEncoder(object):
+ """ bow-encoder """
+
+ def __init__(self):
+ self.param_name = ""
+
+ def forward(self, emb):
+ return nn.sequence_pool(input=emb, pool_type='sum')
+
+
+class GrnnEncoder(object):
+ """ grnn-encoder """
+
+ def __init__(self, param_name="grnn", hidden_size=128):
+ self.param_name = param_name
+ self.hidden_size = hidden_size
+
+ def forward(self, emb):
+ fc0 = nn.fc(input=emb,
+ size=self.hidden_size * 3,
+ param_attr=self.param_name + "_fc.w",
+ bias_attr=False)
+
+ gru_h = nn.dynamic_gru(
+ input=fc0,
+ size=self.hidden_size,
+ is_reverse=False,
+ param_attr=self.param_name + ".param",
+ bias_attr=self.param_name + ".bias")
+ return nn.sequence_pool(input=gru_h, pool_type='max')
+
+
+class PairwiseHingeLoss(object):
+ def __init__(self, margin=0.8):
+ self.margin = margin
+
+ def forward(self, pos, neg):
+ loss_part1 = nn.elementwise_sub(
+ tensor.fill_constant_batch_size_like(
+ input=pos, shape=[-1, 1], value=self.margin, dtype='float32'),
+ pos)
+ loss_part2 = nn.elementwise_add(loss_part1, neg)
+ loss_part3 = nn.elementwise_max(
+ tensor.fill_constant_batch_size_like(
+ input=loss_part2, shape=[-1, 1], value=0.0, dtype='float32'),
+ loss_part2)
+ return loss_part3
+
+
+class SequenceSemanticRetrieval(object):
+ """ sequence semantic retrieval model """
+
+ def __init__(self, embedding_size, embedding_dim, hidden_size):
+ self.embedding_size = embedding_size
+ self.embedding_dim = embedding_dim
+ self.emb_shape = [self.embedding_size, self.embedding_dim]
+ self.hidden_size = hidden_size
+ self.user_encoder = GrnnEncoder(hidden_size=hidden_size)
+ self.item_encoder = BowEncoder()
+ self.pairwise_hinge_loss = PairwiseHingeLoss()
+
+ def get_correct(self, x, y):
+ less = tensor.cast(cf.less_than(x, y), dtype='float32')
+ correct = nn.reduce_sum(less)
+ return correct
+
+ def train(self):
+ user_data = io.data(name="user", shape=[1], dtype="int64", lod_level=1)
+ pos_item_data = io.data(
+ name="p_item", shape=[1], dtype="int64", lod_level=1)
+ neg_item_data = io.data(
+ name="n_item", shape=[1], dtype="int64", lod_level=1)
+ user_emb = nn.embedding(
+ input=user_data, size=self.emb_shape, param_attr="emb.item")
+ pos_item_emb = nn.embedding(
+ input=pos_item_data, size=self.emb_shape, param_attr="emb.item")
+ neg_item_emb = nn.embedding(
+ input=neg_item_data, size=self.emb_shape, param_attr="emb.item")
+ user_enc = self.user_encoder.forward(user_emb)
+ pos_item_enc = self.item_encoder.forward(pos_item_emb)
+ neg_item_enc = self.item_encoder.forward(neg_item_emb)
+ user_hid = nn.fc(input=user_enc,
+ size=self.hidden_size,
+ param_attr='user.w',
+ bias_attr="user.b")
+ pos_item_hid = nn.fc(input=pos_item_enc,
+ size=self.hidden_size,
+ param_attr='item.w',
+ bias_attr="item.b")
+ neg_item_hid = nn.fc(input=neg_item_enc,
+ size=self.hidden_size,
+ param_attr='item.w',
+ bias_attr="item.b")
+ cos_pos = nn.cos_sim(user_hid, pos_item_hid)
+ cos_neg = nn.cos_sim(user_hid, neg_item_hid)
+ hinge_loss = self.pairwise_hinge_loss.forward(cos_pos, cos_neg)
+ avg_cost = nn.mean(hinge_loss)
+ correct = self.get_correct(cos_neg, cos_pos)
+
+ return [user_data, pos_item_data,
+ neg_item_data], cos_pos, avg_cost, correct
diff --git a/PaddleRec/ssr/reader.py b/PaddleRec/ssr/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..15989fd8cec366b2c3b71672f134035c42bf79da
--- /dev/null
+++ b/PaddleRec/ssr/reader.py
@@ -0,0 +1,89 @@
+#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import random
+
+
+class Dataset:
+ def __init__(self):
+ pass
+
+
+class Vocab:
+ def __init__(self):
+ pass
+
+
+class YoochooseVocab(Vocab):
+ def __init__(self):
+ self.vocab = {}
+ self.word_array = []
+
+ def load(self, filelist):
+ idx = 0
+ for f in filelist:
+ with open(f, "r") as fin:
+ for line in fin:
+ group = line.strip().split()
+ for item in group:
+ if item not in self.vocab:
+ self.vocab[item] = idx
+ self.word_array.append(idx)
+ idx += 1
+ else:
+ self.word_array.append(self.vocab[item])
+
+ def get_vocab(self):
+ return self.vocab
+
+ def _get_word_array(self):
+ return self.word_array
+
+
+class YoochooseDataset(Dataset):
+ def __init__(self, vocab_size):
+ self.vocab_size = vocab_size
+
+ def sample_neg(self):
+ return random.randint(0, self.vocab_size - 1)
+
+ def sample_neg_from_seq(self, seq):
+ return seq[random.randint(0, len(seq) - 1)]
+
+ def _reader_creator(self, filelist, is_train):
+ def reader():
+ for f in filelist:
+ with open(f, 'r') as fin:
+ line_idx = 0
+ for line in fin:
+ ids = line.strip().split()
+ if len(ids) <= 1:
+ continue
+ conv_ids = [i for i in ids]
+ boundary = len(ids) - 1
+ src = conv_ids[:boundary]
+ pos_tgt = [conv_ids[boundary]]
+ if is_train:
+ neg_tgt = [self.sample_neg()]
+ yield [src, pos_tgt, neg_tgt]
+ else:
+ yield [src, pos_tgt]
+
+ return reader
+
+ def train(self, file_list):
+ return self._reader_creator(file_list, True)
+
+ def test(self, file_list):
+ return self._reader_creator(file_list, False)
diff --git a/PaddleRec/ssr/test_data/small_test.txt b/PaddleRec/ssr/test_data/small_test.txt
new file mode 100644
index 0000000000000000000000000000000000000000..b4bf7189643a041b769fc88c56f6b1ec5b5229db
--- /dev/null
+++ b/PaddleRec/ssr/test_data/small_test.txt
@@ -0,0 +1,100 @@
+0 16
+475 473 155
+491 21
+96 185 96
+29 14 13
+5 481 11 21 470
+70 5 70 11
+167 42 167 217
+72 15 73 161 172
+82 82
+97 297 97
+193 182 186 183 184 177 214
+152 152
+163 298 7
+39 73 71
+490 23 23 496 488 74 23 74 486 23 23 74
+17 17
+170 170 483 444 443 234
+25 472
+5 5 11 70 69
+149 149 455
+356 68 477 468 17 479 66
+159 172 6 71 6 6 158 13 494 169
+155 44 438 144 500
+156 9 9
+146 146
+173 10 10 461
+7 6 6
+269 48 268
+50 100
+323 174 18
+69 69 22 98
+38 171
+22 29 489 10
+0 0
+11 5
+29 13 14 232 231 451 289 452 229
+260 11 156
+166 160 166 39
+223 134 134 420
+66 401 68 132 17 84 287 5
+39 304
+65 84 132
+400 211
+145 144
+16 28 254 48 50 100 42 154 262 133 17
+0 0
+28 28
+11 476 464
+61 61 86 86
+38 38
+463 478
+437 265
+22 39 485 171 98
+434 51 344
+16 16
+67 67 67 448
+22 12 161
+15 377 147 147 374
+119 317 0
+38 484
+403 499
+432 442
+28 0 16 50 465 42
+163 487 7 162
+99 99 325 423 83 83
+154 133
+5 37 492 235 160 279
+10 10 457 493 10 460
+441 4 4 4 4 4 4 4
+153 153
+159 164 164
+328 37
+65 65 404 347 431 459
+80 80 44 44
+61 446
+162 495 7 453
+157 21 204 68 37 66 469 145
+37 151 230 206 240 205 264 87 409 87 288 270 280 329 157 296 454 474
+430 445 433
+449 14
+9 9 9 9
+440 238 226
+148 148
+266 267 181
+48 498
+263 255 256
+458 158 7
+72 168 12 165 71 73 173 49
+0 0
+7 7 6
+14 29 13 6 15 14 15 13
+480 439 21
+450 21 151
+12 12 49 14 13 165 12 169 72 15 15
+91 91
+22 12 49 168
+497 101 30 411 30 482 30 53 30 101 176 415 53 447
+462 150 150
+471 456 131 435 131 467 436 412 227 218 190 466 429 213 326
diff --git a/PaddleRec/ssr/train.py b/PaddleRec/ssr/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..1c0c9f8cc3ed6750d21ba43985fb142dc527cf00
--- /dev/null
+++ b/PaddleRec/ssr/train.py
@@ -0,0 +1,168 @@
+#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import os
+import sys
+import time
+import argparse
+import logging
+import paddle.fluid as fluid
+import paddle
+import utils
+import numpy as np
+from nets import SequenceSemanticRetrieval
+
+logging.basicConfig(format="%(asctime)s - %(levelname)s - %(message)s")
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("sequence semantic retrieval")
+ parser.add_argument(
+ "--train_dir", type=str, default='train_data', help="Training file")
+ parser.add_argument(
+ "--base_lr", type=float, default=0.01, help="learning rate")
+ parser.add_argument(
+ '--vocab_path', type=str, default='vocab.txt', help='vocab file')
+ parser.add_argument(
+ "--epochs", type=int, default=10, help="Number of epochs")
+ parser.add_argument(
+ '--parallel', type=int, default=0, help='whether parallel')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether use gpu')
+ parser.add_argument(
+ '--print_batch', type=int, default=10, help='num of print batch')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_output', help='model dir')
+ parser.add_argument(
+ "--hidden_size", type=int, default=128, help="hidden size")
+ parser.add_argument(
+ "--batch_size", type=int, default=50, help="number of batch")
+ parser.add_argument(
+ "--embedding_dim", type=int, default=128, help="embedding dim")
+ parser.add_argument(
+ '--num_devices', type=int, default=1, help='Number of GPU devices')
+ parser.add_argument(
+ '--step_num', type=int, default=1000, help='Number of steps')
+ parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs.')
+ return parser.parse_args()
+
+
+def get_cards(args):
+ return args.num_devices
+
+
+def train(args):
+ if args.enable_ce:
+ SEED = 102
+ fluid.default_startup_program().random_seed = SEED
+ fluid.default_main_program().random_seed = SEED
+ use_cuda = True if args.use_cuda else False
+ parallel = True if args.parallel else False
+ print("use_cuda:", use_cuda, "parallel:", parallel)
+ train_reader, vocab_size = utils.construct_train_data(
+ args.train_dir, args.vocab_path, args.batch_size * get_cards(args))
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ ssr = SequenceSemanticRetrieval(vocab_size, args.embedding_dim,
+ args.hidden_size)
+ # Train program
+ train_input_data, cos_pos, avg_cost, acc = ssr.train()
+
+ # Optimization to minimize lost
+ optimizer = fluid.optimizer.Adagrad(learning_rate=args.base_lr)
+ optimizer.minimize(avg_cost)
+
+ data_list = [var.name for var in train_input_data]
+ feeder = fluid.DataFeeder(feed_list=data_list, place=place)
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ if parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=use_cuda, loss_name=avg_cost.name)
+ else:
+ train_exe = exe
+
+ total_time = 0.0
+ ce_info = []
+ for pass_id in range(args.epochs):
+ epoch_idx = pass_id + 1
+ print("epoch_%d start" % epoch_idx)
+ t0 = time.time()
+ i = 0
+ for batch_id, data in enumerate(train_reader()):
+ i += 1
+ loss_val, correct_val = train_exe.run(
+ feed=feeder.feed(data), fetch_list=[avg_cost.name, acc.name])
+ ce_info.append(float(np.mean(correct_val)) / args.batch_size)
+ if i % args.print_batch == 0:
+ logger.info(
+ "Train --> pass: {} batch_id: {} avg_cost: {}, acc: {}".
+ format(pass_id, batch_id,
+ np.mean(loss_val),
+ float(np.mean(correct_val)) / args.batch_size))
+ if args.enable_ce and i > args.step_num:
+ break
+ t1 = time.time()
+ total_time += t1 - t0
+ print("epoch:%d num_steps:%d time_cost(s):%f" %
+ (epoch_idx, i, total_time / epoch_idx))
+ save_dir = "%s/epoch_%d" % (args.model_dir, epoch_idx)
+ fluid.io.save_params(executor=exe, dirname=save_dir)
+ print("model saved in %s" % save_dir)
+
+ # only for ce
+ if args.enable_ce:
+ ce_acc = 0
+ try:
+ ce_acc = ce_info[-2]
+ except:
+ print("ce info error")
+ epoch_idx = args.epochs
+ device = get_device(args)
+ if args.use_cuda:
+ gpu_num = device[1]
+ print("kpis\teach_pass_duration_gpu%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_acc_gpu%s\t%s" %
+ (gpu_num, ce_acc))
+ else:
+ cpu_num = device[1]
+ threads_num = device[2]
+ print("kpis\teach_pass_duration_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, total_time / epoch_idx))
+ print("kpis\ttrain_acc_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, ce_acc))
+
+
+def get_device(args):
+ if args.use_cuda:
+ gpus = os.environ.get("CUDA_VISIBLE_DEVICES", 1)
+ gpu_num = len(gpus.split(','))
+ return "gpu", gpu_num
+ else:
+ threads_num = os.environ.get('NUM_THREADS', 1)
+ cpu_num = os.environ.get('CPU_NUM', 1)
+ return "cpu", int(cpu_num), int(threads_num)
+
+
+def main():
+ args = parse_args()
+ train(args)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/PaddleRec/ssr/train_data/small_train.txt b/PaddleRec/ssr/train_data/small_train.txt
new file mode 100644
index 0000000000000000000000000000000000000000..6252a52c5ce3fe5bcc4f28c274e67461f47e1586
--- /dev/null
+++ b/PaddleRec/ssr/train_data/small_train.txt
@@ -0,0 +1,100 @@
+197 196 198 236
+93 93 384 362 363 43
+336 364 407
+421 322
+314 388
+128 58
+138 138
+46 46 46
+34 34 57 57 57 342 228 321 346 357 59 376
+110 110
+135 94 135
+27 250 27
+129 118
+18 18 18
+81 81 89 89
+27 27
+20 20 20 20 20 212
+33 33 33 33
+62 62 62 63 63 55 248 124 381 428 383 382 43 43 261 63
+90 90 78 78
+399 397 202 141 104 104 245 192 191 271
+239 332 283 88
+187 313
+136 136 324
+41 41
+352 128
+413 414
+410 45 45 45 1 1 1 1 1 1 1 1 31 31 31 31
+92 334 92
+95 285
+215 249
+390 41
+116 116
+300 252
+2 2 2 2 2
+8 8 8 8 8 8
+53 241 259
+118 129 126 94 137 208 216 299
+209 368 139 418 419
+311 180
+303 302 203 284
+369 32 32 32 32 337
+207 47 47 47
+106 107
+143 143
+179 178
+109 109
+405 79 79 371 246
+251 417 427
+333 88 387 358 123 348 394 360 36 365
+3 3 3 3 3
+189 188
+398 425
+107 406
+281 201 141
+2 2 2
+359 54
+395 385 293
+60 60 60 121 121 233 58 58
+24 199 175 24 24 24 351 386 106
+115 294
+122 122 127 127
+35 35
+282 393
+277 140 140 343 225 123 36 36 36 221 114 114 59 59 117 117 247 367 219 258 222 301 375 350 353 111 111
+275 272 273 274 331 330 305 108 76 76 108
+26 26 26 408 26
+290 18 210 291
+372 139 424 113
+341 340 335
+120 370
+224 200
+426 416
+137 319
+402 55
+54 54
+327 119
+125 125
+391 396 354 355 389
+142 142
+295 320
+113 366
+253 85 85
+56 56 310 309 308 307 278 25 25 19 19 3 312 19 19 19 3 25
+220 338
+34 130
+130 120 380 315
+339 422
+379 378
+95 56 392 115
+55 124
+126 34
+349 373 361
+195 194
+75 75
+64 64 64
+35 35
+40 40 40 242 77 244 77 243
+257 316
+103 306 102 51 52 103 105 52 52 292 318 112 286 345 237 276 112 51 102 105
diff --git a/PaddleRec/ssr/utils.py b/PaddleRec/ssr/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..4fe9ef470ed0a2a5da7bef6a975f45e5a04ab18e
--- /dev/null
+++ b/PaddleRec/ssr/utils.py
@@ -0,0 +1,49 @@
+import numpy as np
+import reader as reader
+import os
+import logging
+import paddle.fluid as fluid
+import paddle
+
+
+def get_vocab_size(vocab_path):
+ with open(vocab_path, "r") as rf:
+ line = rf.readline()
+ return int(line.strip())
+
+
+def construct_train_data(file_dir, vocab_path, batch_size):
+ vocab_size = get_vocab_size(vocab_path)
+ files = [file_dir + '/' + f for f in os.listdir(file_dir)]
+ y_data = reader.YoochooseDataset(vocab_size)
+ train_reader = paddle.batch(
+ paddle.reader.shuffle(
+ y_data.train(files), buf_size=batch_size * 100),
+ batch_size=batch_size)
+ return train_reader, vocab_size
+
+
+def construct_test_data(file_dir, vocab_path, batch_size):
+ vocab_size = get_vocab_size(vocab_path)
+ files = [file_dir + '/' + f for f in os.listdir(file_dir)]
+ y_data = reader.YoochooseDataset(vocab_size)
+ test_reader = paddle.batch(y_data.test(files), batch_size=batch_size)
+ return test_reader, vocab_size
+
+
+def infer_data(raw_data, place):
+ data = [dat[0] for dat in raw_data]
+ seq_lens = [len(seq) for seq in data]
+ cur_len = 0
+ lod = [cur_len]
+ for l in seq_lens:
+ cur_len += l
+ lod.append(cur_len)
+ flattened_data = np.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res = fluid.LoDTensor()
+ res.set(flattened_data, place)
+ res.set_lod([lod])
+ p_label = [dat[1] for dat in raw_data]
+ pos_label = np.array(p_label).astype("int64").reshape(len(p_label), 1)
+ return res, pos_label
diff --git a/PaddleRec/ssr/vocab.txt b/PaddleRec/ssr/vocab.txt
new file mode 100644
index 0000000000000000000000000000000000000000..c15fb720f8f8a9163cfec319b226864a3246a7e7
--- /dev/null
+++ b/PaddleRec/ssr/vocab.txt
@@ -0,0 +1 @@
+501
diff --git a/PaddleRec/tagspace/.run_ce.sh b/PaddleRec/tagspace/.run_ce.sh
new file mode 100755
index 0000000000000000000000000000000000000000..74a0413a846f21cc7eaacd444f527103778be923
--- /dev/null
+++ b/PaddleRec/tagspace/.run_ce.sh
@@ -0,0 +1,20 @@
+#!/bin/bash
+
+export MKL_NUM_THREADS=1
+export OMP_NUM_THREADS=1
+
+
+export CPU_NUM=1
+export NUM_THREADS=1
+
+FLAGS_benchmark=true python train.py --enable_ce --train_dir train_big_data/ --vocab_text_path big_vocab_text.txt --vocab_tag_path big_vocab_tag.txt --model_dir big_model --batch_size 500 | python _ce.py
+
+cudaid=${tagspace:=0} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --enable_ce --use_cuda 1 --train_dir train_big_data/ --vocab_text_path big_vocab_text.txt --vocab_tag_path big_vocab_tag.txt --model_dir big_model --batch_size 500 --parallel 1 | python _ce.py
+
+cudaid=${tagspace_4:=0,1,2,3} # use 0-th card as default
+export CUDA_VISIBLE_DEVICES=$cudaid
+
+FLAGS_benchmark=true python train.py --enable_ce --use_cuda 1 --train_dir train_big_data/ --vocab_text_path big_vocab_text.txt --vocab_tag_path big_vocab_tag.txt --model_dir big_model --batch_size 500 --parallel 1 | python _ce.py
diff --git a/PaddleRec/tagspace/README.md b/PaddleRec/tagspace/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..4263065bee2c5492684147f532e92c7c8083e16f
--- /dev/null
+++ b/PaddleRec/tagspace/README.md
@@ -0,0 +1,92 @@
+# TagSpace
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── README.md # 文档
+├── train.py # 训练脚本
+├── infer.py # 预测脚本
+├── net.py # 网络结构
+├── text2paddle.py # 文本数据转paddle数据
+├── cluster_train.py # 多机训练
+├── cluster_train.sh # 多机训练脚本
+├── utils # 通用函数
+├── vocab_text.txt # 小样本文本字典
+├── vocab_tag.txt # 小样本类别字典
+├── train_data # 小样本训练目录
+└── test_data # 小样本测试目录
+
+```
+
+
+## 简介
+
+TagSpace模型的介绍可以参阅论文[#TagSpace: Semantic Embeddings from Hashtags](https://research.fb.com/publications/tagspace-semantic-embeddings-from-hashtags/)。
+
+Tagspace模型学习文本及标签的embedding表示,应用于工业级的标签推荐,具体应用场景有feed新闻标签推荐。
+
+
+## 数据下载及预处理
+
+数据地址: [ag news dataset](https://github.com/mhjabreel/CharCNN/tree/master/data/)
+
+备份数据地址:[ag news dataset](https://paddle-tagspace.bj.bcebos.com/data.tar)
+
+数据格式如下
+
+```
+"3","Wall St. Bears Claw Back Into the Black (Reuters)","Reuters - Short-sellers, Wall Street's dwindling\band of ultra-cynics, are seeing green again."
+```
+
+备份数据解压后,将文本数据转为paddle数据,先将数据放到训练数据目录和测试数据目录
+```
+mv train.csv raw_big_train_data
+mv test.csv raw_big_test_data
+```
+
+运行脚本text2paddle.py 生成paddle输入格式
+```
+python text2paddle.py raw_big_train_data/ raw_big_test_data/ train_big_data test_big_data big_vocab_text.txt big_vocab_tag.txt
+```
+
+## 单机训练
+'--use_cuda 1' 表示使用gpu, 0表示使用cpu, '--parallel 1' 表示使用多卡
+
+小数据训练(样例中的数据已经准备,可跳过上一节的数据准备,直接运行命令)
+
+GPU 环境
+```
+CUDA_VISIBLE_DEVICES=0 python train.py --use_cuda 1
+```
+CPU 环境
+```
+python train.py
+```
+
+全量数据单机单卡训练
+```
+CUDA_VISIBLE_DEVICES=0 python train.py --use_cuda 1 --train_dir train_big_data/ --vocab_text_path big_vocab_text.txt --vocab_tag_path big_vocab_tag.txt --model_dir big_model --batch_size 500
+```
+全量数据单机多卡训练
+
+```
+python train.py --train_dir train_big_data/ --vocab_text_path big_vocab_text.txt --vocab_tag_path big_vocab_tag.txt --model_dir big_model --batch_size 500 --parallel 1
+```
+
+## 预测
+小数据预测
+```
+python infer.py
+```
+
+全量数据预测
+```
+python infer.py --model_dir big_model --vocab_tag_path big_vocab_tag.txt --test_dir test_big_data/
+```
+
+## 本地模拟多机
+运行命令
+```
+sh cluster_train.py
+```
diff --git a/PaddleRec/tagspace/__init.py__ b/PaddleRec/tagspace/__init.py__
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleRec/tagspace/_ce.py b/PaddleRec/tagspace/_ce.py
new file mode 100644
index 0000000000000000000000000000000000000000..b75fa39a2114c9388ffad77be74f52733010aeba
--- /dev/null
+++ b/PaddleRec/tagspace/_ce.py
@@ -0,0 +1,66 @@
+# this file is only used for continuous evaluation test!
+
+import os
+import sys
+sys.path.append(os.environ['ceroot'])
+from kpi import CostKpi
+from kpi import DurationKpi
+from kpi import AccKpi
+
+
+each_pass_duration_cpu1_thread1_kpi = DurationKpi('each_pass_duration_cpu1_thread1', 0.08, 0, actived=True)
+train_acc_cpu1_thread1_kpi = AccKpi('train_acc_cpu1_thread1', 0.08, 0)
+each_pass_duration_gpu1_kpi = DurationKpi('each_pass_duration_gpu1', 0.08, 0, actived=True)
+train_acc_gpu1_kpi = AccKpi('train_acc_gpu1', 0.08, 0)
+each_pass_duration_gpu4_kpi = DurationKpi('each_pass_duration_gpu4', 0.08, 0, actived=True)
+train_acc_gpu4_kpi = AccKpi('train_acc_gpu4', 0.08, 0)
+
+tracking_kpis = [
+ each_pass_duration_cpu1_thread1_kpi,
+ train_acc_cpu1_thread1_kpi,
+ each_pass_duration_gpu1_kpi,
+ train_acc_gpu1_kpi,
+ each_pass_duration_gpu4_kpi,
+ train_acc_gpu4_kpi,
+ ]
+
+
+def parse_log(log):
+ '''
+ This method should be implemented by model developers.
+
+ The suggestion:
+
+ each line in the log should be key, value, for example:
+
+ "
+ train_cost\t1.0
+ test_cost\t1.0
+ train_cost\t1.0
+ train_cost\t1.0
+ train_acc\t1.2
+ "
+ '''
+ for line in log.split('\n'):
+ fs = line.strip().split('\t')
+ print(fs)
+ if len(fs) == 3 and fs[0] == 'kpis':
+ kpi_name = fs[1]
+ kpi_value = float(fs[2])
+ yield kpi_name, kpi_value
+
+
+def log_to_ce(log):
+ kpi_tracker = {}
+ for kpi in tracking_kpis:
+ kpi_tracker[kpi.name] = kpi
+
+ for (kpi_name, kpi_value) in parse_log(log):
+ print(kpi_name, kpi_value)
+ kpi_tracker[kpi_name].add_record(kpi_value)
+ kpi_tracker[kpi_name].persist()
+
+
+if __name__ == '__main__':
+ log = sys.stdin.read()
+ log_to_ce(log)
diff --git a/fluid/PaddleRec/tagspace/cluster_train.py b/PaddleRec/tagspace/cluster_train.py
similarity index 100%
rename from fluid/PaddleRec/tagspace/cluster_train.py
rename to PaddleRec/tagspace/cluster_train.py
diff --git a/fluid/PaddleRec/tagspace/cluster_train.sh b/PaddleRec/tagspace/cluster_train.sh
similarity index 100%
rename from fluid/PaddleRec/tagspace/cluster_train.sh
rename to PaddleRec/tagspace/cluster_train.sh
diff --git a/PaddleRec/tagspace/infer.py b/PaddleRec/tagspace/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..430aa769dab50f8432fe847de01904f545faf1e5
--- /dev/null
+++ b/PaddleRec/tagspace/infer.py
@@ -0,0 +1,99 @@
+import sys
+import argparse
+import time
+import math
+import unittest
+import contextlib
+import numpy as np
+import six
+import paddle.fluid as fluid
+import paddle
+import utils
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("tagspace benchmark.")
+ parser.add_argument(
+ '--test_dir', type=str, default='test_data', help='test file address')
+ parser.add_argument(
+ '--vocab_tag_path',
+ type=str,
+ default='vocab_tag.txt',
+ help='vocab path')
+ parser.add_argument(
+ '--start_index', type=int, default='1', help='start index')
+ parser.add_argument(
+ '--last_index', type=int, default='10', help='end index')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_', help='model dir')
+ parser.add_argument(
+ '--use_cuda', type=int, default='0', help='whether use cuda')
+ parser.add_argument(
+ '--batch_size', type=int, default='5', help='batch_size')
+ args = parser.parse_args()
+ return args
+
+
+def infer(test_reader, vocab_tag, use_cuda, model_path, epoch):
+ """ inference function """
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+
+ with fluid.scope_guard(fluid.core.Scope()):
+ infer_program, feed_target_names, fetch_vars = fluid.io.load_inference_model(
+ model_path, exe)
+ t0 = time.time()
+ step_id = 0
+ true_num = 0
+ all_num = 0
+ size = vocab_tag
+ value = []
+ print("epoch " + str(epoch) + " start")
+ for data in test_reader():
+ step_id += 1
+ lod_text_seq = utils.to_lodtensor([dat[0] for dat in data], place)
+ lod_tag = utils.to_lodtensor([dat[1] for dat in data], place)
+ lod_pos_tag = utils.to_lodtensor([dat[2] for dat in data], place)
+ para = exe.run(infer_program,
+ feed={"text": lod_text_seq,
+ "pos_tag": lod_tag},
+ fetch_list=fetch_vars,
+ return_numpy=False)
+ value.append(para[0]._get_float_element(0))
+ if step_id % size == 0 and step_id > 1:
+ all_num += 1
+ true_pos = [dat[2] for dat in data][0][0]
+ if value.index(max(value)) == int(true_pos):
+ true_num += 1
+ value = []
+ print("epoch:" + str(epoch) + "\tacc:" + str(1.0 * true_num / all_num))
+ t1 = time.time()
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ start_index = args.start_index
+ last_index = args.last_index
+ test_dir = args.test_dir
+ model_dir = args.model_dir
+ batch_size = args.batch_size
+ vocab_tag_path = args.vocab_tag_path
+ use_cuda = True if args.use_cuda else False
+ print("start index: ", start_index, " last_index:", last_index)
+ vocab_text, vocab_tag, test_reader = utils.prepare_data(
+ test_dir,
+ "",
+ vocab_tag_path,
+ batch_size=1,
+ neg_size=0,
+ buffer_size=1000,
+ is_train=False)
+
+ for epoch in range(start_index, last_index + 1):
+ epoch_path = model_dir + "/epoch_" + str(epoch)
+ infer(
+ test_reader=test_reader,
+ vocab_tag=vocab_tag,
+ use_cuda=False,
+ model_path=epoch_path,
+ epoch=epoch)
diff --git a/fluid/PaddleRec/tagspace/net.py b/PaddleRec/tagspace/net.py
similarity index 100%
rename from fluid/PaddleRec/tagspace/net.py
rename to PaddleRec/tagspace/net.py
diff --git a/fluid/PaddleRec/tagspace/test_data/small_test.csv b/PaddleRec/tagspace/test_data/small_test.csv
similarity index 100%
rename from fluid/PaddleRec/tagspace/test_data/small_test.csv
rename to PaddleRec/tagspace/test_data/small_test.csv
diff --git a/fluid/PaddleRec/tagspace/text2paddle.py b/PaddleRec/tagspace/text2paddle.py
similarity index 100%
rename from fluid/PaddleRec/tagspace/text2paddle.py
rename to PaddleRec/tagspace/text2paddle.py
diff --git a/PaddleRec/tagspace/train.py b/PaddleRec/tagspace/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..419bb1c4b156c148f8bc4bc3a48385b6722f5c68
--- /dev/null
+++ b/PaddleRec/tagspace/train.py
@@ -0,0 +1,171 @@
+import os
+import sys
+import time
+import six
+import numpy as np
+import math
+import argparse
+import paddle
+import paddle.fluid as fluid
+import time
+import utils
+import net
+
+SEED = 102
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("TagSpace benchmark.")
+ parser.add_argument(
+ '--neg_size', type=int, default=3, help='number of neg item')
+ parser.add_argument(
+ '--train_dir', type=str, default='train_data', help='train file')
+ parser.add_argument(
+ '--vocab_text_path', type=str, default='vocab_text.txt', help='text')
+ parser.add_argument(
+ '--vocab_tag_path', type=str, default='vocab_tag.txt', help='tag')
+ parser.add_argument(
+ '--model_dir', type=str, default='model_', help='model dir')
+ parser.add_argument(
+ '--batch_size', type=int, default=5, help='num of batch size')
+ parser.add_argument(
+ '--print_batch', type=int, default=10, help='num of print batch')
+ parser.add_argument(
+ '--pass_num', type=int, default=10, help='number of epoch')
+ parser.add_argument(
+ '--use_cuda', type=int, default=0, help='whether use gpu')
+ parser.add_argument(
+ '--parallel', type=int, default=0, help='whether parallel')
+ parser.add_argument(
+ '--base_lr', type=float, default=0.01, help='learning rate')
+ parser.add_argument(
+ '--num_devices', type=int, default=1, help='Number of GPU devices')
+ parser.add_argument(
+ '--enable_ce',
+ action='store_true',
+ help='If set, run the task with continuous evaluation logs.')
+ args = parser.parse_args()
+ return args
+
+
+def get_cards(args):
+ return args.num_devices
+
+
+def train():
+ """ do training """
+ args = parse_args()
+ if args.enable_ce:
+ fluid.default_startup_program().random_seed = SEED
+ fluid.default_main_program().random_seed = SEED
+ train_dir = args.train_dir
+ vocab_text_path = args.vocab_text_path
+ vocab_tag_path = args.vocab_tag_path
+ use_cuda = True if args.use_cuda else False
+ parallel = True if args.parallel else False
+ batch_size = args.batch_size
+ neg_size = args.neg_size
+ print("use_cuda: {}, parallel: {}, batch_size: {}, neg_size: {} "
+ .format(use_cuda, parallel, batch_size, neg_size))
+ vocab_text_size, vocab_tag_size, train_reader = utils.prepare_data(
+ file_dir=train_dir,
+ vocab_text_path=vocab_text_path,
+ vocab_tag_path=vocab_tag_path,
+ neg_size=neg_size,
+ batch_size=batch_size * get_cards(args),
+ buffer_size=batch_size * 100,
+ is_train=True)
+ """ train network """
+ # Train program
+ avg_cost, correct, cos_pos = net.network(
+ vocab_text_size, vocab_tag_size, neg_size=neg_size)
+
+ # Optimization to minimize lost
+ sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=args.base_lr)
+ sgd_optimizer.minimize(avg_cost)
+
+ # Initialize executor
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+ if parallel:
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=use_cuda, loss_name=avg_cost.name)
+ else:
+ train_exe = exe
+
+ pass_num = args.pass_num
+ model_dir = args.model_dir
+ fetch_list = [avg_cost.name]
+ total_time = 0.0
+ ce_info = []
+ for pass_idx in range(pass_num):
+ epoch_idx = pass_idx + 1
+ print("epoch_%d start" % epoch_idx)
+ t0 = time.time()
+ for batch_id, data in enumerate(train_reader()):
+ lod_text_seq = utils.to_lodtensor([dat[0] for dat in data], place)
+ lod_pos_tag = utils.to_lodtensor([dat[1] for dat in data], place)
+ lod_neg_tag = utils.to_lodtensor([dat[2] for dat in data], place)
+ loss_val, correct_val = train_exe.run(
+ feed={
+ "text": lod_text_seq,
+ "pos_tag": lod_pos_tag,
+ "neg_tag": lod_neg_tag
+ },
+ fetch_list=[avg_cost.name, correct.name])
+ ce_info.append(float(np.sum(correct_val)) / (args.num_devices * batch_size))
+ if batch_id % args.print_batch == 0:
+ print("TRAIN --> pass: {} batch_num: {} avg_cost: {}, acc: {}"
+ .format(pass_idx, (batch_id + 10) * batch_size,
+ np.mean(loss_val),
+ float(np.sum(correct_val)) / (args.num_devices *
+ batch_size)))
+ t1 = time.time()
+ total_time += t1 - t0
+ print("epoch:%d num_steps:%d time_cost(s):%f" %
+ (epoch_idx, batch_id, total_time / epoch_idx))
+ save_dir = "%s/epoch_%d" % (model_dir, epoch_idx)
+ feed_var_names = ["text", "pos_tag"]
+ fetch_vars = [cos_pos]
+ fluid.io.save_inference_model(save_dir, feed_var_names, fetch_vars,
+ exe)
+ # only for ce
+ if args.enable_ce:
+ ce_acc = 0
+ try:
+ ce_acc = ce_info[-2]
+ except:
+ logger.error("ce info error")
+ epoch_idx = args.pass_num
+ device = get_device(args)
+ if args.use_cuda:
+ gpu_num = device[1]
+ print("kpis\teach_pass_duration_gpu%s\t%s" %
+ (gpu_num, total_time / epoch_idx))
+ print("kpis\ttrain_acc_gpu%s\t%s" %
+ (gpu_num, ce_acc))
+ else:
+ cpu_num = device[1]
+ threads_num = device[2]
+ print("kpis\teach_pass_duration_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, total_time / epoch_idx))
+ print("kpis\ttrain_acc_cpu%s_thread%s\t%s" %
+ (cpu_num, threads_num, ce_acc))
+
+ print("finish training")
+
+
+def get_device(args):
+ if args.use_cuda:
+ gpus = os.environ.get("CUDA_VISIBLE_DEVICES", 1)
+ gpu_num = len(gpus.split(','))
+ return "gpu", gpu_num
+ else:
+ threads_num = os.environ.get('NUM_THREADS', 1)
+ cpu_num = os.environ.get('CPU_NUM', 1)
+ return "cpu", int(cpu_num), int(threads_num)
+
+
+if __name__ == "__main__":
+ train()
diff --git a/fluid/PaddleRec/tagspace/train_data/small_train.csv b/PaddleRec/tagspace/train_data/small_train.csv
similarity index 100%
rename from fluid/PaddleRec/tagspace/train_data/small_train.csv
rename to PaddleRec/tagspace/train_data/small_train.csv
diff --git a/fluid/PaddleRec/tagspace/utils.py b/PaddleRec/tagspace/utils.py
similarity index 100%
rename from fluid/PaddleRec/tagspace/utils.py
rename to PaddleRec/tagspace/utils.py
diff --git a/fluid/PaddleRec/tagspace/vocab_tag.txt b/PaddleRec/tagspace/vocab_tag.txt
similarity index 100%
rename from fluid/PaddleRec/tagspace/vocab_tag.txt
rename to PaddleRec/tagspace/vocab_tag.txt
diff --git a/fluid/PaddleRec/tagspace/vocab_text.txt b/PaddleRec/tagspace/vocab_text.txt
similarity index 100%
rename from fluid/PaddleRec/tagspace/vocab_text.txt
rename to PaddleRec/tagspace/vocab_text.txt
diff --git a/PaddleRec/word2vec/README.md b/PaddleRec/word2vec/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..936d9fac5860f7adf9fcc587334ecb2aebce1991
--- /dev/null
+++ b/PaddleRec/word2vec/README.md
@@ -0,0 +1,113 @@
+# 基于skip-gram的word2vector模型
+
+以下是本例的简要目录结构及说明:
+
+```text
+.
+├── cluster_train.py # 分布式训练函数
+├── cluster_train.sh # 本地模拟多机脚本
+├── train.py # 训练函数
+├── infer.py # 预测脚本
+├── net.py # 网络结构
+├── preprocess.py # 预处理脚本,包括构建词典和预处理文本
+├── reader.py # 训练阶段的文本读写
+├── README.md # 使用说明
+├── train.py # 训练函数
+└── utils.py # 通用函数
+
+```
+
+## 介绍
+本例实现了skip-gram模式的word2vector模型。
+
+
+## 数据下载
+全量数据集使用的是来自1 Billion Word Language Model Benchmark的(http://www.statmt.org/lm-benchmark) 的数据集.
+
+```bash
+wget http://www.statmt.org/lm-benchmark/1-billion-word-language-modeling-benchmark-r13output.tar.gz
+tar xzvf 1-billion-word-language-modeling-benchmark-r13output.tar
+mv 1-billion-word-language-modeling-benchmark-r13output/training-monolingual.tokenized.shuffled/ data/
+```
+
+备用数据地址下载命令如下
+
+```bash
+wget https://paddlerec.bj.bcebos.com/word2vec/1-billion-word-language-modeling-benchmark-r13output.tar
+tar xvf 1-billion-word-language-modeling-benchmark-r13output.tar
+mv 1-billion-word-language-modeling-benchmark-r13output/training-monolingual.tokenized.shuffled/ data/
+```
+
+为了方便快速验证,我们也提供了经典的text8样例数据集,包含1700w个词。 下载命令如下
+
+```bash
+wget https://paddlerec.bj.bcebos.com/word2vec/text.tar
+tar xvf text.tar
+mv text data/
+```
+
+
+## 数据预处理
+以样例数据集为例进行预处理。全量数据集注意解压后以training-monolingual.tokenized.shuffled 目录为预处理目录,和样例数据集的text目录并列。
+
+词典格式: 词<空格>词频。注意低频词用'UNK'表示
+
+可以按格式自建词典,如果自建词典跳过第一步。
+```
+the 1061396
+of 593677
+and 416629
+one 411764
+in 372201
+a 325873
+ 324608
+to 316376
+zero 264975
+nine 250430
+```
+
+第一步根据英文语料生成词典,中文语料可以通过修改text_strip方法自定义处理方法。
+
+```bash
+python preprocess.py --build_dict --build_dict_corpus_dir data/text/ --dict_path data/test_build_dict
+```
+
+第二步根据词典将文本转成id, 同时进行downsample,按照概率过滤常见词。
+
+```bash
+python preprocess.py --filter_corpus --dict_path data/test_build_dict --input_corpus_dir data/text/ --output_corpus_dir data/convert_text8 --min_count 5 --downsample 0.001
+```
+
+## 训练
+具体的参数配置可运行
+
+
+```bash
+python train.py -h
+```
+
+单机多线程训练
+```bash
+OPENBLAS_NUM_THREADS=1 CPU_NUM=5 python train.py --train_data_dir data/convert_text8 --dict_path data/test_build_dict --num_passes 10 --batch_size 100 --model_output_dir v1_cpu5_b100_lr1dir --base_lr 1.0 --print_batch 1000 --with_speed --is_sparse
+```
+
+本地单机模拟多机训练
+
+```bash
+sh cluster_train.sh
+```
+
+## 预测
+测试集下载命令如下
+
+```bash
+#全量数据集测试集
+wget https://paddlerec.bj.bcebos.com/word2vec/test_dir.tar
+#样本数据集测试集
+wget https://paddlerec.bj.bcebos.com/word2vec/test_mid_dir.tar
+```
+
+预测命令,注意词典名称需要加后缀"_word_to_id_", 此文件是训练阶段生成的。
+```bash
+python infer.py --infer_epoch --test_dir data/test_mid_dir/ --dict_path data/test_build_dict_word_to_id_ --batch_size 20000 --model_dir v1_cpu5_b100_lr1dir/ --start_index 0
+```
diff --git a/PaddleRec/word2vec/cluster_train.py b/PaddleRec/word2vec/cluster_train.py
new file mode 100644
index 0000000000000000000000000000000000000000..4ea3476491c89cb412d69cd8a6bc550ce810877a
--- /dev/null
+++ b/PaddleRec/word2vec/cluster_train.py
@@ -0,0 +1,250 @@
+from __future__ import print_function
+import argparse
+import logging
+import os
+import time
+import math
+import random
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import six
+import reader
+from net import skip_gram_word2vec
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(
+ description="PaddlePaddle Word2vec example")
+ parser.add_argument(
+ '--train_data_dir',
+ type=str,
+ default='./data/text',
+ help="The path of taining dataset")
+ parser.add_argument(
+ '--base_lr',
+ type=float,
+ default=0.01,
+ help="The number of learing rate (default: 0.01)")
+ parser.add_argument(
+ '--save_step',
+ type=int,
+ default=500000,
+ help="The number of step to save (default: 500000)")
+ parser.add_argument(
+ '--print_batch',
+ type=int,
+ default=100,
+ help="The number of print_batch (default: 10)")
+ parser.add_argument(
+ '--dict_path',
+ type=str,
+ default='./data/1-billion_dict',
+ help="The path of data dict")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=500,
+ help="The size of mini-batch (default:500)")
+ parser.add_argument(
+ '--num_passes',
+ type=int,
+ default=10,
+ help="The number of passes to train (default: 10)")
+ parser.add_argument(
+ '--model_output_dir',
+ type=str,
+ default='models',
+ help='The path for model to store (default: models)')
+ parser.add_argument('--nce_num', type=int, default=5, help='nce_num')
+ parser.add_argument(
+ '--embedding_size',
+ type=int,
+ default=64,
+ help='sparse feature hashing space for index processing')
+ parser.add_argument(
+ '--is_sparse',
+ action='store_true',
+ required=False,
+ default=False,
+ help='embedding and nce will use sparse or not, (default: False)')
+ parser.add_argument(
+ '--with_speed',
+ action='store_true',
+ required=False,
+ default=False,
+ help='print speed or not , (default: False)')
+ parser.add_argument(
+ '--role', type=str, default='pserver', help='trainer or pserver')
+ parser.add_argument(
+ '--endpoints',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The pserver endpoints, like: 127.0.0.1:6000, 127.0.0.1:6001')
+ parser.add_argument(
+ '--current_endpoint',
+ type=str,
+ default='127.0.0.1:6000',
+ help='The current_endpoint')
+ parser.add_argument(
+ '--trainer_id',
+ type=int,
+ default=0,
+ help='trainer id ,only trainer_id=0 save model')
+ parser.add_argument(
+ '--trainers',
+ type=int,
+ default=1,
+ help='The num of trianers, (default: 1)')
+ return parser.parse_args()
+
+
+def convert_python_to_tensor(weight, batch_size, sample_reader):
+ def __reader__():
+ cs = np.array(weight).cumsum()
+ result = [[], []]
+ for sample in sample_reader():
+ for i, fea in enumerate(sample):
+ result[i].append(fea)
+ if len(result[0]) == batch_size:
+ tensor_result = []
+ for tensor in result:
+ t = fluid.Tensor()
+ dat = np.array(tensor, dtype='int64')
+ if len(dat.shape) > 2:
+ dat = dat.reshape((dat.shape[0], dat.shape[2]))
+ elif len(dat.shape) == 1:
+ dat = dat.reshape((-1, 1))
+ t.set(dat, fluid.CPUPlace())
+ tensor_result.append(t)
+ tt = fluid.Tensor()
+ neg_array = cs.searchsorted(np.random.sample(args.nce_num))
+ neg_array = np.tile(neg_array, batch_size)
+ tt.set(
+ neg_array.reshape((batch_size, args.nce_num)),
+ fluid.CPUPlace())
+ tensor_result.append(tt)
+ yield tensor_result
+ result = [[], []]
+
+ return __reader__
+
+
+def train_loop(args, train_program, reader, py_reader, loss, trainer_id,
+ weight):
+
+ py_reader.decorate_tensor_provider(
+ convert_python_to_tensor(weight, args.batch_size, reader.train()))
+
+ place = fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ print("CPU_NUM:" + str(os.getenv("CPU_NUM")))
+
+ train_exe = exe
+
+ for pass_id in range(args.num_passes):
+ py_reader.start()
+ time.sleep(10)
+ epoch_start = time.time()
+ batch_id = 0
+ start = time.time()
+ try:
+ while True:
+
+ loss_val = train_exe.run(fetch_list=[loss.name])
+ loss_val = np.mean(loss_val)
+
+ if batch_id % args.print_batch == 0:
+ logger.info(
+ "TRAIN --> pass: {} batch: {} loss: {} reader queue:{}".
+ format(pass_id, batch_id,
+ loss_val.mean(), py_reader.queue.size()))
+ if args.with_speed:
+ if batch_id % 500 == 0 and batch_id != 0:
+ elapsed = (time.time() - start)
+ start = time.time()
+ samples = 1001 * args.batch_size * int(
+ os.getenv("CPU_NUM"))
+ logger.info("Time used: {}, Samples/Sec: {}".format(
+ elapsed, samples / elapsed))
+
+ if batch_id % args.save_step == 0 and batch_id != 0:
+ model_dir = args.model_output_dir + '/pass-' + str(
+ pass_id) + ('/batch-' + str(batch_id))
+ if trainer_id == 0:
+ fluid.io.save_params(executor=exe, dirname=model_dir)
+ print("model saved in %s" % model_dir)
+ batch_id += 1
+
+ except fluid.core.EOFException:
+ py_reader.reset()
+ epoch_end = time.time()
+ logger.info("Epoch: {0}, Train total expend: {1} ".format(
+ pass_id, epoch_end - epoch_start))
+ model_dir = args.model_output_dir + '/pass-' + str(pass_id)
+ if trainer_id == 0:
+ fluid.io.save_params(executor=exe, dirname=model_dir)
+ print("model saved in %s" % model_dir)
+
+
+def GetFileList(data_path):
+ return os.listdir(data_path)
+
+
+def train(args):
+
+ if not os.path.isdir(args.model_output_dir) and args.trainer_id == 0:
+ os.mkdir(args.model_output_dir)
+
+ filelist = GetFileList(args.train_data_dir)
+ word2vec_reader = reader.Word2VecReader(args.dict_path, args.train_data_dir,
+ filelist, 0, 1)
+
+ logger.info("dict_size: {}".format(word2vec_reader.dict_size))
+ np_power = np.power(np.array(word2vec_reader.id_frequencys), 0.75)
+ id_frequencys_pow = np_power / np_power.sum()
+
+ loss, py_reader = skip_gram_word2vec(
+ word2vec_reader.dict_size,
+ args.embedding_size,
+ is_sparse=args.is_sparse,
+ neg_num=args.nce_num)
+
+ optimizer = fluid.optimizer.SGD(
+ learning_rate=fluid.layers.exponential_decay(
+ learning_rate=args.base_lr,
+ decay_steps=100000,
+ decay_rate=0.999,
+ staircase=True))
+
+ optimizer.minimize(loss)
+
+ logger.info("run dist training")
+
+ t = fluid.DistributeTranspiler()
+ t.transpile(
+ args.trainer_id, pservers=args.endpoints, trainers=args.trainers)
+ if args.role == "pserver":
+ print("run psever")
+ pserver_prog = t.get_pserver_program(args.current_endpoint)
+ pserver_startup = t.get_startup_program(args.current_endpoint,
+ pserver_prog)
+ exe = fluid.Executor(fluid.CPUPlace())
+ exe.run(pserver_startup)
+ exe.run(pserver_prog)
+ elif args.role == "trainer":
+ print("run trainer")
+ train_loop(args,
+ t.get_trainer_program(), word2vec_reader, py_reader, loss,
+ args.trainer_id, id_frequencys_pow)
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ train(args)
diff --git a/PaddleRec/word2vec/cluster_train.sh b/PaddleRec/word2vec/cluster_train.sh
new file mode 100644
index 0000000000000000000000000000000000000000..756196fd41eeb52d5f43553664c824748ac83e4e
--- /dev/null
+++ b/PaddleRec/word2vec/cluster_train.sh
@@ -0,0 +1,68 @@
+#!/bin/bash
+
+#export GLOG_v=30
+#export GLOG_logtostderr=1
+
+# start pserver0
+export CPU_NUM=5
+export FLAGS_rpc_deadline=3000000
+python cluster_train.py \
+ --train_data_dir data/convert_text8 \
+ --dict_path data/test_build_dict \
+ --batch_size 100 \
+ --model_output_dir dis_model \
+ --base_lr 1.0 \
+ --print_batch 1 \
+ --is_sparse \
+ --with_speed \
+ --role pserver \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --current_endpoint 127.0.0.1:6000 \
+ --trainers 2 \
+ > pserver0.log 2>&1 &
+
+python cluster_train.py \
+ --train_data_dir data/convert_text8 \
+ --dict_path data/test_build_dict \
+ --batch_size 100 \
+ --model_output_dir dis_model \
+ --base_lr 1.0 \
+ --print_batch 1 \
+ --is_sparse \
+ --with_speed \
+ --role pserver \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --current_endpoint 127.0.0.1:6001 \
+ --trainers 2 \
+ > pserver1.log 2>&1 &
+
+# start trainer0
+python cluster_train.py \
+ --train_data_dir data/convert_text8 \
+ --dict_path data/test_build_dict \
+ --batch_size 100 \
+ --model_output_dir dis_model \
+ --base_lr 1.0 \
+ --print_batch 1000 \
+ --is_sparse \
+ --with_speed \
+ --role trainer \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --trainers 2 \
+ --trainer_id 0 \
+ > trainer0.log 2>&1 &
+# start trainer1
+python cluster_train.py \
+ --train_data_dir data/convert_text8 \
+ --dict_path data/test_build_dict \
+ --batch_size 100 \
+ --model_output_dir dis_model \
+ --base_lr 1.0 \
+ --print_batch 1000 \
+ --is_sparse \
+ --with_speed \
+ --role trainer \
+ --endpoints 127.0.0.1:6000,127.0.0.1:6001 \
+ --trainers 2 \
+ --trainer_id 1 \
+ > trainer1.log 2>&1 &
diff --git a/PaddleRec/word2vec/infer.py b/PaddleRec/word2vec/infer.py
new file mode 100644
index 0000000000000000000000000000000000000000..9a3649506c546b6ab91cf6f96ff063700679c43d
--- /dev/null
+++ b/PaddleRec/word2vec/infer.py
@@ -0,0 +1,213 @@
+import argparse
+import sys
+import time
+import math
+import unittest
+import contextlib
+import numpy as np
+import six
+import paddle.fluid as fluid
+import paddle
+import net
+import utils
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("PaddlePaddle Word2vec infer example")
+ parser.add_argument(
+ '--dict_path',
+ type=str,
+ default='./data/data_c/1-billion_dict_word_to_id_',
+ help="The path of dic")
+ parser.add_argument(
+ '--infer_epoch',
+ action='store_true',
+ required=False,
+ default=False,
+ help='infer by epoch')
+ parser.add_argument(
+ '--infer_step',
+ action='store_true',
+ required=False,
+ default=False,
+ help='infer by step')
+ parser.add_argument(
+ '--test_dir', type=str, default='test_data', help='test file address')
+ parser.add_argument(
+ '--print_step', type=int, default='500000', help='print step')
+ parser.add_argument(
+ '--start_index', type=int, default='0', help='start index')
+ parser.add_argument(
+ '--start_batch', type=int, default='1', help='start index')
+ parser.add_argument(
+ '--end_batch', type=int, default='13', help='start index')
+ parser.add_argument(
+ '--last_index', type=int, default='100', help='last index')
+ parser.add_argument(
+ '--model_dir', type=str, default='model', help='model dir')
+ parser.add_argument(
+ '--use_cuda', type=int, default='0', help='whether use cuda')
+ parser.add_argument(
+ '--batch_size', type=int, default='5', help='batch_size')
+ parser.add_argument('--emb_size', type=int, default='64', help='batch_size')
+ args = parser.parse_args()
+ return args
+
+
+def infer_epoch(args, vocab_size, test_reader, use_cuda, i2w):
+ """ inference function """
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ emb_size = args.emb_size
+ batch_size = args.batch_size
+ with fluid.scope_guard(fluid.core.Scope()):
+ main_program = fluid.Program()
+ with fluid.program_guard(main_program):
+ values, pred = net.infer_network(vocab_size, emb_size)
+ for epoch in range(start_index, last_index + 1):
+ copy_program = main_program.clone()
+ model_path = model_dir + "/pass-" + str(epoch)
+ fluid.io.load_params(
+ executor=exe, dirname=model_path, main_program=copy_program)
+ accum_num = 0
+ accum_num_sum = 0.0
+ t0 = time.time()
+ step_id = 0
+ for data in test_reader():
+ step_id += 1
+ b_size = len([dat[0] for dat in data])
+ wa = np.array(
+ [dat[0] for dat in data]).astype("int64").reshape(
+ b_size, 1)
+ wb = np.array(
+ [dat[1] for dat in data]).astype("int64").reshape(
+ b_size, 1)
+ wc = np.array(
+ [dat[2] for dat in data]).astype("int64").reshape(
+ b_size, 1)
+
+ label = [dat[3] for dat in data]
+ input_word = [dat[4] for dat in data]
+ para = exe.run(
+ copy_program,
+ feed={
+ "analogy_a": wa,
+ "analogy_b": wb,
+ "analogy_c": wc,
+ "all_label":
+ np.arange(vocab_size).reshape(vocab_size, 1),
+ },
+ fetch_list=[pred.name, values],
+ return_numpy=False)
+ pre = np.array(para[0])
+ val = np.array(para[1])
+ for ii in range(len(label)):
+ top4 = pre[ii]
+ accum_num_sum += 1
+ for idx in top4:
+ if int(idx) in input_word[ii]:
+ continue
+ if int(idx) == int(label[ii][0]):
+ accum_num += 1
+ break
+ if step_id % 1 == 0:
+ print("step:%d %d " % (step_id, accum_num))
+
+ print("epoch:%d \t acc:%.3f " %
+ (epoch, 1.0 * accum_num / accum_num_sum))
+
+
+def infer_step(args, vocab_size, test_reader, use_cuda, i2w):
+ """ inference function """
+ place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ emb_size = args.emb_size
+ batch_size = args.batch_size
+ with fluid.scope_guard(fluid.core.Scope()):
+ main_program = fluid.Program()
+ with fluid.program_guard(main_program):
+ values, pred = net.infer_network(vocab_size, emb_size)
+ for epoch in range(start_index, last_index + 1):
+ for batchid in range(args.start_batch, args.end_batch):
+ copy_program = main_program.clone()
+ model_path = model_dir + "/pass-" + str(epoch) + (
+ '/batch-' + str(batchid * args.print_step))
+ fluid.io.load_params(
+ executor=exe,
+ dirname=model_path,
+ main_program=copy_program)
+ accum_num = 0
+ accum_num_sum = 0.0
+ t0 = time.time()
+ step_id = 0
+ for data in test_reader():
+ step_id += 1
+ b_size = len([dat[0] for dat in data])
+ wa = np.array(
+ [dat[0] for dat in data]).astype("int64").reshape(
+ b_size, 1)
+ wb = np.array(
+ [dat[1] for dat in data]).astype("int64").reshape(
+ b_size, 1)
+ wc = np.array(
+ [dat[2] for dat in data]).astype("int64").reshape(
+ b_size, 1)
+
+ label = [dat[3] for dat in data]
+ input_word = [dat[4] for dat in data]
+ para = exe.run(
+ copy_program,
+ feed={
+ "analogy_a": wa,
+ "analogy_b": wb,
+ "analogy_c": wc,
+ "all_label":
+ np.arange(vocab_size).reshape(vocab_size, 1),
+ },
+ fetch_list=[pred.name, values],
+ return_numpy=False)
+ pre = np.array(para[0])
+ val = np.array(para[1])
+ for ii in range(len(label)):
+ top4 = pre[ii]
+ accum_num_sum += 1
+ for idx in top4:
+ if int(idx) in input_word[ii]:
+ continue
+ if int(idx) == int(label[ii][0]):
+ accum_num += 1
+ break
+ if step_id % 1 == 0:
+ print("step:%d %d " % (step_id, accum_num))
+ print("epoch:%d \t acc:%.3f " %
+ (epoch, 1.0 * accum_num / accum_num_sum))
+ t1 = time.time()
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ start_index = args.start_index
+ last_index = args.last_index
+ test_dir = args.test_dir
+ model_dir = args.model_dir
+ batch_size = args.batch_size
+ dict_path = args.dict_path
+ use_cuda = True if args.use_cuda else False
+ print("start index: ", start_index, " last_index:", last_index)
+ vocab_size, test_reader, id2word = utils.prepare_data(
+ test_dir, dict_path, batch_size=batch_size)
+ print("vocab_size:", vocab_size)
+ if args.infer_step:
+ infer_step(
+ args,
+ vocab_size,
+ test_reader=test_reader,
+ use_cuda=use_cuda,
+ i2w=id2word)
+ else:
+ infer_epoch(
+ args,
+ vocab_size,
+ test_reader=test_reader,
+ use_cuda=use_cuda,
+ i2w=id2word)
diff --git a/PaddleRec/word2vec/net.py b/PaddleRec/word2vec/net.py
new file mode 100644
index 0000000000000000000000000000000000000000..ab2abbc76bde8e03c9a6e1e0abb062aa467d2c91
--- /dev/null
+++ b/PaddleRec/word2vec/net.py
@@ -0,0 +1,136 @@
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+"""
+neural network for word2vec
+"""
+from __future__ import print_function
+import math
+import numpy as np
+import paddle.fluid as fluid
+
+
+def skip_gram_word2vec(dict_size, embedding_size, is_sparse=False, neg_num=5):
+
+ datas = []
+ input_word = fluid.layers.data(name="input_word", shape=[1], dtype='int64')
+ true_word = fluid.layers.data(name='true_label', shape=[1], dtype='int64')
+ neg_word = fluid.layers.data(
+ name="neg_label", shape=[neg_num], dtype='int64')
+
+ datas.append(input_word)
+ datas.append(true_word)
+ datas.append(neg_word)
+
+ py_reader = fluid.layers.create_py_reader_by_data(
+ capacity=64, feed_list=datas, name='py_reader', use_double_buffer=True)
+
+ words = fluid.layers.read_file(py_reader)
+ init_width = 0.5 / embedding_size
+ input_emb = fluid.layers.embedding(
+ input=words[0],
+ is_sparse=is_sparse,
+ size=[dict_size, embedding_size],
+ param_attr=fluid.ParamAttr(
+ name='emb',
+ initializer=fluid.initializer.Uniform(-init_width, init_width)))
+
+ true_emb_w = fluid.layers.embedding(
+ input=words[1],
+ is_sparse=is_sparse,
+ size=[dict_size, embedding_size],
+ param_attr=fluid.ParamAttr(
+ name='emb_w', initializer=fluid.initializer.Constant(value=0.0)))
+
+ true_emb_b = fluid.layers.embedding(
+ input=words[1],
+ is_sparse=is_sparse,
+ size=[dict_size, 1],
+ param_attr=fluid.ParamAttr(
+ name='emb_b', initializer=fluid.initializer.Constant(value=0.0)))
+ neg_word_reshape = fluid.layers.reshape(words[2], shape=[-1, 1])
+ neg_word_reshape.stop_gradient = True
+
+ neg_emb_w = fluid.layers.embedding(
+ input=neg_word_reshape,
+ is_sparse=is_sparse,
+ size=[dict_size, embedding_size],
+ param_attr=fluid.ParamAttr(
+ name='emb_w', learning_rate=1.0))
+
+ neg_emb_w_re = fluid.layers.reshape(
+ neg_emb_w, shape=[-1, neg_num, embedding_size])
+ neg_emb_b = fluid.layers.embedding(
+ input=neg_word_reshape,
+ is_sparse=is_sparse,
+ size=[dict_size, 1],
+ param_attr=fluid.ParamAttr(
+ name='emb_b', learning_rate=1.0))
+
+ neg_emb_b_vec = fluid.layers.reshape(neg_emb_b, shape=[-1, neg_num])
+ true_logits = fluid.layers.elementwise_add(
+ fluid.layers.reduce_sum(
+ fluid.layers.elementwise_mul(input_emb, true_emb_w),
+ dim=1,
+ keep_dim=True),
+ true_emb_b)
+ input_emb_re = fluid.layers.reshape(
+ input_emb, shape=[-1, 1, embedding_size])
+ neg_matmul = fluid.layers.matmul(
+ input_emb_re, neg_emb_w_re, transpose_y=True)
+ neg_matmul_re = fluid.layers.reshape(neg_matmul, shape=[-1, neg_num])
+ neg_logits = fluid.layers.elementwise_add(neg_matmul_re, neg_emb_b_vec)
+ #nce loss
+
+ label_ones = fluid.layers.fill_constant_batch_size_like(
+ true_logits, shape=[-1, 1], value=1.0, dtype='float32')
+ label_zeros = fluid.layers.fill_constant_batch_size_like(
+ true_logits, shape=[-1, neg_num], value=0.0, dtype='float32')
+
+ true_xent = fluid.layers.sigmoid_cross_entropy_with_logits(true_logits,
+ label_ones)
+ neg_xent = fluid.layers.sigmoid_cross_entropy_with_logits(neg_logits,
+ label_zeros)
+ cost = fluid.layers.elementwise_add(
+ fluid.layers.reduce_sum(
+ true_xent, dim=1),
+ fluid.layers.reduce_sum(
+ neg_xent, dim=1))
+ avg_cost = fluid.layers.reduce_mean(cost)
+ return avg_cost, py_reader
+
+
+def infer_network(vocab_size, emb_size):
+ analogy_a = fluid.layers.data(name="analogy_a", shape=[1], dtype='int64')
+ analogy_b = fluid.layers.data(name="analogy_b", shape=[1], dtype='int64')
+ analogy_c = fluid.layers.data(name="analogy_c", shape=[1], dtype='int64')
+ all_label = fluid.layers.data(
+ name="all_label",
+ shape=[vocab_size, 1],
+ dtype='int64',
+ append_batch_size=False)
+ emb_all_label = fluid.layers.embedding(
+ input=all_label, size=[vocab_size, emb_size], param_attr="emb")
+
+ emb_a = fluid.layers.embedding(
+ input=analogy_a, size=[vocab_size, emb_size], param_attr="emb")
+ emb_b = fluid.layers.embedding(
+ input=analogy_b, size=[vocab_size, emb_size], param_attr="emb")
+ emb_c = fluid.layers.embedding(
+ input=analogy_c, size=[vocab_size, emb_size], param_attr="emb")
+ target = fluid.layers.elementwise_add(
+ fluid.layers.elementwise_sub(emb_b, emb_a), emb_c)
+ emb_all_label_l2 = fluid.layers.l2_normalize(x=emb_all_label, axis=1)
+ dist = fluid.layers.matmul(x=target, y=emb_all_label_l2, transpose_y=True)
+ values, pred_idx = fluid.layers.topk(input=dist, k=4)
+ return values, pred_idx
diff --git a/PaddleRec/word2vec/preprocess.py b/PaddleRec/word2vec/preprocess.py
new file mode 100644
index 0000000000000000000000000000000000000000..5af174e7f99b0496f9d24c4e83eede9bd6b59d24
--- /dev/null
+++ b/PaddleRec/word2vec/preprocess.py
@@ -0,0 +1,187 @@
+# -*- coding: utf-8 -*
+import os
+import random
+import re
+import six
+import argparse
+import io
+import math
+prog = re.compile("[^a-z ]", flags=0)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(
+ description="Paddle Fluid word2 vector preprocess")
+ parser.add_argument(
+ '--build_dict_corpus_dir', type=str, help="The dir of corpus")
+ parser.add_argument(
+ '--input_corpus_dir', type=str, help="The dir of input corpus")
+ parser.add_argument(
+ '--output_corpus_dir', type=str, help="The dir of output corpus")
+ parser.add_argument(
+ '--dict_path',
+ type=str,
+ default='./dict',
+ help="The path of dictionary ")
+ parser.add_argument(
+ '--min_count',
+ type=int,
+ default=5,
+ help="If the word count is less then min_count, it will be removed from dict"
+ )
+ parser.add_argument(
+ '--downsample',
+ type=float,
+ default=0.001,
+ help="filter word by downsample")
+ parser.add_argument(
+ '--filter_corpus',
+ action='store_true',
+ default=False,
+ help='Filter corpus')
+ parser.add_argument(
+ '--build_dict',
+ action='store_true',
+ default=False,
+ help='Build dict from corpus')
+ return parser.parse_args()
+
+
+def text_strip(text):
+ #English Preprocess Rule
+ return prog.sub("", text.lower())
+
+
+# Shameless copy from Tensorflow https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/data_generators/text_encoder.py
+# Unicode utility functions that work with Python 2 and 3
+def native_to_unicode(s):
+ if _is_unicode(s):
+ return s
+ try:
+ return _to_unicode(s)
+ except UnicodeDecodeError:
+ res = _to_unicode(s, ignore_errors=True)
+ return res
+
+
+def _is_unicode(s):
+ if six.PY2:
+ if isinstance(s, unicode):
+ return True
+ else:
+ if isinstance(s, str):
+ return True
+ return False
+
+
+def _to_unicode(s, ignore_errors=False):
+ if _is_unicode(s):
+ return s
+ error_mode = "ignore" if ignore_errors else "strict"
+ return s.decode("utf-8", errors=error_mode)
+
+
+def filter_corpus(args):
+ """
+ filter corpus and convert id.
+ """
+ word_count = dict()
+ word_to_id_ = dict()
+ word_all_count = 0
+ id_counts = []
+ word_id = 0
+ #read dict
+ with io.open(args.dict_path, 'r', encoding='utf-8') as f:
+ for line in f:
+ word, count = line.split()[0], int(line.split()[1])
+ word_count[word] = count
+ word_to_id_[word] = word_id
+ word_id += 1
+ id_counts.append(count)
+ word_all_count += count
+
+ #filter corpus and convert id
+ if not os.path.exists(args.output_corpus_dir):
+ os.makedirs(args.output_corpus_dir)
+ for file in os.listdir(args.input_corpus_dir):
+ with io.open(args.output_corpus_dir + '/convert_' + file, "w") as wf:
+ with io.open(
+ args.input_corpus_dir + '/' + file, encoding='utf-8') as rf:
+ print(args.input_corpus_dir + '/' + file)
+ for line in rf:
+ signal = False
+ line = text_strip(line)
+ words = line.split()
+ for item in words:
+ if item in word_count:
+ idx = word_to_id_[item]
+ else:
+ idx = word_to_id_[native_to_unicode('')]
+ count_w = id_counts[idx]
+ corpus_size = word_all_count
+ keep_prob = (
+ math.sqrt(count_w /
+ (args.downsample * corpus_size)) + 1
+ ) * (args.downsample * corpus_size) / count_w
+ r_value = random.random()
+ if r_value > keep_prob:
+ continue
+ wf.write(_to_unicode(str(idx) + " "))
+ signal = True
+ if signal:
+ wf.write(_to_unicode("\n"))
+
+
+def build_dict(args):
+ """
+ proprocess the data, generate dictionary and save into dict_path.
+ :param corpus_dir: the input data dir.
+ :param dict_path: the generated dict path. the data in dict is "word count"
+ :param min_count:
+ :return:
+ """
+ # word to count
+
+ word_count = dict()
+
+ for file in os.listdir(args.build_dict_corpus_dir):
+ with io.open(
+ args.build_dict_corpus_dir + "/" + file, encoding='utf-8') as f:
+ print("build dict : ", args.build_dict_corpus_dir + "/" + file)
+ for line in f:
+ line = text_strip(line)
+ words = line.split()
+ for item in words:
+ if item in word_count:
+ word_count[item] = word_count[item] + 1
+ else:
+ word_count[item] = 1
+
+ item_to_remove = []
+ for item in word_count:
+ if word_count[item] <= args.min_count:
+ item_to_remove.append(item)
+
+ unk_sum = 0
+ for item in item_to_remove:
+ unk_sum += word_count[item]
+ del word_count[item]
+ #sort by count
+ word_count[native_to_unicode('')] = unk_sum
+ word_count = sorted(
+ word_count.items(), key=lambda word_count: -word_count[1])
+
+ with io.open(args.dict_path, 'w+', encoding='utf-8') as f:
+ for k, v in word_count:
+ f.write(k + " " + str(v) + '\n')
+
+
+if __name__ == "__main__":
+ args = parse_args()
+ if args.build_dict:
+ build_dict(args)
+ elif args.filter_corpus:
+ filter_corpus(args)
+ else:
+ print(
+ "error command line, please choose --build_dict or --filter_corpus")
diff --git a/PaddleRec/word2vec/reader.py b/PaddleRec/word2vec/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..ea3352ac28dc9685c6b0043170063003f95fc1d2
--- /dev/null
+++ b/PaddleRec/word2vec/reader.py
@@ -0,0 +1,117 @@
+# -*- coding: utf-8 -*
+
+import numpy as np
+import preprocess
+import logging
+import math
+import random
+import io
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+class NumpyRandomInt(object):
+ def __init__(self, a, b, buf_size=1000):
+ self.idx = 0
+ self.buffer = np.random.random_integers(a, b, buf_size)
+ self.a = a
+ self.b = b
+
+ def __call__(self):
+ if self.idx == len(self.buffer):
+ self.buffer = np.random.random_integers(self.a, self.b,
+ len(self.buffer))
+ self.idx = 0
+
+ result = self.buffer[self.idx]
+ self.idx += 1
+ return result
+
+
+class Word2VecReader(object):
+ def __init__(self,
+ dict_path,
+ data_path,
+ filelist,
+ trainer_id,
+ trainer_num,
+ window_size=5):
+ self.window_size_ = window_size
+ self.data_path_ = data_path
+ self.filelist = filelist
+ self.word_to_id_ = dict()
+ self.id_to_word = dict()
+ self.word_count = dict()
+ self.trainer_id = trainer_id
+ self.trainer_num = trainer_num
+
+ word_all_count = 0
+ id_counts = []
+ word_id = 0
+
+ with io.open(dict_path, 'r', encoding='utf-8') as f:
+ for line in f:
+ word, count = line.split()[0], int(line.split()[1])
+ self.word_count[word] = count
+ self.word_to_id_[word] = word_id
+ self.id_to_word[word_id] = word #build id to word dict
+ word_id += 1
+ id_counts.append(count)
+ word_all_count += count
+
+ self.word_all_count = word_all_count
+ self.corpus_size_ = word_all_count
+ self.dict_size = len(self.word_to_id_)
+ self.id_counts_ = id_counts
+ #write word2id file
+ print("write word2id file to : " + dict_path + "_word_to_id_")
+ with io.open(dict_path + "_word_to_id_", 'w+', encoding='utf-8') as f6:
+ for k, v in self.word_to_id_.items():
+ f6.write(k + " " + str(v) + '\n')
+
+ print("corpus_size:", self.corpus_size_)
+ self.id_frequencys = [
+ float(count) / word_all_count for count in self.id_counts_
+ ]
+ print("dict_size = " + str(self.dict_size) + " word_all_count = " + str(
+ word_all_count))
+
+ self.random_generator = NumpyRandomInt(1, self.window_size_ + 1)
+
+ def get_context_words(self, words, idx):
+ """
+ Get the context word list of target word.
+ words: the words of the current line
+ idx: input word index
+ window_size: window size
+ """
+ target_window = self.random_generator()
+ start_point = idx - target_window # if (idx - target_window) > 0 else 0
+ if start_point < 0:
+ start_point = 0
+ end_point = idx + target_window
+ targets = words[start_point:idx] + words[idx + 1:end_point + 1]
+ return targets
+
+ def train(self):
+ def nce_reader():
+ for file in self.filelist:
+ with io.open(
+ self.data_path_ + "/" + file, 'r',
+ encoding='utf-8') as f:
+ logger.info("running data in {}".format(self.data_path_ +
+ "/" + file))
+ count = 1
+ for line in f:
+ if self.trainer_id == count % self.trainer_num:
+ word_ids = [int(w) for w in line.split()]
+ for idx, target_id in enumerate(word_ids):
+ context_word_ids = self.get_context_words(
+ word_ids, idx)
+ for context_id in context_word_ids:
+ yield [target_id], [context_id]
+ count += 1
+
+ return nce_reader
diff --git a/PaddleRec/word2vec/train.py b/PaddleRec/word2vec/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..430ec132d2f810eed0025f16e9b87a8f742c455c
--- /dev/null
+++ b/PaddleRec/word2vec/train.py
@@ -0,0 +1,228 @@
+from __future__ import print_function
+import argparse
+import logging
+import os
+import time
+import math
+import random
+import numpy as np
+import paddle
+import paddle.fluid as fluid
+import six
+import reader
+from net import skip_gram_word2vec
+
+logging.basicConfig(format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger("fluid")
+logger.setLevel(logging.INFO)
+
+
+def parse_args():
+ parser = argparse.ArgumentParser(
+ description="PaddlePaddle Word2vec example")
+ parser.add_argument(
+ '--train_data_dir',
+ type=str,
+ default='./data/text',
+ help="The path of taining dataset")
+ parser.add_argument(
+ '--base_lr',
+ type=float,
+ default=0.01,
+ help="The number of learing rate (default: 0.01)")
+ parser.add_argument(
+ '--save_step',
+ type=int,
+ default=500000,
+ help="The number of step to save (default: 500000)")
+ parser.add_argument(
+ '--print_batch',
+ type=int,
+ default=10,
+ help="The number of print_batch (default: 10)")
+ parser.add_argument(
+ '--dict_path',
+ type=str,
+ default='./data/1-billion_dict',
+ help="The path of data dict")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=500,
+ help="The size of mini-batch (default:500)")
+ parser.add_argument(
+ '--num_passes',
+ type=int,
+ default=10,
+ help="The number of passes to train (default: 10)")
+ parser.add_argument(
+ '--model_output_dir',
+ type=str,
+ default='models',
+ help='The path for model to store (default: models)')
+ parser.add_argument('--nce_num', type=int, default=5, help='nce_num')
+ parser.add_argument(
+ '--embedding_size',
+ type=int,
+ default=64,
+ help='sparse feature hashing space for index processing')
+ parser.add_argument(
+ '--is_sparse',
+ action='store_true',
+ required=False,
+ default=False,
+ help='embedding and nce will use sparse or not, (default: False)')
+ parser.add_argument(
+ '--with_speed',
+ action='store_true',
+ required=False,
+ default=False,
+ help='print speed or not , (default: False)')
+ return parser.parse_args()
+
+
+def convert_python_to_tensor(weight, batch_size, sample_reader):
+ def __reader__():
+ cs = np.array(weight).cumsum()
+ result = [[], []]
+ for sample in sample_reader():
+ for i, fea in enumerate(sample):
+ result[i].append(fea)
+ if len(result[0]) == batch_size:
+ tensor_result = []
+ for tensor in result:
+ t = fluid.Tensor()
+ dat = np.array(tensor, dtype='int64')
+ if len(dat.shape) > 2:
+ dat = dat.reshape((dat.shape[0], dat.shape[2]))
+ elif len(dat.shape) == 1:
+ dat = dat.reshape((-1, 1))
+ t.set(dat, fluid.CPUPlace())
+ tensor_result.append(t)
+ tt = fluid.Tensor()
+ neg_array = cs.searchsorted(np.random.sample(args.nce_num))
+ neg_array = np.tile(neg_array, batch_size)
+ tt.set(
+ neg_array.reshape((batch_size, args.nce_num)),
+ fluid.CPUPlace())
+ tensor_result.append(tt)
+ yield tensor_result
+ result = [[], []]
+
+ return __reader__
+
+
+def train_loop(args, train_program, reader, py_reader, loss, trainer_id,
+ weight):
+
+ py_reader.decorate_tensor_provider(
+ convert_python_to_tensor(weight, args.batch_size, reader.train()))
+
+ place = fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ exec_strategy = fluid.ExecutionStrategy()
+ exec_strategy.use_experimental_executor = True
+
+ print("CPU_NUM:" + str(os.getenv("CPU_NUM")))
+ exec_strategy.num_threads = int(os.getenv("CPU_NUM"))
+
+ build_strategy = fluid.BuildStrategy()
+ if int(os.getenv("CPU_NUM")) > 1:
+ build_strategy.reduce_strategy = fluid.BuildStrategy.ReduceStrategy.Reduce
+
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=False,
+ loss_name=loss.name,
+ main_program=train_program,
+ build_strategy=build_strategy,
+ exec_strategy=exec_strategy)
+
+ for pass_id in range(args.num_passes):
+ py_reader.start()
+ time.sleep(10)
+ epoch_start = time.time()
+ batch_id = 0
+ start = time.time()
+ try:
+ while True:
+
+ loss_val = train_exe.run(fetch_list=[loss.name])
+ loss_val = np.mean(loss_val)
+
+ if batch_id % args.print_batch == 0:
+ logger.info(
+ "TRAIN --> pass: {} batch: {} loss: {} reader queue:{}".
+ format(pass_id, batch_id,
+ loss_val.mean(), py_reader.queue.size()))
+ if args.with_speed:
+ if batch_id % 500 == 0 and batch_id != 0:
+ elapsed = (time.time() - start)
+ start = time.time()
+ samples = 1001 * args.batch_size * int(
+ os.getenv("CPU_NUM"))
+ logger.info("Time used: {}, Samples/Sec: {}".format(
+ elapsed, samples / elapsed))
+
+ if batch_id % args.save_step == 0 and batch_id != 0:
+ model_dir = args.model_output_dir + '/pass-' + str(
+ pass_id) + ('/batch-' + str(batch_id))
+ if trainer_id == 0:
+ fluid.io.save_params(executor=exe, dirname=model_dir)
+ print("model saved in %s" % model_dir)
+ batch_id += 1
+
+ except fluid.core.EOFException:
+ py_reader.reset()
+ epoch_end = time.time()
+ logger.info("Epoch: {0}, Train total expend: {1} ".format(
+ pass_id, epoch_end - epoch_start))
+ model_dir = args.model_output_dir + '/pass-' + str(pass_id)
+ if trainer_id == 0:
+ fluid.io.save_params(executor=exe, dirname=model_dir)
+ print("model saved in %s" % model_dir)
+
+
+def GetFileList(data_path):
+ return os.listdir(data_path)
+
+
+def train(args):
+
+ if not os.path.isdir(args.model_output_dir):
+ os.mkdir(args.model_output_dir)
+
+ filelist = GetFileList(args.train_data_dir)
+ word2vec_reader = reader.Word2VecReader(args.dict_path, args.train_data_dir,
+ filelist, 0, 1)
+
+ logger.info("dict_size: {}".format(word2vec_reader.dict_size))
+ np_power = np.power(np.array(word2vec_reader.id_frequencys), 0.75)
+ id_frequencys_pow = np_power / np_power.sum()
+
+ loss, py_reader = skip_gram_word2vec(
+ word2vec_reader.dict_size,
+ args.embedding_size,
+ is_sparse=args.is_sparse,
+ neg_num=args.nce_num)
+
+ optimizer = fluid.optimizer.SGD(
+ learning_rate=fluid.layers.exponential_decay(
+ learning_rate=args.base_lr,
+ decay_steps=100000,
+ decay_rate=0.999,
+ staircase=True))
+
+ optimizer.minimize(loss)
+
+ # do local training
+ logger.info("run local training")
+ main_program = fluid.default_main_program()
+ train_loop(args, main_program, word2vec_reader, py_reader, loss, 0,
+ id_frequencys_pow)
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ train(args)
diff --git a/PaddleRec/word2vec/utils.py b/PaddleRec/word2vec/utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..01cd04e493b09e880303d7b0c87f5ed71cf86357
--- /dev/null
+++ b/PaddleRec/word2vec/utils.py
@@ -0,0 +1,96 @@
+import sys
+import collections
+import six
+import time
+import numpy as np
+import paddle.fluid as fluid
+import paddle
+import os
+import preprocess
+
+
+def BuildWord_IdMap(dict_path):
+ word_to_id = dict()
+ id_to_word = dict()
+ with open(dict_path, 'r') as f:
+ for line in f:
+ word_to_id[line.split(' ')[0]] = int(line.split(' ')[1])
+ id_to_word[int(line.split(' ')[1])] = line.split(' ')[0]
+ return word_to_id, id_to_word
+
+
+def prepare_data(file_dir, dict_path, batch_size):
+ w2i, i2w = BuildWord_IdMap(dict_path)
+ vocab_size = len(i2w)
+ reader = paddle.batch(test(file_dir, w2i), batch_size)
+ return vocab_size, reader, i2w
+
+
+def native_to_unicode(s):
+ if _is_unicode(s):
+ return s
+ try:
+ return _to_unicode(s)
+ except UnicodeDecodeError:
+ res = _to_unicode(s, ignore_errors=True)
+ return res
+
+
+def _is_unicode(s):
+ if six.PY2:
+ if isinstance(s, unicode):
+ return True
+ else:
+ if isinstance(s, str):
+ return True
+ return False
+
+
+def _to_unicode(s, ignore_errors=False):
+ if _is_unicode(s):
+ return s
+ error_mode = "ignore" if ignore_errors else "strict"
+ return s.decode("utf-8", errors=error_mode)
+
+
+def strip_lines(line, vocab):
+ return _replace_oov(vocab, native_to_unicode(line))
+
+
+def _replace_oov(original_vocab, line):
+ """Replace out-of-vocab words with "".
+ This maintains compatibility with published results.
+ Args:
+ original_vocab: a set of strings (The standard vocabulary for the dataset)
+ line: a unicode string - a space-delimited sequence of words.
+ Returns:
+ a unicode string - a space-delimited sequence of words.
+ """
+ return u" ".join([
+ word if word in original_vocab else u"" for word in line.split()
+ ])
+
+
+def reader_creator(file_dir, word_to_id):
+ def reader():
+ files = os.listdir(file_dir)
+ for fi in files:
+ with open(file_dir + '/' + fi, "r") as f:
+ for line in f:
+ if ':' in line:
+ pass
+ else:
+ line = strip_lines(line.lower(), word_to_id)
+ line = line.split()
+ yield [word_to_id[line[0]]], [word_to_id[line[1]]], [
+ word_to_id[line[2]]
+ ], [word_to_id[line[3]]], [
+ word_to_id[line[0]], word_to_id[line[1]],
+ word_to_id[line[2]]
+ ]
+
+ return reader
+
+
+def test(test_dir, w2i):
+ return reader_creator(test_dir, w2i)
diff --git a/PaddleSlim/README.md b/PaddleSlim/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..342cc006edc04efcadbf43f944058c35261fa9f0
--- /dev/null
+++ b/PaddleSlim/README.md
@@ -0,0 +1,177 @@
+
+
+
+
+---
+# PaddleSlim模型压缩工具库
+
+PaddleSlim是PaddlePaddle框架的一个子模块。PaddleSlim首次在PaddlePaddle 1.4版本中发布。在PaddleSlim中,实现了目前主流的网络剪枝、量化、蒸馏三种压缩策略,主要用于压缩图像领域模型。在后续版本中,会添加更多的压缩策略,以及完善对NLP领域模型的支持。
+
+## 目录
+- [特色](#特色)
+- [架构介绍](#架构介绍)
+- [功能列表](#功能列表)
+- [实验结果与ModelZoo](#简要实验结果)
+- [模型导出格式](#模型导出格式)
+
+## 主要特点
+
+Paddle-Slim工具库有以下特点:
+
+### 接口简单
+
+- 以配置文件方式集中管理可配参数,方便实验管理
+- 在普通模型训练脚本上,添加极少代码即可完成模型压缩
+
+详见:[使用示例](docs/demo.md)
+
+### 效果好
+
+- 对于冗余信息较少的MobileNetV1模型,卷积核剪切策略依然可缩减模型大小,并保持尽量少的精度损失。
+- 蒸馏压缩策略可明显提升原始模型的精度。
+- 量化训练与蒸馏的组合使用,可同时做到缩减模型大小和提升模型精度。
+
+详见:[效果数据与ModelZoo](docs/model_zoo.md)
+
+### 功能更强更灵活
+
+- 剪切压缩过程自动化
+- 剪切压缩策略支持更多网络结构
+- 蒸馏支持多种方式,用户可自定义组合loss
+- 支持快速配置多种压缩策略组合使用
+
+详见:[使用说明](docs/usage.md)
+
+## 架构介绍
+
+这里简要介绍模型压缩工具实现的整体原理,便于理解使用流程。
+**图 1**为模型压缩工具的架构图,从上到下为API依赖关系。蒸馏模块、量化模块和剪切模块都间接依赖底层的paddle框架。目前,模型压缩工具作为了PaddlePaddle框架的一部分,所以已经安装普通版本paddle的用户需要重新下载安装支持模型压缩功能的paddle,才能使用压缩功能。
+
+
+
+图 1
+
+
+如**图 1**所示,最上层的紫色模块为用户接口,在Python脚本中调用模型压缩功能时,只需要构造一个Compressor对象即可,在[使用文档](docs/usage.md)中会有详细说明。
+
+我们将每个压缩算法称为压缩策略,在迭代训练模型的过程中调用用户注册的压缩策略完成模型压缩,如**图2**所示。其中,模型压缩工具封装好了模型训练逻辑,用户只需要提供训练模型需要的网络结构、数据、优化策略(optimizer)等,在[使用文档](docs/usage.md)会对此详细介绍。
+
+
+
+图 2
+
+
+## 功能列表
+
+
+### 剪切
+
+- 支持敏感度和uniform两种方式
+- 支持VGG、ResNet、MobileNet等各种类型的网络
+- 支持用户自定义剪切范围
+
+### 量化训练
+
+- 支持动态和静态两种量化训练方式
+ - 动态策略: 在推理过程中,动态统计激活的量化参数。
+ - 静态策略: 在推理过程中,对不同的输入,采用相同的从训练数据中统计得到的量化参数。
+- 支持对权重全局量化和Channel-Wise量化
+- 支持以兼容Paddle Mobile的格式保存模型
+
+### 蒸馏
+
+- 支持在teacher网络和student网络任意层添加组合loss
+ - 支持FSP loss
+ - 支持L2 loss
+ - 支持softmax with cross-entropy loss
+
+### 其它功能
+
+- 支持配置文件管理压缩任务超参数
+- 支持多种压缩策略组合使用
+- 蒸馏和剪切压缩过程支持checkpoints功能
+
+## 简要实验结果
+
+本节列出了PaddleSlim模型压缩工具库的一些实验结果,更多实验数据和预训练模型的下载,请参考:[详细实验结果与ModelZoo](docs/model_zoo.md)
+
+### 量化训练
+
+评估实验所使用数据集为ImageNet1000类数据,且以top-1准确率为衡量指标:
+
+| Model | FP32| int8(X:abs_max, W:abs_max) | int8, (X:moving_average_abs_max, W:abs_max) |int8, (X:abs_max, W:channel_wise_abs_max) |
+|:---|:---:|:---:|:---:|:---:|
+|MobileNetV1|89.54%/70.91%|89.64%/71.01%|89.58%/70.86%|89.75%/71.13%|
+|ResNet50|92.80%/76.35%|93.12%/76.77%|93.07%/76.65%|93.15%/76.80%|
+
+### 卷积核剪切
+
+数据:ImageNet 1000类
+模型:MobileNetV1
+原始模型大小:17M
+原始精度(top5/top1): 89.54% / 70.91%
+
+#### Uniform剪切
+
+| FLOPS |model size| 精度损失(top5/top1)|精度(top5/top1) |
+|---|---|---|---|
+| -50%|-47.0%(9.0M)|-0.41% / -1.08%|89.13% / 69.83%|
+| -60%|-55.9%(7.5M)|-1.34% / -2.67%|88.22% / 68.24%|
+| -70%|-65.3%(5.9M)|-2.55% / -4.34%|86.99% / 66.57%|
+
+#### 基于敏感度迭代剪切
+
+| FLOPS |精度(top5/top1)|
+|---|---|
+| -0% |89.54% / 70.91% |
+| -20% |90.08% / 71.48% |
+| -36% |89.62% / 70.83%|
+| -50% |88.77% / 69.31%|
+
+### 蒸馏
+
+数据:ImageNet 1000类
+模型:MobileNetV1
+
+|- |精度(top5/top1) |收益(top5/top1)|
+|---|---|---|
+| 单独训| 89.54% / 70.91%| - |
+| ResNet50蒸馏训| 90.92% / 71.97%| +1.28% / +1.06%|
+
+### 组合实验
+
+数据:ImageNet 1000类
+模型:MobileNetV1
+
+|压缩策略 |精度(top5/top1) |模型大小|
+|---|---|---|
+| Baseline|89.54% / 70.91%|17.0M|
+| ResNet50蒸馏|90.92% / 71.97%|17.0M|
+| ResNet50蒸馏训 + 量化|90.94% / 72.08%|4.2M|
+| 剪切-50% FLOPS|89.13% / 69.83%|9.0M|
+| 剪切-50% FLOPS + 量化|89.11% / 69.70%|2.3M|
+
+## 模型导出格式
+
+压缩框架支持导出以下格式的模型:
+
+- **Paddle Fluid模型格式:** Paddle Fluid模型格式,可通过Paddle框架加载使用。
+- **Paddle Mobile模型格式:** 仅在量化训练策略时使用,兼容[Paddle Mobile](https://github.com/PaddlePaddle/paddle-mobile)的模型格式。
diff --git a/PaddleSlim/compress.py b/PaddleSlim/compress.py
new file mode 100644
index 0000000000000000000000000000000000000000..e983974826d1f75958ea43a868db1be546a39e2a
--- /dev/null
+++ b/PaddleSlim/compress.py
@@ -0,0 +1,152 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import time
+import sys
+import logging
+import paddle
+import models
+import argparse
+import functools
+import paddle.fluid as fluid
+import reader
+from utility import add_arguments, print_arguments
+
+from paddle.fluid.contrib.slim import Compressor
+
+logging.basicConfig(format='%(asctime)s-%(levelname)s: %(message)s')
+_logger = logging.getLogger(__name__)
+_logger.setLevel(logging.INFO)
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 64*4, "Minibatch size.")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('class_dim', int, 1000, "Class number.")
+add_arg('image_shape', str, "3,224,224", "Input image size")
+add_arg('model', str, "MobileNet", "Set the network to use.")
+add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('teacher_model', str, None, "Set the teacher network to use.")
+add_arg('teacher_pretrained_model', str, None, "Whether to use pretrained model.")
+add_arg('compress_config', str, None, "The config file for compression with yaml format.")
+# yapf: enable
+
+model_list = [m for m in dir(models) if "__" not in m]
+
+
+def compress(args):
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+
+ assert args.model in model_list, "{} is not in lists: {}".format(args.model,
+ model_list)
+ image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
+ label = fluid.layers.data(name='label', shape=[1], dtype='int64')
+ # model definition
+ model = models.__dict__[args.model]()
+
+ if args.model is "GoogleNet":
+ out0, out1, out2 = model.net(input=image, class_dim=args.class_dim)
+ cost0 = fluid.layers.cross_entropy(input=out0, label=label)
+ cost1 = fluid.layers.cross_entropy(input=out1, label=label)
+ cost2 = fluid.layers.cross_entropy(input=out2, label=label)
+ avg_cost0 = fluid.layers.mean(x=cost0)
+ avg_cost1 = fluid.layers.mean(x=cost1)
+ avg_cost2 = fluid.layers.mean(x=cost2)
+ avg_cost = avg_cost0 + 0.3 * avg_cost1 + 0.3 * avg_cost2
+ acc_top1 = fluid.layers.accuracy(input=out0, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=out0, label=label, k=5)
+ else:
+ out = model.net(input=image, class_dim=args.class_dim)
+ cost = fluid.layers.cross_entropy(input=out, label=label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc_top1 = fluid.layers.accuracy(input=out, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=out, label=label, k=5)
+ val_program = fluid.default_main_program().clone()
+
+ opt = fluid.optimizer.Momentum(
+ momentum=0.9,
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=[5000 * 30, 5000 * 60, 5000 * 90],
+ values=[0.1, 0.01, 0.001, 0.0001]),
+ regularization=fluid.regularizer.L2Decay(4e-5))
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(fluid.default_startup_program())
+
+ if args.pretrained_model:
+
+ def if_exist(var):
+ return os.path.exists(os.path.join(args.pretrained_model, var.name))
+
+ fluid.io.load_vars(exe, args.pretrained_model, predicate=if_exist)
+
+ val_reader = paddle.batch(reader.val(), batch_size=args.batch_size)
+ val_feed_list = [('image', image.name), ('label', label.name)]
+ val_fetch_list = [('acc_top1', acc_top1.name), ('acc_top5', acc_top5.name)]
+
+ train_reader = paddle.batch(
+ reader.train(), batch_size=args.batch_size, drop_last=True)
+ train_feed_list = [('image', image.name), ('label', label.name)]
+ train_fetch_list = [('loss', avg_cost.name)]
+
+ teacher_programs = []
+ distiller_optimizer = None
+ if args.teacher_model:
+ teacher_model = models.__dict__[args.teacher_model]()
+ # define teacher program
+ teacher_program = fluid.Program()
+ startup_program = fluid.Program()
+ with fluid.program_guard(teacher_program, startup_program):
+ img = teacher_program.global_block()._clone_variable(
+ image, force_persistable=False)
+ predict = teacher_model.net(img,
+ class_dim=args.class_dim,
+ conv1_name='res_conv1',
+ fc_name='res_fc')
+ exe.run(startup_program)
+ assert args.teacher_pretrained_model and os.path.exists(
+ args.teacher_pretrained_model
+ ), "teacher_pretrained_model should be set when teacher_model is not None."
+
+ def if_exist(var):
+ return os.path.exists(
+ os.path.join(args.teacher_pretrained_model, var.name))
+
+ fluid.io.load_vars(
+ exe,
+ args.teacher_pretrained_model,
+ main_program=teacher_program,
+ predicate=if_exist)
+
+ distiller_optimizer = opt
+ teacher_programs.append(teacher_program.clone(for_test=True))
+
+ com_pass = Compressor(
+ place,
+ fluid.global_scope(),
+ fluid.default_main_program(),
+ train_reader=train_reader,
+ train_feed_list=train_feed_list,
+ train_fetch_list=train_fetch_list,
+ eval_program=val_program,
+ eval_reader=val_reader,
+ eval_feed_list=val_feed_list,
+ eval_fetch_list=val_fetch_list,
+ teacher_programs=teacher_programs,
+ train_optimizer=opt,
+ distiller_optimizer=distiller_optimizer)
+ com_pass.config(args.compress_config)
+ com_pass.run()
+
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ compress(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/PaddleSlim/configs/filter_pruning_sen.yaml b/PaddleSlim/configs/filter_pruning_sen.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..4537405fe200fad631095d0bd81d3de2fcb267bc
--- /dev/null
+++ b/PaddleSlim/configs/filter_pruning_sen.yaml
@@ -0,0 +1,25 @@
+version: 1.0
+pruners:
+ pruner_1:
+ class: 'StructurePruner'
+ pruning_axis:
+ '*': 0
+ criterions:
+ '*': 'l1_norm'
+strategies:
+ sensitive_pruning_strategy:
+ class: 'SensitivePruneStrategy'
+ pruner: 'pruner_1'
+ start_epoch: 0
+ delta_rate: 0.1
+ target_ratio: 0.5
+ num_steps: 1
+# eval_rate: 0.2
+ pruned_params: '.*_sep_weights'
+ sensitivities_file: 'mobilenet_acc_top1_sensitive.data'
+ metric_name: 'acc_top1'
+compressor:
+ epoch: 200
+ checkpoint_path: './checkpoints/'
+ strategies:
+ - sensitive_pruning_strategy
diff --git a/PaddleSlim/configs/filter_pruning_uniform.yaml b/PaddleSlim/configs/filter_pruning_uniform.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..1dea1070f4456e7150ed1a2706cde072180bf9ff
--- /dev/null
+++ b/PaddleSlim/configs/filter_pruning_uniform.yaml
@@ -0,0 +1,21 @@
+version: 1.0
+pruners:
+ pruner_1:
+ class: 'StructurePruner'
+ pruning_axis:
+ '*': 0
+ criterions:
+ '*': 'l1_norm'
+strategies:
+ uniform_pruning_strategy:
+ class: 'UniformPruneStrategy'
+ pruner: 'pruner_1'
+ start_epoch: 0
+ target_ratio: 0.5
+ pruned_params: '.*_sep_weights'
+ metric_name: 'acc_top1'
+compressor:
+ epoch: 200
+ checkpoint_path: './checkpoints/'
+ strategies:
+ - uniform_pruning_strategy
diff --git a/PaddleSlim/configs/mobilenetv1_resnet50_distillation.yaml b/PaddleSlim/configs/mobilenetv1_resnet50_distillation.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..c825edf595e2e1070315be5f73bc094db5579b30
--- /dev/null
+++ b/PaddleSlim/configs/mobilenetv1_resnet50_distillation.yaml
@@ -0,0 +1,23 @@
+version: 1.0
+distillers:
+ fsp_distiller:
+ class: 'FSPDistiller'
+ teacher_pairs: [['res2a_branch2a.conv2d.output.1.tmp_0', 'res3a_branch2a.conv2d.output.1.tmp_0']]
+ student_pairs: [['depthwise_conv2d_1.tmp_0', 'conv2d_3.tmp_0']]
+ distillation_loss_weight: 1
+ l2_distiller:
+ class: 'L2Distiller'
+ teacher_feature_map: 'fc_1.tmp_0'
+ student_feature_map: 'fc_0.tmp_0'
+ distillation_loss_weight: 1
+strategies:
+ distillation_strategy:
+ class: 'DistillationStrategy'
+ distillers: ['fsp_distiller', 'l2_distiller']
+ start_epoch: 0
+ end_epoch: 130
+compressor:
+ epoch: 130
+ checkpoint_path: './checkpoints/'
+ strategies:
+ - distillation_strategy
diff --git a/PaddleSlim/configs/quantization.yaml b/PaddleSlim/configs/quantization.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..4c8ab29f97485fe09c5a49a96946ae0c77edddee
--- /dev/null
+++ b/PaddleSlim/configs/quantization.yaml
@@ -0,0 +1,20 @@
+version: 1.0
+strategies:
+ quantization_strategy:
+ class: 'QuantizationStrategy'
+ start_epoch: 0
+ end_epoch: 20
+ float_model_save_path: './output/float'
+ mobile_model_save_path: './output/mobile'
+ int8_model_save_path: './output/int8'
+ weight_bits: 8
+ activation_bits: 8
+ weight_quantize_type: 'abs_max'
+ activation_quantize_type: 'abs_max'
+ save_in_nodes: ['image']
+ save_out_nodes: ['fc_0.tmp_2']
+compressor:
+ epoch: 21
+ checkpoint_path: './checkpoints_quan/'
+ strategies:
+ - quantization_strategy
diff --git a/PaddleSlim/configs/quantization_dist.yaml b/PaddleSlim/configs/quantization_dist.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..c178bb26d196b077a2289ab5319d05db7f411090
--- /dev/null
+++ b/PaddleSlim/configs/quantization_dist.yaml
@@ -0,0 +1,38 @@
+# Step1: distillation training from epoch-0 to epoch-120
+# Step2: quantization training from epoch-121 to epoch-141
+version: 1.0
+distillers:
+ fsp_distiller:
+ class: 'FSPDistiller'
+ teacher_pairs: [['res2a_branch2a.conv2d.output.1.tmp_0', 'res3a_branch2a.conv2d.output.1.tmp_0']]
+ student_pairs: [['depthwise_conv2d_1.tmp_0', 'conv2d_3.tmp_0']]
+ distillation_loss_weight: 1
+ l2_distiller:
+ class: 'L2Distiller'
+ teacher_feature_map: 'fc_1.tmp_0'
+ student_feature_map: 'fc_0.tmp_0'
+ distillation_loss_weight: 1
+strategies:
+ distillation_strategy:
+ class: 'DistillationStrategy'
+ distillers: ['fsp_distiller', 'l2_distiller']
+ start_epoch: 0
+ end_epoch: 120
+ quantization_strategy:
+ class: 'QuantizationStrategy'
+ start_epoch: 121
+ end_epoch: 141
+ float_model_save_path: './output/float'
+ mobile_model_save_path: './output/mobile'
+ int8_model_save_path: './output/int8'
+ weight_bits: 8
+ activation_bits: 8
+ weight_quantize_type: 'abs_max'
+ activation_quantize_type: 'abs_max'
+
+compressor:
+ epoch: 142
+ checkpoint_path: './checkpoints/'
+ strategies:
+ - distillation_strategy
+ - quantization_strategy
diff --git a/PaddleSlim/configs/quantization_pruning.yaml b/PaddleSlim/configs/quantization_pruning.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..b4dcd4794dc244c00c144d2fd02bbd7c859c461e
--- /dev/null
+++ b/PaddleSlim/configs/quantization_pruning.yaml
@@ -0,0 +1,36 @@
+# step1: Pruning at epoch-0
+# step2: Fine-tune from epoch-0 to epoch-120
+# step3: Quantization training from epoch-121 to epoch-141
+version: 1.0
+pruners:
+ pruner_1:
+ class: 'StructurePruner'
+ pruning_axis:
+ '*': 0
+ criterions:
+ '*': 'l1_norm'
+strategies:
+ uniform_pruning_strategy:
+ class: 'UniformPruneStrategy'
+ pruner: 'pruner_1'
+ start_epoch: 0
+ target_ratio: 0.5
+ pruned_params: '.*_sep_weights'
+ metric_name: 'acc_top1'
+ quantization_strategy:
+ class: 'QuantizationStrategy'
+ start_epoch: 121
+ end_epoch: 141
+ float_model_save_path: './output/float'
+ mobile_model_save_path: './output/mobile'
+ int8_model_save_path: './output/int8'
+ weight_bits: 8
+ activation_bits: 8
+ weight_quantize_type: 'abs_max'
+ activation_quantize_type: 'abs_max'
+compressor:
+ epoch: 142
+ checkpoint_path: './checkpoints/'
+ strategies:
+ - uniform_pruning_strategy
+ - quantization_strategy
diff --git a/PaddleSlim/docs/demo.md b/PaddleSlim/docs/demo.md
new file mode 100644
index 0000000000000000000000000000000000000000..c3529d9fad649f27df84cee25fa8aca9ad482a56
--- /dev/null
+++ b/PaddleSlim/docs/demo.md
@@ -0,0 +1,278 @@
+
+
+
+---
+# Paddle模型压缩工具库使用示例
+
+## 目录
+
+- [概述](#0-概述)
+- [数据准备](#1-数据准备)
+- [压缩脚本准备](#2-压缩脚本介绍)
+- [蒸馏示例](#31-蒸馏)
+- [剪切示例](#32-uniform剪切)
+- [量化示例](#34-int8量化训练)
+- [蒸馏后量化示例](#35-蒸馏后int8量化)
+- [剪切后量化示例](#36-剪切后int8量化)
+
+## 0. 概述
+该示例参考[PaddlePaddle/models/fluid/PaddleCV/image_classification](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification)下代码,分别实现了以下策略:
+
+1. 蒸馏:用ResNet50对MobileNetV1的在ImageNet 1000数据上的蒸馏训练。
+2. 剪切:对预训练好的MobileNetV1进行剪切
+3. 量化:对预训练好的MobileNetV1进行int8量化训练
+4. 蒸馏量化组合:先用ResNet50对MobileNetV1进行蒸馏,再对蒸馏后得到的模型进行int8量化训练。
+5. 剪切量化组合:先用Uniform剪切策略对MobileNetV1进行剪切,再对剪切后的模型进行int8量化训练
+
+本示例完整代码链接:https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleSlim
+
+使用方式:
+克隆[PaddlePaddle/models](https://github.com/PaddlePaddle/models)到本地,并进入models/fluid/PaddleSlim路径。
+
+**文件结构**
+
+```
+/.
+ |-configs # 压缩任务的配置文件,包括:蒸馏、int8量化量化、filter剪切和组合策略的配置文件。
+ |-data # 存放训练数据和pretrain model
+ |-models # MobileNetV1和ResNet50网络结构的定义
+ |-quant_low_level_api # 量化训练的底层API, 用于处理特殊情况,用户可暂时忽略该内容
+ |-compress.py # 模型压缩任务主脚本,示例中多个压缩策略共用这一个脚本。定义了压缩任务需要的模型相关的信息。
+ |-reader.py # 定义数据处理逻辑
+ |-run.sh # 模型压缩任务启动脚本
+ |-utility.py # 定义了常用的工具方法
+```
+
+本示例中的五个压缩策略使用相同的训练数据和压缩Python脚本`compress.py`,每种策略对应独立的配置文件。
+
+第1章介绍数据准备,第2章介绍脚本compress.py中几个关键步骤。第3章分别介绍了如何执行各种压缩策略的示例。
+
+
+
+
+## 1. 数据准备
+
+### 1.1 训练数据准备
+参考[models/fluid/PaddleCV/image_classification](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification#data-preparation)下的数据准备教程准备训练数据,并放入PaddleSlim/data路径下。
+
+### 1.2 预训练模型准备
+
+脚本run.sh会自动从[models/fluid/PaddleCV/image_classification](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification#supported-models-and-performances)下载ResNet50和MobileNetV1的预训练模型,并放入PaddleSlim/pretrain路径下。
+
+
+## 2. 压缩脚本介绍
+在`compress.py`中定义了执行压缩任务需要的所有模型相关的信息,这里对几个关键的步骤进行简要介绍:
+
+### 2.1 目标网络的定义
+
+compress.py的以下代码片段定义了train program, 这里train program只有前向计算操作。
+```
+out = model.net(input=image, class_dim=args.class_dim)
+cost = fluid.layers.cross_entropy(input=out, label=label)
+avg_cost = fluid.layers.mean(x=cost)
+acc_top1 = fluid.layers.accuracy(input=out, label=label, k=1)
+acc_top5 = fluid.layers.accuracy(input=out, label=label, k=5)
+```
+
+然后,通过clone方法得到eval_program, 用来在压缩过程中评估模型精度,如下:
+
+```
+val_program = fluid.default_main_program().clone()
+```
+
+定义完目标网络结构,需要对其初始化,并根据需要加载预训练模型。
+
+### 2.2 定义feed_list和fetch_list
+对于train program, 定义train_feed_list用于指定从train data reader中取的数据feed给哪些variable。定义train_fetch_list用于指定在训练时,需要在log中展示的结果。如果需要在训练过程中在log中打印accuracy信心,则将('acc_top1', acc_top1.name)添加到train_fetch_list中即可。
+```
+train_feed_list = [('image', image.name), ('label', label.name)]
+train_fetch_list = [('loss', avg_cost.name)]
+```
+
+>注意: 在train_fetch_list里必须有loss这一项。
+
+对于eval program. 同上定义eval_feed_list和train_fetch_list:
+
+```
+val_feed_list = [('image', image.name), ('label', label.name)]
+val_fetch_list = [('acc_top1', acc_top1.name), ('acc_top5', acc_top5.name)]
+```
+
+### 2.3 定义teacher网络
+
+以下代码片段定义了teacher网络,并对其进行了初始化操作。
+```
+teacher_program = fluid.Program()
+startup_program = fluid.Program()
+with fluid.program_guard(teacher_program, startup_program):
+ img = teacher_program.global_block()._clone_variable(image, force_persistable=False)
+ predict = teacher_model.net(img, class_dim=args.class_dim)
+exe.run(startup_program)
+```
+需要注意的是:
+
+- teacher网络只有一个输入,直接clone在train program(fluid.default_main_program) 中定义的image变量即可。
+- teacher网络的输出只需要到predict即可,不用加loss和accuracy等操作
+- teacher网络需要初始化并加载预训练模型。
+
+>注意: ResNet50和MobileNetV1的fc layer的weight parameter的名称都为‘fc_1.weight’,所以需要到PaddleSlim/models/resnet.py中修改一下ResNet fc layer的名称, 同时,修改ResNet50 pretrain model中响应weight的文件名,使其与resnet.py中的名称保持一致。
+
+
+## 3. 执行压缩策略示例
+所有示例的执行命令都放在`run.sh`文件中,用户可以修改run.sh后,执行不同的压缩策略示例。
+
+### 3.1 蒸馏
+
+在该示例中,用预训练好的ResNet50模型监督训练MobileNetV1模型。
+修改run.sh, 执行以下命令,执行蒸馏压缩示例:
+```
+# for distillation
+#--------------------
+export CUDA_VISIBLE_DEVICES=0
+python compress.py \
+--model "MobileNet" \
+--teacher_model "ResNet50" \
+--teacher_pretrained_model ./data/pretrain/ResNet50_pretrained \
+--compress_config ./configs/mobilenetv1_resnet50_distillation.yaml
+```
+该示例在评估数据集上的准确率结果如下:
+
+|- |精度(top5/top1) |
+|---|---|
+| ResNet50蒸馏训| 90.92% / 71.97%|
+
+
+
+图1
+
+
+
+### 3.2 Uniform剪切
+
+在该示例中,将MobileNetV1模型剪掉50%的FLOPS.
+修改run.sh, 执行以下命令,执行Uniform卷积核剪切模型压缩示例:
+
+```
+# for uniform filter pruning
+#---------------------------
+export CUDA_VISIBLE_DEVICES=0
+python compress.py \
+--model "MobileNet" \
+--pretrained_model ./data/pretrain/MobileNetV1_pretrained \
+--compress_config ./configs/filter_pruning_uniform.yaml
+```
+该示例在评估数据集上的准确率结果如下:
+
+| FLOPS |模型大小|精度(top5/top1) |
+|---|---|---|
+| -50%|-47.0%(9.0M) |89.13% / 69.83%|
+
+
+
+图2
+
+
+
+### 3.3 敏感度剪切
+
+在该示例中,将MobileNetV1模型剪掉50%的FLOPS.
+修改run.sh, 执行以下命令,执行敏感度卷积核剪切压缩示例:
+
+```
+# for sensitivity filter pruning
+#---------------------------
+export CUDA_VISIBLE_DEVICES=0
+python compress.py \
+--model "MobileNet" \
+--pretrained_model ./data/pretrain/MobileNetV1_pretrained \
+--compress_config ./configs/filter_pruning_sen.yaml
+```
+该示例在评估数据集上的准确率结果如下:
+
+| FLOPS |模型大小| 精度(top5/top1) |
+|---|---|---|
+| -50%|-61.2%(6.6M) |88.47% / 68.68%|
+
+
+
+图3
+
+
+### 3.4 int8量化训练
+
+修改run.sh, 执行以下命令,执行int8量化训练示例:
+
+```
+# for quantization
+#---------------------------
+export CUDA_VISIBLE_DEVICES=0
+python compress.py \
+--batch_size 64 \
+--model "MobileNet" \
+--pretrained_model ./pretrain/MobileNetV1_pretrained \
+--compress_config ./configs/quantization.yaml
+```
+
+该示例结果如下:
+
+| Model | int8量化(top1_acc)|
+|---|---|
+|MobileNetV1|71.00%|
+
+
+### 3.5 蒸馏后int8量化
+
+本示例先用ResNet50模型对MobileNetV1蒸馏训练120个epochs,然后再对MobileNetV1模型进行动态int8量化训练。
+修改run.sh, 执行以下命令,执行蒸馏与int8量化训练结合的模型压缩示例:
+
+```
+# for distillation with quantization
+#-----------------------------------
+export CUDA_VISIBLE_DEVICES=0
+python compress.py \
+--model "MobileNet" \
+--teacher_model "ResNet50" \
+--teacher_pretrained_model ./data/pretrain/ResNet50_pretrained \
+--compress_config ./configs/quantization_dist.yaml
+```
+
+该示例结果如下:
+
+|- |精度(top5/top1) |
+|---|---|
+| ResNet50蒸馏训+量化|90.94% / 72.08%|
+
+### 3.6 剪切后int8量化
+
+本示例先将预训练好的MobileNetV1模型剪掉50% FLOPS, 让后再对其进行动态int8量化训练。
+修改run.sh, 执行以下命令,执行剪切与int8量化训练结合的模型压缩示例:
+
+```
+# for uniform filter pruning with quantization
+#---------------------------------------------
+export CUDA_VISIBLE_DEVICES=0
+python compress.py \
+--model "MobileNet" \
+--pretrained_model ./data/pretrain/MobileNetV1_pretrained \
+--compress_config ./configs/quantization_pruning.yaml
+```
+
+该示例结果如下:
+
+| 剪切FLOPS |剪切+量化(dynamic)
+|---|---|
+| -50%|89.11% / 69.70%|
diff --git a/PaddleSlim/docs/images/demo/demo.zip b/PaddleSlim/docs/images/demo/demo.zip
new file mode 100644
index 0000000000000000000000000000000000000000..428bdc4a6b59caf845ac00bbc1ecd71003683575
Binary files /dev/null and b/PaddleSlim/docs/images/demo/demo.zip differ
diff --git a/PaddleSlim/docs/images/demo/distillation_result.png b/PaddleSlim/docs/images/demo/distillation_result.png
new file mode 100644
index 0000000000000000000000000000000000000000..49e954feb44e162853fe33ba346a27b5f4221858
Binary files /dev/null and b/PaddleSlim/docs/images/demo/distillation_result.png differ
diff --git a/PaddleSlim/docs/images/demo/pruning_sen_result.png b/PaddleSlim/docs/images/demo/pruning_sen_result.png
new file mode 100644
index 0000000000000000000000000000000000000000..e725682fa307f5666d542a258f278d71f5f6db5a
Binary files /dev/null and b/PaddleSlim/docs/images/demo/pruning_sen_result.png differ
diff --git a/PaddleSlim/docs/images/demo/pruning_uni_result.png b/PaddleSlim/docs/images/demo/pruning_uni_result.png
new file mode 100644
index 0000000000000000000000000000000000000000..a3acdd9fd824da76d105cb8e20457eeb1984659e
Binary files /dev/null and b/PaddleSlim/docs/images/demo/pruning_uni_result.png differ
diff --git a/PaddleSlim/docs/images/framework_0.png b/PaddleSlim/docs/images/framework_0.png
new file mode 100644
index 0000000000000000000000000000000000000000..82d6f8bf1d0439e29d12d7a2bf81f109dbe95b71
Binary files /dev/null and b/PaddleSlim/docs/images/framework_0.png differ
diff --git a/PaddleSlim/docs/images/framework_1.png b/PaddleSlim/docs/images/framework_1.png
new file mode 100644
index 0000000000000000000000000000000000000000..642bc13f8e12eace8fe8e70d8f3ec04a39e4275a
Binary files /dev/null and b/PaddleSlim/docs/images/framework_1.png differ
diff --git a/PaddleSlim/docs/images/tutorial/distillation_0.png b/PaddleSlim/docs/images/tutorial/distillation_0.png
new file mode 100644
index 0000000000000000000000000000000000000000..0946b90914339d2b66af7b9c86b5f77b4ec57ec5
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/distillation_0.png differ
diff --git a/PaddleSlim/docs/images/tutorial/pruning_0.png b/PaddleSlim/docs/images/tutorial/pruning_0.png
new file mode 100644
index 0000000000000000000000000000000000000000..55cab1d5658d040c256ffe5ddd6a440b82a96b35
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/pruning_0.png differ
diff --git a/PaddleSlim/docs/images/tutorial/pruning_1.png b/PaddleSlim/docs/images/tutorial/pruning_1.png
new file mode 100644
index 0000000000000000000000000000000000000000..99dc55e83e6c745fc55e6e0cc67b55ef38754080
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/pruning_1.png differ
diff --git a/PaddleSlim/docs/images/tutorial/pruning_2.png b/PaddleSlim/docs/images/tutorial/pruning_2.png
new file mode 100644
index 0000000000000000000000000000000000000000..8c413672ccc827537e10a54b749dd4b4bf7b0122
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/pruning_2.png differ
diff --git a/PaddleSlim/docs/images/tutorial/pruning_3.png b/PaddleSlim/docs/images/tutorial/pruning_3.png
new file mode 100644
index 0000000000000000000000000000000000000000..764b9c8f2eaabbc14dd593cd138c04fe9d3ef202
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/pruning_3.png differ
diff --git a/PaddleSlim/docs/images/tutorial/pruning_4.png b/PaddleSlim/docs/images/tutorial/pruning_4.png
new file mode 100644
index 0000000000000000000000000000000000000000..c99e89261bccea4cc82de20288a59addfdb153ab
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/pruning_4.png differ
diff --git a/PaddleSlim/docs/images/tutorial/quan_bwd.png b/PaddleSlim/docs/images/tutorial/quan_bwd.png
new file mode 100644
index 0000000000000000000000000000000000000000..9fe571f30bbc133f0a8c5da1639876a3ca39001b
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/quan_bwd.png differ
diff --git a/PaddleSlim/docs/images/tutorial/quan_forward.png b/PaddleSlim/docs/images/tutorial/quan_forward.png
new file mode 100644
index 0000000000000000000000000000000000000000..56c52bb140cb3e0590f3cafa9043205779116533
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/quan_forward.png differ
diff --git a/PaddleSlim/docs/images/tutorial/quan_fwd_1.png b/PaddleSlim/docs/images/tutorial/quan_fwd_1.png
new file mode 100644
index 0000000000000000000000000000000000000000..8224a560d525b47e5d8395064bae018d4d9e67c4
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/quan_fwd_1.png differ
diff --git a/PaddleSlim/docs/images/tutorial/quan_table_0.png b/PaddleSlim/docs/images/tutorial/quan_table_0.png
new file mode 100644
index 0000000000000000000000000000000000000000..ea6571509e4b8f1fa00ee8f9ffb0a6870b740d0f
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/quan_table_0.png differ
diff --git a/PaddleSlim/docs/images/tutorial/quan_table_1.png b/PaddleSlim/docs/images/tutorial/quan_table_1.png
new file mode 100644
index 0000000000000000000000000000000000000000..53bc672246c341dd46dbb7a269ff2b3d1c35a05d
Binary files /dev/null and b/PaddleSlim/docs/images/tutorial/quan_table_1.png differ
diff --git a/PaddleSlim/docs/model_zoo.md b/PaddleSlim/docs/model_zoo.md
new file mode 100644
index 0000000000000000000000000000000000000000..eb35b4b101ca044e0ae1b86a68f5df0c1154a8b6
--- /dev/null
+++ b/PaddleSlim/docs/model_zoo.md
@@ -0,0 +1,215 @@
+
+
+
+---
+# Paddle模型压缩工具实验与模型库
+
+## 目录
+
+- [量化实验](#int8量化训练)
+- [剪切实验](#剪切实验)
+- [蒸馏实验](蒸馏实验)
+- [组合实验](组合实验)
+
+
+## 1. int8量化训练
+
+评估实验所使用数据集为ImageNet 1000类,且以top-1准确率为衡量指标:
+
+| Model | FP32| int8(A:abs_max, W:abs_max) | int8, (A:moving_average_abs_max, W:abs_max) |int8, (A:abs_max, W:channel_wise_abs_max) |
+|:---|:---:|:---:|:---:|:---:|
+|MobileNetV1|[70.916%]()|[71.008%]()|[70.84%]()|[71.00%]()|
+|ResNet50|[76.352%]()|[76.612%]()|[76.456%]()|[76.73%]()|
+
+点击表中超链接即可下载预训练模型。
+
+模型大小变化:
+
+| Model | FP32| int8(A:abs_max, W:abs_max) | int8, (A:moving_average_abs_max, W:abs_max) |int8, (A:abs_max, W:channel_wise_abs_max) |
+|:---|:---:|:---:|:---:|:---:|
+|MobileNetV1|17M|4.8M(1/3.54)|5.1M(1/3.33)|4.9M(1/3.47)|
+|ResNet50|99M|26M(1/3.81)|27M(1/3.67)|27M(1/3.67)|
+
+
+## 2. 剪切实验
+
+数据: ImageNet 1000类
+模型:MobileNetV1
+原始模型大小:17M
+原始精度(top5/top1): 89.54% / 70.91%
+
+### 2.1 基于敏感度迭代剪切
+
+#### 实验说明
+
+分步剪切,每步剪掉模型7%的FLOPS.
+optimizer配置如下:
+
+```
+epoch_size=5000
+boundaries = [30, 60, 90, 120] * epoch_size # for -50% FLOPS
+#boundaries = [35, 65, 95, 125] * epoch_size # for -60% FLOPS
+#boundaries = [50, 80, 110, 140] * epoch_size # for -70% FLOPS
+values = [0.01, 0.1, 0.01, 0.001, 0.0001]
+optimizer = fluid.optimizer.Momentum(
+ momentum=0.9,
+ learning_rate=fluid.layers.piecewise_decay(boundaries=boundaries, values=values),
+ regularization=fluid.regularizer.L2Decay(1e-4))
+```
+
+#### 实验结果
+
+
+| FLOPS |model size| 精度(top5/top1) |下载模型|
+|---|---|---|---|
+| -50%|-59.4%(6.9M) |88.22% / 68.41% |[点击下载](https://paddle-slim-models.bj.bcebos.com/sensitive_filter_pruning_0.5_model.tar.gz)|
+| -60%|-70.6%(5.0M)|87.01% / 66.31% |[点击下载](https://paddle-slim-models.bj.bcebos.com/sensitive_filter_pruning_0.6_model.tar.gz)|
+| -70%|-78.8%(3.6M)|85.30% / 63.41% |[点击下载](https://paddle-slim-models.bj.bcebos.com/sensitive_filter_pruning_0.7_model.tar.gz)|
+
+### 2.2 基于敏感度一次性剪切
+
+#### 实验说明
+
+一步剪切掉50%FLOPS, 然后fine-tune 120个epoch.
+
+optimizer配置如下:
+
+```
+epoch_size=5000
+boundaries = [30, 60, 90] * epoch_size
+values = [0.1, 0.01, 0.001, 0.0001]
+optimizer = fluid.optimizer.Momentum(
+ momentum=0.9,
+ learning_rate=fluid.layers.piecewise_decay(boundaries=boundaries, values=values),
+ regularization=fluid.regularizer.L2Decay(1e-4))
+```
+
+#### 实验结果
+
+| FLOPS |model size|精度(top5/top1) |模型下载|
+|---|---|---|---|
+| -50%|-61.2%(6.6M)| 88.47% / 68.68% |[点击下载](https://paddle-slim-models.bj.bcebos.com/sensitive_filter_pruning_0.5-1step.tar.gz)|
+
+### 2.3 基于敏感度分步剪切
+
+#### 实验说明
+
+1. 一次剪掉20%FLOPS, fine-tune 120个epoch
+2. 在上一步基础上,一次剪掉20%FLOPS, fine-tune 120个epoch
+3. 在上一步基础上,一次剪掉20%FLOPS, fine-tune 120个epoch
+
+optimizer配置如下:
+
+```
+epoch_size=5000
+boundaries = [30, 60, 90] * epoch_size
+values = [0.1, 0.01, 0.001, 0.0001]
+optimizer = fluid.optimizer.Momentum(
+ momentum=0.9,
+ learning_rate=fluid.layers.piecewise_decay(boundaries=boundaries, values=values),
+ regularization=fluid.regularizer.L2Decay(1e-4))
+```
+
+#### 实验结果
+
+| FLOPS |精度(top5/top1)|模型下载 |
+|---|---|---|
+| -20%|90.08% / 71.48% |[点击下载](https://paddle-slim-models.bj.bcebos.com/sensitive_filter_pruning_3step_0.2_model.tar.gz)|
+| -36%|89.62% / 70.83%|[点击下载](https://paddle-slim-models.bj.bcebos.com/sensitive_filter_pruning_3step_0.36_model.tar.gz)|
+| -50%|88.77% / 69.31%|[点击下载](https://paddle-slim-models.bj.bcebos.com/sensitive_filter_pruning_3step_0.5_model.tar.gz)|
+
+
+### 2.4 Uniform剪切
+
+#### 实验说明
+
+一次剪掉指定比例的FLOPS,然后fine-tune 120个epoch.
+
+optimizer配置如下:
+
+```
+epoch_size=5000
+boundaries = [30, 60, 90] * epoch_size
+values = [0.1, 0.01, 0.001, 0.0001]
+optimizer = fluid.optimizer.Momentum(
+ momentum=0.9,
+ learning_rate=fluid.layers.piecewise_decay(boundaries=boundaries, values=values),
+ regularization=fluid.regularizer.L2Decay(1e-4))
+```
+
+#### 实验结果
+
+| FLOPS |model size|精度(top5/top1) |模型下载 |
+|---|---|---|---|
+| -50%|-47.0%(9.0M) | 89.13% / 69.83%|[点击下载](https://paddle-slim-models.bj.bcebos.com/uniform_filter_pruning_0.5_model.tar.gz)|
+| -60%|-55.9%(7.5M)|88.22% / 68.24%| [点击下载](https://paddle-slim-models.bj.bcebos.com/uniform_filter_pruning_0.6_model.tar.gz)|
+| -70%|-65.3%(5.9M)|86.99% / 66.57%| [点击下载](https://paddle-slim-models.bj.bcebos.com/uniform_filter_pruning_0.7_model.tar.gz)|
+
+
+## 3. 蒸馏
+
+数据: ImageNet 1000类
+模型:MobileNetV1
+原始模型大小:17M
+原始精度(top5/top1): 89.54% / 70.91%
+
+#### 实验说明
+
+用训练好的ResNet50蒸馏训练MobileNetV1, 训练120个epoch. 对第一个block加FSP loss; 对softmax layer的输入加L2-loss.
+
+optimizer配置如下:
+
+```
+epoch_size=5000
+boundaries = [30, 60, 90] * epoch_size
+values = [0.1, 0.01, 0.001, 0.0001]
+optimizer = fluid.optimizer.Momentum(
+ momentum=0.9,
+ learning_rate=fluid.layers.piecewise_decay(boundaries=boundaries, values=values),
+ regularization=fluid.regularizer.L2Decay(1e-4))
+```
+
+#### 实验结果
+
+|- |精度(top5/top1) |收益(top5/top1)|模型下载 |
+|---|---|---|---|
+| ResNet50蒸馏训| 90.92% / 71.97%| +1.28% / +1.06%| [点击下载](https://paddle-slim-models.bj.bcebos.com/mobilenetv1_resnet50_distillation_model.tar.gz)|
+
+
+## 4. 组合实验
+
+### 4.1 蒸馏后量化
+
+#### 实验说明
+
+#### 实验结果
+
+|- |精度(top5/top1) |模型下载 |
+|---|---|---|
+| ResNet50蒸馏训+量化|90.94% / 72.08%| [点击下载]()|
+
+
+### 4.2 剪切后量化
+
+
+#### 实验说明
+
+#### 实验结果
+
+| 剪切FLOPS |剪切+量化(dynamic)|模型下载 |
+|---|---|---|
+| -50%|89.11% / 69.70%| [点击下载]()|
diff --git a/PaddleSlim/docs/tutorial.md b/PaddleSlim/docs/tutorial.md
new file mode 100644
index 0000000000000000000000000000000000000000..e4dd13d29d16916dcbf6d637edf510d30693e68d
--- /dev/null
+++ b/PaddleSlim/docs/tutorial.md
@@ -0,0 +1,237 @@
+
+
+---
+# Paddle模型压缩工具库算法原理介绍
+
+## 目录
+
+- [量化原理介绍](#1-quantization-aware-training量化介绍)
+- [剪切原理介绍](#2-卷积核剪切原理)
+- [蒸馏原理介绍](#3-蒸馏)
+
+## 1. Quantization Aware Training量化介绍
+
+### 1.1 背景
+
+近年来,定点量化使用更少的比特数(如8-bit、3-bit、2-bit等)表示神经网络的权重和激活已被验证是有效的。定点量化的优点包括低内存带宽、低功耗、低计算资源占用以及低模型存储需求等。
+
+
+
+表1: 不同类型操作的开销对比
+
+
+由表1可知,低精度定点数操作的硬件面积大小及能耗比高精度浮点数要少几个数量级。 使用定点量化可带来4倍的模型压缩、4倍的内存带宽提升,以及更高效的cache利用(很多硬件设备,内存访问是主要能耗)。除此之外,计算速度也会更快(通常具有2x-3x的性能提升)。由表2可知,在很多场景下,定点量化操作对精度并不会造成损失。另外,定点量化对神经网络于嵌入式设备上的推断来说是极其重要的。
+
+
+
+表2:模型量化前后精度对比
+
+
+目前,学术界主要将量化分为两大类:`Post Training Quantization`和`Quantization Aware Training`。`Post Training Quantization`是指使用KL散度、滑动平均等方法确定量化参数且不需要重新训练的定点量化方法。`Quantization Aware Training`是在训练过程中对量化进行建模以确定量化参数,它与`Post Training Quantization`模式相比可以提供更高的预测精度。本文主要针对`Quantization Aware Training`量化模式进行阐述说明。
+
+### 1.2 量化原理
+
+#### 1.2.1 量化方式
+目前,存在着许多方法可以将浮点数量化成定点数。例如:
+$$ r = min(max(x, a), b)$$ $$ s = \frac{b - a}{n - 1} $$ $$ q = \left \lfloor \frac{r - a}{s} \right \rceil $$
+式中,$x$是待量化的浮点值,$[a, b]$是量化范围,$a$是待量化浮点数中的最小值, $b$ 是待量化浮点数中的最大值。$\left \lfloor \right \rceil$ 表示将结果四舍五入到最近的整数。如果量化级别为$k$,则$n$为$2^k$。例如,若$k$为8,则$n$为256。$q$是量化得到的整数。
+PaddleSlim框架中选择的量化方法为最大绝对值量化(`max-abs`),具体描述如下:
+$$ M = max(abs(x)) $$ $$ q = \left \lfloor \frac{x}{M} * (n - 1) \right \rceil $$
+式中,$x$是待被量化的浮点值,$M$是待量化浮点数中的绝对值最大值。$\left \lfloor \right \rceil$表示将结果四舍五入到最近的整数。对于8bit量化,PaddleSlim采用`int8_t`,即$n=2^7=128$。$q$是量化得到的整数。
+无论是`min-max量化`还是`max-abs量化`,他们都可以表示为如下形式:
+$q = scale * r + b$
+其中`min-max`和`max-abs`被称为量化参数或者量化比例或者量化范围。
+
+#### 1.2.2 量化训练框架
+##### 1.2.2.1 前向传播
+前向传播过程采用模拟量化的方式,具体描述如下:
+
+
+
+图1:基于模拟量化训练的前向过程
+
+
+由图1可知,基于模拟量化训练的前向过程可被描述为以下四个部分:
+1) 输入和权重均被量化成8-bit整数。
+2) 在8-bit整数上执行矩阵乘法或者卷积操作。
+3) 反量化矩阵乘法或者卷积操作的输出结果为32-bit浮点型数据。
+4) 在32-bit浮点型数据上执行偏置加法操作。此处,偏置并未被量化。
+对于通用矩阵乘法(`GEMM`),输入$X$和权重$W$的量化操作可被表述为如下过程:
+$$ X_q = \left \lfloor \frac{X}{X_m} * (n - 1) \right \rceil $$ $$ W_q = \left \lfloor \frac{W}{W_m} * (n - 1) \right \rceil $$
+执行通用矩阵乘法:
+$$ Y = X_q * W_q $$
+反量化$Y$:
+$$
+\begin{align}
+Y_{dq} = \frac{Y}{(n - 1) * (n - 1)} * X_m * W_m \
+=\frac{X_q * W_q}{(n - 1) * (n - 1)} * X_m * W_m \
+=(\frac{X_q}{n - 1} * X_m) * (\frac{W_q}{n - 1} * W_m) \
+\end{align}
+$$
+上述公式表明反量化操作可以被移动到`GEMM`之前,即先对$Xq$和$Wq$执行反量化操作再做`GEMM`操作。因此,前向传播的工作流亦可表示为如下方式:
+
+
+
+图2:基于模拟量化训练前向过程的等价工作流
+
+
+训练过程中,PaddleSlim使用图2中所示的等价工作流。在设计中,量化Pass在IrGraph中插入量化op和反量化op。因为在连续的量化、反量化操作之后输入仍然为32-bit浮点型数据。因此,PaddleSlim量化训练框架所采用的量化方式被称为模拟量化。
+
+##### 1.2.2.2 反向传播
+由图3可知,权重更新所需的梯度值可以由量化后的权重和量化后的激活求得。反向传播过程中的所有输入和输出均为32-bit浮点型数据。注意,梯度更新操作需要在原始权重上进行,即计算出的梯度将被加到原始权重上而非量化后或反量化后的权重上。
+
+
+
+图3:基于模拟量化训练的反向传播和权重更新过程
+
+
+因此,量化Pass也会改变相应反向算子的某些输入。
+
+#### 1.2.3 确定量化参数
+存在着两种策略可以计算求取量化比例系数,即动态策略和静态策略。动态策略会在每次迭代过程中计算量化比例系数的值。静态策略则对不同的输入采用相同的量化比例系数。
+对于权重而言,在训练过程中采用动态策略。换句话说,在每次迭代过程中量化比例系数均会被重新计算得到直至训练过程结束。
+对于激活而言,可以选择动态策略也可以选择静态策略。若选择使用静态策略,则量化比例系数会在训练过程中被评估求得,且在推断过程中被使用(不同的输入均保持不变)。静态策略中的量化比例系数可于训练过程中通过如下三种方式进行评估:
+
+1. 在一个窗口中计算激活最大绝对值的平均值。
+
+2. 在一个窗口中计算激活最大绝对值的最大值。
+
+3. 在一个窗口中计算激活最大绝对值的滑动平均值,计算公式如下:
+
+$$ Vt = (1 - k) * V + k * V_{t-1} $$
+
+式中,$V$ 是当前batch的最大绝对值, $Vt$是滑动平均值。$k$是一个因子,例如其值可取为0.9。
+
+
+
+
+## 2. 卷积核剪切原理
+
+该策略参考paper: [Pruning Filters for Efficient ConvNets](https://arxiv.org/pdf/1608.08710.pdf)
+
+该策略通过减少卷积层中卷积核的数量,来减小模型大小和降低模型计算复杂度。
+
+### 2.1 剪切卷积核
+
+**剪切注意事项1**
+剪切一个conv layer的filter,需要修改后续conv layer的filter. 如**图4**所示,剪掉Xi的一个filter,会导致$X_{i+1}$少一个channel, $X_{i+1}$对应的filter在input_channel纬度上也要减1.
+
+
+
+
+图4
+
+
+
+**剪切注意事项2**
+
+如**图5**所示,剪切完$X_i$之后,根据注意事项1我们从$X_{i+1}$的filter中删除了一行(图中蓝色行),在计算$X_{i+1}$的filters的l1_norm(图中绿色一列)的时候,有两种选择:
+算上被删除的一行:independent pruning
+减去被删除的一行:greedy pruning
+
+
+
+图5
+
+
+**剪切注意事项3**
+在对ResNet等复杂网络剪切的时候,还要考虑到后当前卷积层的修改对上一层卷积层的影响。
+如**图6**所示,在对residual block剪切时,$X_{i+1}$层如何剪切取决于project shortcut的剪切结果,因为我们要保证project shortcut的output和$X_{i+1}$的output能被正确的concat.
+
+
+
+
+图6
+
+
+### 2.2 Uniform剪切卷积网络
+
+每层剪切一样比例的卷积核。
+在剪切一个卷积核之前,按l1_norm对filter从高到低排序,越靠后的filter越不重要,优先剪掉靠后的filter.
+
+
+### 2.3 基于敏感度剪切卷积网络
+
+根据每个卷积层敏感度的不同,剪掉不同比例的卷积核。
+
+#### 两个假设
+
+- 在一个conv layer的parameter内部,按l1_norm对filter从高到低排序,越靠后的filter越不重要。
+- 两个layer剪切相同的比例的filters,我们称对模型精度影响更大的layer的敏感度相对高。
+
+#### 剪切filter的指导原则
+
+- layer的剪切比例与其敏感度成反比
+- 优先剪切layer内l1_norm相对低的filter
+
+#### 敏感度的理解
+
+
+
+图7
+
+
+如**图7**所示,横坐标是将filter剪切掉的比例,竖坐标是精度的损失,每条彩色虚线表示的是网络中的一个卷积层。
+以不同的剪切比例**单独**剪切一个卷积层,并观察其在验证数据集上的精度损失,并绘出**图7**中的虚线。虚线上升较慢的,对应的卷积层相对不敏感,我们优先剪不敏感的卷积层的filter.
+
+#### 选择最优的剪切率组合
+
+我们将**图7**中的折线拟合为**图8**中的曲线,每在竖坐标轴上选取一个精度损失值,就在横坐标轴上对应着一组剪切率,如**图8**中黑色实线所示。
+用户给定一个模型整体的剪切率,我们通过移动**图5**中的黑色实线来找到一组满足条件的且合法的剪切率。
+
+
+
+图8
+
+
+#### 迭代剪切
+考虑到多个卷积层间的相关性,一个卷积层的修改可能会影响其它卷积层的敏感度,我们采取了多次剪切的策略,步骤如下:
+
+- step1: 统计各卷积层的敏感度信息
+- step2: 根据当前统计的敏感度信息,对每个卷积层剪掉少量filter, 并统计FLOPS,如果FLOPS已满足要求,进入step4,否则进行step3。
+- step3: 对网络进行简单的fine-tune,进入step1
+- step4: fine-tune训练至收敛
+
+## 3. 蒸馏
+
+ 一般情况下,模型参数量越多,结构越复杂,其性能越好,但参数也越允余,运算量和资源消耗也越大;模型蒸馏是将复杂网络中的有用信息将复杂网络中的有用信息提取出来提取出来,迁移到一个更小的网络中去,在我们的工具包中,支持两种蒸馏的方法。
+ 第一种是传统的蒸馏方法(参考论文:[Distilling the Knowledge in a Neural Network](https://arxiv.org/pdf/1503.02531.pdf))
+ 使用复杂的网络作为teacher模型去监督训练一个参数量和运算量更少的student模型。teacher模型可以是一个或者多个提前训练好的高性能模型。student模型的训练有两个目标:一个是原始的目标函数,为student模型输出的类别概率和label的交叉熵,记为hard-target;另一个是student模型输出的类别概率和teacher模型输出的类别概率的交叉熵,记为soft target,这两个loss加权后得到最终的训练loss,共同监督studuent模型的训练。
+ 第二种是基于FSP的蒸馏方法(参考论文:[A Gift from Knowledge Distillation:
+Fast Optimization, Network Minimization and Transfer Learning](http://openaccess.thecvf.com/content_cvpr_2017/papers/Yim_A_Gift_From_CVPR_2017_paper.pdf))
+ 相比传统的蒸馏方法直接用小模型去拟合大模型的输出,该方法用小模型去拟合大模型不同层特征之间的转换关系,其用一个FSP矩阵(特征的内积)来表示不同层特征之间的关系,大模型和小模型不同层之间分别获得多个FSP矩阵,然后使用L2 loss让小模型的对应层FSP矩阵和大模型对应层的FSP矩阵尽量一致,具体如下图所示。这种方法的优势,通俗的解释是,比如将蒸馏类比成teacher(大模型)教student(小模型)解决一个问题,传统的蒸馏是直接告诉小模型问题的答案,让小模型学习,而学习FSP矩阵是让小模型学习解决问题的中间过程和方法,因此其学到的信息更多。
+
+
+
+图9
+
+
+ 由于小模型和大模型之间通过L2 loss进行监督,必须保证两个FSP矩阵的维度必须相同,而FSP矩阵的维度为M*N,其中M、N分别为输入和输出特征的channel数,因此大模型和小模型的FSP矩阵需要一一对应。
+
+
+
+## 4. 参考文献
+
+1. [High-Performance Hardware for Machine Learning](https://media.nips.cc/Conferences/2015/tutorialslides/Dally-NIPS-Tutorial-2015.pdf)
+
+2. [Quantizing deep convolutional networks for efficient inference: A whitepaper](https://arxiv.org/pdf/1806.08342.pdf)
+
+3. [Pruning Filters for Efficient ConvNets](https://arxiv.org/pdf/1608.08710.pdf)
+
+4. [Distilling the Knowledge in a Neural Network](https://arxiv.org/pdf/1503.02531.pdf)
+
+5. [A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning](http://openaccess.thecvf.com/content_cvpr_2017/papers/Yim_A_Gift_From_CVPR_2017_paper.pdf)
diff --git a/PaddleSlim/docs/usage.md b/PaddleSlim/docs/usage.md
new file mode 100644
index 0000000000000000000000000000000000000000..5b584021517f8628a549cb685c14c26d1c42f2cb
--- /dev/null
+++ b/PaddleSlim/docs/usage.md
@@ -0,0 +1,445 @@
+
+
+
+---
+# Paddle模型压缩工具库使用说明
+
+本文第一章介绍PaddleSlim模块通用功能的使用,不涉及具体压缩策略的细节。第二、三、四章分别介绍量化训练、剪切、蒸馏三种压缩策略的使用方式。
+建议在看具体策略使用方式之前,先浏览下对应的原理介绍:算法原理介绍
+
+>在本文中不区分operator和layer的概念。不区分loss和cost的概念。
+
+## 目录
+
+- [通用功能使用说明](#1-paddleslim通用功能使用介绍)
+- [量化使用说明](#21-量化训练)
+- [剪切使用说明](#22-卷积核剪切)
+- [蒸馏使用说明](#23-蒸馏)
+
+
+## 1. PaddleSlim通用功能使用介绍
+
+## 1.1 使用压缩工具库的前提
+
+### 1.1.1 安装paddle
+
+**版本:** PaddlePaddle >= 1.4
+**安装教程:** [安装说明](http://paddlepaddle.org/documentation/docs/zh/1.3/beginners_guide/install/index_cn.html)
+
+
+### 1.1.2 搭建好网络结构
+
+用户需要搭建好前向网络,并可以正常执行。
+一个正常可执行的网络一般需要以下内容或操作:
+
+- 网络结构的定义
+- data_reader
+- optimizer
+- 初始化,load pretrain model
+- feed list与fetch list
+
+#### 1.1.2.1 网络结构的定义
+首先参考以下文档,配置网络:
+[《Paddle使用指南:配置简单的网络》](http://paddlepaddle.org/documentation/docs/zh/1.3/user_guides/howto/configure_simple_model/index.html)
+
+这一步的产出应该是两个[Program](http://paddlepaddle.org/documentation/docs/zh/1.3/api_cn/fluid_cn.html#program)实例:
+
+- **train_program:** 用于在压缩过程中迭代训练模型,该program必须包含loss。一般改program不要有backward op和weights update op,否则不能使用蒸馏策略。
+
+- **eval_program:** 用于在压缩过程中评估模型的精度,一般会包含accuracy、IoU等评估指标的计算layer。
+
+>在量化训练策略中,会根据eval_program进行网络结构剪枝并保存一个用于inference的量化模型。这时候,就要求inference网络是eval_program的一个子网络。
+
+#### 1.1.2.2. data_reader
+
+按照以下文档准备数据:
+[《Paddle使用指南:准备数据》](http://paddlepaddle.org/documentation/docs/zh/1.3/user_guides/howto/prepare_data/index.html)
+
+这一步需要产出两个DataReader:
+
+**train_reader:** 用于给train_program的执行提供数据
+**eval_reader:** 用于给eval_program的执行提供数据
+
+#### 1.1.2.3. optimizer
+[fluid.optimizer API](http://www.paddlepaddle.org/documentation/docs/zh/1.3/api_cn/optimizer_cn.html)
+
+在不同的使用场景下,用户需要提供0个、1个或2两个optimizer:
+
+- **0个optimizer:** 在模型搭建阶段的train_program已经是一个包含了反向op和模型weight更新op的网络,则不用再提供optimizer
+- **1个optimizer:** train_program只有前向计算op, 则需要提供一个optimizer,用于优化训练train_program.
+- **2个optimizer:** 在使用蒸馏策略时,且蒸馏训练阶段和单独fine-tune阶段用不同的优化策略。一个optimizer用于优化训练teacher网络和student网络组成的蒸馏训练网络,另一个optimizer用于单独优化student网络。更多细节会在蒸馏策略使用文档中介绍。
+
+#### 1.1.2.4. load pretrain model
+
+- 剪切:需要加载pretrain model
+- 蒸馏:根据需要选择是否加载pretrain model
+- 量化训练:需要加载pretrain model
+
+#### 1.1.2.5. feed list与fetch list
+feed list和fetch list是两个有序的字典, 示例如下:
+```
+feed_list = [('image', image.name), ('label', label.name)]
+fetch_list = [('loss', avg_cost.name)]
+```
+其中,feed_list中的key为自定义的有一定含义的字符串,value是[Variable](http://paddlepaddle.org/documentation/docs/zh/1.3/api_guides/low_level/program.html#variable)的名称, feed_list中的顺序需要和DataReader提供的数据的顺序对应。
+
+对于train_program和eval_program都需要有与其对应的feed_list和fetch_list。
+
+>注意: 在train_program对应的fetch_list中,loss variable(loss layer的输出)对应的key一定要是‘‘loss’’
+
+
+## 1.2 压缩工具库的使用
+
+经过1.1节的准备,所以压缩工具用到的关于目标模型的信息已经就绪,执行以下步骤配置并启动压缩任务:
+
+- 改写模型训练脚本,加入模型压缩逻辑
+- 编写配置文件
+- 执行训练脚本进行模型压缩
+
+### 1.2.1 如何改写普通训练脚本
+
+在1.1节得到的模型脚本基础上做如下修改:
+
+第一步: 构造`paddle.fluid.contrib.slim.Compressor`对象, Compressor构造方法参数说明如下:
+
+```
+Compressor(place,
+ scope,
+ train_program,
+ train_reader=None,
+ train_feed_list=None,
+ train_fetch_list=None,
+ eval_program=None,
+ eval_reader=None,
+ eval_feed_list=None,
+ eval_fetch_list=None,
+ teacher_programs=[],
+ checkpoint_path='./checkpoints',
+ train_optimizer=None,
+ distiller_optimizer=None)
+```
+- **place:** 压缩任务使用的device。GPU请使用[paddle.fluid.CUDAPlace(0)](http://paddlepaddle.org/documentation/docs/zh/1.3/api_cn/fluid_cn.html#paddle.fluid.CUDAPlace)
+- **scope:** 如果在网络配置阶段没有构造scope,则用的是[global scope](http://paddlepaddle.org/documentation/docs/zh/1.3/api_cn/executor_cn.html#paddle.fluid.global_scope),该参数设置为`paddle.fluid.global_scope()`. 如果有自己构造scope,则设置为自己构造的scope.
+- **train_program:** 该program内的网络只有前向operator,而且必须带有loss. 关于program的概念,请参考:[Program API](http://paddlepaddle.org/documentation/docs/zh/1.3/api_cn/fluid_cn.html#program)
+- **train_reader:** 提供训练数据的[data reader](http://paddlepaddle.org/documentation/docs/zh/1.3/user_guides/howto/prepare_data/reader_cn.html)
+- **train_feed_list:** 用于指定train program的输入节点, 详见:1.1.2.5节。
+- **train_fetch_list:** 用于指定train program的输出节点,详见:1.1.2.5节。
+- **eval_program:** 用于评估模型精度的program
+- **eval_reader:** 提供评估数据的[data reader](http://paddlepaddle.org/documentation/docs/zh/1.3/user_guides/howto/prepare_data/reader_cn.html)
+- **eval_feed_list:** 用于指定eval program的输入节点,详见:1.1.2.5节。
+- **eval_fetch_list:** 用于指定eval program的输出节点, 格式同train_fetch_list, 详见:1.1.2.5节。
+- **teacher_programs:** 用于蒸馏的programs, 这些program需要和train program共用同一个scope.
+- **train_optimizer:** 用于训练train program的优化器
+- **distiller_optimizer:** 用于蒸馏训练的优化器
+
+
+第2步:读取配置文件和调用run方法,示例如下
+```python
+compressor.config('./compress.yaml')
+compressor.run()
+```
+其中,compress.yaml文件是压缩策略配置文件,集中了压缩策略的所有可调节参数,在1.2.2节中会详细介绍其格式和内容。
+
+完成该节操作后的完整示例见:[compress.py]()
+
+### 1.2.2 配置文件的使用
+
+模型压缩模块用[yaml](https://zh.wikipedia.org/wiki/YAML)文件集中管理可调节的压缩策略参数。我们以filter pruning为例,说明配置文件的编写方式。
+
+第一步:注册pruners, 如下所示,指定pruner的类别和一些属性,后文**第5节**会详细介绍可选类别和属性的含义。
+```python
+pruners:
+ pruner_1:
+ class: 'StructurePruner'
+ pruning_axis:
+ '*': 0
+ criterions:
+ '*': 'l1_norm'
+```
+
+第二步:注册剪切策略
+如下所示,我们注册两个uniform剪切策略,分别在第0个epoch和第10个epoch将模型的FLOPS剪掉10%.
+```python
+strategies:
+ pruning_strategy_0:
+ class: 'UniformPruneStrategy'
+ pruner: 'pruner_1'
+ start_epoch: 0
+ target_ratio: 0.10
+ pruned_params: '.*_sep_weights'
+ metric_name: 'acc_top1'
+ pruning_strategy_1:
+ class: 'UniformPruneStrategy'
+ pruner: 'pruner_1'
+ start_epoch: 10
+ target_ratio: 0.10
+ pruned_params: '.*_sep_weights'
+ metric_name: 'acc_top1'
+```
+
+第三步:配置通用参数
+
+我们在compress_pass下配置整个压缩任务的参数,如下所示,整个压缩任务会执行120个epoch, 压缩过程中的checkpoint保存在./checkpoints路径下。compress_pass.strategies下为生效的压缩策略,如果生效的多个策略的start_epoch参数一样,则按compress_pass.strategies下列出的先后顺序被调用。
+
+```python
+compress_pass:
+ epoch: 120
+ checkpoint_path: './checkpoints/'
+ strategies:
+ - pruning_strategy_0
+ - pruning_strategy_1
+```
+
+
+## 2. 模型压缩策略使用介绍
+
+本章依次介绍量化训练、卷积核剪切和蒸馏三种策略的使用方式,在此之前建议先浏览相应策略的原理介绍:
+
+- [量化训练原理](tutorial.md#1-quantization-aware-training量化介绍)
+- [卷积核剪切原理](tutorial.md#2-卷积核剪切原理)
+- [蒸馏原理](tutorial.md#3-蒸馏)
+
+### 2.1 量化训练
+
+>注意:多个压缩策略组合使用时,量化训练策略必须放在最后。
+
+```
+class Compressor(object):
+ def __init__(self,
+ place,
+ scope,
+ train_program,
+ train_reader=None,
+ train_feed_list=None,
+ train_fetch_list=None,
+ eval_program=None,
+ eval_reader=None,
+ eval_feed_list=None,
+ eval_fetch_list=None,
+ teacher_programs=[],
+ checkpoint_path='./checkpoints',
+ train_optimizer=None,
+ distiller_optimizer=None):
+```
+在定义Compressor对象时,需要注意以下问题:
+
+- train program如果带反向operators和优化更新相关的operators, train_optimizer需要设置为None.
+- eval_program中parameter的名称需要与train_program中的parameter的名称完全一致。
+- 最终保存的量化后的int8模型,是在eval_program网络基础上进行剪枝保存的,所以,如果用户希望最终保存的模型可以用于inference, 则eval program需要包含infer需要的各种operators.
+- checkpoint保存的是float数据类型的模型
+
+在配置文件中,配置量化训练策略发方法如下:
+```
+strategies:
+ quantization_strategy:
+ class: 'QuantizationStrategy'
+ start_epoch: 0
+ end_epoch: 10
+ float_model_save_path: './output/float'
+ mobile_model_save_path: './output/mobile'
+ int8_model_save_path: './output/int8'
+ weight_bits: 8
+ activation_bits: 8
+ weight_quantize_type: 'abs_max'
+ activation_quantize_type: 'abs_max'
+ save_in_nodes: ['image']
+ save_out_nodes: ['quan.tmp_2']
+ compressor:
+ epoch: 20
+ checkpoint_path: './checkpoints_quan/'
+ strategies:
+ - quantization_strategy
+```
+可配置参数有:
+
+- **class:** 量化策略的类名称,目前仅支持`QuantizationStrategy`
+- **start_epoch:** 在start_epoch开始之前,量化训练策略会往train_program和eval_program插入量化operators和反量化operators. 从start_epoch开始,进入量化训练阶段。
+- **end_epoch:** 在end_epoch结束之后,会保存用户指定格式的模型。注意:end_epoch之后并不会停止量化训练,而是继续训练到compressor.epoch为止。
+- **float_model_save_path:** 保存float数据格式模型的路径。模型weight的实际大小在int8可表示范围内,但是是以float格式存储的。如果设置为None, 则不存储float格式的模型。默认为None.
+- **int8_model_save_path:** 保存int8数据格式模型的路径。如果设置为None, 则不存储int8格式的模型。默认为None.
+- **mobile_model_save_path:** 保存兼容paddle-mobile框架的模型的路径。如果设置为None, 则不存储mobile格式的模型。默认为None.
+- **weight_bits:** 量化weight的bit数,bias不会被量化。
+- **activation_bits:** 量化activation的bit数。
+- **weight_quantize_type:** 对于weight的量化方式,目前支持'abs_max', 'channel_wise_abs_max'.
+- **activation_quantize_type:** 对activation的量化方法,目前可选`abs_max`或`range_abs_max`。`abs_max`意为在训练的每个step和inference阶段动态的计算量化范围。`range_abs_max`意为在训练阶段计算出一个静态的范围,并将其用于inference阶段。
+- **save_in_nodes:** variable名称列表。在保存量化后模型的时候,需要根据save_in_nodes对eval programg 网络进行前向遍历剪枝。默认为eval_feed_list内指定的variable的名称列表。
+- **save_out_nodes:** varibale名称列表。在保存量化后模型的时候,需要根据save_out_nodes对eval programg 网络进行回溯剪枝。默认为eval_fetch_list内指定的variable的名称列表。
+
+### 2.2 卷积核剪切
+该策略通过减少指定卷积层中卷积核的数量,达到缩减模型大小和计算复杂度的目的。根据选取剪切比例的策略的不同,又细分为以下两个方式:
+
+- uniform pruning: 每层剪切掉相同比例的卷积核数量。
+- sensitive pruning: 根据每层敏感度,剪切掉不同比例的卷积核数量。
+
+两种剪切方式都需要加载预训练模型。
+卷积核剪切是基于结构剪切,所以在配置文件中需要注册一个`StructurePruner`, 如下所示:
+
+```
+pruners:
+ pruner_1:
+ class: 'StructurePruner'
+ pruning_axis:
+ '*': 0
+ criterions:
+ '*': 'l1_norm'
+```
+
+其中,一个配置文件可注册多个pruners, 所有pruner需要放在`pruners`关键字下, `pruner`的可配置参数有:
+
+- **class:** pruner 的类型,目前只支持`StructurePruner`
+- **pruning_axis:** 剪切的纬度;'`conv*': 0`表示对所有的卷积层filter weight的第0维进行剪切,即对卷积层filter的数量进行剪切。
+- **criterions**: 通过通配符指定剪切不同parameter时用的排序方式。目前仅支持`l1_norm`.
+
+
+#### 2.2.1 uniform pruning
+
+uniform pruning剪切策略需要在配置文件的`strategies`关键字下注册`UniformPruneStrategy`实例,并将其添加至compressor的strategies列表中。
+如下所示:
+```
+strategies:
+ uniform_pruning_strategy:
+ class: 'UniformPruneStrategy'
+ pruner: 'pruner_1'
+ start_epoch: 0
+ target_ratio: 0.5
+ pruned_params: '.*_sep_weights'
+compressor:
+ epoch: 100
+ strategies:
+ - uniform_pruning_strategy
+```
+UniformPruneStrategy的可配置参数有:
+
+- **class:** 如果使用Uniform剪切策略,请设置为`UniformPruneStrategy`
+- **pruner:** StructurePruner实例的名称,需要在配置文件中注册。在pruner中指定了对单个parameter的剪切方式。
+- **start_epoch:** 开始剪切策略的epoch. 在start_epoch开始之前,该策略会对网络中的filter数量进行剪切,从start_epoch开始对被剪切的网络进行fine-tune训练,直到整个压缩任务结束。
+- **target_ratio:** 将目标网络的FLOPS剪掉的比例。
+- **pruned_params:** 被剪切的parameter的名称,支持通配符。如,‘*’为对所有parameter进行剪切,‘conv*’意为对所有名义以‘conv’开头的parameter进行剪切。
+
+
+
+#### 2.2.2 sensitive pruning
+
+sensitive剪切策略需要在配置文件的`strategies`关键字下注册`SensitivePruneStrategy`实例,并将其添加至compressor的strategies列表中。
+如下所示:
+```
+strategies:
+ sensitive_pruning_strategy:
+ class: 'SensitivePruneStrategy'
+ pruner: 'pruner_1'
+ start_epoch: 0
+ delta_rate: 0.1
+ target_ratio: 0.5
+ num_steps: 1
+ eval_rate: 0.2
+ pruned_params: '.*_sep_weights'
+ sensitivities_file: 'mobilenet_acc_top1_sensitive.data'
+ metric_name: 'acc_top1'
+compressor:
+ epoch: 200
+ strategies:
+ - sensitive_pruning_strategy
+```
+SensitivePruneStrategy可配置的参数有:
+
+- **class:** 如果使用敏感度剪切策略,请设置为`SensitivePruneStrategy`
+- **pruner:** StructurePruner实例的名称,需要在配置文件中注册。在pruner中指定了对单个parameter的剪切方式。
+- **start_epoch:** 开始剪切策略的epoch。 在start_epoch开始之前,该策略会对网络中的filter数量进行第一次剪切。
+- **delta_rate:** 统计敏感度信息时,剪切率从0到1,依次递增delta_rate. 具体细节可参考[原理介绍文档]()
+- **target_ratio:** 将目标网络的FLOPS剪掉的比例。
+- **num_steps:** 整个剪切过程的步数。每次迭代剪掉的比例为:$step = 1 - (1-target\_ratio)^{\frac{1}{num\_steps}}$
+- **eval_rate:** 计算敏感度时,随机抽取使用的验证数据的比例。在迭代剪切中,为了快速重新计算每一步的每个parameter的敏感度,建议随机选取部分验证数据进行计算。当`num_steps`等于1时,建议使用全量数据进行计算。
+
+
+### 2.3 蒸馏
+
+PaddleSlim支持`FSP_loss`, `L2_loss`和`softmax_with_cross_entropy_loss`, 用户可以在配置文件中,用这三种loss组合teacher net和student net的任意一层。
+
+与其它策略不同,如果要使用蒸馏策略,用户在脚本中构造Compressor对象时,需要指定teacher program 和distiller optimizer.
+其中,teacher program有以下要求:
+
+- teacher program需要加载预训练好的模型。
+- teacher program中的变量不能与student program中的变量有命名冲突。
+- teacher program中只有前向计算operators, 不能有backward operators。
+- 用户不必手动设置teacher program的stop_gradient属性(不计算gradient和不更新weight),PaddleSlim会自动将其设置为True.
+
+distiller optimizer用来为student net和teacher net组合而成的网络添加反向operators和优化相关的operators, 仅用于蒸馏训练阶段。
+
+在配置文件中,配置蒸馏策略方式如下:
+```
+strategies:
+ distillation_strategy:
+ class: 'DistillationStrategy'
+ distillers: ['fsp_distiller', 'l2_distiller']
+ start_epoch: 0
+ end_epoch: 130
+```
+其中, 需要在关键字`strategies`下注册策略实例,可配置参数有:
+
+- **class:** 策略类的名称,蒸馏策略请设置为DistillationStrategy。
+- **distillers:** 一个distiller列表,列表中每一个distiller代表了student net和teacher net之间的一个组合loss。该策略会将这个列表中定义的loss加在一起作为蒸馏训练阶段优化的目标。 distiller需要提前在当前配置文件中进行注册,下文会详细介绍其注册方法。
+- **start_epoch:** 在start_epoch开始之前,该策略会根据用户定义的losses将teacher net合并到student net中,并根据合并后的loss添加反向计算操作和优化更新操作。
+- **end_epoch:** 在 end_epoch结束之后,该策略去将teacher net从student net中删除,并回复student net的loss. 在次之后,进入单独fine-tune student net的阶段。
+
+distiller的配置方式如下:
+
+**FSPDistiller**
+```
+distillers:
+ fsp_distiller:
+ class: 'FSPDistiller'
+ teacher_pairs: [['res2a_branch2a.conv2d.output.1.tmp_0', 'res3a_branch2a.conv2d.output.1.tmp_0']]
+ student_pairs: [['depthwise_conv2d_1.tmp_0', 'conv2d_3.tmp_0']]
+ distillation_loss_weight: 1
+```
+- **class:** distiller类名称,可选:`FSPDistiller`,`L2Distiller`,`SoftLabelDistiller`
+- **teacher_pairs:** teacher网络对应的sections. 列表中每一个section由两个variable name表示,这两个variable代表网络中的两个feature map. 这两个feature map可以有不同的channel数量,但是必须有相同的长和宽。
+- **student_pairs:** student网络对应的sections. student_pairs[i]与teacher_pairs[i]计算出一个fsp loss.
+- **distillation_loss_weight:** 当前定义的fsp loss对应的权重。默认为1.0
+
+**L2-loss**
+
+```
+distillers:
+ l2_distiller:
+ class: 'L2Distiller'
+ teacher_feature_map: 'fc_1.tmp_0'
+ student_feature_map: 'fc_0.tmp_0'
+ distillation_loss_weight: 1
+```
+
+- **teacher_feature_map:** teacher网络中用于计算l2 loss的feature map
+- **student_feature_map:** student网络中用于计算l2 loss的feature map, shape必须与`teacher_feature_map`完全一致。
+
+**SoftLabelDistiller**
+
+```
+distillers:
+ soft_label_distiller:
+ class: 'SoftLabelDistiller'
+ student_temperature: 1.0
+ teacher_temperature: 1.0
+ teacher_feature_map: 'teacher.tmp_1'
+ student_feature_map: 'student.tmp_1'
+ distillation_loss_weight: 0.001
+```
+
+- **teacher_feature_map:** teacher网络中用于计算softmax_with_cross_entropy的feature map。
+- **student_feature_map:** student网络中用于计算softmax_with_cross_entropy的feature map。shape必须与`teacher_feature_map`完全一致。
+- **student_temperature:** 在计算softmax_with_cross_entropy之前,用该系数除student_feature_map。
+- **teacher_temperature:** 在计算softmax_with_cross_entropy之前,用该系数除teacher_feature_map。
+- **distillation_loss_weight:** 当前定义的loss对应的权重。默认为1.0
diff --git a/PaddleSlim/models/__init__.py b/PaddleSlim/models/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..458020712dfedac220586a8a31852f5163ad407f
--- /dev/null
+++ b/PaddleSlim/models/__init__.py
@@ -0,0 +1,2 @@
+from .mobilenet import MobileNet
+from .resnet import ResNet50, ResNet101, ResNet152
diff --git a/PaddleSlim/models/mobilenet.py b/PaddleSlim/models/mobilenet.py
new file mode 100644
index 0000000000000000000000000000000000000000..5c4b16a5ed44741054ab7f5f167f696960c4059b
--- /dev/null
+++ b/PaddleSlim/models/mobilenet.py
@@ -0,0 +1,197 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle.fluid as fluid
+from paddle.fluid.initializer import MSRA
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ['MobileNet']
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class MobileNet():
+ def __init__(self):
+ self.params = train_parameters
+
+ def net(self, input, class_dim=1000, scale=1.0):
+ # conv1: 112x112
+ input = self.conv_bn_layer(
+ input,
+ filter_size=3,
+ channels=3,
+ num_filters=int(32 * scale),
+ stride=2,
+ padding=1,
+ name="conv1")
+
+ # 56x56
+ input = self.depthwise_separable(
+ input,
+ num_filters1=32,
+ num_filters2=64,
+ num_groups=32,
+ stride=1,
+ scale=scale,
+ name="conv2_1")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=64,
+ num_filters2=128,
+ num_groups=64,
+ stride=2,
+ scale=scale,
+ name="conv2_2")
+
+ # 28x28
+ input = self.depthwise_separable(
+ input,
+ num_filters1=128,
+ num_filters2=128,
+ num_groups=128,
+ stride=1,
+ scale=scale,
+ name="conv3_1")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=128,
+ num_filters2=256,
+ num_groups=128,
+ stride=2,
+ scale=scale,
+ name="conv3_2")
+
+ # 14x14
+ input = self.depthwise_separable(
+ input,
+ num_filters1=256,
+ num_filters2=256,
+ num_groups=256,
+ stride=1,
+ scale=scale,
+ name="conv4_1")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=256,
+ num_filters2=512,
+ num_groups=256,
+ stride=2,
+ scale=scale,
+ name="conv4_2")
+
+ # 14x14
+ for i in range(5):
+ input = self.depthwise_separable(
+ input,
+ num_filters1=512,
+ num_filters2=512,
+ num_groups=512,
+ stride=1,
+ scale=scale,
+ name="conv5" + "_" + str(i + 1))
+ # 7x7
+ input = self.depthwise_separable(
+ input,
+ num_filters1=512,
+ num_filters2=1024,
+ num_groups=512,
+ stride=2,
+ scale=scale,
+ name="conv5_6")
+
+ input = self.depthwise_separable(
+ input,
+ num_filters1=1024,
+ num_filters2=1024,
+ num_groups=1024,
+ stride=1,
+ scale=scale,
+ name="conv6")
+
+ input = fluid.layers.pool2d(
+ input=input,
+ pool_size=0,
+ pool_stride=1,
+ pool_type='avg',
+ global_pooling=True)
+
+ output = fluid.layers.fc(input=input,
+ size=class_dim,
+ act='softmax',
+ param_attr=ParamAttr(
+ initializer=MSRA(), name="fc7_weights"),
+ bias_attr=ParamAttr(name="fc7_offset"))
+
+ return output
+
+ def conv_bn_layer(self,
+ input,
+ filter_size,
+ num_filters,
+ stride,
+ padding,
+ channels=None,
+ num_groups=1,
+ act='relu',
+ use_cudnn=True,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=padding,
+ groups=num_groups,
+ act=None,
+ use_cudnn=use_cudnn,
+ param_attr=ParamAttr(
+ initializer=MSRA(), name=name + "_weights"),
+ bias_attr=False)
+ bn_name = name + "_bn"
+ return fluid.layers.batch_norm(
+ input=conv,
+ act=act,
+ param_attr=ParamAttr(name=bn_name + "_scale"),
+ bias_attr=ParamAttr(name=bn_name + "_offset"),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance')
+
+ def depthwise_separable(self,
+ input,
+ num_filters1,
+ num_filters2,
+ num_groups,
+ stride,
+ scale,
+ name=None):
+ depthwise_conv = self.conv_bn_layer(
+ input=input,
+ filter_size=3,
+ num_filters=int(num_filters1 * scale),
+ stride=stride,
+ padding=1,
+ num_groups=int(num_groups * scale),
+ use_cudnn=False,
+ name=name + "_dw")
+
+ pointwise_conv = self.conv_bn_layer(
+ input=depthwise_conv,
+ filter_size=1,
+ num_filters=int(num_filters2 * scale),
+ stride=1,
+ padding=0,
+ name=name + "_sep")
+ return pointwise_conv
diff --git a/PaddleSlim/models/resnet.py b/PaddleSlim/models/resnet.py
new file mode 100644
index 0000000000000000000000000000000000000000..5ee8ff886d79074ec5483177e13280b405bca94e
--- /dev/null
+++ b/PaddleSlim/models/resnet.py
@@ -0,0 +1,165 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import paddle
+import paddle.fluid as fluid
+import math
+from paddle.fluid.param_attr import ParamAttr
+
+__all__ = ["ResNet", "ResNet50", "ResNet101", "ResNet152"]
+
+train_parameters = {
+ "input_size": [3, 224, 224],
+ "input_mean": [0.485, 0.456, 0.406],
+ "input_std": [0.229, 0.224, 0.225],
+ "learning_strategy": {
+ "name": "piecewise_decay",
+ "batch_size": 256,
+ "epochs": [30, 60, 90],
+ "steps": [0.1, 0.01, 0.001, 0.0001]
+ }
+}
+
+
+class ResNet():
+ def __init__(self, layers=50):
+ self.params = train_parameters
+ self.layers = layers
+
+ def net(self, input, class_dim=1000, conv1_name='conv1', fc_name=None):
+ layers = self.layers
+ supported_layers = [50, 101, 152]
+ assert layers in supported_layers, \
+ "supported layers are {} but input layer is {}".format(supported_layers, layers)
+
+ if layers == 50:
+ depth = [3, 4, 6, 3]
+ elif layers == 101:
+ depth = [3, 4, 23, 3]
+ elif layers == 152:
+ depth = [3, 8, 36, 3]
+ num_filters = [64, 128, 256, 512]
+
+ # TODO(wanghaoshuang@baidu.com):
+ # fix name("conv1") conflict between student and teacher in distillation.
+ conv = self.conv_bn_layer(
+ input=input,
+ num_filters=64,
+ filter_size=7,
+ stride=2,
+ act='relu',
+ name=conv1_name)
+ conv = fluid.layers.pool2d(
+ input=conv,
+ pool_size=3,
+ pool_stride=2,
+ pool_padding=1,
+ pool_type='max')
+
+ for block in range(len(depth)):
+ for i in range(depth[block]):
+ if layers in [101, 152] and block == 2:
+ if i == 0:
+ conv_name = "res" + str(block + 2) + "a"
+ else:
+ conv_name = "res" + str(block + 2) + "b" + str(i)
+ else:
+ conv_name = "res" + str(block + 2) + chr(97 + i)
+ conv = self.bottleneck_block(
+ input=conv,
+ num_filters=num_filters[block],
+ stride=2 if i == 0 and block != 0 else 1,
+ name=conv_name)
+
+ pool = fluid.layers.pool2d(
+ input=conv, pool_size=7, pool_type='avg', global_pooling=True)
+ stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
+ out = fluid.layers.fc(input=pool,
+ size=class_dim,
+ act='softmax',
+ name=fc_name,
+ param_attr=fluid.param_attr.ParamAttr(
+ initializer=fluid.initializer.Uniform(-stdv,
+ stdv)))
+ return out
+
+ def conv_bn_layer(self,
+ input,
+ num_filters,
+ filter_size,
+ stride=1,
+ groups=1,
+ act=None,
+ name=None):
+ conv = fluid.layers.conv2d(
+ input=input,
+ num_filters=num_filters,
+ filter_size=filter_size,
+ stride=stride,
+ padding=(filter_size - 1) // 2,
+ groups=groups,
+ act=None,
+ param_attr=ParamAttr(name=name + "_weights"),
+ bias_attr=False,
+ name=name + '.conv2d.output.1')
+ if name == "conv1":
+ bn_name = "bn_" + name
+ else:
+ bn_name = "bn" + name[3:]
+ return fluid.layers.batch_norm(
+ input=conv,
+ act=act,
+ name=bn_name + '.output.1',
+ param_attr=ParamAttr(name=bn_name + '_scale'),
+ bias_attr=ParamAttr(bn_name + '_offset'),
+ moving_mean_name=bn_name + '_mean',
+ moving_variance_name=bn_name + '_variance', )
+
+ def shortcut(self, input, ch_out, stride, name):
+ ch_in = input.shape[1]
+ if ch_in != ch_out or stride != 1:
+ return self.conv_bn_layer(input, ch_out, 1, stride, name=name)
+ else:
+ return input
+
+ def bottleneck_block(self, input, num_filters, stride, name):
+ conv0 = self.conv_bn_layer(
+ input=input,
+ num_filters=num_filters,
+ filter_size=1,
+ act='relu',
+ name=name + "_branch2a")
+ conv1 = self.conv_bn_layer(
+ input=conv0,
+ num_filters=num_filters,
+ filter_size=3,
+ stride=stride,
+ act='relu',
+ name=name + "_branch2b")
+ conv2 = self.conv_bn_layer(
+ input=conv1,
+ num_filters=num_filters * 4,
+ filter_size=1,
+ act=None,
+ name=name + "_branch2c")
+
+ short = self.shortcut(
+ input, num_filters * 4, stride, name=name + "_branch1")
+
+ return fluid.layers.elementwise_add(
+ x=short, y=conv2, act='relu', name=name + ".add.output.5")
+
+
+def ResNet50():
+ model = ResNet(layers=50)
+ return model
+
+
+def ResNet101():
+ model = ResNet(layers=101)
+ return model
+
+
+def ResNet152():
+ model = ResNet(layers=152)
+ return model
diff --git a/PaddleSlim/quant_low_level_api/quant.py b/PaddleSlim/quant_low_level_api/quant.py
new file mode 100644
index 0000000000000000000000000000000000000000..76e96915eded9b36776a3c7dcb5997c70d810287
--- /dev/null
+++ b/PaddleSlim/quant_low_level_api/quant.py
@@ -0,0 +1,391 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import os
+import numpy as np
+import time
+import functools
+import paddle
+import paddle.fluid as fluid
+from paddle.fluid.framework import IrGraph
+from paddle.fluid.contrib.slim.quantization import QuantizationTransformPass
+from paddle.fluid.contrib.slim.quantization import QuantizationFreezePass
+from paddle.fluid.contrib.slim.quantization import ConvertToInt8Pass
+from paddle.fluid.contrib.slim.quantization import TransformForMobilePass
+from paddle.fluid import core
+import argparse
+import subprocess
+import sys
+sys.path.append('..')
+import reader
+import models
+from utility import add_arguments, print_arguments
+from utility import save_persistable_nodes, load_persistable_nodes
+
+parser = argparse.ArgumentParser(description=__doc__)
+add_arg = functools.partial(add_arguments, argparser=parser)
+# yapf: disable
+add_arg('batch_size', int, 256, "Minibatch size.")
+add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
+add_arg('total_images', int, 1281167, "Training image number.")
+add_arg('num_epochs', int, 120, "number of epochs.")
+add_arg('class_dim', int, 1000, "Class number.")
+add_arg('image_shape', str, "3,224,224", "input image size")
+add_arg('model_save_dir', str, "output", "model save directory")
+add_arg('pretrained_fp32_model', str, None, "Whether to use the pretrained float32 model to initialize the weights.")
+add_arg('checkpoint', str, None, "Whether to resume the training process from the checkpoint.")
+add_arg('lr', float, 0.1, "set learning rate.")
+add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
+add_arg('model', str, "SE_ResNeXt50_32x4d", "Set the network to use.")
+add_arg('data_dir', str, "./data/ILSVRC2012", "The ImageNet dataset root dir.")
+add_arg('act_quant_type', str, "abs_max", "quantization type for activation, valid value:'abs_max','range_abs_max', 'moving_average_abs_max'" )
+add_arg('wt_quant_type', str, "abs_max", "quantization type for weight, valid value:'abs_max','channel_wise_abs_max'" )
+# yapf: enabl
+
+def optimizer_setting(params):
+ ls = params["learning_strategy"]
+ if ls["name"] == "piecewise_decay":
+ if "total_images" not in params:
+ total_images = 1281167
+ else:
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ step = int(total_images / batch_size + 1)
+
+ bd = [step * e for e in ls["epochs"]]
+ print("decay list:{}".format(bd))
+ base_lr = params["lr"]
+ lr = []
+ lr = [base_lr * (0.1**i) for i in range(len(bd) + 1)]
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.piecewise_decay(
+ boundaries=bd, values=lr),
+ momentum=0.9,
+ regularization=fluid.regularizer.L2Decay(1e-4))
+
+ elif ls["name"] == "cosine_decay":
+ if "total_images" not in params:
+ total_images = 1281167
+ else:
+ total_images = params["total_images"]
+
+ batch_size = ls["batch_size"]
+ step = int(total_images / batch_size + 1)
+
+ lr = params["lr"]
+ num_epochs = params["num_epochs"]
+
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.cosine_decay(
+ learning_rate=lr, step_each_epoch=step, epochs=num_epochs),
+ momentum=0.9,
+ regularization=fluid.regularizer.L2Decay(4e-5))
+ elif ls["name"] == "exponential_decay":
+ if "total_images" not in params:
+ total_images = 1281167
+ else:
+ total_images = params["total_images"]
+ batch_size = ls["batch_size"]
+ step = int(total_images / batch_size +1)
+ lr = params["lr"]
+ num_epochs = params["num_epochs"]
+ learning_decay_rate_factor=ls["learning_decay_rate_factor"]
+ num_epochs_per_decay = ls["num_epochs_per_decay"]
+ NUM_GPUS = 1
+
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=fluid.layers.exponential_decay(
+ learning_rate = lr * NUM_GPUS,
+ decay_steps = step * num_epochs_per_decay / NUM_GPUS,
+ decay_rate = learning_decay_rate_factor),
+ momentum=0.9,
+
+ regularization = fluid.regularizer.L2Decay(4e-5))
+
+ else:
+ lr = params["lr"]
+ optimizer = fluid.optimizer.Momentum(
+ learning_rate=lr,
+ momentum=0.9,
+ regularization=fluid.regularizer.L2Decay(1e-4))
+
+ return optimizer
+
+def net_config(image, label, model, args):
+ model_list = [m for m in dir(models) if "__" not in m]
+ assert args.model in model_list,"{} is not lists: {}".format(
+ args.model, model_list)
+
+ class_dim = args.class_dim
+ model_name = args.model
+
+ if model_name == "GoogleNet":
+ out0, out1, out2 = model.net(input=image, class_dim=class_dim)
+ cost0 = fluid.layers.cross_entropy(input=out0, label=label)
+ cost1 = fluid.layers.cross_entropy(input=out1, label=label)
+ cost2 = fluid.layers.cross_entropy(input=out2, label=label)
+ avg_cost0 = fluid.layers.mean(x=cost0)
+ avg_cost1 = fluid.layers.mean(x=cost1)
+ avg_cost2 = fluid.layers.mean(x=cost2)
+
+ avg_cost = avg_cost0 + 0.3 * avg_cost1 + 0.3 * avg_cost2
+ acc_top1 = fluid.layers.accuracy(input=out0, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=out0, label=label, k=5)
+ out = out2
+ else:
+ out = model.net(input=image, class_dim=class_dim)
+ cost = fluid.layers.cross_entropy(input=out, label=label)
+
+ avg_cost = fluid.layers.mean(x=cost)
+ acc_top1 = fluid.layers.accuracy(input=out, label=label, k=1)
+ acc_top5 = fluid.layers.accuracy(input=out, label=label, k=5)
+
+ return out, avg_cost, acc_top1, acc_top5
+
+
+def build_program(is_train, main_prog, startup_prog, args):
+ image_shape = [int(m) for m in args.image_shape.split(",")]
+ model_name = args.model
+ model_list = [m for m in dir(models) if "__" not in m]
+ assert model_name in model_list, "{} is not in lists: {}".format(args.model,
+ model_list)
+ model = models.__dict__[model_name]()
+ with fluid.program_guard(main_prog, startup_prog):
+ py_reader = fluid.layers.py_reader(
+ capacity=16,
+ shapes=[[-1] + image_shape, [-1, 1]],
+ lod_levels=[0, 0],
+ dtypes=["float32", "int64"],
+ use_double_buffer=True)
+ with fluid.unique_name.guard():
+ image, label = fluid.layers.read_file(py_reader)
+ out, avg_cost, acc_top1, acc_top5 = net_config(image, label, model, args)
+ avg_cost.persistable = True
+ acc_top1.persistable = True
+ acc_top5.persistable = True
+ if is_train:
+ params = model.params
+ params["total_images"] = args.total_images
+ params["lr"] = args.lr
+ params["num_epochs"] = args.num_epochs
+ params["learning_strategy"]["batch_size"] = args.batch_size
+ params["learning_strategy"]["name"] = args.lr_strategy
+
+ optimizer = optimizer_setting(params)
+ optimizer.minimize(avg_cost)
+ global_lr = optimizer._global_learning_rate()
+ if is_train:
+ return image, out, py_reader, avg_cost, acc_top1, acc_top5, global_lr
+ else:
+ return image, out, py_reader, avg_cost, acc_top1, acc_top5
+
+def train(args):
+ # parameters from arguments
+ model_name = args.model
+ pretrained_fp32_model = args.pretrained_fp32_model
+ checkpoint = args.checkpoint
+ model_save_dir = args.model_save_dir
+ data_dir = args.data_dir
+ activation_quant_type = args.act_quant_type
+ weight_quant_type = args.wt_quant_type
+ print("Using %s as the actiavtion quantize type." % activation_quant_type)
+ print("Using %s as the weight quantize type." % weight_quant_type)
+
+ startup_prog = fluid.Program()
+ train_prog = fluid.Program()
+ test_prog = fluid.Program()
+
+ _, _, train_py_reader, train_cost, train_acc1, train_acc5, global_lr = build_program(
+ is_train=True,
+ main_prog=train_prog,
+ startup_prog=startup_prog,
+ args=args)
+ image, out, test_py_reader, test_cost, test_acc1, test_acc5 = build_program(
+ is_train=False,
+ main_prog=test_prog,
+ startup_prog=startup_prog,
+ args=args)
+ test_prog = test_prog.clone(for_test=True)
+
+ place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
+ exe = fluid.Executor(place)
+ exe.run(startup_prog)
+ main_graph = IrGraph(core.Graph(train_prog.desc), for_test=False)
+ test_graph = IrGraph(core.Graph(test_prog.desc), for_test=True)
+
+ if pretrained_fp32_model:
+ def if_exist(var):
+ return os.path.exists(os.path.join(pretrained_fp32_model, var.name))
+ fluid.io.load_vars(
+ exe, pretrained_fp32_model, main_program=train_prog, predicate=if_exist)
+
+ if args.use_gpu:
+ visible_device = os.getenv('CUDA_VISIBLE_DEVICES')
+ if visible_device:
+ device_num = len(visible_device.split(','))
+ else:
+ device_num = subprocess.check_output(
+ ['nvidia-smi', '-L']).decode().count('\n')
+ else:
+ device_num = 1
+
+ train_batch_size = args.batch_size / device_num
+ test_batch_size = 1 if activation_quant_type == 'abs_max' else 8
+ train_reader = paddle.batch(
+ reader.train(data_dir=data_dir), batch_size=train_batch_size, drop_last=True)
+ test_reader = paddle.batch(reader.val(data_dir=data_dir), batch_size=test_batch_size)
+
+ train_py_reader.decorate_paddle_reader(train_reader)
+ test_py_reader.decorate_paddle_reader(test_reader)
+
+ train_fetch_list = [train_cost.name, train_acc1.name, train_acc5.name, global_lr.name]
+ test_fetch_list = [test_cost.name, test_acc1.name, test_acc5.name]
+
+ # 1. Make some quantization transforms in the graph before training and testing.
+ # According to the weight and activation quantization type, the graph will be added
+ # some fake quantize operators and fake dequantize operators.
+ transform_pass = QuantizationTransformPass(
+ scope=fluid.global_scope(), place=place,
+ activation_quantize_type=activation_quant_type,
+ weight_quantize_type=weight_quant_type)
+ transform_pass.apply(main_graph)
+ transform_pass.apply(test_graph)
+
+ if checkpoint:
+ load_persistable_nodes(exe, checkpoint, main_graph)
+
+ build_strategy = fluid.BuildStrategy()
+ build_strategy.memory_optimize = False
+ build_strategy.enable_inplace = False
+ binary = fluid.CompiledProgram(main_graph.graph).with_data_parallel(
+ loss_name=train_cost.name, build_strategy=build_strategy)
+ test_prog = test_graph.to_program()
+ params = models.__dict__[args.model]().params
+ for pass_id in range(params["num_epochs"]):
+
+ train_py_reader.start()
+
+ train_info = [[], [], []]
+ test_info = [[], [], []]
+ train_time = []
+ batch_id = 0
+ try:
+ while True:
+ t1 = time.time()
+ loss, acc1, acc5, lr = exe.run(binary, fetch_list=train_fetch_list)
+ t2 = time.time()
+ period = t2 - t1
+ loss = np.mean(np.array(loss))
+ acc1 = np.mean(np.array(acc1))
+ acc5 = np.mean(np.array(acc5))
+ train_info[0].append(loss)
+ train_info[1].append(acc1)
+ train_info[2].append(acc5)
+ lr = np.mean(np.array(lr))
+ train_time.append(period)
+ if batch_id % 10 == 0:
+ print("Pass {0}, trainbatch {1}, loss {2}, \
+ acc1 {3}, acc5 {4}, lr {5}, time {6}"
+ .format(pass_id, batch_id, loss, acc1, acc5, "%.6f" %
+ lr, "%2.2f sec" % period))
+ sys.stdout.flush()
+ batch_id += 1
+ except fluid.core.EOFException:
+ train_py_reader.reset()
+
+ train_loss = np.array(train_info[0]).mean()
+ train_acc1 = np.array(train_info[1]).mean()
+ train_acc5 = np.array(train_info[2]).mean()
+
+ test_py_reader.start()
+
+ test_batch_id = 0
+ try:
+ while True:
+ t1 = time.time()
+ loss, acc1, acc5 = exe.run(program=test_prog,
+ fetch_list=test_fetch_list)
+ t2 = time.time()
+ period = t2 - t1
+ loss = np.mean(loss)
+ acc1 = np.mean(acc1)
+ acc5 = np.mean(acc5)
+ test_info[0].append(loss)
+ test_info[1].append(acc1)
+ test_info[2].append(acc5)
+ if test_batch_id % 10 == 0:
+ print("Pass {0},testbatch {1},loss {2}, \
+ acc1 {3},acc5 {4},time {5}"
+ .format(pass_id, test_batch_id, loss, acc1, acc5,
+ "%2.2f sec" % period))
+ sys.stdout.flush()
+ test_batch_id += 1
+ except fluid.core.EOFException:
+ test_py_reader.reset()
+
+ test_loss = np.array(test_info[0]).mean()
+ test_acc1 = np.array(test_info[1]).mean()
+ test_acc5 = np.array(test_info[2]).mean()
+
+ print("End pass {0}, train_loss {1}, train_acc1 {2}, train_acc5 {3}, "
+ "test_loss {4}, test_acc1 {5}, test_acc5 {6}".format(
+ pass_id, train_loss, train_acc1, train_acc5, test_loss,
+ test_acc1, test_acc5))
+ sys.stdout.flush()
+
+ save_checkpoint_path = os.path.join(model_save_dir, model_name, str(pass_id))
+ if not os.path.isdir(save_checkpoint_path):
+ os.makedirs(save_checkpoint_path)
+ save_persistable_nodes(exe, save_checkpoint_path, main_graph)
+
+ model_path = os.path.join(model_save_dir, model_name, args.act_quant_type)
+ float_path = os.path.join(model_path, 'float')
+ int8_path = os.path.join(model_path, 'int8')
+ mobile_path = os.path.join(model_path, 'mobile')
+ if not os.path.isdir(model_path):
+ os.makedirs(model_path)
+
+ # 2. Freeze the graph after training by adjusting the quantize
+ # operators' order for the inference.
+ freeze_pass = QuantizationFreezePass(
+ scope=fluid.global_scope(),
+ place=place,
+ weight_quantize_type=weight_quant_type)
+ freeze_pass.apply(test_graph)
+ server_program = test_graph.to_program()
+ fluid.io.save_inference_model(
+ dirname=float_path,
+ feeded_var_names=[image.name],
+ target_vars=[out], executor=exe,
+ main_program=server_program)
+
+ # 3. Convert the weights into int8_t type.
+ # (This step is optional.)
+ convert_int8_pass = ConvertToInt8Pass(scope=fluid.global_scope(), place=place)
+ convert_int8_pass.apply(test_graph)
+ server_int8_program = test_graph.to_program()
+ fluid.io.save_inference_model(
+ dirname=int8_path,
+ feeded_var_names=[image.name],
+ target_vars=[out], executor=exe,
+ main_program=server_int8_program)
+
+ # 4. Convert the freezed graph for paddle-mobile execution.
+ # (This step is optional.)
+ mobile_pass = TransformForMobilePass()
+ mobile_pass.apply(test_graph)
+ mobile_program = test_graph.to_program()
+ fluid.io.save_inference_model(
+ dirname=mobile_path,
+ feeded_var_names=[image.name],
+ target_vars=[out], executor=exe,
+ main_program=mobile_program)
+
+def main():
+ args = parser.parse_args()
+ print_arguments(args)
+ train(args)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/PaddleSlim/quant_low_level_api/run_quant.sh b/PaddleSlim/quant_low_level_api/run_quant.sh
new file mode 100644
index 0000000000000000000000000000000000000000..556b14db168114cb756ac6592d1cabe77c684d0d
--- /dev/null
+++ b/PaddleSlim/quant_low_level_api/run_quant.sh
@@ -0,0 +1,64 @@
+#!/usr/bin/env bash
+
+# download pretrain model
+root_url="http://paddle-imagenet-models-name.bj.bcebos.com"
+MobileNetV1="MobileNetV1_pretrained.zip"
+ResNet50="ResNet50_pretrained.zip"
+pretrain_dir='../pretrain'
+
+if [ ! -d ${pretrain_dir} ]; then
+ mkdir ${pretrain_dir}
+fi
+
+cd ${pretrain_dir}
+
+if [ ! -f ${MobileNetV1} ]; then
+ wget ${root_url}/${MobileNetV1}
+ unzip ${MobileNetV1}
+fi
+
+if [ ! -f ${ResNet50} ]; then
+ wget ${root_url}/${ResNet50}
+ unzip ${ResNet50}
+fi
+
+cd -
+
+
+export CUDA_VISIBLE_DEVICES=0,1,2,3
+
+#MobileNet v1:
+python quant.py \
+ --model=MobileNet \
+ --pretrained_fp32_model=${pretrain_dir}/MobileNetV1_pretrained \
+ --use_gpu=True \
+ --data_dir=../data/ILSVRC2012 \
+ --batch_size=256 \
+ --total_images=1281167 \
+ --class_dim=1000 \
+ --image_shape=3,224,224 \
+ --model_save_dir=output/ \
+ --lr_strategy=piecewise_decay \
+ --num_epochs=20 \
+ --lr=0.0001 \
+ --act_quant_type=abs_max \
+ --wt_quant_type=abs_max
+
+
+#ResNet50:
+#python quant.py \
+# --model=ResNet50 \
+# --pretrained_fp32_model=${pretrain_dir}/ResNet50_pretrained \
+# --use_gpu=True \
+# --data_dir=../data/ILSVRC2012 \
+# --batch_size=128 \
+# --total_images=1281167 \
+# --class_dim=1000 \
+# --image_shape=3,224,224 \
+# --model_save_dir=output/ \
+# --lr_strategy=piecewise_decay \
+# --num_epochs=20 \
+# --lr=0.0001 \
+# --act_quant_type=abs_max \
+# --wt_quant_type=abs_max
+
diff --git a/PaddleSlim/reader.py b/PaddleSlim/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..f4a9da1e03102fceedb2bd295ebb399e19e93361
--- /dev/null
+++ b/PaddleSlim/reader.py
@@ -0,0 +1,188 @@
+import os
+import math
+import random
+import functools
+import numpy as np
+import paddle
+from PIL import Image, ImageEnhance
+
+random.seed(0)
+np.random.seed(0)
+
+DATA_DIM = 224
+
+THREAD = 16
+BUF_SIZE = 10240
+
+DATA_DIR = 'data/ILSVRC2012'
+
+img_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))
+img_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))
+
+
+def resize_short(img, target_size):
+ percent = float(target_size) / min(img.size[0], img.size[1])
+ resized_width = int(round(img.size[0] * percent))
+ resized_height = int(round(img.size[1] * percent))
+ img = img.resize((resized_width, resized_height), Image.LANCZOS)
+ return img
+
+
+def crop_image(img, target_size, center):
+ width, height = img.size
+ size = target_size
+ if center == True:
+ w_start = (width - size) / 2
+ h_start = (height - size) / 2
+ else:
+ w_start = np.random.randint(0, width - size + 1)
+ h_start = np.random.randint(0, height - size + 1)
+ w_end = w_start + size
+ h_end = h_start + size
+ img = img.crop((w_start, h_start, w_end, h_end))
+ return img
+
+
+def random_crop(img, size, scale=[0.08, 1.0], ratio=[3. / 4., 4. / 3.]):
+ aspect_ratio = math.sqrt(np.random.uniform(*ratio))
+ w = 1. * aspect_ratio
+ h = 1. / aspect_ratio
+
+ bound = min((float(img.size[0]) / img.size[1]) / (w**2),
+ (float(img.size[1]) / img.size[0]) / (h**2))
+ scale_max = min(scale[1], bound)
+ scale_min = min(scale[0], bound)
+
+ target_area = img.size[0] * img.size[1] * np.random.uniform(scale_min,
+ scale_max)
+ target_size = math.sqrt(target_area)
+ w = int(target_size * w)
+ h = int(target_size * h)
+
+ i = np.random.randint(0, img.size[0] - w + 1)
+ j = np.random.randint(0, img.size[1] - h + 1)
+
+ img = img.crop((i, j, i + w, j + h))
+ img = img.resize((size, size), Image.LANCZOS)
+ return img
+
+
+def rotate_image(img):
+ angle = np.random.randint(-10, 11)
+ img = img.rotate(angle)
+ return img
+
+
+def distort_color(img):
+ def random_brightness(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Brightness(img).enhance(e)
+
+ def random_contrast(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Contrast(img).enhance(e)
+
+ def random_color(img, lower=0.5, upper=1.5):
+ e = np.random.uniform(lower, upper)
+ return ImageEnhance.Color(img).enhance(e)
+
+ ops = [random_brightness, random_contrast, random_color]
+ np.random.shuffle(ops)
+
+ img = ops[0](img)
+ img = ops[1](img)
+ img = ops[2](img)
+
+ return img
+
+
+def process_image(sample, mode, color_jitter, rotate):
+ img_path = sample[0]
+
+ img = Image.open(img_path)
+ if mode == 'train':
+ if rotate: img = rotate_image(img)
+ img = random_crop(img, DATA_DIM)
+ else:
+ img = resize_short(img, target_size=256)
+ img = crop_image(img, target_size=DATA_DIM, center=True)
+ if mode == 'train':
+ if color_jitter:
+ img = distort_color(img)
+ if np.random.randint(0, 2) == 1:
+ img = img.transpose(Image.FLIP_LEFT_RIGHT)
+
+ if img.mode != 'RGB':
+ img = img.convert('RGB')
+
+ img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255
+ img -= img_mean
+ img /= img_std
+
+ if mode == 'train' or mode == 'val':
+ return img, sample[1]
+ elif mode == 'test':
+ return [img]
+
+
+def _reader_creator(file_list,
+ mode,
+ shuffle=False,
+ color_jitter=False,
+ rotate=False,
+ data_dir=DATA_DIR,
+ batch_size=1):
+ def reader():
+ with open(file_list) as flist:
+ full_lines = [line.strip() for line in flist]
+ if shuffle:
+ np.random.shuffle(full_lines)
+ if mode == 'train' and os.getenv('PADDLE_TRAINING_ROLE'):
+ # distributed mode if the env var `PADDLE_TRAINING_ROLE` exits
+ trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
+ trainer_count = int(os.getenv("PADDLE_TRAINERS", "1"))
+ per_node_lines = len(full_lines) // trainer_count
+ lines = full_lines[trainer_id * per_node_lines:(trainer_id + 1)
+ * per_node_lines]
+ print(
+ "read images from %d, length: %d, lines length: %d, total: %d"
+ % (trainer_id * per_node_lines, per_node_lines, len(lines),
+ len(full_lines)))
+ else:
+ lines = full_lines
+
+ for line in lines:
+ if mode == 'train' or mode == 'val':
+ img_path, label = line.split()
+ # img_path = img_path.replace("JPEG", "jpeg")
+ img_path = os.path.join(data_dir, img_path)
+ yield img_path, int(label)
+ elif mode == 'test':
+ img_path = os.path.join(data_dir, line)
+ yield [img_path]
+
+ mapper = functools.partial(
+ process_image, mode=mode, color_jitter=color_jitter, rotate=rotate)
+
+ return paddle.reader.xmap_readers(mapper, reader, THREAD, BUF_SIZE)
+
+
+def train(data_dir=DATA_DIR):
+ file_list = os.path.join(data_dir, 'train_list.txt')
+ return _reader_creator(
+ file_list,
+ 'train',
+ shuffle=True,
+ color_jitter=False,
+ rotate=False,
+ data_dir=data_dir)
+
+
+def val(data_dir=DATA_DIR):
+ file_list = os.path.join(data_dir, 'val_list.txt')
+ return _reader_creator(file_list, 'val', shuffle=False, data_dir=data_dir)
+
+
+def test(data_dir=DATA_DIR):
+ file_list = os.path.join(data_dir, 'val_list.txt')
+ return _reader_creator(file_list, 'test', shuffle=False, data_dir=data_dir)
diff --git a/PaddleSlim/run.sh b/PaddleSlim/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..eaed30b84cc70805db1738f949bae674f207f17d
--- /dev/null
+++ b/PaddleSlim/run.sh
@@ -0,0 +1,97 @@
+#!/usr/bin/env bash
+
+# download pretrain model
+root_url="http://paddle-imagenet-models-name.bj.bcebos.com"
+MobileNetV1="MobileNetV1_pretrained.zip"
+ResNet50="ResNet50_pretrained.zip"
+pretrain_dir='./pretrain'
+
+if [ ! -d ${pretrain_dir} ]; then
+ mkdir ${pretrain_dir}
+fi
+
+cd ${pretrain_dir}
+
+if [ ! -f ${MobileNetV1} ]; then
+ wget ${root_url}/${MobileNetV1}
+ unzip ${MobileNetV1}
+fi
+
+if [ ! -f ${ResNet50} ]; then
+ wget ${root_url}/${ResNet50}
+ unzip ${ResNet50}
+fi
+
+cd -
+
+# for distillation
+#-----------------
+export CUDA_VISIBLE_DEVICES=0,1,2,3
+
+
+# Fixing name conflicts in distillation
+mv ResNet50_pretrained/conv1_weights ResNet50_pretrained/res_conv1_weights
+mv ResNet50_pretrained/fc_0.w_0 ResNet50_pretrained/res_fc.w_0
+mv ResNet50_pretrained/fc_0.b_0 ResNet50_pretrained/res_fc.b_0
+python compress.py \
+--model "MobileNet" \
+--teacher_model "ResNet50" \
+--teacher_pretrained_model ./pretrain/ResNet50_pretrained \
+--compress_config ./configs/mobilenetv1_resnet50_distillation.yaml
+
+mv ResNet50_pretrained/res_conv1_weights ResNet50_pretrained/conv1_weights
+mv ResNet50_pretrained/res_fc.w_0 ResNet50_pretrained/fc_0.w_0
+mv ResNet50_pretrained/res_fc.b_0 ResNet50_pretrained/fc_0.b_0
+
+# for sensitivity filter pruning
+#-------------------------------
+#export CUDA_VISIBLE_DEVICES=0
+#python compress.py \
+#--model "MobileNet" \
+#--pretrained_model ./pretrain/MobileNetV1_pretrained \
+#--compress_config ./configs/filter_pruning_sen.yaml
+
+# for uniform filter pruning
+#---------------------------
+#export CUDA_VISIBLE_DEVICES=0
+#python compress.py \
+#--model "MobileNet" \
+#--pretrained_model ./pretrain/MobileNetV1_pretrained \
+#--compress_config ./configs/filter_pruning_uniform.yaml
+
+# for quantization
+#-----------------
+#export CUDA_VISIBLE_DEVICES=0
+#python compress.py \
+#--batch_size 64 \
+#--model "MobileNet" \
+#--pretrained_model ./pretrain/MobileNetV1_pretrained \
+#--compress_config ./configs/quantization.yaml
+
+# for distillation with quantization
+#-----------------------------------
+#export CUDA_VISIBLE_DEVICES=0
+#
+## Fixing name conflicts in distillation
+#mv ResNet50_pretrained/conv1_weights ResNet50_pretrained/res_conv1_weights
+#mv ResNet50_pretrained/fc_0.w_0 ResNet50_pretrained/res_fc.w_0
+#mv ResNet50_pretrained/fc_0.b_0 ResNet50_pretrained/res_fc.b_0
+#
+#python compress.py \
+#--model "MobileNet" \
+#--teacher_model "ResNet50" \
+#--teacher_pretrained_model ./data/pretrain/ResNet50_pretrained \
+#--compress_config ./configs/quantization_dist.yaml
+#
+#mv ResNet50_pretrained/res_conv1_weights ResNet50_pretrained/conv1_weights
+#mv ResNet50_pretrained/res_fc.w_0 ResNet50_pretrained/fc_0.w_0
+#mv ResNet50_pretrained/res_fc.b_0 ResNet50_pretrained/fc_0.b_0
+
+# for uniform filter pruning with quantization
+#---------------------------------------------
+#export CUDA_VISIBLE_DEVICES=0
+#python compress.py \
+#--model "MobileNet" \
+#--pretrained_model ./data/pretrain/MobileNetV1_pretrained \
+#--compress_config ./configs/quantization_pruning.yaml
+
diff --git a/PaddleSlim/utility.py b/PaddleSlim/utility.py
new file mode 100644
index 0000000000000000000000000000000000000000..b084813590c45030ea5a0acfa6512fa0d7c4bb70
--- /dev/null
+++ b/PaddleSlim/utility.py
@@ -0,0 +1,142 @@
+"""Contains common utility functions."""
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
+#
+#Licensed under the Apache License, Version 2.0 (the "License");
+#you may not use this file except in compliance with the License.
+#You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+#Unless required by applicable law or agreed to in writing, software
+#distributed under the License is distributed on an "AS IS" BASIS,
+#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#See the License for the specific language governing permissions and
+#limitations under the License.
+
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import distutils.util
+import os
+import numpy as np
+import six
+import paddle.fluid as fluid
+import paddle.compat as cpt
+from paddle.fluid import core
+from paddle.fluid.framework import Program
+
+
+def print_arguments(args):
+ """Print argparse's arguments.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ parser.add_argument("name", default="Jonh", type=str, help="User name.")
+ args = parser.parse_args()
+ print_arguments(args)
+
+ :param args: Input argparse.Namespace for printing.
+ :type args: argparse.Namespace
+ """
+ print("----------- Configuration Arguments -----------")
+ for arg, value in sorted(six.iteritems(vars(args))):
+ print("%s: %s" % (arg, value))
+ print("------------------------------------------------")
+
+
+def add_arguments(argname, type, default, help, argparser, **kwargs):
+ """Add argparse's argument.
+
+ Usage:
+
+ .. code-block:: python
+
+ parser = argparse.ArgumentParser()
+ add_argument("name", str, "Jonh", "User name.", parser)
+ args = parser.parse_args()
+ """
+ type = distutils.util.strtobool if type == bool else type
+ argparser.add_argument(
+ "--" + argname,
+ default=default,
+ type=type,
+ help=help + ' Default: %(default)s.',
+ **kwargs)
+
+
+def save_persistable_nodes(executor, dirname, graph):
+ """
+ Save persistable nodes to the given directory by the executor.
+
+ Args:
+ executor(Executor): The executor to run for saving node values.
+ dirname(str): The directory path.
+ graph(IrGraph): All the required persistable nodes in the graph will be saved.
+ """
+ persistable_node_names = set()
+ persistable_nodes = []
+ all_persistable_nodes = graph.all_persistable_nodes()
+ for node in all_persistable_nodes:
+ name = cpt.to_text(node.name())
+ if name not in persistable_node_names:
+ persistable_node_names.add(name)
+ persistable_nodes.append(node)
+ program = Program()
+ var_list = []
+ for node in persistable_nodes:
+ var_desc = node.var()
+ if var_desc.type() == core.VarDesc.VarType.RAW or \
+ var_desc.type() == core.VarDesc.VarType.READER:
+ continue
+ var = program.global_block().create_var(
+ name=var_desc.name(),
+ shape=var_desc.shape(),
+ dtype=var_desc.dtype(),
+ type=var_desc.type(),
+ lod_level=var_desc.lod_level(),
+ persistable=var_desc.persistable())
+ var_list.append(var)
+ fluid.io.save_vars(executor=executor, dirname=dirname, vars=var_list)
+
+
+def load_persistable_nodes(executor, dirname, graph):
+ """
+ Load persistable node values from the given directory by the executor.
+
+ Args:
+ executor(Executor): The executor to run for loading node values.
+ dirname(str): The directory path.
+ graph(IrGraph): All the required persistable nodes in the graph will be loaded.
+ """
+ persistable_node_names = set()
+ persistable_nodes = []
+ all_persistable_nodes = graph.all_persistable_nodes()
+ for node in all_persistable_nodes:
+ name = cpt.to_text(node.name())
+ if name not in persistable_node_names:
+ persistable_node_names.add(name)
+ persistable_nodes.append(node)
+ program = Program()
+ var_list = []
+
+ def _exist(var):
+ return os.path.exists(os.path.join(dirname, var.name))
+
+ for node in persistable_nodes:
+ var_desc = node.var()
+ if var_desc.type() == core.VarDesc.VarType.RAW or \
+ var_desc.type() == core.VarDesc.VarType.READER:
+ continue
+ var = program.global_block().create_var(
+ name=var_desc.name(),
+ shape=var_desc.shape(),
+ dtype=var_desc.dtype(),
+ type=var_desc.type(),
+ lod_level=var_desc.lod_level(),
+ persistable=var_desc.persistable())
+ if _exist(var):
+ var_list.append(var)
+ fluid.io.load_vars(executor=executor, dirname=dirname, vars=var_list)
diff --git a/PaddleSpeech/DeepASR/.gitignore b/PaddleSpeech/DeepASR/.gitignore
new file mode 100644
index 0000000000000000000000000000000000000000..485dee64bcfb48793379b200a1afd14e85a8aaf4
--- /dev/null
+++ b/PaddleSpeech/DeepASR/.gitignore
@@ -0,0 +1 @@
+.idea
diff --git a/PaddleSpeech/DeepASR/README.md b/PaddleSpeech/DeepASR/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..6b9913fd30a56ef2328bc62e9b36e496f6763430
--- /dev/null
+++ b/PaddleSpeech/DeepASR/README.md
@@ -0,0 +1,36 @@
+The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
+
+## Deep Automatic Speech Recognition
+
+### Introduction
+TBD
+
+### Installation
+
+#### Kaldi
+The decoder depends on [kaldi](https://github.com/kaldi-asr/kaldi), install it by flowing its instructions. Then
+
+```shell
+export KALDI_ROOT=
+```
+
+#### Decoder
+
+```shell
+git clone https://github.com/PaddlePaddle/models.git
+cd models/fluid/DeepASR/decoder
+sh setup.sh
+```
+
+### Data reprocessing
+TBD
+
+### Training
+TBD
+
+
+### Inference & Decoding
+TBD
+
+### Question and Contribution
+TBD
diff --git a/PaddleSpeech/DeepASR/README_cn.md b/PaddleSpeech/DeepASR/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..be78a048701a621bd90942bdfe30ef4d7c7f082f
--- /dev/null
+++ b/PaddleSpeech/DeepASR/README_cn.md
@@ -0,0 +1,186 @@
+运行本目录下的程序示例需要使用 PaddlePaddle v0.14及以上版本。如果您的 PaddlePaddle 安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新 PaddlePaddle 安装版本。
+
+---
+
+DeepASR (Deep Automatic Speech Recognition) 是一个基于PaddlePaddle FLuid与[Kaldi](http://www.kaldi-asr.org)的语音识别系统。其利用Fluid框架完成语音识别中声学模型的配置和训练,并集成 Kaldi 的解码器。旨在方便已对 Kaldi 的较为熟悉的用户实现中声学模型的快速、大规模训练,并利用kaldi完成复杂的语音数据预处理和最终的解码过程。
+
+### 目录
+- [模型概览](#model-overview)
+- [安装](#installation)
+- [数据预处理](#data-reprocessing)
+- [模型训练](#training)
+- [训练过程中的时间分析](#perf-profiling)
+- [预测和解码](#infer-decoding)
+- [评估错误率](#scoring-error-rate)
+- [Aishell 实例](#aishell-example)
+- [欢迎贡献更多的实例](#how-to-contrib)
+
+### 模型概览
+
+DeepASR的声学模型是一个单卷积层加多层层叠LSTMP 的结构,利用卷积来进行初步的特征提取,并用多层的LSTMP来对时序关系进行建模,所用到的损失函数是交叉熵。[LSTMP](https://arxiv.org/abs/1402.1128)(LSTM with recurrent projection layer)是传统 LSTM 的拓展,在 LSTM 的基础上增加了一个映射层,将隐含层映射到较低的维度并输入下一个时间步,这种结构在大为减小 LSTM 的参数规模和计算复杂度的同时还提升了 LSTM 的性能表现。
+
+
+
+图1 LSTMP 的拓扑结构
+
+
+### 安装
+
+
+#### kaldi的安装与设置
+
+
+DeepASR解码过程中所用的解码器依赖于[Kaldi的安装](https://github.com/kaldi-asr/kaldi),如环境中无Kaldi, 请`git clone`其源代码,并按给定的命令安装好kaldi,最后设置环境变量`KALDI_ROOT`:
+
+```shell
+export KALDI_ROOT=
+
+```
+#### 解码器的安装
+进入解码器源码所在的目录
+
+```shell
+cd models/fluid/DeepASR/decoder
+```
+运行安装脚本
+
+```shell
+sh setup.sh
+```
+ 编译过程完成即成功地安转了解码器。
+
+### 数据预处理
+
+参考[Kaldi的数据准备流程](http://kaldi-asr.org/doc/data_prep.html)完成音频数据的特征提取和标签对齐
+
+### 声学模型的训练
+
+可以选择在CPU或GPU模式下进行声学模型的训练,例如在GPU模式下的训练
+
+```shell
+CUDA_VISIBLE_DEVICES=0,1,2,3 python -u train.py \
+ --train_feature_lst train_feature.lst \
+ --train_label_lst train_label.lst \
+ --val_feature_lst val_feature.lst \
+ --val_label_lst val_label.lst \
+ --mean_var global_mean_var \
+ --parallel
+```
+其中`train_feature.lst`和`train_label.lst`分别是训练数据集的特征列表文件和标注列表文件,类似的,`val_feature.lst`和`val_label.lst`对应的则是验证集的列表文件。实际训练过程中要正确指定建模单元大小、学习率等重要参数。关于这些参数的说明,请运行
+
+```shell
+python train.py --help
+```
+获取更多信息。
+
+### 训练过程中的时间分析
+
+利用Fluid提供的性能分析工具profiler,可对训练过程进行性能分析,获取网络中operator级别的执行时间
+
+```shell
+CUDA_VISIBLE_DEVICES=0 python -u tools/profile.py \
+ --train_feature_lst train_feature.lst \
+ --train_label_lst train_label.lst \
+ --val_feature_lst val_feature.lst \
+ --val_label_lst val_label.lst \
+ --mean_var global_mean_var
+```
+
+
+### 预测和解码
+
+在充分训练好声学模型之后,利用训练过程中保存下来的模型checkpoint,可对输入的音频数据进行解码输出,得到声音到文字的识别结果
+
+```
+CUDA_VISIBLE_DEVICES=0,1,2,3 python -u infer_by_ckpt.py \
+ --batch_size 96 \
+ --checkpoint deep_asr.pass_1.checkpoint \
+ --infer_feature_lst test_feature.lst \
+ --infer_label_lst test_label.lst \
+ --mean_var global_mean_var \
+ --parallel
+```
+
+### 评估错误率
+
+对语音识别系统的评价常用的指标有词错误率(Word Error Rate, WER)和字错误率(Character Error Rate, CER), 在DeepASR中也实现了相关的度量工具,其运行方式为
+
+```
+python score_error_rate.py --error_rate_type cer --ref ref.txt --hyp decoding.txt
+```
+参数`error_rate_type`表示测量错误率的类型,即 WER 或 CER;`ref.txt` 和 `decoding.txt` 分别表示参考文本和实际解码出的文本,它们有着同样的格式:
+
+```
+key1 text1
+key2 text2
+key3 text3
+...
+
+```
+
+
+### Aishell 实例
+
+本节以[Aishell数据集](http://www.aishelltech.com/kysjcp)为例,展示如何完成从数据预处理到解码输出。Aishell是由北京希尔贝克公司所开放的中文普通话语音数据集,时长178小时,包含了400名来自不同口音区域录制者的语音,原始数据可由[openslr](http://www.openslr.org/33)获取。为简化流程,这里提供了已完成预处理的数据集供下载:
+
+```
+cd examples/aishell
+sh prepare_data.sh
+```
+
+其中包括了声学模型的训练数据以及解码过程中所用到的辅助文件等。下载数据完成后,在开始训练之前可对训练过程进行分析
+
+```
+sh profile.sh
+```
+
+执行训练
+
+```
+sh train.sh
+```
+默认是用4卡GPU进行训练,在实际过程中可根据可用GPU的数目和显存大小对`batch_size`、学习率等参数进行动态调整。训练过程中典型的损失函数和精度的变化趋势如图2所示
+
+
+
+图2 在Aishell数据集上训练声学模型的学习曲线
+
+
+完成模型训练后,即可执行预测识别测试集语音中的文字:
+
+```
+sh infer_by_ckpt.sh
+```
+
+其中包括了声学模型的预测和解码器的解码输出两个重要的过程。以下是解码输出的样例:
+
+```
+...
+BAC009S0764W0239 十一 五 期间 我 国 累计 境外 投资 七千亿 美元
+BAC009S0765W0140 在 了解 送 方 的 资产 情况 与 需求 之后
+BAC009S0915W0291 这 对 苹果 来说 不 是 件 容易 的 事 儿
+BAC009S0769W0159 今年 土地 收入 预计 近 四万亿 元
+BAC009S0907W0451 由 浦东 商店 作为 掩护
+BAC009S0768W0128 土地 交易 可能 随着 供应 淡季 的 到来 而 降温
+...
+```
+
+每行对应一个输出,均以音频样本的关键字开头,随后是按词分隔的解码出的中文文本。解码完成后运行脚本评估字错误率(CER)
+
+```
+sh score_cer.sh
+```
+
+其输出类似于如下所示
+
+```
+Error rate[cer] = 0.101971 (10683/104765),
+total 7176 sentences in hyp, 0 not presented in ref.
+```
+
+利用经过20轮左右训练的声学模型,可以在Aishell的测试集上得到CER约10%的识别结果。
+
+
+### 欢迎贡献更多的实例
+
+DeepASR目前只开放了Aishell实例,我们欢迎用户在更多的数据集上测试完整的训练流程并贡献到这个项目中。
diff --git a/PaddleSpeech/DeepASR/data_utils/__init__.py b/PaddleSpeech/DeepASR/data_utils/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/fluid/DeepASR/data_utils/async_data_reader.py b/PaddleSpeech/DeepASR/data_utils/async_data_reader.py
similarity index 100%
rename from fluid/DeepASR/data_utils/async_data_reader.py
rename to PaddleSpeech/DeepASR/data_utils/async_data_reader.py
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/__init__.py b/PaddleSpeech/DeepASR/data_utils/augmentor/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/fluid/DeepASR/data_utils/augmentor/tests/__init__.py b/PaddleSpeech/DeepASR/data_utils/augmentor/tests/__init__.py
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/tests/__init__.py
rename to PaddleSpeech/DeepASR/data_utils/augmentor/tests/__init__.py
diff --git a/fluid/DeepASR/data_utils/augmentor/tests/data/global_mean_var_search26kHr b/PaddleSpeech/DeepASR/data_utils/augmentor/tests/data/global_mean_var_search26kHr
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/tests/data/global_mean_var_search26kHr
rename to PaddleSpeech/DeepASR/data_utils/augmentor/tests/data/global_mean_var_search26kHr
diff --git a/fluid/DeepASR/data_utils/augmentor/tests/test_data_trans.py b/PaddleSpeech/DeepASR/data_utils/augmentor/tests/test_data_trans.py
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/tests/test_data_trans.py
rename to PaddleSpeech/DeepASR/data_utils/augmentor/tests/test_data_trans.py
diff --git a/fluid/DeepASR/data_utils/augmentor/trans_add_delta.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_add_delta.py
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/trans_add_delta.py
rename to PaddleSpeech/DeepASR/data_utils/augmentor/trans_add_delta.py
diff --git a/fluid/DeepASR/data_utils/augmentor/trans_delay.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_delay.py
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/trans_delay.py
rename to PaddleSpeech/DeepASR/data_utils/augmentor/trans_delay.py
diff --git a/fluid/DeepASR/data_utils/augmentor/trans_mean_variance_norm.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_mean_variance_norm.py
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/trans_mean_variance_norm.py
rename to PaddleSpeech/DeepASR/data_utils/augmentor/trans_mean_variance_norm.py
diff --git a/fluid/DeepASR/data_utils/augmentor/trans_splice.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_splice.py
similarity index 100%
rename from fluid/DeepASR/data_utils/augmentor/trans_splice.py
rename to PaddleSpeech/DeepASR/data_utils/augmentor/trans_splice.py
diff --git a/PaddleSpeech/DeepASR/data_utils/util.py b/PaddleSpeech/DeepASR/data_utils/util.py
new file mode 100644
index 0000000000000000000000000000000000000000..4a5a8a3f1dad1c46ed773fd48d713e276717d5e5
--- /dev/null
+++ b/PaddleSpeech/DeepASR/data_utils/util.py
@@ -0,0 +1,71 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+import sys
+from six import reraise
+from tblib import Traceback
+
+import numpy as np
+
+
+def to_lodtensor(data, place):
+ """convert tensor to lodtensor
+ """
+ seq_lens = [len(seq) for seq in data]
+ cur_len = 0
+ lod = [cur_len]
+ for l in seq_lens:
+ cur_len += l
+ lod.append(cur_len)
+ flattened_data = numpy.concatenate(data, axis=0).astype("int64")
+ flattened_data = flattened_data.reshape([len(flattened_data), 1])
+ res = fluid.LoDTensor()
+ res.set(flattened_data, place)
+ res.set_lod([lod])
+ return res
+
+
+def split_infer_result(infer_seq, lod):
+ infer_batch = []
+ for i in xrange(0, len(lod[0]) - 1):
+ infer_batch.append(infer_seq[lod[0][i]:lod[0][i + 1]])
+ return infer_batch
+
+
+class CriticalException(Exception):
+ pass
+
+
+def suppress_signal(signo, stack_frame):
+ pass
+
+
+def suppress_complaints(verbose, notify=None):
+ def decorator_maker(func):
+ def suppress_warpper(*args, **kwargs):
+ try:
+ func(*args, **kwargs)
+ except:
+ et, ev, tb = sys.exc_info()
+
+ if notify is not None:
+ notify(except_type=et, except_value=ev, traceback=tb)
+
+ if verbose == 1 or isinstance(ev, CriticalException):
+ reraise(et, ev, Traceback(tb).as_traceback())
+
+ return suppress_warpper
+
+ return decorator_maker
+
+
+class ForceExitWrapper(object):
+ def __init__(self, exit_flag):
+ self._exit_flag = exit_flag
+
+ @suppress_complaints(verbose=0)
+ def __call__(self, *args, **kwargs):
+ self._exit_flag.value = True
+
+ def __eq__(self, flag):
+ return self._exit_flag.value == flag
diff --git a/PaddleSpeech/DeepASR/decoder/.gitignore b/PaddleSpeech/DeepASR/decoder/.gitignore
new file mode 100644
index 0000000000000000000000000000000000000000..ef5c97cfb5c06f3308980ca65c87e9c4b9440171
--- /dev/null
+++ b/PaddleSpeech/DeepASR/decoder/.gitignore
@@ -0,0 +1,4 @@
+ThreadPool
+build
+post_latgen_faster_mapped.so
+pybind11
diff --git a/fluid/DeepASR/decoder/post_latgen_faster_mapped.cc b/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.cc
similarity index 100%
rename from fluid/DeepASR/decoder/post_latgen_faster_mapped.cc
rename to PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.cc
diff --git a/fluid/DeepASR/decoder/post_latgen_faster_mapped.h b/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.h
similarity index 100%
rename from fluid/DeepASR/decoder/post_latgen_faster_mapped.h
rename to PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.h
diff --git a/fluid/DeepASR/decoder/pybind.cc b/PaddleSpeech/DeepASR/decoder/pybind.cc
similarity index 100%
rename from fluid/DeepASR/decoder/pybind.cc
rename to PaddleSpeech/DeepASR/decoder/pybind.cc
diff --git a/fluid/DeepASR/decoder/setup.py b/PaddleSpeech/DeepASR/decoder/setup.py
similarity index 100%
rename from fluid/DeepASR/decoder/setup.py
rename to PaddleSpeech/DeepASR/decoder/setup.py
diff --git a/fluid/DeepASR/decoder/setup.sh b/PaddleSpeech/DeepASR/decoder/setup.sh
similarity index 100%
rename from fluid/DeepASR/decoder/setup.sh
rename to PaddleSpeech/DeepASR/decoder/setup.sh
diff --git a/PaddleSpeech/DeepASR/examples/aishell/.gitignore b/PaddleSpeech/DeepASR/examples/aishell/.gitignore
new file mode 100644
index 0000000000000000000000000000000000000000..c173dd880ae9e06c16989800e06d4d3d7a1a7d5f
--- /dev/null
+++ b/PaddleSpeech/DeepASR/examples/aishell/.gitignore
@@ -0,0 +1,4 @@
+aux.tar.gz
+aux
+data
+checkpoints
diff --git a/fluid/DeepASR/examples/aishell/download_pretrained_model.sh b/PaddleSpeech/DeepASR/examples/aishell/download_pretrained_model.sh
similarity index 100%
rename from fluid/DeepASR/examples/aishell/download_pretrained_model.sh
rename to PaddleSpeech/DeepASR/examples/aishell/download_pretrained_model.sh
diff --git a/fluid/DeepASR/examples/aishell/infer_by_ckpt.sh b/PaddleSpeech/DeepASR/examples/aishell/infer_by_ckpt.sh
similarity index 100%
rename from fluid/DeepASR/examples/aishell/infer_by_ckpt.sh
rename to PaddleSpeech/DeepASR/examples/aishell/infer_by_ckpt.sh
diff --git a/fluid/DeepASR/examples/aishell/prepare_data.sh b/PaddleSpeech/DeepASR/examples/aishell/prepare_data.sh
similarity index 100%
rename from fluid/DeepASR/examples/aishell/prepare_data.sh
rename to PaddleSpeech/DeepASR/examples/aishell/prepare_data.sh
diff --git a/fluid/DeepASR/examples/aishell/profile.sh b/PaddleSpeech/DeepASR/examples/aishell/profile.sh
similarity index 100%
rename from fluid/DeepASR/examples/aishell/profile.sh
rename to PaddleSpeech/DeepASR/examples/aishell/profile.sh
diff --git a/fluid/DeepASR/examples/aishell/score_cer.sh b/PaddleSpeech/DeepASR/examples/aishell/score_cer.sh
similarity index 100%
rename from fluid/DeepASR/examples/aishell/score_cer.sh
rename to PaddleSpeech/DeepASR/examples/aishell/score_cer.sh
diff --git a/PaddleSpeech/DeepASR/examples/aishell/train.sh b/PaddleSpeech/DeepASR/examples/aishell/train.sh
new file mode 100644
index 0000000000000000000000000000000000000000..168581c0ee579ef62f138bb0d8f5bb8886beb90b
--- /dev/null
+++ b/PaddleSpeech/DeepASR/examples/aishell/train.sh
@@ -0,0 +1,14 @@
+export CUDA_VISIBLE_DEVICES=4,5,6,7
+python -u ../../train.py --train_feature_lst data/train_feature.lst \
+ --train_label_lst data/train_label.lst \
+ --val_feature_lst data/val_feature.lst \
+ --val_label_lst data/val_label.lst \
+ --mean_var data/global_mean_var \
+ --checkpoints checkpoints \
+ --frame_dim 80 \
+ --class_num 3040 \
+ --print_per_batches 100 \
+ --infer_models '' \
+ --batch_size 16 \
+ --learning_rate 6.4e-5 \
+ --parallel
diff --git a/fluid/DeepASR/images/learning_curve.png b/PaddleSpeech/DeepASR/images/learning_curve.png
similarity index 100%
rename from fluid/DeepASR/images/learning_curve.png
rename to PaddleSpeech/DeepASR/images/learning_curve.png
diff --git a/fluid/DeepASR/images/lstmp.png b/PaddleSpeech/DeepASR/images/lstmp.png
similarity index 100%
rename from fluid/DeepASR/images/lstmp.png
rename to PaddleSpeech/DeepASR/images/lstmp.png
diff --git a/fluid/DeepASR/infer.py b/PaddleSpeech/DeepASR/infer.py
similarity index 100%
rename from fluid/DeepASR/infer.py
rename to PaddleSpeech/DeepASR/infer.py
diff --git a/fluid/DeepASR/infer_by_ckpt.py b/PaddleSpeech/DeepASR/infer_by_ckpt.py
similarity index 100%
rename from fluid/DeepASR/infer_by_ckpt.py
rename to PaddleSpeech/DeepASR/infer_by_ckpt.py
diff --git a/PaddleSpeech/DeepASR/model_utils/__init__.py b/PaddleSpeech/DeepASR/model_utils/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/PaddleSpeech/DeepASR/model_utils/model.py b/PaddleSpeech/DeepASR/model_utils/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..0b086b55a898a0a29f57132b438684a655e30caf
--- /dev/null
+++ b/PaddleSpeech/DeepASR/model_utils/model.py
@@ -0,0 +1,74 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import paddle.fluid as fluid
+
+
+def stacked_lstmp_model(feature,
+ label,
+ hidden_dim,
+ proj_dim,
+ stacked_num,
+ class_num,
+ parallel=False,
+ is_train=True):
+ """
+ The model for DeepASR. The main structure is composed of stacked
+ identical LSTMP (LSTM with recurrent projection) layers.
+
+ When running in training and validation phase, the feeding dictionary
+ is {'feature', 'label'}, fed by the LodTensor for feature data and
+ label data respectively. And in inference, only `feature` is needed.
+
+ Args:
+ frame_dim(int): The frame dimension of feature data.
+ hidden_dim(int): The hidden state's dimension of the LSTMP layer.
+ proj_dim(int): The projection size of the LSTMP layer.
+ stacked_num(int): The number of stacked LSTMP layers.
+ parallel(bool): Run in parallel or not, default `False`.
+ is_train(bool): Run in training phase or not, default `True`.
+ class_dim(int): The number of output classes.
+ """
+ conv1 = fluid.layers.conv2d(
+ input=feature,
+ num_filters=32,
+ filter_size=3,
+ stride=1,
+ padding=1,
+ bias_attr=True,
+ act="relu")
+
+ pool1 = fluid.layers.pool2d(
+ conv1, pool_size=3, pool_type="max", pool_stride=2, pool_padding=0)
+
+ stack_input = pool1
+ for i in range(stacked_num):
+ fc = fluid.layers.fc(input=stack_input,
+ size=hidden_dim * 4,
+ bias_attr=None)
+ proj, cell = fluid.layers.dynamic_lstmp(
+ input=fc,
+ size=hidden_dim * 4,
+ proj_size=proj_dim,
+ bias_attr=True,
+ use_peepholes=True,
+ is_reverse=False,
+ cell_activation="tanh",
+ proj_activation="tanh")
+ bn = fluid.layers.batch_norm(
+ input=proj,
+ is_test=not is_train,
+ momentum=0.9,
+ epsilon=1e-05,
+ data_layout='NCHW')
+ stack_input = bn
+
+ prediction = fluid.layers.fc(input=stack_input,
+ size=class_num,
+ act='softmax')
+
+ cost = fluid.layers.cross_entropy(input=prediction, label=label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc = fluid.layers.accuracy(input=prediction, label=label)
+ return prediction, avg_cost, acc
diff --git a/fluid/DeepASR/score_error_rate.py b/PaddleSpeech/DeepASR/score_error_rate.py
similarity index 100%
rename from fluid/DeepASR/score_error_rate.py
rename to PaddleSpeech/DeepASR/score_error_rate.py
diff --git a/fluid/DeepASR/tools/_init_paths.py b/PaddleSpeech/DeepASR/tools/_init_paths.py
similarity index 100%
rename from fluid/DeepASR/tools/_init_paths.py
rename to PaddleSpeech/DeepASR/tools/_init_paths.py
diff --git a/fluid/DeepASR/tools/error_rate.py b/PaddleSpeech/DeepASR/tools/error_rate.py
similarity index 100%
rename from fluid/DeepASR/tools/error_rate.py
rename to PaddleSpeech/DeepASR/tools/error_rate.py
diff --git a/fluid/DeepASR/tools/profile.py b/PaddleSpeech/DeepASR/tools/profile.py
similarity index 100%
rename from fluid/DeepASR/tools/profile.py
rename to PaddleSpeech/DeepASR/tools/profile.py
diff --git a/PaddleSpeech/DeepASR/train.py b/PaddleSpeech/DeepASR/train.py
new file mode 100644
index 0000000000000000000000000000000000000000..1a1dd6cf9ea33bb546cc3bdf65c36be0441832cb
--- /dev/null
+++ b/PaddleSpeech/DeepASR/train.py
@@ -0,0 +1,372 @@
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import sys
+import os
+import numpy as np
+import argparse
+import time
+
+import paddle.fluid as fluid
+import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
+import data_utils.augmentor.trans_add_delta as trans_add_delta
+import data_utils.augmentor.trans_splice as trans_splice
+import data_utils.augmentor.trans_delay as trans_delay
+import data_utils.async_data_reader as reader
+from model_utils.model import stacked_lstmp_model
+
+
+def parse_args():
+ parser = argparse.ArgumentParser("Training for stacked LSTMP model.")
+ parser.add_argument(
+ '--batch_size',
+ type=int,
+ default=32,
+ help='The sequence number of a batch data. Batch size per GPU. (default: %(default)d)'
+ )
+ parser.add_argument(
+ '--minimum_batch_size',
+ type=int,
+ default=1,
+ help='The minimum sequence number of a batch data. '
+ '(default: %(default)d)')
+ parser.add_argument(
+ '--frame_dim',
+ type=int,
+ default=80,
+ help='Frame dimension of feature data. (default: %(default)d)')
+ parser.add_argument(
+ '--stacked_num',
+ type=int,
+ default=5,
+ help='Number of lstmp layers to stack. (default: %(default)d)')
+ parser.add_argument(
+ '--proj_dim',
+ type=int,
+ default=512,
+ help='Project size of lstmp unit. (default: %(default)d)')
+ parser.add_argument(
+ '--hidden_dim',
+ type=int,
+ default=1024,
+ help='Hidden size of lstmp unit. (default: %(default)d)')
+ parser.add_argument(
+ '--class_num',
+ type=int,
+ default=3040,
+ help='Number of classes in label. (default: %(default)d)')
+ parser.add_argument(
+ '--pass_num',
+ type=int,
+ default=100,
+ help='Epoch number to train. (default: %(default)d)')
+ parser.add_argument(
+ '--print_per_batches',
+ type=int,
+ default=100,
+ help='Interval to print training accuracy. (default: %(default)d)')
+ parser.add_argument(
+ '--learning_rate',
+ type=float,
+ default=0.00016,
+ help='Learning rate used to train. (default: %(default)f)')
+ parser.add_argument(
+ '--device',
+ type=str,
+ default='GPU',
+ choices=['CPU', 'GPU'],
+ help='The device type. (default: %(default)s)')
+ parser.add_argument(
+ '--parallel', action='store_true', help='If set, run in parallel.')
+ parser.add_argument(
+ '--mean_var',
+ type=str,
+ default='data/global_mean_var_search26kHr',
+ help="The path for feature's global mean and variance. "
+ "(default: %(default)s)")
+ parser.add_argument(
+ '--train_feature_lst',
+ type=str,
+ default='data/feature.lst',
+ help='The feature list path for training. (default: %(default)s)')
+ parser.add_argument(
+ '--train_label_lst',
+ type=str,
+ default='data/label.lst',
+ help='The label list path for training. (default: %(default)s)')
+ parser.add_argument(
+ '--val_feature_lst',
+ type=str,
+ default='data/val_feature.lst',
+ help='The feature list path for validation. (default: %(default)s)')
+ parser.add_argument(
+ '--val_label_lst',
+ type=str,
+ default='data/val_label.lst',
+ help='The label list path for validation. (default: %(default)s)')
+ parser.add_argument(
+ '--init_model_path',
+ type=str,
+ default=None,
+ help="The model (checkpoint) path which the training resumes from. "
+ "If None, train the model from scratch. (default: %(default)s)")
+ parser.add_argument(
+ '--checkpoints',
+ type=str,
+ default='./checkpoints',
+ help="The directory for saving checkpoints. Do not save checkpoints "
+ "if set to ''. (default: %(default)s)")
+ parser.add_argument(
+ '--infer_models',
+ type=str,
+ default='./infer_models',
+ help="The directory for saving inference models. Do not save inference "
+ "models if set to ''. (default: %(default)s)")
+ args = parser.parse_args()
+ return args
+
+
+def print_arguments(args):
+ print('----------- Configuration Arguments -----------')
+ for arg, value in sorted(vars(args).iteritems()):
+ print('%s: %s' % (arg, value))
+ print('------------------------------------------------')
+
+
+def train(args):
+ """train in loop.
+ """
+
+ # paths check
+ if args.init_model_path is not None and \
+ not os.path.exists(args.init_model_path):
+ raise IOError("Invalid initial model path!")
+ if args.checkpoints != '' and not os.path.exists(args.checkpoints):
+ os.mkdir(args.checkpoints)
+ if args.infer_models != '' and not os.path.exists(args.infer_models):
+ os.mkdir(args.infer_models)
+
+ train_program = fluid.Program()
+ train_startup = fluid.Program()
+
+ with fluid.program_guard(train_program, train_startup):
+ with fluid.unique_name.guard():
+ py_train_reader = fluid.layers.py_reader(
+ capacity=10,
+ shapes=([-1, 3, 11, args.frame_dim], [-1, 1]),
+ dtypes=['float32', 'int64'],
+ lod_levels=[1, 1],
+ name='train_reader')
+ feature, label = fluid.layers.read_file(py_train_reader)
+ prediction, avg_cost, accuracy = stacked_lstmp_model(
+ feature=feature,
+ label=label,
+ hidden_dim=args.hidden_dim,
+ proj_dim=args.proj_dim,
+ stacked_num=args.stacked_num,
+ class_num=args.class_num)
+ # optimizer = fluid.optimizer.Momentum(learning_rate=args.learning_rate, momentum=0.9)
+ optimizer = fluid.optimizer.Adam(
+ learning_rate=fluid.layers.exponential_decay(
+ learning_rate=args.learning_rate,
+ decay_steps=1879,
+ decay_rate=1 / 1.2,
+ staircase=True))
+ optimizer.minimize(avg_cost)
+ fluid.memory_optimize(train_program)
+
+ test_program = fluid.Program()
+ test_startup = fluid.Program()
+ with fluid.program_guard(test_program, test_startup):
+ with fluid.unique_name.guard():
+ py_test_reader = fluid.layers.py_reader(
+ capacity=10,
+ shapes=([-1, 3, 11, args.frame_dim], [-1, 1]),
+ dtypes=['float32', 'int64'],
+ lod_levels=[1, 1],
+ name='test_reader')
+ feature, label = fluid.layers.read_file(py_test_reader)
+ prediction, avg_cost, accuracy = stacked_lstmp_model(
+ feature=feature,
+ label=label,
+ hidden_dim=args.hidden_dim,
+ proj_dim=args.proj_dim,
+ stacked_num=args.stacked_num,
+ class_num=args.class_num)
+ test_program = test_program.clone(for_test=True)
+ place = fluid.CPUPlace() if args.device == 'CPU' else fluid.CUDAPlace(0)
+ exe = fluid.Executor(place)
+ exe.run(train_startup)
+ exe.run(test_startup)
+
+ if args.parallel:
+ exec_strategy = fluid.ExecutionStrategy()
+ exec_strategy.num_iteration_per_drop_scope = 10
+ train_exe = fluid.ParallelExecutor(
+ use_cuda=(args.device == 'GPU'),
+ loss_name=avg_cost.name,
+ exec_strategy=exec_strategy,
+ main_program=train_program)
+ test_exe = fluid.ParallelExecutor(
+ use_cuda=(args.device == 'GPU'),
+ main_program=test_program,
+ exec_strategy=exec_strategy,
+ share_vars_from=train_exe)
+
+ # resume training if initial model provided.
+ if args.init_model_path is not None:
+ fluid.io.load_persistables(exe, args.init_model_path)
+
+ ltrans = [
+ trans_add_delta.TransAddDelta(2, 2),
+ trans_mean_variance_norm.TransMeanVarianceNorm(args.mean_var),
+ trans_splice.TransSplice(5, 5), trans_delay.TransDelay(5)
+ ]
+
+ # bind train_reader
+ train_data_reader = reader.AsyncDataReader(
+ args.train_feature_lst,
+ args.train_label_lst,
+ -1,
+ split_sentence_threshold=1024)
+
+ train_data_reader.set_transformers(ltrans)
+
+ def train_data_provider():
+ for data in train_data_reader.batch_iterator(args.batch_size,
+ args.minimum_batch_size):
+ yield batch_data_to_lod_tensors(args, data, fluid.CPUPlace())
+
+ py_train_reader.decorate_tensor_provider(train_data_provider)
+
+ if (os.path.exists(args.val_feature_lst) and
+ os.path.exists(args.val_label_lst)):
+ # test data reader
+ test_data_reader = reader.AsyncDataReader(
+ args.val_feature_lst,
+ args.val_label_lst,
+ -1,
+ split_sentence_threshold=1024)
+ test_data_reader.set_transformers(ltrans)
+
+ def test_data_provider():
+ for data in test_data_reader.batch_iterator(
+ args.batch_size, args.minimum_batch_size):
+ yield batch_data_to_lod_tensors(args, data, fluid.CPUPlace())
+
+ py_test_reader.decorate_tensor_provider(test_data_provider)
+
+ # validation
+ def test(exe):
+ # If test data not found, return invalid cost and accuracy
+ if not (os.path.exists(args.val_feature_lst) and
+ os.path.exists(args.val_label_lst)):
+ return -1.0, -1.0
+ batch_id = 0
+ test_costs = []
+ test_accs = []
+ while True:
+ if batch_id == 0:
+ py_test_reader.start()
+ try:
+ if args.parallel:
+ cost, acc = exe.run(
+ fetch_list=[avg_cost.name, accuracy.name],
+ return_numpy=False)
+ else:
+ cost, acc = exe.run(program=test_program,
+ fetch_list=[avg_cost, accuracy],
+ return_numpy=False)
+ sys.stdout.write('.')
+ sys.stdout.flush()
+ test_costs.append(np.array(cost)[0])
+ test_accs.append(np.array(acc)[0])
+ batch_id += 1
+ except fluid.core.EOFException:
+ py_test_reader.reset()
+ break
+ return np.mean(test_costs), np.mean(test_accs)
+
+ # train
+ for pass_id in xrange(args.pass_num):
+ pass_start_time = time.time()
+ batch_id = 0
+ while True:
+ if batch_id == 0:
+ py_train_reader.start()
+ to_print = batch_id > 0 and (batch_id % args.print_per_batches == 0)
+ try:
+ if args.parallel:
+ outs = train_exe.run(
+ fetch_list=[avg_cost.name, accuracy.name]
+ if to_print else [],
+ return_numpy=False)
+ else:
+ outs = exe.run(program=train_program,
+ fetch_list=[avg_cost, accuracy]
+ if to_print else [],
+ return_numpy=False)
+ except fluid.core.EOFException:
+ py_train_reader.reset()
+ break
+
+ if to_print:
+ if args.parallel:
+ print("\nBatch %d, train cost: %f, train acc: %f" %
+ (batch_id, np.mean(outs[0]), np.mean(outs[1])))
+ else:
+ print("\nBatch %d, train cost: %f, train acc: %f" % (
+ batch_id, np.array(outs[0])[0], np.array(outs[1])[0]))
+ # save the latest checkpoint
+ if args.checkpoints != '':
+ model_path = os.path.join(args.checkpoints,
+ "deep_asr.latest.checkpoint")
+ fluid.io.save_persistables(exe, model_path, train_program)
+ else:
+ sys.stdout.write('.')
+ sys.stdout.flush()
+
+ batch_id += 1
+ # run test
+ val_cost, val_acc = test(test_exe if args.parallel else exe)
+
+ # save checkpoint per pass
+ if args.checkpoints != '':
+ model_path = os.path.join(
+ args.checkpoints,
+ "deep_asr.pass_" + str(pass_id) + ".checkpoint")
+ fluid.io.save_persistables(exe, model_path, train_program)
+ # save inference model
+ if args.infer_models != '':
+ model_path = os.path.join(
+ args.infer_models,
+ "deep_asr.pass_" + str(pass_id) + ".infer.model")
+ fluid.io.save_inference_model(model_path, ["feature"],
+ [prediction], exe, train_program)
+ # cal pass time
+ pass_end_time = time.time()
+ time_consumed = pass_end_time - pass_start_time
+ # print info at pass end
+ print("\nPass %d, time consumed: %f s, val cost: %f, val acc: %f\n" %
+ (pass_id, time_consumed, val_cost, val_acc))
+
+
+def batch_data_to_lod_tensors(args, batch_data, place):
+ features, labels, lod, name_lst = batch_data
+ features = np.reshape(features, (-1, 11, 3, args.frame_dim))
+ features = np.transpose(features, (0, 2, 1, 3))
+ feature_t = fluid.LoDTensor()
+ label_t = fluid.LoDTensor()
+ feature_t.set(features, place)
+ feature_t.set_lod([lod])
+ label_t.set(labels, place)
+ label_t.set_lod([lod])
+ return feature_t, label_t
+
+
+if __name__ == '__main__':
+ args = parse_args()
+ print_arguments(args)
+
+ train(args)
diff --git a/PaddleSpeech/README.md b/PaddleSpeech/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..39f91c26bd90fdd0e8fa81a395d14c2d3826f7cd
--- /dev/null
+++ b/PaddleSpeech/README.md
@@ -0,0 +1,12 @@
+Fluid 模型库
+============
+
+语音识别
+--------
+
+自动语音识别(Automatic Speech Recognition, ASR)是将人类声音中的词汇内容转录成计算机可输入的文字的技术。语音识别的相关研究经历了漫长的探索过程,在HMM/GMM模型之后其发展一直较为缓慢,随着深度学习的兴起,其迎来了春天。在多种语言识别任务中,将深度神经网络(DNN)作为声学模型,取得了比GMM更好的性能,使得 ASR 成为深度学习应用最为成功的领域之一。而由于识别准确率的不断提高,有越来越多的语言技术产品得以落地,例如语言输入法、以智能音箱为代表的智能家居设备等 — 基于语言的交互方式正在深刻的改变人类的生活。
+
+与 [DeepSpeech](https://github.com/PaddlePaddle/DeepSpeech) 中深度学习模型端到端直接预测字词的分布不同,本实例更接近传统的语言识别流程,以音素为建模单元,关注语言识别中声学模型的训练,利用[kaldi](http://www.kaldi-asr.org) 进行音频数据的特征提取和标签对齐,并集成 kaldi 的解码器完成解码。
+
+- [DeepASR](https://github.com/PaddlePaddle/models/blob/develop/PaddleSpeech/DeepASR/README_cn.md)
+
diff --git a/README.md b/README.md
index f08da24e2f6f0e6d2c7e3632bf27da3e0c20565e..6ce65ed3c1d8ecd730234cd396256d287d42ae37 100644
--- a/README.md
+++ b/README.md
@@ -8,8 +8,72 @@ PaddlePaddle provides a rich set of computational units to enable users to adopt
- [fluid models](fluid): use PaddlePaddle's Fluid APIs. We especially recommend users to use Fluid models.
-- [legacy models](legacy): use PaddlePaddle's v2 APIs.
+PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化的方法解决各种学习问题。在此repo中,我们展示了如何用 PaddlePaddle 来解决常见的机器学习任务,提供若干种不同的易学易用的神经网络模型。
+
+- [fluid模型](fluid): 使用 PaddlePaddle Fluid版本的 APIs,我们特别推荐您使用Fluid模型。
+
+## PaddleCV
+模型|简介|模型优势|参考论文
+--|:--:|:--:|:--:
+[AlexNet](./PaddleCV/image_classification/models)|图像分类经典模型|首次在CNN中成功的应用了ReLU、Dropout和LRN,并使用GPU进行运算加速|[ImageNet Classification with Deep Convolutional Neural Networks](https://www.researchgate.net/publication/267960550_ImageNet_Classification_with_Deep_Convolutional_Neural_Networks)
+[VGG](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification/models)|图像分类经典模型|在AlexNet的基础上使用3*3小卷积核,增加网络深度,具有很好的泛化能力|[Very Deep ConvNets for Large-Scale Inage Recognition](https://arxiv.org/pdf/1409.1556.pdf)
+[GoogleNet](./PaddleCV/image_classification/models)|图像分类经典模型|在不增加计算负载的前提下增加了网络的深度和宽度,性能更加优越|[Going deeper with convolutions](https://ieeexplore.ieee.org/document/7298594)
+[ResNet](./PaddleCV/image_classification/models)|残差网络|引入了新的残差结构,解决了随着网络加深,准确率下降的问题|[Deep Residual Learning for Image Recognition](https://arxiv.org/abs/1512.03385)
+[Inception-v4](./PaddleCV/image_classification/models)|图像分类经典模型|更加deeper和wider的inception结构|[Inception-ResNet and the Impact of Residual Connections on Learning](http://arxiv.org/abs/1602.07261)
+[MobileNet](./PaddleCV/image_classification/models)|轻量级网络模型|为移动和嵌入式设备提出的高效模型|[MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications](https://arxiv.org/abs/1704.04861)
+[DPN](./PaddleCV/image_classification/models)|图像分类模型|结合了DenseNet和ResNeXt的网络结构,对图像分类效果有所提升|[Dual Path Networks](https://arxiv.org/abs/1707.01629)
+[SE-ResNeXt](./PaddleCV/image_classification/models)|图像分类模型|ResNeXt中加入了SE block,提高了模型准确率|[Squeeze-and-excitation networks](https://arxiv.org/abs/1709.01507)
+[SSD](./PaddleCV/object_detection/README_cn.md)|单阶段目标检测器|在不同尺度的特征图上检测对应尺度的目标,可以方便地插入到任何一种标准卷积网络中|[SSD: Single Shot MultiBox Detector](https://arxiv.org/abs/1512.02325)
+[YOLOv3](./PaddleCV/yolov3/README_cn.md)|单阶段目标检测器|基于darknet53主干网络在多种尺度的特征图上进行端到端实时目标检测,检测速度快|[YOLOv3: An Incremental Improvement](https://arxiv.org/abs/1804.02767)
+[Face Detector: PyramidBox](./PaddleCV/face_detection/README_cn.md)|基于SSD的单阶段人脸检测器|利用上下文信息解决困难人脸的检测问题,网络表达能力高,鲁棒性强|[PyramidBox: A Context-assisted Single Shot Face Detector](https://arxiv.org/pdf/1803.07737.pdf)
+[Faster RCNN](./PaddleCV/rcnn/README_cn.md)|典型的两阶段目标检测器|创造性地采用卷积网络自行产生建议框,并且和目标检测网络共享卷积网络,建议框数目减少,质量提高|[Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks](https://arxiv.org/abs/1506.01497)
+[Mask RCNN](./PaddleCV/rcnn/README_cn.md)|基于Faster RCNN模型的经典实例分割模型|在原有Faster RCNN模型基础上添加分割分支,得到掩码结果,实现了掩码和类别预测关系的解藕。|[Mask R-CNN](https://arxiv.org/abs/1703.06870)
+[ICNet](./PaddleCV/icnet)|图像实时语义分割模型|即考虑了速度,也考虑了准确性,在高分辨率图像的准确性和低复杂度网络的效率之间获得平衡|[ICNet for Real-Time Semantic Segmentation on High-Resolution Images](https://arxiv.org/abs/1704.08545)
+[DCGAN](./PaddleCV/gan/c_gan)|图像生成模型|深度卷积生成对抗网络,将GAN和卷积网络结合起来,以解决GAN训练不稳定的问题|[Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks](https://arxiv.org/pdf/1511.06434.pdf)
+[ConditionalGAN](./PaddleCV/gan/c_gan)|图像生成模型|条件生成对抗网络,一种带条件约束的GAN,使用额外信息对模型增加条件,可以指导数据生成过程|[Conditional Generative Adversarial Nets](https://arxiv.org/abs/1411.1784)
+[CycleGAN](./PaddleCV/gan/cycle_gan)|图片转化模型|自动将某一类图片转换成另外一类图片,可用于风格迁移|[Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks](https://arxiv.org/abs/1703.10593)
+[CRNN-CTC模型](./PaddleCV/ocr_recognition)|场景文字识别模型|使用CTC model识别图片中单行英文字符|[Connectionist Temporal Classification: Labelling Unsegmented Sequence Data with Recurrent Neural Networks](https://www.researchgate.net/publication/221346365_Connectionist_temporal_classification_Labelling_unsegmented_sequence_data_with_recurrent_neural_'networks)
+[Attention模型](./PaddleCV/ocr_recognition)|场景文字识别模型|使用attention 识别图片中单行英文字符|[Recurrent Models of Visual Attention](https://arxiv.org/abs/1406.6247)
+[Metric Learning](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/metric_learning)|度量学习模型|能够用于分析对象时间的关联、比较关系,可应用于辅助分类、聚类问题,也广泛用于图像检索、人脸识别等领域|-
+[TSN](./PaddleCV/video_classification)|视频分类模型|基于长范围时间结构建模,结合了稀疏时间采样策略和视频级监督来保证使用整段视频时学习得有效和高效|[Temporal Segment Networks: Towards Good Practices for Deep Action Recognition](https://arxiv.org/abs/1608.00859)
+[视频模型库](./PaddleCV/video)|视频模型库|给开发者提供基于PaddlePaddle的便捷、高效的使用深度学习算法解决视频理解、视频编辑、视频生成等一系列模型||
+[caffe2fluid](./PaddleCV/caffe2fluid)|将Caffe模型转换为Paddle Fluid配置和模型文件工具|-|-
+
+## PaddleNLP
+模型|简介|模型优势|参考论文
+--|:--:|:--:|:--:
+[Transformer](./PaddleNLP/neural_machine_translation/transformer/README_cn.md)|机器翻译模型|基于self-attention,计算复杂度小,并行度高,容易学习长程依赖,翻译效果更好|[Attention Is All You Need](https://arxiv.org/abs/1706.03762)
+[BERT](https://github.com/PaddlePaddle/LARK/tree/develop/BERT)|语义表示模型|在多个 NLP 任务上取得 SOTA 效果,支持多卡多机训练,支持混合精度训练|[BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805)
+[LAC](https://github.com/baidu/lac/blob/master/README.md)|联合的词法分析模型|能够整体性地完成中文分词、词性标注、专名识别任务|[Chinese Lexical Analysis with Deep Bi-GRU-CRF Network](https://arxiv.org/abs/1807.01882)
+[Senta](https://github.com/baidu/Senta/blob/master/README.md)|情感倾向分析模型集|百度AI开放平台中情感倾向分析模型|-
+[DAM](./PaddleNLP/deep_attention_matching_net)|语义匹配模型|百度自然语言处理部发表于ACL-2018的工作,用于检索式聊天机器人多轮对话中应答的选择|[Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network](http://aclweb.org/anthology/P18-1103)
+[SimNet](https://github.com/baidu/AnyQ/blob/master/tools/simnet/train/paddle/README.md)|语义匹配框架|使用SimNet构建出的模型可以便捷的加入AnyQ系统中,增强AnyQ系统的语义匹配能力|-
+[DuReader](./PaddleNLP/machine_reading_comprehension/README.md)|阅读理解模型|百度MRC数据集上的机器阅读理解模型|-
+[Bi-GRU-CRF](./PaddleNLP/sequence_tagging_for_ner/README.md)|命名实体识别|结合了CRF和双向GRU的命名实体识别模型|-
+[dialogue model](https://github.com/baidu/knowledge-driven-dialogue/tree/master/generative_paddle/README.md)|知识驱动的对话模型|基于双向RNN和attention实现的生成式对话系统|-
+
+
+## PaddleRec
+模型|简介|模型优势|参考论文
+--|:--:|:--:|:--:
+[TagSpace](./PaddleRec/tagspace)|文本及标签的embedding表示学习模型|应用于工业级的标签推荐,具体应用场景有feed新闻标签推荐等|[#TagSpace: Semantic embeddings from hashtags](https://www.bibsonomy.org/bibtex/0ed4314916f8e7c90d066db45c293462)
+[GRU4Rec](./PaddleRec/gru4rec)|个性化推荐模型|首次将RNN(GRU)运用于session-based推荐,相比传统的KNN和矩阵分解,效果有明显的提升|[Session-based Recommendations with Recurrent Neural Networks](https://arxiv.org/abs/1511.06939)
+[SSR](./PaddleRec/ssr)|序列语义检索推荐模型|使用参考论文中的思想,使用多种时间粒度进行用户行为预测|[Multi-Rate Deep Learning for Temporal Recommendation](https://dl.acm.org/citation.cfm?id=2914726)
+[DeepCTR](./PaddleRec/ctr/README.cn.md)|点击率预估模型|只实现了DeepFM论文中介绍的模型的DNN部分,DeepFM会在其他例子中给出|[DeepFM: A Factorization-Machine based Neural Network for CTR Prediction](https://arxiv.org/abs/1703.04247)
+[Multiview-Simnet](./PaddleRec/multiview_simnet)|个性化推荐模型|基于多元视图,将用户和项目的多个功能视图合并为一个统一模型|[A Multi-View Deep Learning Approach for Cross Domain User Modeling in Recommendation Systems](http://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/frp1159-songA.pdf)
+
+## Other Models
+模型|简介|模型优势|参考论文
+--|:--:|:--:|:--:
+[DeepASR](./PaddleSpeech/DeepASR/README_cn.md)|语音识别系统|利用Fluid框架完成语音识别中声学模型的配置和训练,并集成 Kaldi 的解码器|-
+[DQN](./PaddleRL/DeepQNetwork/README_cn.md)|深度Q网络|value based强化学习算法,第一个成功地将深度学习和强化学习结合起来的模型|[Human-level control through deep reinforcement learning](https://www.nature.com/articles/nature14236)
+[DoubleDQN](./PaddleRL/DeepQNetwork/README_cn.md)|DQN的变体|将Double Q的想法应用在DQN上,解决过优化问题|[Font Size: Deep Reinforcement Learning with Double Q-Learning](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/viewPaper/12389)
+[DuelingDQN](./PaddleRL/DeepQNetwork/README_cn.md)|DQN的变体|改进了DQN模型,提高了模型的性能|[Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html)
## License
This tutorial is contributed by [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) and licensed under the [Apache-2.0 license](LICENSE).
+
+
+## 许可证书
+此向导由[PaddlePaddle](https://github.com/PaddlePaddle/Paddle)贡献,受[Apache-2.0 license](LICENSE)许可认证.
diff --git a/contrib/README.md b/contrib/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/fluid/AutoDL/LRC/README.md b/fluid/AutoDL/LRC/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..546cb19169b965af5a3d0d41c903e318d4dfc64a
--- /dev/null
+++ b/fluid/AutoDL/LRC/README.md
@@ -0,0 +1,6 @@
+
+Hi!
+
+This directory has been deprecated.
+
+Please visit the project at [AutoDL/LRC](../../../AutoDL/LRC).
diff --git a/fluid/AutoDL/LRC/README_cn.md b/fluid/AutoDL/LRC/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..6c87fd2d1cb5f6f4d187d665548ed7c74746bf10
--- /dev/null
+++ b/fluid/AutoDL/LRC/README_cn.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [AutoDL/LRC](../../../AutoDL/LRC) 目录下浏览本项目。
diff --git a/fluid/DeepASR/README.md b/fluid/DeepASR/README.md
index 6b9913fd30a56ef2328bc62e9b36e496f6763430..b7d916c58649790055b2ddbdd32e914d02f14ebf 100644
--- a/fluid/DeepASR/README.md
+++ b/fluid/DeepASR/README.md
@@ -1,36 +1,6 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-## Deep Automatic Speech Recognition
+Hi!
-### Introduction
-TBD
+This directory has been deprecated.
-### Installation
-
-#### Kaldi
-The decoder depends on [kaldi](https://github.com/kaldi-asr/kaldi), install it by flowing its instructions. Then
-
-```shell
-export KALDI_ROOT=
-```
-
-#### Decoder
-
-```shell
-git clone https://github.com/PaddlePaddle/models.git
-cd models/fluid/DeepASR/decoder
-sh setup.sh
-```
-
-### Data reprocessing
-TBD
-
-### Training
-TBD
-
-
-### Inference & Decoding
-TBD
-
-### Question and Contribution
-TBD
+Please visit the project at [PaddleSpeech/DeepASR](../../PaddleSpeech/DeepASR).
diff --git a/fluid/DeepASR/README_cn.md b/fluid/DeepASR/README_cn.md
index be78a048701a621bd90942bdfe30ef4d7c7f082f..51b0e724c810165810154915f41159d478398234 100644
--- a/fluid/DeepASR/README_cn.md
+++ b/fluid/DeepASR/README_cn.md
@@ -1,186 +1,2 @@
-运行本目录下的程序示例需要使用 PaddlePaddle v0.14及以上版本。如果您的 PaddlePaddle 安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新 PaddlePaddle 安装版本。
----
-
-DeepASR (Deep Automatic Speech Recognition) 是一个基于PaddlePaddle FLuid与[Kaldi](http://www.kaldi-asr.org)的语音识别系统。其利用Fluid框架完成语音识别中声学模型的配置和训练,并集成 Kaldi 的解码器。旨在方便已对 Kaldi 的较为熟悉的用户实现中声学模型的快速、大规模训练,并利用kaldi完成复杂的语音数据预处理和最终的解码过程。
-
-### 目录
-- [模型概览](#model-overview)
-- [安装](#installation)
-- [数据预处理](#data-reprocessing)
-- [模型训练](#training)
-- [训练过程中的时间分析](#perf-profiling)
-- [预测和解码](#infer-decoding)
-- [评估错误率](#scoring-error-rate)
-- [Aishell 实例](#aishell-example)
-- [欢迎贡献更多的实例](#how-to-contrib)
-
-### 模型概览
-
-DeepASR的声学模型是一个单卷积层加多层层叠LSTMP 的结构,利用卷积来进行初步的特征提取,并用多层的LSTMP来对时序关系进行建模,所用到的损失函数是交叉熵。[LSTMP](https://arxiv.org/abs/1402.1128)(LSTM with recurrent projection layer)是传统 LSTM 的拓展,在 LSTM 的基础上增加了一个映射层,将隐含层映射到较低的维度并输入下一个时间步,这种结构在大为减小 LSTM 的参数规模和计算复杂度的同时还提升了 LSTM 的性能表现。
-
-
-
-图1 LSTMP 的拓扑结构
-
-
-### 安装
-
-
-#### kaldi的安装与设置
-
-
-DeepASR解码过程中所用的解码器依赖于[Kaldi的安装](https://github.com/kaldi-asr/kaldi),如环境中无Kaldi, 请`git clone`其源代码,并按给定的命令安装好kaldi,最后设置环境变量`KALDI_ROOT`:
-
-```shell
-export KALDI_ROOT=
-
-```
-#### 解码器的安装
-进入解码器源码所在的目录
-
-```shell
-cd models/fluid/DeepASR/decoder
-```
-运行安装脚本
-
-```shell
-sh setup.sh
-```
- 编译过程完成即成功地安转了解码器。
-
-### 数据预处理
-
-参考[Kaldi的数据准备流程](http://kaldi-asr.org/doc/data_prep.html)完成音频数据的特征提取和标签对齐
-
-### 声学模型的训练
-
-可以选择在CPU或GPU模式下进行声学模型的训练,例如在GPU模式下的训练
-
-```shell
-CUDA_VISIBLE_DEVICES=0,1,2,3 python -u train.py \
- --train_feature_lst train_feature.lst \
- --train_label_lst train_label.lst \
- --val_feature_lst val_feature.lst \
- --val_label_lst val_label.lst \
- --mean_var global_mean_var \
- --parallel
-```
-其中`train_feature.lst`和`train_label.lst`分别是训练数据集的特征列表文件和标注列表文件,类似的,`val_feature.lst`和`val_label.lst`对应的则是验证集的列表文件。实际训练过程中要正确指定建模单元大小、学习率等重要参数。关于这些参数的说明,请运行
-
-```shell
-python train.py --help
-```
-获取更多信息。
-
-### 训练过程中的时间分析
-
-利用Fluid提供的性能分析工具profiler,可对训练过程进行性能分析,获取网络中operator级别的执行时间
-
-```shell
-CUDA_VISIBLE_DEVICES=0 python -u tools/profile.py \
- --train_feature_lst train_feature.lst \
- --train_label_lst train_label.lst \
- --val_feature_lst val_feature.lst \
- --val_label_lst val_label.lst \
- --mean_var global_mean_var
-```
-
-
-### 预测和解码
-
-在充分训练好声学模型之后,利用训练过程中保存下来的模型checkpoint,可对输入的音频数据进行解码输出,得到声音到文字的识别结果
-
-```
-CUDA_VISIBLE_DEVICES=0,1,2,3 python -u infer_by_ckpt.py \
- --batch_size 96 \
- --checkpoint deep_asr.pass_1.checkpoint \
- --infer_feature_lst test_feature.lst \
- --infer_label_lst test_label.lst \
- --mean_var global_mean_var \
- --parallel
-```
-
-### 评估错误率
-
-对语音识别系统的评价常用的指标有词错误率(Word Error Rate, WER)和字错误率(Character Error Rate, CER), 在DeepASR中也实现了相关的度量工具,其运行方式为
-
-```
-python score_error_rate.py --error_rate_type cer --ref ref.txt --hyp decoding.txt
-```
-参数`error_rate_type`表示测量错误率的类型,即 WER 或 CER;`ref.txt` 和 `decoding.txt` 分别表示参考文本和实际解码出的文本,它们有着同样的格式:
-
-```
-key1 text1
-key2 text2
-key3 text3
-...
-
-```
-
-
-### Aishell 实例
-
-本节以[Aishell数据集](http://www.aishelltech.com/kysjcp)为例,展示如何完成从数据预处理到解码输出。Aishell是由北京希尔贝克公司所开放的中文普通话语音数据集,时长178小时,包含了400名来自不同口音区域录制者的语音,原始数据可由[openslr](http://www.openslr.org/33)获取。为简化流程,这里提供了已完成预处理的数据集供下载:
-
-```
-cd examples/aishell
-sh prepare_data.sh
-```
-
-其中包括了声学模型的训练数据以及解码过程中所用到的辅助文件等。下载数据完成后,在开始训练之前可对训练过程进行分析
-
-```
-sh profile.sh
-```
-
-执行训练
-
-```
-sh train.sh
-```
-默认是用4卡GPU进行训练,在实际过程中可根据可用GPU的数目和显存大小对`batch_size`、学习率等参数进行动态调整。训练过程中典型的损失函数和精度的变化趋势如图2所示
-
-
-
-图2 在Aishell数据集上训练声学模型的学习曲线
-
-
-完成模型训练后,即可执行预测识别测试集语音中的文字:
-
-```
-sh infer_by_ckpt.sh
-```
-
-其中包括了声学模型的预测和解码器的解码输出两个重要的过程。以下是解码输出的样例:
-
-```
-...
-BAC009S0764W0239 十一 五 期间 我 国 累计 境外 投资 七千亿 美元
-BAC009S0765W0140 在 了解 送 方 的 资产 情况 与 需求 之后
-BAC009S0915W0291 这 对 苹果 来说 不 是 件 容易 的 事 儿
-BAC009S0769W0159 今年 土地 收入 预计 近 四万亿 元
-BAC009S0907W0451 由 浦东 商店 作为 掩护
-BAC009S0768W0128 土地 交易 可能 随着 供应 淡季 的 到来 而 降温
-...
-```
-
-每行对应一个输出,均以音频样本的关键字开头,随后是按词分隔的解码出的中文文本。解码完成后运行脚本评估字错误率(CER)
-
-```
-sh score_cer.sh
-```
-
-其输出类似于如下所示
-
-```
-Error rate[cer] = 0.101971 (10683/104765),
-total 7176 sentences in hyp, 0 not presented in ref.
-```
-
-利用经过20轮左右训练的声学模型,可以在Aishell的测试集上得到CER约10%的识别结果。
-
-
-### 欢迎贡献更多的实例
-
-DeepASR目前只开放了Aishell实例,我们欢迎用户在更多的数据集上测试完整的训练流程并贡献到这个项目中。
+您好,该项目已被迁移,请移步到 [PaddleSpeech/DeepASR](../../PaddleSpeech/DeepASR) 目录下浏览本项目。
diff --git a/fluid/DeepASR/data_utils/util.py b/fluid/DeepASR/data_utils/util.py
deleted file mode 100644
index 27f5ba8305c0e72852230658858d77fed9d233a4..0000000000000000000000000000000000000000
--- a/fluid/DeepASR/data_utils/util.py
+++ /dev/null
@@ -1,81 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import sys
-from six import reraise
-from tblib import Traceback
-
-import numpy as np
-
-
-def to_lodtensor(data, place):
- """convert tensor to lodtensor
- """
- seq_lens = [len(seq) for seq in data]
- cur_len = 0
- lod = [cur_len]
- for l in seq_lens:
- cur_len += l
- lod.append(cur_len)
- flattened_data = numpy.concatenate(data, axis=0).astype("int64")
- flattened_data = flattened_data.reshape([len(flattened_data), 1])
- res = fluid.LoDTensor()
- res.set(flattened_data, place)
- res.set_lod([lod])
- return res
-
-
-def lodtensor_to_ndarray(lod_tensor):
- """conver lodtensor to ndarray
- """
- dims = lod_tensor._get_dims()
- ret = np.zeros(shape=dims).astype('float32')
- for i in xrange(np.product(dims)):
- ret.ravel()[i] = lod_tensor.get_float_element(i)
- return ret, lod_tensor.lod()
-
-
-def split_infer_result(infer_seq, lod):
- infer_batch = []
- for i in xrange(0, len(lod[0]) - 1):
- infer_batch.append(infer_seq[lod[0][i]:lod[0][i + 1]])
- return infer_batch
-
-
-class CriticalException(Exception):
- pass
-
-
-def suppress_signal(signo, stack_frame):
- pass
-
-
-def suppress_complaints(verbose, notify=None):
- def decorator_maker(func):
- def suppress_warpper(*args, **kwargs):
- try:
- func(*args, **kwargs)
- except:
- et, ev, tb = sys.exc_info()
-
- if notify is not None:
- notify(except_type=et, except_value=ev, traceback=tb)
-
- if verbose == 1 or isinstance(ev, CriticalException):
- reraise(et, ev, Traceback(tb).as_traceback())
-
- return suppress_warpper
-
- return decorator_maker
-
-
-class ForceExitWrapper(object):
- def __init__(self, exit_flag):
- self._exit_flag = exit_flag
-
- @suppress_complaints(verbose=0)
- def __call__(self, *args, **kwargs):
- self._exit_flag.value = True
-
- def __eq__(self, flag):
- return self._exit_flag.value == flag
diff --git a/fluid/DeepASR/examples/aishell/train.sh b/fluid/DeepASR/examples/aishell/train.sh
deleted file mode 100644
index 06fe488d4572782d946e8daa7c22ded8ef0212c6..0000000000000000000000000000000000000000
--- a/fluid/DeepASR/examples/aishell/train.sh
+++ /dev/null
@@ -1,13 +0,0 @@
-export CUDA_VISIBLE_DEVICES=4,5,6,7
-python -u ../../train.py --train_feature_lst data/train_feature.lst \
- --train_label_lst data/train_label.lst \
- --val_feature_lst data/val_feature.lst \
- --val_label_lst data/val_label.lst \
- --mean_var data/global_mean_var \
- --checkpoints checkpoints \
- --frame_dim 80 \
- --class_num 3040 \
- --infer_models '' \
- --batch_size 64 \
- --learning_rate 6.4e-5 \
- --parallel
diff --git a/fluid/DeepASR/model_utils/model.py b/fluid/DeepASR/model_utils/model.py
deleted file mode 100644
index 31892a7ad93eed0ba6ad6e7b53377e897f6df29d..0000000000000000000000000000000000000000
--- a/fluid/DeepASR/model_utils/model.py
+++ /dev/null
@@ -1,107 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid as fluid
-
-
-def stacked_lstmp_model(frame_dim,
- hidden_dim,
- proj_dim,
- stacked_num,
- class_num,
- parallel=False,
- is_train=True):
- """ The model for DeepASR. The main structure is composed of stacked
- identical LSTMP (LSTM with recurrent projection) layers.
-
- When running in training and validation phase, the feeding dictionary
- is {'feature', 'label'}, fed by the LodTensor for feature data and
- label data respectively. And in inference, only `feature` is needed.
-
- Args:
- frame_dim(int): The frame dimension of feature data.
- hidden_dim(int): The hidden state's dimension of the LSTMP layer.
- proj_dim(int): The projection size of the LSTMP layer.
- stacked_num(int): The number of stacked LSTMP layers.
- parallel(bool): Run in parallel or not, default `False`.
- is_train(bool): Run in training phase or not, default `True`.
- class_dim(int): The number of output classes.
- """
-
- # network configuration
- def _net_conf(feature, label):
- conv1 = fluid.layers.conv2d(
- input=feature,
- num_filters=32,
- filter_size=3,
- stride=1,
- padding=1,
- bias_attr=True,
- act="relu")
-
- pool1 = fluid.layers.pool2d(
- conv1, pool_size=3, pool_type="max", pool_stride=2, pool_padding=0)
-
- stack_input = pool1
- for i in range(stacked_num):
- fc = fluid.layers.fc(input=stack_input,
- size=hidden_dim * 4,
- bias_attr=None)
- proj, cell = fluid.layers.dynamic_lstmp(
- input=fc,
- size=hidden_dim * 4,
- proj_size=proj_dim,
- bias_attr=True,
- use_peepholes=True,
- is_reverse=False,
- cell_activation="tanh",
- proj_activation="tanh")
- bn = fluid.layers.batch_norm(
- input=proj,
- is_test=not is_train,
- momentum=0.9,
- epsilon=1e-05,
- data_layout='NCHW')
- stack_input = bn
-
- prediction = fluid.layers.fc(input=stack_input,
- size=class_num,
- act='softmax')
-
- cost = fluid.layers.cross_entropy(input=prediction, label=label)
- avg_cost = fluid.layers.mean(x=cost)
- acc = fluid.layers.accuracy(input=prediction, label=label)
- return prediction, avg_cost, acc
-
- # data feeder
- feature = fluid.layers.data(
- name="feature",
- shape=[-1, 3, 11, frame_dim],
- dtype="float32",
- lod_level=1)
- label = fluid.layers.data(
- name="label", shape=[-1, 1], dtype="int64", lod_level=1)
-
- if parallel:
- # When the execution place is specified to CUDAPlace, the program will
- # run on all $CUDA_VISIBLE_DEVICES GPUs. Otherwise the program will
- # run on all CPU devices.
- places = fluid.layers.device.get_places()
- pd = fluid.layers.ParallelDo(places)
- with pd.do():
- feat_ = pd.read_input(feature)
- label_ = pd.read_input(label)
- prediction, avg_cost, acc = _net_conf(feat_, label_)
- for out in [prediction, avg_cost, acc]:
- pd.write_output(out)
-
- # get mean loss and acc through every devices.
- prediction, avg_cost, acc = pd()
- prediction.stop_gradient = True
- avg_cost = fluid.layers.mean(x=avg_cost)
- acc = fluid.layers.mean(x=acc)
- else:
- prediction, avg_cost, acc = _net_conf(feature, label)
-
- return prediction, avg_cost, acc
diff --git a/fluid/DeepASR/train.py b/fluid/DeepASR/train.py
deleted file mode 100644
index 1c35a6637f534abf4a37763fe1915c35e18e1f94..0000000000000000000000000000000000000000
--- a/fluid/DeepASR/train.py
+++ /dev/null
@@ -1,292 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import sys
-import os
-import numpy as np
-import argparse
-import time
-
-import paddle.fluid as fluid
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-import data_utils.augmentor.trans_splice as trans_splice
-import data_utils.augmentor.trans_delay as trans_delay
-import data_utils.async_data_reader as reader
-from data_utils.util import lodtensor_to_ndarray
-from model_utils.model import stacked_lstmp_model
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Training for stacked LSTMP model.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=32,
- help='The sequence number of a batch data. (default: %(default)d)')
- parser.add_argument(
- '--minimum_batch_size',
- type=int,
- default=1,
- help='The minimum sequence number of a batch data. '
- '(default: %(default)d)')
- parser.add_argument(
- '--frame_dim',
- type=int,
- default=80,
- help='Frame dimension of feature data. (default: %(default)d)')
- parser.add_argument(
- '--stacked_num',
- type=int,
- default=5,
- help='Number of lstmp layers to stack. (default: %(default)d)')
- parser.add_argument(
- '--proj_dim',
- type=int,
- default=512,
- help='Project size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--hidden_dim',
- type=int,
- default=1024,
- help='Hidden size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--class_num',
- type=int,
- default=3040,
- help='Number of classes in label. (default: %(default)d)')
- parser.add_argument(
- '--pass_num',
- type=int,
- default=100,
- help='Epoch number to train. (default: %(default)d)')
- parser.add_argument(
- '--print_per_batches',
- type=int,
- default=100,
- help='Interval to print training accuracy. (default: %(default)d)')
- parser.add_argument(
- '--learning_rate',
- type=float,
- default=0.00016,
- help='Learning rate used to train. (default: %(default)f)')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type. (default: %(default)s)')
- parser.add_argument(
- '--parallel', action='store_true', help='If set, run in parallel.')
- parser.add_argument(
- '--mean_var',
- type=str,
- default='data/global_mean_var_search26kHr',
- help="The path for feature's global mean and variance. "
- "(default: %(default)s)")
- parser.add_argument(
- '--train_feature_lst',
- type=str,
- default='data/feature.lst',
- help='The feature list path for training. (default: %(default)s)')
- parser.add_argument(
- '--train_label_lst',
- type=str,
- default='data/label.lst',
- help='The label list path for training. (default: %(default)s)')
- parser.add_argument(
- '--val_feature_lst',
- type=str,
- default='data/val_feature.lst',
- help='The feature list path for validation. (default: %(default)s)')
- parser.add_argument(
- '--val_label_lst',
- type=str,
- default='data/val_label.lst',
- help='The label list path for validation. (default: %(default)s)')
- parser.add_argument(
- '--init_model_path',
- type=str,
- default=None,
- help="The model (checkpoint) path which the training resumes from. "
- "If None, train the model from scratch. (default: %(default)s)")
- parser.add_argument(
- '--checkpoints',
- type=str,
- default='./checkpoints',
- help="The directory for saving checkpoints. Do not save checkpoints "
- "if set to ''. (default: %(default)s)")
- parser.add_argument(
- '--infer_models',
- type=str,
- default='./infer_models',
- help="The directory for saving inference models. Do not save inference "
- "models if set to ''. (default: %(default)s)")
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(vars(args).iteritems()):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def train(args):
- """train in loop.
- """
-
- # paths check
- if args.init_model_path is not None and \
- not os.path.exists(args.init_model_path):
- raise IOError("Invalid initial model path!")
- if args.checkpoints != '' and not os.path.exists(args.checkpoints):
- os.mkdir(args.checkpoints)
- if args.infer_models != '' and not os.path.exists(args.infer_models):
- os.mkdir(args.infer_models)
-
- prediction, avg_cost, accuracy = stacked_lstmp_model(
- frame_dim=args.frame_dim,
- hidden_dim=args.hidden_dim,
- proj_dim=args.proj_dim,
- stacked_num=args.stacked_num,
- class_num=args.class_num,
- parallel=args.parallel)
-
- # program for test
- test_program = fluid.default_main_program().clone()
-
- #optimizer = fluid.optimizer.Momentum(learning_rate=args.learning_rate, momentum=0.9)
- optimizer = fluid.optimizer.Adam(
- learning_rate=fluid.layers.exponential_decay(
- learning_rate=args.learning_rate,
- decay_steps=1879,
- decay_rate=1 / 1.2,
- staircase=True))
- optimizer.minimize(avg_cost)
-
- place = fluid.CPUPlace() if args.device == 'CPU' else fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- # resume training if initial model provided.
- if args.init_model_path is not None:
- fluid.io.load_persistables(exe, args.init_model_path)
-
- ltrans = [
- trans_add_delta.TransAddDelta(2, 2),
- trans_mean_variance_norm.TransMeanVarianceNorm(args.mean_var),
- trans_splice.TransSplice(5, 5), trans_delay.TransDelay(5)
- ]
-
- feature_t = fluid.LoDTensor()
- label_t = fluid.LoDTensor()
-
- # validation
- def test(exe):
- # If test data not found, return invalid cost and accuracy
- if not (os.path.exists(args.val_feature_lst) and
- os.path.exists(args.val_label_lst)):
- return -1.0, -1.0
- # test data reader
- test_data_reader = reader.AsyncDataReader(
- args.val_feature_lst,
- args.val_label_lst,
- -1,
- split_sentence_threshold=1024)
- test_data_reader.set_transformers(ltrans)
- test_costs, test_accs = [], []
- for batch_id, batch_data in enumerate(
- test_data_reader.batch_iterator(args.batch_size,
- args.minimum_batch_size)):
- # load_data
- (features, labels, lod, _) = batch_data
- features = np.reshape(features, (-1, 11, 3, args.frame_dim))
- features = np.transpose(features, (0, 2, 1, 3))
- feature_t.set(features, place)
- feature_t.set_lod([lod])
- label_t.set(labels, place)
- label_t.set_lod([lod])
-
- cost, acc = exe.run(test_program,
- feed={"feature": feature_t,
- "label": label_t},
- fetch_list=[avg_cost, accuracy],
- return_numpy=False)
- test_costs.append(lodtensor_to_ndarray(cost)[0])
- test_accs.append(lodtensor_to_ndarray(acc)[0])
- return np.mean(test_costs), np.mean(test_accs)
-
- # train data reader
- train_data_reader = reader.AsyncDataReader(
- args.train_feature_lst,
- args.train_label_lst,
- -1,
- split_sentence_threshold=1024)
-
- train_data_reader.set_transformers(ltrans)
- # train
- for pass_id in xrange(args.pass_num):
- pass_start_time = time.time()
- for batch_id, batch_data in enumerate(
- train_data_reader.batch_iterator(args.batch_size,
- args.minimum_batch_size)):
- # load_data
- (features, labels, lod, name_lst) = batch_data
- features = np.reshape(features, (-1, 11, 3, args.frame_dim))
- features = np.transpose(features, (0, 2, 1, 3))
- feature_t.set(features, place)
- feature_t.set_lod([lod])
- label_t.set(labels, place)
- label_t.set_lod([lod])
-
- to_print = batch_id > 0 and (batch_id % args.print_per_batches == 0)
- outs = exe.run(fluid.default_main_program(),
- feed={"feature": feature_t,
- "label": label_t},
- fetch_list=[avg_cost, accuracy] if to_print else [],
- return_numpy=False)
-
- if to_print:
- print("\nBatch %d, train cost: %f, train acc: %f" %
- (batch_id, lodtensor_to_ndarray(outs[0])[0],
- lodtensor_to_ndarray(outs[1])[0]))
- # save the latest checkpoint
- if args.checkpoints != '':
- model_path = os.path.join(args.checkpoints,
- "deep_asr.latest.checkpoint")
- fluid.io.save_persistables(exe, model_path)
- else:
- sys.stdout.write('.')
- sys.stdout.flush()
- # run test
- val_cost, val_acc = test(exe)
-
- # save checkpoint per pass
- if args.checkpoints != '':
- model_path = os.path.join(
- args.checkpoints,
- "deep_asr.pass_" + str(pass_id) + ".checkpoint")
- fluid.io.save_persistables(exe, model_path)
- # save inference model
- if args.infer_models != '':
- model_path = os.path.join(
- args.infer_models,
- "deep_asr.pass_" + str(pass_id) + ".infer.model")
- fluid.io.save_inference_model(model_path, ["feature"],
- [prediction], exe)
- # cal pass time
- pass_end_time = time.time()
- time_consumed = pass_end_time - pass_start_time
- # print info at pass end
- print("\nPass %d, time consumed: %f s, val cost: %f, val acc: %f\n" %
- (pass_id, time_consumed, val_cost, val_acc))
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
-
- train(args)
diff --git a/fluid/DeepQNetwork/DQN_agent.py b/fluid/DeepQNetwork/DQN_agent.py
deleted file mode 100644
index 67eb3ce6a29bb723b481d6b1c2f517f037d52942..0000000000000000000000000000000000000000
--- a/fluid/DeepQNetwork/DQN_agent.py
+++ /dev/null
@@ -1,188 +0,0 @@
-#-*- coding: utf-8 -*-
-
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-import numpy as np
-import math
-from tqdm import tqdm
-from utils import fluid_flatten
-
-
-class DQNModel(object):
- def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
- self.img_height = state_dim[0]
- self.img_width = state_dim[1]
- self.action_dim = action_dim
- self.gamma = gamma
- self.exploration = 1.1
- self.update_target_steps = 10000 // 4
- self.hist_len = hist_len
- self.use_cuda = use_cuda
-
- self.global_step = 0
- self._build_net()
-
- def _get_inputs(self):
- return fluid.layers.data(
- name='state',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='action', shape=[1], dtype='int32'), \
- fluid.layers.data(
- name='reward', shape=[], dtype='float32'), \
- fluid.layers.data(
- name='next_s',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='isOver', shape=[], dtype='bool')
-
- def _build_net(self):
- state, action, reward, next_s, isOver = self._get_inputs()
- self.pred_value = self.get_DQN_prediction(state)
- self.predict_program = fluid.default_main_program().clone()
-
- reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
-
- action_onehot = fluid.layers.one_hot(action, self.action_dim)
- action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
-
- pred_action_value = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(action_onehot, self.pred_value), dim=1)
-
- targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
- best_v = fluid.layers.reduce_max(targetQ_predict_value, dim=1)
- best_v.stop_gradient = True
-
- target = reward + (1.0 - fluid.layers.cast(
- isOver, dtype='float32')) * self.gamma * best_v
- cost = fluid.layers.square_error_cost(pred_action_value, target)
- cost = fluid.layers.reduce_mean(cost)
-
- self._sync_program = self._build_sync_target_network()
-
- optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
- optimizer.minimize(cost)
-
- # define program
- self.train_program = fluid.default_main_program()
-
- # fluid exe
- place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
- self.exe = fluid.Executor(place)
- self.exe.run(fluid.default_startup_program())
-
- def get_DQN_prediction(self, image, target=False):
- image = image / 255.0
-
- variable_field = 'target' if target else 'policy'
-
- conv1 = fluid.layers.conv2d(
- input=image,
- num_filters=32,
- filter_size=[5, 5],
- stride=[1, 1],
- padding=[2, 2],
- act='relu',
- param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
- max_pool1 = fluid.layers.pool2d(
- input=conv1, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv2 = fluid.layers.conv2d(
- input=max_pool1,
- num_filters=32,
- filter_size=[5, 5],
- stride=[1, 1],
- padding=[2, 2],
- act='relu',
- param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
- max_pool2 = fluid.layers.pool2d(
- input=conv2, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv3 = fluid.layers.conv2d(
- input=max_pool2,
- num_filters=64,
- filter_size=[4, 4],
- stride=[1, 1],
- padding=[1, 1],
- act='relu',
- param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
- max_pool3 = fluid.layers.pool2d(
- input=conv3, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv4 = fluid.layers.conv2d(
- input=max_pool3,
- num_filters=64,
- filter_size=[3, 3],
- stride=[1, 1],
- padding=[1, 1],
- act='relu',
- param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
-
- flatten = fluid_flatten(conv4)
-
- out = fluid.layers.fc(
- input=flatten,
- size=self.action_dim,
- param_attr=ParamAttr(name='{}_fc1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_fc1_b'.format(variable_field)))
- return out
-
- def _build_sync_target_network(self):
- vars = list(fluid.default_main_program().list_vars())
- policy_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'policy' in x.name, vars))
- target_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
- policy_vars.sort(key=lambda x: x.name)
- target_vars.sort(key=lambda x: x.name)
-
- sync_program = fluid.default_main_program().clone()
- with fluid.program_guard(sync_program):
- sync_ops = []
- for i, var in enumerate(policy_vars):
- sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
- sync_ops.append(sync_op)
- sync_program = sync_program.prune(sync_ops)
- return sync_program
-
- def act(self, state, train_or_test):
- sample = np.random.random()
- if train_or_test == 'train' and sample < self.exploration:
- act = np.random.randint(self.action_dim)
- else:
- if np.random.random() < 0.01:
- act = np.random.randint(self.action_dim)
- else:
- state = np.expand_dims(state, axis=0)
- pred_Q = self.exe.run(self.predict_program,
- feed={'state': state.astype('float32')},
- fetch_list=[self.pred_value])[0]
- pred_Q = np.squeeze(pred_Q, axis=0)
- act = np.argmax(pred_Q)
- if train_or_test == 'train':
- self.exploration = max(0.1, self.exploration - 1e-6)
- return act
-
- def train(self, state, action, reward, next_state, isOver):
- if self.global_step % self.update_target_steps == 0:
- self.sync_target_network()
- self.global_step += 1
-
- action = np.expand_dims(action, -1)
- self.exe.run(self.train_program,
- feed={
- 'state': state.astype('float32'),
- 'action': action.astype('int32'),
- 'reward': reward,
- 'next_s': next_state.astype('float32'),
- 'isOver': isOver
- })
-
- def sync_target_network(self):
- self.exe.run(self._sync_program)
diff --git a/fluid/DeepQNetwork/DoubleDQN_agent.py b/fluid/DeepQNetwork/DoubleDQN_agent.py
deleted file mode 100644
index 09b4b2119bab3fbdfa9bb9cfb8fae40fa34f87e1..0000000000000000000000000000000000000000
--- a/fluid/DeepQNetwork/DoubleDQN_agent.py
+++ /dev/null
@@ -1,195 +0,0 @@
-#-*- coding: utf-8 -*-
-
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-import numpy as np
-from tqdm import tqdm
-import math
-from utils import fluid_argmax, fluid_flatten
-
-
-class DoubleDQNModel(object):
- def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
- self.img_height = state_dim[0]
- self.img_width = state_dim[1]
- self.action_dim = action_dim
- self.gamma = gamma
- self.exploration = 1.1
- self.update_target_steps = 10000 // 4
- self.hist_len = hist_len
- self.use_cuda = use_cuda
-
- self.global_step = 0
- self._build_net()
-
- def _get_inputs(self):
- return fluid.layers.data(
- name='state',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='action', shape=[1], dtype='int32'), \
- fluid.layers.data(
- name='reward', shape=[], dtype='float32'), \
- fluid.layers.data(
- name='next_s',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='isOver', shape=[], dtype='bool')
-
- def _build_net(self):
- state, action, reward, next_s, isOver = self._get_inputs()
- self.pred_value = self.get_DQN_prediction(state)
- self.predict_program = fluid.default_main_program().clone()
-
- reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
-
- action_onehot = fluid.layers.one_hot(action, self.action_dim)
- action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
-
- pred_action_value = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(action_onehot, self.pred_value), dim=1)
-
- targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
-
- next_s_predcit_value = self.get_DQN_prediction(next_s)
- greedy_action = fluid_argmax(next_s_predcit_value)
-
- predict_onehot = fluid.layers.one_hot(greedy_action, self.action_dim)
- best_v = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(predict_onehot, targetQ_predict_value),
- dim=1)
- best_v.stop_gradient = True
-
- target = reward + (1.0 - fluid.layers.cast(
- isOver, dtype='float32')) * self.gamma * best_v
- cost = fluid.layers.square_error_cost(pred_action_value, target)
- cost = fluid.layers.reduce_mean(cost)
-
- self._sync_program = self._build_sync_target_network()
-
- optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
- optimizer.minimize(cost)
-
- # define program
- self.train_program = fluid.default_main_program()
-
- # fluid exe
- place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
- self.exe = fluid.Executor(place)
- self.exe.run(fluid.default_startup_program())
-
- def get_DQN_prediction(self, image, target=False):
- image = image / 255.0
-
- variable_field = 'target' if target else 'policy'
-
- conv1 = fluid.layers.conv2d(
- input=image,
- num_filters=32,
- filter_size=[5, 5],
- stride=[1, 1],
- padding=[2, 2],
- act='relu',
- param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
- max_pool1 = fluid.layers.pool2d(
- input=conv1, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv2 = fluid.layers.conv2d(
- input=max_pool1,
- num_filters=32,
- filter_size=[5, 5],
- stride=[1, 1],
- padding=[2, 2],
- act='relu',
- param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
- max_pool2 = fluid.layers.pool2d(
- input=conv2, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv3 = fluid.layers.conv2d(
- input=max_pool2,
- num_filters=64,
- filter_size=[4, 4],
- stride=[1, 1],
- padding=[1, 1],
- act='relu',
- param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
- max_pool3 = fluid.layers.pool2d(
- input=conv3, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv4 = fluid.layers.conv2d(
- input=max_pool3,
- num_filters=64,
- filter_size=[3, 3],
- stride=[1, 1],
- padding=[1, 1],
- act='relu',
- param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
-
- flatten = fluid_flatten(conv4)
-
- out = fluid.layers.fc(
- input=flatten,
- size=self.action_dim,
- param_attr=ParamAttr(name='{}_fc1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_fc1_b'.format(variable_field)))
- return out
-
- def _build_sync_target_network(self):
- vars = list(fluid.default_main_program().list_vars())
- policy_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'policy' in x.name, vars))
- target_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
- policy_vars.sort(key=lambda x: x.name)
- target_vars.sort(key=lambda x: x.name)
-
- sync_program = fluid.default_main_program().clone()
- with fluid.program_guard(sync_program):
- sync_ops = []
- for i, var in enumerate(policy_vars):
- sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
- sync_ops.append(sync_op)
- sync_program = sync_program.prune(sync_ops)
- return sync_program
-
- def act(self, state, train_or_test):
- sample = np.random.random()
- if train_or_test == 'train' and sample < self.exploration:
- act = np.random.randint(self.action_dim)
- else:
- if np.random.random() < 0.01:
- act = np.random.randint(self.action_dim)
- else:
- state = np.expand_dims(state, axis=0)
- pred_Q = self.exe.run(self.predict_program,
- feed={'state': state.astype('float32')},
- fetch_list=[self.pred_value])[0]
- pred_Q = np.squeeze(pred_Q, axis=0)
- act = np.argmax(pred_Q)
- if train_or_test == 'train':
- self.exploration = max(0.1, self.exploration - 1e-6)
- return act
-
- def train(self, state, action, reward, next_state, isOver):
- if self.global_step % self.update_target_steps == 0:
- self.sync_target_network()
- self.global_step += 1
-
- action = np.expand_dims(action, -1)
- self.exe.run(self.train_program,
- feed={
- 'state': state.astype('float32'),
- 'action': action.astype('int32'),
- 'reward': reward,
- 'next_s': next_state.astype('float32'),
- 'isOver': isOver
- })
-
- def sync_target_network(self):
- self.exe.run(self._sync_program)
diff --git a/fluid/DeepQNetwork/DuelingDQN_agent.py b/fluid/DeepQNetwork/DuelingDQN_agent.py
deleted file mode 100644
index 271a767b7b5841cf1abe213fc477859e3cf5dd05..0000000000000000000000000000000000000000
--- a/fluid/DeepQNetwork/DuelingDQN_agent.py
+++ /dev/null
@@ -1,197 +0,0 @@
-#-*- coding: utf-8 -*-
-
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-import numpy as np
-from tqdm import tqdm
-import math
-from utils import fluid_flatten
-
-
-class DuelingDQNModel(object):
- def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
- self.img_height = state_dim[0]
- self.img_width = state_dim[1]
- self.action_dim = action_dim
- self.gamma = gamma
- self.exploration = 1.1
- self.update_target_steps = 10000 // 4
- self.hist_len = hist_len
- self.use_cuda = use_cuda
-
- self.global_step = 0
- self._build_net()
-
- def _get_inputs(self):
- return fluid.layers.data(
- name='state',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='action', shape=[1], dtype='int32'), \
- fluid.layers.data(
- name='reward', shape=[], dtype='float32'), \
- fluid.layers.data(
- name='next_s',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='isOver', shape=[], dtype='bool')
-
- def _build_net(self):
- state, action, reward, next_s, isOver = self._get_inputs()
- self.pred_value = self.get_DQN_prediction(state)
- self.predict_program = fluid.default_main_program().clone()
-
- reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
-
- action_onehot = fluid.layers.one_hot(action, self.action_dim)
- action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
-
- pred_action_value = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(action_onehot, self.pred_value), dim=1)
-
- targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
- best_v = fluid.layers.reduce_max(targetQ_predict_value, dim=1)
- best_v.stop_gradient = True
-
- target = reward + (1.0 - fluid.layers.cast(
- isOver, dtype='float32')) * self.gamma * best_v
- cost = fluid.layers.square_error_cost(pred_action_value, target)
- cost = fluid.layers.reduce_mean(cost)
-
- self._sync_program = self._build_sync_target_network()
-
- optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
- optimizer.minimize(cost)
-
- # define program
- self.train_program = fluid.default_main_program()
-
- # fluid exe
- place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
- self.exe = fluid.Executor(place)
- self.exe.run(fluid.default_startup_program())
-
- def get_DQN_prediction(self, image, target=False):
- image = image / 255.0
-
- variable_field = 'target' if target else 'policy'
-
- conv1 = fluid.layers.conv2d(
- input=image,
- num_filters=32,
- filter_size=[5, 5],
- stride=[1, 1],
- padding=[2, 2],
- act='relu',
- param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
- max_pool1 = fluid.layers.pool2d(
- input=conv1, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv2 = fluid.layers.conv2d(
- input=max_pool1,
- num_filters=32,
- filter_size=[5, 5],
- stride=[1, 1],
- padding=[2, 2],
- act='relu',
- param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
- max_pool2 = fluid.layers.pool2d(
- input=conv2, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv3 = fluid.layers.conv2d(
- input=max_pool2,
- num_filters=64,
- filter_size=[4, 4],
- stride=[1, 1],
- padding=[1, 1],
- act='relu',
- param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
- max_pool3 = fluid.layers.pool2d(
- input=conv3, pool_size=[2, 2], pool_stride=[2, 2], pool_type='max')
-
- conv4 = fluid.layers.conv2d(
- input=max_pool3,
- num_filters=64,
- filter_size=[3, 3],
- stride=[1, 1],
- padding=[1, 1],
- act='relu',
- param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
-
- flatten = fluid_flatten(conv4)
-
- value = fluid.layers.fc(
- input=flatten,
- size=1,
- param_attr=ParamAttr(name='{}_value_fc'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_value_fc_b'.format(variable_field)))
-
- advantage = fluid.layers.fc(
- input=flatten,
- size=self.action_dim,
- param_attr=ParamAttr(name='{}_advantage_fc'.format(variable_field)),
- bias_attr=ParamAttr(
- name='{}_advantage_fc_b'.format(variable_field)))
-
- Q = advantage + (value - fluid.layers.reduce_mean(
- advantage, dim=1, keep_dim=True))
- return Q
-
- def _build_sync_target_network(self):
- vars = list(fluid.default_main_program().list_vars())
- policy_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'policy' in x.name, vars))
- target_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
- policy_vars.sort(key=lambda x: x.name)
- target_vars.sort(key=lambda x: x.name)
-
- sync_program = fluid.default_main_program().clone()
- with fluid.program_guard(sync_program):
- sync_ops = []
- for i, var in enumerate(policy_vars):
- sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
- sync_ops.append(sync_op)
- # The prune API is deprecated, please don't use it any more.
- sync_program = sync_program._prune(sync_ops)
- return sync_program
-
- def act(self, state, train_or_test):
- sample = np.random.random()
- if train_or_test == 'train' and sample < self.exploration:
- act = np.random.randint(self.action_dim)
- else:
- if np.random.random() < 0.01:
- act = np.random.randint(self.action_dim)
- else:
- state = np.expand_dims(state, axis=0)
- pred_Q = self.exe.run(self.predict_program,
- feed={'state': state.astype('float32')},
- fetch_list=[self.pred_value])[0]
- pred_Q = np.squeeze(pred_Q, axis=0)
- act = np.argmax(pred_Q)
- if train_or_test == 'train':
- self.exploration = max(0.1, self.exploration - 1e-6)
- return act
-
- def train(self, state, action, reward, next_state, isOver):
- if self.global_step % self.update_target_steps == 0:
- self.sync_target_network()
- self.global_step += 1
-
- action = np.expand_dims(action, -1)
- self.exe.run(self.train_program, \
- feed={'state': state.astype('float32'), \
- 'action': action.astype('int32'), \
- 'reward': reward, \
- 'next_s': next_state.astype('float32'), \
- 'isOver': isOver})
-
- def sync_target_network(self):
- self.exe.run(self._sync_program)
diff --git a/fluid/DeepQNetwork/README.md b/fluid/DeepQNetwork/README.md
index e72920bcad29ce7ffd78bfb90a1406654298248d..f82d57f12cc4e97dae99d5a711ee495a9895aa91 100644
--- a/fluid/DeepQNetwork/README.md
+++ b/fluid/DeepQNetwork/README.md
@@ -1,67 +1,6 @@
-[中文版](README_cn.md)
-## Reproduce DQN, DoubleDQN, DuelingDQN model with Fluid version of PaddlePaddle
-Based on PaddlePaddle's next-generation API Fluid, the DQN model of deep reinforcement learning is reproduced, and the same level of indicators of the paper is reproduced in the classic Atari game. The model receives the image of the game as input, and uses the end-to-end model to directly predict the next step. The repository contains the following three types of models:
-+ DQN in
-[Human-level Control Through Deep Reinforcement Learning](http://www.nature.com/nature/journal/v518/n7540/full/nature14236.html)
-+ DoubleDQN in:
-[Deep Reinforcement Learning with Double Q-Learning](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/viewPaper/12389)
-+ DuelingDQN in:
-[Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html)
+Hi!
-## Atari benchmark & performance
+This directory has been deprecated.
-### Atari games introduction
-
-Please see [here](https://gym.openai.com/envs/#atari) to know more about Atari game.
-
-### Pong game result
-
-The average game rewards that can be obtained for the three models as the number of training steps changes during the training are as follows(about 3 hours/1 Million steps):
-
-
-

-
-
-## How to use
-### Dependencies:
-+ python2.7
-+ gym
-+ tqdm
-+ opencv-python
-+ paddlepaddle-gpu>=0.12.0
-+ ale_python_interface
-
-### Install Dependencies:
-+ Install PaddlePaddle:
- recommended to compile and install PaddlePaddle from source code
-+ Install other dependencies:
- ```
- pip install -r requirement.txt
- pip install gym[atari]
- ```
- Install ale_python_interface, please see [here](https://github.com/mgbellemare/Arcade-Learning-Environment).
-
-### Start Training:
-```
-# To train a model for Pong game with gpu (use DQN model as default)
-python train.py --rom ./rom_files/pong.bin --use_cuda
-
-# To train a model for Pong with DoubleDQN
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DoubleDQN
-
-# To train a model for Pong with DuelingDQN
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DuelingDQN
-```
-
-To train more games, you can install more rom files from [here](https://github.com/openai/atari-py/tree/master/atari_py/atari_roms).
-
-### Start Testing:
-```
-# Play the game with saved best model and calculate the average rewards
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong
-
-# Play the game with visualization
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong --viz 0.01
-```
-[Here](https://pan.baidu.com/s/1gIsbNw5V7tMeb74ojx-TMA) is saved models for Pong and Breakout games. You can use it to play the game directly.
+Please visit the project at [PaddleRL/DeepQNetwork](../../PaddleRL/DeepQNetwork).
diff --git a/fluid/DeepQNetwork/README_cn.md b/fluid/DeepQNetwork/README_cn.md
index 68a65bffe8fab79ce563fefc894dd035c1572065..b90f215b2d8e0734db5a41b00ab02260021c8cf6 100644
--- a/fluid/DeepQNetwork/README_cn.md
+++ b/fluid/DeepQNetwork/README_cn.md
@@ -1,71 +1,2 @@
-## 基于PaddlePaddle的Fluid版本复现DQN, DoubleDQN, DuelingDQN三个模型
-基于PaddlePaddle下一代API Fluid复现了深度强化学习领域的DQN模型,在经典的Atari 游戏上复现了论文同等水平的指标,模型接收游戏的图像作为输入,采用端到端的模型直接预测下一步要执行的控制信号,本仓库一共包含以下3类模型:
-+ DQN模型:
-[Human-level Control Through Deep Reinforcement Learning](http://www.nature.com/nature/journal/v518/n7540/full/nature14236.html)
-+ DoubleDQN模型:
-[Deep Reinforcement Learning with Double Q-Learning](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/viewPaper/12389)
-+ DuelingDQN模型:
-[Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html)
-
-## 模型效果:Atari游戏表现
-
-### Atari游戏介绍
-
-请点击[这里](https://gym.openai.com/envs/#atari)了解Atari游戏。
-
-### Pong游戏训练结果
-三个模型在训练过程中随着训练步数的变化,能得到的平均游戏奖励如下图所示(大概3小时每1百万步):
-
-
-

-
-
-## 使用教程
-
-### 依赖:
-+ python2.7
-+ gym
-+ tqdm
-+ opencv-python
-+ paddlepaddle-gpu>=0.12.0
-+ ale_python_interface
-
-### 下载依赖:
-
-+ 安装PaddlePaddle:
- 建议通过PaddlePaddle源码进行编译安装
-+ 下载其它依赖:
- ```
- pip install -r requirement.txt
- pip install gym[atari]
- ```
- 安装ale_python_interface可以参考[这里](https://github.com/mgbellemare/Arcade-Learning-Environment)
-
-### 训练模型:
-
-```
-# 使用GPU训练Pong游戏(默认使用DQN模型)
-python train.py --rom ./rom_files/pong.bin --use_cuda
-
-# 训练DoubleDQN模型
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DoubleDQN
-
-# 训练DuelingDQN模型
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DuelingDQN
-```
-
-训练更多游戏,可以从[这里](https://github.com/openai/atari-py/tree/master/atari_py/atari_roms)下载游戏rom
-
-### 测试模型:
-
-```
-# Play the game with saved model and calculate the average rewards
-# 使用训练过程中保存的最好模型玩游戏,以及计算平均奖励(rewards)
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong
-
-# 以可视化的形式来玩游戏
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong --viz 0.01
-```
-
-[这里](https://pan.baidu.com/s/1gIsbNw5V7tMeb74ojx-TMA)是Pong和Breakout游戏训练好的模型,可以直接用来测试。
+您好,该项目已被迁移,请移步到 [PaddleRL/DeepQNetwork](../../PaddleRL/DeepQNetwork) 目录下浏览本项目。
diff --git a/fluid/DeepQNetwork/utils.py b/fluid/DeepQNetwork/utils.py
deleted file mode 100644
index 26ed7fbdb54494c3cf9a983f8ecafdfbcd4d2719..0000000000000000000000000000000000000000
--- a/fluid/DeepQNetwork/utils.py
+++ /dev/null
@@ -1,20 +0,0 @@
-#-*- coding: utf-8 -*-
-#File: utils.py
-
-import paddle.fluid as fluid
-import numpy as np
-
-
-def fluid_argmax(x):
- """
- Get index of max value for the last dimension
- """
- _, max_index = fluid.layers.topk(x, k=1)
- return max_index
-
-
-def fluid_flatten(x):
- """
- Flatten fluid variable along the first dimension
- """
- return fluid.layers.reshape(x, shape=[-1, np.prod(x.shape[1:])])
diff --git a/fluid/PaddleCV/HiNAS_models/README.md b/fluid/PaddleCV/HiNAS_models/README.md
old mode 100755
new mode 100644
index 9c67736aa30643baf72ce42ed2ca3321d4e22165..1e33fea89e2d4e3a9b9ef2cad81012d082ccc504
--- a/fluid/PaddleCV/HiNAS_models/README.md
+++ b/fluid/PaddleCV/HiNAS_models/README.md
@@ -1,76 +1,6 @@
-# Image Classification Models
-This directory contains six image classification models, which are models automatically discovered by Baidu Big Data Lab (BDL) Hierarchical Neural Architecture Search project (HiNAS), achieving 96.1% accuracy on CIFAR-10 dataset. These models are divided into two categories. The first three have no skip link, named HiNAS 0-2, and the last three networks contain skip links, which are similar to the shortcut connections in Resnet, named HiNAS 3-5.
----
-## Table of Contents
-- [Installation](#installation)
-- [Data preparation](#data-preparation)
-- [Training a model](#training-a-model)
-- [Model performances](#model-performances)
+Hi!
-## Installation
-Running the trainer in current directory requires:
+This directory has been deprecated.
-- PadddlePaddle Fluid >= v0.15.0
-- CuDNN >=6.0
-
-If PaddlePaddle and CuDNN in your runtime environment do not meet the requirements, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) and make an update.
-
-## Data preparation
-
-When you run the sample code for the first time, the trainer will automatically download the cifar-10 dataset. Please make sure your environment has an internet connection.
-
-The dataset will be downloaded to `dataset/cifar/cifar-10-python.tar.gz` in the same directory as the Trainer. If automatic download fails, you can go to https://www.cs.toronto.edu/~kriz/cifar.html and download cifar-10-python.tar.gz to the location mentioned above.
-
-## Training a model
-
-After the environment is ready, you can train the model. There are two entrances: `train_hinas.py` and `train_hinas_res.py`. The former is used to train Model 0-2 (without skip link), and the latter is used to train Model 3-5 (contains skip link).
-
-Train Model 0~2 (without skip link):
-```
-python train_hinas.py --model=m_id # m_id can be 0, 1 or 2.
-```
-Train Model 3~5 (with skip link):
-```
-python train_hinas_res.py --model=m_id # m_id can be 0, 1 or 2.
-```
-
-In addition, both `train_hinas.py` and `train_hinas_res.py` support the following parameters:
-
-- **random_flip_left_right**: Random flip image horizontally. (Default: True)
-- **random_flip_up_down**: Randomly flip image vertically. (Default: False)
-- **cutout**: Add cutout action to image. (Default: True)
-- **standardize_image**: Image standardize. (Default: True)
-- **pad_and_cut_image**: Random padding image and then crop back to the original size. (Default: True)
-- **shuffle_image**: Shuffle the order of the input images during training. (Default: True)
-- **lr_max**: Learning rate at the begin of training. (Default: 0.1)
-- **lr_min**: Learning rate at the end of training. (Default: 0.0001)
-- **batch_size**: Training batch size (Default: 128)
-- **num_epochs**: Total training epoch (Default: 200)
-- **weight_decay**: L2 Regularization value (Default: 0.0004)
-- **momentum**: The momentum parameter in momentum optimizer (Default: 0.9)
-- **dropout_rate**: Dropout rate of the dropout layer (Default: 0.5)
-- **bn_decay**: The decay/momentum parameter (or called moving average decay) in batch norm layer (Default: 0.9)
-
-
-## Model performances
-
-Train all six models using same hyperparameters:
-
-- learning rate: 0.1 -> 0.0001 with cosine annealing
-- total epoch: 200
-- batch size: 128
-- L2 decay: 0.000400
-- optimizer: momentum optimizer with m=0.9 and use nesterov
-- preprocess: random horizontal flip + image standardization + cutout
-
-And below is the accuracy on CIFAR-10 dataset:
-
-| model | round 1 | round 2 | round 3 | max | avg |
-|----------|---------|---------|---------|--------|--------|
-| HiNAS-0 | 0.9548 | 0.9520 | 0.9513 | 0.9548 | 0.9527 |
-| HiNAS-1 | 0.9452 | 0.9462 | 0.9420 | 0.9462 | 0.9445 |
-| HiNAS-2 | 0.9508 | 0.9506 | 0.9483 | 0.9508 | 0.9499 |
-| HiNAS-3 | 0.9607 | 0.9623 | 0.9601 | 0.9623 | 0.9611 |
-| HiNAS-4 | 0.9611 | 0.9584 | 0.9586 | 0.9611 | 0.9594 |
-| HiNAS-5 | 0.9578 | 0.9588 | 0.9594 | 0.9594 | 0.9586 |
+Please visit the project at [AutoDL/HiNAS_models](../../../AutoDL/HiNAS_models).
diff --git a/fluid/PaddleCV/HiNAS_models/README_cn.md b/fluid/PaddleCV/HiNAS_models/README_cn.md
old mode 100755
new mode 100644
index 8ca3bcbfb8d1ea1a15f969c1a1db22ff2ec854f1..8ab7149b0aaef04c226aff0302e4282b0172c113
--- a/fluid/PaddleCV/HiNAS_models/README_cn.md
+++ b/fluid/PaddleCV/HiNAS_models/README_cn.md
@@ -1,78 +1,2 @@
-# Image Classification Models
-本目录下包含6个图像分类模型,都是百度大数据实验室 Hierarchical Neural Architecture Search (HiNAS) 项目通过机器自动发现的模型,在CIFAR-10数据集上达到96.1%的准确率。这6个模型分为两类,前3个没有skip link,分别命名为 HiNAS 0-2号,后三个网络带有skip link,功能类似于Resnet中的shortcut connection,分别命名 HiNAS 3-5号。
----
-## Table of Contents
-- [Installation](#installation)
-- [Data preparation](#data-preparation)
-- [Training a model](#training-a-model)
-- [Model performances](#model-performances)
-
-## Installation
-最低环境要求:
-
-- PadddlePaddle Fluid >= v0.15.0
-- Cudnn >=6.0
-
-如果您的运行环境无法满足要求,可以参考此文档升级PaddlePaddle:[installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)
-
-## Data preparation
-
-第一次训练模型的时候,Trainer会自动下载CIFAR-10数据集,请确保您的环境有互联网连接。
-
-数据集会被下载到Trainer同目录下的`dataset/cifar/cifar-10-python.tar.gz`,如果自动下载失败,您可以自行从 https://www.cs.toronto.edu/~kriz/cifar.html 下载cifar-10-python.tar.gz,然后放到上述位置。
-
-
-## Training a model
-准备好环境后,可以训练模型,训练有2个入口,`train_hinas.py`和`train_hinas_res.py`,前者用来训练0-2号不含skip link的模型,后者用来训练3-5号包含skip link的模型。
-
-训练0~2号不含skip link的模型:
-```
-python train_hinas.py --model=m_id # m_id can be 0, 1 or 2.
-```
-训练3~5号包含skip link的模型:
-```
-python train_hinas_res.py --model=m_id # m_id can be 0, 1 or 2.
-```
-
-此外,`train_hinas.py`和`train_hinas_res.py` 都支持以下参数:
-
-初始化部分:
-
-- random_flip_left_right:图片随机水平翻转(Default:True)
-- random_flip_up_down:图片随机垂直翻转(Default:False)
-- cutout:图片随机遮挡(Default:True)
-- standardize_image:对图片每个像素做 standardize(Default:True)
-- pad_and_cut_image:图片随机padding,并裁剪回原大小(Default:True)
-- shuffle_image:训练时对输入图片的顺序做shuffle(Default:True)
-- lr_max:训练开始时的learning rate(Default:0.1)
-- lr_min:训练结束时的learning rate(Default:0.0001)
-- batch_size:训练的batch size(Default:128)
-- num_epochs:训练总的epoch(Default:200)
-- weight_decay:训练时L2 Regularization大小(Default:0.0004)
-- momentum:momentum优化器中的momentum系数(Default:0.9)
-- dropout_rate:dropout层的dropout_rate(Default:0.5)
-- bn_decay:batch norm层的decay/momentum系数(即moving average decay)大小(Default:0.9)
-
-
-
-## Model performances
-6个模型使用相同的参数训练:
-
-- learning rate: 0.1 -> 0.0001 with cosine annealing
-- total epoch: 200
-- batch size: 128
-- L2 decay: 0.000400
-- optimizer: momentum optimizer with m=0.9 and use nesterov
-- preprocess: random horizontal flip + image standardization + cutout
-
-以下是6个模型在CIFAR-10数据集上的准确率:
-
-| model | round 1 | round 2 | round 3 | max | avg |
-|----------|---------|---------|---------|--------|--------|
-| HiNAS-0 | 0.9548 | 0.9520 | 0.9513 | 0.9548 | 0.9527 |
-| HiNAS-1 | 0.9452 | 0.9462 | 0.9420 | 0.9462 | 0.9445 |
-| HiNAS-2 | 0.9508 | 0.9506 | 0.9483 | 0.9508 | 0.9499 |
-| HiNAS-3 | 0.9607 | 0.9623 | 0.9601 | 0.9623 | 0.9611 |
-| HiNAS-4 | 0.9611 | 0.9584 | 0.9586 | 0.9611 | 0.9594 |
-| HiNAS-5 | 0.9578 | 0.9588 | 0.9594 | 0.9594 | 0.9586 |
+您好,该项目已被迁移,请移步到 [AutoDL/HiNAS_models](../../../AutoDL/HiNAS_models) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/HiNAS_models/nn_paddle.py b/fluid/PaddleCV/HiNAS_models/nn_paddle.py
deleted file mode 100755
index d56bca5f156f47dccad07d32e7ad9d383d3dd459..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/HiNAS_models/nn_paddle.py
+++ /dev/null
@@ -1,138 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import math
-
-import numpy as np
-import paddle
-import paddle.fluid as fluid
-from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
-import reader
-
-from absl import flags
-
-# import preprocess
-
-FLAGS = flags.FLAGS
-
-flags.DEFINE_float("lr_max", 0.1, "initial learning rate")
-flags.DEFINE_float("lr_min", 0.0001, "limiting learning rate")
-
-flags.DEFINE_integer("batch_size", 128, "batch size")
-flags.DEFINE_integer("num_epochs", 200, "total epochs to train")
-flags.DEFINE_float("weight_decay", 0.0004, "weight decay")
-
-flags.DEFINE_float("momentum", 0.9, "momentum")
-
-flags.DEFINE_boolean("shuffle_image", True, "shuffle input images on training")
-
-dataset_train_size = 50000
-
-
-class Model(object):
- def __init__(self, build_fn, tokens):
- print("learning rate: %f -> %f, cosine annealing" %
- (FLAGS.lr_max, FLAGS.lr_min))
- print("epoch: %d" % FLAGS.num_epochs)
- print("batch size: %d" % FLAGS.batch_size)
- print("L2 decay: %f" % FLAGS.weight_decay)
-
- self.max_step = dataset_train_size * FLAGS.num_epochs // FLAGS.batch_size
-
- self.build_fn = build_fn
- self.tokens = tokens
- print("Token is %s" % ",".join(map(str, tokens)))
-
- def cosine_annealing(self):
- step = _decay_step_counter()
- lr = FLAGS.lr_min + (FLAGS.lr_max - FLAGS.lr_min) / 2 \
- * (1.0 + fluid.layers.ops.cos(step / self.max_step * math.pi))
- return lr
-
- def optimizer_program(self):
- return fluid.optimizer.Momentum(
- learning_rate=self.cosine_annealing(),
- momentum=FLAGS.momentum,
- use_nesterov=True,
- regularization=fluid.regularizer.L2DecayRegularizer(
- FLAGS.weight_decay))
-
- def inference_network(self):
- images = fluid.layers.data(
- name='pixel', shape=[3, 32, 32], dtype='float32')
- return self.build_fn(images, self.tokens)
-
- def train_network(self):
- predict = self.inference_network()
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
- cost = fluid.layers.cross_entropy(input=predict, label=label)
- avg_cost = fluid.layers.mean(cost)
- accuracy = fluid.layers.accuracy(input=predict, label=label)
- # self.parameters = fluid.parameters.create(avg_cost)
- return [avg_cost, accuracy]
-
- def run(self):
- train_files = reader.train10()
- test_files = reader.test10()
-
- if FLAGS.shuffle_image:
- train_reader = paddle.batch(
- paddle.reader.shuffle(train_files, dataset_train_size),
- batch_size=FLAGS.batch_size)
- else:
- train_reader = paddle.batch(
- train_files, batch_size=FLAGS.batch_size)
-
- test_reader = paddle.batch(test_files, batch_size=FLAGS.batch_size)
-
- costs = []
- accs = []
-
- def event_handler(event):
- if isinstance(event, fluid.EndStepEvent):
- costs.append(event.metrics[0])
- accs.append(event.metrics[1])
- if event.step % 20 == 0:
- print("Epoch %d, Step %d, Loss %f, Acc %f" % (
- event.epoch, event.step, np.mean(costs), np.mean(accs)))
- del costs[:]
- del accs[:]
-
- if isinstance(event, fluid.EndEpochEvent):
- if event.epoch % 3 == 0 or event.epoch == FLAGS.num_epochs - 1:
- avg_cost, accuracy = trainer.test(
- reader=test_reader, feed_order=['pixel', 'label'])
-
- event_handler.best_acc = max(event_handler.best_acc,
- accuracy)
- print("Test with epoch %d, Loss %f, Acc %f" %
- (event.epoch, avg_cost, accuracy))
- print("Best acc %f" % event_handler.best_acc)
-
- event_handler.best_acc = 0.0
- place = fluid.CUDAPlace(0)
- trainer = fluid.Trainer(
- train_func=self.train_network,
- optimizer_func=self.optimizer_program,
- place=place)
-
- trainer.train(
- reader=train_reader,
- num_epochs=FLAGS.num_epochs,
- event_handler=event_handler,
- feed_order=['pixel', 'label'])
diff --git a/fluid/PaddleCV/caffe2fluid/README.md b/fluid/PaddleCV/caffe2fluid/README.md
index 8520342325a1ef4e08d8f9669969acd5b6b57851..8241e980ec9c05cacb0121387780371029625537 100644
--- a/fluid/PaddleCV/caffe2fluid/README.md
+++ b/fluid/PaddleCV/caffe2fluid/README.md
@@ -1,87 +1,6 @@
-### Caffe2Fluid
-This tool is used to convert a Caffe model to a Fluid model
-### Key Features
-1. Convert caffe model to fluid model with codes of defining a network(useful for re-training)
+Hi!
-2. Pycaffe is not necessary when just want convert model without do caffe-inference
+This directory has been deprecated.
-3. Caffe's customized layers convertion also be supported by extending this tool
-
-4. A bunch of tools in `examples/imagenet/tools` are provided to compare the difference
-
-### HowTo
-1. Prepare `caffepb.py` in `./proto` if your python has no `pycaffe` module, two options provided here:
- - Generate pycaffe from caffe.proto
- ```
- bash ./proto/compile.sh
- ```
-
- - Download one from github directly
- ```
- cd proto/ && wget https://raw.githubusercontent.com/ethereon/caffe-tensorflow/master/kaffe/caffe/caffepb.py
- ```
-
-2. Convert the Caffe model to Fluid model
- - Generate fluid code and weight file
- ```
- python convert.py alexnet.prototxt \
- --caffemodel alexnet.caffemodel \
- --data-output-path alexnet.npy \
- --code-output-path alexnet.py
- ```
-
- - Save weights as fluid model file
- ```
- # only infer the last layer's result
- python alexnet.py alexnet.npy ./fluid
- # infer these 2 layer's result
- python alexnet.py alexnet.npy ./fluid fc8,prob
- ```
-
-3. Use the converted model to infer
- - See more details in `examples/imagenet/tools/run.sh`
-
-4. Compare the inference results with caffe
- - See more details in `examples/imagenet/tools/diff.sh`
-
-### How to convert custom layer
-1. Implement your custom layer in a file under `kaffe/custom_layers`, eg: mylayer.py
- - Implement ```shape_func(input_shape, [other_caffe_params])``` to calculate the output shape
- - Implement ```layer_func(inputs, name, [other_caffe_params])``` to construct a fluid layer
- - Register these two functions ```register(kind='MyType', shape=shape_func, layer=layer_func)```
- - Notes: more examples can be found in `kaffe/custom_layers`
-
-2. Add ```import mylayer``` to `kaffe/custom_layers/\_\_init__.py`
-
-3. Prepare your pycaffe as your customized version(same as previous env prepare)
- - (option1) replace `proto/caffe.proto` with your own caffe.proto and compile it
- - (option2) change your `pycaffe` to the customized version
-
-4. Convert the Caffe model to Fluid model
-
-5. Set env $CAFFE2FLUID_CUSTOM_LAYERS to the parent directory of 'custom_layers'
- ```
- export CAFFE2FLUID_CUSTOM_LAYERS=/path/to/caffe2fluid/kaffe
- ```
-
-6. Use the converted model when loading model in `xxxnet.py` and `xxxnet.npy`(no need if model is already in `fluid/model` and `fluid/params`)
-
-### Tested models
-- Lenet:
-[model addr](https://github.com/ethereon/caffe-tensorflow/blob/master/examples/mnist)
-
-- ResNets:(ResNet-50, ResNet-101, ResNet-152)
-[model addr](https://onedrive.live.com/?authkey=%21AAFW2-FVoxeVRck&id=4006CBB8476FF777%2117887&cid=4006CBB8476FF777)
-
-- GoogleNet:
-[model addr](https://gist.github.com/jimmie33/7ea9f8ac0da259866b854460f4526034)
-
-- VGG:
-[model addr](https://gist.github.com/ksimonyan/211839e770f7b538e2d8)
-
-- AlexNet:
-[model addr](https://github.com/BVLC/caffe/tree/master/models/bvlc_alexnet)
-
-### Notes
-Some of this code come from here: [caffe-tensorflow](https://github.com/ethereon/caffe-tensorflow)
+Please visit the project at [PaddleCV/caffe2fluid](../../../PaddleCV/caffe2fluid).
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/__init__.py b/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/__init__.py
deleted file mode 100644
index 4f7d3dbcc1120cac6731c722c0b7412120e61bd5..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/__init__.py
+++ /dev/null
@@ -1,113 +0,0 @@
-"""
-"""
-
-from .register import get_registered_layers
-#custom layer import begins
-
-import axpy
-import flatten
-import argmax
-import reshape
-import roipooling
-import priorbox
-import permute
-import detection_out
-import normalize
-import select
-import crop
-import reduction
-
-#custom layer import ends
-
-custom_layers = get_registered_layers()
-
-
-def set_args(f, params, node=None):
- """ set args for function 'f' using the parameters in node.layer.parameters
-
- Args:
- f (function): a python function object
- params (object): a object contains attributes needed by f's arguments
-
- Returns:
- arg_names (list): a list of argument names
- kwargs (dict): a dict contains needed arguments
- """
- from ..protobuf_to_dict import protobuf_to_dict
-
- argc = f.__code__.co_argcount
- arg_list = f.__code__.co_varnames[0:argc]
-
- kwargs = {}
- for arg_name in arg_list:
- if arg_name in params:
- kwargs[arg_name] = params[arg_name]
-
- if node is not None and len(node.metadata):
- kwargs.update(node.metadata)
-
- return arg_list, kwargs
-
-
-def has_layer(kind):
- """ test whether this layer exists in custom layer
- """
- return kind in custom_layers
-
-
-def compute_output_shape(kind, node):
- assert kind in custom_layers, "layer[%s] not exist in custom layers" % (
- kind)
- shape_func = custom_layers[kind]['shape']
-
- parents = node.parents
- inputs = [list(p.output_shape) for p in parents]
- arg_names, kwargs = set_args(shape_func, node.params)
-
- if len(inputs) == 1:
- inputs = inputs[0]
-
- return shape_func(inputs, **kwargs)
-
-
-def make_node(template, kind, node):
- """ make a PaddleNode for custom layer which means construct
- a piece of code to define a layer implemented in 'custom_layers'
-
- Args:
- @template (PaddleNode): a factory to new a instance of PaddleNode
- @kind (str): type of custom layer
- @node (graph.Node): a layer in the net
-
- Returns:
- instance of PaddleNode
- """
- assert kind in custom_layers, "layer[%s] not exist in custom layers" % (
- kind)
-
- layer_func = custom_layers[kind]['layer']
-
- #construct arguments needed by custom layer function from node's parameters
- arg_names, kwargs = set_args(layer_func, node.params, node)
-
- return template('custom_layer', kind, **kwargs)
-
-
-def make_custom_layer(kind, inputs, name, *args, **kwargs):
- """ execute a custom layer which is implemented by users
-
- Args:
- @kind (str): type name of this layer
- @inputs (vars): variable list created by fluid
- @namme (str): name for this layer
- @args (tuple): other positional arguments
- @kwargs (dict): other kv arguments
-
- Returns:
- output (var): output variable for this layer
- """
- assert kind in custom_layers, "layer[%s] not exist in custom layers" % (
- kind)
-
- layer_func = custom_layers[kind]['layer']
- return layer_func(inputs, name, *args, **kwargs)
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/priorbox.py b/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/priorbox.py
deleted file mode 100644
index c3c23fbdb17a4992f41946a9889790f0782bd7e7..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/caffe2fluid/kaffe/custom_layers/priorbox.py
+++ /dev/null
@@ -1,100 +0,0 @@
-""" A custom layer for 'priorbox' which is used in ssd to generate prior box info
- Since the order of prior box is different between caffe and paddle,
- we use 'slice' and 'concate' ops to align them.
-"""
-
-from .register import register
-
-
-def priorbox_shape(input_shapes, min_size, max_size=None, aspect_ratio=None):
- """ calculate the output shape of this layer using input shapes
-
- Args:
- @input_shapes (list of tuples): a list of input shapes
-
- Returns:
- @output_shape (list of num): a list of numbers represent the output shape
- """
- assert len(input_shapes) == 2, "invalid inputs for Priorbox[%s]" % (name)
- fc_shape = input_shapes[0]
- N = 1
- if not max_size == None:
- N += 1
- if not aspect_ratio == None:
- N += 2 * len(aspect_ratio)
-
- N_bbx = fc_shape[2] * fc_shape[3] * N
- output_shape = [1, 2, 4 * N_bbx]
- return output_shape
-
-
-def priorbox_layer(inputs,
- name,
- min_size,
- step,
- max_size=None,
- aspect_ratio=None,
- flip=True,
- clip=False,
- variance=[],
- offset=0.5):
- """ build a layer of type 'Priorbox' using fluid
-
- Args:
- @inputs (list of variables): input fluid variables for this layer
- @name (str): name for this layer
-
- Returns:
- output (variable): output variable for this layer
- """
- import paddle.fluid as fluid
-
- assert len(inputs) == 2, "invalid inputs for Priorbox[%s]" % (name)
- input = inputs[0]
- image = inputs[1]
- box, variance_ = fluid.layers.prior_box(
- input,
- image,
- min_size,
- max_size,
- aspect_ratio,
- variance,
- flip,
- clip, (step, step),
- offset,
- min_max_aspect_ratios_order=True)
- """
- #adjust layout when the output is not consistent with caffe's
-
- feat_shape = list(input.shape)
- H = feat_shape[2]
- W = feat_shape[3]
- box_tmp = fluid.layers.reshape(box, [H, W, -1, 4])
- nb_prior_bbx = int(box_tmp.shape[2])
- tensor_list = fluid.layers.split(box_tmp, nb_prior_bbx, 2)
-
- #TODO:
- # current implementation for this layer is not efficient
- # and we should fix this bug in future when Paddle support the same prior-box layout with Caffe
- index_list = [0]
- index_list = index_list * nb_prior_bbx
- index_offset = 0
- if max_size is not None:
- index_list[1] = -1
- index_offset = 1
- for ii in xrange(2 * len(aspect_ratio)):
- index_list[ii + 1 + index_offset] = ii + 1
-
- tensor_list_gathered = [tensor_list[ii] for ii in index_list]
- caffe_prior_bbx = fluid.layers.concat(tensor_list_gathered, axis=2)
- box = fluid.layers.reshape(caffe_prior_bbx, [1, 1, -1])
- """
-
- box = fluid.layers.reshape(box, [1, 1, -1])
- variance_ = fluid.layers.reshape(variance_, [1, 1, -1])
- output = fluid.layers.concat([box, variance_], axis=1)
-
- return output
-
-
-register(kind='PriorBox', shape=priorbox_shape, layer=priorbox_layer)
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/layers.py b/fluid/PaddleCV/caffe2fluid/kaffe/layers.py
deleted file mode 100644
index ecf87f70d5c4d3ea3b13fbd355dbf90ea168f3e1..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/caffe2fluid/kaffe/layers.py
+++ /dev/null
@@ -1,250 +0,0 @@
-import re
-import numbers
-from collections import namedtuple
-
-import custom_layers
-from .shapes import *
-
-LAYER_DESCRIPTORS = {
-
- # Caffe Types
- 'AbsVal': shape_identity,
- 'Accuracy': shape_scalar,
- 'ArgMax': shape_not_implemented,
- 'BatchNorm': shape_identity,
- 'BNLL': shape_not_implemented,
- 'Concat': shape_concat,
- 'ContrastiveLoss': shape_scalar,
- 'Convolution': shape_convolution,
- 'Deconvolution': shape_deconvolution,
- 'Data': shape_data,
- 'Dropout': shape_identity,
- 'DummyData': shape_data,
- 'Crop': shape_crop,
- 'EuclideanLoss': shape_scalar,
- 'Eltwise': shape_identity,
- 'Exp': shape_identity,
- 'Flatten': shape_not_implemented,
- 'HDF5Data': shape_data,
- 'HDF5Output': shape_identity,
- 'HingeLoss': shape_scalar,
- 'Im2col': shape_not_implemented,
- 'ImageData': shape_data,
- 'InfogainLoss': shape_scalar,
- 'InnerProduct': shape_inner_product,
- 'Input': shape_data,
- 'LRN': shape_identity,
- 'MemoryData': shape_mem_data,
- 'MultinomialLogisticLoss': shape_scalar,
- 'MVN': shape_not_implemented,
- 'Pooling': shape_pool,
- 'Power': shape_identity,
- 'ReLU': shape_identity,
- 'PReLU': shape_identity,
- 'Scale': shape_identity,
- 'Sigmoid': shape_identity,
- 'SigmoidCrossEntropyLoss': shape_scalar,
- 'Silence': shape_not_implemented,
- 'Softmax': shape_identity,
- 'SoftmaxWithLoss': shape_scalar,
- 'Split': shape_not_implemented,
- 'Slice': shape_not_implemented,
- 'TanH': shape_identity,
- 'WindowData': shape_not_implemented,
- 'Threshold': shape_identity,
-}
-
-# layer types in 'V1LayerParameter'
-# (v1layertype name, enum value, mapped to layer type)
-v1_layertypes = [
- ('ABSVAL', 35),
- ('ACCURACY', 1),
- ('ARGMAX', 30),
- ('BNLL', 2),
- ('CONCAT', 3),
- ('CONVOLUTION', 4),
- ('DATA', 5),
- ('DECONVOLUTION', 39),
- ('DROPOUT', 6),
- ('ELTWISE', 25),
- ('EXP', 38),
- ('FLATTEN', 8),
- ('IM2COL', 11),
- ('INNERPRODUCT', 14),
- ('LRN', 15),
- ('MEMORYDATA', 29),
- ('MULTINOMIALLOGISTICLOSS', 16),
- ('MVN', 34),
- ('POOLING', 17),
- ('POWER', 26),
- ('RELU', 18),
- ('SIGMOID', 19),
- ('SIGMOIDCROSSENTROPYLOSS', 27),
- ('SILENCE', 36),
- ('SOFTMAX', 20),
- ('SPLIT', 22),
- ('SLICE', 33),
- ('TANH', 23),
- ('WINDOWDATA', 24),
- ('THRESHOLD', 31),
-]
-
-LAYER_TYPES = LAYER_DESCRIPTORS.keys()
-LayerType = type('LayerType', (), {t: t for t in LAYER_TYPES})
-
-#map the layer name in V1 to standard name
-V1_LAYER_MAP = {'_not_init_': True}
-
-
-def get_v1_layer_map():
- global V1_LAYER_MAP
- if '_not_init_' not in V1_LAYER_MAP:
- return V1_LAYER_MAP
- else:
- del V1_LAYER_MAP['_not_init_']
-
- name2layer = {}
- for n in LAYER_TYPES:
- name2layer[n.upper()] = n
-
- for l in v1_layertypes:
- n, v = l
- if n in name2layer and v not in V1_LAYER_MAP:
- V1_LAYER_MAP[v] = name2layer[n]
- else:
- raise KaffeError('not found v1 layer type %s' % n)
- return V1_LAYER_MAP
-
-
-class NodeKind(LayerType):
- @staticmethod
- def map_raw_kind(kind):
- if custom_layers.has_layer(kind):
- return kind
-
- if kind in LAYER_TYPES:
- return kind
-
- v1_layers = get_v1_layer_map()
- if kind in v1_layers:
- return v1_layers[kind]
- else:
- return None
-
- @staticmethod
- def compute_output_shape(node):
- if custom_layers.has_layer(node.kind):
- return custom_layers.compute_output_shape(node.kind, node)
-
- try:
- val = LAYER_DESCRIPTORS[node.kind](node)
- return val
- except NotImplementedError:
- raise KaffeError(
- 'Output shape computation not implemented for type: %s' %
- node.kind)
-
-
-class NodeDispatchError(KaffeError):
- pass
-
-
-class NodeDispatch(object):
- @staticmethod
- def get_handler_name(node_kind):
- if len(node_kind) <= 6:
- # A catch-all for things like ReLU and tanh
- return node_kind.lower()
- # Convert from CamelCase to under_scored
- name = re.sub('(.)([A-Z][a-z]+)', r'\1_\2', node_kind)
- return re.sub('([a-z0-9])([A-Z])', r'\1_\2', name).lower()
-
- def get_handler(self, node_kind, prefix):
- if custom_layers.has_layer(node_kind):
- return getattr(self, 'map_custom')
-
- name = self.get_handler_name(node_kind)
- name = '_'.join((prefix, name))
- try:
- return getattr(self, name)
- except AttributeError:
- raise NodeDispatchError(
- 'No handler found for node kind: %s (expected: %s)' %
- (node_kind, name))
-
-
-class LayerAdapter(object):
- def __init__(self, layer, kind):
- self.layer = layer
- self.kind = kind
-
- @property
- def parameters(self):
- name = NodeDispatch.get_handler_name(self.kind)
- if self.kind.lower() == "normalize":
- name = "norm"
- elif self.kind.lower() == "deconvolution":
- name = "convolution"
-
- name = '_'.join((name, 'param'))
- try:
- return getattr(self.layer, name)
- except AttributeError:
- print(dir(self.layer))
- raise NodeDispatchError(
- 'Caffe parameters not found attr[%s] for layer kind[%s]' %
- (name, self.kind))
-
- @staticmethod
- def get_kernel_value(scalar, repeated, idx, default=None):
- if scalar:
- return scalar
- if repeated:
- if isinstance(repeated, numbers.Number):
- return repeated
- if len(repeated) == 1:
- # Same value applies to all spatial dimensions
- return int(repeated[0])
- assert idx < len(repeated)
- # Extract the value for the given spatial dimension
- return repeated[idx]
- if default is None:
- raise ValueError('Unable to determine kernel parameter!')
- return default
-
- @property
- def kernel_parameters(self):
- assert self.kind in (NodeKind.Convolution, NodeKind.Pooling,\
- NodeKind.Deconvolution)
-
- params = self.parameters
- k_h = self.get_kernel_value(params.kernel_h, params.kernel_size, 0)
- k_w = self.get_kernel_value(params.kernel_w, params.kernel_size, 1)
- s_h = self.get_kernel_value(
- params.stride_h, params.stride, 0, default=1)
- s_w = self.get_kernel_value(
- params.stride_w, params.stride, 1, default=1)
- p_h = self.get_kernel_value(params.pad_h, params.pad, 0, default=0)
- p_w = self.get_kernel_value(params.pad_w, params.pad, 1, default=0)
-
- dila_h = dila_w = 1
- if self.kind in (NodeKind.Convolution, NodeKind.Deconvolution):
- dila_len = len(params.dilation)
- if dila_len == 2:
- dila_h = params.dilation[0]
- dila_w = params.dilation[1]
- elif dila_len == 1:
- dila_h = dila_w = params.dilation[0]
- else:
- assert dila_len == 0, "invalid length[%s] of dilation in convolution" % (
- dila_len)
-
- return KernelParameters(k_h, k_w, s_h, s_w, p_h, p_w, dila_h, dila_w)
-
-
-KernelParameters = namedtuple(
- 'KernelParameters',
- [
- 'kernel_h', 'kernel_w', 'stride_h', 'stride_w', 'pad_h', 'pad_w',
- 'dila_h', 'dila_w'
- ], )
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/paddle/network.py b/fluid/PaddleCV/caffe2fluid/kaffe/paddle/network.py
deleted file mode 100644
index 26aaaca68d6ecd976f8bdc2430b0a180838a062c..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/caffe2fluid/kaffe/paddle/network.py
+++ /dev/null
@@ -1,555 +0,0 @@
-import sys
-import os
-import math
-import numpy as np
-
-
-def import_fluid():
- import paddle.fluid as fluid
- return fluid
-
-
-def layer(op):
- '''Decorator for composable network layers.'''
-
- def layer_decorated(self, *args, **kwargs):
- # Automatically set a name if not provided.
- name = kwargs.setdefault('name', self.get_unique_name(op.__name__))
- # Figure out the layer inputs.
- if len(self.terminals) == 0:
- raise RuntimeError('No input variables found for layer %s.' % name)
- elif len(self.terminals) == 1:
- layer_input = self.terminals[0]
- else:
- layer_input = list(self.terminals)
-
- self.layer_reverse_trace[name] = layer_input
- # Perform the operation and get the output.
- layer_output = op(self, layer_input, *args, **kwargs)
- # Add to layer LUT.
- self.layers[name] = layer_output
- self.var2name[layer_output.name] = (name, layer_output)
-
- # This output is now the input for the next layer.
- self.feed(layer_output)
- # Return self for chained calls.
- return self
-
- return layer_decorated
-
-
-class Network(object):
- def __init__(self, inputs, trainable=True):
- # The input nodes for this network
- self.inputs = inputs
- # The current list of terminal nodes
- self.terminals = []
- # Mapping from layer names to layers
- self.layers = dict(inputs)
- # If true, the resulting variables are set as trainable
- self.trainable = trainable
- # Switch variable for dropout
- self.paddle_env = None
- self.output_names = []
- self.name_trace = None
-
- self.layer_reverse_trace = {}
- self.var2name = {}
- self.setup()
-
- def setup(self):
- '''Construct the network. '''
- raise NotImplementedError('Must be implemented by the subclass.')
-
- def locate_ancestor(self, v, which=[0], ancestor_level=1):
- """ find a ancestor for a node 'v' which is a fluid variable
- """
- ancestor = None
- which = which * ancestor_level
- name = self.var2name[v.name][0]
-
- for i in range(ancestor_level):
- v = self.layer_reverse_trace[name]
- if type(v) is list:
- ancestor = self.var2name[v[which[i]].name]
- else:
- ancestor = self.var2name[v.name]
- name = ancestor[0]
- return ancestor
-
- def load(self, data_path, exe=None, place=None, ignore_missing=False):
- '''Load network weights.
- data_path: The path to the numpy-serialized network weights
- ignore_missing: If true, serialized weights for missing layers are ignored.
- '''
- fluid = import_fluid()
- #load fluid mode directly
- if os.path.isdir(data_path):
- assert (exe is not None), \
- 'must provide a executor to load fluid model'
- fluid.io.load_persistables(executor=exe, dirname=data_path)
- return True
-
- #load model from a npy file
- if exe is None or place is None:
- if self.paddle_env is None:
- place = fluid.CPUPlace()
- exe = fluid.Executor(place)
- self.paddle_env = {'place': place, 'exe': exe}
- exe = exe.run(fluid.default_startup_program())
- else:
- place = self.paddle_env['place']
- exe = self.paddle_env['exe']
-
- data_dict = np.load(data_path).item()
- for op_name in data_dict:
- if op_name == 'caffe2fluid_name_trace':
- self.name_trace = data_dict[op_name]
- continue
-
- layer = self.layers[op_name]
- for param_name, data in data_dict[op_name].iteritems():
- try:
- name = '%s_%s' % (op_name, param_name)
- v = fluid.global_scope().find_var(name)
- w = v.get_tensor()
- w.set(data.reshape(w.shape()), place)
- except ValueError:
- if not ignore_missing:
- raise
- return True
-
- def feed(self, *args):
- '''Set the input(s) for the next operation by replacing the terminal nodes.
- The arguments can be either layer names or the actual layers.
- '''
- assert len(args) != 0
- self.terminals = []
- for fed_layer in args:
- if isinstance(fed_layer, basestring):
- try:
- fed_layer = self.layers[fed_layer]
- except KeyError:
- raise KeyError('Unknown layer name fed: %s' % fed_layer)
- self.terminals.append(fed_layer)
- return self
-
- def get_output(self):
- '''Returns the current network output.'''
- return self.terminals[-1]
-
- def get_unique_name(self, prefix):
- '''Returns an index-suffixed unique name for the given prefix.
- This is used for auto-generating layer names based on the type-prefix.
- '''
- ident = sum(t.startswith(prefix) for t, _ in self.layers.items()) + 1
- return '%s_%d' % (prefix, ident)
-
- def get_unique_output_name(self, prefix, layertype):
- '''Returns an index-suffixed unique name for the given prefix.
- This is used for auto-generating layer names based on the type-prefix.
- '''
- ident = sum(t.startswith(prefix) for t in self.output_names) + 1
- unique_name = '%s.%s.output.%d' % (prefix, layertype, ident)
- self.output_names.append(unique_name)
- return unique_name
-
- @layer
- def conv(self,
- input,
- k_h,
- k_w,
- c_o,
- s_h,
- s_w,
- name,
- relu=True,
- relu_negative_slope=0.0,
- padding=None,
- dilation=1,
- group=1,
- biased=True):
- if padding is None:
- padding = [0, 0]
-
- # Get the number of channels in the input
- c_i, h_i, w_i = input.shape[1:]
-
- # Verify that the grouping parameter is valid
- assert c_i % group == 0
- assert c_o % group == 0
-
- fluid = import_fluid()
- prefix = name + '_'
- leaky_relu = False
- act = 'relu'
- if relu is False:
- act = None
- elif relu_negative_slope != 0.0:
- leaky_relu = True
- act = None
-
- output = fluid.layers.conv2d(
- name=self.get_unique_output_name(name, 'conv2d'),
- input=input,
- filter_size=[k_h, k_w],
- num_filters=c_o,
- stride=[s_h, s_w],
- padding=padding,
- dilation=dilation,
- groups=group,
- param_attr=fluid.ParamAttr(name=prefix + "weights"),
- bias_attr=fluid.ParamAttr(name=prefix + "biases"),
- act=act)
-
- if leaky_relu:
- output = fluid.layers.leaky_relu(output, alpha=relu_negative_slope)
-
- return output
-
- @layer
- def deconv(self,
- input,
- k_h,
- k_w,
- c_o,
- s_h,
- s_w,
- name,
- relu=True,
- relu_negative_slope=0.0,
- padding=None,
- dilation=1,
- biased=True):
- if padding is None:
- padding = [0, 0]
-
- # Get the number of channels in the input
- c_i, h_i, w_i = input.shape[1:]
-
- fluid = import_fluid()
- prefix = name + '_'
- leaky_relu = False
- act = 'relu'
- if relu is False:
- act = None
- elif relu_negative_slope != 0.0:
- leaky_relu = True
- act = None
-
- p_h = padding[0]
- p_w = padding[1]
- h_o = (h_i - 1) * s_h - 2 * p_h + dilation * (k_h - 1) + 1
- w_o = (w_i - 1) * s_w - 2 * p_w + dilation * (k_w - 1) + 1
- output = fluid.layers.conv2d_transpose(
- name=self.get_unique_output_name(name, 'conv2d_transpose'),
- input=input,
- num_filters=c_o,
- output_size=[h_o, w_o],
- filter_size=[k_h, k_w],
- padding=padding,
- stride=[s_h, s_w],
- dilation=dilation,
- param_attr=fluid.ParamAttr(name=prefix + "weights"),
- bias_attr=fluid.ParamAttr(name=prefix + "biases"),
- act=act)
-
- if leaky_relu:
- output = fluid.layers.leaky_relu(output, alpha=relu_negative_slope)
-
- return output
-
- @layer
- def relu(self, input, name):
- fluid = import_fluid()
- output = fluid.layers.relu(input)
- return output
-
- @layer
- def prelu(self, input, channel_shared, name):
- fluid = import_fluid()
- if channel_shared:
- mode = 'all'
- else:
- mode = 'channel'
-
- prefix = name + '_'
- output = fluid.layers.prelu(
- input,
- mode=mode,
- param_attr=fluid.ParamAttr(name=prefix + 'negslope'))
- return output
-
- def pool(self, pool_type, input, k_h, k_w, s_h, s_w, ceil_mode, padding,
- name):
- # Get the number of channels in the input
- in_hw = input.shape[2:]
- k_hw = [k_h, k_w]
- s_hw = [s_h, s_w]
-
- fluid = import_fluid()
- output = fluid.layers.pool2d(
- name=name,
- input=input,
- pool_size=k_hw,
- pool_stride=s_hw,
- pool_padding=padding,
- ceil_mode=ceil_mode,
- pool_type=pool_type)
- return output
-
- @layer
- def max_pool(self,
- input,
- k_h,
- k_w,
- s_h,
- s_w,
- ceil_mode,
- padding=[0, 0],
- name=None):
- return self.pool(
- 'max',
- input,
- k_h,
- k_w,
- s_h,
- s_w,
- ceil_mode,
- padding,
- name=self.get_unique_output_name(name, 'max_pool'))
-
- @layer
- def avg_pool(self,
- input,
- k_h,
- k_w,
- s_h,
- s_w,
- ceil_mode,
- padding=[0, 0],
- name=None):
- return self.pool(
- 'avg',
- input,
- k_h,
- k_w,
- s_h,
- s_w,
- ceil_mode,
- padding,
- name=self.get_unique_output_name(name, 'avg_pool'),
- exclusive=False)
-
- @layer
- def sigmoid(self, input, name):
- fluid = import_fluid()
- return fluid.layers.sigmoid(
- input, name=self.get_unique_output_name(name, 'sigmoid'))
-
- @layer
- def tanh(self, input, name):
- fluid = import_fluid()
- return fluid.layers.tanh(
- input, name=self.get_unique_output_name(name, 'tanh'))
-
- @layer
- def lrn(self, input, radius, alpha, beta, name, bias=1.0):
- fluid = import_fluid()
- output = fluid.layers.lrn(input=input,
- n=radius,
- k=bias,
- alpha=alpha,
- beta=beta,
- name=self.get_unique_output_name(name, 'lrn'))
- return output
-
- @layer
- def concat(self, inputs, axis, name):
- fluid = import_fluid()
- output = fluid.layers.concat(
- input=inputs,
- axis=axis,
- name=self.get_unique_output_name(name, 'concat'))
- return output
-
- @layer
- def add(self, inputs, name):
- fluid = import_fluid()
- output = inputs[0]
- for i in inputs[1:]:
- output = fluid.layers.elementwise_add(
- x=output, y=i, name=self.get_unique_output_name(name, 'add'))
- return output
-
- @layer
- def max(self, inputs, name):
- fluid = import_fluid()
- output = inputs[0]
- for i in inputs[1:]:
- output = fluid.layers.elementwise_max(
- x=output, y=i, name=self.get_unique_output_name(name, 'max'))
- return output
-
- @layer
- def multiply(self, inputs, name):
- fluid = import_fluid()
- output = inputs[0]
- for i in inputs[1:]:
- output = fluid.layers.elementwise_mul(
- x=output, y=i, name=self.get_unique_output_name(name, 'mul'))
- return output
-
- @layer
- def fc(self, input, num_out, name, relu=True, act=None):
- fluid = import_fluid()
-
- if act is None:
- act = 'relu' if relu is True else None
-
- prefix = name + '_'
- output = fluid.layers.fc(
- name=self.get_unique_output_name(name, 'fc'),
- input=input,
- size=num_out,
- act=act,
- param_attr=fluid.ParamAttr(name=prefix + 'weights'),
- bias_attr=fluid.ParamAttr(name=prefix + 'biases'))
- return output
-
- @layer
- def softmax(self, input, axis=2, name=None):
- fluid = import_fluid()
- shape = input.shape
- dims = len(shape)
- axis = axis + dims if axis < 0 else axis
-
- need_transpose = False
- if axis + 1 != dims:
- need_transpose = True
-
- if need_transpose:
- order = range(dims)
- order.remove(axis).append(axis)
- input = fluid.layers.transpose(
- input,
- perm=order,
- name=self.get_unique_output_name(name, 'transpose'))
-
- output = fluid.layers.softmax(
- input, name=self.get_unique_output_name(name, 'softmax'))
-
- if need_transpose:
- order = range(len(shape))
- order[axis] = dims - 1
- order[-1] = axis
- output = fluid.layers.transpose(
- output,
- perm=order,
- name=self.get_unique_output_name(name, 'transpose'))
- return output
-
- @layer
- def batch_normalization(self,
- input,
- name,
- scale_offset=True,
- eps=1e-5,
- relu=False,
- relu_negative_slope=0.0):
- # NOTE: Currently, only inference is supported
- fluid = import_fluid()
- prefix = name + '_'
- param_attr = None if scale_offset is False else fluid.ParamAttr(
- name=prefix + 'scale')
- bias_attr = None if scale_offset is False else fluid.ParamAttr(
- name=prefix + 'offset')
- mean_name = prefix + 'mean'
- variance_name = prefix + 'variance'
-
- leaky_relu = False
- act = 'relu'
- if relu is False:
- act = None
- elif relu_negative_slope != 0.0:
- leaky_relu = True
- act = None
-
- output = fluid.layers.batch_norm(
- name=self.get_unique_output_name(name, 'batch_norm'),
- input=input,
- is_test=True,
- param_attr=param_attr,
- bias_attr=bias_attr,
- moving_mean_name=mean_name,
- moving_variance_name=variance_name,
- epsilon=eps,
- act=act)
-
- if leaky_relu:
- output = fluid.layers.leaky_relu(output, alpha=relu_negative_slope)
-
- return output
-
- @layer
- def dropout(self, input, drop_prob, name, is_test=True):
- fluid = import_fluid()
- if is_test:
- output = input
- else:
- output = fluid.layers.dropout(
- input,
- dropout_prob=drop_prob,
- is_test=is_test,
- name=self.get_unique_output_name(name, 'dropout'))
- return output
-
- @layer
- def scale(self, input, axis=1, num_axes=1, name=None):
- fluid = import_fluid()
-
- assert num_axes == 1, "layer scale not support this num_axes[%d] now" % (
- num_axes)
-
- prefix = name + '_'
- scale_shape = input.shape[axis:axis + num_axes]
- param_attr = fluid.ParamAttr(name=prefix + 'scale')
- scale_param = fluid.layers.create_parameter(
- shape=scale_shape, dtype=input.dtype, name=name, attr=param_attr)
-
- offset_attr = fluid.ParamAttr(name=prefix + 'offset')
- offset_param = fluid.layers.create_parameter(
- shape=scale_shape, dtype=input.dtype, name=name, attr=offset_attr)
-
- output = fluid.layers.elementwise_mul(
- input,
- scale_param,
- axis=axis,
- name=self.get_unique_output_name(name, 'scale_mul'))
- output = fluid.layers.elementwise_add(
- output,
- offset_param,
- axis=axis,
- name=self.get_unique_output_name(name, 'scale_add'))
- return output
-
- def custom_layer_factory(self):
- """ get a custom layer maker provided by subclass
- """
- raise NotImplementedError(
- '[custom_layer_factory] must be implemented by the subclass.')
-
- @layer
- def custom_layer(self, inputs, kind, name, *args, **kwargs):
- """ make custom layer
- """
- #FIX ME:
- # there is a trick for different API between caffe and paddle
- if kind == "DetectionOutput":
- conf_var = inputs[1]
- real_conf_var = self.locate_ancestor(conf_var, ancestor_level=2)
- inputs[1] = real_conf_var[1]
-
- name = self.get_unique_output_name(name, kind)
- layer_factory = self.custom_layer_factory()
- return layer_factory(kind, inputs, name, *args, **kwargs)
diff --git a/fluid/PaddleCV/caffe2fluid/kaffe/shapes.py b/fluid/PaddleCV/caffe2fluid/kaffe/shapes.py
deleted file mode 100644
index b41c853d5af691c3d525d45106619e11b65a65cb..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/caffe2fluid/kaffe/shapes.py
+++ /dev/null
@@ -1,156 +0,0 @@
-import math
-from collections import namedtuple
-
-from .errors import KaffeError
-
-Tensor4DShape = namedtuple('Tensor4DShape',
- ['batch_size', 'channels', 'height', 'width'])
-
-Tensor3DShape = namedtuple('Tensor3DShape', ['batch_size', 'data1', 'data2'])
-
-Tensor2DShape = namedtuple('Tensor2DShape', ['batch_size', 'data'])
-
-ScalarShape = namedtuple('ScalarShape', ['batch_size'])
-
-
-def make_tensor(batch_size, d1=None, d2=None, d3=None):
- if d3 is not None:
- return Tensor4DShape(batch_size, d1, d2, d3)
- elif d1 is not None and d2 is not None:
- return Tensor3DShape(batch_size, d1, d2)
- elif d1 is not None and d2 is None:
- return Tensor2DShape(batch_size, d1)
- elif d1 is None and d2 is None and d3 is None:
- return ScalarShape(batch_size)
- else:
- raise NotImplementedError('invalid params for make_tensor %s' \
- % (str((batch_size, d1, d2, d3))))
-
-
-def get_filter_output_shape(i_h, i_w, params, round_func):
- dila_h = getattr(params, 'dila_h', 1)
- dila_w = getattr(params, 'dila_w', 1)
-
- o_h = (i_h + 2 * params.pad_h -
- (dila_h * (params.kernel_h - 1) + 1)) / float(params.stride_h) + 1
- o_w = (i_w + 2 * params.pad_w -
- (dila_w * (params.kernel_w - 1) + 1)) / float(params.stride_w) + 1
-
- return (int(round_func(o_h)), int(round_func(o_w)))
-
-
-def get_strided_kernel_output_shape(node, round_func):
- assert node.layer is not None
- input_shape = node.get_only_parent().output_shape
- o_h, o_w = get_filter_output_shape(input_shape.height, input_shape.width,
- node.layer.kernel_parameters, round_func)
- params = node.layer.parameters
- has_c_o = hasattr(params, 'num_output')
- c = params.num_output if has_c_o else input_shape.channels
- return make_tensor(input_shape.batch_size, c, o_h, o_w)
-
-
-def shape_not_implemented(node):
- raise NotImplementedError
-
-
-def shape_identity(node):
- assert len(node.parents) > 0
- return node.parents[0].output_shape
-
-
-def shape_scalar(node):
- return make_tensor(1, 1, 1, 1)
-
-
-def shape_crop(node):
- raise KaffeError('crop function had been defined in customer_layers')
-
-
-def shape_data(node):
- if node.output_shape:
- # Old-style input specification
- shape = node.output_shape
- else:
- try:
- # New-style input specification
- shape = map(int, node.parameters.shape[0].dim)
- except:
- # We most likely have a data layer on our hands. The problem is,
- # Caffe infers the dimensions of the data from the source (eg: LMDB).
- # We want to avoid reading datasets here. Fail for now.
- # This can be temporarily fixed by transforming the data layer to
- # Caffe's "input" layer (as is usually used in the "deploy" version).
- # TODO: Find a better solution for this.
- raise KaffeError(
- 'Cannot determine dimensions of data layer.\n'
- 'See comments in function shape_data for more info.')
- return shape
-
-
-def shape_mem_data(node):
- params = node.parameters
- return make_tensor(params.batch_size, params.channels, params.height,
- params.width)
-
-
-def shape_concat(node):
- axis = node.layer.parameters.axis
- output_shape = None
- for parent in node.parents:
- if output_shape is None:
- output_shape = list(parent.output_shape)
- else:
- output_shape[axis] += parent.output_shape[axis]
- return tuple(output_shape)
-
-
-def shape_convolution(node):
- return get_strided_kernel_output_shape(node, math.floor)
-
-
-def shape_deconvolution(node):
- assert node.layer is not None
- input_shape = node.get_only_parent().output_shape
- h_i = input_shape.height
- w_i = input_shape.width
-
- params = node.layer.kernel_parameters
- p_h = params.pad_h
- p_w = params.pad_w
-
- dila_h = params.dila_h
- dila_w = params.dila_w
-
- k_h = params.kernel_h
- k_w = params.kernel_w
-
- s_h = params.stride_h
- s_w = params.stride_w
-
- h_o = (h_i - 1) * s_h - 2 * p_h + dila_h * (k_h - 1) + 1
- w_o = (w_i - 1) * s_w - 2 * p_w + dila_w * (k_w - 1) + 1
-
- params = node.layer.parameters
- has_c_o = hasattr(params, 'num_output')
- c = params.num_output if has_c_o else input_shape.channels
- return make_tensor(input_shape.batch_size, c, h_o, w_o)
-
-
-def shape_pool(node):
- global_pool = getattr(node.layer.parameters, 'global_pooling', False)
- if global_pool:
- input_shape = node.get_only_parent().output_shape
- return make_tensor(input_shape.batch_size, input_shape.channels, 1, 1)
-
- ceil_mode = getattr(node.layer.parameters, 'ceil_mode', True)
- if ceil_mode is True:
- method = math.ceil
- else:
- method = math.floor
- return get_strided_kernel_output_shape(node, method)
-
-
-def shape_inner_product(node):
- input_shape = node.get_only_parent().output_shape
- return make_tensor(input_shape.batch_size, node.layer.parameters.num_output)
diff --git a/fluid/PaddleCV/deeplabv3+/.gitignore b/fluid/PaddleCV/deeplabv3+/.gitignore
deleted file mode 100644
index d086de2dafc52aa312b186bd593211be6f4ee60c..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/deeplabv3+/.gitignore
+++ /dev/null
@@ -1,3 +0,0 @@
-deeplabv3plus_xception65_initialize.params
-deeplabv3plus.params
-deeplabv3plus.tar.gz
diff --git a/fluid/PaddleCV/deeplabv3+/README.md b/fluid/PaddleCV/deeplabv3+/README.md
index 9ff68ab8c1ded0eb41078886aac7a1ec49f02355..94f81a780a21bda7e230bf513be427b08a6eaca2 100644
--- a/fluid/PaddleCV/deeplabv3+/README.md
+++ b/fluid/PaddleCV/deeplabv3+/README.md
@@ -1,108 +1,2 @@
-DeepLab运行本目录下的程序示例需要使用PaddlePaddle Fluid v1.0.0版本或以上。如果您的PaddlePaddle安装版本低于此要求,请按照安装文档中的说明更新PaddlePaddle安装版本,如果使用GPU,该程序需要使用cuDNN v7版本。
-
-## 代码结构
-```
-├── models.py # 网络结构定义脚本
-├── train.py # 训练任务脚本
-├── eval.py # 评估脚本
-└── reader.py # 定义通用的函数以及数据预处理脚本
-```
-
-## 简介
-
-DeepLabv3+ 是DeepLab语义分割系列网络的最新作,其前作有 DeepLabv1,DeepLabv2, DeepLabv3,
-在最新作中,DeepLab的作者通过encoder-decoder进行多尺度信息的融合,同时保留了原来的空洞卷积和ASSP层,
-其骨干网络使用了Xception模型,提高了语义分割的健壮性和运行速率,在 PASCAL VOC 2012 dataset取得新的state-of-art performance,89.0mIOU。
-
-
-
-
-## 数据准备
-
-
-
-本文采用Cityscape数据集,请前往[Cityscape官网](https://www.cityscapes-dataset.com)注册下载。
-下载以后的数据目录结构如下
-```
-data/cityscape/
-|-- gtFine
-| |-- test
-| |-- train
-| `-- val
-|-- leftImg8bit
- |-- test
- |-- train
- `-- val
-```
-
-# 预训练模型准备
-
-如果需要从头开始训练模型,用户需要下载我们的初始化模型
-```
-wget http://paddlemodels.cdn.bcebos.com/deeplab/deeplabv3plus_xception65_initialize.tar.gz
-tar -xf deeplabv3plus_xception65_initialize.tar.gz && rm deeplabv3plus_xception65_initialize.tar.gz
-```
-如果需要最终训练模型进行fine tune或者直接用于预测,请下载我们的最终模型
-```
-wget http://paddlemodels.cdn.bcebos.com/deeplab/deeplabv3plus.tar.gz
-tar -xf deeplabv3plus.tar.gz && rm deeplabv3plus.tar.gz
-```
-
-
-## 模型训练与预测
-
-### 训练
-执行以下命令进行训练,同时指定weights的保存路径,初始化路径,以及数据存放位置:
-```
-python ./train.py \
- --batch_size=1 \
- --train_crop_size=769 \
- --total_step=50 \
- --init_weights_path=$INIT_WEIGHTS_PATH \
- --save_weights_path=$SAVE_WEIGHTS_PATH \
- --dataset_path=$DATASET_PATH
-```
-使用以下命令获得更多使用说明:
-```
-python train.py --help
-```
-以上命令用于测试训练过程是否正常,仅仅迭代了50次并且使用了1的batch size,如果需要复现
-原论文的实验,请使用以下设置:
-```
-python ./train.py \
- --batch_size=8 \
- --parallel=true \
- --train_crop_size=769 \
- --total_step=90000 \
- --init_weights_path=deeplabv3plus_xception65_initialize.params \
- --save_weights_path=output \
- --dataset_path=$DATASET_PATH
-```
-
-### 测试
-执行以下命令在`Cityscape`测试数据集上进行测试:
-```
-python ./eval.py \
- --init_weights=deeplabv3plus.params \
- --dataset_path=$DATASET_PATH
-```
-需要通过选项`--model_path`指定模型文件。测试脚本的输出的评估指标为mean IoU。
-
-
-## 实验结果
-训练完成以后,使用`eval.py`在验证集上进行测试,得到以下结果:
-```
-load from: ../models/deeplabv3p
-total number 500
-step: 500, mIoU: 0.7873
-```
-
-## 其他信息
-|数据集 | pretrained model | trained model | mean IoU
-|---|---|---|---|
-|CityScape | [deeplabv3plus_xception65_initialize.tar.gz](http://paddlemodels.cdn.bcebos.com/deeplab/deeplabv3plus_xception65_initialize.tar.gz) | [deeplabv3plus.tar.gz](http://paddlemodels.cdn.bcebos.com/deeplab/deeplabv3plus.tar.gz) | 0.7873 |
-
-## 参考
-
-- [Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation](https://arxiv.org/abs/1802.02611)
+您好,该项目已被迁移,请移步到 [PaddleCV/deeplabv3+](../../../PaddleCV/deeplabv3+) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/deeplabv3+/eval.py b/fluid/PaddleCV/deeplabv3+/eval.py
deleted file mode 100644
index 624159a54d3ff55e29d9f5ac71c673e5e396d9e7..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/deeplabv3+/eval.py
+++ /dev/null
@@ -1,127 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-os.environ['FLAGS_fraction_of_gpu_memory_to_use'] = '0.98'
-
-import paddle
-import paddle.fluid as fluid
-import numpy as np
-import argparse
-from reader import CityscapeDataset
-import reader
-import models
-import sys
-
-
-def add_argument(name, type, default, help):
- parser.add_argument('--' + name, default=default, type=type, help=help)
-
-
-def add_arguments():
- add_argument('total_step', int, -1,
- "Number of the step to be evaluated, -1 for full evaluation.")
- add_argument('init_weights_path', str, None,
- "Path of the weights to evaluate.")
- add_argument('dataset_path', str, None, "Cityscape dataset path.")
- add_argument('verbose', bool, False, "Print mIoU for each step if verbose.")
- add_argument('use_gpu', bool, True, "Whether use GPU or CPU.")
-
-
-def mean_iou(pred, label):
- label = fluid.layers.elementwise_min(
- label, fluid.layers.assign(np.array(
- [num_classes], dtype=np.int32)))
- label_ignore = (label == num_classes).astype('int32')
- label_nignore = (label != num_classes).astype('int32')
-
- pred = pred * label_nignore + label_ignore * num_classes
-
- miou, wrong, correct = fluid.layers.mean_iou(pred, label, num_classes + 1)
- return miou, wrong, correct
-
-
-def load_model():
- if args.init_weights_path.endswith('/'):
- fluid.io.load_params(
- exe, dirname=args.init_weights_path, main_program=tp)
- else:
- fluid.io.load_params(
- exe, dirname="", filename=args.init_weights_path, main_program=tp)
-
-
-CityscapeDataset = reader.CityscapeDataset
-
-parser = argparse.ArgumentParser()
-add_arguments()
-
-args = parser.parse_args()
-
-models.clean()
-models.is_train = False
-deeplabv3p = models.deeplabv3p
-
-image_shape = [1025, 2049]
-eval_shape = [1024, 2048]
-
-sp = fluid.Program()
-tp = fluid.Program()
-batch_size = 1
-reader.default_config['crop_size'] = -1
-reader.default_config['shuffle'] = False
-num_classes = 19
-
-with fluid.program_guard(tp, sp):
- img = fluid.layers.data(name='img', shape=[3, 0, 0], dtype='float32')
- label = fluid.layers.data(name='label', shape=eval_shape, dtype='int32')
- img = fluid.layers.resize_bilinear(img, image_shape)
- logit = deeplabv3p(img)
- logit = fluid.layers.resize_bilinear(logit, eval_shape)
- pred = fluid.layers.argmax(logit, axis=1).astype('int32')
- miou, out_wrong, out_correct = mean_iou(pred, label)
-
-tp = tp.clone(True)
-fluid.memory_optimize(
- tp,
- print_log=False,
- skip_opt_set=[pred.name, miou, out_wrong, out_correct],
- level=1)
-
-place = fluid.CPUPlace()
-if args.use_gpu:
- place = fluid.CUDAPlace(0)
-exe = fluid.Executor(place)
-exe.run(sp)
-
-if args.init_weights_path:
- print("load from:", args.init_weights_path)
- load_model()
-
-dataset = CityscapeDataset(args.dataset_path, 'val')
-if args.total_step == -1:
- total_step = len(dataset.label_files)
-else:
- total_step = args.total_step
-
-batches = dataset.get_batch_generator(batch_size, total_step)
-
-sum_iou = 0
-all_correct = np.array([0], dtype=np.int64)
-all_wrong = np.array([0], dtype=np.int64)
-
-for i, imgs, labels, names in batches:
- result = exe.run(tp,
- feed={'img': imgs,
- 'label': labels},
- fetch_list=[pred, miou, out_wrong, out_correct])
- wrong = result[2][:-1] + all_wrong
- right = result[3][:-1] + all_correct
- all_wrong = wrong.copy()
- all_correct = right.copy()
- mp = (wrong + right) != 0
- miou2 = np.mean((right[mp] * 1.0 / (right[mp] + wrong[mp])))
- if args.verbose:
- print('step: %s, mIoU: %s' % (i + 1, miou2))
- else:
- print('\rstep: %s, mIoU: %s' % (i + 1, miou2))
- sys.stdout.flush()
diff --git a/fluid/PaddleCV/deeplabv3+/models.py b/fluid/PaddleCV/deeplabv3+/models.py
deleted file mode 100644
index feca2142293ee2169fbe0d2bdc82f1d950af00de..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/deeplabv3+/models.py
+++ /dev/null
@@ -1,313 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-
-import contextlib
-name_scope = ""
-
-decode_channel = 48
-encode_channel = 256
-label_number = 19
-
-bn_momentum = 0.99
-dropout_keep_prop = 0.9
-is_train = True
-
-op_results = {}
-
-default_epsilon = 1e-3
-default_norm_type = 'bn'
-default_group_number = 32
-
-
-@contextlib.contextmanager
-def scope(name):
- global name_scope
- bk = name_scope
- name_scope = name_scope + name + '/'
- yield
- name_scope = bk
-
-
-def check(data, number):
- if type(data) == int:
- return [data] * number
- assert len(data) == number
- return data
-
-
-def clean():
- global op_results
- op_results = {}
-
-
-def append_op_result(result, name):
- global op_results
- op_index = len(op_results)
- name = name_scope + name + str(op_index)
- op_results[name] = result
- return result
-
-
-def conv(*args, **kargs):
- kargs['param_attr'] = name_scope + 'weights'
- if 'bias_attr' in kargs and kargs['bias_attr']:
- kargs['bias_attr'] = name_scope + 'biases'
- else:
- kargs['bias_attr'] = False
- return append_op_result(fluid.layers.conv2d(*args, **kargs), 'conv')
-
-
-def group_norm(input, G, eps=1e-5, param_attr=None, bias_attr=None):
- helper = fluid.layer_helper.LayerHelper('group_norm', **locals())
-
- N, C, H, W = input.shape
- if C % G != 0:
- print("group can not divide channle:", C, G)
- for d in range(10):
- for t in [d, -d]:
- if G + t <= 0: continue
- if C % (G + t) == 0:
- G = G + t
- break
- if C % G == 0:
- print("use group size:", G)
- break
- assert C % G == 0
- param_shape = (G, )
- x = input
- x = fluid.layers.reshape(x, [N, G, C // G * H * W])
- mean = fluid.layers.reduce_mean(x, dim=2, keep_dim=True)
- x = x - mean
- var = fluid.layers.reduce_mean(fluid.layers.square(x), dim=2, keep_dim=True)
- x = x / fluid.layers.sqrt(var + eps)
-
- scale = helper.create_parameter(
- attr=helper.param_attr,
- shape=param_shape,
- dtype='float32',
- default_initializer=fluid.initializer.Constant(1.0))
-
- bias = helper.create_parameter(
- attr=helper.bias_attr, shape=param_shape, dtype='float32', is_bias=True)
- x = fluid.layers.elementwise_add(
- fluid.layers.elementwise_mul(
- x, scale, axis=1), bias, axis=1)
- return fluid.layers.reshape(x, input.shape)
-
-
-def bn(*args, **kargs):
- if default_norm_type == 'bn':
- with scope('BatchNorm'):
- return append_op_result(
- fluid.layers.batch_norm(
- *args,
- epsilon=default_epsilon,
- momentum=bn_momentum,
- param_attr=name_scope + 'gamma',
- bias_attr=name_scope + 'beta',
- moving_mean_name=name_scope + 'moving_mean',
- moving_variance_name=name_scope + 'moving_variance',
- **kargs),
- 'bn')
- elif default_norm_type == 'gn':
- with scope('GroupNorm'):
- return append_op_result(
- group_norm(
- args[0],
- default_group_number,
- eps=default_epsilon,
- param_attr=name_scope + 'gamma',
- bias_attr=name_scope + 'beta'),
- 'gn')
- else:
- raise "Unsupport norm type:" + default_norm_type
-
-
-def bn_relu(data):
- return append_op_result(fluid.layers.relu(bn(data)), 'relu')
-
-
-def relu(data):
- return append_op_result(fluid.layers.relu(data), 'relu')
-
-
-def seq_conv(input, channel, stride, filter, dilation=1, act=None):
- with scope('depthwise'):
- input = conv(
- input,
- input.shape[1],
- filter,
- stride,
- groups=input.shape[1],
- padding=(filter // 2) * dilation,
- dilation=dilation)
- input = bn(input)
- if act: input = act(input)
- with scope('pointwise'):
- input = conv(input, channel, 1, 1, groups=1, padding=0)
- input = bn(input)
- if act: input = act(input)
- return input
-
-
-def xception_block(input,
- channels,
- strides=1,
- filters=3,
- dilation=1,
- skip_conv=True,
- has_skip=True,
- activation_fn_in_separable_conv=False):
- repeat_number = 3
- channels = check(channels, repeat_number)
- filters = check(filters, repeat_number)
- strides = check(strides, repeat_number)
- data = input
- results = []
- for i in range(repeat_number):
- with scope('separable_conv' + str(i + 1)):
- if not activation_fn_in_separable_conv:
- data = relu(data)
- data = seq_conv(
- data,
- channels[i],
- strides[i],
- filters[i],
- dilation=dilation)
- else:
- data = seq_conv(
- data,
- channels[i],
- strides[i],
- filters[i],
- dilation=dilation,
- act=relu)
- results.append(data)
- if not has_skip:
- return append_op_result(data, 'xception_block'), results
- if skip_conv:
- with scope('shortcut'):
- skip = bn(
- conv(
- input, channels[-1], 1, strides[-1], groups=1, padding=0))
- else:
- skip = input
- return append_op_result(data + skip, 'xception_block'), results
-
-
-def entry_flow(data):
- with scope("entry_flow"):
- with scope("conv1"):
- data = conv(data, 32, 3, stride=2, padding=1)
- data = bn_relu(data)
- with scope("conv2"):
- data = conv(data, 64, 3, stride=1, padding=1)
- data = bn_relu(data)
- with scope("block1"):
- data, _ = xception_block(data, 128, [1, 1, 2])
- with scope("block2"):
- data, results = xception_block(data, 256, [1, 1, 2])
- with scope("block3"):
- data, _ = xception_block(data, 728, [1, 1, 2])
- return data, results[1]
-
-
-def middle_flow(data):
- with scope("middle_flow"):
- for i in range(16):
- with scope("block" + str(i + 1)):
- data, _ = xception_block(data, 728, [1, 1, 1], skip_conv=False)
- return data
-
-
-def exit_flow(data):
- with scope("exit_flow"):
- with scope('block1'):
- data, _ = xception_block(data, [728, 1024, 1024], [1, 1, 1])
- with scope('block2'):
- data, _ = xception_block(
- data, [1536, 1536, 2048], [1, 1, 1],
- dilation=2,
- has_skip=False,
- activation_fn_in_separable_conv=True)
- return data
-
-
-def dropout(x, keep_rate):
- if is_train:
- return fluid.layers.dropout(x, 1 - keep_rate) / keep_rate
- else:
- return x
-
-
-def encoder(input):
- with scope('encoder'):
- channel = 256
- with scope("image_pool"):
- image_avg = fluid.layers.reduce_mean(input, [2, 3], keep_dim=True)
- append_op_result(image_avg, 'reduce_mean')
- image_avg = bn_relu(
- conv(
- image_avg, channel, 1, 1, groups=1, padding=0))
- image_avg = fluid.layers.resize_bilinear(image_avg, input.shape[2:])
-
- with scope("aspp0"):
- aspp0 = bn_relu(conv(input, channel, 1, 1, groups=1, padding=0))
- with scope("aspp1"):
- aspp1 = seq_conv(input, channel, 1, 3, dilation=6, act=relu)
- with scope("aspp2"):
- aspp2 = seq_conv(input, channel, 1, 3, dilation=12, act=relu)
- with scope("aspp3"):
- aspp3 = seq_conv(input, channel, 1, 3, dilation=18, act=relu)
- with scope("concat"):
- data = append_op_result(
- fluid.layers.concat(
- [image_avg, aspp0, aspp1, aspp2, aspp3], axis=1),
- 'concat')
- data = bn_relu(conv(data, channel, 1, 1, groups=1, padding=0))
- data = dropout(data, dropout_keep_prop)
- return data
-
-
-def decoder(encode_data, decode_shortcut):
- with scope('decoder'):
- with scope('concat'):
- decode_shortcut = bn_relu(
- conv(
- decode_shortcut, decode_channel, 1, 1, groups=1, padding=0))
- encode_data = fluid.layers.resize_bilinear(
- encode_data, decode_shortcut.shape[2:])
- encode_data = fluid.layers.concat(
- [encode_data, decode_shortcut], axis=1)
- append_op_result(encode_data, 'concat')
- with scope("separable_conv1"):
- encode_data = seq_conv(
- encode_data, encode_channel, 1, 3, dilation=1, act=relu)
- with scope("separable_conv2"):
- encode_data = seq_conv(
- encode_data, encode_channel, 1, 3, dilation=1, act=relu)
- return encode_data
-
-
-def deeplabv3p(img):
- global default_epsilon
- append_op_result(img, 'img')
- with scope('xception_65'):
- default_epsilon = 1e-3
- # Entry flow
- data, decode_shortcut = entry_flow(img)
- # Middle flow
- data = middle_flow(data)
- # Exit flow
- data = exit_flow(data)
- default_epsilon = 1e-5
- encode_data = encoder(data)
- encode_data = decoder(encode_data, decode_shortcut)
- with scope('logit'):
- logit = conv(
- encode_data, label_number, 1, stride=1, padding=0, bias_attr=True)
- logit = fluid.layers.resize_bilinear(logit, img.shape[2:])
- return logit
diff --git a/fluid/PaddleCV/deeplabv3+/reader.py b/fluid/PaddleCV/deeplabv3+/reader.py
deleted file mode 100644
index d420f0a264ba26be00cbe0d1d36130d565d7030d..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/deeplabv3+/reader.py
+++ /dev/null
@@ -1,146 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import cv2
-import numpy as np
-import os
-import six
-
-default_config = {
- "shuffle": True,
- "min_resize": 0.5,
- "max_resize": 2,
- "crop_size": 769,
-}
-
-
-def slice_with_pad(a, s, value=0):
- pads = []
- slices = []
- for i in range(len(a.shape)):
- if i >= len(s):
- pads.append([0, 0])
- slices.append([0, a.shape[i]])
- else:
- l, r = s[i]
- if l < 0:
- pl = -l
- l = 0
- else:
- pl = 0
- if r > a.shape[i]:
- pr = r - a.shape[i]
- r = a.shape[i]
- else:
- pr = 0
- pads.append([pl, pr])
- slices.append([l, r])
- slices = list(map(lambda x: slice(x[0], x[1], 1), slices))
- a = a[slices]
- a = np.pad(a, pad_width=pads, mode='constant', constant_values=value)
- return a
-
-
-class CityscapeDataset:
- def __init__(self, dataset_dir, subset='train', config=default_config):
- label_dirname = os.path.join(dataset_dir, 'gtFine/' + subset)
- if six.PY2:
- import commands
- label_files = commands.getoutput(
- "find %s -type f | grep labelTrainIds | sort" %
- label_dirname).splitlines()
- else:
- import subprocess
- label_files = subprocess.getstatusoutput(
- "find %s -type f | grep labelTrainIds | sort" %
- label_dirname)[-1].splitlines()
- self.label_files = label_files
- self.label_dirname = label_dirname
- self.index = 0
- self.subset = subset
- self.dataset_dir = dataset_dir
- self.config = config
- self.reset()
- print("total number", len(label_files))
-
- def reset(self, shuffle=False):
- self.index = 0
- if self.config["shuffle"]:
- np.random.shuffle(self.label_files)
-
- def next_img(self):
- self.index += 1
- if self.index >= len(self.label_files):
- self.reset()
-
- def get_img(self):
- shape = self.config["crop_size"]
- while True:
- ln = self.label_files[self.index]
- img_name = os.path.join(
- self.dataset_dir,
- 'leftImg8bit/' + self.subset + ln[len(self.label_dirname):])
- img_name = img_name.replace('gtFine_labelTrainIds', 'leftImg8bit')
- label = cv2.imread(ln)
- img = cv2.imread(img_name)
- if img is None:
- print("load img failed:", img_name)
- self.next_img()
- else:
- break
- if shape == -1:
- return img, label, ln
- random_scale = np.random.rand(1) * (self.config['max_resize'] -
- self.config['min_resize']
- ) + self.config['min_resize']
- crop_size = int(shape / random_scale)
- bb = crop_size // 2
-
- def _randint(low, high):
- return int(np.random.rand(1) * (high - low) + low)
-
- offset_x = np.random.randint(bb, max(bb + 1, img.shape[0] -
- bb)) - crop_size // 2
- offset_y = np.random.randint(bb, max(bb + 1, img.shape[1] -
- bb)) - crop_size // 2
- img_crop = slice_with_pad(img, [[offset_x, offset_x + crop_size],
- [offset_y, offset_y + crop_size]], 128)
- img = cv2.resize(img_crop, (shape, shape))
- label_crop = slice_with_pad(label, [[offset_x, offset_x + crop_size],
- [offset_y, offset_y + crop_size]],
- 255)
- label = cv2.resize(
- label_crop, (shape, shape), interpolation=cv2.INTER_NEAREST)
- return img, label, ln + str(
- (offset_x, offset_y, crop_size, random_scale))
-
- def get_batch(self, batch_size=1):
- imgs = []
- labels = []
- names = []
- while len(imgs) < batch_size:
- img, label, ln = self.get_img()
- imgs.append(img)
- labels.append(label)
- names.append(ln)
- self.next_img()
- return np.array(imgs), np.array(labels), names
-
- def get_batch_generator(self, batch_size, total_step):
- def do_get_batch():
- for i in range(total_step):
- imgs, labels, names = self.get_batch(batch_size)
- labels = labels.astype(np.int32)[:, :, :, 0]
- imgs = imgs[:, :, :, ::-1].transpose(
- 0, 3, 1, 2).astype(np.float32) / (255.0 / 2) - 1
- yield i, imgs, labels, names
-
- batches = do_get_batch()
- try:
- from prefetch_generator import BackgroundGenerator
- batches = BackgroundGenerator(batches, 100)
- except:
- print(
- "You can install 'prefetch_generator' for acceleration of data reading."
- )
- return batches
diff --git a/fluid/PaddleCV/deeplabv3+/train.py b/fluid/PaddleCV/deeplabv3+/train.py
deleted file mode 100644
index fcc038b137349877e06c98a6d533669353bb4b34..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/deeplabv3+/train.py
+++ /dev/null
@@ -1,162 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-os.environ['FLAGS_fraction_of_gpu_memory_to_use'] = '0.98'
-
-import paddle
-import paddle.fluid as fluid
-import numpy as np
-import argparse
-from reader import CityscapeDataset
-import reader
-import models
-import time
-
-def add_argument(name, type, default, help):
- parser.add_argument('--' + name, default=default, type=type, help=help)
-
-
-def add_arguments():
- add_argument('batch_size', int, 2,
- "The number of images in each batch during training.")
- add_argument('train_crop_size', int, 769,
- "'Image crop size during training.")
- add_argument('base_lr', float, 0.0001,
- "The base learning rate for model training.")
- add_argument('total_step', int, 90000, "Number of the training step.")
- add_argument('init_weights_path', str, None,
- "Path of the initial weights in paddlepaddle format.")
- add_argument('save_weights_path', str, None,
- "Path of the saved weights during training.")
- add_argument('dataset_path', str, None, "Cityscape dataset path.")
- add_argument('parallel', bool, False, "using ParallelExecutor.")
- add_argument('use_gpu', bool, True, "Whether use GPU or CPU.")
-
-
-def load_model():
- if args.init_weights_path.endswith('/'):
- fluid.io.load_params(
- exe, dirname=args.init_weights_path, main_program=tp)
- else:
- fluid.io.load_params(
- exe, dirname="", filename=args.init_weights_path, main_program=tp)
-
-
-def save_model():
- if args.save_weights_path.endswith('/'):
- fluid.io.save_params(
- exe, dirname=args.save_weights_path, main_program=tp)
- else:
- fluid.io.save_params(
- exe, dirname="", filename=args.save_weights_path, main_program=tp)
-
-
-def loss(logit, label):
- label_nignore = (label < num_classes).astype('float32')
- label = fluid.layers.elementwise_min(
- label,
- fluid.layers.assign(np.array(
- [num_classes - 1], dtype=np.int32)))
- logit = fluid.layers.transpose(logit, [0, 2, 3, 1])
- logit = fluid.layers.reshape(logit, [-1, num_classes])
- label = fluid.layers.reshape(label, [-1, 1])
- label = fluid.layers.cast(label, 'int64')
- label_nignore = fluid.layers.reshape(label_nignore, [-1, 1])
- loss = fluid.layers.softmax_with_cross_entropy(logit, label)
- loss = loss * label_nignore
- no_grad_set.add(label_nignore.name)
- no_grad_set.add(label.name)
- return loss, label_nignore
-
-
-CityscapeDataset = reader.CityscapeDataset
-parser = argparse.ArgumentParser()
-
-add_arguments()
-
-args = parser.parse_args()
-
-models.clean()
-models.bn_momentum = 0.9997
-models.dropout_keep_prop = 0.9
-deeplabv3p = models.deeplabv3p
-
-sp = fluid.Program()
-tp = fluid.Program()
-crop_size = args.train_crop_size
-batch_size = args.batch_size
-image_shape = [crop_size, crop_size]
-reader.default_config['crop_size'] = crop_size
-reader.default_config['shuffle'] = True
-num_classes = 19
-weight_decay = 0.00004
-
-base_lr = args.base_lr
-total_step = args.total_step
-
-no_grad_set = set()
-
-with fluid.program_guard(tp, sp):
- img = fluid.layers.data(
- name='img', shape=[3] + image_shape, dtype='float32')
- label = fluid.layers.data(name='label', shape=image_shape, dtype='int32')
- logit = deeplabv3p(img)
- pred = fluid.layers.argmax(logit, axis=1).astype('int32')
- loss, mask = loss(logit, label)
- lr = fluid.layers.polynomial_decay(
- base_lr, total_step, end_learning_rate=0, power=0.9)
- area = fluid.layers.elementwise_max(
- fluid.layers.reduce_mean(mask),
- fluid.layers.assign(np.array(
- [0.1], dtype=np.float32)))
- loss_mean = fluid.layers.reduce_mean(loss) / area
-
- opt = fluid.optimizer.Momentum(
- lr,
- momentum=0.9,
- regularization=fluid.regularizer.L2DecayRegularizer(
- regularization_coeff=weight_decay), )
- retv = opt.minimize(loss_mean, startup_program=sp, no_grad_set=no_grad_set)
-
-fluid.memory_optimize(
- tp, print_log=False, skip_opt_set=[pred.name, loss_mean.name], level=1)
-
-place = fluid.CPUPlace()
-if args.use_gpu:
- place = fluid.CUDAPlace(0)
-exe = fluid.Executor(place)
-exe.run(sp)
-
-if args.init_weights_path:
- print("load from:", args.init_weights_path)
- load_model()
-
-dataset = CityscapeDataset(args.dataset_path, 'train')
-
-if args.parallel:
- exe_p = fluid.ParallelExecutor(
- use_cuda=True, loss_name=loss_mean.name, main_program=tp)
-
-batches = dataset.get_batch_generator(batch_size, total_step)
-
-for i, imgs, labels, names in batches:
- prev_start_time = time.time()
- if args.parallel:
- retv = exe_p.run(fetch_list=[pred.name, loss_mean.name],
- feed={'img': imgs,
- 'label': labels})
- else:
- retv = exe.run(tp,
- feed={'img': imgs,
- 'label': labels},
- fetch_list=[pred, loss_mean])
- end_time = time.time()
- if i % 100 == 0:
- print("Model is saved to", args.save_weights_path)
- save_model()
- print("step {:d}, loss: {:.6f}, step_time_cost: {:.3f}" .format(i,
- np.mean(retv[1]), end_time - prev_start_time))
-
-print("Training done. Model is saved to", args.save_weights_path)
-save_model()
diff --git a/fluid/PaddleCV/face_detection/README.md b/fluid/PaddleCV/face_detection/README.md
deleted file mode 120000
index 4015683cfa5969297febc12e7ca1264afabbc0b5..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/face_detection/README.md
+++ /dev/null
@@ -1 +0,0 @@
-README_cn.md
\ No newline at end of file
diff --git a/fluid/PaddleCV/face_detection/README.md b/fluid/PaddleCV/face_detection/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..e9319716f4f660ff75b571337575d8cd53c03a13
--- /dev/null
+++ b/fluid/PaddleCV/face_detection/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/face_detection](../../../PaddleCV/face_detection) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/face_detection/README_cn.md b/fluid/PaddleCV/face_detection/README_cn.md
index 80485009d24e278a00b3d21001602fbe6ef9eef6..e9319716f4f660ff75b571337575d8cd53c03a13 100644
--- a/fluid/PaddleCV/face_detection/README_cn.md
+++ b/fluid/PaddleCV/face_detection/README_cn.md
@@ -1,185 +1,2 @@
-## Pyramidbox 人脸检测
-## Table of Contents
-- [简介](#简介)
-- [数据准备](#数据准备)
-- [模型训练](#模型训练)
-- [模型评估](#模型评估)
-- [模型发布](#模型发布)
-
-### 简介
-
-人脸检测是经典的计算机视觉任务,非受控场景中的小脸、模糊和遮挡的人脸检测是这个方向上最有挑战的问题。[PyramidBox](https://arxiv.org/pdf/1803.07737.pdf) 是一种基于SSD的单阶段人脸检测器,它利用上下文信息解决困难人脸的检测问题。如下图所示,PyramidBox在六个尺度的特征图上进行不同层级的预测。该工作主要包括以下模块:LFPN、Pyramid Anchors、CPM、Data-anchor-sampling。具体可以参考该方法对应的论文 https://arxiv.org/pdf/1803.07737.pdf ,下面进行简要的介绍。
-
-
-
-Pyramidbox 人脸检测模型
-
-
-**LFPN**: LFPN全称Low-level Feature Pyramid Networks, 在检测任务中,LFPN可以充分结合高层次的包含更多上下文的特征和低层次的包含更多纹理的特征。高层级特征被用于检测尺寸较大的人脸,而低层级特征被用于检测尺寸较小的人脸。为了将高层级特征整合到高分辨率的低层级特征上,我们从中间层开始做自上而下的融合,构建Low-level FPN。
-
-**Pyramid Anchors**: 该算法使用半监督解决方案来生成与人脸检测相关的具有语义的近似标签,提出基于anchor的语境辅助方法,它引入有监督的信息来学习较小的、模糊的和部分遮挡的人脸的语境特征。使用者可以根据标注的人脸标签,按照一定的比例扩充,得到头部的标签(上下左右各扩充1/2)和人体的标签(可自定义扩充比例)。
-
-**CPM**: CPM全称Context-sensitive Predict Module, 本方法设计了一种上下文敏感结构(CPM)来提高预测网络的表达能力。
-
-**Data-anchor-sampling**: 设计了一种新的采样方法,称作Data-anchor-sampling,该方法可以增加训练样本在不同尺度上的多样性。该方法改变训练样本的分布,重点关注较小的人脸。
-
-Pyramidbox模型可以在以下示例图片上展示鲁棒的检测性能,该图有一千张人脸,该模型检测出其中的880张人脸。
-
-
-Pyramidbox 人脸检测性能展示
-
-
-
-
-### 数据准备
-
-本教程使用 [WIDER FACE 数据集](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/) 来进行模型的训练测试工作,官网给出了详尽的数据介绍。
-
-WIDER FACE数据集包含32,203张图片,其中包含393,703个人脸,数据集的人脸在尺度、姿态、遮挡方面有较大的差异性。另外WIDER FACE数据集是基于61个场景归类的,然后针对每个场景,随机的挑选40%作为训练集,10%作为验证集,50%作为测试集。
-
-首先,从官网训练集和验证集,放在`data`目录,官网提供了谷歌云和百度云下载地址,请依据情况自行下载。并下载训练集和验证集的标注信息:
-
-```bash
-./data/download.sh
-```
-
-准备好数据之后,`data`目录如下:
-
-```
-data
-|-- download.sh
-|-- wider_face_split
-| |-- readme.txt
-| |-- wider_face_train_bbx_gt.txt
-| |-- wider_face_val_bbx_gt.txt
-| `-- ...
-|-- WIDER_train
-| `-- images
-| |-- 0--Parade
-| ...
-| `-- 9--Press_Conference
-`-- WIDER_val
- `-- images
- |-- 0--Parade
- ...
- `-- 9--Press_Conference
-```
-
-
-### 模型训练
-
-#### 下载预训练模型
-
-我们提供了预训练模型,模型是基于VGGNet的主干网络,使用如下命令下载:
-
-
-```bash
-wget http://paddlemodels.bj.bcebos.com/vgg_ilsvrc_16_fc_reduced.tar.gz
-tar -xf vgg_ilsvrc_16_fc_reduced.tar.gz && rm -f vgg_ilsvrc_16_fc_reduced.tar.gz
-```
-
-声明:该预训练模型转换自[Caffe](http://cs.unc.edu/~wliu/projects/ParseNet/VGG_ILSVRC_16_layers_fc_reduced.caffemodel)。不久,我们会发布自己预训练的模型。
-
-
-#### 开始训练
-
-
-`train.py` 是训练模块的主要执行程序,调用示例如下:
-
-```bash
-python -u train.py --batch_size=16 --pretrained_model=vgg_ilsvrc_16_fc_reduced
-```
- - 可以通过设置 `export CUDA_VISIBLE_DEVICES=0,1,2,3` 指定想要使用的GPU数量,`batch_size`默认设置为12或16。
- - 更多的可选参数见:
- ```bash
- python train.py --help
- ```
- - 模型训练150轮以上可以收敛。用Nvidia Tesla P40 GPU 4卡并行,`batch_size=16`的配置,每轮训练大约40分钟,总共训练时长大约100小时
-
-模型训练所采用的数据增强:
-
-**数据增强**:数据的读取行为定义在 `reader.py` 中,所有的图片都会被缩放到640x640。在训练时还会对图片进行数据增强,包括随机扰动、翻转、裁剪等,和[物体检测SSD算法](https://github.com/PaddlePaddle/models/blob/develop/fluid/object_detection/README_cn.md#%E8%AE%AD%E7%BB%83-pascal-voc-%E6%95%B0%E6%8D%AE%E9%9B%86)中数据增强类似,除此之外,增加了上面提到的Data-anchor-sampling:
-
- **尺度变换(Data-anchor-sampling)**:随机将图片尺度变换到一定范围的尺度,大大增强人脸的尺度变化。具体操作为根据随机选择的人脸高(height)和宽(width),得到$v=\\sqrt{width * height}$,判断$v$的值位于缩放区间$[16,32,64,128,256,512]$中的的哪一个。假设$v=45$,则选定$32
-
-
-
-
-Pyramidbox 预测可视化
-
-
-
-### 模型发布
-
-
-
-| 模型 | 预训练模型 | 训练数据 | 测试数据 | mAP |
-|:------------------------:|:------------------:|:----------------:|:------------:|:----:|
-|[Pyramidbox-v1-SSD 640x640](http://paddlemodels.bj.bcebos.com/PyramidBox_WiderFace.tar.gz) | [VGGNet](http://paddlemodels.bj.bcebos.com/vgg_ilsvrc_16_fc_reduced.tar.gz) | WIDER FACE train | WIDER FACE Val | 96.0%/ 94.8%/ 88.8% |
-
-#### 性能曲线
-
-
-
-
-WIDER FACE Easy/Medium/Hard set
-
+您好,该项目已被迁移,请移步到 [PaddleCV/face_detection](../../../PaddleCV/face_detection) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/face_detection/data_util.py b/fluid/PaddleCV/face_detection/data_util.py
deleted file mode 100644
index a8f6aac6ba8a418f5d4645d167122a3bc4cb125b..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/face_detection/data_util.py
+++ /dev/null
@@ -1,157 +0,0 @@
-"""
-This code is based on https://github.com/fchollet/keras/blob/master/keras/utils/data_utils.py
-"""
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import time
-import numpy as np
-import threading
-import multiprocessing
-import traceback
-try:
- import queue
-except ImportError:
- import Queue as queue
-
-
-class GeneratorEnqueuer(object):
- """
- Builds a queue out of a data generator.
-
- Args:
- generator: a generator function which endlessly yields data
- use_multiprocessing (bool): use multiprocessing if True,
- otherwise use threading.
- wait_time (float): time to sleep in-between calls to `put()`.
- random_seed (int): Initial seed for workers,
- will be incremented by one for each workers.
- """
-
- def __init__(self,
- generator,
- use_multiprocessing=False,
- wait_time=0.05,
- random_seed=None):
- self.wait_time = wait_time
- self._generator = generator
- self._use_multiprocessing = use_multiprocessing
- self._threads = []
- self._stop_event = None
- self.queue = None
- self._manager = None
- self.seed = random_seed
-
- def start(self, workers=1, max_queue_size=10):
- """
- Start worker threads which add data from the generator into the queue.
-
- Args:
- workers (int): number of worker threads
- max_queue_size (int): queue size
- (when full, threads could block on `put()`)
- """
-
- def data_generator_task():
- """
- Data generator task.
- """
-
- def task():
- if (self.queue is not None and
- self.queue.qsize() < max_queue_size):
- generator_output = next(self._generator)
- self.queue.put((generator_output))
- else:
- time.sleep(self.wait_time)
-
- if not self._use_multiprocessing:
- while not self._stop_event.is_set():
- with self.genlock:
- try:
- task()
- except Exception:
- traceback.print_exc()
- self._stop_event.set()
- break
- else:
- while not self._stop_event.is_set():
- try:
- task()
- except Exception:
- traceback.print_exc()
- self._stop_event.set()
- break
-
- try:
- if self._use_multiprocessing:
- self._manager = multiprocessing.Manager()
- self.queue = self._manager.Queue(maxsize=max_queue_size)
- self._stop_event = multiprocessing.Event()
- else:
- self.genlock = threading.Lock()
- self.queue = queue.Queue()
- self._stop_event = threading.Event()
- for _ in range(workers):
- if self._use_multiprocessing:
- # Reset random seed else all children processes
- # share the same seed
- np.random.seed(self.seed)
- thread = multiprocessing.Process(target=data_generator_task)
- thread.daemon = True
- if self.seed is not None:
- self.seed += 1
- else:
- thread = threading.Thread(target=data_generator_task)
- self._threads.append(thread)
- thread.start()
- except:
- self.stop()
- raise
-
- def is_running(self):
- """
- Returns:
- bool: Whether the worker theads are running.
- """
- return self._stop_event is not None and not self._stop_event.is_set()
-
- def stop(self, timeout=None):
- """
- Stops running threads and wait for them to exit, if necessary.
- Should be called by the same thread which called `start()`.
-
- Args:
- timeout(int|None): maximum time to wait on `thread.join()`.
- """
- if self.is_running():
- self._stop_event.set()
- for thread in self._threads:
- if self._use_multiprocessing:
- if thread.is_alive():
- thread.terminate()
- else:
- thread.join(timeout)
- if self._manager:
- self._manager.shutdown()
-
- self._threads = []
- self._stop_event = None
- self.queue = None
-
- def get(self):
- """
- Creates a generator to extract data from the queue.
- Skip the data if it is `None`.
-
- # Yields
- tuple of data in the queue.
- """
- while self.is_running():
- if not self.queue.empty():
- inputs = self.queue.get()
- if inputs is not None:
- yield inputs
- else:
- time.sleep(self.wait_time)
diff --git a/fluid/PaddleCV/face_detection/reader.py b/fluid/PaddleCV/face_detection/reader.py
deleted file mode 100644
index 2b38952d2d419ec5b658c762d2668f724dc92a09..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/face_detection/reader.py
+++ /dev/null
@@ -1,342 +0,0 @@
-# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import image_util
-from paddle.utils.image_util import *
-from PIL import Image
-from PIL import ImageDraw
-import numpy as np
-import xml.etree.ElementTree
-import os
-import time
-import copy
-import random
-import cv2
-import six
-from data_util import GeneratorEnqueuer
-
-
-class Settings(object):
- def __init__(self,
- dataset=None,
- data_dir=None,
- label_file=None,
- resize_h=None,
- resize_w=None,
- mean_value=[104., 117., 123.],
- apply_distort=True,
- apply_expand=True,
- ap_version='11point',
- toy=0):
- self.dataset = dataset
- self.ap_version = ap_version
- self.toy = toy
- self.data_dir = data_dir
- self.apply_distort = apply_distort
- self.apply_expand = apply_expand
- self.resize_height = resize_h
- self.resize_width = resize_w
- self.img_mean = np.array(mean_value)[:, np.newaxis, np.newaxis].astype(
- 'float32')
- self.expand_prob = 0.5
- self.expand_max_ratio = 4
- self.hue_prob = 0.5
- self.hue_delta = 18
- self.contrast_prob = 0.5
- self.contrast_delta = 0.5
- self.saturation_prob = 0.5
- self.saturation_delta = 0.5
- self.brightness_prob = 0.5
- # _brightness_delta is the normalized value by 256
- self.brightness_delta = 0.125
- self.scale = 0.007843 # 1 / 127.5
- self.data_anchor_sampling_prob = 0.5
- self.min_face_size = 8.0
-
-
-def to_chw_bgr(image):
- """
- Transpose image from HWC to CHW and from RBG to BGR.
- Args:
- image (np.array): an image with HWC and RBG layout.
- """
- # HWC to CHW
- if len(image.shape) == 3:
- image = np.swapaxes(image, 1, 2)
- image = np.swapaxes(image, 1, 0)
- # RBG to BGR
- image = image[[2, 1, 0], :, :]
- return image
-
-
-def preprocess(img, bbox_labels, mode, settings, image_path):
- img_width, img_height = img.size
- sampled_labels = bbox_labels
- if mode == 'train':
- if settings.apply_distort:
- img = image_util.distort_image(img, settings)
- if settings.apply_expand:
- img, bbox_labels, img_width, img_height = image_util.expand_image(
- img, bbox_labels, img_width, img_height, settings)
-
- # sampling
- batch_sampler = []
-
- prob = np.random.uniform(0., 1.)
- if prob > settings.data_anchor_sampling_prob:
- scale_array = np.array([16, 32, 64, 128, 256, 512])
- batch_sampler.append(
- image_util.sampler(1, 10, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.2,
- 0.0, True))
- sampled_bbox = image_util.generate_batch_random_samples(
- batch_sampler, bbox_labels, img_width, img_height, scale_array,
- settings.resize_width, settings.resize_height)
- img = np.array(img)
- if len(sampled_bbox) > 0:
- idx = int(np.random.uniform(0, len(sampled_bbox)))
- img, sampled_labels = image_util.crop_image_sampling(
- img, bbox_labels, sampled_bbox[idx], img_width, img_height,
- settings.resize_width, settings.resize_height,
- settings.min_face_size)
-
- img = img.astype('uint8')
- img = Image.fromarray(img)
-
- else:
- # hard-code here
- batch_sampler.append(
- image_util.sampler(1, 50, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
- 0.0, True))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
- 0.0, True))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
- 0.0, True))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
- 0.0, True))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0,
- 0.0, True))
- sampled_bbox = image_util.generate_batch_samples(
- batch_sampler, bbox_labels, img_width, img_height)
-
- img = np.array(img)
- if len(sampled_bbox) > 0:
- idx = int(np.random.uniform(0, len(sampled_bbox)))
- img, sampled_labels = image_util.crop_image(
- img, bbox_labels, sampled_bbox[idx], img_width, img_height,
- settings.resize_width, settings.resize_height,
- settings.min_face_size)
-
- img = Image.fromarray(img)
- interp_mode = [
- Image.BILINEAR, Image.HAMMING, Image.NEAREST, Image.BICUBIC,
- Image.LANCZOS
- ]
- interp_indx = np.random.randint(0, 5)
-
- img = img.resize(
- (settings.resize_width, settings.resize_height),
- resample=interp_mode[interp_indx])
- img = np.array(img)
-
- if mode == 'train':
- mirror = int(np.random.uniform(0, 2))
- if mirror == 1:
- img = img[:, ::-1, :]
- for i in six.moves.xrange(len(sampled_labels)):
- tmp = sampled_labels[i][1]
- sampled_labels[i][1] = 1 - sampled_labels[i][3]
- sampled_labels[i][3] = 1 - tmp
-
- img = to_chw_bgr(img)
- img = img.astype('float32')
- img -= settings.img_mean
- img = img * settings.scale
- return img, sampled_labels
-
-
-def load_file_list(input_txt):
- with open(input_txt, 'r') as f_dir:
- lines_input_txt = f_dir.readlines()
-
- file_dict = {}
- num_class = 0
- for i in range(len(lines_input_txt)):
- line_txt = lines_input_txt[i].strip('\n\t\r')
- if '--' in line_txt:
- if i != 0:
- num_class += 1
- file_dict[num_class] = []
- file_dict[num_class].append(line_txt)
- if '--' not in line_txt:
- if len(line_txt) > 6:
- split_str = line_txt.split(' ')
- x1_min = float(split_str[0])
- y1_min = float(split_str[1])
- x2_max = float(split_str[2])
- y2_max = float(split_str[3])
- line_txt = str(x1_min) + ' ' + str(y1_min) + ' ' + str(
- x2_max) + ' ' + str(y2_max)
- file_dict[num_class].append(line_txt)
- else:
- file_dict[num_class].append(line_txt)
-
- return file_dict
-
-
-def expand_bboxes(bboxes,
- expand_left=2.,
- expand_up=2.,
- expand_right=2.,
- expand_down=2.):
- """
- Expand bboxes, expand 2 times by defalut.
- """
- expand_boxes = []
- for bbox in bboxes:
- xmin = bbox[0]
- ymin = bbox[1]
- xmax = bbox[2]
- ymax = bbox[3]
- w = xmax - xmin
- h = ymax - ymin
- ex_xmin = max(xmin - w / expand_left, 0.)
- ex_ymin = max(ymin - h / expand_up, 0.)
- ex_xmax = min(xmax + w / expand_right, 1.)
- ex_ymax = min(ymax + h / expand_down, 1.)
- expand_boxes.append([ex_xmin, ex_ymin, ex_xmax, ex_ymax])
- return expand_boxes
-
-
-def train_generator(settings, file_list, batch_size, shuffle=True):
- file_dict = load_file_list(file_list)
- while True:
- if shuffle:
- np.random.shuffle(file_dict)
- batch_out = []
- for index_image in file_dict.keys():
- image_name = file_dict[index_image][0]
- image_path = os.path.join(settings.data_dir, image_name)
- im = Image.open(image_path)
- if im.mode == 'L':
- im = im.convert('RGB')
- im_width, im_height = im.size
-
- # layout: label | xmin | ymin | xmax | ymax
- bbox_labels = []
- for index_box in range(len(file_dict[index_image])):
- if index_box >= 2:
- bbox_sample = []
- temp_info_box = file_dict[index_image][index_box].split(' ')
- xmin = float(temp_info_box[0])
- ymin = float(temp_info_box[1])
- w = float(temp_info_box[2])
- h = float(temp_info_box[3])
-
- # Filter out wrong labels
- if w < 0 or h < 0:
- continue
- xmax = xmin + w
- ymax = ymin + h
-
- bbox_sample.append(1)
- bbox_sample.append(float(xmin) / im_width)
- bbox_sample.append(float(ymin) / im_height)
- bbox_sample.append(float(xmax) / im_width)
- bbox_sample.append(float(ymax) / im_height)
- bbox_labels.append(bbox_sample)
- im, sample_labels = preprocess(im, bbox_labels, "train", settings,
- image_path)
- sample_labels = np.array(sample_labels)
- if len(sample_labels) == 0: continue
-
- im = im.astype('float32')
- face_box = sample_labels[:, 1:5]
- head_box = expand_bboxes(face_box)
- label = [1] * len(face_box)
- batch_out.append((im, face_box, head_box, label))
- if len(batch_out) == batch_size:
- yield batch_out
- batch_out = []
-
-
-def train(settings,
- file_list,
- batch_size,
- shuffle=True,
- use_multiprocessing=True,
- num_workers=8,
- max_queue=24):
- def reader():
- try:
- enqueuer = GeneratorEnqueuer(
- train_generator(settings, file_list, batch_size, shuffle),
- use_multiprocessing=use_multiprocessing)
- enqueuer.start(max_queue_size=max_queue, workers=num_workers)
- generator_output = None
- while True:
- while enqueuer.is_running():
- if not enqueuer.queue.empty():
- generator_output = enqueuer.queue.get()
- break
- else:
- time.sleep(0.01)
- yield generator_output
- generator_output = None
- finally:
- if enqueuer is not None:
- enqueuer.stop()
-
- return reader
-
-
-def test(settings, file_list):
- file_dict = load_file_list(file_list)
-
- def reader():
- for index_image in file_dict.keys():
- image_name = file_dict[index_image][0]
- image_path = os.path.join(settings.data_dir, image_name)
- im = Image.open(image_path)
- if im.mode == 'L':
- im = im.convert('RGB')
- yield im, image_path
-
- return reader
-
-
-def infer(settings, image_path):
- def batch_reader():
- img = Image.open(image_path)
- if img.mode == 'L':
- img = im.convert('RGB')
- im_width, im_height = img.size
- if settings.resize_width and settings.resize_height:
- img = img.resize((settings.resize_width, settings.resize_height),
- Image.ANTIALIAS)
- img = np.array(img)
- img = to_chw_bgr(img)
- img = img.astype('float32')
- img -= settings.img_mean
- img = img * settings.scale
- return np.array([img])
-
- return batch_reader
diff --git a/fluid/PaddleCV/face_detection/train.py b/fluid/PaddleCV/face_detection/train.py
deleted file mode 100644
index 67cec03b95ba5ffe1a5230c287bd12a49b90bb34..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/face_detection/train.py
+++ /dev/null
@@ -1,233 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import shutil
-import numpy as np
-import time
-import argparse
-import functools
-
-import paddle
-import paddle.fluid as fluid
-from pyramidbox import PyramidBox
-import reader
-from utility import add_arguments, print_arguments
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-
-# yapf: disable
-add_arg('parallel', bool, True, "Whether use multi-GPU/threads or not.")
-add_arg('learning_rate', float, 0.001, "The start learning rate.")
-add_arg('batch_size', int, 16, "Minibatch size.")
-add_arg('epoc_num', int, 160, "Epoch number.")
-add_arg('use_gpu', bool, True, "Whether use GPU.")
-add_arg('use_pyramidbox', bool, True, "Whether use PyramidBox model.")
-add_arg('model_save_dir', str, 'output', "The path to save model.")
-add_arg('resize_h', int, 640, "The resized image height.")
-add_arg('resize_w', int, 640, "The resized image width.")
-add_arg('mean_BGR', str, '104., 117., 123.', "Mean value for B,G,R channel which will be subtracted.")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, './vgg_ilsvrc_16_fc_reduced/', "The init model path.")
-add_arg('data_dir', str, 'data', "The base dir of dataset")
-#yapf: enable
-
-train_parameters = {
- "train_images": 12880,
- "image_shape": [3, 640, 640],
- "class_num": 2,
- "batch_size": 16,
- "lr": 0.001,
- "lr_epochs": [99, 124, 149],
- "lr_decay": [1, 0.1, 0.01, 0.001],
- "epoc_num": 160,
- "optimizer_method": "momentum",
- "use_pyramidbox": True
-}
-
-def optimizer_setting(train_params):
- batch_size = train_params["batch_size"]
- iters = train_params["train_images"] // batch_size
- lr = train_params["lr"]
- optimizer_method = train_params["optimizer_method"]
- boundaries = [i * iters for i in train_params["lr_epochs"]]
- values = [i * lr for i in train_params["lr_decay"]]
-
- if optimizer_method == "momentum":
- optimizer = fluid.optimizer.Momentum(
- learning_rate=fluid.layers.piecewise_decay(boundaries, values),
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(0.0005),
- )
- else:
- optimizer = fluid.optimizer.RMSProp(
- learning_rate=fluid.layers.piecewise_decay(boundaries, values),
- regularization=fluid.regularizer.L2Decay(0.0005),
- )
- return optimizer
-
-
-def build_program(train_params, main_prog, startup_prog, args):
- use_pyramidbox = train_params["use_pyramidbox"]
- image_shape = train_params["image_shape"]
- class_num = train_params["class_num"]
- with fluid.program_guard(main_prog, startup_prog):
- py_reader = fluid.layers.py_reader(
- capacity=8,
- shapes=[[-1] + image_shape, [-1, 4], [-1, 4], [-1, 1]],
- lod_levels=[0, 1, 1, 1],
- dtypes=["float32", "float32", "float32", "int32"],
- use_double_buffer=True)
- with fluid.unique_name.guard():
- image, face_box, head_box, gt_label = fluid.layers.read_file(py_reader)
- fetches = []
- network = PyramidBox(image=image,
- face_box=face_box,
- head_box=head_box,
- gt_label=gt_label,
- sub_network=use_pyramidbox)
- if use_pyramidbox:
- face_loss, head_loss, loss = network.train()
- fetches = [face_loss, head_loss]
- else:
- loss = network.vgg_ssd_loss()
- fetches = [loss]
- optimizer = optimizer_setting(train_params)
- optimizer.minimize(loss)
- return py_reader, fetches, loss
-
-def train(args, config, train_params, train_file_list):
- batch_size = train_params["batch_size"]
- epoc_num = train_params["epoc_num"]
- optimizer_method = train_params["optimizer_method"]
- use_pyramidbox = train_params["use_pyramidbox"]
-
- use_gpu = args.use_gpu
- model_save_dir = args.model_save_dir
- pretrained_model = args.pretrained_model
- with_memory_optimization = args.with_mem_opt
-
- devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
- devices_num = len(devices.split(","))
- batch_size_per_device = batch_size // devices_num
- iters_per_epoc = train_params["train_images"] // batch_size
- num_workers = 8
- is_shuffle = True
-
- startup_prog = fluid.Program()
- train_prog = fluid.Program()
-
- train_py_reader, fetches, loss = build_program(
- train_params = train_params,
- main_prog = train_prog,
- startup_prog = startup_prog,
- args=args)
-
- if with_memory_optimization:
- fluid.memory_optimize(train_prog)
-
- place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(startup_prog)
-
- start_epoc = 0
- if pretrained_model:
- if pretrained_model.isdigit():
- start_epoc = int(pretrained_model) + 1
- pretrained_model = os.path.join(model_save_dir, pretrained_model)
- print("Resume from %s " %(pretrained_model))
-
- if not os.path.exists(pretrained_model):
- raise ValueError("The pre-trained model path [%s] does not exist." %
- (pretrained_model))
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
- fluid.io.load_vars(
- exe, pretrained_model, main_program=train_prog, predicate=if_exist)
- train_reader = reader.train(config,
- train_file_list,
- batch_size_per_device,
- shuffle = is_shuffle,
- use_multiprocessing=True,
- num_workers = num_workers,
- max_queue=24)
- train_py_reader.decorate_paddle_reader(train_reader)
-
- if args.parallel:
- train_exe = fluid.ParallelExecutor(
- main_program = train_prog,
- use_cuda=use_gpu,
- loss_name=loss.name)
-
- def save_model(postfix, program):
- model_path = os.path.join(model_save_dir, postfix)
- if os.path.isdir(model_path):
- shutil.rmtree(model_path)
-
- print('save models to %s' % (model_path))
- fluid.io.save_persistables(exe, model_path, main_program=program)
-
- train_py_reader.start()
- try:
- for pass_id in range(start_epoc, epoc_num):
- start_time = time.time()
- prev_start_time = start_time
- end_time = 0
- batch_id = 0
- for batch_id in range(iters_per_epoc):
- prev_start_time = start_time
- start_time = time.time()
- if args.parallel:
- fetch_vars = train_exe.run(fetch_list=
- [v.name for v in fetches])
- else:
- fetch_vars = exe.run(train_prog,
- fetch_list=fetches)
- end_time = time.time()
- fetch_vars = [np.mean(np.array(v)) for v in fetch_vars]
- if batch_id % 10 == 0:
- if not args.use_pyramidbox:
- print("Pass {:d}, batch {:d}, loss {:.6f}, time {:.5f}".format(
- pass_id, batch_id, fetch_vars[0],
- start_time - prev_start_time))
- else:
- print("Pass {:d}, batch {:d}, face loss {:.6f}, " \
- "head loss {:.6f}, " \
- "time {:.5f}".format(pass_id,
- batch_id, fetch_vars[0], fetch_vars[1],
- start_time - prev_start_time))
- if pass_id % 1 == 0 or pass_id == epoc_num - 1:
- save_model(str(pass_id), train_prog)
- except fluid.core.EOFException:
- train_py_reader.reset()
- except StopIteration:
- train_py_reader.reset()
- train_py_reader.reset()
-
-if __name__ == '__main__':
- args = parser.parse_args()
- print_arguments(args)
-
- data_dir = os.path.join(args.data_dir, 'WIDER_train/images/')
- train_file_list = os.path.join(args.data_dir,
- 'wider_face_split/wider_face_train_bbx_gt.txt')
- mean_BGR = [float(m) for m in args.mean_BGR.split(",")]
- image_shape = [3, int(args.resize_h), int(args.resize_w)]
- train_parameters["image_shape"] = image_shape
- train_parameters["use_pyramidbox"] = args.use_pyramidbox
- train_parameters["batch_size"] = args.batch_size
- train_parameters["lr"] = args.learning_rate
- train_parameters["epoc_num"] = args.epoc_num
-
-
- config = reader.Settings(
- data_dir=data_dir,
- resize_h=image_shape[1],
- resize_w=image_shape[2],
- apply_distort=True,
- apply_expand=False,
- mean_value=mean_BGR,
- ap_version='11point')
- train(args, config, train_parameters, train_file_list)
diff --git a/fluid/PaddleCV/face_detection/widerface_eval.py b/fluid/PaddleCV/face_detection/widerface_eval.py
deleted file mode 100644
index 1544442c78c38bcbcb537cd81374f5c72c7bfc5a..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/face_detection/widerface_eval.py
+++ /dev/null
@@ -1,316 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import time
-import numpy as np
-import argparse
-import functools
-from PIL import Image
-
-import paddle.fluid as fluid
-import reader
-from pyramidbox import PyramidBox
-from visualize import draw_bboxes
-from utility import add_arguments, print_arguments
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-
-# yapf: disable
-add_arg('use_gpu', bool, True, "Whether use GPU or not.")
-add_arg('use_pyramidbox', bool, True, "Whether use PyramidBox model.")
-add_arg('data_dir', str, 'data/WIDER_val/images/', "The validation dataset path.")
-add_arg('model_dir', str, '', "The model path.")
-add_arg('pred_dir', str, 'pred', "The path to save the evaluation results.")
-add_arg('file_list', str, 'data/wider_face_split/wider_face_val_bbx_gt.txt', "The validation dataset path.")
-add_arg('infer', bool, False, "Whether do infer or eval.")
-add_arg('confs_threshold', float, 0.15, "Confidence threshold to draw bbox.")
-add_arg('image_path', str, '', "The image used to inference and visualize.")
-# yapf: enable
-
-
-def infer(args, config):
- model_dir = args.model_dir
- pred_dir = args.pred_dir
- if not os.path.exists(model_dir):
- raise ValueError("The model path [%s] does not exist." % (model_dir))
-
- if args.infer:
- image_path = args.image_path
- image = Image.open(image_path)
- if image.mode == 'L':
- image = img.convert('RGB')
- shrink, max_shrink = get_shrink(image.size[1], image.size[0])
-
- det0 = detect_face(image, shrink)
- det1 = flip_test(image, shrink)
- [det2, det3] = multi_scale_test(image, max_shrink)
- det4 = multi_scale_test_pyramid(image, max_shrink)
- det = np.row_stack((det0, det1, det2, det3, det4))
- dets = bbox_vote(det)
-
- keep_index = np.where(dets[:, 4] >= args.confs_threshold)[0]
- dets = dets[keep_index, :]
- draw_bboxes(image_path, dets[:, 0:4])
- else:
- test_reader = reader.test(config, args.file_list)
- for image, image_path in test_reader():
- shrink, max_shrink = get_shrink(image.size[1], image.size[0])
-
- det0 = detect_face(image, shrink)
- det1 = flip_test(image, shrink)
- [det2, det3] = multi_scale_test(image, max_shrink)
- det4 = multi_scale_test_pyramid(image, max_shrink)
- det = np.row_stack((det0, det1, det2, det3, det4))
- dets = bbox_vote(det)
-
- save_widerface_bboxes(image_path, dets, pred_dir)
-
- print("Finish evaluation.")
-
-
-def save_widerface_bboxes(image_path, bboxes_scores, output_dir):
- """
- Save predicted results, including bbox and score into text file.
- Args:
- image_path (string): file name.
- bboxes_scores (np.array|list): the predicted bboxed and scores, layout
- is (xmin, ymin, xmax, ymax, score)
- output_dir (string): output directory.
- """
- image_name = image_path.split('/')[-1]
- image_class = image_path.split('/')[-2]
-
- odir = os.path.join(output_dir, image_class)
- if not os.path.exists(odir):
- os.makedirs(odir)
-
- ofname = os.path.join(odir, '%s.txt' % (image_name[:-4]))
- f = open(ofname, 'w')
- f.write('{:s}\n'.format(image_class + '/' + image_name))
- f.write('{:d}\n'.format(bboxes_scores.shape[0]))
- for box_score in bboxes_scores:
- xmin, ymin, xmax, ymax, score = box_score
- f.write('{:.1f} {:.1f} {:.1f} {:.1f} {:.3f}\n'.format(xmin, ymin, (
- xmax - xmin + 1), (ymax - ymin + 1), score))
- f.close()
- print("The predicted result is saved as {}".format(ofname))
-
-
-def detect_face(image, shrink):
- image_shape = [3, image.size[1], image.size[0]]
- if shrink != 1:
- h, w = int(image_shape[1] * shrink), int(image_shape[2] * shrink)
- image = image.resize((w, h), Image.ANTIALIAS)
- image_shape = [3, h, w]
-
- img = np.array(image)
- img = reader.to_chw_bgr(img)
- mean = [104., 117., 123.]
- scale = 0.007843
- img = img.astype('float32')
- img -= np.array(mean)[:, np.newaxis, np.newaxis].astype('float32')
- img = img * scale
- img = [img]
- img = np.array(img)
-
- detection, = exe.run(infer_program,
- feed={'image': img},
- fetch_list=fetches,
- return_numpy=False)
- detection = np.array(detection)
- # layout: xmin, ymin, xmax. ymax, score
- if detection.shape == (1, ):
- print("No face detected")
- return np.array([[0, 0, 0, 0, 0]])
- det_conf = detection[:, 1]
- det_xmin = image_shape[2] * detection[:, 2] / shrink
- det_ymin = image_shape[1] * detection[:, 3] / shrink
- det_xmax = image_shape[2] * detection[:, 4] / shrink
- det_ymax = image_shape[1] * detection[:, 5] / shrink
-
- det = np.column_stack((det_xmin, det_ymin, det_xmax, det_ymax, det_conf))
- return det
-
-
-def bbox_vote(det):
- order = det[:, 4].ravel().argsort()[::-1]
- det = det[order, :]
- if det.shape[0] == 0:
- dets = np.array([[10, 10, 20, 20, 0.002]])
- det = np.empty(shape=[0, 5])
- while det.shape[0] > 0:
- # IOU
- area = (det[:, 2] - det[:, 0] + 1) * (det[:, 3] - det[:, 1] + 1)
- xx1 = np.maximum(det[0, 0], det[:, 0])
- yy1 = np.maximum(det[0, 1], det[:, 1])
- xx2 = np.minimum(det[0, 2], det[:, 2])
- yy2 = np.minimum(det[0, 3], det[:, 3])
- w = np.maximum(0.0, xx2 - xx1 + 1)
- h = np.maximum(0.0, yy2 - yy1 + 1)
- inter = w * h
- o = inter / (area[0] + area[:] - inter)
-
- # nms
- merge_index = np.where(o >= 0.3)[0]
- det_accu = det[merge_index, :]
- det = np.delete(det, merge_index, 0)
- if merge_index.shape[0] <= 1:
- if det.shape[0] == 0:
- try:
- dets = np.row_stack((dets, det_accu))
- except:
- dets = det_accu
- continue
- det_accu[:, 0:4] = det_accu[:, 0:4] * np.tile(det_accu[:, -1:], (1, 4))
- max_score = np.max(det_accu[:, 4])
- det_accu_sum = np.zeros((1, 5))
- det_accu_sum[:, 0:4] = np.sum(det_accu[:, 0:4],
- axis=0) / np.sum(det_accu[:, -1:])
- det_accu_sum[:, 4] = max_score
- try:
- dets = np.row_stack((dets, det_accu_sum))
- except:
- dets = det_accu_sum
- dets = dets[0:750, :]
- return dets
-
-
-def flip_test(image, shrink):
- img = image.transpose(Image.FLIP_LEFT_RIGHT)
- det_f = detect_face(img, shrink)
- det_t = np.zeros(det_f.shape)
- # image.size: [width, height]
- det_t[:, 0] = image.size[0] - det_f[:, 2]
- det_t[:, 1] = det_f[:, 1]
- det_t[:, 2] = image.size[0] - det_f[:, 0]
- det_t[:, 3] = det_f[:, 3]
- det_t[:, 4] = det_f[:, 4]
- return det_t
-
-
-def multi_scale_test(image, max_shrink):
- # Shrink detecting is only used to detect big faces
- st = 0.5 if max_shrink >= 0.75 else 0.5 * max_shrink
- det_s = detect_face(image, st)
- index = np.where(
- np.maximum(det_s[:, 2] - det_s[:, 0] + 1, det_s[:, 3] - det_s[:, 1] + 1)
- > 30)[0]
- det_s = det_s[index, :]
- # Enlarge one times
- bt = min(2, max_shrink) if max_shrink > 1 else (st + max_shrink) / 2
- det_b = detect_face(image, bt)
-
- # Enlarge small image x times for small faces
- if max_shrink > 2:
- bt *= 2
- while bt < max_shrink:
- det_b = np.row_stack((det_b, detect_face(image, bt)))
- bt *= 2
- det_b = np.row_stack((det_b, detect_face(image, max_shrink)))
-
- # Enlarged images are only used to detect small faces.
- if bt > 1:
- index = np.where(
- np.minimum(det_b[:, 2] - det_b[:, 0] + 1,
- det_b[:, 3] - det_b[:, 1] + 1) < 100)[0]
- det_b = det_b[index, :]
- # Shrinked images are only used to detect big faces.
- else:
- index = np.where(
- np.maximum(det_b[:, 2] - det_b[:, 0] + 1,
- det_b[:, 3] - det_b[:, 1] + 1) > 30)[0]
- det_b = det_b[index, :]
- return det_s, det_b
-
-
-def multi_scale_test_pyramid(image, max_shrink):
- # Use image pyramids to detect faces
- det_b = detect_face(image, 0.25)
- index = np.where(
- np.maximum(det_b[:, 2] - det_b[:, 0] + 1, det_b[:, 3] - det_b[:, 1] + 1)
- > 30)[0]
- det_b = det_b[index, :]
-
- st = [0.75, 1.25, 1.5, 1.75]
- for i in range(len(st)):
- if (st[i] <= max_shrink):
- det_temp = detect_face(image, st[i])
- # Enlarged images are only used to detect small faces.
- if st[i] > 1:
- index = np.where(
- np.minimum(det_temp[:, 2] - det_temp[:, 0] + 1,
- det_temp[:, 3] - det_temp[:, 1] + 1) < 100)[0]
- det_temp = det_temp[index, :]
- # Shrinked images are only used to detect big faces.
- else:
- index = np.where(
- np.maximum(det_temp[:, 2] - det_temp[:, 0] + 1,
- det_temp[:, 3] - det_temp[:, 1] + 1) > 30)[0]
- det_temp = det_temp[index, :]
- det_b = np.row_stack((det_b, det_temp))
- return det_b
-
-
-def get_shrink(height, width):
- """
- Args:
- height (int): image height.
- width (int): image width.
- """
- # avoid out of memory
- max_shrink_v1 = (0x7fffffff / 577.0 / (height * width))**0.5
- max_shrink_v2 = ((678 * 1024 * 2.0 * 2.0) / (height * width))**0.5
-
- def get_round(x, loc):
- str_x = str(x)
- if '.' in str_x:
- str_before, str_after = str_x.split('.')
- len_after = len(str_after)
- if len_after >= 3:
- str_final = str_before + '.' + str_after[0:loc]
- return float(str_final)
- else:
- return x
-
- max_shrink = get_round(min(max_shrink_v1, max_shrink_v2), 2) - 0.3
- if max_shrink >= 1.5 and max_shrink < 2:
- max_shrink = max_shrink - 0.1
- elif max_shrink >= 2 and max_shrink < 3:
- max_shrink = max_shrink - 0.2
- elif max_shrink >= 3 and max_shrink < 4:
- max_shrink = max_shrink - 0.3
- elif max_shrink >= 4 and max_shrink < 5:
- max_shrink = max_shrink - 0.4
- elif max_shrink >= 5:
- max_shrink = max_shrink - 0.5
-
- shrink = max_shrink if max_shrink < 1 else 1
- return shrink, max_shrink
-
-
-if __name__ == '__main__':
- args = parser.parse_args()
- print_arguments(args)
- config = reader.Settings(data_dir=args.data_dir)
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- main_program = fluid.Program()
- startup_program = fluid.Program()
- image_shape = [3, 1024, 1024]
- with fluid.program_guard(main_program, startup_program):
- network = PyramidBox(
- data_shape=image_shape,
- sub_network=args.use_pyramidbox,
- is_infer=True)
- infer_program, nmsed_out = network.infer(main_program)
- fetches = [nmsed_out]
- fluid.io.load_persistables(
- exe, args.model_dir, main_program=infer_program)
- # save model and program
- #fluid.io.save_inference_model('pyramidbox_model',
- # ['image'], [nmsed_out], exe, main_program=infer_program,
- # model_filename='model', params_filename='params')
- infer(args, config)
diff --git a/fluid/PaddleCV/faster_rcnn/README.md b/fluid/PaddleCV/faster_rcnn/README.md
deleted file mode 100644
index b15ad3b51917dac6b4b4ed35b017fa3ed2a44bc5..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/README.md
+++ /dev/null
@@ -1,142 +0,0 @@
-# Faster RCNN Objective Detection
-
----
-## Table of Contents
-
-- [Installation](#installation)
-- [Introduction](#introduction)
-- [Data preparation](#data-preparation)
-- [Training](#training)
-- [Evaluation](#evaluation)
-- [Inference and Visualization](#inference-and-visualization)
-- [Appendix](#appendix)
-
-## Installation
-
-Running sample code in this directory requires PaddelPaddle Fluid v.1.0.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/documentation/docs/zh/0.15.0/beginners_guide/install/install_doc.html#paddlepaddle) and make an update.
-
-## Introduction
-
-[Faster Rcnn](https://arxiv.org/abs/1506.01497) is a typical two stage detector. The total framework of network can be divided into four parts, as shown below:
-
-
-Faster RCNN model
-
-
-1. Base conv layer. As a CNN objective dection, Faster RCNN extract feature maps using a basic convolutional network. The feature maps then can be shared by RPN and fc layers. This sampel uses [ResNet-50](https://arxiv.org/abs/1512.03385) as base conv layer.
-2. Region Proposal Network (RPN). RPN generates proposals for detection。This block generates anchors by a set of size and ratio and classifies anchors into fore-ground and back-ground by softmax. Then refine anchors to obtain more precise proposals using box regression.
-3. RoI Align. This layer takes feature maps and proposals as input. The proposals are mapped to feature maps and pooled to the same size. The output are sent to fc layers for classification and regression. RoIPool and RoIAlign are used separately to this layer and it can be set in roi\_func in config.py.
-4. Detection layer. Using the output of roi pooling to compute the class and locatoin of each proposal in two fc layers.
-
-## Data preparation
-
-Train the model on [MS-COCO dataset](http://cocodataset.org/#download), download dataset as below:
-
- cd dataset/coco
- ./download.sh
-
-
-## Training
-
-After data preparation, one can start the training step by:
-
- python train.py \
- --model_save_dir=output/ \
- --pretrained_model=${path_to_pretrain_model}
- --data_dir=${path_to_data}
-
-- Set ```export CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7``` to specifiy 8 GPU to train.
-- For more help on arguments:
-
- python train.py --help
-
-**download the pre-trained model:** This sample provides Resnet-50 pre-trained model which is converted from Caffe. The model fuses the parameters in batch normalization layer. One can download pre-trained model as:
-
- sh ./pretrained/download.sh
-
-Set `pretrained_model` to load pre-trained model. In addition, this parameter is used to load trained model when finetuning as well.
-Please make sure that pretrained_model is downloaded and loaded correctly, otherwise, the loss may be NAN during training.
-
-**Install the [cocoapi](https://github.com/cocodataset/cocoapi):**
-
-To train the model, [cocoapi](https://github.com/cocodataset/cocoapi) is needed. Install the cocoapi:
-
- # COCOAPI=/path/to/clone/cocoapi
- git clone https://github.com/cocodataset/cocoapi.git $COCOAPI
- cd $COCOAPI/PythonAPI
- # if cython is not installed
- pip install Cython
- # Install into global site-packages
- make install
- # Alternatively, if you do not have permissions or prefer
- # not to install the COCO API into global site-packages
- python2 setup.py install --user
-
-**data reader introduction:**
-
-* Data reader is defined in `reader.py`.
-* Scaling the short side of all images to `scales`. If the long side is larger than `max_size`, then scaling the long side to `max_size`.
-* In training stage, images are horizontally flipped.
-* Images in the same batch can be padding to the same size.
-
-**model configuration:**
-
-* Use RoIAlign and RoIPool separately.
-* NMS threshold=0.7. During training, pre\_nms=12000, post\_nms=2000; during test, pre\_nms=6000, post\_nms=1000.
-* In generating proposal lables, fg\_fraction=0.25, fg\_thresh=0.5, bg\_thresh_hi=0.5, bg\_thresh\_lo=0.0.
-* In rpn target assignment, rpn\_fg\_fraction=0.5, rpn\_positive\_overlap=0.7, rpn\_negative\_overlap=0.3.
-
-**training strategy:**
-
-* Use momentum optimizer with momentum=0.9.
-* Weight decay is 0.0001.
-* In first 500 iteration, the learning rate increases linearly from 0.00333 to 0.01. Then lr is decayed at 120000, 160000 iteration with multiplier 0.1, 0.01. The maximum iteration is 180000. Also, we released a 2x model which has 360000 iterations and lr is decayed at 240000, 320000. These configuration can be set by max_iter and lr_steps in config.py.
-* Set the learning rate of bias to two times as global lr in non basic convolutional layers.
-* In basic convolutional layers, parameters of affine layers and res body do not update.
-* Use Nvidia Tesla V100 8GPU, total time for training is about 40 hours.
-
-## Evaluation
-
-Evaluation is to evaluate the performance of a trained model. This sample provides `eval_coco_map.py` which uses a COCO-specific mAP metric defined by [COCO committee](http://cocodataset.org/#detections-eval).
-
-`eval_coco_map.py` is the main executor for evalution, one can start evalution step by:
-
- python eval_coco_map.py \
- --dataset=coco2017 \
- --pretrained_model=${path_to_pretrain_model} \
-
-- Set ```export CUDA_VISIBLE_DEVICES=0``` to specifiy one GPU to eval.
-
-Evalutaion result is shown as below:
-
-| Model | RoI function | Batch size | Max iteration | mAP |
-| :--------------- | :--------: | :------------: | :------------------: |------: |
-| [Fluid RoIPool minibatch padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_minibatch_padding.tar.gz) | RoIPool | 8 | 180000 | 0.314 |
-| [Fluid RoIPool no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_no_padding.tar.gz) | RoIPool | 8 | 180000 | 0.316 |
-| [Fluid RoIAlign no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding.tar.gz) | RoIAlign | 8 | 180000 | 0.345 |
-| [Fluid RoIAlign no padding 2x](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding_2x.tar.gz) | RoIAlign | 8 | 360000 | 0.364 |
-
-* Fluid RoIPool minibatch padding: Use RoIPool. Images in one batch padding to the same size. This method is same as detectron.
-* Fluid RoIPool no padding: Images without padding.
-* Fluid RoIAlign no padding: Images without padding.
-* Fluid RoIAlign no padding 2x: Images without padding, train for 360000 iterations, learning rate is decayed at 240000, 320000.
-
-## Inference and Visualization
-
-Inference is used to get prediction score or image features based on trained models. `infer.py` is the main executor for inference, one can start infer step by:
-
- python infer.py \
- --dataset=coco2017 \
- --pretrained_model=${path_to_pretrain_model} \
- --image_path=data/COCO17/val2017/ \
- --image_name=000000000139.jpg \
- --draw_threshold=0.6
-
-Visualization of infer result is shown as below:
-
-
-
-
-
-Faster RCNN Visualization Examples
-
diff --git a/fluid/PaddleCV/faster_rcnn/README_cn.md b/fluid/PaddleCV/faster_rcnn/README_cn.md
deleted file mode 100644
index 75265fe91491aec90b47b85a3c478dbcdee1683d..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/README_cn.md
+++ /dev/null
@@ -1,138 +0,0 @@
-# Faster RCNN 目标检测
-
----
-## 内容
-
-- [安装](#安装)
-- [简介](#简介)
-- [数据准备](#数据准备)
-- [模型训练](#模型训练)
-- [模型评估](#模型评估)
-- [模型推断及可视化](#模型推断及可视化)
-- [附录](#附录)
-
-## 安装
-
-在当前目录下运行样例代码需要PadddlePaddle Fluid的v.1.0.0或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据[安装文档](http://www.paddlepaddle.org/documentation/docs/zh/0.15.0/beginners_guide/install/install_doc.html#paddlepaddle)中的说明来更新PaddlePaddle。
-
-## 简介
-
-[Faster Rcnn](https://arxiv.org/abs/1506.01497) 是典型的两阶段目标检测器。如下图所示,整体网络可以分为4个主要内容:
-
-
-Faster RCNN 目标检测模型
-
-
-1. 基础卷积层。作为一种卷积神经网络目标检测方法,Faster RCNN首先使用一组基础的卷积网络提取图像的特征图。特征图被后续RPN层和全连接层共享。本示例采用[ResNet-50](https://arxiv.org/abs/1512.03385)作为基础卷积层。
-2. 区域生成网络(RPN)。RPN网络用于生成候选区域(proposals)。该层通过一组固定的尺寸和比例得到一组锚点(anchors), 通过softmax判断锚点属于前景或者背景,再利用区域回归修正锚点从而获得精确的候选区域。
-3. RoI Align。该层收集输入的特征图和候选区域,将候选区域映射到特征图中并池化为统一大小的区域特征图,送入全连接层判定目标类别, 该层可选用RoIPool和RoIAlign两种方式,在config.py中设置roi\_func。
-4. 检测层。利用区域特征图计算候选区域的类别,同时再次通过区域回归获得检测框最终的精确位置。
-
-## 数据准备
-
-在[MS-COCO数据集](http://cocodataset.org/#download)上进行训练,通过如下方式下载数据集。
-
- cd dataset/coco
- ./download.sh
-
-## 模型训练
-
-数据准备完毕后,可以通过如下的方式启动训练:
-
- python train.py \
- --model_save_dir=output/ \
- --pretrained_model=${path_to_pretrain_model}
- --data_dir=${path_to_data}
-
-- 通过设置export CUDA\_VISIBLE\_DEVICES=0,1,2,3,4,5,6,7指定8卡GPU训练。
-- 可选参数见:
-
- python train.py --help
-
-**下载预训练模型:** 本示例提供Resnet-50预训练模型,该模性转换自Caffe,并对批标准化层(Batch Normalization Layer)进行参数融合。采用如下命令下载预训练模型:
-
- sh ./pretrained/download.sh
-
-通过初始化`pretrained_model` 加载预训练模型。同时在参数微调时也采用该设置加载已训练模型。
-请在训练前确认预训练模型下载与加载正确,否则训练过程中损失可能会出现NAN。
-
-**安装[cocoapi](https://github.com/cocodataset/cocoapi):**
-
-训练前需要首先下载[cocoapi](https://github.com/cocodataset/cocoapi):
-
- # COCOAPI=/path/to/clone/cocoapi
- git clone https://github.com/cocodataset/cocoapi.git $COCOAPI
- cd $COCOAPI/PythonAPI
- # if cython is not installed
- pip install Cython
- # Install into global site-packages
- make install
- # Alternatively, if you do not have permissions or prefer
- # not to install the COCO API into global site-packages
- python2 setup.py install --user
-
-**数据读取器说明:** 数据读取器定义在reader.py中。所有图像将短边等比例缩放至`scales`,若长边大于`max_size`, 则再次将长边等比例缩放至`max_size`。在训练阶段,对图像采用水平翻转。支持将同一个batch内的图像padding为相同尺寸。
-
-**模型设置:**
-
-* 分别使用RoIAlign和RoIPool两种方法。
-* 训练过程pre\_nms=12000, post\_nms=2000,测试过程pre\_nms=6000, post\_nms=1000。nms阈值为0.7。
-* RPN网络得到labels的过程中,fg\_fraction=0.25,fg\_thresh=0.5,bg\_thresh_hi=0.5,bg\_thresh\_lo=0.0
-* RPN选择anchor时,rpn\_fg\_fraction=0.5,rpn\_positive\_overlap=0.7,rpn\_negative\_overlap=0.3
-
-
-**训练策略:**
-
-* 采用momentum优化算法训练Faster RCNN,momentum=0.9。
-* 权重衰减系数为0.0001,前500轮学习率从0.00333线性增加至0.01。在120000,160000轮时使用0.1,0.01乘子进行学习率衰减,最大训练180000轮。同时我们也提供了2x模型,该模型采用更多的迭代轮数进行训练,训练360000轮,学习率在240000,320000轮衰减,其他参数不变,训练最大轮数和学习率策略可以在config.py中对max_iter和lr_steps进行设置。
-* 非基础卷积层卷积bias学习率为整体学习率2倍。
-* 基础卷积层中,affine_layers参数不更新,res2层参数不更新。
-* 使用Nvidia Tesla V100 8卡并行,总共训练时长大约40小时。
-
-## 模型评估
-
-模型评估是指对训练完毕的模型评估各类性能指标。本示例采用[COCO官方评估](http://cocodataset.org/#detections-eval)
-
-`eval_coco_map.py`是评估模块的主要执行程序,调用示例如下:
-
- python eval_coco_map.py \
- --dataset=coco2017 \
- --pretrained_model=${path_to_pretrain_model} \
-
-- 通过设置export CUDA\_VISIBLE\_DEVICES=0指定单卡GPU评估。
-
-下表为模型评估结果:
-
-| 模型 | RoI处理方式 | 批量大小 | 迭代次数 | mAP |
-| :--------------- | :--------: | :------------: | :------------------: |------: |
-| [Fluid RoIPool minibatch padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_minibatch_padding.tar.gz) | RoIPool | 8 | 180000 | 0.314 |
-| [Fluid RoIPool no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_pool_no_padding.tar.gz) | RoIPool | 8 | 180000 | 0.316 |
-| [Fluid RoIAlign no padding](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding.tar.gz) | RoIAlign | 8 | 180000 | 0.345 |
-| [Fluid RoIAlign no padding 2x](http://paddlemodels.bj.bcebos.com/faster_rcnn/model_align_no_padding_2x.tar.gz) | RoIAlign | 8 | 360000 | 0.364 |
-
-
-
-* Fluid RoIPool minibatch padding: 使用RoIPool,同一个batch内的图像填充为相同尺寸。该方法与detectron处理相同。
-* Fluid RoIPool no padding: 使用RoIPool,不对图像做填充处理。
-* Fluid RoIAlign no padding: 使用RoIAlign,不对图像做填充处理。
-* Fluid RoIAlign no padding 2x: 使用RoIAlign,不对图像做填充处理。训练360000轮,学习率在240000,320000轮衰减。
-
-## 模型推断及可视化
-
-模型推断可以获取图像中的物体及其对应的类别,`infer.py`是主要执行程序,调用示例如下:
-
- python infer.py \
- --dataset=coco2017 \
- --pretrained_model=${path_to_pretrain_model} \
- --image_path=data/COCO17/val2017/ \
- --image_name=000000000139.jpg \
- --draw_threshold=0.6
-
-下图为模型可视化预测结果:
-
-
-
-
-
-Faster RCNN 预测可视化
-
diff --git a/fluid/PaddleCV/faster_rcnn/box_utils.py b/fluid/PaddleCV/faster_rcnn/box_utils.py
deleted file mode 100644
index 64d7d96948b856f4ae5c28594e9fb19a3a18480e..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/box_utils.py
+++ /dev/null
@@ -1,125 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# Detectron
-# Copyright (c) 2017-present, Facebook, Inc.
-# Licensed under the Apache License, Version 2.0;
-# Written by Ross Girshick
-# --------------------------------------------------------
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-from __future__ import unicode_literals
-
-import numpy as np
-
-
-def xywh_to_xyxy(xywh):
- """Convert [x1 y1 w h] box format to [x1 y1 x2 y2] format."""
- if isinstance(xywh, (list, tuple)):
- # Single box given as a list of coordinates
- assert len(xywh) == 4
- x1, y1 = xywh[0], xywh[1]
- x2 = x1 + np.maximum(0., xywh[2] - 1.)
- y2 = y1 + np.maximum(0., xywh[3] - 1.)
- return (x1, y1, x2, y2)
- elif isinstance(xywh, np.ndarray):
- # Multiple boxes given as a 2D ndarray
- return np.hstack(
- (xywh[:, 0:2], xywh[:, 0:2] + np.maximum(0, xywh[:, 2:4] - 1)))
- else:
- raise TypeError('Argument xywh must be a list, tuple, or numpy array.')
-
-
-def xyxy_to_xywh(xyxy):
- """Convert [x1 y1 x2 y2] box format to [x1 y1 w h] format."""
- if isinstance(xyxy, (list, tuple)):
- # Single box given as a list of coordinates
- assert len(xyxy) == 4
- x1, y1 = xyxy[0], xyxy[1]
- w = xyxy[2] - x1 + 1
- h = xyxy[3] - y1 + 1
- return (x1, y1, w, h)
- elif isinstance(xyxy, np.ndarray):
- # Multiple boxes given as a 2D ndarray
- return np.hstack((xyxy[:, 0:2], xyxy[:, 2:4] - xyxy[:, 0:2] + 1))
- else:
- raise TypeError('Argument xyxy must be a list, tuple, or numpy array.')
-
-
-def clip_xyxy_to_image(x1, y1, x2, y2, height, width):
- """Clip coordinates to an image with the given height and width."""
- x1 = np.minimum(width - 1., np.maximum(0., x1))
- y1 = np.minimum(height - 1., np.maximum(0., y1))
- x2 = np.minimum(width - 1., np.maximum(0., x2))
- y2 = np.minimum(height - 1., np.maximum(0., y2))
- return x1, y1, x2, y2
-
-def nms(dets, thresh):
- """Apply classic DPM-style greedy NMS."""
- if dets.shape[0] == 0:
- return []
- x1 = dets[:, 0]
- y1 = dets[:, 1]
- x2 = dets[:, 2]
- y2 = dets[:, 3]
- scores = dets[:, 4]
-
- areas = (x2 - x1 + 1) * (y2 - y1 + 1)
- order = scores.argsort()[::-1]
-
- ndets = dets.shape[0]
- suppressed = np.zeros((ndets), dtype=np.int)
-
- # nominal indices
- # _i, _j
- # sorted indices
- # i, j
- # temp variables for box i's (the box currently under consideration)
- # ix1, iy1, ix2, iy2, iarea
-
- # variables for computing overlap with box j (lower scoring box)
- # xx1, yy1, xx2, yy2
- # w, h
- # inter, ovr
-
- for _i in range(ndets):
- i = order[_i]
- if suppressed[i] == 1:
- continue
- ix1 = x1[i]
- iy1 = y1[i]
- ix2 = x2[i]
- iy2 = y2[i]
- iarea = areas[i]
- for _j in range(_i + 1, ndets):
- j = order[_j]
- if suppressed[j] == 1:
- continue
- xx1 = max(ix1, x1[j])
- yy1 = max(iy1, y1[j])
- xx2 = min(ix2, x2[j])
- yy2 = min(iy2, y2[j])
- w = max(0.0, xx2 - xx1 + 1)
- h = max(0.0, yy2 - yy1 + 1)
- inter = w * h
- ovr = inter / (iarea + areas[j] - inter)
- if ovr >= thresh:
- suppressed[j] = 1
-
- return np.where(suppressed == 0)[0]
-
diff --git a/fluid/PaddleCV/faster_rcnn/config.py b/fluid/PaddleCV/faster_rcnn/config.py
deleted file mode 100644
index 44b35f7509eeb1adf316e3e725aef8a729bf6499..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/config.py
+++ /dev/null
@@ -1,229 +0,0 @@
-# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved.
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-# http://www.apache.org/licenses/LICENSE-2.0
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-from __future__ import unicode_literals
-from edict import AttrDict
-import six
-import numpy as np
-
-_C = AttrDict()
-cfg = _C
-
-#
-# Training options
-#
-_C.TRAIN = AttrDict()
-
-# scales an image's shortest side
-_C.TRAIN.scales = [800]
-
-# max size of longest side
-_C.TRAIN.max_size = 1333
-
-# images per GPU in minibatch
-_C.TRAIN.im_per_batch = 1
-
-# roi minibatch size per image
-_C.TRAIN.batch_size_per_im = 512
-
-# target fraction of foreground roi minibatch
-_C.TRAIN.fg_fractrion = 0.25
-
-# overlap threshold for a foreground roi
-_C.TRAIN.fg_thresh = 0.5
-
-# overlap threshold for a background roi
-_C.TRAIN.bg_thresh_hi = 0.5
-_C.TRAIN.bg_thresh_lo = 0.0
-
-# If False, only resize image and not pad, image shape is different between
-# GPUs in one mini-batch. If True, image shape is the same in one mini-batch.
-_C.TRAIN.padding_minibatch = False
-
-# Snapshot period
-_C.TRAIN.snapshot_iter = 10000
-
-# number of RPN proposals to keep before NMS
-_C.TRAIN.rpn_pre_nms_top_n = 12000
-
-# number of RPN proposals to keep after NMS
-_C.TRAIN.rpn_post_nms_top_n = 2000
-
-# NMS threshold used on RPN proposals
-_C.TRAIN.rpn_nms_thresh = 0.7
-
-# min size in RPN proposals
-_C.TRAIN.rpn_min_size = 0.0
-
-# eta for adaptive NMS in RPN
-_C.TRAIN.rpn_eta = 1.0
-
-# number of RPN examples per image
-_C.TRAIN.rpn_batch_size_per_im = 256
-
-# remove anchors out of the image
-_C.TRAIN.rpn_straddle_thresh = 0.
-
-# target fraction of foreground examples pre RPN minibatch
-_C.TRAIN.rpn_fg_fraction = 0.5
-
-# min overlap between anchor and gt box to be a positive examples
-_C.TRAIN.rpn_positive_overlap = 0.7
-
-# max overlap between anchor and gt box to be a negative examples
-_C.TRAIN.rpn_negative_overlap = 0.3
-
-# stopgrad at a specified stage
-_C.TRAIN.freeze_at = 2
-
-# min area of ground truth box
-_C.TRAIN.gt_min_area = -1
-
-#
-# Inference options
-#
-_C.TEST = AttrDict()
-
-# scales an image's shortest side
-_C.TEST.scales = [800]
-
-# max size of longest side
-_C.TEST.max_size = 1333
-
-# eta for adaptive NMS in RPN
-_C.TEST.rpn_eta = 1.0
-
-# min score threshold to infer
-_C.TEST.score_thresh = 0.05
-
-# overlap threshold used for NMS
-_C.TEST.nms_thresh = 0.5
-
-# number of RPN proposals to keep before NMS
-_C.TEST.rpn_pre_nms_top_n = 6000
-
-# number of RPN proposals to keep after NMS
-_C.TEST.rpn_post_nms_top_n = 1000
-
-# min size in RPN proposals
-_C.TEST.rpn_min_size = 0.0
-
-# max number of detections
-_C.TEST.detectiions_per_im = 100
-
-# NMS threshold used on RPN proposals
-_C.TEST.rpn_nms_thresh = 0.7
-
-#
-# Model options
-#
-
-# weight for bbox regression targets
-_C.bbox_reg_weights = [0.1, 0.1, 0.2, 0.2]
-
-# RPN anchor sizes
-_C.anchor_sizes = [32, 64, 128, 256, 512]
-
-# RPN anchor ratio
-_C.aspect_ratio = [0.5, 1, 2]
-
-# variance of anchors
-_C.variances = [1., 1., 1., 1.]
-
-# stride of feature map
-_C.rpn_stride = [16.0, 16.0]
-
-# Use roi pool or roi align, 'RoIPool' or 'RoIAlign'
-_C.roi_func = 'RoIAlign'
-
-# sampling ratio for roi align
-_C.sampling_ratio = 0
-
-# pooled width and pooled height
-_C.roi_resolution = 14
-
-# spatial scale
-_C.spatial_scale = 1. / 16.
-
-#
-# SOLVER options
-#
-
-# derived learning rate the to get the final learning rate.
-_C.learning_rate = 0.01
-
-# maximum number of iterations, 1x: 180000, 2x:360000
-_C.max_iter = 180000
-#_C.max_iter = 360000
-
-# warm up to learning rate
-_C.warm_up_iter = 500
-_C.warm_up_factor = 1. / 3.
-
-# lr steps_with_decay, 1x: [120000, 160000], 2x: [240000, 320000]
-_C.lr_steps = [120000, 160000]
-#_C.lr_steps = [240000, 320000]
-_C.lr_gamma = 0.1
-
-# L2 regularization hyperparameter
-_C.weight_decay = 0.0001
-
-# momentum with SGD
-_C.momentum = 0.9
-
-#
-# ENV options
-#
-
-# support both CPU and GPU
-_C.use_gpu = True
-
-# Whether use parallel
-_C.parallel = True
-
-# Class number
-_C.class_num = 81
-
-# support pyreader
-_C.use_pyreader = True
-
-# pixel mean values
-_C.pixel_means = [102.9801, 115.9465, 122.7717]
-
-# clip box to prevent overflowing
-_C.bbox_clip = np.log(1000. / 16.)
-
-# dataset path
-_C.train_file_list = 'annotations/instances_train2017.json'
-_C.train_data_dir = 'train2017'
-_C.val_file_list = 'annotations/instances_val2017.json'
-_C.val_data_dir = 'val2017'
-
-
-def merge_cfg_from_args(args, mode):
- """Merge config keys, values in args into the global config."""
- if mode == 'train':
- sub_d = _C.TRAIN
- else:
- sub_d = _C.TEST
- for k, v in sorted(six.iteritems(vars(args))):
- d = _C
- try:
- value = eval(v)
- except:
- value = v
- if k in sub_d:
- sub_d[k] = value
- else:
- d[k] = value
diff --git a/fluid/PaddleCV/faster_rcnn/data_utils.py b/fluid/PaddleCV/faster_rcnn/data_utils.py
deleted file mode 100644
index 12858f1b1037ed0f8a0e47d7f1b0c2490c767623..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/data_utils.py
+++ /dev/null
@@ -1,83 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# Detectron
-# Copyright (c) 2017-present, Facebook, Inc.
-# Licensed under the Apache License, Version 2.0;
-# Written by Ross Girshick
-# --------------------------------------------------------
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-from __future__ import unicode_literals
-
-import cv2
-import numpy as np
-from config import cfg
-
-
-def get_image_blob(roidb, mode):
- """Builds an input blob from the images in the roidb at the specified
- scales.
- """
- if mode == 'train':
- scales = cfg.TRAIN.scales
- scale_ind = np.random.randint(0, high=len(scales))
- target_size = scales[scale_ind]
- max_size = cfg.TRAIN.max_size
- else:
- target_size = cfg.TEST.scales[0]
- max_size = cfg.TEST.max_size
- im = cv2.imread(roidb['image'])
- assert im is not None, \
- 'Failed to read image \'{}\''.format(roidb['image'])
- if roidb['flipped']:
- im = im[:, ::-1, :]
- im, im_scale = prep_im_for_blob(im, cfg.pixel_means, target_size, max_size)
-
- return im, im_scale
-
-
-def prep_im_for_blob(im, pixel_means, target_size, max_size):
- """Prepare an image for use as a network input blob. Specially:
- - Subtract per-channel pixel mean
- - Convert to float32
- - Rescale to each of the specified target size (capped at max_size)
- Returns a list of transformed images, one for each target size. Also returns
- the scale factors that were used to compute each returned image.
- """
- im = im.astype(np.float32, copy=False)
- im -= pixel_means
-
- im_shape = im.shape
- im_size_min = np.min(im_shape[0:2])
- im_size_max = np.max(im_shape[0:2])
- im_scale = float(target_size) / float(im_size_min)
- # Prevent the biggest axis from being more than max_size
- if np.round(im_scale * im_size_max) > max_size:
- im_scale = float(max_size) / float(im_size_max)
- im = cv2.resize(
- im,
- None,
- None,
- fx=im_scale,
- fy=im_scale,
- interpolation=cv2.INTER_LINEAR)
- im_height, im_width, channel = im.shape
- channel_swap = (2, 0, 1) #(batch, channel, height, width)
- im = im.transpose(channel_swap)
- return im, im_scale
diff --git a/fluid/PaddleCV/faster_rcnn/eval_coco_map.py b/fluid/PaddleCV/faster_rcnn/eval_coco_map.py
deleted file mode 100644
index f8c755a3d0f880a47791f1c43aa161cfa0e5ff98..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/eval_coco_map.py
+++ /dev/null
@@ -1,104 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import time
-import numpy as np
-from eval_helper import get_nmsed_box
-from eval_helper import get_dt_res
-import paddle
-import paddle.fluid as fluid
-import reader
-from utility import print_arguments, parse_args
-import models.model_builder as model_builder
-import models.resnet as resnet
-import json
-from pycocotools.coco import COCO
-from pycocotools.cocoeval import COCOeval, Params
-from config import cfg
-
-
-def eval():
- if '2014' in cfg.dataset:
- test_list = 'annotations/instances_val2014.json'
- elif '2017' in cfg.dataset:
- test_list = 'annotations/instances_val2017.json'
-
- image_shape = [3, cfg.TEST.max_size, cfg.TEST.max_size]
- class_nums = cfg.class_num
- devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
- devices_num = len(devices.split(","))
- total_batch_size = devices_num * cfg.TRAIN.im_per_batch
- cocoGt = COCO(os.path.join(cfg.data_dir, test_list))
- numId_to_catId_map = {i + 1: v for i, v in enumerate(cocoGt.getCatIds())}
- category_ids = cocoGt.getCatIds()
- label_list = {
- item['id']: item['name']
- for item in cocoGt.loadCats(category_ids)
- }
- label_list[0] = ['background']
-
- model = model_builder.FasterRCNN(
- add_conv_body_func=resnet.add_ResNet50_conv4_body,
- add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
- use_pyreader=False,
- is_train=False)
- model.build_model(image_shape)
- rpn_rois, confs, locs = model.eval_out()
- place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- # yapf: disable
- if cfg.pretrained_model:
- def if_exist(var):
- return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
- fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
- # yapf: enable
- test_reader = reader.test(total_batch_size)
- feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
-
- dts_res = []
- fetch_list = [rpn_rois, confs, locs]
- for batch_id, batch_data in enumerate(test_reader()):
- start = time.time()
- im_info = []
- for data in batch_data:
- im_info.append(data[1])
- rpn_rois_v, confs_v, locs_v = exe.run(
- fetch_list=[v.name for v in fetch_list],
- feed=feeder.feed(batch_data),
- return_numpy=False)
- new_lod, nmsed_out = get_nmsed_box(rpn_rois_v, confs_v, locs_v,
- class_nums, im_info,
- numId_to_catId_map)
-
- dts_res += get_dt_res(total_batch_size, new_lod, nmsed_out, batch_data)
- end = time.time()
- print('batch id: {}, time: {}'.format(batch_id, end - start))
- with open("detection_result.json", 'w') as outfile:
- json.dump(dts_res, outfile)
- print("start evaluate using coco api")
- cocoDt = cocoGt.loadRes("detection_result.json")
- cocoEval = COCOeval(cocoGt, cocoDt, 'bbox')
- cocoEval.evaluate()
- cocoEval.accumulate()
- cocoEval.summarize()
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- eval()
diff --git a/fluid/PaddleCV/faster_rcnn/eval_helper.py b/fluid/PaddleCV/faster_rcnn/eval_helper.py
deleted file mode 100644
index ec8449b791f792a230a8b81dac5dc183a47f0dd4..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/eval_helper.py
+++ /dev/null
@@ -1,172 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-
-import os
-import numpy as np
-import paddle.fluid as fluid
-import math
-import box_utils
-from PIL import Image
-from PIL import ImageDraw
-from PIL import ImageFont
-from config import cfg
-
-
-def box_decoder(target_box, prior_box, prior_box_var):
- proposals = np.zeros_like(target_box, dtype=np.float32)
- prior_box_loc = np.zeros_like(prior_box, dtype=np.float32)
- prior_box_loc[:, 0] = prior_box[:, 2] - prior_box[:, 0] + 1.
- prior_box_loc[:, 1] = prior_box[:, 3] - prior_box[:, 1] + 1.
- prior_box_loc[:, 2] = (prior_box[:, 2] + prior_box[:, 0]) / 2
- prior_box_loc[:, 3] = (prior_box[:, 3] + prior_box[:, 1]) / 2
- pred_bbox = np.zeros_like(target_box, dtype=np.float32)
- for i in range(prior_box.shape[0]):
- dw = np.minimum(prior_box_var[2] * target_box[i, 2::4], cfg.bbox_clip)
- dh = np.minimum(prior_box_var[3] * target_box[i, 3::4], cfg.bbox_clip)
- pred_bbox[i, 0::4] = prior_box_var[0] * target_box[
- i, 0::4] * prior_box_loc[i, 0] + prior_box_loc[i, 2]
- pred_bbox[i, 1::4] = prior_box_var[1] * target_box[
- i, 1::4] * prior_box_loc[i, 1] + prior_box_loc[i, 3]
- pred_bbox[i, 2::4] = np.exp(dw) * prior_box_loc[i, 0]
- pred_bbox[i, 3::4] = np.exp(dh) * prior_box_loc[i, 1]
- proposals[:, 0::4] = pred_bbox[:, 0::4] - pred_bbox[:, 2::4] / 2
- proposals[:, 1::4] = pred_bbox[:, 1::4] - pred_bbox[:, 3::4] / 2
- proposals[:, 2::4] = pred_bbox[:, 0::4] + pred_bbox[:, 2::4] / 2 - 1
- proposals[:, 3::4] = pred_bbox[:, 1::4] + pred_bbox[:, 3::4] / 2 - 1
-
- return proposals
-
-
-def clip_tiled_boxes(boxes, im_shape):
- """Clip boxes to image boundaries. im_shape is [height, width] and boxes
- has shape (N, 4 * num_tiled_boxes)."""
- assert boxes.shape[1] % 4 == 0, \
- 'boxes.shape[1] is {:d}, but must be divisible by 4.'.format(
- boxes.shape[1]
- )
- # x1 >= 0
- boxes[:, 0::4] = np.maximum(np.minimum(boxes[:, 0::4], im_shape[1] - 1), 0)
- # y1 >= 0
- boxes[:, 1::4] = np.maximum(np.minimum(boxes[:, 1::4], im_shape[0] - 1), 0)
- # x2 < im_shape[1]
- boxes[:, 2::4] = np.maximum(np.minimum(boxes[:, 2::4], im_shape[1] - 1), 0)
- # y2 < im_shape[0]
- boxes[:, 3::4] = np.maximum(np.minimum(boxes[:, 3::4], im_shape[0] - 1), 0)
- return boxes
-
-
-def get_nmsed_box(rpn_rois, confs, locs, class_nums, im_info,
- numId_to_catId_map):
- lod = rpn_rois.lod()[0]
- rpn_rois_v = np.array(rpn_rois)
- variance_v = np.array(cfg.bbox_reg_weights)
- confs_v = np.array(confs)
- locs_v = np.array(locs)
- rois = box_decoder(locs_v, rpn_rois_v, variance_v)
- im_results = [[] for _ in range(len(lod) - 1)]
- new_lod = [0]
- for i in range(len(lod) - 1):
- start = lod[i]
- end = lod[i + 1]
- if start == end:
- continue
- rois_n = rois[start:end, :]
- rois_n = rois_n / im_info[i][2]
- rois_n = clip_tiled_boxes(rois_n, im_info[i][:2])
-
- cls_boxes = [[] for _ in range(class_nums)]
- scores_n = confs_v[start:end, :]
- for j in range(1, class_nums):
- inds = np.where(scores_n[:, j] > cfg.TEST.score_thresh)[0]
- scores_j = scores_n[inds, j]
- rois_j = rois_n[inds, j * 4:(j + 1) * 4]
- dets_j = np.hstack((rois_j, scores_j[:, np.newaxis])).astype(
- np.float32, copy=False)
- keep = box_utils.nms(dets_j, cfg.TEST.nms_thresh)
- nms_dets = dets_j[keep, :]
- #add labels
- cat_id = numId_to_catId_map[j]
- label = np.array([cat_id for _ in range(len(keep))])
- nms_dets = np.hstack((nms_dets, label[:, np.newaxis])).astype(
- np.float32, copy=False)
- cls_boxes[j] = nms_dets
- # Limit to max_per_image detections **over all classes**
- image_scores = np.hstack(
- [cls_boxes[j][:, -2] for j in range(1, class_nums)])
- if len(image_scores) > cfg.TEST.detectiions_per_im:
- image_thresh = np.sort(image_scores)[-cfg.TEST.detectiions_per_im]
- for j in range(1, class_nums):
- keep = np.where(cls_boxes[j][:, -2] >= image_thresh)[0]
- cls_boxes[j] = cls_boxes[j][keep, :]
-
- im_results_n = np.vstack([cls_boxes[j] for j in range(1, class_nums)])
- im_results[i] = im_results_n
- new_lod.append(len(im_results_n) + new_lod[-1])
- boxes = im_results_n[:, :-2]
- scores = im_results_n[:, -2]
- labels = im_results_n[:, -1]
- im_results = np.vstack([im_results[k] for k in range(len(lod) - 1)])
- return new_lod, im_results
-
-
-def get_dt_res(batch_size, lod, nmsed_out, data):
- dts_res = []
- nmsed_out_v = np.array(nmsed_out)
- assert (len(lod) == batch_size + 1), \
- "Error Lod Tensor offset dimension. Lod({}) vs. batch_size({})"\
- .format(len(lod), batch_size)
- k = 0
- for i in range(batch_size):
- dt_num_this_img = lod[i + 1] - lod[i]
- image_id = int(data[i][-1])
- image_width = int(data[i][1][1])
- image_height = int(data[i][1][2])
- for j in range(dt_num_this_img):
- dt = nmsed_out_v[k]
- k = k + 1
- xmin, ymin, xmax, ymax, score, category_id = dt.tolist()
- w = xmax - xmin + 1
- h = ymax - ymin + 1
- bbox = [xmin, ymin, w, h]
- dt_res = {
- 'image_id': image_id,
- 'category_id': category_id,
- 'bbox': bbox,
- 'score': score
- }
- dts_res.append(dt_res)
- return dts_res
-
-
-def draw_bounding_box_on_image(image_path, nms_out, draw_threshold, label_list):
- image = Image.open(image_path)
- draw = ImageDraw.Draw(image)
- im_width, im_height = image.size
-
- for dt in nms_out:
- xmin, ymin, xmax, ymax, score, category_id = dt.tolist()
- if score < draw_threshold:
- continue
- bbox = dt[:4]
- xmin, ymin, xmax, ymax = bbox
- draw.line(
- [(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin),
- (xmin, ymin)],
- width=4,
- fill='red')
- if image.mode == 'RGB':
- draw.text((xmin, ymin), label_list[int(category_id)], (255, 255, 0))
- image_name = image_path.split('/')[-1]
- print("image with bbox drawed saved as {}".format(image_name))
- image.save(image_name)
diff --git a/fluid/PaddleCV/faster_rcnn/image/Faster_RCNN.jpg b/fluid/PaddleCV/faster_rcnn/image/Faster_RCNN.jpg
deleted file mode 100644
index c2ab8085c914979eb23a59734d54797b6580e956..0000000000000000000000000000000000000000
Binary files a/fluid/PaddleCV/faster_rcnn/image/Faster_RCNN.jpg and /dev/null differ
diff --git a/fluid/PaddleCV/faster_rcnn/infer.py b/fluid/PaddleCV/faster_rcnn/infer.py
deleted file mode 100644
index 3c7200f9de57bbd8d42df9dcb7d72c8fdca7e253..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/infer.py
+++ /dev/null
@@ -1,72 +0,0 @@
-import os
-import time
-import numpy as np
-from eval_helper import get_nmsed_box
-from eval_helper import get_dt_res
-from eval_helper import draw_bounding_box_on_image
-import paddle
-import paddle.fluid as fluid
-import reader
-from utility import print_arguments, parse_args
-import models.model_builder as model_builder
-import models.resnet as resnet
-import json
-from pycocotools.coco import COCO
-from pycocotools.cocoeval import COCOeval, Params
-from config import cfg
-
-
-def infer():
-
- if '2014' in cfg.dataset:
- test_list = 'annotations/instances_val2014.json'
- elif '2017' in cfg.dataset:
- test_list = 'annotations/instances_val2017.json'
-
- cocoGt = COCO(os.path.join(cfg.data_dir, test_list))
- numId_to_catId_map = {i + 1: v for i, v in enumerate(cocoGt.getCatIds())}
- category_ids = cocoGt.getCatIds()
- label_list = {
- item['id']: item['name']
- for item in cocoGt.loadCats(category_ids)
- }
- label_list[0] = ['background']
- image_shape = [3, cfg.TEST.max_size, cfg.TEST.max_size]
- class_nums = cfg.class_num
-
- model = model_builder.FasterRCNN(
- add_conv_body_func=resnet.add_ResNet50_conv4_body,
- add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
- use_pyreader=False,
- is_train=False)
- model.build_model(image_shape)
- rpn_rois, confs, locs = model.eval_out()
- place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- # yapf: disable
- if cfg.pretrained_model:
- def if_exist(var):
- return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
- fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
- # yapf: enable
- infer_reader = reader.infer()
- feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
-
- dts_res = []
- fetch_list = [rpn_rois, confs, locs]
- data = next(infer_reader())
- im_info = [data[0][1]]
- rpn_rois_v, confs_v, locs_v = exe.run(
- fetch_list=[v.name for v in fetch_list],
- feed=feeder.feed(data),
- return_numpy=False)
- new_lod, nmsed_out = get_nmsed_box(rpn_rois_v, confs_v, locs_v, class_nums,
- im_info, numId_to_catId_map)
- path = os.path.join(cfg.image_path, cfg.image_name)
- draw_bounding_box_on_image(path, nmsed_out, cfg.draw_threshold, label_list)
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- infer()
diff --git a/fluid/PaddleCV/faster_rcnn/models/model_builder.py b/fluid/PaddleCV/faster_rcnn/models/model_builder.py
deleted file mode 100644
index 9be2f330a62081107d57566962aadc32e1ac687a..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/models/model_builder.py
+++ /dev/null
@@ -1,307 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-from paddle.fluid.initializer import Constant
-from paddle.fluid.initializer import Normal
-from paddle.fluid.regularizer import L2Decay
-from config import cfg
-
-
-class FasterRCNN(object):
- def __init__(self,
- add_conv_body_func=None,
- add_roi_box_head_func=None,
- is_train=True,
- use_pyreader=True,
- use_random=True):
- self.add_conv_body_func = add_conv_body_func
- self.add_roi_box_head_func = add_roi_box_head_func
- self.is_train = is_train
- self.use_pyreader = use_pyreader
- self.use_random = use_random
- #self.py_reader = None
-
- def build_model(self, image_shape):
- self.build_input(image_shape)
- body_conv = self.add_conv_body_func(self.image)
- # RPN
- self.rpn_heads(body_conv)
- # Fast RCNN
- self.fast_rcnn_heads(body_conv)
-
- def loss(self):
- # Fast RCNN loss
- loss_cls, loss_bbox = self.fast_rcnn_loss()
- # RPN loss
- rpn_cls_loss, rpn_reg_loss = self.rpn_loss()
- return loss_cls, loss_bbox, rpn_cls_loss, rpn_reg_loss,
-
- def eval_out(self):
- cls_prob = fluid.layers.softmax(self.cls_score, use_cudnn=False)
- return [self.rpn_rois, cls_prob, self.bbox_pred]
-
- def build_input(self, image_shape):
- if self.use_pyreader:
- self.py_reader = fluid.layers.py_reader(
- capacity=64,
- shapes=[[-1] + image_shape, [-1, 4], [-1, 1], [-1, 1], [-1, 3],
- [-1, 1]],
- lod_levels=[0, 1, 1, 1, 0, 0],
- dtypes=[
- "float32", "float32", "int32", "int32", "float32", "int32"
- ],
- use_double_buffer=True)
- self.image, self.gt_box, self.gt_label, self.is_crowd, \
- self.im_info, self.im_id = fluid.layers.read_file(self.py_reader)
- else:
- self.image = fluid.layers.data(
- name='image', shape=image_shape, dtype='float32')
- self.gt_box = fluid.layers.data(
- name='gt_box', shape=[4], dtype='float32', lod_level=1)
- self.gt_label = fluid.layers.data(
- name='gt_label', shape=[1], dtype='int32', lod_level=1)
- self.is_crowd = fluid.layers.data(
- name='is_crowd',
- shape=[-1],
- dtype='int32',
- lod_level=1,
- append_batch_size=False)
- self.im_info = fluid.layers.data(
- name='im_info', shape=[3], dtype='float32')
- self.im_id = fluid.layers.data(
- name='im_id', shape=[1], dtype='int32')
-
- def feeds(self):
- if not self.is_train:
- return [self.image, self.im_info, self.im_id]
- return [
- self.image, self.gt_box, self.gt_label, self.is_crowd, self.im_info,
- self.im_id
- ]
-
- def rpn_heads(self, rpn_input):
- # RPN hidden representation
- dim_out = rpn_input.shape[1]
- rpn_conv = fluid.layers.conv2d(
- input=rpn_input,
- num_filters=dim_out,
- filter_size=3,
- stride=1,
- padding=1,
- act='relu',
- name='conv_rpn',
- param_attr=ParamAttr(
- name="conv_rpn_w", initializer=Normal(
- loc=0., scale=0.01)),
- bias_attr=ParamAttr(
- name="conv_rpn_b", learning_rate=2., regularizer=L2Decay(0.)))
- self.anchor, self.var = fluid.layers.anchor_generator(
- input=rpn_conv,
- anchor_sizes=cfg.anchor_sizes,
- aspect_ratios=cfg.aspect_ratio,
- variance=cfg.variances,
- stride=cfg.rpn_stride)
- num_anchor = self.anchor.shape[2]
- # Proposal classification scores
- self.rpn_cls_score = fluid.layers.conv2d(
- rpn_conv,
- num_filters=num_anchor,
- filter_size=1,
- stride=1,
- padding=0,
- act=None,
- name='rpn_cls_score',
- param_attr=ParamAttr(
- name="rpn_cls_logits_w", initializer=Normal(
- loc=0., scale=0.01)),
- bias_attr=ParamAttr(
- name="rpn_cls_logits_b",
- learning_rate=2.,
- regularizer=L2Decay(0.)))
- # Proposal bbox regression deltas
- self.rpn_bbox_pred = fluid.layers.conv2d(
- rpn_conv,
- num_filters=4 * num_anchor,
- filter_size=1,
- stride=1,
- padding=0,
- act=None,
- name='rpn_bbox_pred',
- param_attr=ParamAttr(
- name="rpn_bbox_pred_w", initializer=Normal(
- loc=0., scale=0.01)),
- bias_attr=ParamAttr(
- name="rpn_bbox_pred_b",
- learning_rate=2.,
- regularizer=L2Decay(0.)))
-
- rpn_cls_score_prob = fluid.layers.sigmoid(
- self.rpn_cls_score, name='rpn_cls_score_prob')
-
- param_obj = cfg.TRAIN if self.is_train else cfg.TEST
- pre_nms_top_n = param_obj.rpn_pre_nms_top_n
- post_nms_top_n = param_obj.rpn_post_nms_top_n
- nms_thresh = param_obj.rpn_nms_thresh
- min_size = param_obj.rpn_min_size
- eta = param_obj.rpn_eta
- rpn_rois, rpn_roi_probs = fluid.layers.generate_proposals(
- scores=rpn_cls_score_prob,
- bbox_deltas=self.rpn_bbox_pred,
- im_info=self.im_info,
- anchors=self.anchor,
- variances=self.var,
- pre_nms_top_n=pre_nms_top_n,
- post_nms_top_n=post_nms_top_n,
- nms_thresh=nms_thresh,
- min_size=min_size,
- eta=eta)
- self.rpn_rois = rpn_rois
- if self.is_train:
- outs = fluid.layers.generate_proposal_labels(
- rpn_rois=rpn_rois,
- gt_classes=self.gt_label,
- is_crowd=self.is_crowd,
- gt_boxes=self.gt_box,
- im_info=self.im_info,
- batch_size_per_im=cfg.TRAIN.batch_size_per_im,
- fg_fraction=cfg.TRAIN.fg_fractrion,
- fg_thresh=cfg.TRAIN.fg_thresh,
- bg_thresh_hi=cfg.TRAIN.bg_thresh_hi,
- bg_thresh_lo=cfg.TRAIN.bg_thresh_lo,
- bbox_reg_weights=cfg.bbox_reg_weights,
- class_nums=cfg.class_num,
- use_random=self.use_random)
-
- self.rois = outs[0]
- self.labels_int32 = outs[1]
- self.bbox_targets = outs[2]
- self.bbox_inside_weights = outs[3]
- self.bbox_outside_weights = outs[4]
-
- def fast_rcnn_heads(self, roi_input):
- if self.is_train:
- pool_rois = self.rois
- else:
- pool_rois = self.rpn_rois
- if cfg.roi_func == 'RoIPool':
- pool = fluid.layers.roi_pool(
- input=roi_input,
- rois=pool_rois,
- pooled_height=cfg.roi_resolution,
- pooled_width=cfg.roi_resolution,
- spatial_scale=cfg.spatial_scale)
- elif cfg.roi_func == 'RoIAlign':
- pool = fluid.layers.roi_align(
- input=roi_input,
- rois=pool_rois,
- pooled_height=cfg.roi_resolution,
- pooled_width=cfg.roi_resolution,
- spatial_scale=cfg.spatial_scale,
- sampling_ratio=cfg.sampling_ratio)
- rcnn_out = self.add_roi_box_head_func(pool)
- self.cls_score = fluid.layers.fc(input=rcnn_out,
- size=cfg.class_num,
- act=None,
- name='cls_score',
- param_attr=ParamAttr(
- name='cls_score_w',
- initializer=Normal(
- loc=0.0, scale=0.001)),
- bias_attr=ParamAttr(
- name='cls_score_b',
- learning_rate=2.,
- regularizer=L2Decay(0.)))
- self.bbox_pred = fluid.layers.fc(input=rcnn_out,
- size=4 * cfg.class_num,
- act=None,
- name='bbox_pred',
- param_attr=ParamAttr(
- name='bbox_pred_w',
- initializer=Normal(
- loc=0.0, scale=0.01)),
- bias_attr=ParamAttr(
- name='bbox_pred_b',
- learning_rate=2.,
- regularizer=L2Decay(0.)))
-
- def fast_rcnn_loss(self):
- labels_int64 = fluid.layers.cast(x=self.labels_int32, dtype='int64')
- labels_int64.stop_gradient = True
- #loss_cls = fluid.layers.softmax_with_cross_entropy(
- # logits=cls_score,
- # label=labels_int64
- # )
- cls_prob = fluid.layers.softmax(self.cls_score, use_cudnn=False)
- loss_cls = fluid.layers.cross_entropy(cls_prob, labels_int64)
- loss_cls = fluid.layers.reduce_mean(loss_cls)
- loss_bbox = fluid.layers.smooth_l1(
- x=self.bbox_pred,
- y=self.bbox_targets,
- inside_weight=self.bbox_inside_weights,
- outside_weight=self.bbox_outside_weights,
- sigma=1.0)
- loss_bbox = fluid.layers.reduce_mean(loss_bbox)
- return loss_cls, loss_bbox
-
- def rpn_loss(self):
- rpn_cls_score_reshape = fluid.layers.transpose(
- self.rpn_cls_score, perm=[0, 2, 3, 1])
- rpn_bbox_pred_reshape = fluid.layers.transpose(
- self.rpn_bbox_pred, perm=[0, 2, 3, 1])
-
- anchor_reshape = fluid.layers.reshape(self.anchor, shape=(-1, 4))
- var_reshape = fluid.layers.reshape(self.var, shape=(-1, 4))
-
- rpn_cls_score_reshape = fluid.layers.reshape(
- x=rpn_cls_score_reshape, shape=(0, -1, 1))
- rpn_bbox_pred_reshape = fluid.layers.reshape(
- x=rpn_bbox_pred_reshape, shape=(0, -1, 4))
- score_pred, loc_pred, score_tgt, loc_tgt, bbox_weight = \
- fluid.layers.rpn_target_assign(
- bbox_pred=rpn_bbox_pred_reshape,
- cls_logits=rpn_cls_score_reshape,
- anchor_box=anchor_reshape,
- anchor_var=var_reshape,
- gt_boxes=self.gt_box,
- is_crowd=self.is_crowd,
- im_info=self.im_info,
- rpn_batch_size_per_im=cfg.TRAIN.rpn_batch_size_per_im,
- rpn_straddle_thresh=cfg.TRAIN.rpn_straddle_thresh,
- rpn_fg_fraction=cfg.TRAIN.rpn_fg_fraction,
- rpn_positive_overlap=cfg.TRAIN.rpn_positive_overlap,
- rpn_negative_overlap=cfg.TRAIN.rpn_negative_overlap,
- use_random=self.use_random)
- score_tgt = fluid.layers.cast(x=score_tgt, dtype='float32')
- rpn_cls_loss = fluid.layers.sigmoid_cross_entropy_with_logits(
- x=score_pred, label=score_tgt)
- rpn_cls_loss = fluid.layers.reduce_mean(
- rpn_cls_loss, name='loss_rpn_cls')
-
- rpn_reg_loss = fluid.layers.smooth_l1(
- x=loc_pred,
- y=loc_tgt,
- sigma=3.0,
- inside_weight=bbox_weight,
- outside_weight=bbox_weight)
- rpn_reg_loss = fluid.layers.reduce_sum(
- rpn_reg_loss, name='loss_rpn_bbox')
- score_shape = fluid.layers.shape(score_tgt)
- score_shape = fluid.layers.cast(x=score_shape, dtype='float32')
- norm = fluid.layers.reduce_prod(score_shape)
- norm.stop_gradient = True
- rpn_reg_loss = rpn_reg_loss / norm
-
- return rpn_cls_loss, rpn_reg_loss
diff --git a/fluid/PaddleCV/faster_rcnn/models/resnet.py b/fluid/PaddleCV/faster_rcnn/models/resnet.py
deleted file mode 100644
index e868a1506afe4124036d2ecef4acf83676ba02f9..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/models/resnet.py
+++ /dev/null
@@ -1,167 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-from paddle.fluid.initializer import Constant
-from paddle.fluid.regularizer import L2Decay
-from config import cfg
-
-
-def conv_bn_layer(input,
- ch_out,
- filter_size,
- stride,
- padding,
- act='relu',
- name=None):
- conv1 = fluid.layers.conv2d(
- input=input,
- num_filters=ch_out,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- act=None,
- param_attr=ParamAttr(name=name + "_weights"),
- bias_attr=ParamAttr(name=name + "_biases"),
- name=name + '.conv2d.output.1')
- if name == "conv1":
- bn_name = "bn_" + name
- else:
- bn_name = "bn" + name[3:]
-
- return fluid.layers.batch_norm(
- input=conv1,
- act=act,
- name=bn_name + '.output.1',
- param_attr=ParamAttr(name=bn_name + '_scale'),
- bias_attr=ParamAttr(bn_name + '_offset'),
- moving_mean_name=bn_name + '_mean',
- moving_variance_name=bn_name + '_variance',
- is_test=True)
-
-
-def conv_affine_layer(input,
- ch_out,
- filter_size,
- stride,
- padding,
- act='relu',
- name=None):
- conv = fluid.layers.conv2d(
- input=input,
- num_filters=ch_out,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- act=None,
- param_attr=ParamAttr(name=name + "_weights"),
- bias_attr=False,
- name=name + '.conv2d.output.1')
- if name == "conv1":
- bn_name = "bn_" + name
- else:
- bn_name = "bn" + name[3:]
-
- scale = fluid.layers.create_parameter(
- shape=[conv.shape[1]],
- dtype=conv.dtype,
- attr=ParamAttr(
- name=bn_name + '_scale', learning_rate=0.),
- default_initializer=Constant(1.))
- scale.stop_gradient = True
- bias = fluid.layers.create_parameter(
- shape=[conv.shape[1]],
- dtype=conv.dtype,
- attr=ParamAttr(
- bn_name + '_offset', learning_rate=0.),
- default_initializer=Constant(0.))
- bias.stop_gradient = True
-
- out = fluid.layers.affine_channel(x=conv, scale=scale, bias=bias)
- if act == 'relu':
- out = fluid.layers.relu(x=out)
- return out
-
-
-def shortcut(input, ch_out, stride, name):
- ch_in = input.shape[1] # if args.data_format == 'NCHW' else input.shape[-1]
- if ch_in != ch_out:
- return conv_affine_layer(input, ch_out, 1, stride, 0, None, name=name)
- else:
- return input
-
-
-def basicblock(input, ch_out, stride, name):
- short = shortcut(input, ch_out, stride, name=name)
- conv1 = conv_affine_layer(input, ch_out, 3, stride, 1, name=name)
- conv2 = conv_affine_layer(conv1, ch_out, 3, 1, 1, act=None, name=name)
- return fluid.layers.elementwise_add(x=short, y=conv2, act='relu', name=name)
-
-
-def bottleneck(input, ch_out, stride, name):
- short = shortcut(input, ch_out * 4, stride, name=name + "_branch1")
- conv1 = conv_affine_layer(
- input, ch_out, 1, stride, 0, name=name + "_branch2a")
- conv2 = conv_affine_layer(conv1, ch_out, 3, 1, 1, name=name + "_branch2b")
- conv3 = conv_affine_layer(
- conv2, ch_out * 4, 1, 1, 0, act=None, name=name + "_branch2c")
- return fluid.layers.elementwise_add(
- x=short, y=conv3, act='relu', name=name + ".add.output.5")
-
-
-def layer_warp(block_func, input, ch_out, count, stride, name):
- res_out = block_func(input, ch_out, stride, name=name + "a")
- for i in range(1, count):
- res_out = block_func(res_out, ch_out, 1, name=name + chr(ord("a") + i))
- return res_out
-
-
-ResNet_cfg = {
- 18: ([2, 2, 2, 1], basicblock),
- 34: ([3, 4, 6, 3], basicblock),
- 50: ([3, 4, 6, 3], bottleneck),
- 101: ([3, 4, 23, 3], bottleneck),
- 152: ([3, 8, 36, 3], bottleneck)
-}
-
-
-def add_ResNet50_conv4_body(body_input):
- stages, block_func = ResNet_cfg[50]
- stages = stages[0:3]
- conv1 = conv_affine_layer(
- body_input, ch_out=64, filter_size=7, stride=2, padding=3, name="conv1")
- pool1 = fluid.layers.pool2d(
- input=conv1,
- pool_type='max',
- pool_size=3,
- pool_stride=2,
- pool_padding=1)
- res2 = layer_warp(block_func, pool1, 64, stages[0], 1, name="res2")
- if cfg.TRAIN.freeze_at == 2:
- res2.stop_gradient = True
- res3 = layer_warp(block_func, res2, 128, stages[1], 2, name="res3")
- if cfg.TRAIN.freeze_at == 3:
- res3.stop_gradient = True
- res4 = layer_warp(block_func, res3, 256, stages[2], 2, name="res4")
- if cfg.TRAIN.freeze_at == 4:
- res4.stop_gradient = True
- return res4
-
-
-def add_ResNet_roi_conv5_head(head_input):
- res5 = layer_warp(bottleneck, head_input, 512, 3, 2, name="res5")
- res5_pool = fluid.layers.pool2d(
- res5, pool_type='avg', pool_size=7, name='res5_pool')
- return res5_pool
diff --git a/fluid/PaddleCV/faster_rcnn/profile.py b/fluid/PaddleCV/faster_rcnn/profile.py
deleted file mode 100644
index 73634bd6773ecb1606a43b297f0966e2d55506b3..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/profile.py
+++ /dev/null
@@ -1,178 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-
-import os
-import time
-import numpy as np
-import argparse
-from utility import parse_args, add_arguments, print_arguments
-
-import paddle
-import paddle.fluid as fluid
-import reader
-import paddle.fluid.profiler as profiler
-
-import models.model_builder as model_builder
-import models.resnet as resnet
-from learning_rate import exponential_with_warmup_decay
-from config import cfg
-
-
-def train():
- learning_rate = cfg.learning_rate
- image_shape = [3, cfg.TRAIN.max_size, cfg.TRAIN.max_size]
- num_iterations = cfg.max_iter
-
- devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
- devices_num = len(devices.split(","))
- total_batch_size = devices_num * cfg.TRAIN.im_per_batch
- model = model_builder.FasterRCNN(
- add_conv_body_func=resnet.add_ResNet50_conv4_body,
- add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
- use_pyreader=cfg.use_pyreader,
- use_random=False)
- model.build_model(image_shape)
- loss_cls, loss_bbox, rpn_cls_loss, rpn_reg_loss = model.loss()
- loss_cls.persistable = True
- loss_bbox.persistable = True
- rpn_cls_loss.persistable = True
- rpn_reg_loss.persistable = True
- loss = loss_cls + loss_bbox + rpn_cls_loss + rpn_reg_loss
-
- boundaries = cfg.lr_steps
- gamma = cfg.lr_gamma
- step_num = len(cfg.lr_steps)
- values = [learning_rate * (gamma**i) for i in range(step_num + 1)]
-
- optimizer = fluid.optimizer.Momentum(
- learning_rate=exponential_with_warmup_decay(
- learning_rate=learning_rate,
- boundaries=boundaries,
- values=values,
- warmup_iter=500,
- warmup_factor=1.0 / 3.0),
- regularization=fluid.regularizer.L2Decay(0.0001),
- momentum=0.9)
- optimizer.minimize(loss)
-
- fluid.memory_optimize(fluid.default_main_program())
-
- place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- if cfg.pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
-
- fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
-
- if cfg.parallel:
- train_exe = fluid.ParallelExecutor(
- use_cuda=bool(cfg.use_gpu), loss_name=loss.name)
-
- if cfg.use_pyreader:
- train_reader = reader.train(
- batch_size=cfg.TRAIN.im_per_batch,
- total_batch_size=total_batch_size,
- padding_total=cfg.TRAIN.padding_minibatch,
- shuffle=False)
- py_reader = model.py_reader
- py_reader.decorate_paddle_reader(train_reader)
- else:
- train_reader = reader.train(batch_size=total_batch_size, shuffle=False)
- feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
-
- fetch_list = [loss, loss_cls, loss_bbox, rpn_cls_loss, rpn_reg_loss]
-
- def run(iterations):
- reader_time = []
- run_time = []
- total_images = 0
-
- for batch_id in range(iterations):
- start_time = time.time()
- data = next(train_reader())
- end_time = time.time()
- reader_time.append(end_time - start_time)
- start_time = time.time()
- if cfg.parallel:
- losses = train_exe.run(fetch_list=[v.name for v in fetch_list],
- feed=feeder.feed(data))
- else:
- losses = exe.run(fluid.default_main_program(),
- fetch_list=[v.name for v in fetch_list],
- feed=feeder.feed(data))
- end_time = time.time()
- run_time.append(end_time - start_time)
- total_images += len(data)
-
- lr = np.array(fluid.global_scope().find_var('learning_rate')
- .get_tensor())
- print("Batch {:d}, lr {:.6f}, loss {:.6f} ".format(batch_id, lr[0],
- losses[0][0]))
- return reader_time, run_time, total_images
-
- def run_pyreader(iterations):
- reader_time = [0]
- run_time = []
- total_images = 0
-
- py_reader.start()
- try:
- for batch_id in range(iterations):
- start_time = time.time()
- if cfg.parallel:
- losses = train_exe.run(
- fetch_list=[v.name for v in fetch_list])
- else:
- losses = exe.run(fluid.default_main_program(),
- fetch_list=[v.name for v in fetch_list])
- end_time = time.time()
- run_time.append(end_time - start_time)
- total_images += devices_num
- lr = np.array(fluid.global_scope().find_var('learning_rate')
- .get_tensor())
- print("Batch {:d}, lr {:.6f}, loss {:.6f} ".format(batch_id, lr[
- 0], losses[0][0]))
- except fluid.core.EOFException:
- py_reader.reset()
-
- return reader_time, run_time, total_images
-
- run_func = run if not cfg.use_pyreader else run_pyreader
-
- # warm-up
- run_func(2)
- # profiling
- start = time.time()
- if cfg.use_profile:
- with profiler.profiler('GPU', 'total', '/tmp/profile_file'):
- reader_time, run_time, total_images = run_func(num_iterations)
- else:
- reader_time, run_time, total_images = run_func(num_iterations)
-
- end = time.time()
- total_time = end - start
- print("Total time: {0}, reader time: {1} s, run time: {2} s, images/s: {3}".
- format(total_time,
- np.sum(reader_time),
- np.sum(run_time), total_images / total_time))
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- train()
diff --git a/fluid/PaddleCV/faster_rcnn/reader.py b/fluid/PaddleCV/faster_rcnn/reader.py
deleted file mode 100644
index 50b3d88b3995442c49833e6f69c7d6f04ea84064..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/reader.py
+++ /dev/null
@@ -1,160 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from paddle.utils.image_util import *
-import random
-from PIL import Image
-from PIL import ImageDraw
-import numpy as np
-import xml.etree.ElementTree
-import os
-import time
-import copy
-import six
-from collections import deque
-
-from roidbs import JsonDataset
-import data_utils
-from config import cfg
-
-
-def coco(mode,
- batch_size=None,
- total_batch_size=None,
- padding_total=False,
- shuffle=False):
- if 'coco2014' in cfg.dataset:
- cfg.train_file_list = 'annotations/instances_train2014.json'
- cfg.train_data_dir = 'train2014'
- cfg.val_file_list = 'annotations/instances_val2014.json'
- cfg.val_data_dir = 'val2014'
- elif 'coco2017' in cfg.dataset:
- cfg.train_file_list = 'annotations/instances_train2017.json'
- cfg.train_data_dir = 'train2017'
- cfg.val_file_list = 'annotations/instances_val2017.json'
- cfg.val_data_dir = 'val2017'
- else:
- raise NotImplementedError('Dataset {} not supported'.format(
- cfg.dataset))
- cfg.mean_value = np.array(cfg.pixel_means)[np.newaxis,
- np.newaxis, :].astype('float32')
- total_batch_size = total_batch_size if total_batch_size else batch_size
- if mode != 'infer':
- assert total_batch_size % batch_size == 0
- if mode == 'train':
- cfg.train_file_list = os.path.join(cfg.data_dir, cfg.train_file_list)
- cfg.train_data_dir = os.path.join(cfg.data_dir, cfg.train_data_dir)
- elif mode == 'test' or mode == 'infer':
- cfg.val_file_list = os.path.join(cfg.data_dir, cfg.val_file_list)
- cfg.val_data_dir = os.path.join(cfg.data_dir, cfg.val_data_dir)
- json_dataset = JsonDataset(train=(mode == 'train'))
- roidbs = json_dataset.get_roidb()
-
- print("{} on {} with {} roidbs".format(mode, cfg.dataset, len(roidbs)))
-
- def roidb_reader(roidb, mode):
- im, im_scales = data_utils.get_image_blob(roidb, mode)
- im_id = roidb['id']
- im_height = np.round(roidb['height'] * im_scales)
- im_width = np.round(roidb['width'] * im_scales)
- im_info = np.array([im_height, im_width, im_scales], dtype=np.float32)
- if mode == 'test' or mode == 'infer':
- return im, im_info, im_id
- gt_boxes = roidb['gt_boxes'].astype('float32')
- gt_classes = roidb['gt_classes'].astype('int32')
- is_crowd = roidb['is_crowd'].astype('int32')
- return im, gt_boxes, gt_classes, is_crowd, im_info, im_id
-
- def padding_minibatch(batch_data):
- if len(batch_data) == 1:
- return batch_data
-
- max_shape = np.array([data[0].shape for data in batch_data]).max(axis=0)
-
- padding_batch = []
- for data in batch_data:
- im_c, im_h, im_w = data[0].shape[:]
- padding_im = np.zeros(
- (im_c, max_shape[1], max_shape[2]), dtype=np.float32)
- padding_im[:, :im_h, :im_w] = data[0]
- padding_batch.append((padding_im, ) + data[1:])
- return padding_batch
-
- def reader():
- if mode == "train":
- roidb_perm = deque(np.random.permutation(roidbs))
- roidb_cur = 0
- batch_out = []
- while True:
- roidb = roidb_perm[0]
- roidb_cur += 1
- roidb_perm.rotate(-1)
- if roidb_cur >= len(roidbs):
- roidb_perm = deque(np.random.permutation(roidbs))
- roidb_cur = 0
- im, gt_boxes, gt_classes, is_crowd, im_info, im_id = roidb_reader(
- roidb, mode)
- if gt_boxes.shape[0] == 0:
- continue
- batch_out.append(
- (im, gt_boxes, gt_classes, is_crowd, im_info, im_id))
- if not padding_total:
- if len(batch_out) == batch_size:
- yield padding_minibatch(batch_out)
- batch_out = []
- else:
- if len(batch_out) == total_batch_size:
- batch_out = padding_minibatch(batch_out)
- for i in range(total_batch_size / batch_size):
- sub_batch_out = []
- for j in range(batch_size):
- sub_batch_out.append(batch_out[i * batch_size +
- j])
- yield sub_batch_out
- sub_batch_out = []
- batch_out = []
-
- elif mode == "test":
- batch_out = []
- for roidb in roidbs:
- im, im_info, im_id = roidb_reader(roidb, mode)
- batch_out.append((im, im_info, im_id))
- if len(batch_out) == batch_size:
- yield batch_out
- batch_out = []
- if len(batch_out) != 0:
- yield batch_out
-
- else:
- for roidb in roidbs:
- if cfg.image_name not in roidb['image']:
- continue
- im, im_info, im_id = roidb_reader(roidb, mode)
- batch_out = [(im, im_info, im_id)]
- yield batch_out
-
- return reader
-
-
-def train(batch_size, total_batch_size=None, padding_total=False, shuffle=True):
- return coco(
- 'train', batch_size, total_batch_size, padding_total, shuffle=shuffle)
-
-
-def test(batch_size, total_batch_size=None, padding_total=False):
- return coco('test', batch_size, total_batch_size, shuffle=False)
-
-
-def infer():
- return coco('infer')
diff --git a/fluid/PaddleCV/faster_rcnn/roidbs.py b/fluid/PaddleCV/faster_rcnn/roidbs.py
deleted file mode 100644
index b21dc9ed1fb01275aa57b158b0151a56ae297dc7..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/roidbs.py
+++ /dev/null
@@ -1,204 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# Detectron
-# Copyright (c) 2017-present, Facebook, Inc.
-# Licensed under the Apache License, Version 2.0;
-# Written by Ross Girshick
-# --------------------------------------------------------
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-from __future__ import unicode_literals
-
-import copy
-import logging
-import numpy as np
-import os
-import scipy.sparse
-import random
-import time
-import matplotlib
-matplotlib.use('Agg')
-from pycocotools.coco import COCO
-import box_utils
-from config import cfg
-
-logger = logging.getLogger(__name__)
-
-
-class JsonDataset(object):
- """A class representing a COCO json dataset."""
-
- def __init__(self, train=False):
- print('Creating: {}'.format(cfg.dataset))
- self.name = cfg.dataset
- self.is_train = train
- if self.is_train:
- data_dir = cfg.train_data_dir
- file_list = cfg.train_file_list
- else:
- data_dir = cfg.val_data_dir
- file_list = cfg.val_file_list
- self.image_directory = data_dir
- self.COCO = COCO(file_list)
- # Set up dataset classes
- category_ids = self.COCO.getCatIds()
- categories = [c['name'] for c in self.COCO.loadCats(category_ids)]
- self.category_to_id_map = dict(zip(categories, category_ids))
- self.classes = ['__background__'] + categories
- self.num_classes = len(self.classes)
- self.json_category_id_to_contiguous_id = {
- v: i + 1
- for i, v in enumerate(self.COCO.getCatIds())
- }
- self.contiguous_category_id_to_json_id = {
- v: k
- for k, v in self.json_category_id_to_contiguous_id.items()
- }
-
- def get_roidb(self):
- """Return an roidb corresponding to the json dataset. Optionally:
- - include ground truth boxes in the roidb
- - add proposals specified in a proposals file
- - filter proposals based on a minimum side length
- - filter proposals that intersect with crowd regions
- """
- image_ids = self.COCO.getImgIds()
- image_ids.sort()
- roidb = copy.deepcopy(self.COCO.loadImgs(image_ids))
- for entry in roidb:
- self._prep_roidb_entry(entry)
- if self.is_train:
- # Include ground-truth object annotations
- start_time = time.time()
- for entry in roidb:
- self._add_gt_annotations(entry)
- end_time = time.time()
- print('_add_gt_annotations took {:.3f}s'.format(end_time -
- start_time))
- print('Appending horizontally-flipped training examples...')
- self._extend_with_flipped_entries(roidb)
- print('Loaded dataset: {:s}'.format(self.name))
- print('{:d} roidb entries'.format(len(roidb)))
- if self.is_train:
- self._filter_for_training(roidb)
- return roidb
-
- def _prep_roidb_entry(self, entry):
- """Adds empty metadata fields to an roidb entry."""
- # Make file_name an abs path
- im_path = os.path.join(self.image_directory, entry['file_name'])
- #assert os.path.exists(im_path), 'Image \'{}\' not found'.format(im_path)
- entry['image'] = im_path
- entry['flipped'] = False
- # Empty placeholders
- entry['gt_boxes'] = np.empty((0, 4), dtype=np.float32)
- entry['gt_classes'] = np.empty((0), dtype=np.int32)
- entry['gt_id'] = np.empty((0), dtype=np.int32)
- entry['is_crowd'] = np.empty((0), dtype=np.bool)
- # Remove unwanted fields that come from the json file (if they exist)
- for k in ['date_captured', 'url', 'license', 'file_name']:
- if k in entry:
- del entry[k]
-
- def _add_gt_annotations(self, entry):
- """Add ground truth annotation metadata to an roidb entry."""
- count = 0
- #for k in self.category_to_id_map:
- # imgs = self.COCO.getImgIds(catIds=(self.category_to_id_map[k]))
- # count += len(imgs)
- ann_ids = self.COCO.getAnnIds(imgIds=entry['id'], iscrowd=None)
- objs = self.COCO.loadAnns(ann_ids)
- # Sanitize bboxes -- some are invalid
- valid_objs = []
- width = entry['width']
- height = entry['height']
- for obj in objs:
- if obj['area'] < cfg.TRAIN.gt_min_area:
- continue
- if 'ignore' in obj and obj['ignore'] == 1:
- continue
- # Convert form (x1, y1, w, h) to (x1, y1, x2, y2)
- x1, y1, x2, y2 = box_utils.xywh_to_xyxy(obj['bbox'])
- x1, y1, x2, y2 = box_utils.clip_xyxy_to_image(x1, y1, x2, y2,
- height, width)
- # Require non-zero seg area and more than 1x1 box size
- if obj['area'] > 0 and x2 > x1 and y2 > y1:
- obj['clean_bbox'] = [x1, y1, x2, y2]
- valid_objs.append(obj)
- num_valid_objs = len(valid_objs)
-
- gt_boxes = np.zeros((num_valid_objs, 4), dtype=entry['gt_boxes'].dtype)
- gt_id = np.zeros((num_valid_objs), dtype=np.int32)
- gt_classes = np.zeros((num_valid_objs), dtype=entry['gt_classes'].dtype)
- is_crowd = np.zeros((num_valid_objs), dtype=entry['is_crowd'].dtype)
- for ix, obj in enumerate(valid_objs):
- cls = self.json_category_id_to_contiguous_id[obj['category_id']]
- gt_boxes[ix, :] = obj['clean_bbox']
- gt_classes[ix] = cls
- gt_id[ix] = np.int32(obj['id'])
- is_crowd[ix] = obj['iscrowd']
-
- entry['gt_boxes'] = np.append(entry['gt_boxes'], gt_boxes, axis=0)
- entry['gt_classes'] = np.append(entry['gt_classes'], gt_classes)
- entry['gt_id'] = np.append(entry['gt_id'], gt_id)
- entry['is_crowd'] = np.append(entry['is_crowd'], is_crowd)
-
- def _extend_with_flipped_entries(self, roidb):
- """Flip each entry in the given roidb and return a new roidb that is the
- concatenation of the original roidb and the flipped entries.
- "Flipping" an entry means that that image and associated metadata (e.g.,
- ground truth boxes and object proposals) are horizontally flipped.
- """
- flipped_roidb = []
- for entry in roidb:
- width = entry['width']
- gt_boxes = entry['gt_boxes'].copy()
- oldx1 = gt_boxes[:, 0].copy()
- oldx2 = gt_boxes[:, 2].copy()
- gt_boxes[:, 0] = width - oldx2 - 1
- gt_boxes[:, 2] = width - oldx1 - 1
- assert (gt_boxes[:, 2] >= gt_boxes[:, 0]).all()
- flipped_entry = {}
- dont_copy = ('gt_boxes', 'flipped')
- for k, v in entry.items():
- if k not in dont_copy:
- flipped_entry[k] = v
- flipped_entry['gt_boxes'] = gt_boxes
- flipped_entry['flipped'] = True
- flipped_roidb.append(flipped_entry)
- roidb.extend(flipped_roidb)
-
- def _filter_for_training(self, roidb):
- """Remove roidb entries that have no usable RoIs based on config settings.
- """
-
- def is_valid(entry):
- # Valid images have:
- # (1) At least one groundtruth RoI OR
- # (2) At least one background RoI
- gt_boxes = entry['gt_boxes']
- # image is only valid if such boxes exist
- valid = len(gt_boxes) > 0
- return valid
-
- num = len(roidb)
- filtered_roidb = [entry for entry in roidb if is_valid(entry)]
- num_after = len(filtered_roidb)
- print('Filtered {} roidb entries: {} -> {}'.format(num - num_after, num,
- num_after))
diff --git a/fluid/PaddleCV/faster_rcnn/train.py b/fluid/PaddleCV/faster_rcnn/train.py
deleted file mode 100644
index 1b18f85d8b18c74809bac5a8c7c2b4b0d5e0e232..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/train.py
+++ /dev/null
@@ -1,177 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import sys
-import numpy as np
-import time
-import shutil
-from utility import parse_args, print_arguments, SmoothedValue
-
-import paddle
-import paddle.fluid as fluid
-import reader
-import models.model_builder as model_builder
-import models.resnet as resnet
-from learning_rate import exponential_with_warmup_decay
-from config import cfg
-
-
-def train():
- learning_rate = cfg.learning_rate
- image_shape = [3, cfg.TRAIN.max_size, cfg.TRAIN.max_size]
-
- if cfg.debug:
- fluid.default_startup_program().random_seed = 1000
- fluid.default_main_program().random_seed = 1000
- import random
- random.seed(0)
- np.random.seed(0)
-
- devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
- devices_num = len(devices.split(","))
- total_batch_size = devices_num * cfg.TRAIN.im_per_batch
-
- model = model_builder.FasterRCNN(
- add_conv_body_func=resnet.add_ResNet50_conv4_body,
- add_roi_box_head_func=resnet.add_ResNet_roi_conv5_head,
- use_pyreader=cfg.use_pyreader,
- use_random=True)
- model.build_model(image_shape)
- loss_cls, loss_bbox, rpn_cls_loss, rpn_reg_loss = model.loss()
- loss_cls.persistable = True
- loss_bbox.persistable = True
- rpn_cls_loss.persistable = True
- rpn_reg_loss.persistable = True
- loss = loss_cls + loss_bbox + rpn_cls_loss + rpn_reg_loss
-
- boundaries = cfg.lr_steps
- gamma = cfg.lr_gamma
- step_num = len(cfg.lr_steps)
- values = [learning_rate * (gamma**i) for i in range(step_num + 1)]
-
- optimizer = fluid.optimizer.Momentum(
- learning_rate=exponential_with_warmup_decay(
- learning_rate=learning_rate,
- boundaries=boundaries,
- values=values,
- warmup_iter=cfg.warm_up_iter,
- warmup_factor=cfg.warm_up_factor),
- regularization=fluid.regularizer.L2Decay(cfg.weight_decay),
- momentum=cfg.momentum)
- optimizer.minimize(loss)
-
- fluid.memory_optimize(fluid.default_main_program())
-
- place = fluid.CUDAPlace(0) if cfg.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- if cfg.pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(cfg.pretrained_model, var.name))
-
- fluid.io.load_vars(exe, cfg.pretrained_model, predicate=if_exist)
-
- if cfg.parallel:
- train_exe = fluid.ParallelExecutor(
- use_cuda=bool(cfg.use_gpu), loss_name=loss.name)
-
- if cfg.use_pyreader:
- train_reader = reader.train(
- batch_size=cfg.TRAIN.im_per_batch,
- total_batch_size=total_batch_size,
- padding_total=cfg.TRAIN.padding_minibatch,
- shuffle=True)
- py_reader = model.py_reader
- py_reader.decorate_paddle_reader(train_reader)
- else:
- train_reader = reader.train(batch_size=total_batch_size, shuffle=True)
- feeder = fluid.DataFeeder(place=place, feed_list=model.feeds())
-
- def save_model(postfix):
- model_path = os.path.join(cfg.model_save_dir, postfix)
- if os.path.isdir(model_path):
- shutil.rmtree(model_path)
- fluid.io.save_persistables(exe, model_path)
-
- fetch_list = [loss, rpn_cls_loss, rpn_reg_loss, loss_cls, loss_bbox]
-
- def train_loop_pyreader():
- py_reader.start()
- smoothed_loss = SmoothedValue(cfg.log_window)
- try:
- start_time = time.time()
- prev_start_time = start_time
- every_pass_loss = []
- for iter_id in range(cfg.max_iter):
- prev_start_time = start_time
- start_time = time.time()
- losses = train_exe.run(fetch_list=[v.name for v in fetch_list])
- every_pass_loss.append(np.mean(np.array(losses[0])))
- smoothed_loss.add_value(np.mean(np.array(losses[0])))
- lr = np.array(fluid.global_scope().find_var('learning_rate')
- .get_tensor())
- print("Iter {:d}, lr {:.6f}, loss {:.6f}, time {:.5f}".format(
- iter_id, lr[0],
- smoothed_loss.get_median_value(
- ), start_time - prev_start_time))
- sys.stdout.flush()
- if (iter_id + 1) % cfg.TRAIN.snapshot_iter == 0:
- save_model("model_iter{}".format(iter_id))
- except fluid.core.EOFException:
- py_reader.reset()
- return np.mean(every_pass_loss)
-
- def train_loop():
- start_time = time.time()
- prev_start_time = start_time
- start = start_time
- every_pass_loss = []
- smoothed_loss = SmoothedValue(cfg.log_window)
- for iter_id, data in enumerate(train_reader()):
- prev_start_time = start_time
- start_time = time.time()
- losses = train_exe.run(fetch_list=[v.name for v in fetch_list],
- feed=feeder.feed(data))
- loss_v = np.mean(np.array(losses[0]))
- every_pass_loss.append(loss_v)
- smoothed_loss.add_value(loss_v)
- lr = np.array(fluid.global_scope().find_var('learning_rate')
- .get_tensor())
- print("Iter {:d}, lr {:.6f}, loss {:.6f}, time {:.5f}".format(
- iter_id, lr[0],
- smoothed_loss.get_median_value(), start_time - prev_start_time))
- sys.stdout.flush()
- if (iter_id + 1) % cfg.TRAIN.snapshot_iter == 0:
- save_model("model_iter{}".format(iter_id))
- if (iter_id + 1) == cfg.max_iter:
- break
- return np.mean(every_pass_loss)
-
- if cfg.use_pyreader:
- train_loop_pyreader()
- else:
- train_loop()
- save_model('model_final')
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- train()
diff --git a/fluid/PaddleCV/faster_rcnn/utility.py b/fluid/PaddleCV/faster_rcnn/utility.py
deleted file mode 100644
index 12a208823482a6904e4f0ee0dcae84fa38f7cf37..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/faster_rcnn/utility.py
+++ /dev/null
@@ -1,139 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-"""
-Contains common utility functions.
-"""
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import sys
-import distutils.util
-import numpy as np
-import six
-from collections import deque
-from paddle.fluid import core
-import argparse
-import functools
-from config import *
-
-
-def print_arguments(args):
- """Print argparse's arguments.
-
- Usage:
-
- .. code-block:: python
-
- parser = argparse.ArgumentParser()
- parser.add_argument("name", default="Jonh", type=str, help="User name.")
- args = parser.parse_args()
- print_arguments(args)
-
- :param args: Input argparse.Namespace for printing.
- :type args: argparse.Namespace
- """
- print("----------- Configuration Arguments -----------")
- for arg, value in sorted(six.iteritems(vars(args))):
- print("%s: %s" % (arg, value))
- print("------------------------------------------------")
-
-
-def add_arguments(argname, type, default, help, argparser, **kwargs):
- """Add argparse's argument.
-
- Usage:
-
- .. code-block:: python
-
- parser = argparse.ArgumentParser()
- add_argument("name", str, "Jonh", "User name.", parser)
- args = parser.parse_args()
- """
- type = distutils.util.strtobool if type == bool else type
- argparser.add_argument(
- "--" + argname,
- default=default,
- type=type,
- help=help + ' Default: %(default)s.',
- **kwargs)
-
-
-class SmoothedValue(object):
- """Track a series of values and provide access to smoothed values over a
- window or the global series average.
- """
-
- def __init__(self, window_size):
- self.deque = deque(maxlen=window_size)
-
- def add_value(self, value):
- self.deque.append(value)
-
- def get_median_value(self):
- return np.median(self.deque)
-
-
-def parse_args():
- """return all args
- """
- parser = argparse.ArgumentParser(description=__doc__)
- add_arg = functools.partial(add_arguments, argparser=parser)
- # yapf: disable
- # ENV
- add_arg('parallel', bool, True, "Whether use parallel.")
- add_arg('use_gpu', bool, True, "Whether use GPU.")
- add_arg('model_save_dir', str, 'output', "The path to save model.")
- add_arg('pretrained_model', str, 'imagenet_resnet50_fusebn', "The init model path.")
- add_arg('dataset', str, 'coco2017', "coco2014, coco2017.")
- add_arg('class_num', int, 81, "Class number.")
- add_arg('data_dir', str, 'data/COCO17', "The data root path.")
- add_arg('use_pyreader', bool, True, "Use pyreader.")
- add_arg('use_profile', bool, False, "Whether use profiler.")
- add_arg('padding_minibatch',bool, False,
- "If False, only resize image and not pad, image shape is different between"
- " GPUs in one mini-batch. If True, image shape is the same in one mini-batch.")
- #SOLVER
- add_arg('learning_rate', float, 0.01, "Learning rate.")
- add_arg('max_iter', int, 180000, "Iter number.")
- add_arg('log_window', int, 20, "Log smooth window, set 1 for debug, set 20 for train.")
- # FAST RCNN
- # RPN
- add_arg('anchor_sizes', int, [32,64,128,256,512], "The size of anchors.")
- add_arg('aspect_ratios', float, [0.5,1.0,2.0], "The ratio of anchors.")
- add_arg('variance', float, [1.,1.,1.,1.], "The variance of anchors.")
- add_arg('rpn_stride', float, [16.,16.], "Stride of the feature map that RPN is attached.")
- add_arg('rpn_nms_thresh', float, 0.7, "NMS threshold used on RPN proposals")
- # TRAIN TEST INFER
- add_arg('im_per_batch', int, 1, "Minibatch size.")
- add_arg('max_size', int, 1333, "The resized image height.")
- add_arg('scales', int, [800], "The resized image height.")
- add_arg('batch_size_per_im',int, 512, "fast rcnn head batch size")
- add_arg('pixel_means', float, [102.9801, 115.9465, 122.7717], "pixel mean")
- add_arg('nms_thresh', float, 0.5, "NMS threshold.")
- add_arg('score_thresh', float, 0.05, "score threshold for NMS.")
- add_arg('snapshot_stride', int, 10000, "save model every snapshot stride.")
- add_arg('debug', bool, False, "Debug mode")
- # SINGLE EVAL AND DRAW
- add_arg('draw_threshold', float, 0.8, "Confidence threshold to draw bbox.")
- add_arg('image_path', str, 'data/COCO17/val2017', "The image path used to inference and visualize.")
- add_arg('image_name', str, '', "The single image used to inference and visualize.")
- # yapf: enable
- args = parser.parse_args()
- file_name = sys.argv[0]
- if 'train' in file_name or 'profile' in file_name:
- merge_cfg_from_args(args, 'train')
- else:
- merge_cfg_from_args(args, 'test')
- return args
diff --git a/fluid/PaddleCV/gan/c_gan/.run_ce.sh b/fluid/PaddleCV/gan/c_gan/.run_ce.sh
deleted file mode 100755
index 7dee419d90a9719f6c9790f0ffc0b50c69870815..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/gan/c_gan/.run_ce.sh
+++ /dev/null
@@ -1,9 +0,0 @@
-#!/bin/bash
-
-# This file is only used for continuous evaluation.
-export FLAGS_cudnn_deterministic=True
-export ce_mode=1
-(CUDA_VISIBLE_DEVICES=6 python c_gan.py --batch_size=121 --epoch=1 --run_ce=True --use_gpu=True & \
-CUDA_VISIBLE_DEVICES=7 python dc_gan.py --batch_size=121 --epoch=1 --run_ce=True --use_gpu=True) | python _ce.py
-
-
diff --git a/fluid/PaddleCV/gan/c_gan/README.md b/fluid/PaddleCV/gan/c_gan/README.md
index 9f3c18fd0fb9a943f728548f655d3dd3cef73288..b36f7084c0a67ce35cc7e7a73333443919a98775 100644
--- a/fluid/PaddleCV/gan/c_gan/README.md
+++ b/fluid/PaddleCV/gan/c_gan/README.md
@@ -1,76 +1,2 @@
-
-运行本目录下的程序示例需要使用PaddlePaddle develop最新版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
-## 代码结构
-```
-├── network.py # 定义基础生成网络和判别网络。
-├── utility.py # 定义通用工具方法。
-├── dc_gan.py # DCGAN训练脚本。
-└── c_gan.py # conditionalGAN训练脚本。
-```
-
-## 简介
-TODO
-
-## 数据准备
-
-本教程使用 mnist 数据集来进行模型的训练测试工作,该数据集通过`paddle.dataset`模块自动下载到本地。
-
-## 训练测试conditianalGAN
-
-在GPU单卡上训练conditionalGAN:
-
-```
-env CUDA_VISIBLE_DEVICES=0 python c_gan.py --output="./result"
-```
-
-训练过程中,每隔固定的训练轮数,会取一个batch的数据进行测试,测试结果以图片的形式保存至`--output`选项指定的路径。
-
-执行`python c_gan.py --help`可查看更多使用方式和参数详细说明。
-
-图1为conditionalGAN训练损失示意图,其中横坐标轴为训练轮数,纵轴为在训练集上的损失。其中,'G_loss'和'D_loss'分别为生成网络和判别器网络的训练损失。conditionalGAN训练19轮的模型预测效果如图2所示.
-
-
-
-
-
-
-
- |
-
-
- |
-
-
- |
- 图 1
- |
-
- 图 2
- |
-
-
-
-
-
-
-## 训练测试DCGAN
-
-在GPU单卡上训练DCGAN:
-
-```
-env CUDA_VISIBLE_DEVICES=0 python dc_gan.py --output="./result"
-```
-
-训练过程中,每隔固定的训练轮数,会取一个batch的数据进行测试,测试结果以图片的形式保存至`--output`选项指定的路径。
-
-执行`python dc_gan.py --help`可查看更多使用方式和参数详细说明。
-
-
-DCGAN训练10轮的模型预测效果如图3所示:
-
-
-
-图 3
-
+您好,该项目已被迁移,请移步到 [PaddleCV/gan/c_gan](../../../../PaddleCV/gan/c_gan) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/gan/c_gan/c_gan.py b/fluid/PaddleCV/gan/c_gan/c_gan.py
deleted file mode 100644
index 18c6e5df232d5077126001b0fe17ca098c8e6c4b..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/gan/c_gan/c_gan.py
+++ /dev/null
@@ -1,202 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import sys
-import os
-import six
-import argparse
-import functools
-import matplotlib
-import numpy as np
-import paddle
-import time
-import paddle.fluid as fluid
-from utility import get_parent_function_name, plot, check, add_arguments, print_arguments
-from network import G_cond, D_cond
-matplotlib.use('agg')
-import matplotlib.pyplot as plt
-import matplotlib.gridspec as gridspec
-
-
-NOISE_SIZE = 100
-LEARNING_RATE = 2e-4
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('batch_size', int, 121, "Minibatch size.")
-add_arg('epoch', int, 20, "The number of epoched to be trained.")
-add_arg('output', str, "./output", "The directory the model and the test result to be saved to.")
-add_arg('use_gpu', bool, True, "Whether to use GPU to train.")
-add_arg('run_ce', bool, False, "Whether to run for model ce.")
-# yapf: enable
-
-
-def loss(x, label):
- return fluid.layers.mean(
- fluid.layers.sigmoid_cross_entropy_with_logits(
- x=x, label=label))
-
-
-def train(args):
-
- if args.run_ce:
- np.random.seed(10)
- fluid.default_startup_program().random_seed = 90
-
- d_program = fluid.Program()
- dg_program = fluid.Program()
-
- with fluid.program_guard(d_program):
- conditions = fluid.layers.data(
- name='conditions', shape=[1], dtype='float32')
- img = fluid.layers.data(name='img', shape=[784], dtype='float32')
- label = fluid.layers.data(name='label', shape=[1], dtype='float32')
- d_logit = D_cond(img, conditions)
- d_loss = loss(d_logit, label)
-
- with fluid.program_guard(dg_program):
- conditions = fluid.layers.data(
- name='conditions', shape=[1], dtype='float32')
- noise = fluid.layers.data(
- name='noise', shape=[NOISE_SIZE], dtype='float32')
- g_img = G_cond(z=noise, y=conditions)
-
- g_program = dg_program.clone()
- g_program_test = dg_program.clone(for_test=True)
-
- dg_logit = D_cond(g_img, conditions)
- dg_loss = loss(
- dg_logit,
- fluid.layers.fill_constant_batch_size_like(
- input=noise, dtype='float32', shape=[-1, 1], value=1.0))
-
- opt = fluid.optimizer.Adam(learning_rate=LEARNING_RATE)
-
- opt.minimize(loss=d_loss)
- parameters = [p.name for p in g_program.global_block().all_parameters()]
-
- opt.minimize(loss=dg_loss, parameter_list=parameters)
-
- exe = fluid.Executor(fluid.CPUPlace())
- if args.use_gpu:
- exe = fluid.Executor(fluid.CUDAPlace(0))
- exe.run(fluid.default_startup_program())
- if args.run_ce:
- train_reader = paddle.batch(
- paddle.dataset.mnist.train(),
- batch_size=args.batch_size)
- else:
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- paddle.dataset.mnist.train(), buf_size=60000),
- batch_size=args.batch_size)
-
- NUM_TRAIN_TIMES_OF_DG = 2
- const_n = np.random.uniform(
- low=-1.0, high=1.0,
- size=[args.batch_size, NOISE_SIZE]).astype('float32')
- t_time = 0
- losses = [[],[]]
- for pass_id in range(args.epoch):
- for batch_id, data in enumerate(train_reader()):
- if len(data) != args.batch_size:
- continue
- noise_data = np.random.uniform(
- low=-1.0, high=1.0,
- size=[args.batch_size, NOISE_SIZE]).astype('float32')
- real_image = np.array(list(map(lambda x: x[0], data))).reshape(
- -1, 784).astype('float32')
- conditions_data = np.array([x[1] for x in data]).reshape(
- [-1, 1]).astype("float32")
- real_labels = np.ones(
- shape=[real_image.shape[0], 1], dtype='float32')
- fake_labels = np.zeros(
- shape=[real_image.shape[0], 1], dtype='float32')
- total_label = np.concatenate([real_labels, fake_labels])
- s_time = time.time()
- generated_image = exe.run(
- g_program,
- feed={'noise': noise_data,
- 'conditions': conditions_data},
- fetch_list={g_img})[0]
-
- total_images = np.concatenate([real_image, generated_image])
-
- d_loss_1 = exe.run(d_program,
- feed={
- 'img': generated_image,
- 'label': fake_labels,
- 'conditions': conditions_data
- },
- fetch_list={d_loss})[0][0]
-
- d_loss_2 = exe.run(d_program,
- feed={
- 'img': real_image,
- 'label': real_labels,
- 'conditions': conditions_data
- },
- fetch_list={d_loss})[0][0]
-
- d_loss_n = d_loss_1 + d_loss_2
- losses[0].append(d_loss_n)
- for _ in six.moves.xrange(NUM_TRAIN_TIMES_OF_DG):
- noise_data = np.random.uniform(
- low=-1.0, high=1.0,
- size=[args.batch_size, NOISE_SIZE]).astype('float32')
- dg_loss_n = exe.run(
- dg_program,
- feed={'noise': noise_data,
- 'conditions': conditions_data},
- fetch_list={dg_loss})[0][0]
- losses[1].append(dg_loss_n)
- t_time += (time.time() - s_time)
-
-
-
- if batch_id % 10 == 0 and not args.run_ce:
- if not os.path.exists(args.output):
- os.makedirs(args.output)
- # generate image each batch
- generated_images = exe.run(
- g_program_test,
- feed={'noise': const_n,
- 'conditions': conditions_data},
- fetch_list={g_img})[0]
- total_images = np.concatenate([real_image, generated_images])
- fig = plot(total_images)
- msg = "Epoch ID={0}\n Batch ID={1}\n D-Loss={2}\n DG-Loss={3}\n gen={4}".format(
- pass_id, batch_id, d_loss_n, dg_loss_n, check(generated_images))
- print(msg)
- plt.title(msg)
- plt.savefig(
- '{}/{:04d}_{:04d}.png'.format(args.output, pass_id,
- batch_id),
- bbox_inches='tight')
- plt.close(fig)
-
- if args.run_ce:
- print("kpis,cgan_d_train_cost,{}".format(np.mean(losses[0])))
- print("kpis,cgan_g_train_cost,{}".format(np.mean(losses[1])))
- print("kpis,cgan_duration,{}".format(t_time / args.epoch))
-
-
-if __name__ == "__main__":
- args = parser.parse_args()
- print_arguments(args)
- train(args)
diff --git a/fluid/PaddleCV/gan/cycle_gan/README.md b/fluid/PaddleCV/gan/cycle_gan/README.md
index 6520c123f80f423366287ea53a36a3969d1a73c9..5db6d49b2cbdaa6af4224bc0707593908a05352d 100644
--- a/fluid/PaddleCV/gan/cycle_gan/README.md
+++ b/fluid/PaddleCV/gan/cycle_gan/README.md
@@ -1,91 +1,2 @@
-
-运行本目录下的程序示例需要使用PaddlePaddle develop最新版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
-## 代码结构
-```
-├── data_reader.py # 读取、处理数据。
-├── layers.py # 封装定义基础的layers。
-├── model.py # 定义基础生成网络和判别网络。
-├── trainer.py # 构造loss和训练网络。
-├── train.py # 训练脚本。
-└── infer.py # 预测脚本。
-```
-
-## 简介
-TODO
-
-## 数据准备
-
-本教程使用 horse2zebra 数据集 来进行模型的训练测试工作,该数据集是用关键字'wild horse'和'zebra'过滤[ImageNet](http://www.image-net.org/)数据集并下载得到的。
-
-horse2zebra训练集包含1069张野马图片,1336张斑马图片。测试集包含121张野马图片和141张斑马图片。
-
-数据下载处理完毕后,并组织为以下路径结构:
-
-```
-data
-|-- horse2zebra
-| |-- testA
-| |-- testA.txt
-| |-- testB
-| |-- testB.txt
-| |-- trainA
-| |-- trainA.txt
-| |-- trainB
-| `-- trainB.txt
-
-```
-
-以上数据文件中,`data`文件夹需要放在训练脚本`train.py`同级目录下。`testA`为存放野马测试图片的文件夹,`testB`为存放斑马测试图片的文件夹,`testA.txt`和`testB.txt`分别为野马和斑马测试图片路径列表文件,格式如下:
-
-```
-testA/n02381460_9243.jpg
-testA/n02381460_9244.jpg
-testA/n02381460_9245.jpg
-```
-
-训练数据组织方式与测试数据相同。
-
-
-## 模型训练与预测
-
-### 训练
-
-在GPU单卡上训练:
-
-```
-env CUDA_VISIBLE_DEVICES=0 python train.py
-```
-
-执行`python train.py --help`可查看更多使用方式和参数详细说明。
-
-图1为训练152轮的训练损失示意图,其中横坐标轴为训练轮数,纵轴为在训练集上的损失。其中,'g_A_loss','g_B_loss','d_A_loss'和'd_B_loss'分别为生成器A、生成器B、判别器A和判别器B的训练损失。
-
-
-
-图 1
-
-
-
-### 预测
-
-执行以下命令读取多张图片进行预测:
-
-```
-env CUDA_VISIBLE_DEVICE=0 python infer.py \
- --init_model="models/1" --input="./data/inputA/*" \
- --output="./output"
-```
-
-训练150轮的模型预测效果如图2和图3所示:
-
-
-
-图 2
-
-
-
-
-图 3
-
+您好,该项目已被迁移,请移步到 [PaddleCV/gan/cycle_gan](../../../../PaddleCV/gan/cycle_gan) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/gan/cycle_gan/infer.py b/fluid/PaddleCV/gan/cycle_gan/infer.py
deleted file mode 100644
index ffc4d1e6193707b6afafcc6eaee8c01d4b0129ce..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/gan/cycle_gan/infer.py
+++ /dev/null
@@ -1,67 +0,0 @@
-import argparse
-import functools
-import os
-from PIL import Image
-from paddle.fluid import core
-import paddle.fluid as fluid
-import paddle
-import numpy as np
-from scipy.misc import imsave
-from model import *
-import glob
-from utility import add_arguments, print_arguments
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('input', str, None, "The images to be infered.")
-add_arg('output', str, "./infer_result", "The directory the infer result to be saved to.")
-add_arg('init_model', str, None, "The init model file of directory.")
-add_arg('input_style', str, "A", "The style of the input, A or B")
-add_arg('use_gpu', bool, True, "Whether to use GPU to train.")
-# yapf: enable
-
-
-def infer(args):
- data_shape = [-1, 3, 256, 256]
- input = fluid.layers.data(name='input', shape=data_shape, dtype='float32')
- if args.input_style == "A":
- fake = build_generator_resnet_9blocks(input, name="g_A")
- elif args.input_style == "B":
- fake = build_generator_resnet_9blocks(input, name="g_B")
- else:
- raise "Input with style [%s] is not supported." % args.input_style
- # prepare environment
- place = fluid.CPUPlace()
- if args.use_gpu:
- place = fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
- fluid.io.load_persistables(exe, args.init_model)
-
- if not os.path.exists(args.output):
- os.makedirs(args.output)
- for file in glob.glob(args.input):
- print "read %s" % file
- image_name = os.path.basename(file)
- image = Image.open(file)
- image = image.resize((256, 256))
- image = np.array(image) / 127.5 - 1
- if len(image.shape) != 3:
- continue
- data = image.transpose([2, 0, 1])[np.newaxis, :].astype("float32")
- tensor = core.LoDTensor()
- tensor.set(data, place)
-
- fake_temp = exe.run(fetch_list=[fake.name], feed={"input": tensor})
- fake_temp = np.squeeze(fake_temp[0]).transpose([1, 2, 0])
- input_temp = np.squeeze(data).transpose([1, 2, 0])
-
- imsave(args.output + "/fake_" + image_name, (
- (fake_temp + 1) * 127.5).astype(np.uint8))
-
-
-if __name__ == "__main__":
- args = parser.parse_args()
- print_arguments(args)
- infer(args)
diff --git a/fluid/PaddleCV/gan/cycle_gan/layers.py b/fluid/PaddleCV/gan/cycle_gan/layers.py
deleted file mode 100644
index 0cbd5af6ebccc140104b81a75717fa3555c12d5a..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/gan/cycle_gan/layers.py
+++ /dev/null
@@ -1,157 +0,0 @@
-from __future__ import division
-import paddle.fluid as fluid
-import numpy as np
-import os
-
-use_cudnn = True
-if 'ce_mode' in os.environ:
- use_cudnn = False
-
-def cal_padding(img_size, stride, filter_size, dilation=1):
- """Calculate padding size."""
- valid_filter_size = dilation * (filter_size - 1) + 1
- if img_size % stride == 0:
- out_size = max(filter_size - stride, 0)
- else:
- out_size = max(filter_size - (img_size % stride), 0)
- return out_size // 2, out_size - out_size // 2
-
-
-def instance_norm(input, name=None):
- helper = fluid.layer_helper.LayerHelper("instance_norm", **locals())
- dtype = helper.input_dtype()
- epsilon = 1e-5
- mean = fluid.layers.reduce_mean(input, dim=[2, 3], keep_dim=True)
- var = fluid.layers.reduce_mean(
- fluid.layers.square(input - mean), dim=[2, 3], keep_dim=True)
- if name is not None:
- scale_name = name + "_scale"
- offset_name = name + "_offset"
- scale_param = fluid.ParamAttr(
- name=scale_name,
- initializer=fluid.initializer.TruncatedNormal(1.0, 0.02),
- trainable=True)
- offset_param = fluid.ParamAttr(
- name=offset_name,
- initializer=fluid.initializer.Constant(0.0),
- trainable=True)
- scale = helper.create_parameter(
- attr=scale_param, shape=input.shape[1:2], dtype=dtype)
- offset = helper.create_parameter(
- attr=offset_param, shape=input.shape[1:2], dtype=dtype)
-
- tmp = fluid.layers.elementwise_mul(x=(input - mean), y=scale, axis=1)
- tmp = tmp / fluid.layers.sqrt(var + epsilon)
- tmp = fluid.layers.elementwise_add(tmp, offset, axis=1)
- return tmp
-
-
-def conv2d(input,
- num_filters=64,
- filter_size=7,
- stride=1,
- stddev=0.02,
- padding="VALID",
- name="conv2d",
- norm=True,
- relu=True,
- relufactor=0.0):
- """Wrapper for conv2d op to support VALID and SAME padding mode."""
- need_crop = False
- if padding == "SAME":
- top_padding, bottom_padding = cal_padding(input.shape[2], stride,
- filter_size)
- left_padding, right_padding = cal_padding(input.shape[2], stride,
- filter_size)
- height_padding = bottom_padding
- width_padding = right_padding
- if top_padding != bottom_padding or left_padding != right_padding:
- height_padding = top_padding + stride
- width_padding = left_padding + stride
- need_crop = True
- else:
- height_padding = 0
- width_padding = 0
-
- padding = [height_padding, width_padding]
- param_attr = fluid.ParamAttr(
- name=name + "_w",
- initializer=fluid.initializer.TruncatedNormal(scale=stddev))
- bias_attr = fluid.ParamAttr(
- name=name + "_b", initializer=fluid.initializer.Constant(0.0))
- conv = fluid.layers.conv2d(
- input,
- num_filters,
- filter_size,
- name=name,
- stride=stride,
- padding=padding,
- use_cudnn=use_cudnn,
- param_attr=param_attr,
- bias_attr=bias_attr)
- if need_crop:
- conv = fluid.layers.crop(
- conv,
- shape=(-1, conv.shape[1], conv.shape[2] - 1, conv.shape[3] - 1),
- offsets=(0, 0, 1, 1))
- if norm:
- conv = instance_norm(input=conv, name=name + "_norm")
- if relu:
- conv = fluid.layers.leaky_relu(conv, alpha=relufactor)
- return conv
-
-
-def deconv2d(input,
- out_shape,
- num_filters=64,
- filter_size=7,
- stride=1,
- stddev=0.02,
- padding="VALID",
- name="conv2d",
- norm=True,
- relu=True,
- relufactor=0.0):
- """Wrapper for deconv2d op to support VALID and SAME padding mode."""
- need_crop = False
- if padding == "SAME":
- top_padding, bottom_padding = cal_padding(out_shape[0], stride,
- filter_size)
- left_padding, right_padding = cal_padding(out_shape[1], stride,
- filter_size)
- height_padding = top_padding
- width_padding = left_padding
- if top_padding != bottom_padding or left_padding != right_padding:
- need_crop = True
- else:
- height_padding = 0
- width_padding = 0
-
- padding = [height_padding, width_padding]
-
- param_attr = fluid.ParamAttr(
- name=name + "_w",
- initializer=fluid.initializer.TruncatedNormal(scale=stddev))
- bias_attr = fluid.ParamAttr(
- name=name + "_b", initializer=fluid.initializer.Constant(0.0))
- conv = fluid.layers.conv2d_transpose(
- input,
- num_filters,
- name=name,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- use_cudnn=use_cudnn,
- param_attr=param_attr,
- bias_attr=bias_attr)
-
- if need_crop:
- conv = fluid.layers.crop(
- conv,
- shape=(-1, conv.shape[1], conv.shape[2] - 1, conv.shape[3] - 1),
- offsets=(0, 0, 0, 0))
- if norm:
- conv = instance_norm(input=conv, name=name + "_norm")
- if relu:
- conv = fluid.layers.leaky_relu(conv, alpha=relufactor)
- return conv
diff --git a/fluid/PaddleCV/gan/cycle_gan/train.py b/fluid/PaddleCV/gan/cycle_gan/train.py
deleted file mode 100644
index 1cc2fa090b3c35d61071f7ce1b7caedbd18226f9..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/gan/cycle_gan/train.py
+++ /dev/null
@@ -1,220 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import random
-import sys
-import paddle
-import argparse
-import functools
-import time
-import numpy as np
-from scipy.misc import imsave
-import paddle.fluid as fluid
-import paddle.fluid.profiler as profiler
-from paddle.fluid import core
-import data_reader
-from utility import add_arguments, print_arguments, ImagePool
-from trainer import *
-
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('batch_size', int, 1, "Minibatch size.")
-add_arg('epoch', int, 2, "The number of epoched to be trained.")
-add_arg('output', str, "./output_0", "The directory the model and the test result to be saved to.")
-add_arg('init_model', str, None, "The init model file of directory.")
-add_arg('save_checkpoints', bool, True, "Whether to save checkpoints.")
-add_arg('run_test', bool, True, "Whether to run test.")
-add_arg('use_gpu', bool, True, "Whether to use GPU to train.")
-add_arg('profile', bool, False, "Whether to profile.")
-add_arg('run_ce', bool, False, "Whether to run for model ce.")
-# yapf: enable
-
-
-def train(args):
-
- max_images_num = data_reader.max_images_num()
- shuffle=True
- if args.run_ce:
- np.random.seed(10)
- fluid.default_startup_program().random_seed = 90
- max_images_num = 1
- shuffle = False
- data_shape = [-1] + data_reader.image_shape()
-
- input_A = fluid.layers.data(
- name='input_A', shape=data_shape, dtype='float32')
- input_B = fluid.layers.data(
- name='input_B', shape=data_shape, dtype='float32')
- fake_pool_A = fluid.layers.data(
- name='fake_pool_A', shape=data_shape, dtype='float32')
- fake_pool_B = fluid.layers.data(
- name='fake_pool_B', shape=data_shape, dtype='float32')
-
- g_A_trainer = GATrainer(input_A, input_B)
- g_B_trainer = GBTrainer(input_A, input_B)
- d_A_trainer = DATrainer(input_A, fake_pool_A)
- d_B_trainer = DBTrainer(input_B, fake_pool_B)
-
- # prepare environment
- place = fluid.CPUPlace()
- if args.use_gpu:
- place = fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
- A_pool = ImagePool()
- B_pool = ImagePool()
-
- A_reader = paddle.batch(data_reader.a_reader(shuffle=shuffle), args.batch_size)()
- B_reader = paddle.batch(data_reader.b_reader(shuffle=shuffle), args.batch_size)()
- if not args.run_ce:
- A_test_reader = data_reader.a_test_reader()
- B_test_reader = data_reader.b_test_reader()
-
- def test(epoch):
- out_path = args.output + "/test"
- if not os.path.exists(out_path):
- os.makedirs(out_path)
- i = 0
- for data_A, data_B in zip(A_test_reader(), B_test_reader()):
- A_name = data_A[1]
- B_name = data_B[1]
- tensor_A = core.LoDTensor()
- tensor_B = core.LoDTensor()
- tensor_A.set(data_A[0], place)
- tensor_B.set(data_B[0], place)
- fake_A_temp, fake_B_temp, cyc_A_temp, cyc_B_temp = exe.run(
- g_A_trainer.infer_program,
- fetch_list=[
- g_A_trainer.fake_A, g_A_trainer.fake_B, g_A_trainer.cyc_A,
- g_A_trainer.cyc_B
- ],
- feed={"input_A": tensor_A,
- "input_B": tensor_B})
- fake_A_temp = np.squeeze(fake_A_temp[0]).transpose([1, 2, 0])
- fake_B_temp = np.squeeze(fake_B_temp[0]).transpose([1, 2, 0])
- cyc_A_temp = np.squeeze(cyc_A_temp[0]).transpose([1, 2, 0])
- cyc_B_temp = np.squeeze(cyc_B_temp[0]).transpose([1, 2, 0])
- input_A_temp = np.squeeze(data_A[0]).transpose([1, 2, 0])
- input_B_temp = np.squeeze(data_B[0]).transpose([1, 2, 0])
-
- imsave(out_path + "/fakeB_" + str(epoch) + "_" + A_name, (
- (fake_B_temp + 1) * 127.5).astype(np.uint8))
- imsave(out_path + "/fakeA_" + str(epoch) + "_" + B_name, (
- (fake_A_temp + 1) * 127.5).astype(np.uint8))
- imsave(out_path + "/cycA_" + str(epoch) + "_" + A_name, (
- (cyc_A_temp + 1) * 127.5).astype(np.uint8))
- imsave(out_path + "/cycB_" + str(epoch) + "_" + B_name, (
- (cyc_B_temp + 1) * 127.5).astype(np.uint8))
- imsave(out_path + "/inputA_" + str(epoch) + "_" + A_name, (
- (input_A_temp + 1) * 127.5).astype(np.uint8))
- imsave(out_path + "/inputB_" + str(epoch) + "_" + B_name, (
- (input_B_temp + 1) * 127.5).astype(np.uint8))
- i += 1
-
- def checkpoints(epoch):
- out_path = args.output + "/checkpoints/" + str(epoch)
- if not os.path.exists(out_path):
- os.makedirs(out_path)
- fluid.io.save_persistables(
- exe, out_path + "/g_a", main_program=g_A_trainer.program, filename="params")
- fluid.io.save_persistables(
- exe, out_path + "/g_b", main_program=g_B_trainer.program, filename="params")
- fluid.io.save_persistables(
- exe, out_path + "/d_a", main_program=d_A_trainer.program, filename="params")
- fluid.io.save_persistables(
- exe, out_path + "/d_b", main_program=d_B_trainer.program, filename="params")
- print("saved checkpoint to {}".format(out_path))
- sys.stdout.flush()
-
- def init_model():
- assert os.path.exists(
- args.init_model), "[%s] cann't be found." % args.init_mode
- fluid.io.load_persistables(
- exe, args.init_model + "/g_a", main_program=g_A_trainer.program)
- fluid.io.load_persistables(
- exe, args.init_model + "/g_b", main_program=g_B_trainer.program)
- fluid.io.load_persistables(
- exe, args.init_model + "/d_a", main_program=d_A_trainer.program)
- fluid.io.load_persistables(
- exe, args.init_model + "/d_b", main_program=d_B_trainer.program)
- print("Load model from {}".format(args.init_model))
-
- if args.init_model:
- init_model()
- losses=[[], []]
- t_time = 0
- for epoch in range(args.epoch):
- batch_id = 0
- for i in range(max_images_num):
- data_A = next(A_reader)
- data_B = next(B_reader)
- tensor_A = core.LoDTensor()
- tensor_B = core.LoDTensor()
- tensor_A.set(data_A, place)
- tensor_B.set(data_B, place)
- s_time = time.time()
- # optimize the g_A network
- g_A_loss, fake_B_tmp = exe.run(
- g_A_trainer.program,
- fetch_list=[g_A_trainer.g_loss_A, g_A_trainer.fake_B],
- feed={"input_A": tensor_A,
- "input_B": tensor_B})
-
- fake_pool_B = B_pool.pool_image(fake_B_tmp)
-
- # optimize the d_B network
- d_B_loss = exe.run(
- d_B_trainer.program,
- fetch_list=[d_B_trainer.d_loss_B],
- feed={"input_B": tensor_B,
- "fake_pool_B": fake_pool_B})[0]
-
- # optimize the g_B network
- g_B_loss, fake_A_tmp = exe.run(
- g_B_trainer.program,
- fetch_list=[g_B_trainer.g_loss_B, g_B_trainer.fake_A],
- feed={"input_A": tensor_A,
- "input_B": tensor_B})
-
- fake_pool_A = A_pool.pool_image(fake_A_tmp)
-
- # optimize the d_A network
- d_A_loss = exe.run(
- d_A_trainer.program,
- fetch_list=[d_A_trainer.d_loss_A],
- feed={"input_A": tensor_A,
- "fake_pool_A": fake_pool_A})[0]
- t_time += (time.time() - s_time)
- print("epoch{}; batch{}; g_A_loss: {}; d_B_loss: {}; g_B_loss: {}; d_A_loss: {};".format(
- epoch, batch_id, g_A_loss[0], d_B_loss[0], g_B_loss[0],
- d_A_loss[0]))
- losses[0].append(g_A_loss[0])
- losses[1].append(d_A_loss[0])
- sys.stdout.flush()
- batch_id += 1
-
- if args.run_test and not args.run_ce:
- test(epoch)
- if args.save_checkpoints and not args.run_ce:
- checkpoints(epoch)
- if args.run_ce:
- print("kpis,g_train_cost,{}".format(np.mean(losses[0])))
- print("kpis,d_train_cost,{}".format(np.mean(losses[1])))
- print("kpis,duration,{}".format(t_time / args.epoch))
-
-
-if __name__ == "__main__":
- args = parser.parse_args()
- print_arguments(args)
- if args.profile:
- if args.use_gpu:
- with profiler.cuda_profiler("cuda_profiler.txt", 'csv') as nvprof:
- train(args)
- else:
- with profiler.profiler("CPU", sorted_key='total') as cpuprof:
- train(args)
- else:
- train(args)
diff --git a/fluid/PaddleCV/human_pose_estimation/README.md b/fluid/PaddleCV/human_pose_estimation/README.md
index 25dc66500c2bfa6ef10c9dea6b150908d1f1418b..6ced2b3b2cd19d413f2c8f2b139725c2e5ea14fc 100644
--- a/fluid/PaddleCV/human_pose_estimation/README.md
+++ b/fluid/PaddleCV/human_pose_estimation/README.md
@@ -1,110 +1,6 @@
-# Simple Baselines for Human Pose Estimation in Fluid
-## Introduction
-This is a simple demonstration of re-implementation in [PaddlePaddle.Fluid](http://www.paddlepaddle.org/en) for the paper [Simple Baselines for Human Pose Estimation and Tracking](https://arxiv.org/abs/1804.06208) (ECCV'18) from MSRA.
+Hi!
-
+This directory has been deprecated.
-> **Video in Demo**: *Bruno Mars - That’s What I Like [Official Video]*.
-
-## Requirements
-
- - Python == 2.7
- - PaddlePaddle >= 1.0
- - opencv-python >= 3.3
- - tqdm >= 4.25
-
-## Environment
-
-The code is developed and tested under 4 Tesla K40 GPUS cards on CentOS with installed CUDA-9.2/8.0 and cuDNN-7.1.
-
-## Known Issues
-
- - The model does not converge with large batch\_size (e.g. = 32) on Tesla P40 / V100 / P100 GPUS cards, because PaddlePaddle uses the batch normalization function of cuDNN. Changing batch\_size into 1 image on each card during training will ease this problem, but not sure the performance. The issue can be tracked at [here](https://github.com/PaddlePaddle/Paddle/issues/14580).
-
-## Results on MPII Val
-| Arch | Head | Shoulder | Elbow | Wrist | Hip | Knee | Ankle | Mean | Mean@0.1| Models |
-| ---- |:----:|:--------:|:-----:|:-----:|:---:|:----:|:-----:|:----:|:-------:|:------:|
-| 383x384\_pose\_resnet\_50 in PyTorch | 96.658 | 95.754 | 89.790 | 84.614 | 88.523 | 84.666 | 79.287 | 89.066 | 38.046 | - |
-| 383x384\_pose\_resnet\_50 in Fluid | 96.248 | 95.346 | 89.807 | 84.873 | 88.298 | 83.679 | 78.649 | 88.767 | 37.374 | [`link`](http://paddlemodels.bj.bcebos.com/pose/pose-resnet-50-384x384-mpii.tar.gz) |
-
-### Notes:
-
- - Flip test is used.
- - We do not hardly search the best model, just use the last saved model to make validation.
-
-## Getting Start
-
-### Prepare Datasets and Pretrained Models
-
- - Following the [instruction](https://github.com/Microsoft/human-pose-estimation.pytorch#data-preparation) to prepare datasets.
- - Download the pretrained ResNet-50 model in PaddlePaddle.Fluid on ImageNet from [Model Zoo](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification#supported-models-and-performances).
-
-```bash
-wget http://paddle-imagenet-models.bj.bcebos.com/resnet_50_model.tar
-```
-
-Then, put them in the folder `pretrained` under the directory root of this repo, make them look like:
-
-```
-${THIS REPO ROOT}
- `-- pretrained
- `-- resnet_50
- |-- 115
- `-- data
- `-- coco
- |-- annotations
- |-- images
- `-- mpii
- |-- annot
- |-- images
-```
-
-### Install [COCOAPI](https://github.com/cocodataset/cocoapi)
-
-```bash
-# COCOAPI=/path/to/clone/cocoapi
-git clone https://github.com/cocodataset/cocoapi.git $COCOAPI
-cd $COCOAPI/PythonAPI
-# if cython is not installed
-pip install Cython
-# Install into global site-packages
-make install
-# Alternatively, if you do not have permissions or prefer
-# not to install the COCO API into global site-packages
-python2 setup.py install --user
-```
-
-### Perform Validating
-
-Downloading the checkpoints of Pose-ResNet-50 trained on MPII dataset from [here](http://paddlemodels.bj.bcebos.com/pose/pose-resnet-50-384x384-mpii.tar.gz). Extract it into the folder `checkpoints` under the directory root of this repo. Then run
-
-```bash
-python2 val.py --dataset 'mpii' --checkpoint 'checkpoints/pose-resnet-50-384x384-mpii'
-```
-
-### Perform Training
-
-```bash
-python2 train.py --dataset 'mpii' # or coco
-```
-
-**Note**: Configurations for training are aggregated in the `lib/mpii_reader.py` and `lib/coco_reader.py`.
-
-### Perform Test on Images
-
-Put the images into the folder `test` under the directory root of this repo. Then run
-
-```bash
-python2 test.py --checkpoint 'checkpoints/pose-resnet-50-384x384-mpii'
-```
-
-If there are multiple persons in images, detectors such as [Faster R-CNN](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/faster_rcnn), [SSD](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/object_detection) or others should be used first to crop them out. Because the simple baseline for human pose estimation is a top-down method.
-
-## Reference
-
- - Simple Baselines for Human Pose Estimation and Tracking in PyTorch [`code`](https://github.com/Microsoft/human-pose-estimation.pytorch#data-preparation)
-
-## License
-
-This code is released under the Apache License 2.0.
+Please visit the project at [PaddleCV/human_pose_estimation](../../../PaddleCV/human_pose_estimation).
diff --git a/fluid/PaddleCV/human_pose_estimation/README_cn.md b/fluid/PaddleCV/human_pose_estimation/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..84120d0c568b13bfbccead92cd7f9211193f7669
--- /dev/null
+++ b/fluid/PaddleCV/human_pose_estimation/README_cn.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/human_pose_estimation](../../../PaddleCV/human_pose_estimation) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/human_pose_estimation/lib/coco_reader.py b/fluid/PaddleCV/human_pose_estimation/lib/coco_reader.py
deleted file mode 100644
index e955bee6f3efa1c507bec7d2dc83a93d741159b4..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/lib/coco_reader.py
+++ /dev/null
@@ -1,323 +0,0 @@
-# Copyright (c) 2018-present, Baidu, Inc.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-##############################################################################
-
-"""Data reader for COCO dataset."""
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import functools
-import numpy as np
-import cv2
-import random
-
-from utils.transforms import fliplr_joints
-from utils.transforms import get_affine_transform
-from utils.transforms import affine_transform
-from lib.base_reader import visualize, generate_target
-from pycocotools.coco import COCO
-
-# NOTE
-# -- COCO Datatset --
-# "keypoints":
-# {
-# 0: "nose",
-# 1: "left_eye",
-# 2: "right_eye",
-# 3: "left_ear",
-# 4: "right_ear",
-# 5: "left_shoulder",
-# 6: "right_shoulder",
-# 7: "left_elbow",
-# 8: "right_elbow",
-# 9: "left_wrist",
-# 10: "right_wrist",
-# 11: "left_hip",
-# 12: "right_hip",
-# 13: "left_knee",
-# 14: "right_knee",
-# 15: "left_ankle",
-# 16: "right_ankle"
-# },
-#
-# "skeleton":
-# [
-# [16,14],[14,12],[17,15],[15,13],[12,13],[6,12],[7,13], [6,7],[6,8],
-# [7,9],[8,10],[9,11],[2,3],[1,2],[1,3],[2,4],[3,5],[4,6],[5,7]
-# ]
-
-class Config:
- """Configurations for COCO dataset.
- """
- DEBUG = False
- TMPDIR = 'tmp_fold_for_debug'
-
- # For reader
- BUF_SIZE = 102400
- THREAD = 1 if DEBUG else 8 # have to be larger than 0
-
- # Fixed infos of dataset
- DATAROOT = 'data/coco'
- IMAGEDIR = 'images'
- NUM_JOINTS = 17
- FLIP_PAIRS = [[1, 2], [3, 4], [5, 6], [7, 8], [9, 10], [11, 12], [13, 14], [15, 16]]
- PARENT_IDS = None
-
- # CFGS
- SCALE_FACTOR = 0.3
- ROT_FACTOR = 40
- FLIP = True
- TARGET_TYPE = 'gaussian'
- SIGMA = 3
- IMAGE_SIZE = [288, 384]
- HEATMAP_SIZE = [72, 96]
- ASPECT_RATIO = IMAGE_SIZE[0] * 1.0 / IMAGE_SIZE[1]
- MEAN = [0.485, 0.456, 0.406]
- STD = [0.229, 0.224, 0.225]
- PIXEL_STD = 200
-
-cfg = Config()
-
-def _box2cs(box):
- x, y, w, h = box[:4]
- return _xywh2cs(x, y, w, h)
-
-def _xywh2cs(x, y, w, h):
- center = np.zeros((2), dtype=np.float32)
- center[0] = x + w * 0.5
- center[1] = y + h * 0.5
-
- if w > cfg.ASPECT_RATIO * h:
- h = w * 1.0 / cfg.ASPECT_RATIO
- elif w < cfg.ASPECT_RATIO * h:
- w = h * cfg.ASPECT_RATIO
- scale = np.array(
- [w * 1.0 / cfg.PIXEL_STD, h * 1.0 / cfg.PIXEL_STD],
- dtype=np.float32)
- if center[0] != -1:
- scale = scale * 1.25
-
- return center, scale
-
-def _select_data(db):
- db_selected = []
- for rec in db:
- num_vis = 0
- joints_x = 0.0
- joints_y = 0.0
- for joint, joint_vis in zip(
- rec['joints_3d'], rec['joints_3d_vis']):
- if joint_vis[0] <= 0:
- continue
- num_vis += 1
-
- joints_x += joint[0]
- joints_y += joint[1]
- if num_vis == 0:
- continue
-
- joints_x, joints_y = joints_x / num_vis, joints_y / num_vis
-
- area = rec['scale'][0] * rec['scale'][1] * (cfg.PIXEL_STD**2)
- joints_center = np.array([joints_x, joints_y])
- bbox_center = np.array(rec['center'])
- diff_norm2 = np.linalg.norm((joints_center-bbox_center), 2)
- ks = np.exp(-1.0*(diff_norm2**2) / ((0.2)**2*2.0*area))
-
- metric = (0.2 / 16) * num_vis + 0.45 - 0.2 / 16
- if ks > metric:
- db_selected.append(rec)
-
- print('=> num db: {}'.format(len(db)))
- print('=> num selected db: {}'.format(len(db_selected)))
- return db_selected
-
-def _load_coco_keypoint_annotation(image_set_index, coco, _coco_ind_to_class_ind, image_set):
- """Ground truth bbox and keypoints.
- """
- print('generating coco gt_db...')
- gt_db = []
- for index in image_set_index:
- im_ann = coco.loadImgs(index)[0]
- width = im_ann['width']
- height = im_ann['height']
-
- annIds = coco.getAnnIds(imgIds=index, iscrowd=False)
- objs = coco.loadAnns(annIds)
-
- # Sanitize bboxes
- valid_objs = []
- for obj in objs:
- x, y, w, h = obj['bbox']
- x1 = np.max((0, x))
- y1 = np.max((0, y))
- x2 = np.min((width - 1, x1 + np.max((0, w - 1))))
- y2 = np.min((height - 1, y1 + np.max((0, h - 1))))
- if obj['area'] > 0 and x2 >= x1 and y2 >= y1:
- obj['clean_bbox'] = [x1, y1, x2-x1, y2-y1]
- valid_objs.append(obj)
- objs = valid_objs
-
- rec = []
- for obj in objs:
- cls = _coco_ind_to_class_ind[obj['category_id']]
- if cls != 1:
- continue
-
- # Ignore objs without keypoints annotation
- if max(obj['keypoints']) == 0:
- continue
-
- joints_3d = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
- joints_3d_vis = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
- for ipt in range(cfg.NUM_JOINTS):
- joints_3d[ipt, 0] = obj['keypoints'][ipt * 3 + 0]
- joints_3d[ipt, 1] = obj['keypoints'][ipt * 3 + 1]
- joints_3d[ipt, 2] = 0
- t_vis = obj['keypoints'][ipt * 3 + 2]
- if t_vis > 1:
- t_vis = 1
- joints_3d_vis[ipt, 0] = t_vis
- joints_3d_vis[ipt, 1] = t_vis
- joints_3d_vis[ipt, 2] = 0
-
- center, scale = _box2cs(obj['clean_bbox'][:4])
- rec.append({
- 'image': os.path.join(cfg.DATAROOT, cfg.IMAGEDIR, image_set+'2017', '%012d.jpg' % index),
- 'center': center,
- 'scale': scale,
- 'joints_3d': joints_3d,
- 'joints_3d_vis': joints_3d_vis,
- 'filename': '%012d.jpg' % index,
- 'imgnum': 0,
- })
-
- gt_db.extend(rec)
- return gt_db
-
-def data_augmentation(sample, is_train):
- image_file = sample['image']
- filename = sample['filename'] if 'filename' in sample else ''
- joints = sample['joints_3d']
- joints_vis = sample['joints_3d_vis']
- c = sample['center']
- s = sample['scale']
- # score = sample['score'] if 'score' in sample else 1
- # imgnum = sample['imgnum'] if 'imgnum' in sample else ''
- r = 0
-
- data_numpy = cv2.imread(
- image_file, cv2.IMREAD_COLOR | cv2.IMREAD_IGNORE_ORIENTATION)
-
- if is_train:
- sf = cfg.SCALE_FACTOR
- rf = cfg.ROT_FACTOR
- s = s * np.clip(np.random.randn()*sf + 1, 1 - sf, 1 + sf)
- r = np.clip(np.random.randn()*rf, -rf*2, rf*2) \
- if random.random() <= 0.6 else 0
-
- if cfg.FLIP and random.random() <= 0.5:
- data_numpy = data_numpy[:, ::-1, :]
- joints, joints_vis = fliplr_joints(
- joints, joints_vis, data_numpy.shape[1], cfg.FLIP_PAIRS)
- c[0] = data_numpy.shape[1] - c[0] - 1
-
- trans = get_affine_transform(c, s, r, cfg.IMAGE_SIZE)
- input = cv2.warpAffine(
- data_numpy,
- trans,
- (int(cfg.IMAGE_SIZE[0]), int(cfg.IMAGE_SIZE[1])),
- flags=cv2.INTER_LINEAR)
-
- for i in range(cfg.NUM_JOINTS):
- if joints_vis[i, 0] > 0.0:
- joints[i, 0:2] = affine_transform(joints[i, 0:2], trans)
-
- # Numpy target
- target, target_weight = generate_target(cfg, joints, joints_vis)
-
- if cfg.DEBUG:
- visualize(cfg, filename, data_numpy, input.copy(), joints, target)
-
- # Normalization
- input = input.astype('float32').transpose((2, 0, 1)) / 255
- input -= np.array(cfg.MEAN).reshape((3, 1, 1))
- input /= np.array(cfg.STD).reshape((3, 1, 1))
-
- if is_train:
- return input, target, target_weight
- else:
- return input, target, target_weight, c, s
-
-# Create a reader
-def _reader_creator(root, image_set, shuffle=False, is_train=False, use_gt_bbox=False):
-
- def reader():
- if image_set in ['train', 'val']:
- file_name = os.path.join(root, 'annotations', 'person_keypoints_'+image_set+'2017.json')
- elif image_set in ['test', 'test-dev']:
- file_name = os.path.join(root, 'annotations', 'image_info_'+image_set+'2017.json')
- else:
- raise ValueError("The dataset '{}' is not supported".format(image_set))
-
- # Load annotations
- coco = COCO(file_name)
-
- # Deal with class names
- cats = [cat['name']
- for cat in coco.loadCats(coco.getCatIds())]
- classes = ['__background__'] + cats
- print('=> classes: {}'.format(classes))
- num_classes = len(classes)
- _class_to_ind = dict(zip(classes, range(num_classes)))
- _class_to_coco_ind = dict(zip(cats, coco.getCatIds()))
- _coco_ind_to_class_ind = dict([(_class_to_coco_ind[cls],
- _class_to_ind[cls])
- for cls in classes[1:]])
-
- # Load image file names
- image_set_index = coco.getImgIds()
- num_images = len(image_set_index)
- print('=> num_images: {}'.format(num_images))
-
- if is_train or use_gt_bbox:
- gt_db = _load_coco_keypoint_annotation(
- image_set_index, coco, _coco_ind_to_class_ind, image_set)
- gt_db = _select_data(gt_db)
-
- if shuffle:
- random.shuffle(gt_db)
-
- for db in gt_db:
- yield db
-
- mapper = functools.partial(data_augmentation, is_train=is_train)
- return reader, mapper
-
-def train():
- reader, mapper = _reader_creator(cfg.DATAROOT, 'train', shuffle=True, is_train=True)
- def pop():
- for i, x in enumerate(reader()):
- yield mapper(x)
- return pop
-
-def valid():
- reader, mapper = _reader_creator(cfg.DATAROOT, 'val', shuffle=False, is_train=False)
- def pop():
- for i, x in enumerate(reader()):
- yield mapper(x)
- return pop
diff --git a/fluid/PaddleCV/human_pose_estimation/lib/mpii_reader.py b/fluid/PaddleCV/human_pose_estimation/lib/mpii_reader.py
deleted file mode 100644
index f1f6fb38b9dcbfc6402a29fc26e44188c3cfe746..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/lib/mpii_reader.py
+++ /dev/null
@@ -1,216 +0,0 @@
-# Copyright (c) 2018-present, Baidu, Inc.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-##############################################################################
-
-"""Data reader for MPII."""
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import random
-import functools
-import json
-import numpy as np
-import cv2
-
-from utils.transforms import fliplr_joints
-from utils.transforms import get_affine_transform
-from utils.transforms import affine_transform
-from lib.base_reader import visualize, generate_target
-
-class Config:
- """Configurations for MPII dataset.
- """
- DEBUG = False
- TMPDIR = 'tmp_fold_for_debug'
-
- # For reader
- BUF_SIZE = 102400
- THREAD = 1 if DEBUG else 8 # have to be larger than 0
-
- # Fixed infos of dataset
- DATAROOT = 'data/mpii'
- IMAGEDIR = 'images'
- NUM_JOINTS = 16
- FLIP_PAIRS = [[0, 5], [1, 4], [2, 3], [10, 15], [11, 14], [12, 13]]
- PARENT_IDS = [1, 2, 6, 6, 3, 4, 6, 6, 7, 8, 11, 12, 7, 7, 13, 14]
-
- # CFGS
- SCALE_FACTOR = 0.3
- ROT_FACTOR = 40
- FLIP = True
- TARGET_TYPE = 'gaussian'
- SIGMA = 3
- IMAGE_SIZE = [384, 384]
- HEATMAP_SIZE = [96, 96]
- MEAN = [0.485, 0.456, 0.406]
- STD = [0.229, 0.224, 0.225]
-
-cfg = Config()
-
-def data_augmentation(sample, is_train):
- image_file = sample['image']
- filename = sample['filename'] if 'filename' in sample else ''
- joints = sample['joints_3d']
- joints_vis = sample['joints_3d_vis']
- c = sample['center']
- s = sample['scale']
- score = sample['score'] if 'score' in sample else 1
- # imgnum = sample['imgnum'] if 'imgnum' in sample else ''
- r = 0
-
- data_numpy = cv2.imread(
- image_file, cv2.IMREAD_COLOR | cv2.IMREAD_IGNORE_ORIENTATION)
-
- if is_train:
- sf = cfg.SCALE_FACTOR
- rf = cfg.ROT_FACTOR
- s = s * np.clip(np.random.randn()*sf + 1, 1 - sf, 1 + sf)
- r = np.clip(np.random.randn()*rf, -rf*2, rf*2) \
- if random.random() <= 0.6 else 0
-
- if cfg.FLIP and random.random() <= 0.5:
- data_numpy = data_numpy[:, ::-1, :]
- joints, joints_vis = fliplr_joints(
- joints, joints_vis, data_numpy.shape[1], cfg.FLIP_PAIRS)
- c[0] = data_numpy.shape[1] - c[0] - 1
-
- trans = get_affine_transform(c, s, r, cfg.IMAGE_SIZE)
- input = cv2.warpAffine(
- data_numpy,
- trans,
- (int(cfg.IMAGE_SIZE[0]), int(cfg.IMAGE_SIZE[1])),
- flags=cv2.INTER_LINEAR)
-
- for i in range(cfg.NUM_JOINTS):
- if joints_vis[i, 0] > 0.0:
- joints[i, 0:2] = affine_transform(joints[i, 0:2], trans)
-
- # Numpy target
- target, target_weight = generate_target(cfg, joints, joints_vis)
-
- if cfg.DEBUG:
- visualize(cfg, filename, data_numpy, input.copy(), joints, target)
-
- # Normalization
- input = input.astype('float32').transpose((2, 0, 1)) / 255
- input -= np.array(cfg.MEAN).reshape((3, 1, 1))
- input /= np.array(cfg.STD).reshape((3, 1, 1))
-
- if is_train:
- return input, target, target_weight
- else:
- return input, target, target_weight, c, s, score
-
-def test_data_augmentation(sample):
- image_file = sample['image']
- filename = sample['filename'] if 'filename' in sample else ''
-
- file_id = int(filename.split('.')[0].split('_')[1])
-
- input = cv2.imread(
- image_file, cv2.IMREAD_COLOR | cv2.IMREAD_IGNORE_ORIENTATION)
-
- input = cv2.resize(input, (int(cfg.IMAGE_SIZE[0]), int(cfg.IMAGE_SIZE[1])))
-
- # Normalization
- input = input.astype('float32').transpose((2, 0, 1)) / 255
- input -= np.array(cfg.MEAN).reshape((3, 1, 1))
- input /= np.array(cfg.STD).reshape((3, 1, 1))
-
- return input, file_id
-
-# Create a reader
-def _reader_creator(root, image_set, shuffle=False, is_train=False):
- def reader():
- if image_set != 'test':
- file_name = os.path.join(root, 'annot', image_set+'.json')
- with open(file_name) as anno_file:
- anno = json.load(anno_file)
- print('=> load {} samples of {} dataset'.format(len(anno), image_set))
-
- if shuffle:
- random.shuffle(anno)
-
- for a in anno:
- image_name = a['image']
-
- c = np.array(a['center'], dtype=np.float)
- s = np.array([a['scale'], a['scale']], dtype=np.float)
-
- # Adjust center/scale slightly to avoid cropping limbs
- if c[0] != -1:
- c[1] = c[1] + 15 * s[1]
- s = s * 1.25
-
- # MPII uses matlab format, index is based 1,
- # we should first convert to 0-based index
- c = c - 1
-
- joints_3d = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
- joints_3d_vis = np.zeros((cfg.NUM_JOINTS, 3), dtype=np.float)
-
- joints = np.array(a['joints'])
- joints[:, 0:2] = joints[:, 0:2] - 1
- joints_vis = np.array(a['joints_vis'])
- assert len(joints) == cfg.NUM_JOINTS, \
- 'joint num diff: {} vs {}'.format(len(joints), cfg.NUM_JOINTS)
-
- joints_3d[:, 0:2] = joints[:, 0:2]
- joints_3d_vis[:, 0] = joints_vis[:]
- joints_3d_vis[:, 1] = joints_vis[:]
-
- yield dict(
- image = os.path.join(cfg.DATAROOT, cfg.IMAGEDIR, image_name),
- center = c,
- scale = s,
- joints_3d = joints_3d,
- joints_3d_vis = joints_3d_vis,
- filename = image_name,
- test_mode = False,
- imagenum = 0)
- else:
- fold = 'test'
- for img_name in os.listdir(fold):
- yield dict(image = os.path.join(fold, img_name),
- filename = img_name)
-
- if not image_set == 'test':
- mapper = functools.partial(data_augmentation, is_train=is_train)
- else:
- mapper = functools.partial(test_data_augmentation)
- return reader, mapper
-
-def train():
- reader, mapper = _reader_creator(cfg.DATAROOT, 'train', shuffle=True, is_train=True)
- def pop():
- for i, x in enumerate(reader()):
- yield mapper(x)
- return pop
-
-def valid():
- reader, mapper = _reader_creator(cfg.DATAROOT, 'valid', shuffle=False, is_train=False)
- def pop():
- for i, x in enumerate(reader()):
- yield mapper(x)
- return pop
-
-def test():
- reader, mapper = _reader_creator(cfg.DATAROOT, 'test')
- def pop():
- for i, x in enumerate(reader()):
- yield mapper(x)
- return pop
diff --git a/fluid/PaddleCV/human_pose_estimation/lib/pose_resnet.py b/fluid/PaddleCV/human_pose_estimation/lib/pose_resnet.py
deleted file mode 100644
index aab9eb551705a40e148e0af591fbbecd15ebcbed..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/lib/pose_resnet.py
+++ /dev/null
@@ -1,192 +0,0 @@
-# Copyright (c) 2018-present, Baidu, Inc.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-##############################################################################
-
-"""Functions for building network."""
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle.fluid as fluid
-
-__all__ = ["ResNet", "ResNet50", "ResNet101", "ResNet152"]
-
-# Global parameters
-BN_MOMENTUM = 0.1
-
-class ResNet():
- def __init__(self, layers=50, kps_num=16, test_mode=False):
- """
- :param layers: int, the layers number which is used here
- :param kps_num: int, the number of keypoints in accord with the dataset
- :param test_mode: bool, if True, only return output heatmaps, no loss
-
- :return: loss, output heatmaps
- """
- self.k = kps_num
- self.layers = layers
- self.test_mode = test_mode
-
- def net(self, input, target=None, target_weight=None):
- layers = self.layers
- supported_layers = [50, 101, 152]
- assert layers in supported_layers, \
- "supported layers are {} but input layer is {}".format(supported_layers, layers)
-
- if layers == 50:
- depth = [3, 4, 6, 3]
- elif layers == 101:
- depth = [3, 4, 23, 3]
- elif layers == 152:
- depth = [3, 8, 36, 3]
- num_filters = [64, 128, 256, 512]
-
- conv = self.conv_bn_layer(
- input=input, num_filters=64, filter_size=7, stride=2, act='relu')
- conv = fluid.layers.pool2d(
- input=conv,
- pool_size=3,
- pool_stride=2,
- pool_padding=1,
- pool_type='max')
-
- for block in range(len(depth)):
- for i in range(depth[block]):
- conv = self.bottleneck_block(
- input=conv,
- num_filters=num_filters[block],
- stride=2 if i == 0 and block != 0 else 1)
-
- conv = fluid.layers.conv2d_transpose(
- input=conv, num_filters=256,
- filter_size=4,
- padding=1,
- stride=2,
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(0., 0.001)),
- act=None,
- bias_attr=False)
- conv = fluid.layers.batch_norm(input=conv, act='relu', momentum=BN_MOMENTUM)
- conv = fluid.layers.conv2d_transpose(
- input=conv, num_filters=256,
- filter_size=4,
- padding=1,
- stride=2,
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(0., 0.001)),
- act=None,
- bias_attr=False)
- conv = fluid.layers.batch_norm(input=conv, act='relu', momentum=BN_MOMENTUM)
- conv = fluid.layers.conv2d_transpose(
- input=conv, num_filters=256,
- filter_size=4,
- padding=1,
- stride=2,
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(0., 0.001)),
- act=None,
- bias_attr=False)
- conv = fluid.layers.batch_norm(input=conv, act='relu', momentum=BN_MOMENTUM)
-
- out = fluid.layers.conv2d(
- input=conv,
- num_filters=self.k,
- filter_size=1,
- stride=1,
- padding=0,
- act=None,
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(0., 0.001)))
-
- if self.test_mode:
- return out
- else:
- loss = self.calc_loss(out, target, target_weight)
- return loss, out
-
- def conv_bn_layer(self,
- input,
- num_filters,
- filter_size,
- stride=1,
- groups=1,
- act=None):
- conv = fluid.layers.conv2d(
- input=input,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=(filter_size - 1) // 2,
- groups=groups,
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(0., 0.001)),
- act=None,
- bias_attr=False)
- return fluid.layers.batch_norm(input=conv, act=act, momentum=BN_MOMENTUM)
-
- def shortcut(self, input, ch_out, stride):
- ch_in = input.shape[1]
- if ch_in != ch_out or stride != 1:
- return self.conv_bn_layer(input, ch_out, 1, stride)
- else:
- return input
-
- def calc_loss(self, heatmap, target, target_weight):
- _, c, h, w = heatmap.shape
- x = fluid.layers.reshape(heatmap, (-1, self.k, h*w))
- y = fluid.layers.reshape(target, (-1, self.k, h*w))
- w = fluid.layers.reshape(target_weight, (-1, self.k))
-
- x = fluid.layers.split(x, num_or_sections=self.k, dim=1)
- y = fluid.layers.split(y, num_or_sections=self.k, dim=1)
- w = fluid.layers.split(w, num_or_sections=self.k, dim=1)
-
- _list = []
- for idx in range(self.k):
- _tmp = fluid.layers.scale(x=x[idx] - y[idx], scale=1.)
- _tmp = _tmp * _tmp
- _tmp = fluid.layers.reduce_mean(_tmp, dim=2)
- _list.append(_tmp * w[idx])
-
- _loss = fluid.layers.concat(_list, axis=0)
- _loss = fluid.layers.reduce_mean(_loss)
- return 0.5 * _loss
-
- def bottleneck_block(self, input, num_filters, stride):
- conv0 = self.conv_bn_layer(
- input=input, num_filters=num_filters, filter_size=1, act='relu')
- conv1 = self.conv_bn_layer(
- input=conv0,
- num_filters=num_filters,
- filter_size=3,
- stride=stride,
- act='relu')
- conv2 = self.conv_bn_layer(
- input=conv1, num_filters=num_filters * 4, filter_size=1, act=None)
-
- short = self.shortcut(input, num_filters * 4, stride)
-
- return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
-
-def ResNet50():
- model = ResNet(layers=50)
- return model
-
-def ResNet101():
- model = ResNet(layers=101)
- return model
-
-def ResNet152():
- model = ResNet(layers=152)
- return model
diff --git a/fluid/PaddleCV/human_pose_estimation/test.py b/fluid/PaddleCV/human_pose_estimation/test.py
deleted file mode 100644
index 4c56f29b1d126b4c5141bab24ca4ccb26aacb264..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/test.py
+++ /dev/null
@@ -1,159 +0,0 @@
-# Copyright (c) 2018-present, Baidu, Inc.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-##############################################################################
-
-"""Functions for inference."""
-
-import os
-import argparse
-import functools
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.layers as layers
-
-from tqdm import tqdm
-from lib import pose_resnet
-from utils.transforms import flip_back
-from utils.utility import *
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-
-# yapf: disable
-add_arg('batch_size', int, 32, "Minibatch size.")
-add_arg('dataset', str, 'mpii', "Dataset")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('num_epochs', int, 140, "Number of epochs.")
-add_arg('total_images', int, 144406, "Training image number.")
-add_arg('kp_dim', int, 16, "Class number.")
-add_arg('model_save_dir', str, "output", "Model save directory")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
-add_arg('lr', float, 0.001, "Set learning rate.")
-add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
-add_arg('flip_test', bool, True, "Flip test")
-add_arg('shift_heatmap', bool, True, "Shift heatmap")
-add_arg('post_process', bool, False, "post process")
-# yapf: enable
-
-FLIP_PAIRS = [[0, 5], [1, 4], [2, 3], [10, 15], [11, 14], [12, 13]]
-
-def test(args):
- if args.dataset == 'coco':
- import lib.coco_reader as reader
- IMAGE_SIZE = [288, 384]
- # HEATMAP_SIZE = [72, 96]
- FLIP_PAIRS = [[1, 2], [3, 4], [5, 6], [7, 8], [9, 10], [11, 12], [13, 14], [15, 16]]
- args.kp_dim = 17
- args.total_images = 144406 # 149813
- elif args.dataset == 'mpii':
- import lib.mpii_reader as reader
- IMAGE_SIZE = [384, 384]
- # HEATMAP_SIZE = [96, 96]
- FLIP_PAIRS = [[0, 5], [1, 4], [2, 3], [10, 15], [11, 14], [12, 13]]
- args.kp_dim = 16
- args.total_images = 2958 # validation
- else:
- raise ValueError('The dataset {} is not supported yet.'.format(args.dataset))
-
- print_arguments(args)
-
- # Image and target
- image = layers.data(name='image', shape=[3, IMAGE_SIZE[1], IMAGE_SIZE[0]], dtype='float32')
- file_id = layers.data(name='file_id', shape=[1,], dtype='int')
-
- # Build model
- model = pose_resnet.ResNet(layers=50, kps_num=args.kp_dim, test_mode=True)
-
- # Output
- output = model.net(input=image, target=None, target_weight=None)
-
- # Parameters from model and arguments
- params = {}
- params["total_images"] = args.total_images
- params["lr"] = args.lr
- params["num_epochs"] = args.num_epochs
- params["learning_strategy"] = {}
- params["learning_strategy"]["batch_size"] = args.batch_size
- params["learning_strategy"]["name"] = args.lr_strategy
-
- if args.with_mem_opt:
- fluid.memory_optimize(fluid.default_main_program(),
- skip_opt_set=[output.name])
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- args.pretrained_model = './pretrained/resnet_50/115'
- if args.pretrained_model:
- def if_exist(var):
- exist_flag = os.path.exists(os.path.join(args.pretrained_model, var.name))
- return exist_flag
- fluid.io.load_vars(exe, args.pretrained_model, predicate=if_exist)
-
- if args.checkpoint is not None:
- fluid.io.load_persistables(exe, args.checkpoint)
-
- # Dataloader
- test_reader = paddle.batch(reader.test(), batch_size=args.batch_size)
- feeder = fluid.DataFeeder(place=place, feed_list=[image, file_id])
-
- test_exe = fluid.ParallelExecutor(
- use_cuda=True if args.use_gpu else False,
- main_program=fluid.default_main_program().clone(for_test=False),
- loss_name=None)
-
- fetch_list = [image.name, output.name]
-
- for batch_id, data in tqdm(enumerate(test_reader())):
- num_images = len(data)
-
- file_ids = []
- for i in range(num_images):
- file_ids.append(data[i][1])
-
- input_image, out_heatmaps = test_exe.run(
- fetch_list=fetch_list,
- feed=feeder.feed(data))
-
- if args.flip_test:
- # Flip all the images in a same batch
- data_fliped = []
- for i in range(num_images):
- data_fliped.append((
- data[i][0][:, :, ::-1],
- data[i][1]))
-
- # Inference again
- _, output_flipped = test_exe.run(
- fetch_list=fetch_list,
- feed=feeder.feed(data_fliped))
-
- # Flip back
- output_flipped = flip_back(output_flipped, FLIP_PAIRS)
-
- # Feature is not aligned, shift flipped heatmap for higher accuracy
- if args.shift_heatmap:
- output_flipped[:, :, :, 1:] = \
- output_flipped.copy()[:, :, :, 0:-1]
-
- # Aggregate
- out_heatmaps = (out_heatmaps + output_flipped) * 0.5
- save_predict_results(input_image, out_heatmaps, file_ids, fold_name='results')
-
-if __name__ == '__main__':
- args = parser.parse_args()
- test(args)
diff --git a/fluid/PaddleCV/human_pose_estimation/train.py b/fluid/PaddleCV/human_pose_estimation/train.py
deleted file mode 100644
index 20926eb3078f1eb21fe48785d0edb12a5d35e595..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/train.py
+++ /dev/null
@@ -1,172 +0,0 @@
-# Copyright (c) 2018-present, Baidu, Inc.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-##############################################################################
-
-"""Functions for training."""
-
-import os
-import numpy as np
-import cv2
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.layers as layers
-import argparse
-import functools
-
-from lib import pose_resnet
-from utils.utility import *
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('batch_size', int, 32, "Minibatch size.")
-add_arg('dataset', str, 'mpii', "Dataset")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('num_epochs', int, 140, "Number of epochs.")
-add_arg('total_images', int, 144406, "Training image number.")
-add_arg('kp_dim', int, 16, "Class number.")
-add_arg('model_save_dir', str, "output", "Model save directory")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
-add_arg('lr', float, 0.001, "Set learning rate.")
-add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
-# yapf: enable
-
-def optimizer_setting(args, params):
- lr_drop_ratio = 0.1
-
- ls = params["learning_strategy"]
-
- if ls["name"] == "piecewise_decay":
- total_images = params["total_images"]
- batch_size = ls["batch_size"]
- step = int(total_images / batch_size + 1)
-
- ls['epochs'] = [91, 121]
- print('=> LR will be dropped at the epoch of {}'.format(ls['epochs']))
-
- bd = [step * e for e in ls["epochs"]]
- base_lr = params["lr"]
- lr = []
- lr = [base_lr * (lr_drop_ratio**i) for i in range(len(bd) + 1)]
-
- # AdamOptimizer
- optimizer = paddle.fluid.optimizer.AdamOptimizer(
- learning_rate=fluid.layers.piecewise_decay(
- boundaries=bd, values=lr))
- else:
- lr = params["lr"]
- optimizer = fluid.optimizer.Momentum(
- learning_rate=lr,
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(0.0005))
-
- return optimizer
-
-def train(args):
- if args.dataset == 'coco':
- import lib.coco_reader as reader
- IMAGE_SIZE = [288, 384]
- HEATMAP_SIZE = [72, 96]
- args.kp_dim = 17
- args.total_images = 144406 # 149813
- elif args.dataset == 'mpii':
- import lib.mpii_reader as reader
- IMAGE_SIZE = [384, 384]
- HEATMAP_SIZE = [96, 96]
- args.kp_dim = 16
- args.total_images = 22246
- else:
- raise ValueError('The dataset {} is not supported yet.'.format(args.dataset))
-
- print_arguments(args)
-
- # Image and target
- image = layers.data(name='image', shape=[3, IMAGE_SIZE[1], IMAGE_SIZE[0]], dtype='float32')
- target = layers.data(name='target', shape=[args.kp_dim, HEATMAP_SIZE[1], HEATMAP_SIZE[0]], dtype='float32')
- target_weight = layers.data(name='target_weight', shape=[args.kp_dim, 1], dtype='float32')
-
- # Build model
- model = pose_resnet.ResNet(layers=50, kps_num=args.kp_dim)
-
- # Output
- loss, output = model.net(input=image, target=target, target_weight=target_weight)
-
- # Parameters from model and arguments
- params = {}
- params["total_images"] = args.total_images
- params["lr"] = args.lr
- params["num_epochs"] = args.num_epochs
- params["learning_strategy"] = {}
- params["learning_strategy"]["batch_size"] = args.batch_size
- params["learning_strategy"]["name"] = args.lr_strategy
-
- # Initialize optimizer
- optimizer = optimizer_setting(args, params)
- optimizer.minimize(loss)
-
- if args.with_mem_opt:
- fluid.memory_optimize(fluid.default_main_program(),
- skip_opt_set=[loss.name, output.name, target.name])
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- args.pretrained_model = './pretrained/resnet_50/115'
- if args.pretrained_model:
- def if_exist(var):
- exist_flag = os.path.exists(os.path.join(args.pretrained_model, var.name))
- return exist_flag
- fluid.io.load_vars(exe, args.pretrained_model, predicate=if_exist)
-
- if args.checkpoint is not None:
- fluid.io.load_persistables(exe, args.checkpoint)
-
- # Dataloader
- train_reader = paddle.batch(reader.train(), batch_size=args.batch_size)
- feeder = fluid.DataFeeder(place=place, feed_list=[image, target, target_weight])
-
- train_exe = fluid.ParallelExecutor(
- use_cuda=True if args.use_gpu else False, loss_name=loss.name)
- fetch_list = [image.name, loss.name, output.name]
-
- for pass_id in range(params["num_epochs"]):
- for batch_id, data in enumerate(train_reader()):
- current_lr = np.array(paddle.fluid.global_scope().find_var('learning_rate').get_tensor())
-
- input_image, loss, out_heatmaps = train_exe.run(
- fetch_list, feed=feeder.feed(data))
-
- loss = np.mean(np.array(loss))
-
- print('Epoch [{:4d}/{:3d}] LR: {:.10f} '
- 'Loss = {:.5f}'.format(
- batch_id, pass_id, current_lr[0], loss))
-
- if batch_id % 10 == 0:
- save_batch_heatmaps(input_image, out_heatmaps, file_name='visualization@train.jpg', normalize=True)
-
- model_path = os.path.join(args.model_save_dir + '/' + 'simplebase-{}'.format(args.dataset),
- str(pass_id))
- if not os.path.isdir(model_path):
- os.makedirs(model_path)
- fluid.io.save_persistables(exe, model_path)
-
-
-if __name__ == '__main__':
- args = parser.parse_args()
- train(args)
-
diff --git a/fluid/PaddleCV/human_pose_estimation/val.py b/fluid/PaddleCV/human_pose_estimation/val.py
deleted file mode 100644
index d73113e5c2b2cfb017b98c856b5f455708a69dc6..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/val.py
+++ /dev/null
@@ -1,316 +0,0 @@
-# Copyright (c) 2018-present, Baidu, Inc.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-##############################################################################
-
-"""Functions for validation."""
-
-import os
-import argparse
-import functools
-import numpy as np
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.layers as layers
-
-from collections import OrderedDict
-from scipy.io import loadmat, savemat
-from lib import pose_resnet
-from utils.transforms import flip_back
-from utils.utility import *
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-
-# yapf: disable
-add_arg('batch_size', int, 32, "Minibatch size.")
-add_arg('dataset', str, 'mpii', "Dataset")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('num_epochs', int, 140, "Number of epochs.")
-add_arg('total_images', int, 144406, "Training image number.")
-add_arg('kp_dim', int, 16, "Class number.")
-add_arg('model_save_dir', str, "output", "Model save directory")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
-add_arg('lr', float, 0.001, "Set learning rate.")
-add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
-add_arg('flip_test', bool, True, "Flip test")
-add_arg('shift_heatmap', bool, True, "Shift heatmap")
-add_arg('post_process', bool, True, "Post process")
-# yapf: enable
-
-def valid(args):
- if args.dataset == 'coco':
- import lib.coco_reader as reader
- IMAGE_SIZE = [288, 384]
- HEATMAP_SIZE = [72, 96]
- FLIP_PAIRS = [[1, 2], [3, 4], [5, 6], [7, 8], [9, 10], [11, 12], [13, 14], [15, 16]]
- args.kp_dim = 17
- args.total_images = 144406 # 149813
- elif args.dataset == 'mpii':
- import lib.mpii_reader as reader
- IMAGE_SIZE = [384, 384]
- HEATMAP_SIZE = [96, 96]
- FLIP_PAIRS = [[0, 5], [1, 4], [2, 3], [10, 15], [11, 14], [12, 13]]
- args.kp_dim = 16
- args.total_images = 2958 # validation
- else:
- raise ValueError('The dataset {} is not supported yet.'.format(args.dataset))
-
- print_arguments(args)
-
- # Image and target
- image = layers.data(name='image', shape=[3, IMAGE_SIZE[1], IMAGE_SIZE[0]], dtype='float32')
- target = layers.data(name='target', shape=[args.kp_dim, HEATMAP_SIZE[1], HEATMAP_SIZE[0]], dtype='float32')
- target_weight = layers.data(name='target_weight', shape=[args.kp_dim, 1], dtype='float32')
- center = layers.data(name='center', shape=[2,], dtype='float32')
- scale = layers.data(name='scale', shape=[2,], dtype='float32')
- score = layers.data(name='score', shape=[1,], dtype='float32')
-
- # Build model
- model = pose_resnet.ResNet(layers=50, kps_num=args.kp_dim)
-
- # Output
- loss, output = model.net(input=image, target=target, target_weight=target_weight)
-
- # Parameters from model and arguments
- params = {}
- params["total_images"] = args.total_images
- params["lr"] = args.lr
- params["num_epochs"] = args.num_epochs
- params["learning_strategy"] = {}
- params["learning_strategy"]["batch_size"] = args.batch_size
- params["learning_strategy"]["name"] = args.lr_strategy
-
- if args.with_mem_opt:
- fluid.memory_optimize(fluid.default_main_program(),
- skip_opt_set=[loss.name, output.name, target.name])
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- args.pretrained_model = './pretrained/resnet_50/115'
- if args.pretrained_model:
- def if_exist(var):
- exist_flag = os.path.exists(os.path.join(args.pretrained_model, var.name))
- return exist_flag
- fluid.io.load_vars(exe, args.pretrained_model, predicate=if_exist)
-
- if args.checkpoint is not None:
- fluid.io.load_persistables(exe, args.checkpoint)
-
- # Dataloader
- valid_reader = paddle.batch(reader.valid(), batch_size=args.batch_size)
- feeder = fluid.DataFeeder(place=place, feed_list=[image, target, target_weight, center, scale, score])
-
- valid_exe = fluid.ParallelExecutor(
- use_cuda=True if args.use_gpu else False,
- main_program=fluid.default_main_program().clone(for_test=False),
- loss_name=loss.name)
-
- fetch_list = [image.name, loss.name, output.name, target.name]
-
- # For validation
- acc = AverageMeter()
- idx = 0
-
- num_samples = args.total_images
- all_preds = np.zeros((num_samples, args.kp_dim, 3),
- dtype=np.float32)
- all_boxes = np.zeros((num_samples, 6))
-
- for batch_id, data in enumerate(valid_reader()):
- num_images = len(data)
-
- centers = []
- scales = []
- scores = []
- for i in range(num_images):
- centers.append(data[i][3])
- scales.append(data[i][4])
- scores.append(data[i][5])
-
- input_image, loss, out_heatmaps, target_heatmaps = valid_exe.run(
- fetch_list=fetch_list,
- feed=feeder.feed(data))
-
- if args.flip_test:
- # Flip all the images in a same batch
- data_fliped = []
- for i in range(num_images):
- # Input, target, target_weight, c, s, score
- data_fliped.append((
- # np.flip(input_image, 3)[i],
- data[i][0][:, :, ::-1],
- data[i][1],
- data[i][2],
- data[i][3],
- data[i][4],
- data[i][5]))
-
- # Inference again
- _, _, output_flipped, _ = valid_exe.run(
- fetch_list=fetch_list,
- feed=feeder.feed(data_fliped))
-
- # Flip back
- output_flipped = flip_back(output_flipped, FLIP_PAIRS)
-
- # Feature is not aligned, shift flipped heatmap for higher accuracy
- if args.shift_heatmap:
- output_flipped[:, :, :, 1:] = \
- output_flipped.copy()[:, :, :, 0:-1]
-
- # Aggregate
- # out_heatmaps.shape: size[b, args.kp_dim, 96, 96]
- out_heatmaps = (out_heatmaps + output_flipped) * 0.5
-
- loss = np.mean(np.array(loss))
-
- # Accuracy
- _, avg_acc, cnt, pred = accuracy(out_heatmaps, target_heatmaps)
- acc.update(avg_acc, cnt)
-
- # Current center, scale, score
- centers = np.array(centers)
- scales = np.array(scales)
- scores = np.array(scores)
-
- preds, maxvals = get_final_preds(
- args, out_heatmaps, centers, scales)
-
- all_preds[idx:idx + num_images, :, 0:2] = preds[:, :, 0:2]
- all_preds[idx:idx + num_images, :, 2:3] = maxvals
- # Double check this all_boxes parts
- all_boxes[idx:idx + num_images, 0:2] = centers[:, 0:2]
- all_boxes[idx:idx + num_images, 2:4] = scales[:, 0:2]
- all_boxes[idx:idx + num_images, 4] = np.prod(scales*200, 1)
- all_boxes[idx:idx + num_images, 5] = scores
- # image_path.extend(meta['image'])
-
- idx += num_images
-
- print('Epoch [{:4d}] '
- 'Loss = {:.5f} '
- 'Acc = {:.5f}'.format(batch_id, loss, acc.avg))
-
- if batch_id % 10 == 0:
- save_batch_heatmaps(input_image, out_heatmaps, file_name='visualization@val.jpg', normalize=True)
-
- # Evaluate
- args.DATAROOT = 'data/mpii'
- args.TEST_SET = 'valid'
- output_dir = ''
- filenames = []
- imgnums = []
- image_path = []
- name_values, perf_indicator = mpii_evaluate(
- args, all_preds, output_dir, all_boxes, image_path,
- filenames, imgnums)
-
- print_name_value(name_values, perf_indicator)
-
-def mpii_evaluate(cfg, preds, output_dir, *args, **kwargs):
- # Convert 0-based index to 1-based index
- preds = preds[:, :, 0:2] + 1.0
-
- if output_dir:
- pred_file = os.path.join(output_dir, 'pred.mat')
- savemat(pred_file, mdict={'preds': preds})
-
- if 'test' in cfg.TEST_SET:
- return {'Null': 0.0}, 0.0
-
- SC_BIAS = 0.6
- threshold = 0.5
-
- gt_file = os.path.join(cfg.DATAROOT,
- 'annot',
- 'gt_{}.mat'.format(cfg.TEST_SET))
- gt_dict = loadmat(gt_file)
- dataset_joints = gt_dict['dataset_joints']
- jnt_missing = gt_dict['jnt_missing']
- pos_gt_src = gt_dict['pos_gt_src']
- headboxes_src = gt_dict['headboxes_src']
-
- pos_pred_src = np.transpose(preds, [1, 2, 0])
-
- head = np.where(dataset_joints == 'head')[1][0]
- lsho = np.where(dataset_joints == 'lsho')[1][0]
- lelb = np.where(dataset_joints == 'lelb')[1][0]
- lwri = np.where(dataset_joints == 'lwri')[1][0]
- lhip = np.where(dataset_joints == 'lhip')[1][0]
- lkne = np.where(dataset_joints == 'lkne')[1][0]
- lank = np.where(dataset_joints == 'lank')[1][0]
-
- rsho = np.where(dataset_joints == 'rsho')[1][0]
- relb = np.where(dataset_joints == 'relb')[1][0]
- rwri = np.where(dataset_joints == 'rwri')[1][0]
- rkne = np.where(dataset_joints == 'rkne')[1][0]
- rank = np.where(dataset_joints == 'rank')[1][0]
- rhip = np.where(dataset_joints == 'rhip')[1][0]
-
- jnt_visible = 1 - jnt_missing
- uv_error = pos_pred_src - pos_gt_src
- uv_err = np.linalg.norm(uv_error, axis=1)
- headsizes = headboxes_src[1, :, :] - headboxes_src[0, :, :]
- headsizes = np.linalg.norm(headsizes, axis=0)
- headsizes *= SC_BIAS
- scale = np.multiply(headsizes, np.ones((len(uv_err), 1)))
- scaled_uv_err = np.divide(uv_err, scale)
- scaled_uv_err = np.multiply(scaled_uv_err, jnt_visible)
- jnt_count = np.sum(jnt_visible, axis=1)
- less_than_threshold = np.multiply((scaled_uv_err <= threshold),
- jnt_visible)
- PCKh = np.divide(100.*np.sum(less_than_threshold, axis=1), jnt_count)
-
- # Save
- rng = np.arange(0, 0.5+0.01, 0.01)
- pckAll = np.zeros((len(rng), cfg.kp_dim))
-
- for r in range(len(rng)):
- threshold = rng[r]
- less_than_threshold = np.multiply(scaled_uv_err <= threshold,
- jnt_visible)
- pckAll[r, :] = np.divide(100.*np.sum(less_than_threshold, axis=1),
- jnt_count)
-
- PCKh = np.ma.array(PCKh, mask=False)
- PCKh.mask[6:8] = True
-
- jnt_count = np.ma.array(jnt_count, mask=False)
- jnt_count.mask[6:8] = True
- jnt_ratio = jnt_count / np.sum(jnt_count).astype(np.float64)
-
- name_value = [
- ('Head', PCKh[head]),
- ('Shoulder', 0.5 * (PCKh[lsho] + PCKh[rsho])),
- ('Elbow', 0.5 * (PCKh[lelb] + PCKh[relb])),
- ('Wrist', 0.5 * (PCKh[lwri] + PCKh[rwri])),
- ('Hip', 0.5 * (PCKh[lhip] + PCKh[rhip])),
- ('Knee', 0.5 * (PCKh[lkne] + PCKh[rkne])),
- ('Ankle', 0.5 * (PCKh[lank] + PCKh[rank])),
- ('Mean', np.sum(PCKh * jnt_ratio)),
- ('Mean@0.1', np.sum(pckAll[11, :] * jnt_ratio))
- ]
- name_value = OrderedDict(name_value)
-
- return name_value, name_value['Mean']
-
-# TODO: coco_evaluate()
-
-if __name__ == '__main__':
- args = parser.parse_args()
- valid(args)
diff --git a/fluid/PaddleCV/icnet/README.md b/fluid/PaddleCV/icnet/README.md
index dc350ff5e66993b33b976018df36369b773a90c3..72a3a91b0ae52894c641e61b489ff7a04c6f8106 100644
--- a/fluid/PaddleCV/icnet/README.md
+++ b/fluid/PaddleCV/icnet/README.md
@@ -1,110 +1,2 @@
-运行本目录下的程序示例需要使用PaddlePaddle develop最新版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
-## 代码结构
-```
-├── network.py # 网络结构定义脚本
-├── train.py # 训练任务脚本
-├── eval.py # 评估脚本
-├── infer.py # 预测脚本
-├── cityscape.py # 数据预处理脚本
-└── utils.py # 定义通用的函数
-```
-
-## 简介
-
-Image Cascade Network(ICNet)主要用于图像实时语义分割。相较于其它压缩计算的方法,ICNet即考虑了速度,也考虑了准确性。
-ICNet的主要思想是将输入图像变换为不同的分辨率,然后用不同计算复杂度的子网络计算不同分辨率的输入,然后将结果合并。ICNet由三个子网络组成,计算复杂度高的网络处理低分辨率输入,计算复杂度低的网络处理分辨率高的网络,通过这种方式在高分辨率图像的准确性和低复杂度网络的效率之间获得平衡。
-
-整个网络结构如下:
-
-
-
-图 1
-
-
-
-## 数据准备
-
-
-
-本文采用Cityscape数据集,请前往[Cityscape官网](https://www.cityscapes-dataset.com)注册下载。下载数据之后,按照[这里](https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/createTrainIdLabelImgs.py#L3)的说明和工具处理数据。
-处理之后的数据
-```
-data/cityscape/
-|-- gtFine
-| |-- test
-| |-- train
-| `-- val
-|-- leftImg8bit
-| |-- test
-| |-- train
-| `-- val
-|-- train.list
-`-- val.list
-```
-其中,train.list和val.list分别是用于训练和测试的列表文件,第一列为输入图像数据,第二列为标注数据,两列用空格分开。示例如下:
-```
-leftImg8bit/train/stuttgart/stuttgart_000021_000019_leftImg8bit.png gtFine/train/stuttgart/stuttgart_000021_000019_gtFine_labelTrainIds.png
-leftImg8bit/train/stuttgart/stuttgart_000072_000019_leftImg8bit.png gtFine/train/stuttgart/stuttgart_000072_000019_gtFine_labelTrainIds.png
-```
-完成数据下载和准备后,需要修改`cityscape.py`脚本中对应的数据地址。
-
-## 模型训练与预测
-
-### 训练
-执行以下命令进行训练,同时指定checkpoint保存路径:
-```
-python train.py --batch_size=16 --use_gpu=True --checkpoint_path="./chkpnt/"
-```
-使用以下命令获得更多使用说明:
-```
-python train.py --help
-```
-训练过程中会根据用户的设置,输出训练集上每个网络分支的`loss`, 示例如下:
-```
-Iter[0]; train loss: 2.338; sub4_loss: 3.367; sub24_loss: 4.120; sub124_loss: 0.151
-```
-### 测试
-执行以下命令在`Cityscape`测试数据集上进行测试:
-```
-python eval.py --model_path="./model/" --use_gpu=True
-```
-需要通过选项`--model_path`指定模型文件。
-测试脚本的输出的评估指标为[mean IoU]()。
-
-### 预测
-执行以下命令对指定的数据进行预测:
-```
-python infer.py \
---model_path="./model" \
---images_path="./data/cityscape/" \
---images_list="./data/cityscape/infer.list"
-```
-通过选项`--images_list`指定列表文件,列表文件中每一行为一个要预测的图片的路径。
-预测结果默认保存到当前路径下的`output`文件夹下。
-
-## 实验结果
-图2为在`CityScape`训练集上的训练的Loss曲线:
-
-
-
-图 2
-
-
-在训练集上训练,在validation数据集上验证的结果为:mean_IoU=67.0%(论文67.7%)
-
-图3是使用`infer.py`脚本预测产生的结果示例,其中,第一行为输入的原始图片,第二行为人工的标注,第三行为我们模型计算的结果。
-
-
-图 3
-
-
-## 其他信息
-|数据集 | pretrained model |
-|---|---|
-|CityScape | [Model]()[md: ] |
-
-## 参考
-
-- [ICNet for Real-Time Semantic Segmentation on High-Resolution Images](https://arxiv.org/abs/1704.08545)
+您好,该项目已被迁移,请移步到 [PaddleCV/icnet](../../../PaddleCV/icnet) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/icnet/cityscape.py b/fluid/PaddleCV/icnet/cityscape.py
deleted file mode 100644
index c5c08afcf3a3c85b9f43c9110e8a8dedc5900d5b..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/icnet/cityscape.py
+++ /dev/null
@@ -1,239 +0,0 @@
-"""Reader for Cityscape dataset.
-"""
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import cv2
-import numpy as np
-import paddle.dataset as dataset
-
-DATA_PATH = "./data/cityscape"
-TRAIN_LIST = DATA_PATH + "/train.list"
-TEST_LIST = DATA_PATH + "/val.list"
-IGNORE_LABEL = 255
-NUM_CLASSES = 19
-TRAIN_DATA_SHAPE = (3, 720, 720)
-TEST_DATA_SHAPE = (3, 1024, 2048)
-IMG_MEAN = np.array((103.939, 116.779, 123.68), dtype=np.float32)
-
-
-def train_data_shape():
- return TRAIN_DATA_SHAPE
-
-
-def test_data_shape():
- return TEST_DATA_SHAPE
-
-
-def num_classes():
- return NUM_CLASSES
-
-
-class DataGenerater:
- def __init__(self, data_list, mode="train", flip=True, scaling=True):
- self.flip = flip
- self.scaling = scaling
- self.image_label = []
- with open(data_list, 'r') as f:
- for line in f:
- image_file, label_file = line.strip().split(' ')
- self.image_label.append((image_file, label_file))
-
- def create_train_reader(self, batch_size):
- """
- Create a reader for train dataset.
- """
-
- def reader():
- np.random.shuffle(self.image_label)
- images = []
- labels_sub1 = []
- labels_sub2 = []
- labels_sub4 = []
- count = 0
- for image, label in self.image_label:
- image, label_sub1, label_sub2, label_sub4 = self.process_train_data(
- image, label)
- count += 1
- images.append(image)
- labels_sub1.append(label_sub1)
- labels_sub2.append(label_sub2)
- labels_sub4.append(label_sub4)
- if count == batch_size:
- yield self.mask(
- np.array(images),
- np.array(labels_sub1),
- np.array(labels_sub2), np.array(labels_sub4))
- images = []
- labels_sub1 = []
- labels_sub2 = []
- labels_sub4 = []
- count = 0
- if images:
- yield self.mask(
- np.array(images),
- np.array(labels_sub1),
- np.array(labels_sub2), np.array(labels_sub4))
-
- return reader
-
- def create_test_reader(self):
- """
- Create a reader for test dataset.
- """
-
- def reader():
- for image, label in self.image_label:
- image, label = self.load(image, label)
- image = dataset.image.to_chw(image)[np.newaxis, :]
- label = label[np.newaxis, :, :, np.newaxis].astype("float32")
- label_mask = np.where((label != IGNORE_LABEL).flatten())[
- 0].astype("int32")
- yield image, label, label_mask
-
- return reader
-
- def process_train_data(self, image, label):
- """
- Process training data.
- """
- image, label = self.load(image, label)
- if self.flip:
- image, label = self.random_flip(image, label)
- if self.scaling:
- image, label = self.random_scaling(image, label)
- image, label = self.resize(image, label, out_size=TRAIN_DATA_SHAPE[1:])
- label = label.astype("float32")
- label_sub1 = dataset.image.to_chw(self.scale_label(label, factor=4))
- label_sub2 = dataset.image.to_chw(self.scale_label(label, factor=8))
- label_sub4 = dataset.image.to_chw(self.scale_label(label, factor=16))
- image = dataset.image.to_chw(image)
- return image, label_sub1, label_sub2, label_sub4
-
- def load(self, image, label):
- """
- Load image from file.
- """
- image = dataset.image.load_image(
- DATA_PATH + "/" + image, is_color=True).astype("float32")
- image -= IMG_MEAN
- label = dataset.image.load_image(
- DATA_PATH + "/" + label, is_color=False).astype("float32")
- return image, label
-
- def random_flip(self, image, label):
- """
- Flip image and label randomly.
- """
- r = np.random.rand(1)
- if r > 0.5:
- image = dataset.image.left_right_flip(image, is_color=True)
- label = dataset.image.left_right_flip(label, is_color=False)
- return image, label
-
- def random_scaling(self, image, label):
- """
- Scale image and label randomly.
- """
- scale = np.random.uniform(0.5, 2.0, 1)[0]
- h_new = int(image.shape[0] * scale)
- w_new = int(image.shape[1] * scale)
- image = cv2.resize(image, (w_new, h_new))
- label = cv2.resize(
- label, (w_new, h_new), interpolation=cv2.INTER_NEAREST)
- return image, label
-
- def padding_as(self, image, h, w, is_color):
- """
- Padding image.
- """
- pad_h = max(image.shape[0], h) - image.shape[0]
- pad_w = max(image.shape[1], w) - image.shape[1]
- if is_color:
- return np.pad(image, ((0, pad_h), (0, pad_w), (0, 0)), 'constant')
- else:
- return np.pad(image, ((0, pad_h), (0, pad_w)), 'constant')
-
- def resize(self, image, label, out_size):
- """
- Resize image and label by padding or cropping.
- """
- ignore_label = IGNORE_LABEL
- label = label - ignore_label
- if len(label.shape) == 2:
- label = label[:, :, np.newaxis]
- combined = np.concatenate((image, label), axis=2)
- combined = self.padding_as(
- combined, out_size[0], out_size[1], is_color=True)
- combined = dataset.image.random_crop(
- combined, out_size[0], is_color=True)
- image = combined[:, :, 0:3]
- label = combined[:, :, 3:4] + ignore_label
- return image, label
-
- def scale_label(self, label, factor):
- """
- Scale label according to factor.
- """
- h = label.shape[0] // factor
- w = label.shape[1] // factor
- return cv2.resize(
- label, (h, w), interpolation=cv2.INTER_NEAREST)[:, :, np.newaxis]
-
- def mask(self, image, label0, label1, label2):
- """
- Get mask for valid pixels.
- """
- mask_sub1 = np.where(((label0 < (NUM_CLASSES + 1)) & (
- label0 != IGNORE_LABEL)).flatten())[0].astype("int32")
- mask_sub2 = np.where(((label1 < (NUM_CLASSES + 1)) & (
- label1 != IGNORE_LABEL)).flatten())[0].astype("int32")
- mask_sub4 = np.where(((label2 < (NUM_CLASSES + 1)) & (
- label2 != IGNORE_LABEL)).flatten())[0].astype("int32")
- return image.astype(
- "float32"), label0, mask_sub1, label1, mask_sub2, label2, mask_sub4
-
-
-def train(batch_size=32, flip=True, scaling=True):
- """
- Cityscape training set reader.
- It returns a reader, in which each result is a batch with batch_size samples.
-
- :param batch_size: The batch size of each result return by the reader.
- :type batch_size: int
- :param flip: Whether flip images randomly.
- :type batch_size: bool
- :param scaling: Whether scale images randomly.
- :type batch_size: bool
- :return: Training reader.
- :rtype: callable
- """
- reader = DataGenerater(
- TRAIN_LIST, flip=flip, scaling=scaling).create_train_reader(batch_size)
- return reader
-
-
-def test():
- """
- Cityscape validation set reader.
- It returns a reader, in which each result is a sample.
-
- :return: Training reader.
- :rtype: callable
- """
- reader = DataGenerater(TEST_LIST).create_test_reader()
- return reader
-
-
-def infer(image_list=TEST_LIST):
- """
- Infer set reader.
- It returns a reader, in which each result is a sample.
-
- :param image_list: The image list file in which each line is a path of image to be infered.
- :type batch_size: str
- :return: Infer reader.
- :rtype: callable
- """
- reader = DataGenerater(image_list).create_test_reader()
diff --git a/fluid/PaddleCV/icnet/icnet.py b/fluid/PaddleCV/icnet/icnet.py
deleted file mode 100644
index d640621eb9def4bfb1411667ea68f5384fbd5489..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/icnet/icnet.py
+++ /dev/null
@@ -1,304 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle.fluid as fluid
-import numpy as np
-import sys
-
-
-def conv(input,
- k_h,
- k_w,
- c_o,
- s_h,
- s_w,
- relu=False,
- padding="VALID",
- biased=False,
- name=None):
- act = None
- tmp = input
- if relu:
- act = "relu"
- if padding == "SAME":
- padding_h = max(k_h - s_h, 0)
- padding_w = max(k_w - s_w, 0)
- padding_top = padding_h // 2
- padding_left = padding_w // 2
- padding_bottom = padding_h - padding_top
- padding_right = padding_w - padding_left
- padding = [
- 0, 0, 0, 0, padding_top, padding_bottom, padding_left, padding_right
- ]
- tmp = fluid.layers.pad(tmp, padding)
- tmp = fluid.layers.conv2d(
- tmp,
- num_filters=c_o,
- filter_size=[k_h, k_w],
- stride=[s_h, s_w],
- groups=1,
- act=act,
- bias_attr=biased,
- use_cudnn=False,
- name=name)
- return tmp
-
-
-def atrous_conv(input,
- k_h,
- k_w,
- c_o,
- dilation,
- relu=False,
- padding="VALID",
- biased=False,
- name=None):
- act = None
- if relu:
- act = "relu"
- tmp = input
- if padding == "SAME":
- padding_h = max(k_h - s_h, 0)
- padding_w = max(k_w - s_w, 0)
- padding_top = padding_h // 2
- padding_left = padding_w // 2
- padding_bottom = padding_h - padding_top
- padding_right = padding_w - padding_left
- padding = [
- 0, 0, 0, 0, padding_top, padding_bottom, padding_left, padding_right
- ]
- tmp = fluid.layers.pad(tmp, padding)
-
- tmp = fluid.layers.conv2d(
- input,
- num_filters=c_o,
- filter_size=[k_h, k_w],
- dilation=dilation,
- groups=1,
- act=act,
- bias_attr=biased,
- use_cudnn=False,
- name=name)
- return tmp
-
-
-def zero_padding(input, padding):
- return fluid.layers.pad(input,
- [0, 0, 0, 0, padding, padding, padding, padding])
-
-
-def bn(input, relu=False, name=None, is_test=False):
- act = None
- if relu:
- act = 'relu'
- name = input.name.split(".")[0] + "_bn"
- tmp = fluid.layers.batch_norm(
- input, act=act, momentum=0.95, epsilon=1e-5, name=name)
- return tmp
-
-
-def avg_pool(input, k_h, k_w, s_h, s_w, name=None, padding=0):
- temp = fluid.layers.pool2d(
- input,
- pool_size=[k_h, k_w],
- pool_type="avg",
- pool_stride=[s_h, s_w],
- pool_padding=padding,
- name=name)
- return temp
-
-
-def max_pool(input, k_h, k_w, s_h, s_w, name=None, padding=0):
- temp = fluid.layers.pool2d(
- input,
- pool_size=[k_h, k_w],
- pool_type="max",
- pool_stride=[s_h, s_w],
- pool_padding=padding,
- name=name)
- return temp
-
-
-def interp(input, out_shape):
- out_shape = list(out_shape.astype("int32"))
- return fluid.layers.resize_bilinear(input, out_shape=out_shape)
-
-
-def dilation_convs(input):
- tmp = res_block(input, filter_num=256, padding=1, name="conv3_2")
- tmp = res_block(tmp, filter_num=256, padding=1, name="conv3_3")
- tmp = res_block(tmp, filter_num=256, padding=1, name="conv3_4")
-
- tmp = proj_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_1")
- tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_2")
- tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_3")
- tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_4")
- tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_5")
- tmp = res_block(tmp, filter_num=512, padding=2, dilation=2, name="conv4_6")
-
- tmp = proj_block(
- tmp, filter_num=1024, padding=4, dilation=4, name="conv5_1")
- tmp = res_block(tmp, filter_num=1024, padding=4, dilation=4, name="conv5_2")
- tmp = res_block(tmp, filter_num=1024, padding=4, dilation=4, name="conv5_3")
- return tmp
-
-
-def pyramis_pooling(input, input_shape):
- shape = np.ceil(input_shape // 32).astype("int32")
- h, w = shape
- pool1 = avg_pool(input, h, w, h, w)
- pool1_interp = interp(pool1, shape)
- pool2 = avg_pool(input, h // 2, w // 2, h // 2, w // 2)
- pool2_interp = interp(pool2, shape)
- pool3 = avg_pool(input, h // 3, w // 3, h // 3, w // 3)
- pool3_interp = interp(pool3, shape)
- pool4 = avg_pool(input, h // 4, w // 4, h // 4, w // 4)
- pool4_interp = interp(pool4, shape)
- conv5_3_sum = input + pool4_interp + pool3_interp + pool2_interp + pool1_interp
- return conv5_3_sum
-
-
-def shared_convs(image):
- tmp = conv(image, 3, 3, 32, 2, 2, padding='SAME', name="conv1_1_3_3_s2")
- tmp = bn(tmp, relu=True)
- tmp = conv(tmp, 3, 3, 32, 1, 1, padding='SAME', name="conv1_2_3_3")
- tmp = bn(tmp, relu=True)
- tmp = conv(tmp, 3, 3, 64, 1, 1, padding='SAME', name="conv1_3_3_3")
- tmp = bn(tmp, relu=True)
- tmp = max_pool(tmp, 3, 3, 2, 2, padding=[1, 1])
-
- tmp = proj_block(tmp, filter_num=128, padding=0, name="conv2_1")
- tmp = res_block(tmp, filter_num=128, padding=1, name="conv2_2")
- tmp = res_block(tmp, filter_num=128, padding=1, name="conv2_3")
- tmp = proj_block(tmp, filter_num=256, padding=1, stride=2, name="conv3_1")
- return tmp
-
-
-def res_block(input, filter_num, padding=0, dilation=None, name=None):
- tmp = conv(input, 1, 1, filter_num // 4, 1, 1, name=name + "_1_1_reduce")
- tmp = bn(tmp, relu=True)
- tmp = zero_padding(tmp, padding=padding)
- if dilation is None:
- tmp = conv(tmp, 3, 3, filter_num // 4, 1, 1, name=name + "_3_3")
- else:
- tmp = atrous_conv(
- tmp, 3, 3, filter_num // 4, dilation, name=name + "_3_3")
- tmp = bn(tmp, relu=True)
- tmp = conv(tmp, 1, 1, filter_num, 1, 1, name=name + "_1_1_increase")
- tmp = bn(tmp, relu=False)
- tmp = input + tmp
- tmp = fluid.layers.relu(tmp)
- return tmp
-
-
-def proj_block(input, filter_num, padding=0, dilation=None, stride=1,
- name=None):
- proj = conv(
- input, 1, 1, filter_num, stride, stride, name=name + "_1_1_proj")
- proj_bn = bn(proj, relu=False)
-
- tmp = conv(
- input, 1, 1, filter_num // 4, stride, stride, name=name + "_1_1_reduce")
- tmp = bn(tmp, relu=True)
-
- tmp = zero_padding(tmp, padding=padding)
- if padding == 0:
- padding = 'SAME'
- else:
- padding = 'VALID'
- if dilation is None:
- tmp = conv(
- tmp,
- 3,
- 3,
- filter_num // 4,
- 1,
- 1,
- padding=padding,
- name=name + "_3_3")
- else:
- tmp = atrous_conv(
- tmp,
- 3,
- 3,
- filter_num // 4,
- dilation,
- padding=padding,
- name=name + "_3_3")
-
- tmp = bn(tmp, relu=True)
- tmp = conv(tmp, 1, 1, filter_num, 1, 1, name=name + "_1_1_increase")
- tmp = bn(tmp, relu=False)
- tmp = proj_bn + tmp
- tmp = fluid.layers.relu(tmp)
- return tmp
-
-
-def sub_net_4(input, input_shape):
- tmp = interp(input, out_shape=np.ceil(input_shape // 32))
- tmp = dilation_convs(tmp)
- tmp = pyramis_pooling(tmp, input_shape)
- tmp = conv(tmp, 1, 1, 256, 1, 1, name="conv5_4_k1")
- tmp = bn(tmp, relu=True)
- tmp = interp(tmp, input_shape // 16)
- return tmp
-
-
-def sub_net_2(input):
- tmp = conv(input, 1, 1, 128, 1, 1, name="conv3_1_sub2_proj")
- tmp = bn(tmp, relu=False)
- return tmp
-
-
-def sub_net_1(input):
- tmp = conv(input, 3, 3, 32, 2, 2, padding='SAME', name="conv1_sub1")
- tmp = bn(tmp, relu=True)
- tmp = conv(tmp, 3, 3, 32, 2, 2, padding='SAME', name="conv2_sub1")
- tmp = bn(tmp, relu=True)
- tmp = conv(tmp, 3, 3, 64, 2, 2, padding='SAME', name="conv3_sub1")
- tmp = bn(tmp, relu=True)
- tmp = conv(tmp, 1, 1, 128, 1, 1, name="conv3_sub1_proj")
- tmp = bn(tmp, relu=False)
- return tmp
-
-
-def CCF24(sub2_out, sub4_out, input_shape):
- tmp = zero_padding(sub4_out, padding=2)
- tmp = atrous_conv(tmp, 3, 3, 128, 2, name="conv_sub4")
- tmp = bn(tmp, relu=False)
- tmp = tmp + sub2_out
- tmp = fluid.layers.relu(tmp)
- tmp = interp(tmp, input_shape // 8)
- return tmp
-
-
-def CCF124(sub1_out, sub24_out, input_shape):
- tmp = zero_padding(sub24_out, padding=2)
- tmp = atrous_conv(tmp, 3, 3, 128, 2, name="conv_sub2")
- tmp = bn(tmp, relu=False)
- tmp = tmp + sub1_out
- tmp = fluid.layers.relu(tmp)
- tmp = interp(tmp, input_shape // 4)
- return tmp
-
-
-def icnet(data, num_classes, input_shape):
- image_sub1 = data
- image_sub2 = interp(data, out_shape=input_shape * 0.5)
-
- s_convs = shared_convs(image_sub2)
- sub4_out = sub_net_4(s_convs, input_shape)
- sub2_out = sub_net_2(s_convs)
- sub1_out = sub_net_1(image_sub1)
-
- sub24_out = CCF24(sub2_out, sub4_out, input_shape)
- sub124_out = CCF124(sub1_out, sub24_out, input_shape)
-
- conv6_cls = conv(
- sub124_out, 1, 1, num_classes, 1, 1, biased=True, name="conv6_cls")
- sub4_out = conv(
- sub4_out, 1, 1, num_classes, 1, 1, biased=True, name="sub4_out")
- sub24_out = conv(
- sub24_out, 1, 1, num_classes, 1, 1, biased=True, name="sub24_out")
-
- return sub4_out, sub24_out, conv6_cls
diff --git a/fluid/PaddleCV/icnet/infer.py b/fluid/PaddleCV/icnet/infer.py
deleted file mode 100644
index 9f556fc00d5fe0346cb063d90a770ee7c9ab32b5..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/icnet/infer.py
+++ /dev/null
@@ -1,134 +0,0 @@
-"""Infer for ICNet model."""
-from __future__ import print_function
-import cityscape
-import argparse
-import functools
-import sys
-import os
-import cv2
-
-import paddle.fluid as fluid
-import paddle
-from icnet import icnet
-from utils import add_arguments, print_arguments, get_feeder_data
-from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
-from paddle.fluid.initializer import init_on_cpu
-import numpy as np
-
-IMG_MEAN = np.array((103.939, 116.779, 123.68), dtype=np.float32)
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('model_path', str, None, "Model path.")
-add_arg('images_list', str, None, "List file with images to be infered.")
-add_arg('images_path', str, None, "The images path.")
-add_arg('out_path', str, "./output", "Output path.")
-add_arg('use_gpu', bool, True, "Whether use GPU to test.")
-# yapf: enable
-
-data_shape = [3, 1024, 2048]
-num_classes = 19
-
-label_colours = [
- [128, 64, 128],
- [244, 35, 231],
- [69, 69, 69]
- # 0 = road, 1 = sidewalk, 2 = building
- ,
- [102, 102, 156],
- [190, 153, 153],
- [153, 153, 153]
- # 3 = wall, 4 = fence, 5 = pole
- ,
- [250, 170, 29],
- [219, 219, 0],
- [106, 142, 35]
- # 6 = traffic light, 7 = traffic sign, 8 = vegetation
- ,
- [152, 250, 152],
- [69, 129, 180],
- [219, 19, 60]
- # 9 = terrain, 10 = sky, 11 = person
- ,
- [255, 0, 0],
- [0, 0, 142],
- [0, 0, 69]
- # 12 = rider, 13 = car, 14 = truck
- ,
- [0, 60, 100],
- [0, 79, 100],
- [0, 0, 230]
- # 15 = bus, 16 = train, 17 = motocycle
- ,
- [119, 10, 32]
-]
-
-# 18 = bicycle
-
-
-def color(input):
- """
- Convert infered result to color image.
- """
- result = []
- for i in input.flatten():
- result.append(
- [label_colours[i][2], label_colours[i][1], label_colours[i][0]])
- result = np.array(result).reshape([input.shape[0], input.shape[1], 3])
- return result
-
-
-def infer(args):
- data_shape = cityscape.test_data_shape()
- num_classes = cityscape.num_classes()
- # define network
- images = fluid.layers.data(name='image', shape=data_shape, dtype='float32')
- _, _, sub124_out = icnet(images, num_classes,
- np.array(data_shape[1:]).astype("float32"))
- predict = fluid.layers.resize_bilinear(
- sub124_out, out_shape=data_shape[1:3])
- predict = fluid.layers.transpose(predict, perm=[0, 2, 3, 1])
- predict = fluid.layers.reshape(predict, shape=[-1, num_classes])
- _, predict = fluid.layers.topk(predict, k=1)
- predict = fluid.layers.reshape(
- predict,
- shape=[data_shape[1], data_shape[2], -1]) # batch_size should be 1
- inference_program = fluid.default_main_program().clone(for_test=True)
- # prepare environment
- place = fluid.CPUPlace()
- if args.use_gpu:
- place = fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
- assert os.path.exists(args.model_path)
- fluid.io.load_params(exe, args.model_path)
- print("loaded model from: %s" % args.model_path)
- sys.stdout.flush()
-
- if not os.path.isdir(args.out_path):
- os.makedirs(args.out_path)
-
- for line in open(args.images_list):
- image_file = args.images_path + "/" + line.strip()
- filename = os.path.basename(image_file)
- image = paddle.dataset.image.load_image(
- image_file, is_color=True).astype("float32")
- image -= IMG_MEAN
- img = paddle.dataset.image.to_chw(image)[np.newaxis, :]
- image_t = fluid.core.LoDTensor()
- image_t.set(img, place)
- result = exe.run(inference_program,
- feed={"image": image_t},
- fetch_list=[predict])
- cv2.imwrite(args.out_path + "/" + filename + "_result.png",
- color(result[0]))
-
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- infer(args)
-
-
-if __name__ == "__main__":
- main()
diff --git a/fluid/PaddleCV/icnet/train.py b/fluid/PaddleCV/icnet/train.py
deleted file mode 100644
index c497266af89b2c04db7f0364aa02d0aab5c1e4aa..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/icnet/train.py
+++ /dev/null
@@ -1,152 +0,0 @@
-"""Trainer for ICNet model."""
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-from icnet import icnet
-import cityscape
-import argparse
-import functools
-import sys
-import os
-import time
-import paddle.fluid as fluid
-import numpy as np
-from utils import add_arguments, print_arguments, get_feeder_data
-from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
-from paddle.fluid.initializer import init_on_cpu
-
-if 'ce_mode' in os.environ:
- np.random.seed(10)
- fluid.default_startup_program().random_seed = 90
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('batch_size', int, 16, "Minibatch size.")
-add_arg('checkpoint_path', str, None, "Checkpoint svae path.")
-add_arg('init_model', str, None, "Pretrain model path.")
-add_arg('use_gpu', bool, True, "Whether use GPU to train.")
-add_arg('random_mirror', bool, True, "Whether prepare by random mirror.")
-add_arg('random_scaling', bool, True, "Whether prepare by random scaling.")
-# yapf: enable
-
-LAMBDA1 = 0.16
-LAMBDA2 = 0.4
-LAMBDA3 = 1.0
-LEARNING_RATE = 0.003
-POWER = 0.9
-LOG_PERIOD = 100
-CHECKPOINT_PERIOD = 100
-TOTAL_STEP = 100
-
-no_grad_set = []
-
-
-def create_loss(predict, label, mask, num_classes):
- predict = fluid.layers.transpose(predict, perm=[0, 2, 3, 1])
- predict = fluid.layers.reshape(predict, shape=[-1, num_classes])
- label = fluid.layers.reshape(label, shape=[-1, 1])
- predict = fluid.layers.gather(predict, mask)
- label = fluid.layers.gather(label, mask)
- label = fluid.layers.cast(label, dtype="int64")
- loss = fluid.layers.softmax_with_cross_entropy(predict, label)
- no_grad_set.append(label.name)
- return fluid.layers.reduce_mean(loss)
-
-
-def poly_decay():
- global_step = _decay_step_counter()
- with init_on_cpu():
- decayed_lr = LEARNING_RATE * (fluid.layers.pow(
- (1 - global_step / TOTAL_STEP), POWER))
- return decayed_lr
-
-
-def train(args):
- data_shape = cityscape.train_data_shape()
- num_classes = cityscape.num_classes()
- # define network
- images = fluid.layers.data(name='image', shape=data_shape, dtype='float32')
- label_sub1 = fluid.layers.data(name='label_sub1', shape=[1], dtype='int32')
- label_sub2 = fluid.layers.data(name='label_sub2', shape=[1], dtype='int32')
- label_sub4 = fluid.layers.data(name='label_sub4', shape=[1], dtype='int32')
- mask_sub1 = fluid.layers.data(name='mask_sub1', shape=[-1], dtype='int32')
- mask_sub2 = fluid.layers.data(name='mask_sub2', shape=[-1], dtype='int32')
- mask_sub4 = fluid.layers.data(name='mask_sub4', shape=[-1], dtype='int32')
-
- sub4_out, sub24_out, sub124_out = icnet(
- images, num_classes, np.array(data_shape[1:]).astype("float32"))
- loss_sub4 = create_loss(sub4_out, label_sub4, mask_sub4, num_classes)
- loss_sub24 = create_loss(sub24_out, label_sub2, mask_sub2, num_classes)
- loss_sub124 = create_loss(sub124_out, label_sub1, mask_sub1, num_classes)
- reduced_loss = LAMBDA1 * loss_sub4 + LAMBDA2 * loss_sub24 + LAMBDA3 * loss_sub124
-
- regularizer = fluid.regularizer.L2Decay(0.0001)
- optimizer = fluid.optimizer.Momentum(
- learning_rate=poly_decay(), momentum=0.9, regularization=regularizer)
- _, params_grads = optimizer.minimize(reduced_loss, no_grad_set=no_grad_set)
-
- # prepare environment
- place = fluid.CPUPlace()
- if args.use_gpu:
- place = fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
-
- exe.run(fluid.default_startup_program())
-
- if args.init_model is not None:
- print("load model from: %s" % args.init_model)
- sys.stdout.flush()
- fluid.io.load_params(exe, args.init_model)
-
- iter_id = 0
- t_loss = 0.
- sub4_loss = 0.
- sub24_loss = 0.
- sub124_loss = 0.
- train_reader = cityscape.train(
- args.batch_size, flip=args.random_mirror, scaling=args.random_scaling)
- start_time = time.time()
- while True:
- # train a pass
- for data in train_reader():
- if iter_id > TOTAL_STEP:
- end_time = time.time()
- print("kpis train_duration %f" % (end_time - start_time))
- return
- iter_id += 1
- results = exe.run(
- feed=get_feeder_data(data, place),
- fetch_list=[reduced_loss, loss_sub4, loss_sub24, loss_sub124])
- t_loss += results[0]
- sub4_loss += results[1]
- sub24_loss += results[2]
- sub124_loss += results[3]
- # training log
- if iter_id % LOG_PERIOD == 0:
- print(
- "Iter[%d]; train loss: %.3f; sub4_loss: %.3f; sub24_loss: %.3f; sub124_loss: %.3f"
- % (iter_id, t_loss / LOG_PERIOD, sub4_loss / LOG_PERIOD,
- sub24_loss / LOG_PERIOD, sub124_loss / LOG_PERIOD))
- print("kpis train_cost %f" % (t_loss / LOG_PERIOD))
-
- t_loss = 0.
- sub4_loss = 0.
- sub24_loss = 0.
- sub124_loss = 0.
- sys.stdout.flush()
-
- if iter_id % CHECKPOINT_PERIOD == 0 and args.checkpoint_path is not None:
- dir_name = args.checkpoint_path + "/" + str(iter_id)
- fluid.io.save_persistables(exe, dirname=dir_name)
- print("Saved checkpoint: %s" % (dir_name))
-
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- train(args)
-
-
-if __name__ == "__main__":
- main()
diff --git a/fluid/PaddleCV/image_classification/.run_ce.sh b/fluid/PaddleCV/image_classification/.run_ce.sh
deleted file mode 100755
index 9ba9a4c2c6779694f0e87e12ca85b59afa33f1c0..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/.run_ce.sh
+++ /dev/null
@@ -1,12 +0,0 @@
-#!/bin/bash
-
-# This file is only used for continuous evaluation.
-export FLAGS_cudnn_deterministic=True
-BATCH_SIZE=56
-cudaid=${object_detection_cudaid:=0}
-export CUDA_VISIBLE_DEVICES=$cudaid
-python train.py --batch_size=${BATCH_SIZE} --num_epochs=5 --enable_ce=True --lr_strategy=cosine_decay | python _ce.py
-
-cudaid=${object_detection_cudaid_m:=0, 1, 2, 3}
-export CUDA_VISIBLE_DEVICES=$cudaid
-python train.py --batch_size=${BATCH_SIZE} --num_epochs=5 --enable_ce=True --lr_strategy=cosine_decay | python _ce.py
diff --git a/fluid/PaddleCV/image_classification/README.md b/fluid/PaddleCV/image_classification/README.md
index b8cd82a68acf4abfd868f70042c7a66facfd5030..55392b8ac91e4a8c24d2f2d6ac63d695cb58e146 100644
--- a/fluid/PaddleCV/image_classification/README.md
+++ b/fluid/PaddleCV/image_classification/README.md
@@ -1,208 +1,6 @@
-# Image Classification and Model Zoo
-Image classification, which is an important field of computer vision, is to classify an image into pre-defined labels. Recently, many researchers developed different kinds of neural networks and highly improve the classification performance. This page introduces how to do image classification with PaddlePaddle Fluid, including [data preparation](#data-preparation), [training](#training-a-model), [finetuning](#finetuning), [evaluation](#evaluation) and [inference](#inference).
----
-## Table of Contents
-- [Installation](#installation)
-- [Data preparation](#data-preparation)
-- [Training a model with flexible parameters](#training-a-model)
-- [Finetuning](#finetuning)
-- [Evaluation](#evaluation)
-- [Inference](#inference)
-- [Supported models and performances](#supported-models)
+Hi!
-## Installation
+This directory has been deprecated.
-Running sample code in this directory requires PaddelPaddle Fluid v0.13.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) and make an update.
-
-## Data preparation
-
-An example for ImageNet classification is as follows. First of all, preparation of imagenet data can be done as:
-```
-cd data/ILSVRC2012/
-sh download_imagenet2012.sh
-```
-
-In the shell script ```download_imagenet2012.sh```, there are three steps to prepare data:
-
-**step-1:** Register at ```image-net.org``` first in order to get a pair of ```Username``` and ```AccessKey```, which are used to download ImageNet data.
-
-**step-2:** Download ImageNet-2012 dataset from website. The training and validation data will be downloaded into folder "train" and "val" respectively. Please note that the size of data is more than 40 GB, it will take much time to download. Users who have downloaded the ImageNet data can organize it into ```data/ILSVRC2012``` directly.
-
-**step-3:** Download training and validation label files. There are two label files which contain train and validation image labels respectively:
-
-* *train_list.txt*: label file of imagenet-2012 training set, with each line seperated by ```SPACE```, like:
-```
-train/n02483708/n02483708_2436.jpeg 369
-train/n03998194/n03998194_7015.jpeg 741
-train/n04523525/n04523525_38118.jpeg 884
-train/n04596742/n04596742_3032.jpeg 909
-train/n03208938/n03208938_7065.jpeg 535
-...
-```
-* *val_list.txt*: label file of imagenet-2012 validation set, with each line seperated by ```SPACE```, like.
-```
-val/ILSVRC2012_val_00000001.jpeg 65
-val/ILSVRC2012_val_00000002.jpeg 970
-val/ILSVRC2012_val_00000003.jpeg 230
-val/ILSVRC2012_val_00000004.jpeg 809
-val/ILSVRC2012_val_00000005.jpeg 516
-...
-```
-
-## Training a model with flexible parameters
-
-After data preparation, one can start the training step by:
-
-```
-python train.py \
- --model=SE_ResNeXt50_32x4d \
- --batch_size=32 \
- --total_images=1281167 \
- --class_dim=1000
- --image_shape=3,224,224 \
- --model_save_dir=output/ \
- --with_mem_opt=False \
- --lr_strategy=piecewise_decay \
- --lr=0.1
-```
-**parameter introduction:**
-* **model**: name model to use. Default: "SE_ResNeXt50_32x4d".
-* **num_epochs**: the number of epochs. Default: 120.
-* **batch_size**: the size of each mini-batch. Default: 256.
-* **use_gpu**: whether to use GPU or not. Default: True.
-* **total_images**: total number of images in the training set. Default: 1281167.
-* **class_dim**: the class number of the classification task. Default: 1000.
-* **image_shape**: input size of the network. Default: "3,224,224".
-* **model_save_dir**: the directory to save trained model. Default: "output".
-* **with_mem_opt**: whether to use memory optimization or not. Default: False.
-* **lr_strategy**: learning rate changing strategy. Default: "piecewise_decay".
-* **lr**: initialized learning rate. Default: 0.1.
-* **pretrained_model**: model path for pretraining. Default: None.
-* **checkpoint**: the checkpoint path to resume. Default: None.
-
-**data reader introduction:** Data reader is defined in ```reader.py```. In [training stage](#training-a-model), random crop and flipping are used, while center crop is used in [evaluation](#inference) and [inference](#inference) stages. Supported data augmentation includes:
-* rotation
-* color jitter
-* random crop
-* center crop
-* resize
-* flipping
-
-**training curve:** The training curve can be drawn based on training log. For example, the log from training AlexNet is like:
-```
-End pass 1, train_loss 6.23153877258, train_acc1 0.0150696625933, train_acc5 0.0552518665791, test_loss 5.41981744766, test_acc1 0.0519132651389, test_acc5 0.156150355935
-End pass 2, train_loss 5.15442800522, train_acc1 0.0784279331565, train_acc5 0.211050540209, test_loss 4.45795249939, test_acc1 0.140469551086, test_acc5 0.333163291216
-End pass 3, train_loss 4.51505613327, train_acc1 0.145300447941, train_acc5 0.331567406654, test_loss 3.86548018456, test_acc1 0.219443559647, test_acc5 0.446448504925
-End pass 4, train_loss 4.12735557556, train_acc1 0.19437250495, train_acc5 0.405713528395, test_loss 3.56990146637, test_acc1 0.264536827803, test_acc5 0.507190704346
-End pass 5, train_loss 3.87505435944, train_acc1 0.229518383741, train_acc5 0.453582793474, test_loss 3.35345435143, test_acc1 0.297349333763, test_acc5 0.54753267765
-End pass 6, train_loss 3.6929500103, train_acc1 0.255628824234, train_acc5 0.487188398838, test_loss 3.17112898827, test_acc1 0.326953113079, test_acc5 0.581780135632
-End pass 7, train_loss 3.55882954597, train_acc1 0.275381118059, train_acc5 0.511990904808, test_loss 3.03736782074, test_acc1 0.349035382271, test_acc5 0.606293857098
-End pass 8, train_loss 3.45595097542, train_acc1 0.291462600231, train_acc5 0.530815005302, test_loss 2.96034455299, test_acc1 0.362228929996, test_acc5 0.617390751839
-End pass 9, train_loss 3.3745200634, train_acc1 0.303871691227, train_acc5 0.545210540295, test_loss 2.93932366371, test_acc1 0.37129303813, test_acc5 0.623573005199
-...
-```
-
-The error rate curves of AlexNet, ResNet50 and SE-ResNeXt-50 are shown in the figure below.
-
-
-Training and validation Curves
-
-
-## Finetuning
-
-Finetuning is to finetune model weights in a specific task by loading pretrained weights. After initializing ```path_to_pretrain_model```, one can finetune a model as:
-```
-python train.py
- --model=SE_ResNeXt50_32x4d \
- --pretrained_model=${path_to_pretrain_model} \
- --batch_size=32 \
- --total_images=1281167 \
- --class_dim=1000 \
- --image_shape=3,224,224 \
- --model_save_dir=output/ \
- --with_mem_opt=True \
- --lr_strategy=piecewise_decay \
- --lr=0.1
-```
-
-## Evaluation
-Evaluation is to evaluate the performance of a trained model. One can download [pretrained models](#supported-models) and set its path to ```path_to_pretrain_model```. Then top1/top5 accuracy can be obtained by running the following command:
-```
-python eval.py \
- --model=SE_ResNeXt50_32x4d \
- --batch_size=32 \
- --class_dim=1000 \
- --image_shape=3,224,224 \
- --with_mem_opt=True \
- --pretrained_model=${path_to_pretrain_model}
-```
-
-According to the congfiguration of evaluation, the output log is like:
-```
-Testbatch 0,loss 2.1786134243, acc1 0.625,acc5 0.8125,time 0.48 sec
-Testbatch 10,loss 0.898496925831, acc1 0.75,acc5 0.9375,time 0.51 sec
-Testbatch 20,loss 1.32524681091, acc1 0.6875,acc5 0.9375,time 0.37 sec
-Testbatch 30,loss 1.46830511093, acc1 0.5,acc5 0.9375,time 0.51 sec
-Testbatch 40,loss 1.12802267075, acc1 0.625,acc5 0.9375,time 0.35 sec
-Testbatch 50,loss 0.881597697735, acc1 0.8125,acc5 1.0,time 0.32 sec
-Testbatch 60,loss 0.300163716078, acc1 0.875,acc5 1.0,time 0.48 sec
-Testbatch 70,loss 0.692037761211, acc1 0.875,acc5 1.0,time 0.35 sec
-Testbatch 80,loss 0.0969972759485, acc1 1.0,acc5 1.0,time 0.41 sec
-...
-```
-
-## Inference
-Inference is used to get prediction score or image features based on trained models.
-```
-python infer.py \
- --model=SE_ResNeXt50_32x4d \
- --batch_size=32 \
- --class_dim=1000 \
- --image_shape=3,224,224 \
- --with_mem_opt=True \
- --pretrained_model=${path_to_pretrain_model}
-```
-The output contains predication results, including maximum score (before softmax) and corresponding predicted label.
-```
-Test-0-score: [13.168352], class [491]
-Test-1-score: [7.913302], class [975]
-Test-2-score: [16.959702], class [21]
-Test-3-score: [14.197695], class [383]
-Test-4-score: [12.607652], class [878]
-Test-5-score: [17.725458], class [15]
-Test-6-score: [12.678599], class [118]
-Test-7-score: [12.353498], class [505]
-Test-8-score: [20.828007], class [747]
-Test-9-score: [15.135801], class [315]
-Test-10-score: [14.585114], class [920]
-Test-11-score: [13.739927], class [679]
-Test-12-score: [15.040644], class [386]
-...
-```
-
-## Supported models and performances
-
-Models are trained by starting with learning rate ```0.1``` and decaying it by ```0.1``` after each pre-defined epoches, if not special introduced. Available top-1/top-5 validation accuracy on ImageNet 2012 are listed in table. Pretrained models can be downloaded by clicking related model names.
-
-|model | top-1/top-5 accuracy
-|- | -:
-|[AlexNet](http://paddle-imagenet-models.bj.bcebos.com/alexnet_model.tar) | 57.21%/79.72%
-|VGG11 | -
-|VGG13 | -
-|VGG16 | -
-|VGG19 | -
-|GoogleNet | -
-|InceptionV4 | -
-|MobileNet | -
-|[ResNet50](http://paddle-imagenet-models.bj.bcebos.com/resnet_50_model.tar) | 76.63%/93.10%
-|ResNet101 | -
-|ResNet152 | -
-|[SE_ResNeXt50_32x4d](http://paddle-imagenet-models.bj.bcebos.com/se_resnext_50_model.tar) | 78.33%/93.96%
-|SE_ResNeXt101_32x4d | -
-|SE_ResNeXt152_32x4d | -
-|DPN68 | -
-|DPN92 | -
-|DPN98 | -
-|DPN107 | -
-|DPN131 | -
+Please visit the project at [PaddleCV/image_classification](../../../PaddleCV/image_classification).
diff --git a/fluid/PaddleCV/image_classification/README_cn.md b/fluid/PaddleCV/image_classification/README_cn.md
index 12b4d3ef3d22aa4b78abc6138adac1445ae18e42..bb8850cff5fbd658addaba488301783d0e510a6c 100644
--- a/fluid/PaddleCV/image_classification/README_cn.md
+++ b/fluid/PaddleCV/image_classification/README_cn.md
@@ -1,209 +1,2 @@
-# 图像分类以及模型库
-图像分类是计算机视觉的重要领域,它的目标是将图像分类到预定义的标签。近期,需要研究者提出很多不同种类的神经网络,并且极大的提升了分类算法的性能。本页将介绍如何使用PaddlePaddle进行图像分类,包括[数据准备](#data-preparation)、 [训练](#training-a-model)、[参数微调](#finetuning)、[模型评估](#evaluation)以及[模型推断](#inference)。
-
----
-## 内容
-- [安装](#installation)
-- [数据准备](#data-preparation)
-- [模型训练](#training-a-model)
-- [参数微调](#finetuning)
-- [模型评估](#evaluation)
-- [模型推断](#inference)
-- [已有模型及其性能](#supported-models)
-
-## 安装
-
-在当前目录下运行样例代码需要PadddlePaddle Fluid的v0.13.0或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据安装文档中的说明来更新PaddlePaddle。
-
-## 数据准备
-
-下面给出了ImageNet分类任务的样例,首先,通过如下的方式进行数据的准备:
-```
-cd data/ILSVRC2012/
-sh download_imagenet2012.sh
-```
-在```download_imagenet2012.sh```脚本中,通过下面三步来准备数据:
-
-**步骤一:** 首先在```image-net.org```网站上完成注册,用于获得一对```Username```和```AccessKey```。
-
-**步骤二:** 从ImageNet官网下载ImageNet-2012的图像数据。训练以及验证数据集会分别被下载到"train" 和 "val" 目录中。请注意,ImaegNet数据的大小超过40GB,下载非常耗时;已经自行下载ImageNet的用户可以直接将数据组织放置到```data/ILSVRC2012```。
-
-**步骤三:** 下载训练与验证集合对应的标签文件。下面两个文件分别包含了训练集合与验证集合中图像的标签:
-
-* *train_list.txt*: ImageNet-2012训练集合的标签文件,每一行采用"空格"分隔图像路径与标注,例如:
-```
-train/n02483708/n02483708_2436.jpeg 369
-train/n03998194/n03998194_7015.jpeg 741
-train/n04523525/n04523525_38118.jpeg 884
-train/n04596742/n04596742_3032.jpeg 909
-train/n03208938/n03208938_7065.jpeg 535
-...
-```
-* *val_list.txt*: ImageNet-2012验证集合的标签文件,每一行采用"空格"分隔图像路径与标注,例如:
-```
-val/ILSVRC2012_val_00000001.jpeg 65
-val/ILSVRC2012_val_00000002.jpeg 970
-val/ILSVRC2012_val_00000003.jpeg 230
-val/ILSVRC2012_val_00000004.jpeg 809
-val/ILSVRC2012_val_00000005.jpeg 516
-...
-```
-
-## 模型训练
-
-数据准备完毕后,可以通过如下的方式启动训练:
-```
-python train.py \
- --model=SE_ResNeXt50_32x4d \
- --batch_size=32 \
- --total_images=1281167 \
- --class_dim=1000
- --image_shape=3,224,224 \
- --model_save_dir=output/ \
- --with_mem_opt=False \
- --lr_strategy=piecewise_decay \
- --lr=0.1
-```
-**参数说明:**
-* **model**: name model to use. Default: "SE_ResNeXt50_32x4d".
-* **num_epochs**: the number of epochs. Default: 120.
-* **batch_size**: the size of each mini-batch. Default: 256.
-* **use_gpu**: whether to use GPU or not. Default: True.
-* **total_images**: total number of images in the training set. Default: 1281167.
-* **class_dim**: the class number of the classification task. Default: 1000.
-* **image_shape**: input size of the network. Default: "3,224,224".
-* **model_save_dir**: the directory to save trained model. Default: "output".
-* **with_mem_opt**: whether to use memory optimization or not. Default: False.
-* **lr_strategy**: learning rate changing strategy. Default: "piecewise_decay".
-* **lr**: initialized learning rate. Default: 0.1.
-* **pretrained_model**: model path for pretraining. Default: None.
-* **checkpoint**: the checkpoint path to resume. Default: None.
-
-**数据读取器说明:** 数据读取器定义在```reader.py```中。在[训练阶段](#training-a-model), 默认采用的增广方式是随机裁剪与水平翻转, 而在[评估](#inference)与[推断](#inference)阶段用的默认方式是中心裁剪。当前支持的数据增广方式有:
-* 旋转
-* 颜色抖动
-* 随机裁剪
-* 中心裁剪
-* 长宽调整
-* 水平翻转
-
-**训练曲线:** 通过训练过程中的日志可以画出训练曲线。举个例子,训练AlexNet出来的日志如下所示:
-```
-End pass 1, train_loss 6.23153877258, train_acc1 0.0150696625933, train_acc5 0.0552518665791, test_loss 5.41981744766, test_acc1 0.0519132651389, test_acc5 0.156150355935
-End pass 2, train_loss 5.15442800522, train_acc1 0.0784279331565, train_acc5 0.211050540209, test_loss 4.45795249939, test_acc1 0.140469551086, test_acc5 0.333163291216
-End pass 3, train_loss 4.51505613327, train_acc1 0.145300447941, train_acc5 0.331567406654, test_loss 3.86548018456, test_acc1 0.219443559647, test_acc5 0.446448504925
-End pass 4, train_loss 4.12735557556, train_acc1 0.19437250495, train_acc5 0.405713528395, test_loss 3.56990146637, test_acc1 0.264536827803, test_acc5 0.507190704346
-End pass 5, train_loss 3.87505435944, train_acc1 0.229518383741, train_acc5 0.453582793474, test_loss 3.35345435143, test_acc1 0.297349333763, test_acc5 0.54753267765
-End pass 6, train_loss 3.6929500103, train_acc1 0.255628824234, train_acc5 0.487188398838, test_loss 3.17112898827, test_acc1 0.326953113079, test_acc5 0.581780135632
-End pass 7, train_loss 3.55882954597, train_acc1 0.275381118059, train_acc5 0.511990904808, test_loss 3.03736782074, test_acc1 0.349035382271, test_acc5 0.606293857098
-End pass 8, train_loss 3.45595097542, train_acc1 0.291462600231, train_acc5 0.530815005302, test_loss 2.96034455299, test_acc1 0.362228929996, test_acc5 0.617390751839
-End pass 9, train_loss 3.3745200634, train_acc1 0.303871691227, train_acc5 0.545210540295, test_loss 2.93932366371, test_acc1 0.37129303813, test_acc5 0.623573005199
-...
-```
-
-下图给出了AlexNet、ResNet50以及SE-ResNeXt-50网络的错误率曲线:
-
-
-训练集合与验证集合上的错误率曲线
-
-
-
-## 参数微调
-
-参数微调是指在特定任务上微调已训练模型的参数。通过初始化```path_to_pretrain_model```,微调一个模型可以采用如下的命令:
-```
-python train.py
- --model=SE_ResNeXt50_32x4d \
- --pretrained_model=${path_to_pretrain_model} \
- --batch_size=32 \
- --total_images=1281167 \
- --class_dim=1000 \
- --image_shape=3,224,224 \
- --model_save_dir=output/ \
- --with_mem_opt=True \
- --lr_strategy=piecewise_decay \
- --lr=0.1
-```
-
-## 模型评估
-模型评估是指对训练完毕的模型评估各类性能指标。用户可以下载[预训练模型](#supported-models)并且设置```path_to_pretrain_model```为模型所在路径。运行如下的命令,可以获得一个模型top-1/top-5精度:
-```
-python eval.py \
- --model=SE_ResNeXt50_32x4d \
- --batch_size=32 \
- --class_dim=1000 \
- --image_shape=3,224,224 \
- --with_mem_opt=True \
- --pretrained_model=${path_to_pretrain_model}
-```
-
-根据这个评估程序的配置,输出日志形式如下:
-```
-Testbatch 0,loss 2.1786134243, acc1 0.625,acc5 0.8125,time 0.48 sec
-Testbatch 10,loss 0.898496925831, acc1 0.75,acc5 0.9375,time 0.51 sec
-Testbatch 20,loss 1.32524681091, acc1 0.6875,acc5 0.9375,time 0.37 sec
-Testbatch 30,loss 1.46830511093, acc1 0.5,acc5 0.9375,time 0.51 sec
-Testbatch 40,loss 1.12802267075, acc1 0.625,acc5 0.9375,time 0.35 sec
-Testbatch 50,loss 0.881597697735, acc1 0.8125,acc5 1.0,time 0.32 sec
-Testbatch 60,loss 0.300163716078, acc1 0.875,acc5 1.0,time 0.48 sec
-Testbatch 70,loss 0.692037761211, acc1 0.875,acc5 1.0,time 0.35 sec
-Testbatch 80,loss 0.0969972759485, acc1 1.0,acc5 1.0,time 0.41 sec
-...
-```
-
-
-## 模型推断
-模型推断可以获取一个模型的预测分数或者图像的特征:
-```
-python infer.py \
- --model=SE_ResNeXt50_32x4d \
- --batch_size=32 \
- --class_dim=1000 \
- --image_shape=3,224,224 \
- --with_mem_opt=True \
- --pretrained_model=${path_to_pretrain_model}
-```
-输出的预测结果包括最高分数(未经过softmax处理)以及相应的预测标签。
-```
-Test-0-score: [13.168352], class [491]
-Test-1-score: [7.913302], class [975]
-Test-2-score: [16.959702], class [21]
-Test-3-score: [14.197695], class [383]
-Test-4-score: [12.607652], class [878]
-Test-5-score: [17.725458], class [15]
-Test-6-score: [12.678599], class [118]
-Test-7-score: [12.353498], class [505]
-Test-8-score: [20.828007], class [747]
-Test-9-score: [15.135801], class [315]
-Test-10-score: [14.585114], class [920]
-Test-11-score: [13.739927], class [679]
-Test-12-score: [15.040644], class [386]
-...
-```
-
-## 已有模型及其性能
-
-表格中列出了在"models"目录下支持的神经网络种类,并且给出了已完成训练的模型在ImageNet-2012验证集合上的top-1/top-5精度;如无特征说明,训练模型的初始学习率为```0.1```,每隔预定的epochs会下降```0.1```。预训练模型可以通过点击相应模型的名称进行下载。
-
-|model | top-1/top-5 accuracy
-|- | -:
-|[AlexNet](http://paddle-imagenet-models.bj.bcebos.com/alexnet_model.tar) | 57.21%/79.72%
-|VGG11 | -
-|VGG13 | -
-|VGG16 | -
-|VGG19 | -
-|GoogleNet | -
-|InceptionV4 | -
-|MobileNet | -
-|[ResNet50](http://paddle-imagenet-models.bj.bcebos.com/resnet_50_model.tar) | 76.63%/93.10%
-|ResNet101 | -
-|ResNet152 | -
-|[SE_ResNeXt50_32x4d](http://paddle-imagenet-models.bj.bcebos.com/se_resnext_50_model.tar) | 78.33%/93.96%
-|SE_ResNeXt101_32x4d | -
-|SE_ResNeXt152_32x4d | -
-|DPN68 | -
-|DPN92 | -
-|DPN98 | -
-|DPN107 | -
-|DPN131 | -
+您好,该项目已被迁移,请移步到 [PaddleCV/image_classification](../../../PaddleCV/image_classification) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/image_classification/README_ngraph.md b/fluid/PaddleCV/image_classification/README_ngraph.md
new file mode 100644
index 0000000000000000000000000000000000000000..55392b8ac91e4a8c24d2f2d6ac63d695cb58e146
--- /dev/null
+++ b/fluid/PaddleCV/image_classification/README_ngraph.md
@@ -0,0 +1,6 @@
+
+Hi!
+
+This directory has been deprecated.
+
+Please visit the project at [PaddleCV/image_classification](../../../PaddleCV/image_classification).
diff --git a/fluid/PaddleCV/image_classification/dist_train/README.md b/fluid/PaddleCV/image_classification/dist_train/README.md
deleted file mode 100644
index a595a540adfa770253909e432e99a27228d5f062..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/dist_train/README.md
+++ /dev/null
@@ -1,136 +0,0 @@
-# Distributed Image Classification Models Training
-
-This folder contains implementations of **Image Classification Models**, they are designed to support
-large-scaled distributed training with two distributed mode: parameter server mode and NCCL2(Nvidia NCCL2 communication library) collective mode.
-
-## Getting Started
-
-Before getting started, please make sure you have go throught the imagenet [Data Preparation](../README.md#data-preparation).
-
-1. The entrypoint file is `dist_train.py`, some important flags are as follows:
-
- - `model`, the model to run with, default is the fine tune model `DistResnet`.
- - `batch_size`, the batch_size per device.
- - `update_method`, specify the update method, can choose from local, pserver or nccl2.
- - `device`, use CPU or GPU device.
- - `gpus`, the GPU device count that the process used.
-
- you can check out more details of the flags by `python dist_train.py --help`.
-
-1. Runtime configurations
-
- We use the environment variable to distinguish the different training role of a distributed training job.
-
- - `PADDLE_TRAINING_ROLE`, the current training role, should be in [PSERVER, TRAINER].
- - `PADDLE_TRAINERS`, the trainer count of a job.
- - `PADDLE_CURRENT_IP`, the current instance IP.
- - `PADDLE_PSERVER_IPS`, the parameter server IP list, separated by "," only be used with update_method is pserver.
- - `PADDLE_TRAINER_ID`, the unique trainer ID of a job, the ranging is [0, PADDLE_TRAINERS).
- - `PADDLE_PSERVER_PORT`, the port of the parameter pserver listened on.
- - `PADDLE_TRAINER_IPS`, the trainer IP list, separated by ",", only be used with upadte_method is nccl2.
-
-### Parameter Server Mode
-
-In this example, we launched 4 parameter server instances and 4 trainer instances in the cluster:
-
-1. launch parameter server process
-
- ``` bash
- PADDLE_TRAINING_ROLE=PSERVER \
- PADDLE_TRAINERS=4 \
- PADDLE_PSERVER_IPS=192.168.0.100,192.168.0.101,192.168.0.102,192.168.0.103 \
- PADDLE_CURRENT_IP=192.168.0.100 \
- PADDLE_PSERVER_PORT=7164 \
- python dist_train.py \
- --model=DistResnet \
- --batch_size=32 \
- --update_method=pserver \
- --device=CPU \
- --data_dir=../data/ILSVRC2012
- ```
-
-1. launch trainer process
-
- ``` bash
- PADDLE_TRAINING_ROLE=TRAINER \
- PADDLE_TRAINERS=4 \
- PADDLE_PSERVER_IPS=192.168.0.100,192.168.0.101,192.168.0.102,192.168.0.103 \
- PADDLE_TRAINER_ID=0 \
- PADDLE_PSERVER_PORT=7164 \
- python dist_train.py \
- --model=DistResnet \
- --batch_size=32 \
- --update_method=pserver \
- --device=GPU \
- --data_dir=../data/ILSVRC2012
- ```
-
-### NCCL2 Collective Mode
-
-1. launch trainer process
-
- ``` bash
- PADDLE_TRAINING_ROLE=TRAINER \
- PADDLE_TRAINERS=4 \
- PADDLE_TRAINER_IPS=192.168.0.100,192.168.0.101,192.168.0.102,192.168.0.103 \
- PADDLE_TRAINER_ID=0 \
- python dist_train.py \
- --model=DistResnet \
- --batch_size=32 \
- --update_method=nccl2 \
- --device=GPU \
- --data_dir=../data/ILSVRC2012
- ```
-
-### Visualize the Training Process
-
-It's easy to draw the learning curve accroding to the training logs, for example,
-the logs of ResNet50 is as follows:
-
-``` text
-Pass 0, batch 0, loss 7.0336914, accucacys: [0.0, 0.00390625]
-Pass 0, batch 1, loss 7.094781, accucacys: [0.0, 0.0]
-Pass 0, batch 2, loss 7.007068, accucacys: [0.0, 0.0078125]
-Pass 0, batch 3, loss 7.1056547, accucacys: [0.00390625, 0.00390625]
-Pass 0, batch 4, loss 7.133543, accucacys: [0.0, 0.0078125]
-Pass 0, batch 5, loss 7.3055463, accucacys: [0.0078125, 0.01171875]
-Pass 0, batch 6, loss 7.341838, accucacys: [0.0078125, 0.01171875]
-Pass 0, batch 7, loss 7.290557, accucacys: [0.0, 0.0]
-Pass 0, batch 8, loss 7.264951, accucacys: [0.0, 0.00390625]
-Pass 0, batch 9, loss 7.43522, accucacys: [0.00390625, 0.00390625]
-```
-
-The below figure shows top 1 train accuracy for local training with 8 GPUs and distributed training
-with 32 GPUs, and also distributed training with batch merge feature turned on. Note that the
-red curve is trained with origin model configuration, which does not have the warmup and some detailed
-modifications.
-
-For distributed training with 32GPUs using `--model DistResnet` we can achieve test accuracy 75.5% after
-90 passes of training (the test accuracy is not shown in below figure). We can also achieve this result
-using "batch merge" feature by setting `--multi_batch_repeat 4` and with higher throughput.
-
-
-
-Training top-1 accuracy curves
-
-
-### Finetuning for Distributed Training
-
-The default resnet50 distributed training config is based on this paper: https://arxiv.org/pdf/1706.02677.pdf
-
-- use `--model DistResnet`
-- we use 32 P40 GPUs with 4 Nodes, each has 8 GPUs
-- we set `batch_size=32` for each GPU, in `batch_merge=on` case, we repeat 4 times before communicating with pserver.
-- learning rate starts from 0.1 and warm up to 0.4 in 5 passes(because we already have gradient merging,
- so we only need to linear scale up to trainer count) using 4 nodes.
-- using batch_merge (`--multi_batch_repeat 4`) can make better use of GPU computing power and increase the
- total training throughput. Because in the fine-tune configuration, we have to use `batch_size=32` per GPU,
- and recent GPU is so fast that the communication between nodes may delay the total speed. In batch_merge mode
- we run several batches forward and backward computation, then merge the gradients and send to pserver for
- optimization, we use different batch norm mean and variance variable in each repeat so that adding repeats
- behaves the same as adding more GPUs.
-
-
-### Performance
-
-TBD
diff --git a/fluid/PaddleCV/image_classification/dist_train/dist_train.py b/fluid/PaddleCV/image_classification/dist_train/dist_train.py
deleted file mode 100644
index 05c0c23212cfe49f6ef7332143f833d7d7fa7486..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/dist_train/dist_train.py
+++ /dev/null
@@ -1,441 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import argparse
-import time
-import os
-import traceback
-
-import numpy as np
-
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.core as core
-import six
-import sys
-sys.path.append("..")
-import models
-from reader import train, val
-
-def parse_args():
- parser = argparse.ArgumentParser('Distributed Image Classification Training.')
- parser.add_argument(
- '--model',
- type=str,
- default='DistResNet',
- help='The model to run.')
- parser.add_argument(
- '--batch_size', type=int, default=32, help='The minibatch size per device.')
- parser.add_argument(
- '--multi_batch_repeat', type=int, default=1, help='Batch merge repeats.')
- parser.add_argument(
- '--learning_rate', type=float, default=0.1, help='The learning rate.')
- parser.add_argument(
- '--pass_num', type=int, default=90, help='The number of passes.')
- parser.add_argument(
- '--data_format',
- type=str,
- default='NCHW',
- choices=['NCHW', 'NHWC'],
- help='The data data_format, now only support NCHW.')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type.')
- parser.add_argument(
- '--gpus',
- type=int,
- default=1,
- help='If gpus > 1, will use ParallelExecutor to run, else use Executor.')
- parser.add_argument(
- '--cpus',
- type=int,
- default=1,
- help='If cpus > 1, will set ParallelExecutor to use multiple threads.')
- parser.add_argument(
- '--no_test',
- action='store_true',
- help='If set, do not test the testset during training.')
- parser.add_argument(
- '--memory_optimize',
- action='store_true',
- help='If set, optimize runtime memory before start.')
- parser.add_argument(
- '--update_method',
- type=str,
- default='local',
- choices=['local', 'pserver', 'nccl2'],
- help='Choose parameter update method, can be local, pserver, nccl2.')
- parser.add_argument(
- '--no_split_var',
- action='store_true',
- default=False,
- help='Whether split variables into blocks when update_method is pserver')
- parser.add_argument(
- '--async_mode',
- action='store_true',
- default=False,
- help='Whether start pserver in async mode to support ASGD')
- parser.add_argument(
- '--reduce_strategy',
- type=str,
- choices=['reduce', 'all_reduce'],
- default='all_reduce',
- help='Specify the reduce strategy, can be reduce, all_reduce')
- parser.add_argument(
- '--data_dir',
- type=str,
- default="../data/ILSVRC2012",
- help="The ImageNet dataset root dir."
- )
- args = parser.parse_args()
- return args
-
-def get_model(args, is_train, main_prog, startup_prog):
- pyreader = None
- class_dim = 1000
- if args.data_format == 'NCHW':
- dshape = [3, 224, 224]
- else:
- dshape = [224, 224, 3]
- if is_train:
- reader = train(data_dir=args.data_dir)
- else:
- reader = val(data_dir=args.data_dir)
-
- trainer_count = int(os.getenv("PADDLE_TRAINERS", "1"))
- with fluid.program_guard(main_prog, startup_prog):
- with fluid.unique_name.guard():
- pyreader = fluid.layers.py_reader(
- capacity=args.batch_size * args.gpus,
- shapes=([-1] + dshape, (-1, 1)),
- dtypes=('float32', 'int64'),
- name="train_reader" if is_train else "test_reader",
- use_double_buffer=True)
- input, label = fluid.layers.read_file(pyreader)
- model_def = models.__dict__[args.model](layers=50, is_train=is_train)
- predict = model_def.net(input, class_dim=class_dim)
-
- cost = fluid.layers.cross_entropy(input=predict, label=label)
- avg_cost = fluid.layers.mean(x=cost)
-
- batch_acc1 = fluid.layers.accuracy(input=predict, label=label, k=1)
- batch_acc5 = fluid.layers.accuracy(input=predict, label=label, k=5)
-
- optimizer = None
- if is_train:
- start_lr = args.learning_rate
- # n * worker * repeat
- end_lr = args.learning_rate * trainer_count * args.multi_batch_repeat
- total_images = 1281167 / trainer_count
- step = int(total_images / (args.batch_size * args.gpus * args.multi_batch_repeat) + 1)
- warmup_steps = step * 5 # warmup 5 passes
- epochs = [30, 60, 80]
- bd = [step * e for e in epochs]
- base_lr = end_lr
- lr = []
- lr = [base_lr * (0.1**i) for i in range(len(bd) + 1)]
-
- optimizer = fluid.optimizer.Momentum(
- learning_rate=models.learning_rate.lr_warmup(
- fluid.layers.piecewise_decay(
- boundaries=bd, values=lr),
- warmup_steps, start_lr, end_lr),
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(1e-4))
- optimizer.minimize(avg_cost)
-
- batched_reader = None
- pyreader.decorate_paddle_reader(
- paddle.batch(
- reader,
- batch_size=args.batch_size))
-
- return avg_cost, optimizer, [batch_acc1,
- batch_acc5], batched_reader, pyreader
-
-def append_nccl2_prepare(trainer_id, startup_prog):
- trainer_id = int(os.getenv("PADDLE_TRAINER_ID"))
- port = os.getenv("PADDLE_PSERVER_PORT")
- worker_ips = os.getenv("PADDLE_TRAINER_IPS")
- worker_endpoints = []
- for ip in worker_ips.split(","):
- worker_endpoints.append(':'.join([ip, port]))
- current_endpoint = os.getenv("PADDLE_CURRENT_IP") + ":" + port
-
- config = fluid.DistributeTranspilerConfig()
- config.mode = "nccl2"
- t = fluid.DistributeTranspiler(config=config)
- t.transpile(trainer_id, trainers=','.join(worker_endpoints),
- current_endpoint=current_endpoint,
- startup_program=startup_prog)
-
-
-def dist_transpile(trainer_id, args, train_prog, startup_prog):
- port = os.getenv("PADDLE_PSERVER_PORT", "6174")
- pserver_ips = os.getenv("PADDLE_PSERVER_IPS", "")
- eplist = []
- for ip in pserver_ips.split(","):
- eplist.append(':'.join([ip, port]))
- pserver_endpoints = ",".join(eplist)
- trainers = int(os.getenv("PADDLE_TRAINERS"))
- current_endpoint = os.getenv("PADDLE_CURRENT_IP", "") + ":" + port
- training_role = os.getenv("PADDLE_TRAINING_ROLE")
-
- config = fluid.DistributeTranspilerConfig()
- config.slice_var_up = not args.no_split_var
- t = fluid.DistributeTranspiler(config=config)
- t.transpile(
- trainer_id,
- program=train_prog,
- pservers=pserver_endpoints,
- trainers=trainers,
- sync_mode=not args.async_mode,
- startup_program=startup_prog)
- if training_role == "PSERVER":
- pserver_program = t.get_pserver_program(current_endpoint)
- pserver_startup_program = t.get_startup_program(
- current_endpoint, pserver_program, startup_program=startup_prog)
- return pserver_program, pserver_startup_program
- elif training_role == "TRAINER":
- train_program = t.get_trainer_program()
- return train_program, startup_prog
- else:
- raise ValueError(
- 'PADDLE_TRAINING_ROLE environment variable must be either TRAINER or PSERVER'
- )
-
-def append_bn_repeat_init_op(main_prog, startup_prog, num_repeats):
- repeat_vars = set()
- for op in main_prog.global_block().ops:
- if op.type == "batch_norm":
- repeat_vars.add(op.input("Mean")[0])
- repeat_vars.add(op.input("Variance")[0])
-
- for i in range(num_repeats):
- for op in startup_prog.global_block().ops:
- if op.type == "fill_constant":
- for oname in op.output_arg_names:
- if oname in repeat_vars:
- var = startup_prog.global_block().var(oname)
- repeat_var_name = "%s.repeat.%d" % (oname, i)
- repeat_var = startup_prog.global_block().create_var(
- name=repeat_var_name,
- type=var.type,
- dtype=var.dtype,
- shape=var.shape,
- persistable=var.persistable
- )
- main_prog.global_block()._clone_variable(repeat_var)
- startup_prog.global_block().append_op(
- type="fill_constant",
- inputs={},
- outputs={"Out": repeat_var},
- attrs=op.all_attrs()
- )
-
-
-def copyback_repeat_bn_params(main_prog):
- repeat_vars = set()
- for op in main_prog.global_block().ops:
- if op.type == "batch_norm":
- repeat_vars.add(op.input("Mean")[0])
- repeat_vars.add(op.input("Variance")[0])
- for vname in repeat_vars:
- real_var = fluid.global_scope().find_var("%s.repeat.0" % vname).get_tensor()
- orig_var = fluid.global_scope().find_var(vname).get_tensor()
- orig_var.set(np.array(real_var), fluid.CUDAPlace(0)) # test on GPU0
-
-
-def test_single(exe, test_args, args, test_prog):
- acc_evaluators = []
- for i in xrange(len(test_args[2])):
- acc_evaluators.append(fluid.metrics.Accuracy())
-
- to_fetch = [v.name for v in test_args[2]]
- test_args[4].start()
- while True:
- try:
- acc_rets = exe.run(program=test_prog, fetch_list=to_fetch)
- for i, e in enumerate(acc_evaluators):
- e.update(
- value=np.array(acc_rets[i]), weight=args.batch_size)
- except fluid.core.EOFException as eof:
- test_args[4].reset()
- break
-
- return [e.eval() for e in acc_evaluators]
-
-
-def train_parallel(train_args, test_args, args, train_prog, test_prog,
- startup_prog, nccl_id_var, num_trainers, trainer_id):
- over_all_start = time.time()
- place = core.CPUPlace() if args.device == 'CPU' else core.CUDAPlace(0)
-
- if nccl_id_var and trainer_id == 0:
- #FIXME(wuyi): wait other trainer to start listening
- time.sleep(30)
-
- startup_exe = fluid.Executor(place)
- if args.multi_batch_repeat > 1:
- append_bn_repeat_init_op(train_prog, startup_prog, args.multi_batch_repeat)
- startup_exe.run(startup_prog)
- strategy = fluid.ExecutionStrategy()
- strategy.num_threads = args.cpus
- strategy.allow_op_delay = False
- build_strategy = fluid.BuildStrategy()
- if args.multi_batch_repeat > 1:
- pass_builder = build_strategy._create_passes_from_strategy()
- mypass = pass_builder.insert_pass(
- len(pass_builder.all_passes()) - 2, "multi_batch_merge_pass")
- mypass.set_int("num_repeats", args.multi_batch_repeat)
- if args.reduce_strategy == "reduce":
- build_strategy.reduce_strategy = fluid.BuildStrategy(
- ).ReduceStrategy.Reduce
- else:
- build_strategy.reduce_strategy = fluid.BuildStrategy(
- ).ReduceStrategy.AllReduce
-
- avg_loss = train_args[0]
-
- if args.update_method == "pserver":
- # parameter server mode distributed training, merge
- # gradients on local server, do not initialize
- # ParallelExecutor with multi server all-reduce mode.
- num_trainers = 1
- trainer_id = 0
-
- exe = fluid.ParallelExecutor(
- True,
- avg_loss.name,
- main_program=train_prog,
- exec_strategy=strategy,
- build_strategy=build_strategy,
- num_trainers=num_trainers,
- trainer_id=trainer_id)
-
- pyreader = train_args[4]
- for pass_id in range(args.pass_num):
- num_samples = 0
- start_time = time.time()
- batch_id = 0
- pyreader.start()
- while True:
- fetch_list = [avg_loss.name]
- acc_name_list = [v.name for v in train_args[2]]
- fetch_list.extend(acc_name_list)
- try:
- if batch_id % 30 == 0:
- fetch_ret = exe.run(fetch_list)
- else:
- fetch_ret = exe.run([])
- except fluid.core.EOFException as eof:
- break
- except fluid.core.EnforceNotMet as ex:
- traceback.print_exc()
- break
- num_samples += args.batch_size * args.gpus
-
- if batch_id % 30 == 0:
- fetched_data = [np.mean(np.array(d)) for d in fetch_ret]
- print("Pass %d, batch %d, loss %s, accucacys: %s" %
- (pass_id, batch_id, fetched_data[0], fetched_data[1:]))
- batch_id += 1
-
- print_train_time(start_time, time.time(), num_samples)
- pyreader.reset()
-
- if not args.no_test and test_args[2]:
- if args.multi_batch_repeat > 1:
- copyback_repeat_bn_params(train_prog)
- test_ret = test_single(startup_exe, test_args, args, test_prog)
- print("Pass: %d, Test Accuracy: %s\n" %
- (pass_id, [np.mean(np.array(v)) for v in test_ret]))
-
- startup_exe.close()
- print("total train time: ", time.time() - over_all_start)
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(six.iteritems(vars(args))):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def print_train_time(start_time, end_time, num_samples):
- train_elapsed = end_time - start_time
- examples_per_sec = num_samples / train_elapsed
- print('\nTotal examples: %d, total time: %.5f, %.5f examples/sed\n' %
- (num_samples, train_elapsed, examples_per_sec))
-
-
-def print_paddle_envs():
- print('----------- Configuration envs -----------')
- for k in os.environ:
- if "PADDLE_" in k:
- print("ENV %s:%s" % (k, os.environ[k]))
- print('------------------------------------------------')
-
-
-def main():
- args = parse_args()
- print_arguments(args)
- print_paddle_envs()
-
- # the unique trainer id, starting from 0, needed by trainer
- # only
- nccl_id_var, num_trainers, trainer_id = (
- None, 1, int(os.getenv("PADDLE_TRAINER_ID", "0")))
-
- train_prog = fluid.Program()
- test_prog = fluid.Program()
- startup_prog = fluid.Program()
-
- train_args = list(get_model(args, True, train_prog, startup_prog))
- test_args = list(get_model(args, False, test_prog, startup_prog))
-
- all_args = [train_args, test_args, args]
-
- if args.update_method == "pserver":
- train_prog, startup_prog = dist_transpile(trainer_id, args, train_prog,
- startup_prog)
- if not train_prog:
- raise Exception(
- "Must configure correct environments to run dist train.")
- all_args.extend([train_prog, test_prog, startup_prog])
- if os.getenv("PADDLE_TRAINING_ROLE") == "TRAINER":
- all_args.extend([nccl_id_var, num_trainers, trainer_id])
- train_parallel(*all_args)
- elif os.getenv("PADDLE_TRAINING_ROLE") == "PSERVER":
- # start pserver with Executor
- server_exe = fluid.Executor(fluid.CPUPlace())
- server_exe.run(startup_prog)
- server_exe.run(train_prog)
- exit(0)
-
- # for other update methods, use default programs
- all_args.extend([train_prog, test_prog, startup_prog])
-
- if args.update_method == "nccl2":
- nccl_id_var, num_trainers, trainer_id = append_nccl2_prepare(
- trainer_id, startup_prog)
-
- all_args.extend([nccl_id_var, num_trainers, trainer_id])
- train_parallel(*all_args)
-
-if __name__ == "__main__":
- main()
diff --git a/fluid/PaddleCV/image_classification/eval.py b/fluid/PaddleCV/image_classification/eval.py
deleted file mode 100644
index 7d265e525e063488fd758c77b1d90550e6afdf9f..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/eval.py
+++ /dev/null
@@ -1,130 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import numpy as np
-import time
-import sys
-import paddle
-import paddle.fluid as fluid
-import models
-import reader
-import argparse
-import functools
-from models.learning_rate import cosine_decay
-from utility import add_arguments, print_arguments
-import math
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('batch_size', int, 256, "Minibatch size.")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('class_dim', int, 1000, "Class number.")
-add_arg('image_shape', str, "3,224,224", "Input image size")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('model', str, "SE_ResNeXt50_32x4d", "Set the network to use.")
-# yapf: enable
-
-model_list = [m for m in dir(models) if "__" not in m]
-
-
-def eval(args):
- # parameters from arguments
- class_dim = args.class_dim
- model_name = args.model
- pretrained_model = args.pretrained_model
- with_memory_optimization = args.with_mem_opt
- image_shape = [int(m) for m in args.image_shape.split(",")]
-
- assert model_name in model_list, "{} is not in lists: {}".format(args.model,
- model_list)
-
- image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
-
- # model definition
- model = models.__dict__[model_name]()
-
- if model_name is "GoogleNet":
- out0, out1, out2 = model.net(input=image, class_dim=class_dim)
- cost0 = fluid.layers.cross_entropy(input=out0, label=label)
- cost1 = fluid.layers.cross_entropy(input=out1, label=label)
- cost2 = fluid.layers.cross_entropy(input=out2, label=label)
- avg_cost0 = fluid.layers.mean(x=cost0)
- avg_cost1 = fluid.layers.mean(x=cost1)
- avg_cost2 = fluid.layers.mean(x=cost2)
-
- avg_cost = avg_cost0 + 0.3 * avg_cost1 + 0.3 * avg_cost2
- acc_top1 = fluid.layers.accuracy(input=out0, label=label, k=1)
- acc_top5 = fluid.layers.accuracy(input=out0, label=label, k=5)
- else:
- out = model.net(input=image, class_dim=class_dim)
- cost = fluid.layers.cross_entropy(input=out, label=label)
-
- avg_cost = fluid.layers.mean(x=cost)
- acc_top1 = fluid.layers.accuracy(input=out, label=label, k=1)
- acc_top5 = fluid.layers.accuracy(input=out, label=label, k=5)
-
- test_program = fluid.default_main_program().clone(for_test=True)
-
- if with_memory_optimization:
- fluid.memory_optimize(fluid.default_main_program())
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- if pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
-
- fluid.io.load_vars(exe, pretrained_model, predicate=if_exist)
-
- val_reader = paddle.batch(reader.val(), batch_size=args.batch_size)
- feeder = fluid.DataFeeder(place=place, feed_list=[image, label])
-
- fetch_list = [avg_cost.name, acc_top1.name, acc_top5.name]
-
- test_info = [[], [], []]
- cnt = 0
- for batch_id, data in enumerate(val_reader()):
- t1 = time.time()
- loss, acc1, acc5 = exe.run(test_program,
- fetch_list=fetch_list,
- feed=feeder.feed(data))
- t2 = time.time()
- period = t2 - t1
- loss = np.mean(loss)
- acc1 = np.mean(acc1)
- acc5 = np.mean(acc5)
- test_info[0].append(loss * len(data))
- test_info[1].append(acc1 * len(data))
- test_info[2].append(acc5 * len(data))
- cnt += len(data)
- if batch_id % 10 == 0:
- print("Testbatch {0},loss {1}, "
- "acc1 {2},acc5 {3},time {4}".format(batch_id, \
- loss, acc1, acc5, \
- "%2.2f sec" % period))
- sys.stdout.flush()
-
- test_loss = np.sum(test_info[0]) / cnt
- test_acc1 = np.sum(test_info[1]) / cnt
- test_acc5 = np.sum(test_info[2]) / cnt
-
- print("Test_loss {0}, test_acc1 {1}, test_acc5 {2}".format(
- test_loss, test_acc1, test_acc5))
- sys.stdout.flush()
-
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- eval(args)
-
-
-if __name__ == '__main__':
- main()
diff --git a/fluid/PaddleCV/image_classification/infer.py b/fluid/PaddleCV/image_classification/infer.py
deleted file mode 100644
index 19d204a1a21fde57f4c9b28e0c61cb9fd02edc3c..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/infer.py
+++ /dev/null
@@ -1,94 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import numpy as np
-import time
-import sys
-import paddle
-import paddle.fluid as fluid
-import models
-import reader
-import argparse
-import functools
-from models.learning_rate import cosine_decay
-from utility import add_arguments, print_arguments
-import math
-
-parser = argparse.ArgumentParser(description=__doc__)
-# yapf: disable
-add_arg = functools.partial(add_arguments, argparser=parser)
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('class_dim', int, 1000, "Class number.")
-add_arg('image_shape', str, "3,224,224", "Input image size")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('model', str, "SE_ResNeXt50_32x4d", "Set the network to use.")
-# yapf: enable
-
-model_list = [m for m in dir(models) if "__" not in m]
-
-
-def infer(args):
- # parameters from arguments
- class_dim = args.class_dim
- model_name = args.model
- pretrained_model = args.pretrained_model
- with_memory_optimization = args.with_mem_opt
- image_shape = [int(m) for m in args.image_shape.split(",")]
-
- assert model_name in model_list, "{} is not in lists: {}".format(args.model,
- model_list)
-
- image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
-
- # model definition
- model = models.__dict__[model_name]()
-
- if model_name is "GoogleNet":
- out, _, _ = model.net(input=image, class_dim=class_dim)
- else:
- out = model.net(input=image, class_dim=class_dim)
-
- test_program = fluid.default_main_program().clone(for_test=True)
-
- if with_memory_optimization:
- fluid.memory_optimize(fluid.default_main_program())
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- if pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
-
- fluid.io.load_vars(exe, pretrained_model, predicate=if_exist)
-
- test_batch_size = 1
- test_reader = paddle.batch(reader.test(), batch_size=test_batch_size)
- feeder = fluid.DataFeeder(place=place, feed_list=[image])
-
- fetch_list = [out.name]
-
- TOPK = 1
- for batch_id, data in enumerate(test_reader()):
- result = exe.run(test_program,
- fetch_list=fetch_list,
- feed=feeder.feed(data))
- result = result[0][0]
- pred_label = np.argsort(result)[::-1][:TOPK]
- print("Test-{0}-score: {1}, class {2}"
- .format(batch_id, result[pred_label], pred_label))
- sys.stdout.flush()
-
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- infer(args)
-
-
-if __name__ == '__main__':
- main()
diff --git a/fluid/PaddleCV/image_classification/models/__init__.py b/fluid/PaddleCV/image_classification/models/__init__.py
deleted file mode 100644
index f43275b6c674e4d9772e480bd9ec480c75c447d1..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/__init__.py
+++ /dev/null
@@ -1,10 +0,0 @@
-from .alexnet import AlexNet
-from .mobilenet import MobileNet
-from .googlenet import GoogleNet
-from .vgg import VGG11, VGG13, VGG16, VGG19
-from .resnet import ResNet50, ResNet101, ResNet152
-from .resnet_dist import DistResNet
-from .inception_v4 import InceptionV4
-from .se_resnext import SE_ResNeXt50_32x4d, SE_ResNeXt101_32x4d, SE_ResNeXt152_32x4d
-from .dpn import DPN68, DPN92, DPN98, DPN107, DPN131
-import learning_rate
diff --git a/fluid/PaddleCV/image_classification/models/alexnet.py b/fluid/PaddleCV/image_classification/models/alexnet.py
deleted file mode 100644
index 3e0eab2dee1d2f2e8d3cb2e8c12a3504a1e7c0e5..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/alexnet.py
+++ /dev/null
@@ -1,150 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-import math
-
-__all__ = ['AlexNet']
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [40, 70, 100],
- "steps": [0.01, 0.001, 0.0001, 0.00001]
- }
-}
-
-
-class AlexNet():
- def __init__(self):
- self.params = train_parameters
-
- def net(self, input, class_dim=1000):
- stdv = 1.0 / math.sqrt(input.shape[1] * 11 * 11)
- conv1 = fluid.layers.conv2d(
- input=input,
- num_filters=64,
- filter_size=11,
- stride=4,
- padding=2,
- groups=1,
- act='relu',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
- pool1 = fluid.layers.pool2d(
- input=conv1,
- pool_size=3,
- pool_stride=2,
- pool_padding=0,
- pool_type='max')
-
- stdv = 1.0 / math.sqrt(pool1.shape[1] * 5 * 5)
- conv2 = fluid.layers.conv2d(
- input=pool1,
- num_filters=192,
- filter_size=5,
- stride=1,
- padding=2,
- groups=1,
- act='relu',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
- pool2 = fluid.layers.pool2d(
- input=conv2,
- pool_size=3,
- pool_stride=2,
- pool_padding=0,
- pool_type='max')
-
- stdv = 1.0 / math.sqrt(pool2.shape[1] * 3 * 3)
- conv3 = fluid.layers.conv2d(
- input=pool2,
- num_filters=384,
- filter_size=3,
- stride=1,
- padding=1,
- groups=1,
- act='relu',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
-
- stdv = 1.0 / math.sqrt(conv3.shape[1] * 3 * 3)
- conv4 = fluid.layers.conv2d(
- input=conv3,
- num_filters=256,
- filter_size=3,
- stride=1,
- padding=1,
- groups=1,
- act='relu',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
-
- stdv = 1.0 / math.sqrt(conv4.shape[1] * 3 * 3)
- conv5 = fluid.layers.conv2d(
- input=conv4,
- num_filters=256,
- filter_size=3,
- stride=1,
- padding=1,
- groups=1,
- act='relu',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
- pool5 = fluid.layers.pool2d(
- input=conv5,
- pool_size=3,
- pool_stride=2,
- pool_padding=0,
- pool_type='max')
-
- drop6 = fluid.layers.dropout(x=pool5, dropout_prob=0.5)
-
- stdv = 1.0 / math.sqrt(drop6.shape[1] * drop6.shape[2] *
- drop6.shape[3] * 1.0)
- fc6 = fluid.layers.fc(
- input=drop6,
- size=4096,
- act='relu',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
-
- drop7 = fluid.layers.dropout(x=fc6, dropout_prob=0.5)
-
- stdv = 1.0 / math.sqrt(drop7.shape[1] * 1.0)
- fc7 = fluid.layers.fc(
- input=drop7,
- size=4096,
- act='relu',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
-
- stdv = 1.0 / math.sqrt(fc7.shape[1] * 1.0)
- out = fluid.layers.fc(
- input=fc7,
- size=class_dim,
- act='softmax',
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)),
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
- return out
diff --git a/fluid/PaddleCV/image_classification/models/dpn.py b/fluid/PaddleCV/image_classification/models/dpn.py
deleted file mode 100644
index d9144eeb6e7dc781e33aad9b4f54ce0f3b9e903d..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/dpn.py
+++ /dev/null
@@ -1,280 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import numpy as np
-import time
-import sys
-import math
-import paddle.fluid as fluid
-
-__all__ = ["DPN", "DPN68", "DPN92", "DPN98", "DPN107", "DPN131"]
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class DPN(object):
- def __init__(self, layers=68):
- self.params = train_parameters
- self.layers = layers
-
- def net(self, input, class_dim=1000):
- # get network args
- args = self.get_net_args(self.layers)
- bws = args['bw']
- inc_sec = args['inc_sec']
- rs = args['bw']
- k_r = args['k_r']
- k_sec = args['k_sec']
- G = args['G']
- init_num_filter = args['init_num_filter']
- init_filter_size = args['init_filter_size']
- init_padding = args['init_padding']
-
- ## define Dual Path Network
-
- # conv1
- conv1_x_1 = fluid.layers.conv2d(
- input=input,
- num_filters=init_num_filter,
- filter_size=init_filter_size,
- stride=2,
- padding=init_padding,
- groups=1,
- act=None,
- bias_attr=False)
- conv1_x_1 = fluid.layers.batch_norm(
- input=conv1_x_1, act='relu', is_test=False)
- convX_x_x = fluid.layers.pool2d(
- input=conv1_x_1,
- pool_size=3,
- pool_stride=2,
- pool_padding=1,
- pool_type='max')
-
- #conv2 - conv5
- for gc in range(4):
- bw = bws[gc]
- inc = inc_sec[gc]
- R = (k_r * bw) // rs[gc]
- if gc == 0:
- _type1 = 'proj'
- _type2 = 'normal'
- else:
- _type1 = 'down'
- _type2 = 'normal'
- convX_x_x = self.dual_path_factory(convX_x_x, R, R, bw, inc, G,
- _type1)
- for i_ly in range(2, k_sec[gc] + 1):
- convX_x_x = self.dual_path_factory(convX_x_x, R, R, bw, inc, G,
- _type2)
-
- conv5_x_x = fluid.layers.concat(convX_x_x, axis=1)
- conv5_x_x = fluid.layers.batch_norm(
- input=conv5_x_x, act='relu', is_test=False)
- pool5 = fluid.layers.pool2d(
- input=conv5_x_x,
- pool_size=7,
- pool_stride=1,
- pool_padding=0,
- pool_type='avg')
-
- #stdv = 1.0 / math.sqrt(pool5.shape[1] * 1.0)
- stdv = 0.01
- param_attr = fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv))
- fc6 = fluid.layers.fc(input=pool5,
- size=class_dim,
- act='softmax',
- param_attr=param_attr)
-
- return fc6
-
- def get_net_args(self, layers):
- if layers == 68:
- k_r = 128
- G = 32
- k_sec = [3, 4, 12, 3]
- inc_sec = [16, 32, 32, 64]
- bw = [64, 128, 256, 512]
- r = [64, 64, 64, 64]
- init_num_filter = 10
- init_filter_size = 3
- init_padding = 1
- elif layers == 92:
- k_r = 96
- G = 32
- k_sec = [3, 4, 20, 3]
- inc_sec = [16, 32, 24, 128]
- bw = [256, 512, 1024, 2048]
- r = [256, 256, 256, 256]
- init_num_filter = 64
- init_filter_size = 7
- init_padding = 3
- elif layers == 98:
- k_r = 160
- G = 40
- k_sec = [3, 6, 20, 3]
- inc_sec = [16, 32, 32, 128]
- bw = [256, 512, 1024, 2048]
- r = [256, 256, 256, 256]
- init_num_filter = 96
- init_filter_size = 7
- init_padding = 3
- elif layers == 107:
- k_r = 200
- G = 50
- k_sec = [4, 8, 20, 3]
- inc_sec = [20, 64, 64, 128]
- bw = [256, 512, 1024, 2048]
- r = [256, 256, 256, 256]
- init_num_filter = 128
- init_filter_size = 7
- init_padding = 3
- elif layers == 131:
- k_r = 160
- G = 40
- k_sec = [4, 8, 28, 3]
- inc_sec = [16, 32, 32, 128]
- bw = [256, 512, 1024, 2048]
- r = [256, 256, 256, 256]
- init_num_filter = 128
- init_filter_size = 7
- init_padding = 3
- else:
- raise NotImplementedError
- net_arg = {
- 'k_r': k_r,
- 'G': G,
- 'k_sec': k_sec,
- 'inc_sec': inc_sec,
- 'bw': bw,
- 'r': r
- }
- net_arg['init_num_filter'] = init_num_filter
- net_arg['init_filter_size'] = init_filter_size
- net_arg['init_padding'] = init_padding
-
- return net_arg
-
- def dual_path_factory(self,
- data,
- num_1x1_a,
- num_3x3_b,
- num_1x1_c,
- inc,
- G,
- _type='normal'):
- kw = 3
- kh = 3
- pw = (kw - 1) // 2
- ph = (kh - 1) // 2
-
- # type
- if _type is 'proj':
- key_stride = 1
- has_proj = True
- if _type is 'down':
- key_stride = 2
- has_proj = True
- if _type is 'normal':
- key_stride = 1
- has_proj = False
-
- # PROJ
- if type(data) is list:
- data_in = fluid.layers.concat([data[0], data[1]], axis=1)
- else:
- data_in = data
-
- if has_proj:
- c1x1_w = self.bn_ac_conv(
- data=data_in,
- num_filter=(num_1x1_c + 2 * inc),
- kernel=(1, 1),
- pad=(0, 0),
- stride=(key_stride, key_stride))
- data_o1, data_o2 = fluid.layers.split(
- c1x1_w, num_or_sections=[num_1x1_c, 2 * inc], dim=1)
- else:
- data_o1 = data[0]
- data_o2 = data[1]
-
- # MAIN
- c1x1_a = self.bn_ac_conv(
- data=data_in, num_filter=num_1x1_a, kernel=(1, 1), pad=(0, 0))
- c3x3_b = self.bn_ac_conv(
- data=c1x1_a,
- num_filter=num_3x3_b,
- kernel=(kw, kh),
- pad=(pw, ph),
- stride=(key_stride, key_stride),
- num_group=G)
- c1x1_c = self.bn_ac_conv(
- data=c3x3_b,
- num_filter=(num_1x1_c + inc),
- kernel=(1, 1),
- pad=(0, 0))
-
- c1x1_c1, c1x1_c2 = fluid.layers.split(
- c1x1_c, num_or_sections=[num_1x1_c, inc], dim=1)
-
- # OUTPUTS
- summ = fluid.layers.elementwise_add(x=data_o1, y=c1x1_c1)
- dense = fluid.layers.concat([data_o2, c1x1_c2], axis=1)
-
- return [summ, dense]
-
- def bn_ac_conv(self,
- data,
- num_filter,
- kernel,
- pad,
- stride=(1, 1),
- num_group=1):
- bn_ac = fluid.layers.batch_norm(input=data, act='relu', is_test=False)
- bn_ac_conv = fluid.layers.conv2d(
- input=bn_ac,
- num_filters=num_filter,
- filter_size=kernel,
- stride=stride,
- padding=pad,
- groups=num_group,
- act=None,
- bias_attr=False)
- return bn_ac_conv
-
-
-def DPN68():
- model = DPN(layers=68)
- return model
-
-
-def DPN92():
- model = DPN(layers=92)
- return model
-
-
-def DPN98():
- model = DPN(layers=98)
- return model
-
-
-def DPN107():
- model = DPN(layers=107)
- return model
-
-
-def DPN131():
- model = DPN(layers=131)
- return model
diff --git a/fluid/PaddleCV/image_classification/models/googlenet.py b/fluid/PaddleCV/image_classification/models/googlenet.py
deleted file mode 100644
index ebc8566e129296a453bd59109ffaf37f0760660a..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/googlenet.py
+++ /dev/null
@@ -1,167 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-
-__all__ = ['GoogleNet']
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class GoogleNet():
- def __init__(self):
- self.params = train_parameters
-
- def conv_layer(self,
- input,
- num_filters,
- filter_size,
- stride=1,
- groups=1,
- act=None):
- channels = input.shape[1]
- stdv = (3.0 / (filter_size**2 * channels))**0.5
- param_attr = fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv))
- conv = fluid.layers.conv2d(
- input=input,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=(filter_size - 1) // 2,
- groups=groups,
- act=act,
- param_attr=param_attr,
- bias_attr=False)
- return conv
-
- def xavier(self, channels, filter_size):
- stdv = (3.0 / (filter_size**2 * channels))**0.5
- param_attr = fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv))
- return param_attr
-
- def inception(self, name, input, channels, filter1, filter3R, filter3,
- filter5R, filter5, proj):
- conv1 = self.conv_layer(
- input=input, num_filters=filter1, filter_size=1, stride=1, act=None)
- conv3r = self.conv_layer(
- input=input,
- num_filters=filter3R,
- filter_size=1,
- stride=1,
- act=None)
- conv3 = self.conv_layer(
- input=conv3r,
- num_filters=filter3,
- filter_size=3,
- stride=1,
- act=None)
- conv5r = self.conv_layer(
- input=input,
- num_filters=filter5R,
- filter_size=1,
- stride=1,
- act=None)
- conv5 = self.conv_layer(
- input=conv5r,
- num_filters=filter5,
- filter_size=5,
- stride=1,
- act=None)
- pool = fluid.layers.pool2d(
- input=input,
- pool_size=3,
- pool_stride=1,
- pool_padding=1,
- pool_type='max')
- convprj = fluid.layers.conv2d(
- input=pool, filter_size=1, num_filters=proj, stride=1, padding=0)
- cat = fluid.layers.concat(input=[conv1, conv3, conv5, convprj], axis=1)
- cat = fluid.layers.relu(cat)
- return cat
-
- def net(self, input, class_dim=1000):
- conv = self.conv_layer(
- input=input, num_filters=64, filter_size=7, stride=2, act=None)
- pool = fluid.layers.pool2d(
- input=conv, pool_size=3, pool_type='max', pool_stride=2)
-
- conv = self.conv_layer(
- input=pool, num_filters=64, filter_size=1, stride=1, act=None)
- conv = self.conv_layer(
- input=conv, num_filters=192, filter_size=3, stride=1, act=None)
- pool = fluid.layers.pool2d(
- input=conv, pool_size=3, pool_type='max', pool_stride=2)
-
- ince3a = self.inception("ince3a", pool, 192, 64, 96, 128, 16, 32, 32)
- ince3b = self.inception("ince3b", ince3a, 256, 128, 128, 192, 32, 96,
- 64)
- pool3 = fluid.layers.pool2d(
- input=ince3b, pool_size=3, pool_type='max', pool_stride=2)
-
- ince4a = self.inception("ince4a", pool3, 480, 192, 96, 208, 16, 48, 64)
- ince4b = self.inception("ince4b", ince4a, 512, 160, 112, 224, 24, 64,
- 64)
- ince4c = self.inception("ince4c", ince4b, 512, 128, 128, 256, 24, 64,
- 64)
- ince4d = self.inception("ince4d", ince4c, 512, 112, 144, 288, 32, 64,
- 64)
- ince4e = self.inception("ince4e", ince4d, 528, 256, 160, 320, 32, 128,
- 128)
- pool4 = fluid.layers.pool2d(
- input=ince4e, pool_size=3, pool_type='max', pool_stride=2)
-
- ince5a = self.inception("ince5a", pool4, 832, 256, 160, 320, 32, 128,
- 128)
- ince5b = self.inception("ince5b", ince5a, 832, 384, 192, 384, 48, 128,
- 128)
- pool5 = fluid.layers.pool2d(
- input=ince5b, pool_size=7, pool_type='avg', pool_stride=7)
- dropout = fluid.layers.dropout(x=pool5, dropout_prob=0.4)
- out = fluid.layers.fc(input=dropout,
- size=class_dim,
- act='softmax',
- param_attr=self.xavier(1024, 1))
-
- pool_o1 = fluid.layers.pool2d(
- input=ince4a, pool_size=5, pool_type='avg', pool_stride=3)
- conv_o1 = self.conv_layer(
- input=pool_o1, num_filters=128, filter_size=1, stride=1, act=None)
- fc_o1 = fluid.layers.fc(input=conv_o1,
- size=1024,
- act='relu',
- param_attr=self.xavier(2048, 1))
- dropout_o1 = fluid.layers.dropout(x=fc_o1, dropout_prob=0.7)
- out1 = fluid.layers.fc(input=dropout_o1,
- size=class_dim,
- act='softmax',
- param_attr=self.xavier(1024, 1))
-
- pool_o2 = fluid.layers.pool2d(
- input=ince4d, pool_size=5, pool_type='avg', pool_stride=3)
- conv_o2 = self.conv_layer(
- input=pool_o2, num_filters=128, filter_size=1, stride=1, act=None)
- fc_o2 = fluid.layers.fc(input=conv_o2,
- size=1024,
- act='relu',
- param_attr=self.xavier(2048, 1))
- dropout_o2 = fluid.layers.dropout(x=fc_o2, dropout_prob=0.7)
- out2 = fluid.layers.fc(input=dropout_o2,
- size=class_dim,
- act='softmax',
- param_attr=self.xavier(1024, 1))
-
- # last fc layer is "out"
- return out, out1, out2
diff --git a/fluid/PaddleCV/image_classification/models/inception_v4.py b/fluid/PaddleCV/image_classification/models/inception_v4.py
deleted file mode 100644
index d3a80a20500f365166d50a0cf222613d0427354f..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/inception_v4.py
+++ /dev/null
@@ -1,207 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-import math
-
-__all__ = ['InceptionV4']
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class InceptionV4():
- def __init__(self):
- self.params = train_parameters
-
- def net(self, input, class_dim=1000):
- x = self.inception_stem(input)
-
- for i in range(4):
- x = self.inceptionA(x)
- x = self.reductionA(x)
-
- for i in range(7):
- x = self.inceptionB(x)
- x = self.reductionB(x)
-
- for i in range(3):
- x = self.inceptionC(x)
-
- pool = fluid.layers.pool2d(
- input=x, pool_size=8, pool_type='avg', global_pooling=True)
-
- drop = fluid.layers.dropout(x=pool, dropout_prob=0.2)
-
- stdv = 1.0 / math.sqrt(drop.shape[1] * 1.0)
- out = fluid.layers.fc(
- input=drop,
- size=class_dim,
- act='softmax',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv, stdv)))
- return out
-
- def conv_bn_layer(self,
- data,
- num_filters,
- filter_size,
- stride=1,
- padding=0,
- groups=1,
- act='relu'):
- conv = fluid.layers.conv2d(
- input=data,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- groups=groups,
- act=None,
- bias_attr=False)
- return fluid.layers.batch_norm(input=conv, act=act)
-
- def inception_stem(self, data):
- conv = self.conv_bn_layer(data, 32, 3, stride=2, act='relu')
- conv = self.conv_bn_layer(conv, 32, 3, act='relu')
- conv = self.conv_bn_layer(conv, 64, 3, padding=1, act='relu')
-
- pool1 = fluid.layers.pool2d(
- input=conv, pool_size=3, pool_stride=2, pool_type='max')
- conv2 = self.conv_bn_layer(conv, 96, 3, stride=2, act='relu')
- concat = fluid.layers.concat([pool1, conv2], axis=1)
-
- conv1 = self.conv_bn_layer(concat, 64, 1, act='relu')
- conv1 = self.conv_bn_layer(conv1, 96, 3, act='relu')
-
- conv2 = self.conv_bn_layer(concat, 64, 1, act='relu')
- conv2 = self.conv_bn_layer(
- conv2, 64, (7, 1), padding=(3, 0), act='relu')
- conv2 = self.conv_bn_layer(
- conv2, 64, (1, 7), padding=(0, 3), act='relu')
- conv2 = self.conv_bn_layer(conv2, 96, 3, act='relu')
-
- concat = fluid.layers.concat([conv1, conv2], axis=1)
-
- conv1 = self.conv_bn_layer(concat, 192, 3, stride=2, act='relu')
- pool1 = fluid.layers.pool2d(
- input=concat, pool_size=3, pool_stride=2, pool_type='max')
-
- concat = fluid.layers.concat([conv1, pool1], axis=1)
-
- return concat
-
- def inceptionA(self, data):
- pool1 = fluid.layers.pool2d(
- input=data, pool_size=3, pool_padding=1, pool_type='avg')
- conv1 = self.conv_bn_layer(pool1, 96, 1, act='relu')
-
- conv2 = self.conv_bn_layer(data, 96, 1, act='relu')
-
- conv3 = self.conv_bn_layer(data, 64, 1, act='relu')
- conv3 = self.conv_bn_layer(conv3, 96, 3, padding=1, act='relu')
-
- conv4 = self.conv_bn_layer(data, 64, 1, act='relu')
- conv4 = self.conv_bn_layer(conv4, 96, 3, padding=1, act='relu')
- conv4 = self.conv_bn_layer(conv4, 96, 3, padding=1, act='relu')
-
- concat = fluid.layers.concat([conv1, conv2, conv3, conv4], axis=1)
-
- return concat
-
- def reductionA(self, data):
- pool1 = fluid.layers.pool2d(
- input=data, pool_size=3, pool_stride=2, pool_type='max')
-
- conv2 = self.conv_bn_layer(data, 384, 3, stride=2, act='relu')
-
- conv3 = self.conv_bn_layer(data, 192, 1, act='relu')
- conv3 = self.conv_bn_layer(conv3, 224, 3, padding=1, act='relu')
- conv3 = self.conv_bn_layer(conv3, 256, 3, stride=2, act='relu')
-
- concat = fluid.layers.concat([pool1, conv2, conv3], axis=1)
-
- return concat
-
- def inceptionB(self, data):
- pool1 = fluid.layers.pool2d(
- input=data, pool_size=3, pool_padding=1, pool_type='avg')
- conv1 = self.conv_bn_layer(pool1, 128, 1, act='relu')
-
- conv2 = self.conv_bn_layer(data, 384, 1, act='relu')
-
- conv3 = self.conv_bn_layer(data, 192, 1, act='relu')
- conv3 = self.conv_bn_layer(
- conv3, 224, (1, 7), padding=(0, 3), act='relu')
- conv3 = self.conv_bn_layer(
- conv3, 256, (7, 1), padding=(3, 0), act='relu')
-
- conv4 = self.conv_bn_layer(data, 192, 1, act='relu')
- conv4 = self.conv_bn_layer(
- conv4, 192, (1, 7), padding=(0, 3), act='relu')
- conv4 = self.conv_bn_layer(
- conv4, 224, (7, 1), padding=(3, 0), act='relu')
- conv4 = self.conv_bn_layer(
- conv4, 224, (1, 7), padding=(0, 3), act='relu')
- conv4 = self.conv_bn_layer(
- conv4, 256, (7, 1), padding=(3, 0), act='relu')
-
- concat = fluid.layers.concat([conv1, conv2, conv3, conv4], axis=1)
-
- return concat
-
- def reductionB(self, data):
- pool1 = fluid.layers.pool2d(
- input=data, pool_size=3, pool_stride=2, pool_type='max')
-
- conv2 = self.conv_bn_layer(data, 192, 1, act='relu')
- conv2 = self.conv_bn_layer(conv2, 192, 3, stride=2, act='relu')
-
- conv3 = self.conv_bn_layer(data, 256, 1, act='relu')
- conv3 = self.conv_bn_layer(
- conv3, 256, (1, 7), padding=(0, 3), act='relu')
- conv3 = self.conv_bn_layer(
- conv3, 320, (7, 1), padding=(3, 0), act='relu')
- conv3 = self.conv_bn_layer(conv3, 320, 3, stride=2, act='relu')
-
- concat = fluid.layers.concat([pool1, conv2, conv3], axis=1)
-
- return concat
-
- def inceptionC(self, data):
- pool1 = fluid.layers.pool2d(
- input=data, pool_size=3, pool_padding=1, pool_type='avg')
- conv1 = self.conv_bn_layer(pool1, 256, 1, act='relu')
-
- conv2 = self.conv_bn_layer(data, 256, 1, act='relu')
-
- conv3 = self.conv_bn_layer(data, 384, 1, act='relu')
- conv3_1 = self.conv_bn_layer(
- conv3, 256, (1, 3), padding=(0, 1), act='relu')
- conv3_2 = self.conv_bn_layer(
- conv3, 256, (3, 1), padding=(1, 0), act='relu')
-
- conv4 = self.conv_bn_layer(data, 384, 1, act='relu')
- conv4 = self.conv_bn_layer(
- conv4, 448, (1, 3), padding=(0, 1), act='relu')
- conv4 = self.conv_bn_layer(
- conv4, 512, (3, 1), padding=(1, 0), act='relu')
- conv4_1 = self.conv_bn_layer(
- conv4, 256, (1, 3), padding=(0, 1), act='relu')
- conv4_2 = self.conv_bn_layer(
- conv4, 256, (3, 1), padding=(1, 0), act='relu')
-
- concat = fluid.layers.concat(
- [conv1, conv2, conv3_1, conv3_2, conv4_1, conv4_2], axis=1)
-
- return concat
diff --git a/fluid/PaddleCV/image_classification/models/learning_rate.py b/fluid/PaddleCV/image_classification/models/learning_rate.py
deleted file mode 100644
index 01922eb3a490320c58a6a1a68da7c1479882379b..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/learning_rate.py
+++ /dev/null
@@ -1,50 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.layers.ops as ops
-from paddle.fluid.initializer import init_on_cpu
-from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
-import math
-
-
-def cosine_decay(learning_rate, step_each_epoch, epochs=120):
- """Applies cosine decay to the learning rate.
- lr = 0.05 * (math.cos(epoch * (math.pi / 120)) + 1)
- """
- global_step = _decay_step_counter()
-
- with init_on_cpu():
- epoch = ops.floor(global_step / step_each_epoch)
- decayed_lr = learning_rate * \
- (ops.cos(epoch * (math.pi / epochs)) + 1)/2
- return decayed_lr
-
-
-def lr_warmup(learning_rate, warmup_steps, start_lr, end_lr):
- """ Applies linear learning rate warmup for distributed training
- Argument learning_rate can be float or a Variable
- lr = lr + (warmup_rate * step / warmup_steps)
- """
- assert(isinstance(end_lr, float))
- assert(isinstance(start_lr, float))
- linear_step = end_lr - start_lr
- with fluid.default_main_program()._lr_schedule_guard():
- lr = fluid.layers.tensor.create_global_var(
- shape=[1],
- value=0.0,
- dtype='float32',
- persistable=True,
- name="learning_rate_warmup")
-
- global_step = fluid.layers.learning_rate_scheduler._decay_step_counter()
-
- with fluid.layers.control_flow.Switch() as switch:
- with switch.case(global_step < warmup_steps):
- decayed_lr = start_lr + linear_step * (global_step / warmup_steps)
- fluid.layers.tensor.assign(decayed_lr, lr)
- with switch.default():
- fluid.layers.tensor.assign(learning_rate, lr)
-
- return lr
\ No newline at end of file
diff --git a/fluid/PaddleCV/image_classification/models/mobilenet.py b/fluid/PaddleCV/image_classification/models/mobilenet.py
deleted file mode 100644
index f3554734768d5bbec96dac2443b48389d235da91..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/mobilenet.py
+++ /dev/null
@@ -1,167 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle.fluid as fluid
-from paddle.fluid.initializer import MSRA
-from paddle.fluid.param_attr import ParamAttr
-
-__all__ = ['MobileNet']
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class MobileNet():
- def __init__(self):
- self.params = train_parameters
-
- def net(self, input, class_dim=1000, scale=1.0):
- # conv1: 112x112
- input = self.conv_bn_layer(
- input,
- filter_size=3,
- channels=3,
- num_filters=int(32 * scale),
- stride=2,
- padding=1)
-
- # 56x56
- input = self.depthwise_separable(
- input,
- num_filters1=32,
- num_filters2=64,
- num_groups=32,
- stride=1,
- scale=scale)
-
- input = self.depthwise_separable(
- input,
- num_filters1=64,
- num_filters2=128,
- num_groups=64,
- stride=2,
- scale=scale)
-
- # 28x28
- input = self.depthwise_separable(
- input,
- num_filters1=128,
- num_filters2=128,
- num_groups=128,
- stride=1,
- scale=scale)
-
- input = self.depthwise_separable(
- input,
- num_filters1=128,
- num_filters2=256,
- num_groups=128,
- stride=2,
- scale=scale)
-
- # 14x14
- input = self.depthwise_separable(
- input,
- num_filters1=256,
- num_filters2=256,
- num_groups=256,
- stride=1,
- scale=scale)
-
- input = self.depthwise_separable(
- input,
- num_filters1=256,
- num_filters2=512,
- num_groups=256,
- stride=2,
- scale=scale)
-
- # 14x14
- for i in range(5):
- input = self.depthwise_separable(
- input,
- num_filters1=512,
- num_filters2=512,
- num_groups=512,
- stride=1,
- scale=scale)
- # 7x7
- input = self.depthwise_separable(
- input,
- num_filters1=512,
- num_filters2=1024,
- num_groups=512,
- stride=2,
- scale=scale)
-
- input = self.depthwise_separable(
- input,
- num_filters1=1024,
- num_filters2=1024,
- num_groups=1024,
- stride=1,
- scale=scale)
-
- input = fluid.layers.pool2d(
- input=input,
- pool_size=0,
- pool_stride=1,
- pool_type='avg',
- global_pooling=True)
-
- output = fluid.layers.fc(input=input,
- size=class_dim,
- act='softmax',
- param_attr=ParamAttr(initializer=MSRA()))
- return output
-
- def conv_bn_layer(self,
- input,
- filter_size,
- num_filters,
- stride,
- padding,
- channels=None,
- num_groups=1,
- act='relu',
- use_cudnn=True):
- conv = fluid.layers.conv2d(
- input=input,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- groups=num_groups,
- act=None,
- use_cudnn=use_cudnn,
- param_attr=ParamAttr(initializer=MSRA()),
- bias_attr=False)
- return fluid.layers.batch_norm(input=conv, act=act)
-
- def depthwise_separable(self, input, num_filters1, num_filters2, num_groups,
- stride, scale):
- depthwise_conv = self.conv_bn_layer(
- input=input,
- filter_size=3,
- num_filters=int(num_filters1 * scale),
- stride=stride,
- padding=1,
- num_groups=int(num_groups * scale),
- use_cudnn=False)
-
- pointwise_conv = self.conv_bn_layer(
- input=depthwise_conv,
- filter_size=1,
- num_filters=int(num_filters2 * scale),
- stride=1,
- padding=0)
- return pointwise_conv
diff --git a/fluid/PaddleCV/image_classification/models/resnet.py b/fluid/PaddleCV/image_classification/models/resnet.py
deleted file mode 100644
index 75c7b750541c60821a624f95a2ab56d2890fcb25..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/resnet.py
+++ /dev/null
@@ -1,123 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-import math
-
-__all__ = ["ResNet", "ResNet50", "ResNet101", "ResNet152"]
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class ResNet():
- def __init__(self, layers=50):
- self.params = train_parameters
- self.layers = layers
-
- def net(self, input, class_dim=1000):
- layers = self.layers
- supported_layers = [50, 101, 152]
- assert layers in supported_layers, \
- "supported layers are {} but input layer is {}".format(supported_layers, layers)
-
- if layers == 50:
- depth = [3, 4, 6, 3]
- elif layers == 101:
- depth = [3, 4, 23, 3]
- elif layers == 152:
- depth = [3, 8, 36, 3]
- num_filters = [64, 128, 256, 512]
-
- conv = self.conv_bn_layer(
- input=input, num_filters=64, filter_size=7, stride=2, act='relu')
- conv = fluid.layers.pool2d(
- input=conv,
- pool_size=3,
- pool_stride=2,
- pool_padding=1,
- pool_type='max')
-
- for block in range(len(depth)):
- for i in range(depth[block]):
- conv = self.bottleneck_block(
- input=conv,
- num_filters=num_filters[block],
- stride=2 if i == 0 and block != 0 else 1)
-
- pool = fluid.layers.pool2d(
- input=conv, pool_size=7, pool_type='avg', global_pooling=True)
- stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
- out = fluid.layers.fc(input=pool,
- size=class_dim,
- act='softmax',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv,
- stdv)))
- return out
-
- def conv_bn_layer(self,
- input,
- num_filters,
- filter_size,
- stride=1,
- groups=1,
- act=None):
- conv = fluid.layers.conv2d(
- input=input,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=(filter_size - 1) // 2,
- groups=groups,
- act=None,
- bias_attr=False)
- return fluid.layers.batch_norm(input=conv, act=act)
-
- def shortcut(self, input, ch_out, stride):
- ch_in = input.shape[1]
- if ch_in != ch_out or stride != 1:
- return self.conv_bn_layer(input, ch_out, 1, stride)
- else:
- return input
-
- def bottleneck_block(self, input, num_filters, stride):
- conv0 = self.conv_bn_layer(
- input=input, num_filters=num_filters, filter_size=1, act='relu')
- conv1 = self.conv_bn_layer(
- input=conv0,
- num_filters=num_filters,
- filter_size=3,
- stride=stride,
- act='relu')
- conv2 = self.conv_bn_layer(
- input=conv1, num_filters=num_filters * 4, filter_size=1, act=None)
-
- short = self.shortcut(input, num_filters * 4, stride)
-
- return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
-
-
-def ResNet50():
- model = ResNet(layers=50)
- return model
-
-
-def ResNet101():
- model = ResNet(layers=101)
- return model
-
-
-def ResNet152():
- model = ResNet(layers=152)
- return model
diff --git a/fluid/PaddleCV/image_classification/models/resnet_dist.py b/fluid/PaddleCV/image_classification/models/resnet_dist.py
deleted file mode 100644
index 2dab3e6111d8d02df44233c0440a8ea9ce74faa5..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/resnet_dist.py
+++ /dev/null
@@ -1,121 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-import math
-
-__all__ = ["DistResNet"]
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class DistResNet():
- def __init__(self, layers=50, is_train=True):
- self.params = train_parameters
- self.layers = layers
- self.is_train = is_train
- self.weight_decay = 1e-4
-
- def net(self, input, class_dim=1000):
- layers = self.layers
- supported_layers = [50, 101, 152]
- assert layers in supported_layers, \
- "supported layers are {} but input layer is {}".format(supported_layers, layers)
-
- if layers == 50:
- depth = [3, 4, 6, 3]
- elif layers == 101:
- depth = [3, 4, 23, 3]
- elif layers == 152:
- depth = [3, 8, 36, 3]
- num_filters = [64, 128, 256, 512]
-
- conv = self.conv_bn_layer(
- input=input, num_filters=64, filter_size=7, stride=2, act='relu')
- conv = fluid.layers.pool2d(
- input=conv,
- pool_size=3,
- pool_stride=2,
- pool_padding=1,
- pool_type='max')
-
- for block in range(len(depth)):
- for i in range(depth[block]):
- conv = self.bottleneck_block(
- input=conv,
- num_filters=num_filters[block],
- stride=2 if i == 0 and block != 0 else 1)
-
- pool = fluid.layers.pool2d(
- input=conv, pool_size=7, pool_type='avg', global_pooling=True)
- stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
- out = fluid.layers.fc(input=pool,
- size=class_dim,
- act='softmax',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv,
- stdv),
- regularizer=fluid.regularizer.L2Decay(self.weight_decay)),
- bias_attr=fluid.ParamAttr(
- regularizer=fluid.regularizer.L2Decay(self.weight_decay))
- )
- return out
-
- def conv_bn_layer(self,
- input,
- num_filters,
- filter_size,
- stride=1,
- groups=1,
- act=None,
- bn_init_value=1.0):
- conv = fluid.layers.conv2d(
- input=input,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=(filter_size - 1) // 2,
- groups=groups,
- act=None,
- bias_attr=False,
- param_attr=fluid.ParamAttr(regularizer=fluid.regularizer.L2Decay(self.weight_decay)))
- return fluid.layers.batch_norm(
- input=conv, act=act, is_test=not self.is_train,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Constant(bn_init_value),
- regularizer=None))
-
- def shortcut(self, input, ch_out, stride):
- ch_in = input.shape[1]
- if ch_in != ch_out or stride != 1:
- return self.conv_bn_layer(input, ch_out, 1, stride)
- else:
- return input
-
- def bottleneck_block(self, input, num_filters, stride):
- conv0 = self.conv_bn_layer(
- input=input, num_filters=num_filters, filter_size=1, act='relu')
- conv1 = self.conv_bn_layer(
- input=conv0,
- num_filters=num_filters,
- filter_size=3,
- stride=stride,
- act='relu')
- # NOTE: default bias is 0.0 already
- conv2 = self.conv_bn_layer(
- input=conv1, num_filters=num_filters * 4, filter_size=1, act=None, bn_init_value=0.0)
-
- short = self.shortcut(input, num_filters * 4, stride)
-
- return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')
diff --git a/fluid/PaddleCV/image_classification/models/se_resnext.py b/fluid/PaddleCV/image_classification/models/se_resnext.py
deleted file mode 100644
index 023046a5db506d0111e246b03810b02c70f3e1b8..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/se_resnext.py
+++ /dev/null
@@ -1,200 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-import math
-
-__all__ = [
- "SE_ResNeXt", "SE_ResNeXt50_32x4d", "SE_ResNeXt101_32x4d",
- "SE_ResNeXt152_32x4d"
-]
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "dropout_seed": None,
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class SE_ResNeXt():
- def __init__(self, layers=50):
- self.params = train_parameters
- self.layers = layers
-
- def net(self, input, class_dim=1000):
- layers = self.layers
- supported_layers = [50, 101, 152]
- assert layers in supported_layers, \
- "supported layers are {} but input layer is {}".format(supported_layers, layers)
- if layers == 50:
- cardinality = 32
- reduction_ratio = 16
- depth = [3, 4, 6, 3]
- num_filters = [128, 256, 512, 1024]
-
- conv = self.conv_bn_layer(
- input=input,
- num_filters=64,
- filter_size=7,
- stride=2,
- act='relu')
- conv = fluid.layers.pool2d(
- input=conv,
- pool_size=3,
- pool_stride=2,
- pool_padding=1,
- pool_type='max')
- elif layers == 101:
- cardinality = 32
- reduction_ratio = 16
- depth = [3, 4, 23, 3]
- num_filters = [128, 256, 512, 1024]
-
- conv = self.conv_bn_layer(
- input=input,
- num_filters=64,
- filter_size=7,
- stride=2,
- act='relu')
- conv = fluid.layers.pool2d(
- input=conv,
- pool_size=3,
- pool_stride=2,
- pool_padding=1,
- pool_type='max')
- elif layers == 152:
- cardinality = 64
- reduction_ratio = 16
- depth = [3, 8, 36, 3]
- num_filters = [128, 256, 512, 1024]
-
- conv = self.conv_bn_layer(
- input=input,
- num_filters=64,
- filter_size=3,
- stride=2,
- act='relu')
- conv = self.conv_bn_layer(
- input=conv, num_filters=64, filter_size=3, stride=1, act='relu')
- conv = self.conv_bn_layer(
- input=conv,
- num_filters=128,
- filter_size=3,
- stride=1,
- act='relu')
- conv = fluid.layers.pool2d(
- input=conv, pool_size=3, pool_stride=2, pool_padding=1, \
- pool_type='max')
-
- for block in range(len(depth)):
- for i in range(depth[block]):
- conv = self.bottleneck_block(
- input=conv,
- num_filters=num_filters[block],
- stride=2 if i == 0 and block != 0 else 1,
- cardinality=cardinality,
- reduction_ratio=reduction_ratio)
-
- pool = fluid.layers.pool2d(
- input=conv, pool_size=7, pool_type='avg', global_pooling=True)
- drop = fluid.layers.dropout(
- x=pool, dropout_prob=0.5, seed=self.params['dropout_seed'])
- stdv = 1.0 / math.sqrt(drop.shape[1] * 1.0)
- out = fluid.layers.fc(input=drop,
- size=class_dim,
- act='softmax',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(-stdv,
- stdv)))
- return out
-
- def shortcut(self, input, ch_out, stride):
- ch_in = input.shape[1]
- if ch_in != ch_out or stride != 1:
- filter_size = 1
- return self.conv_bn_layer(input, ch_out, filter_size, stride)
- else:
- return input
-
- def bottleneck_block(self, input, num_filters, stride, cardinality,
- reduction_ratio):
- conv0 = self.conv_bn_layer(
- input=input, num_filters=num_filters, filter_size=1, act='relu')
- conv1 = self.conv_bn_layer(
- input=conv0,
- num_filters=num_filters,
- filter_size=3,
- stride=stride,
- groups=cardinality,
- act='relu')
- conv2 = self.conv_bn_layer(
- input=conv1, num_filters=num_filters * 2, filter_size=1, act=None)
- scale = self.squeeze_excitation(
- input=conv2,
- num_channels=num_filters * 2,
- reduction_ratio=reduction_ratio)
-
- short = self.shortcut(input, num_filters * 2, stride)
-
- return fluid.layers.elementwise_add(x=short, y=scale, act='relu')
-
- def conv_bn_layer(self,
- input,
- num_filters,
- filter_size,
- stride=1,
- groups=1,
- act=None):
- conv = fluid.layers.conv2d(
- input=input,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=(filter_size - 1) // 2,
- groups=groups,
- act=None,
- bias_attr=False)
- return fluid.layers.batch_norm(input=conv, act=act)
-
- def squeeze_excitation(self, input, num_channels, reduction_ratio):
- pool = fluid.layers.pool2d(
- input=input, pool_size=0, pool_type='avg', global_pooling=True)
- stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)
- squeeze = fluid.layers.fc(input=pool,
- size=num_channels // reduction_ratio,
- act='relu',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(
- -stdv, stdv)))
- stdv = 1.0 / math.sqrt(squeeze.shape[1] * 1.0)
- excitation = fluid.layers.fc(input=squeeze,
- size=num_channels,
- act='sigmoid',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Uniform(
- -stdv, stdv)))
- scale = fluid.layers.elementwise_mul(x=input, y=excitation, axis=0)
- return scale
-
-
-def SE_ResNeXt50_32x4d():
- model = SE_ResNeXt(layers=50)
- return model
-
-
-def SE_ResNeXt101_32x4d():
- model = SE_ResNeXt(layers=101)
- return model
-
-
-def SE_ResNeXt152_32x4d():
- model = SE_ResNeXt(layers=152)
- return model
diff --git a/fluid/PaddleCV/image_classification/models/vgg.py b/fluid/PaddleCV/image_classification/models/vgg.py
deleted file mode 100644
index 1af664fdb554f05ba8a556abbba36cd1c3141a40..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/models/vgg.py
+++ /dev/null
@@ -1,110 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-
-__all__ = ["VGGNet", "VGG11", "VGG13", "VGG16", "VGG19"]
-
-train_parameters = {
- "input_size": [3, 224, 224],
- "input_mean": [0.485, 0.456, 0.406],
- "input_std": [0.229, 0.224, 0.225],
- "learning_strategy": {
- "name": "piecewise_decay",
- "batch_size": 256,
- "epochs": [30, 60, 90],
- "steps": [0.1, 0.01, 0.001, 0.0001]
- }
-}
-
-
-class VGGNet():
- def __init__(self, layers=16):
- self.params = train_parameters
- self.layers = layers
-
- def net(self, input, class_dim=1000):
- layers = self.layers
- vgg_spec = {
- 11: ([1, 1, 2, 2, 2]),
- 13: ([2, 2, 2, 2, 2]),
- 16: ([2, 2, 3, 3, 3]),
- 19: ([2, 2, 4, 4, 4])
- }
- assert layers in vgg_spec.keys(), \
- "supported layers are {} but input layer is {}".format(vgg_spec.keys(), layers)
-
- nums = vgg_spec[layers]
- conv1 = self.conv_block(input, 64, nums[0])
- conv2 = self.conv_block(conv1, 128, nums[1])
- conv3 = self.conv_block(conv2, 256, nums[2])
- conv4 = self.conv_block(conv3, 512, nums[3])
- conv5 = self.conv_block(conv4, 512, nums[4])
-
- fc_dim = 4096
- fc1 = fluid.layers.fc(
- input=conv5,
- size=fc_dim,
- act='relu',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(scale=0.005)),
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Constant(value=0.1)))
- fc1 = fluid.layers.dropout(x=fc1, dropout_prob=0.5)
- fc2 = fluid.layers.fc(
- input=fc1,
- size=fc_dim,
- act='relu',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(scale=0.005)),
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Constant(value=0.1)))
- fc2 = fluid.layers.dropout(x=fc2, dropout_prob=0.5)
- out = fluid.layers.fc(
- input=fc2,
- size=class_dim,
- act='softmax',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(scale=0.005)),
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Constant(value=0.1)))
-
- return out
-
- def conv_block(self, input, num_filter, groups):
- conv = input
- for i in range(groups):
- conv = fluid.layers.conv2d(
- input=conv,
- num_filters=num_filter,
- filter_size=3,
- stride=1,
- padding=1,
- act='relu',
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Normal(scale=0.01)),
- bias_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.Constant(value=0.0)))
- return fluid.layers.pool2d(
- input=conv, pool_size=2, pool_type='max', pool_stride=2)
-
-
-def VGG11():
- model = VGGNet(layers=11)
- return model
-
-
-def VGG13():
- model = VGGNet(layers=13)
- return model
-
-
-def VGG16():
- model = VGGNet(layers=16)
- return model
-
-
-def VGG19():
- model = VGGNet(layers=19)
- return model
diff --git a/fluid/PaddleCV/image_classification/reader.py b/fluid/PaddleCV/image_classification/reader.py
deleted file mode 100644
index 639b0b01200e3d81c57d75e560d6911f3e74b710..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/reader.py
+++ /dev/null
@@ -1,182 +0,0 @@
-import os
-import math
-import random
-import functools
-import numpy as np
-import paddle
-from PIL import Image, ImageEnhance
-
-random.seed(0)
-np.random.seed(0)
-
-DATA_DIM = 224
-
-THREAD = 8
-BUF_SIZE = 102400
-
-DATA_DIR = 'data/ILSVRC2012'
-
-img_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))
-img_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))
-
-
-def resize_short(img, target_size):
- percent = float(target_size) / min(img.size[0], img.size[1])
- resized_width = int(round(img.size[0] * percent))
- resized_height = int(round(img.size[1] * percent))
- img = img.resize((resized_width, resized_height), Image.LANCZOS)
- return img
-
-
-def crop_image(img, target_size, center):
- width, height = img.size
- size = target_size
- if center == True:
- w_start = (width - size) / 2
- h_start = (height - size) / 2
- else:
- w_start = np.random.randint(0, width - size + 1)
- h_start = np.random.randint(0, height - size + 1)
- w_end = w_start + size
- h_end = h_start + size
- img = img.crop((w_start, h_start, w_end, h_end))
- return img
-
-
-def random_crop(img, size, scale=[0.08, 1.0], ratio=[3. / 4., 4. / 3.]):
- aspect_ratio = math.sqrt(np.random.uniform(*ratio))
- w = 1. * aspect_ratio
- h = 1. / aspect_ratio
-
- bound = min((float(img.size[0]) / img.size[1]) / (w**2),
- (float(img.size[1]) / img.size[0]) / (h**2))
- scale_max = min(scale[1], bound)
- scale_min = min(scale[0], bound)
-
- target_area = img.size[0] * img.size[1] * np.random.uniform(scale_min,
- scale_max)
- target_size = math.sqrt(target_area)
- w = int(target_size * w)
- h = int(target_size * h)
-
- i = np.random.randint(0, img.size[0] - w + 1)
- j = np.random.randint(0, img.size[1] - h + 1)
-
- img = img.crop((i, j, i + w, j + h))
- img = img.resize((size, size), Image.LANCZOS)
- return img
-
-
-def rotate_image(img):
- angle = np.random.randint(-10, 11)
- img = img.rotate(angle)
- return img
-
-
-def distort_color(img):
- def random_brightness(img, lower=0.5, upper=1.5):
- e = np.random.uniform(lower, upper)
- return ImageEnhance.Brightness(img).enhance(e)
-
- def random_contrast(img, lower=0.5, upper=1.5):
- e = np.random.uniform(lower, upper)
- return ImageEnhance.Contrast(img).enhance(e)
-
- def random_color(img, lower=0.5, upper=1.5):
- e = np.random.uniform(lower, upper)
- return ImageEnhance.Color(img).enhance(e)
-
- ops = [random_brightness, random_contrast, random_color]
- np.random.shuffle(ops)
-
- img = ops[0](img)
- img = ops[1](img)
- img = ops[2](img)
-
- return img
-
-
-def process_image(sample, mode, color_jitter, rotate):
- img_path = sample[0]
-
- img = Image.open(img_path)
- if mode == 'train':
- if rotate: img = rotate_image(img)
- img = random_crop(img, DATA_DIM)
- else:
- img = resize_short(img, target_size=256)
- img = crop_image(img, target_size=DATA_DIM, center=True)
- if mode == 'train':
- if color_jitter:
- img = distort_color(img)
- if np.random.randint(0, 2) == 1:
- img = img.transpose(Image.FLIP_LEFT_RIGHT)
-
- if img.mode != 'RGB':
- img = img.convert('RGB')
-
- img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255
- img -= img_mean
- img /= img_std
-
- if mode == 'train' or mode == 'val':
- return img, sample[1]
- elif mode == 'test':
- return [img]
-
-
-def _reader_creator(file_list,
- mode,
- shuffle=False,
- color_jitter=False,
- rotate=False,
- data_dir=DATA_DIR):
- def reader():
- with open(file_list) as flist:
- full_lines = [line.strip() for line in flist]
- if shuffle:
- np.random.shuffle(full_lines)
- if mode == 'train' and os.getenv('PADDLE_TRAINING_ROLE'):
- # distributed mode if the env var `PADDLE_TRAINING_ROLE` exits
- trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
- trainer_count = int(os.getenv("PADDLE_TRAINERS", "1"))
- per_node_lines = len(full_lines) // trainer_count
- lines = full_lines[trainer_id * per_node_lines:(trainer_id + 1)
- * per_node_lines]
- print(
- "read images from %d, length: %d, lines length: %d, total: %d"
- % (trainer_id * per_node_lines, per_node_lines, len(lines),
- len(full_lines)))
- else:
- lines = full_lines
-
- for line in lines:
- if mode == 'train' or mode == 'val':
- img_path, label = line.split()
- img_path = img_path.replace("JPEG", "jpeg")
- img_path = os.path.join(data_dir, img_path)
- yield img_path, int(label)
- elif mode == 'test':
- img_path = os.path.join(data_dir, line)
- yield [img_path]
-
- mapper = functools.partial(
- process_image, mode=mode, color_jitter=color_jitter, rotate=rotate)
-
- return paddle.reader.xmap_readers(mapper, reader, THREAD, BUF_SIZE)
-
-
-def train(data_dir=DATA_DIR):
- file_list = os.path.join(data_dir, 'train_list.txt')
- return _reader_creator(
- file_list, 'train', shuffle=True, color_jitter=False, rotate=False, data_dir=data_dir)
-
-
-def val(data_dir=DATA_DIR):
- file_list = os.path.join(data_dir, 'val_list.txt')
- return _reader_creator(file_list, 'val', shuffle=False, data_dir=data_dir)
-
-
-def test(data_dir=DATA_DIR):
- file_list = os.path.join(data_dir, 'val_list.txt')
- return _reader_creator(file_list, 'test', shuffle=False, data_dir=data_dir)
diff --git a/fluid/PaddleCV/image_classification/train.py b/fluid/PaddleCV/image_classification/train.py
deleted file mode 100644
index 238c322bea9abf1ce086a0228b491e82cb69ae45..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/train.py
+++ /dev/null
@@ -1,290 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import numpy as np
-import time
-import sys
-import functools
-import math
-import paddle
-import paddle.fluid as fluid
-import paddle.dataset.flowers as flowers
-import models
-import reader
-import argparse
-from models.learning_rate import cosine_decay
-from utility import add_arguments, print_arguments
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('batch_size', int, 256, "Minibatch size.")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('total_images', int, 1281167, "Training image number.")
-add_arg('num_epochs', int, 120, "number of epochs.")
-add_arg('class_dim', int, 1000, "Class number.")
-add_arg('image_shape', str, "3,224,224", "input image size")
-add_arg('model_save_dir', str, "output", "model save directory")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
-add_arg('lr', float, 0.1, "set learning rate.")
-add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
-add_arg('model', str, "SE_ResNeXt50_32x4d", "Set the network to use.")
-add_arg('enable_ce', bool, False, "If set True, enable continuous evaluation job.")
-add_arg('data_dir', str, "./data/ILSVRC2012", "The ImageNet dataset root dir.")
-# yapf: enable
-
-model_list = [m for m in dir(models) if "__" not in m]
-
-
-def optimizer_setting(params):
- ls = params["learning_strategy"]
-
- if ls["name"] == "piecewise_decay":
- if "total_images" not in params:
- total_images = 1281167
- else:
- total_images = params["total_images"]
-
- batch_size = ls["batch_size"]
- step = int(total_images / batch_size + 1)
-
- bd = [step * e for e in ls["epochs"]]
- base_lr = params["lr"]
- lr = []
- lr = [base_lr * (0.1**i) for i in range(len(bd) + 1)]
- optimizer = fluid.optimizer.Momentum(
- learning_rate=fluid.layers.piecewise_decay(
- boundaries=bd, values=lr),
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(1e-4))
- elif ls["name"] == "cosine_decay":
- if "total_images" not in params:
- total_images = 1281167
- else:
- total_images = params["total_images"]
-
- batch_size = ls["batch_size"]
- step = int(total_images / batch_size + 1)
-
- lr = params["lr"]
- num_epochs = params["num_epochs"]
-
- optimizer = fluid.optimizer.Momentum(
- learning_rate=cosine_decay(
- learning_rate=lr, step_each_epoch=step, epochs=num_epochs),
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(1e-4))
- else:
- lr = params["lr"]
- optimizer = fluid.optimizer.Momentum(
- learning_rate=lr,
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(1e-4))
-
- return optimizer
-
-
-def train(args):
- # parameters from arguments
- class_dim = args.class_dim
- model_name = args.model
- checkpoint = args.checkpoint
- pretrained_model = args.pretrained_model
- with_memory_optimization = args.with_mem_opt
- model_save_dir = args.model_save_dir
- image_shape = [int(m) for m in args.image_shape.split(",")]
-
- assert model_name in model_list, "{} is not in lists: {}".format(args.model,
- model_list)
-
- image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
-
- # model definition
- model = models.__dict__[model_name]()
-
- if args.enable_ce:
- assert model_name == "SE_ResNeXt50_32x4d"
- fluid.default_startup_program().random_seed = 1000
- model.params["dropout_seed"] = 100
- class_dim = 102
-
- if model_name == "GoogleNet":
- out0, out1, out2 = model.net(input=image, class_dim=class_dim)
- cost0 = fluid.layers.cross_entropy(input=out0, label=label)
- cost1 = fluid.layers.cross_entropy(input=out1, label=label)
- cost2 = fluid.layers.cross_entropy(input=out2, label=label)
- avg_cost0 = fluid.layers.mean(x=cost0)
- avg_cost1 = fluid.layers.mean(x=cost1)
- avg_cost2 = fluid.layers.mean(x=cost2)
-
- avg_cost = avg_cost0 + 0.3 * avg_cost1 + 0.3 * avg_cost2
- acc_top1 = fluid.layers.accuracy(input=out0, label=label, k=1)
- acc_top5 = fluid.layers.accuracy(input=out0, label=label, k=5)
- else:
- out = model.net(input=image, class_dim=class_dim)
- cost = fluid.layers.cross_entropy(input=out, label=label)
-
- avg_cost = fluid.layers.mean(x=cost)
- acc_top1 = fluid.layers.accuracy(input=out, label=label, k=1)
- acc_top5 = fluid.layers.accuracy(input=out, label=label, k=5)
-
- test_program = fluid.default_main_program().clone(for_test=True)
-
- # parameters from model and arguments
- params = model.params
- params["total_images"] = args.total_images
- params["lr"] = args.lr
- params["num_epochs"] = args.num_epochs
- params["learning_strategy"]["batch_size"] = args.batch_size
- params["learning_strategy"]["name"] = args.lr_strategy
-
- # initialize optimizer
- optimizer = optimizer_setting(params)
- opts = optimizer.minimize(avg_cost)
-
- if with_memory_optimization:
- fluid.memory_optimize(fluid.default_main_program())
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- if checkpoint is not None:
- fluid.io.load_persistables(exe, checkpoint)
-
- if pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
-
- fluid.io.load_vars(exe, pretrained_model, predicate=if_exist)
-
- train_batch_size = args.batch_size
- test_batch_size = 16
-
- if not args.enable_ce:
- train_reader = paddle.batch(reader.train(), batch_size=train_batch_size)
- test_reader = paddle.batch(reader.val(), batch_size=test_batch_size)
- else:
- # use flowers dataset for CE and set use_xmap False to avoid disorder data
- # but it is time consuming. For faster speed, need another dataset.
- import random
- random.seed(0)
- np.random.seed(0)
- train_reader = paddle.batch(
- flowers.train(use_xmap=False), batch_size=train_batch_size)
- test_reader = paddle.batch(
- flowers.test(use_xmap=False), batch_size=test_batch_size)
-
- feeder = fluid.DataFeeder(place=place, feed_list=[image, label])
-
- train_exe = fluid.ParallelExecutor(
- use_cuda=True if args.use_gpu else False, loss_name=avg_cost.name)
-
- fetch_list = [avg_cost.name, acc_top1.name, acc_top5.name]
-
- gpu = os.getenv("CUDA_VISIBLE_DEVICES") or ""
- gpu_nums = len(gpu.split(","))
- for pass_id in range(params["num_epochs"]):
- train_info = [[], [], []]
- test_info = [[], [], []]
- train_time = []
- for batch_id, data in enumerate(train_reader()):
- t1 = time.time()
- loss, acc1, acc5 = train_exe.run(fetch_list, feed=feeder.feed(data))
- t2 = time.time()
- period = t2 - t1
- loss = np.mean(np.array(loss))
- acc1 = np.mean(np.array(acc1))
- acc5 = np.mean(np.array(acc5))
- train_info[0].append(loss)
- train_info[1].append(acc1)
- train_info[2].append(acc5)
- train_time.append(period)
- if batch_id % 10 == 0:
- print("Pass {0}, trainbatch {1}, loss {2}, \
- acc1 {3}, acc5 {4} time {5}"
- .format(pass_id, \
- batch_id, loss, acc1, acc5, \
- "%2.2f sec" % period))
- sys.stdout.flush()
-
- train_loss = np.array(train_info[0]).mean()
- train_acc1 = np.array(train_info[1]).mean()
- train_acc5 = np.array(train_info[2]).mean()
- train_speed = np.array(train_time).mean() / train_batch_size
- cnt = 0
- for test_batch_id, data in enumerate(test_reader()):
- t1 = time.time()
- loss, acc1, acc5 = exe.run(test_program,
- fetch_list=fetch_list,
- feed=feeder.feed(data))
- t2 = time.time()
- period = t2 - t1
- loss = np.mean(loss)
- acc1 = np.mean(acc1)
- acc5 = np.mean(acc5)
- test_info[0].append(loss * len(data))
- test_info[1].append(acc1 * len(data))
- test_info[2].append(acc5 * len(data))
- cnt += len(data)
- if test_batch_id % 10 == 0:
- print("Pass {0},testbatch {1},loss {2}, \
- acc1 {3},acc5 {4},time {5}"
- .format(pass_id, \
- test_batch_id, loss, acc1, acc5, \
- "%2.2f sec" % period))
- sys.stdout.flush()
-
- test_loss = np.sum(test_info[0]) / cnt
- test_acc1 = np.sum(test_info[1]) / cnt
- test_acc5 = np.sum(test_info[2]) / cnt
-
- print("End pass {0}, train_loss {1}, train_acc1 {2}, train_acc5 {3}, "
- "test_loss {4}, test_acc1 {5}, test_acc5 {6}".format(pass_id, \
- train_loss, train_acc1, train_acc5, test_loss, test_acc1, \
- test_acc5))
- sys.stdout.flush()
-
- model_path = os.path.join(model_save_dir + '/' + model_name,
- str(pass_id))
- if not os.path.isdir(model_path):
- os.makedirs(model_path)
- fluid.io.save_persistables(exe, model_path)
-
- # This is for continuous evaluation only
- if args.enable_ce and pass_id == args.num_epochs - 1:
- if gpu_nums == 1:
- # Use the mean cost/acc for training
- print("kpis train_cost %s" % train_loss)
- print("kpis train_acc_top1 %s" % train_acc1)
- print("kpis train_acc_top5 %s" % train_acc5)
- # Use the mean cost/acc for testing
- print("kpis test_cost %s" % test_loss)
- print("kpis test_acc_top1 %s" % test_acc1)
- print("kpis test_acc_top5 %s" % test_acc5)
- print("kpis train_speed %s" % train_speed)
- else:
- # Use the mean cost/acc for training
- print("kpis train_cost_card%s %s" % (gpu_nums, train_loss))
- print("kpis train_acc_top1_card%s %s" % (gpu_nums, train_acc1))
- print("kpis train_acc_top5_card%s %s" % (gpu_nums, train_acc5))
- # Use the mean cost/acc for testing
- print("kpis test_cost_card%s %s" % (gpu_nums, test_loss))
- print("kpis test_acc_top1_card%s %s" % (gpu_nums, test_acc1))
- print("kpis test_acc_top5_card%s %s" % (gpu_nums, test_acc5))
- print("kpis train_speed_card%s %s" % (gpu_nums, train_speed))
-
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- train(args)
-
-
-if __name__ == '__main__':
- main()
diff --git a/fluid/PaddleCV/metric_learning/README.md b/fluid/PaddleCV/metric_learning/README.md
index c961bf4842727dab8e808ef016d10efa219bf2f6..6afd28a457c639af25337cc02a6b5b64658845ff 100644
--- a/fluid/PaddleCV/metric_learning/README.md
+++ b/fluid/PaddleCV/metric_learning/README.md
@@ -1,105 +1,6 @@
-# Deep Metric Learning
-Metric learning is a kind of methods to learn discriminative features for each sample, with the purpose that intra-class samples have smaller distances while inter-class samples have larger distances in the learned space. With the develop of deep learning technique, metric learning methods are combined with deep neural networks to boost the performance of traditional tasks, such as face recognition/verification, human re-identification, image retrieval and so on. In this page, we introduce the way to implement deep metric learning using PaddlePaddle Fluid, including [data preparation](#data-preparation), [training](#training-a-model), [finetuning](#finetuning), [evaluation](#evaluation) and [inference](#inference).
----
-## Table of Contents
-- [Installation](#installation)
-- [Data preparation](#data-preparation)
-- [Training metric learning models](#training-a-model)
-- [Finetuning](#finetuning)
-- [Evaluation](#evaluation)
-- [Inference](#inference)
-- [Performances](#supported-models)
+Hi!
-## Installation
+This directory has been deprecated.
-Running sample code in this directory requires PaddelPaddle Fluid v0.14.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) and make an update.
-
-## Data preparation
-
-Stanford Online Product(SOP) dataset contains 120,053 images of 22,634 products downloaded from eBay.com. We use it to conduct the metric learning experiments. For training, 59,5511 out of 11,318 classes are used, and 11,316 classes(60,502 images) are held out for testing. First of all, preparation of SOP data can be done as:
-```
-cd data/
-sh download_sop.sh
-```
-
-## Training metric learning models
-
-To train a metric learning model, one need to set the neural network as backbone and the metric loss function to optimize. We train meiric learning model using softmax or [arcmargin](https://arxiv.org/abs/1801.07698) loss firstly, and then fine-turned the model using other metric learning loss, such as triplet, [quadruplet](https://arxiv.org/abs/1710.00478) and [eml](https://arxiv.org/abs/1212.6094) loss. One example of training using arcmargin loss is shown below:
-
-
-```
-python train_elem.py \
- --model=ResNet50 \
- --train_batch_size=256 \
- --test_batch_size=50 \
- --lr=0.01 \
- --total_iter_num=30000 \
- --use_gpu=True \
- --pretrained_model=${path_to_pretrain_imagenet_model} \
- --model_save_dir=${output_model_path} \
- --loss_name=arcmargin \
- --arc_scale=80.0 \
- --arc_margin=0.15 \
- --arc_easy_margin=False
-```
-**parameter introduction:**
-* **model**: name model to use. Default: "ResNet50".
-* **train_batch_size**: the size of each training mini-batch. Default: 256.
-* **test_batch_size**: the size of each testing mini-batch. Default: 50.
-* **lr**: initialized learning rate. Default: 0.01.
-* **total_iter_num**: total number of training iterations. Default: 30000.
-* **use_gpu**: whether to use GPU or not. Default: True.
-* **pretrained_model**: model path for pretraining. Default: None.
-* **model_save_dir**: the directory to save trained model. Default: "output".
-* **loss_name**: loss fortraining model. Default: "softmax".
-* **arc_scale**: parameter of arcmargin loss. Default: 80.0.
-* **arc_margin**: parameter of arcmargin loss. Default: 0.15.
-* **arc_easy_margin**: parameter of arcmargin loss. Default: False.
-
-## Finetuning
-
-Finetuning is to finetune model weights in a specific task by loading pretrained weights. After training model using softmax or arcmargin loss, one can finetune the model using triplet, quadruplet or eml loss. One example of fine-turned using eml loss is shown below:
-
-```
-python train_pair.py \
- --model=ResNet50 \
- --train_batch_size=160 \
- --test_batch_size=50 \
- --lr=0.0001 \
- --total_iter_num=100000 \
- --use_gpu=True \
- --pretrained_model=${path_to_pretrain_arcmargin_model} \
- --model_save_dir=${output_model_path} \
- --loss_name=eml \
- --samples_each_class=2
-```
-
-## Evaluation
-Evaluation is to evaluate the performance of a trained model. One can download [pretrained models](#supported-models) and set its path to ```path_to_pretrain_model```. Then Recall@Rank-1 can be obtained by running the following command:
-```
-python eval.py \
- --model=ResNet50 \
- --batch_size=50 \
- --pretrained_model=${path_to_pretrain_model} \
-```
-
-## Inference
-Inference is used to get prediction score or image features based on trained models.
-```
-python infer.py \
- --model=ResNet50 \
- --batch_size=1 \
- --pretrained_model=${path_to_pretrain_model}
-```
-
-## Performances
-
-For comparation, many metric learning models with different neural networks and loss functions are trained using corresponding experiential parameters. Recall@Rank-1 is used as evaluation metric and the performance is listed in the table. Pretrained models can be downloaded by clicking related model names.
-
-|pretrain model | softmax | arcmargin
-|- | - | -:
-|without fine-tuned | 77.42% | 78.11%
-|fine-tuned with triplet | 78.37% | 79.21%
-|fine-tuned with quadruplet | 78.10% | 79.59%
-|fine-tuned with eml | 79.32% | 80.11%
+Please visit the project at [PaddleCV/metric_learning](../../../PaddleCV/metric_learning).
diff --git a/fluid/PaddleCV/metric_learning/README_cn.md b/fluid/PaddleCV/metric_learning/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..72417ed9badfc4858f314f143dd069d4ff6a0e6a
--- /dev/null
+++ b/fluid/PaddleCV/metric_learning/README_cn.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/metric_learning](../../../PaddleCV/metric_learning) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/metric_learning/infer.py b/fluid/PaddleCV/metric_learning/infer.py
deleted file mode 100644
index a189ccc3d0be540f3adf79280d1bc9ed7af0b5e9..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/infer.py
+++ /dev/null
@@ -1,84 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import sys
-import math
-import time
-import argparse
-import functools
-import numpy as np
-import paddle
-import paddle.fluid as fluid
-import models
-import reader
-from utility import add_arguments, print_arguments
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('model', str, "ResNet50", "Set the network to use.")
-add_arg('embedding_size', int, 0, "Embedding size.")
-add_arg('batch_size', int, 1, "Minibatch size.")
-add_arg('image_shape', str, "3,224,224", "Input image size.")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('with_mem_opt', bool, False, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-# yapf: enable
-
-model_list = [m for m in dir(models) if "__" not in m]
-
-
-def infer(args):
- # parameters from arguments
- model_name = args.model
- pretrained_model = args.pretrained_model
- with_memory_optimization = args.with_mem_opt
- image_shape = [int(m) for m in args.image_shape.split(",")]
-
- assert model_name in model_list, "{} is not in lists: {}".format(args.model,
- model_list)
-
- image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
-
- # model definition
- model = models.__dict__[model_name]()
- out = model.net(input=image, embedding_size=args.embedding_size)
-
- test_program = fluid.default_main_program().clone(for_test=True)
-
- if with_memory_optimization:
- fluid.memory_optimize(fluid.default_main_program())
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- if pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
-
- fluid.io.load_vars(exe, pretrained_model, predicate=if_exist)
-
- infer_reader = paddle.batch(reader.infer(args), batch_size=args.batch_size, drop_last=False)
- feeder = fluid.DataFeeder(place=place, feed_list=[image])
-
- fetch_list = [out.name]
-
- for batch_id, data in enumerate(infer_reader()):
- result = exe.run(test_program, fetch_list=fetch_list, feed=feeder.feed(data))
- result = result[0][0].reshape(-1)
- print("Test-{0}-feature: {1}".format(batch_id, result))
- sys.stdout.flush()
-
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- infer(args)
-
-
-if __name__ == '__main__':
- main()
diff --git a/fluid/PaddleCV/metric_learning/losses/__init__.py b/fluid/PaddleCV/metric_learning/losses/__init__.py
deleted file mode 100644
index 5cc822141e7b16ca548b0cd466d44616a6990e1b..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/losses/__init__.py
+++ /dev/null
@@ -1,8 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-from .softmaxloss import SoftmaxLoss
-from .arcmarginloss import ArcMarginLoss
-from .tripletloss import TripletLoss
-from .quadrupletloss import QuadrupletLoss
-from .emlloss import EmlLoss
diff --git a/fluid/PaddleCV/metric_learning/losses/emlloss.py b/fluid/PaddleCV/metric_learning/losses/emlloss.py
deleted file mode 100644
index 459ca4498166f983dd0f8a605cc67003fd24cffb..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/losses/emlloss.py
+++ /dev/null
@@ -1,66 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import math
-import paddle.fluid as fluid
-from utility import get_gpu_num
-from .commonfunc import calculate_order_dist_matrix
-
-class EmlLoss():
- def __init__(self, train_batch_size = 40, samples_each_class=2):
- self.samples_each_class = samples_each_class
- self.train_batch_size = train_batch_size
- num_gpus = get_gpu_num()
- assert(train_batch_size % num_gpus == 0)
- self.cal_loss_batch_size = train_batch_size // num_gpus
- assert(self.cal_loss_batch_size % samples_each_class == 0)
-
- def surrogate_function(self, beta, theta, bias):
- x = theta * fluid.layers.exp(bias)
- output = fluid.layers.log(1+beta*x)/math.log(1+beta)
- return output
-
- def surrogate_function_approximate(self, beta, theta, bias):
- output = (fluid.layers.log(theta) + bias + math.log(beta))/math.log(1+beta)
- return output
-
- def surrogate_function_stable(self, beta, theta, target, thresh):
- max_gap = fluid.layers.fill_constant([1], dtype='float32', value=thresh)
- max_gap.stop_gradient = True
-
- target_max = fluid.layers.elementwise_max(target, max_gap)
- target_min = fluid.layers.elementwise_min(target, max_gap)
-
- loss1 = self.surrogate_function(beta, theta, target_min)
- loss2 = self.surrogate_function_approximate(beta, theta, target_max)
- bias = self.surrogate_function(beta, theta, max_gap)
- loss = loss1 + loss2 - bias
- return loss
-
- def loss(self, input):
- samples_each_class = self.samples_each_class
- batch_size = self.cal_loss_batch_size
- #input = fluid.layers.l2_normalize(input, axis=1)
- #input_norm = fluid.layers.sqrt(fluid.layers.reduce_sum(fluid.layers.square(input), dim=1))
- #input = fluid.layers.elementwise_div(input, input_norm, axis=0)
- d = calculate_order_dist_matrix(input, self.cal_loss_batch_size, self.samples_each_class)
- ignore, pos, neg = fluid.layers.split(d, num_or_sections= [1,
- samples_each_class-1, batch_size-samples_each_class], dim=1)
- ignore.stop_gradient = True
-
- pos_max = fluid.layers.reduce_max(pos, dim=1)
- pos_max = fluid.layers.reshape(pos_max, shape=[-1, 1])
- pos = fluid.layers.exp(pos - pos_max)
- pos_mean = fluid.layers.reduce_mean(pos, dim=1)
-
- neg_min = fluid.layers.reduce_min(neg, dim=1)
- neg_min = fluid.layers.reshape(neg_min, shape=[-1, 1])
- neg = fluid.layers.exp(-1*(neg-neg_min))
- neg_mean = fluid.layers.reduce_mean(neg, dim=1)
- bias = pos_max - neg_min
- theta = fluid.layers.reshape(neg_mean * pos_mean, shape=[-1,1])
- thresh = 20.0
- beta = 100000
- loss = self.surrogate_function_stable(beta, theta, bias, thresh)
- return loss
diff --git a/fluid/PaddleCV/metric_learning/losses/quadrupletloss.py b/fluid/PaddleCV/metric_learning/losses/quadrupletloss.py
deleted file mode 100644
index a14fffcd7eebd90966abe9adeffbfaa79c14c684..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/losses/quadrupletloss.py
+++ /dev/null
@@ -1,40 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid as fluid
-from utility import get_gpu_num
-from .commonfunc import calculate_order_dist_matrix
-
-class QuadrupletLoss():
- def __init__(self,
- train_batch_size = 80,
- samples_each_class = 2,
- margin = 0.1):
- self.margin = margin
- self.samples_each_class = samples_each_class
- self.train_batch_size = train_batch_size
- num_gpus = get_gpu_num()
- assert(train_batch_size % num_gpus == 0)
- self.cal_loss_batch_size = train_batch_size // num_gpus
- assert(self.cal_loss_batch_size % samples_each_class == 0)
-
- def loss(self, input):
- #input = fluid.layers.l2_normalize(input, axis=1)
- input_norm = fluid.layers.sqrt(fluid.layers.reduce_sum(fluid.layers.square(input), dim=1))
- input = fluid.layers.elementwise_div(input, input_norm, axis=0)
-
- samples_each_class = self.samples_each_class
- batch_size = self.cal_loss_batch_size
- margin = self.margin
- d = calculate_order_dist_matrix(input, self.cal_loss_batch_size, self.samples_each_class)
- ignore, pos, neg = fluid.layers.split(d, num_or_sections= [1,
- samples_each_class-1, batch_size-samples_each_class], dim=1)
- ignore.stop_gradient = True
- pos_max = fluid.layers.reduce_max(pos)
- neg_min = fluid.layers.reduce_min(neg)
- #pos_max = fluid.layers.sqrt(pos_max + 1e-6)
- #neg_min = fluid.layers.sqrt(neg_min + 1e-6)
- loss = fluid.layers.relu(pos_max - neg_min + margin)
- return loss
-
diff --git a/fluid/PaddleCV/metric_learning/losses/tripletloss.py b/fluid/PaddleCV/metric_learning/losses/tripletloss.py
deleted file mode 100644
index 7ef3bdb4dc9346373c66b964e54c5b0a33394901..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/losses/tripletloss.py
+++ /dev/null
@@ -1,32 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid as fluid
-
-class TripletLoss():
- def __init__(self, margin=0.1):
- self.margin = margin
-
- def loss(self, input):
- margin = self.margin
- fea_dim = input.shape[1] # number of channels
- #input = fluid.layers.l2_normalize(input, axis=1)
- input_norm = fluid.layers.sqrt(fluid.layers.reduce_sum(fluid.layers.square(input), dim=1))
- input = fluid.layers.elementwise_div(input, input_norm, axis=0)
- output = fluid.layers.reshape(input, shape = [-1, 3, fea_dim])
-
- anchor, positive, negative = fluid.layers.split(output, num_or_sections = 3, dim = 1)
-
- anchor = fluid.layers.reshape(anchor, shape = [-1, fea_dim])
- positive = fluid.layers.reshape(positive, shape = [-1, fea_dim])
- negative = fluid.layers.reshape(negative, shape = [-1, fea_dim])
-
- a_p = fluid.layers.square(anchor - positive)
- a_n = fluid.layers.square(anchor - negative)
- a_p = fluid.layers.reduce_sum(a_p, dim = 1)
- a_n = fluid.layers.reduce_sum(a_n, dim = 1)
- #a_p = fluid.layers.sqrt(a_p + 1e-6)
- #a_n = fluid.layers.sqrt(a_n + 1e-6)
- loss = fluid.layers.relu(a_p + margin - a_n)
- return loss
diff --git a/fluid/PaddleCV/metric_learning/reader.py b/fluid/PaddleCV/metric_learning/reader.py
deleted file mode 100644
index 9c5aaf396d7b535bc4729a31834e2ef0f8151a28..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/reader.py
+++ /dev/null
@@ -1,175 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import math
-import random
-import functools
-import numpy as np
-import paddle
-from imgtool import process_image
-
-random.seed(0)
-
-DATA_DIR = "./data/Stanford_Online_Products/"
-TRAIN_LIST = './data/Stanford_Online_Products/Ebay_train.txt'
-VAL_LIST = './data/Stanford_Online_Products/Ebay_test.txt'
-
-
-def init_sop(mode):
- if mode == 'train':
- train_data = {}
- train_image_list = []
- train_list = open(TRAIN_LIST, "r").readlines()
- for i, item in enumerate(train_list):
- items = item.strip().split()
- if items[0] == 'image_id':
- continue
- path = items[3]
- label = int(items[1]) - 1
- train_image_list.append((path, label))
- if label not in train_data:
- train_data[label] = []
- train_data[label].append(path)
- random.shuffle(train_image_list)
- print("{} dataset size: {}".format(mode, len(train_data)))
- return train_data, train_image_list
- else:
- val_data = {}
- val_image_list = []
- test_image_list = []
- val_list = open(VAL_LIST, "r").readlines()
- for i, item in enumerate(val_list):
- items = item.strip().split()
- if items[0] == 'image_id':
- continue
- path = items[3]
- label = int(items[1])
- val_image_list.append((path, label))
- test_image_list.append(path)
- if label not in val_data:
- val_data[label] = []
- val_data[label].append(path)
- print("{} dataset size: {}".format(mode, len(val_data)))
- if mode == 'val':
- return val_data, val_image_list
- else:
- return test_image_list
-
-def common_iterator(data, settings):
- batch_size = settings.train_batch_size
- samples_each_class = settings.samples_each_class
- assert (batch_size % samples_each_class == 0)
- class_num = batch_size // samples_each_class
- def train_iterator():
- labs = list(data.keys())
- lab_num = len(labs)
- ind = list(range(0, lab_num))
- while True:
- random.shuffle(ind)
- ind_sample = ind[:class_num]
- for ind_i in ind_sample:
- lab = labs[ind_i]
- data_list = data[lab]
- data_ind = list(range(0, len(data_list)))
- random.shuffle(data_ind)
- anchor_ind = data_ind[:samples_each_class]
-
- for anchor_ind_i in anchor_ind:
- anchor_path = DATA_DIR + data_list[anchor_ind_i]
- yield anchor_path, lab
-
- return train_iterator
-
-def triplet_iterator(data, settings):
- batch_size = settings.train_batch_size
- assert (batch_size % 3 == 0)
- def train_iterator():
- labs = list(data.keys())
- lab_num = len(labs)
- ind = list(range(0, lab_num))
- while True:
- random.shuffle(ind)
- ind_pos, ind_neg = ind[:2]
- lab_pos = labs[ind_pos]
- pos_data_list = data[lab_pos]
- data_ind = list(range(0, len(pos_data_list)))
- random.shuffle(data_ind)
- anchor_ind, pos_ind = data_ind[:2]
-
- lab_neg = labs[ind_neg]
- neg_data_list = data[lab_neg]
- neg_ind = random.randint(0, len(neg_data_list) - 1)
-
- anchor_path = DATA_DIR + pos_data_list[anchor_ind]
- yield anchor_path, lab_pos
- pos_path = DATA_DIR + pos_data_list[pos_ind]
- yield pos_path, lab_pos
- neg_path = DATA_DIR + neg_data_list[neg_ind]
- yield neg_path, lab_neg
-
- return train_iterator
-
-def arcmargin_iterator(data, settings):
- def train_iterator():
- while True:
- for items in data:
- path, label = items
- path = DATA_DIR + path
- yield path, label
- return train_iterator
-
-def image_iterator(data, mode):
- def val_iterator():
- for items in data:
- path, label = items
- path = DATA_DIR + path
- yield path, label
- def test_iterator():
- for item in data:
- path = item
- path = DATA_DIR + path
- yield [path]
- if mode == 'val':
- return val_iterator
- else:
- return test_iterator
-
-def createreader(settings, mode):
- def metric_reader():
- if mode == 'train':
- train_data, train_image_list = init_sop('train')
- loss_name = settings.loss_name
- if loss_name in ["softmax", "arcmargin"]:
- return arcmargin_iterator(train_image_list, settings)()
- elif loss_name == 'triplet':
- return triplet_iterator(train_data, settings)()
- else:
- return common_iterator(train_data, settings)()
- elif mode == 'val':
- val_data, val_image_list = init_sop('val')
- return image_iterator(val_image_list, 'val')()
- else:
- test_image_list = init_sop('test')
- return image_iterator(test_image_list, 'test')()
-
- image_shape = settings.image_shape.split(',')
- assert(image_shape[1] == image_shape[2])
- image_size = int(image_shape[2])
- keep_order = False if mode != 'train' or settings.loss_name in ['softmax', 'arcmargin'] else True
- image_mapper = functools.partial(process_image,
- mode=mode, color_jitter=False, rotate=False, crop_size=image_size)
- reader = paddle.reader.xmap_readers(
- image_mapper, metric_reader, 8, 1000, order=keep_order)
- return reader
-
-
-def train(settings):
- return createreader(settings, "train")
-
-def test(settings):
- return createreader(settings, "val")
-
-def infer(settings):
- return createreader(settings, "test")
diff --git a/fluid/PaddleCV/metric_learning/train_elem.py b/fluid/PaddleCV/metric_learning/train_elem.py
deleted file mode 100644
index 126a1ff1201301fa7cd3f8be924554c5ded22bdc..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/train_elem.py
+++ /dev/null
@@ -1,290 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import sys
-import math
-import time
-import logging
-import argparse
-import functools
-import threading
-import subprocess
-import numpy as np
-import paddle
-import paddle.fluid as fluid
-import models
-import reader
-from losses import SoftmaxLoss
-from losses import ArcMarginLoss
-from utility import add_arguments, print_arguments
-from utility import fmt_time, recall_topk, get_gpu_num
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('model', str, "ResNet50", "Set the network to use.")
-add_arg('embedding_size', int, 0, "Embedding size.")
-add_arg('train_batch_size', int, 256, "Minibatch size.")
-add_arg('test_batch_size', int, 50, "Minibatch size.")
-add_arg('image_shape', str, "3,224,224", "input image size")
-add_arg('class_dim', int, 11318 , "Class number.")
-add_arg('lr', float, 0.01, "set learning rate.")
-add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
-add_arg('lr_steps', str, "30000", "step of lr")
-add_arg('total_iter_num', int, 30000, "total_iter_num")
-add_arg('display_iter_step', int, 10, "display_iter_step.")
-add_arg('test_iter_step', int, 1000, "test_iter_step.")
-add_arg('save_iter_step', int, 1000, "save_iter_step.")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
-add_arg('model_save_dir', str, "output", "model save directory")
-add_arg('loss_name', str, "softmax", "Set the loss type to use.")
-add_arg('arc_scale', float, 80.0, "arc scale.")
-add_arg('arc_margin', float, 0.15, "arc margin.")
-add_arg('arc_easy_margin', bool, False, "arc easy margin.")
-add_arg('enable_ce', bool, False, "If set True, enable continuous evaluation job.")
-# yapf: enable
-
-model_list = [m for m in dir(models) if "__" not in m]
-
-def optimizer_setting(params):
- ls = params["learning_strategy"]
- assert ls["name"] == "piecewise_decay", \
- "learning rate strategy must be {}, \
- but got {}".format("piecewise_decay", lr["name"])
-
- bd = [int(e) for e in ls["lr_steps"].split(',')]
- base_lr = params["lr"]
- lr = [base_lr * (0.1 ** i) for i in range(len(bd) + 1)]
- optimizer = fluid.optimizer.Momentum(
- learning_rate=fluid.layers.piecewise_decay(
- boundaries=bd, values=lr),
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(1e-4))
- return optimizer
-
-
-def net_config(image, label, model, args, is_train):
- assert args.model in model_list, "{} is not in lists: {}".format(
- args.model, model_list)
-
- out = model.net(input=image, embedding_size=args.embedding_size)
- if not is_train:
- return None, None, None, out
-
- if args.loss_name == "softmax":
- metricloss = SoftmaxLoss(
- class_dim=args.class_dim,
- )
- elif args.loss_name == "arcmargin":
- metricloss = ArcMarginLoss(
- class_dim = args.class_dim,
- margin = args.arc_margin,
- scale = args.arc_scale,
- easy_margin = args.arc_easy_margin,
- )
- cost, logit = metricloss.loss(out, label)
- avg_cost = fluid.layers.mean(x=cost)
- acc_top1 = fluid.layers.accuracy(input=logit, label=label, k=1)
- acc_top5 = fluid.layers.accuracy(input=logit, label=label, k=5)
- return avg_cost, acc_top1, acc_top5, out
-
-def build_program(is_train, main_prog, startup_prog, args):
- image_shape = [int(m) for m in args.image_shape.split(",")]
- model = models.__dict__[args.model]()
- with fluid.program_guard(main_prog, startup_prog):
- if is_train:
- queue_capacity = 64
- py_reader = fluid.layers.py_reader(
- capacity=queue_capacity,
- shapes=[[-1] + image_shape, [-1, 1]],
- lod_levels=[0, 0],
- dtypes=["float32", "int64"],
- use_double_buffer=True)
- image, label = fluid.layers.read_file(py_reader)
- else:
- image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
-
- with fluid.unique_name.guard():
- avg_cost, acc_top1, acc_top5, out = net_config(image, label, model, args, is_train)
- if is_train:
- params = model.params
- params["lr"] = args.lr
- params["learning_strategy"]["lr_steps"] = args.lr_steps
- params["learning_strategy"]["name"] = args.lr_strategy
- optimizer = optimizer_setting(params)
- optimizer.minimize(avg_cost)
- global_lr = optimizer._global_learning_rate()
- """
- if not is_train:
- main_prog = main_prog.clone(for_test=True)
- """
- if is_train:
- return py_reader, avg_cost, acc_top1, acc_top5, global_lr
- else:
- return out, image, label
-
-
-def train_async(args):
- # parameters from arguments
-
- logging.debug('enter train')
- model_name = args.model
- checkpoint = args.checkpoint
- pretrained_model = args.pretrained_model
- model_save_dir = args.model_save_dir
-
- startup_prog = fluid.Program()
- train_prog = fluid.Program()
- tmp_prog = fluid.Program()
-
- if args.enable_ce:
- assert args.model == "ResNet50"
- assert args.loss_name == "arcmargin"
- np.random.seed(0)
- startup_prog.random_seed = 1000
- train_prog.random_seed = 1000
- tmp_prog.random_seed = 1000
-
- train_py_reader, train_cost, train_acc1, train_acc5, global_lr = build_program(
- is_train=True,
- main_prog=train_prog,
- startup_prog=startup_prog,
- args=args)
- test_feas, image, label = build_program(
- is_train=False,
- main_prog=tmp_prog,
- startup_prog=startup_prog,
- args=args)
- test_prog = tmp_prog.clone(for_test=True)
-
- train_fetch_list = [global_lr.name, train_cost.name, train_acc1.name, train_acc5.name]
- test_fetch_list = [test_feas.name]
-
- if args.with_mem_opt:
- fluid.memory_optimize(train_prog, skip_opt_set=set(train_fetch_list))
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- exe.run(startup_prog)
-
- logging.debug('after run startup program')
-
- if checkpoint is not None:
- fluid.io.load_persistables(exe, checkpoint, main_program=train_prog)
-
- if pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
-
- fluid.io.load_vars(
- exe, pretrained_model, main_program=train_prog, predicate=if_exist)
-
- devicenum = get_gpu_num()
- assert (args.train_batch_size % devicenum) == 0
- train_batch_size = args.train_batch_size // devicenum
- test_batch_size = args.test_batch_size
-
- train_reader = paddle.batch(reader.train(args), batch_size=train_batch_size, drop_last=True)
- test_reader = paddle.batch(reader.test(args), batch_size=test_batch_size, drop_last=False)
- test_feeder = fluid.DataFeeder(place=place, feed_list=[image, label])
- train_py_reader.decorate_paddle_reader(train_reader)
-
- train_exe = fluid.ParallelExecutor(
- main_program=train_prog,
- use_cuda=args.use_gpu,
- loss_name=train_cost.name)
-
- totalruntime = 0
- train_py_reader.start()
- iter_no = 0
- train_info = [0, 0, 0, 0]
- while iter_no <= args.total_iter_num:
- t1 = time.time()
- lr, loss, acc1, acc5 = train_exe.run(fetch_list=train_fetch_list)
- t2 = time.time()
- period = t2 - t1
- lr = np.mean(np.array(lr))
- train_info[0] += np.mean(np.array(loss))
- train_info[1] += np.mean(np.array(acc1))
- train_info[2] += np.mean(np.array(acc5))
- train_info[3] += 1
- if iter_no % args.display_iter_step == 0:
- avgruntime = totalruntime / args.display_iter_step
- avg_loss = train_info[0] / train_info[3]
- avg_acc1 = train_info[1] / train_info[3]
- avg_acc5 = train_info[2] / train_info[3]
- print("[%s] trainbatch %d, lr %.6f, loss %.6f, "\
- "acc1 %.4f, acc5 %.4f, time %2.2f sec" % \
- (fmt_time(), iter_no, lr, avg_loss, avg_acc1, avg_acc5, avgruntime))
- sys.stdout.flush()
- totalruntime = 0
- if iter_no % 1000 == 0:
- train_info = [0, 0, 0, 0]
-
- totalruntime += period
-
- if iter_no % args.test_iter_step == 0 and iter_no != 0:
- f, l = [], []
- for batch_id, data in enumerate(test_reader()):
- t1 = time.time()
- [feas] = exe.run(test_prog, fetch_list = test_fetch_list, feed=test_feeder.feed(data))
- label = np.asarray([x[1] for x in data])
- f.append(feas)
- l.append(label)
-
- t2 = time.time()
- period = t2 - t1
- if batch_id % 20 == 0:
- print("[%s] testbatch %d, time %2.2f sec" % \
- (fmt_time(), batch_id, period))
-
- f = np.vstack(f)
- l = np.hstack(l)
- recall = recall_topk(f, l, k=1)
- print("[%s] test_img_num %d, trainbatch %d, test_recall %.5f" % \
- (fmt_time(), len(f), iter_no, recall))
- sys.stdout.flush()
-
- if iter_no % args.save_iter_step == 0 and iter_no != 0:
- model_path = os.path.join(model_save_dir + '/' + model_name,
- str(iter_no))
- if not os.path.isdir(model_path):
- os.makedirs(model_path)
- fluid.io.save_persistables(exe, model_path, main_program=train_prog)
-
- iter_no += 1
-
- # This is for continuous evaluation only
- if args.enable_ce:
- # Use the mean cost/acc for training
- print("kpis train_cost %s" % (avg_loss))
- print("kpis test_recall %s" % (recall))
-
-
-def initlogging():
- for handler in logging.root.handlers[:]:
- logging.root.removeHandler(handler)
- loglevel = logging.DEBUG
- logging.basicConfig(
- level=loglevel,
- # logger.BASIC_FORMAT,
- format=
- "%(levelname)s:%(filename)s[%(lineno)s] %(name)s:%(funcName)s->%(message)s",
- datefmt='%a, %d %b %Y %H:%M:%S')
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- train_async(args)
-
-
-if __name__ == '__main__':
- main()
diff --git a/fluid/PaddleCV/metric_learning/train_pair.py b/fluid/PaddleCV/metric_learning/train_pair.py
deleted file mode 100644
index da94ec5ce65a285f1b59388587120a60ea13ed19..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/train_pair.py
+++ /dev/null
@@ -1,274 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import sys
-import math
-import time
-import logging
-import argparse
-import functools
-import threading
-import subprocess
-import numpy as np
-import paddle
-import paddle.fluid as fluid
-import models
-import reader
-from losses import TripletLoss
-from losses import QuadrupletLoss
-from losses import EmlLoss
-from utility import add_arguments, print_arguments
-from utility import fmt_time, recall_topk, get_gpu_num
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('model', str, "ResNet50", "Set the network to use.")
-add_arg('embedding_size', int, 0, "Embedding size.")
-add_arg('train_batch_size', int, 120, "Minibatch size.")
-add_arg('test_batch_size', int, 50, "Minibatch size.")
-add_arg('image_shape', str, "3,224,224", "input image size")
-add_arg('class_dim', int, 11318, "Class number.")
-add_arg('lr', float, 0.0001, "set learning rate.")
-add_arg('lr_strategy', str, "piecewise_decay", "Set the learning rate decay strategy.")
-add_arg('lr_steps', str, "100000", "step of lr")
-add_arg('total_iter_num', int, 100000, "total_iter_num")
-add_arg('display_iter_step', int, 10, "display_iter_step.")
-add_arg('test_iter_step', int, 5000, "test_iter_step.")
-add_arg('save_iter_step', int, 5000, "save_iter_step.")
-add_arg('use_gpu', bool, True, "Whether to use GPU or not.")
-add_arg('with_mem_opt', bool, True, "Whether to use memory optimization or not.")
-add_arg('pretrained_model', str, None, "Whether to use pretrained model.")
-add_arg('checkpoint', str, None, "Whether to resume checkpoint.")
-add_arg('model_save_dir', str, "output", "model save directory")
-add_arg('loss_name', str, "triplet", "Set the loss type to use.")
-add_arg('samples_each_class', int, 2, "samples_each_class.")
-add_arg('margin', float, 0.1, "margin.")
-# yapf: enable
-
-model_list = [m for m in dir(models) if "__" not in m]
-
-def optimizer_setting(params):
- ls = params["learning_strategy"]
- assert ls["name"] == "piecewise_decay", \
- "learning rate strategy must be {}, \
- but got {}".format("piecewise_decay", lr["name"])
-
- bd = [int(e) for e in ls["lr_steps"].split(',')]
- base_lr = params["lr"]
- lr = [base_lr * (0.1 ** i) for i in range(len(bd) + 1)]
- optimizer = fluid.optimizer.Momentum(
- learning_rate=fluid.layers.piecewise_decay(
- boundaries=bd, values=lr),
- momentum=0.9,
- regularization=fluid.regularizer.L2Decay(1e-4))
- return optimizer
-
-
-def net_config(image, label, model, args, is_train):
- assert args.model in model_list, "{} is not in lists: {}".format(
- args.model, model_list)
-
- out = model.net(input=image, embedding_size=args.embedding_size)
- if not is_train:
- return None, out
-
- if args.loss_name == "triplet":
- metricloss = TripletLoss(
- margin=args.margin,
- )
- elif args.loss_name == "quadruplet":
- metricloss = QuadrupletLoss(
- train_batch_size = args.train_batch_size,
- samples_each_class = args.samples_each_class,
- margin=args.margin,
- )
- elif args.loss_name == "eml":
- metricloss = EmlLoss(
- train_batch_size = args.train_batch_size,
- samples_each_class = args.samples_each_class,
- )
- cost = metricloss.loss(out)
- avg_cost = fluid.layers.mean(x=cost)
- return avg_cost, out
-
-def build_program(is_train, main_prog, startup_prog, args):
- image_shape = [int(m) for m in args.image_shape.split(",")]
- model = models.__dict__[args.model]()
- with fluid.program_guard(main_prog, startup_prog):
- if is_train:
- queue_capacity = 64
- py_reader = fluid.layers.py_reader(
- capacity=queue_capacity,
- shapes=[[-1] + image_shape, [-1, 1]],
- lod_levels=[0, 0],
- dtypes=["float32", "int64"],
- use_double_buffer=True)
- image, label = fluid.layers.read_file(py_reader)
- else:
- image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
-
- with fluid.unique_name.guard():
- avg_cost, out = net_config(image, label, model, args, is_train)
- if is_train:
- params = model.params
- params["lr"] = args.lr
- params["learning_strategy"]["lr_steps"] = args.lr_steps
- params["learning_strategy"]["name"] = args.lr_strategy
- optimizer = optimizer_setting(params)
- optimizer.minimize(avg_cost)
- global_lr = optimizer._global_learning_rate()
- """
- if not is_train:
- main_prog = main_prog.clone(for_test=True)
- """
- if is_train:
- return py_reader, avg_cost, global_lr, out, label
- else:
- return out, image, label
-
-
-def train_async(args):
- # parameters from arguments
-
- logging.debug('enter train')
- model_name = args.model
- checkpoint = args.checkpoint
- pretrained_model = args.pretrained_model
- model_save_dir = args.model_save_dir
-
- startup_prog = fluid.Program()
- train_prog = fluid.Program()
- tmp_prog = fluid.Program()
-
- train_py_reader, train_cost, global_lr, train_feas, train_label = build_program(
- is_train=True,
- main_prog=train_prog,
- startup_prog=startup_prog,
- args=args)
- test_feas, image, label = build_program(
- is_train=False,
- main_prog=tmp_prog,
- startup_prog=startup_prog,
- args=args)
- test_prog = tmp_prog.clone(for_test=True)
-
- train_fetch_list = [global_lr.name, train_cost.name, train_feas.name, train_label.name]
- test_fetch_list = [test_feas.name]
-
- if args.with_mem_opt:
- fluid.memory_optimize(train_prog, skip_opt_set=set(train_fetch_list))
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- exe.run(startup_prog)
-
- logging.debug('after run startup program')
-
- if checkpoint is not None:
- fluid.io.load_persistables(exe, checkpoint, main_program=train_prog)
-
- if pretrained_model:
-
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
-
- fluid.io.load_vars(
- exe, pretrained_model, main_program=train_prog, predicate=if_exist)
-
- devicenum = get_gpu_num()
- assert (args.train_batch_size % devicenum) == 0
- train_batch_size = args.train_batch_size / devicenum
- test_batch_size = args.test_batch_size
-
- train_reader = paddle.batch(reader.train(args), batch_size=train_batch_size, drop_last=True)
- test_reader = paddle.batch(reader.test(args), batch_size=test_batch_size, drop_last=False)
- test_feeder = fluid.DataFeeder(place=place, feed_list=[image, label])
- train_py_reader.decorate_paddle_reader(train_reader)
-
- train_exe = fluid.ParallelExecutor(
- main_program=train_prog,
- use_cuda=args.use_gpu,
- loss_name=train_cost.name)
-
- totalruntime = 0
- train_py_reader.start()
- iter_no = 0
- train_info = [0, 0, 0]
- while iter_no <= args.total_iter_num:
- t1 = time.time()
- lr, loss, feas, label = train_exe.run(fetch_list=train_fetch_list)
- t2 = time.time()
- period = t2 - t1
- lr = np.mean(np.array(lr))
- train_info[0] += np.mean(np.array(loss))
- train_info[1] += recall_topk(feas, label, k=1)
- train_info[2] += 1
- if iter_no % args.display_iter_step == 0:
- avgruntime = totalruntime / args.display_iter_step
- avg_loss = train_info[0] / train_info[2]
- avg_recall = train_info[1] / train_info[2]
- print("[%s] trainbatch %d, lr %.6f, loss %.6f, "\
- "recall %.4f, time %2.2f sec" % \
- (fmt_time(), iter_no, lr, avg_loss, avg_recall, avgruntime))
- sys.stdout.flush()
- totalruntime = 0
- if iter_no % 1000 == 0:
- train_info = [0, 0, 0]
-
- totalruntime += period
-
- if iter_no % args.test_iter_step == 0 and iter_no != 0:
- f, l = [], []
- for batch_id, data in enumerate(test_reader()):
- t1 = time.time()
- [feas] = exe.run(test_prog, fetch_list = test_fetch_list, feed=test_feeder.feed(data))
- label = np.asarray([x[1] for x in data])
- f.append(feas)
- l.append(label)
-
- t2 = time.time()
- period = t2 - t1
- if batch_id % 20 == 0:
- print("[%s] testbatch %d, time %2.2f sec" % \
- (fmt_time(), batch_id, period))
-
- f = np.vstack(f)
- l = np.hstack(l)
- recall = recall_topk(f, l, k=1)
- print("[%s] test_img_num %d, trainbatch %d, test_recall %.5f" % \
- (fmt_time(), len(f), iter_no, recall))
- sys.stdout.flush()
-
- if iter_no % args.save_iter_step == 0 and iter_no != 0:
- model_path = os.path.join(model_save_dir + '/' + model_name,
- str(iter_no))
- if not os.path.isdir(model_path):
- os.makedirs(model_path)
- fluid.io.save_persistables(exe, model_path, main_program=train_prog)
-
- iter_no += 1
-
-def initlogging():
- for handler in logging.root.handlers[:]:
- logging.root.removeHandler(handler)
- loglevel = logging.DEBUG
- logging.basicConfig(
- level=loglevel,
- # logger.BASIC_FORMAT,
- format=
- "%(levelname)s:%(filename)s[%(lineno)s] %(name)s:%(funcName)s->%(message)s",
- datefmt='%a, %d %b %Y %H:%M:%S')
-
-def main():
- args = parser.parse_args()
- print_arguments(args)
- train_async(args)
-
-
-if __name__ == '__main__':
- main()
diff --git a/fluid/PaddleCV/object_detection/README.md b/fluid/PaddleCV/object_detection/README.md
index 651016cdffa7fe6c4fa1dc5e886b9b18e8e40b04..99b0f8db58cc8e2ef130c0054b40bf746b5ac2c8 100644
--- a/fluid/PaddleCV/object_detection/README.md
+++ b/fluid/PaddleCV/object_detection/README.md
@@ -1,124 +1,6 @@
-## SSD Object Detection
-## Table of Contents
-- [Introduction](#introduction)
-- [Data Preparation](#data-preparation)
-- [Train](#train)
-- [Evaluate](#evaluate)
-- [Infer and Visualize](#infer-and-visualize)
-- [Released Model](#released-model)
+Hi!
-### Introduction
+This directory has been deprecated.
-[Single Shot MultiBox Detector (SSD)](https://arxiv.org/abs/1512.02325) framework for object detection can be categorized as a single stage detector. A single stage detector simplifies object detection as a regression problem, which directly predicts the bounding boxes and class probabilities without region proposal. SSD further makes improves by producing these predictions of different scales from different layers, as shown below. Six levels predictions are made in six different scale feature maps. And there are two 3x3 convolutional layers in each feature map, which predict category or a shape offset relative to the prior box(also called anchor), respectively. Thus, we get 38x38x4 + 19x19x6 + 10x10x6 + 5x5x6 + 3x3x4 + 1x1x4 = 8732 detections per class.
-
-
-The Single Shot MultiBox Detector (SSD)
-
-
-SSD is readily pluggable into a wide variant standard convolutional network, such as VGG, ResNet, or MobileNet, which is also called base network or backbone. In this tutorial we used [MobileNet](https://arxiv.org/abs/1704.04861).
-
-
-### Data Preparation
-
-You can use [PASCAL VOC dataset](http://host.robots.ox.ac.uk/pascal/VOC/) or [MS-COCO dataset](http://cocodataset.org/#download).
-
-If you want to train a model on PASCAL VOC dataset, please download dataset at first, skip this step if you already have one.
-
-```bash
-cd data/pascalvoc
-./download.sh
-```
-
-The command `download.sh` also will create training and testing file lists.
-
-If you want to train a model on MS-COCO dataset, please download dataset at first, skip this step if you already have one.
-
-```
-cd data/coco
-./download.sh
-```
-
-### Train
-
-#### Download the Pre-trained Model.
-
-We provide two pre-trained models. The one is MobileNet-v1 SSD trained on COCO dataset, but removed the convolutional predictors for COCO dataset. This model can be used to initialize the models when training other datasets, like PASCAL VOC. The other pre-trained model is MobileNet-v1 trained on ImageNet 2012 dataset but removed the last weights and bias in the Fully-Connected layer.
-
-Declaration: the MobileNet-v1 SSD model is converted by [TensorFlow model](https://github.com/tensorflow/models/blob/f87a58cd96d45de73c9a8330a06b2ab56749a7fa/research/object_detection/g3doc/detection_model_zoo.md). The MobileNet-v1 model is converted from [Caffe](https://github.com/shicai/MobileNet-Caffe).
-We will release the pre-trained models by ourself in the upcoming soon.
-
- - Download MobileNet-v1 SSD:
- ```bash
- ./pretrained/download_coco.sh
- ```
- - Download MobileNet-v1:
- ```bash
- ./pretrained/download_imagenet.sh
- ```
-
-#### Train on PASCAL VOC
-
-`train.py` is the main caller of the training module. Examples of usage are shown below.
- ```bash
- python -u train.py --batch_size=64 --dataset='pascalvoc' --pretrained_model='pretrained/ssd_mobilenet_v1_coco/'
- ```
- - Set ```export CUDA_VISIBLE_DEVICES=0,1``` to specifiy the number of GPU you want to use.
- - Set ```--dataset='coco2014'``` or ```--dataset='coco2017'``` to train model on MS COCO dataset.
- - For more help on arguments:
-
- ```bash
- python train.py --help
- ```
-
-Data reader is defined in `reader.py`. All images will be resized to 300x300. In training stage, images are randomly distorted, expanded, cropped and flipped:
- - distort: distort brightness, contrast, saturation, and hue.
- - expand: put the original image into a larger expanded image which is initialized using image mean.
- - crop: crop image with respect to different scale, aspect ratio, and overlap.
- - flip: flip horizontally.
-
-We used RMSProp optimizer with mini-batch size 64 to train the MobileNet-SSD. The initial learning rate is 0.001, and was decayed at 40, 60, 80, 100 epochs with multiplier 0.5, 0.25, 0.1, 0.01, respectively. Weight decay is 0.00005. After 120 epochs we achieve 73.32% mAP under 11point metric.
-
-### Evaluate
-
-You can evaluate your trained model in different metrics like 11point, integral on both PASCAL VOC and COCO dataset. Note we set the default test list to the dataset's test/val list, you can use your own test list by setting ```--test_list``` args.
-
-`eval.py` is the main caller of the evaluating module. Examples of usage are shown below.
-```bash
-python eval.py --dataset='pascalvoc' --model_dir='train_pascal_model/best_model' --data_dir='data/pascalvoc' --test_list='test.txt' --ap_version='11point' --nms_threshold=0.45
-```
-
-You can set ```--dataset``` to ```coco2014``` or ```coco2017``` to evaluate COCO dataset. Moreover, we provide `eval_coco_map.py` which uses a COCO-specific mAP metric defined by [COCO committee](http://cocodataset.org/#detections-eval). To use this eval_coco_map.py, [cocoapi](https://github.com/cocodataset/cocoapi) is needed.
-Install the cocoapi:
-```
-# COCOAPI=/path/to/clone/cocoapi
-git clone https://github.com/cocodataset/cocoapi.git $COCOAPI
-cd $COCOAPI/PythonAPI
-# Install into global site-packages
-make install
-# Alternatively, if you do not have permissions or prefer
-# not to install the COCO API into global site-packages
-python2 setup.py install --user
-```
-
-### Infer and Visualize
-`infer.py` is the main caller of the inferring module. Examples of usage are shown below.
-```bash
-python infer.py --dataset='pascalvoc' --nms_threshold=0.45 --model_dir='train_pascal_model/best_model' --image_path='./data/pascalvoc/VOCdevkit/VOC2007/JPEGImages/009963.jpg'
-```
-Below are the examples of running the inference and visualizing the model result.
-
-
-
-
-
-MobileNet-v1-SSD 300x300 Visualization Examples
-
-
-
-### Released Model
-
-
-| Model | Pre-trained Model | Training data | Test data | mAP |
-|:------------------------:|:------------------:|:----------------:|:------------:|:----:|
-|[MobileNet-v1-SSD 300x300](http://paddlemodels.bj.bcebos.com/ssd_mobilenet_v1_pascalvoc.tar.gz) | COCO MobileNet SSD | VOC07+12 trainval| VOC07 test | 73.32% |
+Please visit the project at [PaddleCV/object_detection](../../../PaddleCV/object_detection).
diff --git a/fluid/PaddleCV/object_detection/README_cn.md b/fluid/PaddleCV/object_detection/README_cn.md
index 99603953a9dad956bcd13e7af68c59a9ae45c9cd..d3af497b9aecf23db4976970fbe16bc6c99bf6ff 100644
--- a/fluid/PaddleCV/object_detection/README_cn.md
+++ b/fluid/PaddleCV/object_detection/README_cn.md
@@ -1,123 +1,2 @@
-## SSD 目标检测
-## Table of Contents
-- [简介](#简介)
-- [数据准备](#数据准备)
-- [模型训练](#模型训练)
-- [模型评估](#模型评估)
-- [模型预测以及可视化](#模型预测以及可视化)
-- [模型发布](#模型发布)
-
-### 简介
-
-[Single Shot MultiBox Detector (SSD)](https://arxiv.org/abs/1512.02325) 是一种单阶段的目标检测器。与两阶段的检测方法不同,单阶段目标检测并不进行区域推荐,而是直接从特征图回归出目标的边界框和分类概率。SSD 运用了这种单阶段检测的思想,并且对其进行改进:在不同尺度的特征图上检测对应尺度的目标。如下图所示,SSD 在六个尺度的特征图上进行了不同层级的预测。每个层级由两个3x3卷积分别对目标类别和边界框偏移进行回归。因此对于每个类别,SSD 的六个层级一共会产生 38x38x4 + 19x19x6 + 10x10x6 + 5x5x6 + 3x3x4 + 1x1x4 = 8732 个检测结果。
-
-
-SSD 目标检测模型
-
-
-SSD 可以方便地插入到任何一种标准卷积网络中,比如 VGG、ResNet 或者 MobileNet,这些网络被称作检测器的基网络。在这个示例中我们使用 [MobileNet](https://arxiv.org/abs/1704.04861)。
-
-
-### 数据准备
-
-你可以使用 [PASCAL VOC 数据集](http://host.robots.ox.ac.uk/pascal/VOC/) 或者 [MS-COCO 数据集](http://cocodataset.org/#download)。
-
-如果你想在 PASCAL VOC 数据集上进行训练,请先使用下面的命令下载数据集。
-
-```bash
-cd data/pascalvoc
-./download.sh
-```
-
-`download.sh` 命令会自动创建训练和测试用的列表文件。
-
-如果你想在 MS-COCO 数据集上进行训练,请先使用下面的命令下载数据集。
-
-```
-cd data/coco
-./download.sh
-```
-
-### 模型训练
-
-#### 下载预训练模型
-
-我们提供了两个预训练模型。第一个模型是在 COCO 数据集上预训练的 MobileNet-v1 SSD,我们将它的预测头移除了以便在 COCO 以外的数据集上进行训练。第二个模型是在 ImageNet 2012 数据集上预训练的 MobileNet-v1,我们也将最后的全连接层移除以便进行目标检测训练。
-
-声明:MobileNet-v1 SSD 模型转换自[TensorFlow model](https://github.com/tensorflow/models/blob/f87a58cd96d45de73c9a8330a06b2ab56749a7fa/research/object_detection/g3doc/detection_model_zoo.md)。MobileNet-v1 模型转换自[Caffe](https://github.com/shicai/MobileNet-Caffe)。我们不久也会发布我们自己预训练的模型。
-
- - 下载 MobileNet-v1 SSD:
- ```bash
- ./pretrained/download_coco.sh
- ```
- - 下载 MobileNet-v1:
- ```bash
- ./pretrained/download_imagenet.sh
- ```
-
-#### 训练
-
-`train.py` 是训练模块的主要执行程序,调用示例如下:
- ```bash
- python -u train.py --batch_size=64 --dataset='pascalvoc' --pretrained_model='pretrained/ssd_mobilenet_v1_coco/'
- ```
- - 可以通过设置 ```export CUDA_VISIBLE_DEVICES=0,1``` 指定想要使用的GPU数量。
- - 可以通过设置 ```--dataset='coco2014'``` 或 ```--dataset='coco2017'``` 指定训练 MS-COCO数据集。
- - 更多的可选参数见:
-
- ```bash
- python train.py --help
- ```
-
-数据的读取行为定义在 `reader.py` 中,所有的图片都会被缩放到300x300。在训练时还会对图片进行数据增强,包括随机扰动、扩张、翻转和裁剪:
- - 扰动: 扰动图片亮度、对比度、饱和度和色相。
- - 扩张: 将原始图片放进一张使用像素均值填充(随后会在减均值操作中减掉)的扩张图中,再对此图进行裁剪、缩放和翻转。
- - 翻转: 水平翻转。
- - 裁剪: 根据缩放比例、长宽比例两个参数生成若干候选框,再依据这些候选框和标注框的面积交并比(IoU)挑选出符合要求的裁剪结果。
-
-我们使用了 RMSProp 优化算法来训练 MobileNet-SSD,batch大小为64,权重衰减系数为0.00005,初始学习率为 0.001,并且在第40、60、80、100 轮时使用 0.5, 0.25, 0.1, 0.01乘子进行学习率衰减。在120轮训练后,11point评价标准下的mAP为73.32%。
-
-### 模型评估
-
-你可以使用11point、integral等指标在PASCAL VOC 和 COCO 数据集上评估训练好的模型。不失一般性,我们采用相应数据集的测试列表作为样例代码的默认列表,你也可以通过设置```--test_list```来指定自己的测试样本列表。
-
-`eval.py`是评估模块的主要执行程序,调用示例如下:
-```bash
-python eval.py --dataset='pascalvoc' --model_dir='train_pascal_model/best_model' --data_dir='data/pascalvoc' --test_list='test.txt' --ap_version='11point' --nms_threshold=0.45
-```
-
-你可以设置```--dataset``` 为 ```coco2014``` 或 ```coco2017```来评估 COCO 数据集。我们也提供了`eval_coco_map.py`以进行[COCO官方评估](http://cocodataset.org/#detections-eval)。若要使用 eval_coco_map.py, 需要首先下载[cocoapi](https://github.com/cocodataset/cocoapi):
-```
-# COCOAPI=/path/to/clone/cocoapi
-git clone https://github.com/cocodataset/cocoapi.git $COCOAPI
-cd $COCOAPI/PythonAPI
-# Install into global site-packages
-make install
-# Alternatively, if you do not have permissions or prefer
-# not to install the COCO API into global site-packages
-python2 setup.py install --user
-```
-
-### 模型预测以及可视化
-
-`infer.py`是预测及可视化模块的主要执行程序,调用示例如下:
-```bash
-python infer.py --dataset='pascalvoc' --nms_threshold=0.45 --model_dir='train_pascal_model/best_model' --image_path='./data/pascalvoc/VOCdevkit/VOC2007/JPEGImages/009963.jpg'
-```
-下图可视化了模型的预测结果:
-
-
-
-
-
-MobileNet-v1-SSD 300x300 预测可视化
-
-
-
-### 模型发布
-
-
-| 模型 | 预训练模型 | 训练数据 | 测试数据 | mAP |
-|:------------------------:|:------------------:|:----------------:|:------------:|:----:|
-|[MobileNet-v1-SSD 300x300](http://paddlemodels.bj.bcebos.com/ssd_mobilenet_v1_pascalvoc.tar.gz) | COCO MobileNet SSD | VOC07+12 trainval| VOC07 test | 73.32% |
+您好,该项目已被迁移,请移步到 [PaddleCV/object_detection](../../../PaddleCV/object_detection) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/object_detection/README_quant.md b/fluid/PaddleCV/object_detection/README_quant.md
index 6723a48832d1b5210436eb2001234c6fe9149736..99b0f8db58cc8e2ef130c0054b40bf746b5ac2c8 100644
--- a/fluid/PaddleCV/object_detection/README_quant.md
+++ b/fluid/PaddleCV/object_detection/README_quant.md
@@ -1,143 +1,6 @@
-## Quantization-aware training for SSD
-### Introduction
+Hi!
-The quantization-aware training used in this experiments is introduced in [fixed-point quantization desigin](https://gthub.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/design/quantization/fixed_point_quantization.md). Since quantization-aware training is still an active area of research and experimentation,
-here, we just give an simple quantization training usage in Fluid based on MobileNet-SSD model, and more other exeperiments are still needed, like how to quantization traning by considering fusing batch normalization and convolution/fully-connected layers, channel-wise quantization of weights and so on.
+This directory has been deprecated.
-
-A Python transpiler is used to rewrite Fluid training program or evaluation program for quantization-aware training:
-
-```python
-
- #startup_prog = fluid.Program()
- #train_prog = fluid.Program()
- #loss = build_program(
- # main_prog=train_prog,
- # startup_prog=startup_prog,
- # is_train=True)
- #build_program(
- # main_prog=test_prog,
- # startup_prog=startup_prog,
- # is_train=False)
- #test_prog = test_prog.clone(for_test=True)
- # above is an pseudo code
-
- transpiler = fluid.contrib.QuantizeTranspiler(
- weight_bits=8,
- activation_bits=8,
- activation_quantize_type='abs_max', # or 'range_abs_max'
- weight_quantize_type='abs_max')
- # note, transpiler.training_transpile will rewrite train_prog
- # startup_prog is needed since it needs to insert and initialize
- # some state variable
- transpiler.training_transpile(train_prog, startup_prog)
- transpiler.training_transpile(test_prog, startup_prog)
-```
-
- According to above design, this transpiler inserts fake quantization and de-quantization operation for each convolution operation (including depthwise convolution operation) and fully-connected operation. These quantizations take affect on weights and activations.
-
- In the design, we introduce dynamic quantization and static quantization strategies for different activation quantization methods. In the expriments, when set `activation_quantize_type` to `abs_max`, it is dynamic quantization. That is to say, the quantization scale (maximum of absolute value) of activation will be calculated each mini-batch during inference. When set `activation_quantize_type` to `range_abs_max`, a quantization scale for inference period will be calculated during training. Following part will introduce how to train.
-
-### Quantization-aware training
-
- The training is fine-tuned on the well-trained MobileNet-SSD model. So download model at first:
-
- ```
- wget http://paddlemodels.bj.bcebos.com/ssd_mobilenet_v1_pascalvoc.tar.gz
- ```
-
-- dynamic quantization:
-
- ```python
- python main_quant.py \
- --data_dir=$PascalVOC_DIR$ \
- --mode='train' \
- --init_model=ssd_mobilenet_v1_pascalvoc \
- --act_quant_type='abs_max' \
- --epoc_num=20 \
- --learning_rate=0.0001 \
- --batch_size=64 \
- --model_save_dir=$OUTPUT_DIR$
- ```
- Since fine-tuned on a well-trained model, we use a small start learnng rate 0.0001, and train 20 epocs.
-
-- static quantization:
- ```python
- python main_quant.py \
- --data_dir=$PascalVOC_DIR$ \
- --mode='train' \
- --init_model=ssd_mobilenet_v1_pascalvoc \
- --act_quant_type='range_abs_max' \
- --epoc_num=80 \
- --learning_rate=0.001 \
- --lr_epochs=30,60 \
- --lr_decay_rates=1,0.1,0.01 \
- --batch_size=64 \
- --model_save_dir=$OUTPUT_DIR$
- ```
- Here, train 80 epocs, learning rate decays at 30 and 60 epocs by 0.1 every time. Users can adjust these hype-parameters.
-
-### Convert to inference model
-
- As described in the design documentation, the inference graph is a little different from training, the difference is the de-quantization operation is before or after conv/fc. This is equivalent in training due to linear operation of conv/fc and de-quantization and functions' commutative law. But for inference, it needs to convert the graph, `fluid.contrib.QuantizeTranspiler.freeze_program` is used to do this:
-
- ```python
- #startup_prog = fluid.Program()
- #test_prog = fluid.Program()
- #test_py_reader, map_eval, nmsed_out, image = build_program(
- # main_prog=test_prog,
- # startup_prog=startup_prog,
- # train_params=configs,
- # is_train=False)
- #test_prog = test_prog.clone(for_test=True)
- #transpiler = fluid.contrib.QuantizeTranspiler(weight_bits=8,
- # activation_bits=8,
- # activation_quantize_type=act_quant_type,
- # weight_quantize_type='abs_max')
- #transpiler.training_transpile(test_prog, startup_prog)
- #place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
- #exe = fluid.Executor(place)
- #exe.run(startup_prog)
-
- def if_exist(var):
- return os.path.exists(os.path.join(init_model, var.name))
- fluid.io.load_vars(exe, init_model, main_program=test_prog,
- predicate=if_exist)
- # freeze the rewrited training program
- # freeze after load parameters, it will quantized weights
- transpiler.freeze_program(test_prog, place)
- ```
-
- Users can evaluate the converted model by:
-
- ```
- python main_quant.py \
- --data_dir=$PascalVOC_DIR$ \
- --mode='test' \
- --init_model=$MODLE_DIR$ \
- --model_save_dir=$MobileNet_SSD_8BIT_MODEL$
- ```
-
- You also can check the 8-bit model by the inference scripts
-
- ```
- python main_quant.py \
- --mode='infer' \
- --init_model=$MobileNet_SSD_8BIT_MODEL$ \
- --confs_threshold=0.5 \
- --image_path='/data/PascalVOC/VOCdevkit/VOC2007/JPEGImages/002271.jpg'
- ```
- See 002271.jpg for the visualized image with bbouding boxes.
-
-### Results
-
-Results of MobileNet-v1-SSD 300x300 model on PascalVOC dataset.
-
-| Model | mAP |
-|:---------------------------------------:|:------------------:|
-|Floating point: 32bit | 73.32% |
-|Fixed point: 8bit, dynamic quantization | 72.77% |
-|Fixed point: 8bit, static quantization | 72.45% |
-
- As mentioned above, other experiments, like how to quantization traning by considering fusing batch normalization and convolution/fully-connected layers, channel-wise quantization of weights, quantizated weights type with uint8 instead of int8 and so on.
+Please visit the project at [PaddleCV/object_detection](../../../PaddleCV/object_detection).
diff --git a/fluid/PaddleCV/object_detection/_ce.py b/fluid/PaddleCV/object_detection/_ce.py
deleted file mode 100644
index 5f5d3e013a1bfca1ca0a0d1b6fb93a76a242496e..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/_ce.py
+++ /dev/null
@@ -1,72 +0,0 @@
-####this file is only used for continuous evaluation test!
-
-import os
-import sys
-sys.path.append(os.environ['ceroot'])
-from kpi import CostKpi, DurationKpi, AccKpi
-
-#### NOTE kpi.py should shared in models in some way!!!!
-
-train_cost_kpi = CostKpi('train_cost', 0.02, 0, actived=True)
-test_acc_kpi = AccKpi('test_acc', 0.01, 0, actived=False)
-train_speed_kpi = AccKpi('train_speed', 0.1, 0, actived=True)
-train_cost_card4_kpi = CostKpi('train_cost_card4', 0.02, 0, actived=True)
-test_acc_card4_kpi = AccKpi('test_acc_card4', 0.01, 0, actived=False)
-train_speed_card4_kpi = AccKpi('train_speed_card4', 0.1, 0, actived=True)
-
-tracking_kpis = [
- train_cost_kpi,
- test_acc_kpi,
- train_speed_kpi,
- train_cost_card4_kpi,
- test_acc_card4_kpi,
- train_speed_card4_kpi,
-]
-
-
-def parse_log(log):
- '''
- This method should be implemented by model developers.
-
- The suggestion:
-
- each line in the log should be key, value, for example:
-
- "
- train_cost\t1.0
- test_cost\t1.0
- train_cost\t1.0
- train_cost\t1.0
- train_acc\t1.2
- "
- '''
- #kpi_map = {}
- for line in log.split('\n'):
- fs = line.strip().split('\t')
- print(fs)
- if len(fs) == 3 and fs[0] == 'kpis':
- print("-----%s" % fs)
- kpi_name = fs[1]
- kpi_value = float(fs[2])
- #kpi_map[kpi_name] = kpi_value
- yield kpi_name, kpi_value
- #return kpi_map
-
-
-def log_to_ce(log):
- kpi_tracker = {}
- for kpi in tracking_kpis:
- kpi_tracker[kpi.name] = kpi
-
- for (kpi_name, kpi_value) in parse_log(log):
- print(kpi_name, kpi_value)
- kpi_tracker[kpi_name].add_record(kpi_value)
- kpi_tracker[kpi_name].persist()
-
-
-if __name__ == '__main__':
- log = sys.stdin.read()
- print("*****")
- print(log)
- print("****")
- log_to_ce(log)
diff --git a/fluid/PaddleCV/object_detection/data_util.py b/fluid/PaddleCV/object_detection/data_util.py
deleted file mode 100644
index ac022593119e0008c3f7f3858303cbf5bc717650..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/data_util.py
+++ /dev/null
@@ -1,151 +0,0 @@
-"""
-This code is based on https://github.com/fchollet/keras/blob/master/keras/utils/data_utils.py
-"""
-
-import time
-import numpy as np
-import threading
-import multiprocessing
-try:
- import queue
-except ImportError:
- import Queue as queue
-
-
-class GeneratorEnqueuer(object):
- """
- Builds a queue out of a data generator.
-
- Args:
- generator: a generator function which endlessly yields data
- use_multiprocessing (bool): use multiprocessing if True,
- otherwise use threading.
- wait_time (float): time to sleep in-between calls to `put()`.
- random_seed (int): Initial seed for workers,
- will be incremented by one for each workers.
- """
-
- def __init__(self,
- generator,
- use_multiprocessing=False,
- wait_time=0.05,
- random_seed=None):
- self.wait_time = wait_time
- self._generator = generator
- self._use_multiprocessing = use_multiprocessing
- self._threads = []
- self._stop_event = None
- self.queue = None
- self._manager = None
- self.seed = random_seed
-
- def start(self, workers=1, max_queue_size=10):
- """
- Start worker threads which add data from the generator into the queue.
-
- Args:
- workers (int): number of worker threads
- max_queue_size (int): queue size
- (when full, threads could block on `put()`)
- """
-
- def data_generator_task():
- """
- Data generator task.
- """
-
- def task():
- if (self.queue is not None and
- self.queue.qsize() < max_queue_size):
- generator_output = next(self._generator)
- self.queue.put((generator_output))
- else:
- time.sleep(self.wait_time)
-
- if not self._use_multiprocessing:
- while not self._stop_event.is_set():
- with self.genlock:
- try:
- task()
- except Exception:
- self._stop_event.set()
- break
- else:
- while not self._stop_event.is_set():
- try:
- task()
- except Exception:
- self._stop_event.set()
- break
-
- try:
- if self._use_multiprocessing:
- self._manager = multiprocessing.Manager()
- self.queue = self._manager.Queue(maxsize=max_queue_size)
- self._stop_event = multiprocessing.Event()
- else:
- self.genlock = threading.Lock()
- self.queue = queue.Queue()
- self._stop_event = threading.Event()
- for _ in range(workers):
- if self._use_multiprocessing:
- # Reset random seed else all children processes
- # share the same seed
- np.random.seed(self.seed)
- thread = multiprocessing.Process(target=data_generator_task)
- thread.daemon = True
- if self.seed is not None:
- self.seed += 1
- else:
- thread = threading.Thread(target=data_generator_task)
- self._threads.append(thread)
- thread.start()
- except:
- self.stop()
- raise
-
- def is_running(self):
- """
- Returns:
- bool: Whether the worker theads are running.
- """
- return self._stop_event is not None and not self._stop_event.is_set()
-
- def stop(self, timeout=None):
- """
- Stops running threads and wait for them to exit, if necessary.
- Should be called by the same thread which called `start()`.
-
- Args:
- timeout(int|None): maximum time to wait on `thread.join()`.
- """
- if self.is_running():
- self._stop_event.set()
- for thread in self._threads:
- if self._use_multiprocessing:
- if thread.is_alive():
- thread.terminate()
- else:
- thread.join(timeout)
- if self._manager:
- self._manager.shutdown()
-
- self._threads = []
- self._stop_event = None
- self.queue = None
-
- def get(self):
- """
- Creates a generator to extract data from the queue.
- Skip the data if it is `None`.
-
- # Yields
- tuple of data in the queue.
- """
- while self.is_running():
- if not self.queue.empty():
- inputs = self.queue.get()
- if inputs is not None:
- yield inputs
- else:
- time.sleep(self.wait_time)
diff --git a/fluid/PaddleCV/object_detection/eval.py b/fluid/PaddleCV/object_detection/eval.py
deleted file mode 100644
index 106fb67e073648f94934e7b17f02b964d276e5ec..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/eval.py
+++ /dev/null
@@ -1,138 +0,0 @@
-import os
-import time
-import numpy as np
-import argparse
-import functools
-import math
-
-import paddle
-import paddle.fluid as fluid
-import reader
-from mobilenet_ssd import mobile_net
-from utility import add_arguments, print_arguments
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('dataset', str, 'pascalvoc', "coco2014, coco2017, and pascalvoc.")
-add_arg('batch_size', int, 32, "Minibatch size.")
-add_arg('use_gpu', bool, True, "Whether use GPU.")
-add_arg('data_dir', str, '', "The data root path.")
-add_arg('test_list', str, '', "The testing data lists.")
-add_arg('model_dir', str, '', "The model path.")
-add_arg('nms_threshold', float, 0.45, "NMS threshold.")
-add_arg('ap_version', str, '11point', "integral, 11point.")
-add_arg('resize_h', int, 300, "The resized image height.")
-add_arg('resize_w', int, 300, "The resized image height.")
-add_arg('mean_value_B', float, 127.5, "Mean value for B channel which will be subtracted.") #123.68
-add_arg('mean_value_G', float, 127.5, "Mean value for G channel which will be subtracted.") #116.78
-add_arg('mean_value_R', float, 127.5, "Mean value for R channel which will be subtracted.") #103.94
-# yapf: enable
-
-
-def build_program(main_prog, startup_prog, args, data_args):
- image_shape = [3, data_args.resize_h, data_args.resize_w]
- if 'coco' in data_args.dataset:
- num_classes = 91
- elif 'pascalvoc' in data_args.dataset:
- num_classes = 21
-
- with fluid.program_guard(main_prog, startup_prog):
- py_reader = fluid.layers.py_reader(
- capacity=64,
- shapes=[[-1] + image_shape, [-1, 4], [-1, 1], [-1, 1]],
- lod_levels=[0, 1, 1, 1],
- dtypes=["float32", "float32", "int32", "int32"],
- use_double_buffer=True)
- with fluid.unique_name.guard():
- image, gt_box, gt_label, difficult = fluid.layers.read_file(
- py_reader)
- locs, confs, box, box_var = mobile_net(num_classes, image,
- image_shape)
- nmsed_out = fluid.layers.detection_output(
- locs, confs, box, box_var, nms_threshold=args.nms_threshold)
- with fluid.program_guard(main_prog):
- map = fluid.evaluator.DetectionMAP(
- nmsed_out,
- gt_label,
- gt_box,
- difficult,
- num_classes,
- overlap_threshold=0.5,
- evaluate_difficult=False,
- ap_version=args.ap_version)
- return py_reader, map
-
-
-def eval(args, data_args, test_list, batch_size, model_dir=None):
- startup_prog = fluid.Program()
- test_prog = fluid.Program()
-
- test_py_reader, map_eval = build_program(
- main_prog=test_prog,
- startup_prog=startup_prog,
- args=args,
- data_args=data_args)
- test_prog = test_prog.clone(for_test=True)
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(startup_prog)
-
- def if_exist(var):
- return os.path.exists(os.path.join(model_dir, var.name))
-
- fluid.io.load_vars(
- exe, model_dir, main_program=test_prog, predicate=if_exist)
-
- test_reader = reader.test(data_args, test_list, batch_size=batch_size)
- test_py_reader.decorate_paddle_reader(test_reader)
-
- _, accum_map = map_eval.get_map_var()
- map_eval.reset(exe)
- test_py_reader.start()
- try:
- batch_id = 0
- while True:
- test_map, = exe.run(test_prog, fetch_list=[accum_map])
- if batch_id % 10 == 0:
- print("Batch {0}, map {1}".format(batch_id, test_map))
- batch_id += 1
- except (fluid.core.EOFException, StopIteration):
- test_py_reader.reset()
- print("Test model {0}, map {1}".format(model_dir, test_map))
-
-
-if __name__ == '__main__':
- args = parser.parse_args()
- print_arguments(args)
-
- data_dir = 'data/pascalvoc'
- test_list = 'test.txt'
- label_file = 'label_list'
-
- if not os.path.exists(args.model_dir):
- raise ValueError("The model path [%s] does not exist." %
- (args.model_dir))
- if 'coco' in args.dataset:
- data_dir = 'data/coco'
- if '2014' in args.dataset:
- test_list = 'annotations/instances_val2014.json'
- elif '2017' in args.dataset:
- test_list = 'annotations/instances_val2017.json'
-
- data_args = reader.Settings(
- dataset=args.dataset,
- data_dir=args.data_dir if len(args.data_dir) > 0 else data_dir,
- label_file=label_file,
- resize_h=args.resize_h,
- resize_w=args.resize_w,
- mean_value=[args.mean_value_B, args.mean_value_G, args.mean_value_R],
- apply_distort=False,
- apply_expand=False,
- ap_version=args.ap_version)
- eval(
- args,
- data_args=data_args,
- test_list=args.test_list if len(args.test_list) > 0 else test_list,
- batch_size=args.batch_size,
- model_dir=args.model_dir)
diff --git a/fluid/PaddleCV/object_detection/eval_coco_map.py b/fluid/PaddleCV/object_detection/eval_coco_map.py
deleted file mode 100644
index 0837f42ad89cda1e6a81825bc0545a11b48c4b3c..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/eval_coco_map.py
+++ /dev/null
@@ -1,156 +0,0 @@
-import os
-import time
-import numpy as np
-import argparse
-import functools
-
-import paddle
-import paddle.fluid as fluid
-import reader
-from mobilenet_ssd import mobile_net
-from utility import add_arguments, print_arguments
-
-# A special mAP metric for COCO dataset, which averages AP in different IoUs.
-# To use this eval_cocoMAP.py, [cocoapi](https://github.com/cocodataset/cocoapi) is needed.
-import json
-from pycocotools.coco import COCO
-from pycocotools.cocoeval import COCOeval
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('dataset', str, 'coco2014', "coco2014, coco2017.")
-add_arg('batch_size', int, 32, "Minibatch size.")
-add_arg('use_gpu', bool, True, "Whether use GPU.")
-add_arg('data_dir', str, '', "The data root path.")
-add_arg('test_list', str, '', "The testing data lists.")
-add_arg('model_dir', str, '', "The model path.")
-add_arg('nms_threshold', float, 0.5, "NMS threshold.")
-add_arg('ap_version', str, 'cocoMAP', "cocoMAP.")
-add_arg('resize_h', int, 300, "The resized image height.")
-add_arg('resize_w', int, 300, "The resized image height.")
-add_arg('mean_value_B', float, 127.5, "Mean value for B channel which will be subtracted.") #123.68
-add_arg('mean_value_G', float, 127.5, "Mean value for G channel which will be subtracted.") #116.78
-add_arg('mean_value_R', float, 127.5, "Mean value for R channel which will be subtracted.") #103.94
-# yapf: enable
-
-
-def eval(args, data_args, test_list, batch_size, model_dir=None):
- image_shape = [3, data_args.resize_h, data_args.resize_w]
- num_classes = 91
-
- image = fluid.layers.data(name='image', shape=image_shape, dtype='float32')
- gt_box = fluid.layers.data(
- name='gt_box', shape=[4], dtype='float32', lod_level=1)
- gt_label = fluid.layers.data(
- name='gt_label', shape=[1], dtype='int32', lod_level=1)
- gt_iscrowd = fluid.layers.data(
- name='gt_iscrowd', shape=[1], dtype='int32', lod_level=1)
- gt_image_info = fluid.layers.data(
- name='gt_image_id', shape=[3], dtype='int32', lod_level=1)
-
- locs, confs, box, box_var = mobile_net(num_classes, image, image_shape)
- nmsed_out = fluid.layers.detection_output(
- locs, confs, box, box_var, nms_threshold=args.nms_threshold)
- loss = fluid.layers.ssd_loss(locs, confs, gt_box, gt_label, box, box_var)
- loss = fluid.layers.reduce_sum(loss)
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- # yapf: disable
- if model_dir:
- def if_exist(var):
- return os.path.exists(os.path.join(model_dir, var.name))
- fluid.io.load_vars(exe, model_dir, predicate=if_exist)
- # yapf: enable
- test_reader = paddle.batch(
- reader.test(data_args, test_list), batch_size=batch_size)
- feeder = fluid.DataFeeder(
- place=place,
- feed_list=[image, gt_box, gt_label, gt_iscrowd, gt_image_info])
-
- def get_dt_res(nmsed_out_v, data):
- dts_res = []
- lod = nmsed_out_v[0].lod()[0]
- nmsed_out_v = np.array(nmsed_out_v[0])
- real_batch_size = min(batch_size, len(data))
- assert (len(lod) == real_batch_size + 1), \
- "Error Lod Tensor offset dimension. Lod({}) vs. batch_size({})".format(len(lod), batch_size)
- k = 0
- for i in range(real_batch_size):
- dt_num_this_img = lod[i + 1] - lod[i]
- image_id = int(data[i][4][0])
- image_width = int(data[i][4][1])
- image_height = int(data[i][4][2])
- for j in range(dt_num_this_img):
- dt = nmsed_out_v[k]
- k = k + 1
- category_id, score, xmin, ymin, xmax, ymax = dt.tolist()
- xmin = max(min(xmin, 1.0), 0.0) * image_width
- ymin = max(min(ymin, 1.0), 0.0) * image_height
- xmax = max(min(xmax, 1.0), 0.0) * image_width
- ymax = max(min(ymax, 1.0), 0.0) * image_height
- w = xmax - xmin
- h = ymax - ymin
- bbox = [xmin, ymin, w, h]
- dt_res = {
- 'image_id': image_id,
- 'category_id': category_id,
- 'bbox': bbox,
- 'score': score
- }
- dts_res.append(dt_res)
- return dts_res
-
- def test():
- dts_res = []
-
- for batch_id, data in enumerate(test_reader()):
- nmsed_out_v = exe.run(fluid.default_main_program(),
- feed=feeder.feed(data),
- fetch_list=[nmsed_out],
- return_numpy=False)
- if batch_id % 20 == 0:
- print("Batch {0}".format(batch_id))
- dts_res += get_dt_res(nmsed_out_v, data)
-
- with open("detection_result.json", 'w') as outfile:
- json.dump(dts_res, outfile)
- print("start evaluate using coco api")
- cocoGt = COCO(os.path.join(data_args.data_dir, test_list))
- cocoDt = cocoGt.loadRes("detection_result.json")
- cocoEval = COCOeval(cocoGt, cocoDt, "bbox")
- cocoEval.evaluate()
- cocoEval.accumulate()
- cocoEval.summarize()
-
- test()
-
-
-if __name__ == '__main__':
- args = parser.parse_args()
- print_arguments(args)
-
- data_dir = './data/coco'
- if '2014' in args.dataset:
- test_list = 'annotations/instances_val2014.json'
- elif '2017' in args.dataset:
- test_list = 'annotations/instances_val2017.json'
-
- data_args = reader.Settings(
- dataset=args.dataset,
- data_dir=args.data_dir if len(args.data_dir) > 0 else data_dir,
- label_file='',
- resize_h=args.resize_h,
- resize_w=args.resize_w,
- mean_value=[args.mean_value_B, args.mean_value_G, args.mean_value_R],
- apply_distort=False,
- apply_expand=False,
- ap_version=args.ap_version,
- toy=0)
- eval(
- args,
- data_args=data_args,
- test_list=args.test_list if len(args.test_list) > 0 else test_list,
- batch_size=args.batch_size,
- model_dir=args.model_dir)
diff --git a/fluid/PaddleCV/object_detection/main_quant.py b/fluid/PaddleCV/object_detection/main_quant.py
deleted file mode 100644
index bd7d377e69e95dcaf066a40941cf48091583e7ab..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/main_quant.py
+++ /dev/null
@@ -1,282 +0,0 @@
-import os
-import time
-import numpy as np
-import argparse
-import functools
-import shutil
-import math
-import multiprocessing
-
-import paddle
-import paddle.fluid as fluid
-import reader
-from mobilenet_ssd import mobile_net
-from utility import add_arguments, print_arguments
-from train import build_program
-from train import train_parameters
-from infer import draw_bounding_box_on_image
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('learning_rate', float, 0.0001, "Learning rate.")
-add_arg('batch_size', int, 64, "Minibatch size.")
-add_arg('epoc_num', int, 20, "Epoch number.")
-add_arg('use_gpu', bool, True, "Whether use GPU.")
-add_arg('parallel', bool, True, "Whether train in parallel on multi-devices.")
-add_arg('model_save_dir', str, 'quant_model', "The path to save model.")
-add_arg('init_model', str, 'ssd_mobilenet_v1_pascalvoc', "The init model path.")
-add_arg('ap_version', str, '11point', "mAP version can be integral or 11point.")
-add_arg('image_shape', str, '3,300,300', "Input image shape.")
-add_arg('mean_BGR', str, '127.5,127.5,127.5', "Mean value for B,G,R channel which will be subtracted.")
-add_arg('lr_epochs', str, '30,60', "The learning decay steps.")
-add_arg('lr_decay_rates', str, '1,0.1,0.01', "The learning decay rates for each step.")
-add_arg('data_dir', str, 'data/pascalvoc', "Data directory")
-add_arg('act_quant_type', str, 'abs_max', "Quantize type of activation, whicn can be abs_max or range_abs_max")
-add_arg('image_path', str, '', "The image used to inference and visualize.")
-add_arg('confs_threshold', float, 0.5, "Confidence threshold to draw bbox.")
-add_arg('mode', str, 'train', "Job mode can be one of ['train', 'test', 'infer'].")
-#yapf: enable
-
-def test(exe, test_prog, map_eval, test_py_reader):
- _, accum_map = map_eval.get_map_var()
- map_eval.reset(exe)
- test_py_reader.start()
- try:
- batch = 0
- while True:
- test_map, = exe.run(test_prog, fetch_list=[accum_map])
- if batch % 10 == 0:
- print("Batch {0}, map {1}".format(batch, test_map))
- batch += 1
- except fluid.core.EOFException:
- test_py_reader.reset()
- finally:
- test_py_reader.reset()
- print("Test map {0}".format(test_map))
- return test_map
-
-
-def save_model(exe, main_prog, model_save_dir, postfix):
- model_path = os.path.join(model_save_dir, postfix)
- if os.path.isdir(model_path):
- shutil.rmtree(model_path)
- fluid.io.save_persistables(exe, model_path, main_program=main_prog)
-
-
-def train(args,
- data_args,
- train_params,
- train_file_list,
- val_file_list):
-
- model_save_dir = args.model_save_dir
- init_model = args.init_model
- epoc_num = args.epoc_num
- use_gpu = args.use_gpu
- parallel = args.parallel
- is_shuffle = True
- act_quant_type = args.act_quant_type
-
- if use_gpu:
- devices_num = fluid.core.get_cuda_device_count()
- else:
- devices_num = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
-
- batch_size = train_params['batch_size']
- batch_size_per_device = batch_size // devices_num
- iters_per_epoc = train_params["train_images"] // batch_size
- num_workers = 4
-
- startup_prog = fluid.Program()
- train_prog = fluid.Program()
- test_prog = fluid.Program()
-
- train_py_reader, loss = build_program(
- main_prog=train_prog,
- startup_prog=startup_prog,
- train_params=train_params,
- is_train=True)
- test_py_reader, map_eval, _, _ = build_program(
- main_prog=test_prog,
- startup_prog=startup_prog,
- train_params=train_params,
- is_train=False)
-
- test_prog = test_prog.clone(for_test=True)
-
- transpiler = fluid.contrib.QuantizeTranspiler(weight_bits=8,
- activation_bits=8,
- activation_quantize_type=act_quant_type,
- weight_quantize_type='abs_max')
-
- transpiler.training_transpile(train_prog, startup_prog)
- transpiler.training_transpile(test_prog, startup_prog)
-
- place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(startup_prog)
-
- if init_model:
- print('Load init model %s.' % init_model)
- def if_exist(var):
- return os.path.exists(os.path.join(init_model, var.name))
- fluid.io.load_vars(exe, init_model, main_program=train_prog,
- predicate=if_exist)
- else:
- print('There is no init model.')
-
- if parallel:
- train_exe = fluid.ParallelExecutor(main_program=train_prog,
- use_cuda=True if use_gpu else False, loss_name=loss.name)
-
- train_reader = reader.train(data_args,
- train_file_list,
- batch_size_per_device,
- shuffle=is_shuffle,
- use_multiprocessing=True,
- num_workers=num_workers,
- max_queue=24)
- test_reader = reader.test(data_args, val_file_list, batch_size)
- train_py_reader.decorate_paddle_reader(train_reader)
- test_py_reader.decorate_paddle_reader(test_reader)
-
- train_py_reader.start()
- best_map = 0.
- try:
- for epoc in range(epoc_num):
- if epoc == 0:
- # test quantized model without quantization-aware training.
- test_map = test(exe, test_prog, map_eval, test_py_reader)
- # train
- for batch in range(iters_per_epoc):
- start_time = time.time()
- if parallel:
- outs = train_exe.run(fetch_list=[loss.name])
- else:
- outs = exe.run(train_prog, fetch_list=[loss])
- end_time = time.time()
- avg_loss = np.mean(np.array(outs[0]))
- if batch % 20 == 0:
- print("Epoc {:d}, batch {:d}, loss {:.6f}, time {:.5f}".format(
- epoc , batch, avg_loss, end_time - start_time))
- end_time = time.time()
- test_map = test(exe, test_prog, map_eval, test_py_reader)
- save_model(exe, train_prog, model_save_dir, str(epoc))
- if test_map > best_map:
- best_map = test_map
- save_model(exe, train_prog, model_save_dir, 'best_map')
- print("Best test map {0}".format(best_map))
- except (fluid.core.EOFException, StopIteration):
- train_py_reader.reset()
-
-
-def eval(args, data_args, configs, val_file_list):
- init_model = args.init_model
- use_gpu = args.use_gpu
- act_quant_type = args.act_quant_type
- model_save_dir = args.model_save_dir
-
- batch_size = configs['batch_size']
- batch_size_per_device = batch_size
-
- startup_prog = fluid.Program()
- test_prog = fluid.Program()
- test_py_reader, map_eval, nmsed_out, image = build_program(
- main_prog=test_prog,
- startup_prog=startup_prog,
- train_params=configs,
- is_train=False)
- test_prog = test_prog.clone(for_test=True)
-
- transpiler = fluid.contrib.QuantizeTranspiler(weight_bits=8,
- activation_bits=8,
- activation_quantize_type=act_quant_type,
- weight_quantize_type='abs_max')
- transpiler.training_transpile(test_prog, startup_prog)
-
- place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(startup_prog)
-
- def if_exist(var):
- return os.path.exists(os.path.join(init_model, var.name))
- fluid.io.load_vars(exe, init_model, main_program=test_prog,
- predicate=if_exist)
-
- # freeze after load parameters
- transpiler.freeze_program(test_prog, place)
-
- test_reader = reader.test(data_args, val_file_list, batch_size)
- test_py_reader.decorate_paddle_reader(test_reader)
-
- test_map = test(exe, test_prog, map_eval, test_py_reader)
- print("Test model {0}, map {1}".format(init_model, test_map))
- fluid.io.save_inference_model(model_save_dir, [image.name],
- [nmsed_out], exe, test_prog)
-
-
-def infer(args, data_args):
- model_dir = args.init_model
- image_path = args.image_path
- confs_threshold = args.confs_threshold
-
- place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- [inference_program, feed , fetch] = fluid.io.load_inference_model(
- dirname=model_dir,
- executor=exe,
- model_filename='__model__')
-
- #print(np.array(fluid.global_scope().find_var('conv2d_20.w_0').get_tensor()))
- #print(np.max(np.array(fluid.global_scope().find_var('conv2d_20.w_0').get_tensor())))
- infer_reader = reader.infer(data_args, image_path)
- data = infer_reader()
- data = data.reshape((1,) + data.shape)
- outs = exe.run(inference_program,
- feed={feed[0]: data},
- fetch_list=fetch,
- return_numpy=False)
- out = np.array(outs[0])
- draw_bounding_box_on_image(image_path, out, confs_threshold,
- data_args.label_list)
-
-
-if __name__ == '__main__':
- args = parser.parse_args()
- print_arguments(args)
-
- # for pascalvoc
- label_file = 'label_list'
- train_list = 'trainval.txt'
- val_list = 'test.txt'
- dataset = 'pascalvoc'
-
- mean_BGR = [float(m) for m in args.mean_BGR.split(",")]
- image_shape = [int(m) for m in args.image_shape.split(",")]
- lr_epochs = [int(m) for m in args.lr_epochs.split(",")]
- lr_rates = [float(m) for m in args.lr_decay_rates.split(",")]
- train_parameters[dataset]['image_shape'] = image_shape
- train_parameters[dataset]['batch_size'] = args.batch_size
- train_parameters[dataset]['lr'] = args.learning_rate
- train_parameters[dataset]['epoc_num'] = args.epoc_num
- train_parameters[dataset]['ap_version'] = args.ap_version
- train_parameters[dataset]['lr_epochs'] = lr_epochs
- train_parameters[dataset]['lr_decay'] = lr_rates
-
- data_args = reader.Settings(
- dataset=dataset,
- data_dir=args.data_dir,
- label_file=label_file,
- resize_h=image_shape[1],
- resize_w=image_shape[2],
- mean_value=mean_BGR,
- apply_distort=True,
- apply_expand=True,
- ap_version = args.ap_version)
- if args.mode == 'train':
- train(args, data_args, train_parameters[dataset], train_list, val_list)
- elif args.mode == 'test':
- eval(args, data_args, train_parameters[dataset], val_list)
- else:
- infer(args, data_args)
diff --git a/fluid/PaddleCV/object_detection/reader.py b/fluid/PaddleCV/object_detection/reader.py
deleted file mode 100644
index 59da1b38fb2e9cce8bb99a2773e7fc222ee33bd8..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/reader.py
+++ /dev/null
@@ -1,362 +0,0 @@
-# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import image_util
-from paddle.utils.image_util import *
-from PIL import Image
-from PIL import ImageDraw
-import numpy as np
-import xml.etree.ElementTree
-import os
-import time
-import copy
-import six
-from data_util import GeneratorEnqueuer
-
-
-class Settings(object):
- def __init__(self,
- dataset=None,
- data_dir=None,
- label_file=None,
- resize_h=300,
- resize_w=300,
- mean_value=[127.5, 127.5, 127.5],
- apply_distort=True,
- apply_expand=True,
- ap_version='11point'):
- self._dataset = dataset
- self._ap_version = ap_version
- self._data_dir = data_dir
- if 'pascalvoc' in dataset:
- self._label_list = []
- label_fpath = os.path.join(data_dir, label_file)
- for line in open(label_fpath):
- self._label_list.append(line.strip())
-
- self._apply_distort = apply_distort
- self._apply_expand = apply_expand
- self._resize_height = resize_h
- self._resize_width = resize_w
- self._img_mean = np.array(mean_value)[:, np.newaxis, np.newaxis].astype(
- 'float32')
- self._expand_prob = 0.5
- self._expand_max_ratio = 4
- self._hue_prob = 0.5
- self._hue_delta = 18
- self._contrast_prob = 0.5
- self._contrast_delta = 0.5
- self._saturation_prob = 0.5
- self._saturation_delta = 0.5
- self._brightness_prob = 0.5
- self._brightness_delta = 0.125
-
- @property
- def dataset(self):
- return self._dataset
-
- @property
- def ap_version(self):
- return self._ap_version
-
- @property
- def apply_distort(self):
- return self._apply_expand
-
- @property
- def apply_distort(self):
- return self._apply_distort
-
- @property
- def data_dir(self):
- return self._data_dir
-
- @data_dir.setter
- def data_dir(self, data_dir):
- self._data_dir = data_dir
-
- @property
- def label_list(self):
- return self._label_list
-
- @property
- def resize_h(self):
- return self._resize_height
-
- @property
- def resize_w(self):
- return self._resize_width
-
- @property
- def img_mean(self):
- return self._img_mean
-
-
-def preprocess(img, bbox_labels, mode, settings):
- img_width, img_height = img.size
- sampled_labels = bbox_labels
- if mode == 'train':
- if settings._apply_distort:
- img = image_util.distort_image(img, settings)
- if settings._apply_expand:
- img, bbox_labels, img_width, img_height = image_util.expand_image(
- img, bbox_labels, img_width, img_height, settings)
- # sampling
- batch_sampler = []
- # hard-code here
- batch_sampler.append(
- image_util.sampler(1, 1, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.1, 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.3, 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.5, 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.7, 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.9, 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.0, 1.0))
- sampled_bbox = image_util.generate_batch_samples(batch_sampler,
- bbox_labels)
-
- img = np.array(img)
- if len(sampled_bbox) > 0:
- idx = int(np.random.uniform(0, len(sampled_bbox)))
- img, sampled_labels = image_util.crop_image(
- img, bbox_labels, sampled_bbox[idx], img_width, img_height)
-
- img = Image.fromarray(img)
- img = img.resize((settings.resize_w, settings.resize_h), Image.ANTIALIAS)
- img = np.array(img)
-
- if mode == 'train':
- mirror = int(np.random.uniform(0, 2))
- if mirror == 1:
- img = img[:, ::-1, :]
- for i in six.moves.xrange(len(sampled_labels)):
- tmp = sampled_labels[i][1]
- sampled_labels[i][1] = 1 - sampled_labels[i][3]
- sampled_labels[i][3] = 1 - tmp
- # HWC to CHW
- if len(img.shape) == 3:
- img = np.swapaxes(img, 1, 2)
- img = np.swapaxes(img, 1, 0)
- # RBG to BGR
- img = img[[2, 1, 0], :, :]
- img = img.astype('float32')
- img -= settings.img_mean
- img = img * 0.007843
- return img, sampled_labels
-
-
-def coco(settings, file_list, mode, batch_size, shuffle):
- # cocoapi
- from pycocotools.coco import COCO
- from pycocotools.cocoeval import COCOeval
-
- coco = COCO(file_list)
- image_ids = coco.getImgIds()
- images = coco.loadImgs(image_ids)
- print("{} on {} with {} images".format(mode, settings.dataset, len(images)))
-
- def reader():
- if mode == 'train' and shuffle:
- np.random.shuffle(images)
- batch_out = []
- for image in images:
- image_name = image['file_name']
- image_path = os.path.join(settings.data_dir, image_name)
-
- im = Image.open(image_path)
- if im.mode == 'L':
- im = im.convert('RGB')
- im_width, im_height = im.size
- im_id = image['id']
-
- # layout: category_id | xmin | ymin | xmax | ymax | iscrowd
- bbox_labels = []
- annIds = coco.getAnnIds(imgIds=image['id'])
- anns = coco.loadAnns(annIds)
- for ann in anns:
- bbox_sample = []
- # start from 1, leave 0 to background
- bbox_sample.append(float(ann['category_id']))
- bbox = ann['bbox']
- xmin, ymin, w, h = bbox
- xmax = xmin + w
- ymax = ymin + h
- bbox_sample.append(float(xmin) / im_width)
- bbox_sample.append(float(ymin) / im_height)
- bbox_sample.append(float(xmax) / im_width)
- bbox_sample.append(float(ymax) / im_height)
- bbox_sample.append(float(ann['iscrowd']))
- bbox_labels.append(bbox_sample)
- im, sample_labels = preprocess(im, bbox_labels, mode, settings)
- sample_labels = np.array(sample_labels)
- if len(sample_labels) == 0: continue
- im = im.astype('float32')
- boxes = sample_labels[:, 1:5]
- lbls = sample_labels[:, 0].astype('int32')
- iscrowd = sample_labels[:, -1].astype('int32')
- if 'cocoMAP' in settings.ap_version:
- batch_out.append((im, boxes, lbls, iscrowd,
- [im_id, im_width, im_height]))
- else:
- batch_out.append((im, boxes, lbls, iscrowd))
-
- if len(batch_out) == batch_size:
- yield batch_out
- batch_out = []
-
- if mode == 'test' and len(batch_out) > 1:
- yield batch_out
- batch_out = []
-
- return reader
-
-
-def pascalvoc(settings, file_list, mode, batch_size, shuffle):
- flist = open(file_list)
- images = [line.strip() for line in flist]
- print("{} on {} with {} images".format(mode, settings.dataset, len(images)))
-
- def reader():
- if mode == 'train' and shuffle:
- np.random.shuffle(images)
- batch_out = []
- cnt = 0
- for image in images:
- image_path, label_path = image.split()
- image_path = os.path.join(settings.data_dir, image_path)
- label_path = os.path.join(settings.data_dir, label_path)
-
- im = Image.open(image_path)
- if im.mode == 'L':
- im = im.convert('RGB')
- im_width, im_height = im.size
-
- # layout: label | xmin | ymin | xmax | ymax | difficult
- bbox_labels = []
- root = xml.etree.ElementTree.parse(label_path).getroot()
- for object in root.findall('object'):
- bbox_sample = []
- # start from 1
- bbox_sample.append(
- float(settings.label_list.index(object.find('name').text)))
- bbox = object.find('bndbox')
- difficult = float(object.find('difficult').text)
- bbox_sample.append(float(bbox.find('xmin').text) / im_width)
- bbox_sample.append(float(bbox.find('ymin').text) / im_height)
- bbox_sample.append(float(bbox.find('xmax').text) / im_width)
- bbox_sample.append(float(bbox.find('ymax').text) / im_height)
- bbox_sample.append(difficult)
- bbox_labels.append(bbox_sample)
- im, sample_labels = preprocess(im, bbox_labels, mode, settings)
- sample_labels = np.array(sample_labels)
- if len(sample_labels) == 0: continue
- im = im.astype('float32')
- boxes = sample_labels[:, 1:5]
- lbls = sample_labels[:, 0].astype('int32')
- difficults = sample_labels[:, -1].astype('int32')
-
- batch_out.append((im, boxes, lbls, difficults))
- if len(batch_out) == batch_size:
- yield batch_out
- cnt += len(batch_out)
- batch_out = []
-
- if mode == 'test' and len(batch_out) > 1:
- yield batch_out
- cnt += len(batch_out)
- batch_out = []
-
- return reader
-
-
-def train(settings,
- file_list,
- batch_size,
- shuffle=True,
- use_multiprocessing=True,
- num_workers=8,
- max_queue=24,
- enable_ce=False):
- file_list = os.path.join(settings.data_dir, file_list)
-
- if 'coco' in settings.dataset:
- generator = coco(settings, file_list, "train", batch_size, shuffle)
- else:
- generator = pascalvoc(settings, file_list, "train", batch_size, shuffle)
-
- def infinite_reader():
- while True:
- for data in generator():
- yield data
-
- def reader():
- try:
- enqueuer = GeneratorEnqueuer(
- infinite_reader(), use_multiprocessing=use_multiprocessing)
- enqueuer.start(max_queue_size=max_queue, workers=num_workers)
- generator_output = None
- while True:
- while enqueuer.is_running():
- if not enqueuer.queue.empty():
- generator_output = enqueuer.queue.get()
- break
- else:
- time.sleep(0.02)
- yield generator_output
- generator_output = None
- finally:
- if enqueuer is not None:
- enqueuer.stop()
-
- if enable_ce:
- return infinite_reader
- else:
- return reader
-
-
-def test(settings, file_list, batch_size):
- file_list = os.path.join(settings.data_dir, file_list)
- if 'coco' in settings.dataset:
- return coco(settings, file_list, 'test', batch_size, False)
- else:
- return pascalvoc(settings, file_list, 'test', batch_size, False)
-
-
-def infer(settings, image_path):
- def reader():
- img = Image.open(image_path)
- if img.mode == 'L':
- img = im.convert('RGB')
- im_width, im_height = img.size
- img = img.resize((settings.resize_w, settings.resize_h),
- Image.ANTIALIAS)
- img = np.array(img)
- # HWC to CHW
- if len(img.shape) == 3:
- img = np.swapaxes(img, 1, 2)
- img = np.swapaxes(img, 1, 0)
- # RBG to BGR
- img = img[[2, 1, 0], :, :]
- img = img.astype('float32')
- img -= settings.img_mean
- img = img * 0.007843
- return img
-
- return reader
diff --git a/fluid/PaddleCV/object_detection/train.py b/fluid/PaddleCV/object_detection/train.py
deleted file mode 100644
index 2d830bcdf1d7900ca2f27055a9ec7568f75b6211..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/train.py
+++ /dev/null
@@ -1,317 +0,0 @@
-import os
-import time
-import numpy as np
-import argparse
-import functools
-import shutil
-import math
-import multiprocessing
-
-import paddle
-import paddle.fluid as fluid
-import reader
-from mobilenet_ssd import mobile_net
-from utility import add_arguments, print_arguments
-
-parser = argparse.ArgumentParser(description=__doc__)
-add_arg = functools.partial(add_arguments, argparser=parser)
-# yapf: disable
-add_arg('learning_rate', float, 0.001, "Learning rate.")
-add_arg('batch_size', int, 64, "Minibatch size of all devices.")
-add_arg('epoc_num', int, 120, "Epoch number.")
-add_arg('use_gpu', bool, True, "Whether use GPU.")
-add_arg('parallel', bool, True, "Whether train in parallel on multi-devices.")
-add_arg('dataset', str, 'pascalvoc', "dataset can be coco2014, coco2017, and pascalvoc.")
-add_arg('model_save_dir', str, 'model', "The path to save model.")
-add_arg('pretrained_model', str, 'pretrained/ssd_mobilenet_v1_coco/', "The init model path.")
-add_arg('ap_version', str, '11point', "mAP version can be integral or 11point.")
-add_arg('image_shape', str, '3,300,300', "Input image shape.")
-add_arg('mean_BGR', str, '127.5,127.5,127.5', "Mean value for B,G,R channel which will be subtracted.")
-add_arg('data_dir', str, 'data/pascalvoc', "Data directory.")
-add_arg('enable_ce', bool, False, "Whether use CE to evaluate the model.")
-#yapf: enable
-
-train_parameters = {
- "pascalvoc": {
- "train_images": 16551,
- "image_shape": [3, 300, 300],
- "class_num": 21,
- "batch_size": 64,
- "lr": 0.001,
- "lr_epochs": [40, 60, 80, 100],
- "lr_decay": [1, 0.5, 0.25, 0.1, 0.01],
- "ap_version": '11point',
- },
- "coco2014": {
- "train_images": 82783,
- "image_shape": [3, 300, 300],
- "class_num": 91,
- "batch_size": 64,
- "lr": 0.001,
- "lr_epochs": [12, 19],
- "lr_decay": [1, 0.5, 0.25],
- "ap_version": 'integral', # should use eval_coco_map.py to test model
- },
- "coco2017": {
- "train_images": 118287,
- "image_shape": [3, 300, 300],
- "class_num": 91,
- "batch_size": 64,
- "lr": 0.001,
- "lr_epochs": [12, 19],
- "lr_decay": [1, 0.5, 0.25],
- "ap_version": 'integral', # should use eval_coco_map.py to test model
- }
-}
-
-def optimizer_setting(train_params):
- batch_size = train_params["batch_size"]
- iters = train_params["train_images"] // batch_size
- lr = train_params["lr"]
- boundaries = [i * iters for i in train_params["lr_epochs"]]
- values = [ i * lr for i in train_params["lr_decay"]]
-
- optimizer = fluid.optimizer.RMSProp(
- learning_rate=fluid.layers.piecewise_decay(boundaries, values),
- regularization=fluid.regularizer.L2Decay(0.00005), )
-
- return optimizer
-
-
-def build_program(main_prog, startup_prog, train_params, is_train):
- image_shape = train_params['image_shape']
- class_num = train_params['class_num']
- ap_version = train_params['ap_version']
- outs = []
- with fluid.program_guard(main_prog, startup_prog):
- py_reader = fluid.layers.py_reader(
- capacity=64,
- shapes=[[-1] + image_shape, [-1, 4], [-1, 1], [-1, 1]],
- lod_levels=[0, 1, 1, 1],
- dtypes=["float32", "float32", "int32", "int32"],
- use_double_buffer=True)
- with fluid.unique_name.guard():
- image, gt_box, gt_label, difficult = fluid.layers.read_file(py_reader)
- locs, confs, box, box_var = mobile_net(class_num, image, image_shape)
- if is_train:
- with fluid.unique_name.guard("train"):
- loss = fluid.layers.ssd_loss(locs, confs, gt_box, gt_label, box,
- box_var)
- loss = fluid.layers.reduce_sum(loss)
- optimizer = optimizer_setting(train_params)
- optimizer.minimize(loss)
- outs = [py_reader, loss]
- else:
- with fluid.unique_name.guard("inference"):
- nmsed_out = fluid.layers.detection_output(
- locs, confs, box, box_var, nms_threshold=0.45)
- map_eval = fluid.evaluator.DetectionMAP(
- nmsed_out,
- gt_label,
- gt_box,
- difficult,
- class_num,
- overlap_threshold=0.5,
- evaluate_difficult=False,
- ap_version=ap_version)
- # nmsed_out and image is used to save mode for inference
- outs = [py_reader, map_eval, nmsed_out, image]
- return outs
-
-
-def train(args,
- data_args,
- train_params,
- train_file_list,
- val_file_list):
-
- model_save_dir = args.model_save_dir
- pretrained_model = args.pretrained_model
- use_gpu = args.use_gpu
- parallel = args.parallel
- enable_ce = args.enable_ce
- is_shuffle = True
-
- if not use_gpu:
- devices_num = int(os.environ.get('CPU_NUM',
- multiprocessing.cpu_count()))
- else:
- devices_num = fluid.core.get_cuda_device_count()
-
- batch_size = train_params['batch_size']
- epoc_num = train_params['epoc_num']
- batch_size_per_device = batch_size // devices_num
- iters_per_epoc = train_params["train_images"] // batch_size
- num_workers = 8
-
- startup_prog = fluid.Program()
- train_prog = fluid.Program()
- test_prog = fluid.Program()
-
- if enable_ce:
- import random
- random.seed(0)
- np.random.seed(0)
- is_shuffle = False
- startup_prog.random_seed = 111
- train_prog.random_seed = 111
- test_prog.random_seed = 111
-
- train_py_reader, loss = build_program(
- main_prog=train_prog,
- startup_prog=startup_prog,
- train_params=train_params,
- is_train=True)
- test_py_reader, map_eval, _, _ = build_program(
- main_prog=test_prog,
- startup_prog=startup_prog,
- train_params=train_params,
- is_train=False)
-
- test_prog = test_prog.clone(for_test=True)
- place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(startup_prog)
-
- if pretrained_model:
- def if_exist(var):
- return os.path.exists(os.path.join(pretrained_model, var.name))
- fluid.io.load_vars(exe, pretrained_model, main_program=train_prog,
- predicate=if_exist)
-
- if parallel:
- train_exe = fluid.ParallelExecutor(main_program=train_prog,
- use_cuda=use_gpu, loss_name=loss.name)
- train_reader = reader.train(data_args,
- train_file_list,
- batch_size_per_device,
- shuffle=is_shuffle,
- use_multiprocessing=True,
- num_workers=num_workers,
- max_queue=24,
- enable_ce=enable_ce)
- test_reader = reader.test(data_args, val_file_list, batch_size)
- train_py_reader.decorate_paddle_reader(train_reader)
- test_py_reader.decorate_paddle_reader(test_reader)
-
- def save_model(postfix, main_prog):
- model_path = os.path.join(model_save_dir, postfix)
- if os.path.isdir(model_path):
- shutil.rmtree(model_path)
- print('save models to %s' % (model_path))
- fluid.io.save_persistables(exe, model_path, main_program=main_prog)
-
- best_map = 0.
- def test(epoc_id, best_map):
- _, accum_map = map_eval.get_map_var()
- map_eval.reset(exe)
- every_epoc_map=[]
- test_py_reader.start()
- try:
- batch_id = 0
- while True:
- test_map, = exe.run(test_prog, fetch_list=[accum_map])
- if batch_id % 10 == 0:
- every_epoc_map.append(test_map)
- print("Batch {0}, map {1}".format(batch_id, test_map))
- batch_id += 1
- except fluid.core.EOFException:
- test_py_reader.reset()
- mean_map = np.mean(every_epoc_map)
- print("Epoc {0}, test map {1}".format(epoc_id, test_map))
- if test_map[0] > best_map:
- best_map = test_map[0]
- save_model('best_model', test_prog)
- return best_map, mean_map
-
-
- train_py_reader.start()
- total_time = 0.0
- try:
- for epoc_id in range(epoc_num):
- epoch_idx = epoc_id + 1
- start_time = time.time()
- prev_start_time = start_time
- every_epoc_loss = []
- for batch_id in range(iters_per_epoc):
- prev_start_time = start_time
- start_time = time.time()
- if parallel:
- loss_v, = train_exe.run(fetch_list=[loss.name])
- else:
- loss_v, = exe.run(train_prog, fetch_list=[loss])
- loss_v = np.mean(np.array(loss_v))
- every_epoc_loss.append(loss_v)
- if batch_id % 20 == 0:
- print("Epoc {:d}, batch {:d}, loss {:.6f}, time {:.5f}".format(
- epoc_id, batch_id, loss_v, start_time - prev_start_time))
- end_time = time.time()
- total_time += end_time - start_time
-
- best_map, mean_map = test(epoc_id, best_map)
- print("Best test map {0}".format(best_map))
- if epoc_id % 10 == 0 or epoc_id == epoc_num - 1:
- save_model(str(epoc_id), train_prog)
-
- if enable_ce and epoc_id == epoc_num - 1:
- train_avg_loss = np.mean(every_epoc_loss)
- if devices_num == 1:
- print("kpis train_cost %s" % train_avg_loss)
- print("kpis test_acc %s" % mean_map)
- print("kpis train_speed %s" % (total_time / epoch_idx))
- else:
- print("kpis train_cost_card%s %s" %
- (devices_num, train_avg_loss))
- print("kpis test_acc_card%s %s" %
- (devices_num, mean_map))
- print("kpis train_speed_card%s %f" %
- (devices_num, total_time / epoch_idx))
-
- except (fluid.core.EOFException, StopIteration):
- train_reader().close()
- train_py_reader.reset()
-
-
-if __name__ == '__main__':
- args = parser.parse_args()
- print_arguments(args)
-
- data_dir = args.data_dir
- dataset = args.dataset
- assert dataset in ['pascalvoc', 'coco2014', 'coco2017']
-
- # for pascalvoc
- label_file = 'label_list'
- train_file_list = 'trainval.txt'
- val_file_list = 'test.txt'
-
- if dataset == 'coco2014':
- train_file_list = 'annotations/instances_train2014.json'
- val_file_list = 'annotations/instances_val2014.json'
- elif dataset == 'coco2017':
- train_file_list = 'annotations/instances_train2017.json'
- val_file_list = 'annotations/instances_val2017.json'
-
- mean_BGR = [float(m) for m in args.mean_BGR.split(",")]
- image_shape = [int(m) for m in args.image_shape.split(",")]
- train_parameters[dataset]['image_shape'] = image_shape
- train_parameters[dataset]['batch_size'] = args.batch_size
- train_parameters[dataset]['lr'] = args.learning_rate
- train_parameters[dataset]['epoc_num'] = args.epoc_num
- train_parameters[dataset]['ap_version'] = args.ap_version
-
- data_args = reader.Settings(
- dataset=args.dataset,
- data_dir=data_dir,
- label_file=label_file,
- resize_h=image_shape[1],
- resize_w=image_shape[2],
- mean_value=mean_BGR,
- apply_distort=True,
- apply_expand=True,
- ap_version = args.ap_version)
- train(args,
- data_args,
- train_parameters[dataset],
- train_file_list=train_file_list,
- val_file_list=val_file_list)
diff --git a/fluid/PaddleCV/ocr_recognition/README.md b/fluid/PaddleCV/ocr_recognition/README.md
index 653e2a91f163b6b6d9c006684670393a53f991e2..aa675d6048ecdb025ef2273ee755354152adc32e 100644
--- a/fluid/PaddleCV/ocr_recognition/README.md
+++ b/fluid/PaddleCV/ocr_recognition/README.md
@@ -1,199 +1,2 @@
-
-运行本目录下的程序示例需要使用PaddlePaddle develop最新版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
-## 代码结构
-```
-├── data_reader.py # 下载、读取、处理数据。
-├── crnn_ctc_model.py # 定义了OCR CTC model的网络结构。
-├── attention_model.py # 定义了OCR attention model的网络结构。
-├── train.py # 用于模型的训练。
-├── infer.py # 加载训练好的模型文件,对新数据进行预测。
-├── eval.py # 评估模型在指定数据集上的效果。
-└── utils.py # 定义通用的函数。
-```
-
-
-## 简介
-
-本章的任务是识别图片中单行英文字符,这里我们分别使用CTC model和attention model两种不同的模型来完成该任务。
-
-这两种模型的有相同的编码部分,首先采用卷积将图片转为特征图, 然后使用`im2sequence op`将特征图转为序列,通过`双向GRU`学习到序列特征。
-
-两种模型的解码部分和使用的损失函数区别如下:
-
-- CTC model: 训练过程选用的损失函数为CTC(Connectionist Temporal Classification) loss, 预测阶段采用的是贪婪策略和CTC解码策略。
-- Attention model: 训练过程选用的是带注意力机制的解码策略和交叉信息熵损失函数,预测阶段采用的是柱搜索策略。
-
-训练以上两种模型的评估指标为样本级别的错误率。
-
-## 数据
-
-数据的下载和简单预处理都在`data_reader.py`中实现。
-
-### 数据示例
-
-我们使用的训练和测试数据如`图1`所示,每张图片包含单行不定长的英文字符串,这些图片都是经过检测算法进行预框选处理的。
-
-
-
-图 1
-
-
-在训练集中,每张图片对应的label是汉字在词典中的索引。 `图1` 对应的label如下所示:
-```
-80,84,68,82,83,72,78,77,68,67
-```
-在上边这个label中,`80` 表示字符`Q`的索引,`67` 表示英文字符`D`的索引。
-
-
-### 数据准备
-
-**A. 训练集**
-
-我们需要把所有参与训练的图片放入同一个文件夹,暂且记为`train_images`。然后用一个list文件存放每张图片的信息,包括图片大小、图片名称和对应的label,这里暂记该list文件为`train_list`,其格式如下所示:
-
-```
-185 48 00508_0215.jpg 7740,5332,2369,3201,4162
-48 48 00197_1893.jpg 6569
-338 48 00007_0219.jpg 4590,4788,3015,1994,3402,999,4553
-150 48 00107_4517.jpg 5936,3382,1437,3382
-...
-157 48 00387_0622.jpg 2397,1707,5919,1278
-```
-
-文件train_list
-
-上述文件中的每一行表示一张图片,每行被空格分为四列,前两列分别表示图片的宽和高,第三列表示图片的名称,第四列表示该图片对应的sequence label。
-最终我们应有以下类似文件结构:
-
-```
-|-train_data
- |- train_list
- |- train_imags
- |- 00508_0215.jpg
- |- 00197_1893.jpg
- |- 00007_0219.jpg
- | ...
-```
-
-在训练时,我们通过选项`--train_images` 和 `--train_list` 分别设置准备好的`train_images` 和`train_list`。
-
-
->**注:** 如果`--train_images` 和 `--train_list`都未设置或设置为None, ctc_reader.py会自动下载使用[示例数据](http://paddle-ocr-data.bj.bcebos.com/data.tar.gz),并将其缓存到`$HOME/.cache/paddle/dataset/ctc_data/data/` 路径下。
-
-
-**B. 测试集和评估集**
-
-测试集、评估集的准备方式与训练集相同。
-在训练阶段,测试集的路径通过train.py的选项`--test_images` 和 `--test_list` 来设置。
-在评估时,评估集的路径通过eval.py的选项`--input_images_dir` 和`--input_images_list` 来设置。
-
-**C. 待预测数据集**
-
-预测支持三种形式的输入:
-
-第一种:设置`--input_images_dir`和`--input_images_list`, 与训练集类似, 只不过list文件中的最后一列可以放任意占位字符或字符串,如下所示:
-
-```
-185 48 00508_0215.jpg s
-48 48 00197_1893.jpg s
-338 48 00007_0219.jpg s
-...
-```
-
-第二种:仅设置`--input_images_list`, 其中list文件中只需放图片的完整路径,如下所示:
-
-```
-data/test_images/00000.jpg
-data/test_images/00001.jpg
-data/test_images/00003.jpg
-```
-
-第三种:从stdin读入一张图片的path,然后进行一次inference.
-
-## 模型训练与预测
-
-### 训练
-
-使用默认数据在GPU单卡上训练:
-
-```
-env CUDA_VISIBLE_DEVICES=0 python ctc_train.py
-```
-使用默认数据在CPU上训练:
-```
-env OMP_NUM_THREADS= python ctc_train.py --use_gpu False --parallel=False
-```
-
-使用默认数据在GPU多卡上训练:
-
-```
-env CUDA_VISIBLE_DEVICES=0,1,2,3 python ctc_train.py --parallel=True
-```
-
-默认使用的是`CTC model`, 可以通过选项`--model="attention"`切换为`attention model`。
-
-执行`python train.py --help`可查看更多使用方式和参数详细说明。
-
-图2为使用默认参数在默认数据集上训练`CTC model`的收敛曲线,其中横坐标轴为训练迭代次数,纵轴为样本级错误率。其中,蓝线为训练集上的样本错误率,红线为测试集上的样本错误率。测试集上最低错误率为22.0%.
-
-
-
-图 2
-
-
-图3为使用默认参数在默认数据集上训练`attention model`的收敛曲线,其中横坐标轴为训练迭代次数,纵轴为样本级错误率。其中,蓝线为训练集上的样本错误率,红线为测试集上的样本错误率。测试集上最低错误率为16.25%.
-
-
-
-图 3
-
-
-
-## 测试
-
-通过以下命令调用评估脚本用指定数据集对模型进行评估:
-
-```
-env CUDA_VISIBLE_DEVICE=0 python eval.py \
- --model_path="./models/model_0" \
- --input_images_dir="./eval_data/images/" \
- --input_images_list="./eval_data/eval_list\" \
-```
-
-执行`python train.py --help`可查看参数详细说明。
-
-
-### 预测
-
-从标准输入读取一张图片的路径,并对齐进行预测:
-
-```
-env CUDA_VISIBLE_DEVICE=0 python infer.py \
- --model_path="models/model_00044_15000"
-```
-
-执行上述命令进行预测的效果如下:
-
-```
------------ Configuration Arguments -----------
-use_gpu: True
-input_images_dir: None
-input_images_list: None
-model_path: /home/work/models/fluid/ocr_recognition/models/model_00052_15000
-------------------------------------------------
-Init model from: ./models/model_00052_15000.
-Please input the path of image: ./test_images/00001_0060.jpg
-result: [3298 2371 4233 6514 2378 3298 2363]
-Please input the path of image: ./test_images/00001_0429.jpg
-result: [2067 2067 8187 8477 5027 7191 2431 1462]
-```
-
-从文件中批量读取图片路径,并对其进行预测:
-
-```
-env CUDA_VISIBLE_DEVICE=0 python infer.py \
- --model_path="models/model_00044_15000" \
- --input_images_list="data/test.list"
-```
+您好,该项目已被迁移,请移步到 [PaddleCV/ocr_recognition](../../../PaddleCV/ocr_recognition) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/rcnn/README.md b/fluid/PaddleCV/rcnn/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..1e96b373a0ad13424691921dd17e8f251b9cdfc7
--- /dev/null
+++ b/fluid/PaddleCV/rcnn/README.md
@@ -0,0 +1,6 @@
+
+Hi!
+
+This directory has been deprecated.
+
+Please visit the project at [PaddleCV/rcnn](../../../PaddleCV/rcnn).
diff --git a/fluid/PaddleCV/rcnn/README_cn.md b/fluid/PaddleCV/rcnn/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..83d5e0fc06448086e8807587798e804e3c634f97
--- /dev/null
+++ b/fluid/PaddleCV/rcnn/README_cn.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/rcnn](../../../PaddleCV/rcnn) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/README.md b/fluid/PaddleCV/video/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..bbef3af1c6f6715e4415041939e046d66f02f58d
--- /dev/null
+++ b/fluid/PaddleCV/video/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video](../../../PaddleCV/video) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/attention_cluster/README.md b/fluid/PaddleCV/video/models/attention_cluster/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..95056a71cb34304788168e15479a4aa1e2ecf3af
--- /dev/null
+++ b/fluid/PaddleCV/video/models/attention_cluster/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video/models/attention_cluster](../../../../../PaddleCV/video/models/attention_cluster/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/attention_lstm/README.md b/fluid/PaddleCV/video/models/attention_lstm/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..044c88cbecafdc880ae0cd213f6df77a8ce1715f
--- /dev/null
+++ b/fluid/PaddleCV/video/models/attention_lstm/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video/models/attention_lstm](../../../../../PaddleCV/video/models/attention_lstm/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/nextvlad/README.md b/fluid/PaddleCV/video/models/nextvlad/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..ad3a926dd83c8d8825224c404dda76fff5238cbe
--- /dev/null
+++ b/fluid/PaddleCV/video/models/nextvlad/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video/models/nextvlad](../../../../../PaddleCV/video/models/nextvlad/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/nonlocal_model/README.md b/fluid/PaddleCV/video/models/nonlocal_model/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..4f72316b5e761c7e2e421f76fc3f743ab4ac12fb
--- /dev/null
+++ b/fluid/PaddleCV/video/models/nonlocal_model/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video/models/nonlocal_model](../../../../../PaddleCV/video/models/nonlocal_model/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/stnet/README.md b/fluid/PaddleCV/video/models/stnet/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..15cff5af0909a93c8cf244629878582aa6c2d12f
--- /dev/null
+++ b/fluid/PaddleCV/video/models/stnet/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video/models/stnet](../../../../../PaddleCV/video/models/stnet/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/tsm/README.md b/fluid/PaddleCV/video/models/tsm/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..c93c56618aff1cfd331b2c1bd9fccfbb8a4c7a08
--- /dev/null
+++ b/fluid/PaddleCV/video/models/tsm/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video/models/tsm](../../../../../PaddleCV/video/models/tsm/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/tsn/README.md b/fluid/PaddleCV/video/models/tsn/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..8b4a986a63ea7746a3c7a648cd9d535803784ca3
--- /dev/null
+++ b/fluid/PaddleCV/video/models/tsn/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/video/models/tsn](../../../../../PaddleCV/video/models/tsn/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video_classification/README.md b/fluid/PaddleCV/video_classification/README.md
index 822c3ccf64cb1c5567e574425229974524a34471..bb145d1e7d4538f8b1a6df5cf547d9c5ef5ae8c5 100644
--- a/fluid/PaddleCV/video_classification/README.md
+++ b/fluid/PaddleCV/video_classification/README.md
@@ -1,140 +1,6 @@
-# Video Classification Based on Temporal Segment Network
-Video classification has drawn a significant amount of attentions in the past few years. This page introduces how to perform video classification with PaddlePaddle Fluid, on the public UCF-101 dataset, based on the state-of-the-art Temporal Segment Network (TSN) method.
+Hi!
-______________________________________________________________________________
+This directory has been deprecated.
-## Table of Contents
-Installation
-Data preparation
-Training
-Evaluation
-Inference
-Performance
-
-### Installation
-Running sample code in this directory requires PaddelPaddle Fluid v0.13.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in installation document and make an update.
-
-### Data preparation
-
-#### download UCF-101 dataset
-Users can download the UCF-101 dataset by the provided script in data/download.sh.
-
-#### decode video into frame
-To avoid the process of decoding videos in network training, we offline decode them into frames and save it in the pickle format, easily readable for python.
-
-Users can refer to the script data/video_decode.py for video decoding.
-
-#### split data into train and test
-We follow the split 1 of UCF-101 dataset. After data splitting, users can get 9537 videos for training and 3783 videos for validation. The reference script is data/split_data.py.
-
-#### save pickle for training
-As stated above, we save all data as pickle format for training. All information in each video is saved into one pickle, includes video id, frames binary and label. Please refer to the script data/generate_train_data.py.
-After this operation, one can get two directories containing training and testing data in pickle format, and two files train.list and test.list, with each line seperated by SPACE.
-
-### Training
-After data preparation, users can start the PaddlePaddle Fluid training by:
-```
-python train.py \
- --batch_size=128 \
- --total_videos=9537 \
- --class_dim=101 \
- --num_epochs=60 \
- --image_shape=3,224,224 \
- --model_save_dir=output/ \
- --with_mem_opt=True \
- --lr_init=0.01 \
- --num_layers=50 \
- --seg_num=7 \
- --pretrained_model={path_to_pretrained_model}
-```
-
-parameter introduction:
-batch_size: the size of each mini-batch.
-total_videos: total number of videos in the training set.
-class_dim: the class number of the classification task.
-num_epochs: the number of epochs.
-image_shape: input size of the network.
-model_save_dir: the directory to save trained model.
-with_mem_opt: whether to use memory optimization or not.
-lr_init: initialized learning rate.
-num_layers: the number of layers for ResNet.
-seg_num: the number of segments in TSN.
-pretrained_model: model path for pretraining.
-
-
-data reader introduction:
-Data reader is defined in reader.py. Note that we use group operation for all frames in one video.
-
-
-training:
-The training log is like:
-```
-[TRAIN] Pass: 0 trainbatch: 0 loss: 4.630959 acc1: 0.0 acc5: 0.0390625 time: 3.09 sec
-[TRAIN] Pass: 0 trainbatch: 10 loss: 4.559069 acc1: 0.0546875 acc5: 0.1171875 time: 3.91 sec
-[TRAIN] Pass: 0 trainbatch: 20 loss: 4.040092 acc1: 0.09375 acc5: 0.3515625 time: 3.88 sec
-[TRAIN] Pass: 0 trainbatch: 30 loss: 3.478214 acc1: 0.3203125 acc5: 0.5546875 time: 3.32 sec
-[TRAIN] Pass: 0 trainbatch: 40 loss: 3.005404 acc1: 0.3515625 acc5: 0.6796875 time: 3.33 sec
-[TRAIN] Pass: 0 trainbatch: 50 loss: 2.585245 acc1: 0.4609375 acc5: 0.7265625 time: 3.13 sec
-[TRAIN] Pass: 0 trainbatch: 60 loss: 2.151489 acc1: 0.4921875 acc5: 0.8203125 time: 3.35 sec
-[TRAIN] Pass: 0 trainbatch: 70 loss: 1.981680 acc1: 0.578125 acc5: 0.8359375 time: 3.30 sec
-```
-
-### Evaluation
-Evaluation is to evaluate the performance of a trained model. One can download pretrained models and set its path to path_to_pretrain_model. Then top1/top5 accuracy can be obtained by running the following command:
-```
-python eval.py \
- --batch_size=128 \
- --class_dim=101 \
- --image_shape=3,224,224 \
- --with_mem_opt=True \
- --num_layers=50 \
- --seg_num=7 \
- --test_model={path_to_pretrained_model}
-```
-
-According to the congfiguration of evaluation, the output log is like:
-```
-[TEST] Pass: 0 testbatch: 0 loss: 0.011551 acc1: 1.0 acc5: 1.0 time: 0.48 sec
-[TEST] Pass: 0 testbatch: 10 loss: 0.710330 acc1: 0.75 acc5: 1.0 time: 0.49 sec
-[TEST] Pass: 0 testbatch: 20 loss: 0.000547 acc1: 1.0 acc5: 1.0 time: 0.48 sec
-[TEST] Pass: 0 testbatch: 30 loss: 0.036623 acc1: 1.0 acc5: 1.0 time: 0.48 sec
-[TEST] Pass: 0 testbatch: 40 loss: 0.138705 acc1: 1.0 acc5: 1.0 time: 0.48 sec
-[TEST] Pass: 0 testbatch: 50 loss: 0.056909 acc1: 1.0 acc5: 1.0 time: 0.49 sec
-[TEST] Pass: 0 testbatch: 60 loss: 0.742937 acc1: 0.75 acc5: 1.0 time: 0.49 sec
-[TEST] Pass: 0 testbatch: 70 loss: 1.720186 acc1: 0.5 acc5: 0.875 time: 0.48 sec
-[TEST] Pass: 0 testbatch: 80 loss: 0.199669 acc1: 0.875 acc5: 1.0 time: 0.48 sec
-[TEST] Pass: 0 testbatch: 90 loss: 0.195510 acc1: 1.0 acc5: 1.0 time: 0.48 sec
-```
-
-### Inference
-Inference is used to get prediction score or video features based on trained models.
-```
-python infer.py \
- --class_dim=101 \
- --image_shape=3,224,224 \
- --with_mem_opt=True \
- --num_layers=50 \
- --seg_num=7 \
- --test_model={path_to_pretrained_model}
-```
-
-The output contains predication results, including maximum score (before softmax) and corresponding predicted label.
-```
-Test sample: PlayingGuitar_g01_c03, score: [21.418629], class [62]
-Test sample: SalsaSpin_g05_c06, score: [13.238657], class [76]
-Test sample: TrampolineJumping_g04_c01, score: [21.722862], class [93]
-Test sample: JavelinThrow_g01_c04, score: [16.27892], class [44]
-Test sample: PlayingTabla_g01_c01, score: [15.366951], class [65]
-Test sample: ParallelBars_g04_c07, score: [18.42596], class [56]
-Test sample: PlayingCello_g05_c05, score: [18.795723], class [58]
-Test sample: LongJump_g03_c04, score: [7.100088], class [50]
-Test sample: SkyDiving_g06_c03, score: [15.144707], class [82]
-Test sample: UnevenBars_g07_c04, score: [22.114838], class [95]
-```
-
-### Performance
-Configuration | Top-1 acc
-------------- | ---------------:
-seg=7, size=224 | 0.859
-seg=10, size=224 | 0.863
+Please visit the project at [PaddleCV/video_classification](../../../PaddleCV/video_classification).
diff --git a/fluid/PaddleCV/yolov3/README.md b/fluid/PaddleCV/yolov3/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..d05d89ce182a23b2f74e2633f7ada32fc6390477
--- /dev/null
+++ b/fluid/PaddleCV/yolov3/README.md
@@ -0,0 +1,6 @@
+
+Hi!
+
+This directory has been deprecated.
+
+Please visit the project at [PaddleCV/yolov3](../../../PaddleCV/yolov3).
diff --git a/fluid/PaddleCV/yolov3/README_cn.md b/fluid/PaddleCV/yolov3/README_cn.md
new file mode 100644
index 0000000000000000000000000000000000000000..89080d674df265d37a3601b579622adf1829c747
--- /dev/null
+++ b/fluid/PaddleCV/yolov3/README_cn.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleCV/yolov3](../../../PaddleCV/yolov3) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/LAC b/fluid/PaddleNLP/LAC
deleted file mode 160000
index d2fc9e0b45b4e6cfc93e73054026fc5a8abfbfb9..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/LAC
+++ /dev/null
@@ -1 +0,0 @@
-Subproject commit d2fc9e0b45b4e6cfc93e73054026fc5a8abfbfb9
diff --git a/fluid/PaddleNLP/Senta b/fluid/PaddleNLP/Senta
deleted file mode 160000
index 733c1d02085a3092dd262c4f396563962a514c3e..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/Senta
+++ /dev/null
@@ -1 +0,0 @@
-Subproject commit 733c1d02085a3092dd262c4f396563962a514c3e
diff --git a/fluid/PaddleNLP/SimNet b/fluid/PaddleNLP/SimNet
deleted file mode 160000
index 60b698a294c34420a7f0aab3112f27649aed1445..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/SimNet
+++ /dev/null
@@ -1 +0,0 @@
-Subproject commit 60b698a294c34420a7f0aab3112f27649aed1445
diff --git a/fluid/PaddleNLP/chinese_ner/README.md b/fluid/PaddleNLP/chinese_ner/README.md
index c5155b181b45eddcab3cd8d02d2036bd8c1e93ad..d4497b248da56eb8936147c2d1d7c8444f98cb5c 100644
--- a/fluid/PaddleNLP/chinese_ner/README.md
+++ b/fluid/PaddleNLP/chinese_ner/README.md
@@ -1,55 +1,3 @@
-# 使用ParallelExecutor的中文命名实体识别示例
-以下是本例的简要目录结构及说明:
-```text
-.
-├── data # 存储运行本例所依赖的数据,从外部获取
-├── reader.py # 数据读取接口, 从外部获取
-├── README.md # 文档
-├── train.py # 训练脚本
-├── infer.py # 预测脚本
-```
-
-## 数据
-在data目录下,有两个文件夹,train_files中保存的是训练数据,test_files中保存的是测试数据,作为示例,在目录下我们各放置了两个文件,实际训练时,根据自己的实际需要将数据放置在对应目录,并根据数据格式,修改reader.py中的数据读取函数。
-
-## 训练
-修改 [train.py](./train.py) 的 `main` 函数,指定数据路径,运行`python train.py`开始训练。
-
-训练记录形如
-```txt
-pass_id:0, time_cost:4.92960214615s
-[Train] precision:0.000862136531076, recall:0.0059880239521, f1:0.00150726226363
-[Test] precision:0.000796178343949, recall:0.00335758254057, f1:0.00128713933283
-pass_id:1, time_cost:0.715255975723s
-[Train] precision:0.00474094141551, recall:0.00762112139358, f1:0.00584551148225
-[Test] precision:0.0228873239437, recall:0.00727476217124, f1:0.0110403397028
-pass_id:2, time_cost:0.740842103958s
-[Train] precision:0.0120967741935, recall:0.00163309744148, f1:0.00287769784173
-[Test] precision:0, recall:0.0, f1:0
-```
-
-## 预测
-修改 [infer.py](./infer.py) 的 `infer` 函数,指定:需要测试的模型的路径、测试数据、预测标记文件的路径,运行`python infer.py`开始预测。
-
-预测结果如下
-```txt
-152804 O O
-130048 O O
-38862 10-B O
-784 O O
-1540 O O
-4145 O O
-2255 O O
-0 O O
-1279 O O
-7793 O O
-373 O O
-1621 O O
-815 O O
-2 O O
-247 24-B O
-401 24-I O
-```
-输出分为三列,以"\t"分割,第一列是输入的词语的序号,第二列是标准结果,第三列为标记结果。多条输入序列之间以空行分隔。
+您好,该项目已被迁移,请移步到 [PaddleNLP/chinese_ner](../../../PaddleNLP/chinese_ner/) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/chinese_ner/infer.py b/fluid/PaddleNLP/chinese_ner/infer.py
deleted file mode 100644
index e22832d38bc5308444201bd302798cf18cae7d99..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/chinese_ner/infer.py
+++ /dev/null
@@ -1,181 +0,0 @@
-import numpy as np
-import argparse
-import time
-
-import paddle.fluid as fluid
-import paddle.fluid.profiler as profiler
-import paddle
-
-import reader
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Run inference.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=6,
- help='The size of a batch. (default: %(default)d)')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type. (default: %(default)s)')
- parser.add_argument(
- '--model_path',
- type=str,
- default='model/params_pass_0',
- help='A path to the model. (default: %(default)s)')
- parser.add_argument(
- '--test_data_dir',
- type=str,
- default='data/test_files',
- help='A directory with test data files. (default: %(default)s)')
- parser.add_argument(
- '--test_label_file',
- type=str,
- default='data/label_dict',
- help='A file with test labels. (default: %(default)s)')
- parser.add_argument(
- '--num_passes', type=int, default=1, help='The number of passes.')
- parser.add_argument(
- '--skip_pass_num',
- type=int,
- default=0,
- help='The first num of passes to skip in statistics calculations.')
- parser.add_argument(
- '--profile', action='store_true', help='If set, do profiling.')
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(vars(args).iteritems()):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def load_reverse_dict(dict_path):
- return dict((idx, line.strip().split("\t")[0])
- for idx, line in enumerate(open(dict_path, "r").readlines()))
-
-def to_lodtensor(data, place):
- seq_lens = [len(seq) for seq in data]
- cur_len = 0
- lod = [cur_len]
- for l in seq_lens:
- cur_len += l
- lod.append(cur_len)
- flattened_data = np.concatenate(data, axis=0).astype("int64")
- flattened_data = flattened_data.reshape([len(flattened_data), 1])
- res = fluid.LoDTensor()
- res.set(flattened_data, place)
- res.set_lod([lod])
- return res
-
-
-
-def infer(args):
- word = fluid.layers.data(name='word', shape=[1], dtype='int64', lod_level=1)
- mention = fluid.layers.data(
- name='mention', shape=[1], dtype='int64', lod_level=1)
- target = fluid.layers.data(
- name='target', shape=[1], dtype='int64', lod_level=1)
-
- label_reverse_dict = load_reverse_dict(args.test_label_file)
-
- test_data = paddle.batch(
- reader.file_reader(args.test_data_dir), batch_size=args.batch_size)
- place = fluid.CUDAPlace(0) if args.device == 'GPU' else fluid.CPUPlace()
- feeder = fluid.DataFeeder(feed_list=[word, mention, target], place=place)
- exe = fluid.Executor(place)
-
- inference_scope = fluid.core.Scope()
- with fluid.scope_guard(inference_scope):
- [inference_program, feed_target_names,
- fetch_targets] = fluid.io.load_inference_model(args.model_path, exe)
- total_passes = args.num_passes + args.skip_pass_num
- batch_times = [0] * total_passes
- word_counts = [0] * total_passes
- wpses = [0] * total_passes
- all_iters = 0
- for pass_id in range(total_passes):
- if pass_id < args.skip_pass_num:
- print("Warm-up pass")
- if pass_id == args.skip_pass_num:
- profiler.reset_profiler()
- iters = 0
- for data in test_data():
- word = to_lodtensor(map(lambda x: x[0], data), place)
- mention = to_lodtensor(map(lambda x: x[1], data), place)
-
- start = time.time()
- crf_decode = exe.run(inference_program,
- feed={"word": word,
- "mention": mention},
- fetch_list=fetch_targets,
- return_numpy=False)
- batch_time = time.time() - start
- lod_info = (crf_decode[0].lod())[0]
- np_data = np.array(crf_decode[0])
- word_count = 0
- assert len(data) == len(lod_info) - 1
- for sen_index in xrange(len(data)):
- assert len(data[sen_index][0]) == lod_info[
- sen_index + 1] - lod_info[sen_index]
- word_index = 0
- for tag_index in xrange(lod_info[sen_index],
- lod_info[sen_index + 1]):
- word = str(data[sen_index][0][word_index])
- gold_tag = label_reverse_dict[data[sen_index][2][
- word_index]]
- tag = label_reverse_dict[np_data[tag_index][0]]
- word_index += 1
- word_count += word_index
- batch_times[pass_id] += batch_time
- word_counts[pass_id] += word_count
- iters += 1
- all_iters += 1
- batch_times[pass_id] /= iters
- word_counts[pass_id] /= iters
- wps = word_counts[pass_id] / batch_times[pass_id]
- wpses[pass_id] = wps
-
- print(
- "Pass: %d, iterations (total): %d (%d), latency: %.5f s, words: %d, wps: %f"
- % (pass_id, iters, all_iters, batch_times[pass_id],
- word_counts[pass_id], wps))
-
- # Postprocess benchmark data
- latencies = batch_times[args.skip_pass_num:]
- latency_avg = np.average(latencies)
- latency_std = np.std(latencies)
- latency_pc99 = np.percentile(latencies, 99)
- wps_avg = np.average(wpses)
- wps_std = np.std(wpses)
- wps_pc01 = np.percentile(wpses, 1)
-
- # Benchmark output
- print('\nTotal passes (incl. warm-up): %d' % (total_passes))
- print('Total iterations (incl. warm-up): %d' % (all_iters))
- print('Total examples (incl. warm-up): %d' % (all_iters * args.batch_size))
- print('avg latency: %.5f, std latency: %.5f, 99pc latency: %.5f' %
- (latency_avg, latency_std, latency_pc99))
- print('avg wps: %.5f, std wps: %.5f, wps for 99pc latency: %.5f' %
- (wps_avg, wps_std, wps_pc01))
-
-
-if __name__ == "__main__":
- args = parse_args()
- print_arguments(args)
- if args.profile:
- if args.device == 'GPU':
- with profiler.cuda_profiler("cuda_profiler.txt", 'csv') as nvprof:
- infer(args)
- else:
- with profiler.profiler('CPU', sorted_key='total') as cpuprof:
- infer(args)
- else:
- infer(args)
diff --git a/fluid/PaddleNLP/chinese_ner/train.py b/fluid/PaddleNLP/chinese_ner/train.py
deleted file mode 100644
index 7e59d2ed0793ae9499fc2a6618e762a9ac426800..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/chinese_ner/train.py
+++ /dev/null
@@ -1,356 +0,0 @@
-import os
-import math
-import time
-import argparse
-
-import numpy as np
-import paddle
-import paddle.fluid as fluid
-from paddle.fluid.initializer import NormalInitializer
-
-import reader
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Run inference.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=256,
- help='The size of a batch. (default: %(default)d)')
- parser.add_argument(
- '--word_dict_len',
- type=int,
- default=1942563,
- help='The lenght of the word dictionary. (default: %(default)d)')
- parser.add_argument(
- '--label_dict_len',
- type=int,
- default=49,
- help='The lenght of the label dictionary. (default: %(default)d)')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type. (default: %(default)s)')
- parser.add_argument(
- '--train_data_dir',
- type=str,
- default='data/train_files',
- help='A directory with train data files. (default: %(default)s)')
- parser.add_argument(
- '--parallel',
- type=bool,
- default=False,
- help="Whether to use parallel training. (default: %(default)s)")
- parser.add_argument(
- '--test_data_dir',
- type=str,
- default='data/test_files',
- help='A directory with test data files. (default: %(default)s)')
- parser.add_argument(
- '--model_save_dir',
- type=str,
- default='./output',
- help='A directory for saving models. (default: %(default)s)')
- parser.add_argument(
- '--num_passes',
- type=int,
- default=1000,
- help='The number of epochs. (default: %(default)d)')
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(vars(args).iteritems()):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def load_reverse_dict(dict_path):
- return dict((idx, line.strip().split("\t")[0])
- for idx, line in enumerate(open(dict_path, "r").readlines()))
-
-
-def to_lodtensor(data, place):
- seq_lens = [len(seq) for seq in data]
- cur_len = 0
- lod = [cur_len]
- for l in seq_lens:
- cur_len += l
- lod.append(cur_len)
- flattened_data = np.concatenate(data, axis=0).astype("int64")
- flattened_data = flattened_data.reshape([len(flattened_data), 1])
- res = fluid.LoDTensor()
- res.set(flattened_data, place)
- res.set_lod([lod])
- return res
-
-
-def ner_net(word_dict_len, label_dict_len):
- IS_SPARSE = False
- word_dim = 32
- mention_dict_len = 57
- mention_dim = 20
- grnn_hidden = 36
- emb_lr = 5
- init_bound = 0.1
-
- def _net_conf(word, mark, target):
- word_embedding = fluid.layers.embedding(
- input=word,
- size=[word_dict_len, word_dim],
- dtype='float32',
- is_sparse=IS_SPARSE,
- param_attr=fluid.ParamAttr(
- learning_rate=emb_lr,
- name="word_emb",
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound)))
-
- mention_embedding = fluid.layers.embedding(
- input=mention,
- size=[mention_dict_len, mention_dim],
- dtype='float32',
- is_sparse=IS_SPARSE,
- param_attr=fluid.ParamAttr(
- learning_rate=emb_lr,
- name="mention_emb",
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound)))
-
- word_embedding_r = fluid.layers.embedding(
- input=word,
- size=[word_dict_len, word_dim],
- dtype='float32',
- is_sparse=IS_SPARSE,
- param_attr=fluid.ParamAttr(
- learning_rate=emb_lr,
- name="word_emb_r",
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound)))
-
- mention_embedding_r = fluid.layers.embedding(
- input=mention,
- size=[mention_dict_len, mention_dim],
- dtype='float32',
- is_sparse=IS_SPARSE,
- param_attr=fluid.ParamAttr(
- learning_rate=emb_lr,
- name="mention_emb_r",
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound)))
-
- word_mention_vector = fluid.layers.concat(
- input=[word_embedding, mention_embedding], axis=1)
-
- word_mention_vector_r = fluid.layers.concat(
- input=[word_embedding_r, mention_embedding_r], axis=1)
-
- pre_gru = fluid.layers.fc(
- input=word_mention_vector,
- size=grnn_hidden * 3,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound),
- regularizer=fluid.regularizer.L2DecayRegularizer(
- regularization_coeff=1e-4)))
- gru = fluid.layers.dynamic_gru(
- input=pre_gru,
- size=grnn_hidden,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound),
- regularizer=fluid.regularizer.L2DecayRegularizer(
- regularization_coeff=1e-4)))
-
- pre_gru_r = fluid.layers.fc(
- input=word_mention_vector_r,
- size=grnn_hidden * 3,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound),
- regularizer=fluid.regularizer.L2DecayRegularizer(
- regularization_coeff=1e-4)))
- gru_r = fluid.layers.dynamic_gru(
- input=pre_gru_r,
- size=grnn_hidden,
- is_reverse=True,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound),
- regularizer=fluid.regularizer.L2DecayRegularizer(
- regularization_coeff=1e-4)))
-
- gru_merged = fluid.layers.concat(input=[gru, gru_r], axis=1)
-
- emission = fluid.layers.fc(
- size=label_dict_len,
- input=gru_merged,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-init_bound, high=init_bound),
- regularizer=fluid.regularizer.L2DecayRegularizer(
- regularization_coeff=1e-4)))
-
- crf_cost = fluid.layers.linear_chain_crf(
- input=emission,
- label=target,
- param_attr=fluid.ParamAttr(
- name='crfw',
- learning_rate=0.2, ))
- avg_cost = fluid.layers.mean(x=crf_cost)
- return avg_cost, emission
-
- word = fluid.layers.data(name='word', shape=[1], dtype='int64', lod_level=1)
- mention = fluid.layers.data(
- name='mention', shape=[1], dtype='int64', lod_level=1)
- target = fluid.layers.data(
- name="target", shape=[1], dtype='int64', lod_level=1)
-
- avg_cost, emission = _net_conf(word, mention, target)
-
- return avg_cost, emission, word, mention, target
-
-
-def test2(exe, chunk_evaluator, inference_program, test_data, place,
- cur_fetch_list):
- chunk_evaluator.reset()
- for data in test_data():
- word = to_lodtensor(map(lambda x: x[0], data), place)
- mention = to_lodtensor(map(lambda x: x[1], data), place)
- target = to_lodtensor(map(lambda x: x[2], data), place)
- result_list = exe.run(
- inference_program,
- feed={"word": word,
- "mention": mention,
- "target": target},
- fetch_list=cur_fetch_list)
- number_infer = np.array(result_list[0])
- number_label = np.array(result_list[1])
- number_correct = np.array(result_list[2])
- chunk_evaluator.update(number_infer[0], number_label[0],
- number_correct[0])
- return chunk_evaluator.eval()
-
-
-def test(test_exe, chunk_evaluator, inference_program, test_data, place,
- cur_fetch_list):
- chunk_evaluator.reset()
- for data in test_data():
- word = to_lodtensor(map(lambda x: x[0], data), place)
- mention = to_lodtensor(map(lambda x: x[1], data), place)
- target = to_lodtensor(map(lambda x: x[2], data), place)
- result_list = test_exe.run(
- fetch_list=cur_fetch_list,
- feed={"word": word,
- "mention": mention,
- "target": target})
- number_infer = np.array(result_list[0])
- number_label = np.array(result_list[1])
- number_correct = np.array(result_list[2])
- chunk_evaluator.update(number_infer.sum(),
- number_label.sum(), number_correct.sum())
- return chunk_evaluator.eval()
-
-
-def main(args):
- if not os.path.exists(args.model_save_dir):
- os.makedirs(args.model_save_dir)
-
- main = fluid.Program()
- startup = fluid.Program()
- with fluid.program_guard(main, startup):
- avg_cost, feature_out, word, mention, target = ner_net(
- args.word_dict_len, args.label_dict_len)
-
- crf_decode = fluid.layers.crf_decoding(
- input=feature_out, param_attr=fluid.ParamAttr(name='crfw'))
-
- inference_program = fluid.default_main_program().clone(for_test=True)
-
- sgd_optimizer = fluid.optimizer.SGD(learning_rate=1e-3)
- sgd_optimizer.minimize(avg_cost)
-
- (precision, recall, f1_score, num_infer_chunks, num_label_chunks,
- num_correct_chunks) = fluid.layers.chunk_eval(
- input=crf_decode,
- label=target,
- chunk_scheme="IOB",
- num_chunk_types=int(math.ceil((args.label_dict_len - 1) / 2.0)))
-
- chunk_evaluator = fluid.metrics.ChunkEvaluator()
-
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.file_reader(args.train_data_dir), buf_size=2000000),
- batch_size=args.batch_size)
- test_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.file_reader(args.test_data_dir), buf_size=2000000),
- batch_size=args.batch_size)
-
- place = fluid.CUDAPlace(0) if args.device == 'GPU' else fluid.CPUPlace()
- feeder = fluid.DataFeeder(
- feed_list=[word, mention, target], place=place)
-
- exe = fluid.Executor(place)
-
- exe.run(startup)
- if args.parallel:
- train_exe = fluid.ParallelExecutor(
- loss_name=avg_cost.name, use_cuda=(args.device == 'GPU'))
- test_exe = fluid.ParallelExecutor(
- use_cuda=(args.device == 'GPU'),
- main_program=inference_program,
- share_vars_from=train_exe)
- else:
- train_exe = exe
- test_exe = exe
-
- batch_id = 0
- for pass_id in xrange(args.num_passes):
- chunk_evaluator.reset()
- train_reader_iter = train_reader()
- start_time = time.time()
- while True:
- try:
- cur_batch = next(train_reader_iter)
- cost, nums_infer, nums_label, nums_correct = train_exe.run(
- fetch_list=[
- avg_cost.name, num_infer_chunks.name,
- num_label_chunks.name, num_correct_chunks.name
- ],
- feed=feeder.feed(cur_batch))
- chunk_evaluator.update(
- np.array(nums_infer).sum(),
- np.array(nums_label).sum(),
- np.array(nums_correct).sum())
- cost_list = np.array(cost)
- batch_id += 1
- except StopIteration:
- break
- end_time = time.time()
- print("pass_id:" + str(pass_id) + ", time_cost:" + str(
- end_time - start_time) + "s")
- precision, recall, f1_score = chunk_evaluator.eval()
- print("[Train] precision:" + str(precision) + ", recall:" + str(
- recall) + ", f1:" + str(f1_score))
- p, r, f1 = test2(
- exe, chunk_evaluator, inference_program, test_reader, place,
- [num_infer_chunks, num_label_chunks, num_correct_chunks])
- print("[Test] precision:" + str(p) + ", recall:" + str(r) + ", f1:"
- + str(f1))
- save_dirname = os.path.join(args.model_save_dir,
- "params_pass_%d" % pass_id)
- fluid.io.save_inference_model(save_dirname, ['word', 'mention'],
- [crf_decode], exe)
-
-
-if __name__ == "__main__":
- args = parse_args()
- print_arguments(args)
- main(args)
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/README.md b/fluid/PaddleNLP/deep_attention_matching_net/README.md
index 55383fa378463f1bb5aa8bdb5c844bfc6c5b1b54..5208812ce239a901d6acef714cfaed56ccfff628 100644
--- a/fluid/PaddleNLP/deep_attention_matching_net/README.md
+++ b/fluid/PaddleNLP/deep_attention_matching_net/README.md
@@ -1,91 +1,6 @@
-# __Deep Attention Matching Network__
-This is the source code of Deep Attention Matching network (DAM), that is proposed for multi-turn response selection in the retrieval-based chatbot.
+Hi!
-DAM is a neural matching network that entirely based on attention mechanism. The motivation of DAM is to capture those semantic dependencies, among dialogue elements at different level of granularities, in multi-turn conversation as matching evidences, in order to better match response candidate with its multi-turn context. DAM appears on ACL-2018, please find our paper at [http://aclweb.org/anthology/P18-1103](http://aclweb.org/anthology/P18-1103).
+This directory has been deprecated.
-## __TensorFlow Version__
-
-DAM is originally implemented with Tensorflow, which can be found at: [https://github.com/baidu/Dialogue/DAM](https://github.com/baidu/Dialogue/DAM) (in progress). We highly recommend using the PaddlePaddle Fluid version here as it supports parallely training with very large corpus.
-
-
-## __Network__
-
-DAM is inspired by Transformer in Machine Translation (Vaswani et al., 2017), and we extend the key attention mechanism of Transformer in two perspectives and introduce those two kinds of attention in one uniform neural network.
-
-- **self-attention** To gradually capture semantic representations in different granularities by stacking attention from word-level embeddings. Those multi-grained semantic representations would facilitate exploring segmental dependencies between context and response.
-
-- **cross-attention** Attention across context and response can generally capture the relevance in dependency between segment pairs, which could provide complementary information to textual relevance for matching response with multi-turn context.
-
-
-
-Overview of Deep Attention Matching Network
-
-
-## __Results__
-
-We test DAM on two large-scale multi-turn response selection tasks, i.e., the Ubuntu Corpus v1 and Douban Conversation Corpus, experimental results are bellow:
-
-
-
-
-
-## __Usage__
-
-Take the experiment on the Ubuntu Corpus v1 for Example.
-
-1) Go to the `ubuntu` directory
-
-```
-cd ubuntu
-```
-2) Download the well-preprocessed data for training
-
-```
-sh download_data.sh
-```
-3) Execute the model training and evaluation by
-
-```
-sh train.sh
-```
-for more detailed explanation about the arguments, please run
-
-```
-python ../train_and_evaluate.py --help
-```
-
-By default, the training is executed on one single GPU, which can be switched to multiple-GPU mode easily by simply resetting the visible devices in `train.sh`, e.g.,
-
-```
-export CUDA_VISIBLE_DEVICES=0,1,2,3
-```
-
-4) Run test by
-
-```
-sh test.sh
-```
-and run the test for different saved models by using different argument `--model_path`.
-
-Similary, one can carry out the experiment on the Douban Conversation Corpus by going to the directory `douban` and following the same procedure.
-
-## __Dependencies__
-
-- Python >= 2.7.3
-- PaddlePaddle latest develop branch
-
-## __Citation__
-
-The following article describe the DAM in detail. We recommend citing this article as default.
-
-```
-@inproceedings{ ,
- title={Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network},
- author={Xiangyang Zhou, Lu Li, Daxiang Dong, Yi Liu, Ying Chen, Wayne Xin Zhao, Dianhai Yu and Hua Wu},
- booktitle={Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
- volume={1},
- pages={ -- },
- year={2018}
-}
-```
+Please visit the project at [PaddleNLP/deep_attention_matching_net](../../../PaddleNLP/deep_attention_matching_net).
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/_ce.py b/fluid/PaddleNLP/deep_attention_matching_net/_ce.py
deleted file mode 100644
index 0c38c0a3d1b0fc0a240a7bae928d9c07f8b95886..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/deep_attention_matching_net/_ce.py
+++ /dev/null
@@ -1,46 +0,0 @@
-####this file is only used for continuous evaluation test!
-
-import os
-import sys
-sys.path.append(os.environ['ceroot'])
-from kpi import CostKpi, DurationKpi, AccKpi
-
-#### NOTE kpi.py should shared in models in some way!!!!
-
-train_cost_kpi = CostKpi('train_cost', 0.02, actived=True)
-train_duration_kpi = DurationKpi('train_duration', 0.05, actived=True)
-
-tracking_kpis = [
- train_cost_kpi,
- train_duration_kpi,
-]
-
-
-def parse_log(log):
- for line in log.split('\n'):
- fs = line.strip().split('\t')
- print(fs)
- if len(fs) == 3 and fs[0] == 'kpis':
- print("-----%s" % fs)
- kpi_name = fs[1]
- kpi_value = float(fs[2])
- yield kpi_name, kpi_value
-
-
-def log_to_ce(log):
- kpi_tracker = {}
- for kpi in tracking_kpis:
- kpi_tracker[kpi.name] = kpi
-
- for (kpi_name, kpi_value) in parse_log(log):
- print(kpi_name, kpi_value)
- kpi_tracker[kpi_name].add_record(kpi_value)
- kpi_tracker[kpi_name].persist()
-
-
-if __name__ == '__main__':
- log = sys.stdin.read()
- print("*****")
- print(log)
- print("****")
- log_to_ce(log)
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/test_and_evaluate.py b/fluid/PaddleNLP/deep_attention_matching_net/test_and_evaluate.py
deleted file mode 100644
index 998914d1fdcdb4bf3c442fb5276274ebf0aae038..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/deep_attention_matching_net/test_and_evaluate.py
+++ /dev/null
@@ -1,226 +0,0 @@
-import os
-import six
-import numpy as np
-import time
-import argparse
-import multiprocessing
-import paddle
-import paddle.fluid as fluid
-import utils.reader as reader
-from utils.util import print_arguments, mkdir
-
-try:
- import cPickle as pickle #python 2
-except ImportError as e:
- import pickle #python 3
-
-from model import Net
-
-
-#yapf: disable
-def parse_args():
- parser = argparse.ArgumentParser("Test for DAM.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=256,
- help='Batch size for training. (default: %(default)d)')
- parser.add_argument(
- '--num_scan_data',
- type=int,
- default=2,
- help='Number of pass for training. (default: %(default)d)')
- parser.add_argument(
- '--learning_rate',
- type=float,
- default=1e-3,
- help='Learning rate used to train. (default: %(default)f)')
- parser.add_argument(
- '--data_path',
- type=str,
- default="data/ubuntu/data_small.pkl",
- help='Path to training data. (default: %(default)s)')
- parser.add_argument(
- '--save_path',
- type=str,
- default="./",
- help='Path to save score and result files. (default: %(default)s)')
- parser.add_argument(
- '--model_path',
- type=str,
- default="saved_models/step_1000",
- help='Path to load well-trained models. (default: %(default)s)')
- parser.add_argument(
- '--use_cuda',
- action='store_true',
- help='If set, use cuda for training.')
- parser.add_argument(
- '--ext_eval',
- action='store_true',
- help='If set, use MAP, MRR ect for evaluation.')
- parser.add_argument(
- '--max_turn_num',
- type=int,
- default=9,
- help='Maximum number of utterances in context.')
- parser.add_argument(
- '--max_turn_len',
- type=int,
- default=50,
- help='Maximum length of setences in turns.')
- parser.add_argument(
- '--word_emb_init',
- type=str,
- default=None,
- help='Path to the initial word embedding.')
- parser.add_argument(
- '--vocab_size',
- type=int,
- default=434512,
- help='The size of vocabulary.')
- parser.add_argument(
- '--emb_size',
- type=int,
- default=200,
- help='The dimension of word embedding.')
- parser.add_argument(
- '--_EOS_',
- type=int,
- default=28270,
- help='The id for end of sentence in vocabulary.')
- parser.add_argument(
- '--stack_num',
- type=int,
- default=5,
- help='The number of stacked attentive modules in network.')
- parser.add_argument(
- '--channel1_num',
- type=int,
- default=32,
- help="The channels' number of the 1st conv3d layer's output.")
- parser.add_argument(
- '--channel2_num',
- type=int,
- default=16,
- help="The channels' number of the 2nd conv3d layer's output.")
- args = parser.parse_args()
- return args
-
-
-#yapf: enable
-
-
-def test(args):
- if not os.path.exists(args.save_path):
- mkdir(args.save_path)
- if not os.path.exists(args.model_path):
- raise ValueError("Invalid model init path %s" % args.model_path)
- # data data_config
- data_conf = {
- "batch_size": args.batch_size,
- "max_turn_num": args.max_turn_num,
- "max_turn_len": args.max_turn_len,
- "_EOS_": args._EOS_,
- }
-
- dam = Net(args.max_turn_num, args.max_turn_len, args.vocab_size,
- args.emb_size, args.stack_num, args.channel1_num,
- args.channel2_num)
- dam.create_data_layers()
- loss, logits = dam.create_network()
-
- loss.persistable = True
-
- # gradient clipping
- fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByValue(
- max=1.0, min=-1.0))
-
- test_program = fluid.default_main_program().clone(for_test=True)
-
- optimizer = fluid.optimizer.Adam(
- learning_rate=fluid.layers.exponential_decay(
- learning_rate=args.learning_rate,
- decay_steps=400,
- decay_rate=0.9,
- staircase=True))
- optimizer.minimize(loss)
-
- # The fethced loss is wrong when mem opt is enabled
- fluid.memory_optimize(fluid.default_main_program())
-
- if args.use_cuda:
- place = fluid.CUDAPlace(0)
- dev_count = fluid.core.get_cuda_device_count()
- else:
- place = fluid.CPUPlace()
- dev_count = multiprocessing.cpu_count()
-
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- fluid.io.load_persistables(exe, args.model_path)
-
- test_exe = fluid.ParallelExecutor(
- use_cuda=args.use_cuda, main_program=test_program)
-
- print("start loading data ...")
- with open(args.data_path, 'rb') as f:
- if six.PY2:
- train_data, val_data, test_data = pickle.load(f)
- else:
- train_data, val_data, test_data = pickle.load(f, encoding="bytes")
- print("finish loading data ...")
-
- if args.ext_eval:
- import utils.douban_evaluation as eva
- else:
- import utils.evaluation as eva
-
- test_batches = reader.build_batches(test_data, data_conf)
-
- test_batch_num = len(test_batches["response"])
-
- print("test batch num: %d" % test_batch_num)
-
- print("begin inference ...")
- print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
-
- score_path = os.path.join(args.save_path, 'score.txt')
- score_file = open(score_path, 'w')
-
- for it in six.moves.xrange(test_batch_num // dev_count):
- feed_list = []
- for dev in six.moves.xrange(dev_count):
- index = it * dev_count + dev
- batch_data = reader.make_one_batch_input(test_batches, index)
- feed_dict = dict(zip(dam.get_feed_names(), batch_data))
- feed_list.append(feed_dict)
-
- predicts = test_exe.run(feed=feed_list, fetch_list=[logits.name])
-
- scores = np.array(predicts[0])
- print("step = %d" % it)
-
- for dev in six.moves.xrange(dev_count):
- index = it * dev_count + dev
- for i in six.moves.xrange(args.batch_size):
- score_file.write(
- str(scores[args.batch_size * dev + i][0]) + '\t' + str(
- test_batches["label"][index][i]) + '\n')
-
- score_file.close()
-
- #write evaluation result
- result = eva.evaluate(score_path)
- result_file_path = os.path.join(args.save_path, 'result.txt')
- with open(result_file_path, 'w') as out_file:
- for p_at in result:
- out_file.write(str(p_at) + '\n')
- print('finish test')
- print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- test(args)
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/train_and_evaluate.py b/fluid/PaddleNLP/deep_attention_matching_net/train_and_evaluate.py
deleted file mode 100644
index f240615b59376e8d86ce2ebaddd8eae8ee15fe30..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/deep_attention_matching_net/train_and_evaluate.py
+++ /dev/null
@@ -1,402 +0,0 @@
-import os
-import six
-import numpy as np
-import time
-import argparse
-import multiprocessing
-import paddle
-import paddle.fluid as fluid
-import utils.reader as reader
-from utils.util import print_arguments, mkdir
-
-try:
- import cPickle as pickle #python 2
-except ImportError as e:
- import pickle #python 3
-
-from model import Net
-
-
-#yapf: disable
-def parse_args():
- parser = argparse.ArgumentParser("Training DAM.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=256,
- help='Batch size for training. (default: %(default)d)')
- parser.add_argument(
- '--num_scan_data',
- type=int,
- default=2,
- help='Number of pass for training. (default: %(default)d)')
- parser.add_argument(
- '--learning_rate',
- type=float,
- default=1e-3,
- help='Learning rate used to train. (default: %(default)f)')
- parser.add_argument(
- '--data_path',
- type=str,
- default="data/data_small.pkl",
- help='Path to training data. (default: %(default)s)')
- parser.add_argument(
- '--save_path',
- type=str,
- default="saved_models",
- help='Path to save trained models. (default: %(default)s)')
- parser.add_argument(
- '--use_cuda',
- action='store_true',
- help='If set, use cuda for training.')
- parser.add_argument(
- '--use_pyreader',
- action='store_true',
- help='If set, use pyreader for reading data.')
- parser.add_argument(
- '--ext_eval',
- action='store_true',
- help='If set, use MAP, MRR ect for evaluation.')
- parser.add_argument(
- '--max_turn_num',
- type=int,
- default=9,
- help='Maximum number of utterances in context.')
- parser.add_argument(
- '--max_turn_len',
- type=int,
- default=50,
- help='Maximum length of setences in turns.')
- parser.add_argument(
- '--word_emb_init',
- type=str,
- default=None,
- help='Path to the initial word embedding.')
- parser.add_argument(
- '--vocab_size',
- type=int,
- default=434512,
- help='The size of vocabulary.')
- parser.add_argument(
- '--emb_size',
- type=int,
- default=200,
- help='The dimension of word embedding.')
- parser.add_argument(
- '--_EOS_',
- type=int,
- default=28270,
- help='The id for the end of sentence in vocabulary.')
- parser.add_argument(
- '--stack_num',
- type=int,
- default=5,
- help='The number of stacked attentive modules in network.')
- parser.add_argument(
- '--channel1_num',
- type=int,
- default=32,
- help="The channels' number of the 1st conv3d layer's output.")
- parser.add_argument(
- '--channel2_num',
- type=int,
- default=16,
- help="The channels' number of the 2nd conv3d layer's output.")
- args = parser.parse_args()
- return args
-
-
-#yapf: enable
-
-
-def evaluate(score_path, result_file_path):
- if args.ext_eval:
- import utils.douban_evaluation as eva
- else:
- import utils.evaluation as eva
- #write evaluation result
- result = eva.evaluate(score_path)
- with open(result_file_path, 'w') as out_file:
- for p_at in result:
- out_file.write(str(p_at) + '\n')
- print('finish evaluation')
- print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
-
-
-def test_with_feed(exe, program, feed_names, fetch_list, score_path, batches,
- batch_num, dev_count):
- score_file = open(score_path, 'w')
- for it in six.moves.xrange(batch_num // dev_count):
- feed_list = []
- for dev in six.moves.xrange(dev_count):
- val_index = it * dev_count + dev
- batch_data = reader.make_one_batch_input(batches, val_index)
- feed_dict = dict(zip(feed_names, batch_data))
- feed_list.append(feed_dict)
-
- predicts = exe.run(feed=feed_list, fetch_list=fetch_list)
-
- scores = np.array(predicts[0])
- for dev in six.moves.xrange(dev_count):
- val_index = it * dev_count + dev
- for i in six.moves.xrange(args.batch_size):
- score_file.write(
- str(scores[args.batch_size * dev + i][0]) + '\t' + str(
- batches["label"][val_index][i]) + '\n')
- score_file.close()
-
-
-def test_with_pyreader(exe, program, pyreader, fetch_list, score_path, batches,
- batch_num, dev_count):
- def data_provider():
- for index in six.moves.xrange(batch_num):
- yield reader.make_one_batch_input(batches, index)
-
- score_file = open(score_path, 'w')
- pyreader.decorate_tensor_provider(data_provider)
- it = 0
- pyreader.start()
- while True:
- try:
- predicts = exe.run(fetch_list=fetch_list)
-
- scores = np.array(predicts[0])
- for dev in six.moves.xrange(dev_count):
- val_index = it * dev_count + dev
- for i in six.moves.xrange(args.batch_size):
- score_file.write(
- str(scores[args.batch_size * dev + i][0]) + '\t' + str(
- batches["label"][val_index][i]) + '\n')
- it += 1
- except fluid.core.EOFException:
- pyreader.reset()
- break
- score_file.close()
-
-
-def train(args):
- if not os.path.exists(args.save_path):
- os.makedirs(args.save_path)
-
- # data data_config
- data_conf = {
- "batch_size": args.batch_size,
- "max_turn_num": args.max_turn_num,
- "max_turn_len": args.max_turn_len,
- "_EOS_": args._EOS_,
- }
-
- dam = Net(args.max_turn_num, args.max_turn_len, args.vocab_size,
- args.emb_size, args.stack_num, args.channel1_num,
- args.channel2_num)
-
- train_program = fluid.Program()
- train_startup = fluid.Program()
- if "CE_MODE_X" in os.environ:
- train_program.random_seed = 110
- train_startup.random_seed = 110
- with fluid.program_guard(train_program, train_startup):
- with fluid.unique_name.guard():
- if args.use_pyreader:
- train_pyreader = dam.create_py_reader(
- capacity=10, name='train_reader')
- else:
- dam.create_data_layers()
- loss, logits = dam.create_network()
- loss.persistable = True
- logits.persistable = True
- # gradient clipping
- fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByValue(
- max=1.0, min=-1.0))
-
- optimizer = fluid.optimizer.Adam(
- learning_rate=fluid.layers.exponential_decay(
- learning_rate=args.learning_rate,
- decay_steps=400,
- decay_rate=0.9,
- staircase=True))
- optimizer.minimize(loss)
- print("begin memory optimization ...")
- fluid.memory_optimize(train_program)
- print("end memory optimization ...")
-
- test_program = fluid.Program()
- test_startup = fluid.Program()
- if "CE_MODE_X" in os.environ:
- test_program.random_seed = 110
- test_startup.random_seed = 110
- with fluid.program_guard(test_program, test_startup):
- with fluid.unique_name.guard():
- if args.use_pyreader:
- test_pyreader = dam.create_py_reader(
- capacity=10, name='test_reader')
- else:
- dam.create_data_layers()
-
- loss, logits = dam.create_network()
- loss.persistable = True
- logits.persistable = True
-
- test_program = test_program.clone(for_test=True)
-
- if args.use_cuda:
- place = fluid.CUDAPlace(0)
- dev_count = fluid.core.get_cuda_device_count()
- else:
- place = fluid.CPUPlace()
- dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
-
- print("device count %d" % dev_count)
- print("theoretical memory usage: ")
- print(fluid.contrib.memory_usage(
- program=train_program, batch_size=args.batch_size))
-
- exe = fluid.Executor(place)
- exe.run(train_startup)
- exe.run(test_startup)
-
- train_exe = fluid.ParallelExecutor(
- use_cuda=args.use_cuda, loss_name=loss.name, main_program=train_program)
-
- test_exe = fluid.ParallelExecutor(
- use_cuda=args.use_cuda,
- main_program=test_program,
- share_vars_from=train_exe)
-
- if args.word_emb_init is not None:
- print("start loading word embedding init ...")
- if six.PY2:
- word_emb = np.array(pickle.load(open(args.word_emb_init,
- 'rb'))).astype('float32')
- else:
- word_emb = np.array(
- pickle.load(
- open(args.word_emb_init, 'rb'), encoding="bytes")).astype(
- 'float32')
- dam.set_word_embedding(word_emb, place)
- print("finish init word embedding ...")
-
- print("start loading data ...")
- with open(args.data_path, 'rb') as f:
- if six.PY2:
- train_data, val_data, test_data = pickle.load(f)
- else:
- train_data, val_data, test_data = pickle.load(f, encoding="bytes")
- print("finish loading data ...")
-
- val_batches = reader.build_batches(val_data, data_conf)
-
- batch_num = len(train_data[six.b('y')]) // args.batch_size
- val_batch_num = len(val_batches["response"])
-
- print_step = max(1, batch_num // (dev_count * 100))
- save_step = max(1, batch_num // (dev_count * 10))
-
- print("begin model training ...")
- print(time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(time.time())))
-
- # train on one epoch data by feeding
- def train_with_feed(step):
- ave_cost = 0.0
- for it in six.moves.xrange(batch_num // dev_count):
- feed_list = []
- for dev in six.moves.xrange(dev_count):
- index = it * dev_count + dev
- batch_data = reader.make_one_batch_input(train_batches, index)
- feed_dict = dict(zip(dam.get_feed_names(), batch_data))
- feed_list.append(feed_dict)
-
- cost = train_exe.run(feed=feed_list, fetch_list=[loss.name])
-
- ave_cost += np.array(cost[0]).mean()
- step = step + 1
- if step % print_step == 0:
- print("processed: [" + str(step * dev_count * 1.0 / batch_num) +
- "] ave loss: [" + str(ave_cost / print_step) + "]")
- ave_cost = 0.0
-
- if (args.save_path is not None) and (step % save_step == 0):
- save_path = os.path.join(args.save_path, "step_" + str(step))
- print("Save model at step %d ... " % step)
- print(time.strftime('%Y-%m-%d %H:%M:%S',
- time.localtime(time.time())))
- fluid.io.save_persistables(exe, save_path, train_program)
-
- score_path = os.path.join(args.save_path, 'score.' + str(step))
- test_with_feed(test_exe, test_program,
- dam.get_feed_names(), [logits.name], score_path,
- val_batches, val_batch_num, dev_count)
-
- result_file_path = os.path.join(args.save_path,
- 'result.' + str(step))
- evaluate(score_path, result_file_path)
- return step, np.array(cost[0]).mean()
-
- # train on one epoch with pyreader
- def train_with_pyreader(step):
- def data_provider():
- for index in six.moves.xrange(batch_num):
- yield reader.make_one_batch_input(train_batches, index)
-
- train_pyreader.decorate_tensor_provider(data_provider)
-
- ave_cost = 0.0
- train_pyreader.start()
- while True:
- try:
- cost = train_exe.run(fetch_list=[loss.name])
-
- ave_cost += np.array(cost[0]).mean()
- step = step + 1
- if step % print_step == 0:
- print("processed: [" + str(step * dev_count * 1.0 /
- batch_num) + "] ave loss: [" +
- str(ave_cost / print_step) + "]")
- ave_cost = 0.0
-
- if (args.save_path is not None) and (step % save_step == 0):
- save_path = os.path.join(args.save_path,
- "step_" + str(step))
- print("Save model at step %d ... " % step)
- print(time.strftime('%Y-%m-%d %H:%M:%S',
- time.localtime(time.time())))
- fluid.io.save_persistables(exe, save_path, train_program)
-
- score_path = os.path.join(args.save_path,
- 'score.' + str(step))
- test_with_pyreader(test_exe, test_program, test_pyreader,
- [logits.name], score_path, val_batches,
- val_batch_num, dev_count)
-
- result_file_path = os.path.join(args.save_path,
- 'result.' + str(step))
- evaluate(score_path, result_file_path)
-
- except fluid.core.EOFException:
- train_pyreader.reset()
- break
- return step, np.array(cost[0]).mean()
-
- # train over different epoches
- global_step, train_time = 0, 0.0
- for epoch in six.moves.xrange(args.num_scan_data):
- shuffle_train = reader.unison_shuffle(
- train_data, seed=110 if ("CE_MODE_X" in os.environ) else None)
- train_batches = reader.build_batches(shuffle_train, data_conf)
-
- begin_time = time.time()
- if args.use_pyreader:
- global_step, last_cost = train_with_pyreader(global_step)
- else:
- global_step, last_cost = train_with_feed(global_step)
- train_time += time.time() - begin_time
- # For internal continuous evaluation
- if "CE_MODE_X" in os.environ:
- print("kpis train_cost %f" % last_cost)
- print("kpis train_duration %f" % train_time)
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- train(args)
diff --git a/fluid/PaddleNLP/language_model/gru/README.md b/fluid/PaddleNLP/language_model/gru/README.md
index 91ce2d7f58085b56da2ac2dec03af2a05985ab8f..f508e7061c12d6a7f053fa7c533f8350144d67cc 100644
--- a/fluid/PaddleNLP/language_model/gru/README.md
+++ b/fluid/PaddleNLP/language_model/gru/README.md
@@ -1,148 +1,2 @@
-# 语言模型
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── README.md # 文档
-├── train.py # 训练脚本
-├── infer.py # 预测脚本
-└── utils.py # 通用函数
-```
-
-
-## 简介
-
-循环神经网络语言模型的介绍可以参阅论文[Recurrent Neural Network Regularization](https://arxiv.org/abs/1409.2329),在本例中,我们实现了GRU-RNN语言模型。
-
-## 训练
-
-运行命令 `python train.py` 开始训练模型。
-```python
-python train.py
-```
-
-当前支持的参数可参见[train.py](./train.py) `train_net` 函数
-```python
-vocab, train_reader, test_reader = utils.prepare_data(
- batch_size=20, # batch size
- buffer_size=1000, # buffer size, default value is OK
- word_freq_threshold=0) # vocabulary related parameter, and words with frequency below this value will be filtered
-
-train(train_reader=train_reader,
- vocab=vocab,
- network=network,
- hid_size=200, # embedding and hidden size
- base_lr=1.0, # base learning rate
- batch_size=20, # batch size, the same as that in prepare_data
- pass_num=12, # the number of passes for training
- use_cuda=True, # whether to use GPU card
- parallel=False, # whether to be parallel
- model_dir="model", # directory to save model
- init_low_bound=-0.1, # uniform parameter initialization lower bound
- init_high_bound=0.1) # uniform parameter initialization upper bound
-```
-
-## 自定义网络结构
-
-可在[train.py](./train.py) `network` 函数中调整网络结构,当前的网络结构如下:
-```python
-emb = fluid.layers.embedding(input=src, size=[vocab_size, hid_size],
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
- learning_rate=emb_lr_x),
- is_sparse=True)
-
-fc0 = fluid.layers.fc(input=emb, size=hid_size * 3,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
-gru_h0 = fluid.layers.dynamic_gru(input=fc0, size=hid_size,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
-
-fc = fluid.layers.fc(input=gru_h0, size=vocab_size, act='softmax',
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(low=init_low_bound, high=init_high_bound),
- learning_rate=fc_lr_x))
-
-cost = fluid.layers.cross_entropy(input=fc, label=dst)
-```
-
-## 训练结果示例
-
-我们在Tesla K40m单GPU卡上训练的日志如下所示
-```text
-epoch_1 start
-step:100 ppl:771.053
-step:200 ppl:449.597
-step:300 ppl:642.654
-step:400 ppl:458.128
-step:500 ppl:510.912
-step:600 ppl:451.545
-step:700 ppl:364.404
-step:800 ppl:324.272
-step:900 ppl:360.797
-step:1000 ppl:275.761
-step:1100 ppl:294.599
-step:1200 ppl:335.877
-step:1300 ppl:185.262
-step:1400 ppl:241.744
-step:1500 ppl:211.507
-step:1600 ppl:233.431
-step:1700 ppl:298.767
-step:1800 ppl:203.403
-step:1900 ppl:158.828
-step:2000 ppl:171.148
-step:2100 ppl:280.884
-epoch:1 num_steps:2104 time_cost(s):47.478780
-model saved in model/epoch_1
-epoch_2 start
-step:100 ppl:238.099
-step:200 ppl:136.527
-step:300 ppl:204.184
-step:400 ppl:252.886
-step:500 ppl:177.377
-step:600 ppl:197.688
-step:700 ppl:131.650
-step:800 ppl:223.906
-step:900 ppl:144.785
-step:1000 ppl:176.286
-step:1100 ppl:148.158
-step:1200 ppl:203.581
-step:1300 ppl:168.208
-step:1400 ppl:159.412
-step:1500 ppl:114.032
-step:1600 ppl:157.985
-step:1700 ppl:147.743
-step:1800 ppl:88.676
-step:1900 ppl:141.962
-step:2000 ppl:106.087
-step:2100 ppl:122.709
-epoch:2 num_steps:2104 time_cost(s):47.583789
-model saved in model/epoch_2
-...
-```
-
-## 预测
-运行命令 `python infer.py model_dir start_epoch last_epoch(inclusive)` 开始预测,其中,start_epoch指定开始预测的轮次,last_epoch指定结束的轮次,例如
-```python
-python infer.py model 1 12 # prediction from epoch 1 to epoch 12
-```
-
-## 预测结果示例
-```text
-model:model/epoch_1 ppl:254.540 time_cost(s):3.29
-model:model/epoch_2 ppl:177.671 time_cost(s):3.27
-model:model/epoch_3 ppl:156.251 time_cost(s):3.27
-model:model/epoch_4 ppl:139.036 time_cost(s):3.27
-model:model/epoch_5 ppl:132.661 time_cost(s):3.27
-model:model/epoch_6 ppl:130.092 time_cost(s):3.28
-model:model/epoch_7 ppl:128.751 time_cost(s):3.27
-model:model/epoch_8 ppl:125.411 time_cost(s):3.27
-model:model/epoch_9 ppl:124.604 time_cost(s):3.28
-model:model/epoch_10 ppl:124.754 time_cost(s):3.29
-model:model/epoch_11 ppl:125.421 time_cost(s):3.27
-model:model/epoch_12 ppl:125.676 time_cost(s):3.27
-```
+您好,该项目已被迁移,请移步到 [PaddleNLP/language_model/gru](../../../../PaddleNLP/language_model/gru) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/language_model/lstm/.run_ce.sh b/fluid/PaddleNLP/language_model/lstm/.run_ce.sh
deleted file mode 100644
index 8c192ad62e5b66bc4c7f3150d2e24507662491d8..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/language_model/lstm/.run_ce.sh
+++ /dev/null
@@ -1,11 +0,0 @@
-export CUDA_VISIBLE_DEVICES=0
-cd data
-sh download_data.sh
-cd ..
-
-python train.py \
- --data_path data/simple-examples/data/ \
- --model_type small \
- --use_gpu True \
- --enable_ce | python _ce.py
-
diff --git a/fluid/PaddleNLP/language_model/lstm/README.md b/fluid/PaddleNLP/language_model/lstm/README.md
index f6d1250ff66a066c8634eca9c3f74312f00a7749..9aa66dd7eb87b16ab503342014c4dabdfbe4906d 100644
--- a/fluid/PaddleNLP/language_model/lstm/README.md
+++ b/fluid/PaddleNLP/language_model/lstm/README.md
@@ -1,76 +1,2 @@
-# lstm lm
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── README.md # 文档
-├── train.py # 训练脚本
-├── reader.py # 数据读取
-└── lm_model.py # 模型定义文件
-```
-
-
-## 简介
-
-循环神经网络语言模型的介绍可以参阅论文[Recurrent Neural Network Regularization](https://arxiv.org/abs/1409.2329),本文主要是说明基于lstm的语言的模型的实现,数据是采用ptb dataset,下载地址为
-http://www.fit.vutbr.cz/~imikolov/rnnlm/simple-examples.tgz
-
-## 数据下载
-用户可以自行下载数据,并解压, 也可以利用目录中的脚本
-
-cd data; sh download_data.sh
-
-## 训练
-
-运行命令
-`CUDA_VISIBLE_DEVICES=0 python train.py --data_path data/simple-examples/data/ --model_type small --use_gpu True`
- 开始训练模型。
-
-model_type 为模型配置的大小,目前支持 small,medium, large 三种配置形式
-
-实现采用双层的lstm,具体的参数和网络配置 可以参考 train.py, lm_model.py 文件中的设置
-
-
-## 训练结果示例
-
-p40中训练日志如下(small config), test 测试集仅在最后一个epoch完成后进行测试
-```text
-epoch id 0
-ppl 232 865.86505 1.0
-ppl 464 632.76526 1.0
-ppl 696 510.47153 1.0
-ppl 928 437.60617 1.0
-ppl 1160 393.38422 1.0
-ppl 1392 353.05365 1.0
-ppl 1624 325.73267 1.0
-ppl 1856 305.488 1.0
-ppl 2088 286.3128 1.0
-ppl 2320 270.91504 1.0
-train ppl 270.86246
-valid ppl 181.867964379
-...
-ppl 2320 40.975872 0.001953125
-train ppl 40.974102
-valid ppl 117.85741214
-test ppl 113.939103843
-```
-## 与tf结果对比
-
-tf采用的版本是1.6
-```text
-small config
- train valid test
-fluid 1.0 40.962 118.111 112.617
-tf 1.6 40.492 118.329 113.788
-
-medium config
- train valid test
-fluid 1.0 45.620 87.398 83.682
-tf 1.6 45.594 87.363 84.015
-
-large config
- train valid test
-fluid 1.0 37.221 82.358 78.137
-tf 1.6 38.342 82.311 78.121
-```
+您好,该项目已被迁移,请移步到 [PaddleNLP/language_model/lstm](../../../../PaddleNLP/language_model/lstm) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/language_model/lstm/args.py b/fluid/PaddleNLP/language_model/lstm/args.py
deleted file mode 100644
index 498fd9437885238c09e721ee6b182c6d6764398b..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/language_model/lstm/args.py
+++ /dev/null
@@ -1,40 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import argparse
-import distutils.util
-
-
-def parse_args():
- parser = argparse.ArgumentParser(description=__doc__)
- parser.add_argument(
- "--model_type",
- type=str,
- default="small",
- help="model_type [test|small|med|big]")
- parser.add_argument(
- "--data_path", type=str, help="all the data for train,valid,test")
- parser.add_argument('--para_init', action='store_true')
- parser.add_argument(
- '--use_gpu', type=bool, default=False, help='whether using gpu')
- parser.add_argument(
- '--log_path',
- help='path of the log file. If not set, logs are printed to console')
- parser.add_argument('--enable_ce', action='store_true')
- args = parser.parse_args()
- return args
diff --git a/fluid/PaddleNLP/language_model/lstm/lm_model.py b/fluid/PaddleNLP/language_model/lstm/lm_model.py
deleted file mode 100644
index b52b18f9b95ea4654ca35419fb8b4b577e586577..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/language_model/lstm/lm_model.py
+++ /dev/null
@@ -1,285 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid.layers as layers
-import paddle.fluid as fluid
-from paddle.fluid.layers.control_flow import StaticRNN as PaddingRNN
-import numpy as np
-
-
-def lm_model(hidden_size,
- vocab_size,
- batch_size,
- num_layers=2,
- num_steps=20,
- init_scale=0.1,
- dropout=None):
- def padding_rnn(input_embedding, len=3, init_hidden=None, init_cell=None):
- weight_1_arr = []
- weight_2_arr = []
- bias_arr = []
- hidden_array = []
- cell_array = []
- mask_array = []
- for i in range(num_layers):
- weight_1 = layers.create_parameter([hidden_size * 2, hidden_size*4], dtype="float32", name="fc_weight1_"+str(i), \
- default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
- weight_1_arr.append(weight_1)
- bias_1 = layers.create_parameter(
- [hidden_size * 4],
- dtype="float32",
- name="fc_bias1_" + str(i),
- default_initializer=fluid.initializer.Constant(0.0))
- bias_arr.append(bias_1)
-
- pre_hidden = layers.slice(
- init_hidden, axes=[0], starts=[i], ends=[i + 1])
- pre_cell = layers.slice(
- init_cell, axes=[0], starts=[i], ends=[i + 1])
- pre_hidden = layers.reshape(pre_hidden, shape=[-1, hidden_size])
- pre_cell = layers.reshape(pre_cell, shape=[-1, hidden_size])
- hidden_array.append(pre_hidden)
- cell_array.append(pre_cell)
-
- input_embedding = layers.transpose(input_embedding, perm=[1, 0, 2])
- rnn = PaddingRNN()
-
- with rnn.step():
- input = rnn.step_input(input_embedding)
- for k in range(num_layers):
- pre_hidden = rnn.memory(init=hidden_array[k])
- pre_cell = rnn.memory(init=cell_array[k])
- weight_1 = weight_1_arr[k]
- bias = bias_arr[k]
-
- nn = layers.concat([input, pre_hidden], 1)
- gate_input = layers.matmul(x=nn, y=weight_1)
-
- gate_input = layers.elementwise_add(gate_input, bias)
- #i, j, f, o = layers.split(gate_input, num_or_sections=4, dim=-1)
- i = layers.slice(
- gate_input, axes=[1], starts=[0], ends=[hidden_size])
- j = layers.slice(
- gate_input,
- axes=[1],
- starts=[hidden_size],
- ends=[hidden_size * 2])
- f = layers.slice(
- gate_input,
- axes=[1],
- starts=[hidden_size * 2],
- ends=[hidden_size * 3])
- o = layers.slice(
- gate_input,
- axes=[1],
- starts=[hidden_size * 3],
- ends=[hidden_size * 4])
-
- c = pre_cell * layers.sigmoid(f) + layers.sigmoid(
- i) * layers.tanh(j)
- m = layers.tanh(c) * layers.sigmoid(o)
-
- rnn.update_memory(pre_hidden, m)
- rnn.update_memory(pre_cell, c)
-
- rnn.step_output(m)
- rnn.step_output(c)
-
- input = m
-
- if dropout != None and dropout > 0.0:
- input = layers.dropout(
- input,
- dropout_prob=dropout,
- dropout_implementation='upscale_in_train')
-
- rnn.step_output(input)
- #real_res = layers.concat(res, 0)
- rnnout = rnn()
-
- last_hidden_array = []
- last_cell_array = []
- real_res = rnnout[-1]
- for i in range(num_layers):
- m = rnnout[i * 2]
- c = rnnout[i * 2 + 1]
- m.stop_gradient = True
- c.stop_gradient = True
- last_h = layers.slice(
- m, axes=[0], starts=[num_steps - 1], ends=[num_steps])
- last_hidden_array.append(last_h)
- last_c = layers.slice(
- c, axes=[0], starts=[num_steps - 1], ends=[num_steps])
- last_cell_array.append(last_c)
- '''
- else:
- real_res = rnnout[-1]
- for i in range( num_layers ):
-
- m1, c1, m2, c2 = rnnout
- real_res = m2
- m1.stop_gradient = True
- c1.stop_gradient = True
- c2.stop_gradient = True
- '''
-
- #layers.Print( first_hidden, message="22", summarize=10)
- #layers.Print( rnnout[1], message="11", summarize=10)
- #real_res = ( rnnout[1] + rnnout[2] + rnnout[3] + rnnout[4]) / 4.0
- real_res = layers.transpose(x=real_res, perm=[1, 0, 2])
- last_hidden = layers.concat(last_hidden_array, 0)
- last_cell = layers.concat(last_cell_array, 0)
- '''
- last_hidden = layers.concat( hidden_array, 1 )
- last_hidden = layers.reshape( last_hidden, shape=[-1, num_layers, hidden_size])
- last_hidden = layers.transpose( x = last_hidden, perm = [1, 0, 2])
- last_cell = layers.concat( cell_array, 1)
- last_cell = layers.reshape( last_cell, shape=[ -1, num_layers, hidden_size])
- last_cell = layers.transpose( x = last_cell, perm = [1, 0, 2])
- '''
-
- return real_res, last_hidden, last_cell
-
- def encoder_static(input_embedding, len=3, init_hidden=None,
- init_cell=None):
-
- weight_1_arr = []
- weight_2_arr = []
- bias_arr = []
- hidden_array = []
- cell_array = []
- mask_array = []
- for i in range(num_layers):
- weight_1 = layers.create_parameter([hidden_size * 2, hidden_size*4], dtype="float32", name="fc_weight1_"+str(i), \
- default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
- weight_1_arr.append(weight_1)
- bias_1 = layers.create_parameter(
- [hidden_size * 4],
- dtype="float32",
- name="fc_bias1_" + str(i),
- default_initializer=fluid.initializer.Constant(0.0))
- bias_arr.append(bias_1)
-
- pre_hidden = layers.slice(
- init_hidden, axes=[0], starts=[i], ends=[i + 1])
- pre_cell = layers.slice(
- init_cell, axes=[0], starts=[i], ends=[i + 1])
- pre_hidden = layers.reshape(pre_hidden, shape=[-1, hidden_size])
- pre_cell = layers.reshape(pre_cell, shape=[-1, hidden_size])
- hidden_array.append(pre_hidden)
- cell_array.append(pre_cell)
-
- res = []
- for index in range(len):
- input = layers.slice(
- input_embedding, axes=[1], starts=[index], ends=[index + 1])
- input = layers.reshape(input, shape=[-1, hidden_size])
- for k in range(num_layers):
- pre_hidden = hidden_array[k]
- pre_cell = cell_array[k]
- weight_1 = weight_1_arr[k]
- bias = bias_arr[k]
-
- nn = layers.concat([input, pre_hidden], 1)
- gate_input = layers.matmul(x=nn, y=weight_1)
-
- gate_input = layers.elementwise_add(gate_input, bias)
- i, j, f, o = layers.split(gate_input, num_or_sections=4, dim=-1)
-
- c = pre_cell * layers.sigmoid(f) + layers.sigmoid(
- i) * layers.tanh(j)
- m = layers.tanh(c) * layers.sigmoid(o)
-
- hidden_array[k] = m
- cell_array[k] = c
- input = m
-
- if dropout != None and dropout > 0.0:
- input = layers.dropout(
- input,
- dropout_prob=dropout,
- dropout_implementation='upscale_in_train')
-
- res.append(layers.reshape(input, shape=[1, -1, hidden_size]))
- real_res = layers.concat(res, 0)
- real_res = layers.transpose(x=real_res, perm=[1, 0, 2])
- last_hidden = layers.concat(hidden_array, 1)
- last_hidden = layers.reshape(
- last_hidden, shape=[-1, num_layers, hidden_size])
- last_hidden = layers.transpose(x=last_hidden, perm=[1, 0, 2])
- last_cell = layers.concat(cell_array, 1)
- last_cell = layers.reshape(
- last_cell, shape=[-1, num_layers, hidden_size])
- last_cell = layers.transpose(x=last_cell, perm=[1, 0, 2])
-
- return real_res, last_hidden, last_cell
-
- x = layers.data(name="x", shape=[-1, 1, 1], dtype='int64')
- y = layers.data(name="y", shape=[-1, 1], dtype='float32')
-
- init_hidden = layers.data(name="init_hidden", shape=[1], dtype='float32')
- init_cell = layers.data(name="init_cell", shape=[1], dtype='float32')
-
- init_hidden = layers.reshape(
- init_hidden, shape=[num_layers, -1, hidden_size])
- init_cell = layers.reshape(init_cell, shape=[num_layers, -1, hidden_size])
-
- x_emb = layers.embedding(
- input=x,
- size=[vocab_size, hidden_size],
- dtype='float32',
- is_sparse=True,
- param_attr=fluid.ParamAttr(
- name='embedding_para',
- initializer=fluid.initializer.UniformInitializer(
- low=-init_scale, high=init_scale)))
-
- x_emb = layers.reshape(x_emb, shape=[-1, num_steps, hidden_size])
- if dropout != None and dropout > 0.0:
- x_emb = layers.dropout(
- x_emb,
- dropout_prob=dropout,
- dropout_implementation='upscale_in_train')
-
- rnn_out, last_hidden, last_cell = padding_rnn(
- x_emb, len=num_steps, init_hidden=init_hidden, init_cell=init_cell)
- rnn_out = layers.reshape(rnn_out, shape=[-1, num_steps, hidden_size])
-
-
- softmax_weight = layers.create_parameter([hidden_size, vocab_size], dtype="float32", name="softmax_weight", \
- default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
- softmax_bias = layers.create_parameter([vocab_size], dtype="float32", name='softmax_bias', \
- default_initializer=fluid.initializer.UniformInitializer(low=-init_scale, high=init_scale))
-
- projection = layers.matmul(rnn_out, softmax_weight)
- projection = layers.elementwise_add(projection, softmax_bias)
-
- projection = layers.reshape(projection, shape=[-1, vocab_size])
- #y = layers.reshape( y, shape=[-1, vocab_size])
-
- loss = layers.softmax_with_cross_entropy(
- logits=projection, label=y, soft_label=False)
-
- loss = layers.reshape(loss, shape=[-1, num_steps])
- loss = layers.reduce_mean(loss, dim=[0])
- loss = layers.reduce_sum(loss)
-
- loss.permissions = True
-
- feeding_list = ['x', 'y', 'init_hidden', 'init_cell']
- return loss, last_hidden, last_cell, feeding_list
diff --git a/fluid/PaddleNLP/language_model/lstm/train.py b/fluid/PaddleNLP/language_model/lstm/train.py
deleted file mode 100644
index 42bab12bcbdafdabc5ab14370860b4b0ae269dda..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/language_model/lstm/train.py
+++ /dev/null
@@ -1,302 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-import time
-import os
-import random
-
-import math
-
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.core as core
-import paddle.fluid.framework as framework
-from paddle.fluid.executor import Executor
-
-import reader
-
-import sys
-if sys.version[0] == '2':
- reload(sys)
- sys.setdefaultencoding("utf-8")
-sys.path.append('..')
-import os
-os.environ["TF_CPP_MIN_LOG_LEVEL"] = "3"
-
-from args import *
-import lm_model
-import logging
-import pickle
-
-SEED = 123
-
-
-def get_current_model_para(train_prog, train_exe):
- param_list = train_prog.block(0).all_parameters()
- param_name_list = [p.name for p in param_list]
-
- vals = {}
- for p_name in param_name_list:
- p_array = np.array(fluid.global_scope().find_var(p_name).get_tensor())
- vals[p_name] = p_array
-
- return vals
-
-
-def save_para_npz(train_prog, train_exe):
- print("begin to save model to model_base")
- param_list = train_prog.block(0).all_parameters()
- param_name_list = [p.name for p in param_list]
-
- vals = {}
- for p_name in param_name_list:
- p_array = np.array(fluid.global_scope().find_var(p_name).get_tensor())
- vals[p_name] = p_array
-
- emb = vals["embedding_para"]
- print("begin to save model to model_base")
- np.savez("mode_base", **vals)
-
-
-def train():
- args = parse_args()
- model_type = args.model_type
- logger = logging.getLogger("lm")
- logger.setLevel(logging.INFO)
- formatter = logging.Formatter(
- '%(asctime)s - %(name)s - %(levelname)s - %(message)s')
- if args.enable_ce:
- fluid.default_startup_program().random_seed = SEED
- if args.log_path:
- file_handler = logging.FileHandler(args.log_path)
- file_handler.setLevel(logging.INFO)
- file_handler.setFormatter(formatter)
- logger.addHandler(file_handler)
- else:
- console_handler = logging.StreamHandler()
- console_handler.setLevel(logging.INFO)
- console_handler.setFormatter(formatter)
- logger.addHandler(console_handler)
-
- logger.info('Running with args : {}'.format(args))
-
- vocab_size = 10000
- if model_type == "test":
- num_layers = 1
- batch_size = 2
- hidden_size = 10
- num_steps = 3
- init_scale = 0.1
- max_grad_norm = 5.0
- epoch_start_decay = 1
- max_epoch = 1
- dropout = 0.0
- lr_decay = 0.5
- base_learning_rate = 1.0
- elif model_type == "small":
- num_layers = 2
- batch_size = 20
- hidden_size = 200
- num_steps = 20
- init_scale = 0.1
- max_grad_norm = 5.0
- epoch_start_decay = 4
- max_epoch = 13
- dropout = 0.0
- lr_decay = 0.5
- base_learning_rate = 1.0
- elif model_type == "medium":
- num_layers = 2
- batch_size = 20
- hidden_size = 650
- num_steps = 35
- init_scale = 0.05
- max_grad_norm = 5.0
- epoch_start_decay = 6
- max_epoch = 39
- dropout = 0.5
- lr_decay = 0.8
- base_learning_rate = 1.0
- elif model_type == "large":
- num_layers = 2
- batch_size = 20
- hidden_size = 1500
- num_steps = 35
- init_scale = 0.04
- max_grad_norm = 10.0
- epoch_start_decay = 14
- max_epoch = 55
- dropout = 0.65
- lr_decay = 1.0 / 1.15
- base_learning_rate = 1.0
- else:
- print("model type not support")
- return
-
- # Training process
- loss, last_hidden, last_cell, feed_order = lm_model.lm_model(
- hidden_size,
- vocab_size,
- batch_size,
- num_layers=num_layers,
- num_steps=num_steps,
- init_scale=init_scale,
- dropout=dropout)
- # clone from default main program and use it as the validation program
- main_program = fluid.default_main_program()
- inference_program = fluid.default_main_program().clone(for_test=True)
-
- fluid.clip.set_gradient_clip(clip=fluid.clip.GradientClipByGlobalNorm(
- clip_norm=max_grad_norm))
-
- learning_rate = fluid.layers.create_global_var(
- name="learning_rate",
- shape=[1],
- value=1.0,
- dtype='float32',
- persistable=True)
-
- optimizer = fluid.optimizer.SGD(learning_rate=learning_rate)
-
- optimizer.minimize(loss)
-
- place = core.CUDAPlace(0) if args.use_gpu else core.CPUPlace()
- exe = Executor(place)
- exe.run(framework.default_startup_program())
-
- data_path = args.data_path
- print("begin to load data")
- raw_data = reader.ptb_raw_data(data_path)
- print("finished load data")
- train_data, valid_data, test_data, _ = raw_data
-
- def prepare_input(batch, init_hidden, init_cell, epoch_id=0, with_lr=True):
- x, y = batch
- new_lr = base_learning_rate * (lr_decay**max(
- epoch_id + 1 - epoch_start_decay, 0.0))
- lr = np.ones((1), dtype='float32') * new_lr
- res = {}
- x = x.reshape((-1, num_steps, 1))
- y = y.reshape((-1, 1))
-
- res['x'] = x
- res['y'] = y
- res['init_hidden'] = init_hidden
- res['init_cell'] = init_cell
- if with_lr:
- res['learning_rate'] = lr
-
- return res
-
- def eval(data):
- # when eval the batch_size set to 1
- eval_data_iter = reader.get_data_iter(data, 1, num_steps)
- total_loss = 0.0
- iters = 0
- init_hidden = np.zeros((num_layers, 1, hidden_size), dtype='float32')
- init_cell = np.zeros((num_layers, 1, hidden_size), dtype='float32')
- for batch_id, batch in enumerate(eval_data_iter):
- input_data_feed = prepare_input(
- batch, init_hidden, init_cell, epoch_id, with_lr=False)
- fetch_outs = exe.run(
- inference_program,
- feed=input_data_feed,
- fetch_list=[loss.name, last_hidden.name, last_cell.name])
-
- cost_train = np.array(fetch_outs[0])
- init_hidden = np.array(fetch_outs[1])
- init_cell = np.array(fetch_outs[2])
-
- total_loss += cost_train
- iters += num_steps
-
- ppl = np.exp(total_loss / iters)
- return ppl
-
- # get train epoch size
- batch_len = len(train_data) // batch_size
- epoch_size = (batch_len - 1) // num_steps
- log_interval = epoch_size // 10
- total_time = 0.0
- for epoch_id in range(max_epoch):
- start_time = time.time()
- print("epoch id", epoch_id)
- train_data_iter = reader.get_data_iter(train_data, batch_size,
- num_steps)
-
- total_loss = 0
-
- init_hidden = None
- init_cell = None
- #debug_para(fluid.framework.default_main_program(), parallel_executor)
- total_loss = 0
- iters = 0
- init_hidden = np.zeros(
- (num_layers, batch_size, hidden_size), dtype='float32')
- init_cell = np.zeros(
- (num_layers, batch_size, hidden_size), dtype='float32')
- for batch_id, batch in enumerate(train_data_iter):
- input_data_feed = prepare_input(
- batch, init_hidden, init_cell, epoch_id=epoch_id)
- fetch_outs = exe.run(feed=input_data_feed,
- fetch_list=[
- loss.name, last_hidden.name,
- last_cell.name, 'learning_rate'
- ],
- use_program_cache=True)
-
- cost_train = np.array(fetch_outs[0])
- init_hidden = np.array(fetch_outs[1])
- init_cell = np.array(fetch_outs[2])
-
- lr = np.array(fetch_outs[3])
-
- total_loss += cost_train
- iters += num_steps
- if batch_id > 0 and batch_id % log_interval == 0:
- ppl = np.exp(total_loss / iters)
- print("ppl ", batch_id, ppl[0], lr[0])
-
- ppl = np.exp(total_loss / iters)
- if epoch_id == 0 and ppl[0] > 1000:
- # for bad init, after first epoch, the loss is over 1000
- # no more need to continue
- return
- end_time = time.time()
- total_time += end_time - start_time
- print("train ppl", ppl[0])
-
- if epoch_id == max_epoch - 1 and args.enable_ce:
- print("ptblm\tlstm_language_model_duration\t%s" %
- (total_time / max_epoch))
- print("ptblm\tlstm_language_model_loss\t%s" % ppl[0])
-
- model_path = os.path.join("model_new/", str(epoch_id))
- if not os.path.isdir(model_path):
- os.makedirs(model_path)
- fluid.io.save_persistables(
- executor=exe, dirname=model_path, main_program=main_program)
- valid_ppl = eval(valid_data)
- print("valid ppl", valid_ppl[0])
- test_ppl = eval(test_data)
- print("test ppl", test_ppl[0])
-
-
-if __name__ == '__main__':
- train()
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/README.md b/fluid/PaddleNLP/machine_reading_comprehension/README.md
index 884c15058e9b5601c7754e27d1b106fd41e2ac27..eb527349298e0b804ccc11d179cc9b78f1715b49 100644
--- a/fluid/PaddleNLP/machine_reading_comprehension/README.md
+++ b/fluid/PaddleNLP/machine_reading_comprehension/README.md
@@ -1,69 +1,6 @@
-# Abstract
-Dureader is an end-to-end neural network model for machine reading comprehension style question answering, which aims to answer questions from given passages. We first match the question and passages with a bidireactional attention flow network to obtrain the question-aware passages represenation. Then we employ a pointer network to locate the positions of answers from passages. Our experimental evalutions show that DuReader model achieves the state-of-the-art results in DuReader Dadaset.
-# Dataset
-DuReader Dataset is a new large-scale real-world and human sourced MRC dataset in Chinese. DuReader focuses on real-world open-domain question answering. The advantages of DuReader over existing datasets are concluded as follows:
- - Real question
- - Real article
- - Real answer
- - Real application scenario
- - Rich annotation
-# Network
-DuReader model is inspired by 3 classic reading comprehension models([BiDAF](https://arxiv.org/abs/1611.01603), [Match-LSTM](https://arxiv.org/abs/1608.07905), [R-NET](https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/r-net.pdf)).
+Hi!
-DuReader model is a hierarchical multi-stage process and consists of five layers
+This directory has been deprecated.
-- **Word Embedding Layer** maps each word to a vector using a pre-trained word embedding model.
-- **Encoding Layer** extracts context infomation for each position in question and passages with a bi-directional LSTM network.
-- **Attention Flow Layer** couples the query and context vectors and produces a set of query-aware feature vectors for each word in the context. Please refer to [BiDAF](https://arxiv.org/abs/1611.01603) for more details.
-- **Fusion Layer** employs a layer of bi-directional LSTM to capture the interaction among context words independent of the query.
-- **Decode Layer** employs an answer point network with attention pooling of the quesiton to locate the positions of answers from passages. Please refer to [Match-LSTM](https://arxiv.org/abs/1608.07905) and [R-NET](https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/r-net.pdf) for more details.
-
-## How to Run
-### Download the Dataset
-To Download DuReader dataset:
-```
-cd data && bash download.sh
-```
-For more details about DuReader dataset please refer to [DuReader Dataset Homepage](https://ai.baidu.com//broad/subordinate?dataset=dureader).
-
-### Download Thirdparty Dependencies
-We use Bleu and Rouge as evaluation metrics, the calculation of these metrics relies on the scoring scripts under [coco-caption](https://github.com/tylin/coco-caption), to download them, run:
-
-```
-cd utils && bash download_thirdparty.sh
-```
-### Environment Requirements
-For now we've only tested on PaddlePaddle v1.0, to install PaddlePaddle and for more details about PaddlePaddle, see [PaddlePaddle Homepage](http://paddlepaddle.org).
-
-### Preparation
-Before training the model, we have to make sure that the data is ready. For preparation, we will check the data files, make directories and extract a vocabulary for later use. You can run the following command to do this with a specified task name:
-
-```
-sh run.sh --prepare
-```
-You can specify the files for train/dev/test by setting the `trainset`/`devset`/`testset`.
-### Training
-To train the model and you can also set the hyper-parameters such as the learning rate by using `--learning_rate NUM`. For example, to train the model for 10 passes, you can run:
-
-```
-sh run.sh --train --pass_num 10
-```
-
-The training process includes an evaluation on the dev set after each training epoch. By default, the model with the least Bleu-4 score on the dev set will be saved.
-
-### Evaluation
-To conduct a single evaluation on the dev set with the the model already trained, you can run the following command:
-
-```
-sh run.sh --evaluate --load_dir models/1
-```
-
-### Prediction
-You can also predict answers for the samples in some files using the following command:
-
-```
-sh run.sh --predict --load_dir models/1 --testset ../data/preprocessed/testset/search.dev.json
-```
-
-By default, the results are saved at `../data/results/` folder. You can change this by specifying `--result_dir DIR_PATH`.
+Please visit the project at [PaddleNLP/machine_reading_comprehension](../../../PaddleNLP/machine_reading_comprehension).
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/_ce.py b/fluid/PaddleNLP/machine_reading_comprehension/_ce.py
deleted file mode 100644
index cff13c8722007987a3cd82f1298206248963e45a..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/_ce.py
+++ /dev/null
@@ -1,68 +0,0 @@
-####this file is only used for continuous evaluation test!
-
-import os
-import sys
-#sys.path.insert(0, os.environ['ceroot'])
-from kpi import CostKpi, DurationKpi, AccKpi
-
-#### NOTE kpi.py should shared in models in some way!!!!
-
-train_cost_card1_kpi = CostKpi('train_cost_card1', 0.02, 0, actived=True)
-test_cost_card1_kpi = CostKpi('test_cost_card1', 0.005, 0, actived=True)
-train_duration_card1_kpi = DurationKpi(
- 'train_duration_card1', 0.06, 0, actived=True)
-train_cost_card4_kpi = CostKpi('train_cost_card4', 0.01, 0, actived=True)
-test_cost_card4_kpi = CostKpi('test_cost_card4', 0.005, 0, actived=True)
-train_duration_card4_kpi = DurationKpi(
- 'train_duration_card4', 0.06, 0, actived=True)
-
-tracking_kpis = [
- train_cost_card1_kpi,
- test_cost_card1_kpi,
- train_duration_card1_kpi,
- train_cost_card4_kpi,
- test_cost_card4_kpi,
- train_duration_card4_kpi,
-]
-
-
-def parse_log(log):
- '''
- This method should be implemented by model developers.
- The suggestion:
- each line in the log should be key, value, for example:
- "
- train_cost\t1.0
- test_cost\t1.0
- train_cost\t1.0
- train_cost\t1.0
- train_acc\t1.2
- "
- '''
- for line in log.split('\n'):
- fs = line.strip().split('\t')
- print(fs)
- if len(fs) == 3 and fs[0] == 'kpis':
- print("-----%s" % fs)
- kpi_name = fs[1]
- kpi_value = float(fs[2])
- yield kpi_name, kpi_value
-
-
-def log_to_ce(log):
- kpi_tracker = {}
- for kpi in tracking_kpis:
- kpi_tracker[kpi.name] = kpi
-
- for (kpi_name, kpi_value) in parse_log(log):
- print(kpi_name, kpi_value)
- kpi_tracker[kpi_name].add_record(kpi_value)
- kpi_tracker[kpi_name].persist()
-
-
-if __name__ == '__main__':
- log = sys.stdin.read()
- print("*****")
- print(log)
- print("****")
- log_to_ce(log)
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/args.py b/fluid/PaddleNLP/machine_reading_comprehension/args.py
deleted file mode 100644
index 53812252dea9b7855cd09142de89b6f0a1e2ac5c..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/args.py
+++ /dev/null
@@ -1,128 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import argparse
-import distutils.util
-
-
-def parse_args():
- parser = argparse.ArgumentParser(description=__doc__)
- parser.add_argument(
- '--prepare',
- action='store_true',
- help='create the directories, prepare the vocabulary and embeddings')
- parser.add_argument('--train', action='store_true', help='train the model')
- parser.add_argument(
- '--evaluate', action='store_true', help='evaluate the model on dev set')
- parser.add_argument(
- '--predict',
- action='store_true',
- help='predict the answers for test set with trained model')
- parser.add_argument(
- "--embed_size",
- type=int,
- default=300,
- help="The dimension of embedding table. (default: %(default)d)")
- parser.add_argument(
- "--hidden_size",
- type=int,
- default=300,
- help="The size of rnn hidden unit. (default: %(default)d)")
- parser.add_argument(
- "--batch_size",
- type=int,
- default=32,
- help="The sequence number of a mini-batch data. (default: %(default)d)")
- parser.add_argument(
- "--pass_num",
- type=int,
- default=5,
- help="The pass number to train. (default: %(default)d)")
- parser.add_argument(
- "--learning_rate",
- type=float,
- default=0.001,
- help="Learning rate used to train the model. (default: %(default)f)")
- parser.add_argument(
- "--weight_decay",
- type=float,
- default=0.0001,
- help="Weight decay. (default: %(default)f)")
- parser.add_argument(
- "--use_gpu",
- type=distutils.util.strtobool,
- default=True,
- help="Whether to use gpu. (default: %(default)d)")
- parser.add_argument(
- "--save_dir",
- type=str,
- default="model",
- help="Specify the path to save trained models.")
- parser.add_argument(
- "--load_dir",
- type=str,
- default="",
- help="Specify the path to load trained models.")
- parser.add_argument(
- "--save_interval",
- type=int,
- default=1,
- help="Save the trained model every n passes."
- "(default: %(default)d)")
- parser.add_argument(
- "--log_interval",
- type=int,
- default=50,
- help="log the train loss every n batches."
- "(default: %(default)d)")
- parser.add_argument(
- "--dev_interval",
- type=int,
- default=1000,
- help="cal dev loss every n batches."
- "(default: %(default)d)")
- parser.add_argument('--optim', default='adam', help='optimizer type')
- parser.add_argument('--trainset', nargs='+', help='train dataset')
- parser.add_argument('--devset', nargs='+', help='dev dataset')
- parser.add_argument('--testset', nargs='+', help='test dataset')
- parser.add_argument('--vocab_dir', help='dict')
- parser.add_argument('--max_p_num', type=int, default=5)
- parser.add_argument('--max_a_len', type=int, default=200)
- parser.add_argument('--max_p_len', type=int, default=500)
- parser.add_argument('--max_q_len', type=int, default=9)
- parser.add_argument('--doc_num', type=int, default=5)
- parser.add_argument('--para_print', action='store_true')
- parser.add_argument('--drop_rate', type=float, default=0.0)
- parser.add_argument('--random_seed', type=int, default=123)
- parser.add_argument(
- '--log_path',
- help='path of the log file. If not set, logs are printed to console')
- parser.add_argument(
- '--result_dir',
- default='../data/results/',
- help='the dir to output the results')
- parser.add_argument(
- '--result_name',
- default='test_result',
- help='the file name of the results')
- parser.add_argument(
- "--enable_ce",
- action='store_true',
- help="If set, run the task with continuous evaluation logs.")
- args = parser.parse_args()
- return args
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/data/download.sh b/fluid/PaddleNLP/machine_reading_comprehension/data/download.sh
deleted file mode 100644
index bcba3c7ed8957ee987aa13f6c99fa60f8c8494b6..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/data/download.sh
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash
-# ==============================================================================
-# Copyright 2017 Baidu.com, Inc. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-# ==============================================================================
-
-
-if [[ -d preprocessed ]] && [[ -d raw ]]; then
- echo "data exist"
- exit 0
-else
- wget -c --no-check-certificate http://dureader.gz.bcebos.com/dureader_preprocessed.zip
- wget -c --no-check-certificate http://dureader.gz.bcebos.com/demo.tgz
-fi
-
-if md5sum --status -c md5sum.txt; then
- unzip dureader_preprocessed.zip
-else
- echo "download data error!" >> /dev/stderr
- exit 1
-fi
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/data/md5sum.txt b/fluid/PaddleNLP/machine_reading_comprehension/data/md5sum.txt
deleted file mode 100644
index d6bce75a937995de3c29d2b7e029a13e82731e04..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/data/md5sum.txt
+++ /dev/null
@@ -1 +0,0 @@
-7a4c28026f7dc94e8135d17203c63664 dureader_preprocessed.zip
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/dataset.py b/fluid/PaddleNLP/machine_reading_comprehension/dataset.py
deleted file mode 100644
index 3aaf87be9a7b0659fa9e79eb8329911cbea73c55..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/dataset.py
+++ /dev/null
@@ -1,242 +0,0 @@
-# -*- coding:utf8 -*-
-# ==============================================================================
-# Copyright 2017 Baidu.com, Inc. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-# ==============================================================================
-"""
-This module implements data process strategies.
-"""
-
-import os
-import json
-import logging
-import numpy as np
-from collections import Counter
-
-
-class BRCDataset(object):
- """
- This module implements the APIs for loading and using baidu reading comprehension dataset
- """
-
- def __init__(self,
- max_p_num,
- max_p_len,
- max_q_len,
- train_files=[],
- dev_files=[],
- test_files=[]):
- self.logger = logging.getLogger("brc")
- self.max_p_num = max_p_num
- self.max_p_len = max_p_len
- self.max_q_len = max_q_len
-
- self.train_set, self.dev_set, self.test_set = [], [], []
- if train_files:
- for train_file in train_files:
- self.train_set += self._load_dataset(train_file, train=True)
- self.logger.info('Train set size: {} questions.'.format(
- len(self.train_set)))
-
- if dev_files:
- for dev_file in dev_files:
- self.dev_set += self._load_dataset(dev_file)
- self.logger.info('Dev set size: {} questions.'.format(
- len(self.dev_set)))
-
- if test_files:
- for test_file in test_files:
- self.test_set += self._load_dataset(test_file)
- self.logger.info('Test set size: {} questions.'.format(
- len(self.test_set)))
-
- def _load_dataset(self, data_path, train=False):
- """
- Loads the dataset
- Args:
- data_path: the data file to load
- """
- with open(data_path) as fin:
- data_set = []
- for lidx, line in enumerate(fin):
- sample = json.loads(line.strip())
- if train:
- if len(sample['answer_spans']) == 0:
- continue
- if sample['answer_spans'][0][1] >= self.max_p_len:
- continue
-
- if 'answer_docs' in sample:
- sample['answer_passages'] = sample['answer_docs']
-
- sample['question_tokens'] = sample['segmented_question']
-
- sample['passages'] = []
- for d_idx, doc in enumerate(sample['documents']):
- if train:
- most_related_para = doc['most_related_para']
- sample['passages'].append({
- 'passage_tokens':
- doc['segmented_paragraphs'][most_related_para],
- 'is_selected': doc['is_selected']
- })
- else:
- para_infos = []
- for para_tokens in doc['segmented_paragraphs']:
- question_tokens = sample['segmented_question']
- common_with_question = Counter(
- para_tokens) & Counter(question_tokens)
- correct_preds = sum(common_with_question.values())
- if correct_preds == 0:
- recall_wrt_question = 0
- else:
- recall_wrt_question = float(
- correct_preds) / len(question_tokens)
- para_infos.append((para_tokens, recall_wrt_question,
- len(para_tokens)))
- para_infos.sort(key=lambda x: (-x[1], x[2]))
- fake_passage_tokens = []
- for para_info in para_infos[:1]:
- fake_passage_tokens += para_info[0]
- sample['passages'].append({
- 'passage_tokens': fake_passage_tokens
- })
- data_set.append(sample)
- return data_set
-
- def _one_mini_batch(self, data, indices, pad_id):
- """
- Get one mini batch
- Args:
- data: all data
- indices: the indices of the samples to be selected
- pad_id:
- Returns:
- one batch of data
- """
- batch_data = {
- 'raw_data': [data[i] for i in indices],
- 'question_token_ids': [],
- 'question_length': [],
- 'passage_token_ids': [],
- 'passage_length': [],
- 'start_id': [],
- 'end_id': [],
- 'passage_num': []
- }
- max_passage_num = max(
- [len(sample['passages']) for sample in batch_data['raw_data']])
- max_passage_num = min(self.max_p_num, max_passage_num)
- for sidx, sample in enumerate(batch_data['raw_data']):
- count = 0
- for pidx in range(max_passage_num):
- if pidx < len(sample['passages']):
- count += 1
- batch_data['question_token_ids'].append(sample[
- 'question_token_ids'][0:self.max_q_len])
- batch_data['question_length'].append(
- min(len(sample['question_token_ids']), self.max_q_len))
- passage_token_ids = sample['passages'][pidx][
- 'passage_token_ids'][0:self.max_p_len]
- batch_data['passage_token_ids'].append(passage_token_ids)
- batch_data['passage_length'].append(
- min(len(passage_token_ids), self.max_p_len))
- # record the start passage index of current sample
- passade_idx_offset = sum(batch_data['passage_num'])
- batch_data['passage_num'].append(count)
- gold_passage_offset = 0
- if 'answer_passages' in sample and len(sample['answer_passages']):
- for i in range(sample['answer_passages'][0]):
- gold_passage_offset += len(batch_data['passage_token_ids'][
- passade_idx_offset + i])
- start_id = min(sample['answer_spans'][0][0], self.max_p_len)
- end_id = min(sample['answer_spans'][0][1], self.max_p_len)
- batch_data['start_id'].append(gold_passage_offset + start_id)
- batch_data['end_id'].append(gold_passage_offset + end_id)
- else:
- # fake span for some samples, only valid for testing
- batch_data['start_id'].append(0)
- batch_data['end_id'].append(0)
- return batch_data
-
- def word_iter(self, set_name=None):
- """
- Iterates over all the words in the dataset
- Args:
- set_name: if it is set, then the specific set will be used
- Returns:
- a generator
- """
- if set_name is None:
- data_set = self.train_set + self.dev_set + self.test_set
- elif set_name == 'train':
- data_set = self.train_set
- elif set_name == 'dev':
- data_set = self.dev_set
- elif set_name == 'test':
- data_set = self.test_set
- else:
- raise NotImplementedError('No data set named as {}'.format(
- set_name))
- if data_set is not None:
- for sample in data_set:
- for token in sample['question_tokens']:
- yield token
- for passage in sample['passages']:
- for token in passage['passage_tokens']:
- yield token
-
- def convert_to_ids(self, vocab):
- """
- Convert the question and passage in the original dataset to ids
- Args:
- vocab: the vocabulary on this dataset
- """
- for data_set in [self.train_set, self.dev_set, self.test_set]:
- if data_set is None:
- continue
- for sample in data_set:
- sample['question_token_ids'] = vocab.convert_to_ids(sample[
- 'question_tokens'])
- for passage in sample['passages']:
- passage['passage_token_ids'] = vocab.convert_to_ids(passage[
- 'passage_tokens'])
-
- def gen_mini_batches(self, set_name, batch_size, pad_id, shuffle=True):
- """
- Generate data batches for a specific dataset (train/dev/test)
- Args:
- set_name: train/dev/test to indicate the set
- batch_size: number of samples in one batch
- pad_id: pad id
- shuffle: if set to be true, the data is shuffled.
- Returns:
- a generator for all batches
- """
- if set_name == 'train':
- data = self.train_set
- elif set_name == 'dev':
- data = self.dev_set
- elif set_name == 'test':
- data = self.test_set
- else:
- raise NotImplementedError('No data set named as {}'.format(
- set_name))
- data_size = len(data)
- indices = np.arange(data_size)
- if shuffle:
- np.random.shuffle(indices)
- for batch_start in np.arange(0, data_size, batch_size):
- batch_indices = indices[batch_start:batch_start + batch_size]
- yield self._one_mini_batch(data, batch_indices, pad_id)
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/rc_model.py b/fluid/PaddleNLP/machine_reading_comprehension/rc_model.py
deleted file mode 100644
index 932ccd9cafd1772a2b7f4e20867aa51eee74c9c5..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/rc_model.py
+++ /dev/null
@@ -1,320 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid.layers as layers
-import paddle.fluid as fluid
-import numpy as np
-
-
-def dropout(input, args):
- if args.drop_rate:
- return layers.dropout(
- input,
- dropout_prob=args.drop_rate,
- seed=args.random_seed,
- is_test=False)
- else:
- return input
-
-
-def bi_lstm_encoder(input_seq, gate_size, para_name, args):
- # A bi-directional lstm encoder implementation.
- # Linear transformation part for input gate, output gate, forget gate
- # and cell activation vectors need be done outside of dynamic_lstm.
- # So the output size is 4 times of gate_size.
-
- input_forward_proj = layers.fc(
- input=input_seq,
- param_attr=fluid.ParamAttr(name=para_name + '_fw_gate_w'),
- size=gate_size * 4,
- act=None,
- bias_attr=False)
- input_reversed_proj = layers.fc(
- input=input_seq,
- param_attr=fluid.ParamAttr(name=para_name + '_bw_gate_w'),
- size=gate_size * 4,
- act=None,
- bias_attr=False)
- forward, _ = layers.dynamic_lstm(
- input=input_forward_proj,
- size=gate_size * 4,
- use_peepholes=False,
- param_attr=fluid.ParamAttr(name=para_name + '_fw_lstm_w'),
- bias_attr=fluid.ParamAttr(name=para_name + '_fw_lstm_b'))
- reversed, _ = layers.dynamic_lstm(
- input=input_reversed_proj,
- param_attr=fluid.ParamAttr(name=para_name + '_bw_lstm_w'),
- bias_attr=fluid.ParamAttr(name=para_name + '_bw_lstm_b'),
- size=gate_size * 4,
- is_reverse=True,
- use_peepholes=False)
-
- encoder_out = layers.concat(input=[forward, reversed], axis=1)
- return encoder_out
-
-
-def get_data(input_name, lod_level, args):
- input_ids = layers.data(
- name=input_name, shape=[1], dtype='int64', lod_level=lod_level)
- return input_ids
-
-
-def embedding(input_ids, shape, args):
- input_embedding = layers.embedding(
- input=input_ids,
- size=shape,
- dtype='float32',
- is_sparse=True,
- param_attr=fluid.ParamAttr(name='embedding_para'))
- return input_embedding
-
-
-def encoder(input_embedding, para_name, hidden_size, args):
- encoder_out = bi_lstm_encoder(
- input_seq=input_embedding,
- gate_size=hidden_size,
- para_name=para_name,
- args=args)
- return dropout(encoder_out, args)
-
-
-def attn_flow(q_enc, p_enc, p_ids_name, args):
- tag = p_ids_name + "::"
- drnn = layers.DynamicRNN()
- with drnn.block():
- h_cur = drnn.step_input(p_enc)
- u_all = drnn.static_input(q_enc)
- h_expd = layers.sequence_expand(x=h_cur, y=u_all)
- s_t_mul = layers.elementwise_mul(x=u_all, y=h_expd, axis=0)
- s_t_sum = layers.reduce_sum(input=s_t_mul, dim=1, keep_dim=True)
- s_t_re = layers.reshape(s_t_sum, shape=[-1, 0])
- s_t = layers.sequence_softmax(input=s_t_re)
- u_expr = layers.elementwise_mul(x=u_all, y=s_t, axis=0)
- u_expr = layers.sequence_pool(input=u_expr, pool_type='sum')
-
- b_t = layers.sequence_pool(input=s_t_sum, pool_type='max')
- drnn.output(u_expr, b_t)
- U_expr, b = drnn()
- b_norm = layers.sequence_softmax(input=b)
- h_expr = layers.elementwise_mul(x=p_enc, y=b_norm, axis=0)
- h_expr = layers.sequence_pool(input=h_expr, pool_type='sum')
-
- H_expr = layers.sequence_expand(x=h_expr, y=p_enc)
- H_expr = layers.lod_reset(x=H_expr, y=p_enc)
- h_u = layers.elementwise_mul(x=p_enc, y=U_expr, axis=0)
- h_h = layers.elementwise_mul(x=p_enc, y=H_expr, axis=0)
-
- g = layers.concat(input=[p_enc, U_expr, h_u, h_h], axis=1)
- return dropout(g, args)
-
-
-def lstm_step(x_t, hidden_t_prev, cell_t_prev, size, para_name, args):
- def linear(inputs, para_name, args):
- return layers.fc(input=inputs,
- size=size,
- param_attr=fluid.ParamAttr(name=para_name + '_w'),
- bias_attr=fluid.ParamAttr(name=para_name + '_b'))
-
- input_cat = layers.concat([hidden_t_prev, x_t], axis=1)
- forget_gate = layers.sigmoid(x=linear(input_cat, para_name + '_lstm_f',
- args))
- input_gate = layers.sigmoid(x=linear(input_cat, para_name + '_lstm_i',
- args))
- output_gate = layers.sigmoid(x=linear(input_cat, para_name + '_lstm_o',
- args))
- cell_tilde = layers.tanh(x=linear(input_cat, para_name + '_lstm_c', args))
-
- cell_t = layers.sums(input=[
- layers.elementwise_mul(
- x=forget_gate, y=cell_t_prev), layers.elementwise_mul(
- x=input_gate, y=cell_tilde)
- ])
-
- hidden_t = layers.elementwise_mul(x=output_gate, y=layers.tanh(x=cell_t))
-
- return hidden_t, cell_t
-
-
-#point network
-def point_network_decoder(p_vec, q_vec, hidden_size, args):
- tag = 'pn_decoder:'
- init_random = fluid.initializer.Normal(loc=0.0, scale=1.0)
-
- random_attn = layers.create_parameter(
- shape=[1, hidden_size],
- dtype='float32',
- default_initializer=init_random)
- random_attn = layers.fc(
- input=random_attn,
- size=hidden_size,
- act=None,
- param_attr=fluid.ParamAttr(name=tag + 'random_attn_fc_w'),
- bias_attr=fluid.ParamAttr(name=tag + 'random_attn_fc_b'))
- random_attn = layers.reshape(random_attn, shape=[-1])
- U = layers.fc(input=q_vec,
- param_attr=fluid.ParamAttr(name=tag + 'q_vec_fc_w'),
- bias_attr=False,
- size=hidden_size,
- act=None) + random_attn
- U = layers.tanh(U)
-
- logits = layers.fc(input=U,
- param_attr=fluid.ParamAttr(name=tag + 'logits_fc_w'),
- bias_attr=fluid.ParamAttr(name=tag + 'logits_fc_b'),
- size=1,
- act=None)
- scores = layers.sequence_softmax(input=logits)
- pooled_vec = layers.elementwise_mul(x=q_vec, y=scores, axis=0)
- pooled_vec = layers.sequence_pool(input=pooled_vec, pool_type='sum')
-
- init_state = layers.fc(
- input=pooled_vec,
- param_attr=fluid.ParamAttr(name=tag + 'init_state_fc_w'),
- bias_attr=fluid.ParamAttr(name=tag + 'init_state_fc_b'),
- size=hidden_size,
- act=None)
-
- def custom_dynamic_rnn(p_vec, init_state, hidden_size, para_name, args):
- tag = para_name + "custom_dynamic_rnn:"
-
- def static_rnn(step,
- p_vec=p_vec,
- init_state=None,
- para_name='',
- args=args):
- tag = para_name + "static_rnn:"
- ctx = layers.fc(
- input=p_vec,
- param_attr=fluid.ParamAttr(name=tag + 'context_fc_w'),
- bias_attr=fluid.ParamAttr(name=tag + 'context_fc_b'),
- size=hidden_size,
- act=None)
-
- beta = []
- c_prev = init_state
- m_prev = init_state
- for i in range(step):
- m_prev0 = layers.fc(
- input=m_prev,
- size=hidden_size,
- act=None,
- param_attr=fluid.ParamAttr(name=tag + 'm_prev0_fc_w'),
- bias_attr=fluid.ParamAttr(name=tag + 'm_prev0_fc_b'))
- m_prev1 = layers.sequence_expand(x=m_prev0, y=ctx)
-
- Fk = ctx + m_prev1
- Fk = layers.tanh(Fk)
- logits = layers.fc(
- input=Fk,
- size=1,
- act=None,
- param_attr=fluid.ParamAttr(name=tag + 'logits_fc_w'),
- bias_attr=fluid.ParamAttr(name=tag + 'logits_fc_b'))
-
- scores = layers.sequence_softmax(input=logits)
- attn_ctx = layers.elementwise_mul(x=p_vec, y=scores, axis=0)
- attn_ctx = layers.sequence_pool(input=attn_ctx, pool_type='sum')
-
- hidden_t, cell_t = lstm_step(
- attn_ctx,
- hidden_t_prev=m_prev,
- cell_t_prev=c_prev,
- size=hidden_size,
- para_name=tag,
- args=args)
- m_prev = hidden_t
- c_prev = cell_t
- beta.append(scores)
- return beta
-
- return static_rnn(
- 2, p_vec=p_vec, init_state=init_state, para_name=para_name)
-
- fw_outputs = custom_dynamic_rnn(p_vec, init_state, hidden_size, tag + "fw:",
- args)
- bw_outputs = custom_dynamic_rnn(p_vec, init_state, hidden_size, tag + "bw:",
- args)
-
- start_prob = layers.elementwise_add(
- x=fw_outputs[0], y=bw_outputs[1], axis=0) / 2
- end_prob = layers.elementwise_add(
- x=fw_outputs[1], y=bw_outputs[0], axis=0) / 2
-
- return start_prob, end_prob
-
-
-def fusion(g, args):
- m = bi_lstm_encoder(
- input_seq=g, gate_size=args.hidden_size, para_name='fusion', args=args)
- return dropout(m, args)
-
-
-def rc_model(hidden_size, vocab, args):
- emb_shape = [vocab.size(), vocab.embed_dim]
- start_labels = layers.data(
- name="start_lables", shape=[1], dtype='float32', lod_level=1)
- end_labels = layers.data(
- name="end_lables", shape=[1], dtype='float32', lod_level=1)
-
- # stage 1:encode
- q_id0 = get_data('q_id0', 1, args)
-
- q_ids = get_data('q_ids', 2, args)
- p_ids_name = 'p_ids'
-
- p_ids = get_data('p_ids', 2, args)
- p_embs = embedding(p_ids, emb_shape, args)
- q_embs = embedding(q_ids, emb_shape, args)
- drnn = layers.DynamicRNN()
- with drnn.block():
- p_emb = drnn.step_input(p_embs)
- q_emb = drnn.step_input(q_embs)
-
- p_enc = encoder(p_emb, 'p_enc', hidden_size, args)
- q_enc = encoder(q_emb, 'q_enc', hidden_size, args)
-
- # stage 2:match
- g_i = attn_flow(q_enc, p_enc, p_ids_name, args)
- # stage 3:fusion
- m_i = fusion(g_i, args)
- drnn.output(m_i, q_enc)
-
- ms, q_encs = drnn()
- p_vec = layers.lod_reset(x=ms, y=start_labels)
- q_vec = layers.lod_reset(x=q_encs, y=q_id0)
-
- # stage 4:decode
- start_probs, end_probs = point_network_decoder(
- p_vec=p_vec, q_vec=q_vec, hidden_size=hidden_size, args=args)
-
- cost0 = layers.sequence_pool(
- layers.cross_entropy(
- input=start_probs, label=start_labels, soft_label=True),
- 'sum')
- cost1 = layers.sequence_pool(
- layers.cross_entropy(
- input=end_probs, label=end_labels, soft_label=True),
- 'sum')
-
- cost0 = layers.mean(cost0)
- cost1 = layers.mean(cost1)
- cost = cost0 + cost1
- cost.persistable = True
-
- feeding_list = ["q_ids", "start_lables", "end_lables", "p_ids", "q_id0"]
- return cost, start_probs, end_probs, ms, feeding_list
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/run.py b/fluid/PaddleNLP/machine_reading_comprehension/run.py
deleted file mode 100644
index dbe3a4b9a296fdaf089d55be3f0c9845422f0ce5..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/run.py
+++ /dev/null
@@ -1,640 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-import time
-import os
-import random
-import json
-import six
-
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.core as core
-import paddle.fluid.framework as framework
-from paddle.fluid.executor import Executor
-
-import sys
-if sys.version[0] == '2':
- reload(sys)
- sys.setdefaultencoding("utf-8")
-sys.path.append('..')
-
-from args import *
-import rc_model
-from dataset import BRCDataset
-import logging
-import pickle
-from utils import normalize
-from utils import compute_bleu_rouge
-from vocab import Vocab
-
-
-def prepare_batch_input(insts, args):
- batch_size = len(insts['raw_data'])
- inst_num = len(insts['passage_num'])
- if batch_size != inst_num:
- print("data error %d, %d" % (batch_size, inst_num))
- return None
- new_insts = []
-
- passage_idx = 0
- for i in range(batch_size):
- p_len = 0
- p_id = []
- p_ids = []
- q_ids = []
- q_id = []
- p_id_r = []
- p_ids_r = []
- q_ids_r = []
- q_id_r = []
-
- for j in range(insts['passage_num'][i]):
- p_ids.append(insts['passage_token_ids'][passage_idx + j])
- p_id = p_id + insts['passage_token_ids'][passage_idx + j]
- q_ids.append(insts['question_token_ids'][passage_idx + j])
- q_id = q_id + insts['question_token_ids'][passage_idx + j]
-
- passage_idx += insts['passage_num'][i]
- p_len = len(p_id)
-
- def _get_label(idx, ref_len):
- ret = [0.0] * ref_len
- if idx >= 0 and idx < ref_len:
- ret[idx] = 1.0
- return [[x] for x in ret]
-
- start_label = _get_label(insts['start_id'][i], p_len)
- end_label = _get_label(insts['end_id'][i], p_len)
- new_inst = [q_ids, start_label, end_label, p_ids, q_id]
- new_insts.append(new_inst)
- return new_insts
-
-
-def batch_reader(batch_list, args):
- res = []
- for batch in batch_list:
- res.append(prepare_batch_input(batch, args))
- return res
-
-
-def read_multiple(reader, count, clip_last=True):
- """
- Stack data from reader for multi-devices.
- """
-
- def __impl__():
- res = []
- for item in reader():
- res.append(item)
- if len(res) == count:
- yield res
- res = []
- if len(res) == count:
- yield res
- elif not clip_last:
- data = []
- for item in res:
- data += item
- if len(data) > count:
- inst_num_per_part = len(data) // count
- yield [
- data[inst_num_per_part * i:inst_num_per_part * (i + 1)]
- for i in range(count)
- ]
-
- return __impl__
-
-
-def LodTensor_Array(lod_tensor):
- lod = lod_tensor.lod()
- array = np.array(lod_tensor)
- new_array = []
- for i in range(len(lod[0]) - 1):
- new_array.append(array[lod[0][i]:lod[0][i + 1]])
- return new_array
-
-
-def print_para(train_prog, train_exe, logger, args):
- if args.para_print:
- param_list = train_prog.block(0).all_parameters()
- param_name_list = [p.name for p in param_list]
- num_sum = 0
- for p_name in param_name_list:
- p_array = np.array(train_exe.scope.find_var(p_name).get_tensor())
- param_num = np.prod(p_array.shape)
- num_sum = num_sum + param_num
- logger.info(
- "param: {0}, mean={1} max={2} min={3} num={4} {5}".format(
- p_name,
- p_array.mean(),
- p_array.max(), p_array.min(), p_array.shape, param_num))
- logger.info("total param num: {0}".format(num_sum))
-
-
-def find_best_answer_for_passage(start_probs, end_probs, passage_len):
- """
- Finds the best answer with the maximum start_prob * end_prob from a single passage
- """
- if passage_len is None:
- passage_len = len(start_probs)
- else:
- passage_len = min(len(start_probs), passage_len)
- best_start, best_end, max_prob = -1, -1, 0
- for start_idx in range(passage_len):
- for ans_len in range(args.max_a_len):
- end_idx = start_idx + ans_len
- if end_idx >= passage_len:
- continue
- prob = start_probs[start_idx] * end_probs[end_idx]
- if prob > max_prob:
- best_start = start_idx
- best_end = end_idx
- max_prob = prob
- return (best_start, best_end), max_prob
-
-
-def find_best_answer_for_inst(sample, start_prob, end_prob, inst_lod):
- """
- Finds the best answer for a sample given start_prob and end_prob for each position.
- This will call find_best_answer_for_passage because there are multiple passages in a sample
- """
- best_p_idx, best_span, best_score = None, None, 0
- for p_idx, passage in enumerate(sample['passages']):
- if p_idx >= args.max_p_num:
- continue
- if len(start_prob) != len(end_prob):
- logger.info('error: {}'.format(sample['question']))
- continue
- passage_start = inst_lod[p_idx] - inst_lod[0]
- passage_end = inst_lod[p_idx + 1] - inst_lod[0]
- passage_len = passage_end - passage_start
- passage_len = min(args.max_p_len, len(passage['passage_tokens']))
- answer_span, score = find_best_answer_for_passage(
- start_prob[passage_start:passage_end],
- end_prob[passage_start:passage_end], passage_len)
- if score > best_score:
- best_score = score
- best_p_idx = p_idx
- best_span = answer_span
- if best_p_idx is None or best_span is None:
- best_answer = ''
- else:
- best_answer = ''.join(sample['passages'][best_p_idx]['passage_tokens'][
- best_span[0]:best_span[1] + 1])
- return best_answer, best_span
-
-
-def validation(inference_program, avg_cost, s_probs, e_probs, match, feed_order,
- place, dev_count, vocab, brc_data, logger, args):
- """
-
- """
- parallel_executor = fluid.ParallelExecutor(
- main_program=inference_program,
- use_cuda=bool(args.use_gpu),
- loss_name=avg_cost.name)
- print_para(inference_program, parallel_executor, logger, args)
-
- # Use test set as validation each pass
- total_loss = 0.0
- count = 0
- n_batch_cnt = 0
- n_batch_loss = 0.0
- pred_answers, ref_answers = [], []
- val_feed_list = [
- inference_program.global_block().var(var_name)
- for var_name in feed_order
- ]
- val_feeder = fluid.DataFeeder(val_feed_list, place)
- pad_id = vocab.get_id(vocab.pad_token)
- dev_reader = lambda:brc_data.gen_mini_batches('dev', args.batch_size, pad_id, shuffle=False)
- dev_reader = read_multiple(dev_reader, dev_count)
-
- for batch_id, batch_list in enumerate(dev_reader(), 1):
- feed_data = batch_reader(batch_list, args)
- val_fetch_outs = parallel_executor.run(
- feed=list(val_feeder.feed_parallel(feed_data, dev_count)),
- fetch_list=[avg_cost.name, s_probs.name, e_probs.name, match.name],
- return_numpy=False)
- total_loss += np.array(val_fetch_outs[0]).sum()
- start_probs_m = LodTensor_Array(val_fetch_outs[1])
- end_probs_m = LodTensor_Array(val_fetch_outs[2])
- match_lod = val_fetch_outs[3].lod()
- count += len(np.array(val_fetch_outs[0]))
-
- n_batch_cnt += len(np.array(val_fetch_outs[0]))
- n_batch_loss += np.array(val_fetch_outs[0]).sum()
- log_every_n_batch = args.log_interval
- if log_every_n_batch > 0 and batch_id % log_every_n_batch == 0:
- logger.info('Average dev loss from batch {} to {} is {}'.format(
- batch_id - log_every_n_batch + 1, batch_id, "%.10f" % (
- n_batch_loss / n_batch_cnt)))
- n_batch_loss = 0.0
- n_batch_cnt = 0
- batch_offset = 0
- for idx, batch in enumerate(batch_list):
- #one batch
- batch_size = len(batch['raw_data'])
- batch_range = match_lod[0][batch_offset:batch_offset + batch_size +
- 1]
- batch_lod = [[batch_range[x], batch_range[x + 1]]
- for x in range(len(batch_range[:-1]))]
- start_prob_batch = start_probs_m[batch_offset:batch_offset +
- batch_size + 1]
- end_prob_batch = end_probs_m[batch_offset:batch_offset + batch_size
- + 1]
- for sample, start_prob_inst, end_prob_inst, inst_range in zip(
- batch['raw_data'], start_prob_batch, end_prob_batch,
- batch_lod):
- #one instance
- inst_lod = match_lod[1][inst_range[0]:inst_range[1] + 1]
- best_answer, best_span = find_best_answer_for_inst(
- sample, start_prob_inst, end_prob_inst, inst_lod)
- pred = {
- 'question_id': sample['question_id'],
- 'question_type': sample['question_type'],
- 'answers': [best_answer],
- 'entity_answers': [[]],
- 'yesno_answers': [best_span]
- }
- pred_answers.append(pred)
- if 'answers' in sample:
- ref = {
- 'question_id': sample['question_id'],
- 'question_type': sample['question_type'],
- 'answers': sample['answers'],
- 'entity_answers': [[]],
- 'yesno_answers': []
- }
- ref_answers.append(ref)
- batch_offset = batch_offset + batch_size
-
- result_dir = args.result_dir
- result_prefix = args.result_name
- if result_dir is not None and result_prefix is not None:
- if not os.path.exists(args.result_dir):
- os.makedirs(args.result_dir)
- result_file = os.path.join(result_dir, result_prefix + 'json')
- with open(result_file, 'w') as fout:
- for pred_answer in pred_answers:
- fout.write(json.dumps(pred_answer, ensure_ascii=False) + '\n')
- logger.info('Saving {} results to {}'.format(result_prefix,
- result_file))
-
- ave_loss = 1.0 * total_loss / count
- # compute the bleu and rouge scores if reference answers is provided
- if len(ref_answers) > 0:
- pred_dict, ref_dict = {}, {}
- for pred, ref in zip(pred_answers, ref_answers):
- question_id = ref['question_id']
- if len(ref['answers']) > 0:
- pred_dict[question_id] = normalize(pred['answers'])
- ref_dict[question_id] = normalize(ref['answers'])
- bleu_rouge = compute_bleu_rouge(pred_dict, ref_dict)
- else:
- bleu_rouge = None
- return ave_loss, bleu_rouge
-
-
-def l2_loss(train_prog):
- param_list = train_prog.block(0).all_parameters()
- para_sum = []
- for para in param_list:
- para_mul = fluid.layers.elementwise_mul(x=para, y=para, axis=0)
- para_sum.append(fluid.layers.reduce_sum(input=para_mul, dim=None))
- return fluid.layers.sums(para_sum) * 0.5
-
-
-def train(logger, args):
- logger.info('Load data_set and vocab...')
- with open(os.path.join(args.vocab_dir, 'vocab.data'), 'rb') as fin:
- if six.PY2:
- vocab = pickle.load(fin)
- else:
- vocab = pickle.load(fin, encoding='bytes')
- logger.info('vocab size is {} and embed dim is {}'.format(vocab.size(
- ), vocab.embed_dim))
- brc_data = BRCDataset(args.max_p_num, args.max_p_len, args.max_q_len,
- args.trainset, args.devset)
- logger.info('Converting text into ids...')
- brc_data.convert_to_ids(vocab)
- logger.info('Initialize the model...')
-
- if not args.use_gpu:
- place = fluid.CPUPlace()
- dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
- else:
- place = fluid.CUDAPlace(0)
- dev_count = fluid.core.get_cuda_device_count()
-
- # build model
- main_program = fluid.Program()
- startup_prog = fluid.Program()
- if args.enable_ce:
- main_program.random_seed = args.random_seed
- startup_prog.random_seed = args.random_seed
- with fluid.program_guard(main_program, startup_prog):
- with fluid.unique_name.guard():
- avg_cost, s_probs, e_probs, match, feed_order = rc_model.rc_model(
- args.hidden_size, vocab, args)
- # clone from default main program and use it as the validation program
- inference_program = main_program.clone(for_test=True)
-
- # build optimizer
- if args.optim == 'sgd':
- optimizer = fluid.optimizer.SGD(
- learning_rate=args.learning_rate)
- elif args.optim == 'adam':
- optimizer = fluid.optimizer.Adam(
- learning_rate=args.learning_rate)
- elif args.optim == 'rprop':
- optimizer = fluid.optimizer.RMSPropOptimizer(
- learning_rate=args.learning_rate)
- else:
- logger.error('Unsupported optimizer: {}'.format(args.optim))
- exit(-1)
- if args.weight_decay > 0.0:
- obj_func = avg_cost + args.weight_decay * l2_loss(main_program)
- optimizer.minimize(obj_func)
- else:
- obj_func = avg_cost
- optimizer.minimize(obj_func)
-
- # initialize parameters
- place = core.CUDAPlace(0) if args.use_gpu else core.CPUPlace()
- exe = Executor(place)
- if args.load_dir:
- logger.info('load from {}'.format(args.load_dir))
- fluid.io.load_persistables(
- exe, args.load_dir, main_program=main_program)
- else:
- exe.run(startup_prog)
- embedding_para = fluid.global_scope().find_var(
- 'embedding_para').get_tensor()
- embedding_para.set(vocab.embeddings.astype(np.float32), place)
-
- # prepare data
- feed_list = [
- main_program.global_block().var(var_name)
- for var_name in feed_order
- ]
- feeder = fluid.DataFeeder(feed_list, place)
-
- logger.info('Training the model...')
- parallel_executor = fluid.ParallelExecutor(
- main_program=main_program,
- use_cuda=bool(args.use_gpu),
- loss_name=avg_cost.name)
- print_para(main_program, parallel_executor, logger, args)
-
- for pass_id in range(1, args.pass_num + 1):
- pass_start_time = time.time()
- pad_id = vocab.get_id(vocab.pad_token)
- if args.enable_ce:
- train_reader = lambda:brc_data.gen_mini_batches('train', args.batch_size, pad_id, shuffle=False)
- else:
- train_reader = lambda:brc_data.gen_mini_batches('train', args.batch_size, pad_id, shuffle=True)
- train_reader = read_multiple(train_reader, dev_count)
- log_every_n_batch, n_batch_loss = args.log_interval, 0
- total_num, total_loss = 0, 0
- for batch_id, batch_list in enumerate(train_reader(), 1):
- feed_data = batch_reader(batch_list, args)
- fetch_outs = parallel_executor.run(
- feed=list(feeder.feed_parallel(feed_data, dev_count)),
- fetch_list=[obj_func.name],
- return_numpy=False)
- cost_train = np.array(fetch_outs[0]).mean()
- total_num += args.batch_size * dev_count
- n_batch_loss += cost_train
- total_loss += cost_train * args.batch_size * dev_count
-
- if args.enable_ce and batch_id >= 100:
- break
- if log_every_n_batch > 0 and batch_id % log_every_n_batch == 0:
- print_para(main_program, parallel_executor, logger,
- args)
- logger.info(
- 'Average loss from batch {} to {} is {}'.format(
- batch_id - log_every_n_batch + 1, batch_id,
- "%.10f" % (n_batch_loss / log_every_n_batch)))
- n_batch_loss = 0
- if args.dev_interval > 0 and batch_id % args.dev_interval == 0:
- if brc_data.dev_set is not None:
- eval_loss, bleu_rouge = validation(
- inference_program, avg_cost, s_probs, e_probs,
- match, feed_order, place, dev_count, vocab,
- brc_data, logger, args)
- logger.info('Dev eval loss {}'.format(eval_loss))
- logger.info('Dev eval result: {}'.format(
- bleu_rouge))
- pass_end_time = time.time()
-
- logger.info('Evaluating the model after epoch {}'.format(
- pass_id))
- if brc_data.dev_set is not None:
- eval_loss, bleu_rouge = validation(
- inference_program, avg_cost, s_probs, e_probs, match,
- feed_order, place, dev_count, vocab, brc_data, logger,
- args)
- logger.info('Dev eval loss {}'.format(eval_loss))
- logger.info('Dev eval result: {}'.format(bleu_rouge))
- else:
- logger.warning(
- 'No dev set is loaded for evaluation in the dataset!')
- time_consumed = pass_end_time - pass_start_time
- logger.info('Average train loss for epoch {} is {}'.format(
- pass_id, "%.10f" % (1.0 * total_loss / total_num)))
-
- if pass_id % args.save_interval == 0:
- model_path = os.path.join(args.save_dir, str(pass_id))
- if not os.path.isdir(model_path):
- os.makedirs(model_path)
-
- fluid.io.save_persistables(
- executor=exe,
- dirname=model_path,
- main_program=main_program)
- if args.enable_ce: # For CE
- print("kpis\ttrain_cost_card%d\t%f" %
- (dev_count, total_loss / total_num))
- if brc_data.dev_set is not None:
- print("kpis\ttest_cost_card%d\t%f" %
- (dev_count, eval_loss))
- print("kpis\ttrain_duration_card%d\t%f" %
- (dev_count, time_consumed))
-
-
-def evaluate(logger, args):
- logger.info('Load data_set and vocab...')
- with open(os.path.join(args.vocab_dir, 'vocab.data'), 'rb') as fin:
- vocab = pickle.load(fin)
- logger.info('vocab size is {} and embed dim is {}'.format(vocab.size(
- ), vocab.embed_dim))
- brc_data = BRCDataset(
- args.max_p_num, args.max_p_len, args.max_q_len, dev_files=args.devset)
- logger.info('Converting text into ids...')
- brc_data.convert_to_ids(vocab)
- logger.info('Initialize the model...')
-
- # build model
- main_program = fluid.Program()
- startup_prog = fluid.Program()
- with fluid.program_guard(main_program, startup_prog):
- with fluid.unique_name.guard():
- avg_cost, s_probs, e_probs, match, feed_order = rc_model.rc_model(
- args.hidden_size, vocab, args)
- # initialize parameters
- if not args.use_gpu:
- place = fluid.CPUPlace()
- dev_count = int(
- os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
- else:
- place = fluid.CUDAPlace(0)
- dev_count = fluid.core.get_cuda_device_count()
-
- exe = Executor(place)
- if args.load_dir:
- logger.info('load from {}'.format(args.load_dir))
- fluid.io.load_persistables(
- exe, args.load_dir, main_program=main_program)
- else:
- logger.error('No model file to load ...')
- return
-
- inference_program = main_program.clone(for_test=True)
- eval_loss, bleu_rouge = validation(
- inference_program, avg_cost, s_probs, e_probs, feed_order,
- place, dev_count, vocab, brc_data, logger, args)
- logger.info('Dev eval loss {}'.format(eval_loss))
- logger.info('Dev eval result: {}'.format(bleu_rouge))
- logger.info('Predicted answers are saved to {}'.format(
- os.path.join(args.result_dir)))
-
-
-def predict(logger, args):
- logger.info('Load data_set and vocab...')
- with open(os.path.join(args.vocab_dir, 'vocab.data'), 'rb') as fin:
- vocab = pickle.load(fin)
- logger.info('vocab size is {} and embed dim is {}'.format(vocab.size(
- ), vocab.embed_dim))
- brc_data = BRCDataset(
- args.max_p_num, args.max_p_len, args.max_q_len, dev_files=args.testset)
- logger.info('Converting text into ids...')
- brc_data.convert_to_ids(vocab)
- logger.info('Initialize the model...')
-
- # build model
- main_program = fluid.Program()
- startup_prog = fluid.Program()
- with fluid.program_guard(main_program, startup_prog):
- with fluid.unique_name.guard():
- avg_cost, s_probs, e_probs, match, feed_order = rc_model.rc_model(
- args.hidden_size, vocab, args)
- # initialize parameters
- if not args.use_gpu:
- place = fluid.CPUPlace()
- dev_count = int(
- os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
- else:
- place = fluid.CUDAPlace(0)
- dev_count = fluid.core.get_cuda_device_count()
-
- exe = Executor(place)
- if args.load_dir:
- logger.info('load from {}'.format(args.load_dir))
- fluid.io.load_persistables(
- exe, args.load_dir, main_program=main_program)
- else:
- logger.error('No model file to load ...')
- return
-
- inference_program = main_program.clone(for_test=True)
- eval_loss, bleu_rouge = validation(
- inference_program, avg_cost, s_probs, e_probs, match,
- feed_order, place, dev_count, vocab, brc_data, logger, args)
-
-
-def prepare(logger, args):
- """
- checks data, creates the directories, prepare the vocabulary and embeddings
- """
- logger.info('Checking the data files...')
- for data_path in args.trainset + args.devset + args.testset:
- assert os.path.exists(data_path), '{} file does not exist.'.format(
- data_path)
- logger.info('Preparing the directories...')
- for dir_path in [args.vocab_dir, args.save_dir, args.result_dir]:
- if not os.path.exists(dir_path):
- os.makedirs(dir_path)
-
- logger.info('Building vocabulary...')
- brc_data = BRCDataset(args.max_p_num, args.max_p_len, args.max_q_len,
- args.trainset, args.devset, args.testset)
- vocab = Vocab(lower=True)
- for word in brc_data.word_iter('train'):
- vocab.add(word)
-
- unfiltered_vocab_size = vocab.size()
- vocab.filter_tokens_by_cnt(min_cnt=2)
- filtered_num = unfiltered_vocab_size - vocab.size()
- logger.info('After filter {} tokens, the final vocab size is {}'.format(
- filtered_num, vocab.size()))
-
- logger.info('Assigning embeddings...')
- vocab.randomly_init_embeddings(args.embed_size)
-
- logger.info('Saving vocab...')
- with open(os.path.join(args.vocab_dir, 'vocab.data'), 'wb') as fout:
- pickle.dump(vocab, fout)
-
- logger.info('Done with preparing!')
-
-
-if __name__ == '__main__':
- args = parse_args()
-
- if args.enable_ce:
- random.seed(args.random_seed)
- np.random.seed(args.random_seed)
-
- logger = logging.getLogger("brc")
- logger.setLevel(logging.INFO)
- formatter = logging.Formatter(
- '%(asctime)s - %(name)s - %(levelname)s - %(message)s')
- if args.log_path:
- file_handler = logging.FileHandler(args.log_path)
- file_handler.setLevel(logging.INFO)
- file_handler.setFormatter(formatter)
- logger.addHandler(file_handler)
- else:
- console_handler = logging.StreamHandler()
- console_handler.setLevel(logging.INFO)
- console_handler.setFormatter(formatter)
- logger.addHandler(console_handler)
- args = parse_args()
- logger.info('Running with args : {}'.format(args))
- if args.prepare:
- prepare(logger, args)
- if args.train:
- train(logger, args)
- if args.evaluate:
- evaluate(logger, args)
- if args.predict:
- predict(logger, args)
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/run.sh b/fluid/PaddleNLP/machine_reading_comprehension/run.sh
deleted file mode 100644
index cc381c9ecdcfe1b547e0c11fe8d7fe6149248b21..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/run.sh
+++ /dev/null
@@ -1,22 +0,0 @@
-export CUDA_VISIBLE_DEVICES=0
-python run.py \
---trainset 'data/preprocessed/trainset/search.train.json' \
- 'data/preprocessed/trainset/zhidao.train.json' \
---devset 'data/preprocessed/devset/search.dev.json' \
- 'data/preprocessed/devset/zhidao.dev.json' \
---testset 'data/preprocessed/testset/search.test.json' \
- 'data/preprocessed/testset/zhidao.test.json' \
---vocab_dir 'data/vocab' \
---use_gpu true \
---save_dir ./models \
---pass_num 10 \
---learning_rate 0.001 \
---batch_size 32 \
---embed_size 300 \
---hidden_size 150 \
---max_p_num 5 \
---max_p_len 500 \
---max_q_len 60 \
---max_a_len 200 \
---weight_decay 0.0001 \
---drop_rate 0.2 $@\
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/vocab.py b/fluid/PaddleNLP/machine_reading_comprehension/vocab.py
deleted file mode 100644
index 14b608052132cc5c6f46810778511bc9a6a6915b..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/vocab.py
+++ /dev/null
@@ -1,199 +0,0 @@
-# -*- coding:utf8 -*-
-# ==============================================================================
-# Copyright 2017 Baidu.com, Inc. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-# ==============================================================================
-"""
-This module implements the Vocab class for converting string to id and back
-"""
-
-import numpy as np
-
-
-class Vocab(object):
- """
- Implements a vocabulary to store the tokens in the data, with their corresponding embeddings.
- """
-
- def __init__(self, filename=None, initial_tokens=None, lower=False):
- self.id2token = {}
- self.token2id = {}
- self.token_cnt = {}
- self.lower = lower
-
- self.embed_dim = None
- self.embeddings = None
-
- self.pad_token = ''
- self.unk_token = ''
-
- self.initial_tokens = initial_tokens if initial_tokens is not None else []
- self.initial_tokens.extend([self.pad_token, self.unk_token])
- for token in self.initial_tokens:
- self.add(token)
-
- if filename is not None:
- self.load_from_file(filename)
-
- def size(self):
- """
- get the size of vocabulary
- Returns:
- an integer indicating the size
- """
- return len(self.id2token)
-
- def load_from_file(self, file_path):
- """
- loads the vocab from file_path
- Args:
- file_path: a file with a word in each line
- """
- for line in open(file_path, 'r'):
- token = line.rstrip('\n')
- self.add(token)
-
- def get_id(self, token):
- """
- gets the id of a token, returns the id of unk token if token is not in vocab
- Args:
- key: a string indicating the word
- Returns:
- an integer
- """
- token = token.lower() if self.lower else token
- try:
- return self.token2id[token]
- except KeyError:
- return self.token2id[self.unk_token]
-
- def get_token(self, idx):
- """
- gets the token corresponding to idx, returns unk token if idx is not in vocab
- Args:
- idx: an integer
- returns:
- a token string
- """
- try:
- return self.id2token[idx]
- except KeyError:
- return self.unk_token
-
- def add(self, token, cnt=1):
- """
- adds the token to vocab
- Args:
- token: a string
- cnt: a num indicating the count of the token to add, default is 1
- """
- token = token.lower() if self.lower else token
- if token in self.token2id:
- idx = self.token2id[token]
- else:
- idx = len(self.id2token)
- self.id2token[idx] = token
- self.token2id[token] = idx
- if cnt > 0:
- if token in self.token_cnt:
- self.token_cnt[token] += cnt
- else:
- self.token_cnt[token] = cnt
- return idx
-
- def filter_tokens_by_cnt(self, min_cnt):
- """
- filter the tokens in vocab by their count
- Args:
- min_cnt: tokens with frequency less than min_cnt is filtered
- """
- filtered_tokens = [
- token for token in self.token2id if self.token_cnt[token] >= min_cnt
- ]
- # rebuild the token x id map
- self.token2id = {}
- self.id2token = {}
- for token in self.initial_tokens:
- self.add(token, cnt=0)
- for token in filtered_tokens:
- self.add(token, cnt=0)
-
- def randomly_init_embeddings(self, embed_dim):
- """
- randomly initializes the embeddings for each token
- Args:
- embed_dim: the size of the embedding for each token
- """
- self.embed_dim = embed_dim
- self.embeddings = np.random.rand(self.size(), embed_dim)
- for token in [self.pad_token, self.unk_token]:
- self.embeddings[self.get_id(token)] = np.zeros([self.embed_dim])
-
- def load_pretrained_embeddings(self, embedding_path):
- """
- loads the pretrained embeddings from embedding_path,
- tokens not in pretrained embeddings will be filtered
- Args:
- embedding_path: the path of the pretrained embedding file
- """
- trained_embeddings = {}
- with open(embedding_path, 'r') as fin:
- for line in fin:
- contents = line.strip().split()
- token = contents[0].decode('utf8')
- if token not in self.token2id:
- continue
- trained_embeddings[token] = list(map(float, contents[1:]))
- if self.embed_dim is None:
- self.embed_dim = len(contents) - 1
- filtered_tokens = trained_embeddings.keys()
- # rebuild the token x id map
- self.token2id = {}
- self.id2token = {}
- for token in self.initial_tokens:
- self.add(token, cnt=0)
- for token in filtered_tokens:
- self.add(token, cnt=0)
- # load embeddings
- self.embeddings = np.zeros([self.size(), self.embed_dim])
- for token in self.token2id.keys():
- if token in trained_embeddings:
- self.embeddings[self.get_id(token)] = trained_embeddings[token]
-
- def convert_to_ids(self, tokens):
- """
- Convert a list of tokens to ids, use unk_token if the token is not in vocab.
- Args:
- tokens: a list of token
- Returns:
- a list of ids
- """
- vec = [self.get_id(label) for label in tokens]
- return vec
-
- def recover_from_ids(self, ids, stop_id=None):
- """
- Convert a list of ids to tokens, stop converting if the stop_id is encountered
- Args:
- ids: a list of ids to convert
- stop_id: the stop id, default is None
- Returns:
- a list of tokens
- """
- tokens = []
- for i in ids:
- tokens += [self.get_token(i)]
- if stop_id is not None and i == stop_id:
- break
- return tokens
diff --git a/fluid/PaddleNLP/neural_machine_translation/README.md b/fluid/PaddleNLP/neural_machine_translation/README.md
index a0271ad42e62490282ccc154f6a3c50029b6d13d..08446669df7617a2364c455d5093d086f1fd1a6e 100644
--- a/fluid/PaddleNLP/neural_machine_translation/README.md
+++ b/fluid/PaddleNLP/neural_machine_translation/README.md
@@ -1,9 +1,6 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
----
+Hi!
-This is a collection of example models for neural machine translation and neural sequence modeling.
+This directory has been deprecated.
-### TODO
-
-This project is still under active development.
+Please visit the project at [PaddleNLP/neural_machine_translation](../../../PaddleNLP/neural_machine_translation).
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/README.md b/fluid/PaddleNLP/neural_machine_translation/rnn_search/README.md
index 59860114a101b54d8c5f148bd8d725d9bfe778bc..005fb7e2e56c19583bfbeb7997c25fbef5f77578 100644
--- a/fluid/PaddleNLP/neural_machine_translation/rnn_search/README.md
+++ b/fluid/PaddleNLP/neural_machine_translation/rnn_search/README.md
@@ -1,134 +1,2 @@
-运行本目录下的范例模型需要安装PaddlePaddle Fluid 1.0版。如果您的 PaddlePaddle 安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新 PaddlePaddle 安装版本。
-# 机器翻译:RNN Search
-
-以下是本范例模型的简要目录结构及说明:
-
-```text
-.
-├── README.md # 文档,本文件
-├── args.py # 训练、预测以及模型参数
-├── train.py # 训练主程序
-├── infer.py # 预测主程序
-├── attention_model.py # 带注意力机制的翻译模型配置
-└── no_attention_model.py # 无注意力机制的翻译模型配置
-```
-
-## 简介
-机器翻译(machine translation, MT)是用计算机来实现不同语言之间翻译的技术。被翻译的语言通常称为源语言(source language),翻译成的结果语言称为目标语言(target language)。机器翻译即实现从源语言到目标语言转换的过程,是自然语言处理的重要研究领域之一。
-
-近年来,深度学习技术的发展不断为机器翻译任务带来新的突破。直接用神经网络将源语言映射到目标语言,即端到端的神经网络机器翻译(End-to-End Neural Machine Translation, End-to-End NMT)模型逐渐成为主流,此类模型一般简称为NMT模型。
-
-本目录包含一个经典的机器翻译模型[RNN Search](https://arxiv.org/pdf/1409.0473.pdf)的Paddle Fluid实现。事实上,RNN search是一个较为传统的NMT模型,在现阶段,其表现已被很多新模型(如[Transformer](https://arxiv.org/abs/1706.03762))超越。但除机器翻译外,该模型是许多序列到序列(sequence to sequence, 以下简称Seq2Seq)类模型的基础,很多解决其他NLP问题的模型均以此模型为基础;因此其在NLP领域具有重要意义,并被广泛用作Baseline.
-
-本目录下此范例模型的实现,旨在展示如何用Paddle Fluid实现一个带有注意力机制(Attention)的RNN模型来解决Seq2Seq类问题,以及如何使用带有Beam Search算法的解码器。如果您仅仅只是需要在机器翻译方面有着较好翻译效果的模型,则建议您参考[Transformer的Paddle Fluid实现](https://github.com/PaddlePaddle/models/tree/develop/fluid/neural_machine_translation/transformer)。
-
-## 模型概览
-RNN Search模型使用了经典的编码器-解码器(Encoder-Decoder)的框架结构来解决Seq2Seq类问题。这种方法先用编码器将源序列编码成vector,再用解码器将该vector解码为目标序列。这其实模拟了人类在进行翻译类任务时的行为:先解析源语言,理解其含义,再根据该含义来写出目标语言的语句。编码器和解码器往往都使用RNN来实现。关于此方法的具体原理和数学表达式,可以参考[深度学习101](http://www.paddlepaddle.org/documentation/docs/zh/0.15.0/beginners_guide/basics/machine_translation/index.html).
-
-本模型中,在编码器方面,我们的实现使用了双向循环神经网络(Bi-directional Recurrent Neural Network);在解码器方面,我们使用了带注意力(Attention)机制的RNN解码器,并同时提供了一个不带注意力机制的解码器实现作为对比;而在预测方面我们使用柱搜索(beam search)算法来生成翻译的目标语句。以下将分别介绍用到的这些方法。
-
-### 双向循环神经网络
-这里介绍Bengio团队在论文\[[2](#参考文献),[4](#参考文献)\]中提出的一种双向循环网络结构。该结构的目的是输入一个序列,得到其在每个时刻的特征表示,即输出的每个时刻都用定长向量表示到该时刻的上下文语义信息。
-具体来说,该双向循环神经网络分别在时间维以顺序和逆序——即前向(forward)和后向(backward)——依次处理输入序列,并将每个时间步RNN的输出拼接成为最终的输出层。这样每个时间步的输出节点,都包含了输入序列中当前时刻完整的过去和未来的上下文信息。下图展示的是一个按时间步展开的双向循环神经网络。该网络包含一个前向和一个后向RNN,其中有六个权重矩阵:输入到前向隐层和后向隐层的权重矩阵($W_1, W_3$),隐层到隐层自己的权重矩阵($W_2,W_5$),前向隐层和后向隐层到输出层的权重矩阵($W_4, W_6$)。注意,该网络的前向隐层和后向隐层之间没有连接。
-
-
-
-图1. 按时间步展开的双向循环神经网络
-
-
-
-
-图2. 使用双向LSTM的编码器
-
-
-### 注意力机制
-如果编码阶段的输出是一个固定维度的向量,会带来以下两个问题:1)不论源语言序列的长度是5个词还是50个词,如果都用固定维度的向量去编码其中的语义和句法结构信息,对模型来说是一个非常高的要求,特别是对长句子序列而言;2)直觉上,当人类翻译一句话时,会对与当前译文更相关的源语言片段上给予更多关注,且关注点会随着翻译的进行而改变。而固定维度的向量则相当于,任何时刻都对源语言所有信息给予了同等程度的关注,这是不合理的。因此,Bahdanau等人\[[4](#参考文献)\]引入注意力(attention)机制,可以对编码后的上下文片段进行解码,以此来解决长句子的特征学习问题。下面介绍在注意力机制下的解码器结构。
-
-与简单的解码器不同,这里$z_i$的计算公式为:
-
-$$z_{i+1}=\phi _{\theta '}\left ( c_i,u_i,z_i \right )$$
-
-可见,源语言句子的编码向量表示为第$i$个词的上下文片段$c_i$,即针对每一个目标语言中的词$u_i$,都有一个特定的$c_i$与之对应。$c_i$的计算公式如下:
-
-$$c_i=\sum _{j=1}^{T}a_{ij}h_j, a_i=\left[ a_{i1},a_{i2},...,a_{iT}\right ]$$
-
-从公式中可以看出,注意力机制是通过对编码器中各时刻的RNN状态$h_j$进行加权平均实现的。权重$a_{ij}$表示目标语言中第$i$个词对源语言中第$j$个词的注意力大小,$a_{ij}$的计算公式如下:
-
-$$a_{ij} = {exp(e_{ij}) \over {\sum_{k=1}^T exp(e_{ik})}}$$
-$$e_{ij} = {align(z_i, h_j)}$$
-
-其中,$align$可以看作是一个对齐模型,用来衡量目标语言中第$i$个词和源语言中第$j$个词的匹配程度。具体而言,这个程度是通过解码RNN的第$i$个隐层状态$z_i$和源语言句子的第$j$个上下文片段$h_j$计算得到的。传统的对齐模型中,目标语言的每个词明确对应源语言的一个或多个词(hard alignment);而在注意力模型中采用的是soft alignment,即任何两个目标语言和源语言词间均存在一定的关联,且这个关联强度是由模型计算得到的实数,因此可以融入整个NMT框架,并通过反向传播算法进行训练。
-
-
-
-图3. 基于注意力机制的解码器
-
-
-### 柱搜索算法
-
-柱搜索([beam search](http://en.wikipedia.org/wiki/Beam_search))是一种启发式图搜索算法,用于在图或树中搜索有限集合中的最优扩展节点,通常用在解空间非常大的系统(如机器翻译、语音识别)中,原因是内存无法装下图或树中所有展开的解。如在机器翻译任务中希望翻译“`你好`”,就算目标语言字典中只有3个词(``, ``, `hello`),也可能生成无限句话(`hello`循环出现的次数不定),为了找到其中较好的翻译结果,我们可采用柱搜索算法。
-
-柱搜索算法使用广度优先策略建立搜索树,在树的每一层,按照启发代价(heuristic cost)(本教程中,为生成词的log概率之和)对节点进行排序,然后仅留下预先确定的个数(文献中通常称为beam width、beam size、柱宽度等)的节点。只有这些节点会在下一层继续扩展,其他节点就被剪掉了,也就是说保留了质量较高的节点,剪枝了质量较差的节点。因此,搜索所占用的空间和时间大幅减少,但缺点是无法保证一定获得最优解。
-
-使用柱搜索算法的解码阶段,目标是最大化生成序列的概率。思路是:
-
-1. 每一个时刻,根据源语言句子的编码信息$c$、生成的第$i$个目标语言序列单词$u_i$和$i$时刻RNN的隐层状态$z_i$,计算出下一个隐层状态$z_{i+1}$。
-2. 将$z_{i+1}$通过`softmax`归一化,得到目标语言序列的第$i+1$个单词的概率分布$p_{i+1}$。
-3. 根据$p_{i+1}$采样出单词$u_{i+1}$。
-4. 重复步骤1~3,直到获得句子结束标记``或超过句子的最大生成长度为止。
-
-注意:$z_{i+1}$和$p_{i+1}$的计算公式同解码器中的一样。且由于生成时的每一步都是通过贪心法实现的,因此并不能保证得到全局最优解。
-
-## 数据介绍
-
-本教程使用[WMT-14](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/)数据集中的[bitexts(after selection)](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/bitexts.tgz)作为训练集,[dev+test data](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/dev+test.tgz)作为测试集和生成集。
-
-### 数据预处理
-
-我们的预处理流程包括两步:
-- 将每个源语言到目标语言的平行语料库文件合并为一个文件:
- - 合并每个`XXX.src`和`XXX.trg`文件为`XXX`。
- - `XXX`中的第$i$行内容为`XXX.src`中的第$i$行和`XXX.trg`中的第$i$行连接,用'\t'分隔。
-- 创建训练数据的“源字典”和“目标字典”。每个字典都有**DICTSIZE**个单词,包括:语料中词频最高的(DICTSIZE - 3)个单词,和3个特殊符号``(序列的开始)、``(序列的结束)和``(未登录词)。
-
-### 示例数据
-
-因为完整的数据集数据量较大,为了验证训练流程,PaddlePaddle接口paddle.dataset.wmt14中默认提供了一个经过预处理的[较小规模的数据集](http://paddlepaddle.bj.bcebos.com/demo/wmt_shrinked_data/wmt14.tgz)。
-
-该数据集有193319条训练数据,6003条测试数据,词典长度为30000。因为数据规模限制,使用该数据集训练出来的模型效果无法保证。
-
-## 训练模型
-
-`train.py`包含训练程序的主函数,要使用默认参数开始训练,只需要简单地执行:
-```sh
-python train.py
-```
-您可以使用命令行参数来设置模型训练时的参数。要显示所有可用的命令行参数,执行:
-```sh
-python train.py -h
-```
-这样会显示所有的命令行参数的描述,以及其默认值。默认的模型是带有注意力机制的。您也可以尝试运行无注意力机制的模型,命令如下:
-```sh
-python train.py --no_attention
-```
-训练好的模型默认会被保存到`./models`路径下。您可以用命令行参数`--save_dir`来指定模型的保存路径。默认每个pass结束时会保存一个模型。
-
-## 生成预测结果
-
-在模型训练好后,可以用`infer.py`来生成预测结果。同样的,使用默认参数,只需要执行:
-```sh
-python infer.py
-```
-您也可以同样用命令行来指定各参数。注意,预测时的参数设置必须与训练时完全一致,否则载入模型会失败。您可以用`--pass_num`参数来选择读取哪个pass结束时保存的模型。同时您可以使用`--beam_width`参数来选择beam search宽度。
-
-## 参考文献
-
-1. Koehn P. [Statistical machine translation](https://books.google.com.hk/books?id=4v_Cx1wIMLkC&printsec=frontcover&hl=zh-CN&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false)[M]. Cambridge University Press, 2009.
-2. Cho K, Van Merriënboer B, Gulcehre C, et al. [Learning phrase representations using RNN encoder-decoder for statistical machine translation](http://www.aclweb.org/anthology/D/D14/D14-1179.pdf)[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2014: 1724-1734.
-3. Chung J, Gulcehre C, Cho K H, et al. [Empirical evaluation of gated recurrent neural networks on sequence modeling](https://arxiv.org/abs/1412.3555)[J]. arXiv preprint arXiv:1412.3555, 2014.
-4. Bahdanau D, Cho K, Bengio Y. [Neural machine translation by jointly learning to align and translate](https://arxiv.org/abs/1409.0473)[C]//Proceedings of ICLR 2015, 2015.
-5. Papineni K, Roukos S, Ward T, et al. [BLEU: a method for automatic evaluation of machine translation](http://dl.acm.org/citation.cfm?id=1073135)[C]//Proceedings of the 40th annual meeting on association for computational linguistics. Association for Computational Linguistics, 2002: 311-318.
-
-
-
本教程 由 PaddlePaddle 创作,采用 知识共享 署名-相同方式共享 4.0 国际 许可协议进行许可。
\ No newline at end of file
+您好,该项目已被迁移,请移步到 [PaddleNLP/neural_machine_translation/rnn_search](../../../../PaddleNLP/neural_machine_translation/rnn_search) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/README.md b/fluid/PaddleNLP/neural_machine_translation/transformer/README.md
index 6fea167b5e7c3e9dd759ef30d9225b451350e889..47a4f78bbb1e18e55442807b0701aef08f370fc0 100644
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/README.md
+++ b/fluid/PaddleNLP/neural_machine_translation/transformer/README.md
@@ -1,23 +1,2 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
----
-
-# Attention is All You Need: A Paddle Fluid implementation
-
-This is a Paddle Fluid implementation of the Transformer model in [Attention is All You Need]() (Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin, arxiv, 2017).
-
-If you use the dataset/code in your research, please cite the paper:
-
-```text
-@inproceedings{vaswani2017attention,
- title={Attention is all you need},
- author={Vaswani, Ashish and Shazeer, Noam and Parmar, Niki and Uszkoreit, Jakob and Jones, Llion and Gomez, Aidan N and Kaiser, {\L}ukasz and Polosukhin, Illia},
- booktitle={Advances in Neural Information Processing Systems},
- pages={6000--6010},
- year={2017}
-}
-```
-
-### TODO
-
-This project is still under active development.
+您好,该项目已被迁移,请移步到 [PaddleNLP/neural_machine_translation/transformer](../../../../PaddleNLP/neural_machine_translation/transformer) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/README_cn.md b/fluid/PaddleNLP/neural_machine_translation/transformer/README_cn.md
deleted file mode 100644
index 7e7a09e7a1e4e8dcfddc5dbbc27c94a757d80d9e..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/README_cn.md
+++ /dev/null
@@ -1,251 +0,0 @@
-运行本目录下的程序示例需要使用 PaddlePaddle 最新的 develop branch 版本。如果您的 PaddlePaddle 安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新 PaddlePaddle 安装版本。
-
----
-
-## Transformer
-
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── images # README 文档中的图片
-├── config.py # 训练、预测以及模型参数配置
-├── infer.py # 预测脚本
-├── model.py # 模型定义
-├── optim.py # learning rate scheduling 计算程序
-├── reader.py # 数据读取接口
-├── README.md # 文档
-├── train.py # 训练脚本
-└── gen_data.sh # 数据生成脚本
-```
-
-### 简介
-
-Transformer 是论文 [Attention Is All You Need](https://arxiv.org/abs/1706.03762) 中提出的用以完成机器翻译(machine translation, MT)等序列到序列(sequence to sequence, Seq2Seq)学习任务的一种全新网络结构,其完全使用注意力(Attention)机制来实现序列到序列的建模[1]。
-
-相较于此前 Seq2Seq 模型中广泛使用的循环神经网络(Recurrent Neural Network, RNN),使用(Self)Attention 进行输入序列到输出序列的变换主要具有以下优势:
-
-- 计算复杂度小
- - 特征维度为 d 、长度为 n 的序列,在 RNN 中计算复杂度为 `O(n * d * d)` (n 个时间步,每个时间步计算 d 维的矩阵向量乘法),在 Self-Attention 中计算复杂度为 `O(n * n * d)` (n 个时间步两两计算 d 维的向量点积或其他相关度函数),n 通常要小于 d 。
-- 计算并行度高
- - RNN 中当前时间步的计算要依赖前一个时间步的计算结果;Self-Attention 中各时间步的计算只依赖输入不依赖之前时间步输出,各时间步可以完全并行。
-- 容易学习长程依赖(long-range dependencies)
- - RNN 中相距为 n 的两个位置间的关联需要 n 步才能建立;Self-Attention 中任何两个位置都直接相连;路径越短信号传播越容易。
-
-这些也在机器翻译任务中得到了印证,Transformer 模型在训练时间大幅减少的同时取得了 WMT'14 英德翻译任务 BLEU 值的新高。此外,Transformer 在应用于成分句法分析(Constituency Parsing)任务时也有着不俗的表现,这也说明其具有较高的通用性,容易迁移到其他应用场景中。这些都表明 Transformer 有着广阔的前景。
-
-### 模型概览
-
-Transformer 同样使用了 Seq2Seq 模型中典型的编码器-解码器(Encoder-Decoder)的框架结构,整体网络结构如图1所示。
-
-
-
-图 1. Transformer 网络结构图
-
-
-Encoder 由若干相同的 layer 堆叠组成,每个 layer 主要由多头注意力(Multi-Head Attention)和全连接的前馈(Feed-Forward)网络这两个 sub-layer 构成。
-- Multi-Head Attention 在这里用于实现 Self-Attention,相比于简单的 Attention 机制,其将输入进行多路线性变换后分别计算 Attention 的结果,并将所有结果拼接后再次进行线性变换作为输出。参见图2,其中 Attention 使用的是点积(Dot-Product),并在点积后进行了 scale 的处理以避免因点积结果过大进入 softmax 的饱和区域。
-- Feed-Forward 网络会对序列中的每个位置进行相同的计算(Position-wise),其采用的是两次线性变换中间加以 ReLU 激活的结构。
-
-此外,每个 sub-layer 后还施以 Residual Connection [2]和 Layer Normalization [3]来促进梯度传播和模型收敛。
-
-
-
-图 2. Multi-Head Attention
-
-
-Decoder 具有和 Encoder 类似的结构,只是相比于组成 Encoder 的 layer ,在组成 Decoder 的 layer 中还多了一个 Multi-Head Attention 的 sub-layer 来实现对 Encoder 输出的 Attention,这个 Encoder-Decoder Attention 在其他 Seq2Seq 模型中也是存在的。
-
-
-### 数据准备
-
-WMT 数据集是机器翻译领域公认的主流数据集,[WMT'16 EN-DE 数据集](http://www.statmt.org/wmt16/translation-task.html)是其中一个中等规模的数据集,也是 Transformer 论文中用到的一个数据集,这里将其作为示例,可以直接运行 `gen_data.sh` 脚本进行 WMT'16 EN-DE 数据集的下载和预处理。数据处理过程主要包括 Tokenize 和 BPE 编码(byte-pair encoding);BPE 编码的数据能够较好的解决未登录词(out-of-vocabulary,OOV)的问题[4],其在 Transformer 论文中也被使用。运行成功后,将会生成文件夹 `gen_data`,其目录结构如下(可在 `gen_data.sh` 中修改):
-
-```text
-.
-├── wmt16_ende_data # WMT16 英德翻译数据
-├── wmt16_ende_data_bpe # BPE 编码的 WMT16 英德翻译数据
-├── mosesdecoder # Moses 机器翻译工具集,包含了 Tokenize、BLEU 评估等脚本
-└── subword-nmt # BPE 编码的代码
-```
-
-`gen_data/wmt16_ende_data_bpe` 中是我们最终使用的英德翻译数据,其中 `train.tok.clean.bpe.32000.en-de` 为训练数据,`newstest2016.tok.bpe.32000.en-de` 等为验证和测试数据,。`vocab_all.bpe.32000` 为相应的词典文件(已加入 `` 、`` 和 `` 这三个特殊符号,源语言和目标语言共享该词典文件)。
-
-对于其他自定义数据,转换为类似 `train.tok.clean.bpe.32000.en-de` 的数据格式(`\t` 分隔的源语言和目标语言句子对,句子中的 token 之间使用空格分隔)即可;如需使用 BPE 编码,可参考,亦可以使用类似 WMT,使用 `gen_data.sh` 进行处理。
-
-### 模型训练
-
-`train.py` 是模型训练脚本。以英德翻译数据为例,可以执行以下命令进行模型训练:
-```sh
-python -u train.py \
- --src_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
- --trg_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
- --special_token '' '' '' \
- --train_file_pattern gen_data/wmt16_ende_data_bpe/train.tok.clean.bpe.32000.en-de \
- --token_delimiter ' ' \
- --use_token_batch True \
- --batch_size 4096 \
- --sort_type pool \
- --pool_size 200000
-```
-上述命令中设置了源语言词典文件路径(`src_vocab_fpath`)、目标语言词典文件路径(`trg_vocab_fpath`)、训练数据文件(`train_file_pattern`,支持通配符)等数据相关的参数和构造 batch 方式(`use_token_batch` 指定了数据按照 token 数目或者 sequence 数目组成 batch)等 reader 相关的参数。有关这些参数更详细的信息可以通过执行以下命令查看:
-```sh
-python train.py --help
-```
-
-更多模型训练相关的参数则在 `config.py` 中的 `ModelHyperParams` 和 `TrainTaskConfig` 内定义;`ModelHyperParams` 定义了 embedding 维度等模型超参数,`TrainTaskConfig` 定义了 warmup 步数等训练需要的参数。这些参数默认使用了 Transformer 论文中 base model 的配置,如需调整可以在该脚本中进行修改。另外这些参数同样可在执行训练脚本的命令行中设置,传入的配置会合并并覆盖 `config.py` 中的配置,如可以通过以下命令来训练 Transformer 论文中的 big model (如显存不够可适当减小 batch size 的值,或设置 `max_length 200` 过滤过长的句子,或修改某些显存使用相关环境变量的值):
-
-```sh
-# 显存使用的比例,显存不足可适当增大,最大为1
-export FLAGS_fraction_of_gpu_memory_to_use=1.0
-# 显存清理的阈值,显存不足可适当减小,最小为0,为负数时不启用
-export FLAGS_eager_delete_tensor_gb=0.8
-python -u train.py \
- --src_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
- --trg_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
- --special_token '' '' '' \
- --train_file_pattern gen_data/wmt16_ende_data_bpe/train.tok.clean.bpe.32000.en-de \
- --token_delimiter ' ' \
- --use_token_batch True \
- --batch_size 3200 \
- --sort_type pool \
- --pool_size 200000 \
- n_layer 6 \
- n_head 16 \
- d_model 1024 \
- d_inner_hid 4096 \
- n_head 16 \
- prepostprocess_dropout 0.3
-```
-有关这些参数更详细信息的请参考 `config.py` 中的注释说明。
-
-训练时默认使用所有 GPU,可以通过 `CUDA_VISIBLE_DEVICES` 环境变量来设置使用的 GPU 数目。也可以只使用 CPU 训练(通过参数 `--divice CPU` 设置),训练速度相对较慢。在训练过程中,每隔一定 iteration 后(通过参数 `save_freq` 设置,默认为10000)保存模型到参数 `model_dir` 指定的目录,每个 epoch 结束后也会保存 checkpiont 到 `ckpt_dir` 指定的目录,每隔一定数目的 iteration (通过参数 `--fetch_steps` 设置,默认为100)将打印如下的日志到标准输出:
-```txt
-[2018-10-26 00:49:24,705 INFO train.py:536] step_idx: 0, epoch: 0, batch: 0, avg loss: 10.999878, normalized loss: 9.624138, ppl: 59866.832031
-[2018-10-26 00:50:08,717 INFO train.py:545] step_idx: 100, epoch: 0, batch: 100, avg loss: 9.454134, normalized loss: 8.078394, ppl: 12760.809570, speed: 2.27 step/s
-[2018-10-26 00:50:52,655 INFO train.py:545] step_idx: 200, epoch: 0, batch: 200, avg loss: 8.643907, normalized loss: 7.268166, ppl: 5675.458496, speed: 2.28 step/s
-[2018-10-26 00:51:36,529 INFO train.py:545] step_idx: 300, epoch: 0, batch: 300, avg loss: 7.916654, normalized loss: 6.540914, ppl: 2742.579346, speed: 2.28 step/s
-[2018-10-26 00:52:20,692 INFO train.py:545] step_idx: 400, epoch: 0, batch: 400, avg loss: 7.902879, normalized loss: 6.527138, ppl: 2705.058350, speed: 2.26 step/s
-[2018-10-26 00:53:04,537 INFO train.py:545] step_idx: 500, epoch: 0, batch: 500, avg loss: 7.818271, normalized loss: 6.442531, ppl: 2485.604492, speed: 2.28 step/s
-[2018-10-26 00:53:48,580 INFO train.py:545] step_idx: 600, epoch: 0, batch: 600, avg loss: 7.554341, normalized loss: 6.178601, ppl: 1909.012451, speed: 2.27 step/s
-[2018-10-26 00:54:32,878 INFO train.py:545] step_idx: 700, epoch: 0, batch: 700, avg loss: 7.177765, normalized loss: 5.802025, ppl: 1309.977661, speed: 2.26 step/s
-[2018-10-26 00:55:17,108 INFO train.py:545] step_idx: 800, epoch: 0, batch: 800, avg loss: 7.005494, normalized loss: 5.629754, ppl: 1102.674805, speed: 2.26 step/s
-```
-
-### 模型预测
-
-`infer.py` 是模型预测脚本。以英德翻译数据为例,模型训练完成后可以执行以下命令对指定文件中的文本进行翻译:
-```sh
-python -u infer.py \
- --src_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
- --trg_vocab_fpath gen_data/wmt16_ende_data_bpe/vocab_all.bpe.32000 \
- --special_token '' '' '' \
- --test_file_pattern gen_data/wmt16_ende_data_bpe/newstest2016.tok.bpe.32000.en-de \
- --token_delimiter ' ' \
- --batch_size 32 \
- model_path trained_models/iter_100000.infer.model \
- beam_size 4 \
- max_out_len 255
-```
-和模型训练时类似,预测时也需要设置数据和 reader 相关的参数,并可以执行 `python infer.py --help` 查看这些参数的说明(部分参数意义和训练时略有不同);同样可以在预测命令中设置模型超参数,但应与模型训练时的设置一致;此外相比于模型训练,预测时还有一些额外的参数,如需要设置 `model_path` 来给出模型所在目录,可以设置 `beam_size` 和 `max_out_len` 来指定 Beam Search 算法的搜索宽度和最大深度(翻译长度),这些参数也可以在 `config.py` 中的 `InferTaskConfig` 内查阅注释说明并进行更改设置。
-
-执行以上预测命令会打印翻译结果到标准输出,每行输出是对应行输入的得分最高的翻译。对于使用 BPE 的英德数据,预测出的翻译结果也将是 BPE 表示的数据,要还原成原始的数据(这里指 tokenize 后的数据)才能进行正确的评估,可以使用以下命令来恢复 `predict.txt` 内的翻译结果到 `predict.tok.txt` 中(无需再次 tokenize 处理):
-```sh
-sed -r 's/(@@ )|(@@ ?$)//g' predict.txt > predict.tok.txt
-```
-
-接下来就可以使用参考翻译对翻译结果进行 BLEU 指标的评估了。以英德翻译 `newstest2016.tok.de` 数据为例,执行如下命令:
-```sh
-perl gen_data/mosesdecoder/scripts/generic/multi-bleu.perl gen_data/wmt16_ende_data/newstest2016.tok.de < predict.tok.txt
-```
-可以看到类似如下的结果(为单机两卡训练 200K 个 iteration 后模型的预测结果)。
-```
-BLEU = 33.08, 64.2/39.2/26.4/18.5 (BP=0.994, ratio=0.994, hyp_len=61971, ref_len=62362)
-```
-目前在未使用 model average 的情况下,英德翻译 base model 八卡训练 100K 个 iteration 后测试 BLEU 值如下:
-
-| 测试集 | newstest2014 | newstest2015 | newstest2016 |
-|-|-|-|-|
-| BLEU | 26.25 | 29.15 | 33.64 |
-
-
-### 分布式训练
-
-Transformer 模型支持同步或者异步的分布式训练。分布式的配置主要两个方面:
-
-1 命令行配置
-
- - `--local`,有两个取值,`True`表示单机训练,而`False`表示使用分布式训练。默认为单机训练模式。
-
- - `--sync`,有两个取值,但只有当`--local`参数为False才会产生影响,其中`True`表示同步训练模式,`False`表示异步训练模式。默认为同步训练模式。
-
-2 环境变量配置
-
- 在分布式训练模式下,会手动配置训练的trainer数量和pserver数量。在网络拓扑上,每一个trainer都会和每一个pserver相连,pserver作为服务端,而trainer作为客户端。下面分pserver和trainer说明具体的参数配置:
-
-1) pserver配置
-
-- `PADDLE_IS_LOCAL=[0|1]` 是否是分布式训练,`0`标识是分布式,`1`标识是单机
-
-- `TRAINING_ROLE=PSERVER` 标识当前节点是pserver
-
-- `POD_IP=ip` 设置当前pserver使用对外服务的地址
-
-- `PADDLE_PORT=port` 设置当前pserver对外服务监听端口号,和`POD_IP`共同构成对外的唯一标识
-
-- `PADDLE_TRAINERS_NUM=num` 设置pserver连接的trainer的数量
-
-下面是配置的示例, 使用两个pserver, 192.168.2.2上的配置如下:
-```
-export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
-export POD_IP=192.168.2.2
-export PADDLE_TRAINERS_NUM=2
-export TRAINING_ROLE=PSERVER
-export PADDLE_IS_LOCAL=0
-export PADDLE_PORT=6177
-```
-192.168.2.3上的配置如下:
-```
-export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
-export POD_IP=192.168.2.3
-export PADDLE_TRAINERS_NUM=2
-export TRAINING_ROLE=PSERVER
-export PADDLE_IS_LOCAL=0
-export PADDLE_PORT=6177
-```
-2) trainer配置
-
-- `PADDLE_IS_LOCAL=[0|1]` 是否是分布式训练,`0`标识是分布式,`1`标识是单机
-
-- `TRAINING_ROLE=TRAINER` 标识当前节点是trainer
-
-- `PADDLE_PSERVERS=[ip1,ip2,……]` 设置pserver的ip地址,用于告知trainer互联的pserver的ip, 使用`,`分割
-
-- `PADDLE_TRAINER_ID=num` 设置当前节点的编号, 编号的取值范围为0到N-1的整数
-
-- `PADDLE_PORT=port` 设置请求的pserver服务端口号
-
-下面是配置的示例, 使用两个trainer, trainer 1上的配置如下:
-```
-export TRAINING_ROLE=TRAINER
-export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
-export PADDLE_TRAINERS_NUM=2
-export PADDLE_TRAINER_ID=0
-export PADDLE_IS_LOCAL=0
-export PADDLE_PORT=6177
-```
-trainer 2上的配置如下:
-```
-export TRAINING_ROLE=TRAINER
-export PADDLE_PSERVERS=192.168.2.2,192.168.2.3
-export PADDLE_TRAINERS_NUM=2
-export PADDLE_TRAINER_ID=1
-export PADDLE_IS_LOCAL=0
-export PADDLE_PORT=6177
-```
-
-### 参考文献
-1. Vaswani A, Shazeer N, Parmar N, et al. [Attention is all you need](http://papers.nips.cc/paper/7181-attention-is-all-you-need.pdf)[C]//Advances in Neural Information Processing Systems. 2017: 6000-6010.
-2. He K, Zhang X, Ren S, et al. [Deep residual learning for image recognition](http://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf)[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2016: 770-778.
-3. Ba J L, Kiros J R, Hinton G E. [Layer normalization](https://arxiv.org/pdf/1607.06450.pdf)[J]. arXiv preprint arXiv:1607.06450, 2016.
-4. Sennrich R, Haddow B, Birch A. [Neural machine translation of rare words with subword units](https://arxiv.org/pdf/1508.07909)[J]. arXiv preprint arXiv:1508.07909, 2015.
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/config.py b/fluid/PaddleNLP/neural_machine_translation/transformer/config.py
deleted file mode 100644
index ca119aa6fd0878b1e2cea5c0eaba050b54348f79..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/config.py
+++ /dev/null
@@ -1,197 +0,0 @@
-class TrainTaskConfig(object):
- # support both CPU and GPU now.
- use_gpu = True
- # the epoch number to train.
- pass_num = 30
- # the number of sequences contained in a mini-batch.
- # deprecated, set batch_size in args.
- batch_size = 32
- # the hyper parameters for Adam optimizer.
- # This static learning_rate will be multiplied to the LearningRateScheduler
- # derived learning rate the to get the final learning rate.
- learning_rate = 2.0
- beta1 = 0.9
- beta2 = 0.997
- eps = 1e-9
- # the parameters for learning rate scheduling.
- warmup_steps = 8000
- # the weight used to mix up the ground-truth distribution and the fixed
- # uniform distribution in label smoothing when training.
- # Set this as zero if label smoothing is not wanted.
- label_smooth_eps = 0.1
- # the directory for saving trained models.
- model_dir = "trained_models"
- # the directory for saving checkpoints.
- ckpt_dir = "trained_ckpts"
- # the directory for loading checkpoint.
- # If provided, continue training from the checkpoint.
- ckpt_path = None
- # the parameter to initialize the learning rate scheduler.
- # It should be provided if use checkpoints, since the checkpoint doesn't
- # include the training step counter currently.
- start_step = 0
- # the frequency to save trained models.
- save_freq = 10000
-
-
-class InferTaskConfig(object):
- use_gpu = True
- # the number of examples in one run for sequence generation.
- batch_size = 10
- # the parameters for beam search.
- beam_size = 5
- max_out_len = 256
- # the number of decoded sentences to output.
- n_best = 1
- # the flags indicating whether to output the special tokens.
- output_bos = False
- output_eos = False
- output_unk = True
- # the directory for loading the trained model.
- model_path = "trained_models/pass_1.infer.model"
-
-
-class ModelHyperParams(object):
- # These following five vocabularies related configurations will be set
- # automatically according to the passed vocabulary path and special tokens.
- # size of source word dictionary.
- src_vocab_size = 10000
- # size of target word dictionay
- trg_vocab_size = 10000
- # index for token
- bos_idx = 0
- # index for token
- eos_idx = 1
- # index for token
- unk_idx = 2
- # max length of sequences deciding the size of position encoding table.
- max_length = 256
- # the dimension for word embeddings, which is also the last dimension of
- # the input and output of multi-head attention, position-wise feed-forward
- # networks, encoder and decoder.
- d_model = 512
- # size of the hidden layer in position-wise feed-forward networks.
- d_inner_hid = 2048
- # the dimension that keys are projected to for dot-product attention.
- d_key = 64
- # the dimension that values are projected to for dot-product attention.
- d_value = 64
- # number of head used in multi-head attention.
- n_head = 8
- # number of sub-layers to be stacked in the encoder and decoder.
- n_layer = 6
- # dropout rates of different modules.
- prepostprocess_dropout = 0.1
- attention_dropout = 0.1
- relu_dropout = 0.1
- # to process before each sub-layer
- preprocess_cmd = "n" # layer normalization
- # to process after each sub-layer
- postprocess_cmd = "da" # dropout + residual connection
- # random seed used in dropout for CE.
- dropout_seed = None
- # the flag indicating whether to share embedding and softmax weights.
- # vocabularies in source and target should be same for weight sharing.
- weight_sharing = True
-
-
-def merge_cfg_from_list(cfg_list, g_cfgs):
- """
- Set the above global configurations using the cfg_list.
- """
- assert len(cfg_list) % 2 == 0
- for key, value in zip(cfg_list[0::2], cfg_list[1::2]):
- for g_cfg in g_cfgs:
- if hasattr(g_cfg, key):
- try:
- value = eval(value)
- except Exception: # for file path
- pass
- setattr(g_cfg, key, value)
- break
-
-
-# The placeholder for batch_size in compile time. Must be -1 currently to be
-# consistent with some ops' infer-shape output in compile time, such as the
-# sequence_expand op used in beamsearch decoder.
-batch_size = -1
-# The placeholder for squence length in compile time.
-seq_len = ModelHyperParams.max_length
-# Here list the data shapes and data types of all inputs.
-# The shapes here act as placeholder and are set to pass the infer-shape in
-# compile time.
-input_descs = {
- # The actual data shape of src_word is:
- # [batch_size, max_src_len_in_batch, 1]
- "src_word": [(batch_size, seq_len, 1), "int64", 2],
- # The actual data shape of src_pos is:
- # [batch_size, max_src_len_in_batch, 1]
- "src_pos": [(batch_size, seq_len, 1), "int64"],
- # This input is used to remove attention weights on paddings in the
- # encoder.
- # The actual data shape of src_slf_attn_bias is:
- # [batch_size, n_head, max_src_len_in_batch, max_src_len_in_batch]
- "src_slf_attn_bias": [(batch_size, ModelHyperParams.n_head, seq_len,
- seq_len), "float32"],
- # The actual data shape of trg_word is:
- # [batch_size, max_trg_len_in_batch, 1]
- "trg_word": [(batch_size, seq_len, 1), "int64",
- 2], # lod_level is only used in fast decoder.
- # The actual data shape of trg_pos is:
- # [batch_size, max_trg_len_in_batch, 1]
- "trg_pos": [(batch_size, seq_len, 1), "int64"],
- # This input is used to remove attention weights on paddings and
- # subsequent words in the decoder.
- # The actual data shape of trg_slf_attn_bias is:
- # [batch_size, n_head, max_trg_len_in_batch, max_trg_len_in_batch]
- "trg_slf_attn_bias": [(batch_size, ModelHyperParams.n_head, seq_len,
- seq_len), "float32"],
- # This input is used to remove attention weights on paddings of the source
- # input in the encoder-decoder attention.
- # The actual data shape of trg_src_attn_bias is:
- # [batch_size, n_head, max_trg_len_in_batch, max_src_len_in_batch]
- "trg_src_attn_bias": [(batch_size, ModelHyperParams.n_head, seq_len,
- seq_len), "float32"],
- # This input is used in independent decoder program for inference.
- # The actual data shape of enc_output is:
- # [batch_size, max_src_len_in_batch, d_model]
- "enc_output": [(batch_size, seq_len, ModelHyperParams.d_model), "float32"],
- # The actual data shape of label_word is:
- # [batch_size * max_trg_len_in_batch, 1]
- "lbl_word": [(batch_size * seq_len, 1), "int64"],
- # This input is used to mask out the loss of paddding tokens.
- # The actual data shape of label_weight is:
- # [batch_size * max_trg_len_in_batch, 1]
- "lbl_weight": [(batch_size * seq_len, 1), "float32"],
- # This input is used in beam-search decoder.
- "init_score": [(batch_size, 1), "float32"],
-}
-
-# Names of word embedding table which might be reused for weight sharing.
-word_emb_param_names = (
- "src_word_emb_table",
- "trg_word_emb_table", )
-# Names of position encoding table which will be initialized externally.
-pos_enc_param_names = (
- "src_pos_enc_table",
- "trg_pos_enc_table", )
-# separated inputs for different usages.
-encoder_data_input_fields = (
- "src_word",
- "src_pos",
- "src_slf_attn_bias", )
-decoder_data_input_fields = (
- "trg_word",
- "trg_pos",
- "trg_slf_attn_bias",
- "trg_src_attn_bias",
- "enc_output", )
-label_data_input_fields = (
- "lbl_word",
- "lbl_weight", )
-# In fast decoder, trg_pos (only containing the current time step) is generated
-# by ops and trg_slf_attn_bias is not needed.
-fast_decoder_data_input_fields = (
- "trg_word",
- "init_score",
- "trg_src_attn_bias", )
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/infer.py b/fluid/PaddleNLP/neural_machine_translation/transformer/infer.py
deleted file mode 100644
index 6fc04a9422c136d941559d1b45af8bd88c2d2460..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/infer.py
+++ /dev/null
@@ -1,228 +0,0 @@
-import argparse
-import ast
-import numpy as np
-from functools import partial
-
-import paddle
-import paddle.fluid as fluid
-
-import model
-from model import wrap_encoder as encoder
-from model import wrap_decoder as decoder
-from model import fast_decode as fast_decoder
-from config import *
-from train import pad_batch_data
-import reader
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Training for Transformer.")
- parser.add_argument(
- "--src_vocab_fpath",
- type=str,
- required=True,
- help="The path of vocabulary file of source language.")
- parser.add_argument(
- "--trg_vocab_fpath",
- type=str,
- required=True,
- help="The path of vocabulary file of target language.")
- parser.add_argument(
- "--test_file_pattern",
- type=str,
- required=True,
- help="The pattern to match test data files.")
- parser.add_argument(
- "--batch_size",
- type=int,
- default=50,
- help="The number of examples in one run for sequence generation.")
- parser.add_argument(
- "--pool_size",
- type=int,
- default=10000,
- help="The buffer size to pool data.")
- parser.add_argument(
- "--special_token",
- type=str,
- default=["", "", ""],
- nargs=3,
- help="The , and tokens in the dictionary.")
- parser.add_argument(
- "--token_delimiter",
- type=lambda x: str(x.encode().decode("unicode-escape")),
- default=" ",
- help="The delimiter used to split tokens in source or target sentences. "
- "For EN-DE BPE data we provided, use spaces as token delimiter. ")
- parser.add_argument(
- 'opts',
- help='See config.py for all options',
- default=None,
- nargs=argparse.REMAINDER)
- args = parser.parse_args()
- # Append args related to dict
- src_dict = reader.DataReader.load_dict(args.src_vocab_fpath)
- trg_dict = reader.DataReader.load_dict(args.trg_vocab_fpath)
- dict_args = [
- "src_vocab_size", str(len(src_dict)), "trg_vocab_size",
- str(len(trg_dict)), "bos_idx", str(src_dict[args.special_token[0]]),
- "eos_idx", str(src_dict[args.special_token[1]]), "unk_idx",
- str(src_dict[args.special_token[2]])
- ]
- merge_cfg_from_list(args.opts + dict_args,
- [InferTaskConfig, ModelHyperParams])
- return args
-
-
-def post_process_seq(seq,
- bos_idx=ModelHyperParams.bos_idx,
- eos_idx=ModelHyperParams.eos_idx,
- output_bos=InferTaskConfig.output_bos,
- output_eos=InferTaskConfig.output_eos):
- """
- Post-process the beam-search decoded sequence. Truncate from the first
- and remove the and tokens currently.
- """
- eos_pos = len(seq) - 1
- for i, idx in enumerate(seq):
- if idx == eos_idx:
- eos_pos = i
- break
- seq = [
- idx for idx in seq[:eos_pos + 1]
- if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)
- ]
- return seq
-
-
-def prepare_batch_input(insts, data_input_names, src_pad_idx, bos_idx, n_head,
- d_model, place):
- """
- Put all padded data needed by beam search decoder into a dict.
- """
- src_word, src_pos, src_slf_attn_bias, src_max_len = pad_batch_data(
- [inst[0] for inst in insts], src_pad_idx, n_head, is_target=False)
- # start tokens
- trg_word = np.asarray([[bos_idx]] * len(insts), dtype="int64")
- trg_src_attn_bias = np.tile(src_slf_attn_bias[:, :, ::src_max_len, :],
- [1, 1, 1, 1]).astype("float32")
- trg_word = trg_word.reshape(-1, 1, 1)
- src_word = src_word.reshape(-1, src_max_len, 1)
- src_pos = src_pos.reshape(-1, src_max_len, 1)
-
- def to_lodtensor(data, place, lod=None):
- data_tensor = fluid.LoDTensor()
- data_tensor.set(data, place)
- if lod is not None:
- data_tensor.set_lod(lod)
- return data_tensor
-
- # beamsearch_op must use tensors with lod
- init_score = to_lodtensor(
- np.zeros_like(
- trg_word, dtype="float32").reshape(-1, 1),
- place, [range(trg_word.shape[0] + 1)] * 2)
- trg_word = to_lodtensor(trg_word, place, [range(trg_word.shape[0] + 1)] * 2)
-
- data_input_dict = dict(
- zip(data_input_names, [
- src_word, src_pos, src_slf_attn_bias, trg_word, init_score,
- trg_src_attn_bias
- ]))
-
- input_dict = dict(data_input_dict.items())
- return input_dict
-
-
-def fast_infer(test_data, trg_idx2word):
- """
- Inference by beam search decoder based solely on Fluid operators.
- """
- place = fluid.CUDAPlace(0) if InferTaskConfig.use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- out_ids, out_scores = fast_decoder(
- ModelHyperParams.src_vocab_size, ModelHyperParams.trg_vocab_size,
- ModelHyperParams.max_length + 1, ModelHyperParams.n_layer,
- ModelHyperParams.n_head, ModelHyperParams.d_key,
- ModelHyperParams.d_value, ModelHyperParams.d_model,
- ModelHyperParams.d_inner_hid, ModelHyperParams.prepostprocess_dropout,
- ModelHyperParams.attention_dropout, ModelHyperParams.relu_dropout,
- ModelHyperParams.preprocess_cmd, ModelHyperParams.postprocess_cmd,
- ModelHyperParams.weight_sharing, InferTaskConfig.beam_size,
- InferTaskConfig.max_out_len, ModelHyperParams.eos_idx)
-
- fluid.io.load_vars(
- exe,
- InferTaskConfig.model_path,
- vars=[
- var for var in fluid.default_main_program().list_vars()
- if isinstance(var, fluid.framework.Parameter)
- ])
-
- # This is used here to set dropout to the test mode.
- infer_program = fluid.default_main_program().clone(for_test=True)
-
- for batch_id, data in enumerate(test_data.batch_generator()):
- data_input = prepare_batch_input(
- data, encoder_data_input_fields + fast_decoder_data_input_fields,
- ModelHyperParams.eos_idx, ModelHyperParams.bos_idx,
- ModelHyperParams.n_head, ModelHyperParams.d_model, place)
- seq_ids, seq_scores = exe.run(infer_program,
- feed=data_input,
- fetch_list=[out_ids, out_scores],
- return_numpy=False)
- # How to parse the results:
- # Suppose the lod of seq_ids is:
- # [[0, 3, 6], [0, 12, 24, 40, 54, 67, 82]]
- # then from lod[0]:
- # there are 2 source sentences, beam width is 3.
- # from lod[1]:
- # the first source sentence has 3 hyps; the lengths are 12, 12, 16
- # the second source sentence has 3 hyps; the lengths are 14, 13, 15
- hyps = [[] for i in range(len(data))]
- scores = [[] for i in range(len(data))]
- for i in range(len(seq_ids.lod()[0]) - 1): # for each source sentence
- start = seq_ids.lod()[0][i]
- end = seq_ids.lod()[0][i + 1]
- for j in range(end - start): # for each candidate
- sub_start = seq_ids.lod()[1][start + j]
- sub_end = seq_ids.lod()[1][start + j + 1]
- hyps[i].append(" ".join([
- trg_idx2word[idx]
- for idx in post_process_seq(
- np.array(seq_ids)[sub_start:sub_end])
- ]))
- scores[i].append(np.array(seq_scores)[sub_end - 1])
- print(hyps[i][-1])
- if len(hyps[i]) >= InferTaskConfig.n_best:
- break
-
-
-def infer(args, inferencer=fast_infer):
- place = fluid.CUDAPlace(0) if InferTaskConfig.use_gpu else fluid.CPUPlace()
- test_data = reader.DataReader(
- src_vocab_fpath=args.src_vocab_fpath,
- trg_vocab_fpath=args.trg_vocab_fpath,
- fpattern=args.test_file_pattern,
- token_delimiter=args.token_delimiter,
- use_token_batch=False,
- batch_size=args.batch_size,
- pool_size=args.pool_size,
- sort_type=reader.SortType.NONE,
- shuffle=False,
- shuffle_batch=False,
- start_mark=args.special_token[0],
- end_mark=args.special_token[1],
- unk_mark=args.special_token[2],
- # count start and end tokens out
- max_length=ModelHyperParams.max_length - 2,
- clip_last_batch=False)
- trg_idx2word = test_data.load_dict(
- dict_path=args.trg_vocab_fpath, reverse=True)
- inferencer(test_data, trg_idx2word)
-
-
-if __name__ == "__main__":
- args = parse_args()
- infer(args)
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/model.py b/fluid/PaddleNLP/neural_machine_translation/transformer/model.py
deleted file mode 100644
index 1e510bc620dc56f82e8e7303a56ca44a44b74650..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/model.py
+++ /dev/null
@@ -1,802 +0,0 @@
-from functools import partial
-import numpy as np
-
-import paddle.fluid as fluid
-import paddle.fluid.layers as layers
-
-from config import *
-
-
-def position_encoding_init(n_position, d_pos_vec):
- """
- Generate the initial values for the sinusoid position encoding table.
- """
- channels = d_pos_vec
- position = np.arange(n_position)
- num_timescales = channels // 2
- log_timescale_increment = (np.log(float(1e4) / float(1)) /
- (num_timescales - 1))
- inv_timescales = np.exp(np.arange(
- num_timescales)) * -log_timescale_increment
- scaled_time = np.expand_dims(position, 1) * np.expand_dims(inv_timescales,
- 0)
- signal = np.concatenate([np.sin(scaled_time), np.cos(scaled_time)], axis=1)
- signal = np.pad(signal, [[0, 0], [0, np.mod(channels, 2)]], 'constant')
- position_enc = signal
- return position_enc.astype("float32")
-
-
-def multi_head_attention(queries,
- keys,
- values,
- attn_bias,
- d_key,
- d_value,
- d_model,
- n_head=1,
- dropout_rate=0.,
- cache=None):
- """
- Multi-Head Attention. Note that attn_bias is added to the logit before
- computing softmax activiation to mask certain selected positions so that
- they will not considered in attention weights.
- """
- keys = queries if keys is None else keys
- values = keys if values is None else values
-
- if not (len(queries.shape) == len(keys.shape) == len(values.shape) == 3):
- raise ValueError(
- "Inputs: quries, keys and values should all be 3-D tensors.")
-
- def __compute_qkv(queries, keys, values, n_head, d_key, d_value):
- """
- Add linear projection to queries, keys, and values.
- """
- q = layers.fc(input=queries,
- size=d_key * n_head,
- bias_attr=False,
- num_flatten_dims=2)
- k = layers.fc(input=keys,
- size=d_key * n_head,
- bias_attr=False,
- num_flatten_dims=2)
- v = layers.fc(input=values,
- size=d_value * n_head,
- bias_attr=False,
- num_flatten_dims=2)
- return q, k, v
-
- def __split_heads(x, n_head):
- """
- Reshape the last dimension of inpunt tensor x so that it becomes two
- dimensions and then transpose. Specifically, input a tensor with shape
- [bs, max_sequence_length, n_head * hidden_dim] then output a tensor
- with shape [bs, n_head, max_sequence_length, hidden_dim].
- """
- if n_head == 1:
- return x
-
- hidden_size = x.shape[-1]
- # The value 0 in shape attr means copying the corresponding dimension
- # size of the input as the output dimension size.
- reshaped = layers.reshape(
- x=x, shape=[0, 0, n_head, hidden_size // n_head], inplace=True)
-
- # permuate the dimensions into:
- # [batch_size, n_head, max_sequence_len, hidden_size_per_head]
- return layers.transpose(x=reshaped, perm=[0, 2, 1, 3])
-
- def __combine_heads(x):
- """
- Transpose and then reshape the last two dimensions of inpunt tensor x
- so that it becomes one dimension, which is reverse to __split_heads.
- """
- if len(x.shape) == 3: return x
- if len(x.shape) != 4:
- raise ValueError("Input(x) should be a 4-D Tensor.")
-
- trans_x = layers.transpose(x, perm=[0, 2, 1, 3])
- # The value 0 in shape attr means copying the corresponding dimension
- # size of the input as the output dimension size.
- return layers.reshape(
- x=trans_x,
- shape=[0, 0, trans_x.shape[2] * trans_x.shape[3]],
- inplace=True)
-
- def scaled_dot_product_attention(q, k, v, attn_bias, d_key, dropout_rate):
- """
- Scaled Dot-Product Attention
- """
- scaled_q = layers.scale(x=q, scale=d_key**-0.5)
- product = layers.matmul(x=scaled_q, y=k, transpose_y=True)
- if attn_bias:
- product += attn_bias
- weights = layers.softmax(product)
- if dropout_rate:
- weights = layers.dropout(
- weights,
- dropout_prob=dropout_rate,
- seed=ModelHyperParams.dropout_seed,
- is_test=False)
- out = layers.matmul(weights, v)
- return out
-
- q, k, v = __compute_qkv(queries, keys, values, n_head, d_key, d_value)
-
- if cache is not None: # use cache and concat time steps
- # Since the inplace reshape in __split_heads changes the shape of k and
- # v, which is the cache input for next time step, reshape the cache
- # input from the previous time step first.
- k = cache["k"] = layers.concat(
- [layers.reshape(
- cache["k"], shape=[0, 0, d_key * n_head]), k],
- axis=1)
- v = cache["v"] = layers.concat(
- [layers.reshape(
- cache["v"], shape=[0, 0, d_value * n_head]), v],
- axis=1)
-
- q = __split_heads(q, n_head)
- k = __split_heads(k, n_head)
- v = __split_heads(v, n_head)
-
- ctx_multiheads = scaled_dot_product_attention(q, k, v, attn_bias, d_model,
- dropout_rate)
-
- out = __combine_heads(ctx_multiheads)
-
- # Project back to the model size.
- proj_out = layers.fc(input=out,
- size=d_model,
- bias_attr=False,
- num_flatten_dims=2)
- return proj_out
-
-
-def positionwise_feed_forward(x, d_inner_hid, d_hid, dropout_rate):
- """
- Position-wise Feed-Forward Networks.
- This module consists of two linear transformations with a ReLU activation
- in between, which is applied to each position separately and identically.
- """
- hidden = layers.fc(input=x,
- size=d_inner_hid,
- num_flatten_dims=2,
- act="relu")
- if dropout_rate:
- hidden = layers.dropout(
- hidden,
- dropout_prob=dropout_rate,
- seed=ModelHyperParams.dropout_seed,
- is_test=False)
- out = layers.fc(input=hidden, size=d_hid, num_flatten_dims=2)
- return out
-
-
-def pre_post_process_layer(prev_out, out, process_cmd, dropout_rate=0.):
- """
- Add residual connection, layer normalization and droput to the out tensor
- optionally according to the value of process_cmd.
- This will be used before or after multi-head attention and position-wise
- feed-forward networks.
- """
- for cmd in process_cmd:
- if cmd == "a": # add residual connection
- out = out + prev_out if prev_out else out
- elif cmd == "n": # add layer normalization
- out = layers.layer_norm(
- out,
- begin_norm_axis=len(out.shape) - 1,
- param_attr=fluid.initializer.Constant(1.),
- bias_attr=fluid.initializer.Constant(0.))
- elif cmd == "d": # add dropout
- if dropout_rate:
- out = layers.dropout(
- out,
- dropout_prob=dropout_rate,
- seed=ModelHyperParams.dropout_seed,
- is_test=False)
- return out
-
-
-pre_process_layer = partial(pre_post_process_layer, None)
-post_process_layer = pre_post_process_layer
-
-
-def prepare_encoder_decoder(src_word,
- src_pos,
- src_vocab_size,
- src_emb_dim,
- src_max_len,
- dropout_rate=0.,
- word_emb_param_name=None,
- pos_enc_param_name=None):
- """Add word embeddings and position encodings.
- The output tensor has a shape of:
- [batch_size, max_src_length_in_batch, d_model].
- This module is used at the bottom of the encoder stacks.
- """
- src_word_emb = layers.embedding(
- src_word,
- size=[src_vocab_size, src_emb_dim],
- padding_idx=ModelHyperParams.bos_idx, # set embedding of bos to 0
- param_attr=fluid.ParamAttr(
- name=word_emb_param_name,
- initializer=fluid.initializer.Normal(0., src_emb_dim**-0.5)))
-
- src_word_emb = layers.scale(x=src_word_emb, scale=src_emb_dim**0.5)
- src_pos_enc = layers.embedding(
- src_pos,
- size=[src_max_len, src_emb_dim],
- param_attr=fluid.ParamAttr(
- name=pos_enc_param_name, trainable=False))
- src_pos_enc.stop_gradient = True
- enc_input = src_word_emb + src_pos_enc
- return layers.dropout(
- enc_input,
- dropout_prob=dropout_rate,
- seed=ModelHyperParams.dropout_seed,
- is_test=False) if dropout_rate else enc_input
-
-
-prepare_encoder = partial(
- prepare_encoder_decoder, pos_enc_param_name=pos_enc_param_names[0])
-prepare_decoder = partial(
- prepare_encoder_decoder, pos_enc_param_name=pos_enc_param_names[1])
-
-
-def encoder_layer(enc_input,
- attn_bias,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd="n",
- postprocess_cmd="da"):
- """The encoder layers that can be stacked to form a deep encoder.
- This module consits of a multi-head (self) attention followed by
- position-wise feed-forward networks and both the two components companied
- with the post_process_layer to add residual connection, layer normalization
- and droput.
- """
- attn_output = multi_head_attention(
- pre_process_layer(enc_input, preprocess_cmd,
- prepostprocess_dropout), None, None, attn_bias, d_key,
- d_value, d_model, n_head, attention_dropout)
- attn_output = post_process_layer(enc_input, attn_output, postprocess_cmd,
- prepostprocess_dropout)
- ffd_output = positionwise_feed_forward(
- pre_process_layer(attn_output, preprocess_cmd, prepostprocess_dropout),
- d_inner_hid, d_model, relu_dropout)
- return post_process_layer(attn_output, ffd_output, postprocess_cmd,
- prepostprocess_dropout)
-
-
-def encoder(enc_input,
- attn_bias,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd="n",
- postprocess_cmd="da"):
- """
- The encoder is composed of a stack of identical layers returned by calling
- encoder_layer.
- """
- for i in range(n_layer):
- enc_output = encoder_layer(
- enc_input,
- attn_bias,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd, )
- enc_input = enc_output
- enc_output = pre_process_layer(enc_output, preprocess_cmd,
- prepostprocess_dropout)
- return enc_output
-
-
-def decoder_layer(dec_input,
- enc_output,
- slf_attn_bias,
- dec_enc_attn_bias,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- cache=None):
- """ The layer to be stacked in decoder part.
- The structure of this module is similar to that in the encoder part except
- a multi-head attention is added to implement encoder-decoder attention.
- """
- slf_attn_output = multi_head_attention(
- pre_process_layer(dec_input, preprocess_cmd, prepostprocess_dropout),
- None,
- None,
- slf_attn_bias,
- d_key,
- d_value,
- d_model,
- n_head,
- attention_dropout,
- cache, )
- slf_attn_output = post_process_layer(
- dec_input,
- slf_attn_output,
- postprocess_cmd,
- prepostprocess_dropout, )
- enc_attn_output = multi_head_attention(
- pre_process_layer(slf_attn_output, preprocess_cmd,
- prepostprocess_dropout),
- enc_output,
- enc_output,
- dec_enc_attn_bias,
- d_key,
- d_value,
- d_model,
- n_head,
- attention_dropout, )
- enc_attn_output = post_process_layer(
- slf_attn_output,
- enc_attn_output,
- postprocess_cmd,
- prepostprocess_dropout, )
- ffd_output = positionwise_feed_forward(
- pre_process_layer(enc_attn_output, preprocess_cmd,
- prepostprocess_dropout),
- d_inner_hid,
- d_model,
- relu_dropout, )
- dec_output = post_process_layer(
- enc_attn_output,
- ffd_output,
- postprocess_cmd,
- prepostprocess_dropout, )
- return dec_output
-
-
-def decoder(dec_input,
- enc_output,
- dec_slf_attn_bias,
- dec_enc_attn_bias,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- caches=None):
- """
- The decoder is composed of a stack of identical decoder_layer layers.
- """
- for i in range(n_layer):
- dec_output = decoder_layer(
- dec_input,
- enc_output,
- dec_slf_attn_bias,
- dec_enc_attn_bias,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- cache=None if caches is None else caches[i])
- dec_input = dec_output
- dec_output = pre_process_layer(dec_output, preprocess_cmd,
- prepostprocess_dropout)
- return dec_output
-
-
-def make_all_inputs(input_fields):
- """
- Define the input data layers for the transformer model.
- """
- inputs = []
- for input_field in input_fields:
- input_var = layers.data(
- name=input_field,
- shape=input_descs[input_field][0],
- dtype=input_descs[input_field][1],
- lod_level=input_descs[input_field][2]
- if len(input_descs[input_field]) == 3 else 0,
- append_batch_size=False)
- inputs.append(input_var)
- return inputs
-
-
-def make_all_py_reader_inputs(input_fields, is_test=False):
- reader = layers.py_reader(
- capacity=20,
- name="test_reader" if is_test else "train_reader",
- shapes=[input_descs[input_field][0] for input_field in input_fields],
- dtypes=[input_descs[input_field][1] for input_field in input_fields],
- lod_levels=[
- input_descs[input_field][2]
- if len(input_descs[input_field]) == 3 else 0
- for input_field in input_fields
- ])
- return layers.read_file(reader), reader
-
-
-def transformer(src_vocab_size,
- trg_vocab_size,
- max_length,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- weight_sharing,
- label_smooth_eps,
- use_py_reader=False,
- is_test=False):
- if weight_sharing:
- assert src_vocab_size == trg_vocab_size, (
- "Vocabularies in source and target should be same for weight sharing."
- )
-
- data_input_names = encoder_data_input_fields + \
- decoder_data_input_fields[:-1] + label_data_input_fields
-
- if use_py_reader:
- all_inputs, reader = make_all_py_reader_inputs(data_input_names,
- is_test)
- else:
- all_inputs = make_all_inputs(data_input_names)
-
- enc_inputs_len = len(encoder_data_input_fields)
- dec_inputs_len = len(decoder_data_input_fields[:-1])
- enc_inputs = all_inputs[0:enc_inputs_len]
- dec_inputs = all_inputs[enc_inputs_len:enc_inputs_len + dec_inputs_len]
- label = all_inputs[-2]
- weights = all_inputs[-1]
-
- enc_output = wrap_encoder(
- src_vocab_size,
- max_length,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- weight_sharing,
- enc_inputs, )
-
- predict = wrap_decoder(
- trg_vocab_size,
- max_length,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- weight_sharing,
- dec_inputs,
- enc_output, )
-
- # Padding index do not contribute to the total loss. The weights is used to
- # cancel padding index in calculating the loss.
- if label_smooth_eps:
- label = layers.label_smooth(
- label=layers.one_hot(
- input=label, depth=trg_vocab_size),
- epsilon=label_smooth_eps)
-
- cost = layers.softmax_with_cross_entropy(
- logits=predict,
- label=label,
- soft_label=True if label_smooth_eps else False)
- weighted_cost = cost * weights
- sum_cost = layers.reduce_sum(weighted_cost)
- token_num = layers.reduce_sum(weights)
- token_num.stop_gradient = True
- avg_cost = sum_cost / token_num
- return sum_cost, avg_cost, predict, token_num, reader if use_py_reader else None
-
-
-def wrap_encoder(src_vocab_size,
- max_length,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- weight_sharing,
- enc_inputs=None):
- """
- The wrapper assembles together all needed layers for the encoder.
- """
- if enc_inputs is None:
- # This is used to implement independent encoder program in inference.
- src_word, src_pos, src_slf_attn_bias = make_all_inputs(
- encoder_data_input_fields)
- else:
- src_word, src_pos, src_slf_attn_bias = enc_inputs
- enc_input = prepare_encoder(
- src_word,
- src_pos,
- src_vocab_size,
- d_model,
- max_length,
- prepostprocess_dropout,
- word_emb_param_name=word_emb_param_names[0])
- enc_output = encoder(
- enc_input,
- src_slf_attn_bias,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd, )
- return enc_output
-
-
-def wrap_decoder(trg_vocab_size,
- max_length,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- weight_sharing,
- dec_inputs=None,
- enc_output=None,
- caches=None):
- """
- The wrapper assembles together all needed layers for the decoder.
- """
- if dec_inputs is None:
- # This is used to implement independent decoder program in inference.
- trg_word, trg_pos, trg_slf_attn_bias, trg_src_attn_bias, enc_output = \
- make_all_inputs(decoder_data_input_fields)
- else:
- trg_word, trg_pos, trg_slf_attn_bias, trg_src_attn_bias = dec_inputs
-
- dec_input = prepare_decoder(
- trg_word,
- trg_pos,
- trg_vocab_size,
- d_model,
- max_length,
- prepostprocess_dropout,
- word_emb_param_name=word_emb_param_names[0]
- if weight_sharing else word_emb_param_names[1])
- dec_output = decoder(
- dec_input,
- enc_output,
- trg_slf_attn_bias,
- trg_src_attn_bias,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- caches=caches)
- # Reshape to 2D tensor to use GEMM instead of BatchedGEMM
- dec_output = layers.reshape(
- dec_output, shape=[-1, dec_output.shape[-1]], inplace=True)
- if weight_sharing:
- predict = layers.matmul(
- x=dec_output,
- y=fluid.default_main_program().global_block().var(
- word_emb_param_names[0]),
- transpose_y=True)
- else:
- predict = layers.fc(input=dec_output,
- size=trg_vocab_size,
- bias_attr=False)
- if dec_inputs is None:
- # Return probs for independent decoder program.
- predict = layers.softmax(predict)
- return predict
-
-
-def fast_decode(
- src_vocab_size,
- trg_vocab_size,
- max_in_len,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- weight_sharing,
- beam_size,
- max_out_len,
- eos_idx, ):
- """
- Use beam search to decode. Caches will be used to store states of history
- steps which can make the decoding faster.
- """
- enc_output = wrap_encoder(
- src_vocab_size, max_in_len, n_layer, n_head, d_key, d_value, d_model,
- d_inner_hid, prepostprocess_dropout, attention_dropout, relu_dropout,
- preprocess_cmd, postprocess_cmd, weight_sharing)
- start_tokens, init_scores, trg_src_attn_bias = make_all_inputs(
- fast_decoder_data_input_fields)
-
- def beam_search():
- max_len = layers.fill_constant(
- shape=[1], dtype=start_tokens.dtype, value=max_out_len)
- step_idx = layers.fill_constant(
- shape=[1], dtype=start_tokens.dtype, value=0)
- cond = layers.less_than(x=step_idx, y=max_len)
- while_op = layers.While(cond)
- # array states will be stored for each step.
- ids = layers.array_write(
- layers.reshape(start_tokens, (-1, 1)), step_idx)
- scores = layers.array_write(init_scores, step_idx)
- # cell states will be overwrited at each step.
- # caches contains states of history steps to reduce redundant
- # computation in decoder.
- caches = [{
- "k": layers.fill_constant_batch_size_like(
- input=start_tokens,
- shape=[-1, 0, d_model],
- dtype=enc_output.dtype,
- value=0),
- "v": layers.fill_constant_batch_size_like(
- input=start_tokens,
- shape=[-1, 0, d_model],
- dtype=enc_output.dtype,
- value=0)
- } for i in range(n_layer)]
- with while_op.block():
- pre_ids = layers.array_read(array=ids, i=step_idx)
- pre_ids = layers.reshape(pre_ids, (-1, 1, 1))
- pre_scores = layers.array_read(array=scores, i=step_idx)
- # sequence_expand can gather sequences according to lod thus can be
- # used in beam search to sift states corresponding to selected ids.
- pre_src_attn_bias = layers.sequence_expand(
- x=trg_src_attn_bias, y=pre_scores)
- pre_enc_output = layers.sequence_expand(x=enc_output, y=pre_scores)
- pre_caches = [{
- "k": layers.sequence_expand(
- x=cache["k"], y=pre_scores),
- "v": layers.sequence_expand(
- x=cache["v"], y=pre_scores),
- } for cache in caches]
- pre_pos = layers.elementwise_mul(
- x=layers.fill_constant_batch_size_like(
- input=pre_enc_output, # cann't use pre_ids here since it has lod
- value=1,
- shape=[-1, 1, 1],
- dtype=pre_ids.dtype),
- y=step_idx,
- axis=0)
- logits = wrap_decoder(
- trg_vocab_size,
- max_in_len,
- n_layer,
- n_head,
- d_key,
- d_value,
- d_model,
- d_inner_hid,
- prepostprocess_dropout,
- attention_dropout,
- relu_dropout,
- preprocess_cmd,
- postprocess_cmd,
- weight_sharing,
- dec_inputs=(pre_ids, pre_pos, None, pre_src_attn_bias),
- enc_output=pre_enc_output,
- caches=pre_caches)
-
- topk_scores, topk_indices = layers.topk(
- input=layers.softmax(logits), k=beam_size)
- accu_scores = layers.elementwise_add(
- x=layers.log(topk_scores),
- y=layers.reshape(
- pre_scores, shape=[-1]),
- axis=0)
- # beam_search op uses lod to distinguish branches.
- topk_indices = layers.lod_reset(topk_indices, pre_ids)
- selected_ids, selected_scores = layers.beam_search(
- pre_ids=pre_ids,
- pre_scores=pre_scores,
- ids=topk_indices,
- scores=accu_scores,
- beam_size=beam_size,
- end_id=eos_idx)
-
- layers.increment(x=step_idx, value=1.0, in_place=True)
- # update states
- layers.array_write(selected_ids, i=step_idx, array=ids)
- layers.array_write(selected_scores, i=step_idx, array=scores)
- layers.assign(pre_src_attn_bias, trg_src_attn_bias)
- layers.assign(pre_enc_output, enc_output)
- for i in range(n_layer):
- layers.assign(pre_caches[i]["k"], caches[i]["k"])
- layers.assign(pre_caches[i]["v"], caches[i]["v"])
- length_cond = layers.less_than(x=step_idx, y=max_len)
- finish_cond = layers.logical_not(layers.is_empty(x=selected_ids))
- layers.logical_and(x=length_cond, y=finish_cond, out=cond)
-
- finished_ids, finished_scores = layers.beam_search_decode(
- ids, scores, beam_size=beam_size, end_id=eos_idx)
- return finished_ids, finished_scores
-
- finished_ids, finished_scores = beam_search()
- return finished_ids, finished_scores
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/profile.py b/fluid/PaddleNLP/neural_machine_translation/transformer/profile.py
deleted file mode 100644
index 9a437725cb27c29b0233d6297e84781f5343aff1..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/profile.py
+++ /dev/null
@@ -1,284 +0,0 @@
-import argparse
-import ast
-import contextlib
-import multiprocessing
-import os
-import six
-import time
-
-import numpy as np
-import paddle.fluid as fluid
-import paddle.fluid.profiler as profiler
-
-import reader
-from config import *
-from train import pad_batch_data, prepare_data_generator, \
- prepare_feed_dict_list, py_reader_provider_wrapper
-from model import transformer, position_encoding_init
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Training for Transformer.")
- parser.add_argument(
- "--src_vocab_fpath",
- type=str,
- required=True,
- help="The path of vocabulary file of source language.")
- parser.add_argument(
- "--trg_vocab_fpath",
- type=str,
- required=True,
- help="The path of vocabulary file of target language.")
- parser.add_argument(
- "--train_file_pattern",
- type=str,
- required=True,
- help="The pattern to match training data files.")
- parser.add_argument(
- "--use_token_batch",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to "
- "produce batch data according to token number.")
- parser.add_argument(
- "--batch_size",
- type=int,
- default=4096,
- help="The number of sequences contained in a mini-batch, or the maximum "
- "number of tokens (include paddings) contained in a mini-batch. Note "
- "that this represents the number on single device and the actual batch "
- "size for multi-devices will multiply the device number.")
- parser.add_argument(
- "--pool_size",
- type=int,
- default=200000,
- help="The buffer size to pool data.")
- parser.add_argument(
- "--sort_type",
- default="pool",
- choices=("global", "pool", "none"),
- help="The grain to sort by length: global for all instances; pool for "
- "instances in pool; none for no sort.")
- parser.add_argument(
- "--shuffle",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to shuffle instances in each pass.")
- parser.add_argument(
- "--shuffle_batch",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to shuffle the data batches.")
- parser.add_argument(
- "--special_token",
- type=str,
- default=["", "", ""],
- nargs=3,
- help="The , and tokens in the dictionary.")
- parser.add_argument(
- "--token_delimiter",
- type=lambda x: str(x.encode().decode("unicode-escape")),
- default=" ",
- help="The delimiter used to split tokens in source or target sentences. "
- "For EN-DE BPE data we provided, use spaces as token delimiter.")
- parser.add_argument(
- "--use_mem_opt",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to use memory optimization.")
- parser.add_argument(
- "--use_py_reader",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to use py_reader.")
- parser.add_argument(
- "--iter_num",
- type=int,
- default=20,
- help="The iteration number to run in profiling.")
- parser.add_argument(
- "--use_parallel_exe",
- type=ast.literal_eval,
- default=False,
- help="The flag indicating whether to use ParallelExecutor.")
- parser.add_argument(
- "--profile_ops",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to profile operators.")
- parser.add_argument(
- 'opts',
- help='See config.py for all options',
- default=None,
- nargs=argparse.REMAINDER)
-
- args = parser.parse_args()
- # Append args related to dict
- src_dict = reader.DataReader.load_dict(args.src_vocab_fpath)
- trg_dict = reader.DataReader.load_dict(args.trg_vocab_fpath)
- dict_args = [
- "src_vocab_size", str(len(src_dict)), "trg_vocab_size",
- str(len(trg_dict)), "bos_idx", str(src_dict[args.special_token[0]]),
- "eos_idx", str(src_dict[args.special_token[1]]), "unk_idx",
- str(src_dict[args.special_token[2]])
- ]
- merge_cfg_from_list(args.opts + dict_args,
- [TrainTaskConfig, ModelHyperParams])
- return args
-
-
-def main(args):
- train_prog = fluid.Program()
- startup_prog = fluid.Program()
- train_prog.random_seed = 1000
- startup_prog.random_seed = 1000
- with fluid.program_guard(train_prog, startup_prog):
- with fluid.unique_name.guard():
- sum_cost, avg_cost, predict, token_num, pyreader = transformer(
- ModelHyperParams.src_vocab_size,
- ModelHyperParams.trg_vocab_size,
- ModelHyperParams.max_length + 1,
- ModelHyperParams.n_layer,
- ModelHyperParams.n_head,
- ModelHyperParams.d_key,
- ModelHyperParams.d_value,
- ModelHyperParams.d_model,
- ModelHyperParams.d_inner_hid,
- ModelHyperParams.prepostprocess_dropout,
- ModelHyperParams.attention_dropout,
- ModelHyperParams.relu_dropout,
- ModelHyperParams.preprocess_cmd,
- ModelHyperParams.postprocess_cmd,
- ModelHyperParams.weight_sharing,
- TrainTaskConfig.label_smooth_eps,
- use_py_reader=args.use_py_reader,
- is_test=False)
- lr_decay = fluid.layers.learning_rate_scheduler.noam_decay(
- ModelHyperParams.d_model, TrainTaskConfig.warmup_steps)
- optimizer = fluid.optimizer.Adam(
- learning_rate=lr_decay * TrainTaskConfig.learning_rate,
- beta1=TrainTaskConfig.beta1,
- beta2=TrainTaskConfig.beta2,
- epsilon=TrainTaskConfig.eps)
- optimizer.minimize(avg_cost)
-
- if args.use_mem_opt:
- fluid.memory_optimize(train_prog)
-
- if TrainTaskConfig.use_gpu:
- place = fluid.CUDAPlace(0)
- dev_count = fluid.core.get_cuda_device_count()
- else:
- place = fluid.CPUPlace()
- dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
- exe = fluid.Executor(place)
- # Initialize the parameters.
- if TrainTaskConfig.ckpt_path:
- fluid.io.load_persistables(exe, TrainTaskConfig.ckpt_path)
- else:
- exe.run(startup_prog)
-
- exec_strategy = fluid.ExecutionStrategy()
- # For faster executor
- exec_strategy.use_experimental_executor = True
- exec_strategy.num_iteration_per_drop_scope = 5
- build_strategy = fluid.BuildStrategy()
- # Since the token number differs among devices, customize gradient scale to
- # use token average cost among multi-devices. and the gradient scale is
- # `1 / token_number` for average cost.
- build_strategy.gradient_scale_strategy = fluid.BuildStrategy.GradientScaleStrategy.Customized
- train_exe = fluid.ParallelExecutor(
- use_cuda=TrainTaskConfig.use_gpu,
- loss_name=avg_cost.name,
- main_program=train_prog,
- build_strategy=build_strategy,
- exec_strategy=exec_strategy)
-
- # the best cross-entropy value with label smoothing
- loss_normalizer = -((1. - TrainTaskConfig.label_smooth_eps) * np.log(
- (1. - TrainTaskConfig.label_smooth_eps
- )) + TrainTaskConfig.label_smooth_eps *
- np.log(TrainTaskConfig.label_smooth_eps / (
- ModelHyperParams.trg_vocab_size - 1) + 1e-20))
-
- train_data = prepare_data_generator(
- args, is_test=False, count=dev_count, pyreader=pyreader)
- if args.use_py_reader:
- pyreader.start()
- data_generator = None
- else:
- data_generator = train_data()
-
- def run(iter_num):
- reader_time = []
- run_time = []
-
- for step_idx in six.moves.xrange(iter_num):
- try:
- start_time = time.time()
- feed_dict_list = prepare_feed_dict_list(data_generator,
- init_flag, dev_count)
- end_time = time.time()
- reader_time.append(end_time - start_time)
-
- start_time = time.time()
- if args.use_parallel_exe:
- outs = train_exe.run(
- fetch_list=[sum_cost.name, token_num.name],
- feed=feed_dict_list)
- else:
- outs = exe.run(program=train_prog,
- fetch_list=[sum_cost.name, token_num.name],
- feed=feed_dict_list[0]
- if feed_dict_list is not None else None)
- end_time = time.time()
- run_time.append(end_time - start_time)
-
- sum_cost_val, token_num_val = np.array(outs[0]), np.array(outs[
- 1])
- # sum the cost from multi-devices
- total_sum_cost = sum_cost_val.sum()
- total_token_num = token_num_val.sum()
- total_avg_cost = total_sum_cost / total_token_num
- print("step_idx: %d, avg loss: %f, "
- "normalized loss: %f, ppl: %f" %
- (step_idx, total_avg_cost,
- total_avg_cost - loss_normalizer,
- np.exp([min(total_avg_cost, 100)])))
- except (StopIteration, fluid.core.EOFException):
- # The current pass is over.
- if args.use_py_reader:
- pyreader.reset()
- pyreader.start()
-
- return reader_time, run_time
-
- @contextlib.contextmanager
- def profile_context(profile=True):
- if profile:
- with profiler.profiler('All', 'total', '/tmp/profile_file'):
- yield
- else:
- yield
-
- # start-up
- init_flag = True
- run(5)
- init_flag = False
-
- # profiling
- start = time.time()
- # currently only support profiling on one device
- with profile_context(args.profile_ops):
- reader_time, run_time = run(args.iter_num)
- end = time.time()
- total_time = end - start
- print(
- "Total time: {0}, reader time: {1} s, run time: {2} s, step number: {3}".
- format(total_time, np.sum(reader_time), np.sum(run_time),
- args.iter_num))
-
-
-if __name__ == "__main__":
- args = parse_args()
- main(args)
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/train.py b/fluid/PaddleNLP/neural_machine_translation/transformer/train.py
deleted file mode 100644
index 0e9c18416f62c85e76dd060f1fad44073e5841fc..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/train.py
+++ /dev/null
@@ -1,746 +0,0 @@
-import argparse
-import ast
-import copy
-import logging
-import multiprocessing
-import os
-import six
-import sys
-import time
-
-import numpy as np
-import paddle.fluid as fluid
-from paddle.fluid.transpiler.details import program_to_code
-
-import reader
-from config import *
-from model import transformer, position_encoding_init
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Training for Transformer.")
- parser.add_argument(
- "--src_vocab_fpath",
- type=str,
- required=True,
- help="The path of vocabulary file of source language.")
- parser.add_argument(
- "--trg_vocab_fpath",
- type=str,
- required=True,
- help="The path of vocabulary file of target language.")
- parser.add_argument(
- "--train_file_pattern",
- type=str,
- required=True,
- help="The pattern to match training data files.")
- parser.add_argument(
- "--val_file_pattern",
- type=str,
- help="The pattern to match validation data files.")
- parser.add_argument(
- "--use_token_batch",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to "
- "produce batch data according to token number.")
- parser.add_argument(
- "--batch_size",
- type=int,
- default=4096,
- help="The number of sequences contained in a mini-batch, or the maximum "
- "number of tokens (include paddings) contained in a mini-batch. Note "
- "that this represents the number on single device and the actual batch "
- "size for multi-devices will multiply the device number.")
- parser.add_argument(
- "--pool_size",
- type=int,
- default=200000,
- help="The buffer size to pool data.")
- parser.add_argument(
- "--sort_type",
- default="pool",
- choices=("global", "pool", "none"),
- help="The grain to sort by length: global for all instances; pool for "
- "instances in pool; none for no sort.")
- parser.add_argument(
- "--shuffle",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to shuffle instances in each pass.")
- parser.add_argument(
- "--shuffle_batch",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to shuffle the data batches.")
- parser.add_argument(
- "--special_token",
- type=str,
- default=["", "", ""],
- nargs=3,
- help="The , and tokens in the dictionary.")
- parser.add_argument(
- "--token_delimiter",
- type=lambda x: str(x.encode().decode("unicode-escape")),
- default=" ",
- help="The delimiter used to split tokens in source or target sentences. "
- "For EN-DE BPE data we provided, use spaces as token delimiter. ")
- parser.add_argument(
- 'opts',
- help='See config.py for all options',
- default=None,
- nargs=argparse.REMAINDER)
- parser.add_argument(
- '--local',
- type=ast.literal_eval,
- default=True,
- help='Whether to run as local mode.')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help="The device type.")
- parser.add_argument(
- '--update_method',
- choices=("pserver", "nccl2"),
- default="pserver",
- help='Update method.')
- parser.add_argument(
- '--sync', type=ast.literal_eval, default=True, help="sync mode.")
- parser.add_argument(
- "--enable_ce",
- type=ast.literal_eval,
- default=False,
- help="The flag indicating whether to run the task "
- "for continuous evaluation.")
- parser.add_argument(
- "--use_mem_opt",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to use memory optimization.")
- parser.add_argument(
- "--use_py_reader",
- type=ast.literal_eval,
- default=True,
- help="The flag indicating whether to use py_reader.")
- parser.add_argument(
- "--fetch_steps",
- type=int,
- default=100,
- help="The frequency to fetch and print output.")
-
- args = parser.parse_args()
- # Append args related to dict
- src_dict = reader.DataReader.load_dict(args.src_vocab_fpath)
- trg_dict = reader.DataReader.load_dict(args.trg_vocab_fpath)
- dict_args = [
- "src_vocab_size", str(len(src_dict)), "trg_vocab_size",
- str(len(trg_dict)), "bos_idx", str(src_dict[args.special_token[0]]),
- "eos_idx", str(src_dict[args.special_token[1]]), "unk_idx",
- str(src_dict[args.special_token[2]])
- ]
- merge_cfg_from_list(args.opts + dict_args,
- [TrainTaskConfig, ModelHyperParams])
- return args
-
-
-def append_nccl2_prepare(startup_prog, trainer_id, worker_endpoints,
- current_endpoint):
- assert (trainer_id >= 0 and len(worker_endpoints) > 1 and
- current_endpoint in worker_endpoints)
- eps = copy.deepcopy(worker_endpoints)
- eps.remove(current_endpoint)
- nccl_id_var = startup_prog.global_block().create_var(
- name="NCCLID", persistable=True, type=fluid.core.VarDesc.VarType.RAW)
- startup_prog.global_block().append_op(
- type="gen_nccl_id",
- inputs={},
- outputs={"NCCLID": nccl_id_var},
- attrs={
- "endpoint": current_endpoint,
- "endpoint_list": eps,
- "trainer_id": trainer_id
- })
- return nccl_id_var
-
-
-def pad_batch_data(insts,
- pad_idx,
- n_head,
- is_target=False,
- is_label=False,
- return_attn_bias=True,
- return_max_len=True,
- return_num_token=False):
- """
- Pad the instances to the max sequence length in batch, and generate the
- corresponding position data and attention bias.
- """
- return_list = []
- max_len = max(len(inst) for inst in insts)
- # Any token included in dict can be used to pad, since the paddings' loss
- # will be masked out by weights and make no effect on parameter gradients.
- inst_data = np.array(
- [inst + [pad_idx] * (max_len - len(inst)) for inst in insts])
- return_list += [inst_data.astype("int64").reshape([-1, 1])]
- if is_label: # label weight
- inst_weight = np.array(
- [[1.] * len(inst) + [0.] * (max_len - len(inst)) for inst in insts])
- return_list += [inst_weight.astype("float32").reshape([-1, 1])]
- else: # position data
- inst_pos = np.array([
- list(range(0, len(inst))) + [0] * (max_len - len(inst))
- for inst in insts
- ])
- return_list += [inst_pos.astype("int64").reshape([-1, 1])]
- if return_attn_bias:
- if is_target:
- # This is used to avoid attention on paddings and subsequent
- # words.
- slf_attn_bias_data = np.ones((inst_data.shape[0], max_len, max_len))
- slf_attn_bias_data = np.triu(slf_attn_bias_data,
- 1).reshape([-1, 1, max_len, max_len])
- slf_attn_bias_data = np.tile(slf_attn_bias_data,
- [1, n_head, 1, 1]) * [-1e9]
- else:
- # This is used to avoid attention on paddings.
- slf_attn_bias_data = np.array([[0] * len(inst) + [-1e9] *
- (max_len - len(inst))
- for inst in insts])
- slf_attn_bias_data = np.tile(
- slf_attn_bias_data.reshape([-1, 1, 1, max_len]),
- [1, n_head, max_len, 1])
- return_list += [slf_attn_bias_data.astype("float32")]
- if return_max_len:
- return_list += [max_len]
- if return_num_token:
- num_token = 0
- for inst in insts:
- num_token += len(inst)
- return_list += [num_token]
- return return_list if len(return_list) > 1 else return_list[0]
-
-
-def prepare_batch_input(insts, data_input_names, src_pad_idx, trg_pad_idx,
- n_head, d_model):
- """
- Put all padded data needed by training into a dict.
- """
- src_word, src_pos, src_slf_attn_bias, src_max_len = pad_batch_data(
- [inst[0] for inst in insts], src_pad_idx, n_head, is_target=False)
- src_word = src_word.reshape(-1, src_max_len, 1)
- src_pos = src_pos.reshape(-1, src_max_len, 1)
- trg_word, trg_pos, trg_slf_attn_bias, trg_max_len = pad_batch_data(
- [inst[1] for inst in insts], trg_pad_idx, n_head, is_target=True)
- trg_word = trg_word.reshape(-1, trg_max_len, 1)
- trg_pos = trg_pos.reshape(-1, trg_max_len, 1)
-
- trg_src_attn_bias = np.tile(src_slf_attn_bias[:, :, ::src_max_len, :],
- [1, 1, trg_max_len, 1]).astype("float32")
-
- lbl_word, lbl_weight, num_token = pad_batch_data(
- [inst[2] for inst in insts],
- trg_pad_idx,
- n_head,
- is_target=False,
- is_label=True,
- return_attn_bias=False,
- return_max_len=False,
- return_num_token=True)
-
- data_input_dict = dict(
- zip(data_input_names, [
- src_word, src_pos, src_slf_attn_bias, trg_word, trg_pos,
- trg_slf_attn_bias, trg_src_attn_bias, lbl_word, lbl_weight
- ]))
-
- return data_input_dict, np.asarray([num_token], dtype="float32")
-
-
-def prepare_data_generator(args, is_test, count, pyreader):
- """
- Data generator wrapper for DataReader. If use py_reader, set the data
- provider for py_reader
- """
- data_reader = reader.DataReader(
- fpattern=args.val_file_pattern if is_test else args.train_file_pattern,
- src_vocab_fpath=args.src_vocab_fpath,
- trg_vocab_fpath=args.trg_vocab_fpath,
- token_delimiter=args.token_delimiter,
- use_token_batch=args.use_token_batch,
- batch_size=args.batch_size * (1 if args.use_token_batch else count),
- pool_size=args.pool_size,
- sort_type=args.sort_type,
- shuffle=args.shuffle,
- shuffle_batch=args.shuffle_batch,
- start_mark=args.special_token[0],
- end_mark=args.special_token[1],
- unk_mark=args.special_token[2],
- # count start and end tokens out
- max_length=ModelHyperParams.max_length - 2,
- clip_last_batch=False).batch_generator
-
- def stack(data_reader, count, clip_last=True):
- def __impl__():
- res = []
- for item in data_reader():
- res.append(item)
- if len(res) == count:
- yield res
- res = []
- if len(res) == count:
- yield res
- elif not clip_last:
- data = []
- for item in res:
- data += item
- if len(data) > count:
- inst_num_per_part = len(data) // count
- yield [
- data[inst_num_per_part * i:inst_num_per_part * (i + 1)]
- for i in range(count)
- ]
-
- return __impl__
-
- def split(data_reader, count):
- def __impl__():
- for item in data_reader():
- inst_num_per_part = len(item) // count
- for i in range(count):
- yield item[inst_num_per_part * i:inst_num_per_part * (i + 1
- )]
-
- return __impl__
-
- if not args.use_token_batch:
- # to make data on each device have similar token number
- data_reader = split(data_reader, count)
- if args.use_py_reader:
- pyreader.decorate_tensor_provider(
- py_reader_provider_wrapper(data_reader))
- data_reader = None
- else: # Data generator for multi-devices
- data_reader = stack(data_reader, count)
- return data_reader
-
-
-def prepare_feed_dict_list(data_generator, init_flag, count):
- """
- Prepare the list of feed dict for multi-devices.
- """
- feed_dict_list = []
- if data_generator is not None: # use_py_reader == False
- data_input_names = encoder_data_input_fields + \
- decoder_data_input_fields[:-1] + label_data_input_fields
- data = next(data_generator)
- for idx, data_buffer in enumerate(data):
- data_input_dict, num_token = prepare_batch_input(
- data_buffer, data_input_names, ModelHyperParams.eos_idx,
- ModelHyperParams.eos_idx, ModelHyperParams.n_head,
- ModelHyperParams.d_model)
- feed_dict_list.append(data_input_dict)
- if init_flag:
- for idx in range(count):
- pos_enc_tables = dict()
- for pos_enc_param_name in pos_enc_param_names:
- pos_enc_tables[pos_enc_param_name] = position_encoding_init(
- ModelHyperParams.max_length + 1, ModelHyperParams.d_model)
- if len(feed_dict_list) <= idx:
- feed_dict_list.append(pos_enc_tables)
- else:
- feed_dict_list[idx] = dict(
- list(pos_enc_tables.items()) + list(feed_dict_list[idx]
- .items()))
-
- return feed_dict_list if len(feed_dict_list) == count else None
-
-
-def py_reader_provider_wrapper(data_reader):
- """
- Data provider needed by fluid.layers.py_reader.
- """
-
- def py_reader_provider():
- data_input_names = encoder_data_input_fields + \
- decoder_data_input_fields[:-1] + label_data_input_fields
- for batch_id, data in enumerate(data_reader()):
- data_input_dict, num_token = prepare_batch_input(
- data, data_input_names, ModelHyperParams.eos_idx,
- ModelHyperParams.eos_idx, ModelHyperParams.n_head,
- ModelHyperParams.d_model)
- total_dict = dict(data_input_dict.items())
- yield [total_dict[item] for item in data_input_names]
-
- return py_reader_provider
-
-
-def test_context(exe, train_exe, dev_count):
- # Context to do validation.
- test_prog = fluid.Program()
- startup_prog = fluid.Program()
- if args.enable_ce:
- test_prog.random_seed = 1000
- startup_prog.random_seed = 1000
- with fluid.program_guard(test_prog, startup_prog):
- with fluid.unique_name.guard():
- sum_cost, avg_cost, predict, token_num, pyreader = transformer(
- ModelHyperParams.src_vocab_size,
- ModelHyperParams.trg_vocab_size,
- ModelHyperParams.max_length + 1,
- ModelHyperParams.n_layer,
- ModelHyperParams.n_head,
- ModelHyperParams.d_key,
- ModelHyperParams.d_value,
- ModelHyperParams.d_model,
- ModelHyperParams.d_inner_hid,
- ModelHyperParams.prepostprocess_dropout,
- ModelHyperParams.attention_dropout,
- ModelHyperParams.relu_dropout,
- ModelHyperParams.preprocess_cmd,
- ModelHyperParams.postprocess_cmd,
- ModelHyperParams.weight_sharing,
- TrainTaskConfig.label_smooth_eps,
- use_py_reader=args.use_py_reader,
- is_test=True)
- test_prog = test_prog.clone(for_test=True)
- test_data = prepare_data_generator(
- args, is_test=True, count=dev_count, pyreader=pyreader)
-
- exe.run(startup_prog)
- test_exe = fluid.ParallelExecutor(
- use_cuda=TrainTaskConfig.use_gpu,
- main_program=test_prog,
- share_vars_from=train_exe)
-
- def test(exe=test_exe, pyreader=pyreader):
- test_total_cost = 0
- test_total_token = 0
-
- if args.use_py_reader:
- pyreader.start()
- data_generator = None
- else:
- data_generator = test_data()
- while True:
- try:
- feed_dict_list = prepare_feed_dict_list(data_generator, False,
- dev_count)
- outs = test_exe.run(fetch_list=[sum_cost.name, token_num.name],
- feed=feed_dict_list)
- except (StopIteration, fluid.core.EOFException):
- # The current pass is over.
- if args.use_py_reader:
- pyreader.reset()
- break
- sum_cost_val, token_num_val = np.array(outs[0]), np.array(outs[1])
- test_total_cost += sum_cost_val.sum()
- test_total_token += token_num_val.sum()
- test_avg_cost = test_total_cost / test_total_token
- test_ppl = np.exp([min(test_avg_cost, 100)])
- return test_avg_cost, test_ppl
-
- return test
-
-
-def train_loop(exe,
- train_prog,
- startup_prog,
- dev_count,
- sum_cost,
- avg_cost,
- token_num,
- predict,
- pyreader,
- nccl2_num_trainers=1,
- nccl2_trainer_id=0):
- # Initialize the parameters.
- if TrainTaskConfig.ckpt_path:
- fluid.io.load_persistables(exe, TrainTaskConfig.ckpt_path)
- else:
- logging.info("init fluid.framework.default_startup_program")
- exe.run(startup_prog)
-
- logging.info("begin reader")
- train_data = prepare_data_generator(
- args, is_test=False, count=dev_count, pyreader=pyreader)
-
- # For faster executor
- exec_strategy = fluid.ExecutionStrategy()
- exec_strategy.use_experimental_executor = True
- # exec_strategy.num_iteration_per_drop_scope = 5
- build_strategy = fluid.BuildStrategy()
- # Since the token number differs among devices, customize gradient scale to
- # use token average cost among multi-devices. and the gradient scale is
- # `1 / token_number` for average cost.
- # build_strategy.gradient_scale_strategy = fluid.BuildStrategy.GradientScaleStrategy.Customized
-
- logging.info("begin executor")
- train_exe = fluid.ParallelExecutor(
- use_cuda=TrainTaskConfig.use_gpu,
- loss_name=avg_cost.name,
- main_program=train_prog,
- build_strategy=build_strategy,
- exec_strategy=exec_strategy,
- num_trainers=nccl2_num_trainers,
- trainer_id=nccl2_trainer_id)
-
- if args.val_file_pattern is not None:
- test = test_context(exe, train_exe, dev_count)
-
- # the best cross-entropy value with label smoothing
- loss_normalizer = -((1. - TrainTaskConfig.label_smooth_eps) * np.log(
- (1. - TrainTaskConfig.label_smooth_eps
- )) + TrainTaskConfig.label_smooth_eps *
- np.log(TrainTaskConfig.label_smooth_eps / (
- ModelHyperParams.trg_vocab_size - 1) + 1e-20))
-
- step_idx = 0
- init_flag = True
-
- logging.info("begin train")
- for pass_id in six.moves.xrange(TrainTaskConfig.pass_num):
- pass_start_time = time.time()
-
- if args.use_py_reader:
- pyreader.start()
- data_generator = None
- else:
- data_generator = train_data()
-
- batch_id = 0
- while True:
- try:
- feed_dict_list = prepare_feed_dict_list(data_generator,
- init_flag, dev_count)
- outs = train_exe.run(
- fetch_list=[sum_cost.name, token_num.name]
- if step_idx % args.fetch_steps == 0 else [],
- feed=feed_dict_list)
-
- if step_idx % args.fetch_steps == 0:
- sum_cost_val, token_num_val = np.array(outs[0]), np.array(
- outs[1])
- # sum the cost from multi-devices
- total_sum_cost = sum_cost_val.sum()
- total_token_num = token_num_val.sum()
- total_avg_cost = total_sum_cost / total_token_num
-
- if step_idx == 0:
- logging.info(
- "step_idx: %d, epoch: %d, batch: %d, avg loss: %f, "
- "normalized loss: %f, ppl: %f" %
- (step_idx, pass_id, batch_id, total_avg_cost,
- total_avg_cost - loss_normalizer,
- np.exp([min(total_avg_cost, 100)])))
- avg_batch_time = time.time()
- else:
- logging.info(
- "step_idx: %d, epoch: %d, batch: %d, avg loss: %f, "
- "normalized loss: %f, ppl: %f, speed: %.2f step/s" %
- (step_idx, pass_id, batch_id, total_avg_cost,
- total_avg_cost - loss_normalizer,
- np.exp([min(total_avg_cost, 100)]),
- args.fetch_steps / (time.time() - avg_batch_time)))
- avg_batch_time = time.time()
-
- if step_idx % TrainTaskConfig.save_freq == 0 and step_idx > 0:
- fluid.io.save_persistables(
- exe,
- os.path.join(TrainTaskConfig.ckpt_dir,
- "latest.checkpoint"), train_prog)
- fluid.io.save_params(
- exe,
- os.path.join(TrainTaskConfig.model_dir,
- "iter_" + str(step_idx) + ".infer.model"),
- train_prog)
-
- init_flag = False
- batch_id += 1
- step_idx += 1
- except (StopIteration, fluid.core.EOFException):
- # The current pass is over.
- if args.use_py_reader:
- pyreader.reset()
- break
-
- time_consumed = time.time() - pass_start_time
- # Validate and save the persistable.
- if args.val_file_pattern is not None:
- val_avg_cost, val_ppl = test()
- logging.info(
- "epoch: %d, val avg loss: %f, val normalized loss: %f, val ppl: %f,"
- " consumed %fs" % (pass_id, val_avg_cost,
- val_avg_cost - loss_normalizer, val_ppl,
- time_consumed))
- else:
- logging.info("epoch: %d, consumed %fs" % (pass_id, time_consumed))
- if not args.enable_ce:
- fluid.io.save_persistables(
- exe,
- os.path.join(TrainTaskConfig.ckpt_dir,
- "pass_" + str(pass_id) + ".checkpoint"),
- train_prog)
-
- if args.enable_ce: # For CE
- print("kpis\ttrain_cost_card%d\t%f" % (dev_count, total_avg_cost))
- if args.val_file_pattern is not None:
- print("kpis\ttest_cost_card%d\t%f" % (dev_count, val_avg_cost))
- print("kpis\ttrain_duration_card%d\t%f" % (dev_count, time_consumed))
-
-
-def train(args):
- # priority: ENV > args > config
- is_local = os.getenv("PADDLE_IS_LOCAL", "1")
- if is_local == '0':
- args.local = False
- logging.info(args)
-
- if args.device == 'CPU':
- TrainTaskConfig.use_gpu = False
-
- training_role = os.getenv("TRAINING_ROLE", "TRAINER")
-
- if training_role == "PSERVER" or (not TrainTaskConfig.use_gpu):
- place = fluid.CPUPlace()
- dev_count = int(os.environ.get('CPU_NUM', multiprocessing.cpu_count()))
- else:
- place = fluid.CUDAPlace(0)
- dev_count = fluid.core.get_cuda_device_count()
-
- exe = fluid.Executor(place)
-
- train_prog = fluid.Program()
- startup_prog = fluid.Program()
-
- if args.enable_ce:
- train_prog.random_seed = 1000
- startup_prog.random_seed = 1000
-
- with fluid.program_guard(train_prog, startup_prog):
- with fluid.unique_name.guard():
- sum_cost, avg_cost, predict, token_num, pyreader = transformer(
- ModelHyperParams.src_vocab_size,
- ModelHyperParams.trg_vocab_size,
- ModelHyperParams.max_length + 1,
- ModelHyperParams.n_layer,
- ModelHyperParams.n_head,
- ModelHyperParams.d_key,
- ModelHyperParams.d_value,
- ModelHyperParams.d_model,
- ModelHyperParams.d_inner_hid,
- ModelHyperParams.prepostprocess_dropout,
- ModelHyperParams.attention_dropout,
- ModelHyperParams.relu_dropout,
- ModelHyperParams.preprocess_cmd,
- ModelHyperParams.postprocess_cmd,
- ModelHyperParams.weight_sharing,
- TrainTaskConfig.label_smooth_eps,
- use_py_reader=args.use_py_reader,
- is_test=False)
-
- optimizer = None
- if args.sync:
- lr_decay = fluid.layers.learning_rate_scheduler.noam_decay(
- ModelHyperParams.d_model, TrainTaskConfig.warmup_steps)
- logging.info("before adam")
-
- with fluid.default_main_program()._lr_schedule_guard():
- learning_rate = lr_decay * TrainTaskConfig.learning_rate
-
- optimizer = fluid.optimizer.Adam(
- learning_rate=learning_rate,
- beta1=TrainTaskConfig.beta1,
- beta2=TrainTaskConfig.beta2,
- epsilon=TrainTaskConfig.eps)
- else:
- optimizer = fluid.optimizer.SGD(0.003)
- optimizer.minimize(avg_cost)
-
- if args.use_mem_opt:
- fluid.memory_optimize(train_prog)
-
- if args.local:
- logging.info("local start_up:")
- train_loop(exe, train_prog, startup_prog, dev_count, sum_cost, avg_cost,
- token_num, predict, pyreader)
- else:
- if args.update_method == "nccl2":
- trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
- port = os.getenv("PADDLE_PORT")
- worker_ips = os.getenv("PADDLE_TRAINERS")
- worker_endpoints = []
- for ip in worker_ips.split(","):
- worker_endpoints.append(':'.join([ip, port]))
- trainers_num = len(worker_endpoints)
- current_endpoint = os.getenv("POD_IP") + ":" + port
- if trainer_id == 0:
- logging.info("train_id == 0, sleep 60s")
- time.sleep(60)
- logging.info("trainers_num:{}".format(trainers_num))
- logging.info("worker_endpoints:{}".format(worker_endpoints))
- logging.info("current_endpoint:{}".format(current_endpoint))
- append_nccl2_prepare(startup_prog, trainer_id, worker_endpoints,
- current_endpoint)
- train_loop(exe, train_prog, startup_prog, dev_count, sum_cost,
- avg_cost, token_num, predict, pyreader, trainers_num,
- trainer_id)
- return
-
- port = os.getenv("PADDLE_PORT", "6174")
- pserver_ips = os.getenv("PADDLE_PSERVERS") # ip,ip...
- eplist = []
- for ip in pserver_ips.split(","):
- eplist.append(':'.join([ip, port]))
- pserver_endpoints = ",".join(eplist) # ip:port,ip:port...
- trainers = int(os.getenv("PADDLE_TRAINERS_NUM", "0"))
- current_endpoint = os.getenv("POD_IP") + ":" + port
- trainer_id = int(os.getenv("PADDLE_TRAINER_ID"))
-
- logging.info("pserver_endpoints:{}".format(pserver_endpoints))
- logging.info("current_endpoint:{}".format(current_endpoint))
- logging.info("trainer_id:{}".format(trainer_id))
- logging.info("pserver_ips:{}".format(pserver_ips))
- logging.info("port:{}".format(port))
-
- t = fluid.DistributeTranspiler()
- t.transpile(
- trainer_id,
- pservers=pserver_endpoints,
- trainers=trainers,
- program=train_prog,
- startup_program=startup_prog)
-
- if training_role == "PSERVER":
- logging.info("distributed: pserver started")
- current_endpoint = os.getenv("POD_IP") + ":" + os.getenv(
- "PADDLE_PORT")
- if not current_endpoint:
- logging.critical("need env SERVER_ENDPOINT")
- exit(1)
- pserver_prog = t.get_pserver_program(current_endpoint)
- pserver_startup = t.get_startup_program(current_endpoint,
- pserver_prog)
-
- exe.run(pserver_startup)
- exe.run(pserver_prog)
- elif training_role == "TRAINER":
- logging.info("distributed: trainer started")
- trainer_prog = t.get_trainer_program()
-
- train_loop(exe, train_prog, startup_prog, dev_count, sum_cost,
- avg_cost, token_num, predict, pyreader)
- else:
- logging.critical(
- "environment var TRAINER_ROLE should be TRAINER os PSERVER")
- exit(1)
-
-
-if __name__ == "__main__":
- LOG_FORMAT = "[%(asctime)s %(levelname)s %(filename)s:%(lineno)d] %(message)s"
- logging.basicConfig(
- stream=sys.stdout, level=logging.DEBUG, format=LOG_FORMAT)
-
- args = parse_args()
- train(args)
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/README.md b/fluid/PaddleNLP/sequence_tagging_for_ner/README.md
index 6d4efa9eb19dd708a87d4883dccef5ecb5e11666..d9650941eb9469a557dac879cd5cc52a3b0a03d3 100644
--- a/fluid/PaddleNLP/sequence_tagging_for_ner/README.md
+++ b/fluid/PaddleNLP/sequence_tagging_for_ner/README.md
@@ -1,116 +1,2 @@
-# 命名实体识别
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── data # 存储运行本例所依赖的数据,从外部获取
-├── network_conf.py # 模型定义
-├── reader.py # 数据读取接口, 从外部获取
-├── README.md # 文档
-├── train.py # 训练脚本
-├── infer.py # 预测脚本
-├── utils.py # 定义通用的函数, 从外部获取
-└── utils_extend.py # 对utils.py的拓展
-```
-
-
-## 简介,模型详解
-
-在PaddlePaddle v2版本[命名实体识别](https://github.com/PaddlePaddle/models/blob/develop/legacy/sequence_tagging_for_ner/README.md)中对于命名实体识别任务有较详细的介绍,在本例中不再重复介绍。
-在模型上,我们沿用了v2版本的模型结构,唯一区别是我们使用LSTM代替原始的RNN。
-
-## 数据获取
-
-完整数据的获取请参考PaddlePaddle v2版本[命名实体识别](https://github.com/PaddlePaddle/models/blob/develop/legacy/sequence_tagging_for_ner/README.md) 一节中的方式。本例的示例数据同样可以通过运行data/download.sh来获取。
-
-## 训练
-
-1. 运行 `sh data/download.sh`
-2. 修改 `train.py` 的 `main` 函数,指定数据路径
-
- ```python
- main(
- train_data_file="data/train",
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt",
- emb_file="data/wordVectors.txt",
- model_save_dir="models",
- num_passes=1000,
- use_gpu=False,
- parallel=False)
- ```
-
-3. 运行命令 `python train.py` ,**需要注意:直接运行使用的是示例数据,请替换真实的标记数据。**
-
- ```text
- Pass 127, Batch 9525, Cost 4.0867705, Precision 0.3954984, Recall 0.37846154, F1_score0.38679245
- Pass 127, Batch 9530, Cost 3.137265, Precision 0.42971888, Recall 0.38351256, F1_score0.405303
- Pass 127, Batch 9535, Cost 3.6240938, Precision 0.4272152, Recall 0.41795665, F1_score0.4225352
- Pass 127, Batch 9540, Cost 3.5352352, Precision 0.48464164, Recall 0.4536741, F1_score0.46864685
- Pass 127, Batch 9545, Cost 4.1130385, Precision 0.40131578, Recall 0.3836478, F1_score0.39228293
- Pass 127, Batch 9550, Cost 3.6826708, Precision 0.43333334, Recall 0.43730888, F1_score0.43531203
- Pass 127, Batch 9555, Cost 3.6363933, Precision 0.42424244, Recall 0.3962264, F1_score0.4097561
- Pass 127, Batch 9560, Cost 3.6101768, Precision 0.51363635, Recall 0.353125, F1_score0.41851854
- Pass 127, Batch 9565, Cost 3.5935276, Precision 0.5152439, Recall 0.5, F1_score0.5075075
- Pass 127, Batch 9570, Cost 3.4987144, Precision 0.5, Recall 0.4330218, F1_score0.46410686
- Pass 127, Batch 9575, Cost 3.4659843, Precision 0.39864865, Recall 0.38064516, F1_score0.38943896
- Pass 127, Batch 9580, Cost 3.1702557, Precision 0.5, Recall 0.4490446, F1_score0.47315437
- Pass 127, Batch 9585, Cost 3.1587276, Precision 0.49377593, Recall 0.4089347, F1_score0.4473684
- Pass 127, Batch 9590, Cost 3.5043538, Precision 0.4556962, Recall 0.4600639, F1_score0.45786962
- Pass 127, Batch 9595, Cost 2.981989, Precision 0.44981414, Recall 0.45149255, F1_score0.4506518
- [TrainSet] pass_id:127 pass_precision:[0.46023396] pass_recall:[0.43197003] pass_f1_score:[0.44565433]
- [TestSet] pass_id:127 pass_precision:[0.4708409] pass_recall:[0.47971722] pass_f1_score:[0.4752376]
- ```
-## 预测
-1. 修改 [infer.py](./infer.py) 的 `infer` 函数,指定:需要测试的模型的路径、测试数据、字典文件,预测标记文件的路径,默认参数如下:
-
- ```python
- infer(
- model_path="models/params_pass_0",
- batch_size=6,
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt",
- use_gpu=False
- )
- ```
-
-2. 在终端运行 `python infer.py`,开始测试,会看到如下预测结果(以下为训练70个pass所得模型的部分预测结果):
-
- ```text
- leicestershire B-ORG B-LOC
- extended O O
- their O O
- first O O
- innings O O
- by O O
- DGDG O O
- runs O O
- before O O
- being O O
- bowled O O
- out O O
- for O O
- 296 O O
- with O O
- england B-LOC B-LOC
- discard O O
- andy B-PER B-PER
- caddick I-PER I-PER
- taking O O
- three O O
- for O O
- DGDG O O
- . O O
- ```
-
- 输出分为三列,以“\t” 分隔,第一列是输入的词语,第二列是标准结果,第三列为生成的标记结果。多条输入序列之间以空行分隔。
-
-## 结果示例
-
-
-
-图1. 学习曲线, 横轴表示训练轮数,纵轴表示F1值
-
+您好,该项目已被迁移,请移步到 [PaddleNLP/sequence_tagging_for_ner](../../../PaddleNLP/sequence_tagging_for_ner) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/infer.py b/fluid/PaddleNLP/sequence_tagging_for_ner/infer.py
deleted file mode 100644
index acf98d0f15f7f493654822751fb2619de20e5505..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/sequence_tagging_for_ner/infer.py
+++ /dev/null
@@ -1,74 +0,0 @@
-from __future__ import print_function
-
-import numpy as np
-import six
-
-import paddle
-import paddle.fluid as fluid
-
-from network_conf import ner_net
-import reader
-from utils import load_dict, load_reverse_dict
-from utils_extend import to_lodtensor
-
-
-def infer(model_path, batch_size, test_data_file, vocab_file, target_file,
- use_gpu):
- """
- use the model under model_path to predict the test data, the result will be printed on the screen
-
- return nothing
- """
- word_dict = load_dict(vocab_file)
- word_reverse_dict = load_reverse_dict(vocab_file)
-
- label_dict = load_dict(target_file)
- label_reverse_dict = load_reverse_dict(target_file)
-
- test_data = paddle.batch(
- reader.data_reader(test_data_file, word_dict, label_dict),
- batch_size=batch_size)
- place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- inference_scope = fluid.core.Scope()
- with fluid.scope_guard(inference_scope):
- [inference_program, feed_target_names,
- fetch_targets] = fluid.io.load_inference_model(model_path, exe)
- for data in test_data():
- word = to_lodtensor([x[0] for x in data], place)
- mark = to_lodtensor([x[1] for x in data], place)
- target = to_lodtensor([x[2] for x in data], place)
- crf_decode = exe.run(
- inference_program,
- feed={"word": word,
- "mark": mark,
- "target": target},
- fetch_list=fetch_targets,
- return_numpy=False)
- lod_info = (crf_decode[0].lod())[0]
- np_data = np.array(crf_decode[0])
- assert len(data) == len(lod_info) - 1
- for sen_index in six.moves.xrange(len(data)):
- assert len(data[sen_index][0]) == lod_info[
- sen_index + 1] - lod_info[sen_index]
- word_index = 0
- for tag_index in six.moves.xrange(lod_info[sen_index],
- lod_info[sen_index + 1]):
- word = word_reverse_dict[data[sen_index][0][word_index]]
- gold_tag = label_reverse_dict[data[sen_index][2][
- word_index]]
- tag = label_reverse_dict[np_data[tag_index][0]]
- print(word + "\t" + gold_tag + "\t" + tag)
- word_index += 1
- print("")
-
-
-if __name__ == "__main__":
- infer(
- model_path="models/params_pass_0",
- batch_size=6,
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt",
- use_gpu=False)
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/train.py b/fluid/PaddleNLP/sequence_tagging_for_ner/train.py
deleted file mode 100644
index f18300e1f11d3021a24bc38767238bc2b86b7c98..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/sequence_tagging_for_ner/train.py
+++ /dev/null
@@ -1,157 +0,0 @@
-from __future__ import print_function
-
-import os
-import math
-import time
-import numpy as np
-import six
-
-import paddle
-import paddle.fluid as fluid
-
-import reader
-from network_conf import ner_net
-from utils import logger, load_dict
-from utils_extend import to_lodtensor, get_embedding
-
-
-def test(exe, chunk_evaluator, inference_program, test_data, test_fetch_list,
- place):
- chunk_evaluator.reset()
- for data in test_data():
- word = to_lodtensor([x[0] for x in data], place)
- mark = to_lodtensor([x[1] for x in data], place)
- target = to_lodtensor([x[2] for x in data], place)
- rets = exe.run(inference_program,
- feed={"word": word,
- "mark": mark,
- "target": target},
- fetch_list=test_fetch_list)
- num_infer = np.array(rets[0])
- num_label = np.array(rets[1])
- num_correct = np.array(rets[2])
- chunk_evaluator.update(num_infer[0], num_label[0], num_correct[0])
- return chunk_evaluator.eval()
-
-
-def main(train_data_file,
- test_data_file,
- vocab_file,
- target_file,
- emb_file,
- model_save_dir,
- num_passes,
- use_gpu,
- parallel,
- batch_size=200):
- if not os.path.exists(model_save_dir):
- os.mkdir(model_save_dir)
-
- word_dict = load_dict(vocab_file)
- label_dict = load_dict(target_file)
-
- word_vector_values = get_embedding(emb_file)
-
- word_dict_len = len(word_dict)
- label_dict_len = len(label_dict)
-
- if "CE_MODE_X" in os.environ:
- fluid.default_startup_program().random_seed = 110
-
- avg_cost, feature_out, word, mark, target = ner_net(
- word_dict_len, label_dict_len, parallel)
-
- sgd_optimizer = fluid.optimizer.SGD(learning_rate=1e-3)
- sgd_optimizer.minimize(avg_cost)
-
- crf_decode = fluid.layers.crf_decoding(
- input=feature_out, param_attr=fluid.ParamAttr(name='crfw'))
-
- (precision, recall, f1_score, num_infer_chunks, num_label_chunks,
- num_correct_chunks) = fluid.layers.chunk_eval(
- input=crf_decode,
- label=target,
- chunk_scheme="IOB",
- num_chunk_types=int(math.ceil((label_dict_len - 1) / 2.0)))
- chunk_evaluator = fluid.metrics.ChunkEvaluator()
-
- inference_program = fluid.default_main_program().clone(for_test=True)
- test_fetch_list = [num_infer_chunks, num_label_chunks, num_correct_chunks]
-
- if "CE_MODE_X" not in os.environ:
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.data_reader(train_data_file, word_dict, label_dict),
- buf_size=20000),
- batch_size=batch_size)
- test_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.data_reader(test_data_file, word_dict, label_dict),
- buf_size=20000),
- batch_size=batch_size)
- else:
- train_reader = paddle.batch(
- reader.data_reader(train_data_file, word_dict, label_dict),
- batch_size=batch_size)
- test_reader = paddle.batch(
- reader.data_reader(test_data_file, word_dict, label_dict),
- batch_size=batch_size)
-
- place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()
- feeder = fluid.DataFeeder(feed_list=[word, mark, target], place=place)
- exe = fluid.Executor(place)
-
- exe.run(fluid.default_startup_program())
-
- embedding_name = 'emb'
- embedding_param = fluid.global_scope().find_var(embedding_name).get_tensor()
- embedding_param.set(word_vector_values, place)
-
- time_begin = time.time()
- for pass_id in six.moves.xrange(num_passes):
- chunk_evaluator.reset()
- for batch_id, data in enumerate(train_reader()):
- cost_var, nums_infer, nums_label, nums_correct = exe.run(
- fluid.default_main_program(),
- feed=feeder.feed(data),
- fetch_list=[
- avg_cost, num_infer_chunks, num_label_chunks,
- num_correct_chunks
- ])
- if batch_id % 5 == 0:
- print("Pass " + str(pass_id) + ", Batch " + str(batch_id) +
- ", Cost " + str(cost_var[0]))
- chunk_evaluator.update(nums_infer, nums_label, nums_correct)
- pass_precision, pass_recall, pass_f1_score = chunk_evaluator.eval()
- print("[TrainSet] pass_id:" + str(pass_id) + " pass_precision:" + str(
- pass_precision) + " pass_recall:" + str(pass_recall) +
- " pass_f1_score:" + str(pass_f1_score))
-
- test_pass_precision, test_pass_recall, test_pass_f1_score = test(
- exe, chunk_evaluator, inference_program, test_reader,
- test_fetch_list, place)
- print("[TestSet] pass_id:" + str(pass_id) + " pass_precision:" + str(
- test_pass_precision) + " pass_recall:" + str(test_pass_recall) +
- " pass_f1_score:" + str(test_pass_f1_score))
-
- save_dirname = os.path.join(model_save_dir, "params_pass_%d" % pass_id)
- fluid.io.save_inference_model(save_dirname, ['word', 'mark', 'target'],
- crf_decode, exe)
-
- if "CE_MODE_X" in os.environ:
- print("kpis train_precision %f" % pass_precision)
- print("kpis test_precision %f" % test_pass_precision)
- print("kpis train_duration %f" % (time.time() - time_begin))
-
-
-if __name__ == "__main__":
- main(
- train_data_file="data/train",
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt",
- emb_file="data/wordVectors.txt",
- model_save_dir="models",
- num_passes=2000,
- use_gpu=False,
- parallel=False)
diff --git a/fluid/PaddleNLP/text_classification/README.md b/fluid/PaddleNLP/text_classification/README.md
index 43c15934fa62af3db2261be37803ce21ba6bf946..28351d6a035babea1ee1ff6a8e4b0c69780657de 100644
--- a/fluid/PaddleNLP/text_classification/README.md
+++ b/fluid/PaddleNLP/text_classification/README.md
@@ -1,112 +1,2 @@
-# 文本分类
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── nets.py # 模型定义
-├── README.md # 文档
-├── train.py # 训练脚本
-├── infer.py # 预测脚本
-└── utils.py # 定义通用函数,从外部获取
-```
-
-
-## 简介,模型详解
-
-在PaddlePaddle v2版本[文本分类](https://github.com/PaddlePaddle/models/blob/develop/text/README.md)中对于文本分类任务有较详细的介绍,在本例中不再重复介绍。
-在模型上,我们采用了bow, cnn, lstm, gru四种常见的文本分类模型。
-
-## 训练
-
-1. 运行命令 `python train.py bow` 开始训练模型。
- ```python
- python train.py bow # bow指定网络结构,可替换成cnn, lstm, gru
- ```
-
-2. (可选)想自定义网络结构,需在[nets.py](./nets.py)中自行添加,并设置[train.py](./train.py)中的相应参数。
- ```python
- def train(train_reader, # 训练数据
- word_dict, # 数据字典
- network, # 模型配置
- use_cuda, # 是否用GPU
- parallel, # 是否并行
- save_dirname, # 保存模型路径
- lr=0.2, # 学习率大小
- batch_size=128, # 每个batch的样本数
- pass_num=30): # 训练的轮数
- ```
-
-## 训练结果示例
-```text
- pass_id: 0, avg_acc: 0.848040, avg_cost: 0.354073
- pass_id: 1, avg_acc: 0.914200, avg_cost: 0.217945
- pass_id: 2, avg_acc: 0.929800, avg_cost: 0.184302
- pass_id: 3, avg_acc: 0.938680, avg_cost: 0.164240
- pass_id: 4, avg_acc: 0.945120, avg_cost: 0.149150
- pass_id: 5, avg_acc: 0.951280, avg_cost: 0.137117
- pass_id: 6, avg_acc: 0.955360, avg_cost: 0.126434
- pass_id: 7, avg_acc: 0.961400, avg_cost: 0.117405
- pass_id: 8, avg_acc: 0.963560, avg_cost: 0.110070
- pass_id: 9, avg_acc: 0.965840, avg_cost: 0.103273
- pass_id: 10, avg_acc: 0.969800, avg_cost: 0.096314
- pass_id: 11, avg_acc: 0.971720, avg_cost: 0.090206
- pass_id: 12, avg_acc: 0.974800, avg_cost: 0.084970
- pass_id: 13, avg_acc: 0.977400, avg_cost: 0.078981
- pass_id: 14, avg_acc: 0.980000, avg_cost: 0.073685
- pass_id: 15, avg_acc: 0.981080, avg_cost: 0.069898
- pass_id: 16, avg_acc: 0.982080, avg_cost: 0.064923
- pass_id: 17, avg_acc: 0.984680, avg_cost: 0.060861
- pass_id: 18, avg_acc: 0.985840, avg_cost: 0.057095
- pass_id: 19, avg_acc: 0.988080, avg_cost: 0.052424
- pass_id: 20, avg_acc: 0.989160, avg_cost: 0.049059
- pass_id: 21, avg_acc: 0.990120, avg_cost: 0.045882
- pass_id: 22, avg_acc: 0.992080, avg_cost: 0.042140
- pass_id: 23, avg_acc: 0.992280, avg_cost: 0.039722
- pass_id: 24, avg_acc: 0.992840, avg_cost: 0.036607
- pass_id: 25, avg_acc: 0.994440, avg_cost: 0.034040
- pass_id: 26, avg_acc: 0.995000, avg_cost: 0.031501
- pass_id: 27, avg_acc: 0.995440, avg_cost: 0.028988
- pass_id: 28, avg_acc: 0.996240, avg_cost: 0.026639
- pass_id: 29, avg_acc: 0.996960, avg_cost: 0.024186
-```
-
-## 预测
-1. 运行命令 `python infer.py bow_model`, 开始预测。
- ```python
- python infer.py bow_model # bow_model指定需要导入的模型
-
-## 预测结果示例
-```text
- model_path: bow_model/epoch0, avg_acc: 0.882800
- model_path: bow_model/epoch1, avg_acc: 0.882360
- model_path: bow_model/epoch2, avg_acc: 0.881400
- model_path: bow_model/epoch3, avg_acc: 0.877800
- model_path: bow_model/epoch4, avg_acc: 0.872920
- model_path: bow_model/epoch5, avg_acc: 0.872640
- model_path: bow_model/epoch6, avg_acc: 0.869960
- model_path: bow_model/epoch7, avg_acc: 0.865160
- model_path: bow_model/epoch8, avg_acc: 0.863680
- model_path: bow_model/epoch9, avg_acc: 0.861200
- model_path: bow_model/epoch10, avg_acc: 0.853520
- model_path: bow_model/epoch11, avg_acc: 0.850400
- model_path: bow_model/epoch12, avg_acc: 0.855960
- model_path: bow_model/epoch13, avg_acc: 0.853480
- model_path: bow_model/epoch14, avg_acc: 0.855960
- model_path: bow_model/epoch15, avg_acc: 0.854120
- model_path: bow_model/epoch16, avg_acc: 0.854160
- model_path: bow_model/epoch17, avg_acc: 0.852240
- model_path: bow_model/epoch18, avg_acc: 0.852320
- model_path: bow_model/epoch19, avg_acc: 0.850280
- model_path: bow_model/epoch20, avg_acc: 0.849760
- model_path: bow_model/epoch21, avg_acc: 0.850160
- model_path: bow_model/epoch22, avg_acc: 0.846800
- model_path: bow_model/epoch23, avg_acc: 0.845440
- model_path: bow_model/epoch24, avg_acc: 0.845640
- model_path: bow_model/epoch25, avg_acc: 0.846200
- model_path: bow_model/epoch26, avg_acc: 0.845880
- model_path: bow_model/epoch27, avg_acc: 0.844880
- model_path: bow_model/epoch28, avg_acc: 0.844680
- model_path: bow_model/epoch29, avg_acc: 0.844960
-```
-注:过拟合导致acc持续下降,请忽略
+您好,该项目已被迁移,请移步到 [PaddleNLP/text_classification](../../../PaddleNLP/text_classification) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/text_classification/nets.py b/fluid/PaddleNLP/text_classification/nets.py
deleted file mode 100644
index 6ba637dd087afd8e45ad8d0752ac9850ec49e627..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/text_classification/nets.py
+++ /dev/null
@@ -1,124 +0,0 @@
-import sys
-import time
-import numpy as np
-
-import paddle
-import paddle.fluid as fluid
-
-
-def bow_net(data,
- label,
- dict_dim,
- emb_dim=128,
- hid_dim=128,
- hid_dim2=96,
- class_dim=2):
- """
- bow net
- """
- emb = fluid.layers.embedding(input=data, size=[dict_dim, emb_dim])
- bow = fluid.layers.sequence_pool(input=emb, pool_type='sum')
- bow_tanh = fluid.layers.tanh(bow)
- fc_1 = fluid.layers.fc(input=bow_tanh, size=hid_dim, act="tanh")
- fc_2 = fluid.layers.fc(input=fc_1, size=hid_dim2, act="tanh")
- prediction = fluid.layers.fc(input=[fc_2], size=class_dim, act="softmax")
- cost = fluid.layers.cross_entropy(input=prediction, label=label)
- avg_cost = fluid.layers.mean(x=cost)
- acc = fluid.layers.accuracy(input=prediction, label=label)
-
- return avg_cost, acc, prediction
-
-
-def cnn_net(data,
- label,
- dict_dim,
- emb_dim=128,
- hid_dim=128,
- hid_dim2=96,
- class_dim=2,
- win_size=3):
- """
- conv net
- """
- emb = fluid.layers.embedding(input=data, size=[dict_dim, emb_dim])
-
- conv_3 = fluid.nets.sequence_conv_pool(
- input=emb,
- num_filters=hid_dim,
- filter_size=win_size,
- act="tanh",
- pool_type="max")
-
- fc_1 = fluid.layers.fc(input=[conv_3], size=hid_dim2)
-
- prediction = fluid.layers.fc(input=[fc_1], size=class_dim, act="softmax")
- cost = fluid.layers.cross_entropy(input=prediction, label=label)
- avg_cost = fluid.layers.mean(x=cost)
- acc = fluid.layers.accuracy(input=prediction, label=label)
-
- return avg_cost, acc, prediction
-
-
-def lstm_net(data,
- label,
- dict_dim,
- emb_dim=128,
- hid_dim=128,
- hid_dim2=96,
- class_dim=2,
- emb_lr=30.0):
- """
- lstm net
- """
- emb = fluid.layers.embedding(
- input=data,
- size=[dict_dim, emb_dim],
- param_attr=fluid.ParamAttr(learning_rate=emb_lr))
-
- fc0 = fluid.layers.fc(input=emb, size=hid_dim * 4)
-
- lstm_h, c = fluid.layers.dynamic_lstm(
- input=fc0, size=hid_dim * 4, is_reverse=False)
-
- lstm_max = fluid.layers.sequence_pool(input=lstm_h, pool_type='max')
- lstm_max_tanh = fluid.layers.tanh(lstm_max)
-
- fc1 = fluid.layers.fc(input=lstm_max_tanh, size=hid_dim2, act='tanh')
-
- prediction = fluid.layers.fc(input=fc1, size=class_dim, act='softmax')
-
- cost = fluid.layers.cross_entropy(input=prediction, label=label)
- avg_cost = fluid.layers.mean(x=cost)
- acc = fluid.layers.accuracy(input=prediction, label=label)
-
- return avg_cost, acc, prediction
-
-
-def gru_net(data,
- label,
- dict_dim,
- emb_dim=128,
- hid_dim=128,
- hid_dim2=96,
- class_dim=2,
- emb_lr=400.0):
- """
- gru net
- """
- emb = fluid.layers.embedding(
- input=data,
- size=[dict_dim, emb_dim],
- param_attr=fluid.ParamAttr(learning_rate=emb_lr))
-
- fc0 = fluid.layers.fc(input=emb, size=hid_dim * 3)
- gru_h = fluid.layers.dynamic_gru(input=fc0, size=hid_dim, is_reverse=False)
- gru_max = fluid.layers.sequence_pool(input=gru_h, pool_type='max')
- gru_max_tanh = fluid.layers.tanh(gru_max)
- fc1 = fluid.layers.fc(input=gru_max_tanh, size=hid_dim2, act='tanh')
- prediction = fluid.layers.fc(input=fc1, size=class_dim, act='softmax')
-
- cost = fluid.layers.cross_entropy(input=prediction, label=label)
- avg_cost = fluid.layers.mean(x=cost)
- acc = fluid.layers.accuracy(input=prediction, label=label)
-
- return avg_cost, acc, prediction
diff --git a/fluid/PaddleNLP/text_classification/train.py b/fluid/PaddleNLP/text_classification/train.py
deleted file mode 100644
index 159266f3956b950afa200e9f53c9fdc6c36309aa..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/text_classification/train.py
+++ /dev/null
@@ -1,144 +0,0 @@
-import os
-import six
-import sys
-import time
-import unittest
-import contextlib
-
-import paddle
-import paddle.fluid as fluid
-
-import utils
-from nets import bow_net
-from nets import cnn_net
-from nets import lstm_net
-from nets import gru_net
-
-
-def train(train_reader,
- word_dict,
- network,
- use_cuda,
- parallel,
- save_dirname,
- lr=0.2,
- batch_size=128,
- pass_num=30):
- """
- train network
- """
- data = fluid.layers.data(
- name="words", shape=[1], dtype="int64", lod_level=1)
-
- label = fluid.layers.data(name="label", shape=[1], dtype="int64")
-
- if not parallel:
- cost, acc, prediction = network(data, label, len(word_dict))
- else:
- places = fluid.layers.device.get_places(device_count=2)
- pd = fluid.layers.ParallelDo(places)
- with pd.do():
- cost, acc, prediction = network(
- pd.read_input(data), pd.read_input(label), len(word_dict))
-
- pd.write_output(cost)
- pd.write_output(acc)
-
- cost, acc = pd()
- cost = fluid.layers.mean(cost)
- acc = fluid.layers.mean(acc)
-
- sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=lr)
- sgd_optimizer.minimize(cost)
-
- place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
- exe = fluid.Executor(place)
- feeder = fluid.DataFeeder(feed_list=[data, label], place=place)
-
- # For internal continuous evaluation
- if "CE_MODE_X" in os.environ:
- fluid.default_startup_program().random_seed = 110
- exe.run(fluid.default_startup_program())
- for pass_id in six.moves.xrange(pass_num):
- pass_start = time.time()
- data_size, data_count, total_acc, total_cost = 0, 0, 0.0, 0.0
- for data in train_reader():
- avg_cost_np, avg_acc_np = exe.run(fluid.default_main_program(),
- feed=feeder.feed(data),
- fetch_list=[cost, acc])
- data_size = len(data)
- total_acc += data_size * avg_acc_np
- total_cost += data_size * avg_cost_np
- data_count += data_size
- avg_cost = total_cost / data_count
-
- avg_acc = total_acc / data_count
- print("pass_id: %d, avg_acc: %f, avg_cost: %f, pass_time_cost: %f" %
- (pass_id, avg_acc, avg_cost, time.time() - pass_start))
-
- epoch_model = save_dirname + "/" + "epoch" + str(pass_id)
- fluid.io.save_inference_model(epoch_model, ["words", "label"], acc, exe)
-
- pass_end = time.time()
- # For internal continuous evaluation
- if "CE_MODE_X" in os.environ:
- print("kpis train_acc %f" % avg_acc)
- print("kpis train_cost %f" % avg_cost)
- print("kpis train_duration %f" % (pass_end - pass_start))
-
-
-def train_net():
- word_dict, train_reader, test_reader = utils.prepare_data(
- "imdb", self_dict=False, batch_size=4, buf_size=50000)
-
- if sys.argv[1] == "bow":
- train(
- train_reader,
- word_dict,
- bow_net,
- use_cuda=False,
- parallel=False,
- save_dirname="bow_model",
- lr=0.002,
- pass_num=30,
- batch_size=4)
- elif sys.argv[1] == "cnn":
- train(
- train_reader,
- word_dict,
- cnn_net,
- use_cuda=True,
- parallel=False,
- save_dirname="cnn_model",
- lr=0.01,
- pass_num=30,
- batch_size=4)
- elif sys.argv[1] == "lstm":
- train(
- train_reader,
- word_dict,
- lstm_net,
- use_cuda=True,
- parallel=False,
- save_dirname="lstm_model",
- lr=0.05,
- pass_num=30,
- batch_size=4)
- elif sys.argv[1] == "gru":
- train(
- train_reader,
- word_dict,
- lstm_net,
- use_cuda=True,
- parallel=False,
- save_dirname="gru_model",
- lr=0.05,
- pass_num=30,
- batch_size=4)
- else:
- print("network name cannot be found!")
- sys.exit(1)
-
-
-if __name__ == "__main__":
- train_net()
diff --git a/fluid/PaddleNLP/text_matching_on_quora/README.md b/fluid/PaddleNLP/text_matching_on_quora/README.md
index 77d93943ae7dcbe775e60307b74430f320dbaab1..2b268a809b14a9d88a13a12ec4b73d02b7bccf78 100644
--- a/fluid/PaddleNLP/text_matching_on_quora/README.md
+++ b/fluid/PaddleNLP/text_matching_on_quora/README.md
@@ -1,177 +1,6 @@
-# Text matching on Quora qestion-answer pair dataset
-## contents
+Hi!
-* [Introduction](#introduction)
- * [a brief review of the Quora Question Pair (QQP) Task](#a-brief-review-of-the-quora-question-pair-qqp-task)
- * [Our Work](#our-work)
-* [Environment Preparation](#environment-preparation)
- * [Install Fluid release 1.0](#install-fluid-release-10)
- * [cpu version](#cpu-version)
- * [gpu version](#gpu-version)
- * [Have I installed Fluid successfully?](#have-i-installed-fluid-successfully)
-* [Prepare Data](#prepare-data)
-* [Train and evaluate](#train-and-evaluate)
-* [Models](#models)
-* [Results](#results)
+This directory has been deprecated.
-
-## Introduction
-
-### a brief review of the Quora Question Pair (QQP) Task
-
-The [Quora Question Pair](https://data.quora.com/First-Quora-Dataset-Release-Question-Pairs) dataset contains 400,000 question pairs from [Quora](https://www.quora.com/), where people ask and answer questions related to specific areas. Each sample in the dataset consists of two questions (both English) and a label that represents whether the questions are duplicate. The dataset is well annotated by human.
-
-Below are two samples from the dataset. The last column indicates whether the two questions are duplicate (1) or not (0).
-
-|id | qid1 | qid2| question1| question2| is_duplicate
-|:---:|:---:|:---:|:---:|:---:|:---:|
-|0 |1 |2 |What is the step by step guide to invest in share market in india? |What is the step by step guide to invest in share market? |0|
-|1 |3 |4 |What is the story of Kohinoor (Koh-i-Noor) Diamond? | What would happen if the Indian government stole the Kohinoor (Koh-i-Noor) diamond back? |0|
-
- A [kaggle competition](https://www.kaggle.com/c/quora-question-pairs#description) was held based on this dataset in 2017. The kagglers were given a training dataset (with labels), and requested to make predictions on a test dataset (without labels). The predictions were evaluated by the log-likelihood loss on the test data.
-
-The kaggle competition has inspired much effective work. However, most of these models are rule-based and difficult to be transferred to new tasks. Researchers are seeking for more general models that work well on this task and other natual language processing (NLP) tasks.
-
-[Wang _et al._](https://arxiv.org/abs/1702.03814) proposed a bilateral multi-perspective matching (BIMPM) model based on the Quora Question Pair dataset. They splitted the original dataset to [3 parts](https://drive.google.com/file/d/0B0PlTAo--BnaQWlsZl9FZ3l1c28/view?usp=sharing): _train.tsv_ (384,348 samples), _dev.tsv_ (10,000 samples) and _test.tsv_ (10,000 samples). The class distribution of _train.tsv_ is unbalanced (37% positive and 63% negative), while those of _dev.tsv_ and _test.tsv_ are balanced(50% positive and 50% negetive). We used the same splitting method in our experiments.
-
-### Our Work
-
-Based on the Quora Question Pair Dataset, we implemented some classic models in the area of neural language understanding (NLU). The accuracy of prediction results are evaluated on the _test.tsv_ from [Wang _et al._](https://arxiv.org/abs/1702.03814).
-
-## Environment Preparation
-
-### Install Fluid release 1.0
-
-Please follow the [official document in English](http://www.paddlepaddle.org/documentation/docs/en/1.0/build_and_install/pip_install_en.html) or [official document in Chinese](http://www.paddlepaddle.org/documentation/docs/zh/1.0/beginners_guide/install/Start.html) to install the Fluid deep learning framework.
-
-#### Have I installed Fluid successfully?
-
-Run the following script from your command line:
-
-```shell
-python -c "import paddle"
-```
-
-If Fluid is installed successfully you should see no error message. Feel free to open issues under the [PaddlePaddle repository](https://github.com/PaddlePaddle/Paddle/issues) for support.
-
-## Prepare Data
-
-Please download the Quora dataset from [Google drive](https://drive.google.com/file/d/0B0PlTAo--BnaQWlsZl9FZ3l1c28/view?usp=sharing) and unzip to $HOME/.cache/paddle/dataset.
-
-Then run _data/prepare_quora_data.sh_ to download the pre-trained _word2vec_ embedding file -- _glove.840B.300d.zip_:
-
-```shell
-sh data/prepare_quora_data.sh
-```
-
-At this point the dataset directory ($HOME/.cache/paddle/dataset) structure should be:
-
-```shell
-
-$HOME/.cache/paddle/dataset
- |- Quora_question_pair_partition
- |- train.tsv
- |- test.tsv
- |- dev.tsv
- |- readme.txt
- |- wordvec.txt
- |- glove.840B.300d.txt
-```
-
-## Train and evaluate
-
-We provide multiple models and configurations. Details are shown in `models` and `configs` directories. For a quick start, please run the _cdssmNet_ model with the corresponding configuration:
-
-```shell
-python train_and_evaluate.py \
- --model_name=cdssmNet \
- --config=cdssm_base
-```
-
-Logs will be output to the console. If everything works well, the logging information will have the same formats as the content in _cdssm_base.log_.
-
-All configurations used in our experiments are as follows:
-
-|Model|Config|command
-|:----:|:----:|:----:|
-|cdssmNet|cdssm_base|python train_and_evaluate.py --model_name=cdssmNet --config=cdssm_base
-|DecAttNet|decatt_glove|python train_and_evaluate.py --model_name=DecAttNet --config=decatt_glove
-|InferSentNet|infer_sent_v1|python train_and_evaluate.py --model_name=InferSentNet --config=infer_sent_v1
-|InferSentNet|infer_sent_v2|python train_and_evaluate.py --model_name=InferSentNet --config=infer_sent_v2
-|SSENet|sse_base|python train_and_evaluate.py --model_name=SSENet --config=sse_base
-
-## Models
-
-We implemeted 4 models for now: the convolutional deep-structured semantic model (CDSSM, CNN-based), the InferSent model (RNN-based), the shortcut-stacked encoder (SSE, RNN-based), and the decomposed attention model (DecAtt, attention-based).
-
-|Model|features|Context Encoder|Match Layer|Classification Layer
-|:----:|:----:|:----:|:----:|:----:|
-|CDSSM|word|1 layer conv1d|concatenation|MLP
-|DecAtt|word|Attention|concatenation|MLP
-|InferSent|word|1 layer Bi-LSTM|concatenation/element-wise product/
absolute element-wise difference|MLP
-|SSE|word|3 layer Bi-LSTM|concatenation/element-wise product/
absolute element-wise difference|MLP
-
-### CDSSM
-
-```
-@inproceedings{shen2014learning,
- title={Learning semantic representations using convolutional neural networks for web search},
- author={Shen, Yelong and He, Xiaodong and Gao, Jianfeng and Deng, Li and Mesnil, Gr{\'e}goire},
- booktitle={Proceedings of the 23rd International Conference on World Wide Web},
- pages={373--374},
- year={2014},
- organization={ACM}
-}
-```
-
-### InferSent
-
-```
-@article{conneau2017supervised,
- title={Supervised learning of universal sentence representations from natural language inference data},
- author={Conneau, Alexis and Kiela, Douwe and Schwenk, Holger and Barrault, Loic and Bordes, Antoine},
- journal={arXiv preprint arXiv:1705.02364},
- year={2017}
-}
-```
-
-### SSE
-
-```
-@article{nie2017shortcut,
- title={Shortcut-stacked sentence encoders for multi-domain inference},
- author={Nie, Yixin and Bansal, Mohit},
- journal={arXiv preprint arXiv:1708.02312},
- year={2017}
-}
-```
-
-### DecAtt
-
-```
-@article{tomar2017neural,
- title={Neural paraphrase identification of questions with noisy pretraining},
- author={Tomar, Gaurav Singh and Duque, Thyago and T{\"a}ckstr{\"o}m, Oscar and Uszkoreit, Jakob and Das, Dipanjan},
- journal={arXiv preprint arXiv:1704.04565},
- year={2017}
-}
-```
-
-## Results
-
-|Model|Config|dev accuracy| test accuracy
-|:----:|:----:|:----:|:----:|
-|cdssmNet|cdssm_base|83.56%|82.83%|
-|DecAttNet|decatt_glove|86.31%|86.22%|
-|InferSentNet|infer_sent_v1|87.15%|86.62%|
-|InferSentNet|infer_sent_v2|88.55%|88.43%|
-|SSENet|sse_base|88.35%|88.25%|
-
-In our experiment, we found that LSTM-based models outperformed convolution-based models. The DecAtt model has fewer parameters than LSTM-based models, but is sensitive to hyper-parameters.
-
-
-
-
-
-
+Please visit the project at [PaddleNLP/text_matching_on_quora](../../../PaddleNLP/text_matching_on_quora).
diff --git a/fluid/PaddleNLP/text_matching_on_quora/pretrained_word2vec.py b/fluid/PaddleNLP/text_matching_on_quora/pretrained_word2vec.py
deleted file mode 100755
index a3f3422e8bb2d4065978d43378e6c607b00141a4..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/text_matching_on_quora/pretrained_word2vec.py
+++ /dev/null
@@ -1,56 +0,0 @@
-# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-"""
-This Module provide pretrained word-embeddings
-"""
-
-from __future__ import print_function, unicode_literals
-import numpy as np
-import time, datetime
-import os, sys
-
-
-def Glove840B_300D(filepath, keys=None):
- """
- input: the "glove.840B.300d.txt" file path
- return: a dict, key: word (unicode), value: a numpy array with shape [300]
- """
- if keys is not None:
- assert(isinstance(keys, set))
- print("loading word2vec from ", filepath)
- print("please wait for a minute.")
- start = time.time()
- word2vec = {}
- with open(filepath, "r") as f:
- for line in f:
- if sys.version_info <= (3, 0): # for python2
- line = line.decode('utf-8')
- info = line.strip("\n").split(" ")
- word = info[0]
- if (keys is not None) and (word not in keys):
- continue
- vector = info[1:]
- assert(len(vector) == 300)
- word2vec[word] = np.asarray(vector, dtype='float32')
-
- end = time.time()
- print("Spent ", str(datetime.timedelta(seconds=end-start)), " on loading word2vec.")
- return word2vec
-
-if __name__ == '__main__':
- from os.path import expanduser
- home = expanduser("~")
- embed_dict = Glove840B_300D(os.path.join(home, "./.cache/paddle/dataset/glove.840B.300d.txt"))
- exit(0)
diff --git a/fluid/PaddleNLP/text_matching_on_quora/quora_question_pairs.py b/fluid/PaddleNLP/text_matching_on_quora/quora_question_pairs.py
deleted file mode 100755
index d27fa4fdeb598b84fa2069df01951158f78b1834..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/text_matching_on_quora/quora_question_pairs.py
+++ /dev/null
@@ -1,194 +0,0 @@
-# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-"""
-"""
-
-import paddle.dataset.common
-import collections
-import tarfile
-import re
-import string
-import random
-import os, sys
-import nltk
-from os.path import expanduser
-
-
-__all__ = ['word_dict', 'train', 'dev', 'test']
-
-URL = "https://drive.google.com/file/d/0B0PlTAo--BnaQWlsZl9FZ3l1c28/view"
-
-DATA_HOME = os.path.expanduser('~/.cache/paddle/dataset')
-DATA_DIR = "Quora_question_pair_partition"
-
-QUORA_TRAIN_FILE_NAME = os.path.join(DATA_HOME, DATA_DIR, 'train.tsv')
-QUORA_DEV_FILE_NAME = os.path.join(DATA_HOME, DATA_DIR, 'dev.tsv')
-QUORA_TEST_FILE_NAME = os.path.join(DATA_HOME, DATA_DIR, 'test.tsv')
-
-# punctuation or nltk or space
-TOKENIZE_METHOD='space'
-
-COLUMN_COUNT = 4
-
-
-def tokenize(s):
- if sys.version_info <= (3, 0): # for python2
- s = s.decode('utf-8')
- if TOKENIZE_METHOD == "nltk":
- return nltk.tokenize.word_tokenize(s)
- elif TOKENIZE_METHOD == "punctuation":
- return s.translate({ord(char): None for char in string.punctuation}).lower().split()
- elif TOKENIZE_METHOD == "space":
- return s.split()
- else:
- raise RuntimeError("Invalid tokenize method")
-
-
-def maybe_open(file_name):
- if not os.path.isfile(file_name):
- msg = "file not exist: %s\nPlease download the dataset firstly from: %s\n\n" % (file_name, URL) + \
- ("# The finally dataset dir should be like\n\n"
- "$HOME/.cache/paddle/dataset\n"
- " |- Quora_question_pair_partition\n"
- " |- train.tsv\n"
- " |- test.tsv\n"
- " |- dev.tsv\n"
- " |- readme.txt\n"
- " |- wordvec.txt\n")
- raise RuntimeError(msg)
-
- return open(file_name, 'r')
-
-
-def tokenized_question_pairs(file_name):
- """
- """
- with maybe_open(file_name) as f:
- questions = {}
- lines = f.readlines()
- for line in lines:
- info = line.strip().split('\t')
- if len(info) != COLUMN_COUNT:
- # formatting error
- continue
- (label, question1, question2, id) = info
- question1 = tokenize(question1)
- question2 = tokenize(question2)
- yield question1, question2, int(label)
-
-
-def tokenized_questions(file_name):
- """
- """
- with maybe_open(file_name) as f:
- lines = f.readlines()
- for line in lines:
- info = line.strip().split('\t')
- if len(info) != COLUMN_COUNT:
- # formatting error
- continue
- (label, question1, question2, id) = info
- yield tokenize(question1)
- yield tokenize(question2)
-
-
-def build_dict(file_name, cutoff):
- """
- Build a word dictionary from the corpus. Keys of the dictionary are words,
- and values are zero-based IDs of these words.
- """
- word_freq = collections.defaultdict(int)
- for doc in tokenized_questions(file_name):
- for word in doc:
- word_freq[word] += 1
-
- word_freq = filter(lambda x: x[1] > cutoff, word_freq.items())
-
- dictionary = sorted(word_freq, key=lambda x: (-x[1], x[0]))
- words, _ = list(zip(*dictionary))
- word_idx = dict(zip(words, range(len(words))))
- word_idx[''] = len(words)
- word_idx[''] = len(words) + 1
- return word_idx
-
-
-def reader_creator(file_name, word_idx):
- UNK_ID = word_idx['']
-
- def reader():
- for (q1, q2, label) in tokenized_question_pairs(file_name):
- q1_ids = [word_idx.get(w, UNK_ID) for w in q1]
- q2_ids = [word_idx.get(w, UNK_ID) for w in q2]
- if q1_ids != [] and q2_ids != []: # [] is not allowed in fluid
- assert(label in [0, 1])
- yield q1_ids, q2_ids, label
-
- return reader
-
-
-def train(word_idx):
- """
- Quora training set creator.
-
- It returns a reader creator, each sample in the reader is two zero-based ID
- list and label in [0, 1].
-
- :param word_idx: word dictionary
- :type word_idx: dict
- :return: Training reader creator
- :rtype: callable
- """
- return reader_creator(QUORA_TRAIN_FILE_NAME, word_idx)
-
-
-def dev(word_idx):
- """
- Quora develop set creator.
-
- It returns a reader creator, each sample in the reader is two zero-based ID
- list and label in [0, 1].
-
- :param word_idx: word dictionary
- :type word_idx: dict
- :return: develop reader creator
- :rtype: callable
-
- """
- return reader_creator(QUORA_DEV_FILE_NAME, word_idx)
-
-def test(word_idx):
- """
- Quora test set creator.
-
- It returns a reader creator, each sample in the reader is two zero-based ID
- list and label in [0, 1].
-
- :param word_idx: word dictionary
- :type word_idx: dict
- :return: Test reader creator
- :rtype: callable
- """
- return reader_creator(QUORA_TEST_FILE_NAME, word_idx)
-
-
-def word_dict():
- """
- Build a word dictionary from the corpus.
-
- :return: Word dictionary
- :rtype: dict
- """
- return build_dict(file_name=QUORA_TRAIN_FILE_NAME, cutoff=4)
-
diff --git a/fluid/PaddleNLP/text_matching_on_quora/train_and_evaluate.py b/fluid/PaddleNLP/text_matching_on_quora/train_and_evaluate.py
deleted file mode 100755
index 0cca171933fac9dfc47baaf45d551b65d69c2f7a..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/text_matching_on_quora/train_and_evaluate.py
+++ /dev/null
@@ -1,271 +0,0 @@
-#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import print_function
-
-import os
-import sys
-import time
-import argparse
-import unittest
-import contextlib
-import numpy as np
-
-import paddle.fluid as fluid
-
-import utils, metric, configs
-import models
-
-from pretrained_word2vec import Glove840B_300D
-
-parser = argparse.ArgumentParser(description=__doc__)
-
-parser.add_argument('--model_name', type=str, default='cdssmNet', help="Which model to train")
-parser.add_argument('--config', type=str, default='cdssm_base', help="The global config setting")
-
-DATA_DIR = os.path.join(os.path.expanduser('~'), '.cache/paddle/dataset')
-
-def evaluate(epoch_id, exe, inference_program, dev_reader, test_reader, fetch_list, feeder, metric_type):
- """
- evaluate on test/dev dataset
- """
- def infer(test_reader):
- """
- do inference function
- """
- total_cost = 0.0
- total_count = 0
- preds, labels = [], []
- for data in test_reader():
- avg_cost, avg_acc, batch_prediction = exe.run(inference_program,
- feed=feeder.feed(data),
- fetch_list=fetch_list,
- return_numpy=True)
- total_cost += avg_cost * len(data)
- total_count += len(data)
- preds.append(batch_prediction)
- labels.append(np.asarray([x[-1] for x in data], dtype=np.int64))
- y_pred = np.concatenate(preds)
- y_label = np.concatenate(labels)
-
- metric_res = []
- for metric_name in metric_type:
- if metric_name == 'accuracy_with_threshold':
- metric_res.append((metric_name, metric.accuracy_with_threshold(y_pred, y_label, threshold=0.3)))
- elif metric_name == 'accuracy':
- metric_res.append((metric_name, metric.accuracy(y_pred, y_label)))
- else:
- print("Unknown metric type: ", metric_name)
- exit()
- return total_cost / (total_count * 1.0), metric_res
-
- dev_cost, dev_metric_res = infer(dev_reader)
- print("[%s] epoch_id: %d, dev_cost: %f, " % (
- time.asctime( time.localtime(time.time()) ),
- epoch_id,
- dev_cost)
- + ', '.join([str(x[0]) + ": " + str(x[1]) for x in dev_metric_res]))
-
- test_cost, test_metric_res = infer(test_reader)
- print("[%s] epoch_id: %d, test_cost: %f, " % (
- time.asctime( time.localtime(time.time()) ),
- epoch_id,
- test_cost)
- + ', '.join([str(x[0]) + ": " + str(x[1]) for x in test_metric_res]))
- print("")
-
-
-def train_and_evaluate(train_reader,
- dev_reader,
- test_reader,
- network,
- optimizer,
- global_config,
- pretrained_word_embedding,
- use_cuda,
- parallel):
- """
- train network
- """
-
- # define the net
- if global_config.use_lod_tensor:
- # automatic add batch dim
- q1 = fluid.layers.data(
- name="question1", shape=[1], dtype="int64", lod_level=1)
- q2 = fluid.layers.data(
- name="question2", shape=[1], dtype="int64", lod_level=1)
- label = fluid.layers.data(name="label", shape=[1], dtype="int64")
- cost, acc, prediction = network(q1, q2, label)
- else:
- # shape: [batch_size, max_seq_len_in_batch, 1]
- q1 = fluid.layers.data(
- name="question1", shape=[-1, -1, 1], dtype="int64")
- q2 = fluid.layers.data(
- name="question2", shape=[-1, -1, 1], dtype="int64")
- # shape: [batch_size, max_seq_len_in_batch]
- mask1 = fluid.layers.data(name="mask1", shape=[-1, -1], dtype="float32")
- mask2 = fluid.layers.data(name="mask2", shape=[-1, -1], dtype="float32")
- label = fluid.layers.data(name="label", shape=[1], dtype="int64")
- cost, acc, prediction = network(q1, q2, mask1, mask2, label)
-
- if parallel:
- # TODO: Paarallel Training
- print("Parallel Training is not supported for now.")
- sys.exit(1)
-
- #optimizer.minimize(cost)
- if use_cuda:
- print("Using GPU")
- place = fluid.CUDAPlace(0)
- else:
- print("Using CPU")
- place = fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- if global_config.use_lod_tensor:
- feeder = fluid.DataFeeder(feed_list=[q1, q2, label], place=place)
- else:
- feeder = fluid.DataFeeder(feed_list=[q1, q2, mask1, mask2, label], place=place)
-
- # logging param info
- for param in fluid.default_main_program().global_block().all_parameters():
- print("param name: %s; param shape: %s" % (param.name, param.shape))
-
- # define inference_program
- inference_program = fluid.default_main_program().clone(for_test=True)
-
- optimizer.minimize(cost)
-
- exe.run(fluid.default_startup_program())
-
- # load emb from a numpy erray
- if pretrained_word_embedding is not None:
- print("loading pretrained word embedding to param")
- embedding_name = "emb.w"
- embedding_param = fluid.global_scope().find_var(embedding_name).get_tensor()
- embedding_param.set(pretrained_word_embedding, place)
-
- evaluate(-1,
- exe,
- inference_program,
- dev_reader,
- test_reader,
- fetch_list=[cost, acc, prediction],
- feeder=feeder,
- metric_type=global_config.metric_type)
-
- # start training
- print("[%s] Start Training" % time.asctime(time.localtime(time.time())))
- for epoch_id in range(global_config.epoch_num):
- data_size, data_count, total_acc, total_cost = 0, 0, 0.0, 0.0
- batch_id = 0
- epoch_begin_time = time.time()
- for data in train_reader():
- avg_cost_np, avg_acc_np = exe.run(fluid.default_main_program(),
- feed=feeder.feed(data),
- fetch_list=[cost, acc])
- data_size = len(data)
- total_acc += data_size * avg_acc_np
- total_cost += data_size * avg_cost_np
- data_count += data_size
- if batch_id % 100 == 0:
- print("[%s] epoch_id: %d, batch_id: %d, cost: %f, acc: %f" % (
- time.asctime(time.localtime(time.time())),
- epoch_id,
- batch_id,
- avg_cost_np,
- avg_acc_np))
- batch_id += 1
-
- avg_cost = total_cost / data_count
- avg_acc = total_acc / data_count
-
- print("")
- print("[%s] epoch_id: %d, train_avg_cost: %f, train_avg_acc: %f, epoch_time_cost: %f" % (
- time.asctime( time.localtime(time.time())),
- epoch_id, avg_cost, avg_acc,
- time.time() - epoch_begin_time))
-
- epoch_model = global_config.save_dirname + "/" + "epoch" + str(epoch_id)
- fluid.io.save_inference_model(epoch_model, ["question1", "question2", "label"], acc, exe)
-
- evaluate(epoch_id,
- exe,
- inference_program,
- dev_reader,
- test_reader,
- fetch_list=[cost, acc, prediction],
- feeder=feeder,
- metric_type=global_config.metric_type)
-
-def main():
- """
- This function will parse argments, prepare data and prepare pretrained embedding
- """
- args = parser.parse_args()
- global_config = configs.__dict__[args.config]()
-
- print("net_name: ", args.model_name)
- net = models.__dict__[args.model_name](global_config)
-
- # get word_dict
- word_dict = utils.getDict(data_type="quora_question_pairs")
-
- # get reader
- train_reader, dev_reader, test_reader = utils.prepare_data(
- "quora_question_pairs",
- word_dict=word_dict,
- batch_size = global_config.batch_size,
- buf_size=800000,
- duplicate_data=global_config.duplicate_data,
- use_pad=(not global_config.use_lod_tensor))
-
- # load pretrained_word_embedding
- if global_config.use_pretrained_word_embedding:
- word2vec = Glove840B_300D(filepath=os.path.join(DATA_DIR, "glove.840B.300d.txt"),
- keys=set(word_dict.keys()))
- pretrained_word_embedding = utils.get_pretrained_word_embedding(
- word2vec=word2vec,
- word2id=word_dict,
- config=global_config)
- print("pretrained_word_embedding to be load:", pretrained_word_embedding)
- else:
- pretrained_word_embedding = None
-
- # define optimizer
- optimizer = utils.getOptimizer(global_config)
-
- # use cuda or not
- if not global_config.has_member('use_cuda'):
- if 'CUDA_VISIBLE_DEVICES' in os.environ and os.environ['CUDA_VISIBLE_DEVICES'] != '':
- global_config.use_cuda = True
- else:
- global_config.use_cuda = False
-
- global_config.list_config()
-
- train_and_evaluate(
- train_reader,
- dev_reader,
- test_reader,
- net,
- optimizer,
- global_config,
- pretrained_word_embedding,
- use_cuda=global_config.use_cuda,
- parallel=False)
-
-if __name__ == "__main__":
- main()
diff --git a/fluid/PaddleRec/ctr/README.cn.md b/fluid/PaddleRec/ctr/README.cn.md
index 04c5eb9c0c2bc1d7a02b296cfeb601e132864b6c..81cd20625701c13fce3a3f8ad119663a6e5c162c 100644
--- a/fluid/PaddleRec/ctr/README.cn.md
+++ b/fluid/PaddleRec/ctr/README.cn.md
@@ -1,76 +1,2 @@
-# 基于DNN模型的点击率预估模型
-
-## 介绍
-本模型实现了下述论文中提出的DNN模型:
-
-```text
-@inproceedings{guo2017deepfm,
- title={DeepFM: A Factorization-Machine based Neural Network for CTR Prediction},
- author={Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He},
- booktitle={the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI)},
- pages={1725--1731},
- year={2017}
-}
-```
-
-## 运行环境
-需要先安装PaddlePaddle Fluid,然后运行:
-
-```shell
-pip install -r requirements.txt
-```
-
-## 数据集
-本文使用的是Kaggle公司举办的[展示广告竞赛](https://www.kaggle.com/c/criteo-display-ad-challenge/)中所使用的Criteo数据集。
-
-每一行是一次广告展示的特征,第一列是一个标签,表示这次广告展示是否被点击。总共有39个特征,其中13个特征采用整型值,另外26个特征是类别类特征。测试集中是没有标签的。
-
-下载数据集:
-```bash
-cd data && ./download.sh && cd ..
-```
-
-## 模型
-本例子只实现了DeepFM论文中介绍的模型的DNN部分,DeepFM会在其他例子中给出。
-
-
-## 数据准备
-处理原始数据集,整型特征使用min-max归一化方法规范到[0, 1],类别类特征使用了one-hot编码。原始数据集分割成两部分:90%用于训练,其他10%用于训练过程中的验证。
-
-## 训练
-训练的命令行选项可以通过`python train.py -h`列出。
-
-### 单机训练:
-```bash
-python train.py \
- --train_data_path data/raw/train.txt \
- 2>&1 | tee train.log
-```
-
-训练到第1轮的第40000个batch后,测试的AUC为0.801178,误差(cost)为0.445196。
-
-### 分布式训练
-
-本地启动一个2 trainer 2 pserver的分布式训练任务,分布式场景下训练数据会按照trainer的id进行切分,保证trainer之间的训练数据不会重叠,提高训练效率
-
-```bash
-sh cluster_train.sh
-```
-
-## 预测
-预测的命令行选项可以通过`python infer.py -h`列出。
-
-对测试集进行预测:
-```bash
-python infer.py \
- --model_path models/pass-0/ \
- --data_path data/raw/valid.txt
-```
-注意:infer.py跑完最后输出的AUC才是整个预测文件的整体AUC。
-
-## 在百度云上运行集群训练
-1. 参考文档 [在百度云上启动Fluid分布式训练](https://github.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/user_guides/howto/training/train_on_baidu_cloud_cn.rst) 在百度云上部署一个CPU集群。
-1. 用preprocess.py处理训练数据生成train.txt。
-1. 将train.txt切分成集群机器份,放到每台机器上。
-1. 用上面的 `分布式训练` 中的命令行启动分布式训练任务.
+您好,该项目已被迁移,请移步到 [PaddleRec/ctr](../../../PaddleRec/ctr) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/ctr/README.md b/fluid/PaddleRec/ctr/README.md
index 13e71190de3952c1d37b31eff141e2250f2d4c8b..1aceff1350c2c28b13ec92ccf82e321bb3ddda04 100644
--- a/fluid/PaddleRec/ctr/README.md
+++ b/fluid/PaddleRec/ctr/README.md
@@ -1,93 +1,6 @@
-# DNN for Click-Through Rate prediction
+Hi!
-## Introduction
-This model implements the DNN part proposed in the following paper:
+This directory has been deprecated.
-```text
-@inproceedings{guo2017deepfm,
- title={DeepFM: A Factorization-Machine based Neural Network for CTR Prediction},
- author={Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He},
- booktitle={the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI)},
- pages={1725--1731},
- year={2017}
-}
-```
-
-The DeepFm combines factorization machine and deep neural networks to model
-both low order and high order feature interactions. For details of the
-factorization machines, please refer to the paper [factorization
-machines](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf)
-
-## Environment
-You should install PaddlePaddle Fluid first, and run:
-
-```shell
-pip install -r requirements.txt
-```
-
-## Dataset
-This example uses Criteo dataset which was used for the [Display Advertising
-Challenge](https://www.kaggle.com/c/criteo-display-ad-challenge/)
-hosted by Kaggle.
-
-Each row is the features for an ad display and the first column is a label
-indicating whether this ad has been clicked or not. There are 39 features in
-total. 13 features take integer values and the other 26 features are
-categorical features. For the test dataset, the labels are omitted.
-
-Download dataset:
-```bash
-cd data && ./download.sh && cd ..
-```
-
-## Model
-This Demo only implement the DNN part of the model described in DeepFM paper.
-DeepFM model will be provided in other model.
-
-
-## Data Preprocessing method
-To preprocess the raw dataset, the integer features are clipped then min-max
-normalized to [0, 1] and the categorical features are one-hot encoded. The raw
-training dataset are splited such that 90% are used for training and the other
-10% are used for validation during training. In reader.py, training data is the first
-90% of data in train.txt, and validation data is the left.
-
-## Train
-The command line options for training can be listed by `python train.py -h`.
-
-### Local Train:
-```bash
-python train.py \
- --train_data_path data/raw/train.txt \
- 2>&1 | tee train.log
-```
-
-After training pass 1 batch 40000, the testing AUC is `0.801178` and the testing
-cost is `0.445196`.
-
-### Distributed Train
-Run a 2 pserver 2 trainer distribute training on a single machine.
-In distributed training setting, training data is splited by trainer_id, so that training data
- do not overlap among trainers
-
-```bash
-sh cluster_train.sh
-```
-
-## Infer
-The command line options for infering can be listed by `python infer.py -h`.
-
-To make inference for the test dataset:
-```bash
-python infer.py \
- --model_path models/ \
- --data_path data/raw/train.txt
-```
-Note: The AUC value in the last log info is the total AUC for all test dataset. Here, train.txt is splited inside the reader.py so that validation data does not have overlap with training data.
-
-## Train on Baidu Cloud
-1. Please prepare some CPU machines on Baidu Cloud following the steps in [train_on_baidu_cloud](https://github.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/user_guides/howto/training/train_on_baidu_cloud_cn.rst)
-1. Prepare dataset using preprocess.py.
-1. Split the train.txt to trainer_num parts and put them on the machines.
-1. Run training with the cluster train using the command in `Distributed Train` above.
\ No newline at end of file
+Please visit the project at [PaddleRec/ctr](../../../PaddleRec/ctr).
diff --git a/fluid/PaddleRec/ctr/data/download.sh b/fluid/PaddleRec/ctr/data/download.sh
deleted file mode 100755
index 466a22f2c6cc885cea0a1468f3043cb59c611b59..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/data/download.sh
+++ /dev/null
@@ -1,8 +0,0 @@
-#!/bin/bash
-
-wget --no-check-certificate https://s3-eu-west-1.amazonaws.com/criteo-labs/dac.tar.gz
-tar zxf dac.tar.gz
-rm -f dac.tar.gz
-
-mkdir raw
-mv ./*.txt raw/
diff --git a/fluid/PaddleRec/ctr/infer.py b/fluid/PaddleRec/ctr/infer.py
deleted file mode 100644
index 24f4f7bf9be694e7cb3442632c4945dd63a17c4e..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/infer.py
+++ /dev/null
@@ -1,91 +0,0 @@
-import argparse
-import logging
-
-import numpy as np
-# disable gpu training for this example
-import os
-os.environ["CUDA_VISIBLE_DEVICES"] = ""
-import paddle
-import paddle.fluid as fluid
-
-import reader
-from network_conf import ctr_dnn_model
-
-
-logging.basicConfig(
- format='%(asctime)s - %(levelname)s - %(message)s')
-logger = logging.getLogger("fluid")
-logger.setLevel(logging.INFO)
-
-
-def parse_args():
- parser = argparse.ArgumentParser(description="PaddlePaddle DeepFM example")
- parser.add_argument(
- '--model_path',
- type=str,
- required=True,
- help="The path of model parameters gz file")
- parser.add_argument(
- '--data_path',
- type=str,
- required=True,
- help="The path of the dataset to infer")
- parser.add_argument(
- '--embedding_size',
- type=int,
- default=10,
- help="The size for embedding layer (default:10)")
- parser.add_argument(
- '--sparse_feature_dim',
- type=int,
- default=1000001,
- help="The size for embedding layer (default:1000001)")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=1000,
- help="The size of mini-batch (default:1000)")
-
- return parser.parse_args()
-
-
-def infer():
- args = parse_args()
-
- place = fluid.CPUPlace()
- inference_scope = fluid.core.Scope()
-
- dataset = reader.CriteoDataset(args.sparse_feature_dim)
- test_reader = paddle.batch(dataset.test([args.data_path]), batch_size=args.batch_size)
-
- startup_program = fluid.framework.Program()
- test_program = fluid.framework.Program()
- with fluid.framework.program_guard(test_program, startup_program):
- loss, data_list, auc_var, batch_auc_var = ctr_dnn_model(args.embedding_size, args.sparse_feature_dim)
-
- exe = fluid.Executor(place)
-
- feeder = fluid.DataFeeder(feed_list=data_list, place=place)
-
- with fluid.scope_guard(inference_scope):
- [inference_program, _, fetch_targets] = fluid.io.load_inference_model(args.model_path, exe)
-
- def set_zero(var_name):
- param = inference_scope.var(var_name).get_tensor()
- param_array = np.zeros(param._get_dims()).astype("int64")
- param.set(param_array, place)
-
- auc_states_names = ['_generated_var_2', '_generated_var_3']
- for name in auc_states_names:
- set_zero(name)
-
- for batch_id, data in enumerate(test_reader()):
- loss_val, auc_val = exe.run(inference_program,
- feed=feeder.feed(data),
- fetch_list=fetch_targets)
- if batch_id % 100 == 0:
- logger.info("TEST --> batch: {} loss: {} auc: {}".format(batch_id, loss_val/args.batch_size, auc_val))
-
-
-if __name__ == '__main__':
- infer()
diff --git a/fluid/PaddleRec/ctr/network_conf.py b/fluid/PaddleRec/ctr/network_conf.py
deleted file mode 100644
index 4593c16eb2e732096aaf3aa076d3366347a35a16..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/network_conf.py
+++ /dev/null
@@ -1,46 +0,0 @@
-import paddle.fluid as fluid
-import math
-
-dense_feature_dim = 13
-
-def ctr_dnn_model(embedding_size, sparse_feature_dim):
- dense_input = fluid.layers.data(
- name="dense_input", shape=[dense_feature_dim], dtype='float32')
- sparse_input_ids = [
- fluid.layers.data(
- name="C" + str(i), shape=[1], lod_level=1, dtype='int64')
- for i in range(1, 27)
- ]
-
- def embedding_layer(input):
- return fluid.layers.embedding(
- input=input,
- is_sparse=True,
- # you need to patch https://github.com/PaddlePaddle/Paddle/pull/14190
- # if you want to set is_distributed to True
- is_distributed=False,
- size=[sparse_feature_dim, embedding_size],
- param_attr=fluid.ParamAttr(name="SparseFeatFactors", initializer=fluid.initializer.Uniform()))
-
- sparse_embed_seq = map(embedding_layer, sparse_input_ids)
- concated = fluid.layers.concat(sparse_embed_seq + [dense_input], axis=1)
-
- fc1 = fluid.layers.fc(input=concated, size=400, act='relu',
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(scale=1/math.sqrt(concated.shape[1]))))
- fc2 = fluid.layers.fc(input=fc1, size=400, act='relu',
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(scale=1/math.sqrt(fc1.shape[1]))))
- fc3 = fluid.layers.fc(input=fc2, size=400, act='relu',
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(scale=1/math.sqrt(fc2.shape[1]))))
- predict = fluid.layers.fc(input=fc3, size=2, act='softmax',
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(scale=1/math.sqrt(fc3.shape[1]))))
-
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
-
- data_list = [dense_input] + sparse_input_ids + [label]
-
- cost = fluid.layers.cross_entropy(input=predict, label=label)
- avg_cost = fluid.layers.reduce_sum(cost)
- accuracy = fluid.layers.accuracy(input=predict, label=label)
- auc_var, batch_auc_var, auc_states = fluid.layers.auc(input=predict, label=label, num_thresholds=2**12, slide_steps=20)
-
- return avg_cost, data_list, auc_var, batch_auc_var
diff --git a/fluid/PaddleRec/ctr/preprocess.py b/fluid/PaddleRec/ctr/preprocess.py
deleted file mode 100755
index bf5673ba9f2d441c6f0e09196282a444147c682d..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/preprocess.py
+++ /dev/null
@@ -1,164 +0,0 @@
-"""
-Preprocess Criteo dataset. This dataset was used for the Display Advertising
-Challenge (https://www.kaggle.com/c/criteo-display-ad-challenge).
-"""
-import os
-import sys
-import click
-import random
-import collections
-
-# There are 13 integer features and 26 categorical features
-continous_features = range(1, 14)
-categorial_features = range(14, 40)
-
-# Clip integer features. The clip point for each integer feature
-# is derived from the 95% quantile of the total values in each feature
-continous_clip = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
-
-
-class CategoryDictGenerator:
- """
- Generate dictionary for each of the categorical features
- """
-
- def __init__(self, num_feature):
- self.dicts = []
- self.num_feature = num_feature
- for i in range(0, num_feature):
- self.dicts.append(collections.defaultdict(int))
-
- def build(self, datafile, categorial_features, cutoff=0):
- with open(datafile, 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
- for i in range(0, self.num_feature):
- if features[categorial_features[i]] != '':
- self.dicts[i][features[categorial_features[i]]] += 1
- for i in range(0, self.num_feature):
- self.dicts[i] = filter(lambda x: x[1] >= cutoff,
- self.dicts[i].items())
- self.dicts[i] = sorted(self.dicts[i], key=lambda x: (-x[1], x[0]))
- vocabs, _ = list(zip(*self.dicts[i]))
- self.dicts[i] = dict(zip(vocabs, range(1, len(vocabs) + 1)))
- self.dicts[i][''] = 0
-
- def gen(self, idx, key):
- if key not in self.dicts[idx]:
- res = self.dicts[idx]['']
- else:
- res = self.dicts[idx][key]
- return res
-
- def dicts_sizes(self):
- return map(len, self.dicts)
-
-
-class ContinuousFeatureGenerator:
- """
- Normalize the integer features to [0, 1] by min-max normalization
- """
-
- def __init__(self, num_feature):
- self.num_feature = num_feature
- self.min = [sys.maxint] * num_feature
- self.max = [-sys.maxint] * num_feature
-
- def build(self, datafile, continous_features):
- with open(datafile, 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
- for i in range(0, self.num_feature):
- val = features[continous_features[i]]
- if val != '':
- val = int(val)
- if val > continous_clip[i]:
- val = continous_clip[i]
- self.min[i] = min(self.min[i], val)
- self.max[i] = max(self.max[i], val)
-
- def gen(self, idx, val):
- if val == '':
- return 0.0
- val = float(val)
- return (val - self.min[idx]) / (self.max[idx] - self.min[idx])
-
-
-@click.command("preprocess")
-@click.option("--datadir", type=str, help="Path to raw criteo dataset")
-@click.option("--outdir", type=str, help="Path to save the processed data")
-def preprocess(datadir, outdir):
- """
- All 13 integer features are normalized to continuous values and these continuous
- features are combined into one vector with dimension of 13.
-
- Each of the 26 categorical features are one-hot encoded and all the one-hot
- vectors are combined into one sparse binary vector.
- """
- dists = ContinuousFeatureGenerator(len(continous_features))
- dists.build(os.path.join(datadir, 'train.txt'), continous_features)
-
- dicts = CategoryDictGenerator(len(categorial_features))
- dicts.build(
- os.path.join(datadir, 'train.txt'), categorial_features, cutoff=200)
-
- dict_sizes = dicts.dicts_sizes()
- categorial_feature_offset = [0]
- for i in range(1, len(categorial_features)):
- offset = categorial_feature_offset[i - 1] + dict_sizes[i - 1]
- categorial_feature_offset.append(offset)
-
- random.seed(0)
-
- # 90% of the data are used for training, and 10% of the data are used
- # for validation.
- with open(os.path.join(outdir, 'train.txt'), 'w') as out_train:
- with open(os.path.join(outdir, 'valid.txt'), 'w') as out_valid:
- with open(os.path.join(datadir, 'train.txt'), 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
-
- continous_vals = []
- for i in range(0, len(continous_features)):
- val = dists.gen(i, features[continous_features[i]])
- continous_vals.append("{0:.6f}".format(val).rstrip('0')
- .rstrip('.'))
- categorial_vals = []
- for i in range(0, len(categorial_features)):
- val = dicts.gen(i, features[categorial_features[
- i]]) + categorial_feature_offset[i]
- categorial_vals.append(str(val))
-
- continous_vals = ','.join(continous_vals)
- categorial_vals = ','.join(categorial_vals)
- label = features[0]
- if random.randint(0, 9999) % 10 != 0:
- out_train.write('\t'.join(
- [continous_vals, categorial_vals, label]) + '\n')
- else:
- out_valid.write('\t'.join(
- [continous_vals, categorial_vals, label]) + '\n')
-
- with open(os.path.join(outdir, 'test.txt'), 'w') as out:
- with open(os.path.join(datadir, 'test.txt'), 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
-
- continous_vals = []
- for i in range(0, len(continous_features)):
- val = dists.gen(i, features[continous_features[i] - 1])
- continous_vals.append("{0:.6f}".format(val).rstrip('0')
- .rstrip('.'))
- categorial_vals = []
- for i in range(0, len(categorial_features)):
- val = dicts.gen(i, features[categorial_features[
- i] - 1]) + categorial_feature_offset[i]
- categorial_vals.append(str(val))
-
- continous_vals = ','.join(continous_vals)
- categorial_vals = ','.join(categorial_vals)
- out.write('\t'.join([continous_vals, categorial_vals]) + '\n')
-
-
-if __name__ == "__main__":
- preprocess()
diff --git a/fluid/PaddleRec/ctr/reader.py b/fluid/PaddleRec/ctr/reader.py
deleted file mode 100644
index 851839c35b79dea87f51d5aeb5eb1491b7670377..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/reader.py
+++ /dev/null
@@ -1,52 +0,0 @@
-class Dataset:
- def __init__(self):
- pass
-
-class CriteoDataset(Dataset):
- def __init__(self, sparse_feature_dim):
- self.cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- self.cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
- self.cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
- self.hash_dim_ = sparse_feature_dim
- # here, training data are lines with line_index < train_idx_
- self.train_idx_ = 41256555
- self.continuous_range_ = range(1, 14)
- self.categorical_range_ = range(14, 40)
-
- def _reader_creator(self, file_list, is_train, trainer_num, trainer_id):
- def reader():
- for file in file_list:
- with open(file, 'r') as f:
- line_idx = 0
- for line in f:
- line_idx += 1
- if is_train and line_idx > self.train_idx_:
- break
- elif not is_train and line_idx <= self.train_idx_:
- continue
- if line_idx % trainer_num != trainer_id:
- continue
- features = line.rstrip('\n').split('\t')
- dense_feature = []
- sparse_feature = []
- for idx in self.continuous_range_:
- if features[idx] == '':
- dense_feature.append(0.0)
- else:
- dense_feature.append((float(features[idx]) - self.cont_min_[idx - 1]) / self.cont_diff_[idx - 1])
- for idx in self.categorical_range_:
- sparse_feature.append([hash("%d_%s" % (idx, features[idx])) % self.hash_dim_])
-
- label = [int(features[0])]
- yield [dense_feature] + sparse_feature + [label]
-
- return reader
-
- def train(self, file_list, trainer_num, trainer_id):
- return self._reader_creator(file_list, True, trainer_num, trainer_id)
-
- def test(self, file_list):
- return self._reader_creator(file_list, False, -1)
-
- def infer(self, file_list):
- return self._reader_creator(file_list, False, -1)
diff --git a/fluid/PaddleRec/ctr/train.py b/fluid/PaddleRec/ctr/train.py
deleted file mode 100644
index 0b4e1762c4a0f43698ce4f0a34cf11a6fc9e3269..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/train.py
+++ /dev/null
@@ -1,197 +0,0 @@
-from __future__ import print_function
-
-import argparse
-import logging
-import os
-import time
-
-# disable gpu training for this example
-os.environ["CUDA_VISIBLE_DEVICES"] = ""
-
-import paddle
-import paddle.fluid as fluid
-
-import reader
-from network_conf import ctr_dnn_model
-
-logging.basicConfig(
- format='%(asctime)s - %(levelname)s - %(message)s')
-logger = logging.getLogger("fluid")
-logger.setLevel(logging.INFO)
-
-
-def parse_args():
- parser = argparse.ArgumentParser(description="PaddlePaddle CTR example")
- parser.add_argument(
- '--train_data_path',
- type=str,
- default='./data/raw/train.txt',
- help="The path of training dataset")
- parser.add_argument(
- '--test_data_path',
- type=str,
- default='./data/raw/valid.txt',
- help="The path of testing dataset")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=1000,
- help="The size of mini-batch (default:1000)")
- parser.add_argument(
- '--embedding_size',
- type=int,
- default=10,
- help="The size for embedding layer (default:10)")
- parser.add_argument(
- '--num_passes',
- type=int,
- default=10,
- help="The number of passes to train (default: 10)")
- parser.add_argument(
- '--model_output_dir',
- type=str,
- default='models',
- help='The path for model to store (default: models)')
- parser.add_argument(
- '--sparse_feature_dim',
- type=int,
- default=1000001,
- help='sparse feature hashing space for index processing')
- parser.add_argument(
- '--is_local',
- type=int,
- default=1,
- help='Local train or distributed train (default: 1)')
- parser.add_argument(
- '--cloud_train',
- type=int,
- default=0,
- help='Local train or distributed train on paddlecloud (default: 0)')
- parser.add_argument(
- '--async_mode',
- action='store_true',
- default=False,
- help='Whether start pserver in async mode to support ASGD')
- parser.add_argument(
- '--no_split_var',
- action='store_true',
- default=False,
- help='Whether split variables into blocks when update_method is pserver')
- # the following arguments is used for distributed train, if is_local == false, then you should set them
- parser.add_argument(
- '--role',
- type=str,
- default='pserver', # trainer or pserver
- help='The path for model to store (default: models)')
- parser.add_argument(
- '--endpoints',
- type=str,
- default='127.0.0.1:6000',
- help='The pserver endpoints, like: 127.0.0.1:6000,127.0.0.1:6001')
- parser.add_argument(
- '--current_endpoint',
- type=str,
- default='127.0.0.1:6000',
- help='The path for model to store (default: 127.0.0.1:6000)')
- parser.add_argument(
- '--trainer_id',
- type=int,
- default=0,
- help='The path for model to store (default: models)')
- parser.add_argument(
- '--trainers',
- type=int,
- default=1,
- help='The num of trianers, (default: 1)')
-
- return parser.parse_args()
-
-
-def train_loop(args, train_program, data_list, loss, auc_var, batch_auc_var,
- trainer_num, trainer_id):
- dataset = reader.CriteoDataset(args.sparse_feature_dim)
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- dataset.train([args.train_data_path], trainer_num, trainer_id),
- buf_size=args.batch_size * 100),
- batch_size=args.batch_size)
- place = fluid.CPUPlace()
-
- feeder = fluid.DataFeeder(feed_list=data_list, place=place)
- data_name_list = [var.name for var in data_list]
-
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
- for pass_id in range(args.num_passes):
- pass_start = time.time()
- for batch_id, data in enumerate(train_reader()):
- loss_val, auc_val, batch_auc_val = exe.run(
- train_program,
- feed=feeder.feed(data),
- fetch_list=[loss, auc_var, batch_auc_var]
- )
- logger.info("TRAIN --> pass: {} batch: {} loss: {} auc: {}, batch_auc: {}"
- .format(pass_id, batch_id, loss_val/args.batch_size, auc_val, batch_auc_val))
- if batch_id % 1000 == 0 and batch_id != 0:
- model_dir = args.model_output_dir + '/batch-' + str(batch_id)
- if args.trainer_id == 0:
- fluid.io.save_inference_model(model_dir, data_name_list, [loss, auc_var], exe)
- print("pass_id: %d, pass_time_cost: %f" % (pass_id, time.time() - pass_start))
- model_dir = args.model_output_dir + '/pass-' + str(pass_id)
- if args.trainer_id == 0:
- fluid.io.save_inference_model(model_dir, data_name_list, [loss, auc_var], exe)
-
-
-def train():
- args = parse_args()
-
- if not os.path.isdir(args.model_output_dir):
- os.mkdir(args.model_output_dir)
-
- loss, data_list, auc_var, batch_auc_var = ctr_dnn_model(args.embedding_size, args.sparse_feature_dim)
- optimizer = fluid.optimizer.Adam(learning_rate=1e-4)
- optimizer.minimize(loss)
- if args.cloud_train:
- # the port of all pservers, needed by both trainer and pserver
- port = os.getenv("PADDLE_PORT", "6174")
- # comma separated ips of all pservers, needed by trainer and
- pserver_ips = os.getenv("PADDLE_PSERVERS", "")
- eplist = []
- for ip in pserver_ips.split(","):
- eplist.append(':'.join([ip, port]))
- args.endpoints = ",".join(eplist)
- args.trainers = int(os.getenv("PADDLE_TRAINERS_NUM", "1"))
- args.current_endpoint = os.getenv("POD_IP", "localhost") + ":" + port
- args.role = os.getenv("TRAINING_ROLE", "TRAINER")
- args.trainer_id = int(os.getenv("PADDLE_TRAINER_ID", "0"))
- args.is_local = bool(int(os.getenv("PADDLE_IS_LOCAL", 0)))
-
-
- if args.is_local:
- logger.info("run local training")
- main_program = fluid.default_main_program()
- train_loop(args, main_program, data_list, loss, auc_var, batch_auc_var, 1, 0)
- else:
- logger.info("run dist training")
- t = fluid.DistributeTranspiler()
- t.transpile(args.trainer_id, pservers=args.endpoints, trainers=args.trainers)
- if args.role == "pserver" or args.role == "PSERVER":
- logger.info("run pserver")
- prog = t.get_pserver_program(args.current_endpoint)
- startup = t.get_startup_program(args.current_endpoint, pserver_program=prog)
- exe = fluid.Executor(fluid.CPUPlace())
- exe.run(startup)
- exe.run(prog)
- elif args.role == "trainer" or args.role == "TRAINER":
- logger.info("run trainer")
- train_prog = t.get_trainer_program()
- train_loop(args, train_prog, data_list, loss, auc_var, batch_auc_var,
- args.trainers, args.trainer_id)
- else:
- raise ValueError(
- 'PADDLE_TRAINING_ROLE environment variable must be either TRAINER or PSERVER'
- )
-
-
-if __name__ == '__main__':
- train()
diff --git a/fluid/PaddleRec/din/README.md b/fluid/PaddleRec/din/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..6e2df0301cf20434dc3479da8c93644f764c5c42
--- /dev/null
+++ b/fluid/PaddleRec/din/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleRec/din](../../../PaddleRec/din) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/gnn/README.md b/fluid/PaddleRec/gnn/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..1ac21f3ee4712ead33f44322447d30fe5aa45918
--- /dev/null
+++ b/fluid/PaddleRec/gnn/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleRec/gnn](../../../PaddleRec/gnn) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/gru4rec/README.md b/fluid/PaddleRec/gru4rec/README.md
index e88e6172a9df26d629bcaa3199770be0100a1101..9fe28eba00760b67c532e4624a5722cfd62feb57 100644
--- a/fluid/PaddleRec/gru4rec/README.md
+++ b/fluid/PaddleRec/gru4rec/README.md
@@ -1,251 +1,2 @@
-# GRU4REC
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── README.md # 文档
-├── train.py # 训练脚本
-├── infer.py # 预测脚本
-├── net.py # 网络结构
-├── text2paddle.py # 文本数据转paddle数据
-├── cluster_train.py # 多机训练
-├── cluster_train.sh # 多机训练脚本
-├── utils # 通用函数
-├── convert_format.py # 转换数据格式
-├── vocab.txt # 小样本字典
-├── train_data # 小样本训练目录
-└── test_data # 小样本测试目录
-
-```
-
-
-## 简介
-
-GRU4REC模型的介绍可以参阅论文[Session-based Recommendations with Recurrent Neural Networks](https://arxiv.org/abs/1511.06939)。
-
-论文的贡献在于首次将RNN(GRU)运用于session-based推荐,相比传统的KNN和矩阵分解,效果有明显的提升。
-
-论文的核心思想是在一个session中,用户点击一系列item的行为看做一个序列,用来训练RNN模型。预测阶段,给定已知的点击序列作为输入,预测下一个可能点击的item。
-
-session-based推荐应用场景非常广泛,比如用户的商品浏览、新闻点击、地点签到等序列数据。
-
-运行样例程序可跳过'RSC15 数据下载及预处理'部分
-## RSC15 数据下载及预处理
-
-运行命令 下载RSC15官网数据集
-```
-curl -Lo yoochoose-data.7z https://s3-eu-west-1.amazonaws.com/yc-rdata/yoochoose-data.7z
-7z x yoochoose-data.7z
-```
-
-GRU4REC的数据过滤,下载脚本[https://github.com/hidasib/GRU4Rec/blob/master/examples/rsc15/preprocess.py](https://github.com/hidasib/GRU4Rec/blob/master/examples/rsc15/preprocess.py),
-
-注意修改文件路径
-
-line12: PATH_TO_ORIGINAL_DATA = './'
-
-line13:PATH_TO_PROCESSED_DATA = './'
-
-注意使用python3 执行脚本
-```
-python preprocess.py
-```
-生成的数据格式如下
-
-```
-SessionId ItemId Time
-1 214536502 1396839069.277
-1 214536500 1396839249.868
-1 214536506 1396839286.998
-1 214577561 1396839420.306
-2 214662742 1396850197.614
-2 214662742 1396850239.373
-2 214825110 1396850317.446
-2 214757390 1396850390.71
-2 214757407 1396850438.247
-```
-
-数据格式需要转换 运行脚本
-```
-python convert_format.py
-```
-
-模型的训练及测试数据如下,一行表示一个用户按照时间顺序的序列
-
-```
-214536502 214536500 214536506 214577561
-214662742 214662742 214825110 214757390 214757407 214551617
-214716935 214774687 214832672
-214836765 214706482
-214701242 214826623
-214826835 214826715
-214838855 214838855
-214576500 214576500 214576500
-214821275 214821275 214821371 214821371 214821371 214717089 214563337 214706462 214717436 214743335 214826837 214819762
-214717867 214717867
-```
-
-根据训练和测试文件生成字典和对应的paddle输入文件
-
-注意需要将训练文件放到一个目录下面,测试文件放到一个目录下面,同时支持多训练文件
-```
-python text2paddle.py raw_train_data/ raw_test_data/ train_data test_data vocab.txt
-```
-
-转化后生成的格式如下,可参考train_data/small_train.txt
-```
-197 196 198 236
-93 93 384 362 363 43
-336 364 407
-421 322
-314 388
-128 58
-138 138
-46 46 46
-34 34 57 57 57 342 228 321 346 357 59 376
-110 110
-```
-
-## 训练
-'--use_cuda 1' 表示使用gpu, 缺省表示使用cpu '--parallel 1' 表示使用多卡,缺省表示使用单卡
-
-具体的参数配置可运行
-```
-python train.py -h
-```
-
-GPU 环境
-运行命令开始训练模型。
-```
-CUDA_VISIBLE_DEVICES=0 python train.py --train_dir train_data/ --use_cuda 1
-```
-CPU 环境
-运行命令开始训练模型。
-```
-python train.py --train_dir train_data/
-```
-
-## 自定义网络结构
-
-可在[net.py](./net.py) `network` 函数中调整网络结构,当前的网络结构如下:
-```python
-emb = fluid.layers.embedding(
- input=src,
- size=[vocab_size, hid_size],
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=emb_lr_x),
- is_sparse=True)
-
-fc0 = fluid.layers.fc(input=emb,
- size=hid_size * 3,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
-gru_h0 = fluid.layers.dynamic_gru(
- input=fc0,
- size=hid_size,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
-
-fc = fluid.layers.fc(input=gru_h0,
- size=vocab_size,
- act='softmax',
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=fc_lr_x))
-
-cost = fluid.layers.cross_entropy(input=fc, label=dst)
-acc = fluid.layers.accuracy(input=fc, label=dst, k=20)
-```
-
-## 训练结果示例
-
-我们在Tesla K40m单GPU卡上训练的日志如下所示
-```text
-epoch_1 start
-step:100 ppl:441.468
-step:200 ppl:311.043
-step:300 ppl:218.952
-step:400 ppl:186.172
-step:500 ppl:188.600
-step:600 ppl:131.213
-step:700 ppl:165.770
-step:800 ppl:164.414
-step:900 ppl:156.470
-step:1000 ppl:174.201
-step:1100 ppl:118.619
-step:1200 ppl:122.635
-step:1300 ppl:118.220
-step:1400 ppl:90.372
-step:1500 ppl:135.018
-step:1600 ppl:114.327
-step:1700 ppl:141.806
-step:1800 ppl:93.416
-step:1900 ppl:92.897
-step:2000 ppl:121.703
-step:2100 ppl:96.288
-step:2200 ppl:88.355
-step:2300 ppl:101.737
-step:2400 ppl:95.934
-step:2500 ppl:86.158
-step:2600 ppl:80.925
-step:2700 ppl:202.219
-step:2800 ppl:106.828
-step:2900 ppl:91.458
-step:3000 ppl:105.988
-step:3100 ppl:87.067
-step:3200 ppl:92.651
-step:3300 ppl:101.145
-step:3400 ppl:91.247
-step:3500 ppl:107.656
-step:3600 ppl:89.410
-...
-...
-step:15700 ppl:76.819
-step:15800 ppl:62.257
-step:15900 ppl:81.735
-epoch:1 num_steps:15907 time_cost(s):4154.096032
-model saved in model_recall20/epoch_1
-...
-```
-
-## 预测
-运行命令 开始预测.
-
-```
-CUDA_VISIBLE_DEVICES=0 python infer.py --test_dir test_data/ --model_dir model_recall20/ --start_index 1 --last_index 10 --use_cuda 1
-```
-
-## 预测结果示例
-```text
-model:model_r@20/epoch_1 recall@20:0.613 time_cost(s):12.23
-model:model_r@20/epoch_2 recall@20:0.647 time_cost(s):12.33
-model:model_r@20/epoch_3 recall@20:0.662 time_cost(s):12.38
-model:model_r@20/epoch_4 recall@20:0.669 time_cost(s):12.21
-model:model_r@20/epoch_5 recall@20:0.673 time_cost(s):12.17
-model:model_r@20/epoch_6 recall@20:0.675 time_cost(s):12.26
-model:model_r@20/epoch_7 recall@20:0.677 time_cost(s):12.25
-model:model_r@20/epoch_8 recall@20:0.679 time_cost(s):12.37
-model:model_r@20/epoch_9 recall@20:0.680 time_cost(s):12.22
-model:model_r@20/epoch_10 recall@20:0.681 time_cost(s):12.2
-```
-
-
-## 多机训练
-厂内用户可以参考[wiki](http://wiki.baidu.com/pages/viewpage.action?pageId=628300529)利用paddlecloud 配置多机环境
-
-可参考cluster_train.py 配置其他多机环境
-
-运行命令本地模拟多机场景
-```
-sh cluster_train.sh
-```
-
-注意本地模拟需要关闭代理
+您好,该项目已被迁移,请移步到 [PaddleRec/gru4rec](../../../PaddleRec/gru4rec) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/gru4rec/cluster_train.py b/fluid/PaddleRec/gru4rec/cluster_train.py
deleted file mode 100644
index b9b0820d0293700f379abf135e8d149311a7e1a1..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/gru4rec/cluster_train.py
+++ /dev/null
@@ -1,140 +0,0 @@
-import os
-import sys
-import time
-import six
-import numpy as np
-import math
-import argparse
-import paddle.fluid as fluid
-import paddle
-import time
-import utils
-import net
-
-SEED = 102
-
-def parse_args():
- parser = argparse.ArgumentParser("gru4rec benchmark.")
- parser.add_argument(
- '--train_dir', type=str, default='train_data', help='train file address')
- parser.add_argument(
- '--vocab_path', type=str, default='vocab.txt', help='vocab file address')
- parser.add_argument(
- '--is_local', type=int, default=1, help='whether local')
- parser.add_argument(
- '--hid_size', type=int, default=100, help='hid size')
- parser.add_argument(
- '--model_dir', type=str, default='model_recall20', help='model dir')
- parser.add_argument(
- '--batch_size', type=int, default=5, help='num of batch size')
- parser.add_argument(
- '--pass_num', type=int, default=10, help='num of epoch')
- parser.add_argument(
- '--print_batch', type=int, default=10, help='num of print batch')
- parser.add_argument(
- '--use_cuda', type=int, default=0, help='whether use gpu')
- parser.add_argument(
- '--base_lr', type=float, default=0.01, help='learning rate')
- parser.add_argument(
- '--num_devices', type=int, default=1, help='Number of GPU devices')
- parser.add_argument(
- '--role', type=str, default='pserver', help='trainer or pserver')
- parser.add_argument(
- '--endpoints', type=str, default='127.0.0.1:6000', help='The pserver endpoints, like: 127.0.0.1:6000, 127.0.0.1:6001')
- parser.add_argument(
- '--current_endpoint', type=str, default='127.0.0.1:6000', help='The current_endpoint')
- parser.add_argument(
- '--trainer_id', type=int, default=0, help='trainer id ,only trainer_id=0 save model')
- parser.add_argument(
- '--trainers', type=int, default=1, help='The num of trianers, (default: 1)')
- args = parser.parse_args()
- return args
-
-def get_cards(args):
- return args.num_devices
-
-def train():
- """ do training """
- args = parse_args()
- hid_size = args.hid_size
- train_dir = args.train_dir
- vocab_path = args.vocab_path
- use_cuda = True if args.use_cuda else False
- print("use_cuda:", use_cuda)
- batch_size = args.batch_size
- vocab_size, train_reader = utils.prepare_data(
- train_dir, vocab_path, batch_size=batch_size * get_cards(args),\
- buffer_size=1000, word_freq_threshold=0, is_train=True)
-
- # Train program
- src_wordseq, dst_wordseq, avg_cost, acc = net.network(vocab_size=vocab_size, hid_size=hid_size)
-
- # Optimization to minimize lost
- sgd_optimizer = fluid.optimizer.SGD(learning_rate=args.base_lr)
- sgd_optimizer.minimize(avg_cost)
-
- def train_loop(main_program):
- """ train network """
- pass_num = args.pass_num
- model_dir = args.model_dir
- fetch_list = [avg_cost.name]
-
- place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
- total_time = 0.0
- for pass_idx in six.moves.xrange(pass_num):
- epoch_idx = pass_idx + 1
- print("epoch_%d start" % epoch_idx)
-
- t0 = time.time()
- i = 0
- newest_ppl = 0
- for data in train_reader():
- i += 1
- lod_src_wordseq = utils.to_lodtensor([dat[0] for dat in data],
- place)
- lod_dst_wordseq = utils.to_lodtensor([dat[1] for dat in data],
- place)
- ret_avg_cost = exe.run(main_program,
- feed={ "src_wordseq": lod_src_wordseq,
- "dst_wordseq": lod_dst_wordseq},
- fetch_list=fetch_list)
- avg_ppl = np.exp(ret_avg_cost[0])
- newest_ppl = np.mean(avg_ppl)
- if i % args.print_batch == 0:
- print("step:%d ppl:%.3f" % (i, newest_ppl))
-
- t1 = time.time()
- total_time += t1 - t0
- print("epoch:%d num_steps:%d time_cost(s):%f" %
- (epoch_idx, i, total_time / epoch_idx))
- save_dir = "%s/epoch_%d" % (model_dir, epoch_idx)
- feed_var_names = ["src_wordseq", "dst_wordseq"]
- fetch_vars = [avg_cost, acc]
- if args.trainer_id == 0:
- fluid.io.save_inference_model(save_dir, feed_var_names, fetch_vars, exe)
- print("model saved in %s" % save_dir)
- print("finish training")
-
- if args.is_local:
- print("run local training")
- train_loop(fluid.default_main_program())
- else:
- print("run distribute training")
- t = fluid.DistributeTranspiler()
- t.transpile(args.trainer_id, pservers=args.endpoints, trainers=args.trainers)
- if args.role == "pserver":
- print("run psever")
- pserver_prog = t.get_pserver_program(args.current_endpoint)
- pserver_startup = t.get_startup_program(args.current_endpoint,
- pserver_prog)
- exe = fluid.Executor(fluid.CPUPlace())
- exe.run(pserver_startup)
- exe.run(pserver_prog)
- elif args.role == "trainer":
- print("run trainer")
- train_loop(t.get_trainer_program())
-
-if __name__ == "__main__":
- train()
diff --git a/fluid/PaddleRec/gru4rec/infer.py b/fluid/PaddleRec/gru4rec/infer.py
deleted file mode 100644
index ccb2ce097f10798b9e68c623a921b3e5bebf9af3..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/gru4rec/infer.py
+++ /dev/null
@@ -1,83 +0,0 @@
-import argparse
-import sys
-import time
-import math
-import unittest
-import contextlib
-import numpy as np
-import six
-import paddle.fluid as fluid
-import paddle
-
-import utils
-
-def parse_args():
- parser = argparse.ArgumentParser("gru4rec benchmark.")
- parser.add_argument(
- '--test_dir', type=str, default='test_data', help='test file address')
- parser.add_argument(
- '--start_index', type=int, default='1', help='start index')
- parser.add_argument(
- '--last_index', type=int, default='10', help='end index')
- parser.add_argument(
- '--model_dir', type=str, default='model_recall20', help='model dir')
- parser.add_argument(
- '--use_cuda', type=int, default='1', help='whether use cuda')
- parser.add_argument(
- '--batch_size', type=int, default='5', help='batch_size')
- args = parser.parse_args()
- return args
-
-def infer(test_reader, use_cuda, model_path):
- """ inference function """
- place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- with fluid.scope_guard(fluid.core.Scope()):
- infer_program, feed_target_names, fetch_vars = fluid.io.load_inference_model(
- model_path, exe)
- accum_num_recall = 0.0
- accum_num_sum = 0.0
- t0 = time.time()
- step_id = 0
- for data in test_reader():
- step_id += 1
- src_wordseq = utils.to_lodtensor([dat[0] for dat in data], place)
- label_data = [dat[1] for dat in data]
- dst_wordseq = utils.to_lodtensor(label_data, place)
- para = exe.run(
- infer_program,
- feed={"src_wordseq": src_wordseq,
- "dst_wordseq": dst_wordseq},
- fetch_list=fetch_vars,
- return_numpy=False)
-
- acc_ = para[1]._get_float_element(0)
- data_length = len(
- np.concatenate(
- label_data, axis=0).astype("int64"))
- accum_num_sum += (data_length)
- accum_num_recall += (data_length * acc_)
- if step_id % 1 == 0:
- print("step:%d " % (step_id), accum_num_recall / accum_num_sum)
- t1 = time.time()
- print("model:%s recall@20:%.3f time_cost(s):%.2f" %
- (model_path, accum_num_recall / accum_num_sum, t1 - t0))
-
-
-if __name__ == "__main__":
- args = parse_args()
- start_index = args.start_index
- last_index = args.last_index
- test_dir = args.test_dir
- model_dir = args.model_dir
- batch_size = args.batch_size
- use_cuda = True if args.use_cuda else False
- print("start index: ", start_index, " last_index:" ,last_index)
- vocab_size, test_reader = utils.prepare_data(
- test_dir, "", batch_size=batch_size,
- buffer_size=1000, word_freq_threshold=0, is_train=False)
-
- for epoch in xrange(start_index, last_index + 1):
- epoch_path = model_dir + "/epoch_" + str(epoch)
- infer(test_reader=test_reader, use_cuda=use_cuda, model_path=epoch_path)
diff --git a/fluid/PaddleRec/gru4rec/net.py b/fluid/PaddleRec/gru4rec/net.py
deleted file mode 100644
index fea2b3e980f2b83d62a05b8ec3267a7e01066ecb..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/gru4rec/net.py
+++ /dev/null
@@ -1,50 +0,0 @@
-import paddle.fluid as fluid
-
-def network(vocab_size,
- hid_size=100,
- init_low_bound=-0.04,
- init_high_bound=0.04):
- """ network definition """
- emb_lr_x = 10.0
- gru_lr_x = 1.0
- fc_lr_x = 1.0
- # Input data
- src_wordseq = fluid.layers.data(
- name="src_wordseq", shape=[1], dtype="int64", lod_level=1)
- dst_wordseq = fluid.layers.data(
- name="dst_wordseq", shape=[1], dtype="int64", lod_level=1)
-
- emb = fluid.layers.embedding(
- input=src_wordseq,
- size=[vocab_size, hid_size],
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=emb_lr_x),
- is_sparse=True)
- fc0 = fluid.layers.fc(input=emb,
- size=hid_size * 3,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
- gru_h0 = fluid.layers.dynamic_gru(
- input=fc0,
- size=hid_size,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
-
- fc = fluid.layers.fc(input=gru_h0,
- size=vocab_size,
- act='softmax',
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=fc_lr_x))
-
- cost = fluid.layers.cross_entropy(input=fc, label=dst_wordseq)
- acc = fluid.layers.accuracy(input=fc, label=dst_wordseq, k=20)
- avg_cost = fluid.layers.mean(x=cost)
- return src_wordseq, dst_wordseq, avg_cost, acc
diff --git a/fluid/PaddleRec/gru4rec/train.py b/fluid/PaddleRec/gru4rec/train.py
deleted file mode 100644
index 2b889a44ae676b7cf334ad0901436c8694d1e237..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/gru4rec/train.py
+++ /dev/null
@@ -1,122 +0,0 @@
-import os
-import sys
-import time
-import six
-import numpy as np
-import math
-import argparse
-import paddle.fluid as fluid
-import paddle
-import time
-import utils
-import net
-
-SEED = 102
-
-
-def parse_args():
- parser = argparse.ArgumentParser("gru4rec benchmark.")
- parser.add_argument(
- '--train_dir', type=str, default='train_data', help='train file address')
- parser.add_argument(
- '--vocab_path', type=str, default='vocab.txt', help='vocab file address')
- parser.add_argument(
- '--is_local', type=int, default=1, help='whether local')
- parser.add_argument(
- '--hid_size', type=int, default=100, help='hid size')
- parser.add_argument(
- '--model_dir', type=str, default='model_recall20', help='model dir')
- parser.add_argument(
- '--batch_size', type=int, default=5, help='num of batch size')
- parser.add_argument(
- '--print_batch', type=int, default=10, help='num of print batch')
- parser.add_argument(
- '--pass_num', type=int, default=10, help='num of epoch')
- parser.add_argument(
- '--use_cuda', type=int, default=0, help='whether use gpu')
- parser.add_argument(
- '--parallel', type=int, default=0, help='whether parallel')
- parser.add_argument(
- '--base_lr', type=float, default=0.01, help='learning rate')
- parser.add_argument(
- '--num_devices', type=int, default=1, help='Number of GPU devices')
- args = parser.parse_args()
- return args
-
-def get_cards(args):
- return args.num_devices
-
-def train():
- """ do training """
- args = parse_args()
- hid_size = args.hid_size
- train_dir = args.train_dir
- vocab_path = args.vocab_path
- use_cuda = True if args.use_cuda else False
- parallel = True if args.parallel else False
- print("use_cuda:", use_cuda, "parallel:", parallel)
- batch_size = args.batch_size
- vocab_size, train_reader = utils.prepare_data(
- train_dir, vocab_path, batch_size=batch_size * get_cards(args),\
- buffer_size=1000, word_freq_threshold=0, is_train=True)
-
- # Train program
- src_wordseq, dst_wordseq, avg_cost, acc = net.network(vocab_size=vocab_size, hid_size=hid_size)
-
- # Optimization to minimize lost
- sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=args.base_lr)
- sgd_optimizer.minimize(avg_cost)
-
- # Initialize executor
- place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
- if parallel:
- train_exe = fluid.ParallelExecutor(
- use_cuda=use_cuda,
- loss_name=avg_cost.name)
- else:
- train_exe = exe
-
- pass_num = args.pass_num
- model_dir = args.model_dir
- fetch_list = [avg_cost.name]
-
- total_time = 0.0
- for pass_idx in six.moves.xrange(pass_num):
- epoch_idx = pass_idx + 1
- print("epoch_%d start" % epoch_idx)
-
- t0 = time.time()
- i = 0
- newest_ppl = 0
- for data in train_reader():
- i += 1
- lod_src_wordseq = utils.to_lodtensor([dat[0] for dat in data],
- place)
- lod_dst_wordseq = utils.to_lodtensor([dat[1] for dat in data],
- place)
- ret_avg_cost = train_exe.run(feed={
- "src_wordseq": lod_src_wordseq,
- "dst_wordseq": lod_dst_wordseq},
- fetch_list=fetch_list)
- avg_ppl = np.exp(ret_avg_cost[0])
- newest_ppl = np.mean(avg_ppl)
- if i % args.print_batch == 0:
- print("step:%d ppl:%.3f" % (i, newest_ppl))
-
- t1 = time.time()
- total_time += t1 - t0
- print("epoch:%d num_steps:%d time_cost(s):%f" %
- (epoch_idx, i, total_time / epoch_idx))
- save_dir = "%s/epoch_%d" % (model_dir, epoch_idx)
- feed_var_names = ["src_wordseq", "dst_wordseq"]
- fetch_vars = [avg_cost, acc]
- fluid.io.save_inference_model(save_dir, feed_var_names, fetch_vars, exe)
- print("model saved in %s" % save_dir)
- #exe.close()
- print("finish training")
-
-
-if __name__ == "__main__":
- train()
diff --git a/fluid/PaddleRec/gru4rec/utils.py b/fluid/PaddleRec/gru4rec/utils.py
deleted file mode 100644
index 5dec9b750eccfb014ad0e35f38990930c6d3d824..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/gru4rec/utils.py
+++ /dev/null
@@ -1,127 +0,0 @@
-import sys
-import collections
-import six
-import time
-import numpy as np
-import paddle.fluid as fluid
-import paddle
-import os
-
-def to_lodtensor(data, place):
- """ convert to LODtensor """
- seq_lens = [len(seq) for seq in data]
- cur_len = 0
- lod = [cur_len]
- for l in seq_lens:
- cur_len += l
- lod.append(cur_len)
- flattened_data = np.concatenate(data, axis=0).astype("int64")
- flattened_data = flattened_data.reshape([len(flattened_data), 1])
- res = fluid.LoDTensor()
- res.set(flattened_data, place)
- res.set_lod([lod])
- return res
-
-def get_vocab_size(vocab_path):
- with open(vocab_path, "r") as rf:
- line = rf.readline()
- return int(line.strip())
-
-def prepare_data(file_dir,
- vocab_path,
- batch_size,
- buffer_size=1000,
- word_freq_threshold=0,
- is_train=True):
- """ prepare the English Pann Treebank (PTB) data """
- print("start constuct word dict")
- if is_train:
- vocab_size = get_vocab_size(vocab_path)
- reader = sort_batch(
- paddle.reader.shuffle(
- train(
- file_dir, buffer_size, data_type=DataType.SEQ),
- buf_size=buffer_size),
- batch_size,
- batch_size * 20)
- else:
- reader = sort_batch(
- test(
- file_dir, buffer_size, data_type=DataType.SEQ),
- batch_size,
- batch_size * 20)
- vocab_size = 0
- return vocab_size, reader
-
-
-def sort_batch(reader, batch_size, sort_group_size, drop_last=False):
- """
- Create a batched reader.
- :param reader: the data reader to read from.
- :type reader: callable
- :param batch_size: size of each mini-batch
- :type batch_size: int
- :param sort_group_size: size of partial sorted batch
- :type sort_group_size: int
- :param drop_last: drop the last batch, if the size of last batch is not equal to batch_size.
- :type drop_last: bool
- :return: the batched reader.
- :rtype: callable
- """
-
- def batch_reader():
- r = reader()
- b = []
- for instance in r:
- b.append(instance)
- if len(b) == sort_group_size:
- sortl = sorted(b, key=lambda x: len(x[0]), reverse=True)
- b = []
- c = []
- for sort_i in sortl:
- c.append(sort_i)
- if (len(c) == batch_size):
- yield c
- c = []
- if drop_last == False and len(b) != 0:
- sortl = sorted(b, key=lambda x: len(x[0]), reverse=True)
- c = []
- for sort_i in sortl:
- c.append(sort_i)
- if (len(c) == batch_size):
- yield c
- c = []
-
- # Batch size check
- batch_size = int(batch_size)
- if batch_size <= 0:
- raise ValueError("batch_size should be a positive integeral value, "
- "but got batch_size={}".format(batch_size))
- return batch_reader
-
-
-class DataType(object):
- SEQ = 2
-
-def reader_creator(file_dir, n, data_type):
- def reader():
- files = os.listdir(file_dir)
- for fi in files:
- with open(file_dir + '/' + fi, "r") as f:
- for l in f:
- if DataType.SEQ == data_type:
- l = l.strip().split()
- l = [w for w in l]
- src_seq = l[:len(l) - 1]
- trg_seq = l[1:]
- if n > 0 and len(src_seq) > n: continue
- yield src_seq, trg_seq
- else:
- assert False, 'error data type'
- return reader
-
-def train(train_dir, n, data_type=DataType.SEQ):
- return reader_creator(train_dir, n, data_type)
-
-def test(test_dir, n, data_type=DataType.SEQ):
- return reader_creator(test_dir, n, data_type)
diff --git a/fluid/PaddleRec/multiview_simnet/README.cn.md b/fluid/PaddleRec/multiview_simnet/README.cn.md
index 06df3c32c7996f5003bd7b9c1eb749f32c28b752..9cf8e27bba4775800498c25b550f7bb19479f074 100644
--- a/fluid/PaddleRec/multiview_simnet/README.cn.md
+++ b/fluid/PaddleRec/multiview_simnet/README.cn.md
@@ -1,27 +1,2 @@
-# 个性化推荐中的多视角Simnet模型
-## 介绍
-在个性化推荐场景中,推荐系统给用户提供的项目(Item)列表通常是通过个性化的匹配模型计算出来的。在现实世界中,一个用户可能有很多个视角的特征,比如用户Id,年龄,项目的点击历史等。一个项目,举例来说,新闻资讯,也会有多种视角的特征比如新闻标题,新闻类别等。多视角Simnet模型是可以融合用户以及推荐项目的多个视角的特征并进行个性化匹配学习的一体化模型。这类模型在很多工业化的场景中都会被使用到,比如百度的Feed产品中。
-
-## 数据集
-目前,本项目使用机器生成的数据集来介绍多视角Simnet模型的概念,未来我们会逐渐加入真是世界中的数据集并在这个模型上进行效果验证。
-
-## 模型
-本项目的目标是提供一个在个性化匹配场景下利用Paddle搭建的模型。多视角Simnet模型包括多个编码器模块,每个编码器被用在不同的特征视角上。当前,项目中提供Bag-of-Embedding编码器,Temporal-Convolutional编码器,和Gated-Recurrent-Unit编码器。我们会逐渐加入稀疏特征场景下比较实用的编码器到这个项目中。模型的训练方法,当前采用的是Pairwise ranking模式进行训练,即针对一对具有关联的User-Item组合,随机实用一个Item作为负例进行排序学习。
-
-## 训练
-如下
-如下命令行可以获得训练工具的具体选项,`python train.py -h`内容可以参考说明
-```bash
-python train.py
-```
-##
-如下
-如下命令行可以获得预测工具的具体选项,`python infer -h`内容可以参考说明
-```bash
-python infer.py
-```
-## 未来的工作
-- 多种pairwise的损失函数会被加入到这个项目中。对于不同视角的特征,用户-项目之间的匹配关系可以使用不同的损失函数进行联合优化。整个模型会在真实数据中进行验证。
-- Parallel Executor选项会被加入
-- 分布式训练能力会被加入
+您好,该项目已被迁移,请移步到 [PaddleRec/multiview_simnet](../../../PaddleRec/multiview_simnet) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/multiview_simnet/README.md b/fluid/PaddleRec/multiview_simnet/README.md
index 525946e612592b97e10707cadf35e5252230c2bd..8fba8e606256ad7ad65ec429b68e967809bc6a51 100644
--- a/fluid/PaddleRec/multiview_simnet/README.md
+++ b/fluid/PaddleRec/multiview_simnet/README.md
@@ -1,27 +1,6 @@
-# Multi-view Simnet for Personalized recommendation
-## Introduction
-In personalized recommendation scenario, a user often is provided with several items from personalized interest matching model. In real world application, a user may have multiple views of features, say user-id, age, click-history of items, search queries. A item, e.g. news, may also have multiple views of features like news title, news category, images in news and so on. Multi-view Simnet is matching a model that combine users' and items' multiple views of features into one unified model. The model can be used in many industrial product like Baidu's feed news. The model is adapted from the paper A Multi-View Deep Learning(MV-DNN) Approach for Cross Domain User Modeling in Recommendation Systems, WWW 2015. The difference between our model and the MV-DNN is that we also consider multiple feature views of users.
+Hi!
-## Dataset
-Currently, synthetic dataset is provided for proof of concept and we aim to add more real world dataset in this project in the future.
+This directory has been deprecated.
-## Model
-This project aims to provide practical usage of Paddle in personalized matching scenario. The model provides several encoder modules for different views of features. Currently, Bag-of-Embedding encoder, Temporal-Convolutional encoder, Gated-Recurrent-Unit encoder are provided. We will add more practical encoder for sparse features commonly used in recommender systems. Training algorithms used in this model is pairwise ranking in that a negative item with multiple views will be sampled given a pair of positive user-item pair.
-
-## Train
-The command line options for training can be listed by `python train.py -h`
-```bash
-python train.py
-```
-
-## Infer
-The command line options for inference can be listed by `python infer.py -h`
-```bash
-python infer.py
-```
-
-## Future work
-- Multiple types of pairwise loss will be added in this project. For different views of features between a user and an item, multiple losses will be supported. The model will be verified in real world dataset.
-- Parallel Executor will be added in this project
-- Distributed Training will be added
+Please visit the project at [PaddleRec/multiview_simnet](../../../PaddleRec/multiview_simnet).
diff --git a/fluid/PaddleRec/multiview_simnet/nets.py b/fluid/PaddleRec/multiview_simnet/nets.py
deleted file mode 100644
index 41e366f55c80c5151102ed5e81a2746774fb3b4b..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/multiview_simnet/nets.py
+++ /dev/null
@@ -1,239 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import paddle.fluid as fluid
-import paddle.fluid.layers.nn as nn
-import paddle.fluid.layers.tensor as tensor
-import paddle.fluid.layers.control_flow as cf
-import paddle.fluid.layers.io as io
-
-
-class BowEncoder(object):
- """ bow-encoder """
-
- def __init__(self):
- self.param_name = ""
-
- def forward(self, emb):
- return nn.sequence_pool(input=emb, pool_type='sum')
-
-
-class CNNEncoder(object):
- """ cnn-encoder"""
-
- def __init__(self,
- param_name="cnn.w",
- win_size=3,
- ksize=128,
- act='tanh',
- pool_type='max'):
- self.param_name = param_name
- self.win_size = win_size
- self.ksize = ksize
- self.act = act
- self.pool_type = pool_type
-
- def forward(self, emb):
- return fluid.nets.sequence_conv_pool(
- input=emb,
- num_filters=self.ksize,
- filter_size=self.win_size,
- act=self.act,
- pool_type=self.pool_type,
- param_attr=str(self.param_name))
-
-
-class GrnnEncoder(object):
- """ grnn-encoder """
-
- def __init__(self, param_name="grnn.w", hidden_size=128):
- self.param_name = param_name
- self.hidden_size = hidden_size
-
- def forward(self, emb):
- fc0 = nn.fc(
- input=emb,
- size=self.hidden_size * 3,
- param_attr=str(str(self.param_name) + "_fc")
- )
- gru_h = nn.dynamic_gru(
- input=fc0,
- size=self.hidden_size,
- is_reverse=False,
- param_attr=str(self.param_name))
- return nn.sequence_pool(input=gru_h, pool_type='max')
-
-
-'''this is a very simple Encoder factory
-most default argument values are used'''
-
-
-class SimpleEncoderFactory(object):
- def __init__(self):
- pass
-
- ''' create an encoder through create function '''
-
- def create(self, enc_type, enc_hid_size):
- if enc_type == "bow":
- bow_encode = BowEncoder()
- return bow_encode
- elif enc_type == "cnn":
- cnn_encode = CNNEncoder(ksize=enc_hid_size)
- return cnn_encode
- elif enc_type == "gru":
- rnn_encode = GrnnEncoder(hidden_size=enc_hid_size)
- return rnn_encode
-
-
-class MultiviewSimnet(object):
- """ multi-view simnet """
-
- def __init__(self, embedding_size, embedding_dim, hidden_size):
- self.embedding_size = embedding_size
- self.embedding_dim = embedding_dim
- self.emb_shape = [self.embedding_size, self.embedding_dim]
- self.hidden_size = hidden_size
- self.margin = 0.1
-
- def set_query_encoder(self, encoders):
- self.query_encoders = encoders
-
- def set_title_encoder(self, encoders):
- self.title_encoders = encoders
-
- def get_correct(self, x, y):
- less = tensor.cast(cf.less_than(x, y), dtype='float32')
- correct = nn.reduce_sum(less)
- return correct
-
- def train_net(self):
- # input fields for query, pos_title, neg_title
- q_slots = [
- io.data(
- name="q%d" % i, shape=[1], lod_level=1, dtype='int64')
- for i in range(len(self.query_encoders))
- ]
- pt_slots = [
- io.data(
- name="pt%d" % i, shape=[1], lod_level=1, dtype='int64')
- for i in range(len(self.title_encoders))
- ]
- nt_slots = [
- io.data(
- name="nt%d" % i, shape=[1], lod_level=1, dtype='int64')
- for i in range(len(self.title_encoders))
- ]
-
- # lookup embedding for each slot
- q_embs = [
- nn.embedding(
- input=query, size=self.emb_shape, param_attr="emb.w")
- for query in q_slots
- ]
- pt_embs = [
- nn.embedding(
- input=title, size=self.emb_shape, param_attr="emb.w")
- for title in pt_slots
- ]
- nt_embs = [
- nn.embedding(
- input=title, size=self.emb_shape, param_attr="emb.w")
- for title in nt_slots
- ]
-
- # encode each embedding field with encoder
- q_encodes = [
- self.query_encoders[i].forward(emb) for i, emb in enumerate(q_embs)
- ]
- pt_encodes = [
- self.title_encoders[i].forward(emb) for i, emb in enumerate(pt_embs)
- ]
- nt_encodes = [
- self.title_encoders[i].forward(emb) for i, emb in enumerate(nt_embs)
- ]
-
- # concat multi view for query, pos_title, neg_title
- q_concat = nn.concat(q_encodes)
- pt_concat = nn.concat(pt_encodes)
- nt_concat = nn.concat(nt_encodes)
-
- # projection of hidden layer
- q_hid = nn.fc(q_concat, size=self.hidden_size, param_attr='q_fc.w')
- pt_hid = nn.fc(pt_concat, size=self.hidden_size, param_attr='t_fc.w')
- nt_hid = nn.fc(nt_concat, size=self.hidden_size, param_attr='t_fc.w')
-
- # cosine of hidden layers
- cos_pos = nn.cos_sim(q_hid, pt_hid)
- cos_neg = nn.cos_sim(q_hid, nt_hid)
-
- # pairwise hinge_loss
- loss_part1 = nn.elementwise_sub(
- tensor.fill_constant_batch_size_like(
- input=cos_pos,
- shape=[-1, 1],
- value=self.margin,
- dtype='float32'),
- cos_pos)
-
- loss_part2 = nn.elementwise_add(loss_part1, cos_neg)
-
- loss_part3 = nn.elementwise_max(
- tensor.fill_constant_batch_size_like(
- input=loss_part2, shape=[-1, 1], value=0.0, dtype='float32'),
- loss_part2)
-
- avg_cost = nn.mean(loss_part3)
- correct = self.get_correct(cos_neg, cos_pos)
-
- return q_slots + pt_slots + nt_slots, avg_cost, correct
-
- def pred_net(self, query_fields, pos_title_fields, neg_title_fields):
- q_slots = [
- io.data(
- name="q%d" % i, shape=[1], lod_level=1, dtype='int64')
- for i in range(len(self.query_encoders))
- ]
- pt_slots = [
- io.data(
- name="pt%d" % i, shape=[1], lod_level=1, dtype='int64')
- for i in range(len(self.title_encoders))
- ]
- # lookup embedding for each slot
- q_embs = [
- nn.embedding(
- input=query, size=self.emb_shape, param_attr="emb.w")
- for query in q_slots
- ]
- pt_embs = [
- nn.embedding(
- input=title, size=self.emb_shape, param_attr="emb.w")
- for title in pt_slots
- ]
- # encode each embedding field with encoder
- q_encodes = [
- self.query_encoder[i].forward(emb) for i, emb in enumerate(q_embs)
- ]
- pt_encodes = [
- self.title_encoders[i].forward(emb) for i, emb in enumerate(pt_embs)
- ]
- # concat multi view for query, pos_title, neg_title
- q_concat = nn.concat(q_encodes)
- pt_concat = nn.concat(pt_encodes)
- # projection of hidden layer
- q_hid = nn.fc(q_concat, size=self.hidden_size, param_attr='q_fc.w')
- pt_hid = nn.fc(pt_concat, size=self.hidden_size, param_attr='t_fc.w')
- # cosine of hidden layers
- cos = nn.cos_sim(q_hid, pt_hid)
- return cos
diff --git a/fluid/PaddleRec/multiview_simnet/train.py b/fluid/PaddleRec/multiview_simnet/train.py
deleted file mode 100644
index b4a566d39333d871d30e5996c45d7ea9ef7b1531..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/multiview_simnet/train.py
+++ /dev/null
@@ -1,137 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import os
-import sys
-import time
-import six
-import numpy as np
-import math
-import argparse
-import logging
-import paddle.fluid as fluid
-import paddle
-import time
-import reader as reader
-from nets import MultiviewSimnet, SimpleEncoderFactory
-
-logging.basicConfig(format="%(asctime)s - %(levelname)s - %(message)s")
-logger = logging.getLogger("fluid")
-logger.setLevel(logging.INFO)
-
-
-def parse_args():
- parser = argparse.ArgumentParser("multi-view simnet")
- parser.add_argument("--train_file", type=str, help="Training file")
- parser.add_argument("--valid_file", type=str, help="Validation file")
- parser.add_argument(
- "--epochs", type=int, default=10, help="Number of epochs for training")
- parser.add_argument(
- "--model_output_dir",
- type=str,
- default='model_output',
- help="Model output folder")
- parser.add_argument(
- "--query_slots", type=int, default=1, help="Number of query slots")
- parser.add_argument(
- "--title_slots", type=int, default=1, help="Number of title slots")
- parser.add_argument(
- "--query_encoder",
- type=str,
- default="bow",
- help="Encoder module for slot encoding")
- parser.add_argument(
- "--title_encoder",
- type=str,
- default="bow",
- help="Encoder module for slot encoding")
- parser.add_argument(
- "--query_encode_dim",
- type=int,
- default=128,
- help="Dimension of query encoder output")
- parser.add_argument(
- "--title_encode_dim",
- type=int,
- default=128,
- help="Dimension of title encoder output")
- parser.add_argument(
- "--batch_size", type=int, default=128, help="Batch size for training")
- parser.add_argument(
- "--embedding_dim",
- type=int,
- default=128,
- help="Default Dimension of Embedding")
- parser.add_argument(
- "--sparse_feature_dim",
- type=int,
- default=1000001,
- help="Sparse feature hashing space"
- "for index processing")
- parser.add_argument(
- "--hidden_size", type=int, default=128, help="Hidden dim")
- return parser.parse_args()
-
-
-def start_train(args):
- dataset = reader.SyntheticDataset(args.sparse_feature_dim, args.query_slots,
- args.title_slots)
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- dataset.train(), buf_size=args.batch_size * 100),
- batch_size=args.batch_size)
- place = fluid.CPUPlace()
- factory = SimpleEncoderFactory()
- query_encoders = [
- factory.create(args.query_encoder, args.query_encode_dim)
- for i in range(args.query_slots)
- ]
- title_encoders = [
- factory.create(args.title_encoder, args.title_encode_dim)
- for i in range(args.title_slots)
- ]
- m_simnet = MultiviewSimnet(args.sparse_feature_dim, args.embedding_dim,
- args.hidden_size)
- m_simnet.set_query_encoder(query_encoders)
- m_simnet.set_title_encoder(title_encoders)
- all_slots, avg_cost, correct = m_simnet.train_net()
- optimizer = fluid.optimizer.Adam(learning_rate=1e-4)
- optimizer.minimize(avg_cost)
- startup_program = fluid.default_startup_program()
- loop_program = fluid.default_main_program()
-
- feeder = fluid.DataFeeder(feed_list=all_slots, place=place)
- exe = fluid.Executor(place)
- exe.run(startup_program)
-
- for pass_id in range(args.epochs):
- for batch_id, data in enumerate(train_reader()):
- loss_val, correct_val = exe.run(loop_program,
- feed=feeder.feed(data),
- fetch_list=[avg_cost, correct])
- logger.info("TRAIN --> pass: {} batch_id: {} avg_cost: {}, acc: {}"
- .format(pass_id, batch_id, loss_val,
- float(correct_val) / args.batch_size))
- fluid.io.save_inference_model(args.model_output_dir,
- [val.name for val in all_slots],
- [avg_cost, correct], exe)
-
-
-def main():
- args = parse_args()
- start_train(args)
-
-
-if __name__ == "__main__":
- main()
diff --git a/fluid/PaddleRec/ssr/README.md b/fluid/PaddleRec/ssr/README.md
index 034be994d9000591c59ca08feda54d4a39d147af..15111907ccc21942c134a2a614ad341c37710272 100644
--- a/fluid/PaddleRec/ssr/README.md
+++ b/fluid/PaddleRec/ssr/README.md
@@ -1,33 +1,2 @@
-# Sequence Semantic Retrieval Model
-## Introduction
-In news recommendation scenarios, different from traditional systems that recommend entertainment items such as movies or music, there are several new problems to solve.
-- Very sparse user profile features exist that a user may login a news recommendation app anonymously and a user is likely to read a fresh news item.
-- News are generated or disappeared very fast compare with movies or musics. Usually, there will be thousands of news generated in a news recommendation app. The Consumption of news is also fast since users care about newly happened things.
-- User interests may change frequently in the news recommendation setting. The content of news will affect users' reading behaviors a lot even the category of the news does not belong to users' long-term interest. In news recommendation, reading behaviors are determined by both short-term interest and long-term interest of users.
-
-[GRU4Rec](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/gru4rec) models a user's short-term and long-term interest by applying a gated-recurrent-unit on the user's reading history. The generalization ability of recurrent neural network captures users' similarity of reading sequences that alleviates the user profile sparsity problem. However, the paper of GRU4Rec operates on close domain of items that the model predicts which item a user will be interested in through classification method. In news recommendation, news items are dynamic through time that GRU4Rec model can not predict items that do not exist in training dataset.
-
-Sequence Semantic Retrieval(SSR) Model shares the similar idea with Multi-Rate Deep Learning for Temporal Recommendation, SIGIR 2016. Sequence Semantic Retrieval Model has two components, one is the matching model part, the other one is the retrieval part.
-- The idea of SSR is to model a user's personalized interest of an item through matching model structure, and the representation of a news item can be computed online even the news item does not exist in training dataset.
-- With the representation of news items, we are able to build an vector indexing service online for news prediction and this is the retrieval part of SSR.
-
-## Dataset
-Dataset preprocessing follows the method of [GRU4Rec Project](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/gru4rec). Note that you should reuse scripts from GRU4Rec project for data preprocessing.
-
-## Training
-Before training, you should set PYTHONPATH environment
-```
-export PYTHONPATH=./models/fluid:$PYTHONPATH
-```
-
-The command line options for training can be listed by `python train.py -h`
-``` bash
-python train.py --train_file rsc15_train_tr_paddle.txt
-```
-
-## Build Index
-TBA
-
-## Retrieval
-TBA
+您好,该项目已被迁移,请移步到 [PaddleRec/ssr](../../../PaddleRec/ssr) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/ssr/nets.py b/fluid/PaddleRec/ssr/nets.py
deleted file mode 100644
index 278cb8fdde2d63e1e5675c1dbdcfb11152116e73..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ssr/nets.py
+++ /dev/null
@@ -1,101 +0,0 @@
-#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import paddle.fluid as fluid
-import paddle.fluid.layers.nn as nn
-import paddle.fluid.layers.tensor as tensor
-import paddle.fluid.layers.control_flow as cf
-import paddle.fluid.layers.io as io
-from PaddleRec.multiview_simnet.nets import BowEncoder
-from PaddleRec.multiview_simnet.nets import GrnnEncoder
-
-
-class PairwiseHingeLoss(object):
- def __init__(self, margin=0.8):
- self.margin = margin
- def forward(self, pos, neg):
- loss_part1 = nn.elementwise_sub(
- tensor.fill_constant_batch_size_like(
- input=pos,
- shape=[-1, 1],
- value=self.margin,
- dtype='float32'),
- pos)
- loss_part2 = nn.elementwise_add(loss_part1, neg)
- loss_part3 = nn.elementwise_max(
- tensor.fill_constant_batch_size_like(
- input=loss_part2,
- shape=[-1, 1],
- value=0.0,
- dtype='float32'),
- loss_part2)
- return loss_part3
-
-
-class SequenceSemanticRetrieval(object):
- """ sequence semantic retrieval model """
-
- def __init__(self, embedding_size, embedding_dim, hidden_size):
- self.embedding_size = embedding_size
- self.embedding_dim = embedding_dim
- self.emb_shape = [self.embedding_size, self.embedding_dim]
- self.hidden_size = hidden_size
- self.user_encoder = GrnnEncoder(hidden_size=hidden_size)
- self.item_encoder = BowEncoder()
- self.pairwise_hinge_loss = PairwiseHingeLoss()
-
- def get_correct(self, x, y):
- less = tensor.cast(cf.less_than(x, y), dtype='float32')
- correct = nn.reduce_sum(less)
- return correct
-
- def train(self):
- user_data = io.data(
- name="user", shape=[1], dtype="int64", lod_level=1
- )
- pos_item_data = io.data(
- name="p_item", shape=[1], dtype="int64", lod_level=1
- )
- neg_item_data = io.data(
- name="n_item", shape=[1], dtype="int64", lod_level=1
- )
- user_emb = nn.embedding(
- input=user_data, size=self.emb_shape, param_attr="emb.item"
- )
- pos_item_emb = nn.embedding(
- input=pos_item_data, size=self.emb_shape, param_attr="emb.item"
- )
- neg_item_emb = nn.embedding(
- input=neg_item_data, size=self.emb_shape, param_attr="emb.item"
- )
- user_enc = self.user_encoder.forward(user_emb)
- pos_item_enc = self.item_encoder.forward(pos_item_emb)
- neg_item_enc = self.item_encoder.forward(neg_item_emb)
- user_hid = nn.fc(
- input=user_enc, size=self.hidden_size, param_attr='user.w', bias_attr="user.b"
- )
- pos_item_hid = nn.fc(
- input=pos_item_enc, size=self.hidden_size, param_attr='item.w', bias_attr="item.b"
- )
- neg_item_hid = nn.fc(
- input=neg_item_enc, size=self.hidden_size, param_attr='item.w', bias_attr="item.b"
- )
- cos_pos = nn.cos_sim(user_hid, pos_item_hid)
- cos_neg = nn.cos_sim(user_hid, neg_item_hid)
- hinge_loss = self.pairwise_hinge_loss.forward(cos_pos, cos_neg)
- avg_cost = nn.mean(hinge_loss)
- correct = self.get_correct(cos_neg, cos_pos)
-
- return [user_data, pos_item_data, neg_item_data], \
- pos_item_hid, neg_item_hid, avg_cost, correct
diff --git a/fluid/PaddleRec/ssr/reader.py b/fluid/PaddleRec/ssr/reader.py
deleted file mode 100644
index 97e0ae8ec1cd4089b5b291ac7a4552b73ab231ee..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ssr/reader.py
+++ /dev/null
@@ -1,94 +0,0 @@
-#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import random
-
-class Dataset:
- def __init__(self):
- pass
-
-class Vocab:
- def __init__(self):
- pass
-
-class YoochooseVocab(Vocab):
- def __init__(self):
- self.vocab = {}
- self.word_array = []
-
- def load(self, filelist):
- idx = 0
- for f in filelist:
- with open(f, "r") as fin:
- for line in fin:
- group = line.strip().split()
- for item in group:
- if item not in self.vocab:
- self.vocab[item] = idx
- self.word_array.append(idx)
- idx += 1
- else:
- self.word_array.append(self.vocab[item])
-
- def get_vocab(self):
- return self.vocab
-
- def _get_word_array(self):
- return self.word_array
-
-class YoochooseDataset(Dataset):
- def __init__(self, y_vocab):
- self.vocab_size = len(y_vocab.get_vocab())
- self.word_array = y_vocab._get_word_array()
- self.vocab = y_vocab.get_vocab()
-
- def sample_neg(self):
- return random.randint(0, self.vocab_size - 1)
-
- def sample_neg_from_seq(self, seq):
- return seq[random.randint(0, len(seq) - 1)]
-
- # TODO(guru4elephant): wait memory, should be improved
- def sample_from_word_freq(self):
- return self.word_array[random.randint(0, len(self.word_array) - 1)]
-
- def _reader_creator(self, filelist, is_train):
- def reader():
- for f in filelist:
- with open(f, 'r') as fin:
- line_idx = 0
- for line in fin:
- ids = line.strip().split()
- if len(ids) <= 1:
- continue
- conv_ids = [self.vocab[i] if i in self.vocab else 0 for i in ids]
- # random select an index as boundary
- # make ids before boundary as sequence
- # make id next to boundary right as target
- boundary = random.randint(1, len(ids) - 1)
- src = conv_ids[:boundary]
- pos_tgt = [conv_ids[boundary]]
- if is_train:
- neg_tgt = [self.sample_from_word_freq()]
- yield [src, pos_tgt, neg_tgt]
- else:
- yield [src, pos_tgt]
- return reader
-
- def train(self, file_list):
- return self._reader_creator(file_list, True)
-
- def test(self, file_list):
- return self._reader_creator(file_list, False)
-
diff --git a/fluid/PaddleRec/ssr/train.py b/fluid/PaddleRec/ssr/train.py
deleted file mode 100644
index 33fe23e55795e47dea3e7f767016a8be4492a4d0..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ssr/train.py
+++ /dev/null
@@ -1,99 +0,0 @@
-#Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-import os
-import sys
-import argparse
-import logging
-import paddle.fluid as fluid
-import paddle
-import reader as reader
-from nets import SequenceSemanticRetrieval
-
-logging.basicConfig(format="%(asctime)s - %(levelname)s - %(message)s")
-logger = logging.getLogger("fluid")
-logger.setLevel(logging.INFO)
-
-def parse_args():
- parser = argparse.ArgumentParser("sequence semantic retrieval")
- parser.add_argument("--train_file", type=str, help="Training file")
- parser.add_argument("--valid_file", type=str, help="Validation file")
- parser.add_argument(
- "--epochs", type=int, default=10, help="Number of epochs for training")
- parser.add_argument(
- "--model_output_dir",
- type=str,
- default='model_output',
- help="Model output folder")
- parser.add_argument(
- "--sequence_encode_dim",
- type=int,
- default=128,
- help="Dimension of sequence encoder output")
- parser.add_argument(
- "--matching_dim",
- type=int,
- default=128,
- help="Dimension of hidden layer")
- parser.add_argument(
- "--batch_size", type=int, default=128, help="Batch size for training")
- parser.add_argument(
- "--embedding_dim",
- type=int,
- default=128,
- help="Default Dimension of Embedding")
- return parser.parse_args()
-
-def start_train(args):
- y_vocab = reader.YoochooseVocab()
- y_vocab.load([args.train_file])
-
- logger.info("Load yoochoose vocabulary size: {}".format(len(y_vocab.get_vocab())))
- y_data = reader.YoochooseDataset(y_vocab)
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- y_data.train([args.train_file]), buf_size=args.batch_size * 100),
- batch_size=args.batch_size)
- place = fluid.CPUPlace()
- ssr = SequenceSemanticRetrieval(
- len(y_vocab.get_vocab()), args.embedding_dim, args.matching_dim
- )
- input_data, user_rep, item_rep, avg_cost, acc = ssr.train()
- optimizer = fluid.optimizer.Adam(learning_rate=1e-4)
- optimizer.minimize(avg_cost)
- startup_program = fluid.default_startup_program()
- loop_program = fluid.default_main_program()
- data_list = [var.name for var in input_data]
- feeder = fluid.DataFeeder(feed_list=data_list, place=place)
- exe = fluid.Executor(place)
- exe.run(startup_program)
-
- for pass_id in range(args.epochs):
- for batch_id, data in enumerate(train_reader()):
- loss_val, correct_val = exe.run(loop_program,
- feed=feeder.feed(data),
- fetch_list=[avg_cost, acc])
- logger.info("Train --> pass: {} batch_id: {} avg_cost: {}, acc: {}".
- format(pass_id, batch_id, loss_val,
- float(correct_val) / args.batch_size))
- fluid.io.save_inference_model(args.model_output_dir,
- [var.name for val in input_data],
- [user_rep, item_rep, avg_cost, acc], exe)
-
-def main():
- args = parse_args()
- start_train(args)
-
-if __name__ == "__main__":
- main()
-
diff --git a/fluid/PaddleRec/tagspace/README.md b/fluid/PaddleRec/tagspace/README.md
index 055099dbd29be6655472b8832ea08b608a810f1c..67e3f88f7a2245829d0efbfe23a6566a0745fe41 100644
--- a/fluid/PaddleRec/tagspace/README.md
+++ b/fluid/PaddleRec/tagspace/README.md
@@ -1,90 +1,2 @@
-# TagSpace
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── README.md # 文档
-├── train.py # 训练脚本
-├── infer.py # 预测脚本
-├── net.py # 网络结构
-├── text2paddle.py # 文本数据转paddle数据
-├── cluster_train.py # 多机训练
-├── cluster_train.sh # 多机训练脚本
-├── utils # 通用函数
-├── vocab_text.txt # 小样本文本字典
-├── vocab_tag.txt # 小样本类别字典
-├── train_data # 小样本训练目录
-└── test_data # 小样本测试目录
-
-```
-
-
-## 简介
-
-TagSpace模型的介绍可以参阅论文[#TagSpace: Semantic Embeddings from Hashtags](https://research.fb.com/publications/tagspace-semantic-embeddings-from-hashtags/)。
-
-Tagspace模型学习文本及标签的embedding表示,应用于工业级的标签推荐,具体应用场景有feed新闻标签推荐。
-
-
-## 数据下载及预处理
-
-[ag news dataset](https://github.com/mhjabreel/CharCNN/tree/master/data/ag_news_csv)
-
-数据格式如下
-
-```
-"3","Wall St. Bears Claw Back Into the Black (Reuters)","Reuters - Short-sellers, Wall Street's dwindling\band of ultra-cynics, are seeing green again."
-```
-
-将文本数据转为paddle数据,先将数据放到训练数据目录和测试数据目录
-```
-mv train.csv raw_big_train_data
-mv test.csv raw_big_test_data
-```
-
-运行脚本text2paddle.py 生成paddle输入格式
-```
-python text2paddle.py raw_big_train_data/ raw_big_test_data/ train_big_data test_big_data big_vocab_text.txt big_vocab_tag.txt
-```
-
-## 单机训练
-'--use_cuda 1' 表示使用gpu, 0表示使用cpu, '--parallel 1' 表示使用多卡
-
-小数据训练(样例中的数据已经准备,可跳过上一节的数据准备,直接运行命令)
-
-GPU 环境
-```
-CUDA_VISIBLE_DEVICES=0 python train.py --use_cuda 1
-```
-CPU 环境
-```
-python train.py
-```
-
-全量数据单机单卡训练
-```
-CUDA_VISIBLE_DEVICES=0 python train.py --use_cuda 1 --train_dir train_big_data/ --vocab_text_path big_vocab_text.txt --vocab_tag_path big_vocab_tag.txt --model_dir big_model --batch_size 500
-```
-全量数据单机多卡训练
-
-```
-python train.py --train_dir train_big_data/ --vocab_text_path big_vocab_text.txt --vocab_tag_path big_vocab_tag.txt --model_dir big_model --batch_size 500 --parallel 1
-```
-
-## 预测
-小数据预测
-```
-python infer.py
-```
-
-全量数据预测
-```
-python infer.py --model_dir big_model --vocab_tag_path big_vocab_tag.txt --test_dir test_big_data/
-```
-
-## 本地模拟多机
-运行命令
-```
-sh cluster_train.py
-```
+您好,该项目已被迁移,请移步到 [PaddleRec/tagspace](../../../PaddleRec/tagspace) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/tagspace/infer.py b/fluid/PaddleRec/tagspace/infer.py
deleted file mode 100644
index 2ecdaa5a792da24495ee4b2c5ce1ab605e187ba9..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/tagspace/infer.py
+++ /dev/null
@@ -1,85 +0,0 @@
-import sys
-import argparse
-import time
-import math
-import unittest
-import contextlib
-import numpy as np
-import six
-import paddle.fluid as fluid
-import paddle
-import utils
-
-def parse_args():
- parser = argparse.ArgumentParser("gru4rec benchmark.")
- parser.add_argument(
- '--test_dir', type=str, default='test_data', help='test file address')
- parser.add_argument(
- '--vocab_tag_path', type=str, default='vocab_tag.txt', help='vocab path')
- parser.add_argument(
- '--start_index', type=int, default='1', help='start index')
- parser.add_argument(
- '--last_index', type=int, default='10', help='end index')
- parser.add_argument(
- '--model_dir', type=str, default='model_', help='model dir')
- parser.add_argument(
- '--use_cuda', type=int, default='0', help='whether use cuda')
- parser.add_argument(
- '--batch_size', type=int, default='5', help='batch_size')
- args = parser.parse_args()
- return args
-
-def infer(test_reader, vocab_tag, use_cuda, model_path):
- """ inference function """
- place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- with fluid.scope_guard(fluid.core.Scope()):
- infer_program, feed_target_names, fetch_vars = fluid.io.load_inference_model(
- model_path, exe)
- t0 = time.time()
- step_id = 0
- true_num = 0
- all_num = 0
- size = vocab_tag
- value = []
- for data in test_reader():
- step_id += 1
- lod_text_seq = utils.to_lodtensor([dat[0] for dat in data], place)
- lod_tag = utils.to_lodtensor([dat[1] for dat in data], place)
- lod_pos_tag = utils.to_lodtensor([dat[2] for dat in data], place)
- para = exe.run(
- infer_program,
- feed={
- "text": lod_text_seq,
- "pos_tag": lod_tag},
- fetch_list=fetch_vars,
- return_numpy=False)
- value.append(para[0]._get_float_element(0))
- if step_id % size == 0 and step_id > 1:
- all_num += 1
- true_pos = [dat[2] for dat in data][0][0]
- if value.index(max(value)) == int(true_pos):
- true_num += 1
- value = []
- if step_id % 1000 == 0:
- print(step_id, 1.0 * true_num / all_num)
- t1 = time.time()
-
-if __name__ == "__main__":
- args = parse_args()
- start_index = args.start_index
- last_index = args.last_index
- test_dir = args.test_dir
- model_dir = args.model_dir
- batch_size = args.batch_size
- vocab_tag_path = args.vocab_tag_path
- use_cuda = True if args.use_cuda else False
- print("start index: ", start_index, " last_index:" ,last_index)
- vocab_text, vocab_tag, test_reader = utils.prepare_data(
- test_dir, "", vocab_tag_path, batch_size=1,
- neg_size=0, buffer_size=1000, is_train=False)
-
- for epoch in xrange(start_index, last_index + 1):
- epoch_path = model_dir + "/epoch_" + str(epoch)
- infer(test_reader=test_reader, vocab_tag=vocab_tag, use_cuda=False, model_path=epoch_path)
diff --git a/fluid/PaddleRec/tagspace/train.py b/fluid/PaddleRec/tagspace/train.py
deleted file mode 100644
index becb2a0379672a47f574d1e942751582bd89de47..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/tagspace/train.py
+++ /dev/null
@@ -1,117 +0,0 @@
-import os
-import sys
-import time
-import six
-import numpy as np
-import math
-import argparse
-import paddle
-import paddle.fluid as fluid
-import time
-import utils
-import net
-
-SEED = 102
-
-def parse_args():
- parser = argparse.ArgumentParser("TagSpace benchmark.")
- parser.add_argument(
- '--neg_size', type=int, default=3, help='neg/pos ratio')
- parser.add_argument(
- '--train_dir', type=str, default='train_data', help='train file address')
- parser.add_argument(
- '--vocab_text_path', type=str, default='vocab_text.txt', help='vocab_text file address')
- parser.add_argument(
- '--vocab_tag_path', type=str, default='vocab_tag.txt', help='vocab_text file address')
- parser.add_argument(
- '--model_dir', type=str, default='model_', help='model dir')
- parser.add_argument(
- '--batch_size', type=int, default=5, help='num of batch size')
- parser.add_argument(
- '--print_batch', type=int, default=10, help='num of print batch')
- parser.add_argument(
- '--pass_num', type=int, default=10, help='num of epoch')
- parser.add_argument(
- '--use_cuda', type=int, default=0, help='whether use gpu')
- parser.add_argument(
- '--parallel', type=int, default=0, help='whether parallel')
- parser.add_argument(
- '--base_lr', type=float, default=0.01, help='learning rate')
- parser.add_argument(
- '--num_devices', type=int, default=1, help='Number of GPU devices')
- args = parser.parse_args()
- return args
-
-def get_cards(args):
- return args.num_devices
-
-def train():
- """ do training """
- args = parse_args()
- train_dir = args.train_dir
- vocab_text_path = args.vocab_text_path
- vocab_tag_path = args.vocab_tag_path
- use_cuda = True if args.use_cuda else False
- parallel = True if args.parallel else False
- batch_size = args.batch_size
- neg_size = args.neg_size
- print("use_cuda: {}, parallel: {}, batch_size: {}, neg_size: {} "
- .format(use_cuda, parallel, batch_size, neg_size))
- vocab_text_size, vocab_tag_size, train_reader = utils.prepare_data(
- file_dir=train_dir, vocab_text_path=vocab_text_path,
- vocab_tag_path=vocab_tag_path, neg_size=neg_size,
- batch_size=batch_size * get_cards(args),
- buffer_size=batch_size*100, is_train=True)
- """ train network """
- # Train program
- avg_cost, correct, cos_pos = net.network(vocab_text_size, vocab_tag_size, neg_size=neg_size)
-
- # Optimization to minimize lost
- sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=args.base_lr)
- sgd_optimizer.minimize(avg_cost)
-
- # Initialize executor
- place = fluid.CUDAPlace(0) if use_cuda else fluid.CPUPlace()
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
- if parallel:
- train_exe = fluid.ParallelExecutor(
- use_cuda=use_cuda,
- loss_name=avg_cost.name)
- else:
- train_exe = exe
-
- pass_num = args.pass_num
- model_dir = args.model_dir
- fetch_list = [avg_cost.name]
- total_time = 0.0
- for pass_idx in range(pass_num):
- epoch_idx = pass_idx + 1
- print("epoch_%d start" % epoch_idx)
- t0 = time.time()
- for batch_id, data in enumerate(train_reader()):
- lod_text_seq = utils.to_lodtensor([dat[0] for dat in data], place)
- lod_pos_tag = utils.to_lodtensor([dat[1] for dat in data], place)
- lod_neg_tag = utils.to_lodtensor([dat[2] for dat in data], place)
- loss_val, correct_val = train_exe.run(
- feed={
- "text": lod_text_seq,
- "pos_tag": lod_pos_tag,
- "neg_tag": lod_neg_tag},
- fetch_list=[avg_cost.name, correct.name])
- if batch_id % args.print_batch == 0:
- print("TRAIN --> pass: {} batch_num: {} avg_cost: {}, acc: {}"
- .format(pass_idx, (batch_id+10) * batch_size, np.mean(loss_val),
- float(np.sum(correct_val)) / (args.num_devices*batch_size)))
- t1 = time.time()
- total_time += t1 - t0
- print("epoch:%d num_steps:%d time_cost(s):%f" %
- (epoch_idx, batch_id, total_time / epoch_idx))
- save_dir = "%s/epoch_%d" % (model_dir, epoch_idx)
- feed_var_names = ["text", "pos_tag"]
- fetch_vars = [cos_pos]
- fluid.io.save_inference_model(save_dir, feed_var_names, fetch_vars, exe)
- print("finish training")
-
-if __name__ == "__main__":
- train()
diff --git a/fluid/PaddleRec/word2vec/README.md b/fluid/PaddleRec/word2vec/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..7504ff9c332bf86f606d6d8770cefb325fc29ce0
--- /dev/null
+++ b/fluid/PaddleRec/word2vec/README.md
@@ -0,0 +1,2 @@
+
+您好,该项目已被迁移,请移步到 [PaddleRec/word2vec](../../../PaddleRec/word2vec) 目录下浏览本项目。
diff --git a/fluid/README.cn.rst b/fluid/README.cn.rst
deleted file mode 100644
index 811038c8aaedcce2b55fca54e647dcda98924db9..0000000000000000000000000000000000000000
--- a/fluid/README.cn.rst
+++ /dev/null
@@ -1,201 +0,0 @@
-`Fluid 模型库 `__
-============
-
-图像分类
---------
-
-图像分类是根据图像的语义信息对不同类别图像进行区分,是计算机视觉中重要的基础问题,是物体检测、图像分割、物体跟踪、行为分析、人脸识别等其他高层视觉任务的基础,在许多领域都有着广泛的应用。如:安防领域的人脸识别和智能视频分析等,交通领域的交通场景识别,互联网领域基于内容的图像检索和相册自动归类,医学领域的图像识别等。
-
-在深度学习时代,图像分类的准确率大幅度提升,在图像分类任务中,我们向大家介绍了如何在经典的数据集ImageNet上,训练常用的模型,包括AlexNet、VGG、GoogLeNet、ResNet、Inception-v4、MobileNet、DPN(Dual
-Path
-Network)、SE-ResNeXt模型,也开源了\ `训练的模型 `__\ 方便用户下载使用。同时提供了能够将Caffe模型转换为PaddlePaddle
-Fluid模型配置和参数文件的工具。
-
-- `AlexNet `__
-- `VGG `__
-- `GoogleNet `__
-- `Residual
- Network `__
-- `Inception-v4 `__
-- `MobileNet `__
-- `Dual Path
- Network `__
-- `SE-ResNeXt `__
-- `Caffe模型转换为Paddle
- Fluid配置和模型文件工具 `__
-
-目标检测
---------
-
-目标检测任务的目标是给定一张图像或是一个视频帧,让计算机找出其中所有目标的位置,并给出每个目标的具体类别。对于人类来说,目标检测是一个非常简单的任务。然而,计算机能够“看到”的是图像被编码之后的数字,很难解图像或是视频帧中出现了人或是物体这样的高层语义概念,也就更加难以定位目标出现在图像中哪个区域。与此同时,由于目标会出现在图像或是视频帧中的任何位置,目标的形态千变万化,图像或是视频帧的背景千差万别,诸多因素都使得目标检测对计算机来说是一个具有挑战性的问题。
-
-在目标检测任务中,我们介绍了如何基于\ `PASCAL
-VOC `__\ 、\ `MS
-COCO `__\ 数据训练通用物体检测模型,当前介绍了SSD算法,SSD全称Single Shot MultiBox Detector,是目标检测领域较新且效果较好的检测算法之一,具有检测速度快且检测精度高的特点。
-
-开放环境中的检测人脸,尤其是小的、模糊的和部分遮挡的人脸也是一个具有挑战的任务。我们也介绍了如何基于 `WIDER FACE `_ 数据训练百度自研的人脸检测PyramidBox模型,该算法于2018年3月份在WIDER FACE的多项评测中均获得 `第一名 `_。
-
-- `Single Shot MultiBox
- Detector `__
-- `Face Detector: PyramidBox `_
-
-图像语义分割
-------------
-
-图像语意分割顾名思义是将图像像素按照表达的语义含义的不同进行分组/分割,图像语义是指对图像内容的理解,例如,能够描绘出什么物体在哪里做了什么事情等,分割是指对图片中的每个像素点进行标注,标注属于哪一类别。近年来用在无人车驾驶技术中分割街景来避让行人和车辆、医疗影像分析中辅助诊断等。
-
-在图像语义分割任务中,我们介绍如何基于图像级联网络(Image Cascade
-Network,ICNet)进行语义分割,相比其他分割算法,ICNet兼顾了准确率和速度。
-
-- `ICNet `__
-
-图像生成
------------
-
-图像生成是指根据输入向量,生成目标图像。这里的输入向量可以是随机的噪声或用户指定的条件向量。具体的应用场景有:手写体生成、人脸合成、风格迁移、图像修复等。当前的图像生成任务主要是借助生成对抗网络(GAN)来实现。
-生成对抗网络(GAN)由两种子网络组成:生成器和识别器。生成器的输入是随机噪声或条件向量,输出是目标图像。识别器是一个分类器,输入是一张图像,输出是该图像是否是真实的图像。在训练过程中,生成器和识别器通过不断的相互博弈提升自己的能力。
-
-在图像生成任务中,我们介绍了如何使用DCGAN和ConditioanlGAN来进行手写数字的生成,另外还介绍了用于风格迁移的CycleGAN.
-
-- `DCGAN & ConditionalGAN `__
-- `CycleGAN `__
-
-场景文字识别
-------------
-
-许多场景图像中包含着丰富的文本信息,对理解图像信息有着重要作用,能够极大地帮助人们认知和理解场景图像的内容。场景文字识别是在图像背景复杂、分辨率低下、字体多样、分布随意等情况下,将图像信息转化为文字序列的过程,可认为是一种特别的翻译过程:将图像输入翻译为自然语言输出。场景图像文字识别技术的发展也促进了一些新型应用的产生,如通过自动识别路牌中的文字帮助街景应用获取更加准确的地址信息等。
-
-在场景文字识别任务中,我们介绍如何将基于CNN的图像特征提取和基于RNN的序列翻译技术结合,免除人工定义特征,避免字符分割,使用自动学习到的图像特征,完成字符识别。当前,介绍了CRNN-CTC模型和基于注意力机制的序列到序列模型。
-
-- `CRNN-CTC模型 `__
-- `Attention模型 `__
-
-
-度量学习
--------
-
-
-度量学习也称作距离度量学习、相似度学习,通过学习对象之间的距离,度量学习能够用于分析对象时间的关联、比较关系,在实际问题中应用较为广泛,可应用于辅助分类、聚类问题,也广泛用于图像检索、人脸识别等领域。以往,针对不同的任务,需要选择合适的特征并手动构建距离函数,而度量学习可根据不同的任务来自主学习出针对特定任务的度量距离函数。度量学习和深度学习的结合,在人脸识别/验证、行人再识别(human Re-ID)、图像检索等领域均取得较好的性能,在这个任务中我们主要介绍了基于Fluid的深度度量学习模型,包含了三元组、四元组等损失函数。
-
-- `Metric Learning `__
-
-
-视频分类
--------
-
-视频分类是视频理解任务的基础,与图像分类不同的是,分类的对象不再是静止的图像,而是一个由多帧图像构成的、包含语音数据、包含运动信息等的视频对象,因此理解视频需要获得更多的上下文信息,不仅要理解每帧图像是什么、包含什么,还需要结合不同帧,知道上下文的关联信息。视频分类方法主要包含基于卷积神经网络、基于循环神经网络、或将这两者结合的方法。该任务中我们介绍基于Fluid的视频分类模型,目前包含Temporal Segment Network(TSN)模型,后续会持续增加更多模型。
-
-
-- `TSN `__
-
-
-
-语音识别
---------
-
-自动语音识别(Automatic Speech Recognition,
-ASR)是将人类声音中的词汇内容转录成计算机可输入的文字的技术。语音识别的相关研究经历了漫长的探索过程,在HMM/GMM模型之后其发展一直较为缓慢,随着深度学习的兴起,其迎来了春天。在多种语言识别任务中,将深度神经网络(DNN)作为声学模型,取得了比GMM更好的性能,使得
-ASR
-成为深度学习应用最为成功的领域之一。而由于识别准确率的不断提高,有越来越多的语言技术产品得以落地,例如语言输入法、以智能音箱为代表的智能家居设备等
-—— 基于语言的交互方式正在深刻的改变人类的生活。
-
-与 `DeepSpeech `__
-中深度学习模型端到端直接预测字词的分布不同,本实例更接近传统的语言识别流程,以音素为建模单元,关注语言识别中声学模型的训练,利用\ `kaldi `__\ 进行音频数据的特征提取和标签对齐,并集成
-kaldi 的解码器完成解码。
-
-- `DeepASR `__
-
-机器翻译
---------
-
-机器翻译(Machine
-Translation)将一种自然语言(源语言)转换成一种自然语言(目标语音),是自然语言处理中非常基础和重要的研究方向。在全球化的浪潮中,机器翻译在促进跨语言文明的交流中所起的重要作用是不言而喻的。其发展经历了统计机器翻译和基于神经网络的神经机器翻译(Nueural
-Machine Translation, NMT)等阶段。在 NMT
-成熟后,机器翻译才真正得以大规模应用。而早阶段的 NMT
-主要是基于循环神经网络 RNN
-的,其训练过程中当前时间步依赖于前一个时间步的计算,时间步之间难以并行化以提高训练速度。因此,非
-RNN 结构的 NMT 得以应运而生,例如基于卷积神经网络 CNN
-的结构和基于自注意力机制(Self-Attention)的结构。
-
-本实例所实现的 Transformer
-就是一个基于自注意力机制的机器翻译模型,其中不再有RNN或CNN结构,而是完全利用
-Attention 学习语言中的上下文依赖。相较于RNN/CNN,
-这种结构在单层内计算复杂度更低、易于并行化、对长程依赖更易建模,最终在多种语言之间取得了最好的翻译效果。
-
-- `Transformer `__
-
-强化学习
---------
-
-强化学习是近年来一个愈发重要的机器学习方向,特别是与深度学习相结合而形成的深度强化学习(Deep
-Reinforcement Learning,
-DRL),取得了很多令人惊异的成就。人们所熟知的战胜人类顶级围棋职业选手的
-AlphaGo 就是 DRL
-应用的一个典型例子,除游戏领域外,其它的应用还包括机器人、自然语言处理等。
-
-深度强化学习的开山之作是在Atari视频游戏中的成功应用,
-其可直接接受视频帧这种高维输入并根据图像内容端到端地预测下一步的动作,所用到的模型被称为深度Q网络(Deep
-Q-Network, DQN)。本实例就是利用PaddlePaddle Fluid这个灵活的框架,实现了
-DQN 及其变体,并测试了它们在 Atari 游戏中的表现。
-
-- `DeepQNetwork `__
-
-中文词法分析
-------------
-
-中文分词(Word Segmentation)是将连续的自然语言文本,切分出具有语义合理性和完整性的词汇序列的过程。因为在汉语中,词是承担语义的最基本单位,切词是文本分类、情感分析、信息检索等众多自然语言处理任务的基础。 词性标注(Part-of-speech Tagging)是为自然语言文本中的每一个词汇赋予一个词性的过程,这里的词性包括名词、动词、形容词、副词等等。 命名实体识别(Named Entity Recognition,NER)又称作“专名识别”,是指识别自然语言文本中具有特定意义的实体,主要包括人名、地名、机构名、专有名词等。 我们将这三个任务统一成一个联合任务,称为词法分析任务,基于深度神经网络,利用海量标注语料进行训练,提供了一个端到端的解决方案。
-
-我们把这个联合的中文词法分析解决方案命名为LAC。LAC既可以认为是Lexical Analysis of Chinese的首字母缩写,也可以认为是LAC Analyzes Chinese的递归缩写。
-
-- `LAC `__
-
-情感倾向分析
-------------
-
-情感倾向分析针对带有主观描述的中文文本,可自动判断该文本的情感极性类别并给出相应的置信度。情感类型分为积极、消极、 中性。情感倾向分析能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控,为企业提供有力的决策支持。本次我们开放 AI开放平台中情感倾向分析采用的\ `模型 `__\, 提供给用户使用。
-
-- `Senta `__
-
-语义匹配
---------
-
-在自然语言处理很多场景中,需要度量两个文本在语义上的相似度,这类任务通常被称为语义匹配。例如在搜索中根据查询与候选文档的相似度对搜索结果进行排序,文本去重中文本与文本相似度的计算,自动问答中候选答案与问题的匹配等。
-
-本例所开放的DAM (Deep Attention Matching Network)为百度自然语言处理部发表于ACL-2018的工作,用于检索式聊天机器人多轮对话中应答的选择。DAM受Transformer的启发,其网络结构完全基于注意力(attention)机制,利用栈式的self-attention结构分别学习不同粒度下应答和语境的语义表示,然后利用cross-attention获取应答与语境之间的相关性,在两个大规模多轮对话数据集上的表现均好于其它模型。
-
-- `Deep Attention Matching Network `__
-
-AnyQ
-----
-
-`AnyQ `__\ (ANswer Your Questions)
-开源项目主要包含面向FAQ集合的问答系统框架、文本语义匹配工具SimNet。
-问答系统框架采用了配置化、插件化的设计,各功能均通过插件形式加入,当前共开放了20+种插件。开发者可以使用AnyQ系统快速构建和定制适用于特定业务场景的FAQ问答系统,并加速迭代和升级。
-
-SimNet是百度自然语言处理部于2013年自主研发的语义匹配框架,该框架在百度各产品上广泛应用,主要包括BOW、CNN、RNN、MM-DNN等核心网络结构形式,同时基于该框架也集成了学术界主流的语义匹配模型,如MatchPyramid、MV-LSTM、K-NRM等模型。使用SimNet构建出的模型可以便捷的加入AnyQ系统中,增强AnyQ系统的语义匹配能力。
-
-- `SimNet in PaddlePaddle
- Fluid `__
-
-机器阅读理解
-----
-
-机器阅读理解(MRC)是自然语言处理(NLP)中的核心任务之一,最终目标是让机器像人类一样阅读文本,提炼文本信息并回答相关问题。深度学习近年来在NLP中得到广泛使用,也使得机器阅读理解能力在近年有了大幅提高,但是目前研究的机器阅读理解都采用人工构造的数据集,以及回答一些相对简单的问题,和人类处理的数据还有明显差距,因此亟需大规模真实训练数据推动MRC的进一步发展。
-
-百度阅读理解数据集是由百度自然语言处理部开源的一个真实世界数据集,所有的问题、原文都来源于实际数据(百度搜索引擎数据和百度知道问答社区),答案是由人类回答的。每个问题都对应多个答案,数据集包含200k问题、1000k原文和420k答案,是目前最大的中文MRC数据集。百度同时开源了对应的阅读理解模型,称为DuReader,采用当前通用的网络分层结构,通过双向attention机制捕捉问题和原文之间的交互关系,生成query-aware的原文表示,最终基于query-aware的原文表示通过point network预测答案范围。
-
-- `DuReader in PaddlePaddle Fluid `__
-
-
-个性化推荐
--------
-
-推荐系统在当前的互联网服务中正在发挥越来越大的作用,目前大部分电子商务系统、社交网络,广告推荐,搜索引擎,都不同程度的使用了各种形式的个性化推荐技术,帮助用户快速找到他们想要的信息。
-
-在工业可用的推荐系统中,推荐策略一般会被划分为多个模块串联执行。以新闻推荐系统为例,存在多个可以使用深度学习技术的环节,例如新闻的自动化标注,个性化新闻召回,个性化匹配与排序等。PaddlePaddle对推荐算法的训练提供了完整的支持,并提供了多种模型配置供用户选择。
-
-- `TagSpace `_
-- `GRU4Rec `_
-- `SequenceSemanticRetrieval `_
-- `DeepCTR `_
-- `Multiview-Simnet `_
diff --git a/fluid/README.md b/fluid/README.md
deleted file mode 100644
index 9bbcb9623695319d34d0b986ad85d9029ef0b0a5..0000000000000000000000000000000000000000
--- a/fluid/README.md
+++ /dev/null
@@ -1,169 +0,0 @@
-Fluid 模型库
-============
-
-图像分类
---------
-
-图像分类是根据图像的语义信息对不同类别图像进行区分,是计算机视觉中重要的基础问题,是物体检测、图像分割、物体跟踪、行为分析、人脸识别等其他高层视觉任务的基础,在许多领域都有着广泛的应用。如:安防领域的人脸识别和智能视频分析等,交通领域的交通场景识别,互联网领域基于内容的图像检索和相册自动归类,医学领域的图像识别等。
-
-在深度学习时代,图像分类的准确率大幅度提升,在图像分类任务中,我们向大家介绍了如何在经典的数据集ImageNet上,训练常用的模型,包括AlexNet、VGG、GoogLeNet、ResNet、Inception-v4、MobileNet、DPN(Dual Path Network)、SE-ResNeXt模型,也开源了[训练的模型](https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleCV/image_classification/README_cn.md#已有模型及其性能) 方便用户下载使用。同时提供了能够将Caffe模型转换为PaddlePaddle
-Fluid模型配置和参数文件的工具。
-
-- [AlexNet](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [VGG](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [GoogleNet](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [Residual Network](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [Inception-v4](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [MobileNet](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [Dual Path Network](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [SE-ResNeXt](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/image_classification/models)
-- [Caffe模型转换为Paddle Fluid配置和模型文件工具](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/caffe2fluid)
-
-目标检测
---------
-
-目标检测任务的目标是给定一张图像或是一个视频帧,让计算机找出其中所有目标的位置,并给出每个目标的具体类别。对于人类来说,目标检测是一个非常简单的任务。然而,计算机能够“看到”的是图像被编码之后的数字,很难解图像或是视频帧中出现了人或是物体这样的高层语义概念,也就更加难以定位目标出现在图像中哪个区域。与此同时,由于目标会出现在图像或是视频帧中的任何位置,目标的形态千变万化,图像或是视频帧的背景千差万别,诸多因素都使得目标检测对计算机来说是一个具有挑战性的问题。
-
-在目标检测任务中,我们介绍了如何基于[PASCAL VOC](http://host.robots.ox.ac.uk/pascal/VOC/) 、[MS COCO](http://cocodataset.org/#home) 数据训练通用物体检测模型,当前介绍了SSD算法,SSD全称Single Shot MultiBox Detector,是目标检测领域较新且效果较好的检测算法之一,具有检测速度快且检测精度高的特点。
-
-开放环境中的检测人脸,尤其是小的、模糊的和部分遮挡的人脸也是一个具有挑战的任务。我们也介绍了如何基于 [WIDER FACE](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace) 数据训练百度自研的人脸检测PyramidBox模型,该算法于2018年3月份在WIDER FACE的多项评测中均获得 [第一名](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/WiderFace_Results.html)。
-
-Faster RCNN 是典型的两阶段目标检测器,相较于传统提取区域的方法,Faster RCNN中RPN网络通过共享卷积层参数大幅提高提取区域的效率,并提出高质量的候选区域。
-
-- [Single Shot MultiBox Detector](https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleCV/object_detection/README_cn.md)
-- [Face Detector: PyramidBox](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/face_detection/README_cn.md)
-- [Faster RCNN](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/faster_rcnn/README_cn.md)
-
-图像语义分割
-------------
-
-图像语意分割顾名思义是将图像像素按照表达的语义含义的不同进行分组/分割,图像语义是指对图像内容的理解,例如,能够描绘出什么物体在哪里做了什么事情等,分割是指对图片中的每个像素点进行标注,标注属于哪一类别。近年来用在无人车驾驶技术中分割街景来避让行人和车辆、医疗影像分析中辅助诊断等。
-
-在图像语义分割任务中,我们介绍如何基于图像级联网络(Image Cascade
-Network,ICNet)进行语义分割,相比其他分割算法,ICNet兼顾了准确率和速度。
-
-- [ICNet](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/icnet)
-
-图像生成
------------
-
-图像生成是指根据输入向量,生成目标图像。这里的输入向量可以是随机的噪声或用户指定的条件向量。具体的应用场景有:手写体生成、人脸合成、风格迁移、图像修复等。当前的图像生成任务主要是借助生成对抗网络(GAN)来实现。
-生成对抗网络(GAN)由两种子网络组成:生成器和识别器。生成器的输入是随机噪声或条件向量,输出是目标图像。识别器是一个分类器,输入是一张图像,输出是该图像是否是真实的图像。在训练过程中,生成器和识别器通过不断的相互博弈提升自己的能力。
-
-在图像生成任务中,我们介绍了如何使用DCGAN和ConditioanlGAN来进行手写数字的生成,另外还介绍了用于风格迁移的CycleGAN.
-
-- [DCGAN & ConditionalGAN](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/gan/c_gan)
-- [CycleGAN](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/gan/cycle_gan)
-
-场景文字识别
-------------
-
-许多场景图像中包含着丰富的文本信息,对理解图像信息有着重要作用,能够极大地帮助人们认知和理解场景图像的内容。场景文字识别是在图像背景复杂、分辨率低下、字体多样、分布随意等情况下,将图像信息转化为文字序列的过程,可认为是一种特别的翻译过程:将图像输入翻译为自然语言输出。场景图像文字识别技术的发展也促进了一些新型应用的产生,如通过自动识别路牌中的文字帮助街景应用获取更加准确的地址信息等。
-
-在场景文字识别任务中,我们介绍如何将基于CNN的图像特征提取和基于RNN的序列翻译技术结合,免除人工定义特征,避免字符分割,使用自动学习到的图像特征,完成字符识别。当前,介绍了CRNN-CTC模型和基于注意力机制的序列到序列模型。
-
-- [CRNN-CTC模型](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/ocr_recognition)
-- [Attention模型](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/ocr_recognition)
-
-
-度量学习
--------
-
-
-度量学习也称作距离度量学习、相似度学习,通过学习对象之间的距离,度量学习能够用于分析对象时间的关联、比较关系,在实际问题中应用较为广泛,可应用于辅助分类、聚类问题,也广泛用于图像检索、人脸识别等领域。以往,针对不同的任务,需要选择合适的特征并手动构建距离函数,而度量学习可根据不同的任务来自主学习出针对特定任务的度量距离函数。度量学习和深度学习的结合,在人脸识别/验证、行人再识别(human Re-ID)、图像检索等领域均取得较好的性能,在这个任务中我们主要介绍了基于Fluid的深度度量学习模型,包含了三元组、四元组等损失函数。
-
-- [Metric Learning](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/metric_learning)
-
-
-视频分类
--------
-
-视频分类是视频理解任务的基础,与图像分类不同的是,分类的对象不再是静止的图像,而是一个由多帧图像构成的、包含语音数据、包含运动信息等的视频对象,因此理解视频需要获得更多的上下文信息,不仅要理解每帧图像是什么、包含什么,还需要结合不同帧,知道上下文的关联信息。视频分类方法主要包含基于卷积神经网络、基于循环神经网络、或将这两者结合的方法。该任务中我们介绍基于Fluid的视频分类模型,目前包含Temporal Segment Network(TSN)模型,后续会持续增加更多模型。
-
-
-- [TSN](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleCV/video_classification)
-
-
-语音识别
---------
-
-自动语音识别(Automatic Speech Recognition, ASR)是将人类声音中的词汇内容转录成计算机可输入的文字的技术。语音识别的相关研究经历了漫长的探索过程,在HMM/GMM模型之后其发展一直较为缓慢,随着深度学习的兴起,其迎来了春天。在多种语言识别任务中,将深度神经网络(DNN)作为声学模型,取得了比GMM更好的性能,使得 ASR 成为深度学习应用最为成功的领域之一。而由于识别准确率的不断提高,有越来越多的语言技术产品得以落地,例如语言输入法、以智能音箱为代表的智能家居设备等 — 基于语言的交互方式正在深刻的改变人类的生活。
-
-与 [DeepSpeech](https://github.com/PaddlePaddle/DeepSpeech) 中深度学习模型端到端直接预测字词的分布不同,本实例更接近传统的语言识别流程,以音素为建模单元,关注语言识别中声学模型的训练,利用[kaldi](http://www.kaldi-asr.org) 进行音频数据的特征提取和标签对齐,并集成 kaldi 的解码器完成解码。
-
-- [DeepASR](https://github.com/PaddlePaddle/models/blob/develop/fluid/DeepASR/README_cn.md)
-
-机器翻译
---------
-
-机器翻译(Machine Translation)将一种自然语言(源语言)转换成一种自然语言(目标语音),是自然语言处理中非常基础和重要的研究方向。在全球化的浪潮中,机器翻译在促进跨语言文明的交流中所起的重要作用是不言而喻的。其发展经历了统计机器翻译和基于神经网络的神经机器翻译(Nueural
-Machine Translation, NMT)等阶段。在 NMT 成熟后,机器翻译才真正得以大规模应用。而早阶段的 NMT 主要是基于循环神经网络 RNN 的,其训练过程中当前时间步依赖于前一个时间步的计算,时间步之间难以并行化以提高训练速度。因此,非 RNN 结构的 NMT 得以应运而生,例如基 卷积神经网络 CNN 的结构和基于自注意力机制(Self-Attention)的结构。
-
-本实例所实现的 Transformer 就是一个基于自注意力机制的机器翻译模型,其中不再有RNN或CNN结构,而是完全利用 Attention 学习语言中的上下文依赖。相较于RNN/CNN, 这种结构在单层内计算复杂度更低、易于并行化、对长程依赖更易建模,最终在多种语言之间取得了最好的翻译效果。
-
-- [Transformer](https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleNLP/neural_machine_translation/transformer/README_cn.md)
-
-强化学习
---------
-
-强化学习是近年来一个愈发重要的机器学习方向,特别是与深度学习相结合而形成的深度强化学习(Deep Reinforcement Learning, DRL),取得了很多令人惊异的成就。人们所熟知的战胜人类顶级围棋职业选手的 AlphaGo 就是 DRL 应用的一个典型例子,除游戏领域外,其它的应用还包括机器人、自然语言处理等。
-
-深度强化学习的开山之作是在Atari视频游戏中的成功应用, 其可直接接受视频帧这种高维输入并根据图像内容端到端地预测下一步的动作,所用到的模型被称为深度Q网络(Deep Q-Network, DQN)。本实例就是利用PaddlePaddle Fluid这个灵活的框架,实现了 DQN 及其变体,并测试了它们在 Atari 游戏中的表现。
-
-- [DeepQNetwork](https://github.com/PaddlePaddle/models/blob/develop/fluid/DeepQNetwork/README_cn.md)
-
-中文词法分析
-------------
-
-中文分词(Word Segmentation)是将连续的自然语言文本,切分出具有语义合理性和完整性的词汇序列的过程。因为在汉语中,词是承担语义的最基本单位,切词是文本分类、情感分析、信息检索等众多自然语言处理任务的基础。 词性标注(Part-of-speech Tagging)是为自然语言文本中的每一个词汇赋予一个词性的过程,这里的词性包括名词、动词、形容词、副词等等。 命名实体识别(Named Entity Recognition,NER)又称作“专名识别”,是指识别自然语言文本中具有特定意义的实体,主要包括人名、地名、机构名、专有名词等。 我们将这三个任务统一成一个联合任务,称为词法分析任务,基于深度神经网络,利用海量标注语料进行训练,提供了一个端到端的解决方案。
-
-我们把这个联合的中文词法分析解决方案命名为LAC。LAC既可以认为是Lexical Analysis of Chinese的首字母缩写,也可以认为是LAC Analyzes Chinese的递归缩写。
-
-- [LAC](https://github.com/baidu/lac/blob/master/README.md)
-
-情感倾向分析
-------------
-
-情感倾向分析针对带有主观描述的中文文本,可自动判断该文本的情感极性类别并给出相应的置信度。情感类型分为积极、消极、中性。情感倾向分析能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控,为企业提供有力的决策支持。本次我们开放 AI 开放平台中情感倾向分析采用的[模型](http://ai.baidu.com/tech/nlp/sentiment_classify),提供给用户使用。
-
-- [Senta](https://github.com/baidu/Senta/blob/master/README.md)
-
-语义匹配
---------
-
-在自然语言处理很多场景中,需要度量两个文本在语义上的相似度,这类任务通常被称为语义匹配。例如在搜索中根据查询与候选文档的相似度对搜索结果进行排序,文本去重中文本与文本相似度的计算,自动问答中候选答案与问题的匹配等。
-
-本例所开放的DAM (Deep Attention Matching Network)为百度自然语言处理部发表于ACL-2018的工作,用于检索式聊天机器人多轮对话中应答的选择。DAM受Transformer的启发,其网络结构完全基于注意力(attention)机制,利用栈式的self-attention结构分别学习不同粒度下应答和语境的语义表示,然后利用cross-attention获取应答与语境之间的相关性,在两个大规模多轮对话数据集上的表现均好于其它模型。
-
-- [Deep Attention Matching Network](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleNLP/deep_attention_matching_net)
-
-AnyQ
-----
-
-[AnyQ](https://github.com/baidu/AnyQ)(ANswer Your Questions) 开源项目主要包含面向FAQ集合的问答系统框架、文本语义匹配工具SimNet。 问答系统框架采用了配置化、插件化的设计,各功能均通过插件形式加入,当前共开放了20+种插件。开发者可以使用AnyQ系统快速构建和定制适用于特定业务场景的FAQ问答系统,并加速迭代和升级。
-
-SimNet是百度自然语言处理部于2013年自主研发的语义匹配框架,该框架在百度各产品上广泛应用,主要包括BOW、CNN、RNN、MM-DNN等核心网络结构形式,同时基于该框架也集成了学术界主流的语义匹配模型,如MatchPyramid、MV-LSTM、K-NRM等模型。使用SimNet构建出的模型可以便捷的加入AnyQ系统中,增强AnyQ系统的语义匹配能力。
-
-- [SimNet in PaddlePaddle Fluid](https://github.com/baidu/AnyQ/blob/master/tools/simnet/train/paddle/README.md)
-
-机器阅读理解
-----------
-
-机器阅读理解(MRC)是自然语言处理(NLP)中的核心任务之一,最终目标是让机器像人类一样阅读文本,提炼文本信息并回答相关问题。深度学习近年来在NLP中得到广泛使用,也使得机器阅读理解能力在近年有了大幅提高,但是目前研究的机器阅读理解都采用人工构造的数据集,以及回答一些相对简单的问题,和人类处理的数据还有明显差距,因此亟需大规模真实训练数据推动MRC的进一步发展。
-
-百度阅读理解数据集是由百度自然语言处理部开源的一个真实世界数据集,所有的问题、原文都来源于实际数据(百度搜索引擎数据和百度知道问答社区),答案是由人类回答的。每个问题都对应多个答案,数据集包含200k问题、1000k原文和420k答案,是目前最大的中文MRC数据集。百度同时开源了对应的阅读理解模型,称为DuReader,采用当前通用的网络分层结构,通过双向attention机制捕捉问题和原文之间的交互关系,生成query-aware的原文表示,最终基于query-aware的原文表示通过point network预测答案范围。
-
-- [DuReader in PaddlePaddle Fluid](https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleNLP/machine_reading_comprehension/README.md)
-
-个性化推荐
--------
-
-推荐系统在当前的互联网服务中正在发挥越来越大的作用,目前大部分电子商务系统、社交网络,广告推荐,搜索引擎,都不同程度的使用了各种形式的个性化推荐技术,帮助用户快速找到他们想要的信息。
-
-在工业可用的推荐系统中,推荐策略一般会被划分为多个模块串联执行。以新闻推荐系统为例,存在多个可以使用深度学习技术的环节,例如新闻的自动化标注,个性化新闻召回,个性化匹配与排序等。PaddlePaddle对推荐算法的训练提供了完整的支持,并提供了多种模型配置供用户选择。
-
-- [TagSpace](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/tagspace)
-- [GRU4Rec](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/gru4rec)
-- [SequenceSemanticRetrieval](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/ssr)
-- [DeepCTR](https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleRec/ctr/README.cn.md)
-- [Multiview-Simnet](https://github.com/PaddlePaddle/models/tree/develop/fluid/PaddleRec/multiview_simnet)
diff --git a/fluid/adversarial/README.md b/fluid/adversarial/README.md
index 91661f7e1675d59c7d38c4c09bc67d5b9339573d..b43046d174c6fa7cc9517c043601d5a86e53604a 100644
--- a/fluid/adversarial/README.md
+++ b/fluid/adversarial/README.md
@@ -1,112 +1,6 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
----
+Hi!
-# Advbox
+This directory has been deprecated.
-Advbox is a toolbox to generate adversarial examples that fool neural networks and Advbox can benchmark the robustness of machine learning models.
-
-The Advbox is based on [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) Fluid and is under continual development, always welcoming contributions of the latest method of adversarial attacks and defenses.
-
-
-## Overview
-[Szegedy et al.](https://arxiv.org/abs/1312.6199) discovered an intriguing properties of deep neural networks in the context of image classification for the first time. They showed that despite the state-of-the-art deep networks are surprisingly susceptible to adversarial attacks in the form of small perturbations to images that remain (almost) imperceptible to human vision system. These perturbations are found by optimizing the input to maximize the prediction error and the images modified by these perturbations are called as `adversarial examples`. The profound implications of these results triggered a wide interest of researchers in adversarial attacks and their defenses for deep learning in general.
-
-Advbox is similar to [Foolbox](https://github.com/bethgelab/foolbox) and [CleverHans](https://github.com/tensorflow/cleverhans). CleverHans only supports TensorFlow framework while foolbox interfaces with many popular machine learning frameworks such as PyTorch, Keras, TensorFlow, Theano, Lasagne and MXNet. However, these two great libraries don't support PaddlePaddle, an easy-to-use, efficient, flexible and scalable deep learning platform which is originally developed by Baidu scientists and engineers for the purpose of applying deep learning to many products at Baidu.
-
-## Usage
-Advbox provides many stable reference implementations of modern methods to generate adversarial examples such as FGSM, DeepFool, JSMA. When you want to benchmark the robustness of your neural networks , you can use the advbox to generate some adversarial examples and benchmark the networks. Some tips of using Advbox:
-
-1. Train a model and save the parameters.
-2. Load the parameters which has been trained,then reconstruct the model.
-3. Use advbox to generate the adversarial samples.
-
-
-#### Dependencies
-* PaddlePaddle: [the lastest develop branch](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html)
-* Python 2.x
-
-#### Structure
-
-Network models, attack method's implements and the criterion that defines adversarial examples are three essential elements to generate adversarial examples. Misclassification is adopted as the adversarial criterion for briefness in Advbox.
-
-The structure of Advbox module are as follows:
-
- .
- ├── advbox
- | ├── __init__.py
- | ├── attack
- | ├── __init__.py
- | ├── base.py
- | ├── deepfool.py
- | ├── gradient_method.py
- | ├── lbfgs.py
- | └── saliency.py
- | ├── models
- | ├── __init__.py
- | ├── base.py
- | └── paddle.py
- | └── adversary.py
- ├── tutorials
- | ├── __init__.py
- | ├── mnist_model.py
- | ├── mnist_tutorial_lbfgs.py
- | ├── mnist_tutorial_fgsm.py
- | ├── mnist_tutorial_bim.py
- | ├── mnist_tutorial_ilcm.py
- | ├── mnist_tutorial_mifgsm.py
- | ├── mnist_tutorial_jsma.py
- | └── mnist_tutorial_deepfool.py
- └── README.md
-
-**advbox.attack**
-
-Advbox implements several popular adversarial attacks which search adversarial examples. Each attack method uses a distance measure(L1, L2, etc.) to quantify the size of adversarial perturbations. Advbox is easy to craft adversarial example as some attack methods could perform internal hyperparameter tuning to find the minimum perturbation.
-
-**advbox.model**
-
-Advbox implements interfaces to PaddlePaddle. Additionally, other deep learning framworks such as TensorFlow can also be defined and employed. The module is use to compute predictions and gradients for given inputs in a specific framework.
-
-**advbox.adversary**
-
-Adversary contains the original object, the target and the adversarial examples. It provides the misclassification as the criterion to accept a adversarial example.
-
-## Tutorials
-The `./tutorials/` folder provides some tutorials to generate adversarial examples on the MNIST dataset. You can slightly modify the code to apply to other dataset. These attack methods are supported in Advbox:
-
-* [L-BFGS](https://arxiv.org/abs/1312.6199)
-* [FGSM](https://arxiv.org/abs/1412.6572)
-* [BIM](https://arxiv.org/abs/1607.02533)
-* [ILCM](https://arxiv.org/abs/1607.02533)
-* [MI-FGSM](https://arxiv.org/pdf/1710.06081.pdf)
-* [JSMA](https://arxiv.org/pdf/1511.07528)
-* [DeepFool](https://arxiv.org/abs/1511.04599)
-
-## Testing
-Benchmarks on a vanilla CNN model.
-
-> MNIST
-
-| adversarial attacks | fooling rate (non-targeted) | fooling rate (targeted) | max_epsilon | iterations | Strength |
-|:-----:| :----: | :---: | :----: | :----: | :----: |
-|L-BFGS| --- | 89.2% | --- | One shot | *** |
-|FGSM| 57.8% | 26.55% | 0.3 | One shot| *** |
-|BIM| 97.4% | --- | 0.1 | 100 | **** |
-|ILCM| --- | 100.0% | 0.1 | 100 | **** |
-|MI-FGSM| 94.4% | 100.0% | 0.1 | 100 | **** |
-|JSMA| 96.8% | 90.4%| 0.1 | 2000 | *** |
-|DeepFool| 97.7% | 51.3% | --- | 100 | **** |
-
-* The strength (higher for more asterisks) is based on the impression from the reviewed literature.
-
----
-## References
-* [Intriguing properties of neural networks](https://arxiv.org/abs/1312.6199), C. Szegedy et al., arxiv 2014
-* [Explaining and Harnessing Adversarial Examples](https://arxiv.org/abs/1412.6572), I. Goodfellow et al., ICLR 2015
-* [Adversarial Examples In The Physical World](https://arxiv.org/pdf/1607.02533v3.pdf), A. Kurakin et al., ICLR workshop 2017
-* [Boosting Adversarial Attacks with Momentum](https://arxiv.org/abs/1710.06081), Yinpeng Dong et al., arxiv 2018
-* [The Limitations of Deep Learning in Adversarial Settings](https://arxiv.org/abs/1511.07528), N. Papernot et al., ESSP 2016
-* [DeepFool: a simple and accurate method to fool deep neural networks](https://arxiv.org/abs/1511.04599), S. Moosavi-Dezfooli et al., CVPR 2016
-* [Foolbox: A Python toolbox to benchmark the robustness of machine learning models](https://arxiv.org/abs/1707.04131), Jonas Rauber et al., arxiv 2018
-* [CleverHans: An adversarial example library for constructing attacks, building defenses, and benchmarking both](https://github.com/tensorflow/cleverhans#setting-up-cleverhans)
-* [Threat of Adversarial Attacks on Deep Learning in Computer Vision: A Survey](https://arxiv.org/abs/1801.00553), Naveed Akhtar, Ajmal Mian, arxiv 2018
+Please visit the project at [PaddleCV/adversarial](../../PaddleCV/adversarial).
diff --git a/fluid/mnist/.run_ce.sh b/fluid/mnist/.run_ce.sh
deleted file mode 100755
index d6ccf429b52da1ff26ac02df5af287461a823a98..0000000000000000000000000000000000000000
--- a/fluid/mnist/.run_ce.sh
+++ /dev/null
@@ -1,7 +0,0 @@
-#!/bin/bash
-
-# This file is only used for continuous evaluation.
-
-rm -rf *_factor.txt
-model_file='model.py'
-python $model_file --batch_size 128 --pass_num 5 --device CPU | python _ce.py
diff --git a/fluid/mnist/_ce.py b/fluid/mnist/_ce.py
deleted file mode 100644
index 9c2dba53526d2e976252fce05c7ff7f0f44b39b2..0000000000000000000000000000000000000000
--- a/fluid/mnist/_ce.py
+++ /dev/null
@@ -1,61 +0,0 @@
-# this file is only used for continuous evaluation test!
-
-import os
-import sys
-sys.path.append(os.environ['ceroot'])
-from kpi import CostKpi, DurationKpi, AccKpi
-
-# NOTE kpi.py should shared in models in some way!!!!
-
-train_cost_kpi = CostKpi('train_cost', 0.02, actived=True)
-test_acc_kpi = AccKpi('test_acc', 0.005, actived=True)
-train_duration_kpi = DurationKpi('train_duration', 0.06, actived=True)
-train_acc_kpi = AccKpi('train_acc', 0.005, actived=True)
-
-tracking_kpis = [
- train_acc_kpi,
- train_cost_kpi,
- test_acc_kpi,
- train_duration_kpi,
-]
-
-
-def parse_log(log):
- '''
- This method should be implemented by model developers.
-
- The suggestion:
-
- each line in the log should be key, value, for example:
-
- "
- train_cost\t1.0
- test_cost\t1.0
- train_cost\t1.0
- train_cost\t1.0
- train_acc\t1.2
- "
- '''
- for line in log.split('\n'):
- fs = line.strip().split('\t')
- print(fs)
- if len(fs) == 3 and fs[0] == 'kpis':
- kpi_name = fs[1]
- kpi_value = float(fs[2])
- yield kpi_name, kpi_value
-
-
-def log_to_ce(log):
- kpi_tracker = {}
- for kpi in tracking_kpis:
- kpi_tracker[kpi.name] = kpi
-
- for (kpi_name, kpi_value) in parse_log(log):
- print(kpi_name, kpi_value)
- kpi_tracker[kpi_name].add_record(kpi_value)
- kpi_tracker[kpi_name].persist()
-
-
-if __name__ == '__main__':
- log = sys.stdin.read()
- log_to_ce(log)
diff --git a/fluid/mnist/model.py b/fluid/mnist/model.py
deleted file mode 100644
index a66353c2239fd78eb1fdf9f08690994a9a7d1c08..0000000000000000000000000000000000000000
--- a/fluid/mnist/model.py
+++ /dev/null
@@ -1,198 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-import argparse
-import time
-
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.profiler as profiler
-import six
-
-SEED = 90
-DTYPE = "float32"
-
-# random seed must set before configuring the network.
-fluid.default_startup_program().random_seed = SEED
-
-
-def parse_args():
- parser = argparse.ArgumentParser("mnist model benchmark.")
- parser.add_argument(
- '--batch_size', type=int, default=128, help='The minibatch size.')
- parser.add_argument(
- '--iterations', type=int, default=35, help='The number of minibatches.')
- parser.add_argument(
- '--pass_num', type=int, default=5, help='The number of passes.')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type.')
- parser.add_argument(
- '--infer_only', action='store_true', help='If set, run forward only.')
- parser.add_argument(
- '--use_cprof', action='store_true', help='If set, use cProfile.')
- parser.add_argument(
- '--use_nvprof',
- action='store_true',
- help='If set, use nvprof for CUDA.')
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- vars(args)['use_nvprof'] = (vars(args)['use_nvprof'] and
- vars(args)['device'] == 'GPU')
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(six.iteritems(vars(args))):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def cnn_model(data):
- conv_pool_1 = fluid.nets.simple_img_conv_pool(
- input=data,
- filter_size=5,
- num_filters=20,
- pool_size=2,
- pool_stride=2,
- act="relu")
- conv_pool_2 = fluid.nets.simple_img_conv_pool(
- input=conv_pool_1,
- filter_size=5,
- num_filters=50,
- pool_size=2,
- pool_stride=2,
- act="relu")
-
- # TODO(dzhwinter) : refine the initializer and random seed settting
- SIZE = 10
- input_shape = conv_pool_2.shape
- param_shape = [six.moves.reduce(lambda a, b: a * b, input_shape[1:], 1)
- ] + [SIZE]
- scale = (2.0 / (param_shape[0]**2 * SIZE))**0.5
-
- predict = fluid.layers.fc(
- input=conv_pool_2,
- size=SIZE,
- act="softmax",
- param_attr=fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.NormalInitializer(
- loc=0.0, scale=scale)))
- return predict
-
-
-def eval_test(exe, batch_acc, batch_size_tensor, inference_program):
- test_reader = paddle.batch(
- paddle.dataset.mnist.test(), batch_size=args.batch_size)
- test_pass_acc = fluid.average.WeightedAverage()
- for batch_id, data in enumerate(test_reader()):
- img_data = np.array(
- [x[0].reshape([1, 28, 28]) for x in data]).astype(DTYPE)
- y_data = np.array([x[1] for x in data]).astype("int64")
- y_data = y_data.reshape([len(y_data), 1])
-
- acc, weight = exe.run(inference_program,
- feed={"pixel": img_data,
- "label": y_data},
- fetch_list=[batch_acc, batch_size_tensor])
- test_pass_acc.add(value=acc, weight=weight)
- pass_acc = test_pass_acc.eval()
- return pass_acc
-
-
-def run_benchmark(model, args):
- if args.use_cprof:
- pr = cProfile.Profile()
- pr.enable()
- start_time = time.time()
- # Input data
- images = fluid.layers.data(name='pixel', shape=[1, 28, 28], dtype=DTYPE)
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
-
- # Train program
- predict = model(images)
- cost = fluid.layers.cross_entropy(input=predict, label=label)
- avg_cost = fluid.layers.mean(x=cost)
-
- # Evaluator
- batch_size_tensor = fluid.layers.create_tensor(dtype='int64')
- batch_acc = fluid.layers.accuracy(
- input=predict, label=label, total=batch_size_tensor)
-
- # inference program
- inference_program = fluid.default_main_program().clone(for_test=True)
-
- # Optimization
- opt = fluid.optimizer.AdamOptimizer(
- learning_rate=0.001, beta1=0.9, beta2=0.999)
- opt.minimize(avg_cost)
-
- fluid.memory_optimize(fluid.default_main_program())
-
- # Initialize executor
- place = fluid.CPUPlace() if args.device == 'CPU' else fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
-
- # Parameter initialization
- exe.run(fluid.default_startup_program())
-
- # Reader
- train_reader = paddle.batch(
- paddle.dataset.mnist.train(), batch_size=args.batch_size)
-
- accuracy = fluid.average.WeightedAverage()
- for pass_id in range(args.pass_num):
- accuracy.reset()
- pass_start = time.time()
- every_pass_loss = []
- for batch_id, data in enumerate(train_reader()):
- img_data = np.array(
- [x[0].reshape([1, 28, 28]) for x in data]).astype(DTYPE)
- y_data = np.array([x[1] for x in data]).astype("int64")
- y_data = y_data.reshape([len(y_data), 1])
-
- start = time.time()
- loss, acc, weight = exe.run(
- fluid.default_main_program(),
- feed={"pixel": img_data,
- "label": y_data},
- fetch_list=[avg_cost, batch_acc, batch_size_tensor]
- ) # The accuracy is the accumulation of batches, but not the current batch.
- end = time.time()
- accuracy.add(value=acc, weight=weight)
- every_pass_loss.append(loss)
- print("Pass = %d, Iter = %d, Loss = %f, Accuracy = %f" %
- (pass_id, batch_id, loss, acc))
-
- pass_end = time.time()
-
- train_avg_acc = accuracy.eval()
- train_avg_loss = np.mean(every_pass_loss)
- test_avg_acc = eval_test(exe, batch_acc, batch_size_tensor,
- inference_program)
-
- print(
- "pass=%d, train_avg_acc=%f,train_avg_loss=%f, test_avg_acc=%f, elapse=%f"
- % (pass_id, train_avg_acc, train_avg_loss, test_avg_acc,
- (pass_end - pass_start)))
- #Note: The following logs are special for CE monitoring.
- #Other situations do not need to care about these logs.
- print("kpis train_acc %f" % train_avg_acc)
- print("kpis train_cost %f" % train_avg_loss)
- print("kpis test_acc %f" % test_avg_acc)
- print("kpis train_duration %f" % (pass_end - pass_start))
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- if args.use_nvprof and args.device == 'GPU':
- with profiler.cuda_profiler("cuda_profiler.txt", 'csv') as nvprof:
- run_benchmark(cnn_model, args)
- else:
- run_benchmark(cnn_model, args)
diff --git a/fluid/policy_gradient/README.md b/fluid/policy_gradient/README.md
index b813aa124466597adfb80261bee7c2de22b95e67..b6ac95d0fba6bbb7552671fbc6e80d052a648045 100644
--- a/fluid/policy_gradient/README.md
+++ b/fluid/policy_gradient/README.md
@@ -1,171 +1,2 @@
-运行本目录下的程序示例需要使用PaddlePaddle的最新develop分枝。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
----
-
-# Policy Gradient RL by PaddlePaddle
-本文介绍了如何使用PaddlePaddle通过policy-based的强化学习方法来训练一个player(actor model), 我们希望这个player可以完成简单的走阶梯任务。
-
- 内容分为:
-
- - 任务描述
- - 模型
- - 策略(目标函数)
- - 算法(Gradient ascent)
- - PaddlePaddle实现
-
-
-## 1. 任务描述
-假设有一个阶梯,连接A、B点,player从A点出发,每一步只能向前走一步或向后走一步,到达B点即为完成任务。我们希望训练一个聪明的player,它知道怎么最快的从A点到达B点。
-我们在命令行以下边的形式模拟任务:
-```
-A - O - - - - - B
-```
-一个‘-'代表一个阶梯,A点在行头,B点在行末,O代表player当前在的位置。
-
-## 2. Policy Gradient
-### 2.1 模型
-#### inputyer
-模型的输入是player观察到的当前阶梯的状态$S$, 要包含阶梯的长度和player当前的位置信息。
-在命令行模拟的情况下,player的位置和阶梯长度连个变量足以表示当前的状态,但是我们为了便于将这个demo推广到更复杂的任务场景,我们这里用一个向量来表示游戏状态$S$.
-向量$S$的长度为阶梯的长度,每一维代表一个阶梯,player所在的位置为1,其它位置为0.
-下边是一个例子:
-```
-S = [0, 1, 0, 0] // 阶梯长度为4,player在第二个阶梯上。
-```
-#### hidden layer
-隐藏层采用两个全连接layer `FC_1`和`FC_2`, 其中`FC_1` 的size为10, `FC_2`的size为2.
-
-#### output layer
-我们使用softmax将`FC_2`的output映射为所有可能的动作(前进或后退)的概率分布(Probability of taking the action),即为一个二维向量`act_probs`, 其中,`act_probs[0]` 为后退的概率,`act_probs[1]`为前进的概率。
-
-#### 模型表示
-我将我们的player模型(actor)形式化表示如下:
-$$a = \pi_\theta(s)$$
-其中$\theta$表示模型的参数,$s$是输入状态。
-
-
-### 2.2 策略(目标函数)
-我们怎么评估一个player(模型)的好坏呢?首先我们定义几个术语:
-我们让$\pi_\theta(s)$来玩一局游戏,$s_t$表示第$t$时刻的状态,$a_t$表示在状态$s_t$做出的动作,$r_t$表示做过动作$a_t$后得到的奖赏。
-一局游戏的过程可以表示如下:
-$$\tau = [s_1, a_1, r_1, s_2, a_2, r_2 ... s_T, a_T, r_T] \tag{1}$$
-
-一局游戏的奖励表示如下:
-$$R(\tau) = \sum_{t=1}^Tr_t$$
-
-player玩一局游戏,可能会出现多种操作序列$\tau$ ,某个$\tau$出现的概率是依赖于player model的$\theta$, 记做:
-$$P(\tau | \theta)$$
-那么,给定一个$\theta$(player model), 玩一局游戏,期望得到的奖励是:
-$$\overline {R}_\theta = \sum_\tau R(\tau)\sum_\tau R(\tau) P(\tau|\theta)$$
-大多数情况,我们无法穷举出所有的$\tau$,所以我们就抽取N个$\tau$来计算近似的期望:
-$$\overline {R}_\theta = \sum_\tau R(\tau) P(\tau|\theta) \approx \frac{1}{N} \sum_{n=1}^N R(\tau^n)$$
-
-$\overline {R}_\theta$就是我们需要的目标函数,它表示了一个参数为$\theta$的player玩一局游戏得分的期望,这个期望越大,代表这个player能力越强。
-### 2.3 算法(Gradient ascent)
-我们的目标函数是$\overline {R}_\theta$, 我们训练的任务就是, 我们训练的任务就是:
-$$\theta^* = \arg\max_\theta \overline {R}_\theta$$
-
-为了找到理想的$\theta$,我们使用Gradient ascent方法不断在$\overline {R}_\theta$的梯度方向更新$\theta$,可表示如下:
-$$\theta' = \theta + \eta * \bigtriangledown \overline {R}_\theta$$
-
-$$ \bigtriangledown \overline {R}_\theta = \sum_\tau R(\tau) \bigtriangledown P(\tau|\theta)\\
-= \sum_\tau R(\tau) P(\tau|\theta) \frac{\bigtriangledown P(\tau|\theta)}{P(\tau|\theta)} \\
-=\sum_\tau R(\tau) P(\tau|\theta) {\bigtriangledown \log P(\tau|\theta)} $$
-
-
-$$P(\tau|\theta) = P(s_1)P(a_1|s_1,\theta)P(s_2, r_1|s_1,a_1)P(a_2|s_2,\theta)P(s_3,r_2|s_2,a_2)...P(a_t|s_t,\theta)P(s_{t+1}, r_t|s_t,a_t)\\
-=P(s_1) \sum_{t=1}^T P(a_t|s_t,\theta)P(s_{t+1}, r_t|s_t,a_t)$$
-
-$$\log P(\tau|\theta) = \log P(s_1) + \sum_{t=1}^T [\log P(a_t|s_t,\theta) + \log P(s_{t+1}, r_t|s_t,a_t)]$$
-
-$$ \bigtriangledown \log P(\tau|\theta) = \sum_{t=1}^T \bigtriangledown \log P(a_t|s_t,\theta)$$
-
-$$ \bigtriangledown \overline {R}_\theta = \sum_\tau R(\tau) P(\tau|\theta) {\bigtriangledown \log P(\tau|\theta)} \\
-\approx \frac{1}{N} \sum_{n=1}^N R(\tau^n) {\bigtriangledown \log P(\tau|\theta)} \\
-= \frac{1}{N} \sum_{n=1}^N R(\tau^n) {\sum_{t=1}^T \bigtriangledown \log P(a_t|s_t,\theta)} \\
-= \frac{1}{N} \sum_{n=1}^N \sum_{t=1}^T R(\tau^n) { \bigtriangledown \log P(a_t|s_t,\theta)} \tag{11}$$
-
-#### 2.3.2 导数解释
-
-在使用深度学习框架进行训练求解时,一般用梯度下降方法,所以我们把Gradient ascent转为Gradient
-descent, 重写等式$(5)(6)$为:
-
-$$\theta^* = \arg\min_\theta (-\overline {R}_\theta \tag{13}$$
-$$\theta' = \theta - \eta * \bigtriangledown (-\overline {R}_\theta)) \tag{14}$$
-
-根据上一节的推导,$ (-\bigtriangledown \overline {R}_\theta) $结果如下:
-
-$$ -\bigtriangledown \overline {R}_\theta
-= \frac{1}{N} \sum_{n=1}^N \sum_{t=1}^T R(\tau^n) { \bigtriangledown -\log P(a_t|s_t,\theta)} \tag{15}$$
-
-根据等式(14), 我们的player的模型可以设计为:
-
-
-
-图 1
-
-
-用户的在一局游戏中的一次操作可以用元组$(s_t, a_t)$, 就是在状态$s_t$状态下做了动作$a_t$, 我们通过图(1)中的前向网络计算出来cross entropy cost为$−\log P(a_t|s_t,\theta)$, 恰好是等式(15)中我们需要微分的一项。
-图1是我们需要的player模型,我用这个网络的前向计算可以预测任何状态下该做什么动作。但是怎么去训练学习这个网络呢?在等式(15)中还有一项$R(\tau^n)$, 我做反向梯度传播的时候要加上这一项,所以我们需要在图1基础上再加上$R(\tau^n)$, 如 图2 所示:
-
-
-
-图 2
-
-
-图2就是我们最终的网络结构。
-
-#### 2.3.3 直观理解
-对于等式(15),我只看游戏中的一步操作,也就是这一项: $R(\tau^n) { \bigtriangledown -\log P(a_t|s_t,\theta)}$, 我们可以简单的认为我们训练的目的是让 $R(\tau^n) {[ -\log P(a_t|s_t,\theta)]}$尽可能的小,也就是$R(\tau^n) \log P(a_t|s_t,\theta)$尽可能的大。
-
-- 如果我们当前游戏局的奖励$R(\tau^n)$为正,那么我们希望当前操作的出现的概率$P(a_t|s_t,\theta)$尽可能大。
-- 如果我们当前游戏局的奖励$R(\tau^n)$为负,那么我们希望当前操作的出现的概率$P(a_t|s_t,\theta)$尽可能小。
-
-#### 2.3.4 一个问题
-
-一人犯错,诛连九族。一人得道,鸡犬升天。如果一局游戏得到奖励,我们希望帮助获得奖励的每一次操作都被重视;否则,导致惩罚的操作都要被冷落一次。
-是不是很有道理的样子?但是,如果有些游戏场景只有奖励,没有惩罚,怎么办?也就是所有的$R(\tau^n)$都为正。
-针对不同的游戏场景,我们有不同的解决方案:
-
-1. 每局游戏得分不一样:将每局的得分减去一个bias,结果就有正有负了。
-2. 每局游戏得分一样:把完成一局的时间作为计分因素,并减去一个bias.
-
-我们在第一章描述的游戏场景,需要用第二种 ,player每次到达终点都会收到1分的奖励,我们可以按完成任务所用的步数来定义奖励R.
-更进一步,我们认为一局游戏中每步动作对结局的贡献是不同的,有聪明的动作,也有愚蠢的操作。直观的理解,一般是靠前的动作是愚蠢的,靠后的动作是聪明的。既然有了这个价值观,那么我们拿到1分的奖励,就不能平均分给每个动作了。
-如图3所示,让所有动作按先后排队,从后往前衰减地给每个动作奖励,然后再每个动作的奖励再减去所有动作奖励的平均值:
-
-
-
-图 3
-
-
-## 3. 训练效果
-
-demo运行训练效果如下,经过1000轮尝试,我们的player就学会了如何有效的完成任务了:
-
-```
----------O epoch: 0; steps: 42
----------O epoch: 1; steps: 77
----------O epoch: 2; steps: 82
----------O epoch: 3; steps: 64
----------O epoch: 4; steps: 79
----------O epoch: 501; steps: 19
----------O epoch: 1001; steps: 9
----------O epoch: 1501; steps: 9
----------O epoch: 2001; steps: 11
----------O epoch: 2501; steps: 9
----------O epoch: 3001; steps: 9
----------O epoch: 3002; steps: 9
----------O epoch: 3003; steps: 9
----------O epoch: 3004; steps: 9
----------O epoch: 3005; steps: 9
----------O epoch: 3006; steps: 9
----------O epoch: 3007; steps: 9
----------O epoch: 3008; steps: 9
----------O epoch: 3009; steps: 9
----------O epoch: 3010; steps: 11
----------O epoch: 3011; steps: 9
----------O epoch: 3012; steps: 9
----------O epoch: 3013; steps: 9
----------O epoch: 3014; steps: 9
-```
+您好,该项目已被迁移,请移步到 [PaddleRL/policy_gradient](../../PaddleRL/policy_gradient) 目录下浏览本项目。
diff --git a/legacy/README.md b/legacy/README.md
index f7741c9b7b1c39e569e74606d054847b27a206d8..f0719c1a26c04341e8de327143dc826248bb3607 100644
--- a/legacy/README.md
+++ b/legacy/README.md
@@ -1,3 +1,6 @@
+
+# 该目录的模型已经不再维护,不推荐使用。建议使用Fluid目录下的模型。
+
# Introduction to models
[](https://github.com/PaddlePaddle/models)