diff --git a/AutoDL/HiNAS_models/README.md b/AutoDL/HiNAS_models/README.md
deleted file mode 100755
index 9c67736aa30643baf72ce42ed2ca3321d4e22165..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/README.md
+++ /dev/null
@@ -1,76 +0,0 @@
-# Image Classification Models
-This directory contains six image classification models, which are models automatically discovered by Baidu Big Data Lab (BDL) Hierarchical Neural Architecture Search project (HiNAS), achieving 96.1% accuracy on CIFAR-10 dataset. These models are divided into two categories. The first three have no skip link, named HiNAS 0-2, and the last three networks contain skip links, which are similar to the shortcut connections in Resnet, named HiNAS 3-5.
-
----
-## Table of Contents
-- [Installation](#installation)
-- [Data preparation](#data-preparation)
-- [Training a model](#training-a-model)
-- [Model performances](#model-performances)
-
-## Installation
-Running the trainer in current directory requires:
-
-- PadddlePaddle Fluid >= v0.15.0
-- CuDNN >=6.0
-
-If PaddlePaddle and CuDNN in your runtime environment do not meet the requirements, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) and make an update.
-
-## Data preparation
-
-When you run the sample code for the first time, the trainer will automatically download the cifar-10 dataset. Please make sure your environment has an internet connection.
-
-The dataset will be downloaded to `dataset/cifar/cifar-10-python.tar.gz` in the same directory as the Trainer. If automatic download fails, you can go to https://www.cs.toronto.edu/~kriz/cifar.html and download cifar-10-python.tar.gz to the location mentioned above.
-
-## Training a model
-
-After the environment is ready, you can train the model. There are two entrances: `train_hinas.py` and `train_hinas_res.py`. The former is used to train Model 0-2 (without skip link), and the latter is used to train Model 3-5 (contains skip link).
-
-Train Model 0~2 (without skip link):
-```
-python train_hinas.py --model=m_id # m_id can be 0, 1 or 2.
-```
-Train Model 3~5 (with skip link):
-```
-python train_hinas_res.py --model=m_id # m_id can be 0, 1 or 2.
-```
-
-In addition, both `train_hinas.py` and `train_hinas_res.py` support the following parameters:
-
-- **random_flip_left_right**: Random flip image horizontally. (Default: True)
-- **random_flip_up_down**: Randomly flip image vertically. (Default: False)
-- **cutout**: Add cutout action to image. (Default: True)
-- **standardize_image**: Image standardize. (Default: True)
-- **pad_and_cut_image**: Random padding image and then crop back to the original size. (Default: True)
-- **shuffle_image**: Shuffle the order of the input images during training. (Default: True)
-- **lr_max**: Learning rate at the begin of training. (Default: 0.1)
-- **lr_min**: Learning rate at the end of training. (Default: 0.0001)
-- **batch_size**: Training batch size (Default: 128)
-- **num_epochs**: Total training epoch (Default: 200)
-- **weight_decay**: L2 Regularization value (Default: 0.0004)
-- **momentum**: The momentum parameter in momentum optimizer (Default: 0.9)
-- **dropout_rate**: Dropout rate of the dropout layer (Default: 0.5)
-- **bn_decay**: The decay/momentum parameter (or called moving average decay) in batch norm layer (Default: 0.9)
-
-
-## Model performances
-
-Train all six models using same hyperparameters:
-
-- learning rate: 0.1 -> 0.0001 with cosine annealing
-- total epoch: 200
-- batch size: 128
-- L2 decay: 0.000400
-- optimizer: momentum optimizer with m=0.9 and use nesterov
-- preprocess: random horizontal flip + image standardization + cutout
-
-And below is the accuracy on CIFAR-10 dataset:
-
-| model | round 1 | round 2 | round 3 | max | avg |
-|----------|---------|---------|---------|--------|--------|
-| HiNAS-0 | 0.9548 | 0.9520 | 0.9513 | 0.9548 | 0.9527 |
-| HiNAS-1 | 0.9452 | 0.9462 | 0.9420 | 0.9462 | 0.9445 |
-| HiNAS-2 | 0.9508 | 0.9506 | 0.9483 | 0.9508 | 0.9499 |
-| HiNAS-3 | 0.9607 | 0.9623 | 0.9601 | 0.9623 | 0.9611 |
-| HiNAS-4 | 0.9611 | 0.9584 | 0.9586 | 0.9611 | 0.9594 |
-| HiNAS-5 | 0.9578 | 0.9588 | 0.9594 | 0.9594 | 0.9586 |
diff --git a/AutoDL/HiNAS_models/README_cn.md b/AutoDL/HiNAS_models/README_cn.md
deleted file mode 100755
index 8ca3bcbfb8d1ea1a15f969c1a1db22ff2ec854f1..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/README_cn.md
+++ /dev/null
@@ -1,78 +0,0 @@
-# Image Classification Models
-本目录下包含6个图像分类模型,都是百度大数据实验室 Hierarchical Neural Architecture Search (HiNAS) 项目通过机器自动发现的模型,在CIFAR-10数据集上达到96.1%的准确率。这6个模型分为两类,前3个没有skip link,分别命名为 HiNAS 0-2号,后三个网络带有skip link,功能类似于Resnet中的shortcut connection,分别命名 HiNAS 3-5号。
-
----
-## Table of Contents
-- [Installation](#installation)
-- [Data preparation](#data-preparation)
-- [Training a model](#training-a-model)
-- [Model performances](#model-performances)
-
-## Installation
-最低环境要求:
-
-- PadddlePaddle Fluid >= v0.15.0
-- Cudnn >=6.0
-
-如果您的运行环境无法满足要求,可以参考此文档升级PaddlePaddle:[installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)
-
-## Data preparation
-
-第一次训练模型的时候,Trainer会自动下载CIFAR-10数据集,请确保您的环境有互联网连接。
-
-数据集会被下载到Trainer同目录下的`dataset/cifar/cifar-10-python.tar.gz`,如果自动下载失败,您可以自行从 https://www.cs.toronto.edu/~kriz/cifar.html 下载cifar-10-python.tar.gz,然后放到上述位置。
-
-
-## Training a model
-准备好环境后,可以训练模型,训练有2个入口,`train_hinas.py`和`train_hinas_res.py`,前者用来训练0-2号不含skip link的模型,后者用来训练3-5号包含skip link的模型。
-
-训练0~2号不含skip link的模型:
-```
-python train_hinas.py --model=m_id # m_id can be 0, 1 or 2.
-```
-训练3~5号包含skip link的模型:
-```
-python train_hinas_res.py --model=m_id # m_id can be 0, 1 or 2.
-```
-
-此外,`train_hinas.py`和`train_hinas_res.py` 都支持以下参数:
-
-初始化部分:
-
-- random_flip_left_right:图片随机水平翻转(Default:True)
-- random_flip_up_down:图片随机垂直翻转(Default:False)
-- cutout:图片随机遮挡(Default:True)
-- standardize_image:对图片每个像素做 standardize(Default:True)
-- pad_and_cut_image:图片随机padding,并裁剪回原大小(Default:True)
-- shuffle_image:训练时对输入图片的顺序做shuffle(Default:True)
-- lr_max:训练开始时的learning rate(Default:0.1)
-- lr_min:训练结束时的learning rate(Default:0.0001)
-- batch_size:训练的batch size(Default:128)
-- num_epochs:训练总的epoch(Default:200)
-- weight_decay:训练时L2 Regularization大小(Default:0.0004)
-- momentum:momentum优化器中的momentum系数(Default:0.9)
-- dropout_rate:dropout层的dropout_rate(Default:0.5)
-- bn_decay:batch norm层的decay/momentum系数(即moving average decay)大小(Default:0.9)
-
-
-
-## Model performances
-6个模型使用相同的参数训练:
-
-- learning rate: 0.1 -> 0.0001 with cosine annealing
-- total epoch: 200
-- batch size: 128
-- L2 decay: 0.000400
-- optimizer: momentum optimizer with m=0.9 and use nesterov
-- preprocess: random horizontal flip + image standardization + cutout
-
-以下是6个模型在CIFAR-10数据集上的准确率:
-
-| model | round 1 | round 2 | round 3 | max | avg |
-|----------|---------|---------|---------|--------|--------|
-| HiNAS-0 | 0.9548 | 0.9520 | 0.9513 | 0.9548 | 0.9527 |
-| HiNAS-1 | 0.9452 | 0.9462 | 0.9420 | 0.9462 | 0.9445 |
-| HiNAS-2 | 0.9508 | 0.9506 | 0.9483 | 0.9508 | 0.9499 |
-| HiNAS-3 | 0.9607 | 0.9623 | 0.9601 | 0.9623 | 0.9611 |
-| HiNAS-4 | 0.9611 | 0.9584 | 0.9586 | 0.9611 | 0.9594 |
-| HiNAS-5 | 0.9578 | 0.9588 | 0.9594 | 0.9594 | 0.9586 |
diff --git a/AutoDL/HiNAS_models/build/__init__.py b/AutoDL/HiNAS_models/build/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/AutoDL/HiNAS_models/build/layers.py b/AutoDL/HiNAS_models/build/layers.py
deleted file mode 100755
index 5bd67fb837bb21434f8628e339c3ef541b8c5a90..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/build/layers.py
+++ /dev/null
@@ -1,214 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import operator
-
-import numpy as np
-import paddle.fluid as fluid
-from absl import flags
-
-FLAGS = flags.FLAGS
-
-flags.DEFINE_float("bn_decay", 0.9, "batch norm decay")
-flags.DEFINE_float("dropout_rate", 0.5, "dropout rate")
-
-
-def calc_padding(img_width, stride, dilation, filter_width):
- """ calculate pixels to padding in order to keep input/output size same. """
-
- filter_width = dilation * (filter_width - 1) + 1
- if img_width % stride == 0:
- pad_along_width = max(filter_width - stride, 0)
- else:
- pad_along_width = max(filter_width - (img_width % stride), 0)
- return pad_along_width // 2, pad_along_width - pad_along_width // 2
-
-
-def conv(inputs,
- filters,
- kernel,
- strides=(1, 1),
- dilation=(1, 1),
- num_groups=1,
- conv_param=None):
- """ normal conv layer """
-
- if isinstance(kernel, (tuple, list)):
- n = operator.mul(*kernel) * inputs.shape[1]
- else:
- n = kernel * kernel * inputs.shape[1]
-
- # pad input
- padding = (0, 0, 0, 0) \
- + calc_padding(inputs.shape[2], strides[0], dilation[0], kernel[0]) \
- + calc_padding(inputs.shape[3], strides[1], dilation[1], kernel[1])
- if sum(padding) > 0:
- inputs = fluid.layers.pad(inputs, padding, 0)
-
- param_attr = fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.NormalInitializer(
- 0.0, scale=np.sqrt(2.0 / n)),
- regularizer=fluid.regularizer.L2Decay(FLAGS.weight_decay))
-
- bias_attr = fluid.param_attr.ParamAttr(
- regularizer=fluid.regularizer.L2Decay(0.))
-
- return fluid.layers.conv2d(
- inputs,
- filters,
- kernel,
- stride=strides,
- padding=0,
- dilation=dilation,
- groups=num_groups,
- param_attr=param_attr if conv_param is None else conv_param,
- use_cudnn=False if num_groups == inputs.shape[1] == filters else True,
- bias_attr=bias_attr,
- act=None)
-
-
-def sep(inputs, filters, kernel, strides=(1, 1), dilation=(1, 1)):
- """ Separable convolution layer """
-
- if isinstance(kernel, (tuple, list)):
- n_depth = operator.mul(*kernel)
- else:
- n_depth = kernel * kernel
- n_point = inputs.shape[1]
-
- if isinstance(strides, (tuple, list)):
- multiplier = strides[0]
- else:
- multiplier = strides
-
- depthwise_param = fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.NormalInitializer(
- 0.0, scale=np.sqrt(2.0 / n_depth)),
- regularizer=fluid.regularizer.L2Decay(FLAGS.weight_decay))
-
- pointwise_param = fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.NormalInitializer(
- 0.0, scale=np.sqrt(2.0 / n_point)),
- regularizer=fluid.regularizer.L2Decay(FLAGS.weight_decay))
-
- depthwise_conv = conv(
- inputs=inputs,
- kernel=kernel,
- filters=int(filters * multiplier),
- strides=strides,
- dilation=dilation,
- num_groups=int(filters * multiplier),
- conv_param=depthwise_param)
-
- return conv(
- inputs=depthwise_conv,
- kernel=(1, 1),
- filters=int(filters * multiplier),
- strides=(1, 1),
- dilation=dilation,
- conv_param=pointwise_param)
-
-
-def maxpool(inputs, kernel, strides=(1, 1)):
- padding = (0, 0, 0, 0) \
- + calc_padding(inputs.shape[2], strides[0], 1, kernel[0]) \
- + calc_padding(inputs.shape[3], strides[1], 1, kernel[1])
- if sum(padding) > 0:
- inputs = fluid.layers.pad(inputs, padding, 0)
-
- return fluid.layers.pool2d(
- inputs, kernel, 'max', strides, pool_padding=0, ceil_mode=False)
-
-
-def avgpool(inputs, kernel, strides=(1, 1)):
- padding_pixel = (0, 0, 0, 0)
- padding_pixel += calc_padding(inputs.shape[2], strides[0], 1, kernel[0])
- padding_pixel += calc_padding(inputs.shape[3], strides[1], 1, kernel[1])
-
- if padding_pixel[4] == padding_pixel[5] and padding_pixel[
- 6] == padding_pixel[7]:
- # same padding pixel num on all sides.
- return fluid.layers.pool2d(
- inputs,
- kernel,
- 'avg',
- strides,
- pool_padding=(padding_pixel[4], padding_pixel[6]),
- ceil_mode=False)
- elif padding_pixel[4] + 1 == padding_pixel[5] and padding_pixel[6] + 1 == padding_pixel[7] \
- and strides == (1, 1):
- # different padding size: first pad then crop.
- x = fluid.layers.pool2d(
- inputs,
- kernel,
- 'avg',
- strides,
- pool_padding=(padding_pixel[5], padding_pixel[7]),
- ceil_mode=False)
- x_shape = x.shape
- return fluid.layers.crop(
- x,
- shape=(-1, x_shape[1], x_shape[2] - 1, x_shape[3] - 1),
- offsets=(0, 0, 1, 1))
- else:
- # not support. use padding-zero and pool2d.
- print("Warning: use zero-padding in avgpool")
- outputs = fluid.layers.pad(inputs, padding_pixel, 0)
- return fluid.layers.pool2d(
- outputs, kernel, 'avg', strides, pool_padding=0, ceil_mode=False)
-
-
-def global_avgpool(inputs):
- return fluid.layers.pool2d(
- inputs,
- 1,
- 'avg',
- 1,
- pool_padding=0,
- global_pooling=True,
- ceil_mode=True)
-
-
-def fully_connected(inputs, units):
- n = inputs.shape[1]
- param_attr = fluid.param_attr.ParamAttr(
- initializer=fluid.initializer.NormalInitializer(
- 0.0, scale=np.sqrt(2.0 / n)),
- regularizer=fluid.regularizer.L2Decay(FLAGS.weight_decay))
-
- bias_attr = fluid.param_attr.ParamAttr(
- regularizer=fluid.regularizer.L2Decay(0.))
-
- return fluid.layers.fc(inputs,
- units,
- param_attr=param_attr,
- bias_attr=bias_attr)
-
-
-def bn_relu(inputs):
- """ batch norm + rely layer """
-
- output = fluid.layers.batch_norm(
- inputs, momentum=FLAGS.bn_decay, epsilon=0.001, data_layout="NCHW")
- return fluid.layers.relu(output)
-
-
-def dropout(inputs):
- """ dropout layer """
-
- return fluid.layers.dropout(inputs, dropout_prob=FLAGS.dropout_rate)
diff --git a/AutoDL/HiNAS_models/build/ops.py b/AutoDL/HiNAS_models/build/ops.py
deleted file mode 100755
index 359f62852fe193cabaad73d7361ed6db57cf6d8c..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/build/ops.py
+++ /dev/null
@@ -1,117 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import build.layers as layers
-
-
-def conv_1x1(inputs, downsample=False):
- return conv_base(inputs, (1, 1), downsample=downsample)
-
-
-def conv_2x2(inputs, downsample=False):
- return conv_base(inputs, (2, 2), downsample=downsample)
-
-
-def conv_3x3(inputs, downsample=False):
- return conv_base(inputs, (3, 3), downsample=downsample)
-
-
-def dilated_2x2(inputs, downsample=False):
- return conv_base(inputs, (2, 2), (2, 2), downsample)
-
-
-def conv_1x2_2x1(inputs, downsample=False):
- return pair_base(inputs, 2, downsample)
-
-
-def conv_1x3_3x1(inputs, downsample=False):
- return pair_base(inputs, 3, downsample)
-
-
-def sep_2x2(inputs, downsample=False):
- return sep_base(inputs, (2, 2), downsample=downsample)
-
-
-def sep_3x3(inputs, downsample=False):
- return sep_base(inputs, (3, 3), downsample=downsample)
-
-
-def maxpool_2x2(inputs, downsample=False):
- return maxpool_base(inputs, (2, 2), downsample)
-
-
-def maxpool_3x3(inputs, downsample=False):
- return maxpool_base(inputs, (3, 3), downsample)
-
-
-def avgpool_2x2(inputs, downsample=False):
- return avgpool_base(inputs, (2, 2), downsample)
-
-
-def avgpool_3x3(inputs, downsample=False):
- return avgpool_base(inputs, (3, 3), downsample)
-
-
-def conv_base(inputs, kernel, dilation=(1, 1), downsample=False):
- filters = inputs.shape[1]
- if downsample:
- output = layers.conv(inputs, filters * 2, kernel, (2, 2))
- else:
- output = layers.conv(inputs, filters, kernel, dilation=dilation)
- return output
-
-
-def pair_base(inputs, kernel, downsample=False):
- filters = inputs.shape[1]
- if downsample:
- output = layers.conv(inputs, filters, (1, kernel), (1, 2))
- output = layers.conv(output, filters, (kernel, 1), (2, 1))
- output = layers.conv(output, filters * 2, (1, 1))
- else:
- output = layers.conv(inputs, filters, (1, kernel))
- output = layers.conv(output, filters, (kernel, 1))
- return output
-
-
-def sep_base(inputs, kernel, dilation=(1, 1), downsample=False):
- filters = inputs.shape[1]
- if downsample:
- output = layers.sep(inputs, filters * 2, kernel, (2, 2))
- else:
- output = layers.sep(inputs, filters, kernel, dilation=dilation)
- return output
-
-
-def maxpool_base(inputs, kernel, downsample=False):
- if downsample:
- filters = inputs.shape[1]
- output = layers.maxpool(inputs, kernel, (2, 2))
- output = layers.conv(output, filters * 2, (1, 1))
- else:
- output = layers.maxpool(inputs, kernel)
- return output
-
-
-def avgpool_base(inputs, kernel, downsample=False):
- if downsample:
- filters = inputs.shape[1]
- output = layers.avgpool(inputs, kernel, (2, 2))
- output = layers.conv(output, filters * 2, (1, 1))
- else:
- output = layers.avgpool(inputs, kernel)
- return output
diff --git a/AutoDL/HiNAS_models/build/resnet_base.py b/AutoDL/HiNAS_models/build/resnet_base.py
deleted file mode 100755
index 76c870de3bed9641622ed5722dff9e58b76fddff..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/build/resnet_base.py
+++ /dev/null
@@ -1,109 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid as fluid
-from absl import flags
-
-import build.layers as layers
-import build.ops as _ops
-
-FLAGS = flags.FLAGS
-
-flags.DEFINE_integer("num_stages", 3, "number of stages")
-flags.DEFINE_integer("num_blocks", 5, "number of blocks per stage")
-flags.DEFINE_integer("num_ops", 2, "number of operations per block")
-flags.DEFINE_integer("width", 64, "network width")
-flags.DEFINE_string("downsample", "pool", "conv or pool")
-
-num_classes = 10
-
-ops = [
- _ops.conv_1x1,
- _ops.conv_2x2,
- _ops.conv_3x3,
- _ops.dilated_2x2,
- _ops.conv_1x2_2x1,
- _ops.conv_1x3_3x1,
- _ops.sep_2x2,
- _ops.sep_3x3,
- _ops.maxpool_2x2,
- _ops.maxpool_3x3,
- _ops.avgpool_2x2,
- _ops.avgpool_3x3,
-]
-
-
-def net(inputs, tokens):
- """ build network with skip links """
-
- x = layers.conv(inputs, FLAGS.width, (3, 3))
-
- num_ops = FLAGS.num_blocks * FLAGS.num_ops
- x = stage(x, tokens[:num_ops], pre_activation=True)
- for i in range(1, FLAGS.num_stages):
- x = stage(x, tokens[i * num_ops:(i + 1) * num_ops], downsample=True)
-
- x = layers.bn_relu(x)
- x = layers.global_avgpool(x)
- x = layers.dropout(x)
- logits = layers.fully_connected(x, num_classes)
-
- return fluid.layers.softmax(logits)
-
-
-def stage(x, tokens, pre_activation=False, downsample=False):
- """ build network's stage. Stage consists of blocks """
-
- x = block(x, tokens[:FLAGS.num_ops], pre_activation, downsample)
- for i in range(1, FLAGS.num_blocks):
- print("-" * 12)
- x = block(x, tokens[i * FLAGS.num_ops:(i + 1) * FLAGS.num_ops])
- print("=" * 12)
-
- return x
-
-
-def block(x, tokens, pre_activation=False, downsample=False):
- """ build block. """
-
- if pre_activation:
- x = layers.bn_relu(x)
- res = x
- else:
- res = x
- x = layers.bn_relu(x)
-
- x = ops[tokens[0]](x, downsample)
- print("%s \t-> shape %s" % (ops[0].__name__, x.shape))
- for token in tokens[1:]:
- x = layers.bn_relu(x)
- x = ops[token](x)
- print("%s \t-> shape %s" % (ops[token].__name__, x.shape))
-
- if downsample:
- filters = res.shape[1]
- if FLAGS.downsample == "conv":
- res = layers.conv(res, filters * 2, (1, 1), (2, 2))
- elif FLAGS.downsample == "pool":
- res = layers.avgpool(res, (2, 2), (2, 2))
- res = fluid.layers.pad(res, (0, 0, filters // 2, filters // 2, 0, 0,
- 0, 0))
- else:
- raise NotImplementedError
-
- return x + res
diff --git a/AutoDL/HiNAS_models/build/vgg_base.py b/AutoDL/HiNAS_models/build/vgg_base.py
deleted file mode 100755
index d7506a7ec4617a4c1017911a763084f754c6b1f0..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/build/vgg_base.py
+++ /dev/null
@@ -1,70 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid as fluid
-from absl import flags
-
-import build.layers as layers
-import build.ops as _ops
-
-FLAGS = flags.FLAGS
-flags.DEFINE_integer("num_stages", 5, "number of stages")
-flags.DEFINE_integer("width", 64, "network width")
-
-num_classes = 10
-
-ops = [
- _ops.conv_1x1, #0
- _ops.conv_2x2, #1
- _ops.conv_3x3, #2
- _ops.dilated_2x2, #3
- _ops.conv_1x2_2x1, #4
- _ops.conv_1x3_3x1, #5
- _ops.sep_2x2, #6
- _ops.sep_3x3, #7
- _ops.maxpool_2x2, #8
- _ops.maxpool_3x3,
- _ops.avgpool_2x2, #10
- _ops.avgpool_3x3,
-]
-
-
-def net(inputs, tokens):
- depth = len(tokens)
- q, r = divmod(depth + 1, FLAGS.num_stages)
- downsample_steps = [
- i * q + max(0, i + r - FLAGS.num_stages + 1) - 2
- for i in range(1, FLAGS.num_stages)
- ]
-
- x = layers.conv(inputs, FLAGS.width, (3, 3))
- x = layers.bn_relu(x)
-
- for i, token in enumerate(tokens):
- downsample = i in downsample_steps
- x = ops[token](x, downsample)
- print("%s \t-> shape %s" % (ops[token].__name__, x.shape))
- if downsample:
- print("=" * 12)
- x = layers.bn_relu(x)
-
- x = layers.global_avgpool(x)
- x = layers.dropout(x)
- logits = layers.fully_connected(x, num_classes)
-
- return fluid.layers.softmax(logits)
diff --git a/AutoDL/HiNAS_models/nn_paddle.py b/AutoDL/HiNAS_models/nn_paddle.py
deleted file mode 100755
index d3a3ddd60cf3e5e114de322f3eea763e5a2e6018..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/nn_paddle.py
+++ /dev/null
@@ -1,139 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import math
-
-import numpy as np
-import paddle
-import paddle.fluid as fluid
-from paddle.fluid.contrib.trainer import *
-from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
-import reader
-
-from absl import flags
-
-# import preprocess
-
-FLAGS = flags.FLAGS
-
-flags.DEFINE_float("lr_max", 0.1, "initial learning rate")
-flags.DEFINE_float("lr_min", 0.0001, "limiting learning rate")
-
-flags.DEFINE_integer("batch_size", 128, "batch size")
-flags.DEFINE_integer("num_epochs", 200, "total epochs to train")
-flags.DEFINE_float("weight_decay", 0.0004, "weight decay")
-
-flags.DEFINE_float("momentum", 0.9, "momentum")
-
-flags.DEFINE_boolean("shuffle_image", True, "shuffle input images on training")
-
-dataset_train_size = 50000
-
-
-class Model(object):
- def __init__(self, build_fn, tokens):
- print("learning rate: %f -> %f, cosine annealing" %
- (FLAGS.lr_max, FLAGS.lr_min))
- print("epoch: %d" % FLAGS.num_epochs)
- print("batch size: %d" % FLAGS.batch_size)
- print("L2 decay: %f" % FLAGS.weight_decay)
-
- self.max_step = dataset_train_size * FLAGS.num_epochs // FLAGS.batch_size
-
- self.build_fn = build_fn
- self.tokens = tokens
- print("Token is %s" % ",".join(map(str, tokens)))
-
- def cosine_annealing(self):
- step = _decay_step_counter()
- lr = FLAGS.lr_min + (FLAGS.lr_max - FLAGS.lr_min) / 2 \
- * (1.0 + fluid.layers.ops.cos(step / self.max_step * math.pi))
- return lr
-
- def optimizer_program(self):
- return fluid.optimizer.Momentum(
- learning_rate=self.cosine_annealing(),
- momentum=FLAGS.momentum,
- use_nesterov=True,
- regularization=fluid.regularizer.L2DecayRegularizer(
- FLAGS.weight_decay))
-
- def inference_network(self):
- images = fluid.layers.data(
- name='pixel', shape=[3, 32, 32], dtype='float32')
- return self.build_fn(images, self.tokens)
-
- def train_network(self):
- predict = self.inference_network()
- label = fluid.layers.data(name='label', shape=[1], dtype='int64')
- cost = fluid.layers.cross_entropy(input=predict, label=label)
- avg_cost = fluid.layers.mean(cost)
- accuracy = fluid.layers.accuracy(input=predict, label=label)
- # self.parameters = fluid.parameters.create(avg_cost)
- return [avg_cost, accuracy]
-
- def run(self):
- train_files = reader.train10()
- test_files = reader.test10()
-
- if FLAGS.shuffle_image:
- train_reader = paddle.batch(
- paddle.reader.shuffle(train_files, dataset_train_size),
- batch_size=FLAGS.batch_size)
- else:
- train_reader = paddle.batch(
- train_files, batch_size=FLAGS.batch_size)
-
- test_reader = paddle.batch(test_files, batch_size=FLAGS.batch_size)
-
- costs = []
- accs = []
-
- def event_handler(event):
- if isinstance(event, EndStepEvent):
- costs.append(event.metrics[0])
- accs.append(event.metrics[1])
- if event.step % 20 == 0:
- print("Epoch %d, Step %d, Loss %f, Acc %f" % (
- event.epoch, event.step, np.mean(costs), np.mean(accs)))
- del costs[:]
- del accs[:]
-
- if isinstance(event, EndEpochEvent):
- if event.epoch % 3 == 0 or event.epoch == FLAGS.num_epochs - 1:
- avg_cost, accuracy = trainer.test(
- reader=test_reader, feed_order=['pixel', 'label'])
-
- event_handler.best_acc = max(event_handler.best_acc,
- accuracy)
- print("Test with epoch %d, Loss %f, Acc %f" %
- (event.epoch, avg_cost, accuracy))
- print("Best acc %f" % event_handler.best_acc)
-
- event_handler.best_acc = 0.0
- place = fluid.CUDAPlace(0)
- trainer = Trainer(
- train_func=self.train_network,
- optimizer_func=self.optimizer_program,
- place=place)
-
- trainer.train(
- reader=train_reader,
- num_epochs=FLAGS.num_epochs,
- event_handler=event_handler,
- feed_order=['pixel', 'label'])
diff --git a/AutoDL/HiNAS_models/reader.py b/AutoDL/HiNAS_models/reader.py
deleted file mode 100755
index e30725b0c171376029d8c51dc38ac01350740c4a..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/reader.py
+++ /dev/null
@@ -1,157 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-"""
-CIFAR-10 dataset.
-This module will download dataset from
-https://www.cs.toronto.edu/~kriz/cifar.html and parse train/test set into
-paddle reader creators.
-The CIFAR-10 dataset consists of 60000 32x32 colour images in 10 classes,
-with 6000 images per class. There are 50000 training images and 10000 test images.
-"""
-
-from PIL import Image
-from PIL import ImageOps
-import numpy as np
-
-import cPickle
-import itertools
-import paddle.dataset.common
-import tarfile
-from absl import flags
-
-FLAGS = flags.FLAGS
-
-flags.DEFINE_boolean("random_flip_left_right", True,
- "random flip left and right")
-flags.DEFINE_boolean("random_flip_up_down", False, "random flip up and down")
-flags.DEFINE_boolean("cutout", True, "cutout")
-flags.DEFINE_boolean("standardize_image", True, "standardize input images")
-flags.DEFINE_boolean("pad_and_cut_image", True, "pad and cut input images")
-
-__all__ = ['train10', 'test10', 'convert']
-
-URL_PREFIX = 'https://www.cs.toronto.edu/~kriz/'
-CIFAR10_URL = URL_PREFIX + 'cifar-10-python.tar.gz'
-CIFAR10_MD5 = 'c58f30108f718f92721af3b95e74349a'
-
-paddle.dataset.common.DATA_HOME = "dataset/"
-
-image_size = 32
-image_depth = 3
-half_length = 8
-
-
-def preprocess(sample, is_training):
- image_array = sample.reshape(3, image_size, image_size)
- rgb_array = np.transpose(image_array, (1, 2, 0))
- img = Image.fromarray(rgb_array, 'RGB')
-
- if is_training:
- if FLAGS.pad_and_cut_image:
- # pad and ramdom crop
- img = ImageOps.expand(
- img, (2, 2, 2, 2), fill=0) # pad to 36 * 36 * 3
- left_top = np.random.randint(5, size=2) # rand 0 - 4
- img = img.crop((left_top[0], left_top[1], left_top[0] + image_size,
- left_top[1] + image_size))
-
- if FLAGS.random_flip_left_right and np.random.randint(2):
- img = img.transpose(Image.FLIP_LEFT_RIGHT)
- if FLAGS.random_flip_up_down and np.random.randint(2):
- img = img.transpose(Image.FLIP_TOP_BOTTOM)
-
- img = np.array(img).astype(np.float32)
-
- if FLAGS.standardize_image:
- # per_image_standardization
- img_float = img / 255.0
- mean = np.mean(img_float)
- std = max(np.std(img_float), 1.0 / np.sqrt(3 * image_size * image_size))
- img = (img_float - mean) / std
-
- if is_training and FLAGS.cutout:
- center = np.random.randint(image_size, size=2)
- offset_width = max(0, center[0] - half_length)
- offset_height = max(0, center[1] - half_length)
- target_width = min(center[0] + half_length, image_size)
- target_height = min(center[1] + half_length, image_size)
-
- for i in range(offset_height, target_height):
- for j in range(offset_width, target_width):
- img[i][j][:] = 0.0
-
- img = np.transpose(img, (2, 0, 1))
- return img.reshape(3 * image_size * image_size)
-
-
-def reader_creator(filename, sub_name, is_training):
- def read_batch(batch):
- data = batch['data']
- labels = batch.get('labels', batch.get('fine_labels', None))
- assert labels is not None
- for sample, label in itertools.izip(data, labels):
- yield preprocess(sample, is_training), int(label)
-
- def reader():
- with tarfile.open(filename, mode='r') as f:
- names = [
- each_item.name for each_item in f if sub_name in each_item.name
- ]
- names.sort()
-
- for name in names:
- print("Reading file " + name)
- batch = cPickle.load(f.extractfile(name))
- for item in read_batch(batch):
- yield item
-
- return reader
-
-
-def train10():
- """
- CIFAR-10 training set creator.
- It returns a reader creator, each sample in the reader is image pixels in
- [0, 1] and label in [0, 9].
- :return: Training reader creator
- :rtype: callable
- """
- return reader_creator(
- paddle.dataset.common.download(CIFAR10_URL, 'cifar', CIFAR10_MD5),
- 'data_batch', True)
-
-
-def test10():
- """
- CIFAR-10 test set creator.
- It returns a reader creator, each sample in the reader is image pixels in
- [0, 1] and label in [0, 9].
- :return: Test reader creator.
- :rtype: callable
- """
- return reader_creator(
- paddle.dataset.common.download(CIFAR10_URL, 'cifar', CIFAR10_MD5),
- 'test_batch', False)
-
-
-def fetch():
- paddle.dataset.common.download(CIFAR10_URL, 'cifar', CIFAR10_MD5)
-
-
-def convert(path):
- """
- Converts dataset to recordio format
- """
- paddle.dataset.common.convert(path, train10(), 1000, "cifar_train10")
- paddle.dataset.common.convert(path, test10(), 1000, "cifar_test10")
diff --git a/AutoDL/HiNAS_models/tokens/15113.pkl b/AutoDL/HiNAS_models/tokens/15113.pkl
deleted file mode 100755
index a36c7d322311ccceff93b13ddb5bc73058bb4bb7..0000000000000000000000000000000000000000
Binary files a/AutoDL/HiNAS_models/tokens/15113.pkl and /dev/null differ
diff --git a/AutoDL/HiNAS_models/tokens/15383.pkl b/AutoDL/HiNAS_models/tokens/15383.pkl
deleted file mode 100755
index 9f05c39bb408af893d7e19c0349a279b27ac4bc6..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/tokens/15383.pkl
+++ /dev/null
@@ -1,36 +0,0 @@
-cnumpy.core.multiarray
-_reconstruct
-p0
-(cnumpy
-ndarray
-p1
-(I0
-tp2
-S'b'
-p3
-tp4
-Rp5
-(I1
-(I21
-tp6
-cnumpy
-dtype
-p7
-(S'i4'
-p8
-I0
-I1
-tp9
-Rp10
-(I3
-S'<'
-p11
-NNNI-1
-I-1
-I0
-tp12
-bI00
-S'\x05\x00\x00\x00\x07\x00\x00\x00\x02\x00\x00\x00\x05\x00\x00\x00\x05\x00\x00\x00\x02\x00\x00\x00\x08\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00\x01\x00\x00\x00\n\x00\x00\x00\t\x00\x00\x00\x03\x00\x00\x00\x08\x00\x00\x00\x0b\x00\x00\x00\x03\x00\x00\x00\t\x00\x00\x00\x02\x00\x00\x00\x06\x00\x00\x00\x01\x00\x00\x00\x06\x00\x00\x00'
-p13
-tp14
-b.
\ No newline at end of file
diff --git a/AutoDL/HiNAS_models/tokens/15613.pkl b/AutoDL/HiNAS_models/tokens/15613.pkl
deleted file mode 100755
index 332564be14020af6118ad578092d7e68f1447596..0000000000000000000000000000000000000000
Binary files a/AutoDL/HiNAS_models/tokens/15613.pkl and /dev/null differ
diff --git a/AutoDL/HiNAS_models/tokens/17754.pkl b/AutoDL/HiNAS_models/tokens/17754.pkl
deleted file mode 100755
index 4844119fdbee64f86e457d70ce9e7259ced7b15f..0000000000000000000000000000000000000000
Binary files a/AutoDL/HiNAS_models/tokens/17754.pkl and /dev/null differ
diff --git a/AutoDL/HiNAS_models/tokens/17925.pkl b/AutoDL/HiNAS_models/tokens/17925.pkl
deleted file mode 100755
index 841412252339dfa63d44430eef3a95eed255379b..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/tokens/17925.pkl
+++ /dev/null
@@ -1,36 +0,0 @@
-cnumpy.core.multiarray
-_reconstruct
-p0
-(cnumpy
-ndarray
-p1
-(I0
-tp2
-S'b'
-p3
-tp4
-Rp5
-(I1
-(I21
-tp6
-cnumpy
-dtype
-p7
-(S'i4'
-p8
-I0
-I1
-tp9
-Rp10
-(I3
-S'<'
-p11
-NNNI-1
-I-1
-I0
-tp12
-bI00
-S'\x07\x00\x00\x00\x07\x00\x00\x00\x02\x00\x00\x00\x05\x00\x00\x00\x02\x00\x00\x00\x02\x00\x00\x00\x08\x00\x00\x00\x08\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00\x02\x00\x00\x00\n\x00\x00\x00\x08\x00\x00\x00\x02\x00\x00\x00\t\x00\x00\x00\x0b\x00\x00\x00\t\x00\x00\x00\x06\x00\x00\x00\x04\x00\x00\x00\x04\x00\x00\x00\n\x00\x00\x00'
-p13
-tp14
-b.
\ No newline at end of file
diff --git a/AutoDL/HiNAS_models/tokens/18089.pkl b/AutoDL/HiNAS_models/tokens/18089.pkl
deleted file mode 100755
index a466a6c91d7f664ca85351c8e6eed1046f4a2152..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/tokens/18089.pkl
+++ /dev/null
@@ -1,36 +0,0 @@
-cnumpy.core.multiarray
-_reconstruct
-p0
-(cnumpy
-ndarray
-p1
-(I0
-tp2
-S'b'
-p3
-tp4
-Rp5
-(I1
-(I21
-tp6
-cnumpy
-dtype
-p7
-(S'i4'
-p8
-I0
-I1
-tp9
-Rp10
-(I3
-S'<'
-p11
-NNNI-1
-I-1
-I0
-tp12
-bI00
-S'\x07\x00\x00\x00\x05\x00\x00\x00\x08\x00\x00\x00\x01\x00\x00\x00\x02\x00\x00\x00\n\x00\x00\x00\t\x00\x00\x00\x02\x00\x00\x00\x02\x00\x00\x00\x02\x00\x00\x00\x08\x00\x00\x00\x08\x00\x00\x00\x08\x00\x00\x00\x02\x00\x00\x00\t\x00\x00\x00\x04\x00\x00\x00\t\x00\x00\x00\x0b\x00\x00\x00\x07\x00\x00\x00\x04\x00\x00\x00\x03\x00\x00\x00'
-p13
-tp14
-b.
\ No newline at end of file
diff --git a/AutoDL/HiNAS_models/train_hinas.py b/AutoDL/HiNAS_models/train_hinas.py
deleted file mode 100755
index 8e4a0f855a15545b71b98802f74e467cb22e06b7..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/train_hinas.py
+++ /dev/null
@@ -1,44 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import pickle
-
-from absl import app
-from absl import flags
-
-import nn_paddle as nn
-from build import vgg_base
-
-FLAGS = flags.FLAGS
-flags.DEFINE_string("tokdir", "tokens/", "token directory")
-flags.DEFINE_integer("model", 0, "model")
-
-mid = [17925, 18089, 15383]
-
-
-def main(_):
- f = os.path.join(FLAGS.tokdir, str(mid[FLAGS.model]) + ".pkl")
- tokens = pickle.load(open(f, "rb"))
-
- model = nn.Model(vgg_base.net, tokens)
- model.run()
-
-
-if __name__ == "__main__":
- app.run(main)
diff --git a/AutoDL/HiNAS_models/train_hinas_res.py b/AutoDL/HiNAS_models/train_hinas_res.py
deleted file mode 100755
index 4809042274d5a9b3660a66153c95e25e67ad988f..0000000000000000000000000000000000000000
--- a/AutoDL/HiNAS_models/train_hinas_res.py
+++ /dev/null
@@ -1,44 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import pickle
-
-from absl import app
-from absl import flags
-
-import nn_paddle as nn
-from build import resnet_base
-
-FLAGS = flags.FLAGS
-flags.DEFINE_string("tokdir", "tokens/", "token directory")
-flags.DEFINE_integer("model", 0, "model")
-
-mid = [17754, 15113, 15613]
-
-
-def main(_):
- f = os.path.join(FLAGS.tokdir, str(mid[FLAGS.model]) + ".pkl")
- tokens = pickle.load(open(f, "rb"))
-
- model = nn.Model(resnet_base.net, tokens)
- model.run()
-
-
-if __name__ == "__main__":
- app.run(main)
diff --git a/AutoDL/LRC/README.md b/AutoDL/LRC/README.md
deleted file mode 100644
index df9af47d4a3876371673cbbfef0ad2553768b9a5..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/README.md
+++ /dev/null
@@ -1,74 +0,0 @@
-# LRC Local Rademachar Complexity Regularization
-Regularization of Deep Neural Networks(DNNs) for the sake of improving their generalization capability is important and chllenging. This directory contains image classification model based on a novel regularizer rooted in Local Rademacher Complexity (LRC). We appreciate the contribution by [DARTS](https://arxiv.org/abs/1806.09055) for our research. The regularization by LRC and DARTS are combined in this model on CIFAR-10 dataset. Code accompanying the paper
-> [An Empirical Study on Regularization of Deep Neural Networks by Local Rademacher Complexity](https://arxiv.org/abs/1902.00873)\
-> Yingzhen Yang, Xingjian Li, Jun Huan.\
-> _arXiv:1902.00873_.
-
----
-# Table of Contents
-
-- [Installation](#installation)
-- [Data preparation](#data-preparation)
-- [Training](#training)
-
-## Installation
-
-Running sample code in this directory requires PaddelPaddle Fluid v.1.2.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/install/index_cn.html#paddlepaddle) and make an update.
-
-## Data preparation
-
-When you want to use the cifar-10 dataset for the first time, you can download the dataset as:
-
- sh ./dataset/download.sh
-
-Please make sure your environment has an internet connection.
-
-The dataset will be downloaded to `dataset/cifar/cifar-10-batches-py` in the same directory as the `train.py`. If automatic download fails, you can download cifar-10-python.tar.gz from https://www.cs.toronto.edu/~kriz/cifar.html and decompress it to the location mentioned above.
-
-
-## Training
-
-After data preparation, one can start the training step by:
-
- python -u train_mixup.py \
- --batch_size=80 \
- --auxiliary \
- --weight_decay=0.0003 \
- --learning_rate=0.025 \
- --lrc_loss_lambda=0.7 \
- --cutout
-- Set ```export CUDA_VISIBLE_DEVICES=0``` to specifiy one GPU to train.
-- For more help on arguments:
-
- python train_mixup.py --help
-
-**data reader introduction:**
-
-* Data reader is defined in `reader.py`.
-* Reshape the images to 32 * 32.
-* In training stage, images are padding to 40 * 40 and cropped randomly to the original size.
-* In training stage, images are horizontally random flipped.
-* Images are standardized to (0, 1).
-* In training stage, cutout images randomly.
-* Shuffle the order of the input images during training.
-
-**model configuration:**
-
-* Use auxiliary loss and auxiliary\_weight=0.4.
-* Use dropout and drop\_path\_prob=0.2.
-* Set lrc\_loss\_lambda=0.7.
-
-**training strategy:**
-
-* Use momentum optimizer with momentum=0.9.
-* Weight decay is 0.0003.
-* Use cosine decay with init\_lr=0.025.
-* Total epoch is 600.
-* Use Xaiver initalizer to weight in conv2d, Constant initalizer to weight in batch norm and Normal initalizer to weight in fc.
-* Initalize bias in batch norm and fc to zero constant and do not add bias to conv2d.
-
-
-## Reference
-
- - DARTS: Differentiable Architecture Search [`paper`](https://arxiv.org/abs/1806.09055)
- - Differentiable architecture search in PyTorch [`code`](https://github.com/quark0/darts)
diff --git a/AutoDL/LRC/README_cn.md b/AutoDL/LRC/README_cn.md
deleted file mode 100644
index 06dc937074de199af31db97ee200e7690443b1b0..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/README_cn.md
+++ /dev/null
@@ -1,71 +0,0 @@
-# LRC 局部Rademachar复杂度正则化
-为了在深度神经网络中提升泛化能力,正则化的选择十分重要也具有挑战性。本目录包括了一种基于局部rademacher复杂度的新型正则(LRC)的图像分类模型。十分感谢[DARTS](https://arxiv.org/abs/1806.09055)模型对本研究提供的帮助。该模型将LRC正则和DARTS网络相结合,在CIFAR-10数据集中得到了很出色的效果。代码和文章一同发布
-> [An Empirical Study on Regularization of Deep Neural Networks by Local Rademacher Complexity](https://arxiv.org/abs/1902.00873)\
-> Yingzhen Yang, Xingjian Li, Jun Huan.\
-> _arXiv:1902.00873_.
-
----
-# 内容
-
-- [安装](#安装)
-- [数据准备](#数据准备)
-- [模型训练](#模型训练)
-
-## 安装
-
-在当前目录下运行样例代码需要PadddlePaddle Fluid的v.1.2.0或以上的版本。如果你的运行环境中的PaddlePaddle低于此版本,请根据[安装文档](http://www.paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/install/index_cn.html#paddlepaddle)中的说明来更新PaddlePaddle。
-
-## 数据准备
-
-第一次使用CIFAR-10数据集时,您可以通过如果命令下载:
-
- sh ./dataset/download.sh
-
-请确保您的环境有互联网连接。数据会下载到`train.py`同目录下的`dataset/cifar/cifar-10-batches-py`。如果下载失败,您可以自行从https://www.cs.toronto.edu/~kriz/cifar.html上下载cifar-10-python.tar.gz并解压到上述位置。
-
-## 模型训练
-
-数据准备好后,可以通过如下命令开始训练:
-
- python -u train_mixup.py \
- --batch_size=80 \
- --auxiliary \
- --weight_decay=0.0003 \
- --learning_rate=0.025 \
- --lrc_loss_lambda=0.7 \
- --cutout
-- 通过设置 ```export CUDA_VISIBLE_DEVICES=0```指定单张GPU训练。
-- 可选参数见:
-
- python train_mixup.py --help
-
-**数据读取器说明:**
-
-* 数据读取器定义在`reader.py`中
-* 输入图像尺寸统一变换为32 * 32
-* 训练时将图像填充为40 * 40然后随机剪裁为原输入图像大小
-* 训练时图像随机水平翻转
-* 对图像每个像素做归一化处理
-* 训练时对图像做随机遮挡
-* 训练时对输入图像做随机洗牌
-
-**模型配置:**
-
-* 使用辅助损失,辅助损失权重为0.4
-* 使用dropout,随机丢弃率为0.2
-* 设置lrc\_loss\_lambda为0.7
-
-**训练策略:**
-
-* 采用momentum优化算法训练,momentum=0.9
-* 权重衰减系数为0.0001
-* 采用正弦学习率衰减,初始学习率为0.025
-* 总共训练600轮
-* 对卷积权重采用Xaiver初始化,对batch norm权重采用固定初始化,对全连接层权重采用高斯初始化
-* 对batch norm和全连接层偏差采用固定初始化,不对卷积设置偏差
-
-
-## 引用
-
- - DARTS: Differentiable Architecture Search [`论文`](https://arxiv.org/abs/1806.09055)
- - Differentiable Architecture Search in PyTorch [`代码`](https://github.com/quark0/darts)
diff --git a/AutoDL/LRC/dataset/download.sh b/AutoDL/LRC/dataset/download.sh
deleted file mode 100644
index 0981c3b6878421f80d392f314fd0ae836644a63c..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/dataset/download.sh
+++ /dev/null
@@ -1,10 +0,0 @@
-DIR="$( cd "$(dirname "$0")" ; pwd -P )"
-cd "$DIR"
-mkdir cifar
-cd cifar
-# Download the data.
-echo "Downloading..."
-wget https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz
-# Extract the data.
-echo "Extracting..."
-tar zvxf cifar-10-python.tar.gz
diff --git a/AutoDL/LRC/genotypes.py b/AutoDL/LRC/genotypes.py
deleted file mode 100644
index 349fbd2478a7c2d1bb4cc3dd901b470de3c8b906..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/genotypes.py
+++ /dev/null
@@ -1,116 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# DARTS
-# Copyright (c) 2018, Hanxiao Liu.
-# Licensed under the Apache License, Version 2.0;
-# --------------------------------------------------------
-
-from collections import namedtuple
-
-Genotype = namedtuple('Genotype', 'normal normal_concat reduce reduce_concat')
-
-PRIMITIVES = [
- 'none', 'max_pool_3x3', 'avg_pool_3x3', 'skip_connect', 'sep_conv_3x3',
- 'sep_conv_5x5', 'dil_conv_3x3', 'dil_conv_5x5'
-]
-
-NASNet = Genotype(
- normal=[
- ('sep_conv_5x5', 1),
- ('sep_conv_3x3', 0),
- ('sep_conv_5x5', 0),
- ('sep_conv_3x3', 0),
- ('avg_pool_3x3', 1),
- ('skip_connect', 0),
- ('avg_pool_3x3', 0),
- ('avg_pool_3x3', 0),
- ('sep_conv_3x3', 1),
- ('skip_connect', 1),
- ],
- normal_concat=[2, 3, 4, 5, 6],
- reduce=[
- ('sep_conv_5x5', 1),
- ('sep_conv_7x7', 0),
- ('max_pool_3x3', 1),
- ('sep_conv_7x7', 0),
- ('avg_pool_3x3', 1),
- ('sep_conv_5x5', 0),
- ('skip_connect', 3),
- ('avg_pool_3x3', 2),
- ('sep_conv_3x3', 2),
- ('max_pool_3x3', 1),
- ],
- reduce_concat=[4, 5, 6], )
-
-AmoebaNet = Genotype(
- normal=[
- ('avg_pool_3x3', 0),
- ('max_pool_3x3', 1),
- ('sep_conv_3x3', 0),
- ('sep_conv_5x5', 2),
- ('sep_conv_3x3', 0),
- ('avg_pool_3x3', 3),
- ('sep_conv_3x3', 1),
- ('skip_connect', 1),
- ('skip_connect', 0),
- ('avg_pool_3x3', 1),
- ],
- normal_concat=[4, 5, 6],
- reduce=[
- ('avg_pool_3x3', 0),
- ('sep_conv_3x3', 1),
- ('max_pool_3x3', 0),
- ('sep_conv_7x7', 2),
- ('sep_conv_7x7', 0),
- ('avg_pool_3x3', 1),
- ('max_pool_3x3', 0),
- ('max_pool_3x3', 1),
- ('conv_7x1_1x7', 0),
- ('sep_conv_3x3', 5),
- ],
- reduce_concat=[3, 4, 6])
-
-DARTS_V1 = Genotype(
- normal=[('sep_conv_3x3', 1), ('sep_conv_3x3', 0), ('skip_connect', 0),
- ('sep_conv_3x3', 1), ('skip_connect', 0), ('sep_conv_3x3', 1),
- ('sep_conv_3x3', 0), ('skip_connect', 2)],
- normal_concat=[2, 3, 4, 5],
- reduce=[('max_pool_3x3', 0), ('max_pool_3x3', 1), ('skip_connect', 2),
- ('max_pool_3x3', 0), ('max_pool_3x3', 0), ('skip_connect', 2),
- ('skip_connect', 2), ('avg_pool_3x3', 0)],
- reduce_concat=[2, 3, 4, 5])
-DARTS_V2 = Genotype(
- normal=[('sep_conv_3x3', 0), ('sep_conv_3x3', 1), ('sep_conv_3x3', 0),
- ('sep_conv_3x3', 1), ('sep_conv_3x3', 1), ('skip_connect', 0),
- ('skip_connect', 0), ('dil_conv_3x3', 2)],
- normal_concat=[2, 3, 4, 5],
- reduce=[('max_pool_3x3', 0), ('max_pool_3x3', 1), ('skip_connect', 2),
- ('max_pool_3x3', 1), ('max_pool_3x3', 0), ('skip_connect', 2),
- ('skip_connect', 2), ('max_pool_3x3', 1)],
- reduce_concat=[2, 3, 4, 5])
-
-MY_DARTS = Genotype(
- normal=[('sep_conv_3x3', 0), ('skip_connect', 1), ('skip_connect', 0),
- ('dil_conv_5x5', 1), ('skip_connect', 0), ('sep_conv_3x3', 1),
- ('skip_connect', 0), ('sep_conv_3x3', 1)],
- normal_concat=range(2, 6),
- reduce=[('max_pool_3x3', 0), ('max_pool_3x3', 1), ('max_pool_3x3', 0),
- ('skip_connect', 2), ('max_pool_3x3', 0), ('skip_connect', 2),
- ('skip_connect', 2), ('skip_connect', 3)],
- reduce_concat=range(2, 6))
-
-DARTS = MY_DARTS
diff --git a/AutoDL/LRC/learning_rate.py b/AutoDL/LRC/learning_rate.py
deleted file mode 100644
index 3965171b487884d36e4a7447f10f312204803bf8..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/learning_rate.py
+++ /dev/null
@@ -1,43 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# DARTS
-# Copyright (c) 2018, Hanxiao Liu.
-# Licensed under the Apache License, Version 2.0;
-# --------------------------------------------------------
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import paddle
-import paddle.fluid as fluid
-import paddle.fluid.layers.ops as ops
-from paddle.fluid.layers.learning_rate_scheduler import _decay_step_counter
-import math
-from paddle.fluid.initializer import init_on_cpu
-
-
-def cosine_decay(learning_rate, num_epoch, steps_one_epoch):
- """Applies cosine decay to the learning rate.
- lr = 0.5 * (math.cos(epoch * (math.pi / 120)) + 1)
- """
- global_step = _decay_step_counter()
-
- with init_on_cpu():
- decayed_lr = learning_rate * \
- (ops.cos((global_step / steps_one_epoch) \
- * math.pi / num_epoch) + 1)/2
- return decayed_lr
diff --git a/AutoDL/LRC/model.py b/AutoDL/LRC/model.py
deleted file mode 100644
index 45a403495ecc0b7cc0ac3b541d75702adbef31b2..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/model.py
+++ /dev/null
@@ -1,313 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# DARTS
-# Copyright (c) 2018, Hanxiao Liu.
-# Licensed under the Apache License, Version 2.0;
-# --------------------------------------------------------
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import sys
-import numpy as np
-import time
-import functools
-import paddle
-import paddle.fluid as fluid
-from operations import *
-
-
-class Cell():
- def __init__(self, genotype, C_prev_prev, C_prev, C, reduction,
- reduction_prev):
- print(C_prev_prev, C_prev, C)
-
- if reduction_prev:
- self.preprocess0 = functools.partial(FactorizedReduce, C_out=C)
- else:
- self.preprocess0 = functools.partial(
- ReLUConvBN, C_out=C, kernel_size=1, stride=1, padding=0)
- self.preprocess1 = functools.partial(
- ReLUConvBN, C_out=C, kernel_size=1, stride=1, padding=0)
- if reduction:
- op_names, indices = zip(*genotype.reduce)
- concat = genotype.reduce_concat
- else:
- op_names, indices = zip(*genotype.normal)
- concat = genotype.normal_concat
- print(op_names, indices, concat, reduction)
- self._compile(C, op_names, indices, concat, reduction)
-
- def _compile(self, C, op_names, indices, concat, reduction):
- assert len(op_names) == len(indices)
- self._steps = len(op_names) // 2
- self._concat = concat
- self.multiplier = len(concat)
-
- self._ops = []
- for name, index in zip(op_names, indices):
- stride = 2 if reduction and index < 2 else 1
- op = functools.partial(OPS[name], C=C, stride=stride, affine=True)
- self._ops += [op]
- self._indices = indices
-
- def forward(self, s0, s1, drop_prob, is_train, name):
- self.training = is_train
- preprocess0_name = name + 'preprocess0.'
- preprocess1_name = name + 'preprocess1.'
- s0 = self.preprocess0(s0, name=preprocess0_name)
- s1 = self.preprocess1(s1, name=preprocess1_name)
- out = [s0, s1]
- for i in range(self._steps):
- h1 = out[self._indices[2 * i]]
- h2 = out[self._indices[2 * i + 1]]
- op1 = self._ops[2 * i]
- op2 = self._ops[2 * i + 1]
- h3 = op1(h1, name=name + '_ops.' + str(2 * i) + '.')
- h4 = op2(h2, name=name + '_ops.' + str(2 * i + 1) + '.')
- if self.training and drop_prob > 0.:
- if h3 != h1:
- h3 = fluid.layers.dropout(
- h3,
- drop_prob,
- dropout_implementation='upscale_in_train')
- if h4 != h2:
- h4 = fluid.layers.dropout(
- h4,
- drop_prob,
- dropout_implementation='upscale_in_train')
- s = h3 + h4
- out += [s]
- return fluid.layers.concat([out[i] for i in self._concat], axis=1)
-
-
-def AuxiliaryHeadCIFAR(input, num_classes, aux_name='auxiliary_head'):
- relu_a = fluid.layers.relu(input)
- pool_a = fluid.layers.pool2d(relu_a, 5, 'avg', 3)
- conv2d_a = fluid.layers.conv2d(
- pool_a,
- 128,
- 1,
- name=aux_name + '.features.2',
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=aux_name + '.features.2.weight'),
- bias_attr=False)
- bn_a_name = aux_name + '.features.3'
- bn_a = fluid.layers.batch_norm(
- conv2d_a,
- act='relu',
- name=bn_a_name,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=bn_a_name + '.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=bn_a_name + '.bias'),
- moving_mean_name=bn_a_name + '.running_mean',
- moving_variance_name=bn_a_name + '.running_var')
- conv2d_b = fluid.layers.conv2d(
- bn_a,
- 768,
- 2,
- name=aux_name + '.features.5',
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=aux_name + '.features.5.weight'),
- bias_attr=False)
- bn_b_name = aux_name + '.features.6'
- bn_b = fluid.layers.batch_norm(
- conv2d_b,
- act='relu',
- name=bn_b_name,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=bn_b_name + '.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=bn_b_name + '.bias'),
- moving_mean_name=bn_b_name + '.running_mean',
- moving_variance_name=bn_b_name + '.running_var')
- fc_name = aux_name + '.classifier'
- fc = fluid.layers.fc(bn_b,
- num_classes,
- name=fc_name,
- param_attr=ParamAttr(
- initializer=Normal(scale=1e-3),
- name=fc_name + '.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=fc_name + '.bias'))
- return fc
-
-
-def StemConv(input, C_out, kernel_size, padding):
- conv_a = fluid.layers.conv2d(
- input,
- C_out,
- kernel_size,
- padding=padding,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0), name='stem.0.weight'),
- bias_attr=False)
- bn_a = fluid.layers.batch_norm(
- conv_a,
- param_attr=ParamAttr(
- initializer=Constant(1.), name='stem.1.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name='stem.1.bias'),
- moving_mean_name='stem.1.running_mean',
- moving_variance_name='stem.1.running_var')
- return bn_a
-
-
-class NetworkCIFAR(object):
- def __init__(self, C, class_num, layers, auxiliary, genotype):
- self.class_num = class_num
- self._layers = layers
- self._auxiliary = auxiliary
-
- stem_multiplier = 3
- self.drop_path_prob = 0
- C_curr = stem_multiplier * C
-
- C_prev_prev, C_prev, C_curr = C_curr, C_curr, C
- self.cells = []
- reduction_prev = False
- for i in range(layers):
- if i in [layers // 3, 2 * layers // 3]:
- C_curr *= 2
- reduction = True
- else:
- reduction = False
- cell = Cell(genotype, C_prev_prev, C_prev, C_curr, reduction,
- reduction_prev)
- reduction_prev = reduction
- self.cells += [cell]
- C_prev_prev, C_prev = C_prev, cell.multiplier * C_curr
- if i == 2 * layers // 3:
- C_to_auxiliary = C_prev
-
- def forward(self, init_channel, is_train):
- self.training = is_train
- self.logits_aux = None
- num_channel = init_channel * 3
- s0 = StemConv(self.image, num_channel, kernel_size=3, padding=1)
- s1 = s0
- for i, cell in enumerate(self.cells):
- name = 'cells.' + str(i) + '.'
- s0, s1 = s1, cell.forward(s0, s1, self.drop_path_prob, is_train,
- name)
- if i == int(2 * self._layers // 3):
- if self._auxiliary and self.training:
- self.logits_aux = AuxiliaryHeadCIFAR(s1, self.class_num)
- out = fluid.layers.adaptive_pool2d(s1, (1, 1), "avg")
- self.logits = fluid.layers.fc(out,
- size=self.class_num,
- param_attr=ParamAttr(
- initializer=Normal(scale=1e-3),
- name='classifier.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.),
- name='classifier.bias'))
- return self.logits, self.logits_aux
-
- def build_input(self, image_shape, batch_size, is_train):
- if is_train:
- py_reader = fluid.layers.py_reader(
- capacity=64,
- shapes=[[-1] + image_shape, [-1, 1], [-1, 1], [-1, 1], [-1, 1],
- [-1, 1], [-1, batch_size, self.class_num - 1]],
- lod_levels=[0, 0, 0, 0, 0, 0, 0],
- dtypes=[
- "float32", "int64", "int64", "float32", "int32", "int32",
- "float32"
- ],
- use_double_buffer=True,
- name='train_reader')
- else:
- py_reader = fluid.layers.py_reader(
- capacity=64,
- shapes=[[-1] + image_shape, [-1, 1]],
- lod_levels=[0, 0],
- dtypes=["float32", "int64"],
- use_double_buffer=True,
- name='test_reader')
- return py_reader
-
- def train_model(self, py_reader, init_channels, aux, aux_w, batch_size,
- loss_lambda):
- self.image, self.ya, self.yb, self.lam, self.label_reshape,\
- self.non_label_reshape, self.rad_var = fluid.layers.read_file(py_reader)
- self.logits, self.logits_aux = self.forward(init_channels, True)
- self.mixup_loss = self.mixup_loss(aux, aux_w)
- self.lrc_loss = self.lrc_loss(batch_size)
- return self.mixup_loss + loss_lambda * self.lrc_loss
-
- def test_model(self, py_reader, init_channels):
- self.image, self.ya = fluid.layers.read_file(py_reader)
- self.logits, _ = self.forward(init_channels, False)
- prob = fluid.layers.softmax(self.logits, use_cudnn=False)
- loss = fluid.layers.cross_entropy(prob, self.ya)
- acc_1 = fluid.layers.accuracy(self.logits, self.ya, k=1)
- acc_5 = fluid.layers.accuracy(self.logits, self.ya, k=5)
- return loss, acc_1, acc_5
-
- def mixup_loss(self, auxiliary, auxiliary_weight):
- prob = fluid.layers.softmax(self.logits, use_cudnn=False)
- loss_a = fluid.layers.cross_entropy(prob, self.ya)
- loss_b = fluid.layers.cross_entropy(prob, self.yb)
- loss_a_mean = fluid.layers.reduce_mean(loss_a)
- loss_b_mean = fluid.layers.reduce_mean(loss_b)
- loss = self.lam * loss_a_mean + (1 - self.lam) * loss_b_mean
- if auxiliary:
- prob_aux = fluid.layers.softmax(self.logits_aux, use_cudnn=False)
- loss_a_aux = fluid.layers.cross_entropy(prob_aux, self.ya)
- loss_b_aux = fluid.layers.cross_entropy(prob_aux, self.yb)
- loss_a_aux_mean = fluid.layers.reduce_mean(loss_a_aux)
- loss_b_aux_mean = fluid.layers.reduce_mean(loss_b_aux)
- loss_aux = self.lam * loss_a_aux_mean + (1 - self.lam
- ) * loss_b_aux_mean
- return loss + auxiliary_weight * loss_aux
-
- def lrc_loss(self, batch_size):
- y_diff_reshape = fluid.layers.reshape(self.logits, shape=(-1, 1))
- label_reshape = fluid.layers.squeeze(self.label_reshape, axes=[1])
- non_label_reshape = fluid.layers.squeeze(
- self.non_label_reshape, axes=[1])
- label_reshape.stop_gradient = True
- non_label_reshape.stop_graident = True
-
- y_diff_label_reshape = fluid.layers.gather(y_diff_reshape,
- label_reshape)
- y_diff_non_label_reshape = fluid.layers.gather(y_diff_reshape,
- non_label_reshape)
- y_diff_label = fluid.layers.reshape(
- y_diff_label_reshape, shape=(-1, batch_size, 1))
- y_diff_non_label = fluid.layers.reshape(
- y_diff_non_label_reshape,
- shape=(-1, batch_size, self.class_num - 1))
- y_diff_ = y_diff_non_label - y_diff_label
-
- y_diff_ = fluid.layers.transpose(y_diff_, perm=[1, 2, 0])
- rad_var_trans = fluid.layers.transpose(self.rad_var, perm=[1, 2, 0])
- rad_y_diff_trans = rad_var_trans * y_diff_
- lrc_loss_sum = fluid.layers.reduce_sum(rad_y_diff_trans, dim=[0, 1])
- lrc_loss_ = fluid.layers.abs(lrc_loss_sum) / (batch_size *
- (self.class_num - 1))
- lrc_loss_mean = fluid.layers.reduce_mean(lrc_loss_)
-
- return lrc_loss_mean
diff --git a/AutoDL/LRC/operations.py b/AutoDL/LRC/operations.py
deleted file mode 100644
index b015722a1bc5dbf682c90812a971f3dbb2cd8c9a..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/operations.py
+++ /dev/null
@@ -1,349 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# DARTS
-# Copyright (c) 2018, Hanxiao Liu.
-# Licensed under the Apache License, Version 2.0;
-# --------------------------------------------------------
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import os
-import sys
-import numpy as np
-import time
-import paddle
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-from paddle.fluid.initializer import Xavier
-from paddle.fluid.initializer import Normal
-from paddle.fluid.initializer import Constant
-
-OPS = {
- 'none' : lambda input, C, stride, name, affine: Zero(input, stride, name),
- 'avg_pool_3x3' : lambda input, C, stride, name, affine: fluid.layers.pool2d(input, 3, 'avg', pool_stride=stride, pool_padding=1, name=name),
- 'max_pool_3x3' : lambda input, C, stride, name, affine: fluid.layers.pool2d(input, 3, 'max', pool_stride=stride, pool_padding=1, name=name),
- 'skip_connect' : lambda input,C, stride, name, affine: Identity(input, name) if stride == 1 else FactorizedReduce(input, C, name=name, affine=affine),
- 'sep_conv_3x3' : lambda input,C, stride, name, affine: SepConv(input, C, C, 3, stride, 1, name=name, affine=affine),
- 'sep_conv_5x5' : lambda input,C, stride, name, affine: SepConv(input, C, C, 5, stride, 2, name=name, affine=affine),
- 'sep_conv_7x7' : lambda input,C, stride, name, affine: SepConv(input, C, C, 7, stride, 3, name=name, affine=affine),
- 'dil_conv_3x3' : lambda input,C, stride, name, affine: DilConv(input, C, C, 3, stride, 2, 2, name=name, affine=affine),
- 'dil_conv_5x5' : lambda input,C, stride, name, affine: DilConv(input, C, C, 5, stride, 4, 2, name=name, affine=affine),
- 'conv_7x1_1x7' : lambda input,C, stride, name, affine: SevenConv(input, C, name=name, affine=affine)
-}
-
-
-def ReLUConvBN(input, C_out, kernel_size, stride, padding, name='',
- affine=True):
- relu_a = fluid.layers.relu(input)
- conv2d_a = fluid.layers.conv2d(
- relu_a,
- C_out,
- kernel_size,
- stride,
- padding,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.1.weight'),
- bias_attr=False)
- if affine:
- reluconvbn_out = fluid.layers.batch_norm(
- conv2d_a,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=name + 'op.2.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=name + 'op.2.bias'),
- moving_mean_name=name + 'op.2.running_mean',
- moving_variance_name=name + 'op.2.running_var')
- else:
- reluconvbn_out = fluid.layers.batch_norm(
- conv2d_a,
- param_attr=ParamAttr(
- initializer=Constant(1.),
- learning_rate=0.,
- name=name + 'op.2.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.),
- learning_rate=0.,
- name=name + 'op.2.bias'),
- moving_mean_name=name + 'op.2.running_mean',
- moving_variance_name=name + 'op.2.running_var')
- return reluconvbn_out
-
-
-def DilConv(input,
- C_in,
- C_out,
- kernel_size,
- stride,
- padding,
- dilation,
- name='',
- affine=True):
- relu_a = fluid.layers.relu(input)
- conv2d_a = fluid.layers.conv2d(
- relu_a,
- C_in,
- kernel_size,
- stride,
- padding,
- dilation,
- groups=C_in,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.1.weight'),
- bias_attr=False,
- use_cudnn=False)
- conv2d_b = fluid.layers.conv2d(
- conv2d_a,
- C_out,
- 1,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.2.weight'),
- bias_attr=False)
- if affine:
- dilconv_out = fluid.layers.batch_norm(
- conv2d_b,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=name + 'op.3.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=name + 'op.3.bias'),
- moving_mean_name=name + 'op.3.running_mean',
- moving_variance_name=name + 'op.3.running_var')
- else:
- dilconv_out = fluid.layers.batch_norm(
- conv2d_b,
- param_attr=ParamAttr(
- initializer=Constant(1.),
- learning_rate=0.,
- name=name + 'op.3.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.),
- learning_rate=0.,
- name=name + 'op.3.bias'),
- moving_mean_name=name + 'op.3.running_mean',
- moving_variance_name=name + 'op.3.running_var')
- return dilconv_out
-
-
-def SepConv(input,
- C_in,
- C_out,
- kernel_size,
- stride,
- padding,
- name='',
- affine=True):
- relu_a = fluid.layers.relu(input)
- conv2d_a = fluid.layers.conv2d(
- relu_a,
- C_in,
- kernel_size,
- stride,
- padding,
- groups=C_in,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.1.weight'),
- bias_attr=False,
- use_cudnn=False)
- conv2d_b = fluid.layers.conv2d(
- conv2d_a,
- C_in,
- 1,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.2.weight'),
- bias_attr=False)
- if affine:
- bn_a = fluid.layers.batch_norm(
- conv2d_b,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=name + 'op.3.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=name + 'op.3.bias'),
- moving_mean_name=name + 'op.3.running_mean',
- moving_variance_name=name + 'op.3.running_var')
- else:
- bn_a = fluid.layers.batch_norm(
- conv2d_b,
- param_attr=ParamAttr(
- initializer=Constant(1.),
- learning_rate=0.,
- name=name + 'op.3.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.),
- learning_rate=0.,
- name=name + 'op.3.bias'),
- moving_mean_name=name + 'op.3.running_mean',
- moving_variance_name=name + 'op.3.running_var')
-
- relu_b = fluid.layers.relu(bn_a)
- conv2d_d = fluid.layers.conv2d(
- relu_b,
- C_in,
- kernel_size,
- 1,
- padding,
- groups=C_in,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.5.weight'),
- bias_attr=False,
- use_cudnn=False)
- conv2d_e = fluid.layers.conv2d(
- conv2d_d,
- C_out,
- 1,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.6.weight'),
- bias_attr=False)
- if affine:
- sepconv_out = fluid.layers.batch_norm(
- conv2d_e,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=name + 'op.7.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=name + 'op.7.bias'),
- moving_mean_name=name + 'op.7.running_mean',
- moving_variance_name=name + 'op.7.running_var')
- else:
- sepconv_out = fluid.layers.batch_norm(
- conv2d_e,
- param_attr=ParamAttr(
- initializer=Constant(1.),
- learning_rate=0.,
- name=name + 'op.7.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.),
- learning_rate=0.,
- name=name + 'op.7.bias'),
- moving_mean_name=name + 'op.7.running_mean',
- moving_variance_name=name + 'op.7.running_var')
- return sepconv_out
-
-
-def SevenConv(input, C_out, stride, name='', affine=True):
- relu_a = fluid.layers.relu(input)
- conv2d_a = fluid.layers.conv2d(
- relu_a,
- C_out, (1, 7), (1, stride), (0, 3),
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.1.weight'),
- bias_attr=False)
- conv2d_b = fluid.layers.conv2d(
- conv2d_a,
- C_out, (7, 1), (stride, 1), (3, 0),
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'op.2.weight'),
- bias_attr=False)
- if affine:
- out = fluid.layers.batch_norm(
- conv2d_b,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=name + 'op.3.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=name + 'op.3.bias'),
- moving_mean_name=name + 'op.3.running_mean',
- moving_variance_name=name + 'op.3.running_var')
- else:
- out = fluid.layers.batch_norm(
- conv2d_b,
- param_attr=ParamAttr(
- initializer=Constant(1.),
- learning_rate=0.,
- name=name + 'op.3.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.),
- learning_rate=0.,
- name=name + 'op.3.bias'),
- moving_mean_name=name + 'op.3.running_mean',
- moving_variance_name=name + 'op.3.running_var')
-
-
-def Identity(input, name=''):
- return input
-
-
-def Zero(input, stride, name=''):
- ones = np.ones(input.shape[-2:])
- ones[::stride, ::stride] = 0
- ones = fluid.layers.assign(ones)
- return input * ones
-
-
-def FactorizedReduce(input, C_out, name='', affine=True):
- relu_a = fluid.layers.relu(input)
- conv2d_a = fluid.layers.conv2d(
- relu_a,
- C_out // 2,
- 1,
- 2,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'conv_1.weight'),
- bias_attr=False)
- h_end = relu_a.shape[2]
- w_end = relu_a.shape[3]
- slice_a = fluid.layers.slice(relu_a, [2, 3], [1, 1], [h_end, w_end])
- conv2d_b = fluid.layers.conv2d(
- slice_a,
- C_out // 2,
- 1,
- 2,
- param_attr=ParamAttr(
- initializer=Xavier(
- uniform=False, fan_in=0),
- name=name + 'conv_2.weight'),
- bias_attr=False)
- out = fluid.layers.concat([conv2d_a, conv2d_b], axis=1)
- if affine:
- out = fluid.layers.batch_norm(
- out,
- param_attr=ParamAttr(
- initializer=Constant(1.), name=name + 'bn.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.), name=name + 'bn.bias'),
- moving_mean_name=name + 'bn.running_mean',
- moving_variance_name=name + 'bn.running_var')
- else:
- out = fluid.layers.batch_norm(
- out,
- param_attr=ParamAttr(
- initializer=Constant(1.),
- learning_rate=0.,
- name=name + 'bn.weight'),
- bias_attr=ParamAttr(
- initializer=Constant(0.),
- learning_rate=0.,
- name=name + 'bn.bias'),
- moving_mean_name=name + 'bn.running_mean',
- moving_variance_name=name + 'bn.running_var')
- return out
diff --git a/AutoDL/LRC/reader.py b/AutoDL/LRC/reader.py
deleted file mode 100644
index 20b32b504e9245c4ff3892f08736d800080daab4..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/reader.py
+++ /dev/null
@@ -1,187 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rig hts Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# DARTS
-# Copyright (c) 2018, Hanxiao Liu.
-# Licensed under the Apache License, Version 2.0;
-# --------------------------------------------------------
-"""
-CIFAR-10 dataset.
-This module will download dataset from
-https://www.cs.toronto.edu/~kriz/cifar.html and parse train/test set into
-paddle reader creators.
-The CIFAR-10 dataset consists of 60000 32x32 colour images in 10 classes,
-with 6000 images per class. There are 50000 training images and 10000 test images.
-"""
-
-from PIL import Image
-from PIL import ImageOps
-import numpy as np
-
-import cPickle
-import random
-import utils
-import paddle.fluid as fluid
-import time
-import os
-import functools
-import paddle.reader
-
-__all__ = ['train10', 'test10']
-
-image_size = 32
-image_depth = 3
-half_length = 8
-
-CIFAR_MEAN = [0.4914, 0.4822, 0.4465]
-CIFAR_STD = [0.24703233, 0.24348505, 0.26158768]
-
-
-def generate_reshape_label(label, batch_size, CIFAR_CLASSES=10):
- reshape_label = np.zeros((batch_size, 1), dtype='int32')
- reshape_non_label = np.zeros(
- (batch_size * (CIFAR_CLASSES - 1), 1), dtype='int32')
- num = 0
- for i in range(batch_size):
- label_i = label[i]
- reshape_label[i] = label_i + i * CIFAR_CLASSES
- for j in range(CIFAR_CLASSES):
- if label_i != j:
- reshape_non_label[num] = \
- j + i * CIFAR_CLASSES
- num += 1
- return reshape_label, reshape_non_label
-
-
-def generate_bernoulli_number(batch_size, CIFAR_CLASSES=10):
- rcc_iters = 50
- rad_var = np.zeros((rcc_iters, batch_size, CIFAR_CLASSES - 1))
- for i in range(rcc_iters):
- bernoulli_num = np.random.binomial(size=batch_size, n=1, p=0.5)
- bernoulli_map = np.array([])
- ones = np.ones((CIFAR_CLASSES - 1, 1))
- for batch_id in range(batch_size):
- num = bernoulli_num[batch_id]
- var_id = 2 * ones * num - 1
- bernoulli_map = np.append(bernoulli_map, var_id)
- rad_var[i] = bernoulli_map.reshape((batch_size, CIFAR_CLASSES - 1))
- return rad_var.astype('float32')
-
-
-def preprocess(sample, is_training, args):
- image_array = sample.reshape(3, image_size, image_size)
- rgb_array = np.transpose(image_array, (1, 2, 0))
- img = Image.fromarray(rgb_array, 'RGB')
-
- if is_training:
- # pad and ramdom crop
- img = ImageOps.expand(img, (4, 4, 4, 4), fill=0) # pad to 40 * 40 * 3
- left_top = np.random.randint(9, size=2) # rand 0 - 8
- img = img.crop((left_top[0], left_top[1], left_top[0] + image_size,
- left_top[1] + image_size))
- if np.random.randint(2):
- img = img.transpose(Image.FLIP_LEFT_RIGHT)
-
- img = np.array(img).astype(np.float32)
-
- # per_image_standardization
- img_float = img / 255.0
- img = (img_float - CIFAR_MEAN) / CIFAR_STD
-
- if is_training and args.cutout:
- center = np.random.randint(image_size, size=2)
- offset_width = max(0, center[0] - half_length)
- offset_height = max(0, center[1] - half_length)
- target_width = min(center[0] + half_length, image_size)
- target_height = min(center[1] + half_length, image_size)
-
- for i in range(offset_height, target_height):
- for j in range(offset_width, target_width):
- img[i][j][:] = 0.0
-
- img = np.transpose(img, (2, 0, 1))
- return img
-
-
-def reader_creator_filepath(filename, sub_name, is_training, args):
- files = os.listdir(filename)
- names = [each_item for each_item in files if sub_name in each_item]
- names.sort()
- datasets = []
- for name in names:
- print("Reading file " + name)
- batch = cPickle.load(open(filename + name, 'rb'))
- data = batch['data']
- labels = batch.get('labels', batch.get('fine_labels', None))
- assert labels is not None
- dataset = zip(data, labels)
- datasets.extend(dataset)
- random.shuffle(datasets)
-
- def read_batch(datasets, args):
- for sample, label in datasets:
- im = preprocess(sample, is_training, args)
- yield im, [int(label)]
-
- def reader():
- batch_data = []
- batch_label = []
- for data, label in read_batch(datasets, args):
- batch_data.append(data)
- batch_label.append(label)
- if len(batch_data) == args.batch_size:
- batch_data = np.array(batch_data, dtype='float32')
- batch_label = np.array(batch_label, dtype='int64')
- if is_training:
- flatten_label, flatten_non_label = \
- generate_reshape_label(batch_label, args.batch_size)
- rad_var = generate_bernoulli_number(args.batch_size)
- mixed_x, y_a, y_b, lam = utils.mixup_data(
- batch_data, batch_label, args.batch_size,
- args.mix_alpha)
- batch_out = [[mixed_x, y_a, y_b, lam, flatten_label, \
- flatten_non_label, rad_var]]
- yield batch_out
- else:
- batch_out = [[batch_data, batch_label]]
- yield batch_out
- batch_data = []
- batch_label = []
-
- return reader
-
-
-def train10(args):
- """
- CIFAR-10 training set creator.
- It returns a reader creator, each sample in the reader is image pixels in
- [0, 1] and label in [0, 9].
- :return: Training reader creator
- :rtype: callable
- """
-
- return reader_creator_filepath(args.data, 'data_batch', True, args)
-
-
-def test10(args):
- """
- CIFAR-10 test set creator.
- It returns a reader creator, each sample in the reader is image pixels in
- [0, 1] and label in [0, 9].
- :return: Test reader creator.
- :rtype: callable
- """
- return reader_creator_filepath(args.data, 'test_batch', False, args)
diff --git a/AutoDL/LRC/run.sh b/AutoDL/LRC/run.sh
deleted file mode 100644
index 9f1a045d49789c3e9aebbc2a73b84b11da471b5a..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/run.sh
+++ /dev/null
@@ -1,8 +0,0 @@
-CUDA_VISIBLE_DEVICES=0 python -u train_mixup.py \
---batch_size=80 \
---auxiliary \
---weight_decay=0.0003 \
---learning_rate=0.025 \
---lrc_loss_lambda=0.7 \
---cutout
-
diff --git a/AutoDL/LRC/train_mixup.py b/AutoDL/LRC/train_mixup.py
deleted file mode 100644
index de752c84bcf9276aa83540d60370517e66c0704f..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/train_mixup.py
+++ /dev/null
@@ -1,247 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
-#
-#Licensed under the Apache License, Version 2.0 (the "License");
-#you may not use this file except in compliance with the License.
-#You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-#Unless required by applicable law or agreed to in writing, software
-#distributed under the License is distributed on an "AS IS" BASIS,
-#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#See the License for the specific language governing permissions and
-#limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# DARTS
-# Copyright (c) 2018, Hanxiao Liu.
-# Licensed under the Apache License, Version 2.0;
-# --------------------------------------------------------
-
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-from learning_rate import cosine_decay
-import numpy as np
-import argparse
-from model import NetworkCIFAR as Network
-import reader
-import sys
-import os
-import time
-import logging
-import genotypes
-import paddle.fluid as fluid
-import shutil
-import utils
-import cPickle as cp
-
-parser = argparse.ArgumentParser("cifar")
-parser.add_argument(
- '--data',
- type=str,
- default='./dataset/cifar/cifar-10-batches-py/',
- help='location of the data corpus')
-parser.add_argument('--batch_size', type=int, default=96, help='batch size')
-parser.add_argument(
- '--learning_rate', type=float, default=0.025, help='init learning rate')
-parser.add_argument('--momentum', type=float, default=0.9, help='momentum')
-parser.add_argument(
- '--weight_decay', type=float, default=3e-4, help='weight decay')
-parser.add_argument(
- '--report_freq', type=float, default=50, help='report frequency')
-parser.add_argument(
- '--epochs', type=int, default=600, help='num of training epochs')
-parser.add_argument(
- '--init_channels', type=int, default=36, help='num of init channels')
-parser.add_argument(
- '--layers', type=int, default=20, help='total number of layers')
-parser.add_argument(
- '--model_path',
- type=str,
- default='saved_models',
- help='path to save the model')
-parser.add_argument(
- '--auxiliary',
- action='store_true',
- default=False,
- help='use auxiliary tower')
-parser.add_argument(
- '--auxiliary_weight',
- type=float,
- default=0.4,
- help='weight for auxiliary loss')
-parser.add_argument(
- '--cutout', action='store_true', default=False, help='use cutout')
-parser.add_argument(
- '--cutout_length', type=int, default=16, help='cutout length')
-parser.add_argument(
- '--drop_path_prob', type=float, default=0.2, help='drop path probability')
-parser.add_argument('--save', type=str, default='EXP', help='experiment name')
-parser.add_argument(
- '--arch', type=str, default='DARTS', help='which architecture to use')
-parser.add_argument(
- '--grad_clip', type=float, default=5, help='gradient clipping')
-parser.add_argument(
- '--lr_exp_decay',
- action='store_true',
- default=False,
- help='use exponential_decay learning_rate')
-parser.add_argument('--mix_alpha', type=float, default=0.5, help='mixup alpha')
-parser.add_argument(
- '--lrc_loss_lambda', default=0, type=float, help='lrc_loss_lambda')
-parser.add_argument(
- '--loss_type',
- default=1,
- type=float,
- help='loss_type 0: cross entropy 1: multi margin loss 2: max margin loss')
-
-args = parser.parse_args()
-
-CIFAR_CLASSES = 10
-dataset_train_size = 50000
-image_size = 32
-
-
-def main():
- image_shape = [3, image_size, image_size]
- devices = os.getenv("CUDA_VISIBLE_DEVICES") or ""
- devices_num = len(devices.split(","))
- logging.info("args = %s", args)
- genotype = eval("genotypes.%s" % args.arch)
- model = Network(args.init_channels, CIFAR_CLASSES, args.layers,
- args.auxiliary, genotype)
- steps_one_epoch = dataset_train_size / (devices_num * args.batch_size)
- train(model, args, image_shape, steps_one_epoch)
-
-
-def build_program(main_prog, startup_prog, args, is_train, model, im_shape,
- steps_one_epoch):
- out = []
- with fluid.program_guard(main_prog, startup_prog):
- py_reader = model.build_input(im_shape, args.batch_size, is_train)
- if is_train:
- with fluid.unique_name.guard():
- loss = model.train_model(py_reader, args.init_channels,
- args.auxiliary, args.auxiliary_weight,
- args.batch_size, args.lrc_loss_lambda)
- optimizer = fluid.optimizer.Momentum(
- learning_rate=cosine_decay(args.learning_rate, \
- args.epochs, steps_one_epoch),
- regularization=fluid.regularizer.L2Decay(\
- args.weight_decay),
- momentum=args.momentum)
- optimizer.minimize(loss)
- out = [py_reader, loss]
- else:
- with fluid.unique_name.guard():
- loss, acc_1, acc_5 = model.test_model(py_reader,
- args.init_channels)
- out = [py_reader, loss, acc_1, acc_5]
- return out
-
-
-def train(model, args, im_shape, steps_one_epoch):
- train_startup_prog = fluid.Program()
- test_startup_prog = fluid.Program()
- train_prog = fluid.Program()
- test_prog = fluid.Program()
-
- train_py_reader, loss_train = build_program(train_prog, train_startup_prog,
- args, True, model, im_shape,
- steps_one_epoch)
-
- test_py_reader, loss_test, acc_1, acc_5 = build_program(
- test_prog, test_startup_prog, args, False, model, im_shape,
- steps_one_epoch)
-
- test_prog = test_prog.clone(for_test=True)
-
- place = fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(train_startup_prog)
- exe.run(test_startup_prog)
-
- exec_strategy = fluid.ExecutionStrategy()
- exec_strategy.num_threads = 1
- train_exe = fluid.ParallelExecutor(
- main_program=train_prog,
- use_cuda=True,
- loss_name=loss_train.name,
- exec_strategy=exec_strategy)
- train_reader = reader.train10(args)
- test_reader = reader.test10(args)
- train_py_reader.decorate_paddle_reader(train_reader)
- test_py_reader.decorate_paddle_reader(test_reader)
-
- fluid.clip.set_gradient_clip(fluid.clip.GradientClipByNorm(args.grad_clip))
- fluid.memory_optimize(fluid.default_main_program())
-
- def save_model(postfix, main_prog):
- model_path = os.path.join(args.model_path, postfix)
- if os.path.isdir(model_path):
- shutil.rmtree(model_path)
- fluid.io.save_persistables(exe, model_path, main_program=main_prog)
-
- def test(epoch_id):
- test_fetch_list = [loss_test, acc_1, acc_5]
- objs = utils.AvgrageMeter()
- top1 = utils.AvgrageMeter()
- top5 = utils.AvgrageMeter()
- test_py_reader.start()
- test_start_time = time.time()
- step_id = 0
- try:
- while True:
- prev_test_start_time = test_start_time
- test_start_time = time.time()
- loss_test_v, acc_1_v, acc_5_v = exe.run(
- test_prog, fetch_list=test_fetch_list)
- objs.update(np.array(loss_test_v), args.batch_size)
- top1.update(np.array(acc_1_v), args.batch_size)
- top5.update(np.array(acc_5_v), args.batch_size)
- if step_id % args.report_freq == 0:
- print("Epoch {}, Step {}, acc_1 {}, acc_5 {}, time {}".
- format(epoch_id, step_id,
- np.array(acc_1_v),
- np.array(acc_5_v), test_start_time -
- prev_test_start_time))
- step_id += 1
- except fluid.core.EOFException:
- test_py_reader.reset()
- print("Epoch {0}, top1 {1}, top5 {2}".format(epoch_id, top1.avg,
- top5.avg))
-
- train_fetch_list = [loss_train]
- epoch_start_time = time.time()
- for epoch_id in range(args.epochs):
- model.drop_path_prob = args.drop_path_prob * epoch_id / args.epochs
- train_py_reader.start()
- epoch_end_time = time.time()
- if epoch_id > 0:
- print("Epoch {}, total time {}".format(epoch_id - 1, epoch_end_time
- - epoch_start_time))
- epoch_start_time = epoch_end_time
- epoch_end_time
- start_time = time.time()
- step_id = 0
- try:
- while True:
- prev_start_time = start_time
- start_time = time.time()
- loss_v, = train_exe.run(
- fetch_list=[v.name for v in train_fetch_list])
- print("Epoch {}, Step {}, loss {}, time {}".format(epoch_id, step_id, \
- np.array(loss_v).mean(), start_time-prev_start_time))
- step_id += 1
- sys.stdout.flush()
- except fluid.core.EOFException:
- train_py_reader.reset()
- if epoch_id % 50 == 0 or epoch_id == args.epochs - 1:
- save_model(str(epoch_id), train_prog)
- test(epoch_id)
-
-
-if __name__ == '__main__':
- main()
diff --git a/AutoDL/LRC/utils.py b/AutoDL/LRC/utils.py
deleted file mode 100644
index 4002b57c6e91f9a4f7992156c4fa07f9e55d628c..0000000000000000000000000000000000000000
--- a/AutoDL/LRC/utils.py
+++ /dev/null
@@ -1,55 +0,0 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Based on:
-# --------------------------------------------------------
-# DARTS
-# Copyright (c) 2018, Hanxiao Liu.
-# Licensed under the Apache License, Version 2.0;
-# --------------------------------------------------------
-
-import os
-import sys
-import time
-import math
-import numpy as np
-
-
-def mixup_data(x, y, batch_size, alpha=1.0):
- '''Compute the mixup data. Return mixed inputs, pairs of targets, and lambda'''
- if alpha > 0.:
- lam = np.random.beta(alpha, alpha)
- else:
- lam = 1.
- index = np.random.permutation(batch_size)
-
- mixed_x = lam * x + (1 - lam) * x[index, :]
- y_a, y_b = y, y[index]
- return mixed_x.astype('float32'), y_a.astype('int64'),\
- y_b.astype('int64'), np.array(lam, dtype='float32')
-
-
-class AvgrageMeter(object):
- def __init__(self):
- self.reset()
-
- def reset(self):
- self.avg = 0
- self.sum = 0
- self.cnt = 0
-
- def update(self, val, n=1):
- self.sum += val * n
- self.cnt += n
- self.avg = self.sum / self.cnt
diff --git a/PaddleRL/DeepQNetwork/DQN_agent.py b/PaddleRL/DeepQNetwork/DQN_agent.py
deleted file mode 100644
index 1b27051a1a4793ee99fd6d735eb876d483eece34..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/DQN_agent.py
+++ /dev/null
@@ -1,191 +0,0 @@
-#-*- coding: utf-8 -*-
-
-import math
-import numpy as np
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-from tqdm import tqdm
-
-
-class DQNModel(object):
- def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
- self.img_height = state_dim[0]
- self.img_width = state_dim[1]
- self.action_dim = action_dim
- self.gamma = gamma
- self.exploration = 1.1
- self.update_target_steps = 10000 // 4
- self.hist_len = hist_len
- self.use_cuda = use_cuda
-
- self.global_step = 0
- self._build_net()
-
- def _get_inputs(self):
- return fluid.layers.data(
- name='state',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='action', shape=[1], dtype='int32'), \
- fluid.layers.data(
- name='reward', shape=[], dtype='float32'), \
- fluid.layers.data(
- name='next_s',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='isOver', shape=[], dtype='bool')
-
- def _build_net(self):
- self.predict_program = fluid.Program()
- self.train_program = fluid.Program()
- self._sync_program = fluid.Program()
-
- with fluid.program_guard(self.predict_program):
- state, action, reward, next_s, isOver = self._get_inputs()
- self.pred_value = self.get_DQN_prediction(state)
-
- with fluid.program_guard(self.train_program):
- state, action, reward, next_s, isOver = self._get_inputs()
- pred_value = self.get_DQN_prediction(state)
-
- reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
-
- action_onehot = fluid.layers.one_hot(action, self.action_dim)
- action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
-
- pred_action_value = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(action_onehot, pred_value), dim=1)
-
- targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
- best_v = fluid.layers.reduce_max(targetQ_predict_value, dim=1)
- best_v.stop_gradient = True
-
- target = reward + (1.0 - fluid.layers.cast(
- isOver, dtype='float32')) * self.gamma * best_v
- cost = fluid.layers.square_error_cost(pred_action_value, target)
- cost = fluid.layers.reduce_mean(cost)
-
- optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
- optimizer.minimize(cost)
-
- vars = list(self.train_program.list_vars())
- target_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
-
- policy_vars_name = [
- x.name.replace('target', 'policy') for x in target_vars]
- policy_vars = list(filter(
- lambda x: x.name in policy_vars_name, vars))
-
- policy_vars.sort(key=lambda x: x.name)
- target_vars.sort(key=lambda x: x.name)
-
- with fluid.program_guard(self._sync_program):
- sync_ops = []
- for i, var in enumerate(policy_vars):
- sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
- sync_ops.append(sync_op)
-
- # fluid exe
- place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
- self.exe = fluid.Executor(place)
- self.exe.run(fluid.default_startup_program())
-
- def get_DQN_prediction(self, image, target=False):
- image = image / 255.0
-
- variable_field = 'target' if target else 'policy'
-
- conv1 = fluid.layers.conv2d(
- input=image,
- num_filters=32,
- filter_size=5,
- stride=1,
- padding=2,
- act='relu',
- param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
- max_pool1 = fluid.layers.pool2d(
- input=conv1, pool_size=2, pool_stride=2, pool_type='max')
-
- conv2 = fluid.layers.conv2d(
- input=max_pool1,
- num_filters=32,
- filter_size=5,
- stride=1,
- padding=2,
- act='relu',
- param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
- max_pool2 = fluid.layers.pool2d(
- input=conv2, pool_size=2, pool_stride=2, pool_type='max')
-
- conv3 = fluid.layers.conv2d(
- input=max_pool2,
- num_filters=64,
- filter_size=4,
- stride=1,
- padding=1,
- act='relu',
- param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
- max_pool3 = fluid.layers.pool2d(
- input=conv3, pool_size=2, pool_stride=2, pool_type='max')
-
- conv4 = fluid.layers.conv2d(
- input=max_pool3,
- num_filters=64,
- filter_size=3,
- stride=1,
- padding=1,
- act='relu',
- param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
-
- flatten = fluid.layers.flatten(conv4, axis=1)
-
- out = fluid.layers.fc(
- input=flatten,
- size=self.action_dim,
- param_attr=ParamAttr(name='{}_fc1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_fc1_b'.format(variable_field)))
- return out
-
-
- def act(self, state, train_or_test):
- sample = np.random.random()
- if train_or_test == 'train' and sample < self.exploration:
- act = np.random.randint(self.action_dim)
- else:
- if np.random.random() < 0.01:
- act = np.random.randint(self.action_dim)
- else:
- state = np.expand_dims(state, axis=0)
- pred_Q = self.exe.run(self.predict_program,
- feed={'state': state.astype('float32')},
- fetch_list=[self.pred_value])[0]
- pred_Q = np.squeeze(pred_Q, axis=0)
- act = np.argmax(pred_Q)
- if train_or_test == 'train':
- self.exploration = max(0.1, self.exploration - 1e-6)
- return act
-
- def train(self, state, action, reward, next_state, isOver):
- if self.global_step % self.update_target_steps == 0:
- self.sync_target_network()
- self.global_step += 1
-
- action = np.expand_dims(action, -1)
- self.exe.run(self.train_program,
- feed={
- 'state': state.astype('float32'),
- 'action': action.astype('int32'),
- 'reward': reward,
- 'next_s': next_state.astype('float32'),
- 'isOver': isOver
- })
-
- def sync_target_network(self):
- self.exe.run(self._sync_program)
diff --git a/PaddleRL/DeepQNetwork/DoubleDQN_agent.py b/PaddleRL/DeepQNetwork/DoubleDQN_agent.py
deleted file mode 100644
index ecd94abd459e728ac7c845ebee09adcbd6bbdd22..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/DoubleDQN_agent.py
+++ /dev/null
@@ -1,199 +0,0 @@
-#-*- coding: utf-8 -*-
-
-import math
-import numpy as np
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-from tqdm import tqdm
-
-
-class DoubleDQNModel(object):
- def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
- self.img_height = state_dim[0]
- self.img_width = state_dim[1]
- self.action_dim = action_dim
- self.gamma = gamma
- self.exploration = 1.1
- self.update_target_steps = 10000 // 4
- self.hist_len = hist_len
- self.use_cuda = use_cuda
-
- self.global_step = 0
- self._build_net()
-
- def _get_inputs(self):
- return fluid.layers.data(
- name='state',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='action', shape=[1], dtype='int32'), \
- fluid.layers.data(
- name='reward', shape=[], dtype='float32'), \
- fluid.layers.data(
- name='next_s',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='isOver', shape=[], dtype='bool')
-
- def _build_net(self):
- self.predict_program = fluid.Program()
- self.train_program = fluid.Program()
- self._sync_program = fluid.Program()
-
- with fluid.program_guard(self.predict_program):
- state, action, reward, next_s, isOver = self._get_inputs()
- self.pred_value = self.get_DQN_prediction(state)
-
- with fluid.program_guard(self.train_program):
- state, action, reward, next_s, isOver = self._get_inputs()
- pred_value = self.get_DQN_prediction(state)
-
- reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
-
- action_onehot = fluid.layers.one_hot(action, self.action_dim)
- action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
-
- pred_action_value = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(action_onehot, pred_value), dim=1)
-
- targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
-
- next_s_predcit_value = self.get_DQN_prediction(next_s)
- greedy_action = fluid.layers.argmax(next_s_predcit_value, axis=1)
- greedy_action = fluid.layers.unsqueeze(greedy_action, axes=[1])
-
- predict_onehot = fluid.layers.one_hot(greedy_action, self.action_dim)
- best_v = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(predict_onehot, targetQ_predict_value),
- dim=1)
- best_v.stop_gradient = True
-
- target = reward + (1.0 - fluid.layers.cast(
- isOver, dtype='float32')) * self.gamma * best_v
- cost = fluid.layers.square_error_cost(pred_action_value, target)
- cost = fluid.layers.reduce_mean(cost)
-
- optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
- optimizer.minimize(cost)
-
- vars = list(self.train_program.list_vars())
- target_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
-
- policy_vars_name = [
- x.name.replace('target', 'policy') for x in target_vars]
- policy_vars = list(filter(
- lambda x: x.name in policy_vars_name, vars))
-
- policy_vars.sort(key=lambda x: x.name)
- target_vars.sort(key=lambda x: x.name)
-
- with fluid.program_guard(self._sync_program):
- sync_ops = []
- for i, var in enumerate(policy_vars):
- sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
- sync_ops.append(sync_op)
-
- # fluid exe
- place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
- self.exe = fluid.Executor(place)
- self.exe.run(fluid.default_startup_program())
-
- def get_DQN_prediction(self, image, target=False):
- image = image / 255.0
-
- variable_field = 'target' if target else 'policy'
-
- conv1 = fluid.layers.conv2d(
- input=image,
- num_filters=32,
- filter_size=5,
- stride=1,
- padding=2,
- act='relu',
- param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
- max_pool1 = fluid.layers.pool2d(
- input=conv1, pool_size=2, pool_stride=2, pool_type='max')
-
- conv2 = fluid.layers.conv2d(
- input=max_pool1,
- num_filters=32,
- filter_size=5,
- stride=1,
- padding=2,
- act='relu',
- param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
- max_pool2 = fluid.layers.pool2d(
- input=conv2, pool_size=2, pool_stride=2, pool_type='max')
-
- conv3 = fluid.layers.conv2d(
- input=max_pool2,
- num_filters=64,
- filter_size=4,
- stride=1,
- padding=1,
- act='relu',
- param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
- max_pool3 = fluid.layers.pool2d(
- input=conv3, pool_size=2, pool_stride=2, pool_type='max')
-
- conv4 = fluid.layers.conv2d(
- input=max_pool3,
- num_filters=64,
- filter_size=3,
- stride=1,
- padding=1,
- act='relu',
- param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
-
- flatten = fluid.layers.flatten(conv4, axis=1)
-
- out = fluid.layers.fc(
- input=flatten,
- size=self.action_dim,
- param_attr=ParamAttr(name='{}_fc1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_fc1_b'.format(variable_field)))
- return out
-
-
- def act(self, state, train_or_test):
- sample = np.random.random()
- if train_or_test == 'train' and sample < self.exploration:
- act = np.random.randint(self.action_dim)
- else:
- if np.random.random() < 0.01:
- act = np.random.randint(self.action_dim)
- else:
- state = np.expand_dims(state, axis=0)
- pred_Q = self.exe.run(self.predict_program,
- feed={'state': state.astype('float32')},
- fetch_list=[self.pred_value])[0]
- pred_Q = np.squeeze(pred_Q, axis=0)
- act = np.argmax(pred_Q)
- if train_or_test == 'train':
- self.exploration = max(0.1, self.exploration - 1e-6)
- return act
-
- def train(self, state, action, reward, next_state, isOver):
- if self.global_step % self.update_target_steps == 0:
- self.sync_target_network()
- self.global_step += 1
-
- action = np.expand_dims(action, -1)
- self.exe.run(self.train_program,
- feed={
- 'state': state.astype('float32'),
- 'action': action.astype('int32'),
- 'reward': reward,
- 'next_s': next_state.astype('float32'),
- 'isOver': isOver
- })
-
- def sync_target_network(self):
- self.exe.run(self._sync_program)
diff --git a/PaddleRL/DeepQNetwork/DuelingDQN_agent.py b/PaddleRL/DeepQNetwork/DuelingDQN_agent.py
deleted file mode 100644
index 4c6dbbfb79b4e4a069b32297c9a48b737ec7145f..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/DuelingDQN_agent.py
+++ /dev/null
@@ -1,201 +0,0 @@
-#-*- coding: utf-8 -*-
-
-import math
-import numpy as np
-import paddle.fluid as fluid
-from paddle.fluid.param_attr import ParamAttr
-from tqdm import tqdm
-
-
-class DuelingDQNModel(object):
- def __init__(self, state_dim, action_dim, gamma, hist_len, use_cuda=False):
- self.img_height = state_dim[0]
- self.img_width = state_dim[1]
- self.action_dim = action_dim
- self.gamma = gamma
- self.exploration = 1.1
- self.update_target_steps = 10000 // 4
- self.hist_len = hist_len
- self.use_cuda = use_cuda
-
- self.global_step = 0
- self._build_net()
-
- def _get_inputs(self):
- return fluid.layers.data(
- name='state',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='action', shape=[1], dtype='int32'), \
- fluid.layers.data(
- name='reward', shape=[], dtype='float32'), \
- fluid.layers.data(
- name='next_s',
- shape=[self.hist_len, self.img_height, self.img_width],
- dtype='float32'), \
- fluid.layers.data(
- name='isOver', shape=[], dtype='bool')
-
- def _build_net(self):
- self.predict_program = fluid.Program()
- self.train_program = fluid.Program()
- self._sync_program = fluid.Program()
-
- with fluid.program_guard(self.predict_program):
- state, action, reward, next_s, isOver = self._get_inputs()
- self.pred_value = self.get_DQN_prediction(state)
-
- with fluid.program_guard(self.train_program):
- state, action, reward, next_s, isOver = self._get_inputs()
- pred_value = self.get_DQN_prediction(state)
-
- reward = fluid.layers.clip(reward, min=-1.0, max=1.0)
-
- action_onehot = fluid.layers.one_hot(action, self.action_dim)
- action_onehot = fluid.layers.cast(action_onehot, dtype='float32')
-
- pred_action_value = fluid.layers.reduce_sum(
- fluid.layers.elementwise_mul(action_onehot, pred_value), dim=1)
-
- targetQ_predict_value = self.get_DQN_prediction(next_s, target=True)
- best_v = fluid.layers.reduce_max(targetQ_predict_value, dim=1)
- best_v.stop_gradient = True
-
- target = reward + (1.0 - fluid.layers.cast(
- isOver, dtype='float32')) * self.gamma * best_v
- cost = fluid.layers.square_error_cost(pred_action_value, target)
- cost = fluid.layers.reduce_mean(cost)
-
- optimizer = fluid.optimizer.Adam(1e-3 * 0.5, epsilon=1e-3)
- optimizer.minimize(cost)
-
- vars = list(self.train_program.list_vars())
- target_vars = list(filter(
- lambda x: 'GRAD' not in x.name and 'target' in x.name, vars))
-
- policy_vars_name = [
- x.name.replace('target', 'policy') for x in target_vars]
- policy_vars = list(filter(
- lambda x: x.name in policy_vars_name, vars))
-
- policy_vars.sort(key=lambda x: x.name)
- target_vars.sort(key=lambda x: x.name)
-
- with fluid.program_guard(self._sync_program):
- sync_ops = []
- for i, var in enumerate(policy_vars):
- sync_op = fluid.layers.assign(policy_vars[i], target_vars[i])
- sync_ops.append(sync_op)
-
- # fluid exe
- place = fluid.CUDAPlace(0) if self.use_cuda else fluid.CPUPlace()
- self.exe = fluid.Executor(place)
- self.exe.run(fluid.default_startup_program())
-
- def get_DQN_prediction(self, image, target=False):
- image = image / 255.0
-
- variable_field = 'target' if target else 'policy'
-
- conv1 = fluid.layers.conv2d(
- input=image,
- num_filters=32,
- filter_size=5,
- stride=1,
- padding=2,
- act='relu',
- param_attr=ParamAttr(name='{}_conv1'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv1_b'.format(variable_field)))
- max_pool1 = fluid.layers.pool2d(
- input=conv1, pool_size=2, pool_stride=2, pool_type='max')
-
- conv2 = fluid.layers.conv2d(
- input=max_pool1,
- num_filters=32,
- filter_size=5,
- stride=1,
- padding=2,
- act='relu',
- param_attr=ParamAttr(name='{}_conv2'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv2_b'.format(variable_field)))
- max_pool2 = fluid.layers.pool2d(
- input=conv2, pool_size=2, pool_stride=2, pool_type='max')
-
- conv3 = fluid.layers.conv2d(
- input=max_pool2,
- num_filters=64,
- filter_size=4,
- stride=1,
- padding=1,
- act='relu',
- param_attr=ParamAttr(name='{}_conv3'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv3_b'.format(variable_field)))
- max_pool3 = fluid.layers.pool2d(
- input=conv3, pool_size=2, pool_stride=2, pool_type='max')
-
- conv4 = fluid.layers.conv2d(
- input=max_pool3,
- num_filters=64,
- filter_size=3,
- stride=1,
- padding=1,
- act='relu',
- param_attr=ParamAttr(name='{}_conv4'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_conv4_b'.format(variable_field)))
-
- flatten = fluid.layers.flatten(conv4, axis=1)
-
- value = fluid.layers.fc(
- input=flatten,
- size=1,
- param_attr=ParamAttr(name='{}_value_fc'.format(variable_field)),
- bias_attr=ParamAttr(name='{}_value_fc_b'.format(variable_field)))
-
- advantage = fluid.layers.fc(
- input=flatten,
- size=self.action_dim,
- param_attr=ParamAttr(name='{}_advantage_fc'.format(variable_field)),
- bias_attr=ParamAttr(
- name='{}_advantage_fc_b'.format(variable_field)))
-
- Q = advantage + (value - fluid.layers.reduce_mean(
- advantage, dim=1, keep_dim=True))
- return Q
-
-
- def act(self, state, train_or_test):
- sample = np.random.random()
- if train_or_test == 'train' and sample < self.exploration:
- act = np.random.randint(self.action_dim)
- else:
- if np.random.random() < 0.01:
- act = np.random.randint(self.action_dim)
- else:
- state = np.expand_dims(state, axis=0)
- pred_Q = self.exe.run(self.predict_program,
- feed={'state': state.astype('float32')},
- fetch_list=[self.pred_value])[0]
- pred_Q = np.squeeze(pred_Q, axis=0)
- act = np.argmax(pred_Q)
- if train_or_test == 'train':
- self.exploration = max(0.1, self.exploration - 1e-6)
- return act
-
- def train(self, state, action, reward, next_state, isOver):
- if self.global_step % self.update_target_steps == 0:
- self.sync_target_network()
- self.global_step += 1
-
- action = np.expand_dims(action, -1)
- self.exe.run(self.train_program,
- feed={
- 'state': state.astype('float32'),
- 'action': action.astype('int32'),
- 'reward': reward,
- 'next_s': next_state.astype('float32'),
- 'isOver': isOver
- })
-
- def sync_target_network(self):
- self.exe.run(self._sync_program)
diff --git a/PaddleRL/DeepQNetwork/README.md b/PaddleRL/DeepQNetwork/README.md
deleted file mode 100644
index 1edeaaa884318ec3a530ec4fdb7d031d07411b56..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/README.md
+++ /dev/null
@@ -1,67 +0,0 @@
-[中文版](README_cn.md)
-
-## Reproduce DQN, DoubleDQN, DuelingDQN model with Fluid version of PaddlePaddle
-Based on PaddlePaddle's next-generation API Fluid, the DQN model of deep reinforcement learning is reproduced, and the same level of indicators of the paper is reproduced in the classic Atari game. The model receives the image of the game as input, and uses the end-to-end model to directly predict the next step. The repository contains the following three types of models:
-+ DQN in
-[Human-level Control Through Deep Reinforcement Learning](http://www.nature.com/nature/journal/v518/n7540/full/nature14236.html)
-+ DoubleDQN in:
-[Deep Reinforcement Learning with Double Q-Learning](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/viewPaper/12389)
-+ DuelingDQN in:
-[Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html)
-
-## Atari benchmark & performance
-
-### Atari games introduction
-
-Please see [here](https://gym.openai.com/envs/#atari) to know more about Atari game.
-
-### Pong game result
-
-The average game rewards that can be obtained for the three models as the number of training steps changes during the training are as follows(about 3 hours/1 Million steps):
-
-
-
-
-
-## How to use
-### Dependencies:
-+ python2.7
-+ gym
-+ tqdm
-+ opencv-python
-+ paddlepaddle-gpu>=1.0.0
-+ ale_python_interface
-
-### Install Dependencies:
-+ Install PaddlePaddle:
- recommended to compile and install PaddlePaddle from source code
-+ Install other dependencies:
- ```
- pip install -r requirement.txt
- pip install gym[atari]
- ```
- Install ale_python_interface, please see [here](https://github.com/mgbellemare/Arcade-Learning-Environment).
-
-### Start Training:
-```
-# To train a model for Pong game with gpu (use DQN model as default)
-python train.py --rom ./rom_files/pong.bin --use_cuda
-
-# To train a model for Pong with DoubleDQN
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DoubleDQN
-
-# To train a model for Pong with DuelingDQN
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DuelingDQN
-```
-
-To train more games, you can install more rom files from [here](https://github.com/openai/atari-py/tree/master/atari_py/atari_roms).
-
-### Start Testing:
-```
-# Play the game with saved best model and calculate the average rewards
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong
-
-# Play the game with visualization
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong --viz 0.01
-```
-[Here](https://pan.baidu.com/s/1gIsbNw5V7tMeb74ojx-TMA) is saved models for Pong and Breakout games. You can use it to play the game directly.
diff --git a/PaddleRL/DeepQNetwork/README_cn.md b/PaddleRL/DeepQNetwork/README_cn.md
deleted file mode 100644
index 640d775ad8fed2be360d308b6c5df41c86d77c04..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/README_cn.md
+++ /dev/null
@@ -1,71 +0,0 @@
-## 基于PaddlePaddle的Fluid版本复现DQN, DoubleDQN, DuelingDQN三个模型
-
-基于PaddlePaddle下一代API Fluid复现了深度强化学习领域的DQN模型,在经典的Atari 游戏上复现了论文同等水平的指标,模型接收游戏的图像作为输入,采用端到端的模型直接预测下一步要执行的控制信号,本仓库一共包含以下3类模型:
-+ DQN模型:
-[Human-level Control Through Deep Reinforcement Learning](http://www.nature.com/nature/journal/v518/n7540/full/nature14236.html)
-+ DoubleDQN模型:
-[Deep Reinforcement Learning with Double Q-Learning](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/viewPaper/12389)
-+ DuelingDQN模型:
-[Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html)
-
-## 模型效果:Atari游戏表现
-
-### Atari游戏介绍
-
-请点击[这里](https://gym.openai.com/envs/#atari)了解Atari游戏。
-
-### Pong游戏训练结果
-三个模型在训练过程中随着训练步数的变化,能得到的平均游戏奖励如下图所示(大概3小时每1百万步):
-
-
-
-
-
-## 使用教程
-
-### 依赖:
-+ python2.7
-+ gym
-+ tqdm
-+ opencv-python
-+ paddlepaddle-gpu>=1.0.0
-+ ale_python_interface
-
-### 下载依赖:
-
-+ 安装PaddlePaddle:
- 建议通过PaddlePaddle源码进行编译安装
-+ 下载其它依赖:
- ```
- pip install -r requirement.txt
- pip install gym[atari]
- ```
- 安装ale_python_interface可以参考[这里](https://github.com/mgbellemare/Arcade-Learning-Environment)
-
-### 训练模型:
-
-```
-# 使用GPU训练Pong游戏(默认使用DQN模型)
-python train.py --rom ./rom_files/pong.bin --use_cuda
-
-# 训练DoubleDQN模型
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DoubleDQN
-
-# 训练DuelingDQN模型
-python train.py --rom ./rom_files/pong.bin --use_cuda --alg DuelingDQN
-```
-
-训练更多游戏,可以从[这里](https://github.com/openai/atari-py/tree/master/atari_py/atari_roms)下载游戏rom
-
-### 测试模型:
-
-```
-# Play the game with saved model and calculate the average rewards
-# 使用训练过程中保存的最好模型玩游戏,以及计算平均奖励(rewards)
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong
-
-# 以可视化的形式来玩游戏
-python play.py --rom ./rom_files/pong.bin --use_cuda --model_path ./saved_model/DQN-pong --viz 0.01
-```
-
-[这里](https://pan.baidu.com/s/1gIsbNw5V7tMeb74ojx-TMA)是Pong和Breakout游戏训练好的模型,可以直接用来测试。
diff --git a/PaddleRL/DeepQNetwork/assets/dqn.png b/PaddleRL/DeepQNetwork/assets/dqn.png
deleted file mode 100644
index f8f8d12f9887cdab62f09b52597ec187a4c8107c..0000000000000000000000000000000000000000
Binary files a/PaddleRL/DeepQNetwork/assets/dqn.png and /dev/null differ
diff --git a/PaddleRL/DeepQNetwork/atari.py b/PaddleRL/DeepQNetwork/atari.py
deleted file mode 100644
index ec793cba15ddc1c42986689eaad5773875a4ffde..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/atari.py
+++ /dev/null
@@ -1,160 +0,0 @@
-# -*- coding: utf-8 -*-
-
-import numpy as np
-import os
-import cv2
-import threading
-
-import gym
-from gym import spaces
-from gym.envs.atari.atari_env import ACTION_MEANING
-
-from atari_py import ALEInterface
-
-__all__ = ['AtariPlayer']
-
-ROM_URL = "https://github.com/openai/atari-py/tree/master/atari_py/atari_roms"
-_ALE_LOCK = threading.Lock()
-"""
-The following AtariPlayer are copied or modified from tensorpack/tensorpack:
- https://github.com/tensorpack/tensorpack/blob/master/examples/DeepQNetwork/atari.py
-"""
-
-
-class AtariPlayer(gym.Env):
- """
- A wrapper for ALE emulator, with configurations to mimic DeepMind DQN settings.
- Info:
- score: the accumulated reward in the current game
- gameOver: True when the current game is Over
- """
-
- def __init__(self,
- rom_file,
- viz=0,
- frame_skip=4,
- nullop_start=30,
- live_lost_as_eoe=True,
- max_num_frames=0):
- """
- Args:
- rom_file: path to the rom
- frame_skip: skip every k frames and repeat the action
- viz: visualization to be done.
- Set to 0 to disable.
- Set to a positive number to be the delay between frames to show.
- Set to a string to be a directory to store frames.
- nullop_start: start with random number of null ops.
- live_losts_as_eoe: consider lost of lives as end of episode. Useful for training.
- max_num_frames: maximum number of frames per episode.
- """
- super(AtariPlayer, self).__init__()
- assert os.path.isfile(rom_file), \
- "rom {} not found. Please download at {}".format(rom_file, ROM_URL)
-
- try:
- ALEInterface.setLoggerMode(ALEInterface.Logger.Error)
- except AttributeError:
- print("You're not using latest ALE")
-
- # avoid simulator bugs: https://github.com/mgbellemare/Arcade-Learning-Environment/issues/86
- with _ALE_LOCK:
- self.ale = ALEInterface()
- self.ale.setInt(b"random_seed", np.random.randint(0, 30000))
- self.ale.setInt(b"max_num_frames_per_episode", max_num_frames)
- self.ale.setBool(b"showinfo", False)
-
- self.ale.setInt(b"frame_skip", 1)
- self.ale.setBool(b'color_averaging', False)
- # manual.pdf suggests otherwise.
- self.ale.setFloat(b'repeat_action_probability', 0.0)
-
- # viz setup
- if isinstance(viz, str):
- assert os.path.isdir(viz), viz
- self.ale.setString(b'record_screen_dir', viz)
- viz = 0
- if isinstance(viz, int):
- viz = float(viz)
- self.viz = viz
- if self.viz and isinstance(self.viz, float):
- self.windowname = os.path.basename(rom_file)
- cv2.startWindowThread()
- cv2.namedWindow(self.windowname)
-
- self.ale.loadROM(rom_file.encode('utf-8'))
- self.width, self.height = self.ale.getScreenDims()
- self.actions = self.ale.getMinimalActionSet()
-
- self.live_lost_as_eoe = live_lost_as_eoe
- self.frame_skip = frame_skip
- self.nullop_start = nullop_start
-
- self.action_space = spaces.Discrete(len(self.actions))
- self.observation_space = spaces.Box(low=0,
- high=255,
- shape=(self.height, self.width),
- dtype=np.uint8)
- self._restart_episode()
-
- def get_action_meanings(self):
- return [ACTION_MEANING[i] for i in self.actions]
-
- def _grab_raw_image(self):
- """
- :returns: the current 3-channel image
- """
- m = self.ale.getScreenRGB()
- return m.reshape((self.height, self.width, 3))
-
- def _current_state(self):
- """
- returns: a gray-scale (h, w) uint8 image
- """
- ret = self._grab_raw_image()
- # avoid missing frame issue: max-pooled over the last screen
- ret = np.maximum(ret, self.last_raw_screen)
- if self.viz:
- if isinstance(self.viz, float):
- cv2.imshow(self.windowname, ret)
- cv2.waitKey(int(self.viz * 1000))
- ret = ret.astype('float32')
- # 0.299,0.587.0.114. same as rgb2y in torch/image
- ret = cv2.cvtColor(ret, cv2.COLOR_RGB2GRAY)
- return ret.astype('uint8') # to save some memory
-
- def _restart_episode(self):
- with _ALE_LOCK:
- self.ale.reset_game()
-
- # random null-ops start
- n = np.random.randint(self.nullop_start)
- self.last_raw_screen = self._grab_raw_image()
- for k in range(n):
- if k == n - 1:
- self.last_raw_screen = self._grab_raw_image()
- self.ale.act(0)
-
- def reset(self):
- if self.ale.game_over():
- self._restart_episode()
- return self._current_state()
-
- def step(self, act):
- oldlives = self.ale.lives()
- r = 0
- for k in range(self.frame_skip):
- if k == self.frame_skip - 1:
- self.last_raw_screen = self._grab_raw_image()
- r += self.ale.act(self.actions[act])
- newlives = self.ale.lives()
- if self.ale.game_over() or \
- (self.live_lost_as_eoe and newlives < oldlives):
- break
-
- isOver = self.ale.game_over()
- if self.live_lost_as_eoe:
- isOver = isOver or newlives < oldlives
-
- info = {'ale.lives': newlives}
- return self._current_state(), r, isOver, info
diff --git a/PaddleRL/DeepQNetwork/atari_wrapper.py b/PaddleRL/DeepQNetwork/atari_wrapper.py
deleted file mode 100644
index 81ec7e0ba0ee191f70591c16bfff560a62d3d395..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/atari_wrapper.py
+++ /dev/null
@@ -1,106 +0,0 @@
-# -*- coding: utf-8 -*-
-
-import numpy as np
-from collections import deque
-
-import gym
-from gym import spaces
-
-_v0, _v1 = gym.__version__.split('.')[:2]
-assert int(_v0) > 0 or int(_v1) >= 10, gym.__version__
-"""
-The following wrappers are copied or modified from openai/baselines:
-https://github.com/openai/baselines/blob/master/baselines/common/atari_wrappers.py
-"""
-
-
-class MapState(gym.ObservationWrapper):
- def __init__(self, env, map_func):
- gym.ObservationWrapper.__init__(self, env)
- self._func = map_func
-
- def observation(self, obs):
- return self._func(obs)
-
-
-class FrameStack(gym.Wrapper):
- def __init__(self, env, k):
- """Buffer observations and stack across channels (last axis)."""
- gym.Wrapper.__init__(self, env)
- self.k = k
- self.frames = deque([], maxlen=k)
- shp = env.observation_space.shape
- chan = 1 if len(shp) == 2 else shp[2]
- self.observation_space = spaces.Box(low=0,
- high=255,
- shape=(shp[0], shp[1], chan * k),
- dtype=np.uint8)
-
- def reset(self):
- """Clear buffer and re-fill by duplicating the first observation."""
- ob = self.env.reset()
- for _ in range(self.k - 1):
- self.frames.append(np.zeros_like(ob))
- self.frames.append(ob)
- return self.observation()
-
- def step(self, action):
- ob, reward, done, info = self.env.step(action)
- self.frames.append(ob)
- return self.observation(), reward, done, info
-
- def observation(self):
- assert len(self.frames) == self.k
- return np.stack(self.frames, axis=0)
-
-
-class _FireResetEnv(gym.Wrapper):
- def __init__(self, env):
- """Take action on reset for environments that are fixed until firing."""
- gym.Wrapper.__init__(self, env)
- assert env.unwrapped.get_action_meanings()[1] == 'FIRE'
- assert len(env.unwrapped.get_action_meanings()) >= 3
-
- def reset(self):
- self.env.reset()
- obs, _, done, _ = self.env.step(1)
- if done:
- self.env.reset()
- obs, _, done, _ = self.env.step(2)
- if done:
- self.env.reset()
- return obs
-
- def step(self, action):
- return self.env.step(action)
-
-
-def FireResetEnv(env):
- if isinstance(env, gym.Wrapper):
- baseenv = env.unwrapped
- else:
- baseenv = env
- if 'FIRE' in baseenv.get_action_meanings():
- return _FireResetEnv(env)
- return env
-
-
-class LimitLength(gym.Wrapper):
- def __init__(self, env, k):
- gym.Wrapper.__init__(self, env)
- self.k = k
-
- def reset(self):
- # This assumes that reset() will really reset the env.
- # If the underlying env tries to be smart about reset
- # (e.g. end-of-life), the assumption doesn't hold.
- ob = self.env.reset()
- self.cnt = 0
- return ob
-
- def step(self, action):
- ob, r, done, info = self.env.step(action)
- self.cnt += 1
- if self.cnt == self.k:
- done = True
- return ob, r, done, info
diff --git a/PaddleRL/DeepQNetwork/expreplay.py b/PaddleRL/DeepQNetwork/expreplay.py
deleted file mode 100644
index 5f27ca7286b5db7ac963bc25236be416fad50eb0..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/expreplay.py
+++ /dev/null
@@ -1,98 +0,0 @@
-# -*- coding: utf-8 -*-
-
-import numpy as np
-import copy
-from collections import deque, namedtuple
-
-Experience = namedtuple('Experience', ['state', 'action', 'reward', 'isOver'])
-
-
-class ReplayMemory(object):
- def __init__(self, max_size, state_shape, context_len):
- self.max_size = int(max_size)
- self.state_shape = state_shape
- self.context_len = int(context_len)
-
- self.state = np.zeros((self.max_size, ) + state_shape, dtype='uint8')
- self.action = np.zeros((self.max_size, ), dtype='int32')
- self.reward = np.zeros((self.max_size, ), dtype='float32')
- self.isOver = np.zeros((self.max_size, ), dtype='bool')
-
- self._curr_size = 0
- self._curr_pos = 0
- self._context = deque(maxlen=context_len - 1)
-
- def append(self, exp):
- """append a new experience into replay memory
- """
- if self._curr_size < self.max_size:
- self._assign(self._curr_pos, exp)
- self._curr_size += 1
- else:
- self._assign(self._curr_pos, exp)
- self._curr_pos = (self._curr_pos + 1) % self.max_size
- if exp.isOver:
- self._context.clear()
- else:
- self._context.append(exp)
-
- def recent_state(self):
- """ maintain recent state for training"""
- lst = list(self._context)
- states = [np.zeros(self.state_shape, dtype='uint8')] * \
- (self._context.maxlen - len(lst))
- states.extend([k.state for k in lst])
- return states
-
- def sample(self, idx):
- """ return state, action, reward, isOver,
- note that some frames in state may be generated from last episode,
- they should be removed from state
- """
- state = np.zeros(
- (self.context_len + 1, ) + self.state_shape, dtype=np.uint8)
- state_idx = np.arange(idx, idx + self.context_len + 1) % self._curr_size
-
- # confirm that no frame was generated from last episode
- has_last_episode = False
- for k in range(self.context_len - 2, -1, -1):
- to_check_idx = state_idx[k]
- if self.isOver[to_check_idx]:
- has_last_episode = True
- state_idx = state_idx[k + 1:]
- state[k + 1:] = self.state[state_idx]
- break
-
- if not has_last_episode:
- state = self.state[state_idx]
-
- real_idx = (idx + self.context_len - 1) % self._curr_size
- action = self.action[real_idx]
- reward = self.reward[real_idx]
- isOver = self.isOver[real_idx]
- return state, reward, action, isOver
-
- def __len__(self):
- return self._curr_size
-
- def _assign(self, pos, exp):
- self.state[pos] = exp.state
- self.reward[pos] = exp.reward
- self.action[pos] = exp.action
- self.isOver[pos] = exp.isOver
-
- def sample_batch(self, batch_size):
- """sample a batch from replay memory for training
- """
- batch_idx = np.random.randint(
- self._curr_size - self.context_len - 1, size=batch_size)
- batch_idx = (self._curr_pos + batch_idx) % self._curr_size
- batch_exp = [self.sample(i) for i in batch_idx]
- return self._process_batch(batch_exp)
-
- def _process_batch(self, batch_exp):
- state = np.asarray([e[0] for e in batch_exp], dtype='uint8')
- reward = np.asarray([e[1] for e in batch_exp], dtype='float32')
- action = np.asarray([e[2] for e in batch_exp], dtype='int8')
- isOver = np.asarray([e[3] for e in batch_exp], dtype='bool')
- return [state, action, reward, isOver]
diff --git a/PaddleRL/DeepQNetwork/play.py b/PaddleRL/DeepQNetwork/play.py
deleted file mode 100644
index 2c93da509d7cccb81d713c7aefd45a11ee28e8fb..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/play.py
+++ /dev/null
@@ -1,65 +0,0 @@
-#-*- coding: utf-8 -*-
-
-import argparse
-import os
-import numpy as np
-import paddle.fluid as fluid
-
-from train import get_player
-from tqdm import tqdm
-
-
-def predict_action(exe, state, predict_program, feed_names, fetch_targets,
- action_dim):
- if np.random.random() < 0.01:
- act = np.random.randint(action_dim)
- else:
- state = np.expand_dims(state, axis=0)
- pred_Q = exe.run(predict_program,
- feed={feed_names[0]: state.astype('float32')},
- fetch_list=fetch_targets)[0]
- pred_Q = np.squeeze(pred_Q, axis=0)
- act = np.argmax(pred_Q)
- return act
-
-
-if __name__ == '__main__':
- parser = argparse.ArgumentParser()
- parser.add_argument(
- '--use_cuda', action='store_true', help='if set, use cuda')
- parser.add_argument('--rom', type=str, required=True, help='atari rom')
- parser.add_argument(
- '--model_path', type=str, required=True, help='dirname to load model')
- parser.add_argument(
- '--viz',
- type=float,
- default=0,
- help='''viz: visualization setting:
- Set to 0 to disable;
- Set to a positive number to be the delay between frames to show.
- ''')
- args = parser.parse_args()
-
- env = get_player(args.rom, viz=args.viz)
-
- place = fluid.CUDAPlace(0) if args.use_cuda else fluid.CPUPlace()
- exe = fluid.Executor(place)
- inference_scope = fluid.Scope()
- with fluid.scope_guard(inference_scope):
- [predict_program, feed_names,
- fetch_targets] = fluid.io.load_inference_model(args.model_path, exe)
-
- episode_reward = []
- for _ in tqdm(xrange(30), desc='eval agent'):
- state = env.reset()
- total_reward = 0
- while True:
- action = predict_action(exe, state, predict_program, feed_names,
- fetch_targets, env.action_space.n)
- state, reward, isOver, info = env.step(action)
- total_reward += reward
- if isOver:
- break
- episode_reward.append(total_reward)
- eval_reward = np.mean(episode_reward)
- print('Average reward of 30 epidose: {}'.format(eval_reward))
diff --git a/PaddleRL/DeepQNetwork/requirement.txt b/PaddleRL/DeepQNetwork/requirement.txt
deleted file mode 100644
index 689eb324e6bd65aabbe44ca041ff7b3ddacb1943..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/requirement.txt
+++ /dev/null
@@ -1,5 +0,0 @@
-numpy
-gym
-tqdm
-opencv-python
-paddlepaddle-gpu>=1.0.0
diff --git a/PaddleRL/DeepQNetwork/rom_files/breakout.bin b/PaddleRL/DeepQNetwork/rom_files/breakout.bin
deleted file mode 100644
index abab5a8c0a1890461a11b78d4265f1b794327793..0000000000000000000000000000000000000000
Binary files a/PaddleRL/DeepQNetwork/rom_files/breakout.bin and /dev/null differ
diff --git a/PaddleRL/DeepQNetwork/rom_files/pong.bin b/PaddleRL/DeepQNetwork/rom_files/pong.bin
deleted file mode 100644
index 14a5bdfc72548613c059938bdf712efdbb5d3806..0000000000000000000000000000000000000000
Binary files a/PaddleRL/DeepQNetwork/rom_files/pong.bin and /dev/null differ
diff --git a/PaddleRL/DeepQNetwork/train.py b/PaddleRL/DeepQNetwork/train.py
deleted file mode 100644
index dd7986d704aec0c0948f81ca7ddd69bbbd3ea239..0000000000000000000000000000000000000000
--- a/PaddleRL/DeepQNetwork/train.py
+++ /dev/null
@@ -1,181 +0,0 @@
-#-*- coding: utf-8 -*-
-
-from DQN_agent import DQNModel
-from DoubleDQN_agent import DoubleDQNModel
-from DuelingDQN_agent import DuelingDQNModel
-from atari import AtariPlayer
-import paddle.fluid as fluid
-import gym
-import argparse
-import cv2
-from tqdm import tqdm
-from expreplay import ReplayMemory, Experience
-import numpy as np
-import os
-
-from datetime import datetime
-from atari_wrapper import FrameStack, MapState, FireResetEnv, LimitLength
-from collections import deque
-
-UPDATE_FREQ = 4
-
-MEMORY_SIZE = 1e6
-MEMORY_WARMUP_SIZE = MEMORY_SIZE // 20
-IMAGE_SIZE = (84, 84)
-CONTEXT_LEN = 4
-ACTION_REPEAT = 4 # aka FRAME_SKIP
-UPDATE_FREQ = 4
-
-
-def run_train_episode(agent, env, exp):
- total_reward = 0
- state = env.reset()
- step = 0
- while True:
- step += 1
- context = exp.recent_state()
- context.append(state)
- context = np.stack(context, axis=0)
- action = agent.act(context, train_or_test='train')
- next_state, reward, isOver, _ = env.step(action)
- exp.append(Experience(state, action, reward, isOver))
- # train model
- # start training
- if len(exp) > MEMORY_WARMUP_SIZE:
- if step % UPDATE_FREQ == 0:
- batch_all_state, batch_action, batch_reward, batch_isOver = exp.sample_batch(
- args.batch_size)
- batch_state = batch_all_state[:, :CONTEXT_LEN, :, :]
- batch_next_state = batch_all_state[:, 1:, :, :]
- agent.train(batch_state, batch_action, batch_reward,
- batch_next_state, batch_isOver)
- total_reward += reward
- state = next_state
- if isOver:
- break
- return total_reward, step
-
-
-def get_player(rom, viz=False, train=False):
- env = AtariPlayer(
- rom,
- frame_skip=ACTION_REPEAT,
- viz=viz,
- live_lost_as_eoe=train,
- max_num_frames=60000)
- env = FireResetEnv(env)
- env = MapState(env, lambda im: cv2.resize(im, IMAGE_SIZE))
- if not train:
- # in training, context is taken care of in expreplay buffer
- env = FrameStack(env, CONTEXT_LEN)
- return env
-
-
-def eval_agent(agent, env):
- episode_reward = []
- for _ in tqdm(range(30), desc='eval agent'):
- state = env.reset()
- total_reward = 0
- step = 0
- while True:
- step += 1
- action = agent.act(state, train_or_test='test')
- state, reward, isOver, info = env.step(action)
- total_reward += reward
- if isOver:
- break
- episode_reward.append(total_reward)
- eval_reward = np.mean(episode_reward)
- return eval_reward
-
-
-def train_agent():
- env = get_player(args.rom, train=True)
- test_env = get_player(args.rom)
- exp = ReplayMemory(args.mem_size, IMAGE_SIZE, CONTEXT_LEN)
- action_dim = env.action_space.n
-
- if args.alg == 'DQN':
- agent = DQNModel(IMAGE_SIZE, action_dim, args.gamma, CONTEXT_LEN,
- args.use_cuda)
- elif args.alg == 'DoubleDQN':
- agent = DoubleDQNModel(IMAGE_SIZE, action_dim, args.gamma, CONTEXT_LEN,
- args.use_cuda)
- elif args.alg == 'DuelingDQN':
- agent = DuelingDQNModel(IMAGE_SIZE, action_dim, args.gamma, CONTEXT_LEN,
- args.use_cuda)
- else:
- print('Input algorithm name error!')
- return
-
- with tqdm(total=MEMORY_WARMUP_SIZE, desc='Memory warmup') as pbar:
- while len(exp) < MEMORY_WARMUP_SIZE:
- total_reward, step = run_train_episode(agent, env, exp)
- pbar.update(step)
-
- # train
- test_flag = 0
- save_flag = 0
- pbar = tqdm(total=1e8)
- recent_100_reward = []
- total_step = 0
- max_reward = None
- save_path = os.path.join(args.model_dirname, '{}-{}'.format(
- args.alg, os.path.basename(args.rom).split('.')[0]))
- while True:
- # start epoch
- total_reward, step = run_train_episode(agent, env, exp)
- total_step += step
- pbar.set_description('[train]exploration:{}'.format(agent.exploration))
- pbar.update(step)
-
- if total_step // args.test_every_steps == test_flag:
- pbar.write("testing")
- eval_reward = eval_agent(agent, test_env)
- test_flag += 1
- print("eval_agent done, (steps, eval_reward): ({}, {})".format(
- total_step, eval_reward))
-
- if max_reward is None or eval_reward > max_reward:
- max_reward = eval_reward
- fluid.io.save_inference_model(save_path, ['state'],
- agent.pred_value, agent.exe,
- agent.predict_program)
- pbar.close()
-
-
-if __name__ == '__main__':
- parser = argparse.ArgumentParser()
- parser.add_argument(
- '--alg',
- type=str,
- default='DQN',
- help='Reinforcement learning algorithm, support: DQN, DoubleDQN, DuelingDQN'
- )
- parser.add_argument(
- '--use_cuda', action='store_true', help='if set, use cuda')
- parser.add_argument(
- '--gamma',
- type=float,
- default=0.99,
- help='discount factor for accumulated reward computation')
- parser.add_argument(
- '--mem_size',
- type=int,
- default=1000000,
- help='memory size for experience replay')
- parser.add_argument(
- '--batch_size', type=int, default=64, help='batch size for training')
- parser.add_argument('--rom', help='atari rom', required=True)
- parser.add_argument(
- '--model_dirname',
- type=str,
- default='saved_model',
- help='dirname to save model')
- parser.add_argument(
- '--test_every_steps',
- type=int,
- default=100000,
- help='every steps number to run test')
- args = parser.parse_args()
- train_agent()
diff --git a/PaddleRL/README.md b/PaddleRL/README.md
deleted file mode 100644
index 5b8d2caf78d426a14b96f7d842eb88ed37bab233..0000000000000000000000000000000000000000
--- a/PaddleRL/README.md
+++ /dev/null
@@ -1,11 +0,0 @@
-PaddleRL
-============
-
-强化学习
---------
-
-强化学习是近年来一个愈发重要的机器学习方向,特别是与深度学习相结合而形成的深度强化学习(Deep Reinforcement Learning, DRL),取得了很多令人惊异的成就。人们所熟知的战胜人类顶级围棋职业选手的 AlphaGo 就是 DRL 应用的一个典型例子,除游戏领域外,其它的应用还包括机器人、自然语言处理等。
-
-深度强化学习的开山之作是在Atari视频游戏中的成功应用, 其可直接接受视频帧这种高维输入并根据图像内容端到端地预测下一步的动作,所用到的模型被称为深度Q网络(Deep Q-Network, DQN)。本实例就是利用PaddlePaddle Fluid这个灵活的框架,实现了 DQN 及其变体,并测试了它们在 Atari 游戏中的表现。
-
-- [DeepQNetwork](https://github.com/PaddlePaddle/models/blob/develop/PaddleRL/DeepQNetwork/README_cn.md)
diff --git a/PaddleRL/policy_gradient/README.md b/PaddleRL/policy_gradient/README.md
deleted file mode 100644
index b813aa124466597adfb80261bee7c2de22b95e67..0000000000000000000000000000000000000000
--- a/PaddleRL/policy_gradient/README.md
+++ /dev/null
@@ -1,171 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle的最新develop分枝。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# Policy Gradient RL by PaddlePaddle
-本文介绍了如何使用PaddlePaddle通过policy-based的强化学习方法来训练一个player(actor model), 我们希望这个player可以完成简单的走阶梯任务。
-
- 内容分为:
-
- - 任务描述
- - 模型
- - 策略(目标函数)
- - 算法(Gradient ascent)
- - PaddlePaddle实现
-
-
-## 1. 任务描述
-假设有一个阶梯,连接A、B点,player从A点出发,每一步只能向前走一步或向后走一步,到达B点即为完成任务。我们希望训练一个聪明的player,它知道怎么最快的从A点到达B点。
-我们在命令行以下边的形式模拟任务:
-```
-A - O - - - - - B
-```
-一个‘-'代表一个阶梯,A点在行头,B点在行末,O代表player当前在的位置。
-
-## 2. Policy Gradient
-### 2.1 模型
-#### inputyer
-模型的输入是player观察到的当前阶梯的状态$S$, 要包含阶梯的长度和player当前的位置信息。
-在命令行模拟的情况下,player的位置和阶梯长度连个变量足以表示当前的状态,但是我们为了便于将这个demo推广到更复杂的任务场景,我们这里用一个向量来表示游戏状态$S$.
-向量$S$的长度为阶梯的长度,每一维代表一个阶梯,player所在的位置为1,其它位置为0.
-下边是一个例子:
-```
-S = [0, 1, 0, 0] // 阶梯长度为4,player在第二个阶梯上。
-```
-#### hidden layer
-隐藏层采用两个全连接layer `FC_1`和`FC_2`, 其中`FC_1` 的size为10, `FC_2`的size为2.
-
-#### output layer
-我们使用softmax将`FC_2`的output映射为所有可能的动作(前进或后退)的概率分布(Probability of taking the action),即为一个二维向量`act_probs`, 其中,`act_probs[0]` 为后退的概率,`act_probs[1]`为前进的概率。
-
-#### 模型表示
-我将我们的player模型(actor)形式化表示如下:
-$$a = \pi_\theta(s)$$
-其中$\theta$表示模型的参数,$s$是输入状态。
-
-
-### 2.2 策略(目标函数)
-我们怎么评估一个player(模型)的好坏呢?首先我们定义几个术语:
-我们让$\pi_\theta(s)$来玩一局游戏,$s_t$表示第$t$时刻的状态,$a_t$表示在状态$s_t$做出的动作,$r_t$表示做过动作$a_t$后得到的奖赏。
-一局游戏的过程可以表示如下:
-$$\tau = [s_1, a_1, r_1, s_2, a_2, r_2 ... s_T, a_T, r_T] \tag{1}$$
-
-一局游戏的奖励表示如下:
-$$R(\tau) = \sum_{t=1}^Tr_t$$
-
-player玩一局游戏,可能会出现多种操作序列$\tau$ ,某个$\tau$出现的概率是依赖于player model的$\theta$, 记做:
-$$P(\tau | \theta)$$
-那么,给定一个$\theta$(player model), 玩一局游戏,期望得到的奖励是:
-$$\overline {R}_\theta = \sum_\tau R(\tau)\sum_\tau R(\tau) P(\tau|\theta)$$
-大多数情况,我们无法穷举出所有的$\tau$,所以我们就抽取N个$\tau$来计算近似的期望:
-$$\overline {R}_\theta = \sum_\tau R(\tau) P(\tau|\theta) \approx \frac{1}{N} \sum_{n=1}^N R(\tau^n)$$
-
-$\overline {R}_\theta$就是我们需要的目标函数,它表示了一个参数为$\theta$的player玩一局游戏得分的期望,这个期望越大,代表这个player能力越强。
-### 2.3 算法(Gradient ascent)
-我们的目标函数是$\overline {R}_\theta$, 我们训练的任务就是, 我们训练的任务就是:
-$$\theta^* = \arg\max_\theta \overline {R}_\theta$$
-
-为了找到理想的$\theta$,我们使用Gradient ascent方法不断在$\overline {R}_\theta$的梯度方向更新$\theta$,可表示如下:
-$$\theta' = \theta + \eta * \bigtriangledown \overline {R}_\theta$$
-
-$$ \bigtriangledown \overline {R}_\theta = \sum_\tau R(\tau) \bigtriangledown P(\tau|\theta)\\
-= \sum_\tau R(\tau) P(\tau|\theta) \frac{\bigtriangledown P(\tau|\theta)}{P(\tau|\theta)} \\
-=\sum_\tau R(\tau) P(\tau|\theta) {\bigtriangledown \log P(\tau|\theta)} $$
-
-
-$$P(\tau|\theta) = P(s_1)P(a_1|s_1,\theta)P(s_2, r_1|s_1,a_1)P(a_2|s_2,\theta)P(s_3,r_2|s_2,a_2)...P(a_t|s_t,\theta)P(s_{t+1}, r_t|s_t,a_t)\\
-=P(s_1) \sum_{t=1}^T P(a_t|s_t,\theta)P(s_{t+1}, r_t|s_t,a_t)$$
-
-$$\log P(\tau|\theta) = \log P(s_1) + \sum_{t=1}^T [\log P(a_t|s_t,\theta) + \log P(s_{t+1}, r_t|s_t,a_t)]$$
-
-$$ \bigtriangledown \log P(\tau|\theta) = \sum_{t=1}^T \bigtriangledown \log P(a_t|s_t,\theta)$$
-
-$$ \bigtriangledown \overline {R}_\theta = \sum_\tau R(\tau) P(\tau|\theta) {\bigtriangledown \log P(\tau|\theta)} \\
-\approx \frac{1}{N} \sum_{n=1}^N R(\tau^n) {\bigtriangledown \log P(\tau|\theta)} \\
-= \frac{1}{N} \sum_{n=1}^N R(\tau^n) {\sum_{t=1}^T \bigtriangledown \log P(a_t|s_t,\theta)} \\
-= \frac{1}{N} \sum_{n=1}^N \sum_{t=1}^T R(\tau^n) { \bigtriangledown \log P(a_t|s_t,\theta)} \tag{11}$$
-
-#### 2.3.2 导数解释
-
-在使用深度学习框架进行训练求解时,一般用梯度下降方法,所以我们把Gradient ascent转为Gradient
-descent, 重写等式$(5)(6)$为:
-
-$$\theta^* = \arg\min_\theta (-\overline {R}_\theta \tag{13}$$
-$$\theta' = \theta - \eta * \bigtriangledown (-\overline {R}_\theta)) \tag{14}$$
-
-根据上一节的推导,$ (-\bigtriangledown \overline {R}_\theta) $结果如下:
-
-$$ -\bigtriangledown \overline {R}_\theta
-= \frac{1}{N} \sum_{n=1}^N \sum_{t=1}^T R(\tau^n) { \bigtriangledown -\log P(a_t|s_t,\theta)} \tag{15}$$
-
-根据等式(14), 我们的player的模型可以设计为:
-
-
-
-图 1
-
-
-用户的在一局游戏中的一次操作可以用元组$(s_t, a_t)$, 就是在状态$s_t$状态下做了动作$a_t$, 我们通过图(1)中的前向网络计算出来cross entropy cost为$−\log P(a_t|s_t,\theta)$, 恰好是等式(15)中我们需要微分的一项。
-图1是我们需要的player模型,我用这个网络的前向计算可以预测任何状态下该做什么动作。但是怎么去训练学习这个网络呢?在等式(15)中还有一项$R(\tau^n)$, 我做反向梯度传播的时候要加上这一项,所以我们需要在图1基础上再加上$R(\tau^n)$, 如 图2 所示:
-
-
-
-图 2
-
-
-图2就是我们最终的网络结构。
-
-#### 2.3.3 直观理解
-对于等式(15),我只看游戏中的一步操作,也就是这一项: $R(\tau^n) { \bigtriangledown -\log P(a_t|s_t,\theta)}$, 我们可以简单的认为我们训练的目的是让 $R(\tau^n) {[ -\log P(a_t|s_t,\theta)]}$尽可能的小,也就是$R(\tau^n) \log P(a_t|s_t,\theta)$尽可能的大。
-
-- 如果我们当前游戏局的奖励$R(\tau^n)$为正,那么我们希望当前操作的出现的概率$P(a_t|s_t,\theta)$尽可能大。
-- 如果我们当前游戏局的奖励$R(\tau^n)$为负,那么我们希望当前操作的出现的概率$P(a_t|s_t,\theta)$尽可能小。
-
-#### 2.3.4 一个问题
-
-一人犯错,诛连九族。一人得道,鸡犬升天。如果一局游戏得到奖励,我们希望帮助获得奖励的每一次操作都被重视;否则,导致惩罚的操作都要被冷落一次。
-是不是很有道理的样子?但是,如果有些游戏场景只有奖励,没有惩罚,怎么办?也就是所有的$R(\tau^n)$都为正。
-针对不同的游戏场景,我们有不同的解决方案:
-
-1. 每局游戏得分不一样:将每局的得分减去一个bias,结果就有正有负了。
-2. 每局游戏得分一样:把完成一局的时间作为计分因素,并减去一个bias.
-
-我们在第一章描述的游戏场景,需要用第二种 ,player每次到达终点都会收到1分的奖励,我们可以按完成任务所用的步数来定义奖励R.
-更进一步,我们认为一局游戏中每步动作对结局的贡献是不同的,有聪明的动作,也有愚蠢的操作。直观的理解,一般是靠前的动作是愚蠢的,靠后的动作是聪明的。既然有了这个价值观,那么我们拿到1分的奖励,就不能平均分给每个动作了。
-如图3所示,让所有动作按先后排队,从后往前衰减地给每个动作奖励,然后再每个动作的奖励再减去所有动作奖励的平均值:
-
-
-
-图 3
-
-
-## 3. 训练效果
-
-demo运行训练效果如下,经过1000轮尝试,我们的player就学会了如何有效的完成任务了:
-
-```
----------O epoch: 0; steps: 42
----------O epoch: 1; steps: 77
----------O epoch: 2; steps: 82
----------O epoch: 3; steps: 64
----------O epoch: 4; steps: 79
----------O epoch: 501; steps: 19
----------O epoch: 1001; steps: 9
----------O epoch: 1501; steps: 9
----------O epoch: 2001; steps: 11
----------O epoch: 2501; steps: 9
----------O epoch: 3001; steps: 9
----------O epoch: 3002; steps: 9
----------O epoch: 3003; steps: 9
----------O epoch: 3004; steps: 9
----------O epoch: 3005; steps: 9
----------O epoch: 3006; steps: 9
----------O epoch: 3007; steps: 9
----------O epoch: 3008; steps: 9
----------O epoch: 3009; steps: 9
----------O epoch: 3010; steps: 11
----------O epoch: 3011; steps: 9
----------O epoch: 3012; steps: 9
----------O epoch: 3013; steps: 9
----------O epoch: 3014; steps: 9
-```
diff --git a/PaddleRL/policy_gradient/brain.py b/PaddleRL/policy_gradient/brain.py
deleted file mode 100644
index 27a2da28563e5063213100d34c1b88d5fe2f91b0..0000000000000000000000000000000000000000
--- a/PaddleRL/policy_gradient/brain.py
+++ /dev/null
@@ -1,94 +0,0 @@
-import numpy as np
-import paddle.fluid as fluid
-# reproducible
-np.random.seed(1)
-
-
-class PolicyGradient:
- def __init__(
- self,
- n_actions,
- n_features,
- learning_rate=0.01,
- reward_decay=0.95,
- output_graph=False, ):
- self.n_actions = n_actions
- self.n_features = n_features
- self.lr = learning_rate
- self.gamma = reward_decay
-
- self.ep_obs, self.ep_as, self.ep_rs = [], [], []
-
- self.place = fluid.CPUPlace()
- self.exe = fluid.Executor(self.place)
-
- def build_net(self):
-
- obs = fluid.layers.data(
- name='obs', shape=[self.n_features], dtype='float32')
- acts = fluid.layers.data(name='acts', shape=[1], dtype='int64')
- vt = fluid.layers.data(name='vt', shape=[1], dtype='float32')
- # fc1
- fc1 = fluid.layers.fc(input=obs, size=10, act="tanh") # tanh activation
- # fc2
- all_act_prob = fluid.layers.fc(input=fc1,
- size=self.n_actions,
- act="softmax")
- self.inferece_program = fluid.defaul_main_program().clone()
- # to maximize total reward (log_p * R) is to minimize -(log_p * R)
- neg_log_prob = fluid.layers.cross_entropy(
- input=self.all_act_prob,
- label=acts) # this is negative log of chosen action
- neg_log_prob_weight = fluid.layers.elementwise_mul(x=neg_log_prob, y=vt)
- loss = fluid.layers.reduce_mean(
- neg_log_prob_weight) # reward guided loss
-
- sgd_optimizer = fluid.optimizer.SGD(self.lr)
- sgd_optimizer.minimize(loss)
- self.exe.run(fluid.default_startup_program())
-
- def choose_action(self, observation):
- prob_weights = self.exe.run(self.inferece_program,
- feed={"obs": observation[np.newaxis, :]},
- fetch_list=[self.all_act_prob])
- prob_weights = np.array(prob_weights[0])
- # select action w.r.t the actions prob
- action = np.random.choice(
- range(prob_weights.shape[1]), p=prob_weights.ravel())
- return action
-
- def store_transition(self, s, a, r):
- self.ep_obs.append(s)
- self.ep_as.append(a)
- self.ep_rs.append(r)
-
- def learn(self):
- # discount and normalize episode reward
- discounted_ep_rs_norm = self._discount_and_norm_rewards()
- tensor_obs = np.vstack(self.ep_obs).astype("float32")
- tensor_as = np.array(self.ep_as).astype("int64")
- tensor_as = tensor_as.reshape([tensor_as.shape[0], 1])
- tensor_vt = discounted_ep_rs_norm.astype("float32")[:, np.newaxis]
- # train on episode
- self.exe.run(
- fluid.default_main_program(),
- feed={
- "obs": tensor_obs, # shape=[None, n_obs]
- "acts": tensor_as, # shape=[None, ]
- "vt": tensor_vt # shape=[None, ]
- })
- self.ep_obs, self.ep_as, self.ep_rs = [], [], [] # empty episode data
- return discounted_ep_rs_norm
-
- def _discount_and_norm_rewards(self):
- # discount episode rewards
- discounted_ep_rs = np.zeros_like(self.ep_rs)
- running_add = 0
- for t in reversed(range(0, len(self.ep_rs))):
- running_add = running_add * self.gamma + self.ep_rs[t]
- discounted_ep_rs[t] = running_add
-
- # normalize episode rewards
- discounted_ep_rs -= np.mean(discounted_ep_rs)
- discounted_ep_rs /= np.std(discounted_ep_rs)
- return discounted_ep_rs
diff --git a/PaddleRL/policy_gradient/env.py b/PaddleRL/policy_gradient/env.py
deleted file mode 100644
index e2cd972dbc9a3943aceb9763b9dabcd50a1e6df1..0000000000000000000000000000000000000000
--- a/PaddleRL/policy_gradient/env.py
+++ /dev/null
@@ -1,56 +0,0 @@
-import time
-import sys
-import numpy as np
-
-
-class Env():
- def __init__(self, stage_len, interval):
- self.stage_len = stage_len
- self.end = self.stage_len - 1
- self.position = 0
- self.interval = interval
- self.step = 0
- self.epoch = -1
- self.render = False
-
- def reset(self):
- self.end = self.stage_len - 1
- self.position = 0
- self.epoch += 1
- self.step = 0
- if self.render:
- self.draw(True)
-
- def status(self):
- s = np.zeros([self.stage_len]).astype("float32")
- s[self.position] = 1
- return s
-
- def move(self, action):
- self.step += 1
- reward = 0.0
- done = False
- if action == 0:
- self.position = max(0, self.position - 1)
- else:
- self.position = min(self.end, self.position + 1)
- if self.render:
- self.draw()
- if self.position == self.end:
- reward = 1.0
- done = True
- return reward, done, self.status()
-
- def draw(self, new_line=False):
- if new_line:
- print ""
- else:
- print "\r",
- for i in range(self.stage_len):
- if i == self.position:
- sys.stdout.write("O")
- else:
- sys.stdout.write("-")
- sys.stdout.write(" epoch: %d; steps: %d" % (self.epoch, self.step))
- sys.stdout.flush()
- time.sleep(self.interval)
diff --git a/PaddleRL/policy_gradient/images/PG_1.svg b/PaddleRL/policy_gradient/images/PG_1.svg
deleted file mode 100644
index e2352ff57ceb70bdba013c55c35eb1dc1cabe275..0000000000000000000000000000000000000000
--- a/PaddleRL/policy_gradient/images/PG_1.svg
+++ /dev/null
@@ -1,3 +0,0 @@
-
-
- Produced by OmniGraffle 6.0.5 2017-12-01 08:39Z 神经网络 Layer 1 x_2 y_2 y_0 x_1 x_n Softmax y_m a_2 a_0 a_m . . . . . . . . . . . . s_t θ y_t = P ( a_t | s_t , θ ) -log(y_t) = -logP ( a_t | s_t , θ ) CROSS ENTROPY =
diff --git a/PaddleRL/policy_gradient/images/PG_2.svg b/PaddleRL/policy_gradient/images/PG_2.svg
deleted file mode 100644
index 3697bf9feca0861c9c0b2da29980ba4c86a3f4d7..0000000000000000000000000000000000000000
--- a/PaddleRL/policy_gradient/images/PG_2.svg
+++ /dev/null
@@ -1,3 +0,0 @@
-
-
- Produced by OmniGraffle 6.0.5 2017-12-01 08:39Z 神经网络 2 Layer 1 s_t Y FC a_t -logP ( a_t | s_t , θ ) Softmax R( τ^ n ) Cross Entropy Mul - R ( τ^ n ) logP ( a_t | s_t , θ ) θ
diff --git a/PaddleRL/policy_gradient/images/PG_3.svg b/PaddleRL/policy_gradient/images/PG_3.svg
deleted file mode 100644
index 97b56c3fe1188e603a3bf5f6eabf7ea0ea3072c7..0000000000000000000000000000000000000000
--- a/PaddleRL/policy_gradient/images/PG_3.svg
+++ /dev/null
@@ -1,3 +0,0 @@
-
-
- Produced by OmniGraffle 6.0.5 2017-12-01 09:42Z 神经网络 3 Layer 1 R a_2 = 0.9 * … a_1 a_(t-1) a_t = 0.9^2 * = 0.9^t * -= mean(a_1, a_2 … a_t)
diff --git a/PaddleRL/policy_gradient/run.py b/PaddleRL/policy_gradient/run.py
deleted file mode 100644
index 6f2f8c381a9d6452c5d7dfefb41f05eb4551d73a..0000000000000000000000000000000000000000
--- a/PaddleRL/policy_gradient/run.py
+++ /dev/null
@@ -1,29 +0,0 @@
-from brain import PolicyGradient
-from env import Env
-import numpy as np
-
-n_actions = 2
-interval = 0.01
-stage_len = 10
-epoches = 10000
-
-if __name__ == "__main__":
-
- brain = PolicyGradient(n_actions, stage_len)
- e = Env(stage_len, interval)
- brain.build_net()
- done = False
-
- for epoch in range(epoches):
- if (epoch % 500 == 1) or epoch < 5 or epoch > 3000:
- e.render = True
- else:
- e.render = False
- e.reset()
- while not done:
- s = e.status()
- action = brain.choose_action(s)
- r, done, _ = e.move(action)
- brain.store_transition(s, action, r)
- done = False
- brain.learn()
diff --git a/PaddleSpeech/DeepASR/.gitignore b/PaddleSpeech/DeepASR/.gitignore
deleted file mode 100644
index 485dee64bcfb48793379b200a1afd14e85a8aaf4..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-.idea
diff --git a/PaddleSpeech/DeepASR/README.md b/PaddleSpeech/DeepASR/README.md
deleted file mode 100644
index 6b9913fd30a56ef2328bc62e9b36e496f6763430..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/README.md
+++ /dev/null
@@ -1,36 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is the lastest develop branch. If you are on a version of PaddlePaddle earlier than this, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
-## Deep Automatic Speech Recognition
-
-### Introduction
-TBD
-
-### Installation
-
-#### Kaldi
-The decoder depends on [kaldi](https://github.com/kaldi-asr/kaldi), install it by flowing its instructions. Then
-
-```shell
-export KALDI_ROOT=
-```
-
-#### Decoder
-
-```shell
-git clone https://github.com/PaddlePaddle/models.git
-cd models/fluid/DeepASR/decoder
-sh setup.sh
-```
-
-### Data reprocessing
-TBD
-
-### Training
-TBD
-
-
-### Inference & Decoding
-TBD
-
-### Question and Contribution
-TBD
diff --git a/PaddleSpeech/DeepASR/README_cn.md b/PaddleSpeech/DeepASR/README_cn.md
deleted file mode 100644
index be78a048701a621bd90942bdfe30ef4d7c7f082f..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/README_cn.md
+++ /dev/null
@@ -1,186 +0,0 @@
-运行本目录下的程序示例需要使用 PaddlePaddle v0.14及以上版本。如果您的 PaddlePaddle 安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新 PaddlePaddle 安装版本。
-
----
-
-DeepASR (Deep Automatic Speech Recognition) 是一个基于PaddlePaddle FLuid与[Kaldi](http://www.kaldi-asr.org)的语音识别系统。其利用Fluid框架完成语音识别中声学模型的配置和训练,并集成 Kaldi 的解码器。旨在方便已对 Kaldi 的较为熟悉的用户实现中声学模型的快速、大规模训练,并利用kaldi完成复杂的语音数据预处理和最终的解码过程。
-
-### 目录
-- [模型概览](#model-overview)
-- [安装](#installation)
-- [数据预处理](#data-reprocessing)
-- [模型训练](#training)
-- [训练过程中的时间分析](#perf-profiling)
-- [预测和解码](#infer-decoding)
-- [评估错误率](#scoring-error-rate)
-- [Aishell 实例](#aishell-example)
-- [欢迎贡献更多的实例](#how-to-contrib)
-
-### 模型概览
-
-DeepASR的声学模型是一个单卷积层加多层层叠LSTMP 的结构,利用卷积来进行初步的特征提取,并用多层的LSTMP来对时序关系进行建模,所用到的损失函数是交叉熵。[LSTMP](https://arxiv.org/abs/1402.1128)(LSTM with recurrent projection layer)是传统 LSTM 的拓展,在 LSTM 的基础上增加了一个映射层,将隐含层映射到较低的维度并输入下一个时间步,这种结构在大为减小 LSTM 的参数规模和计算复杂度的同时还提升了 LSTM 的性能表现。
-
-
-
-图1 LSTMP 的拓扑结构
-
-
-### 安装
-
-
-#### kaldi的安装与设置
-
-
-DeepASR解码过程中所用的解码器依赖于[Kaldi的安装](https://github.com/kaldi-asr/kaldi),如环境中无Kaldi, 请`git clone`其源代码,并按给定的命令安装好kaldi,最后设置环境变量`KALDI_ROOT`:
-
-```shell
-export KALDI_ROOT=
-
-```
-#### 解码器的安装
-进入解码器源码所在的目录
-
-```shell
-cd models/fluid/DeepASR/decoder
-```
-运行安装脚本
-
-```shell
-sh setup.sh
-```
- 编译过程完成即成功地安转了解码器。
-
-### 数据预处理
-
-参考[Kaldi的数据准备流程](http://kaldi-asr.org/doc/data_prep.html)完成音频数据的特征提取和标签对齐
-
-### 声学模型的训练
-
-可以选择在CPU或GPU模式下进行声学模型的训练,例如在GPU模式下的训练
-
-```shell
-CUDA_VISIBLE_DEVICES=0,1,2,3 python -u train.py \
- --train_feature_lst train_feature.lst \
- --train_label_lst train_label.lst \
- --val_feature_lst val_feature.lst \
- --val_label_lst val_label.lst \
- --mean_var global_mean_var \
- --parallel
-```
-其中`train_feature.lst`和`train_label.lst`分别是训练数据集的特征列表文件和标注列表文件,类似的,`val_feature.lst`和`val_label.lst`对应的则是验证集的列表文件。实际训练过程中要正确指定建模单元大小、学习率等重要参数。关于这些参数的说明,请运行
-
-```shell
-python train.py --help
-```
-获取更多信息。
-
-### 训练过程中的时间分析
-
-利用Fluid提供的性能分析工具profiler,可对训练过程进行性能分析,获取网络中operator级别的执行时间
-
-```shell
-CUDA_VISIBLE_DEVICES=0 python -u tools/profile.py \
- --train_feature_lst train_feature.lst \
- --train_label_lst train_label.lst \
- --val_feature_lst val_feature.lst \
- --val_label_lst val_label.lst \
- --mean_var global_mean_var
-```
-
-
-### 预测和解码
-
-在充分训练好声学模型之后,利用训练过程中保存下来的模型checkpoint,可对输入的音频数据进行解码输出,得到声音到文字的识别结果
-
-```
-CUDA_VISIBLE_DEVICES=0,1,2,3 python -u infer_by_ckpt.py \
- --batch_size 96 \
- --checkpoint deep_asr.pass_1.checkpoint \
- --infer_feature_lst test_feature.lst \
- --infer_label_lst test_label.lst \
- --mean_var global_mean_var \
- --parallel
-```
-
-### 评估错误率
-
-对语音识别系统的评价常用的指标有词错误率(Word Error Rate, WER)和字错误率(Character Error Rate, CER), 在DeepASR中也实现了相关的度量工具,其运行方式为
-
-```
-python score_error_rate.py --error_rate_type cer --ref ref.txt --hyp decoding.txt
-```
-参数`error_rate_type`表示测量错误率的类型,即 WER 或 CER;`ref.txt` 和 `decoding.txt` 分别表示参考文本和实际解码出的文本,它们有着同样的格式:
-
-```
-key1 text1
-key2 text2
-key3 text3
-...
-
-```
-
-
-### Aishell 实例
-
-本节以[Aishell数据集](http://www.aishelltech.com/kysjcp)为例,展示如何完成从数据预处理到解码输出。Aishell是由北京希尔贝克公司所开放的中文普通话语音数据集,时长178小时,包含了400名来自不同口音区域录制者的语音,原始数据可由[openslr](http://www.openslr.org/33)获取。为简化流程,这里提供了已完成预处理的数据集供下载:
-
-```
-cd examples/aishell
-sh prepare_data.sh
-```
-
-其中包括了声学模型的训练数据以及解码过程中所用到的辅助文件等。下载数据完成后,在开始训练之前可对训练过程进行分析
-
-```
-sh profile.sh
-```
-
-执行训练
-
-```
-sh train.sh
-```
-默认是用4卡GPU进行训练,在实际过程中可根据可用GPU的数目和显存大小对`batch_size`、学习率等参数进行动态调整。训练过程中典型的损失函数和精度的变化趋势如图2所示
-
-
-
-图2 在Aishell数据集上训练声学模型的学习曲线
-
-
-完成模型训练后,即可执行预测识别测试集语音中的文字:
-
-```
-sh infer_by_ckpt.sh
-```
-
-其中包括了声学模型的预测和解码器的解码输出两个重要的过程。以下是解码输出的样例:
-
-```
-...
-BAC009S0764W0239 十一 五 期间 我 国 累计 境外 投资 七千亿 美元
-BAC009S0765W0140 在 了解 送 方 的 资产 情况 与 需求 之后
-BAC009S0915W0291 这 对 苹果 来说 不 是 件 容易 的 事 儿
-BAC009S0769W0159 今年 土地 收入 预计 近 四万亿 元
-BAC009S0907W0451 由 浦东 商店 作为 掩护
-BAC009S0768W0128 土地 交易 可能 随着 供应 淡季 的 到来 而 降温
-...
-```
-
-每行对应一个输出,均以音频样本的关键字开头,随后是按词分隔的解码出的中文文本。解码完成后运行脚本评估字错误率(CER)
-
-```
-sh score_cer.sh
-```
-
-其输出类似于如下所示
-
-```
-Error rate[cer] = 0.101971 (10683/104765),
-total 7176 sentences in hyp, 0 not presented in ref.
-```
-
-利用经过20轮左右训练的声学模型,可以在Aishell的测试集上得到CER约10%的识别结果。
-
-
-### 欢迎贡献更多的实例
-
-DeepASR目前只开放了Aishell实例,我们欢迎用户在更多的数据集上测试完整的训练流程并贡献到这个项目中。
diff --git a/PaddleSpeech/DeepASR/data_utils/__init__.py b/PaddleSpeech/DeepASR/data_utils/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/PaddleSpeech/DeepASR/data_utils/async_data_reader.py b/PaddleSpeech/DeepASR/data_utils/async_data_reader.py
deleted file mode 100644
index edface051129b248bad85978118daec6f8660adc..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/async_data_reader.py
+++ /dev/null
@@ -1,465 +0,0 @@
-"""This module contains data processing related logic.
-"""
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import random
-import struct
-import Queue
-import time
-import numpy as np
-from threading import Thread
-import signal
-from multiprocessing import Manager, Process
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-from data_utils.util import suppress_complaints, suppress_signal
-from data_utils.util import CriticalException, ForceExitWrapper
-
-
-class SampleInfo(object):
- """SampleInfo holds the necessary information to load a sample from disk.
-
- Args:
- feature_bin_path (str): File containing the feature data.
- feature_start (int): Start position of the sample's feature data.
- feature_size (int): Byte count of the sample's feature data.
- feature_frame_num (int): Time length of the sample.
- feature_dim (int): Feature dimension of one frame.
- label_bin_path (str): File containing the label data.
- label_size (int): Byte count of the sample's label data.
- label_frame_num (int): Label number of the sample.
- sample_name (str): Key of the sample
- """
-
- def __init__(self, feature_bin_path, feature_start, feature_size,
- feature_frame_num, feature_dim, label_bin_path, label_start,
- label_size, label_frame_num, sample_name):
- self.feature_bin_path = feature_bin_path
- self.feature_start = feature_start
- self.feature_size = feature_size
- self.feature_frame_num = feature_frame_num
- self.feature_dim = feature_dim
-
- self.label_bin_path = label_bin_path
- self.label_start = label_start
- self.label_size = label_size
- self.label_frame_num = label_frame_num
- self.sample_name = sample_name
-
-
-class SampleInfoBucket(object):
- """SampleInfoBucket contains paths of several description files. Feature
- description file contains necessary information (including path of binary
- data, sample start position, sample byte number etc.) to access samples'
- feature data and the same with the label description file. SampleInfoBucket
- is the minimum unit to do shuffle.
-
- Args:
- feature_bin_paths (list|tuple): Files containing the binary feature
- data.
- feature_desc_paths (list|tuple): Files containing the description of
- samples' feature data.
- label_bin_paths (list|tuple): Files containing the binary label data.
- label_desc_paths (list|tuple): Files containing the description of
- samples' label data.
- split_perturb(int): Maximum perturbation value for length of
- sub-sentence when splitting long sentence.
- split_sentence_threshold(int): Sentence whose length larger than
- the value will trigger split operation.
- split_sub_sentence_len(int): sub-sentence length is equal to
- (split_sub_sentence_len
- + rand() % split_perturb).
- """
-
- def __init__(self,
- feature_bin_paths,
- feature_desc_paths,
- label_bin_paths,
- label_desc_paths,
- split_perturb=50,
- split_sentence_threshold=512,
- split_sub_sentence_len=256):
- block_num = len(label_bin_paths)
- assert len(label_desc_paths) == block_num
- assert len(feature_bin_paths) == block_num
- assert len(feature_desc_paths) == block_num
- self._block_num = block_num
-
- self._feature_bin_paths = feature_bin_paths
- self._feature_desc_paths = feature_desc_paths
- self._label_bin_paths = label_bin_paths
- self._label_desc_paths = label_desc_paths
- self._split_perturb = split_perturb
- self._split_sentence_threshold = split_sentence_threshold
- self._split_sub_sentence_len = split_sub_sentence_len
- self._rng = random.Random(0)
-
- def generate_sample_info_list(self):
- sample_info_list = []
- for block_idx in xrange(self._block_num):
- label_bin_path = self._label_bin_paths[block_idx]
- label_desc_path = self._label_desc_paths[block_idx]
- feature_bin_path = self._feature_bin_paths[block_idx]
- feature_desc_path = self._feature_desc_paths[block_idx]
-
- feature_desc_lines = open(feature_desc_path).readlines()
-
- label_desc_lines = []
- if label_desc_path != "":
- label_desc_lines = open(label_desc_path).readlines()
- sample_num = int(feature_desc_lines[0].split()[1])
-
- if label_desc_path != "":
- assert sample_num == int(label_desc_lines[0].split()[1])
-
- for i in xrange(sample_num):
- feature_desc_split = feature_desc_lines[i + 1].split()
- sample_name = feature_desc_split[0]
- feature_start = int(feature_desc_split[2])
- feature_size = int(feature_desc_split[3])
- feature_frame_num = int(feature_desc_split[4])
- feature_dim = int(feature_desc_split[5])
-
- label_start = -1
- label_size = -1
- label_frame_num = feature_frame_num
- if label_desc_path != "":
- label_desc_split = label_desc_lines[i + 1].split()
- label_start = int(label_desc_split[2])
- label_size = int(label_desc_split[3])
- label_frame_num = int(label_desc_split[4])
- assert feature_frame_num == label_frame_num
-
- if self._split_sentence_threshold == -1 or \
- self._split_perturb == -1 or \
- self._split_sub_sentence_len == -1 \
- or self._split_sentence_threshold >= feature_frame_num:
- sample_info_list.append(
- SampleInfo(feature_bin_path, feature_start,
- feature_size, feature_frame_num, feature_dim,
- label_bin_path, label_start, label_size,
- label_frame_num, sample_name))
- #split sentence
- else:
- cur_frame_pos = 0
- cur_frame_len = 0
- remain_frame_num = feature_frame_num
- while True:
- if remain_frame_num > self._split_sentence_threshold:
- cur_frame_len = self._split_sub_sentence_len + \
- self._rng.randint(0, self._split_perturb)
- if cur_frame_len > remain_frame_num:
- cur_frame_len = remain_frame_num
- else:
- cur_frame_len = remain_frame_num
-
- sample_info_list.append(
- SampleInfo(
- feature_bin_path, feature_start + cur_frame_pos
- * feature_dim * 4, cur_frame_len * feature_dim *
- 4, cur_frame_len, feature_dim, label_bin_path,
- label_start + cur_frame_pos * 4, cur_frame_len *
- 4, cur_frame_len, sample_name))
-
- remain_frame_num -= cur_frame_len
- cur_frame_pos += cur_frame_len
- if remain_frame_num <= 0:
- break
- return sample_info_list
-
-
-class EpochEndSignal():
- pass
-
-
-class AsyncDataReader(object):
- """DataReader provides basic audio sample preprocessing pipeline including
- data loading and data augmentation.
-
- Args:
- feature_file_list (str): File containing paths of feature data file and
- corresponding description file.
- label_file_list (str): File containing paths of label data file and
- corresponding description file.
- drop_frame_len (int): Samples whose label length above the value will be
- dropped.(Using '-1' to disable the policy)
- split_sentence_threshold(int): Sentence whose length larger than
- the value will trigger split operation.
- (Assign -1 to disable split)
- proc_num (int): Number of processes for processing data.
- sample_buffer_size (int): Buffer size to indicate the maximum samples
- cached.
- sample_info_buffer_size (int): Buffer size to indicate the maximum
- sample information cached.
- batch_buffer_size (int): Buffer size to indicate the maximum batch
- cached.
- shuffle_block_num (int): Block number indicating the minimum unit to do
- shuffle.
- random_seed (int): Random seed.
- verbose (int): If set to 0, complaints including exceptions and signal
- traceback from sub-process will be suppressed. If set
- to 1, all complaints will be printed.
- """
-
- def __init__(self,
- feature_file_list,
- label_file_list="",
- drop_frame_len=512,
- split_sentence_threshold=1024,
- proc_num=10,
- sample_buffer_size=1024,
- sample_info_buffer_size=1024,
- batch_buffer_size=10,
- shuffle_block_num=10,
- random_seed=0,
- verbose=0):
- self._feature_file_list = feature_file_list
- self._label_file_list = label_file_list
- self._drop_frame_len = drop_frame_len
- self._split_sentence_threshold = split_sentence_threshold
- self._shuffle_block_num = shuffle_block_num
- self._block_info_list = None
- self._rng = random.Random(random_seed)
- self._bucket_list = None
- self.generate_bucket_list(True)
- self._order_id = 0
- self._manager = Manager()
- self._sample_buffer_size = sample_buffer_size
- self._sample_info_buffer_size = sample_info_buffer_size
- self._batch_buffer_size = batch_buffer_size
- self._proc_num = proc_num
- self._verbose = verbose
- self._force_exit = ForceExitWrapper(self._manager.Value('b', False))
-
- def generate_bucket_list(self, is_shuffle):
- if self._block_info_list is None:
- block_feature_info_lines = open(self._feature_file_list).readlines()
- self._block_info_list = []
- if self._label_file_list != "":
- block_label_info_lines = open(self._label_file_list).readlines()
- assert len(block_feature_info_lines) == len(
- block_label_info_lines)
- for i in xrange(0, len(block_feature_info_lines), 2):
- block_info = (block_feature_info_lines[i],
- block_feature_info_lines[i + 1],
- block_label_info_lines[i],
- block_label_info_lines[i + 1])
- self._block_info_list.append(
- map(lambda line: line.strip(), block_info))
- else:
- for i in xrange(0, len(block_feature_info_lines), 2):
- block_info = (block_feature_info_lines[i],
- block_feature_info_lines[i + 1], "", "")
- self._block_info_list.append(
- map(lambda line: line.strip(), block_info))
-
- if is_shuffle:
- self._rng.shuffle(self._block_info_list)
-
- self._bucket_list = []
- for i in xrange(0, len(self._block_info_list), self._shuffle_block_num):
- bucket_block_info = self._block_info_list[i:i +
- self._shuffle_block_num]
- self._bucket_list.append(
- SampleInfoBucket(
- map(lambda info: info[0], bucket_block_info),
- map(lambda info: info[1], bucket_block_info),
- map(lambda info: info[2], bucket_block_info),
- map(lambda info: info[3], bucket_block_info),
- split_sentence_threshold=self._split_sentence_threshold))
-
- # @TODO make this configurable
- def set_transformers(self, transformers):
- self._transformers = transformers
-
- def _sample_generator(self):
- sample_info_queue = self._manager.Queue(self._sample_info_buffer_size)
- sample_queue = self._manager.Queue(self._sample_buffer_size)
- self._order_id = 0
-
- @suppress_complaints(verbose=self._verbose, notify=self._force_exit)
- def ordered_feeding_task(sample_info_queue):
- for sample_info_bucket in self._bucket_list:
- try:
- sample_info_list = \
- sample_info_bucket.generate_sample_info_list()
- except Exception as e:
- raise CriticalException(e)
- else:
- self._rng.shuffle(sample_info_list) # do shuffle here
- for sample_info in sample_info_list:
- sample_info_queue.put((sample_info, self._order_id))
- self._order_id += 1
-
- for i in xrange(self._proc_num):
- sample_info_queue.put(EpochEndSignal())
-
- feeding_thread = Thread(
- target=ordered_feeding_task, args=(sample_info_queue, ))
- feeding_thread.daemon = True
- feeding_thread.start()
-
- @suppress_complaints(verbose=self._verbose, notify=self._force_exit)
- def ordered_processing_task(sample_info_queue, sample_queue, out_order):
- if self._verbose == 0:
- signal.signal(signal.SIGTERM, suppress_signal)
- signal.signal(signal.SIGINT, suppress_signal)
-
- def read_bytes(fpath, start, size):
- try:
- f = open(fpath, 'r')
- f.seek(start, 0)
- binary_bytes = f.read(size)
- f.close()
- return binary_bytes
- except Exception as e:
- raise CriticalException(e)
-
- ins = sample_info_queue.get()
-
- while not isinstance(ins, EpochEndSignal):
- sample_info, order_id = ins
-
- feature_bytes = read_bytes(sample_info.feature_bin_path,
- sample_info.feature_start,
- sample_info.feature_size)
-
- assert sample_info.feature_frame_num \
- * sample_info.feature_dim * 4 \
- == len(feature_bytes), \
- (sample_info.feature_bin_path,
- sample_info.feature_frame_num,
- sample_info.feature_dim,
- len(feature_bytes))
-
- label_data = None
- if sample_info.label_bin_path != "":
- label_bytes = read_bytes(sample_info.label_bin_path,
- sample_info.label_start,
- sample_info.label_size)
-
- assert sample_info.label_frame_num * 4 == len(
- label_bytes), (sample_info.label_bin_path,
- sample_info.label_array,
- len(label_bytes))
-
- label_array = struct.unpack(
- 'I' * sample_info.label_frame_num, label_bytes)
- label_data = np.array(
- label_array, dtype='int64').reshape(
- (sample_info.label_frame_num, 1))
- else:
- label_data = np.zeros(
- (sample_info.label_frame_num, 1), dtype='int64')
-
- feature_frame_num = sample_info.feature_frame_num
- feature_dim = sample_info.feature_dim
- assert feature_frame_num * feature_dim * 4 == len(feature_bytes)
- feature_array = struct.unpack('f' * feature_frame_num *
- feature_dim, feature_bytes)
- feature_data = np.array(
- feature_array, dtype='float32').reshape((
- sample_info.feature_frame_num, sample_info.feature_dim))
- sample_data = (feature_data, label_data,
- sample_info.sample_name)
- for transformer in self._transformers:
- # @TODO(pkuyym) to make transfomer only accept feature_data
- sample_data = transformer.perform_trans(sample_data)
- while order_id != out_order[0]:
- time.sleep(0.001)
-
- # drop long sentence
- if self._drop_frame_len == -1 or \
- self._drop_frame_len >= sample_data[0].shape[0]:
- sample_queue.put(sample_data)
-
- out_order[0] += 1
- ins = sample_info_queue.get()
-
- sample_queue.put(EpochEndSignal())
-
- out_order = self._manager.list([0])
- args = (sample_info_queue, sample_queue, out_order)
- workers = [
- Process(
- target=ordered_processing_task, args=args)
- for _ in xrange(self._proc_num)
- ]
-
- for w in workers:
- w.daemon = True
- w.start()
-
- finished_proc_num = 0
-
- while self._force_exit == False:
- try:
- sample = sample_queue.get_nowait()
- except Queue.Empty:
- time.sleep(0.001)
- else:
- if isinstance(sample, EpochEndSignal):
- finished_proc_num += 1
- if finished_proc_num >= self._proc_num:
- break
- else:
- continue
-
- yield sample
-
- def batch_iterator(self, batch_size, minimum_batch_size):
- def batch_to_ndarray(batch_samples, lod):
- assert len(batch_samples)
- frame_dim = batch_samples[0][0].shape[1]
- batch_feature = np.zeros((lod[-1], frame_dim), dtype="float32")
- batch_label = np.zeros((lod[-1], 1), dtype="int64")
- start = 0
- name_lst = []
- for sample in batch_samples:
- frame_num = sample[0].shape[0]
- batch_feature[start:start + frame_num, :] = sample[0]
- batch_label[start:start + frame_num, :] = sample[1]
- start += frame_num
- name_lst.append(sample[2])
- return (batch_feature, batch_label, name_lst)
-
- @suppress_complaints(verbose=self._verbose, notify=self._force_exit)
- def batch_assembling_task(sample_generator, batch_queue):
- batch_samples = []
- lod = [0]
- for sample in sample_generator():
- batch_samples.append(sample)
- lod.append(lod[-1] + sample[0].shape[0])
- if len(batch_samples) == batch_size:
- (batch_feature, batch_label, name_lst) = batch_to_ndarray(
- batch_samples, lod)
- batch_queue.put((batch_feature, batch_label, lod, name_lst))
- batch_samples = []
- lod = [0]
-
- if len(batch_samples) >= minimum_batch_size:
- (batch_feature, batch_label, name_lst) = batch_to_ndarray(
- batch_samples, lod)
- batch_queue.put((batch_feature, batch_label, lod, name_lst))
-
- batch_queue.put(EpochEndSignal())
-
- batch_queue = Queue.Queue(self._batch_buffer_size)
-
- assembling_thread = Thread(
- target=batch_assembling_task,
- args=(self._sample_generator, batch_queue))
- assembling_thread.daemon = True
- assembling_thread.start()
-
- while self._force_exit == False:
- try:
- batch_data = batch_queue.get_nowait()
- except Queue.Empty:
- time.sleep(0.001)
- else:
- if isinstance(batch_data, EpochEndSignal):
- break
- yield batch_data
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/__init__.py b/PaddleSpeech/DeepASR/data_utils/augmentor/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/tests/__init__.py b/PaddleSpeech/DeepASR/data_utils/augmentor/tests/__init__.py
deleted file mode 100644
index 90856dc44374211453f7de128c08c8004ffda912..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/augmentor/tests/__init__.py
+++ /dev/null
@@ -1,7 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-import data_utils.augmentor.trans_splice as trans_splice
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/tests/data/global_mean_var_search26kHr b/PaddleSpeech/DeepASR/data_utils/augmentor/tests/data/global_mean_var_search26kHr
deleted file mode 100644
index 7fabadc789bbd7aaad4e9ac59aba95b080c68b22..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/augmentor/tests/data/global_mean_var_search26kHr
+++ /dev/null
@@ -1,120 +0,0 @@
-16.2845556399 11.6891798673
-17.21509949 12.3788567902
-18.1143704548 14.9912618017
-19.2335963752 18.5419556172
-19.9266772451 21.2768220522
-19.8245737202 21.2347210705
-19.5432940972 20.2784036567
-19.4631271754 20.2934452329
-19.3929919324 20.457971868
-19.2924788362 20.3626439234
-18.9207244502 19.9196569759
-18.7202605641 19.5920276899
-18.4844279398 19.2068349019
-18.2670948624 18.8716893824
-18.0929628855 18.5439666541
-17.8428896026 18.0255891747
-17.6646850635 17.473764296
-17.4955705896 16.8966859471
-17.3706720293 16.4294027467
-17.2530867792 16.0514717623
-17.1304341172 15.7234699057
-17.0038353287 15.4344471514
-16.902550309 15.1603287337
-16.8375590047 14.9304337826
-16.816287853 14.9119310513
-16.828838265 15.0930023024
-16.8602209498 15.3771992423
-16.9101763812 15.6897991789
-16.9466065143 15.9364556489
-16.9486061956 16.0699417826
-16.9041374104 16.0796970272
-16.8410093699 16.0111444599
-16.7045718836 15.7991985601
-16.51128489 15.5208920129
-16.3253910608 15.2603181921
-16.1297317333 14.9499965958
-15.903428372 14.5958280409
-15.6131718105 14.2709618
-15.1395035533 13.9993939893
-14.4298229999 13.3841189151
-0.0034970565424 0.246184766149
-0.00501284154705 0.238484972472
-0.00605942680019 0.269064381708
-0.00687266156243 0.319479238011
-0.00734065019253 0.371947383205
-0.00718807218417 0.384426479694
-0.00652195540212 0.384676838281
-0.00660416525951 0.395543910317
-0.00680202057642 0.400803979681
-0.00659144183007 0.393228973031
-0.00605294530423 0.385021118038
-0.00590452969394 0.361763039625
-0.00612315374687 0.346777773373
-0.00582354093973 0.335802403976
-0.00574556002554 0.320733728218
-0.00612254485891 0.310153103033
-0.00626733043219 0.299854747445
-0.00567398408041 0.293353685493
-0.00519236700706 0.287668810947
-0.00529581474367 0.281479660772
-0.00479019484082 0.27451415777
-0.00486381039428 0.266294391154
-0.00491126372868 0.258105116126
-0.00452105305011 0.252926328298
-0.00531483334271 0.250910887373
-0.00546572110469 0.253302256977
-0.00479544857908 0.258484183394
-0.00422106426297 0.264582900173
-0.00401824135188 0.268467945623
-0.0041705465252 0.269699480291
-0.00405239564143 0.270406162975
-0.0040059737566 0.270407601782
-0.00406426729317 0.267951582656
-0.00416613791013 0.264543833042
-0.00427847607653 0.26247798891
-0.00428050903034 0.259635263243
-0.00454842971786 0.255829377617
-0.00393747552387 0.253802307025
-0.00374143688909 0.251011478787
-0.00335475310258 0.236543650856
-0.000373194755312 0.0419494800709
-0.000230909648678 0.0394102370205
-0.000150840015851 0.0414956922398
-8.44401840771e-05 0.0460502231327
--6.24759314572e-06 0.0528049937739
--8.82957758148e-05 0.055711244886
-1.16795791952e-05 0.0563188428833
--1.68716267856e-05 0.0575232763711
--0.000112625308645 0.057979929947
--0.000122619090002 0.0564126233493
-1.73569637319e-05 0.05522573909
-6.49872782342e-05 0.0507353361334
-4.17746389178e-05 0.0479568131253
-5.13884475653e-05 0.0461253238047
-1.8860115143e-05 0.0436860476919
--5.64317701105e-05 0.042516381059
--0.000136859948115 0.0413574820205
--7.00847019726e-05 0.0409516370727
--5.39392223336e-05 0.040441504085
--9.24897162815e-05 0.0397800398173
-4.7104970622e-05 0.039046286243
-6.24805896165e-06 0.0380185986602
--2.35272813418e-05 0.036851063786
-5.88344154127e-05 0.0361640489242
--8.39162076993e-05 0.0357639427311
--0.000108702805776 0.0358774639538
-3.22013961834e-06 0.0363644530435
-9.43501518394e-05 0.0370309934774
-0.000134406229423 0.0374972993343
-3.84007008533e-05 0.037676222515
-3.05989328157e-05 0.0379111939182
-9.52201629091e-05 0.0380927209106
-0.000102126083729 0.0379925358499
-6.98628072264e-05 0.0377276252241
-4.55782256339e-05 0.0375165468654
-4.76370987786e-05 0.0371482526345
--2.24128832709e-05 0.0366810742947
-0.000125621306953 0.036628355271
-0.000134568666093 0.0364860461759
-0.000159858844464 0.0345583593149
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/tests/test_data_trans.py b/PaddleSpeech/DeepASR/data_utils/augmentor/tests/test_data_trans.py
deleted file mode 100644
index 6b18f3fa5958a9e44899b39b1f583311f186f72e..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/augmentor/tests/test_data_trans.py
+++ /dev/null
@@ -1,136 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import sys
-import unittest
-import numpy as np
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-import data_utils.augmentor.trans_splice as trans_splice
-import data_utils.augmentor.trans_delay as trans_delay
-
-
-class TestTransMeanVarianceNorm(unittest.TestCase):
- """unit test for TransMeanVarianceNorm
- """
-
- def setUp(self):
- self._file_path = "./data_utils/augmentor/tests/data/" \
- "global_mean_var_search26kHr"
-
- def test(self):
- feature = np.zeros((2, 120), dtype="float32")
- feature.fill(1)
- trans = trans_mean_variance_norm.TransMeanVarianceNorm(self._file_path)
- (feature1, label1, name) = trans.perform_trans((feature, None, None))
- (mean, var) = trans.get_mean_var()
- feature_flat1 = feature1.flatten()
- feature_flat = feature.flatten()
- one = np.ones((1), dtype="float32")
- for idx, val in enumerate(feature_flat1):
- cur_idx = idx % 120
- self.assertAlmostEqual(val, (one[0] - mean[cur_idx]) * var[cur_idx])
-
-
-class TestTransAddDelta(unittest.TestCase):
- """unit test TestTransAddDelta
- """
-
- def test_regress(self):
- """test regress
- """
- feature = np.zeros((14, 120), dtype="float32")
- feature[0:5, 0:40].fill(1)
- feature[0 + 5, 0:40].fill(1)
- feature[1 + 5, 0:40].fill(2)
- feature[2 + 5, 0:40].fill(3)
- feature[3 + 5, 0:40].fill(4)
- feature[8:14, 0:40].fill(4)
- trans = trans_add_delta.TransAddDelta()
- feature = feature.reshape((14 * 120))
- trans._regress(feature, 5 * 120, feature, 5 * 120 + 40, 40, 4, 120)
- trans._regress(feature, 5 * 120 + 40, feature, 5 * 120 + 80, 40, 4, 120)
- feature = feature.reshape((14, 120))
- tmp_feature = feature[5:5 + 4, :]
- self.assertAlmostEqual(1.0, tmp_feature[0][0])
- self.assertAlmostEqual(0.24, tmp_feature[0][119])
- self.assertAlmostEqual(2.0, tmp_feature[1][0])
- self.assertAlmostEqual(0.13, tmp_feature[1][119])
- self.assertAlmostEqual(3.0, tmp_feature[2][0])
- self.assertAlmostEqual(-0.13, tmp_feature[2][119])
- self.assertAlmostEqual(4.0, tmp_feature[3][0])
- self.assertAlmostEqual(-0.24, tmp_feature[3][119])
-
- def test_perform(self):
- """test perform
- """
- feature = np.zeros((4, 40), dtype="float32")
- feature[0, 0:40].fill(1)
- feature[1, 0:40].fill(2)
- feature[2, 0:40].fill(3)
- feature[3, 0:40].fill(4)
- trans = trans_add_delta.TransAddDelta()
- (feature, label, name) = trans.perform_trans((feature, None, None))
- self.assertAlmostEqual(feature.shape[0], 4)
- self.assertAlmostEqual(feature.shape[1], 120)
- self.assertAlmostEqual(1.0, feature[0][0])
- self.assertAlmostEqual(0.24, feature[0][119])
- self.assertAlmostEqual(2.0, feature[1][0])
- self.assertAlmostEqual(0.13, feature[1][119])
- self.assertAlmostEqual(3.0, feature[2][0])
- self.assertAlmostEqual(-0.13, feature[2][119])
- self.assertAlmostEqual(4.0, feature[3][0])
- self.assertAlmostEqual(-0.24, feature[3][119])
-
-
-class TestTransSplict(unittest.TestCase):
- """unit test Test TransSplict
- """
-
- def test_perfrom(self):
- feature = np.zeros((8, 10), dtype="float32")
- for i in xrange(feature.shape[0]):
- feature[i, :].fill(i)
-
- trans = trans_splice.TransSplice()
- (feature, label, name) = trans.perform_trans((feature, None, None))
- self.assertEqual(feature.shape[1], 110)
-
- for i in xrange(8):
- nzero_num = 5 - i
- cur_val = 0.0
- if nzero_num < 0:
- cur_val = i - 5 - 1
- for j in xrange(11):
- if j <= nzero_num:
- for k in xrange(10):
- self.assertAlmostEqual(feature[i][j * 10 + k], cur_val)
- else:
- if cur_val < 7:
- cur_val += 1.0
- for k in xrange(10):
- self.assertAlmostEqual(feature[i][j * 10 + k], cur_val)
-
-
-class TestTransDelay(unittest.TestCase):
- """unittest TransDelay
- """
-
- def test_perform(self):
- label = np.zeros((10, 1), dtype="int64")
- for i in xrange(10):
- label[i][0] = i
-
- trans = trans_delay.TransDelay(5)
- (_, label, _) = trans.perform_trans((None, label, None))
-
- for i in xrange(5):
- self.assertAlmostEqual(label[i + 5][0], i)
-
- for i in xrange(5):
- self.assertAlmostEqual(label[i][0], 0)
-
-
-if __name__ == '__main__':
- unittest.main()
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_add_delta.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_add_delta.py
deleted file mode 100644
index aa8062f87c932b76dd8a79db825d07e8be273857..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_add_delta.py
+++ /dev/null
@@ -1,104 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-import math
-import copy
-
-
-class TransAddDelta(object):
- """ add delta of feature data
- trans feature for shape(a, b) to shape(a, b * 3)
-
- Attributes:
- _norder(int):
- _window(int):
- """
-
- def __init__(self, norder=2, nwindow=2):
- """ init construction
- Args:
- norder: default 2
- nwindow: default 2
- """
- self._norder = norder
- self._nwindow = nwindow
-
- def perform_trans(self, sample):
- """ add delta for feature
- trans feature shape from (a,b) to (a, b * 3)
-
- Args:
- sample(object,tuple): contain feature numpy and label numpy
- Returns:
- (feature, label, name)
- """
- (feature, label, name) = sample
- frame_dim = feature.shape[1]
- d_frame_dim = frame_dim * 3
- head_filled = 5
- tail_filled = 5
- mat = np.zeros(
- (feature.shape[0] + head_filled + tail_filled, d_frame_dim),
- dtype="float32")
- #copy first frame
- for i in xrange(head_filled):
- np.copyto(mat[i, 0:frame_dim], feature[0, :])
-
- np.copyto(mat[head_filled:head_filled + feature.shape[0], 0:frame_dim],
- feature[:, :])
-
- # copy last frame
- for i in xrange(head_filled + feature.shape[0], mat.shape[0], 1):
- np.copyto(mat[i, 0:frame_dim], feature[feature.shape[0] - 1, :])
-
- nframe = feature.shape[0]
- start = head_filled
- tmp_shape = mat.shape
- mat = mat.reshape((tmp_shape[0] * tmp_shape[1]))
- self._regress(mat, start * d_frame_dim, mat,
- start * d_frame_dim + frame_dim, frame_dim, nframe,
- d_frame_dim)
- self._regress(mat, start * d_frame_dim + frame_dim, mat,
- start * d_frame_dim + 2 * frame_dim, frame_dim, nframe,
- d_frame_dim)
- mat.shape = tmp_shape
- return (mat[head_filled:mat.shape[0] - tail_filled, :], label, name)
-
- def _regress(self, data_in, start_in, data_out, start_out, size, n, step):
- """ regress
- Args:
- data_in: in data
- start_in: start index of data_in
- data_out: out data
- start_out: start index of data_out
- size: frame dimentional
- n: frame num
- step: 3 * (frame num)
- Returns:
- None
- """
- sigma_t2 = 0.0
- delta_window = self._nwindow
- for t in xrange(1, delta_window + 1):
- sigma_t2 += t * t
-
- sigma_t2 *= 2.0
- for i in xrange(n):
- fp1 = start_in
- fp2 = start_out
- for j in xrange(size):
- back = fp1
- forw = fp1
- sum = 0.0
- for t in xrange(1, delta_window + 1):
- back -= step
- forw += step
- sum += t * (data_in[forw] - data_in[back])
-
- data_out[fp2] = sum / sigma_t2
- fp1 += 1
- fp2 += 1
- start_in += step
- start_out += step
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_delay.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_delay.py
deleted file mode 100644
index b782498edfd5443806a6c80e3b4fe91b8e2b1cc9..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_delay.py
+++ /dev/null
@@ -1,37 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-import math
-
-
-class TransDelay(object):
- """ Delay label, and copy first label value in the front.
- Attributes:
- _delay_time : the delay frame num of label
- """
-
- def __init__(self, delay_time):
- """init construction
- Args:
- delay_time : the delay frame num of label
- """
- self._delay_time = delay_time
-
- def perform_trans(self, sample):
- """
- Args:
- sample(object):input sample, contain feature numpy and label numpy, sample name list
- Returns:
- (feature, label, name)
- """
- (feature, label, name) = sample
-
- shape = label.shape
- assert len(shape) == 2
- label[self._delay_time:shape[0]] = label[0:shape[0] - self._delay_time]
- for i in xrange(self._delay_time):
- label[i][0] = label[self._delay_time][0]
-
- return (feature, label, name)
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_mean_variance_norm.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_mean_variance_norm.py
deleted file mode 100644
index 9f91b726ea2bcd432340cd06a3cb9006cd5f83f4..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_mean_variance_norm.py
+++ /dev/null
@@ -1,71 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-import math
-
-
-class TransMeanVarianceNorm(object):
- """ normalization of mean variance for feature data
- Attributes:
- _mean(numpy.array): the feature mean vector
- _var(numpy.array): the feature variance
- """
-
- def __init__(self, snorm_path):
- """init construction
- Args:
- snorm_path: the path of mean and variance
- """
- self._mean = None
- self._var = None
- self._load_norm(snorm_path)
-
- def _load_norm(self, snorm_path):
- """ load mean var file
- Args:
- snorm_path(str):the file path
- """
- lLines = open(snorm_path).readlines()
- nLen = len(lLines)
- self._mean = np.zeros((nLen), dtype="float32")
- self._var = np.zeros((nLen), dtype="float32")
- self._nLen = nLen
- for nidx, l in enumerate(lLines):
- s = l.split()
- assert len(s) == 2
- self._mean[nidx] = float(s[0])
- self._var[nidx] = 1.0 / math.sqrt(float(s[1]))
- if self._var[nidx] > 100000.0:
- self._var[nidx] = 100000.0
-
- def get_mean_var(self):
- """ get mean and var
- Args:
- Returns:
- (mean, var)
- """
- return (self._mean, self._var)
-
- def perform_trans(self, sample):
- """ feature = (feature - mean) * var
- Args:
- sample(object):input sample, contain feature numpy and label numpy
- Returns:
- (feature, label, name)
- """
- (feature, label, name) = sample
- shape = feature.shape
- assert len(shape) == 2
- nfeature_len = shape[0] * shape[1]
- assert nfeature_len % self._nLen == 0
- ncur_idx = 0
- feature = feature.reshape((nfeature_len))
- while ncur_idx < nfeature_len:
- block = feature[ncur_idx:ncur_idx + self._nLen]
- block = (block - self._mean) * self._var
- feature[ncur_idx:ncur_idx + self._nLen] = block
- ncur_idx += self._nLen
- feature = feature.reshape(shape)
- return (feature, label, name)
diff --git a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_splice.py b/PaddleSpeech/DeepASR/data_utils/augmentor/trans_splice.py
deleted file mode 100644
index 1fab3d6b442c1613f18d16fd0b0ee89464dbeb2c..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/augmentor/trans_splice.py
+++ /dev/null
@@ -1,64 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-import math
-
-
-class TransSplice(object):
- """ copy feature context to construct new feature
- expand feature data from shape (frame_num, frame_dim)
- to shape (frame_num, frame_dim * 11)
-
- Attributes:
- _nleft_context(int): copy left context number
- _nright_context(int): copy right context number
- """
-
- def __init__(self, nleft_context=5, nright_context=5):
- """ init construction
- Args:
- nleft_context(int):
- nright_context(int):
- """
- self._nleft_context = nleft_context
- self._nright_context = nright_context
-
- def perform_trans(self, sample):
- """ copy feature context
- Args:
- sample(object): input sample(feature, label)
- Return:
- (feature, label, name)
- """
- (feature, label, name) = sample
- nframe_num = feature.shape[0]
- nframe_dim = feature.shape[1]
- nnew_frame_dim = nframe_dim * (
- self._nleft_context + self._nright_context + 1)
- mat = np.zeros(
- (nframe_num + self._nleft_context + self._nright_context,
- nframe_dim),
- dtype="float32")
- ret = np.zeros((nframe_num, nnew_frame_dim), dtype="float32")
-
- #copy left
- for i in xrange(self._nleft_context):
- mat[i, :] = feature[0, :]
-
- #copy middle
- mat[self._nleft_context:self._nleft_context +
- nframe_num, :] = feature[:, :]
-
- #copy right
- for i in xrange(self._nright_context):
- mat[i + self._nleft_context + nframe_num, :] = feature[-1, :]
-
- mat = mat.reshape(mat.shape[0] * mat.shape[1])
- ret = ret.reshape(ret.shape[0] * ret.shape[1])
- for i in xrange(nframe_num):
- np.copyto(ret[i * nnew_frame_dim:(i + 1) * nnew_frame_dim],
- mat[i * nframe_dim:i * nframe_dim + nnew_frame_dim])
- ret = ret.reshape((nframe_num, nnew_frame_dim))
- return (ret, label, name)
diff --git a/PaddleSpeech/DeepASR/data_utils/util.py b/PaddleSpeech/DeepASR/data_utils/util.py
deleted file mode 100644
index 4a5a8a3f1dad1c46ed773fd48d713e276717d5e5..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/data_utils/util.py
+++ /dev/null
@@ -1,71 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-import sys
-from six import reraise
-from tblib import Traceback
-
-import numpy as np
-
-
-def to_lodtensor(data, place):
- """convert tensor to lodtensor
- """
- seq_lens = [len(seq) for seq in data]
- cur_len = 0
- lod = [cur_len]
- for l in seq_lens:
- cur_len += l
- lod.append(cur_len)
- flattened_data = numpy.concatenate(data, axis=0).astype("int64")
- flattened_data = flattened_data.reshape([len(flattened_data), 1])
- res = fluid.LoDTensor()
- res.set(flattened_data, place)
- res.set_lod([lod])
- return res
-
-
-def split_infer_result(infer_seq, lod):
- infer_batch = []
- for i in xrange(0, len(lod[0]) - 1):
- infer_batch.append(infer_seq[lod[0][i]:lod[0][i + 1]])
- return infer_batch
-
-
-class CriticalException(Exception):
- pass
-
-
-def suppress_signal(signo, stack_frame):
- pass
-
-
-def suppress_complaints(verbose, notify=None):
- def decorator_maker(func):
- def suppress_warpper(*args, **kwargs):
- try:
- func(*args, **kwargs)
- except:
- et, ev, tb = sys.exc_info()
-
- if notify is not None:
- notify(except_type=et, except_value=ev, traceback=tb)
-
- if verbose == 1 or isinstance(ev, CriticalException):
- reraise(et, ev, Traceback(tb).as_traceback())
-
- return suppress_warpper
-
- return decorator_maker
-
-
-class ForceExitWrapper(object):
- def __init__(self, exit_flag):
- self._exit_flag = exit_flag
-
- @suppress_complaints(verbose=0)
- def __call__(self, *args, **kwargs):
- self._exit_flag.value = True
-
- def __eq__(self, flag):
- return self._exit_flag.value == flag
diff --git a/PaddleSpeech/DeepASR/decoder/.gitignore b/PaddleSpeech/DeepASR/decoder/.gitignore
deleted file mode 100644
index ef5c97cfb5c06f3308980ca65c87e9c4b9440171..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/decoder/.gitignore
+++ /dev/null
@@ -1,4 +0,0 @@
-ThreadPool
-build
-post_latgen_faster_mapped.so
-pybind11
diff --git a/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.cc b/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.cc
deleted file mode 100644
index ad8aaa84803d61bbce3d76757954e47f8585ed8b..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.cc
+++ /dev/null
@@ -1,305 +0,0 @@
-/* Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
-
-Licensed under the Apache License, Version 2.0 (the "License");
-you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License. */
-
-#include "post_latgen_faster_mapped.h"
-#include
-#include "ThreadPool.h"
-
-using namespace kaldi;
-typedef kaldi::int32 int32;
-using fst::SymbolTable;
-using fst::Fst;
-using fst::StdArc;
-
-Decoder::Decoder(std::string trans_model_in_filename,
- std::string word_syms_filename,
- std::string fst_in_filename,
- std::string logprior_in_filename,
- size_t beam_size,
- kaldi::BaseFloat acoustic_scale) {
- const char *usage =
- "Generate lattices using neural net model.\n"
- "Usage: post-latgen-faster-mapped [options] "
- " "
- " [ [] "
- "]\n";
- ParseOptions po(usage);
- allow_partial = false;
- this->acoustic_scale = acoustic_scale;
-
- config.Register(&po);
- int32 beam = 11;
- po.Register("acoustic-scale",
- &acoustic_scale,
- "Scaling factor for acoustic likelihoods");
- po.Register("word-symbol-table",
- &word_syms_filename,
- "Symbol table for words [for debug output]");
- po.Register("allow-partial",
- &allow_partial,
- "If true, produce output even if end state was not reached.");
-
- int argc = 2;
- char *argv[] = {(char *)"post-latgen-faster-mapped",
- (char *)("--beam=" + std::to_string(beam_size)).c_str()};
-
- po.Read(argc, argv);
-
- std::ifstream is_logprior(logprior_in_filename);
- logprior.Read(is_logprior, false);
-
- {
- bool binary;
- Input ki(trans_model_in_filename, &binary);
- this->trans_model.Read(ki.Stream(), binary);
- }
-
- this->determinize = config.determinize_lattice;
-
- this->word_syms = NULL;
- if (word_syms_filename != "") {
- if (!(word_syms = fst::SymbolTable::ReadText(word_syms_filename))) {
- KALDI_ERR << "Could not read symbol table from file "
- << word_syms_filename;
- }
- }
-
- // Input FST is just one FST, not a table of FSTs.
- this->decode_fst = fst::ReadFstKaldiGeneric(fst_in_filename);
-
- kaldi::LatticeFasterDecoder *decoder =
- new LatticeFasterDecoder(*decode_fst, config);
- decoder_pool.emplace_back(decoder);
-
- std::string lattice_wspecifier =
- "ark:|gzip -c > mapped_decoder_data/lat.JOB.gz";
- if (!(determinize ? compact_lattice_writer.Open(lattice_wspecifier)
- : lattice_writer.Open(lattice_wspecifier)))
- KALDI_ERR << "Could not open table for writing lattices: "
- << lattice_wspecifier;
-
- words_writer = new Int32VectorWriter("");
- alignment_writer = new Int32VectorWriter("");
-}
-
-Decoder::~Decoder() {
- if (!this->word_syms) delete this->word_syms;
- delete this->decode_fst;
- for (size_t i = 0; i < decoder_pool.size(); ++i) {
- delete decoder_pool[i];
- }
- delete words_writer;
- delete alignment_writer;
-}
-
-
-void Decoder::decode_from_file(std::string posterior_rspecifier,
- size_t num_processes) {
- try {
- double tot_like = 0.0;
- kaldi::int64 frame_count = 0;
- // int num_success = 0, num_fail = 0;
-
- KALDI_ASSERT(ClassifyRspecifier(fst_in_filename, NULL, NULL) ==
- kNoRspecifier);
- SequentialBaseFloatMatrixReader posterior_reader("ark:" +
- posterior_rspecifier);
-
- Timer timer;
- timer.Reset();
- double elapsed = 0.0;
-
- for (size_t n = decoder_pool.size(); n < num_processes; ++n) {
- kaldi::LatticeFasterDecoder *decoder =
- new LatticeFasterDecoder(*decode_fst, config);
- decoder_pool.emplace_back(decoder);
- }
- elapsed = timer.Elapsed();
- ThreadPool thread_pool(num_processes);
-
- while (!posterior_reader.Done()) {
- timer.Reset();
- std::vector> que;
- for (size_t i = 0; i < num_processes && !posterior_reader.Done(); ++i) {
- std::string utt = posterior_reader.Key();
- Matrix &loglikes(posterior_reader.Value());
- que.emplace_back(thread_pool.enqueue(std::bind(
- &Decoder::decode_internal, this, decoder_pool[i], utt, loglikes)));
- posterior_reader.Next();
- }
- timer.Reset();
- for (size_t i = 0; i < que.size(); ++i) {
- std::cout << que[i].get() << std::endl;
- }
- }
-
- } catch (const std::exception &e) {
- std::cerr << e.what();
- }
-}
-
-inline kaldi::Matrix vector2kaldi_mat(
- const std::vector> &log_probs) {
- size_t num_frames = log_probs.size();
- size_t dim_label = log_probs[0].size();
- kaldi::Matrix loglikes(
- num_frames, dim_label, kaldi::kSetZero, kaldi::kStrideEqualNumCols);
- for (size_t i = 0; i < num_frames; ++i) {
- memcpy(loglikes.Data() + i * dim_label,
- log_probs[i].data(),
- sizeof(kaldi::BaseFloat) * dim_label);
- }
- return loglikes;
-}
-
-std::vector Decoder::decode_batch(
- std::vector keys,
- const std::vector>>
- &log_probs_batch,
- size_t num_processes) {
- ThreadPool thread_pool(num_processes);
- std::vector decoding_results; //(keys.size(), "");
-
- for (size_t n = decoder_pool.size(); n < num_processes; ++n) {
- kaldi::LatticeFasterDecoder *decoder =
- new LatticeFasterDecoder(*decode_fst, config);
- decoder_pool.emplace_back(decoder);
- }
-
- size_t index = 0;
- while (index < keys.size()) {
- std::vector> res_in_que;
- for (size_t t = 0; t < num_processes && index < keys.size(); ++t) {
- kaldi::Matrix loglikes =
- vector2kaldi_mat(log_probs_batch[index]);
- res_in_que.emplace_back(
- thread_pool.enqueue(std::bind(&Decoder::decode_internal,
- this,
- decoder_pool[t],
- keys[index],
- loglikes)));
- index++;
- }
- for (size_t i = 0; i < res_in_que.size(); ++i) {
- decoding_results.emplace_back(res_in_que[i].get());
- }
- }
- return decoding_results;
-}
-
-std::string Decoder::decode(
- std::string key,
- const std::vector> &log_probs) {
- kaldi::Matrix loglikes = vector2kaldi_mat(log_probs);
- return decode_internal(decoder_pool[0], key, loglikes);
-}
-
-
-std::string Decoder::decode_internal(
- LatticeFasterDecoder *decoder,
- std::string key,
- kaldi::Matrix &loglikes) {
- if (loglikes.NumRows() == 0) {
- KALDI_WARN << "Zero-length utterance: " << key;
- // num_fail++;
- }
- KALDI_ASSERT(loglikes.NumCols() == logprior.Dim());
-
- loglikes.ApplyLog();
- loglikes.AddVecToRows(-1.0, logprior);
-
- DecodableMatrixScaledMapped matrix_decodable(
- trans_model, loglikes, acoustic_scale);
- double like;
- return this->DecodeUtteranceLatticeFaster(
- decoder, matrix_decodable, key, &like);
-}
-
-
-std::string Decoder::DecodeUtteranceLatticeFaster(
- LatticeFasterDecoder *decoder,
- DecodableInterface &decodable, // not const but is really an input.
- std::string utt,
- double *like_ptr) { // puts utterance's like in like_ptr on success.
- using fst::VectorFst;
- std::string ret = utt + ' ';
-
- if (!decoder->Decode(&decodable)) {
- KALDI_WARN << "Failed to decode file " << utt;
- return ret;
- }
- if (!decoder->ReachedFinal()) {
- if (allow_partial) {
- KALDI_WARN << "Outputting partial output for utterance " << utt
- << " since no final-state reached\n";
- } else {
- KALDI_WARN << "Not producing output for utterance " << utt
- << " since no final-state reached and "
- << "--allow-partial=false.\n";
- return ret;
- }
- }
-
- double likelihood;
- LatticeWeight weight;
- int32 num_frames;
- { // First do some stuff with word-level traceback...
- VectorFst decoded;
- if (!decoder->GetBestPath(&decoded))
- // Shouldn't really reach this point as already checked success.
- KALDI_ERR << "Failed to get traceback for utterance " << utt;
-
- std::vector alignment;
- std::vector words;
- GetLinearSymbolSequence(decoded, &alignment, &words, &weight);
- num_frames = alignment.size();
- // if (alignment_writer->IsOpen()) alignment_writer->Write(utt, alignment);
- if (word_syms != NULL) {
- for (size_t i = 0; i < words.size(); i++) {
- std::string s = word_syms->Find(words[i]);
- ret += s + ' ';
- }
- }
- likelihood = -(weight.Value1() + weight.Value2());
- }
-
- // Get lattice, and do determinization if requested.
- Lattice lat;
- decoder->GetRawLattice(&lat);
- if (lat.NumStates() == 0)
- KALDI_ERR << "Unexpected problem getting lattice for utterance " << utt;
- fst::Connect(&lat);
- if (determinize) {
- CompactLattice clat;
- if (!DeterminizeLatticePhonePrunedWrapper(
- trans_model,
- &lat,
- decoder->GetOptions().lattice_beam,
- &clat,
- decoder->GetOptions().det_opts))
- KALDI_WARN << "Determinization finished earlier than the beam for "
- << "utterance " << utt;
- // We'll write the lattice without acoustic scaling.
- if (acoustic_scale != 0.0)
- fst::ScaleLattice(fst::AcousticLatticeScale(1.0 / acoustic_scale), &clat);
- // disable output lattice temporarily
- // compact_lattice_writer.Write(utt, clat);
- } else {
- // We'll write the lattice without acoustic scaling.
- if (acoustic_scale != 0.0)
- fst::ScaleLattice(fst::AcousticLatticeScale(1.0 / acoustic_scale), &lat);
- // lattice_writer.Write(utt, lat);
- }
- return ret;
-}
diff --git a/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.h b/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.h
deleted file mode 100644
index 9c234b8681690b9f1e3d30b61ac3b97b7055887f..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/decoder/post_latgen_faster_mapped.h
+++ /dev/null
@@ -1,80 +0,0 @@
-/* Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
-
-Licensed under the Apache License, Version 2.0 (the "License");
-you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License. */
-
-#include
-#include
-#include "base/kaldi-common.h"
-#include "base/timer.h"
-#include "decoder/decodable-matrix.h"
-#include "decoder/decoder-wrappers.h"
-#include "fstext/kaldi-fst-io.h"
-#include "hmm/transition-model.h"
-#include "tree/context-dep.h"
-#include "util/common-utils.h"
-
-class Decoder {
-public:
- Decoder(std::string trans_model_in_filename,
- std::string word_syms_filename,
- std::string fst_in_filename,
- std::string logprior_in_filename,
- size_t beam_size,
- kaldi::BaseFloat acoustic_scale);
- ~Decoder();
-
- // Interface to accept the scores read from specifier and print
- // the decoding results directly
- void decode_from_file(std::string posterior_rspecifier,
- size_t num_processes = 1);
-
- // Accept the scores of one utterance and return the decoding result
- std::string decode(
- std::string key,
- const std::vector> &log_probs);
-
- // Accept the scores of utterances in batch and return the decoding results
- std::vector decode_batch(
- std::vector key,
- const std::vector>>
- &log_probs_batch,
- size_t num_processes = 1);
-
-private:
- // For decoding one utterance
- std::string decode_internal(kaldi::LatticeFasterDecoder *decoder,
- std::string key,
- kaldi::Matrix &loglikes);
-
- std::string DecodeUtteranceLatticeFaster(kaldi::LatticeFasterDecoder *decoder,
- kaldi::DecodableInterface &decodable,
- std::string utt,
- double *like_ptr);
-
- fst::SymbolTable *word_syms;
- fst::Fst *decode_fst;
- std::vector decoder_pool;
- kaldi::Vector logprior;
- kaldi::TransitionModel trans_model;
- kaldi::LatticeFasterDecoderConfig config;
-
- kaldi::CompactLatticeWriter compact_lattice_writer;
- kaldi::LatticeWriter lattice_writer;
- kaldi::Int32VectorWriter *words_writer;
- kaldi::Int32VectorWriter *alignment_writer;
-
- bool binary;
- bool determinize;
- kaldi::BaseFloat acoustic_scale;
- bool allow_partial;
-};
diff --git a/PaddleSpeech/DeepASR/decoder/pybind.cc b/PaddleSpeech/DeepASR/decoder/pybind.cc
deleted file mode 100644
index 4a9b27d4cf862e5c1492875512fdeba3e95ecb15..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/decoder/pybind.cc
+++ /dev/null
@@ -1,51 +0,0 @@
-/* Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
-
-Licensed under the Apache License, Version 2.0 (the "License");
-you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License. */
-
-#include
-#include
-
-#include "post_latgen_faster_mapped.h"
-
-namespace py = pybind11;
-
-PYBIND11_MODULE(post_latgen_faster_mapped, m) {
- m.doc() = "Decoder for Deep ASR model";
-
- py::class_(m, "Decoder")
- .def(py::init())
- .def("decode_from_file",
- (void (Decoder::*)(std::string, size_t)) & Decoder::decode_from_file,
- "Decode for the probability matrices in specifier "
- "and print the transcriptions.")
- .def(
- "decode",
- (std::string (Decoder::*)(
- std::string, const std::vector>&)) &
- Decoder::decode,
- "Decode one input probability matrix "
- "and return the transcription.")
- .def("decode_batch",
- (std::vector (Decoder::*)(
- std::vector,
- const std::vector>>&,
- size_t num_processes)) &
- Decoder::decode_batch,
- "Decode one batch of probability matrices "
- "and return the transcriptions.");
-}
diff --git a/PaddleSpeech/DeepASR/decoder/setup.py b/PaddleSpeech/DeepASR/decoder/setup.py
deleted file mode 100644
index 81fc857cce5b57af5bce7b34a1f4243fb853c0b6..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/decoder/setup.py
+++ /dev/null
@@ -1,71 +0,0 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import os
-import glob
-from distutils.core import setup, Extension
-from distutils.sysconfig import get_config_vars
-
-try:
- kaldi_root = os.environ['KALDI_ROOT']
-except:
- raise ValueError("Enviroment variable 'KALDI_ROOT' is not defined. Please "
- "install kaldi and export KALDI_ROOT= .")
-
-args = [
- '-std=c++11', '-fopenmp', '-Wno-sign-compare', '-Wno-unused-variable',
- '-Wno-unused-local-typedefs', '-Wno-unused-but-set-variable',
- '-Wno-deprecated-declarations', '-Wno-unused-function'
-]
-
-# remove warning about -Wstrict-prototypes
-(opt, ) = get_config_vars('OPT')
-os.environ['OPT'] = " ".join(flag for flag in opt.split()
- if flag != '-Wstrict-prototypes')
-os.environ['CC'] = 'g++'
-
-LIBS = [
- 'fst', 'kaldi-base', 'kaldi-util', 'kaldi-matrix', 'kaldi-tree',
- 'kaldi-hmm', 'kaldi-fstext', 'kaldi-decoder', 'kaldi-lat'
-]
-
-LIB_DIRS = [
- 'tools/openfst/lib', 'src/base', 'src/matrix', 'src/util', 'src/tree',
- 'src/hmm', 'src/fstext', 'src/decoder', 'src/lat'
-]
-LIB_DIRS = [os.path.join(kaldi_root, path) for path in LIB_DIRS]
-LIB_DIRS = [os.path.abspath(path) for path in LIB_DIRS]
-
-ext_modules = [
- Extension(
- 'post_latgen_faster_mapped',
- ['pybind.cc', 'post_latgen_faster_mapped.cc'],
- include_dirs=[
- 'pybind11/include', '.', os.path.join(kaldi_root, 'src'),
- os.path.join(kaldi_root, 'tools/openfst/src/include'), 'ThreadPool'
- ],
- language='c++',
- libraries=LIBS,
- library_dirs=LIB_DIRS,
- runtime_library_dirs=LIB_DIRS,
- extra_compile_args=args, ),
-]
-
-setup(
- name='post_latgen_faster_mapped',
- version='0.1.0',
- author='Paddle',
- author_email='',
- description='Decoder for Deep ASR model',
- ext_modules=ext_modules, )
diff --git a/PaddleSpeech/DeepASR/decoder/setup.sh b/PaddleSpeech/DeepASR/decoder/setup.sh
deleted file mode 100644
index 238cc64986900bae6fa0bb403d8134981212b8ea..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/decoder/setup.sh
+++ /dev/null
@@ -1,12 +0,0 @@
-set -e
-
-if [ ! -d pybind11 ]; then
- git clone https://github.com/pybind/pybind11.git
-fi
-
-if [ ! -d ThreadPool ]; then
- git clone https://github.com/progschj/ThreadPool.git
- echo -e "\n"
-fi
-
-python setup.py build_ext -i
diff --git a/PaddleSpeech/DeepASR/examples/aishell/.gitignore b/PaddleSpeech/DeepASR/examples/aishell/.gitignore
deleted file mode 100644
index c173dd880ae9e06c16989800e06d4d3d7a1a7d5f..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/examples/aishell/.gitignore
+++ /dev/null
@@ -1,4 +0,0 @@
-aux.tar.gz
-aux
-data
-checkpoints
diff --git a/PaddleSpeech/DeepASR/examples/aishell/download_pretrained_model.sh b/PaddleSpeech/DeepASR/examples/aishell/download_pretrained_model.sh
deleted file mode 100644
index a8813e241c4f6e40392dff6f173160d2bbd77175..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/examples/aishell/download_pretrained_model.sh
+++ /dev/null
@@ -1,15 +0,0 @@
-url=http://deep-asr-data.gz.bcebos.com/aishell_pretrained_model.tar.gz
-md5=7b51bde64e884f43901b7a3461ccbfa3
-
-wget -c $url
-
-echo "Checking md5 sum ..."
-md5sum_tmp=`md5sum aishell_pretrained_model.tar.gz | cut -d ' ' -f1`
-
-if [ $md5sum_tmp != $md5 ]; then
- echo "Md5sum check failed, please remove and redownload "
- "aishell_pretrained_model.tar.gz."
- exit 1
-fi
-
-tar xvf aishell_pretrained_model.tar.gz
diff --git a/PaddleSpeech/DeepASR/examples/aishell/infer_by_ckpt.sh b/PaddleSpeech/DeepASR/examples/aishell/infer_by_ckpt.sh
deleted file mode 100644
index 2d31757451849afc1412421376484d2ad41962bc..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/examples/aishell/infer_by_ckpt.sh
+++ /dev/null
@@ -1,18 +0,0 @@
-decode_to_path=./decoding_result.txt
-
-export CUDA_VISIBLE_DEVICES=0,1,2,3
-python -u ../../infer_by_ckpt.py --batch_size 96 \
- --checkpoint checkpoints/deep_asr.latest.checkpoint \
- --infer_feature_lst data/test_feature.lst \
- --mean_var data/global_mean_var \
- --frame_dim 80 \
- --class_num 3040 \
- --num_threads 24 \
- --beam_size 11 \
- --decode_to_path $decode_to_path \
- --trans_model aux/final.mdl \
- --log_prior aux/logprior \
- --vocabulary aux/graph/words.txt \
- --graphs aux/graph/HCLG.fst \
- --acoustic_scale 0.059 \
- --parallel
diff --git a/PaddleSpeech/DeepASR/examples/aishell/prepare_data.sh b/PaddleSpeech/DeepASR/examples/aishell/prepare_data.sh
deleted file mode 100644
index 8bb7ac5cccb2ba72fd6351fc1e6755f5135740d8..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/examples/aishell/prepare_data.sh
+++ /dev/null
@@ -1,43 +0,0 @@
-data_dir=~/.cache/paddle/dataset/speech/deep_asr_data/aishell
-data_url='http://deep-asr-data.gz.bcebos.com/aishell_data.tar.gz'
-lst_url='http://deep-asr-data.gz.bcebos.com/aishell_lst.tar.gz'
-aux_url='http://deep-asr-data.gz.bcebos.com/aux.tar.gz'
-md5=17669b8d63331c9326f4a9393d289bfb
-aux_md5=50e3125eba1e3a2768a6f2e499cc1749
-
-if [ ! -e $data_dir ]; then
- mkdir -p $data_dir
-fi
-
-if [ ! -e $data_dir/aishell_data.tar.gz ]; then
- echo "Download $data_dir/aishell_data.tar.gz ..."
- wget -c -P $data_dir $data_url
-else
- echo "Skip downloading for $data_dir/aishell_data.tar.gz has already existed!"
-fi
-
-echo "Checking md5 sum ..."
-md5sum_tmp=`md5sum $data_dir/aishell_data.tar.gz | cut -d ' ' -f1`
-
-if [ $md5sum_tmp != $md5 ]; then
- echo "Md5sum check failed, please remove and redownload "
- "$data_dir/aishell_data.tar.gz"
- exit 1
-fi
-
-echo "Untar aishell_data.tar.gz ..."
-tar xzf $data_dir/aishell_data.tar.gz -C $data_dir
-
-if [ ! -e data ]; then
- mkdir data
-fi
-
-echo "Download and untar lst files ..."
-wget -c -P data $lst_url
-tar xvf data/aishell_lst.tar.gz -C data
-
-ln -s $data_dir data/aishell
-
-echo "Download and untar aux files ..."
-wget -c $aux_url
-tar xvf aux.tar.gz
diff --git a/PaddleSpeech/DeepASR/examples/aishell/profile.sh b/PaddleSpeech/DeepASR/examples/aishell/profile.sh
deleted file mode 100644
index e7df868b9ea26db3d91be0c01d0b7ecb63c374de..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/examples/aishell/profile.sh
+++ /dev/null
@@ -1,7 +0,0 @@
-export CUDA_VISIBLE_DEVICES=0
-python -u ../../tools/profile.py --feature_lst data/train_feature.lst \
- --label_lst data/train_label.lst \
- --mean_var data/global_mean_var \
- --frame_dim 80 \
- --class_num 3040 \
- --batch_size 16
diff --git a/PaddleSpeech/DeepASR/examples/aishell/score_cer.sh b/PaddleSpeech/DeepASR/examples/aishell/score_cer.sh
deleted file mode 100644
index 70dfcbad4a8427adcc1149fbab02ec674dacde0c..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/examples/aishell/score_cer.sh
+++ /dev/null
@@ -1,4 +0,0 @@
-ref_txt=aux/test.ref.txt
-hyp_txt=decoding_result.txt
-
-python ../../score_error_rate.py --error_rate_type cer --ref $ref_txt --hyp $hyp_txt
diff --git a/PaddleSpeech/DeepASR/examples/aishell/train.sh b/PaddleSpeech/DeepASR/examples/aishell/train.sh
deleted file mode 100644
index 168581c0ee579ef62f138bb0d8f5bb8886beb90b..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/examples/aishell/train.sh
+++ /dev/null
@@ -1,14 +0,0 @@
-export CUDA_VISIBLE_DEVICES=4,5,6,7
-python -u ../../train.py --train_feature_lst data/train_feature.lst \
- --train_label_lst data/train_label.lst \
- --val_feature_lst data/val_feature.lst \
- --val_label_lst data/val_label.lst \
- --mean_var data/global_mean_var \
- --checkpoints checkpoints \
- --frame_dim 80 \
- --class_num 3040 \
- --print_per_batches 100 \
- --infer_models '' \
- --batch_size 16 \
- --learning_rate 6.4e-5 \
- --parallel
diff --git a/PaddleSpeech/DeepASR/images/learning_curve.png b/PaddleSpeech/DeepASR/images/learning_curve.png
deleted file mode 100644
index f09e8514e16fa09c8c32f3b455a5515f270df27a..0000000000000000000000000000000000000000
Binary files a/PaddleSpeech/DeepASR/images/learning_curve.png and /dev/null differ
diff --git a/PaddleSpeech/DeepASR/images/lstmp.png b/PaddleSpeech/DeepASR/images/lstmp.png
deleted file mode 100644
index 72c2fc28998b09218f5dfd9d4c4d09a773b4f503..0000000000000000000000000000000000000000
Binary files a/PaddleSpeech/DeepASR/images/lstmp.png and /dev/null differ
diff --git a/PaddleSpeech/DeepASR/infer.py b/PaddleSpeech/DeepASR/infer.py
deleted file mode 100644
index 84269261a95c381a9be21425abf43b98006f0886..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/infer.py
+++ /dev/null
@@ -1,108 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os
-import argparse
-import paddle.fluid as fluid
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-import data_utils.augmentor.trans_splice as trans_splice
-import data_utils.async_data_reader as reader
-from data_utils.util import lodtensor_to_ndarray
-from data_utils.util import split_infer_result
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Inference for stacked LSTMP model.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=32,
- help='The sequence number of a batch data. (default: %(default)d)')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type. (default: %(default)s)')
- parser.add_argument(
- '--mean_var',
- type=str,
- default='data/global_mean_var_search26kHr',
- help="The path for feature's global mean and variance. "
- "(default: %(default)s)")
- parser.add_argument(
- '--infer_feature_lst',
- type=str,
- default='data/infer_feature.lst',
- help='The feature list path for inference. (default: %(default)s)')
- parser.add_argument(
- '--infer_label_lst',
- type=str,
- default='data/infer_label.lst',
- help='The label list path for inference. (default: %(default)s)')
- parser.add_argument(
- '--infer_model_path',
- type=str,
- default='./infer_models/deep_asr.pass_0.infer.model/',
- help='The directory for loading inference model. '
- '(default: %(default)s)')
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(vars(args).iteritems()):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def infer(args):
- """ Gets one batch of feature data and predicts labels for each sample.
- """
-
- if not os.path.exists(args.infer_model_path):
- raise IOError("Invalid inference model path!")
-
- place = fluid.CUDAPlace(0) if args.device == 'GPU' else fluid.CPUPlace()
- exe = fluid.Executor(place)
-
- # load model
- [infer_program, feed_dict,
- fetch_targets] = fluid.io.load_inference_model(args.infer_model_path, exe)
-
- ltrans = [
- trans_add_delta.TransAddDelta(2, 2),
- trans_mean_variance_norm.TransMeanVarianceNorm(args.mean_var),
- trans_splice.TransSplice()
- ]
-
- infer_data_reader = reader.AsyncDataReader(args.infer_feature_lst,
- args.infer_label_lst)
- infer_data_reader.set_transformers(ltrans)
-
- feature_t = fluid.LoDTensor()
- one_batch = infer_data_reader.batch_iterator(args.batch_size, 1).next()
-
- (features, labels, lod) = one_batch
- feature_t.set(features, place)
- feature_t.set_lod([lod])
-
- results = exe.run(infer_program,
- feed={feed_dict[0]: feature_t},
- fetch_list=fetch_targets,
- return_numpy=False)
-
- probs, lod = lodtensor_to_ndarray(results[0])
- preds = probs.argmax(axis=1)
- infer_batch = split_infer_result(preds, lod)
- for index, sample in enumerate(infer_batch):
- print("result %d: " % index, sample, '\n')
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- infer(args)
diff --git a/PaddleSpeech/DeepASR/infer_by_ckpt.py b/PaddleSpeech/DeepASR/infer_by_ckpt.py
deleted file mode 100644
index 1e0fb15c6d6f05aa1e054b37333b0fa0cb5cd8d9..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/infer_by_ckpt.py
+++ /dev/null
@@ -1,273 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import sys
-import os
-import numpy as np
-import argparse
-import time
-
-import paddle.fluid as fluid
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-import data_utils.augmentor.trans_splice as trans_splice
-import data_utils.augmentor.trans_delay as trans_delay
-import data_utils.async_data_reader as reader
-from data_utils.util import lodtensor_to_ndarray, split_infer_result
-from model_utils.model import stacked_lstmp_model
-from decoder.post_latgen_faster_mapped import Decoder
-from tools.error_rate import char_errors
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Run inference by using checkpoint.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=32,
- help='The sequence number of a batch data. (default: %(default)d)')
- parser.add_argument(
- '--beam_size',
- type=int,
- default=11,
- help='The beam size for decoding. (default: %(default)d)')
- parser.add_argument(
- '--minimum_batch_size',
- type=int,
- default=1,
- help='The minimum sequence number of a batch data. '
- '(default: %(default)d)')
- parser.add_argument(
- '--frame_dim',
- type=int,
- default=80,
- help='Frame dimension of feature data. (default: %(default)d)')
- parser.add_argument(
- '--stacked_num',
- type=int,
- default=5,
- help='Number of lstmp layers to stack. (default: %(default)d)')
- parser.add_argument(
- '--proj_dim',
- type=int,
- default=512,
- help='Project size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--hidden_dim',
- type=int,
- default=1024,
- help='Hidden size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--class_num',
- type=int,
- default=1749,
- help='Number of classes in label. (default: %(default)d)')
- parser.add_argument(
- '--num_threads',
- type=int,
- default=10,
- help='The number of threads for decoding. (default: %(default)d)')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type. (default: %(default)s)')
- parser.add_argument(
- '--parallel', action='store_true', help='If set, run in parallel.')
- parser.add_argument(
- '--mean_var',
- type=str,
- default='data/global_mean_var',
- help="The path for feature's global mean and variance. "
- "(default: %(default)s)")
- parser.add_argument(
- '--infer_feature_lst',
- type=str,
- default='data/infer_feature.lst',
- help='The feature list path for inference. (default: %(default)s)')
- parser.add_argument(
- '--checkpoint',
- type=str,
- default='./checkpoint',
- help="The checkpoint path to init model. (default: %(default)s)")
- parser.add_argument(
- '--trans_model',
- type=str,
- default='./graph/trans_model',
- help="The path to vocabulary. (default: %(default)s)")
- parser.add_argument(
- '--vocabulary',
- type=str,
- default='./graph/words.txt',
- help="The path to vocabulary. (default: %(default)s)")
- parser.add_argument(
- '--graphs',
- type=str,
- default='./graph/TLG.fst',
- help="The path to TLG graphs for decoding. (default: %(default)s)")
- parser.add_argument(
- '--log_prior',
- type=str,
- default="./logprior",
- help="The log prior probs for training data. (default: %(default)s)")
- parser.add_argument(
- '--acoustic_scale',
- type=float,
- default=0.2,
- help="Scaling factor for acoustic likelihoods. (default: %(default)f)")
- parser.add_argument(
- '--post_matrix_path',
- type=str,
- default=None,
- help="The path to output post prob matrix. (default: %(default)s)")
- parser.add_argument(
- '--decode_to_path',
- type=str,
- default='./decoding_result.txt',
- required=True,
- help="The path to output the decoding result. (default: %(default)s)")
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(vars(args).iteritems()):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-class PostMatrixWriter:
- """ The writer for outputing the post probability matrix
- """
-
- def __init__(self, to_path):
- self._to_path = to_path
- with open(self._to_path, "w") as post_matrix:
- post_matrix.seek(0)
- post_matrix.truncate()
-
- def write(self, keys, probs):
- with open(self._to_path, "a") as post_matrix:
- if isinstance(keys, str):
- keys, probs = [keys], [probs]
-
- for key, prob in zip(keys, probs):
- post_matrix.write(key + " [\n")
- for i in range(prob.shape[0]):
- for j in range(prob.shape[1]):
- post_matrix.write(str(prob[i][j]) + " ")
- post_matrix.write("\n")
- post_matrix.write("]\n")
-
-
-class DecodingResultWriter:
- """ The writer for writing out decoding results
- """
-
- def __init__(self, to_path):
- self._to_path = to_path
- with open(self._to_path, "w") as decoding_result:
- decoding_result.seek(0)
- decoding_result.truncate()
-
- def write(self, results):
- with open(self._to_path, "a") as decoding_result:
- if isinstance(results, str):
- decoding_result.write(results.encode("utf8") + "\n")
- else:
- for result in results:
- decoding_result.write(result.encode("utf8") + "\n")
-
-
-def infer_from_ckpt(args):
- """Inference by using checkpoint."""
-
- if not os.path.exists(args.checkpoint):
- raise IOError("Invalid checkpoint!")
-
- prediction, avg_cost, accuracy = stacked_lstmp_model(
- frame_dim=args.frame_dim,
- hidden_dim=args.hidden_dim,
- proj_dim=args.proj_dim,
- stacked_num=args.stacked_num,
- class_num=args.class_num,
- parallel=args.parallel)
-
- infer_program = fluid.default_main_program().clone()
-
- # optimizer, placeholder
- optimizer = fluid.optimizer.Adam(
- learning_rate=fluid.layers.exponential_decay(
- learning_rate=0.0001,
- decay_steps=1879,
- decay_rate=1 / 1.2,
- staircase=True))
- optimizer.minimize(avg_cost)
-
- place = fluid.CPUPlace() if args.device == 'CPU' else fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- # load checkpoint.
- fluid.io.load_persistables(exe, args.checkpoint)
-
- # init decoder
- decoder = Decoder(args.trans_model, args.vocabulary, args.graphs,
- args.log_prior, args.beam_size, args.acoustic_scale)
-
- ltrans = [
- trans_add_delta.TransAddDelta(2, 2),
- trans_mean_variance_norm.TransMeanVarianceNorm(args.mean_var),
- trans_splice.TransSplice(5, 5), trans_delay.TransDelay(5)
- ]
-
- feature_t = fluid.LoDTensor()
- label_t = fluid.LoDTensor()
-
- # infer data reader
- infer_data_reader = reader.AsyncDataReader(
- args.infer_feature_lst, drop_frame_len=-1, split_sentence_threshold=-1)
- infer_data_reader.set_transformers(ltrans)
-
- decoding_result_writer = DecodingResultWriter(args.decode_to_path)
- post_matrix_writer = None if args.post_matrix_path is None \
- else PostMatrixWriter(args.post_matrix_path)
-
- for batch_id, batch_data in enumerate(
- infer_data_reader.batch_iterator(args.batch_size,
- args.minimum_batch_size)):
- # load_data
- (features, labels, lod, name_lst) = batch_data
- features = np.reshape(features, (-1, 11, 3, args.frame_dim))
- features = np.transpose(features, (0, 2, 1, 3))
- feature_t.set(features, place)
- feature_t.set_lod([lod])
- label_t.set(labels, place)
- label_t.set_lod([lod])
-
- results = exe.run(infer_program,
- feed={"feature": feature_t,
- "label": label_t},
- fetch_list=[prediction, avg_cost, accuracy],
- return_numpy=False)
-
- probs, lod = lodtensor_to_ndarray(results[0])
- infer_batch = split_infer_result(probs, lod)
-
- print("Decoding batch %d ..." % batch_id)
- decoded = decoder.decode_batch(name_lst, infer_batch, args.num_threads)
-
- decoding_result_writer.write(decoded)
-
- if args.post_matrix_path is not None:
- post_matrix_writer.write(name_lst, infer_batch)
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
-
- infer_from_ckpt(args)
diff --git a/PaddleSpeech/DeepASR/model_utils/__init__.py b/PaddleSpeech/DeepASR/model_utils/__init__.py
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/PaddleSpeech/DeepASR/model_utils/model.py b/PaddleSpeech/DeepASR/model_utils/model.py
deleted file mode 100644
index 0b086b55a898a0a29f57132b438684a655e30caf..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/model_utils/model.py
+++ /dev/null
@@ -1,74 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import paddle.fluid as fluid
-
-
-def stacked_lstmp_model(feature,
- label,
- hidden_dim,
- proj_dim,
- stacked_num,
- class_num,
- parallel=False,
- is_train=True):
- """
- The model for DeepASR. The main structure is composed of stacked
- identical LSTMP (LSTM with recurrent projection) layers.
-
- When running in training and validation phase, the feeding dictionary
- is {'feature', 'label'}, fed by the LodTensor for feature data and
- label data respectively. And in inference, only `feature` is needed.
-
- Args:
- frame_dim(int): The frame dimension of feature data.
- hidden_dim(int): The hidden state's dimension of the LSTMP layer.
- proj_dim(int): The projection size of the LSTMP layer.
- stacked_num(int): The number of stacked LSTMP layers.
- parallel(bool): Run in parallel or not, default `False`.
- is_train(bool): Run in training phase or not, default `True`.
- class_dim(int): The number of output classes.
- """
- conv1 = fluid.layers.conv2d(
- input=feature,
- num_filters=32,
- filter_size=3,
- stride=1,
- padding=1,
- bias_attr=True,
- act="relu")
-
- pool1 = fluid.layers.pool2d(
- conv1, pool_size=3, pool_type="max", pool_stride=2, pool_padding=0)
-
- stack_input = pool1
- for i in range(stacked_num):
- fc = fluid.layers.fc(input=stack_input,
- size=hidden_dim * 4,
- bias_attr=None)
- proj, cell = fluid.layers.dynamic_lstmp(
- input=fc,
- size=hidden_dim * 4,
- proj_size=proj_dim,
- bias_attr=True,
- use_peepholes=True,
- is_reverse=False,
- cell_activation="tanh",
- proj_activation="tanh")
- bn = fluid.layers.batch_norm(
- input=proj,
- is_test=not is_train,
- momentum=0.9,
- epsilon=1e-05,
- data_layout='NCHW')
- stack_input = bn
-
- prediction = fluid.layers.fc(input=stack_input,
- size=class_num,
- act='softmax')
-
- cost = fluid.layers.cross_entropy(input=prediction, label=label)
- avg_cost = fluid.layers.mean(x=cost)
- acc = fluid.layers.accuracy(input=prediction, label=label)
- return prediction, avg_cost, acc
diff --git a/PaddleSpeech/DeepASR/score_error_rate.py b/PaddleSpeech/DeepASR/score_error_rate.py
deleted file mode 100644
index dde5a2448afffcae61c4d033159a5b081e6c79e8..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/score_error_rate.py
+++ /dev/null
@@ -1,80 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import argparse
-from tools.error_rate import char_errors, word_errors
-
-
-def parse_args():
- parser = argparse.ArgumentParser(
- "Score word/character error rate (WER/CER) "
- "for decoding result.")
- parser.add_argument(
- '--error_rate_type',
- type=str,
- default='cer',
- choices=['cer', 'wer'],
- help="Error rate type. (default: %(default)s)")
- parser.add_argument(
- '--special_tokens',
- type=str,
- default='',
- help="Special tokens in scoring CER, seperated by space. "
- "They shouldn't be splitted and should be treated as one special "
- "character. Example: ' ' "
- "(default: %(default)s)")
- parser.add_argument(
- '--ref', type=str, required=True, help="The ground truth text.")
- parser.add_argument(
- '--hyp', type=str, required=True, help="The decoding result text.")
- args = parser.parse_args()
- return args
-
-
-if __name__ == '__main__':
-
- args = parse_args()
- ref_dict = {}
- sum_errors, sum_ref_len = 0.0, 0
- sent_cnt, not_in_ref_cnt = 0, 0
-
- special_tokens = args.special_tokens.split(" ")
-
- with open(args.ref, "r") as ref_txt:
- line = ref_txt.readline()
- while line:
- del_pos = line.find(" ")
- key, sent = line[0:del_pos], line[del_pos + 1:-1].strip()
- ref_dict[key] = sent
- line = ref_txt.readline()
-
- with open(args.hyp, "r") as hyp_txt:
- line = hyp_txt.readline()
- while line:
- del_pos = line.find(" ")
- key, sent = line[0:del_pos], line[del_pos + 1:-1].strip()
- sent_cnt += 1
- line = hyp_txt.readline()
- if key not in ref_dict:
- not_in_ref_cnt += 1
- continue
-
- if args.error_rate_type == 'cer':
- for sp_tok in special_tokens:
- sent = sent.replace(sp_tok, '\0')
- errors, ref_len = char_errors(
- ref_dict[key].decode("utf8"),
- sent.decode("utf8"),
- remove_space=True)
- else:
- errors, ref_len = word_errors(ref_dict[key].decode("utf8"),
- sent.decode("utf8"))
- sum_errors += errors
- sum_ref_len += ref_len
-
- print("Error rate[%s] = %f (%d/%d)," %
- (args.error_rate_type, sum_errors / sum_ref_len, int(sum_errors),
- sum_ref_len))
- print("total %d sentences in hyp, %d not presented in ref." %
- (sent_cnt, not_in_ref_cnt))
diff --git a/PaddleSpeech/DeepASR/tools/_init_paths.py b/PaddleSpeech/DeepASR/tools/_init_paths.py
deleted file mode 100644
index 228dbae6bf95231030c1858c4d30b49f162f46e2..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/tools/_init_paths.py
+++ /dev/null
@@ -1,19 +0,0 @@
-"""Add the parent directory to $PYTHONPATH"""
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import os.path
-import sys
-
-
-def add_path(path):
- if path not in sys.path:
- sys.path.insert(0, path)
-
-
-this_dir = os.path.dirname(__file__)
-
-# Add project path to PYTHONPATH
-proj_path = os.path.join(this_dir, '..')
-add_path(proj_path)
diff --git a/PaddleSpeech/DeepASR/tools/error_rate.py b/PaddleSpeech/DeepASR/tools/error_rate.py
deleted file mode 100644
index 215ad39d24a551879d0fd8d4c8892161a0708370..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/tools/error_rate.py
+++ /dev/null
@@ -1,182 +0,0 @@
-# -*- coding: utf-8 -*-
-"""This module provides functions to calculate error rate in different level.
-e.g. wer for word-level, cer for char-level.
-"""
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import numpy as np
-
-
-def _levenshtein_distance(ref, hyp):
- """Levenshtein distance is a string metric for measuring the difference
- between two sequences. Informally, the levenshtein disctance is defined as
- the minimum number of single-character edits (substitutions, insertions or
- deletions) required to change one word into the other. We can naturally
- extend the edits to word level when calculate levenshtein disctance for
- two sentences.
- """
- m = len(ref)
- n = len(hyp)
-
- # special case
- if ref == hyp:
- return 0
- if m == 0:
- return n
- if n == 0:
- return m
-
- if m < n:
- ref, hyp = hyp, ref
- m, n = n, m
-
- # use O(min(m, n)) space
- distance = np.zeros((2, n + 1), dtype=np.int32)
-
- # initialize distance matrix
- for j in xrange(n + 1):
- distance[0][j] = j
-
- # calculate levenshtein distance
- for i in xrange(1, m + 1):
- prev_row_idx = (i - 1) % 2
- cur_row_idx = i % 2
- distance[cur_row_idx][0] = i
- for j in xrange(1, n + 1):
- if ref[i - 1] == hyp[j - 1]:
- distance[cur_row_idx][j] = distance[prev_row_idx][j - 1]
- else:
- s_num = distance[prev_row_idx][j - 1] + 1
- i_num = distance[cur_row_idx][j - 1] + 1
- d_num = distance[prev_row_idx][j] + 1
- distance[cur_row_idx][j] = min(s_num, i_num, d_num)
-
- return distance[m % 2][n]
-
-
-def word_errors(reference, hypothesis, ignore_case=False, delimiter=' '):
- """Compute the levenshtein distance between reference sequence and
- hypothesis sequence in word-level.
- :param reference: The reference sentence.
- :type reference: basestring
- :param hypothesis: The hypothesis sentence.
- :type hypothesis: basestring
- :param ignore_case: Whether case-sensitive or not.
- :type ignore_case: bool
- :param delimiter: Delimiter of input sentences.
- :type delimiter: char
- :return: Levenshtein distance and word number of reference sentence.
- :rtype: list
- """
- if ignore_case == True:
- reference = reference.lower()
- hypothesis = hypothesis.lower()
-
- ref_words = filter(None, reference.split(delimiter))
- hyp_words = filter(None, hypothesis.split(delimiter))
-
- edit_distance = _levenshtein_distance(ref_words, hyp_words)
- return float(edit_distance), len(ref_words)
-
-
-def char_errors(reference, hypothesis, ignore_case=False, remove_space=False):
- """Compute the levenshtein distance between reference sequence and
- hypothesis sequence in char-level.
- :param reference: The reference sentence.
- :type reference: basestring
- :param hypothesis: The hypothesis sentence.
- :type hypothesis: basestring
- :param ignore_case: Whether case-sensitive or not.
- :type ignore_case: bool
- :param remove_space: Whether remove internal space characters
- :type remove_space: bool
- :return: Levenshtein distance and length of reference sentence.
- :rtype: list
- """
- if ignore_case == True:
- reference = reference.lower()
- hypothesis = hypothesis.lower()
-
- join_char = ' '
- if remove_space == True:
- join_char = ''
-
- reference = join_char.join(filter(None, reference.split(' ')))
- hypothesis = join_char.join(filter(None, hypothesis.split(' ')))
-
- edit_distance = _levenshtein_distance(reference, hypothesis)
- return float(edit_distance), len(reference)
-
-
-def wer(reference, hypothesis, ignore_case=False, delimiter=' '):
- """Calculate word error rate (WER). WER compares reference text and
- hypothesis text in word-level. WER is defined as:
- .. math::
- WER = (Sw + Dw + Iw) / Nw
- where
- .. code-block:: text
- Sw is the number of words subsituted,
- Dw is the number of words deleted,
- Iw is the number of words inserted,
- Nw is the number of words in the reference
- We can use levenshtein distance to calculate WER. Please draw an attention
- that empty items will be removed when splitting sentences by delimiter.
- :param reference: The reference sentence.
- :type reference: basestring
- :param hypothesis: The hypothesis sentence.
- :type hypothesis: basestring
- :param ignore_case: Whether case-sensitive or not.
- :type ignore_case: bool
- :param delimiter: Delimiter of input sentences.
- :type delimiter: char
- :return: Word error rate.
- :rtype: float
- :raises ValueError: If word number of reference is zero.
- """
- edit_distance, ref_len = word_errors(reference, hypothesis, ignore_case,
- delimiter)
-
- if ref_len == 0:
- raise ValueError("Reference's word number should be greater than 0.")
-
- wer = float(edit_distance) / ref_len
- return wer
-
-
-def cer(reference, hypothesis, ignore_case=False, remove_space=False):
- """Calculate charactor error rate (CER). CER compares reference text and
- hypothesis text in char-level. CER is defined as:
- .. math::
- CER = (Sc + Dc + Ic) / Nc
- where
- .. code-block:: text
- Sc is the number of characters substituted,
- Dc is the number of characters deleted,
- Ic is the number of characters inserted
- Nc is the number of characters in the reference
- We can use levenshtein distance to calculate CER. Chinese input should be
- encoded to unicode. Please draw an attention that the leading and tailing
- space characters will be truncated and multiple consecutive space
- characters in a sentence will be replaced by one space character.
- :param reference: The reference sentence.
- :type reference: basestring
- :param hypothesis: The hypothesis sentence.
- :type hypothesis: basestring
- :param ignore_case: Whether case-sensitive or not.
- :type ignore_case: bool
- :param remove_space: Whether remove internal space characters
- :type remove_space: bool
- :return: Character error rate.
- :rtype: float
- :raises ValueError: If the reference length is zero.
- """
- edit_distance, ref_len = char_errors(reference, hypothesis, ignore_case,
- remove_space)
-
- if ref_len == 0:
- raise ValueError("Length of reference should be greater than 0.")
-
- cer = float(edit_distance) / ref_len
- return cer
diff --git a/PaddleSpeech/DeepASR/tools/profile.py b/PaddleSpeech/DeepASR/tools/profile.py
deleted file mode 100644
index d25e18f7db0111acf76e66478f8230aab1d5f760..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/tools/profile.py
+++ /dev/null
@@ -1,210 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import sys
-import numpy as np
-import argparse
-import time
-
-import paddle.fluid as fluid
-import paddle.fluid.profiler as profiler
-import _init_paths
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-import data_utils.augmentor.trans_splice as trans_splice
-import data_utils.augmentor.trans_delay as trans_delay
-import data_utils.async_data_reader as reader
-from model_utils.model import stacked_lstmp_model
-from data_utils.util import lodtensor_to_ndarray
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Profiling for the stacked LSTMP model.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=32,
- help='The sequence number of a batch data. (default: %(default)d)')
- parser.add_argument(
- '--minimum_batch_size',
- type=int,
- default=1,
- help='The minimum sequence number of a batch data. '
- '(default: %(default)d)')
- parser.add_argument(
- '--frame_dim',
- type=int,
- default=120 * 11,
- help='Frame dimension of feature data. (default: %(default)d)')
- parser.add_argument(
- '--stacked_num',
- type=int,
- default=5,
- help='Number of lstmp layers to stack. (default: %(default)d)')
- parser.add_argument(
- '--proj_dim',
- type=int,
- default=512,
- help='Project size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--hidden_dim',
- type=int,
- default=1024,
- help='Hidden size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--class_num',
- type=int,
- default=1749,
- help='Number of classes in label. (default: %(default)d)')
- parser.add_argument(
- '--learning_rate',
- type=float,
- default=0.00016,
- help='Learning rate used to train. (default: %(default)f)')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type. (default: %(default)s)')
- parser.add_argument(
- '--parallel', action='store_true', help='If set, run in parallel.')
- parser.add_argument(
- '--mean_var',
- type=str,
- default='data/global_mean_var_search26kHr',
- help='mean var path')
- parser.add_argument(
- '--feature_lst',
- type=str,
- default='data/feature.lst',
- help='feature list path.')
- parser.add_argument(
- '--label_lst',
- type=str,
- default='data/label.lst',
- help='label list path.')
- parser.add_argument(
- '--max_batch_num',
- type=int,
- default=11,
- help='Maximum number of batches for profiling. (default: %(default)d)')
- parser.add_argument(
- '--first_batches_to_skip',
- type=int,
- default=1,
- help='Number of first batches to skip for profiling. '
- '(default: %(default)d)')
- parser.add_argument(
- '--print_train_acc',
- action='store_true',
- help='If set, output training accuray.')
- parser.add_argument(
- '--sorted_key',
- type=str,
- default='total',
- choices=['None', 'total', 'calls', 'min', 'max', 'ave'],
- help='Different types of time to sort the profiling report. '
- '(default: %(default)s)')
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(vars(args).iteritems()):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def profile(args):
- """profile the training process.
- """
-
- if not args.first_batches_to_skip < args.max_batch_num:
- raise ValueError("arg 'first_batches_to_skip' must be smaller than "
- "'max_batch_num'.")
- if not args.first_batches_to_skip >= 0:
- raise ValueError(
- "arg 'first_batches_to_skip' must not be smaller than 0.")
-
- _, avg_cost, accuracy = stacked_lstmp_model(
- frame_dim=args.frame_dim,
- hidden_dim=args.hidden_dim,
- proj_dim=args.proj_dim,
- stacked_num=args.stacked_num,
- class_num=args.class_num,
- parallel=args.parallel)
-
- optimizer = fluid.optimizer.Adam(
- learning_rate=fluid.layers.exponential_decay(
- learning_rate=args.learning_rate,
- decay_steps=1879,
- decay_rate=1 / 1.2,
- staircase=True))
- optimizer.minimize(avg_cost)
-
- place = fluid.CPUPlace() if args.device == 'CPU' else fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(fluid.default_startup_program())
-
- ltrans = [
- trans_add_delta.TransAddDelta(2, 2),
- trans_mean_variance_norm.TransMeanVarianceNorm(args.mean_var),
- trans_splice.TransSplice(5, 5), trans_delay.TransDelay(5)
- ]
-
- data_reader = reader.AsyncDataReader(
- args.feature_lst, args.label_lst, -1, split_sentence_threshold=1024)
- data_reader.set_transformers(ltrans)
-
- feature_t = fluid.LoDTensor()
- label_t = fluid.LoDTensor()
-
- sorted_key = None if args.sorted_key is 'None' else args.sorted_key
- with profiler.profiler(args.device, sorted_key) as prof:
- frames_seen, start_time = 0, 0.0
- for batch_id, batch_data in enumerate(
- data_reader.batch_iterator(args.batch_size,
- args.minimum_batch_size)):
- if batch_id >= args.max_batch_num:
- break
- if args.first_batches_to_skip == batch_id:
- profiler.reset_profiler()
- start_time = time.time()
- frames_seen = 0
- # load_data
- (features, labels, lod, _) = batch_data
- features = np.reshape(features, (-1, 11, 3, args.frame_dim))
- features = np.transpose(features, (0, 2, 1, 3))
- feature_t.set(features, place)
- feature_t.set_lod([lod])
- label_t.set(labels, place)
- label_t.set_lod([lod])
-
- frames_seen += lod[-1]
-
- outs = exe.run(fluid.default_main_program(),
- feed={"feature": feature_t,
- "label": label_t},
- fetch_list=[avg_cost, accuracy]
- if args.print_train_acc else [],
- return_numpy=False)
-
- if args.print_train_acc:
- print("Batch %d acc: %f" %
- (batch_id, lodtensor_to_ndarray(outs[1])[0]))
- else:
- sys.stdout.write('.')
- sys.stdout.flush()
- time_consumed = time.time() - start_time
- frames_per_sec = frames_seen / time_consumed
- print("\nTime consumed: %f s, performance: %f frames/s." %
- (time_consumed, frames_per_sec))
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
- profile(args)
diff --git a/PaddleSpeech/DeepASR/train.py b/PaddleSpeech/DeepASR/train.py
deleted file mode 100644
index 1a1dd6cf9ea33bb546cc3bdf65c36be0441832cb..0000000000000000000000000000000000000000
--- a/PaddleSpeech/DeepASR/train.py
+++ /dev/null
@@ -1,372 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-import sys
-import os
-import numpy as np
-import argparse
-import time
-
-import paddle.fluid as fluid
-import data_utils.augmentor.trans_mean_variance_norm as trans_mean_variance_norm
-import data_utils.augmentor.trans_add_delta as trans_add_delta
-import data_utils.augmentor.trans_splice as trans_splice
-import data_utils.augmentor.trans_delay as trans_delay
-import data_utils.async_data_reader as reader
-from model_utils.model import stacked_lstmp_model
-
-
-def parse_args():
- parser = argparse.ArgumentParser("Training for stacked LSTMP model.")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=32,
- help='The sequence number of a batch data. Batch size per GPU. (default: %(default)d)'
- )
- parser.add_argument(
- '--minimum_batch_size',
- type=int,
- default=1,
- help='The minimum sequence number of a batch data. '
- '(default: %(default)d)')
- parser.add_argument(
- '--frame_dim',
- type=int,
- default=80,
- help='Frame dimension of feature data. (default: %(default)d)')
- parser.add_argument(
- '--stacked_num',
- type=int,
- default=5,
- help='Number of lstmp layers to stack. (default: %(default)d)')
- parser.add_argument(
- '--proj_dim',
- type=int,
- default=512,
- help='Project size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--hidden_dim',
- type=int,
- default=1024,
- help='Hidden size of lstmp unit. (default: %(default)d)')
- parser.add_argument(
- '--class_num',
- type=int,
- default=3040,
- help='Number of classes in label. (default: %(default)d)')
- parser.add_argument(
- '--pass_num',
- type=int,
- default=100,
- help='Epoch number to train. (default: %(default)d)')
- parser.add_argument(
- '--print_per_batches',
- type=int,
- default=100,
- help='Interval to print training accuracy. (default: %(default)d)')
- parser.add_argument(
- '--learning_rate',
- type=float,
- default=0.00016,
- help='Learning rate used to train. (default: %(default)f)')
- parser.add_argument(
- '--device',
- type=str,
- default='GPU',
- choices=['CPU', 'GPU'],
- help='The device type. (default: %(default)s)')
- parser.add_argument(
- '--parallel', action='store_true', help='If set, run in parallel.')
- parser.add_argument(
- '--mean_var',
- type=str,
- default='data/global_mean_var_search26kHr',
- help="The path for feature's global mean and variance. "
- "(default: %(default)s)")
- parser.add_argument(
- '--train_feature_lst',
- type=str,
- default='data/feature.lst',
- help='The feature list path for training. (default: %(default)s)')
- parser.add_argument(
- '--train_label_lst',
- type=str,
- default='data/label.lst',
- help='The label list path for training. (default: %(default)s)')
- parser.add_argument(
- '--val_feature_lst',
- type=str,
- default='data/val_feature.lst',
- help='The feature list path for validation. (default: %(default)s)')
- parser.add_argument(
- '--val_label_lst',
- type=str,
- default='data/val_label.lst',
- help='The label list path for validation. (default: %(default)s)')
- parser.add_argument(
- '--init_model_path',
- type=str,
- default=None,
- help="The model (checkpoint) path which the training resumes from. "
- "If None, train the model from scratch. (default: %(default)s)")
- parser.add_argument(
- '--checkpoints',
- type=str,
- default='./checkpoints',
- help="The directory for saving checkpoints. Do not save checkpoints "
- "if set to ''. (default: %(default)s)")
- parser.add_argument(
- '--infer_models',
- type=str,
- default='./infer_models',
- help="The directory for saving inference models. Do not save inference "
- "models if set to ''. (default: %(default)s)")
- args = parser.parse_args()
- return args
-
-
-def print_arguments(args):
- print('----------- Configuration Arguments -----------')
- for arg, value in sorted(vars(args).iteritems()):
- print('%s: %s' % (arg, value))
- print('------------------------------------------------')
-
-
-def train(args):
- """train in loop.
- """
-
- # paths check
- if args.init_model_path is not None and \
- not os.path.exists(args.init_model_path):
- raise IOError("Invalid initial model path!")
- if args.checkpoints != '' and not os.path.exists(args.checkpoints):
- os.mkdir(args.checkpoints)
- if args.infer_models != '' and not os.path.exists(args.infer_models):
- os.mkdir(args.infer_models)
-
- train_program = fluid.Program()
- train_startup = fluid.Program()
-
- with fluid.program_guard(train_program, train_startup):
- with fluid.unique_name.guard():
- py_train_reader = fluid.layers.py_reader(
- capacity=10,
- shapes=([-1, 3, 11, args.frame_dim], [-1, 1]),
- dtypes=['float32', 'int64'],
- lod_levels=[1, 1],
- name='train_reader')
- feature, label = fluid.layers.read_file(py_train_reader)
- prediction, avg_cost, accuracy = stacked_lstmp_model(
- feature=feature,
- label=label,
- hidden_dim=args.hidden_dim,
- proj_dim=args.proj_dim,
- stacked_num=args.stacked_num,
- class_num=args.class_num)
- # optimizer = fluid.optimizer.Momentum(learning_rate=args.learning_rate, momentum=0.9)
- optimizer = fluid.optimizer.Adam(
- learning_rate=fluid.layers.exponential_decay(
- learning_rate=args.learning_rate,
- decay_steps=1879,
- decay_rate=1 / 1.2,
- staircase=True))
- optimizer.minimize(avg_cost)
- fluid.memory_optimize(train_program)
-
- test_program = fluid.Program()
- test_startup = fluid.Program()
- with fluid.program_guard(test_program, test_startup):
- with fluid.unique_name.guard():
- py_test_reader = fluid.layers.py_reader(
- capacity=10,
- shapes=([-1, 3, 11, args.frame_dim], [-1, 1]),
- dtypes=['float32', 'int64'],
- lod_levels=[1, 1],
- name='test_reader')
- feature, label = fluid.layers.read_file(py_test_reader)
- prediction, avg_cost, accuracy = stacked_lstmp_model(
- feature=feature,
- label=label,
- hidden_dim=args.hidden_dim,
- proj_dim=args.proj_dim,
- stacked_num=args.stacked_num,
- class_num=args.class_num)
- test_program = test_program.clone(for_test=True)
- place = fluid.CPUPlace() if args.device == 'CPU' else fluid.CUDAPlace(0)
- exe = fluid.Executor(place)
- exe.run(train_startup)
- exe.run(test_startup)
-
- if args.parallel:
- exec_strategy = fluid.ExecutionStrategy()
- exec_strategy.num_iteration_per_drop_scope = 10
- train_exe = fluid.ParallelExecutor(
- use_cuda=(args.device == 'GPU'),
- loss_name=avg_cost.name,
- exec_strategy=exec_strategy,
- main_program=train_program)
- test_exe = fluid.ParallelExecutor(
- use_cuda=(args.device == 'GPU'),
- main_program=test_program,
- exec_strategy=exec_strategy,
- share_vars_from=train_exe)
-
- # resume training if initial model provided.
- if args.init_model_path is not None:
- fluid.io.load_persistables(exe, args.init_model_path)
-
- ltrans = [
- trans_add_delta.TransAddDelta(2, 2),
- trans_mean_variance_norm.TransMeanVarianceNorm(args.mean_var),
- trans_splice.TransSplice(5, 5), trans_delay.TransDelay(5)
- ]
-
- # bind train_reader
- train_data_reader = reader.AsyncDataReader(
- args.train_feature_lst,
- args.train_label_lst,
- -1,
- split_sentence_threshold=1024)
-
- train_data_reader.set_transformers(ltrans)
-
- def train_data_provider():
- for data in train_data_reader.batch_iterator(args.batch_size,
- args.minimum_batch_size):
- yield batch_data_to_lod_tensors(args, data, fluid.CPUPlace())
-
- py_train_reader.decorate_tensor_provider(train_data_provider)
-
- if (os.path.exists(args.val_feature_lst) and
- os.path.exists(args.val_label_lst)):
- # test data reader
- test_data_reader = reader.AsyncDataReader(
- args.val_feature_lst,
- args.val_label_lst,
- -1,
- split_sentence_threshold=1024)
- test_data_reader.set_transformers(ltrans)
-
- def test_data_provider():
- for data in test_data_reader.batch_iterator(
- args.batch_size, args.minimum_batch_size):
- yield batch_data_to_lod_tensors(args, data, fluid.CPUPlace())
-
- py_test_reader.decorate_tensor_provider(test_data_provider)
-
- # validation
- def test(exe):
- # If test data not found, return invalid cost and accuracy
- if not (os.path.exists(args.val_feature_lst) and
- os.path.exists(args.val_label_lst)):
- return -1.0, -1.0
- batch_id = 0
- test_costs = []
- test_accs = []
- while True:
- if batch_id == 0:
- py_test_reader.start()
- try:
- if args.parallel:
- cost, acc = exe.run(
- fetch_list=[avg_cost.name, accuracy.name],
- return_numpy=False)
- else:
- cost, acc = exe.run(program=test_program,
- fetch_list=[avg_cost, accuracy],
- return_numpy=False)
- sys.stdout.write('.')
- sys.stdout.flush()
- test_costs.append(np.array(cost)[0])
- test_accs.append(np.array(acc)[0])
- batch_id += 1
- except fluid.core.EOFException:
- py_test_reader.reset()
- break
- return np.mean(test_costs), np.mean(test_accs)
-
- # train
- for pass_id in xrange(args.pass_num):
- pass_start_time = time.time()
- batch_id = 0
- while True:
- if batch_id == 0:
- py_train_reader.start()
- to_print = batch_id > 0 and (batch_id % args.print_per_batches == 0)
- try:
- if args.parallel:
- outs = train_exe.run(
- fetch_list=[avg_cost.name, accuracy.name]
- if to_print else [],
- return_numpy=False)
- else:
- outs = exe.run(program=train_program,
- fetch_list=[avg_cost, accuracy]
- if to_print else [],
- return_numpy=False)
- except fluid.core.EOFException:
- py_train_reader.reset()
- break
-
- if to_print:
- if args.parallel:
- print("\nBatch %d, train cost: %f, train acc: %f" %
- (batch_id, np.mean(outs[0]), np.mean(outs[1])))
- else:
- print("\nBatch %d, train cost: %f, train acc: %f" % (
- batch_id, np.array(outs[0])[0], np.array(outs[1])[0]))
- # save the latest checkpoint
- if args.checkpoints != '':
- model_path = os.path.join(args.checkpoints,
- "deep_asr.latest.checkpoint")
- fluid.io.save_persistables(exe, model_path, train_program)
- else:
- sys.stdout.write('.')
- sys.stdout.flush()
-
- batch_id += 1
- # run test
- val_cost, val_acc = test(test_exe if args.parallel else exe)
-
- # save checkpoint per pass
- if args.checkpoints != '':
- model_path = os.path.join(
- args.checkpoints,
- "deep_asr.pass_" + str(pass_id) + ".checkpoint")
- fluid.io.save_persistables(exe, model_path, train_program)
- # save inference model
- if args.infer_models != '':
- model_path = os.path.join(
- args.infer_models,
- "deep_asr.pass_" + str(pass_id) + ".infer.model")
- fluid.io.save_inference_model(model_path, ["feature"],
- [prediction], exe, train_program)
- # cal pass time
- pass_end_time = time.time()
- time_consumed = pass_end_time - pass_start_time
- # print info at pass end
- print("\nPass %d, time consumed: %f s, val cost: %f, val acc: %f\n" %
- (pass_id, time_consumed, val_cost, val_acc))
-
-
-def batch_data_to_lod_tensors(args, batch_data, place):
- features, labels, lod, name_lst = batch_data
- features = np.reshape(features, (-1, 11, 3, args.frame_dim))
- features = np.transpose(features, (0, 2, 1, 3))
- feature_t = fluid.LoDTensor()
- label_t = fluid.LoDTensor()
- feature_t.set(features, place)
- feature_t.set_lod([lod])
- label_t.set(labels, place)
- label_t.set_lod([lod])
- return feature_t, label_t
-
-
-if __name__ == '__main__':
- args = parse_args()
- print_arguments(args)
-
- train(args)
diff --git a/PaddleSpeech/README.md b/PaddleSpeech/README.md
deleted file mode 100644
index 39f91c26bd90fdd0e8fa81a395d14c2d3826f7cd..0000000000000000000000000000000000000000
--- a/PaddleSpeech/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
-Fluid 模型库
-============
-
-语音识别
---------
-
-自动语音识别(Automatic Speech Recognition, ASR)是将人类声音中的词汇内容转录成计算机可输入的文字的技术。语音识别的相关研究经历了漫长的探索过程,在HMM/GMM模型之后其发展一直较为缓慢,随着深度学习的兴起,其迎来了春天。在多种语言识别任务中,将深度神经网络(DNN)作为声学模型,取得了比GMM更好的性能,使得 ASR 成为深度学习应用最为成功的领域之一。而由于识别准确率的不断提高,有越来越多的语言技术产品得以落地,例如语言输入法、以智能音箱为代表的智能家居设备等 — 基于语言的交互方式正在深刻的改变人类的生活。
-
-与 [DeepSpeech](https://github.com/PaddlePaddle/DeepSpeech) 中深度学习模型端到端直接预测字词的分布不同,本实例更接近传统的语言识别流程,以音素为建模单元,关注语言识别中声学模型的训练,利用[kaldi](http://www.kaldi-asr.org) 进行音频数据的特征提取和标签对齐,并集成 kaldi 的解码器完成解码。
-
-- [DeepASR](https://github.com/PaddlePaddle/models/blob/develop/PaddleSpeech/DeepASR/README_cn.md)
-
diff --git a/README.md b/README.md
index d1b7db180ddacbe497c0495d97ce6d70393758ab..1ecf77cdf7a5ff8218546b3a65798dd1c1d7bb2e 100644
--- a/README.md
+++ b/README.md
@@ -32,16 +32,16 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| **模型名称** | **模型简介** | **数据集** | **评估指标** **top-1/top-5 accuracy(CV2)** |
| ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------- | ------------------------------------------------ |
-| [AlexNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | 首次在CNN中成功的应用了ReLU、Dropout和LRN,并使用GPU进行运算加速 | ImageNet-2012验证集 | 56.72%/79.17% |
-| [VGG](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | 在AlexNet的基础上使用3*3小卷积核,增加网络深度,具有很好的泛化能力 | ImageNet-2012验证集 | 72.56%/90.93% |
-| [GoogleNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | 在不增加计算负载的前提下增加了网络的深度和宽度,性能更加优越 | ImageNet-2012验证集 | 70.70%/89.66% |
-| [ResNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | Residual Network,引入了新的残差结构,解决了随着网络加深,准确率下降的问题 | ImageNet-2012验证集 | 80.93%/95.33% |
-| [ResNet-D](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | 融合最新多种对ResNet改进策略,ResNet50_vd的top1准确率达到79.84% | ImageNet-2012验证集 | 79.84%/94.93% |
-| [Inception-v4](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | 将Inception模块与Residual Connection进行结合,通过ResNet的结构极大地加速训练并获得性能的提升 | ImageNet-2012验证集 | 80.77%/95.26% |
-| [MobileNet v1](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | 将传统的卷积结构改造成两层卷积结构的网络,在基本不影响准确率的前提下大大减少计算时间,更适合移动端和嵌入式视觉应用 | ImageNet-2012验证集 | 70.99%/89.68% |
-| [MobileNet v2](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | MobileNet结构的微调,直接在thinner的bottleneck层上进行skip learning连接以及对bottleneck layer不进行ReLu非线性处理可取得更好的结果 | ImageNet-2012验证集 | 72.15%/90.65% |
-| [SE_ResNeXt](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | 在ResNeXt 基础、上加入了SE(Sequeeze-and-Excitation) 模块,提高了识别准确率,在ILSVRC 2017 的分类项目中取得了第一名 | ImageNet-2012验证集 | 81.40%/95.48% |
-| [ShuffleNet v2](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification) | ECCV2018,轻量级CNN网络,在速度和准确度之间做了很好地平衡。在同等复杂度下,比ShuffleNet和MobileNetv2更准确,更适合移动端以及无人车领域 | ImageNet-2012验证集 | 70.03%/89.17% |
+| [AlexNet](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | 首次在CNN中成功的应用了ReLU、Dropout和LRN,并使用GPU进行运算加速 | ImageNet-2012验证集 | 56.72%/79.17% |
+| [VGG](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | 在AlexNet的基础上使用3*3小卷积核,增加网络深度,具有很好的泛化能力 | ImageNet-2012验证集 | 72.56%/90.93% |
+| [GoogleNet](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | 在不增加计算负载的前提下增加了网络的深度和宽度,性能更加优越 | ImageNet-2012验证集 | 70.70%/89.66% |
+| [ResNet](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | Residual Network,引入了新的残差结构,解决了随着网络加深,准确率下降的问题 | ImageNet-2012验证集 | 80.93%/95.33% |
+| [ResNet-D](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | 融合最新多种对ResNet改进策略,ResNet50_vd的top1准确率达到79.84% | ImageNet-2012验证集 | 79.84%/94.93% |
+| [Inception-v4](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | 将Inception模块与Residual Connection进行结合,通过ResNet的结构极大地加速训练并获得性能的提升 | ImageNet-2012验证集 | 80.77%/95.26% |
+| [MobileNet v1](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | 将传统的卷积结构改造成两层卷积结构的网络,在基本不影响准确率的前提下大大减少计算时间,更适合移动端和嵌入式视觉应用 | ImageNet-2012验证集 | 70.99%/89.68% |
+| [MobileNet v2](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | MobileNet结构的微调,直接在thinner的bottleneck层上进行skip learning连接以及对bottleneck layer不进行ReLu非线性处理可取得更好的结果 | ImageNet-2012验证集 | 72.15%/90.65% |
+| [SE_ResNeXt](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | 在ResNeXt 基础、上加入了SE(Sequeeze-and-Excitation) 模块,提高了识别准确率,在ILSVRC 2017 的分类项目中取得了第一名 | ImageNet-2012验证集 | 81.40%/95.48% |
+| [ShuffleNet v2](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/image_classification) | ECCV2018,轻量级CNN网络,在速度和准确度之间做了很好地平衡。在同等复杂度下,比ShuffleNet和MobileNetv2更准确,更适合移动端以及无人车领域 | ImageNet-2012验证集 | 70.03%/89.17% |
### 目标检测
@@ -49,12 +49,12 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 | 数据集 | 评估指标 mAP |
| ------------------------------------------------------------ | ------------------------------------------------------------ | ---------- | ------------------------------------------------------- |
-| [SSD](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleDetection) | 很好的继承了MobileNet预测速度快,易于部署的特点,能够很好的在多种设备上完成图像目标检测任务 | VOC07 test | mAP = 73.32% |
-| [Faster-RCNN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleDetection) | 创造性地采用卷积网络自行产生建议框,并且和目标检测网络共享卷积网络,建议框数目减少,质量提高 | MS-COCO | 基于ResNet 50 mAP(0.50:0.95) = 36.7% |
-| [Mask-RCNN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleDetection) | 经典的两阶段框架,在Faster R-CNN模型基础上添加分割分支,得到掩码结果,实现了掩码和类别预测关系的解藕,可得到像素级别的检测结果。 | MS-COCO | 基于ResNet 50 Mask mAP(0.50:0.95) = 31.4% |
-| [RetinaNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleDetection) | 经典的一阶段框架,由ResNet主干网络、FPN结构、和两个分别用于回归物体位置和预测物体类别的子网络组成。在训练过程中使用Focal Loss,解决了传统一阶段检测器存在前景背景类别不平衡的问题,进一步提高了一阶段检测器的精度。 | MS-COCO | 基于ResNet mAP (500.50:0.95) = 36% |
-| [YOLOv3](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleDetection) | 速度和精度均衡的目标检测网络,相比于原作者darknet中的YOLO v3实现,PaddlePaddle实现参考了论文[Bag of Tricks for Image Classification with Convolutional Neural Networks](https://arxiv.org/pdf/1812.01187.pdf) 增加了mixup,label_smooth等处理,精度(mAP(0.5:0.95))相比于原作者提高了4.7个绝对百分点,在此基础上加入synchronize batch normalization, 最终精度相比原作者提高5.9个绝对百分点。 | MS-COCO | 基于DarkNet mAP(0.50:0.95)= 38.9% |
-| [PyramidBox](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/face_detection) | **PyramidBox** **模型是百度自主研发的人脸检测模型**,利用上下文信息解决困难人脸的检测问题,网络表达能力高,鲁棒性强。于18年3月份在WIDER Face数据集上取得第一名 | WIDER FACE | mAP (Easy/Medium/Hard set)= 96.0%/ 94.8%/ 88.8% |
+| [SSD](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleDetection) | 很好的继承了MobileNet预测速度快,易于部署的特点,能够很好的在多种设备上完成图像目标检测任务 | VOC07 test | mAP = 73.32% |
+| [Faster-RCNN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleDetection) | 创造性地采用卷积网络自行产生建议框,并且和目标检测网络共享卷积网络,建议框数目减少,质量提高 | MS-COCO | 基于ResNet 50 mAP(0.50:0.95) = 36.7% |
+| [Mask-RCNN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleDetection) | 经典的两阶段框架,在Faster R-CNN模型基础上添加分割分支,得到掩码结果,实现了掩码和类别预测关系的解藕,可得到像素级别的检测结果。 | MS-COCO | 基于ResNet 50 Mask mAP(0.50:0.95) = 31.4% |
+| [RetinaNet](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleDetection) | 经典的一阶段框架,由ResNet主干网络、FPN结构、和两个分别用于回归物体位置和预测物体类别的子网络组成。在训练过程中使用Focal Loss,解决了传统一阶段检测器存在前景背景类别不平衡的问题,进一步提高了一阶段检测器的精度。 | MS-COCO | 基于ResNet mAP (500.50:0.95) = 36% |
+| [YOLOv3](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleDetection) | 速度和精度均衡的目标检测网络,相比于原作者darknet中的YOLO v3实现,PaddlePaddle实现参考了论文[Bag of Tricks for Image Classification with Convolutional Neural Networks](https://arxiv.org/pdf/1812.01187.pdf) 增加了mixup,label_smooth等处理,精度(mAP(0.5:0.95))相比于原作者提高了4.7个绝对百分点,在此基础上加入synchronize batch normalization, 最终精度相比原作者提高5.9个绝对百分点。 | MS-COCO | 基于DarkNet mAP(0.50:0.95)= 38.9% |
+| [PyramidBox](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/face_detection) | **PyramidBox** **模型是百度自主研发的人脸检测模型**,利用上下文信息解决困难人脸的检测问题,网络表达能力高,鲁棒性强。于18年3月份在WIDER Face数据集上取得第一名 | WIDER FACE | mAP (Easy/Medium/Hard set)= 96.0%/ 94.8%/ 88.8% |
### 图像分割
@@ -62,8 +62,8 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 | 数据集 | 评估指标 |
| ------------------------------------------------------------ | ------------------------------------------------------------ | --------- | --------------- |
-| [ICNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/icnet) | 主要用于图像实时语义分割,能够兼顾速度和准确性,易于线上部署 | Cityscape | Mean IoU=67.0% |
-| [DeepLab V3+](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/deeplabv3%2B) | 通过encoder-decoder进行多尺度信息的融合,同时保留了原来的空洞卷积和ASSP层, 其骨干网络使用了Xception模型,提高了语义分割的健壮性和运行速率 | Cityscape | Mean IoU=78.81% |
+| [ICNet](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/icnet) | 主要用于图像实时语义分割,能够兼顾速度和准确性,易于线上部署 | Cityscape | Mean IoU=67.0% |
+| [DeepLab V3+](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/deeplabv3%2B) | 通过encoder-decoder进行多尺度信息的融合,同时保留了原来的空洞卷积和ASSP层, 其骨干网络使用了Xception模型,提高了语义分割的健壮性和运行速率 | Cityscape | Mean IoU=78.81% |
### 关键点检测
@@ -71,7 +71,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 | 数据集 | 评估指标 |
| ------------------------------------------------------------ | ------------------------------------------------------------ | ------------ | ------------ |
-| [Simple Baselines](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/human_pose_estimation) | coco2018关键点检测项目亚军方案,网络结构非常简单,效果达到state of the art | COCO val2017 | AP = 72.7% |
+| [Simple Baselines](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/human_pose_estimation) | coco2018关键点检测项目亚军方案,网络结构非常简单,效果达到state of the art | COCO val2017 | AP = 72.7% |
### 图像生成
@@ -79,13 +79,13 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 | 数据集 |
| ------------------------------------------------------------ | ------------------------------------------------------------ | ---------- |
-| [CGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleGAN) | 条件生成对抗网络,一种带条件约束的GAN,使用额外信息对模型增加条件,可以指导数据生成过程 | Mnist |
-| [DCGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleGAN) | 深度卷积生成对抗网络,将GAN和卷积网络结合起来,以解决GAN训练不稳定的问题 | Mnist |
-| [Pix2Pix](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleGAN) | 图像翻译,通过成对图片将某一类图片转换成另外一类图片,可用于风格迁移 | Cityscapes |
-| [CycleGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleGAN) | 图像翻译,可以通过非成对的图片将某一类图片转换成另外一类图片,可用于风格迁移 | Cityscapes |
-| [StarGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleGAN) | 多领域属性迁移,引入辅助分类帮助单个判别器判断多个属性,可用于人脸属性转换 | Celeba |
-| [AttGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleGAN) | 利用分类损失和重构损失来保证改变特定的属性,可用于人脸特定属性转换 | Celeba |
-| [STGAN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleGAN) | 人脸特定属性转换,只输入有变化的标签,引入GRU结构,更好的选择变化的属性 | Celeba |
+| [CGAN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleGAN) | 条件生成对抗网络,一种带条件约束的GAN,使用额外信息对模型增加条件,可以指导数据生成过程 | Mnist |
+| [DCGAN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleGAN) | 深度卷积生成对抗网络,将GAN和卷积网络结合起来,以解决GAN训练不稳定的问题 | Mnist |
+| [Pix2Pix](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleGAN) | 图像翻译,通过成对图片将某一类图片转换成另外一类图片,可用于风格迁移 | Cityscapes |
+| [CycleGAN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleGAN) | 图像翻译,可以通过非成对的图片将某一类图片转换成另外一类图片,可用于风格迁移 | Cityscapes |
+| [StarGAN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleGAN) | 多领域属性迁移,引入辅助分类帮助单个判别器判断多个属性,可用于人脸属性转换 | Celeba |
+| [AttGAN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleGAN) | 利用分类损失和重构损失来保证改变特定的属性,可用于人脸特定属性转换 | Celeba |
+| [STGAN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleGAN) | 人脸特定属性转换,只输入有变化的标签,引入GRU结构,更好的选择变化的属性 | Celeba |
### 场景文字识别
@@ -93,8 +93,8 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 | 数据集 | 评估指标 |
| ------------------------------------------------------------ | ------------------------------------------------------------ | -------------------------- | -------------- |
-| [CRNN-CTC](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/ocr_recognition) | 使用CTC model识别图片中单行英文字符,用于端到端的文本行图片识别方法 | 单行不定长的英文字符串图片 | 错误率= 22.3% |
-| [OCR Attention](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/ocr_recognition) | 使用attention 识别图片中单行英文字符,用于端到端的自然场景文本识别, | 单行不定长的英文字符串图片 | 错误率 = 15.8% |
+| [CRNN-CTC](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/ocr_recognition) | 使用CTC model识别图片中单行英文字符,用于端到端的文本行图片识别方法 | 单行不定长的英文字符串图片 | 错误率= 22.3% |
+| [OCR Attention](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/ocr_recognition) | 使用attention 识别图片中单行英文字符,用于端到端的自然场景文本识别, | 单行不定长的英文字符串图片 | 错误率 = 15.8% |
### 度量学习
@@ -102,11 +102,11 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 | 数据集 | 评估指标 Recall@Rank-1(使用arcmargin训练) |
| ------------------------------------------------------------ | --------------------------------------------------------- | ------------------------------ | --------------------------------------------- |
-| [ResNet50未微调](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/metric_learning) | 使用arcmargin loss训练的特征模型 | Stanford Online Product(SOP) | 78.11% |
-| [ResNet50使用triplet微调](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用triplet loss微调的特征模型 | Stanford Online Product(SOP) | 79.21% |
-| [ResNet50使用quadruplet微调](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用quadruplet loss微调的特征模型 | Stanford Online Product(SOP) | 79.59% |
-| [ResNet50使用eml微调](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用eml loss微调的特征模型 | Stanford Online Product(SOP) | 80.11% |
-| [ResNet50使用npairs微调](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用npairs loss微调的特征模型 | Stanford Online Product(SOP) | 79.81% |
+| [ResNet50未微调](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/metric_learning) | 使用arcmargin loss训练的特征模型 | Stanford Online Product(SOP) | 78.11% |
+| [ResNet50使用triplet微调](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用triplet loss微调的特征模型 | Stanford Online Product(SOP) | 79.21% |
+| [ResNet50使用quadruplet微调](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用quadruplet loss微调的特征模型 | Stanford Online Product(SOP) | 79.59% |
+| [ResNet50使用eml微调](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用eml loss微调的特征模型 | Stanford Online Product(SOP) | 80.11% |
+| [ResNet50使用npairs微调](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/metric_learning) | 在arcmargin loss基础上,使用npairs loss微调的特征模型 | Stanford Online Product(SOP) | 79.81% |
### 视频分类和动作定位
@@ -114,14 +114,14 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 | 数据集 | 评估指标 |
| ------------------------------------------------------------ | ------------------------------------------------------------ | -------------------------- | ----------- |
-| [TSN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | ECCV'16提出的基于2D-CNN经典解决方案 | Kinetics-400 | Top-1 = 67% |
-| [Non-Local](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | 视频非局部关联建模模型 | Kinetics-400 | Top-1 = 74% |
-| [stNet](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | AAAI'19提出的视频联合时空建模方法 | Kinetics-400 | Top-1 = 69% |
-| [TSM](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | 基于时序移位的简单高效视频时空建模方法 | Kinetics-400 | Top-1 = 70% |
-| [Attention LSTM](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | 常用模型,速度快精度高 | Youtube-8M | GAP = 86% |
-| [Attention Cluster](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | CVPR'18提出的视频多模态特征注意力聚簇融合方法 | Youtube-8M | GAP = 84% |
-| [NeXtVlad](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | 2nd-Youtube-8M最优单模型 | Youtube-8M | GAP = 87% |
-| [C-TCN](https://github.com/PaddlePaddle/models/tree/develop/PaddleCV/PaddleVideo) | 2018年ActivityNet夺冠方案 | ActivityNet1.3 | MAP=31% |
+| [TSN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | ECCV'16提出的基于2D-CNN经典解决方案 | Kinetics-400 | Top-1 = 67% |
+| [Non-Local](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | 视频非局部关联建模模型 | Kinetics-400 | Top-1 = 74% |
+| [stNet](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | AAAI'19提出的视频联合时空建模方法 | Kinetics-400 | Top-1 = 69% |
+| [TSM](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | 基于时序移位的简单高效视频时空建模方法 | Kinetics-400 | Top-1 = 70% |
+| [Attention LSTM](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | 常用模型,速度快精度高 | Youtube-8M | GAP = 86% |
+| [Attention Cluster](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | CVPR'18提出的视频多模态特征注意力聚簇融合方法 | Youtube-8M | GAP = 84% |
+| [NeXtVlad](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | 2nd-Youtube-8M最优单模型 | Youtube-8M | GAP = 87% |
+| [C-TCN](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleCV/PaddleVideo) | 2018年ActivityNet夺冠方案 | ActivityNet1.3 | MAP=31% |
## PaddleNLP
@@ -129,7 +129,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 词法分析
-[LAC(Lexical Analysis of Chinese)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/lexical_analysis)百度自主研发中文特色模型词法分析任务,**输入是一个字符串,而输出是句子中的词边界和词性、实体类别。
+[LAC(Lexical Analysis of Chinese)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/lexical_analysis)百度自主研发中文特色模型词法分析任务,**输入是一个字符串,而输出是句子中的词边界和词性、实体类别。
| **模型** | **Precision** | **Recall** | **F1-score** |
| ---------------- | ------------- | ---------- | ------------ |
@@ -139,7 +139,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 语言模型
-[基于LSTM的语言模型任务](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/language_model),给定一个输入词序列(中文分词、英文tokenize),计算其PPL(语言模型困惑度,用户表示句子的流利程度)。
+[基于LSTM的语言模型任务](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/language_model),给定一个输入词序列(中文分词、英文tokenize),计算其PPL(语言模型困惑度,用户表示句子的流利程度)。
| **large config** | **train** | **valid** | **test** |
| ---------------- | --------- | --------- | -------- |
@@ -150,7 +150,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 情感分析
-[Senta(Sentiment Classification)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/sentiment_classification)百度AI开放平台中情感倾向分析模型、百度自主研发的中文特色模型,是目前最好的中文情感分析模型。
+[Senta(Sentiment Classification)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/sentiment_classification)百度AI开放平台中情感倾向分析模型、百度自主研发的中文特色模型,是目前最好的中文情感分析模型。
| **模型** | **dev** | **test** | **模型(****finetune****)** | **dev** | **test** |
| ------------- | ------- | -------- | ---------------------------- | ------- | -------- |
@@ -164,7 +164,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 对话情绪识别
-[EmoTect(Emotion Detection)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/emotion_detection)专注于识别智能对话场景中用户的情绪识别,并开源基于百度海量数据训练好的预训练模型。
+[EmoTect(Emotion Detection)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/emotion_detection)专注于识别智能对话场景中用户的情绪识别,并开源基于百度海量数据训练好的预训练模型。
| **模型** | **闲聊** | **客服** | **微博** |
| -------- | -------- | -------- | -------- |
@@ -178,7 +178,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 阅读理解
-[MRC(Machine Reading Comprehension)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/reading_comprehension)机器阅读理解(MRC)是自然语言处理(NLP)中的关键任务之一,开源的DuReader升级了经典的阅读理解BiDAF模型,去掉了char级别的embedding,在预测层中使用了[pointer network](https://arxiv.org/abs/1506.03134),并且参考了[R-NET](https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/r-net.pdf)中的一些网络结构,效果上有了大幅提升
+[MRC(Machine Reading Comprehension)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/reading_comprehension)机器阅读理解(MRC)是自然语言处理(NLP)中的关键任务之一,开源的DuReader升级了经典的阅读理解BiDAF模型,去掉了char级别的embedding,在预测层中使用了[pointer network](https://arxiv.org/abs/1506.03134),并且参考了[R-NET](https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/r-net.pdf)中的一些网络结构,效果上有了大幅提升
| **Model** | **Dev ROUGE-L** | **Test ROUGE-L** |
| -------------------------------------------------------- | --------------- | ---------------- |
@@ -266,7 +266,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### SimNet
-[SimNet(Similarity Net)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/similarity_net)一个计算短文本相似度的框架,可以根据用户输入的两个文本,计算出相似度得分。
+[SimNet(Similarity Net)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/similarity_net)一个计算短文本相似度的框架,可以根据用户输入的两个文本,计算出相似度得分。
| **模型** | **百度知道** | **ECOM** | **QQSIM** | **UNICOM** | **LCQMC** |
| ------------ | ------------ | -------- | --------- | ---------- | --------- |
@@ -277,7 +277,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 机器翻译
-[MT(machine translation)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/neural_machine_translation/transformer)机器翻译是利用计算机将一种自然语言(源语言)转换为另一种自然语言(目标语言)的过程,输入为源语言句子,输出为相应的目标语言的句子。
+[MT(machine translation)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/neural_machine_translation/transformer)机器翻译是利用计算机将一种自然语言(源语言)转换为另一种自然语言(目标语言)的过程,输入为源语言句子,输出为相应的目标语言的句子。
| **测试集** | **newstest2014** | **newstest2015** | **newstest2016** |
| ---------- | ---------------- | ---------------- | ---------------- |
@@ -286,7 +286,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 对话自动评估
-[对话自动评估(Auto Dialogue Evaluation)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/dialogue_model_toolkit/auto_dialogue_evaluation)主要用于评估开放领域对话系统的回复质量,能够帮助企业或个人快速评估对话系统的回复质量,减少人工评估成本。
+[对话自动评估(Auto Dialogue Evaluation)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/dialogue_model_toolkit/auto_dialogue_evaluation)主要用于评估开放领域对话系统的回复质量,能够帮助企业或个人快速评估对话系统的回复质量,减少人工评估成本。
利用少量标注数据微调后,自动评估打分和人工打分spearman相关系数,如下表。
@@ -296,7 +296,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### 对话通用理解
-[DGU(Dialogue General Understanding)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/dialogue_model_toolkit/dialogue_general_understanding)对话通用理解针对数据集开发了相关的模型训练过程,支持分类,多标签分类,序列标注等任务,用户可针对自己的数据集,进行相关的模型定制
+[DGU(Dialogue General Understanding)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/dialogue_model_toolkit/dialogue_general_understanding)对话通用理解针对数据集开发了相关的模型训练过程,支持分类,多标签分类,序列标注等任务,用户可针对自己的数据集,进行相关的模型定制
| **ask_name** | **udc** | **udc** | **udc** | **atis_slot** | **dstc2** | **atis_intent** | **swda** | **mrda** |
| ------------ | ------- | ------- | ------- | ------------- | ---------- | --------------- | -------- | -------- |
@@ -309,7 +309,7 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
#### DAM
-[深度注意力机制模型(Deep Attention Maching)](https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/dialogue_model_toolkit/deep_attention_matching)是开放领域多轮对话匹配模型。根据多轮对话历史和候选回复内容,排序出最合适的回复。
+[深度注意力机制模型(Deep Attention Maching)](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleNLP/dialogue_model_toolkit/deep_attention_matching)是开放领域多轮对话匹配模型。根据多轮对话历史和候选回复内容,排序出最合适的回复。
| | Ubuntu Corpus | Douban Conversation Corpus | | | | | | | | |
| ---- | ------------- | -------------------------- | ----- | ----- | ----- | ----- | ----- | ----- | ----- | ----- |
@@ -331,25 +331,16 @@ PaddlePaddle 提供了丰富的计算单元,使得用户可以采用模块化
| 模型名称 | 模型简介 |
| ------------------------------------------------------------ | ------------------------------------------------------------ |
-| [TagSpace](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | 应用于工业级的标签推荐,具体应用场景有feed新闻标签推荐等 |
-| [GRU4Rec](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | 首次将RNN(GRU)运用于session-based推荐,相比传统的KNN和矩阵分解,效果有明显的提升 |
-| [SequenceSemanticRetrieval](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | 使用参考论文中的思想,使用多种时间粒度进行用户行为预测 |
-| [DeepCTR](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | 只实现了DeepFM论文中介绍的模型的DNN部分,DeepFM会在其他例子中给出 |
-| [Multiview-Simnet](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | 基于多元视图,将用户和项目的多个功能视图合并为一个统一模型 |
-| [Word2Vec](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | skip-gram模式的word2vector模型 |
-| [GraphNeuralNetwork](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | 基于会话的图神经网络模型的推荐系统,可以更好的挖掘item中丰富的转换特性以及生成准确的潜在的用户向量表示 |
-| [DeepInterestNetwork](https://github.com/PaddlePaddle/models/tree/develop/PaddleRec) | DIN通过一个兴趣激活模块(Activation Unit),用预估目标Candidate ADs的信息去激活用户的历史点击商品,以此提取用户与当前预估目标相关的兴趣。 |
+| [TagSpace](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | 应用于工业级的标签推荐,具体应用场景有feed新闻标签推荐等 |
+| [GRU4Rec](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | 首次将RNN(GRU)运用于session-based推荐,相比传统的KNN和矩阵分解,效果有明显的提升 |
+| [SequenceSemanticRetrieval](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | 使用参考论文中的思想,使用多种时间粒度进行用户行为预测 |
+| [DeepCTR](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | 只实现了DeepFM论文中介绍的模型的DNN部分,DeepFM会在其他例子中给出 |
+| [Multiview-Simnet](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | 基于多元视图,将用户和项目的多个功能视图合并为一个统一模型 |
+| [Word2Vec](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | skip-gram模式的word2vector模型 |
+| [GraphNeuralNetwork](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | 基于会话的图神经网络模型的推荐系统,可以更好的挖掘item中丰富的转换特性以及生成准确的潜在的用户向量表示 |
+| [DeepInterestNetwork](https://github.com/PaddlePaddle/models/tree/release/1.5/PaddleRec) | DIN通过一个兴趣激活模块(Activation Unit),用预估目标Candidate ADs的信息去激活用户的历史点击商品,以此提取用户与当前预估目标相关的兴趣。 |
-## 其他模型
-
-| 模型名称 | 模型简介 |
-| ------------------------------------------------------------ | ------------------------------------------------------------ |
-| [DeepASR](https://github.com/PaddlePaddle/models/blob/develop/PaddleSpeech/DeepASR/README_cn.md) | 利用Fluid框架完成语音识别中声学模型的配置和训练,并集成 Kaldi 的解码器 |
-| [DQN](https://github.com/PaddlePaddle/models/blob/develop/PaddleRL/DeepQNetwork/README_cn.md) | value based强化学习算法,第一个成功地将深度学习和强化学习结合起来的模型 |
-| [DoubleDQN](https://github.com/PaddlePaddle/models/blob/develop/PaddleRL/DeepQNetwork/README_cn.md) | 将Double Q的想法应用在DQN上,解决过优化问题 |
-| [DuelingDQN](https://github.com/PaddlePaddle/models/blob/develop/PaddleRL/DeepQNetwork/README_cn.md) | 改进了DQN模型,提高了模型的性能 |
-
## License
This tutorial is contributed by [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) and licensed under the [Apache-2.0 license](LICENSE).
diff --git a/fluid/AutoDL/LRC/README.md b/fluid/AutoDL/LRC/README.md
deleted file mode 100644
index 546cb19169b965af5a3d0d41c903e318d4dfc64a..0000000000000000000000000000000000000000
--- a/fluid/AutoDL/LRC/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [AutoDL/LRC](../../../AutoDL/LRC).
diff --git a/fluid/AutoDL/LRC/README_cn.md b/fluid/AutoDL/LRC/README_cn.md
deleted file mode 100644
index 6c87fd2d1cb5f6f4d187d665548ed7c74746bf10..0000000000000000000000000000000000000000
--- a/fluid/AutoDL/LRC/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [AutoDL/LRC](../../../AutoDL/LRC) 目录下浏览本项目。
diff --git a/fluid/DeepASR/README.md b/fluid/DeepASR/README.md
deleted file mode 100644
index b7d916c58649790055b2ddbdd32e914d02f14ebf..0000000000000000000000000000000000000000
--- a/fluid/DeepASR/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleSpeech/DeepASR](../../PaddleSpeech/DeepASR).
diff --git a/fluid/DeepASR/README_cn.md b/fluid/DeepASR/README_cn.md
deleted file mode 100644
index 51b0e724c810165810154915f41159d478398234..0000000000000000000000000000000000000000
--- a/fluid/DeepASR/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleSpeech/DeepASR](../../PaddleSpeech/DeepASR) 目录下浏览本项目。
diff --git a/fluid/DeepQNetwork/README.md b/fluid/DeepQNetwork/README.md
deleted file mode 100644
index f82d57f12cc4e97dae99d5a711ee495a9895aa91..0000000000000000000000000000000000000000
--- a/fluid/DeepQNetwork/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleRL/DeepQNetwork](../../PaddleRL/DeepQNetwork).
diff --git a/fluid/DeepQNetwork/README_cn.md b/fluid/DeepQNetwork/README_cn.md
deleted file mode 100644
index b90f215b2d8e0734db5a41b00ab02260021c8cf6..0000000000000000000000000000000000000000
--- a/fluid/DeepQNetwork/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRL/DeepQNetwork](../../PaddleRL/DeepQNetwork) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/HiNAS_models/README.md b/fluid/PaddleCV/HiNAS_models/README.md
deleted file mode 100644
index 1e33fea89e2d4e3a9b9ef2cad81012d082ccc504..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/HiNAS_models/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [AutoDL/HiNAS_models](../../../AutoDL/HiNAS_models).
diff --git a/fluid/PaddleCV/HiNAS_models/README_cn.md b/fluid/PaddleCV/HiNAS_models/README_cn.md
deleted file mode 100644
index 8ab7149b0aaef04c226aff0302e4282b0172c113..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/HiNAS_models/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [AutoDL/HiNAS_models](../../../AutoDL/HiNAS_models) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/caffe2fluid/README.md b/fluid/PaddleCV/caffe2fluid/README.md
deleted file mode 100644
index 78702204ba32ffa63bcab4aef999267a5d7c1078..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/caffe2fluid/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [X2Paddle](https://github.com/PaddlePaddle/X2Paddle).
diff --git a/fluid/PaddleCV/deeplabv3+/README.md b/fluid/PaddleCV/deeplabv3+/README.md
deleted file mode 100644
index 94f81a780a21bda7e230bf513be427b08a6eaca2..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/deeplabv3+/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/deeplabv3+](../../../PaddleCV/deeplabv3+) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/face_detection/README.md b/fluid/PaddleCV/face_detection/README.md
deleted file mode 100644
index e9319716f4f660ff75b571337575d8cd53c03a13..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/face_detection/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/face_detection](../../../PaddleCV/face_detection) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/face_detection/README_cn.md b/fluid/PaddleCV/face_detection/README_cn.md
deleted file mode 100644
index e9319716f4f660ff75b571337575d8cd53c03a13..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/face_detection/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/face_detection](../../../PaddleCV/face_detection) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/gan/c_gan/README.md b/fluid/PaddleCV/gan/c_gan/README.md
deleted file mode 100644
index b36f7084c0a67ce35cc7e7a73333443919a98775..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/gan/c_gan/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/gan/c_gan](../../../../PaddleCV/gan/c_gan) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/gan/cycle_gan/README.md b/fluid/PaddleCV/gan/cycle_gan/README.md
deleted file mode 100644
index 5db6d49b2cbdaa6af4224bc0707593908a05352d..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/gan/cycle_gan/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/gan/cycle_gan](../../../../PaddleCV/gan/cycle_gan) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/human_pose_estimation/README.md b/fluid/PaddleCV/human_pose_estimation/README.md
deleted file mode 100644
index 6ced2b3b2cd19d413f2c8f2b139725c2e5ea14fc..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/human_pose_estimation](../../../PaddleCV/human_pose_estimation).
diff --git a/fluid/PaddleCV/human_pose_estimation/README_cn.md b/fluid/PaddleCV/human_pose_estimation/README_cn.md
deleted file mode 100644
index 84120d0c568b13bfbccead92cd7f9211193f7669..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/human_pose_estimation/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/human_pose_estimation](../../../PaddleCV/human_pose_estimation) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/icnet/README.md b/fluid/PaddleCV/icnet/README.md
deleted file mode 100644
index 72a3a91b0ae52894c641e61b489ff7a04c6f8106..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/icnet/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/icnet](../../../PaddleCV/icnet) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/image_classification/README.md b/fluid/PaddleCV/image_classification/README.md
deleted file mode 100644
index 55392b8ac91e4a8c24d2f2d6ac63d695cb58e146..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/image_classification](../../../PaddleCV/image_classification).
diff --git a/fluid/PaddleCV/image_classification/README_cn.md b/fluid/PaddleCV/image_classification/README_cn.md
deleted file mode 100644
index bb8850cff5fbd658addaba488301783d0e510a6c..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/image_classification](../../../PaddleCV/image_classification) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/image_classification/README_ngraph.md b/fluid/PaddleCV/image_classification/README_ngraph.md
deleted file mode 100644
index 55392b8ac91e4a8c24d2f2d6ac63d695cb58e146..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/image_classification/README_ngraph.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/image_classification](../../../PaddleCV/image_classification).
diff --git a/fluid/PaddleCV/metric_learning/README.md b/fluid/PaddleCV/metric_learning/README.md
deleted file mode 100644
index 6afd28a457c639af25337cc02a6b5b64658845ff..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/metric_learning](../../../PaddleCV/metric_learning).
diff --git a/fluid/PaddleCV/metric_learning/README_cn.md b/fluid/PaddleCV/metric_learning/README_cn.md
deleted file mode 100644
index 72417ed9badfc4858f314f143dd069d4ff6a0e6a..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/metric_learning/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/metric_learning](../../../PaddleCV/metric_learning) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/object_detection/README.md b/fluid/PaddleCV/object_detection/README.md
deleted file mode 100644
index 99b0f8db58cc8e2ef130c0054b40bf746b5ac2c8..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/object_detection](../../../PaddleCV/object_detection).
diff --git a/fluid/PaddleCV/object_detection/README_cn.md b/fluid/PaddleCV/object_detection/README_cn.md
deleted file mode 100644
index d3af497b9aecf23db4976970fbe16bc6c99bf6ff..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/object_detection](../../../PaddleCV/object_detection) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/object_detection/README_quant.md b/fluid/PaddleCV/object_detection/README_quant.md
deleted file mode 100644
index 99b0f8db58cc8e2ef130c0054b40bf746b5ac2c8..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/object_detection/README_quant.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/object_detection](../../../PaddleCV/object_detection).
diff --git a/fluid/PaddleCV/ocr_recognition/README.md b/fluid/PaddleCV/ocr_recognition/README.md
deleted file mode 100644
index aa675d6048ecdb025ef2273ee755354152adc32e..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/ocr_recognition/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/ocr_recognition](../../../PaddleCV/ocr_recognition) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/rcnn/README.md b/fluid/PaddleCV/rcnn/README.md
deleted file mode 100644
index 1e96b373a0ad13424691921dd17e8f251b9cdfc7..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/rcnn/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/rcnn](../../../PaddleCV/rcnn).
diff --git a/fluid/PaddleCV/rcnn/README_cn.md b/fluid/PaddleCV/rcnn/README_cn.md
deleted file mode 100644
index 83d5e0fc06448086e8807587798e804e3c634f97..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/rcnn/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/rcnn](../../../PaddleCV/rcnn) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/README.md b/fluid/PaddleCV/video/README.md
deleted file mode 100644
index bbef3af1c6f6715e4415041939e046d66f02f58d..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video](../../../PaddleCV/video) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/attention_cluster/README.md b/fluid/PaddleCV/video/models/attention_cluster/README.md
deleted file mode 100644
index 95056a71cb34304788168e15479a4aa1e2ecf3af..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/models/attention_cluster/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video/models/attention_cluster](../../../../../PaddleCV/video/models/attention_cluster/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/attention_lstm/README.md b/fluid/PaddleCV/video/models/attention_lstm/README.md
deleted file mode 100644
index 044c88cbecafdc880ae0cd213f6df77a8ce1715f..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/models/attention_lstm/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video/models/attention_lstm](../../../../../PaddleCV/video/models/attention_lstm/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/nextvlad/README.md b/fluid/PaddleCV/video/models/nextvlad/README.md
deleted file mode 100644
index ad3a926dd83c8d8825224c404dda76fff5238cbe..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/models/nextvlad/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video/models/nextvlad](../../../../../PaddleCV/video/models/nextvlad/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/nonlocal_model/README.md b/fluid/PaddleCV/video/models/nonlocal_model/README.md
deleted file mode 100644
index 4f72316b5e761c7e2e421f76fc3f743ab4ac12fb..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/models/nonlocal_model/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video/models/nonlocal_model](../../../../../PaddleCV/video/models/nonlocal_model/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/stnet/README.md b/fluid/PaddleCV/video/models/stnet/README.md
deleted file mode 100644
index 15cff5af0909a93c8cf244629878582aa6c2d12f..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/models/stnet/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video/models/stnet](../../../../../PaddleCV/video/models/stnet/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/tsm/README.md b/fluid/PaddleCV/video/models/tsm/README.md
deleted file mode 100644
index c93c56618aff1cfd331b2c1bd9fccfbb8a4c7a08..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/models/tsm/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video/models/tsm](../../../../../PaddleCV/video/models/tsm/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video/models/tsn/README.md b/fluid/PaddleCV/video/models/tsn/README.md
deleted file mode 100644
index 8b4a986a63ea7746a3c7a648cd9d535803784ca3..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video/models/tsn/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/video/models/tsn](../../../../../PaddleCV/video/models/tsn/) 目录下浏览本项目。
diff --git a/fluid/PaddleCV/video_classification/README.md b/fluid/PaddleCV/video_classification/README.md
deleted file mode 100644
index bb145d1e7d4538f8b1a6df5cf547d9c5ef5ae8c5..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/video_classification/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/video_classification](../../../PaddleCV/video_classification).
diff --git a/fluid/PaddleCV/yolov3/README.md b/fluid/PaddleCV/yolov3/README.md
deleted file mode 100644
index d05d89ce182a23b2f74e2633f7ada32fc6390477..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/yolov3/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/yolov3](../../../PaddleCV/yolov3).
diff --git a/fluid/PaddleCV/yolov3/README_cn.md b/fluid/PaddleCV/yolov3/README_cn.md
deleted file mode 100644
index 89080d674df265d37a3601b579622adf1829c747..0000000000000000000000000000000000000000
--- a/fluid/PaddleCV/yolov3/README_cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleCV/yolov3](../../../PaddleCV/yolov3) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/chinese_ner/README.md b/fluid/PaddleNLP/chinese_ner/README.md
deleted file mode 100644
index 06398c9164e5a39dbd444c78ebddb3ae46093574..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/chinese_ner/README.md
+++ /dev/null
@@ -1,3 +0,0 @@
-
-
-您好,该项目已被迁移,请移步到 [PaddleNLP/unarchived/chinese_ner](../../../PaddleNLP/unarchived/chinese_ner/) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/deep_attention_matching_net/README.md b/fluid/PaddleNLP/deep_attention_matching_net/README.md
deleted file mode 100644
index 7f4995ff102baadd095d31560c274ba9d57eea9c..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/deep_attention_matching_net/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleNLP/unarchived/deep_attention_matching_net](../../../PaddleNLP/unarchived/deep_attention_matching_net).
diff --git a/fluid/PaddleNLP/language_model/gru/README.md b/fluid/PaddleNLP/language_model/gru/README.md
deleted file mode 100644
index 15176770b1e2df48790386ab0137fcbe7c5d4200..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/language_model/gru/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleNLP/unarchived/language_model/gru](../../../../PaddleNLP/unarchived/language_model/gru) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/language_model/lstm/README.md b/fluid/PaddleNLP/language_model/lstm/README.md
deleted file mode 100644
index 8358bea7d81494c81c14a833e653a36d5eceadfb..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/language_model/lstm/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleNLP/unarchived/language_model/lstm](../../../../PaddleNLP/unarchived/language_model/lstm) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/machine_reading_comprehension/README.md b/fluid/PaddleNLP/machine_reading_comprehension/README.md
deleted file mode 100644
index e9642bc36abc0cfa9b3f0fed27c044b706eb0074..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/machine_reading_comprehension/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleNLP/unarchived/machine_reading_comprehension](../../../PaddleNLP/unarchived/machine_reading_comprehension).
diff --git a/fluid/PaddleNLP/neural_machine_translation/README.md b/fluid/PaddleNLP/neural_machine_translation/README.md
deleted file mode 100644
index 0117e6214f596b87baf097724526b23db23820f8..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleNLP/unarchived/neural_machine_translation](../../../PaddleNLP/unarchived/neural_machine_translation).
diff --git a/fluid/PaddleNLP/neural_machine_translation/rnn_search/README.md b/fluid/PaddleNLP/neural_machine_translation/rnn_search/README.md
deleted file mode 100644
index 005fb7e2e56c19583bfbeb7997c25fbef5f77578..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/rnn_search/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleNLP/neural_machine_translation/rnn_search](../../../../PaddleNLP/neural_machine_translation/rnn_search) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/neural_machine_translation/transformer/README.md b/fluid/PaddleNLP/neural_machine_translation/transformer/README.md
deleted file mode 100644
index 47a4f78bbb1e18e55442807b0701aef08f370fc0..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/neural_machine_translation/transformer/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleNLP/neural_machine_translation/transformer](../../../../PaddleNLP/neural_machine_translation/transformer) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/sequence_tagging_for_ner/README.md b/fluid/PaddleNLP/sequence_tagging_for_ner/README.md
deleted file mode 100644
index 772c4249c2c635ed9a6070b72028ce3a78a6d548..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/sequence_tagging_for_ner/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleNLP/unarchived/sequence_tagging_for_ner](../../../PaddleNLP/unarchived/sequence_tagging_for_ner) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/text_classification/README.md b/fluid/PaddleNLP/text_classification/README.md
deleted file mode 100644
index 48b09ed2c4245a2efe8f97a08a3c80f573e94336..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/text_classification/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleNLP/unarchived/text_classification](../../../PaddleNLP/unarchived/text_classification) 目录下浏览本项目。
diff --git a/fluid/PaddleNLP/text_matching_on_quora/README.md b/fluid/PaddleNLP/text_matching_on_quora/README.md
deleted file mode 100644
index b735660f56cb775a582f393cb06eb725b8ad36e7..0000000000000000000000000000000000000000
--- a/fluid/PaddleNLP/text_matching_on_quora/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleNLP/unarchived/text_matching_on_quora](../../../PaddleNLP/unarchived/text_matching_on_quora).
diff --git a/fluid/PaddleRec/ctr/README.cn.md b/fluid/PaddleRec/ctr/README.cn.md
deleted file mode 100644
index 81cd20625701c13fce3a3f8ad119663a6e5c162c..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/README.cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/ctr](../../../PaddleRec/ctr) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/ctr/README.md b/fluid/PaddleRec/ctr/README.md
deleted file mode 100644
index 1aceff1350c2c28b13ec92ccf82e321bb3ddda04..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ctr/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleRec/ctr](../../../PaddleRec/ctr).
diff --git a/fluid/PaddleRec/din/README.md b/fluid/PaddleRec/din/README.md
deleted file mode 100644
index 6e2df0301cf20434dc3479da8c93644f764c5c42..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/din/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/din](../../../PaddleRec/din) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/gnn/README.md b/fluid/PaddleRec/gnn/README.md
deleted file mode 100644
index 1ac21f3ee4712ead33f44322447d30fe5aa45918..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/gnn/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/gnn](../../../PaddleRec/gnn) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/gru4rec/README.md b/fluid/PaddleRec/gru4rec/README.md
deleted file mode 100644
index 9fe28eba00760b67c532e4624a5722cfd62feb57..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/gru4rec/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/gru4rec](../../../PaddleRec/gru4rec) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/multiview_simnet/README.cn.md b/fluid/PaddleRec/multiview_simnet/README.cn.md
deleted file mode 100644
index 9cf8e27bba4775800498c25b550f7bb19479f074..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/multiview_simnet/README.cn.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/multiview_simnet](../../../PaddleRec/multiview_simnet) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/multiview_simnet/README.md b/fluid/PaddleRec/multiview_simnet/README.md
deleted file mode 100644
index 8fba8e606256ad7ad65ec429b68e967809bc6a51..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/multiview_simnet/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleRec/multiview_simnet](../../../PaddleRec/multiview_simnet).
diff --git a/fluid/PaddleRec/ssr/README.md b/fluid/PaddleRec/ssr/README.md
deleted file mode 100644
index 15111907ccc21942c134a2a614ad341c37710272..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/ssr/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/ssr](../../../PaddleRec/ssr) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/tagspace/README.md b/fluid/PaddleRec/tagspace/README.md
deleted file mode 100644
index 67e3f88f7a2245829d0efbfe23a6566a0745fe41..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/tagspace/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/tagspace](../../../PaddleRec/tagspace) 目录下浏览本项目。
diff --git a/fluid/PaddleRec/word2vec/README.md b/fluid/PaddleRec/word2vec/README.md
deleted file mode 100644
index 7504ff9c332bf86f606d6d8770cefb325fc29ce0..0000000000000000000000000000000000000000
--- a/fluid/PaddleRec/word2vec/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRec/word2vec](../../../PaddleRec/word2vec) 目录下浏览本项目。
diff --git a/fluid/adversarial/README.md b/fluid/adversarial/README.md
deleted file mode 100644
index b43046d174c6fa7cc9517c043601d5a86e53604a..0000000000000000000000000000000000000000
--- a/fluid/adversarial/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-
-Hi!
-
-This directory has been deprecated.
-
-Please visit the project at [PaddleCV/adversarial](../../PaddleCV/adversarial).
diff --git a/fluid/policy_gradient/README.md b/fluid/policy_gradient/README.md
deleted file mode 100644
index b6ac95d0fba6bbb7552671fbc6e80d052a648045..0000000000000000000000000000000000000000
--- a/fluid/policy_gradient/README.md
+++ /dev/null
@@ -1,2 +0,0 @@
-
-您好,该项目已被迁移,请移步到 [PaddleRL/policy_gradient](../../PaddleRL/policy_gradient) 目录下浏览本项目。
diff --git a/legacy/README.cn.md b/legacy/README.cn.md
deleted file mode 100644
index 72fb35ff3b239d8fa5e226f84aa09f084f593697..0000000000000000000000000000000000000000
--- a/legacy/README.cn.md
+++ /dev/null
@@ -1,136 +0,0 @@
-# models 简介
-
-[![Documentation Status](https://img.shields.io/badge/docs-latest-brightgreen.svg?style=flat)](https://github.com/PaddlePaddle/models)
-[![Documentation Status](https://img.shields.io/badge/中文文档-最新-brightgreen.svg)](https://github.com/PaddlePaddle/models)
-[![License](https://img.shields.io/badge/license-Apache%202-blue.svg)](LICENSE)
-
-PaddlePaddle提供了丰富的运算单元,帮助大家以模块化的方式构建起千变万化的深度学习模型来解决不同的应用问题。这里,我们针对常见的机器学习任务,提供了不同的神经网络模型供大家学习和使用。
-
-
-## 1. 词向量
-
-词向量用一个实向量表示词语,向量的每个维都表示文本的某种潜在语法或语义特征,是深度学习应用于自然语言处理领域最成功的概念和成果之一。广义的,词向量也可以应用于普通离散特征。词向量的学习通常都是一个无监督的学习过程,因此,可以充分利用海量的无标记数据以捕获特征之间的关系,也可以有效地解决特征稀疏、标签数据缺失、数据噪声等问题。然而,在常见词向量学习方法中,模型最后一层往往会遇到一个超大规模的分类问题,是计算性能的瓶颈。
-
-在词向量任务中,我们向大家展示如何使用Hierarchical-Sigmoid 和噪声对比估计(Noise Contrastive Estimation,NCE)来加速词向量的学习。
-
-- 1.1 [Hsigmoid加速词向量训练](https://github.com/PaddlePaddle/models/tree/develop/hsigmoid)
-- 1.2 [噪声对比估计加速词向量训练](https://github.com/PaddlePaddle/models/tree/develop/nce_cost)
-
-
-## 2. RNN 语言模型
-
-语言模型是自然语言处理领域里一个重要的基础模型,除了得到词向量(语言模型训练的副产物),还可以帮助我们生成文本。给定若干个词,语言模型可以帮助我们预测下一个最可能出现的词。
-
-在利用语言模型生成文本的任务中,我们重点介绍循环神经网络语言模型,大家可以通过文档中的使用说明快速适配到自己的训练语料,完成自动写诗、自动写散文等有趣的模型。
-
-- 2.1 [使用循环神经网络语言模型生成文本](https://github.com/PaddlePaddle/models/tree/develop/generate_sequence_by_rnn_lm)
-
-## 3. 点击率预估
-
-点击率预估模型预判用户对一条广告点击的概率,对每次广告的点击情况做出预测,是广告技术的核心算法之一。逻谛斯克回归对大规模稀疏特征有着很好的学习能力,在点击率预估任务发展的早期一统天下。近年来,DNN 模型由于其强大的学习能力逐渐接过点击率预估任务的大旗。
-
-在点击率预估任务中,我们首先给出谷歌提出的 Wide & Deep 模型。这一模型融合了适用于学习抽象特征的DNN和适用于大规模稀疏特征的逻谛斯克回归两者的优点,可以作为一种相对成熟的模型框架使用,在工业界也有一定的应用。同时,我们提供基于因子分解机的深度神经网络模型,该模型融合了因子分解机和深度神经网络,分别建模输入属性之间的低阶交互和高阶交互。
-
-- 3.1 [Wide & deep 点击率预估模型](https://github.com/PaddlePaddle/models/tree/develop/ctr/README.cn.md)
-- 3.2 [基于深度因子分解机的点击率预估模型](https://github.com/PaddlePaddle/models/tree/develop/deep_fm)
-
-## 4. 文本分类
-
-文本分类是自然语言处理领域最基础的任务之一,深度学习方法能够免除复杂的特征工程,直接使用原始文本作为输入,数据驱动地最优化分类准确率。
-
-在文本分类任务中,我们以情感分类任务为例,提供了基于DNN的非序列文本分类模型,以及基于CNN的序列模型供大家学习和使用(基于LSTM的模型见PaddleBook中[情感分类](http://www.paddlepaddle.org/docs/develop/book/06.understand_sentiment/index.cn.html)一课)。
-
-- 4.1 [基于DNN/CNN的情感分类](https://github.com/PaddlePaddle/models/tree/develop/text_classification)
-- 4.2 [基于双层序列的文本分类模型](https://github.com/PaddlePaddle/models/tree/develop/nested_sequence/text_classification)
-
-## 5. 排序学习
-
-排序学习(Learning to Rank, LTR)是信息检索和搜索引擎研究的核心问题之一,通过机器学习方法学习一个分值函数对待排序的候选进行打分,再根据分值的高低确定序关系。深度神经网络可以用来建模分值函数,构成各类基于深度学习的LTR模型。
-
-在排序学习任务中,我们介绍基于RankLoss损失函数Pairwise排序模型和基于LambdaRank损失函数的Listwise排序模型(Pointwise学习策略见PaddleBook中[推荐系统](http://www.paddlepaddle.org/docs/develop/book/05.recommender_system/index.cn.html)一课)。
-
-- 5.1 [基于Pairwise和Listwise的排序学习](https://github.com/PaddlePaddle/models/tree/develop/ltr)
-
-## 6. 结构化语义模型
-
-深度结构化语义模型是一种基于神经网络的语义匹配模型框架,可以用于学习两路信息实体或是文本之间的语义相似性。DSSM使用DNN、CNN或是RNN将两路信息实体或是文本映射到同一个连续的低纬度语义空间中。在这个语义空间中,两路实体或是文本可以同时进行表示,然后,通过定义距离度量和匹配函数来刻画并学习不同实体或是文本在同一个语义空间内的语义相似性。
-
-在结构化语义模型任务中,我们演示如何建模两个字符串之间的语义相似度。模型支持DNN(全连接前馈网络)、CNN(卷积网络)、RNN(递归神经网络)等不同的网络结构,以及分类、回归、排序等不同损失函数。本例采用最简单的文本数据作为输入,通过替换自己的训练和预测数据,便可以在真实场景中使用。
-
-- 6.1 [深度结构化语义模型](https://github.com/PaddlePaddle/models/tree/develop/dssm/README.cn.md)
-
-## 7. 命名实体识别
-
-给定输入序列,序列标注模型为序列中每一个元素贴上一个类别标签,是自然语言处理领域最基础的任务之一。随着深度学习方法的不断发展,利用循环神经网络学习输入序列的特征表示,条件随机场(Conditional Random Field, CRF)在特征基础上完成序列标注任务,逐渐成为解决序列标注问题的标配解决方案。
-
-在序列标注任务中,我们以命名实体识别(Named Entity Recognition,NER)任务为例,介绍如何训练一个端到端的序列标注模型。
-
-- 7.1 [命名实体识别](https://github.com/PaddlePaddle/models/tree/develop/sequence_tagging_for_ner)
-
-## 8. 序列到序列学习
-
-序列到序列学习实现两个甚至是多个不定长模型之间的映射,有着广泛的应用,包括:机器翻译、智能对话与问答、广告创意语料生成、自动编码(如金融画像编码)、判断多个文本串之间的语义相关性等。
-
-在序列到序列学习任务中,我们首先以机器翻译任务为例,提供了多种改进模型供大家学习和使用。包括:不带注意力机制的序列到序列映射模型,这一模型是所有序列到序列学习模型的基础;使用Scheduled Sampling改善RNN模型在生成任务中的错误累积问题;带外部记忆机制的神经机器翻译,通过增强神经网络的记忆能力,来完成复杂的序列到序列学习任务。除机器翻译任务之外,我们也提供了一个基于深层LSTM网络生成古诗词,实现同语言生成的模型。
-
-- 8.1 [无注意力机制的神经机器翻译](https://github.com/PaddlePaddle/models/tree/develop/nmt_without_attention/README.cn.md)
-- 8.2 [使用Scheduled Sampling改善翻译质量](https://github.com/PaddlePaddle/models/tree/develop/scheduled_sampling)
-- 8.3 [带外部记忆机制的神经机器翻译](https://github.com/PaddlePaddle/models/tree/develop/mt_with_external_memory)
-- 8.4 [生成古诗词](https://github.com/PaddlePaddle/models/tree/develop/generate_chinese_poetry)
-
-## 9. 阅读理解
-
-当深度学习以及各类新技术不断推动自然语言处理领域向前发展时,我们不禁会问:应该如何确认模型真正理解了人类特有的自然语言,具备一定的理解和推理能力?纵观NLP领域的各类经典问题:词法分析、句法分析、情感分类、写诗等,这些问题的经典解决方案,从技术原理上距离“语言理解”仍有一定距离。为了衡量现有NLP技术到“语言理解”这一终极目标之间的差距,我们需要一个有足够难度且可量化可复现的任务,这也是阅读理解问题提出的初衷。尽管目前的研究现状表明在现有阅读理解数据集上表现良好的模型,依然没有做到真正的语言理解,但机器阅读理解依然被视为是检验模型向理解语言迈进的一个重要任务。
-
-阅读理解本质上也是自动问答的一种,模型“阅读”一段文字后回答给定的问题,在这一任务中,我们介绍使用Learning to Search 方法,将阅读理解转化为从段落中寻找答案所在句子,答案在句子中的起始位置,以及答案在句子中的结束位置,这样一个多步决策过程。
-
-- 9.1 [Globally Normalized Reader](https://github.com/PaddlePaddle/models/tree/develop/globally_normalized_reader)
-
-## 10. 自动问答
-
-自动问答(Question Answering)系统利用计算机自动回答用户提出的问题,是验证机器是否具备自然语言理解能力的重要任务之一,其研究历史可以追溯到人工智能的原点。与检索系统相比,自动问答系统是信息服务的一种高级形式,系统返回给用户的不再是排序后的基于关键字匹配的检索结果,而是精准的自然语言答案。
-
-在自动问答任务中,我们介绍基于深度学习的端到端问答系统,将自动问答转化为一个序列标注问题。端对端问答系统试图通过从高质量的"问题-证据(Evidence)-答案"数据中学习,建立一个联合学习模型,同时学习语料库、知识库、问句语义表示之间的语义映射关系,将传统的问句语义解析、文本检索、答案抽取与生成的复杂步骤转变为一个可学习过程。
-
-- 10.1 [基于序列标注的事实型自动问答模型](https://github.com/PaddlePaddle/models/tree/develop/neural_qa)
-
-## 11. 图像分类
-
-图像相比文字能够提供更加生动、容易理解及更具艺术感的信息,是人们转递与交换信息的重要来源。图像分类是根据图像的语义信息对不同类别图像进行区分,是计算机视觉中重要的基础问题,也是图像检测、图像分割、物体跟踪、行为分析等其他高层视觉任务的基础,在许多领域都有着广泛的应用。如:安防领域的人脸识别和智能视频分析等,交通领域的交通场景识别,互联网领域基于内容的图像检索和相册自动归类,医学领域的图像识别等。
-
-在图像分类任务中,我们向大家介绍如何训练AlexNet、VGG、GoogLeNet、ResNet、Inception-v4、Inception-Resnet-V2和Xception模型。同时提供了能够将Caffe或TensorFlow训练好的模型文件转换为PaddlePaddle模型文件的模型转换工具。
-
-- 11.1 [将Caffe模型文件转换为PaddlePaddle模型文件](https://github.com/PaddlePaddle/models/tree/develop/image_classification/caffe2paddle)
-- 11.2 [将TensorFlow模型文件转换为PaddlePaddle模型文件](https://github.com/PaddlePaddle/models/tree/develop/image_classification/tf2paddle)
-- 11.3 [AlexNet](https://github.com/PaddlePaddle/models/tree/develop/image_classification)
-- 11.4 [VGG](https://github.com/PaddlePaddle/models/tree/develop/image_classification)
-- 11.5 [Residual Network](https://github.com/PaddlePaddle/models/tree/develop/image_classification)
-- 11.6 [Inception-v4](https://github.com/PaddlePaddle/models/tree/develop/image_classification)
-- 11.7 [Inception-Resnet-V2](https://github.com/PaddlePaddle/models/tree/develop/image_classification)
-- 11.8 [Xception](https://github.com/PaddlePaddle/models/tree/develop/image_classification)
-
-## 12. 目标检测
-
-目标检测任务的目标是给定一张图像或是视频帧,让计算机找出其中所有目标的位置,并给出每个目标的具体类别。对于人类来说,目标检测是一个非常简单的任务。然而,计算机能够“看到”的仅有一些值为0 ~ 255的矩阵,很难解图像或是视频帧中出现了人或是物体这样的高层语义概念,也就更加难以定位目标出现在图像中哪个区域。与此同时,由于目标会出现在图像或是视频帧中的任何位置,目标的形态千变万化,图像或是视频帧的背景千差万别,诸多因素都使得目标检测对计算机来说是一个具有挑战性的问题。
-
-在目标检测任务中,我们介绍利用SSD方法完成目标检测。SSD全称:Single Shot MultiBox Detector,是目标检测领域较新且效果较好的检测算法之一,具有检测速度快且检测精度高的特点。
-
-- 12.1 [Single Shot MultiBox Detector](https://github.com/PaddlePaddle/models/tree/develop/ssd/README.cn.md)
-
-## 13. 场景文字识别
-
-许多场景图像中包含着丰富的文本信息,对理解图像信息有着重要作用,能够极大地帮助人们认知和理解场景图像的内容。场景文字识别是在图像背景复杂、分辨率低下、字体多样、分布随意等情况下,将图像信息转化为文字序列的过程,可认为是一种特别的翻译过程:将图像输入翻译为自然语言输出。场景图像文字识别技术的发展也促进了一些新型应用的产生,如通过自动识别路牌中的文字帮助街景应用获取更加准确的地址信息等。
-
-在场景文字识别任务中,我们介绍如何将基于CNN的图像特征提取和基于RNN的序列翻译技术结合,免除人工定义特征,避免字符分割,使用自动学习到的图像特征,完成端到端地无约束字符定位和识别。
-
-- 13.1 [场景文字识别](https://github.com/PaddlePaddle/models/tree/develop/scene_text_recognition)
-
-## 14. 语音识别
-
-语音识别技术(Auto Speech Recognize,简称ASR)将人类语音中的词汇内容转化为计算机可读的输入,让机器能够“听懂”人类的语音,在语音助手、语音输入、语音交互等应用中发挥着重要作用。深度学习在语音识别领域取得了瞩目的成绩,端到端的深度学习方法将传统的声学模型、词典、语言模型等模块融为一个整体,不再依赖隐马尔可夫模型中的各种条件独立性假设,令模型变得更加简洁,一个神经网络模型以语音特征为输入,直接输出识别出的文本,目前已经成为语音识别最重要的手段。
-
-在语音识别任务中,我们提供了基于 DeepSpeech2 模型的完整流水线,包括:特征提取、数据增强、模型训练、语言模型、解码模块等,并提供一个训练好的模型和体验实例,大家能够使用自己的声音来体验语音识别的乐趣。
-
-14.1 [语音识别: DeepSpeech2](https://github.com/PaddlePaddle/DeepSpeech)
-
-本教程由[PaddlePaddle](https://github.com/PaddlePaddle/Paddle)创作,采用[Apache-2.0](LICENSE) 许可协议进行许可。
diff --git a/legacy/README.md b/legacy/README.md
deleted file mode 100644
index f0719c1a26c04341e8de327143dc826248bb3607..0000000000000000000000000000000000000000
--- a/legacy/README.md
+++ /dev/null
@@ -1,89 +0,0 @@
-
-# 该目录的模型已经不再维护,不推荐使用。建议使用Fluid目录下的模型。
-
-# Introduction to models
-
-[![Documentation Status](https://img.shields.io/badge/docs-latest-brightgreen.svg?style=flat)](https://github.com/PaddlePaddle/models)
-[![Documentation Status](https://img.shields.io/badge/中文文档-最新-brightgreen.svg)](https://github.com/PaddlePaddle/models)
-[![License](https://img.shields.io/badge/license-Apache%202-blue.svg)](LICENSE)
-
-PaddlePaddle provides a rich set of computational units to enable users to adopt a modular approach to solving various learning problems. In this repo, we demonstrate how to use PaddlePaddle to solve common machine learning tasks, providing several different neural network model that anyone can easily learn and use.
-
-## 1. Word Embedding
-
-The word embedding expresses words with a real vector. Each dimension of the vector represents some of the latent grammatical or semantic features of the text and is one of the most successful concepts in the field of natural language processing. The generalized word vector can also be applied to discrete features. The study of word vector is usually an unsupervised learning. Therefore, it is possible to take full advantage of massive unmarked data to capture the relationship between features and to solve the problem of sparse features, missing tag data, and data noise. However, in the common word vector learning method, the last layer of the model often encounters a large-scale classification problem, which is the bottleneck of computing performance.
-
-In the example of word vectors, we show how to use Hierarchical-Sigmoid and Noise Contrastive Estimation (NCE) to accelerate word-vector learning.
-
-- 1.1 [Hsigmoid Accelerated Word Vector Training](https://github.com/PaddlePaddle/models/tree/develop/legacy/hsigmoid)
-- 1.2 [Noise Contrastive Estimation Accelerated Word Vector Training](https://github.com/PaddlePaddle/models/tree/develop/legacy/nce_cost)
-
-
-## 2. RNN language model
-
-The language model is important in the field of natural language processing. In addition to getting the word vector (a by-product of language model training), it can also help us to generate text. Given a number of words, the language model can help us predict the next most likely word. In the example of using the language model to generate text, we focus on the recurrent neural network language model. We can use the instructions in the document quickly adapt to their training corpus, complete automatic writing poetry, automatic writing prose and other interesting models.
-
-- 2.1 [Generate text using the RNN language model](https://github.com/PaddlePaddle/models/tree/develop/legacy/generate_sequence_by_rnn_lm)
-
-## 3. Click-Through Rate prediction
-The click-through rate model predicts the probability that a user will click on an ad. This is widely used for advertising technology. Logistic Regression has a good learning performance for large-scale sparse features in the early stages of the development of click-through rate prediction. In recent years, DNN model because of its strong learning ability to gradually take the banner rate of the task of the banner.
-
-In the example of click-through rate estimates, we first give the Google's Wide & Deep model. This model combines the advantages of DNN and the applicable logistic regression model for DNN and large-scale sparse features. Then we provide the deep factorization machine for click-through rate prediction. The deep factorization machine combines the factorization machine and deep neural networks to model both low order and high order interactions of input features.
-
-- 3.1 [Click-Through Rate Model](https://github.com/PaddlePaddle/models/tree/develop/legacy/ctr)
-- 3.2 [Deep Factorization Machine for Click-Through Rate prediction](https://github.com/PaddlePaddle/models/tree/develop/legacy/deep_fm)
-
-## 4. Text classification
-
-Text classification is one of the most basic tasks in natural language processing. The deep learning method can eliminate the complex feature engineering, and use the original text as input to optimize the classification accuracy.
-
-For text classification, we provide a non-sequential text classification model based on DNN and CNN. (For LSTM-based model, please refer to PaddleBook [Sentiment Analysis](http://www.paddlepaddle.org/docs/develop/book/06.understand_sentiment/index.html)).
-
-- 4.1 [Sentiment analysis based on DNN / CNN](https://github.com/PaddlePaddle/models/tree/develop/legacy/text_classification)
-
-## 5. Learning to rank
-
-Learning to rank (LTR) is one of the core problems in information retrieval and search engine research. Training data is used by a learning algorithm to produce a ranking model which computes the relevance of documents for actual queries.
-The depth neural network can be used to model the fractional function to form various LTR models based on depth learning.
-
-The algorithms for learning to rank are usually categorized into three groups by their input representation and the loss function. These are pointwise, pairwise and listwise approaches. Here we demonstrate RankLoss loss function method (pairwise approach), and LambdaRank loss function method (listwise approach). (For Pointwise approaches, please refer to [Recommended System](http://www.paddlepaddle.org/docs/develop/book/05.recommender_system/index.html)).
-
-- 5.1 [Learning to rank based on Pairwise and Listwise approches](https://github.com/PaddlePaddle/models/tree/develop/legacy/ltr)
-
-## 6. Semantic model
-The deep structured semantic model uses the DNN model to learn the vector representation of the low latitude in a continuous semantic space, finally models the semantic similarity between the two sentences.
-
-In this example, we demonstrate how to use PaddlePaddle to implement a generic deep structured semantic model to model the semantic similarity between two strings. The model supports different network structures such as CNN (Convolutional Network), FC (Fully Connected Network), RNN (Recurrent Neural Network), and different loss functions such as classification, regression, and sequencing.
-
-- 6.1 [Deep structured semantic model](https://github.com/PaddlePaddle/models/tree/develop/legacy/dssm)
-
-## 7. Sequence tagging
-
-Given the input sequence, the sequence tagging model is one of the most basic tasks in the natural language processing by assigning a category tag to each element in the sequence. Recurrent neural network models with Conditional Random Field (CRF) are commonly used for sequence tagging tasks.
-
-In the example of the sequence tagging, we describe how to train an end-to-end sequence tagging model with the Named Entity Recognition (NER) task as an example.
-
-- 7.1 [Name Entity Recognition](https://github.com/PaddlePaddle/models/tree/develop/legacy/sequence_tagging_for_ner)
-
-## 8. Sequence to sequence learning
-
-Sequence-to-sequence model has a wide range of applications. This includes machine translation, dialogue system, and parse tree generation.
-
-As an example for sequence-to-sequence learning, we take the machine translation task. We demonstrate the sequence-to-sequence mapping model without attention mechanism, which is the basis for all sequence-to-sequence learning models. We will use scheduled sampling to improve the problem of error accumulation in the RNN model, and machine translation with external memory mechanism.
-
-- 8.1 [Basic Sequence-to-sequence model](https://github.com/PaddlePaddle/models/tree/develop/legacy/nmt_without_attention)
-
-## 9. Image classification
-
-For the example of image classification, we show you how to train AlexNet, VGG, GoogLeNet, ResNet, Inception-v4, Inception-Resnet-V2 and Xception models in PaddlePaddle. It also provides model conversion tools that convert Caffe or TensorFlow trained model files into PaddlePaddle model files.
-
-- 9.1 [convert Caffe model file to PaddlePaddle model file](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification/caffe2paddle)
-- 9.2 [convert TensorFlow model file to PaddlePaddle model file](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification/tf2paddle)
-- 9.3 [AlexNet](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification)
-- 9.4 [VGG](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification)
-- 9.5 [Residual Network](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification)
-- 9.6 [Inception-v4](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification)
-- 9.7 [Inception-Resnet-V2](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification)
-- 9.8 [Xception](https://github.com/PaddlePaddle/models/tree/develop/legacy/image_classification)
-
-This tutorial is contributed by [PaddlePaddle](https://github.com/PaddlePaddle/Paddle) and licensed under the [Apache-2.0 license](LICENSE).
diff --git a/legacy/conv_seq2seq/README.md b/legacy/conv_seq2seq/README.md
deleted file mode 100644
index 5b22c2c17ea2ff3588e93219e86d81a831242211..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/README.md
+++ /dev/null
@@ -1,70 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.11.0. If you are on a version of PaddlePaddle earlier than v0.11.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Convolutional Sequence to Sequence Learning
-This model implements the work in the following paper:
-
-Jonas Gehring, Micheal Auli, David Grangier, et al. Convolutional Sequence to Sequence Learning. Association for Computational Linguistics (ACL), 2017
-
-# Data Preparation
-- The data used in this tutorial can be downloaded by runing:
-
- ```bash
- sh download.sh
- ```
-
-- Each line in the data file contains one sample and each sample consists of a source sentence and a target sentence. And the two sentences are seperated by '\t'. So, to use your own data, it should be organized as follows:
-
- ```
- \t
- ```
-
-# Training a Model
-- Modify the following script if needed and then run:
-
- ```bash
- python train.py \
- --train_data_path ./data/train \
- --test_data_path ./data/test \
- --src_dict_path ./data/src_dict \
- --trg_dict_path ./data/trg_dict \
- --enc_blocks "[(256, 3)] * 5" \
- --dec_blocks "[(256, 3)] * 3" \
- --emb_size 256 \
- --pos_size 200 \
- --drop_rate 0.2 \
- --use_bn False \
- --use_gpu False \
- --trainer_count 1 \
- --batch_size 32 \
- --num_passes 20 \
- >train.log 2>&1
- ```
-
-# Inferring by a Trained Model
-- Infer by a trained model by running:
-
- ```bash
- python infer.py \
- --infer_data_path ./data/dev \
- --src_dict_path ./data/src_dict \
- --trg_dict_path ./data/trg_dict \
- --enc_blocks "[(256, 3)] * 5" \
- --dec_blocks "[(256, 3)] * 3" \
- --emb_size 256 \
- --pos_size 200 \
- --drop_rate 0.2 \
- --use_bn False \
- --use_gpu False \
- --trainer_count 1 \
- --max_len 100 \
- --batch_size 256 \
- --beam_size 1 \
- --is_show_attention False \
- --model_path ./params.pass-0.tar.gz \
- 1>infer_result 2>infer.log
- ```
-
-# Notes
-Since PaddlePaddle of current version doesn't support weight normalization, we use batch normalization instead to confirm convergence when the network is deep.
diff --git a/legacy/conv_seq2seq/beamsearch.py b/legacy/conv_seq2seq/beamsearch.py
deleted file mode 100644
index dd8562f018c803d4f0d7bbba4a2a006ece904851..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/beamsearch.py
+++ /dev/null
@@ -1,197 +0,0 @@
-#coding=utf-8
-
-import sys
-import time
-import math
-import numpy as np
-
-import reader
-
-
-class BeamSearch(object):
- """
- Generate sequence by beam search
- """
-
- def __init__(self,
- inferer,
- trg_dict,
- pos_size,
- padding_num,
- batch_size=1,
- beam_size=1,
- max_len=100):
- self.inferer = inferer
- self.trg_dict = trg_dict
- self.reverse_trg_dict = reader.get_reverse_dict(trg_dict)
- self.word_padding = trg_dict.__len__()
- self.pos_size = pos_size
- self.pos_padding = pos_size
- self.padding_num = padding_num
- self.win_len = padding_num + 1
- self.max_len = max_len
- self.batch_size = batch_size
- self.beam_size = beam_size
-
- def get_beam_input(self, batch, sample_list):
- """
- Get input for generation at the current iteration.
- """
- beam_input = []
-
- for sample_id in sample_list:
- for path in self.candidate_path[sample_id]:
- if len(path['seq']) < self.win_len:
- cur_trg = [self.word_padding] * (
- self.win_len - len(path['seq']) - 1
- ) + [self.trg_dict['']] + path['seq']
- cur_trg_pos = [self.pos_padding] * (
- self.win_len - len(path['seq']) - 1) + [0] + range(
- 1, len(path['seq']) + 1)
- else:
- cur_trg = path['seq'][-self.win_len:]
- cur_trg_pos = range(
- len(path['seq']) + 1 - self.win_len,
- len(path['seq']) + 1)
-
- beam_input.append(batch[sample_id] + [cur_trg] + [cur_trg_pos])
-
- return beam_input
-
- def get_prob(self, beam_input):
- """
- Get the probabilities of all possible tokens.
- """
- row_list = [j * self.win_len for j in range(len(beam_input))]
- prob = self.inferer.infer(beam_input, field='value')[row_list, :]
- return prob
-
- def _top_k(self, prob, k):
- """
- Get indices of the words with k highest probablities.
- """
- return prob.argsort()[-k:][::-1]
-
- def beam_expand(self, prob, sample_list):
- """
- In every iteration step, the model predicts the possible next words.
- For each input sentence, the top beam_size words are selected as candidates.
- """
- top_words = np.apply_along_axis(self._top_k, 1, prob, self.beam_size)
-
- candidate_words = [[]] * len(self.candidate_path)
- idx = 0
-
- for sample_id in sample_list:
- for seq_id, path in enumerate(self.candidate_path[sample_id]):
- for w in top_words[idx, :]:
- score = path['score'] + math.log(prob[idx, w])
- candidate_words[sample_id] = candidate_words[sample_id] + [{
- 'word': w,
- 'score': score,
- 'seq_id': seq_id
- }]
- idx = idx + 1
-
- return candidate_words
-
- def beam_shrink(self, candidate_words, sample_list):
- """
- Pruning process of the beam search. During the process, beam_size most post possible
- sequences are selected for the beam in the next generation.
- """
- new_path = [[]] * len(self.candidate_path)
-
- for sample_id in sample_list:
- beam_words = sorted(
- candidate_words[sample_id],
- key=lambda x: x['score'],
- reverse=True)[:self.beam_size]
-
- complete_seq_min_score = None
- complete_path_num = len(self.complete_path[sample_id])
-
- if complete_path_num > 0:
- complete_seq_min_score = min(self.complete_path[sample_id],
- key=lambda x: x['score'])['score']
- if complete_path_num >= self.beam_size:
- beam_words_max_score = beam_words[0]['score']
- if beam_words_max_score < complete_seq_min_score:
- continue
-
- for w in beam_words:
-
- if w['word'] == self.trg_dict['']:
- if complete_path_num < self.beam_size or complete_seq_min_score <= w[
- 'score']:
-
- seq = self.candidate_path[sample_id][w['seq_id']]['seq']
- self.complete_path[sample_id] = self.complete_path[
- sample_id] + [{
- 'seq': seq,
- 'score': w['score']
- }]
-
- if complete_seq_min_score is None or complete_seq_min_score > w[
- 'score']:
- complete_seq_min_score = w['score']
- else:
- seq = self.candidate_path[sample_id][w['seq_id']]['seq'] + [
- w['word']
- ]
- new_path[sample_id] = new_path[sample_id] + [{
- 'seq': seq,
- 'score': w['score']
- }]
-
- return new_path
-
- def search_one_batch(self, batch):
- """
- Perform beam search on one mini-batch.
- """
- real_size = len(batch)
- self.candidate_path = [[{'seq': [], 'score': 0.}]] * real_size
- self.complete_path = [[]] * real_size
- sample_list = range(real_size)
-
- for i in xrange(self.max_len):
- beam_input = self.get_beam_input(batch, sample_list)
- prob = self.get_prob(beam_input)
-
- candidate_words = self.beam_expand(prob, sample_list)
- new_path = self.beam_shrink(candidate_words, sample_list)
- self.candidate_path = new_path
- sample_list = [
- sample_id for sample_id in sample_list
- if len(new_path[sample_id]) > 0
- ]
-
- if len(sample_list) == 0:
- break
-
- final_path = []
- for i in xrange(real_size):
- top_path = sorted(
- self.complete_path[i] + self.candidate_path[i],
- key=lambda x: x['score'],
- reverse=True)[:self.beam_size]
- final_path.append(top_path)
- return final_path
-
- def search(self, infer_data):
- """
- Perform beam search on all data.
- """
-
- def _to_sentence(seq):
- raw_sentence = [self.reverse_trg_dict[id] for id in seq]
- sentence = " ".join(raw_sentence)
- return sentence
-
- for pos in xrange(0, len(infer_data), self.batch_size):
- batch = infer_data[pos:min(pos + self.batch_size, len(infer_data))]
- self.final_path = self.search_one_batch(batch)
- for top_path in self.final_path:
- print _to_sentence(top_path[0]['seq'])
- sys.stdout.flush()
diff --git a/legacy/conv_seq2seq/download.sh b/legacy/conv_seq2seq/download.sh
deleted file mode 100644
index b1a924d25b1a10ade9f4be8b504933d1efa01905..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/download.sh
+++ /dev/null
@@ -1,22 +0,0 @@
-#!/usr/bin/env bash
-
-CUR_PATH=`pwd`
-git clone https://github.com/moses-smt/mosesdecoder.git
-git clone https://github.com/rizar/actor-critic-public
-
-export MOSES=`pwd`/mosesdecoder
-export LVSR=`pwd`/actor-critic-public
-
-cd actor-critic-public/exp/ted
-sh create_dataset.sh
-
-cd $CUR_PATH
-mkdir data
-cp actor-critic-public/exp/ted/prep/*-* data/
-cp actor-critic-public/exp/ted/vocab.* data/
-
-cd data
-python ../preprocess.py
-
-cd ..
-rm -rf actor-critic-public mosesdecoder
diff --git a/legacy/conv_seq2seq/infer.py b/legacy/conv_seq2seq/infer.py
deleted file mode 100644
index c804a84e71ffe920b72064cb05461d72c444ac73..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/infer.py
+++ /dev/null
@@ -1,236 +0,0 @@
-#coding=utf-8
-
-import sys
-import argparse
-import distutils.util
-import gzip
-
-import paddle.v2 as paddle
-from model import conv_seq2seq
-from beamsearch import BeamSearch
-import reader
-
-
-def parse_args():
- parser = argparse.ArgumentParser(
- description="PaddlePaddle Convolutional Seq2Seq")
- parser.add_argument(
- '--infer_data_path',
- type=str,
- required=True,
- help="Path of the dataset for inference")
- parser.add_argument(
- '--src_dict_path',
- type=str,
- required=True,
- help='Path of the source dictionary')
- parser.add_argument(
- '--trg_dict_path',
- type=str,
- required=True,
- help='path of the target dictionary')
- parser.add_argument(
- '--enc_blocks', type=str, help='Convolution blocks of the encoder')
- parser.add_argument(
- '--dec_blocks', type=str, help='Convolution blocks of the decoder')
- parser.add_argument(
- '--emb_size',
- type=int,
- default=256,
- help='Dimension of word embedding. (default: %(default)s)')
- parser.add_argument(
- '--pos_size',
- type=int,
- default=200,
- help='Total number of the position indexes. (default: %(default)s)')
- parser.add_argument(
- '--drop_rate',
- type=float,
- default=0.,
- help='Dropout rate. (default: %(default)s)')
- parser.add_argument(
- "--use_bn",
- default=False,
- type=distutils.util.strtobool,
- help="Use batch normalization or not. (default: %(default)s)")
- parser.add_argument(
- "--use_gpu",
- default=False,
- type=distutils.util.strtobool,
- help="Use gpu or not. (default: %(default)s)")
- parser.add_argument(
- "--trainer_count",
- default=1,
- type=int,
- help="Trainer number. (default: %(default)s)")
- parser.add_argument(
- '--max_len',
- type=int,
- default=100,
- help="The maximum length of the sentence to be generated. (default: %(default)s)"
- )
- parser.add_argument(
- "--batch_size",
- default=1,
- type=int,
- help="Size of a mini-batch. (default: %(default)s)")
- parser.add_argument(
- "--beam_size",
- default=1,
- type=int,
- help="The width of beam expansion. (default: %(default)s)")
- parser.add_argument(
- "--model_path",
- type=str,
- required=True,
- help="The path of trained model. (default: %(default)s)")
- parser.add_argument(
- "--is_show_attention",
- default=False,
- type=distutils.util.strtobool,
- help="Whether to show attention weight or not. (default: %(default)s)")
- return parser.parse_args()
-
-
-def infer(infer_data_path,
- src_dict_path,
- trg_dict_path,
- model_path,
- enc_conv_blocks,
- dec_conv_blocks,
- emb_dim=256,
- pos_size=200,
- drop_rate=0.,
- use_bn=False,
- max_len=100,
- batch_size=1,
- beam_size=1,
- is_show_attention=False):
- """
- Inference.
-
- :param infer_data_path: The path of the data for inference.
- :type infer_data_path: str
- :param src_dict_path: The path of the source dictionary.
- :type src_dict_path: str
- :param trg_dict_path: The path of the target dictionary.
- :type trg_dict_path: str
- :param model_path: The path of a trained model.
- :type model_path: str
- :param enc_conv_blocks: The scale list of the encoder's convolution blocks. And each element of
- the list contains output dimension and context length of the corresponding
- convolution block.
- :type enc_conv_blocks: list of tuple
- :param dec_conv_blocks: The scale list of the decoder's convolution blocks. And each element of
- the list contains output dimension and context length of the corresponding
- convolution block.
- :type dec_conv_blocks: list of tuple
- :param emb_dim: The dimension of the embedding vector.
- :type emb_dim: int
- :param pos_size: The total number of the position indexes, which means
- the maximum value of the index is pos_size - 1.
- :type pos_size: int
- :param drop_rate: Dropout rate.
- :type drop_rate: float
- :param use_bn: Whether to use batch normalization or not. False is the default value.
- :type use_bn: bool
- :param max_len: The maximum length of the sentence to be generated.
- :type max_len: int
- :param beam_size: The width of beam expansion.
- :type beam_size: int
- :param is_show_attention: Whether to show attention weight or not. False is the default value.
- :type is_show_attention: bool
- """
- # load dict
- src_dict = reader.load_dict(src_dict_path)
- trg_dict = reader.load_dict(trg_dict_path)
- src_dict_size = src_dict.__len__()
- trg_dict_size = trg_dict.__len__()
-
- prob, weight = conv_seq2seq(
- src_dict_size=src_dict_size,
- trg_dict_size=trg_dict_size,
- pos_size=pos_size,
- emb_dim=emb_dim,
- enc_conv_blocks=enc_conv_blocks,
- dec_conv_blocks=dec_conv_blocks,
- drop_rate=drop_rate,
- with_bn=use_bn,
- is_infer=True)
-
- # load parameters
- parameters = paddle.parameters.Parameters.from_tar(gzip.open(model_path))
-
- padding_list = [context_len - 1 for (size, context_len) in dec_conv_blocks]
- padding_num = reduce(lambda x, y: x + y, padding_list)
- infer_reader = reader.data_reader(
- data_file=infer_data_path,
- src_dict=src_dict,
- trg_dict=trg_dict,
- pos_size=pos_size,
- padding_num=padding_num)
-
- if is_show_attention:
- attention_inferer = paddle.inference.Inference(
- output_layer=weight, parameters=parameters)
- for i, data in enumerate(infer_reader()):
- src_len = len(data[0])
- trg_len = len(data[2])
- attention_weight = attention_inferer.infer(
- [data], field='value', flatten_result=False)
- attention_weight = [
- weight.reshape((trg_len, src_len))
- for weight in attention_weight
- ]
- print attention_weight
- break
- return
-
- infer_data = []
- for i, raw_data in enumerate(infer_reader()):
- infer_data.append([raw_data[0], raw_data[1]])
-
- inferer = paddle.inference.Inference(
- output_layer=prob, parameters=parameters)
-
- searcher = BeamSearch(
- inferer=inferer,
- trg_dict=trg_dict,
- pos_size=pos_size,
- padding_num=padding_num,
- max_len=max_len,
- batch_size=batch_size,
- beam_size=beam_size)
-
- searcher.search(infer_data)
- return
-
-
-def main():
- args = parse_args()
- enc_conv_blocks = eval(args.enc_blocks)
- dec_conv_blocks = eval(args.dec_blocks)
-
- sys.setrecursionlimit(10000)
-
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
-
- infer(
- infer_data_path=args.infer_data_path,
- src_dict_path=args.src_dict_path,
- trg_dict_path=args.trg_dict_path,
- model_path=args.model_path,
- enc_conv_blocks=enc_conv_blocks,
- dec_conv_blocks=dec_conv_blocks,
- emb_dim=args.emb_size,
- pos_size=args.pos_size,
- drop_rate=args.drop_rate,
- use_bn=args.use_bn,
- max_len=args.max_len,
- batch_size=args.batch_size,
- beam_size=args.beam_size,
- is_show_attention=args.is_show_attention)
-
-
-if __name__ == '__main__':
- main()
diff --git a/legacy/conv_seq2seq/model.py b/legacy/conv_seq2seq/model.py
deleted file mode 100644
index c31238f83172fdc3d6240095279d1c953ab272ae..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/model.py
+++ /dev/null
@@ -1,440 +0,0 @@
-#coding=utf-8
-
-import math
-
-import paddle.v2 as paddle
-
-__all__ = ["conv_seq2seq"]
-
-
-def gated_conv_with_batchnorm(input,
- size,
- context_len,
- context_start=None,
- learning_rate=1.0,
- drop_rate=0.,
- with_bn=False):
- """
- Definition of the convolution block.
-
- :param input: The input of this block.
- :type input: LayerOutput
- :param size: The dimension of the block's output.
- :type size: int
- :param context_len: The context length of the convolution.
- :type context_len: int
- :param context_start: The start position of the context.
- :type context_start: int
- :param learning_rate: The learning rate factor of the parameters in the block.
- The actual learning rate is the product of the global
- learning rate and this factor.
- :type learning_rate: float
- :param drop_rate: Dropout rate.
- :type drop_rate: float
- :param with_bn: Whether to use batch normalization or not. False is the default
- value.
- :type with_bn: bool
- :return: The output of the convolution block.
- :rtype: LayerOutput
- """
- input = paddle.layer.dropout(input=input, dropout_rate=drop_rate)
-
- context = paddle.layer.mixed(
- size=input.size * context_len,
- input=paddle.layer.context_projection(
- input=input, context_len=context_len, context_start=context_start))
-
- raw_conv = paddle.layer.fc(
- input=context,
- size=size * 2,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(
- initial_mean=0.,
- initial_std=math.sqrt(4.0 * (1.0 - drop_rate) / context.size),
- learning_rate=learning_rate),
- bias_attr=False)
-
- if with_bn:
- raw_conv = paddle.layer.batch_norm(
- input=raw_conv,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(learning_rate=learning_rate))
-
- with paddle.layer.mixed(size=size) as conv:
- conv += paddle.layer.identity_projection(raw_conv, size=size, offset=0)
-
- with paddle.layer.mixed(size=size, act=paddle.activation.Sigmoid()) as gate:
- gate += paddle.layer.identity_projection(
- raw_conv, size=size, offset=size)
-
- with paddle.layer.mixed(size=size) as gated_conv:
- gated_conv += paddle.layer.dotmul_operator(conv, gate)
-
- return gated_conv
-
-
-def encoder(token_emb,
- pos_emb,
- conv_blocks=[(256, 3)] * 5,
- num_attention=3,
- drop_rate=0.,
- with_bn=False):
- """
- Definition of the encoder.
-
- :param token_emb: The embedding vector of the input token.
- :type token_emb: LayerOutput
- :param pos_emb: The embedding vector of the input token's position.
- :type pos_emb: LayerOutput
- :param conv_blocks: The scale list of the convolution blocks. Each element of
- the list contains output dimension and context length of
- the corresponding convolution block.
- :type conv_blocks: list of tuple
- :param num_attention: The total number of the attention modules used in the decoder.
- :type num_attention: int
- :param drop_rate: Dropout rate.
- :type drop_rate: float
- :param with_bn: Whether to use batch normalization or not. False is the default
- value.
- :type with_bn: bool
- :return: The input token encoding.
- :rtype: LayerOutput
- """
- embedding = paddle.layer.addto(
- input=[token_emb, pos_emb],
- layer_attr=paddle.attr.Extra(drop_rate=drop_rate))
-
- proj_size = conv_blocks[0][0]
- block_input = paddle.layer.fc(
- input=embedding,
- size=proj_size,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(
- initial_mean=0.,
- initial_std=math.sqrt((1.0 - drop_rate) / embedding.size),
- learning_rate=1.0 / (2.0 * num_attention)),
- bias_attr=True, )
-
- for (size, context_len) in conv_blocks:
- if block_input.size == size:
- residual = block_input
- else:
- residual = paddle.layer.fc(
- input=block_input,
- size=size,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(learning_rate=1.0 /
- (2.0 * num_attention)),
- bias_attr=True)
-
- gated_conv = gated_conv_with_batchnorm(
- input=block_input,
- size=size,
- context_len=context_len,
- learning_rate=1.0 / (2.0 * num_attention),
- drop_rate=drop_rate,
- with_bn=with_bn)
-
- with paddle.layer.mixed(size=size) as block_output:
- block_output += paddle.layer.identity_projection(residual)
- block_output += paddle.layer.identity_projection(gated_conv)
-
- # halve the variance of the sum
- block_output = paddle.layer.slope_intercept(
- input=block_output, slope=math.sqrt(0.5))
-
- block_input = block_output
-
- emb_dim = embedding.size
- encoded_vec = paddle.layer.fc(
- input=block_output,
- size=emb_dim,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(learning_rate=1.0 / (2.0 * num_attention)),
- bias_attr=True)
-
- encoded_sum = paddle.layer.addto(input=[encoded_vec, embedding])
-
- # halve the variance of the sum
- encoded_sum = paddle.layer.slope_intercept(
- input=encoded_sum, slope=math.sqrt(0.5))
-
- return encoded_vec, encoded_sum
-
-
-def attention(decoder_state, cur_embedding, encoded_vec, encoded_sum):
- """
- Definition of the attention.
-
- :param decoder_state: The hidden state of the decoder.
- :type decoder_state: LayerOutput
- :param cur_embedding: The embedding vector of the current token.
- :type cur_embedding: LayerOutput
- :param encoded_vec: The source token encoding.
- :type encoded_vec: LayerOutput
- :param encoded_sum: The sum of the source token's encoding and embedding.
- :type encoded_sum: LayerOutput
- :return: A context vector and the attention weight.
- :rtype: LayerOutput
- """
- residual = decoder_state
-
- state_size = decoder_state.size
- emb_dim = cur_embedding.size
- with paddle.layer.mixed(size=emb_dim, bias_attr=True) as state_summary:
- state_summary += paddle.layer.full_matrix_projection(decoder_state)
- state_summary += paddle.layer.identity_projection(cur_embedding)
-
- # halve the variance of the sum
- state_summary = paddle.layer.slope_intercept(
- input=state_summary, slope=math.sqrt(0.5))
-
- expanded = paddle.layer.expand(input=state_summary, expand_as=encoded_vec)
-
- m = paddle.layer.dot_prod(input1=expanded, input2=encoded_vec)
-
- attention_weight = paddle.layer.fc(input=m,
- size=1,
- act=paddle.activation.SequenceSoftmax(),
- bias_attr=False)
-
- scaled = paddle.layer.scaling(weight=attention_weight, input=encoded_sum)
-
- attended = paddle.layer.pooling(
- input=scaled, pooling_type=paddle.pooling.Sum())
-
- attended_proj = paddle.layer.fc(input=attended,
- size=state_size,
- act=paddle.activation.Linear(),
- bias_attr=True)
-
- attention_result = paddle.layer.addto(input=[attended_proj, residual])
-
- # halve the variance of the sum
- attention_result = paddle.layer.slope_intercept(
- input=attention_result, slope=math.sqrt(0.5))
- return attention_result, attention_weight
-
-
-def decoder(token_emb,
- pos_emb,
- encoded_vec,
- encoded_sum,
- dict_size,
- conv_blocks=[(256, 3)] * 3,
- drop_rate=0.,
- with_bn=False):
- """
- Definition of the decoder.
-
- :param token_emb: The embedding vector of the input token.
- :type token_emb: LayerOutput
- :param pos_emb: The embedding vector of the input token's position.
- :type pos_emb: LayerOutput
- :param encoded_vec: The source token encoding.
- :type encoded_vec: LayerOutput
- :param encoded_sum: The sum of the source token's encoding and embedding.
- :type encoded_sum: LayerOutput
- :param dict_size: The size of the target dictionary.
- :type dict_size: int
- :param conv_blocks: The scale list of the convolution blocks. Each element
- of the list contains output dimension and context length
- of the corresponding convolution block.
- :type conv_blocks: list of tuple
- :param drop_rate: Dropout rate.
- :type drop_rate: float
- :param with_bn: Whether to use batch normalization or not. False is the default
- value.
- :type with_bn: bool
- :return: The probability of the predicted token and the attention weights.
- :rtype: LayerOutput
- """
-
- def attention_step(decoder_state, cur_embedding, encoded_vec, encoded_sum):
- conditional = attention(
- decoder_state=decoder_state,
- cur_embedding=cur_embedding,
- encoded_vec=encoded_vec,
- encoded_sum=encoded_sum)
- return conditional
-
- embedding = paddle.layer.addto(
- input=[token_emb, pos_emb],
- layer_attr=paddle.attr.Extra(drop_rate=drop_rate))
-
- proj_size = conv_blocks[0][0]
- block_input = paddle.layer.fc(
- input=embedding,
- size=proj_size,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(
- initial_mean=0.,
- initial_std=math.sqrt((1.0 - drop_rate) / embedding.size)),
- bias_attr=True, )
-
- weight = []
- for (size, context_len) in conv_blocks:
- if block_input.size == size:
- residual = block_input
- else:
- residual = paddle.layer.fc(input=block_input,
- size=size,
- act=paddle.activation.Linear(),
- bias_attr=True)
-
- decoder_state = gated_conv_with_batchnorm(
- input=block_input,
- size=size,
- context_len=context_len,
- context_start=0,
- drop_rate=drop_rate,
- with_bn=with_bn)
-
- group_inputs = [
- decoder_state,
- embedding,
- paddle.layer.StaticInput(input=encoded_vec),
- paddle.layer.StaticInput(input=encoded_sum),
- ]
-
- conditional, attention_weight = paddle.layer.recurrent_group(
- step=attention_step, input=group_inputs)
- weight.append(attention_weight)
-
- block_output = paddle.layer.addto(input=[conditional, residual])
-
- # halve the variance of the sum
- block_output = paddle.layer.slope_intercept(
- input=block_output, slope=math.sqrt(0.5))
-
- block_input = block_output
-
- out_emb_dim = embedding.size
- block_output = paddle.layer.fc(
- input=block_output,
- size=out_emb_dim,
- act=paddle.activation.Linear(),
- layer_attr=paddle.attr.Extra(drop_rate=drop_rate))
-
- decoder_out = paddle.layer.fc(
- input=block_output,
- size=dict_size,
- act=paddle.activation.Softmax(),
- param_attr=paddle.attr.Param(
- initial_mean=0.,
- initial_std=math.sqrt((1.0 - drop_rate) / block_output.size)),
- bias_attr=True)
-
- return decoder_out, weight
-
-
-def conv_seq2seq(src_dict_size,
- trg_dict_size,
- pos_size,
- emb_dim,
- enc_conv_blocks=[(256, 3)] * 5,
- dec_conv_blocks=[(256, 3)] * 3,
- drop_rate=0.,
- with_bn=False,
- is_infer=False):
- """
- Definition of convolutional sequence-to-sequence network.
-
- :param src_dict_size: The size of the source dictionary.
- :type src_dict_size: int
- :param trg_dict_size: The size of the target dictionary.
- :type trg_dict_size: int
- :param pos_size: The total number of the position indexes, which means
- the maximum value of the index is pos_size - 1.
- :type pos_size: int
- :param emb_dim: The dimension of the embedding vector.
- :type emb_dim: int
- :param enc_conv_blocks: The scale list of the encoder's convolution blocks. Each element
- of the list contains output dimension and context length of the
- corresponding convolution block.
- :type enc_conv_blocks: list of tuple
- :param dec_conv_blocks: The scale list of the decoder's convolution blocks. Each element
- of the list contains output dimension and context length of the
- corresponding convolution block.
- :type dec_conv_blocks: list of tuple
- :param drop_rate: Dropout rate.
- :type drop_rate: float
- :param with_bn: Whether to use batch normalization or not. False is the default value.
- :type with_bn: bool
- :param is_infer: Whether infer or not.
- :type is_infer: bool
- :return: Cost or output layer.
- :rtype: LayerOutput
- """
- src = paddle.layer.data(
- name='src_word',
- type=paddle.data_type.integer_value_sequence(src_dict_size))
- src_pos = paddle.layer.data(
- name='src_word_pos',
- type=paddle.data_type.integer_value_sequence(pos_size +
- 1)) # one for padding
-
- src_emb = paddle.layer.embedding(
- input=src,
- size=emb_dim,
- name='src_word_emb',
- param_attr=paddle.attr.Param(
- initial_mean=0., initial_std=0.1))
- src_pos_emb = paddle.layer.embedding(
- input=src_pos,
- size=emb_dim,
- name='src_pos_emb',
- param_attr=paddle.attr.Param(
- initial_mean=0., initial_std=0.1))
-
- num_attention = len(dec_conv_blocks)
- encoded_vec, encoded_sum = encoder(
- token_emb=src_emb,
- pos_emb=src_pos_emb,
- conv_blocks=enc_conv_blocks,
- num_attention=num_attention,
- drop_rate=drop_rate,
- with_bn=with_bn)
-
- trg = paddle.layer.data(
- name='trg_word',
- type=paddle.data_type.integer_value_sequence(trg_dict_size +
- 1)) # one for padding
- trg_pos = paddle.layer.data(
- name='trg_word_pos',
- type=paddle.data_type.integer_value_sequence(pos_size +
- 1)) # one for padding
-
- trg_emb = paddle.layer.embedding(
- input=trg,
- size=emb_dim,
- name='trg_word_emb',
- param_attr=paddle.attr.Param(
- initial_mean=0., initial_std=0.1))
- trg_pos_emb = paddle.layer.embedding(
- input=trg_pos,
- size=emb_dim,
- name='trg_pos_emb',
- param_attr=paddle.attr.Param(
- initial_mean=0., initial_std=0.1))
-
- decoder_out, weight = decoder(
- token_emb=trg_emb,
- pos_emb=trg_pos_emb,
- encoded_vec=encoded_vec,
- encoded_sum=encoded_sum,
- dict_size=trg_dict_size,
- conv_blocks=dec_conv_blocks,
- drop_rate=drop_rate,
- with_bn=with_bn)
-
- if is_infer:
- return decoder_out, weight
-
- trg_next_word = paddle.layer.data(
- name='trg_next_word',
- type=paddle.data_type.integer_value_sequence(trg_dict_size))
- cost = paddle.layer.classification_cost(
- input=decoder_out, label=trg_next_word)
-
- return cost
diff --git a/legacy/conv_seq2seq/preprocess.py b/legacy/conv_seq2seq/preprocess.py
deleted file mode 100644
index 1d5c7cdd7b5cc91e28854fa0bbeeffc9dcbe4e5c..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/preprocess.py
+++ /dev/null
@@ -1,30 +0,0 @@
-#coding=utf-8
-
-import cPickle
-
-
-def concat_file(file1, file2, dst_file):
- with open(dst_file, 'w') as dst:
- with open(file1) as f1:
- with open(file2) as f2:
- for i, (line1, line2) in enumerate(zip(f1, f2)):
- line1 = line1.strip()
- line = line1 + '\t' + line2
- dst.write(line)
-
-
-if __name__ == '__main__':
- concat_file('dev.de-en.de', 'dev.de-en.en', 'dev')
- concat_file('test.de-en.de', 'test.de-en.en', 'test')
- concat_file('train.de-en.de', 'train.de-en.en', 'train')
-
- src_dict = cPickle.load(open('vocab.de'))
- trg_dict = cPickle.load(open('vocab.en'))
-
- with open('src_dict', 'w') as f:
- f.write('\n\nUNK\n')
- f.writelines('\n'.join(src_dict.keys()))
-
- with open('trg_dict', 'w') as f:
- f.write('\n\nUNK\n')
- f.writelines('\n'.join(trg_dict.keys()))
diff --git a/legacy/conv_seq2seq/reader.py b/legacy/conv_seq2seq/reader.py
deleted file mode 100644
index ad420af5faade1cd5ee7ef947f7f8920ce6a8bdb..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/reader.py
+++ /dev/null
@@ -1,67 +0,0 @@
-#coding=utf-8
-
-import random
-
-
-def load_dict(dict_file):
- word_dict = dict()
- with open(dict_file, 'r') as f:
- for i, line in enumerate(f):
- w = line.strip().split()[0]
- word_dict[w] = i
- return word_dict
-
-
-def get_reverse_dict(dictionary):
- reverse_dict = {dictionary[k]: k for k in dictionary.keys()}
- return reverse_dict
-
-
-def load_data(data_file, src_dict, trg_dict):
- UNK_IDX = src_dict['UNK']
- with open(data_file, 'r') as f:
- for line in f:
- line_split = line.strip().split('\t')
- if len(line_split) < 2:
- continue
- src, trg = line_split
- src_words = src.strip().split()
- trg_words = trg.strip().split()
- src_seq = [src_dict.get(w, UNK_IDX) for w in src_words]
- trg_seq = [trg_dict.get(w, UNK_IDX) for w in trg_words]
- yield src_seq, trg_seq
-
-
-def data_reader(data_file, src_dict, trg_dict, pos_size, padding_num):
- def reader():
- UNK_IDX = src_dict['UNK']
- word_padding = trg_dict.__len__()
- pos_padding = pos_size
-
- def _get_pos(pos_list, pos_size, pos_padding):
- return [pos if pos < pos_size else pos_padding for pos in pos_list]
-
- with open(data_file, 'r') as f:
- for line in f:
- line_split = line.strip().split('\t')
- if len(line_split) != 2:
- continue
- src, trg = line_split
- src = src.strip().split()
- src_word = [src_dict.get(w, UNK_IDX) for w in src]
- src_word_pos = range(len(src_word))
- src_word_pos = _get_pos(src_word_pos, pos_size, pos_padding)
-
- trg = trg.strip().split()
- trg_word = [trg_dict['']
- ] + [trg_dict.get(w, UNK_IDX) for w in trg]
- trg_word_pos = range(len(trg_word))
- trg_word_pos = _get_pos(trg_word_pos, pos_size, pos_padding)
-
- trg_next_word = trg_word[1:] + [trg_dict['']]
- trg_word = [word_padding] * padding_num + trg_word
- trg_word_pos = [pos_padding] * padding_num + trg_word_pos
- trg_next_word = trg_next_word + [trg_dict['']] * padding_num
- yield src_word, src_word_pos, trg_word, trg_word_pos, trg_next_word
-
- return reader
diff --git a/legacy/conv_seq2seq/train.py b/legacy/conv_seq2seq/train.py
deleted file mode 100644
index 4bd9a1af675ada5820bb375938a4675e6e71fbe1..0000000000000000000000000000000000000000
--- a/legacy/conv_seq2seq/train.py
+++ /dev/null
@@ -1,263 +0,0 @@
-#coding=utf-8
-
-import os
-import sys
-import time
-import argparse
-import distutils.util
-import gzip
-import numpy as np
-
-import paddle.v2 as paddle
-from model import conv_seq2seq
-import reader
-
-
-def parse_args():
- parser = argparse.ArgumentParser(
- description="PaddlePaddle Convolutional Seq2Seq")
- parser.add_argument(
- '--train_data_path',
- type=str,
- required=True,
- help="Path of the training set")
- parser.add_argument(
- '--test_data_path', type=str, help='Path of the test set')
- parser.add_argument(
- '--src_dict_path',
- type=str,
- required=True,
- help='Path of source dictionary')
- parser.add_argument(
- '--trg_dict_path',
- type=str,
- required=True,
- help='Path of target dictionary')
- parser.add_argument(
- '--enc_blocks', type=str, help='Convolution blocks of the encoder')
- parser.add_argument(
- '--dec_blocks', type=str, help='Convolution blocks of the decoder')
- parser.add_argument(
- '--emb_size',
- type=int,
- default=256,
- help='Dimension of word embedding. (default: %(default)s)')
- parser.add_argument(
- '--pos_size',
- type=int,
- default=200,
- help='Total number of the position indexes. (default: %(default)s)')
- parser.add_argument(
- '--drop_rate',
- type=float,
- default=0.,
- help='Dropout rate. (default: %(default)s)')
- parser.add_argument(
- "--use_bn",
- default=False,
- type=distutils.util.strtobool,
- help="Use batch normalization or not. (default: %(default)s)")
- parser.add_argument(
- "--use_gpu",
- default=False,
- type=distutils.util.strtobool,
- help="Use gpu or not. (default: %(default)s)")
- parser.add_argument(
- "--trainer_count",
- default=1,
- type=int,
- help="Trainer number. (default: %(default)s)")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=32,
- help="Size of a mini-batch. (default: %(default)s)")
- parser.add_argument(
- '--num_passes',
- type=int,
- default=15,
- help="Number of passes to train. (default: %(default)s)")
- return parser.parse_args()
-
-
-def create_reader(padding_num,
- train_data_path,
- test_data_path=None,
- src_dict=None,
- trg_dict=None,
- pos_size=200,
- batch_size=32):
-
- train_reader = paddle.batch(
- reader=paddle.reader.shuffle(
- reader=reader.data_reader(
- data_file=train_data_path,
- src_dict=src_dict,
- trg_dict=trg_dict,
- pos_size=pos_size,
- padding_num=padding_num),
- buf_size=10240),
- batch_size=batch_size)
-
- test_reader = None
- if test_data_path:
- test_reader = paddle.batch(
- reader=paddle.reader.shuffle(
- reader=reader.data_reader(
- data_file=test_data_path,
- src_dict=src_dict,
- trg_dict=trg_dict,
- pos_size=pos_size,
- padding_num=padding_num),
- buf_size=10240),
- batch_size=batch_size)
-
- return train_reader, test_reader
-
-
-def train(train_data_path,
- test_data_path,
- src_dict_path,
- trg_dict_path,
- enc_conv_blocks,
- dec_conv_blocks,
- emb_dim=256,
- pos_size=200,
- drop_rate=0.,
- use_bn=False,
- batch_size=32,
- num_passes=15):
- """
- Train the convolution sequence-to-sequence model.
-
- :param train_data_path: The path of the training set.
- :type train_data_path: str
- :param test_data_path: The path of the test set.
- :type test_data_path: str
- :param src_dict_path: The path of the source dictionary.
- :type src_dict_path: str
- :param trg_dict_path: The path of the target dictionary.
- :type trg_dict_path: str
- :param enc_conv_blocks: The scale list of the encoder's convolution blocks. And each element of
- the list contains output dimension and context length of the corresponding
- convolution block.
- :type enc_conv_blocks: list of tuple
- :param dec_conv_blocks: The scale list of the decoder's convolution blocks. And each element of
- the list contains output dimension and context length of the corresponding
- convolution block.
- :type dec_conv_blocks: list of tuple
- :param emb_dim: The dimension of the embedding vector.
- :type emb_dim: int
- :param pos_size: The total number of the position indexes, which means
- the maximum value of the index is pos_size - 1.
- :type pos_size: int
- :param drop_rate: Dropout rate.
- :type drop_rate: float
- :param use_bn: Whether to use batch normalization or not. False is the default value.
- :type use_bn: bool
- :param batch_size: The size of a mini-batch.
- :type batch_size: int
- :param num_passes: The total number of the passes to train.
- :type num_passes: int
- """
- # load dict
- src_dict = reader.load_dict(src_dict_path)
- trg_dict = reader.load_dict(trg_dict_path)
- src_dict_size = src_dict.__len__()
- trg_dict_size = trg_dict.__len__()
-
- optimizer = paddle.optimizer.Adam(learning_rate=1e-3, )
-
- cost = conv_seq2seq(
- src_dict_size=src_dict_size,
- trg_dict_size=trg_dict_size,
- pos_size=pos_size,
- emb_dim=emb_dim,
- enc_conv_blocks=enc_conv_blocks,
- dec_conv_blocks=dec_conv_blocks,
- drop_rate=drop_rate,
- with_bn=use_bn,
- is_infer=False)
-
- # create parameters and trainer
- parameters = paddle.parameters.create(cost)
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- update_equation=optimizer)
-
- padding_list = [context_len - 1 for (size, context_len) in dec_conv_blocks]
- padding_num = reduce(lambda x, y: x + y, padding_list)
- train_reader, test_reader = create_reader(
- padding_num=padding_num,
- train_data_path=train_data_path,
- test_data_path=test_data_path,
- src_dict=src_dict,
- trg_dict=trg_dict,
- pos_size=pos_size,
- batch_size=batch_size)
-
- feeding = {
- 'src_word': 0,
- 'src_word_pos': 1,
- 'trg_word': 2,
- 'trg_word_pos': 3,
- 'trg_next_word': 4
- }
-
- # create event handler
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 20 == 0:
- cur_time = time.strftime('%Y.%m.%d %H:%M:%S', time.localtime())
- print "[%s]: Pass: %d, Batch: %d, TrainCost: %f, %s" % (
- cur_time, event.pass_id, event.batch_id, event.cost,
- event.metrics)
- sys.stdout.flush()
-
- if isinstance(event, paddle.event.EndPass):
- if test_reader is not None:
- cur_time = time.strftime('%Y.%m.%d %H:%M:%S', time.localtime())
- result = trainer.test(reader=test_reader, feeding=feeding)
- print "[%s]: Pass: %d, TestCost: %f, %s" % (
- cur_time, event.pass_id, result.cost, result.metrics)
- sys.stdout.flush()
- with gzip.open("output/params.pass-%d.tar.gz" % event.pass_id,
- 'w') as f:
- trainer.save_parameter_to_tar(f)
-
- if not os.path.exists('output'):
- os.mkdir('output')
-
- trainer.train(
- reader=train_reader,
- event_handler=event_handler,
- num_passes=num_passes,
- feeding=feeding)
-
-
-def main():
- args = parse_args()
- enc_conv_blocks = eval(args.enc_blocks)
- dec_conv_blocks = eval(args.dec_blocks)
-
- sys.setrecursionlimit(10000)
-
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
-
- train(
- train_data_path=args.train_data_path,
- test_data_path=args.test_data_path,
- src_dict_path=args.src_dict_path,
- trg_dict_path=args.trg_dict_path,
- enc_conv_blocks=enc_conv_blocks,
- dec_conv_blocks=dec_conv_blocks,
- emb_dim=args.emb_size,
- pos_size=args.pos_size,
- drop_rate=args.drop_rate,
- use_bn=args.use_bn,
- batch_size=args.batch_size,
- num_passes=args.num_passes)
-
-
-if __name__ == '__main__':
- main()
diff --git a/legacy/ctr/README.cn.md b/legacy/ctr/README.cn.md
deleted file mode 100644
index d717264c46529c4ca3be6500983558b0384a7d77..0000000000000000000000000000000000000000
--- a/legacy/ctr/README.cn.md
+++ /dev/null
@@ -1,369 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 点击率预估
-
-以下是本例目录包含的文件以及对应说明:
-
-```
-├── README.md # 本教程markdown 文档
-├── dataset.md # 数据集处理教程
-├── images # 本教程图片目录
-│ ├── lr_vs_dnn.jpg
-│ └── wide_deep.png
-├── infer.py # 预测脚本
-├── network_conf.py # 模型网络配置
-├── reader.py # data reader
-├── train.py # 训练脚本
-└── utils.py # helper functions
-└── avazu_data_processer.py # 示例数据预处理脚本
-```
-
-## 背景介绍
-
-CTR(Click-Through Rate,点击率预估)\[[1](https://en.wikipedia.org/wiki/Click-through_rate)\]
-是对用户点击一个特定链接的概率做出预测,是广告投放过程中的一个重要环节。精准的点击率预估对在线广告系统收益最大化具有重要意义。
-
-当有多个广告位时,CTR 预估一般会作为排序的基准,比如在搜索引擎的广告系统里,当用户输入一个带商业价值的搜索词(query)时,系统大体上会执行下列步骤来展示广告:
-
-1. 获取与用户搜索词相关的广告集合
-2. 业务规则和相关性过滤
-3. 根据拍卖机制和 CTR 排序
-4. 展出广告
-
-可以看到,CTR 在最终排序中起到了很重要的作用。
-
-### 发展阶段
-在业内,CTR 模型经历了如下的发展阶段:
-
-- Logistic Regression(LR) / GBDT + 特征工程
-- LR + DNN 特征
-- DNN + 特征工程
-
-在发展早期时 LR 一统天下,但最近 DNN 模型由于其强大的学习能力和逐渐成熟的性能优化,
-逐渐地接过 CTR 预估任务的大旗。
-
-
-### LR vs DNN
-
-下图展示了 LR 和一个 \(3x2\) 的 DNN 模型的结构:
-
-
-
-Figure 1. LR 和 DNN 模型结构对比
-
-
-LR 的蓝色箭头部分可以直接类比到 DNN 中对应的结构,可以看到 LR 和 DNN 有一些共通之处(比如权重累加),
-但前者的模型复杂度在相同输入维度下比后者可能低很多(从某方面讲,模型越复杂,越有潜力学习到更复杂的信息);
-如果 LR 要达到匹敌 DNN 的学习能力,必须增加输入的维度,也就是增加特征的数量,
-这也就是为何 LR 和大规模的特征工程必须绑定在一起的原因。
-
-LR 对于 DNN 模型的优势是对大规模稀疏特征的容纳能力,包括内存和计算量等方面,工业界都有非常成熟的优化方法;
-而 DNN 模型具有自己学习新特征的能力,一定程度上能够提升特征使用的效率,
-这使得 DNN 模型在同样规模特征的情况下,更有可能达到更好的学习效果。
-
-本文后面的章节会演示如何使用 PaddlePaddle 编写一个结合两者优点的模型。
-
-
-## 数据和任务抽象
-
-我们可以将 `click` 作为学习目标,任务可以有以下几种方案:
-
-1. 直接学习 click,0,1 作二元分类
-2. Learning to rank, 具体用 pairwise rank(标签 1>0)或者 listwise rank
-3. 统计每个广告的点击率,将同一个 query 下的广告两两组合,点击率高的>点击率低的,做 rank 或者分类
-
-我们直接使用第一种方法做分类任务。
-
-我们使用 Kaggle 上 `Click-through rate prediction` 任务的数据集\[[2](https://www.kaggle.com/c/avazu-ctr-prediction/data)\] 来演示本例中的模型。
-
-具体的特征处理方法参看 [data process](./dataset.md)。
-
-本教程中演示模型的输入格式如下:
-
-```
-# \t \t click
-1 23 190 \t 230:0.12 3421:0.9 23451:0.12 \t 0
-23 231 \t 1230:0.12 13421:0.9 \t 1
-```
-
-详细的格式描述如下:
-
-- `dnn input ids` 采用 one-hot 表示,只需要填写值为1的ID(注意这里不是变长输入)
-- `lr input sparse values` 使用了 `ID:VALUE` 的表示,值部分最好规约到值域 `[-1, 1]`。
-
-此外,模型训练时需要传入一个文件描述 dnn 和 lr两个子模型的输入维度,文件的格式如下:
-
-```
-dnn_input_dim:
-lr_input_dim:
-```
-
-其中, `` 表示一个整型数值。
-
-本目录下的 `avazu_data_processor.py` 可以对下载的演示数据集\[[2](#参考文档)\] 进行处理,具体使用方法参考如下说明:
-
-```
-usage: avazu_data_processer.py [-h] --data_path DATA_PATH --output_dir
- OUTPUT_DIR
- [--num_lines_to_detect NUM_LINES_TO_DETECT]
- [--test_set_size TEST_SET_SIZE]
- [--train_size TRAIN_SIZE]
-
-PaddlePaddle CTR example
-
-optional arguments:
- -h, --help show this help message and exit
- --data_path DATA_PATH
- path of the Avazu dataset
- --output_dir OUTPUT_DIR
- directory to output
- --num_lines_to_detect NUM_LINES_TO_DETECT
- number of records to detect dataset's meta info
- --test_set_size TEST_SET_SIZE
- size of the validation dataset(default: 10000)
- --train_size TRAIN_SIZE
- size of the trainset (default: 100000)
-```
-
-- `data_path` 是待处理的数据路径
-- `output_dir` 生成数据的输出路径
-- `num_lines_to_detect` 预先扫描数据生成ID的个数,这里是扫描的文件行数
-- `test_set_size` 生成测试集的行数
-- `train_size` 生成训练姐的行数
-
-## Wide & Deep Learning Model
-
-谷歌在 16 年提出了 Wide & Deep Learning 的模型框架,用于融合适合学习抽象特征的 DNN 和 适用于大规模稀疏特征的 LR 两种模型的优点。
-
-
-### 模型简介
-
-Wide & Deep Learning Model\[[3](#参考文献)\] 可以作为一种相对成熟的模型框架使用,
-在 CTR 预估的任务中工业界也有一定的应用,因此本文将演示使用此模型来完成 CTR 预估的任务。
-
-模型结构如下:
-
-
-
-Figure 2. Wide & Deep Model
-
-
-模型上边的 Wide 部分,可以容纳大规模系数特征,并且对一些特定的信息(比如 ID)有一定的记忆能力;
-而模型下边的 Deep 部分,能够学习特征间的隐含关系,在相同数量的特征下有更好的学习和推导能力。
-
-
-### 编写模型输入
-
-模型只接受 3 个输入,分别是
-
-- `dnn_input` ,也就是 Deep 部分的输入
-- `lr_input` ,也就是 Wide 部分的输入
-- `click` , 点击与否,作为二分类模型学习的标签
-
-```python
-dnn_merged_input = layer.data(
- name='dnn_input',
- type=paddle.data_type.sparse_binary_vector(data_meta_info['dnn_input']))
-
-lr_merged_input = layer.data(
- name='lr_input',
- type=paddle.data_type.sparse_binary_vector(data_meta_info['lr_input']))
-
-click = paddle.layer.data(name='click', type=dtype.dense_vector(1))
-```
-
-### 编写 Wide 部分
-
-Wide 部分直接使用了 LR 模型,但激活函数改成了 `RELU` 来加速
-
-```python
-def build_lr_submodel():
- fc = layer.fc(
- input=lr_merged_input, size=1, name='lr', act=paddle.activation.Relu())
- return fc
-```
-
-### 编写 Deep 部分
-
-Deep 部分使用了标准的多层前向传导的 DNN 模型
-
-```python
-def build_dnn_submodel(dnn_layer_dims):
- dnn_embedding = layer.fc(input=dnn_merged_input, size=dnn_layer_dims[0])
- _input_layer = dnn_embedding
- for i, dim in enumerate(dnn_layer_dims[1:]):
- fc = layer.fc(
- input=_input_layer,
- size=dim,
- act=paddle.activation.Relu(),
- name='dnn-fc-%d' % i)
- _input_layer = fc
- return _input_layer
-```
-
-### 两者融合
-
-两个 submodel 的最上层输出加权求和得到整个模型的输出,输出部分使用 `sigmoid` 作为激活函数,得到区间 (0,1) 的预测值,
-来逼近训练数据中二元类别的分布,并最终作为 CTR 预估的值使用。
-
-```python
-# conbine DNN and LR submodels
-def combine_submodels(dnn, lr):
- merge_layer = layer.concat(input=[dnn, lr])
- fc = layer.fc(
- input=merge_layer,
- size=1,
- name='output',
- # use sigmoid function to approximate ctr, wihch is a float value between 0 and 1.
- act=paddle.activation.Sigmoid())
- return fc
-```
-
-### 训练任务的定义
-```python
-dnn = build_dnn_submodel(dnn_layer_dims)
-lr = build_lr_submodel()
-output = combine_submodels(dnn, lr)
-
-# ==============================================================================
-# cost and train period
-# ==============================================================================
-classification_cost = paddle.layer.multi_binary_label_cross_entropy_cost(
- input=output, label=click)
-
-
-paddle.init(use_gpu=False, trainer_count=11)
-
-params = paddle.parameters.create(classification_cost)
-
-optimizer = paddle.optimizer.Momentum(momentum=0)
-
-trainer = paddle.trainer.SGD(
- cost=classification_cost, parameters=params, update_equation=optimizer)
-
-dataset = AvazuDataset(train_data_path, n_records_as_test=test_set_size)
-
-def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 100 == 0:
- logging.warning("Pass %d, Samples %d, Cost %f" % (
- event.pass_id, event.batch_id * batch_size, event.cost))
-
- if event.batch_id % 1000 == 0:
- result = trainer.test(
- reader=paddle.batch(dataset.test, batch_size=1000),
- feeding=field_index)
- logging.warning("Test %d-%d, Cost %f" % (event.pass_id, event.batch_id,
- result.cost))
-
-
-trainer.train(
- reader=paddle.batch(
- paddle.reader.shuffle(dataset.train, buf_size=500),
- batch_size=batch_size),
- feeding=field_index,
- event_handler=event_handler,
- num_passes=100)
-```
-## 运行训练和测试
-训练模型需要如下步骤:
-
-1. 准备训练数据
- 1. 从 [Kaggle CTR](https://www.kaggle.com/c/avazu-ctr-prediction/data) 下载 train.gz
- 2. 解压 train.gz 得到 train.txt
- 3. `mkdir -p output; python avazu_data_processer.py --data_path train.txt --output_dir output --num_lines_to_detect 1000 --test_set_size 100` 生成演示数据
-2. 执行 `python train.py --train_data_path ./output/train.txt --test_data_path ./output/test.txt --data_meta_file ./output/data.meta.txt --model_type=0` 开始训练
-
-上面第2个步骤可以为 `train.py` 填充命令行参数来定制模型的训练过程,具体的命令行参数及用法如下
-
-```
-usage: train.py [-h] --train_data_path TRAIN_DATA_PATH
- [--test_data_path TEST_DATA_PATH] [--batch_size BATCH_SIZE]
- [--num_passes NUM_PASSES]
- [--model_output_prefix MODEL_OUTPUT_PREFIX] --data_meta_file
- DATA_META_FILE --model_type MODEL_TYPE
-
-PaddlePaddle CTR example
-
-optional arguments:
- -h, --help show this help message and exit
- --train_data_path TRAIN_DATA_PATH
- path of training dataset
- --test_data_path TEST_DATA_PATH
- path of testing dataset
- --batch_size BATCH_SIZE
- size of mini-batch (default:10000)
- --num_passes NUM_PASSES
- number of passes to train
- --model_output_prefix MODEL_OUTPUT_PREFIX
- prefix of path for model to store (default:
- ./ctr_models)
- --data_meta_file DATA_META_FILE
- path of data meta info file
- --model_type MODEL_TYPE
- model type, classification: 0, regression 1 (default
- classification)
-```
-
-- `train_data_path` : 训练集的路径
-- `test_data_path` : 测试集的路径
-- `num_passes`: 模型训练多少轮
-- `data_meta_file`: 参考[数据和任务抽象](### 数据和任务抽象)的描述。
-- `model_type`: 模型分类或回归
-
-
-## 用训好的模型做预测
-训好的模型可以用来预测新的数据, 预测数据的格式为
-
-```
-# \t
-1 23 190 \t 230:0.12 3421:0.9 23451:0.12
-23 231 \t 1230:0.12 13421:0.9
-```
-
-这里与训练数据的格式唯一不同的地方,就是没有标签,也就是训练数据中第3列 `click` 对应的数值。
-
-`infer.py` 的使用方法如下
-
-```
-usage: infer.py [-h] --model_gz_path MODEL_GZ_PATH --data_path DATA_PATH
- --prediction_output_path PREDICTION_OUTPUT_PATH
- [--data_meta_path DATA_META_PATH] --model_type MODEL_TYPE
-
-PaddlePaddle CTR example
-
-optional arguments:
- -h, --help show this help message and exit
- --model_gz_path MODEL_GZ_PATH
- path of model parameters gz file
- --data_path DATA_PATH
- path of the dataset to infer
- --prediction_output_path PREDICTION_OUTPUT_PATH
- path to output the prediction
- --data_meta_path DATA_META_PATH
- path of trainset's meta info, default is ./data.meta
- --model_type MODEL_TYPE
- model type, classification: 0, regression 1 (default
- classification)
-```
-
-- `model_gz_path_model`:用 `gz` 压缩过的模型路径
-- `data_path` : 需要预测的数据路径
-- `prediction_output_paht`:预测输出的路径
-- `data_meta_file` :参考[数据和任务抽象](### 数据和任务抽象)的描述。
-- `model_type` :分类或回归
-
-示例数据可以用如下命令预测
-
-```
-python infer.py --model_gz_path --data_path output/infer.txt --prediction_output_path predictions.txt --data_meta_path data.meta.txt
-```
-
-最终的预测结果位于 `predictions.txt`。
-
-## 参考文献
-1.
-2.
-3. Cheng H T, Koc L, Harmsen J, et al. [Wide & deep learning for recommender systems](https://arxiv.org/pdf/1606.07792.pdf)[C]//Proceedings of the 1st Workshop on Deep Learning for Recommender Systems. ACM, 2016: 7-10.
diff --git a/legacy/ctr/README.md b/legacy/ctr/README.md
deleted file mode 100644
index 9ace483be6126b31e064ce3014cea1b08664f8cf..0000000000000000000000000000000000000000
--- a/legacy/ctr/README.md
+++ /dev/null
@@ -1,343 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.10.0. If you are on a version of PaddlePaddle earlier than v0.10.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Click-Through Rate Prediction
-
-## Introduction
-
-CTR(Click-Through Rate)\[[1](https://en.wikipedia.org/wiki/Click-through_rate)\]
-is a prediction of the probability that a user clicks on an advertisement. This model is widely used in the advertisement industry. Accurate click rate estimates are important for maximizing online advertising revenue.
-
-When there are multiple ad slots, CTR estimates are generally used as a baseline for ranking. For example, in a search engine's ad system, when the user enters a query, the system typically performs the following steps to show relevant ads.
-
-1. Get the ad collection associated with the user's search term.
-2. Business rules and relevance filtering.
-3. Rank by auction mechanism and CTR.
-4. Show ads.
-
-Here,CTR plays a crucial role.
-
-### Brief history
-Historically, the CTR prediction model has been evolving as follows.
-
-- Logistic Regression(LR) / Gradient Boosting Decision Trees (GBDT) + feature engineering
-- LR + Deep Neural Network (DNN)
-- DNN + feature engineering
-
-In the early stages of development LR dominated, but the recent years DNN based models are mainly used.
-
-
-### LR vs DNN
-
-The following figure shows the structure of LR and DNN model:
-
-
-
-Figure 1. LR and DNN model structure comparison
-
-
-We can see, LR and CNN have some common structures. However, DNN can have non-linear relation between input and output values by adding activation unit and further layers. This enables DNN to achieve better learning results in CTR estimates.
-
-In the following, we demonstrate how to use PaddlePaddle to learn to predict CTR.
-
-## Data and Model formation
-
-Here `click` is the learning objective. There are several ways to learn the objectives.
-
-1. Direct learning click, 0,1 for binary classification
-2. Learning to rank, pairwise rank or listwise rank
-3. Measure the ad click rate of each ad, then rank by the click rate.
-
-In this example, we use the first method.
-
-We use the Kaggle `Click-through rate prediction` task \[[2](https://www.kaggle.com/c/avazu-ctr-prediction/data)\].
-
-Please see the [data process](./dataset.md) for pre-processing data.
-
-The input data format for the demo model in this tutorial is as follows:
-
-```
-# \t \t click
-1 23 190 \t 230:0.12 3421:0.9 23451:0.12 \t 0
-23 231 \t 1230:0.12 13421:0.9 \t 1
-```
-
-Description:
-
-- `dnn input ids` one-hot coding.
-- `lr input sparse values` Use `ID:VALUE` , values are preferaly scaled to the range `[-1, 1]`。
-
-此外,模型训练时需要传入一个文件描述 dnn 和 lr两个子模型的输入维度,文件的格式如下:
-
-```
-dnn_input_dim:
-lr_input_dim:
-```
-
- represents an integer value.
-
-`avazu_data_processor.py` can be used to download the data set \[[2](#参考文档)\]and pre-process the data.
-
-```
-usage: avazu_data_processer.py [-h] --data_path DATA_PATH --output_dir
- OUTPUT_DIR
- [--num_lines_to_detect NUM_LINES_TO_DETECT]
- [--test_set_size TEST_SET_SIZE]
- [--train_size TRAIN_SIZE]
-
-PaddlePaddle CTR example
-
-optional arguments:
- -h, --help show this help message and exit
- --data_path DATA_PATH
- path of the Avazu dataset
- --output_dir OUTPUT_DIR
- directory to output
- --num_lines_to_detect NUM_LINES_TO_DETECT
- number of records to detect dataset's meta info
- --test_set_size TEST_SET_SIZE
- size of the validation dataset(default: 10000)
- --train_size TRAIN_SIZE
- size of the trainset (default: 100000)
-```
-
-- `data_path` The data path to be processed
-- `output_dir` The output path of the data
-- `num_lines_to_detect` The number of generated IDs
-- `test_set_size` The number of rows for the test set
-- `train_size` The number of rows of training set
-
-## Wide & Deep Learning Model
-
-Google proposed a model framework for Wide & Deep Learning to integrate the advantages of both DNNs suitable for learning abstract features and LR models for large sparse features.
-
-
-### Introduction to the model
-
-Wide & Deep Learning Model\[[3](#References)\] is a relatively mature model, but this model is still being used in the CTR predicting task. Here we demonstrate the use of this model to complete the CTR predicting task.
-
-The model structure is as follows:
-
-
-
-Figure 2. Wide & Deep Model
-
-
-The wide part of the top side of the model can accommodate large-scale coefficient features and has some memory for some specific information (such as ID); and the Deep part of the bottom side of the model can learn the implicit relationship between features.
-
-
-### Model Input
-
-The model has three inputs as follows.
-
-- `dnn_input` ,the Deep part of the input
-- `lr_input` ,the wide part of the input
-- `click` , click on or not
-
-```python
-dnn_merged_input = layer.data(
- name='dnn_input',
- type=paddle.data_type.sparse_binary_vector(self.dnn_input_dim))
-
-lr_merged_input = layer.data(
- name='lr_input',
- type=paddle.data_type.sparse_vector(self.lr_input_dim))
-
-click = paddle.layer.data(name='click', type=dtype.dense_vector(1))
-```
-
-### Wide part
-
-Wide part uses of the LR model, but the activation function changed to `RELU` for speed.
-
-```python
-def build_lr_submodel():
- fc = layer.fc(
- input=lr_merged_input, size=1, name='lr', act=paddle.activation.Relu())
- return fc
-```
-
-### Deep part
-
-The Deep part uses a standard multi-layer DNN.
-
-```python
-def build_dnn_submodel(dnn_layer_dims):
- dnn_embedding = layer.fc(input=dnn_merged_input, size=dnn_layer_dims[0])
- _input_layer = dnn_embedding
- for i, dim in enumerate(dnn_layer_dims[1:]):
- fc = layer.fc(
- input=_input_layer,
- size=dim,
- act=paddle.activation.Relu(),
- name='dnn-fc-%d' % i)
- _input_layer = fc
- return _input_layer
-```
-
-### Combine
-
-The output section uses `sigmoid` function to output (0,1) as the prediction value.
-
-```python
-# conbine DNN and LR submodels
-def combine_submodels(dnn, lr):
- merge_layer = layer.concat(input=[dnn, lr])
- fc = layer.fc(
- input=merge_layer,
- size=1,
- name='output',
- # use sigmoid function to approximate ctr, wihch is a float value between 0 and 1.
- act=paddle.activation.Sigmoid())
- return fc
-```
-
-### Training
-```python
-dnn = build_dnn_submodel(dnn_layer_dims)
-lr = build_lr_submodel()
-output = combine_submodels(dnn, lr)
-
-# ==============================================================================
-# cost and train period
-# ==============================================================================
-classification_cost = paddle.layer.multi_binary_label_cross_entropy_cost(
- input=output, label=click)
-
-
-paddle.init(use_gpu=False, trainer_count=11)
-
-params = paddle.parameters.create(classification_cost)
-
-optimizer = paddle.optimizer.Momentum(momentum=0)
-
-trainer = paddle.trainer.SGD(
- cost=classification_cost, parameters=params, update_equation=optimizer)
-
-dataset = AvazuDataset(train_data_path, n_records_as_test=test_set_size)
-
-def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 100 == 0:
- logging.warning("Pass %d, Samples %d, Cost %f" % (
- event.pass_id, event.batch_id * batch_size, event.cost))
-
- if event.batch_id % 1000 == 0:
- result = trainer.test(
- reader=paddle.batch(dataset.test, batch_size=1000),
- feeding=field_index)
- logging.warning("Test %d-%d, Cost %f" % (event.pass_id, event.batch_id,
- result.cost))
-
-
-trainer.train(
- reader=paddle.batch(
- paddle.reader.shuffle(dataset.train, buf_size=500),
- batch_size=batch_size),
- feeding=field_index,
- event_handler=event_handler,
- num_passes=100)
-```
-
-## Run training and testing
-The model go through the following steps:
-
-1. Prepare training data
- 1. Download train.gz from [Kaggle CTR](https://www.kaggle.com/c/avazu-ctr-prediction/data) .
- 2. Unzip train.gz to get train.txt
- 3. `mkdir -p output; python avazu_data_processer.py --data_path train.txt --output_dir output --num_lines_to_detect 1000 --test_set_size 100` 生成演示数据
-2. Execute `python train.py --train_data_path ./output/train.txt --test_data_path ./output/test.txt --data_meta_file ./output/data.meta.txt --model_type=0`. Start training.
-
-The argument options for `train.py` are as follows.
-
-```
-usage: train.py [-h] --train_data_path TRAIN_DATA_PATH
- [--test_data_path TEST_DATA_PATH] [--batch_size BATCH_SIZE]
- [--num_passes NUM_PASSES]
- [--model_output_prefix MODEL_OUTPUT_PREFIX] --data_meta_file
- DATA_META_FILE --model_type MODEL_TYPE
-
-PaddlePaddle CTR example
-
-optional arguments:
- -h, --help show this help message and exit
- --train_data_path TRAIN_DATA_PATH
- path of training dataset
- --test_data_path TEST_DATA_PATH
- path of testing dataset
- --batch_size BATCH_SIZE
- size of mini-batch (default:10000)
- --num_passes NUM_PASSES
- number of passes to train
- --model_output_prefix MODEL_OUTPUT_PREFIX
- prefix of path for model to store (default:
- ./ctr_models)
- --data_meta_file DATA_META_FILE
- path of data meta info file
- --model_type MODEL_TYPE
- model type, classification: 0, regression 1 (default
- classification)
-```
-
-- `train_data_path` : The path of the training set
-- `test_data_path` : The path of the testing set
-- `num_passes`: number of rounds of model training
-- `data_meta_file`: Please refer to [数据和任务抽象](### 数据和任务抽象)的描述。
-- `model_type`: Model classification or regressio
-
-
-## Use the training model for prediction
-The training model can be used to predict new data, and the format of the forecast data is as follows.
-
-
-```
-# \t
-1 23 190 \t 230:0.12 3421:0.9 23451:0.12
-23 231 \t 1230:0.12 13421:0.9
-```
-
-Here the only difference to the training data is that there is no label (i.e. `click` values).
-
-We now can use `infer.py` to perform inference.
-
-```
-usage: infer.py [-h] --model_gz_path MODEL_GZ_PATH --data_path DATA_PATH
- --prediction_output_path PREDICTION_OUTPUT_PATH
- [--data_meta_path DATA_META_PATH] --model_type MODEL_TYPE
-
-PaddlePaddle CTR example
-
-optional arguments:
- -h, --help show this help message and exit
- --model_gz_path MODEL_GZ_PATH
- path of model parameters gz file
- --data_path DATA_PATH
- path of the dataset to infer
- --prediction_output_path PREDICTION_OUTPUT_PATH
- path to output the prediction
- --data_meta_path DATA_META_PATH
- path of trainset's meta info, default is ./data.meta
- --model_type MODEL_TYPE
- model type, classification: 0, regression 1 (default
- classification)
-```
-
-- `model_gz_path_model`:path for `gz` compressed data.
-- `data_path` :
-- `prediction_output_patj`:path for the predicted values s
-- `data_meta_file` :Please refer to [数据和任务抽象](### 数据和任务抽象)。
-- `model_type` :Classification or regression
-
-The sample data can be predicted with the following command
-
-```
-python infer.py --model_gz_path --data_path output/infer.txt --prediction_output_path predictions.txt --data_meta_path data.meta.txt
-```
-
-The final prediction is written in `predictions.txt`。
-
-## References
-1.
-2.
-3. Cheng H T, Koc L, Harmsen J, et al. [Wide & deep learning for recommender systems](https://arxiv.org/pdf/1606.07792.pdf)[C]//Proceedings of the 1st Workshop on Deep Learning for Recommender Systems. ACM, 2016: 7-10.
diff --git a/legacy/ctr/avazu_data_processer.py b/legacy/ctr/avazu_data_processer.py
deleted file mode 100644
index dd3c1441f8f8b26473d15889198abb3593edfa51..0000000000000000000000000000000000000000
--- a/legacy/ctr/avazu_data_processer.py
+++ /dev/null
@@ -1,414 +0,0 @@
-import sys
-import csv
-import cPickle
-import argparse
-import os
-import numpy as np
-
-from utils import logger, TaskMode
-
-parser = argparse.ArgumentParser(description="PaddlePaddle CTR example")
-parser.add_argument(
- '--data_path', type=str, required=True, help="path of the Avazu dataset")
-parser.add_argument(
- '--output_dir', type=str, required=True, help="directory to output")
-parser.add_argument(
- '--num_lines_to_detect',
- type=int,
- default=500000,
- help="number of records to detect dataset's meta info")
-parser.add_argument(
- '--test_set_size',
- type=int,
- default=10000,
- help="size of the validation dataset(default: 10000)")
-parser.add_argument(
- '--train_size',
- type=int,
- default=100000,
- help="size of the trainset (default: 100000)")
-args = parser.parse_args()
-'''
-The fields of the dataset are:
-
- 0. id: ad identifier
- 1. click: 0/1 for non-click/click
- 2. hour: format is YYMMDDHH, so 14091123 means 23:00 on Sept. 11, 2014 UTC.
- 3. C1 -- anonymized categorical variable
- 4. banner_pos
- 5. site_id
- 6. site_domain
- 7. site_category
- 8. app_id
- 9. app_domain
- 10. app_category
- 11. device_id
- 12. device_ip
- 13. device_model
- 14. device_type
- 15. device_conn_type
- 16. C14-C21 -- anonymized categorical variables
-
-We will treat the following fields as categorical features:
-
- - C1
- - banner_pos
- - site_category
- - app_category
- - device_type
- - device_conn_type
-
-and some other features as id features:
-
- - id
- - site_id
- - app_id
- - device_id
-
-The `hour` field will be treated as a continuous feature and will be transformed
-to one-hot representation which has 24 bits.
-
-This script will output 3 files:
-
-1. train.txt
-2. test.txt
-3. infer.txt
-
-all the files are for demo.
-'''
-
-feature_dims = {}
-
-categorial_features = (
- 'C1 banner_pos site_category app_category ' + 'device_type device_conn_type'
-).split()
-
-id_features = 'id site_id app_id device_id _device_id_cross_site_id'.split()
-
-
-def get_all_field_names(mode=0):
- '''
- @mode: int
- 0 for train, 1 for test
- @return: list of str
- '''
- return categorial_features + ['hour'] + id_features + ['click'] \
- if mode == 0 else []
-
-
-class CategoryFeatureGenerator(object):
- '''
- Generator category features.
-
- Register all records by calling `register` first, then call `gen` to generate
- one-hot representation for a record.
- '''
-
- def __init__(self):
- self.dic = {'unk': 0}
- self.counter = 1
-
- def register(self, key):
- '''
- Register record.
- '''
- if key not in self.dic:
- self.dic[key] = self.counter
- self.counter += 1
-
- def size(self):
- return len(self.dic)
-
- def gen(self, key):
- '''
- Generate one-hot representation for a record.
- '''
- if key not in self.dic:
- res = self.dic['unk']
- else:
- res = self.dic[key]
- return [res]
-
- def __repr__(self):
- return '' % len(self.dic)
-
-
-class IDfeatureGenerator(object):
- def __init__(self, max_dim, cross_fea0=None, cross_fea1=None):
- '''
- @max_dim: int
- Size of the id elements' space
- '''
- self.max_dim = max_dim
- self.cross_fea0 = cross_fea0
- self.cross_fea1 = cross_fea1
-
- def gen(self, key):
- '''
- Generate one-hot representation for records
- '''
- return [hash(key) % self.max_dim]
-
- def gen_cross_fea(self, fea1, fea2):
- key = str(fea1) + str(fea2)
- return self.gen(key)
-
- def size(self):
- return self.max_dim
-
-
-class ContinuousFeatureGenerator(object):
- def __init__(self, n_intervals):
- self.min = sys.maxint
- self.max = sys.minint
- self.n_intervals = n_intervals
-
- def register(self, val):
- self.min = min(self.minint, val)
- self.max = max(self.maxint, val)
-
- def gen(self, val):
- self.len_part = (self.max - self.min) / self.n_intervals
- return (val - self.min) / self.len_part
-
-
-# init all feature generators
-fields = {}
-for key in categorial_features:
- fields[key] = CategoryFeatureGenerator()
-for key in id_features:
- # for cross features
- if 'cross' in key:
- feas = key[1:].split('_cross_')
- fields[key] = IDfeatureGenerator(10000000, *feas)
- # for normal ID features
- else:
- fields[key] = IDfeatureGenerator(10000)
-
-# used as feed_dict in PaddlePaddle
-field_index = dict((key, id)
- for id, key in enumerate(['dnn_input', 'lr_input', 'click']))
-
-
-def detect_dataset(path, topn, id_fea_space=10000):
- '''
- Parse the first `topn` records to collect meta information of this dataset.
-
- NOTE the records should be randomly shuffled first.
- '''
- # create categorical statis objects.
- logger.warning('detecting dataset')
-
- with open(path, 'rb') as csvfile:
- reader = csv.DictReader(csvfile)
- for row_id, row in enumerate(reader):
- if row_id > topn:
- break
-
- for key in categorial_features:
- fields[key].register(row[key])
-
- for key, item in fields.items():
- feature_dims[key] = item.size()
-
- feature_dims['hour'] = 24
- feature_dims['click'] = 1
-
- feature_dims['dnn_input'] = np.sum(
- feature_dims[key] for key in categorial_features + ['hour']) + 1
- feature_dims['lr_input'] = np.sum(feature_dims[key]
- for key in id_features) + 1
- return feature_dims
-
-
-def load_data_meta(meta_path):
- '''
- Load dataset's meta infomation.
- '''
- feature_dims, fields = cPickle.load(open(meta_path, 'rb'))
- return feature_dims, fields
-
-
-def concat_sparse_vectors(inputs, dims):
- '''
- Concaterate more than one sparse vectors into one.
-
- @inputs: list
- list of sparse vector
- @dims: list of int
- dimention of each sparse vector
- '''
- res = []
- assert len(inputs) == len(dims)
- start = 0
- for no, vec in enumerate(inputs):
- for v in vec:
- res.append(v + start)
- start += dims[no]
- return res
-
-
-class AvazuDataset(object):
- '''
- Load AVAZU dataset as train set.
- '''
-
- def __init__(self,
- train_path,
- n_records_as_test=-1,
- fields=None,
- feature_dims=None):
- self.train_path = train_path
- self.n_records_as_test = n_records_as_test
- self.fields = fields
- # default is train mode.
- self.mode = TaskMode.create_train()
-
- self.categorial_dims = [
- feature_dims[key] for key in categorial_features + ['hour']
- ]
- self.id_dims = [feature_dims[key] for key in id_features]
-
- def train(self):
- '''
- Load trainset.
- '''
- logger.info("load trainset from %s" % self.train_path)
- self.mode = TaskMode.create_train()
- with open(self.train_path) as f:
- reader = csv.DictReader(f)
-
- for row_id, row in enumerate(reader):
- # skip top n lines
- if self.n_records_as_test > 0 and row_id < self.n_records_as_test:
- continue
-
- rcd = self._parse_record(row)
- if rcd:
- yield rcd
-
- def test(self):
- '''
- Load testset.
- '''
- logger.info("load testset from %s" % self.train_path)
- self.mode = TaskMode.create_test()
- with open(self.train_path) as f:
- reader = csv.DictReader(f)
-
- for row_id, row in enumerate(reader):
- # skip top n lines
- if self.n_records_as_test > 0 and row_id > self.n_records_as_test:
- break
-
- rcd = self._parse_record(row)
- if rcd:
- yield rcd
-
- def infer(self):
- '''
- Load inferset.
- '''
- logger.info("load inferset from %s" % self.train_path)
- self.mode = TaskMode.create_infer()
- with open(self.train_path) as f:
- reader = csv.DictReader(f)
-
- for row_id, row in enumerate(reader):
- rcd = self._parse_record(row)
- if rcd:
- yield rcd
-
- def _parse_record(self, row):
- '''
- Parse a CSV row and get a record.
- '''
- record = []
- for key in categorial_features:
- record.append(self.fields[key].gen(row[key]))
- record.append([int(row['hour'][-2:])])
- dense_input = concat_sparse_vectors(record, self.categorial_dims)
-
- record = []
- for key in id_features:
- if 'cross' not in key:
- record.append(self.fields[key].gen(row[key]))
- else:
- fea0 = self.fields[key].cross_fea0
- fea1 = self.fields[key].cross_fea1
- record.append(self.fields[key].gen_cross_fea(row[fea0], row[
- fea1]))
-
- sparse_input = concat_sparse_vectors(record, self.id_dims)
-
- record = [dense_input, sparse_input]
-
- if not self.mode.is_infer():
- record.append(list((int(row['click']), )))
- return record
-
-
-def ids2dense(vec, dim):
- return vec
-
-
-def ids2sparse(vec):
- return ["%d:1" % x for x in vec]
-
-
-detect_dataset(args.data_path, args.num_lines_to_detect)
-dataset = AvazuDataset(
- args.data_path,
- args.test_set_size,
- fields=fields,
- feature_dims=feature_dims)
-
-output_trainset_path = os.path.join(args.output_dir, 'train.txt')
-output_testset_path = os.path.join(args.output_dir, 'test.txt')
-output_infer_path = os.path.join(args.output_dir, 'infer.txt')
-output_meta_path = os.path.join(args.output_dir, 'data.meta.txt')
-
-with open(output_trainset_path, 'w') as f:
- for id, record in enumerate(dataset.train()):
- if id and id % 10000 == 0:
- logger.info("load %d records" % id)
- if id > args.train_size:
- break
- dnn_input, lr_input, click = record
- dnn_input = ids2dense(dnn_input, feature_dims['dnn_input'])
- lr_input = ids2sparse(lr_input)
- line = "%s\t%s\t%d\n" % (' '.join(map(str, dnn_input)),
- ' '.join(map(str, lr_input)), click[0])
- f.write(line)
- logger.info('write to %s' % output_trainset_path)
-
-with open(output_testset_path, 'w') as f:
- for id, record in enumerate(dataset.test()):
- dnn_input, lr_input, click = record
- dnn_input = ids2dense(dnn_input, feature_dims['dnn_input'])
- lr_input = ids2sparse(lr_input)
- line = "%s\t%s\t%d\n" % (' '.join(map(str, dnn_input)),
- ' '.join(map(str, lr_input)), click[0])
- f.write(line)
- logger.info('write to %s' % output_testset_path)
-
-with open(output_infer_path, 'w') as f:
- for id, record in enumerate(dataset.infer()):
- dnn_input, lr_input = record
- dnn_input = ids2dense(dnn_input, feature_dims['dnn_input'])
- lr_input = ids2sparse(lr_input)
- line = "%s\t%s\n" % (
- ' '.join(map(str, dnn_input)),
- ' '.join(map(str, lr_input)), )
- f.write(line)
- if id > args.test_set_size:
- break
- logger.info('write to %s' % output_infer_path)
-
-with open(output_meta_path, 'w') as f:
- lines = [
- "dnn_input_dim: %d" % feature_dims['dnn_input'],
- "lr_input_dim: %d" % feature_dims['lr_input']
- ]
- f.write('\n'.join(lines))
- logger.info('write data meta into %s' % output_meta_path)
diff --git a/legacy/ctr/dataset.md b/legacy/ctr/dataset.md
deleted file mode 100644
index 16c0f9784bf3409ac5bbe704f932a9b28680fbf8..0000000000000000000000000000000000000000
--- a/legacy/ctr/dataset.md
+++ /dev/null
@@ -1,296 +0,0 @@
-# 数据及处理
-## 数据集介绍
-
-本教程演示使用Kaggle上CTR任务的数据集\[[3](#参考文献)\]的预处理方法,最终产生本模型需要的格式,详细的数据格式参考[README.md](./README.md)。
-
-Wide && Deep Model\[[2](#参考文献)\]的优势是融合稠密特征和大规模稀疏特征,
-因此特征处理方面也针对稠密和稀疏两种特征作处理,
-其中Deep部分的稠密值全部转化为ID类特征,
-通过embedding 来转化为稠密的向量输入;Wide部分主要通过ID的叉乘提升维度。
-
-数据集使用 `csv` 格式存储,其中各个字段内容如下:
-
-- `id` : ad identifier
-- `click` : 0/1 for non-click/click
-- `hour` : format is YYMMDDHH, so 14091123 means 23:00 on Sept. 11, 2014 UTC.
-- `C1` : anonymized categorical variable
-- `banner_pos`
-- `site_id`
-- `site_domain`
-- `site_category`
-- `app_id`
-- `app_domain`
-- `app_category`
-- `device_id`
-- `device_ip`
-- `device_model`
-- `device_type`
-- `device_conn_type`
-- `C14-C21` : anonymized categorical variables
-
-
-## 特征提取
-
-下面我们会简单演示几种特征的提取方式。
-
-原始数据中的特征可以分为以下几类:
-
-1. ID 类特征(稀疏,数量多)
-- `id`
-- `site_id`
-- `app_id`
-- `device_id`
-
-2. 类别类特征(稀疏,但数量有限)
-
-- `C1`
-- `site_category`
-- `device_type`
-- `C14-C21`
-
-3. 数值型特征转化为类别型特征
-
-- hour (可以转化成数值,也可以按小时为单位转化为类别)
-
-### 类别类特征
-
-类别类特征的提取方法有以下两种:
-
-1. One-hot 表示作为特征
-2. 类似词向量,用一个 Embedding 将每个类别映射到对应的向量
-
-
-### ID 类特征
-
-ID 类特征的特点是稀疏数据,但量比较大,直接使用 One-hot 表示时维度过大。
-
-一般会作如下处理:
-
-1. 确定表示的最大维度 N
-2. newid = id % N
-3. 用 newid 作为类别类特征使用
-
-上面的方法尽管存在一定的碰撞概率,但能够处理任意数量的 ID 特征,并保留一定的效果\[[2](#参考文献)\]。
-
-### 数值型特征
-
-一般会做如下处理:
-
-- 归一化,直接作为特征输入模型
-- 用区间分割处理成类别类特征,稀疏化表示,模糊细微上的差别
-
-## 特征处理
-
-
-### 类别型特征
-
-类别型特征有有限多种值,在模型中,我们一般使用 Embedding将每种值映射为连续值的向量。
-
-这种特征在输入到模型时,一般使用 One-hot 表示,相关处理方法如下:
-
-```python
-class CategoryFeatureGenerator(object):
- '''
- Generator category features.
-
- Register all records by calling ~register~ first, then call ~gen~ to generate
- one-hot representation for a record.
- '''
-
- def __init__(self):
- self.dic = {'unk': 0}
- self.counter = 1
-
- def register(self, key):
- '''
- Register record.
- '''
- if key not in self.dic:
- self.dic[key] = self.counter
- self.counter += 1
-
- def size(self):
- return len(self.dic)
-
- def gen(self, key):
- '''
- Generate one-hot representation for a record.
- '''
- if key not in self.dic:
- res = self.dic['unk']
- else:
- res = self.dic[key]
- return [res]
-
- def __repr__(self):
- return '' % len(self.dic)
-```
-
-`CategoryFeatureGenerator` 需要先扫描数据集,得到该类别对应的项集合,之后才能开始生成特征。
-
-我们的实验数据集\[[3](https://www.kaggle.com/c/avazu-ctr-prediction/data)\]已经经过shuffle,可以扫描前面一定数目的记录来近似总的类别项集合(等价于随机抽样),
-对于没有抽样上的低频类别项,可以用一个 UNK 的特殊值表示。
-
-```python
-fields = {}
-for key in categorial_features:
- fields[key] = CategoryFeatureGenerator()
-
-def detect_dataset(path, topn, id_fea_space=10000):
- '''
- Parse the first `topn` records to collect meta information of this dataset.
-
- NOTE the records should be randomly shuffled first.
- '''
- # create categorical statis objects.
-
- with open(path, 'rb') as csvfile:
- reader = csv.DictReader(csvfile)
- for row_id, row in enumerate(reader):
- if row_id > topn:
- break
-
- for key in categorial_features:
- fields[key].register(row[key])
-```
-
-`CategoryFeatureGenerator` 在注册得到数据集中对应类别信息后,可以对相应记录生成对应的特征表示:
-
-```python
-record = []
-for key in categorial_features:
- record.append(fields[key].gen(row[key]))
-```
-
-本任务中,类别类特征会输入到 DNN 中使用。
-
-### ID 类特征
-
-ID 类特征代稀疏值,且值的空间很大的情况,一般用模操作规约到一个有限空间,
-之后可以当成类别类特征使用,这里我们会将 ID 类特征输入到 LR 模型中使用。
-
-```python
-class IDfeatureGenerator(object):
- def __init__(self, max_dim):
- '''
- @max_dim: int
- Size of the id elements' space
- '''
- self.max_dim = max_dim
-
- def gen(self, key):
- '''
- Generate one-hot representation for records
- '''
- return [hash(key) % self.max_dim]
-
- def size(self):
- return self.max_dim
-```
-
-`IDfeatureGenerator` 不需要预先初始化,可以直接生成特征,比如
-
-```python
-record = []
-for key in id_features:
- if 'cross' not in key:
- record.append(fields[key].gen(row[key]))
-```
-
-### 交叉类特征
-
-LR 模型作为 Wide & Deep model 的 `wide` 部分,可以输入很 wide 的数据(特征空间的维度很大),
-为了充分利用这个优势,我们将演示交叉组合特征构建成更大维度特征的情况,之后塞入到模型中训练。
-
-这里我们依旧使用模操作来约束最终组合出的特征空间的大小,具体实现是直接在 `IDfeatureGenerator` 中添加一个 `gen_cross_feature` 的方法:
-
-```python
-def gen_cross_fea(self, fea1, fea2):
- key = str(fea1) + str(fea2)
- return self.gen(key)
-```
-
-比如,我们觉得原始数据中, `device_id` 和 `site_id` 有一些关联(比如某个 device 倾向于浏览特定 site),
-我们通过组合出两者组合来捕捉这类信息。
-
-```python
-fea0 = fields[key].cross_fea0
-fea1 = fields[key].cross_fea1
-record.append(
- fields[key].gen_cross_fea(row[fea0], row[fea1]))
-```
-
-### 特征维度
-#### Deep submodel(DNN)特征
-| feature | dimention |
-|------------------|-----------|
-| app_category | 21 |
-| site_category | 22 |
-| device_conn_type | 5 |
-| hour | 24 |
-| banner_pos | 7 |
-| **Total** | 79 |
-
-#### Wide submodel(LR)特征
-| Feature | Dimention |
-|---------------------|-----------|
-| id | 10000 |
-| site_id | 10000 |
-| app_id | 10000 |
-| device_id | 10000 |
-| device_id X site_id | 1000000 |
-| **Total** | 1,040,000 |
-
-## 输入到 PaddlePaddle 中
-
-Deep 和 Wide 两部分均以 `sparse_binary_vector` 的格式 \[[1](https://github.com/PaddlePaddle/Paddle/blob/develop/doc/api/v1/data_provider/pydataprovider2_en.rst)\] 输入,输入前需要将相关特征拼合,模型最终只接受 3 个 input,
-分别是
-
-1. `dnn input` ,DNN 的输入
-2. `lr input` , LR 的输入
-3. `click` , 标签
-
-拼合特征的方法:
-
-```python
-def concat_sparse_vectors(inputs, dims):
- '''
- concaterate sparse vectors into one
-
- @inputs: list
- list of sparse vector
- @dims: list of int
- dimention of each sparse vector
- '''
- res = []
- assert len(inputs) == len(dims)
- start = 0
- for no, vec in enumerate(inputs):
- for v in vec:
- res.append(v + start)
- start += dims[no]
- return res
-```
-
-生成最终特征的代码如下:
-
-```python
-# dimentions of the features
-categorial_dims = [
- feature_dims[key] for key in categorial_features + ['hour']
-]
-id_dims = [feature_dims[key] for key in id_features]
-
-dense_input = concat_sparse_vectors(record, categorial_dims)
-sparse_input = concat_sparse_vectors(record, id_dims)
-
-record = [dense_input, sparse_input]
-record.append(list((int(row['click']), )))
-yield record
-```
-
-## 参考文献
-
-1.
-2. Mikolov T, Deoras A, Povey D, et al. [Strategies for training large scale neural network language models](https://www.researchgate.net/profile/Lukas_Burget/publication/241637478_Strategies_for_training_large_scale_neural_network_language_models/links/542c14960cf27e39fa922ed3.pdf)[C]//Automatic Speech Recognition and Understanding (ASRU), 2011 IEEE Workshop on. IEEE, 2011: 196-201.
-3.
diff --git a/legacy/ctr/images/lr_vs_dnn.jpg b/legacy/ctr/images/lr_vs_dnn.jpg
deleted file mode 100644
index 50a0db583cd9b6e1a5bc0f83a28ab6e22d649931..0000000000000000000000000000000000000000
Binary files a/legacy/ctr/images/lr_vs_dnn.jpg and /dev/null differ
diff --git a/legacy/ctr/images/wide_deep.png b/legacy/ctr/images/wide_deep.png
deleted file mode 100644
index 616f88cb22607c1c6bcbe4312644f632ef284e8e..0000000000000000000000000000000000000000
Binary files a/legacy/ctr/images/wide_deep.png and /dev/null differ
diff --git a/legacy/ctr/infer.py b/legacy/ctr/infer.py
deleted file mode 100644
index 6541c74638df63a9304989c2ccaff0ff4c00463a..0000000000000000000000000000000000000000
--- a/legacy/ctr/infer.py
+++ /dev/null
@@ -1,79 +0,0 @@
-import gzip
-import argparse
-import itertools
-
-import paddle.v2 as paddle
-import network_conf
-from train import dnn_layer_dims
-import reader
-from utils import logger, ModelType
-
-parser = argparse.ArgumentParser(description="PaddlePaddle CTR example")
-parser.add_argument(
- '--model_gz_path',
- type=str,
- required=True,
- help="path of model parameters gz file")
-parser.add_argument(
- '--data_path', type=str, required=True, help="path of the dataset to infer")
-parser.add_argument(
- '--prediction_output_path',
- type=str,
- required=True,
- help="path to output the prediction")
-parser.add_argument(
- '--data_meta_path',
- type=str,
- default="./data.meta",
- help="path of trainset's meta info, default is ./data.meta")
-parser.add_argument(
- '--model_type',
- type=int,
- required=True,
- default=ModelType.CLASSIFICATION,
- help='model type, classification: %d, regression %d (default classification)'
- % (ModelType.CLASSIFICATION, ModelType.REGRESSION))
-
-args = parser.parse_args()
-
-paddle.init(use_gpu=False, trainer_count=1)
-
-
-class CTRInferer(object):
- def __init__(self, param_path):
- logger.info("create CTR model")
- dnn_input_dim, lr_input_dim = reader.load_data_meta(args.data_meta_path)
- # create the mdoel
- self.ctr_model = network_conf.CTRmodel(
- dnn_layer_dims,
- dnn_input_dim,
- lr_input_dim,
- model_type=ModelType(args.model_type),
- is_infer=True)
- # load parameter
- logger.info("load model parameters from %s" % param_path)
- self.parameters = paddle.parameters.Parameters.from_tar(
- gzip.open(param_path, 'r'))
- self.inferer = paddle.inference.Inference(
- output_layer=self.ctr_model.model,
- parameters=self.parameters, )
-
- def infer(self, data_path):
- logger.info("infer data...")
- dataset = reader.Dataset()
- infer_reader = paddle.batch(
- dataset.infer(args.data_path), batch_size=1000)
- logger.warning('write predictions to %s' % args.prediction_output_path)
- output_f = open(args.prediction_output_path, 'w')
- for id, batch in enumerate(infer_reader()):
- res = self.inferer.infer(input=batch)
- predictions = [x for x in itertools.chain.from_iterable(res)]
- assert len(batch) == len(
- predictions), "predict error, %d inputs, but %d predictions" % (
- len(batch), len(predictions))
- output_f.write('\n'.join(map(str, predictions)) + '\n')
-
-
-if __name__ == '__main__':
- ctr_inferer = CTRInferer(args.model_gz_path)
- ctr_inferer.infer(args.data_path)
diff --git a/legacy/ctr/network_conf.py b/legacy/ctr/network_conf.py
deleted file mode 100644
index bcff49ee05e1d8cc80e2fdd28a771bf9bf9502e3..0000000000000000000000000000000000000000
--- a/legacy/ctr/network_conf.py
+++ /dev/null
@@ -1,104 +0,0 @@
-import paddle.v2 as paddle
-from paddle.v2 import layer
-from paddle.v2 import data_type as dtype
-from utils import logger, ModelType
-
-
-class CTRmodel(object):
- '''
- A CTR model which implements wide && deep learning model.
- '''
-
- def __init__(self,
- dnn_layer_dims,
- dnn_input_dim,
- lr_input_dim,
- model_type=ModelType.create_classification(),
- is_infer=False):
- '''
- @dnn_layer_dims: list of integer
- dims of each layer in dnn
- @dnn_input_dim: int
- size of dnn's input layer
- @lr_input_dim: int
- size of lr's input layer
- @is_infer: bool
- whether to build a infer model
- '''
- self.dnn_layer_dims = dnn_layer_dims
- self.dnn_input_dim = dnn_input_dim
- self.lr_input_dim = lr_input_dim
- self.model_type = model_type
- self.is_infer = is_infer
-
- self._declare_input_layers()
-
- self.dnn = self._build_dnn_submodel_(self.dnn_layer_dims)
- self.lr = self._build_lr_submodel_()
-
- # model's prediction
- # TODO(superjom) rename it to prediction
- if self.model_type.is_classification():
- self.model = self._build_classification_model(self.dnn, self.lr)
- if self.model_type.is_regression():
- self.model = self._build_regression_model(self.dnn, self.lr)
-
- def _declare_input_layers(self):
- self.dnn_merged_input = layer.data(
- name='dnn_input',
- type=paddle.data_type.sparse_binary_vector(self.dnn_input_dim))
-
- self.lr_merged_input = layer.data(
- name='lr_input',
- type=paddle.data_type.sparse_float_vector(self.lr_input_dim))
-
- if not self.is_infer:
- self.click = paddle.layer.data(
- name='click', type=dtype.dense_vector(1))
-
- def _build_dnn_submodel_(self, dnn_layer_dims):
- '''
- build DNN submodel.
- '''
- dnn_embedding = layer.fc(input=self.dnn_merged_input,
- size=dnn_layer_dims[0])
- _input_layer = dnn_embedding
- for i, dim in enumerate(dnn_layer_dims[1:]):
- fc = layer.fc(input=_input_layer,
- size=dim,
- act=paddle.activation.Relu(),
- name='dnn-fc-%d' % i)
- _input_layer = fc
- return _input_layer
-
- def _build_lr_submodel_(self):
- '''
- config LR submodel
- '''
- fc = layer.fc(input=self.lr_merged_input,
- size=1,
- act=paddle.activation.Relu())
- return fc
-
- def _build_classification_model(self, dnn, lr):
- merge_layer = layer.concat(input=[dnn, lr])
- self.output = layer.fc(
- input=merge_layer,
- size=1,
- # use sigmoid function to approximate ctr rate, a float value between 0 and 1.
- act=paddle.activation.Sigmoid())
-
- if not self.is_infer:
- self.train_cost = paddle.layer.multi_binary_label_cross_entropy_cost(
- input=self.output, label=self.click)
- return self.output
-
- def _build_regression_model(self, dnn, lr):
- merge_layer = layer.concat(input=[dnn, lr])
- self.output = layer.fc(input=merge_layer,
- size=1,
- act=paddle.activation.Sigmoid())
- if not self.is_infer:
- self.train_cost = paddle.layer.square_error_cost(
- input=self.output, label=self.click)
- return self.output
diff --git a/legacy/ctr/reader.py b/legacy/ctr/reader.py
deleted file mode 100644
index cafa2349ed0e51a8de65dbeeea8b345edcf0a879..0000000000000000000000000000000000000000
--- a/legacy/ctr/reader.py
+++ /dev/null
@@ -1,64 +0,0 @@
-from utils import logger, TaskMode, load_dnn_input_record, load_lr_input_record
-
-feeding_index = {'dnn_input': 0, 'lr_input': 1, 'click': 2}
-
-
-class Dataset(object):
- def train(self, path):
- '''
- Load trainset.
- '''
- logger.info("load trainset from %s" % path)
- mode = TaskMode.create_train()
- return self._parse_creator(path, mode)
-
- def test(self, path):
- '''
- Load testset.
- '''
- logger.info("load testset from %s" % path)
- mode = TaskMode.create_test()
- return self._parse_creator(path, mode)
-
- def infer(self, path):
- '''
- Load infer set.
- '''
- logger.info("load inferset from %s" % path)
- mode = TaskMode.create_infer()
- return self._parse_creator(path, mode)
-
- def _parse_creator(self, path, mode):
- '''
- Parse dataset.
- '''
-
- def _parse():
- with open(path) as f:
- for line_id, line in enumerate(f):
- fs = line.strip().split('\t')
- dnn_input = load_dnn_input_record(fs[0])
- lr_input = load_lr_input_record(fs[1])
- if not mode.is_infer():
- click = [int(fs[2])]
- yield dnn_input, lr_input, click
- else:
- yield dnn_input, lr_input
-
- return _parse
-
-
-def load_data_meta(path):
- '''
- load data meta info from path, return (dnn_input_dim, lr_input_dim)
- '''
- with open(path) as f:
- lines = f.read().split('\n')
- err_info = "wrong meta format"
- assert len(lines) == 2, err_info
- assert 'dnn_input_dim:' in lines[0] and 'lr_input_dim:' in lines[
- 1], err_info
- res = map(int, [_.split(':')[1] for _ in lines])
- logger.info('dnn input dim: %d' % res[0])
- logger.info('lr input dim: %d' % res[1])
- return res
diff --git a/legacy/ctr/train.py b/legacy/ctr/train.py
deleted file mode 100644
index de7add61d65aba363cc17bed49d32c9054600108..0000000000000000000000000000000000000000
--- a/legacy/ctr/train.py
+++ /dev/null
@@ -1,112 +0,0 @@
-import argparse
-import gzip
-
-import reader
-import paddle.v2 as paddle
-from utils import logger, ModelType
-from network_conf import CTRmodel
-
-
-def parse_args():
- parser = argparse.ArgumentParser(description="PaddlePaddle CTR example")
- parser.add_argument(
- '--train_data_path',
- type=str,
- required=True,
- help="path of training dataset")
- parser.add_argument(
- '--test_data_path', type=str, help='path of testing dataset')
- parser.add_argument(
- '--batch_size',
- type=int,
- default=10000,
- help="size of mini-batch (default:10000)")
- parser.add_argument(
- '--num_passes', type=int, default=10, help="number of passes to train")
- parser.add_argument(
- '--model_output_prefix',
- type=str,
- default='./ctr_models',
- help='prefix of path for model to store (default: ./ctr_models)')
- parser.add_argument(
- '--data_meta_file',
- type=str,
- required=True,
- help='path of data meta info file', )
- parser.add_argument(
- '--model_type',
- type=int,
- required=True,
- default=ModelType.CLASSIFICATION,
- help='model type, classification: %d, regression %d (default classification)'
- % (ModelType.CLASSIFICATION, ModelType.REGRESSION))
-
- return parser.parse_args()
-
-
-dnn_layer_dims = [128, 64, 32, 1]
-
-# ==============================================================================
-# cost and train period
-# ==============================================================================
-
-
-def train():
- args = parse_args()
- args.model_type = ModelType(args.model_type)
- paddle.init(use_gpu=False, trainer_count=1)
- dnn_input_dim, lr_input_dim = reader.load_data_meta(args.data_meta_file)
-
- # create ctr model.
- model = CTRmodel(
- dnn_layer_dims,
- dnn_input_dim,
- lr_input_dim,
- model_type=args.model_type,
- is_infer=False)
-
- params = paddle.parameters.create(model.train_cost)
- optimizer = paddle.optimizer.AdaGrad()
-
- trainer = paddle.trainer.SGD(cost=model.train_cost,
- parameters=params,
- update_equation=optimizer)
-
- dataset = reader.Dataset()
-
- def __event_handler__(event):
- if isinstance(event, paddle.event.EndIteration):
- num_samples = event.batch_id * args.batch_size
- if event.batch_id % 100 == 0:
- logger.warning("Pass %d, Samples %d, Cost %f, %s" % (
- event.pass_id, num_samples, event.cost, event.metrics))
-
- if event.batch_id % 1000 == 0:
- if args.test_data_path:
- result = trainer.test(
- reader=paddle.batch(
- dataset.test(args.test_data_path),
- batch_size=args.batch_size),
- feeding=reader.feeding_index)
- logger.warning("Test %d-%d, Cost %f, %s" %
- (event.pass_id, event.batch_id, result.cost,
- result.metrics))
-
- path = "{}-pass-{}-batch-{}-test-{}.tar.gz".format(
- args.model_output_prefix, event.pass_id, event.batch_id,
- result.cost)
- with gzip.open(path, 'w') as f:
- trainer.save_parameter_to_tar(f)
-
- trainer.train(
- reader=paddle.batch(
- paddle.reader.shuffle(
- dataset.train(args.train_data_path), buf_size=500),
- batch_size=args.batch_size),
- feeding=reader.feeding_index,
- event_handler=__event_handler__,
- num_passes=args.num_passes)
-
-
-if __name__ == '__main__':
- train()
diff --git a/legacy/ctr/utils.py b/legacy/ctr/utils.py
deleted file mode 100644
index 437554c3c291d5a74cc0b3844c8684c73b189a19..0000000000000000000000000000000000000000
--- a/legacy/ctr/utils.py
+++ /dev/null
@@ -1,70 +0,0 @@
-import logging
-
-logging.basicConfig()
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-class TaskMode:
- TRAIN_MODE = 0
- TEST_MODE = 1
- INFER_MODE = 2
-
- def __init__(self, mode):
- self.mode = mode
-
- def is_train(self):
- return self.mode == self.TRAIN_MODE
-
- def is_test(self):
- return self.mode == self.TEST_MODE
-
- def is_infer(self):
- return self.mode == self.INFER_MODE
-
- @staticmethod
- def create_train():
- return TaskMode(TaskMode.TRAIN_MODE)
-
- @staticmethod
- def create_test():
- return TaskMode(TaskMode.TEST_MODE)
-
- @staticmethod
- def create_infer():
- return TaskMode(TaskMode.INFER_MODE)
-
-
-class ModelType:
- CLASSIFICATION = 0
- REGRESSION = 1
-
- def __init__(self, mode):
- self.mode = mode
-
- def is_classification(self):
- return self.mode == self.CLASSIFICATION
-
- def is_regression(self):
- return self.mode == self.REGRESSION
-
- @staticmethod
- def create_classification():
- return ModelType(ModelType.CLASSIFICATION)
-
- @staticmethod
- def create_regression():
- return ModelType(ModelType.REGRESSION)
-
-
-def load_dnn_input_record(sent):
- return map(int, sent.split())
-
-
-def load_lr_input_record(sent):
- res = []
- for _ in [x.split(':') for x in sent.split()]:
- res.append((
- int(_[0]),
- float(_[1]), ))
- return res
diff --git a/legacy/deep_fm/README.cn.md b/legacy/deep_fm/README.cn.md
deleted file mode 100644
index 1f651acbde0078340dab06c551f583ca2b1dd86c..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/README.cn.md
+++ /dev/null
@@ -1,76 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 基于深度因子分解机的点击率预估模型
-
-## 介绍
-本模型实现了下述论文中提出的DeepFM模型:
-
-```text
-@inproceedings{guo2017deepfm,
- title={DeepFM: A Factorization-Machine based Neural Network for CTR Prediction},
- author={Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He},
- booktitle={the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI)},
- pages={1725--1731},
- year={2017}
-}
-```
-
-DeepFM模型把因子分解机和深度神经网络的低阶和高阶特征的相互作用结合起来,有关因子分解机的详细信息,请参考论文[因子分解机](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf)。
-
-## 数据集
-本文使用的是Kaggle公司举办的[展示广告竞赛](https://www.kaggle.com/c/criteo-display-ad-challenge/)中所使用的Criteo数据集。
-
-每一行是一次广告展示的特征,第一列是一个标签,表示这次广告展示是否被点击。总共有39个特征,其中13个特征采用整型值,另外26个特征是类别类特征。测试集中是没有标签的。
-
-下载数据集:
-```bash
-cd data && ./download.sh && cd ..
-```
-
-## 模型
-DeepFM模型是由因子分解机(FM)和深度神经网络(DNN)组成的。所有的输入特征都会同时输入FM和DNN,最后把FM和DNN的输出结合在一起形成最终的输出。DNN中稀疏特征生成的嵌入层与FM层中的隐含向量(因子)共享参数。
-
-PaddlePaddle中的因子分解机层负责计算二阶组合特征的相互关系。以下的代码示例结合了因子分解机层和全连接层,形成了完整的的因子分解机:
-
-```python
-def fm_layer(input, factor_size):
- first_order = paddle.layer.fc(input=input, size=1, act=paddle.activation.Linear())
- second_order = paddle.layer.factorization_machine(input=input, factor_size=factor_size)
- fm = paddle.layer.addto(input=[first_order, second_order],
- act=paddle.activation.Linear(),
- bias_attr=False)
- return fm
-```
-
-## 数据准备
-处理原始数据集,整型特征使用min-max归一化方法规范到[0, 1],类别类特征使用了one-hot编码。原始数据集分割成两部分:90%用于训练,其他10%用于训练过程中的验证。
-
-```bash
-python preprocess.py --datadir ./data/raw --outdir ./data
-```
-
-## 训练
-训练的命令行选项可以通过`python train.py -h`列出。
-
-训练模型:
-```bash
-python train.py \
- --train_data_path data/train.txt \
- --test_data_path data/valid.txt \
- 2>&1 | tee train.log
-```
-
-训练到第9轮的第40000个batch后,测试的AUC为0.807178,误差(cost)为0.445196。
-
-## 预测
-预测的命令行选项可以通过`python infer.py -h`列出。
-
-对测试集进行预测:
-```bash
-python infer.py \
- --model_gz_path models/model-pass-9-batch-10000.tar.gz \
- --data_path data/test.txt \
- --prediction_output_path ./predict.txt
-```
diff --git a/legacy/deep_fm/README.md b/legacy/deep_fm/README.md
deleted file mode 100644
index 6e2c6fad38d2e9e9db8d17c4967196b4f1cc5a36..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/README.md
+++ /dev/null
@@ -1,95 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.11.0. If you are on a version of PaddlePaddle earlier than v0.11.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Deep Factorization Machine for Click-Through Rate prediction
-
-## Introduction
-This model implements the DeepFM proposed in the following paper:
-
-```text
-@inproceedings{guo2017deepfm,
- title={DeepFM: A Factorization-Machine based Neural Network for CTR Prediction},
- author={Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He},
- booktitle={the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI)},
- pages={1725--1731},
- year={2017}
-}
-```
-
-The DeepFm combines factorization machine and deep neural networks to model
-both low order and high order feature interactions. For details of the
-factorization machines, please refer to the paper [factorization
-machines](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf)
-
-## Dataset
-This example uses Criteo dataset which was used for the [Display Advertising
-Challenge](https://www.kaggle.com/c/criteo-display-ad-challenge/)
-hosted by Kaggle.
-
-Each row is the features for an ad display and the first column is a label
-indicating whether this ad has been clicked or not. There are 39 features in
-total. 13 features take integer values and the other 26 features are
-categorical features. For the test dataset, the labels are omitted.
-
-Download dataset:
-```bash
-cd data && ./download.sh && cd ..
-```
-
-## Model
-The DeepFM model is composed of the factorization machine layer (FM) and deep
-neural networks (DNN). All the input features are feeded to both FM and DNN.
-The output from FM and DNN are combined to form the final output. The embedding
-layer for sparse features in the DNN shares the parameters with the latent
-vectors (factors) of the FM layer.
-
-The factorization machine layer in PaddlePaddle computes the second order
-interactions. The following code example combines the factorization machine
-layer and fully connected layer to form the full version of factorization
-machine:
-
-```python
-def fm_layer(input, factor_size):
- first_order = paddle.layer.fc(input=input, size=1, act=paddle.activation.Linear())
- second_order = paddle.layer.factorization_machine(input=input, factor_size=factor_size)
- fm = paddle.layer.addto(input=[first_order, second_order],
- act=paddle.activation.Linear(),
- bias_attr=False)
- return fm
-```
-
-## Data preparation
-To preprocess the raw dataset, the integer features are clipped then min-max
-normalized to [0, 1] and the categorical features are one-hot encoded. The raw
-training dataset are splited such that 90% are used for training and the other
-10% are used for validation during training.
-
-```bash
-python preprocess.py --datadir ./data/raw --outdir ./data
-```
-
-## Train
-The command line options for training can be listed by `python train.py -h`.
-
-To train the model:
-```bash
-python train.py \
- --train_data_path data/train.txt \
- --test_data_path data/valid.txt \
- 2>&1 | tee train.log
-```
-
-After training pass 9 batch 40000, the testing AUC is `0.807178` and the testing
-cost is `0.445196`.
-
-## Infer
-The command line options for infering can be listed by `python infer.py -h`.
-
-To make inference for the test dataset:
-```bash
-python infer.py \
- --model_gz_path models/model-pass-9-batch-10000.tar.gz \
- --data_path data/test.txt \
- --prediction_output_path ./predict.txt
-```
diff --git a/legacy/deep_fm/data/download.sh b/legacy/deep_fm/data/download.sh
deleted file mode 100755
index 466a22f2c6cc885cea0a1468f3043cb59c611b59..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/data/download.sh
+++ /dev/null
@@ -1,8 +0,0 @@
-#!/bin/bash
-
-wget --no-check-certificate https://s3-eu-west-1.amazonaws.com/criteo-labs/dac.tar.gz
-tar zxf dac.tar.gz
-rm -f dac.tar.gz
-
-mkdir raw
-mv ./*.txt raw/
diff --git a/legacy/deep_fm/infer.py b/legacy/deep_fm/infer.py
deleted file mode 100755
index 40a5929780090d403b8b905f8e949f1f8a020eb3..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/infer.py
+++ /dev/null
@@ -1,63 +0,0 @@
-import os
-import gzip
-import argparse
-import itertools
-
-import paddle.v2 as paddle
-
-from network_conf import DeepFM
-import reader
-
-
-def parse_args():
- parser = argparse.ArgumentParser(description="PaddlePaddle DeepFM example")
- parser.add_argument(
- '--model_gz_path',
- type=str,
- required=True,
- help="The path of model parameters gz file")
- parser.add_argument(
- '--data_path',
- type=str,
- required=True,
- help="The path of the dataset to infer")
- parser.add_argument(
- '--prediction_output_path',
- type=str,
- required=True,
- help="The path to output the prediction")
- parser.add_argument(
- '--factor_size',
- type=int,
- default=10,
- help="The factor size for the factorization machine (default:10)")
-
- return parser.parse_args()
-
-
-def infer():
- args = parse_args()
-
- paddle.init(use_gpu=False, trainer_count=1)
-
- model = DeepFM(args.factor_size, infer=True)
-
- parameters = paddle.parameters.Parameters.from_tar(
- gzip.open(args.model_gz_path, 'r'))
-
- inferer = paddle.inference.Inference(
- output_layer=model, parameters=parameters)
-
- dataset = reader.Dataset()
-
- infer_reader = paddle.batch(dataset.infer(args.data_path), batch_size=1000)
-
- with open(args.prediction_output_path, 'w') as out:
- for id, batch in enumerate(infer_reader()):
- res = inferer.infer(input=batch)
- predictions = [x for x in itertools.chain.from_iterable(res)]
- out.write('\n'.join(map(str, predictions)) + '\n')
-
-
-if __name__ == '__main__':
- infer()
diff --git a/legacy/deep_fm/network_conf.py b/legacy/deep_fm/network_conf.py
deleted file mode 100644
index 545fe07b8197e3379eb5a6f34c3134b813a4684e..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/network_conf.py
+++ /dev/null
@@ -1,75 +0,0 @@
-import paddle.v2 as paddle
-
-dense_feature_dim = 13
-sparse_feature_dim = 117568
-
-
-def fm_layer(input, factor_size, fm_param_attr):
- first_order = paddle.layer.fc(input=input,
- size=1,
- act=paddle.activation.Linear())
- second_order = paddle.layer.factorization_machine(
- input=input,
- factor_size=factor_size,
- act=paddle.activation.Linear(),
- param_attr=fm_param_attr)
- out = paddle.layer.addto(
- input=[first_order, second_order],
- act=paddle.activation.Linear(),
- bias_attr=False)
- return out
-
-
-def DeepFM(factor_size, infer=False):
- dense_input = paddle.layer.data(
- name="dense_input",
- type=paddle.data_type.dense_vector(dense_feature_dim))
- sparse_input = paddle.layer.data(
- name="sparse_input",
- type=paddle.data_type.sparse_binary_vector(sparse_feature_dim))
- sparse_input_ids = [
- paddle.layer.data(
- name="C" + str(i),
- type=paddle.data_type.integer_value(sparse_feature_dim))
- for i in range(1, 27)
- ]
-
- dense_fm = fm_layer(
- dense_input,
- factor_size,
- fm_param_attr=paddle.attr.Param(name="DenseFeatFactors"))
- sparse_fm = fm_layer(
- sparse_input,
- factor_size,
- fm_param_attr=paddle.attr.Param(name="SparseFeatFactors"))
-
- def embedding_layer(input):
- return paddle.layer.embedding(
- input=input,
- size=factor_size,
- param_attr=paddle.attr.Param(name="SparseFeatFactors"))
-
- sparse_embed_seq = map(embedding_layer, sparse_input_ids)
- sparse_embed = paddle.layer.concat(sparse_embed_seq)
-
- fc1 = paddle.layer.fc(input=[sparse_embed, dense_input],
- size=400,
- act=paddle.activation.Relu())
- fc2 = paddle.layer.fc(input=fc1, size=400, act=paddle.activation.Relu())
- fc3 = paddle.layer.fc(input=fc2, size=400, act=paddle.activation.Relu())
-
- predict = paddle.layer.fc(input=[dense_fm, sparse_fm, fc3],
- size=1,
- act=paddle.activation.Sigmoid())
-
- if not infer:
- label = paddle.layer.data(
- name="label", type=paddle.data_type.dense_vector(1))
- cost = paddle.layer.multi_binary_label_cross_entropy_cost(
- input=predict, label=label)
- paddle.evaluator.classification_error(
- name="classification_error", input=predict, label=label)
- paddle.evaluator.auc(name="auc", input=predict, label=label)
- return cost
- else:
- return predict
diff --git a/legacy/deep_fm/preprocess.py b/legacy/deep_fm/preprocess.py
deleted file mode 100755
index 36ffea16637c19dee9352d17ed51a67edf582167..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/preprocess.py
+++ /dev/null
@@ -1,164 +0,0 @@
-"""
-Preprocess Criteo dataset. This dataset was used for the Display Advertising
-Challenge (https://www.kaggle.com/c/criteo-display-ad-challenge).
-"""
-import os
-import sys
-import click
-import random
-import collections
-
-# There are 13 integer features and 26 categorical features
-continous_features = range(1, 14)
-categorial_features = range(14, 40)
-
-# Clip integer features. The clip point for each integer feature
-# is derived from the 95% quantile of the total values in each feature
-continous_clip = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
-
-
-class CategoryDictGenerator:
- """
- Generate dictionary for each of the categorical features
- """
-
- def __init__(self, num_feature):
- self.dicts = []
- self.num_feature = num_feature
- for i in range(0, num_feature):
- self.dicts.append(collections.defaultdict(int))
-
- def build(self, datafile, categorial_features, cutoff=0):
- with open(datafile, 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
- for i in range(0, self.num_feature):
- if features[categorial_features[i]] != '':
- self.dicts[i][features[categorial_features[i]]] += 1
- for i in range(0, self.num_feature):
- self.dicts[i] = filter(lambda x: x[1] >= cutoff,
- self.dicts[i].items())
- self.dicts[i] = sorted(self.dicts[i], key=lambda x: (-x[1], x[0]))
- vocabs, _ = list(zip(*self.dicts[i]))
- self.dicts[i] = dict(zip(vocabs, range(1, len(vocabs) + 1)))
- self.dicts[i][''] = 0
-
- def gen(self, idx, key):
- if key not in self.dicts[idx]:
- res = self.dicts[idx]['']
- else:
- res = self.dicts[idx][key]
- return res
-
- def dicts_sizes(self):
- return map(len, self.dicts)
-
-
-class ContinuousFeatureGenerator:
- """
- Normalize the integer features to [0, 1] by min-max normalization
- """
-
- def __init__(self, num_feature):
- self.num_feature = num_feature
- self.min = [sys.maxint] * num_feature
- self.max = [-sys.maxint] * num_feature
-
- def build(self, datafile, continous_features):
- with open(datafile, 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
- for i in range(0, self.num_feature):
- val = features[continous_features[i]]
- if val != '':
- val = int(val)
- if val > continous_clip[i]:
- val = continous_clip[i]
- self.min[i] = min(self.min[i], val)
- self.max[i] = max(self.max[i], val)
-
- def gen(self, idx, val):
- if val == '':
- return 0.0
- val = float(val)
- return (val - self.min[idx]) / (self.max[idx] - self.min[idx])
-
-
-@click.command("preprocess")
-@click.option("--datadir", type=str, help="Path to raw criteo dataset")
-@click.option("--outdir", type=str, help="Path to save the processed data")
-def preprocess(datadir, outdir):
- """
- All the 13 integer features are normalzied to continous values and these
- continous features are combined into one vecotr with dimension 13.
-
- Each of the 26 categorical features are one-hot encoded and all the one-hot
- vectors are combined into one sparse binary vector.
- """
- dists = ContinuousFeatureGenerator(len(continous_features))
- dists.build(os.path.join(datadir, 'train.txt'), continous_features)
-
- dicts = CategoryDictGenerator(len(categorial_features))
- dicts.build(
- os.path.join(datadir, 'train.txt'), categorial_features, cutoff=200)
-
- dict_sizes = dicts.dicts_sizes()
- categorial_feature_offset = [0]
- for i in range(1, len(categorial_features)):
- offset = categorial_feature_offset[i - 1] + dict_sizes[i - 1]
- categorial_feature_offset.append(offset)
-
- random.seed(0)
-
- # 90% of the data are used for training, and 10% of the data are used
- # for validation.
- with open(os.path.join(outdir, 'train.txt'), 'w') as out_train:
- with open(os.path.join(outdir, 'valid.txt'), 'w') as out_valid:
- with open(os.path.join(datadir, 'train.txt'), 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
-
- continous_vals = []
- for i in range(0, len(continous_features)):
- val = dists.gen(i, features[continous_features[i]])
- continous_vals.append("{0:.6f}".format(val).rstrip('0')
- .rstrip('.'))
- categorial_vals = []
- for i in range(0, len(categorial_features)):
- val = dicts.gen(i, features[categorial_features[
- i]]) + categorial_feature_offset[i]
- categorial_vals.append(str(val))
-
- continous_vals = ','.join(continous_vals)
- categorial_vals = ','.join(categorial_vals)
- label = features[0]
- if random.randint(0, 9999) % 10 != 0:
- out_train.write('\t'.join(
- [continous_vals, categorial_vals, label]) + '\n')
- else:
- out_valid.write('\t'.join(
- [continous_vals, categorial_vals, label]) + '\n')
-
- with open(os.path.join(outdir, 'test.txt'), 'w') as out:
- with open(os.path.join(datadir, 'test.txt'), 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
-
- continous_vals = []
- for i in range(0, len(continous_features)):
- val = dists.gen(i, features[continous_features[i] - 1])
- continous_vals.append("{0:.6f}".format(val).rstrip('0')
- .rstrip('.'))
- categorial_vals = []
- for i in range(0, len(categorial_features)):
- val = dicts.gen(i, features[categorial_features[
- i] - 1]) + categorial_feature_offset[i]
- categorial_vals.append(str(val))
-
- continous_vals = ','.join(continous_vals)
- categorial_vals = ','.join(categorial_vals)
- out.write('\t'.join([continous_vals, categorial_vals]) + '\n')
-
-
-if __name__ == "__main__":
- preprocess()
diff --git a/legacy/deep_fm/reader.py b/legacy/deep_fm/reader.py
deleted file mode 100644
index 1098ce423c9071864671be91dea81972e47fbc98..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/reader.py
+++ /dev/null
@@ -1,58 +0,0 @@
-class Dataset:
- def _reader_creator(self, path, is_infer):
- def reader():
- with open(path, 'r') as f:
- for line in f:
- features = line.rstrip('\n').split('\t')
- dense_feature = map(float, features[0].split(','))
- sparse_feature = map(int, features[1].split(','))
- if not is_infer:
- label = [float(features[2])]
- yield [dense_feature, sparse_feature
- ] + sparse_feature + [label]
- else:
- yield [dense_feature, sparse_feature] + sparse_feature
-
- return reader
-
- def train(self, path):
- return self._reader_creator(path, False)
-
- def test(self, path):
- return self._reader_creator(path, False)
-
- def infer(self, path):
- return self._reader_creator(path, True)
-
-
-feeding = {
- 'dense_input': 0,
- 'sparse_input': 1,
- 'C1': 2,
- 'C2': 3,
- 'C3': 4,
- 'C4': 5,
- 'C5': 6,
- 'C6': 7,
- 'C7': 8,
- 'C8': 9,
- 'C9': 10,
- 'C10': 11,
- 'C11': 12,
- 'C12': 13,
- 'C13': 14,
- 'C14': 15,
- 'C15': 16,
- 'C16': 17,
- 'C17': 18,
- 'C18': 19,
- 'C19': 20,
- 'C20': 21,
- 'C21': 22,
- 'C22': 23,
- 'C23': 24,
- 'C24': 25,
- 'C25': 26,
- 'C26': 27,
- 'label': 28
-}
diff --git a/legacy/deep_fm/train.py b/legacy/deep_fm/train.py
deleted file mode 100755
index 92d48696d8845ac13b714b66f7810acdd35fe164..0000000000000000000000000000000000000000
--- a/legacy/deep_fm/train.py
+++ /dev/null
@@ -1,108 +0,0 @@
-import os
-import gzip
-import logging
-import argparse
-
-import paddle.v2 as paddle
-
-from network_conf import DeepFM
-import reader
-
-logging.basicConfig()
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def parse_args():
- parser = argparse.ArgumentParser(description="PaddlePaddle DeepFM example")
- parser.add_argument(
- '--train_data_path',
- type=str,
- required=True,
- help="The path of training dataset")
- parser.add_argument(
- '--test_data_path',
- type=str,
- required=True,
- help="The path of testing dataset")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=1000,
- help="The size of mini-batch (default:1000)")
- parser.add_argument(
- '--num_passes',
- type=int,
- default=10,
- help="The number of passes to train (default: 10)")
- parser.add_argument(
- '--factor_size',
- type=int,
- default=10,
- help="The factor size for the factorization machine (default:10)")
- parser.add_argument(
- '--model_output_dir',
- type=str,
- default='models',
- help='The path for model to store (default: models)')
-
- return parser.parse_args()
-
-
-def train():
- args = parse_args()
-
- if not os.path.isdir(args.model_output_dir):
- os.mkdir(args.model_output_dir)
-
- paddle.init(use_gpu=False, trainer_count=1)
-
- optimizer = paddle.optimizer.Adam(learning_rate=1e-4)
-
- model = DeepFM(args.factor_size)
-
- params = paddle.parameters.create(model)
-
- trainer = paddle.trainer.SGD(cost=model,
- parameters=params,
- update_equation=optimizer)
-
- dataset = reader.Dataset()
-
- def __event_handler__(event):
- if isinstance(event, paddle.event.EndIteration):
- num_samples = event.batch_id * args.batch_size
- if event.batch_id % 100 == 0:
- logger.warning("Pass %d, Batch %d, Samples %d, Cost %f, %s" %
- (event.pass_id, event.batch_id, num_samples,
- event.cost, event.metrics))
-
- if event.batch_id % 10000 == 0:
- if args.test_data_path:
- result = trainer.test(
- reader=paddle.batch(
- dataset.test(args.test_data_path),
- batch_size=args.batch_size),
- feeding=reader.feeding)
- logger.warning("Test %d-%d, Cost %f, %s" %
- (event.pass_id, event.batch_id, result.cost,
- result.metrics))
-
- path = "{}/model-pass-{}-batch-{}.tar.gz".format(
- args.model_output_dir, event.pass_id, event.batch_id)
- with gzip.open(path, 'w') as f:
- trainer.save_parameter_to_tar(f)
-
- trainer.train(
- reader=paddle.batch(
- paddle.reader.shuffle(
- dataset.train(args.train_data_path),
- buf_size=args.batch_size * 10000),
- batch_size=args.batch_size),
- feeding=reader.feeding,
- event_handler=__event_handler__,
- num_passes=args.num_passes)
-
-
-if __name__ == '__main__':
- train()
diff --git a/legacy/dssm/README.cn.md b/legacy/dssm/README.cn.md
deleted file mode 100644
index 140446ad2e071e8bc185d7788dcf33651a370d69..0000000000000000000000000000000000000000
--- a/legacy/dssm/README.cn.md
+++ /dev/null
@@ -1,294 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此版本要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 深度结构化语义模型 (Deep Structured Semantic Models, DSSM)
-DSSM使用DNN模型在一个连续的语义空间中学习文本低纬的表示向量,并且建模两个句子间的语义相似度。本例演示如何使用PaddlePaddle实现一个通用的DSSM 模型,用于建模两个字符串间的语义相似度,模型实现支持通用的数据格式,用户替换数据便可以在真实场景中使用该模型。
-
-## 背景介绍
-DSSM \[[1](##参考文献)\]是微软研究院13年提出来的经典的语义模型,用于学习两个文本之间的语义距离,广义上模型也可以推广和适用如下场景:
-
-1. CTR预估模型,衡量用户搜索词(Query)与候选网页集合(Documents)之间的相关联程度。
-2. 文本相关性,衡量两个字符串间的语义相关程度。
-3. 自动推荐,衡量User与被推荐的Item之间的关联程度。
-
-DSSM 已经发展成了一个框架,可以很自然地建模两个记录之间的距离关系,例如对于文本相关性问题,可以用余弦相似度 (cosin similarity) 来刻画语义距离;而对于搜索引擎的结果排序,可以在DSSM上接上Rank损失训练出一个排序模型。
-
-## 模型简介
-在原论文\[[1](#参考文献)\]中,DSSM模型用来衡量用户搜索词 Query 和文档集合 Documents 之间隐含的语义关系,模型结构如下
-
-
-
-图 1. DSSM 原始结构
-
-
-其贯彻的思想是, **用DNN将高维特征向量转化为低纬空间的连续向量(图中红色框部分)** ,**在上层使用cosine similarity来衡量用户搜索词与候选文档间的语义相关性** 。
-
-在最顶层损失函数的设计上,原始模型使用类似Word2Vec中负例采样的方法,一个Query会抽取正例 $D+$ 和4个负例 $D-$ 整体上算条件概率用对数似然函数作为损失,这也就是图 1中类似 $P(D_1|Q)$ 的结构,具体细节请参考原论文。
-
-随着后续优化DSSM模型的结构得以简化\[[3](#参考文献)\],演变为:
-
-
-
-图 2. DSSM通用结构
-
-
-图中的空白方框可以用任何模型替代,例如:全连接FC,卷积CNN,RNN等。该模型结构专门用于衡量两个元素(比如字符串)间的语义距离。在实际任务中,DSSM模型会作为基础的积木,搭配上不同的损失函数来实现具体的功能,比如:
-
-- 在排序学习中,将 图 2 中结构添加 pairwise rank损失,变成一个排序模型
-- 在CTR预估中,对点击与否做0,1二元分类,添加交叉熵损失变成一个分类模型
-- 在需要对一个子串打分时,可以使用余弦相似度来计算相似度,变成一个回归模型
-
-本例提供一个比较通用的解决方案,在模型任务类型上支持:
-
-- 分类
-- [-1, 1] 值域内的回归
-- Pairwise-Rank
-
-在生成低纬语义向量的模型结构上,支持以下三种:
-
-- FC, 多层全连接层
-- CNN,卷积神经网络
-- RNN,递归神经网络
-
-## 模型实现
-DSSM模型可以拆成三部分:分别是左边和右边的DNN,以及顶层的损失函数。在复杂任务中,左右两边DNN的结构可以不同。在原始论文中左右网络分别学习Query和Document的语义向量,两者数据的数据不同,建议对应定制DNN的结构。
-
-**本例中为了简便和通用,将左右两个DNN的结构设为相同,因此只提供三个选项FC、CNN、RNN**。
-
-损失函数的设计也支持三种类型:分类, 回归, 排序;其中,在回归和排序两种损失中,左右两边的匹配程度通过余弦相似度(cosine similairty)来计算;在分类任务中,类别预测的分布通过softmax计算。
-
-在其它教程中,对上述很多内容都有过详细的介绍,例如:
-
-- 如何CNN, FC 做文本信息提取可以参考 [text classification](https://github.com/PaddlePaddle/models/blob/develop/text_classification/README.md#模型详解)
-- RNN/GRU 的内容可以参考 [Machine Translation](https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.md#gated-recurrent-unit-gru)
-- Pairwise Rank即排序学习可参考 [learn to rank](https://github.com/PaddlePaddle/models/blob/develop/ltr/README.md)
-
-相关原理在此不再赘述,本文接下来的篇幅主要集中介绍使用PaddlePaddle实现这些结构上。
-
-如图3,回归和分类模型的结构相似:
-
-
-
-图 3. DSSM for REGRESSION or CLASSIFICATION
-
-
-最重要的组成部分包括词向量,图中`(1)`,`(2)`两个低纬向量的学习器(可以用RNN/CNN/FC中的任意一种实现),最上层对应的损失函数。
-
-Pairwise Rank的结构会复杂一些,图 4. 中的结构会出现两次,增加了对应的损失函数,模型总体思想是:
-- 给定同一个source(源)为左右两个target(目标)分别打分——`(a),(b)`,学习目标是(a),(b)之间的大小关系
-- `(a)`和`(b)`类似图3中结构,用于给source和target的pair打分
-- `(1)`和`(2)`的结构其实是共用的,都表示同一个source,图中为了表达效果展开成两个
-
-
-
-图 4. DSSM for Pairwise Rank
-
-
-下面是各个部分的具体实现,相关代码均包含在 `./network_conf.py` 中。
-
-
-### 创建文本的词向量表
-
-```python
-def create_embedding(self, input, prefix=''):
- """
- Create word embedding. The `prefix` is added in front of the name of
- embedding"s learnable parameter.
- """
- logger.info("Create embedding table [%s] whose dimention is %d" %
- (prefix, self.dnn_dims[0]))
- emb = paddle.layer.embedding(
- input=input,
- size=self.dnn_dims[0],
- param_attr=ParamAttr(name='%s_emb.w' % prefix))
- return emb
-```
-
-由于输入给词向量表(embedding table)的是一个句子对应的词的ID的列表 ,因此词向量表输出的是词向量的序列。
-
-### CNN 结构实现
-
-```python
-def create_cnn(self, emb, prefix=''):
-
- """
- A multi-layer CNN.
- :param emb: The word embedding.
- :type emb: paddle.layer
- :param prefix: The prefix will be added to of layers' names.
- :type prefix: str
- """
-
- def create_conv(context_len, hidden_size, prefix):
- key = "%s_%d_%d" % (prefix, context_len, hidden_size)
- conv = paddle.networks.sequence_conv_pool(
- input=emb,
- context_len=context_len,
- hidden_size=hidden_size,
- # set parameter attr for parameter sharing
- context_proj_param_attr=ParamAttr(name=key + "contex_proj.w"),
- fc_param_attr=ParamAttr(name=key + "_fc.w"),
- fc_bias_attr=ParamAttr(name=key + "_fc.b"),
- pool_bias_attr=ParamAttr(name=key + "_pool.b"))
- return conv
-
- conv_3 = create_conv(3, self.dnn_dims[1], "cnn")
- conv_4 = create_conv(4, self.dnn_dims[1], "cnn")
- return paddle.layer.concat(input=[conv_3, conv_4])
-```
-
-CNN 接受词向量序列,通过卷积和池化操作捕捉到原始句子的关键信息,最终输出一个语义向量(可以认为是句子向量)。
-
-本例的实现中,分别使用了窗口长度为3和4的CNN学到的句子向量按元素求和得到最终的句子向量。
-
-### RNN 结构实现
-
-RNN很适合学习变长序列的信息,使用RNN来学习句子的信息几乎是自然语言处理任务的标配。
-
-```python
-def create_rnn(self, emb, prefix=''):
- """
- A GRU sentence vector learner.
- """
- gru = paddle.networks.simple_gru(
- input=emb,
- size=self.dnn_dims[1],
- mixed_param_attr=ParamAttr(name='%s_gru_mixed.w' % prefix),
- mixed_bias_param_attr=ParamAttr(name="%s_gru_mixed.b" % prefix),
- gru_param_attr=ParamAttr(name='%s_gru.w' % prefix),
- gru_bias_attr=ParamAttr(name="%s_gru.b" % prefix))
- sent_vec = paddle.layer.last_seq(gru)
- return sent_vec
-```
-
-### 多层全连接网络FC
-
-```python
-def create_fc(self, emb, prefix=''):
-
- """
- A multi-layer fully connected neural networks.
- :param emb: The output of the embedding layer
- :type emb: paddle.layer
- :param prefix: A prefix will be added to the layers' names.
- :type prefix: str
- """
-
- _input_layer = paddle.layer.pooling(
- input=emb, pooling_type=paddle.pooling.Max())
- fc = paddle.layer.fc(
- input=_input_layer,
- size=self.dnn_dims[1],
- param_attr=ParamAttr(name='%s_fc.w' % prefix),
- bias_attr=ParamAttr(name="%s_fc.b" % prefix))
- return fc
-```
-
-在构建全连接网络时首先使用`paddle.layer.pooling` 对词向量序列进行最大池化操作,将边长序列转化为一个固定维度向量,作为整个句子的语义表达,使用最大池化能够降低句子长度对句向量表达的影响。
-
-### 多层DNN
-在 CNN/DNN/FC提取出 semantic vector后,在上层可继续接多层FC来实现深层DNN结构。
-
-```python
-def create_dnn(self, sent_vec, prefix):
- if len(self.dnn_dims) > 1:
- _input_layer = sent_vec
- for id, dim in enumerate(self.dnn_dims[1:]):
- name = "%s_fc_%d_%d" % (prefix, id, dim)
- fc = paddle.layer.fc(
- input=_input_layer,
- size=dim,
- act=paddle.activation.Tanh(),
- param_attr=ParamAttr(name='%s.w' % name),
- bias_attr=ParamAttr(name='%s.b' % name),
- )
- _input_layer = fc
- return _input_layer
-```
-
-### 分类及回归
-分类和回归的结构比较相似,具体实现请参考[network_conf.py]( https://github.com/PaddlePaddle/models/blob/develop/dssm/network_conf.py)中的
-`_build_classification_or_regression_model` 函数。
-
-### Pairwise Rank
-Pairwise Rank复用上面的DNN结构,同一个source对两个target求相似度打分,如果左边的target打分高,预测为1,否则预测为 0。实现请参考 [network_conf.py]( https://github.com/PaddlePaddle/models/blob/develop/dssm/network_conf.py) 中的`_build_rank_model` 函数。
-
-## 数据格式
-在 `./data` 中有简单的示例数据
-
-### 回归的数据格式
-```
-# 3 fields each line:
-# - source word list
-# - target word list
-# - target
- \t \t
-```
-
-比如:
-
-```
-苹果 六 袋 苹果 6s 0.1
-新手 汽车 驾驶 驾校 培训 0.9
-```
-### 分类的数据格式
-```
-# 3 fields each line:
-# - source word list
-# - target word list
-# - target
- \t \t
-```
-
-比如:
-
-```
-苹果 六 袋 苹果 6s 0
-新手 汽车 驾驶 驾校 培训 1
-```
-
-### 排序的数据格式
-```
-# 4 fields each line:
-# - source word list
-# - target1 word list
-# - target2 word list
-# - label
- \t \t \t
-```
-
-比如:
-
-```
-苹果 六 袋 苹果 6s 新手 汽车 驾驶 1
-新手 汽车 驾驶 驾校 培训 苹果 6s 1
-```
-
-## 执行训练
-
-可以直接执行 `python train.py -y 0 --model_arch 0 --class_num 2` 使用 `./data/classification` 目录里的实例数据来测试能否直接运行训练分类FC模型。
-
-其他模型结构也可以通过命令行实现定制,详细命令行参数请执行 `python train.py --help`进行查阅。
-
-这里介绍最重要的几个参数:
-
-- `train_data_path` 训练数据路径
-- `test_data_path` 测试数据路局,可以不设置
-- `source_dic_path` 源字典字典路径
-- `target_dic_path` 目标字典路径
-- `model_type` 模型的损失函数的类型,分类0,排序1,回归2
-- `model_arch` 模型结构,FC 0, CNN 1, RNN 2
-- `dnn_dims` 模型各层的维度设置,默认为 `256,128,64,32`,即模型有4层,各层维度如上设置
-
-## 使用训练好的模型预测
-详细命令行参数请执行 `python infer.py --help`进行查阅。重要参数解释如下:
-
-- `data_path` 需要预测的数据路径
-- `prediction_output_path` 预测的输出路径
-
-## 参考文献
-
-1. Huang P S, He X, Gao J, et al. Learning deep structured semantic models for web search using clickthrough data[C]//Proceedings of the 22nd ACM international conference on Conference on information & knowledge management. ACM, 2013: 2333-2338.
-2. [Microsoft Learning to Rank Datasets](https://www.microsoft.com/en-us/research/project/mslr/)
-3. [Gao J, He X, Deng L. Deep Learning for Web Search and Natural Language Processing[J]. Microsoft Research Technical Report, 2015.](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/wsdm2015.v3.pdf)
diff --git a/legacy/dssm/README.md b/legacy/dssm/README.md
deleted file mode 100644
index ad378f6cd52b0e08efbaac37848d1c167c086ac1..0000000000000000000000000000000000000000
--- a/legacy/dssm/README.md
+++ /dev/null
@@ -1,268 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.10.0. If you are on a version of PaddlePaddle earlier than v0.10.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Deep Structured Semantic Models (DSSM)
-Deep Structured Semantic Models (DSSM) is simple but powerful DNN based model for matching web search queries and the URL based documents. This example demonstrates how to use PaddlePaddle to implement a generic DSSM model for modeling the semantic similarity between two strings.
-
-## Background Introduction
-DSSM \[[1](##References)]is a classic semantic model proposed by the Institute of Physics. It is used to study the semantic distance between two texts. The general implementation of DSSM is as follows.
-
-1. The CTR predictor measures the degree of association between a user search query and a candidate web page.
-2. Text relevance, which measures the degree of semantic correlation between two strings.
-3. Automatically recommend, measure the degree of association between User and the recommended Item.
-
-
-## Model Architecture
-
-In the original paper \[[1](#References)] the DSSM model uses the implicit semantic relation between the user search query and the document as metric. The model structure is as follows
-
-
-
-Figure 1. DSSM In the original paper
-
-
-
-With the subsequent optimization of the DSSM model to simplify the structure \[[3](#References)],the model becomes:
-
-
-
-Figure 2. DSSM generic structure
-
-
-The blank box in the figure can be replaced by any model, such as fully connected FC, convoluted CNN, RNN, etc. The structure is designed to measure the semantic distance between two elements (such as strings).
-
-In practice,DSSM model serves as a basic building block, with different loss functions to achieve specific functions, such as
-
-- In ranking system, the pairwise rank loss function.
-- In the CTR estimate, instead of the binary classification on the click, use cross-entropy loss for a classification model
-- In regression model, the cosine similarity is used to calculate the similarity
-
-## Model Implementation
-At a high level, DSSM model is composed of three components: the left and right DNN, and loss function on top of them. In complex tasks, the structure of the left DNN and the light DNN can be different. In this example, we keep these two DNN structures the same. And we choose any of FC, CNN, and RNN for the DNN architecture.
-
-In PaddlePaddle, the loss functions are supported for any of classification, regression, and ranking. Among them, the distance between the left and right DNN is calculated by the cosine similarity. In the classification task, the predicted distribution is calculated by softmax.
-
-Here we demonstrate:
-
-- How CNN, FC do text information extraction can refer to [text classification](https://github.com/PaddlePaddle/models/blob/develop/text_classification/README.md#模型详解)
-- The contents of the RNN / GRU can be found in [Machine Translation](https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.md#gated-recurrent-unit-gru)
-- For Pairwise Rank learning, please refer to [learn to rank](https://github.com/PaddlePaddle/models/blob/develop/ltr/README.md)
-
-Figure 3 shows the general architecture for both regression and classification models.
-
-
-
-Figure 3. DSSM for REGRESSION or CLASSIFICATION
-
-
-The structure of the Pairwise Rank is more complex, as shown in Figure 4.
-
-
-
-图 4. DSSM for Pairwise Rank
-
-
-In below, we describe how to train DSSM model in PaddlePaddle. All the codes are included in `./network_conf.py`.
-
-
-### Create a word vector table for the text
-```python
-def create_embedding(self, input, prefix=''):
- """
- Create word embedding. The `prefix` is added in front of the name of
- embedding"s learnable parameter.
- """
- logger.info("Create embedding table [%s] whose dimention is %d" %
- (prefix, self.dnn_dims[0]))
- emb = paddle.layer.embedding(
- input=input,
- size=self.dnn_dims[0],
- param_attr=ParamAttr(name='%s_emb.w' % prefix))
- return emb
-```
-
-Since the input (embedding table) is a list of the IDs of the words corresponding to a sentence, the word vector table outputs the sequence of word vectors.
-
-### CNN implementation
-```python
-def create_cnn(self, emb, prefix=''):
-
- """
- A multi-layer CNN.
- :param emb: The word embedding.
- :type emb: paddle.layer
- :param prefix: The prefix will be added to of layers' names.
- :type prefix: str
- """
-
- def create_conv(context_len, hidden_size, prefix):
- key = "%s_%d_%d" % (prefix, context_len, hidden_size)
- conv = paddle.networks.sequence_conv_pool(
- input=emb,
- context_len=context_len,
- hidden_size=hidden_size,
- # set parameter attr for parameter sharing
- context_proj_param_attr=ParamAttr(name=key + "contex_proj.w"),
- fc_param_attr=ParamAttr(name=key + "_fc.w"),
- fc_bias_attr=ParamAttr(name=key + "_fc.b"),
- pool_bias_attr=ParamAttr(name=key + "_pool.b"))
- return conv
-
- conv_3 = create_conv(3, self.dnn_dims[1], "cnn")
- conv_4 = create_conv(4, self.dnn_dims[1], "cnn")
- return paddle.layer.concat(input=[conv_3, conv_4])
-```
-
-CNN accepts the word sequence of the embedding table, then process the data by convolution and pooling, and finally outputs a semantic vector.
-
-### RNN implementation
-
-RNN is suitable for learning variable length of the information
-
-```python
-def create_rnn(self, emb, prefix=''):
- """
- A GRU sentence vector learner.
- """
- gru = paddle.networks.simple_gru(
- input=emb,
- size=self.dnn_dims[1],
- mixed_param_attr=ParamAttr(name='%s_gru_mixed.w' % prefix),
- mixed_bias_param_attr=ParamAttr(name="%s_gru_mixed.b" % prefix),
- gru_param_attr=ParamAttr(name='%s_gru.w' % prefix),
- gru_bias_attr=ParamAttr(name="%s_gru.b" % prefix))
- sent_vec = paddle.layer.last_seq(gru)
- return sent_vec
-```
-
-### FC implementation
-
-```python
-def create_fc(self, emb, prefix=''):
-
- """
- A multi-layer fully connected neural networks.
- :param emb: The output of the embedding layer
- :type emb: paddle.layer
- :param prefix: A prefix will be added to the layers' names.
- :type prefix: str
- """
-
- _input_layer = paddle.layer.pooling(
- input=emb, pooling_type=paddle.pooling.Max())
- fc = paddle.layer.fc(
- input=_input_layer,
- size=self.dnn_dims[1],
- param_attr=ParamAttr(name='%s_fc.w' % prefix),
- bias_attr=ParamAttr(name="%s_fc.b" % prefix))
- return fc
-```
-
-In the construction of FC, we use `paddle.layer.pooling` for the maximum pooling operation on the word vector sequence. Then we transform the sequence into a fixed dimensional vector.
-
-### Multi-layer DNN implementation
-
-```python
-def create_dnn(self, sent_vec, prefix):
- if len(self.dnn_dims) > 1:
- _input_layer = sent_vec
- for id, dim in enumerate(self.dnn_dims[1:]):
- name = "%s_fc_%d_%d" % (prefix, id, dim)
- fc = paddle.layer.fc(
- input=_input_layer,
- size=dim,
- act=paddle.activation.Tanh(),
- param_attr=ParamAttr(name='%s.w' % name),
- bias_attr=ParamAttr(name='%s.b' % name),
- )
- _input_layer = fc
- return _input_layer
-```
-
-### Classification / Regression
-The structure of classification and regression is similar. Below function can be used for both tasks.
-Please check the function `_build_classification_or_regression_model` in [network_conf.py]( https://github.com/PaddlePaddle/models/blob/develop/dssm/network_conf.py) for detail implementation.
-
-### Pairwise Rank
-
-Please check the function `_build_rank_model` in [network_conf.py]( https://github.com/PaddlePaddle/models/blob/develop/dssm/network_conf.py) for implementation.
-
-## Data Format
-Below is a simple example for the data in `./data`
-
-### Regression data format
-```
-# 3 fields each line:
-# - source word list
-# - target word list
-# - target
- \t \t
-```
-
-The example of this format is as follows.
-
-```
-Six bags of apples Apple 6s 0.1
-The new driver The driving school 0.9
-```
-
-### Classification data format
-```
-# 3 fields each line:
-# - source word list
-# - target word list
-# - target
- \t \t
-```
-
-The example of this format is as follows.
-
-
-```
-Six bags of apples Apple 6s 0
-The new driver The driving school 1
-```
-
-
-### Ranking data format
-```
-# 4 fields each line:
-# - source word list
-# - target1 word list
-# - target2 word list
-# - label
- \t \t \t
-```
-
-The example of this format is as follows.
-
-```
-Six bags of apples Apple 6s The new driver 1
-The new driver The driving school Apple 6s 1
-```
-
-## Training
-
-We use `python train.py -y 0 --model_arch 0 --class_num 2` with the data in `./data/classification` to train a DSSM model for classification. The paremeters to execute the script `train.py` can be found by execution `python infer.py --help`. Some important parameters are:
-
-- `train_data_path` Training data path
-- `test_data_path` Test data path, optional
-- `source_dic_path` Source dictionary path
-- `target_dic_path` Target dictionary path
-- `model_type` The type of loss function of the model: classification 0, sort 1, regression 2
-- `model_arch` Model structure: FC 0,CNN 1, RNN 2
-- `dnn_dims` The dimension of each layer of the model is set, the default is `256,128,64,32`,with 4 layers.
-
-## To predict using the trained model
-
-The paremeters to execute the script `infer.py` can be found by execution `python infer.py --help`. Some important parameters are:
-
-- `data_path` Path for the data to predict
-- `prediction_output_path` Prediction output path
-
-## References
-
-1. Huang P S, He X, Gao J, et al. Learning deep structured semantic models for web search using clickthrough data[C]//Proceedings of the 22nd ACM international conference on Conference on information & knowledge management. ACM, 2013: 2333-2338.
-2. [Microsoft Learning to Rank Datasets](https://www.microsoft.com/en-us/research/project/mslr/)
-3. [Gao J, He X, Deng L. Deep Learning for Web Search and Natural Language Processing[J]. Microsoft Research Technical Report, 2015.](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/wsdm2015.v3.pdf)
diff --git a/legacy/dssm/data/classification/test.txt b/legacy/dssm/data/classification/test.txt
deleted file mode 100644
index 66b819576ee3a8c45b7eff035a776d6d0b0e8120..0000000000000000000000000000000000000000
--- a/legacy/dssm/data/classification/test.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-苹果 苹果 6s 0
-汽车 驾驶 驾校 培训 1
diff --git a/legacy/dssm/data/classification/train.txt b/legacy/dssm/data/classification/train.txt
deleted file mode 100644
index 05ab78a61fb31526b5b47519c872f031eff933c5..0000000000000000000000000000000000000000
--- a/legacy/dssm/data/classification/train.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-苹果 六 袋 苹果 6s 0
-新手 汽车 驾驶 驾校 培训 1
diff --git a/legacy/dssm/data/rank/test.txt b/legacy/dssm/data/rank/test.txt
deleted file mode 100644
index 093e2c36bc9a5216e0257055a0ec3e84609c709b..0000000000000000000000000000000000000000
--- a/legacy/dssm/data/rank/test.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-苹果 六 袋 苹果 6s 新手 汽车 驾驶 1
-新手 汽车 驾驶 驾校 培训 苹果 6s 0
diff --git a/legacy/dssm/data/rank/train.txt b/legacy/dssm/data/rank/train.txt
deleted file mode 100644
index 6557cfd0a566e99bd7b50790a0eb0e25ef204834..0000000000000000000000000000000000000000
--- a/legacy/dssm/data/rank/train.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-苹果 六 袋 苹果 6s 新手 汽车 驾驶 1
-新手 汽车 驾驶 驾校 培训 苹果 6s 1
diff --git a/legacy/dssm/data/vocab.txt b/legacy/dssm/data/vocab.txt
deleted file mode 100644
index 86e431ddc926652771e67fcd25c4e4fcfb21c921..0000000000000000000000000000000000000000
--- a/legacy/dssm/data/vocab.txt
+++ /dev/null
@@ -1,10 +0,0 @@
-UNK
-苹果
-六
-袋
-6s
-新手
-汽车
-驾驶
-驾校
-培训
\ No newline at end of file
diff --git a/legacy/dssm/images/dssm.jpg b/legacy/dssm/images/dssm.jpg
deleted file mode 100644
index cd25f319191652a53e62f85fc3d39fb22ae56628..0000000000000000000000000000000000000000
Binary files a/legacy/dssm/images/dssm.jpg and /dev/null differ
diff --git a/legacy/dssm/images/dssm.png b/legacy/dssm/images/dssm.png
deleted file mode 100644
index 2f290a75fb267ff13a74565bce8ce08f1277a6de..0000000000000000000000000000000000000000
Binary files a/legacy/dssm/images/dssm.png and /dev/null differ
diff --git a/legacy/dssm/images/dssm2.jpg b/legacy/dssm/images/dssm2.jpg
deleted file mode 100644
index 71c7b0a92f34fd09f709dc382084c823601c4e34..0000000000000000000000000000000000000000
Binary files a/legacy/dssm/images/dssm2.jpg and /dev/null differ
diff --git a/legacy/dssm/images/dssm2.png b/legacy/dssm/images/dssm2.png
deleted file mode 100644
index 6f36b3feb42857b94e75c601e39d23147ceae3f2..0000000000000000000000000000000000000000
Binary files a/legacy/dssm/images/dssm2.png and /dev/null differ
diff --git a/legacy/dssm/images/dssm3.jpg b/legacy/dssm/images/dssm3.jpg
deleted file mode 100644
index 98f7cd102e1d85fe9aa04e294b163e3fab1304e9..0000000000000000000000000000000000000000
Binary files a/legacy/dssm/images/dssm3.jpg and /dev/null differ
diff --git a/legacy/dssm/infer.py b/legacy/dssm/infer.py
deleted file mode 100644
index 63a9657341d7d220b72696fd215d1850b1718f32..0000000000000000000000000000000000000000
--- a/legacy/dssm/infer.py
+++ /dev/null
@@ -1,139 +0,0 @@
-import argparse
-import itertools
-import distutils.util
-
-import reader
-import paddle.v2 as paddle
-from network_conf import DSSM
-from utils import logger, ModelType, ModelArch, load_dic
-
-parser = argparse.ArgumentParser(description="PaddlePaddle DSSM infer")
-parser.add_argument(
- "--model_path", type=str, required=True, help="The path of trained model.")
-parser.add_argument(
- "-i",
- "--data_path",
- type=str,
- required=True,
- help="The path of the data for inferring.")
-parser.add_argument(
- "-o",
- "--prediction_output_path",
- type=str,
- required=True,
- help="The path to save the predictions.")
-parser.add_argument(
- "-y",
- "--model_type",
- type=int,
- required=True,
- default=ModelType.CLASSIFICATION_MODE,
- help=("The model type: %d for classification, %d for pairwise rank, "
- "%d for regression (default: classification).") %
- (ModelType.CLASSIFICATION_MODE, ModelType.RANK_MODE,
- ModelType.REGRESSION_MODE))
-parser.add_argument(
- "-s",
- "--source_dic_path",
- type=str,
- required=False,
- help="The path of the source's word dictionary.")
-parser.add_argument(
- "--target_dic_path",
- type=str,
- required=False,
- help=("The path of the target's word dictionary, "
- "if this parameter is not set, the `source_dic_path` will be used."))
-parser.add_argument(
- "-a",
- "--model_arch",
- type=int,
- required=True,
- default=ModelArch.CNN_MODE,
- help="model architecture, %d for CNN, %d for FC, %d for RNN" %
- (ModelArch.CNN_MODE, ModelArch.FC_MODE, ModelArch.RNN_MODE))
-parser.add_argument(
- "--share_network_between_source_target",
- type=distutils.util.strtobool,
- default=False,
- help="whether to share network parameters between source and target")
-parser.add_argument(
- "--share_embed",
- type=distutils.util.strtobool,
- default=False,
- help="whether to share word embedding between source and target")
-parser.add_argument(
- "--dnn_dims",
- type=str,
- default="256,128,64,32",
- help=("The dimentions of dnn layers, default is `256,128,64,32`, "
- "which means a dnn with 4 layers with "
- "dmentions 256, 128, 64 and 32 will be created."))
-parser.add_argument(
- "-c",
- "--class_num",
- type=int,
- default=0,
- help="The number of categories for classification task.")
-
-args = parser.parse_args()
-args.model_type = ModelType(args.model_type)
-args.model_arch = ModelArch(args.model_arch)
-if args.model_type.is_classification():
- assert args.class_num > 1, ("The parameter class_num should be set "
- "in classification task.")
-
-layer_dims = map(int, args.dnn_dims.split(","))
-args.target_dic_path = args.source_dic_path if not args.target_dic_path \
- else args.target_dic_path
-
-paddle.init(use_gpu=False, trainer_count=1)
-
-
-class Inferer(object):
- def __init__(self, param_path):
- prediction = DSSM(
- dnn_dims=layer_dims,
- vocab_sizes=[
- len(load_dic(path))
- for path in [args.source_dic_path, args.target_dic_path]
- ],
- model_type=args.model_type,
- model_arch=args.model_arch,
- share_semantic_generator=args.share_network_between_source_target,
- class_num=args.class_num,
- share_embed=args.share_embed,
- is_infer=True)()
-
- # load parameter
- logger.info("Load the trained model from %s." % param_path)
- self.parameters = paddle.parameters.Parameters.from_tar(
- open(param_path, "r"))
- self.inferer = paddle.inference.Inference(
- output_layer=prediction, parameters=self.parameters)
-
- def infer(self, data_path):
- dataset = reader.Dataset(
- train_path=data_path,
- test_path=None,
- source_dic_path=args.source_dic_path,
- target_dic_path=args.target_dic_path,
- model_type=args.model_type, )
- infer_reader = paddle.batch(dataset.infer, batch_size=1000)
- logger.warning("Write predictions to %s." % args.prediction_output_path)
-
- output_f = open(args.prediction_output_path, "w")
-
- for id, batch in enumerate(infer_reader()):
- res = self.inferer.infer(input=batch)
- predictions = [" ".join(map(str, x)) for x in res]
- assert len(batch) == len(predictions), (
- "Error! %d inputs are given, "
- "but only %d predictions are returned.") % (len(batch),
- len(predictions))
- output_f.write("\n".join(map(str, predictions)) + "\n")
-
-
-if __name__ == "__main__":
- inferer = Inferer(args.model_path)
- inferer.infer(args.data_path)
diff --git a/legacy/dssm/network_conf.py b/legacy/dssm/network_conf.py
deleted file mode 100644
index 6758982d59822f414c3e75941eed1fd5dd45165b..0000000000000000000000000000000000000000
--- a/legacy/dssm/network_conf.py
+++ /dev/null
@@ -1,292 +0,0 @@
-from paddle import v2 as paddle
-from paddle.v2.attr import ParamAttr
-from utils import TaskType, logger, ModelType, ModelArch
-
-
-class DSSM(object):
- def __init__(self,
- dnn_dims=[],
- vocab_sizes=[],
- model_type=ModelType.create_classification(),
- model_arch=ModelArch.create_cnn(),
- share_semantic_generator=False,
- class_num=None,
- share_embed=False,
- is_infer=False):
- """
- :param dnn_dims: The dimention of each layer in the semantic vector
- generator.
- :type dnn_dims: list of int
- :param vocab_sizes: The size of left and right items.
- :type vocab_sizes: A list having 2 elements.
- :param model_type: The type of task to train the DSSM model. The value
- should be "rank: 0", "regression: 1" or
- "classification: 2".
- :type model_type: int
- :param model_arch: A value indicating the model architecture to use.
- :type model_arch: int
- :param share_semantic_generator: A flag indicating whether to share the
- semantic vector between the left and
- the right item.
- :type share_semantic_generator: bool
- :param share_embed: A floag indicating whether to share the embeddings
- between the left and the right item.
- :type share_embed: bool
- :param class_num: The number of categories.
- :type class_num: int
- """
- assert len(vocab_sizes) == 2, (
- "The vocab_sizes specifying the sizes left and right inputs. "
- "Its dimension should be 2.")
- assert len(dnn_dims) > 1, ("In the DNN model, more than two layers "
- "are needed.")
-
- self.dnn_dims = dnn_dims
- self.vocab_sizes = vocab_sizes
- self.share_semantic_generator = share_semantic_generator
- self.share_embed = share_embed
- self.model_type = ModelType(model_type)
- self.model_arch = ModelArch(model_arch)
- self.class_num = class_num
- self.is_infer = is_infer
- logger.warning("Build DSSM model with config of %s, %s" %
- (self.model_type, self.model_arch))
- logger.info("The vocabulary size is : %s" % str(self.vocab_sizes))
-
- # bind model architecture
- _model_arch = {
- "cnn": self.create_cnn,
- "fc": self.create_fc,
- "rnn": self.create_rnn,
- }
-
- def _model_arch_creater(emb, prefix=""):
- sent_vec = _model_arch.get(str(model_arch))(emb, prefix)
- dnn = self.create_dnn(sent_vec, prefix)
- return dnn
-
- self.model_arch_creater = _model_arch_creater
-
- _model_type = {
- "classification": self._build_classification_model,
- "rank": self._build_rank_model,
- "regression": self._build_regression_model,
- }
- print("model type: ", str(self.model_type))
- self.model_type_creater = _model_type[str(self.model_type)]
-
- def __call__(self):
- return self.model_type_creater()
-
- def create_embedding(self, input, prefix=""):
- """
- Create word embedding. The `prefix` is added in front of the name of
- embedding"s learnable parameter.
- """
- logger.info("Create embedding table [%s] whose dimention is %d. " %
- (prefix, self.dnn_dims[0]))
- emb = paddle.layer.embedding(
- input=input,
- size=self.dnn_dims[0],
- param_attr=ParamAttr(name="%s_emb.w" % prefix))
- return emb
-
- def create_fc(self, emb, prefix=""):
- """
- A multi-layer fully connected neural networks.
-
- :param emb: The output of the embedding layer
- :type emb: paddle.layer
- :param prefix: A prefix will be added to the layers' names.
- :type prefix: str
- """
- _input_layer = paddle.layer.pooling(
- input=emb, pooling_type=paddle.pooling.Max())
- fc = paddle.layer.fc(input=_input_layer,
- size=self.dnn_dims[1],
- param_attr=ParamAttr(name="%s_fc.w" % prefix),
- bias_attr=ParamAttr(
- name="%s_fc.b" % prefix, initial_std=0.))
- return fc
-
- def create_rnn(self, emb, prefix=""):
- """
- A GRU sentence vector learner.
- """
- gru = paddle.networks.simple_gru(
- input=emb,
- size=self.dnn_dims[1],
- mixed_param_attr=ParamAttr(name="%s_gru_mixed.w" % prefix),
- mixed_bias_param_attr=ParamAttr(name="%s_gru_mixed.b" % prefix),
- gru_param_attr=ParamAttr(name="%s_gru.w" % prefix),
- gru_bias_attr=ParamAttr(name="%s_gru.b" % prefix))
- sent_vec = paddle.layer.last_seq(gru)
- return sent_vec
-
- def create_cnn(self, emb, prefix=""):
- """
- A multi-layer CNN.
-
- :param emb: The word embedding.
- :type emb: paddle.layer
- :param prefix: The prefix will be added to of layers' names.
- :type prefix: str
- """
-
- def create_conv(context_len, hidden_size, prefix):
- key = "%s_%d_%d" % (prefix, context_len, hidden_size)
- conv = paddle.networks.sequence_conv_pool(
- input=emb,
- context_len=context_len,
- hidden_size=hidden_size,
- # set parameter attr for parameter sharing
- context_proj_param_attr=ParamAttr(name=key + "contex_proj.w"),
- fc_param_attr=ParamAttr(name=key + "_fc.w"),
- fc_bias_attr=ParamAttr(name=key + "_fc.b"),
- pool_bias_attr=ParamAttr(name=key + "_pool.b"))
- return conv
-
- logger.info("create a sequence_conv_pool whose context width is 3.")
- conv_3 = create_conv(3, self.dnn_dims[1], "cnn")
- logger.info("create a sequence_conv_pool whose context width is 4.")
- conv_4 = create_conv(4, self.dnn_dims[1], "cnn")
-
- return paddle.layer.concat(input=[conv_3, conv_4])
-
- def create_dnn(self, sent_vec, prefix):
- # if more than three layers, than a fc layer will be added.
- if len(self.dnn_dims) > 1:
- _input_layer = sent_vec
- for id, dim in enumerate(self.dnn_dims[1:]):
- name = "%s_fc_%d_%d" % (prefix, id, dim)
- logger.info("create fc layer [%s] which dimention is %d" %
- (name, dim))
- fc = paddle.layer.fc(input=_input_layer,
- size=dim,
- act=paddle.activation.Tanh(),
- param_attr=ParamAttr(name="%s.w" % name),
- bias_attr=ParamAttr(
- name="%s.b" % name, initial_std=0.))
- _input_layer = fc
- return _input_layer
-
- def _build_classification_model(self):
- logger.info("build classification model")
- assert self.model_type.is_classification()
- return self._build_classification_or_regression_model(
- is_classification=True)
-
- def _build_regression_model(self):
- logger.info("build regression model")
- assert self.model_type.is_regression()
- return self._build_classification_or_regression_model(
- is_classification=False)
-
- def _build_rank_model(self):
- """
- Build a pairwise rank model, and the cost is returned.
-
- A pairwise rank model has 3 inputs:
- - source sentence
- - left_target sentence
- - right_target sentence
- - label, 1 if left_target should be sorted in front of
- right_target, otherwise 0.
- """
- logger.info("build rank model")
- assert self.model_type.is_rank()
- source = paddle.layer.data(
- name="source_input",
- type=paddle.data_type.integer_value_sequence(self.vocab_sizes[0]))
- left_target = paddle.layer.data(
- name="left_target_input",
- type=paddle.data_type.integer_value_sequence(self.vocab_sizes[1]))
- right_target = paddle.layer.data(
- name="right_target_input",
- type=paddle.data_type.integer_value_sequence(self.vocab_sizes[1]))
- if not self.is_infer:
- label = paddle.layer.data(
- name="label_input", type=paddle.data_type.integer_value(1))
-
- prefixs = "_ _ _".split(
- ) if self.share_semantic_generator else "source target target".split()
- embed_prefixs = "_ _ _".split(
- ) if self.share_embed else "source target target".split()
-
- word_vecs = []
- for id, input in enumerate([source, left_target, right_target]):
- x = self.create_embedding(input, prefix=embed_prefixs[id])
- word_vecs.append(x)
-
- semantics = []
- for id, input in enumerate(word_vecs):
- x = self.model_arch_creater(input, prefix=prefixs[id])
- semantics.append(x)
-
- # The cosine similarity score of source and left_target.
- left_score = paddle.layer.cos_sim(semantics[0], semantics[1])
- # The cosine similarity score of source and right target.
- right_score = paddle.layer.cos_sim(semantics[0], semantics[2])
-
- if not self.is_infer:
- # rank cost
- cost = paddle.layer.rank_cost(left_score, right_score, label=label)
- # prediction = left_score - right_score
- # but this operator is not supported currently.
- # so AUC will not used.
- return cost, None, label
- return right_score
-
- def _build_classification_or_regression_model(self, is_classification):
- """
- Build a classification/regression model, and the cost is returned.
-
- The classification/regression task expects 3 inputs:
- - source sentence
- - target sentence
- - classification label
-
- """
- if is_classification:
- assert self.class_num
-
- source = paddle.layer.data(
- name="source_input",
- type=paddle.data_type.integer_value_sequence(self.vocab_sizes[0]))
- target = paddle.layer.data(
- name="target_input",
- type=paddle.data_type.integer_value_sequence(self.vocab_sizes[1]))
- label = paddle.layer.data(
- name="label_input",
- type=paddle.data_type.integer_value(self.class_num)
- if is_classification else paddle.data_type.dense_vector(1))
-
- prefixs = "_ _".split(
- ) if self.share_semantic_generator else "source target".split()
- embed_prefixs = "_ _".split(
- ) if self.share_embed else "source target".split()
-
- word_vecs = []
- for id, input in enumerate([source, target]):
- x = self.create_embedding(input, prefix=embed_prefixs[id])
- word_vecs.append(x)
-
- semantics = []
- for id, input in enumerate(word_vecs):
- x = self.model_arch_creater(input, prefix=prefixs[id])
- semantics.append(x)
-
- if is_classification:
- concated_vector = paddle.layer.concat(semantics)
- prediction = paddle.layer.fc(input=concated_vector,
- size=self.class_num,
- act=paddle.activation.Softmax())
- cost = paddle.layer.classification_cost(
- input=prediction, label=label)
- else:
- prediction = paddle.layer.cos_sim(*semantics)
- cost = paddle.layer.square_error_cost(prediction, label)
-
- if not self.is_infer:
- return cost, prediction, label
- return prediction
diff --git a/legacy/dssm/reader.py b/legacy/dssm/reader.py
deleted file mode 100644
index c0530c50db3b2900844242e2df3ed9ab3020bcde..0000000000000000000000000000000000000000
--- a/legacy/dssm/reader.py
+++ /dev/null
@@ -1,121 +0,0 @@
-from utils import UNK, ModelType, TaskType, load_dic, \
- sent2ids, logger, ModelType
-
-
-class Dataset(object):
- def __init__(self, train_path, test_path, source_dic_path, target_dic_path,
- model_type):
- self.train_path = train_path
- self.test_path = test_path
- self.source_dic_path = source_dic_path
- self.target_dic_path = target_dic_path
- self.model_type = ModelType(model_type)
-
- self.source_dic = load_dic(self.source_dic_path)
- self.target_dic = load_dic(self.target_dic_path)
-
- _record_reader = {
- ModelType.CLASSIFICATION_MODE: self._read_classification_record,
- ModelType.REGRESSION_MODE: self._read_regression_record,
- ModelType.RANK_MODE: self._read_rank_record,
- }
-
- assert isinstance(model_type, ModelType)
- self.record_reader = _record_reader[model_type.mode]
- self.is_infer = False
-
- def train(self):
- '''
- Load trainset.
- '''
- logger.info("[reader] load trainset from %s" % self.train_path)
- with open(self.train_path) as f:
- for line_id, line in enumerate(f):
- yield self.record_reader(line)
-
- def test(self):
- '''
- Load testset.
- '''
- with open(self.test_path) as f:
- for line_id, line in enumerate(f):
- yield self.record_reader(line)
-
- def infer(self):
- self.is_infer = True
- with open(self.train_path) as f:
- for line in f:
- yield self.record_reader(line)
-
- def _read_classification_record(self, line):
- '''
- data format:
- [TAB] [TAB]
-
- @line: str
- a string line which represent a record.
- '''
- fs = line.strip().split('\t')
- assert len(fs) == 3, "wrong format for classification\n" + \
- "the format shoud be " +\
- " [TAB] [TAB] '"
- source = sent2ids(fs[0], self.source_dic)
- target = sent2ids(fs[1], self.target_dic)
- if not self.is_infer:
- label = int(fs[2])
- return (
- source,
- target,
- label, )
- return source, target
-
- def _read_regression_record(self, line):
- '''
- data format:
- [TAB] [TAB]
-
- @line: str
- a string line which represent a record.
- '''
- fs = line.strip().split('\t')
- assert len(fs) == 3, "wrong format for regression\n" + \
- "the format shoud be " +\
- " [TAB] [TAB] '"
- source = sent2ids(fs[0], self.source_dic)
- target = sent2ids(fs[1], self.target_dic)
- if not self.is_infer:
- label = float(fs[2])
- return (
- source,
- target,
- [label], )
- return source, target
-
- def _read_rank_record(self, line):
- '''
- data format:
- [TAB] [TAB] [TAB]
- '''
- fs = line.strip().split('\t')
- assert len(fs) == 4, "wrong format for rank\n" + \
- "the format should be " +\
- " [TAB] [TAB] [TAB] "
-
- source = sent2ids(fs[0], self.source_dic)
- left_target = sent2ids(fs[1], self.target_dic)
- right_target = sent2ids(fs[2], self.target_dic)
- if not self.is_infer:
- label = int(fs[3])
- return (source, left_target, right_target, label)
- return source, left_target, right_target
-
-
-if __name__ == '__main__':
- path = './data/classification/train.txt'
- test_path = './data/classification/test.txt'
- source_dic = './data/vocab.txt'
- dataset = Dataset(path, test_path, source_dic, source_dic,
- ModelType.CLASSIFICATION)
-
- for rcd in dataset.train():
- print rcd
diff --git a/legacy/dssm/train.py b/legacy/dssm/train.py
deleted file mode 100644
index d7ec8aa690ca7b6ace8478e09b2ce3b0f93a2583..0000000000000000000000000000000000000000
--- a/legacy/dssm/train.py
+++ /dev/null
@@ -1,278 +0,0 @@
-import argparse
-import distutils.util
-
-import paddle.v2 as paddle
-from network_conf import DSSM
-import reader
-from utils import TaskType, load_dic, logger, ModelType, ModelArch, display_args
-
-parser = argparse.ArgumentParser(description="PaddlePaddle DSSM example")
-
-parser.add_argument(
- "-i",
- "--train_data_path",
- type=str,
- required=False,
- help="The path of training data.")
-parser.add_argument(
- "-t",
- "--test_data_path",
- type=str,
- required=False,
- help="The path of testing data.")
-parser.add_argument(
- "-s",
- "--source_dic_path",
- type=str,
- required=False,
- help="The path of the source's word dictionary.")
-parser.add_argument(
- "--target_dic_path",
- type=str,
- required=False,
- help=("The path of the target's word dictionary, "
- "if this parameter is not set, the `source_dic_path` will be used"))
-parser.add_argument(
- "-b",
- "--batch_size",
- type=int,
- default=32,
- help="The size of mini-batch (default:32).")
-parser.add_argument(
- "-p",
- "--num_passes",
- type=int,
- default=10,
- help="The number of passes to run(default:10).")
-parser.add_argument(
- "-y",
- "--model_type",
- type=int,
- required=True,
- default=ModelType.CLASSIFICATION_MODE,
- help=("model type, %d for classification, %d for pairwise rank, "
- "%d for regression (default: classification).") %
- (ModelType.CLASSIFICATION_MODE, ModelType.RANK_MODE,
- ModelType.REGRESSION_MODE))
-parser.add_argument(
- "-a",
- "--model_arch",
- type=int,
- required=True,
- default=ModelArch.CNN_MODE,
- help="The model architecture, %d for CNN, %d for FC, %d for RNN." %
- (ModelArch.CNN_MODE, ModelArch.FC_MODE, ModelArch.RNN_MODE))
-parser.add_argument(
- "--share_network_between_source_target",
- type=distutils.util.strtobool,
- default=False,
- help="Whether to share network parameters between source and target.")
-parser.add_argument(
- "--share_embed",
- type=distutils.util.strtobool,
- default=False,
- help="Whether to share word embedding between source and target.")
-parser.add_argument(
- "--dnn_dims",
- type=str,
- default="256,128,64,32",
- help=("The dimentions of dnn layers, default is '256,128,64,32', "
- "which means create a 4-layer dnn. The dimention of each layer is "
- "'256, 128, 64 and 32'."))
-parser.add_argument(
- "--num_workers",
- type=int,
- default=1,
- help="The number of worker threads, default 1.")
-parser.add_argument(
- "--use_gpu",
- type=distutils.util.strtobool,
- default=False,
- help="Whether to use GPU devices (default: False)")
-parser.add_argument(
- "-c",
- "--class_num",
- type=int,
- default=0,
- help="The number of categories for classification task.")
-parser.add_argument(
- "--model_output_prefix",
- type=str,
- default="./",
- help="The prefix of the path to store the trained models (default: ./).")
-parser.add_argument(
- "-g",
- "--num_batches_to_log",
- type=int,
- default=100,
- help=("The log period. Every num_batches_to_test batches, "
- "a training log will be printed. (default: 100)"))
-parser.add_argument(
- "-e",
- "--num_batches_to_test",
- type=int,
- default=200,
- help=("The test period. Every num_batches_to_save_model batches, "
- "the specified test sample will be test (default: 200)."))
-parser.add_argument(
- "-z",
- "--num_batches_to_save_model",
- type=int,
- default=400,
- help=("Every num_batches_to_save_model batches, "
- "a trained model will be saved (default: 400)."))
-
-args = parser.parse_args()
-args.model_type = ModelType(args.model_type)
-args.model_arch = ModelArch(args.model_arch)
-if args.model_type.is_classification():
- assert args.class_num > 1, ("The parameter class_num should be set in "
- "classification task.")
-
-layer_dims = [int(i) for i in args.dnn_dims.split(",")]
-args.target_dic_path = args.source_dic_path if not \
- args.target_dic_path else args.target_dic_path
-
-
-def train(train_data_path=None,
- test_data_path=None,
- source_dic_path=None,
- target_dic_path=None,
- model_type=ModelType.create_classification(),
- model_arch=ModelArch.create_cnn(),
- batch_size=32,
- num_passes=10,
- share_semantic_generator=False,
- share_embed=False,
- class_num=None,
- num_workers=1,
- use_gpu=False):
- """
- Train the DSSM.
- """
- default_train_path = "./data/rank/train.txt"
- default_test_path = "./data/rank/test.txt"
- default_dic_path = "./data/vocab.txt"
- if not model_type.is_rank():
- default_train_path = "./data/classification/train.txt"
- default_test_path = "./data/classification/test.txt"
-
- use_default_data = not train_data_path
-
- if use_default_data:
- train_data_path = default_train_path
- test_data_path = default_test_path
- source_dic_path = default_dic_path
- target_dic_path = default_dic_path
-
- dataset = reader.Dataset(
- train_path=train_data_path,
- test_path=test_data_path,
- source_dic_path=source_dic_path,
- target_dic_path=target_dic_path,
- model_type=model_type, )
-
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- dataset.train, buf_size=1000),
- batch_size=batch_size)
-
- test_reader = paddle.batch(
- paddle.reader.shuffle(
- dataset.test, buf_size=1000),
- batch_size=batch_size)
-
- paddle.init(use_gpu=use_gpu, trainer_count=num_workers)
-
- cost, prediction, label = DSSM(
- dnn_dims=layer_dims,
- vocab_sizes=[
- len(load_dic(path)) for path in [source_dic_path, target_dic_path]
- ],
- model_type=model_type,
- model_arch=model_arch,
- share_semantic_generator=share_semantic_generator,
- class_num=class_num,
- share_embed=share_embed)()
-
- parameters = paddle.parameters.create(cost)
-
- adam_optimizer = paddle.optimizer.Adam(
- learning_rate=2e-4,
- regularization=paddle.optimizer.L2Regularization(rate=1e-3),
- model_average=paddle.optimizer.ModelAverage(average_window=0.5))
-
- trainer = paddle.trainer.SGD(
- cost=cost,
- extra_layers=paddle.evaluator.auc(input=prediction, label=label)
- if not model_type.is_rank() else None,
- parameters=parameters,
- update_equation=adam_optimizer)
-
- feeding = {}
- if model_type.is_classification() or model_type.is_regression():
- feeding = {"source_input": 0, "target_input": 1, "label_input": 2}
- else:
- feeding = {
- "source_input": 0,
- "left_target_input": 1,
- "right_target_input": 2,
- "label_input": 3
- }
-
- def _event_handler(event):
- """
- Define batch handler
- """
- if isinstance(event, paddle.event.EndIteration):
- # output train log
- if event.batch_id % args.num_batches_to_log == 0:
- logger.info("Pass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
-
- # test model
- if event.batch_id > 0 and \
- event.batch_id % args.num_batches_to_test == 0:
- if test_reader is not None:
- if model_type.is_classification():
- result = trainer.test(
- reader=test_reader, feeding=feeding)
- logger.info("Test at Pass %d, %s" % (event.pass_id,
- result.metrics))
- else:
- result = None
- # save model
- if event.batch_id > 0 and \
- event.batch_id % args.num_batches_to_save_model == 0:
- model_desc = "{type}_{arch}".format(
- type=str(args.model_type), arch=str(args.model_arch))
- with open("%sdssm_%s_pass_%05d.tar" %
- (args.model_output_prefix, model_desc,
- event.pass_id), "w") as f:
- trainer.save_parameter_to_tar(f)
-
- trainer.train(
- reader=train_reader,
- event_handler=_event_handler,
- feeding=feeding,
- num_passes=num_passes)
-
- logger.info("Training has finished.")
-
-
-if __name__ == "__main__":
- display_args(args)
- train(
- train_data_path=args.train_data_path,
- test_data_path=args.test_data_path,
- source_dic_path=args.source_dic_path,
- target_dic_path=args.target_dic_path,
- model_type=ModelType(args.model_type),
- model_arch=ModelArch(args.model_arch),
- batch_size=args.batch_size,
- num_passes=args.num_passes,
- share_semantic_generator=args.share_network_between_source_target,
- share_embed=args.share_embed,
- class_num=args.class_num,
- num_workers=args.num_workers,
- use_gpu=args.use_gpu)
diff --git a/legacy/dssm/utils.py b/legacy/dssm/utils.py
deleted file mode 100644
index 97296fd5dcc2dc664c97dd83d658c8805221fc57..0000000000000000000000000000000000000000
--- a/legacy/dssm/utils.py
+++ /dev/null
@@ -1,127 +0,0 @@
-import logging
-import paddle
-
-UNK = 0
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def mode_attr_name(mode):
- return mode.upper() + "_MODE"
-
-
-def create_attrs(cls):
- for id, mode in enumerate(cls.modes):
- setattr(cls, mode_attr_name(mode), id)
-
-
-def make_check_method(cls):
- """
- create methods for classes.
- """
-
- def method(mode):
- def _method(self):
- return self.mode == getattr(cls, mode_attr_name(mode))
-
- return _method
-
- for id, mode in enumerate(cls.modes):
- setattr(cls, "is_" + mode, method(mode))
-
-
-def make_create_method(cls):
- def method(mode):
- @staticmethod
- def _method():
- key = getattr(cls, mode_attr_name(mode))
- return cls(key)
-
- return _method
-
- for id, mode in enumerate(cls.modes):
- setattr(cls, "create_" + mode, method(mode))
-
-
-def make_str_method(cls, type_name="unk"):
- def _str_(self):
- for mode in cls.modes:
- if self.mode == getattr(cls, mode_attr_name(mode)):
- return mode
-
- def _hash_(self):
- return self.mode
-
- setattr(cls, "__str__", _str_)
- setattr(cls, "__repr__", _str_)
- setattr(cls, "__hash__", _hash_)
- cls.__name__ = type_name
-
-
-def _init_(self, mode, cls):
- if isinstance(mode, int):
- self.mode = mode
- elif isinstance(mode, cls):
- self.mode = mode.mode
- else:
- raise Exception("A wrong mode type, get type: %s, value: %s." %
- (type(mode), mode))
-
-
-def build_mode_class(cls):
- create_attrs(cls)
- make_str_method(cls)
- make_check_method(cls)
- make_create_method(cls)
-
-
-class TaskType(object):
- modes = "train test infer".split()
-
- def __init__(self, mode):
- _init_(self, mode, TaskType)
-
-
-class ModelType:
- modes = "classification rank regression".split()
-
- def __init__(self, mode):
- _init_(self, mode, ModelType)
-
-
-class ModelArch:
- modes = "fc cnn rnn".split()
-
- def __init__(self, mode):
- _init_(self, mode, ModelArch)
-
-
-build_mode_class(TaskType)
-build_mode_class(ModelType)
-build_mode_class(ModelArch)
-
-
-def sent2ids(sent, vocab):
- """
- transform a sentence to a list of ids.
- """
- return [vocab.get(w, UNK) for w in sent.split()]
-
-
-def load_dic(path):
- """
- The format of word dictionary : each line is a word.
- """
- dic = {}
- with open(path) as f:
- for id, line in enumerate(f):
- w = line.strip()
- dic[w] = id
- return dic
-
-
-def display_args(args):
- logger.info("The arguments passed by command line is :")
- for k, v in sorted(v for v in vars(args).items()):
- logger.info("{}:\t{}".format(k, v))
diff --git a/legacy/generate_chinese_poetry/README.md b/legacy/generate_chinese_poetry/README.md
deleted file mode 100644
index c1ea00109075a64f549ec56ad8433f7c4846855a..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/README.md
+++ /dev/null
@@ -1,115 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 中国古诗生成
-
-## 简介
-基于编码器-解码器(encoder-decoder)神经网络模型,利用全唐诗进行诗句-诗句(sequence to sequence)训练,实现给定诗句后,生成下一诗句。
-
-模型中的编码器、解码器均使用堆叠双向LSTM (stacked bi-directional LSTM),默认均为3层,带有注意力单元(attention)。
-
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── data # 存储训练数据及字典
-│ ├── download.sh # 下载原始数据
-├── README.md # 文档
-├── index.html # 文档(html格式)
-├── preprocess.py # 原始数据预处理
-├── generate.py # 生成诗句脚本
-├── network_conf.py # 模型定义
-├── reader.py # 数据读取接口
-├── train.py # 训练脚本
-└── utils.py # 定义实用工具函数
-```
-
-## 数据处理
-### 原始数据来源
-本例使用[中华古诗词数据库](https://github.com/chinese-poetry/chinese-poetry)中收集的全唐诗作为训练数据,共有约5.4万首唐诗。
-
-### 原始数据下载
-```bash
-cd data && ./download.sh && cd ..
-```
-### 数据预处理
-```bash
-python preprocess.py --datadir data/raw --outfile data/poems.txt --dictfile data/dict.txt
-```
-
-上述脚本执行完后将生成处理好的训练数据poems.txt和字典dict.txt。字典的构建以字为单位,使用出现频数至少为10的字构建字典。
-
-poems.txt中每行为一首唐诗的信息,分为三列,分别为题目、作者、诗内容。在诗内容中,诗句之间用`.`分隔。
-
-训练数据示例:
-```text
-登鸛雀樓 王之渙 白日依山盡.黃河入海流.欲窮千里目.更上一層樓
-觀獵 李白 太守耀清威.乘閑弄晚暉.江沙橫獵騎.山火遶行圍.箭逐雲鴻落.鷹隨月兔飛.不知白日暮.歡賞夜方歸
-晦日重宴 陳嘉言 高門引冠蓋.下客抱支離.綺席珍羞滿.文場翰藻摛.蓂華彫上月.柳色藹春池.日斜歸戚里.連騎勒金羈
-```
-
-模型训练时,使用每一诗句作为模型输入,下一诗句作为预测目标。
-
-
-## 模型训练
-训练脚本[train.py](./train.py)中的命令行参数可以通过`python train.py --help`查看。主要参数说明如下:
-- `num_passes`: 训练pass数
-- `batch_size`: batch大小
-- `use_gpu`: 是否使用GPU
-- `trainer_count`: trainer数目,默认为1
-- `save_dir_path`: 模型存储路径,默认为当前目录下models目录
-- `encoder_depth`: 模型中编码器LSTM深度,默认为3
-- `decoder_depth`: 模型中解码器LSTM深度,默认为3
-- `train_data_path`: 训练数据路径
-- `word_dict_path`: 数据字典路径
-- `init_model_path`: 初始模型路径,从头训练时无需指定
-
-### 训练执行
-```bash
-python train.py \
- --num_passes 50 \
- --batch_size 256 \
- --use_gpu True \
- --trainer_count 1 \
- --save_dir_path models \
- --train_data_path data/poems.txt \
- --word_dict_path data/dict.txt \
- 2>&1 | tee train.log
-```
-每个pass训练结束后,模型参数将保存在models目录下。训练日志保存在train.log中。
-
-### 最优模型参数
-寻找cost最小的pass,使用该pass对应的模型参数用于后续预测。
-```bash
-python -c 'import utils; utils.find_optiaml_pass("./train.log")'
-```
-
-## 生成诗句
-使用[generate.py](./generate.py)脚本对输入诗句生成下一诗句,命令行参数可通过`python generate.py --help`查看。
-主要参数说明如下:
-- `model_path`: 训练好的模型参数文件
-- `word_dict_path`: 数据字典路径
-- `test_data_path`: 输入数据路径
-- `batch_size`: batch大小,默认为1
-- `beam_size`: beam search中搜索范围大小,默认为5
-- `save_file`: 输出保存路径
-- `use_gpu`: 是否使用GPU
-
-### 执行生成
-例如将诗句 `孤帆遠影碧空盡` 保存在文件 `input.txt` 中作为预测下句诗的输入,执行命令:
-```bash
-python generate.py \
- --model_path models/pass_00049.tar.gz \
- --word_dict_path data/dict.txt \
- --test_data_path input.txt \
- --save_file output.txt
-```
-生成结果将保存在文件 `output.txt` 中。对于上述示例输入,生成的诗句如下:
-```text
--9.6987 萬 壑 清 風 黃 葉 多
--10.0737 萬 里 遠 山 紅 葉 深
--10.4233 萬 壑 清 波 紅 一 流
--10.4802 萬 壑 清 風 黃 葉 深
--10.9060 萬 壑 清 風 紅 葉 多
-```
diff --git a/legacy/generate_chinese_poetry/README_en.md b/legacy/generate_chinese_poetry/README_en.md
deleted file mode 100644
index e7bfe9ebe54a4819a8b345e0b206ac2e9b25fe73..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/README_en.md
+++ /dev/null
@@ -1,114 +0,0 @@
-Running sample code in this directory requires PaddelPaddle v0.10.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) and make an update.
----
-
-# Chinese Ancient Poetry Generation
-
-## Introduction
-On an encoder-decoder neural network model, we perform sequence-to-sequence training with The Complete Tang Poems. The model should generate the verse after the given input verse.
-
-The encoders and decoders in the model all use a stacked bi-directional LSTM which, by default, has three layers and attention.
-
-The following is a brief directory structure and description of this example:
-
-```text
-.
-├── data # store training data and dictionary
-│ ├── download.sh # download raw data
-├── README.md # documentation
-├── index.html # document (html format)
-├── preprocess.py # raw data preprocessing
-├── generate.py # generate verse script
-├── network_conf.py # model definition
-├── reader.py # data reading interface
-├── train.py # training script
-└── utils.py # define utility functions
-```
-
-## Data Processing
-### Raw Data Source
-The training data of this example is The Complete Tang Poems in the [Chinese ancient poetry database](https://github.com/chinese-poetry/chinese-poetry). There are about 54,000 Tang poems.
-
-### Downloading Raw Data
-```bash
-cd data && ./download.sh && cd ..
-```
-### Data Preprocessing
-```bash
-python preprocess.py --datadir data/raw --outfile data/poems.txt --dictfile data/dict.txt
-```
-
-After the above script is executed, the processed training data "poems.txt" and dictionary "dict.txt" will be generated. The dictionary's unit is word, and it is constructed by words with a frequency of at least 10.
-
-Divided into three columns, each line in poems.txt contains the title, author, and content of a poem. Verses of a poem are separated by`.`.
-
-Training data example:
-```text
-登鸛雀樓 王之渙 白日依山盡.黃河入海流.欲窮千里目.更上一層樓
-觀獵 李白 太守耀清威.乘閑弄晚暉.江沙橫獵騎.山火遶行圍.箭逐雲鴻落.鷹隨月兔飛.不知白日暮.歡賞夜方歸
-晦日重宴 陳嘉言 高門引冠蓋.下客抱支離.綺席珍羞滿.文場翰藻摛.蓂華彫上月.柳色藹春池.日斜歸戚里.連騎勒金羈
-```
-
-When the model is trained, each verse is used as a model input, and the next verse is used as a prediction target.
-
-
-## Model Training
-The command line arguments in the training script, ["train.py"](./train.py), can be viewed with `python train.py --help`. The main parameters are as follows:
-- `num_passes`: number of passes
-- `batch_size`: batch size
-- `use_gpu`: whether to use GPU
-- `trainer_count`: number of trainers, the default is 1
-- `save_dir_path`: model storage path, the default is the current directory under the models directory
-- `encoder_depth`: model encoder LSTM depth, default 3
-- `decoder_depth`: model decoder LSTM depth, default 3
-- `train_data_path`: training data path
-- `word_dict_path`: data dictionary path
-- `init_model_path`: initial model path, no need to specify at the start of training
-
-### Training Execution
-```bash
-python train.py \
- --num_passes 50 \
- --batch_size 256 \
- --use_gpu True \
- --trainer_count 1 \
- --save_dir_path models \
- --train_data_path data/poems.txt \
- --word_dict_path data/dict.txt \
- 2>&1 | tee train.log
-```
-After each pass training, the model parameters are saved under directory "models". Training logs are stored in "train.log".
-
-### Optimal Model Parameters
-Find the pass with the lowest cost and use the model parameters corresponding to the pass for subsequent prediction.
-```bash
-python -c 'import utils; utils.find_optiaml_pass("./train.log")'
-```
-
-## Generating Verses
-Use the ["generate.py"](./generate.py) script to generate the next verse for the input verses. Command line arguments can be viewed with `python generate.py --help`.
-The main parameters are described as follows:
-- `model_path`: trained model parameter file
-- `word_dict_path`: data dictionary path
-- `test_data_path`: input data path
-- `batch_size`: batch size, default is 1
-- `beam_size`: search size in beam search, the default is 5
-- `save_file`: output save path
-- `use_gpu`: whether to use GPU
-
-### Perform Generation
-For example, save the verse `孤帆遠影碧空盡` in the file `input.txt` as input. To predict the next sentence, execute the command:
-```bash
-python generate.py \
- --model_path models/pass_00049.tar.gz \
- --word_dict_path data/dict.txt \
- --test_data_path input.txt \
- --save_file output.txt
-```
-The result will be saved in the file "output.txt". For the above example input, the generated verses are as follows:
-```text
--9.6987 萬 壑 清 風 黃 葉 多
--10.0737 萬 里 遠 山 紅 葉 深
--10.4233 萬 壑 清 波 紅 一 流
--10.4802 萬 壑 清 風 黃 葉 深
--10.9060 萬 壑 清 風 紅 葉 多
-```
diff --git a/legacy/generate_chinese_poetry/data/download.sh b/legacy/generate_chinese_poetry/data/download.sh
deleted file mode 100755
index 988c09c0f27c81854d2e090913d2972cb0ffbb51..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/data/download.sh
+++ /dev/null
@@ -1,11 +0,0 @@
-#!/bin/bash
-
-git clone https://github.com/chinese-poetry/chinese-poetry.git
-
-if [ ! -d raw ]
-then
- mkdir raw
-fi
-
-mv chinese-poetry/json/poet.tang.* raw/
-rm -rf chinese-poetry
diff --git a/legacy/generate_chinese_poetry/generate.py b/legacy/generate_chinese_poetry/generate.py
deleted file mode 100755
index 952de15fbcfdb1193c30c6828b73d3a6a825b473..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/generate.py
+++ /dev/null
@@ -1,107 +0,0 @@
-import os
-import sys
-import gzip
-import logging
-import numpy as np
-import click
-
-import reader
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-from network_conf import encoder_decoder_network
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.WARNING)
-
-
-def infer_a_batch(inferer, test_batch, beam_size, id_to_text, fout):
- beam_result = inferer.infer(input=test_batch, field=["prob", "id"])
- gen_sen_idx = np.where(beam_result[1] == -1)[0]
- assert len(gen_sen_idx) == len(test_batch) * beam_size, ("%d vs. %d" % (
- len(gen_sen_idx), len(test_batch) * beam_size))
-
- start_pos, end_pos = 1, 0
- for i, sample in enumerate(test_batch):
- fout.write("%s\n" % (
- " ".join([id_to_text[w] for w in sample[0][1:-1]])
- )) # skip the start and ending mark when print the source sentence
- for j in xrange(beam_size):
- end_pos = gen_sen_idx[i * beam_size + j]
- fout.write("%s\n" % ("%.4f\t%s" % (beam_result[0][i][j], " ".join(
- id_to_text[w] for w in beam_result[1][start_pos:end_pos - 1]))))
- start_pos = end_pos + 2
- fout.write("\n")
- fout.flush
-
-
-@click.command("generate")
-@click.option(
- "--model_path",
- default="",
- help="The path of the trained model for generation.")
-@click.option(
- "--word_dict_path", required=True, help="The path of word dictionary.")
-@click.option(
- "--test_data_path",
- required=True,
- help="The path of input data for generation.")
-@click.option(
- "--batch_size",
- default=1,
- help="The number of testing examples in one forward pass in generation.")
-@click.option(
- "--beam_size", default=5, help="The beam expansion in beam search.")
-@click.option(
- "--save_file",
- required=True,
- help="The file path to save the generated results.")
-@click.option(
- "--use_gpu", default=False, help="Whether to use GPU in generation.")
-def generate(model_path, word_dict_path, test_data_path, batch_size, beam_size,
- save_file, use_gpu):
- assert os.path.exists(model_path), "The given model does not exist."
- assert os.path.exists(test_data_path), "The given test data does not exist."
-
- with gzip.open(model_path, "r") as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
-
- id_to_text = {}
- assert os.path.exists(
- word_dict_path), "The given word dictionary path does not exist."
- with open(word_dict_path, "r") as f:
- for i, line in enumerate(f):
- id_to_text[i] = line.strip().split("\t")[0]
-
- paddle.init(use_gpu=use_gpu, trainer_count=1)
- beam_gen = encoder_decoder_network(
- word_count=len(id_to_text),
- emb_dim=512,
- encoder_depth=3,
- encoder_hidden_dim=512,
- decoder_depth=3,
- decoder_hidden_dim=512,
- bos_id=0,
- eos_id=1,
- max_length=9,
- beam_size=beam_size,
- is_generating=True)
-
- inferer = paddle.inference.Inference(
- output_layer=beam_gen, parameters=parameters)
-
- test_batch = []
- with open(save_file, "w") as fout:
- for idx, item in enumerate(
- reader.gen_reader(test_data_path, word_dict_path)()):
- test_batch.append([item])
- if len(test_batch) == batch_size:
- infer_a_batch(inferer, test_batch, beam_size, id_to_text, fout)
- test_batch = []
-
- if len(test_batch):
- infer_a_batch(inferer, test_batch, beam_size, id_to_text, fout)
- test_batch = []
-
-
-if __name__ == "__main__":
- generate()
diff --git a/legacy/generate_chinese_poetry/network_conf.py b/legacy/generate_chinese_poetry/network_conf.py
deleted file mode 100755
index b1314bd631142234e14d064e5e4aa1f47eaf16f4..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/network_conf.py
+++ /dev/null
@@ -1,128 +0,0 @@
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-
-__all__ = ["encoder_decoder_network"]
-
-
-def _bidirect_lstm_encoder(input, hidden_dim, depth):
- lstm_last = []
- for dirt in ["fwd", "bwd"]:
- for i in range(depth):
- input_proj = paddle.layer.mixed(
- name="__in_proj_%0d_%s__" % (i, dirt),
- size=hidden_dim * 4,
- bias_attr=True,
- input=[
- paddle.layer.full_matrix_projection(input_proj),
- paddle.layer.full_matrix_projection(
- lstm, param_attr=paddle.attr.Param(initial_std=5e-4)),
- ] if i else [paddle.layer.full_matrix_projection(input)])
- lstm = paddle.layer.lstmemory(
- input=input_proj,
- bias_attr=paddle.attr.Param(initial_std=0.),
- param_attr=paddle.attr.Param(initial_std=5e-4),
- reverse=i % 2 if dirt == "fwd" else not i % 2)
- lstm_last.append(lstm)
- return paddle.layer.concat(input=lstm_last)
-
-
-def _attended_decoder_step(word_count, enc_out, enc_out_proj,
- decoder_hidden_dim, depth, trg_emb):
- decoder_memory = paddle.layer.memory(
- name="__decoder_0__", size=decoder_hidden_dim, boot_layer=None)
-
- context = paddle.networks.simple_attention(
- encoded_sequence=enc_out,
- encoded_proj=enc_out_proj,
- decoder_state=decoder_memory)
-
- for i in range(depth):
- input_proj = paddle.layer.mixed(
- act=paddle.activation.Linear(),
- size=decoder_hidden_dim * 4,
- bias_attr=False,
- input=[
- paddle.layer.full_matrix_projection(input_proj),
- paddle.layer.full_matrix_projection(lstm)
- ] if i else [
- paddle.layer.full_matrix_projection(context),
- paddle.layer.full_matrix_projection(trg_emb)
- ])
- lstm = paddle.networks.lstmemory_unit(
- input=input_proj,
- input_proj_layer_attr=paddle.attr.ExtraLayerAttribute(
- error_clipping_threshold=25.),
- out_memory=decoder_memory if not i else None,
- name="__decoder_%d__" % (i),
- size=decoder_hidden_dim,
- act=paddle.activation.Tanh(),
- gate_act=paddle.activation.Sigmoid(),
- state_act=paddle.activation.Tanh())
-
- next_word = paddle.layer.fc(size=word_count,
- bias_attr=True,
- act=paddle.activation.Softmax(),
- input=lstm)
- return next_word
-
-
-def encoder_decoder_network(word_count,
- emb_dim,
- encoder_depth,
- encoder_hidden_dim,
- decoder_depth,
- decoder_hidden_dim,
- bos_id,
- eos_id,
- max_length,
- beam_size=10,
- is_generating=False):
- src_emb = paddle.layer.embedding(
- input=paddle.layer.data(
- name="src_word_id",
- type=paddle.data_type.integer_value_sequence(word_count)),
- size=emb_dim,
- param_attr=paddle.attr.ParamAttr(name="__embedding__"))
- enc_out = _bidirect_lstm_encoder(
- input=src_emb, hidden_dim=encoder_hidden_dim, depth=encoder_depth)
- enc_out_proj = paddle.layer.fc(act=paddle.activation.Linear(),
- size=encoder_hidden_dim,
- bias_attr=False,
- input=enc_out)
-
- decoder_group_name = "decoder_group"
- group_inputs = [
- word_count, paddle.layer.StaticInput(input=enc_out),
- paddle.layer.StaticInput(input=enc_out_proj), decoder_hidden_dim,
- decoder_depth
- ]
-
- if is_generating:
- gen_trg_emb = paddle.layer.GeneratedInput(
- size=word_count,
- embedding_name="__embedding__",
- embedding_size=emb_dim)
- return paddle.layer.beam_search(
- name=decoder_group_name,
- step=_attended_decoder_step,
- input=group_inputs + [gen_trg_emb],
- bos_id=bos_id,
- eos_id=eos_id,
- beam_size=beam_size,
- max_length=max_length)
-
- else:
- trg_emb = paddle.layer.embedding(
- input=paddle.layer.data(
- name="trg_word_id",
- type=paddle.data_type.integer_value_sequence(word_count)),
- size=emb_dim,
- param_attr=paddle.attr.ParamAttr(name="__embedding__"))
- lbl = paddle.layer.data(
- name="trg_next_word",
- type=paddle.data_type.integer_value_sequence(word_count))
- next_word = paddle.layer.recurrent_group(
- name=decoder_group_name,
- step=_attended_decoder_step,
- input=group_inputs + [trg_emb])
- return paddle.layer.classification_cost(input=next_word, label=lbl)
diff --git a/legacy/generate_chinese_poetry/preprocess.py b/legacy/generate_chinese_poetry/preprocess.py
deleted file mode 100755
index 4018e2e3cb83e00b4c65489f88b46b71f6f20a8f..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/preprocess.py
+++ /dev/null
@@ -1,76 +0,0 @@
-# -*- coding: utf-8 -*-
-import os
-import io
-import re
-import json
-import click
-import collections
-
-
-def build_vocabulary(dataset, cutoff=0):
- dictionary = collections.defaultdict(int)
- for data in dataset:
- for sent in data[2]:
- for char in sent:
- dictionary[char] += 1
- dictionary = filter(lambda x: x[1] >= cutoff, dictionary.items())
- dictionary = sorted(dictionary, key=lambda x: (-x[1], x[0]))
- vocab, _ = list(zip(*dictionary))
- return (u"", u"", u"") + vocab
-
-
-@click.command("preprocess")
-@click.option("--datadir", type=str, help="Path to raw data")
-@click.option("--outfile", type=str, help="Path to save the training data")
-@click.option("--dictfile", type=str, help="Path to save the dictionary file")
-def preprocess(datadir, outfile, dictfile):
- dataset = []
- note_pattern1 = re.compile(u"(.*?)", re.U)
- note_pattern2 = re.compile(u"〖.*?〗", re.U)
- note_pattern3 = re.compile(u"-.*?-。?", re.U)
- note_pattern4 = re.compile(u"(.*$", re.U)
- note_pattern5 = re.compile(u"。。.*)$", re.U)
- note_pattern6 = re.compile(u"。。", re.U)
- note_pattern7 = re.compile(u"[《》「」\[\]]", re.U)
- print("Load raw data...")
- for fn in os.listdir(datadir):
- with io.open(os.path.join(datadir, fn), "r", encoding="utf8") as f:
- for data in json.load(f):
- title = data['title']
- author = data['author']
- p = "".join(data['paragraphs'])
- p = "".join(p.split())
- p = note_pattern1.sub(u"", p)
- p = note_pattern2.sub(u"", p)
- p = note_pattern3.sub(u"", p)
- p = note_pattern4.sub(u"", p)
- p = note_pattern5.sub(u"。", p)
- p = note_pattern6.sub(u"。", p)
- p = note_pattern7.sub(u"", p)
- if (p == u"" or u"{" in p or u"}" in p or u"{" in p or
- u"}" in p or u"、" in p or u":" in p or u";" in p or
- u"!" in p or u"?" in p or u"●" in p or u"□" in p or
- u"囗" in p or u")" in p):
- continue
- paragraphs = re.split(u"。|,", p)
- paragraphs = filter(lambda x: len(x), paragraphs)
- if len(paragraphs) > 1:
- dataset.append((title, author, paragraphs))
-
- print("Construct vocabularies...")
- vocab = build_vocabulary(dataset, cutoff=10)
- with io.open(dictfile, "w", encoding="utf8") as f:
- for v in vocab:
- f.write(v + "\n")
-
- print("Write processed data...")
- with io.open(outfile, "w", encoding="utf8") as f:
- for data in dataset:
- title = data[0]
- author = data[1]
- paragraphs = ".".join(data[2])
- f.write("\t".join((title, author, paragraphs)) + "\n")
-
-
-if __name__ == "__main__":
- preprocess()
diff --git a/legacy/generate_chinese_poetry/reader.py b/legacy/generate_chinese_poetry/reader.py
deleted file mode 100755
index 480db9dddf9e3ab6492b64ee11585ab77ea33bd7..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/reader.py
+++ /dev/null
@@ -1,51 +0,0 @@
-from utils import load_dict
-
-
-def train_reader(data_file_path, word_dict_file):
- def reader():
- word_dict = load_dict(word_dict_file)
-
- unk_id = word_dict[u""]
- bos_id = word_dict[u""]
- eos_id = word_dict[u""]
-
- with open(data_file_path, "r") as f:
- for line in f:
- line_split = line.strip().decode(
- "utf8", errors="ignore").split("\t")
- if len(line_split) < 3: continue
-
- poetry = line_split[2].split(".")
- poetry_ids = []
- for sen in poetry:
- if sen:
- poetry_ids.append([bos_id] + [
- word_dict.get(word, unk_id)
- for word in "".join(sen.split())
- ] + [eos_id])
- l = len(poetry_ids)
- if l < 2: continue
- for i in range(l - 1):
- yield poetry_ids[i], poetry_ids[i + 1][:-1], poetry_ids[
- i + 1][1:]
-
- return reader
-
-
-def gen_reader(data_file_path, word_dict_file):
- def reader():
- word_dict = load_dict(word_dict_file)
-
- unk_id = word_dict[u""]
- bos_id = word_dict[u""]
- eos_id = word_dict[u""]
-
- with open(data_file_path, "r") as f:
- for line in f:
- input_line = "".join(line.strip().decode(
- "utf8", errors="ignore").split())
- yield [bos_id] + [
- word_dict.get(word, unk_id) for word in input_line
- ] + [eos_id]
-
- return reader
diff --git a/legacy/generate_chinese_poetry/train.py b/legacy/generate_chinese_poetry/train.py
deleted file mode 100755
index 911c460a5dc94f8226735a26be898981c083c548..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/train.py
+++ /dev/null
@@ -1,134 +0,0 @@
-import os
-import gzip
-import logging
-import click
-
-import paddle.v2 as paddle
-import reader
-from paddle.v2.layer import parse_network
-from network_conf import encoder_decoder_network
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def save_model(trainer, save_path, parameters):
- with gzip.open(save_path, "w") as f:
- trainer.save_parameter_to_tar(f)
-
-
-def load_initial_model(model_path, parameters):
- with gzip.open(model_path, "rb") as f:
- parameters.init_from_tar(f)
-
-
-@click.command("train")
-@click.option(
- "--num_passes", default=10, help="Number of passes for the training task.")
-@click.option(
- "--batch_size",
- default=16,
- help="The number of training examples in one forward/backward pass.")
-@click.option(
- "--use_gpu", default=False, help="Whether to use gpu to train the model.")
-@click.option(
- "--trainer_count", default=1, help="The thread number used in training.")
-@click.option(
- "--save_dir_path",
- default="models",
- help="The path to saved the trained models.")
-@click.option(
- "--encoder_depth",
- default=3,
- help="The number of stacked LSTM layers in encoder.")
-@click.option(
- "--decoder_depth",
- default=3,
- help="The number of stacked LSTM layers in decoder.")
-@click.option(
- "--train_data_path", required=True, help="The path of trainning data.")
-@click.option(
- "--word_dict_path", required=True, help="The path of word dictionary.")
-@click.option(
- "--init_model_path",
- default="",
- help=("The path of a trained model used to initialized all "
- "the model parameters."))
-def train(num_passes,
- batch_size,
- use_gpu,
- trainer_count,
- save_dir_path,
- encoder_depth,
- decoder_depth,
- train_data_path,
- word_dict_path,
- init_model_path=""):
- if not os.path.exists(save_dir_path):
- os.mkdir(save_dir_path)
- assert os.path.exists(
- word_dict_path), "The given word dictionary does not exist."
- assert os.path.exists(
- train_data_path), "The given training data does not exist."
-
- # initialize PaddlePaddle
- paddle.init(use_gpu=use_gpu, trainer_count=trainer_count)
-
- # define optimization method and the trainer instance
- optimizer = paddle.optimizer.Adam(
- learning_rate=1e-4,
- regularization=paddle.optimizer.L2Regularization(rate=1e-5),
- model_average=paddle.optimizer.ModelAverage(
- average_window=0.5, max_average_window=2500))
-
- cost = encoder_decoder_network(
- word_count=len(open(word_dict_path, "r").readlines()),
- emb_dim=512,
- encoder_depth=encoder_depth,
- encoder_hidden_dim=512,
- decoder_depth=decoder_depth,
- decoder_hidden_dim=512,
- bos_id=0,
- eos_id=1,
- max_length=9)
-
- parameters = paddle.parameters.create(cost)
- if init_model_path:
- load_initial_model(init_model_path, parameters)
-
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- update_equation=optimizer)
-
- # define data reader
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.train_reader(train_data_path, word_dict_path),
- buf_size=1024000),
- batch_size=batch_size)
-
- # define the event_handler callback
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if (not event.batch_id % 1000) and event.batch_id:
- save_path = os.path.join(save_dir_path,
- "pass_%05d_batch_%05d.tar.gz" %
- (event.pass_id, event.batch_id))
- save_model(trainer, save_path, parameters)
-
- if not event.batch_id % 10:
- logger.info("Pass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
-
- if isinstance(event, paddle.event.EndPass):
- save_path = os.path.join(save_dir_path,
- "pass_%05d.tar.gz" % event.pass_id)
- save_model(trainer, save_path, parameters)
-
- # start training
- trainer.train(
- reader=train_reader, event_handler=event_handler, num_passes=num_passes)
-
-
-if __name__ == "__main__":
- train()
diff --git a/legacy/generate_chinese_poetry/utils.py b/legacy/generate_chinese_poetry/utils.py
deleted file mode 100755
index f8a20bf4203bc091c8002953b3b3d7df12be25ef..0000000000000000000000000000000000000000
--- a/legacy/generate_chinese_poetry/utils.py
+++ /dev/null
@@ -1,29 +0,0 @@
-import os
-import sys
-import re
-from collections import defaultdict
-
-
-def load_dict(word_dict_file):
- word_dict = {}
- with open(word_dict_file, "r") as fin:
- for i, line in enumerate(fin):
- key = line.strip().decode("utf8", errors="ignore").split("\t")[0]
- word_dict[key] = i
- return word_dict
-
-
-def find_optiaml_pass(log_file):
- cost_info = defaultdict(list)
- cost_pat = re.compile(r'Cost\s[\d]+.[\d]+')
- pass_pat = re.compile(r'Pass\s[\d]+')
- with open(log_file, 'r') as flog:
- for line in flog:
- if not 'Cost' in line: continue
- pass_id = pass_pat.findall(line.strip())[0]
- cost = float(cost_pat.findall(line.strip())[0].replace('Cost ', ''))
- cost_info[pass_id].append(cost)
- print("optimal pass : %s" % sorted(
- cost_info.iteritems(),
- key=lambda x: sum(x[1]) / (len(x[1])),
- reverse=False)[0][0])
diff --git a/legacy/generate_sequence_by_rnn_lm/.gitignore b/legacy/generate_sequence_by_rnn_lm/.gitignore
deleted file mode 100644
index 203ec9a67426fee99e6228716433bb1bec8ff14f..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/.gitignore
+++ /dev/null
@@ -1,3 +0,0 @@
-*.pyc
-*.tar.gz
-models
diff --git a/legacy/generate_sequence_by_rnn_lm/README.md b/legacy/generate_sequence_by_rnn_lm/README.md
deleted file mode 100644
index 756c60d67ec6d27d3f90e1783e300190a0010154..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/README.md
+++ /dev/null
@@ -1,166 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 使用循环神经网语言模型生成文本
-
-语言模型(Language Model)是一个概率分布模型,简单来说,就是用来计算一个句子的概率的模型。利用它可以确定哪个词序列的可能性更大,或者给定若干个词,可以预测下一个最可能出现的词。语言模型是自然语言处理领域里一个重要的基础模型。
-
-## 应用场景
-**语言模型被应用在很多领域**,如:
-
-* **自动写作**:语言模型可以根据上文生成下一个词,递归下去可以生成整个句子、段落、篇章。
-* **QA**:语言模型可以根据Question生成Answer。
-* **机器翻译**:当前主流的机器翻译模型大多基于Encoder-Decoder模式,其中Decoder就是一个待条件的语言模型,用来生成目标语言。
-* **拼写检查**:语言模型可以计算出词序列的概率,一般在拼写错误处序列的概率会骤减,可以用来识别拼写错误并提供改正候选集。
-* **词性标注、句法分析、语音识别......**
-
-## 关于本例
-本例实现基于RNN的语言模型,以及利用语言模型生成文本,本例的目录结构如下:
-
-```text
-.
-├── data
-│ └── train_data_examples.txt # 示例数据,可参考示例数据的格式,提供自己的数据
-├── config.py # 配置文件,包括data、train、infer相关配置
-├── generate.py # 预测任务脚本,即生成文本
-├── beam_search.py # beam search 算法实现
-├── network_conf.py # 本例中涉及的各种网络结构均定义在此文件中,希望进一步修改模型结构,请修改此文件
-├── reader.py # 读取数据接口
-├── README.md
-├── train.py # 训练任务脚本
-└── utils.py # 定义通用的函数,例如:构建字典、加载字典等
-```
-
-## RNN 语言模型
-### 简介
-
-RNN是一个序列模型,基本思路是:在时刻$t$,将前一时刻$t-1$的隐藏层输出和$t$时刻的词向量一起输入到隐藏层从而得到时刻$t$的特征表示,然后用这个特征表示得到$t$时刻的预测输出,如此在时间维上递归下去。可以看出RNN善于使用上文信息、历史知识,具有“记忆”功能。理论上RNN能实现“长依赖”(即利用很久之前的知识),但在实际应用中发现效果并不理想,研究提出了LSTM和GRU等变种,通过引入门机制对传统RNN的记忆单元进行了改进,弥补了传统RNN在学习长序列时遇到的难题。本例模型使用了LSTM或GRU,可通过配置进行修改。下图是RNN(广义上包含了LSTM、GRU等)语言模型“循环”思想的示意图:
-
-
-
-### 模型实现
-
-本例中RNN语言模型的实现简介如下:
-
-- **定义模型参数**:`config.py`中定义了模型的参数变量。
-- **定义模型结构**:`network_conf.py`中的`rnn_lm`**函数**中定义了模型的**结构**,如下:
- - 输入层:将输入的词(或字)序列映射成向量,即词向量层: `embedding`。
- - 中间层:根据配置实现RNN层,将上一步得到的`embedding`向量序列作为输入。
- - 输出层:使用`softmax`归一化计算单词的概率。
- - loss:定义多类交叉熵作为模型的损失函数。
-- **训练模型**:`train.py`中的`main`方法实现了模型的训练,实现流程如下:
- - 准备输入数据:建立并保存词典、构建train和test数据的reader。
- - 初始化模型:包括模型的结构、参数。
- - 构建训练器:demo中使用的是Adam优化算法。
- - 定义回调函数:构建`event_handler`来跟踪训练过程中loss的变化,并在每轮训练结束时保存模型的参数。
- - 训练:使用trainer训练模型。
-
-- **生成文本**:`generate.py` 实现了文本的生成,实现流程如下:
- - 加载训练好的模型和词典文件。
- - 读取`gen_file`文件,每行是一个句子的前缀,用[柱搜索算法(Beam Search)](https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.cn.md#柱搜索算法)根据前缀生成文本。
- - 将生成的文本及其前缀保存到文件`gen_result`。
-
-## 使用说明
-
-运行本例的方法如下:
-
-* 1,运行`python train.py`命令,开始train模型(默认使用LSTM),待训练结束。
-* 2,运行`python generate.py`运行文本生成。(输入的文本默认为`data/train_data_examples.txt`,生成的文本默认保存到`data/gen_result.txt`中。)
-
-
-**如果需要使用自己的语料、定制模型,需要修改`config.py`中的配置,细节和适配工作详情如下:**
-
-
-### 语料适配
-
-* 清洗语料:去除原文中空格、tab、乱码,按需去除数字、标点符号、特殊符号等。
-* 内容格式:每个句子占一行;每行中的各词之间使用一个空格符分开。
-* 按需要配置`config.py`中的如下参数:
-
- ```python
- train_file = "data/train_data_examples.txt"
- test_file = ""
-
- vocab_file = "data/word_vocab.txt"
- model_save_dir = "models"
- ```
- 1. `train_file`:指定训练数据的路径,**需要预先分词**。
- 2. `test_file`:指定测试数据的路径,如果训练数据不为空,将在每个 `pass` 训练结束对指定的测试数据进行测试。
- 3. `vocab_file`:指定字典的路径,如果字典文件不存在,将会对训练语料进行词频统计,构建字典。
- 4. `model_save_dir`:指定模型保存的路径,如果指定的文件夹不存在,将会自动创建。
-
-### 构建字典的策略
-- 当指定的字典文件不存在时,将对训练数据进行词频统计,自动构建字典`config.py` 中有如下两个参数与构建字典有关:
-
- ```python
- max_word_num = 51200 - 2
- cutoff_word_fre = 0
- ```
- 1. `max_word_num`:指定字典中含有多少个词。
- 2. `cutoff_word_fre`:字典中词语在训练语料中出现的最低频率。
-- 假如指定了 `max_word_num = 5000`,并且 `cutoff_word_fre = 10`,词频统计发现训练语料中出现频率高于10次的词语仅有3000个,那么最终会取3000个词构成词典。
-- 构建词典时,会自动加入两个特殊符号:
- 1. ``:不出现在字典中的词
- 2. ``:句子的结束符
-
- *注:需要注意的是,词典越大生成的内容越丰富,但训练耗时越久。一般中文分词之后,语料中不同的词能有几万乃至几十万,如果`max_word_num`取值过小则导致``占比过高,如果`max_word_num`取值较大,则严重影响训练速度(对精度也有影响)。所以,也有“按字”训练模型的方式,即:把每个汉字当做一个词,常用汉字也就几千个,使得字典的大小不会太大、不会丢失太多信息,但汉语中同一个字在不同词中语义相差很大,有时导致模型效果不理想。建议多试试、根据实际情况选择是“按词训练”还是“按字训练”。*
-
-### 模型适配、训练
-
-* 按需调整`config.py`中如下配置,来修改 rnn 语言模型的网络结果:
-
- ```python
- rnn_type = "lstm" # "gru" or "lstm"
- emb_dim = 256
- hidden_size = 256
- stacked_rnn_num = 2
- ```
- 1. `rnn_type`:支持 ”gru“ 或者 ”lstm“ 两种参数,选择使用何种 RNN 单元。
- 2. `emb_dim`:设置词向量的维度。
- 3. `hidden_size`:设置 RNN 单元隐层大小。
- 4. `stacked_rnn_num`:设置堆叠 RNN 单元的个数,构成一个更深的模型。
-
-* 运行`python train.py`命令训练模型,模型将被保存到`model_save_dir`指定的目录。
-
-### 按需生成文本
-
-* 按需调整`config.py`中以下变量,详解如下:
-
- ```python
- gen_file = "data/train_data_examples.txt"
- gen_result = "data/gen_result.txt"
- max_gen_len = 25 # the max number of words to generate
- beam_size = 5
- model_path = "models/rnn_lm_pass_00000.tar.gz"
- ```
- 1. `gen_file`:指定输入数据文件,每行是一个句子的前缀,**需要预先分词**。
- 2. `gen_result`:指定输出文件路径,生成结果将写入此文件。
- 3. `max_gen_len`:指定每一句生成的话最长长度,如果模型无法生成出``,当生成 `max_gen_len` 个词语后,生成过程会自动终止。
- 4. `beam_size`:Beam Search 算法每一步的展开宽度。
- 5. `model_path`:指定训练好的模型的路径。
-
- 其中,`gen_file` 中保存的是待生成的文本前缀,每个前缀占一行,形如:
-
- ```text
- 若隐若现 地像 幽灵 , 像 死神
- ```
- 将需要生成的文本前缀按此格式存入文件即可;
-
-* 运行`python generate.py`命令运行beam search 算法为输入前缀生成文本,下面是模型生成的结果:
-
- ```text
- 81 若隐若现 地像 幽灵 , 像 死神
- -12.2542 一样 。 他 是 个 怪物
- -12.6889 一样 。 他 是 个 英雄
- -13.9877 一样 。 他 是 我 的 敌人
- -14.2741 一样 。 他 是 我 的
- -14.6250 一样 。 他 是 我 的 朋友
- ```
- 其中:
- 1. 第一行 `81 若隐若现 地像 幽灵 , 像 死神`以`\t`为分隔,共有两列:
- - 第一列是输入前缀在训练样本集中的序号。
- - 第二列是输入的前缀。
- 2. 第二 ~ `beam_size + 1` 行是生成结果,同样以 `\t` 分隔为两列:
- - 第一列是该生成序列的对数概率(log probability)。
- - 第二列是生成的文本序列,正常的生成结果会以符号``结尾,如果没有以``结尾,意味着超过了最大序列长度,生成强制终止。
diff --git a/legacy/generate_sequence_by_rnn_lm/beam_search.py b/legacy/generate_sequence_by_rnn_lm/beam_search.py
deleted file mode 100644
index f6d1d3646cbd6b42a28b8531d2f11fe2856a5c4d..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/beam_search.py
+++ /dev/null
@@ -1,172 +0,0 @@
-import os
-import math
-import numpy as np
-
-import paddle.v2 as paddle
-
-from utils import logger, load_reverse_dict
-
-__all__ = ["BeamSearch"]
-
-
-class BeamSearch(object):
- """
- Generating sequence by beam search
- NOTE: this class only implements generating one sentence at a time.
- """
-
- def __init__(self, inferer, word_dict_file, beam_size=1, max_gen_len=100):
- """
- constructor method.
-
- :param inferer: object of paddle.Inference that represents the entire
- network to forward compute the test batch
- :type inferer: paddle.Inference
- :param word_dict_file: path of word dictionary file
- :type word_dict_file: str
- :param beam_size: expansion width in each iteration
- :type param beam_size: int
- :param max_gen_len: the maximum number of iterations
- :type max_gen_len: int
- """
- self.inferer = inferer
- self.beam_size = beam_size
- self.max_gen_len = max_gen_len
- self.ids_2_word = load_reverse_dict(word_dict_file)
- logger.info("dictionay len = %d" % (len(self.ids_2_word)))
-
- try:
- self.eos_id = next(x[0] for x in self.ids_2_word.iteritems()
- if x[1] == "")
- self.unk_id = next(x[0] for x in self.ids_2_word.iteritems()
- if x[1] == "")
- except StopIteration:
- logger.fatal(("the word dictionay must contain an ending mark "
- "in the text generation task."))
-
- self.candidate_paths = []
- self.final_paths = []
-
- def _top_k(self, softmax_out, k):
- """
- get indices of the words with k highest probablities.
- NOTE: will be excluded if it is among the top k words, then word
- with (k + 1)th highest probability will be returned.
-
- :param softmax_out: probablity over the dictionary
- :type softmax_out: narray
- :param k: number of word indices to return
- :type k: int
- :return: indices of k words with highest probablities.
- :rtype: list
- """
- ids = softmax_out.argsort()[::-1]
- return ids[ids != self.unk_id][:k]
-
- def _forward_batch(self, batch):
- """
- forward a test batch.
-
- :params batch: the input data batch
- :type batch: list
- :return: probablities of the predicted word
- :rtype: ndarray
- """
- return self.inferer.infer(input=batch, field=["value"])
-
- def _beam_expand(self, next_word_prob):
- """
- In every iteration step, the model predicts the possible next words.
- For each input sentence, the top k words is added to end of the original
- sentence to form a new generated sentence.
-
- :param next_word_prob: probablities of the next words
- :type next_word_prob: ndarray
- :return: the expanded new sentences.
- :rtype: list
- """
- assert len(next_word_prob) == len(self.candidate_paths), (
- "Wrong forward computing results!")
- top_beam_words = np.apply_along_axis(self._top_k, 1, next_word_prob,
- self.beam_size)
- new_paths = []
- for i, words in enumerate(top_beam_words):
- old_path = self.candidate_paths[i]
- for w in words:
- log_prob = old_path["log_prob"] + math.log(next_word_prob[i][w])
- gen_ids = old_path["ids"] + [w]
- if w == self.eos_id:
- self.final_paths.append({
- "log_prob": log_prob,
- "ids": gen_ids
- })
- else:
- new_paths.append({"log_prob": log_prob, "ids": gen_ids})
- return new_paths
-
- def _beam_shrink(self, new_paths):
- """
- to return the top beam_size generated sequences with the highest
- probabilities at the end of evey generation iteration.
-
- :param new_paths: all possible generated sentences
- :type new_paths: list
- :return: a state flag to indicate whether to stop beam search
- :rtype: bool
- """
-
- if len(self.final_paths) >= self.beam_size:
- max_candidate_log_prob = max(
- new_paths, key=lambda x: x["log_prob"])["log_prob"]
- min_complete_path_log_prob = min(
- self.final_paths, key=lambda x: x["log_prob"])["log_prob"]
- if min_complete_path_log_prob >= max_candidate_log_prob:
- return True
-
- new_paths.sort(key=lambda x: x["log_prob"], reverse=True)
- self.candidate_paths = new_paths[:self.beam_size]
- return False
-
- def gen_a_sentence(self, input_sentence):
- """
- generating sequence for an given input
-
- :param input_sentence: one input_sentence
- :type input_sentence: list
- :return: the generated word sequences
- :rtype: list
- """
- self.candidate_paths = [{"log_prob": 0., "ids": input_sentence}]
- input_len = len(input_sentence)
-
- for i in range(self.max_gen_len):
- next_word_prob = self._forward_batch(
- [[x["ids"]] for x in self.candidate_paths])
- new_paths = self._beam_expand(next_word_prob)
-
- min_candidate_log_prob = min(
- new_paths, key=lambda x: x["log_prob"])["log_prob"]
-
- path_to_remove = [
- path for path in self.final_paths
- if path["log_prob"] < min_candidate_log_prob
- ]
- for p in path_to_remove:
- self.final_paths.remove(p)
-
- if self._beam_shrink(new_paths):
- self.candidate_paths = []
- break
-
- gen_ids = sorted(
- self.final_paths + self.candidate_paths,
- key=lambda x: x["log_prob"],
- reverse=True)[:self.beam_size]
- self.final_paths = []
-
- def _to_str(x):
- text = " ".join(self.ids_2_word[idx]
- for idx in x["ids"][input_len:])
- return "%.4f\t%s" % (x["log_prob"], text)
-
- return map(_to_str, gen_ids)
diff --git a/legacy/generate_sequence_by_rnn_lm/config.py b/legacy/generate_sequence_by_rnn_lm/config.py
deleted file mode 100644
index 23ae2e615e5baefd64db703d85c40778ecac7b5f..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/config.py
+++ /dev/null
@@ -1,44 +0,0 @@
-import os
-
-################## for building word dictionary ##################
-
-max_word_num = 51200 - 2
-cutoff_word_fre = 0
-
-################## for training task #########################
-# path of training data
-train_file = "data/train_data_examples.txt"
-# path of testing data, if testing file does not exist,
-# testing will not be performed at the end of each training pass
-test_file = ""
-# path of word dictionary, if this file does not exist,
-# word dictionary will be built from training data.
-vocab_file = "data/word_vocab.txt"
-# directory to save the trained model
-# create a new directory if the directoy does not exist
-model_save_dir = "models"
-
-batch_size = 32 # the number of training examples in one forward/backward pass
-num_passes = 20 # how many passes to train the model
-
-log_period = 50
-save_period_by_batches = 50
-
-use_gpu = False # to use gpu or not
-trainer_count = 1 # number of trainer
-
-################## for model configuration ##################
-rnn_type = "lstm" # "gru" or "lstm"
-emb_dim = 256
-hidden_size = 256
-stacked_rnn_num = 2
-
-################## for text generation ##################
-gen_file = "data/train_data_examples.txt"
-gen_result = "data/gen_result.txt"
-max_gen_len = 25 # the max number of words to generate
-beam_size = 5
-model_path = "models/rnn_lm_pass_00000.tar.gz"
-
-if not os.path.exists(model_save_dir):
- os.mkdir(model_save_dir)
diff --git a/legacy/generate_sequence_by_rnn_lm/data/train_data_examples.txt b/legacy/generate_sequence_by_rnn_lm/data/train_data_examples.txt
deleted file mode 100644
index db1ad611b0eb6882aac617baeebeea7c029eff7c..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/data/train_data_examples.txt
+++ /dev/null
@@ -1,5 +0,0 @@
-我们 不会 伤害 你 的 。 他们 也 这么 说 。
-你 拥有 你 父亲 皇室 的 血统 。 是 合法 的 继承人 。
-叫 什么 你 可以 告诉 我 。
-你 并 没有 留言 说 要 去 哪里 。 是 的 , 因为 我 必须 要 去 完成 这件 事 。
-你 查出 是 谁 住 在 隔壁 房间 吗 ?
diff --git a/legacy/generate_sequence_by_rnn_lm/generate.py b/legacy/generate_sequence_by_rnn_lm/generate.py
deleted file mode 100644
index 07729e7404bce598790b78a46b073e01e4e46484..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/generate.py
+++ /dev/null
@@ -1,74 +0,0 @@
-import os
-import gzip
-import numpy as np
-
-import paddle.v2 as paddle
-
-from utils import logger, load_dict
-from beam_search import BeamSearch
-import config as conf
-from network_conf import rnn_lm
-
-
-def rnn_generate(gen_input_file, model_path, max_gen_len, beam_size,
- word_dict_file):
- """
- use RNN model to generate sequences.
-
- :param word_id_dict: vocab.
- :type word_id_dict: dictionary with content of "{word, id}",
- "word" is string type , "id" is int type.
- :param num_words: the number of the words to generate.
- :type num_words: int
- :param beam_size: beam width.
- :type beam_size: int
- :return: save prediction results to output_file
- """
-
- assert os.path.exists(gen_input_file), "test file does not exist!"
- assert os.path.exists(model_path), "trained model does not exist!"
- assert os.path.exists(
- word_dict_file), "word dictionary file does not exist!"
-
- # load word dictionary
- word_2_ids = load_dict(word_dict_file)
- try:
- UNK_ID = word_2_ids[""]
- except KeyError:
- logger.fatal("the word dictionary must contain a token!")
- sys.exit(-1)
-
- # initialize paddle
- paddle.init(use_gpu=conf.use_gpu, trainer_count=conf.trainer_count)
-
- # load the trained model
- pred_words = rnn_lm(
- len(word_2_ids),
- conf.emb_dim,
- conf.hidden_size,
- conf.stacked_rnn_num,
- conf.rnn_type,
- is_infer=True)
-
- parameters = paddle.parameters.Parameters.from_tar(
- gzip.open(model_path, "r"))
-
- inferer = paddle.inference.Inference(
- output_layer=pred_words, parameters=parameters)
-
- generator = BeamSearch(inferer, word_dict_file, beam_size, max_gen_len)
- # generate text
- with open(conf.gen_file, "r") as fin, open(conf.gen_result, "w") as fout:
- for idx, line in enumerate(fin):
- fout.write("%d\t%s" % (idx, line))
- for gen_res in generator.gen_a_sentence([
- word_2_ids.get(w, UNK_ID)
- for w in line.lower().strip().split()
- ]):
- fout.write("%s\n" % gen_res)
- fout.write("\n")
-
-
-if __name__ == "__main__":
- rnn_generate(conf.gen_file, conf.model_path, conf.max_gen_len,
- conf.beam_size, conf.vocab_file)
diff --git a/legacy/generate_sequence_by_rnn_lm/images/ngram.png b/legacy/generate_sequence_by_rnn_lm/images/ngram.png
deleted file mode 100644
index 50f89f2d8021d022b169dd4815f5a5986ec9a168..0000000000000000000000000000000000000000
Binary files a/legacy/generate_sequence_by_rnn_lm/images/ngram.png and /dev/null differ
diff --git a/legacy/generate_sequence_by_rnn_lm/images/rnn.png b/legacy/generate_sequence_by_rnn_lm/images/rnn.png
deleted file mode 100644
index b3108d43de9183d75038e9572c84836a701f3a6d..0000000000000000000000000000000000000000
Binary files a/legacy/generate_sequence_by_rnn_lm/images/rnn.png and /dev/null differ
diff --git a/legacy/generate_sequence_by_rnn_lm/network_conf.py b/legacy/generate_sequence_by_rnn_lm/network_conf.py
deleted file mode 100644
index 55e0d00ea35b4fabc090f7a525289b6ec66760a1..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/network_conf.py
+++ /dev/null
@@ -1,61 +0,0 @@
-import paddle.v2 as paddle
-
-
-def rnn_lm(vocab_dim,
- emb_dim,
- hidden_size,
- stacked_rnn_num,
- rnn_type="lstm",
- is_infer=False):
- """
- RNN language model definition.
-
- :param vocab_dim: size of vocabulary.
- :type vocab_dim: int
- :param emb_dim: dimension of the embedding vector
- :type emb_dim: int
- :param rnn_type: the type of RNN cell.
- :type rnn_type: int
- :param hidden_size: number of hidden unit.
- :type hidden_size: int
- :param stacked_rnn_num: number of stacked rnn cell.
- :type stacked_rnn_num: int
- :return: cost and output layer of model.
- :rtype: LayerOutput
- """
-
- # input layers
- input = paddle.layer.data(
- name="input", type=paddle.data_type.integer_value_sequence(vocab_dim))
- if not is_infer:
- target = paddle.layer.data(
- name="target",
- type=paddle.data_type.integer_value_sequence(vocab_dim))
-
- # embedding layer
- input_emb = paddle.layer.embedding(input=input, size=emb_dim)
-
- # rnn layer
- if rnn_type == "lstm":
- for i in range(stacked_rnn_num):
- rnn_cell = paddle.networks.simple_lstm(
- input=rnn_cell if i else input_emb, size=hidden_size)
- elif rnn_type == "gru":
- for i in range(stacked_rnn_num):
- rnn_cell = paddle.networks.simple_gru(
- input=rnn_cell if i else input_emb, size=hidden_size)
- else:
- raise Exception("rnn_type error!")
-
- # fc(full connected) and output layer
- output = paddle.layer.fc(input=[rnn_cell],
- size=vocab_dim,
- act=paddle.activation.Softmax())
-
- if is_infer:
- last_word = paddle.layer.last_seq(input=output)
- return last_word
- else:
- cost = paddle.layer.classification_cost(input=output, label=target)
-
- return cost
diff --git a/legacy/generate_sequence_by_rnn_lm/reader.py b/legacy/generate_sequence_by_rnn_lm/reader.py
deleted file mode 100644
index 1c6bc7a8a83dbd028b11351cb55ace5b529b0268..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/reader.py
+++ /dev/null
@@ -1,31 +0,0 @@
-import collections
-import os
-
-MIN_LEN = 3
-MAX_LEN = 100
-
-
-def rnn_reader(file_name, word_dict):
- """
- create reader for RNN, each line is a sample.
-
- :param file_name: file name.
- :param min_sentence_length: sentence's min length.
- :param max_sentence_length: sentence's max length.
- :param word_dict: vocab with content of '{word, id}',
- 'word' is string type , 'id' is int type.
- :return: data reader.
- """
-
- def reader():
- UNK_ID = word_dict['']
- with open(file_name) as file:
- for line in file:
- words = line.strip().lower().split()
- if len(words) < MIN_LEN or len(words) > MAX_LEN:
- continue
- ids = [word_dict.get(w, UNK_ID)
- for w in words] + [word_dict['']]
- yield ids[:-1], ids[1:]
-
- return reader
diff --git a/legacy/generate_sequence_by_rnn_lm/train.py b/legacy/generate_sequence_by_rnn_lm/train.py
deleted file mode 100644
index 852dd327842f260e80c2386f2481d46f771873f4..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/train.py
+++ /dev/null
@@ -1,127 +0,0 @@
-import os
-import sys
-import gzip
-
-import paddle.v2 as paddle
-import config as conf
-import reader
-from network_conf import rnn_lm
-from utils import logger, build_dict, load_dict
-
-
-def train(topology,
- train_reader,
- test_reader,
- model_save_dir="models",
- num_passes=10):
- """
- train model.
-
- :param topology: cost layer of the model to train.
- :type topology: LayerOuput
- :param train_reader: train data reader.
- :type trainer_reader: collections.Iterable
- :param test_reader: test data reader.
- :type test_reader: collections.Iterable
- :param model_save_dir: path to save the trained model
- :type model_save_dir: str
- :param num_passes: number of epoch
- :type num_passes: int
- """
- if not os.path.exists(model_save_dir):
- os.mkdir(model_save_dir)
-
- # initialize PaddlePaddle
- paddle.init(use_gpu=conf.use_gpu, trainer_count=conf.trainer_count)
-
- # create optimizer
- adam_optimizer = paddle.optimizer.Adam(
- learning_rate=1e-3,
- regularization=paddle.optimizer.L2Regularization(rate=1e-3),
- model_average=paddle.optimizer.ModelAverage(
- average_window=0.5, max_average_window=10000))
-
- # create parameters
- parameters = paddle.parameters.create(topology)
- # create sum evaluator
- sum_eval = paddle.evaluator.sum(topology)
- # create trainer
- trainer = paddle.trainer.SGD(cost=topology,
- parameters=parameters,
- update_equation=adam_optimizer,
- extra_layers=sum_eval)
-
- # define the event_handler callback
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if not event.batch_id % conf.log_period:
- logger.info("Pass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
-
- if (not event.batch_id %
- conf.save_period_by_batches) and event.batch_id:
- save_name = os.path.join(model_save_dir,
- "rnn_lm_pass_%05d_batch_%03d.tar.gz" %
- (event.pass_id, event.batch_id))
- with gzip.open(save_name, "w") as f:
- trainer.save_parameter_to_tar(f)
-
- if isinstance(event, paddle.event.EndPass):
- if test_reader is not None:
- result = trainer.test(reader=test_reader)
- logger.info("Test with Pass %d, %s" %
- (event.pass_id, result.metrics))
- save_name = os.path.join(model_save_dir, "rnn_lm_pass_%05d.tar.gz" %
- (event.pass_id))
- with gzip.open(save_name, "w") as f:
- trainer.save_parameter_to_tar(f)
-
- logger.info("start training...")
- trainer.train(
- reader=train_reader, event_handler=event_handler, num_passes=num_passes)
-
- logger.info("Training is finished.")
-
-
-def main():
- # prepare vocab
- if not (os.path.exists(conf.vocab_file) and
- os.path.getsize(conf.vocab_file)):
- logger.info(("word dictionary does not exist, "
- "build it from the training data"))
- build_dict(conf.train_file, conf.vocab_file, conf.max_word_num,
- conf.cutoff_word_fre)
- logger.info("load word dictionary.")
- word_dict = load_dict(conf.vocab_file)
- logger.info("dictionay size = %d" % (len(word_dict)))
-
- cost = rnn_lm(
- len(word_dict), conf.emb_dim, conf.hidden_size, conf.stacked_rnn_num,
- conf.rnn_type)
-
- # define reader
- reader_args = {
- "file_name": conf.train_file,
- "word_dict": word_dict,
- }
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.rnn_reader(**reader_args), buf_size=102400),
- batch_size=conf.batch_size)
- test_reader = None
- if os.path.exists(conf.test_file) and os.path.getsize(conf.test_file):
- test_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.rnn_reader(**reader_args), buf_size=65536),
- batch_size=conf.batch_size)
-
- train(
- topology=cost,
- train_reader=train_reader,
- test_reader=test_reader,
- model_save_dir=conf.model_save_dir,
- num_passes=conf.num_passes)
-
-
-if __name__ == "__main__":
- main()
diff --git a/legacy/generate_sequence_by_rnn_lm/utils.py b/legacy/generate_sequence_by_rnn_lm/utils.py
deleted file mode 100644
index 57edf3758e995a5c104b7a7cbf37edf1ba95dfbc..0000000000000000000000000000000000000000
--- a/legacy/generate_sequence_by_rnn_lm/utils.py
+++ /dev/null
@@ -1,84 +0,0 @@
-import os
-import logging
-from collections import defaultdict
-
-__all__ = ["build_dict", "load_dict"]
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.DEBUG)
-
-
-def build_dict(data_file,
- save_path,
- max_word_num,
- cutoff_word_fre=5,
- insert_extra_words=["", ""]):
- """
- :param data_file: path of data file
- :type data_file: str
- :param save_path: path to save the word dictionary
- :type save_path: str
- :param vocab_max_size: if vocab_max_size is set, top vocab_max_size words
- will be added into word vocabulary
- :type vocab_max_size: int
- :param cutoff_thd: if cutoff_thd is set, words whose frequencies are less
- than cutoff_thd will not be added into word vocabulary.
- NOTE that: vocab_max_size and cutoff_thd cannot be set at the same time
- :type cutoff_word_fre: int
- :param extra_keys: extra keys defined by users that added into the word
- dictionary, ususally these keys include , start and ending marks
- :type extra_keys: list
- """
- word_count = defaultdict(int)
- with open(data_file, "r") as f:
- for idx, line in enumerate(f):
- if not (idx + 1) % 100000:
- logger.debug("processing %d lines ... " % (idx + 1))
- words = line.strip().lower().split()
- for w in words:
- word_count[w] += 1
-
- sorted_words = sorted(
- word_count.iteritems(), key=lambda x: x[1], reverse=True)
-
- stop_pos = len(sorted_words) if sorted_words[-1][
- 1] > cutoff_word_fre else next(idx for idx, v in enumerate(sorted_words)
- if v[1] < cutoff_word_fre)
-
- stop_pos = min(max_word_num, stop_pos)
- with open(save_path, "w") as fdict:
- for w in insert_extra_words:
- fdict.write("%s\t-1\n" % (w))
- for idx, info in enumerate(sorted_words):
- if idx == stop_pos: break
- fdict.write("%s\t%d\n" % (info[0], info[-1]))
-
-
-def load_dict(dict_path):
- """
- load word dictionary from the given file. Each line of the give file is
- a word in the word dictionary. The first column of the line, seperated by
- TAB, is the key, while the line index is the value.
-
- :param dict_path: path of word dictionary
- :type dict_path: str
- :return: the dictionary
- :rtype: dict
- """
- return dict((line.strip().split("\t")[0], idx)
- for idx, line in enumerate(open(dict_path, "r").readlines()))
-
-
-def load_reverse_dict(dict_path):
- """
- load word dictionary from the given file. Each line of the give file is
- a word in the word dictionary. The line index is the key, while the first
- column of the line, seperated by TAB, is the value.
-
- :param dict_path: path of word dictionary
- :type dict_path: str
- :return: the dictionary
- :rtype: dict
- """
- return dict((idx, line.strip().split("\t")[0])
- for idx, line in enumerate(open(dict_path, "r").readlines()))
diff --git a/legacy/globally_normalized_reader/.gitignore b/legacy/globally_normalized_reader/.gitignore
deleted file mode 100644
index 5707959556bb6b21e88a3a77d11f9222bba01485..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/.gitignore
+++ /dev/null
@@ -1,2 +0,0 @@
-*.txt
-*.pyc
diff --git a/legacy/globally_normalized_reader/README.cn.md b/legacy/globally_normalized_reader/README.cn.md
deleted file mode 100644
index b1d3910754538ffb2743a7eb80ee7225eabcd534..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/README.cn.md
+++ /dev/null
@@ -1,59 +0,0 @@
-此目录中代码示例PaddlePaddle所需版本至少为v0.11.0。如果您使用的PaddlePaddle版本早于v0.11.0, [请更新](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# 全球标准化阅读器
-
-该模型实现以下功能:
-
-Jonathan Raiman and John Miller. Globally Normalized Reader. Empirical Methods in Natural Language Processing (EMNLP), 2017
-
-如果您在研究中使用数据集/代码,请引用上述论文:
-
-```text
-@inproceedings{raiman2015gnr,
- author={Raiman, Jonathan and Miller, John},
- booktitle={Empirical Methods in Natural Language Processing (EMNLP)},
- title={Globally Normalized Reader},
- year={2017},
-}
-```
-
-您也可以访问 https://github.com/baidu-research/GloballyNormalizedReader 以获取更多信息。
-
-
-# 安装
-
-1. 请使用 [docker image](http://doc.paddlepaddle.org/develop/doc/getstarted/build_and_install/docker_install_en.html) 安装最新的PaddlePaddle,运行方法:
- ```bash
- docker pull paddledev/paddle
- ```
-2. 下载所有必要的数据,运行方法:
- ```bash
- cd data && ./download.sh && cd ..
- ```
-3. 预处理并特征化数据:
- ```bash
- python featurize.py --datadir data --outdir data/featurized --glove-path data/glove.840B.300d.txt
- ```
-
-# 模型训练
-
-- 根据需要修改config.py来配置模型,然后运行:
-
- ```bash
- python train.py 2>&1 | tee train.log
- ```
-
-# 使用训练过的模型推断
-
-- 运行以下训练模型来推断:
- ```bash
- python infer.py \
- --model_path models/pass_00000.tar.gz \
- --data_dir data/featurized/ \
- --batch_size 2 \
- --use_gpu 0 \
- --trainer_count 1 \
- 2>&1 | tee infer.log
- ```
diff --git a/legacy/globally_normalized_reader/README.md b/legacy/globally_normalized_reader/README.md
deleted file mode 100644
index 9763a1c04fc5dd76da2003acfa53ba094f0582e4..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/README.md
+++ /dev/null
@@ -1,59 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.11.0. If you are on a version of PaddlePaddle earlier than v0.11.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Globally Normalized Reader
-
-This model implements the work in the following paper:
-
-Jonathan Raiman and John Miller. Globally Normalized Reader. Empirical Methods in Natural Language Processing (EMNLP), 2017.
-
-If you use the dataset/code in your research, please cite the above paper:
-
-```text
-@inproceedings{raiman2015gnr,
- author={Raiman, Jonathan and Miller, John},
- booktitle={Empirical Methods in Natural Language Processing (EMNLP)},
- title={Globally Normalized Reader},
- year={2017},
-}
-```
-
-You can also visit https://github.com/baidu-research/GloballyNormalizedReader to get more information.
-
-
-# Installation
-
-1. Please use [docker image](http://doc.paddlepaddle.org/develop/doc/getstarted/build_and_install/docker_install_en.html) to install the latest PaddlePaddle, by running:
- ```bash
- docker pull paddledev/paddle
- ```
-2. Download all necessary data by running:
- ```bash
- cd data && ./download.sh && cd ..
- ```
-3. Preprocess and featurizer data:
- ```bash
- python featurize.py --datadir data --outdir data/featurized --glove-path data/glove.840B.300d.txt
- ```
-
-# Training a Model
-
-- Configurate the model by modifying `config.py` if needed, and then run:
-
- ```bash
- python train.py 2>&1 | tee train.log
- ```
-
-# Inferring by a Trained Model
-
-- Infer by a trained model by running:
- ```bash
- python infer.py \
- --model_path models/pass_00000.tar.gz \
- --data_dir data/featurized/ \
- --batch_size 2 \
- --use_gpu 0 \
- --trainer_count 1 \
- 2>&1 | tee infer.log
- ```
diff --git a/legacy/globally_normalized_reader/basic_modules.py b/legacy/globally_normalized_reader/basic_modules.py
deleted file mode 100644
index a54b46b568b619c35a234502955f47708aec4115..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/basic_modules.py
+++ /dev/null
@@ -1,183 +0,0 @@
-#coding=utf-8
-
-import collections
-
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-
-__all__ = [
- "stacked_bidirectional_lstm",
- "stacked_bidirectional_lstm_by_nested_seq",
- "lstm_by_nested_sequence",
-]
-
-
-def stacked_bidirectional_lstm(inputs,
- hidden_dim,
- depth,
- drop_rate=0.,
- prefix=""):
- """ The stacked bi-directional LSTM.
-
- In PaddlePaddle recurrent layers have two different implementations:
- 1. recurrent layer implemented by recurrent_group: any intermedia states a
- recurent unit computes during one time step, such as hidden states,
- input-to-hidden mapping, memory cells and so on, is accessable.
- 2. recurrent layer as a whole: only outputs of the recurrent layer are
- accessable.
-
- The second type (recurrent layer as a whole) is more computation efficient,
- because recurrent_group is made up of many basic layers (including add,
- element-wise multiplications, matrix multiplication and so on).
-
- This function uses the second type to implement the stacked bi-directional
- LSTM.
-
- Arguments:
- - inputs: The input layer to the bi-directional LSTM.
- - hidden_dim: The dimension of the hidden state of the LSTM.
- - depth: Depth of the stacked bi-directional LSTM.
- - drop_rate: The drop rate to drop the LSTM output states.
- - prefix: A string which will be appended to name of each layer
- created in this function. Each layer in a network should
- has a unique name. The prefix makes this fucntion can be
- called multiple times.
- """
-
- if not isinstance(inputs, collections.Sequence):
- inputs = [inputs]
-
- lstm_last = []
- for dirt in ["fwd", "bwd"]:
- for i in range(depth):
- input_proj = paddle.layer.mixed(
- name="%s_in_proj_%0d_%s__" % (prefix, i, dirt),
- size=hidden_dim * 4,
- bias_attr=paddle.attr.Param(initial_std=0.),
- input=[paddle.layer.full_matrix_projection(lstm)] if i else [
- paddle.layer.full_matrix_projection(in_layer)
- for in_layer in inputs
- ])
- lstm = paddle.layer.lstmemory(
- input=input_proj,
- bias_attr=paddle.attr.Param(initial_std=0.),
- param_attr=paddle.attr.Param(initial_std=5e-4),
- reverse=(dirt == "bwd"))
- lstm_last.append(lstm)
-
- final_states = paddle.layer.concat(input=[
- paddle.layer.last_seq(input=lstm_last[0]),
- paddle.layer.first_seq(input=lstm_last[1]),
- ])
-
- lstm_outs = paddle.layer.concat(
- input=lstm_last,
- layer_attr=paddle.attr.ExtraLayerAttribute(drop_rate=drop_rate))
- return final_states, lstm_outs
-
-
-def lstm_by_nested_sequence(input_layer, hidden_dim, name="", reverse=False):
- """This is a LSTM implemended by nested recurrent_group.
-
- Paragraph is a nature nested sequence:
- 1. each paragraph is a sequence of sentence.
- 2. each sentence is a sequence of words.
-
- This function ueses the nested recurrent_group to implement LSTM.
- 1. The outer group iterates over sentence in a paragraph.
- 2. The inner group iterates over words in a sentence.
- 3. A LSTM is used to encode sentence, its final outputs is used to
- initialize memory of the LSTM that is used to encode the next sentence.
- 4. Parameters are shared among these sentence-encoding LSTMs.
- 5. Consequently, this function is just equivalent to concatenate all
- sentences in a paragraph into one (long) sentence, and use one LSTM to
- encode this new long sentence.
-
- Arguments:
- - input_layer: The input layer to the bi-directional LSTM.
- - hidden_dim: The dimension of the hidden state of the LSTM.
- - name: The name of the bi-directional LSTM.
- - reverse: The boolean parameter indicating whether to prcess
- the input sequence by the reverse order.
- """
-
- def lstm_outer_step(lstm_group_input, hidden_dim, reverse, name=''):
- outer_memory = paddle.layer.memory(
- name="__inner_%s_last__" % name, size=hidden_dim)
-
- def lstm_inner_step(input_layer, hidden_dim, reverse, name):
- inner_memory = paddle.layer.memory(
- name="__inner_state_%s__" % name,
- size=hidden_dim,
- boot_layer=outer_memory)
- input_proj = paddle.layer.fc(size=hidden_dim * 4,
- bias_attr=False,
- input=input_layer)
- return paddle.networks.lstmemory_unit(
- input=input_proj,
- name="__inner_state_%s__" % name,
- out_memory=inner_memory,
- size=hidden_dim,
- act=paddle.activation.Tanh(),
- gate_act=paddle.activation.Sigmoid(),
- state_act=paddle.activation.Tanh())
-
- inner_out = paddle.layer.recurrent_group(
- name="__inner_%s__" % name,
- step=lstm_inner_step,
- reverse=reverse,
- input=[lstm_group_input, hidden_dim, reverse, name])
-
- if reverse:
- inner_last_output = paddle.layer.first_seq(
- input=inner_out,
- name="__inner_%s_last__" % name,
- agg_level=paddle.layer.AggregateLevel.TO_NO_SEQUENCE)
- else:
- inner_last_output = paddle.layer.last_seq(
- input=inner_out,
- name="__inner_%s_last__" % name,
- agg_level=paddle.layer.AggregateLevel.TO_NO_SEQUENCE)
- return inner_out
-
- return paddle.layer.recurrent_group(
- input=[
- paddle.layer.SubsequenceInput(input_layer), hidden_dim, reverse,
- name
- ],
- step=lstm_outer_step,
- name="__outter_%s__" % name,
- reverse=reverse)
-
-
-def stacked_bidirectional_lstm_by_nested_seq(input_layer,
- depth,
- hidden_dim,
- prefix=""):
- """ The stacked bi-directional LSTM to process a nested sequence.
-
- The modules defined in this function is exactly equivalent to
- that defined in stacked_bidirectional_lstm, the only difference is the
- bi-directional LSTM defined in this function implemented by recurrent_group
- in PaddlePaddle, and receive a nested sequence as its input.
-
- Arguments:
- - inputs: The input layer to the bi-directional LSTM.
- - hidden_dim: The dimension of the hidden state of the LSTM.
- - depth: Depth of the stacked bi-directional LSTM.
- - prefix: A string which will be appended to name of each layer
- created in this function. Each layer in a network should
- has a unique name. The prefix makes this fucntion can be
- called multiple times.
- """
-
- lstm_final_outs = []
- for dirt in ["fwd", "bwd"]:
- for i in range(depth):
- lstm_out = lstm_by_nested_sequence(
- input_layer=(lstm_out if i else input_layer),
- hidden_dim=hidden_dim,
- name="__%s_%s_%02d__" % (prefix, dirt, i),
- reverse=(dirt == "bwd"))
- lstm_final_outs.append(lstm_out)
- return paddle.layer.concat(input=lstm_final_outs)
diff --git a/legacy/globally_normalized_reader/beam_decoding.py b/legacy/globally_normalized_reader/beam_decoding.py
deleted file mode 100644
index 5f7df266d7789186c309a261e0005e0547043a15..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/beam_decoding.py
+++ /dev/null
@@ -1,270 +0,0 @@
-#coding=utf-8
-
-import numpy as np
-
-__all__ = ["BeamDecoding"]
-
-
-class BeamDecoding(object):
- """
- Decode outputs of the PaddlePaddle layers into readable answers.
- """
-
- def __init__(self, documents, sentence_scores, selected_sentences,
- start_scores, selected_starts, end_scores, selected_ends):
- """ The constructor.
-
- Arguments:
- - documents: The one-hot input of the document words.
- - sentence_scores: The score for each sentece in a document.
- - selected_sentences: The top k seleceted sentence. This is the
- output of the paddle.layer.kmax_seq_score
- layer in the model.
- - start_scores: The score for each words in the selected
- sentence indicating whether the it is start
- of the answer.
- - selected_starts: The top k selected start spans. This is the
- output of the paddle.layer.kmax_seq_score
- layer in the model.
- - end_scores: The score for each words in the sub-sequence
- which is from the selecetd starts till end of
- the selected sentence.
- - selected_ends: The top k selected end spans. This is the
- output of the paddle.layer.kmax_seq_score
- layer in the model.
-
- """
-
- self.documents = documents
-
- self.sentence_scores = sentence_scores
- self.selected_sentences = selected_sentences
-
- self.start_scores = start_scores
- self.selected_starts = selected_starts
-
- self.end_scores = end_scores
- self.selected_ends = selected_ends
- """
- sequence start position information for the three step search
- beam1 is to search the sequence index
- """
- self.beam1_seq_start_positions = []
- """beam2 is to search the start answer span"""
- self.beam2_seq_start_positions = []
- """beam3 is to search the end answer span """
- self.beam3_seq_start_positions = []
-
- self.ans_per_sample_in_a_batch = [0]
- self.all_searched_ans = []
-
- self.final_ans = [[] for i in range(len(documents))]
-
- def _build_beam1_seq_info(self):
- """
- The internal function to calculate the offset of each test sequence
- in a batch for the first beam in searching the answer sentence.
- """
-
- self.beam1_seq_start_positions.append([0])
- for idx, one_doc in enumerate(self.documents):
- for sentence in one_doc:
- self.beam1_seq_start_positions[-1].append(
- self.beam1_seq_start_positions[-1][-1] + len(sentence))
-
- if len(self.beam1_seq_start_positions) != len(self.documents):
- self.beam1_seq_start_positions.append(
- [self.beam1_seq_start_positions[-1][-1]])
-
- def _build_beam2_seq_info(self):
- """
- The internal function to calculate the offset of each test sequence
- in a batch for the second beam in searching the start spans.
- """
-
- seq_num, beam_size = self.selected_sentences.shape
- self.beam2_seq_start_positions.append([0])
- for i in range(seq_num):
- for j in range(beam_size):
- selected_id = int(self.selected_sentences[i][j])
- if selected_id == -1: break
- seq_len = self.beam1_seq_start_positions[i][
- selected_id + 1] - self.beam1_seq_start_positions[i][
- selected_id]
- self.beam2_seq_start_positions[-1].append(
- self.beam2_seq_start_positions[-1][-1] + seq_len)
-
- if len(self.beam2_seq_start_positions) != seq_num:
- self.beam2_seq_start_positions.append(
- [self.beam2_seq_start_positions[-1][-1]])
-
- def _build_beam3_seq_info(self):
- """
- The internal function to calculate the offset of each test sequence
- in a batch for the third beam in searching the end spans.
- """
-
- seq_num_in_a_batch = len(self.documents)
-
- seq_id = 0
- sub_seq_id = 0
- sub_seq_count = len(self.beam2_seq_start_positions[seq_id]) - 1
-
- self.beam3_seq_start_positions.append([0])
- sub_seq_num, beam_size = self.selected_starts.shape
- for i in range(sub_seq_num):
- seq_len = self.beam2_seq_start_positions[seq_id][
- sub_seq_id + 1] - self.beam2_seq_start_positions[seq_id][
- sub_seq_id]
- for j in range(beam_size):
- start_id = int(self.selected_starts[i][j])
- if start_id == -1: break
-
- self.beam3_seq_start_positions[-1].append(
- self.beam3_seq_start_positions[-1][-1] + seq_len - start_id)
-
- sub_seq_id += 1
- if sub_seq_id == sub_seq_count:
- if len(self.beam3_seq_start_positions) != seq_num_in_a_batch:
- self.beam3_seq_start_positions.append(
- [self.beam3_seq_start_positions[-1][-1]])
- sub_seq_id = 0
- seq_id += 1
- sub_seq_count = len(self.beam2_seq_start_positions[
- seq_id]) - 1
- assert (
- self.beam3_seq_start_positions[-1][-1] == self.end_scores.shape[0])
-
- def _build_seq_info_for_each_beam(self):
- """
- The internal function to calculate the offset of each test sequence
- in a batch for beams expanded at all the three search steps.
- """
-
- self._build_beam1_seq_info()
- self._build_beam2_seq_info()
- self._build_beam3_seq_info()
-
- def _cal_ans_per_sample_in_a_batch(self):
- """
- The internal function to calculate there are how many candidate answers
- for each of the test sequemce in a batch.
- """
-
- start_row = 0
- for seq in self.beam3_seq_start_positions:
- end_row = start_row + len(seq) - 1
- ans_count = np.sum(self.selected_ends[start_row:end_row, :] != -1.)
-
- self.ans_per_sample_in_a_batch.append(
- self.ans_per_sample_in_a_batch[-1] + ans_count)
- start_row = end_row
-
- def _get_valid_seleceted_ids(slef, mat):
- """
- The internal function to post-process the output matrix of
- paddle.layer.kmax_seq_score layer. This function takes off the special
- dilimeter -1 away and flattens the original two-dimensional output
- matrix into a python list.
- """
-
- flattened = []
- height, width = mat.shape
- for i in range(height):
- for j in range(width):
- if mat[i][j] == -1.: break
- flattened.append([int(mat[i][j]), [i, j]])
- return flattened
-
- def decoding(self):
- """
- The internal function to decode forward results of the GNR network into
- readable answers.
- """
-
- self._build_seq_info_for_each_beam()
- self._cal_ans_per_sample_in_a_batch()
-
- seq_id = 0
- sub_seq_id = 0
- sub_seq_count = len(self.beam3_seq_start_positions[seq_id]) - 1
-
- sub_seq_num, beam_size = self.selected_ends.shape
- for i in xrange(sub_seq_num):
- seq_offset_in_batch = self.beam3_seq_start_positions[seq_id][
- sub_seq_id]
- for j in xrange(beam_size):
- end_pos = int(self.selected_ends[i][j])
- if end_pos == -1: break
-
- self.all_searched_ans.append({
- "score": self.end_scores[seq_offset_in_batch + end_pos],
- "sentence_pos": -1,
- "start_span_pos": -1,
- "end_span_pos": end_pos,
- "parent_ids_in_prev_beam": i
- })
-
- sub_seq_id += 1
- if sub_seq_id == sub_seq_count:
- seq_id += 1
- if seq_id == len(self.beam3_seq_start_positions): break
-
- sub_seq_id = 0
- sub_seq_count = len(self.beam3_seq_start_positions[seq_id]) - 1
-
- assert len(self.all_searched_ans) == self.ans_per_sample_in_a_batch[-1]
-
- seq_id = 0
- sub_seq_id = 0
- sub_seq_count = len(self.beam2_seq_start_positions[seq_id]) - 1
- last_row_id = None
-
- starts = self._get_valid_seleceted_ids(self.selected_starts)
- for i, ans in enumerate(self.all_searched_ans):
- ans["start_span_pos"] = starts[ans["parent_ids_in_prev_beam"]][0]
-
- seq_offset_in_batch = (
- self.beam2_seq_start_positions[seq_id][sub_seq_id])
- ans["score"] += self.start_scores[(
- seq_offset_in_batch + ans["start_span_pos"])]
- ans["parent_ids_in_prev_beam"] = starts[ans[
- "parent_ids_in_prev_beam"]][1][0]
-
- if last_row_id and last_row_id != ans["parent_ids_in_prev_beam"]:
- sub_seq_id += 1
-
- if sub_seq_id == sub_seq_count:
- seq_id += 1
- if seq_id == len(self.beam2_seq_start_positions): break
- sub_seq_count = len(self.beam2_seq_start_positions[seq_id]) - 1
- sub_seq_id = 0
- last_row_id = ans["parent_ids_in_prev_beam"]
-
- offset_info = [0]
- for sen in self.beam1_seq_start_positions[:-1]:
- offset_info.append(offset_info[-1] + len(sen) - 1)
- sen_ids = self._get_valid_seleceted_ids(self.selected_sentences)
- for ans in self.all_searched_ans:
- ans["sentence_pos"] = sen_ids[ans["parent_ids_in_prev_beam"]][0]
- row_id = ans["parent_ids_in_prev_beam"] / beam_size
- offset = offset_info[row_id - 1] if row_id else 0
- ans["score"] += self.sentence_scores[offset + ans["sentence_pos"]]
-
- for i in range(len(self.ans_per_sample_in_a_batch) - 1):
- start_pos = self.ans_per_sample_in_a_batch[i]
- end_pos = self.ans_per_sample_in_a_batch[i + 1]
-
- for ans in sorted(
- self.all_searched_ans[start_pos:end_pos],
- key=lambda x: x["score"],
- reverse=True):
- self.final_ans[i].append({
- "score": ans["score"],
- "label": [
- ans["sentence_pos"], ans["start_span_pos"],
- ans["end_span_pos"]
- ]
- })
-
- return self.final_ans
diff --git a/legacy/globally_normalized_reader/config.py b/legacy/globally_normalized_reader/config.py
deleted file mode 100644
index 2fa48b64d70600a3967895d09b22b8b23ffcd11b..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/config.py
+++ /dev/null
@@ -1,46 +0,0 @@
-#coding=utf-8
-
-__all__ = ["ModelConfig", "TrainerConfig"]
-
-
-class ModelConfig(object):
- vocab_size = 104810
- embedding_dim = 300
- embedding_droprate = 0.3
-
- lstm_depth = 3
- lstm_hidden_dim = 300
- lstm_hidden_droprate = 0.3
-
- passage_indep_embedding_dim = 300
- passage_aligned_embedding_dim = 300
-
- beam_size = 32
-
- dict_path = "data/featurized/vocab.txt"
- pretrained_emb_path = "data/featurized/embeddings.npy"
-
-
-class TrainerConfig(object):
- learning_rate = 1e-3
- l2_decay_rate = 5e-4
- gradient_clipping_threshold = 20
-
- data_dir = "data/featurized"
- save_dir = "models"
-
- use_gpu = False
- trainer_count = 1
- train_batch_size = trainer_count * 8
-
- epochs = 20
-
- # This parameter is for debug printing.
- # If it set to 0, no information will be printed.
- show_parameter_status_period = 0
- checkpoint_period = 100
- log_period = 5
-
- # This parameter is used to resume training.
- # This path can be set to a previously trained model.
- init_model_path = None
diff --git a/legacy/globally_normalized_reader/data/download.sh b/legacy/globally_normalized_reader/data/download.sh
deleted file mode 100755
index f089284a897478c686853754312720c0c91a5abd..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/data/download.sh
+++ /dev/null
@@ -1,7 +0,0 @@
-#!/bin/bash
-
-wget --no-check-certificate https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v1.1.json -O train.json
-wget --no-check-certificate https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v1.1.json -O dev.json
-
-wget http://nlp.stanford.edu/data/glove.840B.300d.zip
-unzip glove.840B.300d.zip
diff --git a/legacy/globally_normalized_reader/evaluate.py b/legacy/globally_normalized_reader/evaluate.py
deleted file mode 100644
index c85ae0126d737cdb2d14d618aa07705faa490d90..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/evaluate.py
+++ /dev/null
@@ -1,96 +0,0 @@
-""" Official evaluation script for v1.1 of the SQuAD dataset. """
-from __future__ import print_function
-from collections import Counter
-import string
-import re
-import argparse
-import json
-import sys
-
-
-def normalize_answer(s):
- """Lower text and remove punctuation, articles and extra whitespace."""
-
- def remove_articles(text):
- return re.sub(r'\b(a|an|the)\b', ' ', text)
-
- def white_space_fix(text):
- return ' '.join(text.split())
-
- def remove_punc(text):
- exclude = set(string.punctuation)
- return ''.join(ch for ch in text if ch not in exclude)
-
- def lower(text):
- return text.lower()
-
- return white_space_fix(remove_articles(remove_punc(lower(s))))
-
-
-def f1_score(prediction, ground_truth):
- prediction_tokens = normalize_answer(prediction).split()
- ground_truth_tokens = normalize_answer(ground_truth).split()
- common = Counter(prediction_tokens) & Counter(ground_truth_tokens)
- num_same = sum(common.values())
- if num_same == 0:
- return 0
- precision = 1.0 * num_same / len(prediction_tokens)
- recall = 1.0 * num_same / len(ground_truth_tokens)
- f1 = (2 * precision * recall) / (precision + recall)
- return f1
-
-
-def exact_match_score(prediction, ground_truth):
- return (normalize_answer(prediction) == normalize_answer(ground_truth))
-
-
-def metric_max_over_ground_truths(metric_fn, prediction, ground_truths):
- scores_for_ground_truths = []
- for ground_truth in ground_truths:
- score = metric_fn(prediction, ground_truth)
- scores_for_ground_truths.append(score)
- return max(scores_for_ground_truths)
-
-
-def evaluate(dataset, predictions):
- f1 = exact_match = total = 0
- for article in dataset:
- for paragraph in article['paragraphs']:
- for qa in paragraph['qas']:
- total += 1
- if qa['id'] not in predictions:
- message = 'Unanswered question ' + qa['id'] + \
- ' will receive score 0.'
- print(message, file=sys.stderr)
- continue
- ground_truths = list(map(lambda x: x['text'], qa['answers']))
- prediction = predictions[qa['id']]
- exact_match += metric_max_over_ground_truths(
- exact_match_score, prediction, ground_truths)
- f1 += metric_max_over_ground_truths(f1_score, prediction,
- ground_truths)
-
- exact_match = 100.0 * exact_match / total
- f1 = 100.0 * f1 / total
-
- return {'exact_match': exact_match, 'f1': f1}
-
-
-if __name__ == '__main__':
- expected_version = '1.1'
- parser = argparse.ArgumentParser(
- description='Evaluation for SQuAD ' + expected_version)
- parser.add_argument('dataset_file', help='Dataset file')
- parser.add_argument('prediction_file', help='Prediction File')
- args = parser.parse_args()
- with open(args.dataset_file) as dataset_file:
- dataset_json = json.load(dataset_file)
- if (dataset_json['version'] != expected_version):
- print(
- 'Evaluation expects v-' + expected_version +
- ', but got dataset with v-' + dataset_json['version'],
- file=sys.stderr)
- dataset = dataset_json['data']
- with open(args.prediction_file) as prediction_file:
- predictions = json.load(prediction_file)
- print(json.dumps(evaluate(dataset, predictions)))
diff --git a/legacy/globally_normalized_reader/featurize.py b/legacy/globally_normalized_reader/featurize.py
deleted file mode 100644
index 9a5f3d26f15002777df441217a52fd53a0507dc0..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/featurize.py
+++ /dev/null
@@ -1,309 +0,0 @@
-#coding=utf-8
-"""
-Convert the raw json data into training and validation examples.
-"""
-from collections import Counter
-import json
-import os
-import io
-import string
-
-import click
-import numpy as np
-import ciseau
-
-from vocab import Vocab
-from evaluate import normalize_answer
-
-# Constants
-UNK = ""
-SOS = ""
-EOS = ""
-PAD = ""
-
-splits = ["train", "dev"]
-
-ARTICLES = {"a", "an", "the", "of"}
-
-# Keep the random embedding matrix the same between runs.
-np.random.seed(1234)
-
-
-def data_stream(path):
- """ Given a path json data in Pranav format, convert it to a stream
- question/context/answers tuple."""
- with io.open(path, "r") as handle:
- raw_data = json.load(handle)["data"]
- for ex in raw_data:
- for paragraph in ex["paragraphs"]:
- context = paragraph["context"]
- for qa in paragraph["qas"]:
- question = qa["question"]
- answers = qa["answers"]
- if "id" not in qa:
- qa_id = -1
- else:
- qa_id = qa["id"]
- yield question, context, answers, qa_id
-
-
-def build_vocabulary(datadir, outdir, glove_path):
- """Construct the vocabulary object used throughout."""
- # We're not going to backprop through the word vectors
- # both train and dev words end up in the vocab.
- counter = Counter()
- for split in splits:
- datapath = os.path.join(datadir, split + ".json")
-
- for question, context, _, _ in data_stream(datapath):
- for word in ciseau.tokenize(question, normalize_ascii=False):
- counter[normalize(word)] += 1
- for word in ciseau.tokenize(context, normalize_ascii=False):
- counter[normalize(word)] += 1
-
- common_words = [UNK, SOS, EOS, PAD] + [w for w, _ in counter.most_common()]
-
- vocab_path = os.path.join(outdir, "vocab.txt")
- with io.open(vocab_path, "w", encoding="utf8") as handle:
- handle.write("\n".join(common_words))
-
- return Vocab(outdir)
-
-
-def normalize_answer_tokens(tokens):
- start = 0
- end = len(tokens)
-
- while end - start > 1:
- first_token = tokens[start].rstrip().lower()
- if first_token in string.punctuation or first_token in ARTICLES:
- start += 1
- else:
- break
- while end - start > 1:
- last_token = tokens[end - 1].rstrip().lower()
- if last_token in string.punctuation:
- end -= 1
- else:
- break
- return start, end
-
-
-def tokenize_example(question, context, answers, strip_labels=True):
- # Q: How should we choose the right answer
- answer = answers[0]["text"]
- answer_start = answers[0]["answer_start"]
-
- if strip_labels:
- answer_tokens = ciseau.tokenize(answer, normalize_ascii=False)
- start_offset, end_offset = normalize_answer_tokens(answer_tokens)
- answer = "".join(answer_tokens[start_offset:end_offset])
- # add back the piece that was stripped off:
- answer_start = answer_start + len("".join(answer_tokens[:start_offset]))
-
- # replace answer string with placeholder
- placeholder = "XXXX"
- new_context = context[:answer_start] + placeholder + context[answer_start +
- len(answer):]
-
- token_context = ciseau.sent_tokenize(new_context, keep_whitespace=True)
- token_question = ciseau.tokenize(question)
-
- sentence_label = None
- for sent_idx, sent in enumerate(token_context):
- answer_start = None
- for idx, word in enumerate(sent):
- if placeholder in word:
- answer_start = idx
- break
-
- if answer_start is None:
- continue
-
- sentence_label = sent_idx
-
- # deal with cases where the answer is in the middle
- # of the word
- answer = word.replace(placeholder, answer)
- token_answer = ciseau.tokenize(answer)
-
- answer_end = answer_start + len(token_answer) - 1
- answer_sent = sent[:answer_start] + token_answer + sent[answer_start +
- 1:]
- break
-
- token_context[sentence_label] = answer_sent
-
- return token_question, token_context, sentence_label, answer_start, answer_end
-
-
-def normalize(word):
- return word.strip()
-
-
-def same_as_question_feature(question_idxs, context_idxs, vocab):
- question_words = [vocab.idx_to_word(idx) for idx in question_idxs]
-
- # remove stop word and puncutation
- question_words = set([
- w.strip().lower() for w in question_words
- if w not in ARTICLES and w not in string.punctuation
- ])
-
- features = []
- for word_idx in context_idxs:
- word = vocab.idx_to_word(word_idx)
- features.append(int(word.strip().lower() in question_words))
-
- return features
-
-
-def repeated_word_features(context_idxs, vocab):
- context_words = [vocab.idx_to_word(idx) for idx in context_idxs]
-
- word_counter = {}
- for word in context_words:
- canon = word.strip().lower()
- if canon in word_counter:
- word_counter[canon] += 1
- else:
- word_counter[canon] = 1
-
- max_occur = max(word_counter.values())
- min_occur = min(word_counter.values())
- occur_range = max(1.0, max_occur - min_occur)
-
- repeated_words = []
- repeated_word_intensity = []
-
- for word in context_words:
- canon = word.strip().lower()
- count = word_counter[canon]
- repeated = float(count > 1 and canon not in ARTICLES and
- canon not in string.punctuation)
- intensity = float((count - min_occur) / occur_range)
-
- repeated_words.append(repeated)
- repeated_word_intensity.append(intensity)
-
- return repeated_words, repeated_word_intensity
-
-
-def convert_example_to_indices(example, outfile, vocab):
- print("Processing {}".format(outfile))
- question, context, answers, qa_id = example
-
- tokenized = tokenize_example(question, context, answers, strip_labels=True)
- token_question, token_context, ans_sent, ans_start, ans_end = tokenized
-
- # Convert to indices
- question_idxs = [vocab.word_to_idx(normalize(w)) for w in token_question]
-
- # + 1 for end of sentence
- sent_lengths = [len(sent) + 1 for sent in token_context]
- context_idxs = []
- for sent in token_context:
- for w in sent:
- context_idxs.append(vocab.word_to_idx(normalize(w)))
- context_idxs.append(vocab.eos)
-
- same_as_question = same_as_question_feature(question_idxs, context_idxs,
- vocab)
-
- repeated_words, repeated_intensity = repeated_word_features(context_idxs,
- vocab)
-
- features = {
- "question": question_idxs,
- "context": context_idxs,
- "ans_sentence": ans_sent,
- "ans_start": ans_start,
- "ans_end": ans_end,
- "sent_lengths": sent_lengths,
- "same_as_question_word": same_as_question,
- "repeated_words": repeated_words,
- "repeated_intensity": repeated_intensity,
- "qa_id": qa_id
- }
-
- # Hack!: This is not a great way to save indices...
- with io.open(outfile, "w", encoding="utf8") as handle:
- handle.write(unicode(json.dumps(features, ensure_ascii=False)))
-
-
-def featurize_example(question, context, vocab):
- # Convert to indices
- question_idxs = [
- vocab.word_to_idx(normalize(w))
- for w in ciseau.tokenize(
- question, normalize_ascii=False)
- ]
-
- context_sents = ciseau.sent_tokenize(
- context, keep_whitespace=True, normalize_ascii=False)
- # + 1 for end of sentence
- sent_lengths = [len(sent) + 1 for sent in context_sents]
- context_idxs = []
- for sent in context_sents:
- for w in sent:
- context_idxs.append(vocab.word_to_idx(normalize(w)))
- context_idxs.append(vocab.eos)
-
- same_as_question = same_as_question_feature(question_idxs, context_idxs,
- vocab)
- repeated_words, repeated_intensity = repeated_word_features(context_idxs,
- vocab)
-
- return (question_idxs, context_idxs, same_as_question, repeated_words,
- repeated_intensity, sent_lengths), context_sents
-
-
-def random_sample(data, k, replace=False):
- indices = np.arange(len(data))
- chosen_indices = np.random.choice(indices, k, replace=replace)
- return [data[idx] for idx in chosen_indices]
-
-
-@click.command()
-@click.option("--datadir", type=str, help="Path to raw data")
-@click.option("--outdir", type=str, help="Path to save the result")
-@click.option("--glove-path", default="/mnt/data/jmiller/glove.840B.300d.txt")
-def preprocess(datadir, outdir, glove_path):
- if not os.path.exists(outdir):
- os.makedirs(outdir)
-
- print("Constructing vocabularies...")
- vocab = build_vocabulary(datadir, outdir, glove_path)
- print("Finished...")
-
- print("Building word embedding matrix...")
- vocab.construct_embedding_matrix(glove_path)
- print("Finished...")
-
- # Create training featurizations
- for split in splits:
- results_path = os.path.join(outdir, split)
- os.makedirs(results_path)
-
- # process each example
- examples = list(data_stream(os.path.join(datadir, split + ".json")))
-
- for idx, example in enumerate(examples):
- outfile = os.path.join(results_path, str(idx) + ".json")
- convert_example_to_indices(example, outfile, vocab)
-
- print("Building evaluation featurization...")
- eval_feats = []
- for question, context, _, qa_id in data_stream(
- os.path.join(datadir, "dev.json")):
- features, tokenized_context = featurize_example(question, context,
- vocab)
- eval_feats.append((qa_id, tokenized_context, features))
-
- with io.open(
- os.path.join(outdir, "eval.json"), "w", encoding="utf8") as handle:
- handle.write(unicode(json.dumps(eval_feats, ensure_ascii=False)))
-
-
-if __name__ == "__main__":
- preprocess()
diff --git a/legacy/globally_normalized_reader/infer.py b/legacy/globally_normalized_reader/infer.py
deleted file mode 100644
index 397cb8ce9497e23919eb3b5752b180f4c67ce394..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/infer.py
+++ /dev/null
@@ -1,213 +0,0 @@
-#coding=utf-8
-
-import os
-import sys
-import argparse
-import gzip
-import logging
-import numpy as np
-
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-import reader
-
-from model import GNR
-from train import choose_samples
-from config import ModelConfig
-from beam_decoding import BeamDecoding
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def parse_cmd():
- """
- Build the command line arguments parser for inferring task.
- """
- parser = argparse.ArgumentParser(
- description="Globally Normalized Reader in PaddlePaddle.")
- parser.add_argument(
- "--model_path",
- required=True,
- type=str,
- help="Path of the trained model to evaluate.",
- default="")
- parser.add_argument(
- "--data_dir",
- type=str,
- required=True,
- help="Path of the training and testing data.",
- default="")
- parser.add_argument(
- "--batch_size",
- type=int,
- required=False,
- help="The batch size for inferring.",
- default=1)
- parser.add_argument(
- "--use_gpu",
- type=int,
- required=False,
- help="Whether to run the inferring on GPU.",
- default=0)
- parser.add_argument(
- "--trainer_count",
- type=int,
- required=False,
- help=("The thread number used in inferring. When set "
- "use_gpu=True, the trainer_count cannot excess "
- "the gpu device number in your computer."),
- default=1)
- return parser.parse_args()
-
-
-def load_reverse_dict(dict_file):
- """ Build the dict which is used to map the word index to word string.
-
- The keys are word index and the values are word strings.
-
- Arguments:
- - dict_file: The path of a word dictionary.
- """
- word_dict = {}
- with open(dict_file, "r") as fin:
- for idx, line in enumerate(fin):
- word_dict[idx] = line.strip()
- return word_dict
-
-
-def print_result(test_batch, predicted_ans, ids_2_word, print_top_k=1):
- """ Print the readable predicted answers.
-
- Format of the output:
- query:\tthe input query.
- documents:\n
- 0\tthe first sentence in the document.
- 1\tthe second sentence in the document.
- ...
- gold:\t[i j k] the answer words.
- (i: the sentence index;
- j: the start span index;
- k: the end span index)
- top answers:
- score0\t[i j k] the answer with the highest score.
- score1\t[i j k] the answer with the second highest score.
- (i, j, k has a same meaning as in gold.)
- ...
-
- By default, top 10 answers will be printed.
-
- Arguments:
- - test_batch: A test batch returned by reader.
- - predicted_ans: The beam decoding results.
- - ids_2_word: The dict whose key is word index and the values are
- word strings.
- - print_top_k: Indicating how many answers will be printed.
- """
-
- for i, sample in enumerate(test_batch):
- query_words = [ids_2_word[ids] for ids in sample[0]]
- print("query:\t%s" % (" ".join(query_words)))
-
- print("documents:")
- for j, sen in enumerate(sample[1]):
- sen_words = [ids_2_word[ids] for ids in sen]
- start = sample[4]
- end = sample[4] + sample[5] + 1
- print("%d\t%s" % (j, " ".join(sen_words)))
- print("gold:\t[%d %d %d] %s" % (
- sample[3], sample[4], sample[5], " ".join(
- [ids_2_word[ids] for ids in sample[1][sample[3]][start:end]])))
-
- print("top answers:")
- for k in range(print_top_k):
- label = predicted_ans[i][k]["label"]
- start = label[1]
- end = label[1] + label[2] + 1
- ans_words = [
- ids_2_word[ids] for ids in sample[1][label[0]][start:end]
- ]
- print("%.4f\t[%d %d %d] %s" %
- (predicted_ans[i][k]["score"], label[0], label[1], label[2],
- " ".join(ans_words)))
- print("\n")
-
-
-def infer_a_batch(inferer, test_batch, ids_2_word, out_layer_count):
- """ Call the PaddlePaddle's infer interface to infer by batch.
-
- Arguments:
- - inferer: The PaddlePaddle Inference object.
- - test_batch: A test batch returned by reader.
- - ids_2_word: The dict whose key is word index and the values are
- word strings.
- - out_layer_count: The number of output layers in the inferring process.
- """
-
- outs = inferer.infer(input=test_batch, flatten_result=False, field="value")
- decoder = BeamDecoding([sample[1] for sample in test_batch], *outs)
- print_result(test_batch, decoder.decoding(), ids_2_word, print_top_k=10)
-
-
-def infer(model_path,
- data_dir,
- batch_size,
- config,
- use_gpu=False,
- trainer_count=1):
- """ The inferring process.
-
- Arguments:
- - model_path: The path of trained model.
- - data_dir: The directory path of test data.
- - batch_size: The batch_size.
- - config: The model configuration.
- - use_gpu: Whether to run the inferring on GPU.
- - trainer_count: The thread number used in inferring. When set
- use_gpu=True, the trainer_count cannot excess
- the gpu device number in your computer.
- """
-
- assert os.path.exists(model_path), "The model does not exist."
- paddle.init(use_gpu=use_gpu, trainer_count=trainer_count)
-
- ids_2_word = load_reverse_dict(config.dict_path)
-
- outputs = GNR(config, is_infer=True)
-
- # load the trained models
- parameters = paddle.parameters.Parameters.from_tar(
- gzip.open(model_path, "r"))
- logger.info("loading parameter is done.")
-
- inferer = paddle.inference.Inference(
- output_layer=outputs, parameters=parameters)
-
- _, valid_samples = choose_samples(data_dir)
- test_reader = reader.data_reader(valid_samples, is_train=False)
-
- test_batch = []
- for i, item in enumerate(test_reader()):
- test_batch.append(item)
- if len(test_batch) == batch_size:
- infer_a_batch(inferer, test_batch, ids_2_word, len(outputs))
- test_batch = []
-
- if len(test_batch):
- infer_a_batch(inferer, test_batch, ids_2_word, len(outputs))
- test_batch = []
-
-
-def main(args):
- infer(
- model_path=args.model_path,
- data_dir=args.data_dir,
- batch_size=args.batch_size,
- config=ModelConfig,
- use_gpu=args.use_gpu,
- trainer_count=args.trainer_count)
-
-
-if __name__ == "__main__":
- args = parse_cmd()
- main(args)
diff --git a/legacy/globally_normalized_reader/model.py b/legacy/globally_normalized_reader/model.py
deleted file mode 100644
index 5ca93808d2762cf038bd1c64d1bcae25d1c08e75..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/model.py
+++ /dev/null
@@ -1,324 +0,0 @@
-#!/usr/bin/env python
-#coding=utf-8
-
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-import basic_modules
-from config import ModelConfig
-
-__all__ = ["GNR"]
-
-
-def build_pretrained_embedding(name, data_type, emb_dim, emb_drop=0.):
- """create word a embedding layer which loads pre-trained embeddings.
-
- Arguments:
- - name: The name of the data layer which accepts one-hot input.
- - data_type: PaddlePaddle's data type for data layer.
- - emb_dim: The path to the data files.
- """
-
- return paddle.layer.embedding(
- input=paddle.layer.data(
- name=name, type=data_type),
- size=emb_dim,
- param_attr=paddle.attr.Param(
- name="GloveVectors", is_static=True),
- layer_attr=paddle.attr.ExtraLayerAttribute(drop_rate=emb_drop), )
-
-
-def encode_question(input_embedding,
- lstm_hidden_dim,
- depth,
- passage_indep_embedding_dim,
- prefix=""):
- """build question encoding by using bidirectional LSTM.
-
- Each question word is encoded by runing a stack of bidirectional LSTM over
- word embedding in question, producing hidden states. The hidden states are
- used to compute a passage-independent question embedding.
-
- The final question encoding is constructed by concatenating the final
- hidden states of the forward and backward LSTMs and the passage-independent
- embedding.
-
- Arguments:
- - input_embedding: The question word embeddings.
- - lstm_hidden_dim: The dimension of bi-directional LSTM.
- - depth: The depth of stacked bi-directional LSTM.
- - passage_indep_embedding_dim: The dimension of passage-independent
- embedding.
- - prefix: A string which will be appended to name of each layer
- created in this function. Each layer in a network should
- has a unique name. The prefix makes this fucntion can be
- called multiple times.
- """
- # stacked bi-directional LSTM to process question embeddings.
- lstm_final, lstm_outs = basic_modules.stacked_bidirectional_lstm(
- input_embedding, lstm_hidden_dim, depth, 0., prefix)
-
- # compute passage-independent embeddings.
- candidates = paddle.layer.fc(input=lstm_outs,
- bias_attr=False,
- size=passage_indep_embedding_dim,
- act=paddle.activation.Linear())
- weights = paddle.layer.fc(input=lstm_outs,
- size=1,
- bias_attr=False,
- act=paddle.activation.SequenceSoftmax())
- weighted_candidates = paddle.layer.scaling(input=candidates, weight=weights)
- passage_indep_embedding = paddle.layer.pooling(
- input=weighted_candidates, pooling_type=paddle.pooling.Sum())
-
- return paddle.layer.concat(
- input=[lstm_final, passage_indep_embedding]), lstm_outs
-
-
-def question_aligned_passage_embedding(question_lstm_outs, document_embeddings,
- passage_aligned_embedding_dim):
- """create question aligned passage embedding.
-
- Arguments:
- - question_lstm_outs: The dimension of output of LSTM that process
- question word embedding.
- - document_embeddings: The document embeddings.
- - passage_aligned_embedding_dim: The dimension of passage aligned
- embedding.
- """
-
- def outer_sentence_step(document_embeddings, question_lstm_outs,
- passage_aligned_embedding_dim):
- """step function for PaddlePaddle's recurrent_group.
-
- In this function, the original input document_embeddings are scattered
- from nested sequence into sequence by recurrent_group in PaddlePaddle.
- The step function iterates over each sentence in the document.
-
- Arguments:
- - document_embeddings: The word embeddings of the document.
- - question_lstm_outs: The dimension of output of LSTM that
- process question word embedding.
- - passage_aligned_embedding_dim: The dimension of passage aligned
- embedding.
- """
-
- def inner_word_step(word_embedding, question_lstm_outs,
- question_outs_proj, passage_aligned_embedding_dim):
- """
- In this recurrent_group, sentence embedding has been scattered into
- word embeddings. The step function iterates over each word in one
- sentence in the document.
-
- Arguments:
- - word_embedding: The word embeddings of documents.
- - question_lstm_outs: The dimension of output of LSTM that
- process question word embedding.
- - question_outs_proj: The projection of question_lstm_outs
- into a new hidden space.
- - passage_aligned_embedding_dim: The dimension of passage
- aligned embedding.
- """
-
- doc_word_expand = paddle.layer.expand(
- input=word_embedding,
- expand_as=question_lstm_outs,
- expand_level=paddle.layer.ExpandLevel.FROM_NO_SEQUENCE)
-
- weights = paddle.layer.fc(
- input=[question_lstm_outs, doc_word_expand],
- size=1,
- bias_attr=False,
- act=paddle.activation.SequenceSoftmax())
- weighted_candidates = paddle.layer.scaling(
- input=question_outs_proj, weight=weights)
- return paddle.layer.pooling(
- input=weighted_candidates, pooling_type=paddle.pooling.Sum())
-
- question_outs_proj = paddle.layer.fc(input=question_lstm_outs,
- bias_attr=False,
- size=passage_aligned_embedding_dim)
- return paddle.layer.recurrent_group(
- input=[
- paddle.layer.SubsequenceInput(document_embeddings),
- paddle.layer.StaticInput(question_lstm_outs),
- paddle.layer.StaticInput(question_outs_proj),
- passage_aligned_embedding_dim,
- ],
- step=inner_word_step,
- name="iter_over_word")
-
- return paddle.layer.recurrent_group(
- input=[
- paddle.layer.SubsequenceInput(document_embeddings),
- paddle.layer.StaticInput(question_lstm_outs),
- passage_aligned_embedding_dim
- ],
- step=outer_sentence_step,
- name="iter_over_sen")
-
-
-def encode_documents(input_embedding, same_as_question, question_vector,
- question_lstm_outs, passage_indep_embedding_dim, prefix):
- """Build the final question-aware document embeddings.
-
- Each word in the document is represented as concatenation of its word
- vector, the question vector, boolean features indicating if a word appers
- in the question or is repeated, and a question aligned embedding.
-
-
- Arguments:
- - input_embedding: The word embeddings of the document.
- - same_as_question: The boolean features indicating if a word appears
- in the question or is repeated.
- - question_lstm_outs: The final question encoding.
- - passage_indep_embedding_dim: The dimension of passage independent
- embedding.
- - prefix: The prefix which will be appended to name of each layer in
- This function.
- """
-
- question_expanded = paddle.layer.expand(
- input=question_vector,
- expand_as=input_embedding,
- expand_level=paddle.layer.ExpandLevel.FROM_NO_SEQUENCE)
- question_aligned_embedding = question_aligned_passage_embedding(
- question_lstm_outs, input_embedding, passage_indep_embedding_dim)
- return paddle.layer.concat(input=[
- input_embedding, question_expanded, same_as_question,
- question_aligned_embedding
- ])
-
-
-def search_answer(doc_lstm_outs, sentence_idx, start_idx, end_idx, config,
- is_infer):
- """Search the answer from the document.
-
- The search process for this layer begins with searching a target sequence
- from a nested sequence by using paddle.layer.kmax_seq_score and
- paddle.layer.sub_nested_seq_layer. In the first search step, top beam size
- sequences with highest scores, indices of these top k sequences in the
- original nested sequence, and the ground truth (also called gold)
- altogether (a triple) make up of the first beam.
-
- Then, start and end positions are searched. In these searches, top k
- positions with highest scores are selected, and then sequence, starting
- from the selected starts till ends of the sequences are taken to search
- next by using paddle.layer.seq_slice.
-
- Finally, the layer paddle.layer.cross_entropy_over_beam takes all the beam
- expansions which contain several candidate targets found along the
- three-step search. cross_entropy_over_beam calculates cross entropy over
- the expanded beams which all the candidates in the beam as the normalized
- factor.
-
- Note that, if gold falls off the beam at search step t, then the cost is
- calculated over the beam at step t.
-
- Arguments:
- - doc_lstm_outs: The output of LSTM that process each document words.
- - sentence_idx: Ground-truth indicating sentence index of the answer
- in the document.
- - start_idx: Ground-truth indicating start span index of the answer
- in the sentence.
- - end_idx: Ground-truth indicating end span index of the answer
- in the sentence.
- - is_infer: The boolean parameter indicating inferring or training.
- """
-
- last_state_of_sentence = paddle.layer.last_seq(
- input=doc_lstm_outs, agg_level=paddle.layer.AggregateLevel.TO_SEQUENCE)
- sentence_scores = paddle.layer.fc(input=last_state_of_sentence,
- size=1,
- bias_attr=False,
- act=paddle.activation.Linear())
- topk_sentence_ids = paddle.layer.kmax_seq_score(
- input=sentence_scores, beam_size=config.beam_size)
- topk_sen = paddle.layer.sub_nested_seq(
- input=doc_lstm_outs, selected_indices=topk_sentence_ids)
-
- # expand beam to search start positions on selected sentences
- start_pos_scores = paddle.layer.fc(
- input=topk_sen,
- size=1,
- layer_attr=paddle.attr.ExtraLayerAttribute(
- error_clipping_threshold=5.0),
- bias_attr=False,
- act=paddle.activation.Linear())
- topk_start_pos_ids = paddle.layer.kmax_seq_score(
- input=start_pos_scores, beam_size=config.beam_size)
- topk_start_spans = paddle.layer.seq_slice(
- input=topk_sen, starts=topk_start_pos_ids, ends=None)
-
- # expand beam to search end positions on selected start spans
- _, end_span_embedding = basic_modules.stacked_bidirectional_lstm(
- topk_start_spans, config.lstm_hidden_dim, config.lstm_depth,
- config.lstm_hidden_droprate, "__end_span_embeddings__")
- end_pos_scores = paddle.layer.fc(input=end_span_embedding,
- size=1,
- bias_attr=False,
- act=paddle.activation.Linear())
- topk_end_pos_ids = paddle.layer.kmax_seq_score(
- input=end_pos_scores, beam_size=config.beam_size)
-
- if is_infer:
- return [
- sentence_scores, topk_sentence_ids, start_pos_scores,
- topk_start_pos_ids, end_pos_scores, topk_end_pos_ids
- ]
- else:
- return paddle.layer.cross_entropy_over_beam(input=[
- paddle.layer.BeamInput(sentence_scores, topk_sentence_ids,
- sentence_idx),
- paddle.layer.BeamInput(start_pos_scores, topk_start_pos_ids,
- start_idx),
- paddle.layer.BeamInput(end_pos_scores, topk_end_pos_ids, end_idx)
- ])
-
-
-def GNR(config, is_infer=False):
- """Build the globally normalized reader model.
-
- Arguments:
- - config: The model configuration.
- - is_infer: The boolean parameter indicating inferring or training.
- """
-
- # encode question words
- question_embeddings = build_pretrained_embedding(
- "question",
- paddle.data_type.integer_value_sequence(config.vocab_size),
- config.embedding_dim, config.embedding_droprate)
- question_vector, question_lstm_outs = encode_question(
- question_embeddings, config.lstm_hidden_dim, config.lstm_depth,
- config.passage_indep_embedding_dim, "__ques")
-
- # encode document words
- document_embeddings = build_pretrained_embedding(
- "documents",
- paddle.data_type.integer_value_sub_sequence(config.vocab_size),
- config.embedding_dim, config.embedding_droprate)
- same_as_question = paddle.layer.data(
- name="same_as_question",
- type=paddle.data_type.dense_vector_sub_sequence(1))
-
- document_words_ecoding = encode_documents(
- document_embeddings, same_as_question, question_vector,
- question_lstm_outs, config.passage_indep_embedding_dim, "__doc")
-
- doc_lstm_outs = basic_modules.stacked_bidirectional_lstm_by_nested_seq(
- document_words_ecoding, config.lstm_depth, config.lstm_hidden_dim,
- "__doc_lstm")
-
- # search the answer.
- sentence_idx = paddle.layer.data(
- name="sen_idx", type=paddle.data_type.integer_value(1))
- start_idx = paddle.layer.data(
- name="start_idx", type=paddle.data_type.integer_value(1))
- end_idx = paddle.layer.data(
- name="end_idx", type=paddle.data_type.integer_value(1))
- return search_answer(doc_lstm_outs, sentence_idx, start_idx, end_idx,
- config, is_infer)
-
-
-if __name__ == "__main__":
- print(parse_network(GNR(ModelConfig)))
diff --git a/legacy/globally_normalized_reader/reader.py b/legacy/globally_normalized_reader/reader.py
deleted file mode 100644
index c6642aa9242ebebdc758a44d6c1d09b5291f73e7..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/reader.py
+++ /dev/null
@@ -1,44 +0,0 @@
-#!/usr/bin/env python
-#coding=utf-8
-
-import os
-import random
-import json
-import logging
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def data_reader(data_list, is_train=True):
- """ Data reader.
-
- Arguments:
- - data_list: A python list which contains path of training samples.
- - is_train: A boolean parameter indicating this function is called
- in training or in inferring.
- """
-
- def reader():
- """shuffle the data list again at the begining of every pass"""
- if is_train:
- random.shuffle(data_list)
-
- for train_sample in data_list:
- data = json.load(open(train_sample, "r"))
-
- start_pos = 0
- doc = []
- same_as_question_word = []
- for l in data['sent_lengths']:
- doc.append(data['context'][start_pos:start_pos + l])
- same_as_question_word.append([
- [[x]] for x in data['same_as_question_word']
- ][start_pos:start_pos + l])
- start_pos += l
-
- yield (data['question'], doc, same_as_question_word,
- data['ans_sentence'], data['ans_start'],
- data['ans_end'] - data['ans_start'])
-
- return reader
diff --git a/legacy/globally_normalized_reader/train.py b/legacy/globally_normalized_reader/train.py
deleted file mode 100644
index 19c7d0d28033f15ece1e77d453aaf45d76fa8aa0..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/train.py
+++ /dev/null
@@ -1,238 +0,0 @@
-#coding=utf-8
-
-from __future__ import print_function
-
-import os
-import sys
-import logging
-import random
-import glob
-import gzip
-import numpy as np
-
-import reader
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-from model import GNR
-from config import ModelConfig, TrainerConfig
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def load_initial_model(model_path, parameters):
- """ Initalize parameters in the network from a trained model.
-
- This is useful in resuming the training from previously saved models.
-
- Arguments:
- - model_path: The path of a trained model.
- - parameters: The parameters in a network which will be initialized
- from the specified model.
- """
- with gzip.open(model_path, "rb") as f:
- parameters.init_from_tar(f)
-
-
-def load_pretrained_parameters(path):
- """ Load one pre-trained parameter.
-
- Arguments:
- - path: The path of the pre-trained parameter.
- """
- return np.load(path)
-
-
-def save_model(trainer, save_path, parameters):
- """ Save the trained parameters.
-
- Arguments:
- - save_path: The path to save the trained parameters.
- - parameters: The trained model parameters.
- """
- with gzip.open(save_path, "w") as f:
- trainer.save_parameter_to_tar(f)
-
-
-def show_parameter_init_info(parameters):
- """ Print the information of initialization mean and std of parameters.
-
- Arguments:
- - parameters: The parameters created in a model.
- """
- for p in parameters:
- logger.info("%s : initial_mean %.4f initial_std %.4f" %
- (p, parameters.__param_conf__[p].initial_mean,
- parameters.__param_conf__[p].initial_std))
-
-
-def show_parameter_status(parameters):
- """ Print some statistical information of parameters in a network.
-
- This is used for debugging the model.
-
- Arguments:
- - parameters: The parameters created in a model.
- """
- for p in parameters:
-
- value = parameters.get(p)
- grad = parameters.get_grad(p)
-
- avg_abs_value = np.average(np.abs(value))
- avg_abs_grad = np.average(np.abs(grad))
-
- logger.info(
- ("%s avg_abs_value=%.6f avg_abs_grad=%.6f "
- "min_value=%.6f max_value=%.6f min_grad=%.6f max_grad=%.6f") %
- (p, avg_abs_value, avg_abs_grad, value.min(), value.max(),
- grad.min(), grad.max()))
-
-
-def choose_samples(path):
- """Load filenames for train, dev, and augmented samples.
-
- Arguments:
- - path: The path of training data.
- """
- if not os.path.exists(os.path.join(path, "train")):
- print(
- "Non-existent directory as input path: {}".format(path),
- file=sys.stderr)
- sys.exit(1)
-
- # Get paths to all samples that we want to load.
- train_samples = glob.glob(os.path.join(path, "train", "*"))
- valid_samples = glob.glob(os.path.join(path, "dev", "*"))
-
- train_samples.sort()
- valid_samples.sort()
-
- random.shuffle(train_samples)
-
- return train_samples, valid_samples
-
-
-def build_reader(data_dir, batch_size):
- """Build the data reader for this model.
-
- Arguments:
- - data_dir: The path of training data.
- - batch_size: batch size for the training task.
- """
- train_samples, valid_samples = choose_samples(data_dir)
-
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.data_reader(train_samples), buf_size=102400),
- batch_size=batch_size)
-
- # testing data is not shuffled
- test_reader = paddle.batch(
- reader.data_reader(
- valid_samples, is_train=False),
- batch_size=batch_size)
- return train_reader, test_reader, len(train_samples)
-
-
-def build_event_handler(config, parameters, trainer):
- """Build the event handler for this model.
-
- Arguments:
- - config: The training task configuration for this model.
- - parameters: The parameters in the network.
- - trainer: The trainer object.
- """
-
- # End batch and end pass event handler
- def event_handler(event):
- """The event handler."""
- """
- To print the statistical information of gradients of any learnable
- parameter, the event: EndForwardBackward rather than EndIteration
- should be handled. For the reason that parameter gradients will be
- reset to zeros when EndIteration event happens in GPU training.
- """
- if config.show_parameter_status_period and \
- isinstance(event, paddle.event.EndForwardBackward):
- if not event.batch_id % config.show_parameter_status_period:
- show_parameter_status(parameters)
-
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id and not event.batch_id % config.checkpoint_period:
- save_path = os.path.join(config.save_dir,
- "checkpoint_param.latest.tar.gz")
- save_model(trainer, save_path, parameters)
-
- if not event.batch_id % config.log_period:
- logger.info("Pass %d, Batch %d, Cost %f" %
- (event.pass_id, event.batch_id, event.cost))
-
- if isinstance(event, paddle.event.EndPass):
- save_path = os.path.join(config.save_dir,
- "pass_%05d.tar.gz" % event.pass_id)
- save_model(trainer, save_path, parameters)
-
- return event_handler
-
-
-def train(model_config, trainer_config):
- """Training the GNR model.
-
- Arguments:
- - modle_config: The model configuration for this model.
- - trainer_config: The training task configuration for this model.
- """
-
- if not os.path.exists(trainer_config.save_dir):
- os.mkdir(trainer_config.save_dir)
-
- paddle.init(
- use_gpu=trainer_config.use_gpu,
- trainer_count=trainer_config.trainer_count)
-
- train_reader, test_reader, train_sample_count = build_reader(
- trainer_config.data_dir, trainer_config.train_batch_size)
- """
- Define the optimizer. The learning rate will decrease according to
- the following formula:
-
- lr = learning_rate * pow(learning_rate_decay_a,
- floor(num_samples_processed /
- learning_rate_decay_b))
- """
- optimizer = paddle.optimizer.Adam(
- learning_rate=trainer_config.learning_rate,
- gradient_clipping_threshold=trainer_config.gradient_clipping_threshold,
- regularization=paddle.optimizer.L2Regularization(
- trainer_config.l2_decay_rate),
- learning_rate_decay_a=0.5,
- learning_rate_decay_b=train_sample_count,
- learning_rate_schedule="discexp")
-
- # define network topology
- loss = GNR(model_config)
-
- parameters = paddle.parameters.create(loss)
-
- if trainer_config.init_model_path:
- load_initial_model(trainer_config.init_model_path, parameters)
- else:
- show_parameter_init_info(parameters)
- parameters.set(
- "GloveVectors",
- load_pretrained_parameters(ModelConfig.pretrained_emb_path))
-
- trainer = paddle.trainer.SGD(cost=loss,
- parameters=parameters,
- update_equation=optimizer)
-
- event_handler = build_event_handler(trainer_config, parameters, trainer)
- trainer.train(
- reader=train_reader,
- num_passes=trainer_config.epochs,
- event_handler=event_handler)
-
-
-if __name__ == "__main__":
- train(ModelConfig, TrainerConfig)
diff --git a/legacy/globally_normalized_reader/vocab.py b/legacy/globally_normalized_reader/vocab.py
deleted file mode 100644
index 345ec4fc5145fc72fa0f9d87091b4733a5c1bf59..0000000000000000000000000000000000000000
--- a/legacy/globally_normalized_reader/vocab.py
+++ /dev/null
@@ -1,284 +0,0 @@
-# -*- coding: utf-8 -*-
-import os
-import io
-import numpy as np
-
-# Constants
-UNK = ""
-SOS = ""
-EOS = ""
-PAD = ""
-VOCAB_DIM = 2196017
-EMBEDDING_DIM = 300
-WORD2VEC = None
-
-
-class Vocab(object):
- """Class to hold the vocabulary for the SquadDataset."""
-
- def __init__(self, path):
- self._id_to_word = []
- self._word_to_id = {}
- self._word_ending_tables = {}
- self._path = path
- self._pad = -1
- self._unk = None
- self._sos = None
- self._eos = None
-
- # first read in the base vocab
- with io.open(os.path.join(path, "vocab.txt"), "r") as f:
- for idx, line in enumerate(f):
- word_name = line.strip()
- if word_name == UNK:
- self._unk = idx
- elif word_name == SOS:
- self._sos = idx
- elif word_name == EOS:
- self._eos = idx
-
- self._id_to_word.append(word_name)
- self._word_to_id[word_name] = idx
-
- @property
- def unk(self):
- return self._unk
-
- @property
- def sos(self):
- return self._sos
-
- @property
- def eos(self):
- return self._eos
-
- @property
- def size(self):
- return len(self._id_to_word)
-
- def word_to_idx(self, word):
- if word in self._word_to_id:
- return self._word_to_id[word]
- return self.unk
-
- def idx_to_word(self, idx):
- if idx == self._pad:
- return PAD
- if idx < self.size:
- return self._id_to_word[idx]
- return "ERROR"
-
- def decode(self, idxs):
- return " ".join([self.idx_to_word(idx) for idx in idxs])
-
- def encode(self, sentence):
- return [self.word_to_idx(word) for word in sentence]
-
- @property
- def word_embeddings(self):
- embedding_path = os.path.join(self._path, "embeddings.npy")
- embeddings = np.load(embedding_path)
- return embeddings
-
- def construct_embedding_matrix(self, glove_path):
- # Randomly initialize word embeddings
- embeddings = np.random.randn(self.size,
- EMBEDDING_DIM).astype(np.float32)
-
- load_word_vectors(
- param=embeddings,
- vocab=self._id_to_word,
- path=glove_path,
- missing_word_alternative=missing_word_heuristic,
- missing_word_value=lambda: 0.0)
- embedding_path = os.path.join(self._path, "embeddings.npy")
- np.save(embedding_path, embeddings)
-
-
-def missing_word_heuristic(word, word2vec):
- """
- propose alternate spellings of a word to match against
- pretrained word vectors (so that if the original spelling
- has no pretrained vector, but alternate spelling does,
- a vector can be retrieved anyways.)
- """
- if len(word) > 5:
- # try to find similar words that share
- # the same 5 character ending:
- most_sim = word2vec.words_ending_in(word[-5:])
-
- if len(most_sim) > 0:
- most_sim = sorted(
- most_sim,
- reverse=True,
- key=lambda x: (
- (word[0].isupper() == x[0].isupper()) +
- (word.lower()[:3] == x.lower()[:3]) +
- (word.lower()[:4] == x.lower()[:4]) +
- (abs(len(word) - len(x)) < 5)
- )
- )
- return most_sim[:1]
- if all(not c.isalpha() for c in word):
- # this is a fully numerical answer (and non alpha)
- return ['13', '9', '100', '2.0']
-
- return [
- # add a capital letter
- word.capitalize(),
- # see if word has spurious period
- word.split(".")[0],
- # see if word has spurious backslash
- word.split("/")[0],
- # see if word has spurious parenthesis
- word.split(")")[0],
- word.split("(")[0]
- ]
-
-
-class Word2Vec(object):
- """
- Load word2vec result from file
- """
-
- def __init__(self, vocab_size, vector_size):
- self.syn0 = np.zeros((vocab_size, vector_size), dtype=np.float32)
- self.index2word = []
- self.vocab_size = vocab_size
- self.vector_size = vector_size
-
- def load_word2vec_format(self, path):
- with io.open(path, "r") as fin:
- for word_id in range(self.vocab_size):
- line = fin.readline()
- parts = line.rstrip("\n").rstrip().split(" ")
- if len(parts) != self.vector_size + 1:
- raise ValueError("invalid vector on line {}".format(
- word_id))
- word, weights = parts[0], [np.float32(x) for x in parts[1:]]
- self.syn0[word_id] = weights
- self.index2word.append(word)
- return self
-
-
-class FastWord2vec(object):
- """
- Load word2vec model, cache the embedding matrix using numpy
- and memory-map it so that future loads are fast.
- """
-
- def __init__(self, path):
- if not os.path.exists(path + ".npy"):
- word2vec = Word2Vec(VOCAB_DIM,
- EMBEDDING_DIM).load_word2vec_format(path)
-
- # save as numpy
- np.save(path + ".npy", word2vec.syn0)
- # also save the vocab
- with io.open(path + ".vocab", "w", encoding="utf8") as fout:
- for word in word2vec.index2word:
- fout.write(word + "\n")
-
- self.syn0 = np.load(path + ".npy", mmap_mode="r")
- self.index2word = [l.strip("\n") for l in io.open(path + ".vocab", "r")]
- self.word2index = {word: k for k, word in enumerate(self.index2word)}
- self._word_ending_tables = {}
- self._word_beginning_tables = {}
-
- def __getitem__(self, key):
- return np.array(self.syn0[self.word2index[key]])
-
- def __contains__(self, key):
- return key in self.word2index
-
- def words_ending_in(self, word_ending):
- if len(word_ending) == 0:
- return self.index2word
- self._build_word_ending_table(len(word_ending))
- return self._word_ending_tables[len(word_ending)].get(word_ending, [])
-
- def _build_word_ending_table(self, length):
- if length not in self._word_ending_tables:
- table = {}
- for word in self.index2word:
- if len(word) >= length:
- ending = word[-length:]
- if ending not in table:
- table[ending] = [word]
- else:
- table[ending].append(word)
- self._word_ending_tables[length] = table
-
- def words_starting_in(self, word_beginning):
- if len(word_beginning) == 0:
- return self.index2word
- self._build_word_beginning_table(len(word_beginning))
- return self._word_beginning_tables[len(word_beginning)].get(
- word_beginning, [])
-
- def _build_word_beginning_table(self, length):
- if length not in self._word_beginning_tables:
- table = {}
- for word in get_progress_bar('building prefix lookup ')(
- self.index2word):
- if len(word) >= length:
- ending = word[:length]
- if ending not in table:
- table[ending] = [word]
- else:
- table[ending].append(word)
- self._word_beginning_tables[length] = table
-
- @staticmethod
- def get(path):
- global WORD2VEC
- if WORD2VEC is None:
- WORD2VEC = FastWord2vec(path)
- return WORD2VEC
-
-
-def load_word_vectors(param,
- vocab,
- path,
- verbose=True,
- missing_word_alternative=None,
- missing_word_value=None):
- """
- Add the pre-trained word embeddings stored under path to the parameter
- matrix `param` that has size `vocab x embedding_dim`.
- Arguments:
- param : np.array
- vocab : list
- path : str, location of the pretrained word embeddings
- verbose : (optional) bool, whether to print how
- many words were recovered
- """
- word2vec = FastWord2vec.get(path)
- missing = 0
- for idx, word in enumerate(vocab):
- try:
- param[idx, :] = word2vec[word]
- except KeyError:
- try:
- param[idx, :] = word2vec[word.lower()]
- except KeyError:
- found = False
- if missing_word_alternative is not None:
- alternatives = missing_word_alternative(word, word2vec)
- if isinstance(alternatives, str):
- alternatives = [alternatives]
- assert (isinstance(alternatives, list)), (
- "missing_word_alternative should return a list of strings."
- )
- for alternative in alternatives:
- if alternative in word2vec:
- param[idx, :] = word2vec[alternative]
- found = True
- break
- if not found:
- if missing_word_value is not None:
- param[idx, :] = missing_word_value()
- missing += 1
- if verbose:
- print("Loaded {} words, {} missing".format(
- len(vocab) - missing, missing))
diff --git a/legacy/hsigmoid/.gitignore b/legacy/hsigmoid/.gitignore
deleted file mode 100644
index 29a9367f0e91889df8654ad4293f0649de2074f0..0000000000000000000000000000000000000000
--- a/legacy/hsigmoid/.gitignore
+++ /dev/null
@@ -1,3 +0,0 @@
-*.pyc
-models
-
diff --git a/legacy/hsigmoid/README.md b/legacy/hsigmoid/README.md
deleted file mode 100644
index 619fc190acbbbfc2f792f3274e4dfec0042d8c1c..0000000000000000000000000000000000000000
--- a/legacy/hsigmoid/README.md
+++ /dev/null
@@ -1,166 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# Hsigmoid加速词向量训练
-## 背景介绍
-在自然语言处理领域中,传统做法通常使用one-hot向量来表示词,比如词典为['我', '你', '喜欢'],可以用[1,0,0]、[0,1,0]和[0,0,1]这三个向量分别表示'我'、'你'和'喜欢'。这种表示方式比较简洁,但是当词表很大时,容易产生维度爆炸问题;而且任意两个词的向量是正交的,向量包含的信息有限。为了避免或减轻one-hot表示的缺点,目前通常使用词向量来取代one-hot表示,词向量也就是word embedding,即使用一个低维稠密的实向量取代高维稀疏的one-hot向量。训练词向量的方法有很多种,神经网络模型是其中之一,包括CBOW、Skip-gram等,这些模型本质上都是一个分类模型,当词表较大即类别较多时,传统的softmax将非常消耗时间。PaddlePaddle提供了Hsigmoid Layer、NCE Layer,来加速模型的训练过程。本文主要介绍如何使用Hsigmoid Layer来加速训练,词向量相关内容请查阅PaddlePaddle Book中的[词向量章节](https://github.com/PaddlePaddle/book/tree/develop/04.word2vec)。
-
-## Hsigmoid Layer
-Hsigmoid Layer引用自论文\[[1](#参考文献)\],Hsigmoid指Hierarchical-sigmoid,原理是通过构建一个分类二叉树来降低计算复杂度,二叉树中每个叶子节点代表一个类别,每个非叶子节点代表一个二类别分类器。例如我们一共有4个类别分别是0、1、2、3,softmax会分别计算4个类别的得分,然后归一化得到概率。当类别数很多时,计算每个类别的概率非常耗时,Hsigmoid Layer会根据类别数构建一个平衡二叉树,如下:
-
-
-
-图1. (a)为平衡二叉树,(b)为根节点到类别1的路径
-
-
-二叉树中每个非叶子节点是一个二类别分类器(sigmoid),如果类别是0,则取左子节点继续分类判断,反之取右子节点,直至达到叶节点。按照这种方式,每个类别均对应一条路径,例如从root到类别1的路径编码为0、1。训练阶段我们按照真实类别对应的路径,依次计算对应分类器的损失,然后综合所有损失得到最终损失。预测阶段,模型会输出各个非叶节点分类器的概率,我们可以根据概率获取路径编码,然后遍历路径编码就可以得到最终预测类别。传统softmax的计算复杂度为N(N为词典大小),Hsigmoid可以将复杂度降至log(N),详细理论细节可参照论文\[[1](#参考文献)\]。
-
-## 数据准备
-### PTB数据
-本文采用Penn Treebank (PTB)数据集([Tomas Mikolov预处理版本](http://www.fit.vutbr.cz/~imikolov/rnnlm/simple-examples.tgz)),共包含train、valid和test三个文件。其中使用train作为训练数据,valid作为测试数据。本文训练的是5-gram模型,即用每条数据的前4个词来预测第5个词。PaddlePaddle提供了对应PTB数据集的python包[paddle.dataset.imikolov](https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/v2/dataset/imikolov.py) ,自动做数据的下载与预处理。预处理会把数据集中的每一句话前后加上开始符号\以及结束符号\,然后依据窗口大小(本文为5),从头到尾每次向右滑动窗口并生成一条数据。如"I have a dream that one day"可以生成\ I have a dream、I have a dream that、have a dream that one、a dream that one day、dream that one day \,PaddlePaddle会把词转换成id数据作为预处理的输出。
-
-### 自定义数据
-用户可以使用自己的数据集训练模型,自定义数据集最关键的地方是实现reader接口做数据处理,reader需要产生一个迭代器,迭代器负责解析文件中的每一行数据,返回一个python list,例如[1, 2, 3, 4, 5],分别是第一个到第四个词在字典中的id,PaddlePaddle会进一步将该list转化成`paddle.data_type.inter_value`类型作为data layer的输入,一个封装样例如下:
-
-```python
-def reader_creator(filename, word_dict, n):
- def reader():
- with open(filename) as f:
- UNK = word_dict['']
- for l in f:
- l = [''] + l.strip().split() + ['']
- if len(l) >= n:
- l = [word_dict.get(w, UNK) for w in l]
- for i in range(n, len(l) + 1):
- yield tuple(l[i - n:i])
- return reader
-
-
-def train_data(filename, word_dict, n):
- """
- Reader interface for training data.
-
- It returns a reader creator, each sample in the reader is a word ID tuple.
-
- :param filename: path of data file
- :type filename: str
- :param word_dict: word dictionary
- :type word_dict: dict
- :param n: sliding window size
- :type n: int
- """
- return reader_creator(filename, word_dict, n)
-```
-
-## 网络结构
-本文通过训练N-gram语言模型来获得词向量,具体地使用前4个词来预测当前词。网络输入为词在字典中的id,然后查询词向量词表获取词向量,接着拼接4个词的词向量,然后接入一个全连接隐层,最后是`Hsigmoid`层。详细网络结构见图2:
-
-
-
-图2. 网络配置结构
-
-
-代码如下:
-
-```python
-def ngram_lm(hidden_size, embed_size, dict_size, gram_num=4, is_train=True):
- emb_layers = []
- embed_param_attr = paddle.attr.Param(
- name="_proj", initial_std=0.001, learning_rate=1, l2_rate=0)
- for i in range(gram_num):
- word = paddle.layer.data(
- name="__word%02d__" % (i),
- type=paddle.data_type.integer_value(dict_size))
- emb_layers.append(
- paddle.layer.embedding(
- input=word, size=embed_size, param_attr=embed_param_attr))
-
- target_word = paddle.layer.data(
- name="__target_word__", type=paddle.data_type.integer_value(dict_size))
-
- embed_context = paddle.layer.concat(input=emb_layers)
-
- hidden_layer = paddle.layer.fc(
- input=embed_context,
- size=hidden_size,
- act=paddle.activation.Sigmoid(),
- layer_attr=paddle.attr.Extra(drop_rate=0.5),
- bias_attr=paddle.attr.Param(learning_rate=2),
- param_attr=paddle.attr.Param(
- initial_std=1. / math.sqrt(embed_size * 8), learning_rate=1))
-
- return paddle.layer.hsigmoid(
- input=hidden_layer,
- label=target_word,
- param_attr=paddle.attr.Param(name="sigmoid_w"),
- bias_attr=paddle.attr.Param(name="sigmoid_b"))
-```
-
-需要注意在 PaddlePaddle 中,hsigmoid 层将可学习参数存储为一个 `[类别数目 - 1 × 隐层向量宽度]` 大小的矩阵。预测时,需要将 hsigmoid 层替换为全连接运算**并固定以 `sigmoid` 为激活**。预测时输出一个宽度为`[batch_size x 类别数目 - 1]` 维度的矩阵(`batch_size = 1`时退化为一个向量)。矩阵行向量的每一维计算了一个输入向量属于一个内部结点的右孩子的概率。**全连接运算在加载 hsigmoid 层学习到的参数矩阵时,需要对参数矩阵进行一次转置**。代码片段如下:
-
-```python
-return paddle.layer.mixed(
- size=dict_size - 1,
- input=paddle.layer.trans_full_matrix_projection(
- hidden_layer, param_attr=paddle.attr.Param(name="sigmoid_w")),
- act=paddle.activation.Sigmoid(),
- bias_attr=paddle.attr.Param(name="sigmoid_b"))
-```
-上述代码片段中的 `paddle.layer.mixed` 必须以 PaddlePaddle 中 `paddle.layer.×_projection` 为输入。`paddle.layer.mixed` 将多个 `projection` (输入可以是多个)计算结果求和作为输出。`paddle.layer.trans_full_matrix_projection` 在计算矩阵乘法时会对参数$W$进行转置。
-
-## 训练阶段
-训练比较简单,直接运行``` python train.py ```。程序第一次运行会检测用户缓存文件夹中是否包含imikolov数据集,如果未包含,则自动下载。运行过程中,每100个iteration会打印模型训练信息,主要包含训练损失和测试损失,每个pass会保存一次模型。
-
-## 预测阶段
-在命令行运行 :
-```bash
-python infer.py \
- --model_path "models/XX" \
- --batch_size 1 \
- --use_gpu false \
- --trainer_count 1
-```
-参数含义如下:
-- `model_path`:指定训练好的模型所在的路径。必选。
-- `batch_size`:一次预测并行的样本数目。可选,默认值为 `1`。
-- `use_gpu`:是否使用 GPU 进行预测。可选,默认值为 `False`。
-- `trainer_count` : 预测使用的线程数目。可选,默认为 `1`。**注意:预测使用的线程数目必选大于一次预测并行的样本数目**。
-
-预测阶段根据多个二分类概率得到编码路径,遍历路径获取最终的预测类别,逻辑如下:
-
-```python
-def decode_res(infer_res, dict_size):
- """
- Inferring probabilities are orginized as a complete binary tree.
- The actual labels are leaves (indices are counted from class number).
- This function travels paths decoded from inferring results.
- If the probability >0.5 then go to right child, otherwise go to left child.
-
- param infer_res: inferring result
- param dict_size: class number
- return predict_lbls: actual class
- """
- predict_lbls = []
- infer_res = infer_res > 0.5
- for i, probs in enumerate(infer_res):
- idx = 0
- result = 1
- while idx < len(probs):
- result <<= 1
- if probs[idx]:
- result |= 1
- if probs[idx]:
- idx = idx * 2 + 2 # right child
- else:
- idx = idx * 2 + 1 # left child
-
- predict_lbl = result - dict_size
- predict_lbls.append(predict_lbl)
- return predict_lbls
-```
-
-预测程序的输入数据格式与训练阶段相同,如`have a dream that one`,程序会根据`have a dream that`生成一组概率,通过对概率解码生成预测词,`one`作为真实词,方便评估。解码函数的输入是一个batch样本的预测概率以及词表的大小,里面的循环是对每条样本的输出概率进行解码,解码方式就是按照左0右1的准则,不断遍历路径,直至到达叶子节点。
-
-## 参考文献
-1. Morin, F., & Bengio, Y. (2005, January). [Hierarchical Probabilistic Neural Network Language Model](http://www.iro.umontreal.ca/~lisa/pointeurs/hierarchical-nnlm-aistats05.pdf). In Aistats (Vol. 5, pp. 246-252).
diff --git a/legacy/hsigmoid/images/binary_tree.png b/legacy/hsigmoid/images/binary_tree.png
deleted file mode 100644
index 3ea43c81356a658f3ca4469af98fd72b32799188..0000000000000000000000000000000000000000
Binary files a/legacy/hsigmoid/images/binary_tree.png and /dev/null differ
diff --git a/legacy/hsigmoid/images/network_conf.png b/legacy/hsigmoid/images/network_conf.png
deleted file mode 100644
index 02f8c257d4d7406aab760920a69bc94298fb48ea..0000000000000000000000000000000000000000
Binary files a/legacy/hsigmoid/images/network_conf.png and /dev/null differ
diff --git a/legacy/hsigmoid/images/path_to_1.png b/legacy/hsigmoid/images/path_to_1.png
deleted file mode 100644
index d07b046680393e2098cf3c9b7fc1a3c6045e9f65..0000000000000000000000000000000000000000
Binary files a/legacy/hsigmoid/images/path_to_1.png and /dev/null differ
diff --git a/legacy/hsigmoid/infer.py b/legacy/hsigmoid/infer.py
deleted file mode 100644
index 36a15f50714dac6ce3d9390a7bf43f5b58a8efdf..0000000000000000000000000000000000000000
--- a/legacy/hsigmoid/infer.py
+++ /dev/null
@@ -1,102 +0,0 @@
-import os
-import logging
-import gzip
-import click
-
-import paddle.v2 as paddle
-from network_conf import ngram_lm
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.WARNING)
-
-
-def decode_result(infer_res, dict_size):
- """
- Inferring probabilities are orginized as a complete binary tree.
- The actual labels are leaves (indices are counted from class number).
- This function travels paths decoded from inferring results.
- If the probability >0.5 then go to right child, otherwise go to left child.
-
- param infer_res: inferring result
- param dict_size: class number
- return predict_lbls: actual class
- """
- predict_lbls = []
- infer_res = infer_res > 0.5
- for i, probs in enumerate(infer_res):
- idx = 0
- result = 1
- while idx < len(probs):
- result <<= 1
- if probs[idx]:
- result |= 1
- if probs[idx]:
- idx = idx * 2 + 2 # right child
- else:
- idx = idx * 2 + 1 # left child
-
- predict_lbl = result - dict_size
- predict_lbls.append(predict_lbl)
- return predict_lbls
-
-
-def infer_a_batch(batch_ins, idx_word_dict, dict_size, inferer):
- infer_res = inferer.infer(input=batch_ins)
-
- predict_lbls = decode_result(infer_res, dict_size)
- predict_words = [idx_word_dict[lbl] for lbl in predict_lbls] # map to word
-
- # Ouput format: word1 word2 word3 word4 -> predict label
- for i, ins in enumerate(batch_ins):
- print(" ".join([idx_word_dict[w]
- for w in ins]) + " -> " + predict_words[i])
-
-
-@click.command("infer")
-@click.option(
- "--model_path",
- default="",
- help="The path of the trained model for generation.")
-@click.option(
- "--batch_size",
- default=1,
- help="The number of testing examples in one forward batch in inferring.")
-@click.option(
- "--use_gpu", default=False, help="Whether to use GPU in inference or not.")
-@click.option(
- "--trainer_count",
- default=1,
- help="Whether to use GPU in inference or not.")
-def infer(model_path, batch_size, use_gpu, trainer_count):
- assert os.path.exists(model_path), "The trained model does not exist."
- assert (batch_size and trainer_count and batch_size >= trainer_count), (
- "batch_size and trainer_count must both be greater than 0. "
- "And batch_size must be equal to "
- "or greater than trainer_count.")
-
- paddle.init(use_gpu=use_gpu, trainer_count=trainer_count)
- word_dict = paddle.dataset.imikolov.build_dict(min_word_freq=2)
- dict_size = len(word_dict)
- prediction_layer = ngram_lm(
- is_train=False, hidden_size=256, embed_size=32, dict_size=dict_size)
-
- with gzip.open(model_path, "r") as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
-
- inferer = paddle.inference.Inference(
- output_layer=prediction_layer, parameters=parameters)
- idx_word_dict = dict((v, k) for k, v in word_dict.items())
-
- batch_ins = []
- for ins in paddle.dataset.imikolov.test(word_dict, 5)():
- batch_ins.append(ins[:-1])
- if len(batch_ins) == batch_size:
- infer_a_batch(batch_ins, idx_word_dict, dict_size, inferer)
- batch_ins = []
-
- if len(batch_ins) > 0:
- infer_a_batch(batch_ins, idx_word_dict, dict_size, inferer)
-
-
-if __name__ == "__main__":
- infer()
diff --git a/legacy/hsigmoid/network_conf.py b/legacy/hsigmoid/network_conf.py
deleted file mode 100644
index aa1126c7ecb6a287a02c3a264123fca9b78465fd..0000000000000000000000000000000000000000
--- a/legacy/hsigmoid/network_conf.py
+++ /dev/null
@@ -1,59 +0,0 @@
-import math
-
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-
-
-def ngram_lm(hidden_size, embed_size, dict_size, gram_num=4, is_train=True):
- emb_layers = []
- embed_param_attr = paddle.attr.Param(
- name="_proj", initial_std=0.001, learning_rate=1, l2_rate=0)
- for i in range(gram_num):
- word = paddle.layer.data(
- name="__word%02d__" % (i),
- type=paddle.data_type.integer_value(dict_size))
- emb_layers.append(
- paddle.layer.embedding(
- input=word, size=embed_size, param_attr=embed_param_attr))
-
- target_word = paddle.layer.data(
- name="__target_word__", type=paddle.data_type.integer_value(dict_size))
-
- embed_context = paddle.layer.concat(input=emb_layers)
-
- hidden_layer = paddle.layer.fc(input=embed_context,
- size=hidden_size,
- act=paddle.activation.Sigmoid(),
- layer_attr=paddle.attr.Extra(drop_rate=0.5),
- bias_attr=paddle.attr.Param(learning_rate=2),
- param_attr=paddle.attr.Param(
- initial_std=1. /
- math.sqrt(embed_size * 8),
- learning_rate=1))
-
- if is_train == True:
- return paddle.layer.hsigmoid(
- input=hidden_layer,
- label=target_word,
- num_classes=dict_size,
- param_attr=paddle.attr.Param(name="sigmoid_w"),
- bias_attr=paddle.attr.Param(name="sigmoid_b"))
- else:
- return paddle.layer.mixed(
- size=dict_size - 1,
- input=paddle.layer.trans_full_matrix_projection(
- hidden_layer, param_attr=paddle.attr.Param(name="sigmoid_w")),
- act=paddle.activation.Sigmoid(),
- bias_attr=paddle.attr.Param(name="sigmoid_b"))
-
-
-if __name__ == "__main__":
- # this is to test and debug the network topology defination.
- # please set the hyper-parameters as needed.
- print(parse_network(
- ngram_lm(
- hidden_size=512,
- embed_size=512,
- dict_size=1024,
- gram_num=4,
- is_train=False)))
diff --git a/legacy/hsigmoid/train.py b/legacy/hsigmoid/train.py
deleted file mode 100644
index 8ee0717e26227828ba3d663296ed9150d7c144bd..0000000000000000000000000000000000000000
--- a/legacy/hsigmoid/train.py
+++ /dev/null
@@ -1,60 +0,0 @@
-import os
-import logging
-import gzip
-
-import paddle.v2 as paddle
-from network_conf import ngram_lm
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def main(save_dir="models"):
- if not os.path.exists(save_dir):
- os.mkdir(save_dir)
-
- paddle.init(use_gpu=False, trainer_count=1)
- word_dict = paddle.dataset.imikolov.build_dict(min_word_freq=2)
- dict_size = len(word_dict)
-
- adam_optimizer = paddle.optimizer.Adam(
- learning_rate=3e-3,
- regularization=paddle.optimizer.L2Regularization(8e-4))
-
- cost = ngram_lm(hidden_size=256, embed_size=32, dict_size=dict_size)
-
- parameters = paddle.parameters.create(cost)
- adam_optimizer = paddle.optimizer.Adam(
- learning_rate=3e-3,
- regularization=paddle.optimizer.L2Regularization(8e-4))
- trainer = paddle.trainer.SGD(cost, parameters, adam_optimizer)
-
- def event_handler(event):
- if isinstance(event, paddle.event.EndPass):
- model_name = os.path.join(save_dir, "hsigmoid_pass_%05d.tar.gz" %
- event.pass_id)
- logger.info("Save model into %s ..." % model_name)
- with gzip.open(model_name, "w") as f:
- trainer.save_parameter_to_tar(f)
-
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id and event.batch_id % 10 == 0:
- result = trainer.test(
- paddle.batch(
- paddle.dataset.imikolov.test(word_dict, 5), 32))
- logger.info(
- "Pass %d, Batch %d, Cost %f, Test Cost %f" %
- (event.pass_id, event.batch_id, event.cost, result.cost))
-
- trainer.train(
- paddle.batch(
- paddle.reader.shuffle(
- lambda: paddle.dataset.imikolov.train(word_dict, 5)(),
- buf_size=1000),
- 64),
- num_passes=30,
- event_handler=event_handler)
-
-
-if __name__ == "__main__":
- main()
diff --git a/legacy/image_classification/README.md b/legacy/image_classification/README.md
deleted file mode 100644
index f041185acc6f972fa5b2759a7f64efc0f2000c80..0000000000000000000000000000000000000000
--- a/legacy/image_classification/README.md
+++ /dev/null
@@ -1,277 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.11.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-图像分类
-=======================
-
-这里将介绍如何在PaddlePaddle下使用AlexNet、VGG、GoogLeNet、ResNet、Inception-v4、Inception-ResNet-v2和Xception模型进行图像分类。图像分类问题的描述和这些模型的介绍可以参考[PaddlePaddle book](https://github.com/PaddlePaddle/book/tree/develop/03.image_classification)。
-
-## 训练模型
-
-### 初始化
-
-在初始化阶段需要导入所用的包,并对PaddlePaddle进行初始化。
-
-```python
-import gzip
-import argparse
-
-import paddle.v2.dataset.flowers as flowers
-import paddle.v2 as paddle
-import reader
-import vgg
-import resnet
-import alexnet
-import googlenet
-import inception_v4
-import inception_resnet_v2
-import xception
-
-
-# PaddlePaddle init
-paddle.init(use_gpu=False, trainer_count=1)
-```
-
-### 定义参数和输入
-
-设置算法参数(如数据维度、类别数目和batch size等参数),定义数据输入层`image`和类别标签`lbl`。
-所用数据集是[flowers](http://www.robots.ox.ac.uk/~vgg/data/flowers/102/)。花的种类是102,因此,CLASS_DIM=102。
-
-```python
-# Use 3 * 331 * 331 or 3 * 299 * 299 for DATA_DIM in Inception-ResNet-v2.
-DATA_DIM = 3 * 224 * 224
-CLASS_DIM = 102
-BATCH_SIZE = 128
-
-image = paddle.layer.data(
- name="image", type=paddle.data_type.dense_vector(DATA_DIM))
-lbl = paddle.layer.data(
- name="label", type=paddle.data_type.integer_value(CLASS_DIM))
-```
-
-### 获得所用模型
-
-这里可以选择使用AlexNet、VGG、GoogLeNet、ResNet、Inception-v4、Inception-ResNet-v2和Xception模型中的一个模型进行图像分类。通过调用相应的方法可以获得网络最后的Softmax层。
-
-1. 使用AlexNet模型
-
-指定输入层`image`和类别数目`CLASS_DIM`后,可以通过下面的代码得到AlexNet的Softmax层。
-
-```python
-out = alexnet.alexnet(image, class_dim=CLASS_DIM)
-```
-
-2. 使用VGG模型
-
-根据层数的不同,VGG分为VGG13、VGG16和VGG19。使用VGG16模型的代码如下:
-
-```python
-out = vgg.vgg16(image, class_dim=CLASS_DIM)
-```
-
-类似地,VGG13和VGG19可以分别通过`vgg.vgg13`和`vgg.vgg19`方法获得。
-
-3. 使用GoogLeNet模型
-
-GoogLeNet在训练阶段使用两个辅助的分类器强化梯度信息并进行额外的正则化。因此`googlenet.googlenet`共返回三个Softmax层,如下面的代码所示:
-
-```python
-out, out1, out2 = googlenet.googlenet(image, class_dim=CLASS_DIM)
-loss1 = paddle.layer.cross_entropy_cost(
- input=out1, label=lbl, coeff=0.3)
-paddle.evaluator.classification_error(input=out1, label=lbl)
-loss2 = paddle.layer.cross_entropy_cost(
- input=out2, label=lbl, coeff=0.3)
-paddle.evaluator.classification_error(input=out2, label=lbl)
-extra_layers = [loss1, loss2]
-```
-
-对于两个辅助的输出,这里分别对其计算损失函数并评价错误率,然后将损失作为后文SGD的extra_layers。
-
-4. 使用ResNet模型
-
-ResNet模型可以通过下面的代码获取:
-
-```python
-out = resnet.resnet_imagenet(image, class_dim=CLASS_DIM)
-```
-
-5. 使用Inception-v4模型
-
-Inception-v4模型可以通过下面的代码获取, 本例中使用的模型输入大小为`3 * 224 * 224` (原文献中使用的输入大小为`3 * 299 * 299`):
-
-```python
-out = inception_v4.inception_v4(image, class_dim=CLASS_DIM)
-```
-
-
-6. 使用Inception-ResNet-v2模型
-
-提供的Inception-ResNet-v2模型支持`3 * 331 * 331`和`3 * 299 * 299`两种大小的输入,同时可以自行设置dropout概率,可以通过如下的代码使用:
-
-```python
-out = inception_resnet_v2.inception_resnet_v2(
- image, class_dim=CLASS_DIM, dropout_rate=0.5, size=DATA_DIM)
-```
-
-注意,由于和其他几种模型输入大小不同,若配合提供的`reader.py`使用Inception-ResNet-v2时请先将`reader.py`中`paddle.image.simple_transform`中的参数为修改为相应大小。
-
-7. 使用Xception模型
-
-Xception模型可以通过下面的代码获取:
-
-```python
-out = xception.xception(image, class_dim=CLASS_DIM)
-```
-
-### 定义损失函数
-
-```python
-cost = paddle.layer.classification_cost(input=out, label=lbl)
-```
-
-### 创建参数和优化方法
-
-```python
-# Create parameters
-parameters = paddle.parameters.create(cost)
-
-# Create optimizer
-optimizer = paddle.optimizer.Momentum(
- momentum=0.9,
- regularization=paddle.optimizer.L2Regularization(rate=0.0005 *
- BATCH_SIZE),
- learning_rate=0.001 / BATCH_SIZE,
- learning_rate_decay_a=0.1,
- learning_rate_decay_b=128000 * 35,
- learning_rate_schedule="discexp", )
-```
-
-通过 `learning_rate_decay_a` (简写$a$) 、`learning_rate_decay_b` (简写$b$) 和 `learning_rate_schedule` 指定学习率调整策略,这里采用离散指数的方式调节学习率,计算公式如下, $n$ 代表已经处理过的累计总样本数,$lr_{0}$ 即为参数里设置的 `learning_rate`。
-
-$$ lr = lr_{0} * a^ {\lfloor \frac{n}{ b}\rfloor} $$
-
-
-### 定义数据读取
-
-首先以[花卉数据](http://www.robots.ox.ac.uk/~vgg/data/flowers/102/index.html)为例说明如何定义输入。下面的代码定义了花卉数据训练集和验证集的输入:
-
-```python
-train_reader = paddle.batch(
- paddle.reader.shuffle(
- flowers.train(),
- buf_size=1000),
- batch_size=BATCH_SIZE)
-test_reader = paddle.batch(
- flowers.valid(),
- batch_size=BATCH_SIZE)
-```
-
-若需要使用其他数据,则需要先建立图像列表文件。`reader.py`定义了这种文件的读取方式,它从图像列表文件中解析出图像路径和类别标签。
-
-图像列表文件是一个文本文件,其中每一行由一个图像路径和类别标签构成,二者以跳格符(Tab)隔开。类别标签用整数表示,其最小值为0。下面给出一个图像列表文件的片段示例:
-
-```
-dataset_100/train_images/n03982430_23191.jpeg 1
-dataset_100/train_images/n04461696_23653.jpeg 7
-dataset_100/train_images/n02441942_3170.jpeg 8
-dataset_100/train_images/n03733281_31716.jpeg 2
-dataset_100/train_images/n03424325_240.jpeg 0
-dataset_100/train_images/n02643566_75.jpeg 8
-```
-
-训练时需要分别指定训练集和验证集的图像列表文件。这里假设这两个文件分别为`train.list`和`val.list`,数据读取方式如下:
-
-```python
-train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.train_reader('train.list'),
- buf_size=1000),
- batch_size=BATCH_SIZE)
-test_reader = paddle.batch(
- reader.test_reader('val.list'),
- batch_size=BATCH_SIZE)
-```
-
-### 定义事件处理程序
-```python
-# End batch and end pass event handler
-def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 1 == 0:
- print "\nPass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics)
- if isinstance(event, paddle.event.EndPass):
- with gzip.open('params_pass_%d.tar.gz' % event.pass_id, 'w') as f:
- parameters.to_tar(f)
-
- result = trainer.test(reader=test_reader)
- print "\nTest with Pass %d, %s" % (event.pass_id, result.metrics)
-```
-
-### 定义训练方法
-
-对于AlexNet、VGG、ResNet、Inception-v4、Inception-ResNet-v2和Xception,可以按下面的代码定义训练方法:
-
-```python
-# Create trainer
-trainer = paddle.trainer.SGD(
- cost=cost,
- parameters=parameters,
- update_equation=optimizer)
-```
-
-GoogLeNet有两个额外的输出层,因此需要指定`extra_layers`,如下所示:
-
-```python
-# Create trainer
-trainer = paddle.trainer.SGD(
- cost=cost,
- parameters=parameters,
- update_equation=optimizer,
- extra_layers=extra_layers)
-```
-
-### 开始训练
-
-```python
-trainer.train(
- reader=train_reader, num_passes=200, event_handler=event_handler)
-```
-
-## 应用模型
-模型训练好后,可以使用下面的代码预测给定图片的类别。
-
-```python
-# load parameters
-with gzip.open('params_pass_10.tar.gz', 'r') as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
-
-file_list = [line.strip() for line in open(image_list_file)]
-test_data = [(paddle.image.load_and_transform(image_file, 256, 224, False)
- .flatten().astype('float32'), )
- for image_file in file_list]
-probs = paddle.infer(
- output_layer=out, parameters=parameters, input=test_data)
-lab = np.argsort(-probs)
-for file_name, result in zip(file_list, lab):
- print "Label of %s is: %d" % (file_name, result[0])
-```
-
-首先从文件中加载训练好的模型(代码里以第10轮迭代的结果为例),然后读取`image_list_file`中的图像。`image_list_file`是一个文本文件,每一行为一个图像路径。代码使用`paddle.infer`判断`image_list_file`中每个图像的类别,并进行输出。
-
-## 使用预训练模型
-为方便进行测试和fine-tuning,我们提供了一些对应于示例中模型配置的预训练模型,目前包括在ImageNet 1000类上训练的ResNet50、ResNet101和Vgg16,请使用`models`目录下的脚本`model_download.sh`进行模型下载,如下载ResNet50可进入`models`目录并执行"`sh model_download.sh ResNet50`",完成后同目录下的`Paddle_ResNet50.tar.gz`即是训练好的模型,可以在代码中使用如下两种方式进行加载模:
-
-```python
-parameters = paddle.parameters.Parameters.from_tar(gzip.open('Paddle_ResNet50.tar.gz', 'r'))
-```
-
-```python
-parameters = paddle.parameters.create(cost)
-parameters.init_from_tar(gzip.open('Paddle_ResNet50.tar.gz', 'r'))
-```
-
-### 注意事项
-模型压缩包中所含各文件的文件名和模型配置中的参数名一一对应,是加载模型参数的依据。我们提供的预训练模型均使用了示例代码中的配置,如需修改网络配置,请多加注意,需要保证网络配置中的参数名和压缩包中的文件名能够正确对应。
diff --git a/legacy/image_classification/alexnet.py b/legacy/image_classification/alexnet.py
deleted file mode 100644
index 1fbd05c0fbbb14319a148f9c38941e89f602a8d9..0000000000000000000000000000000000000000
--- a/legacy/image_classification/alexnet.py
+++ /dev/null
@@ -1,49 +0,0 @@
-import paddle.v2 as paddle
-
-__all__ = ['alexnet']
-
-
-def alexnet(input, class_dim):
- conv1 = paddle.layer.img_conv(
- input=input,
- filter_size=11,
- num_channels=3,
- num_filters=96,
- stride=4,
- padding=1)
- cmrnorm1 = paddle.layer.img_cmrnorm(
- input=conv1, size=5, scale=0.0001, power=0.75)
- pool1 = paddle.layer.img_pool(input=cmrnorm1, pool_size=3, stride=2)
-
- conv2 = paddle.layer.img_conv(
- input=pool1,
- filter_size=5,
- num_filters=256,
- stride=1,
- padding=2,
- groups=1)
- cmrnorm2 = paddle.layer.img_cmrnorm(
- input=conv2, size=5, scale=0.0001, power=0.75)
- pool2 = paddle.layer.img_pool(input=cmrnorm2, pool_size=3, stride=2)
-
- pool3 = paddle.networks.img_conv_group(
- input=pool2,
- pool_size=3,
- pool_stride=2,
- conv_num_filter=[384, 384, 256],
- conv_filter_size=3,
- pool_type=paddle.pooling.Max())
-
- fc1 = paddle.layer.fc(input=pool3,
- size=4096,
- act=paddle.activation.Relu(),
- layer_attr=paddle.attr.Extra(drop_rate=0.5))
- fc2 = paddle.layer.fc(input=fc1,
- size=4096,
- act=paddle.activation.Relu(),
- layer_attr=paddle.attr.Extra(drop_rate=0.5))
-
- out = paddle.layer.fc(input=fc2,
- size=class_dim,
- act=paddle.activation.Softmax())
- return out
diff --git a/legacy/image_classification/caffe2paddle/README.md b/legacy/image_classification/caffe2paddle/README.md
deleted file mode 100644
index c90e000186e974803494cd5d25df1fc71004c37b..0000000000000000000000000000000000000000
--- a/legacy/image_classification/caffe2paddle/README.md
+++ /dev/null
@@ -1,39 +0,0 @@
-## 使用说明
-
-`caffe2paddle.py`提供了将Caffe训练的模型转换为PaddlePaddle可使用的模型的接口`ModelConverter`,其封装了图像领域常用的Convolution、BatchNorm等layer的转换函数,可以完成VGG、ResNet等常用模型的转换。模型转换的基本过程是:基于Caffe的Python API加载模型并依次获取每一个layer的信息,将其中的参数根据layer类型与PaddlePaddle适配后序列化保存(对于Pooling等无需训练的layer不做处理),输出可以直接为PaddlePaddle的Python API加载使用的模型文件。
-
-可以按如下方法使用`ModelConverter`接口:
-
-```python
-# 定义以下变量为相应的文件路径和文件名
-caffe_model_file = "./ResNet-50-deploy.prototxt" # Caffe网络配置文件的路径
-caffe_pretrained_file = "./ResNet-50-model.caffemodel" # Caffe模型文件的路径
-paddle_tar_name = "Paddle_ResNet50.tar.gz" # 输出的Paddle模型的文件名
-
-# 初始化,从指定文件加载模型
-converter = ModelConverter(caffe_model_file=caffe_model_file,
- caffe_pretrained_file=caffe_pretrained_file,
- paddle_tar_name=paddle_tar_name)
-# 进行模型转换
-converter.convert()
-```
-
-`caffe2paddle.py`中已提供以上步骤,修改其中文件相关变量的值后执行`python caffe2paddle.py`即可完成模型转换。此外,为辅助验证转换结果,`ModelConverter`中封装了使用Caffe API预测的接口`caffe_predict`,使用如下所示,将会打印按类别概率排序的(类别id, 概率)的列表:
-
-```python
-# img为图片路径,mean_file为图像均值文件的路径
-converter.caffe_predict(img="./cat.jpg", mean_file="./imagenet/ilsvrc_2012_mean.npy")
-```
-
-需要注意,在模型转换时会对layer的参数进行命名,这里默认使用PaddlePaddle中默认的layer和参数命名规则:以`wrap_name_default`中的值和该layer类型的调用计数构造layer name,并以此为前缀构造参数名,比如第一个InnerProduct层(相应转换函数说明见下方)的bias参数将被命名为`___fc_layer_0__.wbias`。
-
-```python
-# 对InnerProduct层的参数进行转换,使用name值构造对应layer的参数名
-# wrap_name_default设置默认name值为fc_layer
-@wrap_name_default("fc_layer")
-def convert_InnerProduct_layer(self, params, name=None)
-```
-
-为此,在验证和使用转换得到的模型时,编写PaddlePaddle网络配置无需指定layer name并且要保证和Caffe端模型使用同样的拓扑顺序,尤其是对于ResNet这种有分支的网络结构,要保证两分支在PaddlePaddle和Caffe中先后顺序一致,这样才能够使得模型参数正确加载。
-
-如果不希望使用默认的命名,并且在PaddlePaddle网络配置中指定了layer name,可以建立Caffe和PaddlePaddle网络配置间layer name对应关系的`dict`并在调用`ModelConverter.convert`时作为`name_map`的值传入,这样在命名保存layer中的参数时将使用相应的layer name,不受拓扑顺序的影响。另外这里只针对Caffe网络配置中Convolution、InnerProduct和BatchNorm类别的layer建立`name_map`即可(一方面,对于Pooling等无需训练的layer不需要保存,故这里没有提供转换接口;另一方面,对于Caffe中的Scale类别的layer,由于Caffe和PaddlePaddle在实现上的一些差别,PaddlePaddle中的batch_norm层是BatchNorm和Scale层的复合,故这里对Scale进行了特殊处理)。
diff --git a/legacy/image_classification/caffe2paddle/caffe2paddle.py b/legacy/image_classification/caffe2paddle/caffe2paddle.py
deleted file mode 100644
index b25c70bb91fb59f84a46c6ba077c99fa68b904eb..0000000000000000000000000000000000000000
--- a/legacy/image_classification/caffe2paddle/caffe2paddle.py
+++ /dev/null
@@ -1,191 +0,0 @@
-import os
-import struct
-import gzip
-import tarfile
-import cStringIO
-import numpy as np
-import cv2
-import caffe
-from paddle.proto.ParameterConfig_pb2 import ParameterConfig
-from paddle.trainer_config_helpers.default_decorators import wrap_name_default
-
-
-class ModelConverter(object):
- def __init__(self, caffe_model_file, caffe_pretrained_file,
- paddle_tar_name):
- self.net = caffe.Net(caffe_model_file, caffe_pretrained_file,
- caffe.TEST)
- self.tar_name = paddle_tar_name
- self.params = dict()
- self.pre_layer_name = ""
- self.pre_layer_type = ""
-
- def convert(self, name_map=None):
- layer_dict = self.net.layer_dict
- for layer_name in layer_dict.keys():
- layer = layer_dict[layer_name]
- layer_params = layer.blobs
- layer_type = layer.type
- if len(layer_params) > 0:
- self.pre_layer_name = getattr(
- self, "convert_" + layer_type + "_layer")(
- layer_params,
- name=None
- if name_map == None else name_map.get(layer_name))
- self.pre_layer_type = layer_type
- with gzip.open(self.tar_name, 'w') as f:
- self.to_tar(f)
- return
-
- def to_tar(self, f):
- tar = tarfile.TarFile(fileobj=f, mode='w')
- for param_name in self.params.keys():
- param_conf, param_data = self.params[param_name]
-
- confStr = param_conf.SerializeToString()
- tarinfo = tarfile.TarInfo(name="%s.protobuf" % param_name)
- tarinfo.size = len(confStr)
- buf = cStringIO.StringIO(confStr)
- buf.seek(0)
- tar.addfile(tarinfo, fileobj=buf)
-
- buf = cStringIO.StringIO()
- self.serialize(param_data, buf)
- tarinfo = tarfile.TarInfo(name=param_name)
- buf.seek(0)
- tarinfo.size = len(buf.getvalue())
- tar.addfile(tarinfo, buf)
-
- @staticmethod
- def serialize(data, f):
- f.write(struct.pack("IIQ", 0, 4, data.size))
- f.write(data.tobytes())
-
- @wrap_name_default("conv")
- def convert_Convolution_layer(self, params, name=None):
- for i in range(len(params)):
- data = np.array(params[i].data)
- if len(params) == 2:
- suffix = "0" if i == 0 else "bias"
- file_name = "_%s.w%s" % (name, suffix)
- else:
- file_name = "_%s.w%s" % (name, str(i))
- param_conf = ParameterConfig()
- param_conf.name = file_name
- dims = list(data.shape)
- if len(dims) == 1:
- dims.insert(1, 1)
- param_conf.dims.extend(dims)
- param_conf.size = reduce(lambda a, b: a * b, data.shape)
- self.params[file_name] = (param_conf, data.flatten())
-
- return name
-
- @wrap_name_default("fc_layer")
- def convert_InnerProduct_layer(self, params, name=None):
- for i in range(len(params)):
- data = np.array(params[i].data)
- if len(params) == 2:
- suffix = "0" if i == 0 else "bias"
- file_name = "_%s.w%s" % (name, suffix)
- else:
- file_name = "_%s.w%s" % (name, str(i))
- data = np.transpose(data)
- param_conf = ParameterConfig()
- param_conf.name = file_name
- dims = list(data.shape)
- if len(dims) < 2:
- dims.insert(0, 1)
- param_conf.size = reduce(lambda a, b: a * b, dims)
- param_conf.dims.extend(dims)
- self.params[file_name] = (param_conf, data.flatten())
- return name
-
- @wrap_name_default("batch_norm")
- def convert_BatchNorm_layer(self, params, name=None):
- scale = 1 / np.array(params[-1].data)[0] if np.array(params[-1].data)[
- 0] != 0 else 0
- for i in range(2):
- data = np.array(params[i].data) * scale
- file_name = "_%s.w%s" % (name, str(i + 1))
- param_conf = ParameterConfig()
- param_conf.name = file_name
- dims = list(data.shape)
- assert len(dims) == 1
- dims.insert(0, 1)
- param_conf.size = reduce(lambda a, b: a * b, dims)
- param_conf.dims.extend(dims)
- self.params[file_name] = (param_conf, data.flatten())
- return name
-
- def convert_Scale_layer(self, params, name=None):
- assert self.pre_layer_type == "BatchNorm"
- name = self.pre_layer_name
- for i in range(len(params)):
- data = np.array(params[i].data)
- suffix = "0" if i == 0 else "bias"
- file_name = "_%s.w%s" % (name, suffix)
- param_conf = ParameterConfig()
- param_conf.name = file_name
- dims = list(data.shape)
- assert len(dims) == 1
- dims.insert(0, 1)
- param_conf.size = reduce(lambda a, b: a * b, dims)
- if i == 1:
- param_conf.dims.extend(dims)
- self.params[file_name] = (param_conf, data.flatten())
- return name
-
- def caffe_predict(self,
- img,
- mean_file='./caffe/imagenet/ilsvrc_2012_mean.npy'):
- net = self.net
-
- net.blobs['data'].data[...] = load_image(img, mean_file=mean_file)
- out = net.forward()
-
- output_prob = net.blobs['prob'].data[0].flatten()
- print zip(np.argsort(output_prob)[::-1], np.sort(output_prob)[::-1])
-
-
-def load_image(file, resize_size=256, crop_size=224, mean_file=None):
- # load image
- im = cv2.imread(file)
- # resize
- h, w = im.shape[:2]
- h_new, w_new = resize_size, resize_size
- if h > w:
- h_new = resize_size * h / w
- else:
- w_new = resize_size * w / h
- im = cv2.resize(im, (h_new, w_new), interpolation=cv2.INTER_CUBIC)
- # crop
- h, w = im.shape[:2]
- h_start = (h - crop_size) / 2
- w_start = (w - crop_size) / 2
- h_end, w_end = h_start + crop_size, w_start + crop_size
- im = im[h_start:h_end, w_start:w_end, :]
- # transpose to CHW order
- im = im.transpose((2, 0, 1))
-
- if mean_file:
- mu = np.load(mean_file)
- mu = mu.mean(1).mean(1)
- im = im - mu[:, None, None]
- im = im / 255.0
- return im
-
-
-if __name__ == "__main__":
- caffe_model_file = "./ResNet-50-deploy.prototxt"
- caffe_pretrained_file = "./ResNet-50-model.caffemodel"
- paddle_tar_name = "Paddle_ResNet50.tar.gz"
-
- converter = ModelConverter(
- caffe_model_file=caffe_model_file,
- caffe_pretrained_file=caffe_pretrained_file,
- paddle_tar_name=paddle_tar_name)
- converter.convert()
-
- converter.caffe_predict("./cat.jpg",
- "./caffe/imagenet/ilsvrc_2012_mean.npy")
diff --git a/legacy/image_classification/googlenet.py b/legacy/image_classification/googlenet.py
deleted file mode 100644
index a60c01db3195fca367a8f5c7fa7955a67a54819c..0000000000000000000000000000000000000000
--- a/legacy/image_classification/googlenet.py
+++ /dev/null
@@ -1,181 +0,0 @@
-import paddle.v2 as paddle
-
-__all__ = ['googlenet']
-
-
-def inception(name, input, channels, filter1, filter3R, filter3, filter5R,
- filter5, proj):
- cov1 = paddle.layer.img_conv(
- name=name + '_1',
- input=input,
- filter_size=1,
- num_channels=channels,
- num_filters=filter1,
- stride=1,
- padding=0)
-
- cov3r = paddle.layer.img_conv(
- name=name + '_3r',
- input=input,
- filter_size=1,
- num_channels=channels,
- num_filters=filter3R,
- stride=1,
- padding=0)
- cov3 = paddle.layer.img_conv(
- name=name + '_3',
- input=cov3r,
- filter_size=3,
- num_filters=filter3,
- stride=1,
- padding=1)
-
- cov5r = paddle.layer.img_conv(
- name=name + '_5r',
- input=input,
- filter_size=1,
- num_channels=channels,
- num_filters=filter5R,
- stride=1,
- padding=0)
- cov5 = paddle.layer.img_conv(
- name=name + '_5',
- input=cov5r,
- filter_size=5,
- num_filters=filter5,
- stride=1,
- padding=2)
-
- pool1 = paddle.layer.img_pool(
- name=name + '_max',
- input=input,
- pool_size=3,
- num_channels=channels,
- stride=1,
- padding=1)
- covprj = paddle.layer.img_conv(
- name=name + '_proj',
- input=pool1,
- filter_size=1,
- num_filters=proj,
- stride=1,
- padding=0)
-
- cat = paddle.layer.concat(name=name, input=[cov1, cov3, cov5, covprj])
- return cat
-
-
-def googlenet(input, class_dim):
- # stage 1
- conv1 = paddle.layer.img_conv(
- name="conv1",
- input=input,
- filter_size=7,
- num_channels=3,
- num_filters=64,
- stride=2,
- padding=3)
- pool1 = paddle.layer.img_pool(
- name="pool1", input=conv1, pool_size=3, num_channels=64, stride=2)
-
- # stage 2
- conv2_1 = paddle.layer.img_conv(
- name="conv2_1",
- input=pool1,
- filter_size=1,
- num_filters=64,
- stride=1,
- padding=0)
- conv2_2 = paddle.layer.img_conv(
- name="conv2_2",
- input=conv2_1,
- filter_size=3,
- num_filters=192,
- stride=1,
- padding=1)
- pool2 = paddle.layer.img_pool(
- name="pool2", input=conv2_2, pool_size=3, num_channels=192, stride=2)
-
- # stage 3
- ince3a = inception("ince3a", pool2, 192, 64, 96, 128, 16, 32, 32)
- ince3b = inception("ince3b", ince3a, 256, 128, 128, 192, 32, 96, 64)
- pool3 = paddle.layer.img_pool(
- name="pool3", input=ince3b, num_channels=480, pool_size=3, stride=2)
-
- # stage 4
- ince4a = inception("ince4a", pool3, 480, 192, 96, 208, 16, 48, 64)
- ince4b = inception("ince4b", ince4a, 512, 160, 112, 224, 24, 64, 64)
- ince4c = inception("ince4c", ince4b, 512, 128, 128, 256, 24, 64, 64)
- ince4d = inception("ince4d", ince4c, 512, 112, 144, 288, 32, 64, 64)
- ince4e = inception("ince4e", ince4d, 528, 256, 160, 320, 32, 128, 128)
- pool4 = paddle.layer.img_pool(
- name="pool4", input=ince4e, num_channels=832, pool_size=3, stride=2)
-
- # stage 5
- ince5a = inception("ince5a", pool4, 832, 256, 160, 320, 32, 128, 128)
- ince5b = inception("ince5b", ince5a, 832, 384, 192, 384, 48, 128, 128)
- pool5 = paddle.layer.img_pool(
- name="pool5",
- input=ince5b,
- num_channels=1024,
- pool_size=7,
- stride=7,
- pool_type=paddle.pooling.Avg())
- dropout = paddle.layer.addto(
- input=pool5,
- layer_attr=paddle.attr.Extra(drop_rate=0.4),
- act=paddle.activation.Linear())
-
- out = paddle.layer.fc(input=dropout,
- size=class_dim,
- act=paddle.activation.Softmax())
-
- # fc for output 1
- pool_o1 = paddle.layer.img_pool(
- name="pool_o1",
- input=ince4a,
- num_channels=512,
- pool_size=5,
- stride=3,
- pool_type=paddle.pooling.Avg())
- conv_o1 = paddle.layer.img_conv(
- name="conv_o1",
- input=pool_o1,
- filter_size=1,
- num_filters=128,
- stride=1,
- padding=0)
- fc_o1 = paddle.layer.fc(name="fc_o1",
- input=conv_o1,
- size=1024,
- layer_attr=paddle.attr.Extra(drop_rate=0.7),
- act=paddle.activation.Relu())
- out1 = paddle.layer.fc(input=fc_o1,
- size=class_dim,
- act=paddle.activation.Softmax())
-
- # fc for output 2
- pool_o2 = paddle.layer.img_pool(
- name="pool_o2",
- input=ince4d,
- num_channels=528,
- pool_size=5,
- stride=3,
- pool_type=paddle.pooling.Avg())
- conv_o2 = paddle.layer.img_conv(
- name="conv_o2",
- input=pool_o2,
- filter_size=1,
- num_filters=128,
- stride=1,
- padding=0)
- fc_o2 = paddle.layer.fc(name="fc_o2",
- input=conv_o2,
- size=1024,
- layer_attr=paddle.attr.Extra(drop_rate=0.7),
- act=paddle.activation.Relu())
- out2 = paddle.layer.fc(input=fc_o2,
- size=class_dim,
- act=paddle.activation.Softmax())
-
- return out, out1, out2
diff --git a/legacy/image_classification/inception_resnet_v2.py b/legacy/image_classification/inception_resnet_v2.py
deleted file mode 100644
index 06e9eb6079cafb7f91d413886f7c08f5c0ae09e4..0000000000000000000000000000000000000000
--- a/legacy/image_classification/inception_resnet_v2.py
+++ /dev/null
@@ -1,329 +0,0 @@
-import paddle.v2 as paddle
-
-
-def conv_bn_layer(input,
- ch_out,
- filter_size,
- stride,
- padding=0,
- active_type=paddle.activation.Relu(),
- ch_in=None):
- """layer wrapper assembling convolution and batchnorm layer"""
- tmp = paddle.layer.img_conv(
- input=input,
- filter_size=filter_size,
- num_channels=ch_in,
- num_filters=ch_out,
- stride=stride,
- padding=padding,
- act=paddle.activation.Linear(),
- bias_attr=False)
- return paddle.layer.batch_norm(input=tmp, epsilon=0.001, act=active_type)
-
-
-def sequential_block(input, *layers):
- """helper function for sequential layers"""
- for layer in layers:
- layer_func, layer_conf = layer
- input = layer_func(input, **layer_conf)
- return input
-
-
-def mixed_5b_block(input):
- branch0 = conv_bn_layer(
- input, ch_in=192, ch_out=96, filter_size=1, stride=1)
- branch1 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 192,
- "ch_out": 48,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 48,
- "ch_out": 64,
- "filter_size": 5,
- "stride": 1,
- "padding": 2
- }))
- branch2 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 192,
- "ch_out": 64,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 64,
- "ch_out": 96,
- "filter_size": 3,
- "stride": 1,
- "padding": 1
- }), (conv_bn_layer, {
- "ch_in": 96,
- "ch_out": 96,
- "filter_size": 3,
- "stride": 1,
- "padding": 1
- }))
- branch3 = sequential_block(
- input,
- (paddle.layer.img_pool, {
- "pool_size": 3,
- "stride": 1,
- "padding": 1,
- "pool_type": paddle.pooling.Avg(),
- "exclude_mode": False
- }),
- (conv_bn_layer, {
- "ch_in": 192,
- "ch_out": 64,
- "filter_size": 1,
- "stride": 1
- }), )
- out = paddle.layer.concat(input=[branch0, branch1, branch2, branch3])
- return out
-
-
-def block35(input, scale=1.0):
- branch0 = conv_bn_layer(
- input, ch_in=320, ch_out=32, filter_size=1, stride=1)
- branch1 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 320,
- "ch_out": 32,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 32,
- "ch_out": 32,
- "filter_size": 3,
- "stride": 1,
- "padding": 1
- }))
- branch2 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 320,
- "ch_out": 32,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 32,
- "ch_out": 48,
- "filter_size": 3,
- "stride": 1,
- "padding": 1
- }), (conv_bn_layer, {
- "ch_in": 48,
- "ch_out": 64,
- "filter_size": 3,
- "stride": 1,
- "padding": 1
- }))
- out = paddle.layer.concat(input=[branch0, branch1, branch2])
- out = paddle.layer.img_conv(
- input=out,
- filter_size=1,
- num_channels=128,
- num_filters=320,
- stride=1,
- padding=0,
- act=paddle.activation.Linear(),
- bias_attr=None)
- out = paddle.layer.slope_intercept(out, slope=scale, intercept=0.0)
- out = paddle.layer.addto(input=[input, out], act=paddle.activation.Relu())
- return out
-
-
-def mixed_6a_block(input):
- branch0 = conv_bn_layer(
- input, ch_in=320, ch_out=384, filter_size=3, stride=2)
- branch1 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 320,
- "ch_out": 256,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 256,
- "ch_out": 256,
- "filter_size": 3,
- "stride": 1,
- "padding": 1
- }), (conv_bn_layer, {
- "ch_in": 256,
- "ch_out": 384,
- "filter_size": 3,
- "stride": 2
- }))
- branch2 = paddle.layer.img_pool(
- input,
- num_channels=320,
- pool_size=3,
- stride=2,
- pool_type=paddle.pooling.Max())
- out = paddle.layer.concat(input=[branch0, branch1, branch2])
- return out
-
-
-def block17(input, scale=1.0):
- branch0 = conv_bn_layer(
- input, ch_in=1088, ch_out=192, filter_size=1, stride=1)
- branch1 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 1088,
- "ch_out": 128,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 128,
- "ch_out": 160,
- "filter_size": [7, 1],
- "stride": 1,
- "padding": [3, 0]
- }), (conv_bn_layer, {
- "ch_in": 160,
- "ch_out": 192,
- "filter_size": [1, 7],
- "stride": 1,
- "padding": [0, 3]
- }))
- out = paddle.layer.concat(input=[branch0, branch1])
- out = paddle.layer.img_conv(
- input=out,
- filter_size=1,
- num_channels=384,
- num_filters=1088,
- stride=1,
- padding=0,
- act=paddle.activation.Linear(),
- bias_attr=None)
- out = paddle.layer.slope_intercept(out, slope=scale, intercept=0.0)
- out = paddle.layer.addto(input=[input, out], act=paddle.activation.Relu())
- return out
-
-
-def mixed_7a_block(input):
- branch0 = sequential_block(
- input,
- (conv_bn_layer, {
- "ch_in": 1088,
- "ch_out": 256,
- "filter_size": 1,
- "stride": 1
- }),
- (conv_bn_layer, {
- "ch_in": 256,
- "ch_out": 384,
- "filter_size": 3,
- "stride": 2
- }), )
- branch1 = sequential_block(
- input,
- (conv_bn_layer, {
- "ch_in": 1088,
- "ch_out": 256,
- "filter_size": 1,
- "stride": 1
- }),
- (conv_bn_layer, {
- "ch_in": 256,
- "ch_out": 288,
- "filter_size": 3,
- "stride": 2
- }), )
- branch2 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 1088,
- "ch_out": 256,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 256,
- "ch_out": 288,
- "filter_size": 3,
- "stride": 1,
- "padding": 1
- }), (conv_bn_layer, {
- "ch_in": 288,
- "ch_out": 320,
- "filter_size": 3,
- "stride": 2
- }))
- branch3 = paddle.layer.img_pool(
- input,
- num_channels=1088,
- pool_size=3,
- stride=2,
- pool_type=paddle.pooling.Max())
- out = paddle.layer.concat(input=[branch0, branch1, branch2, branch3])
- return out
-
-
-def block8(input, scale=1.0, no_relu=False):
- branch0 = conv_bn_layer(
- input, ch_in=2080, ch_out=192, filter_size=1, stride=1)
- branch1 = sequential_block(input, (conv_bn_layer, {
- "ch_in": 2080,
- "ch_out": 192,
- "filter_size": 1,
- "stride": 1
- }), (conv_bn_layer, {
- "ch_in": 192,
- "ch_out": 224,
- "filter_size": [3, 1],
- "stride": 1,
- "padding": [1, 0]
- }), (conv_bn_layer, {
- "ch_in": 224,
- "ch_out": 256,
- "filter_size": [1, 3],
- "stride": 1,
- "padding": [0, 1]
- }))
- out = paddle.layer.concat(input=[branch0, branch1])
- out = paddle.layer.img_conv(
- input=out,
- filter_size=1,
- num_channels=448,
- num_filters=2080,
- stride=1,
- padding=0,
- act=paddle.activation.Linear(),
- bias_attr=None)
- out = paddle.layer.slope_intercept(out, slope=scale, intercept=0.0)
- out = paddle.layer.addto(
- input=[input, out],
- act=paddle.activation.Linear() if no_relu else paddle.activation.Relu())
- return out
-
-
-def inception_resnet_v2(input,
- class_dim,
- dropout_rate=0.5,
- data_dim=3 * 331 * 331):
- conv2d_1a = conv_bn_layer(
- input, ch_in=3, ch_out=32, filter_size=3, stride=2)
- conv2d_2a = conv_bn_layer(
- conv2d_1a, ch_in=32, ch_out=32, filter_size=3, stride=1)
- conv2d_2b = conv_bn_layer(
- conv2d_2a, ch_in=32, ch_out=64, filter_size=3, stride=1, padding=1)
- maxpool_3a = paddle.layer.img_pool(
- input=conv2d_2b, pool_size=3, stride=2, pool_type=paddle.pooling.Max())
- conv2d_3b = conv_bn_layer(
- maxpool_3a, ch_in=64, ch_out=80, filter_size=1, stride=1)
- conv2d_4a = conv_bn_layer(
- conv2d_3b, ch_in=80, ch_out=192, filter_size=3, stride=1)
- maxpool_5a = paddle.layer.img_pool(
- input=conv2d_4a, pool_size=3, stride=2, pool_type=paddle.pooling.Max())
- mixed_5b = mixed_5b_block(maxpool_5a)
- repeat = sequential_block(mixed_5b, *([(block35, {"scale": 0.17})] * 10))
- mixed_6a = mixed_6a_block(repeat)
- repeat1 = sequential_block(mixed_6a, *([(block17, {"scale": 0.10})] * 20))
- mixed_7a = mixed_7a_block(repeat1)
- repeat2 = sequential_block(mixed_7a, *([(block8, {"scale": 0.20})] * 9))
- block_8 = block8(repeat2, no_relu=True)
- conv2d_7b = conv_bn_layer(
- block_8, ch_in=2080, ch_out=1536, filter_size=1, stride=1)
- avgpool_1a = paddle.layer.img_pool(
- input=conv2d_7b,
- pool_size=8 if data_dim == 3 * 299 * 299 else 9,
- stride=1,
- pool_type=paddle.pooling.Avg(),
- exclude_mode=False)
- drop_out = paddle.layer.dropout(input=avgpool_1a, dropout_rate=dropout_rate)
- out = paddle.layer.fc(input=drop_out,
- size=class_dim,
- act=paddle.activation.Softmax())
- return out
diff --git a/legacy/image_classification/inception_v4.py b/legacy/image_classification/inception_v4.py
deleted file mode 100644
index e171392640ba627ee6ee0c0a13091f11bff027ec..0000000000000000000000000000000000000000
--- a/legacy/image_classification/inception_v4.py
+++ /dev/null
@@ -1,525 +0,0 @@
-import paddle.v2 as paddle
-
-__all__ = ['inception_v4']
-
-
-def img_conv(name,
- input,
- num_filters,
- filter_size,
- stride,
- padding,
- num_channels=None):
- conv = paddle.layer.img_conv(
- name=name,
- input=input,
- num_channels=num_channels,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- act=paddle.activation.Linear())
- norm = paddle.layer.batch_norm(
- name=name + '_norm', input=conv, act=paddle.activation.Relu())
- return norm
-
-
-def stem(input):
- conv0 = img_conv(
- name='stem_conv0',
- input=input,
- num_channels=3,
- num_filters=32,
- filter_size=3,
- stride=2,
- padding=1)
- conv1 = img_conv(
- name='stem_conv1',
- input=conv0,
- num_channels=32,
- num_filters=32,
- filter_size=3,
- stride=1,
- padding=1)
- conv2 = img_conv(
- name='stem_conv2',
- input=conv1,
- num_channels=32,
- num_filters=64,
- filter_size=3,
- stride=1,
- padding=1)
-
- def block0(input):
- pool0 = paddle.layer.img_pool(
- name='stem_branch0_pool0',
- input=input,
- num_channels=64,
- pool_size=3,
- stride=2,
- pool_type=paddle.pooling.Max())
- conv0 = img_conv(
- name='stem_branch0_conv0',
- input=input,
- num_channels=64,
- num_filters=96,
- filter_size=3,
- stride=2,
- padding=1)
- return paddle.layer.concat(input=[pool0, conv0])
-
- def block1(input):
- l_conv0 = img_conv(
- name='stem_branch1_l_conv0',
- input=input,
- num_channels=160,
- num_filters=64,
- filter_size=1,
- stride=1,
- padding=0)
- l_conv1 = img_conv(
- name='stem_branch1_l_conv1',
- input=l_conv0,
- num_channels=64,
- num_filters=96,
- filter_size=3,
- stride=1,
- padding=1)
- r_conv0 = img_conv(
- name='stem_branch1_r_conv0',
- input=input,
- num_channels=160,
- num_filters=64,
- filter_size=1,
- stride=1,
- padding=0)
- r_conv1 = img_conv(
- name='stem_branch1_r_conv1',
- input=r_conv0,
- num_channels=64,
- num_filters=64,
- filter_size=(7, 1),
- stride=1,
- padding=(3, 0))
- r_conv2 = img_conv(
- name='stem_branch1_r_conv2',
- input=r_conv1,
- num_channels=64,
- num_filters=64,
- filter_size=(1, 7),
- stride=1,
- padding=(0, 3))
- r_conv3 = img_conv(
- name='stem_branch1_r_conv3',
- input=r_conv2,
- num_channels=64,
- num_filters=96,
- filter_size=3,
- stride=1,
- padding=1)
- return paddle.layer.concat(input=[l_conv1, r_conv3])
-
- def block2(input):
- conv0 = img_conv(
- name='stem_branch2_conv0',
- input=input,
- num_channels=192,
- num_filters=192,
- filter_size=3,
- stride=2,
- padding=1)
- pool0 = paddle.layer.img_pool(
- name='stem_branch2_pool0',
- input=input,
- num_channels=192,
- pool_size=3,
- stride=2,
- pool_type=paddle.pooling.Max())
- return paddle.layer.concat(input=[conv0, pool0])
-
- conv3 = block0(conv2)
- conv4 = block1(conv3)
- conv5 = block2(conv4)
- return conv5
-
-
-def Inception_A(input, depth):
- b0_pool0 = paddle.layer.img_pool(
- name='inceptA{0}_branch0_pool0'.format(depth),
- input=input,
- num_channels=384,
- pool_size=3,
- stride=1,
- padding=1,
- pool_type=paddle.pooling.Avg())
- b0_conv0 = img_conv(
- name='inceptA{0}_branch0_conv0'.format(depth),
- input=b0_pool0,
- num_channels=384,
- num_filters=96,
- filter_size=1,
- stride=1,
- padding=0)
- b1_conv0 = img_conv(
- name='inceptA{0}_branch1_conv0'.format(depth),
- input=input,
- num_channels=384,
- num_filters=96,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv0 = img_conv(
- name='inceptA{0}_branch2_conv0'.format(depth),
- input=input,
- num_channels=384,
- num_filters=64,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv1 = img_conv(
- name='inceptA{0}_branch2_conv1'.format(depth),
- input=b2_conv0,
- num_channels=64,
- num_filters=96,
- filter_size=3,
- stride=1,
- padding=1)
- b3_conv0 = img_conv(
- name='inceptA{0}_branch3_conv0'.format(depth),
- input=input,
- num_channels=384,
- num_filters=64,
- filter_size=1,
- stride=1,
- padding=0)
- b3_conv1 = img_conv(
- name='inceptA{0}_branch3_conv1'.format(depth),
- input=b3_conv0,
- num_channels=64,
- num_filters=96,
- filter_size=3,
- stride=1,
- padding=1)
- b3_conv2 = img_conv(
- name='inceptA{0}_branch3_conv2'.format(depth),
- input=b3_conv1,
- num_channels=96,
- num_filters=96,
- filter_size=3,
- stride=1,
- padding=1)
- return paddle.layer.concat(input=[b0_conv0, b1_conv0, b2_conv1, b3_conv2])
-
-
-def Inception_B(input, depth):
- b0_pool0 = paddle.layer.img_pool(
- name='inceptB{0}_branch0_pool0'.format(depth),
- input=input,
- num_channels=1024,
- pool_size=3,
- stride=1,
- padding=1,
- pool_type=paddle.pooling.Avg())
- b0_conv0 = img_conv(
- name='inceptB{0}_branch0_conv0'.format(depth),
- input=b0_pool0,
- num_channels=1024,
- num_filters=128,
- filter_size=1,
- stride=1,
- padding=0)
- b1_conv0 = img_conv(
- name='inceptB{0}_branch1_conv0'.format(depth),
- input=input,
- num_channels=1024,
- num_filters=384,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv0 = img_conv(
- name='inceptB{0}_branch2_conv0'.format(depth),
- input=input,
- num_channels=1024,
- num_filters=192,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv1 = img_conv(
- name='inceptB{0}_branch2_conv1'.format(depth),
- input=b2_conv0,
- num_channels=192,
- num_filters=224,
- filter_size=(1, 7),
- stride=1,
- padding=(0, 3))
- b2_conv2 = img_conv(
- name='inceptB{0}_branch2_conv2'.format(depth),
- input=b2_conv1,
- num_channels=224,
- num_filters=256,
- filter_size=(7, 1),
- stride=1,
- padding=(3, 0))
- b3_conv0 = img_conv(
- name='inceptB{0}_branch3_conv0'.format(depth),
- input=input,
- num_channels=1024,
- num_filters=192,
- filter_size=1,
- stride=1,
- padding=0)
- b3_conv1 = img_conv(
- name='inceptB{0}_branch3_conv1'.format(depth),
- input=b3_conv0,
- num_channels=192,
- num_filters=192,
- filter_size=(1, 7),
- stride=1,
- padding=(0, 3))
- b3_conv2 = img_conv(
- name='inceptB{0}_branch3_conv2'.format(depth),
- input=b3_conv1,
- num_channels=192,
- num_filters=224,
- filter_size=(7, 1),
- stride=1,
- padding=(3, 0))
- b3_conv3 = img_conv(
- name='inceptB{0}_branch3_conv3'.format(depth),
- input=b3_conv2,
- num_channels=224,
- num_filters=224,
- filter_size=(1, 7),
- stride=1,
- padding=(0, 3))
- b3_conv4 = img_conv(
- name='inceptB{0}_branch3_conv4'.format(depth),
- input=b3_conv3,
- num_channels=224,
- num_filters=256,
- filter_size=(7, 1),
- stride=1,
- padding=(3, 0))
- return paddle.layer.concat(input=[b0_conv0, b1_conv0, b2_conv2, b3_conv4])
-
-
-def Inception_C(input, depth):
- b0_pool0 = paddle.layer.img_pool(
- name='inceptC{0}_branch0_pool0'.format(depth),
- input=input,
- num_channels=1536,
- pool_size=3,
- stride=1,
- padding=1,
- pool_type=paddle.pooling.Avg())
- b0_conv0 = img_conv(
- name='inceptC{0}_branch0_conv0'.format(depth),
- input=b0_pool0,
- num_channels=1536,
- num_filters=256,
- filter_size=1,
- stride=1,
- padding=0)
- b1_conv0 = img_conv(
- name='inceptC{0}_branch1_conv0'.format(depth),
- input=input,
- num_channels=1536,
- num_filters=256,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv0 = img_conv(
- name='inceptC{0}_branch2_conv0'.format(depth),
- input=input,
- num_channels=1536,
- num_filters=384,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv1 = img_conv(
- name='inceptC{0}_branch2_conv1'.format(depth),
- input=b2_conv0,
- num_channels=384,
- num_filters=256,
- filter_size=(1, 3),
- stride=1,
- padding=(0, 1))
- b2_conv2 = img_conv(
- name='inceptC{0}_branch2_conv2'.format(depth),
- input=b2_conv0,
- num_channels=384,
- num_filters=256,
- filter_size=(3, 1),
- stride=1,
- padding=(1, 0))
- b3_conv0 = img_conv(
- name='inceptC{0}_branch3_conv0'.format(depth),
- input=input,
- num_channels=1536,
- num_filters=384,
- filter_size=1,
- stride=1,
- padding=0)
- b3_conv1 = img_conv(
- name='inceptC{0}_branch3_conv1'.format(depth),
- input=b3_conv0,
- num_channels=384,
- num_filters=448,
- filter_size=(1, 3),
- stride=1,
- padding=(0, 1))
- b3_conv2 = img_conv(
- name='inceptC{0}_branch3_conv2'.format(depth),
- input=b3_conv1,
- num_channels=448,
- num_filters=512,
- filter_size=(3, 1),
- stride=1,
- padding=(1, 0))
- b3_conv3 = img_conv(
- name='inceptC{0}_branch3_conv3'.format(depth),
- input=b3_conv2,
- num_channels=512,
- num_filters=256,
- filter_size=(3, 1),
- stride=1,
- padding=(1, 0))
- b3_conv4 = img_conv(
- name='inceptC{0}_branch3_conv4'.format(depth),
- input=b3_conv2,
- num_channels=512,
- num_filters=256,
- filter_size=(1, 3),
- stride=1,
- padding=(0, 1))
- return paddle.layer.concat(
- input=[b0_conv0, b1_conv0, b2_conv1, b2_conv2, b3_conv3, b3_conv4])
-
-
-def Reduction_A(input):
- b0_pool0 = paddle.layer.img_pool(
- name='ReductA_branch0_pool0',
- input=input,
- num_channels=384,
- pool_size=3,
- stride=2,
- pool_type=paddle.pooling.Max())
- b1_conv0 = img_conv(
- name='ReductA_branch1_conv0',
- input=input,
- num_channels=384,
- num_filters=384,
- filter_size=3,
- stride=2,
- padding=1)
- b2_conv0 = img_conv(
- name='ReductA_branch2_conv0',
- input=input,
- num_channels=384,
- num_filters=192,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv1 = img_conv(
- name='ReductA_branch2_conv1',
- input=b2_conv0,
- num_channels=192,
- num_filters=224,
- filter_size=3,
- stride=1,
- padding=1)
- b2_conv2 = img_conv(
- name='ReductA_branch2_conv2',
- input=b2_conv1,
- num_channels=224,
- num_filters=256,
- filter_size=3,
- stride=2,
- padding=1)
- return paddle.layer.concat(input=[b0_pool0, b1_conv0, b2_conv2])
-
-
-def Reduction_B(input):
- b0_pool0 = paddle.layer.img_pool(
- name='ReductB_branch0_pool0',
- input=input,
- num_channels=1024,
- pool_size=3,
- stride=2,
- pool_type=paddle.pooling.Max())
- b1_conv0 = img_conv(
- name='ReductB_branch1_conv0',
- input=input,
- num_channels=1024,
- num_filters=192,
- filter_size=1,
- stride=1,
- padding=0)
- b1_conv1 = img_conv(
- name='ReductB_branch1_conv1',
- input=b1_conv0,
- num_channels=192,
- num_filters=192,
- filter_size=3,
- stride=2,
- padding=1)
- b2_conv0 = img_conv(
- name='ReductB_branch2_conv0',
- input=input,
- num_channels=1024,
- num_filters=256,
- filter_size=1,
- stride=1,
- padding=0)
- b2_conv1 = img_conv(
- name='ReductB_branch2_conv1',
- input=b2_conv0,
- num_channels=256,
- num_filters=256,
- filter_size=(1, 7),
- stride=1,
- padding=(0, 3))
- b2_conv2 = img_conv(
- name='ReductB_branch2_conv2',
- input=b2_conv1,
- num_channels=256,
- num_filters=320,
- filter_size=(7, 1),
- stride=1,
- padding=(3, 0))
- b2_conv3 = img_conv(
- name='ReductB_branch2_conv3',
- input=b2_conv2,
- num_channels=320,
- num_filters=320,
- filter_size=3,
- stride=2,
- padding=1)
- return paddle.layer.concat(input=[b0_pool0, b1_conv1, b2_conv3])
-
-
-def inception_v4(input, class_dim):
- conv = stem(input)
-
- for i in range(4):
- conv = Inception_A(conv, i)
- conv = Reduction_A(conv)
- for i in range(7):
- conv = Inception_B(conv, i)
- conv = Reduction_B(conv)
- for i in range(3):
- conv = Inception_C(conv, i)
-
- pool = paddle.layer.img_pool(
- name='incept_avg_pool',
- input=conv,
- num_channels=1536,
- pool_size=7,
- stride=1,
- pool_type=paddle.pooling.Avg())
- drop = paddle.layer.dropout(input=pool, dropout_rate=0.2)
- out = paddle.layer.fc(name='incept_fc',
- input=drop,
- size=class_dim,
- act=paddle.activation.Softmax())
- return out
diff --git a/legacy/image_classification/infer.py b/legacy/image_classification/infer.py
deleted file mode 100644
index c73a0c811682209d8cea38089b9db0d315057c00..0000000000000000000000000000000000000000
--- a/legacy/image_classification/infer.py
+++ /dev/null
@@ -1,81 +0,0 @@
-import os
-import gzip
-import argparse
-import numpy as np
-from PIL import Image
-
-import paddle.v2 as paddle
-import reader
-import vgg
-import resnet
-import alexnet
-import googlenet
-import inception_v4
-import inception_resnet_v2
-import xception
-
-DATA_DIM = 3 * 224 * 224 # Use 3 * 331 * 331 or 3 * 299 * 299 for Inception-ResNet-v2.
-CLASS_DIM = 102
-
-
-def main():
- # parse the argument
- parser = argparse.ArgumentParser()
- parser.add_argument(
- 'data_list',
- help='The path of data list file, which consists of one image path per line'
- )
- parser.add_argument(
- 'model',
- help='The model for image classification',
- choices=[
- 'alexnet', 'vgg13', 'vgg16', 'vgg19', 'resnet', 'googlenet',
- 'inception-resnet-v2', 'inception_v4', 'xception'
- ])
- parser.add_argument(
- 'params_path', help='The file which stores the parameters')
- args = parser.parse_args()
-
- # PaddlePaddle init
- paddle.init(use_gpu=True, trainer_count=1)
-
- image = paddle.layer.data(
- name="image", type=paddle.data_type.dense_vector(DATA_DIM))
-
- if args.model == 'alexnet':
- out = alexnet.alexnet(image, class_dim=CLASS_DIM)
- elif args.model == 'vgg13':
- out = vgg.vgg13(image, class_dim=CLASS_DIM)
- elif args.model == 'vgg16':
- out = vgg.vgg16(image, class_dim=CLASS_DIM)
- elif args.model == 'vgg19':
- out = vgg.vgg19(image, class_dim=CLASS_DIM)
- elif args.model == 'resnet':
- out = resnet.resnet_imagenet(image, class_dim=CLASS_DIM)
- elif args.model == 'googlenet':
- out, _, _ = googlenet.googlenet(image, class_dim=CLASS_DIM)
- elif args.model == 'inception-resnet-v2':
- assert DATA_DIM == 3 * 331 * 331 or DATA_DIM == 3 * 299 * 299
- out = inception_resnet_v2.inception_resnet_v2(
- image, class_dim=CLASS_DIM, dropout_rate=0.5, data_dim=DATA_DIM)
- elif args.model == 'inception_v4':
- out = inception_v4.inception_v4(image, class_dim=CLASS_DIM)
- elif args.model == 'xception':
- out = xception.xception(image, class_dim=CLASS_DIM)
-
- # load parameters
- with gzip.open(args.params_path, 'r') as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
-
- file_list = [line.strip() for line in open(args.data_list)]
- test_data = [(paddle.image.load_and_transform(image_file, 256, 224, False)
- .flatten().astype('float32'), ) for image_file in file_list]
- probs = paddle.infer(
- output_layer=out, parameters=parameters, input=test_data)
- lab = np.argsort(-probs)
- for file_name, result in zip(file_list, lab):
- print "Label of %s is: %d" % (file_name, result[0])
-
-
-if __name__ == '__main__':
- main()
diff --git a/legacy/image_classification/models/model_download.sh b/legacy/image_classification/models/model_download.sh
deleted file mode 100644
index 2a73c56ec5b73b33bf6a94ac63e5a2dee387b4a9..0000000000000000000000000000000000000000
--- a/legacy/image_classification/models/model_download.sh
+++ /dev/null
@@ -1,55 +0,0 @@
-#! /usr/bin/env bash
-
-function download() {
- URL=$1
- MD5=$2
- TARGET=$3
-
- if [ -e $TARGET ]; then
- md5_result=`md5sum $TARGET | awk -F[' '] '{print $1}'`
- if [ $MD5 == $md5_result ]; then
- echo "$TARGET already exists, download skipped."
- return 0
- fi
- fi
-
- wget -c $URL -O "$TARGET"
- if [ $? -ne 0 ]; then
- return 1
- fi
-
- md5_result=`md5sum $TARGET | awk -F[' '] '{print $1}'`
- if [ ! $MD5 == $md5_result ]; then
- return 1
- fi
-}
-
-case "$1" in
- "ResNet50")
- URL="http://cloud.dlnel.org/filepub/?uuid=f63f237a-698e-4a22-9782-baf5bb183019"
- MD5="eb4d7b5962c9954340207788af0d6967"
- ;;
- "ResNet101")
- URL="http://cloud.dlnel.org/filepub/?uuid=3d5fb996-83d0-4745-8adc-13ee960fc55c"
- MD5="7e71f24998aa8e434fa164a7c4fc9c02"
- ;;
- "Vgg16")
- URL="http://cloud.dlnel.org/filepub/?uuid=aa0e397e-474a-4cc1-bd8f-65a214039c2e"
- MD5="e73dc42507e6acd3a8b8087f66a9f395"
- ;;
- *)
- echo "The "$1" model is not provided currently."
- exit 1
- ;;
-esac
-TARGET="Paddle_"$1".tar.gz"
-
-echo "Download "$1" model ..."
-download $URL $MD5 $TARGET
-if [ $? -ne 0 ]; then
- echo "Fail to download the model!"
- exit 1
-fi
-
-
-exit 0
diff --git a/legacy/image_classification/reader.py b/legacy/image_classification/reader.py
deleted file mode 100644
index b6bad1a24c36d7ae182a4522dcd9d94b35f6ae3c..0000000000000000000000000000000000000000
--- a/legacy/image_classification/reader.py
+++ /dev/null
@@ -1,55 +0,0 @@
-import random
-from paddle.v2.image import load_and_transform
-import paddle.v2 as paddle
-from multiprocessing import cpu_count
-
-
-def train_mapper(sample):
- '''
- map image path to type needed by model input layer for the training set
- '''
- img, label = sample
- img = paddle.image.load_image(img)
- img = paddle.image.simple_transform(img, 256, 224, True)
- return img.flatten().astype('float32'), label
-
-
-def test_mapper(sample):
- '''
- map image path to type needed by model input layer for the test set
- '''
- img, label = sample
- img = paddle.image.load_image(img)
- img = paddle.image.simple_transform(img, 256, 224, True)
- return img.flatten().astype('float32'), label
-
-
-def train_reader(train_list, buffered_size=1024):
- def reader():
- with open(train_list, 'r') as f:
- lines = [line.strip() for line in f]
- for line in lines:
- img_path, lab = line.strip().split('\t')
- yield img_path, int(lab)
-
- return paddle.reader.xmap_readers(train_mapper, reader,
- cpu_count(), buffered_size)
-
-
-def test_reader(test_list, buffered_size=1024):
- def reader():
- with open(test_list, 'r') as f:
- lines = [line.strip() for line in f]
- for line in lines:
- img_path, lab = line.strip().split('\t')
- yield img_path, int(lab)
-
- return paddle.reader.xmap_readers(test_mapper, reader,
- cpu_count(), buffered_size)
-
-
-if __name__ == '__main__':
- for im in train_reader('train.list'):
- print len(im[0])
- for im in train_reader('test.list'):
- print len(im[0])
diff --git a/legacy/image_classification/resnet.py b/legacy/image_classification/resnet.py
deleted file mode 100644
index 5c884117e27a6401fb8fc689d0d5bca24c90c716..0000000000000000000000000000000000000000
--- a/legacy/image_classification/resnet.py
+++ /dev/null
@@ -1,97 +0,0 @@
-import paddle.v2 as paddle
-
-__all__ = ['resnet_imagenet', 'resnet_cifar10']
-
-
-def conv_bn_layer(input,
- ch_out,
- filter_size,
- stride,
- padding,
- active_type=paddle.activation.Relu(),
- ch_in=None):
- tmp = paddle.layer.img_conv(
- input=input,
- filter_size=filter_size,
- num_channels=ch_in,
- num_filters=ch_out,
- stride=stride,
- padding=padding,
- act=paddle.activation.Linear(),
- bias_attr=False)
- return paddle.layer.batch_norm(input=tmp, act=active_type)
-
-
-def shortcut(input, ch_out, stride):
- if input.num_filters != ch_out:
- return conv_bn_layer(input, ch_out, 1, stride, 0,
- paddle.activation.Linear())
- else:
- return input
-
-
-def basicblock(input, ch_out, stride):
- short = shortcut(input, ch_out, stride)
- conv1 = conv_bn_layer(input, ch_out, 3, stride, 1)
- conv2 = conv_bn_layer(conv1, ch_out, 3, 1, 1, paddle.activation.Linear())
- return paddle.layer.addto(
- input=[short, conv2], act=paddle.activation.Relu())
-
-
-def bottleneck(input, ch_out, stride):
- short = shortcut(input, ch_out * 4, stride)
- conv1 = conv_bn_layer(input, ch_out, 1, stride, 0)
- conv2 = conv_bn_layer(conv1, ch_out, 3, 1, 1)
- conv3 = conv_bn_layer(conv2, ch_out * 4, 1, 1, 0,
- paddle.activation.Linear())
- return paddle.layer.addto(
- input=[short, conv3], act=paddle.activation.Relu())
-
-
-def layer_warp(block_func, input, ch_out, count, stride):
- conv = block_func(input, ch_out, stride)
- for i in range(1, count):
- conv = block_func(conv, ch_out, 1)
- return conv
-
-
-def resnet_imagenet(input, class_dim, depth=50):
- cfg = {
- 18: ([2, 2, 2, 1], basicblock),
- 34: ([3, 4, 6, 3], basicblock),
- 50: ([3, 4, 6, 3], bottleneck),
- 101: ([3, 4, 23, 3], bottleneck),
- 152: ([3, 8, 36, 3], bottleneck)
- }
- stages, block_func = cfg[depth]
- conv1 = conv_bn_layer(
- input, ch_in=3, ch_out=64, filter_size=7, stride=2, padding=3)
- pool1 = paddle.layer.img_pool(input=conv1, pool_size=3, stride=2)
- res1 = layer_warp(block_func, pool1, 64, stages[0], 1)
- res2 = layer_warp(block_func, res1, 128, stages[1], 2)
- res3 = layer_warp(block_func, res2, 256, stages[2], 2)
- res4 = layer_warp(block_func, res3, 512, stages[3], 2)
- pool2 = paddle.layer.img_pool(
- input=res4, pool_size=7, stride=1, pool_type=paddle.pooling.Avg())
- out = paddle.layer.fc(input=pool2,
- size=class_dim,
- act=paddle.activation.Softmax())
- return out
-
-
-def resnet_cifar10(input, class_dim, depth=32):
- # depth should be one of 20, 32, 44, 56, 110, 1202
- assert (depth - 2) % 6 == 0
- n = (depth - 2) / 6
- nStages = {16, 64, 128}
- conv1 = conv_bn_layer(
- input, ch_in=3, ch_out=16, filter_size=3, stride=1, padding=1)
- res1 = layer_warp(basicblock, conv1, 16, n, 1)
- res2 = layer_warp(basicblock, res1, 32, n, 2)
- res3 = layer_warp(basicblock, res2, 64, n, 2)
- pool = paddle.layer.img_pool(
- input=res3, pool_size=8, stride=1, pool_type=paddle.pooling.Avg())
- out = paddle.layer.fc(input=pool,
- size=class_dim,
- act=paddle.activation.Softmax())
- return out
diff --git a/legacy/image_classification/tf2paddle/README.md b/legacy/image_classification/tf2paddle/README.md
deleted file mode 100644
index 821e039ea3938ea4273003afa21088a21dfd012e..0000000000000000000000000000000000000000
--- a/legacy/image_classification/tf2paddle/README.md
+++ /dev/null
@@ -1,54 +0,0 @@
-## 使用说明
-
-`tf2paddle.py`脚本中的工具类`TFModelConverter`实现了将TensorFlow训练好的模型文件转换为PaddlePaddle可加载的模型文件。目前能够支持图像领域常用的:卷积(`Convolution`)层、`Batch Normalization`层和全连接(`Full Connection`)层。图像领域常用的 `ResNet` `VGG` 网络都以这些层此为基础,使用TensorFlow训练的`ResNet`和`VGG`模型能够被转换为PaddlePaddle可加载的模型,进一步用于预训练或是预测服务的开发等。
-
-模型转换的基本流程是:
-1. 将TensorFlow模型等价地使用PaddlePaddle Python API接口进行改写。
-1. 在TensorFlow中可学习参数用 `Variable` 表示,基于TensorFlow的Python API获取网络中的 Variable。
-1. 确定TensorFlow模型中`Variable`与PaddlePaddle中`paddle.layer`的可学习参数的对应关系。
-1. 对TensorFlow中的`Variable`进行一定的适配(详见下文),转化为PaddlePaddle中的参数存储格式并进行序列化保存。
-
-### 需要遵守的约定
-
-为使TensorFlow模型中的`Variable`能够正确对应到`paddle.layer`中的可学习参数,目前版本在使用时有如下约束需要遵守:
-
-1. 目前仅支持将TensorFlow中 `conv2d`,`batchnorm`,`fc`这三种带有可学习`Variable`的Operator训练出的参数向PaddlePaddle模型参数转换。
-1. TensorFlow网络配置中同一Operator内的`Variable`属于相同的scope,以此为依据将`Variable`划分到不同的`paddle.layer`。
-1. `conv2d`、`batchnorm`、`fc`的scope需分别包含`conv`、`bn`、`fc`,以此获取对应`paddle.layer`的类型。也可以通过为`TFModelConverter`传入`layer_type_map`的`dict`,将scope映射到对应的`paddle.layer`的type来规避此项约束。
-1. `conv2d`、`fc`中`Variable`的顺序为:先可学习`Weight`后`Bias`;`batchnorm`中`Variable`的顺序为:`scale`、`shift`、`mean`、`var`,请注意参数存储的顺序将`Variable`对应到`paddle.layer.batch_norm`相应位置的参数。
-1. TensorFlow网络拓扑顺序需和PaddlePaddle网络拓扑顺序一致,尤其注意网络包含分支结构时分支定义的先后顺序,如ResNet的bottleneck模块中两分支定义的先后顺序。这是针对模型转换和PaddlePaddle网络配置均使用PaddlePaddle默认参数命名的情况,此时将根据拓扑顺序进行参数命名。
-1. 若PaddlePaddle网络配置中需要通过调用`param_attr=paddle.attr.Param(name="XX"))`显示地设置可学习参数名字,这时可通过为`TFModelConverter`传入`layer_name_map`或`param_name_map`字典(类型为Python `dict`),在模型转换时将`Variable`的名字映射为所对应的`paddle.layer.XX`中可学习参数的名字。
-1. 要求提供`build_model`接口以从此构建TensorFlow网络,加载模型并返回session。可参照如下示例进行编写:
-
- ```python
- def build_model():
- build_graph()
- sess = tf.Session(config=tf.ConfigProto(allow_soft_placement=True))
- sess.run(tf.tables_initializer())
- saver = tf.train.Saver()
- saver.restore(sess, 'model/model.ckpt')
- return sess
- ```
-
-### 使用说明
-
-按照以上原则操作后,`tf2paddle.py` 脚本的`main`函数提供了一个调用示例,将TensorFlow训练的`ResNet50`模型转换为PaddlePaddle可加载模型。若要对其它各种自定义的模型进行转换,只需修改相关变量的值,在终端执行`python tf2paddle.py`即可。
-
-下面是一个简单的调用示例:
-
-```python
-# 定义相关变量
-tf_net = "TF_ResNet50" # 提供build_model的module名
-paddle_tar_name = "Paddle_ResNet50.tar.gz" # 输出的Paddle模型的文件名
-
-# 初始化并加载模型
-converter = TFModelConverter(tf_net=tf_net,
- paddle_tar_name=paddle_tar_name)
-# 进行模型转换
-converter.convert()
-```
-
-### 注意事项
-
-1. 由于TensorFlow中的padding机制较为特殊,在编写PaddlePaddle网络配置时,对`paddle.layer.conv`这种需要padding的层可能需要推算size后在`paddle.layer.conv`外使用`paddle.layer.pad`进行padding。
-1. 与TensorFlow图像输入多使用NHWC的数据组织格式有所不同,PaddlePaddle按照NCHW的格式组织图像输入数据。
diff --git a/legacy/image_classification/tf2paddle/tf2paddle.py b/legacy/image_classification/tf2paddle/tf2paddle.py
deleted file mode 100644
index 20b6cade17bf0c62ad86ebdf9da981a54bb63d32..0000000000000000000000000000000000000000
--- a/legacy/image_classification/tf2paddle/tf2paddle.py
+++ /dev/null
@@ -1,177 +0,0 @@
-import os
-import re
-import collections
-import struct
-import gzip
-import tarfile
-import cStringIO
-import numpy as np
-
-import tensorflow as tf
-
-from paddle.proto.ParameterConfig_pb2 import ParameterConfig
-from paddle.trainer_config_helpers.default_decorators import wrap_name_default
-
-
-class ModelConverter(object):
- def __init__(self,
- paddle_tar_name,
- param_name_map=None,
- layer_name_map=None,
- layer_type_map=None):
- self.tar_name = paddle_tar_name
- self.param_name_map = param_name_map
- self.layer_name_map = layer_name_map
- self.layer_type_map = layer_type_map
- self.params = dict()
-
- def convert(self):
- layers_params = self.arrange_layer_params()
- for layer_name in layers_params.keys():
- layer_params, layer_params_names, layer_type = layers_params[
- layer_name]
- if len(layer_params) > 0:
- if not layer_type:
- assert layer_type_map and (
- layer_type_map.get(layer_name) in ["conv", "bn", "fc"])
- layer_type = layer_type_map[layer_name]
- self.pre_layer_name = getattr(
- self, "convert_" + layer_type + "_layer")(
- layer_params,
- params_names=[
- self.param_name_map.get(name)
- if self.param_name_map else None
- for name in layer_params_names
- ],
- name=None if self.layer_name_map == None else
- self.layer_name_map.get(layer_name))
- with gzip.open(self.tar_name, 'w') as f:
- self.to_tar(f)
- return
-
- def to_tar(self, f):
- tar = tarfile.TarFile(fileobj=f, mode='w')
- for param_name in self.params.keys():
- param_conf, param_data = self.params[param_name]
-
- confStr = param_conf.SerializeToString()
- tarinfo = tarfile.TarInfo(name="%s.protobuf" % param_name)
- tarinfo.size = len(confStr)
- buf = cStringIO.StringIO(confStr)
- buf.seek(0)
- tar.addfile(tarinfo, fileobj=buf)
-
- buf = cStringIO.StringIO()
- self.serialize(param_data, buf)
- tarinfo = tarfile.TarInfo(name=param_name)
- buf.seek(0)
- tarinfo.size = len(buf.getvalue())
- tar.addfile(tarinfo, buf)
-
- @staticmethod
- def serialize(data, f):
- f.write(struct.pack("IIQ", 0, 4, data.size))
- f.write(data.tobytes())
-
-
-class TFModelConverter(ModelConverter):
- def __init__(self,
- tf_net,
- paddle_tar_name,
- param_name_map=None,
- layer_name_map=None,
- layer_type_map=None):
- super(TFModelConverter, self).__init__(paddle_tar_name, param_name_map,
- layer_name_map, layer_type_map)
- self.sess = __import__(tf_net).build_model()
-
- def arrange_layer_params(self):
- all_vars = tf.global_variables()
- layers_params = collections.OrderedDict()
- for var in all_vars:
- var_name = var.name
- scope_pos = var_name.rfind('/')
- if scope_pos != -1:
- layer_scope = var_name[:scope_pos]
- if layers_params.has_key(layer_scope):
- layer_params, layer_params_names, layer_type = layers_params[
- layer_scope]
- layer_params.append(var.eval(self.sess))
- layer_params_names.append(var_name)
- else:
- layer_type = re.search('conv|bn|fc', layer_scope)
- layers_params[layer_scope] = ([var.eval(self.sess)],
- [var_name], layer_type.group()
- if layer_type else None)
- return layers_params
-
- @wrap_name_default("conv")
- def convert_conv_layer(self, params, params_names=None, name=None):
- for i in range(len(params)):
- data = np.transpose(params[i], (
- 3, 2, 0, 1)) if len(params[i].shape) == 4 else params[i]
- if len(params) == 2:
- suffix = "0" if i == 0 else "bias"
- file_name = "_%s.w%s" % (name, suffix) if not (
- params_names and params_names[i]) else params_names[i]
- else:
- file_name = "_%s.w%s" % (name, str(i)) if not (
- params_names and params_names[i]) else params_names[i]
- param_conf = ParameterConfig()
- param_conf.name = file_name
- dims = list(data.shape)
- if len(dims) == 1:
- dims.insert(1, 1)
- param_conf.dims.extend(dims)
- param_conf.size = reduce(lambda a, b: a * b, data.shape)
- self.params[file_name] = (param_conf, data.flatten())
-
- @wrap_name_default("fc_layer")
- def convert_fc_layer(self, params, params_names=None, name=None):
- for i in range(len(params)):
- data = params[i]
- if len(params) == 2:
- suffix = "0" if i == 0 else "bias"
- file_name = "_%s.w%s" % (name, suffix) if not (
- params_names and params_names[i]) else params_names[i]
- else:
- file_name = "_%s.w%s" % (name, str(i)) if not (
- params_names and params_names[i]) else params_names[i]
- param_conf = ParameterConfig()
- param_conf.name = file_name
- dims = list(data.shape)
- if len(dims) < 2:
- dims.insert(0, 1)
- param_conf.size = reduce(lambda a, b: a * b, dims)
- param_conf.dims.extend(dims)
- self.params[file_name] = (param_conf, data.flatten())
- return name
-
- @wrap_name_default("batch_norm")
- def convert_bn_layer(self, params, params_names=None, name=None):
- params = [params[i] for i in (0, 2, 3, 1)]
- params_names = [params_names[i]
- for i in (0, 2, 3, 1)] if params_names else params_names
- for i in range(len(params)):
- data = params[i]
- file_name = "_%s.w%s" % (name, str(i)) if i < 3 else "_%s.w%s" % (
- name, "bias")
- file_name = file_name if not (params_names and
- params_names[i]) else params_names[i]
- param_conf = ParameterConfig()
- param_conf.name = file_name
- dims = list(data.shape)
- assert len(dims) == 1
- dims.insert(0, 1)
- param_conf.size = reduce(lambda a, b: a * b, dims)
- param_conf.dims.extend(dims)
- self.params[file_name] = (param_conf, data.flatten())
- return name
-
-
-if __name__ == "__main__":
- tf_net = "TF_ResNet"
- paddle_tar_name = "Paddle_ResNet50.tar.gz"
-
- converter = TFModelConverter(tf_net=tf_net, paddle_tar_name=paddle_tar_name)
- converter.convert()
diff --git a/legacy/image_classification/train.py b/legacy/image_classification/train.py
deleted file mode 100644
index d824e10d7e7b0fef2ce96fa3afc53e7ac51be856..0000000000000000000000000000000000000000
--- a/legacy/image_classification/train.py
+++ /dev/null
@@ -1,123 +0,0 @@
-import gzip
-import argparse
-
-import paddle.v2.dataset.flowers as flowers
-import paddle.v2 as paddle
-import reader
-import vgg
-import resnet
-import alexnet
-import googlenet
-import inception_v4
-import inception_resnet_v2
-import xception
-
-DATA_DIM = 3 * 224 * 224 # Use 3 * 331 * 331 or 3 * 299 * 299 for Inception-ResNet-v2.
-CLASS_DIM = 102
-BATCH_SIZE = 128
-
-
-def main():
- # parse the argument
- parser = argparse.ArgumentParser()
- parser.add_argument(
- 'model',
- help='The model for image classification',
- choices=[
- 'alexnet', 'vgg13', 'vgg16', 'vgg19', 'resnet', 'googlenet',
- 'inception-resnet-v2', 'inception_v4', 'xception'
- ])
- args = parser.parse_args()
-
- # PaddlePaddle init
- paddle.init(use_gpu=True, trainer_count=1)
-
- image = paddle.layer.data(
- name="image", type=paddle.data_type.dense_vector(DATA_DIM))
- lbl = paddle.layer.data(
- name="label", type=paddle.data_type.integer_value(CLASS_DIM))
-
- extra_layers = None
- learning_rate = 0.01
- if args.model == 'alexnet':
- out = alexnet.alexnet(image, class_dim=CLASS_DIM)
- elif args.model == 'vgg13':
- out = vgg.vgg13(image, class_dim=CLASS_DIM)
- elif args.model == 'vgg16':
- out = vgg.vgg16(image, class_dim=CLASS_DIM)
- elif args.model == 'vgg19':
- out = vgg.vgg19(image, class_dim=CLASS_DIM)
- elif args.model == 'resnet':
- out = resnet.resnet_imagenet(image, class_dim=CLASS_DIM)
- learning_rate = 0.1
- elif args.model == 'googlenet':
- out, out1, out2 = googlenet.googlenet(image, class_dim=CLASS_DIM)
- loss1 = paddle.layer.cross_entropy_cost(
- input=out1, label=lbl, coeff=0.3)
- paddle.evaluator.classification_error(input=out1, label=lbl)
- loss2 = paddle.layer.cross_entropy_cost(
- input=out2, label=lbl, coeff=0.3)
- paddle.evaluator.classification_error(input=out2, label=lbl)
- extra_layers = [loss1, loss2]
- elif args.model == 'inception-resnet-v2':
- assert DATA_DIM == 3 * 331 * 331 or DATA_DIM == 3 * 299 * 299
- out = inception_resnet_v2.inception_resnet_v2(
- image, class_dim=CLASS_DIM, dropout_rate=0.5, data_dim=DATA_DIM)
- elif args.model == 'inception_v4':
- out = inception_v4.inception_v4(image, class_dim=CLASS_DIM)
- elif args.model == 'xception':
- out = xception.xception(image, class_dim=CLASS_DIM)
-
- cost = paddle.layer.classification_cost(input=out, label=lbl)
-
- # Create parameters
- parameters = paddle.parameters.create(cost)
-
- # Create optimizer
- optimizer = paddle.optimizer.Momentum(
- momentum=0.9,
- regularization=paddle.optimizer.L2Regularization(rate=0.0005 *
- BATCH_SIZE),
- learning_rate=learning_rate / BATCH_SIZE,
- learning_rate_decay_a=0.1,
- learning_rate_decay_b=128000 * 35,
- learning_rate_schedule="discexp", )
-
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- flowers.train(),
- # To use other data, replace the above line with:
- # reader.train_reader('train.list'),
- buf_size=1000),
- batch_size=BATCH_SIZE)
- test_reader = paddle.batch(
- flowers.valid(),
- # To use other data, replace the above line with:
- # reader.test_reader('val.list'),
- batch_size=BATCH_SIZE)
-
- # Create trainer
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- update_equation=optimizer,
- extra_layers=extra_layers)
-
- # End batch and end pass event handler
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 1 == 0:
- print "\nPass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics)
- if isinstance(event, paddle.event.EndPass):
- with gzip.open('params_pass_%d.tar.gz' % event.pass_id, 'w') as f:
- trainer.save_parameter_to_tar(f)
-
- result = trainer.test(reader=test_reader)
- print "\nTest with Pass %d, %s" % (event.pass_id, result.metrics)
-
- trainer.train(
- reader=train_reader, num_passes=200, event_handler=event_handler)
-
-
-if __name__ == '__main__':
- main()
diff --git a/legacy/image_classification/vgg.py b/legacy/image_classification/vgg.py
deleted file mode 100644
index 4abfb4bf238484c3dbcf4b7f345a64aee55de6d1..0000000000000000000000000000000000000000
--- a/legacy/image_classification/vgg.py
+++ /dev/null
@@ -1,53 +0,0 @@
-import paddle.v2 as paddle
-
-__all__ = ['vgg13', 'vgg16', 'vgg19']
-
-
-def vgg(input, nums, class_dim):
- def conv_block(input, num_filter, groups, num_channels=None):
- return paddle.networks.img_conv_group(
- input=input,
- num_channels=num_channels,
- pool_size=2,
- pool_stride=2,
- conv_num_filter=[num_filter] * groups,
- conv_filter_size=3,
- conv_act=paddle.activation.Relu(),
- pool_type=paddle.pooling.Max())
-
- assert len(nums) == 5
- # the channel of input feature is 3
- conv1 = conv_block(input, 64, nums[0], 3)
- conv2 = conv_block(conv1, 128, nums[1])
- conv3 = conv_block(conv2, 256, nums[2])
- conv4 = conv_block(conv3, 512, nums[3])
- conv5 = conv_block(conv4, 512, nums[4])
-
- fc_dim = 4096
- fc1 = paddle.layer.fc(input=conv5,
- size=fc_dim,
- act=paddle.activation.Relu(),
- layer_attr=paddle.attr.Extra(drop_rate=0.5))
- fc2 = paddle.layer.fc(input=fc1,
- size=fc_dim,
- act=paddle.activation.Relu(),
- layer_attr=paddle.attr.Extra(drop_rate=0.5))
- out = paddle.layer.fc(input=fc2,
- size=class_dim,
- act=paddle.activation.Softmax())
- return out
-
-
-def vgg13(input, class_dim):
- nums = [2, 2, 2, 2, 2]
- return vgg(input, nums, class_dim)
-
-
-def vgg16(input, class_dim):
- nums = [2, 2, 3, 3, 3]
- return vgg(input, nums, class_dim)
-
-
-def vgg19(input, class_dim):
- nums = [2, 2, 4, 4, 4]
- return vgg(input, nums, class_dim)
diff --git a/legacy/image_classification/xception.py b/legacy/image_classification/xception.py
deleted file mode 100644
index fbe8f4ed519829836e997e25cc47996ac82eb1a6..0000000000000000000000000000000000000000
--- a/legacy/image_classification/xception.py
+++ /dev/null
@@ -1,192 +0,0 @@
-import paddle.v2 as paddle
-
-__all__ = ['xception']
-
-
-def img_separable_conv_bn(name, input, num_channels, num_out_channels,
- filter_size, stride, padding, act):
- conv = paddle.networks.img_separable_conv(
- name=name,
- input=input,
- num_channels=num_channels,
- num_out_channels=num_out_channels,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- act=paddle.activation.Linear())
- norm = paddle.layer.batch_norm(name=name + '_norm', input=conv, act=act)
- return norm
-
-
-def img_conv_bn(name, input, num_channels, num_filters, filter_size, stride,
- padding, act):
- conv = paddle.layer.img_conv(
- name=name,
- input=input,
- num_channels=num_channels,
- num_filters=num_filters,
- filter_size=filter_size,
- stride=stride,
- padding=padding,
- act=paddle.activation.Linear())
- norm = paddle.layer.batch_norm(name=name + '_norm', input=conv, act=act)
- return norm
-
-
-def conv_block0(input,
- group,
- num_channels,
- num_filters,
- num_filters2=None,
- filter_size=3,
- pool_padding=0,
- entry_relu=True):
- if num_filters2 is None:
- num_filters2 = num_filters
-
- if entry_relu:
- act_input = paddle.layer.mixed(
- input=paddle.layer.identity_projection(input=input),
- act=paddle.activation.Relu())
- else:
- act_input = input
- conv0 = img_separable_conv_bn(
- name='xception_block{0}_conv0'.format(group),
- input=act_input,
- num_channels=num_channels,
- num_out_channels=num_filters,
- filter_size=filter_size,
- stride=1,
- padding=(filter_size - 1) / 2,
- act=paddle.activation.Relu())
- conv1 = img_separable_conv_bn(
- name='xception_block{0}_conv1'.format(group),
- input=conv0,
- num_channels=num_filters,
- num_out_channels=num_filters2,
- filter_size=filter_size,
- stride=1,
- padding=(filter_size - 1) / 2,
- act=paddle.activation.Linear())
- pool0 = paddle.layer.img_pool(
- name='xception_block{0}_pool'.format(group),
- input=conv1,
- pool_size=3,
- stride=2,
- padding=pool_padding,
- num_channels=num_filters2,
- pool_type=paddle.pooling.CudnnMax())
-
- shortcut = img_conv_bn(
- name='xception_block{0}_shortcut'.format(group),
- input=input,
- num_channels=num_channels,
- num_filters=num_filters2,
- filter_size=1,
- stride=2,
- padding=0,
- act=paddle.activation.Linear())
-
- return paddle.layer.addto(
- input=[pool0, shortcut], act=paddle.activation.Linear())
-
-
-def conv_block1(input, group, num_channels, num_filters, filter_size=3):
- act_input = paddle.layer.mixed(
- input=paddle.layer.identity_projection(input=input),
- act=paddle.activation.Relu())
- conv0 = img_separable_conv_bn(
- name='xception_block{0}_conv0'.format(group),
- input=act_input,
- num_channels=num_channels,
- num_out_channels=num_filters,
- filter_size=filter_size,
- stride=1,
- padding=(filter_size - 1) / 2,
- act=paddle.activation.Relu())
- conv1 = img_separable_conv_bn(
- name='xception_block{0}_conv1'.format(group),
- input=conv0,
- num_channels=num_filters,
- num_out_channels=num_filters,
- filter_size=filter_size,
- stride=1,
- padding=(filter_size - 1) / 2,
- act=paddle.activation.Relu())
- conv2 = img_separable_conv_bn(
- name='xception_block{0}_conv2'.format(group),
- input=conv1,
- num_channels=num_filters,
- num_out_channels=num_filters,
- filter_size=filter_size,
- stride=1,
- padding=(filter_size - 1) / 2,
- act=paddle.activation.Linear())
-
- shortcut = input
- return paddle.layer.addto(
- input=[conv2, shortcut], act=paddle.activation.Linear())
-
-
-def xception(input, class_dim):
- conv = img_conv_bn(
- name='xception_conv0',
- input=input,
- num_channels=3,
- num_filters=32,
- filter_size=3,
- stride=2,
- padding=1,
- act=paddle.activation.Relu())
- conv = img_conv_bn(
- name='xception_conv1',
- input=conv,
- num_channels=32,
- num_filters=64,
- filter_size=3,
- stride=1,
- padding=1,
- act=paddle.activation.Relu())
- conv = conv_block0(
- input=conv, group=2, num_channels=64, num_filters=128, entry_relu=False)
- conv = conv_block0(input=conv, group=3, num_channels=128, num_filters=256)
- conv = conv_block0(input=conv, group=4, num_channels=256, num_filters=728)
- for group in range(5, 13):
- conv = conv_block1(
- input=conv, group=group, num_channels=728, num_filters=728)
- conv = conv_block0(
- input=conv,
- group=13,
- num_channels=728,
- num_filters=728,
- num_filters2=1024)
- conv = img_separable_conv_bn(
- name='xception_conv14',
- input=conv,
- num_channels=1024,
- num_out_channels=1536,
- filter_size=3,
- stride=1,
- padding=1,
- act=paddle.activation.Relu())
- conv = img_separable_conv_bn(
- name='xception_conv15',
- input=conv,
- num_channels=1536,
- num_out_channels=2048,
- filter_size=3,
- stride=1,
- padding=1,
- act=paddle.activation.Relu())
- pool = paddle.layer.img_pool(
- name='xception_global_pool',
- input=conv,
- pool_size=7,
- stride=1,
- num_channels=2048,
- pool_type=paddle.pooling.CudnnAvg())
- out = paddle.layer.fc(name='xception_fc',
- input=pool,
- size=class_dim,
- act=paddle.activation.Softmax())
- return out
diff --git a/legacy/ltr/README.md b/legacy/ltr/README.md
deleted file mode 100644
index e7ce9f9215fd85ed3008627f3041a7000ecf219d..0000000000000000000000000000000000000000
--- a/legacy/ltr/README.md
+++ /dev/null
@@ -1,309 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 排序学习(Learning To Rank)
-
-排序学习技术\[[1](#参考文献1)\]是构建排序模型的机器学习方法,在信息检索、自然语言处理,数据挖掘等机器学场景中具有重要作用。排序学习的主要目的是对给定一组文档,对任意查询请求给出反映相关性的文档排序。在本例子中,利用标注过的语料库训练两种经典排序模型RankNet[[4](#参考文献4)\]和LamdaRank[[6](#参考文献6)\],分别可以生成对应的排序模型,能够对任意查询请求,给出相关性文档排序。
-
-## 背景介绍
-
-排序学习技术随着互联网的快速增长而受到越来越多关注,是机器学习中的常见任务之一。一方面人工排序规则不能处理海量规模的候选数据,另一方面无法为不同渠道的候选数据给于合适的权重,因此排序学习在日常生活中应用非常广泛。排序学习起源于信息检索领域,目前仍然是许多信息检索场景中的核心模块,例如搜索引擎搜索结果排序,推荐系统候选集排序,在线广告排序等等。本例以文档检索任务阐述排序学习模型。
-
-
-
-图1. 排序模型在文档检索的典型应用搜索引擎中的作用
-
-
-假定有一组文档$S$,文档检索任务是依据和请求的相关性,给出文档排列顺序。查询引擎根据查询请求,排序模型会给每个文档打出分数,依据打分情况倒序排列文档,得到查询结果。在训练模型时,给定一条查询,并给出对应的文档最佳排序和得分。在预测时候,给出查询请求,排序模型生成文档排序。常见的排序学习方法划分为以下三类:
-
-- Pointwise 方法
-
- Pointwise方法是通过近似为回归问题解决排序问题,输入的单条样本为**得分-文档**,将每个查询-文档对的相关性得分作为实数分数或者序数分数,使得单个查询-文档对作为样本点(Pointwise的由来),训练排序模型。预测时候对于指定输入,给出查询-文档对的相关性得分。
-
-- Pairwise方法
-
- Pairwise方法是通过近似为分类问题解决排序问题,输入的单条样本为**标签-文档对**。对于一次查询的多个结果文档,组合任意两个文档形成文档对作为输入样本。即学习一个二分类器,对输入的一对文档对AB(Pairwise的由来),根据A相关性是否比B好,二分类器给出分类标签1或0。对所有文档对进行分类,就可以得到一组偏序关系,从而构造文档全集的排序关系。该类方法的原理是对给定的文档全集$S$,降低排序中的逆序文档对的个数来降低排序错误,从而达到优化排序结果的目的。
-
-- Listwise方法
-
- Listwise方法是直接优化排序列表,输入为单条样本为一个**文档排列**。通过构造合适的度量函数衡量当前文档排序和最优排序差值,优化度量函数得到排序模型。由于度量函数很多具有非连续性的性质,优化困难。
-
-
-
-图2. 排序模型三类方法
-
-
-## 实验数据
-
-本例中的实验数据采用了排序学习中的基准数据[LETOR]([http://research.microsoft.com/en-us/um/beijing/projects/letor/LETOR4.0/Data/MQ2007.rar](http://research.microsoft.com/en-us/um/beijing/projects/letor/LETOR4.0/Data/MQ2007.rar))语料库,部分来自于Gov2网站的查询请求结果,包含了约1700条查询请求结果文档列表,并对文档相关性做出了人工标注。其中,一条查询含有唯一的查询id,对应于多个具有相关性的文档,构成了一次查询请求结果文档列表。每个文档由一个一维数组的特征向量表示,并对应一个人工标注与查询的相关性分数。
-
-本例在第一次运行的时会自动下载LETOR MQ2007数据集并缓存,无需手动下载。
-
-`mq2007`数据集分别提供了三种类型排序模型的生成格式,需要指定生成格式`format`
-
-例如调用接口
-
-```python
-pairwise_train_dataset = functools.partial(paddle.dataset.mq2007.train, format="pairwise")
-for label, left_doc, right_doc in pairwise_train_dataset():
- ...
-```
-
-## 模型概览
-
-对于排序模型,本例中提供了Pairwise方法的模型RankNet和Listwise方法的模型LambdaRank,分别代表了两类学习方法。Pointwise方法的排序模型退化为回归问题,请参考PaddleBook中[推荐系统](https://github.com/PaddlePaddle/book/blob/develop/05.recommender_system/README.cn.md)一课。
-
-## RankNet排序模型
-
-[RankNet](http://icml.cc/2015/wp-content/uploads/2015/06/icml_ranking.pdf)是一种经典的Pairwise的排序学习方法,是典型的前向神经网络排序模型。在文档集合$S$中的第$i$个文档记做$U_i$,它的文档特征向量记做$x_i$,对于给定的一个文档对$U_i$, $U_j$,RankNet将输入的单个文档特征向量$x$映射到$f(x)$,得到$s_i=f(x_i)$, $s_j=f(x_j)$。将$U_i$相关性比$U_j$好的概率记做$P_{i,j}$,则
-
-$$P_{i,j}=P(U_{i}>U_{j})=\frac{1}{1+e^{-\sigma (s_{i}-s_{j}))}}$$
-
-由于排序度量函数大多数非连续,非光滑,因此RankNet需要一个可以优化的度量函数$C$。首先使用交叉熵作为度量函数衡量预测代价,将损失函数$C$记做
-
-$$C_{i,j}=-\bar{P_{i,j}}logP_{i,j}-(1-\bar{P_{i,j}})log(1-P_{i,j})$$
-
-其中$\bar{P_{i,j}}$代表真实概率,记做
-
-$$\bar{P_{i,j}}=\frac{1}{2}(1+S_{i,j})$$
-
-而$S_{i,j}$ = {+1,0},表示$U_i$和$U_j$组成的Pair的标签,即Ui相关性是否好于$U_j$。
-
-最终得到了可求导的度量损失函数
-
-$$C=\frac{1}{2}(1-S_{i,j})\sigma (s_{i}-s{j})+log(1+e^{-\sigma (s_{i}-s_{j})})$$
-
-可以使用常规的梯度下降方法进行优化。细节见[RankNet](http://icml.cc/2015/wp-content/uploads/2015/06/icml_ranking.pdf)
-
-同时,得到文档$U_i$在排序优化过程的梯度信息为
-
-$$\lambda _{i,j}=\frac{\partial C}{\partial s_{i}} = \frac{1}{2}(1-S_{i,j})-\frac{1}{1+e^{\sigma (s_{i}-s_{j})}}$$
-
-表示的含义是本轮排序优化过程中文档$U_i$的上升或者下降量。
-
-根据以上推论构造RankNet网络结构,由若干层隐藏层和全连接层构成,如图所示,将文档特征使用隐藏层,全连接层逐层变换,完成了底层特征空间到高层特征空间的变换。其中docA和docB结构对称,分别输入到最终的RankCost层中。
-
-
-
-图3. RankNet网络结构示意图
-
-
-- 全连接层(fully connected layer) : 指上一层中的每个节点都连接到下层网络。本例子中同样使用`paddle.layer.fc`实现,注意输入到RankCost层的全连接层维度为1。
-- RankCost层: RankCost层是排序网络RankNet的核心,度量docA相关性是否比docB好,给出预测值并和label比较。使用了交叉熵(cross enctropy)作为度量损失函数,使用梯度下降方法进行优化。细节可见[RankNet](http://icml.cc/2015/wp-content/uploads/2015/06/icml_ranking.pdf)[4]。
-
-由于Pairwise中的网络结构是左右对称,可定义一半网络结构,另一半共享网络参数。在PaddlePaddle中允许网络结构中共享连接,具有相同名字的参数将会共享参数。使用PaddlePaddle实现RankNet排序模型,定义网络结构的示例代码见 [ranknet.py](ranknet.py) 中的 `half_ranknet` 函数。
-
-`half_ranknet` 函数中定义的结构使用了和图3相同的模型结构:两层隐藏层,分别是`hidden_size=10`的全连接层和`hidden_size=1`的全连接层。本例中的`input_dim`指输入**单个文档**的特征的维度,label取值为1,0。每条输入样本为`,`的结构,以`docA`为例,输入`input_dim`的文档特征,依次变换成10维,1维特征,最终输入到RankCost层中,比较docA和docB在RankCost输出得到预测值。
-
-### RankNet模型训练
-
-训练`RankNet`模型在命令行执行:
-```bash
-python train.py --model_type ranknet
-```
-初次执行会自动下载数据,训练RankNet模型,并将每个轮次的模型参数存储下来。
-
-### RankNet模型预测
-
-使用训练好的`RankNet`模型继续进行预测,在命令行执行:
-```bash
-python infer.py --model_type ranknet --test_model_path models/ranknet_params_0.tar.gz
-```
-
-本例提供了rankNet模型的训练和预测两个部分。完成训练后的模型分为拓扑结构(需要注意`rank_cost`不是模型拓扑结构的一部分)和模型参数文件两部分。在本例子中复用了`ranknet`训练时的模型拓扑结构`half_ranknet`,模型参数从外存中加载。模型预测的输入为单个文档的特征向量,模型会给出相关性得分。将预测得分排序即可得到最终的文档相关性排序结果。
-
-## 用户自定义RankNet数据
-
-上述的代码使用了PaddlePaddle内置的排序数据,如果希望使用自定义格式数据,可以参考PaddlePaddle内置的`mq2007`数据集,编写一个新的生成器函数。例如输入数据为如下格式,只包含doc0-doc2三个文档。
-
-\ \ \的格式(featureid: feature_value)
-
-```
-query_id : 1, relevance_score:1, feature_vector 0:0.1, 1:0.2, 2:0.4 #doc0
-query_id : 1, relevance_score:2, feature_vector 0:0.3, 1:0.1, 2:0.4 #doc1
-query_id : 1, relevance_score:0, feature_vector 0:0.2, 1:0.4, 2:0.1 #doc2
-query_id : 2, relevance_score:0, feature_vector 0:0.1, 1:0.4, 2:0.1 #doc0
-.....
-```
-
-需要将输入样本转换为Pairwise的输入格式,例如组合生成格式与mq2007 Pairwise格式相同的结构
-
-\ \\
-
-```
-1 doc1 doc0
-1 doc1 doc2
-1 doc0 doc2
-....
-```
-
-注意,一般在Pairwise格式的数据中,label=1表示docA和查询的相关性好于docB,事实上label信息隐含在docA和docB组合pair中。如果存在`0 docA docB`,交换顺序构造`1 docB docA`即可。
-
-另外组合所有的pair会有训练数据冗余,因为可以从部分偏序关系恢复文档集上的全序关系。相关研究见[PairWise approach](http://www.machinelearning.org/proceedings/icml2007/papers/139.pdf)[[5](#参考文献5)\],本例不予赘述。
-
-```python
-# a customized data generator
-def gen_pairwise_data(text_line_of_data):
- """
- return :
- ------
- label : np.array, shape=(1)
- docA_feature_vector : np.array, shape=(1, feature_dimension)
- docA_feature_vector : np.array, shape=(1, feature_dimension)
- """
- return label, docA_feature_vector, docB_feature_vector
-```
-
-对应于paddle的输入中,`integer_value`为单个整数,`dense_vector`为实数一维向量,与生成器对应,需要在训练模型之前指明输入数据对应关系。
-
-```python
-# Define the input data order
-feeding = { "label":0,
- "left/data" :1,
- "right/data":2}
-```
-
-## LambdaRank排序模型
-
-[LambdaRank](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)\[[6](#参考文献))\]是Listwise的排序方法,是Bugers[6]等人从RankNet发展而来,使用构造lambda函数(LambdaRank名字的由来)的方法优化度量标准NDCG(Normalized Discounted Cumulative Gain),每个查询后得到的结果文档列表都单独作为一个训练样本。NDCG是信息论中很衡量文档列表排序质量的标准之一,前$K$个文档的NDCG得分记做
-
-$$NDCG@K=Z_{k}\sum (2^{rel_{i}})1/log(k+1)$$
-
-前文中RankNet中推导出,文档排序需要的是排序错误的梯度信息。NDCG度量函数是非光滑,非连续的,不能直接求得梯度信息,因此将|delta(NDCG)|=|NDCG(new) - NDCG(old)|引入,构造lambda函数为
-
-$$\lambda _{i,j}=\frac{\partial C}{\partial s_{i}}=-\frac{\sigma }{1+e^{\sigma (s_{i}-s{j})}}|\Delta NDCG|$$
-
-替换RankNet中的梯度表示,得到的排序模型称为[LambdaRank](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)
-
-由以上推导可知,LambdaRank网络结构和RankNet结构非常相似。如图所示
-
-
-
-图4. LambdaRank的网络结构示意图
-
-
-一个查询得到的结果文档列表作为一条样本输入到网络中,替换RankCost为LambdaCost层,其他结构与RankNet相同。
-
-- LambdaCost层 : LambdaCost层使用NDCG差值作为Lambda函数,score是一个一维的序列,对于单调训练样本全连接层输出的是1x1的序列,二者的序列长度都等于该条查询得到的文档数量。Lambda函数的构造详细见[LambdaRank](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)
-
-使用PaddlePaddle定义LambdaRank网络结构的示例代见 [lambda_rank.py](lambda_rank.py) 中的`lambda_rank`函数。
-
-上述结构中使用了和图3相同的模型结构。和RankNet相似,分别使用了`hidden_size=10`和`hidden_size=1`的两个全连接层。本例中的input_dim指输入**单个文档**的特征的维度。每条输入样本为label,\的结构,以docA为例,输入input_dim的文档特征,依次变换成10维,1维特征,最终输入到LambdaCost层中。需要注意这里的label和data格式为**dense_vector_sequence**,表示一列文档得分或者文档特征组成的**序列**。
-
-### LambdaRank模型训练
-
-训练`LambdaRank`模型在命令行执行:
-```bash
-python train.py --model_type lambdarank
-```
-初次运行脚本会自动下载数据训练LambdaRank模型,并将每个轮次的模型存储下来。
-
-### LambdaRank模型预测
-
-LambdaRank模型预测过程和RankNet相同。预测时的模型拓扑结构复用代码中的模型定义,从外存加载对应的参数文件。预测时的输入是文档列表,输出是该文档列表的各个文档相关性打分,根据打分对文档进行重新排序,即可得到最终的文档排序结果。
-
-使用训练好的`LambdaRank`模型继续进行预测,在命令行执行:
-```bash
-python infer.py --model_type lambdarank --test_model_path models/lambda_rank_params_0.tar.gz
-```
-
-## 自定义 LambdaRank数据
-
-上面的代码使用了PaddlePaddle内置的mq2007数据,如果希望使用自定义格式数据,可以参考PaddlePaddle内置的`mq2007`数据集,编写一个生成器函数。例如输入数据为如下格式,只包含doc0-doc2三个文档。
-
-\ \ \的格式
-
-```
-query_id : 1, relevance_score:1, feature_vector 0:0.1, 1:0.2, 2:0.4 #doc0
-query_id : 1, relevance_score:2, feature_vector 0:0.3, 1:0.1, 2:0.4 #doc1
-query_id : 1, relevance_score:0, feature_vector 0:0.2, 1:0.4, 2:0.1 #doc2
-query_id : 2, relevance_score:0, feature_vector 0:0.1, 1:0.4, 2:0.1 #doc0
-query_id : 2, relevance_score:2, feature_vector 0:0.1, 1:0.4, 2:0.1 #doc1
-.....
-```
-
-需要转换为Listwise格式,例如
-
-` `
-
-```tex
-1 1 0.1,0.2,0.4
-1 2 0.3,0.1,0.4
-1 0 0.2,0.4,0.1
-
-2 0 0.1,0.4,0.1
-2 2 0.1,0.4,0.1
-......
-```
-
-**数据格式注意**
-
-- 数据中每条样本对应的文档数量都必须大于`lambda_cost`层的NDCG_num
-- 若单条样本对应的文档都为0,文档相关性都为0,NDCG计算无效,那么可以判定该query无效,我们在训练中过滤掉了这样的query。
-
-```python
-# self define data generator
-def gen_listwise_data(text_all_lines_of_data):
- """
- return :
- ------
- label : np.array, shape=(samples_num, )
- querylist : np.array, shape=(samples_num, feature_dimension)
- """
- return label_list, query_docs_feature_vector_matrix
-```
-
-对应于PaddlePaddle输入,`label`的类型为`dense_vector_sequence`,是得分的序列,`data`的类型为`dense_vector_sequence`,是特征向量的序列输入,`input_dim`为单个文档的一维特征向量维度,与生成器对应,需要在训练模型之前指明输入数据对应关系。
-
-```python
-# Define the input data order
-feeding = {"label":0,
- "data" : 1}
-```
-
-## 训练过程中输出自定义评估指标
-
-这里,我们以 `RankNet` 为例,介绍如何在训练过程中输出自定义评估指标。这个方法同样可以用来在训练过程中获取网络某一层输出矩阵的值。
-
-`RankNet`网络学习一个打分函数对左右两个输入进行打分,左右两个输入的分值差异越大,打分函数对正负例的区分能力越强,模型的泛化能力越好。假设我们希望输出:训练过程中模型对左右输入打分之差绝对值的平均值这样一个指标。为了计算这个自定义的指标,需要获取每个`mini-batch`之后分值层(对应着`ranknet`中的`name`为`left_score`和`right_score`的层)的输出矩阵。可以通过下面两步来实现这一功能:
-
-1. 在`event_handler`中处理`PaddlePaddle`预定义的`paddle.event.EndIteration`或是`paddle.event.EndPass`事件。
-2. 调用`event.gm.getLayerOutputs`,传入网络中指定层的名字,便可获取该层在一个`mini-batch`前向计算结束后的值。
-
-下面是代码示例:
-
-```python
-def score_diff(right_score, left_score):
- return np.average(np.abs(right_score - left_score))
-
-def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 25 == 0:
- diff = score_diff(
- event.gm.getLayerOutputs("right_score")["right_score"][
- "value"],
- event.gm.getLayerOutputs("left_score")["left_score"][
- "value"])
- logger.info(("Pass %d Batch %d : Cost %.6f, "
- "average absolute diff scores: %.6f") %
- (event.pass_id, event.batch_id, event.cost, diff))
-```
-
-## 总结
-
-LTR在实际生活中有着广泛的应用。排序模型构造方法一般可划分为PointWise方法,Pairwise方法,Listwise方法,本例以LETOR的mq2007数据为例子,阐述了Pairwise的经典方法RankNet和Listwise方法中的LambdaRank,展示如何使用PaddlePaddle框架构造对应的排序模型结构,并提供了自定义数据类型样例。PaddlePaddle提供了灵活的编程接口,并可以使用一套代码运行在单机单GPU和多机分布式多GPU下实现LTR类型任务。
-
-## 注意事项
-
-1. 本例作为LTR的演示示例,**所采用的网络规模较小**,在应用中须结合实际情况调整网络复杂度,对网络规模重新进行设置。
-2. 本例实验数据中的特征向量为**查询-文档对**的联合特征,当使用查询和文档的独立特征时,可参考[DSSM](https://github.com/PaddlePaddle/models/tree/develop/dssm)构建网络。
-
-## 参考文献
-
-1. https://en.wikipedia.org/wiki/Learning_to_rank
-2. Liu T Y. [Learning to rank for information retrieval](http://ftp.nowpublishers.com/article/DownloadSummary/INR-016)[J]. Foundations and Trends® in Information Retrieval, 2009, 3(3): 225-331.
-3. Li H. [Learning to rank for information retrieval and natural language processing](http://www.morganclaypool.com/doi/abs/10.2200/S00607ED2V01Y201410HLT026)[J]. Synthesis Lectures on Human Language Technologies, 2014, 7(3): 1-121.
-4. Burges C, Shaked T, Renshaw E, et al. [Learning to rank using gradient descent](http://machinelearning.wustl.edu/mlpapers/paper_files/icml2005_BurgesSRLDHH05.pdf)[C]//Proceedings of the 22nd international conference on Machine learning. ACM, 2005: 89-96.
-5. Cao Z, Qin T, Liu T Y, et al. [Learning to rank: from pairwise approach to listwise approach](http://machinelearning.wustl.edu/mlpapers/paper_files/icml2007_CaoQLTL07.pdf)[C]//Proceedings of the 24th international conference on Machine learning. ACM, 2007: 129-136.
-6. Burges C J C, Ragno R, Le Q V. [Learning to rank with nonsmooth cost functions](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)[C]//NIPS. 2006, 6: 193-200.
diff --git a/legacy/ltr/README_en.md b/legacy/ltr/README_en.md
deleted file mode 100644
index 94bea313f463cc9a5c01a5b8c99c18e6d3aac005..0000000000000000000000000000000000000000
--- a/legacy/ltr/README_en.md
+++ /dev/null
@@ -1,302 +0,0 @@
-Running the program sample in this directory requires the version of the PaddlePaddle is v0.10.0. If the version is below this requirement, following the [instructions](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) in the document about installation to update your Paddlepaddle's version.
-
-# Learning To Rank
-
-Learning to rank[1] is a method to build the ranking model of machine learning,which plays an important role in the computer science scene such as information retrieval, natural language processing and data mining. The primary purpose of learning to rank is to order a document that reflects the relevance of any query request for a given set of documents. In this example, using the annotated Corpus training two classical ranking models RankNet[4] and LamdaRank[6],the corresponding ranking model can be generated, and the correlation documents can be sorted by any query request.
-
-## Background Information
-Learning to rank is the application of machine learning. On the one hand, the manual ranking rules can not deal with the large scale of the candidate data, on the other hand can not give the appropriate weight for the candidate data of different channels, so it is widely used in daily life.Learning to rank originated in the field of information retrieval and is still the core parts of many information retrieval systems,such as the ranking of search results in search engine,ranking of candidate data in the recommendation system,and online advertising, and so on. In this case, we use the document retrieval task to illustrate the learning to rank model.
-
-![image](https://github.com/PaddlePaddle/models/blob/develop/ltr/images/search_engine_example.png?raw=true)
-
-Figure.1 the role of ranking model in the typical application search engine of document retrieval.
-
-Assuming that there is a set of documents $S$, the document retrieval task is based on the relevance of the requests to give the order of the documents. According to the query request, the query engine will score every document according to the query request, and arrange the documents in reverse order according to the grading, and get the query results.Given a query and corresponding documents, the model is trained based the scoring of the document sorts. When it goes to the predicted phase, the model will generate the document sort according to the query received. The common ranking learning methods are divided into the following three categories.
-
-- Pointwise approach
-
-In this case,the learning-to-rank problem can be viewed as a regression problem.The input single sample is the **score-document**,the correlation score of each query-Document pair is used as the real number or the sequence number,so the individual query-document pairs are uesd as a sample point (the origin of the word pointwise) to train the ranking model.When predicting,the correlation score of query-document pair is given for the specified input.
-- Pairwise approach
-
-In this case, the learning-to-rank problem is approximated by a classification problem — learning a binary classifier that can tell which document is better in a given pair of documents.The single input sample is the **label-document pair**.For multiple result documents of one query,any two documents are combined to form document pairs as input samples.Any two documents are combined to form document pairs as the input samples.That is to learn a two classifier, the input is a pair of documents A-B (the origin of Pairwise), according to whether the correlation of A is better than B,the two classifier gives the classification label 1 or 0.After classifying all the document pairs,we can get a set of partial order relations to construct the order relation of the documents.The principle of this kind of the method is to reduce the number of the reverse order document pairs in the order of the given pair of documents $S$,so as to achieve the goal of optimizing the sorting result.
-- Listwise approach
-
-These algorithms try to directly optimize the value of one of the above evaluation measures, averaged over all queries in the training data.The single input sample is a **document arranged**. By constructing the appropriate measurement function to measure the difference between the current document ranking and the optimal ranking,then optimizes the evaluation measures to get the ranking model. It is difficult to optimize because most of the ranking loss function are not continuous smooth functions.
-![image](https://github.com/PaddlePaddle/models/blob/develop/ltr/images/learning_to_rank.jpg?raw=true)
-
-Figure.2 Three methods of the ranking model
-
-## Experimental data
-
-The experimental data in this example uses the LETOR corpus of benchmarking data in the Ranking learning, part of the query results is from the Gov2 website, which contains about 1700 query request result document lists and has made manual annotations on the relevance of the documents.Among them,a query contains a unique query id,corresponding to a number of related documents,forming a query request result list.The feature vector of Each document is represented by the one-dimensional array,and corresponds to a correlation score between the human annotation and the query.
-
-This example automatically downloads the LETOR MQ2007 dataset and cache when it is first running,without manual downloading.
-
-The Data sets of **mq2007** provide a generation format for three types of the ranking models respectively.Which is need to specify the **format**.
-
-for example,the call interface
-
-```
-pairwise_train_dataset = functools.partial(paddle.dataset.mq2007.train, format="pairwise")
-for label, left_doc, right_doc in pairwise_train_dataset():
- ...
-```
-## Model overview
-
-For the ranking model, the RankNet model of the Pairwise method and the LambdaRank of the Listwise method are provided in this example, respectively representing two types of the learning methods. The ranking model of the Pointwise method can be degraded to the regression problem. Please refer to the [recommendation system](https://github.com/PaddlePaddle/book/blob/develop/05.recommender_system/README.cn.md) in the PaddleBook.
-
-## RankNet model
-
-[RankNet](http://icml.cc/2015/wp-content/uploads/2015/06/icml_ranking.pdf) is a classic Pairwise ranking learning method, which is a typical forward neural network ranking model. The $i$ document in the document collection $S$ is denoted as $U_i$, its document feature vector is denoted as $x_i$, and for a given document pair $U_i$, $U_j$, RankNet maps the input single document feature vector $x$ to $f(x)$, and gets $s_i=f(x_i),$s_j=f(x_j)$.The probability that the correlation of $U_i$ is better than $U_j$ is recorded as $P_{i,j}$.
-
-$$P_{i,j}=P(U_{i}>U_{j})=\frac{1}{1+e^{-\sigma (s_{i}-s_{j}))}}$$
-
-Because most of the rank metric functions are mostly non-continuous and non-smooth, the ranknet needs a metric function $C$ that can be optimized.First,the cross entropy is used as a measure function to measure the prediction cost, and the loss function $C$ is recorded as
-
-$$C_{i,j}=-\bar{P_{i,j}}logP_{i,j}-(1-\bar{P_{i,j}})log(1-P_{i,j})$$
-
-The $\bar{P_{i,j}}$ represents the true probability, which is recorded as
-
-$$\bar{P_{i,j}}=\frac{1}{2}(1+S_{i,j})$$
-
-$S_{i,j}$ = {+1,0}, which represents the label of pair consisting of $U_i$ and $U_j$,that is, whethe the Ui correlation is better than $U_j$.
-
-Finally, a derivable metric loss function is obtained
-
-$$C=\frac{1}{2}(1-S_{i,j})\sigma (s_{i}-s{j})+log(1+e^{-\sigma (s_{i }-s_{j})})$$
-
-It can be optimized using conventional gradient descent methods. See [RankNet](http://icml.cc/2015/wp-content/uploads/2015/06/icml_ranking.pdf) for details.
-
-Meanwhile, get the gradient information of document $U_i$ in the ranking optimization process.
-
-$$\lambda {i,j}=\frac{\partial C}{\partial s{i}} = \frac{1}{2}(1-S_{i,j})-\frac{1} {1+e^{\sigma (s_{i}-s_{j})}}$$
-
-The meaning of the expression is the increase or decrease of the document $U_i$ during this round of sorting optimization.
-
-Based on the above inference, the RankNet network structure is constructed, which is composed of several layers of hidden layers and full connected layers. As shown in the figure, the document features are used in the hidden layers, and the all connected layer is transformed by layer by layer,completing the transformation from the underlying feature space to the high-level feature space. The structure of docA and docB is symmetrical and they are input into the final RankCost layer.
-
-![image](https://github.com/sunshine-2015/models/blob/patch-4/ltr/images/ranknet_en.png?raw=true)
-
-Figure.3 The structure diagram of RankNet network
-
-- Full connected layer: means that each node in the previous layer is connected to the underlying network. In this example, **paddle.layer.fc** is also used. Note that the full connection layer dimension input to the RankCost layer is 1.
-
-- RankCost layer: The RankCost layer is the core of the RankNet ranking network and measures whether the docA correlation is better than the docB. Give the predicted value and compare it with the label. Cross entropy is used as a measure of the loss function, using a gradient descent method for optimization. Details can be found in [RankNet](http://icml.cc/2015/wp-content/uploads/2015/06/icml_ranking.pdf) [4].
-
-Because the network structure in Pairwise is Left-right symmetrical, half of the network structure can be defined, and the other half share network parameters. The PaddlePaddle allows sharing of connections in the network structure, parameters with the same name will share parameters. Use the PaddlePaddle to implement the RankNet ranking model. The sample code for defining the network structure is given in the **half_ranknet** function in [ranknet.py](https://github.com/PaddlePaddle/models/blob/develop/ltr/ranknet.py).
-
-The structure defined in the ***half_ranknet*** function uses the same model structure as in FIG 3: two hidden layers, a fully connected layer with **hidden_size=10** and a fully connected layer with **hidden_size=1**. In this example, **input_dim** refers to the dimension of the characteristic of the input **single document**. The value of label is 1,0. Each input sample is the structure of **, **. Take docA as an example, input the **input_dim** document features, turn into 10-dimensional, 1-dimensional features, and finally input into the RankCost layer, compare docA and docB. The RankCost output gives the predicted value.
-
-### RankNet model training
-
-Train **RankNet** model executes on the command line:
-
-```
-python train.py --model_type ranknet
-```
-
-The initial execution automatically downloads data and trains the RankNet model,which stores the parameters of the model at each round.
-
-### RankNet model prediction
-Use the trained **RankNet** model to continue the prediction and execute it on the command line:
-
-```
-python infer.py --model_type ranknet --test_model_path models/ranknet_params_0.tar.gz
-```
-
-This example provides training and prediction part of the RankNet model. After completing the training,The model is divided into two parts, topology structure (the **rank_cost** is not part of the model topology) and the model's parameter file. In this example, the topology **half_ranknet** during the **ranknet** training is reused, and the parameters of the model are loaded from the external memory. The input of the prediction of the model is the feature vector of a single document,and the model will give a relevance score. Sorting the forecast scores to get the final document relevance ranking result.
-
-## User-defined RankNet data
-
-The above code uses PaddlePaddle's built-in sorting data. If you want to use custom format data, you can refer to the PaddlePaddle's built-in **mq2007** data set and write a new generator function. For example, the input data is in the following format, containing only three documents doc0-doc2.
-
- (featureid: feature_value)
-
-```
-query_id : 1, relevance_score:1, feature_vector 0:0.1, 1:0.2, 2:0.4 #doc0
-query_id : 1, relevance_score:2, feature_vector 0:0.3, 1:0.1, 2:0.4 #doc1
-query_id : 1, relevance_score:0, feature_vector 0:0.2, 1:0.4, 2:0.1 #doc2
-query_id : 2, relevance_score:0, feature_vector 0:0.1, 1:0.4, 2:0.1 #doc0
-.....
-```
-
-
-The input sample needs to be converted to Pairwise's input format. for example, the combination of the generated format is same with the structure of the mq2007 Pairwise format.
-
-
-
-```
-1 doc1 doc0
-1 doc1 doc2
-1 doc0 doc2
-....
-```
-
-
-Note that generally, in Pairwise format data, label=1 indicates that the correlation between docA and the query is better than that of docB. In fact, the label information is implicit in the combination of docA and docB. If there is **0 docA docB**, then exchange order to construct **1 docB docA**.
-
-In addition, combining all pairs will make training data redundancy because the total order relationship on the document set can be recovered from the partial partial order relationships. See the [PairWise approach](http://www.machinelearning.org/proceedings/icml2007/papers/139.pdf) [5] for related research. This example will not repeat.
-
-```
-# a customized data generator
-def gen_pairwise_data(text_line_of_data):
- """
- return :
- ------
- label : np.array, shape=(1)
- docA_feature_vector : np.array, shape=(1, feature_dimension)
- docA_feature_vector : np.array, shape=(1, feature_dimension)
- """
- return label, docA_feature_vector, docB_feature_vector
-```
-
-
-Corresponding to the paddle input, **integer_value** is a single integer, and **dense_vector** is a real one-dimensional vector, corresponding to the generator, it is necessary to specify the input data correspondence before training the model.
-
-
-```
-# Define the input data order
-feeding = { "label":0,
- "left/data" :1,
- "right/data":2}
-```
-
-## LambdaRank ranking model
-[LambdaRank](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)[6] is Listwise's ranking method. It is developed by Bugers et al.[6] from RankNet. It uses the method of constructing lambda function (the origin of LambdaRank name) to optimize the metric NDCG (Normalized Discounted Cumulative Gain). The resulting list is used individually as a training sample. The NDCG is one of the standards in the information theory that measures the ranking quality of the document list. The NDCG scores of the previous $K$ documents are recorded as
-
-$$NDCG@K=Z_{k}\sum (2^{rel_{i}})1/log(k+1)$$
-
-As previously deduced by RankNet, document sorting requires the gradient information of errors from sorting. The NDCG metric function is non-smooth and non-successive. It cannot directly obtain the gradient information. Therefore, the |delta(NDCG)|=|NDCG(new) - NDCG(old)| is introduced to construct the lambda function as
-
-$$\lambda {i,j}=\frac{\partial C}{\partial s{i}}=-\frac{\sigma }{1+e^{\sigma (s_{i}-s{j})}}|\Delta NDCG|$$
-
-Replace the gradient representation in RankNet and get the ranking model called [LambdaRank](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)
-
-From the above derivation we can see that the LambdaRank network structure is very similar to the RankNet structure. as the picture shows
-
-![image](https://github.com/sunshine-2015/models/blob/patch-4/ltr/images/LambdaRank_EN.png?raw=true)
-
-Figure 4. Network structure of LambdaRank
-
-Replacing the pair of the document-score pair with the list of the query-related document as input sample, refactoring the LambdaCost layer to RankCost layer, and keep the rest of the network same with Ranket.
-
-- LambdaCost layer: The LambdaCost layer uses the NDCG difference as the Lambda function. The score is a one-dimensional sequence. For a monotonic training sample, the full-connection layer output is a 1x1 sequence, and the length of the both sequence is equal to the number of documents obtained by the query. The **LambdaRank** function's details is in [LambdaRank](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)
-
-
-An example of a LambdaRank network structure defined using PaddlePaddle is the **lambda_rank** function in [lambda_rank.py](https://github.com/PaddlePaddle/models/blob/develop/ltr/lambda_rank.py).
-
-The same model structure as FIG. 3 is used in the above structure. Similar to RankNet, two fully connected layers of **hidden_size=10** and **hidden_size=1** are used respectively. The input_dim in this example refers to the dimension of the characteristic of the input single document. Each input sample is the structure of label, . Take docA as an example, with inputing the input_dim's document features,which are turned into 10-dimensional, 1-dimensional features, and finally is inputed into the LambdaCost layer. It should be noted that the label and data formats here are **dense_vector_sequences**, which represent a sequence of document scores or document features.
-
-### LambdaRank model training
-
-Execute on the command line to train the **LambdaRank** model:
-
-```
-python train.py --model_type lambdarank
-```
-
-The first run of the script will automatically download the data and train the LambdaRank model and store the model of each round.
-
-### LambdaRank model prediction
-The prediction process of the LambdaRank model is the same as RankNet. The model's topology in the prediction model reuses the model definition in the code and loads the corresponding parameter file from the external memory. The input during the forecast is a document list, and the output is the relevance score of each document in the document list. The document is re-sorted based the score, to obtaine the final document's sorting result.
-
-Use the trained LambdaRank model to continue the prediction:
-
-```
-python infer.py --model_type lambdarank --test_model_path models/lambda_rank_params_0.tar.gz
-```
-
-## Customize LambdaRank data
-The above code uses the built-in mq2007 data from PaddlePaddle, and if you want to use custom format data, you can refer to the built-in mq2007 dataset in PaddlePaddle and write a generator function. For example, the input data is in the following format, which only has three documents doc0-doc2.
-
-
-
-```
-query_id : 1, relevance_score:1, feature_vector 0:0.1, 1:0.2, 2:0.4 #doc0
-query_id : 1, relevance_score:2, feature_vector 0:0.3, 1:0.1, 2:0.4 #doc1
-query_id : 1, relevance_score:0, feature_vector 0:0.2, 1:0.4, 2:0.1 #doc2
-query_id : 2, relevance_score:0, feature_vector 0:0.1, 1:0.4, 2:0.1 #doc0
-query_id : 2, relevance_score:2, feature_vector 0:0.1, 1:0.4, 2:0.1 #doc1
-.....
-```
-
-Convert the format to the Listwise, for example:
-
-
-
-```
-1 1 0.1,0.2,0.4
-1 2 0.3,0.1,0.4
-1 0 0.2,0.4,0.1
-
-2 0 0.1,0.4,0.1
-2 2 0.1,0.4,0.1
-......
-```
-**Note Data format**
-- The number of documents corresponding to each sample in the data must be more than the NDCG_num of **lambda_cost** layer.
-- If the document of the single sample is 0, the correlation of the document is 0, and the calculation of NDCG is invalid, then we can determine that the query is invalid, and we can filter out such query during training.
-
-
-```
-# self define data generator
-def gen_listwise_data(text_all_lines_of_data):
- """
- return :
- ------
- label : np.array, shape=(samples_num, )
- querylist : np.array, shape=(samples_num, feature_dimension)
- """
- return label_list, query_docs_feature_vector_matrix
-```
-
-Corresponds to input of PaddlePaddle, the type of **label** as **dense_vector_sequence**, is the sequence of score, the type of **data** for **dense_vector_sequence**, is the input feature vector sequences, **input_dim** is a one-dimensional feature vector dimension of a single document, need to specify the corresponding relations between the input data before training model.
-
-
-```
-# Define the input data order
-feeding = {"label":0,
- "data" : 1}
-```
-
-
-## Output custom evaluation index during training.
-Here, we take **RankNet** as an example of how to export custom evaluation metrics during training. This method can also be used to obtain the value of an output matrix of the network in the training process.
-
-The RankNet network learns a scoring function to score the inputs of left and right.The greater the difference between the scores of the two inputs,the stronger the discrimination ability of the scoring function for positive and negative , and the better the model's generalization ability. If we want to get the average value of the difference between the right and left inputs in the training process. To compute this custom metric, we need to get the output matrix for each mini-batch after the split layer (which corresponds to the layer in ranknet with the name of left_score and right_score). We can do this with the following two steps:
-
-1. In the event_handler, processing the predefined paddle.Event.EndIteration or the paddle.Event.EndPass events in PaddlePaddle.
-2. Call the event.gm.getlayeroutput, pass the name of the specified layer into the network, we can obtain the value of the layer after completing the forward calculation of mini-batch.
-
-Here is the code example:
-
-```
-def score_diff(right_score, left_score):
- return np.average(np.abs(right_score - left_score))
-
-def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 25 == 0:
- diff = score_diff(
- event.gm.getLayerOutputs("right_score")["right_score"][
- "value"],
- event.gm.getLayerOutputs("left_score")["left_score"][
- "value"])
- logger.info(("Pass %d Batch %d : Cost %.6f, "
- "average absolute diff scores: %.6f") %
- (event.pass_id, event.batch_id, event.cost, diff))
-```
-
-
-## Conclusion
-LTR is widely used in real life. The construction method of ranking model can generally be divided into PointWise, Pairwise, Listwise. this example adopt the LETOR mq2007 data as an example, expounds the classic method of RankNet in Pairwise and the LambdaRank in Listwise method, shows how to use PaddlePaddle framework to structure the corresponding sorting model, and provides a sample used the custom data types. Paddlepaddles provides a flexible programming interface. At the same time, with using a set of code in a single GPU, The LTR type is implemented by the multi-machine's distributed multi-gpu.
-## Attention
-1. As a demonstration example of LTR, this example is a small network size. In the application, it is necessary to adjust the network complexity in combination with the actual situation and reset the network scale.
-2. In this case, the feature vectors in the experimental data are the joint features of the query-document. When using the independent features of the query-document, [DSSM](https://github.com/PaddlePaddle/models/tree/develop/dssm) can be used to build the network.
-## Reference
-1. https://en.wikipedia.org/wiki/Learning_to_rank
-2. Liu T Y. [Learning to rank for information retrieval](http://ftp.nowpublishers.com/article/DownloadSummary/INR-016)[J]. Foundations and Trends® in Information Retrieval, 2009, 3(3): 225-331.
-3. Li H. [Learning to rank for information retrieval and natural language processing](http://www.morganclaypool.com/doi/abs/10.2200/S00607ED2V01Y201410HLT026)[J]. Synthesis Lectures on Human Language Technologies, 2014, 7(3): 1-121.
-4. Burges C, Shaked T, Renshaw E, et al. [Learning to rank using gradient descent](http://machinelearning.wustl.edu/mlpapers/paper_files/icml2005_BurgesSRLDHH05.pdf)[C]//Proceedings of the 22nd international conference on Machine learning. ACM, 2005: 89-96.
-5. Cao Z, Qin T, Liu T Y, et al. [Learning to rank: from pairwise approach to listwise approach](http://machinelearning.wustl.edu/mlpapers/paper_files/icml2007_CaoQLTL07.pdf)[C]//Proceedings of the 24th international conference on Machine learning. ACM, 2007: 129-136.
-6. Burges C J C, Ragno R, Le Q V. [Learning to rank with nonsmooth cost functions](https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf)[C]//NIPS. 2006, 6: 193-200.
diff --git a/legacy/ltr/images/LambdaRank_EN.png b/legacy/ltr/images/LambdaRank_EN.png
deleted file mode 100644
index d9fddec74320c801a79db57eccfb82cc780448c9..0000000000000000000000000000000000000000
Binary files a/legacy/ltr/images/LambdaRank_EN.png and /dev/null differ
diff --git a/legacy/ltr/images/lambdarank.jpg b/legacy/ltr/images/lambdarank.jpg
deleted file mode 100644
index 5a0b0da90eb077d63608c004ff346e7a49a467db..0000000000000000000000000000000000000000
Binary files a/legacy/ltr/images/lambdarank.jpg and /dev/null differ
diff --git a/legacy/ltr/images/learning_to_rank.jpg b/legacy/ltr/images/learning_to_rank.jpg
deleted file mode 100644
index 54083d1143dff90a3c8ff06a77d3ea354fa621ef..0000000000000000000000000000000000000000
Binary files a/legacy/ltr/images/learning_to_rank.jpg and /dev/null differ
diff --git a/legacy/ltr/images/ranknet.jpg b/legacy/ltr/images/ranknet.jpg
deleted file mode 100644
index 9767cec96cd2386218df261cf4d78e288fecbb3d..0000000000000000000000000000000000000000
Binary files a/legacy/ltr/images/ranknet.jpg and /dev/null differ
diff --git a/legacy/ltr/images/ranknet_en.png b/legacy/ltr/images/ranknet_en.png
deleted file mode 100644
index 6615a71d6025073c196972f367e81eb4867e1cb7..0000000000000000000000000000000000000000
Binary files a/legacy/ltr/images/ranknet_en.png and /dev/null differ
diff --git a/legacy/ltr/images/search_engine_example.png b/legacy/ltr/images/search_engine_example.png
deleted file mode 100644
index 36386085f45794f51d5cae79b3f9dd6ba004ded8..0000000000000000000000000000000000000000
Binary files a/legacy/ltr/images/search_engine_example.png and /dev/null differ
diff --git a/legacy/ltr/infer.py b/legacy/ltr/infer.py
deleted file mode 100644
index 3ec1842d72335b6f03d14dd56726024096ac6f7d..0000000000000000000000000000000000000000
--- a/legacy/ltr/infer.py
+++ /dev/null
@@ -1,115 +0,0 @@
-import os
-import gzip
-import functools
-import argparse
-
-import paddle.v2 as paddle
-
-from ranknet import half_ranknet
-from lambda_rank import lambda_rank
-
-
-def ranknet_infer(input_dim, model_path):
- """
- RankNet model inference interface.
- """
- # we just need half_ranknet to predict a rank score,
- # which can be used in sort documents
- output = half_ranknet("right", input_dim)
- parameters = paddle.parameters.Parameters.from_tar(gzip.open(model_path))
-
- # load data of same query and relevance documents,
- # need ranknet to rank these candidates
- infer_query_id = []
- infer_data = []
- infer_doc_index = []
-
- # convert to mq2007 built-in data format
- #
- plain_txt_test = functools.partial(
- paddle.dataset.mq2007.test, format="plain_txt")
-
- for query_id, relevance_score, feature_vector in plain_txt_test():
- infer_query_id.append(query_id)
- infer_data.append([feature_vector])
-
- # predict score of infer_data document.
- # Re-sort the document base on predict score
- # in descending order. then we build the ranking documents
- scores = paddle.infer(
- output_layer=output, parameters=parameters, input=infer_data)
- for query_id, score in zip(infer_query_id, scores):
- print "query_id : ", query_id, " score : ", score
-
-
-def lambda_rank_infer(input_dim, model_path):
- """
- LambdaRank model inference interface.
- """
- output = lambda_rank(input_dim, is_infer=True)
- parameters = paddle.parameters.Parameters.from_tar(gzip.open(model_path))
-
- infer_query_id = None
- infer_data = []
- infer_data_num = 1
-
- fill_default_test = functools.partial(
- paddle.dataset.mq2007.test, format="listwise")
- for label, querylist in fill_default_test():
- infer_data.append([querylist])
- if len(infer_data) == infer_data_num:
- break
-
- # Predict score of infer_data document.
- # Re-sort the document base on predict score.
- # In descending order. then we build the ranking documents.
- predicitons = paddle.infer(
- output_layer=output, parameters=parameters, input=infer_data)
- for i, score in enumerate(predicitons):
- print i, score
-
-
-def parse_args():
- parser = argparse.ArgumentParser(
- description="PaddlePaddle learning to rank example.")
- parser.add_argument(
- "--model_type",
- type=str,
- help=("A flag indicating to run the RankNet or the LambdaRank model. "
- "Available options are: ranknet or lambdarank."),
- default="ranknet")
- parser.add_argument(
- "--use_gpu",
- type=bool,
- help="A flag indicating whether to use the GPU device in training.",
- default=False)
- parser.add_argument(
- "--trainer_count",
- type=int,
- help="The thread number used in training.",
- default=1)
- parser.add_argument(
- "--test_model_path",
- type=str,
- required=True,
- help=("The path of a trained model."))
- return parser.parse_args()
-
-
-if __name__ == "__main__":
- args = parse_args()
- assert os.path.exists(args.test_model_path), (
- "The trained model does not exit. Please set a correct path.")
-
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
-
- # Training dataset: mq2007, input_dim = 46, dense format.
- input_dim = 46
-
- if args.model_type == "ranknet":
- ranknet_infer(input_dim, args.test_model_path)
- elif args.model_type == "lambdarank":
- lambda_rank_infer(input_dim, args.test_model_path)
- else:
- logger.fatal(("A wrong value for parameter model type. "
- "Available options are: ranknet or lambdarank."))
diff --git a/legacy/ltr/lambda_rank.py b/legacy/ltr/lambda_rank.py
deleted file mode 100644
index cb96a2d752e11535ef07674f25964a6a0d4c0750..0000000000000000000000000000000000000000
--- a/legacy/ltr/lambda_rank.py
+++ /dev/null
@@ -1,44 +0,0 @@
-"""
-LambdaRank is a listwise rank model.
-https://papers.nips.cc/paper/2971-learning-to-rank-with-nonsmooth-cost-functions.pdf
-"""
-import paddle.v2 as paddle
-
-
-def lambda_rank(input_dim, is_infer=False):
- """
- The input data and label for LambdaRank must be sequences.
-
- parameters :
- input_dim, one document's dense feature vector dimension
-
- The format of the dense_vector_sequence is as follows:
- [[f, ...], [f, ...], ...], f is a float or an int number
- """
- data = paddle.layer.data("data",
- paddle.data_type.dense_vector_sequence(input_dim))
-
- # Define the hidden layer.
- hd1 = paddle.layer.fc(input=data,
- size=128,
- act=paddle.activation.Tanh(),
- param_attr=paddle.attr.Param(initial_std=0.01))
-
- hd2 = paddle.layer.fc(input=hd1,
- size=10,
- act=paddle.activation.Tanh(),
- param_attr=paddle.attr.Param(initial_std=0.01))
- output = paddle.layer.fc(input=hd2,
- size=1,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(initial_std=0.01))
-
- if not is_infer:
- label = paddle.layer.data("label",
- paddle.data_type.dense_vector_sequence(1))
-
- cost = paddle.layer.lambda_cost(
- input=output, score=label, NDCG_num=6, max_sort_size=-1)
- return cost
- else:
- return output
diff --git a/legacy/ltr/ranknet.py b/legacy/ltr/ranknet.py
deleted file mode 100644
index 8484f4353f395d1918dd9277ddaac29dfdad3e9a..0000000000000000000000000000000000000000
--- a/legacy/ltr/ranknet.py
+++ /dev/null
@@ -1,48 +0,0 @@
-"""
-ranknet is the classic pairwise learning to rank algorithm
-http://icml.cc/2015/wp-content/uploads/2015/06/icml_ranking.pdf
-"""
-import paddle.v2 as paddle
-
-
-def half_ranknet(name_prefix, input_dim):
- """
- parameter in same name will be shared in paddle framework,
- these parameters in ranknet can be used in shared state,
- e.g. left network and right network shared parameters in detail
- https://github.com/PaddlePaddle/Paddle/blob/develop/doc/design/api.md
- """
- # data layer
- data = paddle.layer.data(name_prefix + "_data",
- paddle.data_type.dense_vector(input_dim))
-
- # hidden layer
- hd1 = paddle.layer.fc(input=data,
- name=name_prefix + "_hidden",
- size=10,
- act=paddle.activation.Tanh(),
- param_attr=paddle.attr.Param(
- initial_std=0.01, name="hidden_w1"))
-
- # fully connected layer and output layer
- output = paddle.layer.fc(input=hd1,
- name=name_prefix + "_score",
- size=1,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(
- initial_std=0.01, name="output"))
- return output
-
-
-def ranknet(input_dim):
- # label layer
- label = paddle.layer.data("label", paddle.data_type.dense_vector(1))
-
- # reuse the parameter in half_ranknet
- output_left = half_ranknet("left", input_dim)
- output_right = half_ranknet("right", input_dim)
-
- # rankcost layer
- cost = paddle.layer.rank_cost(
- name="cost", left=output_left, right=output_right, label=label)
- return cost
diff --git a/legacy/ltr/train.py b/legacy/ltr/train.py
deleted file mode 100644
index 2a4d16c7e04186ad26f1093d30f45e1c9a54c163..0000000000000000000000000000000000000000
--- a/legacy/ltr/train.py
+++ /dev/null
@@ -1,157 +0,0 @@
-import os
-import gzip
-import functools
-import argparse
-import logging
-import numpy as np
-
-import paddle.v2 as paddle
-
-from ranknet import ranknet
-from lambda_rank import lambda_rank
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def ranknet_train(input_dim, num_passes, model_save_dir):
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- paddle.dataset.mq2007.train, buf_size=100),
- batch_size=100)
- test_reader = paddle.batch(paddle.dataset.mq2007.test, batch_size=100)
-
- cost = ranknet(input_dim)
- parameters = paddle.parameters.create(cost)
-
- trainer = paddle.trainer.SGD(
- cost=cost,
- parameters=parameters,
- update_equation=paddle.optimizer.Adam(learning_rate=2e-4))
-
- feeding = {"label": 0, "left_data": 1, "right_data": 2}
-
- def score_diff(right_score, left_score):
- return np.average(np.abs(right_score - left_score))
-
- # Define end batch and end pass event handler
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 25 == 0:
- diff = score_diff(
- event.gm.getLayerOutputs("left_score")["left_score"][
- "value"],
- event.gm.getLayerOutputs("right_score")["right_score"][
- "value"])
- logger.info(("Pass %d Batch %d : Cost %.6f, "
- "average absolute diff scores: %.6f") %
- (event.pass_id, event.batch_id, event.cost, diff))
-
- if isinstance(event, paddle.event.EndPass):
- result = trainer.test(reader=test_reader, feeding=feeding)
- logger.info("\nTest with Pass %d, %s" %
- (event.pass_id, result.metrics))
- with gzip.open(
- os.path.join(model_save_dir, "ranknet_params_%d.tar.gz" %
- (event.pass_id)), "w") as f:
- trainer.save_parameter_to_tar(f)
-
- trainer.train(
- reader=train_reader,
- event_handler=event_handler,
- feeding=feeding,
- num_passes=num_passes)
-
-
-def lambda_rank_train(input_dim, num_passes, model_save_dir):
- # The input for LambdaRank must be a sequence.
- fill_default_train = functools.partial(
- paddle.dataset.mq2007.train, format="listwise")
- fill_default_test = functools.partial(
- paddle.dataset.mq2007.test, format="listwise")
-
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- fill_default_train, buf_size=100), batch_size=32)
- test_reader = paddle.batch(fill_default_test, batch_size=32)
-
- cost = lambda_rank(input_dim)
- parameters = paddle.parameters.create(cost)
-
- trainer = paddle.trainer.SGD(
- cost=cost,
- parameters=parameters,
- update_equation=paddle.optimizer.Adam(learning_rate=1e-4))
-
- feeding = {"label": 0, "data": 1}
-
- # Define end batch and end pass event handler.
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- logger.info("Pass %d Batch %d Cost %.9f" %
- (event.pass_id, event.batch_id, event.cost))
- if isinstance(event, paddle.event.EndPass):
- result = trainer.test(reader=test_reader, feeding=feeding)
- logger.info("\nTest with Pass %d, %s" %
- (event.pass_id, result.metrics))
- with gzip.open(
- os.path.join(model_save_dir, "lambda_rank_params_%d.tar.gz"
- % (event.pass_id)), "w") as f:
- trainer.save_parameter_to_tar(f)
-
- trainer.train(
- reader=train_reader,
- event_handler=event_handler,
- feeding=feeding,
- num_passes=num_passes)
-
-
-def parse_args():
- parser = argparse.ArgumentParser(
- description="PaddlePaddle learning to rank example.")
- parser.add_argument(
- "--model_type",
- type=str,
- help=("A flag indicating to run the RankNet or the LambdaRank model. "
- "Available options are: ranknet or lambdarank."),
- default="ranknet")
- parser.add_argument(
- "--num_passes",
- type=int,
- help="The number of passes to train the model.",
- default=10)
- parser.add_argument(
- "--use_gpu",
- type=bool,
- help="A flag indicating whether to use the GPU device in training.",
- default=False)
- parser.add_argument(
- "--trainer_count",
- type=int,
- help="The thread number used in training.",
- default=1)
- parser.add_argument(
- "--model_save_dir",
- type=str,
- required=False,
- help=("The path to save the trained models."),
- default="models")
- return parser.parse_args()
-
-
-if __name__ == "__main__":
- args = parse_args()
- if not os.path.exists(args.model_save_dir): os.mkdir(args.model_save_dir)
-
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
-
- # Training dataset: mq2007, input_dim = 46, dense format.
- input_dim = 46
-
- if args.model_type == "ranknet":
- ranknet_train(input_dim, args.num_passes, args.model_save_dir)
- elif args.model_type == "lambdarank":
- lambda_rank_train(input_dim, args.num_passes, args.model_save_dir)
- else:
- logger.fatal(("A wrong value for parameter model type. "
- "Available options are: ranknet or lambdarank."))
diff --git a/legacy/mt_with_external_memory/README.md b/legacy/mt_with_external_memory/README.md
deleted file mode 100644
index 6643b4eb6c530c9fcaaf435ae999fc03eb628838..0000000000000000000000000000000000000000
--- a/legacy/mt_with_external_memory/README.md
+++ /dev/null
@@ -1,484 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.11.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 带外部记忆机制的神经机器翻译
-
-带**外部记忆**(External Memory)机制的神经机器翻译模型(Neural Machine Translation, NMT),是神经机器翻译模型的一个重要扩展。它引入可微分的记忆网络作为额外的记忆单元,拓展神经翻译模型内部工作记忆(Working Memory)的容量或带宽,辅助完成翻译等任务中信息的临时存取,改善模型表现。
-
-类似模型不仅可应用于翻译任务,同时可广泛应用于其他需 “大容量动态记忆” 的任务,例如:机器阅读理解 / 问答、多轮对话、长文本生成等。同时,“记忆” 作为认知的重要部分之一,可用于强化其他多种机器学习模型的表现。
-
-本文所采用的外部记忆机制,主要指**神经图灵机** \[[1](#参考文献)\] 方式(将于后文详细描述)。值得一提的是,神经图灵机仅仅是神经网络模拟记忆机制的尝试之一。记忆机制长久以来被广泛研究,近年来在深度学习的背景下,涌现出一系列有价值的工作,例如记忆网络(Memory Networks)、可微分神经计算机(Differentiable Neural Computers, DNC)等。本文仅讨论和实现神经图灵机机制。
-
-本文的实现主要参考论文\[[2](#参考文献)\], 并假设读者已充分阅读并理解 PaddlePaddle Book 中 [机器翻译](https://github.com/PaddlePaddle/book/tree/develop/08.machine_translation) 一章。
-
-
-## 模型概述
-
-### 记忆机制简介
-
-记忆(Memory),是认知的重要环节之一。记忆赋予认知在时间上的协调性,使得复杂认知(如推理、规划,不同于静态感知)成为可能。灵活的记忆机制,是机器模仿人类智能所需要拥有的关键能力之一。
-
-#### 静态记忆
-
-任何机器学习模型,原生就拥有一定的静态记忆能力:无论它是参数模型(模型参数即记忆),还是非参模型(样本即记忆);无论是传统的 SVM(支持向量即记忆),还是神经网络模型(网络连接权值即记忆)。然而,这里的 “记忆” 绝大部分是指**静态记忆**,即在模型训练结束后,“记忆” 是固化的;在模型推断时,模型是静态一致的,不拥有额外的跨时间步的信息记忆能力。
-
-#### 动态记忆 1 --- RNNs 中的隐状态向量
-
-在处理序列认知问题(如自然语言处理、序列决策等)时,由于每个时间步对信息的处理需要依赖其他时间步的信息,我们往往需要在不同时间步上维持一个持久的信息通路。带有隐状态向量 $h$(或 LSTM 中的状态 $c$)的循环神经网络(Recurrent Neural Networks, RNNs) ,即拥有这样的 “**动态记忆**” 能力。每一个时间步,模型均可从 $h$ 或 $c$ 中获取过去时间步的 “记忆” 信息,并可往上持续叠加新的信息以更新记忆。在模型推断时,不同的样本具有完全不同的一组记忆信息($h$ 或 $c$),具有 “动态” 性。
-
-尽管上述对 LSTM中细胞状态 $c$ 的直觉说法有着诸多不严谨之处:例如从优化的角度看, $c$ 的引入或者 GRU 中的线性 Leaky 结构的引入,是为了在梯度计算中使得单步梯度的雅克比矩阵的谱分布更接近单位阵,以减轻长程梯度衰减问题,降低优化难度。但这不妨碍我们从直觉的角度将它理解为增加 “线性通路” 使得 “记忆通道” 更顺畅,如图1(引自[此文](http://colah.github.io/posts/2015-08-Understanding-LSTMs/))所示的 LSTM 中的细胞状态向量 $c$ 可视为这样一个用于信息持久化的 “线性记忆通道”。
-
-
-
-图1. LSTM 中的细胞状态向量作为 “记忆通道” 示意图
-
-
-#### 动态记忆 2 --- Seq2Seq 中的注意力机制
-
-然而上节所述的单个向量 $h$ 或 $c$ 的信息带宽有限。在序列到序列生成模型中,这样的带宽瓶颈更表现在信息从编码器(Encoder)转移至解码器(Decoder)的过程中:仅仅依赖一个有限长度的状态向量来编码整个变长的源语句,有着较大的潜在信息丢失。
-
-\[[3](#参考文献)\] 提出了注意力机制(Attention Mechanism),以克服上述困难。在解码时,解码器不再仅仅依赖来自编码器的唯一的句级编码向量的信息,而是依赖一个向量组的记忆信息:向量组中的每个向量为编码器的各字符(Token)的编码向量(例如 $h_t$)。通过一组可学习的注意强度(Attention Weights) 来动态分配注意力资源,以线性加权方式读取信息,用于序列的不同时间步的符号生成(可参考 PaddlePaddle Book [机器翻译](https://github.com/PaddlePaddle/book/tree/develop/08.machine_translation)一章)。这种注意强度的分布,可看成基于内容的寻址(请参考神经图灵机 \[[1](#参考文献)\] 中的寻址描述),即在源语句的不同位置根据其内容决定不同的读取强度,起到一种和源语句 “软对齐(Soft Alignment)” 的作用。
-
-相比上节的单个状态向量,这里的 “向量组” 蕴含着更多更精准的信息,例如它可以被认为是一个无界的外部记忆模块(Unbounded External Memory),有效拓宽记忆信息带宽。“无界” 指的是向量组的向量个数非固定,而是随着源语句的字符数的变化而变化,数量不受限。在源语句的编码完成时,该外部存储即被初始化为各字符的状态向量,而在其后的整个解码过程中被读取使用。
-
-#### 动态记忆 3 --- 神经图灵机
-
-图灵机(Turing Machine)或冯诺依曼体系(Von Neumann Architecture),是计算机体系结构的雏形。运算器(如代数计算)、控制器(如逻辑分支控制)和存储器三者一体,共同构成了当代计算机的核心运行机制。神经图灵机(Neural Turing Machines)\[[1](#参考文献)\] 试图利用神经网络模拟可微分(即可通过梯度下降来学习)的图灵机,以实现更复杂的智能。而一般的机器学习模型,大部分忽略了显式的动态存储。神经图灵机正是要弥补这样的潜在缺陷。
-
-
-
-图2. 图灵机结构漫画
-
-
-图灵机的存储机制,常被形象比喻成在一个纸带(Tape)的读写操作。读头(Read Head)和 写头(Write Head)负责在纸带上读出或者写入信息;纸袋的移动、读写头的读写动作和内容,则受控制器 (Contoller) 控制(见图2,引自[此处](http://www.worldofcomputing.net/theory/turing-machine.html));同时纸带的长度通常有限。
-
-神经图灵机则以矩阵 $M \in \mathcal{R}^{n \times m}$ 模拟 “纸带”,其中 $n$ 为记忆向量(又成记忆槽)的数量,$m$ 为记忆向量的长度。以前馈神经网络或循环神经网络来模拟控制器,决定本次读写在不同的记忆槽上的读写强度分布,即寻址:
-
- - 基于内容的寻址(Content-based Addressing):寻址强度依赖于记忆槽的内容和该次读写的实际内容;
- - 基于位置的寻址(Location-based Addressing):寻址强度依赖于上次寻址操作的寻址强度(例如偏移);
- - 混合寻址:混合上述寻址方式(例如线性插值);
-
-(详情请参考论文\[[1](#参考文献)\])。根据寻址情况,图灵机写入 $M$ 或从 $M$ 读出信息,供其他网络使用。神经图灵机结构示意图,见图3,引自\[[1](#参考文献)\]。
-
-
-
-图3. 神经图灵机结构示意图
-
-
-和上节的注意力机制相比,神经图灵机有着诸多相同点和不同点。相同点例如:
-
-- 均利用矩阵(或向量组)形式的外部存储。
-- 均利用可微分的寻址方式。
-
-不同在于:
-
-- 神经图灵机有读有写,是真正意义上的存储器;而注意力机制在编码完成时即初始化存储内容(仅简单缓存,非可微分的写操作),在其后的解码过程中只读不写。
-- 神经图灵机不仅有基于内容的寻址,同时结合基于位置的寻址,使得例如 “序列复制” 等需 “连续寻址” 的任务更容易;而注意力机制仅考虑基于内容的寻址,以实现 Soft Aligment。
-- 神经图灵机利用有界(Bounded) 存储;而注意力机制利用无界(Unbounded)存储。
-
-#### 三种记忆方式的混合,以强化神经机器翻译模型
-
-尽管在一般的序列到序列模型中,注意力机制已经是标配。然而,注意机制中的外部存储仅用于存储编码器信息。在解码器内部,信息通路仍依赖 RNN 的状态单向量 $h$ 或 $c$。于是,利用神经图灵机的外部存储机制,来补充解码器内部的单向量信息通路,成为自然而然的想法。
-
-于是,我们混合上述的三种动态记忆机制,即RNN 原有的状态向量、注意力机制被保留;同时,基于简化版的神经图灵机的有界外部记忆机制被引入以补充解码器单状态向量记忆。整体的模型实现参考论文\[[2](#参考文献)\]。少量的实现差异,详见[其他讨论](#其他讨论)一章。
-
-这里额外需要理解的是,为什么不直接通过增加 $h$ 或 $c$的维度来扩大信息带宽?
-
-- 一方面因为通过增加 $h$ 或 $c$的维度是以 $O(n^2)$ 的存储和计算复杂度为代价(状态-状态转移矩阵);而基于神经图灵机的记忆扩展代价是 $O(n)$的,因其寻址是以记忆槽(Memory Slot)为单位,而控制器的参数结构仅仅是和 $m$(记忆槽的大小)有关。
-- 基于状态单向量的记忆读写机制,仅有唯一的读写强度,即本质上是**全局**的;而神经图灵机的机制是**局部**的,即读写本质上仅在部分记忆槽(寻址强度的分布锐利,即真正大的强度仅分布于部分记忆槽)。局部的特性让记忆的存取更干净,干扰更小。
-
-
-
-
-### 模型网络结构
-
-网络总体结构在带注意机制的序列到序列结构(即RNNsearch\[[3](##参考文献)\]) 基础上叠加简化版神经图灵机\[[1](#参考文献)\]外部记忆模块。
-
-- 编码器(Encoder)采用标准**双向 GRU 结构**(非 stack),不赘述。
-- 解码器(Decoder)采用和论文\[[2](#参考文献)\] 基本相同的结构,见图4(修改自论文\[[2](#参考文献参考文献)\]) 。
-
-
-
-图4. 通过外部记忆增强的解码器结构示意图
-
-
-解码器结构图,解释如下:
-
-1. $M_{t-1}^B$ 和 $M_t^B$ 为有界外部存储矩阵,前者为上一时间步存储矩阵的状态,后者为当前时间步的状态。$\textrm{read}^B$ 和 $\textrm{write}$ 为对应的读写头(包含其控制器)。$r_t$ 为对应的读出向量。
-2. $M^S$ 为无界外部存储矩阵,$\textrm{read}^S$ 为对应的读头,二者配合即实现传统的注意力机制。$c_t$ 为对应的读出向量。
-3. $y_{t-1}$ 为解码器上一步的输出字符并做词向量(Word Embedding)映射,作为当前步的输入,$y_t$ 为解码器当前步的解码符号的概率分布。
-4. 虚线框内(除$M^S$外),整体可视为有界外部存储模块。可以看到,除去该部分,网络结构和 RNNsearch\[[3](#参考文献)\] 基本一致(略有不一致之处为:用于 attention 的 decoder state 被改进,即叠加了一隐层并引入了 $y_{t-1}$)。
-
-
-## 算法实现
-
-算法实现于以下几个文件中:
-
-- `external_memory.py`: 主要实现简化版的 **神经图灵机** 于 `ExternalMemory` 类,对外提供初始化和读写函数。
-- `model.py`: 相关模型配置函数,包括双向 GPU 编码器(`bidirectional_gru_encoder`),带外部记忆强化的解码器(`memory_enhanced_decoder`),带外部记忆强化的序列到序列模型(`memory_enhanced_seq2seq`)。
-- `data_utils.py`: 相关数据处理辅助函数。
-- `train.py`: 模型训练。
-- `infer.py`: 部分示例样本的翻译(模型推断)。
-
-### `ExternalMemory` 类
-
-`ExternalMemory` 类实现通用的简化版**神经图灵机**。相比完整版神经图灵机,该类仅实现了基于内容的寻址(Content Addressing, Interpolation),不包括基于位置的寻址( Convolutional Shift, Sharpening)。读者可以自行将其补充成为一个完整的神经图灵机。
-
-该类结构如下:
-
-```python
-class ExternalMemory(object):
- """External neural memory class.
-
- A simplified Neural Turing Machines (NTM) with only content-based
- addressing (including content addressing and interpolation, but excluding
- convolutional shift and sharpening). It serves as an external differential
- memory bank, with differential write/read head controllers to store
- and read information dynamically as needed. Simple feedforward networks are
- used as the write/read head controllers.
-
- For more details, please refer to
- `Neural Turing Machines `_.
- """
-
- def __init__(self,
- name,
- mem_slot_size,
- boot_layer,
- initial_weight,
- readonly=False,
- enable_interpolation=True):
- """ Initialization.
-
- :param name: Memory name.
- :type name: basestring
- :param mem_slot_size: Size of memory slot/vector.
- :type mem_slot_size: int
- :param boot_layer: Boot layer for initializing the external memory. The
- sequence layer has sequence length indicating the number
- of memory slots, and size as memory slot size.
- :type boot_layer: LayerOutput
- :param initial_weight: Initializer for addressing weights.
- :type initial_weight: LayerOutput
- :param readonly: If true, the memory is read-only, and write function cannot
- be called. Default is false.
- :type readonly: bool
- :param enable_interpolation: If set true, the read/write addressing weights
- will be interpolated with the weights in the
- last step, with the affine coefficients being
- a learnable gate function.
- :type enable_interpolation: bool
- """
- pass
-
- def _content_addressing(self, key_vector):
- """Get write/read head's addressing weights via content-based addressing.
- """
- pass
-
- def _interpolation(self, head_name, key_vector, addressing_weight):
- """Interpolate between previous and current addressing weights.
- """
- pass
-
- def _get_addressing_weight(self, head_name, key_vector):
- """Get final addressing weights for read/write heads, including content
- addressing and interpolation.
- """
- pass
-
- def write(self, write_key):
- """Write onto the external memory.
- It cannot be called if "readonly" set True.
-
- :param write_key: Key vector for write heads to generate writing
- content and addressing signals.
- :type write_key: LayerOutput
- """
- pass
-
- def read(self, read_key):
- """Read from the external memory.
-
- :param write_key: Key vector for read head to generate addressing
- signals.
- :type write_key: LayerOutput
- :return: Content (vector) read from external memory.
- :rtype: LayerOutput
- """
- pass
-```
-
-其中,私有方法包含:
-
-- `_content_addressing`: 通过基于内容的寻址,计算得到读写操作的寻址强度。
-- `_interpolation`: 通过插值寻址(当前寻址强度和上一时间步寻址强度的线性加权),更新当前寻址强度。
-- `_get_addressing_weight`: 调用上述两个寻址操作,获得对存储单元的读写操作的最终寻址强度。
-
-
-对外接口包含:
-
-- `__init__`:类实例初始化。
- - 输入参数 `name`: 外部记忆单元名,不同实例的相同命名将共享同一外部记忆单元。
- - 输入参数 `mem_slot_size`: 单个记忆槽(向量)的维度。
- - 输入参数 `boot_layer`: 用于内存槽初始化的层。需为序列类型,序列长度表明记忆槽的数量。
- - 输入参数 `initial_weight`: 用于初始化寻址强度。
- - 输入参数 `readonly`: 是否打开只读模式(例如打开只读模式,该实例可用于注意力机制)。打开只读模式,`write` 方法不可被调用。
- - 输入参数 `enable_interpolation`: 是否允许插值寻址(例如当用于注意力机制时,需要关闭插值寻址)。
-- `write`: 写操作。
- - 输入参数 `write_key`:某层的输出,其包含的信息用于写头的寻址和实际写入信息的生成。
-- `read`: 读操作。
- - 输入参数 `read_key`:某层的输出,其包含的信息用于读头的寻址。
- - 返回:读出的信息(可直接作为其他层的输入)。
-
-部分关键实现逻辑:
-
-- 神经图灵机的 “外部存储矩阵” 采用 `Paddle.layer.memory`实现。该序列的长度表示记忆槽的数量,序列的 `size` 表示记忆槽(向量)的大小。该序列依赖一个外部层作为初始化, 其记忆槽的数量取决于该层输出序列的长度。因此,该类不仅可用来实现有界记忆(Bounded Memory),同时可用来实现无界记忆 (Unbounded Memory,即记忆槽数量可变)。
-
- ```python
- self.external_memory = paddle.layer.memory(
- name=self.name,
- size=self.mem_slot_size,
- boot_layer=boot_layer)
- ```
-- `ExternalMemory`类的寻址逻辑通过 `_content_addressing` 和 `_interpolation` 两个私有方法实现。读和写操作通过 `read` 和 `write` 两个函数实现,包括上述的寻址操作。并且读和写的寻址独立进行,不同于 \[[2](#参考文献)\] 中的二者共享同一个寻址强度,目的是为了使得该类更通用。
-- 为了简单起见,控制器(Controller)未被专门模块化,而是分散在各个寻址和读写函数中。控制器主要包括寻址操作和写操作时生成写入/擦除向量等,其中寻址操作通过上述的`_content_addressing` 和 `_interpolation` 两个私有方法实现,写操作时的写入/擦除向量的生成则在 `write` 方法中实现。上述均采用简单的前馈网络模拟控制器。读者可尝试剥离控制器逻辑并模块化,同时可尝试循环神经网络做控制器。
-- `ExternalMemory` 类具有只读模式,同时差值寻址操作可关闭。主要目的是便于用该类等价实现传统的注意力机制。
-
-- `ExternalMemory` 只能和 `paddle.layer.recurrent_group`配合使用,具体在用户自定义的 `step` 函数中使用(示例请详细代码),它不可以单独存在。
-
-### `memory_enhanced_seq2seq` 及相关函数
-
-涉及三个主要函数:
-
-```python
-def bidirectional_gru_encoder(input, size, word_vec_dim):
- """Bidirectional GRU encoder.
-
- :params size: Hidden cell number in decoder rnn.
- :type size: int
- :params word_vec_dim: Word embedding size.
- :type word_vec_dim: int
- :return: Tuple of 1. concatenated forward and backward hidden sequence.
- 2. last state of backward rnn.
- :rtype: tuple of LayerOutput
- """
- pass
-
-
-def memory_enhanced_decoder(input, target, initial_state, source_context, size,
- word_vec_dim, dict_size, is_generating, beam_size):
- """GRU sequence decoder enhanced with external memory.
-
- The "external memory" refers to two types of memories.
- - Unbounded memory: i.e. attention mechanism in Seq2Seq.
- - Bounded memory: i.e. external memory in NTM.
- Both types of external memories can be implemented with
- ExternalMemory class, and are both exploited in this enhanced RNN decoder.
-
- The vanilla RNN/LSTM/GRU also has a narrow memory mechanism, namely the
- hidden state vector (or cell state in LSTM) carrying information through
- a span of sequence time, which is a successful design enriching the model
- with the capability to "remember" things in the long run. However, such a
- vector state is somewhat limited to a very narrow memory bandwidth. External
- memory introduced here could easily increase the memory capacity with linear
- complexity cost (rather than quadratic for vector state).
-
- This enhanced decoder expands its "memory passage" through two
- ExternalMemory objects:
- - Bounded memory for handling long-term information exchange within decoder
- itself. A direct expansion of traditional "vector" state.
- - Unbounded memory for handling source language's token-wise information.
- Exactly the attention mechanism over Seq2Seq.
-
- Notice that we take the attention mechanism as a particular form of external
- memory, with read-only memory bank initialized with encoder states, and a
- read head with content-based addressing (attention). From this view point,
- we arrive at a better understanding of attention mechanism itself and other
- external memory, and a concise and unified implementation for them.
-
- For more details about external memory, please refer to
- `Neural Turing Machines `_.
-
- For more details about this memory-enhanced decoder, please
- refer to `Memory-enhanced Decoder for Neural Machine Translation
- `_. This implementation is highly
- correlated to this paper, but with minor differences (e.g. put "write"
- before "read" to bypass a potential bug in V2 APIs. See
- (`issue `_).
- """
- pass
-
-
-
-def memory_enhanced_seq2seq(encoder_input, decoder_input, decoder_target,
- hidden_size, word_vec_dim, dict_size, is_generating,
- beam_size):
- """Seq2Seq Model enhanced with external memory.
-
- The "external memory" refers to two types of memories.
- - Unbounded memory: i.e. attention mechanism in Seq2Seq.
- - Bounded memory: i.e. external memory in NTM.
- Both types of external memories can be implemented with
- ExternalMemory class, and are both exploited in this Seq2Seq model.
-
- :params encoder_input: Encoder input.
- :type encoder_input: LayerOutput
- :params decoder_input: Decoder input.
- :type decoder_input: LayerOutput
- :params decoder_target: Decoder target.
- :type decoder_target: LayerOutput
- :params hidden_size: Hidden cell number, both in encoder and decoder rnn.
- :type hidden_size: int
- :params word_vec_dim: Word embedding size.
- :type word_vec_dim: int
- :param dict_size: Vocabulary size.
- :type dict_size: int
- :params is_generating: Whether for beam search inferencing (True) or
- for training (False).
- :type is_generating: bool
- :params beam_size: Beam search width.
- :type beam_size: int
- :return: Cost layer if is_generating=False; Beam search layer if
- is_generating = True.
- :rtype: LayerOutput
- """
- pass
-```
-
-- `bidirectional_gru_encoder` 函数实现双向单层 GRU(Gated Recurrent Unit) 编码器。返回两组结果:一组为字符级编码向量序列(包含前后向),一组为整个源语句的句级编码向量(仅后向)。前者用于解码器的注意力机制中记忆矩阵的初始化,后者用于解码器的状态向量的初始化。
-
-- `memory_enhanced_decoder` 函数实现通过外部记忆增强的 GRU 解码器。它利用同一个`ExternalMemory` 类实现两种外部记忆模块:
-
- - 无界外部记忆:即传统的注意力机制。利用`ExternalMemory`,打开只读开关,关闭插值寻址。并利用解码器的第一组输出作为 `ExternalMemory` 中存储矩阵的初始化(`boot_layer`)。因此,该存储的记忆槽数目是动态可变的,取决于编码器的字符数。
-
- ```python
- unbounded_memory = ExternalMemory(
- name="unbounded_memory",
- mem_slot_size=size * 2,
- boot_layer=unbounded_memory_init,
- initial_weight=unbounded_memory_weight_init,
- readonly=True,
- enable_interpolation=False)
- ```
- - 有界外部记忆:利用`ExternalMemory`,关闭只读开关,打开插值寻址。并利用解码器的第一组输出,取均值池化(pooling)后并扩展为指定序列长度后,叠加随机噪声(训练和推断时保持一致),作为 `ExternalMemory` 中存储矩阵的初始化(`boot_layer`)。因此,该存储的记忆槽数目是固定的。即代码中的:
-
- ```python
- bounded_memory = ExternalMemory(
- name="bounded_memory",
- mem_slot_size=size,
- boot_layer=bounded_memory_init,
- initial_weight=bounded_memory_weight_init,
- readonly=False,
- enable_interpolation=True)
- ```
-
- 注意到,在我们的实现中,注意力机制(或无界外部存储)和神经图灵机(或有界外部存储)被实现成相同的 `ExternalMemory` 类。前者是**只读**的, 后者**可读可写**。这样处理仅仅是为了便于统一我们对 “注意机制” 和 “记忆机制” 的理解和认识,同时也提供更简洁和统一的实现版本。注意力机制也可以通过 `paddle.networks.simple_attention` 实现。
-
-- `memory_enhanced_seq2seq` 函数定义整个带外部记忆机制的序列到序列模型,是模型定义的主调函数。它首先调用`bidirectional_gru_encoder` 对源语言进行编码,然后通过 `memory_enhanced_decoder` 进行解码。
-
-
-
-此外,在该实现中,将 `ExternalMemory` 的 `write` 操作提前至 `read` 之前,以避开潜在的拓扑连接局限,详见 [Issue](https://github.com/PaddlePaddle/Paddle/issues/2061)。我们可以看到,本质上他们是等价的。
-
-## 快速开始
-
-### 数据自定义
-
-数据是通过无参的 `reader()` 迭代器函数,进入训练过程。因此我们需要为训练数据和测试数据分别构造两个 `reader()` 迭代器。`reader()` 函数使用 `yield` 来实现迭代器功能(即可通过 `for instance in reader()` 方式迭代运行), 例如
-
-```python
-def reader():
- for instance in data_list:
- yield instance
-```
-
-`yield` 返回的每条样本需为三元组,分别包含编码器输入字符列表(即源语言序列,需 ID 化),解码器输入字符列表(即目标语言序列,需 ID 化,且序列右移一位),解码器输出字符列表(即目标语言序列,需 ID 化)。
-
-用户需自行完成字符的切分 (Tokenize) ,并构建字典完成 ID 化。
-
-PaddlePaddle 的接口 [paddle.paddle.wmt14](https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/v2/dataset/wmt14.py), 默认提供了一个经过预处理的、较小规模的 [wmt14 英法翻译数据集的子集](http://paddlepaddle.bj.bcebos.com/demo/wmt_shrinked_data/wmt14.tgz)(该数据集有193319条训练数据,6003条测试数据,词典长度为30000)。并提供了两个reader creator函数如下:
-
-```python
-paddle.dataset.wmt14.train(dict_size)
-paddle.dataset.wmt14.test(dict_size)
-```
-
-这两个函数被调用时即返回相应的`reader()`函数,供`paddle.traner.SGD.train`使用。当我们需要使用其他数据时,可参考 [paddle.paddle.wmt14](https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/v2/dataset/wmt14.py) 构造相应的 data creator,并替换 `paddle.dataset.wmt14.train` 和 `paddle.dataset.wmt14.train` 成相应函数名。
-
-### 训练
-
-命令行输入:
-
-```bash
-python train.py
-```
-或自定义部分参数, 例如:
-
-```bash
-python train.py \
---dict_size 30000 \
---word_vec_dim 512 \
---hidden_size 1024 \
---memory_slot_num 8 \
---use_gpu False \
---trainer_count 1 \
---num_passes 100 \
---batch_size 128 \
---memory_perturb_stddev 0.1
-```
-
-即可运行训练脚本,训练模型将被定期保存于本地 `./checkpoints`。参数含义可运行
-
-```bash
-python train.py --help
-```
-
-
-### 解码
-
-命令行输入:
-
-```bash
-python infer.py
-```
-或自定义部分参数, 例如:
-
-```bash
-python infer.py \
---dict_size 30000 \
---word_vec_dim 512 \
---hidden_size 1024 \
---memory_slot_num 8 \
---use_gpu False \
---trainer_count 1 \
---memory_perturb_stddev 0.1 \
---infer_data_num 10 \
---model_filepath checkpoints/params.latest.tar.gz \
---beam_size 3
-```
-
-即可运行解码脚本,产生示例翻译结果。参数含义可运行:
-
-```bash
-python infer.py --help
-```
-
-
-## 其他讨论
-
-#### 和论文\[[2](#参考文献)\]实现的差异
-
-差异如下:
-
-1. 基于内容的寻址公式不同: 原文为 $a = v^T(WM^B + Us)$,本示例为 $a = v^T \textrm{tanh}(WM^B + Us)$,以保持和 \[[3](#参考文献)\] 中的注意力机制寻址方式一致。
-2. 有界外部存储的初始化方式不同: 原文为 $M^B = \sigma(W\sum_{i=0}^{i=n}h_i)/n + V$, $V_{i,j}~\in \mathcal{N}(0, 0.1)$,本示例为 $M^B = \sigma(\frac{1}{n}W\sum_{i=0}^{i=n}h_i) + V$。
-3. 外部记忆机制的读和写的寻址逻辑不同:原文二者共享同一个寻址强度,相当于权值联结(Weight Tying)正则。本示例不施加该正则,读和写采用独立寻址。
-4. 同时间步内的外部记忆读写次序不同:原文为先读后写,本示例为先写后读,本质等价。
-
-## 参考文献
-
-1. Alex Graves, Greg Wayne, Ivo Danihelka, [Neural Turing Machines](https://arxiv.org/abs/1410.5401). arXiv preprint arXiv:1410.5401, 2014.
-2. Mingxuan Wang, Zhengdong Lu, Hang Li, Qun Liu, [Memory-enhanced Decoder Neural Machine Translation](https://arxiv.org/abs/1606.02003). In Proceedings of EMNLP, 2016, pages 278–286.
-3. Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio, [Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/abs/1409.0473). arXiv preprint arXiv:1409.0473, 2014.
diff --git a/legacy/mt_with_external_memory/data_utils.py b/legacy/mt_with_external_memory/data_utils.py
deleted file mode 100644
index 6cc481fef60dd80c97a8649e2930db7f41b0b3b0..0000000000000000000000000000000000000000
--- a/legacy/mt_with_external_memory/data_utils.py
+++ /dev/null
@@ -1,15 +0,0 @@
-"""
- Contains data utilities.
-"""
-
-
-def reader_append_wrapper(reader, append_tuple):
- """
- Data reader wrapper for appending extra data to exisiting reader.
- """
-
- def new_reader():
- for ins in reader():
- yield ins + append_tuple
-
- return new_reader
diff --git a/legacy/mt_with_external_memory/external_memory.py b/legacy/mt_with_external_memory/external_memory.py
deleted file mode 100644
index 5f26e8d7b6ba4e649dee4f468ad5e16a2ad4973b..0000000000000000000000000000000000000000
--- a/legacy/mt_with_external_memory/external_memory.py
+++ /dev/null
@@ -1,186 +0,0 @@
-"""
- External neural memory class.
-"""
-import paddle.v2 as paddle
-
-
-class ExternalMemory(object):
- """External neural memory class.
-
- A simplified Neural Turing Machines (NTM) with only content-based
- addressing (including content addressing and interpolation, but excluding
- convolutional shift and sharpening). It serves as an external differential
- memory bank, with differential write/read head controllers to store
- and read information dynamically. Simple feedforward networks are
- used as the write/read head controllers.
-
- The ExternalMemory class could be utilized by many neural network structures
- to easily expand their memory bandwidth and accomplish a long-term memory
- handling. Besides, some existing mechanism can be realized directly with
- the ExternalMemory class, e.g. the attention mechanism in Seq2Seq (i.e. an
- unbounded external memory).
-
- Besides, the ExternalMemory class must be used together with
- paddle.layer.recurrent_group (within its step function). It can never be
- used in a standalone manner.
-
- For more details, please refer to
- `Neural Turing Machines `_.
-
- :param name: Memory name.
- :type name: basestring
- :param mem_slot_size: Size of memory slot/vector.
- :type mem_slot_size: int
- :param boot_layer: Boot layer for initializing the external memory. The
- sequence layer has sequence length indicating the number
- of memory slots, and size as memory slot size.
- :type boot_layer: LayerOutput
- :param initial_weight: Initializer for addressing weights.
- :type initial_weight: LayerOutput
- :param readonly: If true, the memory is read-only, and write function cannot
- be called. Default is false.
- :type readonly: bool
- :param enable_interpolation: If set true, the read/write addressing weights
- will be interpolated with the weights in the
- last step, with the affine coefficients being
- a learnable gate function.
- :type enable_interpolation: bool
- """
-
- def __init__(self,
- name,
- mem_slot_size,
- boot_layer,
- initial_weight,
- readonly=False,
- enable_interpolation=True):
- self.name = name
- self.mem_slot_size = mem_slot_size
- self.readonly = readonly
- self.enable_interpolation = enable_interpolation
- self.external_memory = paddle.layer.memory(
- name=self.name, size=self.mem_slot_size, boot_layer=boot_layer)
- self.initial_weight = initial_weight
- # set memory to constant when readonly=True
- if self.readonly:
- self.updated_external_memory = paddle.layer.mixed(
- name=self.name,
- input=[
- paddle.layer.identity_projection(input=self.external_memory)
- ],
- size=self.mem_slot_size)
-
- def _content_addressing(self, key_vector):
- """Get write/read head's addressing weights via content-based addressing.
- """
- # content-based addressing: a=tanh(W*M + U*key)
- key_projection = paddle.layer.fc(input=key_vector,
- size=self.mem_slot_size,
- act=paddle.activation.Linear(),
- bias_attr=False)
- key_proj_expanded = paddle.layer.expand(
- input=key_projection, expand_as=self.external_memory)
- memory_projection = paddle.layer.fc(input=self.external_memory,
- size=self.mem_slot_size,
- act=paddle.activation.Linear(),
- bias_attr=False)
- merged_projection = paddle.layer.addto(
- input=[key_proj_expanded, memory_projection],
- act=paddle.activation.Tanh())
- # softmax addressing weight: w=softmax(v^T a)
- addressing_weight = paddle.layer.fc(
- input=merged_projection,
- size=1,
- act=paddle.activation.SequenceSoftmax(),
- bias_attr=False)
- return addressing_weight
-
- def _interpolation(self, head_name, key_vector, addressing_weight):
- """Interpolate between previous and current addressing weights.
- """
- # prepare interpolation scalar gate: g=sigmoid(W*key)
- gate = paddle.layer.fc(input=key_vector,
- size=1,
- act=paddle.activation.Sigmoid(),
- bias_attr=False)
- # interpolation: w_t = g*w_t+(1-g)*w_{t-1}
- last_addressing_weight = paddle.layer.memory(
- name=self.name + "_addressing_weight_" + head_name,
- size=1,
- boot_layer=self.initial_weight)
- interpolated_weight = paddle.layer.interpolation(
- name=self.name + "_addressing_weight_" + head_name,
- input=[last_addressing_weight, addressing_weight],
- weight=paddle.layer.expand(
- input=gate, expand_as=addressing_weight))
- return interpolated_weight
-
- def _get_addressing_weight(self, head_name, key_vector):
- """Get final addressing weights for read/write heads, including content
- addressing and interpolation.
- """
- # current content-based addressing
- addressing_weight = self._content_addressing(key_vector)
- # interpolation with previous addresing weight
- if self.enable_interpolation:
- return self._interpolation(head_name, key_vector, addressing_weight)
- else:
- return addressing_weight
-
- def write(self, write_key):
- """Write onto the external memory.
- It cannot be called if "readonly" set True.
-
- :param write_key: Key vector for write heads to generate writing
- content and addressing signals.
- :type write_key: LayerOutput
- """
- # check readonly
- if self.readonly:
- raise ValueError("ExternalMemory with readonly=True cannot write.")
- # get addressing weight for write head
- write_weight = self._get_addressing_weight("write_head", write_key)
- # prepare add_vector and erase_vector
- erase_vector = paddle.layer.fc(input=write_key,
- size=self.mem_slot_size,
- act=paddle.activation.Sigmoid(),
- bias_attr=False)
- add_vector = paddle.layer.fc(input=write_key,
- size=self.mem_slot_size,
- act=paddle.activation.Sigmoid(),
- bias_attr=False)
- erase_vector_expand = paddle.layer.expand(
- input=erase_vector, expand_as=self.external_memory)
- add_vector_expand = paddle.layer.expand(
- input=add_vector, expand_as=self.external_memory)
- # prepare scaled add part and erase part
- scaled_erase_vector_expand = paddle.layer.scaling(
- weight=write_weight, input=erase_vector_expand)
- erase_memory_part = paddle.layer.mixed(
- input=paddle.layer.dotmul_operator(
- a=self.external_memory,
- b=scaled_erase_vector_expand,
- scale=-1.0))
- add_memory_part = paddle.layer.scaling(
- weight=write_weight, input=add_vector_expand)
- # update external memory
- self.updated_external_memory = paddle.layer.addto(
- input=[self.external_memory, add_memory_part, erase_memory_part],
- name=self.name)
-
- def read(self, read_key):
- """Read from the external memory.
-
- :param write_key: Key vector for read head to generate addressing
- signals.
- :type write_key: LayerOutput
- :return: Content (vector) read from external memory.
- :rtype: LayerOutput
- """
- # get addressing weight for write head
- read_weight = self._get_addressing_weight("read_head", read_key)
- # read content from external memory
- scaled = paddle.layer.scaling(
- weight=read_weight, input=self.updated_external_memory)
- return paddle.layer.pooling(
- input=scaled, pooling_type=paddle.pooling.Sum())
diff --git a/legacy/mt_with_external_memory/image/lstm_c_state.png b/legacy/mt_with_external_memory/image/lstm_c_state.png
deleted file mode 100644
index ce791573f359a2e3a346eb920b1edec6983cc184..0000000000000000000000000000000000000000
Binary files a/legacy/mt_with_external_memory/image/lstm_c_state.png and /dev/null differ
diff --git a/legacy/mt_with_external_memory/image/memory_enhanced_decoder.png b/legacy/mt_with_external_memory/image/memory_enhanced_decoder.png
deleted file mode 100644
index da6219be5e070c7985247fe41bfe16783e6bff07..0000000000000000000000000000000000000000
Binary files a/legacy/mt_with_external_memory/image/memory_enhanced_decoder.png and /dev/null differ
diff --git a/legacy/mt_with_external_memory/image/neural_turing_machine_arch.png b/legacy/mt_with_external_memory/image/neural_turing_machine_arch.png
deleted file mode 100644
index 86ae8d5dcdc5f05ab4c2e23ddd699cc83346a7ea..0000000000000000000000000000000000000000
Binary files a/legacy/mt_with_external_memory/image/neural_turing_machine_arch.png and /dev/null differ
diff --git a/legacy/mt_with_external_memory/image/turing_machine_cartoon.gif b/legacy/mt_with_external_memory/image/turing_machine_cartoon.gif
deleted file mode 100644
index e0425ffd3a04c3fcf7e46899fad28c801c448067..0000000000000000000000000000000000000000
Binary files a/legacy/mt_with_external_memory/image/turing_machine_cartoon.gif and /dev/null differ
diff --git a/legacy/mt_with_external_memory/infer.py b/legacy/mt_with_external_memory/infer.py
deleted file mode 100644
index 55ab09171cbe7b63db35a37bd2ec036869a46afe..0000000000000000000000000000000000000000
--- a/legacy/mt_with_external_memory/infer.py
+++ /dev/null
@@ -1,159 +0,0 @@
-"""
- Contains infering script for machine translation with external memory.
-"""
-import distutils.util
-import argparse
-import gzip
-import random
-
-import paddle.v2 as paddle
-from external_memory import ExternalMemory
-from model import memory_enhanced_seq2seq
-from data_utils import reader_append_wrapper
-
-parser = argparse.ArgumentParser(description=__doc__)
-parser.add_argument(
- "--dict_size",
- default=30000,
- type=int,
- help="Vocabulary size. (default: %(default)s)")
-parser.add_argument(
- "--word_vec_dim",
- default=512,
- type=int,
- help="Word embedding size. (default: %(default)s)")
-parser.add_argument(
- "--hidden_size",
- default=1024,
- type=int,
- help="Hidden cell number in RNN. (default: %(default)s)")
-parser.add_argument(
- "--memory_slot_num",
- default=8,
- type=int,
- help="External memory slot number. (default: %(default)s)")
-parser.add_argument(
- "--beam_size",
- default=3,
- type=int,
- help="Beam search width. (default: %(default)s)")
-parser.add_argument(
- "--use_gpu",
- default=False,
- type=distutils.util.strtobool,
- help="Use gpu or not. (default: %(default)s)")
-parser.add_argument(
- "--trainer_count",
- default=1,
- type=int,
- help="Trainer number. (default: %(default)s)")
-parser.add_argument(
- "--batch_size",
- default=5,
- type=int,
- help="Batch size. (default: %(default)s)")
-parser.add_argument(
- "--infer_data_num",
- default=3,
- type=int,
- help="Instance num to infer. (default: %(default)s)")
-parser.add_argument(
- "--model_filepath",
- default="checkpoints/params.latest.tar.gz",
- type=str,
- help="Model filepath. (default: %(default)s)")
-parser.add_argument(
- "--memory_perturb_stddev",
- default=0.1,
- type=float,
- help="Memory perturb stddev for memory initialization."
- "(default: %(default)s)")
-args = parser.parse_args()
-
-
-def parse_beam_search_result(beam_result, dictionary):
- """
- Beam search result parser.
- """
- sentence_list = []
- sentence = []
- for word in beam_result[1]:
- if word != -1:
- sentence.append(word)
- else:
- sentence_list.append(' '.join(
- [dictionary.get(word) for word in sentence[1:]]))
- sentence = []
- beam_probs = beam_result[0]
- beam_size = len(beam_probs[0])
- beam_sentences = [
- sentence_list[i:i + beam_size]
- for i in range(0, len(sentence_list), beam_size)
- ]
- return beam_probs, beam_sentences
-
-
-def infer():
- """
- For inferencing.
- """
- # create network config
- source_words = paddle.layer.data(
- name="source_words",
- type=paddle.data_type.integer_value_sequence(args.dict_size))
- beam_gen = memory_enhanced_seq2seq(
- encoder_input=source_words,
- decoder_input=None,
- decoder_target=None,
- hidden_size=args.hidden_size,
- word_vec_dim=args.word_vec_dim,
- dict_size=args.dict_size,
- is_generating=True,
- beam_size=args.beam_size)
-
- # load parameters
- parameters = paddle.parameters.Parameters.from_tar(
- gzip.open(args.model_filepath))
-
- # prepare infer data
- infer_data = []
- random.seed(0) # for keeping consitancy for multiple runs
- bounded_memory_perturbation = [[
- random.gauss(0, args.memory_perturb_stddev)
- for i in xrange(args.hidden_size)
- ] for j in xrange(args.memory_slot_num)]
- test_append_reader = reader_append_wrapper(
- reader=paddle.dataset.wmt14.test(args.dict_size),
- append_tuple=(bounded_memory_perturbation, ))
- for i, item in enumerate(test_append_reader()):
- if i < args.infer_data_num:
- infer_data.append((
- item[0],
- item[3], ))
-
- # run inference
- beam_result = paddle.infer(
- output_layer=beam_gen,
- parameters=parameters,
- input=infer_data,
- field=['prob', 'id'])
-
- # parse beam result and print
- source_dict, target_dict = paddle.dataset.wmt14.get_dict(args.dict_size)
- beam_probs, beam_sentences = parse_beam_search_result(beam_result,
- target_dict)
- for i in xrange(args.infer_data_num):
- print "\n***************************************************\n"
- print "src:", ' '.join(
- [source_dict.get(word) for word in infer_data[i][0]]), "\n"
- for j in xrange(args.beam_size):
- print "prob = %f : %s" % (beam_probs[i][j], beam_sentences[i][j])
-
-
-def main():
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
- infer()
-
-
-if __name__ == '__main__':
- main()
diff --git a/legacy/mt_with_external_memory/model.py b/legacy/mt_with_external_memory/model.py
deleted file mode 100644
index 7342ce145c0a1ef6a3cfffadf726d0fc1d78cc59..0000000000000000000000000000000000000000
--- a/legacy/mt_with_external_memory/model.py
+++ /dev/null
@@ -1,264 +0,0 @@
-"""
- Contains model configuration for external-memory-enhanced seq2seq.
-
- The "external memory" refers to two types of memories.
- - Unbounded memory: i.e. vanilla attention mechanism in Seq2Seq.
- - Bounded memory: i.e. external memory in NTM.
- Both types of external memories are exploited to enhance the vanilla
- Seq2Seq neural machine translation.
-
- The implementation primarily follows the paper
- `Memory-enhanced Decoder for Neural Machine Translation
- `_,
- with some minor differences (will be listed in README.md).
-
- For details about "external memory", please also refer to
- `Neural Turing Machines `_.
-"""
-import paddle.v2 as paddle
-from external_memory import ExternalMemory
-
-
-def bidirectional_gru_encoder(input, size, word_vec_dim):
- """Bidirectional GRU encoder.
-
- :params size: Hidden cell number in decoder rnn.
- :type size: int
- :params word_vec_dim: Word embedding size.
- :type word_vec_dim: int
- :return: Tuple of 1. concatenated forward and backward hidden sequence.
- 2. last state of backward rnn.
- :rtype: tuple of LayerOutput
- """
- # token embedding
- embeddings = paddle.layer.embedding(input=input, size=word_vec_dim)
- # token-level forward and backard encoding for attentions
- forward = paddle.networks.simple_gru(
- input=embeddings, size=size, reverse=False)
- backward = paddle.networks.simple_gru(
- input=embeddings, size=size, reverse=True)
- forward_backward = paddle.layer.concat(input=[forward, backward])
- # sequence-level encoding
- backward_first = paddle.layer.first_seq(input=backward)
- return forward_backward, backward_first
-
-
-def memory_enhanced_decoder(input, target, initial_state, source_context, size,
- word_vec_dim, dict_size, is_generating, beam_size):
- """GRU sequence decoder enhanced with external memory.
-
- The "external memory" refers to two types of memories.
- - Unbounded memory: i.e. attention mechanism in Seq2Seq.
- - Bounded memory: i.e. external memory in NTM.
- Both types of external memories can be implemented with
- ExternalMemory class, and are both exploited in this enhanced RNN decoder.
-
- The vanilla RNN/LSTM/GRU also has a narrow memory mechanism, namely the
- hidden state vector (or cell state in LSTM) carrying information through
- a span of sequence time, which is a successful design enriching the model
- with the capability to "remember" things in the long run. However, such a
- vector state is somewhat limited to a very narrow memory bandwidth. External
- memory introduced here could easily increase the memory capacity with linear
- complexity cost (rather than quadratic for vector state).
-
- This enhanced decoder expands its "memory passage" through two
- ExternalMemory objects:
- - Bounded memory for handling long-term information exchange within decoder
- itself. A direct expansion of traditional "vector" state.
- - Unbounded memory for handling source language's token-wise information.
- Exactly the attention mechanism over Seq2Seq.
-
- Notice that we take the attention mechanism as a particular form of external
- memory, with read-only memory bank initialized with encoder states, and a
- read head with content-based addressing (attention). From this view point,
- we arrive at a better understanding of attention mechanism itself and other
- external memory, and a concise and unified implementation for them.
-
- For more details about external memory, please refer to
- `Neural Turing Machines `_.
-
- For more details about this memory-enhanced decoder, please
- refer to `Memory-enhanced Decoder for Neural Machine Translation
- `_. This implementation is highly
- correlated to this paper, but with minor differences (e.g. put "write"
- before "read" to bypass a potential bug in V2 APIs. See
- (`issue `_).
-
- :params input: Decoder input.
- :type input: LayerOutput
- :params target: Decoder target.
- :type target: LayerOutput
- :params initial_state: Initial hidden state.
- :type initial_state: LayerOutput
- :params source_context: Group of context hidden states for each token in the
- source sentence, for attention mechanisim.
- :type source_context: LayerOutput
- :params size: Hidden cell number in decoder rnn.
- :type size: int
- :params word_vec_dim: Word embedding size.
- :type word_vec_dim: int
- :param dict_size: Vocabulary size.
- :type dict_size: int
- :params is_generating: Whether for beam search inferencing (True) or
- for training (False).
- :type is_generating: bool
- :params beam_size: Beam search width.
- :type beam_size: int
- :return: Cost layer if is_generating=False; Beam search layer if
- is_generating = True.
- :rtype: LayerOutput
- """
- # prepare initial bounded and unbounded memory
- bounded_memory_slot_init = paddle.layer.fc(input=paddle.layer.pooling(
- input=source_context, pooling_type=paddle.pooling.Avg()),
- size=size,
- act=paddle.activation.Sigmoid())
- bounded_memory_perturbation = paddle.layer.data(
- name='bounded_memory_perturbation',
- type=paddle.data_type.dense_vector_sequence(size))
- bounded_memory_init = paddle.layer.addto(
- input=[
- paddle.layer.expand(
- input=bounded_memory_slot_init,
- expand_as=bounded_memory_perturbation),
- bounded_memory_perturbation
- ],
- act=paddle.activation.Linear())
- bounded_memory_weight_init = paddle.layer.slope_intercept(
- input=paddle.layer.fc(input=bounded_memory_init, size=1),
- slope=0.0,
- intercept=0.0)
- unbounded_memory_init = source_context
- unbounded_memory_weight_init = paddle.layer.slope_intercept(
- input=paddle.layer.fc(input=unbounded_memory_init, size=1),
- slope=0.0,
- intercept=0.0)
-
- # prepare step function for reccurent group
- def recurrent_decoder_step(cur_embedding):
- # create hidden state, bounded and unbounded memory.
- state = paddle.layer.memory(
- name="gru_decoder", size=size, boot_layer=initial_state)
- bounded_memory = ExternalMemory(
- name="bounded_memory",
- mem_slot_size=size,
- boot_layer=bounded_memory_init,
- initial_weight=bounded_memory_weight_init,
- readonly=False,
- enable_interpolation=True)
- unbounded_memory = ExternalMemory(
- name="unbounded_memory",
- mem_slot_size=size * 2,
- boot_layer=unbounded_memory_init,
- initial_weight=unbounded_memory_weight_init,
- readonly=True,
- enable_interpolation=False)
- # write bounded memory
- bounded_memory.write(state)
- # read bounded memory
- bounded_memory_read = bounded_memory.read(state)
- # prepare key for unbounded memory
- key_for_unbounded_memory = paddle.layer.fc(
- input=[bounded_memory_read, cur_embedding],
- size=size,
- act=paddle.activation.Tanh(),
- bias_attr=False)
- # read unbounded memory (i.e. attention mechanism)
- context = unbounded_memory.read(key_for_unbounded_memory)
- # gated recurrent unit
- gru_inputs = paddle.layer.fc(
- input=[context, cur_embedding, bounded_memory_read],
- size=size * 3,
- act=paddle.activation.Linear(),
- bias_attr=False)
- gru_output = paddle.layer.gru_step(
- name="gru_decoder", input=gru_inputs, output_mem=state, size=size)
- # step output
- return paddle.layer.fc(input=[gru_output, context, cur_embedding],
- size=dict_size,
- act=paddle.activation.Softmax(),
- bias_attr=True)
-
- if not is_generating:
- target_embeddings = paddle.layer.embedding(
- input=input,
- size=word_vec_dim,
- param_attr=paddle.attr.ParamAttr(name="_decoder_word_embedding"))
- decoder_result = paddle.layer.recurrent_group(
- name="decoder_group",
- step=recurrent_decoder_step,
- input=[target_embeddings])
- cost = paddle.layer.classification_cost(
- input=decoder_result, label=target)
- return cost
- else:
- target_embeddings = paddle.layer.GeneratedInput(
- size=dict_size,
- embedding_name="_decoder_word_embedding",
- embedding_size=word_vec_dim)
- beam_gen = paddle.layer.beam_search(
- name="decoder_group",
- step=recurrent_decoder_step,
- input=[target_embeddings],
- bos_id=0,
- eos_id=1,
- beam_size=beam_size,
- max_length=100)
- return beam_gen
-
-
-def memory_enhanced_seq2seq(encoder_input, decoder_input, decoder_target,
- hidden_size, word_vec_dim, dict_size, is_generating,
- beam_size):
- """Seq2Seq Model enhanced with external memory.
-
- The "external memory" refers to two types of memories.
- - Unbounded memory: i.e. attention mechanism in Seq2Seq.
- - Bounded memory: i.e. external memory in NTM.
- Both types of external memories can be implemented with
- ExternalMemory class, and are both exploited in this Seq2Seq model.
-
- Please refer to the function comments of memory_enhanced_decoder(...).
-
- For more details about external memory, please refer to
- `Neural Turing Machines `_.
-
- For more details about this memory-enhanced Seq2Seq, please
- refer to `Memory-enhanced Decoder for Neural Machine Translation
- `_.
-
- :params encoder_input: Encoder input.
- :type encoder_input: LayerOutput
- :params decoder_input: Decoder input.
- :type decoder_input: LayerOutput
- :params decoder_target: Decoder target.
- :type decoder_target: LayerOutput
- :params hidden_size: Hidden cell number, both in encoder and decoder rnn.
- :type hidden_size: int
- :params word_vec_dim: Word embedding size.
- :type word_vec_dim: int
- :param dict_size: Vocabulary size.
- :type dict_size: int
- :params is_generating: Whether for beam search inferencing (True) or
- for training (False).
- :type is_generating: bool
- :params beam_size: Beam search width.
- :type beam_size: int
- :return: Cost layer if is_generating=False; Beam search layer if
- is_generating = True.
- :rtype: LayerOutput
- """
- # encoder
- context_encodings, sequence_encoding = bidirectional_gru_encoder(
- input=encoder_input, size=hidden_size, word_vec_dim=word_vec_dim)
- # decoder
- return memory_enhanced_decoder(
- input=decoder_input,
- target=decoder_target,
- initial_state=sequence_encoding,
- source_context=context_encodings,
- size=hidden_size,
- word_vec_dim=word_vec_dim,
- dict_size=dict_size,
- is_generating=is_generating,
- beam_size=beam_size)
diff --git a/legacy/mt_with_external_memory/train.py b/legacy/mt_with_external_memory/train.py
deleted file mode 100644
index 38d1970cb1d6b988bfee387ad1283e22d66422fc..0000000000000000000000000000000000000000
--- a/legacy/mt_with_external_memory/train.py
+++ /dev/null
@@ -1,163 +0,0 @@
-"""
- Contains training script for machine translation with external memory.
-"""
-import argparse
-import sys
-import gzip
-import distutils.util
-import random
-
-import paddle.v2 as paddle
-from external_memory import ExternalMemory
-from model import memory_enhanced_seq2seq
-from data_utils import reader_append_wrapper
-
-parser = argparse.ArgumentParser(description=__doc__)
-parser.add_argument(
- "--dict_size",
- default=30000,
- type=int,
- help="Vocabulary size. (default: %(default)s)")
-parser.add_argument(
- "--word_vec_dim",
- default=512,
- type=int,
- help="Word embedding size. (default: %(default)s)")
-parser.add_argument(
- "--hidden_size",
- default=1024,
- type=int,
- help="Hidden cell number in RNN. (default: %(default)s)")
-parser.add_argument(
- "--memory_slot_num",
- default=8,
- type=int,
- help="External memory slot number. (default: %(default)s)")
-parser.add_argument(
- "--use_gpu",
- default=False,
- type=distutils.util.strtobool,
- help="Use gpu or not. (default: %(default)s)")
-parser.add_argument(
- "--trainer_count",
- default=1,
- type=int,
- help="Trainer number. (default: %(default)s)")
-parser.add_argument(
- "--num_passes",
- default=100,
- type=int,
- help="Training epochs. (default: %(default)s)")
-parser.add_argument(
- "--batch_size",
- default=5,
- type=int,
- help="Batch size. (default: %(default)s)")
-parser.add_argument(
- "--memory_perturb_stddev",
- default=0.1,
- type=float,
- help="Memory perturb stddev for memory initialization."
- "(default: %(default)s)")
-args = parser.parse_args()
-
-
-def train():
- """
- For training.
- """
- # create optimizer
- optimizer = paddle.optimizer.Adam(
- learning_rate=5e-5,
- gradient_clipping_threshold=5,
- regularization=paddle.optimizer.L2Regularization(rate=8e-4))
-
- # create network config
- source_words = paddle.layer.data(
- name="source_words",
- type=paddle.data_type.integer_value_sequence(args.dict_size))
- target_words = paddle.layer.data(
- name="target_words",
- type=paddle.data_type.integer_value_sequence(args.dict_size))
- target_next_words = paddle.layer.data(
- name='target_next_words',
- type=paddle.data_type.integer_value_sequence(args.dict_size))
- cost = memory_enhanced_seq2seq(
- encoder_input=source_words,
- decoder_input=target_words,
- decoder_target=target_next_words,
- hidden_size=args.hidden_size,
- word_vec_dim=args.word_vec_dim,
- dict_size=args.dict_size,
- is_generating=False,
- beam_size=None)
-
- # create parameters and trainer
- parameters = paddle.parameters.create(cost)
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- update_equation=optimizer)
-
- # create data readers
- feeding = {
- "source_words": 0,
- "target_words": 1,
- "target_next_words": 2,
- "bounded_memory_perturbation": 3
- }
- random.seed(0) # for keeping consitancy for multiple runs
- bounded_memory_perturbation = [[
- random.gauss(0, args.memory_perturb_stddev)
- for i in xrange(args.hidden_size)
- ] for j in xrange(args.memory_slot_num)]
- train_append_reader = reader_append_wrapper(
- reader=paddle.dataset.wmt14.train(args.dict_size),
- append_tuple=(bounded_memory_perturbation, ))
- train_batch_reader = paddle.batch(
- reader=paddle.reader.shuffle(
- reader=train_append_reader, buf_size=8192),
- batch_size=args.batch_size)
- test_append_reader = reader_append_wrapper(
- reader=paddle.dataset.wmt14.test(args.dict_size),
- append_tuple=(bounded_memory_perturbation, ))
- test_batch_reader = paddle.batch(
- reader=paddle.reader.shuffle(
- reader=test_append_reader, buf_size=8192),
- batch_size=args.batch_size)
-
- # create event handler
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 10 == 0:
- print "Pass: %d, Batch: %d, TrainCost: %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics)
- with gzip.open("checkpoints/params.latest.tar.gz", 'w') as f:
- trainer.save_parameter_to_tar(f)
- else:
- sys.stdout.write('.')
- sys.stdout.flush()
- if isinstance(event, paddle.event.EndPass):
- result = trainer.test(reader=test_batch_reader, feeding=feeding)
- print "Pass: %d, TestCost: %f, %s" % (event.pass_id, result.cost,
- result.metrics)
- with gzip.open("checkpoints/params.pass-%d.tar.gz" % event.pass_id,
- 'w') as f:
- trainer.save_parameter_to_tar(f)
-
- # run train
- if not os.path.exists('checkpoints'):
- os.mkdir('checkpoints')
- trainer.train(
- reader=train_batch_reader,
- event_handler=event_handler,
- num_passes=args.num_passes,
- feeding=feeding)
-
-
-def main():
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
- train()
-
-
-if __name__ == '__main__':
- main()
diff --git a/legacy/nce_cost/.gitignore b/legacy/nce_cost/.gitignore
deleted file mode 100644
index 203ec9a67426fee99e6228716433bb1bec8ff14f..0000000000000000000000000000000000000000
--- a/legacy/nce_cost/.gitignore
+++ /dev/null
@@ -1,3 +0,0 @@
-*.pyc
-*.tar.gz
-models
diff --git a/legacy/nce_cost/README.md b/legacy/nce_cost/README.md
deleted file mode 100644
index 25864ada5c5ab9c686070743f4745f7062047205..0000000000000000000000000000000000000000
--- a/legacy/nce_cost/README.md
+++ /dev/null
@@ -1,156 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 使用噪声对比估计加速语言模型训练
-
-## 为什么需要噪声对比估计
-
-语言模型是许多自然语言处理任务的基础,也是获得词向量表示的一种有效方法。神经概率语言模型(Neural Probabilistic Language Model, NPLM)刻画了词语序列 $\omega_1,...,\omega_T$ 属于某个固定语言的概率 $P(\omega_1^T)$ :
-$$P(\omega_1^T)= \prod_{t=1}^{T}P(\omega_t|\omega_1^{t-1})$$
-
-为了降低建模和求解的难度,通常会引入一定条件独立假设:词语$w_t$的概率只受之前$n-1$个词语的影响,于是有:
-
-$$ P(\omega_1^T) \approx \prod P(\omega_t|\omega_{t-n-1}^{t-1}) \tag{1}$$
-
-从式($1$)中看到,可以通过建模条件概率 $P(\omega_t|w_{t-n-1},...,\omega_{t-1})$ 进而计算整个序列 $\omega_1,...,\omega_T$ 的概率。于是,我们可以将语言模型求解的任务简单地概括为:
-
-**给定词语序列的向量表示 $h$ ,称之为上下文(context),模型预测下一个目标词语 $\omega$ 的概率。**
-
-在[$n$-gram 语言模型](https://github.com/PaddlePaddle/book/tree/develop/04.word2vec)中,上下文取固定的 $n-1$ 个词,[RNN 语言模型](https://github.com/PaddlePaddle/models/tree/develop/generate_sequence_by_rnn_lm)可以处理任意长度的上下文。
-
-给定上下文 $h$,NPLM 学习一个分值函数(scoring function)$s_\theta(\omega, h)$,$s$ 刻画了上下文 $h$ 向量和所有可能的下一个词的向量表示 $\omega'$ 之间的相似度,再通过在全词表空间对打分函数 $s$ 的取值进行归一化(除以归一化因子 $Z$),得到目标词 $\omega$ 的概率分布,其中:$\theta$ 是可学习参数,这一过程用式($2$)表示,也就是 `Softmax` 函数的计算过程。
-
-$$P_\theta^h(\omega) = \frac{\text{exp}{s_\theta(\omega, h)}}{Z},Z=\sum_{\omega'} \exp{s_\theta(\omega', h)}\tag{2}$$
-
-极大似然估计(MLE,Maximum Likelihood Estimation)是求解概率($2$)最常用的学习准则。然而,不论是估计概率 $P_\theta^h(\omega)$ 还是计算似然(likelihood)的梯度时,都要计算归一化因子$Z$。$Z$ 的计算随着词典大小线性增长,当训练大规模语言模型时,例如,当词典增长到百万级别甚至更大,训练时间将变得十分漫长,因此,我们**需要其它可能的学习准则,他的求解过程从计算上应该更加轻便可解。**
-
-models 的另一篇介绍了使用[Hsigmoid加速词向量训练](https://github.com/PaddlePaddle/models/tree/develop/hsigmoid) ,这里我们介绍另一种基于采样的提高语言模型训练速度的方法:使用噪声对比估计(Noise-contrastive estimation, NCE)\[[1](#参考文献)\]。
-
-## 什么是噪声对比估计
-
-噪声对比估计是一种基于采样思想的概率密度估计准则,用于估计/拟合:概率函数由非归一化的分值函数和归一化因子两部分构成,这样一类特殊的概率函数\[[1](#参考文献)\] 。噪声对比估计通过构造下面这样一个辅助问题避免在全词典空间计算归一化因子 $Z$ ,从而降低计算代价:
-
-给定上下文 $h$ 和任意已知的噪声分布 $P_n$ ,学习一个二类分类器来拟合:目标 $\omega$ 来自真实分布 $P_\theta$ ($D = 1$) 还是噪声分布 $P_n$($D = 0$)的概率。假设来自噪声分布的负类样本的数量 $k$ 倍于目标样本,于是有:
-
-$$P(D=1|h,\omega) = \frac{P_\theta(h, \omega)}{P_\theta (h, \omega) + kP_n} \tag{3}$$
-
-我们直接用`Sigmoid`函数来刻画式($3$)这样一个二分类概率:
-
-$$P(D=1|h,\omega) = \sigma (\Delta s_\theta(w,h)) \tag{4}$$
-
-有了上面的问题设置便可以基于二分类来进行极大似然估计:增大正样本的概率同时降低负样本的概率[[2,3](#参考文献)],也就是最小化下面这样一个损失函数:
-
-$$
-J^h(\theta )=E_{ P_d^h }\left[ \log { P^h(D=1|w,\theta ) } \right] +kE_{ P_n }\left[ \log P^h (D=0|w,\theta ) \right]$$
-$$
- \\\\\qquad =E_{ P_d^h }\left[ \log { \sigma (\Delta s_\theta(w,h)) } \right] +kE_{ P_n }\left[ \log (1-\sigma (\Delta s_\theta(w,h))) \right] \tag{5}$$
-
-式($5$)便是基于噪声对比估计而定义的NCE损失函数,至此,我们还剩下两个问题:
-1. 式($5$)中的 $s_\theta(w,h)$ 是什么?
- - 在神经网络的实现中,$s_\theta(h,\omega)$ 是未归一化的分值。
- - NCE cost 层的可学习参数 $W$ 是一个 $|V| \times d$ 维度的矩阵,$|V|$ 是词典大小,$d$ 是上下文向量$h$的维度;
- - 训练时下一个词的真实类别 $t$ 是正类,从指定的噪声分布中采样 $k$ 个负类样本它们的类别分别记作: $\{n_1, ..., n_k\}$;
- - 抽取 $W$ 中第 $\{t, n_1, ..., n_k\}$ 行(共计 $k + 1$ 行)分别与 $h$ 计算分值 $s_\theta(w,h)$ ,再通过($5$)式计算最终的损失;
-2. 噪声分布如何选择?
- - 实践中,可以任意选择合适的噪声分布(噪声分布暗含着一定的先验)。
- - 最常用选择有:使用基于全词典之上的`unigram`分布(词频统计),无偏的均匀分布。
- - 在PaddlePaddle中用户如果用户未指定噪声分布,默认采用均匀分布。
-
-使用NCE准确训练时,最后一层的计算代价只与负采样数目线性相关,当负采样数目逐渐增大时,NCE 估计准则会收敛到极大似然估计。因此,在使用NCE准则训练时,可以通过控制负采样数目来控制对归一化的概率分布近似的质量。
-
-## 实验数据
-本例采用 Penn Treebank (PTB) 数据集([Tomas Mikolov预处理版本](http://www.fit.vutbr.cz/~imikolov/rnnlm/simple-examples.tgz))来训练一个 5-gram 语言模型。PaddlePaddle 提供了 [paddle.dataset.imikolov](https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/v2/dataset/imikolov.py) 接口来方便地使用PTB数据。当没有找到下载好的数据时,脚本会自动下载并验证文件的完整性。语料语种为英文,共有42068句训练数据,3761句测试数据。
-
-## 网络结构
-在 5-gram 神经概率语言模型详细网络结构见图1:
-
-
-
-图1. 5-gram 网络配置结构
-
-
-模型主要分为如下几个部分构成:
-
-1. **输入层**:输入样本由原始英文单词组成,每个英文单词首先被转换为字典中的 id 表示。
-
-2. **词向量层**:id 表示通过词向量层作用得到连续表示的词向量表示,能够更好地体现词与词之间的语义关系。训练完成之后,词语之间的语义相似度可以使用词向量之间的距离来表示,语义越相似,距离越近。
-
-3. **词向量拼接层**:将词向量进行串联,并将词向量首尾相接形成一个长向量。这样可以方便后面全连接层的处理。
-
-4. **全连接隐层**:将上一层获得的长向量输入到一层隐层的神经网络,输出特征向量。全连接的隐层可以增强网络的学习能力。
-
-5. **NCE层**:训练时可以直接实用 PaddlePaddle 提供的 `paddle.layer.nce` 作为损失函数。
-
-
-## 训练
-在命令行窗口运行命令``` python train.py ```可以直接开启训练任务。
-
-- 程序第一次运行会检测用户缓存文件夹中是否包含 ptb 数据集,如果未包含,则自动下载。
-- 运行过程中,每10个 batch 会打印模型训练在训练集上的代价值
-- 每个 pass 结束后,会计算测试数据集上的损失,并同时会保存最新的模型快照。
-
-在模型文件`network_conf.py`中 NCE 调用代码如下:
-
-```python
-return paddle.layer.nce(
- input=hidden_layer,
- label=next_word,
- num_classes=dict_size,
- param_attr=paddle.attr.Param(name="nce_w"),
- bias_attr=paddle.attr.Param(name="nce_b"),
- num_neg_samples=25,
- neg_distribution=None)
-```
-
-NCE 层的一些重要参数解释如下:
-
-| 参数名 | 参数作用 | 介绍 |
-| :----------------------- | :--------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------- |
-| param\_attr / bias\_attr | 用来设置参数名字 | 方便预测阶段加载参数,具体在预测一节中介绍。 |
-| num\_neg\_samples | 负样本采样个数 | 可以控制正负样本比例,这个值取值区间为 [1, 字典大小-1],负样本个数越多则整个模型的训练速度越慢,模型精度也会越高 |
-| neg\_distribution | 生成负样例标签的分布,默认是一个均匀分布 | 可以自行控制负样本采样时各个类别的采样权重。例如:希望正样例为“晴天”时,负样例“洪水”在训练时更被着重区分,则可以将“洪水”这个类别的采样权重增加 |
-
-## 预测
-1. 在命令行运行 :
- ```bash
- python infer.py \
- --model_path "models/XX" \
- --batch_size 1 \
- --use_gpu false \
- --trainer_count 1
- ```
- 参数含义如下:
- - `model_path`:指定训练好的模型所在的路径。必选。
- - `batch_size`:一次预测并行的样本数目。可选,默认值为 `1`。
- - `use_gpu`:是否使用 GPU 进行预测。可选,默认值为 `False`。
- - `trainer_count` : 预测使用的线程数目。可选,默认为 `1`。**注意:预测使用的线程数目必选大于一次预测并行的样本数目**。
-
-2. 需要注意的是:**预测和训练的计算逻辑不同**。预测使用全连接矩阵乘法后接`softmax`激活,输出基于各类别的概率分布,需要替换训练中使用的`paddle.train.nce`层。在PaddlePaddle中,NCE层将可学习参数存储为一个 `[类别数目 × 上一层输出向量宽度]` 大小的矩阵,预测时,**全连接运算在加载NCE层学习到参数时,需要进行转置**,代码如下:
- ```python
- return paddle.layer.mixed(
- size=dict_size,
- input=paddle.layer.trans_full_matrix_projection(
- hidden_layer, param_attr=paddle.attr.Param(name="nce_w")),
- act=paddle.activation.Softmax(),
- bias_attr=paddle.attr.Param(name="nce_b"))
- ```
- 上述代码片段中的 `paddle.layer.mixed` 必须以 PaddlePaddle 中 `paddle.layer.×_projection` 为输入。`paddle.layer.mixed` 将多个 `projection` (输入可以是多个)计算结果求和作为输出。`paddle.layer.trans_full_matrix_projection` 在计算矩阵乘法时会对参数$W$进行转置。
-
-3. 预测的输出格式如下:
- ```text
- 0.6734 their may want to move
- ```
-
- 每一行是一条预测结果,内部以“\t”分隔,共计3列:
- - 第一列:下一个词的概率。
- - 第二列:模型预测的下一个词。
- - 第三列:输入的 $n$ 个词语,内部以空格分隔。
-
-
-## 参考文献
-1. Gutmann M, Hyvärinen A. [Noise-contrastive estimation: A new estimation principle for unnormalized statistical models](http://proceedings.mlr.press/v9/gutmann10a/gutmann10a.pdf)[C]//Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. 2010: 297-304.
-
-1. Mnih A, Kavukcuoglu K. [Learning word embeddings efficiently with noise-contrastive estimation](https://papers.nips.cc/paper/5165-learning-word-embeddings-efficiently-with-noise-contrastive-estimation.pdf)[C]//Advances in neural information processing systems. 2013: 2265-2273.
-
-1. Mnih A, Teh Y W. [A Fast and Simple Algorithm for Training Neural Probabilistic Language Models](http://xueshu.baidu.com/s?wd=paperuri%3A%280735b97df93976efb333ac8c266a1eb2%29&filter=sc_long_sign&tn=SE_xueshusource_2kduw22v&sc_vurl=http%3A%2F%2Farxiv.org%2Fabs%2F1206.6426&ie=utf-8&sc_us=5770715420073315630)[J]. Computer Science, 2012:1751-1758.
diff --git a/legacy/nce_cost/images/network_conf.png b/legacy/nce_cost/images/network_conf.png
deleted file mode 100644
index 749f8a365db1e1c18d829a460de7c45b27892d19..0000000000000000000000000000000000000000
Binary files a/legacy/nce_cost/images/network_conf.png and /dev/null differ
diff --git a/legacy/nce_cost/infer.py b/legacy/nce_cost/infer.py
deleted file mode 100644
index 5498c84868c6d6fc0c1fe62363bc8701cc9b0942..0000000000000000000000000000000000000000
--- a/legacy/nce_cost/infer.py
+++ /dev/null
@@ -1,68 +0,0 @@
-import os
-import gzip
-import click
-import numpy as np
-
-import paddle.v2 as paddle
-from network_conf import ngram_lm
-
-
-def infer_a_batch(inferer, test_batch, id_to_word):
- probs = inferer.infer(input=test_batch)
- for i, res in enumerate(zip(test_batch, probs)):
- maxid = res[1].argsort()[-1]
- print("%.4f\t%s\t%s" % (res[1][maxid], id_to_word[maxid],
- " ".join([id_to_word[w] for w in res[0]])))
-
-
-@click.command("infer")
-@click.option(
- "--model_path",
- default="",
- help="The path of the trained model for generation.")
-@click.option(
- "--batch_size",
- default=1,
- help="The number of testing examples in one forward batch in inferring.")
-@click.option(
- "--use_gpu", default=False, help="Whether to use GPU in inference or not.")
-@click.option(
- "--trainer_count",
- default=1,
- help="Whether to use GPU in inference or not.")
-def infer(model_path, batch_size, use_gpu, trainer_count):
- assert os.path.exists(model_path), "The trained model does not exist."
- assert (batch_size and trainer_count and batch_size >= trainer_count), (
- "batch_size and trainer_count must both be greater than 0. "
- "And batch_size must be equal to "
- "or greater than trainer_count.")
-
- word_to_id = paddle.dataset.imikolov.build_dict()
- id_to_word = dict((v, k) for k, v in word_to_id.items())
- dict_size = len(word_to_id)
-
- paddle.init(use_gpu=use_gpu, trainer_count=trainer_count)
-
- # load the trained model.
- with gzip.open(model_path) as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
- prediction_layer = ngram_lm(
- is_train=False, hidden_size=128, emb_size=512, dict_size=dict_size)
- inferer = paddle.inference.Inference(
- output_layer=prediction_layer, parameters=parameters)
-
- test_batch = []
- for idx, item in enumerate(paddle.dataset.imikolov.test(word_to_id, 5)()):
- test_batch.append((item[:4]))
- if len(test_batch) == batch_size:
- infer_a_batch(inferer, test_batch, id_to_word)
- infer_data = []
-
- if len(test_batch):
- infer_a_batch(inferer, test_batch, id_to_word)
- infer_data = []
- infer_data_label = []
-
-
-if __name__ == "__main__":
- infer()
diff --git a/legacy/nce_cost/network_conf.py b/legacy/nce_cost/network_conf.py
deleted file mode 100644
index bdec1b78de76be1398e3ec1afa156fe4fe114e48..0000000000000000000000000000000000000000
--- a/legacy/nce_cost/network_conf.py
+++ /dev/null
@@ -1,55 +0,0 @@
-import math
-
-import paddle.v2 as paddle
-from paddle.v2.layer import parse_network
-
-
-def ngram_lm(hidden_size, emb_size, dict_size, gram_num=4, is_train=True):
- emb_layers = []
- embed_param_attr = paddle.attr.Param(
- name="_proj", initial_std=0.001, learning_rate=1, l2_rate=0)
- for i in range(gram_num):
- word = paddle.layer.data(
- name="__word%02d__" % (i),
- type=paddle.data_type.integer_value(dict_size))
- emb_layers.append(
- paddle.layer.embedding(
- input=word, size=emb_size, param_attr=embed_param_attr))
- next_word = paddle.layer.data(
- name="__target_word__", type=paddle.data_type.integer_value(dict_size))
-
- context_embedding = paddle.layer.concat(input=emb_layers)
-
- hidden_layer = paddle.layer.fc(
- input=context_embedding,
- size=hidden_size,
- act=paddle.activation.Tanh(),
- param_attr=paddle.attr.Param(initial_std=1. / math.sqrt(emb_size * 8)))
-
- if is_train:
- return paddle.layer.nce(input=hidden_layer,
- label=next_word,
- num_classes=dict_size,
- param_attr=paddle.attr.Param(name="nce_w"),
- bias_attr=paddle.attr.Param(name="nce_b"),
- num_neg_samples=25,
- neg_distribution=None)
- else:
- return paddle.layer.mixed(
- size=dict_size,
- input=paddle.layer.trans_full_matrix_projection(
- hidden_layer, param_attr=paddle.attr.Param(name="nce_w")),
- act=paddle.activation.Softmax(),
- bias_attr=paddle.attr.Param(name="nce_b"))
-
-
-if __name__ == "__main__":
- # this is to test and debug the network topology defination.
- # please set the hyper-parameters as needed.
- print(parse_network(
- ngram_lm(
- hidden_size=256,
- emb_size=256,
- dict_size=1024,
- gram_num=4,
- is_train=True)))
diff --git a/legacy/nce_cost/train.py b/legacy/nce_cost/train.py
deleted file mode 100644
index 11ba1e1652ed37806b1d7a2bb7faf6a330195d47..0000000000000000000000000000000000000000
--- a/legacy/nce_cost/train.py
+++ /dev/null
@@ -1,53 +0,0 @@
-import os
-import logging
-import gzip
-
-import paddle.v2 as paddle
-from network_conf import ngram_lm
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def train(model_save_dir):
- if not os.path.exists(model_save_dir):
- os.mkdir(model_save_dir)
-
- paddle.init(use_gpu=False, trainer_count=1)
- word_dict = paddle.dataset.imikolov.build_dict()
- dict_size = len(word_dict)
-
- optimizer = paddle.optimizer.Adam(learning_rate=1e-4)
-
- cost = ngram_lm(hidden_size=128, emb_size=512, dict_size=dict_size)
- parameters = paddle.parameters.create(cost)
- trainer = paddle.trainer.SGD(cost, parameters, optimizer)
-
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id and not event.batch_id % 10:
- logger.info("Pass %d, Batch %d, Cost %f" %
- (event.pass_id, event.batch_id, event.cost))
- elif isinstance(event, paddle.event.EndPass):
- result = trainer.test(
- paddle.batch(paddle.dataset.imikolov.test(word_dict, 5), 64))
- logger.info("Test Pass %d, Cost %f" % (event.pass_id, result.cost))
-
- save_path = os.path.join(model_save_dir,
- "model_pass_%05d.tar.gz" % event.pass_id)
- logger.info("Save model into %s ..." % save_path)
- with gzip.open(save_path, "w") as f:
- trainer.save_parameter_to_tar(f)
-
- trainer.train(
- paddle.batch(
- paddle.reader.shuffle(
- lambda: paddle.dataset.imikolov.train(word_dict, 5)(),
- buf_size=1000),
- 64),
- num_passes=1000,
- event_handler=event_handler)
-
-
-if __name__ == "__main__":
- train(model_save_dir="models")
diff --git a/legacy/nested_sequence/README.md b/legacy/nested_sequence/README.md
deleted file mode 100644
index 4bb2a2cbfed71a48cf61544cb550016aa0cfa045..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/README.md
+++ /dev/null
@@ -1,9 +0,0 @@
-## 简介
-
-序列是许多机器学习和数据挖掘任务面对的一种输入数据类型,以自然语言处理任务为例:句子由词语构成,而多个句子进一步构成了段落。因此,段落可以看作是一个嵌套的序列(或者叫作:双层序列),这个序列的每个元素又是一个序列。
-
-双层序列是 PaddlePaddle 支持的一种非常灵活的数据组织方式, 能够帮助我们更好地描述段落、多轮对话等更为复杂的数据。以双层序列作为输入,我们可以设计一个层次化的网络,从而更好地完成一些复杂的任务。
-
-本单元将介绍如何在 PaddlePaddle 中使用双层序列。
-
-- [基于双层序列的文本分类](https://github.com/PaddlePaddle/models/tree/develop/nested_sequence/text_classification)
diff --git a/legacy/nested_sequence/README_en.md b/legacy/nested_sequence/README_en.md
deleted file mode 100644
index f2b55dbe7e22d8095dd1d039b4dd8c3525b6900a..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/README_en.md
+++ /dev/null
@@ -1,8 +0,0 @@
-## Introduction
-Sequence is an input data type faced by many machine learning and data mining tasks. Taking Natural Language Processing task as an example, sentence is composed of words, and paragraph is composed of sentences. As a result, a paragraph can be seen as a nested sequence (or called: double sequence), and each element of the sequence is a sequence.
-
-Double sequence is a very flexible data organization method supported by PaddlePaddle, which can help us better describe more complex data such as paragraphs, multiple rounds of dialogues. With a double-layer sequence as input, we can design a hierarchical network to better accomplish some complex tasks.
-
-This unit will introduce how to use a double sequence in PaddlePaddle.
-
-- [Text Classification Based on Double Sequence](https://github.com/PaddlePaddle/models/tree/develop/nested_sequence/text_classification)
diff --git a/legacy/nested_sequence/text_classification/.gitignore b/legacy/nested_sequence/text_classification/.gitignore
deleted file mode 100644
index dde3895fc112ad34a839b2fed9210ac2288a959b..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/.gitignore
+++ /dev/null
@@ -1,2 +0,0 @@
-.DS_Store
-*.pyc
diff --git a/legacy/nested_sequence/text_classification/README.md b/legacy/nested_sequence/text_classification/README.md
deleted file mode 100644
index 093bd9a4a57ae22f049eb83fabe4e8a9150bf335..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/README.md
+++ /dev/null
@@ -1,242 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.11.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 基于双层序列的文本分类
-
-## 简介
-本例将演示如何在 PaddlePaddle 中将长文本输入(通常能达到段落或者篇章基本)组织为双层序列,完成对长文本的分类任务。
-
-## 模型介绍
-我们将一段文本看成句子的序列,而每个句子又是词语的序列。
-
-我们首先用卷积神经网络编码段落中的每一句话;然后,将每句话的表示向量经过池化层得到段落的编码向量;最后将段落的编码向量作为分类器(以softmax层的全连接层)输入,得到最终的分类结果。
-
-**模型结构如下图所示**
-
-
-图1. 基于双层序列的文本分类模型
-
-
-PaddlePaddle 实现该网络结构的代码见 `network_conf.py`。
-
-对双层时间序列的处理,需要先将双层时间序列数据变换成单层时间序列数据,再对每一个单层时间序列进行处理。 在 PaddlePaddle 中 ,`recurrent_group` 是帮助我们构建处理双层序列的层次化模型的主要工具。这里,我们使用两个嵌套的 `recurrent_group` 。外层的 `recurrent_group` 将段落拆解为句子,`step` 函数中拿到的输入是句子序列;内层的 `recurrent_group` 将句子拆解为词语,`step` 函数中拿到的输入是非序列的词语。
-
-在词语级别,我们运用 CNN 网络,以词向量为输入,输出学习到的句子表示;在段落级别,我们通过池化作用,从若干句子的表示中得到段落的表示。
-
-``` python
-nest_group = paddle.layer.recurrent_group(input=[paddle.layer.SubsequenceInput(emb),
- hidden_size],
- step=cnn_cov_group)
-```
-
-
-拆解后的单层序列数据经过一个CNN网络学习对应的向量表示,CNN的网络结构包含以下部分:
-
-- **卷积层**: 文本分类中的卷积在时间序列上进行,卷积核的宽度和词向量层产出的矩阵一致,卷积后得到的结果为“特征图”, 使用多个不同高度的卷积核,可以得到多个特征图。本例代码默认使用了大小为 3(图1红色框)和 4(图1蓝色框)的卷积核。
-- **最大池化层**: 对卷积得到的各个特征图分别进行最大池化操作。由于特征图本身已经是向量,因此最大池化实际上就是选出各个向量中的最大元素。将所有最大元素又被拼接在一起,组成新的向量。
-- **线性投影层**: 将不同卷积得到的结果经过最大池化层之后拼接为一个长向量, 然后经过一个线性投影得到对应单层序列的表示向量。
-
-CNN网络具体代码实现如下:
-```python
-def cnn_cov_group(group_input, hidden_size):
- """
- Convolution group definition.
- :param group_input: The input of this layer.
- :type group_input: LayerOutput
- :params hidden_size: The size of the fully connected layer.
- :type hidden_size: int
- """
- conv3 = paddle.networks.sequence_conv_pool(
- input=group_input, context_len=3, hidden_size=hidden_size)
- conv4 = paddle.networks.sequence_conv_pool(
- input=group_input, context_len=4, hidden_size=hidden_size)
-
- linear_proj = paddle.layer.fc(input=[conv3, conv4],
- size=hidden_size,
- param_attr=paddle.attr.ParamAttr(name='_cov_value_weight'),
- bias_attr=paddle.attr.ParamAttr(name='_cov_value_bias'),
- act=paddle.activation.Linear())
-
- return linear_proj
-```
-PaddlePaddle 中已经封装好的带有池化的文本序列卷积模块:`paddle.networks.sequence_conv_pool`,可直接调用。
-
-在得到每个句子的表示向量之后, 将所有句子表示向量经过一个平均池化层, 得到一个样本的向量表示, 向量经过一个全连接层输出最终的预测结果。 代码如下:
-```python
-avg_pool = paddle.layer.pooling(input=nest_group,
- pooling_type=paddle.pooling.Avg(),
- agg_level=paddle.layer.AggregateLevel.TO_NO_SEQUENCE)
-
-prob = paddle.layer.mixed(size=class_num,
- input=[paddle.layer.full_matrix_projection(input=avg_pool)],
- act=paddle.activation.Softmax())
-```
-## 安装依赖包
-```bash
-pip install -r requirements.txt
-```
-
-## 指定训练配置参数
-
-通过 `config.py` 脚本修改训练和模型配置参数,脚本中有对可配置参数的详细解释,示例如下:
-```python
-class TrainerConfig(object):
-
- # whether to use GPU for training
- use_gpu = False
- # the number of threads used in one machine
- trainer_count = 1
-
- # train batch size
- batch_size = 32
-
- ...
-
-
-class ModelConfig(object):
-
- # embedding vector dimension
- emb_size = 28
-
- ...
-```
-修改 `config.py` 对参数进行调整。例如,通过修改 `use_gpu` 参数来指定是否使用 GPU 进行训练。
-
-## 使用 PaddlePaddle 内置数据运行
-
-### 训练
-在终端执行:
-```bash
-python train.py
-```
-将以 PaddlePaddle 内置的情感分类数据集: `imdb` 运行本例。
-### 预测
-训练结束后,模型将被存储在指定目录当中(默认models目录),在终端执行:
-```bash
-python infer.py --model_path 'models/params_pass_00000.tar.gz'
-```
-预测脚本将加载、训练一个pass的模型,并用这个模型对 `imdb的测试集` 进行测试。
-
-## 使用自定义数据训练和预测
-
-### 训练
-1.数据组织
-
-每一行为一条样本,以 `\t` 分隔,第一列是类别标签,第二列是输入文本的内容。
-
-```
-positive This movie is very good. The actor is so handsome.
-negative What a terrible movie. I waste so much time.
-```
-
-2.编写数据读取接口
-
-自定义数据读取接口只需编写一个 Python 生成器,实现**解析输入文本**的逻辑。以下代码片段实现了读取原始数据返回类型为: `paddle.data_type.integer_value_sub_sequence` 和 `paddle.data_type.integer_value`
-```python
-def train_reader(data_dir, word_dict, label_dict):
- """
- Reader interface for training data
-
- :param data_dir: data directory
- :type data_dir: str
- :param word_dict: path of word dictionary,
- the dictionary must has a "UNK" in it.
- :type word_dict: Python dict
- :param label_dict: path of label dictionary.
- :type label_dict: Python dict
- """
-
- def reader():
- UNK_ID = word_dict['']
- word_col = 1
- lbl_col = 0
-
- for file_name in os.listdir(data_dir):
- file_path = os.path.join(data_dir, file_name)
- if not os.path.isfile(file_path):
- continue
- with open(file_path, "r") as f:
- for line in f:
- line_split = line.strip().split("\t")
- doc = line_split[word_col]
- doc_ids = []
- for sent in doc.strip().split("."):
- sent_ids = [
- word_dict.get(w, UNK_ID)
- for w in sent.split()]
- if sent_ids:
- doc_ids.append(sent_ids)
-
- yield doc_ids, label_dict[line_split[lbl_col]]
-
- return reader
-```
-需要注意的是, 本例中以英文句号`'.'`作为分隔符, 将一段文本分隔为一定数量的句子, 且每个句子表示为对应词表的索引数组(`sent_ids`)。 由于当前样本的表示(`doc_ids`)中包含了该段文本的所有句子, 因此,它的类型为:`paddle.data_type.integer_value_sub_sequence`。
-
-
-3.指定命令行参数进行训练
-
-`train.py`训练脚本中包含以下参数:
-```
-Options:
- --train_data_dir TEXT The path of training dataset (default: None). If
- this parameter is not set, imdb dataset will be
- used.
- --test_data_dir TEXT The path of testing dataset (default: None). If this
- parameter is not set, imdb dataset will be used.
- --word_dict_path TEXT The path of word dictionary (default: None). If this
- parameter is not set, imdb dataset will be used. If
- this parameter is set, but the file does not exist,
- word dictionay will be built from the training data
- automatically.
- --label_dict_path TEXT The path of label dictionary (default: None).If this
- parameter is not set, imdb dataset will be used. If
- this parameter is set, but the file does not exist,
- label dictionay will be built from the training data
- automatically.
- --model_save_dir TEXT The path to save the trained models (default:
- 'models').
- --help Show this message and exit.
-```
-
-修改`train.py`脚本中的启动参数,可以直接运行本例。 以`data`目录下的示例数据为例,在终端执行:
-```bash
-python train.py \
- --train_data_dir 'data/train_data' \
- --test_data_dir 'data/test_data' \
- --word_dict_path 'word_dict.txt' \
- --label_dict_path 'label_dict.txt'
-```
-即可对样例数据进行训练。
-
-### 预测
-
-1.指定命令行参数
-
-`infer.py`训练脚本中包含以下参数:
-
-```
-Options:
- --data_path TEXT The path of data for inference (default: None). If
- this parameter is not set, imdb test dataset will be
- used.
- --model_path TEXT The path of saved model. [required]
- --word_dict_path TEXT The path of word dictionary (default: None). If this
- parameter is not set, imdb dataset will be used.
- --label_dict_path TEXT The path of label dictionary (default: None).If this
- parameter is not set, imdb dataset will be used.
- --batch_size INTEGER The number of examples in one batch (default: 32).
- --help Show this message and exit.
-```
-
-2.以`data`目录下的示例数据为例,在终端执行:
-```bash
-python infer.py \
- --data_path 'data/infer.txt' \
- --word_dict_path 'word_dict.txt' \
- --label_dict_path 'label_dict.txt' \
- --model_path 'models/params_pass_00000.tar.gz'
-```
-
-即可对样例数据进行预测。
diff --git a/legacy/nested_sequence/text_classification/README_en.md b/legacy/nested_sequence/text_classification/README_en.md
deleted file mode 100644
index f2cb5d16fd7f43b2db416b147f002f3dc59a034d..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/README_en.md
+++ /dev/null
@@ -1,241 +0,0 @@
-Running sample code in this directory requires PaddelPaddle v0.11.0 and later. If the PaddlePaddle on your device is lower than� this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html) and make an update.
-
-
----
-
-# Text Classification Based on Double Sequence
-
-## Introduction
-This example will demonstrate how to organize long text(usually paragraphs or chapters) input into a double sequence in PaddlePaddle to complete the task of classifying long text.
-
-## Model introduction
-We treat a text as a sequence of sentences, and each sentence is a sequence of words.
-
-We first use the convolutional neural network to encode each sentence in the paragraph; then, let the expression vector of each sentence goes through the pooled layer to obtain the encoded vector of the paragraph; finally, the encoded vector of the paragraph is used as the classifier(the full connection of softmax layer) input to obtain the final classification result.
-
-**The model structure is shown in the figure below**
-
-
-Figure1. Text classification model based on double layer sequence
-
-
-PaddlePaddle implementation of the network structure is in `network_conf.py`.
-
-To process double-level time series, we need to transform the double layer time series data into single time series data, and then process each single time series. In PaddlePaddle, recurrent_group is the main tool to help us build a hierarchical model for processing double decker sequences. Here, we use two nested recurrent_group. The outer recurrent_group dissolves the paragraph into a sentence, and the input from the step function is the sentence sequence. The recurrent_group in the inner layer dismantles the sentence into word. The input in the step function is a group of non-sequential words.
-
-At the level of words, we obtain the expression of a sentence from word vectors using CNN. At the level of paragraphs, we obtain the expression of a paragraph from the expressions of the sentences in the paragraph through pooling.
-
-``` python
-nest_group = paddle.layer.recurrent_group(input=[paddle.layer.SubsequenceInput(emb),
- hidden_size],
- step=cnn_cov_group)
-```
-
-The single layer sequence data after disassembly is represented by a CNN network to learn the corresponding vector, and the network structure of the CNN contains the following parts:
-
-- **Convolution layer**: convolution in text classification is done on time series. The width of convolution kernel is consistent with the matrix generated by word vector level. After convolution, the result is a "feature map". Multiple feature maps can be obtained by using multiple convolutions of different heights. This code uses the convolution kernel of 3 (the red box of Figure 1) and 4 (the blue box of Figure 1) by default.
-- **Maximum pool layer**: the maximum pool operation is performed on each feature graph obtained by convolution. Since the feature graph itself is already a vector, the maximum pooling is actually the largest element in the selection of each vector. All the largest elements are spliced together to form a new vector.
-- **Linear projection layer**: splices the results from the maximum pool operations into a long vector. Linear projection is used to get the representation vectors of corresponding single layer sequences.
-
-Implementation of CNN network:
-```python
-def cnn_cov_group(group_input, hidden_size):
- """
- Convolution group definition.
- :param group_input: The input of this layer.
- :type group_input: LayerOutput
- :params hidden_size: The size of the fully connected layer.
- :type hidden_size: int
- """
- conv3 = paddle.networks.sequence_conv_pool(
- input=group_input, context_len=3, hidden_size=hidden_size)
- conv4 = paddle.networks.sequence_conv_pool(
- input=group_input, context_len=4, hidden_size=hidden_size)
-
- linear_proj = paddle.layer.fc(input=[conv3, conv4],
- size=hidden_size,
- param_attr=paddle.attr.ParamAttr(name='_cov_value_weight'),
- bias_attr=paddle.attr.ParamAttr(name='_cov_value_bias'),
- act=paddle.activation.Linear())
-
- return linear_proj
-```
-PaddlePaddle has been encapsulated with a pooled text sequence convolution module: `paddle.networks.sequence_conv_pool`, which can be called directly.
-
-After getting the expression vectors of each sentence, all the sentence vectors are passed through an average pool level, and a vector representation of a sample is obtained. The vector outputs the final prediction result through a fully connected layer. The code:
-```python
-avg_pool = paddle.layer.pooling(input=nest_group,
- pooling_type=paddle.pooling.Avg(),
- agg_level=paddle.layer.AggregateLevel.TO_NO_SEQUENCE)
-
-prob = paddle.layer.mixed(size=class_num,
- input=[paddle.layer.full_matrix_projection(input=avg_pool)],
- act=paddle.activation.Softmax())
-```
-## Install dependency package
-```bash
-pip install -r requirements.txt
-```
-
-## Specify training configuration parameters
-
-The training and model configuration parameters are modified through the `config.py` script. There are detailed explanations for configurable parameters in the script. The examples are as follows:
-```python
-class TrainerConfig(object):
-
- # whether to use GPU for training
- use_gpu = False
- # the number of threads used in one machine
- trainer_count = 1
-
- # train batch size
- batch_size = 32
-
- ...
-
-
-class ModelConfig(object):
-
- # embedding vector dimension
- emb_size = 28
-
- ...
-```
-Modify the `config.py` to adjust the parameters. For example, we can specify whether or not to use GPU for training by modifying `use_gpu`.
-## Implement with PaddlePaddle Built-in data
-
-### Train
-Execute at the terminal:
-```bash
-python train.py
-```
-You will run this example with the PaddlePaddle's built-in emotional categorization dataset, `imdb` .
-### Prediction
-After training, the model will be stored in the specified directory (the default models directory), execute the following command:
-
-```bash
-python infer.py --model_path 'models/params_pass_00000.tar.gz'
-```
-The prediction script will load and train a pass model to test `test set of the IMDB`.
-
-## Use custom data train and predict
-
-### Train
-1.Data structure
-
-Each line is a sample with class label and text. Class label and text content are seperated by `\t`. The following are two samples::
-
-```
-positive This movie is very good. The actor is so handsome.
-negative What a terrible movie. I waste so much time.
-```
-
-2.Write the Data Reading Interface
-
-To define a custom data reading interface, we only need to write a Python generator to **parse the input text**. The following code fragment is implemented to read the return type of the original data: `paddle.data_type.integer_value_sub_sequence` and `paddle.data_type.integer_value`
-```python
-def train_reader(data_dir, word_dict, label_dict):
- """
- Reader interface for training data
-
- :param data_dir: data directory
- :type data_dir: str
- :param word_dict: path of word dictionary,
- the dictionary must has a "UNK" in it.
- :type word_dict: Python dict
- :param label_dict: path of label dictionary.
- :type label_dict: Python dict
- """
-
- def reader():
- UNK_ID = word_dict['']
- word_col = 1
- lbl_col = 0
-
- for file_name in os.listdir(data_dir):
- file_path = os.path.join(data_dir, file_name)
- if not os.path.isfile(file_path):
- continue
- with open(file_path, "r") as f:
- for line in f:
- line_split = line.strip().split("\t")
- doc = line_split[word_col]
- doc_ids = []
- for sent in doc.strip().split("."):
- sent_ids = [
- word_dict.get(w, UNK_ID)
- for w in sent.split()]
- if sent_ids:
- doc_ids.append(sent_ids)
-
- yield doc_ids, label_dict[line_split[lbl_col]]
-
- return reader
-```
-Note that, in this case to English period `'.'` as a separator, the text is divided into a certain number of sentences, and each sentence is expressed as the corresponding index array Thesaurus (`sent_ids`). Since the representation of the current sample (`doc_ids`) contains all the sentences of the text, it is type: `paddle.data_type.integer_value_sub_sequence`.
-
-3.Specify command line parameters for training
-
-`train.py` contains the following parameters:
-```
-Options:
- --train_data_dir TEXT The path of training dataset (default: None). If
- this parameter is not set, imdb dataset will be
- used.
- --test_data_dir TEXT The path of testing dataset (default: None). If this
- parameter is not set, imdb dataset will be used.
- --word_dict_path TEXT The path of word dictionary (default: None). If this
- parameter is not set, imdb dataset will be used. If
- this parameter is set, but the file does not exist,
- word dictionay will be built from the training data
- automatically.
- --label_dict_path TEXT The path of label dictionary (default: None).If this
- parameter is not set, imdb dataset will be used. If
- this parameter is set, but the file does not exist,
- label dictionay will be built from the training data
- automatically.
- --model_save_dir TEXT The path to save the trained models (default:
- 'models').
- --help Show this message and exit.
-```
-
-Modify the startup parameters in the `train.py` script to run this example directly. Take the sample data in the data directory for example, execute at the terminal:
-```bash
-python train.py \
- --train_data_dir 'data/train_data' \
- --test_data_dir 'data/test_data' \
- --word_dict_path 'word_dict.txt' \
- --label_dict_path 'label_dict.txt'
-```
-So you can train with sample data.
-
-### Prediction
-
-1.Specify command line parameters
-
-`infer.py` contains the following parameters:
-
-```
-Options:
- --data_path TEXT The path of data for inference (default: None). If
- this parameter is not set, imdb test dataset will be
- used.
- --model_path TEXT The path of saved model. [required]
- --word_dict_path TEXT The path of word dictionary (default: None). If this
- parameter is not set, imdb dataset will be used.
- --label_dict_path TEXT The path of label dictionary (default: None).If this
- parameter is not set, imdb dataset will be used.
- --batch_size INTEGER The number of examples in one batch (default: 32).
- --help Show this message and exit.
-```
-
-2.take the sample data in the `data` directory as an example, execute at the terminal:
-```bash
-python infer.py \
- --data_path 'data/infer.txt' \
- --word_dict_path 'word_dict.txt' \
- --label_dict_path 'label_dict.txt' \
- --model_path 'models/params_pass_00000.tar.gz'
-```
-
-So the sample data can be predicted.
diff --git a/legacy/nested_sequence/text_classification/config.py b/legacy/nested_sequence/text_classification/config.py
deleted file mode 100644
index 1a6e4681b158c5b2f757edc7da8156369fb9c7dc..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/config.py
+++ /dev/null
@@ -1,46 +0,0 @@
-__all__ = ["TrainerConfig", "ModelConfig"]
-
-
-class TrainerConfig(object):
-
- # Whether to use GPU in training or not.
- use_gpu = False
- # The number of computing threads.
- trainer_count = 1
-
- # The training batch size.
- batch_size = 32
-
- # The epoch number.
- num_passes = 10
-
- # The global learning rate.
- learning_rate = 1e-3
-
- # The decay rate for L2Regularization
- l2_learning_rate = 1e-3
-
- # This parameter is used for the averaged SGD.
- # About the average_window * (number of the processed batch) parameters
- # are used for average.
- # To be accurate, between average_window *(number of the processed batch)
- # and 2 * average_window * (number of the processed batch) parameters
- # are used for average.
- average_window = 0.5
-
- # The buffer size of the data reader.
- # The number of buffer size samples will be shuffled in training.
- buf_size = 1000
-
- # The parameter is used to control logging period.
- # Training log will be printed every log_period.
- log_period = 100
-
-
-class ModelConfig(object):
-
- # The dimension of embedding vector.
- emb_size = 28
-
- # The hidden size of sentence vectors.
- hidden_size = 128
diff --git a/legacy/nested_sequence/text_classification/data/infer.txt b/legacy/nested_sequence/text_classification/data/infer.txt
deleted file mode 100644
index 8309d5c026d99a75464044a8f598d111547b6f6c..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/data/infer.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-I was overtaken by the emotion. Unforgettable rendering of a wartime story which is unknown to most people. The performances were faultless and outstanding.
-The original Vampires (1998) is one of my favorites. I was curious to see how a sequel would work considering they used none of the original characters. I was quite surprised at how this played out.
-Without question, the worst ELVIS film ever made. The movie portrays all Indians as drunk, stupid, and lazy. Watch ELVIS's skin change color throughout the film.
-I thought this movie was hysterical. I have watched it many times and recommend it highly. Mel Brooks, was excellent. The cast was fantastic..I don't understand how this movie gets a 2 out of 5 rating. I loved it.
\ No newline at end of file
diff --git a/legacy/nested_sequence/text_classification/data/test_data/test.txt b/legacy/nested_sequence/text_classification/data/test_data/test.txt
deleted file mode 100644
index d162dbbeba72db6ecd62c24844a2d6bb5c655382..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/data/test_data/test.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-positive I liked the film. Some of the action scenes were very interesting, tense and well done. I especially liked the opening scene which had a semi truck in it. Also the film is funny is several parts. I'd give the film an 8 out of 10.
-negative The plot for Descent, if it actually can be called a plot, has two noteworthy events. One near the beginning - one at the end. Together these events make up maybe 5% of the total movie time. Everything (and I mean _everything_) in between is basically the director's desperate effort to fill in the minutes.
-negative This film lacked something I couldn't put my finger on at first: charisma on the part of the leading actress. This inevitably translated to lack of chemistry when she shared the screen with her leading man. Even the romantic scenes came across as being merely the actors at play.
-negative I read the book a long time back and don't specifically remember the plot but do remember that I enjoyed it. Since I'm home sick on the couch it seemed like a good idea and Hey !! It is a Lifetime movie. The movie is populated with grade B actors and actresses. The female cast is right out of Desperate Housewives.
\ No newline at end of file
diff --git a/legacy/nested_sequence/text_classification/data/train_data/train.txt b/legacy/nested_sequence/text_classification/data/train_data/train.txt
deleted file mode 100644
index 4f392593bfee708ff839fb8739b4d5c660f26339..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/data/train_data/train.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-negative It was a Sunday night and I was waiting for the advertised movie on TV. They said it was a comedy! The movie started, 10 minutes passed, after that 30 minutes and I didn't laugh not even once. The fact is that the movie ended and I didn't get even on echance to laugh.
-negative I saw this piece of garbage on AMC last night, and wonder how it could be considered in any way an American Movie Classic. It was awful in every way. How badly did Jack Lemmon, James Stewart and the rest of the cast need cash that they would even consider doing this movie?
-positive its not as good as the first movie,but its a good solid movie its has good car chase scenes,on the remake of this movie there a story for are hero to drive fast as his trying to rush to the side of his ailing wife,the ending is great just a good fair movie to watch in my opinion.
-positive Rosalind Russell executes a power-house performance as Rosie Lord, a very wealthy woman with greedy heirs. With an Auntie Mame-type character, this actress can never go wrong. Her very-real terror at being in an insane assylum is a wonderful piece of acting. Everyone should watch this.
\ No newline at end of file
diff --git a/legacy/nested_sequence/text_classification/images/model.jpg b/legacy/nested_sequence/text_classification/images/model.jpg
deleted file mode 100644
index 4f63d8b55380f4210c2acf98251c8a4ce75a7efc..0000000000000000000000000000000000000000
Binary files a/legacy/nested_sequence/text_classification/images/model.jpg and /dev/null differ
diff --git a/legacy/nested_sequence/text_classification/infer.py b/legacy/nested_sequence/text_classification/infer.py
deleted file mode 100644
index 461eba4935c375c95d9c9570ed487553090ba461..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/infer.py
+++ /dev/null
@@ -1,106 +0,0 @@
-import sys
-import os
-import gzip
-import click
-
-import paddle.v2 as paddle
-
-import reader
-from network_conf import nested_net
-from utils import logger, load_dict, load_reverse_dict
-
-
-@click.command('infer')
-@click.option(
- "--data_path",
- default=None,
- help=("The path of data for inference (default: None). "
- "If this parameter is not set, "
- "imdb test dataset will be used."))
-@click.option(
- "--model_path", type=str, required=True, help="The path of saved model.")
-@click.option(
- "--word_dict_path",
- type=str,
- default=None,
- help=("The path of word dictionary (default: None). "
- "If this parameter is not set, imdb dataset will be used."))
-@click.option(
- "--label_dict_path",
- type=str,
- default=None,
- help=("The path of label dictionary (default: None)."
- "If this parameter is not set, imdb dataset will be used. "))
-@click.option(
- "--batch_size",
- type=int,
- default=32,
- help="The number of examples in one batch (default: 32).")
-def infer(data_path, model_path, word_dict_path, batch_size, label_dict_path):
- def _infer_a_batch(inferer, test_batch, ids_2_word, ids_2_label):
- probs = inferer.infer(input=test_batch, field=["value"])
- assert len(probs) == len(test_batch)
- for word_ids, prob in zip(test_batch, probs):
- sent_ids = []
- for sent in word_ids[0]:
- sent_ids.extend(sent)
- word_text = " ".join([ids_2_word[id] for id in sent_ids])
- print("%s\t%s\t%s" % (ids_2_label[prob.argmax()],
- " ".join(["{:0.4f}".format(p)
- for p in prob]), word_text))
-
- assert os.path.exists(model_path), "The trained model does not exist."
- logger.info("Begin to predict...")
- use_default_data = (data_path is None)
-
- if use_default_data:
- word_dict = reader.imdb_word_dict()
- word_reverse_dict = dict((value, key)
- for key, value in word_dict.iteritems())
-
- # The reversed label dict of the imdb dataset
- label_reverse_dict = {0: "positive", 1: "negative"}
- test_reader = reader.imdb_test(word_dict)
- class_num = 2
- else:
- assert os.path.exists(
- word_dict_path), "The word dictionary file does not exist"
- assert os.path.exists(
- label_dict_path), "The label dictionary file does not exist"
-
- word_dict = load_dict(word_dict_path)
- word_reverse_dict = dict((value, key)
- for key, value in word_dict.iteritems())
- label_reverse_dict = load_reverse_dict(label_dict_path)
- class_num = len(label_reverse_dict)
- test_reader = reader.infer_reader(data_path, word_dict)()
-
- dict_dim = len(word_dict)
-
- # initialize PaddlePaddle.
- paddle.init(use_gpu=False, trainer_count=1)
-
- prob_layer = nested_net(dict_dim, class_num, is_infer=True)
-
- # load the trained models.
- parameters = paddle.parameters.Parameters.from_tar(
- gzip.open(model_path, "r"))
- inferer = paddle.inference.Inference(
- output_layer=prob_layer, parameters=parameters)
-
- test_batch = []
- for idx, item in enumerate(test_reader):
- test_batch.append([item[0]])
- if len(test_batch) == batch_size:
- _infer_a_batch(inferer, test_batch, word_reverse_dict,
- label_reverse_dict)
- test_batch = []
-
- if len(test_batch):
- _infer_a_batch(inferer, test_batch, word_reverse_dict,
- label_reverse_dict)
- test_batch = []
-
-
-if __name__ == "__main__":
- infer()
diff --git a/legacy/nested_sequence/text_classification/network_conf.py b/legacy/nested_sequence/text_classification/network_conf.py
deleted file mode 100644
index 06cbe3cbc4513f75ab37be382d7bfe967c4dad79..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/network_conf.py
+++ /dev/null
@@ -1,61 +0,0 @@
-import paddle.v2 as paddle
-from config import ModelConfig as conf
-
-
-def cnn_cov_group(group_input, hidden_size):
- """
- Convolution group definition.
- :param group_input: The input of this layer.
- :type group_input: LayerOutput
- :params hidden_size: The size of the fully connected layer.
- :type hidden_size: int
- """
- conv3 = paddle.networks.sequence_conv_pool(
- input=group_input, context_len=3, hidden_size=hidden_size)
- conv4 = paddle.networks.sequence_conv_pool(
- input=group_input, context_len=4, hidden_size=hidden_size)
-
- fc_param_attr = paddle.attr.ParamAttr(name='_cov_value_weight')
- fc_bias_attr = paddle.attr.ParamAttr(name='_cov_value_bias')
- linear_proj = paddle.layer.fc(input=[conv3, conv4],
- size=hidden_size,
- param_attr=[fc_param_attr, fc_param_attr],
- bias_attr=fc_bias_attr,
- act=paddle.activation.Linear())
-
- return linear_proj
-
-
-def nested_net(dict_dim, class_num, is_infer=False):
- """
- Nested network definition.
- :param dict_dim: Size of word dictionary.
- :type dict_dim: int
- :params class_num: Number of instance class.
- :type class_num: int
- :params is_infer: The boolean parameter
- indicating inferring or training.
- :type is_infer: bool
- """
- data = paddle.layer.data(
- "word", paddle.data_type.integer_value_sub_sequence(dict_dim))
-
- emb = paddle.layer.embedding(input=data, size=conf.emb_size)
- nest_group = paddle.layer.recurrent_group(
- input=[paddle.layer.SubsequenceInput(emb), conf.hidden_size],
- step=cnn_cov_group)
- avg_pool = paddle.layer.pooling(
- input=nest_group,
- pooling_type=paddle.pooling.Avg(),
- agg_level=paddle.layer.AggregateLevel.TO_NO_SEQUENCE)
- prob = paddle.layer.mixed(
- size=class_num,
- input=[paddle.layer.full_matrix_projection(input=avg_pool)],
- act=paddle.activation.Softmax())
- if is_infer == False:
- label = paddle.layer.data("label",
- paddle.data_type.integer_value(class_num))
- cost = paddle.layer.classification_cost(input=prob, label=label)
- return cost, prob, label
-
- return prob
diff --git a/legacy/nested_sequence/text_classification/reader.py b/legacy/nested_sequence/text_classification/reader.py
deleted file mode 100644
index a437422dd3b9f966e2c456770d2c849e710eda2a..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/reader.py
+++ /dev/null
@@ -1,227 +0,0 @@
-"""
-IMDB dataset.
-
-This module downloads IMDB dataset from
-http://ai.stanford.edu/%7Eamaas/data/sentiment/. This dataset contains a set
-of 25,000 highly polar movie reviews for training, and 25,000 for testing.
-Besides, this module also provides API for building dictionary.
-"""
-import collections
-import tarfile
-import Queue
-import re
-import string
-import threading
-import os
-
-import paddle.v2.dataset.common
-
-URL = 'http://ai.stanford.edu/%7Eamaas/data/sentiment/aclImdb_v1.tar.gz'
-MD5 = '7c2ac02c03563afcf9b574c7e56c153a'
-
-
-def tokenize(pattern):
- """
- Read files that match the given pattern. Tokenize and yield each file.
- """
- with tarfile.open(paddle.v2.dataset.common.download(URL, 'imdb',
- MD5)) as tarf:
- tf = tarf.next()
- while tf != None:
- if bool(pattern.match(tf.name)):
- # newline and punctuations removal and ad-hoc tokenization.
- docs = tarf.extractfile(tf).read().rstrip("\n\r").lower().split(
- '.')
- doc_list = []
- for doc in docs:
- doc = doc.strip()
- if doc:
- doc_without_punc = doc.translate(
- None, string.punctuation).strip()
- if doc_without_punc:
- doc_list.append(
- [word for word in doc_without_punc.split()])
- yield doc_list
- tf = tarf.next()
-
-
-def imdb_build_dict(pattern, cutoff):
- """
- Build a word dictionary from the corpus. Keys of the dictionary are words,
- and values are zero-based IDs of these words.
- """
- word_freq = collections.defaultdict(int)
- for doc_list in tokenize(pattern):
- for doc in doc_list:
- for word in doc:
- word_freq[word] += 1
-
- word_freq[''] = cutoff + 1
- word_freq = filter(lambda x: x[1] > cutoff, word_freq.items())
- dictionary = sorted(word_freq, key=lambda x: (-x[1], x[0]))
- words, _ = list(zip(*dictionary))
- word_idx = dict(zip(words, xrange(len(words))))
- return word_idx
-
-
-def reader_creator(pos_pattern, neg_pattern, word_idx, buffer_size):
- UNK = word_idx['']
-
- qs = [Queue.Queue(maxsize=buffer_size), Queue.Queue(maxsize=buffer_size)]
-
- def load(pattern, queue):
- for doc_list in tokenize(pattern):
- queue.put(doc_list)
- queue.put(None)
-
- def reader():
- # Creates two threads that loads positive and negative samples
- # into qs.
- t0 = threading.Thread(
- target=load, args=(
- pos_pattern,
- qs[0], ))
- t0.daemon = True
- t0.start()
-
- t1 = threading.Thread(
- target=load, args=(
- neg_pattern,
- qs[1], ))
- t1.daemon = True
- t1.start()
-
- # Read alternatively from qs[0] and qs[1].
- i = 0
- doc_list = qs[i].get()
-
- while doc_list != None:
- ids_list = []
- for doc in doc_list:
- ids_list.append([word_idx.get(w, UNK) for w in doc])
- yield ids_list, i % 2
- i += 1
- doc_list = qs[i % 2].get()
-
- # If any queue is empty, reads from the other queue.
- i += 1
- doc_list = qs[i % 2].get()
- while doc_list != None:
- ids_list = []
- for doc in doc_list:
- ids_list.append([word_idx.get(w, UNK) for w in doc])
- yield ids_list, i % 2
- doc_list = qs[i % 2].get()
-
- return reader()
-
-
-def imdb_train(word_idx):
- """
- IMDB training set creator.
-
- It returns a reader creator, each sample in the reader is an zero-based ID
- subsequence and label in [0, 1].
-
- :param word_idx: word dictionary
- :type word_idx: dict
- :return: Training reader creator
- :rtype: callable
- """
- return reader_creator(
- re.compile("aclImdb/train/pos/.*\.txt$"),
- re.compile("aclImdb/train/neg/.*\.txt$"), word_idx, 1000)
-
-
-def imdb_test(word_idx):
- """
- IMDB test set creator.
-
- It returns a reader creator, each sample in the reader is an zero-based ID
- subsequence and label in [0, 1].
-
- :param word_idx: word dictionary
- :type word_idx: dict
- :return: Test reader creator
- :rtype: callable
- """
- return reader_creator(
- re.compile("aclImdb/test/pos/.*\.txt$"),
- re.compile("aclImdb/test/neg/.*\.txt$"), word_idx, 1000)
-
-
-def imdb_word_dict():
- """
- Build a word dictionary from the corpus.
-
- :return: Word dictionary
- :rtype: dict
- """
- return imdb_build_dict(
- re.compile("aclImdb/((train)|(test))/((pos)|(neg))/.*\.txt$"), 150)
-
-
-def train_reader(data_dir, word_dict, label_dict):
- """
- Reader interface for training data
-
- :param data_dir: data directory
- :type data_dir: str
- :param word_dict: path of word dictionary,
- the dictionary must has a "UNK" in it.
- :type word_dict: Python dict
- :param label_dict: path of label dictionary.
- :type label_dict: Python dict
- """
-
- def reader():
- UNK_ID = word_dict['']
- word_col = 1
- lbl_col = 0
-
- for file_name in os.listdir(data_dir):
- file_path = os.path.join(data_dir, file_name)
- if not os.path.isfile(file_path):
- continue
- with open(file_path, "r") as f:
- for line in f:
- line_split = line.strip().split("\t")
- doc = line_split[word_col]
- doc_ids = []
- for sent in doc.strip().split("."):
- sent_ids = [
- word_dict.get(w, UNK_ID) for w in sent.split()
- ]
- if sent_ids:
- doc_ids.append(sent_ids)
-
- yield doc_ids, label_dict[line_split[lbl_col]]
-
- return reader
-
-
-def infer_reader(file_path, word_dict):
- """
- Reader interface for prediction
-
- :param data_dir: data directory
- :type data_dir: str
- :param word_dict: path of word dictionary,
- the dictionary must has a "UNK" in it.
- :type word_dict: Python dict
- """
-
- def reader():
- UNK_ID = word_dict['']
-
- with open(file_path, "r") as f:
- for doc in f:
- doc_ids = []
- for sent in doc.strip().split("."):
- sent_ids = [word_dict.get(w, UNK_ID) for w in sent.split()]
- if sent_ids:
- doc_ids.append(sent_ids)
-
- yield doc_ids, doc
-
- return reader
diff --git a/legacy/nested_sequence/text_classification/requirements.txt b/legacy/nested_sequence/text_classification/requirements.txt
deleted file mode 100644
index dca9a909647e3b066931de2909c2d1e65c78c995..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/requirements.txt
+++ /dev/null
@@ -1 +0,0 @@
-click
diff --git a/legacy/nested_sequence/text_classification/train.py b/legacy/nested_sequence/text_classification/train.py
deleted file mode 100644
index 20ad59dd0d55d817c37fa415049d0a8d67e98bb0..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/train.py
+++ /dev/null
@@ -1,195 +0,0 @@
-import os
-import sys
-import gzip
-import click
-
-import paddle.v2 as paddle
-
-import reader
-from network_conf import nested_net
-from utils import build_word_dict, build_label_dict, load_dict, logger
-from config import TrainerConfig as conf
-
-
-@click.command('train')
-@click.option(
- "--train_data_dir",
- default=None,
- help=("The path of training dataset (default: None). "
- "If this parameter is not set, "
- "imdb dataset will be used."))
-@click.option(
- "--test_data_dir",
- default=None,
- help=("The path of testing dataset (default: None). "
- "If this parameter is not set, "
- "imdb dataset will be used."))
-@click.option(
- "--word_dict_path",
- type=str,
- default=None,
- help=("The path of word dictionary (default: None). "
- "If this parameter is not set, imdb dataset will be used. "
- "If this parameter is set, but the file does not exist, "
- "word dictionay will be built from "
- "the training data automatically."))
-@click.option(
- "--label_dict_path",
- type=str,
- default=None,
- help=("The path of label dictionary (default: None). "
- "If this parameter is not set, imdb dataset will be used. "
- "If this parameter is set, but the file does not exist, "
- "label dictionay will be built from "
- "the training data automatically."))
-@click.option(
- "--model_save_dir",
- type=str,
- default="models",
- help="The path to save the trained models (default: 'models').")
-def train(train_data_dir, test_data_dir, word_dict_path, label_dict_path,
- model_save_dir):
- """
- :params train_data_path: The path of training data, if this parameter
- is not specified, imdb dataset will be used to run this example
- :type train_data_path: str
- :params test_data_path: The path of testing data, if this parameter
- is not specified, imdb dataset will be used to run this example
- :type test_data_path: str
- :params word_dict_path: The path of word dictionary, if this parameter
- is not specified, imdb dataset will be used to run this example
- :type word_dict_path: str
- :params label_dict_path: The path of label dictionary, if this parameter
- is not specified, imdb dataset will be used to run this example
- :type label_dict_path: str
- :params model_save_dir: dir where models saved
- :type model_save_dir: str
- """
- if train_data_dir is not None:
- assert word_dict_path and label_dict_path, (
- "The parameter train_data_dir, word_dict_path, label_dict_path "
- "should be set at the same time.")
-
- if not os.path.exists(model_save_dir):
- os.mkdir(model_save_dir)
-
- use_default_data = (train_data_dir is None)
-
- if use_default_data:
- logger.info(("No training data are porivided, "
- "use imdb to train the model."))
- logger.info("Please wait to build the word dictionary ...")
-
- word_dict = reader.imdb_word_dict()
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- lambda: reader.imdb_train(word_dict), buf_size=1000),
- batch_size=100)
- test_reader = paddle.batch(
- lambda: reader.imdb_test(word_dict), batch_size=100)
- class_num = 2
- else:
- if word_dict_path is None or not os.path.exists(word_dict_path):
- logger.info(("Word dictionary is not given, the dictionary "
- "is automatically built from the training data."))
-
- # build the word dictionary to map the original string-typed
- # words into integer-typed index
- build_word_dict(
- data_dir=train_data_dir,
- save_path=word_dict_path,
- use_col=1,
- cutoff_fre=0)
-
- if not os.path.exists(label_dict_path):
- logger.info(("Label dictionary is not given, the dictionary "
- "is automatically built from the training data."))
- # build the label dictionary to map the original string-typed
- # label into integer-typed index
- build_label_dict(
- data_dir=train_data_dir, save_path=label_dict_path, use_col=0)
-
- word_dict = load_dict(word_dict_path)
- label_dict = load_dict(label_dict_path)
-
- class_num = len(label_dict)
- logger.info("Class number is : %d." % class_num)
-
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.train_reader(train_data_dir, word_dict, label_dict),
- buf_size=conf.buf_size),
- batch_size=conf.batch_size)
-
- if test_data_dir is not None:
- # here, because training and testing data share a same format,
- # we still use the reader.train_reader to read the testing data.
- test_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.train_reader(test_data_dir, word_dict, label_dict),
- buf_size=conf.buf_size),
- batch_size=conf.batch_size)
- else:
- test_reader = None
-
- dict_dim = len(word_dict)
-
- logger.info("Length of word dictionary is : %d." % (dict_dim))
-
- paddle.init(use_gpu=conf.use_gpu, trainer_count=conf.trainer_count)
-
- # create optimizer
- adam_optimizer = paddle.optimizer.Adam(
- learning_rate=conf.learning_rate,
- regularization=paddle.optimizer.L2Regularization(
- rate=conf.l2_learning_rate),
- model_average=paddle.optimizer.ModelAverage(
- average_window=conf.average_window))
-
- # define network topology.
- cost, prob, label = nested_net(dict_dim, class_num, is_infer=False)
-
- # create all the trainable parameters.
- parameters = paddle.parameters.create(cost)
-
- # create the trainer instance.
- trainer = paddle.trainer.SGD(
- cost=cost,
- extra_layers=paddle.evaluator.auc(input=prob, label=label),
- parameters=parameters,
- update_equation=adam_optimizer)
-
- # feeding dictionary
- feeding = {"word": 0, "label": 1}
-
- def _event_handler(event):
- """
- Define the end batch and the end pass event handler.
- """
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % conf.log_period == 0:
- logger.info("Pass %d, Batch %d, Cost %f, %s\n" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
-
- if isinstance(event, paddle.event.EndPass):
- if test_reader is not None:
- result = trainer.test(reader=test_reader, feeding=feeding)
- logger.info("Test at Pass %d, %s \n" % (event.pass_id,
- result.metrics))
- with gzip.open(
- os.path.join(model_save_dir, "params_pass_%05d.tar.gz" %
- event.pass_id), "w") as f:
- trainer.save_parameter_to_tar(f)
-
- # begin training network
- trainer.train(
- reader=train_reader,
- event_handler=_event_handler,
- feeding=feeding,
- num_passes=conf.num_passes)
-
- logger.info("Training has finished.")
-
-
-if __name__ == "__main__":
- train()
diff --git a/legacy/nested_sequence/text_classification/utils.py b/legacy/nested_sequence/text_classification/utils.py
deleted file mode 100644
index 1535e31f46e05b331771ffe0971c1f105088e2c4..0000000000000000000000000000000000000000
--- a/legacy/nested_sequence/text_classification/utils.py
+++ /dev/null
@@ -1,95 +0,0 @@
-import os
-import logging
-from collections import defaultdict
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def build_word_dict(data_dir, save_path, use_col=1, cutoff_fre=1):
- """
- Build word dictionary from training data.
- :param data_dir: The directory of training dataset.
- :type data_dir: str
- :params save_path: The path where the word dictionary will be saved.
- :type save_path: str
- :params use_col: The index of text juring line split.
- :type use_col: int
- :params cutoff_fre: The word will not be added to dictionary if it's
- frequency is less than cutoff_fre.
- :type cutoff_fre: int
- """
- values = defaultdict(int)
-
- for file_name in os.listdir(data_dir):
- file_path = os.path.join(data_dir, file_name)
- if not os.path.isfile(file_path):
- continue
- with open(file_path, "r") as fdata:
- for line in fdata:
- line_splits = line.strip().split("\t")
- if len(line_splits) < use_col:
- continue
- doc = line_splits[use_col]
- for sent in doc.strip().split("."):
- for w in sent.split():
- values[w] += 1
-
- values[''] = cutoff_fre
- with open(save_path, "w") as f:
- for v, count in sorted(
- values.iteritems(), key=lambda x: x[1], reverse=True):
- if count < cutoff_fre:
- break
- f.write("%s\t%d\n" % (v, count))
-
-
-def build_label_dict(data_dir, save_path, use_col=0):
- """
- Build label dictionary from training data.
- :param data_dir: The directory of training dataset.
- :type data_dir: str
- :params save_path: The path where the label dictionary will be saved.
- :type save_path: str
- :params use_col: The index of label juring line split.
- :type use_col: int
- """
- values = defaultdict(int)
-
- for file_name in os.listdir(data_dir):
- file_path = os.path.join(data_dir, file_name)
- if not os.path.isfile(file_path):
- continue
- with open(file_path, "r") as fdata:
- for line in fdata:
- line_splits = line.strip().split("\t")
- if len(line_splits) < use_col:
- continue
- values[line_splits[use_col]] += 1
-
- with open(save_path, "w") as f:
- for v, count in sorted(
- values.iteritems(), key=lambda x: x[1], reverse=True):
- f.write("%s\t%d\n" % (v, count))
-
-
-def load_dict(dict_path):
- """
- Load word dictionary from dictionary path.
- :param dict_path: The path of word dictionary.
- :type data_dir: str
- """
- return dict((line.strip().split("\t")[0], idx)
- for idx, line in enumerate(open(dict_path, "r").readlines()))
-
-
-def load_reverse_dict(dict_path):
- """
- Load the reversed word dictionary from dictionary path.
- Index of each word is saved in key of the dictionary and the
- corresponding word saved in value of the dictionary.
- :param dict_path: The path of word dictionary.
- :type data_dir: str
- """
- return dict((idx, line.strip().split("\t")[0])
- for idx, line in enumerate(open(dict_path, "r").readlines()))
diff --git a/legacy/neural_qa/.gitignore b/legacy/neural_qa/.gitignore
deleted file mode 100644
index 31ddc7f7df3074e948dab2369ea21d6ecaa55e9c..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/.gitignore
+++ /dev/null
@@ -1,5 +0,0 @@
-*.DS_Store
-*.pyc
-data
-models/*.tar.gz
-pre-trained-models/*.tar.gz
diff --git a/legacy/neural_qa/README.md b/legacy/neural_qa/README.md
deleted file mode 100644
index a19d7020679ac0dfee44e3c7a65ebef05057507a..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/README.md
+++ /dev/null
@@ -1,128 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.10.0. If you are on a version of PaddlePaddle earlier than v0.10.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Neural Recurrent Sequence Labeling Model for Open-Domain Factoid Question Answering
-
-This model implements the work in the following paper:
-
-Peng Li, Wei Li, Zhengyan He, Xuguang Wang, Ying Cao, Jie Zhou, and Wei Xu. Dataset and Neural Recurrent Sequence Labeling Model for Open-Domain Factoid Question Answering. [arXiv:1607.06275](https://arxiv.org/abs/1607.06275).
-
-If you use the dataset/code in your research, please cite the above paper:
-
-```text
-@article{li:2016:arxiv,
- author = {Li, Peng and Li, Wei and He, Zhengyan and Wang, Xuguang and Cao, Ying and Zhou, Jie and Xu, Wei},
- title = {Dataset and Neural Recurrent Sequence Labeling Model for Open-Domain Factoid Question Answering},
- journal = {arXiv:1607.06275v2},
- year = {2016},
- url = {https://arxiv.org/abs/1607.06275v2},
-}
-```
-
-
-## Installation
-
-1. Install PaddlePaddle v0.10.5 by the following commond. Note that v0.10.0 is not supported.
- ```bash
- # either one is OK
- # CPU
- pip install paddlepaddle
- # GPU
- pip install paddlepaddle-gpu
- ```
-2. Download the [WebQA](http://idl.baidu.com/WebQA.html) dataset by running
- ```bash
- cd data && ./download.sh && cd ..
- ```
-
-## Hyperparameters
-
-All the hyperparameters are defined in `config.py`. The default values are aligned with the paper.
-
-## Training
-
-Training can be launched using the following command:
-
-```bash
-PYTHONPATH=data/evaluation:$PYTHONPATH python train.py 2>&1 | tee train.log
-```
-## Validation and Test
-
-WebQA provides two versions of validation and test sets. Automatic validation and test can be lauched by
-
-```bash
-PYTHONPATH=data/evaluation:$PYTHONPATH python val_and_test.py models [ann|ir]
-```
-
-where
-
-* `models`: the directory where model files are stored. You can use `models` if `config.py` is not changed.
-* `ann`: using the validation and test sets with annotated evidence.
-* `ir`: using the validation and test sets with retrieved evidence.
-
-Note that validation and test can run simultaneously with training. `val_and_test.py` will handle the synchronization related problems.
-
-Intermediate results are stored in the directory `tmp`. You can delete them safely after validation and test.
-
-The results should be comparable with those shown in Table 3 in the paper.
-
-## Inferring using a Trained Model
-
-Infer using a trained model by running:
-```bash
-PYTHONPATH=data/evaluation:$PYTHONPATH python infer.py \
- MODEL_FILE \
- INPUT_DATA \
- OUTPUT_FILE \
- 2>&1 | tee infer.log
-```
-
-where
-
-* `MODEL_FILE`: a trained model produced by `train.py`.
-* `INPUT_DATA`: input data in the same format as the validation/test sets of the WebQA dataset.
-* `OUTPUT_FILE`: results in the format specified in the WebQA dataset for the evaluation scripts.
-
-## Pre-trained Models
-
-We have provided two pre-trained models, one for the validation and test sets with annotated evidence, and one for those with retrieved evidence. These two models are selected according to the performance on the corresponding version of validation set, which is consistent with the paper.
-
-The models can be downloaded with
-```bash
-cd pre-trained-models && ./download-models.sh && cd ..
-```
-
-The evaluation result on the test set with annotated evidence can be achieved by
-
-```bash
-PYTHONPATH=data/evaluation:$PYTHONPATH python infer.py \
- pre-trained-models/params_pass_00010.tar.gz \
- data/data/test.ann.json.gz \
- test.ann.output.txt.gz
-
-PYTHONPATH=data/evaluation:$PYTHONPATH \
- python data/evaluation/evaluate-tagging-result.py \
- test.ann.output.txt.gz \
- data/data/test.ann.json.gz \
- --fuzzy --schema BIO2
-# The result should be
-# chunk_f1=0.739091 chunk_precision=0.686119 chunk_recall=0.800926 true_chunks=3024 result_chunks=3530 correct_chunks=2422
-```
-
-And the evaluation result on the test set with retrieved evidence can be achieved by
-
-```bash
-PYTHONPATH=data/evaluation:$PYTHONPATH python infer.py \
- pre-trained-models/params_pass_00021.tar.gz \
- data/data/test.ir.json.gz \
- test.ir.output.txt.gz
-
-PYTHONPATH=data/evaluation:$PYTHONPATH \
- python data/evaluation/evaluate-voting-result.py \
- test.ir.output.txt.gz \
- data/data/test.ir.json.gz \
- --fuzzy --schema BIO2
-# The result should be
-# chunk_f1=0.749358 chunk_precision=0.727868 chunk_recall=0.772156 true_chunks=3024 result_chunks=3208 correct_chunks=2335
-```
diff --git a/legacy/neural_qa/config.py b/legacy/neural_qa/config.py
deleted file mode 100644
index b6c457a49933dd9a2ae75363fd48b43832d962e1..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/config.py
+++ /dev/null
@@ -1,112 +0,0 @@
-import math
-
-__all__ = ["TrainingConfig", "InferConfig"]
-
-
-class CommonConfig(object):
- def __init__(self):
- # network size:
- # dimension of the question LSTM
- self.q_lstm_dim = 64
- # dimension of the attention layer
- self.latent_chain_dim = 64
- # dimension of the evidence LSTMs
- self.e_lstm_dim = 64
- # dimension of the qe.comm and ee.comm feature embeddings
- self.com_vec_dim = 2
- self.drop_rate = 0.05
-
- # CRF:
- # valid values are BIO and BIO2
- self.label_schema = "BIO2"
-
- # word embedding:
- # vocabulary file path
- self.word_dict_path = "data/embedding/wordvecs.vcb"
- # word embedding file path
- self.wordvecs_path = "data/embedding/wordvecs.txt"
- self.word_vec_dim = 64
-
- # saving model & logs:
- # dir for saving models
- self.model_save_dir = "models"
-
- # print training info every log_period batches
- self.log_period = 100
- # show parameter status every show_parameter_status_period batches
- self.show_parameter_status_period = 100
-
- @property
- def label_num(self):
- if self.label_schema == "BIO":
- return 3
- elif self.label_schema == "BIO2":
- return 4
- else:
- raise ValueError("wrong value for label_schema")
-
- @property
- def default_init_std(self):
- return 1 / math.sqrt(self.e_lstm_dim * 4)
-
- @property
- def default_l2_rate(self):
- return 8e-4 * self.batch_size / 6
-
- @property
- def dict_dim(self):
- return len(self.vocab)
-
-
-class TrainingConfig(CommonConfig):
- def __init__(self):
- super(TrainingConfig, self).__init__()
-
- # data:
- # training data path
- self.train_data_path = "data/data/training.json.gz"
-
- # number of batches used in each pass
- self.batches_per_pass = 1000
- # number of passes to train
- self.num_passes = 25
- # batch size
- self.batch_size = 120
-
- # the ratio of negative samples used in training
- self.negative_sample_ratio = 0.2
- # the ratio of negative samples that contain golden answer string
- self.hit_ans_negative_sample_ratio = 0.25
-
- # keep only first B in golden labels
- self.keep_first_b = False
-
- # use GPU to train the model
- self.use_gpu = False
- # number of threads
- self.trainer_count = 1
-
- # random seeds:
- # data reader random seed, 0 for random seed
- self.seed = 0
- # paddle random seed, 0 for random seed
- self.paddle_seed = 0
-
- # optimizer:
- self.learning_rate = 1e-3
- # rmsprop
- self.rho = 0.95
- self.epsilon = 1e-4
- # model average
- self.average_window = 0.5
- self.max_average_window = 10000
-
-
-class InferConfig(CommonConfig):
- def __init__(self):
- super(InferConfig, self).__init__()
-
- self.use_gpu = False
- self.trainer_count = 1
- self.batch_size = 120
- self.wordvecs = None
diff --git a/legacy/neural_qa/infer.py b/legacy/neural_qa/infer.py
deleted file mode 100644
index 14bda05a54c24a38e347be91aa4dc84c71f82887..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/infer.py
+++ /dev/null
@@ -1,82 +0,0 @@
-import os
-import sys
-import argparse
-
-import paddle.v2 as paddle
-
-import reader
-import utils
-import network
-import config
-
-from utils import logger
-
-
-class Infer(object):
- def __init__(self, conf):
- self.conf = conf
-
- self.settings = reader.Settings(
- vocab=conf.vocab, is_training=False, label_schema=conf.label_schema)
-
- # init paddle
- # TODO(lipeng17) v2 API does not support parallel_nn yet. Therefore, we
- # can only use CPU currently
- paddle.init(use_gpu=conf.use_gpu, trainer_count=conf.trainer_count)
-
- # define network
- self.tags_layer = network.inference_net(conf)
-
- def infer(self, model_path, data_path, output):
- test_reader = paddle.batch(
- paddle.reader.buffered(
- reader.create_reader(data_path, self.settings),
- size=self.conf.batch_size * 1000),
- batch_size=self.conf.batch_size)
-
- # load the trained models
- parameters = paddle.parameters.Parameters.from_tar(
- utils.open_file(model_path, "r"))
- inferer = paddle.inference.Inference(
- output_layer=self.tags_layer, parameters=parameters)
-
- def count_evi_ids(test_batch):
- num = 0
- for sample in test_batch:
- num += len(sample[reader.E_IDS])
- return num
-
- for test_batch in test_reader():
- tags = inferer.infer(
- input=test_batch, field=["id"], feeding=network.feeding)
- evi_ids_num = count_evi_ids(test_batch)
- assert len(tags) == evi_ids_num
- print >> output, ";\n".join(str(tag) for tag in tags) + ";"
-
-
-def parse_cmd():
- parser = argparse.ArgumentParser()
- parser.add_argument("model_path")
- parser.add_argument("data_path")
- parser.add_argument("output", help="'-' for stdout")
- return parser.parse_args()
-
-
-def main(args):
- conf = config.InferConfig()
- conf.vocab = utils.load_dict(conf.word_dict_path)
- logger.info("length of word dictionary is : %d." % len(conf.vocab))
-
- if args.output == "-":
- output = sys.stdout
- else:
- output = utils.open_file(args.output, "w")
-
- infer = Infer(conf)
- infer.infer(args.model_path, args.data_path, output)
-
- output.close()
-
-
-if __name__ == "__main__":
- main(parse_cmd())
diff --git a/legacy/neural_qa/network.py b/legacy/neural_qa/network.py
deleted file mode 100644
index 1fba80f46c57906c51c4e564a709aee89ae887e3..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/network.py
+++ /dev/null
@@ -1,311 +0,0 @@
-import math
-import paddle.v2 as paddle
-
-import reader
-
-__all__ = ["training_net", "inference_net", "feeding"]
-
-feeding = {
- reader.Q_IDS_STR: reader.Q_IDS,
- reader.E_IDS_STR: reader.E_IDS,
- reader.QE_COMM_STR: reader.QE_COMM,
- reader.EE_COMM_STR: reader.EE_COMM,
- reader.LABELS_STR: reader.LABELS
-}
-
-
-def get_embedding(input, word_vec_dim, wordvecs):
- """
- Define word embedding
-
- :param input: layer input
- :type input: LayerOutput
- :param word_vec_dim: dimension of the word embeddings
- :type word_vec_dim: int
- :param wordvecs: word embedding matrix
- :type wordvecs: numpy array
- :return: embedding
- :rtype: LayerOutput
- """
- return paddle.layer.embedding(
- input=input,
- size=word_vec_dim,
- param_attr=paddle.attr.ParamAttr(
- name="wordvecs", is_static=True, initializer=lambda _: wordvecs))
-
-
-def encoding_question(question, q_lstm_dim, latent_chain_dim, word_vec_dim,
- drop_rate, wordvecs, default_init_std, default_l2_rate):
- """
- Define network for encoding question
-
- :param question: question token ids
- :type question: LayerOutput
- :param q_lstm_dim: dimension of the question LSTM
- :type q_lstm_dim: int
- :param latent_chain_dim: dimension of the attention layer
- :type latent_chain_dim: int
- :param word_vec_dim: dimension of the word embeddings
- :type word_vec_dim: int
- :param drop_rate: dropout rate
- :type drop_rate: float
- :param wordvecs: word embedding matrix
- :type wordvecs: numpy array
- :param default_init_std: default initial standard deviation
- :type default_init_std: float
- :param default_l2_rate: default l2 rate
- :type default_l2_rate: float
- :return: question encoding
- :rtype: LayerOutput
- """
- # word embedding
- emb = get_embedding(question, word_vec_dim, wordvecs)
-
- # question LSTM
- wx = paddle.layer.fc(act=paddle.activation.Linear(),
- size=q_lstm_dim * 4,
- input=emb,
- param_attr=paddle.attr.ParamAttr(
- name="_q_hidden1.w0",
- initial_std=default_init_std,
- l2_rate=default_l2_rate),
- bias_attr=paddle.attr.ParamAttr(
- name="_q_hidden1.wbias",
- initial_std=0,
- l2_rate=default_l2_rate))
- q_rnn = paddle.layer.lstmemory(
- input=wx,
- bias_attr=paddle.attr.ParamAttr(
- name="_q_rnn1.wbias", initial_std=0, l2_rate=default_l2_rate),
- param_attr=paddle.attr.ParamAttr(
- name="_q_rnn1.w0",
- initial_std=default_init_std,
- l2_rate=default_l2_rate))
- q_rnn = paddle.layer.dropout(q_rnn, drop_rate)
-
- # self attention
- fc = paddle.layer.fc(act=paddle.activation.Tanh(),
- size=latent_chain_dim,
- input=q_rnn,
- param_attr=paddle.attr.ParamAttr(
- name="_attention_layer1.w0",
- initial_std=default_init_std,
- l2_rate=default_l2_rate),
- bias_attr=False)
- weight = paddle.layer.fc(size=1,
- act=paddle.activation.SequenceSoftmax(),
- input=fc,
- param_attr=paddle.attr.ParamAttr(
- name="_attention_weight.w0",
- initial_std=default_init_std,
- l2_rate=default_l2_rate),
- bias_attr=False)
-
- scaled_q_rnn = paddle.layer.scaling(input=q_rnn, weight=weight)
-
- q_encoding = paddle.layer.pooling(
- input=scaled_q_rnn, pooling_type=paddle.pooling.Sum())
- return q_encoding
-
-
-def encoding_evidence(evidence, qe_comm, ee_comm, q_encoding, e_lstm_dim,
- word_vec_dim, com_vec_dim, drop_rate, wordvecs,
- default_init_std, default_l2_rate):
- """
- Define network for encoding evidence
-
- :param qe_comm: qe.ecomm features
- :type qe_comm: LayerOutput
- :param ee_comm: ee.ecomm features
- :type ee_comm: LayerOutput
- :param q_encoding: question encoding, a fixed-length vector
- :type q_encoding: LayerOutput
- :param e_lstm_dim: dimension of the evidence LSTMs
- :type e_lstm_dim: int
- :param word_vec_dim: dimension of the word embeddings
- :type word_vec_dim: int
- :param com_vec_dim: dimension of the qe.comm and ee.comm feature embeddings
- :type com_vec_dim: int
- :param drop_rate: dropout rate
- :type drop_rate: float
- :param wordvecs: word embedding matrix
- :type wordvecs: numpy array
- :param default_init_std: default initial standard deviation
- :type default_init_std: float
- :param default_l2_rate: default l2 rate
- :type default_l2_rate: float
- :return: evidence encoding
- :rtype: LayerOutput
- """
-
- def lstm(idx, reverse, inputs):
- """LSTM wrapper"""
- bias_attr = paddle.attr.ParamAttr(
- name="_e_hidden%d.wbias" % idx,
- initial_std=0,
- l2_rate=default_l2_rate)
- with paddle.layer.mixed(size=e_lstm_dim * 4, bias_attr=bias_attr) as wx:
- for i, input in enumerate(inputs):
- param_attr = paddle.attr.ParamAttr(
- name="_e_hidden%d.w%d" % (idx, i),
- initial_std=default_init_std,
- l2_rate=default_l2_rate)
- wx += paddle.layer.full_matrix_projection(
- input=input, param_attr=param_attr)
-
- e_rnn = paddle.layer.lstmemory(
- input=wx,
- reverse=reverse,
- bias_attr=paddle.attr.ParamAttr(
- name="_e_rnn%d.wbias" % idx,
- initial_std=0,
- l2_rate=default_l2_rate),
- param_attr=paddle.attr.ParamAttr(
- name="_e_rnn%d.w0" % idx,
- initial_std=default_init_std,
- l2_rate=default_l2_rate))
- e_rnn = paddle.layer.dropout(e_rnn, drop_rate)
- return e_rnn
-
- # share word embeddings with question
- emb = get_embedding(evidence, word_vec_dim, wordvecs)
-
- # copy q_encoding len(evidence) times
- q_encoding_expand = paddle.layer.expand(
- input=q_encoding, expand_as=evidence)
-
- # feature embeddings
- comm_initial_std = 1 / math.sqrt(64.0)
- qe_comm_emb = paddle.layer.embedding(
- input=qe_comm,
- size=com_vec_dim,
- param_attr=paddle.attr.ParamAttr(
- name="_cw_embedding.w0",
- initial_std=comm_initial_std,
- l2_rate=default_l2_rate))
-
- ee_comm_emb = paddle.layer.embedding(
- input=ee_comm,
- size=com_vec_dim,
- param_attr=paddle.attr.ParamAttr(
- name="_eecom_embedding.w0",
- initial_std=comm_initial_std,
- l2_rate=default_l2_rate))
-
- # evidence LSTMs
- first_layer_extra_inputs = [q_encoding_expand, qe_comm_emb, ee_comm_emb]
- e_rnn1 = lstm(1, False, [emb] + first_layer_extra_inputs)
- e_rnn2 = lstm(2, True, [e_rnn1])
- e_rnn3 = lstm(3, False, [e_rnn2, e_rnn1]) # with cross layer links
-
- return e_rnn3
-
-
-def define_data(dict_dim, label_num):
- """
- Define data layers
-
- :param dict_dim: number of words in the vocabulary
- :type dict_dim: int
- :param label_num: label numbers, BIO:3, BIO2:4
- :type label_num: int
- :return: data layers
- :rtype: tuple of LayerOutput
- """
- question = paddle.layer.data(
- name=reader.Q_IDS_STR,
- type=paddle.data_type.integer_value_sequence(dict_dim))
-
- evidence = paddle.layer.data(
- name=reader.E_IDS_STR,
- type=paddle.data_type.integer_value_sequence(dict_dim))
-
- qe_comm = paddle.layer.data(
- name=reader.QE_COMM_STR,
- type=paddle.data_type.integer_value_sequence(2))
-
- ee_comm = paddle.layer.data(
- name=reader.EE_COMM_STR,
- type=paddle.data_type.integer_value_sequence(2))
-
- label = paddle.layer.data(
- name=reader.LABELS_STR,
- type=paddle.data_type.integer_value_sequence(label_num),
- layer_attr=paddle.attr.ExtraAttr(device=-1))
-
- return question, evidence, qe_comm, ee_comm, label
-
-
-def define_common_network(conf):
- """
- Define common network
-
- :param conf: network conf
- :return: CRF features, golden labels
- :rtype: tuple
- """
- # define data layers
- question, evidence, qe_comm, ee_comm, label = \
- define_data(conf.dict_dim, conf.label_num)
-
- # encode question
- q_encoding = encoding_question(question, conf.q_lstm_dim,
- conf.latent_chain_dim, conf.word_vec_dim,
- conf.drop_rate, conf.wordvecs,
- conf.default_init_std, conf.default_l2_rate)
-
- # encode evidence
- e_encoding = encoding_evidence(
- evidence, qe_comm, ee_comm, q_encoding, conf.e_lstm_dim,
- conf.word_vec_dim, conf.com_vec_dim, conf.drop_rate, conf.wordvecs,
- conf.default_init_std, conf.default_l2_rate)
-
- # pre-compute CRF features
- crf_feats = paddle.layer.fc(act=paddle.activation.Linear(),
- input=e_encoding,
- size=conf.label_num,
- param_attr=paddle.attr.ParamAttr(
- name="_output.w0",
- initial_std=conf.default_init_std,
- l2_rate=conf.default_l2_rate),
- bias_attr=False)
- return crf_feats, label
-
-
-def training_net(conf):
- """
- Define training network
-
- :param conf: network conf
- :return: CRF cost
- :rtype: LayerOutput
- """
- e_encoding, label = define_common_network(conf)
- crf = paddle.layer.crf(input=e_encoding,
- label=label,
- size=conf.label_num,
- param_attr=paddle.attr.ParamAttr(
- name="_crf.w0",
- initial_std=conf.default_init_std,
- l2_rate=conf.default_l2_rate),
- layer_attr=paddle.attr.ExtraAttr(device=-1))
-
- return crf
-
-
-def inference_net(conf):
- """
- Define training network
-
- :param conf: network conf
- :return: CRF viberbi decoding result
- :rtype: LayerOutput
- """
- e_encoding, label = define_common_network(conf)
- ret = paddle.layer.crf_decoding(
- input=e_encoding,
- size=conf.label_num,
- param_attr=paddle.attr.ParamAttr(name="_crf.w0"),
- layer_attr=paddle.attr.ExtraAttr(device=-1))
-
- return ret
diff --git a/legacy/neural_qa/pre-trained-models/download-models.sh b/legacy/neural_qa/pre-trained-models/download-models.sh
deleted file mode 100755
index 6dc4ce6606346820414875b417dff9295dee4b20..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/pre-trained-models/download-models.sh
+++ /dev/null
@@ -1,17 +0,0 @@
-#!/bin/bash
-if [[ -f params_pass_00010.tar.gz ]] && [[ -f params_pass_00021.tar.gz ]]; then
- echo "data exist"
- exit 0
-else
- wget -c http://cloud.dlnel.org/filepub/?uuid=d9a00599-1f66-4549-867b-e958f96474ca \
- -O neural_seq_qa.pre-trained-models.2017-10-27.tar.gz
-fi
-
-if [[ `md5sum -c neural_seq_qa.pre-trained-models.2017-10-27.tar.gz.md5` =~ 'OK' ]] ; then
- tar xf neural_seq_qa.pre-trained-models.2017-10-27.tar.gz
- rm neural_seq_qa.pre-trained-models.2017-10-27.tar.gz
-else
- echo "download data error!" >> /dev/stderr
- exit 1
-fi
-
diff --git a/legacy/neural_qa/pre-trained-models/neural_seq_qa.pre-trained-models.2017-10-27.tar.gz.md5 b/legacy/neural_qa/pre-trained-models/neural_seq_qa.pre-trained-models.2017-10-27.tar.gz.md5
deleted file mode 100644
index 209d317a35764ea2aecc693abb1aae5f270afdd4..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/pre-trained-models/neural_seq_qa.pre-trained-models.2017-10-27.tar.gz.md5
+++ /dev/null
@@ -1 +0,0 @@
-77339985bab7ba173e2f368d9f9d684b neural_seq_qa.pre-trained-models.2017-10-27.tar.gz
diff --git a/legacy/neural_qa/reader.py b/legacy/neural_qa/reader.py
deleted file mode 100644
index e55e77b601a1a1de694e17ccbf3d1e27eaa7c624..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/reader.py
+++ /dev/null
@@ -1,409 +0,0 @@
-import sys
-import random
-from itertools import izip
-import json
-import traceback
-
-from datapoint import DataPoint, Evidence, EecommFeatures
-import utils
-from utils import logger
-
-__all__ = [
- "Q_IDS", "E_IDS", "LABELS", "QE_COMM", "EE_COMM", "Q_IDS_STR", "E_IDS_STR",
- "LABELS_STR", "QE_COMM_STR", "EE_COMM_STR", "Settings", "create_reader"
-]
-
-# slot names
-Q_IDS_STR = "q_ids"
-E_IDS_STR = "e_ids"
-LABELS_STR = "labels"
-QE_COMM_STR = "qe.comm"
-EE_COMM_STR = "ee.comm"
-
-Q_IDS = 0
-E_IDS = 1
-LABELS = 2
-QE_COMM = 3
-EE_COMM = 4
-
-NO_ANSWER = "no_answer"
-
-
-class Settings(object):
- """
- class for storing settings
- """
-
- def __init__(self,
- vocab,
- is_training,
- label_schema="BIO2",
- negative_sample_ratio=0.2,
- hit_ans_negative_sample_ratio=0.25,
- keep_first_b=False,
- seed=31425926):
- """
- Init function
-
- :param vocab: word dict
- :type vocab: dict
- :param is_training: True for training
- :type is_training: bool
- :param label_schema: label schema, valid values are BIO and BIO2,
- the default value is BIO2
- :type label_schema: str
- :param negative_sample_ratio: the ratio of negative samples used in
- training, the default value is 0.2
- :type negative_sample_ratio: float
- :param hit_ans_negative_sample_ratio: the ratio of negative samples
- that contain golden answer string, the default value is 0.25
- :type hit_ans_negative_sample_ratio: float
- :param keep_first_b: only keep the first B in golden tag sequence,
- the default value is False
- :type keep_first_b: bool
- :param seed: random seed, the default value is 31425926
- :type seed: int
- """
- self.negative_sample_ratio = negative_sample_ratio
- self.hit_ans_negative_sample_ratio = hit_ans_negative_sample_ratio
- self.keep_first_b = keep_first_b
- self.is_training = is_training
- self.vocab = vocab
-
- # set up label schema
- if label_schema == "BIO":
- B, I, O1, O2 = 0, 1, 2, 2
- elif label_schema == "BIO2":
- B, I, O1, O2 = 0, 1, 2, 3
- else:
- raise ValueError("label_schema should be BIO/BIO2")
- self.B, self.I, self.O1, self.O2 = B, I, O1, O2
- self.label_map = {
- "B": B,
- "I": I,
- "O1": O1,
- "O2": O2,
- "b": B,
- "i": I,
- "o1": O1,
- "o2": O2
- }
- self.label_num = len(set((B, I, O1, O2)))
-
- # id for OOV
- self.oov_id = 0
-
- # set up random seed
- random.seed(seed)
-
- # booking message
- logger.info("negative_sample_ratio: %f", negative_sample_ratio)
- logger.info("hit_ans_negative_sample_ratio: %f",
- hit_ans_negative_sample_ratio)
- logger.info("keep_first_b: %s", keep_first_b)
- logger.info("data reader random seed: %d", seed)
-
-
-class SampleStream(object):
- def __init__(self, filename, settings):
- self.filename = filename
- self.settings = settings
-
- def __iter__(self):
- return self.load_and_filter_samples(self.filename)
-
- def load_and_filter_samples(self, filename):
- def remove_extra_b(labels):
- if labels.count(self.settings.B) <= 1: return
-
- i = 0
- # find the first B
- while i < len(labels) and labels[i] == self.settings.O1:
- i += 1
- i += 1 # skip B
- # skip the following Is
- while i < len(labels) and labels[i] == self.settings.I:
- i += 1
- # change all the other tags to O2
- while i < len(labels):
- labels[i] = self.settings.O2
- i += 1
-
- def filter_and_preprocess_evidences(evidences):
- for i, evi in enumerate(evidences):
- # convert golden labels to labels ids
- if Evidence.GOLDEN_LABELS in evi:
- labels = [self.settings.label_map[l] \
- for l in evi[Evidence.GOLDEN_LABELS]]
- else:
- labels = [self.settings.O1] * len(evi[Evidence.E_TOKENS])
-
- # determine the current evidence is negative or not
- answer_list = evi[Evidence.GOLDEN_ANSWERS]
- is_negative = len(answer_list) == 1 \
- and "".join(answer_list[0]).lower() == NO_ANSWER
-
- # drop positive evidences that do not contain golden answer
- # matches in training
- is_all_o1 = labels.count(self.settings.O1) == len(labels)
- if self.settings.is_training and is_all_o1 and not is_negative:
- evidences[i] = None # dropped
- continue
-
- if self.settings.keep_first_b:
- remove_extra_b(labels)
- evi[Evidence.GOLDEN_LABELS] = labels
-
- def get_eecom_feats_list(cur_sample_is_negative, eecom_feats_list,
- evidences):
- if not self.settings.is_training:
- return [item[EecommFeatures.EECOMM_FEATURES] \
- for item in eecom_feats_list]
-
- positive_eecom_feats_list = []
- negative_eecom_feats_list = []
-
- for eecom_feats_, other_evi in izip(eecom_feats_list, evidences):
- if not other_evi: continue
-
- eecom_feats = eecom_feats_[EecommFeatures.EECOMM_FEATURES]
- if not eecom_feats: continue
-
- other_evi_type = eecom_feats_[EecommFeatures.OTHER_E_TYPE]
- if cur_sample_is_negative and \
- other_evi_type != Evidence.POSITIVE:
- continue
-
- if other_evi_type == Evidence.POSITIVE:
- positive_eecom_feats_list.append(eecom_feats)
- else:
- negative_eecom_feats_list.append(eecom_feats)
-
- eecom_feats_list = positive_eecom_feats_list
- if negative_eecom_feats_list:
- eecom_feats_list += [negative_eecom_feats_list]
-
- return eecom_feats_list
-
- def process_tokens(data, tok_key):
- ids = [self.settings.vocab.get(token, self.settings.oov_id) \
- for token in data[tok_key]]
- return ids
-
- def process_evi(q_ids, evi, evidences):
- e_ids = process_tokens(evi, Evidence.E_TOKENS)
-
- labels = evi[Evidence.GOLDEN_LABELS]
- qe_comm = evi[Evidence.QECOMM_FEATURES]
- sample_type = evi[Evidence.TYPE]
-
- ret = [None] * 5
- ret[Q_IDS] = q_ids
- ret[E_IDS] = e_ids
- ret[LABELS] = labels
- ret[QE_COMM] = qe_comm
-
- eecom_feats_list = get_eecom_feats_list(
- sample_type != Evidence.POSITIVE,
- evi[Evidence.EECOMM_FEATURES_LIST], evidences)
- if not eecom_feats_list:
- return None
- else:
- ret[EE_COMM] = eecom_feats_list
- return ret
-
- with utils.DotBar(utils.open_file(filename)) as f_:
- for q_idx, line in enumerate(f_):
- # parse json line
- try:
- data = json.loads(line)
- except Exception:
- logger.fatal("ERROR LINE: %s", line.strip())
- traceback.print_exc()
- continue
-
- # convert question tokens to ids
- q_ids = process_tokens(data, DataPoint.Q_TOKENS)
-
- # process evidences
- evidences = data[DataPoint.EVIDENCES]
- filter_and_preprocess_evidences(evidences)
- for evi in evidences:
- if not evi: continue
- sample = process_evi(q_ids, evi, evidences)
- if sample: yield q_idx, sample, evi[Evidence.TYPE]
-
-
-class DataReader(object):
- def __iter__(self):
- return self
-
- def _next(self):
- raise NotImplemented()
-
- def next(self):
- data_point = self._next()
- return self.post_process_sample(data_point)
-
- def post_process_sample(self, sample):
- ret = list(sample)
-
- # choose eecom features randomly
- eecom_feats = random.choice(sample[EE_COMM])
- if not isinstance(eecom_feats[0], int):
- # the other evidence is a negative evidence
- eecom_feats = random.choice(eecom_feats)
- ret[EE_COMM] = eecom_feats
-
- return ret
-
-
-class TrainingDataReader(DataReader):
- def __init__(self, sample_stream, negative_ratio, hit_ans_negative_ratio):
- super(TrainingDataReader, self).__init__()
- self.positive_data = []
- self.hit_ans_negative_data = []
- self.other_negative_data = []
-
- self.negative_ratio = negative_ratio
- self.hit_ans_negative_ratio = hit_ans_negative_ratio
-
- self.p_idx = 0
- self.hit_idx = 0
- self.other_idx = 0
-
- self.load_samples(sample_stream)
-
- def add_data(self, positive, hit_negative, other_negative):
- if not positive: return
- self.positive_data.extend(positive)
- for samples, target_list in \
- zip((hit_negative, other_negative),
- (self.hit_ans_negative_data, self.other_negative_data)):
- if not samples: continue
- # `0" is an index, further refer to _next_negative_data()
- target_list.append([samples, 0])
-
- def load_samples(self, sample_stream):
- logger.info("loading data...")
- last_q_id, positive, hit_negative, other_negative = None, [], [], []
- for q_id, sample, type_ in sample_stream:
- if not last_q_id and q_id != last_q_id:
- self.add_data(positive, hit_negative, other_negative)
- positive, hit_negative, other_negative = [], [], []
-
- last_q_id = q_id
- if type_ == Evidence.POSITIVE:
- positive.append(sample)
- elif type_ == Evidence.HIT_ANS_NEGATIVE:
- hit_negative.append(sample)
- elif type_ == Evidence.OTHER_NEGATIVE:
- other_negative.append(sample)
- else:
- raise ValueError("wrong type: %s" % str(type_))
- self.add_data(positive, hit_negative, other_negative)
-
- # we are not sure whether the input data is shuffled or not
- # so we shuffle them
- random.shuffle(self.positive_data)
- random.shuffle(self.hit_ans_negative_data)
- random.shuffle(self.other_negative_data)
-
- # set thresholds
- if len(self.positive_data) == 0:
- logger.fatal("zero positive sample")
- raise ValueError("zero positive sample")
-
- zero_hit = len(self.hit_ans_negative_data) == 0
- zero_other = len(self.other_negative_data) == 0
-
- if zero_hit and zero_other:
- logger.fatal("zero negative sample")
- raise ValueError("zero negative sample")
-
- if zero_hit:
- logger.warning("zero hit_ans_negative sample")
- self.hit_ans_neg_threshold = 0
- else:
- self.hit_ans_neg_threshold = \
- self.negative_ratio * self.hit_ans_negative_ratio
-
- self.other_neg_threshold = self.negative_ratio
- if zero_other:
- logger.warning("zero other_negative sample")
- self.hit_ans_neg_threshold = self.negative_ratio
- logger.info("loaded")
-
- def next_positive_data(self):
- if self.p_idx >= len(self.positive_data):
- random.shuffle(self.positive_data)
- self.p_idx = 0
-
- self.p_idx += 1
- return self.positive_data[self.p_idx - 1]
-
- def _next_negative_data(self, idx, negative_data):
- if idx >= len(negative_data):
- random.shuffle(negative_data)
- idx = 0
-
- # a negative evidence is sampled in two steps:
- # step 1: sample a question uniformly
- # step 2: sample a negative evidence corresponding to the question
- # uniformly
- # bundle -> (sample, idx)
- bundle = negative_data[idx]
- if bundle[1] >= len(bundle[0]):
- random.shuffle(bundle[0])
- bundle[1] = 0
- bundle[1] += 1
- return idx + 1, bundle[0][bundle[1] - 1]
-
- def next_hit_ans_negative_data(self):
- self.hit_idx, data = self._next_negative_data(
- self.hit_idx, self.hit_ans_negative_data)
- return data
-
- def next_other_negative_data(self):
- self.other_idx, data = self._next_negative_data(
- self.other_idx, self.other_negative_data)
- return data
-
- def _next(self):
- rand = random.random()
- if rand <= self.hit_ans_neg_threshold:
- return self.next_hit_ans_negative_data()
- elif rand < self.other_neg_threshold:
- return self.next_other_negative_data()
- else:
- return self.next_positive_data()
-
-
-class TestDataReader(DataReader):
- def __init__(self, sample_stream):
- super(TestDataReader, self).__init__()
- self.data_generator = iter(sample_stream)
-
- def _next(self):
- q_idx, sample, type_ = self.data_generator.next()
- return sample
-
-
-def create_reader(filename, settings, samples_per_pass=sys.maxint):
- if settings.is_training:
- training_reader = TrainingDataReader(
- SampleStream(filename, settings), settings.negative_sample_ratio,
- settings.hit_ans_negative_sample_ratio)
-
- def wrapper():
- for i, data in izip(xrange(samples_per_pass), training_reader):
- yield data
-
- return wrapper
- else:
-
- def wrapper():
- sample_stream = SampleStream(filename, settings)
- return TestDataReader(sample_stream)
-
- return wrapper
diff --git a/legacy/neural_qa/test/test_reader.py b/legacy/neural_qa/test/test_reader.py
deleted file mode 100644
index 2c3725b50301c3de701e1af82fd1beb367a7ed0f..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/test/test_reader.py
+++ /dev/null
@@ -1,110 +0,0 @@
-import unittest
-import os
-import itertools
-import math
-import logging
-
-# set up python path
-topdir = os.path.join(os.path.dirname(os.path.abspath(__file__)), "..")
-import sys
-sys.path += [topdir, os.path.join(topdir, "data", "evaluation")]
-
-import reader
-import utils
-
-formatter = logging.Formatter(
- "[%(levelname)s %(asctime)s.%(msecs)d %(filename)s:%(lineno)d] %(message)s",
- datefmt='%Y-%m-%d %I:%M:%S')
-ch = logging.StreamHandler()
-ch.setFormatter(formatter)
-utils.logger.addHandler(ch)
-
-
-class Vocab(object):
- @property
- def data(self):
- word_dict_path = os.path.join(topdir, "data", "embedding",
- "wordvecs.vcb")
- return utils.load_dict(word_dict_path)
-
-
-class NegativeSampleRatioTest(unittest.TestCase):
- def check_ratio(self, negative_sample_ratio):
- for keep_first_b in [True, False]:
- settings = reader.Settings(
- vocab=Vocab().data,
- is_training=True,
- label_schema="BIO2",
- negative_sample_ratio=negative_sample_ratio,
- hit_ans_negative_sample_ratio=0.25,
- keep_first_b=keep_first_b)
-
- filename = os.path.join(topdir, "test", "trn_data.gz")
- data_stream = reader.create_reader(filename, settings)
- total, negative_num = 5000, 0
- for _, d in itertools.izip(xrange(total), data_stream()):
- labels = d[reader.LABELS]
- if labels.count(0) == 0:
- negative_num += 1
-
- ratio = negative_num / float(total)
- self.assertLessEqual(math.fabs(ratio - negative_sample_ratio), 0.01)
-
- def runTest(self):
- for ratio in [1., 0.25, 0.]:
- self.check_ratio(ratio)
-
-
-class KeepFirstBTest(unittest.TestCase):
- def runTest(self):
- for keep_first_b in [True, False]:
- for label_schema in ["BIO", "BIO2"]:
- settings = reader.Settings(
- vocab=Vocab().data,
- is_training=True,
- label_schema=label_schema,
- negative_sample_ratio=0.2,
- hit_ans_negative_sample_ratio=0.25,
- keep_first_b=keep_first_b)
-
- filename = os.path.join(topdir, "test", "trn_data.gz")
- data_stream = reader.create_reader(filename, settings)
- total, at_least_one, one = 1000, 0, 0
- for _, d in itertools.izip(xrange(total), data_stream()):
- labels = d[reader.LABELS]
- b_num = labels.count(0)
- if b_num >= 1:
- at_least_one += 1
- if b_num == 1:
- one += 1
-
- self.assertLess(at_least_one, total)
- if keep_first_b:
- self.assertEqual(one, at_least_one)
- else:
- self.assertLess(one, at_least_one)
-
-
-class DictTest(unittest.TestCase):
- def runTest(self):
- settings = reader.Settings(
- vocab=Vocab().data,
- is_training=True,
- label_schema="BIO2",
- negative_sample_ratio=0.2,
- hit_ans_negative_sample_ratio=0.25,
- keep_first_b=True)
-
- filename = os.path.join(topdir, "test", "trn_data.gz")
- data_stream = reader.create_reader(filename, settings)
- q_uniq_ids, e_uniq_ids = set(), set()
- for _, d in itertools.izip(xrange(1000), data_stream()):
- q_uniq_ids.update(d[reader.Q_IDS])
- e_uniq_ids.update(d[reader.E_IDS])
-
- self.assertGreater(len(q_uniq_ids), 50)
- self.assertGreater(len(e_uniq_ids), 50)
-
-
-if __name__ == '__main__':
- unittest.main()
diff --git a/legacy/neural_qa/test/trn_data.gz b/legacy/neural_qa/test/trn_data.gz
deleted file mode 100644
index dcfc4b09c4def53e2d8489784e24a16cfcd2bbb2..0000000000000000000000000000000000000000
Binary files a/legacy/neural_qa/test/trn_data.gz and /dev/null differ
diff --git a/legacy/neural_qa/train.py b/legacy/neural_qa/train.py
deleted file mode 100644
index e09a1b7388131ca361a24a122df607870ceb2f36..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/train.py
+++ /dev/null
@@ -1,154 +0,0 @@
-import sys
-import os
-import argparse
-import numpy as np
-
-import paddle.v2 as paddle
-
-import reader
-import utils
-import network
-import config
-from utils import logger
-
-
-def save_model(trainer, model_save_dir, parameters, pass_id):
- f = os.path.join(model_save_dir, "params_pass_%05d.tar.gz" % pass_id)
- logger.info("model saved to %s" % f)
- with utils.open_file(f, "w") as f:
- trainer.save_parameter_to_tar(f)
-
-
-def show_parameter_init_info(parameters):
- """
- Print the information of initialization mean and standard deviation of
- parameters
-
- :param parameters: the parameters created in a model
- """
- logger.info("Parameter init info:")
- for p in parameters:
- p_val = parameters.get(p)
- logger.info(("%-25s : initial_mean=%-7.4f initial_std=%-7.4f "
- "actual_mean=%-7.4f actual_std=%-7.4f dims=%s") %
- (p, parameters.__param_conf__[p].initial_mean,
- parameters.__param_conf__[p].initial_std, p_val.mean(),
- p_val.std(), parameters.__param_conf__[p].dims))
- logger.info("\n")
-
-
-def show_parameter_status(parameters):
- """
- Print some statistical information of parameters in a network
-
- :param parameters: the parameters created in a model
- """
- for p in parameters:
- abs_val = np.abs(parameters.get(p))
- abs_grad = np.abs(parameters.get_grad(p))
-
- logger.info(
- ("%-25s avg_abs_val=%-10.6f max_val=%-10.6f avg_abs_grad=%-10.6f "
- "max_grad=%-10.6f min_val=%-10.6f min_grad=%-10.6f") %
- (p, abs_val.mean(), abs_val.max(), abs_grad.mean(), abs_grad.max(),
- abs_val.min(), abs_grad.min()))
-
-
-def train(conf):
- if not os.path.exists(conf.model_save_dir):
- os.makedirs(conf.model_save_dir, mode=0755)
-
- settings = reader.Settings(
- vocab=conf.vocab,
- is_training=True,
- label_schema=conf.label_schema,
- negative_sample_ratio=conf.negative_sample_ratio,
- hit_ans_negative_sample_ratio=conf.hit_ans_negative_sample_ratio,
- keep_first_b=conf.keep_first_b,
- seed=conf.seed)
- samples_per_pass = conf.batch_size * conf.batches_per_pass
- train_reader = paddle.batch(
- paddle.reader.buffered(
- reader.create_reader(conf.train_data_path, settings,
- samples_per_pass),
- size=samples_per_pass),
- batch_size=conf.batch_size)
-
- # TODO(lipeng17) v2 API does not support parallel_nn yet. Therefore, we can
- # only use CPU currently
- paddle.init(
- use_gpu=conf.use_gpu,
- trainer_count=conf.trainer_count,
- seed=conf.paddle_seed)
-
- # network config
- cost = network.training_net(conf)
-
- # create parameters
- # NOTE: parameter values are not initilized here, therefore, we need to
- # print parameter initialization info in the beginning of the first batch
- parameters = paddle.parameters.create(cost)
-
- # create optimizer
- rmsprop_optimizer = paddle.optimizer.RMSProp(
- learning_rate=conf.learning_rate,
- rho=conf.rho,
- epsilon=conf.epsilon,
- model_average=paddle.optimizer.ModelAverage(
- average_window=conf.average_window,
- max_average_window=conf.max_average_window))
-
- # create trainer
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- update_equation=rmsprop_optimizer)
-
- # begin training network
- def _event_handler(event):
- """
- Define end batch and end pass event handler
- """
- if isinstance(event, paddle.event.EndIteration):
- sys.stderr.write(".")
- batch_num = event.batch_id + 1
- total_batch = conf.batches_per_pass * event.pass_id + batch_num
- if batch_num % conf.log_period == 0:
- sys.stderr.write("\n")
- logger.info("Total batch=%d Batch=%d CurrentCost=%f Eval: %s" \
- % (total_batch, batch_num, event.cost, event.metrics))
-
- if batch_num % conf.show_parameter_status_period == 0:
- show_parameter_status(parameters)
- elif isinstance(event, paddle.event.EndPass):
- save_model(trainer, conf.model_save_dir, parameters, event.pass_id)
- elif isinstance(event, paddle.event.BeginIteration):
- if event.batch_id == 0 and event.pass_id == 0:
- show_parameter_init_info(parameters)
-
- ## for debugging purpose
- #with utils.open_file("config", "w") as config:
- # print >> config, paddle.layer.parse_network(cost)
-
- trainer.train(
- reader=train_reader,
- event_handler=_event_handler,
- feeding=network.feeding,
- num_passes=conf.num_passes)
-
- logger.info("Training has finished.")
-
-
-def main():
- conf = config.TrainingConfig()
-
- logger.info("loading word embeddings...")
- conf.vocab, conf.wordvecs = utils.load_wordvecs(conf.word_dict_path,
- conf.wordvecs_path)
- logger.info("loaded")
- logger.info("length of word dictionary is : %d." % len(conf.vocab))
-
- train(conf)
-
-
-if __name__ == "__main__":
- main()
diff --git a/legacy/neural_qa/utils.py b/legacy/neural_qa/utils.py
deleted file mode 100644
index 4096e61bcd0fb38252650cac448de253a3885561..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/utils.py
+++ /dev/null
@@ -1,108 +0,0 @@
-import argparse
-import gzip
-import logging
-import sys
-import numpy
-
-__all__ = [
- "open_file",
- "cumsum",
- "logger",
- "DotBar",
- "load_dict",
- "load_wordvecs",
-]
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def open_file(filename, *args1, **args2):
- """
- Open a file
-
- :param filename: name of the file
- :type filename: str
- :return: a file handler
- """
- if filename.endswith(".gz"):
- return gzip.open(filename, *args1, **args2)
- else:
- return open(filename, *args1, **args2)
-
-
-def cumsum(array):
- """
- Caculute the accumulated sum of array. For example, array=[1, 2, 3], the
- result is [1, 1+2, 1+2+3]
-
- :param array: input array
- :type array: python list or numpy array
- :return: the accumulated sum of array
- """
- if len(array) <= 1:
- return list(array)
- ret = list(array)
- for i in xrange(1, len(ret)):
- ret[i] += ret[i - 1]
- return ret
-
-
-class DotBar(object):
- """
- A simple dot bar
- """
-
- def __init__(self, obj, step=200, dots_per_line=50, f=sys.stderr):
- """
- :param obj: an iteratable obj
- :type obj: a python itertor
- :param step: print a dot every step iterations
- :type step: int
- :param dots_per_line: dots each line
- :type dots_per_line: int
- :param f: print dot to f, default value is sys.stderr
- :type f: a file handler
- """
- self.obj = obj
- self.step = step
- self.dots_per_line = dots_per_line
- self.f = f
-
- def __enter__(self, ):
- self.obj.__enter__()
- self.idx = 0
- return self
-
- def __exit__(self, exc_type, exc_value, traceback):
- self.f.write("\n")
- if self.obj is sys.stdin or self.obj is sys.stdout:
- return
- self.obj.__exit__(exc_type, exc_value, traceback)
-
- def __iter__(self):
- return self
-
- def next(self):
- self.idx += 1
- if self.idx % self.step == 0:
- self.f.write(".")
- if self.idx % (self.step * self.dots_per_line) == 0:
- self.f.write("\n")
-
- return self.obj.next()
-
-
-def load_dict(word_dict_path):
- with open_file(word_dict_path) as f:
- # the first word must be OOV
- vocab = {k.rstrip("\n").split()[0].decode("utf-8"):i \
- for i, k in enumerate(f)}
- return vocab
-
-
-def load_wordvecs(word_dict_path, wordvecs_path):
- vocab = load_dict(word_dict_path)
- wordvecs = numpy.loadtxt(wordvecs_path, delimiter=",", dtype="float32")
- assert len(vocab) == wordvecs.shape[0]
- return vocab, wordvecs
diff --git a/legacy/neural_qa/val_and_test.py b/legacy/neural_qa/val_and_test.py
deleted file mode 100644
index 0285f8d68fe7406a235398411ce7990da86d1089..0000000000000000000000000000000000000000
--- a/legacy/neural_qa/val_and_test.py
+++ /dev/null
@@ -1,183 +0,0 @@
-import os
-import sys
-import argparse
-import time
-import traceback
-import subprocess
-import re
-
-import utils
-import infer
-import config
-from utils import logger
-
-
-def load_existing_results(eval_result_file):
- evals = {}
- with utils.open_file(eval_result_file) as f:
- for line in f:
- line = line.strip()
- if not line: continue
- pos = line.find(" ")
- pass_id, ret = int(line[len("Pass="):pos]), line[pos + 1:]
- evals[pass_id] = ret
- return evals
-
-
-__PATTERN_CHUNK_F1 = re.compile("chunk_f1=(\d+(\.\d+)?)")
-
-
-def find_best_pass(evals):
- results = []
- for pass_id, eval_ret in evals.iteritems():
- chunk_f1 = float(__PATTERN_CHUNK_F1.search(eval_ret).group(1))
- results.append((pass_id, chunk_f1))
-
- results.sort(key=lambda item: (-item[1], item[0]))
- return results[0][0]
-
-
-def eval_one_pass(infer_obj, conf, model_path, data_path, eval_script):
- if not os.path.exists("tmp"): os.makedirs("tmp")
- # model file is not ready
- if not os.path.exists(model_path): return False
-
- output_path = os.path.join("tmp", "%s_%s.txt.gz" % (
- os.path.basename(model_path), os.path.basename(data_path)))
- with utils.open_file(output_path, "w") as output:
- try:
- infer_obj.infer(model_path, data_path, output)
- except Exception as ex:
- traceback.print_exc()
- return None
-
- cmd = [
- "python", eval_script, output_path, data_path, "--fuzzy", "--schema",
- conf.label_schema
- ]
- logger.info("cmd: %s" % " ".join(cmd))
- eval_ret = subprocess.check_output(cmd)
- if "chunk_f1" not in eval_ret:
- raise ValueError("Unknown error in cmd \"%s\"" % " ".join(cmd))
-
- return eval_ret
-
-
-def run_eval(infer_obj,
- conf,
- model_dir,
- input_path,
- eval_script,
- log_file,
- start_pass_id,
- end_pass_id,
- force_rerun=False):
- if not force_rerun and os.path.exists(log_file):
- evals = load_existing_results(log_file)
- else:
- evals = {}
- with utils.open_file(log_file, "w") as log:
- for i in xrange(start_pass_id, end_pass_id + 1):
- if i in evals:
- eval_ret = evals[i]
- else:
- pass_id = "%05d" % i
- model_path = os.path.join(model_dir,
- "params_pass_%s.tar.gz" % pass_id)
- logger.info("Waiting for model %s ..." % model_path)
- while True:
- eval_ret = eval_one_pass(infer_obj, conf, model_path,
- input_path, eval_script)
- if eval_ret:
- evals[i] = eval_ret
- break
-
- # wait for one minute and rerun
- time.sleep(60)
- print >> log, "Pass=%d %s" % (i, eval_ret.rstrip())
- log.flush()
- return evals
-
-
-def parse_cmd():
- parser = argparse.ArgumentParser()
- parser.add_argument("model_dir")
- parser.add_argument("data_type", choices=["ann", "ir"], default="ann")
- parser.add_argument(
- "--val_eval_output", help="validation set evaluation result file")
- parser.add_argument(
- "--tst_eval_output", help="test set evaluation result file")
- parser.add_argument("--start_pass_id", type=int, default=0)
- parser.add_argument(
- "--end_pass_id", type=int, default=24, help="this pass is included")
- parser.add_argument("--force_rerun", action="store_true")
- return parser.parse_args()
-
-
-__eval_scripts = {
- "ann": "data/evaluation/evaluate-tagging-result.py",
- "ir": "data/evaluation/evaluate-voting-result.py",
-}
-
-__val_data = {
- "ann": "./data/data/validation.ann.json.gz",
- "ir": "./data/data/validation.ir.json.gz",
-}
-
-__tst_data = {
- "ann": "./data/data/test.ann.json.gz",
- "ir": "./data/data/test.ir.json.gz",
-}
-
-
-def main(args):
- conf = config.InferConfig()
- conf.vocab = utils.load_dict(conf.word_dict_path)
- logger.info("length of word dictionary is : %d." % len(conf.vocab))
-
- if args.val_eval_output:
- val_eval_output = args.val_eval_output
- else:
- val_eval_output = "eval.val.%s.txt" % args.data_type
-
- if args.tst_eval_output:
- tst_eval_output = args.tst_eval_output
- else:
- tst_eval_output = "eval.tst.%s.txt" % args.data_type
-
- eval_script = __eval_scripts[args.data_type]
- val_data_file = __val_data[args.data_type]
- tst_data_file = __tst_data[args.data_type]
-
- infer_obj = infer.Infer(conf)
- val_evals = run_eval(
- infer_obj,
- conf,
- args.model_dir,
- val_data_file,
- eval_script,
- val_eval_output,
- args.start_pass_id,
- args.end_pass_id,
- force_rerun=args.force_rerun)
-
- best_pass_id = find_best_pass(val_evals)
-
- tst_evals = run_eval(
- infer_obj,
- conf,
- args.model_dir,
- tst_data_file,
- eval_script,
- tst_eval_output,
- start_pass_id=best_pass_id,
- end_pass_id=best_pass_id,
- force_rerun=args.force_rerun)
-
- logger.info("Best Pass=%d" % best_pass_id)
- logger.info("Validation: %s" % val_evals[best_pass_id])
- logger.info("Test : %s" % tst_evals[best_pass_id])
-
-
-if __name__ == "__main__":
- main(parse_cmd())
diff --git a/legacy/nmt_without_attention/README.cn.md b/legacy/nmt_without_attention/README.cn.md
deleted file mode 100644
index 2fd43bbdda53091506ca574d8c8b894870471c4f..0000000000000000000000000000000000000000
--- a/legacy/nmt_without_attention/README.cn.md
+++ /dev/null
@@ -1,344 +0,0 @@
-# 神经网络机器翻译模型
-
-## 背景介绍
-机器翻译利用计算机将源语言转换成目标语言的同义表达,是自然语言处理中重要的研究方向,有着广泛的应用需求,其实现方式也经历了不断地演化。传统机器翻译方法主要基于规则或统计模型,需要人为地指定翻译规则或设计语言特征,效果依赖于人对源语言与目标语言的理解程度。近些年来,深度学习的提出与迅速发展使得特征的自动学习成为可能。深度学习首先在图像识别和语音识别中取得成功,进而在机器翻译等自然语言处理领域中掀起了研究热潮。机器翻译中的深度学习模型直接学习源语言到目标语言的映射,大为减少了学习过程中人的介入,同时显著地提高了翻译质量。本例介绍在PaddlePaddle中如何利用循环神经网络(Recurrent Neural Network, RNN)构建一个端到端(End-to-End)的神经网络机器翻译(Neural Machine Translation, NMT)模型。
-
-## 模型概览
-基于 RNN 的神经网络机器翻译模型遵循编码器-解码器结构,其中的编码器和解码器均是一个循环神经网络。将构成编码器和解码器的两个 RNN 沿时间步展开,得到如下的模型结构图:
-
- 图 1. 编码器-解码器框架
-
-神经机器翻译模型的输入输出可以是字符,也可以是词或者短语。不失一般性,本例以基于词的模型为例说明编码器/解码器的工作机制:
-
-- **编码器**:将源语言句子编码成一个向量,作为解码器的输入。解码器的原始输入是表示词的 `id` 序列 $w = {w_1, w_2, ..., w_T}$,用独热(One-hot)码表示。为了对输入进行降维,同时建立词语之间的语义关联,模型为热独码表示的单词学习一个词嵌入(Word Embedding)表示,也就是常说的词向量,关于词向量的详细介绍请参考 PaddleBook 的[词向量](https://github.com/PaddlePaddle/book/blob/develop/04.word2vec/README.cn.md)一章。最后 RNN 单元逐个词地处理输入,得到完整句子的编码向量。
-
-- **解码器**:接受编码器的输入,逐个词地解码出目标语言序列 $u = {u_1, u_2, ..., u_{T'}}$。每个时间步,RNN 单元输出一个隐藏向量,之后经 `Softmax` 归一化计算出下一个目标词的条件概率,即 $P(u_i | w, u_1, u_2, ..., u_{t-1})$。因此,给定输入 $w$,其对应的翻译结果为 $u$ 的概率则为
-
-$$ P(u_1,u_2,...,u_{T'} | w) = \prod_{t=1}^{t={T'}}p(u_t|w, u_1, u_2, u_{t-1})$$
-
-以中文到英文的翻译为例,源语言是中文,目标语言是英文。下面是一句源语言分词后的句子
-
-```
-祝愿 祖国 繁荣 昌盛
-```
-
-对应的目标语言英文翻译结果为:
-
-```
-Wish motherland rich and powerful
-```
-
-在预处理阶段,准备源语言与目标语言互译的平行语料数据,并分别构建源语言和目标语言的词典;在训练阶段,用这样成对的平行语料训练模型;在模型测试阶段,输入中文句子,模型自动生成对应的英语翻译,然后将生成结果与标准翻译对比进行评估。在机器翻译领域,BLEU 是最流行的自动评估指标之一。
-
-### RNN 单元
-RNN 的原始结构用一个向量来存储隐状态,然而这种结构的 RNN 在训练时容易发生梯度弥散(gradient vanishing),对于长时间的依赖关系难以建模。因此人们对 RNN 单元进行了改进,提出了 LSTM\[[1](#参考文献)] 和 GRU\[[2](#参考文献)],这两种单元以门来控制应该记住的和遗忘的信息,较好地解决了序列数据的长时依赖问题。以本例所用的 GRU 为例,其基本结构如下:
-
-
-
-图 2. GRU 单元
-
-
-可以看到除了隐含状态以外,GRU 内部还包含了两个门:更新门(Update Gate)、重置门(Reset Gate)。在每一个时间步,门限和隐状态的更新由图 2 右侧的公式决定。这两个门限决定了状态以何种方式更新。
-
-### 双向编码器
-在上述的基本模型中,编码器在顺序处理输入句子序列时,当前时刻的状态只包含了历史输入信息,而没有未来时刻的序列信息。而对于序列建模,未来时刻的上下文同样包含了重要的信息。可以使用如图 3 所示的这种双向编码器来同时获取当前时刻输入的上下文:
-
-
-图 3. 双向编码器结构示意图
-
-
-图 3 所示的双向编码器\[[3](#参考文献)\]由两个独立的 RNN 构成,分别从前向和后向对输入序列进行编码,然后将两个 RNN 的输出合并在一起,作为最终的编码输出。
-在 PaddlePaddle 中,双向编码器可以很方便地调用相关 APIs 实现:
-
-```python
-src_word_id = paddle.layer.data(
- name='source_language_word',
- type=paddle.data_type.integer_value_sequence(source_dict_dim))
-
-# source embedding
-src_embedding = paddle.layer.embedding(
- input=src_word_id, size=word_vector_dim)
-
-# bidirectional GRU as encoder
-encoded_vector = paddle.networks.bidirectional_gru(
- input=src_embedding,
- size=encoder_size,
- fwd_act=paddle.activation.Tanh(),
- fwd_gate_act=paddle.activation.Sigmoid(),
- bwd_act=paddle.activation.Tanh(),
- bwd_gate_act=paddle.activation.Sigmoid(),
- return_seq=True)
-```
-
-### 柱搜索(Beam Search) 算法
-训练完成后的生成阶段,模型根据源语言输入,解码生成对应的目标语言翻译结果。解码时,一个直接的方式是取每一步条件概率最大的词,作为当前时刻的输出。但局部最优并不一定能得到全局最优,即这种做法并不能保证最后得到的完整句子出现的概率最大。如果对解的全空间进行搜索,其代价又过大。为了解决这个问题,通常采用柱搜索(Beam Search)算法。柱搜索是一种启发式的图搜索算法,用一个参数 $k$ 控制搜索宽度,其要点如下:
-
-**1**. 在解码的过程中,始终维护 $k$ 个已解码出的子序列;
-
-**2**. 在中间时刻 $t$, 对于 $k$ 个子序列中的每个序列,计算下一个词出现的概率并取概率最大的前 $k$ 个词,组合得到 $k^2$ 个新子序列;
-
-**3**. 取 **2** 中这些组合序列中概率最大的前 $k$ 个以更新原来的子序列;
-
-**4**. 不断迭代下去,直至得到 $k$ 个完整的句子,作为翻译结果的候选。
-
-关于柱搜索的更多介绍,可以参考 PaddleBook 中[机器翻译](https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.cn.md)一章中[柱搜索](https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.cn.md#柱搜索算法)一节。
-
-
-### 无注意力机制的解码器
-- PaddleBook中[机器翻译](https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.cn.md)的相关章节中,已介绍了带注意力机制(Attention Mechanism)的 Encoder-Decoder 结构,本例介绍的则是不带注意力机制的 Encoder-Decoder 结构。关于注意力机制,读者可进一步参考 PaddleBook 和参考文献\[[3](#参考文献)]。
-
-对于流行的RNN单元,PaddlePaddle 已有很好的实现均可直接调用。如果希望在 RNN 每一个时间步实现某些自定义操作,可使用 PaddlePaddle 中的`recurrent_layer_group`。首先,自定义单步逻辑函数,再利用函数 `recurrent_group()` 循环调用单步逻辑函数处理整个序列。本例中的无注意力机制的解码器便是使用`recurrent_layer_group`来实现,其中,单步逻辑函数`gru_decoder_without_attention()`相关代码如下:
-
-```python
-# the initialization state for decoder GRU
-encoder_last = paddle.layer.last_seq(input=encoded_vector)
-encoder_last_projected = paddle.layer.fc(
- size=decoder_size, act=paddle.activation.Tanh(), input=encoder_last)
-
-# the step function for decoder GRU
-def gru_decoder_without_attention(enc_vec, current_word):
- '''
- Step function for gru decoder
- :param enc_vec: encoded vector of source language
- :type enc_vec: layer object
- :param current_word: current input of decoder
- :type current_word: layer object
- '''
- decoder_mem = paddle.layer.memory(
- name="gru_decoder",
- size=decoder_size,
- boot_layer=encoder_last_projected)
-
- context = paddle.layer.last_seq(input=enc_vec)
-
- decoder_inputs = paddle.layer.fc(
- size=decoder_size * 3, input=[context, current_word])
-
- gru_step = paddle.layer.gru_step(
- name="gru_decoder",
- act=paddle.activation.Tanh(),
- gate_act=paddle.activation.Sigmoid(),
- input=decoder_inputs,
- output_mem=decoder_mem,
- size=decoder_size)
-
- out = paddle.layer.fc(
- size=target_dict_dim,
- bias_attr=True,
- act=paddle.activation.Softmax(),
- input=gru_step)
- return out
-```
-
-在模型训练和测试阶段,解码器的行为有很大的不同:
-
-- **训练阶段**:目标翻译结果的词向量`trg_embedding`作为参数传递给单步逻辑`gru_decoder_without_attention()`,函数`recurrent_group()`循环调用单步逻辑执行,最后计算目标翻译与实际解码的差异cost并返回;
-- **测试阶段**:解码器根据最后一个生成的词预测下一个词,`GeneratedInput()`自动取出模型预测出的概率最高的$k$个词的词向量传递给单步逻辑,`beam_search()`函数调用单步逻辑函数`gru_decoder_without_attention()`完成柱搜索并作为结果返回。
-
-训练和生成的逻辑分别实现在如下的`if-else`条件分支中:
-
-```python
-group_input1 = paddle.layer.StaticInput(input=encoded_vector)
-group_inputs = [group_input1]
-
-decoder_group_name = "decoder_group"
-if is_generating:
- trg_embedding = paddle.layer.GeneratedInput(
- size=target_dict_dim,
- embedding_name="_target_language_embedding",
- embedding_size=word_vector_dim)
- group_inputs.append(trg_embedding)
-
- beam_gen = paddle.layer.beam_search(
- name=decoder_group_name,
- step=gru_decoder_without_attention,
- input=group_inputs,
- bos_id=0,
- eos_id=1,
- beam_size=beam_size,
- max_length=max_length)
-
- return beam_gen
-else:
- trg_embedding = paddle.layer.embedding(
- input=paddle.layer.data(
- name="target_language_word",
- type=paddle.data_type.integer_value_sequence(target_dict_dim)),
- size=word_vector_dim,
- param_attr=paddle.attr.ParamAttr(name="_target_language_embedding"))
- group_inputs.append(trg_embedding)
-
- decoder = paddle.layer.recurrent_group(
- name=decoder_group_name,
- step=gru_decoder_without_attention,
- input=group_inputs)
-
- lbl = paddle.layer.data(
- name="target_language_next_word",
- type=paddle.data_type.integer_value_sequence(target_dict_dim))
- cost = paddle.layer.classification_cost(input=decoder, label=lbl)
-
- return cost
-```
-
-## 数据准备
-本例所用到的数据来自[WMT14](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/),该数据集是法文到英文互译的平行语料。用[bitexts](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/bitexts.tgz)作为训练数据,[dev+test data](http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/dev+test.tgz)作为验证与测试数据。在PaddlePaddle中已经封装好了该数据集的读取接口,在首次运行的时候,程序会自动完成下载,用户无需手动完成相关的数据准备。
-
-## 模型的训练与测试
-
-### 模型训练
-
-启动模型训练的十分简单,只需在命令行窗口中执行`python train.py`。模型训练阶段 `train.py` 脚本中的 `train()` 函数依次完成了如下的逻辑:
-
-**a) 由网络定义,解析网络结构,初始化模型参数**
-
-```python
-# define the network topolgy.
-cost = seq2seq_net(source_dict_dim, target_dict_dim)
-parameters = paddle.parameters.create(cost)
-```
-
-**b) 设定训练过程中的优化策略、定义训练数据读取 `reader`**
-
-```python
-# define optimization method
-optimizer = paddle.optimizer.RMSProp(
- learning_rate=1e-3,
- gradient_clipping_threshold=10.0,
- regularization=paddle.optimizer.L2Regularization(rate=8e-4))
-
-# define the trainer instance
-trainer = paddle.trainer.SGD(
- cost=cost, parameters=parameters, update_equation=optimizer)
-
-# define data reader
-wmt14_reader = paddle.batch(
- paddle.reader.shuffle(
- paddle.dataset.wmt14.train(source_dict_dim), buf_size=8192),
- batch_size=55)
-```
-
-**c) 定义事件句柄,打印训练中间结果、保存模型快照**
-
-```python
-# define the event_handler callback
-def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if not event.batch_id % 100 and event.batch_id:
- with gzip.open(
- os.path.join(save_path,
- "nmt_without_att_%05d_batch_%05d.tar.gz" %
- event.pass_id, event.batch_id), "w") as f:
- parameters.to_tar(f)
-
- if event.batch_id and not event.batch_id % 10:
- logger.info("Pass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
-```
-
-**d) 开始训练**
-
-```python
-# start training
-trainer.train(
- reader=wmt14_reader, event_handler=event_handler, num_passes=2)
-```
-
-输出样例为
-
-```text
-Pass 0, Batch 0, Cost 267.674663, {'classification_error_evaluator': 1.0}
-.........
-Pass 0, Batch 10, Cost 172.892294, {'classification_error_evaluator': 0.953895092010498}
-.........
-Pass 0, Batch 20, Cost 177.989329, {'classification_error_evaluator': 0.9052488207817078}
-.........
-Pass 0, Batch 30, Cost 153.633665, {'classification_error_evaluator': 0.8643803596496582}
-.........
-Pass 0, Batch 40, Cost 168.170543, {'classification_error_evaluator': 0.8348183631896973}
-```
-
-### 生成翻译结果
-利用训练好的模型生成翻译文本也十分简单。
-
-1. 首先请修改`generate.py`脚本中`main`中传递给`generate`函数的参数,以选择使用哪一个保存的模型来生成。默认参数如下所示:
-
- ```python
- generate(
- source_dict_dim=30000,
- target_dict_dim=30000,
- batch_size=20,
- beam_size=3,
- model_path="models/nmt_without_att_params_batch_00100.tar.gz")
- ```
-
-2. 在终端执行命令 `python generate.py`,脚本中的`generate()`执行了依次如下逻辑:
-
- **a) 加载测试样本**
-
- ```python
- # load data samples for generation
- gen_creator = paddle.dataset.wmt14.gen(source_dict_dim)
- gen_data = []
- for item in gen_creator():
- gen_data.append((item[0], ))
- ```
-
- **b) 初始化模型,执行`infer()`为每个输入样本生成`beam search`的翻译结果**
-
- ```python
- beam_gen = seq2seq_net(source_dict_dim, target_dict_dim, True)
- with gzip.open(init_models_path) as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
- # prob is the prediction probabilities, and id is the prediction word.
- beam_result = paddle.infer(
- output_layer=beam_gen,
- parameters=parameters,
- input=gen_data,
- field=['prob', 'id'])
- ```
-
- **c) 加载源语言和目标语言词典,将`id`序列表示的句子转化成原语言并输出结果**
-
- ```python
- beam_result = inferer.infer(input=test_batch, field=["prob", "id"])
-
- gen_sen_idx = np.where(beam_result[1] == -1)[0]
- assert len(gen_sen_idx) == len(test_batch) * beam_size
-
- start_pos, end_pos = 1, 0
- for i, sample in enumerate(test_batch):
- print(" ".join([
- src_dict[w] for w in sample[0][1:-1]
- ])) # skip the start and ending mark when print the source sentence
- for j in xrange(beam_size):
- end_pos = gen_sen_idx[i * beam_size + j]
- print("%.4f\t%s" % (beam_result[0][i][j], " ".join(
- trg_dict[w] for w in beam_result[1][start_pos:end_pos])))
- start_pos = end_pos + 2
- print("\n")
- ```
-
-设置beam search的宽度为3,输入为一个法文句子,则自动为测试数据生成对应的翻译结果,输出格式如下:
-
-```text
-Elles connaissent leur entreprise mieux que personne .
--3.754819 They know their business better than anyone .
--4.445528 They know their businesses better than anyone .
--5.026885 They know their business better than anybody .
-
-```
-- 第一行为输入的源语言句子。
-- 第二 ~ beam_size + 1 行是柱搜索生成的 `beam_size` 条翻译结果
- - 相同行的输出以“\t”分隔为两列,第一列是句子的log 概率,第二列是翻译结果的文本。
- - 符号`` 表示句子的开始,符号``表示一个句子的结束,如果出现了在词典中未包含的词,则用符号``替代。
-
-至此,我们在 PaddlePaddle 上实现了一个初步的机器翻译模型。我们可以看到,PaddlePaddle 提供了灵活丰富的API供大家选择和使用,使得我们能够很方便完成各种复杂网络的配置。机器翻译本身也是个快速发展的领域,各种新方法新思想在不断涌现。在学习完本例后,读者若有兴趣和余力,可基于 PaddlePaddle 平台实现更为复杂、性能更优的机器翻译模型。
-
-
-## 参考文献
-[1] Sutskever I, Vinyals O, Le Q V. [Sequence to Sequence Learning with Neural Networks](https://arxiv.org/abs/1409.3215)[J]. 2014, 4:3104-3112.
-
-[2]Cho K, Van Merriënboer B, Gulcehre C, et al. [Learning phrase representations using RNN encoder-decoder for statistical machine translation](http://www.aclweb.org/anthology/D/D14/D14-1179.pdf)[C]. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2014: 1724-1734.
-
-[3] Bahdanau D, Cho K, Bengio Y. [Neural machine translation by jointly learning to align and translate](https://arxiv.org/abs/1409.0473)[C]. Proceedings of ICLR 2015, 2015
diff --git a/legacy/nmt_without_attention/README.md b/legacy/nmt_without_attention/README.md
deleted file mode 100644
index deb7ff58ee9c4940964bea8f6a19ca1b54019b6e..0000000000000000000000000000000000000000
--- a/legacy/nmt_without_attention/README.md
+++ /dev/null
@@ -1,352 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.10.0. If you are on a version of PaddlePaddle earlier than v0.10.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Neural Machine Translation Model
-
-## Background Introduction
-Neural Machine Translation (NMT) is a simple new architecture for getting machines to learn to translate. Traditional machine translation methods are mainly based on phrase-based statistical translation approaches that use separately engineered subcomponents rules or statistical models. NMT models use deep learning and representation learning. This example describes how to construct an end-to-end neural machine translation (NMT) model using the recurrent neural network (RNN) in PaddlePaddle.
-
-## Model Overview
-RNN-based neural machine translation follows the encoder-decoder architecture. A common choice for the encoder and decoder is the recurrent neural network (RNN), used by most NMT models. Below is an example diagram of a general approach for NMT.
-
- Figure 1. Encoder - Decoder frame p>
-
-The input and output of the neural machine translation model can be any of character, word or phrase. This example illustrates the word-based NMT.
-
-- **Encoder**: Encodes the source language sentence into a vector as input to the decoder. The original input of the decoder is the `id` sequence $ w = {w_1, w_2, ..., w_T} $ of the word, expressed in the one-hot code. In order to reduce the input dimension, and to establish the semantic association between words, the model is a word that is expressed by hot independent code. Word embedding is a word vector. For more information about word vector, please refer to PaddleBook [word vector] (https://github.com/PaddlePaddle/book/blob/develop/04.word2vec/README.cn.md) chapter. Finally, the RNN unit processes the input word by word to get the encoding vector of the complete sentence.
-
-- **Decoder**: Accepts the input of the encoder, decoding the target language sequence $ u = {u_1, u_2, ..., u_ {T '}} $ one by one. For each time step, the RNN unit outputs a hidden vector. Then the conditional probability of the next target word is calculated by `Softmax` normalization, i.e. $ P (u_i | w, u_1, u_2, ..., u_ {t- 1}) $. Thus, given the input $ w $, the corresponding translation result is $ u $
-
-$$ P(u_1,u_2,...,u_{T'} | w) = \prod_{t=1}^{t={T'}}p(u_t|w, u_1, u_2, u_{t-1})$$
-
-In Chinese to English translation, for example, the source language is Chinese, and the target language is English. The following is a sentence after the source language word segmentation.
-
-```
-祝愿 祖国 繁荣 昌盛
-```
-
-Corresponding target language English translation results for:
-
-```
-Wish motherland rich and powerful
-```
-
-In the preprocessing step, we prepare the parallel corpus data of the source language and the target language. Then we construct the dictionaries of the source language and the target language respectively. In the training stage, we use the pairwise parallel corpus training model. In the model test stage, the model automatically generates the corresponding English translation, and then it evaluates the resulting results with standard translations. For the evaluation metric, BLEU is most commonly used.
-
-### RNN unit
-The original structure of the RNN uses a vector to store the hidden state. However, the RNN of this structure is prone to have gradient vanishing problem, which is difficult to model for a long time. This issue can be addressed by using LSTM \[[1](#References)] and GRU (Gated Recurrent Unit) \[[2](#References)]. This solves the problem of long-term dependency by carefully forgetting the previous information. In this example, we demonstrate the GRU based model.
-
-
-
-图 2. GRU 单元
-
-
-We can see that, in addition to the implicit state, the GRU also contains two gates: the Update Gate and the Reset Gate. At each time step, the update of the threshold and the hidden state is determined by the formula on the right side of Figure 2. These two thresholds determine how the state is updated.
-
-### Bi-directional Encoder
-In the above basic model, when the encoder sequentially processes the input sentence sequence, the state of the current time contains only the history input information without the sequence information of the future time. For sequence modeling, the context of the future also contains important information. With the bi-directional encoder (Figure 3), we can get both information at the same time:
-
-
-
-Figure 3. Bi-directional encoder structure diagram
-
-
-
-The bi-directional encoder \[[3](#References)\] shown in Figure 3 consists of two independent RNNs that encode the input sequence from the forward and backward, respectively. Then it combines the outputs of the two RNNs together, as the final encoding output.
-
-In PaddlePaddle, bi-directional encoders can easily call using APIs:
-
-```python
-src_word_id = paddle.layer.data(
- name='source_language_word',
- type=paddle.data_type.integer_value_sequence(source_dict_dim))
-
-# source embedding
-src_embedding = paddle.layer.embedding(
- input=src_word_id, size=word_vector_dim)
-
-# bidirectional GRU as encoder
-encoded_vector = paddle.networks.bidirectional_gru(
- input=src_embedding,
- size=encoder_size,
- fwd_act=paddle.activation.Tanh(),
- fwd_gate_act=paddle.activation.Sigmoid(),
- bwd_act=paddle.activation.Tanh(),
- bwd_gate_act=paddle.activation.Sigmoid(),
- return_seq=True)
-```
-
-### Beam Search Algorithm
-After the training is completed, the model will input and decode the corresponding target language translation result according to the source language. Decoding, a direct way is to take each step conditional probability of the largest word, as the current moment of output. But the local optimal does not necessarily guarantee the global optimal. If the search for the full space is large, the cost is too large. In order to solve this problem, beam search algorithm is commonly used. Beam search is a heuristic graph search algorithm that controls the search width with a parameter $ k $] as follows:
-
-**1**. During decoding, always maintain $ k $ decoded sub-sequences;
-
-**2**. At the middle of time $ t $, for each sequence in the $ k $ sub-sequence, calculate the probability of the next word and take the maximum of $ k $ words with the largest probability, combining $ k ^ 2 $ New child sequence;
-
-**3**. Take the maximum probability of $ k $ in these combination sequences to update the original subsequence;
-
-**4**. iterate through it until you get $ k $ complete sentences as candidates for translation results.
-
-For more information on beam search, refer to the [beam search](https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.md#beam-search-algorithm) section in PaddleBook [machine translation](https://github.com/PaddlePaddle/book/tree/develop/08.machine_translation) chapter.
-
-
-### Decoder without Attention mechanism
-- In the relevant section of PaddleBook (https://github.com/PaddlePaddle/book/blob/develop/08.machine_translation/README.cn.md), the Attention Mechanism has been introduced. This example demonstrates Encoder-Decoder structure without attention mechanism. With regard to the attention mechanism, please refer to PaddleBook and references \[[3](#References)].
-
-In PaddlePaddle, commonly used RNN units can be conveniently called using APIs. For example, `recurrent_layer_group` can be used to implement custom actions at each point in the RNN. First, customize the single-step logic function, and then use the function `recurrent_group ()` to cycle through the single-step logic function to process the entire sequence. In this case, the unattended mechanism of the decoder uses `recurrent_layer_group` to implement the function` gru_decoder_without_attention () `. Corresponding code is as follows:
-
-
-```python
-# the initialization state for decoder GRU
-encoder_last = paddle.layer.last_seq(input=encoded_vector)
-encoder_last_projected = paddle.layer.fc(
- size=decoder_size, act=paddle.activation.Tanh(), input=encoder_last)
-
-# the step function for decoder GRU
-def gru_decoder_without_attention(enc_vec, current_word):
- '''
- Step function for gru decoder
- :param enc_vec: encoded vector of source language
- :type enc_vec: layer object
- :param current_word: current input of decoder
- :type current_word: layer object
- '''
- decoder_mem = paddle.layer.memory(
- name="gru_decoder",
- size=decoder_size,
- boot_layer=encoder_last_projected)
-
- context = paddle.layer.last_seq(input=enc_vec)
-
- decoder_inputs = paddle.layer.fc(
- size=decoder_size * 3, input=[context, current_word])
-
- gru_step = paddle.layer.gru_step(
- name="gru_decoder",
- act=paddle.activation.Tanh(),
- gate_act=paddle.activation.Sigmoid(),
- input=decoder_inputs,
- output_mem=decoder_mem,
- size=decoder_size)
-
- out = paddle.layer.fc(
- size=target_dict_dim,
- bias_attr=True,
- act=paddle.activation.Softmax(),
- input=gru_step)
- return out
-```
-
-In the model training and testing phase, the behavior of the decoder is different:
-
-- **Training phase**: The word vector of the target translation `trg_embedding` is passed as a parameter to the single step logic` gru_decoder_without_attention () `. The function` recurrent_group () `loop calls the single step logic execution, and finally calculates the target translation with the actual decoding;
-- **Testing phase**: The decoder predicts the next word based on the last generated word, `GeneratedInput ()`. The automatic fetch model predicts the highest probability of the $ k $ word vector passed to the single step logic. Then the beam_search () function calls the function `gru_decoder_without_attention ()` to complete the beam search and returns as a result.
-
-The training and generated returns are implemented in the following `if-else` conditional branches:
-
-```python
-group_input1 = paddle.layer.StaticInput(input=encoded_vector)
-group_inputs = [group_input1]
-
-decoder_group_name = "decoder_group"
-if is_generating:
- trg_embedding = paddle.layer.GeneratedInput(
- size=target_dict_dim,
- embedding_name="_target_language_embedding",
- embedding_size=word_vector_dim)
- group_inputs.append(trg_embedding)
-
- beam_gen = paddle.layer.beam_search(
- name=decoder_group_name,
- step=gru_decoder_without_attention,
- input=group_inputs,
- bos_id=0,
- eos_id=1,
- beam_size=beam_size,
- max_length=max_length)
-
- return beam_gen
-else:
- trg_embedding = paddle.layer.embedding(
- input=paddle.layer.data(
- name="target_language_word",
- type=paddle.data_type.integer_value_sequence(target_dict_dim)),
- size=word_vector_dim,
- param_attr=paddle.attr.ParamAttr(name="_target_language_embedding"))
- group_inputs.append(trg_embedding)
-
- decoder = paddle.layer.recurrent_group(
- name=decoder_group_name,
- step=gru_decoder_without_attention,
- input=group_inputs)
-
- lbl = paddle.layer.data(
- name="target_language_next_word",
- type=paddle.data_type.integer_value_sequence(target_dict_dim))
- cost = paddle.layer.classification_cost(input=decoder, label=lbl)
-
- return cost
-```
-
-## Data Preparation
-The data used in this example is from [WMT14] (http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/), which is a parallel corpus of French-to-English translation. Use [bitexts] (http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/bitexts.tgz) as training data, [dev + test data] (http://www-lium.univ-lemans.fr/~schwenk/cslm_joint_paper/data/dev+test.tgz) as validation and test data. PaddlePaddle has been packaged in the data set of the read interface, in the first run, the program will automatically complete the download. Users do not need to manually complete the relevant data preparation!
-
-## Model Training and Testing
-
-### Model Training
-
-Starting the model training is very simple, just in the command line window to execute `python train.py`. The `train ()` function in the `train.py` script of the model training phase completes the following logic:
-
-**a) Define the network, parse the network structure, initialize the model parameters.**
-
-```python
-# define the network topolgy.
-cost = seq2seq_net(source_dict_dim, target_dict_dim)
-parameters = paddle.parameters.create(cost)
-```
-
-**b) Set the training process optimization strategy. Define the training data to read `reader`**
-
-```python
-# define optimization method
-optimizer = paddle.optimizer.RMSProp(
- learning_rate=1e-3,
- gradient_clipping_threshold=10.0,
- regularization=paddle.optimizer.L2Regularization(rate=8e-4))
-
-# define the trainer instance
-trainer = paddle.trainer.SGD(
- cost=cost, parameters=parameters, update_equation=optimizer)
-
-# define data reader
-wmt14_reader = paddle.batch(
- paddle.reader.shuffle(
- paddle.dataset.wmt14.train(source_dict_dim), buf_size=8192),
- batch_size=55)
-```
-
-**c) Define the event handle, print the training intermediate results, save the model snapshot**
-
-```python
-# define the event_handler callback
-def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if not event.batch_id % 100 and event.batch_id:
- with gzip.open(
- os.path.join(save_path,
- "nmt_without_att_%05d_batch_%05d.tar.gz" %
- event.pass_id, event.batch_id), "w") as f:
- parameters.to_tar(f)
-
- if event.batch_id and not event.batch_id % 10:
- logger.info("Pass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
-```
-
-**d) Start training**
-
-```python
-# start training
-trainer.train(
- reader=wmt14_reader, event_handler=event_handler, num_passes=2)
-```
-
-The output sample is
-
-```text
-Pass 0, Batch 0, Cost 267.674663, {'classification_error_evaluator': 1.0}
-.........
-Pass 0, Batch 10, Cost 172.892294, {'classification_error_evaluator': 0.953895092010498}
-.........
-Pass 0, Batch 20, Cost 177.989329, {'classification_error_evaluator': 0.9052488207817078}
-.........
-Pass 0, Batch 30, Cost 153.633665, {'classification_error_evaluator': 0.8643803596496582}
-.........
-Pass 0, Batch 40, Cost 168.170543, {'classification_error_evaluator': 0.8348183631896973}
-```
-
-### Generate Translation Results
-In PaddlePaddle, it is also easy to use translated models to generate translated text.
-
-1. First of all, please modify the `generate.py` script` main` passed to the `generate` function parameters to choose which saved model to use. The default parameters are as follows:
-
- ```python
- generate(
- source_dict_dim=30000,
- target_dict_dim=30000,
- batch_size=20,
- beam_size=3,
- model_path="models/nmt_without_att_params_batch_00100.tar.gz")
- ```
-
-2. In the terminal phase, execute the `python generate.py` command. The` generate () `in the script executes the following code:
-
- **a) Load the test sample**
-
- ```python
- # load data samples for generation
- gen_creator = paddle.dataset.wmt14.gen(source_dict_dim)
- gen_data = []
- for item in gen_creator():
- gen_data.append((item[0], ))
- ```
-
- **b) Initialize the model, execute `infer ()` for each input sample to generate `beam search` translation results**
-
- ```python
- beam_gen = seq2seq_net(source_dict_dim, target_dict_dim, True)
- with gzip.open(init_models_path) as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
- # prob is the prediction probabilities, and id is the prediction word.
- beam_result = paddle.infer(
- output_layer=beam_gen,
- parameters=parameters,
- input=gen_data,
- field=['prob', 'id'])
- ```
-
- **c) Next, load the source and target language dictionaries, convert the sentences represented by the `id` sequence into the original language and output the results.**
-
- ```python
- beam_result = inferer.infer(input=test_batch, field=["prob", "id"])
-
- gen_sen_idx = np.where(beam_result[1] == -1)[0]
- assert len(gen_sen_idx) == len(test_batch) * beam_size
-
- start_pos, end_pos = 1, 0
- for i, sample in enumerate(test_batch):
- print(" ".join([
- src_dict[w] for w in sample[0][1:-1]
- ])) # skip the start and ending mark when print the source sentence
- for j in xrange(beam_size):
- end_pos = gen_sen_idx[i * beam_size + j]
- print("%.4f\t%s" % (beam_result[0][i][j], " ".join(
- trg_dict[w] for w in beam_result[1][start_pos:end_pos])))
- start_pos = end_pos + 2
- print("\n")
- ```
-
-Set the width of the beam search to 3, enter a French sentence. Then it automatically generate the corresponding test data for the translation results, the output format is as follows:
-
-```text
-Elles connaissent leur entreprise mieux que personne .
--3.754819 They know their business better than anyone .
--4.445528 They know their businesses better than anyone .
--5.026885 They know their business better than anybody .
-
-```
-- The first line of input for the source language.
-- Second ~ beam_size + 1 line is the result of the `beam_size` translation generated by the column search
- - the output of the same row is separated into two columns by "\ t", the first column is the log probability of the sentence, and the second column is the text of the translation result.
- - the symbol `` represents the beginning of the sentence, the symbol `` indicates the end of a sentence, and if there is a word that is not included in the dictionary, it is replaced with the symbol ``.
-
-So far, we have implemented a basic machine translation model using PaddlePaddle. We can see, PaddlePaddle provides a flexible and rich API. This enables users to easily choose and use a various complex network configuration. NMT itself is also a rapidly developing field, and many new ideas continue to emerge. This example is a basic implementation of NMT. Users can also implement more complex NMT models using PaddlePaddle.
-
-
-## References
-[1] Sutskever I, Vinyals O, Le Q V. [Sequence to Sequence Learning with Neural Networks] (https://arxiv.org/abs/1409.3215) [J]. 2014, 4: 3104-3112.
-
-[2] Cho K, Van Merriënboer B, Gulcehre C, et al. [Learning phrase representations using RNN encoder-decoder for statistical machine translation](http://www.aclweb.org/anthology/D/D14/D14-1179.pdf) [C]. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2014: 1724-1734.
-
-[3] Bahdanau D, Cho K, Bengio Y. [Neural machine translation by exclusive learning to align and translate] (https://arxiv.org/abs/1409.0473) [C]. Proceedings of ICLR 2015, 2015
diff --git a/legacy/nmt_without_attention/generate.py b/legacy/nmt_without_attention/generate.py
deleted file mode 100644
index eeb02b6a9312b683311c5a9146b7443ceb2d2427..0000000000000000000000000000000000000000
--- a/legacy/nmt_without_attention/generate.py
+++ /dev/null
@@ -1,84 +0,0 @@
-import os
-import logging
-import numpy as np
-
-from network_conf import seq2seq_net
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.WARNING)
-
-
-def infer_a_batch(inferer, test_batch, beam_size, src_dict, trg_dict):
- beam_result = inferer.infer(input=test_batch, field=["prob", "id"])
-
- gen_sen_idx = np.where(beam_result[1] == -1)[0]
- assert len(gen_sen_idx) == len(test_batch) * beam_size
-
- start_pos, end_pos = 1, 0
- for i, sample in enumerate(test_batch):
- print(" ".join([
- src_dict[w] for w in sample[0][1:-1]
- ])) # skip the start and ending mark when print the source sentence
- for j in xrange(beam_size):
- end_pos = gen_sen_idx[i * beam_size + j]
- print("%.4f\t%s" % (beam_result[0][i][j], " ".join(
- trg_dict[w] for w in beam_result[1][start_pos:end_pos])))
- start_pos = end_pos + 2
- print("\n")
-
-
-def generate(source_dict_dim, target_dict_dim, model_path, beam_size,
- batch_size):
- """
- Sequence generation for NMT.
-
- :param source_dict_dim: size of source dictionary
- :type source_dict_dim: int
- :param target_dict_dim: size of target dictionary
- :type target_dict_dim: int
- :param model_path: path for inital model
- :type model_path: string
- :param beam_size: the expanson width in each generation setp
- :param beam_size: int
- :param batch_size: the number of training examples in one forward pass
- :param batch_size: int
- """
-
- assert os.path.exists(model_path), "trained model does not exist."
-
- # step 1: prepare dictionary
- src_dict, trg_dict = paddle.dataset.wmt14.get_dict(source_dict_dim)
-
- # step 2: load the trained model
- paddle.init(use_gpu=False, trainer_count=1)
- with gzip.open(model_path) as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
- beam_gen = seq2seq_net(
- source_dict_dim,
- target_dict_dim,
- beam_size=beam_size,
- max_length=100,
- is_generating=True)
- inferer = paddle.inference.Inference(
- output_layer=beam_gen, parameters=parameters)
-
- # step 3: iterating over the testing dataset
- test_batch = []
- for idx, item in enumerate(paddle.dataset.wmt14.gen(source_dict_dim)()):
- test_batch.append([item[0]])
- if len(test_batch) == batch_size:
- infer_a_batch(inferer, test_batch, beam_size, src_dict, trg_dict)
- test_batch = []
-
- if len(test_batch):
- infer_a_batch(inferer, test_batch, beam_size, src_dict, trg_dict)
- test_batch = []
-
-
-if __name__ == "__main__":
- generate(
- source_dict_dim=30000,
- target_dict_dim=30000,
- batch_size=20,
- beam_size=3,
- model_path="models/nmt_without_att_params_batch_00100.tar.gz")
diff --git a/legacy/nmt_without_attention/images/bidirectional-encoder.png b/legacy/nmt_without_attention/images/bidirectional-encoder.png
deleted file mode 100644
index ae0f39a75cc810fb9a60fc8fa0fb3d3c460a92e2..0000000000000000000000000000000000000000
Binary files a/legacy/nmt_without_attention/images/bidirectional-encoder.png and /dev/null differ
diff --git a/legacy/nmt_without_attention/images/encoder-decoder.png b/legacy/nmt_without_attention/images/encoder-decoder.png
deleted file mode 100644
index b75b4febc770c4090c00bcb84a9a0a6d0ee859b4..0000000000000000000000000000000000000000
Binary files a/legacy/nmt_without_attention/images/encoder-decoder.png and /dev/null differ
diff --git a/legacy/nmt_without_attention/images/gru.png b/legacy/nmt_without_attention/images/gru.png
deleted file mode 100644
index 0cde685b84106650a4df18ce335a23e6338d3d11..0000000000000000000000000000000000000000
Binary files a/legacy/nmt_without_attention/images/gru.png and /dev/null differ
diff --git a/legacy/nmt_without_attention/network_conf.py b/legacy/nmt_without_attention/network_conf.py
deleted file mode 100644
index 595df349be32d947763089ca225f1d98ebb1a0ae..0000000000000000000000000000000000000000
--- a/legacy/nmt_without_attention/network_conf.py
+++ /dev/null
@@ -1,129 +0,0 @@
-import paddle.v2 as paddle
-import sys
-import gzip
-
-
-def seq2seq_net(source_dict_dim,
- target_dict_dim,
- word_vector_dim=620,
- rnn_hidden_size=1000,
- beam_size=1,
- max_length=50,
- is_generating=False):
- """
- Define the network structure of NMT, including encoder and decoder.
-
- :param source_dict_dim: size of source dictionary
- :type source_dict_dim : int
- :param target_dict_dim: size of target dictionary
- :type target_dict_dim: int
- :param word_vector_dim: size of source language word embedding
- :type word_vector_dim: int
- :param rnn_hidden_size: size of hidden state of encoder and decoder RNN
- :type rnn_hidden_size: int
- :param beam_size: expansion width in each step when generating
- :type beam_size: int
- :param max_length: max iteration number in generation
- :type max_length: int
- :param generating: whether to generate sequence or to train
- :type generating: bool
- """
-
- decoder_size = encoder_size = rnn_hidden_size
-
- src_word_id = paddle.layer.data(
- name="source_language_word",
- type=paddle.data_type.integer_value_sequence(source_dict_dim))
- src_embedding = paddle.layer.embedding(
- input=src_word_id, size=word_vector_dim)
-
- # use bidirectional_gru as the encoder
- encoded_vector = paddle.networks.bidirectional_gru(
- input=src_embedding,
- size=encoder_size,
- fwd_act=paddle.activation.Tanh(),
- fwd_gate_act=paddle.activation.Sigmoid(),
- bwd_act=paddle.activation.Tanh(),
- bwd_gate_act=paddle.activation.Sigmoid(),
- return_seq=True)
- #### Decoder
- encoder_last = paddle.layer.last_seq(input=encoded_vector)
- encoder_last_projected = paddle.layer.fc(size=decoder_size,
- act=paddle.activation.Tanh(),
- input=encoder_last)
-
- # gru step
- def gru_decoder_without_attention(enc_vec, current_word):
- """
- Step function for gru decoder
-
- :param enc_vec: encoded vector of source language
- :type enc_vec: layer object
- :param current_word: current input of decoder
- :type current_word: layer object
- """
- decoder_mem = paddle.layer.memory(
- name="gru_decoder",
- size=decoder_size,
- boot_layer=encoder_last_projected)
-
- context = paddle.layer.last_seq(input=enc_vec)
-
- decoder_inputs = paddle.layer.fc(size=decoder_size * 3,
- input=[context, current_word])
-
- gru_step = paddle.layer.gru_step(
- name="gru_decoder",
- act=paddle.activation.Tanh(),
- gate_act=paddle.activation.Sigmoid(),
- input=decoder_inputs,
- output_mem=decoder_mem,
- size=decoder_size)
-
- out = paddle.layer.fc(size=target_dict_dim,
- bias_attr=True,
- act=paddle.activation.Softmax(),
- input=gru_step)
- return out
-
- group_input1 = paddle.layer.StaticInput(input=encoded_vector)
- group_inputs = [group_input1]
-
- decoder_group_name = "decoder_group"
- if is_generating:
- trg_embedding = paddle.layer.GeneratedInput(
- size=target_dict_dim,
- embedding_name="_target_language_embedding",
- embedding_size=word_vector_dim)
- group_inputs.append(trg_embedding)
-
- beam_gen = paddle.layer.beam_search(
- name=decoder_group_name,
- step=gru_decoder_without_attention,
- input=group_inputs,
- bos_id=0,
- eos_id=1,
- beam_size=beam_size,
- max_length=max_length)
-
- return beam_gen
- else:
- trg_embedding = paddle.layer.embedding(
- input=paddle.layer.data(
- name="target_language_word",
- type=paddle.data_type.integer_value_sequence(target_dict_dim)),
- size=word_vector_dim,
- param_attr=paddle.attr.ParamAttr(name="_target_language_embedding"))
- group_inputs.append(trg_embedding)
-
- decoder = paddle.layer.recurrent_group(
- name=decoder_group_name,
- step=gru_decoder_without_attention,
- input=group_inputs)
-
- lbl = paddle.layer.data(
- name="target_language_next_word",
- type=paddle.data_type.integer_value_sequence(target_dict_dim))
- cost = paddle.layer.classification_cost(input=decoder, label=lbl)
-
- return cost
diff --git a/legacy/nmt_without_attention/train.py b/legacy/nmt_without_attention/train.py
deleted file mode 100644
index 373e4b8d91e29c0312346e82a966ba6b26ce9f06..0000000000000000000000000000000000000000
--- a/legacy/nmt_without_attention/train.py
+++ /dev/null
@@ -1,66 +0,0 @@
-import os
-import logging
-import paddle.v2 as paddle
-
-from network_conf import seq2seq_net
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def train(save_dir_path, source_dict_dim, target_dict_dim):
- '''
- Training function for NMT
-
- :param save_dir_path: path of the directory to save the trained models.
- :param save_dir_path: str
- :param source_dict_dim: size of source dictionary
- :type source_dict_dim: int
- :param target_dict_dim: size of target dictionary
- :type target_dict_dim: int
- '''
- if not os.path.exists(save_dir_path):
- os.mkdir(save_dir_path)
-
- # initialize PaddlePaddle
- paddle.init(use_gpu=False, trainer_count=1)
-
- cost = seq2seq_net(source_dict_dim, target_dict_dim)
- parameters = paddle.parameters.create(cost)
-
- # define optimization method and the trainer instance
- optimizer = paddle.optimizer.RMSProp(
- learning_rate=1e-3,
- gradient_clipping_threshold=10.0,
- regularization=paddle.optimizer.L2Regularization(rate=8e-4))
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- update_equation=optimizer)
-
- # define data reader
- wmt14_reader = paddle.batch(
- paddle.reader.shuffle(
- paddle.dataset.wmt14.train(source_dict_dim), buf_size=8192),
- batch_size=8)
-
- # define the event_handler callback
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if not event.batch_id % 100 and event.batch_id:
- with gzip.open(
- os.path.join(save_path,
- "nmt_without_att_%05d_batch_%05d.tar.gz" %
- event.pass_id, event.batch_id), "w") as f:
- trainer.save_parameter_to_tar(f)
-
- if event.batch_id and not event.batch_id % 10:
- logger.info("Pass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
-
- # start training
- trainer.train(
- reader=wmt14_reader, event_handler=event_handler, num_passes=2)
-
-
-if __name__ == '__main__':
- train(save_dir_path="models", source_dict_dim=30000, target_dict_dim=30000)
diff --git a/legacy/scene_text_recognition/README.md b/legacy/scene_text_recognition/README.md
deleted file mode 100644
index f10b4c0d5a966caa0e3deb6b6fd73bcd7538e2e9..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/README.md
+++ /dev/null
@@ -1,132 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 场景文字识别 (STR, Scene Text Recognition)
-
-## STR任务简介
-
-许多场景图像中包含着丰富的文本信息,它们可以从很大程度上帮助人们去认知场景图像的内容及含义,因此场景图像中的文本识别对所在图像的信息获取具有极其重要的作用。同时,场景图像文字识别技术的发展也促进了一些新型应用的产生,例如:\[[1](#参考文献)\]通过使用深度学习模型来自动识别路牌中的文字,帮助街景应用获取更加准确的地址信息。
-
-本例将演示如何用 PaddlePaddle 完成 **场景文字识别 (STR, Scene Text Recognition)** 。任务如下图所示,给定一张场景图片,`STR` 需要从中识别出对应的文字"keep"。
-
-
-
-图 1. 输入数据示例 "keep"
-
-
-
-## 使用 PaddlePaddle 训练与预测
-
-### 安装依赖包
-```bash
-pip install -r requirements.txt
-```
-
-### 修改配置参数
-
- `config.py` 脚本中包含了模型配置和训练相关的参数以及对应的详细解释,代码片段如下:
-```python
-class TrainerConfig(object):
-
- # Whether to use GPU in training or not.
- use_gpu = True
- # The number of computing threads.
- trainer_count = 1
-
- # The training batch size.
- batch_size = 10
-
- ...
-
-
-class ModelConfig(object):
-
- # Number of the filters for convolution group.
- filter_num = 8
-
- ...
-```
-
-修改 `config.py` 脚本可以实现对参数的调整。例如,通过修改 `use_gpu` 参数来指定是否使用 GPU 进行训练。
-
-### 模型训练
-训练脚本 [./train.py](./train.py) 中设置了如下命令行参数:
-
-```
-Options:
- --train_file_list_path TEXT The path of the file which contains path list
- of train image files. [required]
- --test_file_list_path TEXT The path of the file which contains path list
- of test image files. [required]
- --label_dict_path TEXT The path of label dictionary. If this parameter
- is set, but the file does not exist, label
- dictionay will be built from the training data
- automatically. [required]
- --model_save_dir TEXT The path to save the trained models (default:
- 'models').
- --help Show this message and exit.
-
-```
-
-- `train_file_list` :训练数据的列表文件,每行由图片的存储路径和对应的标记文本组成,格式为:
-```
-word_1.png, "PROPER"
-word_2.png, "FOOD"
-```
-- `test_file_list` :测试数据的列表文件,格式同上。
-- `label_dict_path` :训练数据中标记字典的存储路径,如果指定路径中字典文件不存在,程序会使用训练数据中的标记数据自动生成标记字典。
-- `model_save_dir` :模型参数的保存目录,默认为`./models`。
-
-### 具体执行的过程:
-
-1.从官方网站下载数据\[[2](#参考文献)\](Task 2.3: Word Recognition (2013 edition)),会有三个文件: `Challenge2_Training_Task3_Images_GT.zip`、`Challenge2_Test_Task3_Images.zip` 和 `Challenge2_Test_Task3_GT.txt`。
-分别对应训练集的图片和图片对应的单词、测试集的图片、测试数据对应的单词。然后执行以下命令,对数据解压并移动至目标文件夹:
-
-```bash
-mkdir -p data/train_data
-mkdir -p data/test_data
-unzip Challenge2_Training_Task3_Images_GT.zip -d data/train_data
-unzip Challenge2_Test_Task3_Images.zip -d data/test_data
-mv Challenge2_Test_Task3_GT.txt data/test_data
-```
-
-2.获取训练数据文件夹中 `gt.txt` 的路径 (data/train_data)和测试数据文件夹中`Challenge2_Test_Task3_GT.txt`的路径(data/test_data)。
-
-3.执行如下命令进行训练:
-```bash
-python train.py \
---train_file_list_path 'data/train_data/gt.txt' \
---test_file_list_path 'data/test_data/Challenge2_Test_Task3_GT.txt' \
---label_dict_path 'label_dict.txt'
-```
-4.训练过程中,模型参数会自动备份到指定目录,默认会保存在 `./models` 目录下。
-
-
-### 预测
-预测部分由 `infer.py` 完成,使用的是最优路径解码算法,即:在每个时间步选择一个概率最大的字符。在使用过程中,需要在 `infer.py` 中指定具体的模型保存路径、图片固定尺寸、batch_size(默认为10)、标记词典路径和图片文件的列表文件。执行如下代码:
-```bash
-python infer.py \
---model_path 'models/params_pass_00000.tar.gz' \
---image_shape '173,46' \
---label_dict_path 'label_dict.txt' \
---infer_file_list_path 'data/test_data/Challenge2_Test_Task3_GT.txt'
-```
-即可进行预测。
-
-### 其他数据集
-
-- [SynthText in the Wild Dataset](http://www.robots.ox.ac.uk/~vgg/data/scenetext/)(41G)
-- [ICDAR 2003 Robust Reading Competitions](http://www.iapr-tc11.org/mediawiki/index.php?title=ICDAR_2003_Robust_Reading_Competitions)
-
-### 注意事项
-
-- 由于模型依赖的 `warp CTC` 只有CUDA的实现,本模型只支持 GPU 运行。
-- 本模型参数较多,占用显存比较大,实际执行时可以通过调节 `batch_size` 来控制显存占用。
-- 本例使用的数据集较小,如有需要,可以选用其他更大的数据集\[[3](#参考文献)\]来训练模型。
-
-## 参考文献
-
-1. [Google Now Using ReCAPTCHA To Decode Street View Addresses](https://techcrunch.com/2012/03/29/google-now-using-recaptcha-to-decode-street-view-addresses/)
-2. [Focused Scene Text](http://rrc.cvc.uab.es/?ch=2&com=introduction)
-3. [SynthText in the Wild Dataset](http://www.robots.ox.ac.uk/~vgg/data/scenetext/)
diff --git a/legacy/scene_text_recognition/config.py b/legacy/scene_text_recognition/config.py
deleted file mode 100644
index 9cc563549f409d7abf044a9cf9a95919f8bd6852..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/config.py
+++ /dev/null
@@ -1,75 +0,0 @@
-__all__ = ["TrainerConfig", "ModelConfig"]
-
-
-class TrainerConfig(object):
-
- # Whether to use GPU in training or not.
- use_gpu = True
-
- # The number of computing threads.
- trainer_count = 1
-
- # The training batch size.
- batch_size = 10
-
- # The epoch number.
- num_passes = 10
-
- # Parameter updates momentum.
- momentum = 0
-
- # The shape of images.
- image_shape = (173, 46)
-
- # The buffer size of the data reader.
- # The number of buffer size samples will be shuffled in training.
- buf_size = 1000
-
- # The parameter is used to control logging period.
- # Training log will be printed every log_period.
- log_period = 50
-
-
-class ModelConfig(object):
-
- # Number of the filters for convolution group.
- filter_num = 8
-
- # Use batch normalization or not in image convolution group.
- with_bn = True
-
- # The number of channels for block expand layer.
- num_channels = 128
-
- # The parameter stride_x in block expand layer.
- stride_x = 1
-
- # The parameter stride_y in block expand layer.
- stride_y = 1
-
- # The parameter block_x in block expand layer.
- block_x = 1
-
- # The parameter block_y in block expand layer.
- block_y = 11
-
- # The hidden size for gru.
- hidden_size = num_channels
-
- # Use norm_by_times or not in warp ctc layer.
- norm_by_times = True
-
- # The list for number of filter in image convolution group layer.
- filter_num_list = [16, 32, 64, 128]
-
- # The parameter conv_padding in image convolution group layer.
- conv_padding = 1
-
- # The parameter conv_filter_size in image convolution group layer.
- conv_filter_size = 3
-
- # The parameter pool_size in image convolution group layer.
- pool_size = 2
-
- # The parameter pool_stride in image convolution group layer.
- pool_stride = 2
diff --git a/legacy/scene_text_recognition/decoder.py b/legacy/scene_text_recognition/decoder.py
deleted file mode 100644
index 8ba02a453070f955acd281031f4b29608bcaf65c..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/decoder.py
+++ /dev/null
@@ -1,34 +0,0 @@
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-from itertools import groupby
-import numpy as np
-
-
-def ctc_greedy_decoder(probs_seq, vocabulary):
- """CTC greedy (best path) decoder.
- Path consisting of the most probable tokens are further post-processed to
- remove consecutive repetitions and all blanks.
- :param probs_seq: 2-D list of probabilities over the vocabulary for each
- character. Each element is a list of float probabilities
- for one character.
- :type probs_seq: list
- :param vocabulary: Vocabulary list.
- :type vocabulary: list
- :return: Decoding result string.
- :rtype: baseline
- """
- # dimension verification
- for probs in probs_seq:
- if not len(probs) == len(vocabulary) + 1:
- raise ValueError("probs_seq dimension mismatchedd with vocabulary")
- # argmax to get the best index for each time step
- max_index_list = list(np.array(probs_seq).argmax(axis=1))
- # remove consecutive duplicate indexes
- index_list = [index_group[0] for index_group in groupby(max_index_list)]
- # remove blank indexes
- blank_index = len(vocabulary)
- index_list = [index for index in index_list if index != blank_index]
- # convert index list to string
- return ''.join([vocabulary[index] for index in index_list])
diff --git a/legacy/scene_text_recognition/images/503.jpg b/legacy/scene_text_recognition/images/503.jpg
deleted file mode 100644
index 87253cd25a0e0f36b8430d01054ebe0d2f068356..0000000000000000000000000000000000000000
Binary files a/legacy/scene_text_recognition/images/503.jpg and /dev/null differ
diff --git a/legacy/scene_text_recognition/images/504.jpg b/legacy/scene_text_recognition/images/504.jpg
deleted file mode 100644
index ba19785d45c28e35fa2de2ffea0f5bf97e1ece09..0000000000000000000000000000000000000000
Binary files a/legacy/scene_text_recognition/images/504.jpg and /dev/null differ
diff --git a/legacy/scene_text_recognition/images/505.jpg b/legacy/scene_text_recognition/images/505.jpg
deleted file mode 100644
index f6c2b806cd63793f706cb87cf996e4e16b5cfe97..0000000000000000000000000000000000000000
Binary files a/legacy/scene_text_recognition/images/505.jpg and /dev/null differ
diff --git a/legacy/scene_text_recognition/images/ctc.png b/legacy/scene_text_recognition/images/ctc.png
deleted file mode 100644
index 45b7df3517758ab20ff796133204b385b634e039..0000000000000000000000000000000000000000
Binary files a/legacy/scene_text_recognition/images/ctc.png and /dev/null differ
diff --git a/legacy/scene_text_recognition/images/feature_vector.png b/legacy/scene_text_recognition/images/feature_vector.png
deleted file mode 100644
index f47473fb87462cc0b02270f6121928eae2710627..0000000000000000000000000000000000000000
Binary files a/legacy/scene_text_recognition/images/feature_vector.png and /dev/null differ
diff --git a/legacy/scene_text_recognition/images/transcription.png b/legacy/scene_text_recognition/images/transcription.png
deleted file mode 100644
index cba1f75838d720ab8e28c2d3aa977a008cc618e1..0000000000000000000000000000000000000000
Binary files a/legacy/scene_text_recognition/images/transcription.png and /dev/null differ
diff --git a/legacy/scene_text_recognition/infer.py b/legacy/scene_text_recognition/infer.py
deleted file mode 100644
index dfcf32acc0ba4d7b2a0d4cbc1bc02a528a746810..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/infer.py
+++ /dev/null
@@ -1,86 +0,0 @@
-import click
-import gzip
-
-import paddle.v2 as paddle
-from network_conf import Model
-from reader import DataGenerator
-from decoder import ctc_greedy_decoder
-from utils import get_file_list, load_dict, load_reverse_dict
-
-
-def infer_batch(inferer, test_batch, labels, reversed_char_dict):
- infer_results = inferer.infer(input=test_batch)
- num_steps = len(infer_results) // len(test_batch)
- probs_split = [
- infer_results[i * num_steps:(i + 1) * num_steps]
- for i in xrange(0, len(test_batch))
- ]
- results = []
- # Best path decode.
- for i, probs in enumerate(probs_split):
- output_transcription = ctc_greedy_decoder(
- probs_seq=probs, vocabulary=reversed_char_dict)
- results.append(output_transcription)
-
- for result, label in zip(results, labels):
- print("\nOutput Transcription: %s\nTarget Transcription: %s" %
- (result, label))
-
-
-@click.command('infer')
-@click.option(
- "--model_path", type=str, required=True, help=("The path of saved model."))
-@click.option(
- "--image_shape",
- type=str,
- required=True,
- help=("The fixed size for image dataset (format is like: '173,46')."))
-@click.option(
- "--batch_size",
- type=int,
- default=10,
- help=("The number of examples in one batch (default: 10)."))
-@click.option(
- "--label_dict_path",
- type=str,
- required=True,
- help=("The path of label dictionary. "))
-@click.option(
- "--infer_file_list_path",
- type=str,
- required=True,
- help=("The path of the file which contains "
- "path list of image files for inference."))
-def infer(model_path, image_shape, batch_size, label_dict_path,
- infer_file_list_path):
-
- image_shape = tuple(map(int, image_shape.split(',')))
- infer_file_list = get_file_list(infer_file_list_path)
-
- char_dict = load_dict(label_dict_path)
- reversed_char_dict = load_reverse_dict(label_dict_path)
- dict_size = len(char_dict)
- data_generator = DataGenerator(char_dict=char_dict, image_shape=image_shape)
-
- paddle.init(use_gpu=True, trainer_count=1)
- parameters = paddle.parameters.Parameters.from_tar(gzip.open(model_path))
- model = Model(dict_size, image_shape, is_infer=True)
- inferer = paddle.inference.Inference(
- output_layer=model.log_probs, parameters=parameters)
-
- test_batch = []
- labels = []
- for i, (image, label
- ) in enumerate(data_generator.infer_reader(infer_file_list)()):
- test_batch.append([image])
- labels.append(label)
- if len(test_batch) == batch_size:
- infer_batch(inferer, test_batch, labels, reversed_char_dict)
- test_batch = []
- labels = []
- if test_batch:
- infer_batch(inferer, test_batch, labels, reversed_char_dict)
-
-
-if __name__ == "__main__":
- infer()
diff --git a/legacy/scene_text_recognition/network_conf.py b/legacy/scene_text_recognition/network_conf.py
deleted file mode 100644
index c525c1315a56a4d13cf5cb04e28e2270dbfaf204..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/network_conf.py
+++ /dev/null
@@ -1,127 +0,0 @@
-from paddle import v2 as paddle
-from paddle.v2 import layer
-from paddle.v2 import evaluator
-from paddle.v2.activation import Relu, Linear
-from paddle.v2.networks import img_conv_group, simple_gru
-from config import ModelConfig as conf
-
-
-class Model(object):
- def __init__(self, num_classes, shape, is_infer=False):
- '''
- :param num_classes: The size of the character dict.
- :type num_classes: int
- :param shape: The size of the input images.
- :type shape: tuple of 2 int
- :param is_infer: The boolean parameter indicating
- inferring or training.
- :type shape: bool
- '''
- self.num_classes = num_classes
- self.shape = shape
- self.is_infer = is_infer
- self.image_vector_size = shape[0] * shape[1]
-
- self.__declare_input_layers__()
- self.__build_nn__()
-
- def __declare_input_layers__(self):
- '''
- Define the input layer.
- '''
- # Image input as a float vector.
- self.image = layer.data(
- name='image',
- type=paddle.data_type.dense_vector(self.image_vector_size),
- height=self.shape[0],
- width=self.shape[1])
-
- # Label input as an ID list
- if not self.is_infer:
- self.label = layer.data(
- name='label',
- type=paddle.data_type.integer_value_sequence(self.num_classes))
-
- def __build_nn__(self):
- '''
- Build the network topology.
- '''
- # Get the image features with CNN.
- conv_features = self.conv_groups(self.image, conf.filter_num,
- conf.with_bn)
-
- # Expand the output of CNN into a sequence of feature vectors.
- sliced_feature = layer.block_expand(
- input=conv_features,
- num_channels=conf.num_channels,
- stride_x=conf.stride_x,
- stride_y=conf.stride_y,
- block_x=conf.block_x,
- block_y=conf.block_y)
-
- # Use RNN to capture sequence information forwards and backwards.
- gru_forward = simple_gru(
- input=sliced_feature, size=conf.hidden_size, act=Relu())
- gru_backward = simple_gru(
- input=sliced_feature,
- size=conf.hidden_size,
- act=Relu(),
- reverse=True)
-
- # Map the output of RNN to character distribution.
- self.output = layer.fc(input=[gru_forward, gru_backward],
- size=self.num_classes + 1,
- act=Linear())
-
- self.log_probs = paddle.layer.mixed(
- input=paddle.layer.identity_projection(input=self.output),
- act=paddle.activation.Softmax())
-
- # Use warp CTC to calculate cost for a CTC task.
- if not self.is_infer:
- self.cost = layer.warp_ctc(
- input=self.output,
- label=self.label,
- size=self.num_classes + 1,
- norm_by_times=conf.norm_by_times,
- blank=self.num_classes)
-
- self.eval = evaluator.ctc_error(input=self.output, label=self.label)
-
- def conv_groups(self, input, num, with_bn):
- '''
- Get the image features with image convolution group.
-
- :param input: Input layer.
- :type input: LayerOutput
- :param num: Number of the filters.
- :type num: int
- :param with_bn: Use batch normalization or not.
- :type with_bn: bool
- '''
- assert num % 4 == 0
-
- filter_num_list = conf.filter_num_list
- is_input_image = True
- tmp = input
-
- for num_filter in filter_num_list:
-
- if is_input_image:
- num_channels = 1
- is_input_image = False
- else:
- num_channels = None
-
- tmp = img_conv_group(
- input=tmp,
- num_channels=num_channels,
- conv_padding=conf.conv_padding,
- conv_num_filter=[num_filter] * (num / 4),
- conv_filter_size=conf.conv_filter_size,
- conv_act=Relu(),
- conv_with_batchnorm=with_bn,
- pool_size=conf.pool_size,
- pool_stride=conf.pool_stride, )
-
- return tmp
diff --git a/legacy/scene_text_recognition/reader.py b/legacy/scene_text_recognition/reader.py
deleted file mode 100644
index 91321e34bf6ae748dfbfcf8fff22ee890769616c..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/reader.py
+++ /dev/null
@@ -1,64 +0,0 @@
-import os
-import cv2
-
-from paddle.v2.image import load_image
-
-
-class DataGenerator(object):
- def __init__(self, char_dict, image_shape):
- '''
- :param char_dict: The dictionary class for labels.
- :type char_dict: class
- :param image_shape: The fixed shape of images.
- :type image_shape: tuple
- '''
- self.image_shape = image_shape
- self.char_dict = char_dict
-
- def train_reader(self, file_list):
- '''
- Reader interface for training.
-
- :param file_list: The path list of the image file for training.
- :type file_list: list
- '''
-
- def reader():
- UNK_ID = self.char_dict['']
- for image_path, label in file_list:
- label = [self.char_dict.get(c, UNK_ID) for c in label]
- yield self.load_image(image_path), label
-
- return reader
-
- def infer_reader(self, file_list):
- '''
- Reader interface for inference.
-
- :param file_list: The path list of the image file for inference.
- :type file_list: list
- '''
-
- def reader():
- for image_path, label in file_list:
- yield self.load_image(image_path), label
-
- return reader
-
- def load_image(self, path):
- '''
- Load an image and transform it to 1-dimention vector.
-
- :param path: The path of the image data.
- :type path: str
- '''
- image = load_image(path)
- image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
-
- # Resize all images to a fixed shape.
- if self.image_shape:
- image = cv2.resize(
- image, self.image_shape, interpolation=cv2.INTER_CUBIC)
-
- image = image.flatten() / 255.
- return image
diff --git a/legacy/scene_text_recognition/requirements.txt b/legacy/scene_text_recognition/requirements.txt
deleted file mode 100644
index eb8ed79b09459ccc1fe16e2180a555fade31c58e..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/requirements.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-click
-opencv-python
\ No newline at end of file
diff --git a/legacy/scene_text_recognition/train.py b/legacy/scene_text_recognition/train.py
deleted file mode 100644
index 0fac5c6ea4aed8ee92bb9078ea6ec6bf3f8e0b2b..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/train.py
+++ /dev/null
@@ -1,106 +0,0 @@
-import gzip
-import os
-import click
-
-import paddle.v2 as paddle
-from config import TrainerConfig as conf
-from network_conf import Model
-from reader import DataGenerator
-from utils import get_file_list, build_label_dict, load_dict
-
-
-@click.command('train')
-@click.option(
- "--train_file_list_path",
- type=str,
- required=True,
- help=("The path of the file which contains "
- "path list of train image files."))
-@click.option(
- "--test_file_list_path",
- type=str,
- required=True,
- help=("The path of the file which contains "
- "path list of test image files."))
-@click.option(
- "--label_dict_path",
- type=str,
- required=True,
- help=("The path of label dictionary. "
- "If this parameter is set, but the file does not exist, "
- "label dictionay will be built from "
- "the training data automatically."))
-@click.option(
- "--model_save_dir",
- type=str,
- default="models",
- help="The path to save the trained models (default: 'models').")
-def train(train_file_list_path, test_file_list_path, label_dict_path,
- model_save_dir):
-
- if not os.path.exists(model_save_dir):
- os.mkdir(model_save_dir)
-
- train_file_list = get_file_list(train_file_list_path)
- test_file_list = get_file_list(test_file_list_path)
-
- if not os.path.exists(label_dict_path):
- print(("Label dictionary is not given, the dictionary "
- "is automatically built from the training data."))
- build_label_dict(train_file_list, label_dict_path)
-
- char_dict = load_dict(label_dict_path)
- dict_size = len(char_dict)
- data_generator = DataGenerator(
- char_dict=char_dict, image_shape=conf.image_shape)
-
- paddle.init(use_gpu=conf.use_gpu, trainer_count=conf.trainer_count)
- # Create optimizer.
- optimizer = paddle.optimizer.Momentum(momentum=conf.momentum)
- # Define network topology.
- model = Model(dict_size, conf.image_shape, is_infer=False)
- # Create all the trainable parameters.
- params = paddle.parameters.create(model.cost)
-
- trainer = paddle.trainer.SGD(cost=model.cost,
- parameters=params,
- update_equation=optimizer,
- extra_layers=model.eval)
- # Feeding dictionary.
- feeding = {'image': 0, 'label': 1}
-
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % conf.log_period == 0:
- print("Pass %d, batch %d, Samples %d, Cost %f, Eval %s" %
- (event.pass_id, event.batch_id, event.batch_id *
- conf.batch_size, event.cost, event.metrics))
-
- if isinstance(event, paddle.event.EndPass):
- # Here, because training and testing data share a same format,
- # we still use the reader.train_reader to read the testing data.
- result = trainer.test(
- reader=paddle.batch(
- data_generator.train_reader(test_file_list),
- batch_size=conf.batch_size),
- feeding=feeding)
- print("Test %d, Cost %f, Eval %s" %
- (event.pass_id, result.cost, result.metrics))
- with gzip.open(
- os.path.join(model_save_dir, "params_pass_%05d.tar.gz" %
- event.pass_id), "w") as f:
- trainer.save_parameter_to_tar(f)
-
- trainer.train(
- reader=paddle.batch(
- paddle.reader.shuffle(
- data_generator.train_reader(train_file_list),
- buf_size=conf.buf_size),
- batch_size=conf.batch_size),
- feeding=feeding,
- event_handler=event_handler,
- num_passes=conf.num_passes)
-
-
-if __name__ == "__main__":
- train()
diff --git a/legacy/scene_text_recognition/utils.py b/legacy/scene_text_recognition/utils.py
deleted file mode 100644
index 86bd3a1f477a710c3f245ee3dd582044f37a0f8d..0000000000000000000000000000000000000000
--- a/legacy/scene_text_recognition/utils.py
+++ /dev/null
@@ -1,69 +0,0 @@
-import os
-from collections import defaultdict
-
-
-def get_file_list(image_file_list):
- '''
- Generate the file list for training and testing data.
-
- :param image_file_list: The path of the file which contains
- path list of image files.
- :type image_file_list: str
- '''
- dirname = os.path.dirname(image_file_list)
- path_list = []
- with open(image_file_list) as f:
- for line in f:
- line_split = line.strip().split(',', 1)
- filename = line_split[0].strip()
- path = os.path.join(dirname, filename)
- label = line_split[1][2:-1].strip()
- if label:
- path_list.append((path, label))
-
- return path_list
-
-
-def build_label_dict(file_list, save_path):
- """
- Build label dictionary from training data.
-
- :param file_list: The list which contains the labels
- of training data.
- :type file_list: list
- :params save_path: The path where the label dictionary will be saved.
- :type save_path: str
- """
- values = defaultdict(int)
- for path, label in file_list:
- for c in label:
- if c:
- values[c] += 1
-
- values[''] = 0
- with open(save_path, "w") as f:
- for v, count in sorted(
- values.iteritems(), key=lambda x: x[1], reverse=True):
- f.write("%s\t%d\n" % (v, count))
-
-
-def load_dict(dict_path):
- """
- Load label dictionary from the dictionary path.
-
- :param dict_path: The path of word dictionary.
- :type dict_path: str
- """
- return dict((line.strip().split("\t")[0], idx)
- for idx, line in enumerate(open(dict_path, "r").readlines()))
-
-
-def load_reverse_dict(dict_path):
- """
- Load the reversed label dictionary from dictionary path.
-
- :param dict_path: The path of word dictionary.
- :type dict_path: str
- """
- return dict((idx, line.strip().split("\t")[0])
- for idx, line in enumerate(open(dict_path, "r").readlines()))
diff --git a/legacy/scheduled_sampling/README.md b/legacy/scheduled_sampling/README.md
deleted file mode 100644
index 2a33f3b248e3cede611e5b4c8647286cc8fb791c..0000000000000000000000000000000000000000
--- a/legacy/scheduled_sampling/README.md
+++ /dev/null
@@ -1,224 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# Scheduled Sampling
-
-## 概述
-
-序列生成任务的生成目标是在给定源输入的条件下,最大化目标序列的概率。训练时该模型将目标序列中的真实元素作为解码器每一步的输入,然后最大化下一个元素的概率。生成时上一步解码得到的元素被用作当前的输入,然后生成下一个元素。可见这种情况下训练阶段和生成阶段的解码器输入数据的概率分布并不一致。
-
-Scheduled Sampling \[[1](#参考文献)\]是一种解决训练和生成时输入数据分布不一致的方法。在训练早期该方法主要使用目标序列中的真实元素作为解码器输入,可以将模型从随机初始化的状态快速引导至一个合理的状态。随着训练的进行,该方法会逐渐更多地使用生成的元素作为解码器输入,以解决数据分布不一致的问题。
-
-标准的序列到序列模型中,如果序列前面生成了错误的元素,后面的输入状态将会收到影响,而该误差会随着生成过程不断向后累积。Scheduled Sampling以一定概率将生成的元素作为解码器输入,这样即使前面生成错误,其训练目标仍然是最大化真实目标序列的概率,模型会朝着正确的方向进行训练。因此这种方式增加了模型的容错能力。
-
-## 算法简介
-Scheduled Sampling主要应用在序列到序列模型的训练阶段,而生成阶段则不需要使用。
-
-训练阶段解码器在最大化第$t$个元素概率时,标准序列到序列模型使用上一时刻的真实元素$y_{t-1}$作为输入。设上一时刻生成的元素为$g_{t-1}$,Scheduled Sampling算法会以一定概率使用$g_{t-1}$作为解码器输入。
-
-设当前已经训练到了第$i$个mini-batch,Scheduled Sampling定义了一个概率$\epsilon_i$控制解码器的输入。$\epsilon_i$是一个随着$i$增大而衰减的变量,常见的定义方式有:
-
- - 线性衰减:$\epsilon_i=max(\epsilon,k-c*i)$,其中$\epsilon$限制$\epsilon_i$的最小值,$k$和$c$控制线性衰减的幅度。
-
- - 指数衰减:$\epsilon_i=k^i$,其中$01$,$k$同样控制衰减的幅度。
-
-图1给出了这三种方式的衰减曲线,
-
-
-
-图1. 线性衰减、指数衰减和反向Sigmoid衰减的衰减曲线
-
-
-如图2所示,在解码器的$t$时刻Scheduled Sampling以概率$\epsilon_i$使用上一时刻的真实元素$y_{t-1}$作为解码器输入,以概率$1-\epsilon_i$使用上一时刻生成的元素$g_{t-1}$作为解码器输入。从图1可知随着$i$的增大$\epsilon_i$会不断减小,解码器将不断倾向于使用生成的元素作为输入,训练阶段和生成阶段的数据分布将变得越来越一致。
-
-
-
-图2. Scheduled Sampling选择不同元素作为解码器输入示意图
-
-
-## 模型实现
-
-由于Scheduled Sampling是对序列到序列模型的改进,其整体实现框架与序列到序列模型较为相似。为突出本文重点,这里仅介绍与Scheduled Sampling相关的部分,完整的代码见`network_conf.py`。
-
-首先导入需要的包,并定义控制衰减概率的类`RandomScheduleGenerator`,如下:
-
-```python
-import numpy as np
-import math
-
-
-class RandomScheduleGenerator:
- """
- The random sampling rate for scheduled sampling algoithm, which uses devcayed
- sampling rate.
-
- """
- ...
-```
-
-下面将分别定义类`RandomScheduleGenerator`的`__init__`、`getScheduleRate`和`processBatch`三个方法。
-
-`__init__`方法对类进行初始化,其`schedule_type`参数指定了使用哪种衰减方式,可选的方式有`constant`、`linear`、`exponential`和`inverse_sigmoid`。`constant`指对所有的mini-batch使用固定的$\epsilon_i$,`linear`指线性衰减方式,`exponential`表示指数衰减方式,`inverse_sigmoid`表示反向Sigmoid衰减。`__init__`方法的参数`a`和`b`表示衰减方法的参数,需要在验证集上调优。`self.schedule_computers`将衰减方式映射为计算$\epsilon_i$的函数。最后一行根据`schedule_type`将选择的衰减函数赋给`self.schedule_computer`变量。
-
-```python
-def __init__(self, schedule_type, a, b):
- """
- schduled_type: is the type of the decay. It supports constant, linear,
- exponential, and inverse_sigmoid right now.
- a: parameter of the decay (MUST BE DOUBLE)
- b: parameter of the decay (MUST BE DOUBLE)
- """
- self.schedule_type = schedule_type
- self.a = a
- self.b = b
- self.data_processed_ = 0
- self.schedule_computers = {
- "constant": lambda a, b, d: a,
- "linear": lambda a, b, d: max(a, 1 - d / b),
- "exponential": lambda a, b, d: pow(a, d / b),
- "inverse_sigmoid": lambda a, b, d: b / (b + math.exp(d * a / b)),
- }
- assert (self.schedule_type in self.schedule_computers)
- self.schedule_computer = self.schedule_computers[self.schedule_type]
-```
-
-`getScheduleRate`根据衰减函数和已经处理的数据量计算$\epsilon_i$。
-
-```python
-def getScheduleRate(self):
- """
- Get the schedule sampling rate. Usually not needed to be called by the users
- """
- return self.schedule_computer(self.a, self.b, self.data_processed_)
-
-```
-
-`processBatch`方法根据概率值$\epsilon_i$进行采样,得到`indexes`,`indexes`中每个元素取值为`0`的概率为$\epsilon_i$,取值为`1`的概率为$1-\epsilon_i$。`indexes`决定了解码器的输入是真实元素还是生成的元素,取值为`0`表示使用真实元素,取值为`1`表示使用生成的元素。
-
-```python
-def processBatch(self, batch_size):
- """
- Get a batch_size of sampled indexes. These indexes can be passed to a
- MultiplexLayer to select from the grouth truth and generated samples
- from the last time step.
- """
- rate = self.getScheduleRate()
- numbers = np.random.rand(batch_size)
- indexes = (numbers >= rate).astype('int32').tolist()
- self.data_processed_ += batch_size
- return indexes
-```
-
-Scheduled Sampling需要在序列到序列模型的基础上增加一个输入`true_token_flag`,以控制解码器输入。
-
-```python
-true_token_flags = paddle.layer.data(
- name='true_token_flag',
- type=paddle.data_type.integer_value_sequence(2))
-```
-
-这里还需要对原始reader进行封装,增加`true_token_flag`的数据生成器。下面以线性衰减为例说明如何调用上面定义的`RandomScheduleGenerator`产生`true_token_flag`的输入数据。
-
-```python
-def gen_schedule_data(reader,
- schedule_type="linear",
- decay_a=0.75,
- decay_b=1000000):
- """
- Creates a data reader for scheduled sampling.
-
- Output from the iterator that created by original reader will be
- appended with "true_token_flag" to indicate whether to use true token.
-
- :param reader: the original reader.
- :type reader: callable
- :param schedule_type: the type of sampling rate decay.
- :type schedule_type: str
- :param decay_a: the decay parameter a.
- :type decay_a: float
- :param decay_b: the decay parameter b.
- :type decay_b: float
-
- :return: the new reader with the field "true_token_flag".
- :rtype: callable
- """
- schedule_generator = RandomScheduleGenerator(schedule_type, decay_a, decay_b)
-
- def data_reader():
- for src_ids, trg_ids, trg_ids_next in reader():
- yield src_ids, trg_ids, trg_ids_next, \
- [0] + schedule_generator.processBatch(len(trg_ids) - 1)
-
- return data_reader
-```
-
-这段代码在原始输入数据(即源序列元素`src_ids`、目标序列元素`trg_ids`和目标序列下一个元素`trg_ids_next`)后追加了控制解码器输入的数据。由于解码器第一个元素是序列开始符,因此将追加的数据第一个元素设置为`0`,表示解码器第一步始终使用真实目标序列的第一个元素(即序列开始符)。
-
-训练时`recurrent_group`每一步调用的解码器函数如下:
-
-```python
-def gru_decoder_with_attention_train(enc_vec, enc_proj, true_word,
- true_token_flag):
- """
- The decoder step for training.
- :param enc_vec: the encoder vector for attention
- :type enc_vec: LayerOutput
- :param enc_proj: the encoder projection for attention
- :type enc_proj: LayerOutput
- :param true_word: the ground-truth target word
- :type true_word: LayerOutput
- :param true_token_flag: the flag of using the ground-truth target word
- :type true_token_flag: LayerOutput
- :return: the softmax output layer
- :rtype: LayerOutput
- """
-
- decoder_mem = paddle.layer.memory(
- name='gru_decoder', size=decoder_size, boot_layer=decoder_boot)
-
- context = paddle.networks.simple_attention(
- encoded_sequence=enc_vec,
- encoded_proj=enc_proj,
- decoder_state=decoder_mem)
-
- gru_out_memory = paddle.layer.memory(
- name='gru_out', size=target_dict_dim)
-
- generated_word = paddle.layer.max_id(input=gru_out_memory)
-
- generated_word_emb = paddle.layer.embedding(
- input=generated_word,
- size=word_vector_dim,
- param_attr=paddle.attr.ParamAttr(name='_target_language_embedding'))
-
- current_word = paddle.layer.multiplex(
- input=[true_token_flag, true_word, generated_word_emb])
-
- decoder_inputs = paddle.layer.fc(
- input=[context, current_word],
- size=decoder_size * 3,
- act=paddle.activation.Linear(),
- bias_attr=False)
-
- gru_step = paddle.layer.gru_step(
- name='gru_decoder',
- input=decoder_inputs,
- output_mem=decoder_mem,
- size=decoder_size)
-
- out = paddle.layer.fc(
- name='gru_out',
- input=gru_step,
- size=target_dict_dim,
- act=paddle.activation.Softmax())
- return out
-```
-
-该函数使用`memory`层`gru_out_memory`记忆上一时刻生成的元素,根据`gru_out_memory`选择概率最大的词语`generated_word`作为生成的词语。`multiplex`层会在真实元素`true_word`和生成的元素`generated_word`之间做出选择,并将选择的结果作为解码器输入。`multiplex`层使用了三个输入,分别为`true_token_flag`、`true_word`和`generated_word_emb`。对于这三个输入中每个元素,若`true_token_flag`中的值为`0`,则`multiplex`层输出`true_word`中的相应元素;若`true_token_flag`中的值为`1`,则`multiplex`层输出`generated_word_emb`中的相应元素。
-
-## 参考文献
-
-[1] Bengio S, Vinyals O, Jaitly N, et al. [Scheduled sampling for sequence prediction with recurrent neural networks](http://papers.nips.cc/paper/5956-scheduled-sampling-for-sequence-prediction-with-recurrent-neural-networks)//Advances in Neural Information Processing Systems. 2015: 1171-1179.
diff --git a/legacy/scheduled_sampling/README_en.md b/legacy/scheduled_sampling/README_en.md
deleted file mode 100644
index cffb65adf904f294e94be9f4071d3b9674ca94a7..0000000000000000000000000000000000000000
--- a/legacy/scheduled_sampling/README_en.md
+++ /dev/null
@@ -1,222 +0,0 @@
-Running sample code in this directory requires PaddelPaddle v0.10.0 and later. If the PaddlePaddle on your device is lower than this version, please follow the instructions in [installation document](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html) and make an update.
-
----
-
-# Scheduled Sampling
-
-## Overview
-
-The goal of a sequence generation task is to maximize the probability of the target sequence given the input of the source. When training, the model uses real elements in the target sequence as the input to each step of the decoder and then maximizes the probability of the next element. The element decoded in the previous step is used as the current input to generate the next element. We can see that the probability distribution of the decoder input data in the training phase and the generation phase is not consistent in this case.
-
-Scheduled sampling\[[1](#references)\] is a solution to the inconsistency in the distribution of input data during training and generation phases. In the early stage of training, this method uses the real elements in the target sequence as the decoder input and quickly guides the model from a randomly initialized state to a reasonable state. As training progresses, the method will gradually increase the use of the generated elements as decoder input to solve the problem of inconsistent data distribution.
-
-In a standard sequence-to-sequence model, if an incorrect element is generated at the beginning of the sequence, the subsequent input state will be affected, and the error will continue to accumulate as the generation process continues. Scheduled sampling uses the generated elements as the decoder input with a certain probability, so even if the previous generation steps have errors, the training process' target is still to maximize the probability of the real target sequence, so the model is still trained in the right direction. Therefore, this approach increases the fault tolerance of the model.
-
-## Introduction to the Algorithm
-Scheduled sampling is used only in the training phase of the sequence-to-sequence model and not in the generation phase.
-
-The standard sequence-to-sequence model uses the true element, $y_{t-1}$, at the previous moment as input for the decoder to maximize the probability of the $t$-th element. Let $g_{t-1}$ be the element generated at the latest moment. The scheduled sampling algorithm will use $g_{t-1}$ as the decoder input with a certain probability.
-
-Suppose that the $i$-th mini-batch has been trained. To control the decoder's input, the scheduled sampling algorithm defines a probability variable $\epsilon_i$ that decays as $i$ increases. Some common definitions are:
-
-Linear attenuation: $\epsilon_i=max(\epsilon,k-c*i)$, where $\epsilon$ limits the minimum value of $\epsilon_i$, and $k$ and $c$ control the magnitude of linear attenuation.
-
-Exponential decay: $\epsilon_i=k^i$, where $0 1$, $k$ also controls the magnitude of attenuation.
-
-
-
-Figure 1. Attenuation curves for linear attenuation, exponential decay, and inverse Sigmoid decay
-
-
-As shown in Fig. 2, at time $t$ of the decoder, the scheduled sampling algorithm uses the true element $yt−1$ of the previous moment as the decoder input with probability $\epsilon_i$, and uses $g_{t-1}$ generated at the previous moment as the decoder input with probability $1-\epsilon_i$. From Figure 1, we see that as $i$ increases, $\epsilon_i$ decreases. Decoder will continue to use the generated elements as input. The data distribution during the training phase and the generation phase will gradually become more consistent.
-
-
-
-Figure 2. Scheduled sampling algorithm selects different elements as decoder input
-
-
-## Model Implementation
-
-Since the scheduled sampling algorithm is just an improvement over the sequence-to-sequence model, its overall implementation framework is similar to that of the sequence-to-sequence model. Thus, only the parts related to scheduled sampling are described here. For the complete code, see `network_conf.py`.
-
-First, we import the required package and define the class `RandomScheduleGenerator` that controls the decay probability as follows:
-
-```python
-import numpy as np
-import math
-
-
-class RandomScheduleGenerator:
-"""
-The random sampling rate for scheduled sampling algoithm, which uses devcayed
-sampling rate.
-
-"""
-...
-```
-
-We will now define the three methods of class `RandomScheduleGenerator`: `__init__`, `getScheduleRate`, and `processBatch`.
-
-The `__init__` method initializes the class. The `schedule_type` parameter specifies which decay mode to use. The options are `constant`, `linear`, `exponential`, and `inverse_sigmoid`. Mode `constant` uses a fixed $\epsilon_i$ for all mini-batch; mode `linear` refers to linear attenuation; mode `exponential` refers to exponential decay; mode `inverse_sigmoid` refers to inverse sigmoid decay. Parameters `a` and `b` of the `__init__` method represent the parameters of the attenuation method, and they need to be tuned on the validation set. `self.schedule_computers` maps the decay mode to a function that calculates εi. The last line assigns the selected attenuation function to `self.schedule_computer` according to `schedule_type`.
-
-```python
-def __init__(self, schedule_type, a, b):
-"""
-schduled_type: is the type of the decay. It supports constant, linear,
-exponential, and inverse_sigmoid right now.
-a: parameter of the decay (MUST BE DOUBLE)
-b: parameter of the decay (MUST BE DOUBLE)
-"""
-self.schedule_type = schedule_type
-self.a = a
-self.b = b
-self.data_processed_ = 0
-self.schedule_computers = {
-"constant": lambda a, b, d: a,
-"linear": lambda a, b, d: max(a, 1 - d / b),
-"exponential": lambda a, b, d: pow(a, d / b),
-"inverse_sigmoid": lambda a, b, d: b / (b + math.exp(d * a / b)),
-}
-assert (self.schedule_type in self.schedule_computers)
-self.schedule_computer = self.schedule_computers[self.schedule_type]
-```
-
-`getScheduleRate` calculates $\epsilon_i$ based on the decay function and the amount of data already processed.
-
-```python
-def getScheduleRate(self):
-"""
-Get the schedule sampling rate. Usually not needed to be called by the users
-"""
-return self.schedule_computer(self.a, self.b, self.data_processed_)
-
-```
-
-The `processBatch` method is sampled according to the probability value $\epsilon_i$, and it output `indexes`. Each element in `indexes` has ϵi probability to be assigned `0`, $1-\epsilon_i$ to be assigned `1`. `indexes` determines whether the decoder's input is a real element or a generated element. A value of `0` indicates that the real element is used, and a value of `1` indicates that the generated element is used.
-
-```python
-def processBatch(self, batch_size):
-"""
-Get a batch_size of sampled indexes. These indexes can be passed to a
-MultiplexLayer to select from the grouth truth and generated samples
-from the last time step.
-"""
-rate = self.getScheduleRate()
-numbers = np.random.rand(batch_size)
-indexes = (numbers >= rate).astype('int32').tolist()
-self.data_processed_ += batch_size
-return indexes
-```
-
-Scheduled sampling algorithm needs to add to the sequence-to-sequence model another input variable, `true_token_flag`, to control the decoder input.
-
-```python
-true_token_flags = paddle.layer.data(
-name='true_token_flag',
-type=paddle.data_type.integer_value_sequence(2))
-```
-
-Here, we also need to encapsulate the original reader and add a data generator to `true_token_flag`. We use linear decay as an example to show how to call `RandomScheduleGenerator` defined above to generate input data for `true_token_flag`.
-
-```python
-def gen_schedule_data(reader,
-schedule_type="linear",
-decay_a=0.75,
-decay_b=1000000):
-"""
-Creates a data reader for scheduled sampling.
-
-Output from the iterator that created by original reader will be
-appended with "true_token_flag" to indicate whether to use true token.
-
-:param reader: the original reader.
-:type reader: callable
-:param schedule_type: the type of sampling rate decay.
-:type schedule_type: str
-:param decay_a: the decay parameter a.
-:type decay_a: float
-:param decay_b: the decay parameter b.
-:type decay_b: float
-
-:return: the new reader with the field "true_token_flag".
-:rtype: callable
-"""
-schedule_generator = RandomScheduleGenerator(schedule_type, decay_a, decay_b)
-
-def data_reader():
-for src_ids, trg_ids, trg_ids_next in reader():
-yield src_ids, trg_ids, trg_ids_next, \
-[0] + schedule_generator.processBatch(len(trg_ids) - 1)
-
-return data_reader
-```
-
-This code appends the input data that controls the decoder input after the original input data (ie, source sequence element `src_ids`, target sequence element `trg_ids`, and an element in the target sequence `trg_ids_next`). Since the first element of the decoder is the sequence starter, we set the first element of the appended data to `0`, indicating that the first operation of the decoder always uses the first element of the real target sequence (ie, the sequence starter).
-
-The decoder function called by each step of `recurrent_group` during training is as follows:
-
-```python
-def gru_decoder_with_attention_train(enc_vec, enc_proj, true_word,
-true_token_flag):
-"""
-The decoder step for training.
-:param enc_vec: the encoder vector for attention
-:type enc_vec: LayerOutput
-:param enc_proj: the encoder projection for attention
-:type enc_proj: LayerOutput
-:param true_word: the ground-truth target word
-:type true_word: LayerOutput
-:param true_token_flag: the flag of using the ground-truth target word
-:type true_token_flag: LayerOutput
-:return: the softmax output layer
-:rtype: LayerOutput
-"""
-
-decoder_mem = paddle.layer.memory(
-name='gru_decoder', size=decoder_size, boot_layer=decoder_boot)
-
-context = paddle.networks.simple_attention(
-encoded_sequence=enc_vec,
-encoded_proj=enc_proj,
-decoder_state=decoder_mem)
-
-gru_out_memory = paddle.layer.memory(
-name='gru_out', size=target_dict_dim)
-
-generated_word = paddle.layer.max_id(input=gru_out_memory)
-
-generated_word_emb = paddle.layer.embedding(
-input=generated_word,
-size=word_vector_dim,
-param_attr=paddle.attr.ParamAttr(name='_target_language_embedding'))
-
-current_word = paddle.layer.multiplex(
-input=[true_token_flag, true_word, generated_word_emb])
-
-decoder_inputs = paddle.layer.fc(
-input=[context, current_word],
-size=decoder_size * 3,
-act=paddle.activation.Linear(),
-bias_attr=False)
-
-gru_step = paddle.layer.gru_step(
-name='gru_decoder',
-input=decoder_inputs,
-output_mem=decoder_mem,
-size=decoder_size)
-
-out = paddle.layer.fc(
-name='gru_out',
-input=gru_step,
-size=target_dict_dim,
-act=paddle.activation.Softmax())
-return out
-```
-
-The function uses the `memory` layer `gru_out_memory` to memorize the elements generated at the last moment, and select the word with the highest probability as the generated word. The `multiplex` layer makes a choice between the true element, `true_word`, and the generated element, `generated_word`, and uses the result as the decoder input. The `multiplex` layer uses three inputs, `true_token_flag`, `true_word`, and `generated_word_emb`. For each of these three inputs, if `true_token_flag` is 0, then the `multiplex` layer outputs the corresponding element in `true_word`; if `true_token_flag` is 1, then the `multiplex` layer outputs the corresponding element in `generated_word_emb`.
-
-## References
-
-[1] Bengio S, Vinyals O, Jaitly N, et al. [Scheduled sampling for sequence prediction with recurrent neural networks](http://papers.nips.cc/paper/5956-scheduled-sampling-for-sequence-prediction-with-recurrent-neural-networks)//Advances in Neural Information Processing Systems. 2015: 1171-1179.
diff --git a/legacy/scheduled_sampling/generate.py b/legacy/scheduled_sampling/generate.py
deleted file mode 100644
index adde133744329636baabc8a409fc737f3b9106a3..0000000000000000000000000000000000000000
--- a/legacy/scheduled_sampling/generate.py
+++ /dev/null
@@ -1,91 +0,0 @@
-import gzip
-import argparse
-import distutils.util
-import paddle.v2 as paddle
-
-from network_conf import seqToseq_net
-
-
-def parse_args():
- parser = argparse.ArgumentParser(
- description="PaddlePaddle Scheduled Sampling")
- parser.add_argument(
- '--model_path',
- type=str,
- required=True,
- help="The path for trained model to load.")
- parser.add_argument(
- '--beam_size',
- type=int,
- default=3,
- help='The width of beam expansion. (default: %(default)s)')
- parser.add_argument(
- "--use_gpu",
- type=distutils.util.strtobool,
- default=False,
- help="Use gpu or not. (default: %(default)s)")
- parser.add_argument(
- "--trainer_count",
- type=int,
- default=1,
- help="Trainer number. (default: %(default)s)")
-
- return parser.parse_args()
-
-
-def generate(gen_data, dict_size, model_path, beam_size):
- beam_gen = seqToseq_net(dict_size, dict_size, beam_size, is_generating=True)
-
- with gzip.open(model_path, 'r') as f:
- parameters = paddle.parameters.Parameters.from_tar(f)
-
- # prob is the prediction probabilities, and id is the prediction word.
- beam_result = paddle.infer(
- output_layer=beam_gen,
- parameters=parameters,
- input=gen_data,
- field=['prob', 'id'])
-
- # get the dictionary
- src_dict, trg_dict = paddle.dataset.wmt14.get_dict(dict_size)
-
- # the delimited element of generated sequences is -1,
- # the first element of each generated sequence is the sequence length
- seq_list = []
- seq = []
- for w in beam_result[1]:
- if w != -1:
- seq.append(w)
- else:
- seq_list.append(' '.join([trg_dict.get(w) for w in seq[1:]]))
- seq = []
-
- prob = beam_result[0]
- for i in xrange(gen_num):
- print "\n*******************************************************\n"
- print "src:", ' '.join([src_dict.get(w) for w in gen_data[i][0]]), "\n"
- for j in xrange(beam_size):
- print "prob = %f:" % (prob[i][j]), seq_list[i * beam_size + j]
-
-
-if __name__ == '__main__':
- args = parse_args()
-
- dict_size = 30000
-
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
-
- # use the first 3 samples for generation
- gen_creator = paddle.dataset.wmt14.gen(dict_size)
- gen_data = []
- gen_num = 3
- for item in gen_creator():
- gen_data.append((item[0], ))
- if len(gen_data) == gen_num:
- break
-
- generate(
- gen_data,
- dict_size=dict_size,
- model_path=args.model_path,
- beam_size=args.beam_size)
diff --git a/legacy/scheduled_sampling/images/Scheduled_Sampling.jpg b/legacy/scheduled_sampling/images/Scheduled_Sampling.jpg
deleted file mode 100644
index 27f568a45f41af0aa3c0d982e64233e058302100..0000000000000000000000000000000000000000
Binary files a/legacy/scheduled_sampling/images/Scheduled_Sampling.jpg and /dev/null differ
diff --git a/legacy/scheduled_sampling/images/decay.jpg b/legacy/scheduled_sampling/images/decay.jpg
deleted file mode 100644
index ea0532750a96aaa6a39d84b1ee9b8bedba3916d7..0000000000000000000000000000000000000000
Binary files a/legacy/scheduled_sampling/images/decay.jpg and /dev/null differ
diff --git a/legacy/scheduled_sampling/network_conf.py b/legacy/scheduled_sampling/network_conf.py
deleted file mode 100644
index f331c15b958b739ae87f540ac12617fb08ac039b..0000000000000000000000000000000000000000
--- a/legacy/scheduled_sampling/network_conf.py
+++ /dev/null
@@ -1,196 +0,0 @@
-import paddle.v2 as paddle
-
-__all__ = ["seqToseq_net"]
-
-### Network Architecture
-word_vector_dim = 512 # dimension of word vector
-decoder_size = 512 # dimension of hidden unit in GRU Decoder network
-encoder_size = 512 # dimension of hidden unit in GRU Encoder network
-
-max_length = 250
-
-
-def seqToseq_net(source_dict_dim,
- target_dict_dim,
- beam_size,
- is_generating=False):
- """
- The definition of the sequence to sequence model
- :param source_dict_dim: the dictionary size of the source language
- :type source_dict_dim: int
- :param target_dict_dim: the dictionary size of the target language
- :type target_dict_dim: int
- :param beam_size: The width of beam expansion
- :type beam_size: int
- :param is_generating: whether in generating mode
- :type is_generating: Bool
- :return: the last layer of the network
- :rtype: LayerOutput
- """
-
- #### Encoder
- src_word_id = paddle.layer.data(
- name='source_language_word',
- type=paddle.data_type.integer_value_sequence(source_dict_dim))
- src_embedding = paddle.layer.embedding(
- input=src_word_id, size=word_vector_dim)
- src_forward = paddle.networks.simple_gru(
- input=src_embedding, size=encoder_size)
- src_reverse = paddle.networks.simple_gru(
- input=src_embedding, size=encoder_size, reverse=True)
- encoded_vector = paddle.layer.concat(input=[src_forward, src_reverse])
-
- #### Decoder
- encoded_proj = paddle.layer.fc(input=encoded_vector,
- size=decoder_size,
- act=paddle.activation.Linear(),
- bias_attr=False)
-
- reverse_first = paddle.layer.first_seq(input=src_reverse)
-
- decoder_boot = paddle.layer.fc(input=reverse_first,
- size=decoder_size,
- act=paddle.activation.Tanh(),
- bias_attr=False)
-
- def gru_decoder_with_attention_train(enc_vec, enc_proj, true_word,
- true_token_flag):
- """
- The decoder step for training.
- :param enc_vec: the encoder vector for attention
- :type enc_vec: LayerOutput
- :param enc_proj: the encoder projection for attention
- :type enc_proj: LayerOutput
- :param true_word: the ground-truth target word
- :type true_word: LayerOutput
- :param true_token_flag: the flag of using the ground-truth target word
- :type true_token_flag: LayerOutput
- :return: the softmax output layer
- :rtype: LayerOutput
- """
-
- decoder_mem = paddle.layer.memory(
- name='gru_decoder', size=decoder_size, boot_layer=decoder_boot)
-
- context = paddle.networks.simple_attention(
- encoded_sequence=enc_vec,
- encoded_proj=enc_proj,
- decoder_state=decoder_mem)
-
- gru_out_memory = paddle.layer.memory(
- name='gru_out', size=target_dict_dim)
-
- generated_word = paddle.layer.max_id(input=gru_out_memory)
-
- generated_word_emb = paddle.layer.embedding(
- input=generated_word,
- size=word_vector_dim,
- param_attr=paddle.attr.ParamAttr(name='_target_language_embedding'))
-
- current_word = paddle.layer.multiplex(
- input=[true_token_flag, true_word, generated_word_emb])
-
- decoder_inputs = paddle.layer.fc(input=[context, current_word],
- size=decoder_size * 3,
- act=paddle.activation.Linear(),
- bias_attr=False)
-
- gru_step = paddle.layer.gru_step(
- name='gru_decoder',
- input=decoder_inputs,
- output_mem=decoder_mem,
- size=decoder_size)
-
- out = paddle.layer.fc(name='gru_out',
- input=gru_step,
- size=target_dict_dim,
- act=paddle.activation.Softmax())
- return out
-
- def gru_decoder_with_attention_gen(enc_vec, enc_proj, current_word):
- """
- The decoder step for generating.
- :param enc_vec: the encoder vector for attention
- :type enc_vec: LayerOutput
- :param enc_proj: the encoder projection for attention
- :type enc_proj: LayerOutput
- :param current_word: the previously generated word
- :type current_word: LayerOutput
- :return: the softmax output layer
- :rtype: LayerOutput
- """
-
- decoder_mem = paddle.layer.memory(
- name='gru_decoder', size=decoder_size, boot_layer=decoder_boot)
-
- context = paddle.networks.simple_attention(
- encoded_sequence=enc_vec,
- encoded_proj=enc_proj,
- decoder_state=decoder_mem)
-
- decoder_inputs = paddle.layer.fc(input=[context, current_word],
- size=decoder_size * 3,
- act=paddle.activation.Linear(),
- bias_attr=False)
-
- gru_step = paddle.layer.gru_step(
- name='gru_decoder',
- input=decoder_inputs,
- output_mem=decoder_mem,
- size=decoder_size)
-
- out = paddle.layer.fc(name='gru_out',
- input=gru_step,
- size=target_dict_dim,
- act=paddle.activation.Softmax())
- return out
-
- decoder_group_name = "decoder_group"
- group_input1 = paddle.layer.StaticInput(input=encoded_vector, is_seq=True)
- group_input2 = paddle.layer.StaticInput(input=encoded_proj, is_seq=True)
-
- if not is_generating:
- trg_embedding = paddle.layer.embedding(
- input=paddle.layer.data(
- name='target_language_word',
- type=paddle.data_type.integer_value_sequence(target_dict_dim)),
- size=word_vector_dim,
- param_attr=paddle.attr.ParamAttr(name='_target_language_embedding'))
-
- true_token_flags = paddle.layer.data(
- name='true_token_flag',
- type=paddle.data_type.integer_value_sequence(2))
-
- group_inputs = [
- group_input1, group_input2, trg_embedding, true_token_flags
- ]
-
- decoder = paddle.layer.recurrent_group(
- name=decoder_group_name,
- step=gru_decoder_with_attention_train,
- input=group_inputs)
-
- lbl = paddle.layer.data(
- name='target_language_next_word',
- type=paddle.data_type.integer_value_sequence(target_dict_dim))
-
- cost = paddle.layer.classification_cost(input=decoder, label=lbl)
-
- return cost
- else:
- trg_embedding = paddle.layer.GeneratedInput(
- size=target_dict_dim,
- embedding_name='_target_language_embedding',
- embedding_size=word_vector_dim)
-
- group_inputs = [group_input1, group_input2, trg_embedding]
-
- beam_gen = paddle.layer.beam_search(
- name=decoder_group_name,
- step=gru_decoder_with_attention_gen,
- input=group_inputs,
- bos_id=0,
- eos_id=1,
- beam_size=beam_size,
- max_length=max_length)
- return beam_gen
diff --git a/legacy/scheduled_sampling/reader.py b/legacy/scheduled_sampling/reader.py
deleted file mode 100644
index c751aa91a01902c95cb8968367ff00e1afed7c96..0000000000000000000000000000000000000000
--- a/legacy/scheduled_sampling/reader.py
+++ /dev/null
@@ -1,42 +0,0 @@
-from utils import RandomScheduleGenerator
-
-
-def gen_schedule_data(reader,
- schedule_type="linear",
- decay_a=0.75,
- decay_b=1000000):
- """
- Creates a data reader for scheduled sampling.
-
- Output from the iterator that created by original reader will be
- appended with "true_token_flag" to indicate whether to use true token.
-
- :param reader: the original reader.
- :type reader: callable
- :param schedule_type: the type of sampling rate decay.
- :type schedule_type: str
- :param decay_a: the decay parameter a.
- :type decay_a: float
- :param decay_b: the decay parameter b.
- :type decay_b: float
-
- :return: the new reader with the field "true_token_flag".
- :rtype: callable
- """
- schedule_generator = RandomScheduleGenerator(schedule_type, decay_a,
- decay_b)
-
- def data_reader():
- for src_ids, trg_ids, trg_ids_next in reader():
- yield src_ids, trg_ids, trg_ids_next, \
- [0] + schedule_generator.processBatch(len(trg_ids) - 1)
-
- return data_reader
-
-
-feeding = {
- 'source_language_word': 0,
- 'target_language_word': 1,
- 'target_language_next_word': 2,
- 'true_token_flag': 3
-}
diff --git a/legacy/scheduled_sampling/train.py b/legacy/scheduled_sampling/train.py
deleted file mode 100644
index 3c8532f1b8998a4f7c393b37efbdb412f57f3ed5..0000000000000000000000000000000000000000
--- a/legacy/scheduled_sampling/train.py
+++ /dev/null
@@ -1,127 +0,0 @@
-import os
-import sys
-import gzip
-import argparse
-import distutils.util
-import paddle.v2 as paddle
-
-import reader
-from network_conf import seqToseq_net
-
-
-def parse_args():
- parser = argparse.ArgumentParser(
- description="PaddlePaddle Scheduled Sampling")
- parser.add_argument(
- '--schedule_type',
- type=str,
- default="linear",
- help='The type of sampling rate decay. Supported type: constant, linear, exponential, inverse_sigmoid. (default: %(default)s)'
- )
- parser.add_argument(
- '--decay_a',
- type=float,
- default=0.75,
- help='The sampling rate decay parameter a. (default: %(default)s)')
- parser.add_argument(
- '--decay_b',
- type=float,
- default=1000000,
- help='The sampling rate decay parameter b. (default: %(default)s)')
- parser.add_argument(
- '--beam_size',
- type=int,
- default=3,
- help='The width of beam expansion. (default: %(default)s)')
- parser.add_argument(
- "--use_gpu",
- type=distutils.util.strtobool,
- default=False,
- help="Use gpu or not. (default: %(default)s)")
- parser.add_argument(
- "--trainer_count",
- type=int,
- default=1,
- help="Trainer number. (default: %(default)s)")
- parser.add_argument(
- '--batch_size',
- type=int,
- default=32,
- help="Size of a mini-batch. (default: %(default)s)")
- parser.add_argument(
- '--num_passes',
- type=int,
- default=10,
- help="Number of passes to train. (default: %(default)s)")
- parser.add_argument(
- '--model_output_dir',
- type=str,
- default='models',
- help="The path for model to store. (default: %(default)s)")
-
- return parser.parse_args()
-
-
-def train(dict_size, batch_size, num_passes, beam_size, schedule_type, decay_a,
- decay_b, model_dir):
- optimizer = paddle.optimizer.Adam(
- learning_rate=1e-4,
- regularization=paddle.optimizer.L2Regularization(rate=1e-5))
-
- cost = seqToseq_net(dict_size, dict_size, beam_size)
-
- parameters = paddle.parameters.create(cost)
-
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- update_equation=optimizer)
-
- wmt14_reader = reader.gen_schedule_data(
- paddle.reader.shuffle(
- paddle.dataset.wmt14.train(dict_size), buf_size=8192),
- schedule_type,
- decay_a,
- decay_b)
-
- # define event_handler callback
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 10 == 0:
- print "\nPass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics)
- else:
- sys.stdout.write('.')
- sys.stdout.flush()
- if isinstance(event, paddle.event.EndPass):
- # save parameters
- with gzip.open(
- os.path.join(model_dir, 'params_pass_%d.tar.gz' %
- event.pass_id), 'w') as f:
- trainer.save_parameter_to_tar(f)
-
- # start to train
- trainer.train(
- reader=paddle.batch(
- wmt14_reader, batch_size=batch_size),
- event_handler=event_handler,
- feeding=reader.feeding,
- num_passes=num_passes)
-
-
-if __name__ == '__main__':
- args = parse_args()
-
- if not os.path.isdir(args.model_output_dir):
- os.mkdir(args.model_output_dir)
-
- paddle.init(use_gpu=args.use_gpu, trainer_count=args.trainer_count)
-
- train(
- dict_size=30000,
- batch_size=args.batch_size,
- num_passes=args.num_passes,
- beam_size=args.beam_size,
- schedule_type=args.schedule_type,
- decay_a=args.decay_a,
- decay_b=args.decay_b,
- model_dir=args.model_output_dir)
diff --git a/legacy/scheduled_sampling/utils.py b/legacy/scheduled_sampling/utils.py
deleted file mode 100644
index 80a56f21298174459737c94ea3aa3902fbc348c5..0000000000000000000000000000000000000000
--- a/legacy/scheduled_sampling/utils.py
+++ /dev/null
@@ -1,48 +0,0 @@
-import math
-import numpy as np
-
-
-class RandomScheduleGenerator:
- """
- The random sampling rate for scheduled sampling algoithm, which uses decayed
- sampling rate.
- """
-
- def __init__(self, schedule_type, a, b):
- """
- schduled_type: is the type of the decay. It supports constant, linear,
- exponential, and inverse_sigmoid right now.
- a: parameter of the decay (MUST BE DOUBLE)
- b: parameter of the decay (MUST BE DOUBLE)
- """
- self.schedule_type = schedule_type
- self.a = a
- self.b = b
- self.data_processed_ = 0
- self.schedule_computers = {
- "constant": lambda a, b, d: a,
- "linear": lambda a, b, d: max(a, 1 - d / b),
- "exponential": lambda a, b, d: pow(a, d / b),
- "inverse_sigmoid": lambda a, b, d: b / (b + math.exp(d * a / b)),
- }
- assert (self.schedule_type in self.schedule_computers)
- self.schedule_computer = self.schedule_computers[self.schedule_type]
-
- def getScheduleRate(self):
- """
- Get the schedule sampling rate. Usually not needed to be
- called by the users.
- """
- return self.schedule_computer(self.a, self.b, self.data_processed_)
-
- def processBatch(self, batch_size):
- """
- Get a batch_size of sampled indexes. These indexes can be passed to a
- MultiplexLayer to select from the grouth truth and generated samples
- from the last time step.
- """
- rate = self.getScheduleRate()
- numbers = np.random.rand(batch_size)
- indexes = (numbers >= rate).astype('int32').tolist()
- self.data_processed_ += batch_size
- return indexes
diff --git a/legacy/sequence_tagging_for_ner/.gitignore b/legacy/sequence_tagging_for_ner/.gitignore
deleted file mode 100644
index c281422d174b37b53a951b59f67748920f0cf857..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/.gitignore
+++ /dev/null
@@ -1,2 +0,0 @@
-*.pyc
-*.tar.gz
diff --git a/legacy/sequence_tagging_for_ner/README.md b/legacy/sequence_tagging_for_ner/README.md
deleted file mode 100644
index 9870e3cf2edb4e0a0514b33c59c91d861a8caf5d..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/README.md
+++ /dev/null
@@ -1,177 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 命名实体识别
-
-以下是本例的简要目录结构及说明:
-
-```text
-.
-├── data # 存储运行本例所依赖的数据
-│ ├── download.sh
-├── images # README 文档中的图片
-├── index.html
-├── infer.py # 测试脚本
-├── network_conf.py # 模型定义
-├── reader.py # 数据读取接口
-├── README.md # 文档
-├── train.py # 训练脚本
-└── utils.py # 定义同样的函数
-```
-
-
-## 简介
-
-命名实体识别(Named Entity Recognition,NER)又称作“专名识别”,是指识别文本中具有特定意义的实体,主要包括人名、地名、机构名、专有名词等,是自然语言处理研究的一个基础问题。NER任务通常包括实体边界识别、确定实体类别两部分,可以将其作为序列标注问题解决。
-
-序列标注可以分为Sequence Classification、Segment Classification和Temporal Classification三类[[1](#参考文献)],本例只考虑Segment Classification,即对输入序列中的每个元素在输出序列中给出对应的标签。对于NER任务,由于需要标识边界,一般采用[BIO标注方法](http://www.paddlepaddle.org/docs/develop/book/07.label_semantic_roles/index.cn.html)定义的标签集,如下是一个NER的标注结果示例:
-
-
-
-图1. BIO标注方法示例
-
-
-根据序列标注结果可以直接得到实体边界和实体类别。类似的,分词、词性标注、语块识别、[语义角色标注](http://www.paddlepaddle.org/docs/develop/book/07.label_semantic_roles/index.cn.html)等任务都可通过序列标注来解决。使用神经网络模型解决问题的思路通常是:前层网络学习输入的特征表示,网络的最后一层在特征基础上完成最终的任务;对于序列标注问题,通常:使用基于RNN的网络结构学习特征,将学习到的特征接入CRF完成序列标注。实际上是将传统CRF中的线性模型换成了非线性神经网络。沿用CRF的出发点是:CRF使用句子级别的似然概率,能够更好的解决标记偏置问题[[2](#参考文献)]。本例也将基于此思路建立模型。虽然,这里以NER任务作为示例,但所给出的模型可以应用到其他各种序列标注任务中。
-
-由于序列标注问题的广泛性,产生了[CRF](http://www.paddlepaddle.org/docs/develop/book/07.label_semantic_roles/index.cn.html)等经典的序列模型,这些模型大多只能使用局部信息或需要人工设计特征。随着深度学习研究的发展,循环神经网络(Recurrent Neural Network,RNN等 序列模型能够处理序列元素之间前后关联问题,能够从原始输入文本中学习特征表示,而更加适合序列标注任务,更多相关知识可参考PaddleBook中[语义角色标注](https://github.com/PaddlePaddle/book/blob/develop/07.label_semantic_roles/README.cn.md)一课。
-
-## 模型详解
-
-NER任务的输入是"一句话",目标是识别句子中的实体边界及类别,我们参照论文\[[2](#参考文献)\]仅对原始句子进行了一些简单的预处理工作:将每个词转换为小写,并将原词是否大写另作为一个特征,共同作为模型的输入。模型如图2所示,工作流程如下:
-
-1. 构造输入
- - 输入1是句子序列,采用one-hot方式表示
- - 输入2是大写标记序列,标记了句子中每一个词是否是大写,采用one-hot方式表示;
-2. one-hot方式的句子序列和大写标记序列通过词表,转换为实向量表示的词向量序列;
-3. 将步骤2中的2个词向量序列作为双向RNN的输入,学习输入序列的特征表示,得到新的特性表示序列;
-4. CRF以步骤3中模型学习到的特征为输入,以标记序列为监督信号,实现序列标注。
-
-
-
-图2. NER 模型网络结构图
-
-
-
-## 数据说明
-
-在本例中,我们以 [CoNLL 2003 NER任务](http://www.clips.uantwerpen.be/conll2003/ner/)为例,原始Reuters数据由于版权原因需另外申请免费下载,请大家按照原网站说明获取。
-
-+ 我们仅在`data`目录下的`train`和`test`文件中放置少数样本用以示例输入数据格式。
-+ 本例依赖数据还包括
- 1. 输入文本的词典
- 2. 为词典中的词语提供预训练好的词向量
- 2. 标记标签的词典
- 标记标签词典已附在`data`目录中,对应于`data/target.txt`文件。输入文本的词典以及词典中词语的预训练的词向量来自:[Stanford CS224d](http://cs224d.stanford.edu/)课程作业。**为运行本例,请首先在`data`目录下运行`download.sh`脚本下载输入文本的词典和预训练的词向量。** 完成后会将这两个文件一并放入`data`目录下,输入文本的词典和预训练的词向量分别对应:`data/vocab.txt`和`data/wordVectors.txt`这两个文件。
-
-CoNLL 2003原始数据格式如下:
-
-```
-U.N. NNP I-NP I-ORG
-official NN I-NP O
-Ekeus NNP I-NP I-PER
-heads VBZ I-VP O
-for IN I-PP O
-Baghdad NNP I-NP I-LOC
-. . O O
-```
-
-- 第一列为原始句子序列
-- 第二、三列分别为词性标签和句法分析中的语块标签,本例不使用
-- 第四列为采用了 I-TYPE 方式表示的NER标签
- - I-TYPE 和 BIO 方式的主要区别在于语块开始标记的使用上,I-TYPE只有在出现相邻的同类别实体时对后者使用B标记,其他均使用I标记),句子之间以空行分隔。
-
-我们在`reader.py`脚本中完成对原始数据的处理以及读取,主要包括下面几个步骤:
-
-1. 从原始数据文件中抽取出句子和标签,构造句子序列和标签序列;
-2. 将 I-TYPE 表示的标签转换为 BIO 方式表示的标签;
-3. 将句子序列中的单词转换为小写,并构造大写标记序列;
-4. 依据词典获取词对应的整数索引。
-
-
-预处理完成后,一条训练样本包含3个部分作为神经网络的输入信息用于训练:(1)句子序列;(2)首字母大写标记序列;(3)标注序列,下表是一条训练样本的示例:
-
-| 句子序列 | 大写标记序列 | 标注序列 |
-| -------- | ------------ | -------- |
-| u.n. | 1 | B-ORG |
-| official | 0 | O |
-| ekeus | 1 | B-PER |
-| heads | 0 | O |
-| for | 0 | O |
-| baghdad | 1 | B-LOC |
-| . | 0 | O |
-
-## 运行
-### 编写数据读取接口
-
-自定义数据读取接口只需编写一个 Python 生成器实现从原始输入文本中解析一条训练样本的逻辑。[reader.py](./reader.py) 中的`data_reader`函数实现了读取原始数据返回类型为: `paddle.data_type.integer_value_sequence`的 3 个输入(分别对应:词语在字典的序号、是否为大写、标注结果在字典中的序号)给`network_conf.ner_net`中定义的 3 个 `data_layer` 的功能。
-
-### 训练
-
-1. 运行 `sh data/download.sh`
-2. 修改 `train.py` 的 `main` 函数,指定数据路径
-
- ```python
- main(
- train_data_file="data/train",
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt",
- emb_file="data/wordVectors.txt",
- model_save_dir="models/")
- ```
-
-3. 运行命令 `python train.py` ,**需要注意:直接运行使用的是示例数据,请替换真实的标记数据。**
-
- ```text
- commandline: --use_gpu=False --trainer_count=1
- Initing parameters..
- Init parameters done.
- Pass 0, Batch 0, Cost 41.430110, {'ner_chunk.precision': 0.01587301678955555, 'ner_chunk.F1-score': 0.028368793427944183, 'ner_chunk.recall': 0.13333334028720856, 'error': 0.939393937587738}
- Test with Pass 0, Batch 0, {'ner_chunk.precision': 0.0, 'ner_chunk.F1-score': 0.0, 'ner_chunk.recall': 0.0, 'error': 0.16260161995887756}
- ```
-
-### 预测
-1. 修改 [infer.py](./infer.py) 的 `main` 函数,指定:需要测试的模型的路径、测试数据、字典文件,预测标记文件的路径,默认参数如下:
-
- ```python
- infer(
- model_path="models/params_pass_0.tar.gz",
- batch_size=2,
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt")
- ```
-
-2. 在终端运行 `python infer.py`,开始测试,会看到如下预测结果(以下为训练500个pass所得模型的部分预测结果):
-
- ```text
- cricket O
- - O
- leicestershire B-ORG
- take O
- over O
- at O
- top O
- after O
- innings O
- victory O
- . O
- london B-LOC
- 1996-08-30 O
- west B-MISC
- indian I-MISC
- all-rounder O
- phil B-PER
- simmons I-PER
- took O
- four O
-
- ```
- 输出分为两列,以“\t” 分隔,第一列是输入的词语,第二列是标记结果。多条输入序列之间以空行分隔。
-
-
-## 参考文献
-
-1. Graves A. [Supervised Sequence Labelling with Recurrent Neural Networks](http://www.cs.toronto.edu/~graves/preprint.pdf)[J]. Studies in Computational Intelligence, 2013, 385.
-2. Collobert R, Weston J, Bottou L, et al. [Natural Language Processing (Almost) from Scratch](http://www.jmlr.org/papers/volume12/collobert11a/collobert11a.pdf)[J]. Journal of Machine Learning Research, 2011, 12(1):2493-2537.
diff --git a/legacy/sequence_tagging_for_ner/data/download.sh b/legacy/sequence_tagging_for_ner/data/download.sh
deleted file mode 100644
index 99d81c1e0949e47187cd082947117eb4e6bd888d..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/data/download.sh
+++ /dev/null
@@ -1,16 +0,0 @@
-if [ -f assignment2.zip ]; then
- echo "data exist"
-else
- wget http://cs224d.stanford.edu/assignment2/assignment2.zip
-fi
-
-if [ $? -eq 0 ];then
- unzip assignment2.zip
- cp assignment2_release/data/ner/wordVectors.txt ./data
- cp assignment2_release/data/ner/vocab.txt ./data
- rm -rf assignment2.zip assignment2_release
-else
- echo "download data error!" >> /dev/stderr
- exit 1
-fi
-
diff --git a/legacy/sequence_tagging_for_ner/data/target.txt b/legacy/sequence_tagging_for_ner/data/target.txt
deleted file mode 100644
index e0fa4d8f6654be07b4d1188750abb861d7c6f264..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/data/target.txt
+++ /dev/null
@@ -1,9 +0,0 @@
-B-LOC
-I-LOC
-B-MISC
-I-MISC
-B-ORG
-I-ORG
-B-PER
-I-PER
-O
diff --git a/legacy/sequence_tagging_for_ner/data/test b/legacy/sequence_tagging_for_ner/data/test
deleted file mode 100644
index 66163e1a869d57303117dd94d59ff01be05de8f7..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/data/test
+++ /dev/null
@@ -1,128 +0,0 @@
-CRICKET NNP I-NP O
-- : O O
-LEICESTERSHIRE NNP I-NP I-ORG
-TAKE NNP I-NP O
-OVER IN I-PP O
-AT NNP I-NP O
-TOP NNP I-NP O
-AFTER NNP I-NP O
-INNINGS NNP I-NP O
-VICTORY NN I-NP O
-. . O O
-
-LONDON NNP I-NP I-LOC
-1996-08-30 CD I-NP O
-
-West NNP I-NP I-MISC
-Indian NNP I-NP I-MISC
-all-rounder NN I-NP O
-Phil NNP I-NP I-PER
-Simmons NNP I-NP I-PER
-took VBD I-VP O
-four CD I-NP O
-for IN I-PP O
-38 CD I-NP O
-on IN I-PP O
-Friday NNP I-NP O
-as IN I-PP O
-Leicestershire NNP I-NP I-ORG
-beat VBD I-VP O
-Somerset NNP I-NP I-ORG
-by IN I-PP O
-an DT I-NP O
-innings NN I-NP O
-and CC O O
-39 CD I-NP O
-runs NNS I-NP O
-in IN I-PP O
-two CD I-NP O
-days NNS I-NP O
-to TO I-VP O
-take VB I-VP O
-over IN I-PP O
-at IN B-PP O
-the DT I-NP O
-head NN I-NP O
-of IN I-PP O
-the DT I-NP O
-county NN I-NP O
-championship NN I-NP O
-. . O O
-
-Their PRP$ I-NP O
-stay NN I-NP O
-on IN I-PP O
-top NN I-NP O
-, , O O
-though RB I-ADVP O
-, , O O
-may MD I-VP O
-be VB I-VP O
-short-lived JJ I-ADJP O
-as IN I-PP O
-title NN I-NP O
-rivals NNS I-NP O
-Essex NNP I-NP I-ORG
-, , O O
-Derbyshire NNP I-NP I-ORG
-and CC I-NP O
-Surrey NNP I-NP I-ORG
-all DT O O
-closed VBD I-VP O
-in RP I-PRT O
-on IN I-PP O
-victory NN I-NP O
-while IN I-SBAR O
-Kent NNP I-NP I-ORG
-made VBD I-VP O
-up RP I-PRT O
-for IN I-PP O
-lost VBN I-NP O
-time NN I-NP O
-in IN I-PP O
-their PRP$ I-NP O
-rain-affected JJ I-NP O
-match NN I-NP O
-against IN I-PP O
-Nottinghamshire NNP I-NP I-ORG
-. . O O
-
-After IN I-PP O
-bowling VBG I-NP O
-Somerset NNP I-NP I-ORG
-out RP I-PRT O
-for IN I-PP O
-83 CD I-NP O
-on IN I-PP O
-the DT I-NP O
-opening NN I-NP O
-morning NN I-NP O
-at IN I-PP O
-Grace NNP I-NP I-LOC
-Road NNP I-NP I-LOC
-, , O O
-Leicestershire NNP I-NP I-ORG
-extended VBD I-VP O
-their PRP$ I-NP O
-first JJ I-NP O
-innings NN I-NP O
-by IN I-PP O
-94 CD I-NP O
-runs VBZ I-VP O
-before IN I-PP O
-being VBG I-VP O
-bowled VBD I-VP O
-out RP I-PRT O
-for IN I-PP O
-296 CD I-NP O
-with IN I-PP O
-England NNP I-NP I-LOC
-discard VBP I-VP O
-Andy NNP I-NP I-PER
-Caddick NNP I-NP I-PER
-taking VBG I-VP O
-three CD I-NP O
-for IN I-PP O
-83 CD I-NP O
-. . O O
-
diff --git a/legacy/sequence_tagging_for_ner/data/train b/legacy/sequence_tagging_for_ner/data/train
deleted file mode 100644
index cbf3e678c555a3b6db26fd14e38889f040f048ca..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/data/train
+++ /dev/null
@@ -1,139 +0,0 @@
-EU NNP I-NP I-ORG
-rejects VBZ I-VP O
-German JJ I-NP I-MISC
-call NN I-NP O
-to TO I-VP O
-boycott VB I-VP O
-British JJ I-NP I-MISC
-lamb NN I-NP O
-. . O O
-
-Peter NNP I-NP I-PER
-Blackburn NNP I-NP I-PER
-
-BRUSSELS NNP I-NP I-LOC
-1996-08-22 CD I-NP O
-
-The DT I-NP O
-European NNP I-NP I-ORG
-Commission NNP I-NP I-ORG
-said VBD I-VP O
-on IN I-PP O
-Thursday NNP I-NP O
-it PRP B-NP O
-disagreed VBD I-VP O
-with IN I-PP O
-German JJ I-NP I-MISC
-advice NN I-NP O
-to TO I-PP O
-consumers NNS I-NP O
-to TO I-VP O
-shun VB I-VP O
-British JJ I-NP I-MISC
-lamb NN I-NP O
-until IN I-SBAR O
-scientists NNS I-NP O
-determine VBP I-VP O
-whether IN I-SBAR O
-mad JJ I-NP O
-cow NN I-NP O
-disease NN I-NP O
-can MD I-VP O
-be VB I-VP O
-transmitted VBN I-VP O
-to TO I-PP O
-sheep NN I-NP O
-. . O O
-
-Germany NNP I-NP I-LOC
-'s POS B-NP O
-representative NN I-NP O
-to TO I-PP O
-the DT I-NP O
-European NNP I-NP I-ORG
-Union NNP I-NP I-ORG
-'s POS B-NP O
-veterinary JJ I-NP O
-committee NN I-NP O
-Werner NNP I-NP I-PER
-Zwingmann NNP I-NP I-PER
-said VBD I-VP O
-on IN I-PP O
-Wednesday NNP I-NP O
-consumers NNS I-NP O
-should MD I-VP O
-buy VB I-VP O
-sheepmeat NN I-NP O
-from IN I-PP O
-countries NNS I-NP O
-other JJ I-ADJP O
-than IN I-PP O
-Britain NNP I-NP I-LOC
-until IN I-SBAR O
-the DT I-NP O
-scientific JJ I-NP O
-advice NN I-NP O
-was VBD I-VP O
-clearer JJR I-ADJP O
-. . O O
-
-" " O O
-We PRP I-NP O
-do VBP I-VP O
-n't RB I-VP O
-support VB I-VP O
-any DT I-NP O
-such JJ I-NP O
-recommendation NN I-NP O
-because IN I-SBAR O
-we PRP I-NP O
-do VBP I-VP O
-n't RB I-VP O
-see VB I-VP O
-any DT I-NP O
-grounds NNS I-NP O
-for IN I-PP O
-it PRP I-NP O
-, , O O
-" " O O
-the DT I-NP O
-Commission NNP I-NP I-ORG
-'s POS B-NP O
-chief JJ I-NP O
-spokesman NN I-NP O
-Nikolaus NNP I-NP I-PER
-van NNP I-NP I-PER
-der FW I-NP I-PER
-Pas NNP I-NP I-PER
-told VBD I-VP O
-a DT I-NP O
-news NN I-NP O
-briefing NN I-NP O
-. . O O
-
-He PRP I-NP O
-said VBD I-VP O
-further JJ I-NP O
-scientific JJ I-NP O
-study NN I-NP O
-was VBD I-VP O
-required VBN I-VP O
-and CC O O
-if IN I-SBAR O
-it PRP I-NP O
-was VBD I-VP O
-found VBN I-VP O
-that IN I-SBAR O
-action NN I-NP O
-was VBD I-VP O
-needed VBN I-VP O
-it PRP I-NP O
-should MD I-VP O
-be VB I-VP O
-taken VBN I-VP O
-by IN I-PP O
-the DT I-NP O
-European NNP I-NP I-ORG
-Union NNP I-NP I-ORG
-. . O O
-
diff --git a/legacy/sequence_tagging_for_ner/data/vocab.txt b/legacy/sequence_tagging_for_ner/data/vocab.txt
deleted file mode 100644
index 19d518f85ccf1d6237142fba003599039f9a9905..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/data/vocab.txt
+++ /dev/null
@@ -1,100232 +0,0 @@
-UUUNKKK
-the
-,
-.
-of
-and
-in
-"
-a
-to
-was
-is
-for
-as
-)
-(
-on
-with
-by
-'s
-he
-that
-from
-it
-his
-at
-an
-are
-were
-which
-
-
-this
-be
-or
-also
-has
-had
-their
-one
-not
-first
-but
-new
-its
-they
-have
-who
-after
-DGDGDGDG
-'
-her
-been
-other
-she
-;
-two
-when
-%
-there
-all
-into
-time
-during
-:
-more
-school
-years
-most
-can
-some
-only
-over
-many
-used
-such
-would
-up
-out
-city
-may
-him
-world
-where
--
-later
-under
-these
-between
-then
-about
-made
-state
-known
-than
-united
-DGDG.DG
-DGDGDG
-however
-while
-year
-being
-states
-part
-three
-university
-became
-both
-through
-them
-no
-including
-national
-age
-war
-well
-will
-before
-early
-name
-family
-history
-high
-series
-since
-south
-area
-until
-album
-american
-second
-film
-several
-team
-people
-north
-born
-called
-number
-group
-music
-$
-use
-now
-life
-county
-work
-so
-i
-against
-each
-released
-company
-game
-if
-town
-band
-government
-same
-system
-station
-played
-season
-because
-population
-john
-located
-any
-college
-house
-although
-show
-home
-west
-DG.DG
-line
-often
-end
-four
-found
-york
-those
-did
-public
-career
-like
-day
-following
-began
-back
-very
-general
-british
-around
-could
-another
-local
-major
-river
-named
-small
-district
-service
-international
-still
-large
-song
-party
-based
-former
-league
-place
-due
-won
-along
-members
-member
-own
-century
-east
-church
-best
-much
-within
-even
-club
-death
-park
-children
-u.s.
-long
-set
-include
-last
-&
-do
-took
-left
-march
-what
-held
-main
-original
-built
-single
-book
-form
-september
-order
-old
-great
-road
-every
-air
-different
-served
-students
-june
-january
-october
-games
-down
-football
-off
-power
-law
-president
-july
-just
-army
-living
-late
-april
-building
-november
-way
-community
-species
-king
-you
-country
-august
-again
-times
-died
-december
-television
-among
-though
-published
-become
-development
-five
-using
-near
-london
-village
-english
-make
-water
-radio
-england
-see
-play
-third
-included
-final
-received
-island
-various
-few
-version
-said
-top
-center
-education
-given
-currently
-street
-son
-popular
-side
-man
-led
-income
-ii
-having
-moved
-point
-program
-father
-support
-february
-court
-black
-player
-came
-central
-white
-together
-n't
-council
-went
-next
-political
-men
-according
-take
-force
-live
-without
-role
-created
-military
-title
-common
-usually
-division
-business
-western
-period
-once
-de
-example
-land
-record
-produced
-field
-does
-production
-control
-us
-married
-written
-union
-california
-established
-young
-average
-DG.DGDG
-similar
-battle
-considered
-list
-office
-median
-research
-we
-services
-current
-present
-started
-head
-run
-never
-route
-appeared
-character
-site
-australia
-originally
-canada
-act
-class
-others
-women
-rock
-schools
-st.
-six
-story
-records
-available
-art
-himself
-term
-returned
-america
-special
-region
-award
-northern
-modern
-little
-design
-size
-election
-important
-areas
-works
-information
-million
-days
-association
-continued
-further
-founded
-position
-french
-short
-society
-language
-royal
-george
-case
-german
-body
-eventually
-red
-video
-formed
-open
-southern
-today
-william
-james
-upon
-board
-seen
-india
-lost
-level
-males
-addition
-release
-good
-should
-developed
-throughout
-must
-result
-free
-#
-making
-department
-females
-hall
-right
-per
-san
-forces
-playing
-=
-project
-lake
-human
-songs
-joined
-social
-worked
-how
-science
-older
-railway
-uk
-working
-less
-director
-opened
-elected
-events
-thus
-network
-police
-features
-european
-recorded
-sometimes
-taken
-love
-track
-cup
-training
-space
-either
-full
-systems
-get
-summer
-despite
-process
-go
-championship
-light
-largest
-wife
-david
-fire
-total
-groups
-centre
-episode
-news
-professional
-away
-night
-rather
-!
-return
-europe
-instead
-featured
-across
-players
-lead
-australian
-help
-soon
-washington
-sold
-mother
-construction
-almost
-below
-includes
-kingdom
-aircraft
-range
-famous
-minister
-style
-able
-generally
-...
-eastern
-books
-designed
-wrote
-car
-canadian
-model
-green
-tour
-independent
-middle
-announced
-followed
-itself
-market
-japanese
-official
-private
-media
-civil
-success
-months
-shows
-female
-health
-close
-indian
-countries
-hit
-tv
-robert
-event
-magazine
-natural
-france
-sea
-involved
-possible
-performance
-germany
-performed
-change
-paul
-chief
-teams
-type
-sports
-personal
-race
-brother
-organization
-points
-rights
-base
-japan
-my
-parts
-especially
-above
-technology
-remained
-attack
-far
-half
-successful
-leading
-federal
-data
-seven
-hill
-come
-movement
-action
-arts
-bridge
-big
-operations
-provide
-outside
-replaced
-primary
-standard
-committee
-killed
-industry
-front
-future
-appointed
-fact
-valley
-star
-daughter
-michael
-study
-beginning
-coast
-introduced
-movie
-student
-ship
-runs
-round
-museum
-recent
-win
-township
-too
-provided
-bay
-described
-charles
-section
-sent
-least
-awards
-saw
-thomas
-gave
-texas
-stage
-channel
-roman
-signed
-match
-changed
-blue
-brought
-put
-ground
-means
-china
-household
-whose
-gold
-novel
-added
-festival
-lower
-real
-campaign
-traditional
-navy
-higher
-limited
-characters
-completed
-institute
-films
-find
-christian
-recently
-management
-strong
-person
-ever
-degree
-sound
-culture
-decided
-finally
-]
-medical
-previous
-interest
-bank
-airport
-ten
-[
-money
-trade
-studies
-families
-past
-food
-computer
-location
-leader
-practice
-post
-low
-uses
-academy
-programs
-debut
-census
-energy
-behind
-parliament
-active
-lord
-prior
-operation
-households
-grand
-god
-commercial
-numerous
-course
-earth
-henry
-significant
-via
-certain
-peter
-taking
-NNNUMMM
-husband
-theatre
-capital
-always
-allowed
-required
-placed
-bill
-here
-ireland
-previously
-competition
-individuals
-owned
-command
-running
-hand
-staff
-theory
-DGDG.DGDG
-province
-highway
-africa
-reached
-unit
-studio
-regular
-governor
-word
-los
-chicago
-hospital
-musical
-move
-structure
-mark
-economic
-test
-catholic
-week
-me
-child
-earlier
-particularly
-whom
-writing
-richard
-dr.
-referred
-companies
-eight
-virginia
-approximately
-related
-themselves
-activities
-fourth
-access
-chinese
-republic
-reported
-issues
-complete
-start
-coach
-/
-native
-appears
-particular
-entire
-foreign
-remains
-smith
-appearance
-give
-towards
-religious
-geography
-plant
-associated
-contains
-variety
-terms
-buildings
-rest
-individual
-changes
-senior
-better
-awarded
-source
-feature
-material
-cover
-mission
-takes
-notable
-officer
-library
-lines
-majority
-plan
-press
-?
-issue
-date
-security
-attended
-mostly
-directed
-our
-queen
-brown
-finished
-collection
-woman
-stations
-room
-fall
-alone
-thought
-names
-already
-problems
-engine
-additional
-mountain
-florida
-commission
-larger
-shown
-spread
-legal
-conference
-friends
-need
-italian
-captain
-effect
-status
-longer
-met
-campus
-plays
-ran
-angeles
-caused
-turn
-going
-cases
-closed
-initially
-empire
-seat
-units
-professor
-soviet
-software
-russian
-miles
-ended
-separate
-key
-policy
-st
-got
-winning
-congress
-nature
-recording
-regional
-forms
-fort
-1980s
-mary
-zealand
-provides
-complex
-tracks
-results
-cross
-spent
-annual
-ancient
-forced
-historical
-evidence
-wide
-products
-relationship
-wales
-property
-operated
-actually
-plot
-therefore
-becoming
-port
-stated
-view
-create
-administration
-yet
-enough
-nearly
-attempt
-square
-call
-typically
-author
-mainly
-makes
-foundation
-cities
-friend
-code
-pennsylvania
-stadium
-numbers
-done
-might
-heart
-master
-whether
-opening
-sir
-noted
-believed
-artists
-loss
-hours
-cause
-defeated
-turned
-edition
-islands
-increased
-allow
-assembly
-leaving
-financial
-subject
-lived
-african
-usa
-highest
-fight
-rate
-britain
-prime
-1970s
-stone
-la
-lives
-offered
-manager
-your
-train
-influence
-true
-italy
-executive
-primarily
-appear
-commonly
-passed
-subsequently
-prince
-label
-dance
-contract
-era
-ohio
-greater
-launched
-democratic
-sister
-report
-starting
-greek
-baseball
-artist
-places
-scene
-houses
-plans
-surface
-nearby
-historic
-format
-broadcast
-stories
-heavy
-entered
-background
-someone
-quickly
-1990s
-castle
-pacific
-represented
-claimed
-forest
-dead
-spanish
-medal
-note
-jersey
-actor
-failed
-directly
-engineering
-branch
-experience
-upper
-physical
-beach
-jack
-shot
-carolina
-retired
-metal
-face
-scotland
-rule
-deal
-serving
-justice
-secretary
-facilities
-whole
-camp
-weeks
-corporation
-troops
-transport
-responsible
-decision
-value
-mr.
-temple
-minor
-highly
-jewish
-oil
-helped
-inside
-flight
-direct
-ships
-words
-irish
-authority
-specific
-x
-smaller
-article
-brothers
-care
-consists
-writer
-nations
-function
-albums
-remaining
-marriage
-residents
-meeting
-meaning
-tournament
-produce
-comes
-speed
-spring
-nine
-parents
-effects
-1960s
-bbc
-ability
-blood
-fleet
-singer
-carried
-leave
-types
-guitar
-biography
-told
-supported
-increase
-youth
-joseph
-cultural
-levels
-rules
-except
-keep
-simply
-latter
-victory
-drive
-saint
-serve
-discovered
-method
-languages
-m
-basketball
-junior
-winter
-girl
-selected
-goal
-elements
-chart
-boston
-j.
-avenue
-martin
-nation
-cut
-widely
-formerly
-problem
-boy
-voice
-digital
-listed
-immediately
-operating
-van
-singles
-equipment
-raised
-idea
-demographics
-versions
-quality
-pass
-growth
-continue
-conditions
-amount
-length
-response
-lee
-tower
-probably
-product
-serves
-boys
-daily
-railroad
-illinois
-stop
-defense
-dark
-territory
-columbia
-asked
-hot
-paper
-host
-ice
-tom
-newspaper
-corps
-cost
-largely
-c.
-chairman
-peace
-silver
-offers
-powers
-treatment
-shortly
-cars
-edward
-officers
-naval
-vote
-likely
-basis
-managed
-minutes
-earned
-senate
-industrial
-arrived
-louis
-iii
-elections
-hold
-cast
-presented
-leaders
-continues
-stars
-internet
-a.
-bishop
-hard
-politics
-things
-intended
-girls
-creek
-proposed
-male
-models
-doctor
-store
-covered
-ontario
-concept
-b
-jones
-academic
-producer
-fellow
-crew
-clear
-1st
-championships
-studied
-seasons
-bus
-difficult
-parish
-episodes
-republican
-moving
-constructed
-secondary
-multiple
-morning
-economy
-becomes
-prominent
-month
-michigan
-hotel
-allows
-review
-poverty
-destroyed
-needed
-traffic
-lack
-removed
-grew
-ball
-machine
-unique
-border
-know
-overall
-mexico
-mayor
-job
-composed
-projects
-poor
-hockey
-meet
-claims
-website
-wanted
-frequently
-knowledge
-georgia
-advanced
-dutch
-urban
-sun
-commander
-initial
-DGDG
-candidate
-giving
-basic
-pressure
-worldwide
-revealed
-+
-color
-acquired
-mass
-francisco
-soldiers
-goes
-officially
-fish
-solo
-workers
-travel
-extended
-remain
-wing
-image
-communities
-feet
-featuring
-literature
-distribution
-fans
-acting
-adopted
-entertainment
-russia
-frank
-municipality
-issued
-wall
-racing
-gas
-super
-fighting
-leadership
-programming
-assistant
-latin
-estate
-completely
-israel
-direction
-19th
-mentioned
-c
-efforts
-versus
-divided
-scored
-exchange
-gained
-environment
-unlike
-joe
-rail
-organizations
-johnson
-sydney
-b.
-regiment
-classes
-sources
-arms
-charge
-tree
-graduated
-20th
-build
-attention
-tells
-asia
-goals
-bar
-contemporary
-offer
-score
-online
-critical
---
-combined
-fifth
-victoria
-creation
-prison
-double
-parties
-something
-safety
-scientific
-conservative
-fame
-normal
-activity
-paris
-applied
-origin
-reason
-mount
-editor
-am
-liberal
-tradition
-renamed
-purchased
-marine
-emperor
-matches
-golden
-comedy
-simple
-bands
-join
-mike
-scottish
-begins
-relatively
-appearances
-focus
-planned
-horse
-titles
-coming
-judge
-enemy
-says
-honor
-potential
-animals
-trial
-surrounding
-massachusetts
-reference
-look
-secret
-powerful
-clubs
-combat
-rugby
-electric
-gives
-figure
-growing
-accepted
-dedicated
-situated
-say
-approach
-wood
-scott
-box
-toronto
-fell
-comic
-champion
-kept
-fields
-trains
-olympics
-philadelphia
-split
-couples
-supreme
-purpose
-laws
-agreement
-bob
-alternative
-sales
-iron
-providing
-theme
-beyond
-price
-presence
-quite
-spain
-technical
-windows
-principal
-e.g.
-pop
-jazz
-guard
-broadcasting
-letter
-convention
-rose
-claim
-facility
-analysis
-protection
-affairs
-williams
-reach
-assigned
-headquarters
-tried
-attacks
-farm
-refers
-saying
-DGDGDG.DG
-captured
-rural
-etc.
-ordered
-falls
-content
-no.
-birds
-memory
-memorial
-classic
-occur
-constitution
-distance
-independence
-tropical
-damage
-2nd
-steve
-cricket
-hands
-fiction
-attempts
-entitled
-concert
-plants
-pay
-perhaps
-disease
-dvd
-prize
-deep
-settlement
-performing
-global
-share
-formation
-weapons
-expected
-resulting
-transferred
-nuclear
-labour
-capacity
-opposition
-towns
-portion
-believe
-necessary
-crime
-agency
-murder
-methods
-comics
-acts
-mountains
-reserve
-alexander
-allowing
-application
-birth
-drama
-jim
-jackson
-recognized
-stock
-duke
-freedom
-search
-situation
-suffered
-housing
-account
-applications
-learning
-sense
-bring
-references
-break
-examples
-resources
-actress
-percent
-teaching
-m.
-compared
-bass
-lies
-relations
-floor
-letters
-ring
-hits
-fox
-cell
-cd
-atlantic
-opera
-forward
-effort
-text
-receive
-organized
-kill
-agent
-lady
-materials
-capita
-declared
-card
-leaves
-joint
-subsequent
-holds
-religion
-architecture
-drug
-greatest
-squadron
-resulted
-unknown
-vehicle
-steel
-let
-tax
-journal
-read
-expanded
-revolution
-bc
-rome
-infantry
-kong
-occurred
-fully
-chosen
-roles
-sites
-prevent
-reduced
-jr.
-perform
-develop
-sport
-firm
-s.
-defeat
-w.
-s
-inc.
-conducted
-younger
-agreed
-matter
-oldest
-standards
-twice
-connected
-holy
-pakistan
-owner
-defined
-extensive
-1950s
-hills
-flying
-representatives
-genus
-succeeded
-vehicles
-impact
-volume
-reports
-exist
-grade
-settled
-zone
-web
-page
-cells
-garden
-stores
-supply
-ministry
-transportation
-planning
-signal
-question
-tony
-classical
-granted
-block
-generation
-chris
-sons
-hosted
-promoted
-gun
-choice
-positions
-kind
-ways
-minnesota
-returning
-existing
-functions
-refused
-platform
-mind
-hong
-notably
-buried
-regions
-expansion
-audience
-performances
-unable
-needs
-teacher
-guest
-labor
-taught
-animal
-table
-communications
-theater
-rise
-dropped
-grant
-alongside
-al
-escape
-visit
-figures
-intelligence
-description
-notes
-cemetery
-contact
-critics
-programme
-ocean
-conflict
-adult
-churches
-attempted
-paid
-suggested
-fictional
-medicine
-seats
-reading
-ends
-actual
-creating
-internal
-graduate
-draft
-opposed
-onto
-setting
-effective
-wilson
-heard
-educational
-felt
-winner
-piece
-non-families
-publication
-achieved
-downtown
-edge
-canal
-controversy
-sign
-incorporated
-fought
-d.
-planet
-sunday
-specifically
-alliance
-easily
-nothing
-householder
-citizens
-philosophy
-brand
-wild
-couple
-existence
-cancer
-seems
-crown
-environmental
-electronic
-oxford
-sides
-roads
-v
-borough
-bad
-pilot
-aired
-heavily
-soccer
-want
-administrative
-i.e.
-elizabeth
-stand
-lieutenant
-actions
-broke
-begin
-wrestling
-device
-e.
-follow
-bought
-showed
-carry
-h.
-mixed
-reasons
-hour
-causes
-covers
-rare
-yards
-doing
-increasing
-users
-user
-estimated
-serious
-earl
-marked
-respectively
-metropolitan
-why
-3rd
-motion
-proved
-calls
-extremely
-possibly
-trying
-positive
-fine
-properties
-employed
-identified
-500
-risk
-pope
-fuel
-connection
-introduction
-invasion
-yellow
-articles
-provincial
-rank
-slightly
-founder
-christmas
-aid
-santa
-offices
-lot
-representative
-melbourne
-harry
-target
-finds
-display
-orchestra
-strength
-condition
-permanent
-screen
-ford
-chemical
-dog
-eye
-entirely
-votes
-districts
-weight
-designated
-officials
-humans
-stay
-trees
-techniques
-arthur
-sets
-spirit
-armed
-kansas
-treaty
-speech
-meant
-peak
-missouri
-indiana
-taylor
-davis
-olympic
-km
-habitat
-nominated
-rich
-copies
-institutions
-ultimately
-failure
-inspired
-andrew
-palace
-authorities
-universe
-centuries
-obtained
-typical
-maryland
-titled
-derived
-jews
-losing
-studios
-object
-phase
-americans
-magic
-northwest
-sex
-core
-mobile
-jesus
-identity
-orders
-nfl
-sixth
-cold
-attacked
-scale
-regarded
-partner
-imperial
-lyrics
-transfer
-writers
-toward
-link
-somewhat
-holding
-supporting
-faculty
-herself
-presidential
-trust
-flag
-save
-bird
-pieces
-bowl
-sexual
-starring
-follows
-participated
-skills
-boat
-chain
-miss
-avoid
-picture
-premier
-beat
-heritage
-popularity
-converted
-interview
-criminal
-determined
-asian
-talk
-retirement
-moon
-containing
-netherlands
-trail
-courses
-duty
-write
-engines
-sciences
-tennessee
-abandoned
-occurs
-factory
-vice
-objects
-contain
-founding
-hundred
-legislative
-wars
-standing
-incident
-literary
-fast
-getting
-korea
-heat
-shared
-shape
-brief
-operates
-bell
-n
-races
-ending
-foot
-try
-villages
-developing
-eyes
-express
-glass
-publishing
-locations
-p.
-refer
-briefly
-closely
-weekly
-r.
-contained
-federation
-require
-weather
-bush
-agricultural
-rivers
-patients
-cards
-protect
-lewis
-passenger
-communist
-killing
-challenge
-represent
-task
-arrested
-operate
-successfully
-otherwise
-christ
-broken
-iraq
-reaction
-ranked
-degrees
-flow
-controlled
-driver
-looking
-twelve
-routes
-gain
-stephen
-storm
-combination
-passing
-charts
-values
-none
-causing
-singing
-launch
-occupied
-politician
-recognition
-details
-manchester
-structures
-circuit
-ideas
-items
-fair
-streets
-commissioned
-maintained
-fm
-ca
-views
-occasionally
-brian
-g
-producing
-homes
-deputy
-singapore
-enter
-cable
-think
-visited
-billboard
-defence
-neighborhood
-influenced
-gets
-poetry
-baby
-4th
-decades
-domestic
-muslim
-competed
-pair
-universities
-institution
-ones
-resistance
-municipal
-newly
-motor
-v.
-colorado
-membership
-importance
-chance
-stands
-hollywood
-oregon
-howard
-recordings
-criticism
-don
-movies
-communication
-vietnam
-external
-underground
-golf
-bodies
-opportunity
-apparently
-links
-wind
-corner
-banks
-file
-laid
-personnel
-elementary
-address
-kentucky
-really
-sector
-equal
-sam
-credit
-maintain
-faith
-injury
-thousands
-el
-educated
-d
-entry
-advantage
-formal
-strike
-receiving
-shop
-finish
-merged
-reform
-contributed
-productions
-counties
-opposite
-orange
-mill
-hope
-wins
-grounds
-philippines
-constituency
-interior
-ad
-dry
-mode
-novels
-ethnic
-reality
-selling
-normally
-sought
-cambridge
-junction
-ben
-ray
-battalion
-threatened
-scenes
-nor
-regarding
-athletic
-adjacent
-depending
-passes
-courts
-promotion
-technique
-chapter
-starts
-selection
-apart
-brigade
-southeast
-establishment
-regularly
-lands
-vocals
-daniel
-rear
-definition
-walls
-investigation
-sequence
-instruments
-circle
-musicians
-purposes
-lane
-mid
-detroit
-designs
-die
-champions
-scheduled
-consisted
-huge
-evening
-interests
-path
-funds
-extension
-cape
-egypt
-scoring
-agriculture
-distributed
-straight
-gone
-miller
-sale
-climate
-count
-employees
-documentary
-earliest
-turns
-trained
-devices
-distinguished
-engaged
-reduce
-businesses
-extra
-candidates
-door
-bottom
-soul
-forests
-fashion
-plane
-evil
-approved
-improved
-hair
-korean
-cathedral
-maximum
-plus
-instance
-brain
-attorney
-audio
-consisting
-contest
-requires
-simon
-difference
-folk
-formula
-l.
-sections
-leads
-caught
-francis
-ruled
-lincoln
-register
-confirmed
-coal
-statement
-fund
-promote
-quarter
-kings
-purchase
-differences
-piano
-starred
-clark
-debate
-representing
-capable
-goods
-voted
-universal
-beautiful
-factor
-arena
-pictures
-demand
-visual
-survived
-crisis
-wave
-duties
-charges
-driving
-houston
-negative
-landing
-distinct
-dates
-guns
-saturday
-emergency
-diego
-reviews
-satellite
-attached
-contrast
-drawn
-legend
-blues
-inhabitants
-reputation
-khan
-e
-von
-DGDGDGDGDG
-funding
-spot
-advance
-publications
-discovery
-returns
-guitarist
-describes
-colony
-teachers
-sell
-hamilton
-iowa
-berlin
-colonel
-controversial
-amongst
-residential
-bronze
-message
-considerable
-reaching
-__
-entrance
-accident
-hired
-origins
-mississippi
-step
-improve
-abc
-tries
-argued
-bear
-acid
-behavior
-southwest
-polish
-solution
-dave
-chamber
-height
-scheme
-biggest
-easy
-parliamentary
-rescue
-fighter
-squad
-passengers
-admitted
-manufacturing
-whilst
-fishing
-islamic
-painting
-alan
-dynasty
-billion
-northeast
-industries
-resigned
-ward
-shopping
-senator
-f.
-colleges
-images
-decade
-*
-cycle
-questions
-attend
-louisiana
-replace
-describe
-tests
-mine
-amateur
-flat
-owners
-wisconsin
-costs
-mathematics
-1930s
-allen
-cabinet
-skin
-establish
-hence
-thing
-arm
-sri
-tell
-waters
-walk
-aspects
-respect
-subjects
-violence
-greece
-capture
-element
-massive
-severe
-mean
-dean
-arizona
-constant
-anything
-|
-sweden
-heads
-credited
-organisation
-relative
-session
-charged
-factors
-expressed
-scholars
-wear
-persons
-whereas
-determine
-assumed
-wings
-douglas
-harbor
-visitors
-iv
-showing
-involving
-testing
-meanwhile
-allied
-commonwealth
-increasingly
-learned
-everything
-engineer
-category
-mining
-18th
-patrick
-sky
-albert
-walter
-connecticut
-filled
-anderson
-uss
-weapon
-bond
-terminal
-musician
-orthodox
-afterwards
-oklahoma
-equivalent
-grown
-reception
-consider
-calling
-alabama
-gene
-temperature
-divisions
-additionally
-gay
-chose
-pattern
-soundtrack
-gallery
-charter
-vision
-guide
-investment
-parks
-hosts
-trip
-progressive
-compete
-measure
-requirements
-frequency
-networks
-disney
-components
-invited
-joining
-headed
-proper
-poland
-legislature
-focused
-tank
-fixed
-adam
-represents
-installed
-rooms
-well-known
-shore
-philip
-truth
-markets
-pro
-reign
-manner
-ago
-5th
-princess
-crossing
-strip
-meets
-participate
-colonial
-chapel
-fan
-significantly
-residence
-swedish
-responsibility
-progress
-twenty
-gardens
-iran
-corporate
-billy
-tennis
-traditionally
-moscow
-roughly
-indeed
-worth
-hero
-unusual
-baron
-suburb
-exists
-committed
-feel
-bruce
-stopped
-poet
-modified
-coverage
-understanding
-interstate
-gulf
-gordon
-greatly
-broad
-moore
-mile
-physics
-cleveland
-ceremony
-affected
-documents
-dream
-draw
-marketing
-boundary
-chair
-speak
-grow
-atlanta
-medieval
-separated
-anthony
-releases
-cbs
-output
-continuing
-medium
-mall
-learn
-watch
-neither
-appeal
-assistance
-miami
-insurance
-faced
-johnny
-request
-useful
-bible
-baltimore
-composition
-dan
-carrying
-arrival
-and/or
-salt
-registered
-pool
-budget
-composer
-replacement
-partnership
-legislation
-brazil
-kevin
-tribe
-parallel
-mental
-context
-broadway
-roll
-threat
-map
-experienced
-fired
-accused
-economics
-extreme
-drew
-strategy
-g.
-defensive
-appearing
-policies
-observed
-camera
-eric
-giant
-transit
-legacy
-coastal
-aviation
-license
-seattle
-arab
-collected
-roger
-kennedy
-inner
-i.
-losses
-ownership
-interested
-stood
-decline
-tools
-changing
-effectively
-fly
-fantasy
-offering
-le
-electrical
-directors
-ep
-outstanding
-rapid
-bureau
-lawrence
-trading
-storage
-retained
-anniversary
-actors
-limit
-phone
-jimmy
-finding
-bachelor
-hundreds
-maria
-nbc
-usage
-ages
-ride
-suit
-peninsula
-restored
-missing
-solid
-seventh
-max
-destruction
-missions
-everyone
-accompanied
-concerns
-metres
-machines
-dc
-electoral
-instrument
-restaurant
-linked
-exception
-superior
-suggests
-characteristics
-harvard
-centers
-ensure
-springs
-strongly
-animated
-achieve
-slow
-d.c.
-drugs
-survey
-rejected
-offensive
-dublin
-ryan
-journey
-kent
-advertising
-ridge
-mouth
-knight
-pittsburgh
-conservation
-tim
-processes
-benefit
-existed
-soil
-journalist
-compilation
-expedition
-delivered
-practices
-string
-movements
-microsoft
-rapidly
-aged
-newspapers
-finishing
-riding
-t
-safe
-wine
-rival
-dam
-desert
-artillery
-grey
-spiritual
-franklin
-relief
-prepared
-protein
-wounded
-jerry
-steam
-celebrated
-samuel
-lakes
-styles
-mix
-coalition
-injured
-personality
-frame
-ali
-happened
-stages
-agents
-israeli
-choose
-birmingham
-'m
-speaking
-horses
-spoken
-opinion
-shooting
-cdp
-finance
-accept
-accounts
-gang
-filmed
-yorkshire
-successor
-contributions
-rates
-gate
-suicide
-visible
-thousand
-maintenance
-resolution
-'re
-norway
-dragon
-welsh
-symbol
-uncle
-struck
-tunnel
-fear
-tribes
-mrs.
-associate
-damaged
-r
-finals
-disc
-stayed
-transmission
-temporary
-principle
-sisters
-roots
-colors
-adapted
-frequent
-rice
-ross
-seconds
-tend
-signs
-entering
-childhood
-dallas
-bomb
-del
-hampshire
-option
-branches
-adding
-drummer
-orleans
-authors
-influential
-moment
-strategic
-begun
-extent
-portrayed
-speaker
-anyone
-socialist
-covering
-translation
-bringing
-pick
-harris
-principles
-kelly
-representation
-territories
-processing
-favor
-forming
-illegal
-17th
-belief
-switzerland
-touring
-occupation
-bit
-austria
-concerned
-perfect
-window
-domain
-6th
-print
-summary
-liverpool
-operational
-instrumental
-clearly
-anne
-supporters
-drums
-measures
-ed
-laboratory
-cat
-7th
-happy
-nick
-dating
-evolution
-dangerous
-surgery
-governments
-vocal
-turning
-involvement
-integrated
-row
-clan
-consecutive
-russell
-quebec
-masters
-concerts
-creative
-priest
-rarely
-minute
-angel
-montreal
-completion
-t.
-patrol
-400
-printed
-raise
-gradually
-subtropical
-indians
-papers
-mechanical
-borders
-diocese
-knew
-malaysia
-sees
-frederick
-moves
-experimental
-charlie
-customers
-yard
-translated
-computers
-destroy
-snow
-constitutional
-listing
-tourism
-resident
-bears
-athletics
-footballer
-painted
-weekend
-credits
-formally
-cash
-architect
-involves
-occasions
-stewart
-alive
-mills
-adams
-co.
-themes
-nevertheless
-norman
-script
-treated
-ready
-jeff
-depression
-metro
-rolling
-airlines
-equipped
-declined
-swimming
-assault
-marshall
-rising
-conduct
-wheel
-grades
-talent
-richmond
-surrounded
-apple
-edinburgh
-crash
-keeping
-abbey
-matters
-historically
-pain
-gary
-leg
-buy
-wooden
-commissioner
-mainstream
-lasted
-opponent
-wayne
-queensland
-periods
-jane
-classified
-enjoyed
-send
-alleged
-sugar
-zero
-column
-sponsored
-arranged
-channels
-meters
-protected
-daughters
-essentially
-boundaries
-portuguese
-employment
-revolutionary
-austin
-h
-vancouver
-jordan
-criticized
-annually
-jason
-outer
-vessels
-vast
-tail
-matt
-et
-allies
-l
-andy
-thompson
-traditions
-literacy
-shares
-wants
-appropriate
-saints
-winners
-correct
-similarly
-drop
-edited
-cavalry
-attracted
-seem
-1900
-widespread
-walker
-abilities
-lay
-licensed
-component
-latest
-mounted
-uniform
-franchise
-ranks
-solar
-scientists
-dna
-narrow
-carrier
-prisoners
-dogs
-flows
-marvel
-produces
-document
-concern
-warner
-helping
-meetings
-visiting
-retail
-files
-practical
-islam
-legs
-vessel
-agencies
-surviving
-expensive
-settlers
-secure
-apparent
-civilian
-remainder
-tokyo
-ian
-buses
-guy
-DGDG.DGDGDGDGDGDG
-throne
-alternate
-f
-morgan
-charity
-16th
-supplies
-patient
-.DGDGDG
-shops
-punk
-brooklyn
-jobs
-1920s
-trophy
-seek
-norfolk
-mail
-fit
-expression
-endemic
-sessions
-else
-k
-blocks
-schedule
-hunter
-obtain
-passage
-substantial
-singh
-benefits
-margaret
-glasgow
-producers
-supports
-technologies
-engineers
-twin
-limits
-belgium
-genre
-apply
-nba
-friday
-asks
-drawing
-seeing
-toured
-christopher
-indicate
-p
-add
-telephone
-o
-carbon
-exhibition
-relationships
-unless
-eleven
-commons
-colour
-turkey
-excellent
-submarine
-wearing
-tall
-hardware
-replacing
-fruit
-statistics
-norwegian
-sarah
-adaptation
-denied
-partners
-enzyme
-campbell
-utah
-detailed
-encouraged
-rain
-pearl
-nelson
-pages
-breaking
-alex
-holiday
-immediate
-admiral
-hunting
-ahead
-boats
-roof
-victims
-departure
-partly
-remove
-struggle
-videos
-viewed
-indicated
-stanley
-thereafter
-lists
-soft
-claiming
-minimum
-unsuccessful
-hearing
-afghanistan
-cousin
-operator
-stating
-closer
-adventure
-minority
-falling
-driven
-fresh
-voting
-percentage
-walking
-segment
-poem
-robinson
-resort
-server
-competitions
-restoration
-commerce
-fred
-desire
-hidden
-wright
-jerusalem
-flowers
-innings
-lose
-honour
-sounds
-oak
-picked
-provinces
-conventional
-jean
-buffalo
-alberta
-adventures
-stream
-di
-roy
-beauty
-concluded
-infrastructure
-wedding
-plate
-carter
-wildlife
-debuted
-persian
-eighth
-cited
-balance
-christianity
-identical
-warren
-paintings
-overseas
-possibility
-pitch
-hole
-gray
-distinction
-mechanism
-tied
-farmers
-steps
-carl
-statistical
-cook
-experiences
-rating
-manufactured
-virgin
-barry
-elsewhere
-crowd
-lawyer
-continental
-hunt
-graham
-layer
-clothing
-vary
-diverse
-survive
-clinton
-opportunities
-12th
-logo
-matthew
-remote
-ferry
-honorary
-virtual
-shall
-kids
-decides
-congressional
-historian
-chemistry
-prix
-seeking
-false
-phrase
-appointment
-copy
-prices
-artistic
-airline
-filed
-supposed
-democrat
-concrete
-circumstances
-slowly
-restaurants
-besides
-las
-designer
-cargo
-mexican
-nomination
-soldier
-affair
-revival
-proposal
-tours
-hms
-continuous
-suffering
-understand
-collections
-8th
-ncaa
-favorite
-liberty
-immigrants
-denmark
-dominated
-stable
-executed
-honors
-enterprise
-collaboration
-evolved
-clinical
-missile
-symphony
-magazines
-conversion
-dollars
-essential
-locally
-ceased
-fc
-truck
-exclusively
-revenue
-tool
-possession
-drum
-attempting
-bristol
-mp
-enemies
-ratio
-dispute
-answer
-larry
-morris
-discussion
-phil
-interface
-bonus
-confused
-distinctive
-absence
-impossible
-injuries
-mac
-relation
-arkansas
-13th
-unfortunately
-generated
-advice
-judges
-populations
-arrest
-emerged
-shell
-rocks
-acted
-qualified
-theories
-connections
-manhattan
-peoples
-battles
-dramatic
-index
-tape
-feed
-wrong
-summit
-loan
-assist
-tourist
-portland
-biological
-intersection
-panel
-defeating
-repeated
-genetic
-marks
-dick
-dominant
-hawaii
-robin
-overview
-murray
-cinema
-affiliated
-taiwan
-plain
-confederate
-publisher
-thailand
-merger
-increases
-competitive
-alcohol
-maine
-10th
-nova
-cave
-grace
-argument
-doors
-broadcasts
-delta
-worn
-thinking
-hurricane
-influences
-parent
-neck
-indonesia
-anime
-harbour
-interpretation
-exercise
-electricity
-ultimate
-friendly
-mediterranean
-consumer
-believes
-usual
-self
-terry
-pure
-patterns
-bound
-egyptian
-drafted
-transition
-saved
-phoenix
-sword
-stones
-ruling
-competing
-lights
-exact
-parker
-indigenous
-trials
-worship
-15th
-eagle
-variations
-sand
-partially
-studying
-volunteer
-10,000
-apartment
-victor
-heights
-dj
-sailed
-puerto
-belt
-diamond
-devoted
-recommended
-explained
-tamil
-suitable
-mines
-raf
-comprehensive
-ghost
-loop
-baker
-platforms
-ann
-crystal
-adults
-dancing
-violent
-cincinnati
-attributed
-bed
-fairly
-reporter
-extensively
-poems
-argentina
-turkish
-decisions
-coat
-attending
-volunteers
-originated
-sacred
-displayed
-alice
-finland
-preserved
-participation
-terminus
-flights
-sweet
-traveled
-merchant
-seemed
-longest
-classification
-bobby
-writings
-merely
-y
-concerning
-occasion
-siege
-graphics
-alpha
-danish
-amendment
-meat
-mr
-drivers
-calendar
-mitchell
-regulations
-coaching
-plastic
-exactly
-forum
-muslims
-spending
-nickname
-exposed
-recovered
-alaska
-sentence
-1940s
-facing
-publicly
-monster
-athens
-symptoms
-14th
-alfred
-1905
-victorian
-graduating
-rangers
-guilty
-christians
-exposure
-ordinary
-density
-caribbean
-arrangement
-enforcement
-moist
-patent
-restricted
-looks
-arriving
-monastery
-shadow
-sing
-hull
-jurisdiction
-identify
-serial
-donald
-categories
-germans
-database
-9th
-implemented
-venue
-profile
-tribute
-festivals
-participants
-contribution
-ratings
-psychology
-procedure
-opponents
-hip
-archbishop
-bases
-moral
-animation
-reveals
-farming
-reportedly
-departments
-hudson
-woods
-manufacturer
-subsidiary
-statue
-ranging
-ambassador
-abroad
-pc
-clay
-aim
-dakota
-strange
-eat
-birthday
-sufficient
-prove
-helps
-sort
-interesting
-delaware
-cattle
-signals
-tanks
-boss
-monument
-jefferson
-municipalities
-descent
-martial
-initiative
-heroes
-duo
-affiliate
-implementation
-developments
-commanded
-locomotives
-11th
-closing
-atmosphere
-switch
-gap
-emphasis
-looked
-horror
-furthermore
-noise
-bone
-chase
-sole
-bridges
-magnetic
-solutions
-grammar
-input
-trouble
-lie
-portions
-rebuilt
-keith
-triple
-radar
-basin
-crimes
-gospel
-alien
-giants
-abuse
-eggs
-select
-ties
-railways
-peaked
-waste
-rocky
-shift
-editions
-scholarship
-mention
-monday
-organic
-muhammad
-tag
-deployed
-elder
-regime
-fled
-error
-researchers
-delivery
-controls
-antonio
-automatic
-youngest
-landed
-nazi
--DGDG.DGDGDGDGDGDG
-hindu
-eddie
-guests
-recovery
-pink
-assets
-deaths
-approval
-organ
-voters
-carries
-liquid
-foster
-'d
-trinity
-net
-reduction
-parking
-'ve
-characterized
-exit
-suggest
-baptist
-democracy
-clean
-preferred
-pursue
-kim
-anna
-tube
-venture
-waiting
-rifle
-legendary
-historians
-wish
-volumes
-scout
-pm
-maps
-root
-disaster
-writes
-manufacturers
-significance
-landscape
-benjamin
-warfare
-raw
-suspended
-depth
-rebellion
-monthly
-preparation
-cult
-characteristic
-bright
-experiment
-protocol
-cooper
-concentration
-campaigns
-variation
-decide
-penalty
-exclusive
-wolf
-critic
-sporting
-eliminated
-thanks
-chess
-instruction
-interviews
-pioneer
-matrix
-amounts
-swiss
-breeding
-demolished
-habitats
-therapy
-aimed
-sample
-wells
-ken
-radical
-fifteen
-jonathan
-vegas
-experiments
-marry
-specialized
-citizen
-rabbi
-warriors
-mystery
-fighters
-guards
-departed
-sequel
-ceo
-linear
-rocket
-random
-cartoon
-thirty
-1,000
-options
-rogers
-custom
-touch
-recreation
-barbara
-camps
-virtually
-grove
-tone
-comparison
-programmes
-escaped
-viewers
-execution
-behalf
-dennis
-logic
-bid
-connecting
-slaves
-dated
-ron
-1909
-protest
-alumni
-jet
-telling
-cotton
-arabic
-collins
-gates
-kilometers
-tenure
-architectural
-gift
-nicholas
-landmark
-shield
-faster
-so-called
-batting
-traded
-permitted
-parade
-beer
-battery
-knights
-undergraduate
-erected
-wealthy
-gods
-variable
-ticket
-democrats
-achievement
-difficulty
-vertical
-lords
-wealth
-artificial
-renowned
-pilots
-spoke
-harrison
-mainland
-danger
-suburbs
-turner
-angle
-milk
-ninth
-deck
-manage
-prayer
-gear
-valuable
-receives
-spend
-ottawa
-quick
-donated
-slave
-aside
-da
-texts
-topics
-w
-concepts
-fundamental
-un
-weak
-immigration
-choir
-banned
-'ll
-heaven
-nevada
-establishing
-accessible
-jay
-invented
-leeds
-copper
-colonies
-bills
-doctrine
-victim
-sharp
-resumed
-manual
-thin
-clock
-fill
-dress
-tigers
-illness
-societies
-rounds
-catch
-1903
-lions
-oxygen
-tea
-permission
-radiation
-feeling
-romantic
-managing
-1901
-dr
-package
-jump
-sentenced
-sung
-regard
-depicted
-vienna
-indicates
-cutting
-conclusion
-spaces
-collapse
-soap
-signing
-simultaneously
-scores
-funeral
-indoor
-ideal
-repair
-convinced
-handle
-switched
-coins
-theology
-portugal
-hell
-aftermath
-diet
-speakers
-maintains
-tales
-situations
-relocated
-favour
-oliver
-fun
-ottoman
-divine
-ill
-discussed
-denver
-coffee
-curriculum
-ministers
-certified
-holland
-aspect
-flew
-fallen
-ban
-medals
-ranking
-inspiration
-leagues
-sang
-powered
-posted
-manga
-drink
-presents
-adelaide
-expert
-assisted
-difficulties
-mtv
-roberts
-convicted
-descendants
-travels
-debt
-hear
-teach
-hebrew
-biology
-cultures
-tommy
-placing
-pole
-pulled
-ongoing
-outdoor
-vs.
-nhl
-burning
-corresponding
-succession
-mathematical
-missed
-destination
-resource
-enters
-ukraine
-sitting
-vi
-sleep
-connect
-guinea
-nebraska
-tiger
-designation
-ted
-k.
-owns
-ask
-removal
-crossed
-fitted
-cap
-promoting
-vocalist
-1902
-dual
-rush
-stored
-fate
-diseases
-reducing
-expand
-worst
-internationally
-demonstrated
-palm
-faces
-revenge
-traveling
-explain
-judicial
-taxes
-ralph
-roosevelt
-danny
-full-time
-pine
-evans
-cooperation
-referring
-craig
-defend
-suddenly
-murdered
-ports
-housed
-harold
-syndrome
-filming
-reaches
-explains
-painter
-scholar
-funded
-hop
-friendship
-elite
-intention
-bangladesh
-rick
-lifetime
-locomotive
-mm
-warning
-lloyd
-jury
-trio
-flash
-craft
-examination
-reporting
-boulevard
-settlements
-printing
-respective
-isolated
-cuba
-readers
-responded
-devil
-bars
-initiated
-trustees
-accurate
-pupils
-improvement
-manor
-celebration
-philippine
-lanka
-austrian
-drinking
-till
-beliefs
-mixture
-quantum
-en
-charlotte
-1861
-pointed
-improvements
-towers
-stress
-susan
-consistent
-boxing
-grandfather
-r&b
-literally
-protestant
-wider
-aboard
-virus
-stuart
-focuses
-shipping
-inducted
-boards
-reverse
-seal
-raising
-essex
-extend
-negotiations
-objective
-flood
-us$
-operators
-intermediate
-approaches
-hugh
-burned
-completing
-reflect
-detail
-check
-ruler
-silent
-gauge
-breed
-brisbane
-occasional
-journalism
-efficient
-intellectual
-wire
-veteran
-load
-targets
-ranges
-perceived
-affect
-globe
-talking
-aware
-sum
-demands
-commenced
-neighboring
-armies
-stops
-specialist
-edwards
-illustrated
-yale
-pitched
-tale
-manitoba
-touchdown
-prestigious
-variant
-maritime
-shoot
-cardinal
-regardless
-tie
-shakespeare
-arrive
-partial
-client
-21st
-muscle
-easier
-lowest
-consequently
-proof
-eldest
-angels
-consist
-putting
-attraction
-leo
-blind
-closure
-compound
-sheffield
-temporarily
-suburban
-afternoon
-framework
-derby
-wheels
-1890
-dollar
-perspective
-customer
-waves
-luke
-thereby
-lion
-brick
-bernard
-statements
-bat
-doctors
-efficiency
-attendance
-surprise
-acoustic
-reed
-centres
-wickets
-structural
-electronics
-lebanon
-acquisition
-dismissed
-unincorporated
-finnish
-feelings
-preserve
-codes
-steven
-j
-gameplay
-ma
-nights
-measured
-belongs
-mars
-1898
-remembered
-unions
-delhi
-premiered
-hart
-intense
-dealing
-approached
-skill
-jon
-familiar
-visits
-rob
-scientist
-displays
-commentary
-f.c.
-shorter
-eagles
-sharing
-montgomery
-maintaining
-tech
-signature
-iraqi
-mansion
-collective
-handling
-eve
-earning
-attacking
-renaissance
-buddhist
-spectrum
-detective
-ltd.
-voiced
-berkeley
-lodge
-confusion
-neil
-recognised
-beating
-seed
-fortune
-extinct
-disk
-firing
-1862
-pat
-requested
-noble
-vincent
-crater
-somerset
-thirteen
-destroyer
-wake
-reservoir
-requiring
-cancelled
-wallace
-followers
-pete
-disbanded
-organised
-1865
-knows
-oakland
-anchor
-predecessor
-arrives
-notice
-preservation
-newcastle
-pride
-stolen
-iranian
-shut
-100,000
-administered
-comments
-photography
-fluid
-purple
-perry
-breaks
-dying
-grave
-forth
-1885
-lesser
-mercury
-properly
-racial
-interchange
-understood
-vector
-bombing
-fewer
-paint
-footage
-shah
-actively
-declaration
-photographs
-empty
-bull
-butler
-1896
-backing
-revised
-sure
-liberation
-prisoner
-militia
-relevant
-diplomatic
-pushed
-realized
-ukrainian
-casualties
-sight
-gender
-disorder
-suspension
-helicopter
-ibm
-discovers
-obvious
-greg
-comprises
-elevation
-crosses
-discover
-constantly
-rider
-jail
-1899
-supplied
-madison
-axis
-gaining
-professionals
-3d
-tournaments
-pronounced
-tested
-abolished
-unity
-territorial
-reader
-durham
-argue
-non-profit
-robot
-fifa
-graduation
-receiver
-libraries
-1864
-mirror
-compact
-retreat
-enjoy
-sending
-newport
-helen
-advocate
-awareness
-namely
-hughes
-manages
-freight
-shock
-recruited
-burns
-interaction
-seeds
-farms
-northwestern
-fat
-insects
-catherine
-travelled
-phenomenon
-dies
-automobile
-involve
-java
-entity
-describing
-raid
-pradesh
-mouse
-rico
-resignation
-moderate
-guardian
-congregation
-agree
-juan
-outbreak
-restrictions
-inn
-particles
-dawn
-surrender
-varied
-contracts
-1895
-isle
-gilbert
-murphy
-rings
-dynamic
-veterans
-rhythm
-1894
-nashville
-belonging
-behaviour
-currency
-kilometres
-proteins
-batman
-flower
-diesel
-secured
-romania
-participating
-molecular
-sultan
-welfare
-lock
-qualifying
-hospitals
-seriously
-banking
-talks
-lisa
-generations
-isbn
-prototype
-prepare
-requirement
-tactical
-dependent
-1863
-explosion
-severely
-pirates
-inherited
-coaches
-dragons
-singers
-closest
-archaeological
-vermont
-.DG
-thick
-hungary
-killer
-honours
-asking
-plains
-presidency
-sailing
-allegedly
-ibn
-governed
-functional
-bishops
-graphic
-submitted
-kills
-separation
-configuration
-sandy
-consumption
-autumn
-twentieth
-updated
-procedures
-praised
-herbert
-pitcher
-exile
-canyon
-kiss
-clients
-dreams
-automatically
-array
-regulation
-accomplished
-bowling
-informed
-mutual
-considerably
-challenges
-1897
-der
-drove
-methodist
-long-term
-organisations
-promised
-prevented
-lp
-considering
-integration
-tourists
-variants
-hitler
-colours
-frontier
-grass
-optical
-necessarily
-seminary
-teeth
-reformed
-ski
-dubbed
-disappeared
-westminster
-celebrity
-columbus
-deals
-sat
-punishment
-advisory
-ensemble
-gregory
-predominantly
-ph.d.
-chiefs
-repeatedly
-companion
-bassist
-independently
-smooth
-karl
-rated
-messages
-tissue
-lighting
-explanation
-armenian
-interactive
-customs
-mysterious
-photo
-playoffs
-balls
-warrior
-1889
-contested
-manufacture
-reforms
-corruption
-harvey
-keys
-belgian
-perth
-sox
-foods
-sean
-survival
-theological
-presentation
-pack
-jan
-samples
-accommodate
-jacob
-altered
-byzantine
-varies
-emotional
-quarterback
-newer
-naturally
-divorce
-newton
-fifty
-creates
-governing
-wellington
-connects
-massacre
-shots
-p.m.
-nintendo
-appeals
-arnold
-psychological
-elevated
-short-lived
-temples
-fbi
-brings
-neighbouring
-diameter
-sullivan
-whatever
-excellence
-relating
-laser
-associations
-developers
-des
-fee
-spell
-cool
-stronger
-chorus
-1893
-southeastern
-cuts
-celtic
-sheriff
-du
-cm
-lift
-fa
-failing
-queens
-atomic
-worlds
-tasks
-bath
-wireless
-venues
-releasing
-conjunction
-algorithm
-equally
-belong
-nasa
-varying
-deemed
-poll
-sixteen
-wore
-n.
-eating
-millions
-quoted
-depot
-600
-sony
-segments
-hopes
-computing
-forcing
-rough
-navigation
-eighteen
-encountered
-aids
-absolute
-pairs
-equation
-zoo
-transformed
-witness
-attitude
-kid
-concentrated
-retain
-survivors
-authorized
-withdrew
-beneath
-commanding
-theatrical
-cruise
-warm
-u
-tenth
-sierra
-imprisoned
-storyline
-capabilities
-creatures
-florence
-deer
-epic
-priests
-desired
-layout
-haven
-chancellor
-girlfriend
-premiere
-encourage
-frog
-dialogue
-identification
-challenged
-encounter
-verse
-1870
-worker
-legion
-capitol
-vital
-saskatchewan
-nephew
-freeway
-romance
-circulation
-politicians
-lying
-mad
-1880
-todd
-eligible
-amsterdam
-strict
-generate
-civic
-grandson
-kinds
-snake
-spots
-tons
-grammy
-beam
-lineup
-chapters
-drives
-orbit
-voices
-impressive
-delay
-oral
-performers
-earn
-phillips
-forever
-practiced
-columns
-marines
-businessman
-formats
-employee
-planes
-scandal
-ballet
-gathered
-constitute
-marie
-infection
-thrown
-tomb
-reflected
-symbols
-aims
-qualify
-arc
-abraham
-assessment
-certainly
-sit
-margin
-discontinued
-define
-brazilian
-biblical
-enrolled
-hotels
-1888
-tactics
-sculpture
-treat
-discipline
-promise
-galaxy
-wet
-item
-hat
-payment
-exhibit
-taste
-substance
-gathering
-wait
-happen
-collect
-rally
-transformation
-tip
-registration
-rod
-pan
-saudi
-palestinian
-admission
-counter
-quiet
-deliver
-bow
-observation
-striking
-kick
-shoulder
-lover
-clothes
-smoke
-profit
-enhanced
-consequence
-stretch
-proximity
-keyboard
-burn
-playoff
-celebrate
-ammunition
-curve
-settle
-vinyl
-creature
-voyage
-archives
-rubber
-stroke
-egg
-mineral
-capability
-differ
-airfield
-duration
-deployment
-stick
-confidence
-bats
-distinguish
-blow
-convert
-assignment
-chicken
-shelter
-acknowledged
-withdrawal
-upset
-sacrifice
-diving
-trace
-sudden
-horizontal
-monitoring
-faction
-monk
-wednesday
-publish
-scattered
-judgment
-decrease
-sheet
-rebel
-workshop
-measurement
-compensation
-sick
-arch
-harmony
-bread
-corn
-possess
-smart
-merit
-finale
-terror
-implement
-fail
-colored
-fiscal
-announcement
-hide
-mature
-skilled
-bigger
-spy
-nurse
-prairie
-holder
-prominence
-sleeping
-motto
-welcome
-voluntary
-suffer
-trap
-ash
-arrow
-disabled
-vicinity
-eliminate
-bend
-slight
-membrane
-marathon
-bold
-bore
-precise
-comment
-peaceful
-hostile
-yield
-delegate
-luxury
-woodland
-bitter
-ecology
-lighter
-preceding
-steady
-communicate
-affiliation
-dense
-compatible
-burke
-inch
-wizard
-climb
-strait
-deceased
-limestone
-ally
-drag
-advised
-tongue
-healing
-blast
-maker
-upgrade
-ear
-honey
-steal
-smoking
-archive
-clinic
-trick
-midway
-feud
-equality
-morocco
-enhance
-geological
-dish
-accommodation
-reservation
-withdraw
-impression
-tract
-dropping
-update
-dairy
-marking
-retire
-plasma
-assumption
-magnitude
-silence
-commit
-slope
-photograph
-sealed
-joke
-pound
-courage
-listen
-meal
-discharge
-comfort
-crane
-explicit
-buck
-thrust
-derivative
-pierce
-dominican
-carriage
-inferior
-virtue
-observe
-nobility
-asylum
-succeeding
-scholarly
-pig
-isolation
-abandon
-decoration
-camping
-container
-breakthrough
-stack
-sprint
-desk
-burnt
-airplane
-expect
-coordinate
-supernatural
-recall
-bias
-radius
-terrace
-starter
-shipyard
-panic
-wage
-recommendation
-fauna
-desktop
-citation
-twist
-persecution
-dot
-sized
-alert
-aluminum
-gamma
-reef
-quarterly
-weekday
-keeper
-merchandise
-gore
-pressing
-robbery
-similarity
-shoe
-cord
-wade
-ms
-declare
-dad
-viewer
-offspring
-racism
-hay
-mint
-brave
-deposit
-lightweight
-burst
-noon
-oracle
-stocks
-liability
-deaf
-spiral
-accompany
-incredible
-emphasized
-crushed
-lamb
-imagination
-beloved
-void
-friction
-stuff
-rolled
-organism
-indication
-chord
-slip
-incoming
-vault
-indirect
-touched
-troubled
-bury
-insane
-specimen
-alarm
-happening
-drain
-magnificent
-resist
-inform
-motivation
-bare
-grocery
-gig
-abundance
-confession
-examine
-batch
-metallic
-emblem
-edit
-bent
-cake
-patronage
-worthy
-inadequate
-enthusiasm
-subordinate
-pedestrian
-integer
-livery
-bubble
-shifting
-unhappy
-hunger
-insect
-constellation
-gradual
-disability
-posterior
-offset
-freelance
-crack
-sauce
-famine
-packed
-bug
-acronym
-calculation
-farewell
-priesthood
-sink
-herb
-sandwich
-grassland
-blown
-derive
-blessing
-leap
-prophecy
-catching
-brush
-smile
-dash
-sustain
-predict
-stronghold
-butter
-badminton
-newscast
-potato
-robust
-satan
-confirmation
-consort
-precipitation
-honest
-topology
-magician
-combatant
-cooked
-buffer
-morality
-anxiety
-relieve
-enthusiastic
-saddle
-investor
-colorful
-nickel
-deliberate
-foul
-touching
-mock
-pile
-drought
-iris
-jewel
-mosaic
-stressed
-cod
-disappear
-tense
-innocence
-exploit
-toilet
-react
-descending
-digit
-handicap
-pledge
-evaluate
-modify
-backward
-angular
-warn
-curtain
-ambush
-disappointment
-shelf
-lever
-smell
-arrange
-parole
-wash
-checked
-ensign
-distortion
-lesson
-jaguar
-disadvantage
-trench
-tops
-rift
-fertility
-bundle
-straw
-dye
-consume
-concealed
-knot
-flip
-wholesale
-expose
-mutiny
-passionate
-feminine
-announce
-scan
-excavation
-approve
-lean
-sulfur
-heel
-abbreviation
-furnace
-predator
-astronomer
-compartment
-spray
-alumnus
-enclosure
-adjust
-scenery
-emerald
-confident
-instability
-initiate
-token
-computation
-vine
-mole
-freeze
-insert
-criterion
-bail
-wan
-prejudice
-weaponry
-liberated
-homicide
-lip
-gem
-silva
-hostility
-capitalist
-outward
-undertaking
-limb
-shining
-dwelling
-covert
-blowing
-slain
-precedent
-compose
-infectious
-recipe
-invalid
-defect
-nerves
-password
-rehearsal
-co-star
-conspicuous
-populace
-impulse
-sounding
-booster
-propose
-stupid
-wandering
-traverse
-mourning
-mammal
-reproduce
-shower
-ditch
-prevalence
-crab
-fused
-disturbance
-blaze
-lowering
-directorate
-pod
-veto
-reinforce
-graveyard
-credibility
-idle
-freud
-expectation
-nut
-intercourse
-rack
-moroccan
-worry
-narration
-proton
-snap
-ale
-yeast
-curb
-gag
-forecast
-polished
-vanished
-psychiatry
-aft
-waterway
-fracture
-tablet
-artifact
-answering
-theologian
-stimulate
-hash
-exclude
-descend
-follower
-entertainer
-sighted
-penis
-traveler
-slash
-disclosure
-barrister
-whiskey
-spouse
-fuck
-beak
-excel
-yen
-stalls
-intend
-blanket
-spawn
-seafood
-motel
-homo
-superficial
-violate
-porch
-hooks
-sewer
-homogeneous
-closet
-overdose
-maximize
-satisfactory
-dismiss
-cursed
-punish
-cub
-appreciate
-morrow
-charcoal
-introductory
-inmate
-cardiovascular
-lawful
-topping
-certainty
-abstraction
-telecommunication
-unlawful
-usher
-skirt
-kilometer
-pigment
-improper
-perceive
-smashing
-freeing
-crystalline
-demonic
-workings
-perch
-chariot
-continual
-indefinite
-mute
-annals
-memorabilia
-assimilation
-ardent
-woo
-bootleg
-classify
-stray
-cracking
-panorama
-overrun
-damned
-splash
-waking
-heterosexual
-prognosis
-cloak
-steer
-misery
-flush
-endeavor
-weakening
-branching
-accomplishment
-orchestrated
-pinnacle
-gin
-breeze
-converge
-thoroughfare
-vow
-discontent
-yielding
-chow
-assent
-awake
-upbringing
-sway
-cheat
-crabs
-anti-semitism
-plausible
-gen
-oblique
-lavender
-patterned
-projecting
-cholera
-stabilize
-agitation
-clergyman
-imperative
-lad
-ballast
-converse
-yin
-helix
-hypertension
-brandy
-dispersion
-psychotherapy
-weaken
-emit
-adoptive
-squeeze
-pour
-tor
-bump
-hike
-bless
-likeness
-cognition
-subculture
-smoky
-plumbing
-bulge
-spill
-choke
-chromatic
-allusion
-unfit
-accustomed
-benchmark
-singleton
-highness
-blur
-swat
-videotape
-simplify
-dissemination
-manslaughter
-stove
-seaplane
-sclerosis
-midday
-one-half
-flap
-interruption
-divert
-valor
-disapproval
-tee
-catchphrase
-uncover
-stamina
-cock
-advertiser
-unbroken
-confer
-cabbage
-sincere
-sadness
-persuasion
-glide
-levant
-recital
-rook
-pumpkin
-quarrel
-vodka
-lobster
-responsive
-equip
-gully
-bribe
-rooster
-troll
-brace
-multiply
-laundering
-polygon
-recess
-dividend
-inclination
-shipwreck
-divergent
-arafat
-occupancy
-overlook
-alienation
-paw
-decorate
-ovation
-heater
-interpolation
-solitude
-insignificant
-virtuoso
-liabilities
-intrigue
-thirst
-thunderstorm
-skid
-matriculation
-seawater
-disband
-outpatient
-solstice
-stylish
-counterfeit
-anvil
-stalk
-disparity
-limousine
-bloodshed
-moratorium
-bleach
-caricature
-centrist
-ascertain
-popcorn
-slant
-lapse
-lees
-limelight
-semen
-informative
-formalism
-diluted
-profiling
-fodder
-locomotion
-longing
-mother-in-law
-tapered
-bunk
-plunge
-ration
-puff
-aristocrat
-fingerprint
-seashore
-vet
-compel
-blink
-expanse
-injure
-physique
-ordinal
-nonexistent
-colorless
-validate
-melon
-air-conditioned
-beet
-windward
-mash
-sightseeing
-feline
-vicariate
-torment
-sucker
-dope
-hearth
-repayment
-deliberation
-lumbar
-meager
-barter
-backpack
-persuasive
-topographical
-gravitation
-counterculture
-harpoon
-disarm
-defy
-reorganize
-debilitating
-trustworthy
-jerk
-ptolemaic
-knit
-soar
-racket
-deceive
-hibernation
-hesitate
-presidio
-woodwind
-coiled
-buoy
-subsurface
-astute
-unsettled
-anesthetic
-cloudy
-ringed
-retrograde
-disorganized
-stand-in
-credence
-stinger
-perturbation
-depository
-soften
-reap
-attest
-resolute
-exorcist
-pelt
-populate
-visualize
-indulgence
-abstain
-subjugated
-peri
-check-in
-thorium
-sortie
-profane
-overriding
-causative
-magus
-elaboration
-freemason
-cucumber
-inexperience
-asexual
-academician
-fatherland
-amity
-gutter
-puebla
-headway
-anagram
-antecedent
-deepening
-trespass
-abeyance
-enclose
-debacle
-riverbank
-datum
-deluge
-scrape
-fingerboard
-docile
-grazed
-retraction
-tighten
-indiscriminate
-cant
-repent
-brythonic
-intolerable
-unplanned
-overpopulation
-campfire
-crib
-confine
-adversity
-tease
-corset
-unmistakable
-resentful
-cutlery
-virtuosity
-siphon
-unqualified
-microfilm
-shampoo
-unaccompanied
-placental
-diss
-daze
-headlight
-bedlam
-blunder
-clutter
-communicative
-incoherent
-tethered
-screwdriver
-sparkle
-experimenter
-know-how
-misused
-impartiality
-bifurcation
-inaction
-freudian
-generalize
-llano
-canard
-breakage
-hospitable
-opec
-grieve
-fortify
-derision
-teardrop
-tearful
-sunburst
-hock
-carnivore
-maradona
-arrears
-lighten
-betrothal
-emancipated
-composure
-self-control
-necessitate
-postponement
-fulcrum
-twitch
-collectivization
-stp
-grate
-madhouse
-supplant
-extort
-herbivore
-dryness
-flimsy
-hallowed
-cognac
-serviceman
-clank
-avifauna
-hurl
-moderating
-liquidate
-realtor
-temp
-typeset
-rede
-sociable
-leer
-redeployment
-nip
-one-fourth
-sterility
-numerator
-referent
-powdery
-oblige
-efferent
-astrophysicist
-delineation
-agape
-tasteless
-breathless
-admonition
-lacuna
-fasten
-divestiture
-medium-large
-biome
-seidel
-conceptualization
-seepage
-soliloquy
-stepper
-submerge
-inhale
-scurry
-tableware
-rightmost
-worthiness
-agitate
-sorrowful
-mocha
-perspiration
-instrumentality
-unsteady
-vaporization
-excrete
-amortization
-y2k
-feign
-self-preservation
-skim
-altaic
-fractious
-enumerate
-coldness
-nucleon
-copilot
-lurch
-autumnal
-sideswipe
-forefather
-surmise
-smug
-disassociate
-americanization
-exhale
-foreplay
-butyl
-falsify
-unrefined
-drizzle
-seventy-eight
-healthful
-ryukyuan
-faucet
-reposition
-recombine
-augur
-hellenism
-arpeggio
-caress
-calmness
-deferral
-fearlessness
-smother
-damask
-coeducation
-afflict
-unkind
-readjustment
-parlay
-nearness
-tip-off
-poser
-taster
-ramrod
-devalue
-situate
-discolor
-qibla
-periwinkle
-sterol
-crus
-homonym
-unacknowledged
-flatfish
-sweeten
-wastage
-achromatic
-neckline
-moneymaker
-light-skinned
-sagacity
-demerit
-sameness
-nonexistence
-whitsun
-streamliner
-unmitigated
-crosswise
-crepe
-unoriginal
-distinctness
-matchstick
-maim
-chink
-sandbag
-grapeshot
-mangle
-indent
-insubordinate
-abbreviate
-misprint
-inarticulate
-transgress
-worksheet
-fascinate
-slog
-noiseless
-louden
-acquirer
-tradespeople
-encase
-miasma
-breakable
-fossorial
-unclothed
-relinquishment
-litigate
-tibialis
-chalcedony
-toughen
-paleoecology
-liquefy
-chondrite
-impermanent
-wetness
-footrace
-crisscross
-deface
-calorific
-perusal
-decouple
-cremate
-aglow
-insole
-repercussion
-hailstone
-smirk
-insufflation
-jetting
-flog
-unblock
-heedless
-gherkin
-angstrom
-klutz
-bluster
-challah
-follow-through
-avulsion
-one-hitter
-chromosphere
-headshot
-blowfly
-suntan
-metazoan
-megillah
-explicate
-chessman
-foliate
-eparch
-unshared
-moisten
-handout
-hypochondria
-nonprofessional
-unfilmed
-teargas
-tideland
-capitate
-desensitize
-jeer
-scholiast
-astatine
-anorgasmia
-solemnize
-impute
-sicken
-relearn
-expiate
-regale
-quiddity
-unhurried
-flypaper
-fair-minded
-boondoggle
-pronounceable
-escapologist
-stirrer
-trudge
-enrollee
-tortuosity
-home-school
-stepparent
-fleabag
-unoriented
-unchain
-nonclassical
-larghetto
-homogenize
-self-starter
-procrastinate
-fossilize
-victimize
-nutriment
-plonk
-ramify
-cared-for
-swizzle
-preschooler
-niblick
-nursling
-refrigerate
-warmhearted
-cosmographer
-tone-deaf
-quandong
-rope-a-dope
-circularize
-pediculosis
-ill-natured
-agrology
-greengrocery
-foible
-red-flowered
-exudation
-supererogation
-half-light
-solanaceous
-taxability
-verbalization
-desorb
-mortify
-snuffer
-playgoer
-tombac
-bated
-backdate
-lowborn
-off-day
-tenpins
-prearrangement
-redeposit
-carouse
-notarize
-dissuasive
-gesticulate
-brachiate
-schlep
-complexify
-triple-crown
-protrusive
-expensiveness
-inhabitancy
-fomentation
-anserine
-streptobacillus
-terminative
-etiolation
-unsoiled
-aquaplane
-nonmoving
-clearheaded
-shinplaster
-outstroke
-actinism
-hutment
-trainband
-re-emphasise
-unbend
-streusel
-grainfield
-bilobate
-oenomel
-enervate
-sovietize
-haymow
-horsemint
-paleolith
-bronchitic
-insufflate
-palish
-uncork
-disembody
-hectometer
-sourdine
-cheeselike
-speechify
-incipiency
-branchy
-amorist
-rubricate
-desynchronize
-short-change
-offprint
-honorableness
-unexhausted
-retem
-undependability
-astrogate
-iodocompound
-inmarry
-conceivableness
-unerect
-washup
-shoot-'em-up
-degust
-containerful
-canyonside
-re-afforest
-shamanize
-prepossess
-negociate
-resplend
-milwaukee
-threw
-motors
-horn
-nineteenth
-acclaimed
-discuss
-superman
-creator
-surname
-mythology
-hybrid
-bearing
-sheep
-dinner
-riders
-certificate
-specified
-rookie
-lab
-escort
-hood
-installation
-bones
-geographic
-heir
-villa
-linux
-prey
-quest
-guidance
-resides
-provisions
-reconstruction
-conflicts
-goddess
-crops
-raymond
-editorial
-mason
-present-day
-fortress
-armor
-audiences
-scotia
-kate
-assassination
-autonomous
-willing
-diversity
-1891
-nationwide
-120
-os
-dissolved
-instructions
-all-star
-nationally
-recognize
-shaped
-personally
-toll
-depends
-al.
-ram
-hydrogen
-throw
-nancy
-gibson
-brunswick
-opposing
-indies
-anglican
-performs
-nicknamed
-conquest
-plaza
-experts
-humanity
-testament
-tribal
-bonds
-integral
-boom
-canon
-defended
-fourteen
-extends
-sits
-beijing
-leaf
-lancashire
-aka
-strings
-reactions
-permanently
-sergeant
-comprising
-delayed
-reserves
-reveal
-estates
-pregnant
-refugees
-shaw
-exercises
-surrey
-gothic
-portrait
-playstation
-mascot
-recipient
-montana
-addressed
-fellowship
-trend
-1860
-physically
-associates
-cannon
-naming
-indianapolis
-relatives
-auto
-chester
-opens
-probability
-belonged
-theorem
-restore
-compounds
-remember
-loved
-permit
-1881
-2000s
-milan
-truly
-citizenship
-numbered
-grants
-investigate
-sussex
-midnight
-enable
-shuttle
-joins
-aggressive
-achievements
-glen
-scouts
-accidentally
-1887
-grid
-hamlet
-attractions
-facts
-cardinals
-platinum
-nose
-travelling
-nigeria
-tampa
-arrangements
-peru
-philosophical
-hub
-expanding
-leonard
-fisher
-trademark
-utility
-terrain
-emerging
-bengal
-commitment
-farmer
-underwent
-spencer
-missionary
-dialect
-engage
-distant
-narrative
-1867
-tiny
-politically
-ltd
-bulgaria
-solely
-stability
-nearest
-holmes
-freshwater
-artwork
-cameras
-incumbent
-flagship
-protests
-theoretical
-vol
-garrison
-voltage
-pounds
-documented
-toy
-watching
-blade
-widow
-pit
-doubt
-venice
-recreational
-extending
-mentions
-collegiate
-stanford
-copyright
-contribute
-varieties
-lawsuit
-posts
-grain
-bennett
-hitting
-watson
-monitor
-exploration
-panama
-enlisted
-plates
-potentially
-demanded
-rio
-tables
-wonder
-allegations
-strikes
-push
-1884
-casino
-labels
-kenya
-trips
-consistently
-focusing
-revived
-springfield
-millennium
-circular
-romans
-highways
-loyal
-compiled
-auckland
-petersburg
-activist
-tune
-deeply
-intelligent
-suspected
-jose
-knowing
-shifted
-valid
-fusion
-joan
-ritual
-committees
-ruins
-conferences
-migration
-enabled
-syria
-arguments
-athletes
-conspiracy
-chest
-pace
-eventual
-heading
-pond
-carlos
-saving
-legends
-strictly
-homer
-neutral
-swing
-argues
-inaugural
-temperatures
-divorced
-centered
-clarke
-twins
-absorbed
-isaac
-pet
-consequences
-terrorist
-gravity
-1871
-calgary
-proceeded
-eu
-beaten
-ronald
-hoped
-bankruptcy
-geographical
-tobacco
-forty
-holes
-magical
-provision
-winds
-intervention
-construct
-cliff
-dozen
-bulk
-northeastern
-burial
-earthquake
-oscar
-doubles
-demo
-1876
-sensitive
-aaron
-armstrong
-excess
-1882
-recurring
-developer
-handed
-supporter
-southwestern
-cream
-spin
-founders
-advisor
-grows
-thai
-surfaces
-compositions
-q
-chile
-angry
-timber
-cole
-witnesses
-beta
-pipe
-circles
-5,000
-loose
-2,000
-contributing
-scope
-kitchen
-ny
-mechanics
-firms
-inland
-destroying
-directions
-edmonton
-les
-fees
-feeding
-physician
-aboriginal
-volleyball
-intent
-rotation
-stationed
-refuge
-brooks
-coached
-analog
-pursued
-pub
-czech
-beings
-dimension
-defending
-owen
-laura
-itv
-walks
-presumably
-stevens
-retiring
-teen
-songwriter
-fires
-estimates
-rhode
-unified
-slavery
-trails
-sequences
-refuses
-chaos
-re-elected
-beatles
-innovative
-learns
-consciousness
-palmer
-moreover
-citing
-runway
-publishers
-synopsis
-dame
-attract
-jamaica
-princeton
-winnipeg
-binding
-four-year
-clerk
-belfast
-cornwall
-decorated
-badly
-episcopal
-fed
-guild
-adoption
-plymouth
-cork
-altitude
-victories
-palestine
-skating
-secular
-demon
-competitors
-cameron
-upcoming
-lowland
-underlying
-z
-remarkable
-mario
-lifestyle
-dust
-criteria
-availability
-postal
-enrollment
-inhabited
-afl
-simpson
-rivalry
-rulers
-marcus
-removing
-busy
-layers
-carriers
-idol
-li
-nursing
-julian
-guitars
-estimate
-commented
-respected
-penn
-abu
-ranch
-accuracy
-glory
-heavyweight
-upgraded
-wishes
-notoc
-export
-fights
-cats
-1866
-feels
-memories
-ss
-acres
-colombia
-missiles
-proposals
-trucks
-rivals
-bacteria
-chuck
-1872
-supposedly
-catholics
-seemingly
-windsor
-memphis
-senators
-lunar
-ladies
-1868
-arcade
-praise
-proven
-dylan
-motorcycle
-monks
-headquartered
-transported
-kerala
-occurring
-architects
-hosting
-nationalist
-chambers
-commercially
-withdrawn
-boarding
-graduates
-conventions
-lessons
-lectures
-enormous
-civilians
-improving
-backed
-believing
-patron
-spider-man
-legally
-preparing
-molecules
-transmitter
-interpreted
-aerial
-punjab
-brass
-lightning
-remix
-colin
-profession
-moments
-acceptance
-unclear
-assembled
-beast
-impressed
-1878
-furniture
-shanghai
-continent
-speaks
-console
-notion
-marketed
-jets
-variables
-tends
-barrier
-cia
-rev.
-reliable
-vampire
-complicated
-apollo
-download
-onwards
-1883
-costume
-conductor
-errors
-incidents
-paying
-nest
-observations
-boeing
-synthesis
-indicating
-cloud
-poets
-kingston
-bicycle
-circus
-vii
-nerve
-honored
-townships
-conducting
-dimensions
-1874
-bulgarian
-arabia
-jam
-nathan
-dome
-skull
-airing
-milton
-deposits
-recover
-streams
-introduce
-pull
-bomber
-spite
-endangered
-20,000
-zones
-defender
-triangle
-1850
-duncan
-unlikely
-counts
-accounting
-idaho
-solomon
-invention
-speedway
-stem
-screenplay
-enterprises
-wilderness
-ethics
-kit
-happens
-sr.
-colleagues
-passion
-ace
-runner
-1.5
-sr
-civilization
-autobiography
-privately
-slot
-span
-sometime
-andrews
-attractive
-nonetheless
-pierre
-murders
-1873
-thermal
-investors
-subway
-consultant
-sustained
-betty
-healthy
-anywhere
-bin
-environments
-manuscript
-manila
-interred
-threats
-duck
-councils
-contrary
-ab
-xi
-engagement
-collaborated
-marching
-observatory
-1875
-prompted
-1848
-1879
-rebels
-ruth
-discrimination
-convoy
-elaborate
-consumers
-nato
-brands
-formula_1
-quarters
-odd
-repairs
-85
-breast
-sends
-canterbury
-lucas
-conversation
-presently
-gaming
-inches
-climbing
-genes
-alexandria
-succeed
-minimal
-pa
-commercials
-outcome
-brook
-journals
-suite
-first-class
-topic
-settings
-reserved
-hypothesis
-favourite
-substitute
-kenneth
-separately
-judaism
-ip
-1857
-doug
-dressed
-trapped
-assists
-draws
-batted
-fiji
-interactions
-performer
-contents
-quit
-cleared
-garage
-whenever
-barrel
-valve
-speeds
-assume
-careers
-marsh
-reconnaissance
-agrees
-justin
-diagnosis
-corridor
-targeted
-linda
-realizes
-highland
-controlling
-lens
-conceived
-tension
-1869
-collecting
-dealt
-mammals
-cooking
-floors
-lanes
-arose
-rochester
-darkness
-renewed
-revolt
-evaluation
-hearts
-investigations
-realm
-chip
-coup
-rachel
-mask
-neighborhoods
-ap
-rap
-journalists
-costa
-republicans
-collapsed
-loves
-lancaster
-custody
-spelling
-proportion
-powell
-walt
-blair
-corporations
-charleston
-oh
-editing
-ceremonies
-rifles
-thames
-josh
-vietnamese
-dale
-violin
-yankees
-lucy
-bailey
-hampton
-counsel
-recalled
-commune
-continuously
-bros.
-locals
-totally
-ivan
-managers
-instructor
-guided
-knee
-jungle
-treasury
-gross
-monsters
-photographer
-worse
-preston
-bits
-observer
-shoes
-halls
-seized
-cluster
-suggesting
-ballot
-consideration
-jennifer
-curtis
-touchdowns
-tickets
-mosque
-acquire
-lighthouse
-modes
-composers
-sanctuary
-poorly
-velocity
-witch
-blake
-exhibits
-eugene
-diana
-fraser
-planets
-coin
-protagonist
-dictionary
-fraud
-nixon
-bombs
-bench
-cyprus
-ai
-seeks
-syndicated
-presenter
-spacecraft
-trilogy
-geometry
-joy
-placement
-referendum
-foundations
-leather
-bradford
-1877
-possessed
-solve
-hometown
-responsibilities
-lots
-activated
-african-american
-monarch
-infinite
-delegates
-morrison
-buddhism
-munich
-hammer
-porter
-pocket
-1859
-50,000
-rescued
-monroe
-bombers
-cardiff
-ordained
-orientation
-please
-innocent
-hungarian
-banner
-breakfast
-accredited
-resemble
-cylinder
-presbyterian
-exterior
-cabin
-tight
-particle
-artifacts
-pursuit
-viii
-comedian
-shallow
-vegetation
-expelled
-innovation
-albany
-ellis
-teenage
-genesis
-governors
-newfoundland
-pollution
-gen.
-allan
-populated
-regiments
-imposed
-louisville
-poker
-loaded
-sail
-sponsor
-cox
-consent
-fever
-cricketer
-disputed
-staged
-raiders
-electron
-fastest
-handled
-dock
-proceeds
-preventing
-halifax
-mechanisms
-administrator
-holdings
-wound
-marc
-sonic
-quartet
-technological
-caesar
-uefa
-beaches
-tendency
-abstract
-glenn
-lutheran
-searching
-exceptions
-stan
-teachings
-reunited
-factories
-testimony
-floating
-conservatives
-kerry
-teaches
-ussr
-a.m.
-logan
-module
-respond
-portsmouth
-blog
-henderson
-commands
-puts
-knocked
-bradley
-dancer
-3,000
-overcome
-embassy
-crews
-melody
-brighton
-bike
-chains
-sizes
-reagan
-considers
-thursday
-espn
-1854
-prohibited
-photos
-transform
-sitcom
-keeps
-pretty
-throwing
-airborne
-editors
-imported
-primitive
-napoleon
-lasting
-genres
-mumbai
-ethiopia
-conquered
-nominations
-richardson
-sovereign
-cartoons
-certification
-canton
-burton
-bounded
-underway
-herald
-jamie
-priority
-carol
-guru
-finest
-crucial
-.DGDG
-applies
-lawyers
-undertaken
-complexity
-bypass
-davies
-gabriel
-religions
-disorders
-edmund
-athlete
-caps
-inclusion
-wheat
-decommissioned
-boxes
-nascar
-arsenal
-carved
-ac
-hardy
-malcolm
-extraordinary
-arctic
-indie
-scales
-advantages
-lt.
-neighbourhood
-mere
-termed
-explore
-expressway
-evangelical
-findings
-jesse
-romanian
-schemes
-x-men
-cheese
-likewise
-churchill
-imprisonment
-rises
-ernest
-wrestler
-promotional
-inquiry
-etymology
-julie
-technically
-proceedings
-introducing
-finite
-otto
-savage
-smallest
-parody
-omaha
-viewing
-marble
-toys
-critically
-detection
-likes
-torpedo
-acclaim
-meter
-full-length
-feast
-controller
-agreements
-derives
-reunion
-sphere
-leon
-jake
-trunk
-disputes
-congo
-invaded
-parkway
-moses
-hiding
-flora
-non
-relay
-museums
-randy
-superintendent
-plantation
-academics
-interim
-blacks
-stakes
-entities
-stern
-maurice
-qualification
-explicitly
-ulster
-post-war
-terrorism
-organisms
-essays
-harper
-rape
-classics
-troy
-combine
-badge
-button
-hurt
-filter
-websites
-differential
-prevention
-treasure
-casting
-petition
-kenny
-explorer
-provider
-rendered
-nepal
-systematic
-directing
-publicity
-quantity
-thunder
-fraternity
-speculation
-tough
-crazy
-telecommunications
-ce
-modeling
-insisted
-pump
-shrine
-obama
-baronet
-buying
-seating
-receptor
-mcdonald
-essay
-collector
-addresses
-portable
-inspector
-finger
-cherry
-implies
-crashed
-reynolds
-quantities
-handful
-ease
-leslie
-resembles
-paramount
-adopt
-governance
-opinions
-roster
-richards
-easter
-consolidated
-midland
-discussions
-protecting
-liver
-villain
-spirits
-line-up
-prizes
-equity
-doctorate
-bce
-basement
-resolved
-fatal
-machinery
-devon
-1855
-armoured
-listening
-struggled
-clergy
-dances
-mate
-correctly
-lap
-chronicle
-inception
-karen
-servant
-knife
-expense
-burma
-varsity
-lynn
-panels
-transmitted
-vulnerable
-darwin
-kingdoms
-presenting
-ranger
-campuses
-sin
-potter
-willie
-instances
-countryside
-papal
-pastor
-linking
-700
-longtime
-workshops
-hawk
-nobel
-hanging
-plateau
-notorious
-interviewed
-chassis
-raids
-embarked
-baldwin
-combining
-contacts
-emmy
-1856
-differs
-tributary
-harm
-macdonald
-interference
-tin
-planted
-loyalty
-orlando
-strategies
-martha
-thinks
-encounters
-1812
-julia
-reasonable
-amy
-nice
-sherman
-clause
-geneva
-hopkins
-condemned
-malta
-powder
-accordance
-cure
-paradise
-exhibitions
-premiership
-openly
-measuring
-crest
-emma
-relegated
-fitness
-equations
-batteries
-indonesian
-geology
-edges
-successive
-preliminary
-pleasure
-bag
-watts
-secretly
-jackie
-decreased
-questioned
-prefecture
-re-released
-auxiliary
-steep
-wise
-backup
-anthology
-confederation
-humor
-advances
-advent
-hip-hop
-graph
-reflects
-hoping
-convince
-airways
-flags
-renovation
-stint
-spotted
-minneapolis
-rhodes
-perception
-georgian
-gains
-contestants
-alignment
-log
-o'brien
-theodore
-presidents
-pen
-subdivided
-contracted
-uniforms
-substantially
-spike
-freed
-activists
-merchants
-edgar
-sharon
-cuban
-hans
-2.5
-daytime
-specially
-wyoming
-silk
-ancestors
-1851
-telegraph
-heath
-diploma
-pleasant
-publishes
-troop
-salem
-demise
-bryan
-serbia
-ancestry
-sketch
-sixty
-jeremy
-myth
-sudan
-rovers
-kirk
-naples
-blocked
-forbidden
-elderly
-submarines
-dirty
-calculated
-lecturer
-comparable
-nervous
-chocolate
-continuity
-attributes
-flooding
-cruiser
-carroll
-bullet
-complaints
-infant
-rbi
-freely
-ignored
-airports
-organizing
-proclaimed
-dialects
-pianist
-classrooms
-informal
-demonstrate
-adds
-binary
-vacuum
-maple
-exhibited
-venezuela
-homeland
-parishes
-cooperative
-processor
-jumping
-facilitate
-sings
-fur
-static
-cent
-offense
-tuesday
-cumberland
-nationals
-aided
-ft
-entries
-stake
-mickey
-coined
-demonstration
-evident
-purely
-americas
-alphabet
-fails
-everyday
-propaganda
-organize
-torture
-achieving
-marshal
-specification
-logical
-ecuador
-owing
-criticised
-routine
-flown
-excessive
-yugoslavia
-brad
-folded
-designers
-pakistani
-carnegie
-heating
-sacramento
-personalities
-rely
-dramatically
-miners
-squadrons
-marion
-zimbabwe
-dorothy
-philosopher
-dynamics
-stepped
-terrestrial
-fossil
-beside
-abortion
-aunt
-crop
-cowboys
-annie
-liberals
-filipino
-lucky
-scots
-wwe
-cultivation
-thoughts
-louise
-nottingham
-talented
-cage
-dining
-1849
-novelist
-petroleum
-algebra
-manuscripts
-tornado
-limitations
-constantinople
-carefully
-memorable
-naked
-airs
-generic
-gaelic
-dozens
-readily
-objectives
-stamps
-staying
-oval
-sicily
-conviction
-slopes
-traces
-employ
-requests
-paths
-joel
-phd
-uprising
-mistake
-legitimate
-organs
-hd
-rainbow
-chelsea
-renovated
-newman
-laboratories
-poles
-unofficial
-destinations
-mud
-gifts
-walked
-develops
-frequencies
-lesbian
-fruits
-pale
-defines
-uncertain
-derek
-hayes
-canceled
-albeit
-loans
-leicester
-google
-caroline
-graves
-argentine
-modifications
-commodore
-tyler
-cuisine
-phantom
-luther
-kyle
-25th
-tonight
-diagnosed
-monarchy
-locked
-width
-lou
-promising
-predicted
-charitable
-descended
-tribunal
-comprised
-all-time
-premises
-patch
-crawford
-subspecies
-surrendered
-missionaries
-applying
-obtaining
-robertson
-specimens
-rope
-abbreviated
-2011
--DGDG
-fault
-listeners
-cameo
-fork
-wished
-han
-correspondence
-cohen
-drawings
-formula_2
-tackles
-keen
-absent
-math
-cloth
-profits
-johnston
-integrity
-cornell
-prosecution
-arguing
-anger
-instant
-atoms
-yes
-parameters
-witnessed
-wang
-award-winning
-oz
-hardcore
-harsh
-ex
-rankings
-coupled
-1853
-cruz
-chemicals
-emily
-modest
-ba
-by-election
-default
-visitor
-realize
-1852
-payments
-noticed
-arguably
-empress
-wears
-detect
-similarities
-optional
->
-swan
-elephant
-vladimir
-chronic
-advocates
-tracking
-stopping
-utilized
-caves
-gather
-unemployment
-loses
-spider
-croatia
-luck
-bart
-etc
-motorway
-commanders
-avoided
-prefer
-challenging
-helicopters
-gateway
-eaten
-sustainable
-pregnancy
-cognitive
-divide
-monkey
-astronomy
-metals
-ironically
-unsuccessfully
-distances
-hindi
-volcanic
-embedded
-guidelines
-amazing
-gerald
-forgotten
-providers
-scenic
-flee
-lease
-treasurer
-loosely
-rim
-griffin
-aberdeen
-midlands
-qualities
-30,000
-commissioners
-disambiguation
-infamous
-shipped
-employer
-barnes
-contestant
-evolutionary
-masses
-ellen
-explored
-trek
-employs
-accordingly
-pin
-madrid
-leisure
-wisdom
-linguistic
-sophisticated
-phi
-roller
-supervision
-celebrations
-lecture
-erie
-spare
-sectors
-mob
-tended
-boyfriend
-sponsors
-explosive
-carnival
-switching
-giovanni
-1846
-vice-president
-bottle
-raja
-maiden
-re-election
-pedro
-shark
-toxic
-ion
-vista
-abdul
-claire
-floyd
-o.
-vale
-providence
-71
-barely
-cedar
-celebrities
-talents
-armenia
-fountain
-krishna
-ceremonial
-genera
-slide
-intensity
-frames
-greeks
-disciplines
-intact
-interrupted
-elimination
-strongest
-reactor
-shed
-outlets
-jointly
-violation
-lung
-phenomena
-spectacular
-immune
-highlands
-flexible
-transformers
-julius
-choosing
-answers
-holidays
-underwater
-generating
-hook
-dancers
-commentator
-DG.DGDGDG
-sue
-counted
-armored
-cycling
-xbox
-specialty
-servants
-livestock
-adrian
-88
-freshman
-battlefield
-rat
-tubes
-emigrated
-pier
-strips
-compression
-1847
-traced
-afterward
-consensus
-mutant
-persuaded
-reid
-duchy
-toyota
-tasmania
-yearly
-enabling
-determining
-shells
-somewhere
-cooling
-berry
-effectiveness
-booth
-statute
-tragedy
-feared
-teamed
-granite
-securities
-rides
-alter
-southampton
-brewery
-fragments
-keyboards
-translations
-buffy
-suspect
-correspondent
-referee
-saves
-norton
-acceptable
-rent
-brilliant
-highlights
-consulting
-remake
-doom
-johns
-advocated
-butterfly
-lunch
-spelled
-reversed
-cds
-mixing
-euro
-bedford
-investigated
-altogether
-grandmother
-risks
-invitation
-defunct
-cheap
-battalions
-thesis
-nucleus
-360
-trigger
-regulatory
-mild
-strain
-honda
-percussion
-deliberately
-delegation
-sponsorship
-donations
-automated
-classroom
-pottery
-surprised
-adequate
-fairy
-incorporate
-wartime
-narrator
-momentum
-deadly
-salary
-prospect
-stamp
-investments
-follow-up
-ms.
-proud
-kidnapped
-prolific
-reds
-lifted
-patriots
-generals
-wwf
-apartments
-auditorium
-bride
-barracks
-counterpart
-trainer
-theaters
-batsman
-boxer
-chartered
-measurements
-surgeon
-ore
-scouting
-fathers
-suffolk
-co
-seventeen
-cafe
-2.0
-monte
-intensive
-twelfth
-licence
-streak
-interact
-compromise
-physicians
-reviewed
-chi
-defenders
-sailors
-chronicles
-immigrant
-circuits
-turks
-peaks
-reverend
-winston
-gambling
-bloody
-reflecting
-polar
-whereby
-proceed
-composite
-monetary
-riot
-topped
-detention
-demolition
-cove
-cadet
-adaptations
-atmospheric
-lacking
-peerage
-tunnels
-choices
-buddha
-notation
-syracuse
-accompanying
-detected
-fantastic
-lone
-shire
-labeled
-servers
-leased
-competitor
-basically
-hollow
-stuck
-elvis
-1832
-synthetic
-holocaust
-definitions
-ferdinand
-employers
-sally
-slogan
-ecclesiastical
-1837
-endorsed
-vatican
-mercy
-barcelona
-wagon
-kane
-congressman
-siblings
-provisional
-constituencies
-wounds
-enables
-mandate
-diary
-shirt
-firearms
-acids
-makers
-pistol
-flies
-nash
-fake
-princes
-riots
-candy
-surveillance
-flowing
-safely
-vernon
-protective
-liked
-cottage
-emissions
-elliott
-wo
-fingers
-b.a.
-gm
-3000
-node
-portfolio
-beck
-sodium
-lateral
-1800
-imaging
-reduces
-swift
-phases
-filling
-tender
-monuments
-anthem
-licensing
-viscount
-torn
-tears
-venus
-infected
-accidents
-prophet
-rounded
-shortened
-trinidad
-depicting
-peer
-tracy
-capturing
-landmarks
-vegetables
-winchester
-flame
-augustus
-auction
-escapes
-timing
-hate
-complications
-dental
-vacant
-lengthy
-oriented
-drops
-shapes
-favored
-symbolic
-savings
-noting
-portal
-tide
-gandhi
-expertise
-preceded
-observers
-84
-riverside
-approaching
-uncommon
-newark
-north-west
-sunk
-responses
-acute
-1800s
-mandatory
-executives
-sovereignty
-prominently
-stalin
-hierarchy
-guides
-crowned
-ethical
-stance
-telescope
-demons
-heated
-malay
-coordinates
-moss
-modification
-underneath
-1836
-revealing
-joey
-demanding
-maxwell
-olive
-pressed
-reads
-aquatic
-fiber
-ashes
-middlesex
-aliens
-dealer
-cbc
-traders
-desperate
-commuter
-reside
-mps
-cry
-faithful
-confined
-preparatory
-initiatives
-1500
-lacked
-hawaiian
-1845
-incarnation
-cellular
-altar
-malaysian
-monica
-throat
-casey
-edwin
-derbyshire
-antenna
-titans
-steering
-famed
-sanskrit
-peaking
-ira
-holly
-webb
-hunters
-recommendations
-greene
-1830
-frozen
-eternal
-substances
-mrs
-natives
-anonymous
-wives
-notre
-best-known
-warrant
-dissolution
-depicts
-alike
-replied
-theft
-enacted
-rabbit
-descendant
-part-time
-nowadays
-two-year
-transaction
-pirate
-theatres
-mit
-watched
-allocated
-surgical
-harvest
-gardner
-na
-ashley
-rockets
-minds
-inspection
-large-scale
-successes
-luis
-defining
-authored
-fabric
-carson
-browns
-accessed
-sufficiently
-criminals
-coral
-carpenter
-bordered
-worcester
-explaining
-preference
-skiing
-automotive
-mentor
-tribune
-screening
-algorithms
-collectors
-midwest
-electrons
-annexed
-rapper
-shane
-uganda
-rna
-pike
-implied
-lacrosse
-bermuda
-tunes
-ahmed
-reinforced
-healthcare
-sunset
-migrated
-hispanic
-broadcaster
-concentrate
-functionality
-michelle
-accepts
-threatening
-nadu
-mormon
-bowler
-dolphins
-taxi
-mighty
-wagner
-1841
-ghana
-bet
-pursuing
-transactions
-sued
-phones
-blessed
-exam
-souls
-consortium
-rumors
-catalog
-enclosed
-shores
-tackle
-launching
-possesses
-accepting
-reward
-ham
-madonna
-carey
-hawks
-stark
-lit
-cubs
-pageant
-timeline
-simulation
-deity
-repeat
-jessica
-extensions
-fcc
-revelation
-jacksonville
-marriages
-alpine
-depend
-swim
-ballad
-clyde
-frost
-corners
-armour
-comeback
-europeans
-seas
-narrowly
-salmon
-resign
-jacques
-canberra
-poison
-inscription
-feedback
-patriarch
-decree
-1844
-tucker
-fierce
-modules
-polo
-regent
-brigadier
-beds
-shaft
-coleman
-seasonal
-minerals
-loud
-paved
-cherokee
-reproduction
-1838
-norwich
-rapids
-suggestion
-robots
-statewide
-enjoys
-ingredients
-brussels
-tier
-tallest
-pavilion
-chad
-atom
-dodgers
-fears
-expenses
-coordinator
-subdivision
-oath
-dodge
-harder
-molecule
-cairo
-broader
-combines
-all-american
-fletcher
-timothy
-shorts
-reopened
-airplay
-jenny
-ucla
-chapman
-autonomy
-precision
-hung
-railroads
-assuming
-walsh
-humorous
-drunk
-suited
-rehabilitation
-struggling
-blamed
-consumed
-tanzania
-self-titled
-prohibition
-bee
-supplement
-occupy
-secrets
-joshua
-prussia
-investigating
-valleys
-destiny
-dwarf
-carlo
-cow
-advancing
-astronomical
-governmental
-advocacy
-ceiling
-overhead
-examined
-tensions
-advertisements
-trailer
-ideology
-salvation
-b-side
-factions
-3.5
-ensuing
-comfortable
-luxembourg
-sc
-premium
-neighbors
-retains
-matching
-deeper
-sigma
-shoots
-inserted
-whitney
-breeds
-lynch
-excluded
-chips
-descriptions
-serie
-exceptional
-tapes
-asset
-arise
-practitioners
-gym
-remnants
-icon
-rex
-contributor
-unrelated
-regulated
-hamburg
-teenager
-resolve
-coronation
-sociology
-announcer
-warehouse
-sentences
-den
-collectively
-steadily
-regained
-identifying
-beverly
-purchasing
-specifications
-lovers
-goalkeeper
-rushing
-cartridge
-shepherd
-mick
-bugs
-verses
-thomson
-institutes
-dirt
-peers
-boost
-drinks
-warned
-rushed
-dudley
-destroyers
-obviously
-hyde
-ancestor
-genuine
-enlarged
-id
-suits
-triumph
-disco
-janet
-archdiocese
-rays
-treatments
-incomplete
-riley
-stockholm
-rendering
-glacier
-cleaning
-counterparts
-circa
-bavaria
-rage
-donna
-confronted
-jeffrey
-marched
-overnight
-revenues
-bang
-verb
-arlington
-ferguson
-outlet
-dixon
-scripts
-encouraging
-irving
-unconscious
-franz
-metric
-pioneers
-bombay
-gymnasium
-funk
-abundant
-wards
-determination
-marina
-roland
-trends
-flowering
-aging
-mohammed
-lineage
-vikings
-il
-councillor
-predators
-1835
-heavier
-oriental
-ears
-~
-wilhelm
-sidney
-constantine
-decay
-muscles
-warsaw
-rotating
-portrayal
-realistic
-inventor
-bells
-softball
-rational
-sega
-shocked
-nodes
-fe
-expectations
-archaeology
-kurt
-famously
-rams
-baroque
-drainage
-trivia
-anatomy
-ecological
-solved
-generator
-partition
-1842
-conception
-arabs
-okinawa
-karnataka
-wanting
-sharks
-breakdown
-hugo
-alma
-nobody
-folklore
-attained
-traits
-baltic
-curse
-croatian
-accreditation
-humanitarian
-hank
-dioxide
-swept
-shannon
-careful
-ribbon
-flesh
-alternatively
-borrowed
-continuation
-pyramid
-pbs
-incorporating
-long-time
-kicked
-boot
-herman
-collision
-exeter
-merge
-struggles
-proportional
-metre
-projected
-bombardment
-lithuania
-drake
-wu
-morton
-throws
-locks
-promises
-email
-halt
-villains
-georgetown
-pitchers
-constituted
-appealed
-retaining
-troubles
-finishes
-essence
-trent
-waterloo
-350
-illustrations
-unaware
-pot
-stomach
-birthplace
-additions
-columnist
-falcon
-humanities
-loading
-settling
-ruby
-urged
-undergoing
-dominion
-beginnings
-geoffrey
-formula_3
-mafia
-sinking
-reporters
-hemisphere
-johann
-webster
-jo
-hastings
-flames
-cycles
-sa
-preparations
-buddy
-hiking
-pupil
-saturn
-stereo
-40,000
-chen
-guaranteed
-counting
-networking
-seoul
-courthouse
-hartford
-wholly
-carr
-97
-south-east
-boots
-documentation
-inflation
-tibet
-terminology
-agenda
-invisible
-eden
-andhra
-dee
-mobility
-comprise
-mccarthy
-russians
-undergo
-crush
-encyclopedia
-kashmir
-reliability
-sank
-majors
-midfielder
-angela
-dedication
-injection
-rats
-ambitious
-everybody
-beats
-unusually
-cope
-warwick
-satellites
-dungeons
-survives
-constituent
-pointing
-phillip
-progressed
-survivor
-sara
-marx
-inheritance
-terrible
-spreading
-coloured
-shadows
-sterling
-sundays
-reminiscent
-mice
-invested
-debates
-slower
-wesley
-intel
-felix
-mini
-avengers
-DGDGDG.DGDG
-precisely
-afghan
-deciding
-yang
-discography
-crow
-aerospace
-marker
-reflection
-superhero
-brotherhood
-co-founder
-saga
-tram
-barn
-same-sex
-textile
-4,000
-manning
-disposal
-weekends
-lebanese
-calcium
-remarks
-mood
-freeman
-selective
-televised
-sailor
-plague
-councillors
-amphibious
-labs
-import
-financing
-bolton
-feathers
-tomorrow
-manuel
-incorporates
-rainfall
-apostolic
-wolves
-pioneering
-spells
-catalogue
-declining
-captive
-bronx
-resemblance
-uranium
-equilibrium
-unreleased
-dominance
-rca
-25,000
-damages
-noah
-grande
-formations
-statues
-summers
-functioning
-unexpected
-percy
-1843
-forbes
-roses
-petty
-catches
-storms
-lily
-patents
-offshore
-trevor
-professors
-eleventh
-hiatus
-accurately
-picks
-pushing
-benedict
-juvenile
-hurling
-passive
-designing
-communists
-mann
-lance
-macedonia
-slam
-packages
-hire
-jurisdictions
-ambulance
-introduces
-valentine
-cambodia
-zip
-contacted
-balanced
-instituted
-cadets
-affecting
-favorable
-mapping
-hal
-terminated
-interval
-rolls
-fitzgerald
-patterson
-taxonomy
-pepper
-judy
-echo
-asserted
-turnpike
-regards
-overs
-mel
-friedrich
-rouge
-corrupt
-serbian
-advancement
-pseudonym
-numbering
-playwright
-shirley
-garnered
-organizational
-directory
-ahmad
-punch
-charted
-denominations
-wicket
-sworn
-md
-odds
-submit
-gloucester
-mega
-kay
-practicing
-anticipated
-realizing
-ensuring
-lotus
-wonderful
-joyce
-lennon
-pipes
-irrigation
-referenced
-xavier
-sketches
-salisbury
-reluctant
-whites
-revision
-silicon
-pulse
-branded
-commissions
-recruit
-blades
-feat
-demonstrations
-scrapped
-feeds
-galleries
-researcher
-detachment
-strikeouts
-highlight
-striker
-mi
-cheshire
-tibetan
-quinn
-papua
-runner-up
-combinations
-afford
-stripes
-portraits
-erosion
-nominee
-angles
-posthumously
-lindsay
-2012
-horizon
-originating
-bulls
-cosmic
-brandon
-villagers
-norm
-ix
-elect
-conrad
-sinclair
-instantly
-15,000
-crowds
-jerome
-habit
-doyle
-maternal
-discussing
-occupies
-homosexual
-baghdad
-redevelopment
-carlton
-hr
-lawn
-eighteenth
-avoiding
-corpus
-privy
-auburn
-simpsons
-andrea
-lumber
-chiefly
-defeats
-atlas
-constraints
-fare
-soviets
-brady
-compare
-camden
-promotes
-clash
-fernando
-passages
-happiness
-accent
-swamp
-jenkins
-polls
-prevalent
-coventry
-fuller
-prosperity
-emotions
-modeled
-refuse
-conservatory
-interpretations
-repertoire
-turtle
-updates
-tan
-1880s
-frances
-teammate
-trout
-compulsory
-ricky
-mets
-relates
-delivering
-jupiter
-census-designated
-stealing
-exploring
-niagara
-complained
-30th
-hang
-irregular
-liu
-hercules
-rituals
-syrian
-criticisms
-obsolete
-clara
-mothers
-duel
-davidson
-privacy
-gasoline
-sabha
-orbital
-rosa
-negotiated
-drill
-shiva
-funny
-continually
-braves
-trustee
-admits
-archer
-vince
-helmet
-costumes
-mackenzie
-affects
-wages
-whale
-genius
-fraction
-motivated
-undertook
-itunes
-participates
-snakes
-reformation
-sells
-hulk
-lexington
-artery
-decisive
-occurrence
-seth
-adverse
-amtrak
-saxon
-travis
-concentrations
-competes
-botanical
-calvin
-spin-off
-displaced
-cups
-ho
-cultivated
-productive
-duet
-proprietary
-shipbuilding
-hancock
-prayers
-inability
-chances
-matthews
-vacation
-gases
-caste
-vocational
-hiv
-rebecca
-160
-locate
-cement
-tricks
-relied
-homosexuality
-companions
-devised
-lined
-se
-stems
-sutton
-quotes
-privileges
-ventures
-evacuation
-excluding
-homeless
-stripped
-beef
-inaugurated
-willis
-plaque
-extant
-popularly
-receptors
-induced
-greenwich
-kg
-ronnie
-seymour
-kiev
-newest
-sheridan
-obscure
-miniseries
-hp
-beaver
-afraid
-regain
-td
-rowing
-defendant
-dover
-temperate
-exotic
-raced
-welcomed
-overhaul
-photographic
-maya
-mare
-DGDGDGDGDGDG
-packaging
-broncos
-clare
-emergence
-900
-vic
-1834
-consecrated
-examinations
-chan
-pulling
-valued
-kappa
-amanda
-fence
-atari
-clips
-bosnia
-quarry
-comparative
-belle
-babylon
-mt.
-marco
-isles
-facto
-commemorate
-unlimited
-sexually
-azerbaijan
-adviser
-guantanamo
-e-mail
-permits
-cpu
-yacht
-protocols
-llc
-1831
-intake
-shields
-flooded
-colts
-spur
-fossils
-boyd
-darker
-clarence
-refusing
-inmates
-laurel
-atop
-downs
-prints
-scenario
-miniature
-dwight
-arabian
-ethiopian
-rite
-sculptures
-mgm
-cheaper
-congregations
-trades
-analyst
-peterson
-prehistoric
-col.
-declaring
-anthropology
-torah
-cassette
-lyon
-domains
-clement
-abbot
-pitt
-necessity
-bmw
-calcutta
-singer-songwriter
-vs
-outskirts
-bedroom
-leone
-violations
-discs
-complement
-plural
-showcase
-practically
-instructed
-speeches
-outfit
-500,000
-blend
-convent
-assisting
-ads
-coordination
-brake
-confrontation
-synagogue
-clearing
-zhang
-hannah
-landings
-reelection
-colleague
-lacks
-diamonds
-vein
-precursor
-cab
-clouds
-differently
-attitudes
-gibraltar
-runners
-200,000
-clayton
-guarantee
-patricia
-mar
-unix
-averaged
-starr
-cnn
-bullets
-aires
-germanic
-griffith
-delays
-1900s
-combustion
-estonia
-cancellation
-hereditary
-copenhagen
-bolt
-relieved
-sec
-economically
-sunshine
-bowie
-flank
-wool
-dawson
-buenos
-processed
-sandstone
-doubled
-liner
-lang
-boris
-pipeline
-thumb
-minorities
-sebastian
-willow
-trim
-mk
-penny
-overlooking
-unveiled
-accusations
-squares
-surveys
-sensor
-penalties
-amusement
-inscriptions
-howe
-rows
-pratt
-adjoining
-intervals
-engaging
-extinction
-mvp
-hammond
-north-east
-role-playing
-answered
-pharmaceutical
-gloria
-multimedia
-pioneered
-meyer
-volcano
-reasoning
-frankfurt
-victorious
-nsw
-fertile
-jubilee
-co-wrote
-oppose
-spokesman
-burlington
-communion
-formula_4
-soils
-subjected
-woody
-pius
-processors
-unstable
-juice
-prevents
-picking
-sensors
-specializing
-weakened
-marvin
-seals
-curved
-screened
-miracle
-teens
-nazis
-sheets
-semester
-dried
-panthers
-1815
-iso
-incorrect
-claude
-monopoly
-spectators
-tolerance
-lounge
-emerge
-encompasses
-a.d.
-mound
-exhaust
-dominate
-decorative
-offerings
-electorate
-fix
-kuwait
-umbrella
-watershed
-insufficient
-induction
-rebuilding
-flint
-normandy
-kindergarten
-knox
-expressions
-viking
-grip
-nationalism
-lobby
-psychiatric
-prague
-guys
-1-0
-resided
-recruiting
-pearson
-south-west
-thriller
-summoned
-chaired
-barker
-residing
-pressures
-debris
-1,500
-dayton
-laying
-menu
-nitrogen
-tertiary
-o'neill
-semi-final
-insight
-kernel
-profitable
-imagery
-mccartney
-taxation
-clans
-convenient
-refusal
-dishes
-suspects
-determines
-reprinted
-routing
-logistics
-24th
-financially
-transfers
-synod
-socialism
-backgrounds
-peel
-apprentice
-everywhere
-participant
-prone
-complaint
-colonists
-toledo
-hatch
-christine
-screens
-diabetes
-oblast
-viable
-arbitrary
-puzzle
-sacked
-ions
-surprisingly
-attachment
-crescent
-crusade
-bryant
-raven
-perfectly
-hbo
-sikh
-jacket
-speculated
-holders
-breach
-ideals
-projection
-addressing
-peters
-lengths
-institutional
-burden
-spawned
-hostage
-sculptor
-rs
-abolition
-plots
-mortgage
-dundee
-maharashtra
-innovations
-enzymes
-paired
-afc
-clusters
-chef
-contests
-securing
-prose
-eliminating
-intentions
-attracts
-compliance
-1890s
-iceland
-brett
-feminist
-rodney
-x.
-habits
-usd
-doc
-amended
-aa
-teenagers
-aluminium
-electromagnetic
-fibers
-cecil
-sub
-real-time
-miguel
-tooth
-randolph
-negotiate
-switches
-horns
-evacuated
-truman
-curves
-dolls
-2-1
-devastated
-installations
-leigh
-knockout
-promotions
-spatial
-acquiring
-statutory
-archie
-cliffs
-treating
-exposition
-residences
-markings
-facial
-corresponds
-exports
-professionally
-marty
-matched
-`
-kidney
-x-ray
-dubai
-addiction
-firmly
-flavor
-infections
-fury
-eurovision
-defenses
-breathing
-mcmahon
-implications
-59.5
-cowboy
-commentators
-augusta
-crossover
-recognizes
-guam
-solving
-steelers
-fortified
-utilize
-maggie
-meditation
-spa
-stole
-alley
-katherine
-1833
-right-handed
-cunningham
-nationality
-flats
-yuan
-augustine
-doll
-amino
-blame
-dos
-gifted
-alexandra
-emerson
-nigel
-pizza
-forestry
-relegation
-platoon
-50th
-molly
-helena
-brakes
-bark
-crystals
-lenses
-presumed
-locality
-ink
-failures
-affiliates
-turbine
-daniels
-chevrolet
-memoirs
-berkshire
-maybe
-incorporation
-aggregate
-boasts
-rector
-ninja
-1820
-highlighted
-mohammad
-genetics
-promptly
-catholicism
-suspicious
-benson
-albania
-cemeteries
-screenwriter
-transferring
-posed
-supermarket
-rita
-calm
-extraction
-patrons
-fulfill
-ladder
-packet
-skeleton
-madagascar
-asteroid
-hogan
-positioned
-plantations
-watches
-wreck
-remixes
-gdp
-clip
-pension
-majesty
-balloon
-subtle
-exiled
-darren
-levy
-retailers
-symmetry
-treaties
-refugee
-inherent
-wrestlers
-lafayette
-flute
-teammates
-persia
-chandler
-structured
-chrysler
-numerical
-tumor
-socially
-gloucestershire
-bulletin
-lottery
-manned
-kirby
-simmons
-kumar
-pagan
-lamp
-duchess
-perkins
-pga
-insignia
-alternatives
-basil
-resurrection
-collaborative
-submission
-messenger
-reply
-judged
-diane
-pending
-mc
-representations
-22nd
-pastoral
-weakness
-rebuild
-unnamed
-stranger
-desirable
-hardly
-phrases
-rectangular
-chamberlain
-1814
-testified
-resembling
-rejoined
-expired
-periodic
-harlem
-differing
-bolivia
-tremendous
-operas
-overwhelming
-philharmonic
-alcoholic
-tails
-resorts
-4.5
-emi
-fr
-repaired
-barton
-illusion
-limiting
-north-south
-jokes
-bruno
-informs
-fleming
-flour
-battleship
-basilica
-redskins
-rhine
-rented
-savannah
-eleanor
-shade
-pools
-precious
-logging
-parachute
-salvador
-corp.
-ferrari
-principally
-takeover
-tel
-mortality
-orchestral
-buttons
-rejection
-crosby
-amenities
-somehow
-proponents
-congresses
-anchored
-garcia
-concerto
-seniors
-convenience
-economist
-spanning
-carlisle
-23rd
-subset
-picnic
-three-year
-puppet
-celebrating
-administrators
-mentally
-fisheries
-spends
-downstream
-trumpet
-communism
-opted
-possessions
-algeria
-confluence
-explosives
-apparatus
-rails
-backwards
-utilities
-standardized
-sage
-falcons
-renewal
-pulp
-cone
-sights
-chooses
-individually
-grouped
-rover
-relies
-genocide
-singular
-sutherland
-oaks
-santiago
-sad
-recovering
-employing
-tang
-aesthetic
-digits
-flanders
-mhz
-satisfied
-conscious
-christie
-parsons
-exported
-centennial
-brooke
-assumes
-myspace
-campaigned
-judiciary
-barrett
-alternating
-challenger
-bout
-neighbor
-stays
-possibilities
-byron
-ludwig
-lowered
-piper
-blockade
-doctoral
-neighbours
-absolutely
-predecessors
-holden
-pronunciation
-audition
-builder
-oxide
-extract
-5000
-filmmaker
-plumage
-captivity
-broadly
-definitive
-rode
-fringe
-b.c.
-activation
-strengthen
-damaging
-discovering
-occupying
-dana
-katrina
-meals
-peasants
-validity
-trauma
-somalia
-diagram
-passport
-qualifications
-developmental
-attracting
-mid-1990s
-barber
-browser
-distribute
-blocking
-tidal
-o'connor
-wcw
-communal
-devotion
-dorset
-magnet
-onset
-tens
-tenor
-cables
-ordering
-hawkins
-dispatched
-encoded
-cane
-amber
-supervisor
-larvae
-mankind
-neurons
-retrieve
-vendors
-karachi
-rogue
-discrete
-identifies
-reject
-disappearance
-australians
-westward
-immense
-anyway
-gorge
-jockey
-threshold
-aurora
-madras
-deities
-suspicion
-blank
-oslo
-sanders
-sympathetic
-mortal
-spans
-semi-finals
-eponymous
-yemen
-makeup
-plenty
-all-ireland
-climbed
-hussein
-nobles
-cease
-co-founded
-terminals
-deny
-admiralty
-antiquity
-spinning
-inventory
-montanes
-sexuality
-entrepreneur
-haunted
-racer
-avg
-swords
-packers
-licenses
-blackburn
-dartmouth
-dinosaur
-stirling
-activism
-hey
-whip
-relate
-gill
-quote
-tire
-adolf
-myself
-encourages
-sensitivity
-beth
-detached
-setup
-decorations
-marshes
-ha
-propulsion
-barriers
-sands
-deputies
-applicable
-censorship
-ducks
-lonely
-dorsal
-anterior
-playboy
-eclipse
-reorganized
-resistant
-recruits
-flexibility
-meanings
-lantern
-surplus
-wong
-reissued
-beneficial
-romeo
-cao
-accession
-prolonged
-authentic
-williamson
-hydraulic
-heroic
-alias
-vitamin
-pilgrimage
-computational
-focal
-coaster
-allison
-replica
-devils
-shooter
-forts
-covenant
-inlet
-mountainous
-aston
-wikipedia
-enthusiasts
-tenants
-philosophers
-vintage
-bengali
-absorption
-nominal
-nordic
-bean
-ruined
-freestyle
-patrols
-meadows
-michel
-presley
-treason
-adjusted
-crafts
-ministries
-winters
-litigation
-realised
-monasteries
-currents
-remarked
-te
-granting
-emission
-frustrated
-klein
-registry
-stevenson
-stairs
-elliot
-worcestershire
-bo
-tissues
-homage
-hanover
-backs
-lincolnshire
-genome
-escaping
-seated
-waterfront
-voter
-lethal
-nile
-expeditions
-memoir
-clone
-spiders
-eva
-rigid
-warwickshire
-expeditionary
-confederacy
-exploitation
-devastating
-scarlet
-reggae
-halfway
-casual
-nightclub
-amazon
-best-selling
-reverted
-pharmacy
-exchanges
-exceed
-formula_5
-headmaster
-demographic
-crude
-stretches
-jesuit
-tips
-fog
-contemporaries
-donation
-unchanged
-bangkok
-financed
-soup
-brigades
-lifelong
-6,000
-bergen
-bees
-accomplishments
-pact
-aquarium
-lent
-prussian
-bizarre
-cyclone
-2d
-macintosh
-eisenhower
-sonny
-buchanan
-comet
-comparing
-guilt
-cobra
-correspond
-dive
-aligned
-confirm
-appreciation
-jumped
-underworld
-charities
-privilege
-consul
-thread
-elevator
-redesignated
-probable
-canadians
-mistaken
-tate
-myers
-surf
-retreated
-baba
-giles
-halloween
-pigs
-wines
-pays
-rotten
-controversies
-refined
-obligations
-unprecedented
-phillies
-albion
-nowhere
-translator
-youtube
-satisfy
-subscribers
-furious
-anton
-unification
-dividing
-assassinated
-excavations
-originates
-sticks
-accelerated
-undergone
-tina
-supervised
-resume
-1824
-enforce
-loch
-1829
-lahore
-v8
-renovations
-brutal
-rama
-l.a.
-breath
-warships
-raj
-wrapped
-mansfield
-upstream
-loving
-bags
-vegetable
-simpler
-choral
-proving
-yoga
-automobiles
-triangular
-complexes
-libya
-appointments
-slavic
-a&m
-violated
-shifts
-thirteenth
-northumberland
-admit
-hostilities
-gymnastics
-reinforcements
-pitching
-coordinated
-geometric
-printer
-discharged
-emeritus
-outline
-kosovo
-undefeated
-on-air
-conan
-1825
-hindus
-pie
-scholarships
-helsinki
-marginal
-contingent
-assignments
-announcing
-'n'
-strengthened
-spinal
-considerations
-fu
-metabolism
-angola
-1.2
-overtime
-progression
-unesco
-allegiance
-nineteen
-bred
-temporal
-hectares
-acre
-omega
-probe
-copied
-imprint
-triggered
-canals
-\
-1821
-lava
-bulldogs
-saxophone
-goodbye
-endowment
-rfc
-francesco
-advertisement
-nurses
-disciples
-taliban
-disastrous
-execute
-vocabulary
-80s
-songwriting
-poisoning
-liquor
-lowell
-rupert
-amalgamated
-premise
-constructing
-saxony
-prestige
-vowel
-methodology
-priory
-donor
-frankie
-converting
-specialists
-splitting
-arrows
-stretching
-amendments
-foremost
-cardiac
-ponds
-intimate
-tehran
-linebacker
-wilmington
-crimson
-gravel
-collar
-texture
-bleeding
-noel
-enforced
-fitting
-mistress
-omar
-kazakhstan
-jacobs
-avon
-acceleration
-proposition
-concludes
-illustration
-notices
-lowe
-ordnance
-fin
-discoveries
-rid
-exciting
-vhs
-gujarat
-wolfe
-troupe
-ds
-jennings
-cocaine
-moth
-lester
-endurance
-surprising
-wendy
-1818
-derivatives
-si
-turtles
-26th
-mutations
-cal
-denomination
-specials
-forget
-tired
-tap
-sophomore
-painters
-1776
-warming
-displaying
-traction
-wax
-cousins
-gaza
-experiencing
-sacks
-restriction
-abbott
-goodman
-fourteenth
-implementing
-graphical
-high-speed
-compressed
-benny
-clifford
-mack
-planetary
-staple
-long-running
-subscription
-influx
-1801
-rand
-chat
-travelers
-cruel
-sorts
-meantime
-docks
-thief
-jr
-antarctic
-weaver
-samoa
-departing
-chatham
-con
-cromwell
-ancestral
-czechoslovakia
-stunt
-demos
-gentle
-105
-fortunes
-7.5
-mater
-terrorists
-potomac
-psychic
-twilight
-sixteenth
-engineered
-'em
-swansea
-jewelry
-axe
-gt
-tributaries
-prescribed
-hiring
-bach
-accumulated
-xml
-robbie
-aforementioned
-caucus
-flourished
-mercer
-favoured
-erik
-algebraic
-rental
-simplified
-attorneys
-beans
-torque
-debated
-grandchildren
-della
-east-west
-outright
-shawn
-utilizing
-royals
-medication
-usc
-invaders
-tambon
-persistent
-chennai
-jaw
-tragic
-creators
-nursery
-lok
-farther
-ghosts
-constable
-redesigned
-mo
-capitalism
-livingston
-namesake
-walton
-checks
-belarus
-vaughan
-pose
-displacement
-sparked
-conferred
-filters
-thoroughbred
-preserving
-bates
-melodic
-guerrilla
-forensic
-6.5
-fool
-poetic
-culminating
-ordinance
-regulate
-jumps
-establishments
-prosecutor
-th
-huntington
-midst
-partnerships
-tomatoes
-forums
-heather
-loads
-debts
-lds
-strand
-realism
-mba
-regina
-niece
-presided
-dear
-mirrors
-cube
-prefix
-pathway
-lamps
-sid
-scorer
-gentleman
-issuing
-obstacles
-accidental
-bottles
-fry
-persuade
-gould
-deposited
-recruitment
-bacon
-indirectly
-barack
-tones
-dramas
-quincy
-specializes
-infrared
-talked
-ambient
-bloom
-fleeing
-histories
-inning
-baptism
-builds
-impacts
-humphrey
-obligation
-boroughs
-diminished
-baton
-27th
-cornish
-fielding
-calculations
-dose
-halted
-manipulation
-sunderland
-visions
-pubs
-1870s
-exchanged
-fortifications
-garrett
-handles
-rica
-accessories
-corpse
-lizard
-tuning
-depiction
-sting
-italians
-posting
-dismissal
-mistakes
-polytechnic
-divinity
-unpopular
-u.
-expressing
-contributes
-guiding
-slate
-linguistics
-lime
-par
-floods
-verbal
-wheeler
-inc
-chairs
-databases
-substrate
-install
-sake
-seventeenth
-pulitzer
-disappointed
-cc
-tonnes
-babies
-sleeve
-contractor
-exploded
-edison
-suffers
-smash
-hears
-playable
-dunn
-comply
-steele
-wimbledon
-interfaces
-surroundings
-fulton
-discusses
-worried
-profound
-resting
-shoulders
-dealers
-ivy
-nigerian
-diplomat
-syndicate
-assemblies
-elders
-listings
-haiti
-manifold
-investigators
-peripheral
-hyderabad
-broadcasters
-exceeded
-bangalore
-noteworthy
-ernst
-vowels
-violet
-patented
-citadel
-contributors
-holt
-rockefeller
-oversaw
-shareholders
-m.a.
-reconstructed
-congestion
-mystical
-atlantis
-abdullah
-markers
-<
-exams
-marijuana
-reconciliation
-valves
-fuselage
-characterised
-1826
-macarthur
-transparent
-payne
-dinosaurs
-recipients
-mongolia
-clearance
-mikhail
-concord
-merging
-python
-das
-wherein
-approximate
-chin
-brent
-uncertainty
-kerr
-nwa
-nc
-kramer
-kuala
-justified
-icc
-slender
-northampton
-psychedelic
-hinduism
-overlap
-jill
-separating
-real-life
-albanian
-rewarded
-disguise
-hubbard
-tavern
-yankee
-loops
-ag
-randall
-targeting
-modern-day
-expresses
-tires
-telugu
-patriotic
-mid-1980s
-overthrow
-belmont
-lgbt
-spears
-remixed
-popularized
-satirical
-xii
-captained
-labrador
-forge
-1816
-concurrent
-tent
-tear
-intersects
-domination
-hilton
-arises
-laurence
-wishing
-monkeys
-culminated
-tiles
-stafford
-omitted
-1827
-3.0
-analogous
-lyric
-clive
-instrumentation
-slang
-hatred
-michaels
-celestial
-rhythmic
-myanmar
-dell
-guinness
-busch
-receivers
-radioactive
-formula_6
-greens
-honolulu
-encoding
-ultra
-roma
-zinc
-countess
-eminent
-sioux
-slim
-turbo
-countdown
-mast
-pits
-iconic
-heirs
-garbage
-assassin
-postgraduate
-militant
-stretched
-patches
-optimal
-procession
-fearing
-nicaragua
-respiratory
-confronts
-richie
-monastic
-negro
-usaf
-relics
-daytona
-bind
-monitors
-isabella
-disappointing
-trusted
-hearings
-suppression
-tuberculosis
-unnecessary
-einstein
-dukes
-entirety
-1813
-barney
-owl
-spark
-meadow
-co-written
-programmer
-wires
-risen
-generators
-mutation
-hale
-weston
-outreach
-leicestershire
-1.6
-fold
-reorganization
-uruguay
-font
-sharply
-angus
-guardians
-mughal
-outcomes
-suppressed
-standings
-conway
-sofia
-cavity
-captives
-mans
-yu
-mayo
-ethnicity
-satire
-costly
-catcher
-coastline
-liz
-verdict
-indicator
-ga
-cruisers
-upright
-tutor
-flemish
-turf
-adopting
-disturbed
-kits
-bud
-adapt
-successors
-lionel
-groove
-ruin
-horace
-hassan
-grape
-rites
-behavioral
-anarchist
-@
-lambert
-beacon
-mutants
-ming
-derry
-yields
-dial
-slated
-prompting
-fda
-periodically
-aria
-fundraising
-documentaries
-productivity
-chang
-clifton
-sheikh
-hobart
-panther
-estuary
-centred
-apostles
-recorder
-afb
-pleased
-sampling
-feudal
-buzz
-leopold
-restoring
-aristotle
-wii
-irvine
-disguised
-owens
-affordable
-kannada
-wwii
-importantly
-imports
-avid
-exodus
-ned
-katie
-lorenzo
-300,000
-api
-hoover
-sophie
-cockpit
-scientology
-hazard
-attribute
-natalie
-samurai
-sunlight
-surround
-staffordshire
-gundam
-disabilities
-zombie
-orioles
-diagnostic
-2-0
-destructive
-charging
-exercised
-observing
-canonical
-ndp
-staging
-medium-sized
-coil
-senses
-shortage
-jew
-compiler
-syntax
-12,000
-translates
-barangays
-passerine
-appoint
-persona
-intermittent
-parallels
-evenings
-osaka
-colt
-tuition
-fender
-incorrectly
-prostitution
-ivory
-calculus
-educator
-guatemala
-excavated
-analysts
-thoroughly
-nicole
-vicar
-patriot
-capped
-foreigners
-noticeable
-battled
-heroin
-gazette
-escorted
-distributor
-supplying
-subdivisions
-geoff
-bandwidth
-blackpool
-polynomial
-beams
-deleted
-uncovered
-libertarian
-amiga
-upgrades
-hymn
-seller
-contexts
-readings
-at&t
-bluff
-marrying
-weber
-decreasing
-bloc
-lets
-denote
-veterinary
-deeds
-hague
-gigs
-knock
-pumps
-450
-glasses
-1819
-rotary
-two-thirds
-conquer
-campaigning
-ana
-implementations
-kidnapping
-humour
-dimensional
-drift
-cameroon
-conditioning
-kicking
-builders
-rpm
-ashore
-talbot
-poe
-musicals
-left-wing
-newsletter
-sensation
-applicants
-stephanie
-conclusions
-sunny
-plc
-treatise
-identities
-laos
-pad
-viral
-circulated
-elephants
-christina
-fairfax
-latvia
-cites
-jade
-nissan
-alto
-finalist
-tobago
-pulls
-brewing
-averaging
-anymore
-pitches
-donovan
-anita
-lagoon
-maid
-dante
-alps
-mozart
-reich
-nassau
-accord
-transcription
-rican
-conversely
-unanimous
-nina
-simultaneous
-belly
-dove
-toes
-perimeter
-plato
-basque
-advertised
-torch
-gravitational
-admired
-barbados
-hut
-painful
-compatibility
-mao
-hoffman
-marxist
-jealous
-lisbon
-variously
-noun
-preaching
-lyrical
-undertake
-jin
-investigative
-accounted
-climax
-shy
-ibrahim
-qur
-northward
-sweep
-chesapeake
-ugly
-distress
-munster
-archipelago
-castro
-laps
-oversight
-sect
-evaluated
-tudor
-henri
-co-operative
-exhausted
-nielsen
-intentionally
-confront
-lasts
-....
-justices
-lopez
-lifting
-thriving
-gerard
-unexpectedly
-denis
-suzuki
-ufc
-consolidation
-mornings
-bibliography
-extracted
-liturgy
-deed
-offence
-playground
-superseded
-typhoon
-supplier
-gentlemen
-nonprofit
-unionist
-1803
-scratch
-judith
-commando
-mounting
-90s
-drilling
-agnes
-yourself
-upwards
-forrest
-twisted
-turin
-branding
-santo
-visibility
-paperback
-moor
-1850s
-programmed
-zeppelin
-exits
-5.5
-solitary
-adobe
-upheld
-revive
-cache
-partnered
-conducts
-aviv
-wanderers
-dams
-allocation
-infinity
-neural
-enduring
-ipswich
-styled
-byrne
-miranda
-latino
-shelley
-weights
-sophia
-gerry
-respects
-reasonably
-weird
-wizards
-broadband
-hare
-norris
-unitary
-crusaders
-1860s
-accomplish
-reelected
-1798
-tulsa
-rodriguez
-corporal
-worms
-madness
-bankrupt
-racist
-seminars
-defects
-decreases
-rajasthan
-shirts
-letting
-surpassed
-hurricanes
-hello
-apocalypse
-macedonian
-curry
-frogs
-1.0
-capitals
-deficit
-telecom
-soprano
-enlightenment
-1806
-absorb
-fishermen
-captains
-28th
-washed
-crossroads
-sinatra
-zambia
-beard
-outlook
-seldom
-ensured
-fascist
-presiding
-wetlands
-1823
-stationary
-boone
-dvds
-dub
-constitutes
-galway
-universally
-questioning
-dig
-multiplayer
-landscapes
-ferries
-immunity
-gp
-evergreen
-cart
-excited
-butcher
-motorcycles
-baxter
-relativity
-comparatively
-empirical
-deacon
-imply
-108
-fried
-carmen
-liberties
-inactive
-wagons
-coconut
-bunny
-verbs
-receptions
-integrate
-daylight
-lin
-dignity
-karate
-physiology
-potassium
-wa
-ramp
-pants
-29th
-objections
-workforce
-prosperous
-proves
-signaling
-betrayed
-earnings
-rao
-twenty-five
-buyers
-fifteenth
-cw
-boycott
-amnesty
-anglo-saxon
-rigorous
-promoter
-countless
-spine
-myths
-resisted
-chloride
-emotion
-insists
-thanksgiving
-deposed
-plug
-tsar
-unrest
-boulder
-mantle
-oceans
-minimize
-courtyard
-consultation
-africans
-bacterial
-magnus
-admissions
-semiconductor
-venetian
-frigate
-wharf
-billed
-hon.
-incidence
-infants
-hailed
-brock
-chilean
-raises
-hobby
-straits
-viruses
-wiltshire
-sympathy
-seize
-scarborough
-aaa
-nutrition
-inputs
-harmful
-saddam
-prof.
-whales
-obliged
-recognizing
-arising
-tasked
-generous
-dug
-weaker
-slaughter
-islanders
-franco
-denotes
-ra
-nude
-embraced
-registers
-fellows
-liaison
-tai
-melodies
-ratified
-transmit
-peterborough
-2006-07
-104
-connector
-fairfield
-internally
-estonian
-swamps
-bite
-thornton
-icons
-baseman
-physicist
-limbs
-cents
-superstar
-convincing
-economies
-guess
-prediction
-enjoying
-spun
-rebounds
-commander-in-chief
-finances
-parameter
-coffin
-beethoven
-urdu
-marilyn
-immortal
-everett
-handbook
-mysteries
-renault
-renewable
-orchard
-mathematician
-bamboo
-lattice
-cubic
-lemon
-geelong
-106
-signatures
-croydon
-scripture
-pistols
-georges
-proclamation
-controllers
-f1
-assess
-convinces
-unfinished
-pillars
-siberia
-appalachian
-folding
-formula_7
-schmidt
-namibia
-wei
-pornography
-qing
-prospective
-tracked
-nagar
-unite
-freddie
-mandarin
-norse
-ernie
-creativity
-johannesburg
-elegant
-planting
-chromosome
-murderer
-examining
-dracula
-calculate
-49ers
-1822
-nottinghamshire
-definite
-connor
-ark
-statutes
-sensory
-emphasize
-antwerp
-solicitor
-montenegro
-meridian
-moisture
-yahoo
-literal
-counseling
-eccentric
-comparisons
-built-in
-penguin
-antique
-heinrich
-hymns
-formulation
-threatens
-vacancy
-render
-faculties
-moody
-mariners
-panzer
-routinely
-converts
-arithmetic
-educate
-realms
-lankan
-250,000
-specialised
-playhouse
-specify
-antioch
-tortured
-thor
-conversations
-velvet
-farmland
-protested
-giuseppe
-protesters
-mainline
-burmese
-lithuanian
-grapes
-photographed
-raleigh
-interfere
-mortar
-genoa
-sp
-angelo
-worm
-facade
-goat
-embrace
-contractors
-eager
-banana
-cross-country
-susceptible
-catalyst
-tyne
-deficiency
-armament
-mccoy
-coliseum
-dressing
-vampires
-gps
-istanbul
-lauren
-barrels
-bonnie
-uttar
-re
-degradation
-residency
-yeshiva
-titular
-lips
-correlation
-exclusion
-tyrone
-liturgical
-surge
-steals
-kitty
-mesa
-prisons
-quiz
-schooling
-thickness
-2,500
-cologne
-forested
-lois
-kicks
-exceptionally
-pal
-detained
-radial
-continents
-mccain
-lawson
-peggy
-cobb
-hansen
-darling
-limerick
-astronaut
-marches
-amid
-ya
-curious
-transports
-surveyed
-forged
-consistency
-rumours
-granddaughter
-magistrate
-rumored
-marketplace
-neglected
-outlined
-hazardous
-ranged
-bunker
-suggestions
-satisfaction
-governor-general
-santos
-visa
-sunrise
-grazing
-reluctantly
-marquess
-elaine
-cam
-charlton
-maturity
-remastered
-dismantled
-128
-epidemic
-luna
-wolverine
-searches
-savoy
-hanged
-plagued
-ballroom
-bald
-canvas
-clown
-vectors
-mls
-alloy
-locke
-emperors
-pilgrims
-narrated
-critique
-shri
-explores
-melissa
-thatcher
-likelihood
-70s
-1.3
-inevitable
-perennial
-inscribed
-1790
-carpet
-serpent
-filing
-selecting
-fritz
-separates
-urine
-affection
-deluxe
-proposes
-algae
-bert
-apache
-yi
-customary
-reno
-oldies
-gloves
-generates
-1600
-greenwood
-assumptions
-antarctica
-interceptions
-dexter
-prep
-dangers
-poster
-bricks
-conceptual
-honoured
-stein
-brother-in-law
-dwellings
-cop
-nests
-depressed
-eaton
-desires
-nightmare
-responds
-lockheed
-somebody
-owed
-compelled
-chaplain
-weekdays
-exploited
-prototypes
-sabbath
-needle
-recycling
-lobbying
-1789
-caucasus
-themed
-aggression
-chargers
-attain
-certificates
-interception
-hormone
-explorers
-voluntarily
-superhuman
-framed
-morse
-illustrator
-bologna
-mozambique
-arches
-depict
-denounced
-pi
-conclude
-pork
-expulsion
-motown
-packs
-ka
-preacher
-mistakenly
-teddy
-juliet
-ego
-gibbs
-homestead
-wherever
-1807
-dominic
-transmissions
-discourse
-collaborations
-monaco
-fireworks
-andre
-lima
-banker
-imaginary
-vaccine
-imagine
-titan
-undercover
-gradient
-wembley
-boiler
-tombs
-dice
-greenhouse
-destroys
-harmonic
-pursuits
-scenarios
-directive
-vengeance
-formidable
-min
-buyer
-wicked
-modular
-reputed
-containers
-emirates
-protestants
-semi
-foley
-manipulate
-sedan
-stiff
-watt
-composing
-pedal
-shake
-atp
-kolkata
-reproductive
-grange
-annexation
-gonna
-inexpensive
-unanimously
-8,000
-transitional
-amplifier
-drained
-nuts
-sparks
-seminal
-rodgers
-daisy
-northamptonshire
-weir
-divisional
-neutron
-sandra
-fatty
-coma
-defendants
-cartoonist
-eastward
-warden
-creed
-pets
-slovakia
-dalton
-warnings
-vishnu
-handsome
-audit
-acquitted
-neal
-apartheid
-feeder
-pasadena
-simplicity
-glucose
-fixture
-garland
-cascade
-float
-stephens
-urging
-tunisia
-venom
-mystic
-dispersed
-belts
-fond
-bentley
-slovenia
-drowned
-hind
-dependence
-shotgun
-hitchcock
-caldwell
-unwilling
-chiang
-pickup
-winding
-sins
-relocation
-steamed
-mod
-therapeutic
-forthcoming
-sears
-allah
-burgess
-postwar
-irene
-beetle
-roberto
-christchurch
-pneumonia
--DGDGDG.DGDGDGDGDGDG
-brennan
-fowler
-ornamental
-ginger
-1775
-revolves
-geographically
-moniker
-distinguishing
-browne
-mess
-bowled
-ravens
-dixie
-simulcast
-crowded
-busiest
-his/her
-sequels
-assured
-goose
-wakefield
-colonization
-justify
-rugged
-1792
-skinner
-coding
-nets
-disagreement
-8.5
-thank
-gum
-oils
-60,000
-resonance
-maj.
-j.d.
-gustav
-1.1
-bombings
-swami
-neville
-informally
-ideological
-right-wing
-divides
-conscience
-akin
-cortex
-objected
-spear
-quietly
-peasant
-op
-dickinson
-lungs
-niche
-saturdays
-richest
-1.8
-pins
-intercepted
-moderately
-harriet
-ecw
-notoriety
-exploits
-problematic
-porsche
-armenians
-catering
-filmmakers
-buckingham
-recognizable
-edith
-generalized
-taipei
-piedmont
-roth
-semifinals
-joints
-travellers
-besieged
-uc
-transporting
-turret
-snails
-splits
-southbound
-possessing
-number-one
-suffrage
-correction
-autobots
-1795
-motif
-energies
-slayer
-affinity
-disasters
-2007-08
-erupted
-liberia
-chords
-daddy
-stratford
-pillar
-nichols
-volunteered
-helm
-charm
-oversee
-scandinavian
-glacial
-kensington
-five-year
-castles
-mini-series
-dei
-sylvia
-discount
-ieee
-fencing
-kaiser
-veronica
-melting
-divers
-disciple
-latter-day
-replies
-shrewsbury
-blonde
-spontaneous
-garner
-discarded
-microphone
-johannes
-macau
-ie
-assam
-paradox
-manifesto
-detector
-ceramic
-er
-orion
-tuned
-lawsuits
-plea
-inverse
-sack
-wb
-pricing
-regimental
-ambiguous
-oasis
-sammy
-nationalists
-depths
-1811
-ascent
-invading
-sampled
-torpedoes
-spokesperson
-1805
-portray
-pleaded
-inappropriate
-restructuring
-31st
-ufo
-mir
-mlb
-1809
-interpret
-slots
-skater
-honduras
-cigarette
-rotor
-grouping
-laden
-pray
-rutherford
-sunni
-hector
-punished
-assistants
-reviewers
-candidacy
-finn
-compilations
-gangs
-supportive
-postponed
-glaciers
-nathaniel
-sperm
-au
-fulfilled
-jamaican
-unused
-overturned
-substituted
-colombian
-keyboardist
-helpful
-buckley
-killings
-predominant
-businessmen
-intends
-toby
-damascus
-paso
-grains
-a.k.a.
-elm
-paddy
-rendition
-routed
-2007-2008
-quaker
-brethren
-demonstrates
-trenton
-payload
-clues
-angered
-higgins
-ridges
-definitely
-posters
-meredith
-explanations
-taiwanese
-shropshire
-ton
-hermann
-cornelius
-cain
-petrol
-profiles
-stella
-energetic
-gavin
-scrap
-elects
-firstly
-rods
-portraying
-finalists
-behaviors
-recalls
-allegheny
-envelope
-assessed
-symmetric
-streetcar
-laurie
-refurbished
-traps
-1808
-arrests
-compass
-snyder
-stand-up
-analogue
-fragment
-megatron
-goldberg
-rotterdam
-sentiment
-committing
-randomly
-predictions
-rahman
-earthquakes
-acknowledge
-phased
-spectral
-efficiently
-paula
-reviewer
-1793
-wyatt
-cannes
-partisan
-barred
-fuels
-sindh
-brewers
-annex
-impose
-lucia
-regency
-fins
-monarchs
-destined
-disks
-saunders
-polymer
-northbound
-stranded
-ensued
-deserted
-laugh
-birch
-stoke
-tucson
-warship
-solidarity
-distributions
-bestowed
-lex
-2005-06
-beirut
-wichita
-oldham
-magnum
-humble
-abruptly
-submerged
-endless
-garfield
-contamination
-propagation
-eli
-pony
-high-profile
-migrants
-latitude
-ph
-nave
-galaxies
-poly
-sirius
-goa
-breeders
-reinstated
-colombo
-synonymous
-!!
-suffix
-demonstrating
-jays
-cum
-rudolf
-diverted
-logs
-genetically
-inter
-commitments
-washing
-yielded
-ne
-termination
-instructors
-simplest
-0.5
-vacated
-costello
-horton
-compensate
-zeus
-apology
-juniors
-109
-relying
-psychologist
-kabul
-guyana
-2006-2007
-interestingly
-waterfall
-eruption
-coincide
-piston
-breakup
-fischer
-launches
-canoe
-atoll
-booker
-cottages
-'n
-ceylon
-syndication
-taft
-apex
-formulated
-rejects
-somali
-gale
-commemorated
-morphology
-detailing
-antagonist
-prefers
-motive
-awaiting
-competent
-non-fiction
-ramsey
-xiii
-val
-sweeping
-craters
-royalty
-purchases
-mixes
-manually
-gubernatorial
-expo
-ankle
-commandant
-sellers
-approximation
-unavailable
-minus
-workplace
-mold
-wigan
-vicious
-grenade
-dolphin
-1.4
-initiation
-antony
-spice
-a1
-byrd
-buccaneers
-pablo
-tara
-declares
-hail
-uniquely
-diplomacy
-usb
-vc
-basket
-lumpur
-archibald
-frontman
-headlines
-convergence
-bethlehem
-peculiar
-scheduling
-4.0
-invade
-bravery
-fife
-granada
-barony
-advisors
-vows
-dec
-prc
-invest
-dresden
-dhaka
-1794
-cello
-ambition
-barrow
-threads
-touches
-consoles
-knocking
-bicycles
-nomenclature
-substrates
-illegally
-herd
-pp
-presentations
-english-speaking
-progressively
-asserts
-carrie
-responding
-vera
-wally
-contention
-bowls
-progresses
-stained
-fist
-investigator
-fremantle
-irwin
-towed
-induce
-lively
-nas
-honorable
-biographical
-punt
-cannons
-ants
-ll
-bahamas
-bathroom
-judgement
-cylinders
-remarkably
-capt.
-stud
-versa
-harp
-malayalam
-kyoto
-resurrected
-storylines
-ant
-sanctions
-differed
-useless
-und
-mongol
-aiming
-hunted
-inhabit
-ko
-inventions
-compton
-affluent
-tactic
-chasing
-cartridges
-guarded
-prevailing
-scandinavia
-gaps
-tanker
-proceeding
-tramway
-amalgamation
-ceded
-motorsports
-godfrey
-bikes
-ct
-distributing
-vanguard
-wilkinson
-1,200
-portrays
-trans
-hides
-dungeon
-regeneration
-deadline
-illegitimate
-screaming
-crashes
-orchestras
-clutch
-trader
-psi
-greenland
-ecosystem
-chalk
-ballistic
-masks
-consonants
-gastropod
-djs
-asleep
-donors
-coca-cola
-tow
-madame
-waterford
-podcast
-DGDG.DGDGDG
-occupational
-debbie
-1804
-amos
-assert
-elton
-coincided
-apostle
-cerebral
-mama
-ethanol
-knighted
-chased
-hertfordshire
-commodity
-marian
-spherical
-wesleyan
-doubts
-malaya
-salvage
-bordering
-editor-in-chief
-suppose
-one-year
-surfaced
-grateful
--DG
-protects
-1802
-sheer
-mclean
-textiles
-prospects
-comedic
-dart
-formula_8
-slowed
-ineffective
-ah
-analyzed
-forks
-sanctioned
-woodlands
-analytical
-safer
-tenant
-miracles
-trance
-woodward
-needing
-volkswagen
-segregation
-wight
-anti-aircraft
-deutsche
-drains
-rhetoric
-franchises
-magna
-sliding
-transforming
-visually
-gallagher
-leaked
-resembled
-automation
-olivia
-lakers
-phosphate
-justification
-witches
-replaces
-tempo
-ingredient
-sant
-interpreter
-sol
-caring
-sparta
-drastically
-concurrently
-arbor
-surgeons
-5.1
-entertaining
-co2
-heal
-futures
-subfamily
-battleships
-penetration
-gnu
-battling
-xvi
-dewey
-ballads
-deprived
-alvin
-def
-reassigned
-clinics
-mating
-yukon
-crete
-mae
-potatoes
-diaspora
-alterations
-skip
-denoted
-stellar
-originate
-reigning
-salle
-newscasts
-economists
-long-distance
-left-handed
-decks
-subjective
-surfing
-com
-fancy
-lending
-deborah
-palestinians
-nutrients
-bihar
-everton
-olympia
-bunch
-synthesizer
-selections
-amherst
-woodstock
-shoreline
-reptiles
-ministerial
-protector
-7,000
-commentaries
-knives
-immigrated
-perpendicular
-stanton
-flux
-theoretically
-baronetcy
-galactic
-leipzig
-hamlets
-intellectuals
-soda
-mandated
-checking
-lyons
-alison
-brunei
-purity
-schedules
-superiority
-short-term
-imposing
-leopard
-ignore
-glamorgan
-evolve
-moose
-paterson
-nickelodeon
-shelby
-belgrade
-talmud
-bahrain
-sinister
-toc
-confessed
-mayer
-culturally
-capsule
-boyle
-degraded
-ada
-differentiate
-ch
-strengthening
-scriptures
-marcos
-severed
-journeys
-configurations
-townsend
-rhythms
-osborne
-downward
-replay
-belize
-onward
-clue
-caliber
-suppliers
-opus
-anaheim
-beaumont
-scream
-fortunately
-memo
-monmouth
-resolutions
-nicholson
-medicinal
-desmond
-neolithic
-perspectives
-ware
-vanderbilt
-zion
-analyses
-lenin
-herzegovina
-eg
-headline
-kathleen
-strauss
-stripe
-co-authored
-hooker
-elves
-mythical
-screw
-muscular
-prepares
-nasal
-harding
-preferences
-metropolis
-luftwaffe
-statesman
-leonardo
-meaningful
-collingwood
-principality
-monterey
-devi
-steamer
-bollywood
-manufactures
-trafficking
-weighed
-mates
-programmers
-stevie
-dunes
-leyte
-utrecht
-brittany
-clocks
-terra
-jasper
-toxicity
-eliot
-mp3
-upward
-flyers
-fungi
-clarinet
-mercenaries
-defences
-ensembles
-evidenced
-english-language
-encryption
-delicate
-sacrifices
-2008-2009
-casualty
-u.k.
-nero
-year-round
-trait
-blows
-kathy
-hicks
-appreciated
-microwave
-monitored
-rodeo
-gwen
-kai
-emphasizes
-2.4
-gaa
-middleton
-packaged
-leinster
-surveyor
-seventy
-economical
-exceeding
-day-to-day
-canopy
-outlaw
-provoked
-contender
-40th
-terminate
-mayors
-believers
-auspices
-re-release
-dune
-breton
-utilizes
-secretariat
-tolkien
-invites
-pledged
-1796
-supplemented
-basal
-orbits
-ethan
-emigration
-medications
-ransom
-trams
-cooke
-illustrate
-georg
-vanessa
-puzzles
-charting
-digest
-co-operation
-anchorage
-havana
-viewpoint
-improvised
-tory
-domesday
-nicolas
-oxidation
-reeves
-positively
-abdomen
-wills
-cyril
-smithsonian
-reservations
-napoleonic
-polling
-ge
-hanson
-beverage
-nuns
-cutter
-modernization
-3-0
-medina
-coating
-oilers
-wheelchair
-scattering
-flashback
-practitioner
-axle
-tensor
-2.2
-restrict
-cr
-on-screen
-imminent
-messiah
-starters
-liam
-totals
-malls
-rooted
-dillon
-outspoken
-paternal
-coasts
-nails
-veins
-tna
-mackay
-suppress
-adjustment
-monty
-vapor
-globally
-click
-skate
-reliance
-archaeologists
-questionable
-complementary
-mom
-sur
-1783
-nemesis
-retailer
-piers
-leak
-recommend
-lil
-aba
-graffiti
-mph
-jain
-lorraine
-fore
-waterways
-1791
-rains
-ella
-doctrines
-populous
-harassment
-watkins
-confiscated
-borneo
-colloquially
-severity
-shine
-trump
-lucrative
-mitsubishi
-pawn
-eclectic
-1799
-friedman
-wilde
-avalon
-strokes
-sergei
-rink
-substitution
-ordination
-letterman
-translate
-trailing
-batters
-html
-sawyer
-symphonic
-2008-09
-niger
-mat
-sac
-styling
-topography
-delivers
-soundtracks
-elk
-bust
-deported
-packets
-aide
-bon
-yan
-halo
-koch
-emerges
-avenues
-eton
-subcommittee
-knoxville
-darts
-vh1
-runways
-entrusted
-inconsistent
-evelyn
-mu
-brendan
-retrieved
-po
-carriages
-multi-purpose
-1777
-sonata
-maharaja
-wooded
-schwartz
-hated
-wakes
-textbook
-re-recorded
-spies
-convened
-diffusion
-boer
-kinetic
-maltese
-goldman
-logos
-rests
-viaduct
-huddersfield
-khmer
-unfair
-optics
-chestnut
-experimented
-repeating
-shelters
-frustration
-offences
-canary
-crossings
-quad
-biodiversity
-metaphor
-removes
-denial
-budapest
-hendrix
-hospitality
-wonders
-fiat
-rutgers
-zen
-wavelength
-botany
-judging
-fits
-encouragement
-penal
-pathology
-organizers
-thieves
-squirrel
-conflicting
-turbines
-micro
-conditional
-coefficient
-umpire
-teresa
-hats
-archaic
-pontiac
-evan
-morale
-unfortunate
-textbooks
-maneuver
-motifs
-subsidiaries
-propeller
-wrist
-grief
-boiling
-acquisitions
-hazel
-hydroelectric
-mckay
-bilingual
-fined
-convey
-faded
-printers
-1797
-dragged
-mechanic
-mg
-buckinghamshire
-sided
-3.2
-cannabis
-motives
-enraged
-augmented
-yuri
-bounty
-retaliation
-cyrus
-hazards
-crying
-booklet
-invasive
-merlin
-wit
-finch
-penned
-stepping
-downfall
-entropy
-centenary
-otis
-technician
-patton
-streaming
-3-1
-rafael
-inauguration
-expecting
-sheila
-rwanda
-knots
-prescription
-fairs
-serviced
-warns
-malaria
-putnam
-playback
-peruvian
-tendencies
-bidding
-locking
-raped
-frontal
-assyrian
-andreas
-damon
-colliery
-co-host
-preferring
-150,000
-transforms
-buster
-technicians
-relaxed
-uhf
-pathways
-freezing
-coupling
-deposition
-nyc
-grounded
-staircase
-vertically
-precinct
-strains
-supremacy
-mw
-sexes
-1840s
-pigeon
-pietro
-formula_9
-roadway
-tong
-youths
-levi
-benton
-100th
-burger
-one-third
-weaving
-vfl
-co-produced
-pauline
-sahara
-doses
-kurdish
-tattoo
-erotic
-corrected
-averages
-consulted
-ratios
-indy
-thorough
-oswald
-rey
-thou
-novelty
-schneider
-abused
-staffed
-arbitration
-1780
-engagements
-o'clock
-rai
-pops
-chemist
-proportions
-announces
-ng
-bucks
-langley
-state-of-the-art
-jules
-olivier
-judah
-lama
-kilkenny
-installment
-listener
-chandra
-wolverhampton
-testify
-ported
-minors
-labelled
-topological
-foothills
-tractor
-bernie
-rainy
-abnormal
-ensures
-carmel
-hungry
-truce
-gus
-bluegrass
-welch
-disneyland
-gesture
-toe
-subgroup
-migratory
-whig
-stamford
-sorry
-tightly
-verified
-yugoslav
-scarce
-hayden
-govern
-retention
-seaside
-nouns
-republics
-cancel
-coats
-wilder
-cretaceous
-jericho
-mclaren
-robotic
-thee
-saloon
-herbs
-benefited
-semantic
-comedies
-flair
-6.3
-borne
-centralized
-pockets
-ashton
-tracts
-radios
-depictions
-deception
-ought
-135
-initials
-bliss
-gong
-oxfordshire
-aragon
-espionage
-bays
-assurance
-pharaoh
-unconstitutional
-neo
-librarian
-long-standing
-remnant
-rabbits
-eastenders
-malawi
-threaten
-nhs
-lucius
-landslide
-hq
-stephenson
-searched
-cows
-mercedes
-bauer
-bravo
-laughing
-5.0
-trash
-loaned
-inequality
-runners-up
-ji
-khz
-carthage
-mcgill
-martyr
-greenville
-eighteens
-shin
-mined
-prostitute
-huron
-macy
-vernacular
-pentagon
-gears
-pornographic
-openings
-ida
-treasures
-pamela
-spotlight
-hourly
-gypsy
-spurs
-commemorative
-blogs
-fluent
-in-house
-reactors
-ballots
-alec
-paradigm
-stimulation
-repeats
-rhodesia
-margins
-trolley
-werner
-cfl
-norms
-wired
-fluids
-presenters
-recession
-rematch
-endorsement
-regents
-miscellaneous
-punjabi
-remedy
-lifts
-7.1
-photographers
-mitch
-whistle
-camel
-bangor
-nuclei
-transverse
-polk
-beverages
-cater
-seizure
-exaggerated
-taped
-vertices
-foil
-theresa
-burnett
-cruises
-transmitting
-xiv
-grandparents
-sniper
-educators
-meteorological
-cinemas
-tying
-davenport
-evolving
-secondly
-smackdown
-joker
-clashes
-robbins
-implicated
-penguins
-guthrie
-postage
-captures
-pennant
-imam
-jackets
-mca
-intending
-killers
-walnut
-u.s.a.
-hume
-requesting
-thy
-stockton
-leningrad
-tasmanian
-ft.
-lure
-confusing
-manners
-fumble
-decent
-crust
-zhou
-saratoga
-dissertation
-insulin
-mutually
-marino
-repertory
-tar
-caravan
-cyclists
-blunt
-boating
-revolver
-actresses
-millionaire
-orissa
-mister
-banning
-lo
-modelling
-sikhs
-alliances
-catalyzes
-1788
-amidst
-hybrids
-4000
-convoys
-gregorian
-mirage
-vega
-armistice
-spartan
-darlington
-scrutiny
-supermarkets
-endowed
-accumulation
-raided
-airbus
-xv
-interdisciplinary
-ate
-paraguay
-tao
-seneca
-shame
-notified
-probation
-null
-sims
-tile
-contaminated
-raphael
-esther
-carver
-9.5
-avery
-invasions
-coupe
-masterpiece
-devotees
-convictions
-merrill
-faa
-disappears
-dickens
-claws
-staten
-favourable
-analyze
-malik
-bombed
-obsessed
-toad
-1700
-doncaster
-mid-1970s
-chant
-downloaded
-fueled
-shattered
-wiped
-riaa
-bowen
-closes
-versatile
-doris
-swindon
-marries
-salesman
-offenders
-manson
-clones
-secession
-commencement
-brenda
-noir
-co-educational
-fresno
-migrate
-foam
-activate
-golfer
-shrubs
-policing
-amplifiers
-managerial
-inclined
-heavenly
-strangers
-pumping
-relocate
-oceanic
-manifest
-sickness
-suites
-owls
-limitation
-quantitative
-zombies
-depart
-odyssey
-torres
-masked
-babe
-modulation
-philips
-marquis
-denies
-trusts
-pertaining
-high-level
-colourful
-pulmonary
-viola
-socialists
-neutrality
-potent
-flynn
-elbow
-handball
-230
-ajax
-chili
-impending
-fixtures
-elisabeth
-runoff
-ambrose
-orient
-ostensibly
-neglect
-bubbles
-attacker
-emmanuel
-heroine
-landowners
-man-made
-chronological
-prevailed
-detectives
-sherwood
-rockies
-suez
-eligibility
-lining
-misleading
-shortest
-ambitions
-obstacle
-spanned
-sparrow
-bye
-physiological
-correctional
-waited
-concessions
-onboard
-adventist
-sv
-ethernet
-collects
-russ
-nail
-awakening
-intentional
-serbs
-equestrian
-danube
-clair
-squash
-revisions
-dare
-milestone
-cooler
-acc
-sh
-dietary
-celebrates
-fictitious
-coke
-fragile
-opium
-malone
-biographer
-envisioned
-refinery
-munitions
-6.7
-federally
-affiliations
-slipped
-opener
-luigi
-mai
-snail
-settler
-chinatown
-tau
-nes
-egyptians
-rudolph
-ascension
-balkans
-6.0
-valencia
-gemini
-hostages
-spheres
-mecca
-elongated
-qatar
-courtney
-treats
-prelude
-atkinson
-contraction
-alberto
-goats
-martinez
-runaway
-negotiating
-odi
-pascal
-son-in-law
-kicker
-entrances
-astrology
-backbone
-clockwise
-tottenham
-ptolemy
-tags
-woreda
-pcs
-howell
-symbolism
-third-party
-ambassadors
-vigorous
-eddy
-backdrop
-stimulus
-picturesque
-inflammation
-swap
-indicted
-hitter
-withstand
-hid
-autism
-reissue
-6.1
-nj
-differentiation
-inflicted
-foreman
-3-2
-glider
-optimization
-pembroke
-citrus
-academia
-ole
-garry
-respectable
-violating
-astros
-sylvester
-exposing
-fatalities
-kung
-bounds
-weighing
-yorker
-isabel
-stout
-commenting
-mia
-spirituality
-metabolic
-avant-garde
-squads
-adler
-5.6
-implying
-barons
-stefan
-sewage
-policeman
-fatigue
-turmoil
-retrospective
-contrasted
-gunfire
-mushroom
-rewards
-engraved
-winger
-kendall
-6.8
-daly
-caution
-xp
-skyline
-rpg
-berger
-poole
-relevance
-pardon
-half-hour
-concession
-recognise
-cafeteria
-amphoe
-ye
-1778
-franks
-pivotal
-baptized
-ib
-cylindrical
-simulated
-unemployed
-burt
-bernstein
-betrayal
-tumors
-experimentation
-poisoned
-dull
-nesting
-knees
-locker
-fl
-shan
-decommissioning
-williamsburg
-concluding
-shrubland
-attends
-alderman
-rabbis
-murdoch
-civilizations
-remembers
-4.2
-brains
-decepticons
-rebranded
-constance
-first-team
-oversees
-jailed
-harness
-deteriorated
-philanthropist
-abuses
-moran
-straightforward
-renal
-cadillac
-samantha
-brewer
-dump
-baths
-digimon
-overland
-jensen
-ignoring
-tandem
-eroded
-debating
-positioning
-nolan
-virgil
-sic
-strengths
-authenticity
-raceway
-tabloid
-u2
-proponent
-indefinitely
-1830s
-ceramics
-monumental
-clever
-martyrs
-bohemia
-vibrant
-vain
-shareholder
-thence
-7.7
-jorge
-causeway
-redemption
-ecumenical
-westwood
-recipes
-violinist
-inverted
-happily
-berg
-posthumous
-bucket
-duff
-cheltenham
-vague
-chronology
-b-sides
-pt
-benedictine
-yiddish
-tastes
-reggie
-milford
-12.5
-confederates
-psychiatrist
-rainforest
-113
-solvent
-creations
-digging
-marge
-appliances
-balkan
-turnout
-000
-cody
-adulthood
-keynes
-sabotage
-seminar
-glorious
-ropes
-formula_10
-caretaker
-deploy
-downhill
-disadvantages
-electors
-fibre
-salon
-tsunami
-trailers
-posing
-abel
-ritchie
-willard
-transmitters
-xm
-mysore
-o'reilly
-ariel
-jonas
-facilitated
-discouraged
-inevitably
-pre-season
-meg
-adherents
-rom
-analogy
-encompassing
-resurgence
-aperture
-overlooked
-autobiographical
-gil
-empires
-sorted
-7.0
-diocesan
-dharma
-hereford
-tacoma
-gatherings
-tablets
-quentin
-overseeing
-bernardino
-gregg
-uzbekistan
-arterial
-approx
-maze
-codex
-unwanted
-administer
-discretion
-westfield
-wolfgang
-curator
-7.2
-periodicals
-emotionally
-rave
-iroquois
-albuquerque
-preview
-brigham
-nomadic
-advocating
-sums
-titus
-zionist
-persians
-colbert
-badges
-cooperate
-fuse
-o'donnell
-rudy
-intercollegiate
-play-off
-3.1
-ki
-orphan
-parted
-professions
-middlesbrough
-4.3
-freshmen
-2005-2006
-steiner
-live-action
-strawberry
-parental
-parasite
-hutchinson
-bourne
-hubert
-7.4
-diaries
-denny
-firearm
-humanoid
-sesame
-pos
-reel
-ebert
-advise
-roc
-volvo
-hannibal
-jamestown
-pam
-usable
-nokia
-senegal
-lea
-musically
-arboretum
-vivian
-botswana
-reginald
-bisexual
-maneuvers
-poured
-twenty-four
-owning
-intercontinental
-donkey
-athenian
-adaptive
-smuggling
-half-brother
-equals
-8.3
-bilateral
-syed
-lightly
-welding
-premature
-sicilian
-manipulated
-align
-alexis
-weddings
-cigarettes
-keller
-vickers
-bowlers
-reflections
-dispatch
-kelley
-vertex
-priorities
-archery
-fashionable
-mohamed
-soloist
-peabody
-diagonal
-seahawks
-stargate
-juno
-outsiders
-practised
-disliked
-padres
-distributors
-rumble
-stature
-perpetual
-coefficients
-sap
-digitally
-midi
-ign
-roofs
-efficacy
-antibodies
-ethel
-bengals
-tyson
-mercenary
-reprise
-miner
-gospels
-jakarta
-disagreed
-aces
-gallantry
-incentive
-pre
-astronauts
-carleton
-1.7
-greeted
-cheating
-dislike
-counselor
-gossip
-apples
-60s
-bragg
-mythological
-intro
-cindy
-stabbed
-sensing
-implements
-drafting
-worshipped
-charismatic
-vogue
-eighty
-obsession
-5.7
-2.1
-3.8
-bremen
-2.3
-caledonia
-excelled
-glands
-mellon
-surrounds
-theta
-godzilla
-saul
-quartz
-superb
-infringement
-unpublished
-axes
-appealing
-fleets
-incredibly
-preached
-timor
-extracurricular
-lyndon
-willingness
-gangster
-cassidy
-commemorating
-broker
-medley
-ghetto
-expenditure
-prosecutors
-eliza
-pasha
-obe
-dictatorship
-decimal
-academies
-burgundy
-severn
-nun
-co-starred
-ventilation
-connie
-reactive
-spokane
-alcoholism
-transformations
-nightly
-booked
-nocturnal
-shrimp
-explosions
-renew
-10.5
-rae
-chaplin
-vineyard
-dunedin
-schumacher
-dormitory
-gunn
-mla
-neighbour
-strained
-al-qaeda
-duplicate
-sails
-dictator
-slater
--DGDGDG
-butterflies
-johan
-stainless
-undoubtedly
-bids
-regret
-outfielder
-one-off
-disruption
-simulate
-waist
-drunken
-prophets
-herring
-chung
-abdominal
-salaries
-edible
-commodities
-versailles
-browning
-janeiro
-moe
-3.6
-blew
-capacities
-caliph
-m.d.
-penetrate
-ottomans
-insistence
-paisley
-syllable
-sweeney
-1784
-twain
-fia
-beatrice
-depended
-forcibly
-400,000
-rusty
-rhyme
-corrections
-lydia
-ricardo
-barlow
-roe
-501
-elias
-7.3
-courtesy
-outdoors
-modelled
-sentinel
-shaun
-contempt
-long-range
-jurassic
-matrices
-lookout
-mosques
-comune
-royalties
-diver
-cypress
-ryder
-liable
-audrey
-flotilla
-hampered
-collaborator
-briggs
-galloway
-decomposition
-humidity
-authorization
-bruins
-hints
-saskatoon
-roundabout
-quay
-swans
-viceroy
-dusty
-ruthless
-conform
-robotics
-slept
-116
-amc
-negotiation
-im
-airplanes
-accusing
-parkinson
-nikolai
-establishes
-multinational
-gettysburg
-fielded
-authorised
-nat
-optimus
-lays
-steward
-vibration
-scanning
-integers
-webber
-southward
-123
-densely
-hillary
-1-1
-outpost
-24-hour
-hutton
-mali
-racers
-verde
-moines
-footsteps
-garment
-7.8
-bosses
-hilly
-aristocratic
-merry
-fishes
-mussolini
-sustainability
-sixties
-purdue
-routines
-6.6
-7.6
-ni
-gran
-sour
-distinctly
-akbar
-revue
-122
-confidential
-impaired
-mono
-southernmost
-beers
-reversal
-keel
-igor
-haley
-appropriately
-festivities
-one-day
-delight
-superheroes
-indications
-pointer
-invariably
-8.1
-lan
-mongols
-mounds
-motions
-banquet
-balloons
-gamble
-verify
-protagonists
-wrestled
-firefighters
-airlift
-reviewing
-slovak
-abusive
-oppression
-habsburg
-luton
-scroll
-feasible
-inadvertently
-interrogation
-wrath
-anchors
-cicero
-guitarists
-rushes
-faults
-overwhelmed
-rotated
-sounded
-underside
-techno
-instructional
-vivid
-codice_1
-1781
-porn
-befriended
-5.3
-descriptive
-dolly
-harrisburg
-medalist
-unreliable
-erect
-categorized
-maximus
-coordinating
-lizards
-curvature
-calculating
-aerodrome
-daring
-vegetarian
-consonant
-forerunner
-organizer
-attic
-farrell
-sudbury
-template
-champagne
-barley
-1300
-distorted
-gladstone
-drowning
-heidelberg
-semantics
-header
-2.6
-occupants
-uncomfortable
-spared
-valerie
-martian
-shear
-waived
-proliferation
-comrades
-rivera
-peacock
-fronts
-cyclones
-claudius
-poultry
-ramsay
-succeeds
-127
-elevations
-8.0
-braking
-binds
-convex
-118
-3.3
-dynamo
-warmer
-unbeaten
-stony
-alfonso
-radically
-titanic
-frankenstein
-racecourse
-legions
-judo
-redundant
-35,000
-disagreements
-wildcats
-eastbound
-provost
-concentrating
-realization
-octave
-1787
-primetime
-nate
-accountability
-billie
-loyola
-distinctions
-insurgents
-clint
-6.2
-collier
-neptune
-parodies
-clarkson
-burroughs
-chu
-gotten
-papa
-injected
-patty
-swimmer
-2.8
-sherlock
-landlord
-pinned
-shootout
-horseshoe
-bodyguard
-philippe
-nl
-readiness
-melanie
-consuming
-cynthia
-rosemary
-oman
-denying
-pines
-signalling
-+DG
-unseen
-midfield
-siding
-shakedown
-disrupted
-cheyenne
-jaws
-volatile
-researched
-6.9
-biographies
-unto
-supplements
-equatorial
-cabaret
-claudia
-agreeing
-5.2
-ulysses
-invite
-aspiring
-on-line
-three-dimensional
-dominating
-bytes
-attendees
-formula_12
-lindsey
-westbound
-crowley
-114
-subsidies
-muddy
-inspire
-voyages
-doubling
-guarantees
-kris
-regis
-black-and-white
-shades
-qb
-lasers
-yamaha
-elena
-voodoo
-sanford
-forehead
-durable
-multiplication
-olsen
-maize
-stresses
-6.4
-intercept
-bucharest
-sermon
-formula_11
-consultants
-attested
-overlapping
-congregational
-anybody
-wanna
-prosecuted
-lu
-sanitation
-wentworth
-patience
-guadalcanal
-gently
-churchyard
-m1
-va
-rig
-rhys
-plaster
-intervene
-glove
-cornerstone
-blitz
-panchayat
-banjo
-indicators
-claus
-exceeds
-curling
-chaotic
-eats
-princely
-intrinsic
-thorn
-hillside
-jesuits
-creole
-emancipation
-9-12
-trousers
-electrified
-4.8
-precedence
-obey
-woven
-excitement
-ea
-manpower
-grassroots
-cruelty
-2.7
-tornadoes
-roanoke
-rip
-lsu
-fugitive
-laguna
-sheldon
-assertion
-proxy
-spectator
-atkins
-foundry
-groundbreaking
-installing
-notions
-uncredited
-nu
-forwards
-rodent
-cite
-napier
-salts
-serum
-warp
-magistrates
-multitude
-pixel
-northernmost
-pueblo
-palms
-zoe
-anticipation
-ave.
-abbas
-biotechnology
-pads
-assaulted
-sacrificed
-remembrance
-behave
-packing
-paolo
-arid
-iucn
-guise
-aristocracy
-athena
-mimic
-ascended
-rancho
-postseason
-len
-orthodoxy
-springsteen
-crypt
-avalanche
-tally
-ecosystems
-harley
-urgent
-palermo
-rotate
-abe
-4.7
-pre-war
-skies
-enzymology
-ta
-pts
-cyclic
-paramilitary
-contrasts
-storing
-plaintiff
-cooled
-compassion
-dodd
-scaled
-investing
-overly
-bakery
-executing
-kemp
-anglia
-woodrow
-vineyards
-trojan
-motorola
-high-end
-spaniards
-anti-tank
-vulcan
-illuminated
-shaping
-1779
-constituents
-lectured
-drummond
-tiffany
-amir
-beasts
-imitation
-baronetage
-sanchez
-abandonment
-middle-class
-5.9
-otago
-under-21
-8.2
-attach
-commissioning
-sf
-organist
-pamphlet
-weed
-gb
-temperament
-appellate
-jewels
-taller
-joanna
-enhancement
-anal
-becker
-freedoms
-endured
-beads
-kangaroo
-asphalt
-persisted
-exempt
-grab
-outlaws
-quo
-upstairs
-medicines
-duran
-spices
-fisherman
-thereof
-postmaster
-stricken
-crushing
-assaults
-bathurst
-orbiting
-lehigh
-godfather
-localized
-horseback
-skins
-synonym
-parades
-corrosion
-yates
-baccalaureate
-caucasian
-expenditures
-britannia
-nawab
-dynamite
-lovely
-rye
-dynasties
-sen
-derivation
-lenny
-dresses
-dots
-stagecoach
-nominally
-invertebrates
-es
-pavement
-confessions
-macmillan
-intricate
-reservoirs
-dyke
-grammatical
-earls
-newcomer
-susquehanna
-literate
-josef
-5.4
-sharpe
-nfc
-comedians
-neon
-upscale
-rallies
-sas
-depleted
-sinai
-ballard
-dissent
-directional
-romney
-abandoning
-backstage
-parrot
-incentives
-gunpowder
-play-by-play
-pearce
-royalist
-nebula
-laptop
-navigator
-8.8
-puget
-mid-1960s
-articulated
-ames
-courier
-climbs
-vastly
-coherent
-examiner
-reconcile
-flees
-accessibility
-glover
-phoebe
-lineman
-authorship
-9.1
-disclosed
-eps
-claw
-.1
-willy
-mausoleum
-bali
-jedi
-mesh
-diversified
-well-received
-fission
-550
-bartlett
-whoever
-coptic
-toss
-harrow
-accompaniment
-burrows
-dumped
-holloway
-inaccurate
-classed
-permitting
-luzon
-dormant
-attendant
-gaul
-macquarie
-laude
-remarried
-culinary
-coincidentally
-renomination
-parasites
-medici
-militants
-mead
-clapton
-upn
-avatar
-5.8
-meteor
-choirs
-attackers
-earnest
-franciscan
-whitman
-arranging
-mapped
-replacements
-incompatible
-sediment
-solos
-domino
-easton
-1782
-sermons
-combatants
-crows
-kw
-hypothetical
-lovecraft
-well-being
-cosmos
-constraint
-drying
-grossed
-renaming
-ignition
-ras
-honoring
-promo
-fielder
-legitimacy
-roach
-bandits
-guarding
-pilgrim
-disqualified
-dom
-transplant
-lb
-3.4
-su
-gala
-constructions
-awkward
-sahib
-designations
-stunning
-124
-polly
-phyllis
-refueling
-unmarried
-sediments
-bhutan
-mb
-circulating
-organizes
-sonar
-wedge
-sumatra
-collaborating
-weiss
-walters
-marcel
-massey
-speculative
-patel
-inverness
-protectorate
-lombard
-texans
-cabins
-eastwood
-grabs
-amplitude
-brahmins
-spam
-insertion
-annapolis
-savanna
-pushes
-ventura
-carving
-barnett
-simone
-ligament
-reprised
-amelia
-lynx
-gail
-wal-mart
-podium
-flavors
-db
-,000
-parma
-conceded
-huang
-intervened
-vedic
-leroy
-drumming
-stokes
-ribs
-dagger
-melville
-seventies
-bono
-fargo
-maxim
-raaf
-hezbollah
-coinage
-excuse
-virtues
-feather
-skeptical
-spruce
-freak
-orphanage
-repealed
-credentials
-magnesium
-weaknesses
-vaughn
-walled
-br
-bismarck
-ironic
-gems
-bournemouth
-repeal
-charming
-rhino
-downloads
-periodical
-terminating
-incurred
-hiroshima
-medial
-diagrams
-tariff
-earldom
-ok
-lansing
-achilles
-aero
-inherently
-kilda
-jung
-beckett
-volcanoes
-groom
-alternately
-outgoing
-replication
-perfection
-shortages
-desperately
-260
-query
-conductors
-hinted
-ska
-harvested
-kahn
-doe
-fares
-occupations
-canons
-crimean
-macleod
-chattanooga
-geared
-audi
-chesterfield
-jeremiah
-breathe
-palaces
-astronomers
-climates
-pay-per-view
-3.7
-norwood
-cbe
-messaging
-marlborough
-presses
-one-time
-cybertron
-examines
-vaudeville
-scholastic
-126
-gland
-poses
-kissing
-rufus
-thorpe
-bowman
-hamas
-odessa
-boasted
-mural
-researching
-humid
-gardening
-scared
-sizable
-pensacola
-calhoun
-cured
-ripley
-undertaker
-bohemian
-harvesting
-goddard
-forefront
-executions
-sai
-fines
-batter
-favorites
-basel
-retreating
-archeological
-b.s.
-ups
-highlanders
-crusader
-4.6
-biomedical
-witchcraft
-admiration
-catalan
-extras
-deviation
-mustang
-nirvana
-mortimer
-outputs
-organising
-sami
-localities
-gui
-designate
-temper
-characterization
-whereabouts
-plastics
-oyster
-configured
-cipher
-valiant
-prohibit
-revoked
-barge
-graeme
-influenza
-portage
-flock
-internment
-incarnations
-champ
-650
-monsoon
-resin
-recycled
-synthesized
-diversion
-enfield
-7.9
-zurich
-two-time
-groves
-analytic
-disturbing
-specifies
-harrington
-reclaimed
-despair
-bladder
-lauderdale
-kang
-arrays
-internationals
-hai
-dioceses
-haul
-auditioned
-165
-abducted
-fledgling
-rendezvous
-trivial
-bavarian
-.5
-endings
-stall
-connolly
-evidently
-trainers
-mazda
-verification
-commuters
-mandal
-incapable
-betting
-batsmen
-perished
-cue
-olson
-authentication
-vowed
-banknotes
-infancy
-jealousy
-widened
-rowland
-non-governmental
-sparse
-vascular
-grim
-lend
-high-quality
-trenches
-madhya
-clubhouse
-cared
-brawl
-transparency
-mauritius
-petitioned
-flaws
-procedural
-cradle
-in-game
-balcony
-fascism
-baroness
-overwhelmingly
-adhere
-hearst
-accountant
-dow
-dong
-recounts
-rana
-gertrude
-integrating
-affirmed
-waits
-re-opened
-bargaining
-inclusive
-voyager
-cummings
-tonga
-ignorance
-80,000
-parramatta
-bing
-rebirth
-mentioning
-morley
-riff
-stalled
-koreans
-harmonica
-hawthorn
-songwriters
-parcel
-stuttgart
-legislators
-trophies
-sufi
-sloan
-galveston
-cass
-banished
-heels
-swinging
-pollen
-altitudes
-awarding
-leafs
-cumbria
-overseen
-spurred
-humboldt
-clarity
-honourable
-homecoming
-scare
-ramps
-titanium
-filtering
-laureate
-fade
-cary
-8.7
-idf
-theorists
-formulas
-graded
-peach
-onion
-evenly
-commemoration
-reyes
-upgrading
-confesses
-ssr
-otter
-assign
-elf
-wonderland
-purse
-intersections
-supervisors
-connectivity
-disposition
-nuremberg
-vance
-contracting
-8.6
-utc
-aqueduct
-hygiene
-mach
-compromised
-rowan
-ely
-improvisation
-assassins
-bt
-rescues
-ogden
-interiors
-butch
-preserves
-north-western
-cleaner
-announcements
-schizophrenia
-colchester
-majestic
-kite
-shamrock
-indictment
-seeded
-denise
-explode
-4.1
-entrepreneurs
-weighs
-improves
-ejected
-goodwin
-neurological
-hoax
-chromosomes
-jumper
-babylonian
-marin
-communicating
-paste
-reacted
-270
-danced
-prostitutes
-kgb
-sql
-hornet
-triad
-isa
-surnames
-radicals
-ph.
-asa
-viva
-differentiated
-foe
-christi
-saipan
-topical
-cleaned
-mule
-jihad
-tribunals
-fills
-gigantic
-maclean
-lore
-ideally
-ave
-traveller
-pistons
-debuting
-telescopes
-beetles
-mcc
-sgt.
-liga
-9.2
-synthesizers
-charters
-aiding
-vacancies
-co-op
-arcadia
-originals
-zulu
-nike
-tuscany
-salute
-equivalence
-hellenistic
-isotopes
-launcher
-parodied
-disturbances
-mcgee
-mercedes-benz
-cheer
-fonts
-turbulent
-storey
-unidentified
-impressions
-ascending
-serials
-hanna
-invoked
-transitions
-ramon
-timely
-wrestlemania
-reilly
-altering
-negatively
-indus
-mohawk
-outlying
-barr
-viper
-ezra
-akron
-casinos
-paranormal
-rosie
-biochemistry
-rude
-suzanne
-porto
-cbd
-murdering
-thinkers
-hopper
-albans
-enhancing
-bounce
-thrash
-1774
-functioned
-redesign
-fundamentally
-nehru
-prefect
-70,000
-sci-fi
-graphs
-rodents
-adjunct
-hayward
-rejecting
-decca
-litter
-dubious
-alleging
-boarded
-epa
-coated
-khalid
-conversions
-sentencing
-collaborators
-antibiotics
-cairns
-carlson
-obedience
-1400
-+1
-interstellar
-anti-war
-colon
-continuum
-karma
-claremont
-peppers
-unsuitable
-servicing
-bonding
-cisco
-hussain
-masonry
-lambda
-sequential
-orb
-assassinate
-umar
-heyday
-analyzing
-boiled
-skeletal
-spielberg
-fidelity
-sporadic
-firth
-loyalist
-pune
-powerhouse
-elector
-dire
-8.9
-disposed
-devote
-nautical
-winged
-traitor
-flavour
-confinement
-tanner
-yearbook
-shepard
-est
-slides
-scar
-secrecy
-uneven
-subcontinent
-exemption
-organise
-kapoor
-liberalism
-interacting
-shia
-baked
-adequately
-residual
-motorsport
-computed
-vanity
-reefs
-helium
-membranes
-elgin
-matilda
-nora
-chrome
-gotham
-socket
-dickson
-conserved
-houghton
-mailing
-guildford
-prescott
-low-cost
-secretary-general
-edo
-sumner
-nationalities
-jewellery
-blacksmith
-remark
-conjecture
-vendor
-restless
-showcased
-maverick
-restrictive
-corey
-soluble
-decision-making
-hangs
-cosmetic
-fremont
-fest
-antrim
-not-for-profit
-reflective
-dissolve
-billing
-sardinia
-cheney
-manifestation
-murals
-repetition
-zodiac
-4.9
-novella
-westchester
-insulation
-poisonous
-mona
-bafta
-brandenburg
-experimenting
-concourse
-sadly
-wes
-jarrett
-shrines
-blanche
-revered
-4.4
-compares
-castes
-byte
-grill
-thirds
-make-up
-pathogen
-contrasting
-newsweek
-elmer
-viktor
-nominate
-descends
-hierarchical
-transcript
-preface
-violently
-blackwell
-clashed
-jeans
-brewster
-iberian
-melt
-first-ever
-regulating
-necks
-barnsley
-surveying
-weighted
-births
-hawker
-razor
-back-to-back
-hangar
-myrtle
-bouts
-showdown
-elastic
-punches
-newtown
-fiona
-menace
-fullback
-bison
-halftime
-twenty-one
-horizontally
-clauses
-hu
-unicode
-twenty-two
-131
-vested
-stadiums
-tnt
-bp
-libel
-brig.
-purported
-ambushed
-anti-submarine
-middleweight
-acquaintance
-foliage
-vans
-shea
-mongolian
-injunction
-saxophonist
-unconventional
-dictated
-ctv
-allmusic
-north-eastern
-navajo
-nominees
-gearbox
-insights
-280
-transatlantic
-phelps
-naturalist
-gonzalez
-dreaming
-electrification
-masculine
-simulator
-coyote
-turnover
-mckenzie
-europa
-booking
-haifa
-1785
-orphans
-mesopotamia
-appendix
-nikki
-fixing
-santana
-triple-a
-cinematography
-optic
-mustard
-mercantile
-1.9
-barnard
-redistribution
-father-in-law
-outnumbered
-fallout
-bodily
-paragraph
-windmill
-assessments
-stealth
-samson
-denton
-projectile
-mastered
-lazy
-badger
-crashing
-dependency
-cereal
-collaborate
-carla
-unacceptable
-grenades
-assemble
-stables
-spoof
-helmets
-9.3
-hedge
-ballpark
-greyhound
-hartley
-eighties
-eritrea
-burr
-ku
-sheltered
-envoy
-strata
-whaling
-grasses
-detainees
-palazzo
-swallow
-skyscraper
-ample
-systemic
-nectar
-caller
-immortality
-blended
-starvation
-simulations
-auditor
-arteries
-10.0
-candle
-chick
-watford
-11.5
-syllables
-striped
-acquainted
-projections
-boogie
-duffy
-salford
-1770
-spaceship
-taunton
-butt
-regulars
-freddy
-aspirations
-junta
-acidic
-lieu
-bites
-directs
-prostate
-aerodynamic
-schooner
-totaling
-modernist
-re-established
-creeks
-lied
-belarusian
-fourier
-gillespie
-nme
-quintet
-marathi
-lesions
-variance
-9.0
-cantonese
-judas
-accelerate
-high-altitude
-eureka
-muir
-memorandum
-libby
-hostel
-suns
-propelled
-detectors
-inspiring
-compelling
-informing
-moons
-weigh
-pepsi
-emitted
-cyber
-darius
-scrub
-1763
-reflex
-u-boat
-buys
-sexy
----
-sundance
-proposing
-sacrament
-undisclosed
-balancing
-orton
-competence
-frankish
-unauthorized
-terminates
-mckinley
-refurbishment
-alleviate
-parity
-verona
-sweat
-brahmin
-self-defense
-injuring
-hesse
-chechen
-regulator
-co-author
-antiquities
-decatur
-writ
-contend
-seizures
-gallons
-catastrophic
-1/2
-155
-scully
-berth
-phonetic
-sire
-bourbon
-unopposed
-hooked
-state-owned
-berries
-dwellers
-leftist
-8.4
-tides
-eastman
-spartans
-dunbar
-lara
-geese
-aesthetics
-DGDGDGDGDGDGDGDGDGDG
-starship
-ozone
-zach
-disrupt
-rumor
-peyton
-wharton
-25.0
-cheng
-clerical
-daleks
-jude
-repository
-canning
-consolidate
-admitting
-hates
-moors
-laundry
-deciduous
-hepburn
-elemental
-cambridgeshire
-solids
-canadiens
-pest
-credible
-nicola
-sidekick
-nam
-sms
-chimney
-shale
-donate
-welles
-collided
-neighbourhoods
-greenberg
-9.6
-humanist
-disused
-cathy
-crewe
-nt
-hewitt
-occult
-eternity
-'70s
-invariant
-pencil
-pegasus
-onstage
-summon
-airfields
-gao
-9.4
-giorgio
-moldova
-anthologies
-copying
-revelations
-folds
-havoc
-wessex
-bolshevik
-formula_14
-imagined
-andersen
-bearings
-feminism
-bmg
-gorilla
-disciplinary
-cracks
-autobot
-raiding
-superiors
-junctions
-pendleton
-bathing
-archaeologist
-priced
-repression
-mara
-socks
-commonplace
-sorcerer
-convertible
-lithium
-slade
-sneak
-elijah
-antennas
-apprenticeship
-policemen
-insanity
-olga
-systematically
-grasslands
-waltz
-uninhabited
-lola
-knesset
-puppets
-fiesta
-classmates
-tug
-socrates
-guerrero
-prized
-sheppard
-catchment
-ismail
-halves
-pow
-ar
-hairs
-hydro
-iq
-gina
-italia
-qualifier
-stefano
-tarzan
-melvin
-softer
-barrie
-safari
-sorority
-expands
-unmanned
-grams
-epsilon
-handheld
-35th
-lcd
-fools
-addison
-convict
-zappa
-verge
-bestseller
-asimov
-reigned
-vines
-curiosity
-merits
-fulfilling
-crocodile
-seismic
-zhao
-goldsmith
-bait
-pun
-camouflage
-stand-alone
-engages
-holiness
-steamboat
-gibbons
-shafts
-learnt
-nicky
-g.i.
-gardiner
-merton
-redeveloped
-hormones
-tomato
-tariffs
-tripoli
-staples
-klaus
-kaufman
-lush
-baritone
-amounted
-conqueror
-refrain
-illustrates
-pronouns
-borrowing
-abd
-asteroids
-diner
-137
-jacqueline
-selects
-npr
-gentry
-hawthorne
-feral
-alfa
-sealing
-calder
-shi
-customized
-navigable
-orchid
-converter
-mayoral
-cochrane
-warrington
-pluto
-creditors
-jun
-avoids
-comets
-horrible
-fulham
-ngo
-barrage
-fridays
-petitions
-jeanne
-lavish
-principals
-jaguars
-lennox
-staunch
-bose
-1100
-chola
-jarvis
-emphasizing
-attire
-jar
-bumper
-qin
-davey
-devout
-bahadur
-stimulated
-parry
-lao
-publicized
-ang
-implicit
-choreographer
-afforded
-liang
-fanny
-deliveries
-wasp
-palo
-pharmaceuticals
-afternoons
-branched
-pensions
-reinforcement
-flattened
-louisa
-caleb
-ubiquitous
-atrocities
-chemotherapy
-aspen
-est.
-insult
-garth
-144
-operative
-environmentally
-absorbing
-inviting
-durability
-exiles
-knocks
-sim
-whereupon
-expansions
-mergers
-commence
-natal
-blu-ray
-bachelors
-craven
-dip
-cumulative
-galicia
-illnesses
-pasture
-peck
-reductions
-sedimentary
-longitude
-photon
-grasp
-armagh
-falsely
-paints
-maureen
-gilmore
-laborers
-hilary
-lam
-hugely
-off-broadway
-erin
-shrub
-carbonate
-ourselves
-constabulary
-memorials
-tardis
-bella
-rubin
-mist
-reused
-fountains
-laughter
-resentment
-furnished
-privileged
-grandmaster
-inherit
-nicknames
-plotting
-peacekeeping
-hellenic
-protections
-psychologists
-newborn
-heresy
-wilkes
-pills
-zoning
-9.7
-doomed
-wasted
-applicant
-counters
-boon
-dupont
-formula_13
-bracket
-seminole
-apes
-uniting
-rf
-tipperary
-cullen
-empowered
-choreography
-3.9
-156
-barbarian
-filmmaking
-shankar
-stuffed
-echoes
-accessory
-0.1
-spreads
-cones
-rum
-ava
-galactica
-squire
-damien
-emil
-flipped
-commemorates
-tha
-jimi
-encompass
-sulfate
-reminds
-ewing
-glendale
-narrower
-1-2
-tobias
-estranged
-candles
-undesirable
-lal
-hauled
-rebellious
-sultanate
-snack
-lamar
-hillsborough
-authoritative
-cassie
-loser
-irrelevant
-brightly
-yesterday
-bonded
-10.3
-fern
-reconciled
-now-defunct
-forgiveness
-heiress
-ec
-charlemagne
-prowess
-o2
-katz
-cruising
-reese
-conglomerate
-cypriot
-fungus
-rd
-defiance
-yokosuka
-eileen
-toilets
-amassed
-fiddle
-arranger
-confirming
-pairing
-32-bit
-adjustments
-flyer
-sarawak
-toni
-swiftly
-waterfalls
-scorpion
-muzzle
-falkland
-celtics
-mustered
-jared
-fleetwood
-seinfeld
-identifier
-inhibitors
-grille
-alteration
-commencing
-silly
-dal
-paige
-warehouses
-enslaved
-hm
-pdf
-icelandic
-robbed
-ridley
-south-western
-romanesque
-realities
-din
-cans
-trajectory
-fishery
-listened
-9.8
-in-depth
-masonic
-piracy
-owes
-saigon
-1,000,000
-rossi
-naomi
-advises
-escorting
-9/11
-wrap
-sd
-regal
-glow
-mainstay
-dev
-euclidean
-cappella
-lilly
-brackets
-wrecked
-thicker
-frigates
-arenas
-mcdonnell
-bartholomew
-pleas
-yeah
-shreveport
-planners
-floppy
-antoine
-pretending
-kidnap
-philanthropy
-cinderella
-ultraviolet
-middletown
-bianca
-baptists
-catfish
-storytelling
-horsemen
-shortstop
-outrage
-relaxation
-ives
-+DGDG
-working-class
-intellect
-fairbanks
-maynard
-obstruction
-kilometre
-grimsby
-arson
-lifespan
-novice
-fitzroy
-hi
-outlines
-ellison
-piloted
-mccormick
-wilkins
-ling
-bored
-spines
-3-d
-fuji
-evacuate
-1786
-porcelain
-lisp
-ambiguity
-midtown
-erroneously
-leukemia
-backlash
-goodwill
-blizzard
-tango
-ioc
-'80s
-casa
-longevity
-grabbed
-preseason
-boise
-indicative
-crowns
-kisses
-jeep
-nairobi
-facebook
-flu
-gwynedd
-one-way
-lyricist
-11,000
-madden
-combo
-ammonia
-widening
-resigning
-frightened
-pontifical
-collisions
-distinguishes
-jammu
-rep.
-stunts
-34th
-sleeves
-11.1
-undergraduates
-abyss
-inactivated
-carolyn
-kildare
-peninsular
-humane
-mal
-patrolled
-reproduced
-asthma
-statehood
-rust
-interventions
-diplomats
-undead
-reunite
-peoria
-sightings
-flanks
-stacy
-pious
-wnba
-pictured
-tyre
-penang
-fats
-scandals
-peacefully
-ascribed
-klan
-aol
-alloys
-hackney
-resultant
-criticizing
-hacker
-liquids
-magneto
-conscription
-kaplan
-essendon
-octopus
-commandos
-appropriations
-orchards
-css
-decepticon
-burgh
-reddish
-libyan
-ratification
-tents
-bursts
-hawke
-marrow
-levine
-textual
-judd
-declines
-electrode
-misconduct
-linden
-pr
-indo-european
-comcast
-baku
-shocking
-intensified
-wiring
-stylistic
-cores
-jacobite
-adjustable
-botanist
-invitational
-deportation
-junk
-troopers
-dirk
-levin
-risky
-supper
-haynes
-chicks
-siberian
-showtime
-back-up
-objection
-munro
-olaf
-horde
-bern
-osborn
-low-level
-semifinal
-underwood
-immensely
-robson
-tractors
-dealings
-trapping
-bloomfield
-quota
-soyuz
-terence
-half-life
-beech
-turing
-beaufort
-cleopatra
-nrl
-1660
-rt
-aggressively
-lagos
-josephine
-poorer
-garments
-zack
-marley
-slick
-grayson
-centrally
-uae
-drastic
-revolutionaries
-patrolling
-horne
-algerian
-breasts
-zeta
-indigo
-lodges
-mcdowell
-wellesley
-groundwater
-disagree
-secretaries
-shetland
-terrier
-shout
-operatic
-eras
-promoters
-queue
-opposes
-aromatic
-_
-revolutions
-surely
-kali
-32nd
-ymca
-lim
-bishopric
-tex
-ti
-adorned
-metallica
-blockbuster
-brightness
-2.9
-anatolia
-caters
-erich
-evade
-boring
-mulder
-nineties
-coarse
-esoteric
-hurley
-luthor
-barren
-shawnee
-vulnerability
-marquette
-cheek
-reclaim
-nz
-cosmology
-impetus
-segregated
-cinematic
-harlan
-cp
-ornate
-compute
-dane
-i/o
-loyalists
-bookstore
-dominates
-hasan
-illustrious
-shouting
-macon
-longitudinal
-inuit
-fences
-rejoin
-mcgrath
-rolls-royce
-immersion
-skipper
-interchanges
-ebay
-downing
-archers
-timetable
-additive
-utterly
-neumann
-merritt
-chairperson
-macbeth
-flanked
-z.
-noticeably
-ellington
-bosnian
-louie
-fiery
-skaters
-inhibitor
-felony
-linen
-adapting
-feasibility
-andes
-classifications
-stockport
-strategically
-contiguous
-prop
-152
-johor
-transient
-fluorescent
-distracted
-husbands
-basins
-interrupt
-168
-outset
-mirza
-right-hand
-turkic
-hound
-narratives
-incarcerated
-danielle
-passports
-impeachment
-collapses
-regimes
-quakers
-reintroduced
-pentecostal
-cries
-cebu
-patti
-factual
-south-eastern
-formula_15
-obituary
-insurgency
-muse
-thrive
-papacy
-sampson
-gram
-industrialist
-fijian
-crippled
-finalized
-deactivated
-neutrons
-framing
-corridors
-quake
-auditions
-dilemma
-latvian
-translators
-v6
-lobbied
-aegean
-props
-piercing
-km/h
-veil
-dehydrogenase
-rescuing
-notch
-symposium
-relaunched
-1750
-independents
-repetitive
-huntsville
-righteous
-chapels
-taxonomic
-tx
-hernandez
-cougars
-evaluating
-oven
-dent
-reborn
-callsign
-recommends
-bordeaux
-taxis
-offenses
-norte
-microscopic
-stylized
-maroon
-immaculate
-unpleasant
-320
-henley
-rivalries
-casts
-polynomials
-parkland
-cardboard
-he/she
-borrow
-cosmetics
-stunned
-rc
-snooker
-3,500
-mcclellan
-grading
-pumped
-townshend
-catalonia
-israelis
-10.2
-60th
-inferno
-trombone
-reddy
-kenyan
-successively
-queer
-fatally
-documenting
-conveyed
-fortification
-hallmark
-elective
-syrup
-fide
-chickens
-evangelist
-accolades
-realises
-carvings
-arthritis
-undrafted
-ferris
-quorum
-chichester
-ing
-flashbacks
-reminded
-microscope
-standardization
-quarries
-hospitalized
-impoverished
-procurement
-genealogy
-truncated
-rutland
-preach
-pulses
-quadrant
-palatine
-blanc
-bloomington
-cops
-congenital
-mortally
-coded
-malvern
-costing
-pamphlets
-landowner
-tito
-33rd
-idols
-plight
-spectroscopy
-baja
-stepfather
-tu
-nad
-lowlands
-prohibits
-transformer
-unarmed
-asiatic
-bulldog
-duct
-herbal
-braun
-uv
-educating
-wexford
-commute
-fortunate
-10.4
-mammoth
-manifested
-headwaters
-mounts
-outlawed
-bridgeport
-janice
-specialising
-sm
-leiden
-rhymes
-theologians
-stereotypes
-t-shirts
-epoch
-cholesterol
-sharma
-gratitude
-nutritional
-detecting
-unofficially
-alicia
-cares
-milky
-surfer
-isotope
-markedly
-breakout
-cyborg
-strikers
-adjective
-strife
-coeducational
-siemens
-bret
-tubular
-affectionately
-cosmopolitan
-anxious
-contentious
-needles
-interspersed
-mcbride
-accumulate
-sanitary
-9,000
-wrexham
-inflammatory
-condensed
-bargain
-deco
-lange
-fabrication
-bureaucracy
-bard
-sentiments
-ape
-ping
-schism
-faulkner
-interpreting
-takeoff
-cervical
-bridget
-dissatisfied
-kara
-federalist
-montane
-intercity
-anarchists
-landfill
-ayrshire
-equator
-accents
-werewolf
-gel
-haplogroup
-jurisprudence
-governorate
-protesting
-censored
-narcotics
-romero
-bulb
-havilland
-duluth
-sorrow
-relational
-confirms
-mosquito
-embodied
-fabulous
-rebbe
-baird
-entertain
-bazaar
-rediscovered
-peptide
-aeronautical
-constructs
-fermentation
-tyrant
-bending
-36th
-moths
-posture
-handing
-manly
-cookie
-floats
-futuristic
-toulouse
-scrolls
-gunner
-coloring
-armada
-hermit
-reeve
-10.7
-injustice
-elevators
-cher
-fabrics
-gaius
-enriched
-shipment
-irs
-pressured
-major-general
-reworked
-translating
-unsafe
-projective
-primate
-notwithstanding
-salzburg
-ashland
-outdated
-mahabharata
-navigate
-raft
-multi
-libretto
-330
-high-tech
-piccadilly
-islander
-stemming
-ova
-residue
-floral
-hedgehog
-earle
-rumoured
-galileo
-cranes
-jiang
-gut
-remotely
-redwood
-meritorious
-tata
-deployments
-consecration
-conical
-crises
-pedestrians
-trafford
-heartland
-oceania
-bradshaw
-clade
-connectors
-exponential
-partisans
-occurrences
-cf.
-affirmative
-reappeared
-trident
-rowe
-josiah
-tilt
-rehearsals
-pots
-modem
-deepest
-strands
-concurrency
-transgender
-baggage
-rollers
-smashed
-richter
-governorship
-scripted
-haydn
-noisy
-cyclist
-benches
-erica
-medicare
-first-person
-nana
-luxurious
-advisers
-vocalists
-reunification
-coveted
-plum
-raider
-argyll
-salamander
-10.1
-frequented
-rockwell
-roche
-polymers
-seville
-upside
-embryo
-corvette
-prohibiting
-unpredictable
-rosenberg
-abduction
-adolescent
-unsigned
-rebelled
-convicts
-enrichment
-cheerleading
-hoc
-wetland
-spire
-recommissioned
-grossing
-cheryl
-angrily
-sparrows
-signage
-recounted
-praising
-currie
-dino
-concordia
-banners
-downloadable
-vigorously
-horrified
-edict
-42nd
-dion
-bog
-markham
-ritter
-kant
-lazarus
-mondays
-upkeep
-farmhouse
-stimuli
-collateral
-gareth
-causal
-greenfield
-teaming
-casablanca
-federated
-supplemental
-refining
-sonia
-bolsheviks
-endeavour
-championed
-sharif
-alzheimer
-scalar
-inhibit
-swelling
-bellevue
-carly
-quarterfinals
-unspecified
-anders
-wainwright
-skirts
-revamped
-gonzales
-hodge
-sudanese
-royce
-lena
-highlighting
-1700s
-indices
-tolerate
-rev
-jessie
-kidd
-regatta
-alton
-ass
-baylor
-waller
-dummy
-apparel
-orson
-couch
-nw
-136
-seaman
-reclamation
-revolving
-cigar
-restricting
-dogg
-wavelengths
-fi
-favors
-inexperienced
-fronted
-enroll
-two-lane
-magnolia
-clandestine
-escorts
-pba
-android
-duly
-9.9
-streamlined
-saturated
-adversary
-flourishing
-chases
-danes
-proficiency
-allergic
-cocoa
-consult
-formula_16
-qi
-sentient
-hodges
-calvert
-squid
-inline
-animator
-ccc
-psalms
-supersonic
-accusation
-forbade
-identifiable
-croix
-fascinated
-uptown
-ric
-lace
-draper
-refit
-spikes
-subdued
-amateurs
-michele
-fabricated
-purification
-10.6
-accompanies
-methane
-bipolar
-compiling
-perez
-eucharist
-para
-mastery
-bender
-2013
-ninety
-guerrillas
-radcliffe
-unfamiliar
-grover
-charley
-realise
-primates
-agility
-nitro
-implication
-break-up
-upstate
-frescoes
-chico
-whatsoever
-ict
-emulate
-laird
-racetrack
-jerseys
-outraged
-o'connell
-tajikistan
-tangent
-kickoff
-clerks
-suspicions
-dorchester
-monstrous
-hooper
-airspace
-artificially
-attaining
-himalayas
-1772
-recreated
-cpr
-megan
-sawmill
-fetus
-litre
-helens
-dockyard
-sticky
-cis
-pantheon
-ike
-lillian
-choreographed
-spoon
-sects
-tolls
-nasty
-distributes
-keystone
-peanuts
-coroner
-corona
-10.8
-apologized
-condemnation
-norma
-gage
-migrating
-vortex
-nm
-biking
-shelton
-rotational
-cues
-gma
-huey
-flaming
-residues
-cactus
-persecuted
-lt
-chancel
-maximilian
-cocktail
-lid
-reclassified
-pickups
-overshadowed
-whitehead
-avenge
-groupings
-walkers
-locus
-excerpts
-undeveloped
-torquay
-multicultural
-tore
-guggenheim
-dismay
-allowance
-grinding
-unitarian
-cassandra
-willem
-extracts
-heinz
-oneself
-atm
-wren
-lucknow
-corrupted
-impractical
-biased
-mechanically
-vomiting
-warhead
-m2
-turrets
-influencing
-pas
-cracked
-primaries
-anthropologist
-alphabetical
-caspian
-butte
-subsistence
-lac
-observes
-contenders
-undermine
-aztec
-discourage
-ringo
-venezuelan
-subic
-thwarted
-mindanao
-scaling
-38th
-eurasian
-beatty
-hypotheses
-jehovah
-warlord
-upland
-2009-2010
-weld
-scrooge
-kinase
-burundi
-transistor
-6000
-stormed
-guernsey
-edna
-at-large
-sited
-supplementary
-numerals
-proprietor
-bearer
-8-bit
-budgets
-wealthiest
-alba
-prima
-agrarian
-cola
-jima
-jeopardy
-rationale
-orkney
-osbourne
-olympiad
-illumination
-coincidence
-enforcing
-tokens
-alameda
-joaquin
-mummy
-hilda
-foxes
-labourers
-slalom
-supervising
-viz
-carriageway
-spaced
-walden
-pendulum
-huts
-annoyed
-putin
-evicted
-melted
-kitchener
-unpaid
-emanuel
-fluctuations
-marlins
-templar
-sooner
-ithaca
-perceptions
-275
-quarter-finals
-bundesliga
-telecast
-boilers
-nudity
-alf
-intervening
-hilbert
-novgorod
-4-1
-advising
-year-old
-venerable
-cpi
-154
-stacey
-andover
-cellar
-2004-05
-disrepair
-entertained
-modifying
-senna
-giuliani
-mma
-flawed
-mt
-bayou
-thematic
-migrant
-bae
-perl
-outfield
-phosphorus
-kraft
-ml
-angie
-withdrawing
-right-of-way
-jameson
-parochial
-brood
-merseyside
-insisting
-ill-fated
-symphonies
-believer
-reactivated
-fairchild
-gabrielle
-impedance
-cn
-abrupt
-cambodian
-pearls
-cd-rom
-orthogonal
-flores
-sabre
-symmetrical
-electronically
-figured
-prequel
-urge
-stereotypical
-satisfying
-paralysis
-xx
-rallied
-nan
-chao
-professorship
-sbs
-truss
-14,000
-astor
-grease
-dod
-hepatitis
-var
-observance
-zoology
-meade
-nov.
-crockett
-praying
-nazareth
-smallpox
-reacts
-mechanized
-heron
-pisa
-therapist
-spawning
-midwestern
-impress
-statistically
-twenty20
-fig
-yokohama
-corpses
-manuals
-rajput
-formula_17
-commendation
-kamen
-showcases
-iwo
-taxa
-climbers
-burials
-leverage
-a.j.
-sabres
-craftsmen
-formatted
-amp
-wreckage
-ref
-sibling
-18,000
-lars
-moray
-transylvania
-mansions
-partridge
-vito
-suriname
-disappearing
-encompassed
-privatization
-contradiction
-amd
-teller
-specialize
-neoclassical
-scuba
-axiom
-prodigy
-600,000
-habeas
-predictable
-hottest
-undergoes
-rue
-constants
-expects
-mbe
-brighter
-electrically
-pains
-uci
-remington
-terraces
-sweetheart
-cutler
-enchanted
-abrams
-reigns
-lieutenant-colonel
-lear
-ems
-photons
-frederic
-sporadically
-anarchism
-ex-wife
-malicious
-hmas
-ventral
-mushrooms
-pickering
-zagreb
-ty
-sociological
-marseille
-mermaid
-benign
-proficient
-popes
-gomez
-imperialism
-col
-keating
-consultancy
-prologue
-syriac
-sync
-newcomers
-isis
-lynne
-abby
-pigeons
-slab
-curly
-darrell
-tracing
-spearheaded
-swimmers
-renumbered
-burbank
-heightened
-arable
-adjusting
-mclaughlin
-haitian
-morphological
-deserts
-forgive
-reciprocal
-promulgated
-reruns
-ultrasound
-digestive
-maximal
-shady
-gliding
-posse
-estimation
-vhf
-garlic
-'90s
-rory
-exemplified
-bypassed
-401
-buick
-pixels
-larson
-walkway
-spit
-jock
-qualifiers
-salad
-steamship
-malibu
-kindness
-leyland
-sparsely
-youthful
-widowed
-faiths
-t-shirt
-169
-sly
-summons
-mitochondrial
-.2
-metaphysical
-surreal
-kingsley
-rappers
-ornaments
-lordship
-nobleman
-lucille
-administrations
-lobe
-yorktown
-moshe
-scent
-heats
-mennonite
-copa
-info
-retrieval
-miniatures
-ngc
-willamette
-distillery
-modernized
-thistle
-miriam
-shooters
-fraudulent
-judgments
-inference
-schultz
-^
-cho
-benin
-winslow
-bethel
-embarrassing
-bingham
-hack
-manipulating
-vanilla
-impacted
-shutter
-reset
-scripting
-domingo
-massage
-arrogant
-ripped
-20.0
-tricked
-comfortably
-absurd
-conceal
-kimball
-everest
-natasha
-traumatic
-tt
-bonaparte
-snowy
-schwarzenegger
-exiting
-benevolent
-sarajevo
-vase
-13,000
-bolts
-fay
-jaime
-magma
-scotch
-meiji
-recaptured
-thessaloniki
-sachs
-blessings
-brutally
-millennia
-compile
-juveniles
-peat
-paddle
-shelved
-aeronautics
-penelope
-harmless
-hurdles
-blossom
-favourites
-enigma
-lust
-14.3
-aubrey
-web-based
-metadata
-interfering
-hatfield
-tombstone
-rbis
-electrodes
-soho
-morrissey
-taxpayers
-durban
-tongues
-fraternities
-christy
-freighter
-burnley
-externally
-ngos
-spacious
-morales
-departs
-cad
-y.
-marital
-optimistic
-anand
-henson
-nord
-nod
-mri
-luggage
-abolish
-youngstown
-ionic
-legislatures
-aramaic
-blending
-prompt
-genie
-fractured
-universes
-shenandoah
-hose
-knob
-16.7
-inaccessible
-flocks
-re-signed
-haryana
-montrose
-exhibiting
-timed
-barrington
-reformers
-sizeable
-pollock
-inorganic
-gunnery
-temptation
-onions
-temperance
-prism
-contractual
-orphaned
-vibrations
-d-day
-rocker
-copeland
-purpose-built
-bundled
-sen.
-remembering
-narayan
-13.5
-11.8
-manny
-boosted
-shrew
-mv
-sabah
-submissions
-derrick
-rosario
-offender
-glad
-sal
-nec
-polarization
-deterioration
-rooney
-recorders
-springer
-nacional
-vent
-inspectors
-lastly
-sevens
-hires
-revisited
-kaye
-romantically
-darby
-blasts
-andrei
-complaining
-cb
-aviator
-nagasaki
-aziz
-gideon
-hickory
-interned
-cabinets
-emory
-strasbourg
-glowing
-calf
-enthusiast
-splendid
-seekers
-lyman
-16-bit
-pre-existing
-viet
-pci
-argyle
-southport
-docking
-spence
-ousted
-russo
-mediation
-2004-2005
-commits
-cremated
-assimilated
-dowager
-ke
-hardest
-shipyards
-puck
-chevy
-castile
-lakeside
-jagger
-administers
-citations
-hz
-mcguire
-darryl
-taboo
-anatomical
-servicemen
-dec.
-adherence
-fiercely
-cory
-hammersmith
-latent
-kodak
-linguists
-embankment
-dyer
-filipinos
-mcintyre
-draining
-ulithi
-theo
-goaltender
-afterlife
-duane
-semi-automatic
-guido
-pullman
-upton
-classmate
-quotations
-disillusioned
-rampage
-twenties
-wallis
-open-air
-rabbinical
-nippon
-amin
-backyard
-opt
-lured
-enhancements
-locating
-bananas
-whisky
-dar
-slice
-unsure
-tam
-registrar
-wiley
-dusk
-dragoons
-pods
-parasitic
-handel
-milling
-bows
-flourish
-zoological
-high-rise
-naive
-tapping
-biologist
-himalayan
-euros
-mozilla
-granville
-expressive
-reminder
-alessandro
-mccall
-thinner
-dentistry
-acton
-herodotus
-vertigo
-goblin
-greeting
-wildly
-rockford
-recoil
-ahl
-rus
-byzantines
-picket
-iphone
-quicker
-philanthropic
-kelvin
-2-3
-directorial
-riga
-buena
-rewritten
-k-12
-rarity
-rollins
-gcse
-autopsy
-foes
-drummers
-gaye
-atheist
-tolerant
-encore
-isolate
-elizabethan
-guiana
-stats
-endeavors
-ascii
-misses
-isaiah
-formula_18
-parliaments
-ness
-stade
-conquests
-antibody
-regulators
-flashes
-bully
-summarized
-pollard
-offshoot
-hadith
-wearer
-telephones
-interchangeable
-reluctance
-marjorie
-fir
-liking
-kin
-runtime
-utopia
-10.9
-ic
-inhibition
-2015
-exploding
-222
-shutdown
-1760
-completes
-tuna
-daphne
-tearing
-jelly
-explodes
-shu
-162
-logistical
-standalone
-constructive
-corbett
-alienated
-barges
-arranges
-manchuria
-marissa
-pak
-topeka
-rochdale
-netball
-bushes
-recreate
-emergencies
-drury
-cheers
-rudder
-bulletins
--DGDGDGDG
-coloration
-real-world
-telegram
-motorists
-architectures
-gimmick
-amphibians
-totaled
-gastrointestinal
-donegal
-baseline
-sykes
-screams
-hari
-inca
-correlated
-peanut
-arisen
-relinquished
-dysfunction
-hive
-throttle
-dubuque
-gracie
-villas
-defected
-burnham
-staggered
-mcpherson
-authoritarian
-brutality
-crafted
-crimea
-reg
-pe
-diffuse
-orr
-englishman
-rested
-chong
-navarre
-mathematicians
-hint
-lately
-lyle
-ringing
-swear
-tempest
-relocating
-gown
-glue
-campers
-compressor
-firmware
-ps
-baking
-alt
-prasad
-bel
-roxy
-enjoyment
-gsm
-minogue
-greensboro
-relax
-assure
-communes
-cafes
-domesticated
-harmon
-packard
-ignatius
-thom
-legality
-footballers
-airship
-homeworld
-remind
-trainee
-generalization
-administering
-readership
-begs
-academically
-piazza
-thyroid
-reuben
-smoothly
-epilepsy
-regression
-wilbur
-snout
-two-way
-reincarnation
-tracey
-heraldry
-colloquial
-wolff
-pg
-nanjing
-ashford
-contradictory
-206
-h2o
-campground
-myriad
-hardcover
-propellant
-nighttime
-prospered
-donaldson
-vandalism
-healed
-petals
-typing
-torso
-high-ranking
-all-america
-socioeconomic
-invincible
-bullock
-tanya
-cancers
-11.4
-yusuf
-rhineland-palatinate
-dictionaries
-stipulated
-gabon
-specialization
-gable
-constitutions
-geologist
-moderator
-custer
-stooges
-obesity
-aquinas
-eniwetok
-axioms
-pill
-naacp
-northwards
-cyclops
-contended
-investigates
-facilitates
-artemis
-gliders
-facilitating
-upazila
-sponsoring
-insurrection
-healy
-on-site
-arundel
-ipod
-marxism
-synchronized
-restraint
-israelites
-prank
-electronica
-ethos
-39th
-triggers
-siena
-jonah
-herds
-lister
-bs
-accuses
-leach
-seventh-day
-dissatisfaction
-sleeper
-cookies
-yo
-o'hara
-yarmouth
-zoom
-two-part
-antics
-renown
-16,000
-dalai
-willoughby
-rubble
-cessna
-2009-10
-appointing
-sender
-greed
-abnormalities
-parked
-sable
-antelope
-vinci
-african-americans
-fairies
-sept.
-geologic
-19th-century
-fellowships
-aloud
-non-stop
-handicapped
-cessation
-reforming
-crooked
-off-road
-reckless
-gardener
-enclave
-westinghouse
-spins
-inversion
-avoidance
-cinematographer
-odd-numbered
-morally
-ineligible
-anomaly
-trunks
-formative
-incomes
-siegel
-ccf
-ridden
-cordelia
-ingram
-hubs
-craftsman
-x-rays
-recapture
-molten
-poppy
-r&d
-dormitories
-outfits
-forecasts
-fascination
-flowed
-fading
-musa
-yellowstone
-exhaustion
-blackout
-rothschild
-marius
-browsers
-innate
-bluffs
-theorist
-adapter
-pedals
-sergio
-strive
-nitrate
-cults
-primal
-countered
-offseason
-johnstone
-slips
-bromley
-6-3
-horsepower
-adriatic
-ravi
-livingstone
-shipments
-lsd
-katharine
-theirs
-sardar
-utilization
-arrivals
-chloe
-blindness
-lothian
-headlined
-childless
-offended
-humber
-populist
-pompey
-2020
-minesweeper
-siam
-flushing
-insist
-relic
-mueang
-av
-cavendish
-trojans
-axel
-vijay
-retro
-11.2
-fx
-toro
-psycho
-olds
-haas
-asserting
-bette
-blazers
-probes
-rms
-trudeau
-plurality
-cricketers
-sportsman
-cecilia
-chlorine
-gaia
-spontaneously
-aisle
-aladdin
-narrows
-administratively
-jams
-acknowledges
-henrietta
-intensely
-kellogg
-griffiths
-epstein
-polled
-o'neal
-como
-melrose
-towing
-doherty
-restart
-breweries
-repulsed
-republished
-warbler
-quadratic
-ryu
-vip
-optimized
-weimar
-mahmud
-patsy
-remade
-notoriously
-lacey
-bedfordshire
-knowles
-succumbed
-mainz
-dialog
-cornerback
-understands
-1773
-career-high
-cloning
-electro
-beforehand
-fenton
-boar
-archival
-laughs
-purge
-contra
-ordinances
-jan.
-agile
-geo
-betsy
-worsened
-armory
-q.
-bray
-tbilisi
-adept
-cossacks
-feb
-buddhists
-valle
-dyes
-pliny
-optimum
-coco
-lego
-iteration
-huxley
-plaintiffs
-henchmen
-gunboat
-wounding
-gainesville
-dumb
-tchaikovsky
-renegade
-conquering
-fared
-skeletons
-homosexuals
-acorn
-metacritic
-flinders
-accordion
-recited
-wildcat
-crowe
-14.5
-mersey
-corinth
-mangrove
-exerted
-firepower
-psp
-oct.
-heap
-salvatore
-37th
-wendell
-mysticism
-veneration
-extraterrestrial
-damp
-berwick
-jasmine
-joachim
-tallahassee
-sax
-sonoma
-sandman
-schema
-minh
-barnet
-158
-unlocked
-alchemy
-honesty
-togo
-aura
-moreno
-gambit
-daredevil
-mandy
-coronary
-inquisition
-cpc
-11.3
-passions
-ethnically
-unaffected
-bankers
-endure
-speedy
-vertebrates
-characterize
-boxed
-kern
-padua
-tendon
-philosophies
-33.3
-dredd
-365
-basics
-necessitated
-substantive
-necklace
-hers
-sustaining
-fragmentation
-a-league
-science-fiction
-lew
-culver
-javanese
-stints
-sar
-ether
-blvd.
-ganga
-setback
-prematurely
-0.3
-iberia
-romano
-preachers
-fasting
-mayfield
-remodeled
-tipped
-would-be
-hardened
-dorsey
-triangles
-criticize
-keane
-caledonian
-asher
-landfall
-stallion
-disadvantaged
-tailored
-repairing
-smiths
-rudd
-buff
-vaccines
-22.5
-slavs
-u.s
-bumps
-larsen
-circumcision
-hahn
-lineages
-anarchy
-barangay
-behest
-850
-zoned
-1768
-bergman
-cultivars
-set-up
-loft
-nanny
-punched
-wr
-multiplied
-0.2
-iain
-inefficient
-merges
-whence
-marriott
-tycoon
-drone
-proofs
-razed
-avail
-dentist
-misunderstanding
-piles
-28.6
-340
-two-story
-om
-1765
-seaboard
-tokugawa
-quotation
-raging
-plotted
-heroism
-shoals
-four-lane
-fiba
-mani
-skulls
-initiating
-barbecue
-horrors
-ghent
-turk
-grafton
-fang
-heavens
-burgeoning
-paddington
-palmerston
-mcleod
-unleashed
-dopamine
-extremes
-27.5
-manifestations
-clemson
-verizon
-crook
-hasidic
-globalization
-catalytic
-mateo
-bcs
-rani
-exacerbated
-milo
-learners
-regulates
-synchronization
-excursion
-gaston
-subscriber
-robes
-zebra
-134
-nightclubs
-earns
-domestically
-embarrassment
-triples
-alain
-26.5
-algiers
-intestine
-lai
-malabar
-stepmother
-antigen
-23.5
-spectra
-derelict
-inspections
-aborigines
-showcasing
-kamal
-wrongly
-paine
-overlooks
-minted
-tesco
-heraldic
-constrained
-newbury
-volga
-assortment
-163
-neuroscience
-formula_19
-lodged
-benefactor
-fordham
-boca
-chemically
-deformation
-slaughtered
-skirmish
-12.7
-landlords
-shit
-assessing
-irony
-small-scale
-fayette
-transvaal
-diffraction
-hunts
-orthography
-prosecuting
-gestures
-accommodations
-expos
-obscurity
-incidentally
-plentiful
-embassies
-persist
-uniformly
-commandments
-complain
-handler
-post-graduate
-bastard
-astro
-riddle
-stir
-ordeal
-nonstop
-baronets
-damian
-guelph
-informant
-intersecting
-guangdong
-sulfide
-roadside
-elektra
-societal
-singer/songwriter
-botanic
-faint
-dai
-alkaline
-a-side
-rampant
-drills
-conservatism
-allotted
-plutonium
-specialties
-somerville
-rash
-spelt
-27.3
-ceasefire
-artisans
-wow
-hermes
-signify
-embark
-practise
-fetal
-capcom
-trafalgar
-adultery
-maestro
-mammalian
-accorded
-illustrating
-enrico
-sitcoms
-townspeople
-m3
-faulty
-boniface
-akira
-carousel
-impairment
-winery
-rightful
-micropolitan
-negligence
-engraving
-synagogues
-utilised
-mabel
-anti-semitic
-all-stars
-scarlett
-all-state
-recovers
-croats
-off-season
-lifeboat
-leaning
-delighted
-austro-hungarian
-11.6
-dough
-dre
-router
-kazan
-167
-embryonic
-accelerator
-tolerated
-ares
-toast
-funky
-friars
-shelves
-noises
-disarmament
-spacetime
-gmbh
-grassy
-erskine
-cathode
-pows
-clements
-aroused
-internship
-combs
-diaz
-seamen
-uphold
-wehrmacht
-colspan
-appoints
-paranoid
-nonsense
-scranton
-animations
-stain
-defective
-winnings
-gaines
-devastation
-payroll
-mormons
-honeymoon
-saline
-ode
-mon
-baden
-controversially
-pianos
-notification
-agatha
-maitland
-bukit
-shapiro
-1820s
-resolving
-nausea
-three-member
-waldo
-adjutant
-doubtful
-fai
-tab
-hyundai
-148
-dwarfs
-insider
-vassal
-motorized
-gamecube
-bosch
-crustaceans
-25.5
-barclay
-moonlight
-accessing
-2-2
-two-dimensional
-wrought
-edmond
-basalt
-appetite
-walsall
-fertilizer
-endorse
-plank
-logged
-full-scale
-climatic
-genuinely
-inertial
-illicit
-incidental
-infiltrate
-exchequer
-drafts
-graders
-decrees
-kibbutz
-acacia
-eruptions
-tesla
-afrikaans
-sorting
-silla
-modal
-consulate
-winfield
-183
-splinter
-mcconnell
-orbiter
-reformer
-usefulness
-textures
-katy
-bertrand
-surfers
-ching
-diarrhea
-reputedly
-packer
-gamers
-cakes
-shuffle
-farmed
-75,000
-weymouth
-retreats
-arden
-leasing
-movable
-saxons
-inns
-asians
-blaine
-lair
-ut
-rect
-sideways
-curt
-peacetime
-pests
-13.3
-10:00
-ari
-nasser
-rashid
-militias
-benoit
-hiro
-kirsten
-hemingway
-canaan
-swings
-renounced
-bloomberg
-usaaf
-orwell
-cossack
-anzac
-requiem
-hines
-knicks
-11.9
-importing
-thugs
-encrypted
-tapped
-vertebrae
-0.8
-deduction
-sprung
-75th
-illusions
-removable
-algebras
-ignited
-hectare
-dundas
-lubbock
-pause
-collectible
-overthrown
-gladiators
-slough
-pathogens
-rebound
-glastonbury
-morals
-urges
-purcell
-alerted
-boolean
-royale
-hampstead
-fashioned
-activates
-cryptography
-courtroom
-detrimental
-tories
-anti
-atrium
-damn
-replicate
-maratha
-transcontinental
-instinct
-wynn
-anti-communist
-luciano
-fundamentalist
-mainframe
-faust
-snoop
-michelangelo
-wycombe
-thrace
-22.2
-immature
-erection
-vice-chancellor
-vinegar
-promenade
-pleistocene
-kowloon
-receipt
-luger
-fairness
-tristan
-eminem
-communicated
-hardship
-rd.
-esperanto
-deteriorating
-forgot
-hays
-oakley
-robe
-navigational
-wat
-tropics
-mobilization
-oprah
-manx
-thebes
-boyz
-erstwhile
-cha
-recurrent
-elaborated
-linguist
-fillmore
-brink
-feng
-bingo
-o'sullivan
-seton
-demography
-aggravated
-low-income
-realising
--1
-kat
-solemn
-puppy
-12.1
-wasps
-4-0
-equivalents
-growers
-hotspur
-gladys
-timer
-antibiotic
-crisp
-jaya
-fuzzy
-caliphate
-fours
-nearer
-bb
-sha
-replicas
-boxers
-roswell
-simeon
-courtship
-distillation
-eyewitness
-davy
-remedies
-britons
-hackett
-cantor
-six-year
-anc
-rotherham
-pictorial
-validation
-mildred
-13.6
-mid-point
-iss
-brahma
-swapped
-twenty-six
-maguire
-vending
-hen
-hairy
-rewrite
-leases
-previews
-DGDGDG.DGDGDG
-adopts
-lough
-childbirth
-49th
-beg
-nonlinear
-mediated
-cuckoo
-wilcox
-164
-ltte
-aden
-wen
-articulation
-robbers
-rockers
-compensated
-garibaldi
-grooves
-emigrants
-misuse
-tun
-corp
-joanne
-superstars
-referees
-fungal
-stacked
-26.7
-ignores
-majored
-1689
-escarpment
-skier
-computerized
-burnside
-sabrina
-quezon
-non-existent
-chanting
-mcqueen
-tiberius
-bulgarians
-dispersal
-richer
-auditory
-supernova
-odor
-formula_20
-currencies
-soy
-aryan
-1769
-overt
-prom
-usl
-rosen
-fiberglass
-counselors
-nylon
-antagonists
-minnie
-mandolin
-deserved
-hess
-hodgson
-borden
-sturgeon
-cosby
-vitro
-lund
-int
-spacing
-kathryn
-labeling
-malacca
-indira
-structurally
-mixtures
-propositions
-k-1
-enoch
-heaviest
-sci
-physicists
-salerno
-handy
-sewing
-sargent
-goldstein
-tko
-irregularities
-bmx
-gupta
-silesia
-canned
-1715
-juris
-reversible
-senatorial
-11.7
-gi
-mckenna
-trademarks
-enlist
-coils
-rabbinic
-erasmus
-reversing
-wading
-footprint
-sighting
-montage
-calculator
-horatio
-warhol
-harmonies
-travancore
-paving
-1745
-stills
-crusades
-martyrdom
-crawl
-breadth
-blends
-spores
-dignitaries
-slowing
-pa.
-bari
-nguyen
-6-4
-intersect
-extravagant
-tagged
-gwr
-livelihood
-champlain
-erratic
-mandela
-launchers
-omnibus
-jolly
-interconnected
-luca
-lodging
-squirrels
-airmen
-disbanding
-yours
-sanction
-hui
-rents
-suicidal
-leland
-pediatric
-php
-arsenic
-marianne
-hartman
-csi
-entourage
-floated
-therapies
-indochina
-glimpse
-soto
-lends
-demoted
-eduardo
-woolwich
-rko
-dorian
-bowel
-tuck
-confuse
-dora
-thorne
-maldives
-exercising
-kobe
-wabash
-fernandez
-fifties
-pursuant
-feats
-ju
-mana
-whitewater
-ferrara
-devotional
-non-league
-mediocre
-navies
-islington
-penitentiary
-ono
-metaphysics
-bourgeois
-a2
-bombardier
-laurent
-unlock
-unearthed
-octagonal
-domes
-palette
-dumont
-lau
-olympus
-12.3
-top-10
-ulrich
-epithet
-assigning
-coolidge
-clarified
-pep
-swarm
-hilltop
-scrapping
-hastily
-rajputs
-palladium
-cryptic
-bridgewater
-upbeat
-non-commercial
-stacks
-pitted
-bede
-hindered
-caterpillar
-balfour
-consequent
-zenith
-tubing
-skepticism
-shaken
-cnbc
-poplar
-starboard
-feb.
-tonal
-wiki
-msc
-superboy
-dumping
-oscillator
-looney
-pinball
-choi
-keepers
-transcribed
-forsyth
-divergence
-janata
-rb
-maidstone
-1758
-northerly
-effected
-magnate
-catalina
-infiltration
-bedrock
-forage
-tonic
-industrialization
-implants
-dizzy
-darfur
-mid-19th
-inspected
-legislator
-spectacle
-handgun
-codified
-zhu
-graf
-fascinating
-airliner
-retitled
-toxin
-drifting
-douglass
-outsider
-taxpayer
-paralyzed
-re-named
-playlist
-flashing
-appropriated
-arcs
-melee
-ir
-corresponded
-trotsky
-ventured
-bsc
-guangzhou
-accords
-umpires
-arroyo
-mcnamara
-serge
-alp
-brokers
-cato
-aqua
-panda
-sheds
-keaton
-christened
-palin
-baum
-broughton
-juventus
-welterweight
-headlining
-tehsil
-awakened
-amend
-khrushchev
-lehman
-jalan
-bam
-euler
-fg
-austen
-1755
-26.3
-hasbro
-advantageous
-concentrates
-abba
-predicting
-clover
-1761
-shouted
-negligible
-courageous
-catastrophe
-marietta
-milne
-beheaded
-ramos
-unorthodox
-01
-filtered
-celsius
-tentacles
-alistair
-mandir
-grenada
-object-oriented
-spellings
-govt
-25.4
-expansive
-inward
-12.2
-pretend
-knitting
-islamist
-kimberly
-carmichael
-convection
-privatisation
-epilogue
-witnessing
-purportedly
-australasia
-britney
-jordanian
-diets
-hue
-miserable
-cas
-selfish
-mats
-kettle
-beverley
-24.1
-nunavut
-hydra
-wardrobe
-abs-cbn
-sectional
-supplanted
-doorway
-cemented
-zachary
-drifted
-reopen
-billboards
-proctor
-respondents
-mexican-american
-carole
-marred
-cedric
-psalm
-lulu
-viability
-labyrinth
-ridings
-mango
-archangel
-1910s
-racially
-amplified
-likened
-egan
-papyrus
-epidemiology
-liquidation
-microprocessor
-wilton
-embarrassed
-poorest
-manfred
-robins
-clancy
-25.6
-peer-reviewed
-advertise
-peshawar
-fearful
-biomass
-danville
-wary
-shack
-adi
-grumman
-enactment
-heathrow
-erased
-pagoda
-allegation
-whistler
-multiplex
-cdc
-slovenian
-23.1
-penultimate
-sasha
-equitable
-chechnya
-truths
-pretoria
-25.8
-pesticides
-frazier
-fk
-southend
-athenians
-mace
-expressly
-showers
-bribery
-expectancy
-clerics
-moat
-vauxhall
-shi'a
-supervise
-mutated
-naturalized
-vest
-variability
-scars
-smoked
-dio
-24.4
-cayman
-delegated
-drank
-hungarians
-witty
-fruition
-dobson
-suspend
-lesbians
-warrants
-mojo
-accountants
-clad
-brutus
-insults
-intestinal
-sprang
-co-hosted
-diva
-lorne
-26.2
-ravaged
-hershey
-subtitles
-migrations
-reins
-loaf
-malignant
-http
-hornets
-ortiz
-valuation
-08
-kilograms
-25.9
-breakaway
-grady
-hardships
-racehorse
-amplification
-potentials
-anson
-eno
-interviewing
-downed
-fax
-byu
-assigns
-clarify
-northrop
-spitfire
-fran
-masts
-0.6
-1766
-carte
-vandals
-13.2
-kb
-mcgregor
-tomas
-naga
-astral
-lettering
-jess
-tentatively
-appliance
-redding
-befriends
-drilled
-septa
-em
-massacres
-terrified
-panoramic
-bonuses
-speculate
-penetrated
-chieftain
-gull
-parentheses
-spanish-american
-projectiles
-audible
-downturn
-1650
-mariner
-containment
-maud
-sloping
-1762
-intimidation
-becky
-farley
-escalated
-1707
-qualitative
-newly-formed
-solvents
-penetrating
-puritan
-erwin
-socio-economic
-glossy
-inter-county
-boomerang
-footpath
-mysteriously
-bsa
-dahl
-fabian
-pastors
-non-linear
-telford
-heisman
-tankers
-networked
-specifying
-steroids
-epsom
-yarra
-manifolds
-tufts
-24.3
-wee
-ignorant
-bromwich
-habitation
-centimeters
-plugs
-menzies
-shiny
-380
-symptom
-cathedrals
-londonderry
-ja
-cochran
-confess
-masjid
-strap
-osama
-interchangeably
-arriva
-horizons
-furnishings
-rot
-suspense
-poseidon
-caine
-ox
-industrialized
-screenings
-scientifically
-normans
-restarted
-325
-fearless
-breeder
-observable
-waugh
-bulbs
-tcp
-entrepreneurship
-hands-on
-inhabits
-stampede
-parcels
-1764
-imitate
-bandit
-msa
-tailor
-delphi
-gunners
-midget
-waco
-seasonally
-indoors
-m4
-hades
-alaskan
-stamped
-eliminates
-scanner
-downey
-eels
-astoria
-26.8
-26.4
-roommate
-disciplined
-amphitheatre
-exploiting
-konami
-billions
-1,300
-outcry
-esteem
-standout
-acknowledging
-hoboken
-stump
-maternity
-involuntary
-postcode
-aborted
-ludlow
-peg
-old-fashioned
-hobbies
-top-level
-manchu
-falklands
-woodpecker
-swore
-reliably
-clearwater
-trey
-coney
-biochemical
-17.5
-unionists
-madurai
-martyn
-12.8
-inertia
-disclose
-phylogenetic
-hiv/aids
-kota
-coasters
-centro
-bitch
-resurfaced
-fractions
-pens
-cinnamon
-etruscan
-2014
-asbestos
-exporting
-kangaroos
-tarot
-tick
-6-2
-mastermind
-cautious
-messina
-journalistic
-golfers
-rankin
-urinary
-spectre
-docked
-layered
-mathematically
-tyres
-csx
-chants
-low-budget
-kawasaki
-embroiled
-niles
-lombardy
-ama
-cavaliers
-cowan
-12.4
-25.7
-fathered
-26.1
-detonated
-kurds
-deficiencies
-hobbs
-dew
-interment
-realignment
-kira
-serialized
-silhouette
-utica
-counterattack
-retracted
-haunting
-walla
-stringent
-intuitive
-mas
-outrageous
-antilles
-bash
-starch
-ferns
-overlaps
-justinian
-safeguard
-southwark
-che
-madeleine
-depots
-11.0
-cgi
-sequencing
-3-4
-decreed
-vicksburg
-gnome
-deformed
-pulpit
-gough
-27.8
-entertainers
-ukrainians
-covington
-24.6
-blond
-methyl
-carp
-ericsson
-trooper
-zionism
-coles
-expatriate
-209
-sumo
-slogans
-gov.
-antigua
-bondage
-refinement
-jacks
-anemia
-huntingdon
-life-long
-terre
-bonn
-berber
-23.3
-saab
-burrell
-12.0
-pelvic
-whitehall
-miraculous
-acetate
-discomfort
-cartman
-wed
-chronicled
-1,600
-wi-fi
-croft
-evaporation
-pinch
-distraught
-alta
-fokker
-12.9
-sleepy
-reorganisation
-marsden
-historia
-darcy
-maha
-fontana
-worries
-21.7
-lind
-mahatma
-aground
-lori
-intern
-whitaker
-hove
-footwear
-unicorn
-ditches
-gorbachev
-ensues
-intrusion
-embargo
-mass.
-overcoming
-mahal
-friendships
-calabria
-tending
-wording
-larkin
-imf
-sw
-lps
-strangely
-frontiers
-yachts
-vertebrate
-mic
-kalamazoo
-sourced
-yosemite
-lizzie
-beau
-preschool
-bertie
-23.8
-tanaka
-ethic
-precautions
-abelian
-formula_21
-bassett
-triggering
-seizing
-deng
-24.5
-underparts
-lowry
-foo
-gladiator
-undermined
-hegemony
-tributes
-flea
-wastes
-disgusted
-1999-2000
-sligo
-centerpiece
-hermitage
-on-campus
-abolitionist
-interscholastic
-21.4
-mrt
-meath
-kites
-ballarat
-carpenters
-devoid
-firefox
-inspirational
-24.8
-barbarians
-mid-atlantic
-hikers
-foreword
-trumpeter
-lauded
-speeding
-twenty-three
-matthias
-well-defined
-protestantism
-pouring
-mathews
-javascript
-harassed
-separatist
-finer
-uncanny
-nightingale
-peas
-218
-hitherto
-canoeing
-bleed
-leno
-islamabad
-motivations
-pulaski
-birkenhead
-mid-1950s
-nascent
-operatives
-45th
-forster
-177
-coe
-symbolize
-second-largest
-gras
-scipio
-leah
-australasian
-bundles
-touted
-massively
-collapsing
-2003-04
-receipts
-mohan
-marlowe
-macro
-gangsters
-deccan
-mouths
-schoolhouse
-temptations
-cy
-hempstead
-irons
-unicron
-pity
-17,000
-blackmail
-barking
-nypd
-projector
-pragmatic
-lax
-avro
-restrained
-microscopy
-newmarket
-patriotism
-hardin
-walpole
-accommodated
-malt
-biennial
-stabilization
-cobalt
-doppler
-reopening
-easiest
-notebook
-terminator
-pryor
-twenty-first
-weavers
-clemens
-headache
-platt
-piety
-predatory
-undisputed
-precursors
-saginaw
-bouncing
-reuse
-aqueous
-mendoza
-o'malley
-alvarez
-mustafa
-generosity
-hog
-proudly
-370
-compartments
-nyu
-dalek
-self-government
-weeds
-symbolizes
-aegis
-alexei
-fostered
-unrestricted
-ecstasy
-thuringia
-traversed
-ms-dos
-greer
-instigated
-allusions
-hurst
-montagu
-hubble
-compliant
-breached
-circumstance
-tentative
-bakersfield
-cheung
-emulation
-electing
-chun
-ayr
-modernism
-addicted
-vehicular
-revert
-char
-esp
-toxins
-m.s.
-jay-z
-15.4
-290
-1,100
-fis
-overlord
-minerva
-carnivorous
-mobilized
-mollusks
-27.4
-abigail
-oddly
-monotypic
-tabernacle
-lbs
-headlights
-ravenna
-1a
-sloop
-daley
-yarn
-hyperbolic
-re-issued
-blames
-boutique
-bets
-cs
-punishments
-midsummer
-bikini
-announcers
-28.1
-exited
-1642
-disgust
-50s
-stimulating
-selangor
-emery
-bottled
-41st
-subgroups
-pax
-isthmus
-24.2
-chelmsford
-lpga
-lucifer
-msnbc
-proclaiming
-fragmented
-resisting
-nutrient
-malayan
-x86
-159
-barbed
-chew
-linkage
-probabilities
-buren
-asymmetric
-atv
-dempsey
-27.6
-galilee
-conserve
-detachments
-warlock
-progressing
-casing
-typeface
-flex
-maru
-cabot
-45,000
-ultimatum
-sorties
-interfered
-fonda
-snp
-startup
-strives
-tobin
-sculpted
-encountering
-ridiculous
-waverley
-karaoke
-bypassing
-qc
-whitby
-hamid
-incompetent
-all-around
-shillings
-curricula
-coli
-longstanding
-mirrored
-sunken
-carts
-16.5
-deserve
-versatility
-attorney-general
-wick
-castillo
-psyche
-mayhem
-practising
-benito
-grimm
-giacomo
-implicitly
-dotted
-spinner
-+2
-guilds
-apprenticed
-jimmie
-humiliation
-sterile
-anastasia
-xu
-nietzsche
-penrith
-nexus
-doctrinal
-dns
-viewpoints
-whiting
-sub-saharan
-biting
-updating
-achieves
-mont
-bethany
-glam
-jackpot
-oratory
-fibres
-circuitry
-landscaping
-post-season
-pointe
-u.n.
-kimberley
-entrenched
-calendars
-pont
-confronting
-weasel
-pillow
-1771
-gop
-cultivate
-footbridge
-silica
-microphones
-queries
-pursues
-finely
-sheen
-adjectives
-hardness
-oricon
-platte
-disqualification
-inventing
-18.8
-doomsday
-petra
-13.8
-yvonne
-selby
-lambeth
-rigged
-advertisers
-rockingham
-harcourt
-magicians
-deploying
-secretive
-comptroller
-oct
-juniper
-shortcomings
-enquiry
-three-day
-zelda
-attendants
-chill
-rene
-buxton
-verdi
-sapphire
-spongebob
-420
-full-size
-palau
-mcmanus
-reprints
-olympian
-lipid
-captaincy
-sentimental
-persistence
-23.6
-portals
-ceilings
-stitch
-subsets
-half-way
-coltrane
-apologize
-condominiums
-protracted
-25.2
-infinitely
-meteorology
-placebo
-arjuna
-shook
-alamo
-reappears
-stonewall
-cartilage
-signifies
-rag
-22.6
-quoting
-welcoming
-compilers
-c1
-great-grandson
-subterranean
-infirmary
-mariana
-caverns
-culmination
-27.7
-exemplary
-flare
-fraternal
-rebellions
-nightmares
-marlin
-downwards
-181
-plaques
-retina
-pudding
-fulfillment
-26.9
-trillion
-parton
-chai
-balliol
-moored
-twitter
-bounced
-retractable
-mimi
-dietrich
-rehab
-distraction
-subaru
-pastures
-irrespective
-exert
-graphite
-cartwright
-obtains
-outing
-wanda
-12.6
-prayed
-makeshift
-brentwood
-faber
-rulings
-taurus
-pali
-sensible
-aquino
-20.5
-stabilized
-kew
-samsung
-adventurer
-richland
-horticultural
-64-bit
-capacitor
-offline
-adolescents
-binghamton
-patriarchate
-exchanging
-tamworth
-27.1
-conner
-solaris
-deter
-torre
-newell
-rib
-distract
-self-contained
-inserts
-dessert
-ursula
-falmouth
-protons
-premiers
-mage
-pardoned
-zeal
-planck
-periphery
-yunnan
-187
-vita
-scala
-pim
-20.8
-condor
-repatriation
-dole
-jingle
-awakens
-dispose
-15.5
-julio
-treble
-warring
-barron
-antisubmarine
-bhutto
-kyrgyzstan
-i-95
-telephony
-0.9
-elle
-messengers
-telepathic
-tagline
-motocross
-3.00
-cymbals
-lamont
-codice_2
-marlon
-gaussian
-ren
-anglo
-casper
-seniority
-substitutes
-surrogate
-departures
-leaflets
-lakeland
-pol
-novak
-reconstruct
-dryden
-semitic
-pivot
-palma
-178
-primes
-biplane
-colleen
-bryn
-quarantine
-imaginative
-disregard
-monorail
-joplin
-sausage
-cynical
-looted
-27.2
-danzig
-professed
-thermodynamics
-untouched
-inert
-lifestyles
-conduction
-lrt
-provocative
-starscream
-gifford
-underwear
-chewing
-excursions
-singaporean
-214
-invaluable
-23.9
-marta
-brit
-blinded
-connotations
-supervillain
-deprivation
-battered
-profitability
-kmt
-obscene
-jokingly
-bn
-irrational
-chancery
-echoed
-greats
-timeslot
-26.0
-pre-production
-bjp
-lexical
-sichuan
-ppp
-wheeled
-bacterium
-play-offs
-lakewood
-herons
-kurdistan
-non-partisan
-liechtenstein
-fresco
-gorman
-nico
-13.0
-skye
-14.1
-thrived
-covent
-grunge
-condemning
-fullerton
-theorized
-billings
-c2
-isp
-sk
-evasion
-enormously
-registering
-townsville
-three-time
-bedrooms
-world-class
-dawkins
-spock
-exponent
-shaking
-formula_22
-low-power
-armageddon
-tundra
-21.5
-44th
-gunmen
-one-man
-haskell
-dangerously
-ellsworth
-garter
-0.7
-callahan
-warped
-gee
-turbulence
-ollie
-mississauga
-accountable
-harald
-seaport
-paz
-post-world
-28.3
-ligand
-gathers
-richly
-anwar
-mundane
-sweets
-dim
-diablo
-fitzpatrick
-singled
-1,400
-nearing
-booths
-gilmour
-vulgar
-visionary
-bland
-frenzy
-knighthood
-pyramids
-compounded
-transitioned
-thigh
-relays
-jat
-co-starring
-monash
-booming
-visibly
-orderly
-scam
-pavel
-overture
-abs
-riches
-predation
-dogma
-monaghan
-rochelle
-perfume
-overflow
-formalized
-sober
-cooperatives
-subtitled
-eviction
-sociologist
-sinks
-shocks
-'60s
-argent
-crucifixion
-gallant
-mandates
-slug
-sugarcane
-dumfries
-jing
-reductase
-cougar
-bans
-maharaj
-lymph
-que
-deriving
-dissipated
-hanoi
-widest
-sticker
-swallowed
-choctaw
-mori
-22.7
-needy
-latex
-regaining
-unexplained
-await
-hand-to-hand
-stereotype
-235
-ata
-transistors
-canoes
-dwarves
-21.3
-assorted
-kennel
-sui
-mcintosh
-mina
-outbreaks
-lockhart
-dissident
-keynote
-youngsters
-serotonin
-43rd
-27.0
-demonstrators
-late-night
-serb
-crank
-magnets
-reconstituted
-skirmishes
-bernardo
-26.6
-24.7
-deferred
-nair
-faye
-carpets
-afflicted
-artefacts
-haines
-noodles
-submitting
-prehistory
-inferred
-23.4
-xerox
-salsa
-rss
-azad
-sidings
-cleric
-alarmed
-beale
-diminishing
-interceptor
-scans
-laredo
-haute
-jonny
-furnaces
-rotates
-limbo
-tammy
-bends
-hypothesized
-empties
-thurston
-lexicon
-edits
-condominium
-greyhawk
-bernhard
-taxon
-odin
-donnie
-cpus
-watergate
-bangladeshi
-inserting
-setbacks
-camille
-derogatory
-arched
-neuron
-expel
-owe
-byzantium
-awaited
-derwent
-rees
-transporter
-conclave
-ripe
-blaster
-augsburg
-rupture
-seduce
-prometheus
-bonnet
-13.4
-panhandle
-kidneys
-fundraiser
-triathlon
-immersed
-diminutive
-wilfred
-bai
-one-hour
-amman
-0.4
-mosley
-pretext
-colby
-eucalyptus
-purana
-revitalization
-americana
-pac
-markup
-airframe
-wastewater
-mcgraw
-looting
-earthly
-shaker
-resented
-fostering
-51st
-conservancy
-dorm
-walford
-folio
-terrell
-priscilla
-larva
-penance
-moot
-unequal
-willingly
-stumps
-shepherds
-scary
-beavers
-1688
-programmable
-vaccination
-4-2
-treatises
-ste.
-00
-levied
-defends
-mbta
-gunshot
-numeric
-maori
-reviving
-depletion
-executable
-wipe
-tung
-expiration
-holistic
-yoko
-funnel
-sprinter
-waged
-xander
-enrique
-zane
-pollutants
-23.7
-anthropomorphic
-28.0
-insulting
-flaw
-southwards
-verne
-4-3
-pharmacology
-partitions
-fitz
-consultative
-quran
-antiques
-21.6
-steaming
-napa
-voip
-student-run
-welded
-excludes
-stockings
-inquiries
-disperse
-comical
-detainee
-tyrol
-bbs
-url
-romani
-enamel
-appleton
-cleansing
-acoustics
-waterfowl
-elites
-luge
-measurable
-sloane
-camels
-brampton
-una
-sidewalk
-fdp
-23.2
-vulture
-nakhon
-montague
-stewardship
-selma
-jamming
-peugeot
-fallon
-trough
-gallipoli
-cuthbert
-lanarkshire
-fiddler
-interoperability
-cryptographic
-2600
-manganese
-marguerite
-doping
-transcendental
-rotations
-regan
-multi-use
-tungsten
-cleanup
-nov
-picard
-maui
-interpretive
-21.1
-polaris
-warmth
-priestly
-nonfiction
-nh
-improv
-18.2
-breaker
-scotty
-petit
-fitch
-trimmed
-13.1
-reptile
-marconi
-overcame
-1767
-snare
-clipper
-poirot
-1-3
-diaphragm
-thrill
-wheeling
-satisfies
-410
-pandit
-mckinney
-fictionalized
-meats
-constables
-mid-20th
-14.7
-prosecute
-fayetteville
-glamour
-07
-mays
-timbers
-amusing
-stardom
-canine
-standpoint
-waned
-embracing
-stemmed
-24.0
-eel
-dunlop
-widescreen
-utter
-goguryeo
-hua
-enact
-radars
-cobain
-blackwood
-dynamically
-modena
-cline
-tweed
-1640
-afloat
-heidi
-piping
-22.4
-rudimentary
-renders
-13.9
-beautifully
-hacking
-horticulture
-mart
-120,000
-gpa
-1086
-suharto
-localization
-eurasia
-smiling
-westport
-tutelage
-portfolios
-broom
-resolves
-avenger
-32.4
-visualization
-uneasy
-hadrian
-mos
-keenan
-perfected
-22.9
-buyout
-accelerating
-elliptical
-25.3
-enlargement
-28.2
-meaningless
-challengers
-horned
-disconnected
-bullying
-blackstone
-predicate
-conspirators
-latham
-winthrop
-extortion
-mta
-schleswig-holstein
-henceforth
-whitfield
-x-files
-xl
-intermittently
-ellie
-hagen
-hour-long
-erroneous
-forgery
-westerns
-bleak
-contingency
-morphine
-slugging
-goth
-annihilation
-28.8
-claudio
-yogi
-kristen
-foolish
-mallory
-phonology
-facets
-allentown
-unnoticed
-reindeer
-kremlin
-launceston
-attaching
-20.7
-flop
-29.5
-adolescence
-18.5
-housemates
-sensational
-peking
-fortresses
-potts
-hillman
-gershwin
-clarendon
-evansville
-stu
-deforestation
-fetch
-swedes
-800,000
-mata
-27.9
-condensation
-elusive
-sticking
-sleeps
-entrants
-orator
-devine
-durga
-vice-chairman
-waitress
-imperfect
-harlow
-minions
-1740
-kenyon
-roberta
-biologists
-sadler
-menus
-mixer
-waterman
-tulane
-mauritania
-parrots
-overweight
-14.6
-idiot
-capitalize
-boardwalk
-tf
-magdalene
-extradition
-beit
-28.4
-incline
-complemented
-emu
-fractures
-subscriptions
-io
-snapped
-goths
-csa
-reliant
-ealing
-well-established
-genealogical
-frees
-restructured
-repainted
-cute
-augment
-fisk
-indexed
-astonishing
-stakeholders
-13.7
-professionalism
-elliptic
-bethesda
-nozzle
-dunne
-somme
-claimant
-fireplace
-valentino
-dissenting
-gambia
-logically
-pipelines
-austrians
-6:00
-excessively
-pouch
-artworks
-polite
-tracker
-begging
-banksia
-debra
-implanted
-adhesive
-zanzibar
-ideologies
-saturation
-6-8
-vicky
-rn
-portrayals
-layton
-rallying
-prefixes
-secretion
-exhaustive
-kuomintang
-bureaucratic
-inspect
-ravine
-villainous
-unresolved
-breakers
-v2
-rosenthal
-summed
-ganges
-predicts
-cellulose
-explorations
-bellamy
-spoiled
-argonauts
-filmfare
-goddesses
-detonation
-adventurous
-brokerage
-silt
-californian
-high-performance
-1,800
-thunderbird
-eyre
-mrna
-daryl
-trier
-vigilante
-weakest
-alam
-plo
-cadre
-auschwitz
-profoundly
-21.9
-iaaf
-conductivity
-stripping
-hulls
-annette
-precipitated
-airwaves
-summits
-farmington
-batista
-outs
-asw
-shaman
-kiel
-216
-duma
-referencing
-illawarra
-licences
-nadia
-bess
-subordinates
-garnering
-20.3
-apprehended
-pigments
-bei
-tortoise
-halen
-nsa
-grail
-edged
-prosper
-cameos
-raptors
-extermination
-reprint
-relentless
-rv
-anthrax
-suck
-samoan
-cartel
-praises
-22.3
-22.8
-byrds
-susie
-earnhardt
-epistle
-retribution
-geffen
-thermodynamic
-15.2
-conveniently
-commended
-feldman
-bodyguards
-vicente
-parenting
-warhammer
-brothel
-dynastic
-lesotho
-aleksandr
-desperation
-hadley
-subsidy
-punitive
-mania
-parnell
-daffy
-midshipman
-mcgovern
-eds
-mooney
-surpassing
-potsdam
-battlestar
-22.1
-windy
-massacred
-hexagonal
-jacobson
-ri
-dialogues
-dustin
-alternated
-liar
-tn
-isd
-wander
-barrymore
-apt
-embroidery
-pantomime
-1756
-sans
-leighton
-selectively
-minsk
-heracles
-alligator
-belongings
-blanchard
-grissom
-punching
-denoting
-jai
-frontline
-censor
-gleason
-conscientious
-forbidding
-favoring
-monoxide
-synonyms
-hippie
-centauri
-installments
-favorably
-shortlisted
-jurist
-monologue
-blaming
-meteorite
-open-source
-antitrust
-ramayana
-quotient
-paxton
-sg-1
-cambrian
-troublesome
-grandstand
-airstrip
-voltages
-heller
-22.0
-buildup
-bayer
-stormy
-6-1
-vicki
-plead
-forrester
-cooks
-blackwater
-magpies
-pleading
-spying
-interrupts
-mei
-yellowish
-flamboyant
-helper
-climber
-mortgages
-wollongong
-cords
-novi
-1641
-someday
-transmits
-sandwiches
-redmond
-chavez
-heralded
-emilio
-trackage
-helms
-amnesia
-marcia
-adama
-comanche
-cried
-lewes
-conduit
-shrink
-theodor
-mari
-seam
-preventive
-filmography
-9:00
-fractional
-balboa
-gaulle
-awesome
-roaring
-berman
-inhibits
-balochistan
-anonymously
-nellie
-28.9
-taj
-tyranny
-greedy
-skateboarding
-shrinking
-lev
-wilfrid
-steroid
-rea
-martins
-falkirk
-austria-hungary
-electrostatic
-revolved
-vodafone
-zx
-ascend
-lakshmi
-superstructure
-post-secondary
-stigma
-exceedingly
-technicolor
-dwindled
-suggestive
-lieutenant-general
-barns
-beltway
-harpsichord
-fatah
-tees
-melancholy
-anthropological
-vaults
-coward
-whitley
-norwalk
-waltham
-saviour
-cushing
-mastering
-shootings
-sweeps
-petersen
-bitterly
-fittings
-29.2
-governs
-strickland
-310
-cf
-discounted
-drones
-herbie
-interacts
-torino
-knowledgeable
-chaim
-uxbridge
-siegfried
-rosary
-hedges
-turnbull
-dolby
-allman
-gauntlet
-correspondents
-dashboard
-jamal
-terri
-keene
-patriarchal
-folks
-fukuoka
-vetoed
-lanterns
-provoke
-179
-prost
-leaks
-hydroxide
-bequeathed
-romances
-medallion
-yeomanry
-roh
-tort
-cascades
-2003-2004
-hardwood
-smokey
-unmarked
-grind
-foray
-chittagong
-aerosmith
-riffs
-fairview
-rommel
-ashkenazi
-venerated
-kubrick
-goodness
-mosaics
-forging
-beaks
-luckily
-dolores
-sectarian
-gurney
-devonshire
-wrestle
-25.1
-1620
-demetrius
-sideline
-apis
-expire
-sinus
-aylesbury
-cree
-mocked
-taxed
-e4
-satanic
-hawkeye
-andromeda
-15.6
-avionics
-enigmatic
-polynesia
-rcaf
-grizzly
-ester
-rosh
-quartermaster
-mcmillan
-schuster
-bop
-oates
-overcrowding
-0-0
-chet
-emir
-eastwards
-loughborough
-ashby
-shen
-tattoos
-pelham
-whigs
-leakage
-pebbles
-hydrocarbons
-televisions
-fawcett
-kylie
-foiled
-yom
-planetarium
-28.5
-technologically
-sponge
-gpl
-postulated
-af
-hathaway
-recessive
-psychotic
-riviera
-decidedly
-aguilera
-curia
-mahmoud
-scorpions
-arturo
-erickson
-delaying
-chopped
-domenico
-copyrighted
-joseon
-worthington
-colonialism
-ridership
-sprague
-1754
-nephews
-planner
-spotting
-revolted
-vanuatu
-blount
-immoral
-euclid
-rigging
-bree
-a.c.
-namco
-fanfare
-ia
-azure
-soaring
-101st
-1643
-hiram
-gillian
-almond
-kissed
-brabant
-chairmanship
-siren
-densities
-deacons
-roaming
-esa
-medway
-jardine
-shockwave
-bryce
-sacraments
-roderick
-fairest
-sabine
-bradbury
-ns
-opel
-ghats
-attribution
-29.3
-dinah
-1720
-puja
-prophecies
-lobes
-het
-flutes
-formula_23
-29.4
-sikkim
-xiao
-playwrights
-researches
-percussionist
-20.9
-doves
-wheelbase
-bolster
-mogadishu
-bestselling
-u-boats
-duets
-acquires
-219
-kwan
-221
-arbitrarily
-dues
-post-production
-estrada
-juilliard
-cunning
-goethe
-hogg
-anomalies
-spindle
-scooter
-uprisings
-corinthian
-hamburger
-polynesian
-oldfield
-sherry
-saffron
-arresting
-vargas
-dives
-lesley
-swaziland
-substituting
-ind
-wig
-cheerful
-boulders
-21.8
-cid
-15.8
-maccabi
-introductions
-auctions
-blanco
-disgrace
-deserves
-enlightened
-tallinn
-locust
-camped
-gateshead
-conventionally
-batavia
-clinch
-ripper
-sublime
-gamespot
-carney
-oc
-1644
-entails
-closures
-banded
-futile
-cymru
-parting
-conditioned
-hartlepool
-primus
-priestley
-grimes
-foraging
-spalding
-19.4
-alfredo
-bertha
-calypso
-serena
-smuggled
-1690
-hex
-haunt
-half-sister
-scoreboard
-lick
-excommunicated
-262
-ammonium
-schoolboy
-fei
-bois
-deficits
-pneumatic
-defamation
-460
-turkmenistan
-directv
-scoop
-boast
-therein
-inventors
-tangible
-playful
-shoppers
-disparate
-persuades
-rune
-disposable
-axial
-consolation
-spirited
-indistinguishable
-isomorphic
-nostalgia
-242
-marston
-holliday
-anthropologists
-seeker
-westmoreland
-sprawling
-tintin
-rocco
-downstairs
-wagga
-javelin
-animosity
-imitated
-aldershot
-airflow
-marshals
-voivodeship
-sturdy
-oppressed
-skyscrapers
-antennae
-174
-recalling
-funerals
-spouses
-pentium
-pretends
-19.5
-terraced
-netscape
-mould
-ruse
-hopefully
-harbors
-theorems
-folly
-propagate
-middle-aged
-23.0
-almanac
-feminists
-irc
-mira
-rioting
-gulls
-14.9
-officership
-spade
-goliath
-yangtze
-stochastic
-nevis
-acceptor
-penthouse
-donnelly
-maths
-lieberman
-snowfall
-colder
-totalling
-shutouts
-coppola
-pelican
-28.7
-uterus
-concacaf
-codenamed
-respiration
-bribes
-westernmost
-kitchens
-vat
-nugent
-headaches
-coincides
-52nd
-tame
-competitiveness
-curtiss
-hemorrhage
-greenway
-retake
-parliamentarian
-flick
-opaque
-dun
-mfa
-casket
-stocked
-redundancy
-eisner
-abode
-circumference
-filtration
-bottoms
-cavern
-20.6
-checkpoint
-mentoring
-coinciding
-mitigate
-tackled
-herefordshire
-fundamentals
-jars
-editorials
-spree
-first-round
-flaps
-19.2
-ligands
-steak
-bennet
-nikon
-freeways
-eduard
-obscured
-inflatable
-4-6
-ribbons
-fantasies
-burlesque
-vents
-outback
-override
-iodine
-humanism
-mcnulty
-saladin
-insular
-codename
-iata
-highs
-eclipsed
-polluted
-anesthesia
-.3
-secluded
-regionally
-empowerment
-sine
-genital
-pressurized
-shabbat
-indifferent
-melton
-cesar
-interpersonal
-chippewa
-avian
-bayonet
-clippers
-biscuit
-aleppo
-iconography
-14.8
-forecasting
-icy
-corsica
-coyotes
-duplicated
-8000
-unifying
-colossus
-chop
-digestion
-intolerance
-provence
-voicing
-treacherous
-triton
-14.4
-pea
-situ
-organises
-flanagan
-statistic
-thursdays
-hebron
-smiley
-burglary
-axles
-scarcity
-02
-johnstown
-octavian
-tal
-muller
-muppet
-absorbs
-shivaji
-extracellular
-pointers
-merchandising
-linus
-vichy
-amarillo
-coca
-grays
-windham
-sino-japanese
-c4
-eyesight
-brittle
-nuisance
-carnatic
-intracellular
-pancreatic
-carnage
-quarterbacks
-coatings
-ailing
-francs
-teatro
-throughput
-1603
-maher
-sulphur
-17.6
-murderous
-s.a.
-induces
-amt
-implant
-reaper
-seeming
-wisden
-formula_24
-dependencies
-pro-life
-fascia
-ratu
-ordinarily
-unheard
-bony
-guadalupe
-planar
-holm
-broadened
-brant
-admirer
-stab
-pygmy
-truro
-homemade
-warheads
-unison
-grossman
-brest
-uniformed
-sutra
-audubon
-nomads
-sorcery
-manic
-propagated
-cylon
-trumpets
-seychelles
-annoying
-awful
-informatics
-repay
-declarations
-originality
-unbeknownst
-bluetooth
-kombat
-saber
-jen
-nanotechnology
-hips
-ala
-4,500
-percival
-latency
-lieutenants
-1757
-kv
-doctorates
-altman
-fredericksburg
-archived
-allegory
-begged
-reza
-16.3
-spd
-seward
-addict
-dominique
-22,000
-nadp
-alluvial
-jg
-bunkers
-carmine
-emmett
-cyanide
-kandahar
-losers
-radiator
-davison
-harmed
-svp
-16.1
-ticketing
-synthase
-assembling
-spinoff
-diamondbacks
-dauphin
-inflated
-comb
-wrapping
-pla
-stickers
-tagalog
-tray
-banu
-carlyle
-emile
-viacom
-atc
-swallows
-celia
-leith
-az
-amazon.com
-finley
-pros
-gregor
-lander
-synchronous
-whichever
-leary
-dutt
-wheaton
-aldermen
-dreamcast
-dividends
-confines
-bancroft
-747
-revere
-ao
-haji
-schubert
-gerhard
-envisaged
-addis
-lichfield
-e3
-semi-professional
-gis
-directives
-dragging
-trieste
-seasoned
-sympathies
-briefing
-sewers
-06
-recursive
-voice-over
-scissors
-pronoun
-dekalb
-spaceflight
-excise
-importation
-evaluations
-mattel
-treachery
-cheetah
-orchids
-wah
-hype
-bugle
-aphrodite
-typewriter
-224
-geologists
-resorted
-southland
-southside
-7000
-vitamins
-14.0
-layouts
-uncut
-hebrides
-flanking
-yamato
-assyrians
-milligan
-loeb
-1/4
-enshrined
-propellers
-widows
-jenna
-mavericks
-sliced
-pronounce
-beneficiary
-helpless
-preferential
-charmed
-90,000
-kazakh
-stirred
-scot
-consciously
-15.3
-bitten
-ivor
-guerilla
-slum
-dementia
-superfamily
-overboard
-1600s
-incarceration
-diode
-npc
-a.b.
-pacifist
-kingpin
-recount
-vultures
-infect
-traverses
-formulations
-lana
-quark
-humiliated
-pieter
-gothenburg
-take-off
-ars
-visitation
-malays
-vagina
-chopper
-sonnet
-canteen
-fidel
-plebiscite
-24.9
-departmental
-routers
-homology
-distressed
-fringes
-boyce
-faso
-nasdaq
-stabbing
-crtc
-19.8
-granger
-crawley
-emails
-diminish
-postmodern
-prophetic
-82nd
-creighton
-royalists
-oboe
-29.1
-liszt
-felipe
-deregulation
-sugars
-mons
-ins
-diplomas
-racks
-canucks
-brewed
-salient
-atonement
-box-office
-knuckles
-grievances
-linn
-misconception
-repel
-commuting
-gotta
-team-mate
-ahmedabad
-lite
-bronson
-accuse
-parlor
-30.0
-faisal
-asheville
-21.2
-ppg
-groningen
-jody
-tecumseh
-primer
-banda
-30.8
-twisting
-loneliness
-high-school
-pickett
-atypical
-recurrence
-marianas
-predates
-barring
-29.6
-potion
-astra
-boo
-wyndham
-1746
-milestones
-genesee
-viscosity
-slices
-twists
-scarecrow
-dealership
-subsidized
-tess
-powerless
-suv
-insurgent
-fucking
-argus
-plume
-riggs
-organisers
-constituting
-fades
-anecdotes
-osprey
-rustic
-matured
-allele
-specialises
-vader
-narayana
-gilded
-3/4
-taping
-hanuman
-assertions
-1662
-incursions
-taluk
-woodbridge
-underage
-microsystems
-earths
-skillful
-g4
-offending
-transplantation
-arboreal
-robb
-lms
-oxides
-kilmarnock
-awa
-macedon
-on-board
-rounding
-savior
-17.9
-formosa
-lender
-chevron
-drayton
-bogart
-clinched
-misunderstood
-oneida
-excalibur
-civilized
-carrington
-blurred
-usn
-pennington
-snacks
-qaeda
-oberlin
-hayley
-capone
-diy
-marylebone
-foreigner
-20.2
-kirkland
-bey
-folder
-leonid
-predetermined
-plutarch
-lark
-playmate
-nikita
-oppressive
-ashram
-dojo
-hails
-villanova
-hegel
-afl-cio
-horrific
-bsd
-electra
-travers
-motivational
-winfrey
-pandora
-reuters
-hum
-eta
-j.j.
-molded
-hutt
-430
-utmost
-mollusk
-tianjin
-disuse
-yell
-clinically
-pitchfork
-14.2
-feuded
-kearney
-embarking
-sacrificing
-swell
-enjoyable
-nativity
-brew
-hobson
-mcdermott
-igneous
-smyth
-halle
-80th
-firefly
-pip
-asean
-suffragan
-godwin
-scanned
-18.9
-tiers
-raffles
-leaking
-gillette
-contour
-festive
-air-breathing
-raoul
-tennant
-signifying
-harman
-agra
-reeds
-bowed
-dominions
-mikey
-incense
-inconclusive
-hat-trick
-20.4
-transcripts
-thunderbolt
-complied
-epitaph
-mantra
-rename
-hera
-federations
-mott
-heterogeneous
-additives
-sasebo
-prosecutions
-inquest
-sarcastic
-cereals
-murderers
-inflict
-wreath
-preferable
-allahabad
-blockading
-crichton
-frameworks
-imbalance
-cartesian
-stalemate
-snowboarding
-lindbergh
-zimbabwean
-campo
-millar
-sl
-beginners
-hirsch
-fulbright
-cornwallis
-hinder
-bute
-huston
-mueller
-lockwood
-colvin
-18.4
-crocker
-budding
-notables
-pervasive
-azerbaijani
-admirals
-centaur
-haze
-disobedience
-4a
-esquire
-swann
-heartbeat
-sassanid
-crocodiles
-biscuits
-christensen
-issuance
-onslaught
-armand
-ontology
-galen
-defenceman
-penzance
-formula_25
-1630
-1747
-quests
-triassic
-elsie
-brixton
-shutting
-thirty-five
-bio
-feasts
-mustangs
-salvaged
-microorganisms
-hampden
-plata
-merrick
-outsourcing
-decker
-landau
-rang
-skits
-velocities
-hallway
-sarasota
-typed
-sanderson
-radiohead
-17.4
-bakr
-robber
-tri
-curtailed
-1730
-chefs
-chomsky
-jaipur
-kinship
-bicentennial
-plundered
-wrigley
-stallions
-kitts
-howie
-florentine
-full-fledged
-15.9
-castor
-sergey
-westerly
-reformist
-iqbal
-disable
-decaying
-hamiltonian
-perak
-kabbalah
-cutoff
-interviewer
-grudge
-mulroney
-augustinian
-imposition
-medford
-behind-the-scenes
-thinker
-screenplays
-up-to-date
-deficient
-aptitude
-renting
-scarcely
-nader
-19.7
-albanians
-directories
-initiates
-eukaryotic
-dielectric
-vilnius
-rousseau
-oro
-chalmers
-headland
-ventricular
-gauges
-forbid
-303
-jargon
-tatars
-invoke
-ina
-mau
-somers
-sockets
-oswego
-experimentally
-picasso
-distilled
-hanna-barbera
-dunkirk
-improvisational
-cena
-suffixes
-bandmates
-mcclure
-29.0
-teleport
-adhered
-blueprint
-obelisk
-clustered
-bios
-aes
-bridging
-caribou
-baptised
-unilateral
-sylar
-lawton
-tay
-geraldine
-sumter
-hsbc
-amphitheater
-beecher
-roleplaying
-aitken
-yamamoto
-irritation
-financier
-gresham
-london-based
-bombarded
-blogger
-48th
-dorado
-cooley
-uppsala
-mildly
-gastric
-coda
-formulate
-souvenir
-squared
-padma
-tectonic
-mobster
-deva
-hester
-mojave
-gills
-stringer
-co-owner
-dynamical
-paratroopers
-roar
-crested
-194
-methodologies
-lazio
-16.2
-delaney
-homeowners
-percentages
-largo
-chinook
-minimizing
-revolts
-pineapple
-under-19
-ingrid
-stubborn
-rooftop
-extinguished
-plunged
-sycamore
-handwriting
-lenders
-overtaken
-gables
-replicated
-mythos
-abortions
-german-speaking
-ying
-alluded
-brodie
-prevail
-polymerase
-haarlem
-scorsese
-naia
-konstantin
-demolish
-1685
-dyson
-lasalle
-brody
-calais
-dragoon
-decisively
-filler
-16.4
-balances
-distal
-teal
-tbs
-congested
-charlottesville
-biopsy
-rockabilly
-comprehension
-erp
-raptor
-systematics
-pleasing
-venomous
-violins
-charisma
-jp
-slipping
-ramirez
-unconditional
-ide
-conformity
-semesters
-mather
-emd
-inactivation
-obligatory
-hornsby
-game-winning
-heinlein
-30.4
-compendium
-jfk
-ionian
-mardi
-appropriation
-loren
-yew
-dominica
-non-native
-francophone
-rhapsody
-cruised
-druid
-terns
-wand
-kidnaps
-hinton
-contradicted
-improperly
-colonels
-dmitry
-cochin
-emulator
-1649
-unorganized
-theodosius
-conformation
-jutland
-calvary
-hickman
-all-new
-buffett
-sidelined
-mountaineering
-lyceum
-backstory
-rensselaer
-melinda
-riemann
-armaments
-1609
-moseley
-bbc2
-strabo
-billiards
-vr
-frasier
-29.7
-100.0
-schuylkill
-227
-worldly
-rump
-acp
-thoracic
-decimated
-sault
-indra
-gecko
-cough
-mediaeval
-forfeited
-friar
-chaucer
-mecha
-broadcasted
-amar
-purified
-mortars
-sima
-unharmed
-basses
-crests
-baskets
-kipling
-d&d
-silvio
-uri
-hallucinations
-intuition
-reacting
-thaksin
-238
-rinpoche
-dismal
-sneaks
-pir
-natalia
-sco
-alexandre
-delegations
-retinal
-rumour
-tartan
-add-on
-finlay
-cavities
-roadways
-leptodactylidae
-indexes
-dedicate
-18.6
-cleavage
-hounds
-volts
-kart
-ozzy
-kathmandu
-hs
-19.6
-dreamed
-caldera
-olives
-sopranos
-mor
-gan
-korn
-dispatches
-fells
-bodywork
-dosage
-17.2
-kitten
-233
-fairgrounds
-wiener
-syn
-usda
-ratchet
-spicy
-burying
-capo
-javier
-opting
-wbc
-devlin
-dwell
-extracting
-swung
-hertford
-lengthened
-390
-antisemitism
-quotas
-tenets
-gs
-lausanne
-s.h.i.e.l.d.
-instructs
-paranoia
-teutonic
-self-determination
-zoos
-josie
-fanbase
-lag
-dmitri
-fallacy
-steppe
-tori
-gazetted
-graceful
-collagen
-dons
-burkina
-nightlife
-great-grandfather
-shortwave
-mariah
-powerpc
-loudoun
-cooperated
-fusiliers
-curiously
-dreamer
-counsellor
-re-entered
-dalhousie
-18.3
-kiln
-395
-tossed
-formulae
-tsr
-biosynthesis
-vlad
-surgeries
-fandom
-somaliland
-repatriated
-232
-1645
-animators
-affine
-complication
-magdalen
-malfunction
-sled
-700,000
-nagano
-hauling
-enron
-gags
-shogun
-smiles
-16.8
-bain
-1/3
-digby
-pathogenic
-ruskin
-hyperion
-intravenous
-propel
-calibration
-fashions
-jamboree
-herod
-occidental
-wadsworth
-skipped
-callers
-vaguely
-easternmost
-polarized
-anti-slavery
-panned
-co-ed
-gower
-horse-drawn
-assures
-feuding
-spec
-carbohydrates
-512
-allergy
-andean
-rnas
-17.1
-04
-sculptors
-weinstein
-hatchback
-timmy
-burrow
-cavalier
-mid-season
-idiom
-19.0
-three-way
-pierced
-uncontrolled
-tak
-passover
-litres
-mn
-reinforcing
-northumbria
-1701
-cross-section
-dubbing
-shandong
-garrisons
-oranges
-two-hour
-minimalist
-validated
-newfound
-5-0
-distrust
-29.9
-condemn
-weller
-portmanteau
-ghulam
-png
-kosher
-rotunda
-concertos
-polio
-straps
-tabor
-peroxide
-millwall
-scrolling
-kelso
-thorax
-conjugate
-contends
-geothermal
-supergroup
-wrongdoing
-dwayne
-gunboats
-mccann
-decoding
-brandt
-franc
-bile
-mcculloch
-deane
-gottfried
-overtly
-logistic
-cong
-1607
-joss
-vittorio
-specificity
-laptops
-annum
-wto
-resumes
-landis
-wednesdays
-pavilions
-brownsville
-unborn
-ou
-workstations
-wilhelmina
-smythe
-pretended
-monograph
-insulted
-impromptu
-aggregation
-paler
-cms
-slew
-liberate
-a4
-asl
-housewives
-uniqueness
-advert
-formula_26
-aeroplane
-plaid
-applause
-canyons
-activating
-lycoming
-meek
-jovi
-chisholm
-spp
-skit
-perpetrators
-barbarossa
-peril
-yorke
-shorthand
-sejm
-apologizes
-pandemic
-parliamentarians
-klux
-famer
-bantu
-enthusiastically
-lymphoma
-azores
-hussars
-primacy
-cutters
-archduke
-roast
-repeater
-timberlake
-kiwi
-mughals
-cramer
-rarer
-lyme
-criticizes
-16.0
-french-speaking
-lecturing
-pei
-left-hand
-3a
-hopeless
-pumpkins
-1680
-constitutionally
-suppressing
-interrogated
-duplication
-cray
-superstition
-verbally
-gillingham
-siamese
-concave
-gays
-albatross
-convergent
-dragonfly
-brushes
-colonized
-emergent
-albright
-inhabiting
-313
-boyer
-atherton
-yun
-liza
-protruding
-infiltrated
-DGDGDGDGDGDGDGDG
-marque
-enhances
-obverse
-undivided
-powys
-strategist
-pathological
-adversaries
-ornament
-unicef
-turnaround
-DGDGDGDGDGDGDG
-etiquette
-dandy
-rein
-mckee
-shortening
-manley
-chameleon
-1753
-guest-starred
-humphreys
-isla
-pleasures
-washburn
-cultured
-agony
-italics
-concentric
-greenpeace
-yao
-chieftains
-diverged
-30.3
-forwarded
-8:00
-rarities
-altercation
-stratton
-woolf
-landscaped
-bastion
-17.3
-mains
-gdr
-buds
-budd
-realist
-devolved
-postcards
-gallon
-rei
-dewitt
-gators
-memberships
-humiliating
-reflector
-refrigeration
-sher
-woodlawn
-dhabi
-summa
-gambino
-moritz
-footing
-unsolved
-fabaceae
-tackling
-housewife
-gilchrist
-counterpoint
-15.1
-unloading
-rupees
-repelled
-tricky
-figurines
-uninterrupted
-autosomal
-xxx
-greet
-vp
-liners
-15.0
-konrad
-lighted
-earthworks
-relativistic
-cassettes
-horowitz
-pre-school
-ine
-pawns
-slippery
-familiarity
-dab
-grantham
-stabs
-rouse
-moravian
-rhetorical
-lorentz
-disdain
-tb
-gestapo
-ayatollah
-phoenician
-pseudonyms
-rectory
-faraday
-wares
-fulfil
-madeira
-plywood
-loki
-ovens
-hostess
-pomona
-manus
-pence
-mallet
-optimism
-gaseous
-translucent
-roscoe
-alejandro
-wirral
-purposely
-bane
-extraordinarily
-planters
-gymnast
-afar
-withheld
-vivo
-binder
-epping
-thunderbolts
-twenty-seven
-aug
-brescia
-cider
-glance
-yerevan
-romanians
-chum
-adolph
-partnering
-thunderstorms
-rolf
-salman
-rentals
-forearm
-garnet
-novell
-rhinoceros
-farnham
-welcomes
-dilapidated
-classically
-braddock
-cornelia
-integrates
-interlocking
-sos
-julien
-guan
-cyrillic
-sampler
-hops
-sewell
-refrigerator
-bowers
-feature-length
-millet
-moorish
-habib
-gamer
-o'neil
-collide
-unloaded
-16.9
-anya
-mover
-oleg
-chic
-shrek
-thrilling
-vigil
-superficially
-DG.DGDGDGDG
-commuted
-mun
-nationalisation
-tuba
-raion
-discriminatory
-himachal
-1610
-vanishing
-spaghetti
-metz
-gambler
-gaunt
-shaikh
-functionally
-blackjack
-fertilization
-iraqis
-interpreters
-swifts
-berne
-macpherson
-adirondack
-calibre
-outfitted
-312
-exposes
-333
-second-in-command
-subversive
-blankets
-detractors
-landon
-streetcars
-maneuvering
-carthaginian
-maude
-wadi
-bandmate
-oahu
-dx
-ami
-intrigued
-encircled
-astrophysics
-swine
-19.3
-29.8
-lyn
-irb
-aides
-sigmund
-streisand
-branson
-252
-messianic
-geometrical
-ephraim
-deteriorate
-sanity
-abramoff
-j.p.
-2002-03
-atheism
-chr
-mora
-joked
-clique
-permissible
-kuhn
-phenomenal
-roosters
-airbase
-brightest
-morph
-one-shot
-ridiculed
-steinberg
-slabs
-ingestion
-scorpio
-josephus
-tramways
-lofty
-chambered
-bale
-sita
-finder
-bun
-hammers
-extremist
-alonso
-carnivores
-hanks
-trailed
-eskimos
-caterpillars
-llp
-standby
-warlords
-domed
-greatness
-weinberg
-woodruff
-longest-running
-triumphant
-keyes
-15.7
-zeno
-bumpers
-matheson
-stead
-hummingbird
-worshippers
-jurors
-loma
-ricci
-gallo
-tacitus
-iranians
-r.e.m.
-375
-amish
-locale
-catered
-haig
-stapleton
-life-threatening
-pejorative
-shotguns
-embryos
-chronologically
-full-service
-randomized
-screws
-sidewalks
-recite
-laois
-strongholds
-co-workers
-30.5
-323
-superimposed
-silas
-fanzine
-compassionate
-barksdale
-pastry
-riverfront
-adventurers
-codec
-recognises
-discus
-craze
-tub
-predictive
-thelma
-kayaking
-redefined
-cajun
-hangars
-265
-delete
-numerically
-295
-bowden
-d4
-persists
-firewall
-nextel
-jakob
-leibniz
-smallville
-environs
-intermediary
-revolve
-thayer
-bosco
-sister-in-law
-charger
-williamstown
-andretti
-alerts
-zee
-7:00
-afi
-parisian
-formula_27
-rowling
-mitigation
-ttc
-alludes
-renee
-nuggets
-stag
-malice
-synth
-rhinos
-wolverines
-devonport
-reorganised
-degenerate
-amadeus
-attachments
-subdue
-shrapnel
-carrot
-dodger
-arnhem
-punic
-jpg
-eleutherodactylus
-f-16
-bering
-co-writer
-ain
-edmunds
-rations
-oats
-testosterone
-bloch
-simms
-usenet
-numeral
-torpedoed
-mentors
-uploaded
-skiers
-downloading
-caffeine
-surat
-aircrew
-cannonball
-ironworks
-pap
-roam
-staffing
-xxiii
-visas
-capsules
-bonner
-villiers
-circulate
-merle
-sharia
-simcoe
-bartender
-forbids
-tac
-pimp
-faithfully
-horner
-martel
-barre
-kingfisher
-jats
-ruiz
-valence
-intimacy
-prize-winning
-scribe
-stricter
-veneto
-322
-marquee
-actuality
-overtones
-infuriated
-goofy
-dissidents
-brig
-chand
-hideout
-fantasia
-ftp
-mahler
-lviv
-tutors
-gnostic
-utopian
-suny
-podcasts
-stardust
-sadie
-nara
-sumerian
-pear
-toolkit
-frightening
-siva
-ponies
-speciality
-bowles
-nagoya
-whalers
-pasta
-leaps
-1748
-starving
-bharat
-tunis
-roasted
-frenchman
-haywood
-nauru
-olmsted
-distinguishable
-meir
-bandleader
-connacht
-puberty
-worthless
-stairway
-infrequent
-hydrocarbon
-sunflower
-dubois
-modernisation
-1742
-rm
-reflexes
-fend
-recitals
-misfortune
-herzog
-pashtun
-engravings
-acm
-mansell
-build-up
-taekwondo
-outflow
-alleles
-complains
-350,000
-blasted
-aga
-disapproved
-assaulting
-lilith
-swanson
-webbed
-misty
-refineries
-morecambe
-parc
-donating
-chained
-phalanx
-vox
-adverts
-drive-in
-6-0
-autoimmune
-relaunch
-unfavorable
-orpheus
-biosphere
-ledge
-khanate
-gilles
-hollis
-ebony
-hitch
-frisian
-fixation
-d'or
-tagore
-workload
-newt
-focussed
-flycatcher
-eagerly
-attrition
-tatar
-meta
-discrepancy
-attainment
-clicking
-greco
-recommending
-nepali
-moog
-reuniting
-maury
-tasman
-fatima
-workstation
-2a
-mediator
-gall
-steamers
-southerly
-xviii
-brute
-six-month
-ducal
-fund-raising
-arcades
-symbolizing
-wide-ranging
-reparations
-o'keefe
-sinhalese
-woodside
-wba
-stalingrad
-cesare
-syntactic
-mancha
-deterministic
-fixes
-hoffmann
-monmouthshire
-exempted
-adhesion
-diem
-refractive
-irvin
-unify
-tlc
-ledger
-optionally
-westland
-barefoot
-microbiology
-uniformity
-ramona
-breech
-high-definition
-navarro
-ops
-disintegration
-nepalese
-schenectady
-longford
-cassius
-auctioned
-federico
-affections
-indifference
-2002-2003
-myles
-groin
-nkvd
-rcmp
-fries
-lager
-hoyt
-salinity
-incest
-inner-city
-profiled
-connors
-germantown
-getaway
-rectangle
-edie
-subdistricts
-heineken
-punisher
-emigrate
-lowers
-privateer
-ejection
-infusion
-medusa
-kool
-calcio
-hypnotic
-gonzaga
-lancers
-tragically
-tamils
-ipa
-loudly
-dearborn
-diocletian
-envy
-dundalk
-exploratory
-bigelow
-rasmussen
-karim
-chiba
-conclusive
-outposts
-tempered
-sirens
-familial
-cohn
-germ
-widen
-lacy
-fujian
-thanked
-supervisory
-1741
-hrh
-analysed
-beneficiaries
-mediate
-hemp
-untitled
-clumsy
-idealism
-getty
-gentile
-helmut
-freetown
-47th
-martini
-pedestal
-director-general
-acrylic
-eyed
-booty
-huskies
-trainees
-mifflin
-recognisable
-watertown
-abkhazia
-enclosures
-dillard
-contacting
-lupus
-stew
-polka
-esteemed
-warblers
-tragedies
-schuyler
-asbury
-kindly
-powdered
-bnp
-bendigo
-normative
-figurative
-observational
-psychoanalysis
-dunfermline
-fragrance
-mcmaster
-acquaintances
-km2
-clemente
-edmonds
-schofield
-mime
-uplift
-hammerstein
-manifests
-curing
-vinnie
-delano
-boarders
-astrological
-beagle
-louder
-sindhi
-cranial
-edwardian
-locales
-preferably
-harrogate
-57th
-displeasure
-beggar
-waivers
-drier
-almighty
-infrequently
-30.6
-allegorical
-ancients
-osman
-incremental
-kearny
-insured
-moby
-bougainville
-francesca
-cummins
-ashamed
-greenock
-reckoning
-brownlow
-slit
-twenty-eight
-abrasive
-bullpen
-ozark
-jaffa
-vendetta
-patrician
-excerpt
-fibrous
-politburo
-hoop
-cctv
-msn
-sg
-belleville
-onscreen
-v1
-insulated
-carbine
-llywelyn
-toto
-reasoned
-archbishops
-nazism
-critiques
-discredited
-upload
-silverman
-amor
-ranching
-paramedics
-redistricting
-punishable
-algonquin
-embraces
-passaic
-pectoral
-blackhawks
-vaginal
-neill
-margo
-stitches
-counteract
-witt
-281
-jinnah
-asia-pacific
-plough
-invent
-materialism
-monoplane
-aidan
-stanza
-plummer
-princesses
-30.2
-weary
-resuming
-harrier
-puri
-rada
-pre-columbian
-receptive
-episodic
-consular
-pulmonate
-beckham
-impurities
-brentford
-moffat
-mort
-monticello
-dime
-apocryphal
-selector
-ionization
-minster
-shang
-abide
-frye
-stroud
-exquisite
-caudal
-eaters
-metrics
-zheng
-contradict
-leveled
-limburg
-orally
-ounces
-basing
-entrepreneurial
-interplay
-rowley
-nicholls
-cloister
-luminous
-n.y.
-resettlement
-guilford
-yong
-bmt
-claiborne
-tie-in
-ashcroft
-plug-in
-287
-volatility
-dinners
-co-stars
-sportscaster
-freehold
-winston-salem
-adhd
-materialized
-sena
-beauchamp
-pranks
-unnatural
-militarily
-lament
-drown
-hovering
-filaments
-geophysical
-centrifugal
-17.8
-adversely
-jure
-caracas
-coolant
-seabirds
-gaze
-fleshy
-vedas
-microbial
-gippsland
-shaggy
-chevalier
-starbucks
-aberdeenshire
-finns
-townsite
-niches
-diversification
-30.1
-endorsements
-harbours
-apron
-procured
-stat
-aiken
-harshly
-inducing
-dh
-interlude
-rancher
-surpass
-nell
-shearer
-frazer
-ein
-denison
-impulses
-dreamworks
-alphabetically
-540
-commutative
-becket
-bainbridge
-mocking
-shouts
-pretender
-abingdon
-loader
-70th
-discard
-pinto
-reverence
-315
-ernesto
-knoll
-dowry
-sexton
-bidder
-rhodesian
-on-going
-243
-faked
-eusebius
-ara
-apa
-tomlinson
-sacrificial
-zimmer
-lon
-articulate
-aquitaine
-weave
-masterpieces
-skinny
-hwy
-rocking
-annoyance
-nbl
-clovis
-silesian
-mantua
-conductive
-mitochondria
-multi-million
-shielding
-macclesfield
-streaks
-torchwood
-rockaway
-gatwick
-consolidating
-indexing
-farce
-constructors
-apocalyptic
-19.1
-invitations
-defection
-lancelot
-epics
-bungalow
-newington
-yonge
-piloting
-bandar
-cheated
-sprawl
-stool
-start-up
-interurban
-cellist
-joystick
-westwards
-kettering
-pellets
-devotee
-tad
-nikolay
-paulo
-um
-acs
-ax
-marlboro
-spector
-excelsior
-fanning
-colossal
-unc
-campania
-kamikaze
-18.7
-nucleotide
-quirky
-vols
-amalgam
-pyle
-vitality
-hinckley
-05
-3g
-mahogany
-nazarene
-aristocrats
-miami-dade
-fortran
-overlay
-sham
-gadgets
-openness
-archdeacon
-butterfield
-nvidia
-hemlock
-mein
-gayle
-publius
-reiterated
-sacking
-arrondissement
-forensics
-resonant
-ignite
-mogul
-virginity
-calligraphy
-nagpur
-nizam
-descendents
-ito
-oecd
-1751
-isomorphism
-quarter-final
-spiny
-mynetworktv
-iloilo
-porcupine
-ruben
-diurnal
-wondering
-urbanization
-hurry
-nea
-plc.
-yuki
-46th
-twickenham
-ex-husband
-richness
-chemists
-regrets
-khomeini
-unintentionally
-bedouin
-elsa
-bf
-vincenzo
-alsace
-vibe
-follies
-extremity
-co-owned
-duality
-bennington
-19.9
-chagrin
-j.c.
-jharkhand
-angled
-frey
-grundy
-johnnie
-ailments
-community-based
-sul
-molluscs
-sore
-aortic
-coates
-gourmet
-durant
-argos
-manipur
-topographic
-uphill
-mahayana
-deterrent
-gujarati
-remorse
-daimler
-ashanti
-31.3
-stately
-263
-neat
-1550
-eng
-hostels
-templeton
-arab-israeli
-airway
-discounts
-jive
-bertram
-duc
-estimating
-subchannel
-terrence
-bracelet
-grabbing
-preparedness
-parishioners
-1749
-kendrick
-multiverse
-hc
-pathfinder
-gestation
-aerodynamics
-sanjay
-serenity
-intoxicated
-doolittle
-projectors
-betray
-adamson
-innsbruck
-brunel
-ridicule
-quadrangle
-hopeful
-mystique
-leung
-porous
-slump
-barkley
-rapping
-aikido
-appraisal
-scramble
-taps
-curran
-paleolithic
-dutton
-forceful
-subclass
-selkirk
-ebenezer
-pomerania
-nemo
-moles
-sorceress
-gilman
-freemasonry
-1638
-scourge
-251
-under-secretary
-coercion
-metallurgy
-stalking
-collage
-9000
-leyton
-pia
-cairn
-salinas
-regretted
-unjust
-espoused
-scotsman
-beginner
-sps
-charlestown
-hopewell
-boyhood
-molds
-facelift
-hydrolysis
-alright
-vin
-weakly
-motorways
-battista
-susanna
-opal
-b-29
-south-central
-mainwaring
-repressed
-evanston
-resurrect
-grenadier
-ginsberg
-halton
-discredit
-liter
-grotesque
-equinox
-meier
-1752
-crouch
-magee
-fray
-pedigree
-rockin
-carcinoma
-dea
-dissolving
-shedding
-reappear
-calvinist
-dalmatia
-modesty
-ageing
-striving
-salim
-matlock
-mercia
-emphasised
-carolingian
-lighthouses
-borrower
-reagent
-bandai
-instincts
-borland
-evils
-legume
-larval
-assassinations
-scripps
-oblivion
-nameless
-gilliam
-sphinx
-cubes
-all-big
-emblems
-exe
-rizal
-chateau
-sunglasses
-creep
-kristin
-originator
-callaghan
-1683
-metrolink
-caitlin
-stoker
-barnabas
-scsi
-councilman
-freedman
-molding
-insectivorous
-salty
-neapolitan
-limp
-fugitives
-soaked
-superpowers
-confrontations
-docklands
-civilisation
-cosmological
-hailing
-xinjiang
-wis
-addams
-neath
-chute
-b-52
-stingray
-surprises
-footprints
-cale
-aisles
-lutherans
-winnie
-stalker
-405
-latterly
-instruct
-clutches
-tainted
-shellfish
-enrolment
-cantonment
-tait
-calculators
-cosmo
-bucky
-round-robin
-auguste
-nicaraguan
-arista
-tvb
-for-profit
-stv
-pre-recorded
-stephan
-arya
-irrigated
-brando
-gait
-estrogen
-cameraman
-20.1
-18.1
-syd
-majoring
-attaches
-amr
-drawback
-pans
-venous
-mischief
-brahms
-captors
-coat-of-arms
-turntable
-organiser
-nelly
-proclaim
-vehemently
-muster
-revivals
-smithfield
-hmcs
-10-year
-signatories
-cooperating
-nueva
-anticipating
-madam
-delicious
-rainier
-exponentially
-1708
-chromium
-18.0
-montevideo
-lansdowne
-regenerate
-shroud
-decoy
-oscillation
-1713
-ky
-roper
-cohesion
-pleads
-itv1
-shelling
-evoke
-samba
-pacers
-demeanor
-wingspan
-radiant
-epistemology
-shiloh
-lux
-khartoum
-janis
-246
-antiochus
-intramural
-foreground
-eradicate
-waving
-galvatron
-self-proclaimed
-curl
-mens
-254
-mathis
-doubted
-1450
-goldwater
-injections
-feces
-confucian
-a&r
-tongue-in-cheek
-bloggers
-humility
-non-violent
-embroidered
-1634
-censors
-lawler
-billionaire
-supremes
-sigismund
-templates
-warne
-occupant
-1670
-21.0
-rotc
-parrish
-correctness
-perpetrated
-jug
-weldon
-niven
-northfield
-dwindling
-sheath
-amazed
-clamp
-pl
-fermi
-euthanasia
-iona
-unrealistic
-diagnose
-relayed
-affixed
-alden
-borg
-hospice
-1725
-alleges
-maroons
-priestess
-whedon
-mentality
-maddox
-djibouti
-land-based
-mag
-noticing
-wellness
-singularity
-starling
-cu
-livermore
-encode
-ogre
-timeless
-5:00
-quitting
-loretta
-catchy
-spin-offs
-yelling
-tinker
-rockland
-polydor
-signaled
-tipton
-gypsies
-palisades
-excitation
-co-creator
-nested
-unplugged
-tm
-lobbyist
-klang
-jumbo
-a&e
-resupply
-seminoles
-insomnia
-albemarle
-lombardi
-stowe
-flavored
-og
-plunder
-uptake
-woodford
-revise
-miscarriage
-rideau
-discharges
-dictate
-gordy
-fables
-1605
-comdr.
-famicom
-imax
-cloned
-violates
-dread
-dulwich
-shelled
-jang
-no-one
-dictates
-ascot
-piccolo
-post-punk
-smelting
-imitating
-cistercian
-auditing
-deletion
-visconti
-c.e.
-parallax
-inventive
-hon
-maimonides
-brides
-farmlands
-spar
-woodpeckers
-goalie
-interprets
-abilene
-stoner
-iliad
-slap
-groundwork
-oldsmobile
-sulla
-subsystem
-pies
-punctuated
-lagrange
-symbolically
-feyenoord
-romanticism
-pillai
-syllabus
-samaj
-fluoride
-concorde
-fibrosis
-niki
-prem
-womb
-drexel
-ours
-robberies
-halfback
-henrik
-varma
-uncles
-1727
-lakeshore
-55th
-nantucket
-grounding
-cremation
-otters
-ambulances
-first-year
-sold-out
-mink
-bhai
-moravia
-mullen
-primordial
-filament
-curtains
-juggling
-ombudsman
-schoolteacher
-campsites
-ff
-fireman
-innes
-meps
-pershing
-wineries
-leavenworth
-lawns
-orchestration
-comrade
-271
-tully
-isabelle
-ascetic
-igbo
-pressings
-instantaneous
-eindhoven
-bulky
-ateneo
-antigens
-liber
-30s
-modernity
-inductive
-31.4
-mcrae
-sucked
-portico
-miocene
-hath
-waveform
-phish
-sergeants
-iggy
-untreated
-visuals
-oda
-woke
-mountaineers
-swampy
-authorizing
-radiated
-1066
-bonfire
-lei
-subtly
-exposures
-gatehouse
-gunman
-sixes
-simplistic
-intruder
-deviations
-tapestry
-1710
-je
-neuronal
-assay
-amphibian
-odis
-remodeling
-stanhope
-severus
-lesion
-despised
-debtor
-grosvenor
-minstrel
-subscribe
-determinant
-ua
-res
-5-1
-307
-1570
-247
-devise
-boosters
-goodyear
-inconsistencies
-frontage
-secreted
-low-lying
-earp
-restraints
-abdication
-1666
-anaerobic
-one-on-one
-11:00
-staunton
-cultivar
-quan
-polygamy
-crewmen
-wye
-begum
-asynchronous
-towering
-disrupting
-whitish
-twinned
-sagar
-lute
-fondness
-intertwined
-1540
-palate
-emits
-mulligan
-shakespearean
-bongo
-oss
-tahoe
-bbc1
-samar
-parkes
-overdrive
-244
-formula_28
-conrail
-alkali
-attributable
-omission
-celts
-bra
-vogel
-dada
-montfort
-bala
-meyers
-jc
-vocation
-304
-myra
-four-man
-macgregor
-leuven
-mendelssohn
-barbie
-guillaume
-genovese
-centurion
-glee
-chaudhry
-3-6
-carlin
-shasta
-kia
-malnutrition
-tempe
-psychosis
-321
-curate
-vanishes
-forfeit
-organisational
-tonnage
-existential
-windshield
-debit
-overheard
-yue
-nakamura
-non-denominational
-contaminants
-950
-hitman
-loot
-crag
-rabin
-banff
-tolstoy
-bi
-cockburn
-h.r.
-1604
-susceptibility
-sororities
-scanners
-greenish
-zimmerman
-relieving
-counter-terrorism
-grips
-krypton
-fumbles
-popeye
-formula_29
-overpass
-mccormack
-thrush
-oppenheimer
-dix
-empathy
-mariano
-pancras
-scoreless
-evangelism
-1714
-gruesome
-jockeys
-gritty
-eastbourne
-outlining
-ofsted
-cohesive
-swearing
-ati
-annan
-prakash
-mexicans
-sentry
-scrubs
-afghans
-lump
-ak
-cost-effective
-roddy
-re-establish
-lingua
-mackey
-hawley
-1250
-farthest
-naa
-accomplice
-drawbacks
-costumed
-liberator
-withers
-non-standard
-kerosene
-bayern
-clapham
-yuen
-thwart
-belvedere
-concerted
-charing
-artie
-persuading
-pembrokeshire
-riverdale
-smyrna
-parentage
-saltwater
-exemptions
-aug.
-erika
-incubation
-babu
-dd
-chronicler
-epithelial
-suitcase
-embodiment
-foxx
-ovid
-lovell
-lethbridge
-smugglers
-odysseus
-plagiarism
-phra
-capitalized
-texan
-marr
-two-day
-slack
-lerner
-panchayats
-jigsaw
-bharatiya
-agitated
-1665
-quill
-breaches
-31.0
-dipole
-pooh
-permian
-okay
-tilted
-surmounted
-nouveau
-17.7
-schulz
-flammable
-encampment
-wondered
-coping
-pebble
-5:30
-ingested
-schwarz
-self-sufficient
-premieres
-ntsc
-boasting
-1718
-va.
-hinged
-tg
-opengl
-sonnets
-margrave
-sedgwick
-indycar
-moderated
-characteristically
-tougher
-outwards
-milano
-fruitful
-ramones
-compositional
-connelly
-merrimack
-biologically
-thesaban
-corcoran
-phenotype
-cortical
-narrates
-tamar
-masons
-crackdown
-necessities
-detonate
-dressage
-volt
-bessie
-sora
-humankind
-slums
-curtin
-strathclyde
-freeware
-saracens
-loomis
-tighter
-holman
-narnia
-pemberton
-16.6
-incapacitated
-fetish
-sonora
-f3
-catalogs
-herbaceous
-reciting
-hendricks
-laughed
-mehta
-overload
-marlene
-chores
-paganism
-ansi
-relapse
-unwittingly
-ps2
-286
-kai-shek
-fergus
-firefighting
-thunderbirds
-sakura
-budgetary
-gretchen
-wielding
-goodnight
-relaxing
-laity
-excommunication
-summoning
-anytime
-maxi
-madeline
-expatriates
-c3
-pup
-defiant
-non-commissioned
-reaffirmed
-mascots
-gabe
-waikato
-icao
-mtr
-sheik
-abner
-oaths
-sonatas
-subunits
-renovate
-relinquish
-mayan
-cheeks
-brandeis
-wrecking
-arp
-mulberry
-procure
-qantas
-abstinence
-deathbed
-bonneville
-marathas
-decompression
-rubbish
-waldorf
-binomial
-tumour
-haile
-fluorescence
-fil
-lepidoptera
-self-esteem
-laterally
-therapists
-pty
-multiplier
-steered
-mcfarlane
-cages
-trolls
-stiffness
-personalized
-pelvis
-side-by-side
-quits
-irreducible
-tempted
-dimitri
-masque
-harmonics
-cleft
-year-long
-lexus
-promontory
-90th
-trotskyist
-fermented
-incompetence
-caulfield
-tod
-mouthpiece
-mayfair
-tahiti
-cadbury
-mcgowan
-1744
-concise
-falk
-tarantino
-polarity
-yoruba
-1-4
-subunit
-deduced
-horseman
-waiter
-swahili
-1,700
-o'leary
-m6
-masse
-deerfield
-leamington
-bubba
-rejoining
-bernice
-broward
-hurts
-basra
-telecasts
-verve
-stylus
-mbit/s
-2001-02
-nur
-bother
-quell
-puppies
-blazing
-gaiman
-zia
-aldo
-subspace
-sported
-sired
-flares
-gadget
-mawr
-illiterate
-connotation
-yeats
-nino
-housekeeper
-nong
-chaplains
-tramp
-erase
-kingship
-romulus
-limbaugh
-sadistic
-atwood
-harriers
-epirus
-husbandry
-qualifies
-nasir
-guts
-grotto
-425
-enrolls
-maxine
-apprentices
-musharraf
-porky
-janitor
-platonic
-mccabe
-ackerman
-historiography
-strung
-constructor
-hotly
-talisman
-spiritually
-salomon
-kimmel
-parr
-latitudes
-avant
-ppm
-pinocchio
-soloists
-meteorologist
-approving
-abound
-2-4
-dumps
-scorers
-penchant
-eradication
-cantons
-beresford
-236
-liquidity
-hollyoaks
-all-weather
-consumes
-emulated
-carbide
-hypnosis
-nominating
-correcting
-antoinette
-enforcer
-traditionalist
-euphrates
-wsop
-thru
-anglers
-crankshaft
-ts
-disgruntled
-airliners
-vida
-tempore
-hesitant
-unprotected
-procter
-concussion
-dat
-paperwork
-abortive
-medalists
-saliva
-lucien
-martinique
-kelsey
-methodists
-socialite
-rawls
-mahoney
-barnum
-edifice
-1632
-connective
-clearer
-plethora
-boycotted
-ancillary
-toughest
-sparkling
-garrick
-apoptosis
-stefani
-devonian
-thrower
-rohan
-thirty-two
-bmi
-pd
-rout
-hyman
-sandhurst
-mysterio
-pharmacist
-lest
-quarried
-conceive
-payable
-polish-lithuanian
-firestone
-countermeasures
-hurdle
-degeneration
-ragtime
-unreasonable
-detects
-alveolar
-gibbon
-bohemians
-one-sided
-dismantling
-paintball
-grids
-tasting
-nationalistic
-chunks
-talmudic
-charms
-plugged
-brevet
-sufferers
-dude
-marist
-frantic
-gator
-03
-cba
-donner
-comte
-detour
-shakti
-ores
-acharya
-non-human
-afield
-combating
-certifications
-jed
-serra
-repentance
-alarms
-convocation
-scottsdale
-ot
-behaves
-nic
-rodrigo
-bypasses
-fad
-crawling
-ire
-boa
-idw
-robo
-chapin
-nasl
-mannheim
-chakra
-seine
-ladders
-cocker
-engel
-unknowingly
-sophistication
-eugenics
-elmira
-guo
-aries
-tumultuous
-iec
-gallows
-contesting
-medically
-thirties
-placid
-lampoon
-1739
-barb
-confessor
-tennyson
-rayner
-guadalajara
-31.5
-swollen
-bobcats
-waverly
-mcneil
-copyrights
-joshi
-tuesdays
-decentralized
-motherboard
-six-day
-widower
-filesystem
-toulon
-skis
-staffs
-first-order
-millard
-haymarket
-canto
-bodybuilding
-awaken
-shutout
-stirring
-ci
-gradients
-pristine
-buchan
-dawes
-sensations
-maastricht
-long-lived
-hobbes
-alchemist
-philipp
-conn
-heaton
-double-a
-pcr
-marys
-affectionate
-mules
-oakwood
-stravinsky
-soo
-mackintosh
-darkest
-lien
-dehydration
-hysteria
-unsatisfactory
-reestablished
-spilled
-1722
-grooming
-anode
-loco
-world-wide
-coulter
-stearns
-robbing
-omni
-5a
-slapstick
-eskimo
-wraith
-yosef
-wraps
-roxbury
-saucer
-32.1
-1719
-jd
-isps
-lourdes
-haggard
-angelica
-milner
-entailed
-advancements
-glamorous
-zoroastrian
-oxidized
-bhopal
-zephyr
-balmain
-permutation
-heartbroken
-abridged
-valet
-peptides
-irma
-turbocharged
-drydock
-processions
-westside
-pvc
-bitterness
-punctuation
-classifying
-flax
-irreversible
-juries
-genders
-catalysts
-expires
-shaving
-emmerdale
-gauteng
-buoyancy
-kenton
-transfusion
-summertime
-phnom
-inhibited
-eco
-dionysius
-1647
-inhibiting
-doubleday
-jeffries
-uranus
-30.7
-restraining
-gorges
-worthwhile
-participatory
-leander
-31.6
-staring
-wasting
-dunk
-harlequin
-beeching
-hillsboro
-cohort
-ranches
-langdon
-fueling
-prerequisite
-jena
-stride
-robyn
-1702
-sewn
-anjou
-franca
-multi-instrumentalist
-hyatt
-chaney
-pavia
-eglinton
-holyoke
-j.r.
-motherwell
-duquesne
-vasily
-mukherjee
-jayne
-tripura
-right-arm
-pac-10
-easterly
-gmt
-centimetres
-resettled
-raul
-untimely
-enclosing
-disintegrated
-warcraft
-mum
-pores
-cl
-dolan
-trucking
-gf
-camino
-gottlieb
-hatching
-minimized
-ghanaian
-sages
-1602
-ripon
-bangla
-instructing
-20s
-outtakes
-hemoglobin
-unix-like
-guildhall
-fortnight
-jumpers
-superbike
-17.0
-innumerable
-oxidoreductase
-chopra
-mackinac
-sq.
-subtitle
-schumann
-renounce
-blenheim
-antidote
-31.7
-congressmen
-markov
-b-17
-campaigner
-vanish
-crosse
-storks
-contemplated
-heist
-jainism
-als
-anchoring
-mcnally
-tryon
-229
-furry
-herschel
-bdsm
-enrolling
-methanol
-monochrome
-poisons
-30.9
-genoese
-brookfield
-giro
-selassie
-dismisses
-skunk
-foote
-rawalpindi
-shaded
-blackadder
-partitioned
-barrio
-boswell
-emptied
-oxidoreductases
-chiltern
-bazar
-amorphous
-abbreviations
-mondo
-decorating
-playgrounds
-metamorphosis
-1580
-quintus
-storeys
-insofar
-ababa
-coursework
-lila
-crunch
-309
-rubbing
-learner
-frazioni
-curated
-hand-held
-rainforests
-sayings
-mecklenburg
-acropolis
-solidified
-shaved
-indispensable
-277
-patna
-plugins
-telepathy
-indentured
-fragmentary
-wielded
-gh
-hug
-amulet
-whittier
-brine
-browsing
-quieter
-pong
-mixtape
-tunisian
-reels
-doha
-premiums
-unlicensed
-1672
-12-month
-healer
-volta
-rerouted
-peta
-youngster
-ui
-disruptive
-adair
-granny
-muriel
-catharines
-marxist-leninist
-peregrine
-hispanics
-unlucky
-schroeder
-glazed
-jetty
-ennis
-257
-cpa
-dramatist
-mods
-rehabilitated
-ovarian
-benning
-natchez
-cauldron
-shun
-3500
-abercrombie
-chilton
-blasting
-1664
-hijackers
-marcello
-teamsters
-shanks
-lonnie
-dialing
-loom
-britten
-torrance
-mala
-phylogeny
-legate
-1623
-dreamwave
-eustace
-cohomology
-railcars
-mimics
-metamorphic
-officiated
-barbary
-rewarding
-rhyming
-uzbek
-andorra
-euston
-varna
-eerie
-moderation
-sato
-ventricle
-communicates
-unusable
-lettered
-partido
-progeny
-nit
-chariots
-glynn
-elasticity
-perched
-starlight
-1699
-alternates
-yeltsin
-249
-20th-century
-delle
-wicklow
-1625
-smoother
-coimbatore
-westerners
-fief
-farr
-403
-patriarchs
-refuted
-fastball
-lenape
-trondheim
-submachine
-anglo-american
-highgate
-lucian
-lobo
-nitric
-improbable
-burner
-bellingham
-inflicting
-couture
-obligated
-jellyfish
-gotti
-arte
-kok
-ojibwe
-bona
-undercarriage
-leaping
-octaves
-31.1
-equine
-progenitor
-cavite
-kinks
-283
-nauvoo
-kline
-gr
-tonkin
-1611
-assemblage
-gaylord
-recitation
-blackmore
-enix
-nilsson
-blondie
-altars
-environmentalist
-sprites
-culminates
-grenville
-misdemeanor
-1692
-gated
-aldrich
-fen
-divination
-clooney
-oshawa
-mileage
-annotated
-shorten
-interplanetary
-1628
-1651
-democracies
-scunthorpe
-metaphors
-vidal
-ellesmere
-klamath
-54th
-sulawesi
-happenings
-chadwick
-1621
-fondly
-borrows
-north-central
-407
-myer
-robby
-peirce
-goshen
-marshy
-czechoslovak
-chorale
-cora
-amazons
-clarion
-wintering
-furs
-definitively
-cytoplasm
-childish
-punta
-torsion
-sukarno
-alamos
-tutoring
-fahrenheit
-brilliance
-lg
-iit
-hebrews
-berlusconi
-aspiration
-thurman
-mathew
-cease-fire
-four-wheel
-talladega
-oversized
-stoney
-landry
-stryker
-arias
-scuttled
-naturalistic
-2.50
-polymerization
-interwar
-lingering
-moro
-saeed
-casually
-bolivian
-cheering
-kashmiri
-3-5
-quicksilver
-fertilizers
-tatum
-inhalation
-taxable
-upheaval
-zealanders
-weeping
-electives
-boron
-moreton
-paternity
-gregarious
-backstreet
-voltaire
-frankfort
-waiver
-attica
-zu
-cookbook
-sparking
-acme
-shackleton
-filippo
-ferocious
-aesthetically
-40s
-levee
-codice_3
-ipo
-solubility
-ducts
-ls
-conflicted
-sb
-estuaries
-sulfuric
-silently
-solomons
-benzodiazepines
-mellow
-co-producer
-bayesian
-shelly
-forcefully
-intriguing
-nath
-constitutionality
-dm
-self-taught
-cartoonists
-firefighter
-autistic
-sensibility
-redman
-vallejo
-inspirations
-amen
-mailed
-ingenious
-dulles
-moraine
-connaught
-quartets
-insecure
-precincts
-clockwork
-ugandan
-johansson
-1536
-astounding
-oromo
-huns
-palsy
-liddell
-rebounded
-joni
-godavari
-towel
-fenway
-leela
-negation
-chunk
-hertz
-1717
-infused
-distantly
-accommodating
-beggars
-corinthians
-narrowed
-orgasm
-raspberry
-65th
-asha
-1667
-bea
-threaded
-mares
-entangled
-nestor
-ashoka
-freaks
-isu
-sept
-itinerant
-errol
-ur
-cristina
-sephardic
-italic
-dil
-hurting
-niall
-pfc
-bebop
-refitted
-hatton
-calumet
-kala
-dipped
-7,500
-jekyll
-jackal
-echelon
-heretics
-dislikes
-placements
-equivalently
-leech
-crowning
-janus
-vassals
-francois
-falco
-g8
-alum
-53rd
-keyword
-greco-roman
-bunting
-aurangzeb
-bantam
-nineteenth-century
-hutchison
-bateman
-mosul
-talon
-paralleled
-cham
-reissues
-surya
-zedong
-snl
-freebsd
-1743
-degrade
-co-chair
-transept
-husky
-awe
-galician
-kruger
-2.97
-baden-powell
-davao
-homework
-flooring
-panelist
-slid
-risc
-rajiv
-schlesinger
-hanley
-non-traditional
-handlers
-recombination
-jia
-barbour
-mujahideen
-ponce
-tuscaloosa
-behavioural
-volunteering
-westbury
-megadeth
-shaolin
-kobayashi
-shoemaker
-mcnair
-columbian
-1616
-micah
-semiconductors
-optimize
-alastair
-lonesome
-fw
-gens
-aruba
-mezzanine
-fatality
-clough
-ushered
-irina
-licensee
-1652
-all-female
-capitalization
-buckner
-itu
-wasteland
-1622
-accumulating
-wittgenstein
-winona
-rationality
-fleischer
-retrieving
-flamenco
-closeness
-ligaments
-infield
-standardised
-kip
-co-editor
-bearded
-ecoregion
-startling
-institutionalized
-trademarked
-puma
-hinterland
-vincennes
-authoring
-dismissing
-groton
-1612
-1675
-self-propelled
-week-long
-rajendra
-braxton
-desai
-philology
-mayflower
-gino
-golan
-ornamentation
-overturn
-pdc
-jennie
-admirers
-lauder
-porte
-cambria
-duplex
-cannibal
-feuds
-asteraceae
-roulette
-confessional
-catalogues
-blyth
-cures
-sentinels
-rep
-sprite
-greys
-surrendering
-ten-year
-ignacio
-made-for-tv
-colouring
-tyrrell
-diagonally
-brainchild
-monza
-cuttings
-correlate
-folsom
-jervis
-devolution
-bonham
-distinctively
-intimately
-bathrooms
-annulled
-bead
-lodi
-grandma
-topps
-hijacking
-allocate
-idiosyncratic
-cthulhu
-ezekiel
-revolvers
-high-energy
-krai
-jonson
-anecdotal
-virtuous
-totalitarian
-huff
-passwords
-kepler
-adoration
-gemma
-simulcasting
-commoners
-molina
-overthrew
-wields
-bachchan
-reliever
-mosquitoes
-musket
-probate
-three-quarters
-disguises
-bundy
-laurier
-negros
-hewlett
-discrepancies
-lovin
-1637
-nostalgic
-manu
-bashir
-sis
-bottling
-incendiary
-bonanza
-modems
-markus
-workman
-catechism
-freedmen
-theses
-undersea
-skates
-slows
-goblins
-curzon
-saba
-chiropractic
-upper-class
-naughty
-smells
-plugin
-wanders
-refine
-weathering
-totem
-over-the-air
-banbury
-forty-five
-overhauled
-mille
-up-and-coming
-lemma
-enlisting
-venetians
-tompkins
-chappell
-terrifying
-northland
-notify
-rembrandt
-bureaus
-discontinue
-sabbatical
-worsening
-annuity
-giulio
-conceding
-louvre
-goo
-electrolyte
-karan
-renditions
-self-help
-deceptive
-disabling
-baer
-salamanca
-signalled
-studebaker
-handley
-59th
-burghs
-gwalior
-kano
-photosynthesis
-18th-century
-1654
-lapd
-boers
-cannibalism
-caption
-sq
-essayist
-furlongs
-nadir
-frederik
-voiceover
-welland
-premierships
-vick
-ddt
-taoist
-proprietors
-raga
-broaden
-lecturers
-pyrenees
-caf
-two-man
-melodrama
-elevate
-prefects
-vermilion
-lancer
-kippur
-serpentine
-eurobasket
-warranty
-minas
-donahue
-crease
-fable
-regains
-york-based
-solace
-ser
-1619
-vantage
-macaulay
-fingerprints
-steeply
-referral
-1659
-aptly
-sheehan
-tvs
-shalom
-1633
-1635
-charlottetown
-predominately
-osiris
-thorns
-catania
-arrowhead
-mchenry
-alters
-etching
-oblivious
-robeson
-uno
-elegance
-dwyer
-1704
-sludge
-amritsar
-cytochrome
-chimes
-frustrating
-2.91
-freeport
-foothold
-raya
-abusing
-draught
-coalitions
-470
-worshiped
-patten
-o'hare
-twa
-sanctioning
-vie
-jethro
-spoiler
-johanna
-vegan
-danbury
-westlake
-intergovernmental
-well-developed
-religiously
-macfarlane
-dysfunctional
-darjeeling
-williamsport
-1737
-asahi
-denham
-millie
-parable
-magellan
-watchdog
-islet
-cfb
-obese
-infielder
-compromising
-enlistment
-discourses
-cortez
-misfits
-ralston
-rake
-litchfield
-cushion
-glaciation
-climactic
-walkways
-278
-librarians
-counselling
-kei
-starfleet
-sought-after
-herrera
-massimo
-strapped
-anglesey
-diabetic
-pdb
-proteus
-1735
-principalities
-activision
-fastened
-wandered
-zaragoza
-bela
-driscoll
-yuma
-moods
-tui
-baruch
-precedes
-greta
-yam
-seer
-junkers
-24/7
-fliers
-fedex
-lanier
-perjury
-seminaries
-s.s.
-signatory
-coherence
-mismanagement
-volley
-forte
-downgraded
-saraswati
-payton
-goss
-wooster
-hazrat
-proclaims
-graz
-rubinstein
-whitworth
-longstreet
-dia
-abi
-unfairly
-314
-2/3
-aquifer
-guineas
-salazar
-choruses
-multi-party
-oxidative
-heartbreak
-forked
-raccoon
-sedition
-ryo
-udp
-wield
-minded
-leds
-289
-judson
-vail
-traversing
-springboard
-campion
-32.5
-air-to-air
-brabham
-humanistic
-jacinto
-in-flight
-skipping
-watering
-hydrophobic
-camelot
-sibley
-icu
-hatched
-deakin
-wicca
-1598
-voiceless
-muhammed
-erode
-souvenirs
-klingon
-assyria
-handmade
-hesitation
-magically
-maulana
-notts
-1733
-esplanade
-mika
-top-ten
-repercussions
-authorize
-cursor
-viennese
-lufthansa
-shrunk
-starbuck
-langford
-luc
-stumbled
-hussey
-brahman
-707
-cigars
-denouncing
-.45
-marbles
-undisturbed
-gules
-tijuana
-272
-bracken
-avignon
-negatives
-fearsome
-redeemed
-favours
-monkees
-cor
-punter
-stumbles
-aviators
-freiburg
-multiplying
-ding
-haldane
-generalizations
-contradictions
-apse
-sven
-accommodates
-adventists
-garvey
-reichstag
-rapture
-unreal
-t1
-genghis
-restitution
-oysters
-hindustani
-dionysus
-terrific
-glutamate
-galley
-sybil
-tanning
-hons
-dyed
-neurology
-2.93
-yves
-macross
-chiral
-dp
-nunn
-buggy
-poised
-1560
-1646
-ahmadinejad
-itc
-slr
-tarrant
-modernize
-1606
-heretical
-disproportionate
-56th
-overhears
-shinto
-2.95
-donny
-ploy
-wordsworth
-unconfirmed
-doo
-speculations
-oedipus
-obscenity
-woodbury
-epistles
-miraculously
-630
-x-factor
-benzene
-lucie
-earthen
-racketeering
-normalized
-phantoms
-nye
-heywood
-802.11
-obsidian
-tds
-ananda
-requisite
-culprit
-skateboard
-rfid
-cg
-ladd
-xiang
-withhold
-quadruple
-prompts
-deus
-hendrick
-displeased
-defied
-gogh
-villeneuve
-wyman
-pied
-maris
-leopards
-squeezed
-hiroshi
-staining
-2.98
-midlothian
-shenzhen
-provoking
-transliteration
-renumbering
-wager
-webcomic
-iyer
-escalating
-plump
-amps
-teleportation
-precarious
-luminaries
-cracker
-ani
-francine
-rouen
-symbolized
-chopin
-raju
-a5
-mangalore
-ironclad
-swastika
-stasis
-agnostic
-eccles
-albrecht
-regimen
-acquittal
-massif
-legalized
-single-engine
-s.c.
-halsey
-presbytery
-unionism
-30-minute
-underdog
-brownish
-scion
-mahdi
-hajj
-asterisk
-rsa
-spleen
-mendel
-flamingo
-montclair
-retelling
-reliefs
-astaire
-consecutively
-novo
-hasty
-liberian
-m5
-carrollton
-belinda
-brothels
-tunbridge
-psa
-gorgeous
-wylie
-1716
-envelopes
-taco
-ob
-anonymity
-restricts
-burnet
-stonehenge
-pinter
-dutchman
-inks
-bradman
-furthest
-westbrook
-wicket-keeper
-girlfriends
-designating
-fuses
-2.94
-holographic
-ashe
-fs
-elo
-solitaire
-galena
-guevara
-2007/08
-characterizes
-yeovil
-formula_30
-1997-98
-conducive
-kwajalein
-stalks
-fermanagh
-np
-angelina
-hitters
-1679
-mountbatten
-notebooks
-grappling
-hindenburg
-pencils
-southgate
-elmo
-inquirer
-universidad
-lawmakers
-deadliest
-blossoms
-snaps
-ceres
-preside
-benefiting
-258
-decoder
-oran
-pick-up
-crackers
-ville
-tamara
-nomad
-gita
-lyrically
-impartial
-sep
-linearly
-ev
-lucerne
-monterrey
-broome
-ankara
-constellations
-peripherals
-swallowing
-1636
-raman
-enmity
-nimrod
-bourke
-sig
-southerners
-non-productive
-corrective
-o'rourke
-muchmusic
-infidelity
-handguns
-meng
-second-hand
-airy
-tatiana
-grist
-staffers
-cowell
-msu
-britton
-griswold
-recognising
-hamlin
-ypres
-revamp
-anthems
-3.04
-brainwashed
-epithelium
-trapper
-ethnographic
-mandalay
-derailed
-clam
-hobbyists
-blight
-popularised
-grandeur
-thrilled
-acadia
-rockville
-inbound
-canonized
-ghostly
-thrice
-teamwork
-steeplechase
-vice-admiral
-workmen
-perrin
-u-20
-urea
-howitzer
-mtdna
-23,000
-gardeners
-self-released
-aachen
-arrogance
-comintern
-deans
-pickens
-ethnicities
-non-zero
-fra
-re-entry
-scatter
-genomes
-slovene
-solves
-slugs
-chimneys
-polygons
-grouse
-nil
-1709
-chimpanzees
-1703
-beattie
-encodes
-strangled
-mutilated
-app
-frith
-tuscan
-racine
-spanish-language
-1613
-ics
-paulie
-beret
-hittite
-anglicans
-delicacy
-individuality
-conveyor
-cuckoos
-redeem
-ligue
-1492
-chalice
-31.2
-squires
-georgina
-reis
-lindy
-pesticide
-finishers
-conjugation
-airspeed
-seriousness
-epidemics
-musica
-transited
-chainsaw
-environmentalists
-blackbird
-associative
-depeche
-storming
-opry
-expended
-arthurian
-shelbourne
-equated
-conifer
-hardtop
-qasim
-radford
-havre
-vane
-protectors
-foss
-2001-2002
-jingles
-biodiesel
-two-week
-baal
-osage
-wheatley
-rationing
-lakeview
-saud
-transsexual
-jean-pierre
-effigy
-cahill
-fairey
-alder
-conspiring
-telstra
-bogs
-1608
-hitchhiker
-emitting
-oxidase
-1.25
-reconsider
-napoli
-steph
-reservists
-150th
-carolinas
-composites
-postscript
-tenders
-cuisines
-embedding
-ganesh
-arkham
-caltech
-dartford
-claimants
-seeger
-sun-times
-e-mails
-bruges
-porta
-world-renowned
-3.03
-veracruz
-frs
-750,000
-1682
-hana
-345
-apc
-a3
-consuls
-endgame
-1711
-bandstand
-sip
-chivalry
-bram
-prairies
-1-yard
-pinkerton
-vikram
-missy
-bower
-dumbarton
-kedah
-gmc
-martyred
-31.8
-typhoid
-gaol
-uproar
-1697
-untrue
-utilising
-corsair
-thong
-loudspeaker
-cleaners
-brainiac
-wta
-multipurpose
-moncton
-bellows
-helios
-heidegger
-318
-7:30
-1639
-clothed
-253
-pico
-dl
-reminding
-peaches
-multan
-editorship
-everlasting
-appease
-hyper
-careless
-kanye
-wpa
-psychiatrists
-sutter
-1590
-tc
-girard
-flycatchers
-530
-takahashi
-buffet
-harden
-polyester
-overcrowded
-ex-girlfriend
-bk
-summaries
-adaption
-braid
-pylons
-reckoned
-simulators
-informational
-shinji
-grizzlies
-gunpoint
-cadence
-duggan
-mcdonough
-oem
-2.96
-worldview
-exponents
-penney
-caravaggio
-1653
-unhealthy
-gal
-bustling
-aurelius
-general-purpose
-signified
-groceries
-abdel
-selectors
-surveyors
-respectful
-rem
-boomer
-martino
-sincerity
-burnaby
-uthman
-accountancy
-5-4
-basingstoke
-odo
-popper
-wu-tang
-transplanted
-stiles
-roadrunner
-downside
-duval
-schiller
-heaters
-laughlin
-spraying
-rec
-monsignor
-year-end
-quarrying
-hendrik
-hardwick
-nantes
-judea
-corning
-pesos
-offaly
-bernadette
-polity
-scrimmage
-1712
-cheerleader
-dakar
-utilise
-gallic
-1658
-hrs
-35mm
-reverses
-1624
-plexus
-multi-national
-55,000
-cowley
-justine
-kambojas
-microprocessors
-batches
-asp
-agamemnon
-lessen
-.DGDGDGDG
-usgs
-wyandanch
-debuts
-extremists
-1629
-abort
-aftermarket
-pye
-retailing
-garages
-stagnation
-halting
-fontaine
-jaffna
-overton
-forgiven
-chabad
-jeannie
-snoopy
-gimme
-jagged
-carling
-skelton
-pell
-turquoise
-calories
-re-issue
-soma
-cns
-plasmodium
-jukebox
-trax
-morristown
-b1
-ivanov
-tr
-2.88
-al-din
-holstein
-stubbs
-rearranged
-forgetting
-rebranding
-ungulates
-llanelli
-thinly
-vaulted
-ethylene
-288
-pandavas
-muban
-camilla
-biker
-creationism
-fates
-headteacher
-sedans
-walther
-plainfield
-seri
-ritz
-valour
-chipset
-multiples
-eased
-syphilis
-worthing
-239
-tuner
-d'oyly
-lackawanna
-dung
-tomography
-stains
-prr
-1530
-resistor
-vagrant
-panasonic
-guillermo
-incumbents
-refresher
-battersea
-harriman
-jawaharlal
-architecturally
-buckland
-snes
-single-handedly
-karin
-ufos
-whipped
-1655
-everglades
-boosting
-stripper
-capacitors
-caen
-conspired
-phosphorylation
-monopolies
-cumbersome
-beacons
-vile
-fredericton
-aversion
-shostakovich
-menstrual
-securely
-sts
-k-8
-hallam
-urn
-dredging
-energon
-homologous
-necropolis
-maturation
-banco
-mervyn
-monologues
-o.c.
-mussels
-unintended
-grandpa
-bosworth
-zeros
-homers
-abbots
-dowling
-formula_33
-soc
-blackberry
-withholding
-hindustan
-co-writing
-pave
-theseus
-horsham
-regrouped
-6pm
-ramakrishna
-kc
-apulia
-crucible
-idealized
-clermont
-lorenz
-cheaply
-foodstuffs
-thornhill
-khalifa
-curses
-arduous
-rescinded
-tribesmen
-7-6
-c.j.
-denser
-nicks
-bleachers
-nautilus
-smear
-campsite
-affirm
-goalscorer
-pac-man
-upriver
-multilateral
-galleria
-nightfall
-eastside
-extrajudicial
-0pts
-pancreas
-pyotr
-remission
-kirkpatrick
-mowbray
-corals
-geek
-r2
-yells
-magdalena
-thirty-six
-disseminated
-counter-attack
-albedo
-279
-midshipmen
-rapes
-sane
-eros
-tyrosine
-teaser
-officiating
-jermaine
-whitlam
-veterinarian
-off-campus
-lakota
-dem
-cronin
-indecent
-confidentiality
-enlarge
-go-go
-debatable
-respecting
-stair
-pahlavi
-jj
-alonzo
-braille
-translink
-engulfed
-spitzer
-collars
-favre
-torus
-cleo
-receivership
-dravidian
-lonsdale
-supergirl
-sash
-regular-season
-65,000
-tiara
-contraception
-capua
-2.90
-slaying
-sikhism
-bu
-revitalize
-penh
-anselm
-rahul
-mil
-connell
-circumvent
-twigs
-blazer
-childs
-timur
-marginalized
-lombards
-gc
-chastity
-auditors
-rubens
-headphones
-antonia
-2006/07
-tunneling
-shui
-beryl
-gibb
-profanity
-harmonious
-waning
-readable
-kissinger
-jester
-shakur
-approximated
-propensity
-**
-laserdisc
-tutorial
-viscous
-reared
-299
-artisan
-drab
-crore
-1734
-sinners
-osi
-rhondda
-replenishment
-valdez
-excesses
-rangoon
-2.99
-peer-to-peer
-metering
-nee
-adele
-homotopy
-sapiens
-24,000
-andaman
-noaa
-platted
-wealthier
-unbalanced
-abolishing
-aberystwyth
-micronesia
-northside
-run-off
-langer
-oncoming
-grandsons
-farrar
-herding
-luce
-wrongful
-phonological
-archetype
-longfellow
-1618
-millimeters
-2016
-redistributed
-wodehouse
-britt
-a-level
-ticker
-aztecs
-1724
-cultivating
-babcock
-lewisham
-raged
-fireball
-sulu
-bothered
-injecting
-baekje
-makeover
-raritan
-isaacs
-overtook
-hijacked
-div
-breda
-mi5
-tangled
-stilts
-arjun
-bremerton
-outcrops
-speer
-droplets
-1614
-bischoff
-empower
-tremendously
-etched
-stimulates
-boil
-littleton
-palais
-elbe
-listens
-chromatography
-winkler
-dsp
-interracial
-benn
-oskar
-jong
-forwarding
-4-5
-minimise
-yakima
-detectable
-camber
-corrigan
-valentin
-garnett
-buckeye
-righteousness
-harkness
-plating
-carboniferous
-methodological
-city-states
-perpetrator
-ee
-nos.
-hybridization
-1626
-cowardly
-khalil
-upperparts
-abdicated
-prerogative
-ultraman
-madman
-harassing
-o'toole
-by-elections
-marvelous
-centralised
-dusky
-malden
-antiquarian
-farragut
-appellation
-postdoctoral
-co-hosts
-debian
-coleridge
-transnational
-hypocrisy
-jean-baptiste
-intelligible
-alphonse
-rebroadcast
-seams
-10:30
-seven-year
-barbra
-little-known
-weightlifting
-usability
-crt
-phonemes
-1693
-apparition
-sill
-580
-1350
-samaritan
-uttered
-lynchburg
-vizier
-himmler
-depressions
-settles
-carrion
-conceptions
-2,200
-solid-state
-latinos
-paddock
-obsessive
-sprayed
-furness
-manors
-rigs
-har
-realigned
-prohibitions
-graft
-poughkeepsie
-gowns
-2000-01
-cristo
-dsl
-shogunate
-ashok
-monographs
-radha
-dunlap
-propane
-aerobic
-borderline
-kindred
-curfew
-anew
-1723
-strikingly
-covenants
-720
-petrie
-curving
-mcfadden
-discoverer
-taoism
-emilia
-likud
-polyphonic
-flagstaff
-indeterminate
-reproductions
-larouche
-palmyra
-endeavours
-lullaby
-mugabe
-newsroom
-trajan
-marcy
-guerre
-isi
-recast
-diodes
-nang
-kato
-1588
-neutralize
-paleontology
-mandible
-flakes
-irt
-christoph
-2.92
-runes
-thug
-intellectually
-edema
-shave
-aide-de-camp
-woodson
-ncc
-levers
-alia
-portability
-chaser
-loyalties
-quixote
-lieutenant-governor
-roslin
-dreyfus
-1996-97
-lawless
-digger
-intonation
-condoms
-1736
-stabilizer
-checkpoints
-reprising
-self-described
-last-minute
-bloodstream
-multidisciplinary
-gypsum
-inductees
-1663
-cybernetic
-elisa
-susannah
-aggregates
-adc
-suva
-narrowing
-soaps
-traitors
-mekong
-chanted
-imperium
-amine
-swapping
-gateways
-32.2
-dept.
-aquaculture
-two-seat
-gangsta
-othello
-cross-border
-innovator
-synaptic
-aliases
-mathias
-osmond
-satin
-permeability
-paget
-bequest
-crossfire
-renfrew
-pereira
-epiphany
-pledges
-adrienne
-a6
-shao
-hairstyle
-shaftesbury
-rwandan
-freezes
-slay
-auspicious
-knapp
-cholas
-collegiately
-barclays
-autograph
-regalia
-ccd
-philo
-univision
-reworking
-elgar
-scriptural
-mucus
-thoughtful
-jungles
-congolese
-fascists
-suicides
-designates
-potters
-keegan
--2
-englewood
-iaf
-dina
-retaliated
-hapoel
-commences
-essentials
-shunned
-tilting
-rafting
-palatinate
-distanced
-whelan
-casimir
-kingfishers
-caligula
-flavours
-haredi
-opportunistic
-omen
-dinamo
-contemporaneous
-face-to-face
-bakshi
-formula_32
-referendums
-swam
-marauders
-merthyr
-spinners
-reinhardt
-light-hearted
-rennie
-wallabies
-busby
-proximal
-pas-de-calais
-lucca
-anno
-resumption
-sock
-cma
-flipping
-fists
-smokers
-stucco
-benevento
-logarithm
-bullies
-wickham
-reverting
-arthropods
-hackers
-kochi
-georgie
-segal
-fiasco
-deadpool
-footballing
-explanatory
-allergies
-alban
-chautauqua
-clowns
-680
-schleswig
-hmong
-ad-din
-gopal
-izzy
-32.3
-rishi
-mishnah
-enid
-utilitarian
-sculptural
-denominational
-checker
-zamboanga
-jammed
-nicotine
-faux
-quantico
-gerber
-centering
-sdp
-curricular
-31.9
-scarf
-adamant
-kbit/s
-pcb
-dso
-stipulation
-prudence
-converters
-mane
-rebuffed
-interpol
-.4
-daemon
-100m
-falun
-self-governing
-firewood
-murdock
-combinatorial
-ponte
-clustering
-solicitors
-xtreme
-suborder
-urbanized
-grammars
-nmr
-sinhala
-fugue
-recon
-intimidate
-tandy
-remit
-putney
-silverstone
-prime-time
-deducted
-1656
-3.02
-retrospect
-stylistically
-felicity
-viewership
-vere
-wrists
-norbert
-dobbs
-acidity
-doric
-tuxedo
-c-130
-3.06
-buckle
-executes
-deflection
-firsts
-simons
-1572
-world-famous
-annihilated
-shading
-caithness
-paramedic
-1691
-toon
-disturb
-antagonistic
-musgrave
-allege
-pituitary
-gulch
-thracian
-bolstered
-magi
-batu
-resists
-259
-towne
-fo
-louth
-medicaid
-keywords
-kan
-5-2
-rattlesnake
-juliana
-ultima
-woolworths
-slime
-plano
-revolutionized
-free-standing
-roundhouse
-undermining
-ceases
-inductee
-fowl
-wolfram
-livy
-f2
-nurture
-watchers
-journeyed
-co-production
-evangelicals
-persecutions
-goldie
-connery
-pocahontas
-cdr
-maloney
-sta
-lille
-plunkett
-padre
-goryeo
-whirlwind
-wafer
-owain
-theobald
-2000-2001
-punishing
-seamless
-waite
-magnetism
-fleece
-pagans
-wi
-magenta
-shipwrecks
-kata
-ricans
-axons
-addicts
-turboprop
-needham
-oa
-d5
-turku
-first-hand
-obispo
-winton
-rgb
-cretan
-riyadh
-videogame
-zhejiang
-intimidating
-69th
-inconsistency
-unwillingness
-carrick
-hepatic
-magister
-dissection
-gurus
-1705
-lateran
-amounting
-1729
-assisi
-potency
-feynman
-tern
-safest
-beowulf
-echoing
-quail
-metcalfe
-cio
-supercomputer
-amhara
-anus
-holby
-oberon
-tabs
-dirac
-featherweight
-hawkes
-brackish
-janine
-cunard
-irritated
-dma
-handwritten
-warmly
-capacitance
-rarest
-vipers
-pym
-buford
-1571
-immanuel
-far-reaching
-interfaith
-gophers
-symbiotic
-.0
-encroachment
-estadio
-ravana
-jean-claude
-reactionary
-patras
-3.01
-terriers
-1686
-half-time
-sultans
-aclu
-aida
-ripping
-atta
-bilbao
-decor
-purchaser
-unveiling
-mated
-leblanc
-positional
-ghosh
-76ers
-ord
-crucified
-avatars
-pre-kindergarten
-inter-war
-nestled
-no-hitter
-ranjit
-homing
-impart
-amstrad
-mullins
-1592
-cantilever
-acknowledgment
-possum
-agrippa
-reproducing
-hilarious
-veiled
-moira
-earmarked
-spills
-sapporo
-irresponsible
-dike
-rosewood
-pyramidal
-bullion
-voldemort
-hotspot
-intruders
-273
-anomalous
-abbasid
-rattle
-oakville
-marlow
-ramesses
-suitability
-hippo
-terribly
-asquith
-kayak
-standoff
-wilmot
-boggs
-saxony-anhalt
-ephesus
-instrumentals
-293
-hoods
-bis
-germain
-ayers
-associating
-brasil
-abandons
-ibf
-carta
-urgency
-krusty
-momentarily
-uncensored
-bnsf
-pacifica
-champs
-mischievous
-carotid
-hooded
-ruptured
-israelite
-vacations
-destitute
-chickasaw
-pixies
-overgrown
-minesweeping
-retarded
-formula_31
-waltrip
-full-sized
-andrey
-lewiston
-sterilization
-nostrils
-aloha
-vt
-pollination
-hustler
-mica
-montoya
-coburn
-precipitate
-scalp
-precede
-cerro
-publicist
-self-published
-10pm
-yakuza
-hordes
-1998-99
-spartak
-phipps
-pedagogy
-1684
-sikorsky
-evoked
-registrations
-pringle
-flaherty
-enumerated
-mesoamerican
-parabolic
-shovel
-ender
-computations
-khaki
-andres
-pertinent
-capri
-novelization
-necrosis
-coronado
-355
-spitting
-soundly
-triumphs
-incandescent
-yahya
-lipids
-tightened
-non-metropolitan
-skeptics
-hoops
-lambs
-toads
-slavonic
-oi
-stave
-1-6
-sheriffs
-stadion
-ute
-hdtv
-bipartisan
-10-12
-schaefer
-wrecks
-affords
-ascendancy
-head-to-head
-scooby-doo
-foundational
-matriculated
-contemplation
-ganglion
-007
-sondheim
-uruguayan
-wha
-piero
-wallet
-dipping
-tierney
-parthian
-ormond
-cupola
-fenced
-rotting
-gopher
-chronicling
-precedents
-brunt
-hispania
-escalation
-simulating
-horus
-matador
-sidelines
-gurkha
-capillary
-deb
-cramped
-primrose
-1627
-libre
-neutrino
-desegregation
-knott
-deutsch
-deserving
-salaam
-solicited
-goulburn
-deptford
-cuomo
-georgi
-catapult
-kurtz
-henchman
-suction
-dinghy
-stillwater
-marple
-rewriting
-yankovic
-galleys
-millers
-italian-american
-snipers
-immortals
-optioned
-nationalized
-blob
-gunther
-1681
-magdeburg
-nos
-white-tailed
-transitive
-pritchard
-guaranteeing
-veda
-ratify
-cylons
-better-known
-nurturing
-willows
-3.05
-lott
-intrepid
-cuff
-rotors
-marginally
-sutcliffe
-briefs
-aleutian
-ulcers
-ramadan
-khalsa
-eulogy
-kde
-cross-cultural
-colonisation
-firestorm
-courted
-polyethylene
-healey
-twenty-nine
-digs
-biscayne
-mass-produced
-stork
-suzy
-undertakes
-fenwick
-padding
-celeste
-catskill
-32.6
-boutiques
-passionately
-duvall
-6:30
-habitable
-aj
-frying
-behaviours
-alcoholics
-regularity
-roadster
-bas
-barbershop
-manipulative
-ghostbusters
-redirected
-ids
-presbyterians
-waders
-406
-1668
-liquidated
-leanings
-lounges
-three-part
-pals
-milieu
-esters
-pharmacies
-kermit
-fink
-spoil
-jewell
-disbelief
-appended
-complements
-hormonal
-biden
-wavy
-invoking
-pundits
-egrets
-analyzes
-sophomores
-kirkwood
-sainsbury
-fairmont
-nadph
-addington
-montenegrin
-carcass
-serene
-pickle
-inglewood
-rasputin
-planter
-oxen
-pee
-workhouse
-chorley
-pedagogical
-jla
-baez
-xix
-steeple
-dir
-disregarded
-unsuspecting
-moustache
-999
-biopic
-435
-presides
-astley
-tremolo
-prosthetic
-acl
-lombardo
-softly
-caesarea
-funniest
-layla
-apl
-lenox
-anticipate
-shareware
-looping
-quigley
-mortuary
-soared
-neurotransmitter
-glossary
-arun
-rho
-1721
-endogenous
-margarita
-manchukuo
-paralleling
-gees
-biggs
-pinky
-off-peak
-motivate
-dorothea
-ellery
-yokozuna
-pratap
-harbin
-tuff
-amur
-dillinger
-fewest
-cross-platform
-germania
-dao
-inhibitory
-carew
-riparian
-urbana-champaign
-saloons
-lorna
-sanctuaries
-lynette
-juneau
-swinton
-sandusky
-perceptual
-whistles
-asturias
-2,400
-1738
-1731
-mediums
-1673
-tanganyika
-drogheda
-renegades
-laplace
-hidalgo
-aroma
-overruled
-recollections
-1995-96
-hickey
-nag
-cronulla
-cavan
-d1
-decapitated
-269
-farnsworth
-1547
-whistling
-63rd
-dunham
-computer-generated
-super-heroes
-cbbc
-examiners
-floodplain
-farnborough
-maharishi
-bodied
-styx
-borrowers
-hush
-ounce
-simplex
-homelessness
-tillman
-pv
-betrothed
-infer
-furthered
-hinge
-wildfire
-til
-dared
-1601
-tora
-uterine
-selwyn
-dissenters
-re-introduced
-roi
-brecht
-encased
-hotter
-rts
-isolating
-monolithic
-psychologically
-1694
-snapper
-warranted
-masterson
-spades
-pled
-dani
-omitting
-encyclopaedia
-pang
-penrose
-skrull
-allowances
-fey
-regia
-seconded
-tampering
-plenary
-footscray
-parchment
-sackville
-blooms
-endorsing
-felicia
-islets
-tv3
-nra
-diversify
-crosstown
-stabilizing
-probabilistic
-servitude
-32.7
-taranto
-cognate
-nonpartisan
-talons
-soe
-moriarty
-discriminate
-feeders
-hillcrest
-courtenay
-channing
-orville
-joliet
-high-pressure
-orcs
-1732
-bumblebee
-yehuda
-unparalleled
-fronting
-footnotes
-yonkers
-motoring
-direct-to-video
-pundit
-commemorations
-sao
-conroy
-cordillera
-spectrometer
-fast-paced
-harare
-bursting
-alouettes
-402
-aquaman
-engraver
-humorously
-hatches
-foreseeable
-paragraphs
-government-owned
-pitts
-three-story
-martina
-ewell
-eminence
-yahweh
-middle-earth
-rigidity
-cartier
-bal
-huber
-muni
-pelicans
-wallingford
-8:30
-dealerships
-mentored
-baltar
-g3
-gigi
-colgate
-jens
-lame
-1541
-bayonne
-insiders
-prelate
-gendarmerie
-silvia
-fossa
-lner
-.22
-galactus
-newburgh
-transliterated
-sportscar
-intervenes
-okanagan
-1559
-infested
-inject
-westpac
-higgs
-cska
-sentai
-silenced
-58th
-puritans
-sling
-detachable
-grands
-nation-wide
-fumes
-hellfire
-cagney
-huguenot
-outbound
-411
-ranchers
-ist
-educates
-seduced
-stanfield
-unoccupied
-briscoe
-archetypal
-pcc
-quaternary
-bautista
-scrabble
-second-generation
-capitulation
-ley
-retardation
-harass
-longest-serving
-cheeses
-redgrave
-2008/09
-geographer
-panathinaikos
-rearrangement
-tyrannosaurus
-genomic
-shielded
-shrike
-delgado
-martinsville
-solon
-screenwriting
-renfrewshire
-cayuga
-berths
-hump
-dilute
-mobilize
-1617
-swordsman
-expressways
-entertainments
-tomahawk
-cut-off
-witherspoon
-1678
-nonviolent
-lagoons
-vases
-compress
-yemeni
-trumbull
-usac
-crumbling
-perforated
-yoke
-solihull
-safeguards
-brecon
-honorific
-yisrael
-md.
-dissimilar
-chernobyl
-lat
-hustle
-pixar
-dentists
-well-preserved
-1565
-seclusion
-deductions
-rota
-g1
-reinstatement
-uplands
-rtl
-mobsters
-ro
-sebring
-wai
-hoisted
-re-used
-laramie
-mi6
-iterations
-anti-inflammatory
-freshly
-rags
-kohl
-idealistic
-adidas
-mtv2
-lapsed
-spearhead
-dispensed
-sturt
-427
-furthering
-iaea
-caa
-cara
-determinism
-displacing
-bevan
-paf
-oakes
-kottayam
-discworld
-nhk
-signings
-sungai
-armando
-cai
-northwood
-frail
-wesson
-351
-bedding
-hermione
-1539
-inhabitant
-herr
-furman
-garrisoned
-permutations
-pai
-remarking
-marcellus
-510
-identifiers
-colman
-codice_4
-crc
-1585
-workout
-tack
-o'grady
-apical
-sickle
-executioner
-co-worker
-thrillers
-suitably
-upholding
-ovary
-freemasons
-outcast
-gutted
-maxima
-tyneside
-player-manager
-oscars
-envoys
-whisper
-mep
-phu
-bathtub
-stewards
-suspensions
-refund
-muldoon
-raton
-finisher
-eugenia
-inconvenient
-appalled
-oscillations
-7-12
-quercus
-subsumed
-reciprocity
-modulated
-ethyl
-netted
-amused
-lingerie
-eukaryotes
-strenuous
-alignments
-ouse
-5-6
-llewellyn
-oncology
-degrading
-findlay
-wilberforce
-ronstadt
-septuagint
-kerouac
-1674
-1687
-duels
-enclaves
-nijmegen
-rove
-creeping
-79th
-nadh
-tits
-camberwell
-unclassified
-grit
-molar
-coon
-aloft
-massed
-sensed
-cystic
-mcallister
-1520
-cfa
-tulip
-nucleotides
-refraction
-fjord
-comforts
-fright
-1631
-b&o
-mid-1930s
-j.b.
-promos
-musashi
-at-bats
-comprehend
-kessler
-reshuffle
-scarred
-consultations
-accrediting
-spurious
-bookstores
-buddies
-hoist
-necessitating
-32,000
-stalinist
-formula_34
-spiked
-uk-based
-top-down
-cadres
-evolves
-confessing
-pathologist
-2.89
-liao
-esophagus
-lucid
-re-branded
-pvi
-tiling
-eigenvalues
-thumbs
-second-highest
-operationally
-franciscans
-precepts
-admire
-elapsed
-meticulous
-calves
-vegetative
-660
-gerrard
-timpani
-cobham
-incursion
-backers
-privateers
-simplification
-invocation
-quantization
-1615
-undue
-beset
-cricketing
-flirting
-vedder
-courting
-assuring
-faroe
-subscribed
-cadmium
-bene
-ek
-71st
-possessive
-mancini
-ucc
-ziggy
-highlander
-operculum
-prescriptions
-straddles
-med
-jana
-reusable
-regenerative
-myocardial
-conceptually
-conveys
-subsided
-relented
-32.8
-custodian
-matteo
-noblemen
-goldfields
-elven
-greeley
-record-breaking
-clientele
-bloomsbury
-rumsfeld
-waterhouse
-cybermen
-multi-platinum
-oratorio
-kirtland
-marques
-buffers
-facet
-rajah
-nicaea
-impatient
-genevieve
-rosters
-1695
-mccullough
-harwood
-harem
-slits
-gauss
-rehearsing
-apiece
-restores
-enzo
-olin
-downes
-stances
-conley
-insoluble
-wiseman
-pastime
-rosalind
-swimsuit
-streamed
-columba
-kanpur
-adrenal
-jansen
-surrealist
-disbandment
-conurbation
-rioters
-pussy
-he-man
-clicks
-customization
-rainwater
-exonerated
-gravesend
-menon
-preoccupied
-daimyo
-butchers
-macedonians
-padded
-stoke-on-trent
-gannett
-burley
-lovejoy
-nih
-tian
-isthmian
-conte
-storyteller
-holbrook
-inequalities
-kenney
-melts
-gorillas
-purgatory
-shadowy
-goodrich
-tantric
-bearers
-recollection
-mirroring
-bolted
-lemur
-undo
-bac
-mallard
-protege
-fab
-cumulus
-2.75
-soulful
-scooby
-filly
-britannica
-fruitless
-colton
-alfie
-maserati
-radiating
-geronimo
-jeffery
-kneeling
-retroactively
-seduction
-insensitive
-takashi
-qui
-bong
-enrich
-integrals
-mixed-use
-tensile
-derided
-zorro
-rimmer
-scratching
-pickles
-greetings
-tranmere
-ais
-cay
-axed
-conveying
-georgians
-broadside
-gauls
-modernised
-kanji
-westphalia
-imus
-novelists
-olfactory
-filthy
-close-up
-drosophila
-yangon
-woodville
-whipple
-statesmen
-transferable
-stationery
-seeding
-clipping
-latimer
-castel
-yadav
-impossibility
-brill
-garde
-nationalised
-gunfight
-basie
-darkseid
-margot
-21,000
-dominick
-31,250
-warm-up
-gilligan
-f-4
-cripple
-kanagawa
-kinsman
-1555
-desks
-ayres
-hewlett-packard
-ronan
-repudiated
-censure
-crippling
-personification
-minot
-accra
-intangible
-theosophical
-parametric
-re-designated
-fn
-funerary
-dashed
-postman
-grosse
-cheque
-shiraz
-paschal
-madre
-subplot
-intoxication
-testifying
-amboy
-stepson
-daughter-in-law
-nuevo
-romana
-drugged
-homophobia
-endocrine
-liberalization
-termini
-jacobi
-scathing
-naturalism
-pacing
-wilkes-barre
-queenstown
-huckabee
-hun
-sauk
-femur
-betts
-refereed
-aretha
-honky
-collie
-rhea
-grainger
-chimpanzee
-seaweed
-toowoomba
-dickey
-sangha
-thrombosis
-nils
-spectrometry
-stressing
-dispense
-materiel
-chekhov
-1050
-eater
-kickboxing
-reputable
-provo
-futsal
-presumption
-kwazulu-natal
-eritrean
-ascents
-upside-down
-goode
-aceh
-pikes
-maoist
-bello
-32.0
-dukedom
-headquarter
-lhasa
-arousal
-unprepared
-zhuge
-carburetor
-squatters
-muskets
-foreclosure
-foresight
-pulau
-tierra
-fangs
-pediatrics
-manoeuvre
-intrusive
-jeanette
-auxiliaries
-rainey
-reappointed
-grilled
-cordial
-usages
-rogues
-antisemitic
-cruiserweight
-26,250
-spoils
-bottleneck
-bligh
-aac
-cupid
-understandable
-utensils
-sbc
-deceived
-taro
-fervent
-three-point
-tripod
-desoto
-refresh
-3.07
-gaya
-sperry
-h.w.
-upa
-hover
-servo
-coaxial
-s$
-thakur
-dooley
-tagging
-gainsborough
-repressive
-dilation
-runaways
-jute
-imposes
-uncompromising
-maidens
-tri-state
-malevolent
-schoolchildren
-animorphs
-single-player
-26,000
-locates
-hk$
-recharge
-cityrail
-ciphers
-herbivorous
-1676
-joking
-woodhouse
-chalukyas
-insulating
-7.62
-sree
-abnormally
-neo-classical
-tripartite
-wanderer
-apg
-liters
-kerrang
-deadlock
-prentice
-fooled
-pluralism
-acupuncture
-feinstein
-21,250
-buckets
-stanislaus
-chihuahua
-notched
-bhakti
-adonis
-peasantry
-masturbation
-eldorado
-d'angelo
-crusher
-court-martial
-all-male
-desiring
-cmt
-aus
-pageants
-quarks
-restrooms
-finchley
-criminology
-cosimo
-amends
-peloponnese
-napalm
-tenerife
-concubine
-remakes
-rpgs
-pfeiffer
-bicameral
-umayyad
-heston
-imp
-tormented
-1994-95
-e.p.
-blogging
-rung
-caches
-msp
-minoan
-fuchs
-rafts
-mela
-lunatic
-ingalls
-michelin
-front-line
-alcohols
-bloodline
-sangam
-xxi
-pore
-yitzhak
-cj
-msg
-muses
-eau
-weil
-excepting
-ragged
-elmore
-gamblers
-mutt
-walrus
-deflected
-mattress
-gentrification
-voss
-taper
-undefined
-davie
-reboot
-watchtower
-naturalization
-all-pro
-industrialists
-peebles
-nicosia
-vicenza
-apr
-nd
-journeyman
-jsa
-risked
-298
-ofcom
-portman
-seleucid
-loudspeakers
-foyer
-nanak
-graphically
-sedentary
-blythe
-bobo
-handicrafts
-leeward
-reunions
-sped
-sheng
-renton
-summation
-sympathizers
-livonia
-taunts
-informer
-neilson
-discharging
-reintroduction
-handover
-restrain
-frieze
-anecdote
-diesel-electric
-3.08
-iww
-gallup
-tripled
-hessian
-rub
-exporter
-fitzwilliam
-enlists
-bowery
-contemplating
-bluebird
-dum
-avi
-tobruk
-ad-hoc
-illuminate
-hartmann
-wallpaper
-majorities
-foramen
-dreadful
-semi-arid
-bao
-broderick
-co-director
-inanimate
-chas
-plantings
-popping
-recourse
-czar
-hydroxyl
-bromide
-unethical
-hylidae
-ul
-ducati
-ifa
-385
-interdiction
-partake
-extremities
-1993-94
-re-formed
-nagorno-karabakh
-subgenus
-templars
-amiens
-egalitarian
-umm
-oceanographic
-opioid
-bikers
-cormorants
-nucleic
-mandel
-sari
-gia
-soybeans
-miki
-vulnerabilities
-iso/iec
-maas
-kean
-marv
-causality
-sydenham
-airbags
-messier
-lightfoot
-undulating
-gretzky
-pylon
-celibacy
-455
-7pm
-lads
-f4
-messerschmitt
-typified
-virtualization
-1696
-adder
-marysville
-gama
-eiffel
-noor
-dunstan
-dickerson
-frome
-emo
-spreadsheet
-bribed
-talib
-lamented
-bataan
-genitive
-manassas
-druids
-armitage
-radioactivity
-time-consuming
-whyte
-deflect
-onondaga
-keats
-recounting
-toshiba
-insectivores
-stalwart
-winnebago
-lyell
-nite
-nanking
-khaled
-conclusively
-1669
-busan
-wynne
-aquarius
-mongoose
-hearn
-plankton
-pianists
-leprosy
-leven
-1706
-theravada
-globular
-neatly
-toms
-agnew
-blaise
-voc
-leahy
-ipc
-ae
-rajya
-glenwood
-alphabets
-anglian
-pinus
-trotter
-slider
-shakes
-ud
-meningitis
-progressives
-symptomatic
-one-act
-shipwrecked
-hopping
-dilution
-sawmills
-tread
-hoard
-aussie
-wingate
-affidavit
-vertebral
-medic
-loci
-ascap
-tenet
-silicone
-leviathan
-aggregator
-451
-varanasi
-finitely
-mages
-nk
-sturm
-austere
-discontinuation
-carols
-stupa
-d'italia
-mid-air
-ceasing
-propagating
-ong
-escalators
-side-effects
-buttocks
-universalist
-omit
-rower
-resins
-unintentional
-sorbonne
-zi
-adhering
-ingenuity
-coniferous
-trickster
-eocene
-1790s
-episcopate
-phonograph
-piraeus
-delft
-shilling
-nome
-panicked
-betraying
-kidman
-glebe
-61st
-sequentially
-inheriting
-roughriders
-intifada
-conwy
-sheryl
-takeshi
-1:1
-ipv6
-petrochemical
-ural
-mimicking
-state-run
-compliment
-decayed
-pooja
-smelter
-democratically
-hud
-crayfish
-osceola
-aspirin
-rampart
-confectionery
-inhuman
-footnote
-museo
-high-pitched
-baloch
-hecht
-co-anchor
-streamline
-dreadnought
-kuwaiti
-knock-out
-fishers
-correlations
-like-minded
-pushkin
-specifics
-timid
-arras
-tuskegee
-sag
-fannie
-aya
-scrambled
-dortmund
-ortega
-desolate
-lamborghini
-distortions
-stare
-leila
-boleyn
-prejudices
-communicator
-matchup
-ryde
-harpers
-startled
-addictive
-blvd
-cog
-minden
-synthesize
-kasparov
-corrosive
-ruining
-discriminated
-montauk
-linnaeus
-singly
-421
-dine
-redford
-kabir
-4-year
-emissary
-smoker
-noodle
-adaptable
-microbes
-femme
-lovett
-presumptive
-salted
-moto
-complicity
-thierry
-prides
-1698
-mindset
-vasquez
-nationale
-emptying
-preamble
-posits
-pda
-brantford
-1657
-27,000
-laval
-waterproof
-flamingos
-two-disc
-silvery
-avenged
-torrent
-570
-vedanta
-kharkiv
-conor
-linton
-tenchi
-approximations
-drm
-cornered
-correspondingly
-coercive
-steppes
-sleepers
-hao
-illustrators
-cons
-egerton
-16-year-old
-exalted
-boreal
-ganesha
-flyweight
-fudge
-webs
-lynda
-gu
-1558
-oily
-rawlings
-mortals
-folders
-muskegon
-butts
-tuttle
-thatched
-berthed
-quantify
-cornea
-gravely
-wray
-transnistria
-heuristic
-repaid
-italianate
-postings
-gunnar
-pinning
-battlefields
-xenon
-natively
-respite
-bi-monthly
-augmentation
-hf
-jett
-tigris
-allotment
-gallatin
-nailed
-leans
-mennonites
-blasphemy
-bohol
-collieries
-lubrication
-jpeg
-septum
-harps
-x-force
-competency
-paladin
-barks
-seti
-kinder
-jurists
-33.0
-kilpatrick
-airships
-nikola
-unicameral
-angling
-netting
-spiegel
-peng
-2.86
-applauded
-m/s
-kaohsiung
-headers
-krause
-whitehouse
-modulo
-admirable
-rowdy
-eos
-vassar
-ticonderoga
-eyebrows
-cnc
-wedgwood
-linebackers
-emanating
-perugia
-hikaru
-wits
-boulton
-inshore
-entail
-3.10
-umno
-estes
-cyst
-bullied
-sensual
-saleh
-forester
-thirty-three
-zine
-discern
-mcneill
-preeminent
-28,000
-3.09
-rocked
-lagrangian
-watcher
-chanel
-ponting
-contradicts
-flyover
-iguana
-hagar
-corby
-dingle
-washes
-reactivity
-formatting
-babes
-vasco
-lukewarm
-angelus
-reggio
-ground-breaking
-bran
-looms
-longhorns
-safeway
-risking
-ed.
-ciudad
-zorn
-dieter
-rammed
-karel
-verdun
-muscat
-psionic
-martians
-subsystems
-happier
-druze
-inciting
-caravans
-diverge
-norwegians
-subfamilies
-harming
-russo-japanese
-bahia
-eucharistic
-huffman
-bros
-pounding
-copley
-dissolves
-disarray
-payoff
-scalia
-cif
-mites
-fsb
-doran
-buckeyes
-kidderminster
-eloquent
-thaw
-karlsruhe
-mmorpg
-countrymen
-campgrounds
-paulus
-1992-93
-carmarthen
-glaze
-embellished
-subgenre
-ishmael
-hmv
-clarification
-hk
-mcs
-reb
-roving
-glyn
-dammed
-biennale
-putative
-nakajima
-elisha
-snowball
-abram
-helene
-starved
-northridge
-cabs
-raping
-disraeli
-altoona
-pacemaker
-reims
-siddeley
-78th
-ecm
-peachtree
-beasley
-ogle
-mindoro
-ethereal
-32.9
-angolan
-popularize
-spares
-constantius
-417
-powerplant
-cusack
-inept
-footpaths
-somatic
-adv
-olympians
-mya
-free-to-air
-alla
-motley
-goggles
-ricketts
-terracotta
-lupin
-finsbury
-allegro
-mathura
-genitalia
-expressionism
-formula_36
-pointless
-musketeers
-psychoanalytic
-intestines
-9:30
-kinsey
-0-1
-molloy
-berklee
-w3c
-blackfriars
-columnists
-powhatan
-cta
-genitals
-winifred
-debugging
-figurehead
-ballantine
-b2
-impressionist
-1596
-meltdown
-glitter
-oxley
-publicised
-parlance
-overseer
-corfu
-tenuous
-botha
-shlomo
-aslan
-purvis
-afp
-c.c.
-366
-headmasters
-lirr
-jahan
-champaign
-rightly
-navigating
-fyodor
-invader
-lutz
-puns
-zak
-kyrgyz
-bracelets
-fiancee
-creditor
-consented
-watercolor
-ballerina
-bassoon
-tupolev
-jem
-transplants
-silo
-schiff
-nyse
-dugout
-medallist
-skywalker
-behold
-ppv
-aggie
-chalukya
-impede
-pheasant
-differentiable
-thrissur
-aeneas
-cornwell
-crucifix
-subconscious
-ballets
-astonished
-dumas
-uncontrollable
-purged
-squat
-co-founders
-luo
-alexa
-luminosity
-middlebury
-whittaker
-422
-bighorn
-sexiest
-bakers
-imran
-bandy
-staggering
-estelle
-milder
-beware
-4x4
-shines
-iglesias
-gravy
-jodie
-calibrated
-honeywell
-hwang
-interferes
-redress
-unilaterally
-knowingly
-katana
-bauhaus
-uta
-gramophone
-cymbal
-silvers
-swain
-completeness
-landslides
-lm
-mannerisms
-epitome
-camshaft
-insurers
-city-state
-bolan
-pittsburg
-webpage
-blender
-interscope
-cuneiform
-sarcophagus
-damping
-nimitz
-bridgwater
-subversion
-osu
-botched
-searle
-vapour
-babel
-gearing
-autocratic
-payloads
-drags
-sinful
-newly-created
-reclusive
-halley
-gael
-ayer
-28,750
-hakim
-horst
-firebird
-mccallum
-disproportionately
-lifeguard
-chongqing
-hornby
-asymmetrical
-cede
-4wd
-lures
-thruway
-built-up
-lumen
-fiftieth
-formula_35
-negotiator
-pakistanis
-redeemer
-kari
-37.5
-200th
-shearing
-azeri
-icarus
-warrick
-lanark
-284
-ghz
-alston
-partitioning
-pompeii
-teri
-balloting
-shoshone
-moresby
-geary
-neale
-planks
-cichlidae
-ultrasonic
-genomics
-assemblyman
-inactivity
-replays
-gondola
-cliffhanger
-shoal
-waterbury
-mpaa
-braces
-gustavo
-ulmus
-e5
-celine
-placer
-mnemonic
-elms
-homepage
-non-muslim
-mythic
-checkmate
-ecstatic
-gillis
-stocking
-descartes
-cabrera
-internationale
-santander
-merv
-questionnaire
-66th
-postpone
-by-product
-sportsmen
-encyclical
-supervillains
-341
-bookshop
-bartlet
-frans
-wv
-sodomy
-amazingly
-hough
-dardanelles
-bani
-junius
-namor
-handset
-swamy
-33.6
-dade
-agar
-hee
-125,000
-berliner
-stallone
-complexities
-kernels
-frick
-healthier
-sushi
-pj
-rav
-refractory
-clegg
-totalled
-leto
-himalaya
-sisterhood
-usurper
-n.c.
-tipping
-sqn
-brash
-unfolding
-tenacious
-monsieur
-nanda
-peckham
-naylor
-shrews
-instigation
-yaw
-ruhr
-conjectured
-cellars
-t.j.
-perceives
-ga.
-chino
-menachem
-1726
-onslow
-groening
-subordinated
-benz
-mauro
-rockstar
-admittance
-asgard
-federalism
-brokered
-crate
-topless
-predated
-vixen
-brunner
-maneuverability
-weathered
-oceanography
-austronesian
-seaway
-atheists
-7-5
-meteorites
-overpowered
-coop
-littoral
-mcknight
-moreau
-sow
-19,000
-latrobe
-indore
-377
-bonanno
-comedy-drama
-laine
-yeoman
-non-executive
-engels
-reverb
-c5
-95th
-phasing
-iceberg
-src
-iis
-moldavia
-negev
-operetta
-telethon
-5-7
-gianni
-friesland
-collin
-archeology
-jian
-atrophy
-albury
-woking
-lettuce
-hammered
-lifeboats
-bourgeoisie
-spitfires
-ado
-lorry
-outboard
-showbiz
-light-years
-censuses
-declan
-splendor
-rescuers
-functor
-high-resolution
-daewoo
-sportsmanship
-glued
-kendra
-best-seller
-fong
-unpaved
-golem
-347
-aw
-guideline
-minefields
-harvester
-four-door
-292
-non-hispanic
-timbaland
-niels
-ruff
-twentieth-century
-sepulchre
-shiv
-vice-versa
-1568
-nymph
-hogarth
-aau
-poaching
-tedious
-narasimha
-itn
-acetylcholine
-loftus
-booksellers
-tonk
-theatrically
-condon
-athanasius
-inexplicably
-perseverance
-zaire
-treehouse
-reentry
-pro-choice
-badgers
-busts
-restructure
-benefactors
-vj
-coerced
-3m
-spires
-vir
-three-month
-csu
-gotra
-headmistress
-zeon
-ando
-gila
-corrugated
-comma
-renewing
-carpathian
-third-person
-lauper
-non-muslims
-brandywine
-gamelan
-helpers
-octagon
-subdistrict
-kiosk
-wittenberg
-reprisals
-3.11
-wildwood
-vidya
-newsreader
-bailiff
-tangerine
-corleone
-inked
-sanctity
-7-0
-powering
-ascends
-DGDGDGDG.DG
-graced
-mystics
-gannon
-monies
-sinha
-seaton
-usnr
-jag
-spectacles
-bayside
-roofing
-endowments
-sorghum
-2.85
-delusions
-wil
-depp
-dignified
-ewart
-ccp
-assurances
-burman
-boucher
-ibis
-spooner
-frampton
-barbuda
-generously
-follow-on
-yogurt
-stub
-dufferin
-shingle
-furnish
-decomposed
-helical
-tessa
-watanabe
-dales
-paley
-breslau
-psv
-luisa
-17-year-old
-sociologists
-bl
-relentlessly
-postcard
-apologised
-2.83
-ldp
-notary
-then-current
-mckean
-argentinian
-remodelled
-chandigarh
-kievan
-hogwarts
-first-time
-solidly
-holyrood
-fuelled
-pickford
-miyazaki
-olympiacos
-thiruvananthapuram
-sequoia
-crowther
-conscripted
-long-lasting
-manure
-coronet
-geiger
-oakdale
-lavigne
-chrysalis
-anatolian
-1671
-adrenaline
-anatoly
-ros
-karzai
-peso
-brice
-contagious
-sever
-parson
-carve
-mirren
-directx
-allende
-cuyahoga
-nui
-finney
-2.87
-juices
-behaved
-entry-level
-ridgeway
-bratislava
-sweater
-ayutthaya
-fung
-classifies
-rug
-typography
-masquerade
-dowd
-westmorland
-pentecost
-jewry
-pease
-e-commerce
-archbishopric
-darth
-uml
-stockade
-lokomotiv
-anu
-speculates
-betrays
-kilns
-polymorphism
-dax
-selene
-cypriots
-assembler
-accademia
-tendons
-hogs
-33.5
-munson
-customize
-honeycomb
-ray-finned
-clams
-vestiges
-alluding
-chatsworth
-elmwood
-cheapest
-scheming
-stuyvesant
-metaphorical
-three-year-old
-lessened
-cottonwood
-goin
-sars
-spooky
-telescopic
-narragansett
-tsui
-iba
-2006/2007
-fructose
-airtime
-machine-gun
-privately-owned
-dismayed
-cleese
-polygram
-hackensack
-finches
-gar
-nap
-swears
-a.f.c.
-noses
-hatcher
-modulus
-363
-air-cooled
-zeke
-hulme
-pall
-ferrell
-toa
-howling
-gridiron
-showings
-atchison
-alhambra
-barium
-kanawha
-all-out
-machining
-guessed
-valkyrie
-commercialization
-moreland
-wormhole
-bagh
-rewrote
-replenish
-punts
-mcfarland
-yorkers
-anglo-irish
-capella
-yar
-supercar
-pharmacological
-old-time
-photoshop
-iceman
-dartmoor
-screwed
-dysplasia
-geforce
-offensives
-valea
-hirst
-mantis
-dossier
-customarily
-prefixed
-formulating
-sahitya
-animate
-dependents
-cv
-call-up
-prudent
-iced
-sofa
-single-season
-fabio
-fusing
-priori
-sylhet
-etienne
-telemundo
-attenuation
-acrobatic
-hideous
-panamanian
-mips
-twister
-henan
-giordano
-mountaineer
-half-century
-biographers
-mysql
-stoughton
-d2
-restorations
-sith
-cagliari
-nb
-odeon
-pears
-contraband
-confederations
-mormonism
-doria
-auditioning
-aorta
-morello
-platoons
-snk
-i-70
-indebted
-townland
-two-year-old
-kenji
-chicano
-kshatriya
-fished
-randi
-snell
-czechs
-waved
-357
-sandler
-297
-itchy
-lomax
-elam
-petrels
-ehrlich
-umberto
-jahangir
-1001
-jura
-stimulant
-bridgehead
-misspelled
-20-year
-b.j.
-bolivar
-15-year-old
-1260
-hastened
-alva
-1587
-humphries
-ainsworth
-granular
-diagnostics
-356
-lehmann
-cilicia
-cubans
-v12
-teas
-1584
-ney
-councilor
-ganglia
-scribes
-seceded
-machinations
-376
-resilience
-pampanga
-balconies
-comstock
-rediscovery
-unprofitable
-ger
-sparring
-desertion
-favouring
-murad
-kofi
-429
-embodies
-lucinda
-bibles
-carrera
-flo
-rad
-inaccuracies
-metro-north
-bungalows
-majlis
-woes
-specter
-flake
-polyhedron
-valparaiso
-denominator
-moulton
-intermarriage
-autoroute
-swaps
-cleanliness
-converges
-hippies
-kenosha
-laced
-ams
-tunic
-trans-canada
-transmembrane
-upperside
-paralympic
-corneal
-outburst
-unfounded
-construed
-comic-book
-towson
-gaeta
-raza
-compulsive
-clearances
-subsidence
-causation
-arlene
-ik
-elaborately
-352
-1-5
-sala
-worrying
-parris
-tutored
-steinbeck
-mcdaniel
-1599
-wicks
-multilingual
-approves
-seedlings
-alkyl
-34.5
-bohr
-usmc
-chaparral
-nco
-behaving
-fresnel
-sufism
-dykes
-methamphetamine
-stow
-feathered
-coining
-metro-goldwyn-mayer
-archimedes
-baptismal
-aldridge
-abbotsford
-81st
-suleiman
-saitama
-passer
-chairmen
-whore
-bristow
-riemannian
-arvn
-scant
-drinker
-airman
-molotov
-mci
-unproductive
-rosetta
-geologically
-ljubljana
-cca
-masking
-ladakh
-schmitt
-drow
-awaits
-motte
-seljuk
-timbre
-pyongyang
-pregnancies
-deference
-equalled
-grossly
-oldenburg
-cheadle
-ophthalmology
-23,750
-denim
-thoroughfares
-desserts
-shipley
-trappers
-catheter
-stressful
-irregularly
-henning
-chipmunks
-clap
-dancehall
-d'arcy
-shay
-2,300
-kalinga
-wiping
-mitra
-tashkent
-crassus
-habsburgs
-1542
-akhtar
-forelimbs
-candid
-consummated
-iterative
-monarchies
-myron
-shaffer
-disarmed
-matte
-gcc
-mussel
-33.1
-heyman
-mauser
-praetorian
-mcnabb
-steen
-!!!
-intrigues
-hamm
-arran
-splicing
-confiscation
-rearing
-jean-paul
-hsu
-drawer
-objectivity
-bruckner
-coors
-trish
-bins
-essen
-50.0
-octane
-schwab
-displace
-seamus
-euphoria
-sucking
-fuze
-tajik
-anil
-sweetness
-summerslam
-heretic
-oni
-mordechai
-xvii
-409
-beastie
-2100
-kodiak
-5,500
-waterline
-kampong
-taverns
-stampeders
-sledge
-fiance
-realty
-cos
-outings
-sunbury
-consistory
-fujiwara
-circling
-guinea-bissau
-gregorio
-uneventful
-plucked
-woolly
-flatter
-criticising
-compromises
-montessori
-sauces
-gian
-pedersen
-hendon
-nr
-cheerleaders
-transponder
-329
-hurled
-noam
-troupes
-valencian
-florian
-laments
-proactive
-hamstring
-pausanias
-origen
-wilma
-butterworth
-wilshire
-.50
-warlike
-virgo
-formula_37
-ferrer
-gobind
-nawaz
-reconstructions
-piquet
-nurtured
-laing
-expressionist
-mudd
-slams
-canaveral
-3-3
-spaulding
-scandalous
-snipes
-aek
-ecuadorian
-berks
-robotech
-misguided
-woolen
-barak
-hares
-woodworking
-hyperspace
-marxists
-carthaginians
-fao
-hibernian
-stagnant
-bullseye
-disseminate
-co-edited
-uttarakhand
-kootenay
-localised
-logie
-2.84
-rocca
-testimonial
-identically
-midrash
-booklets
-studs
-shrewd
-4:00
-strachan
-5-year
-slightest
-quartered
-brushed
-97.3
-albino
-kamehameha
-russel
-emirate
-tama
-pollack
-calmly
-pugh
-reiner
-u-17
-mid-august
-brookside
-crooks
-affirming
-lipstick
-mobs
-prepaid
-antichrist
-nepean
-saito
-darrow
-osteopathic
-chauncey
-pierson
-macrophages
-ripple
-tia
-minesweepers
-decathlon
-serenade
-wargames
-vice-chair
-boulogne
-rr
-kirkby
-intermodal
-ips
-snapping
-tae
-convincingly
-tull
-hainan
-vespasian
-pendant
-telemetry
-cremona
-mubarak
-strove
-stoned
-dianne
-475
-pons
-kershaw
-first-generation
-hydrographic
-richelieu
-pancake
-unsupported
-turn-based
-bionic
-caruso
-sturgis
-looming
-klondike
-dispensation
-roost
-electorates
-byproduct
-burdens
-shuttles
-kafka
-crates
-grimlock
-yo-yo
-opined
-run-up
-deli
-gruber
-filth
-77th
-vladivostok
-virginian
-stateside
-leftover
-craftsmanship
-repeaters
-remy
-playa
-cfs
-1230
-tranquility
-33.8
-alms
-pelagic
-lithography
-halliwell
-barnaby
-staircases
-soles
-housemate
-bondi
-newry
-victors
-sagan
-dci
-executor
-anointed
-aris
-caustic
-janssen
-taman
-sergius
-taxing
-rufous
-showroom
-ccm
-disassembled
-minimally
-brenner
-peloponnesian
-dacia
-winn
-shahi
-mecklenburg-vorpommern
-rabaul
-tooling
-hounslow
-eccentricity
-labelling
-marshalls
-1589
-jails
-undone
-r1
-kestrel
-weeknights
-spokeswoman
-doi
-exhumed
-shahid
-stomp
-officio
-franco-prussian
-segmented
-pew
-probing
-digitized
-clipped
-obliterated
-devin
-rhineland
-geyser
-fluently
-sura
-fling
-revisionist
-2.82
-paragon
-spore
-tangier
-doorstep
-burglar
-automaton
-copious
-facades
-elwood
-frowned
-anacostia
-gl
-durand
-nucleolar
-carpentry
-riverview
-webbing
-sharper
-pegs
-vigor
-1567
-horribly
-cherished
-kinetics
-reloaded
-foggy
-inherits
-bot
-lunchtime
-pidgin
-stoppage
-s1
-bowing
-invertebrate
-wali
-kilmer
-morden
-cashier
-ssp
-prokofiev
-conifers
-trestle
-contours
-cranston
-tilbury
-crossbow
-1990-91
-unscrupulous
-poignant
-ossetia
-macros
-cirque
-a.s.
-gutenberg
-bindings
-libertarians
-yung
-dionne
-601
-cdu
-beauregard
-reassignment
-chests
-hamish
-decays
-cleary
-single-family
-usurped
-polishing
-nus
-lowery
-plovdiv
-carillon
-absentee
-94.9
-thessaly
-pathophysiology
-whitechapel
-chipping
-nesbitt
-blatant
-conforming
-tilly
-schreiber
-snowboard
-softened
-overlapped
-sooners
-hawking
-theodora
-naidu
-326
-all-rounder
-wilt
-vojvodina
-shaykh
-interconnect
-alibi
-text-align
-s.j.
-sera
-syndromes
-wiggins
-donned
-auf
-reinstate
-jazeera
-idris
-ominous
-adapters
-photovoltaic
-504
-contractions
-isidore
-valdemar
-jazzy
-1526
-stax
-vignettes
-wrench
-lathe
-decode
-2.45
-howitzers
-encirclement
-olav
-jolla
-clemency
-militiamen
-whipping
-leveraged
-attila
-individualism
-echl
-overtures
-carbohydrate
-arcane
-outgrowth
-skink
-karst
-h2
-64th
-avondale
-tireless
-stanzas
-ibiza
-sausages
-hallows
-misinterpreted
-dorms
-hawkesbury
-calamity
-nobunaga
-categorization
-2005/06
-sorcerers
-sakai
-bronco
-demarcation
-septic
-1500s
-cef
-abstracts
-ostrich
-dislocation
-typefaces
-gorky
-jb
-d.d.
-shredder
-asymptotic
-transitioning
-proverbs
-wien
-bureaucrats
-battlecruiser
-sinestro
-ghazi
-materialize
-greenlee
-csp
-arming
-margate
-redoubt
-hardening
-reagents
-dialysis
-refute
-stratocaster
-kriegsmarine
-anfield
-ks
-mooring
-stretcher
-yd
-kryptonite
-caricatures
-biscay
-pars
-snipe
-clergymen
-hoy
-saws
-timelines
-phobos
-overheating
-devoting
-prefectures
-divisive
-beatrix
-shattering
-mimicry
-orbison
-irresistible
-steuben
-34.4
-whips
-trekking
-massa
-sitar
-seater
-ilya
-annunciation
-orthopedic
-rickenbacker
-distracting
-590
-cmc
-weezer
-gustave
-latina
-fill-in
-eid
-cornet
-memos
-jogging
-outwardly
-sadat
-alight
-toddler
-dogmatic
-consensual
-multicast
-cmos
-sanatorium
-extramarital
-hunan
-gotha
-2400
-lowly
-mpa
-karna
-unskilled
-brevard
-gustavus
-knobs
-terran
-dario
-jojo
-prescribe
-unfolded
-jeeves
-ziegler
-converging
-donut
-blackman
-prospectors
-hannover
-goodies
-parlour
-goldfish
-clays
-nanaimo
-algol
-misnomer
-kroger
-hinting
-342
-capitalists
-toei
-lancia
-one-dimensional
-equipping
-sequenced
-nutcracker
-aichi
-antiaircraft
-bletchley
-missal
-abruzzo
-foucault
-stockwell
-mendes
-theophilus
-sutras
-rowed
-72nd
-collaborates
-naturalists
-hinds
-angelic
-transcendent
-diatonic
-biddle
-salads
-automata
-cyberpunk
-carrots
-rachael
-haste
-shree
-jains
-minuscule
-single-seat
-devo
-1525
-fosters
-petrograd
-refrigerated
-mingus
-athlone
-attentions
-det
-361
-bexley
-rossini
-croce
-fiend
-diddy
-asterix
-hone
-bioinformatics
-110th
-covertly
-morbid
-breen
-buell
-dermot
-whispering
-draped
-atrial
-fosse
-nac
-evaded
-boredom
-408
-godmother
-capricorn
-muted
-brookline
-plated
-radiology
-zealous
-protestors
-mineralogy
-6,500
-86th
-fema
-steels
-vacate
-pyruvate
-angers
-nameplate
-tsarist
-endothelial
-ulm
-dera
-attenborough
-gervais
-ryerson
-langston
-lytton
-artistically
-brat
-bravely
-sangh
-aux
-carinthia
-miley
-jiangsu
-dagenham
-coldplay
-wifi
-tibetans
-chengdu
-resilient
-rigby
-virgins
-predate
-narcotic
-maids
-laminated
-490
-sheaf
-orbitals
-62nd
-montserrat
-cherries
-artistry
-hiawatha
-affirmation
-italiana
-1553
-wwi
-tiananmen
-anti-fascist
-parachutes
-embezzlement
-mackinnon
-long-tailed
-pastureland
-edessa
-accolade
-bales
-sinner
-forties
-factbook
-deo
-lc
-all-conference
-redirect
-re-recording
-corbin
-quasar
-ex-boyfriend
-u-21
-provocation
-brilliantly
-adrift
-antidepressants
-marlo
-on-base
-symonds
-schoolmaster
-transpired
-burgoyne
-nashua
-l'
-raster
-knack
-1150
-jeddah
-interrupting
-bund
-commoner
-gul
-malwa
-analyse
-awoke
-incur
-resigns
-burrowing
-adjoins
-tabriz
-dailies
-358
-sorrows
-thoma
-blacklisted
-labourer
-kiribati
-1521
-multi-engine
-sanfl
-unimportant
-2007/2008
-maggiore
-overlaid
-legalization
-hovercraft
-ramifications
-redwall
-insure
-luang
-vesicles
-whispers
-garfunkel
-conscripts
-impressing
-suing
-blinding
-multiculturalism
-505
-glycol
-pregame
-orientations
-morningside
-6-7
-aps
-metzger
-indemnity
-provenance
-confusingly
-130,000
-tequila
-tableau
-dandenong
-evangelists
-18-hole
-bluish
-edging
-12:00
-substation
-morgue
-workflow
-highest-rated
-smuggle
-vue
-pri
-mistreatment
-bridle
-ramblers
-humanoids
-p2p
-corolla
-pashto
-menacing
-tupelo
-songbook
-khel
-ramparts
-amis
-selena
-hoof
-electrochemical
-levski
-+DG.DGDG
-groucho
-soundgarden
-quarterfinal
-antagonism
-danforth
-pretenders
-unanswered
-facsimile
-3.13
-roscommon
-dickie
-extensible
-upsets
-keeler
-itf
-bantamweight
-commandment
-kree
-rife
-349
-aggies
-smuts
-tafe
-vijaya
-miyamoto
-h5n1
-surgically
-fundamentalism
-agonist
-xerxes
-whimsical
-macroscopic
-oiler
-newtonian
-kamloops
-hacked
-darkened
-conservatoire
-unwritten
-pauli
-saddles
-concede
-herbivores
-xxii
-diefenbaker
-pitting
-dispensing
-i.e
-bro
-hawes
-circulatory
-asu
-harker
-murat
-hierarchies
-lynching
-tricycle
-lira
-deems
-corwin
-nicol
-navigators
-mangroves
-deirdre
-hives
-co-created
-mcg
-werewolves
-biases
-posited
-selenium
-flared
-sideshow
-testimonies
-2.25
-duckworth
-thaddeus
-sputnik
-bint
-nazionale
-reznor
-rath
-aristotelian
-disorderly
-assailant
-cranbrook
-khorasan
-superstitious
-1595
-olney
-fez
-vw
-tattooed
-honestly
-acknowledgement
-hemispheres
-sandford
-infarction
-firemen
-prudential
-forgeries
-deliverance
-measles
-grohl
-emigrant
-antlers
-endpoint
-briefcase
-rudyard
-thales
-infestation
-hakka
-431
-dewan
-experiential
-bhp
-sousa
-2200
-inspecting
-season-by-season
-sn
-disallowed
-mcguinness
-stun
-beatings
-dial-up
-deliberations
-synergy
-applicability
-primitives
-blum
-emulsion
-reclaiming
-12-inch
-rookies
-confidant
-pao
-1991-92
-toleration
-nrc
-umbilical
-bigfoot
-abdallah
-insurer
-rims
-apprehension
-rhine-westphalia
-koh
-hitachi
-strang
-outcrop
-k-9
-suitor
-loudon
-loanwords
-suspending
-darien
-359
-soups
-cbi
-alamein
--3
-metcalf
-insolvency
-subduction
-importer
-std
-dominicans
-9pm
-stratification
-notting
-ornithologist
-stinson
-galois
-colliding
-placenta
-thrasher
-transcriptions
-neonatal
-secede
-sacs
-diligence
-intimidated
-bumped
-hells
-englishmen
-visser
-snuff
-dara
-381
-confucianism
-brinkley
-trujillo
-haw
-recombinant
-hospitalization
-bryson
-modesto
-extrasolar
-overground
-chesterton
-perish
-727
-bouncer
-fanciful
-1340
-turkeys
-hopelessly
-lublin
-pfizer
-omits
-cinder
-tidewater
-intuitively
-metalcore
-subtype
-jurisdictional
-gelechiidae
-acetic
-hopi
-sunil
-divas
-affinities
-discretionary
-metabolites
-divisible
-baines
-contending
-re-opening
-linkages
-hairdresser
-evocative
-bowser
-brookings
-elise
-galbraith
-slag
-unfaithful
-unforgettable
-niro
-megawatts
-formula_38
-aneurysm
-wrapper
-grinder
-aerobatic
-palawan
-weill
-chisel
-quetta
-lala
-mamluks
-oscillators
-at-grade
-kale
-havok
-penalized
-unleash
-shaky
-leafy
-stipe
-sensibilities
-hoa
-adheres
-republicanism
-sensei
-deepened
-frets
-bainimarama
-cookery
-mikado
-ankles
-ayala
-sunda
-ridgway
-groomed
-stalling
-v3
-moo
-stacking
-sds
-rapist
-pseudomonas
-kendal
-eights
-dibiase
-plover
-durango
-maniac
-andalusia
-checkers
-tirana
-demosthenes
-precluded
-uav
-broadsheet
-dryer
-completions
-parietal
-peek
-pharmacists
-ivo
-mid-june
-paradigms
-etiology
-scofield
-acer
-juliette
-liberating
-barristers
-coffins
-breaching
-capes
-synapse
-scratchy
-holborn
-secrete
-hyacinth
-kenilworth
-underestimated
-subtypes
-vengeful
-eject
-deduce
-rear-admiral
-ting
-parser
-goldwyn
-patrice
-wattle
-directorship
-fsn
-ailment
-raines
-monophyletic
-vinton
-33.7
-chauhan
-7-8
-mccarty
-tres
-longhorn
-bras
-sunnyside
-mechanised
-karelia
-adsl
-weeknight
-erecting
-monde
-schoenberg
-suitors
-lyttelton
-ullman
-tambourine
-mahendra
-1586
-supermarine
-antietam
-conti
-depreciation
-detriment
-underdeveloped
-non-religious
-breakwater
-mishra
-declassified
-extra-curricular
-suede
-dpp
-madsen
-kidnappers
-segmentation
-duchies
-fixed-wing
-sewerage
-posey
-rerun
-ubuntu
-wildcard
-hi-fi
-outermost
-cali
-327
-jolie
-dumbledore
-power-ups
-strawberries
-inter-city
-thighs
-toxicology
-chee
-biloxi
-novices
-appomattox
-gallen
-hideyoshi
-stepdaughter
-marseilles
-assimilate
-suresh
-amharic
-pharaohs
-2b
-fractal
-cortina
-nepenthes
-413
-u21
-bodine
-0.01
-inseparable
-ocular
-amputated
-polygonal
-bookseller
-magnification
-delightful
-woodwork
-arbuckle
-x2
-sanctum
-sparingly
-photographing
-blooming
-encircling
-modestly
-thrift
-raytheon
-isuzu
-correlates
-thirty-one
-cephalopods
-rigorously
-sforza
-wildstorm
-mlas
-newsletters
-mak
-pups
-stipend
-normalization
-eredivisie
-2.55
-vai
-3:00
-boldly
-interns
-lewin
-manilow
-blanks
-niko
-sarkar
-pulsar
-deranged
-murrow
-triplets
-blackhawk
-repulse
-maddie
-rosalie
-lefty
-92.3
-crompton
-chesney
-lesser-known
-1549
-woodcock
-thane
-internships
-disembarked
-aman
-muay
-bg
-hayek
-2.80
-homelands
-ol'
-valens
-lemonade
-honduran
-tucked
-p&o
-bec
-fahey
-patil
-vulgate
-777
-non-jewish
-moldovan
-quarto
-merced
-increments
-refreshing
-flocked
-loy
-aloysius
-impersonating
-wests
-off-screen
-benetton
-dissociation
-emigrating
-perseus
-sod
-scratched
-poisson
-1535
-mitre
-pussycat
-investiture
-krueger
-herrmann
-prussians
-296
-200m
-minto
-telangana
-t2
-wandsworth
-yachting
-bankstown
-symbiote
-34.1
-granth
-niigata
-mesopotamian
-plush
-co-directed
-interstitial
-space-time
-grinnell
-vibrato
-rickey
-mayne
-encoder
-retires
-3.15
-formula_39
-33.2
-33.9
-shatter
-moorland
-384
-franconia
-lifecycle
-swp
-ustad
-anguish
-all-round
-prodigious
-tactile
-romanization
-sgi
-diverting
-allowable
-3.12
-140,000
-maximizing
-polyhedra
-ethnography
-favourably
-majorca
-vibrating
-rajshahi
-newberry
-copland
-arlen
-deafness
-awami
-algonquian
-helga
-foothill
-attendances
-corman
-coyle
-apostasy
-khanna
-monique
-enforcers
-cryogenic
-hennepin
-sternberg
-aquariums
-videotaped
-awakes
-smarter
-gram-negative
-futurama
-analytics
-rudi
-mears
-fortuna
-tuvalu
-ietf
-interruptions
-chucky
-10-15
-second-place
-bum
-recursion
-miramar
-colonize
-particulate
-euphorbiaceae
-bartolomeo
-andersson
-catacombs
-mornington
-hatchet
-gnomes
-coa
-midler
-bogus
-adriana
-thani
-womack
-67th
-lows
-second-team
-erasure
-re-enter
-muppets
-emmet
-metra
-pinot
-nubian
-stoves
-trot
-antebellum
-arno
-chicago-based
-
-madan
-xena
-budweiser
-unscathed
-devas
-falconer
-osce
-franken
-s-bahn
-mistook
-murchison
-weighting
-snider
-headgear
-34.3
-habitual
-cathay
-lymphocytes
-aldehyde
-enacting
-third-largest
-state-wide
-enthroned
-creepy
-intensively
-dag
-canister
-southall
-upmarket
-harwich
-chromosomal
-heroines
-amitabh
-analogues
-wuhan
-zamora
-353
-grasshopper
-symmetries
-neutrinos
-privatised
-uyghur
-kami
-appendages
-perm
-northumbrian
-zara
-certify
-candice
-grupo
-negligent
-liangshan
-propositional
-copernicus
-vis
-bridger
-mpeg-4
-710
-karabakh
-disgraced
-husayn
-blanca
-slocum
-ringside
-darin
-predating
-graces
-fragrant
-sponsorships
-orca
-emphasise
-sportswriter
-colouration
-then-president
-lina
-toughness
-glyphs
-goody
-liaoning
-daybreak
-alkaloids
-appointees
-insecurity
-aec
-ronson
-diagnoses
-dresser
-confers
-kittens
-jenner
-tonbridge
-tavistock
-deepwater
-walcott
-meadowlands
-decency
-sponges
-consulates
-regensburg
-la.
-saddened
-bitmap
-darpa
-plow
-vigilance
-mackie
-elixir
-harford
-windermere
-deportivo
-pilar
-castile-la
-elicit
-montpellier
-calculates
-weiner
-collectibles
-rehearsed
-infertility
-clarinets
-undetected
-brompton
-corral
-arne
-electrician
-scalable
-linemen
-yoshida
-deceit
-delilah
-retrieves
-delisted
-suppressor
-quicktime
-por
-lolita
-unfolds
-a.p.
-tsn
-acadian
-gulag
-1593
-commissar
-giuliano
-hotline
-rainer
-argon
-scarface
-bethune
-ard
-waddell
-floating-point
-alarming
-afonso
-oriole
-dimorphism
-dizziness
-wicker
-receptionist
-s.r.
-inhaled
-tit
-headingley
-antioxidant
-fast-food
-rahim
-pessimistic
-paws
-weald
-moi
-chimera
-galt
-mian
-34.8
-newsday
-dnieper
-sed
-felice
-acceded
-nf3
-palmetto
-analysing
-elkhart
-tantra
-1677
-realisation
-bittorrent
-esq.
-differentiating
-blackface
-neo-nazi
-purify
-trendy
-tallied
-unbelievable
-i-a
-pf
-grenoble
-extratropical
-thirty-nine
-venturing
-mazinger
-DGDGDGDG.DGDG
-contrived
-tran
-haddock
-paroled
-broadening
-ww2
-gutierrez
-irgun
-rodman
-four-time
-donkeys
-sendai
-on-stage
-nadine
-tyrannical
-balinese
-demolishing
-1581
-foyle
-manipulations
-skyway
-cervantes
-sion
-kincaid
-analyzer
-aberration
-gruffydd
-adolphus
-terminally
-powerball
-p.s.
-blossomed
-vaishnava
-annuals
-shatner
-cda
-courtiers
-watermelon
-notations
-heinkel
-veer
-pocono
-vividly
-afrika
-marcelo
-cauchy
-1.00
-aif
-drunkenness
-login
-barth
-houdini
-17th-century
-gluten
-pre-game
-rin
-kimura
-posterity
-omnivorous
-akkadian
-plenipotentiary
-wallachia
-fateful
-xenophon
-disagrees
-evaluates
-maidenhead
-plattsburgh
-vests
-flak
-puzzled
-frankston
-34.2
-mcclelland
-rucker
-kaufmann
-slashed
-1534
-stoddard
-preset
-comoros
-backfired
-turkmen
-programing
-conquerors
-bannister
-registries
-one-room
-nonzero
-downriver
-bayard
-penske
-555
-purchasers
-untenable
-percussive
-custard
-widget
-91st
-deprive
-lubricant
-prospector
-cockney
-overtake
-treviso
-pre-christian
-kal
-shimon
-cofactor
-archway
-post-apocalyptic
-entente
-taranaki
-aquila
-exxon
-edgbaston
-lizzy
-doge
-haganah
-tariq
-tuners
-frivolous
-long-lost
-sweating
-2:00
-castilian
-luv
-hattie
-2.81
-pontoon
-pennies
-guangxi
-two-stroke
-unavoidable
-charterhouse
-overtaking
-padilla
-searchlight
-gygax
-revising
-catwoman
-jodi
-bicycling
-predominate
-mesoamerica
-dissipation
-rapp
-windmills
-reprisal
-mahon
-ontological
-overran
-columbine
-innocents
-3.14
-alford
-perilous
-multiplicity
-encapsulated
-intercession
-crafting
-bakeries
-1538
-buzzard
-macomb
-cade
-bedside
-overwhelm
-2.63
-dunmore
-froze
-immunology
-camper
-duplicates
-negroes
-monongahela
-fullest
-294
-625
-gustaf
-1594
-denounce
-ribble
-averted
-bianchi
-pittsfield
-infighting
-bfa
-observatories
-mindless
-career-best
-chiswick
-sidi
-farah
-b/w
-flurry
-musk
-collusion
-lowercase
-mayall
-malware
-rachmaninoff
-ef
-altrincham
-scratches
-gloss
-open-ended
-railcar
-b12
-semi-pro
-toppled
-discernible
-cdma
-montreux
-silky
-twinning
-tossing
-torches
-logarithmic
-delinquent
-eugen
-morissette
-hydrology
-geodesic
-frisbee
-gloomy
-b-24
-4pts
-family-owned
-mesozoic
-stadia
-redevelop
-ifc
-mahone
-institut
-kauai
-bambi
-reverts
-19-year-old
-payout
-99.9
-costas
-guardsmen
-boardman
-co-curricular
-nostra
-delusion
-entitlement
-espiritu
-billiard
-haircut
-dichotomy
-howl
-stiller
-layman
-dodds
-broccoli
-calloway
-lng
-ebb
-90210
-manhood
-kun
-wacky
-ich
-k2
-eloquence
-complexion
-shaka
-esposito
-mbc
-cheddar
-guesses
-plazas
-infatuated
-nicklaus
-stringed
-athos
-vesper
-leona
-tsang
-demonstrator
-pratchett
-nicely
-jodhpur
-sucks
-crossovers
-crick
-jinx
-goldfinger
-environmentalism
-hinges
-indestructible
-immortalized
-cranberry
-emitter
-hou
-farnese
-parsing
-namibian
-etobicoke
-precaution
-1582
-psip
-unilever
-cortes
-bylaws
-jackman
-vila
-childress
-prophesied
-spoons
-shortcut
-yoshi
-corvettes
-mobil
-quechua
-divorcing
-maia
-nf6
-comatose
-levelled
-silurian
-freeview
-2.44
-footy
-guerra
-gunned
-headstone
--DG.DG
-suzerainty
-35.3
-amending
-nsf
-lingayen
-sae
-havelock
-travolta
-wardens
-keswick
-samara
-perching
-wink
-amicable
-e1
-siu
-puppeteer
-woollen
-lili
-endanger
-carboxylic
-bdo
-97.1
-schloss
-swarthmore
-nord-pas-de-calais
-penobscot
-laymen
-prolong
-abbess
-scarring
-cocoon
-uma
-zoologist
-laureates
-briton
-1569
-to/from
-dalrymple
-widnes
-tioga
-forays
-internet-based
-confucius
-golds
-bovine
-blocker
-raining
-kinnikuman
-coagulation
-bisected
-penicillin
-huguenots
-psychoactive
-inclusions
-manoeuvres
-idyllic
-ieyasu
-alexios
-sharpened
-interacted
-headlamps
-daggers
-l2
-degli
-hiss
-retook
-parvati
-bijapur
-purges
-grasping
-dido
-lachlan
-hairless
-bessarabia
-2.54
-demille
-prefabricated
-labelle
-resonator
-s&p
-seabed
-jacobean
-1566
-carbonyl
-axon
-sarge
-justifying
-lejeune
-moulded
-amphetamine
-2.35
-predeceased
-powders
-stourbridge
-warmed
-rebuttal
-2008/2009
-heralds
-byers
-juicy
-untouchables
-11:30
-piled
-triplet
-severing
-426
-recognitions
-carmela
-xia
-dysentery
-invariants
-scc
-hobbit
-mucous
-blueprints
-kamboja
-standstill
-shorty
-corgan
-dragonlance
-souza
-cassava
-hysterical
-960
-dae
-piacenza
-licking
-sonja
-ante
-hennessy
-astrologer
-lipton
-tumbling
-legislated
-accomplices
-calvinism
-gonzo
-competitively
-end-to-end
-rapport
-handcuffs
-co-ordination
-tammany
-kieran
-riccardo
-advisable
-5-3
-ephemeral
-augusto
-pawtucket
-spoleto
-pangasinan
-hebei
-puncture
-marshland
-coyne
-visayas
-flips
-nuneaton
-boas
-scrum
-gladly
-goodall
-covalent
-girdle
-ipcc
-giraffe
-fibt
-much-needed
-radium
-leavitt
-croquet
-ginny
-precession
-3-year
-customizable
-paleontologist
-pheasants
-bremer
-mahmood
-philly
-maxillary
-tipu
-centre-right
-prof
-cherbourg
-guessing
-coven
-bloor
-konkani
-5pm
-engined
-tivoli
-arcana
-casanova
-v10
-borges
-1583
-legitimately
-gelatin
-bouquet
-coordinators
-+3
-crumb
-aire
-pre-islamic
-alfalfa
-eight-year
-disclaimer
-pahang
-rambo
-hollister
-depressive
-assamese
-alleys
-individualist
-grieving
-headless
-impostor
-phelan
-lambton
-fairport
-revitalized
-guardianship
-offend
-promiscuous
-buffaloes
-imparted
-nahuatl
-inundated
-iau
-vijayanagara
-ola
-dafydd
-messy
-creamy
-cerberus
-irl
-cowardice
-dikes
-shanti
-nurseries
-diameters
-qt
-yaakov
-2.53
-reprimanded
-hillsides
-expelling
-cilia
-situational
-outlived
-eads
-zandt
-bora
-causa
-eldon
-99.5
-suri
-affective
-serrano
-genotype
-clot
-plummeted
-larissa
-asuka
-brough
-gazelle
-irreverent
-divested
-nwo
-lackluster
-diners
-candies
-governess
-astrologers
-clarissa
-canby
-excellency
-cyberspace
-reciprocating
-assessor
-anglophone
-stamping
-cariboo
-routledge
-1240
-34.6
-tj
-guadeloupe
-pantera
-18-year-old
-newsgroup
-ararat
-dexterity
-introspective
-gentiles
-torrens
-cornelis
-icing
-sanger
-predicament
-ddr
-2.47
-fuzz
-parkin
-ayyavazhi
-baa
-guillotine
-pre-eminent
-swelled
-kemper
-redstone
-anaconda
-head-on
-coconuts
-monolith
-categorical
-turkestan
-takeda
-lega
-townhouses
-sketched
-bayan
-mcewen
-draco
-obstetrics
-triumvirate
-keppel
-putt
-dystrophy
-carmarthenshire
-scaffold
-shi'ar
-excreted
-season-ending
-7-9
-lago
-capote
-gulliver
-trombonist
-hofmann
-marymount
-lash
-jiu-jitsu
-nom
-aspired
-complimentary
-postulates
-macabre
-priceless
-surrealism
-infects
-chauffeur
-fortnightly
-staines
-minutemen
-mdc
-clem
-nix
-hedgehogs
-fieldwork
-399
-re-open
-corea
-magpie
-lithuanians
-initiator
-aspirated
-vestibule
-realistically
-r.a.
-driveway
-battlecruisers
-bushy
-devising
-funimation
-falsetto
-2.48
-gwent
-gpu
-bight
-picton
-vinson
-speckled
-excuses
-equalization
-linz
-shaheed
-dtv
-mutilation
-hangzhou
-petri
-accomplishing
-1180
-burgesses
-brow
-beige
-proletarian
-formula_40
-taluka
-worsen
-hawai'i
-footwork
-avert
-incensed
-misunderstandings
-peres
-neared
-rheumatoid
-forman
-strides
-stuntman
-diodorus
-regroup
-reptilian
-clarksville
-felton
-underprivileged
-gurdjieff
-contingents
-fret
-macroeconomic
-snatch
-makoto
-musicologist
-concealing
-revocation
-texaco
-botanists
-sunnydale
-sexism
-hauser
-marburg
-rifled
-inglis
-bobsleigh
-ludacris
-509
-feudalism
-isfahan
-unload
-sheba
-frisco
-vestments
-hammerhead
-522
-sakhalin
-fenders
-scorn
-well-to-do
-statistician
-disgusting
-csc
-beyer
-opie
-moderates
-344
-widgets
-cotta
-1453
-dogged
-432
-caesars
-storied
-2.52
-2.57
-ayodhya
-cardigan
-mourners
-caboose
-phineas
-phylum
-hens
-cohorts
-tarmac
-ter
-deuce
-ucl
-iodide
-audits
-cabernet
-entomology
-whittle
-popularizing
-linguistically
-cps
-appleby
-harrah
-undetermined
-lindley
-uh
-snorna
-ewan
-levies
-globes
-evokes
-untold
-seventy-five
-evaporated
-pilate
-windscreen
-iihf
-bmc
-160,000
-rigor
-harlequins
-loreto
-adnan
-mufti
-hula
-criss
-2.60
-indo-aryan
-hexham
-hardback
-supervises
-decider
-kingsway
-lohan
-second-order
-merck
-romanov
-colne
-bronzes
-sayyid
-diggers
-levinson
-delia
-mcdougall
-devotes
-shunting
-slumped
-excite
-346
-kellerman
-printmaking
-hippocampus
-quaint
-tupac
-incite
-kiosks
-fedora
-2.49
-gretna
-gurdwara
-dic
-mitzvah
-informants
-dials
-perpetuate
-d3
-lata
-tarsus
-juba
-amon
-tk
-senegalese
-intermediates
-summarily
-erectile
-adelphi
-rowers
-numan
-worshipful
-bitterns
-kirov
-struts
-figaro
-analgesic
-imitations
-1989-90
-cleave
-honed
-novello
-sst
-740
-hernando
-nab
-decentralization
-jah
-renato
-stenosis
-capra
-oblong
-gamut
-obeyed
-sassoon
-u-bahn
-elongate
-exegesis
-patio
-fabled
-carlow
-inoue
-lancet
-op-ed
-seaford
-fachhochschule
-madrigal
-youngblood
-confrontational
-austerity
-sho
-retirees
-preakness
-pericles
-retract
-rolfe
-383
-98.5
-slammed
-125th
-phraya
-seagull
-mustache
-farc
-oriel
-text-based
-angkor
-shank
-34.0
-rugs
-pickled
-barrows
-accumulates
-benchmarks
-unsustainable
-upstart
-mandeville
-girder
-cutaneous
-roxanne
-marsha
-aloof
-sayers
-monogamous
-telluride
-14-year-old
-entomologist
-lubricants
-besieging
-desi
-tracer
-ils
-1591
-annotations
-non-government
-alger
-triads
-yeo
-ubc
-ales
-coachella
-apu
-vicarage
-exchanger
-rambler
-dex
-dermatitis
-prefectural
-semi-autobiographical
-aaf
-safavid
-remedial
-?!
-popped
-milt
-1552
-mid-july
-disfigured
-roommates
-touchstone
-transduction
-canis
-kmart
-harpercollins
-springboks
-nabokov
-670
-pashtuns
-inquire
-amplify
-rascal
-configure
-orestes
-whitey
-banknote
-byway
-implantation
-fluffy
-jafar
-gawain
-vaikundar
-chrissie
-ground-based
-jurong
-irishman
-eradicated
-summarizes
-1556
-36.4
-unites
-q2
-jeb
-guardia
-punks
-calvados
-787
-dougherty
-tarn
-sauvignon
-shem
-basaltic
-sandwiched
-democratic-republican
-wolseley
-garuda
-proms
-smu
-batangas
-aki
-afro
-trivandrum
-onyx
-homophobic
-x1
-gta
-nandi
-ilford
-midpoint
-sardinian
-mohinder
-grocer
-sloth
-brigitte
-aspire
-jacobites
-teenaged
-shires
-wishbone
-kalimantan
-citigroup
-piet
-noonan
-woodman
-dorn
-kravitz
-clears
-cools
-obnoxious
-radiate
-mid-september
-disaffected
-herat
-kuban
-indictments
-slovenes
-2.79
-krupp
-chattahoochee
-rehabilitate
-crickets
-blige
-triumphal
-thanos
-bhavan
-sabotaged
-ibsen
-landlocked
-apathy
-potable
-kfc
-staffer
-2.46
-whitefish
-salter
-sitka
-362
-527
-townsfolk
-beavis
-paced
-embraer
-livia
-yamada
-riker
-grozny
-dram
-inman
-ecologically
-schrader
-countering
-loophole
-agendas
-alerting
-sy
-actin
-wrc
-400m
-mid-size
-2.00
-pitcairn
-trimble
-folkestone
-carbonated
-patiala
-cyan
-point-to-point
-jamison
-coo
-hg
-councilors
-1988-89
-news/talk
-605
-boland
-upholstery
-upsilon
-misled
-eisteddfod
-heisenberg
-scouted
-kyo
-plainly
-combe
-bundestag
-harsher
-evangelion
-edgware
-busway
-adenosine
-iga
-oneness
-sca
-fuego
-govan
-ucd
-35.1
-rolex
-ply
-biafra
-unfavourable
-understudy
-endpoints
-ballistics
-recreating
-cysts
-lundy
-pasco
-rhoda
-poke
-self-employed
-allocations
-childers
-fanatical
-whittington
-casco
-prowl
-394
-strut
-northwich
-monogram
-circumstantial
-2030
-episcopacy
-lem
-cyclical
-nymphs
-sabretooth
-bergamo
-watchmen
-accelerates
-walloon
-breckinridge
-requisitioned
-nuncio
-unruly
-unfold
-dispatching
-goon
-extremism
-impaled
-intercepting
-workplaces
-lockout
-joyful
-distort
-quattro
-made-for-television
-twenty-fifth
-5-10
-naruto
-444
-asda
-embody
-entrant
-stuffing
-sieve
-miramax
-deterrence
-mavis
-97.5
-intrinsically
-superliga
-seamlessly
-miyagi
-mendocino
-stewie
-sinbad
-cantata
-comp
-cvs
-upsetting
-grendel
-consulship
-contextual
-tre
-e6
-spanish-speaking
-deviate
-trusting
-loring
-kinney
-lucchese
-saad
-rips
-kiki
-reunites
-esteban
-35.5
-mig
-atmospheres
-salamis
-songbird
-re-emerged
-four-star
-o'shea
-ayub
-ims
-urbana
-standish
-netherland
-raiden
-goldsmiths
-slapped
-mosses
-skopje
-four-cylinder
-diesels
-skated
-pulley
-lindisfarne
-yee
-congregate
-rx
-gisborne
-timetables
-frustrations
-grenadines
-primo
-leveling
-brazilians
-exclaimed
-2.77
-cytoplasmic
-extinguish
-bridal
-bachman
-shards
-36,250
-10000
-kiowa
-braintree
-spartanburg
-phenomenology
-patched
-hammett
-eluded
-delirious
-interstates
-dot-com
-repositories
-lice
-coquitlam
-bridgend
-gargoyles
-339
-pepin
-codecs
-tunku
-ralf
-dornier
-35.7
-vasa
-orogeny
-disulfide
-scrambling
-dissuade
-kev
-348
-1290
-postulate
-arbiter
-rackets
-2.56
-baroda
-meandering
-blackheath
-unrecognized
-breastfeeding
-geddes
-tutu
-1:00
-preclude
-porters
-ntsb
-spectroscopic
-34.7
-hb
-formality
-sevastopol
-tester
-krauss
-muridae
-reminiscences
-fdr
-blitzkrieg
-sloped
-imagines
-suffice
-ovw
-ajay
-kennett
-monrovia
-low-key
-byng
-sandoval
-plaintext
-clinicians
-donohue
-valletta
-suvs
-lib
-categorised
-accesses
-bogie
-unaltered
-octavia
-sanborn
-auden
-pacino
-townhouse
-ardennes
-fagan
-bonaventure
-penniless
-beretta
-730
-depressing
-ncr
-soybean
-sneaking
-visceral
-nihon
-christendom
-beehive
-36,000
-tryout
-steadfast
-aram
-dutchess
-xin
-madrigals
-oxidizing
-selectivity
-barcode
-bader
-ncos
--18
-thermonuclear
-benfica
-1578
-pepe
-inroads
-babylonia
-hamster
-spoonbills
-go-ahead
-high-risk
-bock
-cic
-layne
-nene
-boathouse
-gourd
-1513
-domestication
-visigoths
-kandy
-objectively
-pop-up
-eisenstein
-i-80
-arad
-timeframe
-loveless
-sonya
-cooney
-jocelyn
-multi-media
-ziegfeld
-revoke
-usd$
-hooke
-kayla
-angeles-based
-verifiable
-termites
-bentham
-worshipping
-busted
-369
-nationalization
-courtyards
-minimizes
-grad
-steamboats
-apprehend
-melanoma
-over-the-counter
-uncovering
-daf
-sleek
-wingfield
-whirlpool
-2.76
-reece
-sabina
-compaq
-fossilized
-tutorials
-ica
-ohl
-pomeroy
-kyiv
-american-born
-guatemalan
-sustenance
-deuteronomy
-anion
-rathbone
-cassino
-broth
-modalities
-ivanhoe
-granby
-motorbike
-teak
-tiger-cats
-benzodiazepine
-gemstone
-donington
-huntley
-andros
-perpetually
-industrialisation
-argo
-35.4
-nsc
-calcareous
-dismantle
-374
-toned
-ledges
-small-town
-granules
-ibises
-soundwave
-tat
-shunt
-provisionally
-factorization
-cabal
-ordovician
-greyish
-prenatal
-2.61
-nubia
-edt
-unstoppable
-pic
-coffey
-pca
-rizzo
-amassing
-conferring
-fy
-infinitesimal
-birdie
-above-mentioned
-erhard
-aguinaldo
-waring
-subsection
-sdk
-fluency
-sartre
-hologram
-kursk
-baptiste
-talia
-kilogram
-qpr
-2-6
-lenient
-jus
-cams
-bubbling
-torrey
-schooled
-kee
-au$
-antigonus
-conformal
-lotto
-resistors
-940
-saatchi
-wakeman
-emblazoned
-f5
-pondicherry
-phonemic
-exaggeration
-catharine
-oriya
-dahlia
-brougham
-bowdoin
-everyman
-900,000
-heals
-aung
-budge
-goodwood
-forgives
-railings
-nederland
-heraclius
-osgood
-chamberlin
-petrov
-pop/rock
-lloyds
-kongo
-aerosol
-eltham
-rescheduled
-selina
-sihanouk
-methodism
-2.38
-cutbacks
-reconfigured
-stoic
-marche
-whiskers
-impassable
-sati
-honorius
-gymnasts
-discord
-antecedents
-amerindian
-philippa
-kavanagh
-kristina
-honorably
-overloaded
-kt
-jul
-darmstadt
-internazionale
-winch
-manhunter
-mordecai
-delirium
-four-day
-mossad
-libertadores
-valedictorian
-undocumented
-oust
-darla
-15-minute
-perpetuated
-makati
-sssi
-tilden
-tyndall
-mull
-lifeless
-condom
-770
-particulars
-single-member
-phoneme
-isometric
-thereupon
-gunter
-noi
-captaining
-unbiased
-bulacan
-ellipse
-asymmetry
-ru
-micky
-turnovers
-accelerators
-cocktails
-recoveries
-darnell
-wane
-ilocos
-whiteman
-deadlines
-i-75
-sea-birds
-crores
-lemurs
-mckinnon
-oboes
-locos
-caprica
-persepolis
-hurried
-consummate
-crossword
-stylist
-metcard
-1543
-2.43
-motivating
-swaminarayan
-impediment
-anorexia
-1460
-ronin
-secretions
-kain
-cybernetics
-alas
-riverboat
-110,000
-nuys
-stetson
-adf
-excretion
-achaemenid
-boyne
-kannur
-torturing
-illyrian
-cosmonaut
-ferret
-gaba
-decipher
-guntur
-trajectories
-isley
-obstructed
-balsam
-caucuses
-permissions
-lassie
-sadar
-chilled
-d6
-dialectic
-refinements
-1480
-one-to-one
-oha
-blinds
-scraps
-gwar
-peerages
-bandung
-mid-sized
-ironman
-irradiation
-eunice
-dictatorial
-aunts
-pulsed
-psalter
-life-size
-mchale
-pauses
-ferdinando
-xing
-suspecting
-golfing
-tithe
-triumphed
-evangelistic
-tupper
-thoreau
-backcountry
-bootlegs
-dishonest
-overlying
-molson
-ionized
-workpiece
-'til
-hausa
-queues
-converged
-lactic
-belushi
-maronite
-sub-division
-repackaged
-hilliard
-foaled
-garb
-1576
-misl
-thrusters
-gaon
-kishore
-scavenger
-hindrance
-ulcer
-well-respected
-stour
-pontus
-1579
-sailboat
-gosford
-caspar
-malagasy
-belgians
-hazara
-dismounted
-vu
-floodlights
-laurens
-underpass
-harvests
-fouls
-mid-to-late
-tongan
-enumeration
-taunting
-proletariat
-mamluk
-islamists
-garvin
-fallujah
-riddles
-socio-political
-intergalactic
-reload
-defying
-lse
-australis
-elizabethtown
-rejoice
-yamuna
-1562
-fundamentalists
-caicos
-lookup
-autonomic
-remediation
-flicker
-org
-brookes
-uncertainties
-tenancy
-olympique
-waterworks
-shallower
-mitigated
-wahid
-hendry
-manhunt
-glimpses
-anti-catholic
-parenthood
-amputation
-ill-health
-juror
-simplifies
-p2
-tollway
-hannity
-ftc
-lockdown
-tulu
-pastiche
-multiplexing
-thrones
-impregnated
-coker
-appalling
-foal
-viejo
-mcarthur
-deviant
-bobbie
-corroborated
-cordon
-yamaguchi
-paradoxical
-hsc
-anguilla
-goff
-ancona
-hh
-sarnia
-msx
-whitehorse
-decoded
-piggy
-posh
-complimented
-greig
-lumley
-occupiers
-fernandes
-brooding
-fawn
-alexandrian
-herndon
-watered
-warburton
-tasted
-roseanne
-smolensk
-pats
-livorno
-graze
-electromechanical
-aur
-whim
-samuels
-depaul
-mints
-pre-world
-neglecting
-non-white
-pipers
-foy
-forgave
-undertakings
-karla
-bestow
-476
-zz
-smog
-nozzles
-modifies
-imagining
-compressors
-pinoy
-mundi
-steinman
-snape
-altarpiece
-workington
-q1
-metroid
-glens
-unauthorised
-inwards
-unconnected
-discreet
-0.05
-ovaries
-exclamation
-ornamented
-longs
-uga
-berbers
-roadblock
-abolitionists
-rupee
-emptiness
-poetical
-atf
-writs
-federalists
-infrastructures
-chopping
-scania
-highbury
-hapless
-kama
-inconvenience
-cladding
-salina
-'50s
-+DGDGDG
-laporte
-dummies
-bowes
-leppard
-convoluted
-eeg
-tint
-flagler
-valuables
-1986-87
-tendered
-sprinkled
-iverson
-needless
-harbinger
-wharves
-bnei
-then-new
-campaigners
-viv
-zhong
-trolleys
-licensure
-obstructions
-yeager
-inspires
-solder
-synthpop
-matchbox
-antidepressant
-yat-sen
-netanyahu
-2.67
-adil
-dubs
-fujita
-goto
-akhenaten
-separatists
-athenaeum
-pixie
-farrow
-kingswood
-smashes
-riverine
-mclachlan
-waging
-euroleague
-dekker
-crittenden
-eternally
-salvadoran
-mitosis
-alligators
-outsourced
-mockingbird
-pounder
-kurukshetra
-flavius
-cvp
-spyware
-darkly
-havens
-hurd
-hervey
-catalogued
-ashfield
-pasteur
-airdrie
-sandown
-tsing
-urquhart
-sylvan
-gatineau
-wim
-vegetarianism
-hoo
-haden
-unworthy
-menagerie
-flathead
-tva
-banter
-kuan
-arbitron
-wairarapa
-wolfman
-molybdenum
-dalit
-sedative
-pusan
-yule
-sark
-ince
-bridged
-pennine
-aoc
-monsanto
-bacchus
-centerville
-hornblower
-elvira
-senshi
-mpp
-cobras
-seng
-inflow
-j.t.
-polyphony
-cromer
-kentish
-brac
-parke
-2.42
-silos
-night-time
-pallas
-timmins
-avraham
-mitt
-shakira
-faking
-formaldehyde
-appalachia
-coupon
-rsc
-madge
-illuminating
-channeled
-opposites
-castings
-lamas
-92.5
-geller
-frazione
-changi
-cargoes
-swells
-motels
-96.5
-ca2
-lockers
-ssc
-pittman
-??
-riverina
-entombed
-restroom
-pubic
-unambiguous
-fps
-antiquated
-eastleigh
-swabia
-perlman
-incision
-woredas
-loc
-tokyopop
-yip
-rcn
-quito
-marlena
-discontinuous
-evidences
-mid-2006
-zonal
-attributing
-nunnery
-stockbridge
-flipper
-rusk
-callaway
-gripping
-buccaneer
-figs
-sibelius
-asc
-burg
-mail-order
-fripp
-snapshot
-condenser
-lilac
-archivist
-inter-american
-enlarging
-sialkot
-succumbing
-androids
-stockholders
-mores
-dux
-speciation
-bulger
-spaniel
-bystanders
-chitty
-suetonius
-fillies
-wayside
-elevating
-applegate
-372
-puppetry
-monoclonal
-riva
-molasses
-hotchkiss
-delineated
-92.7
-lillooet
-perthshire
-cadiz
-sandals
-thrives
-convene
-castello
-specialise
-rhododendron
-roundup
-rutledge
-vos
-racked
-b5
-splendour
-farina
-parapet
-boney
-bogies
-statuary
-mio
-harrell
-sunnis
-t-mobile
-superpower
-complicate
-lobbyists
-inflorescence
-sauna
-mug
-mummies
-usagi
-kurosawa
-rodham
-quantified
-ig
-thucydides
-bake
-vestry
-coexist
-matron
-pogo
-bodie
-misconceptions
-germination
-labours
-carefree
-mottled
-invisibility
-teleported
-imperialist
-irritating
-2.71
-hydro-electric
-450,000
-rationalism
-backer
-puranas
-halakha
-mambo
-alpert
-laborer
-kos
-coal-fired
-sancho
-s3
-legionnaires
-cetera
-improvise
-verbandsgemeinde
-hoskins
-hanoverian
-caramel
-zipper
-broadest
-shelburne
-bartley
-herbarium
-chhattisgarh
-scarab
-rosedale
-hwa
-iota
-aoi
-berhad
-gaiety
-negeri
-aud
-manticore
-catawba
-sweetened
-slugger
-ruffin
-browse
-rifleman
-hefner
-rectum
-kalam
-lodgings
-tanzanian
-cotswold
-allegany
-rubbed
-stevenage
-riflemen
-kaine
-diseased
-uninterested
-peppermint
-gnosticism
-sloppy
-issuer
-rectified
-painfully
-algorithmic
-symplectic
-ibanez
-efficiencies
-mts
-consternation
-hernia
-domitian
-reincarnated
-lifeline
-pomeranian
-reticulum
-reputations
-biometric
-jcpenney
-rancid
-pasir
-brainwashing
-mackerel
-428
-mulholland
-klm
-mindy
-tibia
-bunbury
-flannery
-combos
-yelled
-sunbeam
-datuk
-plasticity
-dramatized
-camouflaged
-underwriting
-i-5
-faerie
-roseville
-govinda
-cern
-gestalt
-middleware
-2.72
-hamer
-imbued
-recycle
-bahru
-meher
-mehmed
-alvarado
-empirically
-dunning
-returner
-banishment
-guested
-heathen
-i-10
-tacit
-ghat
-suzie
-fecal
-ramone
-zeeland
-banshee
-starfire
-interludes
-underboss
-dedicating
-molars
-hutch
-ticino
-fsa
-steely
-sills
-bia
-flatbush
-motherland
-bookings
-she-hulk
-hutchins
-dennison
-vespers
-party-list
-afrikaner
-sufferings
-standardize
-slander
-friuli
-meme
-396
-dingo
-kwon
-bibi
-vole
-1330
-bicarbonate
-traditionalists
-lucan
-merrie
-bleu
-.6
-geophysics
-co-chairman
-crossbar
-empowering
-bok
-serpents
-92.1
-panelists
-767
-dendritic
-woolley
-mandala
-formula_42
-nitrous
-sequencer
-barbican
-praetor
-almonds
-gums
-interwoven
-pu
-by-pass
-grating
-surry
-inversely
-sprout
-hanger
-hermits
-phoenicians
-textured
-breathed
-straddling
-choking
-renovating
-privatized
-35.2
-nano
-lemons
-skillfully
-taran
-bebe
-schematic
-goebbels
-kinross
-vries
-imsa
-nominative
-stocky
-tripp
-clean-up
-compounding
-gtp
-koko
-1780s
-monson
-amuro
-1537
-missoula
-savages
-autry
-passageway
-stilwell
-consorts
-juncture
-raps
-sextus
-nursed
-inpatient
-weblog
-caddo
-yazoo
-synthetase
-accretion
-gooch
-kam
-0.25
-grier
-ablaze
-edmondson
-sidewinder
-95.5
-perpetuity
-deutschland
-movers
-pusher
-hungerford
-tugs
-qarase
-secularism
-indulge
-anglicanism
-steed
-tezuka
-c64
-2.65
-2.59
-mitcham
-topper
-dependable
-gomes
-hillel
-tinsley
-cover-up
-impulsive
-2.58
-heaps
-dis
-iwa
-alco
-protease
-licenced
-imo
-aligarh
-shredded
-freighters
-end-user
-roasting
-swordfish
-l1
-usf
-barricades
-surface-to-air
-bastards
-rococo
-oromia
-jalal
-lilies
-kirsty
-ftse
-constipation
-12:30
-platelets
-2.78
-chatterjee
-2-5
-snowman
-fairmount
-canwest
-minor-league
-campbelltown
-shanxi
-bernoulli
-mx
-cosa
-zapata
-individualized
-kenmore
-ub
-brand-new
-plateaus
-frantically
-mississippian
-emcee
-tremor
-sanitarium
-rab
-usaid
-sextet
-droughts
-unattended
-braided
-gazetteer
-astray
-crystallography
-pythagorean
-lita
-nsl
-scythian
-undecided
-flask
-evading
-real-estate
-manta
-emeralds
-selfless
-saxophones
-setanta
-mccracken
-mer
-mesolithic
-cask
-picnics
-dato
-playlists
-meryl
-caro
-hummingbirds
-ammo
-biff
-nematode
-dystopian
-leger
-unmatched
-salons
-cathcart
-migraine
-framingham
-slayers
-tarawa
-forty-two
-f-15
-steam-powered
-leek
-lovelace
-scriptwriter
-descriptor
-mashed
-four-part
-silicate
-keynesian
-ayn
-roan
-itching
-antonius
-meigs
-supercharger
-greenbelt
-misadventures
-short-range
-burgundian
-gd
-unpopularity
-isomers
-partying
-viticulture
-yr
-sub-genre
-porridge
-authorisation
-hoses
-castleford
-spinach
-373
-vices
-humayun
-yugi
-trimming
-cassini
-waddington
-strathcona
-purebred
-mediacorp
-34.9
-shazam
-innovators
-94.1
-y'
-infernal
-interconnection
-synods
-irish-american
-airsoft
-lifesaving
-after-school
-chirac
-vidyalaya
-javed
-96.7
-mendip
-tfw
-1220
-darwinism
-arunachal
-roebuck
-doon
-rounder
-486
-noxious
-audio-visual
-glyph
-sojourn
-peep
-ps3
-1575
-bodmin
-lemmon
-peacekeepers
-b3
-dt
-2-year
-westmeath
-pertains
-1987-88
-2.30
-rp
-leaflet
-grandiose
-35.0
-emplacements
-unlockable
-swims
-maw
-arian
-atwater
-stalked
-newbery
-dewsbury
-40.0
-rockhampton
-c6
-g2
-powerfully
-93.3
-kilbride
-tenderness
-ben-gurion
-conversational
-predefined
-shannara
-vindicated
-1529
-wafl
-tome
-schindler
-girolamo
-syringe
-mid-1800s
-carnivals
-spilling
-serfs
-1470
-kraus
-paraded
-1490
-shad
-naturalised
-eaves
-impacting
-hydride
-fundraisers
-forcible
-petite
-verbatim
-metabolite
-forgets
-wolfson
-mombasa
-ojibwa
-reinhold
-shutters
-darlene
-predacons
-vonnegut
-romford
-gracious
-boro
-fanatic
-wholly-owned
-eldridge
-crumbled
-esophageal
-cardiology
-m.p.
-soliciting
-mid-october
-impeded
-rl
-hypoglycemia
-wiccan
-urgently
-felled
-lehi
-veritas
-karting
-ringling
-yogyakarta
-scaffolding
-2.74
-cirrus
-lattices
-brochure
-gq
-boulevards
-donetsk
-formula_43
-formula_41
-generative
-observant
-coldest
-rdf
-bhima
-secessionist
-pillsbury
-eventing
-coburg
-trickle
-gaetano
-fulfills
-triangulation
-smt
-co-operate
-steeped
-marko
-fricative
-nipple
-sixtus
-coughing
-lichens
-dann
-countable
-withdrawals
-1984-85
-parsley
-landers
-bagley
-boomed
-beechcraft
-millionaires
-zac
-xian
-personified
-stumble
-subtraction
-neuropathy
-zuma
-winkle
-unregistered
-plagues
-hillbilly
-unsurprisingly
-ehud
-1545
-srpska
-rhonda
-valhalla
-periscope
-111th
-gambier
-usfl
-spokes
-rood
-f-14
-us-based
-capsized
-chau
-holocene
-replenished
-celebratory
-bushland
-razak
-coalfield
-georgios
-teton
-tenured
-styria
-507
-sully
-sayid
-gorham
-femoral
-supersonics
-eclipses
-morehead
-chairwoman
-1573
-busta
-pilgrimages
-vee
-glaucoma
-allotments
-p1
-cyprinidae
-mourn
-edgewood
-unita
-layoffs
-manipulates
-flywheel
-metallurgical
-filings
-reinhard
-howland
-thanking
-2.70
-osteoporosis
-laker
-deuterium
-roadshow
-vitale
-pogroms
-maison
-yap
-chaka
-viki
-drip
-pvda
-shone
-cagayan
-pellet
-reassembled
-potawatomi
-gilgamesh
-lopes
-84th
-367
-1563
-holme
-woodley
-panics
-french-language
-94.7
-kar
-far-right
-1b
-crustacean
-propelling
-paradoxically
-substitutions
-vert
-raimi
-breaststroke
-staunchly
-repute
-redox
-kl
-hairpin
-petrel
-frosty
-sovereigns
-campos
-93.5
-magnates
-preoccupation
-musicianship
-scraped
-semicircular
-colonna
-euphemism
-vermin
-reba
-cheats
-lotteries
-breakdowns
-fujitsu
-tonne
-syrians
-earhart
-ketone
-tax-exempt
-husain
-cheered
-beachhead
-resorting
-imprints
-freezer
-multiplicative
-complies
-welk
-raquel
-absentia
-roundtable
-self-produced
-wea
-gillies
-shrouded
-captions
-ethiopians
-epo
-gadsden
-daniele
-lakefront
-dualism
-lossless
-hydrolases
-battleground
-ghazni
-bally
-mohd
-appointee
-sephardi
-456
-tas
-all-white
-neurotic
-lugano
-grebes
-shizuoka
-subpoena
-ramesh
-artistes
-slipper
-haulage
-undiscovered
-pouches
-gorton
-fluctuating
-36.5
-buller
-truk
-lowther
-disruptions
-hallways
-patagonia
-minton
-citytv
-complicating
-on-demand
-2.40
-cargill
-timberwolves
-co-pilot
-dalian
-blackfoot
-motherhood
-mutton
-quintessential
-fermat
-capuchin
-entice
-retinue
-cowper
-sind
-mustaine
-aer
-exhausting
-kassel
-francia
-bos
-rojas
-vandenberg
-disjoint
-endeavoured
-inlets
-up-tempo
-self-sufficiency
-antimatter
-nikos
-unproven
-12-year-old
-foresters
-handshake
-dissipate
-gaynor
-worf
-bots
-sundown
-homomorphism
-mishap
-leonidas
-leia
-pdas
-cd/dvd
-seva
-greenhouses
-ambushes
-zoroastrianism
-hues
-voorhees
-veracity
-contemplative
-automaker
-tabletop
-mie
-fumbled
-d.j.
-pilkington
-licks
-corollary
-unattractive
-1985-86
-6am
-cautioned
-decadent
-withstood
-computer-aided
-pontiff
-ambivalent
-ozzie
-pippin
-m.e.
-pryce
-birdman
-wiesbaden
-koi
-38.5
-emmys
-nightwing
-peckinpah
-780
-dissonance
-kell
-intelligentsia
-1548
-kcb
-leaned
-pio
-glenelg
-radio-television
-391
-spas
-evacuees
-blackish
-beulah
-coy
-zambian
-clinging
-hanseatic
-fsu
-vertebra
-diverges
-utv
-shona
-clocked
-biz
-retainers
-vga
-phobia
-coogan
-tropic
-ineffectual
-92.9
-algernon
-covariant
-halliday
-longview
-mf
-petrified
-svetlana
-anti-ship
-cartagena
-lengthening
-kinases
-kunming
-symposia
-crispin
-stansted
-2.51
-paperbacks
-intrusions
-vaud
-keighley
-pictish
-radiocarbon
-mahathir
-cytokines
-degenerated
-levees
-carcetti
-eliezer
-yuga
-all-girls
-bd
-co-sponsored
-electrophoresis
-tana
-puddle
-distaste
-dsc
-ravines
-tok
-mda
-sexist
-roald
-wada
-talkies
-rewind
-kfar
-trombones
-rosso
-skynyrd
-.300
-minuteman
-allie
-porting
-figueroa
-berkley
-s2
-morphs
-rayleigh
-constantin
-detergent
-iu
-hooves
-non-mandatory
-unwelcome
-delights
-bloomingdale
-aligning
-marksman
-skids
-tenses
-erotica
-hausdorff
-mowgli
-synapses
-edson
-1998-1999
-jacobsen
-carcasses
-pancho
-foreseen
-bayreuth
-barra
-luring
-lansbury
-pala
-crevices
-minstrels
-tiller
-stateless
-drc
-militaries
-autographs
-printings
-bir
-2.73
-invariance
-accrued
-staley
-vigo
-musume
-convair
-bumbling
-stratified
-disenchanted
-unravel
-metroplex
-coen
-ssi
-grapevine
-merovingian
-pcl
-dsm
-schweitzer
-shamans
-starfish
-extradited
-blueberry
-sacramental
-set-top
-insufficiently
-616
-unsubstantiated
-detain
-harland
-frankly
-yarrow
-stamens
-vern
-congestive
-flack
-brisk
-hypertext
-puccini
-meerut
-nigger
-orc
-kelp
-690
-e.t.
-4400
-dreaded
-archiving
-sosa
-valentina
-khazar
-hillsong
-33,750
-smuggler
-verity
-freemen
-round-trip
-jean-jacques
-teasing
-reardon
-cramps
-harrowing
-scapegoat
-scuffle
-viewable
-california-based
-third-place
-rained
-didier
-uplifting
-partick
-glycogen
-colonia
-camus
-confides
-killarney
-udupi
-turpin
-timers
-mycenaean
-front-end
-thackeray
-fortitude
-barricade
-daegu
-lassen
-pandya
-paralympics
-1527
-1170
-admires
-ruthven
-depose
-pledging
-twin-engine
-76th
-tiber
-modus
-nv
-35.6
-airshow
-ridgewood
-exclusivity
-sabin
-rfa
-18,750
-briar
-1320
-busters
-ranidae
-eugenio
-zionists
-mocks
-yasser
-bobcat
-valeria
-fifths
-juggernaut
-alter-ego
-97.7
-incheon
-asymptomatic
-sabu
-cutting-edge
-cris
-deem
-dieppe
-1577
-tarantula
-10.00
-salah
-sefton
-rak
-winemaking
-radon
-tartu
-pct
-proportionally
-nullified
-icbm
-bulletproof
-kshatriyas
-tappan
-silverstein
-lashes
-2.39
-mithridates
-cowl
-springtime
-redfern
-2:1
-kitt
-voluminous
-retroactive
-substantiated
-mandi
-jbl
-qian
-shaanxi
-lactose
-mailer
-homeport
-1544
-27,500
-bruised
-federals
-apologies
-DGDG.DGDGDGDG
-koji
-acorns
-starcraft
-worsley
-bittersweet
-gautama
-larks
-eventful
-1215
-avicenna
-mpeg
-madtv
-mid-nineteenth
-dissected
-django
-kowalski
-conspiracies
-nightcrawler
-grandchild
-sade
-457
-gielgud
-redd
-holtz
-exterminated
-haim
-j-pop
-demeter
-tapering
-nll
-juanita
-2.68
-aquatics
-contraceptive
-cirrhosis
-dlr
-redeployed
-taito
-22,500
-stubby
-ias
-culled
-m25
-neve
-mar.
-imams
-392
-prisms
-amenhotep
-radicalism
-inflamed
-gargoyle
-idolatry
-fast-growing
-beauties
-hymnal
-twenty-second
-befriend
-morphed
-wrangler
-aimee
-blockaded
-stonework
-xfm
-tis
-wildfires
-uppermost
-beryllium
-nikolas
-reflexive
-mutineers
-naxos
-sanhedrin
-sweepstakes
-30-year
-codice_5
-haskins
-divisor
-canonization
-mahesh
-lymphatic
-resignations
-forsythe
-nva
-cyndi
-in-line
-seaward
-despatches
-tripping
-tightening
-1204
-pinyin
-huntsman
-adjourned
-deactivation
-abdicate
-innermost
-jv
-luka
-1561
-lowndes
-vanquished
-colfax
-linkin
-menlo
-warfield
-kuru
-phrasing
-coexistence
-d'hondt
-isc
-stillman
-.7
-445
-dries
-unedited
-73rd
-robocop
-instill
-nodules
-radeon
-sportsnet
-k-4
-breakthroughs
-braga
-2009/10
-booed
-valery
-torrential
-tenement
-crystallization
-renunciation
-decadence
-defoe
-postmodernism
-ticks
-bedtime
-1510
-4:30
-isolates
-macho
-fips
-436
-supermodel
-oaxaca
-centre-left
-cumming
-gw
-picts
-altair
-96.3
-rejoins
-goku
-glitch
-fens
-housekeeping
-pdl
-nai
-suture
-conch
-1770s
-conjugated
-jiangxi
-non-album
-515
-mastercard
-cursive
-acrimonious
-ferrers
-namur
-hatchery
-tycho
-tinian
-crap
-gatorade
-sporty
-jimenez
-tien
-barnstaple
-belo
-norah
-firefight
-totality
-bimonthly
-trailhead
-hazen
-egbert
-humorist
-tzu
-yost
-coleraine
-ufa
-cordova
-piped
-lecter
-burgos
-vinny
-pte
-blouse
-licinius
-tendulkar
-infomercials
-preferentially
-batgirl
-fissure
-speculators
-1982-83
-barbaric
-apnea
-koenig
-bj
-757
-tomlin
-robustness
-newsmagazine
-remo
-recluse
-haddon
-modifier
-anti-terrorism
-altona
-plumes
-rafters
-keypad
-co-ordinated
-beaulieu
-schilling
-yardley
-ecu
-furlong
-goh
-bounces
-bounding
-96.1
-spoofed
-aix
-solent
-ramat
-medea
-trackbed
-99th
-presidium
-thutmose
-oder
-grenfell
-coed
-kelowna
-inga
-hamill
-xenia
-mcphee
-hamasaki
-emissaries
-glorified
-vernal
-lyra
-encamped
-elbert
-maki
-fledged
-redrawn
-castleton
-stele
-taos
-newcomb
-sieges
-langton
-dill
-latch
-kamchatka
-bonifacio
-installer
-stepney
-plebeian
-midge
-prado
-permeable
-hydrochloric
-earrings
-scooters
-maa
-banach
-heckler
-goblet
-tunings
-exemplifies
-mid-2007
-geographers
-long-awaited
-steeper
-didactic
-generically
-pacifism
-coups
-telenovela
-trl
-kristian
-aliyah
-watercraft
-ketchup
-saf
-happiest
-re-use
-reconstructing
-quarrels
-portraiture
-automate
-non-aligned
-wgn
-ruc
-sams
-assailants
-abnormality
-isotopic
-unser
-typhus
-cation
-25-64
-hawkeyes
-basu
-solidify
-trill
-94.5
-agostino
-sinfonia
-seed-eating
-465
-heber
-olde
-augustin
-pawnee
-seaplanes
-selves
-recap
-barrios
-anthracite
-sj
-montpelier
-loon
-burgers
-incited
-drawbridge
-woburn
-overdubs
-almeida
-paraphernalia
-long-necked
-minoru
-tirelessly
-i-40
-hypothalamus
-pere
-krebs
-cablevision
-arbuthnot
-beeston
-bendix
-internationalist
-tron
-acne
-radiators
-tampere
-ferried
-easing
-trina
-patrese
-under-20
-heart-shaped
-watersheds
-milanese
-strom
-2,700
-spc
-3.17
-pepys
-euripides
-outstretched
-next-generation
-aditya
-diploid
-caron
-cysteine
-2,000,000
-uni
-lotte
-marketers
-turismo
-riddled
-thoroughbreds
-responders
-haq
-matsumoto
-ahern
-dearly
-fas
-embossed
-hydropower
-taming
-paradoxes
-shifter
-nizhny
-cedars
-evangeline
-kola
-lnwr
-fakes
-placings
-binoculars
-tyrants
-snippets
-samos
-trapeze
-onlookers
-dury
-kilt
-retaliate
-primeval
-94th
-accrington
-1574
-transferases
-ornithologists
-city-wide
-atms
-quatermass
-twenty-third
-diction
-allocating
-turban
-roamed
-cropped
-overcomes
-legacies
-caius
-stupidity
-rayon
-chickamauga
-bacharach
-vivekananda
-dewar
-kauffman
-electromagnetism
-foxtel
-cnr
-twig
-prohibitive
-sedimentation
-soaking
-tortoises
-publicize
-deportations
-pretense
-sportscenter
-stratum
-straus
-ascertained
-pvt.
-taggart
-quilt
-mid-december
-invercargill
-runcorn
-unseated
-kamakura
-inventories
-palos
-bpi
-headings
-severin
-well-suited
-reestablish
-mankato
-apo
-christophe
-deseret
-salaried
-marshalling
-setlist
-grueling
-shriver
-acta
-episcopalian
-ochre
-barbera
-vasari
-taiping
-petrus
-alcatraz
-corinne
-avril
-supplementation
-sevier
-wayward
-lend-lease
-180,000
-wheelchairs
-bien
-2,600
-licensees
-elbows
-1564
-honoris
-andalusian
-doubtless
-undamaged
-stags
-tanis
-plotline
-10am
-archetypes
-bucs
-shuffled
-pervez
-reenactment
-selznick
-landlady
-xanadu
-patchwork
-quang
-fcs
-salamanders
-haiku
-mathieu
-espn2
-morphisms
-2005/2006
-gard
-iago
-kol
-primera
-stylised
-2018
-fieldhouse
-shangri-la
-ashraf
-immobile
-flagg
-retrofitted
-synoptic
-canvases
-eec
-wanganui
-impresario
-cancerous
-mcclintock
-gyms
-freer
-differentiates
-cob
-winans
-d20
-reggaeton
-dey
-flavia
-sucrose
-clube
-write-in
-non-military
-escalator
-wabc
-home-made
-flightless
-1485
-livin
-morgantown
-unnecessarily
-widens
-simulcasts
-versed
-insulator
-qajar
-gehrig
-greets
-brosnan
-greener
-sola
-cpt
-childcare
-adorn
-rabbitohs
-lucasarts
-pataki
-thirty-four
-futility
-parklands
-itt
-condo
-87th
-hillsdale
-flung
-non-english
-bleaching
-solicit
-8pm
-hourglass
-admissible
-thresholds
-lukas
-kush
-hubei
-shard
-troubling
-mcmurray
-attlee
-manageable
-0-2
-categorize
-solano
-maturing
-astrid
-holomorphic
-ue
-plantagenet
-transposition
-stratus
-shipbuilders
-dugan
-loire
-northstar
-wbo
-webcast
-pannonia
-followup
-pliocene
-president-elect
-ayurvedic
-15-year
-clio
-thalamus
-disowned
-ginn
-dst
-burwood
-ve
-illuminati
-mayhew
-all-purpose
-cine
-carteret
-commercialized
-gaffney
-kingman
-swindle
-heenan
-marsupial
-atticus
-contradicting
-sander
-crags
-scorched
-chalet
-stipulates
-herne
-compatriots
-ishikawa
-crabtree
-mellitus
-affliction
-trios
-interferometer
-second-round
-dohc
-amenable
-meticulously
-disintegrate
-nel
-metrical
-unsatisfied
-bubblegum
-35.8
-heed
-sava
-fisa
-brigadier-general
-deeded
-companionship
-travelcard
-valli
-arles
-pastel
-homeric
-gabriele
-tiki
-impala
-nwfp
-rodrigues
-mephisto
-grievous
-orienteering
-whitmore
-lull
-plumber
-cockatoo
-prevails
-1130
-nephi
-lithgow
-bloodlines
-tuscarora
-guwahati
-hermetic
-booby
-charlene
-t4
-levitation
-devour
-pee-wee
-lotta
-marsalis
-pronunciations
-atolls
-computer-based
-minbari
-infecting
-slipknot
-haakon
-ravage
-discarding
-lasso
-overjoyed
-lilian
-kinshasa
-benedetto
-headliner
-huffington
-dazzling
-5-8
-thule
-munch
-armadillo
-severance
-emphatic
-re-edited
-larynx
-citizenry
-418
-k-5
-macintyre
-expendable
-morrell
-pinpoint
-leviticus
-jacqui
-chilling
-wisely
-tidy
-turbocharger
-harms
-lomond
-imdb
-dialectical
-recessed
-0.0
-disconnect
-surged
-intakes
-hagan
-2.69
-illustrative
-handsets
-trays
-mesquite
-incisors
-regenerated
-pathogenesis
-curtail
-lasker
-salam
-illiteracy
-apogee
-modality
-ever-present
-flashy
-effingham
-ambedkar
-berets
-bipedal
-tutsi
-coupons
-nomura
-gwinnett
-methylation
-tania
-acolytes
-controllable
-gallop
-flirt
-tomcat
-daz
-richey
-fluke
-marksmanship
-434
-call-in
-prays
-emulating
-disposing
-seleucus
-high-frequency
-stopover
-sherbrooke
-afa
-telecaster
-quaid
-replayed
-grasshoppers
-intermission
-mcguinn
-rondo
-denominated
-ytv
-ciphertext
-wyeth
-chien
-typesetting
-anglicized
-tunstall
-a.k.a
-jamieson
-scraping
-domini
-turntables
-newsom
-meta-analysis
-tabitha
-critically-acclaimed
-lowestoft
-veneer
-non-christian
-rimini
-simplifying
-meditations
-gosport
-thirty-seven
-1517
-pau
-mol
-cady
-mcentire
-pinochet
-tula
-in-between
-uaap
-franchised
-abul
-clasp
-awry
-paraphyletic
-laguardia
-separator
-preservative
-tart
-mccloud
-chalcedon
-rostov
-doreen
-greaves
-deir
-harmonium
-sharkey
-midwife
-u-19
-ringer
-evesham
-hawkman
-mhc
-warburg
-neurotransmitters
-monarchist
-vice-presidential
-hbc
-mahan
-chowk
-tyrell
-structuring
-azteca
-rationalist
-enslavement
-milburn
-prez
-ouachita
-cataract
-xuan
-valois
-trans-atlantic
-2.64
-galton
-fateh
-jeter
-pirie
-sceptical
-circled
-bandura
-schoolboys
-tectonics
-rundown
-openers
-isomer
-tcu
-well-regarded
-nada
-slanted
-dg
-coworkers
-thinning
-evacuating
-courtier
-front-wheel
-howells
-barstow
-gto
-coughlin
-achievable
-intermediaries
-sandringham
-cfo
-centurions
-skewed
-leif
-solenoid
-blackrock
-bamberg
-iptv
-malpractice
-1551
-serling
-nlp
-ochs
-ecole
-fleeting
-1524
-ridgefield
-characterizing
-attractiveness
-tfl
-leandro
-subs
-bewitched
-chs
-predominance
-baudelaire
-stockpile
-three-piece
-overarching
-clouded
-flue
-trigonometric
-jillian
-shastri
-deathmatch
-lobbies
-byrnes
-exterminate
-famers
-eateries
-edicts
-embittered
-seabird
-tamilnadu
-gifu
-outlandish
-unforeseen
-crematorium
-trumps
-cac
-ophelia
-intolerant
-ayurveda
-mcelroy
-north/south
-phoned
-dyeing
-vigilant
-jacky
-vying
-kendo
-regius
-composes
-turbojet
-4500
-gant
-sculpting
-counterintelligence
-haag
-kula
-minotaur
-incarnate
-cloverleaf
-pathetic
-poetics
-wyandotte
-aarhus
-95.3
-agglomeration
-alcibiades
-breathtaking
-isdn
-elie
-deconstruction
-emus
-sprays
-disappearances
-sounders
-hereafter
-crandall
-soot
-three-hour
-74th
-depriving
-ilp
-rehearse
-titania
-zaft
-wigs
-apocrypha
-usns
-slurry
-epidermis
-intersected
-motherboards
-minkowski
-two-storey
-chaldean
-unregulated
-turbofan
-dropout
-handily
-1970-71
-carton
-geopolitical
-tiled
-rafi
-commissary
-weyl
-sincerely
-nicobar
-omer
-surigao
-mower
-i-94
-hythe
-parishad
-subiaco
-herbicides
-820
-skinned
-squall
-ninjas
-os/2
-astm
-sylvania
-glengarry
-comyn
-geisha
-irishmen
-northcote
-forerunners
-tadpoles
-doo-wop
-majuro
-prospecting
-hellman
-hanford
-actuarial
-factional
-punjabis
-rhett
-seger
-discal
-1997-1998
-adored
-t.i.
-creatively
-fallow
-2.33
-tremont
-parodying
-499
-seagulls
-palme
-bagpipes
-skytrain
-elicited
-falsified
-diverging
-midseason
-conjoined
-norepinephrine
-four-piece
-boaters
-tabla
-croke
-fiorentina
-shulman
-codification
-satisfactorily
-lichen
-banding
-compiles
-understandably
-stratford-upon-avon
-tugboat
-liston
-corso
-gmtv
-pnc
-hedwig
-sobriety
-sheraton
-lum
-baring
-countryman
-n2
-stott
-wargame
-newton-john
-misgivings
-konkan
-bsp
-megalithic
-hatteras
-modoc
-infractions
-inefficiency
-shears
-pardons
-mockery
-lisle
-bandages
-ideologically
-gogol
-flintoff
-howrah
-aggregated
-deere
-od
-ivar
-bivalve
-amazonian
-lineups
-nar
-pasquale
-fina
-1415
-impairments
-trask
-3-dimensional
-herpes
-samarkand
-garza
-svt
-tasty
-fertilized
-slicing
-3.20
-puente
-greenery
-bessemer
-legation
-454
-grapple
-garda
-mid-april
-remuneration
-prototyping
-momo
-adhesives
-thickened
-smithers
-isotropic
-danvers
-gts
-myung
-bayonets
-catalysis
-2.66
-high-powered
-zeiss
-akademi
-deutscher
-elegans
-two-door
-unsolicited
-retainer
-haverford
-u.p.
-thanjavur
-compost
-legislate
-soleil
-objectionable
-de-facto
-3.16
-millstone
-n1
-bruton
-5-speed
-businesswoman
-babur
-reimbursement
-dts
-purports
-itinerary
-agricola
-riksdag
-impassioned
-long-established
-cfr
-jean-luc
-npcs
-yolk
-paducah
-valerian
-tracklisting
-underlined
-ashdown
-binaries
-taunt
-1557
-feeble
-skidmore
-shiite
-affleck
-finalised
-vistula
-6a
-fibonacci
-eon
-on-field
-percentile
-faisalabad
-etruscans
-buri
-mungo
-moulin
-swinburne
-spitz
-fatwa
-exhibitors
-arbroath
-tritium
-adherent
-heatseekers
-nafta
-cronkite
-leed
-xy
-mccord
-shakers
-apra
-379
-535
-mucosa
-ste
-wayland
-vijayanagar
-roshan
-lestat
-shrikes
-gandhara
-two-player
-benthic
-anastasio
-plovers
-predictor
-apertura
-self-defence
-1080
-quizzes
-first-run
-visigothic
-dips
-meteors
-389
-biofuels
-blacksmiths
-rabies
-93.1
-sla
-arnie
-phony
-kampala
-covariance
-thirty-eight
-frankford
-electrolysis
-beards
-466
-ansett
-ludovico
-elia
-peculiarities
-spaceships
-helmand
-glare
-deathly
-conforms
-cowles
-holotype
-irenaeus
-coots
-salome
-airframes
-forza
-hooters
-geckos
-affirms
-undistinguished
-pavn
-j.m.
-capitulated
-dassault
-bonsai
-b6
-joys
-looped
-bayview
-fiennes
-tl
-redlands
-warehousing
-low-powered
-subsonic
-sarkozy
-weis
-meuse
-bodybuilder
-strobe
-parakeet
-instantaneously
-verifying
-quine
-fluctuated
-judi
-meena
-aggressor
-tabloids
-expedient
-etta
-acapulco
-skeptic
-auerbach
-obedient
-hacienda
-sell-out
-hangman
-forgiving
-auld
-meriden
-pinfall
-soured
-jacopo
-sneakers
-occipital
-africanus
-testicular
-toothpaste
-ncis
-abstained
-expounded
-perils
-windsurfing
-hippodrome
-theodoric
-quinlan
-shortlist
-shuts
-samir
-eb
-ret
-novellas
-mikael
-herc
-arctiidae
-tosh
-goalkeeping
-marathons
-epoxy
-jab
-determinants
-eur
-inge
-annulment
-pruning
-conceivable
-steinway
-war-time
-universiade
-zola
-disparities
-mins
-95.7
-ogilvy
-computationally
-kearns
-radiotherapy
-rechargeable
-stiffer
-elkins
-pps
-ibadan
-pips
-catchers
-besar
-marne
-bronchitis
-octavius
-hectic
-2.62
-blockage
-columbo
-oba
-deanery
-coppa
-terrance
-monet
-halide
-waterside
-hoi
-pdp
-hayashi
-93.7
-shortfall
-hoare
-ansari
-poitiers
-neutralized
-sharjah
-mcfly
-kabuki
-modifiers
-2.36
-sgt
-momentary
-marek
-eas
-discouraging
-rihanna
-89th
-1190
-nuclear-powered
-lakh
-multi-ethnic
-machined
-obie
-mullet
-silverware
-johansen
-estimator
-lamina
-a-team
-jhelum
-crewmembers
-antimony
-motionless
-previewed
-9am
-dalmatian
-comix
-substandard
-toaster
-alphabetic
-nonsensical
-eternals
-enterprising
-citric
-shareholding
-amal
-blazon
-superconducting
-whitchurch
-stuttering
-subsp
-nasr
-hydrothermal
-705
-dwelt
-enzymatic
-flagged
-mundo
-emphysema
-bbq
-e2
-month-long
-maeda
-hofstra
-michaela
-cmg
-troubadour
-samadhi
-coadjutor
-warners
-ferreira
-numismatic
-inferiority
-cenotaph
-serine
-macs
-98.7
-2x
-slotted
-avocets
-ilo
-transcend
-incubator
-glade
-midas
-coeur
-circumcised
-flanker
-manchurian
-fredrik
-nez
-hefty
-aip
-csr
-nicene
-1s
-waylon
-sturges
-smtp
-blackmailed
-oscillating
-cleverly
-regionals
-submissive
-respondent
-canute
-capping
-dang
-precocious
-nha
-glossop
-slowest
-stillborn
-yardage
-uw
-refuelling
-shyam
-pounded
-penning
-sauber
-hitchens
-ssl
-acumen
-moores
-butters
-charteris
-backups
-callan
-calico
-masculinity
-geri
-kham
-teague
-sheltering
-machen
-chasm
-quickest
-nadi
-serrated
-wildflowers
-motogp
-aureus
-unholy
-bioethics
-bharati
-indignation
-ged
-3.25
-nist
-mourned
-evers
-carnation
-tanager
-lemieux
-untrained
-simulates
-kenner
-cellphone
-synchronize
-alcoa
-birdlife
-banerjee
-furnishing
-undersecretary
-savvy
-mothership
-6-5
-manatee
-10-0
-negara
-outbuildings
-slasher
-screamed
-wolsey
-justus
-post-soviet
-612
-surabaya
-duller
-audiovisual
-salvo
-tosses
-kronos
-conglomerates
-zhi
-ejaculation
-lakhs
-cascading
-whl
-stadio
-geodetic
-priya
-vfa
-36.2
-islip
-tithes
-taizong
-lockyer
-belligerent
-.38
-unwin
-tailed
-starlings
-deserters
-fiske
-natures
-increment
-bangs
-parti
-drifts
-teletext
-kumari
-orphanages
-pacheco
-tranquil
-arbitrage
-malloy
-willson
-morrill
-pensioners
-hashim
-one-night
-all-black
-norad
-106.7
-rote
-haida
-830
-lelouch
-guidebook
-transactional
-fhm
-colville
-kingsbury
-382
-torrington
-authenticated
-ligase
-nape
-4-door
-galapagos
-mid-november
-kuching
-policymakers
-fairbairn
-premiering
-tsa
-separable
-tourette
-hyder
-maddy
-appel
-deadwood
-gloster
-hearsay
-truthful
-ardmore
-sickly
-valerius
-8,500
-agassiz
-mou
-calle
-hag
-ubisoft
-33.4
-recuperate
-snatched
-trespassing
-defenseman
-watling
-randomness
-upi
-creoles
-symbiosis
-endlessly
-forefathers
-kickers
-judgements
-origami
-sevenoaks
-well-equipped
-adel
-compatriot
-cryptanalysis
-medulla
-maximise
-hardee
-brereton
-dore
-nether
-3.18
-deleting
-bothwell
-tachycardia
-hargreaves
-cations
-dbe
-amicus
-rac
-cypher
-613
-cortland
-good-natured
-blurring
-forty-eight
-wetter
-amun
-ccs
-rigidly
-711
-rik
-conferencing
-representational
-mclennan
-common-law
-impenetrable
-banshees
-streep
-aragonese
-gaels
-villanueva
-superfluous
-caprice
-milliseconds
-claudette
-re-joined
-a320
-duarte
-devoured
-caged
-vandal
-two-volume
-scepter
-convective
-prewar
-wootton
-commotion
-oration
-meiosis
-loa
-8-10
-bards
-team-mates
-raisins
-truckers
-reebok
-descendent
-automorphism
-92nd
-debtors
-dens
-1270
-unrequited
-multi-year
-indonesians
-mend
-monasticism
-vowing
-af2
-top-ranked
-patterning
-texarkana
-anglo-norman
-exacting
-kerrigan
-radiological
-damper
-muskogee
-toothed
-hallucinogenic
-crewman
-coimbra
-centrepiece
-dhs
-bv
-birla
-blackest
-remanded
-nf
-checkered
-980
-provincetown
-33,000
-inflection
-railing
-brahmaputra
-nuances
-mo.
-maugham
-nigra
-polisario
-firehouse
-hutu
-genealogies
-canes
-anubis
-quimby
-flushed
-booms
-mid-south
-disengagement
-somalis
-fanzines
-varnish
-minimalism
-diligent
-hallucination
-rushdie
-rossetti
-acosta
-bixby
-invalidated
-brezhnev
-wile
-al-daula
-cogan
-firebox
-mitchum
-gilead
-kiryat
-wasatch
-tricia
-vocally
-cockroaches
-greensburg
-damnation
-sunbird
-taker
-haunts
-chaco
-side-scrolling
-plundering
-o&o
-sasuke
-farquhar
-tetris
-majid
-rollo
-rebadged
-ei
-peralta
-asi
-thrushes
-pint
-testicles
-cancelling
-wycliffe
-36.8
-evict
-vomit
-stavanger
-maritimes
-swirling
-surfacing
-apologise
-2,100
-capoeira
-positivism
-takeo
-watchman
-offa
-eldar
-avars
-nittany
-newscaster
-lbc
-raitt
-moldavian
-anaesthetic
-idioms
-rajesh
-gallium
-flapping
-incurring
-devaluation
-germs
-bretton
-sheboygan
-dolomite
-flawless
-ionizing
-shamanism
-amg
-1310
-1430
-mus
-gilt
-hurriedly
-spenser
-axl
-hiller
-menstruation
-2.37
-0.02
-p-51
-397
-rectifier
-tapestries
-vesta
-2.41
-posix
-reappearance
-kaur
-mchugh
-friendlies
-geneticist
-acetone
-disrespect
-sown
-spirals
-incriminating
-bunyan
-nik
-catapulted
-defer
-conspicuously
-manawatu
-hypoxia
-wholesome
-1440
-escalate
-hispaniola
-balearic
-dukakis
-tanakh
-creationist
-xiongnu
-istria
-antonov
-sixty-five
-ghee
-catholicos
-bushnell
-bombard
-mcduck
-kryptonian
-hodder
-schopenhauer
-infertile
-avex
-88th
-hollows
-lusitania
-theron
-unbearable
-translational
-neri
-addictions
-taunted
-dripping
-fastest-growing
-deville
-gallifrey
-ita
-sekolah
-hydrostatic
-glaser
-cambrai
-sweetwater
-offbeat
-shocker
-ogilvie
-handkerchief
-lutheranism
-potions
-soundscan
-dashing
-scopes
-bellini
-royston
-cormorant
-gilroy
-skeletor
-equate
-sceptre
-congruent
-judoka
-calcite
-spartacus
-teased
-high-density
-optimizing
-lbw
-529
-khatami
-kevlar
-exorcism
-lode
-dateline
-polyurethane
-701
-lillie
-franck
-multi-level
-wachovia
-nighthawk
-emc
-implosion
-kelantan
-automakers
-485
-chardonnay
-salvia
-four-way
-britpop
-espresso
-crozier
-hoosier
-halved
-courtly
-unmasked
-metalworking
-motorised
-deductive
-bradfield
-hales
-brookhaven
-insightful
-tri-cities
-notional
-napolitano
-suzhou
-mid-may
-cuffs
-opossum
-dcc
-t3
-trimester
-passively
-stahl
-invokes
-fives
-candlelight
-scapa
-d.a.
-pre-roman
-nc3
-subcultures
-1533
-kansai
-speyer
-rennes
-1515
-barracuda
-polybius
-lysine
-elegy
-flemington
-rereleased
-top-rated
-joyous
-laotian
-stockbroker
-menial
-selig
-seductive
-vaux
-grubbs
-aykroyd
-theropod
-palladian
-tolerances
-gemstones
-drifters
-isaf
-telepathically
-r.j.
-heck
-donuts
-spl
-trawler
-sundial
-bi-weekly
-marten
-gravestone
-68th
-irani
-endangering
-sandpipers
-matriarch
-merriam
-deprecated
-bauxite
-85,000
-moulds
-etchings
-capel
-ay
-970
-gn
-antimicrobial
-ogg
-36.7
-m16
-ornithology
-transiting
-ingham
-bluth
-srinagar
-platelet
-vostok
-profited
-courant
-ffa
-carmelite
-realtime
-endorses
-unmodified
-glazing
-filename
-xhosa
-canopies
-bridegroom
-i-aa
-caliphs
-lisburn
-nerd
-tommaso
-curley
-rambling
-run-down
-ecliptic
-nutmeg
-pre-k
-trudy
-930
-re-equipped
-minolta
-downer
-wily
-traumatized
-marti
-nantwich
-shag
-starve
-smitten
-rafferty
-gout
-murakami
-psychopathic
-2,800
-conjectures
-esc
-formula_44
-rectify
-3,200
-computable
-disseminating
-saban
-carbondale
-disregarding
-pompous
-ott
-exerts
-ramsgate
-rosebud
-stumbling
-luncheon
-iolaus
-him/her
-acura
-co-located
-modernizing
-cosworth
-sourcing
-stinging
-puffin
-mako
-woodbine
-sedai
-foxy
-tabled
-contemplate
-cantrell
-shales
-subterminal
-greyhounds
-eighteenth-century
-vibrational
-obstruct
-brea
-ervin
-hasidim
-navel
-kennedys
-ffestiniog
-southwell
-herrick
-smalltalk
-aia
-mdma
-bahamian
-jeeps
-yushchenko
-pooled
-all-americans
-firsthand
-reachable
-hallmarks
-add-ons
-hokkaido
-moyer
-waterville
-lumbering
-theotokos
-chartres
-vitae
-thirtieth
-unaffiliated
-malfunctioning
-luster
-bathe
-babbitt
-phage
-karloff
-mello
-napster
-khao
-imperfections
-615
-tortures
-wareham
-.500
-finesse
-haploid
-taylors
-vouchers
-lswr
-perfumes
-maynooth
-subtracted
-inflexible
-pcm
-scca
-cherokees
-jerzy
-shayne
-vero
-sedation
-condemns
-hoon
-steppenwolf
-computer-controlled
-sliders
-1532
-polaroid
-llb
-furtado
-sandia
-crawled
-blundell
-hieroglyphs
-arca
-uncharted
-hippopotamus
-sparky
-nps
-575
-illyria
-newlands
-vilna
-wasteful
-telekinesis
-ats
-self-confidence
-maximals
-centerline
-side-effect
-cimarron
-sagas
-snowmobile
-marvels
-refrigerators
-reprieve
-towels
-forfeiture
-9-11
-16mm
-daunting
-hard-working
-theban
-scythians
-508
-rta
-believable
-910
-calif.
-ija
-kharkov
-sarcastically
-nayak
-corsican
-persephone
-circumnavigation
-mandibular
-xi'an
-out-of-state
-lobos
-demi
-85th
-shiro
-valentinian
-expositions
-flavoring
-deryni
-retaliatory
-angst
-carrey
-infatuation
-cpn
-ocala
-ptc
-clumps
-breads
-kr
-enveloped
-raccoons
-qf
-terengganu
-som
-collegium
-hem
-roofed
-.9
-thrashers
-shingles
-98.3
-nga
-vax
-metabolized
-overturning
-encroaching
-brownie
-kwai
-marla
-sparing
-hoysala
-bulawayo
-idiopathic
-coupes
-compensating
-35.9
-extravaganza
-dov
-b.b.
-dislocated
-dangling
-silat
-featherstone
-satires
-comarca
-compulsion
-auger
-dupree
-.8
-hazzard
-shuster
-pizarro
-chowdhury
-unhcr
-breeches
-misplaced
-preventative
-sukhoi
-bara
-emulators
-pcp
-khyber
-dispatcher
-beached
-veterinarians
-post-doctoral
-nugget
-grimaldi
-pendragon
-rocha
-tcp/ip
-pentagonal
-rehnquist
-accompanist
-zoot
-arbitrator
-itch
-36.6
-paton
-backbench
-gaz
-eigenvalue
-coldfield
-sauer
-spillway
-exporters
-akita
-staked
-mahindra
-gulfstream
-blister
-slapping
-rok
-affluence
-musicology
-furiously
-hikes
-acoustical
-beal
-leper
-unimpressed
-shipboard
-cautiously
-hla
-1975-76
-provisioning
-worden
-tinted
-feldspar
-103.1
-pinafore
-emporium
-hornbill
-dueling
-quays
-esl
-volcanism
-canaanite
-soa
-4/4
-urls
-hobgoblin
-fraught
-tribeca
-xie
-leesburg
-blacklist
-uso
-post-hardcore
-militarism
-glutathione
-interlaced
-geometries
-occlusion
-chroniclers
-hexadecimal
-buzzer
-morphing
-modi
-self-sustaining
-pappas
-unconsciousness
-nola
-summerfield
-narrating
-pimlico
-sdn
-non-resident
-duress
-ure
-subprime
-rebounding
-buzzcocks
-rosecrans
-hain
-clout
-12,500
-96th
-guano
-mandibles
-continuo
-anti-social
-hesitated
-toscanini
-pepsico
-}
-a-10
-502
-ramming
-indio
-safeguarding
-desecration
-edgy
-l.p.
-jal
-vadim
-yamagata
-anselmo
-slashing
-lyre
-fabius
-lo-fi
-unannounced
-citywide
-inxs
-throwback
-rodolfo
-figuring
-carradine
-mis
-slovaks
-fta
-airfoil
-elba
-headliners
-lido
-kaveri
-sea-level
-pelosi
-anis
-borgo
-summing
-c&o
-maida
-lifetimes
-motorist
-glycine
-sandstones
-precambrian
-hamburgers
-volumetric
-injustices
-thirdly
-congresswoman
-pomp
-amines
-perot
-1528
-headset
-blois
-4ad
-ever-increasing
-airforce
-one-quarter
-dixieland
-fillings
-chou
-illini
-meanders
-imprinted
-matures
-mong
-dentition
-pop-rock
-gymnasiums
-pyrotechnics
-convents
-frankfurter
-oshkosh
-grenadiers
-accusative
-pkp
-prescribing
-funicular
-venturi
-onshore
-formula_45
-perce
-kbe
-ronde
-wac
-competencies
-optus
-nablus
-sachin
-futurist
-practicable
-clydebank
-luz
-incubus
-nasional
-dutta
-criticise
-mid-1920s
-rosette
-entwistle
-zambezi
-hara
-alternation
-matty
-degenerative
-finkelstein
-industrially
-machado
-1280
-dimaggio
-foreshadowed
-nswrl
-vigour
-diddley
-flashlight
-vulgaris
-romanized
-negated
-confided
-dredged
-bev
-drifter
-7-3
-q4
-uva
-euphorbia
-basso
-odes
-inchon
-gansu
-vial
-caregivers
-anhui
-osa
-paleontologists
-missa
-kutch
-mitchel
-mediators
-hawaiians
-insecticide
-visor
-forgo
-palisade
-analogies
-co-ordinate
-cleaver
-mcdonalds
-cram
-intertidal
-badged
-rearguard
-racquet
-cate
-ambulatory
-embarks
-counterfeiting
-salmonella
-keg
-anthropogenic
-winningest
-paleozoic
-filmation
-commonality
-36.0
-davids
-hyenas
-shoreditch
-mullah
-roxburgh
-crewed
-1978-79
-dll
-shenyang
-epistemological
-gert
-five-day
-101.1
-orientated
-otc
-new-found
-ipl
-anti-apartheid
-belcher
-schulze
-non-political
-hideki
-dagestan
-tet
-milking
-94.3
-gingrich
-vela
-suikoden
-erupts
-technologists
-cst
-co-ordinator
-hyphen
-banquets
-alcock
-annandale
-nightjars
-universality
-anakin
-collared
-bruises
-97th
-rubles
-selim
-walthamstow
-kareem
-walleye
-bukhara
-giulia
-gruff
-colitis
-aurobindo
-trackers
-apologetics
-chubby
-durrell
-five-star
-matrilineal
-hossein
-dk
-heng
-jt
-spadina
-pender
-whitefield
-dislodge
-gsn
-twofold
-unlocking
-grafting
-ferocity
-clashing
-fencer
-trawlers
-magadha
-mansur
-xiaoping
-codice_6
-impure
-DGDGDGDGDGDGDGDGDGDGDGDGDG
-inferences
-oxnard
-candida
-skateboards
-hayworth
-woolsey
-kennet
-pur
-sandinista
-loos
-98.1
-resale
-rascals
-lorain
-reina
-oyo
-ribosomal
-surmised
-refreshments
-above-ground
-homebrew
-tilton
-foreshadowing
-subcutaneous
-pillows
-bled
-ingersoll
-lunches
-dachau
-kalyan
-abkhaz
-votive
-socialization
-manna
-shortland
-hollins
-dinh
-lightest
-undercut
-jackass
-nagel
-nis
-overpower
-schaeffer
-slaps
-dyslexia
-remus
-pours
-air-conditioning
-rimsky-korsakov
-c.s.
-papuan
-legio
-433
-yak
-kagan
-synchrotron
-stash
-lothair
-tumble
-supremacist
-thanet
-financiers
-mayport
-plunging
-centerfold
-1760s
-door-to-door
-tricking
-monti
-tfs
-bailed
-giancarlo
-lavishly
-camaro
-seafarers
-predictably
-aqueducts
-krishnan
-blagojevich
-sakamoto
-niue
-pacer
-funnels
-nss
-tetra
-congratulated
-mumford
-graceland
-hyena
-unsaturated
-ofc
-crux
-dallas-fort
-dios
-ac/dc
-thirsty
-vms
-fronds
-backstroke
-sepals
-insufficiency
-dirichlet
-samaria
-shmuel
-a380
-skrulls
-navi
-cabo
-diemen
-impounded
-thefts
-paddles
-t.v.
-lawrenceville
-1522
-one-piece
-harney
-sluice
-yu-gi-oh
-alexius
-dirge
-biotech
-conciliation
-submits
-abbeys
-mcguinty
-code-named
-roxas
-juxtaposition
-curie
-revd
-satirist
-aural
-slippers
-booted
-germinate
-vogt
-infamously
-public-private
-flavoured
-comically
-unwieldy
-guglielmo
-unbound
-pathan
-beheading
-fireplaces
-steamships
-zona
-96.9
-vigilantes
-failings
-paging
-rosy
-sia
-sulcus
-1546
-briefings
-tarnished
-860
-icebreaker
-sommers
-interactivity
-rubiaceae
-moonshine
-altos
-culminate
-girders
-dc-3
-beloit
-shopkeeper
-whitecaps
-v4
-rendell
-tem
-quantification
-cir
-emin
--4
-outbursts
-schooners
-naismith
-2004/05
-concealment
-desist
-nehemiah
-levitt
-jaffe
-piercings
-interregnum
-seuss
-stormont
-verma
-equates
-sorenson
-oars
-makin
-mid-2000s
-estuarine
-revels
-long-haul
-mehdi
-pequot
-enquiries
-transcendence
-hawkwind
-mythologies
-conserving
-layering
-emphasises
-dubrovnik
-watery
-abrahams
-thurmond
-resounding
-88.1
-wept
-dazed
-icrc
-enquirer
-kwame
-equipments
-hagerstown
-ust
-residencies
-1380
-marisa
-outro
-machinist
-overthrowing
-waveforms
-1512
-censured
-babyface
-wilco
-salticidae
-vapors
-gillan
-2005-2007
-littered
-enforces
-mev
-mismatch
-re-establishment
-hpv
-1996-1997
-khadr
-ruddy
-93.9
-grievance
-carapace
-repton
-mooted
-437
-449
-narn
-splashed
-doubleheader
-paused
-humberside
-bode
-wicketkeeper
-plotters
-grinstead
-athabasca
-alumnae
-blue-collar
-motets
-sigint
-supplementing
-dravida
-handicraft
-flopped
-huntly
-callisto
-sti
-signer
-greenbrier
-sub-continent
-redbridge
-participle
-reflectors
-visualized
-subways
-eighty-four
-higher-level
-murmansk
-rousing
-leftists
-4x100
-hacks
-foreshore
-petro
-queensway
-esk
-cerebellum
-qu
-lashed
-audited
-dlp
-nightline
-trade-off
-pseudo
-zalman
-nomen
-vesuvius
-wreckers
-ivanovich
-aryans
-1483
-sleigh
-chuan
-pews
-single-a
-catalyzed
-by-products
-atreides
-unbounded
-stimulants
-shaver
-astrophysical
-aster
-lacquer
-woolworth
-crass
-ukulele
-aircrews
-seo
-overseers
-knitted
-strongman
-36.3
-compressing
-un-american
-2:30
-pupa
-symantec
-faltered
-williston
-charred
-scheduler
-imphal
-solute
-logics
-barnstable
-backpacking
-dicks
-jonestown
-hippocrates
-low-floor
-ribosome
-recursively
-moynihan
-ranji
-offsets
-2.31
-endzone
-fielders
-sacral
-ganguly
-ldl
-saville
-rasa
-bulldozer
-pearse
-yarborough
-dayan
-mid-2008
-benevolence
-mitigating
-positron
-purdy
-dene
-leica
-603
-merciless
-incessant
-mycobacterium
-disposals
-herron
-vanadium
-tether
-unexplored
-1980-81
-debugger
-goldeneye
-cropping
-mandrake
-gondwana
-refrained
-quayle
-shrank
-babylonians
-virulent
-spiritualism
-remarry
-mga
-asif
-pitman
-broadleaf
-626
-uriah
-nok
-hitmen
-redone
-odell
-roush
-corky
-aoki
-badr
-sluggish
-self-imposed
-cutlass
-baillie
-western-style
-obstructing
-mixers
-scribner
-sigurd
-convolution
-mccook
-moser
-vibrate
-refuel
-morbidity
-kampung
-goring
-post-independence
-mercilessly
-corus
-shoot-out
-whitmer
-h.p.
-shearwater
-digested
-undergrowth
-subjugation
-malankara
-scares
-cupboard
-receded
-overkill
-resourceful
-epidemiological
-invests
-armchair
-heathcote
-jarman
-undaunted
-banked
-1410
-delinquency
-calicut
-patria
-primogeniture
-ak-47
-observances
-absurdity
-tofu
-dispensary
-thetford
-admittedly
-zohar
-prithviraj
-beaconsfield
-newborns
-97.9
-lh
-studded
-miramichi
-hobo
-memorably
-neuromuscular
-instituting
-sweeter
-atalanta
-tonawanda
-issa
-methionine
-persistently
-pontypridd
-semi-detached
-catwalk
-misrepresented
-luoyang
-formula_46
-junkyard
-yalta
-koala
-hallelujah
-anti-soviet
-dilemmas
-j.s.
-helsing
-binge
-utterance
-cheques
-multiracial
-seafaring
-veritable
-peacemaker
-brownstone
-motu
-meditative
-adolphe
-macaque
-kinabalu
-hitomi
-agreeable
-remoteness
-butt-head
-citrate
-glenda
-pursuers
-cruciate
-95.1
-annabel
-kana
-encircle
-sevilla
-watchful
-llandaff
-pilasters
-hedley
-magruder
-1503
-scams
-basset
-briefed
-mannequin
-alyssa
-sunfish
-deen
-umaga
-embankments
-100.7
-pallet
-trickery
-creamery
-midwifery
-mattie
-taboos
-certiorari
-fortis
-pinewood
-homemaker
-palliative
-insecticides
-xiu
-daedalus
-pinellas
-blockers
-ginsburg
-telegraphy
-thameslink
-skipton
-rong
-anti-government
-cosgrove
-homme
-cluny
-sengoku
-relativism
-watercolour
-degrassi
-stunted
-griggs
-five-member
-kraken
-incurable
-morphologically
-17,500
-hensley
-parallelism
-delusional
-f/a
-raucous
-yeti
-transporters
-one-hit
-dupage
-laughton
-menard
-lug
-j.w.
-clarinetist
-batten
-meehan
-photosynthetic
-resent
-cast-iron
-usafe
-1497
-sodor
-mccaffrey
-rusher
-leitrim
-kot
-harley-davidson
-38,750
-b-movie
-lysander
-magnitudes
-esmeralda
-disillusionment
-cruciform
-bulkhead
-millimetres
-testes
-scum
-tremors
-non-violence
-solvable
-ensue
-spinoza
-yvette
-baguio
-tirunelveli
-incomprehensible
-aan
-gif
-gesta
-obliquely
-dasa
-zealander
-polystyrene
-ladysmith
-borel
-altruism
-garbo
-rnzaf
-exteriors
-ponsonby
-transducer
-38,000
-hunchback
-kenwood
-greasy
-sideman
-zed
-f.a.
-widths
-surrenders
-amarna
-anaesthesia
-swarms
-intermarried
-pheromones
-actuators
-burnout
-superposition
-interlingua
-maricopa
-higher-order
-telstar
-colosseum
-hugs
-whitehaven
-kilowatts
-dares
-serbo-croatian
-alesi
-algoma
-brazos
-gent
-infringed
-phonetically
-chivas
-mccauley
-kagoshima
-eiji
-chillicothe
-imprison
-wallaby
-schizophrenic
-boltzmann
-daft
-neanderthal
-avila
-progressions
-agate
-checklist
-metaphorically
-republika
-brackett
-gaspar
-hibiscus
-sylviidae
-kuiper
-clotting
-adalbert
-maghreb
-mobutu
-bristles
-waldron
-ermine
-optically
-soweto
-monomer
-mlc
-baht
-legumes
-skype
-10-day
-anti-nuclear
-alcs
-mistreated
-paco
-aristophanes
-morpheus
-hst
-dory
-roddenberry
-scientologists
-thea
-chlorophyll
-pythagoras
-h1
-lefebvre
-isabela
-counterclockwise
-israeli-palestinian
-searchable
-doughty
-beaton
-polemic
-accented
-uil
-beatified
-masthead
-bitumen
-credential
-championing
-disparaging
-baikal
-frighten
-fonseca
-grader
-mpc
-hollandia
-kk
-nimh
-doodle
-rigg
-budgeting
-encyclopedias
-chipsets
-seaforth
-ocs
-cuny
-streatham
-rabindranath
-99.3
-palakkad
-azimuth
-liaisons
-preemptive
-flintstones
-praxis
-jrotc
-tradesmen
-antivirus
-foretold
-syllabic
-minnelli
-ipv4
-pravda
-windings
-perturbations
-orifice
-91.5
-mulcahy
-hamza
-1994-1995
-adp
-ozma
-spaniard
-ses
-rourke
-landmass
-kohler
-knopf
-weston-super-mare
-barrow-in-furness
-dunbartonshire
-ame
-790
-heartfelt
-player-coach
-antithesis
-pogrom
-desolation
-hardie
-cornering
-choked
-circuses
-articular
-36.1
-hatter
-honouring
-killian
-workable
-sprints
-grin
-laban
-run-time
-eloise
-dui
-pietersen
-audley
-durbin
-hydraulics
-taxicab
-symbian
-capillaries
-nebulae
-mbeki
-politely
-ning
-fd
-culpeper
-audacious
-carlsbad
-narrowest
-peary
-91.9
-monotheistic
-blue-green
-tot
-lancastrian
-intensify
-blackie
-scotts
-encyclopedic
-pegged
-outcasts
-unsold
-noyes
-concordat
-ecc
-subservient
-cacti
-balmoral
-cytokine
-tribulations
-militaristic
-ori
-shou
-slaughterhouse
-sustains
-theosophy
-purest
-subroutine
-cheetahs
-huggins
-concurred
-eyelids
-unclean
-schenck
-kayaks
-gascoigne
-yucca
-sprinkler
-ashburton
-ciara
-32,500
-2.34
-91.7
-chappelle
-cristian
-impervious
-boomers
-erasing
-foreign-born
-abomination
-aguirre
-452
-csm
-becca
-indica
-trenchard
-testers
-formalised
-1296
-ajmer
-satirized
-showy
-fenian
-corrie
-daria
-elphaba
-repulsion
-duan
-callie
-810
-varney
-shinn
-runic
-cts
-comin
-drawers
-henrico
-scorponok
-opioids
-kucinich
-beaux-arts
-enforceable
-fletch
-semaphore
-stereotyped
-beni
-ataxia
-choo
-zedd
-oliva
-1983-84
-overdubbed
-moulding
-marchers
-cie
-tannery
-wallonia
-1370
-radii
-parsi
-scone
-toros
-sefer
-rapier
-brimstone
-dap
-dwells
-urethra
-bomberman
-spliced
-nance
-250cc
-basques
-stratigraphic
-sidon
-infinitive
-wranglers
-decried
-denys
-503
-liturgies
-hof
-freestanding
-annotation
-lobsters
-diwan
-anderlecht
-summarised
-badlands
-selo
-infiltrating
-ravages
-reproduces
-ophthalmic
-quashed
-denbigh
-flintshire
-comprehensively
-richfield
-brochures
-icann
-lesnar
-hand-made
-pilipinas
-typology
-oar
-rsl
-trolleybus
-clef
-two-point
-long-serving
-brann
-1969-70
-divorces
-offside
-thrashing
-drago
-moveable
-infocom
-xun
-eastland
-saracen
-obi-wan
-gx
-absolution
-dipper
-beano
-3:30
-springbok
-heavy-duty
-aig
-flourishes
-melchior
-ramjet
-ishtar
-freshness
-culloden
-bee-eaters
-millicent
-carcinogenic
-paulina
-p.j.
-assays
-99.1
-inquisitor
-2000-2002
-benefitted
-kalyani
-adkins
-redirects
-farid
-3.19
-energized
-lx
-washer
-scotrail
-brushing
-bibliographic
-loveland
-yai
-1516
-everly
-vong
-bankhead
-chops
-action-adventure
-rundgren
-jos
-phonetics
-vestibular
-nieces
-belgrave
-fleas
-upc
-unaired
-drinkers
-buoyant
-speight
-congruence
-licentiate
-fouling
-mortensen
-leonora
-impeached
-knuckle
-tawny
-swivel
-leiter
-canisters
-lobed
-anachronistic
-kinsmen
-mamoru
-deep-sea
-sorrento
-thickening
-dispositions
-repetitions
-racecar
-materially
-erupt
-positives
-orsini
-equalizer
-fighter-bomber
-duryodhana
-belknap
-storefront
-scalability
-hailey
-wilkerson
-pacification
-obstructive
-ibc
-petrarch
-deviated
-attains
-memorialized
-rrna
-notability
-ghazal
-magda
-grapefruit
-apparitions
-threefold
-verdicts
-thrusting
-caddy
-hydrolase
-diligently
-epcot
-re-launched
-kemble
-thematically
-grills
-suisse
-numa
-yardbirds
-colegio
-venizelos
-springing
-tusk
-litt
-pontefract
-suraj
-rehman
-sadr
-tahir
-korg
-oka
-binh
-sodom
-cartography
-midwives
-picnicking
-meu
-hotspots
-seder
-haverhill
-azul
-emblematic
-bren
-mistaking
-516
-rocko
-montezuma
-salinger
-gesserit
-aquaria
-brundle
-gamespy
-95.9
-halogen
-prickly
-folk-rock
-vindication
-noida
-poppins
-improvisations
-hcl
-2017
-bpm
-screenwriters
-first-degree
-chats
-piranha
-pennines
-kennebec
-imparting
-georgiana
-takers
-pastries
-poltava
-softening
-melba
-abv
-eunuch
-democratization
-anointing
-tannins
-arl
-agua
-heathland
-tisch
-armoury
-pollinators
-marshfield
-forsaken
-emp
-monotheism
-pug
-treen
-unlocks
-budgeted
-geraldton
-bua
-vvd
-headstock
-4kids
-broadbent
-headdress
-mantras
-blackened
-mistrust
-free-market
-riordan
-golding
-agni
-garret
-isobel
-ayrton
-posteriorly
-plat
-war-torn
-behar
-aficionados
-belfry
-90-minute
-polanski
-lecce
-limo
-reminders
-strengthens
-superstore
-ceredigion
-perdue
-dabbled
-purposefully
-birthdays
-robotnik
-multitasking
-20-30
-neue
-hunterdon
-thorp
-khun
-okinawan
-histamine
-inhospitable
-heredity
-detonator
-asante
-abyssinia
-homegrown
-slaughtering
-hick
-howarth
-olympiads
-langham
-incestuous
-pillaged
-coincidental
-rant
-apaches
-100.3
-mew
-lackey
-n.j.
-egon
-saks
-strikeout
-papyri
-willed
-pharisees
-narmada
-brockton
-museveni
-joes
-audiobook
-earthwork
-rebate
-kao
-kath
-pan-american
-ponderosa
-protectionist
-ctc
-crushes
-declarative
-odour
-varese
-clocking
-ferro
-laverne
-wetherby
-ghraib
-fissile
-debussy
-blues-rock
-fords
-fulda
-orgy
-2050
-geldof
-38.3
-icl
-novelisation
-mitral
-qr
-depositing
-parthians
-resold
-1024
-tusks
-angina
-retold
-eyewitnesses
-gabby
-igniting
-labonte
-abuja
-513
-d'este
-anglo-dutch
-detritus
-left-arm
-legge
-ursa
-getter
-mellencamp
-insolvent
-rulebook
-tempting
-profess
-xs
-lavinia
-burney
-581
-exmouth
-kcmg
-crowding
-sharpness
-assembles
-wordplay
-holley
-homicides
-west-central
-derrida
-cordoba
-roos
-manmohan
-1501
-randal
-tanglewood
-aca
-inlaid
-hoodoo
-flexor
-at-risk
-metrorail
-creme
-cosmonauts
-kilburn
-jnr
-nagaland
-scioto
-cutthroat
-coastlines
-follicles
-vandalized
-adjusts
-ost
-ostend
-historicity
-hamper
-brigid
-105.7
-assertive
-atholl
-morphism
-aberdare
-improvising
-distractions
-abstractions
-cranmer
-snelling
-mera
-invertible
-chrono
-overdue
-procuring
-udf
-i-90
-javan
-crenshaw
-holton
-phan
-raisin
-milled
-meads
-sei
-bigotry
-flagpole
-wedges
-wettest
-stratigraphy
-clinching
-hindsight
-topaz
-snowden
-tenacity
-buoys
-pairings
-askew
-baz
-solis
-koran
-tubers
-kargil
-sooty
-bridgeman
-proto-indo-european
-gaiden
-maharajah
-conservationist
-vpn
-waterpark
-therese
-idents
-diversionary
-curled
-intercultural
-wrongs
-fairtrade
-mcclain
-serfdom
-cloths
-angelique
-minigames
-pragmatism
-first-choice
-gdi
-bream
-gnaeus
-ischemic
-spicer
-cerebellar
-bastions
-irregulars
-airings
-crohn
-flotation
-donates
-hilt
-unbreakable
-38.1
-modulate
-alienating
-revisit
-starz
-aeneid
-1531
-mists
-weevil
-microgravity
-37.8
-bhagat
-multivariate
-mcloughlin
-slumber
-dias
-ultralight
-benelux
-maxie
-mcas
-dss
-patuxent
-tristar
-anti-jewish
-uncovers
-uscgc
-reconsidered
-messing
-tvnz
-vaulting
-hohenzollern
-abreast
-c.f.
-yolanda
-city-based
-oddity
-home-and-away
-elphinstone
-37.1
-gst
-armin
-firework
-2025
-mee
-9mm
-tonality
-1509
-lynyrd
-creeds
-amicably
-polemical
-bagged
-36.9
-enz
-quirk
-hoe
-cleanly
-myrtaceae
-award-nominated
-leaching
-tulagi
-rtc
-omissions
-knopfler
-unlv
-gallus
-coldstream
-mendez
-fawkes
-arco
-kiran
-bakker
-anheuser-busch
-tunica
-lui
-handcuffed
-damming
-instilled
-tryptophan
-beaker
-ringwood
-34,000
-este
-taras
-mohr
-umpiring
-kut
-saif
-denbighshire
-aisha
-mawson
-non-trivial
-stroll
-conning
-gminas
-mainframes
-distributive
-pontificate
-najaf
-fairground
-a-levels
-aso
-juniata
-ingraham
-darwen
-goya
-ruckman
-alegre
-hijack
-2s
-roadie
-betterment
-ets
-devito
-sunnyvale
-melancholic
-jae
-albano
-murong
-incontinence
-blockhouse
-anabolic
-kaunas
-amigaos
-ricochet
-2.32
-tokusatsu
-schoolgirl
-702
-alkaloid
-resuscitation
-underhill
-ainu
-wilkie
-fiedler
-1511
-doorways
-parthenon
-fiduciary
-hoyle
-ivey
-alexey
-smallwood
-partridges
-dolce
-glycerol
-third-generation
-gpo
-second-class
-kinda
-eustis
-milliken
-ej
-taff
-abdur
-633
-condoleezza
-fulani
-rayburn
-645
-aron
-superstitions
-tydfil
-beep
-double-decker
-klf
-bab
-ulf
-angler
-billington
-gump
-koizumi
-orinoco
-andrade
-leanne
-payback
-toi
-siouxsie
-vivendi
-bathhouse
-huckleberry
-costner
-daw
-apologizing
-buffs
-reconciling
-gremlin
-robison
-cannery
-csn
-curators
-racehorses
-chancellorsville
-semi-circular
-achille
-wayans
-monocoque
-abrasion
-porosity
-impersonator
-mid-range
-krieger
-lard
-okayama
-mobilised
-ici
-ans
-incompatibility
-drizzt
-berserk
-involuntarily
-refuges
-spoofing
-county-level
-2-yard
-mite
-ammonite
-wendover
-practicality
-powertrain
-keiko
-rumi
-auntie
-waxy
-villager
-french-canadian
-clarifying
-invulnerable
-foals
-fervor
-pele
-1995-1996
-mishaps
-savile
-cusp
-12-year
-swarming
-moa
-1.75
-outta
-santee
-hypothermia
-merciful
-1974-75
-kitson
-cardinality
-munroe
-mailbox
-dmx
-underbelly
-hafiz
-ischemia
-nordstrom
-epicenter
-annabelle
-derg
-zigzag
-dedham
-m.b.a.
-yeh
-aleksander
-hetman
-ajit
-maurya
-streaked
-peshwa
-roethlisberger
-halford
-pigmentation
-beardsley
-brickyard
-nutritious
-laila
-cirencester
-teo
-abiding
-tetsuya
-r3
-anzio
-ousting
-hesiod
-claxton
-responder
-diogenes
-microscopes
-nightshade
-wham
-troilus
-equus
-reprinting
-aurelia
-2.29
-cichlids
-refectory
-tunney
-preyed
-donning
-effeminate
-hi-tech
-rattlers
-nortel
-bolger
-owings
-anti-japanese
-athol
-oceanside
-six-cylinder
-drusilla
-tlingit
-pan-african
-two-party
-melaka
-foundries
-wavelet
-wollaston
-gad
-ucsd
-birding
-al-aqsa
-1973-74
-104.9
-trusses
-starkey
-inflected
-bledsoe
-ferrying
-viewfinder
-misfortunes
-melina
-arif
-misrepresentation
-clementine
-nairn
-fulltime
-foundered
-disenfranchised
-lismore
-conservationists
-arnaud
-replicating
-cpsu
-pegg
-uconn
-simba
-mariposa
-paratrooper
-pentateuch
-artur
-trisha
-greggs
-kraftwerk
-dodo
-predestination
-habitually
-ostia
-denizens
-reclining
-itu-t
-mizrahi
-inbreeding
-dived
-tps
-silverton
-polis
-utes
-kabbalistic
-non-western
-landline
-polluting
-newbridge
-tailoring
-leaky
-well-documented
-witham
-refreshment
-vas
-indecisive
-braithwaite
-polypeptide
-weatherman
-rifling
-snowdon
-proxies
-outweigh
-pb
-uwe
-highschool
-vivien
-oban
-engendered
-starman
-aerobics
-haplotype
-obeying
-chronically
-epc
-roblin
-phenol
-thiele
-instrumentalists
-pbl
-composting
-vm
-macromedia
-bested
-reloading
-1,900
-outgrew
-witwatersrand
-greenleaf
-creams
-eurostar
-anima
-joyner
-porno
-kure
-rollin
-earners
-injective
-selectable
-gorkha
-akers
-checksum
-redshift
-bilal
-schenker
-borgia
-506
-casio
-mid-table
-decorator
-rivaled
-lugo
-avenging
-squeezing
-succumb
-perceiving
-yarns
-two-month
-half-mile
-cholmondeley
-orang
-vacationing
-comedienne
-monro
-singaporeans
-progesterone
-bloodless
-marinas
-great-uncle
-akram
-2099
-haru
-homeowner
-superintendents
-seizes
-marsupials
-scavengers
-entanglement
-griffins
-under-18
-beebe
-firstborn
-1495
-specializations
-prog
-lazar
-equidistant
-bolshoi
-bassists
-soong
-stela
-negotiators
-eh
-viciously
-ect
-avn
-ellicott
-perfecting
-mathers
-shrinkage
-vann
-montes
-seduces
-ragas
-deformity
-citibank
-nsdap
-ironside
-borghese
-cp/m
-lighthearted
-aor
-post-rock
-apostrophe
-vinod
-lcc
-coombe
-cricetidae
-platter
-unambiguously
-complying
-d'
-abington
-kidnappings
-nx
-sheena
-w.h.
-full-power
-cocks
-za
-faustus
-highfield
-kimble
-poblacion
-bosons
-under-17
-recites
-vistas
-falcone
-gce
-9.00
-sump
-ibid
-omnipotent
-keenly
-gipsy
-spacey
-ninian
-melanin
-hoya
-1:30
-takumi
-bidders
-randle
-bugatti
-tum
-buttresses
-yakovlev
-boxset
-circe
-jaden
-excision
-catamaran
-consults
-expulsions
-anglicised
-exuberant
-interrogations
-kayfabe
-left-leaning
-crazed
-pressuring
-42,000
-doubly
-lst
-clinician
-1992-1993
-greenwald
-dv
-londo
-10-20
-colonised
-brickwork
-winwood
-archon
-grampian
-overbearing
-muncie
-hingham
-shorebirds
-pertain
-bathed
-fitchburg
-hibbert
-preminger
-kollam
-olmec
-norristown
-whitbread
-nox
-psc
-aboriginals
-kumamoto
-tasha
-interrelated
-roadhouse
-timo
-radiance
-microcomputer
-bowyer
-398
-sharps
-yugoslavian
-rainbows
-jax
-mollusc
-banyan
-nourishment
-aether
-innuendo
-kankakee
-agrippina
-umass
-heine
-northgate
-fergusson
-negate
-translocation
-twenty-fourth
-chocolates
-pago
-bonita
-109th
-gleaned
-loam
-infamy
-repulsive
-pirated
-shigeru
-jacksonian
-439
-confidently
-matsui
-eq
-miletus
-tropicana
-bilingualism
-purview
-sophocles
-gilda
-reflux
-retaken
-smoothing
-pronouncing
-37.3
-svg
-redshirt
-coos
-centralization
-gristmill
-frusciante
-backwater
-zimmermann
-mcghee
-pearlman
-fouled
-wholesalers
-radnor
-maarten
-ner
-chipped
-barisan
-petitioning
-all-metal
-91.3
-evoking
-hardline
-omagh
-wonka
-oas
-unremarkable
-fabrizio
-102.7
-toyline
-two-stage
-angell
-pcbs
-oxfam
-chinatowns
-aeg
-kierkegaard
-ernakulam
-uptempo
-consumerism
-vajrayana
-paulsen
-toho
-rayne
-palmas
-hylton
-orbs
-morehouse
-cec
-maccabees
-447
-friary
-rappahannock
-edi
-disproved
-1295
-frankel
-bahr
-allude
-albini
-compensatory
-u-2
-iupac
-quails
-organists
-ovulation
-underlies
-pura
-liqueur
-trento
-528
-longwood
-fijians
-yeomen
-charon
-ormonde
-ohm
-ceos
-atr
-buzzing
-unjustly
-overfishing
-536
-walsingham
-urbanism
-sata
-hollies
-matinee
-rickman
-emsworth
-liquefied
-muscovy
-effluent
-fingernails
-datu
-arnulf
-jacobus
-skydiving
-westgate
-koo
-estefan
-mid-twentieth
-glock
-sitter
-tinto
-hemel
-7.30
-fermions
-foresaw
-warmest
-halstead
-danielson
-antiseptic
-1523
-cession
-vor
-heartless
-east-central
-blumenthal
-esso
-1990-1991
-dcs
-watermill
-weep
-caching
-abergavenny
-asparagus
-mid-90s
-decals
-walid
-disembodied
-counterattacks
-immorality
-suarez
-704
-typhoons
-lehrer
-midline
-cnet
-vice-captain
-stc
-procurator
-aggressiveness
-mpeg-2
-102.5
-gai
-wogan
-illusory
-hazing
-peake
-1981-82
-g5
-eck
-adjoint
-re-releases
-declension
-multi-disciplinary
-top-flight
-goetz
-cervix
-diaper
-hama
-undertones
-neustadt
-precipitating
-wexler
-zap
-breyer
-mid-18th
-souths
-problem-solving
-harnesses
-kapp
-bruising
-gerrit
-meeker
-candace
-pascoe
-bickering
-ruckus
-dup
-intercepts
-full-back
-by-laws
--5
-german-american
-emphatically
-laud
-faith-based
-elmhurst
-commandeered
-aharon
-refutation
-dramatists
-b.i.g.
-posner
-propellants
-out-of-town
-commodores
-malfunctions
-cannibals
-diwali
-fukushima
-passerines
-worrell
-phosphatase
-sangeet
-2-disc
-acf
-ise
-tripped
-non-sectarian
-accumulator
-strummer
-lashley
-understandings
-kyi
-shameful
-mid-level
-roque
-cappadocia
-bizarro
-rza
-millimeter
-biodegradable
-514
-conformational
-sportsperson
-93rd
-bihari
-tridentine
-infante
-full-page
-aylmer
-restyled
-wiser
-cuttlefish
-owned-and-operated
-existent
-abad
-recuperating
-unger
-rosebery
-keele
-prawn
-pan-european
-encodings
-1519
-linwood
-downplayed
-gander
-birthright
-congratulations
-jig
-lauda
-social-democratic
-squaw
-wollstonecraft
-tiff
-hilo
-dft
-excavating
-piney
-non-negative
-limited-edition
-pickets
-formula_47
-bootstrap
-ethnology
-a.h.
-largemouth
-premio
-s/he
-iff
-merman
-flutter
-detergents
-alchemical
-carpark
-90.9
-yamashita
-revives
-detainment
-co-exist
-mariam
-shortcuts
-limitless
-quartzite
-plasmid
-antioxidants
-sortied
-backfield
-obsolescence
-yumi
-inductance
-drysdale
-canines
-khulna
-imperator
-damme
-refereeing
-carruthers
-mid-way
-finnegan
-delany
-i-35
-tiverton
-checkout
-reprocessing
-faris
-mohawks
-rear-wheel
-tenements
-lovable
-nation-state
-37,000
-punctured
-selectmen
-wasserman
-laconia
-freamon
-detach
-marimba
-waveguide
-deux
-somber
-formula_49
-murky
-morten
-83rd
-incas
-mid-january
-numb
-cornice
-two-tier
-justifies
-ruck
-over-the-top
-elkhorn
-corzine
-3b
-galvanized
-kristy
-boycotts
-dennett
-hokkien
-bizkit
-lma
-spate
-dougal
-sdf
-500cc
-shafer
-tov
-617
-conklin
-agha
-rigveda
-penetrates
-redcliffe
-importers
-upheavals
-gleeson
-valladolid
-highest-ranking
-lugosi
-tyra
-xtra
-dalits
-larne
-dps
-dijon
-parkman
-santi
-tsingtao
-gish
-trna
-specialities
-alasdair
-pds
-u-shaped
-croton
-batchelor
-theologically
-out-of-print
-altamont
-courtesan
-armidale
-1up
-riverton
-noriega
-gharana
-tartar
-mid-february
-1160
-bromsgrove
-deming
-retrofit
-smc
-slowdown
-gba
-absorbers
-liberators
-gaulish
-foxe
-saucers
-exarchate
-blackboard
-emmylou
-drax
-pvp
-moderators
-moll
-despises
-uncontested
-vivaldi
-memorized
-pki
-abrahamic
-unchecked
-lpg
-spat
-shanty
-pane
-zhen
-aran
-rialto
-ravel
-dilip
-spiro
-bozeman
-carina
-worksop
-facilitator
-fluctuation
-falstaff
-hydration
-concubines
-675
-grahame
-madhu
-adapts
-inhumane
-lillehammer
-echelons
-riel
-khwaja
-anglo-saxons
-bistro
-laurentian
-suburbia
-cloisters
-insulators
-supernovae
-blackouts
-16th-century
-dru
-forty-four
-hazelwood
-old-growth
-oscar-winning
-101.5
-physiotherapy
-bobsledder
-dac
-drivetrain
-mcwilliams
-unchallenged
-decryption
-sud
-monomers
-herder
-walken
-deanna
-ambience
-hoppers
-huck
-reston
-dames
-emphasising
-sowing
-2k
-thickets
-oglethorpe
-cushions
-branford
-ionosphere
-chandragupta
-k-6
-absences
-libra
-zoey
-haired
-treasured
-command-line
-soca
-ota
-518
-formula_48
-urs
-minefield
-hossain
-natchitoches
-ballymena
-wilber
-connexion
-carlito
-goaltenders
-potash
-luxuries
-secretary-treasurer
-crumble
-fairytale
-eamon
-harz
-avalanches
-asad
-hobbyist
-csf
-elfman
-giza
-skylark
-caster
-unfriendly
-crematoria
-factored
-relaying
-438
-infringing
-medallions
-custom-made
-smothers
-faq
-evidence-based
-appalachians
-cling
-iridescent
-devious
-self-interest
-asb
-rui
-rechristened
-modulator
-plumb
-argento
-burdened
-livonian
-alphanumeric
-three-letter
-parables
-gazprom
-excels
-beanie
-coverings
-h.g.
-mizoram
-facilitation
-cote
-nagas
-gcses
-savant
-daltrey
-cinque
-rios
-desertification
-rh
-crm
-colley
-hoffa
-13-year-old
-mini-album
-apostate
-beckwith
-solemnly
-r.c.
-boynton
-mountainside
-huygens
-pecos
-467
-machete
-bharata
-meltzer
-stover
-caveat
-sissy
-recife
-soak
-wormwood
-sama
-novitiate
-31,000
-e-learning
-etheridge
-plunger
-canceling
-telekinetic
-concurring
-subtracting
-shubert
-intensities
-wellman
-rotorua
-cordell
-healers
-airbag
-feist
-impair
-noe
-neely
-lennie
-alumina
-1979-80
-4pm
-hooking
-expiry
-4-cylinder
-landless
-smokeless
-bikaner
-614
-aquifers
-494
-smithy
-dinar
-bodhisattva
-industrialised
-grafted
-undp
-downsized
-adversarial
-uploading
-hypotension
-cementing
-piezoelectric
-hashimoto
-dispel
-enamored
-mcewan
-searchers
-coulomb
-bassline
-phuket
-semi-finalists
-wrens
-yazid
-corregidor
-generalised
-ills
-5.00
-strangle
-fairing
-interrogate
-garber
-neva
-prototypical
-tetrahedral
-fec
-velasco
-yakov
-gorgon
-maxx
-toll-free
-akc
-integrative
-ariadne
-vellore
-theistic
-stop-motion
-gunsmoke
-responsiveness
-u.s.-based
-subjectivity
-redhill
-diphthongs
-cerritos
-spatially
-mop
-follicle
-instinctive
-workmanship
-zander
-trans-siberian
-106.3
-throats
-elongation
-equaliser
-osgoode
-gagarin
-jacoby
-magnified
-mcafee
-453
-six-man
-redondo
-seditious
-limassol
-pyrrhus
-icf
-enright
-bollinger
-montgomerie
-flirts
-slow-moving
-albatrosses
-philosophic
-luxor
-sleazy
-hin
-edouard
-elan
-privates
-characterizations
-leaguer
-asceticism
-demotion
-prelates
-occasioned
-capt
-abc-tv
-lun
-tweety
-camillo
-approximates
-snowstorm
-well-educated
-jughead
-chasers
-acclamation
-462
-guerillas
-encephalitis
-nanyang
-arn
-bisects
-basilan
-ferrous
-bagwell
-prodigal
-marketable
-sicilies
-mistresses
-equaled
-bromine
-steinbrenner
-oo
-bisons
-iva
-gallia
-lapwings
-combative
-derivations
-humanists
-slingshot
-conestoga
-pallava
-rosenbaum
-tarleton
-whalley
-bullard
-sembilan
-basse-normandie
-104.7
-carnarvon
-hearted
-shewa
-heme
-bramble
-donne
-bagpipe
-goons
-apportionment
-thankful
-streamlining
-ozarks
-brahmo
-depopulated
-calming
-subcommittees
-vegetarians
-newsreel
-kaoru
-macrae
-luxemburg
-terrorized
-yellowknife
-fortieth
-102.3
-quipped
-kristofferson
-gyrus
-monochromatic
-602
-philatelic
-goalkeepers
-schell
-bosnia-herzegovina
-pollutant
-zell
-corkscrew
-schafer
-supercharged
-prefrontal
-plug-ins
-ww
-gush
-yau
-labored
-worshipers
-gare
-nikolaus
-condensate
-fibrillation
-siddiqui
-i-15
-hippolyta
-tanjung
-warping
-agonists
-homeopathy
-calibers
-kindergartens
-15-20
-polypropylene
-leapt
-easley
-kilimanjaro
-archeologists
-impurity
-filibuster
-admiring
-porphyry
-principia
-health-care
-bountiful
-leash
-multi-cultural
-restorative
-rekindled
-bellaire
-534
-artiste
-sint
-heer
-trolleybuses
-machinima
-separatism
-488
-jaundice
-impersonation
-coll
-mccready
-eschatology
-ism
-cashel
-silencing
-laissez-faire
-manipulator
-ld
-warts
-exoskeleton
-kirkcaldy
-roadrunners
-cuddy
-olmert
-overuse
-piecemeal
-divya
-castellano
-geraint
-jama
-merkel
-co-executive
-yong-min
-secures
-hoff
-narcissus
-pauls
-whiteley
-chilliwack
-vittoria
-telly
-deepen
-newsprint
-herkimer
-105.9
-melayu
-hittites
-disordered
-inflate
-strathmore
-minamoto
-freeholders
-521
-labors
-novosibirsk
-a$
-egmont
-scilly
-enniskillen
-internees
-zirconium
-uplifted
-pohl
-lightness
-bosley
-seafront
-aguilar
-yoon
-undoing
-lassiter
-bianco
-curative
-evasive
-thurgood
-606
-kel
-conyers
-issn
-tigh
-imparts
-spars
-nsaids
-nanotubes
-steiger
-lichtenstein
-cutty
-104.3
-tse
-interceptors
-ransacked
-matisse
-3200
-veena
-qutb
-almaty
-scurvy
-torbay
-loathing
-energetically
-dodson
-lapland
-capp
-parlophone
-oliphant
-kaya
-maldon
-bombarding
-aurangabad
-coopers
-downtime
-abbasi
-faceless
-prolog
-immunization
-pino
-pinkish
-anglo-french
-burdon
-downsizing
-uneducated
-calvinists
-reiss
-paddling
-half-hourly
-jonathon
-deering
-karbala
-nav
-3,600
-cockroach
-nz$
-dinobots
-prospectus
-allenby
-alienate
-oliveira
-tempos
-coombs
-germanium
-chrysostom
-stagg
-epr
-edsel
-resurgent
-coenzyme
-algal
-lorries
-edelman
-p.c.
-cayley
-permissive
-baffled
-durrani
-pfa
-bloemfontein
-taffy
-reuptake
-rimes
-kozhikode
-crusoe
-mccool
-juridical
-hellboy
-summarize
-analogs
-braunschweig
-rebuked
-materialised
-polygraph
-compliments
-oeuvre
-in-store
-huo
-chechens
-dismissals
-denning
-djing
-pta
-eponym
-anemone
-cocked
-ponder
-537
-forfar
-killeen
-wyvern
-strafing
-versace
-bagram
-jeffreys
-chiles
-exclaims
-motility
-understory
-malin
-distilleries
-bailout
-nebuchadnezzar
-tree-lined
-ionia
-dashes
-d'amato
-chiapas
-gurkhas
-3.21
-blistering
-necklaces
-juxtaposed
-major-league
-disclosing
-dorje
-samaritans
-concomitant
-homeostasis
-freda
-liv
-watauga
-instituto
-lingered
-nefarious
-dockers
-berle
-dervish
-caldecott
-compostela
-focussing
-site-specific
-devereux
-incognito
-derailment
-polyps
-non-european
-lcms
-feeney
-romo
-trachea
-apec
-bombardments
-prabhu
-inclement
-sfsr
-pentax
-refitting
-mellor
-mediating
-irkutsk
-komodo
-thickly
-533
-co-operatives
-corticosteroids
-1514
-seaports
-yeshivas
-lawrie
-pareto
-endicott
-admixture
-blindfolded
-dominoes
-prada
-prerequisites
-allard
-bassoons
-ghettos
-minibus
-chipmunk
-sucre
-phosphates
-formula_50
-teesside
-entrapment
-89.9
-infantile
-escobar
-antipathy
-unexploded
-toasted
-maneuvered
-miri
-hickok
-beckley
-etna
-gigabit
-conveyance
-transposed
-inspectorate
-thrills
-1976-77
-scorecard
-reinvented
-ilyushin
-blighted
-otsego
-watertight
-predicates
-replete
-stoller
-rooks
-daydream
-607
-meer
-cooperates
-medics
-wga
-dedicates
-akon
-authorizes
-transceiver
-complementing
-rowlands
-msgr
-1140
-postbellum
-scented
-romulan
-symbolise
-swede
-kerman
-anti-personnel
-re-united
-socialize
-headlands
-3.22
-spectacularly
-kyoko
-aurelian
-faiz
-inoperable
-centralia
-lothar
-characterisation
-razorbacks
-diminishes
-speck
-androgen
-rebelling
-90.5
-culp
-supersport
-brim
-figurine
-dismembered
-foundling
-deepak
-jolson
-andrzej
-caballero
-subliminal
-gsa
-hynes
-jackals
-tyndale
-understated
-chia
-bursa
-urbino
-mixed-race
-bulbous
-gyro
-scapular
-gol
-cherie
-low-end
-ritually
-claypool
-morlocks
-indulged
-holdsworth
-epp
-poking
-ctu
-demobilization
-ruger
-gandalf
-altai
-neo-gothic
-inquired
-extinctions
-stalwarts
-medvedev
-reintroduce
-eddington
-paoli
-olbermann
-plame
-kristine
-possessor
-parrott
-buddhas
-90.7
-sandalwood
-moselle
-subhash
-basildon
-wolcott
-isoforms
-emmaus
-monoid
-sailboats
-busing
-mockumentary
-bohemond
-mappings
-1977-78
-globalisation
-edina
-amba
-mcgraw-hill
-mpg
-lollipop
-scoreline
-skylab
-reddish-brown
-apollonius
-respectability
-superfund
-ttl
-hippolytus
-enos
-sumatran
-highest-grossing
-eberhard
-pk
-pauling
-0.03
-booting
-post-colonial
-proviso
-al-andalus
-fitzsimmons
-congratulate
-multnomah
-spacewalk
-towpath
-incitement
-hansa
-rectangles
-re-enactment
-valera
-impersonal
-pettigrew
-shadowed
-umbria
-mastodon
-excused
-murfreesboro
-vocalizations
-cq
-jeffersonville
-storyboard
-ewe
-cabling
-lookouts
-1967-68
-abney
-faire
-menominee
-deleterious
-psoriasis
-biao
-voor
-38.6
-m8
-muck
-umts
-brevity
-xxxx
-popularization
-hangover
-sherborne
-98.9
-ber
-subordination
-redwoods
-eda
-natwest
-candlestick
-mcclatchy
-dahlgren
-scarp
-featurette
-hexagon
-advaita
-dunkeld
-blower
-cochlear
-aomori
-etymological
-2006-2008
-bayes
-cautions
-lupe
-1390
-hamdan
-topple
-nagai
-caernarfon
-retrial
-adsorption
-webcam
-nepotism
-11pm
-commutation
-sprouts
-taoiseach
-evo
-coals
-escapement
-renzo
-retrospectively
-baptisms
-watermark
-azam
-qwest
-tuareg
-malo
-petitioner
-cartooning
-lumps
-stitched
-adolfo
-objector
-468
-ebu
-memento
-neurologist
-chiron
-firings
-cocos
-etruria
-super-powered
-biophysics
-roslyn
-bonny
-madly
-deformities
-aviary
-provident
-635
-mano
-aleister
-37.0
-j.g.
-sigil
-cdf
-mises
-metalwork
-marbled
-certifying
-99.7
-rau
-csis
-remodel
-meagher
-carrara
-liguria
-sign-on
-stand-off
-ternary
-inappropriately
-ahom
-proclamations
-ven
-mullin
-linga
-nsu
-mauricio
-nordiques
-witte
-droid
-wald
-aaliyah
-1010
-forges
-blatantly
-actuator
-defensively
-wmd
-aristide
-swoop
-anathema
-eragon
-childlike
-molested
-38.2
-rollercoaster
-ranchi
-presque
-tms
-2003/04
-524
-shinjuku
-evaporate
-104th
-impunity
-suzuka
-olivet
-ugo
-6-10
-tekken
-sprocket
-namespace
-s.c.r.
-linfield
-anti-american
-h3
-shams
-disrupts
-pseudoscience
-d'etat
-tru
-emanuele
-methuen
-pontchartrain
-materialistic
-whitesnake
-gloom
-gerais
-skynet
-noose
-snohomish
-bulwark
-elitist
-exogenous
-classicism
-repose
-univac
-fruiting
-gilpin
-wretched
-carbines
-stung
-retriever
-moorings
-cleanse
-446
-urals
-lead-off
-savers
-disprove
-innis
-harvick
-bakar
-industrious
-532
-sandro
-sagittarius
-ijn
-verdes
-pardo
-bacillus
-hangout
-wildflower
-guinean
-maryam
-perverse
-mondale
-voids
-90.3
-umpired
-untouchable
-hairstyles
-2700
-nc6
-cichlid
-cyprian
-netware
-stratosphere
-freeform
-nia
-teleports
-glottal
-italo
-dictators
-injures
-1498
-turmeric
-diagnosing
-telekom
-steelhead
-erick
-subsidised
-blimp
-rocketry
-snicket
-sisko
-halides
-scythe
-thorny
-sil
-pate
-slats
-xor
-nhra
-holst
-autobahn
-pre-release
-outgrown
-ntv
-infantryman
-maurizio
-powderfinger
-c-47
-conservatorium
-thessalonica
-dilated
-non-coding
-velodrome
-hocking
-londoners
-fsc
-waukesha
-inconspicuous
-movin
-robles
-secondhand
-timon
-amphetamines
-464
-oxidizer
-remand
-corazon
-bingley
-dislocations
-newham
-masala
-o'day
-languished
-malton
-frisia
-lian
-calmed
-notated
-despondent
-high-temperature
-emacs
-kalgoorlie
-leyden
-10.30
-1-8
-39.5
-multi-user
-prinz
-448
-great-grandchildren
-fergie
-wirth
-opulent
-cantwell
-paraphrase
-waffen-ss
-chloroplasts
-slur
-weizmann
-biannual
-dissension
-lagging
-brawn
-acrobatics
-underlie
-ruggiero
-combinatorics
-minardi
-dribbling
-holyhead
-detonating
-gastropods
-purists
-debug
-fluorine
-epileptic
-undeniable
-karelian
-1040
-masterful
-neurosurgery
-murry
-lagged
-philosophically
-excised
-nitrite
-plying
-3.23
-council-manager
-ecg
-632
-connally
-convulsions
-holed
-anticipates
-lwt
-administrated
-dimensionless
-bernini
-domineering
-fatale
-couplings
-eriksson
-juke
-pinnacles
-eroding
-unhappiness
-kenyatta
-rind
-manukau
-whiteside
-playwriting
-methodical
-vangelis
-menopause
-cyborgs
-fob
-infallible
-behemoth
-wafers
-a.i.
-indiscriminately
-lr
-hammering
-wrongfully
-gawler
-ncs
-hildebrand
-tewkesbury
-alito
-adr
-ritualistic
-eschewed
-tights
-osamu
-corrupting
-satrap
-ramsar
-mayday
-reentered
-baluchistan
-unknowns
-farber
-massoud
-sadiq
-handyman
-squarepants
-8-0
-quinton
-bullwinkle
-auvergne
-overlords
-ventricles
-nineveh
-cacao
-bernd
-nucleophilic
-semi-regular
-khazars
-thermometer
-cooperstown
-gaither
-pox
-featureless
-clwyd
-quack
-doer
-622
-hillbillies
-setups
-hypersensitivity
-gerd
-outage
-backside
-vajpayee
-ferndale
-cot
-ecotourism
-durst
-self-destruct
-porpoises
-bushfires
-binocular
-maelstrom
-1505
-91.1
-orangeville
-dbs
-a.g.
-spool
-autopilot
-srinivasa
-shalt
-charan
-atherosclerosis
-value-added
-lawfully
-cyclotron
-gwynne
-jabalpur
-cleland
-anti-nazi
-inguinal
-bartow
-willesden
-belton
-schillinger
-taillights
-blasters
-thunderball
-reconstructive
-clippings
-penarth
-crept
-german-born
-reconstructionist
-quadrilateral
-qualcomm
-wemyss
-ronny
-zune
-xhtml
-esprit
-yearning
-quirks
-propriety
-cataclysm
-prius
-lampooned
-winchell
-crosley
-embolism
-guzman
-breckenridge
-vermeer
-debaters
-sasaki
-komnenos
-looser
-relish
-intracranial
-indoctrination
-wap
-madigan
-dor
-frodo
-singularities
-107.1
-broussard
-celestine
-symbolises
-titian
-brooker
-thermopylae
-sub-divided
-nanoparticles
-foyt
-pantry
-petroglyphs
-38.9
-adiabatic
-108th
-astride
-hard-hitting
-blink-182
-brno
-baywatch
-christianization
-anti-discrimination
-contraceptives
-25-year
-dulce
-meera
-seaquest
-georgy
-bestsellers
-susa
-beda
-lipscomb
-ces
-soriano
-telemark
-creeper
-canter
-amass
-broomfield
-nipissing
-well-trained
-spillane
-korps
-twente
-enriching
-jindal
-antipope
-meld
-waive
-power-up
-kenshin
-sketching
-blavatsky
-37,500
-paraguayan
-mdr
-lockport
-scaly
-disrespectful
-airlock
-belvoir
-merchantmen
-kinsella
-xxiv
-heliport
-westcott
-primers
-panes
-dative
-nutty
-lollapalooza
-superdome
-astroturf
-kanyakumari
-37.2
-rajan
-rena
-satanism
-imitators
-patrimony
-felixstowe
-mallee
-winless
-senseless
-archaea
-hayman
-89.5
-dacian
-ioannis
-vyacheslav
-sidekicks
-ugc
-camry
-disembark
-permafrost
-pharyngeal
-vermillion
-ecologist
-gwynn
-donatello
-ablation
-473
-471
-clr
-managua
-stalag
-combustible
-infra-red
-tachyon
-boll
-csl
-fingerprinting
-bunt
-sheaves
-grayling
-english-born
-optometry
-palatal
-rbc
-belgaum
-functors
-aeroplanes
-arakan
-survivability
-ast
-couplets
-standardisation
-odom
-lucio
-marauder
-cynicism
-iia
-shrubby
-stockman
-815
-ikea
-transitway
-alanis
-perelman
-vortices
-defaults
-narada
-crump
-strontium
-palumbo
-merion
-skilful
-headley
-justifiable
-allerton
-draftsman
-mech
-kelson
-inking
-imogen
-veal
-supra
-bosniaks
-buckwheat
-griffon
-chatting
-bootlegging
-encloses
-c.w.
-signet
-glinda
-barham
-conformed
-kilgore
-attenuated
-petaluma
-loggers
-mislead
-toussaint
-counsels
-nematodes
-mechelen
-g6
-trinitarian
-pes
-mauzas/mahallas
-comanches
-serialization
-hainaut
-aquileia
-contras
-tailgate
-satya
-digidestined
-annular
-sauron
-superspeedway
-garrard
-souter
-ocean-going
-degeneres
-overhanging
-azusa
-leonards
-low-pressure
-venezia
-grindcore
-rebus
-repelling
-arouse
-resistive
-prefecture-level
-8.00
-re-built
-porgy
-shima
-vanier
-shevchenko
-giselle
-shetty
-relinquishing
-redhead
-mj
-puss
-secretarial
-ecoregions
-pungent
-cci
-37.9
-biltmore
-bennie
-affording
-cfc
-novara
-erfurt
-straightened
-puffy
-tertullian
-aspires
-dms
-ariane
-placekicker
-storehouse
-1360
-ficus
-infantrymen
-berea
-clontarf
-stingrays
-polytope
-plantain
-steelworks
-channeling
-gust
-p53
-edgewater
-llyn
-whiz
-majapahit
-despatched
-at-bat
-appreciable
-nipples
-tris
-moksha
-pandava
-6-12
-utilises
-teng
-moretti
-rodger
-dissipating
-monck
-ysgol
-bewildered
-equilibria
-interrogators
-a.a.
-newsnight
-inverter
-mmorpgs
-fetuses
-danilo
-defaulted
-utilitarianism
-furies
-crowell
-474
-brickworks
-crawler
-objectors
-roskilde
-cosine
-ntt
-athrun
-clove
-bridgetown
-cucumbers
-top-selling
-crescendo
-canadian-born
-isidro
-seeley
-unconditionally
-ahn
-braden
-ursus
-lurking
-protectionism
-simi
-7-1
-histology
-hezekiah
-handsworth
-compositing
-riddick
-106.9
-darshan
-hargrove
-4x
-chandelier
-morin
-dissented
-hvac
-jean-marie
-perumal
-clapping
-morro
-stingers
-self-made
-nonconformist
-knockouts
-intents
-benji
-bouchard
-perestroika
-oligarchy
-playboys
-spoilers
-lyases
-shiga
-liberally
-clausura
-grigory
-glassware
-simcity
-anz
-tumours
-ry
-scepticism
-halliburton
-shoreham
-spyder
-prue
-dist
-co-opted
-nin
-kirkham
-amiable
-sdl
-libertarianism
-461
-limpopo
-annexing
-daisuke
-roo
-rerecorded
-ishaq
-1291
-intensification
-howards
-captivated
-dishonesty
-yancey
-reconquest
-byproducts
-axiomatic
-canfield
-rann
-subjunctive
-stettin
-plantar
-41,250
-0.04
-generality
-pakenham
-inhumans
-unsecured
-bracing
-driest
-'cause
-glencoe
-insemination
-burnie
-maples
-liege
-hermon
-aeolian
-rostock
-hitchin
-tho
-outnumber
-vieira
-even-numbered
-self-reliance
-trappings
-nand
-agincourt
-domicile
-sunnah
-pomegranate
-delenn
-gametes
-mezzo-soprano
-ventilated
-centaurs
-wolds
-brickell
-pediment
-craterlets
-trigonometry
-mid-march
-fundy
-whitcomb
-shrinks
-comic-con
-magnetization
-equating
-singer/guitarist
-saltire
-m&m
-starships
-cassel
-adverbs
-acyl
-fuentes
-crystallized
-north-northeast
-z80
-newland
-alumna
-daugherty
-evolutions
-89.3
-octopuses
-academical
-wonderfully
-noh
-drowns
-shamrocks
-kincardine
-a-z
-tailors
-vented
-balaji
-anteriorly
-billet
-alderson
-smack
-minaret
-supersymmetry
-vander
-steers
-sakurai
-sandal
-37.6
-landfills
-1493
-29,000
-wnbc
-belk
-selden
-ashworth
-alabaster
-bottlenose
-toddlers
-alternator
-kennington
-frontenac
-minion
-k-league
-chairlift
-dominguez
-blakely
-compuserve
-caliban
-escapees
-turpentine
-sufferer
-alberti
-meteorologists
-fingal
-under-23
-90.1
-pretentious
-marl
-caveman
-seddon
-responsa
-caterina
-bangles
-disguising
-wedlock
-88.9
-laid-back
-img
-sixpence
-torts
-irritable
-starfighter
-abbeville
-yui
-hypocritical
-willey
-920
-acrobat
-1502
-malmesbury
-counterweight
-schneerson
--6
-reorganizing
-unquestionably
-casings
-6.00
-rodeos
-loosened
-cistern
-concurrence
-colonials
-parque
-petre
-skateboarder
-heparin
-biceps
-thelonious
-tweaked
-458
-gucci
-dmc
-diversions
-551
-slamming
-nevsky
-narbonne
-2009/2010
-infiltrates
-eyelid
-fenn
-ryukyu
-designator
-proscribed
-gujrat
-cocteau
-craving
-godard
-ssa
-stylings
--10
-dostoevsky
-bushido
-coldwater
-qingdao
-lifeguards
-k-2
-lanao
-divisors
-pre-determined
-tiring
-bruford
-savory
-universalism
-takeaway
-corliss
-rj
-divest
-willful
-proscenium
-re-created
-tilley
-usury
-nda
-triptych
-divider
-szlachta
-ahmadiyya
-attainable
-homicidal
-lances
-promulgation
-gory
-manitou
-windfall
-103.5
-inker
-choate
-gec
-clueless
-creationists
-beets
-chroma
-drugstore
-dandelion
-lumsden
-seedy
-velma
-mainstays
-refounded
-snowdonia
-38.4
-4-speed
-beleaguered
-re-appointed
-552
-deterred
-bumping
-2-door
-norden
-achaea
-glc
-impersonated
-565
-namesakes
-boop
-humpback
-fealty
-moorcock
-oligocene
-swirl
-kokomo
-wurlitzer
-continuance
-walnuts
-recruiters
-3,300
-sneaked
-rosanna
-nsb
-b4
-galerie
-sourcebook
-transgenic
-goro
-unintelligible
-foils
-sherri
-humming
-strewn
-106.5
-fogerty
-shinty
-vijayawada
-ayumi
-moonstone
-bracknell
-nettles
-espy
-devotions
-lapses
-microtubules
-viterbo
-incised
-racquetball
-getz
-chaste
-cowes
-krill
-litters
-fast-moving
-fetched
-sasquatch
-draughtsman
-cetacea
-suave
-metastatic
-lain
-shimbun
-sankt
-khali
-sapp
-satyajit
-twenty-sixth
-ever-changing
-re-written
-prescriptive
-citv
-overruns
-redacted
-fullness
-grieg
-earth-two
-bernal
-loner
-pretensions
-rhee
-cebuano
-wailers
-maldonado
-bannerman
-irv
-portobello
-goodrem
-nesmith
-celery
-lusaka
-urns
-machina
-107.5
-459
-arborea
-indomitable
-pascagoula
-jessup
-staphylococcus
-flogging
-airlifted
-revues
-professing
-zemo
-sparc
-grumpy
-larceny
-fetishism
-supercomputers
-eyeball
-smokes
-excelling
-in-state
-diapers
-thais
-quagmire
-cortisol
-cushman
-paola
-demilitarized
-gilly
-werribee
-volpe
-peart
-ingesting
-aeroflot
-defenseless
-photojournalist
-videotapes
-self-styled
-impasse
-earthy
-huntingdonshire
-vampiric
-vestigial
-olivine
-recur
-granary
-dodging
-grubb
-backbencher
-1-2-3
-pallets
-barrens
-choreographers
-sorensen
-cpl
-aud$
-hartland
-government-sponsored
-co-hosting
-tancred
-825
-philanthropists
-v-2
-mestizo
-300th
-categorically
-gatos
-radiative
-splice
-basements
-hartwell
-bak
-outages
-538
-tic
-divinely
-giffen
-insubordination
-104.1
-amalia
-f-22
-reparation
-couriers
-bw
-shortness
-kiwanis
-glades
-89.7
-record-setting
-clanton
-vcr
-vimy
-tickle
-barrack
-rcs
-empiricism
-chaffee
-haasan
-bukovina
-8.30
-mid-2009
-haney
-readability
-pentathlon
-cmu
-chevrons
-annales
-testifies
-sacco
-kc-135
-blacktown
-hopkinson
-gatling
-beecham
-amalfi
-moroni
-equities
-condos
-sub-district
-edsa
-mpi
-honeyeater
-romansh
-kisan
-mauryan
-arezzo
-wakayama
-albee
-haworth
-,3
-srs
-laver
-shing
-regattas
-three-week
-smoothed
-folkloric
-formula_51
-horsley
-palacio
-fahd
-limes
-ile
-storybook
-tiberias
-godhead
-amusements
-tengku
-ismaili
-sncf
-placenames
-babysitter
-cutie
-transponders
-ruthenian
-three-phase
-tralee
-probationary
-neuter
-coulthard
-fb
-a-1
-leiber
-mame
-leatherhead
-hereby
-crossley
-unrecorded
-scat
-iskcon
-mewar
-orthopaedic
-bayshore
-107.9
-fleur
-sweeper
-chime
-encapsulation
-abundantly
-469
-amulets
-soldering
-miao
-fireballs
-gnr
-trang
-typographical
-retconned
-1999-2001
-harbhajan
-feigned
-redcar
-moltke
-reconnect
-obtainable
-argumentation
-moab
-hondo
-samuelson
-hodgkin
-grocers
-dragonflies
-one-fifth
-multidimensional
-unitas
-masquerading
-accomplishes
-kripke
-factorial
-advertises
-vols.
-underpinnings
-setter
-crucially
-o.j.
-faints
-tcm
-segundo
-specs
-encephalopathy
-vestal
-barbs
-galle
-deviates
-lenoir
-478
-patronized
-douai
-playmates
-bhatt
-paypal
-steyr
-frege
-stormwater
-anamorphic
-tailings
-rhoads
-mayonnaise
-urination
-oppenheim
-fantastical
-earthworms
-pater
-shopper
-caper
-caregiver
-lumberjack
-six-week
-phenomenological
-monotone
-squarely
-kn
-huw
-luminance
-cotabato
-covenanters
-gott
-prepositions
-vicars
-impersonate
-virulence
-hov
-whit
-prosser
-heartbreakers
-brion
-afferent
-blot
-selfishness
-shimizu
-incubated
-dunblane
-refurbishing
-syndicates
-unions/wards
-brier
-prokaryotes
-jettisoned
-usps
-matrimonial
-ht
-legalize
-sxsw
-water-soluble
-virtua
-{
-cotter
-cranium
-ruthlessly
-talkin
-frieda
-olden
-nkrumah
-metropolitans
-streamers
-knut
-threading
-1506
-bufonidae
-arecaceae
-andronikos
-plucking
-brevis
-north-northwest
-oracles
-asperger
-hospitalised
-hotbed
-103.9
-545
-codice_7
-lemay
-piotr
-corvallis
-non-members
-girth
-lewd
-tallulah
-asd
-0-3
-bicyclists
-caerphilly
-beautification
-carlotta
-embattled
-discerned
-keying
-imbalances
-fifty-five
-arabella
-sten
-anatomically
-csiro
-galleon
-fittipaldi
-marwan
-colorectal
-aeronautica
-2003-2005
-four-stroke
-phrygian
-shekhar
-krystal
-ocd
-vocations
-schott
-sculptured
-j.h.
-cormac
-nous
-holyfield
-sommer
-zayed
-housatonic
-purporting
-dogwood
-huntress
-northerners
-4-4
-fontainebleau
-transgression
-randwick
-shimla
-paxson
-perforation
-scissor
-matins
-semblance
-halesowen
-doj
-chub
-parcells
-elinor
-extruded
-blowout
-usurp
-1327
-fibreglass
-michener
-pandas
-blakey
-tigre
-kaiserslautern
-enchantment
-feedstock
-paulson
-allin
-rbs
-splintered
-w.c.
-uscg
-platypus
-geeks
-single-handed
-dreamland
-harnessed
-ex-officio
-putty
-papaya
-shura
-mayoralty
-meth
-long-legged
-hellas
-teaneck
-rajas
-tsushima
-trafficked
-rothbard
-coupler
-cd4
-mid-century
-dismissive
-ucce
-mandating
-subtract
-antipsychotic
-baronial
-changeover
-brendon
-abduct
-7-eleven
-forty-six
-ashtabula
-2.28
-catapults
-spangled
-annenberg
-liars
-nuovo
-dempster
-trapp
-methadone
-armature
-sub-divisions
-melodramatic
-garhwal
-schmid
-chaining
-magik
-compressive
-fine-grained
-shapur
-bochum
-apricot
-cutscenes
-major-label
-scheldt
-snows
-davros
-bluebell
-multiplexed
-cherish
-levu
-102.9
-direct3d
-emf
-septimus
-lioness
-narendra
-grattan
-palomar
-panay
-non-jews
-armani
-exchangers
-presser
-fizz
-bactria
-comiskey
-purbeck
-garson
-636
-zahir
-thyme
-comforted
-able-bodied
-elway
-unstressed
-pennants
-9.30
-inflorescences
-berra
-southwick
-dihedral
-reprimand
-prithvi
-washoe
-hauptmann
-fracturing
-tilak
-phone-in
-caving
-constructivism
-dordrecht
-sub-group
-fellini
-rooftops
-westlife
-scaring
-cyanobacteria
-jtwc
-pattison
-ratcliffe
-acuity
-craton
-grc
-correspondences
-2002-2004
-479
-akiva
-glycoprotein
-talkie
-dampers
-ceding
-kosmos
-diphosphate
-2.27
-credo
-qianlong
-changer
-mini-con
-denman
-aeschylus
-rushmore
-mehmet
-dragnet
-exmoor
-yulia
-visakhapatnam
-g7
-ecb
-socialized
-ashburn
-obituaries
-west-northwest
-recreations
-geostationary
-emotive
-avocado
-belli
-riba
-37.7
-nonviolence
-geometrically
-gaby
-nestorian
-martindale
-rowell
-codices
-huan
-nabisco
-warangal
-madea
-tubby
-nzr
-104.5
-vaishnavism
-podiums
-biathlon
-desalination
-whoopi
-constructivist
-shader
-recruiter
-zoroastrians
-molestation
-lino
-anachronism
-humerus
-hummel
-durations
-openbsd
-hamad
-magyars
-bhushan
-cbgb
-m60
-larue
-macapagal-arroyo
-tinge
-lyndhurst
-1972-73
-prodi
-cygnus
-flange
-carnal
-aphasia
-atlantean
-rushden
-ozzfest
-roughness
-elca
-autonomously
-cadmus
-sindhis
-pre-crisis
-wiz
-nickelback
-montebello
-acidosis
-songbirds
-businessweek
-uninsured
-woodard
-exemplify
-ons
-livelihoods
-barrichello
-ardeidae
-ramen
-bombshell
-damselfly
-castletown
-hapkido
-behan
-gros
-redneck
-arf
-universiteit
-hein
-iguanas
-sportswriters
-gtr
-harrelson
-meru
-collaboratively
-orford
-repo
-daventry
-keng
-spi
-blandford
-tauranga
-brt
-haslam
-jalandhar
-showman
-calorie
-alois
-grierson
-taku
-+DGDGDGDG
-seducing
-grime
-underscore
-kampuchea
-stakeholder
-think-tank
-indo-pacific
-octet
-star-spangled
-scorned
-jabal
-guyed
-hamsters
-shoved
-shapeshifting
-spoofs
-nothin
-the.net
-differentials
-yomiuri
-cataracts
-well-connected
-annihilate
-gamal
-unwitting
-distillers
-origination
-sargon
-fogg
-amundsen
-els
-3pm
-amigos
-receptacle
-psychedelia
-litany
-1070
-unwise
-forty-three
-niobe
-flashed
-waffle
-bse
-keratin
-non-consecutive
-homeward
-1282
-laryngeal
-programmatic
-unequivocally
-borromeo
-semple
-meijer
-aum
-flume
-ramallah
-kolhapur
-kuo
-codice_8
-trampoline
-astronautics
-lexie
-webmaster
-lucretia
-disestablished
-bonin
-tamarind
-forty-one
-palladio
-film-maker
-adjudication
-3.30
-crowes
-injects
-gelderland
-personages
-jeannette
-jugular
-verlag
-esperanza
-pyre
-foreskin
-right-hander
-clades
-liger
-levon
-sambo
-mauretania
-chancellors
-amo
-british-born
-oud
-instrumentalist
-traceable
-burghers
-4-h
-bohs
-match-up
-pharynx
-usk
-minter
-utterances
-dispersing
-multicellular
-starry
-50/50
-east-northeast
-x-man
-rcc
-berthold
-496
-493
-musique
-beria
-separations
-uda
-occitan
-hearty
-1234
-taiga
-bbl
-defaced
-483
-kaleidoscope
-sarcasm
-mut
-ktla
-sathya
-harte
-fifty-two
-pairwise
-astonishment
-ent
-638
-vagabond
-q3
-diacritics
-1307
-mies
-high-power
-breuer
-therapeutics
-co-ordinating
-magyar
-weaves
-hit-and-run
-costal
-franchising
-914
-skew
-b-2
-foursome
-lofts
-semi-autonomous
-chaitanya
-hammock
-earthenware
-riser
-elko
-bottom-up
-homesteads
-knudsen
-microchip
-thoth
-twinkle
-amide
-wreak
-sepoys
-pampa
-excavate
-655
-humiliate
-ludhiana
-muskoka
-ihsa
-potteries
-idlewild
-allegiances
-capitalizing
-transpose
-sem
-farthing
-neanderthals
-unfulfilled
-sleuth
-voracious
-mcnab
-vice-presidents
-phitsanulok
-nh3
-superfortress
-doorman
-cursing
-summarizing
-inaccurately
-jaeger
-lamenting
-fatimid
-mallow
-9-0
-ansar
-malla
-infrastructural
-bom
-eves
-proportionality
-immigrate
-synths
-rectal
-yevgeny
-beatle
-wolverton
-televisa
-meddling
-petitioners
-ishii
-mollie
-mid-day
-descriptors
-sepia
-num
-erectus
-uric
-mayberry
-epidermal
-o'brian
-udaipur
-organelles
-high-voltage
-491
-sopwith
-arquette
-2.26
-concocted
-computes
-pentagram
-brainstem
-morbius
-inexplicable
-four-track
-insistent
-rifts
-brownlee
-yad
-bituminous
-tigray
-innervation
-all-night
-okrug
-sepsis
-noc
-luzerne
-stings
-sinusoidal
-bachmann
-geist
-rangel
-1993-1994
-561
-issuers
-teapot
-bhangra
-canaries
-trevelyan
-pinckney
-cbse
-sennett
-crs
-two-and-a-half
-carpal
-eko
-thatch
-decompose
-melamine
-mountjoy
-hardwicke
-fasteners
-chios
-gonzalo
-swayed
-squatting
-mangoes
-seto
-103rd
-cost-cutting
-gannets
-precautionary
-constriction
-grunt
-1508
-videogames
-c.d.
-bechtel
-gk
-kimono
-tilapia
-neverland
-p-38
-saarland
-rube
-642
-malachi
-mab
-grasped
-paok
-tetrahedron
-denunciation
-dunstable
-goan
-ipoh
-boac
-poodle
-627
-diffused
-iliac
-five-minute
-webcomics
-mimicked
-loughton
-loudness
-chesham
-yucatan
-cobol
-dolph
-zooplankton
-mrc
-partizan
-milltown
-ima
-downwind
-1991-1992
-indistinct
-valenti
-belasco
-664
-rupp
-white-collar
-benchley
-trw
-rangefinder
-anti-hero
-chua
-invades
-giver
-endoplasmic
-belvidere
-predisposition
-catchphrases
-ria
-fifi
-unicorns
-ilex
-dispossessed
-timeout
-humbert
-trott
-kemal
-boomtown
-lansky
-cpp
-quaternions
-installs
-gagne
-yemenite
-jayhawks
-hillock
-thruster
-chloroform
-hardwoods
-remixing
-hom
-adipose
-meagre
-ryanair
-acronyms
-histone
-rivet
-carrickfergus
-driftwood
-bahn
-twin-engined
-wsu
-endeavored
-kenan
-rpf
-shuffling
-harbored
-ironhide
-baronetcies
-raistlin
-seasoning
-duchamp
-amstel
-pro-democracy
-eurythmics
-526
-memorize
-saar
-minima
-glittering
-blofeld
-prong
-prunus
-antonin
-titleholder
-schiavo
-dc-10
-crowbar
-elon
-wikis
-zaman
-ktm
-microcontroller
-constitutive
-riddler
-nicolai
-marooned
-ohrid
-curiosities
-dans
-stale
-p.i.
-frenchmen
-mccloskey
-firewalls
-vo
-serialised
-clouseau
-northam
-7.00
-storer
-liveries
-abbie
-alex.
-ukip
-custodial
-buckles
-homeported
-extirpated
-godley
-martens
-burners
-self-service
-maneuverable
-obi
-2pac
-bookkeeper
-transfiguration
-goswami
-mayweather
-formula_52
-circadian
-tuam
-cng
-attentive
-great-granddaughter
-ovals
-jewett
-adenine
-corsairs
-cava
-adjudged
-midterm
-locusts
-grandview
-greenspan
-fp
-2pts
-aacccc
-duisburg
-leisurely
-1225
-conceptualized
-dungannon
-defector
-cabana
-498
-21-year-old
-kostas
-abl
-newly-built
-kita
-1499
-cadogan
-maung
-kazuo
-anderton
-strathfield
-offshoots
-609
-tearfully
-oregonian
-vitreous
-bermondsey
-rethink
-rewa
-macneil
-italiano
-dmz
-tippecanoe
-muhlenberg
-debs
-odense
-lacus
-haller
-amma
-unthinkable
-stowed
-doughnut
-caswell
-jokers
-buntings
-bolling
-cytosol
-nicolson
-corrects
-palmdale
-hummer
-kurgan
-nocturne
-hasidism
-velar
-dhl
-henriette
-drumline
-atrocity
-zeitung
-tubercles
-1504
-foxtrot
-754
-expansionist
-formulaic
-poco
-camellia
-follette
-cancels
-bermudian
-rashi
-balanchine
-magsaysay
-shameless
-1518
-macroeconomics
-waseda
-callum
-donato
-hemphill
-rds
-artois
-strafford
-human-like
-ypsilanti
-kojima
-bakewell
-afon
-amenity
-irvington
-shute
-bungie
-professorships
-codeine
-canopus
-manitowoc
-kashi
-mcauley
-fiefs
-garo
-ferment
-udinese
-submersible
-beazley
-samplers
-sketchy
-lye
-collectives
-parthia
-beatification
-vied
-noa
-tench
-1212
-105.3
-10-6
-h.h.
-unchanging
-bodleian
-naperville
-nazim
-palatable
-newquay
-tp
-sizing
-autofocus
-emporia
-absorber
-exertion
-kirwan
-fridge
-extrinsic
-paraffin
-philistines
-mid-2005
-aru
-bhumibol
-1120
-scanlon
-kalmar
-anniversaries
-littlefield
-sahel
-wobble
-kickoffs
-faulk
-roz
-swapo
-celibate
-waziristan
-stabilizers
-mcp
-riverbed
-poachers
-tantamount
-conciliatory
-raves
-bragging
-gatekeeper
-cozy
-roux
-rostrum
-gaur
-postures
-notches
-talmadge
-diamondback
-mercian
-urchin
-shukla
-lula
-gillett
-maasai
-guinevere
-richman
-amesbury
-watercolors
-cortlandt
-agora
-neff
-v-shaped
-patchy
-s&w
-durante
-cornette
-p.e.
-s4c
-ganymede
-hurler
-tsuen
-mobilizing
-noll
-fribourg
-caretakers
-electrodynamics
-zim
-acadians
-pleural
-mamma
-antibacterial
-lacs
-defies
-panties
-aphids
-militancy
-targa
-surges
-dingoes
-instinctively
-santorum
-mass-market
-urbanisation
-dermal
-stirrup
-telegrams
-in-laws
-givens
-existentialism
-alderney
-introverted
-morgenthau
-barris
-attests
-hasselbeck
-stitching
-homesick
-salyut
-proportionate
-maclaren
-sarai
-asic
-bergerac
-unconsciously
-goa'uld
-li'l
-errand
-haider
-albumin
-hippos
-landmines
-pst
-welker
-chiesa
-mms
-couplers
-wigmore
-volk
-ordain
-tactically
-low-rise
-mcveigh
-vb
-miserably
-configurable
-misdeeds
-deciphered
-deutsches
-brockville
-countrywide
-deduces
-salih
-37.4
-impressively
-blubber
-approximating
-intricately
-blue-eyed
-eames
-puyo
-sumptuous
-two-seater
-babar
-galloping
-ker
-corbusier
-haber
-couplet
-ez
-castiglione
-booze
-worley
-thrusts
-zur
-groupe
-1750s
-spender
-ivf
-appeasement
-e85
-bookmakers
-jaina
-sra
-w.w.
-homewood
-benazir
-interspace
-mayor-council
-dendrites
-grimshaw
-ellwood
-mannix
-halakhic
-flops
-saki
-kabuto
-avowed
-bate
-kaitlin
-dali
-eleazar
-pepperdine
-ploughing
-valais
-transcends
-bruin
-theropods
-wausau
-42.9
-vikramaditya
-fram
-colonist
-coerce
-soledad
-wargaming
-storytellers
-sugababes
-wizardry
-blu
-kidder
-kkk
-discourages
-agee
-mordred
-pre-emptive
-ambiguities
-taskforce
-brugge
-irreconcilable
-dipterocarpaceae
-koichi
-rejuvenated
-ecclesia
-cowie
-beaverton
-newhall
-wylde
-1494
-changeling
-ascribe
-ndebele
-evert
-shelbyville
-chulalongkorn
-oberliga
-boobies
-newsagent
-mildura
-swath
-higham
-reshaped
-mamet
-lemuel
-scuderia
-statisticians
-zafar
-strachey
-88.5
-non-lethal
-feasting
-opt-out
-mishima
-170,000
-paymaster
-generational
-galatea
-iyengar
-472
-osha
-3.24
-infraction
-callas
-caesium
-stoppard
-1475
-mi2
-buda
-sportswear
-allama
-2004-2006
-notaries
-palaeolithic
-top-40
-pancakes
-ulnar
-exclave
-shinawatra
-elvin
-wma
-southfield
-isaak
-mired
-nitride
-cabell
-inlays
-pecking
-microhylidae
-pompeius
-agong
-crayon
-withering
-nervosa
-kinnear
-heffernan
-transfusions
-abalone
-haight
-acutely
-agm
-lawlessness
-krugman
-trebizond
-tiebreaker
-rebuke
-e-flat
-yuba
-fac
-eelam
-justly
-trampled
-klezmer
-imprecise
-808
-paparazzi
-nita
-harmonia
-necromancer
-1461
-impressionism
-npa
-griffey
-uprooted
-shamir
-dorking
-mingled
-resurrecting
-cavalrymen
-optimizations
-mcauliffe
-dol
-reinforces
-586
-stencil
-rodimus
-kda
-chairing
-loudest
-mohamad
-resonances
-reversion
-interferometry
-convening
-uspto
-courland
-subvert
-incinerator
-103.7
-stator
-meissen
-succumbs
-marcie
-fuzhou
-haruka
-bcci
-larch
-eamonn
-discards
-mattresses
-uttering
-,2
-masaki
-post-game
-pestilence
-9-7
-chantilly
-cilla
-multi-sport
-one-party
-722
-trims
-traci
-comecon
-reactants
-stockdale
-grub
-1989-1990
-barenaked
-ubs
-kona
-goers
-rooting
-cruces
-stellenbosch
-implausible
-bananarama
-usm
-unsound
-amit
-kerama
-9-3
-sohc
-chippenham
-twenty-seventh
-jilin
-hindering
-berengar
-copperfield
-tangle
-539
-moribund
-werder
-capstone
-fluctuate
-grooved
-udi
-morpeth
-spiced
-troughs
-mazes
-anic
-off-duty
-bayley
-tiered
-objecting
-oxidant
-squamous
-gooding
-socorro
-cwa
-38.0
-taxiway
-gov
-migrates
-icp
-khitan
-sub-committee
-tgv
-petronas
-mair
-colonnade
-gabor
-newsgroups
-lakeville
-ozawa
-evangelization
-castlereagh
-usman
-netbsd
-ventriloquist
-edda
-cineplex
-cavalcade
-chanson
-mid-term
-goren
-desirability
-bagan
-cityscape
-masterminded
-atl
-fainted
-kneel
-full-blown
-uyghurs
-zenit
-bolo
-newgate
-39.1
-brun
-munda
-hunter-gatherers
-sombre
-wielder
-meander
-ennobled
-halts
-undeterred
-grosso
-det.
-subtlety
-public-use
-pennsylvanian
-puzzling
-blondes
-hatchlings
-rededicated
-chicane
-trollope
-shuttered
-asm
-mutate
-psylocke
-apolitical
-purses
-gondar
-wushu
-ruston
-samiti
-oy
-1275
-millington
-fredrick
-czw
-sunspot
-dimes
-rosemont
-lopsided
-christening
-metellus
-wala
-16,250
-rationally
-coolest
-zuckerman
-limited-access
-ratzinger
-rah
-trueman
-quinine
-soothing
-menelaus
-bwv
-cd-r
-duffield
-limestones
-amplitudes
-skips
-steadman
-soros
-pro-slavery
-etymologies
-brockman
-523
-purposeful
-spud
-dressings
-ruinous
-+4
-galatasaray
-preponderance
-vinatieri
-trinidadian
-eunuchs
-malachy
-acetyl-coa
-aldous
-elude
-darwinian
-chimp
-padang
-thresher
-bazooka
-grogan
-nyssa
-flemming
-schaffer
-714
-zhukov
-mundy
-multi-player
-lionheart
-tangential
-apologetic
-russo-turkish
-542
-envision
-pant
-recanted
-abacus
-vladislav
-alchemists
-ruggles
-brooklands
-amrita
-vibes
-cranford
-kbs
-davos
-553
-landmine
-rlfc
-campeonato
-antiviral
-88.3
-holi
-nine-year
-732
-strapping
-saka
-termite
-gantry
-best-of
-dearest
-spina
-off-shore
-sores
-ambrosia
-viscounts
-boson
-rader
-alaric
-discontinuity
-datasets
-kells
-bettered
-hairspray
-1/8
-unduly
-xanth
-discriminating
-brin
-elsevier
-uplink
-ucf
-maplewood
-sacha
-ripening
-chantal
-capitan
-fromm
-2.24
-udo
-kohn
-hau
-suwon
-smelling
-milhouse
-stockyards
-mbbs
-intelsat
-germaine
-masson
-swenson
-petaling
-marblehead
-brubeck
-hideo
-non-binding
-tikva
-kars
-ailerons
-plotinus
-preselection
-haddad
-boils
-labuan
-bullitt
-rabbinate
-befriending
-famines
-rar
-paralysed
-taz
-papandreou
-introspection
-avellino
-n-terminal
-zora
-rhein
-frehley
-gunship
-nihilism
-criminally
-moya
-hyland
-tver
-loosen
-barnwell
-zvi
-glycolysis
-symbolising
-milly
-muskingum
-cocky
-cuenca
-gmina
-zulus
-beaux
-bly
-prendergast
-defuse
-shoplifting
-carne
-donal
-wingman
-marmalade
-teck
-undeclared
-team-up
-playfair
-golders
-dairies
-petticoat
-folic
-mariachi
-faroese
-rightfully
-recapitulation
-inlay
-upsurge
-qu'appelle
-suckers
-tubal
-caries
-indentation
-petr
-powerpoint
-camaraderie
-mommy
-edd
-demobilized
-signalman
-stipulations
-first-place
-muralitharan
-sphincter
-reigate
-trincomalee
-papadopoulos
-swabian
-sigel
-heresies
-cui
-chl
-chp
-1-10
-hohenstaufen
-baffin
-pskov
-terracing
-wahoo
-infanta
-arlo
-momentous
-scavenging
-edgerton
-pettit
-telephoned
-b'nai
-embed
-full-color
-1471
-102nd
-gyroscope
-legionary
-defectors
-castration
-pictou
-non-specific
-macdougall
-birkbeck
-pickwick
-solver
-postsynaptic
-unjustified
-leominster
-spirituals
-junkie
-ark.
-surreptitiously
-jenson
-guyanese
-bradenton
-cruelly
-rockport
-whidbey
-rastafari
-lead-up
-limehouse
-gomer
-ifbb
-ogaden
-yum
-rothesay
-embarrass
-mercado
-voronezh
-624
-628
-slp
-errant
-registrars
-rickshaws
-biochemist
-triennial
-pingu
-88.7
-lytle
-counterterrorism
-hackman
-sharpshooter
-sma
-zebras
-seminarians
-prentiss
-spaceport
-sternum
-whorls
-5-yard
-patrilineal
-tyrannidae
-loo
-reassure
-mcmullen
-linlithgow
-kilo
-overridden
-respectfully
-flatly
-karpov
-nullification
-aloe
-exxonmobil
-rollover
-catskills
-blossoming
-wyler
-chertsey
-reburied
-augie
-annoy
-edwardes
-taint
-2004/2005
-showa
-transmutation
-odors
-codon
-goaltending
-487
-devastator
-kino
-irradiated
-unresponsive
-penciled
-second-most
-atsc
-rua
-wolfpack
-cyrenaica
-reo
-madera
-reductive
-viz.
-clutching
-filipina
-renwick
-prichard
-spontaneity
-jerky
-read-only
-valenzuela
-edf
-golgi
-glazer
-misinformation
-showrooms
-2001-2005
-boyfriends
-gosling
-b.d.
-tamer
-burleigh
-7-10
-german-language
-chilly
-infanticide
-mythbusters
-bosom
-undesired
-proconsul
-accentuated
-abbottabad
-105.5
-reeling
-griff
-undescribed
-burrito
-clapp
-frau
-lower-level
-bespoke
-clydesdale
-grote
-pbx
-gurps
-taganrog
-zf
-ember
-streptococcus
-masterplan
-electrocuted
-coastguard
-uic
-amazement
-moshav
-sina
-reactivation
-immunities
-fawr
-595
-effortlessly
-cromarty
-x11
-1314
-onassis
-546
-cobbler
-spring-loaded
-water-cooled
-reformatted
-bhs
-pacify
-yunus
-coronal
-brainerd
-yuna
-metastasis
-sarandon
-patricians
-wimax
-darters
-appreciative
-cetaceans
-jonesboro
-stationers
-vehement
-wavell
-schumer
-2001-2003
-hywel
-snapshots
-draconian
-prismatic
-depauw
-wfan
-blotches
-non-functional
-woodhead
-unsympathetic
-106.1
-motorcade
-desdemona
-pynchon
-deejay
-satchel
-ursuline
-thc
-mensa
-caan
-janie
-yadkin
-gosforth
-naka
-roseland
-drona
-une
-morte
-uavs
-confidante
-keeled
-deep-water
-creeps
-lief
-pottsville
-crawls
-tampered
-cropper
-maier
-weatherford
-graaf
-eclipsing
-coriolis
-nicolaus
-theocracy
-semicircle
-20-year-old
-tristram
-superhumans
-username
-kop
-autodesk
-blinking
-delimited
-639
-yoshino
-unlawfully
-hydroelectricity
-water-based
-teleplay
-901
-pom
-aha
-phenotypic
-norsk
-family-friendly
-travelogue
-reassured
-girardeau
-trigeminal
-christos
-appendicitis
-flp
-distracts
-rinks
-stabilised
-tortilla
-chap
-hani
-uktv
-non-alcoholic
-elsinore
-harborough
-scorecards
-imaginable
-105.1
-gamma-ray
-socializing
-beersheba
-sulaiman
-kiefer
-kazi
-toolbar
-bulgars
-oswestry
-flynt
-aesop
-merriman
-afoul
-madhavan
-magog
-nir
-technicality
-galilean
-stalybridge
-speculating
-estonians
-innkeeper
-3do
-717
-nlcs
-zanuck
-springdale
-attainder
-hartnell
-broad-based
-jie
-restaurateur
-salivary
-bribing
-coverdale
-naik
-yousef
-life-saving
-1152
-colm
-buns
-cheever
-coniston
-indulgences
-saya
-loopholes
-abernethy
-tribunes
-kalan
-gneiss
-ptsd
-775
-volition
-swayze
-rajkumar
-baldy
-keir
-spousal
-pandemonium
-bau
-catalyze
-delmar
-newhart
-dynamos
-ludicrous
-pre-university
-gatsby
-pokes
-horny
-notifying
-t.c.
-horrifying
-nilgiri
-loaders
-1968-69
-731
-suplex
-lamia
-front-page
-glitches
-endures
-degrades
-cit
-inf
-xd
-xt
-kaz
-hark
-clp
-regnal
-1258
-plinth
-auxerre
-rosicrucian
-tainan
-voigt
-compacted
-minibuses
-connick
-arg
-chaudhary
-veitch
-openoffice.org
-481
-tribals
-tarantulas
-marvellous
-yaroslav
-gagged
-gog
-llanview
-s4
-roemer
-adaptor
-fane
-ditko
-prt
-dead-end
-cristobal
-nuffield
-crypto
-eggman
-hugging
-bilbo
-worshiping
-daycare
-gwyn
-spinnaker
-disintegrating
-tfg
-handicaps
-farmstead
-bethnal
-mangeshkar
-unaided
-adriano
-cort
-rmb
-ascendant
-profusely
-isola
-rowena
-anti-defamation
-nca
-rabat
-pelts
-karnak
-susanne
-inadequacy
-mallorca
-off-the-shelf
-tweedy
-toner
-linder
-minsky
-weathers
-affixes
-cryonics
-dwarven
-sumer
-withdraws
-agios
-wanamaker
-cinemascope
-hardball
-antisocial
-maggots
-rawlins
-619
-frauds
-sahrawi
-faithless
-ly
-taliesin
-wp
-holcomb
-39.3
-posen
-bozo
-lathrop
-outperformed
-cwm
-redefine
-willingham
-delicacies
-ecologists
-cctld
-moffett
-extravagance
-562
-lutyens
-sacristy
-recoverable
-saco
-radially
-arpanet
-antipsychotics
-coulee
-chenango
-jhansi
-lada
-ooh
-bmp
-amigo
-presume
-symons
-bubonic
-gallinules
-provincially
-collages
-koda
-halleck
-skinheads
-morphemes
-patronymic
-vickie
-visage
-surpluses
-slattery
-mindful
-enrollments
-hager
-parse
-ivanova
-inversions
-parva
-whispered
-puntland
-toki
-cancellations
-nathalie
-wg
-aargau
-seiya
-kirin
-orchestrations
-gp2
-alicante
-aral
-six-lane
-orlov
-distilling
-trowbridge
-oddities
-mccutcheon
-dottie
-unsuited
-siobhan
-634
-eglin
-thana
-hubbell
-norcross
-eoin
-jagannath
-homogeneity
-konstantinos
-1348
-1349
-wec
-irrawaddy
-impeccable
-deadman
-unionized
-can-am
-omniscient
-laborious
-felling
-choristers
-vivienne
-fla.
-di-pertuan
-pertwee
-judaic
-dispenser
-p3
-rg
-aar
-demoralized
-tanjong
-fluxus
-tiffin
-bundeswehr
-adagio
-linnean
-urchins
-syncopated
-taut
-absinthe
-otway
-portia
-gunslinger
-zevon
-chandos
-disinformation
-marmaduke
-mea
-quint
-jarre
-choudhury
-second-best
-544
-bridlington
-princeps
-officiate
-599
-tpa
-kellie
-eo
-karr
-loggia
-dmk
-geer
-minting
-inez
-asmodeus
-paiute
-mang
-arapaho
-nuttall
-besant
-clitoris
-serviceable
-rashtriya
-abscess
-concourses
-factoring
-sherpa
-aosta
-reconquista
-mid-1940s
-hardman
-tubbs
-brownfield
-tisdale
-tombstones
-celluloid
-answerable
-branagh
-self-supporting
-amidships
-tact
-dha
-assassinating
-semi-finalist
-epithets
-pyrotechnic
-popov
-interdict
-onsite
-lessening
-5.56
-aalborg
-actuated
-single-track
-paces
-caledon
-miscellany
-strayed
-mudflats
-pierpont
-hariri
-double-sided
-welwyn
-fdic
-0.75
-appropriateness
-inclinations
-technologist
-42.5
-eloped
-panning
-sweetener
-indo-greek
-banish
-line-ups
-heliopolis
-xxv
-flattering
-fareham
-u.c.
-brunette
-scudder
-endymion
-strasberg
-fiqh
-kon
-manifesting
-tehsils
-raipur
-ait
-arianism
-ouster
-kirkuk
-nava
-kuznetsov
-effie
-metamorphoses
-deltas
-petros
-narvik
-no-nonsense
-martinsburg
-bacall
-dench
-rejuvenation
-biofuel
-melodifestivalen
-mechanization
-analyte
-reformatory
-antigone
-tilburg
-narrators
-vieques
-live-in
-regulus
-computerised
-honoree
-tcl
-forrestal
-camarines
-frisians
-south-southeast
-keri
-slav
-fricatives
-unwarranted
-sutures
-leaguers
-loach
-melatonin
-astrodome
-halcyon
-boosts
-interpolated
-lansdale
-dethroned
-lederer
-metheny
-placate
-reflectivity
-concepcion
-despot
-toyama
-kamp
-widely-used
-workbench
-digital-only
-gabriella
-anti-corruption
-usha
-effecting
-annuities
-breakbeat
-hornchurch
-groovy
-clearfield
-naca
-lucasfilm
-o'keeffe
-bushwick
-paramilitaries
-kryten
-raked
-synchronised
-weiland
-bridgestone
-kinky
-katanga
-wazir
-small-time
-post-traumatic
-poked
-yudhisthira
-d.h.
-101.9
-sinuses
-gyeongju
-14-0
-belizean
-phosphor
-cryer
-nerds
-federer
-sagebrush
-menelik
-dosing
-cottrell
-tooting
-rmit
-bilinear
-viaducts
-critiqued
-j2
-smoothness
-fujimori
-lra
-avoca
-yama
-khas
-shinkansen
-immigrating
-anti-drug
-overlays
-born-again
-crakes
-jaques
-five-time
-shibuya
-vella
-royle
-testaments
-gales
-dior
-jib
-perera
-bor
-preexisting
-mena
-millbrook
-gunung
-sugarloaf
-fretboard
-rhs
-marciano
-lectureship
-ministered
-hunter-gatherer
-fett
-assemblages
-tryouts
-1985-1986
-kathakali
-conic
-keyser
-outlawing
-stranraer
-hrt
-exacerbate
-penn.
-ragusa
-mig-21
-galahad
-downbeat
-gimmicks
-tmnt
-deductible
-dandridge
-frida
-20-yard
-mock-up
-khlong
-self-sacrifice
-heartache
-concordance
-schaumburg
-q&a
-ejecta
-parlors
-hillier
-bullshit
-cumbernauld
-reserving
-sidereal
-preservatives
-cra
-montferrat
-stools
-helmholtz
-pankhurst
-cephalopod
-116th
-carswell
-r&r
-prestwich
-stl
-mukti
-500th
-borrowings
-watercourse
-dupri
-jiu
-nabi
-denzel
-tng
-608
-kanda
-carman
-dimorphic
-ppi
-renouncing
-all-australian
-pimpernel
-heilongjiang
-biweekly
-bungee
-38.7
-7-2
-freelancer
-win-loss
-succulent
-lewinsky
-monotonous
-forestall
-resents
-uaw
-subjecting
-perihelion
-morrisons
-bolingbroke
-dura
-wavefunction
-101.7
-phytoplankton
-safi
-underpowered
-beckenham
-second-half
-surtees
-ranulf
-banging
-ver
-inaccuracy
-df
-angering
-nichol
-re-organized
-wc
-borussia
-protrude
-non-toxic
-innocuous
-besiege
-stairways
-gyatso
-1478
-expanses
-bhd
-raine
-counsellors
-herbicide
-distancing
-psychoanalyst
-rami
-2800
-flamethrower
-tdi
-in-situ
-rattan
-1971-72
-professes
-immunodeficiency
-skyrocketed
-hermitian
-astern
-coachman
-snr
-non-championship
-yavanas
-patriarchy
-undying
-bayliss
-uppercase
-mpr
-marcella
-43,000
-powerbook
-emg
-gob
-gartner
-take-over
-nfp
-fess
-skunks
-bouvier
-bastille
-bidirectional
-rte
-invulnerability
-bofors
-ila
-68000
-pumas
-lia
-nao
-spreadsheets
-malleable
-khamenei
-lenz
-cyclo-cross
-dulcimer
-kanu
-brechin
-scr
-inasmuch
-amanita
-fingertips
-approvals
-barbet
-skylight
-ctvglobemedia
-tcs
-shorea
-pdp-11
-headship
-crofton
-wop
-biggest-selling
-gatefold
-pn
-weeklies
-two-level
-daimlerchrysler
-chaff
-belatedly
-h.m.s.
-tbn
-mitford
-voucher
-elstree
-epps
-narrow-gauge
-ibaraki
-signers
-epinephrine
-retto
-transitory
-casale
-rother
-mcdougal
-40-yard
-austell
-jiao
-natick
-hislop
-lemmings
-gilberto
-receding
-quilts
-close-knit
-tout
-devizes
-legendre
-sanctorum
-tele
-bedchamber
-deol
-sardonic
-oxbow
-grammer
-perplexed
-gon
-ahab
-darnley
-indecency
-hyperactivity
-denials
-4d
-abominable
-riveted
-zadar
-marbury
-autographed
-darley
-implicate
-imitates
-usp
-laszlo
-kailash
-tamed
-complicates
-acoustically
-judeo-christian
-realtors
-tubs
-impregnable
-sugden
-governorates
-remedied
-jolt
-outrun
-dredge
-beefcake
-post-mortem
-shiner
-tactician
-tymoshenko
-hurwitz
-castaways
-shaheen
-murmur
-rakyat
-minarets
-bishoprics
-androgynous
-traffickers
-landforms
-redditch
-kalahari
-discreetly
-clc
-normality
-vitis
-p.a.
-substantiate
-larsson
-nep
-gram-positive
-imperials
-distrusted
-amalric
-618
-christiansen
-cramp
-accuser
-salish
-minivan
-stent
-top-secret
-fissures
-backfire
-rawson
-kiwis
-98th
-firewire
-parkersburg
-cunt
-cowen
-personas
-f-86
-croat
-nodal
-re-record
-tutankhamun
-nolte
-then-governor
-tyr
-greville
-awkwardly
-hyperactive
-fiends
-aberrant
-three-storey
-zahn
-stripped-down
-nida
-squier
-seaver
-kress
-rabid
-fon
-goldstone
-whitlock
-unseat
-558
-45.5
-touristic
-pulleys
-flirtatious
-commentating
-eavesdropping
-oatmeal
-willi
-belted
-taira
-harpo
-prospero
-hoosiers
-pha
-tabulated
-sto
-grander
-reanimated
-tallis
-comforting
-lebesgue
-barque
-hap
-pima
-inductor
-preliminaries
-emplacement
-singularly
-strode
-literatures
-seldon
-anarchic
-waxman
-m7
-visayan
-willa
-winder
-909
-near-death
-2n
-frisch
-romantics
-exarch
-minos
-lustre
-bookkeeping
-tei
-peeled
-a.e.
-bibliotheca
-mgr
-misfit
-pipits
-563
-dacre
-fwa
-baru
-shawl
-eparchy
-jarrah
-citi
-corporeal
-tso
-rearward
-metamorphism
-slipstream
-indelible
-flautist
-interdependent
-brower
-macdonnell
-tirupati
-indigent
-spammers
-maryville
-ettore
-arie
-fictionalised
-wyre
-lillywhite
-kushan
-conde
-cleaved
-tra
-extinguishing
-two-handed
-izumi
-pamplona
-bossa
-nimoy
-transjordan
-on-camera
-wahlberg
-gels
-espn.com
-toshio
-enslave
-cinematographers
-preposition
-berlioz
-stalinism
-khans
-surpasses
-1185
-jessop
-bhatti
-hyksos
-sampdoria
-hydrophilic
-off-site
-freleng
-munition
-rushton
-barneveld
-knotted
-minigame
-lumped
-encrypt
-bissell
-staves
-clear-cut
-schechter
-dubbo
-anhydrous
-walworth
-dianetics
-wtc
-stink
-yoo
-modulating
-cladistic
-hemi
-buzzards
-assesses
-taichung
-vlaams
-immutable
-indecision
-kweli
-verna
-morey
-disobeyed
-mc2
-peering
-birney
-roundabouts
-r5
-camberley
-generalizes
-deport
-disarming
-overflowing
-modernise
-squidward
-llandudno
-vmware
-karam
-sweaters
-g-unit
-despise
-overcast
-proto
-montmorency
-2.23
-heep
-fairway
-wagtails
-collyer
-falsehood
-woodall
-michal
-jamia
-qumran
-enticed
-nakano
-quantized
-clarita
-parachuting
-avis
-defensible
-cross-sectional
-ribeiro
-whistleblower
-sakas
-half-centuries
-proverb
-stine
-even-toed
-misidentified
-voided
-colette
-oca
-maclaine
-blindly
-keely
-platters
-paleocene
-knockdown
-waterwheel
-650,000
-maura
-kaliningrad
-boccaccio
-vasili
-kumaon
-tensors
-gilgit
-sobriquet
-x3
-liffey
-cartographer
-hecate
-baumann
-8-12
-adamantium
-eczema
-unforgiven
-ato
-keeley
-bandicoot
-rotator
-kurzweil
-huai
-genet
-angora
-wilmer
-belisarius
-ido
-janelle
-betel
-porpoise
-permanence
-sufis
-tnn
-camogie
-intelligences
-lorient
-lahti
-digipak
-kamala
-spellman
-dicaprio
-galvin
-kinston
-belafonte
-newnham
-handlebars
-bustamante
-amoral
-palatial
-wilford
-.30
-shadowing
-testis
-cashed
-6.30
-benghazi
-rothwell
-omicron
-merino
-cantonal
-congregationalist
-kanata
-uninhabitable
-emulates
-dormancy
-disjointed
-rework
-sawdust
-sotto
-peleliu
-calabar
-woodwinds
-staccato
-nmda
-b-flat
-alleyn
-wheelock
-toga
-hoddle
-shanahan
-240,000
-picker
-cookbooks
-rants
-dakshina
-890
-belles
-latins
-idi
-leniency
-sipowicz
-mila
-theocratic
-hennig
-rutan
-swallowtail
-bisexuality
-deceleration
-hiatt
-597
-lenore
-omnivores
-543
-poi
-pre-colonial
-moyne
-mansour
-gg
-haydon
-guang
-margery
-sram
-wry
-dissonant
-bloodhound
-sneaky
-hadfield
-keio
-89.1
-feu
-melchizedek
-kushner
-.357
-cet
-interdependence
-rumba
-beveridge
-pinion
-1966-67
-rawlinson
-animax
-kull
-edvard
-temperamental
-speedily
-underdogs
-clawed
-siskiyou
-repechage
-julianne
-wal
-upanishads
-itv2
-libri
-universals
-viscountcy
-wein
-pugin
-newhouse
-parham
-muscovite
-bhattacharya
-ferranti
-trobe
-berar
-grammatically
-krajina
-prostrate
-nyasaland
-blowback
-irritant
-upfront
-four-year-old
-pythons
-phaeton
-punting
-pontypool
-ustinov
-cooperatively
-midsomer
-blas
-mckinsey
-ghoul
-idealist
-aamir
-operettas
-catalunya
-fukui
-supersede
-482
-ssh
-low-density
-gaskell
-malian
-masterworks
-chak
-vice-principal
-berenice
-uematsu
-mmc
-hsv
-umbrellas
-commodus
-trescothick
-curlew
-pre-modern
-jemima
-basking
-falwell
-boxcar
-rubio
-keokuk
-a.r.
-boatswain
-dreamt
-disapproves
-apologists
-capitalise
-sandpiper
-ethnologue
-conquistadors
-moderne
-two-piece
-103.3
-proliferated
-purging
-jeju
-podcasting
-disoriented
-toyotomi
-rosslyn
-548
-hightower
-fpga
-scarves
-vmi
-hearne
-darter
-bi-annual
-apportioned
-96.4
-anbar
-left-back
-vignette
-throbbing
-connoisseur
-39.4
-o'donoghue
-narita
-wrest
-miwok
-updike
-condorcet
-cowling
-milepost
-harada
-marchand
-steepest
-unspoken
-wipers
-rawhide
-ointment
-tv2
-inkjet
-cashmere
-atpase
-nhu
-fogarty
-clackamas
-imaginations
-571
-marais
-arnaz
-masood
-fishman
-keeling
-bahama
-bitrate
-faraway
-assad
-gprs
-annealing
-buries
-angelos
-invalidate
-gorillaz
-spinelli
-ruud
-straczynski
-shockley
-mumps
-95.8
-haus
-mohs
-phs
-backfires
-amersham
-kazhagam
-frei
-half-back
-102.1
-oakey
-cert
-outsold
-cme
-rostral
-redfield
-athlon
-land-grant
-telus
-1455
-lifeforms
-dermatology
-secunderabad
-gowda
-gist
-lga
-v-8
-2pm
-flattening
-battering
-cutaway
-31,875
-moduli
-headstrong
-muffin
-621
-eidos
-dian
-catullus
-westover
-graff
-meister
-alnwick
-outram
-genocidal
-periyar
-manorial
-linker
-counter-insurgency
-manasseh
-authenticate
-banat
-dnipropetrovsk
-aberrations
-morea
-promiscuity
-kenai
-wildest
-hindley
-jebel
-stanislav
-newlyweds
-four-hour
-j.a.
-lossy
-magick
-loosening
-spotter
-44.4
-t-34
-dialed
-finite-dimensional
-thetis
-allying
-godly
-crafty
-weakens
-curls
-initialization
-transducers
-stepan
-ecclesiastic
-712
-consenting
-leib
-kmaq
-seraphim
-maidan
-b-25
-prowler
-fairhaven
-superdraft
-mellotron
-14-year
-decomposing
-internacional
-raye
-huon
-leakey
-adrien
-i-69
-discounting
-in-studio
-endo
-pnp
-cheech
-earring
-adequacy
-jor-el
-dees
-a&p
-capistrano
-methodius
-nucleation
-semitone
-smelt
-rhubarb
-injunctions
-krasnoyarsk
-juana
-peppered
-hunter-killer
-refreshed
-pyroclastic
-plastered
-escherichia
-beckman
-freiherr
-hand-drawn
-excitatory
-palliser
-scherzo
-644
-imriel
-gordie
-pastimes
-virginians
-skaggs
-dost
-litvinenko
-septimius
-jefferies
-athleticism
-woodworth
-koufax
-osmosis
-kennesaw
-sharman
-questionnaires
-changchun
-xfl
-anti-german
-rheumatism
-soon-to-be
-667
-nissen
-hewson
-asen
-38.8
-bowe
-hydrochloride
-darkwing
-toba
-isolationist
-39.6
-39.2
-glaxosmithkline
-appendage
-aly
-phosphoric
-jacksons
-immunoglobulin
-caviar
-otl
-pq
-millennial
-frostbite
-ghar
-mccollum
-bramley
-fatimah
-puller
-chilli
-1346
-charleroi
-condense
-desimone
-tsb
-misawa
-activator
-hominid
-rogan
-redeeming
-corgi
-jezebel
-husserl
-alstom
-patiently
-492
-altruistic
-baronies
-hibs
-oceana
-sylvie
-pauper
-lom
-souichiro
-bystander
-semi-independent
-rcp
-rudra
-best-of-seven
-droppings
-messer
-eintracht
-arnett
-estrangement
-in-ring
-cerebrospinal
-serif
-advisories
-branca
-all-news
-heirloom
-sebastiano
-discontinuing
-donelson
-hibernate
-maryborough
-displacements
-elihu
-rosberg
-rhone
-pre-historic
-visalia
-forehand
-ginebra
-axelrod
-2-d
-kiddie
-alekhine
-workspace
-bullen
-melanesian
-pontifex
-mckellen
-skirmishing
-gidget
-lanza
-liddy
-portishead
-resonate
-vulcans
-gasket
-superconductivity
-state-sponsored
-argonaut
-shep
-policy-making
-botticelli
-dakotas
-skinhead
-montego
-uab
-tca
-prevost
-fringed
-sr-71
-cressida
-haddington
-plugging
-delos
-hl
-alleviated
-neopagan
-t-bone
-kwh
-forty-seven
-coauthored
-715
-dally
-indented
-ergonomics
-whitten
-sonics
-lingo
-stomachs
-marionette
-vitus
-anschluss
-intertoto
-tricolour
-o'farrell
-knickerbocker
-ruble
-porches
-steinbach
-jalisco
-ghostface
-bismuth
-irate
-equilateral
-overpowering
-cyrene
-reinvestment
-kea
-radley
-derleth
-backdrops
-post-modern
-nadal
-cielo
-neu
-n.w.
-mercier
-neutralizing
-argentia
-amethyst
-intercom
-referrals
-765
-lightship
-burris
-unrated
-darya
-priming
-ol
-bayswater
-nasi
-651
-algebraically
-wm
-boars
-romanus
-beadle
-pagodas
-fistula
-hand-picked
-dona
-borealis
-chadian
-goole
-momma
-surfactant
-bannon
-vagus
-afterword
-805
-magnetosphere
-smoot
-divx
-hornsey
-yury
-octahedral
-pid
-longed
-aspinall
-non-professional
-metrodome
-clarice
-family-oriented
-rekindle
-cytotoxic
-grisly
-tenors
-imc
-renderings
-thraupidae
-actuaries
-cyrano
-northeastward
-east/west
-lactate
-non-combat
-tabula
-apertures
-477
-all-girl
-jewelers
-bcc
-penile
-bernhardt
-incomparable
-apothecary
-non-proliferation
-county-wide
-judgeship
-1999/2000
-bucknell
-leong
-one-woman
-artesian
-bufo
-fragrances
-abt
-blockbusters
-lyne
-magnifying
-tillamook
-ragnarok
-conditioner
-squatter
-phishing
-willett
-pendle
-lua
-bedridden
-emden
-spanking
-c$
-centuries-old
-jamil
-backhand
-infallibility
-cartels
-meghalaya
-taxonomists
-colourless
-cronus
-kelli
-cross-dressing
-prescribes
-well-rounded
-knuth
-bylaw
-nama
-kikuyu
-chromatin
-gennaro
-629
-felon
-triomphe
-afro-american
-zhuang
-esquimalt
-rina
-neologism
-trucker
-client-server
-21,875
-unstructured
-belair
-strictest
-moesia
-backpackers
-soderbergh
-marauding
-daystar
-navigated
-palaiologos
-bunnies
-udall
-1999-2002
-remittances
-bayeux
-roms
-brandi
-kanal
-clementi
-junkies
-slavonia
-sedge
-tek
-homozygous
-fibula
-perks
-contraindicated
-1138
-sienna
-zines
-blisters
-scheer
-cheeky
-harmison
-barbadian
-reva
-rhizomes
-karaite
-midweek
-shambles
-all-boys
-725
-benefice
-rigdon
-psilocybin
-tredegar
-ferromagnetic
-criminality
-centreville
-labatt
-bulbul
-declarer
-halim
-oases
-ichiro
-puppeteers
-sherrill
-safed
-rutter
-pergamon
-vitruvius
-godolphin
-forde
-futura
-straining
-loew
-thirty-second
-lurie
-beachfront
-nimbus
-7-inch
-dataset
-kpmg
-heelers
-c-terminal
-ornithological
-kd
-wenatchee
-razors
-writer/director
-typewriters
-pstn
-pinchot
-prohibitively
-averse
-irwell
-chewed
-agustin
-bulldozers
-unpowered
-shaughnessy
-ikeda
-dit
-illingworth
-gauthier
-hine
-disables
-bootle
-cousteau
-late-1990s
-margherita
-1.50
-stupas
-1210
-blasphemous
-txt
-chera
-neutrophils
-nellore
-coxswain
-bukowski
-machiavelli
-smedley
-flounder
-924
-blacked
-!?
-deflation
-mss
-fer
-baskerville
-petey
-micrometres
-satyagraha
-stained-glass
-professorial
-ablett
-contributory
-epochs
-expedite
-azov
-expandable
-3,400
-cross-over
-fiu
-41.2
-schmitz
-flashman
-kinnock
-commentated
-greenhill
-specialisation
-1299
-straddle
-kish
-yukio
-millville
-fated
-tazewell
-seafloor
-adore
-bogged
-raynor
-pisces
-single-elimination
-diff
-wannabe
-m.c.
-antiwar
-fuqua
-wellbeing
-merseyrail
-recoup
-cochise
-nathanael
-homeopathic
-geneon
-hams
-hersh
-taser
-camcorder
-wort
-restarting
-wipo
-568
-ident
-pygmalion
-reinsurance
-pescara
-stimson
-playford
-aiden
-mme
-decrypt
-cecily
-lise
-rpi
-ewen
-ump
-migs
-sorkin
-justifications
-barisal
-gremlins
-eastgate
-terrorizing
-enthalpy
-tos
-infill
-mccomb
-cpusa
-dothan
-cadastral
-steakhouse
-franchisees
-ketones
-papp
-wellcome
-crotch
-tremblay
-ramapo
-amnesiac
-u.s.s.r.
-monckton
-1154
-m62
-dobie
-proportionately
-sardis
-muang
-planktonic
-semi-retirement
-alcott
-,5
-natalya
-mini-games
-pollinated
-grammys
-thurber
-battlements
-bergeron
-cross-examination
-refurbish
-two-tone
-lantau
-spits
-hemorrhagic
-contreras
-aleph
-plas
-bhubaneswar
-fiorello
-663
-deloitte
-kitsch
-truscott
-murrumbidgee
-monteverdi
-xylophone
-saintly
-suisun
-baboon
-kaskaskia
-lakshmana
-frederica
-birthdate
-hanukkah
-push-pull
-telecoms
-narcissistic
-cannock
-spooks
-stranglers
-2.22
-savannas
-nederlandse
-boardroom
-spasms
-caesarean
-aroostook
-painstaking
-columnar
-batley
-sandbox
-bel-air
-hundredth
-triphosphate
-3.27
-orthographic
-cowdery
-mencken
-burlingame
-jk
-otho
-horrocks
-mingo
-secaucus
-mid-american
-leake
-gazebo
-quasi
-shove
-blandings
-groot
-tls
-wcc
-lockerbie
-passers-by
-supercontinent
-cantos
-portnoy
-microfinance
-shaquille
-caucasians
-bolland
-straighten
-trophic
-parte
-1101
-cd-roms
-epiphone
-transits
-brophy
-remarriage
-hagia
-colson
-offenbach
-576
-pillaging
-soule
-cherwell
-661
-25,625
-morel
-gameshow
-gere
-four-story
-skokie
-inflationary
-propagates
-lids
-hialeah
-folger
-sanctus
-glut
-wilds
-unnumbered
-pgp
-latif
-hoaxes
-trainings
-thermally
-chart-topping
-knievel
-mothra
-howden
-non-japanese
-richthofen
-sarin
-underpinning
-schemas
-gentler
-czechoslovakian
-accipitridae
-cima
-antler
-incisive
-1740s
-neurosis
-bretagne
-whiplash
-free-form
-1946-47
-steaks
-transcriptional
-carlsberg
-progenitors
-dropkick
-mamie
-northport
-3-yard
-schengen
-boggy
-mercator
-lavra
-bumpy
-razorback
-merchantman
-jerks
-counter-clockwise
-encores
-youssef
-okeechobee
-metamorphosed
-midori
-masterton
-expiring
-silhouettes
-nook
-formula_53
-bm
-marchant
-pessimism
-cammell
-stereoscopic
-opiate
-aikman
-originators
-eliminator
-identifications
-sloops
-queuing
-taupo
-mahoning
-74-gun
-banzai
-determinations
-outkast
-fretless
-demonstrative
-purley
-henk
-oilfield
-masseria
-loretto
-exerting
-rheumatic
-boisterous
-spinoffs
-deceiving
-moloney
-stokowski
-trs
-lynched
-maurer
-megazord
-7am
-undermines
-nimble
-coromandel
-unflattering
-mademoiselle
-preempted
-typecast
-sauropod
-rikki
-66.7
-ujjain
-formby
-osmotic
-albin
-diphtheria
-amato
-optare
-ospreys
-shania
-streaking
-obscura
-semi-rural
-einar
-mysterons
-cyathea
-consigned
-featherston
-hime
-re-assigned
-carburetors
-yanukovych
-film-making
-quinta
-.44
-admirably
-psychosocial
-albertville
-correa
-pre-empted
-khon
-misspelling
-lodger
-dorman
-benitez
-cel
-self-awareness
-re-establishing
-demolitions
-latifah
-everquest
-sibyl
-idiots
-rendezvoused
-1216
-lundgren
-psychotherapist
-self-portrait
-paloma
-chavo
-wipes
-govind
-artful
-bloodiest
-counteroffensive
-picketing
-allo
-6-speed
-pathologists
-casks
-gripped
-wadham
-longus
-clicked
-sana
-pyne
-bouncy
-durbar
-668
-tabasco
-gondor
-1189
-iridium
-clamps
-seibel
-ppc
-tabulation
-threshing
-side-project
-stieglitz
-guin
-frantz
-custom-built
-xiamen
-lieder
-donizetti
-natsuki
-portadown
-affordability
-disloyalty
-statuette
-vindictive
-states-general
-queanbeyan
-guayaquil
-ucsb
-berserker
-fbs
-satsuma
-gozo
-novelties
-hanlon
-gato
-nephites
-pronouncements
-lcs
-ismay
-1998-2000
-fars
-anti-trust
-sprinkling
-venting
-1306
-eccleston
-liberalisation
-expropriated
-margie
-pathos
-rawat
-kitab
-pleasanton
-barricaded
-gretsch
-bratton
-infiniti
-bundaberg
-saxe-coburg
-felonies
-582
-gti
-gtk
-racking
-incapacity
-ketchum
-jools
-pooling
-clamped
-gimp
-lochaber
-timings
-sculls
-augmenting
-phoebus
-biogeography
-barman
-guile
-lavas
-ashikaga
-invents
-gt1
-taka
-ihl
-bloodthirsty
-jellicoe
-grisham
-lovat
-knighton
-ironclads
-bartram
-rodin
-memoranda
-556
-marketer
-facies
-814
-mathematica
-p-47
-unsettling
-halter
-customised
-kibaki
-mobilisation
-hermeneutics
-hydrogenation
-stara
-superleague
-oldman
-comer
-gascony
-h.m.
-bramall
-unleashing
-floss
-post-crisis
-dougie
-x-1
-lol
-landgrave
-vanbrugh
-ethically
-segovia
-stubbornly
-storm-petrels
-vihar
-allyson
-volodymyr
-mckim
-41.7
-titration
-self-conscious
-acreage
-clausen
-extraterrestrials
-lilley
-hikari
-fortescue
-maxilla
-unexpired
-clitheroe
-brom
-ransome
-neosho
-bassey
-tsubasa
-hibernia
-ammon
-fiercest
-rijeka
-ruslan
-watercolours
-krakow
-danby
-vesicle
-regio
-yat
-xpress
-adulterous
-tala
-ora
-supercross
-lougheed
-melodious
-tunnelling
-formula_55
-alleviating
-despatch
-franconian
-pallavas
-six-part
-practises
-bluestone
-backdoor
-airmail
-procopius
-port-au-prince
-spinks
-gunshots
-stockholder
-krantz
-gingerbread
-dwarfed
-paycheck
-belgrano
-baca
-dimer
-storch
-mansa
-remodelling
-papas
-adaptability
-sativa
-cru
-drusus
-queenie
-vx
-barbaro
-womanizer
-hazelton
-quadrants
-bi-directional
-satiric
-batons
-theremin
-vadodara
-yellows
-thespian
-cranbourne
-pern
-herzl
-gord
-aro
-destro
-gnp
-zoroaster
-trembling
-co-ordinates
-buffering
-melvins
-cardiomyopathy
-c-2
-woodcut
-heavyweights
-rivets
-panjang
-tamaulipas
-babbler
-656
-wozniak
-underrated
-deploys
-haringey
-djinn
-sellout
-hambleton
-loris
-bugsy
-rothstein
-devises
-protour
-lugs
-grass-roots
-scruggs
-libertad
-putra
-ecumenism
-scud
-halal
-polygamous
-intractable
-dca
-faithfulness
-gratification
-quartermaine
-warfarin
-rollout
-alimony
-30,625
-mot
-axillary
-497
-itasca
-wipeout
-contemplates
-guardsman
-spiritualist
-eis
-1381
-hard-fought
-eichmann
-penda
-half-blood
-crowder
-troughton
-nagging
-motorcyclists
-mtn
-immemorial
-stairwell
-lucullus
-screenshots
-dreyfuss
-meisner
-multiplexes
-maluku
-vanes
-nation-states
-minneapolis-st
-509th
-satriani
-nri
-ould
-roda
-viticultural
-flyby
-curd
-good-bye
-8-1
-ebro
-troma
-katha
-vuitton
-parganas
-u18
-dedications
-clampett
-masa
-vets
-winehouse
-locarno
-conversing
-lavey
-transformative
-orochi
-husk
-btcc
-legionaries
-laidlaw
-compagnie
-steadfastly
-telnet
-immaterial
-carvalho
-harboring
-wagoner
-iwc
-famagusta
-takara
-sidecar
-strident
-vecchio
-ticking
-suvorov
-ossie
-kidnapper
-megumi
-ostracized
-easement
-rubies
-fragility
-bibb
-ragnar
-wiles
-diabolical
-holkar
-cardozo
-fiore
-callao
-hellcat
-nitrates
-impotence
-hasten
-pierrot
-vsevolod
-salon.com
-tippett
-passageways
-indo-pakistani
-tinamou
-taxiways
-1965-66
-baggy
-skimming
-mammy
-arapahoe
-sandburg
-sequestration
-cardamom
-farouk
-camillus
-reincorporated
-goosebumps
-bureaucrat
-salesmen
-individualistic
-tirpitz
-monarchical
-confining
-ntfs
-1187
-40.5
-catatonic
-dyck
-ramanujan
-candido
-rolle
-loggins
-unwillingly
-rockdale
-637
-ibo
-slates
-centimeter
-mcgwire
-subfield
-hatshepsut
-carpathians
-drive-by
-venables
-doordarshan
-1,250
-ribbed
-tomboy
-skene
-freya
-surinam
-reade
-piscataway
-amuse
-civitas
-popstars
-dumpster
-boarder
-hunslet
-allendale
-psu
-dtm
-azmi
-yoda
-wither
-fireproof
-lesbianism
-prawns
-west-southwest
-boi
-worded
-dodecahedron
-festa
-bastian
-perryville
-californians
-m-day
-tubercle
-stavros
-hannay
-benediction
-simla
-fleck
-teletype
-usurpation
-rafe
-587
-immersive
-rackham
-decoders
-pewter
-alcorn
-1960-61
-hydrated
-tadpole
-wonsan
-chiller
-whisperer
-sda
-rtd
-mussorgsky
-catfishes
-np-complete
-gerda
-decapitation
-e.j.
-rauch
-parachuted
-schwarzschild
-refill
-berkowitz
-rosyth
-disclosures
-magnetite
-half-brothers
-azeris
-bracts
-maccoll
-poppies
-554
-557
-crankcase
-yamazaki
-alif
-carcinogen
-decoys
-adeline
-blocs
-peculiarity
-pogues
-bskyb
-wholesaler
-stig
-wingtips
-keyhole
-rosenfeld
-zan
-martyrology
-flippers
-sabbat
-exasperated
-sonoran
-allure
-tolled
-necro
-imaged
-olcott
-tubules
-changers
-menander
-freaky
-mhs
-mannerist
-zaphod
-lawman
-backlog
-manatees
-carlsson
-ibrox
-48,000
-msi
-prs
-cadaver
-cavernous
-tribulation
-anglo-catholic
-vaccinations
-winemakers
-joiner
-gracefully
-rickshaw
-hanshin
-baen
-seedling
-morant
-pavlov
-rydberg
-overprinted
-eurozone
-sangamon
-growl
-udc
-diuretic
-masoretic
-dnc
-zucker
-roarke
-emt
-subalpine
-co-founding
-supposition
-janeway
-widener
-barone
-camphor
-40.7
-sew
-topmost
-vlachs
-cahn
-semi-retired
-gaslight
-nightjar
-bioavailability
-knightsbridge
-four-month
-inalienable
-kru
-ascoli
-ultras
-jocks
-sago
-superlative
-sharpening
-degree-granting
-tabby
-97.4
-liberal-progressive
-littlejohn
-buttercup
-misa
-veered
-changsha
-anchorman
-white-eye
-semiotics
-ketch
-thymus
-non-player
-troup
-elope
-flexion
-ilia
-macgyver
-iditarod
-ronaldo
-masada
-patching
-speculator
-commedia
-araki
-thq
-mails
-melfi
-condensing
-skyhawk
-chills
-islay
-grudgingly
-haphazard
-withered
-semi-permanent
-myeloma
-waals
-single-sex
-siting
-piling
-luanda
-manger
-micheal
-orthonormal
-government-funded
-fournier
-predicated
-rohit
-hieroglyphic
-runestone
-moma
-b-47
-bellow
-400th
-oft
-cmb
-cmp
-escalante
-herein
-glorification
-leichhardt
-blackwall
-marquez
-tubman
-cpm
-phenotypes
-monophonic
-tynan
-elaborates
-cmi
-sub-groups
-adela
-ultimates
-tapings
-showmanship
-raiser
-gunnison
-559
-peacocks
-slays
-shepparton
-kesteven
-dislodged
-jalalabad
-keitel
-gravesite
-redness
-amory
-1984-1985
-linger
-ryoko
-24-bit
-bahraini
-readmitted
-marg
-dcu
-nikolaos
-icbms
-rougher
-bahawalpur
-mouton
-halas
-802
-outclassed
-resurfacing
-sweetest
-udon
-bade
-kreis
-hasdrubal
-simulcasted
-lando
-rochford
-proverbial
-ergonomic
-angelou
-clearings
-twenty-eighth
-gravelly
-barda
-dissociative
-vta
-gte
-obeys
-nickels
-damsel
-myelin
-miko
-substituents
-outweighed
-inelastic
-norodom
-2.21
-pyke
-fathom
-fireflies
-syncretism
-perforations
-contingencies
-benched
-palestrina
-adorno
-fitzmaurice
-zod
-steubenville
-dahomey
-smiled
-abra
-germanicus
-tv4
-murcia
-camcorders
-crustal
-ach
-beefheart
-maktoum
-erikson
-greenstone
-faulted
-tmc
-precludes
-solzhenitsyn
-algarve
-anti-gay
-unionville
-protozoa
-decimation
-44,000
-a38
-39.0
-exemplar
-772
-misaki
-trillium
-p-40
-offensively
-secretory
-munn
-hooghly
-admonished
-678
-invicta
-t-cell
-musik
-semantically
-birdsong
-alisa
-petal
-warpath
-gynecology
-marchioness
-thung
-zomba
-amway
-sentience
-flirted
-conjure
-ratliff
-silks
-jpl
-blaenau
-escrow
-indium
-cavett
-nullify
-elyon
-testbed
-quirino
-darrin
-andesite
-ainslie
-volusia
-bossy
-rickard
-dou
-materia
-eason
-otley
-playfully
-coriander
-profusion
-remi
-hollowed
-non-invasive
-raglan
-spengler
-svalbard
-satoru
-shi'ite
-beery
-inadvertent
-hoarding
-alamanni
-tattooing
-akali
-sealift
-hangings
-30-40
-dissertations
-longworth
-4.00
-mid-year
-rebrand
-fleury
-vamp
-belated
-aemilius
-libido
-materialist
-defecting
-debutant
-unparished
-plunges
-penciller
-propounded
-ringers
-mushtaq
-larnaca
-conn.
-shuttleworth
-alix
-alphonso
-dehydrated
-underestimate
-nailing
-gwyneth
-uttara
-fulk
-lass
-poona
-granddaughters
-dictatorships
-oren
-maumee
-raving
-mermaids
-rivaling
-meriwether
-spindles
-yerba
-crutches
-acetylene
-weirs
-ginseng
-anions
-academicians
-five-piece
-aafc
-40.3
-christiania
-xenophobic
-encircles
-astana
-bevel
-accede
-re-instated
-minami
-marshmallow
-kibbutzim
-nuanced
-roch
-seacrest
-organometallic
-devouring
-yuji
-esau
-familia
-rubric
-sakuraba
-hooligan
-abate
-1730s
-corvus
-philippi
-fainting
-pastoralists
-injectors
-40.4
-bremner
-pavements
-bettis
-reitz
-gurgaon
-closed-circuit
-manchus
-toluene
-reconquered
-echolocation
-postsecondary
-morita
-painless
-100.5
-saddleback
-helmed
-corrado
-najib
-apprehensive
-jaworski
-argonne
-newcastle-upon-tyne
-southerner
-dfa
-amorous
-bedi
-overhang
-well-liked
-001
-lomas
-gerardo
-pg-13
-exec
-ghouls
-heroics
-obscuring
-tamaki
-npl
-sweethearts
-subduing
-pyrite
-ghb
-wacker
-culpepper
-isherwood
-decolonization
-orangutan
-stockpiles
-106th
-folklorist
-herders
-x-2
-paignton
-sundry
-embarcadero
-punchline
-forecourt
-marduk
-aleksey
-outfitting
-dinas
-1191
-astrakhan
-mahadeva
-gobi
-reconvened
-d.w.
-amador
-holderness
-nashik
-berates
-kempe
-langhorne
-elis
-997
-addie
-subplots
-mayes
-cixi
-dickenson
-alans
-mcginnis
-mosby
-collated
-rainstorm
-divinities
-mcgarry
-549
-apokolips
-jog
-1297
-solicitation
-thiamine
-lithosphere
-candler
-c-1
-life-cycle
-imola
-rescind
-ament
-disapprove
-duodenum
-1301
-trouser
-indefatigable
-ifl
-straub
-annoys
-lamm
-kya
-oystercatchers
-watercourses
-107th
-formula_54
-meditating
-evacuations
-p450
-carmody
-hermosa
-marinus
-gunships
-kublai
-lengthen
-locket
-sibylla
-sochi
-lewisburg
-counterbalance
-yer
-lentils
-6-9
-friendliness
-uncompleted
-kramnik
-pecan
-boughton
-lek
-biogenesis
-rosenborg
-senile
-offstage
-mikan
-valdosta
-zfc
-craddock
-frenetic
-somoza
-burrard
-overloading
-endearing
-unassuming
-mastiff
-life-sized
-fairlie
-bandage
-cliffe
-fryer
-sires
-quonset
-sdi
-slacker
-llangollen
-mush
-ehime
-hooligans
-arrington
-716
-overlordship
-agassi
-nettle
-rspb
-hamel
-dilbert
-sixty-four
-hasegawa
-first-rate
-alves
-gaddafi
-enver
-carat
-philby
-costco
-malthus
-philological
-jpmorgan
-a9
-three-game
-undemocratic
-parachutist
-skagit
-airworthy
-ides
-testa
-thurrock
-lucent
-0-6-0
-1402
-125cc
-1-9
-quadra
-geniuses
-chatter
-polemics
-synthesised
-tailback
-dagon
-dominus
-formula_59
-geriatric
-asymptotically
-175,000
-.25
-desktops
-batty
-shimmer
-telepath
-ellipsoid
-korda
-mims
-exploitative
-tsu
-dumplings
-abstracted
-nyman
-577
-defensed
-nei
-psych
-cercle
-joint-venture
-blackmailing
-berkman
-arch-rival
-burch
-m40
-sga
-db2
-air-sea
-xenophobia
-homunculus
-chandeliers
-karmapa
-boyars
-skegness
-sainte
-mpla
-countenance
-bengalis
-wailing
-gcb
-katya
-blackness
-lahiri
-bladed
-gabriela
-sistine
-injector
-sabino
-oulu
-minder
-k1
-myosin
-growling
-stares
-mowat
-neptunes
-malaysians
-fmri
-legged
-catarina
-muscogee
-gpus
-brak
-,4
-mam
-extrusion
-animus
-bawdy
-mes
-galilei
-684
-2003-2007
-623
-hangul
-swamped
-silverado
-0.08
-bodhi
-slut
-khuzestan
-microcontrollers
-kimber
-truckee
-copycat
-magellanic
-pascual
-subscribing
-1398
-locally-produced
-fylde
-gmail
-tcg
-scrotum
-oneonta
-vandalia
-whaler
-laminate
-totalitarianism
-woe
-melanesia
-unobstructed
-globetrotters
-tuc
-coalesced
-misinterpretation
-1488
-1487
-reynard
-ascetics
-nestle
-tulare
-marikina
-anne-marie
-latium
-auster
-highwayman
-hopf
-reiko
-sit-in
-honing
-orpheum
-unstaffed
-vorenus
-rif
-floris
-plath
-memoriam
-sprouted
-metairie
-collider
-mbs
-mountaintop
-aisne
-ladoga
-hazlitt
-bolder
-houten
-kitsap
-pleases
-suzi
-atelier
-allocates
-489
-schoharie
-assiniboine
-fittingly
-hesketh
-fuss
-l'amour
-lambeau
-giga
-tsai
-dehradun
-fanaticism
-jeong
-kolb
-coverts
-non-title
-v-1
-roxie
-shorelines
-tilghman
-936
-plied
-39.7
-holston
-rhenish
-re-routed
-cameroonian
-farrer
-brantley
-boaz
-lorena
-formula_56
-aldrin
-sympathizer
-kook
-upperclassmen
-dunaway
-contaminant
-cleburne
-imposter
-delimitation
-sheringham
-perverted
-kwok
-naam
-gilmer
-kenseth
-adebisi
-alkenes
-bhutanese
-telepaths
-proteases
-auctioneer
-co-chaired
-miho
-jenin
-shih
-filton
-alor
-insidious
-aerobatics
-prince-archbishopric
-tsim
-fevers
-voice-overs
-pra
-city-owned
-symmetrically
-admin
-dreamy
-wynyard
-643
-shahrukh
-kilowatt
-radix
-kitano
-boardings
-regenerating
-duomo
-polonia
-bergmann
-687
-bonfires
-temperaments
-4,200
-deactivate
-kroll
-stanwyck
-1316
-artaxerxes
-rages
-stoddart
-coatbridge
-riddell
-indianola
-dramatics
-erupting
-petrelli
-deletions
-caracalla
-indulging
-grown-up
-kashmiris
-ena
-swanage
-odette
-oe
-lf
-gracilis
-inquiring
-mankiewicz
-conker
-tivo
-aye
-oic
-907
-905
-canandaigua
-intercut
-two-face
-bulging
-695
-coley
-moline
-endeared
-nachman
-bas-relief
-zork
-hydrodynamic
-piaget
-cgs
-567
-1962-63
-mugs
-tutti
-mexicana
-matsushita
-juarez
-anglo-indian
-amf
-isan
-eraser
-105th
-bannockburn
-supranational
-spotsylvania
-chestnuts
-christgau
-x-linked
-chumash
-neuroimaging
-quelled
-daniela
-gesellschaft
-celestials
-msl
-aslam
-f-35
-deflections
-stratovolcano
-bhagavad
-leveraging
-quicksand
-variegated
-glial
-blok
-swanton
-crain
-jeweler
-ismailis
-romsey
-grammarian
-calamities
-savagely
-stallings
-41.5
-lamanites
-singha
-oviedo
-codice_9
-atco
-3,800
-hysteresis
-revolting
-derail
-omens
-convenor
-gwendolyn
-9-10
-leitch
-omnipresent
-passable
-shyness
-woken
-cribs
-divulge
-feisty
-contrapuntal
-sit-down
-olsson
-diomedes
-9-1-1
-downy
-optimally
-sse
-wilks
-buckethead
-conductance
-slurs
-oed
-stumped
-clift
-nontrivial
-lovingly
-oberst
-chantry
-cripps
-sausalito
-malkin
-925
-grandstands
-unclaimed
-patronised
-wav
-red-tailed
-siddha
-non-voting
-haro
-anti-virus
-purifying
-persico
-infringe
-upturned
-2.20
-churchman
-laxmi
-incrementally
-impaling
-infirm
-krasnodar
-tobey
-mccrae
-zanu-pf
-mui
-logarithms
-loons
-topologies
-spandau
-zeitgeist
-waitakere
-rubicon
-shogi
-tyco
-impeach
-radio-controlled
-salesian
-1274
-jetfire
-maurier
-overcoat
-3,700
-kessel
-gris
-principled
-diller
-incipient
-iterated
-floodlit
-novelette
-93.8
-heian
-apologist
-canna
-distorting
-cockpits
-vena
-heliocentric
-shellac
-minty
-bernese
-mafiosi
-maruti
-mandurah
-1630s
-al-qaida
-pin-up
-eliyahu
-eller
-fixer
-sequestered
-dall
-malaise
-bathgate
-ostrogoths
-voicemail
-garnier
-1241
-plethodontidae
-sorrel
-letts
-lv
-inboard
-sebastopol
-lord-lieutenant
-dimitrios
-saxton
-trialled
-unrelenting
-ini
-bana
-+5
-cavanagh
-1482
-lewisville
-mathilde
-duped
-588
-pluck
-perversion
-keels
-kagyu
-mingle
-crappie
-standup
-boyband
-fenner
-prisoner-of-war
-superannuation
-prerogatives
-duplicating
-hiv-positive
-escondido
-876
-berwick-upon-tweed
-seawall
-teddington
-radu
-primavera
-multiparty
-screeching
-maximian
-hongkong
-0.15
-1268
-guilders
-shatters
-f.b.i.
-wanton
-tiamat
-secondarily
-reassures
-tamers
-letterkenny
-legitimize
-janesville
-sharpton
-playfield
-ntl
-wiggin
-bedingfield
-mattingly
-adrianople
-amaranth
-seibu
-1030
-petah
-constrain
-galba
-mcluhan
-batt
-all-rookie
-scots-irish
-sotheby
-salma
-dalby
-insatiable
-f6
-ice-cream
-mamas
-tani
-marengo
-murugan
-aas
-waldemar
-tomorrowland
-predominates
-indivisible
-hambledon
-karoo
-low-frequency
-clarksburg
-utters
-tractate
-bache
-abb
-achaean
-crediting
-shearwaters
-713
-folate
-thesaurus
-tanith
-obligate
-harewood
-inadmissible
-foresee
-milian
-farrah
-caritas
-lathes
-5500
-sawtooth
-jovian
-mattered
-omani
-hindmarsh
-peale
-velasquez
-u-23
-payson
-yeung
-kiedis
-circulates
-heterozygous
-93.2
-gautam
-reprises
-terrors
-sabc
-adirondacks
-ensigns
-melastomataceae
-dweller
-903
-ossetian
-vercelli
-xtc
-wisteria
-haggis
-animating
-metrobus
-shek
-vienne
-eyeglasses
-havant
-reapers
-alyson
-dfc
-painstakingly
-byelection
-nol
-conceals
-apollon
-newburyport
-antoni
-kms
-mccandless
-refrigerant
-compresses
-cassia
-waynesville
-end-of-war
-whitewash
-cordero
-soundscapes
-fredric
-semper
-allred
-sanctification
-ex-members
-gustafson
-fukuda
-654
-mst
-gru
-heim
-hattiesburg
-light-weight
-91.4
-40.9
-nikephoros
-levelling
-cognomen
-poindexter
-cajon
-danko
-dunwich
-ringgold
-undress
-panhellenic
-shari
-watchmaker
-buchenwald
-daud
-noblewoman
-far-flung
-nutt
-wrested
-ophthalmologist
-coalesce
-antioquia
-howler
-imprisoning
-dravid
-chard
-kovacs
-switchboard
-maharani
-embarkation
-electrolytes
-socratic
-conservatories
-abernathy
-spiraling
-1347
-talley
-xue
-bithynia
-crusading
-rewrites
-crouching
-apps
-fms
-tamagotchi
-hospitaller
-rnc
-unwavering
-kasem
-psychopath
-vientiane
-glans
-plums
-faversham
-batak
-pita
-ordo
-grower
-hollander
-orel
-colonizers
-wishart
-armadale
-plowman
-talkback
-victorians
-supercritical
-kensal
-jac
-syllabary
-yaroslavl
-fictions
-evaporates
-alcazar
-20,625
-reichenbach
-farringdon
-paracetamol
-nuova
-storefronts
-airtight
-tie-ins
-doin
-sayed
-roberson
-1469
-rulership
-k.c.
-rescuer
-pho
-mcmurdo
-snornas
-cranberries
-4-8
-perceptive
-mohegan
-broads
-multi-member
-sangre
-iambic
-mini-cons
-mcbain
-preludes
-wallsend
-718
-undetectable
-saranac
-quin
-plimpton
-drapery
-quintana
-absalom
-compasses
-chidambaram
-porthmadog
-chula
-laurana
-1199
-carolla
-chilcotin
-n.b.
-sassy
-overlain
-ingolstadt
-965
-theism
-huerta
-aintree
-disobeying
-buckshot
-1265
-eckhart
-omb
-place-names
-steelworkers
-andi
-arkwright
-input/output
-flintlock
-trs-80
-akuma
-brees
-bytecode
-meditate
-burl
-eco-friendly
-mks
-631
-al-islam
-dharwad
-fitzhugh
-luft
-railroading
-2-dimensional
-ningbo
-toynbee
-1401
-envious
-acker
-absolved
-tricolor
-duster
-qigong
-kraven
-mosquitos
-darken
-enlai
-coxed
-antiquaries
-superboy-prime
-nosferatu
-resurface
-anjali
-waxes
-kundalini
-low-speed
-charade
-champa
-entercom
-angevin
-adlai
-cheatham
-u-turn
-nootka
-estrella
-sfr
-nuke
-paulding
-chert
-``
-felons
-burleson
-confiscate
-harnessing
-heuristics
-deployable
-suffield
-hollingsworth
-grp
-hassett
-cataclysmic
-gamba
-do-it-yourself
-hannan
-goalscoring
-m.f.a.
-shadwell
-drucker
-interchanging
-microcomputers
-beos
-geordie
-electrolytic
-yash
-off-limits
-disloyal
-grameen
-4-yard
-sappho
-goldschmidt
-abrogated
-heighten
-vlsi
-venkata
-harun
-reacquired
-corel
-microelectronics
-bol
-backend
-attu
-1957-58
-652
-co-signed
-sek
-41.4
-crist
-unleashes
-pena
-malformation
-ochoa
-reversals
-flirtation
-outfitters
-re-organisation
-1303
-screech
-mitsui
-convertibles
-rasheed
-doped
-cobden
-3.33
-parasympathetic
-skirting
-maj
-illogical
-pheromone
-patera
-pon
-pocklington
-clots
-walk-off
-creedence
-jimbo
-pomo
-aggravating
-brews
-kyushu
-borax
-mahjong
-ubiquity
-cuttack
-al-sadr
-contractually
-clearinghouse
-moiety
-horoscope
-b&b
-bata
-ontologies
-utilisation
-chansons
-seabrook
-sherwin
-co-consecrators
-gowen
-distasteful
-seagate
-wellingborough
-darkroom
-bogue
-garwood
-chutes
-raith
-howlett
-dreamers
-fertilisation
-nines
-roused
-rhythmically
-b-29s
-chepstow
-subgenres
-haruhi
-cse
-audiobooks
-steyn
-jetix
-vliet
-conspecific
-portillo
-bundelkhand
-ishida
-crean
-toggle
-591
-cinerama
-plenum
-gymnures
-alana
-karakoram
-kondo
-delorean
-wright-patterson
-kintetsu
-khilji
-o'callaghan
-babbage
-flutie
-valiantly
-stratos
-inhaling
-pasig
-intravenously
-coax
-ripples
-embryology
-demented
-awd
-mccluskey
-windhoek
-weser
-mildenhall
-martine
-parsonage
-isambard
-salmond
-11-12
-1308
-bodega
-jenks
-leverkusen
-rika
-highest-charting
-abdul-jabbar
-a-list
-leonhard
-cellos
-miura
-biota
-vce
-gordo
-tinged
-berezovsky
-luthier
-pesaro
-fencers
-bugis
-augustan
-s.e.
-browder
-paulinus
-barberini
-lindberg
-arius
-opiates
-smp
-flagella
-573
-revivalist
-4-7
-seagoing
-provokes
-lindgren
-spurring
-drywall
-ambiance
-dbms
-keyed
-polymorphic
-hendrickson
-yuuzhan
-asylums
-troyes
-wyandot
-hashing
-qinghai
-katowice
-father-son
-pre-raphaelite
-lae
-crosscountry
-medan
-mcshane
-ismael
-aqha
-macworld
-truncation
-amour
-42.3
-ectopic
-orientalist
-kasi
-myrna
-anti-terrorist
-uncharacteristically
-sano
-petula
-contented
-fabricating
-volkov
-spurned
-aleksei
-peretz
-jigs
-hansard
-overalls
-ingest
-rhymney
-strong-willed
-entomological
-montefiore
-bef
-marrakech
-coincident
-malkovich
-spasm
-stadtholder
-cashew
-intransitive
-collapsible
-longitudinally
-soi
-1332
-hebert
-homogenous
-culprits
-nordland
-yank
-97.8
-relents
-kerensky
-dramatization
-reedy
-intranet
-anniston
-b-17s
-parenthesis
-geyer
-margarine
-detoxification
-pediatrician
-mews
-8am
-676
-clogged
-jean-louis
-saran
-tarlac
-northeasterly
-lamprey
-keck
-buchholz
-posit
-aldehydes
-rinehart
-short-tailed
-solway
-birr
-whalen
-well-meaning
-putsch
-megaman
-seca
-gora
-sub-mariner
-langdale
-embodying
-awdry
-adoptions
-cleethorpes
-toronto-based
-nagarjuna
-weasels
-shankara
-predominated
-barmaid
-bronchial
-rinaldo
-daviess
-anticonvulsant
-goalless
-ghaziabad
-sarum
-mds
-welt
-pavarotti
-siri
-95.6
-1425
-ripken
-gnosis
-stormwatch
-1377
-tavares
-proximate
-great-great-grandson
-immobilized
-gershon
-trotskyists
-scholz
-furrow
-dunsany
-8-track
-3.26
-40.6
-ligation
-colonizing
-rationalisation
-l'aquila
-non-union
-sterilized
-c-5
-matha
-reinterred
-cattaraugus
-stonehouse
-tua
-nona
-interagency
-cashman
-fifty-fourth
-96.6
-96.2
-875
-uf
-hypersonic
-state-level
-mckeon
-fanatics
-reforestation
-pizzeria
-toil
-joysticks
-quesnel
-killa
-backgammon
-fingering
-flamengo
-stepbrother
-pancreatitis
-nay
-grayish
-obote
-undercard
-gormley
-lead-in
-gusts
-caliente
-grasse
-harrisonburg
-confraternity
-supergrass
-westin
-marat
-vandross
-pro-independence
-seacoast
-ratna
-glassy
-egyptology
-mothballed
-firs
-cotswolds
-parkdale
-brazen
-mamba
-copra
-mammary
-kick-off
-928
-headroom
-petrovich
-lieber
-hillingdon
-planing
-tokushima
-chancellery
-anti-christian
-chez
-xa
-stretton
-criticality
-smartphone
-floodplains
-bluntly
-two-sided
-drug-related
-bremerhaven
-edifices
-adderley
-apr.
-revitalizing
-krug
-one-story
-mentorship
-chitral
-guerin
-lancs
-detonates
-humberto
-zygote
-defamatory
-machinegun
-cates
-bushman
-1435
-replying
-end-users
-funkadelic
-brunton
-107.3
-hearse
-lurid
-17-year
-willpower
-barbers
-epitomized
-sanyo
-gow
-denounces
-cloves
-taupin
-kor
-pterosaur
-elysian
-longbow
-titusville
-pentland
-hela
-decrepit
-hyperinflation
-t'pol
-radiography
-awol
-3600
-ratner
-presto
-didcot
-bleached
-conceivably
-globus
-conformance
-condiment
-upper-middle
-bitches
-boston-based
-narain
-een
-faro
-outtake
-absorbent
-alun
-strangler
-a.v.
-aeon
-minced
-panini
-speechwriter
-deg
-salton
-13-year
-minangkabau
-skinks
-befitting
-tonto
-2002-2005
-heavy-handed
-feller
-cupertino
-iconoclastic
-geospatial
-salutes
-subsidize
-rammstein
-yasmin
-topologically
-shipman
-warms
-outland
-pajamas
-cranks
-10-year-old
-dvorak
-hetfield
-1961-62
-nelvana
-719
-woodburn
-bendis
-manoj
-namath
-ferraris
-oxygenated
-embalming
-permeated
-ogres
-implicating
-holograms
-vibraphone
-3.29
-asmara
-nerissa
-khe
-australian-born
-akiyama
-kneale
-interoperable
-deftones
-sawing
-figuratively
-reapportionment
-volcanics
-afm
-qazi
-yanks
-thankfully
-tenures
-toponymy
-hec
-six-year-old
-decarboxylase
-wold
-ifr
-euboea
-njcaa
-niger-congo
-nouvelle
-thomason
-terme
-notifications
-strasburg
-perfusion
-eod
-vacationers
-sinuous
-single-game
-hating
-smeared
-101.3
-11am
-urinate
-tvr
-snares
-thicket
-conjunctions
-mota
-wallington
-parnassus
-well-drained
-gazing
-amok
-r8
-zick
-zanesville
-91.8
-switzer
-dba
-kehoe
-baud
-recuperation
-mcgann
-amniotic
-cli
-spook
-smalley
-saa
-break-in
-suzaku
-leathery
-barnacle
-timbuktu
-cso
-jello
-megabytes
-kennan
-nevill
-most-watched
-brownies
-perishable
-brm
-kurtzman
-ushering
-peeling
-subroutines
-npsl
-glazes
-bulkheads
-femininity
-mearns
-sop
-shoaib
-fittest
-repellent
-schuman
-107.7
-subtext
-t.a.
-pilbara
-polytheistic
-techtv
-bellsouth
-anthologized
-drs.
-plotlines
-placentia
-flor
-rincon
-tsunamis
-knightley
-694
-penistone
-stabilise
-xinhua
-faithfull
-laurels
-fernand
-stedman
-entitlements
-holiest
-awash
-gullies
-lorimer
-sickles
-dangerfield
-babs
-polychrome
-walkman
-aerials
-pillboxes
-sheerness
-scapula
-prosthesis
-morgana
-bloods
-corry
-wysiwyg
-morison
-encroached
-mcsweeney
-spout
-n3
-pml
-cbn
-seaworld
-fantail
-94.6
-ncp
-woden
-merry-go-round
-2002/03
-bolero
-hisham
-collis
-macklin
-freeland
-poul
-twenty-ninth
-220,000
-koblenz
-46,000
-victimized
-well-publicized
-daisies
-1988-1989
-thrall
-weekes
-jorgensen
-non-competitive
-dissuaded
-scottie
-sepultura
-authoritarianism
-qom
-underwing
-prato
-inigo
-strenuously
-uup
-2.18
-segregationist
-enduro
-dmus
-purplish
-special-purpose
-stol
-alkene
-maggot
-negating
-toshiko
-sandi
-refunds
-monarchists
-buttress
-ampang
-bivalves
-chyna
-ibs
-defraud
-ws
-dreyer
-finders
-tring
-gasses
-predacon
-laminar
-veolia
-wcl
-tule
-goldfrapp
-renaud
-imac
-stoll
-bicester
-macular
-arenabowl
-gliese
-elks
-gac
-docudrama
-nears
-pickard
-oudh
-encantadia
-hyperplasia
-casement
-minelayer
-lukashenko
-transsexuals
-fluttering
-grandest
-kingsford
-soundness
-bpa
-paratransit
-fora
-escapades
-interconnecting
-lbf
-no-confidence
-1389
-electrocution
-three-man
-moyers
-1472
-ayya
-roadmap
-old-style
-withstanding
-antoninus
-under-16
-dialectal
-apia
-gumbel
-tapioca
-scoops
-geeta
-hissing
-seligman
-indo
-brasenose
-zerg
-1312
-spinster
-nws
-coming-of-age
-pabst
-passchendaele
-dietz
-anastasius
-stallman
-memorization
-instigator
-toorak
-earthbound
-scimitar
-philippine-american
-heterodox
-wickes
-cohabitation
-shacks
-cleansed
-1964-65
-voyageurs
-kazimierz
-looe
-overflows
-transformational
-1147
-luiz
-marwar
-linearity
-mimico
-1325
-ichi
-n64
-paradiso
-re-designed
-galvanic
-bernadotte
-p4
-commutator
-kuroda
-scituate
-whey
-emmerich
-untested
-defections
-heaney
-tromp
-cleves
-1337
-macartney
-flier
-declination
-amjad
-mediates
-1386
-mccrea
-debre
-sarcoma
-simmonds
-witton
-bester
-coote
-git
-politicized
-wallach
-co-driver
-sixtieth
-kufa
-meri
-kissimmee
-functionalism
-1018
-mcenroe
-sacra
-haemorrhage
-pleasurable
-kabila
-billingsley
-1476
-colborne
-headphone
-winsor
-sich
-bauman
-yorkville
-maywood
-batavian
-caloric
-grunts
-cobbled
-lyricists
-deathstroke
-sukhothai
-mauritian
-benedictines
-balm
-tosca
-bodybuilders
-polytopes
-effigies
-medium-to-large
-parapsychology
-sentries
-romagna
-cusps
-gujranwala
-abutments
-ostensible
-krishnamurti
-lengthwise
-brownian
-ferraro
-reoccupied
-ewald
-samoans
-quinnipiac
-vang
-dryers
-calaveras
-internationalization
-profitably
-betjeman
-three-speed
-faltering
-pritzker
-formula_57
-commensurate
-prosthetics
-philbin
-trusteeship
-doobie
-multi-storey
-timescale
-ricard
-mutinied
-thomasville
-chambersburg
-chanter
-deliberative
-choujin
-resettle
-rehired
-rallidae
-pindar
-bogdanovich
-extraneous
-galesburg
-extensor
--DGDG.DG
-cleve
-christa
-caplan
-owensboro
-20-25
-nickerson
-shrimps
-classy
-1489
-c/d
-kinski
-bct
-nlrb
-koirala
-bushmen
-guises
-topsoil
-queried
-sheepdog
-vig
-shonen
-newly-established
-phenylalanine
-1m
-physiologist
-carib
-matson
-exhausts
-churning
-autobiographies
-toppling
-mcmurtry
-numbness
-aei
-brumbies
-auk
-fuca
-organically
-pka
-native-born
-ied
-dictating
-self-inflicted
-exorbitant
-musculoskeletal
-c7
-forty-nine
-putter
-horrid
-stockton-on-tees
-shg
-polymorphisms
-selfridge
-sylvain
-trapdoor
-functionaries
-subchannels
-graciously
-oryx
-cristiano
-olimpia
-quantitatively
-rosales
-sappers
-wiggles
-eugenie
-smit
-snorkeling
-wenlock
-fledging
-saurashtra
-mitsuomi
-walkin
-pillage
-canadian-american
-lydian
-second-oldest
-paterno
-indic
-washers
-substratum
-shephard
-deified
-southpaw
-sultana
-gunning
-bigby
-short-handed
-bullhead
-tanjore
-1399
-cheerfully
-skatepark
-walk-in
-on/off
-ignites
-masint
-lighters
-librettist
-hv
-genji
-flavio
-4:3
-653
-nothingness
-lantz
-commissariat
-paderborn
-sadhana
-706
-joondalup
-angiotensin
-allstars
-polytheism
-intracoastal
-blacksburg
-easts
-retooled
-matlab
-delves
-cz
-moyles
-one-eyed
-duchovny
-philologist
-redknapp
-badfinger
-four-speed
-neotropical
-veronese
-well-off
-rumford
-macapagal
-885
-1263
-omo
-heinie
-branigan
-cautionary
-mohanlal
-edgeworth
-painkillers
-kuril
-armorial
-1660s
-pardee
-appendices
-eca
-wgbh
-1987-1988
-schist
-second-degree
-parley
-modell
-657
-linde
-retort
-ramachandran
-bellarmine
-atone
-keogh
-wildman
-loaves
-albarn
-bidwell
-daihatsu
-heinemann
-fiefdom
-2001/02
-clapper
-yogic
-extractor
-bogdan
-calvinistic
-sambar
-gulfport
-recharged
-ever-growing
-earner
-pernicious
-two-legged
-iskandar
-abenaki
-goulding
-malformations
-prejudiced
-tarkovsky
-timorese
-stereotyping
-modularity
-marxian
-acheson
-drexler
-dogfight
-adria
-extrapolation
-seahorse
-potted
-c-17
-2007-2009
-epg
-minimization
-grafts
-callous
-clarks
-somerton
-workouts
-four-issue
-marillion
-1385
-6500
-rmc
-f-16s
-squids
-jvc
-lunga
-deckers
-mangaka
-jean-michel
-berthing
-arcot
-psychics
-woodcuts
-tajiks
-cally
-lar
-nie
-voce
-meteoric
-florist
-bootlegged
-1235
-boehm
-fabricate
-subang
-musics
-matra
-near-fatal
-pandyan
-pandyas
-prosody
-baugh
-nore
-saratov
-uruk
-litigants
-pasted
-impreza
-bevin
-molyneux
-rochefort
-leeway
-suffocation
-dazzler
-millen
-emilie
-invoice
-siddhartha
-vfr
-trainsets
-christianized
-pola
-warrnambool
-pliers
-second-level
-akiko
-masato
-pevsner
-hosiery
-pulsating
-next-door
-wronged
-subjectively
-dells
-pth
-saharan
-dlc
-emitters
-narayanan
-mixtapes
-elucidated
-darbhanga
-1202
-harmonize
-bangladeshis
-724
-viceroyalty
-piedmontese
-tetrapods
-liturgics
-renard
-waxwings
-conus
-plowing
-resemblances
-cassavetes
-goodison
-nansen
-maximized
-muda
-raspberries
-offs
-rounders
-autocephalous
-glorify
-tartarus
-nisan
-alloa
-scanty
-hijab
-precipitous
-6502
-1227
-counter-offensive
-yeerk
-sheva
-borderlands
-necessitates
-damansara
-mond
-alexandrovich
-ashdod
--8
-aqaba
-normanton
-clandestinely
-server-side
-lettres
-584
-brodsky
-gio
-896
-megara
-1650s
-wimborne
-medes
-scolds
-off-line
-b.g.
-fitter
-kellner
-perches
-aulus
-recklessly
-student-teacher
-uniontown
-fagin
-south-southwest
-facelifted
-slimmer
-marcher
-takeovers
-universiti
-insurmountable
-ius
-ditched
-labia
-user-defined
-isham
-exp
-stomping
-lusk
-palpatine
-glas
-nala
-transvestite
-nayaka
-tracery
-etymologically
-hampson
-infomercial
-netanya
-gregson
-menudo
-crammed
-gogo
-burge
-rootes
-berwickshire
-thereon
-soundboard
-benet
-econometrics
-jarrow
-ardrossan
-extricate
-nine-ball
-tarpon
-xyz
-eschewing
-unsung
-busking
-salas
-co-authors
-symington
-softcore
-giri
-sub-zero
-aicte
-gestational
-gatt
-railhead
-mid-sixties
-creditable
-miss.
-sprinkle
-murphys
-str
-muds
-samford
-angiosperms
-pompano
-naas
-asphyxiation
-cantabria
-left-hander
-chita
-rapprochement
-sayer
-vasudeva
-d'alene
-rhodri
-wor
-stagnated
-shovels
-civics
-clog
-longley
-nicki
-iommi
-codice_10
-welling
-meshes
-high-class
-1963-64
-bci
-kazuki
-micrometers
-verwaltungsgemeinschaft
-umpqua
-stifling
-orcas
-real-valued
-dimethyl
-beyonder
-suckling
-okeh
-bookmaker
-16:9
-siculus
-arusha
-post-nominal
-transpersonal
-moult
-a40
-gurudwara
-burghley
-backus
-ultravox
-deplorable
-centipede
-ingrained
-apennines
-rima
-cleon
-tallaght
-40.2
-danner
-loomed
-ansel
-bure
-ecclestone
-glories
-vfb
-luria
-1954-55
-dalles
-pullo
-envisions
-kusanagi
-39.8
-asti
-jetstream
-hankey
-divisione
-nozick
-carrillo
-100.9
-moorgate
-japanese-american
-formula_58
-edwina
-sequoyah
-clerestory
-jubal
-umwa
-omsk
-dance-pop
-brokaw
-nellis
-catalans
-plebeians
-underserved
-fal
-stutter
-u.n.c.l.e.
-cloaked
-rudders
-guizhou
-tax-free
-mandan
-taha
-ch3
-solanum
-loess
-book-length
-idiomatic
-archeologist
-smartest
-dw
-plaines
-browner
-college-level
-pharma
-zuni
-nofx
-acland
-beamed
-wag
-badajoz
-macaques
-exacted
-bertolt
-martell
-underline
-manes
-benchmarking
-1,500,000
-segmental
-recesses
-bossier
-hardworking
-wrangling
-michelson
-zooms
-instructive
-agave
-copulation
-morpheme
-auc
-starlet
-remaster
-definable
-snorkel
-msps
-zhuo
-shipbuilder
-moslem
-a8
-maul
-systolic
-paar
-formalities
-egyptologist
-timekeeping
-burgas
-priestesses
-myrrh
-coterminous
-29,375
-orientalis
-1302
-pacemakers
-duopoly
-tie-breaker
-mccarthyism
-bhagavata
-demesne
-bonar
-splitter
-donn
-petting
-abwehr
-jitter
-azhar
-rasmus
-const
-vagrants
-netaji
-all-england
-transference
-upminster
-breakfasts
-hd2
-broadwater
-gorda
-abydos
-l-shaped
-coutts
-jugglers
-depravity
-maniacs
-botham
-equalling
-nob
-sympathized
-pontius
-abolishment
-aniston
-spangler
-back-end
-poltergeist
-sme
-castell
-metatarsal
-renshaw
-newsagents
-gluck
-iglesia
-moorhead
-zayd
-colloidal
-issaquah
-qcd
-kurita
-maquis
-kachin
-956
-centrifuge
-1226
-wilts
-heerenveen
-hypertrophy
-sessile
-hargrave
-hieronymus
-hoxha
-tera
-countermeasure
-sittings
-culling
-embellishments
-mogwai
-batcave
-havering
-njsiaa
-1285
-nys
-diptera
-kenora
-rossendale
-gravestones
-concerti
-fitzgibbon
-valeri
-stifle
-kangxi
-five-year-old
-cubism
-shelving
-top-five
-panelling
-birdwatching
-en-route
-1292
-sandwell
-laxman
-uncompressed
-co-presented
-www
-lenticular
-manmade
-poitier
-1405
-lilo
-cafeterias
-leprechaun
-hesitates
-,6
-lionsgate
-conceiving
-continuations
-1111
-12-0
-icj
-nods
-hairdressers
-ptv
-walz
-nieuwe
-cto
-fowey
-kirke
-5.30
-eyck
-buttermilk
-bizerte
-nevin
-corydon
-lys
-keren
-pre-trial
-pld
-asaph
-915
-cannibalistic
-philharmonia
-unsurpassed
-harmonized
-frigid
-executioners
-orchidaceae
-yisroel
-eckert
-contraindications
-kiley
-chemung
-3.28
-wallenberg
-circumvented
-tablelands
-nikaya
-efi
-schalke
-follicular
-dingwall
-solvers
-dubliners
-howlin
-sprinters
--9
-reneged
-1075
-smethwick
-sleepless
-boylston
-safekeeping
-noli
-ade
-bardstown
-win32
-prestwick
-rugrats
-infestations
-orators
-eigenvectors
-tipp
-nicolae
-teleporting
-subclasses
-wfm
-cuzco
-dreamtime
-foggia
-30-yard
-mib
-thermostat
-applet
-albatros
-blunders
-elland
-zvezda
-postdiscal
-colburn
-sandbar
-violators
-puckett
-variances
-manly-warringah
-wedged
-massie
-1459
-afs
-rer
-re-published
-surcharge
-bearcats
-pirelli
-pariah
-1305
-maltin
-gowrie
-non-technical
-all-seater
-sharepoint
-coahuila
-oru
-aswan
-douglas-fir
-east-southeast
-intermedia
-3,000,000
-flannel
-kennard
-showered
-operand
-unreliability
-ably
-thang
-aleutians
-sino-soviet
-bodo
-cookware
-nagi
-occultism
-lactation
-re-titled
-firebrand
-bemis
-great-nephew
-hughie
-basalts
-unscheduled
-seabees
-doren
-camacho
-ponca
-top-tens
-miscarriages
-mallets
-reportage
-coarser
-seely
-hhs
-heads-up
-alsop
-tolliver
-genosha
-anatomist
-snub
-breezy
-neiman
-pre-order
-patricks
-christiana
-canadair
-wgn-tv
-killah
-962
-1375
-j.k.
-headstones
-mannerheim
-growths
-oxidize
-detested
-cukor
-three-volume
-collett
-roti
-cann
-panipat
-pigmented
-rabi
-maharana
-barratt
-2.15
-92.6
-prankster
-kung-fu
-prolifically
-tirade
-zemeckis
-768
-affidavits
-jest
-castellaneta
-touchscreen
-molesworth
-plasmas
-bava
-giotto
-gresley
-v8s
-sof
-ditton
-sharpshooters
-dungeness
-strategists
-arnott
-adamantly
-erinsborough
-lalonde
-hephaestus
-98.4
-lascelles
-massing
-gault
-nevers
-pembina
-morricone
-bharatpur
-nutter
-smallmouth
-tigger
-569
-26,875
-672
-waistcoat
-cromwellian
-chivalric
-bexar
-stews
-god-like
-carbon-carbon
-bridgnorth
-acquiesced
-violinists
-al-mu
-bottlenecks
-colic
-knowlton
-blurb
-adjoined
-ssris
-punishes
-rekha
-inyo
-melvyn
-pterosaurs
-weisman
-countywide
-adak
-tracklist
-disengaged
-spurrier
-lignin
-rhesus
-castrated
-wethersfield
-bca
-rehoboth
-lemony
-forthright
-wapping
-law-enforcement
-rampaging
-nine-year-old
-campy
-zealot
-leclerc
-mildew
-95.2
-kaori
-integrable
-arginine
-experimenters
-commendations
-nisbet
-espousing
-mahadev
-hussar
-swipe
-hateful
-eliade
-opts
-pasok
-ket
-runyon
-galatians
-utada
-outscored
-ad&d
-datsun
-late-20th
-gymnastic
-apostolate
-spearheading
-alkmaar
-pitfalls
-parasol
-cliques
-utf-8
-indo-iranian
-hotham
-pietermaritzburg
-hus
-palmar
-seventy-two
-lignite
-clubbing
-satmar
-speke
-homilies
-auchinleck
-lipoprotein
-uvf
-courteous
-haughty
-right-back
-immunological
-carbons
-anxieties
-eder
-kasey
-krabs
-warrenton
-coffers
-buganda
-inextricably
-cumin
-strangling
-dysart
-austral
-aramco
-warminster
-ocr
-uwa
-gryphon
-frugal
-dearth
-ona
-cross-town
-tigranes
-demiurge
-745
-self-destructive
-tri-nations
-fifty-third
-neurobiology
-carnivora
-macaw
-rosettes
-striping
-abn
-kwong
-msf
-rearranging
-haman
-attributions
-goldsboro
-narva
-18s
-wittelsbach
-meldrum
-scp
-sauropods
-seitz
-mulan
-maldivian
-one-handed
-ciliary
-zoids
-knowsley
-everson
-child-like
-phoenicia
-storyboards
-reusing
-ordinals
-swamiji
-martineau
-abbe
-gelatinous
-tatra
-dismemberment
-tuple
-clamping
-perturbed
-reshape
-sgml
-clearest
-brierley
-resellers
-waffen
-uzbeks
-tatyana
-u19
-throckmorton
-1272
-quasars
-asura
-veils
-648
-wind-up
-bakunin
-civilisations
-brainstorm
-boeotia
-crispy
-schoolmates
-pineville
-valdivia
-811
-mauled
-regimens
-saakashvili
-troubleshooting
-bandera
-operands
-deputation
-bhaskar
-ruddock
-hdl
-saboteurs
-aonb
-hardiness
-drenthe
-pullen
-menswear
-capitoline
-jak
-learjet
-obasanjo
-appreciates
-jinan
-bukhari
-wuxia
-formula_60
-1060
-protester
-semifinalists
-tamura
-669
-semisimple
-h4
-haliburton
-opc
-lapel
-conduits
-r.d.
-joule
-gape
-grt
-inverclyde
-klingons
-lod
-firebirds
-1484
-earthworm
-bucking
-4-track
-trichy
-gt3
-kangra
-tabular
-waratah
-typographic
-udine
-40.8
-sasso
-cleans
-tapers
-suspiciously
-raycom
-brookwood
-nowak
-integra
-pvr
-ris
-leaded
-roark
-o'shaughnessy
-luba
-collides
-1441
-hamptons
-gravano
-state-funded
-2000/01
-brightman
-spider-woman
-hydrazine
-troon
-assr
-ebbw
-kink
-farmhouses
-conflated
-rothman
-dreadnoughts
-1298
-dengue
-government-in-exile
-obelix
-evolutionarily
-leben
-r.e.
-dien
-arta
-ejector
-eleonora
-stanmore
-frescos
-geass
-pearly
-late-1980s
-newby
-haigh
-enchantress
-itanium
-re-located
-welshman
-1-7
-confuses
-satyr
-mulatto
-dvb
-fitzalan
-appleseed
-1356
-silliman
-alu
-mobiles
-boycotting
-blackmails
-clump
-aldi
-tincture
-glenbrook
-haldimand
-cga
-oscillate
-occultist
-bestows
-twining
-monogamy
-refueled
-reciprocated
-discos
-snodgrass
-kunst
-folkways
-proliferate
-shopkeepers
-smokies
-feelin
-goodhue
-samtgemeinde
-rsi
-sachem
-stockhausen
-js
-moir
-westbourne
-hosokawa
-ventilator
-dauntless
-90.6
-sauter
-d'amore
-pimps
-bateson
-fetching
-hard-line
-waratahs
-1388
-multiple-choice
-smurf
-camperdown
-uneconomic
-reiser
-bluesy
-30-day
-proboscis
-folktales
-overbrook
-horan
-statham
-mid-west
-negri
-toney
-chrysanthemum
-reps
-sulfides
-sindhu
-weta
-gossard
-emmons
-conant
-rivington
-walk-on
-chiu
-downtrodden
-tsinghua
-mobb
-lubin
-o'meara
-shrunken
-mg2
-saarinen
-splashing
-classifier
-voles
-user-friendly
-burwell
-hoxton
-impediments
-wiggum
-basemen
-greyfriars
-ferrets
-egret
-kio
-resistivity
-lanchester
-castlemaine
-ogdensburg
-sankara
-uli
-24,375
-agribusiness
-ercole
-lymington
-alda
-mucosal
-mclain
-bax
-marianna
-narcissism
-fucked
-brevetted
-avestan
-medica
-swank
-immortalised
-zinoviev
-sutta
-230,000
-line-of-sight
-cements
-fruity
-lune
-strictures
-opp
-penner
-iran-contra
-marksmen
-monsoons
-wordperfect
-toluca
-smurfs
-high-grade
-downforce
-28,125
-c-span
-a-ha
-nanette
-paganini
-faulting
-holbein
-syncretic
-conveniences
-shivers
-jumpstart
-muhajirs
-1947-48
-roden
-tynwald
-cayenne
-kenobi
-gali
-ramstein
-caisson
-childe
-russellville
-simonson
-witney
-u.s.s.
-665
-thalia
-harshness
-meeks
-8-2
-hemispherical
-suu
-firecracker
-92.2
-perlis
-spoiling
-thirty-first
-maur
-inspector-general
-goodson
-karn
-barnegat
-caray
-caxton
-uralic
-bookmarks
-corte
-ruffed
-ashton-under-lyne
-enchanting
-centrality
-conquers
-engelbert
-matchmaker
-saybrook
-recumbent
-stane
-1090
-utama
-government-run
-superconductors
-andrus
-sorely
-kava
-surfboard
-brindisi
-waxing
-burdett
-hinsdale
-eisenberg
-delve
-longo
-usisl
-hominids
-coldly
-decked
-epistemic
-musial
-conspire
-ductile
-animism
-depolarization
-lumberjacks
-jonsson
-bhishma
-soundstage
-ballparks
-levett
-terrorize
-gnostics
-ln
-haveli
-ayla
-amyloid
-illuminates
-hindman
-rajputana
-gyllenhaal
-100-yard
-avar
-gastonia
-minorca
-mendenhall
-balked
-cisneros
-52,000
-bada
-befell
-3x
-adm
-ojai
-aiel
-zig
-oakleigh
-reiki
-satara
-famitsu
-flatulence
-funhouse
-haplochromis
-populism
-blamey
-antiretroviral
-latinized
-ferric
-underscored
-grey-brown
-cavanaugh
-mashhad
-blackbeard
-seamanship
-baits
-duns
-gongs
-olof
-chitra
-cable-stayed
-8-9
-japonica
-fedayeen
-leadoff
-1507
-notre-dame
-convalescent
-condiments
-rutherglen
-ambivalence
-lorelei
-krynn
-dykstra
-matterhorn
-2003/2004
-eyebrow
-overtone
-steeleye
-sledgehammer
-predominating
-1277
-sub-tropical
-12-string
-a.t.
-brahmana
-lifter
-tufted
-harringay
-irreparable
-hannigan
-droids
-arrayed
-pella
-ironstone
-ligurian
-18-month
-11-year-old
-carbonates
-citytrain
-3.50
-locator
-lis
-afton
-anisotropic
-revel
-carlingford
-lolo
-klinger
-fifty-one
-fibroblasts
-cavitation
-exaltation
-petrochemicals
-preble
-rpo
-tcc
-scrapyard
-mendota
-sensu
-eben
-parabola
-sunroof
-bast
-teflon
-beaverbrook
-double-deck
-rudge
-seria
-worships
-wester
-superbly
-batson
-class-a
-nightwish
-oat
-motet
-kikuchi
-mega-city
-errands
-riki
-fulfilment
-sufficiency
-likens
-hammill
-sais
-plainview
-drunkard
-682
-andronicus
-misdemeanors
-upper-level
-gandy
-hast
-aftab
-rapporteur
-hajime
-peerless
-colle
-counter-reformation
-skiff
-copepods
-y-dna
-braced
-corbet
-starke
-mistral
-straddled
-morgenstern
-allstate
-ish
-1451
-alberni
-ess
-hopwood
-jud
-frobenius
-baudelaires
-basilisk
-expropriation
-kalman
-ethniki
-viti
-henryk
-mestizos
-subregion
-interchanged
-pentecostals
-preying
-1099
-atria
-transpennine
-taiji
-talker
-jeffers
-emboldened
-daemons
-lucha
-arachnids
-chronos
-deadpan
-regressive
-jas
-mid-morning
-b-1
-llamas
-covina
-civet
-rapidity
-pleasantly
-balkenende
-eurofighter
-awhile
-derbies
-leh
-seamstress
-neopaganism
-purefoods
-balding
-labial
-curragh
-accumulations
-leadville
-tampico
-fourth-largest
-botvinnik
-conditioners
-reinterpreted
-bhavani
-springhill
-flickr
-hants
-adl
-kodansha
-henshaw
-vico
-angelis
-jani
-warr
-leeches
-gabba
-kingsland
-man-thing
-catastrophes
-tarnovo
-purim
-3:1
-gotland
-zydeco
-newsstand
-gruffudd
-more-or-less
-photoelectric
-waitangi
-mymensingh
-rajkot
-renoir
-uribe
-matsuda
-s.m.
-sus
-manas
-westville
-762
-novartis
-sephiroth
-round-up
-uwf
-arthropod
-handbooks
-bushfire
-mitotic
-pyro
-landrieu
-anza
-saskia
-speakeasy
-shred
-reservist
-rashad
-pluralistic
-reprogrammed
-ochraceous
-eth
-gorizia
-mataram
-noguchi
-angiogenesis
-1424
-coffeehouse
-scrutinized
-waldman
-mantel
-outliers
-earth-616
-wands
-temecula
-playmaker
-five-eighth
-stipulate
-taney
-qos
-resection
-tics
-glaad
-anas
-mimosa
-lasky
-skillet
-mmp
-cantus
-inter-school
-lintel
-1219
-lhs
-necked
-cinco
-qe2
-maitreya
-melanchthon
-572
-662
-waheed
-ballooning
-sandberg
-mackaye
-jma
-vidarbha
-humphry
-7500
-capers
-yeerks
-cartoonish
-potemkin
-gorse
-mose
-583
-christiane
-casas
-combats
-ela
-karol
-pre-dates
-bideford
-fluted
-mourns
-ising
-mantegna
-thenceforth
-lumpy
-10-7
-masami
-1300s
-ultimo
-yoghurt
-silverchair
-kleine
-bridport
-jetblue
-cunha
-voight
-consigliere
-belarusians
-varuna
-cooker
-catenary
-utero
-reconfiguration
-demarcated
-wetting
-collette
-innervated
-scholes
-strolling
-walling
-zwolle
-dmu
-rundle
-grebe
-beckford
-hirohito
-cunninghame
-seiko
-sigh
-23d
-96.8
-egress
-woodblock
-anak
-clemons
-cardassian
-texas-based
-diorama
-schwimmer
-bulges
-paltz
-gbit/s
-scindia
-interlinked
-carpi
-jong-il
-a-18
-yeasts
-bashar
-grandmasters
-disappoint
-caecilius
-nfs
-apalachicola
-+DG.DG
-theorizes
-cronenberg
-s.p.a.
-uninjured
-wodonga
-krazy
-944
-tailplane
-spruance
-unscientific
-aryl
-rolando
-buggies
-forages
-caltrans
-hucknall
--7
-transylvanian
-secularization
-pompidou
-1477
-pippa
-takoma
-depositors
-mid-80s
-breviary
-.303
-1400s
-mid-2004
-pivoting
-undifferentiated
-languedoc
-hulu
-hercule
-mpls
-florio
-two-minute
-fennell
-rebates
-bittern
-badami
-propylene
-oscar-nominated
-falaise
-nerdy
-morotai
-riverhead
-screwball
-helier
-orchestrating
-bacolod
-takedown
-faintly
-elegantly
-cllr
-context-free
-ghul
-pandey
-watersports
-active-duty
-hdmi
-uday
-22-year-old
-emap
-toons
-jehan
-megawati
-olympiakos
-thine
-canonically
-vineland
-down-to-earth
-miyuki
-yudhoyono
-ankh-morpork
-whomever
-cellini
-buscema
-1054
-800m
-forecaster
-dower
-misha
-lunenburg
-noriko
-sapotaceae
-hutchings
-non-indigenous
-amer
-iwate
-osteoarthritis
-narrate
-47,000
-antitank
-shingo
-pints
-speedometer
-dobro
-concedes
-dfl
-quadrupled
-wavefront
-retorted
-43.5
-breathes
-shinobi
-dimitrov
-spacer
-viol
-1464
-churn
-eure
-mineola
-ravenswood
-polymeric
-e.c.
-facie
-handbag
-montcalm
-constricted
-longshore
-lamarr
-akshay
-laswell
-mander
-sloths
-kirchner
-ba'ath
-44.1
-romano-british
-walmart
-rona
-935
-seagram
-snrna
-arora
-grated
-dugdale
-forcetoc
-1361
-1284
-peyote
-three-quarter
-ovate
-joubert
-formers
-varley
-heathfield
-cantatas
-maglev
-41.8
-phat
-foregoing
-cul-de-sac
-skirmishers
-socio-cultural
-counter-strike
-amu
-oma
-transparently
-lessing
-bluegill
-kauravas
-kroq
-g.o.
-post-playing
-sassanids
-saltzman
-urban-rural
-decontamination
-ripen
-stiffened
-burj
-kof
-mk2
-crudely
-telex
-longman
-materialise
-neeson
-anisotropy
-penitent
-executors
-b-movies
-instigating
-sada
-fluvial
-bega
-conflagration
-amalgamate
-moonee
-adjuvant
-2001-2004
-mykola
-weft
-re-worked
-urbanised
-hoke
-pacified
-nuthatches
-564
-eccentricities
-non-academic
-monferrato
-pula
-mccay
-wanderings
-pcf
-worst-case
-lupo
-subjection
-1067
-721
-esperance
-chequered
-vel
-lubricating
-dazzle
-polarizing
-imps
-patroness
-toddy
-chunky
-wash.
-bronstein
-stoop
-mujib
-guidebooks
-lydon
-arousing
-bandon
-unplayable
-bleeds
-fes
-podocarpus
-persevered
-heretofore
-cri
-destabilize
-mischa
-compensates
-g'kar
-eretz
-synonymously
-hemings
-didsbury
-chukchi
-impeding
-tote
-reimbursed
-glaring
-1198
-biel
-favoritism
-sprinklers
-ironwood
-keita
-fillers
-zastava
-1473
-triglycerides
-how-to
-b-26
-1266
-leys
-josquin
-bou
-blooded
-caltrain
-pelton
-marcellinus
-brahe
-funnies
-i-84
-ck
-warnock
-euphonium
-overstated
-swordsmanship
-abatement
-audacity
-counterattacked
-periodicity
-1279
-payoffs
-doty
-thalberg
-metrostars
-forearms
-abdulaziz
-urinating
-eia
-solicitor-general
-brooch
-great-grandmother
-bardot
-pantomimes
-auditoriums
-rtp
-abounded
-freefall
-tomsk
-97.6
-skimmers
-umbra
-informers
-deca
-87.5
-1998-2001
-gradius
-mah
-673
-preservationists
-takagi
-chrissy
-non-aggression
-giraffes
-pna
-anhydride
-barbarism
-mobius
-greening
-forecastle
-cleomenes
-eldred
-recapturing
-babblers
-pisgah
-rpc
-firmer
-gearboxes
-blindfold
-draupadi
-melanogaster
-chastised
-maeve
-suda
-centrum
-hygienic
-psychotropic
-exon
-malignancy
-jeopardize
-sgc
-retaking
-mangled
-wankel
-dinky
-abstaining
-quench
-counterproductive
-dept
-cataloging
-kingstown
-wearable
-exclaiming
-bridgeton
-hoplites
-judean
-massawa
-cluj
-earache
-rockne
-doubting
-neverwinter
-depopulation
-focke-wulf
-686
-stonehill
-derisively
-sobers
-hayter
-arrowheads
-akilam
-uncontrollably
-horrendous
-hafez
-hyrum
-non-selective
-wisbech
-1439
-zhongshan
-granitic
-wardle
-wrinkle
-doak
-crestwood
-upham
-ennio
-thwarting
-bork
-hieroglyphics
-forty-third
-kora
-tambo
-phonographic
-haplogroups
-midfielders
-idler
-coalfields
-hard-boiled
-unbeatable
-tolerable
-soloveitchik
-brubaker
-abounds
-francisco-based
-eight-year-old
-l'arc
-dunhill
-squamish
-avanti
-keroro
-dramedy
-maven
-volleys
-saas
-f-zero
-operandi
-pressurised
-bygone
-half-dozen
-vosges
-telenor
-haldeman
-atheistic
-bayfield
-wyclef
-mickie
-ashlee
-zacharias
-intersex
-11-year
-halsted
-stoneman
-20-minute
-northallerton
-doane
-lazlo
-glandular
-ingo
-telefilm
-microwaves
-sokoto
-dawood
-re-creation
-precept
-backlot
-tibeto-burman
-witten
-gort
-protrusion
-habana
-mariel
-genial
-2009-present
-thirty-third
-spee
-waterlogged
-reals
-encino
-rearrange
-yams
-moyle
-wondrous
-headbangers
-maxentius
-tethys
-protrudes
-bakerloo
-frigatebirds
-zany
-bamber
-utley
-occidentalis
-howes
-trilobites
-newsstands
-3-point
-oyama
-gsi
-kashgar
-semigroup
-showgirl
-ioannina
-806
-backseat
-physiographic
-cheka
-39.9
-rit
-bba
-branco
-mummified
-bunnymen
-sanh
-counter-intelligence
-41.1
-folios
-ailsa
-haaretz
-evgeny
-nostril
-longshot
-sentral
-roa
-memons
-4-4-0
-lafitte
-bethpage
-scylla
-buoyed
-endow
-cosenza
-camarillo
-gramm
-dormer
-uzi
-taal
-cibber
-tuanku
-second-year
-coffman
-stopper
-cuculidae
-98.2
-steinitz
-sectarianism
-toth
-azaria
-eight-hour
-litton
-netflix
-fjords
-cuvier
-epigrams
-metros
-jousting
-harlingen
-urbe
-p.l.
-polymath
-nagy
-ridder
-0.07
-arkhangelsk
-valuations
-electrics
-tumbler
-excavators
-43.8
-tiburon
-re-mastered
-outperform
-ween
-blowers
-42.6
-thi
-glick
-rorschach
-quakes
-dalziel
-wario
-hydrobiidae
-avenida
-zeroes
-run-in
-sprouting
-foodservice
-whittingham
-daimler-benz
-alita
-yellowhead
-istana
-louvain
-dioxin
-fido
-spirou
-1254
-eagerness
-clea
-maxims
-helplessly
-1323
-scooped
-normand
-endoscopic
-berryman
-mckeown
-dictation
-wedderburn
-kingsport
-qld
-agusan
-hipparchus
-adorns
-didi
-sabotaging
-canadensis
-petter
-trigg
-undated
-disassembly
-interfacing
-whaley
-chutney
-c&g
-mosfet
-palli
-droplet
-bek
-stebbins
-&
-manheim
-linseed
-cuticle
-inverting
-proselytizing
-mazarin
-bustle
-imelda
-linklater
-entitles
-gaea
-cheesy
-98.0
-rimbaud
-joost
-d'souza
-rsfsr
-stasi
-yekaterinburg
-falter
-suppresses
-piledriver
-impoundment
-sankey
-kurnool
-creswell
-bap
-parkside
-hatta
-subhas
-econometric
-dimmer
-mircea
-ponders
-chemokine
-flatten
-whetstone
-copywriter
-fannin
-roosting
-unhindered
-dsi
-dunwoody
-aku
-canarsie
-holger
-1246
-multi-racial
-haileybury
-whitford
-lz
-deportes
-heil
-fairweather
-deeming
-co-existence
-quads
-cerf
-takeuchi
-contravention
-teramo
-macduff
-amityville
-mangum
-constrictor
-lexi
-stymied
-catterick
-tomko
-accosted
-ernestine
-1326
-insensitivity
-fifty-second
-genotypes
-crossrail
-unavailability
-radioed
-tyrrhenian
-jomo
-hypnotism
-geysers
-finkel
-furlough
-blackett
-592
-cmj
-magnesia
-bruni
-drummed
-tsung
-pkk
-hecker
-stench
-impresses
-karts
-garlands
-41.6
-a12
-castaway
-gash
-jha
-pharmacokinetics
-ij
-tableaux
-raffle
-disengage
-lifes
-holster
-yatra
-arecibo
-psl
-uncharacteristic
-singin
-konica
-bharti
-23-year-old
-begg
-intrepidity
-patan
-hoh
-codebase
-monoamine
-arion
-ch-oh
-c'est
-malan
-inclines
-incidences
-objectivist
-gratuitous
-dfb
-libyans
-disembarking
-thirsk
-sorel
-bosque
-lyall
-campanile
-limoges
-al-malik
-vardar
-1-12
-mallett
-symbolised
-abyssinian
-ghq
-calliope
-procreation
-tranquillity
-dwindle
-tonya
-chlorinated
-radioisotope
-turnip
-two-run
-otaku
-cassell
-siti
-zig-zag
-nci
-sivan
-firmness
-macdowell
-nobile
-tarragona
-rockman
-hangers
-srivijaya
-nth
-devalued
-theorize
-microcosm
-rhyolite
-1496
-hersey
-grits
-christology
-1447
-affix
-41.9
-nilpotent
-pommel
-purine
-moebius
-prophylaxis
-wetmore
-sourceforge
-turnstiles
-muscicapidae
-apcs
-timaliidae
-courthouses
-scharnhorst
-shotokan
-subspaces
-fluidity
-sula
-perpetuating
-unreported
-higginson
-melendez
-erna
-keble
-all-party
-pus
-venkateswara
-ife
-semis
-first-line
-contaminate
-kempton
-sbi
-1020
-instilling
-demeo
-gangland
-longchamp
-kurtis
-biotic
-hand-written
-incessantly
-self-centered
-saint-germain
-overrule
-slc
-lapis
-16-year
-ganassi
-0.06
-corley
-paphos
-treadmill
-damning
-dumbo
-martius
-disputing
-c.a.
-jaspers
-intruded
-deform
-1218
-zechariah
-fifty-fifth
-wightman
-satoshi
-sheedy
-takhisis
-stengel
-wiesel
-snoqualmie
-cached
-horwich
-benefices
-seven-day
-costanza
-polyvinyl
-stepsister
-1382
-enticing
-optimised
-pathet
-akai
-mcbeal
-frenzied
-oam
-icehouse
-naha
-under-19s
-stortford
-1283
-sackett
-arpa
-heritable
-clung
-sprained
-baath
-confounded
-sepoy
-linesman
-erudition
-hermaphrodite
-decomposes
-hydrate
-sigourney
-waterbirds
-venereal
-huddle
-quasimodo
-habibullah
-polycarbonate
-gatti
-neurologic
-comilla
-sutlej
-ack
-lorre
-minuet
-meissner
-gemayel
-hopefuls
-97.2
-turnbuckle
-captivating
-shani
-farrington
-moultrie
-balikpapan
-kwun
-conceit
-giroux
-townes
-synthesizing
-contrabass
-rimfire
-jyoti
-mikhailovich
-10,500
-partaking
-josip
-sweyn
-newhaven
-health-related
-ramblin
-toppings
-elspeth
-87.7
-leng
-cavour
-w.d.
-crosbie
-dfs
-condolences
-lavatory
-reassert
-formula_61
-726
-vengi
-kamil
-smirnov
-wm.
-palaus
-stirlingshire
-mstislav
-developmentally
-priam
-chicagoland
-impotent
-spoonful
-extents
-gaits
-willfully
-kanon
-futurism
-vespertilionidae
-feinberg
-switchover
-gunma
-disestablishment
-malling
-spokesmen
-hanafi
-reminisce
-inadequately
-1479
-venetia
-telenovelas
-vtol
-hvdc
-glencairn
-kennels
-granites
-drive-thru
-1422
-dayal
-retardant
-wenceslaus
-bole
-moy
-accrue
-98.6
-iconoclasm
-pallium
-reverberation
-conmebol
-lyase
-shui-bian
-opposite-sex
-commutes
-carnahan
-antalya
-1431
-arrhythmia
-slaton
-wodeyar
-classicist
-introns
-ungar
-diadem
-louisbourg
-stereophonic
-0.50
-1368
-aras
-weaned
-ancaster
-1pm
-concertmaster
-tripos
-1363
-pocatello
-yearwood
-canara
-malenko
-nudist
-leann
-chaplaincy
-wrangel
-pretorius
-tahitian
-colliers
-fancied
-seven-member
-98.8
-waukegan
-transgressions
-olathe
-sniff
-hallett
-sanga
-bolshevism
-817
-46.7
-f-100
-agro
-uncompetitive
-ice-t
-levitate
-1452
-wimsey
-ceuta
-ulu
-1419
-sallie
-qatari
-crunchy
-whitestone
-elohim
-friedkin
-oude
-stigmata
-bognor
-sensitivities
-bourges
-sudoku
-alexandros
-vitellius
-oktoberfest
-litoria
-1214
-flav
-flay
-singer-songwriters
-jarring
-rectification
-lauraceae
-nevins
-zbyszko
-bulloch
-interspaces
-colquhoun
-strabane
-aic
-10-3
-moats
-sevenfold
-kaos
-linings
-hertha
-uriel
-daughtry
-spivey
-metromedia
-xr
-bonzo
-reticular
-blackford
-miz
-tryst
-irritability
-kavanaugh
-despotic
-ukridge
-galerius
-stanislavski
-16s
-wynton
-nalanda
-tradeoff
-guiscard
-aniline
-gallienus
-fim
-capps
-gigante
-hogue
-velde
-hewett
-hijacker
-pryde
-jovial
-red-haired
-41.3
-bilayer
-conspirator
-hizb
-drowsiness
-diarist
-nudes
-alucard
-bma
-age-old
-zest
-1955-56
-2019
-faunal
-chingford
-qualms
-eris
-co-write
-785
-maybach
-unaccredited
-mumtaz
-sub-species
-filamentous
-ueda
-698
-ilkley
-amore
-quarantined
-mcalpine
-merdeka
-balhae
-reformists
-2003-2006
-multi-story
-throwers
-winslet
-altimeter
-ahmednagar
-toots
-pretzel
-methylene
-o.p.
-thornbury
-sncc
-siluriformes
-newcastle-under-lyme
-eea
-hypothyroidism
-service-oriented
-wendt
-cingular
-688
-hiroyuki
-mowers
-trivium
-battletech
-maltby
-linotype
-pre-election
-discontented
-docherty
-10-yard
-ill-defined
-gaughan
-infidels
-seale
-almshouses
-a.l.
-daoud
-decal
-scriabin
-microseconds
-low-quality
-weiser
-processional
-pulsars
-leonean
-neglects
-c.m.
-whizzer
-calderon
-manohar
-arrhythmias
-robinsons
-mcdonagh
-mclane
-integrator
-kristi
-lanois
-mistletoe
-crabbe
-bolzano
-sampras
-thar
-swedenborg
-antipater
-broadmoor
-dedekind
-1125
-fodor
-multichannel
-thermoplastic
-mafic
-dorsum
-apathetic
-readied
-last-ditch
-shabby
-x-play
-discerning
-598
-manon
-reshaping
-menai
-tadeusz
-segregate
-toolbox
-tiflis
-unrwa
-repealing
-2.17
-ciaran
-paucity
-hildesheim
-phosphorylated
-karo
-dominator
-bascule
-parsis
-r.l.
-7-4
-cornhuskers
-97.0
-minivans
-carpathia
-snowflake
-rudiments
-padova
-persisting
-low-grade
-aspartame
-knotts
-1958-59
-biograph
-inoculation
-cdt
-bluefield
-farcical
-replaceable
-clodius
-ags
-bhagwan
-film-makers
-spider-girl
-fiber-optic
-spied
-farmingdale
-frills
-greek-letter
-pumice
-cherryh
-land-use
-childrens
-53.3
-yvon
-1318
-musselburgh
-deformations
-10-minute
-jujutsu
-stooge
-dvr
-frictional
-rajasthani
-formula_64
-agitators
-scrappy
-wedge-shaped
-mccaughey
-minnow
-multi-faceted
-npt
-579
-calendarists
-unidirectional
-veeck
-friel
-aardvark
-lower-class
-mccourt
-cleaves
-raking
-deadlocked
-ibex
-above-average
-4-lane
-sager
-klass
-ashlar
-disappointments
-psychical
-swale
-skeeter
-womens
-dagmar
-43,750
-badawi
-pressman
-invert
-1720s
-wkrp
-partway
-mikoto
-bbb
-cloaks
-lardner
-duos
-dailey
-echidna
-abstention
-immediacy
-colston
-maryhill
-mortis
-1315
-obata
-gympie
-quesada
-aldwych
-all-powerful
-a7
-cristal
-seaborne
-crossgen
-detract
-dinesh
-roddick
-wert
-fervour
-smithson
-domus
-f-111
-benares
-abated
-5,000,000
-harkonnen
-hodgkinson
-palomino
-falsification
-candidature
-phrygia
-consequential
-fleshed
-wyn
-50m
-ilm
-nicktoons
-heyer
-low-wing
-checkerboard
-philemon
-gastronomy
-dsa
-euphemia
-parathyroid
-jerez
-patrolman
-hippy
-biggie
-vitali
-islami
-lavin
-dtc
-circumscribed
-non-verbal
-kittery
-erythrocytes
-1217
-kirkman
-onerous
-starsky
-hurrah
-1245
-abductions
-josephson
-squeak
-lupine
-totnes
-saraswat
-liskeard
-pre-revolutionary
-94.4
-doughnuts
-vorkosigan
-quid
-eac
-spline
-homesteaders
-bombastic
-snowmobiling
-jammer
-superstation
-afoot
-blume
-woolston
-guest-starring
-lacrimal
-pidgeon
-outhouse
-telegraphs
-cofounder
-barbecues
-weybridge
-colditz
-harpist
-impropriety
-1236
-ulema
-philautus
-validating
-bencher
-laxative
-inter-provincial
-ploughed
-torrents
-1264
-117th
-unabridged
-obsessions
-demeaning
-disorientation
-irrigate
-f-1
-molotov-ribbentrop
-40.1
-eurodance
-simd
-pamir
-alkanes
-sens
-yitzchok
-jointed
-eastlake
-2-7
-kohlberg
-rowntree
-queensberry
-nighthawks
-setzer
-univ.
-orman
-suffixed
-hei
-toyah
-dhcp
-otome
-rathore
-nacelles
-wheelwright
-snead
-4/5
-649
-space-based
-bowels
-908
-hypnotized
-repress
-x-23
-drivetime
-oceanus
-rectors
-rtv
-cynic
-gellar
-ammonites
-bowl-shaped
-romanos
-kwaito
-wokingham
-femmes
-helloween
-illyrians
-tallow
-flatts
-taliaferro
-saugus
-wilf
-7-11
-marnie
-reviled
-comprehensible
-wrinkles
-squirt
-merlot
-fatboy
-1206
-aliasing
-anti-lock
-birthing
-sweeteners
-judicature
-radiates
-bloodied
-swaying
-kadima
-photonic
-statically
-never-ending
-dembski
-step-by-step
-burgenland
-cold-blooded
-ricks
-near-earth
-ranga
-n.a.
-non-volatile
-steampunk
-venona
-untrustworthy
-***
-forty-fourth
-expend
-talus
-american-style
-lepidus
-one-of-a-kind
-xli
-honeysuckle
-schweizer
-earnestly
-holmenkollen
-ohlone
-cx
-undeniably
-urban-type
-non-canonical
-incineration
-articulates
-reinvent
-i-20
-arkin
-paulette
-asian-american
-1355
-kasper
-repositioned
-vikas
-stifled
-effector
-puffs
-gelding
-subscript
-discipleship
-ciro
-hemolytic
-ryman
-conga
-kaifeng
-prehensile
-ponzi
-ramadi
-fiorentino
-analgesics
-galore
-breck
-jetstar
-duckling
-branning
-aib
-co-captain
-krzysztof
-10s
-afrikaners
-brodmann
-904
-woogie
-plaxton
-nisei
-egos
-1110
-mst3k
-laziness
-temps
-enugu
-quince
-hemmings
-gentoo
-kickboxer
-kiernan
-hema
-unabated
-alpena
-aten
-goebel
-polysaccharides
-petersfield
-lans
-kolmogorov
-matthau
-blantyre
-748
-pacs
-maxis
-o'hanlon
-rime
-kumara
-pro-am
-bails
-teases
-thon
-counter-attacks
-sieg
-argentines
-1408
-yorkist
-rheims
-prospekt
-all-volunteer
-helices
-hanno
-meenakshi
-dabbling
-kilwinning
-lampeter
-wingtip
-dawg
-yearling
-bores
--20
-41,000
-looters
-client-side
-talk-show
-mctell
-keaggy
-volhynia
-hora
-3-8
-mercurial
-zo
-rejoicing
-pasternak
-lovelock
-urology
-mancuso
-vitesse
-die-cast
-doohan
-grilles
-huac
-peet
-aerosols
-adjectival
-isleworth
-lavery
-amygdala
-capa
-heartbreaking
-solemnity
-burgundians
-cynon
-then-unknown
-hooton
-trawling
-bentinck
-9,500
-594
-eri
-townley
-adriaan
-monad
-centaurus
-nukem
-mechanistic
-kanto
-athabaskan
-coulson
-flexing
-filial
-unworkable
-sargodha
-11.30
-duchesne
-placename
-swimwear
-1456
-muncy
-parkways
-evander
-neoclassicism
-prakrit
-fwv
-macphail
-1304
-mvs
-barbels
-shackles
-lingus
-cycled
-hirschfeld
-whangarei
-clerked
-pincer
-carmelites
-hazleton
-brotherly
-mll
-neron
-taconic
-tumbled
-drillers
-ayyubid
-bamford
-l&swr
-htv
-seasonings
-homeschooling
-thanh
-sandys
-multipliers
-waiters
-leibowitz
-persson
-1414
-negra
-vcd
-switcher
-tanja
-millionth
-cordite
-runnin
-saint-denis
-braniff
-consecrate
-fedor
-stabler
-terai
-solidifying
-oneworld
-carbonic
-cour
-appian
-tannin
-climaxes
-banja
-39,000
-newts
-troopship
-carpeting
-strangulation
-o'donovan
-joanie
-1248
-collectable
-frat
-absolutism
-48.5
-hydrological
-lyricism
-steeles
-multi-dimensional
-brookville
-r4
-phyla
-coincidences
-contralto
-claudine
-yuko
-doggett
-1956-57
-bogota
-approachable
-shaders
-seaview
-moraines
-profiting
-rubik
-anti-imperialist
-mcdowall
-front-runner
-mycenae
-violets
-blm
-mulgrave
-samhita
-ilse
-pentatonic
-tohoku
-patagonian
-roca
-karamanlis
-furor
-ewa
-negotiates
-beaked
-keno
-three-fourths
-mazur
-compaction
-gymnasia
-jit
-excrement
-1317
-phantasy
-busier
-hydraulically
-dramatised
-ooze
-scents
-endocrinology
-thabo
-panthera
-exaggerate
-erc
-d.l.
-icd
-ratt
-nfb
-1959-60
-moonraker
-fingered
-unease
-empowers
-grif
-ambon
-tilings
-ky.
-codrington
-gabled
-trending
-100.1
-interbreeding
-'40s
-cusco
-vireo
-1095
-oconee
-tyner
-unconvinced
-friedlander
-charlevoix
-tillis
-ready-made
-chronicon
-perrier
-bjelke-petersen
-42.4
-mohun
-avesta
-choe
-talismans
-activex
-abell
-darkening
-t-rex
-revisionism
-landy
-peterhead
-manse
-90.2
-legates
-kharagpur
-binns
-curveball
-ys
-equinoxes
-sleaford
-disapproving
-varietal
-ruthenium
-fatherhood
-shape-shifting
-deserter
-valuing
-frightful
-shambhala
-diazepam
-iti
-belorussian
-word-of-mouth
-forty-eighth
-unequivocal
-mortgaged
-internationalism
-quaternion
-one-person
-fortunato
-illuminations
-romulans
-assessors
-transcribe
-1267
-paladins
-gretel
-8-6
-incapacitate
-mcginley
-anti-racist
-concatenation
-allergens
-dozier
-nagin
-maguindanao
-smeaton
-landtag
-rusted
-geodesics
-eliminations
-93.6
-impermeable
-pop-culture
-sobbing
-confounding
-mossy
-breezes
-stretford
-softness
-fincher
-acceptability
-diverts
-godunov
-co-operated
-carmelo
-mirpur
-kiska
-salo
-suffragette
-metacomet
-rankine
-wsc
-irreplaceable
-crossroad
-entertains
-deduct
-pre-independence
-homolka
-intoxicating
-starrcade
-leftovers
-lcl
-conger
-distressing
-immerse
-simplicial
-champlin
-sfc
-bruise
-darke
-merlyn
-out-of-court
-cordless
-moorehead
-bouncers
-okada
-kickapoo
-choy
-thereto
-hadi
-denali
-bestowing
-discoloration
-primed
-surfactants
-awoken
-1335
-ulterior
-unassisted
-moeller
-infringements
-erebus
-marland
-nakayama
-1229
-fieldturf
-carell
-dilla
-householders
-johnathan
-sprinting
-war-era
-unp
-rnli
-diabetics
-anti-religious
-stonehaven
-tamper
-macbook
-hematopoietic
-boundless
-kamui
-1322
-calipers
-dhillon
-barometer
-fremen
-murree
-berated
-boonville
-facings
-florin
-salve
-krupa
-8080
-hannes
-1261
-dynamism
-850,000
-runestones
-simca
-multifaceted
-joo
-malfunctioned
-taki
-macromolecules
-merengue
-levellers
-brouwer
-archduchess
-freeholder
-searchlights
-debunked
-lethargy
-analgesia
-blood-brain
-thame
-46,250
-wrx
-camborne
-timmons
-centripetal
-bimbo
-fft
-part-owner
-midwinter
-ueshiba
-oif
-wednesbury
-clevedon
-non-empty
-poo
-horry
-millais
-hol
-godlike
-interdenominational
-iwgp
-swartz
-sordid
-scouring
-pizzas
-jahn
-triceratops
-guanine
-payouts
-phillipsburg
-rhymed
-macaroni
-riesling
-lakshman
-positivist
-disavowed
-fourths
-chandrasekhar
-menuhin
-campbells
-945
-metered
-formula_62
-pipit
-camara
-shepperton
-macdill
-haan
-ganz
-49.5
-testimonials
-wpt
-twine
-treadwell
-lgpl
-permanente
-niobium
-11.00
-heterogeneity
-threes
-danubian
-womanizing
-gamepro
-clinker
-lawford
-southsea
-wrappers
-rosehill
-replicates
-davide
-samnites
-sniffing
-1997-2001
-gt-r
-polder
-fraudulently
-841
-plympton
-oblate
-fci
-geomagnetic
-pieced
-waikiki
-longish
-brianna
-furtherance
-grevillea
-khawaja
-koster
-spezia
-halibut
-ashmore
-e.w.
-chiara
-platz
-forebears
-c.b.
-593
-boe
-euthanized
-ne-yo
-al-azhar
-malick
-egremont
-freitas
-diagonals
-pre-history
-gorakhpur
-retirements
-buckler
-ethers
-burford
-hibiki
-taxila
-clasps
-kratos
-canola
-ilan
-ten-year-old
-bajaj
-boylan
-piaa
-endoscopy
-inundation
-cav
-postpartum
-broadsides
-sinensis
-outpouring
-il-sung
-plante
-romain
-shiki
-all-sports
-javelins
-look-alike
-1640s
-jamaat
-africana
-tabu
-ungainly
-biliary
-saccharomyces
-865
-bcp
-musculature
-spassky
-demarco
-mende
-belichick
-puglia
-petar
-mechs
-subconsciously
-well-maintained
-1392
-endemol
-articled
-bhojpuri
-radisson
-1465
-1406
-landsberg
-42.2
-1244
-harmonization
-matchups
-prion
-tut
-ptt
-n.v.
-off-air
-abusers
-751
-koontz
-opentype
-1221
-psychopathology
-gilan
-bahamut
-non-payment
-worsens
-enroute
-94.2
-grau
-doss
-shippers
-anabaptist
-zinn
-shallows
-vaporized
-monism
-wls
-watters
-nim
-hippocampal
-erred
-littlewood
-milos
-tetanus
-scrapbook
-counterinsurgency
-multi-channel
-erlewine
-carelessly
-motorbikes
-brel
-maracaibo
-karlsson
-dopaminergic
-orme
-willcox
-vips
-ppa
-pangaea
-luk
-interlock
-glaciated
-llama
-roadblocks
-jacobian
-ulama
-rs-232
-opacity
-tantalum
-brachial
-prejudicial
-orpington
-14th-century
-resnick
-93.4
-93.0
-godparents
-off-white
-gennady
-flagging
-srikakulam
-i-64
-cherub
-23,125
-b.e.
-meetinghouse
-toyo
-waccamaw
-antonescu
-synovial
-mccrory
-sayre
-mediocrity
-haridwar
-mariko
-committeeman
-reactant
-time-sharing
-charly
-wilkesboro
-ringworld
-chaperone
-cullum
-iha
-retroflex
-hansel
-pausing
-coheed
-tympanum
-47.5
-strasser
-beechwood
-eurocopter
-cea
-otherworld
-1135
-yorkton
-seventeenth-century
-methyltransferase
-bulimia
-thornycroft
-739
-wmo
-rattlesnakes
-basilicata
-ceremonially
-cosplay
-gerbil
-dargah
-leopoldo
-katakana
-paltrow
-zidane
-naturist
-hoisting
-newsboys
-mephistopheles
-reinterpretation
-peduncle
-slashes
-91.2
-inr
-3-7
-precondition
-jitsu
-9-1
-nagisa
-brackley
-wiretapping
-crystallographic
-al-assad
-ebrahim
-o'higgins
-975
-jsp
-1256
-intensifying
-rigors
-heligoland
-ftl
-espinosa
-foci
-gangrene
-malatesta
-simian
-joffrey
-pacts
-scrubbing
-1127
-beholder
-soissons
-speculum
-metis
-daffodil
-meniscus
-ligatures
-metrology
-joon
-fretted
-rapid-fire
-astounded
-550,000
-coves
-1345
-militum
-septal
-jugs
-transaminase
-amano
-826
-20/20
-cobourg
-schreyer
-friz
-wofford
-huesca
-tournai
-britannic
-decius
-densetsu
-squandered
-glosses
-quadrature
-trusty
-empathic
-munnetra
-armbar
-heflin
-changeable
-ballina
-idolized
-grade-separated
-llewelyn
-erith
-agp
-ingush
-samhain
-cytoskeleton
-carabinieri
-boz
-1983-1984
-grinch
-paleontological
-hunza
-grothendieck
-palmach
-paediatric
-wah-wah
-menendez
-azazel
-crossbows
-awacs
-tvt
-first-born
-talat
-charlatans
-nagle
-tsp
-mainstage
-seiu
-unsupervised
-leukaemia
-westmount
-soulcalibur
-alachua
-synergistic
-hasselhoff
-mongo
-re-create
-seco
-prokaryotic
-658
-absurdist
-feingold
-gizmo
-wampanoag
-ghostwriter
-pop-punk
-astrolabe
-strived
-strahan
-ikki
-94.8
-marton
-ground-level
-lumpkin
-1d
-cogs
-hae
-attesting
-serf
-wrestles
-braganza
-pruitt
-storrs
-15th-century
-meltwater
-glentoran
-multi-family
-psion
-bludgeon
-ntr
-independiente
-pontoons
-alleviation
-a1a
-yasuda
-all-league
-brushy
-nyerere
-stabling
-pekin
-wt
-breslin
-homomorphisms
-artes
-canonicity
-tramlink
-romanticized
-1311
-feigning
-fanclub
-toroidal
-hiroden
-ogawa
-straightening
-wearers
-glasnost
-saxo
-kartli
-biak
-germinal
-arbour
-wanneroo
-nanoscale
-ocampo
-reducible
-conlon
-sub-lieutenant
-suggs
-smokin
-debater
-argentino
-althea
-broods
-sdram
-revell
-chebyshev
-64,000
-scarfo
-harum
-nawabs
-fawlty
-kinematics
-encampments
-ilk
-in-service
-ards
-numidia
-qualitatively
-bumble
-25-yard
-kongu
-narrations
-ryuichi
-lymphoid
-zamindar
-mahavira
-hamming
-kronstadt
-engle
-froggy
-fugazi
-42.1
-danica
-lesbos
-turan
-beslan
-hubris
-mid-late
-l'enfant
-all-sec
-articulating
-everard
-reel-to-reel
-greenaway
-aap
-shusha
-canisius
-1486
-fifty-first
-pinder
-messrs.
-aeromedical
-zanu
-espoo
-mauna
-fte
-multi-layered
-reuter
-95.4
-inefficiencies
-copperhead
-gamera
-newly-elected
-bareilly
-zsa
-mid-summer
-delineate
-judicious
-dispensers
-rais
-wolfsburg
-charmaine
-psychosomatic
-mohandas
-functionalities
-consents
-bratz
-f.w.
-vellum
-etch
-melons
-siskel
-acr
-adm.
-dined
-transcribing
-nicollet
-point-blank
-kesey
-behaviorism
-shoddy
-yasin
-bushmaster
-wallkill
-ultramafic
-carnes
-margera
-sanctified
-avia
-inductors
-stabilisation
-marigold
-drudge
-yelena
-ifs
-chafee
-u.s.-led
-2m
-loewe
-43.2
-inwood
-nlf
-mountings
-dashiell
-sayles
-youngs
-bashing
-iman
-ramses
-fast-medium
-99.6
-motorcycling
-bahini
-haka
-phong
-asakura
-regionalliga
-1174
-gcmg
-oxbridge
-iban
-politeness
-barwon
-uninvited
-diversifying
-dc-9
-forty-second
-monopolistic
-then-wife
-castelli
-hadar
-ecac
-armas
-flattery
-characterise
-dudes
-hy
-hulbert
-braids
-nyack
-tractive
-obadiah
-lafferty
-lucrezia
-rashtrakutas
-colwyn
-runabout
-geneseo
-asturian
-alveoli
-8.50
-inuktitut
-summerland
-kismet
-rcd
-tasso
-butchered
-hovers
-dancefloor
-fairer
-ako
-lumumba
-m.c.c.
-skeena
-appreciating
-misdiagnosed
-brattleboro
-allegra
-carrier-based
-juma
-ormskirk
-flattered
-birger
-fooling
-all-district
-retford
-mesoderm
-umatilla
-hedrick
-wegener
-disciplinarian
-short-story
-57.1
-flaccus
-gianfranco
-rahal
-kandyan
-tanah
-churchmen
-tatarstan
-racy
-mvc
-sammartino
-evangelista
-aleut
-10k
-vietcong
-traynor
-dvi
-personage
-proust
-demerara
-prerecorded
-pernambuco
-dereham
-boc
-kolar
-contentment
-rykodisc
-gerlach
-sharpen
-ayesha
-inebriated
-halberstam
-magnificat
-chifley
-igg
-htc
-eurosport
-eilat
-long-suffering
-haystack
-socceroos
-colter
-coppell
-moffatt
-onetime
-commercial-free
-cauliflower
-fortuitous
-anti-cancer
-backlund
-959
-shoestring
-anatomic
-quarreled
-showgrounds
-baboons
-scrope
-skyhawks
-ruyter
-mc5
-electroacoustic
-belfort
-albertson
-753
-behr
-mullingar
-jiro
-annesley
-wrights
-expansionism
-rockwood
-locative
-yassin
-incubate
-untamed
-ell
-stefanie
-shaper
-incantations
-targum
-sm-liiga
-beograd
-whitehurst
-toole
-hosea
-debutante
-matrimony
-beluga
-slrs
-kke
-tripods
-callsigns
-bloomer
-hauer
-two-step
-manlius
-iyasu
-baptistery
-repudiation
-coren
-mec
-omg
-hyperplane
-kui
-subscribes
-atonal
-franchisee
-2.16
-cotto
-natsume
-lithographs
-disagreeing
-horseshoes
-ganja
-melkite
-nadezhda
-perpignan
-weems
-jesu
-karras
-824
-onsen
-strippers
-obsessive-compulsive
-budokan
-volker
-jaded
-jutting
-fredericks
-rectus
-ghastly
-wickedness
-surrealistic
-mackinaw
-barrera
-musings
-schwarzkopf
-americanized
-solvay
-nervously
-iscariot
-grahamstown
-morphin
-ovoid
-wardell
-one-month
-swingin
-amara
-hayate
-24-year-old
-bartle
-pontic
-10,000,000
-12.00
-lepage
-quantifiers
-benaud
-rasul
-674
-disinherited
-scm
-sela
-741
-ichigo
-decile
-goldsworthy
-quills
-bicol
-interrogator
-best-of-three
-mote
-nacogdoches
-ubiquitin
-hoge
-airstrikes
-bassano
-krieg
-arcee
-pinker
-deceptively
-teitelbaum
-co-producing
-boilermakers
-secunda
-1481
-surrogacy
-schur
-843
-clary
-mcduffie
-brayton
-rfu
-lauryn
-brandishing
-tollywood
-caterham
-terraforming
-ritz-carlton
-nta
-elston
-sanremo
-begley
-triumphantly
-erving
-krautrock
-evie
-tualatin
-taf
-caricom
-3.75
-izzie
-pointedly
-waka
-mrsa
-dockyards
-mizo
-makepeace
-ramana
-xps
-roundly
-cpgb
-legg
-92.4
-dentin
-shamanic
-post-retirement
-rosenblatt
-rolla
-big-budget
-withington
-kigali
-dmt
-plagiarized
-low-pass
-kf
-kintyre
-3-way
-trixie
-giorgi
-offutt
-birkin
-parading
-d-league
-somethin'
-nfa
-barnyard
-ordway
-sx
-formless
-treks
-caw
-malolos
-seagal
-traversal
-pari
-rosas
-.40
-eubanks
-hod
-tempera
-bemidji
-chamorro
-pharrell
-1207
-iftikhar
-salar
-kenna
-jintao
-whist
-rahway
-nazir
-moyes
-dershowitz
-stronach
-old-school
-grooms
-abbasids
-7.50
-mid-nineties
-vario
-downright
-recreates
-hinson
-bohras
-arcata
-gann
-aoyama
-33,125
-42.7
-nonstandard
-pflp
-sequencers
-rapides
-1986-1987
-seltzer
-powerplay
-aegon
-bicolor
-thrifty
-lagomorphs
-inoperative
-trogons
-+7
-zooming
-ze
-painkiller
-spotters
-venera
-phanom
-delancey
-matriarchal
-steptoe
-acb
-kirill
-enhancer
-gt2
-dennehy
-inset
-shodown
-custodians
-ccr
-rearmament
-otherworldly
-torii
-letitia
-fending
-dak
-bettina
-rabbah
-dela
-knossos
-sympathetically
-1449
-chapelle
-goulet
-scotti
-mirabilis
-liebe
-march-april
-tegan
-proline
-duxbury
-exothermic
-vachon
-prabhupada
-sua
-lefevre
-directorates
-gompers
-catlin
-lightened
-gass
-ruf
-heydrich
-golda
-cofactors
-megiddo
-g.o.b.
-lordships
-awan
-likable
-uncooperative
-orrin
-mozambican
-bellefonte
-jef
-fallacies
-1403
-lawlor
-barat
-carn
-mammon
-murrayfield
-wir
-soldered
-zenon
-remastering
-womanhood
-re-launch
-bhat
-776
-nostradamus
-jarrod
-rerum
-grinders
-firestar
-tempt
-nicodemus
-shinhwa
-gli
-knightly
-pastels
-1205
-lusignan
-wenzel
-roundel
-sarmiento
-eerste
-bolt-action
-now-famous
-aligns
-azuma
-whorl
-stereotypically
-fiddlers
-butthole
-reactivate
-fifty-six
-shrill
-daman
-beeswax
-harwell
-lsi
-long-held
-manton
-krista
-cultivators
-bunches
-cognates
-spf
-pro-israel
-amerindians
-stranglehold
-teruel
-transversely
-anti-treaty
-rozelle
-tallies
-tus
-ophir
-celebrant
-48.6
-largs
-midrange
-gre
-pandulf
-lrc
-charest
-o'dwyer
-outfielders
-ramsden
-aeros
-emilia-romagna
-insee
-superconductor
-overnights
-jaco
-nazrul
-erudite
-n.e.
-vortigern
-skuas
-eradicating
-reimburse
-95.0
-1192
-deke
-mayotte
-abm
-incurs
-signal-to-noise
-heseltine
-scalars
-nbn
-hideaway
-intelligibility
-singlet
-atia
-hartigan
-8086
-wvu
-thicknesses
-deciphering
-nucleolus
-candi
-veblen
-interferon
-shankly
-fandango
-acu
-billionaires
-2-8
-amplifying
-pagasa
-perkin
-gascoyne
-piglet
-hes
-ahura
-693
-brae
-3.31
-lagan
-instigate
-skateboarders
-re-evaluation
-coruscant
-unranked
-wawa
-superheroine
-transcended
-pwg
-1096
-scarlets
-brenton
-wooing
-extinguisher
-schelling
-hurstville
-ejecting
-smartphones
-enjoined
-upanishad
-beall
-aurelio
-hing
-red-light
-vom
-1341
-perennials
-minna
-43.6
-mago
-rostam
-mid-17th
-makassar
-kabhi
-grenier
-banaras
-kono
-badu
-eminently
-DG.DGDGDGDGDG
-hardys
-daejeon
-dalglish
-levites
-transgendered
-redistribute
-prolonging
-1139
-gbp
-chem
-hrm
-calms
-reveille
-stimpy
-2008-present
-1474
-capitalisation
-evanescence
-dini
-about.com
-tatsuya
-gullible
-galina
-connex
-christiaan
-dormammu
-fluxes
-archaeopteryx
-aegina
-pop-oriented
-dsm-iv
-carmack
-abuts
-42,500
-nihilistic
-dbase
-ios
-foxton
-pvt
-semyon
-excitable
-codemasters
-niacin
-dixit
-8-4
-wheatland
-pre-dated
-especial
-equalize
-streamer
-unbuilt
-underwriters
-sukhumi
-subcontractors
-broadhurst
-winterbourne
-underpants
-deviance
-regionalism
-enda
-juggler
-tuber
-craigie
-milligrams
-qwerty
-zaki
-watkin
-spiteful
-heinous
-rhacophoridae
-college-preparatory
-nightmarish
-bobbi
-unrestrained
-custos
-gaping
-zar
-suse
-high-volume
-all-terrain
-kavi
-reefer
-livre
-smock
-s&m
-roofline
-arminius
-double-blind
-cnd
-gallaudet
-ingots
-dictum
-entwined
-kaiba
-swaraj
-mustapha
-a.k.
-zachariah
-sfx
-sfa
-intermingled
-shuswap
-angiography
-bough
-jarl
-fain
-slings
-fredonia
-five-story
-pre-defined
-close-ups
-maccabiah
-co-sponsors
-sa'id
-biarritz
-ground-colour
-vala
-resolutely
-bukharin
-pma
-736
-borthwick
-vestige
-gonads
-subside
-enya
-behn
-abramson
-benham
-cip
-naughton
-anticoagulant
-roeper
-headband
-doorbell
-pollux
-bankrupted
-msts
-cyr
-cortona
-externalities
-allegan
-kmfdm
-thirty-fourth
-dimitris
-premeditated
-aprile
-mohsen
-padstow
-bit/s
-tyldesley
-1948-49
-anstey
-hoss
-longer-term
-graphing
-thacher
-gurion
-glenorchy
-shakin
-thunders
-hork-bajir
-appellant
-educationist
-abject
-plausibility
-1313
-retailed
-anatole
-xanthi
-circuitous
-kadyrov
-gallbladder
-92.0
-cpd
-raunchy
-rhizome
-overdubbing
-dobruja
-crabb
-o'dell
-condone
-landform
-disheartened
-elementals
-tweaks
-manhole
-one-hundred
-boyar
-memes
-hazy
-stepchildren
-isolde
-pulteney
-bicknell
-rock-cut
-reubens
-seers
-objective-c
-ar-15
-gulden
-unproduced
-tins
-90.0
-803
-asio
-podgorica
-helmsley
-89.8
-homages
-mwe
-53.8
-summerhill
-clerkenwell
-sona
-relieves
-abercorn
-malmsteen
-offload
-500m
-orbis
-imploded
-unwell
-airstaff
-southbridge
-coolers
-scandinavians
-flagrant
-tomoe
-operable
-2000-2003
-vitally
-tolling
-hawn
-inflating
-saleem
-banque
-platformer
-50-year
-albertsons
-1690s
-southbank
-sctv
-horsemanship
-muskie
-bim
-transepts
-lynde
-waltzes
-1951-52
-banc
-1997-2000
-loi
-hynde
-yolande
-precipitates
-cere
-formalize
-cnt
-frobisher
-1259
-plowed
-2004-2007
-paseo
-mukesh
-sundanese
-skylights
-schwantz
-alauddin
-josefa
-dafoe
-undirected
-recklessness
-subjugate
-condors
-yalu
-pellegrino
-cair
-hitchhiking
-redirection
-arashi
-lookin
-longinus
-pacelli
-omi
-motta
-keough
-nuno
-ojibway
-homoerotic
-1444
-soli
-2.19
-a-4
-landsman
-harmer
-baldur
-five-part
-bargains
-sleds
-trulli
-chiranjeevi
-publican
-chawla
-deserting
-grose
-sadhu
-anti-british
-sterne
-icr
-knowle
-DGDG.DGDGDGDGDG
-clannad
-mccreary
-ambrosian
-newcombe
-intellivision
-psd
-biko
-balustrade
-diuretics
-subpoenas
-rusting
-waldegrave
-synthesiser
-waynesboro
-feltham
-balaclava
-coetzee
-oireachtas
-phasianidae
-gamecocks
-723
-profiler
-escher
-katipunan
-rebuilds
-ambrosius
-dogfish
-egotistical
-bbfc
-mami
-miyu
-jungian
-1137
-palearctic
-shrublands
-conseil
-two-term
-vlan
-outcropping
-saanich
-tyburn
-moti
-grigori
-underpinned
-lankans
-doings
-pendergast
-l3
-balch
-rawal
-sultanates
-pattaya
-mazar
-yuwen
-toadie
-o.k.
-accentuate
-kaspar
-holtzman
-outed
-spellbound
-class-action
-middlewich
-eternia
-schoolmate
-decorum
-agusta
-unam
-1080i
-ltc
-brunson
-arsenio
-pierces
-677
-sawan
-sanjeev
-1371
-bahram
-scaife
-godiva
-throng
-stoneham
-snowmobiles
-humus
-censoring
-ers
-ingeborg
-yourselves
-lightsaber
-revlon
-curvilinear
-cibc
-galante
-possums
-2.14
-minehead
-breese
-1155
-probed
-13th-century
-repairman
-maes
-cyclonus
-espouse
-hebden
-ih
-roker
-dern
-co-presenter
-afzal
-planescape
-disrepute
-illusionist
-isometry
-graveyards
-naturals
-non-orthodox
-prick
-10m
-rajaraja
-copps
-sensuous
-786
-13-14
-oulton
-communicators
-basham
-post-wwii
-cloverdale
-biblioteca
-boing
-russert
-onside
-869
-pronounces
-ambler
-talkers
-interviewers
-trix
-horta
-consignment
-bugti
-glockenspiel
-congenial
-smasher
-atal
-794
-papillon
-cardinalate
-gheorghe
-substations
-b.f.
-743
-reunified
-fricke
-cicely
-roxette
-executables
-mmu
-gehry
-insecurities
-spotlights
-burglars
-734
-1-11
-flavian
-haughton
-ricoh
-forty-fifth
-all-tournament
-thiessen
-indebtedness
-guesthouse
-battaglia
-spurt
-racetracks
-toonami
-carlist
-90.8
-cauty
-cheras
-kokoda
-snarl
-2010-2011
-cerevisiae
-scull
-matsudaira
-konnan
-affront
-deteriorates
-uswa
-canaria
-ringmaster
-sab
-firman
-grammy-nominated
-copula
-bbwaa
-ratan
-duplexes
-rebekah
-stiletto
-malory
-khabarovsk
-dav
-kampf
-1378
-lepers
-1232
-1448
-re-entering
-allophones
-flintstone
-moroder
-restatement
-houser
-adjudicated
-itchen
-super-hero
-pseudonymous
-godot
-phallus
-fact-finding
-jinn
-pinched
-totten
-helipad
-megapixel
-proudhon
-1085
-1997-1999
-sudamericana
-okhotsk
-sadism
-timelike
-belmonte
-doth
-vizcaya
-martello
-tatchell
-shins
-jna
-badging
-unimproved
-marmion
-non-catholic
-blanton
-wake-up
-wilding
-atlanteans
-subatomic
-1098
-blackbirds
-wreaths
-3,100
-cella
-89.6
-carelessness
-fea
-embers
-pinhead
-60.0
-rotulorum
-spitsbergen
-irregularity
-mulch
-cumbia
-brokeback
-inquires
-dfw
-mclellan
-gasification
-shuttled
-inedible
-eun
-citybus
-nowell
-cresswell
-belmore
-boatman
-kayako
-1286
-starks
-nanometers
-pay-tv
-seed-eaters
-callander
-lullabies
-reconciles
-webley
-crone
-skeet
-48.8
-alanna
-bilirubin
-arrian
-xo
-whirling
-fastener
-varela
-extrapolated
-preaches
-southam
-ncl
-lofton
-frist
-pro-government
-motorcyclist
-epidural
-expediency
-lashing
-destinies
-jardin
-legato
-cutscene
-thad
-aldgate
-neel
-vandellas
-geauga
-a10
-bpd
-gwangju
-futurity
-1445
-mckellar
-perceval
-spacewalks
-belial
-morn
-weatherby
-fifty-three
-southey
-chianti
-haitians
-andante
-janissaries
-madura
-gristle
-highly-regarded
-croc
-aspergillus
-atropine
-sivaji
-velcro
-etheria
-hatorah
-redefining
-mortem
-isro
-kingfish
-a/s
-1351
-presbyter
-llm
-pastorate
-plasmids
-lamentations
-vilas
-aut
-hok
-bellman
-transcaucasian
-saver
-gla
-gatherers
-incisions
-globo
-marischal
-sli
-orbited
-baloo
-dabney
-rashtrakuta
-mohave
-sandgrouse
-twyford
-foetus
-overpasses
-kpa
-barents
-gendarmes
-adverb
-microcode
-normandie
-lca
-ciel
-chedi
-solanki
-fisichella
-downplay
-vorbis
-formula_65
-meadville
-tellico
-10-2
-dorr
-smk
-smg
-jeffersons
-newtype
-filip
-reformulated
-latches
-maslin
-may-june
-maqam
-piss
-well-formed
-made-up
-cooder
-sagamore
-opm
-downie
-j1
-dep
-gaogaigar
-inflections
-flanges
-allston
-mass-production
-lufkin
-nigerians
-farscape
-presbyterianism
-emirs
-campanella
-myotis
-dof
-red-headed
-mcintire
-wallop
-counter-revolutionary
-scarsdale
-saudis
-dosages
-weeps
-rethinking
-ncrna
-veera
-ploughs
-sarnoff
-hoysalas
-vise
-neem
-violist
-helmsman
-968
-auer
-svenska
-1237
-sverdlovsk
-brut
-melding
-lubricated
-festus
-ahmet
-pyrolysis
-lumet
-blogosphere
-gair
-prieta
-nervousness
-fallopian
-godalming
-mukhtar
-zeit
-berwyn
-arai
-armistead
-kagami
-hydrolyzed
-al-mansur
-rioja
-mar-vell
-fervently
-crampton
-sepp
-scholl
-linens
-carolines
-muon
-surakarta
-clostridium
-.32
-non-classical
-whammy
-clyne
-pacquiao
-excepted
-lanceolate
-fertiliser
-kyu
-liefeld
-carnaval
-tarim
-tasker
-xlii
-e-book
-grilling
-usta
-wrekin
-genteel
-conscript
-basayev
-riverbanks
-overworld
-bracketed
-texel
-aja
-orangutans
-beiyang
-azalea
-delicately
-1201
-crofts
-non-breeding
-saboteur
-ura
-tuples
-parkview
-sunbirds
-rof
-paralytic
-minnetonka
-delores
-criticises
-1466
-ratnam
-rakesh
-penrhyn
-artesia
-dessau
-tupou
-shim
-curler
-albers
-brzezinski
-1333
-pms
-good-looking
-bouillon
-kluge
-restated
-kalmyk
-morena
-masahiro
-alvaro
-tou
-pondering
-ruthlessness
-976
-kaiju
-1255
-gigging
-frustrate
-pleiades
-haque
-paramus
-bourbons
-hemming
-1194
-3.35
-payer
-liggett
-penfield
-sawai
-hyperbaric
-gayatri
-asses
-sverdrup
-chadderton
-kash
-blackmun
-pakhtuns
-houseman
-rizzuto
-fourfold
-entitle
-calderwood
-gur
-leroux
-mek
-fazal
-blatchford
-3100
-navin
-high-technology
-non-christians
-yeon
-mansoor
-fangio
-cartan
-d'amour
-framers
-perennially
-ovas
-m9
-connemara
-byes
-compactness
-pervert
-t-cells
-opinionated
-khoisan
-96.0
-undefended
-pseudorandom
-crombie
-hurries
-bishopsgate
-pivots
-tecmo
-montgomeryshire
-kieron
-902
-4s
-dwarfism
-bushey
-sephardim
-pandu
-ducky
-prostaglandin
-chota
-tustin
-chakras
-replicator
-soricidae
-bogle
-basslines
-dugouts
-iiis
-coursing
-cisterns
-cohan
-kobra
-overijssel
-hyannis
-hallie
-attache
-lire
-doukas
-buy-out
-wpix
-kenta
-ppl
-deheubarth
-willkie
-wedded
-underclass
-quickening
-855
-dreamgirls
-bps
-pineapples
-thirty-fifth
-ryazan
-nonverbal
-2.10
-oop
-pitney
-927
-alianza
-risborough
-bielefeld
-gunfighter
-paralegal
-triage
-high-value
-grandad
-dari
-cloaking
-94.0
-melcher
-eugenius
-cle
-7s
-ssg
-devanagari
-abundances
-non-conformist
-stents
-1324
-biplanes
-1016
-chagatai
-oocyte
-arma
-cmdr.
-khatri
-sancti
-subpoenaed
-exportation
-sagging
-redirecting
-u.s.c.
-newsradio
-albertus
-roshi
-gallo-roman
-ema
-tochigi
-reaffirming
-giorgos
-milli
-bickerton
-chatswood
-trillian
-swerve
-fluctuates
-raghu
-eniac
-forego
-homebush
-indignant
-gildas
-hamelin
-oxytocin
-seu
-michaelmas
-self-identified
-shoji
-gnat
-egalitarianism
-ashura
-shepton
-szczecin
-goals-against
-boldness
-atlases
-trotting
-bitstream
-tiberium
-elamite
-assize
-vasil
-sidearm
-brome
-comparably
-bhajan
-hedonistic
-chenoweth
-longwave
-deltora
-smalls
-vallance
-hamblin
-psychobilly
-banton
-tec
-infogrames
-jeanie
-shilton
-aws
-two-person
-expedited
-deforest
-varro
-million-dollar
-89.2
-867
-peculiarly
-extradite
-electricians
-culpability
-datta
-ige
-fam
-soares
-middays
-bellefontaine
-1979-1980
-8.10
-omri
-overgrazing
-dancin
-maximizes
-villard
-doublet
-two-game
-impersonations
-deregulated
-asuras
-astonishingly
-streeter
-seven-year-old
-relient
-hornbills
-armband
-slippage
-creosote
-optimist
-cup-shaped
-dryandra
-dawa
-neutralization
-lymphocyte
-gulab
-659
-55.6
-condenses
-apollonia
-rajinikanth
-18-year
-eggplant
-epiphanius
-heroically
-cathedra
-estella
-isomerase
-irritate
-dink
-mpumalanga
-parshah
-gisela
-warlocks
-ccg
-todo
-gerson
-qureshi
-maceo
-orks
-r32
-tokelau
-csd
-eggers
-dollhouse
-aliphatic
-rix
-neoconservative
-medallists
-priors
-wiesenthal
-schola
-deposing
-hgtv
-chahar
-t.k.
-sengkang
-vissi
-contemptuous
-anti-doping
-ahs
-pyridine
-spearman
-black-headed
-sudetenland
-marxism-leninism
-shirin
-rivalled
-blowfish
-tropes
-telegraphic
-ajah
-plus/minus
-dokken
-quickness
-sardines
-miniaturized
-andalite
-mutagenic
-deceitful
-bandwagon
-mihai
-thundering
-phylloxera
-bycatch
-nyro
-non-uniform
-coryell
-validly
-tuggeranong
-whitelaw
-regress
-necktie
-showground
-hyphenated
-vought
-10-11
-yolo
-carpentier
-ablative
-montel
-bandaged
-tassel
-creepers
-wilbert
-778
-maddalena
-816
-bandaranaike
-46.2
-captor
-biffle
-etv
-desh
-airworthiness
-oden
-agia
-s-class
-aruch
-nevil
-glidden
-1418
-stupor
-erratically
-zamindars
-cornfield
-tudors
-578
-912
-wrinkled
-bamako
-rapped
-watterson
-olave
-policed
-moana
-aliyev
-1102
-timeslots
-formula_68
-goodbyes
-multi-role
-almanack
-welby
-dowie
-pica
-1462
-weigel
-wingless
-prop.
-ferrero
-cots
-gallacher
-rakim
-shula
-romp
-assimilating
-bringer
-mortification
-647
-35,625
-rishon
-hmp
-hildegard
-rk
-r$
-srebrenica
-thirunal
-wallasey
-gbe
-kayakers
-post-industrial
-mii
-stunting
-sunsets
-orangeburg
-sabra
-hajduk
-disgraceful
-tulsi
-harappan
-1015
-bushrangers
-anthurium
-archdeaconry
-45-minute
-codice_11
-bunn
-mishandling
-3-day
-tejano
-urza
-balzac
-souness
-celebes
-128-bit
-caporegime
-12.30
-sacrum
-vg
-keke
-repressions
-erlangen
-acetyl
-extender
-geosynchronous
-superoxide
-rhyl
-'30s
-ammianus
-marginalised
-acheron
-snatches
-shion
-workhorse
-opossums
-seamount
-voiceovers
-machin
-dzogchen
-sansom
-theoretic
-emmis
-hurrian
-wernher
-karina
-shana
-northwesterly
-gleaming
-carniola
-clothesline
-878
-metahuman
-coffs
-escalates
-bienville
-vidhan
-w.g.
-surging
-4k
-guava
-cdi
-leaped
-:1
-lolth
-shenton
-rampur
-murthy
-anka
-fouad
-giveaway
-mcvie
-balochis
-gazeta
-krewe
-sarasvati
-kanchi
-bertone
-culvert
-decade-long
-stryfe
-diatoms
-peekskill
-thaler
-secularized
-doghouse
-mccaw
-serializability
-nikkatsu
-asp.net
-two-state
-luise
-grammarians
-unguarded
-geraldo
-aforesaid
-pogue
-assizes
-ajith
-leu
-banal
-adorning
-yanni
-wacken
-chelyabinsk
-fausto
-1999-2003
-inter-house
-alpini
-926
-lj
-redline
-corr
-initialized
-hagiography
-ascribes
-corsets
-91.6
-srt
-ingress
-merited
-araneta
-multicolored
-barbiturates
-rijn
-syme
-pinhole
-gropius
-peterhouse
-beekeeping
-puyallup
-d12
-putouts
-matta
-multiprocessor
-printmaker
-pathanamthitta
-wildside
-immovable
-despotism
-tourney
-natura
-sanz
-kerberos
-cheong
-wattles
-fba
-quenching
-basf
-lipa
-analog-to-digital
-reseller
-percussionists
-photius
-kohima
-luxe
-herded
-makarov
-coursers
-illyana
-dawning
-forty-ninth
-tag-team
-hadron
-antifungal
-powerhouses
-brunch
-slavia
-aymara
-fire-control
-chocobo
-non-official
-ere
-hep
-1309
-y-chromosome
-h.c.
-taiko
-15-yard
-recharging
-ferengi
-non-public
-grampa
-khmelnytsky
-undersides
-50.4
-pith
-self-serving
-frontrunner
-brava
-4-6-0
-amply
-bradstreet
-waterboys
-methylphenidate
-incantation
-airstrips
-lottie
-implored
-unsealed
-dili
-pro-soviet
-linehan
-fleur-de-lis
-codeshare
-strangelove
-ruislip
-efficacious
-spoken-word
-punggol
-three-digit
-maga
-lysis
-kael
-formula_63
-ryle
-ubu
-bair
-jazz-rock
-phenolic
-delicatessen
-jami
-lingam
-urizen
-toothless
-bpo
-fln
-a-7
-vociferous
-pyrophosphate
-levittown
-schnabel
-bessel
-hortons
-90.4
-solutes
-45.7
-hamar
-tain
-denzil
-sumitomo
-wad
-soundscape
-lob
-cadenza
-porteous
-capacitive
-unfettered
-unido
-cabriolet
-reminiscing
-44.5
-levan
-895
-kazakhs
-vimes
-riau
-blyton
-dodgeball
-62.5
-augustana
-delved
-deep-fried
-1421
-scientologist
-soult
-gaga
-terse
-taub
-13,500
-clubbed
-publicizing
-comm
-unione
-2022
-homeomorphic
-ransomed
-airstrike
-politic
-goodfellas
-gauntlets
-picardy
-protoss
-:00
-davenant
-romer
-valerio
-xmas
-cpb
-attainted
-replicators
-brittain
-stragglers
-oddball
-fairlight
-miers
-spelman
-blacker
-re-appeared
-calmer
-cletus
-unconvincing
-pederson
-tamerlane
-chartist
-buffered
-albinus
-insertions
-sokol
-helo
-weinberger
-cerebus
-tabbed
-verhoeven
-theodorus
-holywell
-scolded
-arithmetical
-chattel
-extra-time
-exquisitely
-battery-powered
-crisps
-krypto
-carnot
-fandorin
-lahr
-kumasi
-robards
-acyclic
-gravis
-864
-nadeem
-sss
-hino
-hyun
-1171
-vainly
-cauca
-wcbs-tv
-boscombe
-kopp
-88.6
-metronome
-mems
-mmm
-distinctiveness
-milkman
-dispatchers
-pinehurst
-islamia
-imitative
-bastrop
-girona
-fellowes
-1468
-videocassette
-1242
-gotthard
-verkhovna
-haeckel
-currier
-monadnock
-condita
-gulbarga
-rationalization
-tippu
-dodgson
-5-9
-millenium
-koto
-dereliction
-orations
-groundhog
-closings
-mansehra
-artemisia
-ludendorff
-karthik
-castlevania
-1429
-emmeline
-eddies
-disqualify
-wasson
-vocoder
-r.s.
-deplete
-waal
-octahedron
-limpet
-arb
-corvair
-windowing
-quarrelled
-1179
-exhorted
-approbation
-convener
-parkhurst
-r.m.
-delray
-syfy
-eighty-five
-rossellini
-glows
-978
-p.g.
-jordanes
-jozef
-fanned
-hashish
-broca
-bosporus
-vitebsk
-yerkes
-23-yard
-roping
-brindley
-lingual
-sidebar
-beanstalk
-human-computer
-palpable
-tekle
-2g
-2c
-barrhead
-shunning
-hoshi
-innovate
-riis
-flashlights
-journeying
-revamping
-slush
-clavicle
-disobey
-paralyzing
-finalize
-dour
-863
-866
-hasse
-ajc
-proteomics
-aback
-cloaca
-hurdler
-nobuo
-ratites
-kouji
-pomfret
-muff
-gans
-wynette
-6-yard
-connoisseurs
-sopra
-myst
-rebroadcasts
-tenfold
-cardwell
-coghlan
-breccia
-witter
-snug
-supercars
-instalment
-chenab
-vf
-bilaspur
-unattached
-kahane
-1071
-tenney
-oxus
-houma
-woodham
-dimera
-dormition
-wlw
-april-may
-heide
-dyfed
-tuft
-diarrhoea
-cranfield
-supernumerary
-hooliganism
-toda
-stb
-bunning
-969
-splinters
-amalgamations
-vihara
-mistrial
-nagara
-whitacre
-trad
-wheelers
-836
-filesystems
-calyx
-houseboat
-dishwasher
-dvb-t
-estado
-8-8
-dismount
-complicit
-switchblade
-diethyl
-delinquents
-hpa
-demoscene
-ribosomes
-fuchsia
-private-sector
-preventable
-banqueting
-kom
-livejournal
-sculpt
-lindwall
-ovum
-yuriy
-tubule
-zala
-lebron
-reassuring
-goldin
-five-door
-tmf
-wic
-12th-century
-claymore
-gravels
-predisposed
-histidine
-sahih
-fano
-dampier
-non-musical
-wantage
-silmarillion
-lora
-aryeh
-colostethus
-silken
-bothering
-webby
-terrains
-discloses
-ghalib
-728
-alappuzha
-farris
-loewen
-87.9
-ekaterina
-viscountess
-esme
-petronius
-barwick
-schlegel
-1173
-clarifies
-capablanca
-punto
-spink
-w.s.
-implacable
-cucamonga
-gouda
-rubenstein
-imap
-epigenetic
-wiretap
-vigils
-carding
-knutsford
-11-0
-driffield
-o'bannon
-specks
-smb
-fairlane
-mannerism
-binion
-pauley
-munger
-diesel-powered
-msm
-cbt
-casson
-forward-facing
-bedouins
-overhauls
-custis
-jvm
-trioxide
-noone
-iana
-lordi
-schoolcraft
-chemnitz
-44.2
-matchplay
-incinerated
-chevelle
-crayons
-brinkman
-baldness
-saddled
-schwerin
-keira
-hijazi
-bix
-iom
-antonine
-fatimids
-5-star
-mini-game
-elim
-thurso
-guybrush
-stornoway
-bantry
-apprenticeships
-lilah
-damiano
-infosys
-gruppe
-shapeshifter
-berkhamsted
-j.e.
-zatanna
-martingale
-carstairs
-surbiton
-shikoku
-renewables
-non-canon
-redesigning
-kazuya
-halachic
-ew
-refuting
-periodontal
-saugeen
-passau
-u16
-co-developed
-skiffle
-lectern
-cae
-striptease
-megawatt
-wie
-ushers
-ziff
-781
-diliman
-alb
-88.8
-d66
-rashidun
-operant
-caversham
-sssis
-dialling
-convenes
-guppy
-arguable
-intra
-wraparound
-biologic
-tyumen
-weightlifter
-graduations
-t.s.
-916
-byline
-aishwarya
-88.2
-magnificence
-liquefaction
-chewbacca
-1394
-reliquary
-49.3
-hipper
-collarbone
-ault
-caramon
-yaga
-cartilaginous
-ldap
-feld
-34,375
-necronomicon
-sentosa
-keener
-retard
-invalided
-re-arranged
-circumscription
-diggings
-rmp
-saddleworth
-bogey
-beiderbecke
-interments
-stelae
-rehabilitating
-statler
-superheated
-j.l.
-weehawken
-sobotka
-reuss
-dishonor
-1953-54
-asoka
-5k
-luxembourgian
-buccleuch
-deland
-lithic
-o'dea
-veers
-lpc
-nhtsa
-asaf
-dejected
-stonington
-baiting
-cheesecake
-decision-makers
-tuolumne
-lacoste
-yala
-nextstep
-proclus
-gridley
-ii-era
-mid-tempo
-chiefdoms
-alertness
-unspoiled
-komuter
-retorts
-goldust
-constans
-dando
-shilpa
-92.8
-rivas
-roby
-xml-based
-coolness
-bronfman
-meacham
-toponym
-highwaymen
-asn
-shai
-abreu
-unsw
-kp
-shylock
-fete
-methuselah
-fidelis
-shattuck
-vaz
-ywca
-closely-related
-hel
-pre-industrial
-mazari
-hayabusa
-n-1
-valentines
-invid
-sifton
-free-flowing
-riptide
-phallic
-bloodbath
-dhamma
-biofeedback
-astringent
-muzik
-broodmare
-power-play
-xiahou
-forewing
-semi-annual
-carshalton
-ministering
-corsa
-darul
-al-husayni
-4x400
-bracks
-eschew
-kyrie
-occ
-ephemera
-fau
-noddy
-47.1
-phlox
-gigolo
-mafioso
-minicomputers
-durgapur
-fractionation
-anise
-kohen
-recommenced
-skippy
-morningstar
-jukes
-jochen
-biswas
-araceae
-raze
-blaxploitation
-jirga
-miro
-a14
-remixer
-kindle
-kesselring
-napkin
-huddleston
-tameside
-broadview
-discriminant
-cathal
-carmella
-mobley
-douglas-home
-bowland
-anxiolytic
-rosamund
-age-related
-commonalities
-saha
-91.0
-all-africa
-single-stranded
-raziel
-anatidae
-runnymede
-sizemore
-quilting
-ache
-agitating
-two-headed
-twitty
-ill.
-valmiki
-minicomputer
-seigneur
-kumbakonam
-lurgan
-naiad
-no-fly
-oki
-orin
-presidente
-styrene
-vizianagaram
-turbidity
-per-capita
-1231
-al-haytham
-hypergeometric
-hardcastle
-hala
-beauvais
-dualistic
-c-4
-trotters
-bactrian
-company-owned
-cockatoos
-notching
-bests
-allosaurus
-pajama
-gernsback
-stealthy
-biogas
-strathearn
-mpv
-maines
-adjudicator
-822
-linderman
-shingen
-potenza
-sliver
-rohe
-non-nuclear
-ende
-aggressors
-t-pain
-second-longest
-kebab
-khaldun
-lakehurst
-arjan
-butlers
-huard
-traitorous
-injurious
-hejaz
-halperin
-uther
-kilmore
-concur
-long-billed
-heffer
-alderley
-fivb
-aphorisms
-87.8
-nogueira
-betancourt
-mukden
-orne
-nazarbayev
-forney
-switchfoot
-mashiro
-smothered
-virtus
-bo-bobo
-intercostal
-surety
-kashrut
-whatley
-pubis
-best-of-five
-b7
-miscarried
-squaring
-hematite
-procol
-hammerheads
-southwold
-l.c.
-netherton
-southeasterly
-refilled
-r.i.
-stratofortress
-powerplants
-misappropriation
-milla
-uca
-re-enters
-aphex
-coast-to-coast
-usga
-outrigger
-wcbs
-energy-efficient
-fein
-lalor
-strat
-teochew
-ornette
-carthy
-palmers
-1996-1998
-cassation
-1253
-sohrab
-creon
-suffocated
-ecija
-sheeting
-ite
-vaccinated
-amedeo
-asap
-post-cold
-light-emitting
-1952-53
-medium-fast
-tawa
-banfield
-antiquary
-apsley
-strangest
-modernists
-2021
-hordak
-fermenting
-reclassification
-curlews
-bartok
-geschichte
-stoning
-+10
-felder
-authorising
-884
-ellice
-bazaars
-namond
-blanchett
-larimer
-virtuosic
-bloodstone
-eldritch
-1294
-fifty-four
-post-civil
-pilipino
-disinfectant
-massillon
-828
-sixteen-year-old
-1454
-freire
-goc
-self-respect
-8mm
-crawlers
-strays
-morgaine
-wishaw
-specie
-shiver
-conditionally
-predictability
-sounder
-weasley
-post-election
-dim-witted
-mh
-nesbit
-tusculum
-parlement
-h.s.
-dreary
-viasat
-cardoso
-wale
-reimagined
-bishan
-camas
-chagall
-sainthood
-46.5
-fredrikstad
-sirenia
-craggy
-turn-of-the-century
-lenten
-eyepiece
-tokai
-venn
-unprovoked
-darden
-tirtha
-alessandria
-c'mon
-celica
-batmobile
-slashdot
-1397
-pedophilia
-dmg
-inevitability
-thun
-shimin
-jammin
-955
-sixty-third
-npd
-ratchasima
-m.m.
-lebedev
-striated
-messrs
-numismatics
-kasumi
-viscera
-esteghlal
-coosa
-haplotypes
-reduplication
-highrise
-projekt
-pacey
-cotten
-intelligently
-enveloping
-bailiffs
-bardo
-suborbital
-nebulous
-chameleons
-847
-godsmack
-aileen
-shawls
-hayne
-3s
-geppetto
-sinaloa
-creatinine
-flavouring
-1257
-chievo
-pager
-batik
-honorees
-non-essential
-frock
-dengeki
-1014
-salonika
-lippe
-oddie
-linea
-burchard
-inflight
-jatra
-ignatieff
-histological
-non-u.s.
-1238
-pricewaterhousecoopers
-102.2
-takeoffs
-mcnaughton
-plagioclase
-elly
-radiometric
-fentanyl
-raekwon
-coxless
-fiscally
-95,000
-re-organization
-noether
-mickelson
-gnu/linux
-ece
-elissa
-needlework
-ubon
-killen
-1458
-bukidnon
-certifies
-lepanto
-adenauer
-teixeira
-appraised
-sheared
-geneticists
-condescending
-pzl
-angham
-stirner
-tattered
-budgie
-captioning
-hennessey
-dimitar
-rupiah
-zdf
-fighter-bombers
-schulman
-icosahedron
-steady-state
-hob
-hutson
-danse
-cumbrian
-shortfalls
-xslt
-re-recordings
-ambiguously
-acceptors
-ringtone
-w.a.
-16-18
-pejoratively
-shikai
-legrand
-ssb
-silvestri
-grouper
-openvms
-rapa
-rocksteady
-eskdale
-quicken
-tel-aviv
-1342
-ramus
-mendelsohn
-yupik
-ionospheric
-illyricum
-educations
-lich
-landes
-tibor
-catshark
-formula_66
-all-encompassing
-ilfracombe
-49.2
-735
-intermixed
-annexations
-1211
-findley
-amazonas
-jobson
-ryland
-steric
-wolfowitz
-covalently
-benzoate
-moskva
-non-citizens
-marquesas
-benicia
-pupae
-violas
-nucleoside
-overflowed
-allergen
-double-headed
-quantifying
-timaru
-thicke
-kasabian
-helplessness
-arch-rivals
-9-5
-cowdrey
-rosin
-anti-establishment
-vegetated
-krs-one
-tulips
-44.6
-wildland
-billets
-sabri
-restarts
-sixty-six
-moped
-anesthetics
-posada
-delco
-pachinko
-counseled
-annexe
-decibel
-hamilcar
-pandan
-bipartite
-nott
-karur
-afire
-petipa
-megaphone
-jobless
-beltrami
-hodson
-lacroix
-multiprocessing
-brisco
-belew
-seifert
-inerrancy
-1186
-reflectance
-unverified
-barret
-robespierre
-impatience
-exuberance
-dischord
-p-funk
-yitzchak
-clamshell
-gasp
-loup
-demetrios
-likeable
-yuezhi
-ferrier
-ohno
-ift
-caricatured
-canmore
-coot
-ravaging
-jabba
-gracchus
-internalized
-slimy
-kinnick
-tino
-chubb
-impressionistic
-louse
-wraiths
-fouts
-tingle
-wirt
-hyped
-ryuji
-goshawk
-ferber
-megalopolis
-baltimore-washington
-bullfrog
-costar
-faw
-backpacks
-mito
-alister
-osc
-pillman
-montero
-off-shoot
-subnet
-barrowman
-86.4
-804
-nal
-20:00
-puranic
-thera
-pappy
-lafontaine
-drovers
-odie
-gauri
-cybele
-grained
-chanda
-blane
-spurgeon
-bisbee
-1387
-sign-off
-faison
-middle-eastern
-mckeithen
-redruth
-adopters
-badenoch
-fsh
-middelburg
-lauer
-boden
-seeps
-stabilizes
-19,375
-scaled-down
-prim
-aaaa
-ignace
-plp
-1012
-shastra
-lampson
-nuance
-wilbraham
-1239
-midair
-kolbe
-limbu
-boorman
-wracked
-short-listed
-tampines
-concertgebouw
-scythia
-unforgiving
-mailman
-handpicked
-nautiloids
-marik
-reith
-argosy
-jetta
-thevar
-lilydale
-subcontractor
-squashed
-aipac
-wilsons
-robs
-guaranty
-mura
-carrefour
-albinism
-disputation
-bakufu
-eumenes
-reinstating
-five-o
-alga
-k3
-raheny
-wusa
-headsets
-moussa
-jez
-ratifying
-tropicbirds
-abubakar
-hewn
-concoction
-disjunct
-1224
-bolognese
-ephemeris
-pastorius
-deism
-noncommissioned
-throes
-yonder
-samguk
-prahran
-farren
-kym
-matsya
-cronies
-vandalised
-kiyoshi
-ossian
-hollinger
-sobieski
-depleting
-hayato
-99.8
-awb
-sleight
-overheat
-thrashed
-deplored
-amistad
-matabeleland
-malaga
-stradivarius
-billericay
-corelli
-ebbets
-abetting
-angioplasty
-communally
-snobbish
-jawad
-payers
-five-pointed
-aileron
-porphyria
-lanny
-bunton
-leeson
-tacitly
-733
-transpires
-wingers
-1247
-5.25
-straws
-hoshino
-bardic
-2.11
-forty-first
-mismatched
-nairs
-willowbrook
-archiepiscopal
-self-consciousness
-sandstorm
-ano
-rma
-desecrated
-smithville
-expository
-third-highest
-jurchen
-lunn
-sinfonietta
-shoves
-pratincoles
-attuned
-sunbathing
-pansy
-sepulveda
-interleague
-underarm
-placards
-sunfire
-castrum
-kornheiser
-klaw
-frosted
-marcion
-trefoil
-trapezoidal
-jsc
-follett
-1396
-menken
-eckerd
-negus
-intricacies
-self-destruction
-palettes
-autos
-1428
-polytechnics
-thorndike
-mtc
-juvenal
-dallara
-wolfenstein
-veneta
-upholds
-pleasantville
-mournful
-833
-mok
-jubilees
-non-state
-operon
-climatology
-slayton
-arar
-mugen
-hiv-1
-aotearoa
-high-scoring
-ilium
-newlyn
-melling
-ostentatious
-tirumala
-cuneo
-sparrowhawk
-naturalisation
-kandi
-25-year-old
-q-tip
-sangster
-mitchells
-chimps
-erroll
-suffices
-1164
-mouldings
-prebendary
-straight-line
-omnidirectional
-corvidae
-valenciennes
-lubavitch
-airasia
-rhd
-mieszko
-oromiffa
-school-age
-wigner
-lusatia
-cashes
-tientsin
-vali
-acca
-medullary
-uist
-greenough
-second-tier
-perchlorate
-5-11
-computer-animated
-rejuvenate
-intruding
-barnstorming
-orillia
-flexed
-electrifying
-tsimshian
-axum
-icm
-adventuring
-coughlan
-kosala
-fasts
-airpark
-arup
-rmi
-vials
-fantagraphics
-gameday
-all-century
-hamada
-gauguin
-nettie
-yukari
-whiter
-wsa
-cistercians
-gynaecology
-plummeting
-cenozoic
-k-theory
-fairtax
-high-intensity
-swagger
-hartnett
-gymkhana
-panellist
-puffins
-vishal
-irish-born
-796
-komsomol
-i-295
-scs
-coelho
-aldeburgh
-elucidate
-insulate
-frescoed
-elwes
-elaborating
-jeweller
-headhunters
-zbigniew
-segue
-off-beat
-1463
-debauchery
-bolus
-montour
-samad
-animist
-bequests
-dogmas
-thapar
-thermoelectric
-notifies
-deleuze
-gossett
-court-martialed
-galleons
-unscripted
-bobbing
-unpaired
-gerhardt
-maddux
-fluoroquinolones
-farrakhan
-uckfield
-macaws
-catford
-savatage
-pan-blue
-burly
-animatronic
-inoki
-nazca
-blah
-billingham
-ilsa
-1148
-lesh
-mero
-9.50
-eclipso
-stuarts
-f.e.
-analytically
-p.t.
-lippi
-reaping
-strikeforce
-repubblica
-shevardnadze
-osler
-juniperus
-unending
-statoil
-1491
-lynton
-ulan
-re-erected
-wikimedia
-hornaday
-scutes
-cvt
-erg
-sandor
-prolactin
-abies
-battenberg
-tulloch
-telltale
-harshest
-waddle
-naoki
-gauze
-powerups
-patapsco
-penitential
-downlink
-poggio
-ba'al
-inwardly
-1:2
-egyptologists
-kinesiology
-menzel
-1271
-flava
-floatplane
-nfu
-shand
-untapped
-hindquarters
-timm
-oldbury
-sixty-second
-associazione
-moloch
-constantia
-gam
-petrucci
-yana
-farman
-pran
-pdi
-hedging
-slumping
-cloutier
-zambales
-bleacher
-99.2
-nakagawa
-shalit
-supremely
-golems
-carron
-mintz
-coxeter
-twice-weekly
-waxed
-greenbank
-single-shot
-w.r.
-mits
-cerrado
-halpern
-dinghies
-wuthering
-yrs
-tenochtitlan
-amx
-bat-like
-hokuto
-raffaele
-naf
-tunics
-49.6
-sothern
-iran-iraq
-maimed
-cleanest
-crushers
-xxvi
-electromagnet
-camerata
-spm
-moated
-lipman
-trammell
-osho
-growls
-zelazny
-showalter
-well-organized
-constructicons
-unattainable
-mccurdy
-kuchma
-automorphisms
-adar
-5200
-savio
-unimpressive
-phaser
-chauvel
-milam
-mitanni
-berrien
-54.5
-rct
-chakraborty
-shortcoming
-seeadler
-kling
-rotherhithe
-bilston
-clonal
-menengah
-.10
-lifelike
-6-cylinder
-bulla
-gcr
-hand-painted
-season-high
-ghia
-archangels
-51.5
-touche
-monaro
-832
-mittal
-chit
-dramatisation
-hovered
-schenley
-hellions
-distrustful
-1436
-8.20
-kaduna
-appaloosa
-crocus
-henriksen
-quadrangular
-heritability
-mondeo
-1.85
-coldfusion
-evocation
-iz
-dispelled
-827
-36,875
-rangpur
-omega-3
-vilified
-wwc
-velez
-reappearing
-avs
-uffizi
-beira
-labor-intensive
-southworth
-0.12
-punky
-baran
-kimi
-mallon
-wwwf
-unconcerned
-arba
-sapir
-unprofessional
-marshlands
-n&w
-deeside
-catch-all
-coeliac
-arslan
-gooden
-cana
-nephilim
-1208
-inquisitive
-cyclonic
-threesome
-corse
-comitatus
-oakenfold
-vitaphone
-brauer
-netherworld
-phonics
-toothbrush
-blizzards
-goodfellow
-self-directed
-kremer
-vocabularies
-substantia
-1177
-kashima
-nitroglycerin
-hammadi
-annulus
-recitative
-2002-2006
-gopi
-yishuv
-anhalt
-append
-elegiac
-foams
-smoothbore
-mosman
-xxl
-alamitos
-remote-controlled
-ladislaus
-burges
-jaye
-goble
-macao
-artist-in-residence
-seattle-based
-farooq
-echols
-msrp
-microprose
-neidhart
-thirty-ninth
-yavapai
-bosniak
-sotho
-referential
-teletoon
-1223
-mitzi
-blush
-lapwing
-radio-friendly
-freund
-transcriptase
-hasmonean
-anti-capitalist
-quanta
-gioia
-5s
-misamis
-epworth
-irda
-dhar
-fornication
-jcp
-salix
-transshipment
-snippet
-stm
-dtd
-kind-hearted
-jacen
-kinematic
-1996-2000
-exclusionary
-forger
-jailhouse
-sixty-sixth
-saint-pierre
-racketeer
-satirizing
-feodor
-ineffectiveness
-firenze
-handbags
-41.0
-wyong
-parle
-horgan
-intrastate
-exons
-sacd
-megabyte
-2.13
-tzvi
-6800
-annes
-972
-1362
-copacabana
-.308
-acerbic
-sultry
-citron
-postgresql
-nach
-annexes
-cws
-newly-founded
-a330
-fibromyalgia
-gullah
-jamaicans
-nussbaum
-ildefonso
-awadh
-denby
-rubs
-bathhouses
-factionalism
-b-36
-carolus
-cleeve
-colorist
-piaggio
-sagi
-peddler
-deft
-senders
-sleeveless
-armpit
-tenderloin
-691
-cybertronian
-unpopulated
-malley
-mcmichael
-motile
-1982-1983
-thracians
-conservator
-kooper
-bonilla
-679
-devos
-canvassing
-choppers
-harvesters
-gainsbourg
-trion
-gaudiya
-anasazi
-lsa
-rolland
-chf
-letchworth
-displaces
-99.4
-gana
-helpline
-visionaries
-magill
-wulf
-northbrook
-parabellum
-necrotic
-barros
-polyglot
-1249
-high-capacity
-entanglements
-pdpa
-soya
-speedball
-laon
-mossley
-48.3
-mutates
-shankill
-winterthur
-x-treme
-bcal
-younis
-1590s
-etudes
-interpretative
-sensuality
-wooten
-h.e.
-wilders
-refrains
-evita
-commissars
-talksport
-munsters
-tolerates
-mtu
-barca
-motivates
-720p
-cabral
-tommie
-fastback
-babysitting
-dalmatians
-wabc-tv
-jumpsuit
-georgette
-fly-half
-similar-looking
-spun-off
-caney
-spotty
-51.7
-ambala
-tigrinya
-pretence
-spacelab
-pattani
-hutcheson
-longshoremen
-sunburn
-munir
-newlywed
-residuals
-non-profits
-lynwood
-nilgiris
-chokes
-uprated
-wortley
-khosrau
-kapur
-saharanpur
-nevermind
-silting
-monkton
-766
-phobias
-sverre
-black-tailed
-rowdies
-takings
-forceps
-lssp
-danson
-wru
-premonition
-derisive
-walgreens
-barbell
-garratt
-raskin
-scuttle
-coronations
-decibels
-congreve
-bloated
-sneezing
-dormouse
-gari
-transferase
-henge
-post-shakedown
-savimbi
-bhosle
-1093
-l.j.
-dprk
-zag
-sheppey
-adware
-downe
-wsm
-harkin
-87.6
-gonville
-carsten
-sigrid
-ribbentrop
-566
-trapezoid
-sproul
-abysmal
-allround
-golconda
-heckling
-bagel
-kawai
-stringfellow
-realists
-psmith
-mommsen
-incubates
-1998-2002
-steller
-dungy
-taskmaster
-19-year
-usurping
-aldosterone
-darko
-cornerstones
-camshafts
-dhahran
-gossamer
-afterburner
-riverwalk
-camrose
-gallantly
-summerside
-pouteria
-stay-at-home
-bohm
-eruptive
-aromas
-260,000
-confetti
-degeneracy
-brainstorming
-coexisted
-falconry
-phrased
-gillen
-beitar
-m.j.
-bcd
-claret
-mariinsky
-intraocular
-britanniae
-miu
-redpath
-burbage
-stonemason
-strikebreakers
-gilder
-eazy-e
-murrell
-direct-to-dvd
-1329
-loathed
-bas-reliefs
-ballantyne
-sandhills
-interventional
-rinse
-kitiara
-begat
-leggett
-oxalate
-arw
-drooping
-nigga
-shoving
-mediterranean-type
-colorized
-bre
-matchday
-restlessness
-evatt
-chartering
-signposted
-ashman
-spotswood
-biotin
-supertramp
-twelve-year-old
-sereno
-bolden
-fakir
-milledgeville
-sup
-treecreepers
-fiume
-penciler
-anti-abortion
-molt
-lsts
-cataloguing
-farwell
-hugged
-aco
-manipal
-mudhoney
-light-rail
-hospitallers
-all-wheel
-koppel
-whistled
-standardizing
-wh
-consortia
-truthfully
-peritoneal
-ralphie
-makarios
-3.32
-bruiser
-particulates
-howick
-oriente
-1002
-willys
-6-inch
-narthex
-shimmering
-murshidabad
-presumes
-springvale
-rawdon
-interrogating
-1914-1918
-muskrat
-snake-like
-lfo
-njt
-pantano
-raz
-tri-county
-liquors
-lock-up
-wireline
-meridians
-oakham
-ghazals
-corvids
-geomorphology
-silverberg
-arcturus
-flickering
-mendicant
-bulbuls
-dastardly
-westerner
-santini
-culbertson
-frawley
-1395
-lilandra
-renata
-jetties
-vicarious
-ancestries
-853
-komatsu
-shinoda
-bravest
-voyagers
-ribera
-wombat
-759
-accompaniments
-eff
-sandiego
-spacesuit
-1136
-qvc
-sowerby
-kats
-offsetting
-obturator
-13-0
-sneeze
-3c
-smpte
-air-to-ground
-blooper
-bemba
-scheckter
-hobbits
-brannigan
-uncommonly
-german-occupied
-white-throated
-pilatus
-4.30
-donegan
-50.7
-naugatuck
-flipside
-foch
-40-year
-mckeever
-sculpin
-bootsy
-acolyte
-infidel
-sunn
-biggar
-brigg
-taubman
-dummer
-995
-keweenaw
-pastored
-51.3
-glanville
-cone-shaped
-cvo
-diplomatically
-nettwerk
-neuberger
-rothko
-scallops
-imperious
-coaxed
-caning
-cartographic
-langmuir
-doucet
-chelan
-pyar
-1184
-trapani
-roselle
-kelleher
-state-controlled
-vanquish
-shs
-vfw
-iolanthe
-asme
-flutist
-damodar
-runciman
-sixty-fifth
-qs
-cranwell
-cochlea
-50-50
-take-away
-699
-home-grown
-kinsale
-undiagnosed
-sperling
-priscus
-wreaking
-harbourfront
-evictions
-verdant
-zaku
-masashi
-michell
-pelletier
-saluting
-tarp
-nanoha
-doylestown
-ethnikos
-lsc
-intercede
-zwei
-coin-operated
-romany
-waid
-seahorses
-tir
-sunscreen
-plummet
-balochi
-odoacer
-hyperolius
-gratian
-cingulate
-pushcart
-ginzburg
-hydrocephalus
-tolentino
-metaxas
-cep
-markowitz
-lsm
-leda
-messed
-vinaya
-nits
-reimer
-historicism
-time-travel
-hinders
-placeholder
-seatbelt
-geochemistry
-feliz
-rosenwald
-bpp
-iola
-danza
-astin
-perrault
-schuller
-53,000
-boren
-tuen
-salami
-downfield
-malhotra
-dud
-jedburgh
-kjv
-tighe
-opportune
-hyperoliidae
-yohannes
-aminotransferase
-hellish
-dalla
-pns
-kohat
-bodley
-842
-pedigrees
-attock
-gangetic
-gender-neutral
-outmoded
-gromit
-meantone
-posturing
-assistive
-meighen
-appropriating
-50.8
-gettin
-tamarack
-keillor
-malvaceae
-goderich
-shimane
-verney
-blazed
-1980-1981
-maribor
-anti-globalization
-categorizing
-ground-attack
-wanstead
-diffuser
-tsuyoshi
-otero
-ulcerative
-brest-litovsk
-beaded
-contactless
-spyker
-bongos
-depredations
-lotharingia
-iie
-aelius
-pylori
-disinfection
-inconceivable
-jumo
-797
-photonics
-hyperlinks
-kodokan
-dads
-niebuhr
-cased
-greenlandic
-incapacitating
-schiphol
-obsolescent
-arians
-baikonur
-mousetrap
-ohv
-ohr
-bhi
-dniester
-existentialist
-hiroki
-fata
-centric
-carbon-nitrogen
-incompressible
-bellona
-1950-51
-gramercy
-low-energy
-amalgamating
-radioisotopes
-carvers
-toot
-simmering
-mowing
-bunsen
-jaisalmer
-greenpoint
-yusef
-bandra
-medium-pace
-humbug
-deviants
-a.o.
-nishi
-grossmith
-2/4
-madeley
-bronte
-swingers
-yani
-stourton
-unreachable
-fireside
-arrangers
-696
-ergodic
-brito
-1116
-hofstadter
-marchese
-blurry
-metropole
-laroche
-chalfont
-photogenic
-turbochargers
-wain
-wail
-prager
-landshut
-vyasa
-ninety-nine
-atkin
-671
-qadir
-piro
-47.6
-monahan
-sheri
-scrapes
-suitcases
-busting
-chabot
-1343
-oppressors
-headman
-cyclopedia
-harrods
-off-field
-wyckoff
-barbieri
-hse
-emplaced
-49.1
-likenesses
-disagreeable
-1409
-junagadh
-swinger
-troublemaker
-npp
-reeder
-lithium-ion
-seljuks
-physiotherapist
-usefully
-stuckists
-carden
-asano
-cultivable
-cocking
-ramachandra
-relict
-755
-stewarts
-codice_12
-40-50
-ampex
-sgs
-docs
-neko
-commercialize
-elkton
-anastacia
-mim
-tetrahedra
-hamden
-post-operative
-sarcophagi
-borehole
-carcinogens
-camanachd
-galaxie
-irrevocably
-walhalla
-penalised
-teleporter
-lannister
-gibberish
-dte
-re-emergence
-verdean
-lineal
-closeted
-mizuki
-anabaptists
-contagion
-triffids
-am/fm
-talkative
-ten-day
-tinnitus
-self-evident
-51.4
-51.6
-1443
-synchronizing
-hayat
-interbank
-intramolecular
-sayyaf
-2900
-sotomayor
-durkin
-bungle
-fwd
-closets
-al-walid
-kum
-egoism
-knaresborough
-bundling
-grandfathered
-haverfordwest
-1,2
-eliciting
-bagels
-maribyrnong
-pillbox
-sapphires
-sclc
-0-4
-1944-45
-tapas
-bereft
-dolittle
-1.35
-jaan
-m14
-literals
-1278
-samnite
-ews
-transportable
-double-stranded
-mercantilism
-oleksandr
-maxime
-rhododendrons
-barbus
-electro-mechanical
-d'agostino
-kiri
-hellmuth
-subsections
-reni
-immanent
-european-american
-52.6
-chittenden
-macarius
-kahne
-liberace
-symbolist
-in-joke
-mgs
-5am
-89.4
-861
-levantine
-donaghy
-abstentions
-cna
-seismology
-salvaging
-dostoyevsky
-stipulating
-hanrahan
-ossett
-endometrial
-exhibitor
-brigands
-rattling
-shenzhou
-regurgitation
-then-girlfriend
-writer-in-residence
-douro
-wolof
-reinvigorated
-1995-1997
-baraka
-coc
-baden-baden
-autocorrelation
-maskhadov
-25-30
-oblasts
-tri-city
-vulva
-extremadura
-runnels
-altus
-sohail
-highpoint
-overworked
-khai
-bellagio
-pillared
-videoconferencing
-kelis
-publica
-rightist
-beekeepers
-h2s
-gladiatorial
-9x
-magoo
-sids
-tetralogy
-disjunction
-ostrander
-renominated
-abutting
-fdi
-commendable
-jizya
-decrypted
-northwestward
-4,300
-ary
-spanky
-kidz
-statehouse
-multitrack
-susana
-vickery
-depraved
-washroom
-apotheosis
-darlinghurst
-humpty
-yanam
-pre-show
-zagora
-railed
-nitrogenous
-bartel
-unep
-sikandar
-ameliorate
-defiantly
-genealogists
-repugnant
-30-second
-daggett
-fresher
-koku
-libertine
-timberline
-satish
-disturbs
-redshirted
-wosm
-lunsford
-gojjam
-kittiwakes
-andropov
-bna
-timecode
-tympanic
-deum
-arabi
-kanchipuram
-pawley
-engrossed
-hairston
-yossi
-statins
-burchill
-miser
-lower-case
-813
-salk
-touro
-glennon
-depowered
-syncope
-aitchison
-arpeggios
-1998/99
-castling
-hoey
-pedophile
-mid-america
-revisiting
-drainages
-18:00
-neurosurgeon
-pix
-two-hourly
-ferrite
-syndicalist
-emeryville
-3dfx
-moguls
-interglacial
-kojak
-gotras
-ponytail
-pizzicato
-lci
-mild-mannered
-instabilities
-shags
-considine
-56,000
-thick-knees
-omnipotence
-exaggerating
-run-ins
-jabs
-11,500
-mendon
-forewings
-bloopers
-metternich
-lippincott
-adulyadej
-assailed
-coronel
-pmc
-tempus
-aiello
-suan
-toyland
-tornados
-ramanuja
-5-5
-recessions
-tentacle
-slung
-pasolini
-landward
-ajpw
-seema
-yousuf
-hoggard
-7a
-987
-mexicali
-ltp
-uneconomical
-belconnen
-alluvium
-tdm
-jehu
-embry
-ringtones
-watermills
-ketamine
-shreds
-cadman
-sep.
-nilsen
-belloc
-enthralled
-1442
-cardholders
-enchanter
-amery
-n.d.
-plumbers
-copts
-internals
-dalkeith
-concours
-pedestrianised
-conyngham
-gaff
-morbihan
-7-yard
-truthfulness
-9.10
-pilings
-ridiculing
-littlehampton
-hindemith
-zildjian
-mk1
-sdsu
-graydon
-fads
-goldsborough
-j.r.r.
-aphrodisiac
-schlumberger
-corio
-demme
-oratorios
-hellespont
-reconsideration
-orchestrate
-legalised
-low-profile
-11-18
-ug
-whitewashed
-profaci
-timah
-four-dimensional
-lyudmila
-789
-43.1
-gyan
-il-2
-nami
-brecker
-dribble
-hulagu
-edu
-gaithersburg
-castagnoli
-presets
-croker
-anti-air
-airtran
-subsoil
-kirklees
-2007-present
-jaxx
-cyclin
-sociopolitical
-higher-end
-moen
-juventud
-meech
-unionization
-kresge
-1413
-hiphop
-stoops
-carers
-buccal
-kaaba
-dreadlocks
-dipoles
-pantheism
-cerrito
-nieuport
-corrs
-mabry
-kingsbridge
-student-athletes
-ingmar
-mvd
-4:1
-55.4
-powerline
-brahmi
-wildebeest
-culpable
-peal
-perceptible
-docket
-ocelli
-rsm
-non-operational
-promissory
-foghorn
-stratemeyer
-ten-pin
-44.3
-1252
-cdm
-top-tier
-flatness
-ilkeston
-talavera
-miroslav
-aline
-explosively
-idc
-quidditch
-feint
-one-week
-laplacian
-pape
-season-long
-sub-region
-prine
-insomniac
-visualizing
-sebastien
-bakelite
-carbonaceous
-polytechnical
-offscreen
-birdland
-presynaptic
-marinated
-amanullah
-physiologic
-1993-1995
-rosewater
-kulothunga
-hadassah
-hashes
-variational
-brigantine
-tarrytown
-averell
-sublimation
-seb
-forklift
-matsu
-poaceae
-cellophane
-i-85
-railtrack
-monday-friday
-boulanger
-typist
-823
-bellies
-atman
-ghidorah
-katarina
-1276
-hashomer
-dampen
-mazandaran
-wagers
-wolfsbane
-diphthong
-hathor
-neustria
-0.11
-gunns
-gau
-monika
-12-hour
-eurydice
-donnell
-trashed
-well-designed
-bustards
-axonal
-franciscus
-ragan
-layoff
-christopherson
-lynchings
-heathcliff
-conjured
-re-issues
-yehoshua
-thebans
-butterflyfish
-tye
-bottomley
-legionnaire
-diophantine
-franke
-capricious
-esf
-marmara
-kruse
-slaveholders
-thurles
-straighter
-paneling
-kunstler
-eschatological
-bonne
-43.4
-807
-grout
-pce
-bwf
-kumble
-desron
-kovil
-10-4
-sniping
-2000-2004
-imprinting
-mouthparts
-disaffection
-self-discipline
-agata
-hardinge
-redland
-conservatively
-underwriter
-nanded
-inescapable
-mcw
-ingle
-margraviate
-armen
-vegeta
-shamanistic
-allaire
-telemarketing
-high-wing
-zentraedi
-loran
-jawahar
-45.1
-attestation
-cowdenbeath
-aaj
-slipway
-shiromani
-slavers
-grimoire
-lop
-pickguard
-koine
-volgograd
-jase
-wark
-story-telling
-landholders
-psilocybe
-pineda
-paraphrased
-homologation
-abhishek
-okazaki
-portables
-canvey
-att
-sau
-disallow
-bilge
-kix
-discursive
-pre-teen
-ctesiphon
-takuya
-congratulates
-1289
-cofounded
-ringleader
-delisle
-dismember
-goby
-bic
-semien
-848
-unisys
-x-15
-wiper
-cooch
-kass
-syringes
-tightrope
-stinky
-tranz
-wholeheartedly
-fluff
-jyllands-posten
-permeates
-non-communist
-flail
-fifty-sixth
-postcolonial
-suo
-anti-zionist
-violator
-semi-nomadic
-cyclase
-discoverers
-homily
-hammocks
-cartons
-simonds
-thallium
-rabuka
-cwt
-flowered
-putrajaya
-wayang
-backroom
-granularity
-molde
-palgrave
-passant
-miquelon
-guaimar
-miscellanea
-kuna
-tmd
-finno-ugric
-scg
-pare
-bentonville
-engler
-transcending
-retaliates
-ludington
-40-year-old
-46.3
-irreversibly
-hainault
-akane
-shulchan
-regrowth
-mayr
-clonmel
-sikar
-diu
-woomera
-meatpacking
-wiccans
-taxicabs
-5,200
-cheviot
-longueuil
-compels
-psychometric
-1344
-herts
-haugen
-reciprocate
-muffins
-ch2
-tableland
-mostar
-muar
-sesquicentennial
-smelters
-s.p.
-ula
-gottschalk
-nourished
-busses
-fun-loving
-championnat
-1336
-gluing
-inter-island
-welton
-pikmin
-male-dominated
-sedgley
-gondoliers
-melo
-pageantry
-pisano
-scallop
-repented
-gbc
-lunt
-pen-name
-ostriches
-1438
-rank-and-file
-bickle
-hendrie
-highest-paid
-eco-tourism
-befall
-mangal
-estero
-limped
-sedate
-kuk
-modulators
-selhurst
-chamois
-binaural
-corvinus
-muscatine
-nib
-zarathustra
-gse
-cockermouth
-culebra
-kwantung
-uttoxeter
-bookshops
-rookery
-zeo
-mow
-gori
-turbans
-straights
-fca
-accenture
-shauna
-bonnaroo
-humbled
-flatbed
-pirena
-gutman
-emulsions
-sammamish
-crum
-ert
-takao
-toland
-diebold
-913
-nicomedia
-daya
-evaporative
-cheam
-panoz
-yoder
-disastrously
-snowshoe
-momenta
-point-of-view
-zum
-thigpen
-scalloped
-unrevealed
-1457
-diamond-shaped
-readout
-kartik
-congleton
-welder
-879
-liens
-fancies
-draconis
-trope
-seiji
-silted
-iib
-lamy
-discontinuities
-9.20
-lld
-medially
-c.p.
-mecklenburg-strelitz
-52.5
-52.4
-10cc
-washington-based
-latvians
-baibars
-reiter
-683
-comuni
-androscoggin
-echuca
-timber-framed
-1427
-kyuss
-apm
-warder
-----
-baptize
-one-tenth
-pendlebury
-ethnomusicology
-levying
-bereavement
-buckskin
-multiplies
-roode
-mmr
-twelver
-u.m.
-brassey
-palacios
-8-3
-peano
-congregationalists
-744
-multi-billion
-chania
-thinned
-testicle
-bullfighting
-muharram
-qua
-immunotherapy
-aneurysms
-maranatha
-binders
-cim
-kross
-10-1
-quraysh
-doppelganger
-catheters
-851
-mengistu
-wotton
-wok
-yagi
-kristiansand
-welshpool
-brooms
-well-founded
-roadbed
-murano
-45.2
-vocalization
-cantilevered
-curbing
-forma
-jermyn
-razzie
-baksh
-981
-olean
-a.w.
-amides
-849
-subterfuge
-61.5
-iits
-difranco
-atrocious
-yutaka
-dayak
-combing
-sahiwal
-renner
-bipod
-burglaries
-1288
-maintainer
-blunkett
-snag
-sokolov
-hopelessness
-super-villains
-battleford
-gregoire
-aef
-furnariidae
-anticholinergic
-gurley
-tartans
-davina
-ponta
-spla
-vashem
-pickers
-nrhp
-kubota
-32-yard
-moltisanti
-carracci
-ulna
-harnett
-shias
-973
-grinning
-geshe
-iwan
-fish-eating
-emanation
-basile
-european-style
-carmina
-nestlings
-fellatio
-cassady
-opportunist
-irvan
-hed
-huskers
-grandis
-tinggi
-echinoderms
-wf
-albus
-patter
-addy
-intercalated
-viceroys
-creston
-unary
-pujols
-copd
-self-rule
-pygmies
-melodramas
-f-104
-housings
-non-speaking
-physiologically
-tempestuous
-zamalek
-prophylactic
-scree
-stillson
-ephesians
-allegories
-paras
-roebling
-privately-held
-rashes
-beatriz
-purport
-situationist
-lonergan
-mahony
-crevice
-12-15
-baile
-doble
-biohazard
-pdp-10
-zuo
-twiggy
-blass
-massless
-cardenas
-kamenev
-ghibli
-grief-stricken
-noisily
-rattus
-pretzels
-turlough
-chairlifts
-computability
-sulayman
-cheshunt
-maranzano
-ongar
-wilhelmshaven
-voluptuous
-authorise
-kaunda
-estradiol
-cosmodrome
-wessels
-amihan
-zeng
-rcm
-lanham
-cdo
-goings
-173rd
-moscone
-ihs
-interrogative
-hekmatyar
-late-1970s
-yamanashi
-porth
-safeties
-kcvo
-hildreth
-musl
-mitzvot
-xmpp
-semitones
-unionidae
-bbn
-adductor
-disconnection
-837
-wintered
-maclennan
-seq.
-savona
-kalinin
-tumulus
-bloat
-tewksbury
-amadeo
-8-7
-magnetized
-faster-than-light
-howdy
-shaivism
-self-financed
-naoko
-semi-official
-non-medical
-birkett
-farrelly
-quiver
-hydrogenated
-ijssel
-estudiantes
-convalescence
-austrasia
-karp
-top-notch
-ahsan
-floodwaters
-wasim
-balkh
-relive
-ashbourne
-ill-advised
-pascha
-erne
-r-rated
-anti-aliasing
-lumbee
-f.c
-laces
-matti
-formic
-ethnological
-brumby
-open-minded
-complainant
-strathspey
-brawls
-dillingham
-mato
-fart
-leukocytes
-grassed
-re-married
-overhearing
-selinsgrove
-8s
-witchblade
-yuk
-lugar
-raisers
-bereaved
-pre-race
-odum
-incan
-habermas
-marri
-hinman
-jalil
-pseudoscientific
-1991-1993
-yusuke
-fuad
-ageless
-paredes
-shaukat
-gusto
-grantor
-worldcon
-evades
-zealots
-kilsyth
-wagering
-temasek
-drenched
-cutout
-nucleophile
-pneumoniae
-prater
-mcguigan
-repeatable
-o'quinn
-wachusett
-dark-skinned
-personalised
-clothier
-pudsey
-tikal
-boles
-fpu
-nogales
-l'engle
-10.10
-kpd
-polynesians
-rothenberg
-malini
-arika
-scuola
-rambaldi
-flashpoint
-daws
-wesker
-hoban
-alon
-1331
-delmarva
-pensioner
-despairing
-unbridled
-poitou
-anemones
-mindfulness
-kerk
-kau
-unrecognizable
-c/c
-najibullah
-spastic
-livable
-sneaker
-akan
-zia-ul-haq
-prolongation
-evermore
-buckminster
-nunes
-jeet
-middleburg
-loyd
-corfe
-creatine
-solothurn
-rejections
-transpo
-tendering
-tourer
-eroticism
-cataloged
-heyward
-adana
-bogor
-moberly
-overdoses
-aaas
-markey
-nefertiti
-frogmen
-delimit
-leathers
-1369
-receivable
-modulates
-audioslave
-dass
-newsman
-album-oriented
-kiama
-fabrications
-entrenchments
-cryptosystem
-transubstantiation
-993
-guided-missile
-seleucids
-visitations
-weatherfield
-bubbly
-townlands
-musser
-neoliberal
-gamaliel
-warthog
-shyamalan
-numb3rs
-1262
-rectifiers
-vivek
-snoring
-imprisons
-tomasz
-kotor
-epaulettes
-koloff
-rox
-crumlin
-grecian
-crookes
-15-18
-yasna
-brazzaville
-viceregal
-gooseberry
-coal-mining
-dpi
-1620s
-muammar
-sandinistas
-sobel
-hypnotize
-lags
-teodoro
-sunflowers
-orenburg
-keanu
-11-16
-yankton
-unguided
-1995/96
-banca
-startups
-basilicas
-2,900
-silverlight
-rakes
-chemin
-arabians
-selleck
-starstruck
-kubert
-reinventing
-758
-ilam
-slinky
-bellshill
-bluefish
-harlech
-dubh
-harmonix
-maximinus
-hedonism
-arabidopsis
-fuzes
-arnis
-reaffirm
-re-imagined
-beals
-kyra
-festivity
-ramzi
-ackermann
-kauri
-haggerty
-gollancz
-garmisch-partenkirchen
-86.7
-civilised
-sapper
-affable
-mmo
-dacians
-renormalization
-baie
-sealand
-winstanley
-atahualpa
-dinobot
-warrantless
-red-brown
-wolstenholme
-corpora
-herz
-airpower
-re-branding
-saleen
-esd
-mcd
-ratnagiri
-competences
-sullivans
-retcon
-forty-seventh
-co-chairs
-cobh
-demento
-cabrini
-951
-washtenaw
-medved
-vz
-+6
-joly
-trochilidae
-ashkelon
-sub-surface
-neoprene
-buttes
-gyeongsang
-54.2
-naught
-iai
-obliterate
-kamel
-double-digit
-kannauj
-dworkin
-mashonaland
-lancing
-nameplates
-lech
-micron
-vidor
-lethality
-jcr
-giovanna
-statuettes
-rlm
-utd
-kendricks
-perle
-crewmember
-kkr
-moriah
-almagro
-sayle
-lcdr
-yamauchi
-stereophonics
-bursary
-tomkins
-moderns
-gervase
-verifies
-parra
-apocynaceae
-anti-communism
-british-based
-zog
-infineon
-krang
-thomond
-magisterium
-consonantal
-inborn
-bickford
-unlisted
-hysterectomy
-vitaly
-apricots
-faribault
-moby-dick
-oxo
-0-6
-satirizes
-jm
-pozzo
-eso
-garnished
-polish-soviet
-depression-era
-gowdy
-teignmouth
-paddlers
-keisha
-circassian
-conquistador
-mordaunt
-predilection
-prowse
-leucine
-circumventing
-pueblos
-maule
-case-by-case
-ede
-leavers
-3.40
-recoilless
-syro-malabar
-1321
-alenia
-zentradi
-lifespans
-brownback
-nietzschean
-kingsville
-resonators
-dodgy
-marga
-choughs
-hdb
-refloated
-susumu
-beas
-newstead
-attendee
-legalizing
-invictus
-bigoted
-travertine
-two-page
-shere
-xylem
-fennel
-alexandru
--15
-egregious
-dashwood
-19:00
-dodecanese
-phair
-elysium
-hite
-nonesuch
-savo
-angelfish
-lisboa
-hockley
-ohmsford
-bais
-dejazmach
-guttural
-heartbreaker
-lambasted
-raab
-subject-matter
-paragliding
-25.00
-batticaloa
-clogging
-caria
-15-day
-listowel
-propels
-marvell
-h6
-underfunded
-venison
-sunspots
-nyingma
-mccarran
-consul-general
-demonstrably
-performance-enhancing
-welds
-then-husband
-anuradhapura
-chinchilla
-unpainted
-makai
-pbr
-scoreboards
-zilla
-merrion
-dioecious
-vst
-tourniquet
-3gpp
-rhiannon
-sulfite
-hoople
-perfectionist
-manservant
-lincolns
-desiderius
-buchman
-44.7
-beyonce
-marwari
-banister
-karolina
-kroner
-endothelium
-sambalpur
-dissociate
-topanga
-harmsworth
-chordal
-gsp
-landulf
-superseding
-legitimized
-nestorius
-silkworm
-abolitionism
-awardee
-xxxviii
-mcminnville
-hypermarket
-detonators
-kagawa
-tallying
-gerrymandering
-hauraki
-aragorn
-839
-orff
-wetton
-bricklayer
-potholes
-kenichi
-e-3
-histogram
-halloran
-komi
-banderas
-philistine
-haughey
-4x100m
-beachy
-dede
-cobblestone
-analogously
-cambodians
-humbucker
-cardiologist
-54,000
-asx
-arraigned
-nyquist
-chiropractors
-fastening
-tantalus
-lessee
-oau
-redo
-preceptor
-umayyads
-overpowers
-spaceshipone
-didgeridoo
-peruvians
-0.14
-aik
-obp
-rossiter
-spamming
-tengo
-teeming
-capello
-soiled
-predictors
-1/10
-micro-organisms
-ridgeline
-leduc
-teething
-gandharva
-blau
-searing
-ridged
-balthasar
-howth
-terrapins
-redhawks
-thievery
-maser
-streetwise
-keiji
-teg
-anishinaabe
-speechless
-collectivist
-trove
-flocking
-middens
-irrevocable
-87.2
-panesar
-glucagon
-messes
-subarctic
-babbar
-nishimura
-abitur
-earthforce
-joscelin
-nayaks
-strada
-goalscorers
-darbar
-d'oro
-gandhian
-craps
-winterbottom
-read/write
-tractatus
-fortifying
-self-penned
-hooray
-ramey
-ulsan
-longoria
-verandah
-debriefing
-reaped
-afro-caribbean
-unquestioned
-milkshake
-heaped
-overrunning
-noche
-mcalister
-1992-1995
-rafah
-candia
-katharina
-ligature
-servilia
-water-powered
-reminisces
-contravariant
-kooning
-agung
-pathi
-comparator
-kal-el
-fasa
-kosh
-1969-1970
-creel
-plurals
-caer
-mumbles
-covenanter
-prunes
-mg/kg
-cuddalore
-slowness
-988
-rainham
-haviland
-adb
-protrusions
-hist
-renters
-skippers
-50-60
-barossa
-formosan
-72,000
-racquets
-overgrowth
-resisters
-zulfiqar
-salic
-ilocano
-rediffusion
-whitton
-rafter
-amalie
-novaya
-goof
-preah
-hcc
-giddings
-abductor
-835
-hardaway
-snowboarder
-erm
-marrickville
-glyndebourne
-photoreceptor
-barbarous
-seaworthy
-encoders
-century-old
-aker
-subcompact
-imi
-nasution
-surveyor-general
-1156
-riveting
-ghani
-sixty-seventh
-lunacy
-choctaws
-amerika
-copernican
-763
-evas
-deon
-noongar
-bacteriology
-muruga
-helston
-warley
-abc1
-privatize
-runt
-areal
-m11
-beltline
-weedon
-botulinum
-northcliffe
-babington
-reitman
-cayce
-obsessively
-ericson
-crochet
-sontag
-urethral
-1404
-starburst
-deviating
-qiu
-israel-lebanon
-toya
-mirkin
-hypnotist
-1353
-pto
-sympathisers
-nigh
-downgrade
-propagandist
-1115
-jayson
-sagittal
-ironing
-kafr
-brutalist
-matera
-eye-catching
-whiteness
-concierge
-twi
-rerelease
-taurine
-downpatrick
-charnwood
-p-type
-columbidae
-refocused
-sennacherib
-despicable
-klerk
-angleton
-addendum
-aggrieved
-constancy
-mapuche
-segues
-brolin
-47.4
-scraper
-aza
-bums
-methodically
-jeffersonian
-casein
-heatwave
-carle
-curries
-hi-hat
-messner
-sheaths
-uddin
-supercup
-fowls
-sixty-fourth
-chatterton
-g-protein
-pannonian
-gwg
-ozeki
-golly
-rall
-shulgin
-kathie
-galli
-eightieth
-snark
-ojo
-pre-arranged
-rydell
-patella
-hopetoun
-sondra
-kolchak
-teja
-cluttered
-rsp
-bc4
-vytautas
-megaton
-cera
-reavers
-hagley
-congregated
-989
-tacked
-arvind
-chlorides
-roni
-bls
-balian
-250th
-50.2
-moonlighting
-interconnections
-rectilinear
-lightyear
-lower-order
-showering
-hannon
-peeping
-bollocks
-amravati
-chileans
-lacombe
-zakat
-1376
-b-52s
-krum
-godless
-bogan
-fro
-emberizidae
-nori
-orange-nassau
-inositol
-886
-henley-on-thames
-aprons
-heeled
-etsi
-whores
-frere
-hickson
-deschanel
-filmon
-safran
-vladimirovich
-cogswell
-48.7
-spectrograph
-transonic
-candide
-foiling
-dpr
-flogged
-subgenera
-wwa
-khedive
-unsanitary
-ackroyd
-usba
-flotsam
-broglie
-kohli
-metastable
-kwinana
-mbna
-noncommercial
-carriageways
-turners
-iic
-all-in-one
-taluks
-cakobau
-backhouse
-i-495
-siad
-secretes
-sali
-personae
-10.50
-elson
-unrivalled
-brawling
-verdon
-shashi
-1423
-12-13
-99.0
-karaites
-zaidi
-hippolyte
-m.i.a.
-1417
-spahn
-bifurcated
-knightmare
-35-yard
-penryn
-achievers
-ganilau
-hypercube
-boosh
-weierstrass
-negativity
-antigonish
-vod
-fortuyn
-lacan
-batesville
-pauly
-artefact
-uncountable
-co-conspirators
-publicise
-43.3
-domine
-landor
-86.9
-38,125
-abkhazian
-uncomplicated
-upwelling
-dolenz
-emanated
-sein
-pro-union
-audiencia
-shel
-cadillacs
-adyar
-tci
-deyoung
-lint
-weevils
-draftees
-b.c.e.
-pipestone
-wreaked
-torched
-hashem
-stickney
-flicks
-mujibur
-writer/producer
-mcm
-one-game
-enviable
-alconbury
-early-1990s
-anarky
-elfin
-turnpikes
-eircom
-whatcom
-22-yard
-deja
-obstinate
-kilroy
-esher
-vara
-smrt
-retracting
-exhilarating
-chek
-sensationalist
-costigan
-augustinians
-claes
-helmuth
-analyzers
-fila
-izzard
-panchala
-kanazawa
-hampering
-gravitated
-freese
-marden
-white-winged
-backwaters
-broach
-deutschen
-annonaceae
-serpentor
-opendocument
-mammoths
-chiroptera
-arcseconds
-shays
-kearsarge
-outsource
-moaning
-hazmat
-h&m
-beis
-barnstormers
-finlayson
-thirty-sixth
-free-lance
-epistolary
-queueing
-sorbus
-arrondissements
-unadorned
-saxophonists
-28-yard
-trac
-heralding
-ansgar
-cradock
-mineko
-everytime
-load-bearing
-tetley
-devore
-humbly
-sofie
-foix
-dual-carriageway
-gelder
-localize
-swampland
-seneschal
-pipa
-corydoras
-shopped
-coroners
-hameed
-maus
-polyp
-vee-jay
-halevi
-sastri
-sameer
-squeaky
-hochschule
-i.d.
-torrid
-teluk
-strutt
-minnehaha
-amrit
-barbosa
-scotus
-aurea
-arsenals
-heilbronn
-discrete-time
-pirin
-8500
-rosaceae
-al-ma
-archdiocesan
-albi
-sullen
-croesus
-augments
-hawick
-madhava
-johnsons
-1352
-fifty-seven
-faustina
-snowboarders
-1157
-claridge
-kirkwall
-aratus
-goodell
-duxford
-cky
-lotion
-pieve
-grayskull
-re-run
-j.f.
-22d
-subdivide
-braff
-strangles
-liquorice
-animaniacs
-busey
-multilevel
-warringah
-53.2
-preys
-ganesan
-ibuprofen
-sistan
-muswell
-1981-1982
-smriti
-wideband
-dicky
-unenforceable
-r.f.c.
-clowes
-lani
-nikhil
-substructure
-greystone
-1989-1999
-zarqawi
-venerate
-colonialists
-karlheinz
-hemet
-jealously
-fiendish
-wentz
-rumsey
-6-6
-grantee
-nominates
-ridding
-guitarist/vocalist
-spindle-shaped
-42.0
-relegating
-ziggurat
-ashburnham
-grunwald
-hokies
-sellars
-apollodorus
-chaps
-damped
-jubilant
-fabletown
-shamil
-roared
-bfi
-heth
-45.9
-orland
-josephs
-estoppel
-snuka
-formula_70
-bellerophon
-nant
-1228
-cosigned
-octopussy
-spaniels
-paralyze
-x86-64
-timekeeper
-micrometer
-cls
-cowichan
-tere
-hertogenbosch
-madcap
-shrugged
-seep
-instalments
-wilkin
-duty-free
-dtt
-involution
-overshot
-abo
-cays
-nahum
-postponing
-fire-fighting
-habilitation
-advani
-first-past-the-post
-brogan
-kebangsaan
-brindle
-trappist
-proterozoic
-hass
-twos
-kdka
-temporally
-kher
-berrigan
-enderby
-a-channel
-studious
-rushen
-kees
-swordsmen
-toboggan
-gilding
-repainting
-9-yard
-seventieth
-libor
-friezes
-accusers
-waltons
-7-7
-bur
-junko
-thorold
-callender
-estimations
-meany
-ventnor
-meccano
-hematoma
-luxembourg.
-akio
-occam
-kib
-ork
-ditka
-gratings
-allyn
-tendou
-masao
-tanned
-laparoscopic
-8a
-cassock
-zhuhai
-jcs
-treatable
-mcevoy
-tumult
-long-haired
-subgraph
-gahan
-encroachments
-left-right
-bolin
-megafauna
-parasitism
-wiggle
-costuming
-thirty-seventh
-3300
-hashemite
-blueberries
-hoopoes
-crave
-eurogamer
-dirigible
-ignaz
-pocock
-brearley
-fae
-electro-optical
-tism
-bydgoszcz
-finnmark
-sierras
-olongapo
-capsid
-worldcom
-melange
-hakeem
-moldings
-plosive
-hairdressing
-mccombs
-talal
-yellow-green
-acetaldehyde
-initio
-white-bellied
-tanagers
-elated
-amplifies
-toponyms
-tomy
-klub
-malate
-49.7
-recyclable
-matanzas
-striatum
-middle-east
-heyford
-escapist
-alamogordo
-cowgirl
-agarwal
-pmr
-insuring
-grieved
-gcap
-1293
-detaining
-scalpel
-frankland
-rollback
-eagleton
-yoshihiro
-kestrels
-gargan
-autocad
-ligonier
-boyzone
-moonbase
-pascack
-photojournalism
-subtractive
-w.b.
-surah
-diehard
-cadell
-male-line
-hijra
-newstalk
-businesspeople
-rappaport
-southend-on-sea
-tilson
-280,000
-sdlp
-bil
-garofalo
-tilts
-eggleston
-franklyn
-otello
-bagging
-nrk
-zwingli
-confection
-drop-in
-blaylock
-trianon
-ineptitude
-interviewees
-iconostasis
-mastectomy
-ruan
-assiniboia
-khagan
-cruickshank
-ariana
-earth-like
-kropotkin
-russification
-covey
-glenfield
-paxman
-shas
-sibi
-peninsulas
-chesnutt
-macdonell
-divorcee
-pelagius
-us-led
-zebulon
-esb
-vvs
-caton
-s.w.
-siebert
-montmartre
-hickam
-minnows
-westernized
-macready
-non-american
-112th
-solidification
-march/april
-tints
-brienne
-pro-western
-srb
-sharad
-bishkek
-demeanour
-borderers
-nepos
-gat
-passersby
-gassed
-belladonna
-drdo
-breastplate
-referenda
-1am
-pde
-87.3
-sear
-silicates
-backtracking
-45.3
-sedona
-fap
-self-adjoint
-strood
-agronomy
-cookers
-1175
-sevendust
-ont
-destabilizing
-reaver
-pavlovich
-immunosuppressive
-igloo
-psoe
-arcing
-duelist
-trumpeters
-dagobert
-88.4
-86.5
-chakwal
-parisi
-telescoping
-randomization
-unhurt
-blatt
-b&m
-1391
-hegarty
-schatz
-apatow
-brien
-popmatters
-mandelson
-songkhla
-oftentimes
-sdf-1
-p.o.
-cheri
-stalactites
-xxxiii
-cellmate
-street-level
-khasi
-alsatian
-hj
-quiroga
-harrigan
-lato
-1338
-palaeontology
-underlings
-48.1
-hanif
-claud
-mimo
-langlands
-guderian
-coffeehouses
-underrepresented
-thapa
-mingling
-leaderless
-internecine
-bottomless
-besting
-zordon
-jann
-932
-squibb
-brazier
-50.5
-tyme
-1197
-pegu
-pre-julian
-prequels
-pampas
-crosswords
-subclavian
-seleucia
-headlamp
-berm
-petiole
-sixty-three
-worrall
-fulford
-1372
-godwits
-8.40
-munshi
-prod
-snatchers
-soledons
-greenback
-oxon
-descents
-pre-selection
-disparaged
-arima
-9-hole
-bushell
-bernier
-1183
-subcategory
-salvator
-dignitary
-multi-track
-kettles
-unfavorably
-ballou
-1365
-bell-shaped
-scarlatti
-torneo
-andries
-06:00
-aix-en-provence
-ivana
-stared
-tellers
-ligases
-outpaced
-prescient
-outlasted
-transceivers
-drawdown
-welle
-howitt
-pawar
-windows-based
-schleicher
-atacama
-1997/98
-lunatics
-parmenides
-0.13
-renderer
-adda
-hombre
-zilog
-orvieto
-trans-am
-magners
-payday
-self-reliant
-ucsf
-peloton
-drumsticks
-reproducible
-menaced
-plumaged
-helter
-altadena
-dissociated
-innovated
-drang
-loin
-snook
-iblis
-re-development
-archimedean
-roadsides
-saaf
-uninteresting
-maronites
-petersham
-grigg
-pervades
-arsenide
-boos
-bsnl
-jepson
-inla
-mra
-tongs
-bgp
-digivolve
-akiba
-boogiepop
-sidcup
-dieu
-stephane
-gena
-anxiously
-tenths
-dermis
-reassurance
-1-15
-searcy
-trevino
-birt
-densest
-greifswald
-truetype
-juried
-montano
-nats
-reuven
-casals
-hangin
-red-hot
-49.4
-adverbial
-peacekeeper
-magnuson
-monogatari
-gaozu
-d'amico
-anambra
-5-door
-brocade
-stroudsburg
-religiosity
-jaggery
-whiteboard
-trt
-i-35w
-1222
-winglets
-blockades
-dishonorable
-clarkston
-dubose
-prodrive
-bsf
-tremaine
-sketchbook
-snow-covered
-4,800
-892
-remainders
-931
-laney
-dds
-lawnmower
-sedgemoor
-o'dowd
-rasta
-mikoyan
-polgas
-color-coded
-idm
-kamran
-medal-winning
-short-haul
-roughnecks
-eckersley
-enfranchised
-divestment
-miconia
-allsvenskan
-jeri
-theroux
-cvc
-near-threatened
-modding
-ascorbic
-dais
-saheb
-faceted
-sandilands
-boothby
-justiciar
-ppe
-neckar
-supersymmetric
-1153
-hyrcanus
-chanute
-bestial
-soapy
-colmar
-dred
-hoot
-captain-coach
-cookstown
-baseless
-gor
-mustering
-gordian
-hur
-filene
-prongs
-salieri
-suman
-circulations
-vats
-uke
-masi
-7.10
-epiphytic
-ice-free
-dildo
-pathologies
-742
-bordentown
-non-residential
-nine-month
-1119
-cruzeiro
-schematics
-july-august
-nn
-rance
-leghorn
-off-camera
-superset
-dalston
-mainsail
-top-scored
-godson
-asin
-malai
-tehachapi
-harkins
-oamaru
-troubadours
-encapsulate
-arsonist
-keres
-maclay
-subtribe
-sedalia
-dyk
-puffing
-osmium
-toasters
-recede
-roseau
-underlain
-zippers
-unveil
-trills
-1939-1945
-limousines
-hastert
-blimps
-invincibility
-88.0
-colds
-yea
-rayman
-cabinet-level
-emigre
-utricularia
-gerontology
-enamoured
-chloroplast
-touting
-parsers
-nicer
-multi-colored
-nainital
-corduroy
-hx
-gruden
-chronometer
-lauri
-hirsi
-929
-fallowfield
-tolland
-marquesses
-sciuridae
-beamish
-velociraptor
-dolphy
-borderland
-hesse-kassel
-teh
-cornhill
-rspca
-power-sharing
-knolls
-thrawn
-replenishing
-diggs
-yogananda
-epigraph
-prophesy
-barbets
-centrale
-1c
-844
-3h
-kerner
-synonymy
-stroma
-overheated
-1996-1999
-sewall
-dunst
-back-story
-concertina
-phospholipids
-b2b
-alle
-slidell
-c.h.
-7800
-coxon
-thuringian
-talbott
-soothe
-linares
-zev
-featurettes
-epicurus
-ager
-rifkin
-kingsmill
-dq
-crocodilians
-verisign
-sica
-transom
-refunded
-arakawa
-third-round
-cuatro
-balakirev
-2005-2008
-impressionists
-afterglow
-v5
-apoel
-trunkline
-restorer
-disunity
-skanda
-williamite
-esopus
-kron
-stowage
-meares
-equerry
-ionians
-lactobacillus
-curtius
-obstetric
-orientale
-desirous
-manuela
-dayne
-ueno
-million-selling
-retraining
-whims
-seguin
-tirol
-mizuho
-chowder
-o'byrne
-psg
-wd
-ducats
-knute
-hims
-hebe
-interlochen
-infeasible
-non-residents
-prefaced
-goldilocks
-azkaban
-bloodletting
-splashes
-guha
-lieutenancy
-tynemouth
-feudatory
-shugo
-0,1
-52.9
-sian
-gdynia
-footloose
-eudora
-wicomico
-aml
-weddell
-marionettes
-illegality
-redecorated
-anaphylaxis
-nonnegative
-cno
-kumi
-castilla
-47.2
-sheung
-homeroom
-sympathize
-morozov
-1432
-anachronisms
-shariah
-dampened
-cowards
-4,600
-lapeer
-catanzaro
-imager
-three-and-a-half
-b&w
-amersfoort
-crosstalk
-tsx
-carcassonne
-non-us
-ohms
-dsv
-agnus
-bomba
-pentecostalism
-offal
-mccoll
-concussive
-tapir
-literati
-corroborate
-thumping
-obc
-bounties
-declaratory
-kutcher
-latour
-vcs
-proudest
-bergson
-microarray
-strafed
-natsumi
-ucr
-amoeba
-berenson
-pocketed
-lytham
-worshipper
-infuse
-baleen
-camilo
-g.a.
-ketchikan
-tradesman
-cathars
-10km
-mapai
-takada
-weingarten
-sez
-datagram
-pitkin
-patroclus
-glace
-plm
-migraines
-helton
-trinkets
-s.d.
-majumdar
-distorts
-menorah
-kembla
-starkweather
-rigour
-stumpings
-codice_13
-blinky
-ghori
-coaling
-hmnzs
-alighieri
-heraclea
-midhurst
-1446
-pfp
-contaminating
-selva
-sledding
-lady-in-waiting
-cpl.
-burslem
-glabra
-autocracy
-bampton
-resourcefulness
-bigamy
-conclaves
-peltier
-pickerel
-7.70
-mousavi
-selmer
-bertelsmann
-ecs
-busways
-pleated
-ree
-ogc
-zellers
-anciently
-robusta
-phenom
-achebe
-choirmaster
-complacency
-loyally
-wildfowl
-glenview
-daddies
-grubs
-spanner
-bamba
-115th
-durandal
-corny
-sheehy
-determinate
-poli
-silencer
-madrasah
-etf
-non-whites
-12-14
-compacts
-disliking
-fino
-states-based
-norrington
-lungfish
-hfs
-ut-tahrir
-igm
-hisar
-commercialism
-transposing
-kfi
-lindner
-perrot
-stadler
-w.t.
-conglomeration
-krone
-stewardess
-masataka
-myeloid
-histoire
-689
-rivals.com
-pre-paid
-ligure
-m42
-enrolments
-codd
-gallican
-oems
-thx
-enschede
-stillwell
-boxcars
-crypts
-cantina
-steaua
-marky
-sibiu
-1339
-permeate
-roped
-bibby
-mulliner
-formula_71
-winstone
-lifehouse
-encapsulates
-cosmetology
-amberley
-helicon
-hellsing
-e.v.
-3.38
-mestre
-easyjet
-thunderwing
-santoro
-tarakan
-revs
-coworker
-disreputable
-hagman
-daimon
-petraeus
-selborne
-delhomme
-freescale
-lajoie
-50.6
-cics
-delbert
-panna
-glorifying
-biopsies
-breaux
-pro-german
-alltel
-lowenstein
-riku
-excavator
-aimlessly
-mork
-corino
-changeup
-earth-one
-driveshaft
-frostburg
-honeydew
-forward-looking
-kakheti
-ten-minute
-muntz
-speedwell
-a/c
-deighton
-bowerman
-torsional
-repositioning
-carty
-bitters
-proteolytic
-holloman
-troi
-joh
-droitwich
-lleida
-storting
-mahinda
-encrypting
-spoilt
-harty
-1364
-hanssen
-euronext
-enna
-glycoproteins
-evarts
-palladino
-exterminator
-schuler
-benigno
-meted
-sif
-purser
-granddad
-frelimo
-graphene
-h1n1
-condit
-vagueness
-erastus
-kelvins
-unfeasible
-fukuyama
-225,000
-molise
-interdimensional
-svr
-fairbank
-satie
-ocelot
-1500m
-ane
-chorister
-1118
-upjohn
-naguib
-retransmission
-tld
-eto
-nichiren
-prosaic
-marini
-toffee
-1203
-bobsled
-lensing
-engravers
-galatia
-lifeson
-mp3s
-birley
-subverted
-korner
-param
-leni
-four-week
-leggings
-793
-hentai
-sitters
-m.g.
-crake
-flagellum
-pattie
-self-deprecating
-wetzel
-irfan
-mcgoohan
-domiciled
-blalock
-ubi
-tonsils
-villainess
-asch
-dever
-waitresses
-sixty-first
-gridlock
-umi
-microkernel
-councilmember
-jenkin
-freiberg
-tantrum
-mhd
-greencastle
-ummah
-subtleties
-wintour
-duca
-pudong
-rudely
-herero
-queenston
-hiker
-aeruginosa
-lozano
-mineralogical
-chiefdom
-flapper
-escuela
-soloing
-cruikshank
-headlong
-coram
-themistocles
-1994-1996
-tricyclic
-taguchi
-sarsfield
-re-sign
-cultic
-rfp
-ramsbottom
-1680s
-electrophilic
-matsuri
-ullrich
-cadavers
-nfpa
-round-the-world
-dda
-paraplegic
-arleigh
-guitar/vocals
-semifinalist
-gaborone
-scarpa
-career-highs
-rapists
-cobbs
-uts
-conservators
-nitrazepam
-f-15e
-cayo
-hawkgirl
-klitschko
-mcinnes
-marella
-evangelicalism
-biomechanics
-regum
-yggdrasil
-mathewson
-51.2
-well-versed
-ensenada
-aed
-voblast
-besson
-883
-hutchence
-theoretician
-skagway
-langur
-2am
-sussman
-chica
-uptight
-jumble
-fagen
-nomi
-blakeney
-cobbett
-muyo
-schwinn
-sheela
-asper
-decentralised
-letran
-onegin
-postmortem
-cartographers
-minamata
-eagan
-patricio
-bms
-blobs
-qa
-supple
-flatlands
-svensson
-1035
-resell
-macleay
-muzaffar
-colloid
-legible
-craiova
-emmanuelle
-re-write
-kegs
-demigod
-ladybird
-journal-constitution
-khurasan
-hepworth
-jonze
-leoni
-nephropathy
-w.j.
-46.9
-norske
-escambia
-american-made
-wallpapers
-jour
-angiosperm
-stroh
-natak
-scherer
-oppositions
-sangli
-brownell
-ksc
-walcheren
-redaction
-demarest
-corporals
-pistoia
-premolars
-etzion
-ringway
-outcroppings
-bourque
-forty-sixth
-partisanship
-horsa
-nanticoke
-news-press
-cecile
-newcomen
-tura
-bostwick
-ewes
-barnacles
-october-november
-byblos
-peritonitis
-betamax
-skelter
-entendre
-sportive
-dez
-airstream
-edta
-120th
-0.10
-weintraub
-752
-cbp
-baytown
-millburn
-talcott
-foolishly
-+8
-unhappily
-nicias
-quraish
-unevenly
-libro
-vanda
-coombes
-fsf
-koil
-jeezy
-portis
-kilts
-exclaim
-interlocked
-magics
-andriy
-lexx
-tpc
-cromford
-corrector
-cellphones
-isadora
-wimmer
-intermountain
-melisende
-henschel
-h.l.
-bulmer
-peto
-lavelle
-infinite-dimensional
-drug-induced
-kohanim
-adat
-koro
-superpowered
-dc-8
-outflank
-thunderous
-troposphere
-co-leader
-1122
-udt
-chirality
-theorizing
-zong
-drakensberg
-triode
-sahaba
-1670s
-h.j.
-qmjhl
-off-label
-838
-lyoko
-tilling
-effortless
-wiaa
-naru
-presaged
-mef
-zapp
-midgets
-centroid
-cahokia
-globemaster
-prout
-two-phase
-fogel
-strucker
-archivists
-postgraduates
-971
-4-12
-paramagnetic
-tendrils
-kingwood
-misquoted
-37-yard
-conniving
-satirised
-obtuse
-celiac
-all-india
-prioritize
-pininfarina
-walkley
-laurentiis
-armors
-expunged
-cantopop
-incredibles
-liberal-conservative
-quire
-fmc
-300m
-ivs
-unsavory
-westley
-debenhams
-airedale
-launchpad
-louis-philippe
-scolopacidae
-1141
-16:00
-ruthie
-timex
-cassiopeia
-dello
-riven
-nucleons
-post-grunge
-demigods
-calculi
-bjorn
-misogynistic
-frink
-youth-oriented
-u20
-theorised
-ingushetia
-wenham
-celle
-nogai
-upfield
-baw
-fante
-munk
-hdd
-emlyn
-kmb
-101.6
-cull
-reassessment
-drogo
-obscures
-stonebridge
-wintergreen
-al-rahman
-manowar
-doonesbury
-herbst
-neverending
-ellet
-fee-paying
-hashanah
-kivu
-radhakrishnan
-oghuz
-eilean
-morelos
-zululand
-pantograph
-helensburgh
-wps
-buyouts
-sandbach
-42.8
-palance
-elyria
-bodkin
-lauro
-6.50
-921
-malek
-mainichi
-six-issue
-besiegers
-mallards
-f-5
-1008
-schon
-ukiah
-steeler
-springville
-hopkirk
-starjammers
-adepts
-maputo
-yuka
-broun
-wiretaps
-keelung
-bonito
-murtagh
-star-studded
-lebaron
-d'artagnan
-saluted
-tonopah
-haftarah
-aglaia
-gib
-1251
-rov
-macbride
-kleiner
-may/june
-sak
-grayscale
-unfiltered
-critters
-arkady
-elitism
-1195
-rationalized
-hopton
-nuri
-autopsies
-avebury
-turkana
-lorelai
-1123
-hurlers
-ceti
-optimality
-linoleum
-7.60
-niemeyer
-belemnites
-trimet
-unleavened
-panchen
-hovind
-dreamcoat
-branham
-1269
-adt
-ybor
-rossa
-56.3
-whirl
-enki
-7.80
-3800
-papilio
-spaceman
-phalaropes
-sikes
-zakaria
-ergative
-aq
-flamethrowers
-spilt
-serangoon
-subsist
-mola
-servile
-modernising
-dimming
-cardiothoracic
-zeebrugge
-scrip
-six-time
-codify
-confiscating
-lucena
-i-65
-howson
-montecito
-stranding
-menezes
-fenians
-tobe
-haripur
-prebend
-stephenville
-abiotic
-counter-attacked
-sittingbourne
-thumbnail
-1358
-leeuwarden
-sheepshead
-collegian
-orta
-dizzee
-burne-jones
-savarkar
-poa
-interconnects
-kilobytes
-bds
-absent-minded
-pronouncement
-pubmed
-unitarians
-easington
-johar
-fulling
-yhwh
-oval-shaped
-mig-29
-freeborn
-deshmukh
-saraiki
-bod
-conover
-unhelpful
-holcombe
-chishti
-andie
-listenership
-hubli
-renfro
-uris
-mehra
-disadvantageous
-tibial
-wsb
-87.0
-1411
-carloman
-horizonte
-sextant
-middleman
-tightness
-chambliss
-braose
-deventer
-mifune
-tanana
-rubella
-air-raid
-australopithecus
-risque
-manitoulin
-pathfinders
-nak
-lantana
-abducting
-peritoneum
-tss
-digests
-igcse
-hanbury
-mannequins
-plaguing
-857
-heady
-jobber
-boxy
-aetna
-tuscarawas
-spg
-santosh
-ordinations
-northbridge
-nen
-lackland
-rediscover
-configuring
-plows
-u-16
-interprovincial
-hyphae
-debby
-undersized
-sturrock
-weblogs
-gamebook
-red-brick
-terek
-mid-ocean
-moscoe
-varian
-undisciplined
-bsb
-aronson
-burra
-engulf
-tweak
-wenders
-cauvery
-safeguarded
-summoner
-937
-hymenoptera
-wordless
-1426
-vitter
-alleluia
-borman
-coho
-reconnected
-tdp
-principate
-antares
-moluccas
-newfield
-sinn
-rummy
-mid-western
-frequenting
-godin
-snubbed
-shinigami
-waqf
-doral
-microbiological
-accelerations
-burgher
-provable
-self-governance
-stile
-discernment
-presupposes
-snorri
-saris
-giffard
-semmes
-resurfaces
-dijkstra
-keiichi
-wallets
-mile-long
-kravis
-bannock
-untouchability
-downtempo
-hypothesize
-walkout
-unsophisticated
-ruminants
-diacritic
-e8
-potosi
-glycemic
-773
-metacarpal
-extolling
-teotihuacan
-highbridge
-katerina
-sembawang
-alamance
-macdonalds
-dmca
-romancing
-baffle
-huh
-parkers
-monroeville
-mantell
-siebel
-dairying
-televise
-strongbow
-frosts
-bifida
-deckard
-wickliffe
-eucalypt
-tormenting
-saplings
-cua
-symon
-vasopressin
-megamix
-snrnas
-nankai
-shabazz
-javits
-ayu
-akasha
-burdick
-270,000
-agudat
-ebola
-1357
-debartolo
-sixty-eighth
-thundercracker
-kickbacks
-t-2
-ansbach
-c-47s
-dusted
-november-december
-gabler
-customizing
-brockway
-butane
-shibata
-self-expression
-46.4
-libertines
-newsome
-bobbin
-agi
-extraterritorial
-memorizing
-mauve
-ayman
-exemplars
-kapil
-jannetty
-trailblazer
-glenrothes
-kirshner
-elric
-shillong
-bobble
-anglo-scottish
-foreclosed
-neuralgia
-hearths
-srirangam
-argumentative
-lindemann
-sambora
-massinger
-7.90
-photoshoot
-799
-aids-related
-snatching
-supercomputing
-chlamydia
-lemming
-nog
-seraikis
-wolfmother
-58.3
-matting
-kup
-paratroops
-geocentric
-gml
-joannes
-mire
-jamiroquai
-hartwig
-orrell
-woodcocks
-shockingly
-backline
-poise
-confederated
-hybridisation
-hauls
-slappy
-kiko
-hird
-spry
-oerlikon
-whitstable
-volunteerism
-singlehandedly
-unaccounted
-absolutist
-pmi
-csv
-samael
-cracow
-infusions
-vidyasagar
-laundromat
-skimmer
-dropouts
-spivak
-khoury
-generalist
-ucb
-+0
-anti-gravity
-lox
-schismatic
-fritillary
-couric
-dra
-hyperbola
-priyanka
-clings
-crispus
-koya
-stigmatized
-azrael
-wart
-fortean
-michiel
-rlc
-verner
-todmorden
-fiestas
-archenemy
-westheimer
-50.3
-fail-safe
-963
-charadriidae
-wfp
-cytosolic
-priddy
-zulfikar
-callback
-cbeebies
-spr
-dinkins
-8.70
-2006-present
-gelfand
-calabasas
-103.2
-nrg
-trotskyism
-regrouping
-burdensome
-alena
-hilltops
-braemar
-cunliffe
-pecuniary
-whisked
-lumberton
-persimmon
-intron
-formalization
-naves
-carthusian
-2.12
-morrisville
-chaffin
-takuma
-1367
-impossibly
-adsorbed
-revie
-wooed
-handlebar
-zug
-fistful
-wisner
-kathak
-rete
-zodiacal
-hydrofoil
-revoking
-madero
-knowledge-based
-juju
-abducts
-cromartie
-debi
-pencilled
-chester-le-street
-greenbush
-bluffton
-tiga
-canticle
-mehmood
-carb
-fishguard
-sawa
-prins
-fifty-eighth
-polities
-glides
-stena
-struve
-counterexample
-walser
-heaviside
-bioware
-levene
-laboured
-granholm
-maypole
-madelyne
-nilo-saharan
-popolo
-inbred
-purifiers
-permaculture
-vivacious
-tractable
-pinkie
-stinking
-fairyland
-campanian
-phys
-cubitt
-torments
-gradation
-mithila
-miscegenation
-botev
-suntory
-hirundinidae
-pulps
-ekman
-1107
-mitterrand
-registrants
-corba
-stauffenberg
-essences
-jansson
-derringer
-738
-gyroscopes
-flemings
-barrages
-culkin
-annul
-ridgely
-embezzling
-f.e.a.r.
-zenobia
-imputed
-epi
-spv
-icebreakers
-hira
-harpur
-purnell
-suppressive
-re-runs
-producer/director
-922
-syntheses
-kosi
-bewdley
-koehler
-xc
-virology
-shwe
-bajoran
-fraunhofer
-eastham
-nonsectarian
-nitrocellulose
-azar
-margolis
-tovey
-supposes
-killswitch
-bueno
-non-cooperation
-agitator
-kiarostami
-bsn
-elegies
-wahab
-acrimony
-emanate
-raina
-stout-bodied
-vcu
-1974-1976
-marginalization
-olf
-bynum
-mckeesport
-24.00
-expertly
-mujer
-nucleosynthesis
-talc
-manufactory
-quarterdeck
-anoxic
-unreadable
-3:2
-wheatbelt
-trois
-istituto
-taieri
-assen
-twu
-namgyal
-pre-transition
-afl-nfl
-51.1
-abhimanyu
-haj
-transcaucasia
-adjudicate
-aldred
-bicentenary
-sayaka
-moravians
-manoa
-xbmc
-colombians
-amina
-kawartha
-mangano
-beachcomber
-rondeau
-kildonan
-shiba
-candu
-turboprops
-laycock
-paddies
-akagi
-footballs
-terrebonne
-haye
-fifty-eight
-axa
-newspaperman
-hoag
-lanyard
-powerlifting
-multitudes
-pud
-isl
-incarnated
-bohdan
-cacapon
-ens
-10.40
-greek-speaking
-ramiro
-nuthatch
-privateering
-paprika
-byram
-flatland
-holdout
-revolutionize
-aeration
-buffyverse
-lookalike
-gouverneur
-herzliya
-e.a.
-channelled
-lipase
-waitrose
-uninhibited
-freightliner
-nondescript
-reintroducing
-bagot
-two-member
-cesena
-dahmer
-cronulla-sutherland
-quebecers
-electropop
-comers
-rivadavia
-gigas
-unpredictability
-standards-based
-livres
-arvin
-cosell
-lazuli
-ephedrine
-denouement
-kuma
-samanid
-campese
-hoppy
-helles
-arndt
-impetuous
-fula
-ironton
-busoni
-946
-unanimity
-sou
-layfield
-unsightly
-commonly-used
-zinta
-patristic
-hurls
-ipods
-cardiopulmonary
-1393
-cat-like
-sino-indian
-compressible
-my-otome
-aird
-firmus
-suffocate
-yamhill
-hirsh
-vogler
-generalisation
-hilversum
-nikolayevich
-reichswehr
-prd
-lyapunov
-reintegration
-kolyma
-bandurist
-demonology
-falsifying
-aab
-athelstan
-tuskers
-bourchier
-stiglitz
-rishis
-pradeep
-arrakis
-bhd.
-defensor
-preemption
-54.8
-hiragana
-edwardsville
-exacerbating
-ndc
-mid-life
-sarmatians
-famas
-rubra
-bramwell
-percutaneous
-off-stage
-pusey
-mid-2003
-tancredo
-senanayake
-ashkenazic
-pisan
-scuttling
-heeded
-pitiful
-dunsmuir
-disposes
-rie
-risley
-51.9
-elseworlds
-whitgift
-francophonie
-hedda
-straying
-cubist
-galeazzo
-dunder
-onn
-aske
-sls
-vis-a-vis
-invoices
-housemaster
-gulshan
-whyalla
-yamin
-shirk
-yehudi
-xxix
-secundus
-voa
-repaint
-responsibly
-abscesses
-jhang
-tarski
-boethius
-co-ops
-erinaceomorpha
-lemnos
-ynez
-suebi
-dalymount
-kaga
-turok
-ico
-flip-flop
-hydrography
-highest-selling
-venis
-1087
-feudatories
-overestimated
-maytag
-undercurrent
-sustainment
-beauvoir
-chicopee
-nilotic
-manifestos
-aranda
-ping-pong
-1,350
-inconsequential
-orf
-jackrabbit
-endocarditis
-zircon
-aching
-licorice
-self-aware
-aggregations
-k-rock
-rebooted
-kalevala
-golkar
-warez
-vesey
-masamune
-universalists
-neologisms
-querying
-lowrie
-aggravate
-scintillation
-petworth
-kalki
-360-degree
-mig-15
-lightening
-jessore
-strozzi
-neha
-subcategories
-1999-00
-luttrell
-cyclades
-disintegrates
-e.b.
-kronecker
-malayo-polynesian
-tyrion
-deepdale
-reminiscence
-takei
-cabrillo
-fazio
-746
-749
-third-placed
-infest
-43.9
-downland
-poudre
-rattles
-mix-up
-landholdings
-gmp
-njpw
-chernigov
-trimmer
-tcf
-thrombin
-filenames
-bx
-sondhi
-chinese-american
-washrooms
-6-year
-tucci
-unearth
-molinari
-gove
-farhad
-pathans
-jagdgeschwader
-redesignation
-btec
-wuppertal
-rell
-reassemble
-incongruous
-mahavishnu
-tamalpais
-easy-going
-mongkut
-etowah
-clairvaux
-sit-ins
-trivially
-loansharking
-tachometer
-florins
-cavallo
-orange-red
-vorticity
-rnase
-kaw
-maysville
-sholom
-ex-convict
-blue-grey
-normalizing
-overhand
-mosh
-kampen
-place-name
-commandery
-warranties
-ravenscroft
-aldermaston
-wheelie
-tots
-acrobats
-servius
-refutes
-rerouting
-parallelogram
-subcarrier
-intramuscular
-microtubule
-x-rated
-lafarge
-venango
-roadkill
-aylwin
-torry
-codice_15
-falconidae
-back-and-forth
-plautus
-mortlake
-zahra
-mediumwave
-2-8-0
-barua
-subramaniam
-fid
-i-25
-swept-back
-sani
-cormack
-hoorn
-homeboys
-anhydrides
-hegemonic
-vcrs
-urmia
-gev
-1437
-precipice
-rikishi
-calligrapher
-nicotinamide
-goleta
-chandan
-lcds
-discordant
-muawiyah
-in-law
-serampore
-sentimentality
-grandfathers
-screamers
-etfs
-swatch
-dordogne
-neoplatonism
-medevac
-schramm
-koroma
-s6
-s5
-misogyny
-freising
-glbt
-kelton
-naresh
-mullan
-structuralism
-arbitrators
-beyblade
-a30
-lilienthal
-danton
-ticketmaster
-interleaved
-january-february
-horacio
-rtm
-kefauver
-bascom
-reinstalled
-symbolics
-lares
-lamberton
-masques
-sascha
-longbridge
-mudra
-puran
-2cd
-m113
-ranfurly
-rittenhouse
-krall
-manstein
-simran
-malappuram
-ellipses
-anglo-boer
-branden
-combed
-cosmas
-0-10
-wolters
-crouse
-unmolested
-camilleri
-twenty-year
-cee
-classifiers
-knelt
-heerlen
-tejas
-weei
-parkville
-minke
-toomey
-sheffer
-clairvoyance
-lawry
-bh
-ben-hur
-dowding
-tikkun
-chok
-redundancies
-adjournment
-aiba
-pantages
-tailwheel
-in-a-row
-brotherhoods
-reenter
-jocasta
-calligraphic
-124th
-archimandrite
-unanticipated
-non-compliance
-ucs
-henrique
-parrett
-bioengineering
-synge
-unraveling
-mason-dixon
-linford
-bhagavan
-trekkers
-crescent-shaped
-orchestre
-bootleggers
-russa
-ianto
-anti-colonial
-oglala
-cisticolidae
-stewed
-all-union
-middlebrook
-asclepius
-ilyich
-shinya
-knebworth
-creator-owned
-leasehold
-cannae
-contemporaneously
-bellum
-scrapers
-realign
-billerica
-mithras
-avonmouth
-quranic
-dindigul
-kabaka
-savoury
-insectoid
-ouija
-devastate
-puyi
-scoured
-aemilia
-roleplay
-frum
-leacock
-mixtec
-harsha
-rodd
-drawn-out
-digitization
-souther
-lasseter
-diol
-day-lewis
-ikon
-roop
-lapham
-zou
-freelancing
-cut-out
-1182
-iep
-winx
-maclachlan
-vaal
-kahlan
-orchestrator
-769
-mosher
-naboo
-panacea
-tarsier
-backwoods
-tonsure
-interbreed
-intuitionistic
-testable
-realizations
-extra-terrestrial
-scavenge
-lycia
-stoichiometric
-daschle
-cissy
-roosts
-1088
-alcan
-1407
-fusions
-gorey
-stubs
-muslin
-delaunay
-stirrups
-sharecroppers
-sofer
-saadia
-782
-face-down
-habitations
-fuelling
-mid-70s
-oligarchs
-palembang
-gwydion
-radish
-leadbeater
-wisest
-usu
-prioritized
-spandex
-ots
-s-type
-shankaracharya
-evaporator
-87.1
-crystallize
-shafter
-rajneesh
-fathoms
-olmstead
-flaring
-timurid
-macgowan
-onus
-rohtak
-v-twin
-familiarize
-jawa
-diametrically
-43.7
-86.8
-four-game
-948
-sfo
-tomo
-koei
-whampoa
-moviegoers
-stuckist
-haar
-inflectional
-dobbins
-sailplane
-ipp
-pre-rendered
-skydome
-nebo
-folktale
-monotype
-calabash
-wont
-yume
-cdna
-4.50
-hn
-unsportsmanlike
-zain
-kingmaker
-48.9
-interrogates
-tykes
-cbm
-arcuate
-aacsb
-brockley
-merah
-double-edged
-moni
-artis
-mersenne
-buhl
-tuva
-9-2
-chiaroscuro
-sing-along
-44.0
-basilio
-stone-built
-audio/video
-58,000
-debunking
-grandparent
-sitwell
-inayat
-flowery
-horse-racing
-lifters
-shotts
-polyphyletic
-85.5
-maar
-kawada
-mcginty
-alitalia
-1287
-cocoons
-d.c.-based
-vanua
-kantian
-lilli
-sphagnum
-buna
-supp
-alwar
-102.4
-neurodegenerative
-lanyon
-underscores
-gutters
-clansmen
-spectrometers
-romita
-villarreal
-dragster
-housework
-mockingly
-elli
-bso
-negroponte
-nasals
-struthers
-goaded
-serjeant
-avco
-delius
-rha
-magnetron
-minimising
-flensburg
-wildc
-superhighway
-1949-50
-second-placed
-eine
-joggers
-helmer
-acculturation
-pritchett
-griese
-ines
-codons
-begotten
-blaney
-geopolitics
-punchbowl
-collation
-phillippe
-wilmslow
-non-combatants
-shreve
-scrupulous
-generalissimo
-d'annunzio
-svd
-g-force
-doused
-akins
-sviatoslav
-joaquim
-solheim
-lusty
-drunkenly
-goad
-beekman
-60-minute
-mada
-kakatiya
-mcgreevey
-100.4
-synchronisation
-manda
-nusrat
-annibale
-tutsis
-frankincense
-musso
-fujifilm
-hotaru
-maximization
-leftwich
-sub-label
-poverty-stricken
-jamshedpur
-antihistamines
-sinkhole
-101.8
-skerries
-digraph
-unchained
-ragga
-798
-holroyd
-muti
-swarmed
-dorcas
-blanking
-assemblers
-jagir
-yuichi
-ramada
-multi-part
-rasch
-negi
-boba
-helio
-spano
-areca
-eight-ball
-well-behaved
-longterm
-navan
-scabbard
-weitz
-lead-acid
-fifteen-year-old
-menotti
-wnyw
-forecasters
-ledbury
-malus
-supervalu
-congregants
-platonism
-oxygenation
-hinkley
-cochabamba
-jatt
-benbow
-thrombocytopenia
-gavel
-hamersley
-x5
-kotaro
-boykin
-radner
-schweppes
-crp
-ovoviviparous
-2-lane
-1384
-mesoscale
-schoolers
-harryhausen
-oblates
-trentino
-musicologists
-ayckbourn
-webzine
-depress
-39,375
-juli
-waterborne
-sakya
-goodridge
-kuantan
-puisne
-homburg
-colmes
-prostatic
-thal
-testator
-genting
-saic
-manoeuvring
-relished
-cephalic
-papi
-talos
-presidencies
-circumpolar
-workday
-scorpius
-thomaston
-kilotons
-gleefully
-zainab
-cfd
-outlier
-duala
-machu
-faxes
-undecidable
-frc
-oso
-giambattista
-jagan
-8.90
-stationing
-tolman
-uppingham
-zakir
-teamster
-patois
-yoakam
-nusa
-pani
-lymphomas
-ajahn
-enthronement
-jeffs
-kodagu
-prided
-vfl/afl
-diallo
-deflecting
-mozarabic
-schizoid
-reactance
-proportioned
-ditto
-buscemi
-tweaking
-apses
-outwit
-burkett
-self-guided
-troika
-deliberated
-aborigine
-crash-landed
-turbografx-16
-isolationism
-day-long
-alvis
-canaanites
-furore
-933
-rubrics
-watercress
-wealden
-thammarat
-pawlenty
-yngwie
-guidon
-8.60
-one-stop
-audie
-dwm
-british-american
-superiore
-quips
-forgettable
-moller
-degas
-bts
-mego
-zither
-alberton
-fenchurch
-founds
-mctaggart
-allosteric
-oundle
-fangoria
-seaborg
-herculaneum
-tottori
-bidar
-cradley
-underwritten
-pathologic
-86.3
-dissolute
-bearden
-vaulter
-orem
-steffen
-emiliano
-shinobu
-magnetically
-affinis
-al-gaddafi
-decades-long
-jumpin
-victorville
-soft-spoken
-ammonoid
-samarra
-shipton
-14-15
-court-ordered
-palanca
-piso
-ellerslie
-fuckin
-whitening
-re-introduction
-sleepwalking
-13-15
-inauspicious
-bookmaking
-reductionism
-shafi
-wirtz
-360,000
-janney
-ooty
-governors-general
-lastman
-screamer
-rabindra
-bice
-rosamond
-banishing
-rouvas
-ferrie
-bina
-jabbar
-regicide
-leninist
-852
-webpages
-bellflower
-dunster
-dannii
-microstructure
-gacy
-toyoda
-haut
-hendersonville
-computer-assisted
-hertzberg
-centralize
-hoch
-enders
-undercoat
-x-axis
-r.h.
-bere
-clerkship
-tex.
-azzam
-saini
-1379
-raff
-reverie
-shahbaz
-gojoseon
-ghs
-matroid
-sys
-lube
-mollari
-officeholders
-dnr
-iberians
-retrospectives
-baldry
-wheelbarrow
-brats
-foliot
-moccasin
-footlights
-eriksen
-cecelia
-geochemical
-kobol
-nazi-occupied
-ausf
-vajra
-seabury
-large-format
-intermolecular
-antihistamine
-xliii
-pursuivant
-non-chinese
-1366
-cumnock
-al-maliki
-hypercard
-dushanbe
-schwann
-isb
-power-hungry
-fessenden
-glenville
-1000th
-docomo
-comox
-viswanathan
-diluting
-lintels
-half-siblings
-abdali
-smits
-nast
-tsukuba
-padi
-algerians
-sagara
-ortho
-vorlons
-9.70
-tuticorin
-myopia
-ecco
-hoar
-paltry
-september-october
-castelnuovo
-geotechnical
-three-minute
-iim
-cornucopia
-unisex
-inventiveness
-tellurium
-castra
-telos
-xen
-aquitania
-sicilia
-vasant
-1977-1988
-burnell
-icse
-monoidal
-hsien
-halberstadt
-garand
-protestations
-placerville
-ahmanson
-mgb
-malahide
-arrernte
-glarus
-coughs
-fairings
-2002/2003
-capitalised
-musketeer
-brockport
-b-boy
-kaneko
-multi-volume
-1176
-chittoor
-anti-clockwise
-manipuri
-foreheads
-battersby
-popham
-marad
-eons
-isao
-aller
-giannini
-hard-core
-941
-flanaess
-hortense
-spiky
-calabrese
-62,000
-earley
-ds9
-craniofacial
-masud
-moorlands
-megas
-ther
-cancri
-limavady
-single-engined
-sedaka
-waldeck
-lafleur
-choline
-unknowing
-escorial
-injectable
-photoreceptors
-9.40
-serres
-sealy
-fenimore
-attleboro
-45.8
-rosina
-cinhil
-uhuru
-rearrangements
-scoliosis
-uracil
-sorin
-zm
-anstruther
-quip
-applets
-pendants
-madder
-excruciating
-40,625
-kovel
-eyeballs
-nidal
-ltv
-vollmer
-alcuin
-ibelin
-arching
-separators
-ayre
-amputee
-mausoleums
-kord
-ifpi
-surfin
-dharmendra
-sinkholes
-forthwith
-foreign-language
-grauman
-lawndale
-stannard
-mauer
-596
-maslow
-tapa
-sattar
-soke
-syntactically
-cheyne
-881
-shar
-quatro
-musicality
-pinsky
-hipster
-airbases
-knowland
-nadar
-alexi
-tibbs
-manistee
-kabc
-doab
-afridi
-bunnell
-mambazo
-bysshe
-factually
-wartburg
-supine
-frelinghuysen
-fixated
-pata
-liberman
-pacha
-waterproofing
-manningham
-lowes
-rothmans
-beltran
-obelisks
-747-400
-ministership
-mechagodzilla
-pg&e
-substituent
-uber
-tsars
-boni
-beermen
-tonks
-mammootty
-schaffhausen
-savagery
-intercooler
-multi-millionaire
-addressable
-goldfield
-brawley
-robie
-post-communist
-reus
-83.8
-mudstone
-maeby
-burnsville
-bucolic
-sandie
-tewodros
-lysenko
-diffuses
-s.l.
-teleological
-1996/97
-vulgarity
-rippon
-hither
-omniscience
-aquarists
-memon
-lauzon
-unwed
-sydney-based
-unimpeded
-upshur
-s-1
-noland
-interviewee
-appia
-chaz
-adcock
-geodesy
-rabe
-dumpling
-divergences
-lautoka
-lynott
-bobs
-southeastward
-lechmere
-evaporating
-formula_67
-flavorings
-deimos
-bonney
-wangaratta
-icebergs
-gaozong
-clusiaceae
-crumbles
-pierrepont
-june-july
-unctad
-856
-dy
-drop-off
-10-inch
-handloom
-normed
-bulkeley
-venter
-menahem
-veranda
-d-generation
-thickest
-high-flying
-rutaceae
-ugliest
-clos
-tes
-45.6
-uh-1
-summerville
-esrb
-trc
-velikovsky
-rummel
-1383
-enslaving
-greenford
-half-lives
-hypertensive
-quitman
-antisymmetric
-tuffy
-54.1
-kenpo
-nieuw
-1.15
-mashup
-sicilians
-nena
-buddah
-die-hard
-85.7
-sputum
-basinger
-utr
-monastics
-tigra
-taber
-thiru
-toccata
-21-yard
-causey
-hayao
-portola
-benefitting
-ael
-cmf
-thermometers
-claris
-maryknoll
-diop
-open-wheel
-biophysical
-mem
-fracas
-cloyne
-uncritical
-polynomial-time
-emaciated
-tolbert
-1433
-tithing
-burridge
-debye
-long-form
-seraph
-risers
-enlivened
-subalgebra
-guanajuato
-woodcraft
-slurred
-thamnophilidae
-delphic
-stadtbahn
-endosperm
-12-yard
-reasserted
-clemenceau
-kyw-tv
-sybase
-lanegan
-amro
-endearment
-lemuria
-tunable
-atreyu
-aficionado
-porirua
-hassell
-gehenna
-ulrika
-3.36
-reconfigurable
-kadapa
-potting
-bahasa
-midbrain
-alberts
-mcdaniels
-kaurava
-penwith
-c.v.
-smokescreen
-bardhaman
-coterie
-wilsonville
-unintelligent
-life-forms
-janson
-f-105
-afro-cuban
-cardiacs
-pranksters
-83.3
-ascalon
-ichikawa
-theodoros
-seabiscuit
-oto
-dialup
-lundberg
-orden
-boogeyman
-re-enacted
-odorless
-hayle
-abercromby
-0.09
-pre-set
-pco
-chine
-crooner
-drover
-tillie
-noarlunga
-trackway
-whooping
--12
-braidwood
-mineralization
-shakhtar
-bijou
-chittor
-kerwin
-songhai
-58.5
-sioned
-self-replicating
-embezzled
-kttv
-sfu
-holmgren
-disorganised
-ahmose
-hss
-top-scoring
-flatt
-gentlemanly
-dse
-himes
-euro-american
-manioc
-imhotep
-valero
-henna
-alleyway
-shema
-grable
-star-shaped
-trebek
-rapunzel
-schurz
-crips
-bookshelf
-estrogens
-grodd
-reemerged
-uninspired
-downers
-legalisation
-interventionist
-eyelashes
-americus
-mochizuki
-isamu
-hopea
-zygomatic
-antislavery
-joint-stock
-houlihan
-tassels
-31-yard
-dinka
-wessel
-kyokushin
-loh
-lacklustre
-vespa
-viviparous
-anti-ballistic
-vacating
-caproni
-fallin
-poetess
-dallas/fort
-philadelphia-based
-gallardo
-protists
-terminators
-50.1
-pineal
-updraft
-sono
-mtb
-antbird
-uvular
-claymores
-hypothesised
-alexandrov
-center-right
-cytosine
-anglo-chinese
-all-russian
-fiddling
-heifer
-radiographic
-deauville
-overwork
-yeshivat
-hetherington
-detox
-uninformed
-skoda
-lieut.
-khost
-multimodal
-iigs
-congratulating
-gigabytes
-infraorder
-tremble
-praia
-be7
-1978-1979
-ssri
-cockle
-h5
-romanies
-k.k.
-arendt
-bron
-bowditch
-nansemond
-dendrobatidae
-valign
-dewhurst
-benatar
-tapeworm
-115,000
-six-speed
-tsetse
-eclecticism
-vodou
-samy
-newshour
-koa
-fifty-seventh
-sanya
-chart-topper
-labials
-boothe
-long-run
-microbe
-candor
-jaap
-frolic
-depositions
-wigwam
-tur
-quelling
-huss
-wimp
-brazoria
-joann
-volos
-off-network
-xeric
-muniz
-jimma
-bamberger
-vrije
-theophrastus
-purkinje
-spotless
-steinem
-amb
-1113
-dainty
-2-cd
-shue
-holzman
-neo-nazis
-unobtrusive
-conjunctivitis
-typhon
-agonizing
-tachibana
-samut
-buckling
-annihilator
-karmic
-colloquium
-tweet
-duck-like
-mutating
-medill
-h.b.
-countably
-single-party
-shirazi
-mensheviks
-kirtley
-edgardo
-kogarah
-58.8
-nontraditional
-k.d.
-promulgate
-evensong
-bullocks
-market-based
-lapped
-gildersleeve
-klee
-risorgimento
-unceremoniously
-segura
-formula_69
-enlighten
-undoubted
-ex-servicemen
-essequibo
-breakdancing
-estrildid
-adaptions
-conceives
-meola
-topside
-phagocytosis
-navarra
-dartington
-punted
-intubation
-medium-range
-tarentum
-jalgaon
-haywards
-eze
-curtailing
-postgame
-bookie
-interleukin
-ervine
-numerology
-big-name
-trans-mississippi
-gilley
-mesenteric
-shinichi
-yaqui
-germano
-prefaces
-zambrano
-myristicaceae
-southwesterly
-drs
-protein-coupled
-leckie
-leche
-telco
-asgardian
-brinsley
-abay
-wfc
-generis
-recension
-boudreau
-banting
-acetaminophen
-forsake
-ultron
-rabinowitz
-trampling
-hemodialysis
-merz
-educationally
-localisation
-xe
-mycroft
-garrity
-mandelbrot
-sats
-sdr
-brighouse
-mccown
-vampirism
-sonthi
-maximally
-mescaline
-brydon
-almirante
-thornley
-rinaldi
-proc
-birdy
-fourteen-year-old
-hilfiger
-publicly-traded
-spokespersons
-abuser
-kennewick
-yadavas
-ctrl
-upliftment
-zaynab
-telic
-donoghue
-nouri
-qiao
-kurupt
-56.5
-limbic
-spoilage
-co-owners
-suk
-antigenic
-cellulosic
-mottram
-overshadow
-sheathed
-broz
-unicellular
-straight-forward
-darna
-garnish
-lithographic
-vinifera
-0-5
-faeries
-sibu
-workgroup
-crayola
-buckled
-nicotinic
-godspeed
-p-3
-hammurabi
-cantona
-huq
-allgemeine
-appa
-gagnon
-bowring
-angra
-luxembourgish
-suffocating
-tanneries
-cholinergic
-alboreto
-sans-serif
-cag
-,1
-unreasonably
-emmitt
-tempering
-elahi
-talas
-quinto
-defra
-jaycees
-52.3
-sherburne
-porcupines
-antisense
-erdington
-'08
-embellishment
-transients
-m20
-honinbo
-eatery
-btc
-non-local
-haroon
-57,000
-eretria
-websphere
-jumanji
-watchung
-2000/2001
-bushels
-beachside
-maces
-wildhearts
-homebase
-wide-angle
-roud
-lorikeet
-divina
-bhadra
-lally
-hibernians
-northolt
-1/5
-objectivism
-babri
-regrow
-costanzo
-delong
-eider
-recalcitrant
-jamuna
-braham
-1188
-entebbe
-warplanes
-kerch
-anti-union
-rosenblum
-chalcolithic
-berner
-leftmost
-fyfe
-jara
-maniacal
-854
-1467
-geert
-anant
-14-10
-shimoga
-routh
-jesper
-tayside
-o'halloran
-energia
-logitech
-archrival
-scullin
-o'riordan
-unelected
-kray
-aarne-thompson
-rookwood
-counter-culture
-vss
-agreed-upon
-swissair
-smelly
-bridgman
-namespaces
-hp-ux
-fsm
-caldas
-sedated
-bosphorus
-hippocratic
-roxana
-soviet-era
-pinnate
-opryland
-cumans
-fluoxetine
-topalov
-bayezid
-wallenstein
-bulldozed
-squats
-elmendorf
-mudge
-innocently
-xn
-1011
-hyperthermia
-kollel
-juggle
-ashkenazim
-ecclesiae
-shaven
-aristides
-malted
-m50
-22:00
-1373
-moms
-belden
-odinga
-physicalism
-two-car
-voorhis
-sp1
-adventism
-keeble
-polysaccharide
-fledge
-complementarity
-tennessean
-dunphy
-pre-primary
-morelli
-blunted
-solvency
-balad
-g.p.
-2gb
-o'driscoll
-all-acc
-non-emergency
-imamura
-oilfields
-16,500
-balder
-adornment
-blackthorne
-greensand
-tingling
-44.9
-callow
-asus
-tleilaxu
-hinkle
-c8
-hartz
-desjardins
-drewry
-alliterative
-endor
-scipione
-shifters
-cascadia
-dutiful
-8.80
-lobotomy
-bharathi
-sog
-ravenloft
-zelaya
-autochthonous
-ackland
-immersing
-scorching
-818
-chana
-ainge
-1200s
-mino
-shivering
-supposing
-tajiri
-ramah
-rapidkl
-orm
-sherrod
-elayne
-bestiality
-waimea
-msiri
-radio-canada
-rogaland
-supplication
-pleven
-pre-reformation
-madi
-alcor
-bankruptcies
-lb&scr
-gilad
-inconsistently
-phoney
-phill
-man-eating
-ethane
-birdwood
-tuan
-exponentiation
-pokey
-ssm
-deschutes
-murtaugh
-5.50
-setae
-otsuka
-camorra
-arce
-silversmith
-velvety
-chapple
-girton
-bonavista
-spann
-barbel
-naturalis
-dawned
-gunsmith
-carnoustie
-sonning
-printable
-jellies
-diced
-gleneagles
-rogerson
-bryansk
-vesting
-conjuring
-1992-1994
-naturally-occurring
-dignities
-glycosides
-recompense
-imago
-859
-numidian
-unencrypted
-murnau
-masturbating
-struct
-nvc
-joram
-granaries
-shemp
-akkad
-gemara
-chapultepec
-christo
-flavin
-fatehpur
-kerri
-1004
-six-sided
-desiree
-fibrin
-icicle
-sanguine
-ellimist
-imperialists
-maruyama
-testaverde
-finality
-aggregating
-cyrix
-challis
-tacos
-hybridized
-squawk
-fishkill
-vanden
-norge
-flodden
-patronizing
-50.9
-clitoral
-frontera
-davis-monthan
-pero
-free-fall
-blotting
-belper
-allardyce
-microeconomics
-coloratura
-topically
-maritima
-aitkin
-bluefin
-al-bashir
-iwerks
-low-altitude
-freelanced
-mugging
-indenture
-campbeltown
-stritch
-komo
-safes
-rebuffs
-colan
-mcquaid
-spaceflights
-garston
-aci
-simo
-strumming
-followings
-vetting
-ase
-upu
-reconquer
-honeymooners
-reinvention
-confections
-lcr
-supercell
-earlham
-peels
-sharples
-termen
-zippo
-697
-offloading
-caboolture
-gambian
-thump
-manolo
-carmona
-notitia
-guernica
-plucky
-christus
-plies
-collegio
-skywarp
-fighter-interceptor
-mads
-rebutted
-succinct
-pastoralist
-two-digit
-hellraiser
-duckett
-arrowsmith
-zemin
-adult-oriented
-motorboat
-no-hitters
-creech
-ill-equipped
-halos
-calton
-hdi
-balti
-iwata
-iroc
-logue
-p5
-stadt
-exclusives
-actus
-icefield
-three-tier
-freberg
-jaa
-initiators
-wep
-actinides
-ansonia
-sassafras
-muchalls
-shrouds
-alleyne
-epigram
-yugo
-yazd
-motherly
-kaeo
-942
-sumerians
-rosey
-flac
-climaxed
-shorn
-face-off
-49.8
-oun
-danone
-zemlya
-jewellers
-malady
-jenkinson
-duntroon
-peavey
-daniell
-1169
-pah
-cap'n
-wormholes
-jeune
-graziani
-airtel
-shortline
-middlemen
-homolog
-behe
-ampeg
-1.65
-janes
-lymon
-abuna
-fester
-2000-2005
-cabinda
-soria
-catesby
-androgens
-eaa
-doa
-filk
-oram
-doheny
-54.7
-awkwardness
-rhinitis
-toshinden
-3-inch
-bohlen
-sadducees
-moskowitz
-frohman
-puno
-gaspard
-sandbags
-abdi
-certificated
-commas
-syllogism
-lockett
-repudiate
-westview
-bellary
-846
-iconoclast
-regnant
-831
-dutch-speaking
-kouga
-saucy
-graben
-trackways
-raghavendra
-ligier
-floridian
-observables
-puffer
-stilt
-plessey
-lucretius
-gmac
-shavers
-nonempty
-skool
-sandhya
-regierungsbezirk
-1181
-tsuna
-arthurs
-anti-smoking
-15-16
-witwicky
-kreuznach
-pasts
-atwell
-daeng
-grifter
-mechanicals
-fertilize
-proteaceae
-revolutionised
-morell
-hals
-lawley
-0-9
-prozac
-gakuen
-hutcherson
-lemmy
-draughts
-overrode
-hormel
-undergarments
-hardens
-sewanee
-soren
-koli
-larkspur
-louisburg
-parsed
-lidar
-26-year-old
-benno
-terrapin
-monophosphate
-rebreather
-skirted
-petioles
-sopot
-corned
-preposterous
-staubach
-h.d.
-helder
-eurosceptic
-46.8
-1530s
-agl
-tenma
-ili
-carotenoids
-adsl2
-landowning
-sealers
-sesotho
-normalize
-desiccation
-borat
-indie-rock
-rainmaker
-manifestly
-47.8
-graviton
-cottle
-calatrava
-expander
-arthouse
-redux
-1031
-luhrmann
-sweepers
-provisioned
-haruna
-brownwood
-milgram
-beeches
-phenyl
-perls
-minimalistic
-affiliating
-sweatshop
-repressor
-novelizations
-inuyasha
-marfa
-nc-17
-myna
-shogakukan
-terrans
-tyrolean
-fios
-deflated
-intercolonial
-curatorial
-zwickau
-unearthly
-glendora
-elocution
-pointy
-tyke
-bleecker
-maredudd
-rabbani
-preempt
-ecclesiastes
-tapia
-ersatz
-87.4
-meliaceae
-pyroxene
-moralistic
-kantner
-veruca
-labelmates
-formula_73
-nunnally
-chauvinism
-ino
-seaham
-artin
-13-3
-multi-platform
-pil
-livid
-educationalist
-halicarnassus
-poached
-bemused
-humint
-bst
-bartoli
-morgoth
-johanson
-thirty-eighth
-fugues
-angeli
-beatnik
-cardona
-idt
-moroccans
-fanciers
-johnsonville
-shultz
-polish-american
-deterring
-elucidation
-crookston
-2:3
-1233
-gentleness
-co-found
-tetracycline
-aguri
-dano
-sputtering
-elio
-zoltan
-larger-than-life
-condoned
-scholasticism
-otranto
-27-yard
-nitra
-p.r.
-fh
-chopsticks
-erma
-b.v.
-soames
-craugastor
-schisms
-roeder
-chuvash
-acquiescence
-pantyhose
-supplanting
-baseband
-stabilising
-pavlova
-villar
-polska
-swc
-authentically
-opa
-kunwar
-venoms
-pby
-purer
-pitbull
-banishes
-retells
-rastafarian
-hiruma
-saussure
-hayakawa
-ese
-masbate
-dakin
-bunks
-14.00
-elkin
-slacks
-toru
-goalkicker
-tramps
-792
-winnetka
-sandhill
-chippendale
-xxvii
-piecewise
-3.34
-persecuting
-coogee
-o'flaherty
-basilar
-midnapore
-deluded
-aprilia
-centipedes
-pre-date
-sidmouth
-demoralised
-second-to-last
-pilsen
-timbres
-carolinian
-vetch
-pradhan
-kismayo
-uac
-1994/95
-membranous
-dsb
-surnamed
-lawmen
-tussle
-reallocated
-one-armed
-perdition
-royton
-insurrections
-koon
-basti
-caerleon
-haywire
-pforzheim
-53.5
-ailey
-chaudhuri
-baha
-expletive
-beatz
-p.o.d.
-moxon
-kanaka
-imperatives
-bruckheimer
-clairvoyant
-cefn
-lectionary
-falciparum
-machias
-big-box
-machineguns
-hsn
-poston
-collet
-iqaluit
-9/10
-relaxes
-extensional
-bromeliaceae
-bengaluru
-paradis
-hhc
-luminescence
-anti-spam
-macewan
-trippe
-paik
-groundskeeper
-coauthor
-kawaguchi
-resets
-tsarevich
-steppin
-salcedo
-crayford
-gini
-sori
-balsa
-einsatzgruppen
-surayud
-wensleydale
-fop
-baidoa
-cwmbran
-hordaland
-eax
-strawbs
-demographically
-phylogenetics
-7/8
-protectorates
-collard
-outnumbering
-noetherian
-diehl
-fatigued
-chamillionaire
-lewistown
-e.m.
-sacrilege
-bergin
-zeffirelli
-rosita
-jourdan
-agathon
-hypodermic
-fansite
-egm
-loadings
-riotous
-pristina
-closed-loop
-shriners
-pulsing
-dah
-geisel
-50-yard
-tortillas
-s.k.
-holography
-tortuous
-kasaragod
-jewish-american
-roars
-dialogs
-andromedae
-limping
-telemovie
-self-image
-alessandra
-shinde
-mesons
-whitespace
-bellinger
-seething
-newsreels
-non-existence
-nwt
-jilted
-monopolized
-mynydd
-taiyuan
-overrides
-peripatetic
-wettin
-joust
-sohn
-hieratic
-re-examined
-baldock
-sighs
-boogie-woogie
-sukkot
-mcb
-glyphosate
-utep
-nmc
-coking
-761
-wingham
-virgilio
-feathering
-isoform
-caserta
-57.5
-turton
-re-organised
-dissipates
-121st
-wbz
-givenchy
-foliation
-rous
-plantains
-treasuries
-mid-16th
-dodges
-u1
-menino
-messiaen
-henricus
-crawfish
-2.08
-untranslated
-contessa
-quorn
-magisterial
-kuyper
-welf
-sedges
-hristo
-paleobiology
-conformations
-madoc
-grambling
-all-japan
-bergstrom
-rosencrantz
-100.8
-sawn
-stanier
-gabi
-downsview
-tolka
-ckd
-moreira
-piha
-indeterminacy
-sabian
-fritsch
-yon
-overlaying
-owsley
-secam
-1977-1978
-principe
-m-1
-sous
-coleoptera
-hadlee
-vic-20
-ostracism
-northants
-outflanked
-toshi
-blazes
-junge
-zamindari
-subwoofer
-muta
-karsten
-complacent
-sirmium
-cev
-vanzetti
-planking
-palouse
-lsat
-dualdisc
-smokestack
-swanston
-ballgame
-cisalpine
-peace-keeping
-pre-dreadnought
-antonelli
-thurlow
-vayu
-1939-40
-tudela
-okamoto
-breguet
-wojciech
-mckechnie
-fela
-zaid
-popish
-peloponnesus
-duhamel
-stonyhurst
-nodding
-kishan
-showboat
-continuities
-strangeness
-quintessons
-perky
-n-gage
-ulric
-caloocan
-bladders
-terranova
-interuniversity
-shunted
-skinning
-1076
-sinan
-kati
-bogdanov
-tattlers
-gambled
-archaeologically
-chui
-adrenergic
-ameer
-methylated
-oviparous
-powerbomb
-sixteenth-century
-bhim
-cubicle
-icmp
-osvaldo
-shels
-perinatal
-homeomorphism
-bodyline
-ironwork
-straight-6
-buka
-oberoi
-mexborough
-quartering
-interzonal
-balk
-panday
-iconographic
-ladino
-sunder
-ub40
-bola
-lozenge
-hohokam
-sub-genres
-cubase
-faeces
-gci
-blesses
-waaf
-floored
-multi-core
-picchu
-fickle
-tokio
-1045
-florid
-barreto
-trade-offs
-mailboxes
-vishwa
-steinmetz
-probst
-chieti
-zakynthos
-karunanidhi
-troutman
-2-hour
-fbo
-1354
-orthodontic
-currant
-sidhu
-kubo
-roussel
-tacky
-shabana
-flik
-anarcho-punk
-1993/94
-spacek
-shanklin
-treacle
-manzano
-maisie
-iar
-cpg
-topsail
-reproach
-gratia
-yeoh
-cavemen
-carlile
-khadi
-764
-bastogne
-belkin
-shamokin
-carberry
-installers
-icty
-oops
-ecozone
-minato
-hobie
-charbonneau
-tedder
-verden
-agudath
-plotter
-100-year
-primality
-oroville
-ershad
-laemmle
-oppositional
-xxxi
-grandin
-weighty
-piedras
-ratepayers
-bhaktivedanta
-stansfield
-dehydratase
-incompletely
-endometrium
-1143
-gracia
-larp
-afresh
-rebroadcaster
-odbc
-bulgakov
-sympatric
-edm
-tatanka
-kaolin
-erythema
-harmoniously
-83.5
-belang
-videography
-pulver
-bevis
-ndtv
-ladner
-bricked
-chalky
-lennart
-vocalist/guitarist
-buckhead
-gruelling
-langevin
-quintin
-mwc
-prakasam
-hjalmar
-columb
-al-zarqawi
-serially
-rootstock
-espouses
-nicanor
-corker
-opensolaris
-casemates
-reelin
-sharpstown
-tft
-cashing
-barfield
-succinctly
-navratilova
-fillet
-afghani
-antonioni
-1106
-voinovich
-kaen
-tsv
-succubus
-kiva
-charioteer
-revitalised
-herold
-10-8
-heung
-0.35
-ketcham
-trilby
-tacloban
-chamonix
-airmobile
-regenerates
-fauquier
-48.2
-exhumation
-travelogues
-potchefstroom
-salesperson
-she-ra
-beamer
-pre-dating
-cin
-ironbridge
-xuanzong
-ched
-wendel
-anti-monitor
-white-eyes
-501st
-preamp
-preconditions
-savitri
-dama
-refractor
-eskrima
-vieux
-storia
-overshoot
-cla
-steamroller
-cullman
-starkly
-polley
-digraphs
-idealised
-great-great-grandfather
-tanga
-point-and-click
-olivetti
-self-harm
-armbands
-loverboy
-tocco
-scrambler
-tarquinius
-trestles
-calibrate
-sylvian
-qubit
-gentian
-hauptbahnhof
-warspite
-pollok
-nbs
-slideshow
-severino
-fip
-resonated
-hyperdrive
-emphases
-pinkney
-multipath
-laugh-in
-tendinitis
-propped
-hamed
-nirmal
-madina
-intelligencer
-quatre
-tuckahoe
-56.8
-nwr
-calvi
-flushes
-fpgas
-triathlete
-fishponds
-cpe
-half-timbered
-jesup
-stanislaw
-sabata
-jeopardized
-seppuku
-paramore
-mertz
-timeouts
-initialism
-bonaduce
-mk3
-dogger
-mediawiki
-baka
-bostock
-chetwynd
-hellion
-1959-1960
-scariest
-duong
-re-education
-swazi
-walliams
-soraya
-lapointe
-parameterized
-intuit
-tejada
-longest-lived
-royalton
-toasting
-sveriges
-norsemen
-higginbotham
-appius
-drongos
-minx
-1145
-sinker
-ignatz
-sardine
-aches
-mainspring
-tv-series
-pram
-cognizant
-egg-shaped
-cdn
-redeveloping
-jet-powered
-yul
-xuanzang
-enceladus
-grandmothers
-esmond
-abruzzi
-magno
-haircuts
-comebacks
-ferruccio
-wimpy
-983
-crucis
-anteaters
-kudos
-60.1
-beith
-lachine
-accordionist
-stereolab
-apiaceae
-handoff
-heatley
-shamed
-souris
-ipsec
-cinemax
-e-zpass
-ribot
-korfball
-capsaicin
-chabon
-venusian
-45.0
-clockmaker
-cen
-airey
-invocations
-rufc
-khayyam
-archons
-rioted
-gaudy
-radstock
-haddonfield
-pissed
-rauschenberg
-tinamous
-3-speed
-chaves
-topo
-grosset
-pamunkey
-michie
-shoalhaven
-latched
-flasks
-s-video
-scrubland
-kottke
-anti-chinese
-overzealous
-vanga
-deleon
-fully-fledged
-kirsch
-p6
-janjua
-lupino
-yossarian
-shuler
-sorrell
-widmark
-atlanta-based
-trilobite
-heterocyclic
-removals
-repurchase
-waggoner
-chattopadhyay
-baskin
-curiae
-deep-seated
-rso
-duro
-982
-all-day
-bisley
-aditi
-peppercorn
-dippers
-isometries
-minyan
-reestablishment
-899
-maccarthy
-roches
-swire
-mccaskill
-sociedad
-idiosyncrasies
-caputo
-yass
-biosciences
-washboard
-nobby
-hegelian
-boule
-thought-provoking
-phytophthora
-salahuddin
-mox
-masaya
-uncomfortably
-malfoy
-handcrafted
-peptic
-unbeknown
-ormsby
-viggo
-free-living
-laterite
-raney
-wythe
-tarred
-jansch
-thanatos
-abacha
-munros
-revulsion
-recurred
-beinn
-catamarans
-gwydir
-heikki
-boomerangs
-belen
-84.6
-126th
-post-dispatch
-funafuti
-kobold
-bibliographical
-cano
-thwaites
-godoy
-eula
-ratcliff
-panelled
-messe
-tisha
-dorsally
-norreys
-320,000
-plaudits
-munna
-jeolla
-penduline
-liberalized
-patties
-viridis
-low-temperature
-sundar
-abdulla
-micronesian
-bucer
-chhatrapati
-disinterest
-pelted
-unneeded
-moai
-sombrero
-rustlers
-come-from-behind
-jehangir
-neurogenesis
-recuperated
-eutrophication
-holism
-mahfouz
-system-wide
-unibody
-t-bag
-intrude
-populus
-westboro
-flon
-clipboard
-cliftonville
-o.s.
-aleksandar
-limiter
-dphil
-poh
-nakhchivan
-absorbance
-stang
-brunelleschi
-gavaskar
-52.8
-dehn
-invitation-only
-resetting
-100.6
-pigott
-pincus
-angas
-circumnavigate
-eakins
-dispenses
-nau
-duelling
-bancorp
-plunket
-gaillard
-spiking
-conjectural
-sampradaya
-doggy
-jdm
-89.0
-lombok
-pierluigi
-vca
-tomoko
-becher
-egghead
-stapp
-1945-46
-kipper
-take-two
-dotterels
-gongsun
-babson
-dusting
-second-rate
-hypoplasia
-webmasters
-beckton
-gardena
-daan
-diageo
-webisodes
-borgnine
-beaumaris
-rock-oriented
-gaap
-tricycles
-hannaford
-tsc
-valverde
-gms
-unfashionable
-kyrenia
-tasteful
-noreen
-m.b.
-loddon
-aquarii
-11-5
-westmont
-lupton
-thud
-bombo
-needlessly
-fractals
-laser-guided
-metabolize
-breathable
-shuttling
-intrauterine
-manetho
-xana
-westerberg
-ricki
-dearne
-brinker
-arabesque
-barbossa
-mcvey
-48.4
-8-inch
-molokai
-biya
-vanya
-goch
-robben
-wader
-scoutmaster
-vitti
-baddeley
-45.4
-rw
-barometric
-mna
-aai
-unspeakable
-thistles
-fayre
-merseburg
-squabbles
-futebol
-suwannee
-bcm
-scheider
-hoynes
-uns
-silverdale
-somersault
-bohun
-balin
-shredding
-alfreton
-defecation
-jabotinsky
-wccw
-29-yard
-ilona
-fineness
-finisterre
-yle
-rebbes
-33-yard
-mascis
-tullius
-nissim
-shriek
-linc
-frontiersmen
-biomes
-malnourished
-harmonizing
-grandnephew
-gating
-unkempt
-ntp
-subscription-based
-mtm
-jayasuriya
-july/august
-electroplating
-wolde
-siegen
-o-o
-geranium
-breaths
-evidentiary
-chequers
-lochs
-kader
-neuropsychology
-wellstone
-thf
-6/8
-reddick
-disbursed
-tortuga
-hoppus
-well-armed
-aspartate
-imre
-full-screen
-bosman
-bursaries
-zealously
-lomonosov
-al-hakim
-bluewater
-hollingworth
-divisie
-haring
-meo
-ove
-billeted
-bhatia
-capsize
-wobbly
-brinton
-lipophilic
-cicada
-writer-director
-enactments
-sce
-ussf
-then-popular
-cronquist
-self-referential
-citroen
-45s
-scotiabank
-colo
-transworld
-kare
-phelsuma
-player/coach
-scopus
-graaff
-howser
-co-existed
-desilu
-repatriate
-arti
-doctored
-nahin
-euphoric
-biotite
-nima
-matchmaking
-footed
-carpentaria
-avp
-counter-terrorist
-councilwoman
-lukather
-xxxv
-sambhaji
-wristwatch
-phalanges
-irix
-srp
-mcparland
-toxteth
-cultus
-obliges
-varden
-catan
-collinson
-wdc
-darussalam
-52.7
-msv
-kennon
-xiong
-rdx
-moussaoui
-agc
-unsanctioned
-gls
-leachate
-dellums
-corunna
-millikan
-murtha
-rossum
-sectoral
-shapeshifters
-goalkicking
-862
-c2c
-armadillos
-53.4
-threonine
-tackler
-cne
-northfleet
-einhorn
-occultation
-sso
-nanchang
-fionn
-heterosexuality
-riche
-normanby
-man-of-the-match
-wooly
-lecherous
-cosy
-anteater
-psr
-regine
-pottinger
-rimutaka
-leverett
-belait
-dominators
-1990-1992
-averroes
-polyhedral
-mikawa
-heaths
-cece
-icd-10
-postoperative
-oclc
-enlistments
-urticaria
-mahi
-silvester
-flinn
-lightnin
-languishing
-1243
-bourassa
-tsarina
-fume
-tows
-stromberg
-hiked
-connotes
-glasnevin
-giamatti
-stavropol
-excerpted
-aat
-stratotanker
-bouton
-.400
-appraisals
-jolo
-190,000
-nanometer
-idling
-shuja
-neutrophil
-eap
-theatrics
-charoen
-stanwix
-langlois
-fianna
-macmurray
-894
-coutances
-nikolaevich
-carus
-pittston
-streetscape
-hella
-dromore
-cascaded
-menthol
-homely
-asan
-todt
-harmlessly
-cationic
-entitling
-silage
-m.i.t.
-verus
-rigoletto
-stormtroopers
-motacillidae
-pallid
-espada
-udr
-nebel
-pelecaniformes
-1374
-comnenus
-siltstone
-conocophillips
-persuasions
-archpriest
-mid-career
-fudd
-khattak
-eadie
-manos
-american-based
-887
-pleadings
-podocarpaceae
-pippen
-shahin
-areva
-macmahon
-dundonald
-rut
-panicking
-glimmer
-laysan
-geste
-kamo
-colgan
-millward
-lower-cost
-consummation
-tulku
-bizarrely
-simm
-coder
-unpalatable
-garh
-1080p
-bettie
-relatedness
-egwene
-conflation
-martinus
-wythenshawe
-randell
-machine-guns
-crystallizes
-turnouts
-trolling
-jh
-al-sabah
-kosciuszko
-kasai
-11-yard
-recant
-water-filled
-dodged
-westphalian
-heike
-morphy
-blaikie
-chehalis
-merckx
-hayfield
-keeneland
-784
-bragged
-vorarlberg
-loews
-ledyard
-conran
-spring/summer
-solapur
-microclimate
-supercilium
-alo
-rti
-52.1
-impairs
-continuous-time
-mid-afternoon
-tintagel
-surin
-montesquieu
-pinchas
-semiotic
-articulations
-alsace-lorraine
-endorser
-cornmeal
-pinner
-propulsive
-brite
-siddur
-nureyev
-funaki
-boldface
-work-related
-alluring
-berating
-rovere
-eurocup
-masterwork
-angara
-musselman
-ex-president
-juni
-mortified
-zobel
--11
-hesse-darmstadt
-time-dependent
-210,000
-co-produce
-asexually
-systema
-marmot
-86.1
-bayt
-szabo
-itineraries
-upholstered
-intensifies
-dce
-biomechanical
-shunters
-bato
-abet
-ppr
-bird-like
-qed
-scudetto
-stoltz
-51,250
-yoshio
-thermionic
-single-deck
-daylights
-siouan
-buckman
-blazoned
-shoten
-arbogast
-megrahi
-skipjack
-hengist
-all-inclusive
-grice
-bedded
-treeless
--DG.DGDG
-vicar-general
-mell
-eisler
-collegians
-siem
-umd
-kaa
-ramis
-zend
-conundrum
-lehr
-stoppages
-stratagem
-copse
-reutemann
-fitzherbert
-aardman
-stowell
-malts
-lemoore
-pre-eminence
-lucile
-zant
-methotrexate
-puna
-zephyrs
-haris
-ambidextrous
-omdurman
-indo-fijians
-isiah
-1975-1976
-olam
-ogun
-novus
-11.40
-crediton
-nuba
-boulez
-nandini
-manfredi
-kalamata
-durocher
-hounded
-18,125
-reabsorption
-overmind
-mauritanian
-al-zawahiri
-prognostic
-rosser
-neos
-unrecognised
-cutts
-scrubbers
-dewine
-full-body
-wrecker
-sakharov
-pliable
-24-yard
-quincey
-duplessis
-18-20
-suze
-ringleaders
-delamere
-funnier
-rubidium
-maxillofacial
-cgt
-rotted
-m1911
-moros
-qiang
-arequipa
-eland
-kamboh
-admissibility
-cayetano
-likening
-uranium-235
-donmar
-tidings
-elagabalus
-miscommunication
-dimebag
-francophones
-fmv
-pse
-1935-36
-millwood
-top-of-the-line
-lindo
-castlebar
-domitius
-dht
-al-rashid
-decking
-overwritten
-crepuscular
-sool
-2.06
-vincente
-sulley
-antananarivo
-abstractly
-parashurama
-logician
-corporatism
-sba
-lbj
-iphigenia
-claro
-intuitions
-bagdad
-ghazan
-ucsc
-andani
-1.96
-aime
-771
-mso
-fothergill
-stephanus
-huelva
-chatfield
-resorption
-redbirds
-penns
-grammy-winning
-lapin
-suppressors
-lycaenidae
-samus
-bustard
-1993-1996
-semi-desert
-sabotages
-cubana
-ssd
-starwood
-1133
-795
-bueller
-wishful
-pilates
-dryland
-reticle
-papineau
-wintertime
-shrubbery
-feldstein
-bruner
-naveen
-magnolias
-attalus
-shinsengumi
-devdas
-hagerman
-bornean
-gimli
-prescot
-bigot
-coty
-doberman
-doig
-102.8
-phenix
-alcove
-dominik
-diaconate
-mcgriff
-saltpeter
-robey
-crackle
-luciana
-hailsham
-toucan
-masha
-chewy
-defused
-larisa
-siddharth
-13,750
-kingsmen
-to-day
-gosse
-nahal
-gero
-darkside
-hetty
-corrine
-naturae
-1131
-brassica
-kareena
-vidin
-layup
-effusion
-7,200
-trice
-valchek
-986
-light-heavyweight
-technicalities
-appreciably
-bergh
-jeroen
-57.6
-jeevan
-fawkner
-well-placed
-moorhouse
-md5
-doan
-slieve
-re-discovered
-two-track
-prelature
-go-between
-my-hime
-bier
-tham
-dmitriy
-biennially
-gsc
-one-point
-54.3
-binnie
-meow
-bloke
-neophyte
-jandek
-fifty-ninth
-ballplayers
-haemophilia
-taw
-rassilon
-51.8
-skanderbeg
-hydroxylase
-lerma
-gowan
-aew
-sprain
-kurd
-renaldo
-wcml
-pruett
-dostum
-mcallen
-wild-card
-koma
-post-gazette
-redeemable
-10.20
-islamism
-mcnary
-resonating
-catharsis
-anti-racism
-glabrous
-indo-china
-postmaster-general
-shadowrun
-rubbers
-ahimsa
-marita
-celina
-heraklion
-californication
-isf
-remittance
-nampa
-rodan
-photosensitive
-shuya
-adage
-solana
-untied
-diaghilev
-ov
-mcreynolds
-aliso
-gannet
-manston
-photochemical
-italicized
-pss
-araliaceae
-sedaris
-marquardt
-goemon
-kalas
-liquidator
-1146
-afro-asiatic
-dolmen
-inexhaustible
-paro
-widely-known
-qasr
-dreamlike
-palahniuk
-51,000
-grotius
-mithun
-decoupling
-valentia
-carnell
-3.45
-rabobank
-janne
-soundings
-contraption
-reconstitution
-trip-hop
-samajwadi
-jedediah
-danceable
-3/8
-campus-wide
-sabers
-logger
-shoshenq
-freesat
-twang
-40mm
-akiyoshi
-bicker
-ilchester
-baldrick
-bakura
-metastases
-sylvanus
-arse
-dupuis
-longa
-megahertz
-minar
-winemaker
-aves
-fusarium
-bigg
-mro
-goran
-mcgruder
-tafsir
-torvalds
-osteopathy
-suba
-queercore
-holidaymakers
-toppers
-apothecaries
-matres
-weatherly
-rubus
-soth
-winches
-animas
-co-sponsor
-apostates
-59.4
-spitalfields
-colonnades
-modeler
-pesach
-unknowable
-madinah
-extinguishers
-cisticola
-bilaterally
-7.20
-ulmer
-communicable
-suffern
-soane
-paddocks
-immobilize
-stockpiled
-smelled
-nativist
-multi-lingual
-commissionings
-30,417
-forgetful
-ka-zar
-annapurna
-53.6
-schweinfurt
-gravina
-zina
-zagros
-lachey
-kisumu
-lawmaker
-thimble
-bureaucracies
-pacifists
-poway
-affricates
-columbiana
-lacmta
-cranky
-non-destructive
-caraway
-patina
-boult
-tigres
-poonch
-saddler
-jigme
-silvanus
-amadou
-1281
-milroy
-a13
-appice
-09:00
-1973-1974
-stennis
-hsi
-palani
-semarang
-corday
-prudhoe
-malietoa
-peele
-baji
-sphingidae
-slighted
-a/b
-babangida
-zorin
-ludvig
-kalmyks
-raby
-albertine
-ramble
-caput
-biju
-endangerment
-papen
-hee-dong
-monocle
-gpc
-polarised
-1166
-halligan
-monaural
-sheldrake
-devens
-mutualism
-heald
-misrepresenting
-eisenach
-octal
-737-200
-staking
-lauds
-venkatesh
-self-regulation
-siraj
-canty
-gramsci
-nobis
-chinese-language
-colonel-in-chief
-hew
-874
-desperado
-boyish
-marquessate
-vindicator
-considerate
-eee
-chislehurst
-vecna
-naresuan
-montez
-entranced
-clytemnestra
-a&w
-kaman
-horseracing
-2001/2002
-legco
-annis
-padgett
-quadrupole
-bassa
-ex-lover
-mcclung
-oceanfront
-bamboos
-gedo
-guineafowl
-muscarinic
-fakenham
-saltash
-tepper
-olivares
-well-balanced
-livius
-grampus
-brodeur
-hygroscopic
-sinnott
-supremo
-efta
-1974-1975
-addai
-e.o.
-radioshack
-mbt
-tami
-corriere
-unrealized
-aarti
-amiss
-middle-order
-cloistered
-freckles
-albani
-fard
-58.2
-ilyas
-fluoridation
-rattler
-ballrooms
-1108
-janos
-synthesisers
-lamarche
-g.w.
-gurdaspur
-ulceration
-nashe
-goodspeed
-indo-fijian
-greenmount
-malthouse
-roc-a-fella
-1976-1977
-andersons
-horwitz
-del.
-storeroom
-immunocompromised
-florida-based
-vyborg
-keystrokes
-s.b.
-equaling
-1.05
-ghg
-hellenes
-neots
-kaito
-11-1
-apologises
-choppy
-l7
-shoop
-hamzah
-8-yard
-haggadah
-cullinan
-breathitt
-ouray
-magoffin
-panmure
-pre-ordered
-roughing
-953
-spey
-cadences
-sensationalism
-red-billed
-rodentia
-berkshires
-zt
-dinars
-easterners
-transmigration
-cath
-undressed
-anantapur
-tocqueville
-bude
-tranter
-orly
-fick
-gunderson
-incl
-verlaine
-droit
-mourne
-npb
-deepens
-schreiner
-.11
-purves
-thiel
-vitoria
-toss-up
-20mm
-tetsuo
-anionic
-mouthpieces
-mpeg-1
-paya
-102.6
-acid-base
-altarpieces
-49,000
-g.e.
-rifting
-popularise
-nongovernmental
-birdwatchers
-exclusions
-pham
-84.1
-defrauded
-dorsett
-9.60
-precast
-swiftsure
-5,600
-approximant
-smokebox
-erlang
-riefenstahl
-coasting
-steerable
-lanning
-lmp
-1055
-rhodium
-scour
-colophon
-taqi
-shanna
-picketed
-ciws
-capos
-stikine
-so2
-parsimony
-midgley
-botolph
-11-13
-jee
-wizarding
-yearbooks
-diminution
-cookson
-supersedes
-30-year-old
-adic
-stegosaurus
-ledbetter
-180th
-gudgeon
-akershus
-nagash
-tessellation
-co-operating
-scarfe
-d'ampezzo
-eerily
-n-type
-syr
-araya
-mesothelioma
-prostheses
-lustful
-yuvraj
-neurotoxin
-strychnine
-matsuyama
-asker
-jemison
-poynter
-jean-marc
-korolyov
-laboring
-lalit
-naps
-ectoderm
-morganatic
-transdev
-fanshawe
-century-fox
-chaminade
-booing
-namie
-k-mart
-picaresque
-albay
-lanky
-iscsi
-21:00
-shinano
-non-departmental
-nutley
-alcatel
-matabele
-aransas
-five-speed
-bradwell
-stam
-heartwood
-reenactors
-tillage
-cubic-inch
-handsomely
-grainy
-bathsheba
-emanates
-microarchitecture
-satyricon
-stitt
-farmsteads
-56.6
-49.0
-enema
-chuckie
-bonkers
-alessi
-exegetical
-choreographic
-skyrocket
-glinka
-liverworts
-pasay
-jabez
-phool
-furst
-wrangell
-1167
-1163
-bandmaster
-makkah
-pais
-californica
-themis
-nagata
-countrylink
-baumgartner
-rion
-incumbency
-mraz
-agawam
-phaedra
-lancasters
-anti-
-surfside
-chea
-buenaventura
-pavlo
-regolith
-slitheen
-yoshiki
-5,400
-stand-out
-parco
-karnal
-54.4
-poacher
-oac
-5m
-monocots
-bareback
-dhanbad
-sutro
-mid-engined
-kernow
-inco
-heartily
-supercenter
-wavering
-spca
-suki
-heraclitus
-mercurio
-stipends
-farnworth
-kellen
-pre-professional
-dogon
-senescence
-panchayath
-tenzin
-dowitchers
-confide
-spawns
-komen
-trans-neptunian
-gaskets
-hac
-1178
-men-at-arms
-gilberts
-thomsen
-b-type
-kummer
-hodgins
-mcilroy
-battuta
-lawes
-laverton
-chacha
-basle
-mendeleev
-apprehending
-watley
-r.b.
-hotelier
-goel
-h-1b
-peacemakers
-xxxvi
-okra
-c&c
-clogher
-slane
-twitchell
-hoshiarpur
-mphil
-eeprom
-geoffroy
-oranje
-bailiwick
-samsara
-upswing
-chandni
-30,833
-watsuki
-harlock
-yangzhou
-rez
-coster
-neurophysiology
-achaeans
-majere
-sectioned
-re-design
-droopy
-bi-partisan
-frequents
-abelard
-west-east
-sabo
-pnm
-randers
-bonk
-moondragon
-faubus
-zumwalt
-easygoing
-laridae
-electro-motive
-herniation
-neuroscientist
-delphine
-upturn
-encroach
-unfairness
-outrages
-frollo
-ruffians
-harpy
-synchronicity
-squalid
-decipherment
-kla
-tumuli
-trounced
-allspark
-carlsen
-equivocal
-caracaras
-kandrakar
-carburettor
-then-mayor
-langtry
-leaner
-k.m.
-deftly
-manche
-plosives
-larrabee
-sunningdale
-iwakuni
-1416
-9.80
-waterboarding
-akatsuki
-ballston
-time-traveling
-hypochlorite
-hermie
-viagra
-bouzouki
-chara
-tish
-badi
-aerialbots
-muirhead
-mobilise
-anorthosis
-wernicke
-milbank
-ugliness
-riggins
--13
-jinja
-hindwings
-auks
-tohru
-banten
-extemporaneous
-subnational
-neela
-hawkers
-ravenous
-patenting
-plumstead
-bellick
-hydrologic
-stewarton
-opelika
-quisling
-antelopes
-sidebottom
-slava
-valente
-daytimes
-sajid
-cobble
-mannes
-icici
-fsln
-stablemate
-stylidium
-paracelsus
-x-pac
-kgo
-1580s
-sensitization
-irises
-benzyl
-smarts
-winsford
-17-14
-badoglio
-sunway
-applaud
-forlorn
-couches
-diario
-spektor
-copyist
-siddons
-seventy-first
-meggie
-koopa
-strider
-roubles
-megaliths
-provenzano
-nicoll
-shanley
-asser
-ruptures
-gond
-dinwiddie
-equalised
-magica
-monolingual
-thacker
-calne
-9a
-pterygoid
-penciling
-natya
-file-sharing
-astronautical
-one-minute
-westfall
-dressler
-duwamish
-kurus
-mbira
-tyree
-lonestar
-aldus
-renews
-turki
-bowlby
-collaborationist
-naphthalene
-2006/7
-differentially
-droop
-blockhouses
-rajaji
-incompatibilities
-eurobeat
-11.10
-meadowbank
-erb
-nicolette
-francie
-patently
-nswrfl
-rabble
-kapitan
-brighten
-dhoni
-foie
-stereochemistry
-facilitators
-squalor
-tritone
-antillean
-conjugal
-56.4
-pemba
-rossington
-.00
-fatherly
-275,000
-lakehead
-cogeneration
-naphtha
-tell-tale
-jupp
-cavil
-librarianship
-sandakan
-danio
-pitzer
-payette
-amana
-buono
-millstones
-tattnall
-19-yard
-rickie
-inaugurate
-manne
-osmania
-arabic-speaking
-tswana
-grindhouse
-brutish
-carolinians
-distillate
-geezer
-syncopation
-heb
-snitch
-zippy
-harrod
-downpour
-redefinition
-hoyland
-187th
-dagbon
-opcode
-pericardium
-center-left
-galaxias
-kalat
-frunze
-consumables
-pournelle
-archelaus
-stepford
-gilliland
-co-owns
-persecute
-rosewall
-coextensive
-1005
-105.4
-wellsville
-46.6
-phonons
-ayyappa
-nationhood
-gasworks
-sixty-two
-vasodilation
-fredo
-risings
-cheetham
-82.5
-smirnoff
-carrasco
-citylink
-pointwise
-shiina
-kossuth
-serapis
-330,000
-burstyn
-apalachee
-korat
-infective
-peron
-chams
-kadena
-storekeeper
-sues
-tines
-elas
-presse
-unrepentant
-self-identification
-queensbury
-mementos
-bijective
-jolley
-dth
-redoubts
-nisqually
-hayride
-6400
-divinorum
-astigmatism
-left-to-right
-uluru
-bausch
-butkus
-enunciated
-fasted
-beenleigh
-deflects
-3com
-rojo
-cashbox
-siew
-zuko
-cassin
-terrane
-scaggs
-9.90
-ammons
-sarees
-softens
-bruticus
-arethusa
-nasrallah
-kirksville
-grillo
-co-ruler
-seahawk
-suchet
-uncooked
-insides
-victorias
-hillfort
-magmatron
-crawfordsville
-55.9
-consonance
-wukong
-conjugacy
-anoka
-glucocorticoid
-codice_16
-lida
-1134
-stoneware
-hopkinton
-betta
-keefe
-milpitas
-1077
-katt
-tetrachloride
-ne'er
-hermaphroditic
-bierce
-stratospheric
-askin
-centre-forward
-bodhisattvas
-quintessence
-jule
-fund-raiser
-lak
-clun
-solar-powered
-broadmeadows
-einem
-columbanus
-maunsell
-irian
-half-human
-paa
-quiescent
-802.1
-transaxle
-wfl
-41-yard
-blitzwing
-melchett
-zardari
-oflag
-avidly
-orientalism
-991
-lasse
-834
-kaminski
-duple
-zapotec
-lattimore
-lifeform
-rationed
-nowra
-bulaga
-curbs
-resveratrol
-glances
-mcclane
-purusha
-all-important
-flin
-detours
-dutchmen
-savoia
-borja
-r.r.
-urbane
-lynbrook
-dimensionality
-lycanthropy
-gnutella
-quarter-mile
-recurs
-hpc
-sky1
-dregs
-jadakiss
-cvr
-taaffe
-ecl
-tortola
-hasina
-118th
-i4
-agriculturally
-preclinical
-cost-benefit
-fourth-place
-4,000,000
-illegitimacy
-fowley
-evernham
-shifty
-reenactments
-kimchi
-dacca
-ioan
-bridge-tunnel
-ryukyus
-correlating
-dubstep
-brasilia
-myocardium
-futurists
-fenix
-masterly
-banger
-domain-specific
-906
-haroun
-gamboa
-militarized
-pandanus
-noda
-piave
-nlc
-sedis
-partito
-airdate
-storm-petrel
-jowett
-exertions
-rajahmundry
-non-hodgkin
-sive
-u23
-hsieh
-disfranchised
-jevons
-round-the-clock
-366th
-1196
-claustrophobic
-noize
-glimpsed
-judaea
-tuzla
-101.0
-stradlin
-1319
-ephedra
-kallang
-1172
-steinhardt
-verger
-791
-recitations
-everclear
-memnon
-fourth-generation
-86.0
-86.2
-albacore
-maleficent
-user-generated
-yokota
-longwell
-frideric
-stringing
-beehives
-fuel-efficient
-rickmansworth
-lisi
-agnosticism
-49.9
-sagami
-tce
-beaming
-trg
-jaswant
-estevez
-moorabbin
-dn
-sculling
-tablature
-inflows
-923
-kandinsky
-l4
-intaglio
-cachet
-tiempo
-decimus
-lb.
-fungicide
-kjell
-crunk
-monocytes
-dua
-skala
-meli
-sauerkraut
-ebook
-full-text
-fishburne
-imagin
-snps
-arundell
-43.0
-singling
-dungan
-sevan
-molonglo
-splint
-enteric
-ge'ez
-tol
-postmodernist
-chaykin
-berryhill
-44.8
-897
-rhombic
-tambov
-radiologist
-meghan
-934
-gdc
-kuch
-probus
-thymidine
-smoldering
-sajjad
-faience
-mrp
-wheatfield
-1971-1972
-repurposed
-3am
-i-81
-alina
-order-in-council
-eshkol
-dti
-ferrante
-garros
-vasoconstriction
-gussie
-cerebro
-synesthesia
-razer
-germanus
-berta
-soper
-birthmark
-naturopathic
-stoicism
-academie
-hadamard
-kura
-nila
-lyta
-drive-through
-segway
-tripolitania
-plebs
-crescents
-humbucking
-sna
-saroyan
-20m
-daffodils
-trans-pacific
-student-led
-pechenegs
-suncoast
-s/o
-deegan
-captioned
-bitola
-tdma
-pyrimidine
-shaul
-5,300
-tpi
-trumper
-annabella
-turdidae
-harboured
-altercations
-asf
-bes
-sportscasters
-homestead-miami
-mutable
-annihilating
-multi-stage
-overhangs
-eni
-canticles
-mkii
-nephite
-recordable
-bodyshell
-hypnotherapy
-three-judge
-wicket-taker
-ussher
-patnaik
-latakia
-hanan
-dione
-pan-arab
-ingres
-harbouring
-serotype
-:30
-prana
-gephardt
-langan
-mind-control
-rationals
-passivity
-vnaf
-barringer
-flyway
-highbrow
-misra
-trentham
-trini
-kallen
-oratorical
-victimization
-wrap-around
-lamentation
-ninety-five
-madog
-feliciano
-plataea
-perpetuation
-pickling
-afflictions
-neufeld
-husks
-1.45
-nodule
-filipino-american
-break-away
-mannered
-82.4
-kadokawa
-corinna
-sle
-dib
-keilor
-pressurization
-magnetometer
-cipriani
-hake
-garten
-curio
-rundfunk
-reorganizations
-gyeonggi
-47.3
-27.50
-on-chip
-pottstown
-overviews
-threskiornithidae
-rocketed
-consuelo
-badal
-carondelet
-15000
-austerlitz
-watchdogs
-kapellmeister
-zaporozhian
-wyck
-cavalli
-kaede
-nutcrackers
-faslane
-lazenby
-equites
-i.q.
-pulcher
-frascati
-spongy
-leytonstone
-familiarly
-flints
-almanacs
-kushiel
-rustam
-action-oriented
-qui-gon
-noire
-bellanca
-hairline
-monotremes
-aminoacyl
-treads
-offroad
-bogra
-unnerved
-nitin
-esch
-1334
-verso
-grudging
-turandot
-hata
-ecclesiastics
-phocas
-colonise
-vada
-hmc
-gondolas
-parthenogenesis
-unselfish
-17:00
-c-class
-galas
-1972-1973
-alvar
-froese
-1994-1997
-gauche
-superorder
-humana
-hauntings
-chernihiv
-bedford-stuyvesant
-trilateral
-aflame
-stopford
-gormenghast
-quizzing
-optima
-thorburn
-1995-1998
-baynes
-135,000
-apoplexy
-boatmen
-cudahy
-earnshaw
-eusebio
-kapiti
-saavedra
-ntu
-rila
-senatus
-snowe
-mordru
-eardrum
-powis
-85.9
-falling-out
-hitoshi
-roll-off
-khandesh
-coretta
-televangelist
-modulations
-pleurisy
-kleist
-ninoy
-lubavitcher
-hillclimb
-sel
-init
-stradbroke
-softbank
-outdo
-spt
-noggin
-biddulph
-alphonsus
-verifier
-august-september
-othman
-multi-function
-haute-normandie
-staton
-omonia
-cmm
-sabarimala
-chancellorship
-nacelle
-variably
-hanscom
-repossessed
-gourds
-cedarville
-iger
-c.i.
-lieutenant-commander
-bem
-opelousas
-impales
-neuropathic
-mercutio
-schorr
-15-17
-splatter
-greendale
-divulged
-fightin
-caveats
-jammers
-piggott
-canso
-silber
-halfpenny
-cowherd
-austar
-cityhood
-reredos
-thelema
-sasami
-mattei
-beetlejuice
-ossification
-winamp
-arbitrate
-ant-man
-kimbrough
-3.42
-appraiser
-bewildering
-burks
-marketplaces
-burrowers
-acholi
-throttling
-playability
-gung
-cycads
-ramzan
-retrofitting
-belay
-returnees
-morcha
-virginal
-9001
-coulsdon
-leftwing
-harel
-pro-business
-milazzo
-wels
-ondo
-long-chain
-goatee
-five-man
-kaas
-caligari
-decommission
-oiled
-co-lead
-brannan
-deitch
-furnishes
-nls
-poc
-1027
-manhasset
-sie
-faun
-montenegrins
-cussler
-fairplay
-1091
-odm
-ctf
-jailer
-garifuna
-warton
-nasik
-prettiest
-aquamarine
-huis
-wooldridge
-burnette
-epaminondas
-mccune
-switchback
-o'kelly
-nobuyuki
-euphemisms
-cometary
-pomegranates
-sonet
-corded
-starches
-sub-districts
-reyna
-trig
-nine-member
-crier
-anaemia
-131st
-vincentian
-genki
-oleander
-mcleish
-accreditations
-rivoli
-agitations
-early-mid
-artiodactyla
-splashdown
-giraud
-mondrian
-alloying
-maliki
-telkom
-sinned
-idukki
-edah
-hypothetically
-intros
-cuxhaven
-amaterasu
-elicits
-start-ups
-novoselic
-lade
-hovey
-atsushi
-afrc
-41,875
-wynter
-sprockets
-take-out
-i-55
-guarantor
-earth-based
-humperdinck
-2040
-leonie
-epicurean
-halpin
-e.s.
-ullmann
-ryoma
-lexicography
-adenocarcinoma
-gajah
-moesha
-sulpicius
-shuffles
-sandpaper
-n.s.
-fetishes
-thumbelina
-taylorsville
-resistances
-schoolgirls
-chur
-baluch
-formulates
-joyfully
-seashells
-klagenfurt
-venturers
-embalmed
-sit-com
-bourdieu
-mirai
-bannu
-ndr
-akash
-multi-day
-drunks
-equalized
-gera
-ngati
-dasgupta
-subarachnoid
-actuation
-refactoring
-nts
-unmistakably
-niv
-anglo-burmese
-thao
-inter-state
-ignazio
-ctr
-harts
-85.4
-incapacitation
-maan
-batumi
-kabaddi
-occupier
-buber
-arak
-rockfish
-luscious
-bloating
-hynek
-beekeeper
-minis
-unifil
-fancher
-deena
-novikov
-kozani
-ramgarh
-scifi
-dharam
-olmos
-ern
-heave
-fabolous
-calcification
-smithtown
-automating
-indentations
-archmage
-culverts
-ilkhanate
-anatoli
-dan-air
-wieland
-feedwater
-bizet
-kalamandalam
-scottsboro
-shriner
-camerons
-siro
-soas
-equips
-great-great
-remaking
-sportif
-g.b.
-albanese
-non-food
-maithili
-villainy
-merlo
-brasiliensis
-disenchantment
-toews
-moffitt
-dystopia
-chakri
-hessen
-glancing
-kurri
-takayama
-extolled
-akola
-duplicity
-anji
-1965-1966
-quadraphonic
-parama
-diastolic
-swashbuckling
-peretti
-extensibility
-shandling
-cadena
-hiromi
-desensitization
-0.16
-brand-name
-srm
-unbecoming
-sarno
-balthazar
-kif
-altenburg
-sodas
-devereaux
-shor
-paupers
-inanna
-macneill
-dorfman
-mirth
-2006-7
-vanderjagt
-aruna
-garbled
-michaelis
-blackberries
-dla
-anh
-radio-frequency
-chillies
-barroso
-salta
-libera
-clumsiness
-dorians
-lousy
-sharona
-misato
-conolly
-buyeo
-sift
-1193
-sylva
-scavenged
-safavids
-chapbook
-abattoir
-janey
-magaddino
-circumferential
-hadden
-avc
-chickadees
-raynham
-shachtman
-organics
-styrofoam
-watermelons
-adorable
-enharmonic
-thalidomide
-vinick
-bourget
-city-county
-cranleigh
-rooming
-hilal
-smears
-maharshi
-tie-breaking
-clacton
-dockside
-waldheim
-42-yard
-maharastra
-transplanting
-mz
-transkei
-valen
-weyerhaeuser
-perrine
-four-barrel
-1986-1988
-crutch
-pares
-nacho
-spreckels
-d'adda
-1937-38
-trawl
-arellano
-matchless
-sapienza
-metformin
-wasc
-carranza
-glasshouse
-clogs
-artichoke
-payoh
-dorney
-goalies
-bracewell
-harpsichords
-bijection
-viktoria
-buckaroo
-miriya
-harmonically
-alouette
-dissecting
-indisputable
-lemond
-54.6
-nse
-o'er
-westeros
-vegans
-vicissitudes
-allis
-petered
-uncorrelated
-melds
-poulsen
-unfilled
-truffles
-tufa
-danaya
-mtl
-bluebeard
-hijri
-ega
-bookman
-casady
-minister-president
-morag
-oarsmen
-first-half
-cluj-napoca
-irena
-buffon
-charlemont
-mediaworks
-ridiculously
-14:00
-replanted
-anti-clerical
-plantarum
-traviata
-perdiccas
-jackpots
-catch-22
-blackthorn
-millett
-macula
-hanes
-ebm
-mailings
-crowne
-greco-turkish
-salado
-elint
-tracers
-overstreet
-orcus
-prydain
-bikram
-36-yard
-ludgate
-prieto
-razing
-asst
-sandefjord
-barcodes
-crumbs
-millfield
-obscenities
-nejd
-masterminds
-delillo
-giggs
-kho
-re-enactments
-misraq
-alopecia
-cheboygan
-malignancies
-post-match
-leapfrog
-i.r.s.
-ruy
-fella
-greased
-mcgillivray
-57.7
-kircher
-anti-mutant
-aff
-esu
-re-started
-free-spirited
-daoist
-eschews
-unleaded
-non-polar
-outerbridge
-rohr
-gastroenteritis
-sakata
-self-appointed
-gneisenau
-subsea
-kalisz
-one-liners
-sixers
-slovakian
-intramuros
-vlach
-ueki
-3.39
-barracudas
-manzanita
-oswalt
-vld
-2005-present
-isha
-1359
-pampered
-beek
-trinh
-ieds
-rationalize
-knell
-offloaded
-fasti
-buss
-latte
-gauchos
-non-combatant
-bushwhackers
-jemaah
-whither
-stylists
-erdman
-bolsover
-equivalency
-cortese
-warmblood
-amaro
-chennault
-backplane
-mahidol
-rocket-propelled
-televoting
-murray-darling
-monteith
-league-leading
-1933-34
-benguet
-glutamine
-ergo
-nargis
-disfavor
-recidivism
-101.4
-concretions
-grandes
-microeconomic
-spratt
-worsted
-delian
-cleverness
-frits
-mcglynn
-glasser
-quli
-tzadik
-homa
-home-based
-xxxiv
-sequent
-funders
-vibrator
-unearned
-lineker
-currituck
-siloam
-beautify
-colloquialism
-racemic
-attwood
-25m
-badd
-hard-edged
-happy-go-lucky
-balham
-chor
-adores
-gonsalves
-ggg
-stirs
-steuart
-waltman
-arraignment
-bahmani
-heriot
-pahari
-tatler
-supermen
-endocytosis
-r.w.
-1412
-panavision
-technics
-formula_72
-corpsman
-s-adenosyl-l-methionine
-scalped
-gruenwald
-hanwell
-motivator
-r.n.
-jumeirah
-kandor
-helvetica
-trainspotting
-relaxant
-greeneville
-saget
-elora
-gossage
-stamper
-urich
-gohan
-goodenough
-conned
-dugald
-10-13
-twista
-4,400
-nii
-cct
-mckernan
-swash
-merrow
-o'hagan
-pettibone
-anthemic
-shipowner
-glenmore
-salutation
-1.01
-bib
-rattray
-hyndman
-chettiar
-walmsley
-koalas
-quaestor
-bodom
-wagtail
-exempting
-triceps
-cepeda
-hamadan
-sila
-skilling
-103.0
-fishbone
-lameness
-devoutly
-cosgrave
-casbah
-bulgar
-puk
-non-metallic
-procellariids
-theyyam
-pharmacopoeia
-marthanda
-shechem
-multi-state
-hedgerows
-obstetrician
-busiek
-fag
-18,500
-omon
-murphey
-jayanti
-bickley
-119th
-wheeljack
-8800
-nourish
-quackenbush
-edrich
-snappy
-kuti
-yamasee
-aea
-pongal
-labrum
-greenblatt
-handhelds
-kapu
-stowaway
-ellipsis
-life-like
-afn
-scrubber
-annecy
-ann-margret
-1273
-snowshoeing
-sumac
-asshole
-arcadian
-liebig
-roussillon
-tanar
-udal
-hoplite
-mishna
-'98
-capitols
-frame/s
-eight-man
-mendelian
-ciao
-mcmath
-infantino
-0.17
-shepherding
-2.05
-chore
-protozoan
-1144
-porterfield
-kristallnacht
-taoyuan
-kiro
-meaningfully
-fifth-largest
-excitedly
-quash
-namm
-ferrol
-lipinski
-lucania
-defrauding
-rheingold
-disproven
-connectedness
-47,500
-reasoner
-peroxidase
-buch
-infiltrators
-four-team
-misc
-52.2
-shotton
-entrails
-generalizing
-tuah
-involvements
-lene
-iligan
-15.00
-temur
-tib
-anadromous
-lalita
-mayorga
-nukes
-gottlob
-semi-trailer
-skaven
-ternate
-osp
-osb
-27,083
-erewash
-bator
-courtesans
-mended
-balaban
-ostrobothnia
-radnorshire
-krona
-feely
-destructor
-allium
-hamann
-hitz
-xxxix
-dimbleby
-windlass
-anglais
-gunston
-bargained
-houseguests
-u-235
-tickling
-deion
-glazunov
-fortier
-ramprakash
-whitburn
-laika
-grosser
-hcg
-seventy-third
-regretting
-lireo
-hypocrite
-shizuka
-woy
-clenched
-frente
-nek
-outweighs
-rupa
-number-two
-seconda
-roams
-acevedo
-aspirant
-commonest
-malachite
-early-to-mid
-amoy
-serendipity
-pry
-metathesis
-schenk
-tailpiece
-esri
-blacklisting
-gola
-vl
-sogn
-alcohol-related
-canteens
-coakley
-as-yet
-wnew
-canoeists
-droylsden
-rylands
-45-yard
-oshima
-polycyclic
-abutment
-anglo-afghan
-balarama
-antifreeze
-abyssal
-deepa
-spinor
-salonga
-sc.
-siliguri
-zhivago
-a.d
-2007-8
-gargantuan
-mdi
-five-month
-vim
-diffeomorphism
-khoo
-85.1
-transperth
-carted
-addressee
-tipo
-kadesh
-biomolecules
-bahar
-subfields
-pinching
-law-abiding
-nibley
-swainson
-gediminas
-apophis
-fiz
-nrp
-katia
-eratosthenes
-84.7
-preterite
-lyte
-dita
-non-arab
-brussels-capital
-quotients
-o-methylation
-saale
-one-shots
-school-based
-pro-nazi
-hellenized
-munchkin
-ernle
-pizzarelli
-4-wheel
-fairy-tale
-baroni
-plasterwork
-gripen
-revelstoke
-2wd
-977
-romy
-daydreams
-lewy
-flawlessly
-springwood
-mcardle
-oscillatory
-nathanson
-entryway
-tareen
-jaroslav
-ethnologist
-kou
-fujisawa
-anthropic
-third-class
-ranald
-exhortation
-punters
-konstanz
-perfumery
-indole
-drinkwater
-urania
-pergamum
-moshi
-hockney
-blue-white
-notepad
-jatin
-knorr
-adige
-shortens
-upwind
-vasyl
-ayp
-gold-plated
-sridevi
-dont
-tcas
-labouring
-sheathing
-balan
-podiatric
-nirmala
-inmarsat
-rezoning
-trilogies
-magadan
-nemechek
-55.2
-dumfriesshire
-lance-corporal
-86.6
-gadfly
-melilla
-tv-movie
-rebar
-thermodynamically
-solanaceae
-alomar
-perdy
-pachuca
-scotch-irish
-beary
-hinterlands
-andra
-raeburn
-post-intelligencer
-impersonates
-figment
-1936-37
-inadequacies
-cottbus
-mascara
-profiteering
-bertolucci
-mrnas
-massena
-11.50
-imt
-inhibitions
-1,150
-firecrackers
-uncoordinated
-tams
-ryton
-bauhinia
-miming
-jost
-six-story
-typological
-ampleforth
-sashes
-goldfarb
-lsp
-beka
-billiton
-mikko
-947
-espinoza
-teito
-belltower
-non-steroidal
-morgans
-sabinus
-kongsberg
-10-9
-jari
-outlay
-sloboda
-kostroma
-skaife
-near-by
-1995-1999
-zito
-bz
-milkweed
-beng
-laypeople
-yarbrough
-ripa
-repopulated
-fitzsimons
-sportiva
-bravado
-washu
-rostropovich
-molesting
-capitulate
-vagabonds
-spacesuits
-workaround
-liman
-pantheons
-duce
-supercouple
-hawk-like
-tramcars
-lechuck
-o'odham
-us-69
-shipp
-friesian
-passeriformes
-actium
-sandgate
-necromancy
-darn
-strindberg
-capaldi
-canongate
-j.e.b.
-anandpur
-hort
-salting
-gatlin
-jacobin
-rulemaking
-khatris
-bisset
-policewoman
-rothrock
-proto-germanic
-gir
-nunez
-henty
-procellariiformes
-mcadam
-rial
-opulence
-sdap
-strictness
-uesugi
-laut
-three-issue
-wallachian
-navajos
-t-72
-liftoff
-dapper
-pittsford
-arshad
-85.0
-theophanes
-cursory
-piranhas
-retinopathy
-nyx
-sisyphus
-reignited
-examinees
-24.50
-puig
-sancta
-rajaram
-cardinal-priest
-puducherry
-supa
-dervishes
-102.0
-eye-witness
-syrinx
-precognition
-remotes
-a-type
-nabc
-livings
-bibliographies
-disassociated
-harpenden
-madisonville
-smartcard
-22,083
-6600
-neruda
-maoists
-melded
-colonie
-uscf
-dilworth
-emotionless
-texcoco
-inimitable
-0800
-8-5
-3400
-asphyxia
-maliciously
-madhav
-fresh-water
-felis
-herbalist
-inspects
-nieto
-rudin
-lefroy
-p&g
-e7
-superfluid
-1932-33
-tanna
-fujii
-glomerular
-hilde
-fatu
-keach
-depositional
-57.9
-devours
-hormuz
-mirab
-1081
-otitis
-lage
-lapels
-yuvan
-maddison
-skywest
-pussycats
-a-class
-mineo
-underdark
-mid-continent
-zubin
-karamazov
-1,750
-kuopio
-mongolians
-refinancing
-backlight
-austro-prussian
-catabolism
-subaltern
-spiritism
-chickahominy
-foulis
-keewatin
-shackled
-1970-1971
-peddling
-bolles
-aunty
-bardon
-2008-9
-clackmannanshire
-sweatshops
-fivefold
-eugenic
-hunk
-sieur
-temagami
-iter
-khufu
-centre-half
-rafsanjani
-55.3
-blouses
-murata
-khong
-bogarde
-philately
-paddled
-:2
-dual-purpose
-rogen
-al-mahdi
-eigenvector
-charleville
-jamelia
-nigerien
-jaclyn
-stencils
-preprocessor
-tuberous
-gertie
-chambre
-yoh
-28-year-old
-self-realization
-dibble
-futuna
-shorting
-timms
-arvid
-denikin
-rosenzweig
-gundy
-rootkit
-niu
-alleghany
-national-level
-mycelium
-bosstones
-grasso
-montville
-bootable
-babyshambles
-siachen
-woonsocket
-extorted
-jerkins
-mukerji
-holdover
-saqqara
-communique
-astbury
-top-20
-temne
-goma
-bethe
-gangrel
-crossbreeding
-tv-am
-kinkaid
-reggina
-riboflavin
-cursus
-equestrians
-zita
-48,750
-anga
-deflector
-hindko
-toastmasters
-1999-2004
-joya
-1162
-dvd-audio
-non-venomous
-invariable
-staind
-lindell
-afterthought
-gujjar
-svend
-woodhaven
-1960-1961
-chiyoda
-pilani
-poirier
-naoto
-resp
-sherbet
-grimlord
-jacor
-baur
-expounding
-mirabeau
-halacha
-untiring
-stuffy
-gamestop
-titleholders
-unapproved
-lda
-30m
-vlaanderen
-ogham
-bsl
-voisin
-anker
-hymnals
-893
-transnistrian
-t-38
-family-run
-noosa
-shipowners
-double-ended
-walkinshaw
-renounces
-extra-dimensional
-deaneries
-beckinsale
-oppress
-weisz
-patentability
-85.8
-octavio
-defalco
-aventura
-ruts
-hatchbacks
-headwater
-serre
-well-attended
-i-96
-donat
-serous
-transgressive
-abou
-prow
-briers
-dogtown
-51.0
-resonates
-stillness
-toolkits
-officinalis
-watchtowers
-acidification
-gandhinagar
-caliper
-ahriman
-ledoux
-889
-pittwater
-kes
-kulwicki
-consanguinity
-scalps
-re-releasing
-brideshead
-phosphors
-xpw
-piven
-barassi
-semi-aquatic
-urogenital
-lalo
-adjacency
-tamla
-abadan
-4-10
-blevins
-cfcs
-dpm
-loong
-anansi
-cockrell
-investigational
-f-117
-walkabout
-bhandari
-kenyans
-vh-1
-braveheart
-granule
-volterra
-well-built
-demobilisation
-saturate
-passeridae
-'97
-calum
-enola
-fourth-round
-capriccio
-utena
-paice
-quadrupedal
-neurosciences
-straitjacket
-cawnpore
-backlit
-soter
-130th
-massino
-mccray
-indochinese
-russian-born
-journeymen
-sharpsburg
-usurpers
-disfigurement
-collectables
-hadrons
-dinajpur
-3.41
-bayi
-murali
-borno
-paulin
-1.000
-initiations
-arum
-bernardi
-brits
-shimonoseki
-treacy
-friendlier
-hazlewood
-aphid
-omc
-maturin
-quetzal
-behari
-calving
-ocarina
-iseult
-goldmine
-amateurish
-selectman
-dominantly
-aquarian
-mugged
-lithograph
-mendelson
-ajs
-ral
-ocu
-navarrese
-curitiba
-takarazuka
-krush
-surinamese
-lexicographer
-maesteg
-laodicea
-47.9
-pc-based
-jago
-matsuzaka
-sika
-lightspeed
-dieting
-chouteau
-regressed
-saint-jean
-antes
-entablature
-caniff
-comunale
-1989-1991
-half-hearted
-snowboards
-58.1
-68k
-lermontov
-tagus
-proprio
-harburg
-adoring
-seletar
-aber
-bolstering
-grazer
-whistleblowers
-1068
-yakshagana
-sejanus
-wahl
-antofagasta
-split-screen
-vivisection
-sakha
-geordi
-non-fatal
-episcopalians
-henny
-titi
-reivers
-1/6
-turtledove
-a-6
-interosseous
-diablos
-24.10
-gumbo
-gano
-postumus
-flat-topped
-wild-type
-coir
-lavoisier
-criollo
-monotonic
-pectoralis
-n7
-smes
-phospholipid
-sixty-one
-jurai
-jordi
-b.m.
-shelford
-satyrs
-clearview
-ediacaran
-arliss
-gbl
-jvp
-bluffing
-reasonableness
-stauffer
-shimada
-quatermain
-rakoff
-portsea
-trainor
-radiata
-guillen
-subdeacon
-ivorian
-polyphemus
-outbid
-38-yard
-twilley
-egmond
-falkner
-edel
-cabarrus
-tefillin
-freeboard
-battambang
-atb
-preeminence
-erythematosus
-graduate-level
-merc
-baffles
-publicly-funded
-vindhya
-aschaffenburg
-blackheart
-grado
-jeweled
-1128
-olajuwon
-friedrichshafen
-mikel
-aarp
-molitor
-cambuslang
-nagercoil
-sundaram
-cfp
-flotillas
-holler
-sowell
-edom
-polycarp
-a46
-eero
-novena
-no-frills
-impeller
-super-strength
-haram
-wmr
-schulte
-tamsin
-cross-channel
-comerica
-toul
-tooele
-synthetically
-all-filipino
-prefontaine
-melaleuca
-pylos
-xanatos
-penge
-hankyu
-gillard
-combiner
-brod
-non-british
-wove
-hamtramck
-deluca
-trias
-retiree
-nock
-verily
-krispy
-turley
-prolapse
-floundered
-tennent
-hydrant
-pollute
-rado
-ginga
-chlorate
-yoma
-25.70
-bosonic
-logicians
-avarice
-sattler
-schwegmann
-lissa
-setlists
-872
-pasteurization
-yoshiaki
-sociopath
-mulla
-brunell
-mochi
-girlie
-tali
-meliphagidae
-tasking
-admiralties
-masayuki
-noncommutative
-bothell
-beardmore
-tepid
-unbelievers
-unseating
-mizuno
-kozak
-waterson
-lahn
-endometriosis
-thieving
-hostetler
-ikari
-nae
-averting
-quod
-gchq
-unafraid
-bigsby
-three-run
-mcphail
-huyton
-thampi
-centavos
-expectant
-cbs-tv
-kaddish
-47.7
-jun.
-zao
-pyrmont
-gamsakhurdia
-technion
-squeezes
-kortrijk
-jeane
-psn
-ramshackle
-satirize
-kagura
-cameramen
-bentsen
-ryedale
-mopeds
-porcaro
-dominatrix
-funfair
-dct
-maelgwn
-160th
-parakeets
-mapleton
-holodeck
-qazvin
-marquise
-livers
-baryshnikov
-modo
-longhouse
-dedicatory
-tans
-sinusitis
-polarisation
-agoura
-draconians
-mezzo
-fringing
-current-day
-saguenay
-saluzzo
-wavered
-sarma
-hednesford
-scab
-24.60
-kindersley
-63.4
-gotcha
-boyden
-wide-body
-venta
-valance
-brothers-in-law
-fidonet
-botcon
-psionics
-z1
-jenn
-1x
-heavies
-iteratively
-chatterley
-bink
-carvey
-bonnier
-quadrennial
-chikara
-woodroffe
-nonconformists
-lorentzian
-v-12
-divan
-olt
-cynan
-setters
-runner-ups
-pipistrelle
-microsoft.net
-aoa
-munsee
-tdc
-gela
-engelbart
-lasik
-olap
-january/february
-moraes
-magnificently
-e-40
-21-year
-vickerman
-14,500
-chipman
-codice_14
-e.d.
-seatbelts
-shekhawati
-asis
-beachwood
-suncorp
-pagnell
-high-security
-safford
-greenup
-nikah
-o'mahony
-landesliga
-neuropsychological
-patco
-tesh
-inexpensively
-cultists
-nedney
-strabismus
-penetrator
-mostyn
-vaucluse
-aspirants
-watership
-albertina
-earshot
-pre-med
-unter
-non-playing
-khalistan
-limpets
-capet
-kroc
-adjutant-general
-unacceptably
-shahu
-coppice
-gandharvas
-powerpuff
-firma
-tharp
-potrero
-rurik
-foment
-fossett
-kudu
-scrivener
-freelancers
-defray
-cricklewood
-tseung
-actuary
-titmice
-michi
-curtiss-wright
-sesto
-mosca
-cagalli
-reconstitute
-varus
-timescales
-pronominal
-tetragrammaton
-regionalist
-arianna
-senecio
-codeword
-23.60
-hassle
-neasden
-homocysteine
-detaching
-lakoff
-statuses
-cimino
-pambansa
-royer
-banjos
-26,667
-zakk
-tweeter
-azerbaijanis
-kirkcudbright
-honeyeaters
-myristica
-seibert
-galactose
-submariners
-padmini
-appledore
-unhinged
-kidding
-telok
-tinley
-riegel
-jabir
-malheur
-odb
-60.2
-nema
-127th
-oscilloscope
-etr
-vice-mayor
-eppes
-tandon
-shaan
-greiner
-27-year-old
-alte
-in-fighting
-lector
-eberron
-kulkarni
-prune
-cannell
-candidacies
-centimetre
-fabrice
-s.o.s.
-kishimoto
-sonu
-soulless
-sarmatian
-tigress
-non-physical
-leal
-s.v.
-leelanau
-spyglass
-acuff
-jacanas
-weedy
-electromagnets
-preselected
-anchorages
-assented
-maharajas
-non-eu
-scantily
-noblest
-musicale
-thicken
-pitfall
-gmg
-russian-american
-puritanical
-endear
-stallworth
-siddique
-yup
-oxegen
-intermezzo
-d-pad
-eckstein
-matos
-jamz
-halla
-astrocytes
-on-the-fly
-grrrl
-modifiying
-rooke
-guildenstern
-lunisolar
-korman
-stavka
-swadeshi
-pail
-aerodynamically
-studley
-ockham
-fdny
-beater
-asansol
-n4
-n.w.a.
-murine
-uppercut
-pendulous
-grasmere
-liaodong
-ingleside
-a300
-eufaula
-vallarta
-umbilicus
-ozu
-m/v
-newbold
-telomerase
-military-industrial
-low-scoring
-10:1
-bss
-sandbanks
-pre-draft
-jsr
-jubilation
-pompadour
-luthiers
-flavonoids
-26.00
-abhisit
-mi-8
-mogollon
-seawolf
-ntc
-unrivaled
-chthon
-majoris
-musgrove
-hawarden
-unopened
-appeased
-bung
-machinists
-steep-sided
-coolies
-tpm
-province-wide
-cormier
-arado
-80386
-moldovans
-hdnet
-aashto
-foreboding
-karajan
-elizabethton
-ottavio
-khet
-cavaliere
-causeways
-nielson
-63,000
-hijaz
-nando
-iuz
-integrators
-gaveston
-calvo
-xzibit
-septs
-dadu
-shandy
-pimentel
-recurved
-retief
-rafiq
-dalbergia
-outstripped
-nessus
-janusz
-alloted
-foulkes
-pachypodium
-blackarachnia
-vosper
-linnet
-binney
-double-disc
-tuileries
-richmondshire
-heyerdahl
-headbutt
-lebel
-tomita
-quarter-century
-simcha
-nahyan
-pitti
-directorships
-shalimar
-superscript
-redesigns
-coleco
-nass
-fairclough
-tiwari
-wirelessly
-on-the-job
-mammography
-ireton
-ramillies
-cajuns
-epson
-transamerica
-malformed
-sewed
-tagbilaran
-drapes
-4m
-frankenheimer
-sociobiology
-interpolations
-pangolins
-24.80
-bottlers
-alyn
-flashdance
-12.50
-mics
-february-march
-expels
-malekith
-2001-2006
-late-season
-wct
-tullahoma
-kajol
-novae
-sanam
-okehampton
-lethargic
-riccarton
-normie
-1105
-354th
-panzergrenadier
-hoyts
-biggin
-skiddaw
-multithreading
-master-planned
-furrows
-tock
-op-amp
-all-city
-woodfull
-taproot
-giger
-sontaran
-coubertin
-palenque
-bhawan
-buzzsaw
-rivero
-horseradish
-bacup
-43-yard
-feuerbach
-pre-made
-killebrew
-ready-to-wear
-veneers
-bluestar
-makino
-pyaar
-driveways
-manorama
-hurtful
-chessboard
-gotch
-baily
-glaswegian
-chelicerae
-lumina
-bellas
-tubb
-elevates
-labeouf
-brynner
-hellmouth
-20.00
-bookmark
-logansport
-smitty
-arminian
-attar
-pikemen
-aldebaran
-cloture
-software-based
-junie
-cnmi
-stuka
-amyotrophic
-iroquoian
-paternalistic
-48.0
-debarked
-subedar
-cleats
-meles
-wadebridge
-chitose
-stigwood
-banstead
-eleni
-bellerive
-rarotonga
-fasciculus
-interreligious
-zarek
-bova
-nonhuman
-uncaring
-asghar
-funneled
-bonnets
-5,800
-kahan
-elp
-iap
-santamaria
-unripe
-muddled
-seanchan
-attractor
-game-tying
-2004-2008
-gyroscopic
-i-270
-instated
-sammo
-zagato
-pre-programmed
-jcb
-ravalomanana
-prostaglandins
-gratz
-ranjan
-stockpiling
-adhesions
-cluniac
-hanja
-dtp
-85.6
-0000
-ecclesial
-padmasambhava
-1124
-greenleft
-koninklijke
-i-93
-sleet
-bouchet
-calderdale
-meara
-c.r.
-kander
-suspenders
-spe
-caddick
-kido
-endemism
-iver
-zaibatsu
-igf-1
-rukmini
-gua
-servicemembers
-harajuku
-lucilla
-crts
-decriminalization
-acth
-haydock
-panegyric
-in-place
-upolu
-berke
-cobbles
-grasps
-digesting
-geoghegan
-joc
-karta
-ld50
-a20
-bathory
-sandeep
-mpd
-canonised
-mig-21s
-scatters
-sande
-esdras
-joris
-tungabhadra
-arica
-photocopied
-eventuality
-beane
-knave
-gaafu
-thin-film
-efate
-sintering
-symmes
-vasectomy
-bhabha
-cimmerian
-plessis
-demobilised
-partlow
-12pm
-mattoon
-atul
-craigavon
-tubas
-bromberg
-parsifal
-anti-competitive
-plateia
-egham
-malaspina
-bandini
-madd
-pondered
-orde
-'09
-onitsha
-yokoyama
-gimbal
-hurston
-raspy
-legendarium
-cmyk
-safra
-vinland
-stubbornness
-imad
-pangolin
-dovid
-frizzell
-overhear
-brainwash
-rathmines
-jorma
-ibarra
-jiri
-showpiece
-bloodlust
-quadratus
-burberry
-cracknell
-iiia
-feigns
-batts
-ssx
-aleksandra
-dhol
-mccafferty
-boondocks
-facultative
-whelen
-chelating
-koreatown
-snowfalls
-non-party
-amoco
-hatzair
-federative
-catoctin
-antipas
-summarising
-anticancer
-horten
-guelphs
-bottomed
-clowning
-september/october
-ppd
-outgunned
-kanishka
-sor
-lasallian
-25a
-dealey
-disorganization
-68,000
-gianluca
-sp2
-venstre
-crosland
-specular
-appetites
-spymaster
-baserunner
-drammen
-xianbei
-vira
-matsuura
-berton
-midleton
-cbl
-plumas
-mazzini
-e-class
-masquerades
-rear-engined
-deactivating
-underperforming
-ambassadorial
-schlitz
-suguru
-capensis
-abaddon
-gaucho
-rsv
-stilicho
-unm
-birgit
-defibrillator
-nsx
-7.40
-leached
-impulsively
-xeon
-monotony
-photovoltaics
-catharina
-bellinzona
-amphipods
-crutchfield
-trimmings
-grata
-khor
-taher
-dundrum
-dace
-ashrams
-mannar
-parfitt
-barrick
-stunner
-abdullahi
-norbury
-princesa
-tancredi
-destefano
-saunas
-disbarred
-bera
-junior-senior
-ze'ev
-record-keeping
-1996-2001
-uncreated
-combaticons
-hydrants
-seventy-fifth
-nijinsky
-panionios
-botulism
-puddings
-one-yard
-nuit
-makran
-engg
-schlessinger
-haa
-akrotiri
-strato
-franny
-idleness
-coiling
-non-member
-modelers
-pseudoephedrine
-bandler
-jis
-itil
-abri
-dividers
-cross-sections
-cienfuegos
-keyless
-yorba
-first-grade
-agostini
-riband
-spermatozoa
-groff
-kul
-farmville
-macchi
-plectrum
-unyielding
-23.50
-faenza
-palpitations
-karimnagar
-trelawny
-savoie
-torfaen
-secularist
-greymouth
-joinery
-repayments
-pfi
-disappointingly
-classless
-cheater
-traumas
-pinyon
-13-yard
-67,000
-drumheller
-schwa
-ww1
-rampal
-mid-90
-iden
-ronne
-medellin
-pitta
-circassians
-fugees
-spira
-vettel
-rimmed
-wagstaff
-reckon
-grama
-eyewear
-chirp
-gluconeogenesis
-essenes
-barnhart
-zosimus
-lunda
-triremes
-co-emperor
-pso
-snaffle
-overproduction
-junker
-kalb
-dhimmis
-ameche
-sanada
-christen
-amram
-twitching
-pallidus
-cersei
-jarmusch
-stilted
-bradycardia
-marinos
-easy-to-use
-buy-in
-mati
-geico
-homebuilt
-prions
-laibach
-779
-unmoved
-liotta
-sargeant
-merriam-webster
-yds
-wooley
-tarr
-beja
-softwood
-folksy
-lucida
-negotiable
-lovech
-dogra
-textural
-miso
-bureaux
-nah
-magmas
-kilos
-spasticity
-avedon
-reabsorbed
-46-yard
-0.00
-joyride
-klaas
-conroe
-sluices
-mind-body
-gladwin
-supermassive
-chthonic
-shenanigans
-trellis
-spad
-liaquat
-chief-of-staff
-gymnosperms
-rdbms
-satyam
-instrumented
-kilrathi
-tinkering
-casuarina
-wirksworth
-foerster
-gnrh
-wiiware
-nubians
-mertens
-coshocton
-santino
-reinvigorate
-lett
-under-age
-whitson
-roja
-ephrata
-eubank
-arch-nemesis
-storehouses
-pagers
-perris
-bayly
-hazell
-carrack
-aviculture
-1/9
-necker
-13.00
-uneaten
-maniots
-yancy
-anti-matter
-ika
-borer
-balmer
-59.8
-55.8
-25,417
-burress
-makhno
-groupie
-f4u
-spats
-remitted
-crusts
-sudeten
-tamblyn
-rfk
-disparagingly
-croghan
-courtland
-merrimac
-moghul
-honeycutt
-carpeted
-kah
-roehampton
-coriolanus
-warenne
-sockers
-gangadhar
-fajardo
-fsk
-unobserved
-tallman
-newsworld
-nsr
-155th
-0.22
-alliteration
-formica
-trainset
-grottoes
-soundsystem
-105,000
-iac
-rutger
-cupro-nickel
-vilma
-mccloy
-17-0
-fmln
-kinescope
-six-member
-serengeti
-tatooine
-fichte
-jaynes
-best-preserved
-pohang
-tazz
-1967-1968
-warfighting
-ccl
-kinnaird
-disruptor
-concretely
-capell
-bacardi
-tauri
-seagrass
-hyoid
-zhili
-mcduff
-i-91
-annihilus
-moolah
-participles
-gabonese
-country-rock
-grandi
-e.h.
-leveson-gower
-nassar
-hypoxic
-perfumed
-mahabharat
-over-sized
-matsunaga
-kirkbride
-unimaginable
-84.2
-cml
-cmd
-delta-v
-scholastics
-cacique
-quantrill
-phospholipase
-revisits
-ichabod
-contralateral
-wholeness
-mooresville
-dioramas
-974
-coprocessor
-guarini
-muri
-viborg
-pentangle
-d.s.
-unaccustomed
-sloop-of-war
-warszawa
-artisanal
-winterton
-ene
-littering
-arda
-vukovar
-precipitously
-inducement
-bling
-moodie
-jurgen
-err
-birdcage
-ivins
-fleischmann
-khin
-rel
-kneels
-0.18
-outworld
-high-strength
-sueur
-ethnographer
-availed
-lacerations
-suffused
-near-future
-marquand
-bechuanaland
-scolding
-goings-on
-kotzebue
-huna
-elroy
-popstar
-michels
-desegregated
-tga
-knvb
-welbeck
-telephoto
-zonder
-hopkin
-diatribe
-inaudible
-blohm
-1097
-catacomb
-oms
-maximillian
-1990/91
-gromov
-dethklok
-incisor
-cattlemen
-darkstar
-grosmont
-breech-loading
-sub-editor
-nocona
-orbach
-psychotria
-janette
-ambrogio
-lamiaceae
-qawwali
-swastikas
-ashurst
-relent
-disenfranchisement
-ksu
-53.7
-outmatched
-lindh
-bahu
-loder
-1929-30
-1073
-lalla
-venable
-mita
-nemours
-chalets
-bensonhurst
-paridae
-tryin
-liebman
-superstores
-bastia
-petoskey
-1038
-nederlands
-ced
-mcadoo
-sievers
-potlatch
-bedell
-regnum
-estoril
-groundnut
-clausewitz
-25.50
-kastner
-toma
-endgames
-animalistic
-dhu
-deadshot
-whiskies
-evola
-canvass
-adjuster
-dz
-brentano
-engender
-wakulla
-dyslexic
-macalister
-unicycle
-ejects
-sheepskin
-k.s.
-mplayer
-antiochian
-silverbacks
-6,800
-wheelchair-accessible
-tux
-nutshell
-antiphons
-longboat
-neuroendocrine
-florets
-salamandastron
-spectacled
-classifieds
-mele
-x-mansion
-brabazon
-implantable
-tesseract
-aam
-jada
-anthers
-palas
-homesickness
-melli
-haemoglobin
-984
-katich
-cagle
-karimov
-polotsk
-velu
-jonesy
-augusti
-lichtenberg
-snob
-devendra
-risk-free
-felsic
-rustling
-spiegelman
-utm
-biogeographic
-crowsnest
-manet
-i-44
-tarver
-80.0
-kubica
-yavin
-troublemakers
-comrie
-perceptor
-caved
-athenaeus
-sartorius
-foremast
-nonspecific
-takayuki
-maf
-berners-lee
-earthsea
-protea
-a15
-vv
-ladbroke
-802.3
-lloydminster
-weatherwax
-ethology
-colobus
-kandel
-musick
-bowdon
-thammasat
-brundage
-neoconservatives
-trocadero
-fib
-nicest
-baggins
-middle-income
-belz
-stralsund
-iori
-106.6
-u.k
-cometh
-synchrony
-school-wide
-underweight
-chemotherapeutic
-jeeva
-willington
-rollie
-meghna
-damasus
-flaunt
-alcalde
-loggerheads
-dyadic
-oakhurst
-bakri
-vectra
-pfg
-kermanshah
-kantor
-pinsent
-cutouts
-discrediting
-12.90
-hiroko
-ranson
-pinecrest
-horoscopes
-darlings
-icknield
-detentions
-39-yard
-wastelands
-panoramas
-freshers
-jebediah
-nicieza
-disinterested
-vetted
-nels
-abdus
-respirator
-joffe
-phloem
-ansell
-unitarianism
-kushans
-strangford
-ed.d.
-u3
-harvie
-1938-39
-chuckles
-redan
-kima
-lamarck
-brule
-maltreatment
-self-improvement
-usoc
-dga
-goldblum
-malfeasance
-kaurna
-myasthenia
-allay
-jw
-oriskany
-surratt
-cleaving
-mle
-empoli
-quayside
-footbag
-tetrarch
-chater
-kalidasa
-hierarchically
-gekko
-vuelta
-apollinaris
-elazar
-metroline
-arnim
-compile-time
-60.7
-gladio
-torak
-brixham
-russet
-rowboat
-lvn
-lochalsh
-hanau
-nonproliferation
-instar
-deira
-macdonough
-just-in-time
-vice-marshal
-spunky
-chaitra
-gurukul
-staid
-thessalonians
-rock-and-roll
-blissful
-one-eighth
-commercialisation
-chatto
-dorgan
-11.60
-4200
-unitrans
-tuvan
-943
-microbiologist
-17-yard
-mohicans
-then-record
-one-sixth
-solidity
-nicephorus
-mangan
-opines
-formula_76
-carbene
-gawad
-optimise
-uncircumcised
-self-identify
-crematory
-scrupulously
-chon
-targaryen
-moyo
-antiphon
-newar
-cannonballs
-linley
-walley
-homesteading
-biometrics
-booklist
-spearheads
-tadashi
-floriana
-kenzo
-quadriceps
-seremban
-cbr
-bambino
-duk
-swf
-impedes
-ruleset
-iuris
-kroeger
-uncultivated
-ephron
-tw
-gruen
-formula_74
-yaqub
-belford
-1158
-denier
-deicide
-lamellae
-enrol
-pennywise
-mxyzptlk
-semi-precious
-scanlan
-floristic
-ginkgo
-pili
-okanogan
-fils
-fusilier
-54.9
-dimers
-ardently
-errata
-noelle
-presbyteries
-oster
-amex
-baffling
-bloomsburg
-1934-35
-suspends
-shinui
-farian
-anti-shipping
-betula
-capilano
-laz
-egged
-ogasawara
-optimisation
-nerva
-yah
-caird
-hartog
-hassler
-valjean
-sain
-berk
-misidentification
-1121
-gakuin
-q.v.
-5,700
-konoe
-weyburn
-fortas
-wapentake
-nizamuddin
-shampoos
-r-type
-macha
-transoxiana
-kitsune
-matagorda
-ashington
-84.5
-gamerankings
-homerton
-puffed
-hornung
-dialectics
-appetizer
-speedster
-meccan
-pikeville
-taxidermy
-voltage-gated
-focusses
-kilobyte
-follow-ups
-bugles
-blondel
-bairnsdale
-trumped
-maligned
-montaigne
-carville
-tolerating
-democritus
-hamdi
-vermeil
-davila
-mossflower
-janae
-wmca
-kendriya
-otomo
-ojeda
-mondial
-rapeseed
-passat
-u17
-henin
-rata
-moan
-vallis
-vagrancy
-agastya
-chibi
-glossed
-al-ghazali
-bagration
-alby
-aidid
-matadors
-1.10
-feroz
-pedagogue
-rancheria
-lutes
-postmark
-scrum-half
-swedish-speaking
-kitchener-waterloo
-kellaway
-hoffer
-uman
-barberton
-bathers
-sainte-marie
-toulmin
-gulzar
-symphysis
-gerhart
-latta
-barzani
-twisters
-olusegun
-resuscitate
-borrowdale
-pacific-10
-0-60
-selman
-dissenter
-cosponsored
-wollo
-convexity
-balaam
-ennerdale
-macdiarmid
-cather
-close-range
-bago
-heidfeld
-anti-poverty
-tiresias
-wheatstone
-sebaceous
-bimini
-ncsa
-justiciary
-53.1
-izz
-provincia
-ludovic
-pathak
-mutinies
-droves
-rgs
-timbered
-bramhall
-1930-31
-ointments
-wendigo
-fuld
-ninjutsu
-cliffside
-muto
-brocken
-wey
-aspley
-profuse
-cultura
-fernanda
-crime-fighting
-premarital
-rattled
-non-euclidean
-papermaking
-cbsa
-illuminator
-patanjali
-mclendon
-margraves
-kesh
-mid-ohio
-blurr
-alief
-figgis
-nuku
-1-13
-glendalough
-mckimson
-kyon
-precentor
-flinch
-sickened
-studd
-shenouda
-ramnagar
-kyla
-fj
-rhinox
-boddy
-mcv
-dunkin'
-hunley
-6x6
-kipp
-tsutomu
-sutch
-a.i.m.
-unbelievably
-meuse-argonne
-tikka
-pika
-fe2
-threepenny
-mccarter
-loras
-palmieri
-juri
-adan
-cil
-brydges
-1-year
-coolie
-japan-only
-3-9
-penticton
-misbehavior
-sino-tibetan
-tbi
-drg
-henriques
-luas
-overwinter
-re-emerge
-circularly
-pallbearers
-denunciations
-kozlov
-brenham
-twelve-year
-carvel
-parsecs
-xorn
-corroborating
-groundlings
-one-design
-speedwagon
-hagler
-891
-thibault
-surrealists
-insurgencies
-vasu
-iida
-fama
-calcasieu
-traceability
-spithead
-formula_75
-heifetz
-ramu
-vernet
-refracted
-mid-2000
-unwary
-shawangunk
-puddles
-2km
-sanding
-preterm
-stressors
-titling
-sunita
-11.90
-enberg
-yachtsman
-japw
-scruffy
-windom
-nanotube
-salicylate
-103.6
-imro
-placard
-perissodactyls
-mineralogist
-tysons
-multiethnic
-gammon
-moree
-debrecen
-kreuger
-yugoslavs
-witi
-31,667
-farleigh
-m&t
-suh
-harmonisation
-voortrekkers
-whr
-aalto
-c&s
-cheery
-succinate
-picatinny
-escanaba
-staggers
-25.80
-magnusson
-1966-1967
-08:00
-ziaur
-numeracy
-sanjak
-houseguest
-bedminster
-jurchens
-raichur
-57.3
-timberland
-26.20
-sov
-sundara
-backyards
-pinks
-fecundity
-adachi
-masterclasses
-laurents
-capuchins
-erling
-stoics
-consulate-general
-state-supported
-1142
-relocations
-wagggs
-sema
-gascon
-barbeau
-sarbanes-oxley
-ronda
-vesa
-deportees
-1.95
-1.93
-sib
-disdained
-two-letter
-shorted
-birnbaum
-1.44
-neodymium
-sangiovese
-eulogized
-cranborne
-speared
-60.5
-wiang
-grignard
-moncrieff
-collectivism
-meritocracy
-fermionic
-seventy-second
-marcantonio
-ankh
-mcnutt
-bah
-shinra
-inhomogeneous
-burdwan
-ralphs
-eagleson
-cuban-american
-three-term
-101.2
-pesci
-zatch
-specialism
-schwyz
-ayahuasca
-openstep
-grosseto
-garish
-pontine
-murrieta
-main-line
-anole
-glucocorticoids
-sabines
-enriquez
-suburbanization
-r16
-tash
-tass
-drumcondra
-cud
-tenn.
-gluttony
-mb/s
-gumball
-barthes
-rennell
-locally-owned
-hookers
-siddhanta
-npo
-0.33
-immaculata
-w.e.
-bussed
-hattori
-haase
-pared
-bunce
-roll-out
-atherstone
-massasoit
-asner
-smyslov
-headington
-sachsenhausen
-zeller
-63.5
-sisal
-broomstick
-tanners
-meshed
-constructively
-pathum
-michiko
-francesa
-iras
-dinsmore
-hilaire
-bardeen
-well-dressed
-wissahickon
-morty
-candelaria
-munday
-middling
-manumission
-far-left
-orienting
-groza
-ashbury
-interspecific
-embry-riddle
-moura
-omnia
-abominations
-annam
-aftershocks
-machynlleth
-1149
-sadc
-pendulums
-somerfield
-templer
-reit
-greif
-fungicides
-runescape
-vice-presidency
-nicholl
-brannon
-all-ages
-sharm
-wadia
-austerities
-966
-aristarchus
-rondout
-multiplexer
-electroweak
-mid-eighties
-muffler
-kzinti
-berbice
-sedbergh
-gals
-survivin
-porco
-klara
-maclaurin
-shooto
-freebirds
-audax
-psychodynamic
-tolan
-sleeker
-carinae
-belzer
-tawau
-addiscombe
-brussel
-newly-constructed
-alt-country
-lillee
-splenic
-pinch-hitter
-melua
-ostinato
-pushkar
-anschutz
-open-pit
-tpg
-hearns
-jolene
-vetoes
-sabato
-mean-spirited
-deshpande
-jono
-ascertaining
-wistful
-c-c
-23.80
-asr
-shenhua
-meerkat
-vitalis
-pollinator
-fayed
-1.30
-belleau
-dawlish
-anchovies
-assuage
-drive-time
-iraklis
-hulking
-hussite
-compressibility
-23.30
-bentonite
-chastises
-roselli
-mightiest
-congas
-deceptions
-warps
-multiplications
-refocus
-ione
-2.09
-whitwell
-buskers
-stoppers
-interlaken
-gallura
-vide
-gurage
-retinoic
-generics
-tadcaster
-dwelled
-vous
-783
-steelmaking
-1904-1905
-functionary
-picidae
-bikinis
-parapets
-pappus
-rumpole
-pro-british
-dasara
-over-all
-hydrides
-rtf
-varangian
-26.80
-pdt
-emts
-plumages
-pretrial
-kavala
-salto
-ebbsfleet
-alcantara
-rwd
-nuptial
-x10
-entrapped
-selous
-tarbell
-siggraph
-uncontroversial
-intravascular
-datalink
-pinkett
-manji
-nourse
-dossiers
-cabildo
-miglia
-minstrelsy
-vicus
-24.20
-ferc
-icelanders
-romaine
-extendable
-schlosser
-sixty-seven
-dual-core
-rotana
-eisley
-takht
-kossol
-jordon
-glues
-rock/pop
-acyltransferases
-courageously
-unabashed
-fathering
-greenham
-waterston
-pisco
-liquid-cooled
-sukkur
-intamin
-chastain
-indigestion
-hillis
-all-women
-french-born
-kutuzov
-aarau
-landover
-ganapati
-kiratas
-popovich
-patentable
-kaji
-kadamba
-freikorps
-revelry
-free-kick
-eisen
-cuga
-bobtail
-histones
-alpina
-washout
-gebhard
-fulwood
-tanu
-t.r.
-yoshimi
-cactaceae
-roncesvalles
-enumerates
-mettle
-personhood
-unfurled
-pamlico
-o'reily
-c.k.
-nonprofits
-55.1
-pandits
-technocrats
-anp
-statesboro
-courts-martial
-sunan
-katoomba
-zorba
-raindrops
-khoja
-avram
-taif
-pistone
-kumite
-taree
-conjunctive
-ifor
-kakadu
-pasting
-x8
-hyla
-larvik
-excitations
-nizamabad
-murtaza
-matoran
-middleborough
-sub-standard
-justo
-then-owner
-chorlton
-ottokar
-higher-quality
-tootsie
-cosmetically
-sarabhai
-2-10
-2-12
-9-13
-aerts
-surfboards
-45rpm
--30
-cameroons
-bursar
-haug
-muerte
-pls
-routemaster
-wigram
-molineux
-nicaraguans
-annick
-muntjac
-eston
-nya
-26-yard
-kanyon
-koga
-tracheal
-accoutrements
-rickles
-free-for-all
-e10
-underlining
-60-70
-offends
-fbla
-associativity
-memorialize
-biotrog
-jacquet
-casso
-portcullis
-20km
-aun
-agon
-bordello
-56.1
-collinsville
-enquire
-embellish
-islamization
-secord
-cuesta
-dolomites
-34-yard
-pasty
-appleyard
-ziad
-bouldering
-paintbrush
-amazonia
-on-road
-berri
-high-stakes
-hypothesizes
-formalist
-cavell
-hereward
-encumbered
-ambushing
-schembechler
-mp4
-dme
-binyamin
-ragsdale
-impedances
-gulfton
-shogo
-boeotian
-letterpress
-tynes
-rusticated
-generale
-horncastle
-18-49
-millsaps
-corda
-do'urden
-inequities
-noyce
-mechanisation
-elva
-epiphanes
-galant
-millisecond
-almaz
-heterodyne
-four-engined
-flamborough
-strum
-lauenburg
-a34
-ecsc
-krabi
-tingley
-honiton
-snowbird
-fleisher
-joop
-bains
-masai
-1021
-spacers
-stationmaster
-tangents
-nawal
-echr
-silty
-doyen
-military-style
-ducktales
-25.40
-maulvi
-mixed-species
-datura
-26.30
-rauf
-mahajan
-aubert
-carboxyl
-godspell
-platforming
-fluminense
-zaheer
-similarly-named
-25.30
-tektronix
-5-inch
-ths
-segued
-gallico
-jamaat-e-islami
-goldenrod
-myopathy
-theon
-strath
-christabel
-1990-1993
-cacophony
-inculcate
-workloads
-kasim
-jovovich
-chalabi
-babi
-ferryman
-shophouses
-benjy
-diogo
-bg5
-trucked
-nido
-zandvoort
-isothermal
-crusty
-spank
-cavers
-bitwise
-mundus
-falsity
-phrenology
-simplifications
-deveraux
-kanzaki
-gratiot
-overprotective
-grammophon
-dham
-freestone
-23.00
-harington
-immobilization
-59.6
-allusions/references
-hallucinatory
-saps
-three-person
-64.5
-amperes
-rostov-on-don
-sassanian
-timestamp
-uhl
-reshuffled
-olomouc
-two-cylinder
-non-exclusive
-meese
-mentone
-mianwali
-14-7
-payphone
-paper-based
-shakuhachi
-manco
-aviva
-laisenia
-mottos
-esper
-ajaccio
-muttering
-requisition
-nabors
-druitt
-gwilym
-acknowledgments
-higashi
-renegotiate
-whiston
-epigraphic
-perfectionism
-eloquently
-roorkee
-credentialing
-frentzen
-cottesloe
-pedophiles
-phpbb
-jettison
-capsicum
-finnegans
-madra
-lyonnais
-i.m.
-cetacean
-???
-fairford
-human-readable
-digitizing
-quagga
-hksar
-crippen
-hoth
-stepwise
-finery
-islamophobia
-frontiersman
-four-wheeled
-upped
-aop
-11.20
-66,000
-azaleas
-making-of
-85.2
-brickman
-blodgett
-historiae
-antipolo
-kwik
-bootlegger
-bhavnagar
-zhengzhou
-enameled
-almshouse
-muscled
-entre
-cheapside
-gauging
-chetan
-coonan
-seventeen-year-old
-opole
-2e
-grinds
-jezreel
-ices
-madrasa
-lapping
-ensoniq
-jalpaiguri
-cynwyd
-10.70
-scaliger
-condyle
-bion
-sweaty
-jetsons
-scribal
-vilayet
-councilmembers
-picea
-sandow
-dqb1
-glendon
-ducking
-lavalle
-garagiola
-bolded
-coppin
-eigenstates
-canova
-lindstrom
-subsides
-aberhart
-colliculus
-naseby
-vocalion
-aizu
-summerall
-nasrullah
-seni
-iplayer
-download-only
-pentose
-fermoy
-thymine
-guitar-based
-urmston
-carbon-oxygen
-vitagraph
-kindled
-mockingbirds
-fha
-cavalera
-hunch
-burk
-jiva
-k9
-micallef
-hybridize
-somatosensory
-adora
-disfranchisement
-carrow
-snuck
-duodenal
-maupin
-rfcs
-tripling
-shiel
-chessington
-aftershock
-stanwell
-absolve
-dhruva
-titulus
-cragg
-5km
-135th
-courtauld
-psm
-ruabon
-cribb
-lazer
-oversize
-post-apartheid
-scherzinger
-pallidum
-april/may
-r6
-24.30
-jell-o
-mitsuko
-opticians
-towner
-edelweiss
-vectoring
-derecho
-ice-skating
-sugiyama
-pneumothorax
-northrup
-sengupta
-deceiver
-plainclothes
-karyotype
-submerging
-raimondo
-sub-par
-reticent
-kreme
-stovall
-amphion
-kurata
-cti
-karak
-single-lens
-poulton
-web-site
-sorsogon
-piast
-nour
-thirteen-year-old
-malaita
-ardashir
-papillae
-honorifics
-2-point
-sewa
-kalashnikov
-phds
-implores
-single-minded
-seventy-sixth
-tranquilizer
-alpheus
-chinn
-mignon
-tomei
-hi-nrg
-tico
-right-to-left
-moorestown
-six-hour
-psh
-miceli
-1998-2003
-backman
-minutely
-bedok
-2006-2009
-nudge
-inferring
-willenhall
-sugarland
-subramanya
-lanna
-courtrooms
-mandrel
-holarctic
-trespassers
-takamatsu
-girish
-umma
-catheterization
-wageningen
-exhaustively
-castellan
-1939-1940
-sassari
-hca
-iveco
-pasar
-matanikau
-jisc
-verilog
-thien
-heredia
-starrett
-battleaxe
-resplendent
-osorkon
-birkdale
-term-limited
-villegas
-mongke
-ood
-re-engineered
-highest-ranked
-carousels
-bmo
-centre-back
-judaica
-oil-based
-grimoald
-herdsmen
-nisibis
-hueneme
-pz
-enantiomer
-sackler
-tufnell
-tweeddale
-high-rises
-skimmed
-travails
-
-silverbolt
-imperfection
-fait
-dendropsophus
-7.92
-wharfedale
-pedder
-1078
-keefer
-petes
-bcl
-weekender
-edinburg
-schooldays
-23.20
-hyaline
-categorizes
-caissons
-echizen
-1560s
-hartwick
-career-ending
-single-seater
-fico
-johore
-pleasance
-naphtali
-hoek
-plaintive
-insula
-mauch
-writhing
-medicinally
-calkins
-universelle
-turboshaft
-gimnasia
-immeasurable
-on-set
-betsey
-shadowcat
-philadelphians
-feelgood
-tarnish
-athletically
-sabot
-lbscr
-katsina
-mov
-24-7
-maron
-icosahedral
-hopped
-writer-artist
-sternwheeler
-jaafar
-phase-out
-paynter
-gatton
-kenn
-mirko
-dewi
-10.80
-56.7
-ranting
-denatured
-mechanicsburg
-satrapy
-ugauga
-fru
-toning
-insecticons
-cost-effectiveness
-kellett
-etcetera
-a41
-stackhouse
-sub-units
-dissections
-ossory
-campeche
-haiphong
-anticipatory
-hacksaw
-gravitationally
-sklar
-rakhine
-caterer
-constrict
-cessnock
-whisker
-yenisei
-combatting
-army-navy
-kittanning
--40
-biosynthetic
-nain
-underhanded
-daredevils
-policy-makers
-byfield
-colac
-sass
-locksmith
-spader
-breasted
-paston
-druk
-biomarkers
-subtree
-newsworthy
-disobedient
-hashemi
-reine
-jadavpur
-revaluation
-506th
-attired
-ecr
-guster
-dobbin
-cottontail
-helium-3
-pina
-phocion
-chesley
-tacks
-peddlers
-mytilene
-75.0
-gos
-urethane
-agaricus
-3.43
-blacket
-soapstone
-nondenominational
-meatballs
-keystroke
-menaces
-sweetman
-tripuri
-shied
-anarcho-capitalism
-corticosteroid
-duras
-waffles
-staveley
-reisman
-unmixed
-adamawa
-3.37
-lingers
-diyala
-tashi
-ruthenia
-purnima
-1987-1989
-emailed
-benalla
-andor
-bandgap
-posteriori
-mandrell
-waldorf-astoria
-daron
-kapital
-boingo
-rane
-webcasts
-tns
-klia
-6d
-sherif
-sidwell
-katyusha
-lory
-cristoforo
-colma
-100.2
-malcom
-escutcheon
-sanyal
-noto
-bobbio
-agr
-corinthos
-vlasov
-cox-2
-shootouts
-bayless
-l-1011
-dvd-rom
-disjunctive
-bootstrapping
-leffler
-congratulatory
-stiffening
-monopolize
-polyunsaturated
-uninvolved
-mash-up
-westjet
-semiautomatic
-kroeber
-honus
-ssn
-elkie
-pira
-waitaki
-well-loved
-kisaeng
-lancastrians
-ranma
-carothers
-siskins
-olly
-sinden
-easements
-airwolf
-coprime
-sorbian
-corruptions
-collegial
-58.4
-maxi-single
-manakin
-novelisations
-thyristor
-pcd
-bourn
-syrups
-hasselt
-shueisha
-deletes
-ravenhill
-propeller-driven
-tati
-r2-d2
-mongul
-montparnasse
-bexleyheath
-macnab
-seceding
-ohsaa
-pastiches
-kalachakra
-formate
-puerta
-leachman
-pui
-fiordland
-fedorov
-cusick
-hsinchu
-verano
-frogman
-filbert
-engelmann
-mcl
-dooly
-ranvir
-jn
-foxhound
-catchments
-isbell
-progressivism
-stelmach
-bambaataa
-gujjars
-hmo
-gopala
-yate
-song-writing
-ferntree
-habibie
-formula_80
-pro-cathedral
-hewes
-bondholders
-imereti
-quetzalcoatl
-eyesore
-finalizing
-peony
-rosanne
-garak
-lenton
-on-off
-nibbles
-antihero
-natatorium
-paramour
-hatcheries
-drips
-sedgefield
-fetches
-i-275
-22.00
-hadiths
-four-seat
-arbutus
-lostock
-vijayan
-double-cd
-landsat
-farmyard
-weyr
-1944-1945
-baskervilles
-wiry
-galaxia
-sergeant-major
-shimano
-vdc
-multipole
-uti
-under-15
-kalari
-makara
-csonka
-csg
-annabeth
-taxiing
-goucher
-tsurugi
-atreus
-iles
-hosni
-mccrary
-insead
-merrily
-gershom
-jeon
-consoled
-penitence
-twelve-tone
-1007
-peaceable
-williamsville
-mabuse
-cklw
-lodhi
-rush-hour
-infotainment
-screamin
-soka
-sclerophyll
-vectis
-homophonic
-saprissa
-ablest
-jaegers
-zapu
-overbridge
-rejoiced
-dacite
-sub-family
-mandinka
-arch-enemy
-wnyc
-16,875
-opteron
-tokamak
-denson
-naya
-philmont
-optometrists
-yaoi
-infra
-s.h.
-supermajority
-marcelle
-batang
-lindau
-koresh
-us-acan
-fath
-neyland
-re-building
-tma
-odetta
-downham
-capon
-kbr
-dispersive
-mfc
-nuer
-hessians
-cubby
-congdon
-roane
-kalpana
-panellists
-rorty
-ardeshir
-devane
-antar
-boumediene
-alexia
-cardinale
-bridesmaid
-mispronounced
-anaesthetics
-chiffre
-muromachi
-ruhollah
-baylis
-chindits
-helvetic
-male-female
-sojourner
-refurbishments
-stephanopoulos
-hemmed
-slaughters
-capstan
-103.8
-26.70
-tsuchiya
-koan
-raffi
-codenames
-groombridge
-fermion
--17
-intendant
-aghast
-1968-1969
-lanzhou
-froude
-pdr
-magni
-pre-orders
-pattinson
-smearing
-cayuse
-junior/senior
-bott
-saarc
-scramjet
-glickman
-showbusiness
-treo
-nhut
-painterly
-masta
-32x
-moneypenny
-piz
-1995-2000
-reichert
-stix
-taverner
-transmilenio
-barfleur
-alanine
-archetypical
-cross-linking
-maladies
-gadd
-knapsack
-rourkela
-transliterations
-marjory
-cradles
-oil-rich
-bevilacqua
-prabang
-vasey
-ambrosio
-recasting
-geosciences
-nguni
-homs
-insinuated
-sweetly
-tullamore
-gasoline-powered
-compline
-tamoxifen
-lightwave
-yuu
-torricelli
-reynaud
-ries
-dort
-handedness
-capron
-chern
-westerville
-kaiserliche
-unrepresented
-gabbro
-croup
-kfor
-jaber
-mastroianni
-overburden
-collison
-sailings
-aldermanic
-nef
-aldolase
-landa
-ahmadi
-nieman
-sejong
-rabbitt
-vitriolic
-geraghty
-backbeat
-55.5
-vrindavan
-saddest
-caryl
-guccione
-entrust
-baobab
-lamon
-1132
-zuid-holland
-lacing
-refraining
-hrc
-izumo
-chattering
-gigli
-ledesma
-bcg
-malleus
-xlr
-yadava
-maxey
-safar
-ileum
-10.60
-naseem
-huddled
-jesters
-ickx
-sunga
-acadiana
-ghadar
-obstetricians
-pinch-hit
-second-floor
-vict
-coalville
-evergreens
-mikami
-concatenated
-asli
-tullio
-latin-american
-mcclendon
-jackfruit
-chitradurga
-foreknowledge
-aphorism
-scoundrel
-4800
-erythromycin
-duryea
-hexafluoride
-armia
-perla
-apeldoorn
-savonarola
-strategos
-cosmogony
-c-130s
-cladogram
-p.d.
-lentil
-9-6
-swimsuits
-oise
-double-track
-pre-installed
-watsonville
-go-kart
-lamacq
-erisa
-nagari
-intercounty
-wertheimer
-gavan
-quart
-103.4
-mnemonics
-maja
-maersk
-critiquing
-bohemund
-martz
-manto
-84.8
-cup-winning
-interregional
-jardines
-decrying
-brushless
-gondry
-mckagan
-28.00
-shahnameh
-mapp
-big-screen
-seventy-six
-pfc.
-letourneau
-spherically
-ies
-niklas
-techno-organic
-ropeway
-traralgon
-systole
-baltistan
-24.40
-2005-6
-kunti
-rumbling
-satisfiability
-stabled
-zeb
-sulphate
-sourdough
-26.90
-kryptonians
-mid-twenties
-sturtevant
-cost-saving
-hsdpa
-now-closed
-alliant
-newswire
-sufyan
-buf
-mulvey
-pati
-cantu
-meadowbrook
-antechinus
-3.46
-inaugurating
-riddim
-recreationally
-betar
-voyaged
-fringillidae
-nestling
-azarbaijan
-dovecote
-owner-occupied
-meddle
-lycurgus
-bernicia
-belligerents
-divines
-essington
-apennine
-catonsville
-margulies
-27.90
-tebbit
-dudgeon
-kuki
-misjudged
-baptistry
-46.1
-coxe
-hoodlum
-stryper
-profumo
-psittacidae
-vejle
-basiliscus
-ampere
-almaden
-uap
-lodovico
-anshan
-s-ivb
-rediscovering
-lakenheath
-amphora
-frontispiece
-gravure
-868
-muttiah
-baudouin
-levesque
-nostrand
-elizondo
-boonton
-ssw
-csir
-qubits
-gauged
-s-3
-impairing
-context-sensitive
-whos
-mathieson
-unum
-mid-week
-herbalists
-uncouth
-104.8
-dominos
-banbridge
-speedboat
-face-up
-babb
-weg
-tice
-moron
-saragossa
-praja
-kvp
-giveaways
-stansbury
-58.9
-e-business
-takin
-pyrrhic
-youghal
-agumon
-mozzarella
-desam
-mapes
-sixty-ninth
-perineum
-71.4
-trackside
-sellafield
-retrievers
-boldklub
-dorf
-vanities
-tegucigalpa
-fos
-playland
-on-and-off
-anti-vietnam
-antiqua
-revivalism
-noumea
-blurs
-invesco
-kelana
-1964-1965
-mackenzies
-cathar
-4,700
-bleep
-stagecoaches
-kaul
-zoid
-1006
-andersonville
-guelders
-tew
-tondo
-binalshibh
-reichenau
-bexhill
-desperados
-reseda
-kerb
-snooping
-romberg
-thesz
-1:10
-958
-tavernier
-almarhum
-trp
-sio2
-mengele
-quarles
-neto
-wed.
-hoagland
-borodin
-sultanpur
-taksin
-dri
-triglyceride
-englund
-reinvested
-hard-pressed
-alleyways
-hudgens
-maximo
-ferdowsi
-panjab
-metabolised
-daine
-fortaleza
-pervaded
-https
-neston
-iguanodon
-puritanism
-27.00
-karman
-hexameter
-reinhart
-frenchtown
-emea
-oriana
-vashon
-ferri
-hallstatt
-one-mile
-bown
-vining
-scheherazade
-civility
-dlls
-satraps
-dnb
-78,000
-stannary
-milken
-w.i.t.c.h.
-bocelli
-brokering
-libertyville
-pursuance
-reuther
-piqua
-wmv
-thundercats
-midsection
-pancrase
-iredell
-splints
-mycorrhizal
-1993-1997
-tomaso
-dnipro
-mounties
-turbofans
-hexagons
-ofsaa
-storz
-parodic
-self-discovery
-negates
-swizz
-engulfing
-gilwell
-saxena
-pagar
-annunziata
-reauthorization
-cadwallon
-veering
-eightfold
-lendl
-laozi
-scoping
-bostrom
-mcalester
-inari
-terminalia
-diomede
-azo
-wesel
-ebon
-corpuscles
-periplus
-co-writers
-neonates
-borehamwood
-pombo
-promicin
-avr
-ober
-fan-made
-mohini
-yellowcard
-nabil
-sharan
-irrationality
-microtonal
-carbone
-percolation
-ruthin
-wotan
-suga
-ranh
-caduceus
-55.7
-cinerea
-organelle
-two-speed
-odile
-guttenberg
-tsonga
-proteobacteria
-forfeiting
-ovi
-mutinous
-doink
-malayali
-t-shaped
-tindal
-sugary
-invisibles
-surjective
-wulfstan
-sharapova
-1991-1994
-generalship
-sws
-characterises
-cathodes
-soulfly
-bharatanatyam
-mung
-willits
-drugging
-indymedia
-sub-plot
-telefutura
-ammar
-ossuary
-altay
-fixed-point
-martinelli
-fainter
-detonations
-1017
-honeycombs
-alwyn
-hallucinating
-faure
-ditching
-bostic
-karuna
-columbarium
--16
-perdido
-polices
-mrf
-pivoted
-intervarsity
-ascites
-stringers
-prasanna
-.223
-early-1980s
-feverish
-remorseful
-aleksandrovich
-moorer
-symbology
-cheriton
-brio
-sea-going
-retry
-karlin
-receptacles
-pres.
-stoltenberg
-teilhard
-kong-based
-teena
-pics
-unfazed
-arneson
-stagger
-guitar-driven
-fantasyland
-broadens
-hindwing
-klaatu
-nera
-askari
-whack
-ahvaz
-mcu
-26.10
-vagaries
-canterbury-bankstown
-packwood
-prodrug
-inflicts
-ransack
-novum
-17-18
-angeline
-mse
-scouters
-neuhaus
-gally
-queenscliff
-81.8
-ucas
-urie
-mnm
-pbc
-gosh
-poodles
-kunio
-tarek
-wapiti
-umc
-non-mainstream
-rebut
-anticonvulsants
-gnomon
-gaeltacht
-bartered
-karenina
-unt
-coupland
-straker
-frosinone
-groote
-scotian
-stapleford
-'06
-nesn
-reconfigure
-carpio
-disseminates
-recollect
-shipmates
-canoer
-parkwood
-wigtown
-arsonists
-ansaldo
-08-09
-concurs
-pulverized
-spilotro
-tamiya
-laffey
-gelb
-magnify
-railfans
-thiers
-cockerel
-servos
-a19
-rilke
-pre-match
-donati
-duesenberg
-stuffs
-fairways
-one-page
-bacteriophage
-muffled
-chloramphenicol
-misanthropic
-nazaire
-brus
-eases
-992
-hasselblad
-hymen
-odometer
-thermite
-mideast
-legalism
-decidable
-vreeland
-caskets
-microarrays
-tikhon
-country-wide
-accredits
-folklorists
-phraseology
-craxi
-marca
-all-electric
-hallucinogen
-shibe
-underlines
-fassbinder
-naim
-nergal
-lukes
-looking-glass
-m&a
-misao
-iea
-osijek
-yttrium
-electrochemistry
-rafik
-larger-scale
-jop
-fairley
-flicking
-genealogist
-306th
-1988-1990
-garr
-b.p.
-vaishali
-zacatecas
-diatomic
-tenedos
-sorge
-1059
-manti
-preclassic
-archies
-machine-readable
-guajira
-wheldon
-sleepiness
-maginot
-cheaters
-instigators
-carola
-rod-shaped
-exton
-centralism
-thawed
-dotson
-unspoilt
-imbruglia
-photometric
-cannabinoid
-cattell
-44-yard
-refits
-benelli
-spiralling
-tritt
-ziva
-old-age
-pragmatics
-wee-bey
-eudoxia
-leilani
-dwaraka
-wendish
-consumable
-stuckey
-sociocultural
-kund
-nisha
-bagby
-aaronic
-mary-kate
-welford
-coppi
-florey
-chinensis
-agama
-prince-bishop
-swee
-embryogenesis
-psygnosis
-incorporeal
-tretyakov
-lessor
-mcdiarmid
-butternut
-mcavoy
-vaasa
-4b
-tweezers
-12-4
-hysterically
-moka
-pov
-gasser
-poset
-pompeo
-pear-shaped
-pressburger
-cth
-hyborian
-mardan
-anyang
-c-type
-charlatan
-lemans
-diecast
-christadelphians
-universalis
-daytime-only
-kavu
-wulff
-homeschooled
-re-founded
-mitsuru
-px
-telomeres
-oaklands
-galla
-reiterating
-ischia
-degenerates
-chernivtsi
-kitana
-altan
-ascari
-skippered
-morelia
-overdraft
-dvd-video
-kimiko
-praefectus
-kulick
-949
-sekigahara
-toyed
-perpendicularly
-zira
-collinwood
-driftless
-universitas
-nicht
-cuellar
-mandamus
-reams
-scinax
-flip-flops
-squint
-groban
-yellow-bellied
-cassio
-fingernail
-fabricius
-uploads
-mechanicsville
-re-supply
-sphenoid
-almora
-streaky
-corroboration
-rakshasa
-guruvayur
-emesa
-intimated
-floorboards
-girly
-trevithick
-16-17
-schaller
-positivity
-cipriano
-niel
-pestle
-chouinard
-four-digit
-dutifully
-elwyn
-4-6-2
-952
-jessy
-nambu
-americorps
-b.r.
-marder
-overlies
-c-3po
-fontenelle
-7-year
-dugongs
-participations
-grey-headed
-well-intentioned
-6,200
-pwc
-gibbes
-pickpocket
-0.28
-madlib
-fulmar
-vive
-durian
-senior-level
-t-33
-junichiro
-wipro
-fowles
-guesting
-atg
-10a
-kazaa
-non-disclosure
-soreness
-poppa
-foolishness
-functionalist
-on-location
-seether
-rousse
-85.3
-varga
-rusyn
-nibs
-tantras
-boric
-vaishnavite
-granta
-four-page
-prednisone
-cnut
-subscripts
-clarksdale
-surly
-penicillium
-neopets
-1046
-z-cars
-hougang
-soapbox
-holles
-reining
-subducted
-gerbils
-feria
-lascivious
-84.4
-bartels
-lezion
-archdioceses
-chumbawamba
-pisani
-collings
-huger
-heterosexuals
-confusions
-guideway
-hailes
-milland
-wakw
-pre-clinical
-defencemen
-housman
-hoards
-wenceslas
-renames
-ecclesiology
-anda
-mintage
-televising
-zombified
-loopy
-destruct
-nonsuch
-coningsby
-1161
-sahu
-workaholic
-donnchadh
-chinaman
-starkville
-sante
-glamorganshire
-hyperglycemia
-enantiomers
-chiropractor
-57.8
-weider
-gironde
-schiavone
-kj
-rosemount
-impelled
-dslr
-re-imagining
-appellations
-jue
-lobelia
-leukocyte
-mulgrew
-anja
-bharu
-cama
-pada
-damietta
-chartreuse
-mizrachi
-haneef
-super-villain
-attics
-suis
-pensive
-saree
-shoppe
-theresienstadt
-lactone
-chaffey
-osterman
-krishan
-mahavamsa
-malawian
-burbridge
-jocelin
-victorian-era
-buu
-20-17
-dirksen
-ort
-kh
-turnips
-ulla
-balalaika
-cornices
-balao
-mottola
-subservience
-47-yard
-chipperfield
-belmar
-lysimachus
-52.0
-chinna
-nigam
-geta
-razor-sharp
-unmasking
-keady
-jeph
-buran
-dumpty
-sikander
-yawning
-aspasia
-englehart
-subdues
-matamoros
-pinal
-cardio
-bedworth
-vauban
-vocative
-5x
-18-yard
-hourman
-blomfield
-macgillivray
-joslyn
-embouchure
-braunfels
-trisomy
-humps
-macadam
-izu
-disregards
-penta
-safeco
-ragdoll
-jaz
-40-man
-jagdish
-tasers
-magmatic
-petunia
-smacked
-evangelize
-mctavish
-harried
-hoyas
-oreo
-jenifer
-alessio
-ariosto
-1103
-leavened
-madawaska
-conceited
-1928-29
-stobart
-dorp
-switchers
-thugs-n-harmony
-ladle
-latrines
-recurvirostridae
-76.2
-daigo
-buttery
-visualisation
-wxyz
-ghc
-14-16
-jugurtha
-pera
-ripened
-dawe
-scheveningen
-gravedigger
-oboist
-markt
-grantville
-single-phase
-gille
-28.20
-cinnabar
-lk
-single-celled
-gately
-55.0
-sociopathic
-vocals/guitar
-sailer
-tikrit
-fallacious
-telemachus
-rambam
-smw
-dsg
-27.60
-957
-re-developed
-bley
-inverters
-jeunesse
-antithetical
-ballinger
-malco
-telemedicine
-wansdyke
-workhouses
-antonis
-0.24
-lessens
-unquestionable
-henkel
-dimple
-berenguer
-ceylonese
-blt
-80286
-herrings
-orangery
-pachelbel
-yearns
-aventine
-nystagmus
-anti-miscegenation
-frontpage
-a-2
-schoolwork
-mckie
-invincibles
-26.50
-repentant
-non-renewable
-fossilised
-searles
-albacete
-pirro
-groveland
-panis
-easel
-resurrects
-jawbreaker
-singletary
-punning
-hawken
-towneley
-schuldiner
-koshi
-horseshoe-shaped
-84.9
-raions
-parekh
-mandaluyong
-undervalued
-colchis
-waltzing
-bispham
-forestville
-optician
-eklund
-foia
-liddle
-jayavarman
-emcees
-c-train
-trogon
-norths
-showgirls
-hava
-akhil
-nhc
-haidar
-hempel
-m&s
-vrain
-mccarron
-nobilis
-co-ownership
-o.b.
-krystle
-november/december
-joi
--DGDG.DGDGDGDG
-urbano
-five-week
-muzzle-loading
-kanya
-zeppelins
-fader
-13-7
-biagio
-bingen
-monkhouse
-medak
-ex-member
-topspin
-flash-cut
-bristol-myers
-11.80
-oxted
-cuboid
-yasir
-dhofar
-bolt-on
-fado
-snobbery
-nazar
-darkhold
-fordyce
-mfp
-callis
-rupaul
-lock-in
-canoga
-re-examination
-kochanski
-cresta
-'94
-mcclaren
-tantrums
-langkawi
-l5
-25.20
-kuntz
-qurayza
-textron
-anquetil
-macv
-shrugs
-newschannel
-abraxas
-lilburn
-sub-regions
-737s
-nuchal
-malang
-nares
-self-funded
-grosbeak
-rda
-pressler
-cordy
-boh
-stato
-piqued
-portlaoise
-zama
-hodgkins
-ampersand
-spiel
-dattatreya
-lynnwood
-glycosylation
-noontime
-evanescent
-53.9
-unpunished
-ocp
-demotic
-scrubby
-antiope
-councilmen
-nxd4
-keibler
-4.25
-blue-black
-ovale
-exaggerations
-w-league
-jobim
-abrasives
-torrio
-guanosine
-naively
-nifong
-svm
-dilatation
-uru
-garmin
-qualia
-carbon-14
-w.k.
-chaturvedi
-signed-on
-sfl
-ganj
-winnie-the-pooh
-werke
-dori
-up-front
-kassa
-douala
-muthu
-buttermere
-top-up
-fides
-graecia
-black-throated
-chom
-glassboro
-inordinate
-cermak
-brooksville
-calorimeter
-peranakan
-beecroft
-lotions
-parmar
-houtman
-balco
-karyn
-gaddis
-traylor
-lossiemouth
-governorships
-nuh
-chick-fil-a
-business-to-business
-playtime
-reales
-palustris
-koren
-mildmay
-DG.DGDGDGDGDGDG
-allawi
-muslimgauze
-civilis
-63.2
-fomenting
-cawdor
-formula_77
-facs
-monikers
-hedy
-monteiro
-buzzers
-pettis
-deniers
-riverboats
-lynd
-gouged
-heiden
-golgotha
-bhajans
-compactly
-mundelein
-proscription
-mudaliar
-corin
-aikikai
-rcr
-selly
-orthopedics
-parkhead
-burana
-whoopee
-atn
-samizdat
-somnath
-eilonwy
-accelerometers
-feuerstein
-sarabande
-deodorant
-carcinomas
-cmll
-conagra
-hibbard
-kinglets
-gomorrah
-regrettable
-reiterates
-pucci
-harran
-22-year
-moulay
-ravan
-cobordism
-drape
-retinol
-9-8
-santer
-amraam
-caribs
-reykjavik
-tvontario
-workflows
-d-1
-millbrae
-gruppo
-bioterrorism
-raincoat
-montag
-pankaj
-meaty
-niazi
-chadbourne
-philco
-leinart
-cable-only
-bluesman
-mordant
-singstar
-whorehouse
-hulks
-olefin
-dewdney
-haditha
-pre-fabricated
-mcferrin
-unassigned
-rubbery
-doktor
-14-yard
-laconic
-quantifier
-matted
-unsc
-zener
-trabzon
-indelicato
-nurmi
-al-aziz
-waronker
-snickers
-foursquare
-accc
-lostwithiel
-mulhouse
-flir
-haga
-canarian
-kickback
-p-orridge
-jarred
-1032
-ill-suited
-1991/92
-taciturn
-exhaled
-goethals
-machiavellian
-rohini
-asides
-northway
-uchida
-curtiz
-habitability
-sarpsborg
-toowong
-ifp
-hyperbole
-i-195
-pollitt
-carpe
-wead
-inheritor
-1.98
-ambani
-safehouse
-xliv
-cristofori
-condolence
-cassano
-scammer
-pancamo
-stanshall
-turgenev
-kcr
-yigal
-adopter
-heimlich
-otu
-tapper
-usar
-stalkers
-anti-authoritarian
-satanists
-deaconess
-natan
-steeds
-koop
-20,833
-homesteaded
-alemannic
-infusing
-zamorin
-dogfights
-conserves
-patric
-alisha
-osh
-atar
-lausd
-liceo
-oregano
-hann
-naja
-mecklenburg-schwerin
-maran
-cem
-cei
-rhinebeck
-driller
-1109
-allanon
-maracas
-cassander
-notionally
-in-demand
-opeth
-donnington
-stacie
-unmask
-lisu
-desegregate
-jtag
-kaun
-marcian
-stroking
-uucp
-excellently
-dreier
-earthlings
-epiphytes
-gametophyte
-pancha
-osmani
-59.1
-59.3
-legazpi
-a-b
-enacts
-j'
-westall
-orfeo
-hooley
-unspectacular
-ohba
-shuriken
-trailways
-fontane
-loughlin
-paribas
-adventureland
-basotho
-gummi
-darlin
-bann
-gillibrand
-transmetal
-d'arc
-pruned
-guzzi
-hillgruber
-tsuji
-chromite
-bhakta
-leonor
-a350
-stickler
-peopled
-phonogram
-mantidactylus
-four-car
-noord-brabant
-shrieking
-10-10
-heartlands
-alli
-28.10
-exorcisms
-arbus
-fulfils
-mwanza
-11-day
-vastness
-orazio
-mahanadi
-cheema
-plausibly
-fine-tuning
-madama
-al-hasan
-a11
-combustor
-ross-shire
-j2ee
-ofdm
-exley
-single-sided
-much-loved
-hoarse
-mithra
-interactively
-valles
-higuchi
-causally
-three-headed
-polyandry
-woodcreeper
-voyageur
-borstal
-40-foot
-worn-out
-kiya
-lavington
-lucero
-super-human
-two-room
-surrogates
-aep
-cvd
-cultivates
-84.3
-caricaturist
-totems
-mpt
-municipally
-manville
-wms
-wmc
-vitriol
-oporto
-vampiro
-humes
-stomata
-encrusted
-dinucleotide
-repaying
-amidah
-sullavan
-mycoplasma
-dystonia
-eitan
-lavrov
-ploughshares
-tanka
-380th
-inveterate
-and1
-arawak
-pistachio
-crumpled
-dhritarashtra
-inoculated
-streptomyces
-isg
-kamikazes
-lampooning
-turnstile
-pigtails
-deuterocanonical
-pty.
-loony
-marchetti
-buru
-hashmi
-arty
-tenpin
-retd
-excites
-2000ad
-fallback
-tollemache
-ardabil
-co-defendants
-gilford
-tipper
-tengen
-refracting
-needle-like
-almelo
-davitt
-oper
-non-contact
-bioscience
-mossi
-hayling
-fiori
-psf
-cowlitz
-oliveri
-school-aged
-improprieties
-magar
-tunguska
-kirkintilloch
-2.03
-165,000
-anti-israel
-61.7
-uprights
-non-fictional
-forelegs
-bosko
-cornel
-bricklayers
-benes
-interceded
-triune
-knyaz
-shanties
-bugged
-spiers
-phalacrocoracidae
-o'doherty
-simsbury
-unprocessed
-throwaway
-swati
-single-stage
-skua
-pdo
-dampening
-90-degree
-havel
-edgier
-calexico
-6.80
-defensemen
-3.51
-shipwright
-cohens
-conjures
-giddens
-philp
-usas
-churu
-trunked
-vishwanath
-quenched
-bivouac
-overwinters
-ramaswamy
-verdonk
-16-inch
-exchange-traded
-weatherboard
-tumbleweed
-sousuke
-chitin
-ramla
-ql
-weaning
-intentionality
-colorguard
-non-free
-orthographies
-blish
-ose
-ettrick
-kenrick
-toontown
-sct
-mineworkers
-quatrain
-crespo
-young-adult
-achtung
-dunton
-re-working
-mahila
-camra
-nor'easter
-pikas
-svo
-rossville
-erases
-boilermaker
-leptons
-informix
-pugwash
-kents
-zapper
-presences
-torsten
-guilder
-afterwords
-botetourt
-kevan
-electronegativity
-one-and-a-half
-expellees
-3-year-old
-volz
-saxe-weimar
-patrika
-tex-mex
-self-censorship
-cut-down
-odious
-readmission
-euryalus
-jupiler
-gorski
-yesteryear
-naima
-hawksmoor
-lesabre
-sulzer
-13:00
-nihil
-p.m
-fernandina
-harbord
-omac
-murrah
-callosum
-1061
-glycerin
-wbcn
-birbhum
-dimmed
-shiller
-schirmer
-mystically
-abrolhos
-atvs
-todos
-garang
-newmark
-fairleigh
-contractile
-f-16c
-research-based
-accumulators
-overspill
-cre
-goenka
-cii
-trevally
-63.6
-mcconaughey
-gamebooks
-hazelnut
-sangeetha
-fizzled
-relapsed
-municipium
-mummers
-suhrawardy
-adena
-diffusing
-25.10
-niggers
-ala.
-tyrian
-czarist
-hegde
-honeybees
-jerrold
-zogby
-dream-like
-twirling
-rian
-crossbones
-pro-american
-powershell
-fatigues
-straight-4
-endeavouring
-signpost
-groundnuts
-swisher
-state-maintained
-umbc
-devolve
-gell
-mongrel
-pursuer
-creamer
-killam
-munoz
-perc
-longclaws
-last-second
-heisei
-parachutists
-erythropoietin
-bengt
-blemish
-llwyd
-actionable
-ginza
-online-only
-semi-active
-publix
-malarial
-zoologists
-lampang
-kontinental
-jayawardene
-favreau
-coloureds
-cfm
-handedly
-emanations
-e-books
-libretti
-commagene
-bonelli
-macht
-nsp
-eastport
-pujas
-deliverables
-full-featured
-hyperthyroidism
-indict
-stackpole
-magen
-baselines
-woefully
-ghola
-jebb
-barmouth
-harar
-bhagavathy
-well-written
-khalidi
-tirthankar
-ind.
-evinced
-pro-wrestling
-constraining
-submersion
-benedetti
-paparizou
-sarkis
-overhill
-serna
-vice-regal
-namakkal
-aust
-disquiet
-autonomist
-c-terminus
-f-series
-pietra
-khurd
-antivenom
-twit
-unmanageable
-benegal
-nepomuk
-skaro
-pincers
-beckoned
-lortel
-freewheel
-abid
-3km
-ma'am
-sangguniang
-navicular
-fish-eater
-10.90
-holywood
-janaki
-behrens
-shau
-petco
-sahaja
-hustlers
-pupate
-breakwaters
-widney
-tnf
-otus
-wfmu
-camargo
-dryas
-vik
-bloodshot
-dimond
-abc2
-backpacker
-tmi
-dwarka
-u12
-extramural
-7600
-co-conspirator
-fettes
-dhoom
-topham
-portrush
-thawing
-ntac
-lycos
-livewire
-outlive
-castellated
-kloster
-2.04
-wip
-pediments
-phonon
-bannatyne
-leatherface
-levis
-nitrox
-table-top
-faustino
-hallowell
-tseng
-piran
-sealion
-editorially
-bd3
-scriptwriting
-r.k.
-rieti
-maoism
-eleusis
-flings
-aliya
-sleeved
-phon
-ophthalmologists
-pastoralism
-industry-wide
-puram
-mudslides
-eldredge
-k-3
-maksim
-nebulos
-loya
-bushranger
-mattia
-wantagh
-croatians
-vesalius
-internet-only
-donnybrook
-23.70
-sharpest
-jambi
-6-month
-idiotic
-reprogramming
-regazzoni
-woof
-ethiopic
-farish
-sternly
-action/adventure
-royall
-sununu
-subcamp
-v.c.
-osf
-flaubert
-tamarin
-folksongs
-initialize
-fierro
-whittlesey
-winnfield
-leeming
-schacht
-terrill
-wigley
-chatty
-fulcher
-maastrichtian
-modigliani
-65.5
-ho-chunk
-netherlandish
-pining
-1069
-kodaikanal
-waterfield
-therefrom
-stand-by
-mabinogion
-thieu
-sixty-eight
-terminations
-seventy-four
-hillhouse
-simenon
-waveguides
-grassley
-pseudo-random
-mowry
-radiologic
-mayumi
-earps
-flaky
-bakhsh
-coit
-agnelli
-hafner
-breakin
-supply-side
-kookaburra
-blaxland
-downstate
-daenerys
-overprints
-holon
-kavita
-debased
-tz
-phosphine
-portraitist
-intertwining
-tojo
-freyberg
-pcha
-peden
-phosphodiesterase
-great-aunt
-x4
-hnoms
-texians
-nani
-kunz
-wnet
-gibbet
-shanker
-12.20
-expressiveness
-begonia
-bladensburg
-bisecting
-moonlit
-orthogonality
-gtv
-froth
-quarreling
-coton
-protruded
-gundams
-pre-hispanic
-lateef
-lta
-racists
-effy
-directorate-general
-affricate
-prepped
-jct
-60-day
-sadf
-prospering
-celta
-tosefta
-christological
-narrowness
-rione
-meon
-perk
-harpsichordist
-dyspnea
-0.85
-crossbills
-sealant
-rabun
-hard-rock
-canavan
-meditated
-waas
-disincorporated
-tammi
-artiodactyl
-1044
-arr
-leticia
-dena
-deferring
-non-discrimination
-holmfirth
-woolmer
-cicadas
-makem
-ji-soo
-haft
-syntactical
-bonecrusher
-chowan
-cuevas
-heavier-than-air
-wrightson
-i-77
-elster
-treetops
-nueces
-49er
-plaistow
-aguascalientes
-tress
-voivode
-unnaturally
-superion
-inquisitorial
-donjon
-26.40
-sime
-paratroop
-vma
-wgi
-kinloch
-gringo
-pratibha
-giggles
-eval
-keltner
-aklan
-dioguardi
-mciver
-corris
-generalitat
-finnerty
-helots
-mcmillen
-fifty-nine
-vyvyan
-clocktower
-jux
-enosis
-revelers
-shipper
-newley
-catechetical
-corroded
-teela
-dural
-hand-built
-cobweb
-uridine
-green-wood
-anacardiaceae
-eloy
-gelsenkirchen
-deluise
-gourock
-best-sellers
-2048
-gingival
-bangsar
-17-10
-gaudio
-treblinka
-yogis
-decemberists
-garforth
-nullarbor
-prat
-undertow
-tenby
-charcot
-narses
-spacemen
-mapper
-chalukyan
-83.9
-mcnaught
-exonyms
-dolmens
-laetitia
-standardbred
-danelaw
-peeler
-fording
-anagrams
-sansa
-superbus
-appending
-roundels
-wayanad
-toriyama
-bahujan
-franky
-electrify
-corrales
-actor/director
-hand-in-hand
-tarkenton
-ss7
-romanovs
-68.8
-gosselin
-falkenberg
-herriman
-wjw
-goeben
-mbr
-thiol
-moradabad
-danorum
-timaeus
-towanda
-reena
-arabica
-wohl
-2010-11
-sadako
-projectionist
-digg
-wsbk
-squabbling
-chazz
-malakand
-lodgepole
-cress
-mid-seventies
-kirkstall
-tcr
-concentrator
-arcola
-moneys
-hci
-conkling
-tarquin
-alresford
-thr
-animistic
-kilcullen
-ossipee
-ukiyo-e
-hardeman
-flattop
-neh
-commend
-demonstrable
-trantor
-wulfgar
-chandi
-smet
-pectin
-finnigan
-adamski
-canuck
-monosyllabic
-ibrd
-croaker
-kilby
-t5
-marinette
-oligarchic
-airbrush
-d'art
-ruddigore
-banka
-itv4
-tiraspol
-venda
-inter-ethnic
-northwoods
-audiophile
-sideburns
-8-11
-indisputably
-iapetus
-akal
-queers
-sabir
-exchangeable
-love-hate
-clk
-toh
-three-star
-pelops
-wardha
-voronin
-satguru
-irascible
-nitze
-primorsky
-knickers
-macrophage
-asics
-selflessness
-java-based
-sowed
-chavis
-psychopathy
-sulfates
-nakashima
-borowski
-xenosaga
-steepness
-aucoin
-ultramarine
-rifkind
-ihr
-canids
-dainik
-felines
-sabercats
-d'honneur
-maffei
-pomponius
-parkgate
-empty-handed
-subcontracted
-isosceles
-skelly
-unissued
-precariously
-pyjamas
-wiradjuri
-eleftherios
-seafort
-hakodate
-breakcore
-accredit
-edens
-wuchang
-rothbart
-taa
-bodice
-agence
-fluorite
-srinivas
-applescript
-urawa
-kazakhstani
-rosse
-pronghorn
-musab
-dithmarschen
-cliffjumper
-tangles
-21,500
-1962-1963
-dukat
-thirumal
-holstein-gottorp
-pharos
-mccarver
-espnu
-rebukes
-aoun
-entrenchment
-colonoscopy
-decarlo
-tovar
-powerboat
-idema
-dupe
-fynbos
-bezel
-disclaimed
-normalcy
-pollak
-f7
-candidly
-charmer
-quietest
-high-explosive
-two-pronged
-guis
-gillman
-e-1
-2-player
-24.90
-fenech
-bloodstock
-bolzano-bozen
-joad
-navarino
-nutting
-blowin
-snmp
-9-4
-cesaris
-up-curved
-tessie
-coarsely
-seeping
-sabena
-kristensen
-ottumwa
-drop-out
-desertions
-hampden-sydney
-'99
-wcau
-snouts
-bahlika
-oceanographer
-taft-hartley
-pre-civil
-firuz
-capper
-drakes
-waza
-arrigo
-ghaffar
-transferrin
-imagineering
-irreparably
-serpico
-agriculturists
-theni
-far-fetched
-impostors
-belgravia
-stawell
-peterman
-fountainhead
-unenthusiastic
-bladen
-counterfactual
-bespectacled
-tephra
-unia
-seanad
-tallgrass
-adaptors
-panagiotis
-madang
-perineal
-rallye
-chough
-two-fold
-kempston
-bonaire
-single-disc
-badri
-horticulturist
-roxxon
-triana
-slaver
-30.00
-outpointed
-compleat
-crevasse
-tughlaq
-flamsteed
-dyskinesia
-homological
-kormoran
-marz
-hdr
-vaporize
-bused
-repels
-tetras
-edelstein
-high-throughput
-massaro
-madoka
-permittivity
-jaxa
-ghibellines
-pollicis
-mackillop
-smothering
-turco
-easterbrook
-beppe
-dornoch
-humvee
-47.0
-gendered
-rutles
-all-county
-greenhills
-semicha
-jap
-awad
-synch
-moans
-anupam
-canseco
-scottish-born
-jjb
-maintainers
-norrie
-slpp
-pacifier
-myrick
-gaas
-parvathi
-co-education
-deathstrike
-planer
-chc
-e.e.
-multifunctional
-dreamin
-ppb
-preschools
-paramaribo
-ellsberg
-thesiger
-ghor
-transversal
-iihs
-darkwave
-menasor
-deputed
-gowanus
-stress-energy
-euan
-kulthum
-nata
-moulder
-sawed
-un-named
-lunas
-cento
-birthed
-shapeshift
-basketry
-cushitic
-prisoners-of-war
-sedatives
-white-headed
-o'loughlin
-20000
-libreville
-webcams
-scruples
-stereos
-yoshiyuki
-sephirot
-fet
-fe3
-larkana
-birchwood
-buncombe
-exocet
-kannapolis
-butuan
-burgomaster
-lubitsch
-synchro
-lydney
-recto
-halogens
-al-ahly
-poehler
-e.f.
-fach
-hydroponics
-bhikkhu
-nadja
-heritages
-stretchers
-erykah
-macalester
-brashear
-goffin
-silja
-saltillo
-allingham
-amherstburg
-telemedia
-team-high
-rfd
-queensferry
-phantasm
-everlast
-sella
-coccothrinax
-t.a.t.u.
-loggerhead
-msds
-ivrea
-hardliners
-southwestward
-astrazeneca
-jewison
-lolly
-cannan
-overpopulated
-livgren
-sleepwalker
-fly-by-wire
-mxpx
-ecma
-deconstructed
-dimas
-palaeontologist
-envisioning
-airlink
-mamiya
-chipper
-subsequence
-cotes
-eartha
-irby
-highline
-oxidizes
-marvelman
-understatement
-finnic
-siang
-time-life
-vologda
-dunoon
-locklear
-carters
-norquist
-gastroenterology
-werft
-aitc
-al-khattab
-epperson
-mykhailo
-loke
-poop
-filioque
-sheiks
-greely
-kunal
-belur
-phinney
-56.9
-56.0
-lazzaro
-trow
-lofgren
-imu
-georgiev
-warhawk
-boshin
-refinance
-phenols
-seg
-1026
-landmasses
-tannen
--14
-dependant
-masaaki
-lmg
-honiara
-non-surgical
-dmv
-patchogue
-duckburg
-wherefore
-sa'ad
-nono
-armor-piercing
-dioxins
-hevesi
-renville
-byelorussian
-supremum
-radome
-grodno
-physiognomy
-kunar
-headhunter
-mdl
-maggid
-adornments
-park-like
-accursed
-meaux
-horwood
-rnr
-iwasaki
-chimaera
-ecce
-yn
-1000m
-ngai
-belgica
-orix
-1/16
-free-agent
-saruman
-asta
-sprouse
-talkshow
-+DGDG.DG
-channon
-gopuram
-infliction
-mid-1990
-strother
-hexagram
-bagdasarian
-blip
-poisonings
-herm
-uttam
-c-46
-pion
-inkerman
-blackfish
-co-president
-subspecialty
-lustig
-northcott
-kennelly
--DGDG.DGDGDGDGDG
-enumerable
-churned
-ctp
-43,125
-toasts
-lamport
-60.4
-ducted
-camerlengo
-brodhead
-kotoko
-bardwell
-wcg
-licensor
-keil
-bunyip
-appletalk
-vallabhbhai
-wall-e
-parotid
-realignments
-longleaf
-mexica
-whs
-detuned
-hdv
-oxycodone
-tetroxide
-bitmaps
-rigel
-nkomo
-jarasandha
-reyne
-waitemata
-scada
-adelphia
-gautier
-ssu
-toca
-kazoo
-hina
-sleaze
-whoa
-truex
-patrik
-hoppe
-hardboiled
-parbat
-coconino
-caruana
-deflate
-tv1
-ex-gay
-multinationals
-nakuru
-dingy
-58.7
-latur
-tuscola
-piston-engined
-pubescent
-celsus
-24.70
-mundaring
-rennet
-whithorn
-cryptozoology
-charnock
-henke
-re-broadcast
-monoclinic
-nebraska-lincoln
-hornswoggle
-29-year-old
-implanting
-doro
-fullmetal
-assemblymen
-zavala
-gigantea
-hipped
-entrainment
-castaneda
-frisell
-trilling
-sph
-polycystic
-castelo
-avary
-fv
-southwood
-nuclides
-jind
-extorting
-otterbein
-gollum
-psychedelics
-life-time
-wisecracking
-corsi
-mercosur
-mirra
-wachowski
-graziano
-dijk
-wurundjeri
-shenmue
-dubinsky
-todor
-2x2
-mi-24
-pertained
-prabhakar
-tarka
-kalla
-f3000
-menshikov
-illuminatus
-scrawled
-voith
-delving
-63.8
-rickover
-limon
-ullah
-coolio
-kune
-yeahs
-pwm
-globalized
-slitting
-intel-based
-torvill
-1.12
-skerry
-isomeric
-caperton
-898
-morlock
-nataraja
-enlil
-digitorum
-62.9
-62.1
-shemesh
-mahaprabhu
-ludo
-shiites
-morland
-stop-gap
-barreled
-khok
-defacto
-jaffrey
-snakehead
-wbz-tv
-derailleur
-truxtun
-idp
-snide
-pre-planned
-october/november
-massapequa
-lier
-grandison
-franciszek
-newsmakers
-seamounts
-dau
-vrs
-neurologists
-azzurri
-pva
-mahar
-isomorphisms
-cfe
-battisti
-emerick
-lapierre
-gibney
-fenny
-mors
-husband-and-wife
-osan
-nabu
-lyneham
-krist
-berardinelli
-windowless
-bechet
-thongs
-cryptically
-8/10
-beesley
-azide
-slackware
-hardstands
-141st
-cross-linked
-inexorably
-56.2
-inaccessibility
-purist
-doctorow
-inactivate
-life-changing
-eoka
-horsey
-29,583
-samvat
-chlorite
-gurewitz
-salama
-heathens
-endothermic
-anticline
-shapley
-mogo
-giorno
-pay-per-views
-whodunit
-07:00
-southington
-ingot
-maroubra
-johnsbury
-uea
-stakeout
-nanci
-rigger
-weathermen
-mostafa
-1057
-caguas
-hothouse
-1.88
-guberniya
-owerri
-xxviii
-be3
-durkheim
-olinda
-bellwether
-5:1
-lumiere
-co-anchored
-kerslake
-salience
-lorin
-trackless
-petrolia
-karas
-karat
-bretons
-self-reported
-lorton
-short-tempered
-chapbooks
-edgehill
-jaunpur
-roasts
-kapampangan
-gamete
-beheld
-stuns
-electroshock
-al-jazeera
-successions
-phosphorylase
-gores
-expound
-solberg
-three-level
-2.01
-fenland
-mcgrady
-timeshare
-61.1
-monosaccharides
-hatley
-decorators
-ors
-kenshiro
-erlanger
-antagonized
-bushehr
-bama
-pelly
-clymer
-philomena
-bovis
-emerita
-iam
-valentinus
-23.90
-mcnamee
-1.99
-public-key
-abramovich
-hispano-suiza
-salafi
-glassworks
-kmc
-itagaki
-handkerchiefs
-clawson
-quadriplegic
-justicia
-mithril
-two-wheel
-winer
-intrudes
-hindutva
-entrained
-condemnations
-hippias
-brigantes
-hecla
-torrence
-redraw
-atletico
-seung
-vino
-furioso
-clunes
-seabee
-berio
-antipodes
-pik
-alamgir
-och
-ock
-oakridge
-nias
-tiberian
-firmament
-retroviruses
-wordpress
-lethbridge-stewart
-schliemann
-bendahara
-goddamn
-kyocera
-nanomachines
-lockup
-passmore
-banga
-shirdi
-hattersley
-gilboa
-toccoa
-citrix
-colonial-era
-rolleston
-ouyang
-brahmanas
-spotlighted
-olwen
-82.8
-sofas
-eponymously
-rhind
-a.f.
-electroconvulsive
-h-1
-babbling
-captain-general
-mileposts
-1063
-sixty-nine
-swayne
-rps
-handicapping
-cottingham
-bonjour
-attenuate
-novick
-paec
-linx
-twink
-pappu
-mendham
-nihilist
-microcredit
-relocates
-microbrewery
-beerbohm
-uhtred
-cawley
-badalamenti
-delfino
-secreting
-compo
-supermarionation
-corsicana
-reincarnations
-armalite
-misdirection
-side-to-side
-coro
-duvalier
-1003
-prp
-shirtless
-heightening
-ritualized
-442nd
-mauldin
-747s
-institutionalised
-knockin
-ullswater
-hemolysis
-hobgoblins
-d'estaing
-welty
-rogersville
-betis
-upmc
-complimenting
-wam
-insect-like
-polikarpov
-best-remembered
-81.3
-dahir
-wombats
-rameau
-immanence
-ornately
-thyself
-mullally
-ulverston
-vash
-denney
-polenta
-pwr
-graveney
-tonsured
-exemplifying
-preconceived
-auxin
-polycrystalline
-disa
-glan
-extractive
-bi-lo
-walvis
-jamalpur
-butchering
-oli
-caspase
-proofing
-multics
-sacagawea
-ziv
-disconnecting
-smacks
-rockcliffe
-manning-sanders
-zariski
-1019
-shugden
-catt
-watkinson
-fleer
-halfpipe
-hydrates
-pantone
-shaves
-bunche
-kristol
-callus
-garin
-clubman
-sammarinese
-wolfenden
-jacuzzi
-ceiba
-hazaras
-ramdas
-tinplate
-zartan
-mongooses
-delroy
-monopole
-patriarca
-ultrasonography
-rbd
-unfavourably
-pastes
-psychically
-forres
-ghostwritten
-chaya
-380,000
-mobo
-26,500
-finck
-adoptees
-eirene
-ducked
-harries
-duiker
-tutelary
-timepieces
-alpharetta
-canossa
-botkin
-nhls
-poliovirus
-brera
-1159
-immingham
-mince
-maree
-abia
-verbena
-huma
-bridgford
-warm-blooded
-5/6
-hulton
-1.80
-siddiqi
-mohican
-neocortex
-co-headlining
-sideband
-digivolution
-halfling
-sadist
-pocketing
-27.30
-seki
-sharpie
-kcrw
-brogden
-ptfe
-hugues
-environmentally-friendly
-phocis
-carrol
-orono
-palindromic
-masaru
-persevere
-saint-louis
-flagships
-pluralist
-fulvia
-microscopically
-rjr
-willingdon
-145th
-0.19
-woodhull
-quickie
-bjarne
-milnes
-mahavir
-worldviews
-agnatic
-imitator
-nxe5
-wallowa
-arthroscopic
-dunks
-caecilia
-vls
-kumaran
-yantra
-fistfight
-shedd
-kyd
-student-athlete
-grappa
-adders
-malle
-todaro
-traditionalism
-paddler
-tempers
-heckled
-franco-german
-wlan
-1112
-childbearing
-silences
-manzarek
-48-yard
-solidifies
-cardin
-choiseul
-sakaguchi
-hotmail
-post-roman
-bulwell
-hutus
-summarises
-authorial
-pohnpei
-1988/89
-suzette
-bintang
-sweeny
-freenet
-newly-arrived
-liouville
-amici
-pro-communist
-pfister
-cubits
-supers
-swartzwelder
-cotangent
-13.10
-non-conference
-gatting
-texian
-audra
-scrubbed
-saiva
-friedel
-ecommerce
-cassadine
-e.r.
-kosciusko
-serotonergic
-simony
-girlguiding
-naan
-mansard
-break-out
-gomi
-dcl
-baig
-atty.
-elastica
-tatsumi
-hemme
-latveria
-ecowas
-urartu
-simmered
-amputees
-marmora
-helmeted
-antonello
-portugese
-super-soldier
-jordanians
-1.02
-anesthesiology
-59.7
-beetham
-south-facing
-ominously
-overwhelms
-yummy
-manzil
-newscenter
-madhuri
-asheboro
-netley
-laevis
-lemberg
-t-55
-so-named
-alles
-submanifold
-hemispheric
-0.95
-omnivore
-neotropics
-cheetor
-formalizing
-leitmotif
-warders
-be'er
-peder
-rebelde
-crothers
-unadilla
-kha
-foibles
-4-dimensional
-sifting
-kier
-vibrates
-chigwell
-walsham
-roundheads
-dvina
-butz
-ritalin
-epitomised
-biggles
-64k
-wymondham
-ldv
-four-storey
-snowshoes
-foreland
-bsi
-couched
-emberizid
-retrovirus
-topsy
-bumiputra
-low-carbohydrate
-mcinerney
-linux-based
-kayo
-syzygium
-sixth-form
-josette
-muggleton
-lusitanian
-foregone
-andreotti
-preoccupations
-non-catholics
-dewayne
-hepatocytes
-seventy-three
-jatropha
-breakneck
-scrambles
-pooram
-spetsnaz
-agena
-tamales
-codice_17
-mishneh
-usair
-kirchhoff
-ynys
-v/line
-idealists
-soarers
-june/july
-vallee
-macgill
-non-equilibrium
-photocopying
-shawnees
-kinsley
-kodachrome
-tradename
-1,050
-sub-class
-beckmann
-trompe
-apace
-25.60
-27.20
-bayfront
-oughton
-gojong
-magallanes
-pembrey
-xxxvii
-seedeaters
-kokoro
-adelbert
-apoptotic
-majora
-teufel
-petrosian
-trilingual
-reacher
-filigree
-vianney
-abed
-spacex
-fended
-crotty
-boothroyd
-a-series
-lidocaine
-ebionites
-randleman
-teeter
-sira
-excretory
-undercutting
-vandegrift
-zygmunt
-1053
-battlegroup
-marci
-high-priced
-archbold
-preteen
-subverting
-dusters
-inverness-shire
-ferny
-curfews
-madmen
-dryad
-pez
-haddam
-esr
-underhand
-vikrama
-beagles
-flat-bottomed
-dimmu
-eastwick
-guna
-whately
-dharmapala
-sienkiewicz
-pottawatomie
-hexes
-entrusts
-torr
-grantees
-ehrman
-mase
-fixed-line
-19-20
-townsmen
-mussar
-langar
-1940-41
-fabre
-61.3
-pdp-8
-restrictor
-mahim
-castries
-then-prime
-smelted
-prearranged
-overpressure
-s.i.
-prideaux
-mending
-kyw
-sciatic
-6b
-km/s
-cd2
-resuscitated
-p.k.
-brawler
-algeciras
-yorks
-wijk
-vibranium
-salonica
-izmir
-korolev
-60.3
-interfraternity
-renewals
-strayhorn
-inverses
-82.6
-co-productions
-alexandrovna
-eifel
-herter
-hematology
-truest
-riffing
-maroney
-henze
-mbti
-filho
-impracticable
-annika
-craigieburn
-absecon
-.20
-tarts
-griqualand
-schieffer
-pressley
-23.40
-anastomosis
-slammiversary
-wrought-iron
-renuka
-5.10
-lohengrin
-104.2
-uncaf
-scf
-rafale
-olli
-23:00
-dished
-piel
-argentinos
-v.s.
-raou
-midden
-sorvino
-hermaphrodites
-bennetts
-sheikhs
-rossdale
-betz
-dyffryn
-princedom
-oxidised
-granulated
-pirandello
-esse
-furthers
-waca
-telecasting
-niobrara
-mastoid
-gamemaster
-manilla
-battelle
-militare
-caslon
-inbev
-ciprofloxacin
-jaffer
-lep
-mha
-natured
-poetically
-tynecastle
-palazzi
-functionals
-dharamsala
-choa
-onizuka
-pedestals
-keneally
-bohn
-exonerate
-indicus
-self-management
-grog
-konishi
-earth-2
-post-processing
-liir
-woodsworth
-bando
-findhorn
-cori
-criminalized
-handbrake
-emley
-wilmette
-kansas-nebraska
-shoegazing
-lidia
-smr
-haya
-badakhshan
-eiko
-hnlms
-violetta
-bano
-reale
-bhagalpur
-schmeling
-loe
-grandiflora
-lamellar
-sundarbans
-celje
-dob
-incubating
-arial
-postcodes
-rathfarnham
-treves
-bueng
-gidley
-rotavirus
-zook
-hisd
-klemperer
-21-14
-gyros
-palfrey
-t.h.
-13.30
-haphazardly
-trost
-conny
-hootie
-new-born
-dented
-hillocks
-framebuffer
-blomberg
-sado
-margrethe
-ecmascript
-siskin
-cogent
-non-interference
-positing
-punctures
-nif
-tebow
-mabini
-same-named
-yasuhiro
-vocs
-primula
-.17
-kingdome
-onomatopoeia
-138th
-widebody
-hijackings
-headmen
-militsiya
-three-stage
-1126
-moc
-21st-century
-mothersbaugh
-efflux
-hanns
-palmeiras
-ranieri
-114th
-stefanos
-eesti
-mugger
-toiled
-frightens
-stethoscope
-underpasses
-yarralumla
-whiteboards
-tarsal
-zor
-dubna
-boyes
-emb
-backwardness
-jondalar
-banditry
-self-organization
-agri
-donatus
-ethnonym
-sns
-valdes
-ying-jeou
-gillig
-african-born
-roskam
-goudie
-thompsons
-aewa
-u6
-chachapoyas
-recherche
-gainer
-mutagen
-slicks
-dingane
-mutters
-zaza
-161st
-puzo
-dpj
-geum
-short-line
-no-man
-nh4
-photometry
-swearing-in
-mitty
-buzzed
-wittman
-a/k/a
-punahou
-yaksha
-biola
-1992/93
-electrolux
-saluda
-ccgs
-attilio
-fretting
-re-join
-pawan
-fount
-competently
-miliband
-malabsorption
-insanely
-brahmans
-three-toed
-27,917
-benue
-wenner
-oncologist
-miseries
-decedent
-valachi
-duncombe
-averill
-alkylation
-daub
-mujahid
-addu
-suvarnabhumi
-vladikavkaz
-737-300
-hajji
-tight-knit
-bytown
-malkmus
-borgir
-britannicus
-small-arms
-attainments
-11.70
-rtn
-rtr
-pan-hellenic
-crispa
-smallish
-sather
-madrox
-capsizing
-biffy
-celestia
-mentalist
-mixmaster
-albicans
-bourse
-poma
-non-permanent
-taguig
-ribose
-crea
-schoolyard
-envisages
-howerd
-hirano
-open-top
-previn
-totenkopf
-khosla
-krofft
-non-deterministic
-bluebirds
-diab
-lingle
-remover
-sustainer
-blockheads
-populists
-tepe
-holyoake
-libertas
-portus
-lenard
-mariae
-microorganism
-anyhow
-polygyny
-radionuclides
-matzo
-kakinada
-melancholia
-celeron
-guinan
-drongo
-zygodactyl
-silvermane
-hulman
-winks
-leet
-meadowlark
-decentralisation
-manhwa
-cantabrian
-flatley
-nadler
-trypsin
-mmx
-prather
-topiary
-straight-ahead
-oxyrhynchus
-aled
-meristem
-drowsy
-tailless
-tri-service
-well-made
-pienaar
-minidisc
-asim
-surendra
-prover
-brags
-near-perfect
-bournville
-unassailable
-belding
-stenhouse
-acetyltransferase
-bpr
-seventy-fourth
-tarun
-nerf
-thumper
-1989/90
-macc
-vasile
-farces
-kermode
-anti-roll
-1941-42
-aldenham
-reductionist
-neuman
-lowbrow
-coddington
-galls
-shavuot
-chama
-shackelford
-quartic
-cupolas
-vibrio
-overheads
-aedes
-18-19
-dogfighting
-atma
-tedeschi
-175th
-vaca
-cannula
-shari'a
-pyridoxal
-heli
-severs
-2.02
-ibl
-ead
-25,833
-folketing
-jasta
-paraphrases
-chigi
-santorini
-boudin
-spasmodic
-elvish
-royalty-free
-16-yard
-jetliner
-reedbeds
-melford
-decennial
-post-race
-all-region
-aemilianus
-ruffian
-ickes
-bludgeoned
-new-york
-phenytoin
-g20
-flash-based
-wasserstein
-orkut
-kwakwaka
-lebeau
-autologous
-poppet
-carnitine
-basara
-julianna
-stopgap
-milkshakes
-jailbreak
-bakhtiar
-lexis
-tywin
-attleborough
-scleroderma
-harriott
-shaiva
-hypermedia
-rahi
-spendthrift
-personalization
-albie
-third-world
-sd-6
-wheatear
-droning
-fibroblast
-brabourne
-duleep
-naqshbandi
-affirmations
-tiptree
-zope
-beata
-proby
-mathur
-kanchanaburi
-flatiron
-canny
-latifolia
-baan
-electrotechnical
-zirconia
-carrboro
-expositor
-fgc
-b.a
-sempronius
-anti-semite
-leman
-nynaeve
-sasser
-technetium
-francisca
-rsaf
-lorenzen
-arap
-espnews
-spinors
-airshows
-faceplate
-nanna
-plaine
-whois
-12.40
-alfieri
-cygwin
-w.a.s.p.
-kamboj
-6.25
-preschoolers
-v.i.p.
-moulting
-sicyon
-dmso
-re-aired
-kole
-fumbling
-loesser
-quasi-war
-little-used
-floorball
-stocker
-esfahan
-.75
-bundesautobahn
-zhivkov
-pott
-spokespeople
-shawcross
-technische
-event-driven
-confound
-alewife
-illegible
-61.9
-candyman
-schomberg
-disinfectants
-schwartzman
-kawabata
-retracts
-telegraphed
-schule
-pallavi
-kadambas
-melas
-jumbled
-breadfruit
-expounds
-buchwald
-pontificum
-nevermore
-killaloe
-d.o.
-werther
-60.6
-hts
-wcs
-santurce
-gayoom
-cycleway
-dube
-poussin
-junoon
-ductus
-flt
-28,333
-shermans
-afan
-diarmuid
-jurgens
-stalagmites
-biomedicine
-onehunga
-maire
-screenshot
-fiddles
-westcountry
-silvestre
-northwest-southeast
-erzurum
-spiller
-aversa
-maundy
-first-aid
-ballymoney
-gfr
-ungrateful
-bowes-lyon
-ambleside
-newnan
-lio
-teahouse
-mutagenesis
-cruger
-loblaw
-instantiated
-distt
-dachshund
-monger
-immaturity
-mantellidae
-defile
-9600
-liard
-previewing
-well-studied
-inter-religious
-tevye
-tozer
-butlins
-eke
-mauthausen
-moorman
-100,000,000
-linnaean
-venuto
-chalkboard
-tachikawa
-bunty
-egil
-souk
-underoath
-ever-expanding
-hydro-lyase
-nascimento
-herbalism
-cybill
-tailor-made
-smattering
-scaramanga
-dovetail
-nhlpa
-frater
-carbo
-kyaw
-oatley
-microrna
-rthk
-hma
-dome-shaped
-realschule
-habbaniya
-mcaleese
-coexisting
-cd+dvd
-treepies
-rijksmuseum
-hilarion
-ballin
-housemaid
-xb
-teheran
-seta
-kak
-weightlessness
-al-kindi
-zn
-zr
-theophilos
-dox
-40-minute
-raa
-berge
-m.p.h.
-pge
-langella
-colo-colo
-bardi
-membrane-bound
-pericardial
-amaravati
-chace
-koen
-hominem
-gouging
-sarcoidosis
-amarnath
-shimmy
-ddu
-cremorne
-hitches
-yeshe
-kananaskis
-jair
-hemu
-rx-7
-fock
-welders
-moondance
-burnings
-level-headed
-hahnemann
-saadi
-pleura
-iannucci
-non-flying
-calcified
-garriott
-devilish
-re-enlisted
-shaik
-lamoille
-ventimiglia
-bryne
-sdh
-blackburne
-melos
-light-colored
-grand-am
-reestablishing
-indictable
-butchery
-caribe
-illam
-yeshivah
-kogan
-appellants
-stagecraft
-khanh
-theosophists
-mainmast
-buendia
-majin
-cgmp
-moondog
-i-395
-squatted
-lexisnexis
-keynsham
-pyrgos
-6,400
-mariani
-insignificance
-aub
-g.h.
-mande
-dysphagia
-criss-cross
-vibhushan
-bruns
-eccellenza
-homonymous
-beccles
-casemate
-deforested
-loni
-folksong
-self-study
-efron
-ascribing
-josey
-smut
-talbert
-galliano
-manasquan
-fussy
-pandionidae
-lycaenids
-berenger
-pitsea
-daydreaming
-b-series
-woodinville
-bushings
-meson
-istrian
-neuroscientists
-isr
-anti-french
-annamalai
-rhetorically
-57.4
-anti-americanism
-criswell
-allegiant
-londinium
-pharm
-treading
-longhouses
-schellenberg
-geier
-star-ledger
-delafield
-soricomorpha
-basseterre
-bluto
-r/c
-soldiered
-adulation
-wechsler
-gujar
-aycock
-ryuki
-sigmar
-casuals
-mosfets
-vartan
-wouter
-caravelle
-sindhia
-widowhood
-swordplay
-5-12
-icts
-l.d.
-winchelsea
-khushab
-osorio
-ziff-davis
-pro-war
-fanu
-diosdado
-heylin
-kanaan
-capiz
-geeky
-journaling
-gara
-thorson
-shiatsu
-panvel
-satanist
-unsurprising
-bushel
-cakewalk
-lariat
-madagascariensis
-tizard
-bearcat
-quapaw
-gilani
-marbella
-f-102
-83.1
-perron
-kirtan
-ctl
-guisborough
-libero
-amw
-huachuca
-bailing
-self-incrimination
-wides
-dimly
-robarts
-cwc
-clapboard
-supramolecular
-liana
-vibrators
-reselling
-82.9
-t.m.
-awl
-makuta
-evel
-ratzenberger
-short-circuit
-tiepolo
-conversant
-him/herself
-see-saw
-vint
-agglutinative
-quilted
-layover
-kaka
-commanderies
-punctuality
-marketability
-medium-size
-00:00
-killingworth
-cartwheel
-cowgirls
-feodorovna
-jamiat
-ptarmigan
-montauban
-tomes
-infirmity
-proselytism
-kherson
-shuichi
-bahman
-wyse
-preproduction
-claymation
-basestar
-four-member
-nop
-batiste
-vadis
-marah
-hplc
-svu
-genk
-erythrocyte
-disaffiliated
-leventhal
-katzman
-methyltransferases
-chungking
-bledisloe
-gratefully
-park-and-ride
-aubin
-reiji
-11-3
-geddy
-precluding
-ipr
-6.40
-celesta
-tiruchirapalli
-interchangeability
-garrow
-areala
-samuil
-palindrome
-ninh
-elden
-orhan
-cambyses
-vergara
-12a
-energizer
-waspinator
-c-band
-ahmadnagar
-n.h.
-6.10
-whitehill
-foreclosures
-contrabassoon
-tipler
-coupee
-washita
-styne
-absconded
-bfc
-box-set
-ptolemies
-leotardo
-serotypes
-aurochs
-incites
-michinoku
-re-telling
-bvi
-violette
-groan
-anti-piracy
-ved
-nagaur
-2.07
-hcv
-decalogue
-375,000
-non-credit
-d'ange
-universitario
-photomultiplier
-egg-laying
-adu
-snot
-shalmaneser
-haslemere
-gangas
-fruit-dove
-jenni
-mdp
-atx
-esta
-62.8
-dissect
-kovalainen
-ddp
-frederiksen
-rudeness
-grewal
-zal
-photoresist
-tuya
-prebble
-interrelationships
-chided
-kabbalists
-trifle
-agartala
-airtrain
-scabies
-miracleman
-vba
-koller
-teng-hui
-semi-open
-chambal
-vireos
-hotshot
-botafogo
-crosshairs
-sourav
-natty
-stumping
-monstrosity
-understeer
-imperialistic
-suffragist
-zweig
-zouk
-hemophilia
-gatlinburg
-mandolins
-katyn
-broad-leaved
-lerwick
-bb4
-rippling
-bolshoy
-adina
-vgn
-ghibelline
-casque
-rhetorician
-suvla
-21,667
-heaviness
-bellatrix
-27.10
-luapula
-147th
-imogene
-6.70
-breadwinner
-visualizations
-radiophonic
-lela
-300px
-marsala
-dyess
-hafnium
-ehr
-bhatta
-vaunted
-politkovskaya
-syphon
-gleb
-carrizo
-chuang
-hawtrey
-coachbuilders
-sniffer
-ghatak
-tannenbaum
-murton
-2004-5
-marija
-halych
-reay
-strawbridge
-e.g
-versioning
-zermatt
-humiliates
-mael
-full-forward
-hadleigh
-power-to-weight
-tahoma
-chaldeans
-k5
-foul-mouthed
-citadels
-phagocytes
-stenographer
-pinatubo
-sparkman
-stratojet
-poncho
-try-out
-3.44
-protein-protein
-camaenidae
-sandbank
-zahid
-taillight
-henlopen
-buzzy
-etihad
-shakey
-mohali
-1942-43
-demona
-1km
-schneersohn
-cornbread
-yeadon
-rockall
-yarmouk
-quandary
-ffl
-emmy-nominated
-labyrinthine
-frontbench
-cornyn
-steffi
-craved
-pooley
-lothians
-chisels
-holford
-tilbrook
-chukotka
-sundials
-g.c.
-cazenovia
-fantasio
-rives
-caetano
-neurotoxic
-masochism
-khalaf
-misusing
-15:00
-razorlight
-27.40
-coens
-resupplied
-m26
-makuuchi
-desa
-dimwitted
-alums
-ippolito
-mundial
-ursae
-countout
-bartercard
-barden
-surmises
-epicentre
-bhasa
-kirti
-sld
-wittig
-burscough
-seventy-eighth
-haku
-plutonic
-whiteface
-paediatrics
-starchy
-consett
-branko
-scrapper
-32,083
-monocacy
-vipassana
-goosen
-gardenia
-lymphocytic
-setar
-merida
-deadweight
-dustbin
-cabeza
-no2
-grizzled
-maggia
-asce
-chainsaws
-arana
-turn-around
-config
-xxxii
-farquharson
-tahu
-cur
-phenomenally
-leeuw
-conchita
-querns
-florent
-hetmanate
-automatics
-peacemaking
-eckstine
-astle
-marcuse
-adarsh
-katayama
-leptin
-dubium
-court-martialled
-aquilae
-molesey
-ahoy
-tiago
-twiggs
-brgy
-concha
-coq
-telenet
-cockerell
-wavelets
-14-13
-helge
-renegotiated
-sportscars
-aeons
-sunstone
-meccans
-canting
-ipso
-longest-lasting
-schoen
-gustafsson
-rhotic
-collies
-bosh
-msw
-tammuz
-russian-speaking
-barranquilla
-cannabinoids
-ashmolean
-intergenerational
-krab
-three-car
-13.50
-pinpointed
-text-type
-theologies
-male/female
-ferociously
-melle
-fullerenes
-81.4
-finster
-dupuy
-unwind
-rizvi
-secombe
-livesey
-static-x
-coldwell
-sackville-west
-metroad
-nunchaku
-elliston
-calarts
-67.5
-0.21
-suspenseful
-casebook
-bolitoglossa
-737-800
-broyles
-editable
-coolly
-boutsen
-estimators
-birders
-ruffled
-zand
-sah
-yili
-mopping
-misleadingly
-recaps
-boosey
-fremantlemedia
-complainants
-80.4
-joinville
-pullback
-solly
-criminalize
-sportswoman
-lah
-cedars-sinai
-rosenbloom
-kirschner
-cyp3a4
-godhood
-darkhawk
-qadi
-quacker
-reorganise
-player/manager
-simple-minded
-chronometers
-galvanised
-hartington
-solidus
-barrayar
-overdub
-mukim
-eddystone
-fehr
-anthea
-scholten
-humanely
-technology-based
-danang
-genentech
-atomics
-doggie
-brag
-broch
-jawbone
-ouroboros
-giessen
-cartilages
-transact
-floater
-ganda
-mehr
-denticles
-bionicle
-arcadius
-grinspoon
-fearlessly
-magistracy
-haberdashers
-istar
-kurz
-protonated
-bajo
-bartlesville
-spong
-rutten
-fauntleroy
-vintages
-22.30
-self-promotion
-pratinidhi
-pemigewasset
-davidian
-karnes
-repurchased
-forebrain
-tannic
-homeostatic
-tramping
-hsin
-acquiesce
-pasties
-breakpoint
-zapatista
-montford
-toothache
-radula
-2-stroke
-distemper
-seattle-tacoma
-mckeen
-havasu
-bulfinch
-plessy
-orgasms
-poliomyelitis
-artillerymen
-thwarts
-non-living
-pugachev
-seesaw
-farben
-pre-law
-stapled
-flyable
-re-christened
-mschif
-fovea
-oldcastle
-comelec
-frontend
-lentz
-lemonheads
-unpretentious
-vasilevsky
-reformulation
-crimp
-foreseeing
-nothofagus
-jessel
-by-passed
-theodosia
-free-floating
-planetoid
-uncollected
-elma
-fawley
-godwinson
-s7
-midsize
-smillie
-lampard
-walk-through
-maybank
-cherubim
-talukas
-basquiat
-katori
-5-15
-physicality
-subsidizing
-warty
-piniella
-hitching
-12000
-gentrified
-homered
-joules
-swooping
-vacuoles
-sillago
-quinte
-blumberg
-1023
-barbiturate
-realnetworks
-heysel
-r.i.p.
-balfe
-alfons
-embeddings
-alexandrine
-turturro
-latymer
-herut
-workingmen
-83.6
-nymphalidae
-melnyk
-sanitized
-spinetail
-tranche
-hideaki
-exd5
-coders
-1991-1995
-high-yield
-slf
-overconfident
-out-of-control
-thurgau
-villi
-guyot
-isopropyl
-franko
-ibni
-two-wheeled
-pick-ups
-hermite
-forsberg
-gein
-hopkinsville
-cubicles
-spider-like
-saintes
-sidewinders
-metrication
-borda
-khanda
-hoists
-hulled
-tathagatagarbha
-co-inventor
-geno
-katsura
-high-velocity
-mirada
-isomerases
-tiredness
-ibuki
-yishun
-babaji
-hsp
-self-absorbed
-daivari
-postulating
-fullerene
-liss
-flan
-sanguinis
-irritates
-contrition
-cd-i
-hydrodynamics
-piezo
-messick
-ormoc
-rights-of-way
-viareggio
-birches
-malad
-turnkey
-atlus
-v.f.d.
-newly-acquired
-geoscience
-heir-apparent
-semi-transparent
-hesperia
-caradoc
-quahog
-rnai
-schoolroom
-hildebrandt
-lewton
-underwrite
-footings
-brachiopods
-institutionalization
-folklife
-vpro
-angier
-foodstuff
-josepha
-kokhba
-countertenor
-jotaro
-weeding
-itzhak
-subdural
-lito
-remonstrance
-nephthys
-belgic
-lucidity
-university-level
-zengi
-billabong
-walkover
-bowsprit
-catdog
-starfighters
-single-use
-formannskapsdistrikt
-kerkorian
-ipecac
-miraj
-hamline
-kohala
-hegemon
-conaway
-trigonal
-62.3
-carnaby
-ningxia
-viera
-shrove
-lanthanum
-77,000
-rubberized
-indraprastha
-horse-riding
-class-based
-nid
-twelve-step
-strongpoints
-t20
-aon
-wataru
-abidjan
-desborough
-organum
-pvs
-bieber
-badshah
-mils
-retrenchment
-bie
-pollinate
-jitterbug
-hoagy
-high-water
-adjourn
-toolset
-wivenhoe
-socialites
-downdraft
-wilk
-pellegrini
-accuweather
-winfs
-carbonation
-guanzhong
-oradea
-ebs
-pfaff
-escalade
-yoweri
-raper
-ottonian
-neilston
-speciosa
-cfrb
-deliberating
-devgan
-venlo
-inti
-newkirk
-rnzn
-pokerstars
-infuriating
-jacaranda
-gatley
-lothrop
-pasteurized
-liebknecht
-conjugations
-aldol
-oboro
-lemaire
-26.60
-lacquered
-diskette
-passerby
-stardock
-paavo
-dado
-lorber
-liutprand
-appling
-kunduz
-sasa
-rhona
-kcbs
-cpo
-ferenc
-ptl
-honourary
-elmont
-gluteal
-human-made
-raheem
-asma
-suborders
-imperfectly
-smersh
-cretans
-cortisone
-daksha
-balloch
-catiline
-skaneateles
-nightwatchman
-unconstitutionally
-performative
-kbc
-28.70
-'96
-ainsley
-lgb
-friedan
-dragonball
-preity
-hansson
-ufcw
-t-bar
-tonka
-lukin
-sbr
-sbb
-wrightsville
-ordinaries
-littlest
-12-1
-hauge
-exner
-alg
-mangold
-ottawa-carleton
-vestfold
-waveney
-biryani
-three-pronged
-windstorm
-8x8
-polyamory
-battlefront
-curbed
-stander
-ravindra
-pleasonton
-rdc
-hitched
-maggs
-vollenhoven
-cantref
-glo
-geylang
-yellowjacket
-nipper
-agarose
-truancy
-fermilab
-roseburg
-habitants
-1961-1962
-hosking
-naumann
-chalcedonian
-mirna
-haugesund
-stavro
-82.1
-roundhay
-.380
-gounod
-gurjar
-wurm
-muramasa
-treasonous
-byker
-pw
-cachar
-tiresome
-16.00
-kulin
-phlegm
-khama
-abaco
-giancana
-yasuo
-salween
-geriatrics
-lawmaking
-ceremoniously
-beato
-skittles
-chambly
-booz
-misrepresentations
-single-day
-dreux
-dendrite
-savini
-deller
-unstated
-ccb
-morgen
-gardiners
-talwar
-frazee
-cinna
-thermae
-tfr
-b-3
-gambela
-opportunism
-gametap
-well-ordered
-jewelled
-luma
-gcvo
-debarge
-loitering
-calwyn
-0.65
-DGDG.DGDGDGDGDGDGDG
-lhc
-gazelles
-frankness
-stutz
-ashi
-emberizids
-b'rith
-enlightening
-pangea
-raad
-jusuf
-masur
-photocopy
-stunticons
-zimbalist
-thu
-3-door
-occluded
-bronwyn
-spinney
-three-pointers
-maltings
-money-making
-facile
-pentameter
-mandriva
-nanobots
-mayen
-badham
-dhani
-hanifa
-inter-governmental
-epoxide
-daba
-franklins
-sustainably
-acela
-faggot
-shavings
-preslav
-wenger
-mortain
-re-form
-housewares
-chamba
-tomar
-riversdale
-m48
-formula_78
-upshaw
-long-eared
-krom
-superchargers
-wau
-delineating
-agouti
-savalas
-mounth
-nitty
-arawn
-near-infrared
-bohra
-salesians
-texting
-bankura
-ipsilateral
-rinsed
-polymerases
-'07
-trooping
-crocidura
-hypervisor
-underpin
-lts
-plainchant
-loney
-35,417
-mdm
-blackmailer
-62.6
-davion
-kamaraj
-megalith
-leoben
-maintainability
-midwood
-3aw
-sikorski
-evens
-hornblende
-outselling
-singlish
-gethsemane
-prophesies
-vacuole
-brasco
-27.70
-caul
-merwin
-redeemers
-belting
-rainiers
-d7
-notational
-formicidae
-hasbrouck
-queeg
-forst
-ruthenians
-formula_81
-tengah
-piatt
-roundhead
-copolymer
-alloway
-honk
-auroral
-ponty
-adenovirus
-biondi
-gertz
-pelle
-pushrod
-dorff
-krasny
-walkerville
-kenley
-tetsu
-cilician
-mex
-ituri
-brackus
-melbourne-based
-morose
-mula
-noord-holland
-hfc
-solow
-top-class
-loamy
-floodgates
-franco-spanish
-riedel
-catlett
-screensaver
-pkr
-subdividing
-sahl
-oblongata
-llandovery
-fhwa
-ifriqiya
-scriptorium
-113th
-chivington
-t-1
-amwell
-ajanta
-mcaleer
-balbo
-clavier
-lompoc
-aponeurosis
-harpoons
-franzen
-biathlete
-salleh
-gerakan
-zimbabweans
-groen
-6.90
-hagushi
-normalised
-1931-32
-arcgis
-3700
-nou
-gluons
-tartars
-reprimands
-preconceptions
-ziyad
-acsi
-krishnadevaraya
-electro-acoustic
-endowing
-farallon
-gynecologist
-shola
-theodoret
-validates
-kaneohe
-romanum
-convoked
-muldaur
-state-based
-dancesport
-esotericism
-tenterden
-conchords
-boole
-plateaux
-gah
-menem
-ottaway
-sparhawk
-entomologists
-bojangles
-cd8
-counter-productive
-incriminate
-d.r.
-non-indian
-empresses
-swab
-leese
-busk
-nicolaas
-46.0
-tenzing
-janna
-emmy-winning
-entrusting
-nikkei
-octavo
-consejo
-assiduously
-.177
-curvy
-sahni
-73,000
-alds
-circumnavigated
-shareef
-sunoco
-matrox
-e-type
-lautenberg
-moria
-3/5
-nightingales
-ascorbate
-schreck
-quaid-e-azam
-28.60
-francolins
-giyorgis
-trunking
-boner
-rajeev
-mummification
-risa
-choreograph
-amuso
-sittin
-foramina
-suffragettes
-republica
-fonzie
-mullica
-matilde
-imprimatur
-silencers
-assi
-volante
-sfs
-oberleutnant
-davisville
-laski
-tugboats
-zoran
-five-game
-prk
-desdiv
-1979-1981
-invective
-everwood
-six-string
-redbook
-klum
-hebridean
-avoidable
-smi
-zords
-precognitive
-andreev
-summerfest
-jaffee
-12.10
-drug-free
-tsi
-epl
-in-school
-cyberman
-asgardians
-malina
-bask
-soliton
-uninfected
-brassy
-blakemore
-artform
-merriweather
-lamberto
-sapient
-naipaul
-ganon
-chiao
-crestview
-kutaisi
-2000-2006
-1009
-i-4
-wertheim
-doron
-pellew
-coffees
-upchurch
-hauteville
-takaya
-chanakya
-song-and-dance
-polypeptides
-eosinophilic
-clavell
-dimples
-personifications
-carom
-gummy
-anguished
-biblically
-savin
-hatebreed
-topi
-liqueurs
-uvb
-namboothiri
-1074
-kakar
-barta
-fsv
-vincents
-ghirlandaio
-pardubice
-tabora
-titicaca
-skardu
-athy
-healings
-macomber
-xenomania
-praenomen
-phillipe
-ebsen
-yvan
-morwell
-ballance
-brassicaceae
-tambor
-constantino
-belov
-calthorpe
-2010s
-malda
-yoichi
-provan
-compulsorily
-pull-out
-symeon
-spheroid
-mosconi
-geena
-25.90
-goldfinch
-sheard
-chilperic
-cxd4
-rufinus
-appin
-veneti
-summum
-curbside
-jamnagar
-lahey
-mid-2002
-chondroitin
-subsisted
-moorfields
-trumbo
-1963-1964
-kathiawar
-anti-catholicism
-odissi
-sungei
-washingtonian
-tie-break
-catalysed
-srivastava
-chanticleer
-synergies
-ballade
-lovecraftian
-3,900
-harlot
-cfi
-chromate
-dnd
-derrike
-1570s
-vestment
-misconstrued
-weinstock
-tilney
-tyagi
-highfields
-mensch
-earldoms
-chronologies
-tenafly
-4-way
-rincewind
-add-in
-zarak
-tiglath-pileser
-puy
-bywater
-organa
-simmer
-peroxides
-balustrades
-1984-1986
-chiming
-systematized
-baptista
-mineralized
-hellyer
-wasco
-20,417
-hawkshaw
-cornett
-dupin
-bennion
-givat
-de-emphasized
-full-blooded
-reprieved
-cincinnatus
-superscalar
-porus
-arctica
-mayon
-medicated
-agriculturalists
-disclaimers
-isospin
-feds
-cotentin
-courier-journal
-stoa
-arbil
-kinloss
-repressing
-miskito
-ki-moon
-magne
-nosy
-lamotta
-vernier
-seventy-seventh
-crom
-stichting
-all-steel
-tallapoosa
-self-publishing
-agnostics
-k.v.
-siler
-kasay
-krio
-steamy
-kawashima
-superwoman
-anju
-ammann
-badan
-kersey
-amdahl
-1985-1987
-robina
-neave
-vergil
-trnc
-mahdist
-structuralist
-late-1960s
-n-terminus
-acyl-coa
-cowbell
-1-a
-schlafly
-cerebrovascular
-d.b.
-racemes
-thibodaux
-brunet
-ishihara
-tsuburaya
-stroked
-pastrana
-stubble
-purpurea
-northup
-edinger
-1997-2002
-botts
-1094
-multi-talented
-estevan
-theismann
-tupi
-supremacists
-greek-catholic
-petrova
-ilc
-decommissionings
-akito
-liane
-phosgene
-accrual
-82.3
-galland
-highly-rated
-cala
-spheroidal
-half-day
-uncial
-bachata
-claustrophobia
-ataru
-killin
-conf
-149th
-qutub
-adua
-molting
-cathartic
-ayyub
-caracciolo
-z/os
-powerglide
-visors
-soni
-tutte
-prepayment
-ironsides
-recirculation
-seven-time
-borisov
-broadcom
-gudrun
-synodical
-bavarians
-two-tiered
-kalimpong
-mid-eighteenth
-51s
-chuckle
-monoculture
-togolese
-candela
-slow-motion
-astronema
-bluesbreakers
-roadsters
-monnet
-hunyadi
-bewick
-thiago
-eschenbach
-1,200,000
-justina
-dcm
-rianz
-self-indulgent
-bundesrat
-saxe-meiningen
-whitty
-stubblefield
-time-based
-wickersham
-+13
-champaign-urbana
-mwanawasa
-rpa
-waris
-glamis
-dors
-input-output
-microglia
-alhaji
-kcal
-tragicomedy
-wcco
-cupar
-fosdick
-c.l.
-abnett
-bhoja
-allophone
-re-evaluated
-organizationally
-moorpark
-witley
-schecter
-malar
-arik
-subba
-64.8
-pacifics
-pharaonic
-phaedrus
-dance-oriented
-armagnac
-betelgeuse
-minimised
-tann
-dunking
-2008/9
-marshmallows
-3-space
-fast-track
-armes
-patrolmen
-gurdwaras
-cantankerous
-soba
-off-guard
-appenzell
-tormentor
-islas
-scheele
-parliament-funkadelic
-bioremediation
-pickler
-saccharin
-investigatory
-hapgood
-statics
-drugstores
-formula_79
-koestler
-conjunctiva
-clownfish
-fuming
-edirne
-lor
-hicksville
-kas
-lagash
-roxborough
-coghill
-farmhand
-ramage
-semicolon
-zena
-lydbrook
-vaillant
-energetics
-ushl
-biruni
-stradivari
-slaughterhouses
-austereo
-4,100
-wasabi
-burro
-champaran
-barghouti
-renin
-pulakesi
-reissuing
-poon
-somare
-80-90
-lilia
-pipelining
-hmmwv
-low-priced
-downton
-revenant
-fnc
-braved
-histadrut
-grassi
-pentonville
-krefeld
-72.5
-1986/87
-49-yard
-frond
-schumpeter
-frazetta
-karmal
-satirically
-acquit
-prinze
-devaney
-legislations
-gazala
-multilayer
-crofters
-formula_82
-reconstructionism
-shizuru
-nefaria
-uprooting
-aromatics
-pop3
-spa-francorchamps
-davidians
-feted
-luxuriant
-zoidberg
-dany
-1047
-pro-
-waterstone
-ring-shaped
-wollaton
-peremptory
-machi
-chivalrous
-bru
-inger
-clefts
-disarms
-gaffe
-whiley
-autun
-newsted
-70.3
-m.r.
-ileana
-fastpitch
-i-78
-sanda
-hyraxes
-sunrunner
-anti-immigration
-rustin
-narciso
-dvaita
-hocus
-comorbid
-70mm
-phylogenetically
-altenkirchen
-aristobulus
-tattersall
-glutamic
-campi
-orndorff
-monopoles
-circumflex
-watermen
-halving
-indiscretions
-bronchi
-vegetal
-merzbow
-u.s.-mexico
-clintons
-arto
-spall
-nacl
-marsa
-stilton
-newly-appointed
-elopement
-ishi
-ellroy
-spiritualists
-subantarctic
-oxalic
-soler
-overactive
-lustrous
-phages
-2007/8
-mfs
-blanketed
-helsingborg
-miscible
-saturnalia
-camelback
-rone
-wheelhouse
-invitee
-harvin
-sloughs
-maddow
-minimax
-mid-1880s
-smarty
-determiners
-l-3
-handiwork
-connectives
-redrawing
-golani
-festschrift
-osseo
-deist
-yagna
-defiled
-mcminn
-chokeslam
-13-10
-lower-income
-biannually
-femina
-wdr
-archosaurs
-yann
-swerved
-copp
-neoplastic
-apopka
-kawamura
-seekonk
-indulges
-baratheon
-lorca
-sundae
-perlmutter
-designators
-prodigies
-miscalculation
-20,500
-harrer
-mireille
-pitjantjatjara
-82.2
-vesicular
-50:50
-usat
-lowcountry
-tajuddin
-ambit
-v.v.
-rabb
-vroom
-awardees
-sturdier
-zapruder
-jonesville
-buttoned
-hornbeam
-e-mailed
-summerlin
-optometrist
-pastore
-hmi
-sandry
-rumen
-hatti
-pennyroyal
-shimoda
-flammability
-conspires
-nieves
-boog
-samudra
-kumaratunga
-on-orbit
-chay
-hobhouse
-import/export
-rishikas
-jad
-giardia
-gutsy
-foxboro
-distaff
-malpas
-002
-fara
-ubaldo
-therion
-weightless
-karsh
-garonne
-bearskin
-hadji
-shippensburg
-pepi
-maclellan
-rapoport
-mezzogiorno
-chambering
-prunella
-cudworth
-benedictus
-kahuna
-eek
-sub-series
-unconquered
-solicits
-1065
-welsh-speaking
-phosphide
-antonina
-tardive
-pires
-oksana
-coloma
-all-nba
-taxonomically
-discouragement
-foreshadow
-telefunken
-wrasse
-vanna
-105.0
-haast
-bunter
-wbca
-schnitzler
-pontevedra
-streator
-criss-crossed
-feuer
-pupal
-efe
-20-40
-mazhar
-corrino
-DGDGDG.DGDGDGDG
-l.l.
-ilokano
-wallerstein
-lovitz
-cartmel
-fashioning
-low-voltage
-hori
-hrw
-airfoils
-lorsch
-stroheim
-eam
-mahasabha
-nickleby
-marlboros
-hiccup
-ahly
-ldr
-rc4
-contentions
-binet
-pretenses
-teres
-bsg
-starlin
-partula
-quintessentially
-hwv
-saltonstall
-stagflation
-kagame
-improviser
-deval
-twee
-clee
-62.2
-gunge
-zurab
-10-14
-polonium
-swingarm
-shockwaves
-xan
-hersheypark
-bayne
-1013
-clyro
-regrettably
-quintilian
-wath
-idb
-laudanum
-baire
-refoundation
-leporidae
-europe-wide
-kiani
-laharl
-atomism
-dsps
-rukh
-kricfalusi
-carnap
-toombs
-rahn
-eurostat
-bowral
-calla
-orogenic
-globulin
-minix
-fetzer
-silberman
-naturism
-sks
-italian-born
-vani
-telecine
-sociolinguistics
-winnipesaukee
-basswood
-charney
-boneyard
-confocal
-populating
-analogical
-unappreciated
-calwell
-preeti
-stairwells
-octahedra
-imine
-teasdale
-bulkier
-leonov
-vernor
-cand
-raconteur
-well-funded
-spica
-parkour
-well-planned
-palghat
-pullout
-decameron
-c-141
-mangere
-loran-c
-p.w.
-man-portable
-artifice
-conacher
-re-interred
-sufficed
-ishak
-ratchathani
-fall-out
-swasey
-6,000,000
-gratis
-anti-democratic
-messalina
-prophetess
-falconbridge
-productively
-zubaydah
-blackfeet
-puru
-goodie
-balcombe
-workchoices
-nymphalid
-wilby
-impersonators
-elbridge
-disqualifying
-koz
-catena
-watton
-malvasia
-vixens
-mogami
-aceveda
-adjuncts
-sluggers
-vindicate
-briarwood
-poppin
-condell
-aeschines
-ratsiraka
-pre-empt
-inaba
-luff
-hatha
-harron
-jer
-half-sisters
-almagest
-shepley
-hunnic
-bukka
-nightwatch
-treasurers
-thirlwell
-70-80
-kimo
-nabis
-klansmen
-broomball
-nomine
-mjolnir
-on-call
-albertan
-inhaler
-dunston
-rescript
-ceratopsians
-puke
-male-only
-passiflora
-priories
-gopinath
-61.4
-lebrun
-baphomet
-lee-enfield
-garu
-cathie
-kerrville
-ninety-six
-200-300
-non-contract
-panola
-oakton
-leonese
-pepsi-cola
-foreshadows
-geocities
-guyton
-parenteral
-postnatal
-jumong
-uleb
-sumba
-imai
-sanaa
-f.m.
-cravings
-lindelof
-celebrezze
-benders
-klebold
-campana
-syllabics
-englishwoman
-rendez-vous
-studbook
-ordaining
-gristmills
-60.8
-downgrading
-53.0
-prester
-bramlett
-casca
-rebounder
-gesualdo
-hesperus
-cundinamarca
-parkins
-cripples
-russes
-gonorrhea
-avm
-cchs
-gouldman
-tomahawks
-lindros
-oingo
-hemsworth
-quarrymen
-evaluator
-sogdian
-leitner
-reamer
-exhortations
-unocal
-nondeterministic
-janitors
-sachiko
-aberavon
-biracial
-ossining
-lso
-pounce
-gnats
-xuzhou
-thylacine
-hipaa
-survivalist
-amat
-10-5
-lorrie
-florists
-columella
-peppard
-doukhobors
-nought
-moiraine
-ocasek
-fabien
-hadad
-torturer
-flagellation
-thes
-shang-chi
-uncivilized
-nerv
-mridangam
-queenborough
-tenasserim
-winn-dixie
-aqualung
-conon
-looper
-groupware
-nacht
-kosar
-jeepneys
-mansi
-safire
-co-champions
-efa
-saxe-altenburg
-nexon
-usg
-uproot
-trigon
-cuckoo-shrikes
-immortus
-14-3
-ampthill
-zetland
-adal
-mnf
-m.k.
-shadowlands
-incertae
-oldendorf
-placebo-controlled
-kat-tun
-zb
-bannered
-quik
-orangemen
-ludlum
-internationalized
-citynews
-krakatoa
-dematteis
-235u
-newly-opened
-laboriously
-windswept
-hatters
-gmo
-hagee
-buchalter
-telepresence
-disobeys
-turbos
-tyger
-longridge
-octets
-chiricahua
-lostprophets
-2,500,000
-caprivi
-ephrem
-mdt
-62.4
-victrix
-stonemasons
-defusing
-zoetrope
-mimetic
-undertakers
-binkley
-top-5
-shawinigan
-miserables
-manhua
-bambang
-h&k
-guus
-wenn
-b-cell
-bilabial
-sarong
-mannose
-hyperventilation
-sde
-spiritualized
-formula_83
-earmarks
-zephaniah
-directshow
-rexall
-self-educated
-shadrach
-fuente
-aiko
-webmail
-rip-off
-phang
-indiscretion
-irr
-absurdities
-bikeway
-demerged
-bramante
-overblown
-dermatologist
-switchgear
-broadsword
-clatsop
-odets
-busse
-misalignment
-chosin
-objectification
-simpkins
-libellulidae
-oddly-shaped
-rossall
-cohoes
-enhancers
-contravened
-travesty
-drescher
-glutinous
-accelerometer
-hurdy
-desha
-bloor-danforth
-kine
-moulana
-magdala
-klassen
-meretz
-jewishness
-cervus
-tikhonov
-catatonia
-alkylating
-trowel
-shortlived
-maidu
-quilon
-mastership
-al-nafis
-straightaway
-kickstart
-grell
-saxe
-berns
-manama
-aromatase
-cva
-ordeals
-r&aw
-spinifex
-jordin
-southern-most
-medeski
-dessie
-mmhg
-holinshed
-inara
-deniliquin
-khilafat
-provocateur
-emmitsburg
-methven
-rutile
-claydon
-kock
-shuang
-londoner
-bedard
-nexstar
-poore
-gza
-6.60
-maggio
-pigface
-conversos
-xanthine
-350cc
-nutritionist
-big-game
-outpatients
-reeled
-dfes
-lahar
-isonzo
-neoplatonic
-33,333
-ory
-cross-shaped
-benner
-timoshenko
-eventuated
-4-8-4
-narc
-custom-designed
-sourcebooks
-mattias
-trafficker
-on-the-air
-viburnum
-cde
-meshuggah
-cerium
-6,600
-stromal
-nighy
-atoka
-silhouetted
-stendal
-lampoons
-maloof
-mizzen
-molino
-infilled
-african-eurasian
-kirishima
-fengtian
-oddparents
-recognizably
-high-income
-softcover
-bulwer-lytton
-katholieke
-kazuma
-nepheline
-cheeseburger
-self-consciously
-daken
-pinelands
-hinde
-semi-formal
-carrom
-veni
-bezirk
-kistler
-truant
-longhurst
-bissett
-moel
-vanilli
-kapurthala
-bradlee
-zai
-exhalation
-adem
-yojimbo
-peridotite
-boozer
-lahaina
-cendant
-reaves
-zaria
-pharcyde
-beelzebub
-ladin
-monta
-neuromancer
-lazare
-after-hours
-flukes
-bolsa
-bg7
-orphnoch
-postural
-bandana
-articulatory
-beothuk
-secessionists
-rohde
-borger
-micrometre
-dukie
-5.40
-self-knowledge
-preserver
-spaatz
-oort
-2,250
-ruppelt
-contrarian
-outdoorsman
-jiancheng
-eight-day
-hubbert
-pantai
-chatwin
-superstring
-1992-1996
-fibrinogen
-kiplinger
-mangala
-kingside
-anglo-german
-enamelled
-phy
-manik
-fogo
-poppers
-canavese
-nhat
-suds
-64.7
-westenra
-s/n
-autocrat
-coweta
-whopper
-legaspi
-phonation
-12-pounder
-wisp
-boso
-diderot
-unruh
-hardwired
-puncher
-nahar
-16-valve
-oshii
-tautology
-low-intensity
-9500
-daa520
-pesky
-four-engine
-mitscher
-claidi
-fidei
-yawkey
-quetzals
-daa
-blagoevgrad
-ferretti
-specially-designed
-vex
-margolin
-joiners
-derivational
-etisalat
-emba
-motilal
-maskelyne
-kountze
-67.3
-ill-prepared
-lajos
-gazi
-consiglio
-zaporizhia
-shapira
-y-axis
-nine-hole
-synthetics
-mapam
-mullard
-corson
-antagonize
-collinsworth
-coffeyville
-dispensaries
-devan
-spiderhunters
-malthusian
-metahumans
-ivano-frankivsk
-radyr
-indah
-imperceptible
-purveyors
-maly
-hornish
-guidry
-craftspeople
-mandelbaum
-southlake
-pinna
-groundhogs
-al-arian
-westphal
-a-3
-128k
-undeserved
-megastore
-pained
-kenmare
-maughan
-celyn
-whereof
-menashe
-danni
-indesign
-genova
-satori
-hypocrites
-olaus
-naropa
-o'hearn
-ipkf
-22.60
-vana
-boudica
-teesdale
-miyako
-hillenburg
-unmik
-lemoyne
-tripe
-habra
-norseman
-hydrocodone
-chippewas
-sleepover
-132nd
-corpo
-preservationist
-keo
-quadrophenia
-itis
-zapf
-taunus
-lipan
-amniotes
-ammunitions
-concussions
-sargasso
-court-appointed
-liquified
-idolizes
-chandrika
-peeps
-wayfarer
-wasdale
-unplaced
-boastful
-contrastive
-c&a
-feedstocks
-ohs
-siegelman
-keirsey
-klotz
-alternators
-nms
-wellspring
-hamirpur
-sorbitol
-0-7
-milch
-cruella
-thabit
-57.2
-butterley
-eastlink
-metuchen
-noni
-macadamia
-dzong
-three-fifths
-listeria
-re1
-1.33
-liha
-sublingual
-silverwood
-goltz
-sterns
-illus
-gillick
-1550s
-aggravation
-leaside
-gluten-free
-chapeltown
-deporting
-lebowski
-sports-related
-accordions
-red-green
-tork
-energize
-sulids
-spirito
-toomer
-opencast
-evangelizing
-airdrop
-kates
-milius
-kalu
-dyers
-fjordane
-liliuokalani
-cd&v
-benford
-non-playable
-capelli
-psychotherapists
-isostatic
-howley
-mariamman
-fold-out
-synthesizes
-first-season
-pett
-amitabha
-vetus
-christian-democratic
-gretton
-dicta
-giggling
-frailty
-norwest
-franchitti
-calvino
-khoikhoi
-enon
-tagg
-ryker
-yakutsk
-dispensationalism
-msr
-zouaves
-barbarella
-launder
-warbeck
-23.10
-conversed
-predynastic
-antara
-bjork
-three-disc
-pmid
-ekaterinburg
-hand-operated
-make-a-wish
-sidemen
-borobudur
-savai'i
-chivers
-erol
-kadavu
-salat
-ormandy
-awu
-sarita
-mccallister
-lenni
-universidade
-underperformed
-affray
-jitney
-narcissist
-gallaher
-sadomasochism
-maravich
-gath
-canonsburg
-gusty
-destabilized
-blom
-soviet/russian
-five-wicket
-jesmond
-kilmaurs
-digitalis
-bhargava
-rear-view
-broiler
-saint-cloud
-brasileiro
-voz
-colecovision
-lightbody
-gonadal
-aylesford
-counter-earth
-mignola
-13-16
-sangat
-boru
-0-4-0
-knbc
-mascagni
-chasse
-perdita
-communis
-takis
-1104
-common-sense
-dependability
-purveyor
-1.55
-4am
-paolini
-coolock
-h.a.
-caddie
-bormann
-hsa
-miyoshi
-naumburg
-bitburg
-jeu
-hurtado
-sigebert
-glaciations
-mislabeled
-video-game
-adare
-envelop
-encapsulating
-loyalsock
-salmo
-pertussis
-axing
-fetters
-exotica
-bambara
-jacque
-epigraphy
-clementina
-misread
-cabramatta
-shunts
-armee
-itsuki
-jorhat
-downplaying
-cee-lo
-woodsman
-fishbowl
-scribbled
-0.45
-obliging
-schists
-edgefield
-tarps
-khair
-qew
-hordern
-towered
-1:15
-natale
-scape
-championship-winning
-taita
-giller
-sersi
-bookmarking
-v7
-mastic
-olpc
-sapna
-deodato
-adriaen
-kleene
-rebreathers
-lofthouse
-mikveh
-socialise
-nairne
-22.70
-abbotts
-taleb
-sows
-mobbing
-righting
-smudge
-murshid
-kleinman
-woofer
-bodhidharma
-0.23
-luminary
-1941-1942
-bushwalking
-pazz
-pekan
-elc
-shined
-rewari
-crisler
-old-timers
-culhwch
-digipack
-17-7
-forgetfulness
-c.i.p.
-masuda
-endomorphism
-d.o.a.
-seance
-outflows
-carer
-courcy
-payee
-ballymore
-noakes
-hls
-beenie
-hiddink
-sydnor
-unwound
-gcd
-low-flying
-blackstock
-unappealing
-groundless
-sliema
-59.0
-revitalise
-seven-game
-ebt
-kariba
-dng
-publics
-faringdon
-machida
-hase
-evicting
-schemer
-nitschke
-coolgardie
-merivale
-counterintuitive
-tatton
-i-405
-kefalonia
-frill
-dark-haired
-ex-con
-chemotaxis
-salicylic
-dammam
-spedding
-enumerating
-vhdl
-disassemble
-campfires
-camerino
-windowed
-iacocca
-tumbles
-didion
-vliw
-82.7
-broxbourne
-biosystems
-kamm
-disinterred
-two-mile
-nisga'a
-vegf
-giddy
-genii
-hudsons
-retellings
-sef
-blocky
-lalu
-heartburn
-baccarat
-agoraphobia
-pti
-monoliths
-beastly
-castellammarese
-pedantic
-thither
-1058
-giulietta
-philandering
-nourishing
-malazan
-sidra
-gullit
-tripper
-huish
-louis-dreyfus
-smoothie
-commends
-o-ring
-blore
-p-39
-mattson
-mindbender
-rehydration
-imlay
-elision
-odd-toed
-ditty
-merkin
-centennials
-gund
-sampaguita
-mismanaged
-stockland
-jukeboxes
-yokoi
-yuchi
-ballooned
-caddies
-ivc
-kaukonen
-uu
-ovations
-aerojet
-rougeau
-ticketed
-aperiodic
-chandrapur
-kurumada
-chagas
-menteith
-erosional
-vassalage
-bakke
-one-carbon
-maxfield
-tree-like
-government-controlled
-kotli
-drakh
-paisa
-loveline
-montreal-based
-valiente
-wardlaw
-physic
-ecclesiastica
-re-mixed
-99-year
-thamesmead
-61.8
-anglo-sikh
-younghusband
-radiographs
-larus
-rostra
-pagani
-sanpete
-colquitt
-tema
-frsc
-tybalt
-joppa
-dowland
-boortz
-quickdraw
-pdm
-repopulate
-sarwar
-backstretch
-charlesworth
-s60
-unescorted
-transhumanist
-divo
-albigensian
-ku-ring-gai
-donegall
-biomarker
-five-hour
-acanthus
-midnighter
-22.50
-adivasi
-lessig
-francais
-quantifiable
-yearned
-sweatshirt
-under-21s
-figural
-toying
-yori
-graig
-expending
-eighty-eight
-miis
-cuc
-fwcc
-fonseka
-lmp2
-radicalized
-bagshot
-inverurie
-footman
-gaumont
-sortable
-rosemarie
-doughboy
-welsh-language
-blain
-dwan
-tarja
-music-related
-narcolepsy
-wends
-bandmembers
-blue-gray
-shoemakers
-tamale
-jemaine
-belfield
-darwish
-nika
-sandhu
-accruing
-missourians
-105.6
-freezers
-cro
-maculata
-naqvi
-penalize
-murasaki
-petone
-hyperlink
-npv
-0.36
-karwar
-bridleway
-floridians
-thorney
-xfree86
-schemata
-vso
-adelson
-friesen
-lud
-mcr
-mockup
-laserjet
-cryptologic
-scalping
-permanganate
-wbbm-tv
-burlap
-revolutionizing
-cyzicus
-secant
-defibrillation
-petrillo
-chandu
-hye
-kwa
-dalkey
-sakon
-christoffel
-fanon
-nayyar
-virgen
-virginiana
-orientable
-ects
-dissensions
-perpetua
-kooks
-accardo
-kavieng
-arachidonic
-anglo-egyptian
-ogata
-cocke
-yayoi
-butadiene
-icfi
-taiyo
-upwardly
-mcdade
-frimley
-saturninus
-lumens
-coase
-dc-3s
-inequity
-amaranthus
-ellis-bextor
-morioka
-constrains
-cardholder
-auctioning
-tuke
-panch
-litigant
-empathetic
-kielty
-ultra-orthodox
-goonies
-osten
-muzaffarnagar
-downturned
-crozet
-peshitta
-62.7
-lunge
-lilliput
-mindaugas
-bonhoeffer
-zin
-sadd
-zips
-moctezuma
-dog-like
-cryptographers
-jamey
-dawns
-alfaro
-dfv
-kirstie
-inside-out
-idg
-civil-military
-cully
-5600
-clerc
-zindagi
-sdo
-transitivity
-gamecock
-mombi
-diya
-superbrawl
-ahasuerus
-w2
-1084
-palaeologus
-yester
-taxus
-laurentius
-sede
-vodacom
-denmark-norway
-immolation
-emr
-2005-2009
-alkalinity
-biafran
-anscombe
-perciformes
-montages
-13.20
-simson
-apsl
-anaphora
-goodson-todman
-gladden
-post-revolutionary
-impressionable
-lifeforce
-saunderson
-mamaroneck
-quantum-mechanical
-batasang
-21-0
-unordered
-refrigerants
-hd-2
-dunums
-7-bit
-northern-most
-frugality
-herrington
-moga
-queued
-cooksey
-bullring
-tidwell
-herzen
-varietals
-schistosomiasis
-scions
-3.60
-bellarine
-107.4
-neary
-kudrow
-hartt
-matric
-non-ferrous
-dreiser
-mourner
-hesperiidae
-70.6
-four-volume
-indefensible
-ecj
-throop
-soulja
-yasht
-brazing
-tehama
-southdown
-i2
-lias
-baryon
-hassall
-sekhar
-conquistadores
-supermini
-amnon
-forfeits
-legit
-halden
-bowmen
-loria
-induct
-onam
-varnishes
-idps
-two-cd
-absa
-reticulata
-survivable
-non-believers
-ginninderra
-partook
-lyles
-lavished
-marinduque
-abrogation
-austro-asiatic
-appendectomy
-motherhouse
-telfer
-quam
-lasorda
-minhas
-trustworthiness
-struan
-subgiant
-teasers
-ilmenite
-onda
-insinuating
-ugarit
-turncoat
-katra
-ranchos
-gelug
-g.s.
-rambova
-pacman
-autozone
-movietone
-amiri
-anthemius
-gambon
-duquette
-asunder
-hofer
-maximising
-10:15
-pok
-run-d.m.c.
-moveon
-edulis
-concisely
-napkins
-bonobo
-d&c
-mey
-lamington
-skymaster
-spearmint
-juts
-openid
-outdone
-mlp
-floatplanes
-circlet
-backronym
-pan-green
-fibrils
-feilding
-vomited
-ursinus
-f-number
-maggi
-bhati
-strainer
-saint-domingue
-zamia
-podolia
-1911-12
-1983-1985
-markandeya
-pianoforte
-demonstratives
-lummis
-makerere
-chabad-lubavitch
-day/night
-hillhead
-hundley
-639-3
-third-string
-venu
-johnsen
-riina
-mind-controlled
-terabytes
-estrus
-marcelino
-big-time
-wnt
-unbranched
-talmage
-o'kane
-stealthily
-traill
-104.0
-trinian
-wizkids
-gostkowski
-bruch
-amusingly
-crafton
-burritos
-tonge
-kirat
-oogie
-defcon
-backbenchers
-plastron
-tuatara
-kitesurfing
-burnin
-arnos
-ayyubids
-seventy-one
-65.6
-shahar
-z100
-transmissible
-tsh
-serov
-poque
-passos
-businessperson
-fuhrman
-deification
-piscataqua
-next-to-last
-11-2
-ried
-105.2
-oui
-spammer
-hoang
-hydroxy
-megalodon
-lamin
-.338
-recut
-letter-writing
-marigny
-59.2
-amphorae
-yoshimoto
-ncea
-juhi
-dange
-wwe.com
-wagnerian
-osei
-secondaries
-envelopment
-binti
-jakes
-northville
-shunter
-decapitating
-satyanarayana
-yajna
-chorionic
-msd
-carlebach
-reema
-rpn
-cannington
-epica
-onstar
-pteranodon
-blackmoor
-palencia
-reassess
-stunningly
-terrazzo
-bagrat
-carburettors
-leconte
-brie
-payola
-tiridates
-gisulf
-airside
-thirty-minute
-norquay
-32,917
-glycoside
-tba
-cappadocian
-kien
-imidazole
-hendy
-kender
-pila
-bialik
-sehwag
-exasperation
-corrupts
-inouye
-secularisation
-halberd
-blacksmithing
-mip
-knee-length
-remick
-0.27
-martti
-lohmann
-pekar
-jeanine
-covergirl
-iaa
-alai
-immobility
-atheneum
-woolpack
-baptizing
-oilseeds
-noricum
-mutter
-mahaffey
-luyendyk
-eloi
-snapple
-gds
-bullpup
-rolston
-match-fixing
-outlooks
-rasht
-themyscira
-chickpeas
-ahrens
-jaffar
-dna-binding
-1540s
-throwdown
-dreamworld
-dewolf
-pargana
-bernays
-cce
-suleyman
-knud
-buttonquails
-simultaneity
-wpc
-framlingham
-plaskett
-moluccan
-dc-4
-neapolis
-isang
-mili
-cisc
-kilbirnie
-toymaker
-wgp
-achaia
-sinise
-short-run
-gargano
-jabberwocky
-fetter
-twc
-cinta
-palatka
-reordering
-aelita
-denial-of-service
-sobolev
-1041
-celie
-jarrell
-greenlight
-self-organizing
-cordoned
-kitzmiller
-suppressant
-fingerstyle
-bri
-hollings
-kobolds
-volcanos
-hoes
-dildos
-jujitsu
-drooling
-unconstrained
-gents
-tormentors
-mesic
-northerner
-evgeni
-ghassan
-cotterill
-four-song
-interest-free
-banagher
-s.f.
-pulchra
-sylmar
-clemence
-schroder
-triplane
-gordons
-matriculating
-seasiders
-aiaw
-atcc
-50mm
-pwllheli
-robustus
-prodded
-faridpur
-balancer
-nocs
-disperses
-atchafalaya
-niagara-on-the-lake
-jazzmaster
-c-3
-13.80
-gy
-grylls
-militarization
-saburo
-haultain
-shp
-x-wing
-10-round
-genericized
-buttressed
-dongle
-dzerzhinsky
-nodular
-kickin
-arona
-entravision
-ashtray
-m10
-rhythmical
-ingenue
-market-oriented
-pashons
-rabelais
-bryanston
-catkins
-qinetiq
-nannies
-oocytes
-junks
-reorientation
-bild
-incremented
-shakuntala
-bucher
-equalizing
-sob
-druidic
-misappropriated
-oita
-armley
-koppen
-remsen
-superweapon
-canale
-cross-party
-hoyer
-passe
-rumelia
-subtropics
-ewood
-sirocco
-endoderm
-thome
-vigan
-alm
-1.90
-warfighter
-edc
-statius
-combi
-underutilized
-turdus
-carmelita
-fiefdoms
-breslov
-md-80
-brith
-delonge
-gwadar
-sapien
-dowel
-belhaven
-yannis
-tailing
-h.t.
-interjection
-1,650
-odex
-wexner
-veidt
-celt
-vril
-atresia
-crossbred
-s&t
-webern
-kernan
-chaosium
-fiesole
-postlethwaite
-interzone
-sound-on-film
-vhsl
-gandhiji
-kanga
-evelina
-protoculture
-khobar
-archuleta
-liberal-progressives
-motorcars
-ogmore
-unsworth
-bestiary
-streetlights
-dockery
-laclede
-sportschannel
-seigenthaler
-valium
-macnamara
-higson
-todi
-sienese
-evidential
-stad
-alexandrina
-hardanger
-krajowa
-emissivity
-capellan
-railhawks
-apologising
-12.60
-pressburg
-ronkonkoma
-mandapam
-furio
-deaths/1
-1990-1994
-demersal
-dibrugarh
-pori
-enlarges
-5300
-u-18
-oup
-fasano
-under-represented
-ves
-al-arab
-golitsyn
-wealdstone
-1940-1941
-iskra
-allon
-cataphracts
-mudvayne
-gottfredson
-22a
-spiritus
-khiri
-tite
-hashed
-12m
-64.2
-centrolenidae
-aimless
-kutty
-kadir
-multi-coloured
-beeson
-haymes
-end-stage
-fusiform
-southbury
-attilan
-2004-present
-boogaloo
-balasore
-costliest
-crotalus
-primum
-kensuke
-metallo
-dement
-continentals
-toler
-off-off-broadway
-15m
-conventual
-aguas
-mpps
-63.3
-betti
-bidston
-hydrotherapy
-slinger
-spiracles
-headmistresses
-sota
-selsey
-vassilis
-nagapattinam
-lippmann
-no-fault
-marcellin
-vaduz
-dosa
-kalle
-briley
-townhomes
-metatron
-oilman
-neurotrophic
-thirty-year
-crouched
-lohia
-miniaturization
-katja
-christmas-themed
-hagel
-berliners
-4-stroke
-navigations
-formulary
-ethelred
-misfire
-30-50
-budo
-thorfinn
-chris-craft
-first-innings
-kurohime
-margulis
-hibernating
-bloxham
-triplett
-thessalian
-asterism
-wtvj
-tymnet
-35-year
-hackettstown
-ferlinghetti
-eddard
-poot
-sammi
-80.6
-ballo
-hampi
-doxycycline
-wetted
-3-11
-heartbeats
-kore
-neame
-althusser
-ingleby
-choreographing
-spode
-pingree
-f-150
-incredulous
-orange-yellow
-agen
-bih
-sacchi
-darrel
-burren
-solomonic
-rhenium
-chrism
-loaning
-layard
-signum
-greco-bactrian
-rabban
-pseudocode
-lemke
-naoh
-off-spinner
-13.90
-go-karts
-popularising
-hetchy
-one-eleven
-qassam
-borrelia
-hofje
-penman
-doraemon
-dagbladet
-non-intervention
-fillion
-pearsall
-artsakh
-ibt
-colli
-narwhal
-kaluga
-ochotonidae
-jagat
-oswiu
-algren
-qaida
-jillette
-rotax
-fillets
-korchnoi
-sophists
-tindall
-su-27
-courteney
-datatype
-roomed
-peptidoglycan
-peafowl
-ostrom
-urinal
-macromolecular
-chaumont
-amiel
-deltoid
-oscillates
-vyas
-chirping
-eleven-year-old
-cabarets
-stanislas
-rowhouses
-takako
-carbamazepine
-khajuraho
-baited
-committal
-prorogued
-girvan
-revivalists
-dearden
-elst
-emms
-cathodic
-vitor
-kudo
-foghat
-thatta
-timestream
-proffered
-joab
-melanocytes
-1986-1989
-iemma
-bintulu
-seato
-tmp
-3.47
-bromfield
-tiwi
-yasukuni
-topographically
-cookham
-rictor
-mossadegh
-despenser
-thorogood
-egfr
-marmots
-trutv
-hetch
-awed
-dresdner
-tomentosa
-snags
-1989-1992
-gode
-centerfire
-sru
-airlie
-ffs
-out-of-band
-raghavan
-rachis
-rachid
-generico
-musha
-thirlmere
-pennell
-riperton
-ringgit
-anti-aging
-lambourn
-padlock
-gabaa
-signoria
-oya
-skydive
-facsimiles
-dlm
-suitland
-mahalakshmi
-foragers
-hor
-low-tech
-beaneaters
-seongnam
-jihadist
-83.7
-homologue
-pathogenicity
-edenton
-tuberosity
-ducting
-isidor
-interjections
-dataflow
-zor-el
-sarojini
-three-tiered
-verisimilitude
-arman
-thunderdome
-nephrology
-sunaina
-castleknock
-longmeadow
-hullabaloo
-northlands
-ving
-r/t
-vashi
-orgies
-laserbeak
-drogue
-isar
-arbuthnott
-tug-of-war
-deasy
-woodridge
-tutt
-implicates
-american-led
-waley
-dentures
-nicopolis
-deforming
-ouellet
-larwood
-accidentals
-sequester
-mutoh
-linkletter
-luwian
-naji
-2150
-kyung
-pro-russian
-white-faced
-pteropodidae
-lysias
-stannis
-cueva
-wych
-kuhl
-scissorhands
-marmoset
-essa
-nuestro
-retried
-vilhelm
-arjen
-bwi
-batra
-raden
-stouffville
-gmb
-depreciated
-3:15
-sarath
-swindler
-mcmullan
-s.p.d.
-fuel-injected
-excommunicate
-cojuangco
-curacy
-nappy
-longifolia
-sadhus
-six-foot
-trachtenberg
-andry
-cartouche
-il-76
-last-place
-louw
-quiberon
-tarantella
-valkyries
-4096
-geniculate
-pataliputra
-oklahoman
-12-day
-gladius
-kingussie
-goschen
-bunts
-jps
-gabber
-cota
-ikarus
-webcasting
-79.5
-kootenai
-prema
-vaidya
-chiaki
-lath
-tamiami
-neckties
-humanitas
-w1
-holies
-colloids
-beel
-schley
-rosendale
-morimoto
-monophyly
-spinet
-micrograms
-longsword
-allmendinger
-makar
-scoundrels
-ev1
-self-declared
-maurus
-bioluminescence
-3166-1
-nationally-syndicated
-subgenius
-tbm
-cordilleran
-post-deployment
-contra-rotating
-vikki
-harter
-out-of-date
-starbase
-bahnhof
-conlan
-reliving
-steranko
-motherfucker
-5.80
-54.0
-nsg
-micelles
-taxol
-plantinga
-0.29
-black-backed
-shroff
-communitarian
-tahlequah
-castile-leon
-beatmania
-hiwassee
-holladay
-dslrs
-tourmaline
-synoptics
-shinnecock
-vil
-kahler
-rheinmetall
-11-14
-trastevere
-overdosed
-gallstones
-male-to-female
-double-action
-catv
-28.40
-egr
-hernias
-balu
-dressmaker
-bernanke
-sankranti
-harihara
-pert
-culberson
-swaggart
-dionysian
-grimston
-hogmanay
-costarred
-rahu
-messines
-6,300
-leva
-gantt
-hawthorns
-mishandled
-matin
-pvo
-katamari
-drood
-ebr
-bb5
-kantakouzenos
-sneha
-groats
-premolar
-macauley
-tika
-adroit
-pleats
-basheer
-ospf
-parsnip
-chi-square
-morarji
-higbee
-man-to-man
-broadstairs
-tolson
-nbc-tv
-dror
-sandoz
-tamayo
-dryburgh
-myoglobin
-simferopol
-purina
-cockfighting
-ezrin
-infix
-7200
-i-76
-three-pointer
-veganism
-v.p.
-athenry
-qrs
-aldington
-inexorable
-general-in-chief
-kanin
-award-winner
-1920-21
-kinmen
-storyville
-frigg
-wcbs-fm
-blackdown
-delph
-detmer
-casus
-impregnate
-three-year-olds
-deakins
-medleys
-shikari
-loughor
-x64
-piecing
-l.l.c.
-neches
-murti
-1.23
-gv
-innately
-harish
-1921-22
-wilmore
-syllabi
-oled
-quintero
-flaminia
-doshi
-pongo
-aryabhata
-papilla
-pridi
-panhard
-interlacing
-immunosuppression
-mantles
-menninger
-sisler
-lenawee
-rhoades
-moche
-headscarf
-issei
-crashers
-neuse
-dimeo
--25
-rivalling
-iyers
-philanthropies
-asthmatic
-gobies
-tiswas
-gi-tae
-gnarls
-exterminating
-jove
-inquisitors
-ronnies
-bonobos
-3-time
-villafranca
-lenovo
-13.60
-stealer
-katong
-thermo
-decorates
-ashokan
-gimbutas
-licencing
-gender-specific
-catalpa
-yakubu
-tibi
-kumbha
-terkel
-hard-nosed
-12-2
-non-degree
-bangui
-cetshwayo
-slackers
-chieftainship
-dentary
-alc
-chillout
-devoto
-sexsmith
-reprogram
-tallmadge
-colubrid
-bda
-12-bit
-voiding
-exhorting
-arsacid
-frears
-pontificalis
-1861-1865
-i-695
-mlk
-ianthe
-virginis
-rhuddlan
-delegating
-shug
-penshurst
-ljungberg
-thetan
-federici
-moench
-16-0
-auteur
-reworkings
-idealization
-mahakali
-midnite
-countercultural
-lip-synching
-16-19
-greenest
-rida
-snm
-rectifying
-multi-modal
-ceratopsian
-gadag
-spouts
-droll
-primitivism
-tardy
-raposo
-christadelphian
-bisphosphate
-lickey
-vorlon
-novela
-bimal
-parlayed
-kaki
-irem
-prioress
-dharmapuri
-daubert
-pedalboard
-boronia
-polyphenols
-deathlok
-gardnerian
-asami
-ranulph
-mbh
-hot-tempered
-close-in
-denominators
-connellsville
-ugaritic
-stargazer
-musburger
-ps1
-boston-area
-sebelius
-neoptolemus
-58.6
-publicists
-croxton
-dubin
-cp24
-danmark
-mmi
-bartenders
-eysenck
-deniz
-avaya
-self-assembly
-renate
-alighting
-wjbk
-whimsy
-stribling
-off-again
-chums
-overwrite
-whittled
-28.50
-kamara
-playset
-crowninshield
-tremonti
-overestimate
-iridescence
-occultists
-ahluwalia
-saraswathi
-mariology
-currys
-bpc
-confluent
-portales
-prepositional
-foresees
-bamburgh
-ilbo
-zemun
-re-evaluate
-64.4
-mauri
-sickle-cell
-xenopus
-pulping
-rupturing
-gumby
-regimented
-preece
-hallamshire
-kamiya
-applauding
-third-team
-remedios
-icac
-ninety-eight
-fini
-academe
-tavis
-goldar
-dilfer
-white-browed
-hesh
-weight-loss
-menteri
--0
-cahuilla
-onlf
-khl
-kilian
-maximin
-ridgemont
-re-signing
-bossi
-z2
-obliterating
-mazinkaiser
-intermingling
-third-year
-fso
-fss
-ia-32
-konya
-breakdance
-adios
-5,100
-woolton
-grimsley
-stepsisters
-guillemots
-razi
-5d
-gtx
-cladistics
-regencies
-tahsil
-copolymers
-moorefield
-chemins
-eridani
-nosebleed
-gibran
-clarifications
-asiatic-pacific
-siltation
-jci
-whisk
-loincloth
-mesmerized
-wrathful
-325th
-inna
-pre-dawn
-kiely
-teleprinter
-malia
-stallard
-ludwigshafen
-kimberlite
-vallenato
-invalidating
-rhodope
-doting
-kristof
-atlantics
-pekka
-vikernes
-non-german
-blaydon
-5-minute
-eurasians
-utensil
-wolfriders
-mcvay
-annadurai
-monday-saturday
-jeroboam
-moveon.org
-depletes
-ill-treatment
-unheralded
-reconnoiter
-ashleigh
-dq2
-dizon
-sitio
-glyndwr
-turbojets
-kossoff
-mudgee
-iro
-keitaro
-masefield
-redmen
-dinos
-sahadeva
-eyal
-kaziranga
-sprinted
-jarry
-olympias
-tallon
-chekov
-low-risk
-perri
-mcneal
-reprehensible
-deschamps
-kahneman
-gwendoline
-jacobo
-a45
-xindi
-end-of-year
-whiptail
-philibert
-decorah
-polski
-6-man
-schwarzenberg
-naral
-ohc
-9x19mm
-low-fat
-pupillary
-instantiation
-clutha
-greenhalgh
-hi-5
-megazine
-raikes
-cryptographer
-rup
-refugio
-tigran
-anopheles
-yasmine
-nsaid
-letcher
-videographer
-piedra
-plebiscites
-weissman
-yagami
-well-informed
-pato
-glucosamine
-madhubala
-darragh
-triggerfish
-labyrinths
-tuppence
-ambika
-qm
-match-winning
-strassburg
-jarno
-doldrums
-complex-valued
-barbee
-virchow
-deejays
-peleus
-aguila
-wews
-24,583
-shakespearian
-basten
-hohner
-shango
-explication
-gioacchino
-kodama
-malka
-risk-taking
-fuselages
-azadi
-dietitians
-gadolinium
-zeenat
-banged
-ranelagh
-belgae
-riskier
-geographies
-bowmanville
-killigrew
-twr
-loewy
-banneker
-aurigae
-miccosukee
-proctors
-delaval
-papillomavirus
-benzoic
-conakry
-np-hard
-nannini
-rock/metal
-yuya
-relegations
-wilburys
-gododdin
-ztt
-baghdadi
-arul
-gurren
-milia
-gujral
-novato
-1943-44
-mlf
-1092
-tlb
-yuppie
-farsley
-ayyappan
-fore-and-aft
-prosecutorial
-rotund
-jtf
-eliott
-abetted
-valis
-2000-present
-mccullum
-photorealistic
-swr
-penna
-k.a.
-morne
-sparx
-borah
-volatiles
-medians
-vinh
-x51mm
-protagoras
-fem
-cricinfo
-maff
-x.org
-atelopus
-tyranids
-red-eyed
-norberto
-courbet
-ccha
-zadok
-left-winger
-hellcats
-suet
-severo
-c.g.
-acanthaceae
-rapaport
-inverts
-sayreville
-mid-60s
-byars
-lvov
-in-jokes
-6.20
-gynt
-alkane
-circumvention
-kashyap
-gnc
-pearling
-rocor
-carronades
-charleson
-intercalary
-eckankar
-danaus
-wendi
-saward
-bookchin
-r10
-snakebite
-maccallum
-tarzi
-unfunded
-lupertazzi
-cygni
-ashbrook
-potty
-ezio
-sydow
-22.40
-hcp
-birchall
-nakhichevan
-raas
-falsifiable
-anti-masonic
-leishmaniasis
-wide-open
-bindi
-aragonite
-lapua
-reflagged
-serhiy
-policyholders
-nerc
-robed
-skeid
-vaccaro
-wich
-yona
-bankroll
-counterexamples
-agis
-turlock
-halve
-vazquez
-capitalistic
-bleek
-confirmations
-anjouan
-spreader
-mantapa
-alarmingly
-tzedek
-lindquist
-eventuate
-vco
-humbuckers
-arminianism
-tyagaraja
-virion
-layperson
-bagua
-mns
-pacaf
-doniphan
-karlovy
-crock
-+9
-wilhelmine
-full-grown
-xj
-81.5
-hosmer
-wheal
-sooke
-vasai
-ncb
-extracurriculars
-morgentaler
-lecithin
-spp.
-outgassing
-fava
-clydach
-13-1
-rattrap
-premonstratensian
-auctioneers
-dispassionate
-kobzar
-inductively
-satay
-north-facing
-0.20
-copperplate
-bakhtin
-oculomotor
-malevich
-scooping
-canaris
-volkoff
-fujimoto
-balrog
-v.a.
-millan
-manoeuvrability
-msdn
-8-year
-electro-magnetic
-hastening
-crb
-ath
-umist
-afterschool
-terroir
-banovina
-helvellyn
-re-registered
-400-series
-calabrian
-calverton
-1927-28
-lacustrine
-tudo
-sealants
-red-legged
-influenzae
-oolong
-kani
-morphogenetic
-grudges
-maac
-halland
-canker
-air-launched
-carboxylase
-deadbeat
-olimpico
-bbses
-comedy/drama
-semi-major
-1129
-beholden
-moh
-single-point
-springburn
-utsav
-elitserien
-startle
-wchl
-sanilac
-abhorrent
-labasa
-wwdc
-parlours
-skidded
-bracton
-dunciad
-castigated
-almoravids
-soars
-re-classified
-discontinuance
-asv
-panto
-iru
-danske
-fossey
-durrant
-mahalia
-amado
-intramurals
-packhorse
-chol
-wharfs
-skelmersdale
-auditor-general
-bunshichi
-iri
-clavichord
-loraine
-candlesticks
-servings
-fitton
-kem
-noori
-sublabel
-roon
-mando
-eradicator
-comunista
-newtons
-nereus
-4-star
-cupressus
-minchin
-morphos
-aho
-ardsley
-dahlonega
-sot
-snatcher
-blowhole
-sele
-aoba
-najran
-ettinger
-erhu
-5.70
-charybdis
-versicolor
-chae
-ice-covered
-6s
-ncsoft
-herger
-kwang
-instinctual
-nunatak
-leverages
-michilimackinac
-pederasty
-ream
-5/8
-nettleton
-skyhook
-buckaroos
-84.0
-solange
-typifies
-upp
-vfx
-two-in-one
-warman
-samu
-bergen-belsen
-mujahedin
-fragilis
-archean
-mongoloid
-interfaced
-glagolitic
-altamaha
-deinonychus
-turbid
-3.48
-pitino
-ilves
-u14
-tsukasa
-256.0
-wampum
-manicured
-nuvolari
-westway
-foot-and-mouth
-kneecap
-bartleby
-yoshinori
-dogpatch
-carinthian
-atopic
-drukpa
-adyghe
-leeton
-v-e
-baddiel
-outlast
-samanids
-asuncion
-hiligaynon
-ahoms
-arrl
-hedberg
-schenkel
-metropolitana
-diffracted
-fushimi
-m.v.
-niamh
-p.f.
-wags
-be2
-buckmaster
-aldine
-orth
-millis
-stenhousemuir
-reimagining
-avrohom
-dichromate
-pou
-plunge-dive
-tgs
-lochner
-locomotive-hauled
-mostel
-gubbio
-fluoroquinolone
-esalen
-bassi
-dawgs
-war-related
-deathbird
-aetolian
-agt
-curds
-matriculate
-mannie
-webbe
-instars
-ashina
-corder
-taxonomies
-acbl
-dolerite
-blindside
-mcinnis
-buttered
-iberville
-shofar
-kaeng
-swt
-urim
-sifu
-biodynamic
-indulgent
-naac
-25,500
-choroid
-tethers
-metrobuses
-motte-and-bailey
-haverstraw
-dai-gurren
-cahir
-ducklings
-ejb
-nuu-chah-nulth
-mxr
-syndicalism
-1981-1983
-libration
-veen
-birstall
-teymourtash
-unrounded
-gyanendra
-voe
-eyepatch
-dextromethorphan
-bracebridge
-good-hearted
-folksinger
-high-precision
-inter-university
-acls
-weare
-otahuhu
-pinkham
-canibus
-equitation
-wiebe
-11:15
-s-adenosyl
-smaller-scale
-hideously
-lesueur
-rulebooks
-lampreys
-lyla
-extroverted
-moos
-visby
-twentynine
-gabel
-lockjaw
-wasl
-greenstreet
-shoplifter
-syne
-one-by-one
-raat
-rundstedt
-325,000
-goldmark
-soir
-manisha
-arduin
-lue
-life-style
-h0
-harrisville
-bigler
-chamoun
-befallen
-tedd
-nlaka
-thinnest
-aang
-esen
-underpaid
-taejo
-kiffin
-septimal
-kitna
-14.10
-goldmoon
-civ
-lairds
-rothermere
-non-motorized
-disconnects
-conveyors
-sectorisation
-j-league
-stoichiometry
-butanol
-barty
-anaya
-mullaney
-frodsham
-hot-air
-midlife
-azadegan
-peafowls
-boya
-paperboard
-time-varying
-hispano
-venturer
-glay
-subpopulations
-fergal
-ruslana
-avatara
-elswick
-lamer
-mutes
-a.m
-fraxinus
-shtetl
-bee-eater
-relegate
-1986-1990
-1947-1948
-aecl
-enabler
-jazz-influenced
-neuwirth
-80.5
-cust
-brittas
-a500
-durazzo
-dhugal
-alcester
-laughingthrush
-wintry
-turnham
-krohn
-kinniku
-sapindaceae
-kisangani
-mon-khmer
-sabatini
-siempre
-pucca
-sudhir
-ossified
-grandsire
-h&r
-66.2
-burrough
-limber
-orochimaru
-ejective
-loxton
-29,167
-csb
-second-line
-arcade-style
-patani
-electress
-sdc
-haswell
-yegor
-pankow
-timesharing
-best-loved
-barger
-toothpick
-1043
-calley
-interpolating
-fuscus
-brigadoon
-agito
-gassing
-mid-1850s
-orthostatic
-silurians
-isma'il
-krycek
-parenti
-springbrook
-503rd
-shakir
-y1
-ddr2
-106.2
-growler
-hertzog
-imperiled
-cassiodorus
-sumpter
-regt
-underdevelopment
-beggs
-harrold
-blithe
-sno
-tomino
-co-regent
-dasein
-forking
-d-ca
-pushpa
-madoff
-yari
-elephantine
-abounding
-tumkur
-noma
-makhdoom
-rosella
-mousehole
-divinatory
-half-plane
-1,3
-damselflies
-condensers
-lemar
-public/private
-pincher
-banjul
-ruffo
-mohsin
-barbecued
-nephritis
-diamante
-sev
-swanwick
-talpur
-challoner
-unflinching
-arikara
-shepherded
-effendi
-chertoff
-zond
-logbook
-pornographers
-doody
-shanghainese
-gj
-dyna
-hearers
-calamitous
-hoylake
-allianz
-kovalev
-sophist
-22,188
-koy
-k7
-playsets
-ferroelectric
-recollects
-edmonson
-sainik
-sethi
-veronicas
-techs
-delimiter
-poplars
-non-mormon
-142nd
-politico
-1520s
-giscard
-imlach
-melodi
-ciliated
-sporophyte
-nakai
-daja
-gauliga
-atef
-,500
-higher-dimensional
-rosslare
-userland
-turgut
-luhansk
-pegram
-ergot
-flyovers
-pedra
-holmdel
-sro
-tuo
-19s
-mylar
-ventspils
-ayat
-militaire
-lagann
-rachelle
-allport
-manuka
-malting
-life-support
-vieja
-kreator
-bagheera
-amortized
-minimoog
-gaultier
-mikkelsen
-rhinelander
-l-dopa
-tolosa
-ndsu
-swarovski
-flatwoods
-hitless
-harchester
-pufferfish
-cimmerians
-on-again
-craigslist
-reenacted
-telematics
-noth
-slop
-breve
-platinum-selling
-ratbat
-yuval
-finlandia
-everhart
-daintree
-kempeitai
-anastomoses
-metlife
-veliko
-hamner
-charopidae
-disconcerting
-bunge
-solti
-crossman
-unction
-hoobastank
-perpetuates
-everitt
-gaitskell
-hackworth
-distiller
-fertilisers
-condottiero
-macinnes
-steeples
-chattels
-shimura
-arsi
-waypoint
-leas
-sivana
-swayamsevak
-stoked
-exelon
-wabasha
-socials
-traumatised
-dania
-rotuma
-dong-joo
-psychogenic
-transpositions
-1034
-schiaparelli
-3/2
-paintballs
-yagan
-maudsley
-jolliffe
-varsha
-now-extinct
-bowness
-serological
-extenders
-patmos
-n11
-gentes
-lahaye
-bose-einstein
-wildness
-1064
-haymanot
-al-quds
-composer-in-residence
-supermodels
-portarlington
-wakefulness
-gazes
-rushcliffe
-63.9
-highest-scoring
-coalescence
-0.30
-1957-1958
-ayton
-xenu
-nunzio
-vygotsky
-clubhouses
-vinay
-moda
-banquo
-funen
-maio
-popol
-hoechst
-dalry
-manzanar
-runnings
-mid-teens
-rabba
-schlager
-khaybar
-kiara
-fifth-generation
-mossman
-myoclonus
-sulphuric
-joie
-herakles
-bandshell
-pmd
-jiminy
-bfg
-fredholm
-grb
-yazdegerd
-biosafety
-aksum
-dork
-ostrovsky
-kahlo
-schlieffen
-dsw
-world-view
-milked
-straightedge
-aad
-sigler
-re-make
-laurentia
-usfws
-hypothalamic
-state-of-the
-syrah
-8-ball
-godfathers
-swathes
-osment
-sporangia
-compuware
-waynesburg
-snowmaking
-hajar
-showmen
-tatters
-carvajal
-satsuki
-re-acquired
-jayhawk
-kentwood
-bettering
-quorums
-carves
-nxe4
-kstp
-pulpits
-radin
-presuming
-taraki
-acacius
-empiricist
-burry
-sennar
-moca
-jauhar
-carioca
-tetrarchy
-assemblywoman
-chelios
-wyllie
-senter
-355th
-jcw
-maegashira
-spital
-underpins
-24-25
-far-off
-self-confident
-langerhans
-chides
-mtp
-gorm
-calogero
-pendolino
-tinctures
-sate
-16v
-studi
-anatoliy
-nyanza
-ud-din
-fine-tuned
-besser
-docent
-andal
-olentangy
-h2o2
-large-sized
-vrc
-gas-fired
-grosz
-maza
-kpix
-brenna
-solas
-thomastown
-psychopaths
-offertory
-huxtable
-3-manifold
-territorially
-chondrites
-stercorariidae
-yellow-brown
-silkworms
-nmos
-bullet-proof
-cft
-faridabad
-obliteration
-hofmeyr
-provincials
-mzilikazi
-perishes
-rorke
-3-manifolds
-haight-ashbury
-pema
-disheveled
-rive
-staal
-lundin
-khana
-phar
-hookah
-zamboni
-scholastica
-tricuspid
-pook
-nbdl
-yair
-politecnico
-rego
-frosting
-philippus
-gex
-long-sleeved
-giorgione
-doodles
-relapses
-mul
-pre-selected
-tongariro
-finbarr
-macaca
-dextrose
-pass-through
-walkie-talkie
-padraig
-hissar
-64.6
-ditzy
-cantillation
-gokhale
-warzone
-westernization
-paclitaxel
-solstices
-crotone
-jourgensen
-josep
-take-offs
-mercies
-noradrenaline
-laxity
-issf
-hsiao
-buprenorphine
-beetleborgs
-mccreery
-eccentrics
-sastry
-baseballs
-aami
-ambo
-gilson
-1056
-nuyorican
-hydrofluoric
-hamlisch
-americanus
-sprawled
-dyne
-dpa
-dmd
-appraisers
-ruggero
-salley
-hypsiboas
-initiatory
-buh
-romanorum
-i.o.
-cro-magnon
-vashti
-viscoelastic
-admonishes
-humanized
-attell
-newscasters
-shut-down
-bennelong
-parasitology
-deuces
-irreligious
-continua
-xpath
-cropland
-trick-taking
-uvalde
-fernie
-albertans
-tsvangirai
-stingy
-aliquot
-spring-fed
-dionysos
-kuno
-skyraider
-guarani
-triestina
-stumpy
-melora
-ferox
-alcmene
-orp
-earphones
-viipuri
-telfair
-wilkens
-schock
-caltex
-paulista
-giugiaro
-datatypes
-brahmagupta
-entendres
-legibility
-kelo
-tourneur
-eighty-first
-voortrekker
-wickford
-canvassed
-kherubim
-counter-intuitive
-minimums
-corbijn
-geoid
-
-lucci
-bill-clattering
-absurdly
-indigenously
-profanities
-runge
-verbandsliga
-florissant
-thickens
-hyginus
-human-powered
-hawksbill
-sailfish
-fifth-round
-zam
-compensator
-waalwijk
-abdelaziz
-schottky
-stashed
-cels
-non-sexual
-multipoint
-neuwied
-cnidarians
-rpms
-polanyi
-almonte
-mcadams
-sonically
-florencio
-fondue
-decoherence
-vacaville
-samaja
-pelley
-inferiore
-single-story
-centrifugation
-garners
-downsize
-alternations
-altaf
-lilburne
-schulich
-quicksort
-miserly
-mcclanahan
-totila
-marshalltown
-zinfandel
-dozer
-alleghenies
-partial-birth
-ultra-high
-tvn
-tropospheric
-m-16
-messenia
-monals
-whittlesea
-policymaking
-_____
-nathan-turner
-glenbogle
-o-methyltransferase
-1037
-mardin
-timiskaming
-mags
-al-ittihad
-pernod
-re-numbered
-sivaganga
-kallis
-non-directional
-chiton
-autoimmunity
-a310
-skewer
-marmon
-kazarian
-b&q
-tcb
-werth
-machar
-middle-of-the-road
-entreaties
-xylene
-muzaffarpur
-monooxygenase
-harmonicas
-aetolians
-jessi
-typo
-vjs
-soothsayer
-backhoe
-granulation
-monge
-regularization
-redshifts
-unconsolidated
-altdorf
-litke
-comair
-semiregular
-perfective
-p300
-venlafaxine
-clow
-adelskalender
-towcester
-imamate
-counterrevolutionary
-coby
-mendis
-velletri
-espace
-solenoids
-l'homme
-cymbeline
-mccoys
-golic
-neoliberalism
-debub
-hostesses
-81.9
-khat
-randa
-aqa
-g.d.
-fmf
-140th
-devitt
-parasitoids
-cetus
-neutralise
-forgoing
-bucca
-kanara
-sectionals
-2024
-jabot
-mig-17
-broached
-arachnoid
-call-ups
-d.c
-melia
-erle
-sayeed
-sommerfeld
-lte
-mohler
-unheard-of
-7,800
-hustings
-1922-23
-dde
-opcw
-re-introduce
-taube
-brightside
-nahr
-super-g
-jarama
-abu-jamal
-topoisomerase
-paperboy
-beastmen
-enchantments
-cassian
-synodic
-crim
-maroni
-beetroot
-antwren
-kahnawake
-yeshivot
-gelasius
-end-of-life
-yzerman
-dougall
-morial
-eight-page
-fairuz
-hot-headed
-orthodontics
-spyro
-gradations
-sivananda
-122nd
-re-routing
-britishers
-signifier
-gombak
-self-worth
-elin
-patellar
-gwendolen
-nordisk
-dnf
-hab
-barford
-jayadeva
-a48
-lomu
-o'banion
-sixx
-trevor-roper
-druggist
-girardi
-hakufu
-marcis
-lord-in-waiting
-newberg
-ravinia
-markos
-coons
-shakeup
-fatherless
-interferences
-loafing
-khatib
-seinen
-cohesiveness
-shamsher
-mua
-mystified
-project-based
-bento
-dicey
-d'ivoire
-swathe
-ostrogothic
-23,333
-riverbend
-nenagh
-axton
-salzwedel
-104.4
-c0
-jobbers
-breckland
-linney
-jamshed
-kaze
-delon
-aix-la-chapelle
-1942-1945
-teardrops
-asg
-danity
-ascender
-quadrate
-nimmo
-masovia
-porcine
-reles
-pend
-piercy
-pel
-chargeable
-gigabyte
-22.90
-pontecorvo
-wnbc-tv
-floundering
-upstarts
-xiaolin
-four-sided
-mamo
-cosi
-cross-disciplinary
-izard
-e-gold
-steve-o
-rickets
-pictographs
-mujeres
-centrifuges
-kensei
-sandon
-tegh
-counter-revolution
-palio
-safaris
-auglaize
-delors
-boulter
-lilium
-dishwashers
-ruxton
-schick
-plassey
-glenside
-heineman
-61.2
-inductions
-diamagnetic
-swope
-svc
-jeffords
-leuchars
-hoofdklasse
-parka
-autechre
-strugglers
-achim
-chelation
-caffrey
-llg
-wilful
-renn
-treecreeper
-cricklade
-tussauds
-siv
-embarrassingly
-berchtesgaden
-teratoma
-dansk
-skybus
-colossians
-stereographic
-goodale
-reaffirms
-1114
-1117
-thaliana
-dufour
-ardwick
-mlg
-marrs
-odf
-ctm
-cut-up
-salishan
-freewheeling
-guinn
-henslowe
-thumbnails
-outfall
-statecraft
-alania
-repulsing
-beatlemania
-kwak
-duddy
-ballplayer
-habitus
-frp
-6200
-cervera
-post-glacial
-aradia
-brielle
-interweaving
-longhair
-cronje
-subwoofers
-archaean
-hernan
-trenchant
-roelstra
-m.l.
-oci
-wallin
-losman
-fiyero
-freethought
-29.20
-borate
-unwrapped
-daren
-lateness
-manado
-wiseguy
-defecate
-assertiveness
-trikala
-pedagogic
-shoppingtown
-wakanda
-matias
-rendel
-shellfire
-enthused
-nof
-god-given
-habilis
-disallowing
-zanetti
-mittens
-tfi
-arminia
-talabani
-bayh
-watchmaking
-supercluster
-antic
-yep
-corrientes
-baguazhang
-coastway
-imd
-78.5
-westerwald
-michaud
-shaddam
-disequilibrium
-comstar
-inniskilling
-contoured
-gatekeepers
-enfranchisement
-jaro
-seventy-seven
-gauleiter
-swearengen
-owais
-exempts
-yakub
-wordstar
-botox
-bridewell
-concho
-daradas
-yablonski
-lunules
-spurge
-vuh
-lado
-christmastime
-luray
-gulbenkian
-vultee
-144th
-dandolo
-redden
-covens
-arnprior
-cyclecar
-infuriates
-catton
-appanage
-bulleid
-turacos
-apocalyptica
-19,583
-ahtisaari
-14-6
-blackley
-transwarp
-12.70
-counteracting
-mihrab
-smokestacks
-crd
-paroxetine
-laddie
-therefor
-boettcher
-mizar
-grisman
-baluchi
-redubbed
-comprehending
-orbiters
-abune
-2112
-ncf
-homophones
-pokhara
-pwd
-choudhary
-camila
-ritmo
-spurlock
-bopara
-honeybee
-holgate
-metalurh
-67.6
-osteoblasts
-morrie
-fcat
-tanenbaum
-casters
-tragopans
-melodically
-simile
-gruner
-muzak
-billa
-lti
-baltazar
-mirzapur
-defaulting
-thermals
-newly-independent
-hooch
-twisty
-plana
-chastise
-aos
-mehran
-edibility
-altiplano
-anjuman
-cheeseman
-theming
-proteasome
-paperclip
-keselowski
-'68
-knitwear
-meneses
-science-based
-kempthorne
-oku
-third-level
-hoeven
-lever-action
-perfecto
-dazzled
-manalo
-littler
-waycross
-sexploitation
-1912-13
-eruv
-zanla
-copes
-salvi
-teco
-tapi
-echocardiography
-a340
-luckett
-hoel
-atlanta-fulton
-clijsters
-hypo
-reshevsky
-cardassians
-meitei
-flds
-rohmer
-kaizer
-keb
-zenda
-colonic
-wasser
-500s
-ashton-tate
-inimical
-paedophile
-decoupled
-liquidating
-metamorphose
-wagered
-luken
-bagalkot
-sharp-edged
-azevedo
-mcgeorge
-askar
-swaths
-lpf
-delerium
-stupendous
-lafourche
-mobbed
-verbose
-plaisance
-hecke
-winner-take-all
-shi'as
-hurlburt
-amani
-mpl
-27.80
-e-2
-bookworm
-orthopaedics
-subron
-rowlett
-kittson
-anodes
-cuddly
-ridin
-midriff
-arth
-kairouan
-friedmann
-1089
-al-tabari
-mid-days
-positivists
-vian
-crema
-kz
-glauca
-raghunath
-truong
-ovule
-salvos
-off-break
-redcoats
-unviable
-coleshill
-waver
-****
-xcode
-anarcho-syndicalist
-35e
-whopping
-tilsit
-adie
-ablaut
-spandrels
-zealand-born
-grangemouth
-vancomycin
-dougan
-jovan
-masako
-varner
-ricin
-20-10
-land-owning
-grouchy
-tando
-estacado
-bartimaeus
-toltec
-firesign
-sandworms
-proxima
-pewsey
-commending
-rhema
-phoenixville
-chinooks
-suter
-arby
-hendra
-spools
-plotkin
-chromodynamics
-ubaidah
-concertante
-horvath
-bleasdale
-gouge
-softworks
-kiss-fm
-opperman
-financials
-raphe
-dauphine
-chairpersons
-notley
-heusen
-modifieds
-harpies
-verulam
-nrotc
-cb1
-sonali
-half-breed
-anvils
-contax
-jts
-turbonegro
-on-time
-jazz-funk
-colorings
-maecenas
-hoadley
-lifton
-12-18
-overhauling
-ventrally
-penne
-kampfgruppe
-litmus
-barrackpore
-foxhole
-hspa
-desultory
-loye
-uintah
-tio
-rahne
-solari
-half-forward
-nullifying
-clermont-ferrand
-newtownabbey
-3-sphere
-modicum
-proprietorship
-herpesvirus
-shahaji
-30mm
-paulet
-tushar
-mithridatic
-two-barrel
-jacko
-pathmark
-naish
-blanda
-romuald
-unearthing
-six-figure
-papillary
-re-engined
-digicel
-stari
-grimly
-nacionalista
-irresponsibility
-guida
-nikko
-ah-64
-broadus
-call-and-response
-gurdy
-wpial
-mima
-gingee
-bissau
-bakken
-oracular
-chiseled
-uninitiated
-sub-orbital
-pow-wow
-gushing
-madhvacharya
-belluno
-vaikom
-sepik
-u-14
-catastrophically
-photoplay
-letterbox
-selwood
-theatricals
-dotting
-josias
-rugeley
-yardstick
-rainfalls
-sleater-kinney
-bpl
-kalyana
-f-15s
-cytokinesis
-newcap
-roomy
-readying
-friedland
-cordage
-lactantius
-chapelry
-olive-green
-12s
-standardise
-sookie
-chlorination
-biomolecular
-machetes
-bcl-2
-nitti
-wenzhou
-warmia
-viterbi
-chlorpromazine
-pittas
-lmc
-grieves
-millbank
-chromatids
-muons
-ouagadougou
-wakeboarding
-spinel
-celebrants
-fleetway
-hypermarkets
-68.5
-loral
-susquehannock
-spooked
-tucows
-reynold
-flugelhorn
-premised
-hogeschool
-gestural
-cib
-inter-collegiate
-four-and-a-half
-socialistic
-spect
-zoomed
-pontiacs
-selecta
-collen
-adduced
-habs
-salian
-invalids
-lampert
-sridhar
-proteges
-oppidum
-potentiality
-spen
-jace
-thousandth
-gracile
-invitees
-huaorani
-giolitti
-bettendorf
-non-abelian
-liberec
-kanem
-ringtail
-pavlos
-ttp
-rolph
-domo
-dually
-orage
-sahiba
-dop
-wonderwall
-anthroposophy
-frenulum
-scale-space
-braiding
-mth
-yashwantrao
-moxie
-o'gorman
-alaskans
-escudo
-well-remembered
-brigand
-skywalk
-1,850
-cribbs
-stratford-on-avon
-yoyo
-dichroic
-1980-1982
-pintail
-ipx
-self-portraits
-sayuri
-cadwalader
-leptodactylus
-insincere
-application-specific
-al-awlaki
-radians
-subdominant
-basha
-unital
-noahide
-lodz
-idyll
-drakken
-ambar
-rham
-manticoran
-dentate
-beneficent
-chatham-kent
-superfamilies
-toohey
-eluding
-jouett
-depute
-sinica
-ebi
-stylistics
-igarashi
-highsmith
-ptah
-carmilla
-reticulated
-rawtenstall
-peppy
-susann
-5.90
-traxxas
-equalise
-moriscos
-rochon
-mirabella
-pahlavas
-sprig
-stillbirth
-acknowledgements
-roderic
-bunkhouse
-mountie
-hughson
-danae
-aiadmk
-d'arthur
-allyl
-14.30
-anesthesiologist
-medlock
-o'rahilly
-slingers
-walkable
-potentiometer
-actu
-romanised
-spearing
-frayed
-21-7
-coitus
-chubut
-tomcats
-mamodo
-emmerson
-wynonna
-meno
-1151
-khj
-disparage
-bugler
-partials
-fastidious
-bookies
-elucidating
-scamp
-hagrid
-hijikata
-ballymun
-wharfe
-novodevichy
-mallah
-newtownards
-chainmail
-outrageously
-berbera
-c60
-orozco
-tomboyish
-xcel
-ecn
-2-9
-65.1
-kalimba
-villepin
-isn
-norilsk
-pugsley
-evangel
-samo
-brusilov
-antithrombin
-translocated
-d.m.
-json
-shortstops
-gingiva
-luminescent
-defenceless
-ground-dwelling
-2-year-old
-oppland
-pendergrass
-postalveolar
-anecdotally
-zeeman
-vekoma
-ozaki
-neuroleptic
-pawnshop
-brahui
-spanos
-fmw
-calamari
-ivr
-gurnee
-mid-engine
-kalahandi
-radiologists
-retrace
-skylar
-etheric
-buffoon
-glennie
-hasta
-cadfael
-killington
-bagong
-93,000
-brooklyn-based
-nazianzus
-eir
-worthies
-azim
-leninism
-impressment
-ents
-1062
-woaa
-beausoleil
-v-22
-langland
-ramchandra
-haloperidol
-lightnings
-now-demolished
-romblon
-drive/genesis
-misadventure
-subud
-beeps
-beauclerk
-baltics
-kronor
-amylase
-rovigo
-replanting
-samedi
-.42
-gangotri
-reise
-6m
-open-access
-imjin
-four-letter
-lurks
-neretva
-4217
-andalucia
-voskhod
-sharky
-nozomi
-historica
-filled-in
-yaks
-pipelined
-photosystem
-tenterfield
-bisexuals
-lieut-col
-salt-water
-muggeridge
-verstappen
-automorphic
-fennelly
-herzberg
-reemergence
-varaha
-floaters
-tycoons
-fajr
-potvin
-mig-25
-gagliano
-tillotson
-sinope
-lilacs
-60.9
-quedlinburg
-beranabus
-uan
-houk
-non-smoking
-drb1
-newly-promoted
-botnet
-panchalas
-acrostic
-pilibhit
-sandbars
-varicose
-164th
-minelaying
-ossetians
-subtitling
-tinbergen
-symbiotes
-resettling
-psy
-prying
-gmr
-cave-in
-echos
-sigmoid
-58.0
-understrength
-paquin
-hand-crafted
-buteo
-mcneely
-65.7
-over-run
-tomi
-wace
-maranao
-alen
-ccny
-levite
-multifunction
-re-unification
-strega
-pres
-ancyra
-four-wheel-drive
-kington
-throngs
-pleiku
-tylenol
-2000-2007
-grosses
-pevensey
-hulse
-existences
-satsang
-07-08
-sicard
-climaxing
-foaming
-bar-ilan
-lomb
-anahuac
-short-distance
-escapee
-castrol
-afshar
-landi
-shimazu
-carbureted
-vals
-unseeded
-kilgour
-larkhall
-lw
-ahearn
-backstop
-wryly
-muskies
-ayako
-shrew-forms
-incubators
-lagomorpha
-lorimar
-amartya
-waterlooville
-tomislav
-pamux
-hanyu
-2ue
-swearingen
-islamiyah
-hrd
-370,000
-renzi
-yami
-args
-osteomyelitis
-oversteer
-deathtrap
-kazuhiro
-psychotherapeutic
-volksdeutsche
-hokey
-co-winner
-thrasybulus
-drapers
-tuoba
-autocar
-128th
-schultze
-mcbean
-forewarned
-timetabled
-apsara
-takeru
-1941-1945
-rfl
-1.14
-nibelungen
-clu
-symphonia
-climate-controlled
-rear-mounted
-cowra
-ilitch
-aigle
-zhan
-cushioned
-bramber
-historiographical
-spencers
-1100s
-malinga
-wheezing
-mashpee
-proffitt
-britannian
-lebar
-ystrad
-gsr
-nikolayev
-fipresci
-misdirected
-risdon
-shilo
-textus
-clarkin
-rhacophorus
-speakership
-konitz
-rokeby
-wilno
-grads
-dhrupad
-66.4
-deputized
-amazo
-tyranid
-russian-language
-search-and-rescue
-pershore
-oke
-codice_18
-vtec
-expressionistic
-boardwalks
-hartshorne
-suwa
-vrt
-servlet
-samanta
-aldergrove
-185,000
-screwing
-harriot
-landholder
-sorley
-urinals
-anglophones
-boilerplate
-litigated
-unperturbed
-1,450
-meyerbeer
-nantahala
-philadelphian
-clearness
-outgrow
-actionscript
-fri
-denholm
-trans-saharan
-plumer
-usaac
-salwa
-off-center
-foretelling
-cmo
-nepa
-neagh
-pre-packaged
-cowpens
-banhart
-holdfast
-metaphoric
-campagnolo
-gunfighters
-well-marked
-shivpuri
-vives
-ibogaine
-futhark
-d-boys
-foundress
-hornell
-blatty
-feathery
-miskatonic
-mur
-adilabad
-gendo
-ppt
-sociale
-pulsation
-harriette
-eosinophils
-zoster
-exoskeletons
-yannick
-harriett
-wonderswan
-turbodiesel
-small-block
-neoplasia
-w.f.
-yongle
-cuffe
-dayanand
-formula_90
-starbug
-nxt
-shudder
-polymorph
-croatan
-74,000
-high-fidelity
-mylapore
-adrs
-fook
-1.83
-70.0
-paan
-vfd
-shiho
-sanibel
-lysosomes
-pini
-limousin
-kokborok
-snowcocks
-squelch
-cwi
-taniguchi
-relievers
-fastest-selling
-milhaud
-escapade
-hallucinogens
-fortissimo
-supergiant
-sandlot
-powergen
-resinous
-doxology
-gerrold
-nerul
-destabilization
-imbrium
-woodfield
-2002-2007
-kittredge
-harbaugh
-0.55
-aiims
-yello
-burnished
-low-light
-nkf
-1919-1920
-mitten
-wy
-fatman
-kittie
-birendra
-d.i.
-gondal
-conners
-didymus
-61.6
-yas
-gajapati
-capitis
-mtbe
-triplex
-exiling
-siphoned
-mwai
-criminologist
-zevi
-anti-socialist
-morava
-kirovohrad
-baftas
-wdm
-yano
-elrod
-amaral
-swum
-decently
-trejo
-jnana
-cd1
-triathlons
-83.4
-1xtra
-santiam
-tirupur
-memoria
-exilarch
-blankenship
-baserunners
-blued
-bluey
-anodized
-villaraigosa
-kajang
-wriothesley
-tine
-22.80
-britomart
-underclassmen
-liberality
-mass-produce
-recombined
-breakups
-amfm
-tansen
-jammy
-benoni
-bhiwani
-background-color
-co-cathedral
-rappelling
-three-act
-resizing
-preis
-uneasiness
-faring
-uas
-stelios
-gira
-b-24s
-transdermal
-balts
-backflip
-gers
-bolivarian
-lantos
-currants
-campsie
-blacklight
-barbatus
-chandran
-expressionists
-delgada
-red-figure
-brambles
-riggers
-touts
-kfz
-lessard
-dieback
-consciences
-dissents
-thats
-esquivel
-kalpa
-ruehl
-domaine
-1036
-spyros
-kennecott
-autoclave
-rwa
-350th
-shouldered
-sladen
-life-force
-wmaq
-piura
-shahab
-voltron
-l.m.
-eaker
-westdale
-mesas
-fingleton
-pgce
-leasable
-muthappan
-bigod
-knuckleball
-moustaches
-shehu
-cabbages
-apd
-chicha
-delp
-ngee
-e.u.
-collimated
-naser
-letterer
-disgaea
-kpn
-skeets
-gundagai
-mottling
-wtbs
-colloquy
-brittonum
-zelig
-takeaways
-repackaging
-mcdevitt
-wristbands
-judicially
-sfio
-deprogramming
-schwarze
-shorewood
-conejo
-piller
-bungay
-undergarment
-microfiche
-segamat
-dalley
-kal-l
-suffragans
-late-19th
-tugela
-tridents
-phoning
-eastchester
-flanged
-incompleteness
-movimiento
-mosasaurs
-kvapil
-lepton
-degarmo
-mohalla
-reversibly
-reification
-schmit
-misdiagnosis
-truax
-fermo
-1914-15
-notte
-eike
-severian
-gelli
-unifies
-81.7
-oil-for-food
-castellanos
-self-confessed
-gehringer
-unsprung
-paullus
-subdiv
-24p
-turnhout
-thoroughness
-ardagh
-tooke
-in-person
-poor-quality
-prosimians
-composited
-pangborn
-ehsan
-razz
-impalement
-latrine
-lipoproteins
-kinderhook
-gladwell
-speicher
-reconditioned
-seedless
-jsf
-marceau
-japheth
-gastritis
-kdp
-nds
-r64
-scoffs
-widely-read
-bramalea
-modok
-annas
-colima
-decarboxylation
-one-two
-brainwave
-governmentally
-bolam
-madraiwiwi
-harpe
-burin
-impactor
-toughened
-seaweeds
-declassification
-non-commutative
-unix-based
-3-13
-dualshock
-flecks
-sidewall
-playgirl
-lamination
-barco
-semiarid
-f.k.
-linville
-idl
-ninety-two
-popliteal
-two-to-one
-lysosomal
-satu
-haplo
-timepiece
-bembridge
-fadden
-giani
-formula_86
-lamson
-hauts-de-seine
-nekrasov
-tawang
-halmahera
-beatbox
-amita
-taino
-mukhopadhyay
-question-and-answer
-bonhomme
-pump-action
-10.000
-reassessed
-pithy
-reproached
-guillory
-rctv
-kinsler
-scold
-inkster
-35-year-old
-etude
-mantova
-raygun
-selmon
-carruth
-quatrains
-mclachlin
-devoe
-correggio
-three-wheeled
-indi
-examinee
-beaty
-comital
-waypoints
-re-armament
-adelman
-combat-ready
-cimbri
-rule-based
-dkw
-bc5
-salernitana
-unami
-hafey
-sudirman
-sapling
-resolver
-enka
-bundang
-ecp
-taneyev
-corben
-envied
-rakshasas
-lestrade
-surface-to-surface
-rarefied
-pandering
-underscoring
-burchell
-waterworld
-humours
-caedmon
-lycus
-mhsaa
-legh
-co-authoring
-bomp
-shepp
-a23
-anoa'i
-fullmer
-suntrust
-crosswind
-nisi
-gainey
-vinogradov
-home-town
-manzoni
-leaderboard
-pardes
-5/4
-1052
-mosel
-and.net
-stroger
-satana
-studdard
-undine
-quabbin
-juryo
-1168
-six-piece
-barsoom
-aelia
-bux
-halder
-reval
-bagrationi
-reichsmark
-ondaatje
-zanardi
-kasur
-thalassery
-asterisks
-belies
-vantaa
-am2
-glitz
-terranes
-trike
-daiei
-minneapolis-saint
-warrandyte
-weirdness
-narcosis
-ogier
-vaticanus
-anaphase
-toye
-ferrigno
-1912-1913
-bothers
-huangpu
-liberates
-vakhtang
-eight-month
-corneille
-sculthorpe
-gurung
-syl
-recensions
-anti-life
-iniquity
-appeasing
-icq
-dithering
-beaters
-oyl
-eighteen-year-old
-hardtops
-caudate
-tatham
-caiman
-acma
-khera
-no-limit
-helden
-modifiable
-snagged
-armories
-nonpolar
-ps/2
-taussig
-minghella
-broder
-mordor
-poconos
-komal
-2003-2008
-laertes
-ea-6b
-piggyback
-daunte
-a1gp
-cycad
-njrotc
-akademie
-maillard
-10:45
-borosilicate
-awf
-medgar
-evey
-audiophiles
-mencius
-gracias
-undertone
-wobblies
-ternopil
-alamosa
-pythias
-dobby
-pollster
-bcn
-balakrishnan
-redemptive
-self-financing
-submergence
-intelligence-gathering
-imbedded
-biman
-oirats
-bdnf
-kamisese
-glottis
-looker
-second-row
-maior
-kimberly-clark
-no-hit
-2-oxoglutarate
-boneless
-brooches
-hic
-dowdy
-quakerism
-ejaculate
-shuttlecock
-bass-baritone
-fraggle
-critter
-kerns
-verges
-full-contact
-squabble
-potentiation
-frisby
-wafa
-digbeth
-crespi
-mandrill
-madchester
-demurred
-tankian
-ceaseless
-kishi
-red-bellied
-gay-themed
-demoed
-acetylation
-morriston
-debunk
-domitia
-sepang
-religio
-masri
-petrology
-vasculature
-leidy
-nuestra
-aquabats
-1979-1982
-tsg
-kageyama
-hermeneutic
-bettencourt
-levante
-nordhausen
-gulu
-archtop
-margret
-gessle
-ouimet
-prefab
-blaisdell
-oberman
-backscatter
-2005/6
-muggle
-sobeys
-fogs
-ellipsoidal
-survivals
-hunky
-1/1
-savannahs
-sprinkles
-valderrama
-awakenings
-crosslinking
-esi
-surcharges
-farhan
-binks
-damone
-rebuttals
-three-sided
-madhusudan
-lba
-atlantique
-wicketless
-reburial
-contradistinction
-globules
-hmos
-rml
-barbe
-cryptology
-hedland
-fnla
-dtmf
-jocular
-77.5
-german-jewish
-supercard
-allee
-heirlooms
-utf-16
-rajapaksa
-dilly
-reichsbahn
-gcl
-devries
-merkle
-tisbury
-whining
-piya
-1079
-vande
-greenstein
-elua
-saumarez
-ureter
-clemenza
-hempfield
-al-arabi
-euwe
-2026
-channa
-underling
-flasher
-inkwell
-53,750
-conspiratorial
-hall-of-famer
-dixons
-beaudesert
-flyleaf
-nealon
-bagge
-she-wolf
-schunck
-2-14
-frontlines
-schneier
-honky-tonk
-laminates
-blarney
-glareolidae
-gainax
-al-ashraf
-tannadice
-gemmell
-kfrc
-smilodon
-gynecological
-machismo
-chadwell
-kristiania
-hypostasis
-dorsetshire
-carollo
-nevus
-parviz
-voltaic
-larchmont
-deep-rooted
-slingsby
-comas
-perforating
-platformers
-inattention
-morville
-shigella
-client/server
-geraci
-repertoires
-stacker
-pae
-acetylcholinesterase
-anglicisation
-doolin
-xaverian
-diraja
-ekg
-anni
-cockrum
-conder
-kensit
-herberger
-cobblers
-2-time
-brooke-taylor
-windshields
-ramenman
-lamoureux
-cynics
-mcclernand
-kastoria
-destino
-catch-up
-crockery
-vanunu
-urgell
-moylan
-paladine
-snc
-nakata
-sidhe
-mariette
-tillich
-lptv
-neo-fascist
-archana
-kotter
-nhi
-placidia
-nicolet
-tresham
-ewtn
-duque
-xpt
-stds
-garrido
-praenomina
-bachelorette
-warbird
-aude
-1988-1992
-vaas
-lavon
-boscawen
-minear
-pompton
-colosimo
-leavening
-lagers
-regnery
-biggers
-wilburn
-botley
-tyrannulet
-dmp
-ieuan
-pterodactyl
-torana
-dowsing
-behaviorist
-korine
-monon
-time-lapse
-freehand
-misreading
-partch
-innisfail
-eishockey
-inter-parliamentary
-faldo
-mediacom
-lucina
-livni
-single-coil
-3-disc
-yellow-billed
-kentuckians
-kazoku
-fakhr
-photobook
-satavahana
-bredesen
-consolidations
-funakoshi
-epyx
-paraparaumu
-mutsu
-carats
-acheulean
-gracey
-italian-americans
-15,500
-colwell
-early-morning
-valkenburg
-shrug
-rajab
-ameritech
-heffron
-pitbulls
-bugger
-trine
-kunkel
-1028
-monuc
-soccer-specific
-pre-european
-dwi
-aldabra
-11,250
-sudden-death
-a6m
-selatan
-turnoff
-youkilis
-raison
-bobbili
-solitons
-wallflowers
-psychoanalytical
-semi-satellite
-blayney
-encinitas
-wmmr
-tibbetts
-tinsel
-bricker
-counterweights
-ill-tempered
-lexmark
-rogallo
-tosa
-thant
-hedmark
-re-elect
-goblets
-aksai
-richler
-akhmatova
-clothe
-alireza
-divisibility
-foligno
-jhunjhunu
-bilderberg
-swamigal
-pcw
-carpels
-avec
-brandes
-hat-tricks
-barc
-turntablist
-durning
-paternoster
-palanquin
-chacon
-prime-minister
-playfulness
-burckhardt
-gfp
-criollos
-croom
-cinematographic
-wherry
-redistricted
-autogyro
-horsehair
-pharoah
-jellybean
-big-league
-cousy
-jawi
-l'anse
-thermoregulation
-snecma
-toshack
-redbacks
-abstracting
-redbird
-griddle
-pcu
-65.2
-mouthwash
-porphyrin
-varangians
-75mm
-gamedaily
-catechumens
-alef
-ehret
-digweed
-gryffindor
-lety
-consecrating
-vestibules
-kanan
-holsworthy
-nowlan
-dsu
-junglefowl
-navigates
-sorum
-oesophagus
-bupropion
-.333
-confectioner
-cliffhangers
-noirs
-trevi
-englanders
-plater
-outplayed
-59.9
-sucralose
-musicological
-curates
-skolnick
-4077th
-rayford
-sinecure
-greenlit
-4-4-2
-phosphohydrolase
-poolside
-plesiosaurs
-binky
-malet
-copperbelt
-rivendell
-kuznetsova
-ikshvaku
-birks
-pulliam
-n5
-in-patient
-awfully
-tittle
-nazmul
-boreham
-4300
-420,000
-manzanera
-yorkists
-kalingas
-medline
-stoddert
-cumbric
-lollipops
-absenteeism
-scops
-punchy
-watervliet
-three-door
-kaibab
-tro
-sinfield
-non-scientific
-lexcorp
-centraal
-upson
-top-seeded
-reconfirmed
-culverhouse
-webtv
-awning
-arncliffe
-naseeruddin
-dro
-lower-priced
-stevedores
-rationalised
-buttstock
-hibbing
-meijin
-portuguese-speaking
-clean-cut
-counterbalanced
-erosive
-franziska
-rst
-awe-inspiring
-dalmeny
-3.65
-halflings
-aeolus
-andria
-propagator
-avocados
-yomi
-panter
-vaishnavas
-left-sided
-spidey
-adenoma
-times-picayune
-hota
-hgh
-erste
-burkholderia
-elita
-glassman
-lav
-kullu
-tintoretto
-salis
-spurfowls
-aviano
-kanno
-deathrock
-.12
-lippo
-virata
-costars
-saifullah
-ribald
-hever
-zabriskie
-resaca
-charlottenburg
-americano
-anti-spyware
-stamos
-stenton
-watsons
-codice_19
-zookeeper
-ferruginous
-lurk
-o-rings
-imagawa
-hemorrhaging
-evin
-quanzhou
-verein
-backbenches
-barycentric
-24-year
-francolin
-yamasaki
-incoherence
-qam
-skewers
-greenport
-sturgess
-frg
-stapes
-hemorrhoids
-firstgroup
-suma
-miaa
-jankel
-kadett
-technet
-herdsman
-browser-based
-2003-present
-8088
-tolle
-cream-colored
-lamda
-scintillating
-variable-length
-iczn
-sankar
-rocketdyne
-parenchyma
-abarth
-acte
-pelorus
-tinned
-cotati
-subpar
-luscombe
-thanagar
-auxilia
-hixon
-naib
-philipps
-20,000,000
-emusic
-bashkortostan
-moieties
-5b
-t.d.
-failsafe
-defamed
-joakim
-plover-like
-dats
-palatalized
-den-o
-haystacks
-ballon
-consolidates
-disfiguring
-rain-affected
-iamblichus
-nagashima
-cleef
-lowrey
-time-out
-mpo
-suzerain
-tracie
-corundum
-kunda
-isca
-radiotelephone
-nessarose
-baritones
-pergola
-stille
-constitutionalist
-xul
-cuirass
-kellar
-karolinska
-2-6-0
-transpiration
-predispose
-fitzwalter
-sainis
-disciplining
-aylsham
-ecosoc
-33-year-old
-miron
-quanah
-falsehoods
-28.90
-reinstituted
-campanulaceae
-ordos
-guardhouse
-three-mile
-1985-1988
-kunlun
-disbursement
-pique
-inexact
-puteri
-yarraville
-restigouche
-allylic
-deut
-krishnamurthy
-tooheys
-peeters
-slouch
-naushad
-weardale
-emoticons
-skyrocketing
-pragmatist
-pazzi
-0s
-bamboozle
-warrens
-epitope
-burstein
-greenspace
-polders
-alii
-cressy
-bandpass
-unaffordable
-muggs
-hasharon
-oligonucleotides
-anjar
-t-6
-1.92
-tostig
-cassegrain
-neagle
-navidad
-spirally
-passamaquoddy
-chippawa
-palpi
-oil-fired
-oncogene
-branston
-1968-1970
-haver
-dhanush
-beechworth
-saiyan
-dilate
-nystrom
-corsham
-familiars
-ysandre
-wimpey
-lycan
-world-record
-didache
-lettice
-napo
-single-issue
-d'alembert
-lewisohn
-savery
-curtailment
-p-40s
-ghaznavid
-brainard
-finian
-harnack
-dowell
-jdl
-syon
-sclera
-mid-1890s
-macerata
-conneaut
-denotational
-incorruptible
-mormaer
-snafu
-jacobins
-red-orange
-parched
-nick-named
-blotch
-morganton
-pyridoxine
-comber
-pitstop
-santley
-sarto
-quimper
-truer
-vercingetorix
-wide-scale
-indo-scythian
-slipcase
-voroshilov
-bjj
-hazlehurst
-ugg
-al-farabi
-air-ground
-moroz
-rockwall
-cappuccino
-fubuki
-orontes
-bas-rhin
-mr2
-2/5
-genn
-bsac
-imperiex
-streaker
-finial
-terahertz
-bayport
-pcg
-65.3
-caballeros
-quarrelsome
-subcellular
-qub
-hopman
-tydings
-colenso
-arterioles
-midvale
-137th
-baio
-sandhi
-hsl
-delawares
-l&t
-easybeats
-faizabad
-mook
-u-238
-howls
-hawkmoon
-smf
-greek-american
-conall
-velour
-fpa
-belshazzar
-skied
-queensboro
-tapirs
-swansong
-astarte
-eggshell
-hrithik
-tulse
-send-up
-donnas
-seras
-1.06
-basanti
-loricariidae
-zebrafish
-henn
-seele
-seascapes
-pushers
-trashy
-ne5
-cellier
-abodes
-middlefield
-bmd
-vidar
-goliad
-solingen
-n'djamena
-prahlad
-serviceability
-biddeford
-zutphen
-self-regulating
-bashed
-buskerud
-rambert
-n6
-recrystallization
-keppler
-rmr
-lift-off
-townend
-manhunters
-infers
-esiason
-unpredictably
-araucaria
-flintheart
-jordison
-muskellunge
-novy
-wigtownshire
-krav
-zille
-flambeau
-meatus
-unapologetic
-heptathlon
-hackle
-ribavirin
-x6
-silliness
-hibbs
-merlino
-shach
-boycie
-kony
-forename
-privations
-avellaneda
-medora
-gianna
-nella
-1h
-untried
-standouts
-tomoya
-sensorimotor
-meropidae
-bensalem
-krogh
-reincarnate
-maathai
-lindfield
-proactively
-lubbers
-21-17
-frasers
-clitic
-boii
-banna
-tasslehoff
-lozenges
-dual-mode
-edgcumbe
-d.g.
-prakriti
-srinivasan
-julianus
-borchardt
-imputation
-archos
-cush
-closeup
-tarkington
-boombox
-alberic
-compactflash
-dta
-shinedown
-1975-1977
-spearfish
-outerwear
-black-faced
-donnan
-mirabel
-o'conor
-agriculturist
-ussba
-aerated
-5.60
-alipore
-sisson
-theogony
-bip
-harbison
-07/08
-adenylate
-pcie
-lazarev
-bettelheim
-sana'a
-gourami
-trenchcoat
-bucyrus
-kirton
-chicory
-hoopoe
-leones
-perplexing
-eba
-yukawa
-yogam
-cyanosis
-volchok
-humanae
-klickitat
-deleo
-kaminsky
-pfd
-tapp
-1.27
-bares
-extol
-timor-leste
-jorn
-rhp
-hedgerow
-beddgelert
-1.13
-well-stocked
-hyphens
-refusals
-sealer
-delineates
-graeff
-revision3
-lantis
-litanies
-vidyapeeth
-mortgage-backed
-milngavie
-black-eyed
-hanratty
-sasr
-1,1
-handcuff
-boma
-13.70
-aiaa
-aim-9
-trillions
-trower
-bearsden
-re-wrote
-woodcarving
-beswick
-bonnar
-dallas-based
-agustawestland
-dobell
-praiseworthy
-khanates
-harken
-pazuzu
-wrenches
-make-shift
-edoardo
-ladainian
-w.m.
-exciter
-flatmate
-eleonore
-smartly
-privet
-ishq
-cloven
-zukofsky
-after-effects
-tamia
-o3
-singapura
-corrode
-hags
-chirk
-galea
-shippen
-salviati
-kaikoura
-invigorating
-veggie
-school-record
-jes
-0.51
-irrepressible
-deducing
-interoperate
-bivins
-veit
-draperies
-zhukovsky
-plummets
-garderobe
-hakata
-m18
-on-street
-srl
-kabbah
-ffc
-worm-like
-haren
-idolatrous
-recirculating
-caley
-wetsuit
-rizzoli
-avonlea
-hydrogens
-hurstbridge
-twp
-sistema
-4-day
-basutoland
-infidelities
-bregman
-abd-allah
-cilento
-overexposure
-catalyse
-saint-omer
-hagedorn
-6x
-rost
-lansingburgh
-1.94
-scow
-bramham
-umarov
-smallhausen
-bedbugs
-decretals
-breastworks
-plumpton
-datable
-taoists
-loathe
-mcgavin
-dacha
-power-pop
-ulus
-predestined
-tove
-birefringence
-enrages
-adduct
-picketers
-agg
-wpi
-lensman
-headrests
-m24
-tennille
-wanaka
-neponset
-waterton
-stiffen
-ponderous
-neoplasms
-diapause
-perigee
-noctuidae
-usaa
-hansford
-re-unite
-hoary
-rockbridge
-thoughtless
-darnall
-eger
-alemannia
-aversive
-pis
-ailes
-pagano
-ital
-dexamethasone
-aparna
-depersonalization
-sub-unit
-unreformed
-ahold
-fortuitously
-gibraltarian
-locally-based
-sawfish
-ecovillage
-coccyx
-materializes
-recycles
-tvc
-bjd
-avey
-mayans
-batanes
-komuro
-zeta-jones
-roxburghshire
-hortus
-pinnock
-firebombing
-stratovarius
-tf1
-urd
-urb
-destitution
-marsters
-aalst
-murmurs
-woodring
-xeelee
-zhytomyr
-uart
-lower-middle
-by-law
-stoneleigh
-modafinil
-blackshirts
-apologia
-dibny
-mashups
-wtop
-shockey
-herschell
-adrianne
-misshapen
-coin-only
-carreras
-yastrzemski
-lumberman
-warri
-hfcs
-pitviper
-channelized
-vamana
-empennage
-critchley
-faria
-bitterroot
-awnings
-chromed
-1994-96
-bonaventura
-kathir
-difficile
-offshoring
-classis
-nica
-no-contest
-barretto
-tejon
-ridicules
-klemm
-nul
-houseboats
-paddick
-t.b.
-senussi
-leonis
-sterett
-eight-week
-efx
-terrorcons
-tete
-68.9
-banffshire
-equatoria
-vondas
-77.4
-eared
-arouses
-el-sheikh
-adsense
-superintended
-brix
-kedleston
-intestate
-gbs
-simonsen
-tmz
-chex
-directly-elected
-teagarden
-endangers
-oberheim
-confraternities
-monophysite
-jamborees
-lakeman
-nyclu
-mayville
-gellibrand
-border-style
-wadden
-ayacucho
-exorcise
-neosapien
-ultraverse
-18.00
-busselton
-steam-driven
-espanyol
-depew
-sloat
-rheumatology
-non-revenue
-carpetbaggers
-mindscape
-khatun
-recce
-arirang
-mcandrew
-29.00
-gravitate
-gtc
-mechwarrior
-nismo
-dqa1
-pre-college
-rhombus
-frogmore
-coronavirus
-mfg.
-valiants
-brigs
-life-giving
-jagiellon
-demjanjuk
-letterhead
-sustrans
-cammie
-andheri
-capulet
-sigsworth
-grazers
-saipa
-ignominious
-wallflower
-gatecrasher
-pritam
-hamamatsu
-plastids
-russells
-bogside
-coxsackie
-free-swimming
-mitsuda
-drosera
-cropredy
-misbehaving
-mysteron
-pumpkinhead
-drubbing
-janissary
-piri
-vanni
-entrenching
-handrails
-kach
-707s
-short-legged
-innervate
-capito
-nira
-fck
-polychlorinated
-twelfth-century
-manali
-muara
-giannis
-casablancas
-b-58
-earnestness
-sittard
-preferment
-starhawk
-baryons
-japanese-held
-kingdom-based
-gleam
-uinta
-nederlander
-amparo
-treasonable
-st-pierre
-oleic
-b'av
-fudan
-1.22
-re-enact
-n.m.
-paiste
-marisol
-saguaro
-ulrike
-grosseteste
-marcin
-downingtown
-sinton
-rajasuya
-tsugaru
-sn2
-DGDGDGDGDGDGDGDGDG
-ar-rahman
-bozzio
-manalapan
-pasa
-dpt
-ussocom
-trichur
-sandton
-anatomists
-ricotta
-bumblebees
-thain
-dysfunctions
-saida
-rocket-powered
-schiffer
-ringwald
-1988-1991
-issachar
-reawakened
-alka
-theakston
-tilman
-escarpments
-nihal
-callimachus
-bian
-fusca
-entailing
-kalika
-puc
-odawara
-0-8
-mimed
-reverse-engineered
-hoodlums
-chaperones
-matrons
-dejohnette
-gpb
-koki
-mordo
-geophysicist
-phrynobatrachus
-pozzi
-1:3
-clamor
-canonry
-al-masri
-bdd
-dehra
-proliferating
-oerth
-moxy
-ladislas
-gorsedd
-vinayak
-wollheim
-short-season
-befits
-menshevik
-counterfeits
-docker
-obed
-atem
-langa
-privation
-hox
-oversubscribed
-petawawa
-woodmere
-poynton
-sycorax
-co-head
-vicomte
-rehabilitative
-porthole
-broonzy
-sudanic
-safin
-xilinx
-graafschap
-maximum-security
-breathy
-leftward
-kittatinny
-sensibly
-para-military
-1924-25
-vlf
-eight-track
-leutnant
-slammer
-maudlin
-amenemhat
-sloan-kettering
-equiano
-zuiderzee
-renu
-ruprecht
-23-25
-madox
-cheval
-83.2
-roaches
-redlegs
-yahia
-lynley
-lightbulb
-gibbard
-frylock
-truckload
-rmn
-leka
-maois
-leroi
-scholia
-ad-libbed
-templi
-second-person
-preta
-sarwan
-mig-23
-artaud
-d-type
-arcam
-nasim
-buckie
-dugong
-lyceums
-consequentially
-ntpc
-boinc
-mid-15th
-1983/84
-weisberg
-rapidan
-lincoln-douglas
-saari
-yeshu
-prising
-odakyu
-pucks
-slow-growing
-capetian
-macneice
-unalaska
-mazepa
-effluents
-mardle
-humoral
-tiro
-katzenberg
-bowker
-mon-el
-burtt
-chetty
-lucassen
-ntuc
-purpura
-kshetra
-overbeck
-directionality
-decima
-idly
-artist-run
-kamina
-roly
-g.m.
-d-von
-brickhill
-pec
-sayyed
-torchlight
-1923-24
-dornan
-bozell
-chudleigh
-crucifixes
-hard-to-find
-340,000
-intently
-newsrooms
-pnr
-troglodytes
-rapallo
-3.55
-kronberg
-radiometer
-merri
-3e
-orderlies
-mahwah
-oberhausen
-vpd
-thatcherism
-therapsids
-in-home
-telewest
-witcher
-colangelo
-a319
-squib
-transit-oriented
-clydeside
-bruyn
-fining
-hieroglyph
-aronofsky
-globose
-g.t.
-stream-of-consciousness
-centimes
-carolingians
-qe7
-warhols
-gyula
-halloweentown
-deafening
-out-takes
-animatronics
-nurtures
-tudeh
-1919-20
-voivodship
-panicles
-pinscher
-k-pop
-arbat
-bookish
-vinyls
-portlethen
-weeklong
-tradeoffs
-hubie
-intimates
-broadsheets
-ilr
-enrollees
-gorka
-romm
-dumaguete
-nagesh
-activations
-tribhuvan
-richardsonian
-tobolsk
-angelika
-rigaud
-disambiguate
-slfp
-prioritizing
-randomised
-okw
-electrocardiogram
-selfridges
-directness
-lope
-aeug
-antti
-brookdale
-olefins
-20-22
-outwash
-antin
-neuilly-sur-seine
-mnr
-trichloride
-transcona
-yolks
-mid-city
-muhajir
-whine
-trashing
-merchiston
-1994-1998
-sook
-hyams
-inclusiveness
-cult-like
-peay
-dibley
-nidaros
-bcr
-speedup
-reroute
-unni
-earthrealm
-perun
-uninspiring
-goslar
-daulat
-climes
-undesirables
-rci
-kilborn
-hoost
-michalis
-orrery
-clastic
-aromanian
-mechatronics
-carboxylate
-preorder
-shumway
-undecorated
-petticoats
-21.00
-mdx
-cooma
-welega
-mimes
-therapeutically
-subrange
-proulx
-ruta
-ludi
-unlit
-armfield
-verwoerd
-engrossing
-blitzer
-birobidzhan
-mousa
-politicization
-waddy
-cystitis
-fancier
-i1
-splat
-taichi
-antibes
-schapiro
-avogadro
-fifo
-anti-bolshevik
-bhuj
-choirboys
-fernwood
-thyssen
-pag
-parent-teacher
-paresh
-ovules
-chiarelli
-playgroup
-formula_84
-yeshua
-peripherally
-antonino
-shermer
-cotillion
-phosphatidylinositol
-cyclura
-self-fulfilling
-sieves
-shoah
-mcginn
-burhanpur
-yechiel
-taormina
-sedley
-tughluq
-claflin
-burhinidae
-tacna
-hak
-lodestone
-rafique
-wayzata
-salva
-funston
-bresnan
-1987-1990
-hodgman
-paigc
-43d
-superhumanly
-hemsley
-colla
-ring-tailed
-americal
-pillory
-ansley
-rattigan
-heeley
-truffle
-wladimir
-electrophoretic
-horemheb
-gregorios
-slg
-neo-noir
-aromanians
-underused
-bayous
-potocki
-pedaling
-exploitable
-64.3
-jasmin
-mosiah
-lovemaking
-bajor
-penetrations
-mccully
-scoffed
-kaname
-colebrook
-wplj
-haptic
-thrips
-x45mm
-katsu
-bowline
-___
-trimaran
-lankester
-pippi
-cushioning
-okamura
-a0
-ionize
-deaminase
-saith
-87,000
-witching
-chefoo
-dagupan
-putonghua
-halfdan
-suddenlink
-usbl
-centralizing
-mohair
-healthiest
-crayak
-khammam
-gulbuddin
-radials
-batwoman
-57.0
-white-fronted
-maguires
-tsardom
-re-activated
-meck
-301st
-desorption
-sobor
-knollys
-okuda
-lanfranc
-moranis
-pordenone
-daoism
-connote
-encasing
-veles
-langport
-ugolino
-castille
-1985/86
-waterskiing
-decisis
-manhattanville
-pontian
-graeco-roman
-krull
-rears
-sundari
-zebu
-mahadevan
-eleusinian
-eyrie
-athene
-fons
-rear-end
-bromeliads
-kammerer
-interleaving
-cointelpro
-legless
-facelock
-logies
-rhinoplasty
-craighead
-channelling
-anti-muslim
-catch-phrase
-bujold
-yag
-dookie
-vectored
-converses
-prodding
-verdugo
-zhdanov
-hilux
-libs
-coarse-grained
-ferredoxin
-snarling
-citicorp
-razzle
-margarete
-single-ended
-sanjana
-nawaf
-betacam
-decapitate
-ferragamo
-boardgame
-scheuer
-sarada
-murillo
-nectandra
-4-bit
-kilbourne
-rainhill
-methos
-83.0
-dangun
-btv
-womad
-hypnos
-schermerhorn
-postpositions
-ramnad
-third-degree
-opt-outs
-alcatel-lucent
-dufresne
-clumping
-iwi
-reedley
-swirled
-schoolteachers
-agf
-fratton
-gwin
-pfalz
-stolons
-hammon
-weka
-stasiak
-dierks
-ngoc
-funicello
-konstantinovich
-tsawwassen
-brocket
-boldt
-bagger
-orthorhombic
-wran
-ludlam
-nrm
-vires
-llanrwst
-masham
-klugman
-astrium
-tripwire
-transhumanism
-bahnsen
-deidre
-wheatcroft
-papakura
-backboard
-sfi
-exclaves
-strongpoint
-chikuma
-necho
-esterhazy
-biofilm
-vintners
-bourdon
-foretell
-atsugi
-busty
-wass
-tope
-fou
-l&n
-macrovision
-eastview
-yelverton
-skien
-freaked
-embayment
-season-opening
-ching-kuo
-saeki
-rille
-warred
-gaskin
-mistranslation
-1.03
-1.04
-unabashedly
-tainter
-multi-unit
-14.60
-coalbrookdale
-trikes
-amaze
-procellarum
-eicher
-mycobacteria
-hamble
-hanworth
-prestonpans
-sarbanes
-surmounting
-valkenburgh
-shorthair
-corleonesi
-stroller
-adjoin
-hsing
-gaekwad
-flustered
-despero
-cathleen
-reassembly
-serialism
-cbf
-laxatives
-stricker
-gins
-bewilderment
-galston
-amd64
-mispronunciation
-mortician
-counteracted
-durie
-zico
-agrawal
-briatore
-bluebells
-thampuran
-badia
-blastocyst
-many-worlds
-meikle
-brute-force
-blakeslee
-kuroki
-yim
-0.31
-gaffer
-mahe
-pwa
-truitt
-1925-26
-gogarty
-slanderous
-feith
-sevierville
-icahn
-pop/r
-temazepam
-ghostwriters
-fulvius
-velo
-gangtok
-0.26
-criminalization
-melfort
-sub-machine
-casella
-kordofan
-tert
-vivi
-aesa
-khaleda
-nonfunctional
-bridgeheads
-midian
-recommissioning
-mid-winter
-9s
-nygaard
-stampa
-denizen
-ballista
-astrotrain
-bartali
-olympic-sized
-sudha
-modern-style
-rls
-8-year-old
-cobby
-gsd
-re-commissioned
-hyperparathyroidism
-visualised
-stokely
-66.1
-saltmarsh
-endotracheal
-deron
-temptress
-okb
-1.78
-calophyllum
-chihiro
-cubed
-claudian
-psychohistory
-exhorts
-r.g.
-roll-up
-scharf
-sugriva
-atri
-rih
-maharashtrian
-meadowhall
-sensationalized
-atrioventricular
-ananias
-keizer
-monistic
-matchlock
-tercentenary
-muqtada
-self-loathing
-logistically
-dannebrog
-amirs
-22,917
-pennock
-nocera
-brainy
-armouries
-fully-equipped
-suribachi
-carlene
-knickerbockers
-rawa
-after-dinner
-duralumin
-happold
-questioner
-agesilaus
-mcclory
-battler
-dvb-s
-megachurch
-relaid
-muslim-majority
-striations
-cheyney
-mandu
-tinkerer
-jarreau
-104.6
-boehner
-splc
-19,167
-freight-only
-puli
-lika
-santangelo
-adam-12
-oystercatcher
-montagne
-shenley
-anushilan
-celso
-ballpoint
-shepherdess
-multi-layer
-pictorials
-constructible
-presuppositions
-man-at-arms
-lanthanides
-grindstone
-tichborne
-coachbuilder
-quartermaster-general
-reducibility
-plasminogen
-rallycross
-humongous
-penza
-tpf
-crandon
-contraptions
-kiso
-kevorkian
-dursley
-confiscations
-ald
-utne
-0.57
-spokesmodel
-kempsey
-non-core
-datong
-corporates
-irritants
-yakult
-markie
-indoctrinated
-thrax
-kodos
-fatter
-windage
-muhsin
-chartists
-mangini
-gewehr
-adjara
-foodborne
-crieff
-ackerley
-ansalon
-annemarie
-dumbbell
-pfeffer
-opjhl
-stringham
-21.30
-modine
-sportier
-transbay
-vdot
-uo
-run-out
-pitot
-renouf
-hartsville
-info-gap
-veii
-vancouver-based
-abridgment
-cah
-acworth
-streetball
-superlatives
-paintsville
-jetpack
-henle
-catto
-healesville
-saru
-kis
-decelerate
-mashing
-ribeira
-westerlies
-ufology
-sweatshirts
-grob
-promulgating
-cerulean
-dunams
-masturbate
-a-12
-in-vitro
-aseptic
-tay-sachs
-macnaghten
-goldrush
-outspent
-cimon
-marfan
-forcefield
-inheritors
-hola
-malign
-break-through
-partington
-falla
-shearers
-ector
-vac
-demogorgon
-carotene
-earthling
-nemi
-canards
-tortugas
-laci
-pillay
-portly
-chagos
-py
-hazelnuts
-boller
-buckhorn
-sliven
-transvestites
-digitised
-sequitur
-1913-14
-one-reel
-weissmuller
-sacrilegious
-delacorte
-empedocles
-transistorized
-maududi
-aisa
-quadrille
-mid-14th
-tiv
-ayreon
-ladybug
-kayastha
-syros
-129th
-undulations
-fulci
-flaking
-1899-1902
-declamation
-zcta
-gadi
-azt
-5:15
-tinea
-scl
-mbda
-petruchio
-genocides
-28.30
-load-carrying
-tented
-depatie-freleng
-reciprocates
-cottam
-panettiere
-twomey
-counterparty
-shego
-shahpur
-eraserheads
-impinge
-staffel
-1033
-3-phosphate
-heriot-watt
-tarbert
-jaune
-multi-lane
-batchelder
-jacek
-circulars
-belgorod
-dvd-r
-cyc
-65.8
-hasler
-strap-on
-beerschot
-masterminding
-qala
-anemic
-ripens
-flyte
-k'iche
-adric
-skydiver
-deimon
-stuckism
-light-sensitive
-lha
-coto
-justifiably
-theoreticians
-biblia
-berowra
-chery
-1-14
-mafeking
-muskrats
-godber
-hyrule
-waverider
-icewind
-mayakovsky
-doce
-valorous
-cuernavaca
-provocations
-birtles
-brueghel
-rushworth
-ventrals
-shortcake
-pers
-women-only
-campbell-bannerman
-rajagopalachari
-dumber
-bardia
-matawan
-lightman
-shir
-fanta
-netbooks
-ambergris
-double-crossed
-non-russian
-ballasts
-arrestor
-bitty
-atum
-nehalem
-md-11
-celtiberian
-raigad
-westford
-0.46
-boughs
-hydra-matic
-scriptwriters
-g-3
-loxley
-kuttner
-saadat
-barolo
-ojos
-wassily
-1984/85
-ballincollig
-wjr
-filemaker
-ligne
-mid-2001
-reclaims
-overrated
-fiver
-fulgencio
-aah
-playbill
-deeping
-unsinkable
-cmv
-rehearing
-gaffes
-chasseurs
-mandell
-sarnath
-prinia
-eeyore
-crudiv
-churchtown
-chuk
-cotonou
-bottomland
-paduka
-brca1
-lior
-omineca
-bobbitt
-hartsfield
-bredbury
-mbps
-timeshift
-23-year
-slovo
-gauci
-fifteen-minute
-naum
-mclanahan
-dhaalu
-renamo
-isadore
-panoply
-mcclean
-1936-1939
-yesudas
-marja
-gaddi
-oscan
-bottom-dwelling
-bla
-paged
-homespun
-u.s.a
-uif
-dallin
-gid
-fts
-mattachine
-tweedsmuir
-oberland
-galloped
-mathilda
-touch-screen
-acis
-shahzada
-eichelberger
-yamani
-forded
-jacana
-decolonisation
-soggy
-zamenhof
-inspiron
-reappointment
-flory
-byam
-non-conventional
-rasool
-snooty
-josaphat
-enlargements
-florentines
-septet
-m56
-stocksbridge
-basford
-pugad
-templeman
-eutectic
-anthroposophical
-ilea
-inter-disciplinary
-sub-regional
-dormers
-arvada
-cupboards
-sub-clan
-mikoyan-gurevich
-untidy
-positrons
-katahdin
-nmu
-cristiana
-gerardus
-lungren
-sains
-pringles
-1.38
-m27
-90-day
-cuming
-waistband
-1.20
-1.24
-unheated
-segrave
-gigged
-dagar
-monkstown
-bratt
-vinge
-gas-powered
-4.75
-ires
-materiality
-siphons
-kiddush
-upendra
-apatite
-elastomer
-nernst
-vociferously
-kydd
-submarine-launched
-skolian
-spofforth
-tigard
-tshwane
-situs
-makeovers
-billups
-unfocused
-guianas
-1967-1969
-exigencies
-re-shot
-phetchabun
-ankeny
-carbocation
-bnc
-goal-line
-picturing
-uys
-crankshafts
-fyne
-rangefinders
-kitagawa
-bernardin
-gysin
-ringworm
-vanillin
-tussock
-incunabula
-margam
-blackstreet
-neves
-neurotoxicity
-lycian
-diez
-elastomers
-sallust
-nine-day
-rpr
-vapid
-upnp
-m74
-tunny
-brandis
-ajmal
-rungs
-submariner
-tarpaulin
-nippur
-overclocking
-switchbacks
-mahomet
-amstelveen
-dunvegan
-baek
-garageband
-canid
-battlezone
-uka
-telefilms
-frsa
-chronica
-c-54
-to-date
-normals
-beatboxing
-cartersville
-24-21
-umbrian
-unico
-pubescens
-damen
-brimley
-trekked
-sharaf
-calamus
-galil
-conurbations
-ostwald
-forex
-rasp
-petroglyph
-sassi
-escovedo
-slob
-utsler
-artforum
-laughable
-yoshikawa
-melksham
-samira
-formula_88
-fan-shaped
-amami
-34,167
-jory
-hiccups
-hellblazer
-candied
-surplus-value
-luchino
-vishnuvardhana
-1051
-acha
-souci
-manish
-radicalization
-cardigans
-lohman
-despotate
-character-based
-yazdi
-cordially
-ogpu
-patricius
-vampirella
-ozymandias
-raimondi
-u15
-winick
-menomonee
-atos
-marrakesh
-305th
-spellbinder
-coolangatta
-uaa
-leszek
-cease-and-desist
-allodial
-cobbold
-khiva
-carabiniers
-saverio
-1.07
-betfair
-a-sides
-sunstorm
-pre-contact
-congeners
-poulenc
-pisin
-feelers
-egocentric
-blanding
-sallis
-mitrovica
-charis
-roya
-ymir
-scriptura
-vexed
-extrapolate
-sone
-naina
-chicoutimi
-ruggedness
-symbionts
-wspu
-praeger
-scola
-grahams
-amaya
-remotest
-kurchatov
-guiseley
-torinese
-palimpsest
-apollonian
-conceptualize
-spafford
-nurul
-ferb
-silversmiths
-rucka
-spatula
-13-12
-gleaning
-mid-1700s
-big-band
-dropsy
-writer/artist
-pudendal
-greensborough
-munter
-leumit
-bowstring
-pothole
-orgasmic
-treader
-adducts
-neckerchief
-cannonade
-extravagantly
-landen
-unionize
-cambie
-showrunner
-gallica
-endorphins
-tremlett
-floppies
-cephalothorax
-lenihan
-dhole
-bhardwaj
-re-examine
-cashiers
-n8
-legislating
-hsr
-phibun
-kirkus
-indo-pak
-bryophytes
-gell-mann
-cott
-castlegar
-bafl
-105.8
-ludden
-mccaffery
-tybee
-ashy
-telamon
-wiman
-newsweekly
-dents
-horsfieldia
-diels-alder
-restorers
-mylne
-olimpija
-trovatore
-elminster
-17/35
-splendidly
-griffen
-sp.
-frown
-begawan
-disorienting
-harborne
-shackle
-alyosha
-64.9
-bussey
-remagen
-mce
-carisbrooke
-groh
-cherubs
-redeye
-rawkus
-taymyr
-baileys
-ultrasparc
-interphase
-letty
-dxe5
-out-of-order
-2-way
-soest
-prestel
-koxinga
-low-resolution
-imploring
-skyteam
-idylls
-rock-climbing
-68.2
-28,500
-spennymoor
-finnair
-parished
-armrest
-arteriosus
-stopwatch
-foederati
-pbm
-30,938
-olive-brown
-unnerving
-co-producers
-sgp
-bonet
-gcw
-amyl
-florina
-uhura
-strike-shortened
-88,000
-fargo-moorhead
-castres
-benedek
-amaury
-copier
-fertilizing
-ipm
-underbrush
-roko
-laundries
-slickers
-user-created
-pusser
-hirshhorn
-sisto
-truckin
-volcker
-bockwinkel
-air-to-surface
-murch
-seafield
-kaif
-auras
-rowand
-f-5e
-tamako
-clovelly
-rebroadcasters
-skinners
-perdana
-14.20
-munchkins
-haupt
-ethno-linguistic
-62.0
-winnemucca
-tukhachevsky
-advices
-essentialism
-morphogenesis
-visualise
-praha
-kil
-pilsner
-aoe
-satomi
-thap
-luanne
-fdc
-imbue
-hl7
-cronyn
-meucci
-tipi
-sundew
-voyaging
-donelly
-66.5
-0.34
-al-hilal
-hurlbut
-yeong
-sabor
-kulaks
-saxby
-lukoil
-mackerras
-collingswood
-five-issue
-ktvu
-greenslade
-formula_87
-cantors
-castleman
-iorwerth
-ratlam
-7.65
-7:45
-craterlet
-inviolable
-lopburi
-mercurius
-manion
-togetherness
-rra
-motorcar
-abhay
-levitating
-pileggi
-pencak
-grappled
-pimm
-gwh
-bentivoglio
-kahl
-peiper
-unraveled
-elmham
-afforestation
-blundetto
-guangxu
-modbury
-dain
-lazier
-21,563
-riverwind
-scatman
-blameless
-faarooq
-kothi
-yukimura
-musketry
-glaucus
-yoritomo
-veronika
-fareed
-12-6
-tomasi
-watchet
-inver
-litigator
-gatto
-pflueger
-kampar
-ventriloquism
-glissando
-jio
-elgon
-stora
-lederman
-sixshot
-axially
-goold
-meanest
-nics
-polyamide
-rinsing
-miike
-n9ne
-ausable
-kring
-slumdog
-redevelopments
-chestertown
-maciej
-trousdale
-frito-lay
-legalise
-gt40
-minch
-non-migratory
-trypanosoma
-well-crafted
-rhein-lahn
-willems
-preload
-schalk
-eulogies
-larder
-acqua
-karaga
-margai
-bartell
-rigvedic
-long-dead
-brp
-skatalites
-mamadou
-helium-4
-tare
-1:4
-istvan
-kugler
-fast-flowing
-gou
-mussoorie
-uranyl
-memel
-anti-flag
-allister
-stay-behind
-ducharme
-chotiner
-cliveden
-bookbinding
-staub
-burchett
-bankside
-kalina
-zale
-long-nosed
-four-minute
-u4
-busa
-mecham
-wooden-hulled
-martial-arts
-lakeport
-non-vegetarian
-cyclophosphamide
-hmrc
-usra
-tabulating
-narathiwat
-parisians
-rodolphe
-toiling
-arnaldo
-renta
-invisibly
-rockslide
-adamstown
-b.f.a.
-144,000
-cattleman
-tiana
-tolima
-regurgitated
-angelico
-swiftlet
-southborough
-conradin
-29.80
-argenta
-styris
-millay
-kuang
-sportspeople
-text-only
-suryavanshi
-nifty
-similkameen
-a&f
-nlm
-first-century
-incinerators
-unsatisfying
-personable
-ifad
-dikshitar
-birkenfeld
-puncturing
-nation-building
-sines
-barenboim
-weighton
-madani
-arachnid
-80.2
-frenkel
-cycladic
-exif
-walke
-nossa
-ledford
-odu
-olduvai
-rvs
-cades
-gimbels
-manzikert
-henie
-free-trade
-chromaticism
-castellani
-doman
-aitape
-bratty
-manaus
-communalism
-1914-1915
-chalke
-bassetlaw
-rindt
-renegotiation
-seay
-5.45
-licht
-wsl
-woodsmen
-fel
-perugino
-maxton
-legnica
-stargirl
-hellinger
-siliceous
-mandla
-igf
-anti-cult
-lethem
-vocational-technical
-0-14
-agadir
-gade
-medeiros
-mihail
-welter
-nyon
-gudang
-abele
-bruck
-trailheads
-yeates
-shoup
-schottenheimer
-newsreaders
-ballesteros
-kofun
-gest
-inter-racial
-insolent
-tonkawa
-littlewoods
-avitus
-0.001
-pch
-grigor
-lyly
-dighton
-patinkin
-taba
-wanger
-cammy
-baldassare
-pre-taped
-a-frame
-drumbeat
-masvingo
-thuy
-lowden
-pumbaa
-members-only
-mog
-testamentary
-paralegals
-o'mara
-ume
-eunos
-fly-in
-scapegoats
-76,000
-cyclopean
-boyington
-1955-1956
-goal-scoring
-deeley
-alkynes
-lateralward
-elohist
-malenkov
-1.08
-surreptitious
-hickox
-piperaceae
-f-4s
-laudatory
-ontonagon
-natsu
-backtrack
-asheton
-jiabao
-wardour
-saiga
-14.50
-garigliano
-zainichi
-rebuff
-fleiss
-pcmcia
-74.5
-marmite
-opuntia
-nerdcore
-mud-brick
-bmj
-kliment
-drag-and-drop
-toscano
-time-travelling
-izod
-elisabetta
-dunked
-spada
-colas
-proa
-0.43
-matsuoka
-skorzeny
-unpolished
-bloomery
-anglo-maratha
-germinated
-pre-broadway
-leeuwenhoek
-rosemead
-gra
-masahiko
-nazia
-enmore
-start/finish
-hydrangea
-majesties
-diorite
-salespeople
-millersville
-gorgoroth
-stellate
-bungled
-chaloner
-radhika
-0.99
-kieft
-republication
-interneurons
-charmian
-gro
-finales
-amputations
-unbalance
-turvey
-internalization
-d&h
-whittemore
-x0
-81.1
-toothbrushes
-aspera
-ramin
-ipi
-friuli-venezia
-hammarby
-chonburi
-gellius
-grapevines
-zutons
-periscopes
-sensitized
-terni
-alyce
-lamu
-vajiravudh
-nso
-pinson
-asiana
-grittier
-inhabitation
-giv
-pelias
-clematis
-hideouts
-4.40
-suzanna
-noord
-lugdunum
-zang
-reattached
-confessors
-refilling
-aherne
-democratic-republicans
-cluedo
-kairos
-naftali
-3-10
-dumnonia
-femoris
-schefflera
-toungoo
-ferghana
-haslingden
-cardston
-out-of-pocket
-fuerte
-airi
-sterilize
-mowed
-formula_85
-abitibi
-tauber
-ifugao
-impingement
-12.80
-iupui
-irfu
-ultra-humanite
-khotan
-lote
-hausen
-wade-giles
-eighty-three
-14.40
-newsboy
-glean
-w.l.
-meta-data
-realplayer
-ultimecia
-fhs
-four-legged
-15.50
-marte
-collodion
-dorsalis
-chobham
-gyeongbu
-eme
-emm
-mononucleosis
-1954-1955
-ataman
-atterbury
-50-100
-vlaamse
-randolph-macon
-dosanjh
-satirists
-peachey
-interbellum
-loosing
-nephron
-althorp
-nunc
-darebin
-kilmacolm
-20-0
-wolfie
-yarm
-waleed
-humfrey
-quirinal
-glib
-kanta
-homeschool
-neuropsychiatric
-commonsense
-gdansk
-ezhava
-teleology
-campagna
-hkd
-u5
-ourense
-oconto
-f0
-nostrum
-ladytron
-cdma2000
-ant1
-officer-in-charge
-ridgecrest
-kriya
-rexroth
-airdrieonians
-sublette
-nisa
-tinder
-4km
-breisgau
-sidonie
-sidestep
-mp5
-bardas
-nusantara
-labelmate
-sub-culture
-willibrord
-mismatches
-demoralizing
-bottler
-scandalized
-massages
-kasha
-lammermoor
-ongole
-rambunctious
-terenure
-quintanilla
-haridas
-1083
-wahhabi
-shintaro
-re-hired
-higdon
-shariff
-byways
-violeta
-tarik
-krylov
-wetaskiwin
-uranium-238
-meda
-monoecious
--200
-yellowfin
-motoko
-106.8
-19,500
-mousse
-usmani
-dreamcatcher
-medavoy
-yatsura
-miserere
-ductility
-civilizing
-13-episode
-anarchistic
-retaliating
-uwb
-pythagoreans
-tuk
-tmt
-behead
-varina
-kubek
-paean
-overtakes
-tanda
-rocs
-reevaluation
-homologues
-bgcolor
-performance-based
-triangulated
-ngong
-kcrc
-biostatistics
-yoshimura
-millcreek
-yusupov
-mqm
-goldenberg
-1.91
-folland
-gabbana
-tramcar
-toenails
-bachir
-axtell
-electrophysiology
-xna
-basse
-nambour
-uncorrected
-joslin
-nagurski
-kincheloe
-22pts
-karai
-sendak
-11th-century
-omd
-f-84
-herodias
-schlock
-congregating
-gizzard
-dialer
-barkin
-kippax
-susi
-.280
-wmms
-1909-10
-chalked
-lif
-gner
-djembe
-licentious
-amateurism
-kcs
-spratly
-25,313
-molest
-stenson
-codomain
-x-statix
-fireproofing
-nuclide
-coercing
-24-17
-republish
-kebabs
-ferryboat
-reconstructs
-colcord
-markle
-longreach
-blacktop
-renny
-1918-1919
-sidetracked
-wegner
-free-tailed
-yongzheng
-knollwood
-stifler
-grue
-sch
-jamshid
-yemenis
-reintegrated
-mesaba
-yod
-najd
-katar
-rino
-canadian-based
-sonars
-raffaello
-shorrock
-bellbird
-inter-services
-doublets
-alem
-cross-city
-ravager
-p-1
-free-of-charge
-forelimb
-wouk
-disodium
-philpot
-40m
-myrsinaceae
-nicolo
-trochanter
-myerson
-shakyamuni
-almere
-citys
-conjoint
-vaseline
-prestatyn
-yellow-throated
-109s
-goyer
-elbaradei
-xinhai
-laya
-conman
-seeped
-tight-fitting
-four-player
-man-of-war
-63.1
-misrule
-0.39
-escapism
-bindu
-faqir
-warrensburg
-gimbel
-alborz
-torturous
-muraco
-weatherall
-repossession
-self-identity
-calcitonin
-cpbl
-individuation
-sissi
-gerontius
-pipettes
-thynne
-bioluminescent
-swag
-denuded
-d'lo
-tuy
-stainton
-sobs
-calista
-hater
-1997-2003
-flowerpecker
-bajirao
-agamotto
-coif
-geo-political
-1977-1980
-dj/producer
-iulia
-haruki
-sailplanes
-outspokenness
-friulian
-marshalled
-partakes
-guptas
-revisionists
-indo-greeks
-yount
-jorgenson
-towle
-asynchronously
-copyleft
-spigot
-khs
-clayface
-lookups
-millipedes
-muralist
-ean
-unh
-schutz
-lachman
-shamisen
-gneisses
-officialdom
-playout
-45,625
-landkreis
-krishnagiri
-ayler
-kesari
-eudes
-sakic
-gelber
-proselytize
-lumbermen
-nobili
-farrier
-well-kept
-joest
-yoav
-hooft
-chanced
-grazia
-teaspoon
-inauthentic
-lamour
-compositor
-torpor
-stipendiary
-anarcho-capitalists
-pacifying
-deliverable
-jangle
-speeded
-roza
-alata
-apna
-emmer
-boitano
-friedberg
-microchips
-slocan
-scanian
-baxley
-powerlessness
-set-piece
-sandino
-bloodaxe
-sett
-roding
-jeppesen
-chipotle
-niguel
-panton
-mentmore
-dhoti
-apostolos
-fullscreen
-cascada
-hagerty
-480,000
-kingswinford
-harari
-dni
-duals
-goober
-first-come
-andolan
-boldon
-herrin
-rehovot
-macrophylla
-offsite
-newall
-nrs
-clasped
-kariya
-haussmann
-yesler
-pharisee
-pietism
-fugate
-babak
-snorkelling
-ngawang
-westborough
-henne
-iskander
-liken
-sunstreaker
-paris-based
-gabrieli
-lodewijk
-hostname
-jackdaw
-diphenhydramine
-signor
-seventy-ninth
-lupinus
-overpriced
-0.61
-linoleic
-pontryagin
-ehf
-agilent
-cameronians
-halvorsen
-subsidise
-64.0
-subdomain
-visigoth
-zetec
-tongatapu
-geber
-agglutination
-moonstar
-caped
-rednecks
-krol
-brotherton
-c&w
-hand-carved
-rhun
-hoad
-1987/88
-mellen
-schouten
-borscht
-hps
-ruminant
-phytochemicals
-woollahra
-biggleswade
-quasi-judicial
-boutros
-shap
-high-efficiency
-water-borne
-colusa
-heisler
-yilgarn
-broomhill
-safa
-eka
-go-betweens
-delahunt
-stateful
-abcd
-800th
-superbad
-tjader
-mirages
-dantes
-frio
-psfs
-tamim
-stempel
-slanting
-conemaugh
-matapan
-bookmobile
-coronets
-speedo
-esol
-sergeyev
-bnl
-cinders
-cura
-round-tower
-atk
-sanath
-hydroponic
-ratchaburi
-35w
-crofting
-simard
-lazaro
-apamea
-ammen
-samothrace
-redvers
-cavs
-left-field
-stef
-dinnerware
-kheda
-motorboats
-vaudreuil
-koussevitzky
-giallo
-cosima
-virago
-four-point
-alkyne
-waterhole
-eeoc
-kool-aid
-hefei
-vid
-populi
-rampling
-medusae
-ellyn
-somatostatin
-vratsa
-jodo
-impartially
-socialising
-colet
-sakaki
-8,200
-danziger
-milverton
-palamas
-shikellamy
-bookcase
-tesoro
-pulldown
-kaku
-iho
-megson
-sub-branch
-leghari
-schoonmaker
-postmenopausal
-ensconced
-nandan
-gastineau
-booby-trapped
-hage
-arrhenius
-scargill
-orinda
-kielce
-non-citizen
-balor
-pythian
-vini
-missteps
-hansom
-thayil
-favouritism
-o-level
-gata
-posidonius
-mortaza
-poling
-kabylie
-raindance
-heavily-armed
-preiss
-plainville
-heaving
-fizzy
-fortepiano
-1899-1900
-clubland
-earthlink
-scn
-mexican-americans
-handguard
-cranked
-mistry
-chatman
-splitters
-indo-aryans
-blackball
-milnrow
-00s
-arklow
-jersey-based
-anaximander
-radyo
-fullers
-fidler
-invalidity
-british-built
-moldy
-bioactive
-toiletries
-lilting
-gardocki
-ha-ha
-16-13
-llosa
-homeownership
-cranbury
-macmanus
-biogenic
-ch4
-kazakov
-lamond
-symbiont
-ramped
-take-up
-b-1b
-tic-tac-toe
-thalamic
-crerar
-action-packed
-arinc
-ocaml
-robotboy
-autopista
-sirsa
-avner
-subframe
-duplications
-voivod
-gemina
-annus
-ipn
-fertig
-technocratic
-rottweiler
-swanley
-yukam
-mtvu
-radclyffe
-amyrlin
-wmap
-manhattan-bound
-lhd
-ponthieu
-kult
-moneylenders
-guilin
-upstanding
-sub-national
-ulc
-oder-neisse
-64.1
-zulia
-26,563
-84,000
-pithoragarh
-bellville
-fairburn
-iko
-calabro
-kaylee
-three-masted
-concoct
-fagaceae
-etcher
-hypertrophic
-neutrals
-wimmera
-hollingshead
-rohilla
-unravels
-shackleford
-sympathizing
-brandished
-haruko
-shinrikyo
-unseemly
-basta
-keystones
-velazquez
-respectably
-neutropenia
-clifden
-tiernan
-cd32
-nodaway
-otherness
-thomasina
-caceres
-cisplatin
-waa
-cuyo
-diavolo
-ncu
-wiese
-demetrio
-southard
-magalona
-neo-renaissance
-blakeley
-laius
-armillary
-gambles
-maka
-milkmen
-luan
-bandara
-subcortical
-pawned
-sunt
-character-driven
-harb
-senor
-2023
-z3
-motion-picture
-dagwood
-spinola
-3r
-microsatellite
-rusholme
-circleville
-bisphenol
-al-hazmi
-romilly
-pumila
-1,2,3
-r-1
-socialista
-myp
-grabowski
-kerguelen
-kashan
-thirtysomething
-unfree
-scafell
-radiations
-80.1
-calrissian
-keeton
-bornholm
-unipolar
-virile
-wkbw
-asianet
-grates
-shapers
-lehrman
-socal
-slatkin
-yendi
-vasculitis
-pumper
-unfamiliarity
-newly-discovered
-31,500
-typify
-attractively
-cabled
-unexposed
-atlantica
-magnanimous
-iop
-alington
-infinitum
-penitents
-boroughbridge
-wangler
-dynamometer
-essanay
-mulga
-2-8-2
-veggietales
-unarmored
-reengineering
-lean-to
-woodworkers
-lari
-rocroi
-8:45
-thyroiditis
-iolo
-71,000
-maybelle
-graeca
-nayudu
-pigskin
-1810s
-panicker
-salcombe
-stinks
-double-elimination
-kamphaeng
-kiril
-meester
-enmeshed
-flavorful
-oikawa
-transcendentalism
-elman
-mazer
-moderna
-s-adenosylhomocysteine
-dasher
-belus
-bivector
-pranayama
-cavalieri
-eprom
-akerman
-yuriko
-chafing
-kiniski
-psychomotor
-busied
-mlb.com
-ritt
-windscreens
-ulaanbaatar
-loins
-mainstreaming
-non-greek
-signe
-passers
-schip
-metzler
-inuvik
-epitaxy
-self-loading
-unceasing
-savarese
-goldhagen
-violoncello
-templestowe
-timespan
-dudayev
-placido
-neelam
-gloved
-neo-assyrian
-ecd
-dushku
-ibbetson
-23,500
-fronte
-benguela
-allsopp
-viols
-constitutionalism
-1926-27
-woodgate
-m1917
-bez
-gungahlin
-kalenjin
-signposts
-rosco
-talha
-molester
-torrie
-breccias
-ventana
-rainn
-joao
-subclades
-saml
-mauchly
-alphaville
-presbyters
-p-n
-m12
-esm
-zubair
-mich.
-ontogeny
-oj
-tsao
-ibu
-secondment
-fatehabad
-riaz
-stingaree
-foxworthy
-annualized
-montalban
-jinsen
-harri
--50
-giambi
-twillingate
-littell
-impinging
-m.h.
-zeolite
-krasner
-co-headlined
-motorman
-come-back
-1956-1957
-paden
-pneumatics
-celbridge
-carpool
-urbain
-ovata
-tangra
-yoel
-concavity
-dublins
-damocles
-sulzberger
-meine
-lambertville
-craves
-tensas
-ittihad
-taijiquan
-leverhulme
-injunctive
-glossolalia
-charnham
-ventilators
-adonai
-skydivers
-fertilised
-ont.
-arterials
-sadomasochistic
-trapezius
-remixers
-nonlinearity
-nla
-undertail
-landlines
-diaphragms
-anaphylactic
-mccausland
-facemask
-brendel
-nambiar
-teme
-terrestrially
-heliodorus
-mich
-tamir
-mcnerney
-bookbinder
-pre-1927
-vidalia
-trackball
-eiger
-evapotranspiration
-pescarolo
-indias
-rearmed
-culshaw
-bigeye
-inscrutable
-line-out
-navarone
-3.56
-glomerulus
-entrench
-plaquemines
-sorell
-talysh
-irredentism
-hagi
-typ
-skowhegan
-catriona
-e-government
-tracklistings
-cahaba
-shum
-writable
-moyet
-kindling
-tif
-suboptimal
-co-branded
-ravers
-dramatizations
-15,625
-harappa
-bawn
-feliciana
-wyverns
-todman
-shatterstar
-grigsby
-carabao
-codice_20
-codice_22
-mandarins
-chungcheong
-parvifolia
-loca
-nonpoint
-agathocles
-bassist/vocalist
-manichaeism
-skyler
-agrigento
-seamer
-dry-weather
-atmos
-tripathi
-hamidullah
-durden
-swatara
-temescal
-kaizen
-talleyrand
-accentors
-feijenoord
-dinoflagellates
-derfel
-sapped
-hobos
-mibu
-erodes
-jeered
-coped
-insolation
-crawfords
-boateng
-ime
-margaretta
-nitrites
-mgh
-coherently
-montalvo
-nuclear-armed
-ravenel
-re-admitted
-strike-slip
-alexandr
-mokokchung
-highlife
-misogynist
-three-team
-bi-annually
-.200
-potsdamer
-postel
-janeane
-stephenie
-casserole
-registrant
-bunched
-aitutaki
-peaceville
-flexibly
-montacute
-tympani
-altamira
-soviet-backed
-cardew
-1999-2005
-chronograph
-mhow
-bethea
-teutoburg
-mais
-tiaras
-chrominance
-spes
-sizzler
-h3n2
-chatroom
-cremonese
-blushing
-purl
-nalgonda
-22.10
-gaskill
-pathe
-hatem
-moomin
-misiones
-doma
-virk
-ramayan
-ralt
-atlee
-soupy
-pudukkottai
-solipsism
-ideon
-hyponatremia
-mallika
-sericulture
-titov
-g-1
-bosnians
-cancun
-gamow
-neurone
-edgecombe
-unforgivable
-ccds
-wjz
-ahuja
-81.0
-24,500
-fomented
-responsum
-urien
-rumney
-carhart
-glazier
-all-digital
-drinkable
-solamnia
-hemiptera
-easy-listening
-confiding
-absolutive
-hebner
-recs
-35,833
-jutes
-vampyre
-epidemiologist
-bch
-meerkats
-ill-conceived
-serota
-savard
-horovitz
-contorted
-tindouf
-pegging
-depositary
-yarkand
-warragul
-fengguan
-tetrode
-schimmel
-richwood
-stationer
-keesler
-tayler
-beaufighter
-aksumite
-clink
-serj
-re-creating
-scurlock
-jailing
-afsc
-misbehaviour
-yes/no
-long-duration
-nnamdi
-unequalled
-luteola
-fencibles
-randhir
-drumhead
-bolas
-ddc
-galindo
-preppy
-braddon
-amazigh
-levator
-full-duplex
-stagings
-cover-dated
-familias
-pro-active
-union-tribune
-colonialist
-78rpm
-i.t.
-amphipolis
-disston
-agron
-kedron
-verband
-moistened
-geats
-astroworld
-gsb
-approvingly
-nutbush
-jablonski
-neutered
-nursultan
-karmas
-ch-47
-c99
-audiology
-preseli
-bloomers
-platforms/service
-canneries
-voyeurism
-bulleen
-anther
-partiality
-audibly
-overused
-moraceae
-carnacki
-blunden
-micromanagement
-refills
-spottiswoode
-holberg
-slytherin
-15-30
-khumalo
-sheahan
-team-based
-njcl
-wuxi
-buddhahood
-shake-up
-nanotech
-nmi
-vice-consul
-73.5
-janikowski
-erk
-unready
-barfly
-snorting
-signification
-wmu
-drupal
-mccovey
-mesilla
-convertibility
-fifteenth-century
-gente
-headbanger
-apolipoprotein
-mccotter
-viljoen
-assignee
-mullum
-hierarch
-pytheas
-charlize
-tharavadu
-campestris
-marrero
-roskill
-okavango
-lounging
-impale
-bride-to-be
-farran
-tu-144
-single-chip
-absolut
-ponti
-viana
-quantifies
-supermax
-trod
-winterland
-sewickley
-aspin
-4-inch
-sips
-re-incorporated
-anderssen
-ermita
-buxom
-dashboards
-kiner
-goldmember
-kabbalist
-abut
-kekkonen
-divalent
-dalmuir
-myotismon
-marsyas
-shirer
-spriggs
-liverpudlian
-ah-1
-masaccio
-tirith
-bemoaned
-cumulatively
-rinzai
-tensioned
-out-of-body
-soundcheck
-platini
-knotty
-desiccated
-switchable
-interposed
-kabc-tv
-winstead
-autolycus
-deformable
-phocians
-82d
-disproving
-asian/pacific
-hammonton
-remasters
-synopses
-mk4
-ayalon
-woio
-categorise
-hagfish
-stolypin
-13c
-raymundo
-bhera
-jojoba
-eckart
-laga
-iterator
-aircrafts
-s.w.a.t.
-adoor
-semigroups
-aurel
-fallible
-jukka
-m203
-livio
-01:00
-terrigen
-shore-based
-swadesh
-mihoshi
-vetinari
-francaise
-hinata
-piaf
-0.56
-localizing
-2004-2009
-northshore
-kookaburras
-obu
-lazard
--100
-aguadilla
-mesenchymal
-quae
-smithereens
-norml
-art-deco
-klimt
-fujikawa
-hadiya
-ayaka
-nspcc
-broomsticks
-toit
-vlc
-juelz
-third-rate
-spoor
-levi-civita
-barbette
-hagelin
-binational
-pironi
-27,188
-1022
-.41
-swaffham
-1.97
-molto
-yavatmal
-8k
-stormers
-eurystheus
-plein
-digitize
-gonda
-mountford
-baha'i
-breather
-oryol
-regionalisation
-sugarhill
-hebb
-jamaal
-neutra
-university-based
-avante
-chicagoans
-tombigbee
-topos
-michiru
-uh-60
-marzipan
-retardants
-scoresby
-airco
-familiarization
-unconformity
-iolani
-still-life
-pandolfo
-salicaceae
-getafix
-debonair
-jagjit
-kyril
-paulino
-varvara
-syphax
-auburndale
-ancash
-boreas
-import-export
-maktab
-blondell
-mouser
-6-phosphate
-yehudah
-sons-in-law
-kamini
-brassiere
-wco
-verdy
-kunoichi
-tirthankara
-1990-1995
-third-best
-horthy
-kingswear
-nyarlathotep
-kroon
-batou
-bangura
-trillo
-aussies
-soviet-style
-cherenkov
-saami
-subclade
-amnion
-enjoin
-filmworks
-biaggi
-wapakoneta
-kannan
-grodin
-mittagong
-dufay
-sprigg
-pldt
-gilby
-lsr
-conoco
-niyazov
-1985-1989
-65.9
-oef
-chalmette
-blatter
-veche
-diacritical
-pantographs
-spiti
-overwintering
-untraceable
-self-assessment
-moslems
-el-p
-brockie
-woodend
-blackmon
-togliatti
-geetha
-kassim
-250px
-dawley
-irredentist
-c.i.a.
-helmond
-fira
-ledgers
-loathsome
-kemerovo
-translocations
-npg
-fairywren
-kaloyan
-sunbelt
-hairdo
-0.38
-81,000
-frictions
-teshuva
-rushville
-valla
-modernistic
-1990-2000
-borlaug
-cambium
-monotonically
-trans-tasman
-gujaratis
-fumi
-lunchroom
-quinone
-company-wide
-single-cylinder
-frag
-baganda
-gelugpa
-franking
-pavonia
-hillenbrand
-rnp
-hamlyn
-dirtiest
-distrito
-fearne
-kotler
-self-concept
-logit
-slaf
-waymarked
-verhofstadt
-mnc
-litani
-astrometry
-photolithography
-shoo
-smoltz
-evp
-erinaceidae
-veikkausliiga
-blockages
-shihab
-sportatorium
-screwjob
-mehndi
-chinese-speaking
-oppressor
-havers
-seina
-masers
-bhavana
-mclagan
-re-start
-faxed
-08/09
-socotra
-kokanee
-bloomed
-leading-edge
-ccas
-densmore
-bi-level
-leadon
-futbol
-ceri
-guale
-trews
-re-appear
-eisenman
-clayey
-acrylamide
-marinelli
-juggalo
-bsu
-ele
-pugliese
-janamejaya
-cannibalized
-northmen
-tanabe
-kisei
-neo-aramaic
-county-owned
-minucius
-yellowtail
-kelmscott
-1904-05
-mescalero
-barkly
-concessionary
-tobit
-lansdown
-rocher
-maldive
-medianews
-107.6
-unveils
-elfangor
-accumbens
-directeur
-163rd
-irked
-wrather
-backbones
-bokaro
-tintern
-5100
-nt$
-filter-feed
-3-12
-rnvr
-abda
-wine-making
-calpurnius
-granuloma
-suge
-alpha-2
-dayananda
-archways
-satb
-papeete
-omnium
-lieb
-tawe
-1.70
-tacking
-rukia
-schyster
-mediolanum
-enniscorthy
-mikheil
-baraboo
-biogeographical
-bungling
-takapuna
-nbr
-berti
-gold-colored
-fluorides
-levuka
-barna
-every-day
-winooski
-brookvale
-wachuku
-zeroth
-massacring
-poppaea
-assur
-3.85
-cayabyab
-jobe
-vomits
-phonofilm
-hillyer
-longton
-overexpression
-communicants
-iveagh
-foden
-parishioner
-granulomas
-tenko
-stegall
-oligonucleotide
-near-complete
-self-doubt
-recouped
-culham
-startin
-boorish
-chillers
-alcala
-hesitantly
-statesmanship
-ameritrade
-travail
-daguerreotype
-shuns
-acquaint
-efteling
-gribble
-angostura
-clockmakers
-jodrell
-critias
-enquired
-year-olds
-struble
-pennsylvanians
-bristle
-millwright
-lythgoe
-retraced
-dolley
-amsl
-schirra
-schnauzer
-rogge
-handrail
-uppland
-shipwrights
-gamepad
-a47
-pccw
-garlick
-rsvp
-cheekbone
-underestimating
-hawkish
-reforma
-bondarenko
-wide-spread
-s.t.a.r.
-aneurin
-bravura
-acd
-79.9
-m61
-passband
-ayeka
-psb
-houston-based
-millimetre
-edberg
-forti
-mobilising
-brianza
-winnowing
-hydromorphone
-cognitively
-brembo
-burdette
-vereeniging
-serenades
-carthusians
-telomere
-jascha
-needful
-taskbar
-nakedness
-brahmos
-propellor
-brachii
-touchline
-151st
-double-album
-maneater
-bening
-tongva
-geet
-lcp
-constricting
-up-market
-outwood
-ow
-r-7
-goalposts
-architrave
-huffer
-nasd
-makings
-retakes
-mizzou
-mfi
-tenali
-pne
-hern
-full-colour
-kaeding
-milstein
-acrylics
-cuando
-varun
-gentium
-d9
-cajal
-jel
-fluoresce
-tezcatlipoca
-drava
-community-oriented
-prang
-kabyle
-rottnest
-intermarry
-71.1
-30,313
-rideout
-barkerville
-protoform
-stymie
-bussy
-petropavlovsk
-grazier
-kuni
-adjudicators
-triskelion
-mcchesney
-kuranda
-rochambeau
-tander
-pounders
-e-series
-qnx
-dvt
-mercers
-
-auriga
-wxrk
-longfield
-foxborough
-narmer
-kristoff
-german-style
-lusts
-a&t
-sudeley
-headwear
-nebbiolo
-urusei
-boddington
-honeypot
-dlf
-dopant
-pagerank
-0.44
-employability
-nessie
-caccia
-phenobarbital
-pre-hospital
-c-usa
-boal
-nazirite
-reshot
-paintwork
-taxidermist
-anticlockwise
-celastraceae
-marcius
-noblesse
-campden
-fergana
-bto
-bolting
-ferrata
-laborde
-aanestad
-sahibzada
-3.52
-dronfield
-head-up
-well-read
-ka-tet
-baillieu
-heterochromatin
-standard-definition
-extrapolating
-krsna
-shirvan
-sedna
-plowright
-greywacke
-sokal
-titanate
-abrasions
-quoc
-plekhanov
-artest
-anti-terror
-attercliffe
-kalmykia
-loeffler
-mercians
-daemonites
-balto
-cgc
-cupe
-chauhans
-bebo
-greyer
-optimizer
-cliburn
-hootenanny
-powerman
-ehs
-conjugates
-metroland
-aish
-catman
-sekiwake
-debenture
-s-2
-subtler
-five-point
-neuro-linguistic
-marias
-synchronously
-huth
-cpas
-digressions
-slash-and-burn
-broadstone
-holsters
-rationalists
-uncas
-demining
-hedin
-candlemas
-catalonian
-thrombus
-garcinia
-antwerpen
-cafaro
-terza
-eprdf
-trample
-nutritionally
-transpacific
-yodeling
-barakat
-sherinian
-centcom
-cobo
-akayev
-chilkoot
-basij
-shearman
-automatism
-tibbets
-punch-out
-throsby
-mnp
-lupa
-tabb
-jiaotong
-groupies
-webserver
-carse
-harmonie
-singapore-based
-unmentioned
-11-4
-missenden
-bosanquet
-kutztown
-atenulf
-crokes
-sportspersons
-thonburi
-iarc
-panavia
-hydramatic
-.260
-63.0
-lerman
-d'asti
-helicase
-drupada
-pathankot
-attentional
-bruxelles
-susskind
-yangban
-morini
-woodvale
-trypticon
-krakauer
-aacs
-rannoch
-culturing
-kamarupa
-reminisced
-ycl
-fogelberg
-yoni
-a-c
-haemophilus
-niosh
-i-16
-bme
-barrhaven
-chessie
-rescheduling
-eckhardt
-vanderburgh
-supertram
-khurshid
-1943-1944
-forefinger
-kbps
-taya
-harmonised
-melita
-satavahanas
-greywater
-lecco
-wadena
-apartment-style
-77.8
-81.2
-15a
-havisham
-limbang
-hoffs
-tannenberg
-simonton
-ambridge
-marler
-tenggara
-macdougal
-redemptorist
-ramanathan
-omnipresence
-30.20
-renan
-thurn
-groat
-mccree
-song-writer
-nct
-speckles
-flavourings
-platon
-fugu
-phonographs
-cosford
-hussites
-muzaffarabad
-kasuga
-najwa
-bryden
-glyoxylate
-on-ice
-poulson
-rhodesians
-acn
-dilettante
-absalon
-1.11
-maren
-scaffolds
-vapours
-tonson
-non-working
-coloni
-morang
-yatton
-guesthouses
-ashish
-wyden
-evashevski
-tortoiseshell
-10x
-grievously
-oglesby
-paraphilia
-meas
-handouts
-unhesitatingly
-jacquard
-tonite
-ssv
-longitudes
-theosophist
-sauvage
-kanna
-ratisbon
-whippet
-junejo
-physiography
-neer
-shoprite
-thronged
-ashot
-hansi
-a16
-blood-stained
-didius
-cisl
-tokyu
-titchmarsh
-fluidized
-unarmoured
-bettors
-marathwada
-garciaparra
-bullfighter
-gender-based
-in-country
-sintered
-zing
-calibres
-federline
-warmup
-alvord
-eustathius
-girija
-timken
-sociolinguistic
-concolor
-anti-pornography
-lampkin
-ivanovna
-backhaul
-heschel
-week-end
-jil
-nsd
-sarit
-thora
-teknologi
-gloating
-iudaea
-acceding
-mineralisation
-uncommitted
-quickened
-jhiaxus
-beezus
-rivulet
-dhikr
-cook-off
-evangelized
-vtr
-cryptographically
-mcchord
-tolly
-pollyanna
-creb
-bilson
-karate-do
-piasecki
-intermarriages
-hometowns
-swoon
-khamis
-beltane
-ulyanovsk
-decompositions
-gef
-wtcc
-millpond
-victorino
-short-sighted
-i-74
-warkworth
-camiguin
-o.t.o.
-leaven
-annealed
-vagal
-benavides
-salacious
-woodhall
-interloper
-senhora
-oland
-helspont
-mitrokhin
-co-discovered
-ephemerides
-wami
-sasi
-modi'in
-nasiriyah
-ingvar
-dueled
-orval
-gouache
-rosalyn
-loblaws
-parvus
-near-total
-gay-straight
-kulturkampf
-4-11
-vermeulen
-dunkerque
-pilton
-shenfield
-carryover
-ofi
-philander
-hilarity
-shruti
-0-12
-wyche
-gamla
-treefrog
-shh
-mayflies
-hookworm
-genies
-gaurav
-valine
-burp
-middle-school
-5-ht2a
-erector
-esv
-satellite-fed
-e-zine
-obfuscation
-scheff
-mouvement
-corbeil
-bunko
-jur
-wwor-tv
-macoupin
-top-scorer
-nivea
-weidenfeld
-gas-operated
-north-american
-baator
-subcaudals
-sonorous
-non-contiguous
-durg
-1896-97
-jala
-catron
-hyperplanes
-well-prepared
-pollo
-ive
-blassie
-hospitalisation
-rededication
-52,500
-lgv
-gatiss
-cassatt
-rosalia
-theresia
-karcher
-girdles
-saman
-potgieter
-humanitarianism
-alesha
-darkforce
-scarecrows
-wtvt
-christer
-shama
-haza
-disfellowshipped
-menhaden
-cronyism
-yauza
-ranganatha
-euroseries
-piccolomini
-zucchini
-ufm
-ya'akov
-swales
-phosphorescent
-tsongas
-margarets
-rkc
-unenviable
-bhairavi
-mashgiach
-foreign-owned
-2h
-cerebrum
-hoberman
-kloss
-godman
-okello
-amm
-spunk
-a61
-curll
-tars
-sipe
-heatherton
-spofford
-tobermory
-joffre
-best-ever
-mushers
-7:15
-roermond
-digestible
-shindig
-reprocessed
-thune
-divinyls
-funan
-rastelli
-calor
-as/400
-salai
-kemetic
-cytoskeletal
-picador
-llantrisant
-unley
-emelianenko
-spectres
-+44
-hering
-kellman
-plaaf
-l.b.
-benvenuto
-xavin
-feverishly
-alyx
-idempotent
-inter-cities
-hetzel
-representable
-charnel
-litvinov
-set-ups
-ilkhan
-cowbridge
-susilo
-rusa
-wigginton
-o'haire
-promenades
-6,700
-mardonius
-meninga
-neorealism
-plant-based
-psychoanalysts
-citgo
-kvm
-chalker
-winky
-athanasios
-61,000
-marae
-marar
-exoneration
-1980/81
-1982/83
-ichkeria
-crudup
-bathinda
-34,583
-kentville
-thin-walled
-languish
-shallowness
-oxaloacetate
-begrudgingly
-sabella
-prosodic
-biddy
-parent-child
-shirkuh
-vulpes
-chattan
-vibraphonist
-postclassic
-regenerator
-font-size
-amelioration
-bhonsle
-baler
-spewing
-dilorenzo
-strategical
-curvatures
-curial
-mweru
-applewhite
-lakebed
-2031
-lini
-maudling
-ler
-megatokyo
-somervell
-ake
-zea
-montanus
-interfacial
-nightbeat
-1942-1943
-tenon
-rudess
-dima
-instrumentally
-iet
-cringe
-hombres
-sangro
-capuano
-scram
-plesiosaur
-otani
-dogmatism
-interlace
-secs
-fisher-price
-alou
-adulterated
-eight-lane
-llanberis
-69.4
-deucalion
-cnrs
-17-13
-klaxons
-stomped
-collectivity
-all-area
-ysidro
-aromatherapy
-crybaby
-pilbeam
-hannett
-epstein-barr
-flexure
-jeonju
-andorran
-sawgrass
-inheritances
-gertrud
-15s
-groth
-blessington
-mcgurk
-doda
-siryn
-finned
-histon
-philadelphus
-videotaping
-dyan
-relapsing
-pre-invasion
-peristome
-inu
-baul
-eucalypts
-mdf
-e.l.
-arghun
-formwork
-cryin
-lugard
-electroclash
-whitetail
-squirts
-oligarch
-railwaymen
-ganapathi
-skimpy
-histologic
-s-pulse
-8.25
-meromorphic
-tertia
-olver
-radioworks
-dumbfounded
-birkhoff
-dhimmi
-reawakening
-caiaphas
-1995-2001
-lackeys
-hygromiidae
-hemorrhages
-6:1
-pedipalps
-2029
-chevrolets
-home-field
-uscis
-minding
-promotion/relegation
-ginuwine
-batholith
-panj
-tylor
-gvsu
-swirls
-queenside
-cruelties
-1998-2004
-llnl
-fortune-telling
-siddhi
-cino
-gimpo
-junee
-ruffles
-marcomanni
-mcclurg
-swarbrick
-80.8
-turney
-privatizing
-kerik
-manayunk
-ntm
-nucleophiles
-poojas
-mfd
-merriment
-bessborough
-philoctetes
-craik
-clares
-froissart
-baboy
-rifampicin
-chicago-area
-eyewall
-halftone
-s-100
-wauchope
-jibril
-weenie
-iterate
-jean-philippe
-0.32
-stf
-third-wave
-conditionals
-basilian
-zep
-aeryn
-leeuwen
-miraflores
-eighty-six
-geoengineering
-synth-pop
-sickbay
-rahl
-pieridae
-kilner
-hammonds
-usfs
-signalized
-faultless
-icicles
-pyrenean
-300-400
-stratocasters
-snitsky
-sunos
-mandya
-+11
-issac
-subsisting
-ghosting
-goldbug
-credentialed
-schemed
-gamekeeper
-abbr
-yakutat
-sayville
-student-faculty
-alderton
-alembic
-hillage
-imrie
-papoose
-subtidal
-sub-committees
-kewell
-vinnytsia
-skat
-elpidio
-veld
-i-iv
-doulton
-regno
-hout
-dunce
-enemas
-cahoots
-like-named
-rampton
-discotheque
-ppf
-26-28
-mehboob
-winnsboro
-owari
-preamplifier
-hakluyt
-traub
-shiri
-1989-1993
-intergang
-codman
-depailler
-frou
-flayed
-wampas
-trungpa
-lpa
-fco
-alloyed
-electro-industrial
-fescue
-fernald
-quartier
-thomasson
-embargoes
-glia
-107.2
-torg
-damar
-hubley
-electronegative
-phosphorylates
-co-creators
-metaseries
-donohoe
-transacted
-stanstead
-ecf
-bex
-jayaram
-pando
-busia
-gibraltarians
-hexane
-junichi
-hakoah
-heyting
-reproducibility
-a50
-dinsdale
-canales
-whitmire
-ped
-24,167
-revelator
-groundswell
-mencia
-3.49
-ahmadis
-shrieks
-anticommunist
-usama
-miamisburg
-alten
-much-publicized
-dl&w
-schilt
-oceanian
-devco
-fullarton
-insubstantial
-one-term
-autosport
-vasistha
-yimou
-funder
-18-inch
-s8
-comenius
-panache
-erl
-amphitrite
-jackhammer
-surfliner
-integrand
-zabala
-herta
-hashana
-petrobras
-subhuman
-authorizations
-europaea
-meggan
-fetterman
-zigbee
-fabriclive
-system/360
-4-car
-abides
-mcateer
-phonologically
-cria
-m-theory
-linnell
-helsby
-2f
-necrophilia
-dgc
-razorbeast
-sarazen
-4g
-julii
-tenryu
-bildt
-reoccupation
-parrotbills
-cruse
-napanee
-long-delayed
-side-wheel
-geos
-sops
-bugging
-crvena
-motorcoach
-gerasimov
-bindon
-chatterbox
-nivelle
-rdp
-persis
-m23
-domhnall
-fonteyn
-cubical
-keisuke
-luteum
-neo-confucianism
-kidron
-mohammedan
-tripitaka
-roadworks
-dengler
-vanquishing
-clovers
-heckle
-bridwell
-sparred
-komeito
-boras
-code-name
-ericaceae
-mutya
-palatalization
-ryun
-linchpin
-cortana
-papilionidae
-mwr
-barasat
-calendrical
-nihilo
-doxa
-gkn
-nerang
-mycology
-dongfeng
-modellers
-york-penn
-bishopbriggs
-guilderland
-naloxone
-wynnewood
-lupone
-cpac
-soham
-nunciature
-eriko
-byfleet
-willowdale
-oakland-alameda
-bhave
-wttw
-sertraline
-macrocarpa
-magha
-sayonara
-debord
-free-range
-plainsong
-bors
-nosed
-1039
-reordered
-rationalizing
-ludus
-nonconformity
-uncritically
-cravath
-cajamarca
-non-singular
-abbado
-ledo
-taytay
-segni
-utp
-icrm
-78.9
-lynam
-braes
-then-boyfriend
-stg
-bardsley
-allington
-paros
-7-day
-plainmoor
-grammaticus
-otp
-hard-fi
-inapplicable
-well-supported
-zwicky
-30,500
-shawmut
-almohads
-artfully
-kyles
-historique
-madhavi
-understorey
-crutcher
-quillen
-trinamool
-leclair
-nakula
-jordanhill
-ludmila
-sharmila
-npn
-hornpipe
-record-holder
-caldicot
-waiau
-fully-automatic
-brick-built
-forgers
-21.70
-iron-sulfur
-renbourn
-swe
-deputised
-c&slr
-maro
-entreri
-grandees
-zarah
-giovan
-ilorin
-mazdoor
-pneumocystis
-douglases
-n9
-kirchberg
-gamesmaster
-0.42
-paleogene
-lanzarote
-afterburning
-sahasranama
-desgrange
-thunderclan
-clansman
-tangail
-upper-middle-class
-showband
-kentigern
-netobjects
-77.3
-motegi
-statesville
-undercurrents
-isocrates
-agfa
-chiquititas
-kaffir
-uniate
-loath
-21.90
-analytes
-sockeye
-cloris
-montini
-cieszyn
-matruh
-clonazepam
-f6f
-jeux
-takato
-sclater
-francorum
-johnsson
-boreholes
-morphologies
-laurance
-zavod
-almohad
------
-rovaniemi
-betterton
-flockhart
-pinwheel
-funnel-shaped
-tollbooth
-treacherously
-schwaben
-impound
-fridtjof
-2-4-0
-boxwood
-arguello
-fifth-place
-messalla
-untie
-jindrak
-risch
-satellite-based
-technos
-dhruv
-shaeffer
-skirmished
-franchot
-tooley
-anggun
-novorossiysk
-radioman
-15-inch
-ravenglass
-abseiling
-fourth-highest
-sorbs
-code-switching
-benfield
-nowitzki
-biella
-reme
-rationales
-nanami
-arachne
-doritos
-pickaway
-1048
-adac
-peace-loving
-dc-6
-corks
-etrigan
-waterless
-73.3
-weng
-ulbricht
-lahars
-paktia
-rajouri
-mursili
-vizag
-taji
-1.76
-ange
-bhagirathi
-marie-louise
-colluded
-ormeau
-dethrone
-guanyin
-well-wishers
-decrypts
-wittering
-mekon
-miklos
-scopolamine
-hansberry
-aristo
-sha-1
-desiccant
-colgate-palmolive
-magny-cours
-tippin
-miyake
-i-29
-azlan
-uk-wide
-three-track
-wombwell
-norville
-odiham
-swabians
-brc
-hellstrom
-whanganui
-walmley
-heysham
-tenable
-mellifera
-ariola
-rackers
-syzygy
-uniondale
-potez
-cued
-paychecks
-bharath
-cryopreservation
-spotlighting
-mogilev
-sinusoids
-hunnicutt
-anacreon
-yearlong
-menger
-bregenz
-mercersburg
-mulu
-magnox
-inhumanity
-tifa
-udet
-nazario
-three-legged
-aerodromes
-c.t.
-laserwriter
-comparability
-irrorated
-aermacchi
-16.30
-oaklawn
-oversea
-uto-aztecan
-two-parter
-heilmann
-unwinding
-salzman
-pyarelal
-cabela
-revson
-standard-bearer
-ensnare
-pennzoil
-cpv
-overprint
-nagra
-inbuilt
-sunland
-workweek
-subducting
-thiokol
-blanked
-intervertebral
-navistar
-italiani
-khoi
-teterboro
-amida
-audio/visual
-franco-american
-briefer
-niantic
-buro
-oversimplification
-bohannon
-rogier
-fille
-corrib
-miran
-whitewood
-footbridges
-scotstoun
-vung
-kinkade
-baldacci
-drawl
-yochanan
-derr
-campari
-empath
-half-staff
-etoile
-burki
-raila
-grothe
-sahyadri
-endemics
-seine-et-marne
-steerage
-cumae
-cason
-ater
-farnum
-healthsouth
-cloverfield
-roel
-origine
-sardars
-unchangeable
-grandia
-extractions
-duguid
-fica
-micronutrients
-marron
-d'onofrio
-pictograms
-counterfeiters
-oncorhynchus
-granulocytes
-effectual
-p.v.
-uloom
-tarzana
-caplin
-labi
-kalem
-redwing
-tty
-tira
-wille
-pellow
-dollis
-re-writing
-pinjarra
-1943-1945
-transmuted
-kristal
-kaweah
-flugzeugbau
-twh
-whipwreck
-2p
-solvation
-apsidal
-hattrick
-stomps
-consultancies
-o'carroll
-fieschi
-henshall
-nanga
-uncharged
-mangaverse
-andreasen
-murcer
-abhorred
-distended
-geralt
-interferometric
-mellons
-westman
-phileas
-lynden
-skousen
-verandahs
-lulworth
-radiofrequency
-pasqual
-naturalness
-arafura
-easa
-callbacks
-insignis
-c-in-c
-dhalla
-sildenafil
-shul
-deacetylase
-chincoteague
-wyke
-complementation
-marmont
-kurland
-novas
-corozal
-ferrocarril
-faceoff
-1983-1986
-serology
-jibe
-zooey
-hargeisa
-korean-american
-poinsettia
-greve
-bronxville
-february/march
-psytrance
-ocoee
-sonorants
-alcide
-despaired
-error-prone
-irtysh
-wykeham
-bharuch
-biharis
-orions
-newmont
-unibond
-rufina
-volya
-feminization
-dolgellau
-reformatting
-g-men
-guillemot
-qayyum
-bronc
-sinless
-samrat
-auric
-unus
-non-random
-microstates
-tamm
-miloradovich
-karner
-phir
-platts
-periya
-vaccinium
-eustachian
-borodino
-bioshock
-exude
-jerking
-engrave
-brasserie
-bnf
-rurouni
-cynically
-brach
-hanse
-overshadowing
-asci
-kilowog
-impious
-toronado
-toft
-splintering
-shiji
-sveti
-kakapo
-cohasset
-savonia
-cuirassiers
-kanuri
-anti-mormon
-nyren
-full-width
-harvestmen
-stormbringer
-zosterops
-comeau
-matrikas
-gfdl
-black-owned
-pangs
-kohut
-detroit-based
-hsh
-belldandy
-agoncillo
-krishnaraja
-bhel
-fulmer
-fisker
-miscalculated
-ala-ud-din
-masorti
-renfield
-pervading
-notchback
-5-ht3
-nappa
-zabel
-izaak
-cartoon-like
-illmatic
-pearland
-melodica
-cuckoo-shrike
-berners
-rtos
-ferrera
-castrato
-100s
-homemakers
-stromboli
-saint-michel
-hypotonia
-rupe
-dignitatum
-baulkham
-out-patient
-gellish
-chimeric
-computerization
-dualist
-holidaying
-theophany
-kodaira
-pfeifer
-valeyard
-tahmasp
-netto
-waiving
-archipelagos
-belizeans
-sanat
-29.60
-sub-categories
-rhos
-mcneese
-68.6
-mercyful
-infringes
-arthritic
-apsrtc
-vc-1
-seven-minute
-person-to-person
-sydal
-evs
-bobbins
-shuman
-pushmataha
-karimi
-brewhouse
-administrating
-superheater
-word-for-word
-barbier
-manzanillo
-laupheim
-swett
-collazo
-lasing
-southridge
-fairytales
-hierarchs
-ddos
-k-love
-stottlemeyer
-ijtihad
-carausius
-kap
-tompkinsville
-bosse
-higurashi
-damm
-masterman
-unconditioned
-lungi
-hydroplane
-vivre
-jerald
-codewords
-sectioning
-turramurra
-berberis
-fatwas
-carboxy-lyase
-wpf
-michio
-engl
-aborting
-ifrs
-67.9
-massachusetts-based
-japanese-language
-stromness
-off-ramp
-wildcards
-spearmen
-controlled-access
-fcm
-powerup
-overwriting
-inverkeithing
-litex
-lipschitz
-chromaticity
-hisense
-fatso
-hualien
-ebooks
-legros
-nd5
-prothrombin
-waylaid
-sav
-oncogenes
-carbenes
-vettori
-twigg
-grifters
-carrere
-reappraisal
-dishonour
-winograd
-tunas
-ticked
-guilt-ridden
-zamiaceae
-klimov
-ignacy
-malinowski
-menziesii
-10-30
-hinault
-coenagrionidae
-antiguan
-business-oriented
-stroker
-naranjo
-soldiery
-flattens
-catecholamines
-bonnyrigg
-nnewi
-66.3
-neurath
-zarya
-taranee
-4:15
-turpan
-contravene
-navasota
-lionhead
-plait
-alki
-terrorizer
-oppressing
-escadrille
-forager
-subsiding
-saugerties
-dynamited
-muerto
-musters
-powwow
-sbd
-toshiro
-slapp
-sunnybank
-handspring
-ehlers
-finglas
-gab
-birrell
-parimutuel
-sensitively
-prostration
-aviemore
-chim
-3-2-1
-culverted
-demerger
-torquemada
-kelman
-j'onn
-enriches
-fledermaus
-verena
-leotard
-tsongkhapa
-adh
-stompin
-rimington
-1.18
-comsat
-aska
-seyyed
-neapolitans
-shaara
-kranz
-non-relativistic
-tvx
-clairmont
-corstorphine
-shihan
-thoros
-turkistan
-l-glutamate
-propagandists
-stegner
-expletives
-wister
-katina
-goldbach
-ilene
-bramah
-brinks
-cementum
-irradiance
-gunfights
-phibes
-giganteus
-laskin
-nifl
-dyce
-sexology
-banteay
-107.8
-codey
-essar
-cameroun
-herford
-abdal
-0.52
-megophryidae
-ex-military
-rockfield
-pt-109
-caraga
-crackling
-psas
-dilley
-e-6
-70.1
-zopiclone
-biasing
-geils
-tsunku
-waistcoats
-ricca
-peromyscus
-macfadyen
-15.40
-1.34
-mcsorley
-raconteurs
-solid-fuel
-29.10
-shikarpur
-brokerages
-millman
-anurag
-yantai
-integrations
-low-carbon
-speier
-wtmj
-talyllyn
-flemmi
-hyderabadi
-time-honored
-shorncliffe
-hojo
-thermostats
-dohrn
-odot
-afflicting
-hachette
-mullets
-westlaw
-xaver
-drina
-gunsmiths
-roraima
-smoothies
-urinated
-recedes
-cer
-15-0
-yc
-steelman
-yahtzee
-chora
-fishy
-igp
-pteropus
-eastwest
-eftpos
-szeged
-buttock
-stouffer
-levert
-seimas
-babette
-wreaks
-netbook
-eavesdrop
-thorsten
-lyrae
-muttley
-two-act
-menin
-12-3
-szasz
-bullough
-vereniging
-carcinogenesis
-service-learning
-leflore
-illicitly
-dirhams
-mannion
-skyward
-chenier
-firmin
-correll
-mlm
-bizzy
-mckelvey
-owatonna
-localism
-edgemont
-subjugating
-starostin
-rito
-non-royal
-larkins
-merce
-mongia
-dendrochronology
-belanger
-clarkstown
-vesuvio
-outperforming
-shirakawa
-merril
-dollywood
-q.c.
-haredim
-wanderlust
-planitia
-steelbacks
-ablution
-83,000
-ibne
-arietis
-sinew
-self-similar
-re-take
-bolte
-annexin
-10mm
-basavanna
-hinn
-formula_93
-mento
-merkur
-perverts
-68.3
-cornerbacks
-m-type
-alexisonfire
-failover
-acquaviva
-splines
-exceptionalism
-cercospora
-o'jays
-tenuis
-bibliophile
-embrittlement
-inchicore
-heparan
-phaya
-chono
-guest-star
-jinju
-searcher
-lct
-leibstandarte
-kedar
-m1a1
-honeyguides
-eez
-whirlpools
-prata
-babysit
-dubia
-vermandois
-cibo
-65.4
-ormiston
-78.6
-pkc
-studium
-12-10
-pretension
-rubisco
-subotica
-subglacial
-sinkings
-receivables
-subacute
-mancunian
-florenz
-wbtv
-chilterns
-walkthrough
-0.62
-self-protection
-monbiot
-virden
-pinstripes
-ethelbert
-clarisse
-taca
-brushwood
-fingertip
-militari
-curable
-tiwanaku
-wittmann
-gold-mining
-saltpetre
-18/36
-mansfeld
-76.5
-alinghi
-omnes
-sangakkara
-adulteration
-bhola
-igniter
-willimantic
-mclemore
-sitiveni
-tohei
-loanword
-neo-liberal
-briskly
-coi
-angrier
-flinx
-self-defeating
-shneur
-pacifico
-oliveros
-gematria
-mrinal
-potok
-millipede
-brande
-metalworkers
-gand
-deh
-forecasted
-effusive
-argive
-nesta
-bertin
-zfs
-caradog
-boons
-wyrm
-suprema
-tyrwhitt
-leiningen
-cresson
-chiam
-foner
-relf
-tlaxcala
-benko
-settlor
-hideyuki
-suiting
-foshan
-kusatsu
-voyeur
-badley
-shoring
-81.6
-posttraumatic
-junco
-transpire
-mance
-1.66
-ekiti
-criterium
-x-y
-neuroblastoma
-29,688
-nachos
-misaligned
-toklas
-soldiering
-tourenwagen
-baggio
-servilius
-ki-in
-kilmainham
-reck
-time-frame
-parsippany
-quivers
-digoxin
-fma
-nailsea
-fibber
-dems
-lubricate
-asiad
-serco
-neuf
-frisky
-chiffon
-fieseler
-shekhawat
-agrarians
-policyholder
-82,000
-erindale
-darlaston
-half-mast
-stigler
-clack
-veli
-67.2
-67.1
-codebook
-wktu
-hubertus
-tilda
-wastegate
-1.16
-carbery
-bembo
-connivance
-seru
-balaklava
-pindus
-12:1
-apparatuses
-chosun
-infiltrator
-steerpike
-troubleshooter
-beddoes
-purna
-chumphon
-arndale
-9w
-133rd
-closers
-tangy
-seven-month
-snowmelt
-slither
-ivanovo
-ludovisi
-primm
-kashiwa
-longnose
-vertov
-oirat
-sanctaphrax
-bellew
-80.3
-compatibles
-autodidactus
-i-mode
-hauler
-sheetmetal
-juha
-goldenthal
-trnava
-divesting
-stefansson
-erlich
-see-through
-lovesick
-poinsett
-lidcombe
-ccw
-bophuthatswana
-hongwu
-niranjan
-keyshia
-animates
-raji
-choco
-marlies
-armm
-upstaged
-gluteus
-4.10
-14.90
-zecter
-tribesman
-gatchina
-sceptics
-disease-causing
-blood-horse
-tanuki
-broadwood
-typewritten
-milloy
-renga
-v&a
-jodorowsky
-tsukamoto
-emeric
-amgen
-sbu
-kardinal
-brasher
-ashurbanipal
-leaden
-partij
-stanly
-hoekstra
-sudler
-costin
-prothonotary
-sonority
-personalize
-jalaluddin
-queene
-matriarchy
-edexcel
-historiographer
-scotto
-nowruz
-belzoni
-five-way
-duffel
-shahjahan
-urmila
-crevasses
-steinburg
-eielson
-vk
-penetanguishene
-kearsley
-corwood
-headcount
-ungulate
-l'oeil
-briarcliff
-naved-ul-hasan
-bourbaki
-commercialised
-painesville
-edgeley
-sagal
-plasticizers
-stickleback
-nokomis
-etawah
-dkk
-chaska
-zugzwang
-serrations
-wsb-tv
-landholding
-bethell
-semolina
-wttg
-yolngu
-ashraful
-septimania
-formosus
-africanist
-2,3
-paid-for
-rhinoceroses
-gilbertson
-jango
-ded
-iranian-american
-iipm
-tarry
-kasten
-woodcutter
-hecuba
-verbeek
-disharmony
-a-5
-tujunga
-okie
-boreanaz
-gopalan
-deerhoof
-republik
-yellow-orange
-kaba
-cpf
-yellow-white
-nilson
-synthesise
-lorillard
-buyid
-lews
-franti
-ibge
-70.5
-hildy
-pluribus
-vibrancy
-lacson
-hands-free
-black-capped
-epigastric
-1nt
-reconsidering
-nithsdale
-mbeya
-anthropomorphized
-sushruta
-holness
-clumsily
-independencia
-hangal
-lauria
-mammal-like
-embarrasses
-colonelcy
-pikesville
-wide-eyed
-roose
-goi
-fundus
-icn
-sidecars
-gounder
-behring
-yersinia
-codice_21
-mantled
-fidelio
-mycotoxins
-honking
-mcentee
-clozapine
-gerund
-twinkling
-nem
-d.a.r.e.
-pentre
-acushnet
-chauvinistic
-quixotic
-perse
-dhawan
-anti-federalist
--23
-suir
-lemp
-skyfire
-mcvicar
-haja
-decapitates
-seashell
-60m
-metamaterials
-71.3
-mva
-braj
-restive
-kemsley
-buriram
-gorbals
-1970-1972
-sarl
-engelberg
-re-invented
-cassowary
-in-car
-neturei
-fabless
-osteoclasts
-eisa
-goads
-chunichi
-typesetter
-dilawar
-neopagans
-anglo-japanese
-inconveniences
-crossbill
-lutea
-virunga
-brachytherapy
-30.50
-bezalel
-jambalaya
-lewontin
-raleigh-durham
-nagaoka
-hkcee
-reverent
-porthcawl
-polkas
-spillage
-bde
-marmosets
-datagrams
-blix
-ogoni
-sherbourne
-magnitogorsk
-franchi
-zima
-parkhill
-mops
-lebensraum
-vigilantism
-galashiels
-kartel
-odp
-jakub
-countdowns
-ctw
-asahara
-pro-french
-sdtv
-warlpiri
-148th
-wedge-tailed
-2:15
-wiggly
-whdh
-busily
-gaspare
-westhoughton
-thyroxine
-wrongdoings
-aiff
-foreshortening
-copperheads
-disown
-cashin
-febrile
-llandeilo
-168th
-b-class
-laminitis
-gallegos
-orena
-biron
-hospices
-amerie
-ksl
-ballycastle
-66.9
-differenced
-khar
-nyeri
-vardon
-manzoor
-port-royal
-ncube
-battin
-stich
-steinberger
-burhanuddin
-charu
-weyland
-abdu
-whorf
-voi
-manichaean
-taishan
-anglorum
-echternach
-brasses
-glamor
-detrick
-thylakoid
-andong
-proserpine
-hephaestion
-tankard
-derailing
-lcm
-9-year-old
-sealink
-povenmire
-hermine
-detaches
-kidsgrove
-integument
-elastin
-loners
-tors
-simoni
-australian-based
-mayenne
-runny
-unrepresentative
-jacked
-aarons
-malic
-minutiae
-jiangnan
-parenthetical
-wavetable
-65.0
-gangway
-macisaac
-northwind
-robustly
-hunstanton
-kincardineshire
-gleaves
-ideation
-alamodome
-historicist
-rosea
-photocopies
-usatf
-demmin
-herniated
-shoegaze
-rodes
-g.g.
-gopalakrishnan
-brutes
-breadalbane
-filipinas
-infielders
-pilon
-cumulonimbus
-salusbury
-chloroquine
-guqin
-noory
-arquebus
-misericords
-egidio
-semi-legendary
-tigrayan
-basa
-foxwoods
-publicising
-bastarnae
-hiei
-unadulterated
-kulam
-kingly
-fol
-juntas
-pupation
-queenslander
-chanderpaul
-cyrille
-magrath
-20-win
-vnc
-time-domain
-ifield
-quickfire
-legatus
-25-minute
-tlemcen
-shota
-mugar
-ribbing
-probert
-leeuwin
-bismark
-pabna
-sheikhupura
-sentra
-metabotropic
-bataille
-courtois
-canam
-caz
-agrobacterium
-codifying
-kamov
-haslett
-yallop
-khandoba
-hurtling
-kerrey
-in-kind
-rabia
-elfquest
-noticias
-dingley
-rossen
-dbx
-i-35e
-stealers
-batasuna
-satterthwaite
-5400
-entheogenic
-wied
-ac-130
-mutilations
-angelides
-kmpc
-wingo
-westmore
-aquilonia
-martok
-takashima
-difford
-catanduanes
-norges
-prestressed
-laypersons
-necrotizing
-13-6
-akeem
-upstage
-reuses
-cackling
-demilitarization
-lemont
-garam
-budden
-moosa
-shodan
-steere
-67.4
-javad
-panopticon
-catalinas
-sprains
-quadrupeds
-dowson
-stefania
-blissfully
-stossel
-translit
-gotee
-rias
-brooking
-eighty-second
-21.50
-guimaras
-ashutosh
-pierrepoint
-repellents
-rathaus
-cowher
-denville
-dramaturgy
-80.7
-darkover
-lipari
-yuva
-plast
-ellendale
-roppongi
-bynes
-busker
-72.7
-kurram
-noboru
-triphenylphosphine
-californios
-alesia
-ehrenberg
-shawshank
-calleigh
-hommes
-guerrera
-juxtaposing
-rotem
-houck
-metallurgist
-uar
-pharmacologically
-mcmahons
-antrobus
-nclb
-urabi
-vallely
-longarm
-suel
-zandig
-manoel
-torquatus
-fone
-tettenhall
-d.p.
-henniker
-irn
-jere
-ellensburg
-hermanos
-word-final
-paedophilia
-nuptials
-chevette
-asdic
-theodicy
-solicitations
-bulwer
-harlington
-malinovsky
-scorch
-calipari
-608,827
-bacteriological
-brecknockshire
-half-million
-isin
-sirte
-shiori
-sheol
-outhouses
-1.89
-nayar
-re-using
-cyclorama
-sylvestris
-spoked
-ezhavas
-mabinogi
-parkhouse
-loveday
-bdu
-seac
-rapide
-delrina
-williamston
-grivas
-hawkhurst
-umphrey
-handmaid
-red-and-white
-expats
-ekiden
-anoint
-shildon
-quilmes
-mixed-sex
-peristyle
-vasiliev
-tradable
-hastinapur
-507th
-reconnection
-voy
-bilton
-deforms
-wmds
-encyclicals
-1978-1981
-sneezes
-sanded
-50-foot
-piemonte
-fogle
-deltic
-mishawaka
-niners
-chaturthi
-rikers
-co-channel
-bochco
-green-skinned
-sukumaran
-coldcut
-23-24
-wardak
-glaucous
-democratisation
-harlesden
-tg4
-sigils
-strongarm
-plos
-ricimer
-mini-tour
-coad
-bijan
-e-cycle
-1987-1991
-left-turn
-undercroft
-malvo
-underfloor
-rejuvenating
-rumbold
-baader
-succesful
-bullfrogs
-abrogate
-eighty-one
-parmesan
-wamba
-formalisms
-wearside
-aldiss
-volkhov
-tenaciously
-rainstorms
-alexakis
-mottoes
-abdoulaye
-helicobacter
-belied
-anarcho-capitalist
-fieldstone
-zucco
-zytek
-sub-systems
-d-2
-lorien
-yoshizawa
-two-hundred
-garver
-ondine
-suffren
-cash-strapped
-waists
-saye
-lifford
-myopic
-siniora
-nestles
-languid
-norther
-jnf
-lyubov
-office-holders
-butoh
-14.70
-maddock
-canadas
-louis-joseph
-gaf
-jellinek
-gillmore
-seba
-botti
-tenderly
-123rd
-4c
-brading
-32,188
-simula
-malm
-wolf-like
-vess
-christianisation
-shada
-qat
-chattisgarh
-tribalism
-sio
-trippin
-lifeblood
-newfoundlanders
-frutiger
-fraserburgh
-anterograde
-liquidators
-high-gain
-gardie
-ashwell
-semiramis
-gesner
-24,688
-alderaan
-turaga
-skyways
-cronos
-tola
-builth
-preveza
-square-shaped
-abiola
-computer-related
-rendlesham
-quant
-nutrient-rich
-chinle
-yoshitsune
-cienega
-monomeric
-scudamore
-newey
-salak
-todays
-paranor
-demographers
-nlds
-angustifolia
-germanization
-fla
-burgage
-seventy-nine
-arkle
-pennsauken
-re-brand
-mckenney
-cabanatuan
-klos
-penises
-begbie
-srv
-down-curved
-cephiro
-conformist
-disavow
-silvestro
-1958-1959
-.15
-colonising
-theos
-100.00
-point-of-sale
-payless
-lyngby
-brmb
-rainford
-corsehill
-phin
-reynaldo
-receptus
-floes
-vvaw
-prizren
-pawling
-front-engined
-badrinath
-wral
-leontius
-heneage
-dawud
-29.70
-quee
-e-mu
-geng
-contarini
-mcgarrigle
-inheritable
-after-market
-chatti
-netbeans
-rozen
-kinki
-exocrine
-paascu
-sirhind
-r18
-under-14
-78.3
-konark
-dominici
-owyhee
-muharraq
-myint
-scobee
-acetal
-takhli
-8km
-lamontagne
-zarahemla
-smollett
-trabuco
-miyazawa
-ellora
-giglio
-then-u.s.
-troopships
-griscom
-havemeyer
-ecker
-caryn
-gulp
-fpl
-comus
-kone
-thurstan
-savants
-laissez
-cranking
-carcinogenicity
-threepwood
-golightly
-sarek
-kibble
-radames
-imeem
-bedlington
-retooling
-overruling
-andijan
-hollie
-adventitious
-berate
-medio
-astonish
-a-flat
-eastern-most
-libris
-keratitis
-galan
-organophosphate
-kelsang
-brandenburg-prussia
-chews
-elessedil
-fda-approved
-gutmann
-asherah
-gathas
-valledupar
-basileus
-autoharp
-disingenuous
-north-easterly
-alimentary
-rayalaseema
-english-style
-boobs
-0.47
-0.41
-cosey
-splendens
-waded
-tev
-vasilyevich
-outwitted
-decimate
-grs
-indescribable
-nestorianism
-msida
-carfax
-dickies
-ossicles
-chemokines
-fossella
-essayed
-huawei
-lpd
-aggregators
-anahim
-ev'ry
-toucans
-okara
-simpang
-montvale
-asada
-tanoli
-histologically
-alejo
-repents
-levada
-post-office
-labienus
-fluorescein
-adwa
-mahabali
-voltmeter
-niklaus
-correlative
-vespucci
-vodrey
-imbibed
-gasping
-workin
-vronsky
-gt4
-chander
-rhus
-shamelessly
-empson
-aetolia
-rbcs
-shorthorn
-dinara
-brandreth
-shusaku
-toron
-villehardouin
-kavner
-newbattle
-plucks
-stelling
-darkens
-luckey
-myitkyina
-fordson
-hustling
-titchfield
-schickele
-rohrer
-brakeman
-patra
-kii
-mcguffey
-tarsiers
-bionics
-cobblestones
-sentul
-gametrailers
-remembrances
-spinosaurus
-referents
-well-researched
-repaved
-gsk
-kors
-tik-tok
-intermissions
-drumstick
-valtellina
-comitia
-66.8
-overo
-foxfire
-redstart
-claps
-wecht
-bogert
-flexes
-macchio
-longmont
-ruppert
-insularis
-deeb
-hydroxylation
-casted
-scalise
-hydro-man
-meloy
-edlund
-mantlo
-pocomoke
-diggle
-nflpa
-magnanimity
-bellicose
-sunnybrook
-zogu
-5150
-dispossession
-pandemics
-gular
-self-consistent
-vest-agder
-pre-tax
-yakumo
-esham
-nishikawa
-dark-colored
-hamiltons
-heiberg
-densely-populated
-graveside
-wfla
-yoshitaka
-scarman
-crisfield
-cowrote
-post-16
-grech
-xanten
-lecompton
-jscript
-machon
-semi-fast
-sharer
-thyestes
-garman
-gentilly
-ambling
-chembai
-rhein-main
-bhagavathi
-pipped
-dnssec
-gleaner
-determinative
-36,667
-jernigan
-mcninja
-barletta
-caringbah
-interferometers
-cabinetry
-free-throw
-6.75
-startlingly
-toko
-palpation
-lleu
-societas
-wheatsheaf
-luftflotte
-mullens
-shuck
-ppk
-rop
-spicks
-well-recognized
-constanze
-leatherback
-univariate
-unterberger
-theophylline
-anglo-russian
-hamama
-dispensations
-dorma
-kottonmouth
-coggeshall
-larter
-restrains
-helfer
-scuzz
-dramatizes
-defacing
-laurasia
-republican-controlled
-cooter
-navis
-deconstructing
-mukul
-archy
-pascale
-wem
-kilbourn
-demerits
-tutuila
-naltrexone
-corbridge
-bremgarten
-50000
-thornburgh
-c128
-janitorial
-glaxo
-ffmpeg
-utkal
-1.82
-full-bodied
-e-4
-ideologues
-chisum
-cdl
-sunraysia
-alkan
-grapplers
-commandeer
-dmb
-lohner
-vinayaka
-jadeja
-akihiro
-alcoves
-k4
-myriam
-northeast-southwest
-koolhaas
-travelodge
-yonsei
-co-led
-palmeiro
-pharmacologic
-tyrannus
-madusa
-habibi
-mq
-jadzia
-herren
-apollinare
-cryptosporidium
-chapterhouse
-panjabi
-gigan
-1965-1967
-four-disc
-volturno
-tonle
-quercetin
-cockcroft
-trite
-amblin
-fazil
-neurofibromatosis
-1946-1947
-garran
-two-reel
-redskin
-ranbir
-staatsoper
-liberalize
-silicosis
-tranquilizers
-hurrying
-petz
-sinaiticus
-faery
-fortunatus
-nonce
-ayanami
-tailgating
-lagerfeld
-araujo
-sa-2
-menton
-creepshow
-hydrofoils
-derick
-eurocentric
-enol
-attention-deficit
-h7
-whereafter
-prophesying
-sportsplex
-partita
-anagni
-kudzu
-kozlowski
-ascanio
-dexterous
-jka
-saint-laurent
-aughrim
-nuala
-logon
-kpfa
-caracol
-self-employment
-wack
-dimapur
-locum
-tla
-distributional
-noakhali
-lippman
-evaluative
-diamantina
-bawa
-susu
-morticia
-foraminifera
-souda
-oberstdorf
-virility
-q102
-cambias
-rymer
-longleat
-talktalk
-many-body
-uah
-aspartic
-quackery
-syndicator
-tangentially
-corden
-carcinoid
-transtech
-disdainful
-platform-independent
-seascape
-metafictional
-unobtainable
-50km
-neurosurgical
-baro
-posco
-andress
-salgado
-non-compete
-rudman
-macdermot
-planed
-warde
-sirohi
-producer/engineer
-starhub
-sepahan
-hakea
-tolhurst
-industry-standard
-rushers
-koper
-trireme
-hackberry
-sansone
-bamar
-noy
-serbians
-arend
-agyeman
-neoconservatism
-superfast
-lic
-crewmates
-bg4
-nextwave
-beleriand
-wikileaks
-starflyer
-kodo
-bostonian
-ragweed
-15-25
-disqualifications
-collinear
-pleasence
-himera
-lashkar
-jankowski
-downtowns
-briefcases
-poale
-mmt
-coucy
-biscoe
-checksums
-neuburg
-flexner
-monic
-teacup
-improvisers
-non-teaching
-wiggs
-us-89
-achelous
-isocyanate
-ahi
-skewered
-ganjam
-0.63
-gendarme
-betrayer
-momoko
-ramsbury
-sarina
-hornsea
-mariya
-temeraire
-unterseebootsflottille
-turman
-vez
-tuneful
-hushed
-inferential
-knighthawks
-premonitions
-targetmasters
-lusatian
-44,375
-greenwell
-dalhart
-hummus
-105mm
-bangka
-p.p.
-sp3
-bironas
-d-ny
-shillelagh
-semi-natural
-berhampur
-akwa
-suevi
-full-on
-sifted
-fontenot
-orifices
-damrosch
-sledging
-medias
-patek
-skein
-in-band
-tempel
-clavinet
-6:15
-lono
-adur
-usl-1
-swabs
-fixed-route
-alok
-caisse
-latencies
-deprives
-bertil
-dorpat
-jinks
-cryogenically
-trelawney
-17-19
-servite
-barlaam
-grantley
-5-day
-efs
-gamgee
-stevo
-soro
-maatschappij
-free-roaming
-viic
-bowne
-mutilating
-lafrance
-makah
-1948-1949
-matthieu
-comings
-myatt
-carob
-elmet
-spartacist
-0600
-heimdall
-esthetic
-loge
-5.75
-self-report
-chardin
-tetrapod
-ptr
-toodyay
-ifv
-dermatological
-waterpolo
-zs
-27,813
-ibi
-barts
-categorisation
-rangi
-wiglaf
-morven
-around-the-world
-pwi
-manju
-jfs
-super-speed
-neuroses
-juste
-prohibitionist
-lecythidaceae
-stockley
-landrum
-burwash
-gepids
-nizami
-yun-fat
-petro-canada
-21-10
-campobello
-liukin
-jms
-bizzarrini
-matto
-oryza
-khaganate
-african-caribbean
-oskaloosa
-retails
-overextended
-polonius
-deferens
-lower-end
-bulova
-vix
-unicast
-unloved
-jcc
-baxendale
-cad/cam
-159th
-jayant
-givers
-galvez
-outmaneuvered
-eleven-year
-anberlin
-insulin-like
-daun
-stijl
-annona
-kiriakis
-drawbar
-wata
-rustavi
-leeks
-maestri
-504th
-mcquarrie
-walmer
-churubusco
-masterfully
-dreadnaught
-pav
-octant
-1:20
-sulfotransferase
-8tv
-heimskringla
-bankrolled
-arv
-deepavali
-rollerball
-rahm
-snooks
-keni
-jackins
-moomba
-catherall
-traktor
-musher
-wpvi
-condamine
-egotism
-15km
-ingria
-ampa
-14-17
-hellhound
-bafana
-romper
-squeal
-reanimate
-carbody
-sangin
-scart
-yasushi
-kriss
-reevaluate
-varghese
-varmint
-mccowan
-cyp2d6
-tumut
-hopalong
-ragtag
-peled
-stuntmen
-ealham
-kothari
-re-engineering
-bettor
-jawed
-scaevola
-cardiganshire
-ocmulgee
-pluripotent
-proprioception
-cimb
-novgorodian
-polonnaruwa
-well-deserved
-girondins
-guruji
-miler
-banksy
-rapido
-unheeded
-borge
-godric
-jboss
-600th
-9:1
-warthogs
-zahedi
-repudiating
-pescadores
-goer
-commentarii
-paiutes
-hollerith
-aranjuez
-lambe
-borg-warner
-sakuya
-circulator
-detests
-perrins
-cabbie
-voicings
-mellowed
-17s
-norwegian-american
-seedeater
-guilfoyle
-acg
-79.8
-witold
-okapi
-last-named
-nanites
-aln
-sub-field
-jataka
-spacy
-wincanton
-gulistan
-uehara
-ingatestone
-balki
-atiyah
-rto
-octa
-gilbey
-barnardo
-shc
-labine
-drei
-donnellan
-cmx
-clickable
-hastinapura
-triadic
-lycra
-house-to-house
-gravenhurst
-songstress
-dhule
-hassel
-dragonriders
-bullfinch
-roeg
-foodland
-asymmetrically
-mastication
-thiocyanate
-huckaby
-d'ken
-backdated
-oxygen-rich
-palladius
-a57
-centralist
-irondequoit
-pinang
-alpes
-montalto
-0900
-oadby
-franchione
-ousmane
-karaikal
-sindi
-lunges
-akilattirattu
-wraxall
-fruitland
-hbos
-decimating
-shelduck
-uffington
-gothic-style
-sabr
-efren
-chichibu
-anti-immigrant
-sodus
-monorails
-penghu
-giustiniani
-otomi
-eight-team
-tewa
-subramania
-opt-in
-plexiglas
-kweller
-enchant
-61.0
-5-string
-vidi
-isomerization
-20-14
-nems
-whitsunday
-glycerine
-a35
-uranian
-aftertaste
-shatt
-us-50
-mudcats
-kanawa
-perforce
-goud
-ortigas
-phillis
-leguizamo
-bimetallic
-kreutz
-avgas
-oleh
-qaa
-an-nasir
-misinformed
-alawi
-legnano
-strathbogie
-amora
-lassus
-linearized
-sirna
-shardik
-thespis
-pictus
-ual
-palaeozoic
-3.58
-breakpoints
-snetterton
-penniman
-raich
-thinly-veiled
-ohmic
-brickley
-hanyang
-shaar
-phils
-purifier
-dut
-tehuantepec
-molnar
-megatons
-go-to
-walkden
-lyran
-dreher
-battlestars
-coffelt
-phosphorylate
-coverup
-locsin
-schoenfeld
-gailey
-epitaphs
-winkie
-hampers
-rollergirls
-spoonbill
-aylward
-campinas
-patronize
-cellists
-yorn
-tah
-highveld
-brize
-expat
-kasdan
-edington
-eldoret
-eady
-nabonidus
-authenticating
-polgar
-lancre
-bemani
-logia
-syunik
-wagnalls
-radek
-simonides
-qamar
-tillsonburg
-polanco
-comique
-prefigured
-allens
-nomina
-playmore
-marginata
-cernan
-lawgiver
-redesdale
-syngas
-mclaws
-manteca
-brinsmead
-cafferty
-madrassa
-sialic
-noblesville
-h-b
-closed-end
-caterers
-blitzstein
-haskalah
-92,000
-accipiter
-beynon
-wachtel
-17.00
-laxey
-cantaloupe
-lesage
-sarthe
-tplf
-biro
-kornilov
-feder
-abaya
-navas
-t-37
-aetiology
-continence
-malevolence
-skullcap
-merrell
-annelids
-ragi
-montlake
-al-bayt
-protectobots
-dousing
-19-21
-feluda
-rockhurst
-kislev
-smarties
-shamsuddin
-kra
-gymraeg
-bootham
-redeems
-life-form
-hiv-infected
-79.3
-hore
-coola
-magherafelt
-monocular
-aznar
-charpentier
-gusev
-curwen
-prabhakaran
-purnia
-chugach
-bhindranwale
-theoria
-non-magnetic
-anticoagulants
-outstation
-labour-intensive
-unawares
-sapiro
-ariete
-noriyuki
-arriba
-all-over
-unencumbered
-jagadish
-blackening
-mehsud
-14-2
-bloxwich
-ngoni
-lobsang
-spacetimes
-14.80
-liaise
-asakusa
-bolshaya
-magnavox
-coveralls
-pocus
-rekindling
-low-ranking
-lepus
-tujue
-burswood
-buechner
-salt-n-pepa
-nursemaid
-cyclohexane
-fumio
-vimana
-metrocard
-tada
-grete
-luckman
-documenta
-perfunctory
-byrum
-russe
-klien
-jaunty
-spongiform
-flicka
-temposhark
-cazares
-popp
-provocatively
-dukinfield
-anjelica
-tampons
-21-23
-vt.
-karori
-eurobarometer
-helvetii
-cockles
-semiconducting
-deutz
--80
-bapu
-bartolo
-ethnomusicologist
-wlr
-f27
-apulian
-songbooks
-arain
-ext
-netlist
-saldanha
-moin
-over-the-rhine
-femi
-hillview
-ronaldinho
-17-3
-20-23
-opcodes
-junto
-leonine
-saltcoats
-longdendale
-busboy
-envelops
-aina
-raghava
-paraiso
-americium
-benkenstein
-faroes
-leadbelly
-codimension
-mactaggert
-all-in
-hydroquinone
-deepika
-epifanio
-eidolon
-half-pay
-colney
-musing
-subregions
-industrie
-relaunching
-directivity
-kodungallur
-cygnet
-maupassant
-inter-related
-seon
-embryological
-sze
-mcsween
-sharrock
-pavlik
-anglo-australian
-dotty
-gunto
-simpleton
-subdomains
-astrobiology
-species-rich
-slapshot
-jatiya
-definiteness
-al-amin
-wuji
-ulva
-nishapur
-brewpub
-spirulina
-revolutionists
-seacroft
-beli
-lindenwood
-orderings
-torquil
-ryszard
-sipping
-ambassadorship
-axils
-weapons-grade
-googie
-deep-level
-mid-flight
-entropic
-lavan
-ecclesall
-vpns
-motorhome
-106.4
-jumblatt
-waseca
-endosymbiotic
-dratch
-ngf
-dvb-h
-zedekiah
-spacelike
-10-point
-espers
-sister-ship
-raymonde
-blackall
-eichsfeld
-sapa
-tupperware
-coelom
-tvo
-self-dual
-keyport
-malaco
-eld
-anthon
-rog
-bishnupur
-croaking
-habakkuk
-cronbach
-boho
-shorthanded
-shira
-froom
-145,000
-10/10
-hoste
-broad-spectrum
-swathi
-knx
-rangeland
-a21
-akihabara
-non-agricultural
-hdpe
-tillinghast
-villosa
-3-car
-soulmate
-bronner
-zolder
-brownson
-fishbase
-30.40
-cowbird
-jingwei
-penetrative
-nazarenes
-oppositely
-metastasio
-atsf
-phai
-lisieux
-pentacle
-kolombangara
-schifrin
-marienburg
-greases
-parapsychological
-ayato
-fixx
-kfwb
-tangiers
-kathi
-microlensing
-cabinetmaker
-svyatoslav
-lindon
-foxworth
-synapsids
-afv
-vestries
-regularized
-chrisman
-non-hierarchical
-marana
-sethu
-hirayama
-katsuya
-osorno
-anadyr
-wbt
-wbf
-nudum
-o1
-decriminalized
-buckhurst
-boku
-gratifying
-outgrowths
-midgard
-gadgetry
-tawi-tawi
-shure
-vlissingen
-errico
-macleish
-disc-shaped
-tartary
-spurts
-fmr
-spondylitis
-razzaq
-meanie
-jussi
-kaganovich
-69.5
-institutionally
-gackt
-bayerische
-ols
-chalcis
-securitate
-hyles
-n.e.c.
-burundian
-gorkhas
-arteriosclerosis
-darkman
-reconnecting
-gunny
-metropolitano
-tepals
-fabric-covered
-ijaw
-umber
-juche
-letelier
-segar
-notarized
-29.50
-weisinger
-cherno
-o'duffy
-marika
-commensal
-henman
-supergravity
-tarrasch
-urc
-anti-crime
-27-year
-cozumel
-snort
-bridie
-runequest
-pmp
-virginia-based
-seshadri
-othniel
-habash
-sturbridge
-schachter
-pru
-deoxygenated
-80.9
-inoffensive
-personals
-moorcroft
-praetorius
-kayenta
-disque
-mukada
-fibronectin
-1950-1951
-six-game
-triborough
-oded
-topol
-repin
-veselin
-datasheet
-lydda
-wrong-doing
-4100
-wattage
-kosovar
-herpetology
-transputer
-goodly
-.375
-taryn
-20a
-mundell
-belsize
-madhva
-malak
-oakwell
-sphinxes
-rasen
-beseech
-12:01
-sleeman
-scrophulariaceae
-wsf
-31,563
-redistributing
-biggio
-pre-raphaelites
-c.o.
-bihu
-psyop
-uatu
-satna
-iovine
-radom
-lengthier
-springbank
-bechstein
-abelson
-nwobhm
-nian
-lugton
-otolaryngology
-ironmaster
-kobashi
-sickening
-quemoy
-narrowband
-cinchona
-evaluators
-tvm
-shapely
-metzenbaum
-alsa
-struma
-souled
-galaga
-nkosi
-3abn
-kynaston
-chromophore
-redeploy
-snake-eyes
-pipework
-interning
-tyrannosaurids
-hard-wired
-blinn
-sherbro
-roasters
-jugantar
-o'casey
-bryozoans
-rhames
-giacinto
-cramping
-re-emerging
-bruenor
-semi-presidential
-digi
-leesville
-bothy
-polychaete
-villars
-piebald
-semi-private
-threshers
-nakanishi
-protoplanetary
-rishikesh
-al-ghamdi
-asahel
-t9
-neo-geo
-kiting
-baley
-1910-11
-pediatricians
-m-net
-a470
-cryptid
-spaceplane
-hohenheim
-reutimann
-chetham
-elen
-morbidly
-multituberculata
-wmata
-gazed
-cowal
-npf
-reformism
-misunderstand
-khalq
-jamar
-yellowcake
-37,083
-bioko
-anabasis
-autocannon
-megacity
-potentiometers
-boudinot
-vsa
-tita
-subtribes
-adieu
-cadore
-bricklin
-autocephaly
-wave-like
-serifs
-dubrow
-tehri
-nangarhar
-armey
-brusque
-amerigo
-cauda
-nitrile
-zawinul
-slaving
-15.10
-juggled
-newswatch
-chun-li
-bharadwaj
-prods
-roerich
-belcourt
-yaya
-prca
-leonia
-jacketed
-inhalant
-prologues
-accentor
-short-sleeved
-keepsake
-gelato
-japanese-americans
-arsenate
-elmyra
-someplace
-wino
-long-playing
-sledges
-al-manar
-gbr
-pacem
-clute
-weds
-euphemistically
-denotation
-printmakers
-filaret
-vasudevan
-calero
-shigure
-chiquita
-1978-1980
-alida
-gadol
-hypnotised
-pinsk
-whats
-pagliacci
-sharecropper
-thiomersal
-anangu
-bashkir
-lavatories
-ribcage
-ashen
-jamba
-schleswig-flensburg
-creases
-sterols
-fabrica
-deram
-pgs
-mik
-freethinkers
-icky
-tog
-viewtiful
-scher
-bautzen
-druce
-montreuil
-hunte
-sturgeons
-oblation
-greco-persian
-hayton
-mprp
-vipera
-agulhas
-polish-born
-boces
-wescott
-cannondale
-ninetieth
-obviate
-avocet
-ipomoea
-lairs
-gregori
-rame
-merl
-antena
-0.72
-triumvir
-volar
-carex
-fraudster
-ft/s
-povich
-unga
-tri-series
-sacro
-stabiliser
-emmen
-carentan
-merit-based
-minho
-sintra
-carnelian
-brenta
-boli
-nationalize
-13.40
-dimensionally
-anse
-wadis
-azikiwe
-dholpur
-schmeichel
-embossing
-houlton
-lyng
-bhanu
-stepanovich
-30-35
-ensnared
-calcaneus
-student-produced
-kfyo
-convalescing
-argives
-mcgivern
-mishnaic
-ironed
-federalized
-negativland
-lesseps
-paradigmatic
-trak
-succulents
-oozing
-lawrenceburg
-kdka-tv
-wani
-steadfastness
-integrally
-omarion
-karabiner
-ornithischian
-crackdowns
-digi-egg
-kinetoscope
-plumed
-nullity
-hans-joachim
-bastar
-alejandra
-kulongoski
-alessandri
-pickin
-eof
-vibert
-byomkesh
-narberth
-setia
-aspirational
-post-conflict
-advantaged
-loko
-zhuangzi
-osd
-dur
-convocations
-glynis
-
-leixlip
-eudoxus
-digambar
-esbjerg
-caras
-deas
-carto
-andamanese
-polluters
-playbook
-adami
-quintets
-mattox
-pahs
-hfe
-leongatha
-wavertree
-palaeontological
-prats
-hund
-grindelwald
-jetsam
-woodie
-kazuhiko
-itv3
-digges
-minette
-mikhalkov
-post-9
-scobie
-ipsas
-foldable
-jol
-mcburney
-picayune
-blacktip
-shoppes
-eubie
-vioxx
-syrus
-monastir
-salvin
-formula_95
-pigot
-satterfield
-edsall
-radziwill
-salutations
-padmavati
-osbert
-liminal
-tzara
-pinnae
-lmu
-headways
-laface
-knighthoods
-jayaprakash
-roadmaster
-tapaculo
-snedden
-alevi
-b.com
-nibbler
-sharada
-carrott
-newdigate
-leber
-boraginaceae
-19.00
-plumtree
-dinny
-wysp
-1966-1968
-ackles
-innuendos
-asroc
-noss
-mihdhar
-bilayers
-jeopardizing
-poros
-mushroomed
-corporis
-pinacoteca
-einsiedeln
-thins
-phthalate
-semana
-scotties
-mannitol
-sherriff
-licata
-pramod
-manvers
-five-volume
-siletz
-fantastically
-megalomaniac
-betas
-106.0
-saponi
-battlespace
-grenland
-jags
-discriminates
-winchilsea
-'92
-jacinta
-honorarium
-vivant
-sringeri
-mcclinton
-schiedam
-silico
-fonthill
-unfccc
-thich
-luann
-creche
-1953-1954
-cecum
-eit
-counterpunch
-fourche
-gakhars
-ym
-ashta
-p-90
-pre-flight
-punk-rock
-bui
-cooktown
-snaefell
-wulsin
-tusker
-cornes
-revellers
-peshwas
-sbt
-bouteflika
-fant
-bernards
-equivalences
-gaudin
-interurbans
-mella
-hauppauge
-sindri
-freie
-peterlee
-spinosa
-garn
-inflates
-barla
-1-dimensional
-fut
-overman
-adhemar
-off-world
-bornu
-sukkah
-seiken
-landman
-razia
-grossmont
-double-tracked
-freestyles
-bridesmaids
-agu
-taekwon-do
-tempi
-regionalised
-khodro
-braincase
-poms
-etl
-100-meter
-kimbolton
-beza
-12-16
-procrastination
-scoville
-awi
-btrieve
-spiraled
-midrashic
-nasmyth
-raynald
-trepidation
-scribble
-acceptably
-self-perpetuating
-biologics
-emigres
-pavers
-jennison
-duratec
-37,917
-postgate
-huda
-i.v.
-massimiliano
-lfp
-homesteader
-f-106
-weise
-1973-1975
-isas
-non-smokers
-tyseley
-capercaillie
-peeks
-highjump
-broeck
-kc-135r
-eulerian
-neda
-francoist
-gamesradar
-zeolites
-thirteenth-century
-single-storey
-heidenreich
-cisterna
-dorrit
-depth-first
-efim
-rotorcraft
-anti-western
-celadon
-rtgs
-gfs
-redbeard
-recaro
-hingis
-tarnum
-veloso
-chal
-marriner
-cheikh
-al-shehhi
-dreamweaver
-greystones
-gyrich
-familiaris
-western-most
-dunraven
-cepheid
-polyakov
-finke
-apollos
-buxar
-sogdiana
-lefschetz
-munsey
-umbro
-waddesdon
-frakes
-.01
-stolz
-cephalosporins
-lowenthal
-mcat
-societe
-co-designed
-bethlem
-serafin
-ip-based
-guarda
-ogress
-deniro
-clarinetists
-sequins
-reiterate
-waifs
-stn
-baia
-sandlin
-pathis
-9660
-ciano
-indiantown
-levey
-pdp-1
-cross-strait
-rien
-malvina
-153rd
-esque
-jayadratha
-mugello
-beddington
-stendhal
-decry
-traurig
-menno
-aridity
-particularity
-vei
-stateline
-braathens
-apted
-choses
-high-dose
-umbridge
-sufjan
-chingy
-lutterworth
-darting
-waterberg
-maclagan
-philpott
-bahir
-hirt
-hisako
-fantails
-bushveld
-yura
-79.1
-reinfeldt
-t&t
-antediluvian
-catwalks
-cuccurullo
-morrowind
-epeli
-moonsault
-mallya
-s15
-1480s
--22
-second-wave
-cliches
-metamorpho
-69.2
-interlake
-brar
-sparkplug
-vicodin
-spicules
-kusel
-unwisely
-0.48
-quantock
-stereoisomers
-caux
-pangbourne
-ncbi
-rucksack
-usi
-niem
-trebuchet
-atherosclerotic
-leopardi
-s&s
-haters
-bij
-hummelstown
-leeb
-rosenheim
-wrgb
-holbeck
-congaree
-wiegand
-gruel
-avantgarde
-bd7
-gca
-neneh
-vsc
-ventilating
-mercantilist
-timidity
-pre-cast
-tyramine
-lauterbach
-eteocles
-kac
-isoleucine
-flicked
-correia
-walberg
-extrapyramidal
-hasluck
-holland-dozier-holland
-bawang
-alprazolam
-falafel
-manan
-scalding
-meinhof
-cyd
-teru
-english-speakers
-snowmass
-ponnani
-bartering
-multimillionaire
-venkat
-outranked
-capac
-thunderhead
-ricordi
-jelinek
-gravatt
-festuca
-look-out
-x39mm
-getafe
-woodchucks
-ble
-yakama
-impi
-brush-footed
-hollands
-boletus
-chch
-breakbeats
-calderone
-feed-in
-poum
-williamtown
-ung
-1971-1973
-licked
-ahenobarbus
-26-year
-mabo
-mtd
-denarius
-wimpole
-72.4
-lanner
-microsd
-bagnall
-paraphrasing
-balsillie
-hunsdon
-zx81
-segregating
-praga
-toraja
-nyt
-fazlul
-branchline
-24-0
-dhivehi
-lilla
-guro
-delacroix
-headdresses
-beare
-gerstein
-second-season
-vamos
-wellingtons
-ilgwu
-wolfhounds
-tav
-vongola
-19:30
-pegmatites
-sub-antarctic
-counter-rotating
-chiyo
-denaturation
-mudstones
-re-arrested
-4-d
-amira
-bisson
-non-african
-cross-cutting
-tremayne
-1.37
-tordesillas
-brb
-almada
-0.76
-chernomorets
-silkeborg
-corella
-nightlight
-108.3
-dot-matrix
-gangstas
-glacis
-leanbow
-v.i.
-aybak
-paperless
-amv
-air-dropped
-mctiernan
-kruis
-cottonseed
-typographer
-human-sized
-under-graduate
-bhaskara
-plastering
-adamo
-76.9
-tourville
-flippant
-oddfellows
-burnand
-kuykendall
-repetitious
-anti-castro
-transfection
-japanese-style
-cambio
-26-27
-atto
-evinrude
-strigidae
-oryzomys
-teva
-yreka
-ramanathapuram
-perishing
-nch
-pollan
-prelims
-automates
-tramiel
-azamgarh
-atrox
-laman
-daioh
-fezzan
-homogenization
-inaugurations
-full-powered
-fatalistic
-narai
-kayamkulam
-6:45
-dewa
-re-raised
-herringbone
-arcadians
-dromaeosaurids
-liuzzi
-yellowing
-kamla
-notochord
-buenavista
-nimmi
-rocketship
-lateen
-perspex
-itp
-robbin
-mbas
-gombe
-devagiri
-podhoretz
-semi-evergreen
-boxford
-mucky
-cleisthenes
-gunslingers
-avigdor
-gladiolus
-sempervirens
-tuf
-zolpidem
-1976-1978
-sawyers
-opensuse
-germane
-terje
-tuvok
-magnifico
-cheraw
-miscavige
-know-nothing
-bailly
-meningeal
-colinas
-tambaram
-twelve-string
-westies
-i-66
-d8
-frogger
-multicore
-aberfeldy
-gorgias
-stabat
-ethnographers
-deryck
-blewett
-power-law
-purdah
-anti-radiation
-mikhailov
-paich
-zoonotic
-stynes
-cubit
-loog
-dutcher
-wristwatches
-figureheads
-sickert
-kennywood
-self-righteous
-gorgan
-pre-medical
-cantt
-hogfather
-edney
-ribozyme
-mystra
-khairpur
-koons
-lizard-like
-savaged
-rosselli
-eboli
-aramis
-whitsett
-trivalent
-scouter
-red-necked
-socom
-peccary
-mediumship
-paternalism
-staterooms
-english-based
-killy
-granulomatous
-36ers
-8x
-neunkirchen
-morella
-mosier
-mlw
-troubridge
-boydell
-kunsthalle
-wyle
-quintuple
-jeannine
-pennebaker
-gowri
-decrement
-concessionaire
-bhrigu
-ammonoosuc
-gatherer
-post-football
-cbrn
-dickman
-telecasters
-swallow-like
-charisse
-polignac
-mcconville
-zhongzong
-powelliphanta
-leafless
-prafulla
-siphoning
-exd4
-conjurer
-mushroomhead
-panglima
-skyhooks
-awk
-laurea
-kemptville
-prouty
-ill-gotten
-antbirds
-waites
-gigantes
-clarett
-road-going
-trailblazers
-santayana
-wallacea
-butchie
-muna
-marj
-bordure
-ultramarathon
-retrained
-starland
-gowon
-non-overlapping
-fellers
-anglo-indians
-rufino
-collating
-clean-shaven
-nineteen-year-old
-dok
-knowable
-shalya
-dioxygenase
-benteen
-oboler
-staph
-stasov
-ghazali
-nowshera
-then-recent
-rajnikanth
-tidbits
-timezone
-wairoa
-rewritable
-tv5
-emulsifier
-shumen
-lajpat
-quixtar
-29.90
-cnh
-muth
-parikshit
-sarat
-ushakov
-bakhtiari
-peddie
-.79
-fleischman
-stompers
-oecs
-bronski
-afrobeat
-laterals
-hallman
-janina
-pulpwood
-squaresoft
-jaja
-shahriar
-cardoza
-reproductively
-yusa
-plaited
-aswad
-darkthrone
-galveston-houston
-roncalli
-raam
-workarounds
-wint
-peete
-loblolly
-alboin
-chalfant
-basadi
-satpura
-staunchest
-quinault
-invergordon
-in-memory
-dudek
-jelena
-kooyong
-improbably
-7,600
-lateralis
-glascock
-gniezno
-teat
-satake
-paneled
-compiz
-5800
-7,300
-wpp
-geographia
-evm
-surma
-daeva
-kasturi
-elaborations
-hopedale
-prekindergarten
-wknr
-laxmikant
-carin
-furse
-347th
-plainer
-cycliste
-dik
-humors
-characterising
-manliness
-terris
-bildungsroman
-toastmaster
-hensel
-troms
-srbs
-knitters
-ohio-based
-benicio
-hazaribagh
-intergroup
-petrovna
-sippy
-69.1
-ordine
-bluescreen
-f-8
-takano
-hatt
-locksley
-17-16
-parappa
-fratelli
-set-theoretic
-herreshoff
-dooku
-innards
-fictive
-parametrization
-shadowpact
-i.g.
-sebold
-planina
-bhasha
-molle
-rumah
-1969-1971
-whosoever
-scottsbluff
-linearization
-nipigon
-niccolo
-chassidic
-jato
-hupp
-resch
-panagia
-1:12
-morihei
-gazettes
-kobzars
-rfi
-trf
-chinas
-sarcoplasmic
-barras
-enunciation
-meran
-kettlewell
-einaudi
-hra
-nasrani
-elimelech
-wanker
-quintesson
-inputting
-nobly
-latching
-hoddesdon
-wieman
-yukos
-early-stage
-negligently
-temenggong
-hcm
-sahar
-need-based
-scrutinize
-3pts
-human-rights
-aikawa
-mazama
-overloads
-ulam
-embu
-prenuptial
-unidentifiable
-mantovani
-raymer
-amargosa
-satcom
-wellbore
-hyogo
-romualdez
-3.64
-unravelling
-dimmers
-nullius
-gp2x
-gamezone
-blustery
-texturing
-vanyel
-graphemes
-aetc
-saint-simon
-zarathos
-5-ht
-henninger
-unixware
-lentulus
-reexamination
-bont
-77.2
-vit
-essentialist
-obeid
-roxane
-coracoid
-queso
-buuren
-1908-09
-improbability
-uncapped
-290,000
-desalvo
-shammi
-holodomor
-innu
-christ-centered
-lieutenant-admiral
-mpas
-streamlines
-locational
-changelings
-gs1
-metallicity
-thangals
-ussuri
-mcmurry
-raquette
-catalase
-riigikogu
-mof
-al-biruni
-sciatica
-peruzzi
-ineffable
-hasdell
-starlets
-boulogne-sur-mer
-post-vietnam
-newburg
-mallrats
-helvetia
-tapps
-fatih
-mexicano
-barged
-vecchia
-gado
-bb7
-french-american
-open-circuit
-llantwit
-fuyuki
-transmute
-inflame
-farida
-midazolam
-underfoot
-luby
-skilfully
-hypoallergenic
-callistus
-.250
-agronomist
-fatalism
-jijiga
-mid-80
-roeselare
-jalore
-hombu
-tisza
-countship
-morais
-boadicea
-thorstein
-thummim
-varnished
-3.70
-marriageable
-haras
-biggins
-galvanometer
-crosier
-mediastinum
-ninomiya
-woh
-perkiomen
-daim
-melvoin
-ebcdic
-kisaragi
-unburned
-neepawa
-lifelines
-hidatsa
-reconnoitered
-ferriby
-wigston
-monocytogenes
-caloris
-nelsons
-hubcaps
-bristly
-serval
-standard-gauge
-abeokuta
-vamps
-suleman
-norn
-zero-sum
-erasers
-feroze
-multistage
-insectivore
-sacredness
-semyonov
-janssens
-bradykinin
-kenzie
-montefeltro
-barres
-keyboardists
-cytometry
-ballinasloe
-hadid
-blowup
-drewe
-nagumo
-cabbagetown
-centinela
-ohne
-balija
-straight-to-video
-5:45
-oberg
-medicago
-rustler
-kajal
-unpatriotic
-senn
-wergeland
-barwell
-mahkota
-weeklys
-louvers
-zwick
-erzhu
-marineland
-legalistic
-maceration
-bernt
-hellspawn
-tegea
-hakone
-rav4
-snowflakes
-hadera
-muscaria
-berisha
-96,000
-pentapolis
-magnes
-mini-comic
-contractible
-lushan
-hsus
-filey
-jaunt
-grisons
-geometer
-abcs
-gagauz
-rydal
-blaustein
-lovering
-hollowed-out
-sherrie
-mauryas
-gang-related
-neuro
-wet-season
-padmanabha
-icw
-reger
-anglin
-reinterpret
-vividness
-rigi
-qq
-research-oriented
-sfor
-nasalized
-pem
-nomex
-shabbos
-ukc
-truett
-abdurrahman
-emas
-lianas
-frenchy
-71.6
-1919-1922
-quieted
-diglossia
-obs
-emetic
-maybury
-lovebirds
-laffoon
-sarris
-neutralised
-eplf
-hepatocellular
-corbels
-revelle
-cairngorms
-hardenberg
-thunderer
-coolmore
-inspiral
-panamint
-cleghorn
-half-duplex
-echmiadzin
-farkas
-caenorhabditis
-defeo
-scolaire
-28.80
-unsuspected
-kirkdale
-watrous
-yezhov
-salivation
-komnenian
-herpetologist
-2003-4
-rodwell
-ouellette
-convicting
-currey
-unjustifiable
-gomphidae
-snobby
-ratatouille
-deceives
-allam
-getae
-frieza
-pooch
-cochranella
-kuskokwim
-self-professed
-wamphyri
-ravensbourne
-minuit
-:3
-maschera
-girls-only
-masumi
-majorana
-cytology
-riviere
-discolored
-estates-general
-macrobius
-rvr
-foran
-boerne
-persuaders
-eiri
-anticiliary
-shellcode
-metron
-trawls
-checklists
-erhardt
-delko
-cornstarch
-defreitas
-rearrested
-peltasts
-awt
-tornal
-shawano
-greytown
-laurer
-handbills
-byl
-homestar
-ravin
-holmberg
-bacque
-slu
-samosata
-americanism
-spier
-adjudicating
-griselda
-degenerating
-mesmerizing
-vannevar
-pedestrian-friendly
-reconnects
-mullahs
-liftback
-parivar
-caracciola
-gandaki
-qd
-mauger
-eutyches
-kasba
-encomienda
-eludes
-pycnonotidae
-carbides
-coxsone
-fingerings
-raincoats
-pardoning
-trinny
-pipefish
-ovipositor
-mrap
-pilcher
-cpan
-athan
-habersham
-berthier
-gaster
-halpert
-cul
-ladywood
-fireboat
-heihachi
-edutainment
-r7
-qari
-liebling
-gearhart
-kaushik
-hungnam
-ybrahim
-lanz
-miltiades
-vasodilator
-corpsmen
-sempill
-thecla
-langside
-routemasters
-bwa
-tippmann
-willibald
-michaux
-widmer
-liri
-ghaznavids
-khushi
-lrs
-gt500
-tfc
-ebadi
-error-correcting
-necromancers
-bhimsen
-legitimizing
-campbellton
-piratical
-tichenor
-toller
-constantinian
-carlota
-arnon
-cognizance
-noncompliance
-bonnett
-koel
-tolmie
-penances
-post-career
-sigue
-fahim
-przewalski
-rize
-serevi
-eberle
-a27
-cuza
-0.64
-dashi
-asaka
-ragman
-bedwyn
-semi-rigid
-khenpo
-ilocanos
-honeyguide
-stovepipe
-smalltown
-gxp
-erythrina
-illusionary
-filo
-ronettes
-hinrich
-guar
-b8
-gonadotropin
-eber
-davin
-catlins
-nsync
-rangeley
-versification
-compere
-bpf
-disillusion
-chetniks
-domenici
-14-18
-kittinger
-diffie
-waldensians
-rootstocks
-amasa
-hazeltine
-halflife
-lovato
-hw
-cian
-sogo
-79.4
-sea-based
-amyloidosis
-baucus
-aloisi
-tessellations
-podge
-fiorina
-deeply-forked
-flett
-irwindale
-ahmadu
-seda
-wais
-tdy
-psycholinguistics
-bulimulus
-demidov
-kensico
-floe
-bandi
-14c
-barba
-vansittart
-montecassino
-yiannis
-gpm
-rheostatics
-meron
-polonaise
-concomitantly
-unaccountable
-pama
-68.7
-karakorum
-jml
-canford
-kirche
-carrouges
-graetz
-melk
-coligny
-wasa
-bradys
-outstandingly
-portholes
-gairdner
-4,059
-vsu
-hobsbawm
-lymm
-uppers
-namba
-proofreading
-albo
-browsed
-hypochondriac
-hipp
-roks
-constructionism
-unr
-calmodulin
-fastnet
-pather
-ashwin
-akihito
-retouched
-tirupathi
-mactan
-kriol
-cero
-pgm
-doogie
-grabber
-adx
-kristie
-salsoul
-sugawara
-non-performing
-finials
-thunderchief
-fcw
-potpourri
-great-grandparents
-modernizations
-arimathea
-jorgen
-yellow-headed
-vacante
-grazie
-masih
-snnpr
-gohar
-scrin
-rehearses
-webkit
-brack
-25-26
-keychain
-40,833
-harrodsburg
-6100
-mitochondrion
-infocomm
-knockdowns
-mused
-diadochi
-mtt
-tempts
-spandrel
-shaivite
-humeral
-metaphase
-139th
-stx
-comprehended
-mdot
-dislodging
-comal
-kori
-putti
-recused
-koragg
-multi-mode
-first-week
-equis
-coonoor
-intraspecific
-nicolay
-estragon
-maroc
-bathes
-lucidum
-cfu
-treeline
-r.s.c.
-therin
-raggedy
-tricksters
-noster
-robur
-playroom
-norwell
-millom
-orientis
-ccu
-kitamura
-dupre
-devourer
-brautigan
-kersal
-sandridge
-jamali
-maji
-aspinwall
-sriram
-stricto
-sudarshan
-absorptive
-biot
-nipa
-inst
-off-spin
-1.28
-patriae
-viewings
-tni
-starfy
-valerii
-soko
-a31
-crossan
-somdet
-88s
-auraria
-chickenpox
-roundarm
-a/v
-henchard
-losey
-monohull
-carnac
-salona
-savino
-70.4
-cardington
-onedin
-2-2-2
-disley
-cottesmore
-well-understood
-comden
-rebroadcasting
-outwith
-onomatopoeic
-texoma
-yayo
-35,313
-tackett
-sterculiaceae
-xvid
-zavvi
-wednesfield
-reform-minded
-verging
-0.66
-ror
-naivety
-boalt
-homologs
-biren
-nabha
-walkerton
-velella
-docu-drama
-tufail
-niamey
-ecusa
-ground-floor
-centrepoint
-dere
-ephram
-goanna
-kory
-semi-solid
-formula_92
-formula_96
-rantzen
-decebalus
-westlink
-pinnipeds
-guardship
-glass-reinforced
-0.53
-beakers
-university-wide
-dongguan
-100km
-tangara
-scoria
-macneille
-dobkin
-kinser
-ertl
-nonconvex
-dunford
-wahdat
-qassim
-sandip
-anti-jacksonian
-oberth
-1.36
-hypatia
-bowell
-agathe
-tsarskoye
-tempelhof
-sango
-lec
-peo
-insolence
-devers
-burlesques
-30km
-first-floor
-escudero
-calusa
-olybrius
-cante
-spermatogenesis
-unvoiced
-nimt
-overconfidence
-grey-green
-quintuplets
-rondebosch
-wingmen
-geomancy
-17.30
-zalm
-bullinger
-belleview
-leweni
-narcissa
-renminbi
-soft-bodied
-24-hours
-bantustan
-open-loop
-chanters
-g-forces
-6-string
-legroom
-mccue
-tobi
-21,750
-ideologue
-hobbled
-bub
-yardy
-classico
-chafed
-counselled
-tullamarine
-ronn
-pevensie
-jakobson
-finzi
-elytra
-redlining
-hypotenuse
-scorcher
-kdwb
-greentree
-turcotte
-wickenburg
-alim
-floodlight
-turgeon
-kermadec
-ladies-in-waiting
-pmoi
-baltica
-kyl
-sharda
-fitzgeralds
-gulick
-walbrook
-half-year
-shakopee
-xrco
-giudicato
-strouse
-excimer
-knema
-gzip
-silvan
-silenus
-mahama
-fasces
-uveitis
-nscaa
-167th
-creuset
-amul
-greenbul
-toefl
-aroras
-slaven
-pounced
-newnes
-owego
-s.t.
-gazza
-rothschilds
-haun
-chittaranjan
-aftra
-news-talk
-abductors
-meols
-quist
-makonnen
-rahbani
-cravat
-airfare
-skara
-hankou
-paran
-63.7
-unexplainable
-jackrabbits
-isotonic
-janzen
-e3e3e3
-landwehr
-risd
-scornful
-dupas
-amidohydrolase
-wewak
-leges
-tobymac
-obfuscated
-elimia
-sabatier
-documentary-style
-epitopes
-sturluson
-ghose
-forestalled
-dovber
-bruijn
-tamas
-1981-1985
-ultra-violet
-ardour
-7-13
-dorris
-willebrand
-fibroids
-nitromethane
-on2
-polidori
-fibration
-childebert
-minako
-counter-measures
-ex-the
-despina
-make-believe
-rusyns
-ethnolinguistic
-paull
-unkle
-jadwiga
-antpitta
-kilohertz
-prepackaged
-shivamogga
-mattawa
-damas
-misnamed
-sherrington
-skarloey
-medtronic
-naxalite
-encyclopaedic
-markland
-1977-1979
-lsb
-portmore
-fiero
-tribunate
-msx2
-skuld
-tri-star
-immobilised
-workspaces
-rotoscoping
-bwo
-tapeworms
-pettersson
-honourably
-eds.
-kotla
-six-episode
-unreserved
-mutha
-ndongo
-y&r
-cdb
-aflac
-wsvn
-garris
-reines
-impatiens
-amiyumi
-durai
-purulia
-1-16
-safina
-ipt
-chorizo
-mail-in
-osney
-fire-breathing
-mistreating
-laffer
-garbutt
-sendmail
-aks
-silverdome
-xanthogaleruca
-platen
-bermudians
-kirana
-medfield
-tungusic
-wiking
-1917-18
-steffens
-wuab
-extragalactic
-bathtubs
-f8
-derricks
-mumbling
-blow-up
-anzus
-second-string
-railfan
-horsfield
-deists
-mythopoeic
-27-0
-79.2
-hillard
-alleviates
-clallam
-selway
-secondo
-chadha
-baenre
-graver
-high-a
-blowpipe
-warriner
-bighead
-chian
-pelota
-ferrymead
-dojos
-cbo
-100-200
-sawada
-bowerbird
-aldersgate
-heinze
-blixen
-taddeo
-widmerpool
-culm
-tailfins
-winterberg
-pay-as-you-go
-dsn
-goossens
-northey
-comerford
-dodi
-giardello
-takedowns
-pannier
-late-1950s
-4.01
-senusret
-ilaiyaraaja
-mightily
-shivalik
-refutations
-aniello
-7,700
-gawker
-decodes
-whacked
-sankaran
-kadampa
-tetraploid
-kae
-steamtown
-fenice
-quaife
-mossberg
--24
-starro
-franeker
-infotech
-rocketeer
-immunoglobulins
-longden
-sivas
-morts
-shekels
-vice-chancellors
-tuffs
-lorrain
-seiyu
-gastroesophageal
-transparencies
-shammai
-chaman
-silchar
-rosebank
-1978/79
-intermedius
-ziauddin
-jihadists
-ishvara
-moise
-longer-range
-pratiharas
-stipules
-kautz
-1952-1953
-collegeville
-rollei
-multirole
-fault-tolerant
-honeyman
-turmel
-pushy
-domenic
-rockpile
-blemishes
-luda
-theaceae
-prosopis
-picky
-cusa
-elyse
-poppo
-a-20
-brushstrokes
-photosphere
-uua
-unblemished
-giraldus
-heirens
-deka
-ndrangheta
-streptomycin
-o-1
-maat
-tuckerman
-sulman
-172nd
-pharsalus
-guardrail
-28-year
-cdu/csu
-monona
-urbanist
-shevlin
-kollywood
-grusin
-a18
-rheged
-three-lane
-etops
-murex
-post-translational
-cahan
-kudarat
-formula_89
-horley
-soapnet
-tytler
-hatay
-philipsburg
-fulke
-crimped
-dacosta
-westerfield
-traprock
-70.2
-tansy
-cartoony
-breeden
-cementation
-64-gun
-179th
-lenca
-mcgonagall
-quillan
-wolpert
-colour-coded
-re-packaged
-all-decade
-methylmercury
-upregulation
-a49
-nahmanides
-nro
-sweats
-quietness
-near-term
-hanmer
-masayoshi
-25,938
-co-anchors
-maerad
-tongeren
-ibp
-baraga
-astragalus
-south-westerly
-cruisin
-reisterstown
-100mm
-possessors
-sparsely-populated
-mukilteo
-khachaturian
-coble
-dorrance
-cochinchina
-showplace
-perlecan
-foro
-combatives
-coontz
-opechancanough
-must-carry
-tozzi
-goldblatt
-guindy
-marly
-kember
-ishiguro
-babka
-three-match
-ssangyong
-leir
-28-29
-milnor
-oeste
-engr
-hyracoidea
-fat32
-alnus
-ethnocentric
-padraic
-ostrog
-biwa
-a.n.
-truncating
-virtualized
-dearie
-sarod
-antinous
-inside-the-park
-phantasmagoria
-gardermoen
-mmog
-austria-este
-madrasahs
-pana
-panza
-hershel
-banknorth
-actor/comedian
-tell-all
-lpfm
-stabile
-lubyanka
-unfailing
-early-1970s
-kinta
-fenris
-nmt
-four-fifths
-pua
-rhi
-wer
-tracii
-iwao
-furnas
-tgf
-mahlon
-f-type
-saosin
-broadrick
-then-senator
-conwell
-t0
-nutan
-corsini
-tnc
-upperclass
-s.n.
-brattle
-a55
-saenz
-tocharian
-bayhawks
-ranaldo
-phenethylamine
-amberg
-kangsar
-75.5
-sharpless
-taelons
-pegasi
-tajima
-xchange
-hause
-luzhkov
-oakmont
-klsx
-miryang
-rania
-1994-2000
-rickety
-ksfo
-holabird
-22.20
-diplopia
-plagiarizing
-actinide
-bassein
-ushant
-gagetown
-mien
-marvelettes
-allopatric
-cockade
-banavasi
-facetiously
-fisheye
-vice-governor
-lasher
-phet
-coady
-ladyland
-tvxq
-rapturous
-jamerson
-fairlawn
-crossfield
-ovechkin
-short-form
-kristiansund
-akim
-olfaction
-praed
-mushy
-ruellia
-shinshu
-mizoguchi
-3000m
-technocracy
-abductees
-superbas
-redick
-piceno
-irreverence
-proof-of-concept
-tickled
-manduca
-burgon
-golos
-curule
-bell-tower
-mcgough
-hairstreak
-gabbard
-magh
-guarino
-sanitorium
-kamber
-oatlands
-moraga
-pmg
-smallholders
-heartaches
-transsexualism
-2008-2010
-younkers
-llyr
-us-40
-spellcasting
-dardenne
-gauley
-dvd/cd
-papillion
-end-of-season
-ramtha
-mid-1900s
-generalisations
-lithe
-tari
-hellebore
-decathlete
-barka
-nibble
-histocompatibility
-smolin
-hanjour
-zucchero
-vermouth
-debden
-single-line
-3.53
-dispelling
-uhud
-levenson
-alemanni
-jahwist
-sunline
-treorchy
-d10
-nlb
-indigestible
-recollected
-tyers
-leko
-erskineville
-shee
-arrogantly
-1991-1996
-eoc
-optimistically
-abril
-steyning
-mcnish
-sarda
-gouraud
-ambergate
-caprock
-wallaroo
-p7
-mejia
-bangash
-subpopulation
-electrostatics
-stis
-airfix
-sifaka
-endorheic
-incalculable
-obeisance
-pre-olympic
-alys
-luongo
-lorica
-orzabal
-nawa
-lipson
-ktn
-coatesville
-numismatists
-arabization
-devol
-yellowjackets
-2004/5
-vannes
-lytic
-oligopoly
-librettos
-depressant
-southold
-mawashi
-laskaris
-warded
-marwood
-dinalupihan
-foundering
-sartain
-waiheke
-trifecta
-hildegarde
-evian
-seacliff
-mrd
-doby
-postmarks
-innovating
-stop-over
-folke
-aruban
-paracha
-similar-sized
-single-payer
-driverless
-sociability
-janov
-doxorubicin
-heuer
-webelos
-inducting
-palamedes
-five-a-side
-half-man
-andranik
-7400
-sfry
-unfpa
-natural-born
-ixion
-pre-mrna
-kufra
-stresemann
-bxc3
-ywam
-multi-functional
-18-pounder
-hanky
-morrall
-ridgeley
-motorhead
-elke
-throttled
-six-song
-1992-1997
-spadefoot
-hostettler
-cals
-zipra
-DGDGDGDG.DGDGDG
-entrap
-pronto
-magid
-amorites
-rambles
-pre-code
-neoplasm
-ostracised
-producer-director
-middle/high
-foulke
-kulp
-1903-04
-belmondo
-mineral-rich
-1915-1916
-windpipe
-prince-bishopric
-decavalcante
-mega-cd
-spallation
-delfin
-angevins
-speckle
-mikulski
-poti
-saponins
-karlstad
-coasted
-legoland
-mariucci
-caeciliidae
-tta
-kisii
-koinonia
-statist
-halfa
-murkowski
-poznan
-albumen
-77.9
-r0
-mcenery
-personify
-giant-size
-dsd
-h&h
-tats
-crl
-sinc
-loman
-ravenshaw
-amyntas
-lingpa
-mgmt
-replicant
-flamen
-slat
-uvm
-pawnbroker
-rodina
-rhodopsin
-delbarton
-marsaxlokk
-ordre
-heinsohn
-tembo
-1470s
-croxley
-weiler
-perimeters
-shafi'i
-frisk
-chuo
-irondale
-vdl
-tortosa
-greenall
-broek
-dysphoria
-cantorum
-navvies
-arkell
-spectabilis
-hammad
-daylesford
-acuminata
-mediabase
-67.8
-dispirited
-orphic
-high-impact
-slicker
-cosponsors
-indiscreet
-hokum
-semele
-lacerta
-woolfolk
-president/ceo
-rast
-tbc
-godflesh
-bacteriologist
-jalna
-seminarian
-dirck
-platteville
-agglomerations
-duchesses
-dyspepsia
-scandium
-hamdard
-balaghat
-prabhat
-clerkships
-distill
-sule
-longueville
-centralisation
-laburnum
-paramedical
-villani
-fabiano
-limburgish
-ataris
-wrack
-timmerman
-watchmakers
-shantou
-fruitcake
-0.37
-asymmetries
-1000km
-tendai
-bauchi
-fagus
-brakhage
-0.86
-anthocyanins
-shinzo
-sisu
-gadsby
-paria
-morneau
-40,417
-wcha
-incredulity
-merian
-lighter-than-air
-ingall
-emeril
-balakrishna
-spilsby
-greenlawn
-ingeniously
-bettman
-x-sampa
-workbook
-aristagoras
-centra
-uyezd
-petliura
-majic
-ceramide
-oligosaccharides
-jupiters
-mavor
-tangshan
-wfld
-dos-based
-carley
-gyeonggi-do
-bracegirdle
-vocalized
-intan
-compulsively
-amalthea
-auricular
-sketchbooks
-super-soldiers
-konda
-ape-like
-diophantus
-lenski
-sanofi-aventis
-college-age
-ocracoke
-pythia
-heigl
-kurama
-non-racial
-streptococcal
-kavya
-superstardom
-skiffs
-daspletosaurus
-seah
-arash
-cetinje
-marinovich
-n.t.
-montiel
-misericordia
-shuttlecraft
-kokutai
-angioedema
-intercalation
-rewinding
-her/his
-ebdon
-kgo-tv
-multilinear
-ahn-chae
-jodha
-stagehands
-mid-2010
-super-powers
-hughenden
-decimalisation
-whn
-sandeman
-ongc
-seismically
-pudgy
-dinapoli
-connah
-illiac
-spvgg
-lowder
-79.6
-queensbridge
-matura
-torc
-maspeth
-sandanski
-kelvingrove
-emancipate
-c9
-harveys
-cost-efficient
-notarial
-wgs
-bhagwati
-silvertown
-yomiko
-ventre
-non-rigid
-shaef
-lyari
-popery
-caloundra
-duckworth-lewis
-zhob
-artabanus
-fifteen-year
-lakeba
-up-and-down
-slezak
-indrajit
-tornus
-silverlink
-werrington
-cjk
-andel
-parsees
-seko
-mironov
-pugnacious
-huy
-canadien
-belgacom
-libelous
-bukharan
-quick-witted
-insurrectionary
-kailua
-touhy
-udder
-hutto
-scheffer
-sanchi
-pollsters
-buckcherry
-pennyworth
-48-hour
-horsforth
-ameliorated
-bayle
-trefriw
-tashlin
-distributary
-prolegomena
-gobiidae
-chappaqua
-aukerman
-teledyne
-pions
-intercellular
-leofric
-nyit
-urbis
-huizenga
-locrian
-proto-language
-conlin
-vetoing
-schulberg
-garters
-borlase
-ethicist
-wiranto
-on-the-spot
-ifrcs
-feni
-elmasry
-drupe
-kcbs-tv
-dorne
-bazan
-deconstruct
-bulkley
-18:30
-balai
-sangley
-populaire
-gobots
-lycans
-nicklin
-half-moon
-apalis
-everingham
-69.8
-sun-god
-1029
-nankana
-castanea
-cornaro
-nizams
-down-turned
-chavan
-double-deckers
-liliaceae
-woronora
-multithreaded
-0.54
-kashin
-varitek
-tsereteli
-belittled
-fodio
-belge
-roosendaal
-unrealistically
-cuyler
-antshrike
-bridgeville
-saint-maurice
-prensa
-kindler
-obafemi
-elaida
-glu
-norco
-kilbey
-bohan
-herodian
-hawai
-re-aligned
-aksu
-non-formal
-audio-animatronic
-lehto
-hesse-cassel
-8,300
-ote
-monch
-inupiat
-430,000
-field-marshal
-vampaneze
-forbush
-tgif
-androgenic
-self-reference
-bhau
-chaoyang
-34-year-old
-rambeau
-goldy
-seaborn
-screwattack
-hendryx
-tena
-nunchuk
-iza
-esmonde
-babble
-mclaury
-hotdog
-workpieces
-25000
-hutter
-musil
-kitakyushu
-moosehead
-tvp
-midges
-qara
-mendi
-aopa
-rameses
-dlamini
-tsm
-bws
-valon
-sarovar
-15.20
-tatami
-redback
-50-minute
-aptos
-overdosing
-dmytro
-teredo
-qahtani
-aorist
-resiliency
-abravanel
-irrationally
-rakuten
-hattin
-llanos
-ligon
-alkalosis
-coal-burning
-38,333
-pericarditis
-jushin
-katif
-karami
-short-wave
-eph
-surplice
-adoptee
-unbundling
-austrian-born
-djoser
-shoeless
-nyaya
-al-nasir
-subhadra
-timucua
-ha-levi
-harlin
-mimas
-computron
-tyana
-hizbullah
-aluf
-ivermectin
-1.42
-veg
-gregan
-faneuil
-kediri
-allot
-sherds
-wls-tv
-mainlanders
-demarcate
-fogh
-bittner
-labile
-unverifiable
-4004
-red-winged
-29.30
-noninvasive
-f-4e
-herlihy
-bucked
-impiety
-39,583
-tuohy
-wingback
-nauruan
-kro
-selah
-helaman
-wajid
-darkling
-kaserne
-bastien
-h8
-canted
-nev
-rimless
-keris
-79.7
-cross-references
-katsumi
-apus
-laure
-draping
-munchausen
-pectoris
-dangle
-kawanishi
-hart-davis
-recolored
-serravalle
-black-necked
-brandywell
-trover
-buhler
-biographic
-tartrate
-melanomas
-komarov
-man-hours
-saen
-faunas
-tipsy
-gerbert
-call-sign
-tlds
-comely
-mearsheimer
-cosmologies
-sonnenberg
-benoist
-soyinka
-teats
-24-14
-yuan-ti
-clann
-baathist
-kerstin
-solovyov
-msnbc.com
-wole
-aadi
-merseybeat
-khouri
-sanssouci
-donte
-ravinder
-sizzling
-hyattsville
-respirators
-negras
-non-avian
-shuji
-pff
-railcard
-echidnas
-chariton
-koni
-gormanston
-funchal
-low-wage
-bartz
-155,000
-parang
-daisley
-hjk
-dillingen
-chaozhou
-columban
-fsi
-fsp
-thruxton
-nayau
-txdot
-killzone
-ird
-zenawi
-1.26
-moshoeshoe
-unida
-sikri
-rindge
-jelle
-1.17
-mid-major
-5c
-karaj
-gic
-tickell
-lites
-lepore
-deano
-survivorship
-sanatan
-kryvyi
-cumann
-4x200
-crossland
-bille
-moulins
-polyacrylamide
-lammers
-geo-force
-cavalcanti
-guercio
-nutritionists
-anchovy
-cisticolas
-dypsis
-popcap
-attenuator
-amble
-near-field
-ostrava
-20,313
-patentee
-brittan
-gold-medal
-2-1/2
-dunant
-shingon
-coolbrith
-namboothiris
-hochschild
-hermeticism
-ouspensky
-renege
-wide-range
-saket
-refocusing
-eichler
-alkalis
-duddon
-66.6
-barnhill
-rlds
-shigeo
-finalization
-24-cell
-borglum
-googly
-gynecologic
-volney
-carrigan
-gcs
-warnes
-quiero
-zink
-implode
-thiosulfate
-centreline
-flycatching
-2-day
-scalzi
-qila
-niekro
-vassallo
-autodromo
-ebla
-kenwright
-institutionalism
-posies
-kuen
-psychometrics
-story-arc
-grappler
-galifianakis
-liberalizing
-re-occupied
-two-night
-dalia
-amorc
-sainz
-bolinas
-physio
-marts
-arsinoe
-154th
-salla
-break-ups
-hilltoppers
-troubleshooters
-counter-battery
-eyeshield
-cavalryman
-nearshore
-guastalla
-pavan
-impounds
-woodberry
-cyane
-thakin
-mascarenhas
-'88
-music-making
-hindrances
-mckendree
-haken
-gudi
-accreted
-beavercreek
-tanduay
-consequentialist
-cat-and-mouse
-ochil
-promethium
-jupitus
-granz
-qaboos
-ocho
-mid-1999
-polyamorous
-trd
-nanomaterials
-quakertown
-s-shaped
-varius
-hyo
-sex-positive
-pardus
-dellinger
-prowling
-eresby
-bryon
-debilitated
-judgeships
-harshad
-chuuk
-mid-thirties
-d-reaper
-layden
-ihc
-schnitzel
-chiefship
-sharecropping
-supraorbital
-open-heart
-konno
-emirau
-magan
-molland
-portuguesa
-ey
-siete
-ensor
-downhole
-star-crossed
-shilts
-shac
-orangey-pink
-weu
-wendlinger
-rauma
-nickell
-k.p.
-curbishley
-radm
-jl
-revo
-sashimi
-woodsball
-disbarment
-bdp
-kinoshita
-doli
-martinet
-pu-erh
-brdc
-mangotsfield
-okey
-paglia
-dactylic
-pettitte
-unscom
-flytrap
-l.e.
-botch
-baranov
-dodgeville
-non-electrified
-silang
-wooler
-innsmouth
-enfilade
-dichotomous
-four-person
-externality
-cotinga
-jadis
-kokand
-8,600
-saori
-aveling
-re-invention
-merionethshire
-schleiermacher
-uncirculated
-theocritus
-d-branes
-uttaranchal
-snowed
-restyling
-thewlis
-terpenes
-boyds
-mgm/ua
-crucibles
-tongans
-super-regional
-ulster-scots
-strzelecki
-halogenated
-verismo
-82.0
-cuddle
-ilsley
-passy
-tts
-kewanee
-twinbee
-neoplan
-dantzig
-franklinton
-akaka
-fanboy
-al-nasr
-crowhurst
-superdelegates
-smathers
-dissociates
-sloman
-chauncy
-denarii
-gillmor
-kaffa
-kolokotronis
-amati
-wintringham
-dominaria
-wean
-moke
-crasher
-subtracts
-k-os
-t-54
-bellboy
-lethe
-pinaceae
-lepida
-grizzard
-unimaginative
-walang
-anr
-ahisa
-wyld
-pugni
-subgame
-rael
-demonata
-eddings
-36,563
-gabo
-repeatability
-embrun
-3-hour
-congeniality
-anaesthetist
-sidewheel
-upshot
-saxe-coburg-gotha
-raiford
-shara
-windjammer
-scallions
-monarchic
-raghav
-siebold
-gourlay
-advocaat
-pitaka
-visio
-superheterodyne
-ceefax
-spacefaring
-swv
-yucaipa
-saidin
-helpmann
-chauvinist
-dingman
-duchenne
-derailments
-maeterlinck
-geode
-darug
-tuas
-nilus
-fumaroles
-alcyone
-wsdl
-wood-burning
-applebaum
-coseley
-1,550
-eight-pointed
-rexford
-danites
-komusubi
-hillyard
-vam
-muli
-jutta
-marshman
-khasas
-nennius
-larrys
-couscous
-infinities
-yore
-enum
-perryman
-desantis
-peperomia
-wcf
-shawm
-kraal
-democratic-farmer-labor
-sadao
-meres
-wbbm
-funneling
-warding
-recurrences
-brockovich
-edrington
-nudie
-eihl
-misheard
-cibola
-qut
-siberry
-mallikarjuna
-mceachern
-pre-requisite
-spinous
-millersburg
-drachmas
-kenesaw
-repens
-intellectualism
-carafa
-134th
-hydrolyzes
-dactyl
-spouting
-sirhan
-quintile
-commutativity
-e36
-wadding
-fabiola
-gansevoort
-schaal
-brambilla
-14a
-menier
-santiniketan
-suhasini
-video-on-demand
-78.1
-i-68
-belsky
-bwr
-tail-end
-guiteau
-almoner
-hudson-bergen
-stricklin
-fee-based
-arkestra
-janjaweed
-suppiluliuma
-bodin
-menzoberranzan
-crestline
-sutil
-misinterpretations
-third-most
-europe/africa
-heliconia
-maistre
-forward/center
-ansa
-kenedy
-bolliger
-rothbury
-69,000
-oedema
-steadicam
-edriss
-neues
-bushing
-pikachu
-gipps
-4.80
-zukor
-mollis
-ziya
-40-60
-post-reformation
-fumarate
-teletubbies
-brightly-colored
-co-defendant
-marianus
-super-skrull
-bacteriophages
-ayaz
-hambro
-4-9
-mid-1943
-strollers
-vaqueros
-krait
-diskless
-carded
-bergner
-gritstone
-xang
-disfavour
-electrophysiological
-armpits
-inattentive
-noonday
-n/a
-sols
-necaxa
-foa
-naturopathy
-senlis
-icarly
-thallus
-syosset
-sarangi
-proprioceptive
-olivo
-paix
-sawed-off
-coeli
-bha
-lahnstein
-twombly
-subrahmanya
-yaris
-fuu
-rahxephon
-dsm-iv-tr
-sizzle
-benveniste
-fastow
-retinoblastoma
-decoction
-heterozygote
-blassingame
-rm1
-bubbled
-lalitpur
-meisel
-great-niece
-earmark
-donnelley
-laidley
-leveller
-mucilage
-allmovie
-insipid
-felicitas
-dakshin
-77.6
-santror
-toreador
-bunyoro
-karri
-englefield
-langres
-tiltrotor
-minimi
-mandals
-dalliance
-potala
-ill-will
-prabha
-text-to-speech
-soul/r
-marinetti
-borrego
-vulcanization
-zebulun
-polgara
-monteverde
-speeder
-avowedly
-tilde
-danaher
-batwing
-dioscorides
-kera
-haveri
-right-winger
-cupcake
-kerem
-nelligan
-supertall
-endogamous
-t.o.
-anaxagoras
-yoshiko
-arbiters
-rossman
-stative
-submersibles
-bostonians
-limbuwan
-star-telegram
-____
-vbscript
-mccalla
-k-type
-bare-knuckle
-planeta
-fili
-nanoseconds
-reformat
-azeotrope
-bini
-heanor
-multilayered
-filius
-germinating
-meinhard
-valproate
-camellias
-decicco
-matara
-marquisate
-3270
-sheree
-flinging
-3.66
-winking
-al-qa
-reassembling
-haudenosaunee
-neurotransmission
-dint
-mundine
-aleuts
-7-year-old
-ndb
-ananta
-general-interest
-kantha
-cd-single
-caernarvon
-bachar
-guth
-outflanking
-videoclip
-bemoans
-saphenous
-dizzying
-baranof
-1982-1985
-conjoiners
-septicemia
-katonah
-3ak
-coucal
-eustatius
-alesis
-utz
-wilfried
-babbit
-rothes
-determinacy
-payphones
-defecating
-adrastus
-fuel/air
-hodgepodge
-low-growing
-klinghoffer
-kilkis
-ketogenic
-lathom
-arcaded
-early-season
-shain
-smacking
-selflessly
-1.73
-diouf
-waupaca
-sdb
-gastown
-i.s.
-kempo
-parit
-swiftness
-lutenist
-lisgar
-sergeyevich
-penmanship
-flaccid
-tasburgh
-milosevic
-jshaa
-mig-29s
-butterscotch
-leat
-gaijin
-bactericidal
-raimund
-mcanally
-snugly
-cerdic
-ballmer
-rejoices
-malouf
-outagamie
-wertham
-haneda
-rodi
-hiva
-spielman
-mikasa
-abdallahi
-farewells
-iwai
-elberfeld
-sequestering
-typescript
-engulfs
-dalida
-golub
-alsager
-damiani
-markin
-vorontsov
-djerba
-grana
-darwaza
-liaising
-nederlandsche
-haack
-top-line
-80-yard
-hongo
-kuwata
-resoundingly
-DGDGDGDGDGDGDGDGDGDGDG
-rice-oxley
-tiebreak
-edification
-friendless
-vorster
-rickards
-3-part
-burros
-plodding
-righted
-neglectful
-sudi
-afsan
-pro-european
-mayaguez
-mylene
-big-block
-kipchak
-74.6
-left-footed
-lymphedema
-emanu-el
-wellard
-subah
-otra
-g-spot
-fontenoy
-endeavoring
-delisting
-a24
-affixing
-multi-threaded
-stockbrokers
-climatological
-trikoupis
-pedicle
-sugiura
-redington
-saltires
-trumbauer
-mure
-valvular
-ramification
-ganji
-dosbox
-interpolate
-5000m
-steichen
-schnitzer
-threnody
-gast
-shenango
-gastropoda
-malesia
-fine-tune
-bitton
-seles
-6809
-prynne
-holdup
-bonnell
-muswellbrook
-1951-1952
-four-term
-qward
-8,118
-samburu
-flossie
-juxtaposes
-1082
-schoolhouses
-regurgitate
-around-the-clock
-hopatcong
-saotome
-rehoboam
-cupped
-31.20
-mansbridge
-conciliar
-atascadero
-ratz
-tlaloc
-rezoned
-non-linearity
-greco-buddhist
-self-destructs
-leaderships
-myx
-misapplied
-taib
-moralist
-andromache
-lorn
-lactarius
-hangovers
-indusia
-pryderi
-comyns
-antonym
-johnno
-lagwagon
-nobu
-narre
-1976/77
-0.58
-anarcho-syndicalism
-fuel-air
-birtwistle
-tuscumbia
-kangri
-112.5
-brucellosis
-non-financial
-hidaka
-hematocrit
-deactivates
-holkham
-c-h
-dodoma
-ifk
-maca
-goldfaden
-guidi
-bzp
-vicarius
-timeliness
-fluor
-amakusa
-comforter
-peth
-laryngitis
-barajas
-nwc
-grampians
-minotaurs
-draconic
-synonymized
-ayo
-munia
-tkach
-cegep
-lanceolata
-fann
-lambie
-harasses
-vla
-export-import
-despairs
-garstang
-black-winged
-headspace
-goslin
-sneed
-kilroy-silk
-thoms
-ndola
-swanee
-sderot
-senility
-glutton
-pof
-bopper
-softimage
-waxahachie
-kaczynski
-universo
-vap
-extradimensional
-off-track
-oft-repeated
-coggins
-amasis
-kawakami
-sood
-et/pt
-polecat
-spineless
-supernaturally
-turreted
-statin
-horbury
-btl
-wrko
-sixth-largest
-khimki
-8600
-superintend
-morfa
-arakanese
-tuska
-leadup
-eti
-straight-six
-chuvashia
-yow
-tuli
-gmos
-sachar
-1981/82
-hamedan
-kaliyan
-scitech
-enchantix
-meckel
-highly-publicized
-lahontan
-georgia-pacific
-editor-at-large
-self-written
-dwyane
-malam
-single-level
-interlibrary
-makkal
-boran
-sbvt
-rockett
-mailers
-yixian
-ext3
-exudes
-proteoglycans
-trie
-overeating
-modeller
-abimelech
-airbrushed
-bedser
-indymac
-heilman
-shamus
-nitzsche
-maculatus
-contemptible
-hessle
-danylo
-orange-brown
-popoff
-halverson
-atac
-prudently
-paroxysmal
-downsides
-honoria
-amrish
-batok
-generalists
-flails
-thorndyke
-lake-effect
-bundi
-lalibela
-rockier
-well-represented
-introversion
-stephanos
-alben
-anadarko
-calorimetry
-deprotonation
-loughnane
-maras
-karelians
-cadoc
-carjacking
-alibis
-shoguns
-tene
-outlawz
-immunizations
-hypercholesterolemia
-anglo-mysore
-greektown
-kur
-hunas
-get-togethers
-hamsterdam
-therewith
-interminable
-davinci
-78.7
-virions
-revelatory
-sidgwick
-match-ups
-eighty-two
-self-healing
-westen
-severinsen
-arab-american
-nolasco
-speech-language
-coreopsis
-inverell
-polyphase
-lmp1
-snowballs
-velika
-dhi
-karun
-lankaran
-putatively
-kloof
-reactionaries
-wavenumber
-viswanath
-tecumsehs
-ganthet
-drumlins
-monod
-milagro
-lutra
-kakatiyas
-sese
-legionella
-can-can
-epe
-storerooms
-svein
-pimping
-dietetic
-new-build
-foh
-vilsack
-montt
-colonnaded
-kazimir
-buggery
-esterase
-samnium
-banta
-hors
-morshead
-71.7
-krzyzewski
-lithographer
-armour-piercing
-barro
-baalbek
-gripper
-misrepresent
-quiksilver
-low-volume
-energie
-judgmental
-kalka
-archdale
-trautmann
-bandleaders
-beed
-bande
-amol
-g-d
-kaftan
-mmd
-jibanananda
-gomel
-anchorwoman
-bulli
--DGDG.DGDGDGDGDGDGDG
-biodegradation
-thunderclap
-urrutia
-salutary
-600m
-blagdon
-servetus
-british-controlled
-transboundary
-larrikin
-scythes
-psychokinesis
-oreille
-macbrayne
-stith
-courcel
-kerby
-0.98
-pisans
-orgone
-coushatta
-crf
-drummoyne
-hermosillo
-chalcopyrite
-shoelaces
-superluminal
-2:2
-balladeer
-cassels
-aluva
-stepsons
-margaretha
-experian
-chandio
-putamen
-encarta
-landslip
-saura
-mondavi
-kahani
-pompilius
-cottagers
-clotaire
-salchow
-jass
-tri-valley
-subbed
-grantland
-nitta
-dodder
-av-8b
-two-issue
-deadeye
-piglets
-segev
-sebald
-lammas
-helicarrier
-cangrande
-eigenstate
-widowers
-popa
-akashi
-reve
-pgd
-frontside
-handa
-abap
-abaca
-billowing
-candelabra
-communions
-martin-in-the-fields
-crafters
-august/september
-srgb
-burrs
-jourdain
-tabernacles
-coraline
-vizard
-rouses
-dynkin
-coleen
-k.o.
-squealing
-jubba
-leaper
-commanding-in-chief
-myers-briggs
-highwood
-honecker
-subversives
-uia
-mazzola
-big-city
-calea
-srinath
-flirtations
-loathes
-uncoupled
-mutualistic
-dudelange
-non-interactive
-pleasants
-self-catering
-serrata
-youghiogheny
-sebadoh
-monosaccharide
-yael
-faired
-meenachil
-crowfoot
-lurleen
-topscorer
-propagandistic
-baldi
-visa-free
-broxburn
-rogelio
-polychaetes
-kimba
-cassville
-instant-runoff
-kayser
-ojcl
-ny1
-talmudical
-illiniwek
-ketoacidosis
-kennebunkport
-hannu
-big-eared
-hypoglycemic
-lir
-anuradha
-hoped-for
-riza
-davidic
-chayefsky
-briquettes
-denigrated
-guesswork
-jone
-emceed
-basel-country
-sulphide
-neuroprotective
-serpentinite
-albom
-crossbone
-brum
-bisping
-kwik-e-mart
-flaminius
-footplate
-massenet
-hali
-externalism
-machel
-stoat
-crenellated
-wigglesworth
-captiva
-aediles
-plamondon
-uni-president
-ectodermal
-dodie
-gallego
-stilo
-testarossa
-y-shaped
-doremi
-1993-1998
-9,600
-dynasts
-fragmenting
-mux
-roofless
-mesta
-chuen
-anare
-tishman
-shadyside
-jager
-relevancy
-nogi
-zerilli
-1996-98
-fusco
-fuga
-lorazepam
-co-manager
-centerpoint
-protuberance
-conniff
-elham
-n'dour
-tlr
-oreste
-yoakum
-regen
-kbo
-legations
-hokuriku
-self-assured
-hyades
-o'riley
-day-use
-wolpe
-marshallese
-fitts
-aplastic
-capstar
-breathalyzer
-ayia
-misinterpret
-mactavish
-multimillion
-27-24
-unicom
-cell-phone
-adcs
-outlands
-o'regan
-photojournalists
-targ
-ild
-vorderman
-mabillard
-jaane
-abhorrence
-photodiode
-robichaud
-affectation
-assonet
-comic-strip
-mtf
-3.57
-goodwrench
-kptv
-arditi
-osm
-onr
-reeducation
-codice_23
-pasi
-paulk
-32-page
-rollicking
-1917-1918
-vadivelu
-osterley
-quem
-metronidazole
-homebuilding
-neuritis
-single-user
-symbolical
-toseland
-sub-system
-clubbers
-ebersol
-compacta
-bitterest
-neuer
-caistor
-scott-heron
-hafizullah
-nowa
-presupposition
-valmet
-trampolining
-undermanned
-kombi
-320th
-antidotes
-adenomas
-charente-maritime
-wetherell
-plimsoll
-basant
diff --git a/legacy/sequence_tagging_for_ner/images/BIO tag example.png b/legacy/sequence_tagging_for_ner/images/BIO tag example.png
deleted file mode 100644
index 88ee9e84b7cc9ed8fd794c66c0929c1351d34d8e..0000000000000000000000000000000000000000
Binary files a/legacy/sequence_tagging_for_ner/images/BIO tag example.png and /dev/null differ
diff --git a/legacy/sequence_tagging_for_ner/images/ner_label_ins.png b/legacy/sequence_tagging_for_ner/images/ner_label_ins.png
deleted file mode 100644
index a3667c82e0bd012eea50b1d451a7a4063d26aa54..0000000000000000000000000000000000000000
Binary files a/legacy/sequence_tagging_for_ner/images/ner_label_ins.png and /dev/null differ
diff --git a/legacy/sequence_tagging_for_ner/images/ner_model_en.png b/legacy/sequence_tagging_for_ner/images/ner_model_en.png
deleted file mode 100644
index da541cda7e9632cfdac86df6f3f7d3e4c462b85b..0000000000000000000000000000000000000000
Binary files a/legacy/sequence_tagging_for_ner/images/ner_model_en.png and /dev/null differ
diff --git a/legacy/sequence_tagging_for_ner/images/ner_network.png b/legacy/sequence_tagging_for_ner/images/ner_network.png
deleted file mode 100644
index e9c07e34ac287ed04301bdede87dcc53377881b7..0000000000000000000000000000000000000000
Binary files a/legacy/sequence_tagging_for_ner/images/ner_network.png and /dev/null differ
diff --git a/legacy/sequence_tagging_for_ner/infer.py b/legacy/sequence_tagging_for_ner/infer.py
deleted file mode 100644
index cf48bc249c80fd44415d643ffb60bfb0feec4e1f..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/infer.py
+++ /dev/null
@@ -1,62 +0,0 @@
-import gzip
-
-import reader
-import paddle.v2 as paddle
-from network_conf import ner_net
-from utils import load_dict, load_reverse_dict
-
-
-def infer(model_path, batch_size, test_data_file, vocab_file, target_file):
- def _infer_a_batch(inferer, test_data, id_2_word, id_2_label):
- probs = inferer.infer(input=test_data, field=["id"])
- assert len(probs) == sum(len(x[0]) for x in test_data)
-
- for idx, test_sample in enumerate(test_data):
- start_id = 0
- for w, tag in zip(test_sample[0],
- probs[start_id:start_id + len(test_sample[0])]):
- print("%s\t%s" % (id_2_word[w], id_2_label[tag]))
- print("\n")
- start_id += len(test_sample[0])
-
- word_dict = load_dict(vocab_file)
- word_dict_len = len(word_dict)
- word_reverse_dict = load_reverse_dict(vocab_file)
-
- label_dict = load_dict(target_file)
- label_reverse_dict = load_reverse_dict(target_file)
- label_dict_len = len(label_dict)
-
- # initialize PaddlePaddle
- paddle.init(use_gpu=False, trainer_count=1)
- parameters = paddle.parameters.Parameters.from_tar(
- gzip.open(model_path, "r"))
-
- predict = ner_net(
- word_dict_len=word_dict_len,
- label_dict_len=label_dict_len,
- is_train=False)
-
- inferer = paddle.inference.Inference(
- output_layer=predict, parameters=parameters)
-
- test_data = []
- for i, item in enumerate(
- reader.data_reader(test_data_file, word_dict, label_dict)()):
- test_data.append([item[0], item[1]])
- if len(test_data) == batch_size:
- _infer_a_batch(inferer, test_data, word_reverse_dict,
- label_reverse_dict)
- test_data = []
-
- _infer_a_batch(inferer, test_data, word_reverse_dict, label_reverse_dict)
- test_data = []
-
-
-if __name__ == "__main__":
- infer(
- model_path="models/params_pass_0.tar.gz",
- batch_size=2,
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt")
diff --git a/legacy/sequence_tagging_for_ner/network_conf.py b/legacy/sequence_tagging_for_ner/network_conf.py
deleted file mode 100644
index b8bc8da0890d331ff2de808ee954e4071c3416d0..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/network_conf.py
+++ /dev/null
@@ -1,113 +0,0 @@
-import math
-
-import paddle.v2 as paddle
-import paddle.v2.evaluator as evaluator
-
-
-def ner_net(word_dict_len, label_dict_len, stack_num=2, is_train=True):
- mark_dict_len = 2
- word_dim = 50
- mark_dim = 5
- hidden_dim = 300
-
- word = paddle.layer.data(
- name="word",
- type=paddle.data_type.integer_value_sequence(word_dict_len))
- word_embedding = paddle.layer.embedding(
- input=word,
- size=word_dim,
- param_attr=paddle.attr.Param(
- name="emb", initial_std=math.sqrt(1. / word_dim), is_static=True))
-
- mark = paddle.layer.data(
- name="mark",
- type=paddle.data_type.integer_value_sequence(mark_dict_len))
- mark_embedding = paddle.layer.embedding(
- input=mark, size=mark_dim, param_attr=paddle.attr.Param(initial_std=0.))
-
- word_caps_vector = paddle.layer.concat(
- input=[word_embedding, mark_embedding])
-
- mix_hidden_lr = 1e-3
- rnn_para_attr = paddle.attr.Param(initial_std=0.0, learning_rate=0.1)
- hidden_para_attr = paddle.attr.Param(
- initial_std=1. / math.sqrt(hidden_dim) / 3, learning_rate=mix_hidden_lr)
-
- # the first forward and backward rnn layer share the
- # input-to-hidden mappings.
- hidden = paddle.layer.fc(
- name="__hidden00__",
- size=hidden_dim,
- act=paddle.activation.Tanh(),
- bias_attr=paddle.attr.Param(initial_std=1. / math.sqrt(hidden_dim) / 3),
- input=word_caps_vector,
- param_attr=paddle.attr.Param(initial_std=1. / math.sqrt(hidden_dim) /
- 3))
-
- fea = []
- for direction in ["fwd", "bwd"]:
- for i in range(stack_num):
- if i:
- hidden = paddle.layer.fc(
- name="__hidden%02d_%s__" % (i, direction),
- size=hidden_dim,
- act=paddle.activation.STanh(),
- bias_attr=paddle.attr.Param(initial_std=1.),
- input=[hidden, rnn],
- param_attr=[hidden_para_attr, rnn_para_attr])
-
- rnn = paddle.layer.recurrent(
- name="__rnn%02d_%s__" % (i, direction),
- input=hidden,
- act=paddle.activation.Relu(),
- bias_attr=paddle.attr.Param(initial_std=1.),
- reverse=i % 2 if direction == "fwd" else not i % 2,
- param_attr=rnn_para_attr)
- fea += [hidden, rnn]
-
- rnn_fea = paddle.layer.fc(
- size=hidden_dim,
- bias_attr=paddle.attr.Param(initial_std=1. / math.sqrt(hidden_dim) / 3),
- act=paddle.activation.STanh(),
- input=fea,
- param_attr=[hidden_para_attr, rnn_para_attr] * 2)
-
- # NOTE: This fully connected layer calculates the emission feature for
- # the CRF layer. Because the paddle.layer.crf performs global normalization
- # over all possible sequences internally, it expects UNSCALED emission
- # feature weights.
- # Please do not add any nonlinear activation to this fully connected layer.
- # The default activation for paddle.layer.fc is the tanh, here needs to set
- # it to linear explictly.
- emission = paddle.layer.fc(size=label_dict_len,
- bias_attr=False,
- input=rnn_fea,
- act=paddle.activation.Linear(),
- param_attr=paddle.attr.Param(
- initial_std=1. / math.sqrt(hidden_dim) / 3))
-
- if is_train:
- target = paddle.layer.data(
- name="target",
- type=paddle.data_type.integer_value_sequence(label_dict_len))
-
- crf = paddle.layer.crf(size=label_dict_len,
- input=emission,
- label=target,
- param_attr=paddle.attr.Param(
- name="crfw",
- initial_std=1. / math.sqrt(hidden_dim) / 3,
- learning_rate=mix_hidden_lr))
-
- crf_dec = paddle.layer.crf_decoding(
- size=label_dict_len,
- input=emission,
- label=target,
- param_attr=paddle.attr.Param(name="crfw"))
- return crf, crf_dec, target
- else:
- predict = paddle.layer.crf_decoding(
- size=label_dict_len,
- input=emission,
- param_attr=paddle.attr.Param(name="crfw"))
- return predict
diff --git a/legacy/sequence_tagging_for_ner/reader.py b/legacy/sequence_tagging_for_ner/reader.py
deleted file mode 100644
index 5050d0bf499e59db505758b0af9eed71e6af7de7..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/reader.py
+++ /dev/null
@@ -1,66 +0,0 @@
-"""
-Conll03 dataset.
-"""
-
-from utils import *
-
-__all__ = ["data_reader"]
-
-
-def canonicalize_digits(word):
- if any([c.isalpha() for c in word]): return word
- word = re.sub("\d", "DG", word)
- if word.startswith("DG"):
- word = word.replace(",", "") # remove thousands separator
- return word
-
-
-def canonicalize_word(word, wordset=None, digits=True):
- word = word.lower()
- if digits:
- if (wordset != None) and (word in wordset): return word
- word = canonicalize_digits(word) # try to canonicalize numbers
- if (wordset == None) or (word in wordset): return word
- else: return "UUUNKKK" # unknown token
-
-
-def data_reader(data_file, word_dict, label_dict):
- """
- The dataset can be obtained according to http://www.clips.uantwerpen.be/conll2003/ner/.
- It returns a reader creator, each sample in the reader includes:
- word id sequence, label id sequence and raw sentence.
-
- :return: reader creator
- :rtype: callable
- """
-
- def reader():
- UNK_IDX = word_dict["UUUNKKK"]
-
- sentence = []
- labels = []
- with open(data_file, "r") as f:
- for line in f:
- if len(line.strip()) == 0:
- if len(sentence) > 0:
- word_idx = [
- word_dict.get(
- canonicalize_word(w, word_dict), UNK_IDX)
- for w in sentence
- ]
- mark = [1 if w[0].isupper() else 0 for w in sentence]
- label_idx = [label_dict[l] for l in labels]
- yield word_idx, mark, label_idx
- sentence = []
- labels = []
- else:
- segs = line.strip().split()
- sentence.append(segs[0])
- # transform I-TYPE to BIO schema
- if segs[-1] != "O" and (len(labels) == 0 or
- labels[-1][1:] != segs[-1][1:]):
- labels.append("B" + segs[-1][1:])
- else:
- labels.append(segs[-1])
-
- return reader
diff --git a/legacy/sequence_tagging_for_ner/train.py b/legacy/sequence_tagging_for_ner/train.py
deleted file mode 100644
index 04b748f5858f87c44b90828f5262c1bb2d2b5a76..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/train.py
+++ /dev/null
@@ -1,110 +0,0 @@
-import os
-import gzip
-import numpy as np
-
-import reader
-from utils import logger, load_dict, get_embedding
-from network_conf import ner_net
-
-import paddle.v2 as paddle
-import paddle.v2.evaluator as evaluator
-
-
-def main(train_data_file,
- test_data_file,
- vocab_file,
- target_file,
- emb_file,
- model_save_dir,
- num_passes=100,
- batch_size=64):
- if not os.path.exists(model_save_dir):
- os.mkdir(model_save_dir)
-
- word_dict = load_dict(vocab_file)
- label_dict = load_dict(target_file)
-
- word_vector_values = get_embedding(emb_file)
-
- word_dict_len = len(word_dict)
- label_dict_len = len(label_dict)
-
- paddle.init(use_gpu=False, trainer_count=1)
-
- # define network topology
- crf_cost, crf_dec, target = ner_net(word_dict_len, label_dict_len)
- evaluator.sum(name="error", input=crf_dec)
- evaluator.chunk(
- name="ner_chunk",
- input=crf_dec,
- label=target,
- chunk_scheme="IOB",
- num_chunk_types=(label_dict_len - 1) / 2)
-
- # create parameters
- parameters = paddle.parameters.create(crf_cost)
- parameters.set("emb", word_vector_values)
-
- # create optimizer
- optimizer = paddle.optimizer.Momentum(
- momentum=0,
- learning_rate=2e-4,
- regularization=paddle.optimizer.L2Regularization(rate=8e-4),
- gradient_clipping_threshold=25,
- model_average=paddle.optimizer.ModelAverage(
- average_window=0.5, max_average_window=10000), )
-
- trainer = paddle.trainer.SGD(cost=crf_cost,
- parameters=parameters,
- update_equation=optimizer,
- extra_layers=crf_dec)
-
- train_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.data_reader(train_data_file, word_dict, label_dict),
- buf_size=1000),
- batch_size=batch_size)
- test_reader = paddle.batch(
- paddle.reader.shuffle(
- reader.data_reader(test_data_file, word_dict, label_dict),
- buf_size=1000),
- batch_size=batch_size)
-
- feeding = {"word": 0, "mark": 1, "target": 2}
-
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 5 == 0:
- logger.info("Pass %d, Batch %d, Cost %f, %s" % (
- event.pass_id, event.batch_id, event.cost, event.metrics))
- if event.batch_id % 50 == 0:
- result = trainer.test(reader=test_reader, feeding=feeding)
- logger.info("\nTest with Pass %d, Batch %d, %s" %
- (event.pass_id, event.batch_id, result.metrics))
-
- if isinstance(event, paddle.event.EndPass):
- # save parameters
- with gzip.open(
- os.path.join(model_save_dir, "params_pass_%d.tar.gz" %
- event.pass_id), "w") as f:
- trainer.save_parameter_to_tar(f)
-
- result = trainer.test(reader=test_reader, feeding=feeding)
- logger.info("\nTest with Pass %d, %s" % (event.pass_id,
- result.metrics))
-
- trainer.train(
- reader=train_reader,
- event_handler=event_handler,
- num_passes=num_passes,
- feeding=feeding)
-
-
-if __name__ == "__main__":
- main(
- train_data_file="data/train",
- test_data_file="data/test",
- vocab_file="data/vocab.txt",
- target_file="data/target.txt",
- emb_file="data/wordVectors.txt",
- model_save_dir="models/")
diff --git a/legacy/sequence_tagging_for_ner/utils.py b/legacy/sequence_tagging_for_ner/utils.py
deleted file mode 100644
index f40f1bb19481e34288ede7247f4fbe827be6f590..0000000000000000000000000000000000000000
--- a/legacy/sequence_tagging_for_ner/utils.py
+++ /dev/null
@@ -1,47 +0,0 @@
-#!/usr/bin/env python
-# -*- coding: utf-8 -*-
-import logging
-import os
-import re
-import argparse
-import numpy as np
-from collections import defaultdict
-
-logger = logging.getLogger("paddle")
-logger.setLevel(logging.INFO)
-
-
-def get_embedding(emb_file='data/wordVectors.txt'):
- """
- Get the trained word vector.
- """
- return np.loadtxt(emb_file, dtype=float)
-
-
-def load_dict(dict_path):
- """
- Load the word dictionary from the given file.
- Each line of the given file is a word, which can include multiple columns
- seperated by tab.
-
- This function takes the first column (columns in a line are seperated by
- tab) as key and takes line number of a line as the key (index of the word
- in the dictionary).
- """
-
- return dict((line.strip().split("\t")[0], idx)
- for idx, line in enumerate(open(dict_path, "r").readlines()))
-
-
-def load_reverse_dict(dict_path):
- """
- Load the word dictionary from the given file.
- Each line of the given file is a word, which can include multiple columns
- seperated by tab.
-
- This function takes line number of a line as the key (index of the word in
- the dictionary) and the first column (columns in a line are seperated by
- tab) as the value.
- """
- return dict((idx, line.strip().split("\t")[0])
- for idx, line in enumerate(open(dict_path, "r").readlines()))
diff --git a/legacy/ssd/README.cn.md b/legacy/ssd/README.cn.md
deleted file mode 100644
index f9dbde507cae09e4ae2593b25a5062da378ebc1a..0000000000000000000000000000000000000000
--- a/legacy/ssd/README.cn.md
+++ /dev/null
@@ -1,242 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# SSD目标检测
-## 概述
-SSD全称:Single Shot MultiBox Detector,是目标检测领域较新且效果较好的检测算法之一\[[1](#引用)\],有着检测速度快且检测精度高的有的。PaddlePaddle已集成SSD算法,本示例旨在介绍如何使用PaddlePaddle中的SSD模型进行目标检测。下文首先简要介绍SSD原理,然后介绍示例包含文件及如何使用,接着介绍如何在PASCAL VOC数据集上训练、评估及检测,最后简要介绍如何在自有数据集上使用SSD。
-
-## SSD原理
-SSD使用一个卷积神经网络实现“端到端”的检测:输入为原始图像,输出为检测结果,无需借助外部工具或流程进行特征提取、候选框生成等。论文中SSD使用VGG16\[[2](#引用)\]作为基础网络进行图像特征提取。但SSD对原始VGG16网络做了一些改变:
-
-1. 将最后的fc6、fc7全连接层变为卷积层,卷积层参数通过对原始fc6、fc7参数采样得到。
-2. 将pool5层的参数由2x2-s2(kernel大小为2x2,stride size为2)更改为3x3-s1-p1(kernel大小为3x3,stride size为1,padding size为1)。
-3. 在conv4\_3、conv7、conv8\_2、conv9\_2、conv10\_2及pool11层后面接了priorbox层,priorbox层的主要目的是根据输入的特征图(feature map)生成一系列的矩形候选框。更详细的介绍可参考\[[1](#引用)\]。
-
-下图为模型(输入图像尺寸:300x300)的总体结构:
-
-
-
-图1. SSD 网络结构
-
-
-图中每个矩形盒子代表一个卷积层,最后两个矩形框分别表示汇总各卷积层输出结果和后处理阶段。在预测阶段,网络会输出一组候选矩形框,每个矩形包含:位置和类别得分。图中倒数第二个矩形框即表示网络的检测结果的汇总处理。由于候选矩形框数量较多且很多矩形框重叠严重,这时需要经过后处理来筛选出质量较高的少数矩形框,主要方法有非极大值抑制(Non-maximum Suppression)。
-
-从SSD的网络结构可以看出,候选矩形框在多个特征图(feature map)上生成,不同的feature map具有的感受野不同,这样可以在不同尺度扫描图像,相对于其他检测方法可以生成更丰富的候选框,从而提高检测精度;另一方面SSD对VGG16的扩展部分以较小的代价实现对候选框的位置和类别得分的计算,整个过程只需要一个卷积神经网络完成,所以速度较快。
-
-## 示例总览
-本示例共包含如下文件:
-
-
-表1. 示例文件
-文件 用途
-train.py 训练脚本
-eval.py 评估脚本,用于评估训好模型
-infer.py 检测脚本,给定图片及模型,实施检测
-visual.py 检测结果可视化
-image_util.py 图像预处理所需公共函数
-data_provider.py 数据处理脚本,生成训练、评估或检测所需数据
-config/pascal_voc_conf.py 神经网络超参数配置文件
-data/label_list 类别列表
-data/prepare_voc_data.py 准备训练PASCAL VOC数据列表
-
-
-训练阶段需要对数据做预处理,包括裁剪、采样等,这部分操作在```image_util.py```和```data_provider.py```中完成。
-
-需要注意:**```config/vgg_config.py```是参数配置文件,含有训练参数、神经网络参数等。配置文件中的参数针对PASCAL VOC数据集而配置,当训练自有数据时,需要进行针对性的修改。**
-
-```data/prepare_voc_data.py```脚本用来生成文件列表,包括切分训练集和测试集,使用时需要事先下载并解压数据,默认采用VOC2007和VOC2012。
-
-## PASCAL VOC数据集
-### 数据准备
-
-1. 请首先下载数据集:VOC2007\[[3](#引用)\]和VOC2012\[[4](#引用)\]。VOC2007包含训练集和测试集,VOC2012只包含训练集,将下载好的数据解压,目录结构为```data/VOCdevkit/VOC2007```和```data/VOCdevkit/VOC2012```。
-1. 进入```data```目录,运行```python prepare_voc_data.py```即可生成```trainval.txt```和```test.txt```。核心函数为:
-
- ```python
- def prepare_filelist(devkit_dir, years, output_dir):
- trainval_list = []
- test_list = []
- for year in years:
- trainval, test = walk_dir(devkit_dir, year)
- trainval_list.extend(trainval)
- test_list.extend(test)
- random.shuffle(trainval_list)
- with open(osp.join(output_dir, 'trainval.txt'), 'w') as ftrainval:
- for item in trainval_list:
- ftrainval.write(item[0] + ' ' + item[1] + '\n')
-
- with open(osp.join(output_dir, 'test.txt'), 'w') as ftest:
- for item in test_list:
- ftest.write(item[0] + ' ' + item[1] + '\n')
- ```
-
- 该函数首先对每一年(year)数据进行处理,然后将训练图像的文件路径列表进行随机乱序,最后保存训练文件列表和测试文件列表。默认```prepare_voc_data.py```和```VOCdevkit```在相同目录下,且生成的文件列表也在该目录次数。
-
- 需注意```trainval.txt```既包含VOC2007的训练数据,也包含VOC2012的训练数据,```test.txt```只包含VOC2007的测试数据。
-
- 下面是```trainval.txt```前几行输入示例:
-
- ```text
- VOCdevkit/VOC2007/JPEGImages/000005.jpg VOCdevkit/VOC2007/Annotations/000005.xml
- VOCdevkit/VOC2007/JPEGImages/000007.jpg VOCdevkit/VOC2007/Annotations/000007.xml
- VOCdevkit/VOC2007/JPEGImages/000009.jpg VOCdevkit/VOC2007/Annotations/000009.xml
- ```
-
- 文件共两个字段,第一个字段为图像文件的相对路径,第二个字段为对应标注文件的相对路径。
-
-### 预训练模型准备
-下载预训练的VGG-16模型,我们提供了一个转换好的模型,下载模型[http://paddlemodels.bj.bcebos.com/v2/vgg_model.tar.gz](http://paddlemodels.bj.bcebos.com/v2/vgg_model.tar.gz),并将其放置路径为```vgg/vgg_model.tar.gz```。
-
-### 模型训练
-直接执行```python train.py```即可进行训练。需要注意本示例仅支持CUDA GPU环境,无法在CPU上训练,主要因为使用CPU训练速度很慢,实践中一般使用GPU来处理图像任务,这里实现采用硬编码方式使用cuDNN,不提供CPU版本。```train.py```的一些关键执行逻辑:
-
-```python
-paddle.init(use_gpu=True, trainer_count=4)
-data_args = data_provider.Settings(
- data_dir='./data',
- label_file='label_list',
- resize_h=cfg.IMG_HEIGHT,
- resize_w=cfg.IMG_WIDTH,
- mean_value=[104,117,124])
-train(train_file_list='./data/trainval.txt',
- dev_file_list='./data/test.txt',
- data_args=data_args,
- init_model_path='./vgg/vgg_model.tar.gz')
-```
-
-主要包括:
-
-1. 调用```paddle.init```指定使用4卡GPU训练。
-2. 调用```data_provider.Settings```配置数据预处理所需参数,其中```cfg.IMG_HEIGHT```和```cfg.IMG_WIDTH```在配置文件```config/vgg_config.py```中设置,这里均为300,300x300是一个典型配置,兼顾效率和检测精度,也可以通过修改配置文件扩展到512x512。
-3. 调用```train```执行训练,其中```train_file_list```指定训练数据列表,```dev_file_list```指定评估数据列表,```init_model_path```指定预训练模型位置。
-4. 训练过程中会打印一些日志信息,每训练1个batch会输出当前的轮数、当前batch的cost及mAP(mean Average Precision,平均精度均值),每训练一个pass,会保存一次模型,默认保存在```checkpoints```目录下(注:需事先创建)。
-
-下面给出SDD300x300在VOC数据集(train包括07+12,test为07)上的mAP曲线,迭代140轮mAP可达到71.52%。
-
-
-
-图2. SSD300x300 mAP收敛曲线
-
-
-
-### 模型评估
-执行```python eval.py```即可对模型进行评估,```eval.py```的关键执行逻辑如下:
-
-```python
-paddle.init(use_gpu=True, trainer_count=4) # use 4 gpus
-
-data_args = data_provider.Settings(
- data_dir='./data',
- label_file='label_list',
- resize_h=cfg.IMG_HEIGHT,
- resize_w=cfg.IMG_WIDTH,
- mean_value=[104, 117, 124])
-
-eval(
- eval_file_list='./data/test.txt',
- batch_size=4,
- data_args=data_args,
- model_path='models/pass-00000.tar.gz')
-```
-
-调用```paddle.init```指定使用4卡GPU评估;```data_provider.Settings```参见训练阶段的配置;调用```eval```执行评估,其中```eval_file_list```指定评估数据列表,```batch_size```指定评估时batch size的大小,```model_path ```指定模型:位置。评估结束会输出```loss```信息和```mAP```信息。
-
-### 图像检测
-执行```python infer.py```即可使用训练好的模型对图片实施检测,```infer.py```关键逻辑如下:
-
-```python
-infer(
- eval_file_list='./data/infer.txt',
- save_path='infer.res',
- data_args=data_args,
- batch_size=4,
- model_path='models/pass-00000.tar.gz',
- threshold=0.3)
-```
-
-其中```eval_file_list```指定图像路径列表;```save_path```指定预测结果保存路径;```data_args```如上;```batch_size```为每多少样本预测一次;```model_path```指模型的位置;```threshold```为置信度阈值,只有得分大于或等于该值的才会输出。下面给出```infer.res```的一些输出样例:
-
-```text
-VOCdevkit/VOC2007/JPEGImages/006936.jpg 12 0.997844 131.255611777 162.271582842 396.475315094 334.0
-VOCdevkit/VOC2007/JPEGImages/006936.jpg 14 0.998557 229.160234332 49.5991278887 314.098775387 312.913876176
-VOCdevkit/VOC2007/JPEGImages/006936.jpg 14 0.372522 187.543615699 133.727034628 345.647156239 327.448492289
-...
-```
-
-共包含4个字段,以tab分割,第一个字段是检测图像路径,第二字段为检测矩形框内类别,第三个字段是置信度,第四个字段是4个坐标值(以空格分割)。
-
-示例还提供了一个可视化脚本,直接运行```python visual.py```即可,须指定输出检测结果路径及输出目录,默认可视化后图像保存在```./visual_res```,下面是用训练好的模型infer部分图像并可视化的效果:
-
-
-
-
-
-
-图3. SSD300x300 检测可视化示例
-
-
-
-## 自有数据集
-在自有数据上训练PaddlePaddle SSD需要完成两个关键准备,首先需要适配网络可以接受的输入格式,这里提供一个推荐的结构,以```train.txt```为例
-
-```text
-image00000_file_path image00000_annotation_file_path
-image00001_file_path image00001_annotation_file_path
-image00002_file_path image00002_annotation_file_path
-...
-```
-
-文件共两列,以空白符分割,第一列为图像文件的路径,第二列为对应标注数据的文件路径。对图像文件的读取比较直接,略微复杂的是对标注数据的解析,本示例中标注数据使用xml文件存储,所以需要在```data_provider.py```中对xml解析,核心逻辑如下:
-
-```python
-bbox_labels = []
-root = xml.etree.ElementTree.parse(label_path).getroot()
-for object in root.findall('object'):
- bbox_sample = []
- # start from 1
- bbox_sample.append(float(settings.label_list.index(
- object.find('name').text)))
- bbox = object.find('bndbox')
- difficult = float(object.find('difficult').text)
- bbox_sample.append(float(bbox.find('xmin').text)/img_width)
- bbox_sample.append(float(bbox.find('ymin').text)/img_height)
- bbox_sample.append(float(bbox.find('xmax').text)/img_width)
- bbox_sample.append(float(bbox.find('ymax').text)/img_height)
- bbox_sample.append(difficult)
- bbox_labels.append(bbox_sample)
-```
-
-这里一条标注数据包括:label、xmin、ymin、xmax、ymax和is\_difficult,is\_difficult表示该object是否为难例,实际中如果不需要,只需把该字段置零即可。自有数据也需要提供对应的解析逻辑,假设标注数据(比如image00000\_annotation\_file\_path)存储格式如下:
-
-```text
-label1 xmin1 ymin1 xmax1 ymax1
-label2 xmin2 ymin2 xmax2 ymax2
-...
-```
-
-每行对应一个物体,共5个字段,第一个为label(注背景为0,需从1编号),剩余4个为坐标,对应的解析逻辑可更改为如下:
-
-```python
-bbox_labels = []
-with open(label_path) as flabel:
- for line in flabel:
- bbox_sample = []
- bbox = [float(i) for i in line.strip().split()]
- label = bbox[0]
- bbox_sample.append(label)
- bbox_sample.append(bbox[1]/float(img_width))
- bbox_sample.append(bbox[2]/float(img_height))
- bbox_sample.append(bbox[3]/float(img_width))
- bbox_sample.append(bbox[4]/float(img_height))
- bbox_sample.append(0.0)
- bbox_labels.append(bbox_sample)
-```
-
-**同时需要注意根据图像大小及检测物体的大小等更改网络结构相关的超参数,请仿照```config/vgg_config.py```创建自己的配置文件,参数设置经验请参考论文\[[1](#引用)\]。**
-
-## 引用
-1. Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott Reed, Cheng-Yang Fu, Alexander C. Berg. [SSD: Single shot multibox detector](https://arxiv.org/abs/1512.02325). European conference on computer vision. Springer, Cham, 2016.
-2. Simonyan, Karen, and Andrew Zisserman. [Very deep convolutional networks for large-scale image recognition](https://arxiv.org/abs/1409.1556). arXiv preprint arXiv:1409.1556 (2014).
-3. [The PASCAL Visual Object Classes Challenge 2007](http://host.robots.ox.ac.uk/pascal/VOC/voc2007/index.html)
-4. [Visual Object Classes Challenge 2012 (VOC2012)](http://host.robots.ox.ac.uk/pascal/VOC/voc2012/index.html)
diff --git a/legacy/ssd/README.md b/legacy/ssd/README.md
deleted file mode 100644
index 7ad8a6936acfbebe6c21527e104847643aa036b6..0000000000000000000000000000000000000000
--- a/legacy/ssd/README.md
+++ /dev/null
@@ -1,233 +0,0 @@
-The minimum PaddlePaddle version needed for the code sample in this directory is v0.10.0. If you are on a version of PaddlePaddle earlier than v0.10.0, [please update your installation](http://www.paddlepaddle.org/docs/develop/documentation/en/build_and_install/pip_install_en.html).
-
----
-
-# Single Shot MultiBox Detector (SSD) Object Detection
-
-## Introduction
-Single Shot MultiBox Detector (SSD) is one of the new and enhanced detection algorithms detecting objects in images [ 1 ]. SSD algorithm is characterized by rapid detection and high detection accuracy. PaddlePaddle has an integrated SSD algorithm! This example demonstrates how to use the SSD model in PaddlePaddle for object detection. We first provide a brief introduction to the SSD principle. Then we describe how to train, evaluate and test on the PASCAL VOC data set, and finally on how to use SSD on custom data set.
-
-## SSD Architecture
-SSD uses a convolutional neural network to achieve end-to-end detection. The term "End-to-end" is used because it uses the input as the original image and the output for the test results, without the use of external tools or processes for feature extraction. One popular model of SSD is VGG16 [ 2 ]. SSD differs from VGG16 network model as in following.
-
-1. The final fc6, fc7 full connection layer into a convolution layer, convolution layer parameters through the original fc6, fc7 parameters obtained.
-2. Change the parameters of the pool5 layer from 2x2-s2 (kernel size 2x2, stride size to 2) to 3x3-s1-p1 (kernel size is 3x3, stride size is 1, padding size is 1).
-3. The initial layers are composed of conv4\_3、conv7、conv8\_2、conv9\_2、conv10\_2, and pool11 layers. The main purpose of the priorbox layer is to generate a series of rectangular candidates based on the input feature map. A more detailed introduction to SSD can be found in the paper\[[1](#References)\]。
-
-Below is the overall structure of the model (300x300)
-
-
-
-图1. SSD网络结构
-
-
-Each box in the figure represents a convolution layer, and the last two rectangles represent the summary of each convolution layer output and the post-processing phase. Specifically, the network will output a set of candidate rectangles in the prediction phase. Each rectangle contains two types of information: the position and the category score. The network produces thousands of predictions at various scales and aspect ratios before performing non-maximum suppression, resulting in a handful of final tags.
-
-## Example Overview
-This example contains the following files:
-
-
-Table 1. Directory structure
-File Description
-train.py Training script
-eval.py Evaluation
-infer.py Prediction using the trained model
-visual.py Visualization of the test results
-image_util.py Image preprocessing required common function
-data_provider.py Data processing scripts, generate training, evaluate or detect the required data
-config/pascal_voc_conf.py Neural network hyperparameter configuration file
-data/label_list Label list
-data/prepare_voc_data.py Prepare training PASCAL VOC data list
-
-
-The training phase requires pre-processing of the data, including clipping, sampling, etc. This is done in ```image_util.py``` and ```data_provider.py```.```config/vgg_config.py```. ```data/prepare_voc_data.py``` is used to generate a list of files, including the training set and test set, the need to use the user to download and extract data, the default use of VOC2007 and VOC2012.
-
-## PASCAL VOC Data set
-
-### Data Preparation
-First download the data set. VOC2007\[[3](#References)\] contains both training and test data set, and VOC2012\[[4](#References)\] contains only training set. Downloaded data are stored in ```data/VOCdevkit/VOC2007``` and ```data/VOCdevkit/VOC2012```. Next, run ```data/prepare_voc_data.py``` to generate ```trainval.txt``` and ```test.txt```. The relevant function is as following:
-
-```python
-def prepare_filelist(devkit_dir, years, output_dir):
- trainval_list = []
- test_list = []
- for year in years:
- trainval, test = walk_dir(devkit_dir, year)
- trainval_list.extend(trainval)
- test_list.extend(test)
- random.shuffle(trainval_list)
- with open(osp.join(output_dir, 'trainval.txt'), 'w') as ftrainval:
- for item in trainval_list:
- ftrainval.write(item[0] + ' ' + item[1] + '\n')
-
- with open(osp.join(output_dir, 'test.txt'), 'w') as ftest:
- for item in test_list:
- ftest.write(item[0] + ' ' + item[1] + '\n')
-```
-
-The data in ```trainval.txt``` will look like:
-
-```
-VOCdevkit/VOC2007/JPEGImages/000005.jpg VOCdevkit/VOC2007/Annotations/000005.xml
-VOCdevkit/VOC2007/JPEGImages/000007.jpg VOCdevkit/VOC2007/Annotations/000007.xml
-VOCdevkit/VOC2007/JPEGImages/000009.jpg VOCdevkit/VOC2007/Annotations/000009.xml
-```
-
-The first field is the relative path of the image file, and the second field is the relative path of the corresponding label file.
-
-
-### To Use Pre-trained Model
-We also provide a pre-trained model using VGG-16 with good performance. To use the model, download the file http://paddlemodels.bj.bcebos.com/v2/vgg_model.tar.gz, and place it as ```vgg/vgg_model.tar.gz```.
-
-### Training
-Next, run ```python train.py``` to train the model. Note that this example only supports the CUDA GPU environment, and can not be trained using only CPU. This is mainly because the training is very slow using CPU only.
-
-```python
-paddle.init(use_gpu=True, trainer_count=4)
-data_args = data_provider.Settings(
- data_dir='./data',
- label_file='label_list',
- resize_h=cfg.IMG_HEIGHT,
- resize_w=cfg.IMG_WIDTH,
- mean_value=[104,117,124])
-train(train_file_list='./data/trainval.txt',
- dev_file_list='./data/test.txt',
- data_args=data_args,
- init_model_path='./vgg/vgg_model.tar.gz')
-```
-
-Below is a description about this script:
-
-1. Call ```paddle.init``` with 4 GPUs.
-2. ```data_provider.Settings()``` is to pass configuration parameters. For ```config/vgg_config.py``` setting,300x300 is a typical configuration for both the accuracy and efficiency. It can be extended to 512x512 by modifying the configuration file.
-3. In ```train()```执 function, ```train_file_list``` specifies the training data list, and ```dev_file_list``` specifies the evaluation data list, and ```init_model_path``` specifies the pre-training model location.
-4. During the training process will print some log information, each training a batch will output the current number of rounds, the current batch cost and mAP (mean Average Precision. Each training pass will be saved a model to the default saved directory ```checkpoints``` (Need to be created in advance).
-
-The following shows the SDD300x300 in the VOC data set.
-
-
-
-图2. SSD300x300 mAP收敛曲线
-
-
-
-### Model Assessment
-Next, run ```python eval.py``` to evaluate the model.
-
-```python
-paddle.init(use_gpu=True, trainer_count=4) # use 4 gpus
-
-data_args = data_provider.Settings(
- data_dir='./data',
- label_file='label_list',
- resize_h=cfg.IMG_HEIGHT,
- resize_w=cfg.IMG_WIDTH,
- mean_value=[104, 117, 124])
-
-eval(
- eval_file_list='./data/test.txt',
- batch_size=4,
- data_args=data_args,
- model_path='models/pass-00000.tar.gz')
-```
-
-### Obejct Detection
-Run ```python infer.py``` to perform the object detection using the trained model.
-
-```python
-infer(
- eval_file_list='./data/infer.txt',
- save_path='infer.res',
- data_args=data_args,
- batch_size=4,
- model_path='models/pass-00000.tar.gz',
- threshold=0.3)
-```
-
-
-Here ```eval_file_list``` specified image path list, ```save_path``` specifies directory to save the prediction result.
-
-
-```text
-VOCdevkit/VOC2007/JPEGImages/006936.jpg 12 0.997844 131.255611777 162.271582842 396.475315094 334.0
-VOCdevkit/VOC2007/JPEGImages/006936.jpg 14 0.998557 229.160234332 49.5991278887 314.098775387 312.913876176
-VOCdevkit/VOC2007/JPEGImages/006936.jpg 14 0.372522 187.543615699 133.727034628 345.647156239 327.448492289
-...
-```
-
-一共包含4个字段,以tab分割,第一个字段是检测图像路径,第二字段为检测矩形框内类别,第三个字段是置信度,第四个字段是4个坐标值(以空格分割)。
-
-Below is the example after running ```python visual.py``` to visualize the model result. The default visualization of the image saved in the ```./visual_res```.
-
-
-
-
-
-
-Figure 3. SSD300x300 Visualization Example
-
-
-
-## To Use Custo Data set
-In PaddlePaddle, using the custom data set to train SSD model is also easy! Just input the format that ```train.txt``` can understand. Below is a recommended structure to input for ```train.txt```.
-
-```text
-image00000_file_path image00000_annotation_file_path
-image00001_file_path image00001_annotation_file_path
-image00002_file_path image00002_annotation_file_path
-...
-```
-
-The first column is for the image file path, and the second column for the corresponding marked data file path. In the case of using xml file format, ```data_provider.py``` can be used to process the data as follows.
-
-```python
-bbox_labels = []
-root = xml.etree.ElementTree.parse(label_path).getroot()
-for object in root.findall('object'):
- bbox_sample = []
- # start from 1
- bbox_sample.append(float(settings.label_list.index(
- object.find('name').text)))
- bbox = object.find('bndbox')
- difficult = float(object.find('difficult').text)
- bbox_sample.append(float(bbox.find('xmin').text)/img_width)
- bbox_sample.append(float(bbox.find('ymin').text)/img_height)
- bbox_sample.append(float(bbox.find('xmax').text)/img_width)
- bbox_sample.append(float(bbox.find('ymax').text)/img_height)
- bbox_sample.append(difficult)
- bbox_labels.append(bbox_sample)
-```
-
-Now the marked data(e.g. image00000\_annotation\_file\_path)is as follows:
-
-```text
-label1 xmin1 ymin1 xmax1 ymax1
-label2 xmin2 ymin2 xmax2 ymax2
-...
-```
-
-Here each row corresponds to an object for 5 fields. The first is for the label (note the background 0, need to be numbered from 1), and the remaining four are for the coordinates.
-
-```python
-bbox_labels = []
-with open(label_path) as flabel:
- for line in flabel:
- bbox_sample = []
- bbox = [float(i) for i in line.strip().split()]
- label = bbox[0]
- bbox_sample.append(label)
- bbox_sample.append(bbox[1]/float(img_width))
- bbox_sample.append(bbox[2]/float(img_height))
- bbox_sample.append(bbox[3]/float(img_width))
- bbox_sample.append(bbox[4]/float(img_height))
- bbox_sample.append(0.0)
- bbox_labels.append(bbox_sample)
-```
-
-Another important thing is to change the size of the image and the size of the object to change the configuration of the network structure. Use ```config/vgg_config.py``` to create the custom configuration file. For more details, please refer to \[[1](#References)\]。
-
-## References
-1. Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott Reed, Cheng-Yang Fu, Alexander C. Berg. [SSD: Single shot multibox detector](https://arxiv.org/abs/1512.02325). European conference on computer vision. Springer, Cham, 2016.
-2. Simonyan, Karen, and Andrew Zisserman. [Very deep convolutional networks for large-scale image recognition](https://arxiv.org/abs/1409.1556). arXiv preprint arXiv:1409.1556 (2014).
-3. [The PASCAL Visual Object Classes Challenge 2007](http://host.robots.ox.ac.uk/pascal/VOC/voc2007/index.html)
-4. [Visual Object Classes Challenge 2012 (VOC2012)](http://host.robots.ox.ac.uk/pascal/VOC/voc2012/index.html)
diff --git a/legacy/ssd/config/__init__.py b/legacy/ssd/config/__init__.py
deleted file mode 100644
index 8b137891791fe96927ad78e64b0aad7bded08bdc..0000000000000000000000000000000000000000
--- a/legacy/ssd/config/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-
diff --git a/legacy/ssd/config/pascal_voc_conf.py b/legacy/ssd/config/pascal_voc_conf.py
deleted file mode 100644
index 416423345e5cded9afce4c0083f4523434845484..0000000000000000000000000000000000000000
--- a/legacy/ssd/config/pascal_voc_conf.py
+++ /dev/null
@@ -1,92 +0,0 @@
-from easydict import EasyDict as edict
-import numpy as np
-
-__C = edict()
-cfg = __C
-
-__C.TRAIN = edict()
-
-__C.IMG_WIDTH = 300
-__C.IMG_HEIGHT = 300
-__C.IMG_CHANNEL = 3
-__C.CLASS_NUM = 21
-__C.BACKGROUND_ID = 0
-
-# training settings
-__C.TRAIN.LEARNING_RATE = 0.001 / 4
-__C.TRAIN.MOMENTUM = 0.9
-__C.TRAIN.BATCH_SIZE = 32
-__C.TRAIN.NUM_PASS = 200
-__C.TRAIN.L2REGULARIZATION = 0.0005 * 4
-__C.TRAIN.LEARNING_RATE_DECAY_A = 0.1
-__C.TRAIN.LEARNING_RATE_DECAY_B = 16551 * 80
-__C.TRAIN.LEARNING_RATE_SCHEDULE = 'discexp'
-
-__C.NET = edict()
-
-# configuration for multibox_loss_layer
-__C.NET.MBLOSS = edict()
-__C.NET.MBLOSS.OVERLAP_THRESHOLD = 0.5
-__C.NET.MBLOSS.NEG_POS_RATIO = 3.0
-__C.NET.MBLOSS.NEG_OVERLAP = 0.5
-
-# configuration for detection_map
-__C.NET.DETMAP = edict()
-__C.NET.DETMAP.OVERLAP_THRESHOLD = 0.5
-__C.NET.DETMAP.EVAL_DIFFICULT = False
-__C.NET.DETMAP.AP_TYPE = "11point"
-
-# configuration for detection_output_layer
-__C.NET.DETOUT = edict()
-__C.NET.DETOUT.CONFIDENCE_THRESHOLD = 0.01
-__C.NET.DETOUT.NMS_THRESHOLD = 0.45
-__C.NET.DETOUT.NMS_TOP_K = 400
-__C.NET.DETOUT.KEEP_TOP_K = 200
-
-# configuration for priorbox_layer from conv4_3
-__C.NET.CONV4 = edict()
-__C.NET.CONV4.PB = edict()
-__C.NET.CONV4.PB.MIN_SIZE = [30]
-__C.NET.CONV4.PB.MAX_SIZE = []
-__C.NET.CONV4.PB.ASPECT_RATIO = [2.]
-__C.NET.CONV4.PB.VARIANCE = [0.1, 0.1, 0.2, 0.2]
-
-# configuration for priorbox_layer from fc7
-__C.NET.FC7 = edict()
-__C.NET.FC7.PB = edict()
-__C.NET.FC7.PB.MIN_SIZE = [60]
-__C.NET.FC7.PB.MAX_SIZE = [114]
-__C.NET.FC7.PB.ASPECT_RATIO = [2., 3.]
-__C.NET.FC7.PB.VARIANCE = [0.1, 0.1, 0.2, 0.2]
-
-# configuration for priorbox_layer from conv6_2
-__C.NET.CONV6 = edict()
-__C.NET.CONV6.PB = edict()
-__C.NET.CONV6.PB.MIN_SIZE = [114]
-__C.NET.CONV6.PB.MAX_SIZE = [168]
-__C.NET.CONV6.PB.ASPECT_RATIO = [2., 3.]
-__C.NET.CONV6.PB.VARIANCE = [0.1, 0.1, 0.2, 0.2]
-
-# configuration for priorbox_layer from conv7_2
-__C.NET.CONV7 = edict()
-__C.NET.CONV7.PB = edict()
-__C.NET.CONV7.PB.MIN_SIZE = [168]
-__C.NET.CONV7.PB.MAX_SIZE = [222]
-__C.NET.CONV7.PB.ASPECT_RATIO = [2., 3.]
-__C.NET.CONV7.PB.VARIANCE = [0.1, 0.1, 0.2, 0.2]
-
-# configuration for priorbox_layer from conv8_2
-__C.NET.CONV8 = edict()
-__C.NET.CONV8.PB = edict()
-__C.NET.CONV8.PB.MIN_SIZE = [222]
-__C.NET.CONV8.PB.MAX_SIZE = [276]
-__C.NET.CONV8.PB.ASPECT_RATIO = [2., 3.]
-__C.NET.CONV8.PB.VARIANCE = [0.1, 0.1, 0.2, 0.2]
-
-# configuration for priorbox_layer from pool6
-__C.NET.POOL6 = edict()
-__C.NET.POOL6.PB = edict()
-__C.NET.POOL6.PB.MIN_SIZE = [276]
-__C.NET.POOL6.PB.MAX_SIZE = [330]
-__C.NET.POOL6.PB.ASPECT_RATIO = [2., 3.]
-__C.NET.POOL6.PB.VARIANCE = [0.1, 0.1, 0.2, 0.2]
diff --git a/legacy/ssd/data/label_list b/legacy/ssd/data/label_list
deleted file mode 100644
index 87df23ce0aebcd5ab96fc91c868598c3333da59c..0000000000000000000000000000000000000000
--- a/legacy/ssd/data/label_list
+++ /dev/null
@@ -1,21 +0,0 @@
-background
-aeroplane
-bicycle
-bird
-boat
-bottle
-bus
-car
-cat
-chair
-cow
-diningtable
-dog
-horse
-motorbike
-person
-pottedplant
-sheep
-sofa
-train
-tvmonitor
diff --git a/legacy/ssd/data/prepare_voc_data.py b/legacy/ssd/data/prepare_voc_data.py
deleted file mode 100644
index a652956e91ab8277bc6670d4dc85905fc52a3203..0000000000000000000000000000000000000000
--- a/legacy/ssd/data/prepare_voc_data.py
+++ /dev/null
@@ -1,63 +0,0 @@
-import os
-import os.path as osp
-import re
-import random
-
-devkit_dir = './VOCdevkit'
-years = ['2007', '2012']
-
-
-def get_dir(devkit_dir, year, type):
- return osp.join(devkit_dir, 'VOC' + year, type)
-
-
-def walk_dir(devkit_dir, year):
- filelist_dir = get_dir(devkit_dir, year, 'ImageSets/Main')
- annotation_dir = get_dir(devkit_dir, year, 'Annotations')
- img_dir = get_dir(devkit_dir, year, 'JPEGImages')
- trainval_list = []
- test_list = []
- added = set()
-
- for _, _, files in os.walk(filelist_dir):
- for fname in files:
- img_ann_list = []
- if re.match('[a-z]+_trainval\.txt', fname):
- img_ann_list = trainval_list
- elif re.match('[a-z]+_test\.txt', fname):
- img_ann_list = test_list
- else:
- continue
- fpath = osp.join(filelist_dir, fname)
- for line in open(fpath):
- name_prefix = line.strip().split()[0]
- if name_prefix in added:
- continue
- added.add(name_prefix)
- ann_path = osp.join(annotation_dir, name_prefix + '.xml')
- img_path = osp.join(img_dir, name_prefix + '.jpg')
- assert os.path.isfile(ann_path), 'file %s not found.' % ann_path
- assert os.path.isfile(img_path), 'file %s not found.' % img_path
- img_ann_list.append((img_path, ann_path))
-
- return trainval_list, test_list
-
-
-def prepare_filelist(devkit_dir, years, output_dir):
- trainval_list = []
- test_list = []
- for year in years:
- trainval, test = walk_dir(devkit_dir, year)
- trainval_list.extend(trainval)
- test_list.extend(test)
- random.shuffle(trainval_list)
- with open(osp.join(output_dir, 'trainval.txt'), 'w') as ftrainval:
- for item in trainval_list:
- ftrainval.write(item[0] + ' ' + item[1] + '\n')
-
- with open(osp.join(output_dir, 'test.txt'), 'w') as ftest:
- for item in test_list:
- ftest.write(item[0] + ' ' + item[1] + '\n')
-
-
-prepare_filelist(devkit_dir, years, '.')
diff --git a/legacy/ssd/data_provider.py b/legacy/ssd/data_provider.py
deleted file mode 100644
index e59d324b497977ec02c1f728cb49a432f864382c..0000000000000000000000000000000000000000
--- a/legacy/ssd/data_provider.py
+++ /dev/null
@@ -1,175 +0,0 @@
-# Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserved
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import image_util
-from paddle.utils.image_util import *
-import random
-from PIL import Image
-import numpy as np
-import xml.etree.ElementTree
-import os
-
-
-class Settings(object):
- def __init__(self, data_dir, label_file, resize_h, resize_w, mean_value):
- self._data_dir = data_dir
- self._label_list = []
- label_fpath = os.path.join(data_dir, label_file)
- for line in open(label_fpath):
- self._label_list.append(line.strip())
-
- self._resize_height = resize_h
- self._resize_width = resize_w
- self._img_mean = np.array(mean_value)[:, np.newaxis, np.newaxis].astype(
- 'float32')
-
- @property
- def data_dir(self):
- return self._data_dir
-
- @property
- def label_list(self):
- return self._label_list
-
- @property
- def resize_h(self):
- return self._resize_height
-
- @property
- def resize_w(self):
- return self._resize_width
-
- @property
- def img_mean(self):
- return self._img_mean
-
-
-def _reader_creator(settings, file_list, mode, shuffle):
- def reader():
- with open(file_list) as flist:
- lines = [line.strip() for line in flist]
- if shuffle:
- random.shuffle(lines)
- for line in lines:
- if mode == 'train' or mode == 'test':
- img_path, label_path = line.split()
- img_path = os.path.join(settings.data_dir, img_path)
- label_path = os.path.join(settings.data_dir, label_path)
- elif mode == 'infer':
- img_path = os.path.join(settings.data_dir, line)
-
- img = Image.open(img_path)
- img_width, img_height = img.size
- img = np.array(img)
-
- # layout: label | xmin | ymin | xmax | ymax | difficult
- if mode == 'train' or mode == 'test':
- bbox_labels = []
- root = xml.etree.ElementTree.parse(label_path).getroot()
- for object in root.findall('object'):
- bbox_sample = []
- # start from 1
- bbox_sample.append(
- float(
- settings.label_list.index(
- object.find('name').text)))
- bbox = object.find('bndbox')
- difficult = float(object.find('difficult').text)
- bbox_sample.append(
- float(bbox.find('xmin').text) / img_width)
- bbox_sample.append(
- float(bbox.find('ymin').text) / img_height)
- bbox_sample.append(
- float(bbox.find('xmax').text) / img_width)
- bbox_sample.append(
- float(bbox.find('ymax').text) / img_height)
- bbox_sample.append(difficult)
- bbox_labels.append(bbox_sample)
-
- sample_labels = bbox_labels
- if mode == 'train':
- batch_sampler = []
- # hard-code here
- batch_sampler.append(
- image_util.sampler(1, 1, 1.0, 1.0, 1.0, 1.0, 0.0,
- 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.1,
- 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.3,
- 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.5,
- 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.7,
- 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.9,
- 0.0))
- batch_sampler.append(
- image_util.sampler(1, 50, 0.3, 1.0, 0.5, 2.0, 0.0,
- 1.0))
- """ random crop """
- sampled_bbox = image_util.generate_batch_samples(
- batch_sampler, bbox_labels, img_width, img_height)
-
- if len(sampled_bbox) > 0:
- idx = int(random.uniform(0, len(sampled_bbox)))
- img, sample_labels = image_util.crop_image(
- img, bbox_labels, sampled_bbox[idx], img_width,
- img_height)
-
- img = Image.fromarray(img)
- img = img.resize((settings.resize_w, settings.resize_h),
- Image.ANTIALIAS)
- img = np.array(img)
-
- if mode == 'train':
- mirror = int(random.uniform(0, 2))
- if mirror == 1:
- img = img[:, ::-1, :]
- for i in xrange(len(sample_labels)):
- tmp = sample_labels[i][1]
- sample_labels[i][1] = 1 - sample_labels[i][3]
- sample_labels[i][3] = 1 - tmp
-
- if len(img.shape) == 3:
- img = np.swapaxes(img, 1, 2)
- img = np.swapaxes(img, 1, 0)
-
- img = img.astype('float32')
- img -= settings.img_mean
- img = img.flatten()
-
- if mode == 'train' or mode == 'test':
- if mode == 'train' and len(sample_labels) == 0: continue
- yield img.astype('float32'), sample_labels
- elif mode == 'infer':
- yield img.astype('float32')
-
- return reader
-
-
-def train(settings, file_list, shuffle=True):
- return _reader_creator(settings, file_list, 'train', shuffle)
-
-
-def test(settings, file_list):
- return _reader_creator(settings, file_list, 'test', False)
-
-
-def infer(settings, file_list):
- return _reader_creator(settings, file_list, 'infer', False)
diff --git a/legacy/ssd/eval.py b/legacy/ssd/eval.py
deleted file mode 100644
index 4f585eae25804b1c70f9baafd5d33246a6531505..0000000000000000000000000000000000000000
--- a/legacy/ssd/eval.py
+++ /dev/null
@@ -1,47 +0,0 @@
-import paddle.v2 as paddle
-import data_provider
-import vgg_ssd_net
-import os, sys
-import gzip
-from config.pascal_voc_conf import cfg
-
-
-def eval(eval_file_list, batch_size, data_args, model_path):
- cost, detect_out = vgg_ssd_net.net_conf(mode='eval')
-
- assert os.path.isfile(model_path), 'Invalid model.'
- parameters = paddle.parameters.Parameters.from_tar(gzip.open(model_path))
-
- optimizer = paddle.optimizer.Momentum()
-
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- extra_layers=[detect_out],
- update_equation=optimizer)
-
- feeding = {'image': 0, 'bbox': 1}
-
- reader = paddle.batch(
- data_provider.test(data_args, eval_file_list), batch_size=batch_size)
-
- result = trainer.test(reader=reader, feeding=feeding)
-
- print "TestCost: %f, Detection mAP=%g" % \
- (result.cost, result.metrics['detection_evaluator'])
-
-
-if __name__ == "__main__":
- paddle.init(use_gpu=True, trainer_count=4) # use 4 gpus
-
- data_args = data_provider.Settings(
- data_dir='./data',
- label_file='label_list',
- resize_h=cfg.IMG_HEIGHT,
- resize_w=cfg.IMG_WIDTH,
- mean_value=[104, 117, 124])
-
- eval(
- eval_file_list='./data/test.txt',
- batch_size=4,
- data_args=data_args,
- model_path='models/pass-00000.tar.gz')
diff --git a/legacy/ssd/image_util.py b/legacy/ssd/image_util.py
deleted file mode 100644
index ba8744eda0a078acd38cad9b10ca7511185efc43..0000000000000000000000000000000000000000
--- a/legacy/ssd/image_util.py
+++ /dev/null
@@ -1,161 +0,0 @@
-from PIL import Image
-import numpy as np
-import random
-import math
-
-
-class sampler():
- def __init__(self, max_sample, max_trial, min_scale, max_scale,
- min_aspect_ratio, max_aspect_ratio, min_jaccard_overlap,
- max_jaccard_overlap):
- self.max_sample = max_sample
- self.max_trial = max_trial
- self.min_scale = min_scale
- self.max_scale = max_scale
- self.min_aspect_ratio = min_aspect_ratio
- self.max_aspect_ratio = max_aspect_ratio
- self.min_jaccard_overlap = min_jaccard_overlap
- self.max_jaccard_overlap = max_jaccard_overlap
-
-
-class bbox():
- def __init__(self, xmin, ymin, xmax, ymax):
- self.xmin = xmin
- self.ymin = ymin
- self.xmax = xmax
- self.ymax = ymax
-
-
-def bbox_area(src_bbox):
- width = src_bbox.xmax - src_bbox.xmin
- height = src_bbox.ymax - src_bbox.ymin
- return width * height
-
-
-def generate_sample(sampler):
- scale = random.uniform(sampler.min_scale, sampler.max_scale)
- min_aspect_ratio = max(sampler.min_aspect_ratio, (scale**2.0))
- max_aspect_ratio = min(sampler.max_aspect_ratio, 1 / (scale**2.0))
- aspect_ratio = random.uniform(min_aspect_ratio, max_aspect_ratio)
- bbox_width = scale * (aspect_ratio**0.5)
- bbox_height = scale / (aspect_ratio**0.5)
- xmin_bound = 1 - bbox_width
- ymin_bound = 1 - bbox_height
- xmin = random.uniform(0, xmin_bound)
- ymin = random.uniform(0, ymin_bound)
- xmax = xmin + bbox_width
- ymax = ymin + bbox_height
- sampled_bbox = bbox(xmin, ymin, xmax, ymax)
- return sampled_bbox
-
-
-def jaccard_overlap(sample_bbox, object_bbox):
- if sample_bbox.xmin >= object_bbox.xmax or \
- sample_bbox.xmax <= object_bbox.xmin or \
- sample_bbox.ymin >= object_bbox.ymax or \
- sample_bbox.ymax <= object_bbox.ymin:
- return 0
- intersect_xmin = max(sample_bbox.xmin, object_bbox.xmin)
- intersect_ymin = max(sample_bbox.ymin, object_bbox.ymin)
- intersect_xmax = min(sample_bbox.xmax, object_bbox.xmax)
- intersect_ymax = min(sample_bbox.ymax, object_bbox.ymax)
- intersect_size = (intersect_xmax - intersect_xmin) * (
- intersect_ymax - intersect_ymin)
- sample_bbox_size = bbox_area(sample_bbox)
- object_bbox_size = bbox_area(object_bbox)
- overlap = intersect_size / (
- sample_bbox_size + object_bbox_size - intersect_size)
- return overlap
-
-
-def satisfy_sample_constraint(sampler, sample_bbox, bbox_labels):
- if sampler.min_jaccard_overlap == 0 and sampler.max_jaccard_overlap == 0:
- return True
- for i in range(len(bbox_labels)):
- object_bbox = bbox(bbox_labels[i][1], bbox_labels[i][2],
- bbox_labels[i][3], bbox_labels[i][4])
- overlap = jaccard_overlap(sample_bbox, object_bbox)
- if sampler.min_jaccard_overlap != 0 and \
- overlap < sampler.min_jaccard_overlap:
- continue
- if sampler.max_jaccard_overlap != 0 and \
- overlap > sampler.max_jaccard_overlap:
- continue
- return True
- return False
-
-
-def generate_batch_samples(batch_sampler, bbox_labels, image_width,
- image_height):
- sampled_bbox = []
- index = []
- c = 0
- for sampler in batch_sampler:
- found = 0
- for i in range(sampler.max_trial):
- if found >= sampler.max_sample:
- break
- sample_bbox = generate_sample(sampler)
- if satisfy_sample_constraint(sampler, sample_bbox, bbox_labels):
- sampled_bbox.append(sample_bbox)
- found = found + 1
- index.append(c)
- c = c + 1
- return sampled_bbox
-
-
-def clip_bbox(src_bbox):
- src_bbox.xmin = max(min(src_bbox.xmin, 1.0), 0.0)
- src_bbox.ymin = max(min(src_bbox.ymin, 1.0), 0.0)
- src_bbox.xmax = max(min(src_bbox.xmax, 1.0), 0.0)
- src_bbox.ymax = max(min(src_bbox.ymax, 1.0), 0.0)
- return src_bbox
-
-
-def meet_emit_constraint(src_bbox, sample_bbox):
- center_x = (src_bbox.xmax + src_bbox.xmin) / 2
- center_y = (src_bbox.ymax + src_bbox.ymin) / 2
- if center_x >= sample_bbox.xmin and \
- center_x <= sample_bbox.xmax and \
- center_y >= sample_bbox.ymin and \
- center_y <= sample_bbox.ymax:
- return True
- return False
-
-
-def transform_labels(bbox_labels, sample_bbox):
- proj_bbox = bbox(0, 0, 0, 0)
- sample_labels = []
- for i in range(len(bbox_labels)):
- sample_label = []
- object_bbox = bbox(bbox_labels[i][1], bbox_labels[i][2],
- bbox_labels[i][3], bbox_labels[i][4])
- if not meet_emit_constraint(object_bbox, sample_bbox):
- continue
- sample_width = sample_bbox.xmax - sample_bbox.xmin
- sample_height = sample_bbox.ymax - sample_bbox.ymin
- proj_bbox.xmin = (object_bbox.xmin - sample_bbox.xmin) / sample_width
- proj_bbox.ymin = (object_bbox.ymin - sample_bbox.ymin) / sample_height
- proj_bbox.xmax = (object_bbox.xmax - sample_bbox.xmin) / sample_width
- proj_bbox.ymax = (object_bbox.ymax - sample_bbox.ymin) / sample_height
- proj_bbox = clip_bbox(proj_bbox)
- if bbox_area(proj_bbox) > 0:
- sample_label.append(bbox_labels[i][0])
- sample_label.append(float(proj_bbox.xmin))
- sample_label.append(float(proj_bbox.ymin))
- sample_label.append(float(proj_bbox.xmax))
- sample_label.append(float(proj_bbox.ymax))
- sample_label.append(bbox_labels[i][5])
- sample_labels.append(sample_label)
- return sample_labels
-
-
-def crop_image(img, bbox_labels, sample_bbox, image_width, image_height):
- sample_bbox = clip_bbox(sample_bbox)
- xmin = int(sample_bbox.xmin * image_width)
- xmax = int(sample_bbox.xmax * image_width)
- ymin = int(sample_bbox.ymin * image_height)
- ymax = int(sample_bbox.ymax * image_height)
- sample_img = img[ymin:ymax, xmin:xmax]
- sample_labels = transform_labels(bbox_labels, sample_bbox)
- return sample_img, sample_labels
diff --git a/legacy/ssd/images/SSD300x300_map.png b/legacy/ssd/images/SSD300x300_map.png
deleted file mode 100644
index a40a1e028be7ba979052034c152028976bc4b715..0000000000000000000000000000000000000000
Binary files a/legacy/ssd/images/SSD300x300_map.png and /dev/null differ
diff --git a/legacy/ssd/images/ssd_network.png b/legacy/ssd/images/ssd_network.png
deleted file mode 100644
index 193caa0168a4f981506ad7b97f8b9fb35557ed20..0000000000000000000000000000000000000000
Binary files a/legacy/ssd/images/ssd_network.png and /dev/null differ
diff --git a/legacy/ssd/images/vis_1.jpg b/legacy/ssd/images/vis_1.jpg
deleted file mode 100644
index c317462ee6053df15fa8d44d0f35398e47156e8d..0000000000000000000000000000000000000000
Binary files a/legacy/ssd/images/vis_1.jpg and /dev/null differ
diff --git a/legacy/ssd/images/vis_2.jpg b/legacy/ssd/images/vis_2.jpg
deleted file mode 100644
index 7bc59b239cb9c123087fdecbb210ad52a3a35f10..0000000000000000000000000000000000000000
Binary files a/legacy/ssd/images/vis_2.jpg and /dev/null differ
diff --git a/legacy/ssd/images/vis_3.jpg b/legacy/ssd/images/vis_3.jpg
deleted file mode 100644
index a79598343a7e2707ba79c2e8891d7af0c24df491..0000000000000000000000000000000000000000
Binary files a/legacy/ssd/images/vis_3.jpg and /dev/null differ
diff --git a/legacy/ssd/images/vis_4.jpg b/legacy/ssd/images/vis_4.jpg
deleted file mode 100644
index 96b2c99c9ef986cc0d4802b31c33f076fce6f965..0000000000000000000000000000000000000000
Binary files a/legacy/ssd/images/vis_4.jpg and /dev/null differ
diff --git a/legacy/ssd/infer.py b/legacy/ssd/infer.py
deleted file mode 100644
index c0bc79189935d8bdd59f17756b9c95581870f36a..0000000000000000000000000000000000000000
--- a/legacy/ssd/infer.py
+++ /dev/null
@@ -1,98 +0,0 @@
-import paddle.v2 as paddle
-import data_provider
-import vgg_ssd_net
-import os, sys
-import numpy as np
-import gzip
-from PIL import Image
-from config.pascal_voc_conf import cfg
-
-
-def _infer(inferer, infer_data, threshold):
- ret = []
- infer_res = inferer.infer(input=infer_data)
- keep_inds = np.where(infer_res[:, 2] >= threshold)[0]
- for idx in keep_inds:
- ret.append([
- infer_res[idx][0], infer_res[idx][1] - 1, infer_res[idx][2],
- infer_res[idx][3], infer_res[idx][4], infer_res[idx][5],
- infer_res[idx][6]
- ])
- return ret
-
-
-def save_batch_res(ret_res, img_w, img_h, fname_list, fout):
- for det_res in ret_res:
- img_idx = int(det_res[0])
- label = int(det_res[1])
- conf_score = det_res[2]
- xmin = det_res[3] * img_w[img_idx]
- ymin = det_res[4] * img_h[img_idx]
- xmax = det_res[5] * img_w[img_idx]
- ymax = det_res[6] * img_h[img_idx]
- fout.write(fname_list[img_idx] + '\t' + str(label) + '\t' + str(
- conf_score) + '\t' + str(xmin) + ' ' + str(ymin) + ' ' + str(xmax) +
- ' ' + str(ymax))
- fout.write('\n')
-
-
-def infer(eval_file_list, save_path, data_args, batch_size, model_path,
- threshold):
- detect_out = vgg_ssd_net.net_conf(mode='infer')
-
- assert os.path.isfile(model_path), 'Invalid model.'
- parameters = paddle.parameters.Parameters.from_tar(gzip.open(model_path))
-
- inferer = paddle.inference.Inference(
- output_layer=detect_out, parameters=parameters)
-
- reader = data_provider.infer(data_args, eval_file_list)
- all_fname_list = [line.strip() for line in open(eval_file_list).readlines()]
-
- test_data = []
- fname_list = []
- img_w = []
- img_h = []
- idx = 0
- """Do inference batch by batch,
- coords of bbox will be scaled based on image size
- """
- with open(save_path, 'w') as fout:
- for img in reader():
- test_data.append([img])
- fname_list.append(all_fname_list[idx])
- w, h = Image.open(os.path.join('./data', fname_list[-1])).size
- img_w.append(w)
- img_h.append(h)
- if len(test_data) == batch_size:
- ret_res = _infer(inferer, test_data, threshold)
- save_batch_res(ret_res, img_w, img_h, fname_list, fout)
- test_data = []
- fname_list = []
- img_w = []
- img_h = []
-
- idx += 1
-
- if len(test_data) > 0:
- ret_res = _infer(inferer, test_data, threshold)
- save_batch_res(ret_res, img_w, img_h, fname_list, fout)
-
-
-if __name__ == "__main__":
- paddle.init(use_gpu=True, trainer_count=1)
-
- data_args = data_provider.Settings(
- data_dir='./data',
- label_file='label_list',
- resize_h=cfg.IMG_HEIGHT,
- resize_w=cfg.IMG_WIDTH,
- mean_value=[104, 117, 124])
-
- infer(
- eval_file_list='./data/infer.txt',
- save_path='infer.res',
- data_args=data_args,
- batch_size=4,
- model_path='models/pass-00000.tar.gz',
- threshold=0.3)
diff --git a/legacy/ssd/train.py b/legacy/ssd/train.py
deleted file mode 100644
index 38b27f7bbf698d697d6d4313395b8cede112d2d9..0000000000000000000000000000000000000000
--- a/legacy/ssd/train.py
+++ /dev/null
@@ -1,83 +0,0 @@
-import paddle.v2 as paddle
-import data_provider
-import vgg_ssd_net
-import os, sys
-import gzip
-import tarfile
-from config.pascal_voc_conf import cfg
-
-
-def train(train_file_list, dev_file_list, data_args, init_model_path):
- optimizer = paddle.optimizer.Momentum(
- momentum=cfg.TRAIN.MOMENTUM,
- learning_rate=cfg.TRAIN.LEARNING_RATE,
- regularization=paddle.optimizer.L2Regularization(
- rate=cfg.TRAIN.L2REGULARIZATION),
- learning_rate_decay_a=cfg.TRAIN.LEARNING_RATE_DECAY_A,
- learning_rate_decay_b=cfg.TRAIN.LEARNING_RATE_DECAY_B,
- learning_rate_schedule=cfg.TRAIN.LEARNING_RATE_SCHEDULE)
-
- cost, detect_out = vgg_ssd_net.net_conf('train')
-
- parameters = paddle.parameters.create(cost)
- if not (init_model_path is None):
- assert os.path.isfile(init_model_path), 'Invalid model.'
- parameters.init_from_tar(gzip.open(init_model_path))
-
- trainer = paddle.trainer.SGD(cost=cost,
- parameters=parameters,
- extra_layers=[detect_out],
- update_equation=optimizer)
-
- feeding = {'image': 0, 'bbox': 1}
-
- train_reader = paddle.batch(
- data_provider.train(data_args, train_file_list),
- batch_size=cfg.TRAIN.BATCH_SIZE) # generate a batch image each time
-
- dev_reader = paddle.batch(
- data_provider.test(data_args, dev_file_list),
- batch_size=cfg.TRAIN.BATCH_SIZE)
-
- def event_handler(event):
- if isinstance(event, paddle.event.EndIteration):
- if event.batch_id % 1 == 0:
- print "\nPass %d, Batch %d, TrainCost %f, Detection mAP=%f" % \
- (event.pass_id,
- event.batch_id,
- event.cost,
- event.metrics['detection_evaluator'])
- else:
- sys.stdout.write('.')
- sys.stdout.flush()
-
- if isinstance(event, paddle.event.EndPass):
- with gzip.open('checkpoints/params_pass_%05d.tar.gz' % \
- event.pass_id, 'w') as f:
- trainer.save_parameter_to_tar(f)
- result = trainer.test(reader=dev_reader, feeding=feeding)
- print "\nTest with Pass %d, TestCost: %f, Detection mAP=%g" % \
- (event.pass_id,
- result.cost,
- result.metrics['detection_evaluator'])
-
- trainer.train(
- reader=train_reader,
- event_handler=event_handler,
- num_passes=cfg.TRAIN.NUM_PASS,
- feeding=feeding)
-
-
-if __name__ == "__main__":
- paddle.init(use_gpu=True, trainer_count=4)
- data_args = data_provider.Settings(
- data_dir='./data',
- label_file='label_list',
- resize_h=cfg.IMG_HEIGHT,
- resize_w=cfg.IMG_WIDTH,
- mean_value=[104, 117, 124])
- train(
- train_file_list='./data/trainval.txt',
- dev_file_list='./data/test.txt',
- data_args=data_args,
- init_model_path='./vgg/vgg_model.tar.gz')
diff --git a/legacy/ssd/vgg_ssd_net.py b/legacy/ssd/vgg_ssd_net.py
deleted file mode 100644
index 9d7434acd5eea1453f282ce3e065810aed02f8e8..0000000000000000000000000000000000000000
--- a/legacy/ssd/vgg_ssd_net.py
+++ /dev/null
@@ -1,297 +0,0 @@
-import paddle.v2 as paddle
-from config.pascal_voc_conf import cfg
-
-
-def net_conf(mode):
- """Network configuration. Total three modes included 'train' 'eval'
- and 'infer'. Loss and mAP evaluation layer will return if using 'train'
- and 'eval'. In 'infer' mode, only detection output layer will be returned.
- """
- default_l2regularization = cfg.TRAIN.L2REGULARIZATION
-
- default_bias_attr = paddle.attr.ParamAttr(l2_rate=0.0, learning_rate=2.0)
- default_static_bias_attr = paddle.attr.ParamAttr(is_static=True)
-
- def get_param_attr(local_lr, regularization):
- is_static = False
- if local_lr == 0.0:
- is_static = True
- return paddle.attr.ParamAttr(
- learning_rate=local_lr, l2_rate=regularization, is_static=is_static)
-
- def get_loc_conf_filter_size(aspect_ratio_num, min_size_num, max_size_num):
- loc_filter_size = (
- aspect_ratio_num * 2 + min_size_num + max_size_num) * 4
- conf_filter_size = (
- aspect_ratio_num * 2 + min_size_num + max_size_num) * cfg.CLASS_NUM
- return loc_filter_size, conf_filter_size
-
- def conv_group(stack_num, name_list, input, filter_size_list, num_channels,
- num_filters_list, stride_list, padding_list,
- common_bias_attr, common_param_attr, common_act):
- conv = input
- in_channels = num_channels
- for i in xrange(stack_num):
- conv = paddle.layer.img_conv(
- name=name_list[i],
- input=conv,
- filter_size=filter_size_list[i],
- num_channels=in_channels,
- num_filters=num_filters_list[i],
- stride=stride_list[i],
- padding=padding_list[i],
- bias_attr=common_bias_attr,
- param_attr=common_param_attr,
- act=common_act)
- in_channels = num_filters_list[i]
- return conv
-
- def vgg_block(idx_str, input, num_channels, num_filters, pool_size,
- pool_stride, pool_pad):
- layer_name = "conv%s_" % idx_str
- stack_num = 3
- name_list = [layer_name + str(i + 1) for i in xrange(3)]
-
- conv = conv_group(stack_num, name_list, input, [3] * stack_num,
- num_channels, [num_filters] * stack_num,
- [1] * stack_num, [1] * stack_num, default_bias_attr,
- get_param_attr(1, default_l2regularization),
- paddle.activation.Relu())
-
- pool = paddle.layer.img_pool(
- input=conv,
- pool_size=pool_size,
- num_channels=num_filters,
- pool_type=paddle.pooling.CudnnMax(),
- stride=pool_stride,
- padding=pool_pad)
- return conv, pool
-
- def mbox_block(layer_idx, input, num_channels, filter_size, loc_filters,
- conf_filters):
- mbox_loc_name = layer_idx + "_mbox_loc"
- mbox_loc = paddle.layer.img_conv(
- name=mbox_loc_name,
- input=input,
- filter_size=filter_size,
- num_channels=num_channels,
- num_filters=loc_filters,
- stride=1,
- padding=1,
- bias_attr=default_bias_attr,
- param_attr=get_param_attr(1, default_l2regularization),
- act=paddle.activation.Identity())
-
- mbox_conf_name = layer_idx + "_mbox_conf"
- mbox_conf = paddle.layer.img_conv(
- name=mbox_conf_name,
- input=input,
- filter_size=filter_size,
- num_channels=num_channels,
- num_filters=conf_filters,
- stride=1,
- padding=1,
- bias_attr=default_bias_attr,
- param_attr=get_param_attr(1, default_l2regularization),
- act=paddle.activation.Identity())
-
- return mbox_loc, mbox_conf
-
- def ssd_block(layer_idx, input, img_shape, num_channels, num_filters1,
- num_filters2, aspect_ratio, variance, min_size, max_size):
- layer_name = "conv" + layer_idx + "_"
- stack_num = 2
- conv1_name = layer_name + "1"
- conv2_name = layer_name + "2"
- conv2 = conv_group(stack_num, [conv1_name, conv2_name], input, [1, 3],
- num_channels, [num_filters1, num_filters2], [1, 2],
- [0, 1], default_bias_attr,
- get_param_attr(1, default_l2regularization),
- paddle.activation.Relu())
-
- loc_filters, conf_filters = get_loc_conf_filter_size(
- len(aspect_ratio), len(min_size), len(max_size))
- mbox_loc, mbox_conf = mbox_block(conv2_name, conv2, num_filters2, 3,
- loc_filters, conf_filters)
- mbox_priorbox = paddle.layer.priorbox(
- input=conv2,
- image=img_shape,
- min_size=min_size,
- max_size=max_size,
- aspect_ratio=aspect_ratio,
- variance=variance)
-
- return conv2, mbox_loc, mbox_conf, mbox_priorbox
-
- img = paddle.layer.data(
- name='image',
- type=paddle.data_type.dense_vector(cfg.IMG_CHANNEL * cfg.IMG_HEIGHT *
- cfg.IMG_WIDTH),
- height=cfg.IMG_HEIGHT,
- width=cfg.IMG_WIDTH)
-
- stack_num = 2
- conv1_2 = conv_group(stack_num, ['conv1_1', 'conv1_2'], img,
- [3] * stack_num, 3, [64] * stack_num, [1] * stack_num,
- [1] * stack_num, default_static_bias_attr,
- get_param_attr(0, 0), paddle.activation.Relu())
-
- pool1 = paddle.layer.img_pool(
- name="pool1",
- input=conv1_2,
- pool_type=paddle.pooling.CudnnMax(),
- pool_size=2,
- num_channels=64,
- stride=2)
-
- stack_num = 2
- conv2_2 = conv_group(stack_num, ['conv2_1', 'conv2_2'], pool1, [3] *
- stack_num, 64, [128] * stack_num, [1] * stack_num,
- [1] * stack_num, default_static_bias_attr,
- get_param_attr(0, 0), paddle.activation.Relu())
-
- pool2 = paddle.layer.img_pool(
- name="pool2",
- input=conv2_2,
- pool_type=paddle.pooling.CudnnMax(),
- pool_size=2,
- num_channels=128,
- stride=2)
-
- conv3_3, pool3 = vgg_block("3", pool2, 128, 256, 2, 2, 0)
-
- conv4_3, pool4 = vgg_block("4", pool3, 256, 512, 2, 2, 0)
- conv4_3_mbox_priorbox = paddle.layer.priorbox(
- input=conv4_3,
- image=img,
- min_size=cfg.NET.CONV4.PB.MIN_SIZE,
- aspect_ratio=cfg.NET.CONV4.PB.ASPECT_RATIO,
- variance=cfg.NET.CONV4.PB.VARIANCE)
- conv4_3_norm = paddle.layer.cross_channel_norm(
- name="conv4_3_norm",
- input=conv4_3,
- param_attr=paddle.attr.ParamAttr(
- initial_mean=20, initial_std=0, is_static=False, learning_rate=1))
- CONV4_PB = cfg.NET.CONV4.PB
- loc_filter_size, conf_filter_size = get_loc_conf_filter_size(
- len(CONV4_PB.ASPECT_RATIO),
- len(CONV4_PB.MIN_SIZE), len(CONV4_PB.MAX_SIZE))
- conv4_3_norm_mbox_loc, conv4_3_norm_mbox_conf = \
- mbox_block("conv4_3_norm", conv4_3_norm, 512, 3,
- loc_filter_size, conf_filter_size)
-
- conv5_3, pool5 = vgg_block("5", pool4, 512, 512, 3, 1, 1)
-
- stack_num = 2
- fc7 = conv_group(stack_num, ['fc6', 'fc7'], pool5, [3, 1], 512, [1024] *
- stack_num, [1] * stack_num, [1, 0], default_bias_attr,
- get_param_attr(1, default_l2regularization),
- paddle.activation.Relu())
-
- FC7_PB = cfg.NET.FC7.PB
- loc_filter_size, conf_filter_size = get_loc_conf_filter_size(
- len(FC7_PB.ASPECT_RATIO), len(FC7_PB.MIN_SIZE), len(FC7_PB.MAX_SIZE))
- fc7_mbox_loc, fc7_mbox_conf = mbox_block("fc7", fc7, 1024, 3,
- loc_filter_size, conf_filter_size)
- fc7_mbox_priorbox = paddle.layer.priorbox(
- input=fc7,
- image=img,
- min_size=cfg.NET.FC7.PB.MIN_SIZE,
- max_size=cfg.NET.FC7.PB.MAX_SIZE,
- aspect_ratio=cfg.NET.FC7.PB.ASPECT_RATIO,
- variance=cfg.NET.FC7.PB.VARIANCE)
-
- conv6_2, conv6_2_mbox_loc, conv6_2_mbox_conf, conv6_2_mbox_priorbox = \
- ssd_block("6", fc7, img, 1024, 256, 512,
- cfg.NET.CONV6.PB.ASPECT_RATIO,
- cfg.NET.CONV6.PB.VARIANCE,
- cfg.NET.CONV6.PB.MIN_SIZE,
- cfg.NET.CONV6.PB.MAX_SIZE)
- conv7_2, conv7_2_mbox_loc, conv7_2_mbox_conf, conv7_2_mbox_priorbox = \
- ssd_block("7", conv6_2, img, 512, 128, 256,
- cfg.NET.CONV7.PB.ASPECT_RATIO,
- cfg.NET.CONV7.PB.VARIANCE,
- cfg.NET.CONV7.PB.MIN_SIZE,
- cfg.NET.CONV7.PB.MAX_SIZE)
- conv8_2, conv8_2_mbox_loc, conv8_2_mbox_conf, conv8_2_mbox_priorbox = \
- ssd_block("8", conv7_2, img, 256, 128, 256,
- cfg.NET.CONV8.PB.ASPECT_RATIO,
- cfg.NET.CONV8.PB.VARIANCE,
- cfg.NET.CONV8.PB.MIN_SIZE,
- cfg.NET.CONV8.PB.MAX_SIZE)
-
- pool6 = paddle.layer.img_pool(
- name="pool6",
- input=conv8_2,
- pool_size=3,
- num_channels=256,
- stride=1,
- pool_type=paddle.pooling.Avg())
- POOL6_PB = cfg.NET.POOL6.PB
- loc_filter_size, conf_filter_size = get_loc_conf_filter_size(
- len(POOL6_PB.ASPECT_RATIO),
- len(POOL6_PB.MIN_SIZE), len(POOL6_PB.MAX_SIZE))
- pool6_mbox_loc, pool6_mbox_conf = mbox_block(
- "pool6", pool6, 256, 3, loc_filter_size, conf_filter_size)
- pool6_mbox_priorbox = paddle.layer.priorbox(
- input=pool6,
- image=img,
- min_size=cfg.NET.POOL6.PB.MIN_SIZE,
- max_size=cfg.NET.POOL6.PB.MAX_SIZE,
- aspect_ratio=cfg.NET.POOL6.PB.ASPECT_RATIO,
- variance=cfg.NET.POOL6.PB.VARIANCE)
-
- mbox_priorbox = paddle.layer.concat(
- name="mbox_priorbox",
- input=[
- conv4_3_mbox_priorbox, fc7_mbox_priorbox, conv6_2_mbox_priorbox,
- conv7_2_mbox_priorbox, conv8_2_mbox_priorbox, pool6_mbox_priorbox
- ])
-
- loc_loss_input = [
- conv4_3_norm_mbox_loc, fc7_mbox_loc, conv6_2_mbox_loc, conv7_2_mbox_loc,
- conv8_2_mbox_loc, pool6_mbox_loc
- ]
-
- conf_loss_input = [
- conv4_3_norm_mbox_conf, fc7_mbox_conf, conv6_2_mbox_conf,
- conv7_2_mbox_conf, conv8_2_mbox_conf, pool6_mbox_conf
- ]
-
- detection_out = paddle.layer.detection_output(
- input_loc=loc_loss_input,
- input_conf=conf_loss_input,
- priorbox=mbox_priorbox,
- confidence_threshold=cfg.NET.DETOUT.CONFIDENCE_THRESHOLD,
- nms_threshold=cfg.NET.DETOUT.NMS_THRESHOLD,
- num_classes=cfg.CLASS_NUM,
- nms_top_k=cfg.NET.DETOUT.NMS_TOP_K,
- keep_top_k=cfg.NET.DETOUT.KEEP_TOP_K,
- background_id=cfg.BACKGROUND_ID,
- name="detection_output")
-
- if mode == 'train' or mode == 'eval':
- bbox = paddle.layer.data(
- name='bbox', type=paddle.data_type.dense_vector_sequence(6))
- loss = paddle.layer.multibox_loss(
- input_loc=loc_loss_input,
- input_conf=conf_loss_input,
- priorbox=mbox_priorbox,
- label=bbox,
- num_classes=cfg.CLASS_NUM,
- overlap_threshold=cfg.NET.MBLOSS.OVERLAP_THRESHOLD,
- neg_pos_ratio=cfg.NET.MBLOSS.NEG_POS_RATIO,
- neg_overlap=cfg.NET.MBLOSS.NEG_OVERLAP,
- background_id=cfg.BACKGROUND_ID,
- name="multibox_loss")
- paddle.evaluator.detection_map(
- input=detection_out,
- label=bbox,
- overlap_threshold=cfg.NET.DETMAP.OVERLAP_THRESHOLD,
- background_id=cfg.BACKGROUND_ID,
- evaluate_difficult=cfg.NET.DETMAP.EVAL_DIFFICULT,
- ap_type=cfg.NET.DETMAP.AP_TYPE,
- name="detection_evaluator")
- return loss, detection_out
- elif mode == 'infer':
- return detection_out
diff --git a/legacy/ssd/visual.py b/legacy/ssd/visual.py
deleted file mode 100644
index 278fd34af1a7e817012c27f38647f9ce76f0c803..0000000000000000000000000000000000000000
--- a/legacy/ssd/visual.py
+++ /dev/null
@@ -1,33 +0,0 @@
-import cv2
-import os
-
-data_dir = './data'
-infer_file = './infer.res'
-out_dir = './visual_res'
-
-path_to_im = dict()
-
-for line in open(infer_file):
- img_path, _, _, _ = line.strip().split('\t')
- if img_path not in path_to_im:
- im = cv2.imread(os.path.join(data_dir, img_path))
- path_to_im[img_path] = im
-
-for line in open(infer_file):
- img_path, label, conf, bbox = line.strip().split('\t')
- xmin, ymin, xmax, ymax = map(float, bbox.split(' '))
- xmin = int(round(xmin))
- ymin = int(round(ymin))
- xmax = int(round(xmax))
- ymax = int(round(ymax))
-
- img = path_to_im[img_path]
- cv2.rectangle(img, (xmin, ymin), (xmax, ymax),
- (0, (1 - xmin) * 255, xmin * 255), 2)
-
-for img_path in path_to_im:
- im = path_to_im[img_path]
- out_path = os.path.join(out_dir, os.path.basename(img_path))
- cv2.imwrite(out_path, im)
-
-print 'Done.'
diff --git a/legacy/text_classification/.gitignore b/legacy/text_classification/.gitignore
deleted file mode 100644
index c979e15d6bf2cfd7b517c97f7a53bb57c0b1b139..0000000000000000000000000000000000000000
--- a/legacy/text_classification/.gitignore
+++ /dev/null
@@ -1,4 +0,0 @@
-data
-*.tar.gz
-*.log
-*.pyc
diff --git a/legacy/text_classification/README.md b/legacy/text_classification/README.md
deleted file mode 100644
index 0617e19d3061c2288b2c59dfe53e2053cb8d3be2..0000000000000000000000000000000000000000
--- a/legacy/text_classification/README.md
+++ /dev/null
@@ -1,202 +0,0 @@
-运行本目录下的程序示例需要使用PaddlePaddle v0.10.0 版本。如果您的PaddlePaddle安装版本低于此要求,请按照[安装文档](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/pip_install_cn.html)中的说明更新PaddlePaddle安装版本。
-
----
-
-# 文本分类
-
-以下是本例目录包含的文件以及对应说明:
-
-```text
-.
-├── images # 文档中的图片
-│ ├── cnn_net.png
-│ └── dnn_net.png
-├── index.html # 文档
-├── infer.py # 预测脚本
-├── network_conf.py # 本例中涉及的各种网络结构均定义在此文件中,若进一步修改模型结构,请查看此文件
-├── reader.py # 读取数据接口,若使用自定义格式的数据,请查看此文件
-├── README.md # 文档
-├── run.sh # 训练任务运行脚本,直接运行此脚本,将以默认参数开始训练任务
-├── train.py # 训练脚本
-└── utils.py # 定义通用的函数,例如:打印日志、解析命令行参数、构建字典、加载字典等
-```
-
-## 简介
-文本分类任务根据给定一条文本的内容,判断该文本所属的类别,是自然语言处理领域的一项重要的基础任务。[PaddleBook](https://github.com/PaddlePaddle/book) 中的[情感分类](https://github.com/PaddlePaddle/book/blob/develop/06.understand_sentiment/README.cn.md)一课,正是一个典型的文本分类任务,任务流程如下:
-
-1. 收集电影评论网站的用户评论数据。
-2. 清洗,标记。
-3. 模型设计。
-4. 模型学习效果评估。
-
-训练好的分类器能够**自动判断**新出现的用户评论的情感是正面还是负面,在舆情监控、营销策划、产品品牌价值评估等任务中,能够起到重要作用。以上过程也是我们去完成一个新的文本分类任务需要遵循的常规流程。可以看到,深度学习方法的巨大优势体现在:**免除复杂的特征的设计,只需要对原始文本进行基础的清理、标注即可**。
-
-[PaddleBook](https://github.com/PaddlePaddle/book) 中的[情感分类](https://github.com/PaddlePaddle/book/blob/develop/06.understand_sentiment/README.cn.md)介绍了一个较为复杂的栈式双向 LSTM 模型,循环神经网络在一些需要理解语言语义的复杂任务中有着明显的优势,但计算量大,通常对调参技巧也有着更高的要求。在对计算时间有一定限制的任务中,也会考虑其它模型。除了计算时间的考量,更重要的一点:**模型选择往往是机器学习任务成功的基础**。机器学习任务的目标始终是提高泛化能力,也就是对未知的新的样本预测的能力:
-
-1. 简单模型拟合能力不足,无法精确拟合训练样本,更加无法期待模型能够准确地预测没有出现在训练样本集中的未知样本,这就是**欠拟合**问题。
-2. 然而,过于复杂的模型轻松“记忆”了训练样本集中的每一个样本,但对于没有出现在训练样本集中的未知样本却毫无识别能力,这就是**过拟合**问题。
-
-"No Free Lunch (NFL)" 是机器学习任务基本原则之一:没有任何一种模型是天生优于其他模型的。模型的设计和选择建立在了解不同模型特性的基础之上,但同时也是一个多次实验评估的过程。在本例中,我们继续向大家介绍几种最常用的文本分类模型,它们的能力和复杂程度不同,帮助大家对比学习这些模型学习效果之间的差异,针对不同的场景选择使用。
-
-## 模型详解
-
-`network_conf.py` 中包括以下模型:
-
-1. `fc_net`: DNN 模型,是一个非序列模型。使用基本的全连接结构。
-2. `convolution_net`:浅层 CNN 模型,是一个基础的序列模型,能够处理变长的序列输入,提取一个局部区域之内的特征。
-
-我们以情感分类任务为例,简单说明序列模型和非序列模型之间的差异。情感分类是一项常见的文本分类任务,模型自动判断文本中表现出的情感是正向还是负向。以句子 "The apple is not bad" 为例,"not bad" 是决定这个句子情感的关键:
-
-- 对于 DNN 模型来说,只知道句子中有一个 "not" 和一个 "bad",两者之间的顺序关系在输入网络时已丢失,网络不再有机会学习序列之间的顺序信息。
-- CNN 模型接受文本序列作为输入,保留了 "not bad" 之间的顺序信息。
-
-两者各自的一些特点简单总结如下:
-
-1. DNN 的计算量可以远低于 CNN / RNN 模型,在对响应时间有要求的任务中具有优势。
-2. DNN 刻画的往往是频繁词特征,潜在会受到分词错误的影响,但对一些依赖关键词特征也能做的不错的任务:如 Spam 短信检测,依然是一个有效的模型。
-3. 在大多数需要一定语义理解(例如,借助上下文消除语义中的歧义)的文本分类任务上,以 CNN / RNN 为代表的序列模型的效果往往好于 DNN 模型。
-
-### 1. DNN 模型
-
-**DNN 模型结构入下图所示:**
-
-
-
-图1. 本例中的 DNN 文本分类模型
-
-
-在 PaddlePaddle 实现该 DNN 结构的代码见 `network_conf.py` 中的 `fc_net` 函数,模型主要分为如下几个部分:
-
-- **词向量层**:为了更好地表示不同词之间语义上的关系,首先将词语转化为固定维度的向量。训练完成后,词与词语义上的相似程度可以用它们的词向量之间的距离来表示,语义上越相似,距离越近。关于词向量的更多信息请参考PaddleBook中的[词向量](https://github.com/PaddlePaddle/book/tree/develop/04.word2vec)一节。
-
-- **最大池化层**:最大池化在时间序列上进行,池化过程消除了不同语料样本在单词数量多少上的差异,并提炼出词向量中每一下标位置上的最大值。经过池化后,词向量层输出的向量序列被转化为一条固定维度的向量。例如,假设最大池化前向量的序列为`[[2,3,5],[7,3,6],[1,4,0]]`,则最大池化的结果为:`[7,4,6]`。
-
-- **全连接隐层**:经过最大池化后的向量被送入两个连续的隐层,隐层之间为全连接结构。
-
-- **输出层**:输出层的神经元数量和样本的类别数一致,例如在二分类问题中,输出层会有2个神经元。通过Softmax激活函数,输出结果是一个归一化的概率分布,和为1,因此第$i$个神经元的输出就可以认为是样本属于第$i$类的预测概率。
-
-该 DNN 模型默认对输入的语料进行二分类(`class_dim=2`),embedding(词向量)维度默认为28(`emd_dim=28`),两个隐层均使用Tanh激活函数(`act=paddle.activation.Tanh()`)。需要注意的是,该模型的输入数据为整数序列,而不是原始的单词序列。事实上,为了处理方便,我们一般会事先将单词根据词频顺序进行 id 化,即将词语转化成在字典中的序号。
-
-### 2. CNN 模型
-
-**CNN 模型结构如下图所示:**
-
-
-
-图2. 本例中的 CNN 文本分类模型
-
-
-通过 PaddlePaddle 实现该 CNN 结构的代码见 `network_conf.py` 中的 `convolution_net` 函数,模型主要分为如下几个部分:
-
-- **词向量层**:与 DNN 中词向量层的作用一样,将词语转化为固定维度的向量,利用向量之间的距离来表示词之间的语义相关程度。如图2所示,将得到的词向量定义为行向量,再将语料中所有的单词产生的行向量拼接在一起组成矩阵。假设词向量维度为5,句子 “The cat sat on the read mat” 含 7 个词语,那么得到的矩阵维度为 7*5。关于词向量的更多信息请参考 PaddleBook 中的[词向量](https://github.com/PaddlePaddle/book/tree/develop/04.word2vec)一节。
-
-- **卷积层**: 文本分类中的卷积在时间序列上进行,即卷积核的宽度和词向量层产出的矩阵一致,卷积沿着矩阵的高度方向进行。卷积后得到的结果被称为“特征图”(feature map)。假设卷积核的高度为 $h$,矩阵的高度为 $N$,卷积的步长为 1,则得到的特征图为一个高度为 $N+1-h$ 的向量。可以同时使用多个不同高度的卷积核,得到多个特征图。
-
-- **最大池化层**: 对卷积得到的各个特征图分别进行最大池化操作。由于特征图本身已经是向量,因此这里的最大池化实际上就是简单地选出各个向量中的最大元素。各个最大元素又被拼接在一起,组成新的向量,显然,该向量的维度等于特征图的数量,也就是卷积核的数量。举例来说,假设我们使用了四个不同的卷积核,卷积产生的特征图分别为:`[2,3,5]`、`[8,2,1]`、`[5,7,7,6]` 和 `[4,5,1,8]`,由于卷积核的高度不同,因此产生的特征图尺寸也有所差异。分别在这四个特征图上进行最大池化,结果为:`[5]`、`[8]`、`[7]`和`[8]`,最后将池化结果拼接在一起,得到`[5,8,7,8]`。
-
-- **全连接与输出层**:将最大池化的结果通过全连接层输出,与 DNN 模型一样,最后输出层的神经元个数与样本的类别数量一致,且输出之和为 1。
-
-CNN 网络的输入数据类型和 DNN 一致。PaddlePaddle 中已经封装好的带有池化的文本序列卷积模块:`paddle.networks.sequence_conv_pool`,可直接调用。该模块的 `context_len` 参数用于指定卷积核在同一时间覆盖的文本长度,即图 2 中的卷积核的高度。`hidden_size` 用于指定该类型的卷积核的数量。本例代码默认使用了 128 个大小为 3 的卷积核和 128 个大小为 4 的卷积核,这些卷积的结果经过最大池化和结果拼接后产生一个 256 维的向量,向量经过一个全连接层输出最终的预测结果。
-
-## 使用 PaddlePaddle 内置数据运行
-
-### 如何训练
-
-在终端中执行 `sh run.sh` 以下命令, 将以 PaddlePaddle 内置的情感分类数据集:`paddle.dataset.imdb` 直接运行本例,会看到如下输入:
-
-```text
-Pass 0, Batch 0, Cost 0.696031, {'__auc_evaluator_0__': 0.47360000014305115, 'classification_error_evaluator': 0.5}
-Pass 0, Batch 100, Cost 0.544438, {'__auc_evaluator_0__': 0.839249312877655, 'classification_error_evaluator': 0.30000001192092896}
-Pass 0, Batch 200, Cost 0.406581, {'__auc_evaluator_0__': 0.9030032753944397, 'classification_error_evaluator': 0.2199999988079071}
-Test at Pass 0, {'__auc_evaluator_0__': 0.9289745092391968, 'classification_error_evaluator': 0.14927999675273895}
-```
-日志每隔 100 个 batch 输出一次,输出信息包括:(1)Pass 序号;(2)Batch 序号;(3)依次输出当前 Batch 上评估指标的评估结果。评估指标在配置网络拓扑结构时指定,在上面的输出中,输出了训练样本集之的 AUC 以及错误率指标。
-
-### 如何预测
-
-训练结束后模型默认存储在当前工作目录下,在终端中执行 `python infer.py` ,预测脚本会加载训练好的模型进行预测。
-
-- 默认加载使用 `paddle.data.imdb.train` 训练一个 Pass 产出的 DNN 模型对 `paddle.dataset.imdb.test` 进行测试
-
-会看到如下输出:
-
-```text
-positive 0.9275 0.0725 previous reviewer gave a much better of the films plot details than i could what i recall mostly is that it was just so beautiful in every sense emotionally visually just br if you like movies that are wonderful to look at and also have emotional content to which that beauty is relevant i think you will be glad to have seen this extraordinary and unusual work of br on a scale of 1 to 10 id give it about an the only reason i shy away from 9 is that it is a mood piece if you are in the mood for a really artistic very romantic film then its a 10 i definitely think its a mustsee but none of us can be in that mood all the time so overall
-negative 0.0300 0.9700 i love scifi and am willing to put up with a lot scifi are usually